modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
google/pegasus-xsum | 2023-01-24T16:42:49.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"pegasus",
"text2text-generation",
"summarization",
"en",
"arxiv:1912.08777",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | google | null | null | google/pegasus-xsum | 136 | 72,257 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- summarization
model-index:
- name: google/pegasus-xsum
results:
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: train
metrics:
- name: ROUGE-1
type: rouge
value: 21.8096
verified: true
- name: ROUGE-2
type: rouge
value: 4.2525
verified: true
- name: ROUGE-L
type: rouge
value: 17.4469
verified: true
- name: ROUGE-LSUM
type: rouge
value: 18.8907
verified: true
- name: loss
type: loss
value: 3.0317161083221436
verified: true
- name: gen_len
type: gen_len
value: 20.3122
verified: true
- task:
type: summarization
name: Summarization
dataset:
name: xsum
type: xsum
config: default
split: test
metrics:
- name: ROUGE-1
type: rouge
value: 46.8623
verified: true
- name: ROUGE-2
type: rouge
value: 24.4533
verified: true
- name: ROUGE-L
type: rouge
value: 39.0548
verified: true
- name: ROUGE-LSUM
type: rouge
value: 39.0994
verified: true
- name: loss
type: loss
value: 1.5717021226882935
verified: true
- name: gen_len
type: gen_len
value: 22.8821
verified: true
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: test
metrics:
- name: ROUGE-1
type: rouge
value: 22.2062
verified: true
- name: ROUGE-2
type: rouge
value: 7.6701
verified: true
- name: ROUGE-L
type: rouge
value: 15.4046
verified: true
- name: ROUGE-LSUM
type: rouge
value: 19.2182
verified: true
- name: loss
type: loss
value: 2.681241273880005
verified: true
- name: gen_len
type: gen_len
value: 25.0234
verified: true
---
### Pegasus Models
See Docs: [here](https://huggingface.co/transformers/master/model_doc/pegasus.html)
Original TF 1 code [here](https://github.com/google-research/pegasus)
Authors: Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu on Dec 18, 2019
Maintained by: [@sshleifer](https://twitter.com/sam_shleifer)
Task: Summarization
The following is copied from the authors' README.
# Mixed & Stochastic Checkpoints
We train a pegasus model with sampled gap sentence ratios on both C4 and HugeNews, and stochastically sample important sentences. The updated the results are reported in this table.
| dataset | C4 | HugeNews | Mixed & Stochastic|
| ---- | ---- | ---- | ----|
| xsum | 45.20/22.06/36.99 | 47.21/24.56/39.25 | 47.60/24.83/39.64|
| cnn_dailymail | 43.90/21.20/40.76 | 44.17/21.47/41.11 | 44.16/21.56/41.30|
| newsroom | 45.07/33.39/41.28 | 45.15/33.51/41.33 | 45.98/34.20/42.18|
| multi_news | 46.74/17.95/24.26 | 47.52/18.72/24.91 | 47.65/18.75/24.95|
| gigaword | 38.75/19.96/36.14 | 39.12/19.86/36.24 | 39.65/20.47/36.76|
| wikihow | 43.07/19.70/34.79 | 41.35/18.51/33.42 | 46.39/22.12/38.41 *|
| reddit_tifu | 26.54/8.94/21.64 | 26.63/9.01/21.60 | 27.99/9.81/22.94|
| big_patent | 53.63/33.16/42.25 | 53.41/32.89/42.07 | 52.29/33.08/41.66 *|
| arxiv | 44.70/17.27/25.80 | 44.67/17.18/25.73 | 44.21/16.95/25.67|
| pubmed | 45.49/19.90/27.69 | 45.09/19.56/27.42 | 45.97/20.15/28.25|
| aeslc | 37.69/21.85/36.84 | 37.40/21.22/36.45 | 37.68/21.25/36.51|
| billsum | 57.20/39.56/45.80 | 57.31/40.19/45.82 | 59.67/41.58/47.59|
The "Mixed & Stochastic" model has the following changes:
- trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples).
- trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity).
- the model uniformly sample a gap sentence ratio between 15% and 45%.
- importance sentences are sampled using a 20% uniform noise to importance scores.
- the sentencepiece tokenizer is updated to be able to encode newline character.
(*) the numbers of wikihow and big_patent datasets are not comparable because of change in tokenization and data:
- wikihow dataset contains newline characters which is useful for paragraph segmentation, the C4 and HugeNews model's sentencepiece tokenizer doesn't encode newline and loose this information.
- we update the BigPatent dataset to preserve casing, some format cleanings are also changed, please refer to change in TFDS.
The "Mixed & Stochastic" model has the following changes (from pegasus-large in the paper):
trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples).
trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity).
the model uniformly sample a gap sentence ratio between 15% and 45%.
importance sentences are sampled using a 20% uniform noise to importance scores.
the sentencepiece tokenizer is updated to be able to encode newline character.
Citation
```
@misc{zhang2019pegasus,
title={PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization},
author={Jingqing Zhang and Yao Zhao and Mohammad Saleh and Peter J. Liu},
year={2019},
eprint={1912.08777},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 5,363 | [
[
-0.0284423828125,
-0.05816650390625,
0.028900146484375,
0.02069091796875,
-0.0264892578125,
-0.0251007080078125,
-0.0106964111328125,
-0.033721923828125,
0.0394287109375,
0.0221405029296875,
-0.058349609375,
-0.0458984375,
-0.054779052734375,
-0.0013971328735351562,
-0.0304412841796875,
0.07452392578125,
-0.001796722412109375,
-0.004138946533203125,
0.00843048095703125,
-0.0005650520324707031,
-0.01175689697265625,
-0.02392578125,
-0.047027587890625,
0.006916046142578125,
0.0287322998046875,
0.0103607177734375,
0.0478515625,
0.047943115234375,
0.0340576171875,
0.018829345703125,
-0.035491943359375,
-0.007781982421875,
-0.01480865478515625,
-0.0264434814453125,
0.0186614990234375,
-0.010711669921875,
-0.0252227783203125,
0.001728057861328125,
0.04132080078125,
0.06671142578125,
-0.01157379150390625,
0.0035419464111328125,
0.0160064697265625,
0.035400390625,
-0.030181884765625,
0.008392333984375,
-0.0201873779296875,
0.0182037353515625,
-0.01910400390625,
0.0092010498046875,
-0.003963470458984375,
-0.01444244384765625,
0.013885498046875,
-0.0701904296875,
0.034088134765625,
0.00231170654296875,
0.1077880859375,
0.0213775634765625,
-0.0289764404296875,
-0.01107025146484375,
-0.00600433349609375,
0.058685302734375,
-0.0736083984375,
0.022552490234375,
0.007259368896484375,
-0.00902557373046875,
-0.01538848876953125,
-0.0802001953125,
-0.055023193359375,
0.00045371055603027344,
-0.0034313201904296875,
0.0238494873046875,
-0.00887298583984375,
0.0015430450439453125,
0.023040771484375,
0.045013427734375,
-0.034576416015625,
0.0203094482421875,
-0.040374755859375,
-0.0171051025390625,
0.044281005859375,
0.0134735107421875,
0.01458740234375,
-0.0273895263671875,
-0.040496826171875,
-0.00629425048828125,
-0.0272216796875,
0.031982421875,
0.0192108154296875,
0.00994873046875,
-0.0282745361328125,
0.039886474609375,
-0.02105712890625,
0.050872802734375,
0.00023615360260009766,
-0.003997802734375,
0.05657958984375,
-0.04058837890625,
-0.02691650390625,
0.0006532669067382812,
0.0748291015625,
0.0408935546875,
0.006969451904296875,
0.01100921630859375,
0.0030307769775390625,
-0.00891876220703125,
0.008148193359375,
-0.07196044921875,
0.00439453125,
0.0171051025390625,
-0.03826904296875,
-0.0189208984375,
0.0309600830078125,
-0.059112548828125,
-0.0035839080810546875,
-0.0101165771484375,
0.021697998046875,
-0.032806396484375,
-0.004810333251953125,
0.033843994140625,
-0.019805908203125,
0.02093505859375,
0.025726318359375,
-0.0665283203125,
-0.0005211830139160156,
0.04931640625,
0.07427978515625,
0.0031108856201171875,
-0.0426025390625,
-0.0224761962890625,
-0.009735107421875,
-0.0305938720703125,
0.047821044921875,
-0.021087646484375,
-0.00872802734375,
0.0027675628662109375,
0.02203369140625,
-0.0283355712890625,
-0.0110931396484375,
0.06866455078125,
-0.025421142578125,
0.052642822265625,
-0.0251617431640625,
-0.049774169921875,
-0.01146697998046875,
0.0128936767578125,
-0.052001953125,
0.082763671875,
0.01139068603515625,
-0.086181640625,
0.052703857421875,
-0.04669189453125,
-0.01715087890625,
-0.0251922607421875,
0.002712249755859375,
-0.04998779296875,
-0.01493072509765625,
0.039459228515625,
0.0239410400390625,
-0.01148223876953125,
0.0275115966796875,
-0.0245208740234375,
-0.0355224609375,
0.00514984130859375,
-0.014373779296875,
0.07269287109375,
0.01421356201171875,
-0.040374755859375,
0.012420654296875,
-0.048736572265625,
-0.0192413330078125,
-0.0075225830078125,
-0.0228424072265625,
-0.004146575927734375,
-0.0180511474609375,
0.0027618408203125,
0.035400390625,
0.01111602783203125,
-0.0443115234375,
0.01275634765625,
-0.055694580078125,
0.050018310546875,
0.047882080078125,
0.02630615234375,
0.0174102783203125,
-0.034454345703125,
0.0306396484375,
0.033172607421875,
0.0158538818359375,
-0.0246429443359375,
-0.037567138671875,
-0.08526611328125,
-0.0243072509765625,
0.032867431640625,
0.022186279296875,
-0.0389404296875,
0.058685302734375,
-0.020263671875,
-0.0288238525390625,
-0.034881591796875,
-0.0161590576171875,
0.010284423828125,
0.057830810546875,
0.032928466796875,
-0.0223541259765625,
-0.0309295654296875,
-0.0811767578125,
-0.0151519775390625,
-0.007099151611328125,
-0.0145263671875,
-0.0025005340576171875,
0.051300048828125,
-0.01088714599609375,
0.06781005859375,
-0.042388916015625,
-0.01189422607421875,
-0.010894775390625,
0.0094146728515625,
0.037567138671875,
0.035736083984375,
0.031494140625,
-0.058013916015625,
-0.01450347900390625,
-0.035919189453125,
-0.05047607421875,
-0.0005497932434082031,
-0.0143585205078125,
-0.0109100341796875,
0.0205078125,
0.05706787109375,
-0.07318115234375,
0.0309600830078125,
0.0141754150390625,
-0.0413818359375,
0.044403076171875,
0.0008492469787597656,
0.009735107421875,
-0.104736328125,
0.0120391845703125,
0.01849365234375,
0.00899505615234375,
-0.038909912109375,
-0.007358551025390625,
0.0024318695068359375,
-0.006053924560546875,
-0.0450439453125,
0.03399658203125,
-0.0413818359375,
0.001163482666015625,
0.007358551025390625,
0.00888824462890625,
0.0102996826171875,
0.0546875,
-0.00022101402282714844,
0.058074951171875,
0.0292816162109375,
-0.04345703125,
-0.0014095306396484375,
0.035186767578125,
-0.05670166015625,
0.0037631988525390625,
-0.05706787109375,
-0.0205230712890625,
-0.01139068603515625,
0.0305938720703125,
-0.068359375,
-0.02105712890625,
0.014892578125,
-0.043121337890625,
0.0029926300048828125,
0.02178955078125,
-0.02655029296875,
-0.03900146484375,
-0.03985595703125,
0.00926971435546875,
0.0229339599609375,
-0.0219879150390625,
0.0173492431640625,
0.02764892578125,
-0.0288848876953125,
-0.055816650390625,
-0.07586669921875,
0.0038547515869140625,
-0.0267486572265625,
-0.06744384765625,
0.047271728515625,
0.0073089599609375,
0.002674102783203125,
0.00431060791015625,
-0.00608062744140625,
-0.0098419189453125,
0.0065155029296875,
0.00914764404296875,
0.0249176025390625,
-0.025054931640625,
0.01318359375,
-0.0003559589385986328,
-0.01251220703125,
-0.0032711029052734375,
-0.021820068359375,
0.031982421875,
-0.005588531494140625,
-0.00982666015625,
-0.0386962890625,
0.00820159912109375,
0.0362548828125,
-0.01275634765625,
0.06475830078125,
0.046295166015625,
-0.028594970703125,
0.01171112060546875,
-0.03704833984375,
-0.00984954833984375,
-0.033172607421875,
0.042144775390625,
-0.019775390625,
-0.09283447265625,
0.042694091796875,
0.0236358642578125,
0.0169525146484375,
0.07135009765625,
0.037445068359375,
0.004993438720703125,
0.0450439453125,
0.047698974609375,
-0.0159149169921875,
0.035736083984375,
-0.033111572265625,
0.006252288818359375,
-0.049102783203125,
-0.02130126953125,
-0.031982421875,
-0.0208740234375,
-0.035858154296875,
-0.0248870849609375,
0.0264892578125,
0.022308349609375,
-0.034881591796875,
0.0237884521484375,
-0.030364990234375,
0.015228271484375,
0.054901123046875,
0.0033512115478515625,
0.0142669677734375,
0.0023021697998046875,
-0.0254058837890625,
-0.02099609375,
-0.059539794921875,
-0.0285186767578125,
0.06146240234375,
0.032318115234375,
0.0278472900390625,
0.0011434555053710938,
0.0311431884765625,
0.01084136962890625,
0.00555419921875,
-0.046051025390625,
0.029876708984375,
-0.0041351318359375,
-0.04388427734375,
-0.036590576171875,
-0.042327880859375,
-0.0750732421875,
0.03167724609375,
-0.01953125,
-0.062225341796875,
0.0198974609375,
-0.016632080078125,
-0.047271728515625,
0.0158233642578125,
-0.04498291015625,
0.08538818359375,
0.01446533203125,
-0.0196533203125,
0.0024013519287109375,
-0.060211181640625,
0.043365478515625,
-0.0004010200500488281,
0.0166473388671875,
0.0006356239318847656,
-0.0103302001953125,
0.075927734375,
-0.061309814453125,
0.044189453125,
-0.00861358642578125,
0.004337310791015625,
0.0177001953125,
-0.04327392578125,
0.031341552734375,
-0.0070648193359375,
-0.007350921630859375,
0.0159149169921875,
-0.00841522216796875,
-0.032806396484375,
-0.0226898193359375,
0.03509521484375,
-0.061126708984375,
-0.04779052734375,
-0.04510498046875,
-0.02630615234375,
-0.004711151123046875,
0.0236053466796875,
0.058258056640625,
0.0248565673828125,
-0.0192413330078125,
0.0183868408203125,
0.0309906005859375,
-0.028656005859375,
0.062744140625,
0.03045654296875,
0.013641357421875,
-0.04443359375,
0.0216064453125,
0.0133209228515625,
0.00689697265625,
0.0211944580078125,
-0.003662109375,
-0.03216552734375,
-0.026275634765625,
-0.036834716796875,
0.0259552001953125,
-0.0215301513671875,
0.0023326873779296875,
-0.07086181640625,
-0.035919189453125,
-0.0552978515625,
-0.00540924072265625,
-0.0177154541015625,
-0.052764892578125,
-0.038055419921875,
-0.0234222412109375,
0.009185791015625,
0.0281829833984375,
0.012603759765625,
0.0259246826171875,
-0.055694580078125,
-0.001804351806640625,
0.01062774658203125,
0.006130218505859375,
-0.002857208251953125,
-0.06756591796875,
-0.036834716796875,
-0.0031337738037109375,
-0.042572021484375,
-0.05072021484375,
0.0328369140625,
0.00650787353515625,
0.03607177734375,
0.038818359375,
0.00795745849609375,
0.059234619140625,
-0.028594970703125,
0.08941650390625,
0.03411865234375,
-0.06854248046875,
0.027618408203125,
-0.033294677734375,
0.030303955078125,
0.049896240234375,
0.0186004638671875,
-0.06060791015625,
-0.040283203125,
-0.07049560546875,
-0.0810546875,
0.0635986328125,
0.030181884765625,
0.0011434555053710938,
0.004283905029296875,
0.01474761962890625,
-0.01296234130859375,
0.0302734375,
-0.06268310546875,
-0.0138092041015625,
-0.0231781005859375,
-0.025360107421875,
-0.0101318359375,
-0.02288818359375,
0.002376556396484375,
-0.005825042724609375,
0.055908203125,
0.0141448974609375,
0.0220947265625,
0.039642333984375,
-0.0015010833740234375,
0.00984954833984375,
0.0218658447265625,
0.060516357421875,
0.041595458984375,
-0.01006317138671875,
-0.01227569580078125,
0.005893707275390625,
-0.041900634765625,
0.0014944076538085938,
0.04803466796875,
-0.0223541259765625,
0.0090179443359375,
0.04730224609375,
0.06939697265625,
0.0162506103515625,
-0.035614013671875,
0.06683349609375,
-0.006908416748046875,
-0.040557861328125,
-0.04376220703125,
0.0013246536254882812,
0.005199432373046875,
0.0195770263671875,
0.0312347412109375,
-0.0016202926635742188,
0.016571044921875,
-0.0216217041015625,
0.0235443115234375,
0.0017709732055664062,
-0.03466796875,
-0.012451171875,
0.06231689453125,
0.01296234130859375,
0.0022983551025390625,
0.034576416015625,
-0.01555633544921875,
-0.038299560546875,
0.05841064453125,
0.0177459716796875,
0.058624267578125,
-0.004207611083984375,
0.00801849365234375,
0.05072021484375,
0.042388916015625,
-0.0197296142578125,
-0.0094146728515625,
0.01285552978515625,
-0.04315185546875,
-0.033294677734375,
-0.048248291015625,
-0.00975799560546875,
0.0304412841796875,
-0.041229248046875,
0.0338134765625,
-0.0159759521484375,
-0.01136016845703125,
0.00724029541015625,
0.0095367431640625,
-0.0232696533203125,
0.021484375,
-0.0067596435546875,
0.0941162109375,
-0.06695556640625,
0.048248291015625,
0.03387451171875,
-0.049468994140625,
-0.073974609375,
0.0198211669921875,
-0.0014753341674804688,
-0.034576416015625,
0.034271240234375,
0.0445556640625,
0.0301513671875,
-0.0080413818359375,
-0.025543212890625,
-0.0736083984375,
0.08538818359375,
0.01934814453125,
-0.0416259765625,
-0.024078369140625,
0.01335906982421875,
0.036102294921875,
-0.0194091796875,
0.0184173583984375,
0.035858154296875,
0.026214599609375,
0.01535797119140625,
-0.06402587890625,
0.000705718994140625,
-0.045379638671875,
-0.004985809326171875,
0.024200439453125,
-0.08746337890625,
0.08929443359375,
-0.00510406494140625,
-0.019500732421875,
0.00885772705078125,
0.056396484375,
0.038177490234375,
0.034942626953125,
0.0450439453125,
0.0833740234375,
0.055694580078125,
-0.0127716064453125,
0.07080078125,
-0.0283050537109375,
0.031585693359375,
0.06756591796875,
0.01116180419921875,
0.045623779296875,
0.0298309326171875,
-0.01235198974609375,
0.041046142578125,
0.08197021484375,
-0.005496978759765625,
0.033416748046875,
0.00853729248046875,
-0.00836181640625,
-0.0001882314682006836,
0.003688812255859375,
-0.05035400390625,
0.01311492919921875,
0.0160064697265625,
-0.0440673828125,
-0.00812530517578125,
-0.00914764404296875,
0.032562255859375,
-0.0260772705078125,
-0.01483917236328125,
0.038421630859375,
0.01535797119140625,
-0.058013916015625,
0.033905029296875,
0.0232696533203125,
0.055908203125,
-0.042572021484375,
0.0213165283203125,
-0.01837158203125,
0.004497528076171875,
-0.01474761962890625,
-0.04034423828125,
0.01837158203125,
0.00359344482421875,
-0.00601959228515625,
-0.0029315948486328125,
0.037750244140625,
-0.03521728515625,
-0.048797607421875,
0.001659393310546875,
0.0185546875,
0.0018444061279296875,
-0.0018596649169921875,
-0.06072998046875,
-0.0216827392578125,
0.0106353759765625,
-0.0472412109375,
-0.0099639892578125,
0.050048828125,
0.01220703125,
0.0271759033203125,
0.04443359375,
-0.0021266937255859375,
0.0030117034912109375,
0.002597808837890625,
0.07501220703125,
-0.075439453125,
-0.07830810546875,
-0.060333251953125,
0.054962158203125,
-0.031341552734375,
-0.05914306640625,
0.0643310546875,
0.060638427734375,
0.0467529296875,
0.007686614990234375,
0.048004150390625,
0.0011072158813476562,
0.04571533203125,
-0.07135009765625,
0.036773681640625,
-0.052978515625,
0.0201873779296875,
-0.0318603515625,
-0.061248779296875,
-0.0220184326171875,
0.041015625,
-0.0230255126953125,
0.01546478271484375,
0.07061767578125,
0.06304931640625,
-0.00021088123321533203,
0.0218658447265625,
-0.00673675537109375,
0.0258331298828125,
0.0197296142578125,
0.05810546875,
0.057525634765625,
-0.045562744140625,
0.042816162109375,
-0.003231048583984375,
-0.018035888671875,
-0.0156707763671875,
-0.0494384765625,
-0.057647705078125,
-0.03997802734375,
-0.0205230712890625,
-0.036041259765625,
-0.01226043701171875,
0.047332763671875,
0.054534912109375,
-0.03448486328125,
0.003383636474609375,
-0.0229339599609375,
-0.01444244384765625,
0.002376556396484375,
-0.021240234375,
0.050201416015625,
-0.0268096923828125,
-0.050262451171875,
0.00327301025390625,
0.00827789306640625,
0.0142669677734375,
-0.0003376007080078125,
0.00131988525390625,
-0.0182342529296875,
-0.0078277587890625,
0.0169525146484375,
0.004192352294921875,
-0.043243408203125,
-0.006237030029296875,
0.00878143310546875,
-0.025482177734375,
0.007343292236328125,
0.04840087890625,
-0.0247344970703125,
0.0005936622619628906,
0.02069091796875,
0.04925537109375,
0.073486328125,
0.006927490234375,
0.0184173583984375,
-0.049560546875,
0.039337158203125,
0.0040283203125,
0.0419921875,
0.0198822021484375,
-0.0206146240234375,
0.043670654296875,
0.035919189453125,
-0.04534912109375,
-0.042144775390625,
-0.007537841796875,
-0.0816650390625,
-0.0298004150390625,
0.07366943359375,
-0.01226043701171875,
-0.027008056640625,
0.0021419525146484375,
0.004360198974609375,
0.0260009765625,
-0.039306640625,
0.062225341796875,
0.0787353515625,
0.004405975341796875,
-0.01183319091796875,
-0.04339599609375,
0.035888671875,
0.034637451171875,
-0.061798095703125,
-0.0179290771484375,
0.047821044921875,
0.0123138427734375,
0.016937255859375,
0.06646728515625,
-0.01371002197265625,
0.01885986328125,
0.00922393798828125,
0.001491546630859375,
-0.0034942626953125,
-0.0020904541015625,
-0.0207672119140625,
0.0192413330078125,
-0.0166015625,
-0.0255126953125
]
] |
facebook/convnextv2-tiny-1k-224 | 2023-09-26T17:19:21.000Z | [
"transformers",
"pytorch",
"tf",
"convnextv2",
"image-classification",
"vision",
"dataset:imagenet-1k",
"arxiv:2301.00808",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | facebook | null | null | facebook/convnextv2-tiny-1k-224 | 0 | 71,799 | transformers | 2023-02-17T14:03:53 | ---
license: apache-2.0
tags:
- vision
- image-classification
datasets:
- imagenet-1k
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
---
# ConvNeXt V2 (tiny-sized model)
ConvNeXt V2 model pretrained using the FCMAE framework and fine-tuned on the ImageNet-1K dataset at resolution 224x224. It was introduced in the paper [ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders](https://arxiv.org/abs/2301.00808) by Woo et al. and first released in [this repository](https://github.com/facebookresearch/ConvNeXt-V2).
Disclaimer: The team releasing ConvNeXT V2 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
ConvNeXt V2 is a pure convolutional model (ConvNet) that introduces a fully convolutional masked autoencoder framework (FCMAE) and a new Global Response Normalization (GRN) layer to ConvNeXt. ConvNeXt V2 significantly improves the performance of pure ConvNets on various recognition benchmarks.

## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=convnextv2) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import AutoImageProcessor, ConvNextV2ForImageClassification
import torch
from datasets import load_dataset
dataset = load_dataset("huggingface/cats-image")
image = dataset["test"]["image"][0]
preprocessor = AutoImageProcessor.from_pretrained("facebook/convnextv2-tiny-1k-224")
model = ConvNextV2ForImageClassification.from_pretrained("facebook/convnextv2-tiny-1k-224")
inputs = preprocessor(image, return_tensors="pt")
with torch.no_grad():
logits = model(**inputs).logits
# model predicts one of the 1000 ImageNet classes
predicted_label = logits.argmax(-1).item()
print(model.config.id2label[predicted_label]),
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/convnextv2).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2301-00808,
author = {Sanghyun Woo and
Shoubhik Debnath and
Ronghang Hu and
Xinlei Chen and
Zhuang Liu and
In So Kweon and
Saining Xie},
title = {ConvNeXt {V2:} Co-designing and Scaling ConvNets with Masked Autoencoders},
journal = {CoRR},
volume = {abs/2301.00808},
year = {2023},
url = {https://doi.org/10.48550/arXiv.2301.00808},
doi = {10.48550/arXiv.2301.00808},
eprinttype = {arXiv},
eprint = {2301.00808},
timestamp = {Tue, 10 Jan 2023 15:10:12 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2301-00808.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 3,370 | [
[
-0.05224609375,
-0.0282440185546875,
-0.0269012451171875,
0.0130462646484375,
-0.0273590087890625,
-0.0217742919921875,
-0.012115478515625,
-0.060546875,
0.0247344970703125,
0.030975341796875,
-0.042633056640625,
-0.007320404052734375,
-0.04315185546875,
-0.0063018798828125,
-0.0171966552734375,
0.058746337890625,
0.006046295166015625,
0.00330352783203125,
-0.0233001708984375,
-0.01806640625,
-0.030426025390625,
-0.034210205078125,
-0.06756591796875,
-0.023956298828125,
0.0164337158203125,
0.03155517578125,
0.046875,
0.05023193359375,
0.052581787109375,
0.026153564453125,
-0.0007252693176269531,
0.0096435546875,
-0.027008056640625,
-0.0245513916015625,
0.010223388671875,
-0.02032470703125,
-0.029693603515625,
0.00748443603515625,
0.017578125,
0.0265655517578125,
0.0178985595703125,
0.026153564453125,
0.01495361328125,
0.0377197265625,
-0.0202789306640625,
0.0209503173828125,
-0.0352783203125,
0.009124755859375,
0.00659942626953125,
0.0081634521484375,
-0.0297393798828125,
-0.0055389404296875,
0.0171966552734375,
-0.04412841796875,
0.0401611328125,
0.01438140869140625,
0.0858154296875,
0.0186920166015625,
-0.026824951171875,
-0.00027751922607421875,
-0.032623291015625,
0.046722412109375,
-0.045440673828125,
0.022705078125,
0.007686614990234375,
0.029937744140625,
0.00004547834396362305,
-0.08251953125,
-0.044036865234375,
-0.0097198486328125,
-0.030731201171875,
0.008880615234375,
-0.0297393798828125,
0.00995635986328125,
0.02752685546875,
0.035491943359375,
-0.059326171875,
0.0243988037109375,
-0.044647216796875,
-0.0295257568359375,
0.061614990234375,
-0.01056671142578125,
0.011566162109375,
-0.0214691162109375,
-0.053680419921875,
-0.0273895263671875,
-0.0303955078125,
0.0183868408203125,
0.00949859619140625,
-0.0011882781982421875,
-0.04083251953125,
0.034393310546875,
-0.00818634033203125,
0.03936767578125,
0.02825927734375,
0.020355224609375,
0.0263824462890625,
-0.026763916015625,
-0.031646728515625,
0.0021419525146484375,
0.07916259765625,
0.045623779296875,
0.018402099609375,
0.01343536376953125,
0.00032830238342285156,
0.0057525634765625,
-0.0002944469451904297,
-0.0848388671875,
-0.04736328125,
0.0183258056640625,
-0.043365478515625,
-0.025238037109375,
0.00884246826171875,
-0.04290771484375,
-0.0134735107421875,
-0.0271148681640625,
0.0396728515625,
-0.0033588409423828125,
-0.03955078125,
-0.000457763671875,
-0.0120086669921875,
0.0255126953125,
0.0224609375,
-0.052398681640625,
0.0328369140625,
0.03155517578125,
0.0645751953125,
-0.005474090576171875,
0.005611419677734375,
-0.010040283203125,
-0.03826904296875,
-0.021209716796875,
0.038665771484375,
0.005619049072265625,
-0.003814697265625,
-0.015960693359375,
0.036773681640625,
0.00888824462890625,
-0.041168212890625,
0.028594970703125,
-0.036285400390625,
0.001220703125,
-0.01593017578125,
-0.0295562744140625,
-0.018280029296875,
0.019866943359375,
-0.059356689453125,
0.07562255859375,
0.0250091552734375,
-0.06182861328125,
0.015167236328125,
-0.0311737060546875,
-0.0017566680908203125,
-0.0018863677978515625,
0.0014705657958984375,
-0.06134033203125,
-0.01204681396484375,
0.00897979736328125,
0.047149658203125,
-0.021728515625,
0.0158538818359375,
-0.036651611328125,
-0.0200958251953125,
0.005107879638671875,
-0.02703857421875,
0.07928466796875,
0.0185546875,
-0.0316162109375,
-0.0023193359375,
-0.051544189453125,
0.00605010986328125,
0.0234832763671875,
0.007198333740234375,
-0.0106048583984375,
-0.03656005859375,
0.0226593017578125,
0.05035400390625,
0.019073486328125,
-0.04443359375,
0.0235748291015625,
-0.026947021484375,
0.036407470703125,
0.04290771484375,
0.0014858245849609375,
0.033660888671875,
-0.01544952392578125,
0.0226898193359375,
0.006946563720703125,
0.04296875,
0.0020313262939453125,
-0.036468505859375,
-0.0701904296875,
-0.01393890380859375,
0.00919342041015625,
0.0295867919921875,
-0.05877685546875,
0.0260009765625,
-0.0177001953125,
-0.056121826171875,
-0.0226898193359375,
0.0005640983581542969,
0.035888671875,
0.026123046875,
0.040679931640625,
-0.040313720703125,
-0.06768798828125,
-0.080322265625,
0.0077667236328125,
0.0066986083984375,
0.00010973215103149414,
0.02899169921875,
0.047760009765625,
-0.01290130615234375,
0.059539794921875,
-0.01464080810546875,
-0.01702880859375,
-0.002685546875,
0.00197601318359375,
0.0224456787109375,
0.0660400390625,
0.051422119140625,
-0.0762939453125,
-0.047882080078125,
-0.00336456298828125,
-0.059173583984375,
0.016754150390625,
-0.00743865966796875,
-0.01294708251953125,
0.01317596435546875,
0.0408935546875,
-0.03631591796875,
0.042755126953125,
0.043304443359375,
-0.0293731689453125,
0.045440673828125,
-0.0029506683349609375,
-0.012603759765625,
-0.08233642578125,
0.0007562637329101562,
0.0191192626953125,
-0.0177764892578125,
-0.033477783203125,
-0.008941650390625,
0.00821685791015625,
-0.001949310302734375,
-0.05029296875,
0.062347412109375,
-0.049224853515625,
-0.0017185211181640625,
-0.0159149169921875,
-0.0175628662109375,
0.0089263916015625,
0.061553955078125,
0.017852783203125,
0.022186279296875,
0.04156494140625,
-0.041168212890625,
0.0489501953125,
0.0227813720703125,
-0.0248565673828125,
0.0287017822265625,
-0.06915283203125,
0.0115814208984375,
0.01006317138671875,
0.03509521484375,
-0.0733642578125,
-0.0171051025390625,
0.027557373046875,
-0.041778564453125,
0.046966552734375,
-0.0255126953125,
-0.0295867919921875,
-0.063232421875,
-0.020782470703125,
0.043701171875,
0.034393310546875,
-0.05206298828125,
0.00824737548828125,
0.0185394287109375,
0.0256195068359375,
-0.041229248046875,
-0.07025146484375,
-0.00931549072265625,
0.005157470703125,
-0.05462646484375,
0.0271759033203125,
-0.0155792236328125,
0.00423431396484375,
0.01041412353515625,
-0.01190185546875,
-0.0026111602783203125,
-0.00940704345703125,
0.0260772705078125,
0.0333251953125,
-0.0214080810546875,
-0.006572723388671875,
0.00032830238342285156,
-0.01690673828125,
0.005832672119140625,
-0.041595458984375,
0.0309906005859375,
-0.018646240234375,
-0.0007295608520507812,
-0.04901123046875,
0.01094818115234375,
0.03131103515625,
-0.00791168212890625,
0.04705810546875,
0.0706787109375,
-0.038421630859375,
-0.00942230224609375,
-0.03228759765625,
-0.0281524658203125,
-0.03997802734375,
0.031829833984375,
-0.0209808349609375,
-0.060455322265625,
0.0396728515625,
0.00928497314453125,
0.0005478858947753906,
0.0579833984375,
0.040863037109375,
-0.00574493408203125,
0.05303955078125,
0.040740966796875,
0.0230865478515625,
0.0478515625,
-0.07720947265625,
0.000904083251953125,
-0.07928466796875,
-0.035400390625,
-0.0186309814453125,
-0.046295166015625,
-0.07305908203125,
-0.0380859375,
0.0225372314453125,
0.004947662353515625,
-0.0401611328125,
0.058685302734375,
-0.07244873046875,
0.016693115234375,
0.048187255859375,
0.0262908935546875,
-0.0181427001953125,
0.012908935546875,
0.0015087127685546875,
0.0028018951416015625,
-0.061248779296875,
-0.003353118896484375,
0.06768798828125,
0.031890869140625,
0.03790283203125,
-0.00605010986328125,
0.0270843505859375,
0.01284027099609375,
0.0240478515625,
-0.0535888671875,
0.0308685302734375,
-0.025970458984375,
-0.06500244140625,
-0.0078582763671875,
-0.01467132568359375,
-0.06976318359375,
0.012542724609375,
-0.020538330078125,
-0.05224609375,
0.067138671875,
0.0239715576171875,
-0.01393890380859375,
0.0285797119140625,
-0.055816650390625,
0.07989501953125,
-0.01288604736328125,
-0.040740966796875,
0.01192474365234375,
-0.070556640625,
0.0286712646484375,
0.0135650634765625,
-0.00501251220703125,
0.006938934326171875,
0.0199127197265625,
0.0609130859375,
-0.054718017578125,
0.067626953125,
-0.018646240234375,
0.02728271484375,
0.05194091796875,
0.002315521240234375,
0.04827880859375,
0.00418853759765625,
-0.00213623046875,
0.0413818359375,
0.01029205322265625,
-0.035369873046875,
-0.041595458984375,
0.051605224609375,
-0.0692138671875,
-0.020050048828125,
-0.03057861328125,
-0.021331787109375,
0.01104736328125,
0.0098724365234375,
0.06597900390625,
0.05145263671875,
0.002716064453125,
0.0389404296875,
0.048583984375,
-0.01215362548828125,
0.036163330078125,
0.006072998046875,
0.00016355514526367188,
-0.03179931640625,
0.06146240234375,
0.0277557373046875,
0.033294677734375,
0.02679443359375,
0.01171112060546875,
-0.0278167724609375,
-0.0155487060546875,
-0.0286712646484375,
0.0157623291015625,
-0.044342041015625,
-0.040374755859375,
-0.052642822265625,
-0.04107666015625,
-0.03594970703125,
-0.017486572265625,
-0.04632568359375,
-0.03631591796875,
-0.034423828125,
0.0036163330078125,
0.034027099609375,
0.029510498046875,
-0.0286712646484375,
0.02838134765625,
-0.0198211669921875,
0.01264190673828125,
0.0228118896484375,
0.0273895263671875,
0.0019292831420898438,
-0.0540771484375,
-0.029632568359375,
0.008636474609375,
-0.02642822265625,
-0.0328369140625,
0.03515625,
0.0083770751953125,
0.02655029296875,
0.0263519287109375,
0.00817108154296875,
0.0352783203125,
-0.006099700927734375,
0.050933837890625,
0.044647216796875,
-0.043365478515625,
0.0300445556640625,
-0.006999969482421875,
0.01007843017578125,
0.0185546875,
0.021728515625,
-0.036651611328125,
0.00308990478515625,
-0.07366943359375,
-0.0634765625,
0.053802490234375,
0.0196990966796875,
0.00841522216796875,
0.0166473388671875,
0.0372314453125,
0.001972198486328125,
0.0033626556396484375,
-0.060577392578125,
-0.042724609375,
-0.04473876953125,
-0.0164642333984375,
-0.00850677490234375,
-0.031982421875,
0.00891876220703125,
-0.0462646484375,
0.047607421875,
-0.009246826171875,
0.061553955078125,
0.030548095703125,
0.0009937286376953125,
-0.003337860107421875,
-0.0306243896484375,
0.032440185546875,
0.0051727294921875,
-0.0217742919921875,
0.00809478759765625,
-0.0024471282958984375,
-0.04449462890625,
0.0019521713256835938,
0.0133819580078125,
-0.007808685302734375,
0.0093231201171875,
0.0318603515625,
0.0782470703125,
0.00930023193359375,
0.00444793701171875,
0.05438232421875,
-0.0155487060546875,
-0.03106689453125,
-0.04156494140625,
0.0057220458984375,
-0.01195526123046875,
0.0210418701171875,
0.00811767578125,
0.03814697265625,
0.01377105712890625,
-0.0246734619140625,
0.0295257568359375,
0.025970458984375,
-0.04193115234375,
-0.026702880859375,
0.06060791015625,
0.003448486328125,
-0.0081329345703125,
0.055755615234375,
-0.020782470703125,
-0.032989501953125,
0.09112548828125,
0.0386962890625,
0.0682373046875,
-0.005413055419921875,
0.007587432861328125,
0.0738525390625,
0.03326416015625,
-0.006763458251953125,
-0.004520416259765625,
0.00365447998046875,
-0.054229736328125,
-0.007411956787109375,
-0.038726806640625,
-0.0032253265380859375,
0.021148681640625,
-0.048614501953125,
0.039825439453125,
-0.039276123046875,
-0.00971221923828125,
0.0024394989013671875,
0.0360107421875,
-0.08770751953125,
0.038604736328125,
0.0117645263671875,
0.08001708984375,
-0.058624267578125,
0.06915283203125,
0.037353515625,
-0.03448486328125,
-0.08441162109375,
-0.0380859375,
0.0015020370483398438,
-0.047149658203125,
0.036407470703125,
0.027191162109375,
0.02606201171875,
0.0120086669921875,
-0.079345703125,
-0.0665283203125,
0.1007080078125,
0.01482391357421875,
-0.046661376953125,
0.0175323486328125,
-0.0102386474609375,
0.037353515625,
-0.0293731689453125,
0.03607177734375,
0.01148223876953125,
0.0277099609375,
0.0272979736328125,
-0.05767822265625,
0.0165863037109375,
-0.038330078125,
0.0109710693359375,
-0.00782012939453125,
-0.07647705078125,
0.06787109375,
-0.01751708984375,
0.0015697479248046875,
0.00472259521484375,
0.061248779296875,
-0.0038433074951171875,
0.0284576416015625,
0.031341552734375,
0.0304412841796875,
0.03948974609375,
-0.0208892822265625,
0.0828857421875,
-0.0007123947143554688,
0.05303955078125,
0.0771484375,
0.035369873046875,
0.035003662109375,
0.01580810546875,
-0.007015228271484375,
0.0306549072265625,
0.08087158203125,
-0.033294677734375,
0.032806396484375,
0.0123138427734375,
0.006008148193359375,
-0.01446533203125,
-0.0028209686279296875,
-0.0384521484375,
0.0302734375,
0.0189971923828125,
-0.0309906005859375,
0.00910186767578125,
0.016326904296875,
0.007717132568359375,
-0.0282440185546875,
-0.0188446044921875,
0.035858154296875,
0.019012451171875,
-0.03564453125,
0.0645751953125,
-0.00763702392578125,
0.0557861328125,
-0.0261993408203125,
-0.0016050338745117188,
-0.0297393798828125,
0.0175018310546875,
-0.0225372314453125,
-0.050811767578125,
0.0223541259765625,
-0.024444580078125,
-0.013275146484375,
0.00553131103515625,
0.057373046875,
-0.03082275390625,
-0.047210693359375,
0.0211944580078125,
-0.0024127960205078125,
0.0161285400390625,
-0.0006775856018066406,
-0.07354736328125,
0.01727294921875,
-0.0059814453125,
-0.0377197265625,
0.012237548828125,
0.0276641845703125,
-0.014495849609375,
0.043121337890625,
0.048248291015625,
-0.01806640625,
0.006671905517578125,
-0.0250091552734375,
0.06573486328125,
-0.0215606689453125,
-0.0174407958984375,
-0.046356201171875,
0.044891357421875,
-0.01837158203125,
-0.028778076171875,
0.04412841796875,
0.050689697265625,
0.0787353515625,
-0.01232147216796875,
0.049346923828125,
-0.0284576416015625,
-0.0011806488037109375,
-0.0169219970703125,
0.0450439453125,
-0.05877685546875,
-0.0067138671875,
-0.0157012939453125,
-0.04559326171875,
-0.0302886962890625,
0.040313720703125,
-0.01290130615234375,
0.016510009765625,
0.03814697265625,
0.07379150390625,
-0.019683837890625,
-0.016143798828125,
0.019927978515625,
0.0259552001953125,
0.0192108154296875,
0.042724609375,
0.015899658203125,
-0.07757568359375,
0.03271484375,
-0.058380126953125,
-0.0167388916015625,
-0.04296875,
-0.048736572265625,
-0.060089111328125,
-0.058563232421875,
-0.043304443359375,
-0.05767822265625,
-0.0156402587890625,
0.073486328125,
0.082275390625,
-0.06573486328125,
-0.0017242431640625,
-0.0176849365234375,
0.001068115234375,
-0.034820556640625,
-0.017852783203125,
0.05126953125,
0.01319122314453125,
-0.05511474609375,
-0.019744873046875,
-0.0017223358154296875,
0.016448974609375,
-0.01226806640625,
-0.022186279296875,
-0.0118255615234375,
-0.004474639892578125,
0.0440673828125,
0.037017822265625,
-0.0369873046875,
-0.0240936279296875,
0.0027313232421875,
-0.0196075439453125,
0.00942230224609375,
0.03558349609375,
-0.044769287109375,
0.048797607421875,
0.036712646484375,
0.019195556640625,
0.05609130859375,
-0.0186767578125,
0.005962371826171875,
-0.054718017578125,
0.040496826171875,
0.004673004150390625,
0.01995849609375,
0.023529052734375,
-0.042724609375,
0.04583740234375,
0.040863037109375,
-0.04345703125,
-0.052001953125,
0.003604888916015625,
-0.1080322265625,
-0.00666046142578125,
0.08966064453125,
-0.007198333740234375,
-0.0295867919921875,
0.003887176513671875,
-0.0128173828125,
0.041778564453125,
-0.007198333740234375,
0.0289306640625,
0.026702880859375,
0.0013637542724609375,
-0.0389404296875,
-0.04156494140625,
0.041656494140625,
-0.007137298583984375,
-0.033599853515625,
-0.0249481201171875,
0.01467132568359375,
0.0269012451171875,
0.0081939697265625,
0.033660888671875,
-0.00809478759765625,
0.0286102294921875,
0.0159149169921875,
0.033477783203125,
-0.0283203125,
-0.01070404052734375,
-0.01593017578125,
-0.00981903076171875,
-0.0258026123046875,
-0.033599853515625
]
] |
valhalla/t5-base-qa-qg-hl | 2020-12-11T22:03:44.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"question-generation",
"dataset:squad",
"arxiv:1910.10683",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | valhalla | null | null | valhalla/t5-base-qa-qg-hl | 16 | 71,089 | transformers | 2022-03-02T23:29:05 | ---
datasets:
- squad
tags:
- question-generation
widget:
- text: "generate question: <hl> 42 <hl> is the answer to life, the universe and everything. </s>"
- text: "question: What is 42 context: 42 is the answer to life, the universe and everything. </s>"
license: mit
---
## T5 for multi-task QA and QG
This is multi-task [t5-base](https://arxiv.org/abs/1910.10683) model trained for question answering and answer aware question generation tasks.
For question generation the answer spans are highlighted within the text with special highlight tokens (`<hl>`) and prefixed with 'generate question: '. For QA the input is processed like this `question: question_text context: context_text </s>`
You can play with the model using the inference API. Here's how you can use it
For QG
`generate question: <hl> 42 <hl> is the answer to life, the universe and everything. </s>`
For QA
`question: What is 42 context: 42 is the answer to life, the universe and everything. </s>`
For more deatils see [this](https://github.com/patil-suraj/question_generation) repo.
### Model in action 🚀
You'll need to clone the [repo](https://github.com/patil-suraj/question_generation).
[](https://colab.research.google.com/github/patil-suraj/question_generation/blob/master/question_generation.ipynb)
```python3
from pipelines import pipeline
nlp = pipeline("multitask-qa-qg", model="valhalla/t5-base-qa-qg-hl")
# to generate questions simply pass the text
nlp("42 is the answer to life, the universe and everything.")
=> [{'answer': '42', 'question': 'What is the answer to life, the universe and everything?'}]
# for qa pass a dict with "question" and "context"
nlp({
"question": "What is 42 ?",
"context": "42 is the answer to life, the universe and everything."
})
=> 'the answer to life, the universe and everything'
``` | 1,902 | [
[
-0.035308837890625,
-0.07659912109375,
0.036895751953125,
0.01535797119140625,
-0.0035152435302734375,
-0.0019969940185546875,
0.028045654296875,
-0.00937652587890625,
-0.0030117034912109375,
0.03173828125,
-0.0804443359375,
-0.0236053466796875,
-0.005603790283203125,
0.0230560302734375,
-0.0037059783935546875,
0.0826416015625,
0.019195556640625,
0.0016231536865234375,
-0.01678466796875,
0.003570556640625,
-0.034332275390625,
-0.0268707275390625,
-0.054107666015625,
-0.0081634521484375,
0.033416748046875,
0.0223541259765625,
0.0181121826171875,
0.019866943359375,
0.01496124267578125,
0.01558685302734375,
-0.010345458984375,
0.020904541015625,
-0.02362060546875,
0.025634765625,
-0.01354217529296875,
-0.0247039794921875,
-0.032989501953125,
-0.0232086181640625,
0.0286407470703125,
0.0341796875,
0.00872039794921875,
0.053619384765625,
-0.00823211669921875,
0.019744873046875,
-0.04754638671875,
0.0110931396484375,
-0.04180908203125,
-0.00630950927734375,
0.00994873046875,
-0.004810333251953125,
-0.007293701171875,
-0.030181884765625,
-0.0007886886596679688,
-0.045257568359375,
0.01285552978515625,
0.00980377197265625,
0.0584716796875,
0.0276336669921875,
-0.044097900390625,
-0.035675048828125,
-0.0176239013671875,
0.05889892578125,
-0.048828125,
0.0127716064453125,
0.004421234130859375,
0.007228851318359375,
-0.00858306884765625,
-0.0692138671875,
-0.06756591796875,
0.0084381103515625,
0.01499176025390625,
0.022369384765625,
0.017333984375,
-0.00537109375,
0.03558349609375,
0.026214599609375,
-0.0526123046875,
-0.039520263671875,
-0.0435791015625,
-0.003528594970703125,
0.0279388427734375,
0.032470703125,
0.0303802490234375,
-0.044677734375,
-0.0200042724609375,
-0.006931304931640625,
-0.03424072265625,
0.01690673828125,
0.0035800933837890625,
0.00028133392333984375,
0.01849365234375,
0.046142578125,
-0.0229949951171875,
0.052703857421875,
0.0163421630859375,
0.00249481201171875,
0.018035888671875,
-0.047454833984375,
-0.01378631591796875,
-0.033477783203125,
0.06402587890625,
0.0272369384765625,
0.00222015380859375,
-0.003490447998046875,
0.0213775634765625,
-0.007198333740234375,
0.036895751953125,
-0.062225341796875,
-0.0007185935974121094,
0.06695556640625,
-0.0060882568359375,
-0.024169921875,
-0.021697998046875,
-0.043182373046875,
-0.01139068603515625,
0.0116729736328125,
0.04571533203125,
-0.0306854248046875,
-0.013885498046875,
0.00583648681640625,
-0.0208587646484375,
0.03350830078125,
0.00788116455078125,
-0.06268310546875,
0.004589080810546875,
0.0256500244140625,
0.023101806640625,
0.007633209228515625,
-0.0396728515625,
-0.02734375,
0.007328033447265625,
-0.0185546875,
0.0440673828125,
-0.01226043701171875,
-0.01079559326171875,
-0.0073394775390625,
0.0164642333984375,
-0.0107879638671875,
-0.0289764404296875,
0.035552978515625,
-0.056884765625,
0.054534912109375,
-0.028045654296875,
-0.044097900390625,
-0.032501220703125,
0.044403076171875,
-0.040557861328125,
0.0833740234375,
0.01507568359375,
-0.07427978515625,
0.005130767822265625,
-0.075439453125,
-0.006366729736328125,
0.011474609375,
0.01239776611328125,
-0.033599853515625,
-0.02374267578125,
0.027130126953125,
0.038238525390625,
-0.01678466796875,
0.01453399658203125,
-0.016082763671875,
-0.007030487060546875,
0.012237548828125,
0.01021575927734375,
0.0836181640625,
0.02630615234375,
-0.00872039794921875,
0.021514892578125,
-0.0467529296875,
0.047027587890625,
0.00955963134765625,
-0.042877197265625,
0.015869140625,
-0.005863189697265625,
0.00640106201171875,
0.03485107421875,
0.0340576171875,
-0.048065185546875,
0.016143798828125,
-0.01739501953125,
0.04742431640625,
0.0254364013671875,
0.036285400390625,
0.0012502670288085938,
-0.04302978515625,
0.052886962890625,
-0.00130462646484375,
0.01416015625,
-0.0084686279296875,
-0.06304931640625,
-0.0556640625,
0.01137542724609375,
0.009124755859375,
0.06427001953125,
-0.061004638671875,
0.03289794921875,
0.0035419464111328125,
-0.040313720703125,
-0.04656982421875,
0.004383087158203125,
0.0268096923828125,
0.043792724609375,
0.034637451171875,
0.0013675689697265625,
-0.054534912109375,
-0.03973388671875,
-0.00980377197265625,
-0.00913238525390625,
-0.005413055419921875,
0.019622802734375,
0.063720703125,
-0.005611419677734375,
0.0643310546875,
-0.045928955078125,
0.000911712646484375,
-0.02032470703125,
0.006519317626953125,
0.0158538818359375,
0.037200927734375,
0.02642822265625,
-0.07080078125,
-0.034210205078125,
-0.035400390625,
-0.06103515625,
-0.0017871856689453125,
-0.0017576217651367188,
-0.0195465087890625,
-0.0064544677734375,
0.0211639404296875,
-0.051025390625,
0.0013675689697265625,
0.0311279296875,
-0.03192138671875,
0.0538330078125,
0.0017910003662109375,
0.020599365234375,
-0.1380615234375,
0.029571533203125,
-0.032623291015625,
-0.0149383544921875,
-0.05206298828125,
0.04010009765625,
0.007518768310546875,
-0.0247955322265625,
-0.052642822265625,
0.05096435546875,
-0.022674560546875,
0.01169586181640625,
-0.0091705322265625,
-0.00662994384765625,
0.0103759765625,
0.05023193359375,
-0.01180267333984375,
0.0625,
0.030364990234375,
-0.0596923828125,
0.046417236328125,
0.04205322265625,
-0.01885986328125,
0.0225830078125,
-0.0579833984375,
0.027191162109375,
0.00409698486328125,
0.019683837890625,
-0.080322265625,
-0.0256805419921875,
0.023345947265625,
-0.052978515625,
-0.022674560546875,
0.004077911376953125,
-0.050079345703125,
-0.046478271484375,
-0.0199432373046875,
0.026031494140625,
0.0404052734375,
-0.0170135498046875,
0.030792236328125,
0.0261383056640625,
-0.016357421875,
-0.053741455078125,
-0.032470703125,
-0.0106353759765625,
-0.0229034423828125,
-0.056060791015625,
0.0186767578125,
-0.038238525390625,
0.0027561187744140625,
0.00782012939453125,
0.0080718994140625,
-0.0240936279296875,
0.0172576904296875,
0.0143280029296875,
0.01165771484375,
-0.005237579345703125,
0.022918701171875,
0.00916290283203125,
0.01132965087890625,
0.01336669921875,
-0.0284423828125,
0.0501708984375,
-0.0269775390625,
-0.01140594482421875,
-0.01163482666015625,
0.049957275390625,
0.04705810546875,
-0.0308074951171875,
0.0455322265625,
0.048583984375,
-0.039154052734375,
-0.008636474609375,
-0.03155517578125,
-0.00748443603515625,
-0.029571533203125,
0.02838134765625,
-0.03515625,
-0.060577392578125,
0.05279541015625,
0.01045989990234375,
0.029388427734375,
0.052276611328125,
0.06793212890625,
-0.01181793212890625,
0.07171630859375,
0.028533935546875,
0.026397705078125,
0.0214996337890625,
-0.0241546630859375,
0.003734588623046875,
-0.0765380859375,
-0.018646240234375,
-0.023529052734375,
0.00862884521484375,
-0.027069091796875,
-0.035369873046875,
0.046142578125,
0.0260009765625,
-0.037506103515625,
0.01352691650390625,
-0.05535888671875,
0.0303802490234375,
0.06341552734375,
0.0010700225830078125,
0.003498077392578125,
-0.0255126953125,
-0.004779815673828125,
0.02203369140625,
-0.0655517578125,
-0.0364990234375,
0.094970703125,
0.0015783309936523438,
0.03125,
0.0019512176513671875,
0.059051513671875,
-0.0069580078125,
0.0012445449829101562,
-0.037933349609375,
0.040313720703125,
0.0016088485717773438,
-0.0810546875,
-0.027801513671875,
-0.01538848876953125,
-0.06640625,
0.016357421875,
-0.0183258056640625,
-0.05438232421875,
-0.0103607177734375,
0.0070037841796875,
-0.0567626953125,
0.01385498046875,
-0.059722900390625,
0.07745361328125,
-0.0029144287109375,
-0.0274810791015625,
0.006008148193359375,
-0.050079345703125,
0.032501220703125,
0.00644683837890625,
-0.0022640228271484375,
0.01233673095703125,
0.0195159912109375,
0.06146240234375,
-0.015533447265625,
0.0601806640625,
-0.0095672607421875,
0.001407623291015625,
0.045257568359375,
-0.006816864013671875,
0.0161285400390625,
0.0239105224609375,
0.0033283233642578125,
-0.019989013671875,
0.026824951171875,
-0.02215576171875,
-0.026611328125,
0.04351806640625,
-0.050048828125,
-0.039520263671875,
-0.0258636474609375,
-0.07342529296875,
-0.01424407958984375,
0.0299530029296875,
0.018646240234375,
0.01259613037109375,
-0.0010595321655273438,
0.006130218505859375,
0.05462646484375,
-0.033111572265625,
0.02801513671875,
0.025787353515625,
-0.016204833984375,
-0.03546142578125,
0.050384521484375,
-0.00017058849334716797,
0.01654052734375,
0.0340576171875,
0.00543975830078125,
-0.04510498046875,
-0.0084075927734375,
-0.04931640625,
0.00815582275390625,
-0.053253173828125,
-0.005809783935546875,
-0.06988525390625,
-0.027191162109375,
-0.048919677734375,
0.00827789306640625,
-0.0030384063720703125,
-0.034271240234375,
-0.0241851806640625,
0.0023403167724609375,
0.02606201171875,
0.037322998046875,
0.00580596923828125,
-0.0005679130554199219,
-0.061431884765625,
0.054168701171875,
0.058685302734375,
-0.006130218505859375,
-0.0132293701171875,
-0.01099395751953125,
0.0032062530517578125,
0.0122833251953125,
-0.057769775390625,
-0.077880859375,
0.01273345947265625,
0.00485992431640625,
0.02252197265625,
-0.002010345458984375,
0.00836181640625,
0.039947509765625,
-0.032684326171875,
0.08050537109375,
-0.0059814453125,
-0.0745849609375,
0.04119873046875,
-0.0338134765625,
0.04107666015625,
0.0251922607421875,
0.019500732421875,
-0.0538330078125,
-0.03570556640625,
-0.0594482421875,
-0.06817626953125,
0.04205322265625,
0.0197906494140625,
0.00684356689453125,
-0.011260986328125,
0.020538330078125,
0.005107879638671875,
0.031982421875,
-0.0704345703125,
-0.01538848876953125,
-0.043670654296875,
-0.01029205322265625,
0.0243682861328125,
-0.01406097412109375,
-0.0224151611328125,
-0.00981903076171875,
0.053070068359375,
-0.029876708984375,
0.030792236328125,
0.01340484619140625,
0.004974365234375,
0.005504608154296875,
0.047821044921875,
0.0455322265625,
0.0550537109375,
-0.0325927734375,
-0.00811004638671875,
0.030364990234375,
-0.0146026611328125,
0.0003402233123779297,
-0.0018320083618164062,
-0.0191192626953125,
-0.0245513916015625,
0.011871337890625,
0.049224853515625,
-0.030792236328125,
-0.04107666015625,
0.0166168212890625,
-0.0032596588134765625,
-0.03759765625,
-0.0261077880859375,
0.0160980224609375,
0.034698486328125,
0.016937255859375,
0.027923583984375,
-0.0227813720703125,
0.004505157470703125,
-0.04656982421875,
0.01192474365234375,
0.015869140625,
-0.01030731201171875,
-0.00955963134765625,
0.0731201171875,
0.0139923095703125,
-0.0274505615234375,
0.05438232421875,
-0.036773681640625,
-0.056488037109375,
0.06597900390625,
0.041015625,
0.0556640625,
0.008331298828125,
0.0335693359375,
0.041961669921875,
0.0089569091796875,
0.0233154296875,
0.0819091796875,
-0.0034618377685546875,
-0.058837890625,
-0.040008544921875,
-0.048919677734375,
-0.042572021484375,
0.03753662109375,
-0.03839111328125,
-0.01641845703125,
-0.01727294921875,
0.000827789306640625,
0.0005974769592285156,
0.0113067626953125,
-0.046417236328125,
0.052764892578125,
-0.006099700927734375,
0.04278564453125,
-0.04791259765625,
0.044281005859375,
0.0860595703125,
-0.0455322265625,
-0.073974609375,
0.00568389892578125,
-0.031097412109375,
-0.05352783203125,
0.04815673828125,
0.005657196044921875,
0.020721435546875,
0.02899169921875,
-0.050384521484375,
-0.053741455078125,
0.09197998046875,
0.00799560546875,
0.00565338134765625,
-0.024078369140625,
0.0113372802734375,
0.04339599609375,
-0.0183868408203125,
0.02557373046875,
0.0004062652587890625,
0.02667236328125,
0.0015764236450195312,
-0.051605224609375,
0.0147857666015625,
-0.0439453125,
-0.0109710693359375,
-0.0015649795532226562,
-0.054718017578125,
0.08648681640625,
-0.034210205078125,
-0.00868988037109375,
0.0135650634765625,
0.0291748046875,
0.05517578125,
0.002277374267578125,
0.0418701171875,
0.039031982421875,
0.0501708984375,
-0.016448974609375,
0.0689697265625,
-0.05389404296875,
0.048126220703125,
0.062744140625,
0.001201629638671875,
0.040130615234375,
0.044097900390625,
-0.0029506683349609375,
0.0288848876953125,
0.0645751953125,
-0.00830841064453125,
0.037841796875,
0.025970458984375,
-0.004863739013671875,
-0.0308837890625,
-0.005191802978515625,
-0.0256805419921875,
0.015411376953125,
0.018829345703125,
0.0013017654418945312,
-0.0203704833984375,
-0.007762908935546875,
-0.01470947265625,
-0.0302581787109375,
-0.00960540771484375,
0.048553466796875,
-0.0018301010131835938,
-0.0804443359375,
0.056976318359375,
-0.005970001220703125,
0.0250396728515625,
-0.048370361328125,
-0.014892578125,
0.00453948974609375,
-0.0003523826599121094,
-0.009857177734375,
-0.06976318359375,
0.01904296875,
-0.0023860931396484375,
-0.0263214111328125,
-0.0222625732421875,
0.037139892578125,
-0.0276641845703125,
-0.02813720703125,
-0.01197052001953125,
0.048614501953125,
0.0132293701171875,
-0.0019512176513671875,
-0.071044921875,
-0.040863037109375,
0.01047515869140625,
-0.0272674560546875,
-0.00811767578125,
0.0229644775390625,
0.003826141357421875,
0.069091796875,
0.040863037109375,
-0.01230621337890625,
0.03570556640625,
-0.00684356689453125,
0.055816650390625,
-0.049041748046875,
-0.045806884765625,
-0.0266265869140625,
0.06512451171875,
0.00681304931640625,
-0.051788330078125,
0.04705810546875,
0.03729248046875,
0.05352783203125,
-0.036346435546875,
0.0743408203125,
-0.043212890625,
0.050811767578125,
-0.033416748046875,
0.04754638671875,
-0.057647705078125,
-0.01004791259765625,
-0.013427734375,
-0.051483154296875,
-0.00601959228515625,
0.0343017578125,
-0.007038116455078125,
0.004238128662109375,
0.07275390625,
0.058258056640625,
0.00989532470703125,
-0.01132965087890625,
0.008148193359375,
0.00606536865234375,
0.0219879150390625,
0.072998046875,
0.06610107421875,
-0.050933837890625,
0.050872802734375,
-0.01140594482421875,
0.00884246826171875,
0.01439666748046875,
-0.03900146484375,
-0.064453125,
-0.06011962890625,
-0.01187896728515625,
-0.034423828125,
-0.00621795654296875,
0.03839111328125,
0.05224609375,
-0.06219482421875,
-0.0034999847412109375,
-0.02691650390625,
0.0223388671875,
-0.0258941650390625,
-0.0219268798828125,
0.03302001953125,
-0.032989501953125,
-0.061614990234375,
0.015045166015625,
-0.0170440673828125,
-0.01003265380859375,
-0.015838623046875,
0.01654052734375,
-0.03558349609375,
0.0022983551025390625,
0.027191162109375,
0.0195770263671875,
-0.0521240234375,
-0.0254058837890625,
0.0179443359375,
-0.002002716064453125,
-0.007228851318359375,
0.0240325927734375,
-0.047943115234375,
0.0235443115234375,
0.0562744140625,
0.044891357421875,
0.029510498046875,
0.007038116455078125,
0.03936767578125,
-0.055694580078125,
-0.031951904296875,
0.0279388427734375,
0.01393890380859375,
0.01262664794921875,
-0.023529052734375,
0.042388916015625,
0.00795745849609375,
-0.060211181640625,
-0.048828125,
0.025054931640625,
-0.07501220703125,
-0.02960205078125,
0.092529296875,
-0.00928497314453125,
-0.017059326171875,
-0.01505279541015625,
-0.0341796875,
0.035491943359375,
-0.0164642333984375,
0.06256103515625,
0.05548095703125,
-0.01641845703125,
-0.01526641845703125,
-0.044830322265625,
0.03839111328125,
0.04168701171875,
-0.08819580078125,
-0.01525115966796875,
0.0211944580078125,
0.04595947265625,
0.031707763671875,
0.055267333984375,
0.00777435302734375,
0.043731689453125,
-0.00199127197265625,
0.024688720703125,
-0.014434814453125,
0.004245758056640625,
0.010955810546875,
0.046539306640625,
-0.0211639404296875,
-0.042083740234375
]
] |
neulab/codebert-python | 2023-02-27T20:56:57.000Z | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"arxiv:2302.05527",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | neulab | null | null | neulab/codebert-python | 18 | 70,759 | transformers | 2022-09-23T15:01:36 | This is a `microsoft/codebert-base-mlm` model, trained for 1,000,000 steps (with `batch_size=32`) on **Python** code from the `codeparrot/github-code-clean` dataset, on the masked-language-modeling task.
It is intended to be used in CodeBERTScore: [https://github.com/neulab/code-bert-score](https://github.com/neulab/code-bert-score), but can be used for any other model or task.
For more information, see: [https://github.com/neulab/code-bert-score](https://github.com/neulab/code-bert-score)
## Citation
If you use this model for research, please cite:
```
@article{zhou2023codebertscore,
url = {https://arxiv.org/abs/2302.05527},
author = {Zhou, Shuyan and Alon, Uri and Agarwal, Sumit and Neubig, Graham},
title = {CodeBERTScore: Evaluating Code Generation with Pretrained Models of Code},
publisher = {arXiv},
year = {2023},
}
``` | 853 | [
[
-0.00965118408203125,
-0.042510986328125,
-0.00791168212890625,
0.0369873046875,
-0.0023441314697265625,
-0.007648468017578125,
-0.014007568359375,
-0.01129913330078125,
0.005466461181640625,
0.047576904296875,
-0.04351806640625,
-0.0478515625,
-0.02813720703125,
-0.00760650634765625,
-0.0443115234375,
0.1014404296875,
0.0162811279296875,
0.024749755859375,
-0.0037288665771484375,
-0.0015354156494140625,
-0.0263824462890625,
-0.0455322265625,
-0.027679443359375,
-0.0174713134765625,
0.036468505859375,
0.011993408203125,
0.040191650390625,
0.032989501953125,
0.0286102294921875,
0.00785064697265625,
0.01338958740234375,
-0.0162200927734375,
-0.049163818359375,
-0.00833892822265625,
0.017852783203125,
-0.0406494140625,
-0.06640625,
0.02813720703125,
0.0237884521484375,
0.05865478515625,
0.006130218505859375,
0.04937744140625,
0.01453399658203125,
0.059722900390625,
-0.033416748046875,
0.009033203125,
-0.050689697265625,
0.00225830078125,
0.02642822265625,
0.01055145263671875,
-0.04168701171875,
-0.0304412841796875,
0.004344940185546875,
-0.017059326171875,
0.02618408203125,
-0.004215240478515625,
0.07476806640625,
0.0015954971313476562,
-0.0102081298828125,
-0.02178955078125,
-0.04156494140625,
0.058685302734375,
-0.04913330078125,
0.0030574798583984375,
0.0295562744140625,
0.013427734375,
0.0078582763671875,
-0.062255859375,
-0.0196075439453125,
-0.032379150390625,
0.0164642333984375,
-0.023834228515625,
-0.0304718017578125,
-0.015716552734375,
0.049591064453125,
0.00164031982421875,
-0.06536865234375,
-0.015869140625,
-0.06787109375,
-0.020233154296875,
0.0406494140625,
0.01386260986328125,
-0.008514404296875,
-0.006374359130859375,
-0.04779052734375,
0.00801849365234375,
-0.05218505859375,
0.022064208984375,
0.038055419921875,
0.019439697265625,
-0.01337432861328125,
0.033416748046875,
-0.006710052490234375,
0.06640625,
-0.01126861572265625,
0.0102691650390625,
0.04583740234375,
-0.01495361328125,
-0.03765869140625,
-0.004974365234375,
0.060791015625,
0.0231475830078125,
0.055816650390625,
-0.004791259765625,
-0.025604248046875,
-0.004917144775390625,
0.037017822265625,
-0.07958984375,
-0.055267333984375,
0.0250396728515625,
-0.04083251953125,
-0.03753662109375,
0.0272369384765625,
-0.0175628662109375,
-0.0015735626220703125,
-0.014251708984375,
0.041107177734375,
-0.0242462158203125,
-0.015625,
0.0019311904907226562,
0.01007843017578125,
0.017791748046875,
0.0171966552734375,
-0.031005859375,
0.0020122528076171875,
0.040985107421875,
0.06414794921875,
0.0012407302856445312,
-0.0248565673828125,
-0.029937744140625,
-0.041229248046875,
-0.04669189453125,
0.01363372802734375,
-0.022216796875,
-0.0104522705078125,
0.0030155181884765625,
0.0158843994140625,
-0.01483917236328125,
-0.049957275390625,
0.0148773193359375,
-0.056427001953125,
0.00957489013671875,
0.0123291015625,
-0.037017822265625,
-0.0224761962890625,
0.0276336669921875,
-0.06243896484375,
0.076416015625,
0.027557373046875,
-0.0150604248046875,
0.03717041015625,
-0.05487060546875,
-0.006763458251953125,
0.0213165283203125,
-0.0031299591064453125,
-0.03851318359375,
-0.01137542724609375,
-0.013092041015625,
0.035308837890625,
0.00940704345703125,
0.042510986328125,
-0.0161590576171875,
-0.046173095703125,
0.0286865234375,
-0.0294342041015625,
0.07061767578125,
0.033294677734375,
-0.027191162109375,
0.005321502685546875,
-0.07769775390625,
0.022186279296875,
-0.002178192138671875,
-0.0360107421875,
0.0017757415771484375,
-0.0247802734375,
0.0261383056640625,
0.0313720703125,
0.03759765625,
-0.03863525390625,
0.0345458984375,
-0.01593017578125,
0.026824951171875,
0.042572021484375,
-0.012420654296875,
0.027984619140625,
-0.04150390625,
0.057769775390625,
-0.005802154541015625,
0.01433563232421875,
-0.0307769775390625,
-0.0521240234375,
-0.06524658203125,
-0.021636962890625,
0.05816650390625,
0.0265960693359375,
-0.0277252197265625,
0.05096435546875,
-0.0182037353515625,
-0.0469970703125,
-0.047149658203125,
0.01654052734375,
0.022308349609375,
0.0167236328125,
0.0275726318359375,
-0.01849365234375,
-0.044158935546875,
-0.057159423828125,
-0.0225982666015625,
0.0110626220703125,
-0.0175323486328125,
0.006103515625,
0.07342529296875,
-0.03375244140625,
0.0789794921875,
-0.0294189453125,
-0.0232086181640625,
-0.0103912353515625,
0.0199432373046875,
0.04986572265625,
0.06591796875,
0.03448486328125,
-0.045257568359375,
-0.03973388671875,
-0.03619384765625,
-0.0462646484375,
0.007472991943359375,
-0.0182037353515625,
-0.0097198486328125,
0.028411865234375,
0.035247802734375,
-0.0156402587890625,
0.039794921875,
0.06463623046875,
-0.0250701904296875,
0.04620361328125,
-0.00388336181640625,
0.01444244384765625,
-0.06719970703125,
0.024749755859375,
0.0018558502197265625,
-0.029083251953125,
-0.0462646484375,
-0.0028896331787109375,
0.0202789306640625,
-0.0282440185546875,
-0.03826904296875,
0.015167236328125,
-0.024139404296875,
0.0221405029296875,
-0.0251922607421875,
-0.026947021484375,
-0.00034928321838378906,
0.0606689453125,
-0.017974853515625,
0.04833984375,
0.041259765625,
-0.045745849609375,
0.018402099609375,
0.0004775524139404297,
-0.052581787109375,
-0.00505828857421875,
-0.0660400390625,
0.0278472900390625,
0.027313232421875,
0.005390167236328125,
-0.0709228515625,
-0.0006017684936523438,
0.0241241455078125,
-0.055419921875,
0.005466461181640625,
-0.022247314453125,
-0.05218505859375,
-0.0033092498779296875,
-0.01523590087890625,
0.04071044921875,
0.05181884765625,
-0.0231170654296875,
0.02667236328125,
0.01190185546875,
0.0092926025390625,
-0.0406494140625,
-0.05206298828125,
-0.00043320655822753906,
-0.004047393798828125,
-0.04669189453125,
-0.0032329559326171875,
-0.01346588134765625,
0.003582000732421875,
-0.0019254684448242188,
-0.01155853271484375,
-0.00910186767578125,
-0.0017642974853515625,
0.040985107421875,
0.027374267578125,
-0.016510009765625,
0.02398681640625,
-0.0284271240234375,
-0.0037384033203125,
0.0279693603515625,
-0.0166168212890625,
0.0584716796875,
-0.013916015625,
-0.01216888427734375,
-0.00882720947265625,
0.014495849609375,
0.034454345703125,
-0.009185791015625,
0.0806884765625,
0.0389404296875,
-0.033416748046875,
-0.03594970703125,
-0.0237579345703125,
-0.01061248779296875,
-0.029693603515625,
0.034759521484375,
-0.01568603515625,
-0.049560546875,
0.0278778076171875,
0.0029621124267578125,
-0.0023746490478515625,
0.0364990234375,
0.04730224609375,
-0.005313873291015625,
0.05712890625,
0.04119873046875,
-0.02960205078125,
0.03216552734375,
-0.046417236328125,
0.01119232177734375,
-0.03424072265625,
-0.0275421142578125,
-0.04437255859375,
-0.0191650390625,
-0.044586181640625,
-0.042266845703125,
0.00940704345703125,
0.022430419921875,
-0.04595947265625,
0.05419921875,
-0.03759765625,
0.016845703125,
0.05621337890625,
0.0245513916015625,
0.0084075927734375,
0.01136016845703125,
-0.0140533447265625,
0.01195526123046875,
-0.05645751953125,
-0.056121826171875,
0.1041259765625,
0.0322265625,
0.06890869140625,
-0.00890350341796875,
0.054656982421875,
0.038055419921875,
0.0270538330078125,
-0.032806396484375,
0.03717041015625,
0.00804901123046875,
-0.06231689453125,
-0.00476837158203125,
-0.037872314453125,
-0.085693359375,
-0.005847930908203125,
-0.01160430908203125,
-0.046630859375,
-0.01003265380859375,
0.01314544677734375,
-0.01462554931640625,
0.002735137939453125,
-0.05352783203125,
0.08062744140625,
-0.01092529296875,
-0.004451751708984375,
0.004550933837890625,
-0.04052734375,
0.0186309814453125,
-0.0216522216796875,
0.024169921875,
-0.00844573974609375,
0.009307861328125,
0.07220458984375,
-0.0289154052734375,
0.058807373046875,
-0.0203857421875,
-0.005298614501953125,
0.0189971923828125,
0.00728607177734375,
0.03900146484375,
-0.015777587890625,
-0.0110015869140625,
0.04559326171875,
-0.00904083251953125,
-0.03875732421875,
-0.020050048828125,
0.061370849609375,
-0.049896240234375,
-0.0097198486328125,
-0.041046142578125,
-0.047637939453125,
0.0127105712890625,
0.01033782958984375,
0.0186614990234375,
0.039093017578125,
0.003223419189453125,
0.04290771484375,
0.038299560546875,
-0.024932861328125,
0.02740478515625,
0.042205810546875,
-0.024688720703125,
-0.016571044921875,
0.07159423828125,
0.00647735595703125,
0.028961181640625,
0.01264190673828125,
-0.019317626953125,
0.004291534423828125,
-0.02142333984375,
-0.012420654296875,
0.0038318634033203125,
-0.05084228515625,
-0.01593017578125,
-0.044189453125,
-0.034698486328125,
-0.034942626953125,
-0.0007519721984863281,
-0.0171356201171875,
-0.014984130859375,
-0.03314208984375,
0.004634857177734375,
0.0163421630859375,
0.040191650390625,
0.01165771484375,
0.0005049705505371094,
-0.061126708984375,
0.0163116455078125,
-0.0034656524658203125,
0.02703857421875,
-0.0078125,
-0.05474853515625,
-0.05621337890625,
0.01629638671875,
-0.0174407958984375,
-0.0546875,
0.045623779296875,
-0.0033473968505859375,
0.048431396484375,
0.007904052734375,
0.01097869873046875,
0.0303192138671875,
-0.0251922607421875,
0.055877685546875,
0.01163482666015625,
-0.05975341796875,
0.038665771484375,
-0.01212310791015625,
0.04718017578125,
0.0408935546875,
0.053009033203125,
-0.00957489013671875,
-0.0355224609375,
-0.05352783203125,
-0.07440185546875,
0.04931640625,
0.034576416015625,
0.025421142578125,
0.0191650390625,
0.002597808837890625,
0.0009222030639648438,
0.038299560546875,
-0.09405517578125,
-0.0355224609375,
-0.0015707015991210938,
-0.0177459716796875,
-0.00731658935546875,
-0.01399993896484375,
-0.01200103759765625,
-0.0264892578125,
0.0626220703125,
-0.0019989013671875,
0.0247802734375,
-0.00901031494140625,
-0.042816162109375,
-0.0021076202392578125,
0.0018892288208007812,
0.0604248046875,
0.063232421875,
-0.040130615234375,
-0.022247314453125,
-0.01517486572265625,
-0.046875,
-0.01104736328125,
0.006069183349609375,
0.00794219970703125,
0.00485992431640625,
0.038055419921875,
0.049560546875,
0.00806427001953125,
-0.0589599609375,
0.048248291015625,
0.0118255615234375,
-0.046173095703125,
-0.03790283203125,
0.02117919921875,
0.0035266876220703125,
0.028839111328125,
0.04400634765625,
0.03399658203125,
-0.00589752197265625,
-0.01824951171875,
0.03326416015625,
0.0212860107421875,
-0.059478759765625,
0.0021953582763671875,
0.051666259765625,
0.0082855224609375,
-0.039642333984375,
0.05621337890625,
-0.026092529296875,
-0.0333251953125,
0.0648193359375,
0.01088714599609375,
0.0576171875,
0.01885986328125,
-0.0010805130004882812,
0.043304443359375,
0.0390625,
0.01306915283203125,
0.0204925537109375,
-0.00005602836608886719,
-0.042022705078125,
-0.0186920166015625,
-0.06402587890625,
-0.010345458984375,
0.0283050537109375,
-0.055694580078125,
0.0217437744140625,
-0.01280975341796875,
-0.0068359375,
-0.0007014274597167969,
0.004627227783203125,
-0.0599365234375,
0.022216796875,
0.00896453857421875,
0.08087158203125,
-0.05462646484375,
0.08563232421875,
0.03851318359375,
-0.040252685546875,
-0.0733642578125,
-0.002162933349609375,
-0.02117919921875,
-0.0677490234375,
0.0743408203125,
0.02777099609375,
0.01416015625,
0.00290679931640625,
-0.062255859375,
-0.0243988037109375,
0.07037353515625,
0.014251708984375,
-0.0550537109375,
0.006748199462890625,
0.01284027099609375,
0.042724609375,
-0.051239013671875,
0.0121307373046875,
0.0234375,
-0.0029449462890625,
-0.01381683349609375,
-0.055999755859375,
-0.006946563720703125,
-0.040069580078125,
-0.00504302978515625,
-0.0085906982421875,
-0.010467529296875,
0.10052490234375,
-0.0223846435546875,
0.0160064697265625,
0.024749755859375,
0.0175323486328125,
0.041595458984375,
0.01180267333984375,
0.03973388671875,
0.0211029052734375,
0.022216796875,
-0.0009236335754394531,
0.0552978515625,
-0.05889892578125,
0.07366943359375,
0.07659912109375,
0.00348663330078125,
0.05126953125,
0.0157012939453125,
-0.0239410400390625,
0.05255126953125,
0.042694091796875,
-0.0467529296875,
0.0269927978515625,
0.04736328125,
-0.00447845458984375,
-0.0123291015625,
0.032135009765625,
-0.042510986328125,
0.018463134765625,
0.0014667510986328125,
-0.073974609375,
-0.00446319580078125,
-0.0015039443969726562,
0.006561279296875,
-0.015838623046875,
-0.02276611328125,
0.03436279296875,
0.000518798828125,
-0.04083251953125,
0.0726318359375,
-0.00656890869140625,
0.046478271484375,
-0.0494384765625,
-0.02490234375,
-0.00908660888671875,
0.02093505859375,
-0.0087432861328125,
-0.031524658203125,
-0.019439697265625,
0.0250244140625,
-0.026092529296875,
-0.0250701904296875,
0.037384033203125,
-0.035614013671875,
-0.05926513671875,
0.020782470703125,
0.0251312255859375,
0.036590576171875,
-0.0257110595703125,
-0.06842041015625,
0.0012578964233398438,
0.010345458984375,
-0.0282135009765625,
0.03167724609375,
0.00704193115234375,
0.0163116455078125,
0.05206298828125,
0.053192138671875,
-0.007472991943359375,
0.01507568359375,
0.0160675048828125,
0.062255859375,
-0.05853271484375,
-0.0256500244140625,
-0.0584716796875,
0.049560546875,
0.00800323486328125,
-0.037506103515625,
0.06427001953125,
0.07080078125,
0.06591796875,
-0.03424072265625,
0.0545654296875,
-0.00894927978515625,
0.01910400390625,
-0.02642822265625,
0.0654296875,
-0.036773681640625,
0.0214385986328125,
-0.0311737060546875,
-0.068115234375,
-0.00775909423828125,
0.046173095703125,
0.0025119781494140625,
0.029510498046875,
0.039886474609375,
0.08538818359375,
0.0200958251953125,
-0.011444091796875,
0.03790283203125,
0.00479888916015625,
0.01275634765625,
0.042572021484375,
0.045379638671875,
-0.055938720703125,
0.056610107421875,
-0.012969970703125,
-0.015106201171875,
-0.02618408203125,
-0.054779052734375,
-0.0693359375,
-0.037689208984375,
-0.03302001953125,
-0.061798095703125,
-0.01523590087890625,
0.0784912109375,
0.067138671875,
-0.08062744140625,
-0.03448486328125,
-0.0164031982421875,
-0.0144195556640625,
-0.0225372314453125,
-0.020416259765625,
0.00439453125,
-0.035675048828125,
-0.058868408203125,
0.00420379638671875,
-0.006313323974609375,
-0.0208282470703125,
-0.035125732421875,
-0.011749267578125,
-0.0272369384765625,
-0.0163116455078125,
0.0294036865234375,
-0.00261688232421875,
-0.03314208984375,
-0.0016641616821289062,
0.0172271728515625,
-0.037200927734375,
0.004436492919921875,
0.057159423828125,
-0.059844970703125,
0.039703369140625,
0.0177001953125,
0.017120361328125,
0.0338134765625,
0.000027954578399658203,
0.043731689453125,
-0.0560302734375,
-0.005100250244140625,
0.017974853515625,
0.021087646484375,
-0.000044226646423339844,
-0.01004791259765625,
0.0389404296875,
0.018341064453125,
-0.047027587890625,
-0.060089111328125,
-0.0098876953125,
-0.0814208984375,
-0.0206146240234375,
0.10516357421875,
-0.0288848876953125,
-0.0004742145538330078,
-0.003147125244140625,
-0.007648468017578125,
0.006656646728515625,
-0.034881591796875,
0.030029296875,
0.040008544921875,
0.00222015380859375,
-0.00835418701171875,
-0.05487060546875,
0.0224761962890625,
0.0052642822265625,
-0.040557861328125,
-0.0256195068359375,
0.019439697265625,
0.046051025390625,
0.0176849365234375,
0.041229248046875,
0.01184844970703125,
0.0243377685546875,
0.0182037353515625,
0.03271484375,
-0.027496337890625,
-0.032196044921875,
-0.036956787109375,
0.034149169921875,
-0.005207061767578125,
-0.043060302734375
]
] |
microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract | 2023-11-06T18:04:15.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"exbert",
"en",
"arxiv:2007.15779",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract | 43 | 70,570 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- exbert
license: mit
widget:
- text: "[MASK] is a tyrosine kinase inhibitor."
---
## MSR BiomedBERT (abstracts only)
<div style="border: 2px solid orange; border-radius:10px; padding:0px 10px; width: fit-content;">
* This model was previously named **"PubMedBERT (abstracts)"**.
* You can either adopt the new model name "microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract" or update your `transformers` library to version 4.22+ if you need to refer to the old name.
</div>
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web. A prevailing assumption is that even domain-specific pretraining can benefit by starting from general-domain language models. [Recent work](https://arxiv.org/abs/2007.15779) shows that for domains with abundant unlabeled text, such as biomedicine, pretraining language models from scratch results in substantial gains over continual pretraining of general-domain language models.
This BiomedBERT is pretrained from scratch using _abstracts_ from [PubMed](https://pubmed.ncbi.nlm.nih.gov/). This model achieves state-of-the-art performance on several biomedical NLP tasks, as shown on the [Biomedical Language Understanding and Reasoning Benchmark](https://aka.ms/BLURB).
## Citation
If you find BiomedBERT useful in your research, please cite the following paper:
```latex
@misc{pubmedbert,
author = {Yu Gu and Robert Tinn and Hao Cheng and Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann and Jianfeng Gao and Hoifung Poon},
title = {Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing},
year = {2020},
eprint = {arXiv:2007.15779},
}
```
<a href="https://huggingface.co/exbert/?model=microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract&modelKind=bidirectional&sentence=Gefitinib%20is%20an%20EGFR%20tyrosine%20kinase%20inhibitor,%20which%20is%20often%20used%20for%20breast%20cancer%20and%20NSCLC%20treatment.&layer=10&heads=..0,1,2,3,4,5,6,7,8,9,10,11&threshold=0.7&tokenInd=17&tokenSide=right&maskInds=..&hideClsSep=true">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 2,306 | [
[
-0.0138702392578125,
-0.041290283203125,
0.040557861328125,
0.005443572998046875,
-0.0291900634765625,
0.006866455078125,
-0.017364501953125,
-0.0394287109375,
0.0213470458984375,
0.021820068359375,
-0.031890869140625,
-0.046722412109375,
-0.0556640625,
0.022430419921875,
-0.00641632080078125,
0.0968017578125,
0.00011610984802246094,
0.0180816650390625,
-0.025421142578125,
-0.0179443359375,
0.0068359375,
-0.0611572265625,
-0.0377197265625,
-0.037567138671875,
0.044891357421875,
-0.015869140625,
0.03955078125,
0.0256195068359375,
0.03521728515625,
0.0235137939453125,
-0.01554107666015625,
-0.003200531005859375,
-0.0217742919921875,
-0.006870269775390625,
-0.00925445556640625,
-0.00818634033203125,
-0.05963134765625,
0.0108184814453125,
0.039703369140625,
0.07489013671875,
0.00009316205978393555,
-0.004608154296875,
0.0137176513671875,
0.057708740234375,
-0.0285186767578125,
0.0009851455688476562,
-0.032867431640625,
0.0010166168212890625,
-0.011932373046875,
-0.013458251953125,
-0.03533935546875,
-0.0204925537109375,
0.03887939453125,
-0.036041259765625,
0.0210723876953125,
-0.0021114349365234375,
0.09613037109375,
0.002185821533203125,
-0.01611328125,
0.01190948486328125,
-0.032623291015625,
0.070556640625,
-0.07440185546875,
0.034454345703125,
0.030548095703125,
-0.0009851455688476562,
0.00005060434341430664,
-0.08306884765625,
-0.023468017578125,
-0.025421142578125,
-0.0154571533203125,
0.00931549072265625,
-0.04290771484375,
0.01067352294921875,
0.005626678466796875,
0.00039124488830566406,
-0.0731201171875,
-0.01422882080078125,
-0.05413818359375,
-0.0202178955078125,
0.0303802490234375,
-0.0152435302734375,
0.025970458984375,
0.006313323974609375,
-0.03057861328125,
-0.003204345703125,
-0.057037353515625,
-0.00656890869140625,
-0.004817962646484375,
0.01256561279296875,
-0.0144805908203125,
0.0216217041015625,
0.00841522216796875,
0.067138671875,
-0.0003376007080078125,
0.004070281982421875,
0.0701904296875,
-0.0285186767578125,
-0.016937255859375,
0.0027408599853515625,
0.07427978515625,
0.00817108154296875,
0.03533935546875,
-0.0152587890625,
-0.0015058517456054688,
-0.01010894775390625,
0.037078857421875,
-0.06561279296875,
-0.03253173828125,
0.034759521484375,
-0.04132080078125,
-0.007373809814453125,
-0.01165008544921875,
-0.03070068359375,
0.0029449462890625,
-0.0275421142578125,
0.04254150390625,
-0.04998779296875,
-0.0037403106689453125,
0.0262298583984375,
0.007232666015625,
0.0020351409912109375,
0.0098419189453125,
-0.045013427734375,
0.010101318359375,
0.0092620849609375,
0.062744140625,
-0.0296630859375,
-0.019805908203125,
-0.01953125,
0.0067596435546875,
0.0025787353515625,
0.061279296875,
-0.0379638671875,
-0.0088348388671875,
-0.0028018951416015625,
0.0254974365234375,
-0.0193939208984375,
-0.031646728515625,
0.016632080078125,
-0.03948974609375,
0.0174560546875,
0.004673004150390625,
-0.0361328125,
-0.008819580078125,
-0.00936126708984375,
-0.029815673828125,
0.045013427734375,
0.00356292724609375,
-0.06719970703125,
0.00629425048828125,
-0.050384521484375,
-0.042205810546875,
-0.01096343994140625,
-0.01611328125,
-0.038543701171875,
0.00518035888671875,
0.004299163818359375,
0.034576416015625,
-0.00394439697265625,
0.0164337158203125,
-0.0147247314453125,
0.0010929107666015625,
0.0113067626953125,
0.0024814605712890625,
0.06939697265625,
0.006481170654296875,
-0.0262908935546875,
0.0212554931640625,
-0.06781005859375,
0.025665283203125,
0.00997161865234375,
-0.0268096923828125,
-0.0236358642578125,
-0.00547027587890625,
-0.002239227294921875,
0.028411865234375,
0.01715087890625,
-0.0443115234375,
0.00112152099609375,
-0.049224853515625,
0.0406494140625,
0.041473388671875,
0.0005922317504882812,
0.03253173828125,
-0.021697998046875,
0.042327880859375,
0.00835418701171875,
-0.00023937225341796875,
0.00728607177734375,
-0.038116455078125,
-0.038330078125,
-0.032684326171875,
0.0455322265625,
0.037750244140625,
-0.0572509765625,
0.048004150390625,
-0.0128021240234375,
-0.0220489501953125,
-0.0533447265625,
-0.0017385482788085938,
0.041595458984375,
0.039306640625,
0.06585693359375,
-0.041595458984375,
-0.048004150390625,
-0.0762939453125,
-0.021575927734375,
0.0083770751953125,
-0.0123291015625,
0.007343292236328125,
0.0377197265625,
-0.03948974609375,
0.061920166015625,
-0.029022216796875,
-0.017486572265625,
-0.033203125,
0.02655029296875,
0.025787353515625,
0.05194091796875,
0.038818359375,
-0.0426025390625,
-0.03778076171875,
-0.0210113525390625,
-0.048736572265625,
-0.0236358642578125,
0.007595062255859375,
-0.019439697265625,
0.00925445556640625,
0.042388916015625,
-0.0513916015625,
0.034454345703125,
0.04681396484375,
-0.0170135498046875,
0.050140380859375,
-0.0335693359375,
-0.01236724853515625,
-0.0726318359375,
0.025146484375,
0.00344085693359375,
-0.022674560546875,
-0.071533203125,
-0.0168304443359375,
0.001766204833984375,
0.0027217864990234375,
-0.0386962890625,
0.03753662109375,
-0.045135498046875,
0.0243682861328125,
-0.0211334228515625,
0.01071929931640625,
0.01316070556640625,
0.04290771484375,
0.0266571044921875,
0.05029296875,
0.046630859375,
-0.049468994140625,
-0.020904541015625,
0.036956787109375,
-0.02154541015625,
-0.0014696121215820312,
-0.0889892578125,
0.007282257080078125,
-0.02471923828125,
0.0215301513671875,
-0.067626953125,
0.00543212890625,
0.01210784912109375,
-0.047698974609375,
0.043212890625,
0.011505126953125,
-0.0156402587890625,
-0.0069732666015625,
-0.0232391357421875,
0.0249481201171875,
0.04888916015625,
-0.01084136962890625,
0.04022216796875,
0.0322265625,
-0.040069580078125,
-0.04730224609375,
-0.061614990234375,
-0.0160064697265625,
0.02197265625,
-0.03863525390625,
0.046966552734375,
-0.016387939453125,
0.0071868896484375,
-0.004852294921875,
-0.005260467529296875,
-0.01557159423828125,
-0.0177154541015625,
0.012451171875,
0.03033447265625,
-0.0189971923828125,
0.017333984375,
0.01097869873046875,
-0.0163726806640625,
-0.0010929107666015625,
-0.01139068603515625,
0.04486083984375,
-0.01404571533203125,
-0.012359619140625,
-0.017822265625,
0.030975341796875,
0.0271759033203125,
-0.04132080078125,
0.0716552734375,
0.040069580078125,
-0.0165252685546875,
0.005718231201171875,
-0.0201416015625,
-0.02313232421875,
-0.034454345703125,
0.04437255859375,
0.006008148193359375,
-0.07318115234375,
0.0118560791015625,
-0.00940704345703125,
0.003650665283203125,
0.041778564453125,
0.05108642578125,
-0.0012693405151367188,
0.076904296875,
0.049652099609375,
0.01068878173828125,
0.01513671875,
-0.01425933837890625,
0.026031494140625,
-0.0662841796875,
-0.00521087646484375,
-0.038818359375,
-0.01177215576171875,
-0.01378631591796875,
-0.0311431884765625,
0.023193359375,
-0.002567291259765625,
-0.0256195068359375,
0.035064697265625,
-0.05548095703125,
0.0157928466796875,
0.033905029296875,
0.0193328857421875,
0.011688232421875,
0.01434326171875,
-0.042938232421875,
-0.00930023193359375,
-0.052978515625,
-0.046661376953125,
0.085205078125,
0.02972412109375,
0.045501708984375,
0.005313873291015625,
0.0501708984375,
0.005401611328125,
0.034454345703125,
-0.027008056640625,
0.034912109375,
-0.01126861572265625,
-0.058746337890625,
-0.007770538330078125,
-0.03271484375,
-0.0968017578125,
0.0136260986328125,
-0.025177001953125,
-0.0677490234375,
0.029205322265625,
0.0136871337890625,
-0.048614501953125,
0.00910186767578125,
-0.049774169921875,
0.06793212890625,
-0.0216217041015625,
-0.0211334228515625,
0.00795745849609375,
-0.07525634765625,
0.00428009033203125,
-0.0183868408203125,
0.0193023681640625,
-0.0032863616943359375,
0.0020542144775390625,
0.066650390625,
-0.032684326171875,
0.0643310546875,
-0.0124969482421875,
0.0098724365234375,
0.00604248046875,
-0.0219268798828125,
0.021881103515625,
-0.0220794677734375,
0.0079498291015625,
0.029754638671875,
0.0144805908203125,
-0.029541015625,
-0.00791168212890625,
0.0244293212890625,
-0.070556640625,
-0.03057861328125,
-0.052764892578125,
-0.018157958984375,
-0.0308990478515625,
0.0198516845703125,
0.056915283203125,
0.034271240234375,
-0.012054443359375,
0.028961181640625,
0.06610107421875,
-0.056182861328125,
0.0113372802734375,
0.051422119140625,
-0.01605224609375,
-0.024444580078125,
0.044891357421875,
-0.0016880035400390625,
0.023895263671875,
0.032501220703125,
-0.003589630126953125,
-0.020599365234375,
-0.051544189453125,
-0.0028133392333984375,
0.03680419921875,
-0.03662109375,
-0.0172882080078125,
-0.0823974609375,
-0.04150390625,
-0.038421630859375,
-0.006488800048828125,
-0.0214385986328125,
-0.029144287109375,
-0.026153564453125,
-0.0007548332214355469,
0.01678466796875,
0.0338134765625,
-0.017578125,
0.0142059326171875,
-0.07366943359375,
0.0213775634765625,
0.006237030029296875,
0.0192108154296875,
0.004161834716796875,
-0.058807373046875,
-0.0230560302734375,
0.00728607177734375,
-0.015838623046875,
-0.06793212890625,
0.04522705078125,
0.03466796875,
0.05596923828125,
0.017120361328125,
-0.00261688232421875,
0.0200958251953125,
-0.07415771484375,
0.048065185546875,
0.03466796875,
-0.0509033203125,
0.040252685546875,
-0.01788330078125,
0.03936767578125,
0.058990478515625,
0.069091796875,
-0.0159454345703125,
-0.029998779296875,
-0.0496826171875,
-0.09210205078125,
0.041290283203125,
0.0263824462890625,
0.0024261474609375,
-0.0169525146484375,
0.01107025146484375,
0.0039215087890625,
0.02227783203125,
-0.0677490234375,
-0.04107666015625,
-0.00849151611328125,
-0.021881103515625,
-0.01568603515625,
-0.021484375,
-0.0185089111328125,
-0.0594482421875,
0.06365966796875,
0.005008697509765625,
0.06524658203125,
0.03765869140625,
-0.0274810791015625,
0.00913238525390625,
0.026397705078125,
0.050262451171875,
0.06689453125,
-0.03497314453125,
0.00640869140625,
0.004276275634765625,
-0.0533447265625,
0.003025054931640625,
0.033905029296875,
0.00972747802734375,
0.02685546875,
0.03118896484375,
0.04833984375,
0.0137176513671875,
-0.04937744140625,
0.052459716796875,
-0.0010213851928710938,
-0.03692626953125,
-0.0117034912109375,
-0.00917816162109375,
0.0154876708984375,
0.00801849365234375,
0.03216552734375,
0.01245880126953125,
0.00055694580078125,
-0.0285797119140625,
0.0297698974609375,
0.0189971923828125,
-0.0419921875,
-0.031005859375,
0.061737060546875,
0.006130218505859375,
0.000797271728515625,
0.0267791748046875,
-0.00826263427734375,
-0.043060302734375,
0.0218048095703125,
0.048553466796875,
0.06561279296875,
-0.0194549560546875,
0.01165008544921875,
0.037841796875,
0.017059326171875,
0.006664276123046875,
0.02044677734375,
0.023162841796875,
-0.060089111328125,
-0.049591064453125,
-0.06982421875,
-0.004825592041015625,
0.02490234375,
-0.039154052734375,
-0.0181121826171875,
-0.038604736328125,
-0.035247802734375,
0.0279998779296875,
-0.0128326416015625,
-0.049041748046875,
0.0235443115234375,
-0.001956939697265625,
0.0614013671875,
-0.0526123046875,
0.079345703125,
0.0760498046875,
-0.04095458984375,
-0.0506591796875,
-0.017852783203125,
-0.00554656982421875,
-0.05548095703125,
0.058837890625,
-0.00036144256591796875,
-0.0006527900695800781,
-0.002899169921875,
-0.066162109375,
-0.0626220703125,
0.0592041015625,
0.0151214599609375,
-0.04931640625,
-0.0125732421875,
-0.005184173583984375,
0.057037353515625,
-0.027069091796875,
0.01372528076171875,
0.0285797119140625,
0.01727294921875,
-0.0115814208984375,
-0.062164306640625,
0.017974853515625,
-0.048919677734375,
-0.00823211669921875,
-0.0007996559143066406,
-0.022430419921875,
0.08624267578125,
-0.007022857666015625,
0.00513458251953125,
0.00910186767578125,
0.042938232421875,
0.0277862548828125,
0.0032978057861328125,
0.0180816650390625,
0.0293731689453125,
0.0513916015625,
-0.002655029296875,
0.0802001953125,
-0.035736083984375,
0.0255126953125,
0.0765380859375,
-0.0139617919921875,
0.06109619140625,
0.031585693359375,
-0.03143310546875,
0.0655517578125,
0.033203125,
0.00875091552734375,
0.05596923828125,
0.01800537109375,
-0.0196075439453125,
-0.0149993896484375,
0.011444091796875,
-0.059539794921875,
0.0108184814453125,
0.01331329345703125,
-0.054046630859375,
-0.01285552978515625,
0.00693511962890625,
0.0171051025390625,
-0.0102386474609375,
-0.0010089874267578125,
0.03790283203125,
0.02777099609375,
-0.020721435546875,
0.058135986328125,
-0.0106964111328125,
0.050689697265625,
-0.083740234375,
0.00391387939453125,
-0.0006656646728515625,
0.01654052734375,
-0.011199951171875,
-0.0283203125,
0.013671875,
0.002079010009765625,
-0.0180511474609375,
-0.01244354248046875,
0.062286376953125,
-0.03607177734375,
-0.0216522216796875,
0.025970458984375,
0.0478515625,
0.0249481201171875,
-0.007198333740234375,
-0.0716552734375,
-0.00409698486328125,
0.01134490966796875,
-0.031829833984375,
0.045318603515625,
0.0159454345703125,
0.0268096923828125,
0.033203125,
0.045684814453125,
0.0181121826171875,
-0.0169677734375,
0.0042572021484375,
0.0716552734375,
-0.04278564453125,
-0.0208282470703125,
-0.055267333984375,
0.046844482421875,
-0.01080322265625,
-0.0169219970703125,
0.050872802734375,
0.033416748046875,
0.049896240234375,
-0.031494140625,
0.0574951171875,
-0.00091552734375,
0.057037353515625,
-0.027313232421875,
0.08154296875,
-0.051971435546875,
0.000946044921875,
-0.033660888671875,
-0.061279296875,
-0.034637451171875,
0.07769775390625,
-0.0291900634765625,
0.036102294921875,
0.076904296875,
0.04925537109375,
0.007266998291015625,
-0.0185394287109375,
0.017120361328125,
0.03472900390625,
0.0006985664367675781,
0.045013427734375,
0.03607177734375,
-0.017669677734375,
0.0238037109375,
0.00119781494140625,
-0.02923583984375,
-0.010498046875,
-0.06695556640625,
-0.0762939453125,
-0.040557861328125,
-0.037933349609375,
-0.04498291015625,
0.0233001708984375,
0.087646484375,
0.05621337890625,
-0.08197021484375,
0.0080108642578125,
0.01044464111328125,
-0.0303802490234375,
-0.00702667236328125,
-0.010162353515625,
0.042266845703125,
-0.0240936279296875,
-0.0196533203125,
0.0208587646484375,
0.0194854736328125,
0.0153961181640625,
0.00785064697265625,
-0.0000029802322387695312,
-0.060211181640625,
0.01155853271484375,
0.052093505859375,
0.043121337890625,
-0.03125,
-0.01462554931640625,
-0.00806427001953125,
-0.0249481201171875,
0.017059326171875,
0.03680419921875,
-0.06048583984375,
0.023193359375,
0.0242462158203125,
0.05389404296875,
0.03570556640625,
-0.007137298583984375,
0.0517578125,
-0.06719970703125,
0.0051727294921875,
0.028900146484375,
0.0278167724609375,
0.02020263671875,
0.00017786026000976562,
0.034393310546875,
0.01354217529296875,
-0.05389404296875,
-0.04302978515625,
-0.006195068359375,
-0.07318115234375,
-0.047149658203125,
0.08197021484375,
-0.015716552734375,
-0.0185394287109375,
-0.0216827392578125,
-0.00891876220703125,
0.0244598388671875,
-0.016357421875,
0.03863525390625,
0.0384521484375,
-0.00849151611328125,
-0.0025844573974609375,
-0.053253173828125,
0.0592041015625,
0.04486083984375,
-0.055389404296875,
-0.0225982666015625,
0.0163116455078125,
0.0163421630859375,
0.0268402099609375,
0.06634521484375,
-0.0216827392578125,
0.01513671875,
-0.0258026123046875,
0.037200927734375,
0.005710601806640625,
-0.01824951171875,
-0.02960205078125,
-0.0082855224609375,
-0.00698089599609375,
-0.0014171600341796875
]
] |
bert-base-german-dbmdz-uncased | 2023-04-06T13:43:06.000Z | [
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"fill-mask",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-base-german-dbmdz-uncased | 2 | 70,117 | transformers | 2022-03-02T23:29:04 | ---
language: de
license: mit
---
This model is the same as [dbmdz/bert-base-german-uncased](https://huggingface.co/dbmdz/bert-base-german-uncased). See the [dbmdz/bert-base-german-cased model card](https://huggingface.co/dbmdz/bert-base-german-uncased) for details on the model.
| 281 | [
[
-0.026824951171875,
-0.059112548828125,
0.018646240234375,
0.022247314453125,
-0.02972412109375,
0.01241302490234375,
0.00730133056640625,
-0.01593017578125,
0.05645751953125,
0.059295654296875,
-0.0814208984375,
-0.0421142578125,
-0.01224517822265625,
-0.027496337890625,
-0.0102691650390625,
0.054229736328125,
0.00644683837890625,
0.043426513671875,
-0.023956298828125,
-0.01358795166015625,
-0.00506591796875,
-0.0623779296875,
-0.034881591796875,
-0.06854248046875,
0.0227203369140625,
-0.006694793701171875,
0.07244873046875,
0.01544952392578125,
0.036163330078125,
0.021514892578125,
-0.0273590087890625,
-0.040557861328125,
-0.00771331787109375,
-0.0244598388671875,
-0.0041046142578125,
-0.023651123046875,
-0.07257080078125,
0.0035915374755859375,
0.036376953125,
0.07073974609375,
-0.04815673828125,
0.0021343231201171875,
-0.0187530517578125,
0.038238525390625,
-0.0230560302734375,
0.0269012451171875,
-0.01476287841796875,
0.021575927734375,
-0.0064239501953125,
0.0231781005859375,
-0.039459228515625,
0.0025634765625,
0.016754150390625,
-0.0280609130859375,
0.035064697265625,
-0.0164642333984375,
0.08111572265625,
-0.01085662841796875,
-0.02423095703125,
0.0035762786865234375,
-0.03717041015625,
0.0264892578125,
-0.050689697265625,
0.04669189453125,
-0.0209808349609375,
0.02947998046875,
-0.037078857421875,
-0.0260162353515625,
-0.0250701904296875,
-0.01177215576171875,
-0.00849151611328125,
0.0214385986328125,
-0.041717529296875,
0.03369140625,
0.025054931640625,
0.0283660888671875,
-0.049530029296875,
-0.00567626953125,
-0.0435791015625,
-0.0224151611328125,
0.046783447265625,
-0.00933837890625,
0.0185546875,
-0.01593017578125,
-0.0594482421875,
-0.021514892578125,
-0.05633544921875,
-0.004970550537109375,
0.03271484375,
0.042083740234375,
-0.04107666015625,
0.037353515625,
-0.00627899169921875,
0.049346923828125,
0.0242462158203125,
0.0101165771484375,
0.048126220703125,
0.0298309326171875,
-0.046600341796875,
0.0196533203125,
0.005035400390625,
0.03369140625,
0.0199432373046875,
-0.031402587890625,
-0.02276611328125,
-0.005664825439453125,
0.0233917236328125,
-0.06500244140625,
-0.039459228515625,
0.0007634162902832031,
-0.07562255859375,
0.0014591217041015625,
0.032135009765625,
-0.021942138671875,
0.007236480712890625,
0.00638580322265625,
0.0297393798828125,
-0.01158905029296875,
-0.043792724609375,
0.01450347900390625,
-0.023773193359375,
0.036163330078125,
0.003368377685546875,
-0.06011962890625,
0.030853271484375,
0.0426025390625,
0.028533935546875,
-0.0166015625,
0.01265716552734375,
-0.0101165771484375,
0.0172119140625,
-0.034027099609375,
0.0511474609375,
-0.04052734375,
-0.057891845703125,
0.024810791015625,
0.0186767578125,
0.0206298828125,
-0.03662109375,
0.06280517578125,
-0.05804443359375,
-0.0009527206420898438,
-0.039886474609375,
-0.0377197265625,
-0.0128173828125,
0.02490234375,
-0.07611083984375,
0.07452392578125,
0.0183868408203125,
-0.04058837890625,
0.055877685546875,
-0.046356201171875,
-0.0251922607421875,
0.03253173828125,
-0.013946533203125,
-0.034332275390625,
0.0264739990234375,
-0.02862548828125,
0.029876708984375,
-0.0017032623291015625,
0.0000762939453125,
-0.05096435546875,
-0.02496337890625,
-0.010589599609375,
-0.0056610107421875,
0.068115234375,
0.00904083251953125,
0.01021575927734375,
0.00618743896484375,
-0.08795166015625,
0.02734375,
0.022613525390625,
-0.04193115234375,
-0.03509521484375,
-0.012969970703125,
0.0177154541015625,
0.01261138916015625,
0.036468505859375,
-0.06072998046875,
0.0216827392578125,
-0.0174407958984375,
0.0016546249389648438,
0.036529541015625,
0.0034923553466796875,
0.02783203125,
-0.032806396484375,
0.024932861328125,
0.01259613037109375,
0.03753662109375,
0.0194091796875,
-0.04052734375,
-0.0482177734375,
-0.030548095703125,
0.035125732421875,
0.026611328125,
-0.034271240234375,
0.05682373046875,
-0.003582000732421875,
-0.05975341796875,
-0.01551055908203125,
-0.02142333984375,
0.035369873046875,
0.01421356201171875,
0.0162506103515625,
-0.059326171875,
-0.035125732421875,
-0.10076904296875,
0.00856781005859375,
-0.017242431640625,
-0.00531768798828125,
0.0281219482421875,
0.016876220703125,
-0.0301361083984375,
0.046478271484375,
-0.02655029296875,
-0.004833221435546875,
-0.01141357421875,
0.01324462890625,
0.062164306640625,
0.04022216796875,
0.08001708984375,
-0.0361328125,
-0.039398193359375,
-0.0273590087890625,
-0.0323486328125,
0.002071380615234375,
0.01023101806640625,
-0.0259246826171875,
0.02093505859375,
0.0175933837890625,
-0.08544921875,
0.043670654296875,
0.0213623046875,
-0.035400390625,
0.0151214599609375,
-0.05389404296875,
0.00260162353515625,
-0.0648193359375,
0.00495147705078125,
-0.0015039443969726562,
-0.003574371337890625,
-0.059417724609375,
0.017425537109375,
0.03009033203125,
0.01390838623046875,
-0.0218658447265625,
0.0229034423828125,
-0.0516357421875,
-0.0419921875,
-0.0015668869018554688,
-0.02667236328125,
-0.00952911376953125,
0.036895751953125,
0.02056884765625,
0.0213623046875,
0.04425048828125,
-0.014312744140625,
0.03875732421875,
0.032989501953125,
-0.02056884765625,
0.056884765625,
-0.062286376953125,
0.0018606185913085938,
-0.02825927734375,
0.0217437744140625,
-0.05291748046875,
0.0099639892578125,
0.01393890380859375,
0.0006275177001953125,
0.03546142578125,
-0.025390625,
-0.06329345703125,
-0.044921875,
-0.00971221923828125,
-0.0005545616149902344,
0.060150146484375,
-0.06658935546875,
0.06451416015625,
0.00946044921875,
-0.01128387451171875,
-0.040191650390625,
-0.05633544921875,
-0.015380859375,
-0.020294189453125,
-0.03985595703125,
0.06488037109375,
-0.039398193359375,
-0.0235748291015625,
0.009124755859375,
-0.0293426513671875,
-0.0290069580078125,
-0.00506591796875,
0.038482666015625,
0.05706787109375,
-0.03399658203125,
-0.003368377685546875,
0.00849151611328125,
-0.0055389404296875,
0.0191650390625,
0.0251312255859375,
0.046844482421875,
-0.008636474609375,
-0.013031005859375,
-0.001506805419921875,
0.042877197265625,
0.033477783203125,
0.00876617431640625,
0.056732177734375,
0.01666259765625,
-0.055206298828125,
-0.0287322998046875,
-0.045135498046875,
-0.023345947265625,
-0.04052734375,
-0.01404571533203125,
-0.06884765625,
-0.022186279296875,
0.05462646484375,
0.014495849609375,
-0.007167816162109375,
0.04791259765625,
0.05023193359375,
-0.0092620849609375,
0.07550048828125,
0.08331298828125,
-0.011505126953125,
0.02392578125,
-0.00365447998046875,
-0.0030956268310546875,
-0.0291290283203125,
-0.033477783203125,
-0.0101165771484375,
-0.0029468536376953125,
-0.00621795654296875,
-0.007007598876953125,
-0.010345458984375,
0.0229034423828125,
-0.04705810546875,
0.034149169921875,
-0.047149658203125,
0.03485107421875,
0.0654296875,
0.0118560791015625,
0.0219879150390625,
0.02825927734375,
-0.03289794921875,
-0.005580902099609375,
-0.049896240234375,
-0.02288818359375,
0.07080078125,
0.0170440673828125,
0.0533447265625,
0.015869140625,
0.048248291015625,
0.04425048828125,
0.0270233154296875,
-0.013153076171875,
0.022857666015625,
-0.01471710205078125,
-0.075439453125,
0.016510009765625,
-0.01271820068359375,
-0.056365966796875,
0.01335906982421875,
-0.04052734375,
-0.044097900390625,
0.01222991943359375,
-0.00946044921875,
-0.0290679931640625,
0.022247314453125,
-0.06329345703125,
0.07208251953125,
-0.0207672119140625,
0.017059326171875,
-0.02288818359375,
-0.033050537109375,
-0.00010961294174194336,
0.019317626953125,
-0.006359100341796875,
-0.0073699951171875,
0.03875732421875,
0.046783447265625,
-0.050262451171875,
0.045135498046875,
-0.036285400390625,
-0.0167999267578125,
0.0160980224609375,
0.0008482933044433594,
0.0243988037109375,
0.01085662841796875,
0.005046844482421875,
0.023956298828125,
0.0025730133056640625,
-0.0308990478515625,
-0.0210113525390625,
0.047393798828125,
-0.053924560546875,
-0.0311279296875,
-0.0271148681640625,
-0.01458740234375,
-0.0099945068359375,
0.032012939453125,
0.0214385986328125,
0.039031982421875,
-0.05474853515625,
-0.006542205810546875,
0.0677490234375,
-0.0032958984375,
0.033050537109375,
0.07611083984375,
0.0013523101806640625,
0.007312774658203125,
0.01418304443359375,
-0.01678466796875,
0.0036792755126953125,
0.037353515625,
0.0028209686279296875,
-0.02294921875,
-0.0235748291015625,
-0.048492431640625,
0.021697998046875,
-0.0408935546875,
-0.0113067626953125,
-0.0304107666015625,
-0.0430908203125,
-0.00502777099609375,
-0.024322509765625,
-0.0281982421875,
-0.0281829833984375,
-0.014129638671875,
-0.0240478515625,
0.058349609375,
0.041717529296875,
-0.028533935546875,
0.02783203125,
-0.038818359375,
0.0168609619140625,
0.0256500244140625,
0.06976318359375,
-0.01511383056640625,
-0.03485107421875,
0.03704833984375,
0.00917816162109375,
0.01036834716796875,
-0.05169677734375,
0.00751495361328125,
-0.009735107421875,
0.07489013671875,
0.04522705078125,
0.00464630126953125,
0.034027099609375,
-0.07196044921875,
0.03466796875,
0.05517578125,
-0.047271728515625,
0.00931549072265625,
-0.0207061767578125,
-0.00795745849609375,
0.0297393798828125,
0.01491546630859375,
-0.019989013671875,
0.0190277099609375,
-0.08062744140625,
-0.05023193359375,
0.057708740234375,
0.0007991790771484375,
0.0303192138671875,
0.021392822265625,
0.0167999267578125,
0.024078369140625,
0.03271484375,
-0.059600830078125,
-0.07000732421875,
-0.02764892578125,
0.0014314651489257812,
-0.005947113037109375,
-0.049285888671875,
-0.030975341796875,
-0.01306915283203125,
0.0521240234375,
0.021575927734375,
0.0413818359375,
-0.01264190673828125,
0.00968170166015625,
-0.01220703125,
0.004291534423828125,
0.0294952392578125,
0.00878143310546875,
-0.0631103515625,
-0.0264892578125,
0.0013151168823242188,
-0.0195159912109375,
-0.0196533203125,
0.0252227783203125,
-0.004413604736328125,
0.033050537109375,
0.01702880859375,
0.06512451171875,
0.03497314453125,
-0.043609619140625,
0.05377197265625,
0.007904052734375,
-0.037109375,
-0.058258056640625,
-0.004779815673828125,
0.024871826171875,
0.041778564453125,
0.0103302001953125,
-0.01450347900390625,
0.0162811279296875,
-0.0673828125,
0.0262908935546875,
0.057891845703125,
-0.0254364013671875,
-0.00939178466796875,
0.05743408203125,
0.01389312744140625,
-0.054962158203125,
0.0579833984375,
-0.01522064208984375,
-0.0288543701171875,
0.061248779296875,
0.047882080078125,
0.048828125,
-0.00806427001953125,
0.0333251953125,
0.007389068603515625,
0.0229949951171875,
-0.0012865066528320312,
0.060272216796875,
-0.0222320556640625,
-0.0546875,
0.005626678466796875,
-0.025177001953125,
-0.039306640625,
-0.00495147705078125,
-0.085205078125,
0.0063934326171875,
-0.055206298828125,
-0.0124969482421875,
0.00751495361328125,
-0.0009245872497558594,
-0.042022705078125,
0.0246124267578125,
0.02154541015625,
0.1102294921875,
-0.040252685546875,
0.076904296875,
0.05804443359375,
-0.0160369873046875,
-0.045806884765625,
-0.0301361083984375,
-0.01287841796875,
-0.08477783203125,
0.04046630859375,
-0.0009655952453613281,
0.010101318359375,
-0.035369873046875,
-0.0202789306640625,
-0.07037353515625,
0.0762939453125,
0.007537841796875,
-0.021575927734375,
0.0000527501106262207,
-0.006946563720703125,
0.05059814453125,
-0.022705078125,
0.027313232421875,
0.034912109375,
0.0272064208984375,
0.0328369140625,
-0.051910400390625,
-0.01338958740234375,
-0.036468505859375,
0.0032787322998046875,
0.014801025390625,
-0.0665283203125,
0.06640625,
0.009033203125,
-0.014739990234375,
0.0172119140625,
0.0301971435546875,
0.0308990478515625,
-0.0201873779296875,
0.043548583984375,
0.0491943359375,
0.0389404296875,
-0.0174560546875,
0.049163818359375,
-0.0134735107421875,
0.0379638671875,
0.056610107421875,
-0.04296875,
0.05126953125,
0.01271820068359375,
-0.01560211181640625,
0.06787109375,
0.03515625,
-0.046844482421875,
0.056976318359375,
0.0247650146484375,
-0.00504302978515625,
-0.02471923828125,
0.023193359375,
-0.057403564453125,
0.01535797119140625,
0.024871826171875,
-0.0209808349609375,
-0.044921875,
-0.024749755859375,
-0.00421142578125,
0.0016698837280273438,
-0.045440673828125,
0.034271240234375,
-0.01641845703125,
-0.0183563232421875,
0.02667236328125,
0.02752685546875,
0.06549072265625,
-0.052398681640625,
0.00684356689453125,
-0.01560211181640625,
0.01383209228515625,
0.003040313720703125,
-0.040313720703125,
0.00952911376953125,
-0.01262664794921875,
-0.05072021484375,
-0.0242767333984375,
0.051055908203125,
-0.027496337890625,
-0.054931640625,
0.038421630859375,
0.0276031494140625,
0.01482391357421875,
0.0142974853515625,
-0.07391357421875,
0.0013780593872070312,
-0.01349639892578125,
-0.00876617431640625,
0.004150390625,
0.00308990478515625,
-0.00165557861328125,
0.051544189453125,
0.0159759521484375,
-0.006473541259765625,
0.0171051025390625,
0.0160369873046875,
0.039825439453125,
-0.030670166015625,
-0.048126220703125,
-0.0253448486328125,
0.0204315185546875,
-0.0416259765625,
-0.023529052734375,
0.0487060546875,
0.06884765625,
0.05157470703125,
-0.055694580078125,
0.037322998046875,
0.0008878707885742188,
0.04254150390625,
-0.018646240234375,
0.06964111328125,
-0.0419921875,
-0.0252838134765625,
-0.032135009765625,
-0.0762939453125,
-0.002197265625,
0.075439453125,
0.041015625,
0.007472991943359375,
0.0198822021484375,
0.0361328125,
-0.0201873779296875,
0.01806640625,
0.0224456787109375,
0.0229034423828125,
-0.01326751708984375,
0.0377197265625,
0.02801513671875,
-0.01187896728515625,
0.01418304443359375,
-0.02734375,
-0.022430419921875,
-0.0162506103515625,
-0.0673828125,
-0.11602783203125,
-0.03033447265625,
-0.025634765625,
-0.014495849609375,
0.00286102294921875,
0.04425048828125,
0.078369140625,
-0.052276611328125,
-0.005558013916015625,
0.004375457763671875,
-0.008056640625,
-0.006954193115234375,
-0.01071929931640625,
0.0341796875,
0.046722412109375,
-0.041778564453125,
0.022491455078125,
0.0161285400390625,
0.035125732421875,
-0.00446319580078125,
0.0121612548828125,
0.0204620361328125,
0.031829833984375,
0.03619384765625,
0.0263214111328125,
-0.037017822265625,
-0.031158447265625,
-0.0125885009765625,
-0.009063720703125,
0.01490020751953125,
0.039947509765625,
-0.0440673828125,
0.0015802383422851562,
0.0144805908203125,
0.023529052734375,
0.054046630859375,
-0.0036258697509765625,
0.041259765625,
-0.060028076171875,
0.0300140380859375,
-0.015380859375,
0.0634765625,
0.002086639404296875,
-0.0227203369140625,
0.036956787109375,
0.00583648681640625,
0.004772186279296875,
-0.0545654296875,
0.0248260498046875,
-0.1253662109375,
-0.025604248046875,
0.0537109375,
0.01319122314453125,
-0.006927490234375,
0.02960205078125,
-0.01282501220703125,
-0.00621795654296875,
-0.03033447265625,
0.053802490234375,
0.06787109375,
0.01953125,
-0.013641357421875,
-0.0181427001953125,
0.0102691650390625,
0.01145172119140625,
-0.0361328125,
-0.00975799560546875,
0.0162353515625,
0.0361328125,
0.00763702392578125,
0.032989501953125,
-0.031707763671875,
0.021484375,
-0.00048470497131347656,
0.0533447265625,
0.006755828857421875,
-0.016998291015625,
-0.01385498046875,
-0.001811981201171875,
-0.02264404296875,
-0.0036945343017578125
]
] |
Helsinki-NLP/opus-mt-en-es | 2023-08-16T11:29:28.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"marian",
"text2text-generation",
"translation",
"en",
"es",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-es | 53 | 69,728 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- es
tags:
- translation
license: apache-2.0
---
### eng-spa
* source group: English
* target group: Spanish
* OPUS readme: [eng-spa](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-spa/README.md)
* model: transformer
* source language(s): eng
* target language(s): spa
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-08-18.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-spa/opus-2020-08-18.zip)
* test set translations: [opus-2020-08-18.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-spa/opus-2020-08-18.test.txt)
* test set scores: [opus-2020-08-18.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-spa/opus-2020-08-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newssyscomb2009-engspa.eng.spa | 31.0 | 0.583 |
| news-test2008-engspa.eng.spa | 29.7 | 0.564 |
| newstest2009-engspa.eng.spa | 30.2 | 0.578 |
| newstest2010-engspa.eng.spa | 36.9 | 0.620 |
| newstest2011-engspa.eng.spa | 38.2 | 0.619 |
| newstest2012-engspa.eng.spa | 39.0 | 0.625 |
| newstest2013-engspa.eng.spa | 35.0 | 0.598 |
| Tatoeba-test.eng.spa | 54.9 | 0.721 |
### System Info:
- hf_name: eng-spa
- source_languages: eng
- target_languages: spa
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-spa/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['en', 'es']
- src_constituents: {'eng'}
- tgt_constituents: {'spa'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-spa/opus-2020-08-18.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-spa/opus-2020-08-18.test.txt
- src_alpha3: eng
- tgt_alpha3: spa
- short_pair: en-es
- chrF2_score: 0.721
- bleu: 54.9
- brevity_penalty: 0.978
- ref_len: 77311.0
- src_name: English
- tgt_name: Spanish
- train_date: 2020-08-18 00:00:00
- src_alpha2: en
- tgt_alpha2: es
- prefer_old: False
- long_pair: eng-spa
- helsinki_git_sha: d2f0910c89026c34a44e331e785dec1e0faa7b82
- transformers_git_sha: f7af09b4524b784d67ae8526f0e2fcc6f5ed0de9
- port_machine: brutasse
- port_time: 2020-08-24-18:20 | 2,401 | [
[
-0.032562255859375,
-0.04827880859375,
0.0204315185546875,
0.0357666015625,
-0.02642822265625,
-0.0167999267578125,
-0.0211181640625,
-0.02978515625,
0.0217742919921875,
0.018890380859375,
-0.047332763671875,
-0.05841064453125,
-0.043548583984375,
0.0271759033203125,
-0.00371551513671875,
0.06866455078125,
-0.0081787109375,
0.0091094970703125,
0.0347900390625,
-0.0296478271484375,
-0.03533935546875,
-0.0171051025390625,
-0.05242919921875,
-0.022308349609375,
0.03240966796875,
0.0185699462890625,
0.033447265625,
0.031524658203125,
0.037017822265625,
0.0242919921875,
-0.03125,
0.021514892578125,
-0.01142120361328125,
-0.00868988037109375,
-0.0072021484375,
-0.035858154296875,
-0.041778564453125,
-0.02001953125,
0.062408447265625,
0.042327880859375,
0.01184844970703125,
0.032623291015625,
-0.005168914794921875,
0.051055908203125,
-0.0117034912109375,
0.0083770751953125,
-0.0386962890625,
-0.003086090087890625,
-0.027435302734375,
-0.0293426513671875,
-0.043548583984375,
-0.0200958251953125,
0.007724761962890625,
-0.041900634765625,
0.0079345703125,
0.01184844970703125,
0.12347412109375,
0.006099700927734375,
-0.02728271484375,
-0.00951385498046875,
-0.0279388427734375,
0.055694580078125,
-0.05670166015625,
0.0298004150390625,
0.03369140625,
-0.01059722900390625,
-0.009307861328125,
-0.0291290283203125,
-0.0196075439453125,
0.0015239715576171875,
-0.021209716796875,
0.0164337158203125,
-0.0192108154296875,
-0.01534271240234375,
0.01409149169921875,
0.04083251953125,
-0.05303955078125,
0.0074462890625,
-0.0267791748046875,
-0.0109100341796875,
0.034576416015625,
0.0083160400390625,
0.0224761962890625,
-0.037567138671875,
-0.028900146484375,
-0.0299835205078125,
-0.04144287109375,
0.016448974609375,
0.0302886962890625,
0.02978515625,
-0.03363037109375,
0.0467529296875,
-0.00757598876953125,
0.040374755859375,
0.00508880615234375,
-0.003726959228515625,
0.0537109375,
-0.04534912109375,
-0.010955810546875,
-0.0179595947265625,
0.0906982421875,
0.0190277099609375,
-0.0007314682006835938,
0.0034236907958984375,
-0.0216827392578125,
-0.018341064453125,
-0.00908660888671875,
-0.0626220703125,
0.01184844970703125,
0.01922607421875,
-0.0227508544921875,
-0.0149383544921875,
0.0043182373046875,
-0.0594482421875,
0.0113525390625,
0.00608062744140625,
0.039398193359375,
-0.0594482421875,
-0.022369384765625,
0.031707763671875,
-0.0015039443969726562,
0.0173187255859375,
-0.002872467041015625,
-0.0308685302734375,
0.01548004150390625,
0.0236968994140625,
0.072265625,
-0.01055908203125,
-0.03216552734375,
-0.0133514404296875,
0.00556182861328125,
-0.009857177734375,
0.052093505859375,
-0.00772857666015625,
-0.033935546875,
-0.01033782958984375,
0.033111572265625,
-0.012908935546875,
-0.0130615234375,
0.06414794921875,
-0.020843505859375,
0.047027587890625,
-0.0257568359375,
-0.03997802734375,
-0.02734375,
0.0247344970703125,
-0.06036376953125,
0.09783935546875,
0.01166534423828125,
-0.06561279296875,
0.0285186767578125,
-0.06622314453125,
-0.0178985595703125,
-0.0048675537109375,
0.014068603515625,
-0.054443359375,
-0.0017032623291015625,
0.0187225341796875,
0.02740478515625,
-0.0291748046875,
0.03436279296875,
0.0003707408905029297,
-0.0201568603515625,
0.0030670166015625,
-0.02642822265625,
0.09588623046875,
0.01497650146484375,
-0.0411376953125,
0.0019426345825195312,
-0.055572509765625,
-0.0007686614990234375,
0.0269927978515625,
-0.0298309326171875,
-0.0178680419921875,
-0.00786590576171875,
0.01654052734375,
0.0095367431640625,
0.0205535888671875,
-0.03936767578125,
0.0255889892578125,
-0.055816650390625,
0.0201873779296875,
0.056884765625,
0.0152740478515625,
0.0201263427734375,
-0.032440185546875,
0.0272674560546875,
0.01543426513671875,
0.006866455078125,
0.007068634033203125,
-0.04168701171875,
-0.05767822265625,
-0.02154541015625,
0.04296875,
0.0484619140625,
-0.054046630859375,
0.0657958984375,
-0.05303955078125,
-0.061431884765625,
-0.05084228515625,
-0.0171661376953125,
0.039398193359375,
0.0170745849609375,
0.03936767578125,
-0.01546478271484375,
-0.0364990234375,
-0.073486328125,
-0.015838623046875,
-0.01541900634765625,
0.0020008087158203125,
0.0176239013671875,
0.062225341796875,
0.0004410743713378906,
0.044281005859375,
-0.02996826171875,
-0.03778076171875,
-0.0159912109375,
0.0146484375,
0.03765869140625,
0.050506591796875,
0.05474853515625,
-0.062255859375,
-0.04205322265625,
0.00736236572265625,
-0.044677734375,
-0.0103302001953125,
-0.0017547607421875,
-0.01172637939453125,
0.0287322998046875,
-0.004856109619140625,
-0.041961669921875,
0.020050048828125,
0.04217529296875,
-0.06732177734375,
0.032806396484375,
-0.015533447265625,
0.033843994140625,
-0.10791015625,
0.0137786865234375,
-0.0003533363342285156,
-0.00444793701171875,
-0.02923583984375,
-0.0000756382942199707,
0.0022220611572265625,
0.01202392578125,
-0.04412841796875,
0.060211181640625,
-0.04852294921875,
-0.0014371871948242188,
0.0286407470703125,
0.006114959716796875,
0.004913330078125,
0.05780029296875,
-0.006717681884765625,
0.0755615234375,
0.043701171875,
-0.03082275390625,
0.0000032186508178710938,
0.02850341796875,
-0.028656005859375,
0.0185089111328125,
-0.052642822265625,
-0.01540374755859375,
0.02398681640625,
-0.00213623046875,
-0.054168701171875,
-0.00688934326171875,
0.01418304443359375,
-0.055755615234375,
0.0189361572265625,
-0.00934600830078125,
-0.052581787109375,
-0.017181396484375,
-0.029541015625,
0.03192138671875,
0.02972412109375,
-0.013519287109375,
0.055816650390625,
0.01343536376953125,
-0.0018310546875,
-0.045257568359375,
-0.0635986328125,
-0.00250244140625,
-0.016204833984375,
-0.052032470703125,
0.0293426513671875,
-0.00910186767578125,
0.0058135986328125,
0.0171051025390625,
0.0029582977294921875,
-0.0147552490234375,
0.00978851318359375,
0.00238037109375,
0.0198974609375,
-0.02252197265625,
0.003887176513671875,
0.001064300537109375,
-0.00643157958984375,
-0.0155792236328125,
-0.01384735107421875,
0.061187744140625,
-0.0347900390625,
-0.0174713134765625,
-0.0517578125,
0.0119781494140625,
0.040802001953125,
-0.0301361083984375,
0.08050537109375,
0.0390625,
-0.0220794677734375,
0.01025390625,
-0.043701171875,
0.0023479461669921875,
-0.030731201171875,
0.0287017822265625,
-0.0484619140625,
-0.05322265625,
0.064697265625,
0.0182342529296875,
0.0190277099609375,
0.07318115234375,
0.0537109375,
0.01171875,
0.0499267578125,
0.0233306884765625,
0.0077362060546875,
0.04156494140625,
-0.048583984375,
-0.00984954833984375,
-0.0531005859375,
-0.022705078125,
-0.059326171875,
-0.009490966796875,
-0.063720703125,
-0.0179443359375,
0.0220794677734375,
-0.00885772705078125,
-0.01238250732421875,
0.06134033203125,
-0.042938232421875,
0.025390625,
0.0433349609375,
0.0137786865234375,
0.024627685546875,
-0.00600433349609375,
-0.031219482421875,
-0.0066680908203125,
-0.0377197265625,
-0.0419921875,
0.0867919921875,
0.0252838134765625,
0.01222991943359375,
0.021759033203125,
0.050537109375,
0.00936126708984375,
0.009307861328125,
-0.0430908203125,
0.047149658203125,
-0.0096282958984375,
-0.06719970703125,
-0.0287322998046875,
-0.0302581787109375,
-0.07403564453125,
0.0251617431640625,
-0.0177001953125,
-0.047882080078125,
0.013519287109375,
-0.00730133056640625,
-0.00420379638671875,
0.048187255859375,
-0.059661865234375,
0.070556640625,
0.0009775161743164062,
-0.0276031494140625,
0.00836944580078125,
-0.0374755859375,
0.0087432861328125,
-0.00470733642578125,
0.0189056396484375,
-0.01366424560546875,
-0.01287841796875,
0.0648193359375,
-0.0210418701171875,
0.042449951171875,
-0.00980377197265625,
-0.004913330078125,
0.0151214599609375,
0.01522064208984375,
0.0416259765625,
-0.01044464111328125,
-0.02166748046875,
0.0283050537109375,
0.0078125,
-0.04156494140625,
-0.0138702392578125,
0.04052734375,
-0.061187744140625,
-0.032867431640625,
-0.04327392578125,
-0.044158935546875,
-0.0037746429443359375,
0.037078857421875,
0.04010009765625,
0.038330078125,
-0.0070343017578125,
0.043975830078125,
0.054534912109375,
-0.021148681640625,
0.036895751953125,
0.04168701171875,
0.0006847381591796875,
-0.041900634765625,
0.053375244140625,
0.0211181640625,
0.01285552978515625,
0.034912109375,
0.004451751708984375,
-0.0181427001953125,
-0.061004638671875,
-0.03399658203125,
0.033050537109375,
-0.0264892578125,
-0.025665283203125,
-0.045501708984375,
-0.00295257568359375,
-0.027435302734375,
0.00926971435546875,
-0.032806396484375,
-0.0290374755859375,
-0.010772705078125,
-0.024993896484375,
0.031585693359375,
0.027130126953125,
0.005321502685546875,
0.0155487060546875,
-0.060150146484375,
0.00954437255859375,
-0.0195770263671875,
0.034942626953125,
-0.0243682861328125,
-0.059356689453125,
-0.0232086181640625,
-0.0018186569213867188,
-0.0201263427734375,
-0.0797119140625,
0.040863037109375,
-0.00335693359375,
0.024810791015625,
0.005138397216796875,
0.004364013671875,
0.04541015625,
-0.03692626953125,
0.07623291015625,
-0.0029277801513671875,
-0.0653076171875,
0.046661376953125,
-0.0294036865234375,
0.027587890625,
0.050811767578125,
0.022216796875,
-0.0252838134765625,
-0.052703857421875,
-0.06573486328125,
-0.07025146484375,
0.057830810546875,
0.044189453125,
-0.007904052734375,
-0.006381988525390625,
0.00395965576171875,
-0.00019669532775878906,
-0.0149383544921875,
-0.09320068359375,
-0.036285400390625,
0.0121002197265625,
-0.029693603515625,
0.01000213623046875,
-0.0301055908203125,
-0.01264190673828125,
-0.0196533203125,
0.08245849609375,
0.0155792236328125,
0.01377105712890625,
0.034637451171875,
-0.01277923583984375,
0.0038585662841796875,
0.0292205810546875,
0.050506591796875,
0.033905029296875,
-0.0207977294921875,
-0.0101470947265625,
0.032745361328125,
-0.038787841796875,
0.007965087890625,
0.0034580230712890625,
-0.041351318359375,
0.0263214111328125,
0.0389404296875,
0.06683349609375,
0.0185089111328125,
-0.0347900390625,
0.04388427734375,
-0.0035343170166015625,
-0.034912109375,
-0.031646728515625,
-0.0206451416015625,
0.007747650146484375,
0.0083770751953125,
0.0254364013671875,
-0.003124237060546875,
0.004550933837890625,
-0.01464080810546875,
0.006389617919921875,
0.0092620849609375,
-0.014129638671875,
-0.0299530029296875,
0.0396728515625,
0.007083892822265625,
-0.027130126953125,
0.0156707763671875,
-0.022003173828125,
-0.0281219482421875,
0.043243408203125,
0.021759033203125,
0.08319091796875,
-0.0156707763671875,
-0.0089111328125,
0.05572509765625,
0.04058837890625,
-0.0054779052734375,
0.0259857177734375,
0.0199737548828125,
-0.04473876953125,
-0.0271759033203125,
-0.057403564453125,
0.00634765625,
0.008056640625,
-0.056884765625,
0.030242919921875,
0.0032444000244140625,
-0.0274505615234375,
-0.00911712646484375,
0.031402587890625,
-0.0458984375,
0.00345611572265625,
-0.02923583984375,
0.0810546875,
-0.0767822265625,
0.056549072265625,
0.054229736328125,
-0.061370849609375,
-0.07568359375,
-0.0117340087890625,
-0.021148681640625,
-0.0438232421875,
0.03985595703125,
0.00008082389831542969,
0.0013580322265625,
-0.0010509490966796875,
-0.019989013671875,
-0.0594482421875,
0.08868408203125,
0.025970458984375,
-0.0252532958984375,
-0.01861572265625,
-0.0005040168762207031,
0.043731689453125,
-0.00223541259765625,
0.0190277099609375,
0.029388427734375,
0.05780029296875,
-0.0094451904296875,
-0.08355712890625,
0.01076507568359375,
-0.036285400390625,
-0.0017442703247070312,
0.022735595703125,
-0.0606689453125,
0.060150146484375,
0.01435089111328125,
-0.0181427001953125,
0.0030078887939453125,
0.042877197265625,
0.0287017822265625,
0.002544403076171875,
0.040008544921875,
0.0670166015625,
0.0318603515625,
-0.043365478515625,
0.07177734375,
-0.0309906005859375,
0.049560546875,
0.067138671875,
0.021026611328125,
0.0584716796875,
0.04046630859375,
-0.026214599609375,
0.045074462890625,
0.060699462890625,
-0.01410675048828125,
0.0239410400390625,
-0.010528564453125,
-0.003650665283203125,
-0.01424407958984375,
-0.0126953125,
-0.039947509765625,
0.0258026123046875,
0.007022857666015625,
-0.0144500732421875,
-0.00409698486328125,
-0.01432037353515625,
0.031524658203125,
0.006855010986328125,
-0.01206207275390625,
0.050506591796875,
-0.01549530029296875,
-0.051177978515625,
0.051055908203125,
0.00040435791015625,
0.050567626953125,
-0.048553466796875,
0.0062713623046875,
-0.0185394287109375,
0.0081024169921875,
-0.008819580078125,
-0.06744384765625,
0.0252532958984375,
0.0178680419921875,
-0.0204010009765625,
-0.0214080810546875,
0.012725830078125,
-0.032196044921875,
-0.05462646484375,
0.032379150390625,
0.036285400390625,
0.0196533203125,
0.0178070068359375,
-0.058746337890625,
0.0009069442749023438,
0.01274871826171875,
-0.061553955078125,
-0.0032482147216796875,
0.06317138671875,
0.004398345947265625,
0.053985595703125,
0.03533935546875,
0.0192108154296875,
0.01065826416015625,
0.002262115478515625,
0.0538330078125,
-0.055023193359375,
-0.0352783203125,
-0.062469482421875,
0.049072265625,
-0.01105499267578125,
-0.043365478515625,
0.04986572265625,
0.0621337890625,
0.06500244140625,
-0.00347137451171875,
0.0229034423828125,
-0.02056884765625,
0.04010009765625,
-0.047821044921875,
0.049896240234375,
-0.06903076171875,
0.007419586181640625,
-0.01424407958984375,
-0.057159423828125,
-0.021087646484375,
0.021881103515625,
-0.0194244384765625,
-0.002544403076171875,
0.07513427734375,
0.058624267578125,
0.01092529296875,
-0.02679443359375,
0.0009479522705078125,
0.03179931640625,
0.0209808349609375,
0.06243896484375,
0.0187225341796875,
-0.06451416015625,
0.054962158203125,
-0.0218505859375,
0.00298309326171875,
-0.00010991096496582031,
-0.05584716796875,
-0.059539794921875,
-0.057098388671875,
-0.0134735107421875,
-0.0343017578125,
-0.008575439453125,
0.076904296875,
0.02325439453125,
-0.072265625,
-0.02423095703125,
-0.0003349781036376953,
0.00965118408203125,
-0.0203094482421875,
-0.01971435546875,
0.054046630859375,
-0.01168060302734375,
-0.07989501953125,
0.0109710693359375,
0.003704071044921875,
0.008392333984375,
0.0010728836059570312,
-0.003448486328125,
-0.0491943359375,
-0.001827239990234375,
0.01947021484375,
0.00800323486328125,
-0.06390380859375,
-0.01480865478515625,
0.0121612548828125,
-0.0221099853515625,
0.0179290771484375,
0.0020389556884765625,
-0.0195770263671875,
0.0117645263671875,
0.051605224609375,
0.0294342041015625,
0.04302978515625,
-0.01078033447265625,
0.0212554931640625,
-0.05621337890625,
0.0361328125,
0.02264404296875,
0.0484619140625,
0.01398468017578125,
-0.01300048828125,
0.0643310546875,
0.029937744140625,
-0.0221405029296875,
-0.08013916015625,
-0.004261016845703125,
-0.08990478515625,
-0.00778961181640625,
0.08050537109375,
-0.0178070068359375,
-0.026702880859375,
0.0158233642578125,
-0.0181121826171875,
0.026641845703125,
-0.034149169921875,
0.03875732421875,
0.0697021484375,
0.0224609375,
0.00946044921875,
-0.033050537109375,
0.01885986328125,
0.03662109375,
-0.058197021484375,
-0.01151275634765625,
0.0212249755859375,
0.032562255859375,
0.0252838134765625,
0.0535888671875,
-0.0293121337890625,
0.0148468017578125,
-0.0111236572265625,
0.0267181396484375,
-0.017333984375,
-0.006214141845703125,
-0.0190887451171875,
0.0152435302734375,
-0.005645751953125,
-0.01503753662109375
]
] |
Yntec/epiCPhotoGasm | 2023-10-31T12:57:32.000Z | [
"diffusers",
"Photorealistic",
"Realism",
"Girls",
"epinikion",
"text-to-image",
"stable-diffusion",
"stable-diffusion-diffusers",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/epiCPhotoGasm | 6 | 69,453 | diffusers | 2023-10-01T17:51:17 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Photorealistic
- Realism
- Girls
- epinikion
- text-to-image
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
---
Original page: https://civitai.com/models/132632?modelVersionId=145885
UPDATE: Now with the MoistMixV2VAE baked in!
Comparison and prompt:

macro studio photo of old antique Victorian marmor figurine of cute Rinoa, chibi Rinoa Heartilly, eyeliner, very slim, arm warmers, necklace, sleeveless coat, black shirt, blue skirt, arm ribbon, bike shorts, boots, long hair, black hair, cozy home in the background with candles and plants, Rinoa on flat round porcelain base, by Michelangelo
Samples and prompt:


(hyperrealist painting of a girl as genie with a sun on each shoulder ), 1940, magazine ad, iconic. by Daniel F. Gerhartz and greg rutkowski, aggressive color palette, elegant, dream, fantasy, dynamic lighting, beautiful, poster, wlop, trending on artstation, wallpaper, 4 k, award winning, digital art, very | 1,392 | [
[
-0.0215606689453125,
-0.03125,
0.01314544677734375,
0.01338958740234375,
-0.0277862548828125,
-0.01178741455078125,
0.0190277099609375,
-0.0284271240234375,
0.05364990234375,
0.06072998046875,
-0.037445068359375,
-0.00981903076171875,
-0.03216552734375,
-0.01451873779296875,
-0.0014095306396484375,
0.0280914306640625,
0.01239013671875,
0.0262451171875,
-0.024932861328125,
0.007259368896484375,
-0.0157623291015625,
0.01184844970703125,
-0.0210113525390625,
-0.001201629638671875,
0.01016998291015625,
0.0465087890625,
0.050872802734375,
0.012115478515625,
0.0142669677734375,
0.02618408203125,
0.004688262939453125,
0.030609130859375,
-0.02984619140625,
0.0050048828125,
0.0005102157592773438,
-0.037811279296875,
-0.0176849365234375,
0.0203399658203125,
0.01259613037109375,
0.0144195556640625,
0.01366424560546875,
0.00908660888671875,
0.0018749237060546875,
0.03436279296875,
-0.07244873046875,
0.0033359527587890625,
-0.0111846923828125,
-0.0167999267578125,
-0.01305389404296875,
-0.0229949951171875,
-0.043487548828125,
-0.0307769775390625,
-0.0090789794921875,
-0.0565185546875,
0.02081298828125,
-0.035888671875,
0.0870361328125,
-0.0004413127899169922,
-0.0548095703125,
-0.0266876220703125,
-0.064697265625,
0.04132080078125,
-0.04144287109375,
0.03302001953125,
-0.0035076141357421875,
0.038055419921875,
-0.051513671875,
-0.07269287109375,
-0.031280517578125,
0.01526641845703125,
0.004032135009765625,
0.0538330078125,
-0.0186920166015625,
-0.0667724609375,
0.01413726806640625,
0.009674072265625,
-0.04388427734375,
-0.0294036865234375,
-0.02276611328125,
0.01934814453125,
0.033172607421875,
-0.01309967041015625,
0.0374755859375,
0.0051116943359375,
-0.0665283203125,
-0.0215606689453125,
-0.0556640625,
0.0180816650390625,
0.004550933837890625,
-0.00753021240234375,
-0.0219268798828125,
0.053314208984375,
-0.01235198974609375,
0.0234527587890625,
0.0083160400390625,
-0.00945281982421875,
0.0174713134765625,
-0.01470184326171875,
-0.02764892578125,
-0.028350830078125,
0.066650390625,
0.0645751953125,
0.0087890625,
0.01363372802734375,
0.00635528564453125,
-0.026336669921875,
0.0108184814453125,
-0.101318359375,
-0.025634765625,
-0.00588226318359375,
-0.0215606689453125,
-0.00962066650390625,
0.042327880859375,
-0.059539794921875,
-0.00945281982421875,
0.01367950439453125,
0.01568603515625,
-0.0165557861328125,
-0.033416748046875,
0.00891876220703125,
-0.00841522216796875,
0.0224609375,
0.034698486328125,
-0.045074462890625,
0.01309967041015625,
0.031280517578125,
0.04296875,
0.036407470703125,
0.03619384765625,
-0.00815582275390625,
0.008087158203125,
-0.0374755859375,
0.0672607421875,
-0.031768798828125,
-0.03460693359375,
-0.007038116455078125,
0.024932861328125,
0.0037059783935546875,
-0.055145263671875,
0.05889892578125,
-0.04144287109375,
0.020172119140625,
-0.04449462890625,
-0.0240020751953125,
-0.024993896484375,
-0.020294189453125,
-0.045867919921875,
0.0455322265625,
0.040069580078125,
-0.0665283203125,
0.05682373046875,
-0.0013685226440429688,
0.0018224716186523438,
0.011322021484375,
-0.007724761962890625,
-0.046478271484375,
0.0233917236328125,
-0.0024738311767578125,
0.0221405029296875,
-0.04852294921875,
-0.036895751953125,
-0.058380126953125,
-0.03192138671875,
0.038055419921875,
-0.0176239013671875,
0.07464599609375,
0.030853271484375,
-0.053985595703125,
-0.01416015625,
-0.06402587890625,
0.022247314453125,
0.04986572265625,
-0.00008189678192138672,
-0.032440185546875,
-0.03265380859375,
0.00710296630859375,
0.034027099609375,
0.031829833984375,
-0.036407470703125,
0.0014333724975585938,
-0.01337432861328125,
0.00984954833984375,
0.02801513671875,
0.0194854736328125,
0.00627899169921875,
-0.040802001953125,
0.04931640625,
0.005870819091796875,
0.037567138671875,
-0.00868988037109375,
-0.0445556640625,
-0.0831298828125,
-0.04425048828125,
0.0189056396484375,
0.02740478515625,
-0.059295654296875,
0.01776123046875,
0.0003190040588378906,
-0.07623291015625,
-0.04034423828125,
0.00909423828125,
0.0287628173828125,
0.01303863525390625,
-0.004299163818359375,
-0.03704833984375,
-0.01071929931640625,
-0.09832763671875,
-0.004749298095703125,
-0.005329132080078125,
-0.0181884765625,
0.043365478515625,
0.01419830322265625,
-0.00760650634765625,
0.036102294921875,
-0.02886962890625,
-0.00792694091796875,
0.01690673828125,
-0.002262115478515625,
0.0308380126953125,
0.0419921875,
0.07501220703125,
-0.0684814453125,
-0.054718017578125,
0.0011320114135742188,
-0.031585693359375,
-0.0008664131164550781,
0.0208282470703125,
-0.0299072265625,
-0.0134124755859375,
0.010467529296875,
-0.05767822265625,
0.0528564453125,
0.01666259765625,
-0.04119873046875,
0.051177978515625,
-0.037567138671875,
0.06622314453125,
-0.10076904296875,
-0.0192413330078125,
0.00611114501953125,
-0.01108551025390625,
-0.0243682861328125,
0.037139892578125,
0.0240936279296875,
0.0155487060546875,
-0.06817626953125,
0.043060302734375,
-0.06396484375,
0.00829315185546875,
-0.02545166015625,
-0.006771087646484375,
0.0199127197265625,
0.0022735595703125,
-0.01389312744140625,
0.0684814453125,
0.0355224609375,
-0.032806396484375,
0.027313232421875,
0.022216796875,
-0.03607177734375,
0.028839111328125,
-0.07122802734375,
0.00995635986328125,
0.005123138427734375,
0.0001169443130493164,
-0.07879638671875,
-0.0175323486328125,
0.037689208984375,
-0.047882080078125,
0.0162506103515625,
-0.01165008544921875,
-0.054168701171875,
-0.016845703125,
-0.03143310546875,
0.0305938720703125,
0.049957275390625,
-0.02349853515625,
0.0259246826171875,
0.01235198974609375,
0.01910400390625,
-0.015045166015625,
-0.0496826171875,
0.0060272216796875,
-0.0259857177734375,
-0.046600341796875,
0.02044677734375,
-0.025848388671875,
-0.03643798828125,
-0.020843505859375,
-0.004116058349609375,
-0.01947021484375,
-0.001071929931640625,
0.0379638671875,
0.026947021484375,
-0.025177001953125,
-0.048004150390625,
-0.00514984130859375,
-0.004871368408203125,
0.003986358642578125,
0.017120361328125,
0.03424072265625,
-0.019561767578125,
-0.03192138671875,
-0.058502197265625,
0.0187225341796875,
0.07391357421875,
-0.004425048828125,
0.053802490234375,
0.04547119140625,
-0.0341796875,
0.01020050048828125,
-0.058837890625,
-0.0170745849609375,
-0.03167724609375,
-0.032623291015625,
-0.050537109375,
-0.0186614990234375,
0.0252532958984375,
0.00873565673828125,
-0.0160064697265625,
0.045867919921875,
0.049163818359375,
-0.0035076141357421875,
0.07373046875,
0.01947021484375,
0.03558349609375,
0.0280914306640625,
-0.055145263671875,
-0.0038928985595703125,
-0.0556640625,
-0.031829833984375,
-0.032501220703125,
-0.021759033203125,
-0.058624267578125,
-0.05780029296875,
0.00665283203125,
0.0161285400390625,
-0.0192413330078125,
0.0533447265625,
-0.023773193359375,
0.017181396484375,
0.038726806640625,
0.04315185546875,
0.0158843994140625,
-0.0003669261932373047,
-0.0009455680847167969,
-0.029998779296875,
-0.027435302734375,
-0.04595947265625,
0.054229736328125,
-0.0063629150390625,
0.03790283203125,
0.038299560546875,
0.036163330078125,
-0.0032806396484375,
0.0236053466796875,
-0.022613525390625,
0.0328369140625,
-0.00876617431640625,
-0.0628662109375,
0.031585693359375,
-0.0178070068359375,
-0.061920166015625,
0.037353515625,
-0.037200927734375,
-0.033111572265625,
0.04547119140625,
-0.018310546875,
-0.0224761962890625,
0.0195770263671875,
-0.061370849609375,
0.048126220703125,
-0.024505615234375,
-0.06500244140625,
0.00839996337890625,
-0.01274871826171875,
0.0396728515625,
0.030914306640625,
0.006153106689453125,
-0.01259613037109375,
-0.025909423828125,
0.0231475830078125,
-0.0307464599609375,
0.03704833984375,
-0.008575439453125,
0.01299285888671875,
0.0201568603515625,
0.026947021484375,
0.0188446044921875,
0.04730224609375,
-0.0152435302734375,
-0.0443115234375,
0.00197601318359375,
-0.049407958984375,
-0.05987548828125,
0.07342529296875,
-0.0390625,
-0.046356201171875,
-0.041748046875,
-0.0092620849609375,
0.0058746337890625,
0.0233306884765625,
0.042510986328125,
0.0584716796875,
-0.04248046875,
0.02935791015625,
0.048797607421875,
-0.005161285400390625,
0.00848388671875,
0.028472900390625,
-0.0156707763671875,
-0.030426025390625,
0.034576416015625,
-0.0023403167724609375,
0.04180908203125,
0.022735595703125,
0.0150909423828125,
0.005496978759765625,
-0.0151214599609375,
-0.031280517578125,
0.0438232421875,
-0.029510498046875,
-0.0049591064453125,
-0.052215576171875,
-0.006374359130859375,
-0.042510986328125,
-0.0018863677978515625,
-0.0308685302734375,
-0.0250244140625,
-0.053680419921875,
0.0174407958984375,
0.02734375,
0.0556640625,
0.0112762451171875,
-0.0055084228515625,
-0.031005859375,
0.0004911422729492188,
0.042236328125,
-0.004241943359375,
-0.0184326171875,
-0.02911376953125,
0.0214080810546875,
0.02752685546875,
-0.027923583984375,
-0.0653076171875,
0.04241943359375,
-0.00921630859375,
0.009063720703125,
0.0648193359375,
0.0137176513671875,
0.05108642578125,
-0.0224151611328125,
0.0450439453125,
0.039154052734375,
-0.0128021240234375,
0.0306854248046875,
-0.034576416015625,
0.0196990966796875,
0.0548095703125,
0.0264129638671875,
-0.0117645263671875,
-0.0205841064453125,
-0.07427978515625,
-0.058380126953125,
0.0150299072265625,
0.02911376953125,
0.0221405029296875,
-0.005191802978515625,
0.0218963623046875,
0.032501220703125,
0.0172882080078125,
-0.03277587890625,
-0.0291900634765625,
0.0017976760864257812,
0.01885986328125,
-0.0022487640380859375,
-0.0408935546875,
-0.00890350341796875,
-0.03192138671875,
0.05804443359375,
0.0015764236450195312,
0.0352783203125,
0.0034027099609375,
0.0118865966796875,
-0.0016870498657226562,
0.012420654296875,
0.0721435546875,
0.078369140625,
-0.053466796875,
-0.008544921875,
-0.01157379150390625,
-0.0186767578125,
0.0040130615234375,
-0.0163421630859375,
-0.02862548828125,
0.01552581787109375,
0.0149688720703125,
0.055206298828125,
0.05096435546875,
-0.04730224609375,
0.057373046875,
-0.04241943359375,
0.0106658935546875,
-0.06256103515625,
0.03277587890625,
0.037811279296875,
0.042236328125,
0.00616455078125,
0.0085601806640625,
0.032623291015625,
-0.06414794921875,
0.0147552490234375,
0.0390625,
-0.040252685546875,
-0.021514892578125,
0.064208984375,
-0.018096923828125,
-0.03302001953125,
0.022857666015625,
-0.014312744140625,
-0.0150604248046875,
0.054443359375,
0.049407958984375,
0.055206298828125,
-0.019683837890625,
0.043487548828125,
0.0241546630859375,
-0.0018606185913085938,
0.02001953125,
0.032440185546875,
-0.0038776397705078125,
-0.0266571044921875,
0.0360107421875,
-0.0177459716796875,
-0.02197265625,
-0.0006303787231445312,
-0.049041748046875,
0.040496826171875,
-0.050689697265625,
-0.003170013427734375,
-0.010284423828125,
0.008331298828125,
-0.061737060546875,
0.042999267578125,
-0.018951416015625,
0.08697509765625,
-0.063232421875,
0.0628662109375,
0.02056884765625,
-0.041015625,
-0.04766845703125,
-0.004398345947265625,
0.01464080810546875,
-0.0498046875,
0.02850341796875,
0.019805908203125,
-0.0030765533447265625,
-0.034912109375,
-0.0233917236328125,
-0.056396484375,
0.090576171875,
0.0216827392578125,
-0.02911376953125,
0.0154571533203125,
-0.0225372314453125,
0.03399658203125,
-0.055450439453125,
0.07330322265625,
0.040008544921875,
0.0276031494140625,
0.06854248046875,
-0.036712646484375,
-0.011138916015625,
-0.06500244140625,
0.0055694580078125,
-0.0051116943359375,
-0.08990478515625,
0.0675048828125,
-0.0185699462890625,
-0.029449462890625,
0.053466796875,
0.06756591796875,
0.018829345703125,
0.0297393798828125,
0.04803466796875,
0.056640625,
0.01309967041015625,
-0.0212860107421875,
0.09344482421875,
0.0149993896484375,
0.0005712509155273438,
0.07000732421875,
-0.0187835693359375,
0.053192138671875,
0.008148193359375,
-0.0241851806640625,
0.038848876953125,
0.0706787109375,
0.00449371337890625,
0.050140380859375,
0.0104522705078125,
-0.038116455078125,
-0.00745391845703125,
-0.01605224609375,
-0.048370361328125,
0.0222930908203125,
0.00250244140625,
-0.0250091552734375,
-0.006458282470703125,
0.0131378173828125,
0.01324462890625,
0.0014638900756835938,
0.01154327392578125,
0.046966552734375,
0.01328277587890625,
-0.0252685546875,
0.046966552734375,
-0.0199127197265625,
0.028472900390625,
-0.0301055908203125,
-0.00044536590576171875,
-0.016876220703125,
0.022613525390625,
-0.00998687744140625,
-0.0419921875,
-0.005443572998046875,
-0.01493072509765625,
-0.005779266357421875,
-0.0087890625,
0.0469970703125,
-0.01486968994140625,
-0.06219482421875,
0.0272674560546875,
0.0196075439453125,
0.029693603515625,
0.0227508544921875,
-0.0721435546875,
0.00801849365234375,
-0.00014209747314453125,
-0.029815673828125,
-0.002895355224609375,
0.036651611328125,
0.0221405029296875,
0.047027587890625,
0.0199432373046875,
0.0205535888671875,
0.012908935546875,
0.0045318603515625,
0.0543212890625,
-0.02581787109375,
-0.0152587890625,
-0.0265045166015625,
0.047149658203125,
-0.026611328125,
-0.043487548828125,
0.07073974609375,
0.06060791015625,
0.042236328125,
-0.020904541015625,
0.0218048095703125,
0.00521087646484375,
0.0199737548828125,
-0.045623779296875,
0.052642822265625,
-0.08648681640625,
-0.0087738037109375,
-0.0267333984375,
-0.0892333984375,
-0.006572723388671875,
0.034027099609375,
0.0192413330078125,
0.029144287109375,
0.03289794921875,
0.06475830078125,
-0.0235748291015625,
-0.002979278564453125,
0.01081085205078125,
0.0228424072265625,
0.037078857421875,
0.02374267578125,
0.04620361328125,
-0.047515869140625,
-0.0036029815673828125,
-0.045013427734375,
-0.0330810546875,
-0.049224853515625,
-0.049652099609375,
-0.06781005859375,
-0.060699462890625,
-0.0303955078125,
-0.0377197265625,
0.0016546249389648438,
0.0634765625,
0.056884765625,
-0.06219482421875,
-0.004619598388671875,
0.041046142578125,
-0.0089263916015625,
-0.01219940185546875,
-0.01715087890625,
0.007598876953125,
0.05023193359375,
-0.0718994140625,
0.038177490234375,
0.026611328125,
0.03277587890625,
-0.0172882080078125,
0.0263824462890625,
-0.0217742919921875,
0.0196533203125,
0.0125732421875,
0.01070404052734375,
-0.0419921875,
-0.00983428955078125,
-0.00510406494140625,
-0.0136566162109375,
0.01568603515625,
0.06756591796875,
-0.0250244140625,
0.031097412109375,
0.06121826171875,
0.00107574462890625,
0.057159423828125,
-0.0157623291015625,
0.02593994140625,
0.0084991455078125,
0.024078369140625,
0.0049896240234375,
0.054351806640625,
0.0265655517578125,
-0.039031982421875,
0.032867431640625,
0.022674560546875,
-0.0335693359375,
-0.0496826171875,
0.026275634765625,
-0.111083984375,
-0.026947021484375,
0.065185546875,
0.0044708251953125,
-0.061920166015625,
0.037750244140625,
-0.0211639404296875,
0.0165557861328125,
-0.0156402587890625,
0.039886474609375,
0.049468994140625,
0.01336669921875,
-0.02532958984375,
-0.068603515625,
0.004486083984375,
0.0153350830078125,
-0.048736572265625,
-0.02044677734375,
0.021759033203125,
0.041015625,
0.00623321533203125,
0.023101806640625,
-0.038330078125,
0.04888916015625,
-0.015655517578125,
0.0150299072265625,
-0.004608154296875,
-0.0235137939453125,
0.03662109375,
0.00548553466796875,
0.0026760101318359375,
-0.02923583984375
]
] |
bhadresh-savani/distilbert-base-uncased-emotion | 2023-03-22T08:44:05.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"distilbert",
"text-classification",
"emotion",
"en",
"dataset:emotion",
"arxiv:1910.01108",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | bhadresh-savani | null | null | bhadresh-savani/distilbert-base-uncased-emotion | 86 | 69,160 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
license: apache-2.0
tags:
- text-classification
- emotion
- pytorch
datasets:
- emotion
metrics:
- Accuracy, F1 Score
thumbnail: https://avatars3.githubusercontent.com/u/32437151?s=460&u=4ec59abc8d21d5feea3dab323d23a5860e6996a4&v=4
model-index:
- name: bhadresh-savani/distilbert-base-uncased-emotion
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: emotion
type: emotion
config: default
split: test
metrics:
- type: accuracy
value: 0.927
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzQxOGRmMjFlZThmZWViNjNmNGMzMTdjMGNjYjg1YWUzOTI0ZDlmYjRhYWMzMDA3Yjg2N2FiMTdmMzk0ZjJkOSIsInZlcnNpb24iOjF9.mOqr-hgNrnle7WCPy3Mo7M3fITFppn5gjpNagGMf_TZfB6VZnPKfZ51UkNFQlBtUlcm0U8vwPkF79snxwvCoDw
- type: precision
value: 0.8880230732280744
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYjZiN2NjNTkyN2M3ZWM2ZDZiNDk1OWZhN2FmNTAwZDIzMmQ3NTU2Yjk2MTgyNjJmMTNjYTYzOTc1NDdhYTljYSIsInZlcnNpb24iOjF9.0rWHmCZ2PyZ5zYkSeb_tFdQG9CHS5PdpOZ9kOfrIzEXyZ968daayaOJi2d6iO84fnauE5hZiIAUPsx24Vr4nBA
- type: precision
value: 0.927
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmRhNWM1NDQ4ZjkyYjAxYjQ5MzQzMDA1ZDIzYWU3YTE4NTI2ZTMwYWI2ZWQ4NzQ3YzJkODYzMmZhZDI1NGRlNCIsInZlcnNpb24iOjF9.NlII1s42Mr_DMzPEoR0ntyh5cDW0405TxVkWhCgXLJTFAdnivH54-zZY4av1U5jHPTeXeWwZrrrbMwHCRBkoCw
- type: precision
value: 0.9272902840835793
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODhkNmM5NmYyMzA4MjkwOTllZDgyMDQ1NzZkN2QzOTAyOTMyNGFlZTU4NzM5NmM5NWQ1YmUxYmRmNjA5YjhhNCIsInZlcnNpb24iOjF9.oIn1KT-BOpFNLXiKL29frMvgHhWZMHWc9Q5WgeR7UaMEO7smkK8J3j5HAMy17Ktjv2dh783-f76N6gyJ_NewCg
- type: recall
value: 0.8790126653780703
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYjhlNzczNDY2NDVlM2UwMjAzOWQxYTAyNWZkNGZlYmNjODNiZTEzMTcxNTE3MTAxNjNkOTFiMmRiMzViMzJmZiIsInZlcnNpb24iOjF9.AXp7omMuUZFJ6mzAVTQPMke7QoUtoi4RJSSE7Xbnp2pNi7y-JtznKdm---l6RfqcHPlI0jWr7TVGoFsWZ64YAg
- type: recall
value: 0.927
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjEyYmZiZDQ4MzM1ZmQ2ZmJhZWU4OTVkNmViYjA5NzhiN2MxODE0MzUxZTliZTk0MzViZDAyNGU4MDFjYjM1MSIsInZlcnNpb24iOjF9.9lazxLXbPOdwhqoYtIudwRwjfNVZnUu7KvGRklRP_RAoQStAzgmWMIrT3ckX_d5_6bKZH9fIdujUn5Qz-baKBw
- type: recall
value: 0.927
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWVhMzY0YTA4YmQzYTg4YTBiMzQ5YzRiZWJhMjM1NjUzZGQxZmQ5M2NkZDcyNTQ0ZmJjN2NkY2ZiYjg0OWI0ZCIsInZlcnNpb24iOjF9.QgTv726WCTyvrEct0NM8Zpc3vUnDbIwCor9EH941-zpJtuWr-xpdZzYZFJfILkVA0UUn1y6Jz_ABfkfBeyZTBg
- type: f1
value: 0.8825061528287809
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNzQzZTJkMDAwOTUwMzY3ZjI2MjIxYjlmZTg3YTdhNTc4ZjYyMmQ2NDQzM2FmYzk3OGEzNjhhMTk3NTQ3OTlhNyIsInZlcnNpb24iOjF9.hSln1KfKm0plK7Qao9vlubFtAl1M7_UYHNM6La9gEZlW_apnU1Mybz03GT2XZORgOVPe9JmgygvZByxQhpsYBw
- type: f1
value: 0.927
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNzljODQ3NjE3MDRkODE3ZjFlZmY5MjYyOGJlNDQ4YzdlZGRiMTI5OGZiZWM2ODkyZjMyZWQ3MTkzYWU5YThkOCIsInZlcnNpb24iOjF9.7qfBw39fv22jSIJoY71DkOVr9eBB-srhqSi09bCcUC7Huok4O2Z_vB7gO_Rahh9sFgKVu1ZATusjTmOLQr0fBw
- type: f1
value: 0.926876082854655
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjJhN2UzODgxOWQ0Y2E3YTcwZTQxMDE0ZWRmYThjOWVhYWQ1YjBhMzk0YWUxNzE2ZjFhNWM5ZmE2ZmI1YTczYSIsInZlcnNpb24iOjF9.nZW0dBdLmh_FgNw6GaITvSJFX-2C_Iku3NanU8Rip7FSiRHozKPAjothdQh9MWQnq158ZZGPPVIjtyIvuTSqCw
- type: loss
value: 0.17403268814086914
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTVjZmFiOGQwZGY1OTU5YWFkNGZjMTlhOGI4NjE3MGI4ZDhkODcxYmJiYTQ3NWNmMWM0ODUyZDI1MThkYTY3ZSIsInZlcnNpb24iOjF9.OYz5BI3Lz8LgjAqVnD6NcrG3UAG0D3wjKJ7G5298RRGaNpb621ycisG_7UYiWixY7e2RJafkfRiplmkdczIFDQ
---
# Distilbert-base-uncased-emotion
## Model description:
[Distilbert](https://arxiv.org/abs/1910.01108) is created with knowledge distillation during the pre-training phase which reduces the size of a BERT model by 40%, while retaining 97% of its language understanding. It's smaller, faster than Bert and any other Bert-based model.
[Distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters
```
learning rate 2e-5,
batch size 64,
num_train_epochs=8,
```
## Model Performance Comparision on Emotion Dataset from Twitter:
| Model | Accuracy | F1 Score | Test Sample per Second |
| --- | --- | --- | --- |
| [Distilbert-base-uncased-emotion](https://huggingface.co/bhadresh-savani/distilbert-base-uncased-emotion) | 93.8 | 93.79 | 398.69 |
| [Bert-base-uncased-emotion](https://huggingface.co/bhadresh-savani/bert-base-uncased-emotion) | 94.05 | 94.06 | 190.152 |
| [Roberta-base-emotion](https://huggingface.co/bhadresh-savani/roberta-base-emotion) | 93.95 | 93.97| 195.639 |
| [Albert-base-v2-emotion](https://huggingface.co/bhadresh-savani/albert-base-v2-emotion) | 93.6 | 93.65 | 182.794 |
## How to Use the model:
```python
from transformers import pipeline
classifier = pipeline("text-classification",model='bhadresh-savani/distilbert-base-uncased-emotion', return_all_scores=True)
prediction = classifier("I love using transformers. The best part is wide range of support and its easy to use", )
print(prediction)
"""
Output:
[[
{'label': 'sadness', 'score': 0.0006792712374590337},
{'label': 'joy', 'score': 0.9959300756454468},
{'label': 'love', 'score': 0.0009452480007894337},
{'label': 'anger', 'score': 0.0018055217806249857},
{'label': 'fear', 'score': 0.00041110432357527316},
{'label': 'surprise', 'score': 0.0002288572577526793}
]]
"""
```
## Dataset:
[Twitter-Sentiment-Analysis](https://huggingface.co/nlp/viewer/?dataset=emotion).
## Training procedure
[Colab Notebook](https://github.com/bhadreshpsavani/ExploringSentimentalAnalysis/blob/main/SentimentalAnalysisWithDistilbert.ipynb)
## Eval results
```json
{
'test_accuracy': 0.938,
'test_f1': 0.937932884041714,
'test_loss': 0.1472451239824295,
'test_mem_cpu_alloc_delta': 0,
'test_mem_cpu_peaked_delta': 0,
'test_mem_gpu_alloc_delta': 0,
'test_mem_gpu_peaked_delta': 163454464,
'test_runtime': 5.0164,
'test_samples_per_second': 398.69
}
```
## Reference:
* [Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf](https://learning.oreilly.com/library/view/natural-language-processing/9781098103231/) | 7,010 | [
[
-0.032928466796875,
-0.04730224609375,
0.01535797119140625,
0.03564453125,
-0.0216064453125,
0.00907135009765625,
-0.0262451171875,
-0.018707275390625,
0.0214691162109375,
-0.0050048828125,
-0.052947998046875,
-0.046875,
-0.061126708984375,
-0.00528717041015625,
-0.005886077880859375,
0.08489990234375,
0.0033550262451171875,
0.00547027587890625,
0.0008177757263183594,
-0.00836944580078125,
-0.01030731201171875,
-0.04547119140625,
-0.039947509765625,
-0.037017822265625,
0.0213470458984375,
0.0124359130859375,
0.0386962890625,
0.01006317138671875,
0.040740966796875,
0.03009033203125,
-0.02703857421875,
-0.0165863037109375,
-0.040069580078125,
0.0007405281066894531,
0.0265350341796875,
-0.033447265625,
-0.03887939453125,
0.00522613525390625,
0.02801513671875,
0.039642333984375,
-0.003833770751953125,
0.0245513916015625,
0.01244354248046875,
0.07244873046875,
-0.0296173095703125,
0.040740966796875,
-0.0413818359375,
0.0086212158203125,
-0.00803375244140625,
0.0080108642578125,
-0.031494140625,
-0.025177001953125,
0.0291748046875,
-0.021484375,
0.0231781005859375,
0.007015228271484375,
0.08868408203125,
0.028717041015625,
-0.01332855224609375,
-0.0159759521484375,
-0.02935791015625,
0.07537841796875,
-0.06243896484375,
0.0162506103515625,
0.01605224609375,
0.00847625732421875,
0.0033664703369140625,
-0.05499267578125,
-0.054656982421875,
-0.003650665283203125,
-0.02020263671875,
0.0292205810546875,
-0.03790283203125,
-0.0014553070068359375,
0.0290374755859375,
0.04058837890625,
-0.03387451171875,
-0.0162506103515625,
-0.019805908203125,
-0.00626373291015625,
0.04974365234375,
0.0088043212890625,
0.008514404296875,
-0.03216552734375,
-0.0257415771484375,
-0.0297088623046875,
0.005489349365234375,
0.030303955078125,
0.0192413330078125,
0.02215576171875,
-0.0289154052734375,
0.033721923828125,
-0.0069732666015625,
0.02838134765625,
0.0288238525390625,
-0.01282501220703125,
0.0697021484375,
0.006717681884765625,
-0.02752685546875,
0.0132904052734375,
0.0858154296875,
0.0305938720703125,
0.021392822265625,
0.0096282958984375,
-0.006092071533203125,
0.01336669921875,
-0.0008974075317382812,
-0.065185546875,
-0.032867431640625,
0.027496337890625,
-0.037353515625,
-0.02728271484375,
0.0017986297607421875,
-0.066162109375,
-0.0079193115234375,
-0.0163116455078125,
0.0255889892578125,
-0.05517578125,
-0.02703857421875,
0.01282501220703125,
-0.0139923095703125,
-0.0005478858947753906,
0.0009260177612304688,
-0.0682373046875,
0.007259368896484375,
0.02703857421875,
0.058319091796875,
0.005107879638671875,
-0.0174102783203125,
-0.0007948875427246094,
-0.03997802734375,
-0.0024566650390625,
0.034332275390625,
-0.0063018798828125,
-0.0225677490234375,
-0.0135040283203125,
0.00101470947265625,
-0.005237579345703125,
-0.01549530029296875,
0.055267333984375,
-0.0196075439453125,
0.025482177734375,
-0.01230621337890625,
-0.03302001953125,
-0.02154541015625,
0.0191192626953125,
-0.047607421875,
0.10284423828125,
0.0230712890625,
-0.0810546875,
0.0220184326171875,
-0.047454833984375,
-0.0201416015625,
-0.0251007080078125,
0.0233001708984375,
-0.044342041015625,
0.00957489013671875,
0.0196075439453125,
0.047607421875,
-0.01508331298828125,
0.0208282470703125,
-0.025543212890625,
-0.019805908203125,
0.0213623046875,
-0.032867431640625,
0.08428955078125,
0.01247406005859375,
-0.042449951171875,
-0.004932403564453125,
-0.06976318359375,
0.00909423828125,
0.0225067138671875,
-0.0168914794921875,
-0.0172271728515625,
-0.0217437744140625,
0.01192474365234375,
0.01190185546875,
0.0252838134765625,
-0.043701171875,
0.00954437255859375,
-0.042633056640625,
0.01415252685546875,
0.052520751953125,
-0.0156402587890625,
0.026763916015625,
-0.00980377197265625,
0.0257720947265625,
0.0162353515625,
0.016265869140625,
0.0129852294921875,
-0.03729248046875,
-0.07159423828125,
-0.032867431640625,
0.02117919921875,
0.040252685546875,
-0.03411865234375,
0.06756591796875,
-0.021026611328125,
-0.065185546875,
-0.06048583984375,
-0.00496673583984375,
0.0234375,
0.06292724609375,
0.041900634765625,
-0.006916046142578125,
-0.063720703125,
-0.0572509765625,
0.00016558170318603516,
-0.0239715576171875,
0.0025806427001953125,
0.00020372867584228516,
0.033050537109375,
-0.035186767578125,
0.07684326171875,
-0.0509033203125,
-0.014373779296875,
-0.020751953125,
0.033782958984375,
0.05517578125,
0.0251312255859375,
0.05194091796875,
-0.048980712890625,
-0.061767578125,
-0.0256500244140625,
-0.06427001953125,
-0.005916595458984375,
0.01007080078125,
-0.01332855224609375,
0.0297698974609375,
-0.0013818740844726562,
-0.05401611328125,
0.032623291015625,
0.03692626953125,
-0.027130126953125,
0.03900146484375,
-0.005931854248046875,
0.006198883056640625,
-0.0849609375,
0.0032939910888671875,
0.015106201171875,
0.00396728515625,
-0.05010986328125,
-0.03277587890625,
-0.0009074211120605469,
0.020294189453125,
-0.039215087890625,
0.0343017578125,
-0.0301971435546875,
0.019866943359375,
-0.0006093978881835938,
0.0007867813110351562,
0.00646209716796875,
0.059051513671875,
0.007160186767578125,
0.0252838134765625,
0.053802490234375,
-0.0223541259765625,
0.03997802734375,
0.038665771484375,
-0.027801513671875,
0.0278472900390625,
-0.04693603515625,
0.0028285980224609375,
-0.0217437744140625,
0.01503753662109375,
-0.082275390625,
-0.0022983551025390625,
0.01226043701171875,
-0.05426025390625,
0.0273284912109375,
-0.00496673583984375,
-0.03314208984375,
-0.0433349609375,
-0.037322998046875,
0.005680084228515625,
0.06695556640625,
-0.0377197265625,
0.047760009765625,
0.01183319091796875,
-0.0017805099487304688,
-0.06500244140625,
-0.0643310546875,
-0.022430419921875,
-0.0209503173828125,
-0.047943115234375,
0.0207977294921875,
-0.019256591796875,
-0.01331329345703125,
-0.0009284019470214844,
-0.0041046142578125,
-0.0053558349609375,
-0.001190185546875,
0.033294677734375,
0.0379638671875,
-0.0012006759643554688,
0.0104522705078125,
-0.0010986328125,
-0.014373779296875,
0.016571044921875,
0.0183258056640625,
0.0572509765625,
-0.04022216796875,
0.003025054931640625,
-0.039642333984375,
-0.006549835205078125,
0.036834716796875,
0.00823211669921875,
0.0679931640625,
0.0718994140625,
-0.026580810546875,
-0.002162933349609375,
-0.030364990234375,
-0.00482177734375,
-0.036102294921875,
0.0257415771484375,
-0.0244140625,
-0.050201416015625,
0.043914794921875,
0.01149749755859375,
-0.007671356201171875,
0.064453125,
0.047943115234375,
-0.017852783203125,
0.07830810546875,
0.0303802490234375,
-0.03131103515625,
0.030364990234375,
-0.05157470703125,
0.0181427001953125,
-0.061004638671875,
-0.029144287109375,
-0.0214080810546875,
-0.038818359375,
-0.04656982421875,
-0.01175689697265625,
0.0175933837890625,
0.02288818359375,
-0.04010009765625,
0.021881103515625,
-0.056976318359375,
0.01081085205078125,
0.044952392578125,
0.01186370849609375,
0.00305938720703125,
0.0005984306335449219,
-0.01434326171875,
-0.0232696533203125,
-0.03179931640625,
-0.025054931640625,
0.0758056640625,
0.04534912109375,
0.06060791015625,
-0.00223541259765625,
0.053070068359375,
0.01392364501953125,
0.0252838134765625,
-0.061676025390625,
0.033416748046875,
-0.007717132568359375,
-0.04449462890625,
-0.0173492431640625,
-0.03790283203125,
-0.0555419921875,
0.006229400634765625,
-0.026763916015625,
-0.062255859375,
0.0016584396362304688,
0.007205963134765625,
-0.0204925537109375,
0.024627685546875,
-0.0645751953125,
0.061614990234375,
-0.0292205810546875,
-0.01403045654296875,
0.0182647705078125,
-0.06085205078125,
0.018310546875,
-0.0029582977294921875,
-0.001983642578125,
-0.0219879150390625,
0.0294189453125,
0.056884765625,
-0.02587890625,
0.06646728515625,
-0.0260009765625,
0.01505279541015625,
0.03179931640625,
-0.007450103759765625,
0.029144287109375,
0.00391387939453125,
-0.0157012939453125,
0.033538818359375,
-0.0081024169921875,
-0.0237579345703125,
-0.032470703125,
0.046630859375,
-0.07275390625,
-0.01534271240234375,
-0.0587158203125,
-0.035186767578125,
-0.018707275390625,
0.010101318359375,
0.043792724609375,
0.01324462890625,
-0.0077667236328125,
0.0236663818359375,
0.04998779296875,
-0.0111236572265625,
0.043701171875,
0.017791748046875,
-0.007152557373046875,
-0.0335693359375,
0.045501708984375,
-0.0117034912109375,
-0.0026988983154296875,
0.0293121337890625,
0.025482177734375,
-0.048309326171875,
-0.013763427734375,
-0.00980377197265625,
0.021881103515625,
-0.043853759765625,
-0.0269012451171875,
-0.0582275390625,
-0.02740478515625,
-0.045806884765625,
-0.0038242340087890625,
-0.036773681640625,
-0.03179931640625,
-0.0379638671875,
-0.021881103515625,
0.05816650390625,
0.028778076171875,
-0.00689697265625,
0.029571533203125,
-0.06146240234375,
0.01494598388671875,
0.01268768310546875,
0.03448486328125,
0.0023784637451171875,
-0.049407958984375,
-0.0081634521484375,
0.018402099609375,
-0.029510498046875,
-0.0577392578125,
0.052337646484375,
0.0107879638671875,
0.0251922607421875,
0.0304107666015625,
0.0151214599609375,
0.0567626953125,
-0.02203369140625,
0.06256103515625,
0.0443115234375,
-0.08514404296875,
0.047027587890625,
-0.0079498291015625,
0.01751708984375,
0.043609619140625,
0.043975830078125,
-0.034576416015625,
-0.01324462890625,
-0.059967041015625,
-0.0767822265625,
0.0704345703125,
0.0278472900390625,
0.0023517608642578125,
0.00615692138671875,
0.0027484893798828125,
0.000017344951629638672,
0.02374267578125,
-0.06463623046875,
-0.0521240234375,
-0.034332275390625,
-0.05035400390625,
-0.0176239013671875,
-0.0249786376953125,
0.002960205078125,
-0.046722412109375,
0.061431884765625,
0.006420135498046875,
0.036529541015625,
0.0211639404296875,
0.00035309791564941406,
-0.00566864013671875,
0.005802154541015625,
0.01806640625,
0.01316070556640625,
-0.057373046875,
-0.006656646728515625,
0.0196380615234375,
-0.047515869140625,
0.00665283203125,
0.018585205078125,
0.010772705078125,
0.0186767578125,
0.037017822265625,
0.083984375,
-0.0029697418212890625,
-0.036895751953125,
0.0382080078125,
-0.0131378173828125,
-0.0305023193359375,
-0.04095458984375,
-0.0064544677734375,
0.01155853271484375,
0.0248565673828125,
0.0276336669921875,
0.01203155517578125,
0.005336761474609375,
-0.038177490234375,
0.0137481689453125,
0.00968170166015625,
-0.04559326171875,
-0.031524658203125,
0.046234130859375,
0.0004146099090576172,
-0.007904052734375,
0.054046630859375,
-0.016632080078125,
-0.05657958984375,
0.042327880859375,
0.0186767578125,
0.07611083984375,
-0.0017108917236328125,
0.01535797119140625,
0.04254150390625,
0.0087432861328125,
-0.01155853271484375,
0.0296630859375,
0.0132293701171875,
-0.0540771484375,
-0.01531219482421875,
-0.06817626953125,
-0.01080322265625,
0.00942230224609375,
-0.05828857421875,
0.01332855224609375,
-0.0318603515625,
-0.034454345703125,
0.00551605224609375,
0.01462554931640625,
-0.057708740234375,
0.0297393798828125,
0.00896453857421875,
0.0709228515625,
-0.06243896484375,
0.059600830078125,
0.045318603515625,
-0.037322998046875,
-0.08001708984375,
-0.00907135009765625,
-0.007335662841796875,
-0.046295166015625,
0.06219482421875,
0.01800537109375,
0.0066986083984375,
0.001956939697265625,
-0.0343017578125,
-0.049407958984375,
0.08941650390625,
0.0204925537109375,
-0.041534423828125,
0.007350921630859375,
0.0123291015625,
0.07000732421875,
-0.015655517578125,
0.043487548828125,
0.050201416015625,
0.032623291015625,
0.01319122314453125,
-0.052276611328125,
-0.006275177001953125,
-0.038238525390625,
-0.00997161865234375,
0.016845703125,
-0.061614990234375,
0.08258056640625,
0.0004906654357910156,
0.0016641616821289062,
-0.0156402587890625,
0.049713134765625,
0.02362060546875,
0.0307464599609375,
0.048187255859375,
0.06658935546875,
0.049713134765625,
-0.0345458984375,
0.059783935546875,
-0.0205078125,
0.07135009765625,
0.0615234375,
-0.007602691650390625,
0.05242919921875,
0.0310211181640625,
-0.0323486328125,
0.055938720703125,
0.050750732421875,
-0.006809234619140625,
0.048614501953125,
0.0107421875,
-0.011322021484375,
0.0010232925415039062,
0.01529693603515625,
-0.027496337890625,
0.034088134765625,
0.01043701171875,
-0.02740478515625,
0.004695892333984375,
-0.0018711090087890625,
0.0191802978515625,
-0.00670623779296875,
-0.0015382766723632812,
0.0391845703125,
0.0006546974182128906,
-0.033843994140625,
0.059814453125,
-0.018524169921875,
0.0711669921875,
-0.045196533203125,
0.0089263916015625,
-0.01412200927734375,
0.024810791015625,
-0.020111083984375,
-0.06475830078125,
0.0163116455078125,
0.0144195556640625,
-0.0189666748046875,
-0.016754150390625,
0.042877197265625,
-0.0300140380859375,
-0.05096435546875,
0.0360107421875,
0.01885986328125,
0.00428009033203125,
-0.01192474365234375,
-0.0728759765625,
0.00568389892578125,
0.0167694091796875,
-0.052886962890625,
-0.00007468461990356445,
0.043182373046875,
0.03302001953125,
0.042633056640625,
0.043914794921875,
0.00276947021484375,
0.005054473876953125,
0.0035915374755859375,
0.06976318359375,
-0.053680419921875,
-0.0200042724609375,
-0.0716552734375,
0.0687255859375,
-0.0195465087890625,
-0.03216552734375,
0.052276611328125,
0.04205322265625,
0.0482177734375,
-0.0148468017578125,
0.05816650390625,
-0.0301971435546875,
0.041595458984375,
-0.0247802734375,
0.05511474609375,
-0.043975830078125,
-0.0009517669677734375,
-0.03515625,
-0.0635986328125,
-0.0131683349609375,
0.05157470703125,
-0.0144195556640625,
0.01189422607421875,
0.05792236328125,
0.038818359375,
0.003971099853515625,
-0.004970550537109375,
0.0003371238708496094,
0.031463623046875,
0.01165008544921875,
0.048980712890625,
0.041015625,
-0.05401611328125,
0.0382080078125,
-0.059600830078125,
-0.026336669921875,
-0.0175323486328125,
-0.06463623046875,
-0.08441162109375,
-0.0458984375,
-0.018218994140625,
-0.04052734375,
-0.02740478515625,
0.0697021484375,
0.040985107421875,
-0.06463623046875,
-0.0034580230712890625,
-0.0009751319885253906,
-0.009735107421875,
-0.0145416259765625,
-0.0230712890625,
0.039093017578125,
-0.01366424560546875,
-0.07269287109375,
-0.0017290115356445312,
-0.005062103271484375,
0.012664794921875,
0.000629425048828125,
-0.012908935546875,
-0.0186614990234375,
-0.00893402099609375,
0.04510498046875,
-0.005584716796875,
-0.040740966796875,
-0.01287078857421875,
0.01096343994140625,
-0.01178741455078125,
0.0137939453125,
0.0145721435546875,
-0.03900146484375,
0.032196044921875,
0.051361083984375,
0.0233612060546875,
0.05035400390625,
0.0009531974792480469,
0.00568389892578125,
-0.067138671875,
0.0261383056640625,
0.022369384765625,
0.041473388671875,
0.0233612060546875,
-0.03155517578125,
0.04534912109375,
0.02874755859375,
-0.039154052734375,
-0.053924560546875,
-0.016998291015625,
-0.103515625,
-0.0040740966796875,
0.0716552734375,
-0.01047515869140625,
-0.0297088623046875,
0.0267333984375,
-0.02716064453125,
0.044677734375,
-0.053009033203125,
0.0703125,
0.05975341796875,
-0.0178375244140625,
-0.004638671875,
-0.019866943359375,
0.033447265625,
0.02996826171875,
-0.0439453125,
-0.0099334716796875,
0.0165252685546875,
0.018707275390625,
0.0300445556640625,
0.043243408203125,
0.00249481201171875,
-0.0082550048828125,
0.00623321533203125,
0.03619384765625,
0.000823974609375,
-0.001628875732421875,
-0.0202789306640625,
-0.0054931640625,
-0.01495361328125,
-0.0245208740234375
]
] |
kha-white/manga-ocr-base | 2022-06-22T15:34:05.000Z | [
"transformers",
"pytorch",
"vision-encoder-decoder",
"image-to-text",
"ja",
"dataset:manga109s",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | kha-white | null | null | kha-white/manga-ocr-base | 55 | 68,470 | transformers | 2022-03-02T23:29:05 | ---
language: ja
tags:
- image-to-text
license: apache-2.0
datasets:
- manga109s
---
# Manga OCR
Optical character recognition for Japanese text, with the main focus being Japanese manga.
It uses [Vision Encoder Decoder](https://huggingface.co/docs/transformers/model_doc/vision-encoder-decoder) framework.
Manga OCR can be used as a general purpose printed Japanese OCR, but its main goal was to provide a high quality
text recognition, robust against various scenarios specific to manga:
- both vertical and horizontal text
- text with furigana
- text overlaid on images
- wide variety of fonts and font styles
- low quality images
Code is available [here](https://github.com/kha-white/manga_ocr).
| 705 | [
[
-0.029510498046875,
-0.024566650390625,
0.041595458984375,
-0.0018148422241210938,
-0.04339599609375,
-0.0051116943359375,
0.01398468017578125,
-0.048858642578125,
0.0208282470703125,
0.057098388671875,
-0.0246124267578125,
-0.059112548828125,
-0.0286102294921875,
0.042877197265625,
-0.030029296875,
0.052093505859375,
-0.00754547119140625,
0.00293731689453125,
0.00843048095703125,
-0.0180511474609375,
-0.030548095703125,
-0.037628173828125,
-0.058837890625,
-0.017974853515625,
0.04071044921875,
0.03369140625,
0.049468994140625,
0.043212890625,
0.071044921875,
0.01983642578125,
0.0025634765625,
0.01123809814453125,
-0.014068603515625,
-0.00817108154296875,
0.002056121826171875,
-0.059173583984375,
-0.011749267578125,
-0.014556884765625,
0.04486083984375,
0.0002925395965576172,
0.01446533203125,
-0.026519775390625,
-0.00949859619140625,
0.0576171875,
-0.06884765625,
0.0144195556640625,
-0.005367279052734375,
0.057769775390625,
-0.01059722900390625,
-0.0272674560546875,
-0.032073974609375,
-0.0182952880859375,
-0.0211029052734375,
-0.0491943359375,
0.0077667236328125,
-0.01227569580078125,
0.0670166015625,
0.0097503662109375,
-0.03131103515625,
-0.0565185546875,
-0.0718994140625,
0.02752685546875,
-0.0035858154296875,
0.037872314453125,
0.057220458984375,
0.0316162109375,
0.02947998046875,
-0.06524658203125,
-0.041412353515625,
-0.0028171539306640625,
-0.02606201171875,
0.03887939453125,
0.0084228515625,
-0.00043702125549316406,
0.036773681640625,
0.05584716796875,
-0.02392578125,
0.0118865966796875,
-0.032623291015625,
-0.0186004638671875,
0.038421630859375,
0.027740478515625,
0.070556640625,
-0.00714111328125,
-0.0197296142578125,
-0.01410675048828125,
-0.022308349609375,
0.0035457611083984375,
0.01325225830078125,
-0.00969696044921875,
0.00104522705078125,
0.059234619140625,
0.0029048919677734375,
0.04461669921875,
-0.00820159912109375,
-0.029754638671875,
0.0208282470703125,
-0.02691650390625,
-0.00228118896484375,
0.022064208984375,
0.06561279296875,
0.046356201171875,
0.0253448486328125,
-0.0171966552734375,
-0.0160369873046875,
0.0192108154296875,
0.03338623046875,
-0.0706787109375,
0.005748748779296875,
-0.0171661376953125,
-0.055816650390625,
-0.02093505859375,
0.01375579833984375,
-0.08477783203125,
-0.0487060546875,
-0.00865936279296875,
0.0194549560546875,
-0.039306640625,
-0.003826141357421875,
0.0261383056640625,
-0.05047607421875,
0.0270233154296875,
0.0205535888671875,
-0.07403564453125,
0.004154205322265625,
0.0107269287109375,
0.0789794921875,
-0.0006575584411621094,
-0.006427764892578125,
0.002780914306640625,
0.005634307861328125,
-0.0242156982421875,
0.05670166015625,
-0.026031494140625,
-0.040283203125,
0.00690460205078125,
0.01122283935546875,
0.00873565673828125,
-0.01142120361328125,
0.05816650390625,
-0.037933349609375,
0.00885772705078125,
-0.005794525146484375,
-0.00948333740234375,
-0.028167724609375,
0.0040740966796875,
-0.081298828125,
0.0650634765625,
-0.0017604827880859375,
-0.0594482421875,
0.03692626953125,
-0.039276123046875,
-0.04730224609375,
0.006816864013671875,
-0.0125885009765625,
-0.039276123046875,
0.007396697998046875,
0.021270751953125,
0.0162506103515625,
-0.017608642578125,
-0.04571533203125,
0.007152557373046875,
-0.05206298828125,
0.019500732421875,
-0.030731201171875,
0.045318603515625,
0.0289459228515625,
-0.0119781494140625,
-0.021514892578125,
-0.081298828125,
-0.02203369140625,
0.030487060546875,
-0.029144287109375,
-0.045745849609375,
0.0160369873046875,
0.01023101806640625,
-0.0087127685546875,
0.026824951171875,
-0.04681396484375,
0.022430419921875,
-0.01406097412109375,
0.053009033203125,
0.01175689697265625,
-0.0045623779296875,
0.0311431884765625,
-0.001003265380859375,
0.0244598388671875,
-0.01270294189453125,
0.017669677734375,
-0.0288543701171875,
-0.05340576171875,
-0.050262451171875,
-0.033782958984375,
0.019622802734375,
0.06817626953125,
-0.06402587890625,
0.032684326171875,
0.00681304931640625,
-0.049957275390625,
-0.024566650390625,
-0.01483154296875,
0.0218963623046875,
0.025665283203125,
0.01477813720703125,
-0.044342041015625,
-0.034637451171875,
-0.034881591796875,
0.01221466064453125,
-0.0107574462890625,
0.00753021240234375,
-0.00986480712890625,
0.037994384765625,
-0.026580810546875,
0.04705810546875,
-0.05548095703125,
-0.05035400390625,
-0.00902557373046875,
-0.0028324127197265625,
0.0243377685546875,
0.02703857421875,
0.0367431640625,
-0.0926513671875,
-0.06597900390625,
0.038909912109375,
-0.0550537109375,
0.0003421306610107422,
0.00638580322265625,
-0.0210418701171875,
0.01352691650390625,
0.060272216796875,
-0.03314208984375,
0.0667724609375,
0.0303192138671875,
-0.03131103515625,
0.0312347412109375,
-0.0207061767578125,
0.043853759765625,
-0.08331298828125,
0.024017333984375,
0.0025882720947265625,
-0.02484130859375,
-0.044342041015625,
0.029388427734375,
0.015411376953125,
-0.034332275390625,
-0.001987457275390625,
0.0125732421875,
-0.0291290283203125,
-0.0153656005859375,
-0.013275146484375,
-0.00513458251953125,
0.015411376953125,
0.03680419921875,
0.045013427734375,
0.067626953125,
0.00429534912109375,
-0.030487060546875,
0.0174713134765625,
0.0262603759765625,
-0.04241943359375,
0.06561279296875,
-0.060089111328125,
0.0188140869140625,
-0.02947998046875,
-0.0120086669921875,
-0.08489990234375,
-0.025299072265625,
0.057769775390625,
-0.019439697265625,
0.0241546630859375,
0.0127716064453125,
-0.062042236328125,
-0.035888671875,
-0.01367950439453125,
0.0292510986328125,
0.0460205078125,
-0.0313720703125,
0.056488037109375,
0.024658203125,
0.018157958984375,
-0.029754638671875,
-0.08221435546875,
-0.00867462158203125,
0.00461578369140625,
-0.004238128662109375,
0.0242462158203125,
0.007152557373046875,
0.0311431884765625,
0.004535675048828125,
0.016632080078125,
-0.0341796875,
-0.0075836181640625,
0.01580810546875,
0.00478363037109375,
-0.0298309326171875,
-0.00650787353515625,
0.008209228515625,
-0.02197265625,
-0.03350830078125,
0.01548004150390625,
0.05340576171875,
0.009368896484375,
-0.0311431884765625,
-0.05474853515625,
0.03302001953125,
0.07232666015625,
-0.0167236328125,
0.038818359375,
0.04962158203125,
-0.04278564453125,
0.01360321044921875,
-0.0216522216796875,
0.0095977783203125,
-0.03460693359375,
0.041717529296875,
-0.0458984375,
-0.043670654296875,
0.04864501953125,
0.0170745849609375,
-0.0167999267578125,
0.0469970703125,
0.025665283203125,
-0.015411376953125,
0.08935546875,
0.05474853515625,
-0.01043701171875,
0.042205810546875,
0.003818511962890625,
0.01226043701171875,
-0.06646728515625,
-0.03289794921875,
-0.0621337890625,
-0.0146636962890625,
-0.0135345458984375,
-0.0072784423828125,
0.005359649658203125,
0.0265045166015625,
0.0005888938903808594,
0.052459716796875,
-0.07000732421875,
0.061553955078125,
0.043426513671875,
0.030029296875,
0.0540771484375,
0.0300140380859375,
-0.0017480850219726562,
-0.0203704833984375,
-0.021270751953125,
-0.033203125,
0.056732177734375,
0.0328369140625,
0.04766845703125,
-0.01043701171875,
0.037689208984375,
0.030487060546875,
0.01163482666015625,
-0.08013916015625,
0.038665771484375,
-0.0478515625,
-0.05462646484375,
-0.013519287109375,
-0.016632080078125,
-0.07562255859375,
0.00885772705078125,
-0.0143280029296875,
-0.0439453125,
0.041259765625,
-0.01308441162109375,
0.00873565673828125,
0.03387451171875,
-0.034912109375,
0.055755615234375,
0.002777099609375,
0.031707763671875,
0.0049896240234375,
-0.0224456787109375,
-0.00955963134765625,
-0.01091766357421875,
-0.0038356781005859375,
0.017303466796875,
-0.023162841796875,
0.0372314453125,
-0.0229034423828125,
0.07135009765625,
0.014129638671875,
-0.0159149169921875,
0.0240020751953125,
-0.023468017578125,
0.00623321533203125,
-0.0173492431640625,
-0.00942230224609375,
0.03155517578125,
0.003963470458984375,
0.0133819580078125,
-0.03887939453125,
-0.0217132568359375,
-0.06854248046875,
0.0035266876220703125,
-0.042388916015625,
0.0009822845458984375,
0.0164642333984375,
0.0626220703125,
0.066162109375,
0.043701171875,
-0.0254669189453125,
0.0172576904296875,
0.0286102294921875,
-0.01168060302734375,
0.003963470458984375,
0.032623291015625,
-0.03558349609375,
-0.06256103515625,
0.07855224609375,
0.011566162109375,
0.0185089111328125,
0.06121826171875,
0.01812744140625,
-0.00948333740234375,
-0.0281829833984375,
-0.046661376953125,
0.01055908203125,
-0.07684326171875,
-0.0169525146484375,
-0.0313720703125,
-0.039337158203125,
0.01061248779296875,
-0.0286102294921875,
-0.007843017578125,
0.0053558349609375,
-0.046417236328125,
0.01702880859375,
0.0170745849609375,
0.051025390625,
0.0076141357421875,
0.029510498046875,
-0.0621337890625,
0.0565185546875,
0.0187225341796875,
0.0213623046875,
-0.0147247314453125,
0.00307464599609375,
-0.0187530517578125,
-0.00789642333984375,
-0.01499176025390625,
-0.06744384765625,
0.0218963623046875,
0.015594482421875,
0.0233154296875,
0.042633056640625,
-0.01248931884765625,
0.0289764404296875,
-0.044097900390625,
0.045318603515625,
0.051483154296875,
-0.0635986328125,
0.036834716796875,
-0.0087432861328125,
0.0289459228515625,
0.05511474609375,
0.057220458984375,
-0.049957275390625,
-0.0240020751953125,
0.01959228515625,
-0.0252227783203125,
0.06304931640625,
0.017425537109375,
0.0021114349365234375,
0.0239105224609375,
0.04107666015625,
0.01082611083984375,
0.0121612548828125,
-0.045501708984375,
-0.0093841552734375,
-0.0369873046875,
-0.040008544921875,
-0.01450347900390625,
-0.0287322998046875,
0.01065826416015625,
-0.0189361572265625,
-0.0021572113037109375,
-0.00969696044921875,
0.060455322265625,
0.03961181640625,
-0.0088958740234375,
-0.0003376007080078125,
-0.0160064697265625,
0.0400390625,
-0.005306243896484375,
-0.01142120361328125,
-0.01038360595703125,
-0.01971435546875,
-0.09991455078125,
0.01464080810546875,
-0.0260009765625,
-0.045318603515625,
0.003276824951171875,
0.0260772705078125,
0.06866455078125,
-0.00330352783203125,
-0.024505615234375,
0.0187530517578125,
-0.0176544189453125,
-0.031829833984375,
-0.018829345703125,
0.0102691650390625,
-0.017425537109375,
0.002719879150390625,
0.02978515625,
0.006908416748046875,
0.03143310546875,
-0.0635986328125,
-0.00896453857421875,
0.004009246826171875,
-0.04425048828125,
-0.0087127685546875,
0.040435791015625,
0.0206298828125,
-0.03680419921875,
0.059661865234375,
-0.004825592041015625,
-0.059661865234375,
0.058746337890625,
0.05523681640625,
0.06591796875,
-0.0088043212890625,
0.0247650146484375,
0.041534423828125,
0.0369873046875,
-0.0053253173828125,
0.0377197265625,
-0.0128631591796875,
-0.0714111328125,
-0.01202392578125,
-0.04205322265625,
-0.047149658203125,
-0.0214691162109375,
-0.04962158203125,
0.04632568359375,
-0.055419921875,
-0.0297698974609375,
-0.004100799560546875,
-0.0251617431640625,
-0.0264434814453125,
0.036163330078125,
0.0172119140625,
0.06005859375,
-0.034210205078125,
0.034912109375,
0.0504150390625,
-0.0548095703125,
-0.025177001953125,
0.01136016845703125,
-0.00940704345703125,
-0.079833984375,
0.046356201171875,
0.041229248046875,
-0.014404296875,
0.00995635986328125,
-0.0264434814453125,
-0.046142578125,
0.07159423828125,
-0.005023956298828125,
-0.03289794921875,
-0.0209197998046875,
0.03460693359375,
0.05316162109375,
-0.017181396484375,
0.0179290771484375,
0.01495361328125,
0.0307769775390625,
0.00508880615234375,
-0.04302978515625,
0.0013904571533203125,
-0.03271484375,
0.011383056640625,
0.031158447265625,
-0.0548095703125,
0.03369140625,
0.034210205078125,
-0.037139892578125,
0.0242767333984375,
0.03643798828125,
-0.0084075927734375,
0.0281829833984375,
0.0217132568359375,
0.0408935546875,
0.03350830078125,
-0.01461029052734375,
0.07843017578125,
-0.00653076171875,
0.018096923828125,
0.0269012451171875,
0.001445770263671875,
0.053558349609375,
0.0049896240234375,
-0.01715087890625,
0.05364990234375,
0.0246429443359375,
-0.032440185546875,
0.061553955078125,
-0.0261077880859375,
0.01451873779296875,
0.00627899169921875,
0.0028171539306640625,
-0.03326416015625,
0.03369140625,
0.054534912109375,
-0.036773681640625,
0.0191192626953125,
0.0152435302734375,
-0.0216217041015625,
-0.0091552734375,
-0.049560546875,
0.057708740234375,
0.0182647705078125,
-0.032012939453125,
0.026947021484375,
0.0078277587890625,
0.054290771484375,
-0.05816650390625,
-0.00003415346145629883,
0.0125732421875,
0.020965576171875,
-0.023162841796875,
-0.08441162109375,
0.043426513671875,
-0.0170745849609375,
-0.027862548828125,
0.0169677734375,
0.0841064453125,
-0.0088348388671875,
-0.03521728515625,
0.0236053466796875,
-0.037322998046875,
0.0092010498046875,
-0.002307891845703125,
-0.041046142578125,
0.023101806640625,
-0.00528717041015625,
0.0099334716796875,
0.005367279052734375,
0.03289794921875,
0.0282440185546875,
0.026458740234375,
0.032318115234375,
-0.012725830078125,
-0.016754150390625,
0.010284423828125,
0.025054931640625,
-0.041290283203125,
-0.053741455078125,
-0.0523681640625,
0.043853759765625,
-0.0229949951171875,
-0.047393798828125,
0.072265625,
0.0211944580078125,
0.04241943359375,
-0.041473388671875,
0.0369873046875,
0.0016355514526367188,
0.00023448467254638672,
-0.06109619140625,
0.07720947265625,
-0.0673828125,
-0.0517578125,
-0.039398193359375,
-0.07232666015625,
-0.059326171875,
0.07879638671875,
0.007534027099609375,
-0.006702423095703125,
0.0628662109375,
0.032623291015625,
-0.0045623779296875,
0.021820068359375,
0.0212554931640625,
-0.0279388427734375,
-0.0081329345703125,
0.0243988037109375,
0.048614501953125,
-0.05596923828125,
0.041107177734375,
-0.0211181640625,
-0.0128173828125,
-0.035430908203125,
-0.035675048828125,
-0.096923828125,
-0.0531005859375,
-0.0251617431640625,
-0.033935546875,
-0.016845703125,
0.0019683837890625,
0.03643798828125,
-0.0545654296875,
0.0017614364624023438,
-0.0240020751953125,
0.0008111000061035156,
-0.000942230224609375,
-0.0174102783203125,
0.0284271240234375,
-0.0010728836059570312,
-0.06793212890625,
-0.0230560302734375,
0.0291290283203125,
0.01076507568359375,
-0.0128631591796875,
-0.0019321441650390625,
0.01149749755859375,
0.00021135807037353516,
0.039215087890625,
0.051605224609375,
-0.009552001953125,
0.0026798248291015625,
0.0020046234130859375,
-0.040924072265625,
0.0189666748046875,
0.0584716796875,
0.004268646240234375,
0.032684326171875,
0.067138671875,
0.048828125,
0.036590576171875,
-0.0262298583984375,
0.039093017578125,
-0.016632080078125,
0.0168914794921875,
0.003910064697265625,
0.0225982666015625,
0.00801849365234375,
-0.045684814453125,
0.04962158203125,
0.039520263671875,
-0.038787841796875,
-0.0474853515625,
0.0233001708984375,
-0.06231689453125,
-0.011871337890625,
0.06536865234375,
-0.00408935546875,
-0.042388916015625,
-0.01551055908203125,
-0.06231689453125,
0.01122283935546875,
-0.0185546875,
0.01502227783203125,
0.057220458984375,
0.0279388427734375,
-0.0274810791015625,
-0.01543426513671875,
0.022674560546875,
-0.0082244873046875,
-0.04949951171875,
-0.04974365234375,
0.01251220703125,
-0.0009398460388183594,
0.06463623046875,
0.06524658203125,
-0.045867919921875,
0.0290069580078125,
0.02496337890625,
0.00441741943359375,
0.007480621337890625,
-0.0284271240234375,
-0.0213470458984375,
0.0207672119140625,
-0.019866943359375,
-0.0243988037109375
]
] |
google/t5-v1_1-xl | 2023-01-24T16:52:38.000Z | [
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"en",
"dataset:c4",
"arxiv:2002.05202",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/t5-v1_1-xl | 14 | 68,354 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- c4
license: apache-2.0
---
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) Version 1.1
## Version 1.1
[T5 Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/master/released_checkpoints.md#t511) includes the following improvements compared to the original T5 model- GEGLU activation in feed-forward hidden layer, rather than ReLU - see [here](https://arxiv.org/abs/2002.05202).
- Dropout was turned off in pre-training (quality win). Dropout should be re-enabled during fine-tuning.
- Pre-trained on C4 only without mixing in the downstream tasks.
- no parameter sharing between embedding and classifier layer
- "xl" and "xxl" replace "3B" and "11B". The model shapes are a bit different - larger `d_model` and smaller `num_heads` and `d_ff`.
**Note**: T5 Version 1.1 was only pre-trained on C4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [C4](https://huggingface.co/datasets/c4)
Other Community Checkpoints: [here](https://huggingface.co/models?search=t5-v1_1)
Paper: [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf)
Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu*
## Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.

| 2,673 | [
[
-0.0215301513671875,
-0.026824951171875,
0.0295867919921875,
0.015838623046875,
-0.0155792236328125,
0.01049041748046875,
-0.017913818359375,
-0.05303955078125,
-0.012451171875,
0.033538818359375,
-0.052734375,
-0.043670654296875,
-0.07012939453125,
0.01508331298828125,
-0.04840087890625,
0.0972900390625,
-0.01422119140625,
-0.01439666748046875,
0.0011510848999023438,
-0.0031642913818359375,
-0.02667236328125,
-0.032501220703125,
-0.0640869140625,
-0.0270538330078125,
0.0286102294921875,
0.027984619140625,
0.0205535888671875,
0.0275115966796875,
0.0537109375,
0.01300048828125,
-0.0006923675537109375,
-0.006435394287109375,
-0.0501708984375,
-0.029083251953125,
-0.026519775390625,
-0.01445770263671875,
-0.036376953125,
0.007022857666015625,
0.04522705078125,
0.054107666015625,
0.00208282470703125,
0.0189208984375,
0.02545166015625,
0.0450439453125,
-0.052734375,
0.01412200927734375,
-0.04534912109375,
0.0151824951171875,
-0.0076446533203125,
0.0008721351623535156,
-0.05047607421875,
-0.01508331298828125,
0.039337158203125,
-0.056610107421875,
0.0254669189453125,
-0.0088958740234375,
0.091796875,
0.02740478515625,
-0.037628173828125,
-0.0191192626953125,
-0.048614501953125,
0.06646728515625,
-0.045166015625,
0.0287017822265625,
0.00934600830078125,
0.0283966064453125,
0.01160430908203125,
-0.0894775390625,
-0.034149169921875,
-0.0023097991943359375,
-0.0089874267578125,
0.004276275634765625,
-0.0216827392578125,
-0.003570556640625,
0.0077362060546875,
0.036102294921875,
-0.035430908203125,
0.016021728515625,
-0.05010986328125,
-0.019500732421875,
0.036773681640625,
-0.0180816650390625,
0.02362060546875,
0.0011930465698242188,
-0.04913330078125,
-0.0191497802734375,
-0.04052734375,
0.007904052734375,
-0.01666259765625,
0.02447509765625,
-0.0257415771484375,
-0.0070343017578125,
-0.0015516281127929688,
0.048553466796875,
0.01122283935546875,
-0.004730224609375,
0.026519775390625,
-0.04791259765625,
-0.0174407958984375,
-0.015716552734375,
0.06573486328125,
0.0135955810546875,
0.0216217041015625,
-0.03228759765625,
-0.0016078948974609375,
-0.0212860107421875,
0.03228759765625,
-0.0731201171875,
-0.033538818359375,
-0.0060272216796875,
-0.02801513671875,
-0.038116455078125,
0.007663726806640625,
-0.0457763671875,
-0.00408935546875,
-0.0189361572265625,
0.0411376953125,
-0.04302978515625,
-0.0204620361328125,
0.026336669921875,
0.00273895263671875,
0.032135009765625,
0.04071044921875,
-0.07843017578125,
0.035430908203125,
0.03607177734375,
0.06304931640625,
-0.044830322265625,
-0.0254669189453125,
-0.041748046875,
-0.0005431175231933594,
-0.0101776123046875,
0.06097412109375,
-0.024200439453125,
-0.017547607421875,
-0.006435394287109375,
0.01291656494140625,
-0.0190277099609375,
-0.0232086181640625,
0.060089111328125,
-0.031280517578125,
0.043212890625,
-0.020660400390625,
-0.034454345703125,
-0.038970947265625,
0.0128021240234375,
-0.05120849609375,
0.0755615234375,
0.0135650634765625,
-0.044586181640625,
0.035491943359375,
-0.0655517578125,
-0.033233642578125,
-0.01319122314453125,
0.0274658203125,
-0.030731201171875,
-0.0174102783203125,
0.02520751953125,
0.04290771484375,
-0.007678985595703125,
0.00518035888671875,
-0.017242431640625,
-0.02130126953125,
-0.01262664794921875,
-0.004486083984375,
0.06805419921875,
0.0223388671875,
-0.023406982421875,
0.0040435791015625,
-0.047149658203125,
0.013824462890625,
-0.0011167526245117188,
-0.0231781005859375,
0.01023101806640625,
-0.022369384765625,
0.011444091796875,
0.03173828125,
0.0215606689453125,
-0.0249481201171875,
0.01995849609375,
-0.0195159912109375,
0.0401611328125,
0.040313720703125,
-0.01451873779296875,
0.06573486328125,
-0.031829833984375,
0.038299560546875,
0.004100799560546875,
0.004665374755859375,
-0.01184844970703125,
-0.0167388916015625,
-0.055908203125,
-0.008880615234375,
0.050567626953125,
0.0537109375,
-0.0511474609375,
0.043060302734375,
-0.04205322265625,
-0.03955078125,
-0.04559326171875,
0.006214141845703125,
0.0280609130859375,
0.047882080078125,
0.05712890625,
-0.01934814453125,
-0.04278564453125,
-0.0411376953125,
-0.0206146240234375,
0.004268646240234375,
-0.006744384765625,
-0.002193450927734375,
0.0362548828125,
-0.01580810546875,
0.05841064453125,
-0.0233306884765625,
-0.041229248046875,
-0.04461669921875,
0.01361846923828125,
-0.004100799560546875,
0.046051025390625,
0.05230712890625,
-0.04534912109375,
-0.040496826171875,
0.0070343017578125,
-0.058837890625,
-0.01340484619140625,
-0.012054443359375,
-0.006816864013671875,
0.0241241455078125,
0.04425048828125,
-0.0196533203125,
0.0238494873046875,
0.06298828125,
-0.0184173583984375,
0.0271759033203125,
-0.0106201171875,
0.0011806488037109375,
-0.11767578125,
0.0294189453125,
0.00341796875,
-0.038177490234375,
-0.05615234375,
-0.0008916854858398438,
0.0205078125,
0.006381988525390625,
-0.04327392578125,
0.046661376953125,
-0.036956787109375,
0.005558013916015625,
-0.0198822021484375,
0.0139617919921875,
-0.000484466552734375,
0.039459228515625,
-0.0087738037109375,
0.060333251953125,
0.036041259765625,
-0.060699462890625,
-0.006084442138671875,
0.0323486328125,
-0.0150909423828125,
0.0099334716796875,
-0.04522705078125,
0.0321044921875,
0.0004887580871582031,
0.0345458984375,
-0.06591796875,
0.0201263427734375,
0.0310211181640625,
-0.04437255859375,
0.042816162109375,
-0.00970458984375,
-0.015289306640625,
-0.01496124267578125,
-0.0267333984375,
0.0215301513671875,
0.049041748046875,
-0.04730224609375,
0.040191650390625,
0.01068115234375,
0.001811981201171875,
-0.052093505859375,
-0.055999755859375,
0.01483917236328125,
-0.0194854736328125,
-0.047882080078125,
0.0640869140625,
0.00130462646484375,
0.0183258056640625,
-0.00428009033203125,
-0.006099700927734375,
-0.0212554931640625,
0.016693115234375,
-0.01192474365234375,
0.02001953125,
-0.002239227294921875,
0.00775146484375,
0.01071929931640625,
-0.021392822265625,
-0.0028400421142578125,
-0.034881591796875,
0.022064208984375,
-0.01031494140625,
0.015777587890625,
-0.04083251953125,
0.0010251998901367188,
0.0234832763671875,
-0.0201263427734375,
0.0555419921875,
0.06951904296875,
-0.0188751220703125,
-0.0234375,
-0.0205078125,
-0.0157928466796875,
-0.034515380859375,
0.03228759765625,
-0.037750244140625,
-0.07611083984375,
0.0310516357421875,
-0.017120361328125,
0.0229034423828125,
0.052276611328125,
0.006557464599609375,
-0.00290679931640625,
0.0489501953125,
0.082275390625,
-0.02520751953125,
0.050018310546875,
-0.03369140625,
0.0206146240234375,
-0.06732177734375,
-0.01204681396484375,
-0.050811767578125,
-0.022308349609375,
-0.04840087890625,
-0.0215911865234375,
0.003635406494140625,
0.01934814453125,
-0.01377105712890625,
0.040252685546875,
-0.029144287109375,
0.027099609375,
0.01482391357421875,
0.01233673095703125,
0.029815673828125,
0.00727081298828125,
0.0027923583984375,
-0.01442718505859375,
-0.05889892578125,
-0.037445068359375,
0.09197998046875,
0.023895263671875,
0.03863525390625,
0.006793975830078125,
0.048797607421875,
0.032745361328125,
0.03302001953125,
-0.055999755859375,
0.033905029296875,
-0.0345458984375,
-0.0208587646484375,
-0.0270538330078125,
-0.03460693359375,
-0.08697509765625,
0.0225830078125,
-0.0372314453125,
-0.05438232421875,
-0.010498046875,
0.0011301040649414062,
-0.0088348388671875,
0.037689208984375,
-0.060699462890625,
0.0782470703125,
0.004848480224609375,
-0.0152130126953125,
-0.0008702278137207031,
-0.0574951171875,
0.0192718505859375,
-0.01055145263671875,
-0.00342559814453125,
0.007419586181640625,
-0.004955291748046875,
0.055389404296875,
-0.0166168212890625,
0.051055908203125,
-0.0092010498046875,
-0.006526947021484375,
0.00042366981506347656,
0.0002187490463256836,
0.039764404296875,
-0.029815673828125,
-0.0001348257064819336,
0.0245361328125,
-0.0015010833740234375,
-0.042327880859375,
-0.037109375,
0.0328369140625,
-0.060211181640625,
-0.0237884521484375,
-0.0229644775390625,
-0.021209716796875,
-0.004650115966796875,
0.02606201171875,
0.0352783203125,
0.01363372802734375,
-0.015899658203125,
0.0264129638671875,
0.053924560546875,
-0.01136016845703125,
0.04388427734375,
0.0266571044921875,
-0.02130126953125,
-0.0060272216796875,
0.053009033203125,
-0.00018072128295898438,
0.037506103515625,
0.046417236328125,
0.007579803466796875,
-0.02801513671875,
-0.058837890625,
-0.037322998046875,
0.01505279541015625,
-0.047332763671875,
-0.00994110107421875,
-0.06072998046875,
-0.03094482421875,
-0.045166015625,
-0.01013946533203125,
-0.034759521484375,
-0.0219268798828125,
-0.038116455078125,
-0.01934814453125,
0.01093292236328125,
0.050933837890625,
0.00970458984375,
0.0169525146484375,
-0.07977294921875,
0.00849151611328125,
0.004718780517578125,
0.01751708984375,
-0.0032100677490234375,
-0.07550048828125,
-0.01163482666015625,
0.00196075439453125,
-0.02862548828125,
-0.050933837890625,
0.03619384765625,
0.030426025390625,
0.0296478271484375,
0.01238250732421875,
0.00632476806640625,
0.0390625,
-0.02874755859375,
0.05841064453125,
0.016876220703125,
-0.0892333984375,
0.030242919921875,
-0.0233154296875,
0.029754638671875,
0.058563232421875,
0.0418701171875,
-0.034576416015625,
-0.00797271728515625,
-0.051910400390625,
-0.050262451171875,
0.05938720703125,
0.0133056640625,
-0.00508880615234375,
0.037384033203125,
0.0229644775390625,
0.02667236328125,
-0.004192352294921875,
-0.0693359375,
-0.0108795166015625,
-0.01151275634765625,
-0.0153350830078125,
-0.01142120361328125,
0.007602691650390625,
0.031005859375,
-0.028717041015625,
0.0443115234375,
-0.01505279541015625,
0.0239715576171875,
0.0248260498046875,
-0.038421630859375,
0.0136566162109375,
0.0181732177734375,
0.043243408203125,
0.05816650390625,
-0.018310546875,
-0.0064697265625,
0.035888671875,
-0.049102783203125,
-0.0029468536376953125,
0.01580810546875,
-0.010650634765625,
-0.0058135986328125,
0.033233642578125,
0.0634765625,
0.0244903564453125,
-0.0187225341796875,
0.043365478515625,
-0.010406494140625,
-0.048614501953125,
-0.01113128662109375,
0.00487518310546875,
-0.007904052734375,
-0.00569915771484375,
0.027130126953125,
0.01922607421875,
0.0234375,
-0.032745361328125,
0.00930023193359375,
0.00516510009765625,
-0.038055419921875,
-0.040069580078125,
0.0472412109375,
0.029998779296875,
-0.01134490966796875,
0.058563232421875,
-0.019561767578125,
-0.0426025390625,
0.029754638671875,
0.043121337890625,
0.077392578125,
-0.007343292236328125,
0.0260467529296875,
0.045806884765625,
0.026947021484375,
-0.01152801513671875,
-0.00814056396484375,
-0.0180816650390625,
-0.061187744140625,
-0.063232421875,
-0.03363037109375,
-0.0360107421875,
0.0111846923828125,
-0.050811767578125,
0.03460693359375,
-0.0247650146484375,
0.0151824951171875,
-0.0006327629089355469,
0.01445770263671875,
-0.06219482421875,
0.0156402587890625,
0.0115509033203125,
0.0716552734375,
-0.0582275390625,
0.07952880859375,
0.053558349609375,
-0.0222015380859375,
-0.0650634765625,
0.0036830902099609375,
-0.02471923828125,
-0.047119140625,
0.03240966796875,
0.02264404296875,
-0.01273345947265625,
0.0169219970703125,
-0.05072021484375,
-0.072021484375,
0.09930419921875,
0.036376953125,
-0.0260009765625,
-0.0214080810546875,
0.006282806396484375,
0.038970947265625,
-0.0240478515625,
0.0135040283203125,
0.045684814453125,
0.028533935546875,
0.01971435546875,
-0.09429931640625,
0.0201568603515625,
-0.0196380615234375,
-0.00933074951171875,
0.01666259765625,
-0.0399169921875,
0.05328369140625,
-0.024200439453125,
-0.0258331298828125,
-0.0010442733764648438,
0.0552978515625,
0.0018320083618164062,
0.0185699462890625,
0.040802001953125,
0.05792236328125,
0.061431884765625,
-0.01508331298828125,
0.0882568359375,
-0.0036716461181640625,
0.034912109375,
0.07952880859375,
-0.0010499954223632812,
0.062744140625,
0.02410888671875,
-0.0204315185546875,
0.044830322265625,
0.04156494140625,
0.0097198486328125,
0.043182373046875,
0.003528594970703125,
-0.0043487548828125,
-0.006198883056640625,
0.0086212158203125,
-0.033172607421875,
0.0256195068359375,
0.01288604736328125,
-0.0237884521484375,
-0.032745361328125,
0.00439453125,
0.0162353515625,
-0.006439208984375,
-0.0142364501953125,
0.072509765625,
0.00629425048828125,
-0.050018310546875,
0.047637939453125,
-0.00568389892578125,
0.0723876953125,
-0.04412841796875,
-0.0003113746643066406,
-0.0221099853515625,
0.0163116455078125,
-0.0184478759765625,
-0.05401611328125,
0.032501220703125,
-0.006839752197265625,
-0.00945281982421875,
-0.050811767578125,
0.07330322265625,
-0.0237274169921875,
-0.017913818359375,
0.030548095703125,
0.040252685546875,
0.018585205078125,
-0.0096893310546875,
-0.055694580078125,
-0.017120361328125,
0.0202484130859375,
-0.007244110107421875,
0.036376953125,
0.03631591796875,
0.005382537841796875,
0.050872802734375,
0.043792724609375,
-0.001789093017578125,
0.01073455810546875,
0.0034084320068359375,
0.05389404296875,
-0.054840087890625,
-0.0386962890625,
-0.044403076171875,
0.03662109375,
-0.004337310791015625,
-0.0399169921875,
0.046783447265625,
0.030181884765625,
0.08843994140625,
-0.00971221923828125,
0.05810546875,
-0.0017175674438476562,
0.041595458984375,
-0.0458984375,
0.04852294921875,
-0.039642333984375,
0.00714111328125,
-0.0250091552734375,
-0.06378173828125,
-0.025146484375,
0.03955078125,
-0.0233612060546875,
0.016510009765625,
0.07391357421875,
0.036834716796875,
-0.006519317626953125,
0.000995635986328125,
0.01824951171875,
-0.0007948875427246094,
0.038177490234375,
0.062255859375,
0.041015625,
-0.06719970703125,
0.066162109375,
-0.0168914794921875,
-0.004749298095703125,
-0.005306243896484375,
-0.07720947265625,
-0.0616455078125,
-0.05621337890625,
-0.028900146484375,
-0.0169677734375,
0.004795074462890625,
0.04937744140625,
0.064697265625,
-0.0478515625,
-0.001949310302734375,
-0.0215911865234375,
-0.005527496337890625,
-0.0145416259765625,
-0.0165863037109375,
0.0272064208984375,
-0.051605224609375,
-0.06072998046875,
0.005458831787109375,
-0.0022430419921875,
0.00527191162109375,
0.01030731201171875,
-0.00530242919921875,
-0.0232086181640625,
-0.032745361328125,
0.044586181640625,
0.021820068359375,
-0.0244140625,
-0.0238494873046875,
0.002689361572265625,
-0.007080078125,
0.0185089111328125,
0.044403076171875,
-0.06597900390625,
0.01397705078125,
0.03521728515625,
0.07757568359375,
0.06439208984375,
-0.01000213623046875,
0.043060302734375,
-0.043731689453125,
-0.00864410400390625,
0.01328277587890625,
0.00815582275390625,
0.0272064208984375,
-0.01416015625,
0.0513916015625,
0.0124969482421875,
-0.040374755859375,
-0.03515625,
-0.00954437255859375,
-0.09588623046875,
-0.01404571533203125,
0.07977294921875,
-0.0162811279296875,
-0.0156707763671875,
0.00241851806640625,
-0.01142120361328125,
0.02471923828125,
-0.0243377685546875,
0.06103515625,
0.06072998046875,
0.01318359375,
-0.0291595458984375,
-0.03662109375,
0.05169677734375,
0.044586181640625,
-0.0882568359375,
-0.0247802734375,
0.01430511474609375,
0.033416748046875,
0.00333404541015625,
0.042327880859375,
-0.011383056640625,
0.0190277099609375,
-0.030426025390625,
0.0146942138671875,
-0.0011587142944335938,
-0.0302581787109375,
-0.04327392578125,
0.011077880859375,
-0.0169830322265625,
-0.0261993408203125
]
] |
mrm8488/distill-bert-base-spanish-wwm-cased-finetuned-spa-squad2-es | 2023-01-20T12:05:38.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"question-answering",
"es",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | mrm8488 | null | null | mrm8488/distill-bert-base-spanish-wwm-cased-finetuned-spa-squad2-es | 33 | 67,450 | transformers | 2022-03-02T23:29:05 | ---
language: es
thumbnail: https://i.imgur.com/jgBdimh.png
license: apache-2.0
---
# BETO (Spanish BERT) + Spanish SQuAD2.0 + distillation using 'bert-base-multilingual-cased' as teacher
This model is a fine-tuned on [SQuAD-es-v2.0](https://github.com/ccasimiro88/TranslateAlignRetrieve) and **distilled** version of [BETO](https://github.com/dccuchile/beto) for **Q&A**.
Distillation makes the model **smaller, faster, cheaper and lighter** than [bert-base-spanish-wwm-cased-finetuned-spa-squad2-es](https://github.com/huggingface/transformers/blob/master/model_cards/mrm8488/bert-base-spanish-wwm-cased-finetuned-spa-squad2-es/README.md)
This model was fine-tuned on the same dataset but using **distillation** during the process as mentioned above (and one more train epoch).
The **teacher model** for the distillation was `bert-base-multilingual-cased`. It is the same teacher used for `distilbert-base-multilingual-cased` AKA [**DistilmBERT**](https://github.com/huggingface/transformers/tree/master/examples/distillation) (on average is twice as fast as **mBERT-base**).
## Details of the downstream task (Q&A) - Dataset
<details>
[SQuAD-es-v2.0](https://github.com/ccasimiro88/TranslateAlignRetrieve)
| Dataset | # Q&A |
| ----------------------- | ----- |
| SQuAD2.0 Train | 130 K |
| SQuAD2.0-es-v2.0 | 111 K |
| SQuAD2.0 Dev | 12 K |
| SQuAD-es-v2.0-small Dev | 69 K |
</details>
## Model training
The model was trained on a Tesla P100 GPU and 25GB of RAM with the following command:
```bash
!export SQUAD_DIR=/path/to/squad-v2_spanish \
&& python transformers/examples/distillation/run_squad_w_distillation.py \
--model_type bert \
--model_name_or_path dccuchile/bert-base-spanish-wwm-cased \
--teacher_type bert \
--teacher_name_or_path bert-base-multilingual-cased \
--do_train \
--do_eval \
--do_lower_case \
--train_file $SQUAD_DIR/train-v2.json \
--predict_file $SQUAD_DIR/dev-v2.json \
--per_gpu_train_batch_size 12 \
--learning_rate 3e-5 \
--num_train_epochs 5.0 \
--max_seq_length 384 \
--doc_stride 128 \
--output_dir /content/model_output \
--save_steps 5000 \
--threads 4 \
--version_2_with_negative
```
## Results:
TBA
### Model in action
Fast usage with **pipelines**:
```python
from transformers import *
# Important!: By now the QA pipeline is not compatible with fast tokenizer, but they are working on it. So that pass the object to the tokenizer {"use_fast": False} as in the following example:
nlp = pipeline(
'question-answering',
model='mrm8488/distill-bert-base-spanish-wwm-cased-finetuned-spa-squad2-es',
tokenizer=(
'mrm8488/distill-bert-base-spanish-wwm-cased-finetuned-spa-squad2-es',
{"use_fast": False}
)
)
nlp(
{
'question': '¿Para qué lenguaje está trabajando?',
'context': 'Manuel Romero está colaborando activamente con huggingface/transformers ' +
'para traer el poder de las últimas técnicas de procesamiento de lenguaje natural al idioma español'
}
)
# Output: {'answer': 'español', 'end': 169, 'score': 0.67530957344621, 'start': 163}
```
Play with this model and ```pipelines``` in a Colab:
<a href="https://colab.research.google.com/github/mrm8488/shared_colab_notebooks/blob/master/Using_Spanish_BERT_fine_tuned_for_Q%26A_pipelines.ipynb" target="_parent"><img src="https://camo.githubusercontent.com/52feade06f2fecbf006889a904d221e6a730c194/68747470733a2f2f636f6c61622e72657365617263682e676f6f676c652e636f6d2f6173736574732f636f6c61622d62616467652e737667" alt="Open In Colab" data-canonical-src="https://colab.research.google.com/assets/colab-badge.svg"></a>
<details>
1. Set the context and ask some questions:

2. Run predictions:

</details>
More about ``` Huggingface pipelines```? check this Colab out:
<a href="https://colab.research.google.com/github/mrm8488/shared_colab_notebooks/blob/master/Huggingface_pipelines_demo.ipynb" target="_parent"><img src="https://camo.githubusercontent.com/52feade06f2fecbf006889a904d221e6a730c194/68747470733a2f2f636f6c61622e72657365617263682e676f6f676c652e636f6d2f6173736574732f636f6c61622d62616467652e737667" alt="Open In Colab" data-canonical-src="https://colab.research.google.com/assets/colab-badge.svg"></a>
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488)
> Made with <span style="color: #e25555;">♥</span> in Spain | 4,598 | [
[
-0.041015625,
-0.051788330078125,
0.0174102783203125,
0.0230865478515625,
-0.0157623291015625,
0.0220184326171875,
-0.01483154296875,
-0.026519775390625,
0.022003173828125,
0.00948333740234375,
-0.06427001953125,
-0.03271484375,
-0.051727294921875,
0.00943756103515625,
-0.01038360595703125,
0.0899658203125,
-0.01371002197265625,
0.01427459716796875,
-0.004756927490234375,
-0.004833221435546875,
-0.031951904296875,
-0.035430908203125,
-0.05291748046875,
-0.0285491943359375,
0.0285491943359375,
0.0146026611328125,
0.042510986328125,
0.03839111328125,
0.03302001953125,
0.0305633544921875,
-0.02105712890625,
0.002079010009765625,
-0.0251617431640625,
0.01155853271484375,
0.008392333984375,
-0.03533935546875,
-0.03662109375,
0.0019521713256835938,
0.03289794921875,
0.028045654296875,
0.005146026611328125,
0.0241546630859375,
0.00498199462890625,
0.038970947265625,
-0.0271148681640625,
0.03546142578125,
-0.0389404296875,
-0.0035800933837890625,
0.0101318359375,
0.006877899169921875,
-0.017181396484375,
-0.016143798828125,
0.0132598876953125,
-0.03204345703125,
0.0279998779296875,
0.003368377685546875,
0.09649658203125,
0.0294342041015625,
-0.010040283203125,
-0.0248260498046875,
-0.036834716796875,
0.06475830078125,
-0.061065673828125,
0.01438140869140625,
0.0102691650390625,
0.0244140625,
-0.0125885009765625,
-0.06439208984375,
-0.051971435546875,
0.004680633544921875,
-0.015411376953125,
0.02587890625,
-0.00916290283203125,
-0.01055145263671875,
0.02044677734375,
0.0235748291015625,
-0.036529541015625,
0.0013704299926757812,
-0.049530029296875,
-0.0233917236328125,
0.048828125,
0.00577545166015625,
0.0089569091796875,
-0.01898193359375,
-0.03936767578125,
-0.036346435546875,
-0.02899169921875,
0.028717041015625,
0.0278167724609375,
0.041015625,
-0.0287933349609375,
0.038238525390625,
-0.0175628662109375,
0.0276031494140625,
0.01074981689453125,
0.00229644775390625,
0.035400390625,
-0.0209503173828125,
-0.019683837890625,
-0.004703521728515625,
0.08349609375,
0.02288818359375,
0.0216522216796875,
-0.0080413818359375,
-0.00576019287109375,
0.00426483154296875,
-0.00452423095703125,
-0.07720947265625,
-0.0192413330078125,
0.042449951171875,
-0.01238250732421875,
-0.0191497802734375,
0.0016813278198242188,
-0.055633544921875,
0.00588226318359375,
-0.007415771484375,
0.037628173828125,
-0.03363037109375,
-0.0144500732421875,
0.006641387939453125,
-0.0310516357421875,
0.027099609375,
0.005706787109375,
-0.062103271484375,
-0.006317138671875,
0.0389404296875,
0.0631103515625,
0.00821685791015625,
-0.0283203125,
-0.03656005859375,
-0.01445770263671875,
-0.004436492919921875,
0.0430908203125,
-0.01427459716796875,
-0.0247039794921875,
-0.005840301513671875,
0.02685546875,
-0.0157928466796875,
-0.0312347412109375,
0.032470703125,
-0.027252197265625,
0.036163330078125,
-0.03656005859375,
-0.0292816162109375,
-0.013153076171875,
0.01549530029296875,
-0.0311737060546875,
0.096923828125,
0.0143585205078125,
-0.0587158203125,
0.0303192138671875,
-0.048431396484375,
-0.0243988037109375,
-0.00815582275390625,
0.01523590087890625,
-0.036224365234375,
-0.017486572265625,
0.02484130859375,
0.04608154296875,
-0.0223236083984375,
0.0195465087890625,
-0.0276031494140625,
-0.0190277099609375,
0.00826263427734375,
-0.006183624267578125,
0.09014892578125,
0.0145111083984375,
-0.0333251953125,
0.008575439453125,
-0.043426513671875,
0.00868988037109375,
0.009490966796875,
-0.02276611328125,
-0.00008225440979003906,
-0.00954437255859375,
-0.0042877197265625,
0.03253173828125,
0.022705078125,
-0.03717041015625,
0.016326904296875,
-0.02447509765625,
0.05078125,
0.0628662109375,
-0.01044464111328125,
0.0287628173828125,
-0.027862548828125,
0.044281005859375,
0.00574493408203125,
0.01534271240234375,
0.0007796287536621094,
-0.05413818359375,
-0.07818603515625,
-0.04473876953125,
0.0238800048828125,
0.0518798828125,
-0.0562744140625,
0.038238525390625,
-0.0152435302734375,
-0.051177978515625,
-0.048004150390625,
-0.0085296630859375,
0.0247650146484375,
0.05615234375,
0.038482666015625,
-0.0095977783203125,
-0.057098388671875,
-0.06365966796875,
0.003734588623046875,
-0.036834716796875,
-0.012237548828125,
0.0112457275390625,
0.07086181640625,
-0.0080413818359375,
0.07177734375,
-0.044342041015625,
-0.0114288330078125,
-0.016937255859375,
0.01018524169921875,
0.040313720703125,
0.045318603515625,
0.048553466796875,
-0.042816162109375,
-0.047149658203125,
-0.013580322265625,
-0.0584716796875,
0.0088958740234375,
0.004459381103515625,
-0.0229644775390625,
0.00595855712890625,
0.02545166015625,
-0.0472412109375,
0.006305694580078125,
0.036834716796875,
-0.0345458984375,
0.039764404296875,
-0.01031494140625,
0.00800323486328125,
-0.09173583984375,
0.01031494140625,
-0.0016546249389648438,
-0.007160186767578125,
-0.038909912109375,
0.007122039794921875,
-0.00830078125,
0.0006647109985351562,
-0.05877685546875,
0.049468994140625,
-0.0248565673828125,
0.0135345458984375,
0.01103973388671875,
-0.01091766357421875,
0.0117340087890625,
0.0518798828125,
0.004688262939453125,
0.061981201171875,
0.060821533203125,
-0.04278564453125,
0.02880859375,
0.033599853515625,
-0.017822265625,
0.02374267578125,
-0.0694580078125,
0.0092926025390625,
-0.0125885009765625,
0.01346588134765625,
-0.0833740234375,
-0.010650634765625,
0.03289794921875,
-0.05023193359375,
0.0290985107421875,
-0.0221405029296875,
-0.040191650390625,
-0.046417236328125,
-0.02655029296875,
-0.004146575927734375,
0.059417724609375,
-0.034820556640625,
0.035797119140625,
0.0200958251953125,
0.00043272972106933594,
-0.052093505859375,
-0.061004638671875,
-0.0247650146484375,
-0.040679931640625,
-0.06048583984375,
0.0274810791015625,
-0.024169921875,
0.00007301568984985352,
-0.007793426513671875,
-0.00991058349609375,
-0.030731201171875,
0.0085296630859375,
0.008087158203125,
0.0447998046875,
-0.005840301513671875,
-0.00849151611328125,
0.0044097900390625,
0.00484466552734375,
0.01441192626953125,
0.003528594970703125,
0.05224609375,
-0.030792236328125,
0.0100555419921875,
-0.031890869140625,
0.0141143798828125,
0.062408447265625,
-0.0125579833984375,
0.07177734375,
0.068359375,
-0.0098419189453125,
0.0004353523254394531,
-0.0251617431640625,
-0.031341552734375,
-0.039825439453125,
0.023681640625,
-0.03125,
-0.055572509765625,
0.07171630859375,
0.0102081298828125,
0.01245880126953125,
0.0518798828125,
0.051788330078125,
-0.0333251953125,
0.08160400390625,
0.0291748046875,
-0.003753662109375,
0.031402587890625,
-0.06622314453125,
0.0130615234375,
-0.06854248046875,
-0.03857421875,
-0.05120849609375,
-0.032379150390625,
-0.052947998046875,
-0.035430908203125,
0.0294342041015625,
0.022064208984375,
-0.035247802734375,
0.049102783203125,
-0.047119140625,
0.01245880126953125,
0.047821044921875,
0.00528717041015625,
0.0014085769653320312,
-0.007785797119140625,
-0.0149688720703125,
0.0011386871337890625,
-0.070068359375,
-0.03497314453125,
0.07659912109375,
0.0291748046875,
0.0260009765625,
-0.006999969482421875,
0.058349609375,
-0.007099151611328125,
0.021087646484375,
-0.06280517578125,
0.040191650390625,
-0.006855010986328125,
-0.07025146484375,
-0.01442718505859375,
-0.021728515625,
-0.06475830078125,
0.0238800048828125,
-0.01122283935546875,
-0.050384521484375,
0.01262664794921875,
0.0091094970703125,
-0.0214691162109375,
0.020111083984375,
-0.07318115234375,
0.07647705078125,
-0.00487518310546875,
-0.02545166015625,
0.0170135498046875,
-0.05096435546875,
0.02099609375,
0.0106048583984375,
0.0166473388671875,
-0.0172271728515625,
0.00835418701171875,
0.0574951171875,
-0.049468994140625,
0.055877685546875,
-0.0193328857421875,
0.005420684814453125,
0.03936767578125,
-0.0165863037109375,
0.043243408203125,
0.002490997314453125,
-0.01611328125,
0.023834228515625,
0.01204681396484375,
-0.0272979736328125,
-0.0350341796875,
0.041961669921875,
-0.055999755859375,
-0.032012939453125,
-0.045196533203125,
-0.04071044921875,
-0.006473541259765625,
0.0093994140625,
0.036468505859375,
0.006290435791015625,
0.00818634033203125,
0.0143585205078125,
0.031951904296875,
-0.007762908935546875,
0.0528564453125,
0.031341552734375,
-0.002162933349609375,
-0.01107025146484375,
0.048553466796875,
0.0027141571044921875,
0.0169219970703125,
0.0157470703125,
0.01873779296875,
-0.044677734375,
-0.027099609375,
-0.044647216796875,
0.0247344970703125,
-0.042144775390625,
-0.0258941650390625,
-0.044097900390625,
-0.01515960693359375,
-0.037811279296875,
-0.003757476806640625,
-0.039459228515625,
-0.027008056640625,
-0.03704833984375,
-0.002696990966796875,
0.053863525390625,
0.0196685791015625,
0.0010814666748046875,
0.0220489501953125,
-0.044036865234375,
0.0153656005859375,
0.026153564453125,
0.019378662109375,
-0.013214111328125,
-0.048126220703125,
-0.013763427734375,
0.0235595703125,
-0.02313232421875,
-0.0604248046875,
0.04241943359375,
0.00835418701171875,
0.0305938720703125,
0.003047943115234375,
0.011199951171875,
0.06494140625,
-0.032257080078125,
0.06256103515625,
0.0151214599609375,
-0.068359375,
0.045867919921875,
-0.02459716796875,
0.01337432861328125,
0.0396728515625,
0.04278564453125,
-0.0292510986328125,
-0.0396728515625,
-0.047821044921875,
-0.08172607421875,
0.05694580078125,
0.024322509765625,
0.01238250732421875,
-0.0160675048828125,
0.0063629150390625,
-0.006824493408203125,
0.0196685791015625,
-0.04949951171875,
-0.03839111328125,
-0.018707275390625,
-0.0001175999641418457,
0.0034542083740234375,
0.007335662841796875,
-0.004100799560546875,
-0.042236328125,
0.06689453125,
-0.0075531005859375,
0.025390625,
0.0233917236328125,
0.00766754150390625,
0.01375579833984375,
0.01025390625,
0.0213775634765625,
0.0295257568359375,
-0.03717041015625,
-0.0245513916015625,
0.01505279541015625,
-0.0305938720703125,
0.006744384765625,
0.005218505859375,
-0.0170135498046875,
0.0168914794921875,
0.01788330078125,
0.07525634765625,
0.0006084442138671875,
-0.048187255859375,
0.026947021484375,
-0.0086517333984375,
-0.0242767333984375,
-0.0236968994140625,
0.007659912109375,
0.0094146728515625,
0.031951904296875,
0.022796630859375,
0.0101165771484375,
-0.006866455078125,
-0.051422119140625,
0.0026397705078125,
0.03460693359375,
-0.026397705078125,
-0.0240478515625,
0.0633544921875,
0.0097503662109375,
-0.014251708984375,
0.047760009765625,
-0.02325439453125,
-0.058502197265625,
0.0645751953125,
0.0264739990234375,
0.066162109375,
0.0019683837890625,
0.023345947265625,
0.055816650390625,
0.01454925537109375,
-0.01727294921875,
0.0304718017578125,
0.0090179443359375,
-0.052581787109375,
-0.030914306640625,
-0.0540771484375,
-0.01727294921875,
0.01261138916015625,
-0.055572509765625,
0.03533935546875,
-0.0309295654296875,
-0.022003173828125,
-0.00002372264862060547,
0.00618743896484375,
-0.0711669921875,
0.02032470703125,
-0.011932373046875,
0.05718994140625,
-0.076904296875,
0.0595703125,
0.0526123046875,
-0.047760009765625,
-0.0787353515625,
-0.0198822021484375,
-0.01971435546875,
-0.060638427734375,
0.0545654296875,
0.00522613525390625,
0.006084442138671875,
0.002666473388671875,
-0.0271148681640625,
-0.059326171875,
0.09014892578125,
0.029510498046875,
-0.0306396484375,
-0.006999969482421875,
0.0022678375244140625,
0.053314208984375,
-0.02215576171875,
0.0404052734375,
0.04876708984375,
0.03851318359375,
0.0253753662109375,
-0.07232666015625,
0.0018949508666992188,
-0.02362060546875,
-0.0147857666015625,
0.0038299560546875,
-0.06689453125,
0.08135986328125,
-0.0232086181640625,
-0.00583648681640625,
-0.0022792816162109375,
0.043487548828125,
0.0308837890625,
0.010589599609375,
0.02880859375,
0.038818359375,
0.054718017578125,
-0.032073974609375,
0.083740234375,
-0.026214599609375,
0.05694580078125,
0.07232666015625,
0.0038928985595703125,
0.05438232421875,
0.04766845703125,
-0.034820556640625,
0.044708251953125,
0.0528564453125,
-0.018096923828125,
0.037200927734375,
0.0120391845703125,
-0.005802154541015625,
-0.0038928985595703125,
0.0012712478637695312,
-0.038543701171875,
0.041900634765625,
-0.0015697479248046875,
-0.0232086181640625,
-0.0020160675048828125,
-0.0173492431640625,
0.027435302734375,
-0.0146636962890625,
-0.00679779052734375,
0.03289794921875,
-0.01541900634765625,
-0.06597900390625,
0.07452392578125,
-0.00782012939453125,
0.068359375,
-0.049896240234375,
0.014678955078125,
-0.0209808349609375,
0.0084991455078125,
-0.006992340087890625,
-0.058197021484375,
0.0273590087890625,
0.01474761962890625,
-0.0223541259765625,
-0.031951904296875,
0.0169219970703125,
-0.035064697265625,
-0.050872802734375,
0.0179443359375,
0.036590576171875,
0.0216217041015625,
0.0034160614013671875,
-0.06658935546875,
0.010101318359375,
0.014984130859375,
-0.02227783203125,
0.016448974609375,
0.0187530517578125,
0.0164031982421875,
0.053558349609375,
0.051544189453125,
0.006317138671875,
0.01416015625,
-0.00516510009765625,
0.06927490234375,
-0.02142333984375,
-0.020111083984375,
-0.06353759765625,
0.07354736328125,
-0.0014333724975585938,
-0.039154052734375,
0.053955078125,
0.05450439453125,
0.07403564453125,
-0.0246429443359375,
0.05352783203125,
-0.034912109375,
0.0277099609375,
-0.02545166015625,
0.052459716796875,
-0.04705810546875,
0.012542724609375,
-0.02130126953125,
-0.06512451171875,
0.004390716552734375,
0.0570068359375,
-0.01090240478515625,
0.00406646728515625,
0.05615234375,
0.056549072265625,
0.0013895034790039062,
-0.0256805419921875,
-0.01355743408203125,
0.0179595947265625,
0.031494140625,
0.045501708984375,
0.03570556640625,
-0.06878662109375,
0.04974365234375,
-0.049896240234375,
-0.0058135986328125,
0.005825042724609375,
-0.064697265625,
-0.07391357421875,
-0.05322265625,
-0.0401611328125,
-0.04071044921875,
-0.01087188720703125,
0.058685302734375,
0.060760498046875,
-0.0670166015625,
-0.02001953125,
-0.010040283203125,
0.0100250244140625,
-0.0159149169921875,
-0.0223236083984375,
0.03155517578125,
-0.02484130859375,
-0.0928955078125,
0.0184173583984375,
-0.010772705078125,
0.01580810546875,
-0.00543212890625,
-0.0091400146484375,
-0.026214599609375,
-0.0108642578125,
0.03814697265625,
0.019989013671875,
-0.04278564453125,
-0.0281524658203125,
0.0216064453125,
0.00962066650390625,
0.016510009765625,
0.0266876220703125,
-0.063720703125,
0.033599853515625,
0.05035400390625,
0.016021728515625,
0.05712890625,
-0.0157012939453125,
0.03271484375,
-0.0595703125,
0.0345458984375,
0.0256195068359375,
0.033599853515625,
0.0193023681640625,
-0.0173187255859375,
0.04180908203125,
0.0206146240234375,
-0.0284881591796875,
-0.0711669921875,
-0.0010328292846679688,
-0.07379150390625,
-0.0157623291015625,
0.07696533203125,
-0.0238037109375,
-0.0164337158203125,
0.01377105712890625,
-0.01090240478515625,
0.050018310546875,
-0.040374755859375,
0.061370849609375,
0.06658935546875,
-0.004520416259765625,
0.01071929931640625,
-0.039154052734375,
0.03143310546875,
0.03619384765625,
-0.052490234375,
-0.0146942138671875,
0.01259613037109375,
0.03656005859375,
0.0082550048828125,
0.0234832763671875,
0.003780364990234375,
0.0035419464111328125,
-0.0134735107421875,
0.0019121170043945312,
-0.01215362548828125,
-0.00921630859375,
-0.004245758056640625,
-0.002956390380859375,
-0.0281524658203125,
-0.0318603515625
]
] |
huggingface/CodeBERTa-small-v1 | 2022-06-27T15:48:41.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"fill-mask",
"code",
"dataset:code_search_net",
"arxiv:1909.09436",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | huggingface | null | null | huggingface/CodeBERTa-small-v1 | 53 | 67,432 | transformers | 2022-03-02T23:29:05 | ---
language: code
thumbnail: https://cdn-media.huggingface.co/CodeBERTa/CodeBERTa.png
datasets:
- code_search_net
---
# CodeBERTa
CodeBERTa is a RoBERTa-like model trained on the [CodeSearchNet](https://github.blog/2019-09-26-introducing-the-codesearchnet-challenge/) dataset from GitHub.
Supported languages:
```shell
"go"
"java"
"javascript"
"php"
"python"
"ruby"
```
The **tokenizer** is a Byte-level BPE tokenizer trained on the corpus using Hugging Face `tokenizers`.
Because it is trained on a corpus of code (vs. natural language), it encodes the corpus efficiently (the sequences are between 33% to 50% shorter, compared to the same corpus tokenized by gpt2/roberta).
The (small) **model** is a 6-layer, 84M parameters, RoBERTa-like Transformer model – that’s the same number of layers & heads as DistilBERT – initialized from the default initialization settings and trained from scratch on the full corpus (~2M functions) for 5 epochs.
### Tensorboard for this training ⤵️
[](https://tensorboard.dev/experiment/irRI7jXGQlqmlxXS0I07ew/#scalars)
## Quick start: masked language modeling prediction
```python
PHP_CODE = """
public static <mask> set(string $key, $value) {
if (!in_array($key, self::$allowedKeys)) {
throw new \InvalidArgumentException('Invalid key given');
}
self::$storedValues[$key] = $value;
}
""".lstrip()
```
### Does the model know how to complete simple PHP code?
```python
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model="huggingface/CodeBERTa-small-v1",
tokenizer="huggingface/CodeBERTa-small-v1"
)
fill_mask(PHP_CODE)
## Top 5 predictions:
#
' function' # prob 0.9999827146530151
'function' #
' void' #
' def' #
' final' #
```
### Yes! That was easy 🎉 What about some Python (warning: this is going to be meta)
```python
PYTHON_CODE = """
def pipeline(
task: str,
model: Optional = None,
framework: Optional[<mask>] = None,
**kwargs
) -> Pipeline:
pass
""".lstrip()
```
Results:
```python
'framework', 'Framework', ' framework', 'None', 'str'
```
> This program can auto-complete itself! 😱
### Just for fun, let's try to mask natural language (not code):
```python
fill_mask("My name is <mask>.")
# {'sequence': '<s> My name is undefined.</s>', 'score': 0.2548016905784607, 'token': 3353}
# {'sequence': '<s> My name is required.</s>', 'score': 0.07290805131196976, 'token': 2371}
# {'sequence': '<s> My name is null.</s>', 'score': 0.06323737651109695, 'token': 469}
# {'sequence': '<s> My name is name.</s>', 'score': 0.021919190883636475, 'token': 652}
# {'sequence': '<s> My name is disabled.</s>', 'score': 0.019681859761476517, 'token': 7434}
```
This (kind of) works because code contains comments (which contain natural language).
Of course, the most frequent name for a Computer scientist must be undefined 🤓.
## Downstream task: [programming language identification](https://huggingface.co/huggingface/CodeBERTa-language-id)
See the model card for **[`huggingface/CodeBERTa-language-id`](https://huggingface.co/huggingface/CodeBERTa-language-id)** 🤯.
<br>
## CodeSearchNet citation
<details>
```bibtex
@article{husain_codesearchnet_2019,
title = {{CodeSearchNet} {Challenge}: {Evaluating} the {State} of {Semantic} {Code} {Search}},
shorttitle = {{CodeSearchNet} {Challenge}},
url = {http://arxiv.org/abs/1909.09436},
urldate = {2020-03-12},
journal = {arXiv:1909.09436 [cs, stat]},
author = {Husain, Hamel and Wu, Ho-Hsiang and Gazit, Tiferet and Allamanis, Miltiadis and Brockschmidt, Marc},
month = sep,
year = {2019},
note = {arXiv: 1909.09436},
}
```
</details>
| 3,696 | [
[
-0.0289764404296875,
-0.041412353515625,
0.0268707275390625,
0.0201263427734375,
-0.0147857666015625,
0.017852783203125,
-0.033233642578125,
-0.0235137939453125,
0.031494140625,
0.022674560546875,
-0.034698486328125,
-0.0576171875,
-0.055816650390625,
0.01018524169921875,
-0.03814697265625,
0.087158203125,
0.00988006591796875,
-0.00027441978454589844,
0.0067596435546875,
0.0031452178955078125,
-0.0254974365234375,
-0.0458984375,
-0.04217529296875,
-0.0117340087890625,
0.0318603515625,
-0.002384185791015625,
0.051300048828125,
0.058746337890625,
0.03131103515625,
0.025665283203125,
-0.012664794921875,
-0.0008826255798339844,
-0.021270751953125,
-0.0139617919921875,
0.0030384063720703125,
-0.048095703125,
-0.0191192626953125,
-0.0099639892578125,
0.025421142578125,
0.047210693359375,
0.0032062530517578125,
0.03240966796875,
-0.01525115966796875,
0.044158935546875,
-0.042633056640625,
0.03350830078125,
-0.057037353515625,
0.0010271072387695312,
-0.0175323486328125,
-0.0050048828125,
-0.02923583984375,
-0.03729248046875,
0.0034332275390625,
-0.022674560546875,
0.0222930908203125,
-0.00453948974609375,
0.08203125,
0.02764892578125,
-0.010284423828125,
-0.0175018310546875,
-0.038055419921875,
0.0570068359375,
-0.05877685546875,
0.007137298583984375,
0.029571533203125,
0.0074310302734375,
-0.0134735107421875,
-0.066650390625,
-0.047882080078125,
0.0009083747863769531,
-0.01476287841796875,
0.0030612945556640625,
-0.02801513671875,
-0.001056671142578125,
0.025665283203125,
0.0193023681640625,
-0.062255859375,
-0.01641845703125,
-0.051177978515625,
-0.034942626953125,
0.04180908203125,
-0.002162933349609375,
0.033233642578125,
-0.03729248046875,
-0.0230865478515625,
-0.008453369140625,
-0.026458740234375,
0.02337646484375,
0.024139404296875,
0.032470703125,
-0.0288848876953125,
0.030303955078125,
-0.018829345703125,
0.061981201171875,
-0.00563812255859375,
-0.0010938644409179688,
0.048309326171875,
-0.0173187255859375,
-0.0204315185546875,
-0.009674072265625,
0.08697509765625,
0.0033092498779296875,
0.0291290283203125,
-0.0134429931640625,
-0.006374359130859375,
0.0306243896484375,
0.0005860328674316406,
-0.057586669921875,
-0.03497314453125,
0.0300750732421875,
-0.032135009765625,
-0.039031982421875,
0.0269622802734375,
-0.061065673828125,
-0.004974365234375,
-0.0167083740234375,
0.032867431640625,
-0.0293121337890625,
-0.00969696044921875,
0.0194091796875,
-0.005279541015625,
0.013519287109375,
-0.0011091232299804688,
-0.05511474609375,
0.0173187255859375,
0.052581787109375,
0.0711669921875,
-0.0036296844482421875,
-0.0202178955078125,
-0.03472900390625,
-0.03948974609375,
-0.015411376953125,
0.038055419921875,
-0.040802001953125,
-0.0084228515625,
-0.0045166015625,
0.0182342529296875,
-0.01593017578125,
-0.0258636474609375,
0.030670166015625,
-0.046356201171875,
0.023956298828125,
0.01099395751953125,
-0.0267791748046875,
-0.0164337158203125,
-0.0001550912857055664,
-0.05279541015625,
0.079345703125,
0.0069427490234375,
-0.052154541015625,
0.0079193115234375,
-0.05462646484375,
-0.01461029052734375,
-0.0003299713134765625,
-0.01212310791015625,
-0.0330810546875,
-0.019317626953125,
0.0094146728515625,
0.0260467529296875,
-0.016937255859375,
0.044219970703125,
-0.017120361328125,
-0.0313720703125,
0.03424072265625,
-0.002105712890625,
0.0916748046875,
0.03106689453125,
-0.048431396484375,
0.00803375244140625,
-0.066162109375,
0.0162200927734375,
0.018524169921875,
-0.0215911865234375,
0.00318145751953125,
-0.007404327392578125,
0.01123046875,
0.02532958984375,
0.0255279541015625,
-0.032196044921875,
0.0203704833984375,
-0.04534912109375,
0.053924560546875,
0.051513671875,
-0.0013103485107421875,
0.0251312255859375,
-0.011871337890625,
0.05426025390625,
-0.01184844970703125,
0.0157470703125,
-0.01776123046875,
-0.0517578125,
-0.0640869140625,
-0.032012939453125,
0.053558349609375,
0.0509033203125,
-0.051727294921875,
0.04510498046875,
-0.0291900634765625,
-0.0517578125,
-0.046112060546875,
0.004802703857421875,
0.05059814453125,
0.025390625,
0.03033447265625,
-0.0307464599609375,
-0.0736083984375,
-0.056854248046875,
-0.018218994140625,
-0.010284423828125,
-0.007587432861328125,
0.0148773193359375,
0.05023193359375,
-0.031768798828125,
0.07916259765625,
-0.060150146484375,
-0.019989013671875,
0.0006041526794433594,
0.01227569580078125,
0.048309326171875,
0.045867919921875,
0.04254150390625,
-0.06353759765625,
-0.043182373046875,
-0.02142333984375,
-0.046722412109375,
-0.0033283233642578125,
0.0021991729736328125,
-0.01261138916015625,
0.0191650390625,
0.026275634765625,
-0.04022216796875,
0.03399658203125,
0.0399169921875,
-0.03082275390625,
0.0245819091796875,
-0.00629425048828125,
-0.00104522705078125,
-0.07952880859375,
0.02557373046875,
-0.005809783935546875,
-0.00957489013671875,
-0.0509033203125,
0.01446533203125,
0.01407623291015625,
-0.01593017578125,
-0.0245513916015625,
0.029876708984375,
-0.031402587890625,
0.011810302734375,
-0.003116607666015625,
0.0218353271484375,
-0.0158843994140625,
0.0655517578125,
0.004398345947265625,
0.0538330078125,
0.056396484375,
-0.03424072265625,
0.0246429443359375,
0.0157470703125,
-0.0225372314453125,
-0.004055023193359375,
-0.0550537109375,
0.022857666015625,
-0.0017938613891601562,
0.015411376953125,
-0.0728759765625,
-0.0117645263671875,
0.02508544921875,
-0.06787109375,
0.00588226318359375,
-0.03265380859375,
-0.03863525390625,
-0.0364990234375,
-0.0294189453125,
0.033935546875,
0.051177978515625,
-0.03826904296875,
0.032318115234375,
0.032745361328125,
-0.0008349418640136719,
-0.050445556640625,
-0.05120849609375,
0.0208282470703125,
-0.012664794921875,
-0.0445556640625,
0.04083251953125,
-0.033721923828125,
-0.0011911392211914062,
-0.01352691650390625,
0.009368896484375,
-0.0165252685546875,
0.0045928955078125,
0.0198516845703125,
0.04510498046875,
-0.0035762786865234375,
0.0021762847900390625,
-0.0279541015625,
-0.0204925537109375,
0.01494598388671875,
-0.032745361328125,
0.0626220703125,
-0.02142333984375,
-0.006443023681640625,
-0.0172576904296875,
-0.002773284912109375,
0.035888671875,
-0.038604736328125,
0.05059814453125,
0.049591064453125,
-0.043304443359375,
-0.0028667449951171875,
-0.037322998046875,
-0.0025920867919921875,
-0.033172607421875,
0.0150909423828125,
-0.0201873779296875,
-0.061126708984375,
0.060943603515625,
0.020721435546875,
-0.019805908203125,
0.044464111328125,
0.041595458984375,
0.0303955078125,
0.07904052734375,
0.034393310546875,
-0.0296630859375,
0.032379150390625,
-0.04864501953125,
0.034149169921875,
-0.0467529296875,
-0.0209808349609375,
-0.054534912109375,
-0.007472991943359375,
-0.06964111328125,
-0.049530029296875,
0.01424407958984375,
0.0008754730224609375,
-0.014892578125,
0.06451416015625,
-0.0670166015625,
0.0187835693359375,
0.044403076171875,
0.0106964111328125,
0.00922393798828125,
-0.0129241943359375,
-0.00875091552734375,
-0.005100250244140625,
-0.034454345703125,
-0.0298004150390625,
0.09075927734375,
0.014556884765625,
0.048431396484375,
0.00634002685546875,
0.0731201171875,
0.0191650390625,
-0.0047760009765625,
-0.047637939453125,
0.0386962890625,
0.005218505859375,
-0.046234130859375,
-0.01148223876953125,
-0.03363037109375,
-0.07818603515625,
0.0120697021484375,
-0.007965087890625,
-0.07135009765625,
0.0176849365234375,
-0.005645751953125,
-0.0267791748046875,
0.0072479248046875,
-0.041748046875,
0.06396484375,
-0.00595855712890625,
-0.019439697265625,
0.00800323486328125,
-0.06304931640625,
0.0234527587890625,
-0.0027103424072265625,
0.03240966796875,
0.00513458251953125,
-0.0009851455688476562,
0.06591796875,
-0.03497314453125,
0.0611572265625,
-0.00887298583984375,
-0.0019369125366210938,
0.019561767578125,
0.006099700927734375,
0.0379638671875,
0.01079559326171875,
-0.0007243156433105469,
0.0262298583984375,
0.010223388671875,
-0.030487060546875,
-0.043609619140625,
0.059295654296875,
-0.053985595703125,
-0.0256195068359375,
-0.042449951171875,
-0.008453369140625,
0.01116943359375,
0.037506103515625,
0.0350341796875,
0.033477783203125,
0.006744384765625,
0.007297515869140625,
0.048431396484375,
-0.0122528076171875,
0.031890869140625,
0.036376953125,
-0.0200958251953125,
-0.041656494140625,
0.08013916015625,
0.00397491455078125,
-0.0008144378662109375,
0.03131103515625,
0.004638671875,
-0.006557464599609375,
-0.027923583984375,
-0.0231781005859375,
0.0208740234375,
-0.0562744140625,
-0.0372314453125,
-0.05914306640625,
-0.03387451171875,
-0.032012939453125,
-0.0029087066650390625,
-0.02447509765625,
-0.032440185546875,
-0.021942138671875,
-0.005153656005859375,
0.02947998046875,
0.035858154296875,
0.013580322265625,
-0.01129913330078125,
-0.039306640625,
0.0223846435546875,
0.0107574462890625,
0.038421630859375,
-0.0110931396484375,
-0.0401611328125,
-0.020904541015625,
0.0032958984375,
-0.000013113021850585938,
-0.0595703125,
0.04107666015625,
0.0016155242919921875,
0.031158447265625,
0.0103912353515625,
0.0105133056640625,
0.048797607421875,
-0.02313232421875,
0.06951904296875,
0.00737762451171875,
-0.085205078125,
0.052978515625,
-0.01348114013671875,
0.02362060546875,
0.038665771484375,
0.0218505859375,
-0.04217529296875,
-0.0426025390625,
-0.061065673828125,
-0.0723876953125,
0.055999755859375,
0.034454345703125,
0.01352691650390625,
-0.01471710205078125,
0.01554107666015625,
-0.01284027099609375,
0.0144500732421875,
-0.08099365234375,
-0.046600341796875,
-0.0165252685546875,
-0.0291595458984375,
0.0069427490234375,
-0.00522613525390625,
-0.0020580291748046875,
-0.0237884521484375,
0.055023193359375,
-0.021484375,
0.041351318359375,
0.0191650390625,
-0.0196990966796875,
0.0033397674560546875,
0.004337310791015625,
0.05609130859375,
0.041229248046875,
-0.01678466796875,
0.0083160400390625,
0.013763427734375,
-0.03021240234375,
-0.01378631591796875,
0.0162200927734375,
-0.01056671142578125,
0.012237548828125,
0.035858154296875,
0.053741455078125,
0.027801513671875,
-0.059844970703125,
0.051422119140625,
-0.011322021484375,
-0.017303466796875,
-0.049285888671875,
0.010589599609375,
0.00037932395935058594,
0.01224517822265625,
0.033416748046875,
0.02520751953125,
0.018402099609375,
-0.024444580078125,
0.008087158203125,
0.0242156982421875,
-0.033477783203125,
-0.0229644775390625,
0.05657958984375,
-0.0114593505859375,
-0.039886474609375,
0.01395416259765625,
-0.0298919677734375,
-0.06610107421875,
0.0672607421875,
0.0281829833984375,
0.073486328125,
0.01439666748046875,
0.00489044189453125,
0.055328369140625,
0.017791748046875,
-0.0017986297607421875,
0.02374267578125,
-0.004428863525390625,
-0.037322998046875,
-0.012054443359375,
-0.046112060546875,
0.002178192138671875,
0.0037021636962890625,
-0.0421142578125,
0.0347900390625,
-0.04107666015625,
-0.017059326171875,
0.00787353515625,
0.00751495361328125,
-0.05841064453125,
0.013153076171875,
0.01288604736328125,
0.06488037109375,
-0.052093505859375,
0.0675048828125,
0.041015625,
-0.059722900390625,
-0.063720703125,
-0.011474609375,
-0.0013494491577148438,
-0.060394287109375,
0.061309814453125,
0.009918212890625,
-0.01026153564453125,
0.002269744873046875,
-0.056243896484375,
-0.087158203125,
0.09130859375,
0.020782470703125,
-0.022369384765625,
-0.00909423828125,
-0.0037250518798828125,
0.054107666015625,
-0.0294189453125,
0.035675048828125,
0.034637451171875,
0.03375244140625,
-0.01934814453125,
-0.072509765625,
0.0217437744140625,
-0.0472412109375,
0.02093505859375,
0.00881195068359375,
-0.059844970703125,
0.06964111328125,
-0.015838623046875,
-0.01873779296875,
0.0218963623046875,
0.04180908203125,
0.01019287109375,
0.00943756103515625,
0.0291900634765625,
0.0318603515625,
0.0511474609375,
-0.0178070068359375,
0.0562744140625,
-0.054718017578125,
0.067138671875,
0.058013916015625,
0.019927978515625,
0.055328369140625,
0.0286712646484375,
-0.0379638671875,
0.06585693359375,
0.039093017578125,
-0.0272216796875,
0.03326416015625,
0.034332275390625,
-0.005279541015625,
0.0007319450378417969,
0.01457977294921875,
-0.028839111328125,
0.04510498046875,
0.0003268718719482422,
-0.040496826171875,
0.01190185546875,
-0.0061187744140625,
0.0258026123046875,
0.0226898193359375,
0.01018524169921875,
0.055206298828125,
-0.009490966796875,
-0.06781005859375,
0.0699462890625,
0.01465606689453125,
0.07421875,
-0.04400634765625,
-0.003948211669921875,
-0.01495361328125,
0.00821685791015625,
-0.0330810546875,
-0.03125,
-0.0027675628662109375,
0.01308441162109375,
-0.022003173828125,
-0.0008196830749511719,
0.036865234375,
-0.0391845703125,
-0.03997802734375,
0.037841796875,
0.00914764404296875,
0.0085296630859375,
0.0054779052734375,
-0.055999755859375,
0.01248931884765625,
0.013397216796875,
-0.01357269287109375,
0.0176849365234375,
0.0124053955078125,
0.0145111083984375,
0.0609130859375,
0.052337646484375,
0.0070343017578125,
0.01166534423828125,
-0.0073699951171875,
0.06890869140625,
-0.06048583984375,
-0.03375244140625,
-0.066650390625,
0.039337158203125,
-0.0021991729736328125,
-0.0234527587890625,
0.0445556640625,
0.0596923828125,
0.066650390625,
-0.0180511474609375,
0.06243896484375,
-0.0286407470703125,
0.03082275390625,
-0.039703369140625,
0.05084228515625,
-0.03411865234375,
0.0174102783203125,
-0.0394287109375,
-0.055450439453125,
-0.0264434814453125,
0.047119140625,
-0.01102447509765625,
0.0129241943359375,
0.048126220703125,
0.08074951171875,
0.0088348388671875,
-0.0198516845703125,
0.0035305023193359375,
0.0029392242431640625,
0.02667236328125,
0.0572509765625,
0.0209808349609375,
-0.05859375,
0.0570068359375,
-0.041595458984375,
-0.02020263671875,
-0.018646240234375,
-0.054351806640625,
-0.0635986328125,
-0.0614013671875,
-0.031494140625,
-0.041351318359375,
-0.0037670135498046875,
0.08172607421875,
0.0533447265625,
-0.07562255859375,
-0.0198516845703125,
-0.0160980224609375,
0.01464080810546875,
-0.01311492919921875,
-0.0265350341796875,
0.034332275390625,
-0.02838134765625,
-0.07073974609375,
0.019439697265625,
0.005199432373046875,
-0.0178985595703125,
-0.01349639892578125,
-0.006786346435546875,
-0.02557373046875,
-0.005878448486328125,
0.025177001953125,
0.031463623046875,
-0.06884765625,
-0.0216217041015625,
0.0164642333984375,
-0.0300140380859375,
0.00635528564453125,
0.04388427734375,
-0.07476806640625,
0.049346923828125,
0.03887939453125,
0.03057861328125,
0.0401611328125,
-0.0151824951171875,
0.029388427734375,
-0.0408935546875,
0.0143890380859375,
0.00707244873046875,
0.03826904296875,
0.01476287841796875,
-0.0361328125,
0.0521240234375,
0.0266571044921875,
-0.038238525390625,
-0.0633544921875,
0.00315093994140625,
-0.07562255859375,
-0.00811004638671875,
0.078125,
-0.01401519775390625,
-0.01377105712890625,
0.00301361083984375,
-0.02386474609375,
0.031402587890625,
-0.03271484375,
0.06524658203125,
0.035980224609375,
0.006191253662109375,
-0.0020198822021484375,
-0.0396728515625,
0.0455322265625,
0.0269775390625,
-0.0379638671875,
-0.01091766357421875,
0.0070953369140625,
0.035614013671875,
0.041229248046875,
0.04730224609375,
-0.01385498046875,
0.0225830078125,
-0.0174407958984375,
0.0230255126953125,
-0.0008521080017089844,
-0.00415802001953125,
-0.040740966796875,
0.005474090576171875,
0.0023212432861328125,
-0.01166534423828125
]
] |
cardiffnlp/tweet-topic-21-multi | 2023-05-28T04:56:09.000Z | [
"transformers",
"pytorch",
"tf",
"roberta",
"text-classification",
"en",
"dataset:cardiffnlp/tweet_topic_multi",
"arxiv:2209.09824",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/tweet-topic-21-multi | 42 | 67,231 | transformers | 2022-06-06T14:52:42 | ---
language: en
widget:
- text: It is great to see athletes promoting awareness for climate change.
datasets:
- cardiffnlp/tweet_topic_multi
license: mit
metrics:
- f1
- accuracy
pipeline_tag: text-classification
---
# tweet-topic-21-multi
This model is based on a [TimeLMs](https://github.com/cardiffnlp/timelms) language model trained on ~124M tweets from January 2018 to December 2021 (see [here](https://huggingface.co/cardiffnlp/twitter-roberta-base-2021-124m)), and finetuned for multi-label topic classification on a corpus of 11,267 [tweets](https://huggingface.co/datasets/cardiffnlp/tweet_topic_multi). This model is suitable for English.
- Reference Paper: [TweetTopic](https://arxiv.org/abs/2209.09824) (COLING 2022).
<b>Labels</b>:
| <span style="font-weight:normal">0: arts_&_culture</span> | <span style="font-weight:normal">5: fashion_&_style</span> | <span style="font-weight:normal">10: learning_&_educational</span> | <span style="font-weight:normal">15: science_&_technology</span> |
|-----------------------------|---------------------|----------------------------|--------------------------|
| 1: business_&_entrepreneurs | 6: film_tv_&_video | 11: music | 16: sports |
| 2: celebrity_&_pop_culture | 7: fitness_&_health | 12: news_&_social_concern | 17: travel_&_adventure |
| 3: diaries_&_daily_life | 8: food_&_dining | 13: other_hobbies | 18: youth_&_student_life |
| 4: family | 9: gaming | 14: relationships | |
## Full classification example
```python
from transformers import AutoModelForSequenceClassification, TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
from scipy.special import expit
MODEL = f"cardiffnlp/tweet-topic-21-multi"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
class_mapping = model.config.id2label
text = "It is great to see athletes promoting awareness for climate change."
tokens = tokenizer(text, return_tensors='pt')
output = model(**tokens)
scores = output[0][0].detach().numpy()
scores = expit(scores)
predictions = (scores >= 0.5) * 1
# TF
#tf_model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
#class_mapping = tf_model.config.id2label
#text = "It is great to see athletes promoting awareness for climate change."
#tokens = tokenizer(text, return_tensors='tf')
#output = tf_model(**tokens)
#scores = output[0][0]
#scores = expit(scores)
#predictions = (scores >= 0.5) * 1
# Map to classes
for i in range(len(predictions)):
if predictions[i]:
print(class_mapping[i])
```
Output:
```
news_&_social_concern
sports
```
### BibTeX entry and citation info
Please cite the [reference paper](https://aclanthology.org/2022.coling-1.299/) if you use this model.
```bibtex
@inproceedings{antypas-etal-2022-twitter,
title = "{T}witter Topic Classification",
author = "Antypas, Dimosthenis and
Ushio, Asahi and
Camacho-Collados, Jose and
Silva, Vitor and
Neves, Leonardo and
Barbieri, Francesco",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics",
url = "https://aclanthology.org/2022.coling-1.299",
pages = "3386--3400"
}
``` | 3,526 | [
[
-0.027740478515625,
-0.04052734375,
0.0085906982421875,
0.0192718505859375,
-0.0169677734375,
0.0242767333984375,
-0.0166168212890625,
-0.0207977294921875,
0.0229034423828125,
0.0052490234375,
-0.043914794921875,
-0.047210693359375,
-0.0557861328125,
-0.00640869140625,
-0.0244140625,
0.0665283203125,
-0.004055023193359375,
-0.0043487548828125,
0.0188140869140625,
-0.0102081298828125,
-0.0190277099609375,
-0.0372314453125,
-0.046722412109375,
-0.02362060546875,
0.021759033203125,
0.02288818359375,
0.0235748291015625,
0.04022216796875,
0.0241546630859375,
0.0274200439453125,
-0.0017337799072265625,
0.0004329681396484375,
-0.0194091796875,
0.00046563148498535156,
0.002490997314453125,
-0.0047149658203125,
-0.025543212890625,
0.01383209228515625,
0.048004150390625,
0.040191650390625,
0.010284423828125,
0.026702880859375,
0.0128631591796875,
0.0211181640625,
-0.01806640625,
0.016204833984375,
-0.02459716796875,
-0.002330780029296875,
-0.01523590087890625,
-0.022705078125,
-0.0204315185546875,
-0.01496124267578125,
0.0073394775390625,
-0.032379150390625,
0.0312042236328125,
-0.007579803466796875,
0.095947265625,
0.01380157470703125,
-0.031982421875,
-0.00731658935546875,
-0.0323486328125,
0.0716552734375,
-0.0489501953125,
0.0237274169921875,
0.03167724609375,
0.003704071044921875,
0.008636474609375,
-0.04400634765625,
-0.0406494140625,
0.0047607421875,
-0.00632476806640625,
0.01392364501953125,
-0.0086669921875,
-0.0192718505859375,
0.004940032958984375,
0.01422882080078125,
-0.050079345703125,
-0.006061553955078125,
-0.038299560546875,
-0.0016088485717773438,
0.04876708984375,
0.00667572021484375,
0.01708984375,
-0.0238494873046875,
-0.01552581787109375,
-0.01235198974609375,
-0.01049041748046875,
0.0012598037719726562,
0.0096893310546875,
0.02740478515625,
-0.036590576171875,
0.0443115234375,
-0.01496124267578125,
0.0455322265625,
0.006381988525390625,
-0.0221099853515625,
0.04986572265625,
-0.032562255859375,
-0.0019121170043945312,
-0.01082611083984375,
0.079345703125,
0.03143310546875,
0.023345947265625,
-0.01372528076171875,
0.0029754638671875,
0.006103515625,
-0.013916015625,
-0.06732177734375,
-0.017852783203125,
0.022796630859375,
-0.036285400390625,
-0.03515625,
0.002910614013671875,
-0.0689697265625,
0.0017671585083007812,
-0.01192474365234375,
0.040557861328125,
-0.039581298828125,
-0.030242919921875,
-0.003032684326171875,
-0.018280029296875,
0.00988006591796875,
0.003444671630859375,
-0.055816650390625,
-0.0006818771362304688,
0.029510498046875,
0.06365966796875,
0.0012254714965820312,
-0.04376220703125,
-0.006320953369140625,
0.0212554931640625,
-0.0240478515625,
0.050262451171875,
-0.039520263671875,
-0.0166473388671875,
-0.0121307373046875,
-0.00044345855712890625,
-0.0262451171875,
-0.003337860107421875,
0.02581787109375,
-0.016021728515625,
0.04180908203125,
-0.006542205810546875,
-0.0413818359375,
0.00188446044921875,
0.01763916015625,
-0.0266265869140625,
0.0880126953125,
0.01070404052734375,
-0.0679931640625,
0.036376953125,
-0.0716552734375,
-0.0257415771484375,
-0.01140594482421875,
-0.0006093978881835938,
-0.038330078125,
-0.01548004150390625,
0.0198516845703125,
0.037139892578125,
-0.01549530029296875,
0.04180908203125,
-0.0302734375,
-0.01392364501953125,
0.009521484375,
-0.0245819091796875,
0.07489013671875,
0.019195556640625,
-0.035980224609375,
0.015350341796875,
-0.049407958984375,
0.0013341903686523438,
0.0148773193359375,
-0.0197296142578125,
-0.01076507568359375,
-0.0182342529296875,
0.0099945068359375,
0.04217529296875,
0.010833740234375,
-0.040069580078125,
0.0232086181640625,
-0.017059326171875,
0.05029296875,
0.050262451171875,
0.008697509765625,
0.0282440185546875,
-0.0478515625,
0.0226287841796875,
0.002620697021484375,
0.01003265380859375,
-0.004150390625,
-0.049468994140625,
-0.0521240234375,
-0.0226287841796875,
0.01490020751953125,
0.02947998046875,
-0.05596923828125,
0.038299560546875,
-0.029937744140625,
-0.064453125,
-0.036285400390625,
-0.0191650390625,
0.01116180419921875,
0.04071044921875,
0.042510986328125,
0.018951416015625,
-0.04736328125,
-0.038665771484375,
-0.021881103515625,
-0.029449462890625,
0.023223876953125,
0.0192718505859375,
0.058685302734375,
-0.01287841796875,
0.05828857421875,
-0.0438232421875,
-0.016937255859375,
-0.01454925537109375,
0.0240478515625,
0.028228759765625,
0.0546875,
0.0601806640625,
-0.06243896484375,
-0.05255126953125,
-0.030792236328125,
-0.056121826171875,
-0.007904052734375,
0.01274871826171875,
-0.0140533447265625,
0.032012939453125,
0.0232391357421875,
-0.04119873046875,
0.032867431640625,
0.02484130859375,
-0.053558349609375,
0.03173828125,
0.005405426025390625,
0.0218963623046875,
-0.1036376953125,
0.0036067962646484375,
0.0163421630859375,
0.0118865966796875,
-0.053558349609375,
-0.0146331787109375,
-0.00823974609375,
0.00040030479431152344,
-0.028900146484375,
0.06890869140625,
-0.0161285400390625,
0.022796630859375,
-0.00862884521484375,
-0.01422119140625,
-0.0018606185913085938,
0.037506103515625,
0.0141143798828125,
0.042205810546875,
0.050048828125,
-0.050079345703125,
0.0198516845703125,
0.0167083740234375,
-0.01090240478515625,
0.0188751220703125,
-0.035919189453125,
-0.01020050048828125,
-0.001445770263671875,
0.021392822265625,
-0.08856201171875,
-0.0233306884765625,
0.028564453125,
-0.06146240234375,
0.028961181640625,
-0.00322723388671875,
-0.04620361328125,
-0.05206298828125,
-0.028778076171875,
0.0300140380859375,
0.024810791015625,
-0.033447265625,
0.042144775390625,
0.035858154296875,
0.007328033447265625,
-0.050628662109375,
-0.063720703125,
-0.004154205322265625,
-0.036773681640625,
-0.0435791015625,
0.03155517578125,
-0.021514892578125,
0.00036072731018066406,
0.01739501953125,
-0.00040841102600097656,
-0.013031005859375,
0.017364501953125,
0.00893402099609375,
0.0260467529296875,
-0.011260986328125,
0.0179595947265625,
-0.0177459716796875,
0.00032019615173339844,
-0.00682830810546875,
-0.0299224853515625,
0.0726318359375,
-0.019287109375,
0.0012083053588867188,
-0.036834716796875,
0.0168304443359375,
0.032501220703125,
-0.0175018310546875,
0.07305908203125,
0.0718994140625,
-0.034088134765625,
0.00157928466796875,
-0.0408935546875,
-0.0159149169921875,
-0.034149169921875,
0.041351318359375,
-0.0394287109375,
-0.05908203125,
0.042816162109375,
0.02423095703125,
0.005809783935546875,
0.07537841796875,
0.0458984375,
-0.00821685791015625,
0.058807373046875,
0.034576416015625,
-0.002834320068359375,
0.044219970703125,
-0.060882568359375,
0.0032958984375,
-0.03826904296875,
-0.041961669921875,
-0.04962158203125,
-0.012115478515625,
-0.07122802734375,
-0.0227203369140625,
0.0024814605712890625,
-0.00516510009765625,
-0.056854248046875,
0.034393310546875,
-0.053070068359375,
0.00974273681640625,
0.04986572265625,
0.00534820556640625,
-0.00884246826171875,
0.0019817352294921875,
-0.006877899169921875,
-0.01404571533203125,
-0.055999755859375,
-0.0290069580078125,
0.0771484375,
0.025360107421875,
0.03662109375,
0.0177764892578125,
0.07000732421875,
0.0019378662109375,
0.03790283203125,
-0.0440673828125,
0.037628173828125,
-0.03369140625,
-0.07159423828125,
-0.031982421875,
-0.043609619140625,
-0.064453125,
0.0181732177734375,
-0.01763916015625,
-0.06683349609375,
0.01473236083984375,
-0.01244354248046875,
-0.00914764404296875,
0.04315185546875,
-0.04815673828125,
0.06866455078125,
-0.0119476318359375,
-0.03302001953125,
0.00917816162109375,
-0.037841796875,
0.0198974609375,
-0.01496124267578125,
0.0299224853515625,
-0.0216217041015625,
-0.004184722900390625,
0.09259033203125,
-0.0216522216796875,
0.06292724609375,
-0.0091705322265625,
0.0192108154296875,
0.025543212890625,
-0.017486572265625,
-0.0011186599731445312,
0.00075531005859375,
-0.01220703125,
0.01082611083984375,
-0.0010404586791992188,
-0.035003662109375,
-0.00791168212890625,
0.052032470703125,
-0.08697509765625,
-0.045166015625,
-0.0618896484375,
-0.036529541015625,
-0.0118255615234375,
0.031768798828125,
0.044158935546875,
0.0377197265625,
0.0031452178955078125,
0.017974853515625,
0.0230255126953125,
-0.0173187255859375,
0.0716552734375,
0.032562255859375,
-0.0127716064453125,
-0.040191650390625,
0.07122802734375,
0.024261474609375,
0.0009984970092773438,
0.040374755859375,
0.0340576171875,
-0.03082275390625,
-0.0198516845703125,
-0.00594329833984375,
0.0369873046875,
-0.045135498046875,
-0.0241241455078125,
-0.06561279296875,
-0.0352783203125,
-0.066162109375,
-0.0027618408203125,
-0.02105712890625,
-0.044677734375,
-0.042236328125,
-0.0064239501953125,
0.030487060546875,
0.053070068359375,
-0.02105712890625,
0.0258026123046875,
-0.04144287109375,
0.02093505859375,
0.006908416748046875,
0.0247039794921875,
0.00421905517578125,
-0.06585693359375,
-0.0119171142578125,
-0.00708770751953125,
-0.04083251953125,
-0.0618896484375,
0.049407958984375,
0.0245819091796875,
0.05218505859375,
0.024749755859375,
0.0083160400390625,
0.046905517578125,
-0.019195556640625,
0.06365966796875,
0.0251922607421875,
-0.07171630859375,
0.04278564453125,
-0.0287322998046875,
0.0270538330078125,
0.025146484375,
0.032623291015625,
-0.045166015625,
-0.039398193359375,
-0.07275390625,
-0.07110595703125,
0.073974609375,
0.01383209228515625,
0.01421356201171875,
-0.0108795166015625,
0.0007929801940917969,
-0.0035991668701171875,
0.020751953125,
-0.06878662109375,
-0.0576171875,
-0.044464111328125,
-0.0254974365234375,
-0.0240020751953125,
-0.015777587890625,
0.0034885406494140625,
-0.03692626953125,
0.073974609375,
0.002490997314453125,
0.040557861328125,
-0.0055389404296875,
0.01303863525390625,
-0.002716064453125,
0.01129150390625,
0.048095703125,
0.052490234375,
-0.033660888671875,
0.0096588134765625,
0.0027484893798828125,
-0.035797119140625,
0.0164794921875,
0.022705078125,
-0.0018329620361328125,
0.021026611328125,
0.04754638671875,
0.058013916015625,
0.004055023193359375,
0.00042176246643066406,
0.0294189453125,
-0.00940704345703125,
-0.0284576416015625,
-0.0222625732421875,
-0.007476806640625,
0.00493621826171875,
0.011016845703125,
0.050384521484375,
0.0118865966796875,
-0.007747650146484375,
-0.0430908203125,
0.0298919677734375,
0.0083770751953125,
-0.019317626953125,
-0.018341064453125,
0.062042236328125,
-0.0038909912109375,
-0.018707275390625,
0.0226898193359375,
-0.016571044921875,
-0.06561279296875,
0.055938720703125,
0.027740478515625,
0.06744384765625,
-0.0269927978515625,
0.028564453125,
0.061798095703125,
0.017181396484375,
-0.007678985595703125,
0.0290985107421875,
0.0128936767578125,
-0.051849365234375,
-0.0232391357421875,
-0.04736328125,
-0.0037841796875,
0.0194854736328125,
-0.02899169921875,
0.024749755859375,
-0.0179595947265625,
-0.0280914306640625,
0.022064208984375,
0.0227508544921875,
-0.0517578125,
0.0257415771484375,
0.006488800048828125,
0.064697265625,
-0.07568359375,
0.07611083984375,
0.0599365234375,
-0.042510986328125,
-0.07537841796875,
0.009368896484375,
-0.0160064697265625,
-0.04290771484375,
0.0711669921875,
0.01861572265625,
0.02197265625,
0.006580352783203125,
-0.043365478515625,
-0.0855712890625,
0.07855224609375,
0.0054931640625,
-0.006908416748046875,
-0.015655517578125,
0.01493072509765625,
0.039306640625,
-0.0275726318359375,
0.0341796875,
0.0244903564453125,
0.038604736328125,
0.023223876953125,
-0.06573486328125,
-0.00720977783203125,
-0.02777099609375,
-0.0178070068359375,
0.00920867919921875,
-0.07476806640625,
0.0831298828125,
-0.0109100341796875,
0.0044097900390625,
-0.008270263671875,
0.0533447265625,
0.04315185546875,
0.0364990234375,
0.02496337890625,
0.0389404296875,
0.050537109375,
-0.034515380859375,
0.0653076171875,
-0.0323486328125,
0.0618896484375,
0.0706787109375,
0.0307769775390625,
0.043487548828125,
0.034454345703125,
-0.023223876953125,
0.0192108154296875,
0.07403564453125,
-0.013946533203125,
0.056243896484375,
0.005054473876953125,
0.0038089752197265625,
-0.017669677734375,
-0.0143890380859375,
-0.041656494140625,
0.03173828125,
0.0236053466796875,
-0.023101806640625,
-0.013641357421875,
0.007350921630859375,
0.023193359375,
-0.0165863037109375,
-0.01910400390625,
0.038330078125,
0.01058197021484375,
-0.05157470703125,
0.053009033203125,
0.005832672119140625,
0.06640625,
-0.028656005859375,
0.01001739501953125,
-0.00643157958984375,
0.016998291015625,
-0.0127716064453125,
-0.07586669921875,
0.0103912353515625,
-0.0033397674560546875,
-0.005069732666015625,
-0.020751953125,
0.0401611328125,
-0.044281005859375,
-0.047576904296875,
0.032440185546875,
0.03009033203125,
0.01145172119140625,
0.01904296875,
-0.07733154296875,
0.00006330013275146484,
0.007415771484375,
-0.036376953125,
-0.0023517608642578125,
0.040771484375,
0.00423431396484375,
0.050994873046875,
0.029205322265625,
-0.0020999908447265625,
0.0233612060546875,
0.0003948211669921875,
0.06280517578125,
-0.046295166015625,
-0.037139892578125,
-0.0740966796875,
0.0288848876953125,
-0.006221771240234375,
-0.040374755859375,
0.058197021484375,
0.05694580078125,
0.054840087890625,
-0.0006070137023925781,
0.062286376953125,
-0.0196533203125,
0.03985595703125,
-0.015167236328125,
0.054473876953125,
-0.0601806640625,
0.0029239654541015625,
-0.022369384765625,
-0.055450439453125,
-0.0225830078125,
0.043701171875,
-0.037200927734375,
0.029632568359375,
0.052215576171875,
0.03070068359375,
0.004421234130859375,
-0.00557708740234375,
0.00807952880859375,
0.0214996337890625,
0.031951904296875,
0.041748046875,
0.052154541015625,
-0.04638671875,
0.04473876953125,
-0.056549072265625,
-0.009246826171875,
-0.0232696533203125,
-0.050079345703125,
-0.08428955078125,
-0.068115234375,
-0.039459228515625,
-0.056182861328125,
-0.001880645751953125,
0.073974609375,
0.049102783203125,
-0.0740966796875,
-0.0203857421875,
0.0112152099609375,
0.00640869140625,
-0.0014896392822265625,
-0.02191162109375,
0.058624267578125,
-0.01983642578125,
-0.060546875,
-0.004604339599609375,
-0.0173187255859375,
0.022796630859375,
0.01129913330078125,
-0.00583648681640625,
-0.05023193359375,
-0.006153106689453125,
0.0242767333984375,
0.005214691162109375,
-0.042144775390625,
-0.0206146240234375,
0.0084381103515625,
-0.02569580078125,
0.004566192626953125,
0.01947021484375,
-0.03173828125,
0.00958251953125,
0.03875732421875,
0.0200653076171875,
0.060546875,
-0.0097808837890625,
0.006511688232421875,
-0.0455322265625,
0.01425933837890625,
0.014617919921875,
0.0266571044921875,
0.026580810546875,
-0.004657745361328125,
0.054840087890625,
0.040985107421875,
-0.0338134765625,
-0.067138671875,
-0.0201873779296875,
-0.069580078125,
-0.03375244140625,
0.09686279296875,
0.0018205642700195312,
-0.01451873779296875,
-0.00786590576171875,
0.00731658935546875,
0.056396484375,
-0.053009033203125,
0.055877685546875,
0.057342529296875,
-0.0026073455810546875,
-0.00843048095703125,
-0.041748046875,
0.033172607421875,
0.00830841064453125,
-0.049224853515625,
-0.02008056640625,
-0.00426483154296875,
0.039276123046875,
0.023712158203125,
0.05108642578125,
0.01043701171875,
0.006542205810546875,
-0.005382537841796875,
0.0049591064453125,
-0.0108184814453125,
-0.0162811279296875,
-0.004032135009765625,
0.0176239013671875,
-0.02435302734375,
-0.029937744140625
]
] |
succinctly/text2image-prompt-generator | 2022-08-20T06:01:10.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"text2image",
"prompting",
"en",
"dataset:succinctly/midjourney-prompts",
"license:cc-by-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | succinctly | null | null | succinctly/text2image-prompt-generator | 228 | 67,056 | transformers | 2022-07-21T22:17:43 | ---
language:
- "en"
thumbnail: "https://drive.google.com/uc?export=view&id=1JWwrxQbr1s5vYpIhPna_p2IG1pE5rNiV"
tags:
- text2image
- prompting
license: "cc-by-2.0"
datasets:
- "succinctly/midjourney-prompts"
---
This is a GPT-2 model fine-tuned on the [succinctly/midjourney-prompts](https://huggingface.co/datasets/succinctly/midjourney-prompts) dataset, which contains 250k text prompts that users issued to the [Midjourney](https://www.midjourney.com/) text-to-image service over a month period. For more details on how this dataset was scraped, see [Midjourney User Prompts & Generated Images (250k)](https://www.kaggle.com/datasets/succinctlyai/midjourney-texttoimage).
This prompt generator can be used to auto-complete prompts for any text-to-image model (including the DALL·E family):

Note that, while this model can be used together with any text-to-image model, it occasionally produces Midjourney-specific tags. Users can specify certain requirements via [double-dashed parameters](https://midjourney.gitbook.io/docs/imagine-parameters) (e.g. `--ar 16:9` sets the aspect ratio to 16:9, and `--no snake` asks the model to exclude snakes from the generated image) or set the importance of various entities in the image via [explicit weights](https://midjourney.gitbook.io/docs/user-manual#advanced-text-weights) (e.g. `hot dog::1.5 food::-1` is likely to produce the image of an animal instead of a frankfurter).
When using this model, please attribute credit to [Succinctly AI](https://succinctly.ai). | 1,626 | [
[
-0.03619384765625,
-0.050933837890625,
0.054412841796875,
-0.003818511962890625,
-0.029693603515625,
-0.0214996337890625,
0.00762176513671875,
-0.0274658203125,
-0.003063201904296875,
0.0247650146484375,
-0.06341552734375,
-0.028076171875,
-0.052459716796875,
0.01763916015625,
-0.02008056640625,
0.094970703125,
-0.0059661865234375,
-0.0196990966796875,
-0.00843048095703125,
0.00865936279296875,
-0.042724609375,
-0.01513671875,
-0.052093505859375,
-0.00901031494140625,
0.054473876953125,
0.040374755859375,
0.06591796875,
0.041351318359375,
0.02459716796875,
0.014251708984375,
0.0077362060546875,
0.007045745849609375,
-0.019256591796875,
0.0025310516357421875,
-0.01715087890625,
-0.004669189453125,
-0.033538818359375,
0.024505615234375,
0.0294647216796875,
0.0095062255859375,
0.007659912109375,
0.0191192626953125,
0.0128326416015625,
0.04656982421875,
-0.0198211669921875,
0.0170745849609375,
-0.028564453125,
-0.00986480712890625,
-0.0113677978515625,
-0.00553131103515625,
-0.0169677734375,
-0.0155792236328125,
0.019134521484375,
-0.06402587890625,
0.0264129638671875,
0.01142120361328125,
0.10455322265625,
0.01377105712890625,
-0.040679931640625,
-0.0467529296875,
-0.024932861328125,
0.0261688232421875,
-0.031951904296875,
-0.0039825439453125,
0.042266845703125,
0.028839111328125,
-0.00316619873046875,
-0.0731201171875,
-0.05328369140625,
-0.00887298583984375,
-0.03326416015625,
0.0215301513671875,
-0.032470703125,
0.00018727779388427734,
0.040313720703125,
0.01052093505859375,
-0.0787353515625,
-0.01103973388671875,
-0.0211639404296875,
-0.005126953125,
0.035186767578125,
0.0103607177734375,
0.045562744140625,
-0.04571533203125,
-0.02899169921875,
-0.01202392578125,
-0.045654296875,
0.01342010498046875,
0.035125732421875,
-0.004772186279296875,
-0.022430419921875,
0.056793212890625,
-0.00785064697265625,
0.053863525390625,
0.021759033203125,
0.0147705078125,
0.010284423828125,
-0.040191650390625,
-0.005863189697265625,
-0.0243988037109375,
0.062255859375,
0.0635986328125,
0.03448486328125,
-0.006683349609375,
-0.031280517578125,
-0.0022735595703125,
0.002696990966796875,
-0.08172607421875,
-0.053497314453125,
0.0103607177734375,
-0.030426025390625,
-0.0044097900390625,
-0.0118865966796875,
-0.048797607421875,
-0.0107879638671875,
-0.0274505615234375,
0.040252685546875,
-0.03387451171875,
0.0013837814331054688,
-0.01904296875,
-0.017852783203125,
0.01114654541015625,
0.033966064453125,
-0.055419921875,
-0.00830841064453125,
0.012359619140625,
0.08154296875,
0.01119232177734375,
-0.0159912109375,
-0.0245208740234375,
0.0053558349609375,
-0.01318359375,
0.08154296875,
-0.03900146484375,
-0.05670166015625,
-0.0217437744140625,
0.034759521484375,
-0.002498626708984375,
-0.01273345947265625,
0.04571533203125,
-0.04486083984375,
0.04254150390625,
-0.04302978515625,
-0.037628173828125,
-0.0162506103515625,
0.0175323486328125,
-0.043914794921875,
0.07574462890625,
0.0281829833984375,
-0.052337646484375,
0.03973388671875,
-0.04888916015625,
-0.014373779296875,
0.0101776123046875,
-0.013916015625,
-0.04327392578125,
-0.024078369140625,
0.03057861328125,
0.034271240234375,
-0.031494140625,
0.01947021484375,
-0.014923095703125,
-0.01404571533203125,
-0.00745391845703125,
-0.019287109375,
0.033660888671875,
0.0261077880859375,
-0.0013427734375,
-0.007495880126953125,
-0.0474853515625,
0.010284423828125,
0.028656005859375,
-0.0099029541015625,
-0.0228271484375,
-0.0269775390625,
0.019683837890625,
0.02874755859375,
0.031982421875,
-0.032470703125,
0.05291748046875,
-0.00342559814453125,
0.040802001953125,
0.0435791015625,
0.0261688232421875,
0.04034423828125,
-0.03271484375,
0.04864501953125,
0.00669097900390625,
0.0167083740234375,
-0.04022216796875,
-0.05328369140625,
-0.0167694091796875,
-0.0253143310546875,
0.0208740234375,
0.035369873046875,
-0.06304931640625,
0.034271240234375,
-0.022125244140625,
-0.0413818359375,
-0.0164794921875,
-0.01873779296875,
0.03155517578125,
0.060882568359375,
0.025909423828125,
-0.0301055908203125,
-0.01629638671875,
-0.06463623046875,
-0.002147674560546875,
-0.007305145263671875,
-0.00803375244140625,
0.032501220703125,
0.04510498046875,
-0.01172637939453125,
0.07623291015625,
-0.047576904296875,
-0.0023345947265625,
0.0069427490234375,
0.0263671875,
0.0277862548828125,
0.033782958984375,
0.06494140625,
-0.061309814453125,
-0.055938720703125,
-0.02813720703125,
-0.041046142578125,
-0.017364501953125,
-0.031341552734375,
-0.0266876220703125,
0.007663726806640625,
0.0102691650390625,
-0.0687255859375,
0.044586181640625,
0.02874755859375,
-0.06475830078125,
0.05322265625,
-0.0296783447265625,
0.023895263671875,
-0.09320068359375,
0.005950927734375,
0.0204010009765625,
-0.025177001953125,
-0.0192108154296875,
-0.0177459716796875,
0.0201873779296875,
-0.005588531494140625,
0.005496978759765625,
0.060516357421875,
-0.051025390625,
-0.004177093505859375,
-0.0210723876953125,
-0.006931304931640625,
0.0207672119140625,
0.02606201171875,
-0.005580902099609375,
0.086669921875,
0.050933837890625,
-0.0254974365234375,
0.02105712890625,
0.0274505615234375,
-0.0102996826171875,
0.047576904296875,
-0.063720703125,
0.0208740234375,
-0.008026123046875,
0.0252838134765625,
-0.0941162109375,
-0.039825439453125,
0.037017822265625,
-0.042327880859375,
0.03265380859375,
-0.0408935546875,
-0.05938720703125,
-0.0338134765625,
-0.0071868896484375,
0.0300750732421875,
0.047698974609375,
-0.036346435546875,
0.0361328125,
0.0133514404296875,
-0.03314208984375,
-0.013702392578125,
-0.03411865234375,
0.0232086181640625,
-0.015289306640625,
-0.041656494140625,
0.025634765625,
-0.010101318359375,
0.03460693359375,
0.01438140869140625,
0.0185546875,
0.005886077880859375,
-0.0276641845703125,
0.0318603515625,
0.04888916015625,
0.01140594482421875,
0.005229949951171875,
0.01158905029296875,
-0.0191802978515625,
0.0008983612060546875,
-0.033050537109375,
0.0399169921875,
0.0008020401000976562,
-0.0174407958984375,
-0.033111572265625,
0.0257568359375,
0.0300140380859375,
-0.001468658447265625,
0.03302001953125,
0.049835205078125,
-0.0286865234375,
0.0128936767578125,
-0.028656005859375,
-0.0098114013671875,
-0.036285400390625,
0.0234527587890625,
-0.0335693359375,
-0.02374267578125,
0.042083740234375,
0.0011272430419921875,
0.0030002593994140625,
0.0491943359375,
0.041900634765625,
-0.0284423828125,
0.073486328125,
0.0219879150390625,
0.003643035888671875,
0.04022216796875,
-0.0265350341796875,
0.0004973411560058594,
-0.0614013671875,
-0.0189971923828125,
-0.0294189453125,
-0.01806640625,
-0.046295166015625,
-0.0192108154296875,
0.042022705078125,
0.00722503662109375,
-0.03857421875,
0.0303955078125,
-0.06304931640625,
0.0433349609375,
0.05133056640625,
0.0177154541015625,
0.007534027099609375,
0.014373779296875,
-0.00904083251953125,
-0.00936126708984375,
-0.04736328125,
-0.06109619140625,
0.083251953125,
-0.005924224853515625,
0.047698974609375,
-0.004573822021484375,
0.033721923828125,
0.03369140625,
0.0183868408203125,
-0.0433349609375,
0.040496826171875,
-0.015960693359375,
-0.039947509765625,
-0.020416259765625,
-0.0225830078125,
-0.0751953125,
-0.018798828125,
-0.00463104248046875,
-0.047393798828125,
0.01001739501953125,
0.0224609375,
-0.036651611328125,
0.00791168212890625,
-0.07684326171875,
0.0601806640625,
-0.011383056640625,
-0.0196990966796875,
0.01110076904296875,
-0.054351806640625,
0.0018777847290039062,
-0.003509521484375,
-0.0010499954223632812,
-0.00945281982421875,
-0.0035190582275390625,
0.048095703125,
-0.04034423828125,
0.058746337890625,
-0.029144287109375,
0.009552001953125,
0.031585693359375,
0.00007557868957519531,
0.062103271484375,
0.01044464111328125,
0.006256103515625,
-0.004901885986328125,
0.0153961181640625,
-0.032135009765625,
-0.0380859375,
0.046112060546875,
-0.0771484375,
-0.021514892578125,
-0.041107177734375,
-0.042572021484375,
0.006130218505859375,
0.012969970703125,
0.054840087890625,
0.0390625,
0.01212310791015625,
-0.0230255126953125,
0.033935546875,
-0.0183868408203125,
0.04132080078125,
0.0299530029296875,
-0.00850677490234375,
-0.03277587890625,
0.053802490234375,
0.031524658203125,
0.0108795166015625,
-0.005901336669921875,
0.00023758411407470703,
-0.045654296875,
-0.02606201171875,
-0.0692138671875,
0.0235137939453125,
-0.060882568359375,
-0.0256195068359375,
-0.04705810546875,
-0.0187225341796875,
-0.02777099609375,
-0.019256591796875,
-0.002716064453125,
-0.03167724609375,
-0.04522705078125,
-0.0037097930908203125,
0.047271728515625,
0.048553466796875,
-0.003124237060546875,
0.037872314453125,
-0.06378173828125,
0.0283966064453125,
0.021728515625,
0.01006317138671875,
-0.008575439453125,
-0.06396484375,
-0.005855560302734375,
-0.004150390625,
-0.0419921875,
-0.07373046875,
0.035675048828125,
0.01336669921875,
0.0213165283203125,
0.0107421875,
-0.0076904296875,
0.046875,
-0.046661376953125,
0.0811767578125,
0.0303955078125,
-0.0396728515625,
0.05078125,
-0.05078125,
0.036865234375,
0.01873779296875,
0.03271484375,
-0.0394287109375,
-0.0079193115234375,
-0.0504150390625,
-0.07672119140625,
0.0478515625,
0.0293731689453125,
0.0159149169921875,
0.015899658203125,
0.0548095703125,
0.009552001953125,
0.0008244514465332031,
-0.0589599609375,
-0.014892578125,
-0.022674560546875,
-0.01959228515625,
-0.0020389556884765625,
-0.0330810546875,
-0.0079345703125,
-0.0210723876953125,
0.059600830078125,
-0.00826263427734375,
0.0155029296875,
0.0284271240234375,
0.0038433074951171875,
-0.0196990966796875,
0.0037078857421875,
0.01837158203125,
0.035888671875,
-0.0032825469970703125,
-0.01399993896484375,
-0.01509857177734375,
-0.044586181640625,
0.0096435546875,
0.0304412841796875,
-0.03497314453125,
0.01044464111328125,
0.017333984375,
0.05853271484375,
-0.016143798828125,
-0.0157318115234375,
0.035614013671875,
-0.01300811767578125,
-0.01192474365234375,
-0.0198822021484375,
0.006256103515625,
-0.00719451904296875,
0.0174407958984375,
0.0264739990234375,
0.0111236572265625,
0.026763916015625,
-0.027923583984375,
0.0161285400390625,
0.0223236083984375,
-0.00878143310546875,
-0.037750244140625,
0.055206298828125,
0.00818634033203125,
0.0004906654357910156,
0.061065673828125,
-0.05377197265625,
-0.026092529296875,
0.053619384765625,
0.021514892578125,
0.06182861328125,
0.0063018798828125,
0.0239105224609375,
0.0494384765625,
0.021728515625,
-0.0016803741455078125,
0.038818359375,
0.006725311279296875,
-0.031707763671875,
-0.0010251998901367188,
-0.0379638671875,
-0.02081298828125,
0.003864288330078125,
-0.05108642578125,
0.03399658203125,
-0.01010894775390625,
-0.045623779296875,
0.006923675537109375,
0.002582550048828125,
-0.074462890625,
0.032806396484375,
0.0149078369140625,
0.052734375,
-0.056854248046875,
0.0582275390625,
0.0692138671875,
-0.047210693359375,
-0.07843017578125,
0.007572174072265625,
0.006137847900390625,
-0.058380126953125,
0.027557373046875,
0.0113677978515625,
0.0156707763671875,
0.0173797607421875,
-0.058502197265625,
-0.057098388671875,
0.0994873046875,
0.0014781951904296875,
-0.058929443359375,
-0.032379150390625,
0.0080108642578125,
0.031829833984375,
-0.02740478515625,
0.04278564453125,
0.03375244140625,
0.03411865234375,
0.027099609375,
-0.064697265625,
0.019317626953125,
-0.029754638671875,
0.01326751708984375,
-0.00040793418884277344,
-0.05181884765625,
0.06451416015625,
-0.03656005859375,
-0.0176849365234375,
0.0206451416015625,
0.0274810791015625,
0.023529052734375,
0.039947509765625,
0.041778564453125,
0.04180908203125,
0.04754638671875,
-0.024566650390625,
0.09393310546875,
-0.036224365234375,
0.048492431640625,
0.08734130859375,
-0.0010633468627929688,
0.015838623046875,
0.028533935546875,
-0.0122222900390625,
0.021881103515625,
0.0955810546875,
-0.029571533203125,
0.064453125,
-0.0171356201171875,
0.00244903564453125,
-0.0099334716796875,
0.005706787109375,
-0.03424072265625,
-0.0045623779296875,
0.030487060546875,
-0.03436279296875,
-0.0263824462890625,
0.002582550048828125,
0.00902557373046875,
-0.0305633544921875,
-0.0158843994140625,
0.0518798828125,
0.0010833740234375,
-0.05291748046875,
0.033050537109375,
-0.0133514404296875,
0.03631591796875,
-0.042083740234375,
-0.0024356842041015625,
-0.013702392578125,
-0.024993896484375,
-0.002826690673828125,
-0.09222412109375,
0.0127716064453125,
0.01953125,
-0.029693603515625,
-0.0155029296875,
0.0595703125,
-0.03228759765625,
-0.0440673828125,
-0.002605438232421875,
0.035552978515625,
0.0166473388671875,
-0.0082244873046875,
-0.07373046875,
-0.02740478515625,
-0.0031757354736328125,
-0.04754638671875,
0.00891876220703125,
0.058624267578125,
0.0026397705078125,
0.037109375,
0.04296875,
0.007328033447265625,
-0.0058135986328125,
0.005489349365234375,
0.08111572265625,
-0.037353515625,
-0.05023193359375,
-0.058441162109375,
0.048309326171875,
-0.0094451904296875,
-0.0341796875,
0.041900634765625,
0.053558349609375,
0.05035400390625,
-0.0181732177734375,
0.07763671875,
-0.01947021484375,
0.0203399658203125,
-0.05078125,
0.05657958984375,
-0.062469482421875,
0.00737762451171875,
-0.01727294921875,
-0.048187255859375,
0.0104217529296875,
0.0257568359375,
-0.04296875,
0.0188446044921875,
0.04119873046875,
0.076904296875,
-0.006694793701171875,
-0.0005359649658203125,
0.010223388671875,
0.00217437744140625,
0.0217742919921875,
0.0294952392578125,
0.05792236328125,
-0.034912109375,
0.042877197265625,
-0.008270263671875,
-0.01397705078125,
0.00235748291015625,
-0.06231689453125,
-0.052978515625,
-0.06524658203125,
-0.034332275390625,
-0.0423583984375,
0.013153076171875,
0.0634765625,
0.062103271484375,
-0.04541015625,
0.01885986328125,
-0.0059814453125,
-0.0247955322265625,
0.02392578125,
-0.01468658447265625,
0.039398193359375,
-0.01354217529296875,
-0.056243896484375,
-0.021728515625,
0.00957489013671875,
0.046112060546875,
0.015838623046875,
0.0011415481567382812,
0.00537872314453125,
0.0005645751953125,
0.0282745361328125,
0.00783538818359375,
-0.03619384765625,
-0.0182952880859375,
-0.002155303955078125,
-0.0241546630859375,
0.01389312744140625,
0.04150390625,
-0.039306640625,
0.03851318359375,
0.021759033203125,
0.00893402099609375,
0.033538818359375,
0.0086669921875,
0.0215911865234375,
-0.05706787109375,
0.0068817138671875,
0.00202178955078125,
0.039459228515625,
0.0294952392578125,
-0.056488037109375,
0.038726806640625,
0.0254364013671875,
-0.062103271484375,
-0.037872314453125,
-0.00009363889694213867,
-0.077392578125,
-0.0133209228515625,
0.09710693359375,
-0.0178070068359375,
0.0026187896728515625,
0.01010894775390625,
-0.036041259765625,
0.0299224853515625,
-0.052337646484375,
0.060577392578125,
0.0728759765625,
-0.01001739501953125,
-0.0188446044921875,
-0.04248046875,
0.041351318359375,
0.005950927734375,
-0.06097412109375,
-0.01247406005859375,
0.051666259765625,
0.041168212890625,
-0.003986358642578125,
0.04833984375,
0.0004496574401855469,
0.0260162353515625,
-0.016326904296875,
0.0031795501708984375,
-0.008087158203125,
-0.0231781005859375,
-0.00897979736328125,
-0.0166778564453125,
-0.005374908447265625,
-0.0249176025390625
]
] |
openlm-research/open_llama_13b | 2023-06-16T05:47:29.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:togethercomputer/RedPajama-Data-1T",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | openlm-research | null | null | openlm-research/open_llama_13b | 438 | 67,000 | transformers | 2023-06-15T10:51:45 | ---
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T
---
# OpenLLaMA: An Open Reproduction of LLaMA
In this repo, we present a permissively licensed open source reproduction of Meta AI's [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) large language model. We are releasing 3B, 7B and 13B models trained on 1T tokens. We provide PyTorch and JAX weights of pre-trained OpenLLaMA models, as well as evaluation results and comparison against the original LLaMA models. Please see the [project homepage of OpenLLaMA](https://github.com/openlm-research/open_llama) for more details.
## Weights Release, License and Usage
We release the weights in two formats: an EasyLM format to be use with our [EasyLM framework](https://github.com/young-geng/EasyLM), and a PyTorch format to be used with the [Hugging Face transformers](https://huggingface.co/docs/transformers/index) library. Both our training framework EasyLM and the checkpoint weights are licensed permissively under the Apache 2.0 license.
### Loading the Weights with Hugging Face Transformers
Preview checkpoints can be directly loaded from Hugging Face Hub. **Please note that it is advised to avoid using the Hugging Face fast tokenizer for now, as we’ve observed that the auto-converted fast tokenizer sometimes gives incorrect tokenizations.** This can be achieved by directly using the `LlamaTokenizer` class, or passing in the `use_fast=False` option for the `AutoTokenizer` class. See the following example for usage.
```python
import torch
from transformers import LlamaTokenizer, LlamaForCausalLM
# model_path = 'openlm-research/open_llama_3b'
# model_path = 'openlm-research/open_llama_7b'
model_path = 'openlm-research/open_llama_13b'
tokenizer = LlamaTokenizer.from_pretrained(model_path)
model = LlamaForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float16, device_map='auto',
)
prompt = 'Q: What is the largest animal?\nA:'
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=32
)
print(tokenizer.decode(generation_output[0]))
```
For more advanced usage, please follow the [transformers LLaMA documentation](https://huggingface.co/docs/transformers/main/model_doc/llama).
### Evaluating with LM-Eval-Harness
The model can be evaluated with [lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness). However, due to the aforementioned tokenizer issue, we need to avoid using the fast tokenizer to obtain the correct results. This can be achieved by passing in `use_fast=False` to [this part of lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness/blob/4b701e228768052cfae9043dca13e82052ca5eea/lm_eval/models/huggingface.py#LL313C9-L316C10), as shown in the example below:
```python
tokenizer = self.AUTO_TOKENIZER_CLASS.from_pretrained(
pretrained if tokenizer is None else tokenizer,
revision=revision + ("/" + subfolder if subfolder is not None else ""),
use_fast=False
)
```
### Loading the Weights with EasyLM
For using the weights in our EasyLM framework, please refer to the [LLaMA documentation of EasyLM](https://github.com/young-geng/EasyLM/blob/main/docs/llama.md). Note that unlike the original LLaMA model, our OpenLLaMA tokenizer and weights are trained completely from scratch so it is no longer needed to obtain the original LLaMA tokenizer and weights. Note that we use BOS (beginning of sentence) token (id=1) during training, so it is best to prepend this token for best performance during few-shot evaluation.
## Dataset and Training
We train our models on the [RedPajama](https://www.together.xyz/blog/redpajama) dataset released by [Together](https://www.together.xyz/), which is a reproduction of the LLaMA training dataset containing over 1.2 trillion tokens. We follow the exactly same preprocessing steps and training hyperparameters as the original LLaMA paper, including model architecture, context length, training steps, learning rate schedule, and optimizer. The only difference between our setting and the original one is the dataset used: OpenLLaMA employs the RedPajama dataset rather than the one utilized by the original LLaMA.
We train the models on cloud TPU-v4s using [EasyLM](https://github.com/young-geng/EasyLM), a JAX based training pipeline we developed for training and fine-tuning large language models. We employ a combination of normal data parallelism and [fully sharded data parallelism (also know as ZeRO stage 3)](https://engineering.fb.com/2021/07/15/open-source/fsdp/) to balance the training throughput and memory usage. Overall we reach a throughput of over 2200 tokens / second / TPU-v4 chip for our 7B model.
## Evaluation
We evaluated OpenLLaMA on a wide range of tasks using [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness). The LLaMA results are generated by running the original LLaMA model on the same evaluation metrics. We note that our results for the LLaMA model differ slightly from the original LLaMA paper, which we believe is a result of different evaluation protocols. Similar differences have been reported in [this issue of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/issues/443). Additionally, we present the results of GPT-J, a 6B parameter model trained on the [Pile](https://pile.eleuther.ai/) dataset by [EleutherAI](https://www.eleuther.ai/).
The original LLaMA model was trained for 1 trillion tokens and GPT-J was trained for 500 billion tokens. We present the results in the table below. OpenLLaMA exhibits comparable performance to the original LLaMA and GPT-J across a majority of tasks, and outperforms them in some tasks.
| **Task/Metric** | GPT-J 6B | LLaMA 7B | LLaMA 13B | OpenLLaMA 7B | OpenLLaMA 3B | OpenLLaMA 13B |
| ---------------------- | -------- | -------- | --------- | ------------ | ------------ | ------------- |
| anli_r1/acc | 0.32 | 0.35 | 0.35 | 0.33 | 0.33 | 0.33 |
| anli_r2/acc | 0.34 | 0.34 | 0.36 | 0.36 | 0.32 | 0.33 |
| anli_r3/acc | 0.35 | 0.37 | 0.39 | 0.38 | 0.35 | 0.40 |
| arc_challenge/acc | 0.34 | 0.39 | 0.44 | 0.37 | 0.34 | 0.41 |
| arc_challenge/acc_norm | 0.37 | 0.41 | 0.44 | 0.38 | 0.37 | 0.44 |
| arc_easy/acc | 0.67 | 0.68 | 0.75 | 0.72 | 0.69 | 0.75 |
| arc_easy/acc_norm | 0.62 | 0.52 | 0.59 | 0.68 | 0.65 | 0.70 |
| boolq/acc | 0.66 | 0.75 | 0.71 | 0.71 | 0.68 | 0.75 |
| hellaswag/acc | 0.50 | 0.56 | 0.59 | 0.53 | 0.49 | 0.56 |
| hellaswag/acc_norm | 0.66 | 0.73 | 0.76 | 0.72 | 0.67 | 0.76 |
| openbookqa/acc | 0.29 | 0.29 | 0.31 | 0.30 | 0.27 | 0.31 |
| openbookqa/acc_norm | 0.38 | 0.41 | 0.42 | 0.40 | 0.40 | 0.43 |
| piqa/acc | 0.75 | 0.78 | 0.79 | 0.76 | 0.75 | 0.77 |
| piqa/acc_norm | 0.76 | 0.78 | 0.79 | 0.77 | 0.76 | 0.79 |
| record/em | 0.88 | 0.91 | 0.92 | 0.89 | 0.88 | 0.91 |
| record/f1 | 0.89 | 0.91 | 0.92 | 0.90 | 0.89 | 0.91 |
| rte/acc | 0.54 | 0.56 | 0.69 | 0.60 | 0.58 | 0.64 |
| truthfulqa_mc/mc1 | 0.20 | 0.21 | 0.25 | 0.23 | 0.22 | 0.25 |
| truthfulqa_mc/mc2 | 0.36 | 0.34 | 0.40 | 0.35 | 0.35 | 0.38 |
| wic/acc | 0.50 | 0.50 | 0.50 | 0.51 | 0.48 | 0.47 |
| winogrande/acc | 0.64 | 0.68 | 0.70 | 0.67 | 0.62 | 0.70 |
| Average | 0.52 | 0.55 | 0.57 | 0.55 | 0.53 | 0.57 |
We removed the task CB and WSC from our benchmark, as our model performs suspiciously well on these two tasks. We hypothesize that there could be a benchmark data contamination in the training set.
## Contact
We would love to get feedback from the community. If you have any questions, please open an issue or contact us.
OpenLLaMA is developed by:
[Xinyang Geng](https://young-geng.xyz/)* and [Hao Liu](https://www.haoliu.site/)* from Berkeley AI Research.
*Equal Contribution
## Acknowledgment
We thank the [Google TPU Research Cloud](https://sites.research.google/trc/about/) program for providing part of the computation resources. We’d like to specially thank Jonathan Caton from TPU Research Cloud for helping us organizing compute resources, Rafi Witten from the Google Cloud team and James Bradbury from the Google JAX team for helping us optimizing our training throughput. We’d also want to thank Charlie Snell, Gautier Izacard, Eric Wallace, Lianmin Zheng and our user community for the discussions and feedback.
The OpenLLaMA 13B model is trained in collaboration with [Stability AI](https://stability.ai/), and we thank Stability AI for providing the computation resources. We’d like to especially thank David Ha and Shivanshu Purohit for the coordinating the logistics and providing engineering support.
## Reference
If you found OpenLLaMA useful in your research or applications, please cite using the following BibTeX:
```
@software{openlm2023openllama,
author = {Geng, Xinyang and Liu, Hao},
title = {OpenLLaMA: An Open Reproduction of LLaMA},
month = May,
year = 2023,
url = {https://github.com/openlm-research/open_llama}
}
```
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
```
@article{touvron2023llama,
title={Llama: Open and efficient foundation language models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```
| 10,637 | [
[
-0.0240478515625,
-0.053375244140625,
0.0178070068359375,
0.029296875,
-0.0181884765625,
-0.002948760986328125,
-0.022796630859375,
-0.0421142578125,
0.0281829833984375,
0.0193634033203125,
-0.030670166015625,
-0.05029296875,
-0.048858642578125,
0.006305694580078125,
-0.01520538330078125,
0.08544921875,
-0.023651123046875,
-0.0115814208984375,
0.001285552978515625,
-0.0250091552734375,
-0.011016845703125,
-0.026123046875,
-0.053131103515625,
-0.029693603515625,
0.031524658203125,
0.01568603515625,
0.04583740234375,
0.0360107421875,
0.038604736328125,
0.0252532958984375,
-0.0226593017578125,
0.0163421630859375,
-0.03814697265625,
-0.019378662109375,
0.021240234375,
-0.04052734375,
-0.050140380859375,
0.004024505615234375,
0.03912353515625,
0.0266876220703125,
-0.02447509765625,
0.041534423828125,
-0.0028514862060546875,
0.039703369140625,
-0.0401611328125,
0.02362060546875,
-0.04071044921875,
0.0091552734375,
-0.0231170654296875,
-0.0049591064453125,
-0.0201873779296875,
-0.029266357421875,
-0.007190704345703125,
-0.0579833984375,
0.0019550323486328125,
0.001834869384765625,
0.08892822265625,
0.0252227783203125,
-0.01788330078125,
-0.017669677734375,
-0.029510498046875,
0.061431884765625,
-0.0616455078125,
0.0105743408203125,
0.0335693359375,
0.01235198974609375,
-0.00861358642578125,
-0.059722900390625,
-0.052886962890625,
-0.007434844970703125,
-0.00885009765625,
0.00977325439453125,
-0.02166748046875,
-0.01018524169921875,
0.0220184326171875,
0.04559326171875,
-0.0360107421875,
0.0178375244140625,
-0.04205322265625,
-0.00981903076171875,
0.05767822265625,
0.021453857421875,
0.0104217529296875,
-0.01056671142578125,
-0.03765869140625,
-0.0176849365234375,
-0.052886962890625,
0.028289794921875,
0.016876220703125,
0.0218505859375,
-0.035491943359375,
0.0484619140625,
-0.021148681640625,
0.0382080078125,
0.00894927978515625,
-0.04180908203125,
0.05322265625,
-0.0300750732421875,
-0.03271484375,
0.0035037994384765625,
0.06591796875,
0.02996826171875,
0.0016012191772460938,
0.00958251953125,
-0.01366424560546875,
-0.00717926025390625,
-0.0067596435546875,
-0.05987548828125,
-0.0027523040771484375,
0.0194091796875,
-0.036407470703125,
-0.0267181396484375,
0.002582550048828125,
-0.0411376953125,
-0.0098876953125,
-0.01087188720703125,
0.034027099609375,
-0.0158233642578125,
-0.0182342529296875,
0.02227783203125,
0.00970458984375,
0.033111572265625,
0.032684326171875,
-0.055206298828125,
0.01611328125,
0.03582763671875,
0.07135009765625,
-0.0035858154296875,
-0.0279541015625,
-0.01959228515625,
0.0018243789672851562,
-0.019989013671875,
0.042083740234375,
-0.0079803466796875,
-0.0236968994140625,
-0.01015472412109375,
0.007144927978515625,
-0.01739501953125,
-0.03558349609375,
0.0374755859375,
-0.03173828125,
0.017913818359375,
-0.01346588134765625,
-0.01385498046875,
-0.022796630859375,
0.02105712890625,
-0.044464111328125,
0.10052490234375,
0.0077056884765625,
-0.05511474609375,
0.02392578125,
-0.057037353515625,
-0.008514404296875,
-0.0202789306640625,
0.0117950439453125,
-0.051483154296875,
-0.00537109375,
0.032501220703125,
0.0300750732421875,
-0.032745361328125,
0.01399993896484375,
-0.017242431640625,
-0.0369873046875,
0.01015472412109375,
-0.0169525146484375,
0.0826416015625,
0.020904541015625,
-0.0352783203125,
0.019622802734375,
-0.0687255859375,
-0.004322052001953125,
0.044647216796875,
-0.046142578125,
-0.006252288818359375,
-0.0212860107421875,
-0.003086090087890625,
0.003978729248046875,
0.033477783203125,
-0.04315185546875,
0.0312042236328125,
-0.024810791015625,
0.037872314453125,
0.06744384765625,
-0.01442718505859375,
0.01264190673828125,
-0.03466796875,
0.03131103515625,
0.01233673095703125,
0.0179595947265625,
-0.0150299072265625,
-0.047332763671875,
-0.0750732421875,
-0.0421142578125,
0.01019287109375,
0.0311279296875,
-0.0218963623046875,
0.034698486328125,
-0.01255035400390625,
-0.053314208984375,
-0.057037353515625,
0.01654052734375,
0.030242919921875,
0.032867431640625,
0.03753662109375,
-0.02130126953125,
-0.04278564453125,
-0.06396484375,
0.001064300537109375,
-0.0215606689453125,
0.01242828369140625,
0.02276611328125,
0.055419921875,
-0.025177001953125,
0.06414794921875,
-0.041900634765625,
-0.031768798828125,
-0.0147705078125,
-0.00507354736328125,
0.049163818359375,
0.032073974609375,
0.0518798828125,
-0.0299530029296875,
-0.03924560546875,
0.0036869049072265625,
-0.06219482421875,
-0.00539398193359375,
-0.0014400482177734375,
-0.010345458984375,
0.022796630859375,
0.010894775390625,
-0.06640625,
0.048431396484375,
0.0423583984375,
-0.0286865234375,
0.04107666015625,
-0.009368896484375,
0.003299713134765625,
-0.072509765625,
0.018707275390625,
-0.00586700439453125,
-0.00862884521484375,
-0.034271240234375,
0.0194549560546875,
0.0006237030029296875,
0.003826141357421875,
-0.04962158203125,
0.0521240234375,
-0.0299224853515625,
-0.00788116455078125,
0.00742340087890625,
0.0014104843139648438,
-0.0015001296997070312,
0.05322265625,
-0.01067352294921875,
0.0704345703125,
0.0347900390625,
-0.0311279296875,
0.0235443115234375,
0.0240478515625,
-0.037628173828125,
0.022979736328125,
-0.059051513671875,
0.0190582275390625,
-0.0002982616424560547,
0.03466796875,
-0.07513427734375,
-0.013641357421875,
0.032440185546875,
-0.02294921875,
0.01325225830078125,
0.009918212890625,
-0.038665771484375,
-0.049285888671875,
-0.04864501953125,
0.028045654296875,
0.038818359375,
-0.052459716796875,
0.018280029296875,
0.0104827880859375,
0.009552001953125,
-0.052764892578125,
-0.052398681640625,
-0.0084381103515625,
-0.025726318359375,
-0.042724609375,
0.0266265869140625,
-0.00545501708984375,
-0.01319122314453125,
-0.00821685791015625,
-0.006717681884765625,
0.0017328262329101562,
0.01381683349609375,
0.025848388671875,
0.0209808349609375,
-0.025146484375,
-0.00919342041015625,
-0.006443023681640625,
-0.004306793212890625,
-0.00916290283203125,
0.003925323486328125,
0.05401611328125,
-0.0287017822265625,
-0.0335693359375,
-0.053131103515625,
-0.00910186767578125,
0.03753662109375,
-0.018402099609375,
0.06903076171875,
0.05194091796875,
-0.0220184326171875,
0.016387939453125,
-0.0406494140625,
0.0118560791015625,
-0.035614013671875,
0.0195159912109375,
-0.032135009765625,
-0.06439208984375,
0.047393798828125,
0.01458740234375,
0.0186767578125,
0.055908203125,
0.057830810546875,
0.00457000732421875,
0.060089111328125,
0.031829833984375,
-0.019775390625,
0.0248870849609375,
-0.044708251953125,
-0.0009260177612304688,
-0.07537841796875,
-0.03839111328125,
-0.037994384765625,
-0.0305938720703125,
-0.0273895263671875,
-0.034515380859375,
0.0250701904296875,
0.0251922607421875,
-0.0498046875,
0.0293121337890625,
-0.03961181640625,
0.020660400390625,
0.04608154296875,
0.01464080810546875,
0.0307464599609375,
0.004215240478515625,
-0.01195526123046875,
0.00514984130859375,
-0.036865234375,
-0.03900146484375,
0.1063232421875,
0.039825439453125,
0.055084228515625,
0.0093231201171875,
0.06396484375,
0.0064849853515625,
0.035003662109375,
-0.0423583984375,
0.035888671875,
0.0211944580078125,
-0.044921875,
-0.01171112060546875,
-0.01520538330078125,
-0.07818603515625,
0.037322998046875,
-0.0069122314453125,
-0.0711669921875,
0.0032367706298828125,
-0.006572723388671875,
-0.0276641845703125,
0.033203125,
-0.036895751953125,
0.05126953125,
-0.0205841064453125,
-0.0182342529296875,
-0.00891876220703125,
-0.035003662109375,
0.045989990234375,
-0.01319122314453125,
0.0118255615234375,
-0.01338958740234375,
-0.0170745849609375,
0.06744384765625,
-0.053375244140625,
0.06256103515625,
-0.0131378173828125,
-0.01284027099609375,
0.03948974609375,
-0.0160980224609375,
0.03900146484375,
-0.0020198822021484375,
-0.0182037353515625,
0.0350341796875,
-0.009490966796875,
-0.03314208984375,
-0.0186309814453125,
0.055389404296875,
-0.09027099609375,
-0.056488037109375,
-0.040435791015625,
-0.0296783447265625,
0.01342010498046875,
0.01207733154296875,
0.01425933837890625,
0.0024261474609375,
0.0030422210693359375,
0.0175323486328125,
0.0275421142578125,
-0.0299224853515625,
0.0457763671875,
0.0335693359375,
-0.029266357421875,
-0.042022705078125,
0.05523681640625,
0.00209808349609375,
0.010589599609375,
0.01277923583984375,
0.017333984375,
-0.02001953125,
-0.03515625,
-0.04107666015625,
0.028839111328125,
-0.04498291015625,
-0.0272674560546875,
-0.045806884765625,
-0.0168304443359375,
-0.0279541015625,
-0.0078277587890625,
-0.0260162353515625,
-0.037139892578125,
-0.033721923828125,
-0.011627197265625,
0.04888916015625,
0.06341552734375,
0.0023975372314453125,
0.03643798828125,
-0.0384521484375,
0.0150604248046875,
0.015960693359375,
0.01453399658203125,
0.0131988525390625,
-0.051177978515625,
-0.0202484130859375,
0.00006598234176635742,
-0.0465087890625,
-0.0504150390625,
0.0287628173828125,
0.01067352294921875,
0.036041259765625,
0.032073974609375,
-0.00878143310546875,
0.07666015625,
-0.02069091796875,
0.0750732421875,
0.0287322998046875,
-0.06512451171875,
0.044525146484375,
-0.01617431640625,
0.01334381103515625,
0.036224365234375,
0.0303802490234375,
-0.0206298828125,
-0.02215576171875,
-0.049285888671875,
-0.07073974609375,
0.06622314453125,
0.0176239013671875,
-0.0023174285888671875,
0.0074462890625,
0.0202178955078125,
0.003559112548828125,
0.01511383056640625,
-0.0849609375,
-0.030303955078125,
-0.0161895751953125,
-0.0147247314453125,
-0.01116180419921875,
-0.005401611328125,
-0.0150604248046875,
-0.038818359375,
0.042694091796875,
0.0015106201171875,
0.031707763671875,
0.0184478759765625,
-0.019073486328125,
-0.01387786865234375,
-0.0002582073211669922,
0.05877685546875,
0.04656982421875,
-0.0179595947265625,
-0.01255035400390625,
0.030914306640625,
-0.03924560546875,
0.01546478271484375,
-0.0014925003051757812,
-0.018402099609375,
-0.00939178466796875,
0.03814697265625,
0.07183837890625,
0.0166015625,
-0.03997802734375,
0.04095458984375,
0.005207061767578125,
-0.0179443359375,
-0.0251007080078125,
0.0004992485046386719,
0.01238250732421875,
0.024169921875,
0.034576416015625,
-0.006404876708984375,
-0.0139617919921875,
-0.039337158203125,
-0.004486083984375,
0.033599853515625,
-0.00034928321838378906,
-0.0233917236328125,
0.06671142578125,
0.00521087646484375,
-0.02142333984375,
0.033111572265625,
0.0050201416015625,
-0.035919189453125,
0.061492919921875,
0.049713134765625,
0.050018310546875,
-0.015960693359375,
0.0006737709045410156,
0.045135498046875,
0.0276336669921875,
-0.004108428955078125,
0.018768310546875,
-0.007183074951171875,
-0.0283966064453125,
-0.0181884765625,
-0.06982421875,
-0.0235748291015625,
0.01548004150390625,
-0.0426025390625,
0.0263519287109375,
-0.04290771484375,
-0.0152435302734375,
-0.02716064453125,
0.019622802734375,
-0.06689453125,
0.01035308837890625,
0.0039005279541015625,
0.07440185546875,
-0.051116943359375,
0.058624267578125,
0.046722412109375,
-0.050079345703125,
-0.07550048828125,
-0.020172119140625,
-0.004436492919921875,
-0.09210205078125,
0.058929443359375,
0.0248870849609375,
0.01166534423828125,
-0.008087158203125,
-0.03192138671875,
-0.088134765625,
0.11431884765625,
0.014068603515625,
-0.0382080078125,
0.003192901611328125,
0.014984130859375,
0.03704833984375,
-0.0164642333984375,
0.0443115234375,
0.0361328125,
0.04278564453125,
-0.0036678314208984375,
-0.089111328125,
0.01983642578125,
-0.0216217041015625,
-0.002704620361328125,
0.005603790283203125,
-0.08099365234375,
0.08892822265625,
-0.0225830078125,
-0.0007476806640625,
0.021514892578125,
0.05029296875,
0.0390625,
0.032318115234375,
0.02874755859375,
0.0748291015625,
0.06488037109375,
-0.01287841796875,
0.082275390625,
-0.01253509521484375,
0.046875,
0.057861328125,
-0.0117950439453125,
0.0682373046875,
0.035491943359375,
-0.047210693359375,
0.04296875,
0.06353759765625,
0.0035114288330078125,
0.0312042236328125,
0.0158233642578125,
-0.010711669921875,
0.00847625732421875,
0.005054473876953125,
-0.056610107421875,
0.035369873046875,
0.0125885009765625,
-0.026824951171875,
-0.01416778564453125,
-0.0102996826171875,
0.0158233642578125,
-0.0174102783203125,
-0.0278472900390625,
0.042022705078125,
0.00287628173828125,
-0.034149169921875,
0.07135009765625,
0.0147247314453125,
0.07354736328125,
-0.043182373046875,
0.015625,
-0.0228424072265625,
0.01493072509765625,
-0.032196044921875,
-0.04583740234375,
0.00922393798828125,
0.01294708251953125,
0.009765625,
-0.00887298583984375,
0.03558349609375,
-0.0101776123046875,
-0.03460693359375,
0.02069091796875,
0.021514892578125,
0.0194091796875,
0.0168609619140625,
-0.056488037109375,
0.02593994140625,
-0.003299713134765625,
-0.0609130859375,
0.033111572265625,
0.0134735107421875,
-0.0026645660400390625,
0.049468994140625,
0.06500244140625,
0.001308441162109375,
0.020751953125,
-0.01007080078125,
0.07904052734375,
-0.053680419921875,
-0.024139404296875,
-0.06463623046875,
0.03887939453125,
0.00045013427734375,
-0.046844482421875,
0.058990478515625,
0.05047607421875,
0.0621337890625,
-0.0026760101318359375,
0.031463623046875,
-0.00785064697265625,
0.01806640625,
-0.04229736328125,
0.05572509765625,
-0.057525634765625,
0.01004791259765625,
-0.0171661376953125,
-0.072021484375,
-0.0245513916015625,
0.0640869140625,
-0.0166473388671875,
0.0021533966064453125,
0.040435791015625,
0.056793212890625,
0.00714874267578125,
-0.00798797607421875,
-0.0015420913696289062,
0.0232391357421875,
0.0245513916015625,
0.06439208984375,
0.056610107421875,
-0.053924560546875,
0.037811279296875,
-0.0270843505859375,
-0.0123443603515625,
-0.02923583984375,
-0.05816650390625,
-0.061187744140625,
-0.0277252197265625,
-0.0224456787109375,
-0.021148681640625,
-0.00894927978515625,
0.082763671875,
0.040283203125,
-0.0435791015625,
-0.03338623046875,
0.0082244873046875,
0.0095367431640625,
-0.00875091552734375,
-0.01434326171875,
0.04095458984375,
-0.008941650390625,
-0.0631103515625,
0.0254058837890625,
0.0034427642822265625,
0.007747650146484375,
-0.022369384765625,
-0.02392578125,
-0.0192108154296875,
-0.0005526542663574219,
0.047576904296875,
0.0252227783203125,
-0.07098388671875,
-0.020904541015625,
-0.0162200927734375,
-0.0222625732421875,
0.0216827392578125,
0.02178955078125,
-0.05914306640625,
0.01024627685546875,
0.0177459716796875,
0.03765869140625,
0.061492919921875,
-0.00412750244140625,
0.0025177001953125,
-0.031707763671875,
0.034820556640625,
-0.0120697021484375,
0.03253173828125,
0.01094818115234375,
-0.0235443115234375,
0.06005859375,
0.0225830078125,
-0.033966064453125,
-0.07763671875,
-0.017303466796875,
-0.0882568359375,
0.0019969940185546875,
0.0838623046875,
-0.02215576171875,
-0.03753662109375,
0.02423095703125,
-0.029266357421875,
0.01507568359375,
-0.033050537109375,
0.049285888671875,
0.04718017578125,
-0.009002685546875,
-0.0033283233642578125,
-0.042022705078125,
0.0139617919921875,
0.0245819091796875,
-0.0584716796875,
-0.0239715576171875,
0.01141357421875,
0.025634765625,
0.017333984375,
0.0682373046875,
-0.00885009765625,
0.01409149169921875,
-0.007602691650390625,
0.00803375244140625,
-0.024169921875,
-0.007015228271484375,
-0.0246429443359375,
0.012725830078125,
0.006744384765625,
-0.0255126953125
]
] |
bigscience/bloomz-560m | 2023-05-27T17:27:11.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zu",
"dataset:bigscience/xP3",
"arxiv:2211.01786",
"license:bigscience-bloom-rail-1.0",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloomz-560m | 82 | 66,911 | transformers | 2022-10-08T16:14:42 | ---
datasets:
- bigscience/xP3
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
pipeline_tag: text-generation
widget:
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。Would you rate the previous review as positive, neutral or negative?"
example_title: "zh-en sentiment"
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?"
example_title: "zh-zh sentiment"
- text: "Suggest at least five related search terms to \"Mạng neural nhân tạo\"."
example_title: "vi-en query"
- text: "Proposez au moins cinq mots clés concernant «Réseau de neurones artificiels»."
example_title: "fr-fr query"
- text: "Explain in a sentence in Telugu what is backpropagation in neural networks."
example_title: "te-en qa"
- text: "Why is the sky blue?"
example_title: "en-en qa"
- text: "Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is \"Heroes Come in All Shapes and Sizes\". Story (in Spanish):"
example_title: "es-en fable"
- text: "Write a fable about wood elves living in a forest that is suddenly invaded by ogres. The fable is a masterpiece that has achieved praise worldwide and its moral is \"Violence is the last refuge of the incompetent\". Fable (in Hindi):"
example_title: "hi-en fable"
model-index:
- name: bloomz-560m
results:
- task:
type: Coreference resolution
dataset:
type: winogrande
name: Winogrande XL (xl)
config: xl
split: validation
revision: a80f460359d1e9a67c006011c94de42a8759430c
metrics:
- type: Accuracy
value: 52.41
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (en)
config: en
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 51.01
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (fr)
config: fr
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 51.81
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (jp)
config: jp
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 52.03
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (pt)
config: pt
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 53.99
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (ru)
config: ru
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 53.97
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (zh)
config: zh
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 54.76
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r1)
config: r1
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 33.4
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r2)
config: r2
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 33.4
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r3)
config: r3
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 33.5
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (cb)
config: cb
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 53.57
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (rte)
config: rte
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 67.15
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ar)
config: ar
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 44.46
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (bg)
config: bg
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 39.76
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (de)
config: de
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 39.36
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (el)
config: el
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 40.96
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (en)
config: en
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 46.43
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (es)
config: es
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 44.98
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (fr)
config: fr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 45.54
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (hi)
config: hi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 41.81
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ru)
config: ru
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 39.64
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (sw)
config: sw
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 38.35
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (th)
config: th
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 35.5
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (tr)
config: tr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 37.31
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ur)
config: ur
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 38.96
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (vi)
config: vi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 44.74
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (zh)
config: zh
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 44.66
- task:
type: Program synthesis
dataset:
type: openai_humaneval
name: HumanEval
config: None
split: test
revision: e8dc562f5de170c54b5481011dd9f4fa04845771
metrics:
- type: Pass@1
value: 2.18
- type: Pass@10
value: 4.11
- type: Pass@100
value: 9.00
- task:
type: Sentence completion
dataset:
type: story_cloze
name: StoryCloze (2016)
config: "2016"
split: validation
revision: e724c6f8cdf7c7a2fb229d862226e15b023ee4db
metrics:
- type: Accuracy
value: 60.29
- task:
type: Sentence completion
dataset:
type: super_glue
name: SuperGLUE (copa)
config: copa
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 52.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (et)
config: et
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 53.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ht)
config: ht
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 49.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (id)
config: id
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 57.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (it)
config: it
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 52.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (qu)
config: qu
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 55.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (sw)
config: sw
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 56.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ta)
config: ta
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 58.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (th)
config: th
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 58.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (tr)
config: tr
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 61.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (vi)
config: vi
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 61.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (zh)
config: zh
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 61.0
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ar)
config: ar
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 54.4
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (es)
config: es
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 56.45
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (eu)
config: eu
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 50.56
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (hi)
config: hi
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 55.79
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (id)
config: id
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 57.84
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (my)
config: my
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 47.05
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ru)
config: ru
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 53.14
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (sw)
config: sw
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 51.36
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (te)
config: te
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 54.86
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (zh)
config: zh
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 56.52
---

# Table of Contents
1. [Model Summary](#model-summary)
2. [Use](#use)
3. [Limitations](#limitations)
4. [Training](#training)
5. [Evaluation](#evaluation)
7. [Citation](#citation)
# Model Summary
> We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find the resulting models capable of crosslingual generalization to unseen tasks & languages.
- **Repository:** [bigscience-workshop/xmtf](https://github.com/bigscience-workshop/xmtf)
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
- **Languages:** Refer to [bloom](https://huggingface.co/bigscience/bloom) for pretraining & [xP3](https://huggingface.co/datasets/bigscience/xP3) for finetuning language proportions. It understands both pretraining & finetuning languages.
- **BLOOMZ & mT0 Model Family:**
<div class="max-w-full overflow-auto">
<table>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3>xP3</a>. Recommended for prompting in English.
</tr>
<tr>
<td>Parameters</td>
<td>300M</td>
<td>580M</td>
<td>1.2B</td>
<td>3.7B</td>
<td>13B</td>
<td>560M</td>
<td>1.1B</td>
<td>1.7B</td>
<td>3B</td>
<td>7.1B</td>
<td>176B</td>
</tr>
<tr>
<td>Finetuned Model</td>
<td><a href=https://huggingface.co/bigscience/mt0-small>mt0-small</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-base>mt0-base</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-large>mt0-large</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xl>mt0-xl</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-560m>bloomz-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b1>bloomz-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b7>bloomz-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-3b>bloomz-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1>bloomz-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
</tr>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a>. Recommended for prompting in non-English.</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-mt>bloomz-7b1-mt</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a></td>
</tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/Muennighoff/P3>P3</a>. Released for research purposes only. Strictly inferior to above models!</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-p3>bloomz-7b1-p3</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a></td>
</tr>
<th colspan="12">Original pretrained checkpoints. Not recommended.</th>
<tr>
<td>Pretrained Model</td>
<td><a href=https://huggingface.co/google/mt5-small>mt5-small</a></td>
<td><a href=https://huggingface.co/google/mt5-base>mt5-base</a></td>
<td><a href=https://huggingface.co/google/mt5-large>mt5-large</a></td>
<td><a href=https://huggingface.co/google/mt5-xl>mt5-xl</a></td>
<td><a href=https://huggingface.co/google/mt5-xxl>mt5-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-560m>bloom-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b1>bloom-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b7>bloom-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-3b>bloom-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-7b1>bloom-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom>bloom</a></td>
</tr>
</table>
</div>
# Use
## Intended use
We recommend using the model to perform tasks expressed in natural language. For example, given the prompt "*Translate to English: Je t’aime.*", the model will most likely answer "*I love you.*". Some prompt ideas from our paper:
- 一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?
- Suggest at least five related search terms to "Mạng neural nhân tạo".
- Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is "Heroes Come in All Shapes and Sizes". Story (in Spanish):
- Explain in a sentence in Telugu what is backpropagation in neural networks.
**Feel free to share your generations in the Community tab!**
## How to use
### CPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-560m"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-560m"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype="auto", device_map="auto")
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU in 8bit
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate bitsandbytes
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-560m"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, device_map="auto", load_in_8bit=True)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
<!-- Necessary for whitespace -->
###
# Limitations
**Prompt Engineering:** The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "*Translate to English: Je t'aime*" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "*Translate to English: Je t'aime.*", "*Translate to English: Je t'aime. Translation:*" "*What is "Je t'aime." in English?*", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "*Explain in a sentence in Telugu what is backpropagation in neural networks.*".
# Training
## Model
- **Architecture:** Same as [bloom-560m](https://huggingface.co/bigscience/bloom-560m), also refer to the `config.json` file
- **Finetuning steps:** 1750
- **Finetuning tokens:** 3.67 billion
- **Finetuning layout:** 1x pipeline parallel, 1x tensor parallel, 1x data parallel
- **Precision:** float16
## Hardware
- **CPUs:** AMD CPUs with 512GB memory per node
- **GPUs:** 64 A100 80GB GPUs with 8 GPUs per node (8 nodes) using NVLink 4 inter-gpu connects, 4 OmniPath links
- **Communication:** NCCL-communications network with a fully dedicated subnet
## Software
- **Orchestration:** [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed)
- **Optimizer & parallelism:** [DeepSpeed](https://github.com/microsoft/DeepSpeed)
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch) (pytorch-1.11 w/ CUDA-11.5)
- **FP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
# Evaluation
We refer to Table 7 from our [paper](https://arxiv.org/abs/2211.01786) & [bigscience/evaluation-results](https://huggingface.co/datasets/bigscience/evaluation-results) for zero-shot results on unseen tasks. The sidebar reports zero-shot performance of the best prompt per dataset config.
# Citation
```bibtex
@article{muennighoff2022crosslingual,
title={Crosslingual generalization through multitask finetuning},
author={Muennighoff, Niklas and Wang, Thomas and Sutawika, Lintang and Roberts, Adam and Biderman, Stella and Scao, Teven Le and Bari, M Saiful and Shen, Sheng and Yong, Zheng-Xin and Schoelkopf, Hailey and others},
journal={arXiv preprint arXiv:2211.01786},
year={2022}
}
``` | 24,197 | [
[
-0.031890869140625,
-0.042938232421875,
0.0226593017578125,
0.0293426513671875,
-0.005767822265625,
-0.006259918212890625,
-0.0251617431640625,
-0.02508544921875,
0.0313720703125,
-0.01214599609375,
-0.06854248046875,
-0.03985595703125,
-0.040496826171875,
0.01151275634765625,
0.0011548995971679688,
0.05950927734375,
-0.010223388671875,
0.01222991943359375,
0.0021953582763671875,
-0.0036716461181640625,
-0.021392822265625,
-0.0304107666015625,
-0.05572509765625,
-0.04498291015625,
0.038665771484375,
0.0124359130859375,
0.036956787109375,
0.039215087890625,
0.0224609375,
0.0287628173828125,
-0.0242919921875,
0.005496978759765625,
-0.0160064697265625,
-0.01032257080078125,
0.0021953582763671875,
-0.028961181640625,
-0.054931640625,
-0.0051422119140625,
0.043731689453125,
0.044708251953125,
0.01464080810546875,
0.0219268798828125,
0.023590087890625,
0.039215087890625,
-0.03411865234375,
0.0276947021484375,
-0.0030155181884765625,
0.0296173095703125,
-0.01291656494140625,
0.0030574798583984375,
-0.0114288330078125,
-0.0240020751953125,
-0.0030574798583984375,
-0.059234619140625,
0.014923095703125,
0.00982666015625,
0.100341796875,
0.0016393661499023438,
0.0029964447021484375,
0.00493621826171875,
-0.02471923828125,
0.07623291015625,
-0.066162109375,
0.030364990234375,
0.0309295654296875,
-0.003505706787109375,
0.0010786056518554688,
-0.045928955078125,
-0.05914306640625,
-0.00534820556640625,
-0.0254669189453125,
0.0309295654296875,
-0.0186004638671875,
-0.0119476318359375,
0.0195465087890625,
0.037933349609375,
-0.05255126953125,
0.005290985107421875,
-0.0249481201171875,
-0.0175323486328125,
0.042144775390625,
0.01543426513671875,
0.042572021484375,
-0.02337646484375,
-0.019683837890625,
-0.03240966796875,
-0.03436279296875,
0.01113128662109375,
0.01253509521484375,
0.040008544921875,
-0.048736572265625,
0.030364990234375,
-0.00688934326171875,
0.04534912109375,
0.02239990234375,
0.0003857612609863281,
0.057464599609375,
-0.0360107421875,
-0.028717041015625,
-0.0192718505859375,
0.0897216796875,
0.015869140625,
0.004062652587890625,
-0.007099151611328125,
0.007904052734375,
-0.0147705078125,
-0.001033782958984375,
-0.07183837890625,
-0.004486083984375,
0.0223388671875,
-0.042999267578125,
-0.025482177734375,
-0.0083160400390625,
-0.07403564453125,
0.0088043212890625,
-0.0168609619140625,
0.0517578125,
-0.043182373046875,
-0.0280303955078125,
0.016082763671875,
0.0011663436889648438,
0.0155792236328125,
0.0113677978515625,
-0.07073974609375,
0.012969970703125,
0.0236968994140625,
0.06805419921875,
-0.01090240478515625,
-0.043121337890625,
0.0017957687377929688,
0.005100250244140625,
-0.01102447509765625,
0.03875732421875,
-0.01226806640625,
-0.0294647216796875,
-0.0241241455078125,
0.0238800048828125,
-0.0333251953125,
-0.006832122802734375,
0.043121337890625,
-0.0084991455078125,
0.046356201171875,
-0.04254150390625,
-0.025787353515625,
-0.0152740478515625,
0.02203369140625,
-0.03985595703125,
0.08013916015625,
0.01548004150390625,
-0.06842041015625,
0.012664794921875,
-0.07232666015625,
-0.0175628662109375,
-0.01476287841796875,
-0.0010194778442382812,
-0.0513916015625,
-0.0272216796875,
0.033416748046875,
0.038421630859375,
-0.017364501953125,
-0.0191497802734375,
-0.021942138671875,
-0.001605987548828125,
-0.0014638900756835938,
-0.0118255615234375,
0.07916259765625,
0.0194854736328125,
-0.04705810546875,
0.018341064453125,
-0.0494384765625,
0.0101776123046875,
0.04150390625,
-0.0165557861328125,
0.0084075927734375,
-0.03216552734375,
-0.002010345458984375,
0.035675048828125,
0.0237579345703125,
-0.038787841796875,
0.01361846923828125,
-0.040313720703125,
0.0487060546875,
0.0469970703125,
-0.004215240478515625,
0.032073974609375,
-0.039337158203125,
0.036865234375,
0.01277923583984375,
0.011444091796875,
-0.019561767578125,
-0.03271484375,
-0.06378173828125,
-0.01531982421875,
0.01953125,
0.03619384765625,
-0.040740966796875,
0.042327880859375,
-0.0226287841796875,
-0.048126220703125,
-0.0273284912109375,
0.0005822181701660156,
0.044281005859375,
0.052581787109375,
0.049896240234375,
-0.00380706787109375,
-0.0435791015625,
-0.05902099609375,
-0.00019121170043945312,
-0.007320404052734375,
0.00994110107421875,
0.040069580078125,
0.05731201171875,
-0.00998687744140625,
0.039215087890625,
-0.04632568359375,
-0.0037479400634765625,
-0.0292816162109375,
0.00321197509765625,
0.0206451416015625,
0.059967041015625,
0.042327880859375,
-0.057647705078125,
-0.0333251953125,
0.00067138671875,
-0.06903076171875,
0.0171051025390625,
0.0015239715576171875,
-0.0303802490234375,
0.0084991455078125,
0.0254364013671875,
-0.057037353515625,
0.03497314453125,
0.022979736328125,
-0.03790283203125,
0.045501708984375,
-0.0174560546875,
0.018096923828125,
-0.09918212890625,
0.0316162109375,
0.011962890625,
0.005359649658203125,
-0.04840087890625,
0.01396942138671875,
0.00501251220703125,
0.004344940185546875,
-0.044281005859375,
0.066650390625,
-0.036590576171875,
0.0129852294921875,
0.00252532958984375,
-0.00799560546875,
0.0178375244140625,
0.055572509765625,
0.01351165771484375,
0.05316162109375,
0.0517578125,
-0.05072021484375,
0.0226287841796875,
0.042816162109375,
-0.00926971435546875,
0.0273895263671875,
-0.064453125,
-0.0044403076171875,
0.0006465911865234375,
0.01091766357421875,
-0.064208984375,
-0.0168609619140625,
0.0309295654296875,
-0.055267333984375,
0.046966552734375,
0.00461578369140625,
-0.03961181640625,
-0.061370849609375,
-0.024322509765625,
0.023193359375,
0.040802001953125,
-0.037994384765625,
0.0285797119140625,
-0.0009245872497558594,
0.0060882568359375,
-0.042572021484375,
-0.07135009765625,
-0.01168060302734375,
-0.0289154052734375,
-0.06500244140625,
0.04620361328125,
-0.0154266357421875,
0.01311492919921875,
-0.0178070068359375,
0.004764556884765625,
-0.006748199462890625,
-0.00382232666015625,
0.0247802734375,
0.03216552734375,
-0.0285491943359375,
0.005588531494140625,
-0.01148223876953125,
0.004302978515625,
-0.00042700767517089844,
-0.0185699462890625,
0.054290771484375,
-0.0186309814453125,
-0.007137298583984375,
-0.05731201171875,
0.01137542724609375,
0.0399169921875,
-0.0121002197265625,
0.06793212890625,
0.06884765625,
-0.03326416015625,
0.0074615478515625,
-0.0295257568359375,
-0.0286407470703125,
-0.039947509765625,
0.01126861572265625,
-0.02392578125,
-0.04779052734375,
0.0538330078125,
0.0196075439453125,
-0.0030670166015625,
0.056671142578125,
0.0472412109375,
0.01131439208984375,
0.071533203125,
0.042755126953125,
-0.006214141845703125,
0.036895751953125,
-0.05029296875,
0.01059722900390625,
-0.07232666015625,
-0.0355224609375,
-0.029571533203125,
-0.023040771484375,
-0.018463134765625,
-0.0248565673828125,
0.018218994140625,
0.00487518310546875,
-0.0478515625,
0.038177490234375,
-0.05145263671875,
-0.001861572265625,
0.04632568359375,
0.0269622802734375,
-0.007579803466796875,
0.0003521442413330078,
-0.03662109375,
-0.012542724609375,
-0.05633544921875,
-0.0169525146484375,
0.07257080078125,
0.0205535888671875,
0.031280517578125,
-0.00751495361328125,
0.050048828125,
-0.01690673828125,
-0.00362396240234375,
-0.038787841796875,
0.032073974609375,
0.0033817291259765625,
-0.051971435546875,
-0.0241241455078125,
-0.029144287109375,
-0.086181640625,
0.02130126953125,
-0.035369873046875,
-0.071533203125,
0.013824462890625,
0.0234527587890625,
-0.055694580078125,
0.036163330078125,
-0.052703857421875,
0.08099365234375,
-0.0157623291015625,
-0.057342529296875,
0.01232147216796875,
-0.048309326171875,
0.01346588134765625,
0.0280914306640625,
0.0201568603515625,
0.00740814208984375,
0.01568603515625,
0.062164306640625,
-0.04547119140625,
0.06353759765625,
-0.0107421875,
0.00691986083984375,
0.02130126953125,
-0.0165863037109375,
0.0244598388671875,
-0.0118865966796875,
-0.00482177734375,
0.00494384765625,
-0.0050201416015625,
-0.03521728515625,
-0.026092529296875,
0.06048583984375,
-0.0673828125,
-0.03515625,
-0.041290283203125,
-0.03973388671875,
-0.00952911376953125,
0.03594970703125,
0.0469970703125,
0.0174407958984375,
0.0051116943359375,
-0.00415802001953125,
0.04864501953125,
-0.025238037109375,
0.052764892578125,
0.01097869873046875,
-0.01436614990234375,
-0.01788330078125,
0.0703125,
0.005855560302734375,
0.00789642333984375,
0.0291290283203125,
0.0291900634765625,
-0.0272979736328125,
-0.02935791015625,
-0.039398193359375,
0.037109375,
-0.02447509765625,
-0.022796630859375,
-0.06500244140625,
-0.0265960693359375,
-0.059051513671875,
-0.01282501220703125,
-0.0321044921875,
-0.0333251953125,
-0.042938232421875,
-0.013092041015625,
0.0355224609375,
0.033660888671875,
-0.018951416015625,
0.0251922607421875,
-0.038299560546875,
0.0267333984375,
0.0179595947265625,
0.0227813720703125,
0.01556396484375,
-0.040740966796875,
-0.016082763671875,
0.017578125,
-0.043487548828125,
-0.05029296875,
0.051300048828125,
0.00154876708984375,
0.039154052734375,
0.0174407958984375,
-0.0264434814453125,
0.0611572265625,
-0.034393310546875,
0.06109619140625,
0.03192138671875,
-0.0633544921875,
0.047119140625,
-0.028961181640625,
0.037078857421875,
0.02728271484375,
0.0399169921875,
-0.030853271484375,
-0.0122222900390625,
-0.057830810546875,
-0.06817626953125,
0.0577392578125,
0.0242919921875,
0.0022125244140625,
0.00624847412109375,
0.0288848876953125,
-0.00518798828125,
0.007221221923828125,
-0.0723876953125,
-0.046051025390625,
-0.036834716796875,
-0.0201568603515625,
-0.004436492919921875,
0.0073394775390625,
-0.002384185791015625,
-0.044586181640625,
0.052764892578125,
0.0018739700317382812,
0.04278564453125,
0.02215576171875,
0.0010614395141601562,
-0.0028820037841796875,
0.00832366943359375,
0.044464111328125,
0.031341552734375,
-0.005016326904296875,
-0.016387939453125,
0.0155181884765625,
-0.05047607421875,
0.000972747802734375,
0.00580596923828125,
-0.021942138671875,
-0.00986480712890625,
0.0168304443359375,
0.06549072265625,
0.0159759521484375,
-0.01177215576171875,
0.03253173828125,
-0.0030307769775390625,
-0.027862548828125,
-0.020965576171875,
0.01104736328125,
0.0252685546875,
0.0159454345703125,
0.0176849365234375,
0.005069732666015625,
0.0008492469787597656,
-0.02935791015625,
0.0020465850830078125,
0.030548095703125,
-0.01953125,
-0.037353515625,
0.06695556640625,
-0.0038776397705078125,
-0.0029621124267578125,
0.022857666015625,
-0.0234832763671875,
-0.0577392578125,
0.05029296875,
0.047698974609375,
0.045989990234375,
-0.0211334228515625,
0.004985809326171875,
0.07598876953125,
0.00624847412109375,
-0.0165863037109375,
0.0248565673828125,
0.0018768310546875,
-0.039581298828125,
-0.020751953125,
-0.060333251953125,
0.0007762908935546875,
0.02618408203125,
-0.047332763671875,
0.0278778076171875,
-0.037109375,
-0.017608642578125,
0.0184173583984375,
0.0204925537109375,
-0.0579833984375,
0.042388916015625,
0.018951416015625,
0.06231689453125,
-0.0555419921875,
0.057159423828125,
0.047698974609375,
-0.062225341796875,
-0.075927734375,
-0.00778961181640625,
0.0015583038330078125,
-0.07122802734375,
0.06378173828125,
0.010650634765625,
0.01104736328125,
0.012451171875,
-0.045928955078125,
-0.08514404296875,
0.09942626953125,
0.005809783935546875,
-0.01898193359375,
-0.0220489501953125,
0.00380706787109375,
0.041473388671875,
-0.01526641845703125,
0.0309295654296875,
0.0252532958984375,
0.0487060546875,
0.02020263671875,
-0.0694580078125,
0.027252197265625,
-0.045562744140625,
-0.003482818603515625,
-0.002811431884765625,
-0.08465576171875,
0.09185791015625,
-0.0128936767578125,
-0.009002685546875,
0.00345611572265625,
0.060638427734375,
0.0277862548828125,
0.0148773193359375,
0.01543426513671875,
0.060272216796875,
0.03692626953125,
-0.0235748291015625,
0.0750732421875,
-0.02899169921875,
0.041900634765625,
0.0582275390625,
0.017059326171875,
0.043121337890625,
0.0256500244140625,
-0.038330078125,
0.04071044921875,
0.04833984375,
-0.021270751953125,
0.0204010009765625,
0.0167083740234375,
-0.00560760498046875,
-0.00684356689453125,
0.01052093505859375,
-0.048248291015625,
0.006862640380859375,
0.0301666259765625,
-0.02276611328125,
-0.00261688232421875,
0.007045745849609375,
0.0276947021484375,
-0.002613067626953125,
-0.03558349609375,
0.027862548828125,
0.00946807861328125,
-0.0513916015625,
0.0518798828125,
-0.00382232666015625,
0.07550048828125,
-0.039947509765625,
0.0192108154296875,
-0.01131439208984375,
0.01291656494140625,
-0.0294342041015625,
-0.0552978515625,
0.01401519775390625,
-0.004726409912109375,
-0.009002685546875,
-0.01438140869140625,
0.0360107421875,
-0.0236968994140625,
-0.04595947265625,
0.0223846435546875,
0.026580810546875,
0.00882720947265625,
0.004734039306640625,
-0.08111572265625,
0.0029277801513671875,
-0.00241851806640625,
-0.034454345703125,
0.0148468017578125,
0.01366424560546875,
0.0157623291015625,
0.053863525390625,
0.04400634765625,
0.00872039794921875,
0.02728271484375,
-0.005695343017578125,
0.06298828125,
-0.052703857421875,
-0.035919189453125,
-0.0623779296875,
0.042205810546875,
-0.01015472412109375,
-0.025970458984375,
0.07952880859375,
0.04266357421875,
0.059967041015625,
-0.005474090576171875,
0.060516357421875,
-0.01776123046875,
0.044921875,
-0.02984619140625,
0.07000732421875,
-0.05950927734375,
-0.0187530517578125,
-0.027374267578125,
-0.0377197265625,
-0.0238037109375,
0.059783935546875,
-0.0204925537109375,
0.041656494140625,
0.058135986328125,
0.04949951171875,
-0.00969696044921875,
-0.004222869873046875,
-0.004093170166015625,
0.0293731689453125,
0.01308441162109375,
0.0643310546875,
0.024871826171875,
-0.056182861328125,
0.0286712646484375,
-0.05072021484375,
-0.0022296905517578125,
-0.018646240234375,
-0.048126220703125,
-0.0682373046875,
-0.0523681640625,
-0.036407470703125,
-0.041473388671875,
-0.007602691650390625,
0.06549072265625,
0.055877685546875,
-0.06707763671875,
-0.01520538330078125,
-0.013336181640625,
0.00010687112808227539,
-0.01084136962890625,
-0.0178070068359375,
0.055267333984375,
-0.0218353271484375,
-0.07183837890625,
0.005840301513671875,
0.0008912086486816406,
0.03961181640625,
-0.005184173583984375,
-0.01499176025390625,
-0.03057861328125,
-0.003696441650390625,
0.023895263671875,
0.04864501953125,
-0.03497314453125,
-0.00701904296875,
0.01273345947265625,
-0.015716552734375,
0.02642822265625,
0.0245361328125,
-0.03973388671875,
0.00799560546875,
0.03546142578125,
0.0220489501953125,
0.0518798828125,
-0.0144195556640625,
0.025360107421875,
-0.036590576171875,
0.017822265625,
0.0126190185546875,
0.034515380859375,
0.0264892578125,
-0.033782958984375,
0.0280303955078125,
0.0199432373046875,
-0.042999267578125,
-0.057769775390625,
-0.00861358642578125,
-0.08453369140625,
-0.0163726806640625,
0.08544921875,
-0.0207977294921875,
-0.050537109375,
0.025970458984375,
-0.0106201171875,
0.043182373046875,
-0.02593994140625,
0.04754638671875,
0.05731201171875,
-0.02239990234375,
-0.00897216796875,
-0.0445556640625,
0.041656494140625,
0.044647216796875,
-0.0650634765625,
-0.01222991943359375,
0.01097869873046875,
0.0328369140625,
0.0306549072265625,
0.03076171875,
-0.0194854736328125,
0.01552581787109375,
-0.0006337165832519531,
0.01551055908203125,
-0.01371002197265625,
0.0032672882080078125,
-0.02874755859375,
-0.002613067626953125,
-0.0231781005859375,
-0.0201568603515625
]
] |
guillaumekln/faster-whisper-base | 2023-05-12T18:57:32.000Z | [
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"license:mit",
"region:us"
] | automatic-speech-recognition | guillaumekln | null | null | guillaumekln/faster-whisper-base | 8 | 66,726 | ctranslate2 | 2023-03-23T10:19:37 | ---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- 'no'
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper base model for CTranslate2
This repository contains the conversion of [openai/whisper-base](https://huggingface.co/openai/whisper-base) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("base")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-base --output_dir faster-whisper-base \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-base).**
| 1,996 | [
[
0.00960540771484375,
-0.0276031494140625,
0.0156707763671875,
0.032379150390625,
-0.031890869140625,
-0.019561767578125,
-0.033294677734375,
-0.0244140625,
0.0005741119384765625,
0.059051513671875,
-0.034149169921875,
-0.047576904296875,
-0.04425048828125,
-0.0308685302734375,
-0.0256195068359375,
0.07037353515625,
-0.01025390625,
0.0172119140625,
0.030426025390625,
-0.007053375244140625,
-0.03265380859375,
-0.019775390625,
-0.048004150390625,
-0.0294189453125,
0.00490570068359375,
0.01666259765625,
0.0458984375,
0.029205322265625,
0.029266357421875,
0.017913818359375,
-0.0260772705078125,
-0.006488800048828125,
-0.0238800048828125,
-0.006011962890625,
0.01486968994140625,
-0.049774169921875,
-0.056671142578125,
0.00583648681640625,
0.0479736328125,
0.0206451416015625,
-0.0225982666015625,
0.0288848876953125,
-0.019287109375,
0.0280609130859375,
-0.04345703125,
0.0206146240234375,
-0.04205322265625,
-0.0006642341613769531,
-0.0107574462890625,
-0.0201873779296875,
-0.044891357421875,
-0.025054931640625,
0.039794921875,
-0.0611572265625,
0.01322174072265625,
-0.007732391357421875,
0.06390380859375,
0.0198516845703125,
-0.04022216796875,
-0.0252685546875,
-0.06884765625,
0.0704345703125,
-0.054840087890625,
0.022857666015625,
0.0198974609375,
0.0330810546875,
0.01442718505859375,
-0.0784912109375,
-0.00969696044921875,
-0.01087188720703125,
0.01409912109375,
0.0264892578125,
-0.0322265625,
0.020843505859375,
0.0133514404296875,
0.038848876953125,
-0.049652099609375,
-0.0171051025390625,
-0.052581787109375,
-0.0309600830078125,
0.0294189453125,
0.004436492919921875,
0.0167999267578125,
-0.01528167724609375,
-0.0277557373046875,
-0.041351318359375,
-0.04583740234375,
0.0014247894287109375,
0.03741455078125,
0.01904296875,
-0.045989990234375,
0.046661376953125,
0.004520416259765625,
0.028839111328125,
0.01314544677734375,
-0.0213165283203125,
0.03546142578125,
-0.0181884765625,
-0.0136871337890625,
0.036956787109375,
0.0430908203125,
0.03680419921875,
0.01268768310546875,
0.0158233642578125,
-0.01727294921875,
-0.000324249267578125,
0.00960540771484375,
-0.081787109375,
-0.03472900390625,
0.0209503173828125,
-0.06671142578125,
-0.022247314453125,
0.0019130706787109375,
-0.01490020751953125,
0.00823211669921875,
-0.00959014892578125,
0.044189453125,
-0.04010009765625,
-0.050323486328125,
0.0291290283203125,
-0.0428466796875,
0.01207733154296875,
0.0257415771484375,
-0.062103271484375,
0.038604736328125,
0.0408935546875,
0.08062744140625,
0.0019550323486328125,
-0.00725555419921875,
-0.019500732421875,
0.0231170654296875,
-0.00850677490234375,
0.040008544921875,
0.0028438568115234375,
-0.04888916015625,
-0.00312042236328125,
-0.010162353515625,
-0.00804901123046875,
-0.04827880859375,
0.0465087890625,
-0.01806640625,
0.0251922607421875,
0.0178985595703125,
-0.0201263427734375,
-0.0164031982421875,
0.0022430419921875,
-0.050140380859375,
0.07684326171875,
0.039764404296875,
-0.05279541015625,
-0.011993408203125,
-0.05755615234375,
-0.0151214599609375,
-0.00966644287109375,
0.036041259765625,
-0.025482177734375,
0.0190887451171875,
-0.006290435791015625,
-0.00632476806640625,
-0.0268096923828125,
0.0229949951171875,
-0.004688262939453125,
-0.03204345703125,
0.017974853515625,
-0.031158447265625,
0.0751953125,
0.0213623046875,
-0.002353668212890625,
0.025787353515625,
-0.046630859375,
0.006267547607421875,
-0.005077362060546875,
-0.0274200439453125,
-0.02294921875,
-0.012725830078125,
0.03814697265625,
-0.00910186767578125,
0.0244903564453125,
-0.040679931640625,
0.0193328857421875,
-0.042022705078125,
0.05206298828125,
0.02838134765625,
-0.002979278564453125,
0.0364990234375,
-0.0308990478515625,
0.004436492919921875,
0.0170440673828125,
0.0281829833984375,
-0.002986907958984375,
-0.0428466796875,
-0.055023193359375,
-0.004657745361328125,
0.02618408203125,
0.035736083984375,
-0.0300445556640625,
0.0266571044921875,
-0.0235137939453125,
-0.06781005859375,
-0.06573486328125,
-0.031463623046875,
0.021514892578125,
0.02960205078125,
0.04595947265625,
-0.0082244873046875,
-0.061279296875,
-0.059967041015625,
-0.005352020263671875,
-0.034332275390625,
-0.01393890380859375,
0.01239013671875,
0.03472900390625,
-0.0143280029296875,
0.05377197265625,
-0.046173095703125,
-0.03302001953125,
-0.0195159912109375,
0.02880859375,
0.0224151611328125,
0.06329345703125,
0.049407958984375,
-0.059417724609375,
-0.0171661376953125,
-0.0153656005859375,
-0.0232391357421875,
-0.004680633544921875,
-0.0006632804870605469,
-0.01055908203125,
0.0016498565673828125,
-0.0018148422241210938,
-0.060211181640625,
0.03350830078125,
0.057159423828125,
-0.0243377685546875,
0.0439453125,
0.00662994384765625,
-0.0014553070068359375,
-0.09539794921875,
0.013427734375,
-0.004390716552734375,
-0.01139068603515625,
-0.051971435546875,
-0.00024437904357910156,
0.0158233642578125,
0.00975799560546875,
-0.05230712890625,
0.04315185546875,
-0.0107574462890625,
-0.004817962646484375,
-0.0179595947265625,
-0.02838134765625,
-0.00844573974609375,
0.0243988037109375,
0.0364990234375,
0.056915283203125,
0.034423828125,
-0.029083251953125,
0.0145416259765625,
0.04302978515625,
-0.01015472412109375,
0.0097503662109375,
-0.07861328125,
0.01256561279296875,
0.018402099609375,
0.037017822265625,
-0.04803466796875,
0.0030498504638671875,
0.0206451416015625,
-0.053924560546875,
0.009796142578125,
-0.058258056640625,
-0.032684326171875,
-0.009765625,
-0.034210205078125,
0.035247802734375,
0.0570068359375,
-0.02801513671875,
0.056396484375,
0.016693115234375,
0.007244110107421875,
-0.0123138427734375,
-0.07733154296875,
-0.005809783935546875,
-0.007541656494140625,
-0.05621337890625,
0.056396484375,
-0.020233154296875,
-0.01306915283203125,
-0.00521087646484375,
-0.006778717041015625,
-0.0215911865234375,
-0.00959014892578125,
0.039337158203125,
0.028839111328125,
-0.037506103515625,
-0.0217437744140625,
0.031524658203125,
-0.031524658203125,
0.00933074951171875,
-0.043060302734375,
0.058441162109375,
-0.014068603515625,
-0.0016384124755859375,
-0.044677734375,
0.00888824462890625,
0.04278564453125,
-0.016998291015625,
0.031280517578125,
0.049468994140625,
-0.03076171875,
-0.0142822265625,
-0.033660888671875,
-0.01213836669921875,
-0.034912109375,
0.0176849365234375,
-0.029327392578125,
-0.052886962890625,
0.031982421875,
0.00811004638671875,
0.0027408599853515625,
0.05950927734375,
0.049224853515625,
0.006504058837890625,
0.07904052734375,
0.029541015625,
0.0175628662109375,
0.03106689453125,
-0.0570068359375,
-0.0208740234375,
-0.08709716796875,
-0.0242156982421875,
-0.046539306640625,
-0.006805419921875,
-0.0194244384765625,
-0.018768310546875,
0.034820556640625,
0.01490020751953125,
-0.03179931640625,
0.052825927734375,
-0.051666259765625,
-0.004817962646484375,
0.040313720703125,
0.01355743408203125,
0.026397705078125,
0.0025653839111328125,
0.01495361328125,
-0.00855255126953125,
-0.021636962890625,
-0.02740478515625,
0.08465576171875,
0.044677734375,
0.05487060546875,
0.02276611328125,
0.045654296875,
0.009979248046875,
0.01458740234375,
-0.06439208984375,
0.01505279541015625,
-0.017120361328125,
-0.04150390625,
0.002521514892578125,
-0.02130126953125,
-0.0460205078125,
0.00632476806640625,
0.004955291748046875,
-0.038726806640625,
0.006805419921875,
0.0035915374755859375,
-0.0048828125,
0.027557373046875,
-0.038665771484375,
0.051422119140625,
0.006084442138671875,
0.0281829833984375,
-0.0157623291015625,
-0.0284271240234375,
0.037750244140625,
0.018341064453125,
-0.0281982421875,
0.0041656494140625,
-0.00421142578125,
0.07415771484375,
-0.052520751953125,
0.0592041015625,
-0.0321044921875,
-0.0142669677734375,
0.050018310546875,
0.01143646240234375,
0.02301025390625,
0.0078277587890625,
-0.0144195556640625,
0.03662109375,
0.0250701904296875,
-0.0030269622802734375,
-0.02557373046875,
0.03875732421875,
-0.09619140625,
-0.004985809326171875,
-0.014251708984375,
-0.039520263671875,
0.0220947265625,
0.00859832763671875,
0.047119140625,
0.04766845703125,
-0.0036182403564453125,
0.015899658203125,
0.0511474609375,
0.0159912109375,
0.025115966796875,
0.045867919921875,
-0.0034847259521484375,
-0.059478759765625,
0.051727294921875,
0.01461029052734375,
0.022216796875,
0.031280517578125,
0.0258636474609375,
-0.03924560546875,
-0.07440185546875,
-0.03472900390625,
0.0018749237060546875,
-0.046295166015625,
-0.027679443359375,
-0.051727294921875,
-0.042236328125,
-0.038604736328125,
0.01251983642578125,
-0.048095703125,
-0.053466796875,
-0.03973388671875,
0.018402099609375,
0.050384521484375,
0.03253173828125,
0.0026912689208984375,
0.05419921875,
-0.07989501953125,
0.0223388671875,
0.0006833076477050781,
0.00724029541015625,
0.01448822021484375,
-0.07159423828125,
-0.007110595703125,
0.01348876953125,
-0.02276611328125,
-0.063720703125,
0.0322265625,
0.002201080322265625,
0.01502227783203125,
0.0088653564453125,
0.004550933837890625,
0.05804443359375,
-0.016510009765625,
0.071533203125,
0.025543212890625,
-0.0889892578125,
0.051971435546875,
-0.03790283203125,
0.0125885009765625,
0.038787841796875,
0.007465362548828125,
-0.04510498046875,
-0.00463104248046875,
-0.04486083984375,
-0.057373046875,
0.05804443359375,
0.0333251953125,
-0.0024967193603515625,
0.0193023681640625,
0.0145416259765625,
0.010772705078125,
0.01531982421875,
-0.054351806640625,
-0.0259552001953125,
-0.0545654296875,
-0.032745361328125,
0.0256805419921875,
-0.021087646484375,
-0.005031585693359375,
-0.01251983642578125,
0.0556640625,
-0.0148773193359375,
0.02703857421875,
0.038177490234375,
-0.0201568603515625,
-0.014892578125,
0.0021209716796875,
0.04803466796875,
0.01004791259765625,
-0.0469970703125,
-0.006855010986328125,
0.01184844970703125,
-0.0645751953125,
-0.00283050537109375,
-0.0006208419799804688,
-0.027862548828125,
0.0156707763671875,
0.04058837890625,
0.0638427734375,
0.0278167724609375,
-0.021575927734375,
0.048095703125,
-0.003963470458984375,
-0.033203125,
-0.0595703125,
0.01177978515625,
0.0198516845703125,
0.01161956787109375,
0.038543701171875,
0.0240325927734375,
0.0213165283203125,
-0.031402587890625,
-0.0120849609375,
0.005889892578125,
-0.036376953125,
-0.039794921875,
0.053436279296875,
0.01358795166015625,
-0.0272979736328125,
0.048797607421875,
0.0125579833984375,
-0.020721435546875,
0.042205810546875,
0.058837890625,
0.09381103515625,
-0.00896453857421875,
0.00185394287109375,
0.0391845703125,
0.017364501953125,
-0.01473236083984375,
0.04949951171875,
-0.0137786865234375,
-0.035491943359375,
-0.0291748046875,
-0.049041748046875,
-0.0350341796875,
0.006439208984375,
-0.070068359375,
0.0227508544921875,
-0.04168701171875,
-0.018524169921875,
0.0005517005920410156,
0.005279541015625,
-0.036376953125,
-0.00015020370483398438,
0.0181884765625,
0.10870361328125,
-0.04632568359375,
0.08465576171875,
0.043243408203125,
-0.0303192138671875,
-0.066650390625,
-0.017303466796875,
-0.005924224853515625,
-0.048919677734375,
0.049774169921875,
0.0084686279296875,
-0.00870513916015625,
0.004608154296875,
-0.052703857421875,
-0.06524658203125,
0.1016845703125,
0.0048980712890625,
-0.03759765625,
-0.018798828125,
0.00756072998046875,
0.037200927734375,
-0.0443115234375,
0.0521240234375,
0.025238037109375,
0.056610107421875,
0.00849151611328125,
-0.088134765625,
0.0004534721374511719,
-0.0160675048828125,
0.01654052734375,
0.0037746429443359375,
-0.057769775390625,
0.09088134765625,
-0.007610321044921875,
-0.0033111572265625,
0.061859130859375,
0.051239013671875,
0.0309600830078125,
0.0216522216796875,
0.038818359375,
0.035125732421875,
0.0235748291015625,
-0.0232086181640625,
0.04144287109375,
-0.0167236328125,
0.0390625,
0.06561279296875,
-0.0242156982421875,
0.0732421875,
0.0228118896484375,
-0.0021457672119140625,
0.060394287109375,
0.0267791748046875,
-0.023590087890625,
0.044952392578125,
-0.0024089813232421875,
-0.0017442703247070312,
-0.004505157470703125,
-0.005218505859375,
-0.0401611328125,
0.050079345703125,
0.0321044921875,
-0.0264434814453125,
-0.01470947265625,
-0.01335906982421875,
0.003749847412109375,
-0.01739501953125,
-0.02813720703125,
0.04937744140625,
-0.0099639892578125,
-0.033050537109375,
0.05242919921875,
0.0287322998046875,
0.07098388671875,
-0.07623291015625,
-0.01380157470703125,
0.02044677734375,
0.01371002197265625,
-0.022918701171875,
-0.056549072265625,
0.032867431640625,
-0.0133514404296875,
-0.0291748046875,
-0.003452301025390625,
0.053985595703125,
-0.04669189453125,
-0.02337646484375,
0.029937744140625,
0.0172576904296875,
0.0199737548828125,
-0.02459716796875,
-0.05474853515625,
0.0309600830078125,
0.01544952392578125,
-0.023956298828125,
0.01377105712890625,
-0.01123809814453125,
-0.003322601318359375,
0.031280517578125,
0.07110595703125,
0.01434326171875,
0.002292633056640625,
0.0109405517578125,
0.054473876953125,
-0.04937744140625,
-0.052276611328125,
-0.0295562744140625,
0.044403076171875,
-0.0005841255187988281,
-0.057281494140625,
0.035400390625,
0.06549072265625,
0.037750244140625,
-0.024627685546875,
0.052398681640625,
0.0018329620361328125,
0.030242919921875,
-0.05755615234375,
0.056671142578125,
-0.02130126953125,
-0.01313018798828125,
-0.0009984970092773438,
-0.0621337890625,
0.006427764892578125,
0.02691650390625,
0.00466156005859375,
-0.013580322265625,
0.043487548828125,
0.060394287109375,
-0.0113983154296875,
0.01959228515625,
-0.0010633468627929688,
0.0286407470703125,
0.0224151611328125,
0.044586181640625,
0.04986572265625,
-0.073486328125,
0.05340576171875,
-0.0270843505859375,
-0.00446319580078125,
-0.006317138671875,
-0.045867919921875,
-0.06719970703125,
-0.041107177734375,
-0.03350830078125,
-0.047088623046875,
-0.0019989013671875,
0.0673828125,
0.07666015625,
-0.05340576171875,
-0.033111572265625,
0.0007734298706054688,
-0.008636474609375,
-0.0017900466918945312,
-0.0216827392578125,
0.034393310546875,
0.024444580078125,
-0.056793212890625,
0.043701171875,
0.004833221435546875,
0.033843994140625,
-0.0223541259765625,
-0.032745361328125,
0.02197265625,
-0.00556182861328125,
0.026275634765625,
0.004268646240234375,
-0.059814453125,
-0.0086212158203125,
-0.00858306884765625,
-0.0025959014892578125,
0.01473236083984375,
0.05908203125,
-0.060089111328125,
0.0026187896728515625,
0.047454833984375,
-0.0172882080078125,
0.055206298828125,
-0.017425537109375,
0.022064208984375,
-0.041229248046875,
0.0311279296875,
0.021087646484375,
0.030731201171875,
0.01180267333984375,
-0.00959014892578125,
0.030303955078125,
0.022491455078125,
-0.017730712890625,
-0.06683349609375,
0.002330780029296875,
-0.1055908203125,
-0.0034275054931640625,
0.0802001953125,
0.0000033974647521972656,
-0.025421142578125,
0.01220703125,
-0.0465087890625,
0.0287017822265625,
-0.049560546875,
0.0077056884765625,
0.01026153564453125,
0.0256195068359375,
-0.005855560302734375,
-0.0203857421875,
0.0282440185546875,
-0.0014944076538085938,
-0.0262603759765625,
0.0032863616943359375,
0.0103759765625,
0.04278564453125,
0.0350341796875,
0.04364013671875,
-0.035400390625,
0.036407470703125,
0.0197906494140625,
0.033172607421875,
-0.031982421875,
-0.02783203125,
-0.02264404296875,
0.004131317138671875,
-0.005764007568359375,
-0.0242156982421875
]
] |
google/pegasus-cnn_dailymail | 2023-01-24T16:42:26.000Z | [
"transformers",
"pytorch",
"rust",
"pegasus",
"text2text-generation",
"summarization",
"en",
"arxiv:1912.08777",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | google | null | null | google/pegasus-cnn_dailymail | 52 | 66,309 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- summarization
---
### Pegasus Models
See Docs: [here](https://huggingface.co/transformers/master/model_doc/pegasus.html)
Original TF 1 code [here](https://github.com/google-research/pegasus)
Authors: Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu on Dec 18, 2019
Maintained by: [@sshleifer](https://twitter.com/sam_shleifer)
Task: Summarization
The following is copied from the authors' README.
# Mixed & Stochastic Checkpoints
We train a pegasus model with sampled gap sentence ratios on both C4 and HugeNews, and stochastically sample important sentences. The updated the results are reported in this table.
| dataset | C4 | HugeNews | Mixed & Stochastic|
| ---- | ---- | ---- | ----|
| xsum | 45.20/22.06/36.99 | 47.21/24.56/39.25 | 47.60/24.83/39.64|
| cnn_dailymail | 43.90/21.20/40.76 | 44.17/21.47/41.11 | 44.16/21.56/41.30|
| newsroom | 45.07/33.39/41.28 | 45.15/33.51/41.33 | 45.98/34.20/42.18|
| multi_news | 46.74/17.95/24.26 | 47.52/18.72/24.91 | 47.65/18.75/24.95|
| gigaword | 38.75/19.96/36.14 | 39.12/19.86/36.24 | 39.65/20.47/36.76|
| wikihow | 43.07/19.70/34.79 | 41.35/18.51/33.42 | 46.39/22.12/38.41 *|
| reddit_tifu | 26.54/8.94/21.64 | 26.63/9.01/21.60 | 27.99/9.81/22.94|
| big_patent | 53.63/33.16/42.25 | 53.41/32.89/42.07 | 52.29/33.08/41.66 *|
| arxiv | 44.70/17.27/25.80 | 44.67/17.18/25.73 | 44.21/16.95/25.67|
| pubmed | 45.49/19.90/27.69 | 45.09/19.56/27.42 | 45.97/20.15/28.25|
| aeslc | 37.69/21.85/36.84 | 37.40/21.22/36.45 | 37.68/21.25/36.51|
| billsum | 57.20/39.56/45.80 | 57.31/40.19/45.82 | 59.67/41.58/47.59|
The "Mixed & Stochastic" model has the following changes:
- trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples).
- trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity).
- the model uniformly sample a gap sentence ratio between 15% and 45%.
- importance sentences are sampled using a 20% uniform noise to importance scores.
- the sentencepiece tokenizer is updated to be able to encode newline character.
(*) the numbers of wikihow and big_patent datasets are not comparable because of change in tokenization and data:
- wikihow dataset contains newline characters which is useful for paragraph segmentation, the C4 and HugeNews model's sentencepiece tokenizer doesn't encode newline and loose this information.
- we update the BigPatent dataset to preserve casing, some format cleanings are also changed, please refer to change in TFDS.
The "Mixed & Stochastic" model has the following changes (from pegasus-large in the paper):
trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples).
trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity).
the model uniformly sample a gap sentence ratio between 15% and 45%.
importance sentences are sampled using a 20% uniform noise to importance scores.
the sentencepiece tokenizer is updated to be able to encode newline character.
Citation
```
@misc{zhang2019pegasus,
title={PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization},
author={Jingqing Zhang and Yao Zhao and Mohammad Saleh and Peter J. Liu},
year={2019},
eprint={1912.08777},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 3,332 | [
[
-0.0284423828125,
-0.05816650390625,
0.0289306640625,
0.020721435546875,
-0.0264892578125,
-0.0250701904296875,
-0.0107269287109375,
-0.033721923828125,
0.039398193359375,
0.0221405029296875,
-0.058349609375,
-0.045867919921875,
-0.05474853515625,
-0.0013942718505859375,
-0.030426025390625,
0.07452392578125,
-0.0017833709716796875,
-0.00412750244140625,
0.0084228515625,
-0.0005640983581542969,
-0.01175689697265625,
-0.0239105224609375,
-0.047027587890625,
0.006916046142578125,
0.02874755859375,
0.010345458984375,
0.047882080078125,
0.047943115234375,
0.0340576171875,
0.018829345703125,
-0.035491943359375,
-0.007762908935546875,
-0.01480865478515625,
-0.0264434814453125,
0.0186309814453125,
-0.01070404052734375,
-0.0252227783203125,
0.001728057861328125,
0.04132080078125,
0.06671142578125,
-0.01157379150390625,
0.003543853759765625,
0.0160064697265625,
0.035400390625,
-0.030181884765625,
0.008392333984375,
-0.0201873779296875,
0.0182037353515625,
-0.0190887451171875,
0.00919342041015625,
-0.003978729248046875,
-0.0144500732421875,
0.01387786865234375,
-0.0701904296875,
0.034088134765625,
0.0023059844970703125,
0.1077880859375,
0.0213775634765625,
-0.0289764404296875,
-0.011077880859375,
-0.006038665771484375,
0.058685302734375,
-0.0736083984375,
0.022552490234375,
0.00725555419921875,
-0.0090179443359375,
-0.01537322998046875,
-0.080322265625,
-0.055023193359375,
0.0004405975341796875,
-0.0034332275390625,
0.02386474609375,
-0.00885772705078125,
0.0015459060668945312,
0.0230712890625,
0.045013427734375,
-0.034637451171875,
0.02032470703125,
-0.040374755859375,
-0.0171051025390625,
0.044281005859375,
0.01348114013671875,
0.0145721435546875,
-0.0273895263671875,
-0.04046630859375,
-0.00629425048828125,
-0.0272216796875,
0.031982421875,
0.0192108154296875,
0.00994873046875,
-0.0282745361328125,
0.039886474609375,
-0.0210418701171875,
0.050872802734375,
0.00022602081298828125,
-0.003993988037109375,
0.05657958984375,
-0.04058837890625,
-0.0268707275390625,
0.0006480216979980469,
0.0748291015625,
0.0408935546875,
0.006961822509765625,
0.01100921630859375,
0.0030384063720703125,
-0.00890350341796875,
0.0081634521484375,
-0.07208251953125,
0.004390716552734375,
0.017120361328125,
-0.038299560546875,
-0.0189056396484375,
0.0309295654296875,
-0.05914306640625,
-0.003589630126953125,
-0.01012420654296875,
0.0216827392578125,
-0.032806396484375,
-0.004810333251953125,
0.033843994140625,
-0.019775390625,
0.02093505859375,
0.0257415771484375,
-0.0665283203125,
-0.0005354881286621094,
0.04937744140625,
0.07427978515625,
0.0031070709228515625,
-0.0426025390625,
-0.0224609375,
-0.009735107421875,
-0.03057861328125,
0.047821044921875,
-0.021087646484375,
-0.00873565673828125,
0.002773284912109375,
0.02203369140625,
-0.0283355712890625,
-0.0110931396484375,
0.06866455078125,
-0.025421142578125,
0.052642822265625,
-0.0251312255859375,
-0.049774169921875,
-0.01146697998046875,
0.01287841796875,
-0.052001953125,
0.082763671875,
0.01139068603515625,
-0.086181640625,
0.052703857421875,
-0.04669189453125,
-0.0171356201171875,
-0.0251922607421875,
0.0027217864990234375,
-0.04998779296875,
-0.0149383544921875,
0.039459228515625,
0.0239410400390625,
-0.01148223876953125,
0.0275115966796875,
-0.0245208740234375,
-0.0355224609375,
0.005130767822265625,
-0.0143890380859375,
0.07269287109375,
0.01422119140625,
-0.040374755859375,
0.0124359130859375,
-0.048736572265625,
-0.0192413330078125,
-0.00749969482421875,
-0.0228424072265625,
-0.004154205322265625,
-0.0180206298828125,
0.0027942657470703125,
0.035400390625,
0.0111236572265625,
-0.0443115234375,
0.01276397705078125,
-0.055694580078125,
0.050048828125,
0.047882080078125,
0.02630615234375,
0.0174102783203125,
-0.034454345703125,
0.0306243896484375,
0.033172607421875,
0.0158538818359375,
-0.0246429443359375,
-0.03753662109375,
-0.08526611328125,
-0.0242767333984375,
0.032867431640625,
0.022186279296875,
-0.0389404296875,
0.0587158203125,
-0.020263671875,
-0.028839111328125,
-0.034881591796875,
-0.0161895751953125,
0.0102996826171875,
0.057861328125,
0.032928466796875,
-0.0223541259765625,
-0.0309295654296875,
-0.0811767578125,
-0.0151519775390625,
-0.00710296630859375,
-0.0145263671875,
-0.00250244140625,
0.05126953125,
-0.0109100341796875,
0.06781005859375,
-0.042327880859375,
-0.01189422607421875,
-0.01092529296875,
0.00939178466796875,
0.037567138671875,
0.035736083984375,
0.031494140625,
-0.058013916015625,
-0.01450347900390625,
-0.035919189453125,
-0.050506591796875,
-0.0005636215209960938,
-0.01438140869140625,
-0.01088714599609375,
0.020477294921875,
0.057098388671875,
-0.07318115234375,
0.03094482421875,
0.0141754150390625,
-0.0413818359375,
0.044403076171875,
0.0008440017700195312,
0.00971221923828125,
-0.104736328125,
0.0120086669921875,
0.018524169921875,
0.00899505615234375,
-0.038909912109375,
-0.0073394775390625,
0.002445220947265625,
-0.00605010986328125,
-0.0450439453125,
0.03399658203125,
-0.0413818359375,
0.0011692047119140625,
0.007350921630859375,
0.008880615234375,
0.0102996826171875,
0.05474853515625,
-0.00019800662994384766,
0.058135986328125,
0.0292816162109375,
-0.043426513671875,
-0.0014104843139648438,
0.035186767578125,
-0.05670166015625,
0.003753662109375,
-0.05706787109375,
-0.0205230712890625,
-0.01139068603515625,
0.0305938720703125,
-0.068359375,
-0.02105712890625,
0.0149078369140625,
-0.04315185546875,
0.0029811859130859375,
0.02178955078125,
-0.026580810546875,
-0.03900146484375,
-0.03985595703125,
0.00926971435546875,
0.0229034423828125,
-0.0219879150390625,
0.0173492431640625,
0.027618408203125,
-0.02886962890625,
-0.055816650390625,
-0.07574462890625,
0.003849029541015625,
-0.026763916015625,
-0.06744384765625,
0.047271728515625,
0.007305145263671875,
0.0026721954345703125,
0.004314422607421875,
-0.006069183349609375,
-0.0098419189453125,
0.006511688232421875,
0.00914764404296875,
0.0249176025390625,
-0.0250701904296875,
0.01317596435546875,
-0.0003726482391357422,
-0.0124969482421875,
-0.00325775146484375,
-0.02178955078125,
0.031982421875,
-0.005588531494140625,
-0.00982666015625,
-0.038665771484375,
0.00821685791015625,
0.036285400390625,
-0.01276397705078125,
0.06475830078125,
0.04632568359375,
-0.028594970703125,
0.01166534423828125,
-0.03704833984375,
-0.00982666015625,
-0.033172607421875,
0.042144775390625,
-0.0197296142578125,
-0.09283447265625,
0.042724609375,
0.02362060546875,
0.0169677734375,
0.07135009765625,
0.037445068359375,
0.0049896240234375,
0.045074462890625,
0.047698974609375,
-0.0159149169921875,
0.035736083984375,
-0.033111572265625,
0.006229400634765625,
-0.049102783203125,
-0.02130126953125,
-0.031982421875,
-0.0208740234375,
-0.035888671875,
-0.024871826171875,
0.0264739990234375,
0.02227783203125,
-0.03485107421875,
0.0238189697265625,
-0.030364990234375,
0.015228271484375,
0.054901123046875,
0.0033435821533203125,
0.01425933837890625,
0.002292633056640625,
-0.0253753662109375,
-0.02099609375,
-0.059539794921875,
-0.0285186767578125,
0.06146240234375,
0.032318115234375,
0.027862548828125,
0.0011425018310546875,
0.0311431884765625,
0.010833740234375,
0.00554656982421875,
-0.046051025390625,
0.0299224853515625,
-0.004146575927734375,
-0.04388427734375,
-0.03656005859375,
-0.042388916015625,
-0.0750732421875,
0.03167724609375,
-0.0195159912109375,
-0.062225341796875,
0.0198974609375,
-0.0166473388671875,
-0.047271728515625,
0.0158233642578125,
-0.04498291015625,
0.08538818359375,
0.0144500732421875,
-0.0196533203125,
0.0023937225341796875,
-0.06024169921875,
0.043365478515625,
-0.0003986358642578125,
0.0166473388671875,
0.0006361007690429688,
-0.01031494140625,
0.075927734375,
-0.061279296875,
0.04412841796875,
-0.00861358642578125,
0.004352569580078125,
0.0177001953125,
-0.04327392578125,
0.031341552734375,
-0.0070648193359375,
-0.00736236572265625,
0.0159149169921875,
-0.0084228515625,
-0.032806396484375,
-0.0226898193359375,
0.03509521484375,
-0.061126708984375,
-0.0477294921875,
-0.0450439453125,
-0.0263214111328125,
-0.00472259521484375,
0.02362060546875,
0.058258056640625,
0.0248565673828125,
-0.0192413330078125,
0.0183868408203125,
0.0310211181640625,
-0.0286407470703125,
0.062744140625,
0.03045654296875,
0.01361846923828125,
-0.04437255859375,
0.0216064453125,
0.0133209228515625,
0.00688934326171875,
0.0211944580078125,
-0.0036602020263671875,
-0.03216552734375,
-0.02630615234375,
-0.036865234375,
0.0259246826171875,
-0.0215301513671875,
0.0023479461669921875,
-0.07086181640625,
-0.035919189453125,
-0.0552978515625,
-0.005420684814453125,
-0.0177154541015625,
-0.052734375,
-0.038055419921875,
-0.0234222412109375,
0.00917816162109375,
0.0281829833984375,
0.012603759765625,
0.025970458984375,
-0.05572509765625,
-0.0017900466918945312,
0.0106048583984375,
0.006130218505859375,
-0.0028553009033203125,
-0.06756591796875,
-0.036865234375,
-0.003124237060546875,
-0.042572021484375,
-0.05072021484375,
0.032867431640625,
0.006496429443359375,
0.03607177734375,
0.038787841796875,
0.0079498291015625,
0.059234619140625,
-0.0286102294921875,
0.08941650390625,
0.034088134765625,
-0.06854248046875,
0.02764892578125,
-0.033294677734375,
0.0302886962890625,
0.04998779296875,
0.01861572265625,
-0.06060791015625,
-0.040283203125,
-0.07049560546875,
-0.0810546875,
0.06365966796875,
0.030181884765625,
0.0011320114135742188,
0.004276275634765625,
0.01473236083984375,
-0.01296234130859375,
0.0302734375,
-0.06268310546875,
-0.013824462890625,
-0.0231781005859375,
-0.025360107421875,
-0.0101318359375,
-0.022857666015625,
0.0023746490478515625,
-0.005832672119140625,
0.055938720703125,
0.0141448974609375,
0.0220947265625,
0.039642333984375,
-0.0015010833740234375,
0.00984954833984375,
0.0218658447265625,
0.060516357421875,
0.04156494140625,
-0.01007080078125,
-0.01229095458984375,
0.005878448486328125,
-0.04193115234375,
0.0015010833740234375,
0.04803466796875,
-0.022369384765625,
0.0090179443359375,
0.04730224609375,
0.06939697265625,
0.0162200927734375,
-0.035614013671875,
0.06683349609375,
-0.0069122314453125,
-0.04052734375,
-0.043792724609375,
0.0013370513916015625,
0.00519561767578125,
0.0196075439453125,
0.031219482421875,
-0.001598358154296875,
0.016571044921875,
-0.0216217041015625,
0.0235137939453125,
0.001766204833984375,
-0.034637451171875,
-0.01245880126953125,
0.06231689453125,
0.01296234130859375,
0.002277374267578125,
0.034576416015625,
-0.01556396484375,
-0.03826904296875,
0.058441162109375,
0.0177459716796875,
0.058624267578125,
-0.004222869873046875,
0.00801849365234375,
0.050750732421875,
0.042388916015625,
-0.019775390625,
-0.0094146728515625,
0.0128631591796875,
-0.04315185546875,
-0.0333251953125,
-0.048248291015625,
-0.00975799560546875,
0.03045654296875,
-0.041259765625,
0.033782958984375,
-0.0159759521484375,
-0.0113525390625,
0.00724029541015625,
0.00951385498046875,
-0.0233001708984375,
0.021484375,
-0.006740570068359375,
0.09417724609375,
-0.06695556640625,
0.048309326171875,
0.03387451171875,
-0.049468994140625,
-0.073974609375,
0.01983642578125,
-0.0014581680297851562,
-0.034576416015625,
0.0343017578125,
0.0445556640625,
0.0301666259765625,
-0.00803375244140625,
-0.0255279541015625,
-0.07354736328125,
0.08538818359375,
0.0193328857421875,
-0.041656494140625,
-0.0240936279296875,
0.01335906982421875,
0.036102294921875,
-0.0194091796875,
0.0184173583984375,
0.035858154296875,
0.0262298583984375,
0.01535797119140625,
-0.06402587890625,
0.0006775856018066406,
-0.045379638671875,
-0.00496673583984375,
0.024200439453125,
-0.08740234375,
0.08929443359375,
-0.005092620849609375,
-0.019500732421875,
0.00885772705078125,
0.056427001953125,
0.038177490234375,
0.034942626953125,
0.0450439453125,
0.0833740234375,
0.055694580078125,
-0.01277923583984375,
0.07080078125,
-0.0283050537109375,
0.031585693359375,
0.06756591796875,
0.011138916015625,
0.045623779296875,
0.0298309326171875,
-0.0123748779296875,
0.041046142578125,
0.08197021484375,
-0.005519866943359375,
0.033416748046875,
0.008514404296875,
-0.00835418701171875,
-0.00018978118896484375,
0.0036945343017578125,
-0.05035400390625,
0.01312255859375,
0.0160369873046875,
-0.0440673828125,
-0.00812530517578125,
-0.00913238525390625,
0.032562255859375,
-0.0260467529296875,
-0.0148468017578125,
0.0384521484375,
0.01532745361328125,
-0.058013916015625,
0.033905029296875,
0.0232696533203125,
0.055908203125,
-0.0426025390625,
0.02130126953125,
-0.01837158203125,
0.004489898681640625,
-0.01474761962890625,
-0.040374755859375,
0.01837158203125,
0.003597259521484375,
-0.006031036376953125,
-0.0029582977294921875,
0.037750244140625,
-0.035247802734375,
-0.048797607421875,
0.0016765594482421875,
0.0185699462890625,
0.0018396377563476562,
-0.0018625259399414062,
-0.06072998046875,
-0.0216827392578125,
0.010650634765625,
-0.047271728515625,
-0.00994110107421875,
0.050048828125,
0.01221466064453125,
0.0271759033203125,
0.044464111328125,
-0.0021038055419921875,
0.0030059814453125,
0.0026092529296875,
0.07501220703125,
-0.075439453125,
-0.0782470703125,
-0.060333251953125,
0.05499267578125,
-0.031341552734375,
-0.05914306640625,
0.0643310546875,
0.060638427734375,
0.0467529296875,
0.00766754150390625,
0.048004150390625,
0.0010967254638671875,
0.045745849609375,
-0.07135009765625,
0.036773681640625,
-0.052978515625,
0.0201873779296875,
-0.031829833984375,
-0.061309814453125,
-0.0220184326171875,
0.041015625,
-0.023040771484375,
0.01544189453125,
0.07061767578125,
0.06304931640625,
-0.0002143383026123047,
0.0218658447265625,
-0.006744384765625,
0.0258331298828125,
0.0197296142578125,
0.05810546875,
0.05755615234375,
-0.045562744140625,
0.042816162109375,
-0.0032501220703125,
-0.017974853515625,
-0.0156707763671875,
-0.049407958984375,
-0.057647705078125,
-0.03997802734375,
-0.0205230712890625,
-0.036041259765625,
-0.01226806640625,
0.047332763671875,
0.054534912109375,
-0.03448486328125,
0.003383636474609375,
-0.0229339599609375,
-0.01444244384765625,
0.0023670196533203125,
-0.021270751953125,
0.050201416015625,
-0.02679443359375,
-0.050262451171875,
0.003307342529296875,
0.008270263671875,
0.01425933837890625,
-0.000347137451171875,
0.001312255859375,
-0.0182342529296875,
-0.00783538818359375,
0.0169525146484375,
0.004192352294921875,
-0.04327392578125,
-0.00624847412109375,
0.0087738037109375,
-0.025482177734375,
0.00734710693359375,
0.048431396484375,
-0.0247344970703125,
0.0005850791931152344,
0.02069091796875,
0.04925537109375,
0.073486328125,
0.006938934326171875,
0.0184173583984375,
-0.049591064453125,
0.039337158203125,
0.004039764404296875,
0.0419921875,
0.0198822021484375,
-0.0206146240234375,
0.043670654296875,
0.035858154296875,
-0.04534912109375,
-0.04217529296875,
-0.007537841796875,
-0.0816650390625,
-0.0298004150390625,
0.07366943359375,
-0.01226043701171875,
-0.027008056640625,
0.002132415771484375,
0.004337310791015625,
0.0260009765625,
-0.039306640625,
0.062225341796875,
0.0787353515625,
0.00440216064453125,
-0.01183319091796875,
-0.04345703125,
0.035888671875,
0.034637451171875,
-0.061798095703125,
-0.017913818359375,
0.0478515625,
0.0123138427734375,
0.0169525146484375,
0.06646728515625,
-0.0136871337890625,
0.018829345703125,
0.00922393798828125,
0.001491546630859375,
-0.0035152435302734375,
-0.0021114349365234375,
-0.0207672119140625,
0.0192413330078125,
-0.0166168212890625,
-0.0255126953125
]
] |
SG161222/Realistic_Vision_V2.0 | 2023-04-01T10:11:25.000Z | [
"diffusers",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | null | SG161222 | null | null | SG161222/Realistic_Vision_V2.0 | 290 | 66,202 | diffusers | 2023-03-21T13:06:00 | ---
license: creativeml-openrail-m
---
<b>Please read this!</b><br>
For version 2.0 it is recommended to use with VAE (to improve generation quality and get rid of blue artifacts): https://huggingface.co/stabilityai/sd-vae-ft-mse-original<br>
This model is available on <a href="https://www.mage.space/">Mage.Space</a>, <a href="https://sinkin.ai/">Sinkin.ai</a>, <a href="https://getimg.ai/">GetImg.ai</a> and (<a href="https://randomseed.co/">RandomSeed.co</a> - NSFW content)
<hr/>
<b>I use this template to get good generation results:
Prompt:</b>
RAW photo, *subject*, (high detailed skin:1.2), 8k uhd, dslr, soft lighting, high quality, film grain, Fujifilm XT3
<b>Example:</b> RAW photo, a close up portrait photo of 26 y.o woman in wastelander clothes, long haircut, pale skin, slim body, background is city ruins, (high detailed skin:1.2), 8k uhd, dslr, soft lighting, high quality, film grain, Fujifilm XT3
<b>Negative Prompt:</b>
(deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br>
<b>OR</b><br>
(deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation
<b>Euler A or DPM++ 2M Karras with 25 steps<br>
CFG Scale 3,5 - 7<br>
Hires. fix with Latent upscaler<br>
0 Hires steps and Denoising strength 0.25-0.45<br>
Upscale by 1.1-2.0</b> | 2,001 | [
[
-0.0364990234375,
-0.046661376953125,
0.019378662109375,
0.01468658447265625,
-0.04083251953125,
-0.004543304443359375,
0.0194854736328125,
-0.033599853515625,
0.0325927734375,
0.047271728515625,
-0.055755615234375,
-0.041900634765625,
-0.033233642578125,
-0.0126495361328125,
-0.01495361328125,
0.05108642578125,
0.005126953125,
0.01509857177734375,
-0.0079193115234375,
0.01194000244140625,
-0.0465087890625,
-0.0009832382202148438,
-0.054962158203125,
-0.00839996337890625,
0.016448974609375,
0.04248046875,
0.050933837890625,
0.0596923828125,
0.02362060546875,
0.01995849609375,
-0.00868988037109375,
0.0159912109375,
-0.0517578125,
-0.00196075439453125,
-0.01495361328125,
-0.0247955322265625,
-0.07244873046875,
-0.000858306884765625,
0.03448486328125,
0.01346588134765625,
-0.0016880035400390625,
0.0199127197265625,
-0.0037994384765625,
0.052978515625,
-0.045867919921875,
-0.007274627685546875,
-0.01059722900390625,
0.01052093505859375,
-0.036834716796875,
-0.00406646728515625,
-0.00792694091796875,
-0.0155487060546875,
-0.0323486328125,
-0.07171630859375,
0.038177490234375,
-0.00496673583984375,
0.09039306640625,
0.0141143798828125,
-0.01904296875,
0.01392364501953125,
-0.06719970703125,
0.046966552734375,
-0.05377197265625,
0.0274200439453125,
0.0216217041015625,
0.03521728515625,
0.003871917724609375,
-0.07415771484375,
-0.04833984375,
0.004741668701171875,
0.007213592529296875,
0.036651611328125,
-0.033172607421875,
0.0034503936767578125,
0.048095703125,
0.054046630859375,
-0.05865478515625,
0.004413604736328125,
-0.0592041015625,
-0.015167236328125,
0.0626220703125,
0.02008056640625,
0.031219482421875,
-0.0211181640625,
-0.047821044921875,
-0.025299072265625,
-0.06573486328125,
-0.0042877197265625,
0.033935546875,
-0.0088653564453125,
-0.026611328125,
0.049224853515625,
0.00286865234375,
0.03411865234375,
0.038909912109375,
-0.0134429931640625,
0.0172882080078125,
-0.0260467529296875,
-0.01666259765625,
-0.034271240234375,
0.052734375,
0.0645751953125,
0.002246856689453125,
0.023773193359375,
0.002071380615234375,
0.001514434814453125,
0.038330078125,
-0.083740234375,
-0.0153656005859375,
0.0163116455078125,
-0.05078125,
-0.0171966552734375,
-0.00481414794921875,
-0.091796875,
-0.0197601318359375,
-0.0178375244140625,
0.0243072509765625,
-0.03656005859375,
-0.036834716796875,
-0.011383056640625,
-0.0163421630859375,
0.0259857177734375,
0.0304412841796875,
-0.06640625,
0.00794219970703125,
0.0119171142578125,
0.04388427734375,
0.0167083740234375,
0.00739288330078125,
-0.00421142578125,
0.01123809814453125,
-0.039459228515625,
0.05218505859375,
0.000408172607421875,
-0.0390625,
-0.02362060546875,
0.0111846923828125,
0.0142669677734375,
-0.039398193359375,
0.054443359375,
-0.02093505859375,
0.0221099853515625,
-0.01296234130859375,
-0.036285400390625,
-0.015167236328125,
-0.0171966552734375,
-0.06304931640625,
0.053314208984375,
0.03936767578125,
-0.05963134765625,
0.03955078125,
-0.0289154052734375,
-0.006076812744140625,
0.01430511474609375,
-0.0028247833251953125,
-0.043243408203125,
0.0126495361328125,
0.011444091796875,
0.01849365234375,
-0.00897216796875,
0.0004749298095703125,
-0.0221710205078125,
-0.034881591796875,
-0.02508544921875,
-0.039459228515625,
0.059356689453125,
0.03192138671875,
-0.04034423828125,
0.0011816024780273438,
-0.0711669921875,
0.002727508544921875,
0.03955078125,
-0.0108184814453125,
0.006832122802734375,
0.0018930435180664062,
0.016082763671875,
0.03448486328125,
0.0225982666015625,
-0.031036376953125,
0.009307861328125,
-0.04046630859375,
-0.007625579833984375,
0.038909912109375,
0.01374053955078125,
0.01995849609375,
-0.041412353515625,
0.052703857421875,
0.00848388671875,
0.0269927978515625,
-0.0142669677734375,
-0.037933349609375,
-0.0733642578125,
-0.02020263671875,
-0.0006527900695800781,
0.030731201171875,
-0.07061767578125,
0.026611328125,
-0.004680633544921875,
-0.055389404296875,
-0.03692626953125,
0.0010318756103515625,
0.0367431640625,
0.0258941650390625,
0.034271240234375,
-0.0318603515625,
-0.050537109375,
-0.08563232421875,
0.004581451416015625,
-0.002471923828125,
-0.01806640625,
0.023193359375,
0.01430511474609375,
0.00258636474609375,
0.04083251953125,
-0.031280517578125,
-0.01464080810546875,
-0.0012578964233398438,
-0.0115509033203125,
0.0074462890625,
0.04547119140625,
0.061004638671875,
-0.0606689453125,
-0.03228759765625,
-0.00055694580078125,
-0.054107666015625,
-0.00292205810546875,
0.0088653564453125,
-0.01218414306640625,
0.01514434814453125,
0.029815673828125,
-0.0467529296875,
0.05072021484375,
0.036651611328125,
-0.053436279296875,
0.056610107421875,
-0.01959228515625,
0.031982421875,
-0.10015869140625,
0.03021240234375,
0.0155792236328125,
-0.0272979736328125,
-0.051025390625,
0.0321044921875,
0.002765655517578125,
-0.01522064208984375,
-0.055267333984375,
0.04833984375,
-0.03369140625,
0.01003265380859375,
-0.01264190673828125,
-0.006900787353515625,
0.0176239013671875,
0.0264129638671875,
-0.0063934326171875,
0.047088623046875,
0.032867431640625,
-0.036376953125,
0.0325927734375,
0.0158538818359375,
-0.01415252685546875,
0.0655517578125,
-0.053863525390625,
0.00958251953125,
-0.0104522705078125,
-0.00444793701171875,
-0.04547119140625,
-0.035064697265625,
0.039520263671875,
-0.04547119140625,
0.042999267578125,
0.0005578994750976562,
-0.025665283203125,
-0.045379638671875,
-0.0301666259765625,
0.011962890625,
0.06396484375,
-0.042144775390625,
0.035552978515625,
0.007526397705078125,
0.001739501953125,
-0.033721923828125,
-0.04498291015625,
-0.0026454925537109375,
-0.0307769775390625,
-0.0565185546875,
0.03131103515625,
-0.01064300537109375,
-0.0125732421875,
-0.002147674560546875,
-0.008880615234375,
-0.00634002685546875,
-0.0164947509765625,
0.021240234375,
0.021759033203125,
-0.034912109375,
-0.038787841796875,
0.029083251953125,
-0.003093719482421875,
-0.00919342041015625,
0.0105743408203125,
0.03594970703125,
0.01464080810546875,
-0.0306549072265625,
-0.037506103515625,
0.031158447265625,
0.07159423828125,
-0.0015459060668945312,
0.0249481201171875,
0.0689697265625,
-0.0467529296875,
0.0095672607421875,
-0.0457763671875,
-0.0023937225341796875,
-0.03582763671875,
0.0279693603515625,
-0.02056884765625,
-0.049041748046875,
0.04998779296875,
0.0006103515625,
0.00365447998046875,
0.078369140625,
0.04437255859375,
-0.0229339599609375,
0.09649658203125,
0.037261962890625,
0.0193328857421875,
0.03717041015625,
-0.045013427734375,
-0.01122283935546875,
-0.06732177734375,
-0.032470703125,
-0.010101318359375,
-0.026153564453125,
-0.0491943359375,
-0.05224609375,
0.02325439453125,
0.03350830078125,
-0.025787353515625,
0.0284881591796875,
-0.038177490234375,
0.0262298583984375,
0.025787353515625,
0.0200042724609375,
0.00592041015625,
0.0207977294921875,
0.0057220458984375,
-0.0221099853515625,
-0.04736328125,
-0.0472412109375,
0.07574462890625,
0.0182647705078125,
0.0487060546875,
0.0164642333984375,
0.041534423828125,
0.01177978515625,
0.0158233642578125,
-0.026611328125,
0.037811279296875,
-0.044525146484375,
-0.08074951171875,
-0.0007834434509277344,
-0.0197296142578125,
-0.0828857421875,
0.0157623291015625,
-0.0404052734375,
-0.0750732421875,
0.042388916015625,
0.0318603515625,
-0.00920867919921875,
0.03607177734375,
-0.05194091796875,
0.058929443359375,
-0.0030612945556640625,
-0.056610107421875,
-0.0196380615234375,
-0.042999267578125,
0.04656982421875,
0.000059545040130615234,
0.00543975830078125,
-0.0001951456069946289,
0.006404876708984375,
0.03973388671875,
-0.0408935546875,
0.049530029296875,
-0.02056884765625,
0.004547119140625,
0.047760009765625,
0.0062408447265625,
0.020843505859375,
0.0101776123046875,
0.00959014892578125,
0.011383056640625,
0.00013005733489990234,
-0.034637451171875,
-0.0309906005859375,
0.055755615234375,
-0.050994873046875,
-0.054473876953125,
-0.019500732421875,
-0.01274871826171875,
0.01508331298828125,
0.02630615234375,
0.07037353515625,
0.0278778076171875,
-0.0270233154296875,
0.00922393798828125,
0.04150390625,
-0.01218414306640625,
0.0284423828125,
0.016143798828125,
-0.0312347412109375,
-0.0496826171875,
0.07281494140625,
0.0122833251953125,
0.0391845703125,
-0.00768280029296875,
-0.00046443939208984375,
-0.0103759765625,
-0.024017333984375,
-0.071044921875,
0.0214385986328125,
-0.03582763671875,
-0.02435302734375,
-0.021575927734375,
-0.022491455078125,
-0.0269927978515625,
-0.01605224609375,
-0.04107666015625,
-0.023773193359375,
-0.05181884765625,
0.0011959075927734375,
0.0209503173828125,
0.04876708984375,
-0.0092926025390625,
0.01032257080078125,
-0.043548583984375,
0.03753662109375,
0.02020263671875,
0.027923583984375,
-0.00614166259765625,
-0.02825927734375,
-0.0183868408203125,
0.01445770263671875,
-0.061065673828125,
-0.08099365234375,
0.037322998046875,
-0.009124755859375,
0.038482666015625,
0.035888671875,
-0.02728271484375,
0.057464599609375,
-0.0173187255859375,
0.08392333984375,
0.032623291015625,
-0.0438232421875,
0.048065185546875,
-0.0560302734375,
0.0293426513671875,
0.03271484375,
0.01274871826171875,
-0.03057861328125,
-0.02093505859375,
-0.08013916015625,
-0.06842041015625,
0.057952880859375,
0.036895751953125,
0.01161956787109375,
0.00775909423828125,
0.0345458984375,
0.00922393798828125,
0.010955810546875,
-0.05023193359375,
-0.046173095703125,
-0.039520263671875,
0.00827789306640625,
-0.00457763671875,
-0.027191162109375,
0.0069732666015625,
-0.0323486328125,
0.05816650390625,
0.006839752197265625,
0.032745361328125,
0.015106201171875,
0.021728515625,
-0.044647216796875,
-0.00988006591796875,
0.042633056640625,
0.044403076171875,
-0.02484130859375,
-0.0158538818359375,
0.0085296630859375,
-0.047760009765625,
0.0095672607421875,
0.0043182373046875,
-0.0323486328125,
0.013427734375,
0.0153961181640625,
0.07305908203125,
-0.00496673583984375,
-0.01448822021484375,
0.03729248046875,
-0.007419586181640625,
-0.0227813720703125,
-0.017333984375,
0.0184173583984375,
0.0113067626953125,
0.01509857177734375,
0.01861572265625,
0.0187225341796875,
0.021759033203125,
-0.0241546630859375,
-0.0017900466918945312,
0.0094451904296875,
-0.02276611328125,
-0.019989013671875,
0.0650634765625,
0.00765228271484375,
-0.031158447265625,
0.03857421875,
-0.033905029296875,
-0.00872802734375,
0.061981201171875,
0.054962158203125,
0.052581787109375,
-0.043243408203125,
0.015777587890625,
0.07330322265625,
0.0260162353515625,
0.00229644775390625,
0.0411376953125,
0.0306396484375,
-0.022064208984375,
-0.0296783447265625,
-0.048797607421875,
-0.01003265380859375,
0.05279541015625,
-0.046112060546875,
0.050384521484375,
-0.044158935546875,
-0.01151275634765625,
-0.00957489013671875,
-0.003940582275390625,
-0.054656982421875,
0.046844482421875,
0.022735595703125,
0.058563232421875,
-0.065673828125,
0.035308837890625,
0.05609130859375,
-0.0469970703125,
-0.07061767578125,
-0.0184173583984375,
0.0184173583984375,
-0.0313720703125,
0.0059661865234375,
-0.006061553955078125,
0.007724761962890625,
0.01300811767578125,
-0.046966552734375,
-0.08050537109375,
0.10809326171875,
0.040863037109375,
-0.042022705078125,
0.002513885498046875,
-0.0166168212890625,
0.05078125,
-0.019927978515625,
0.0125579833984375,
0.027801513671875,
0.0377197265625,
0.018524169921875,
-0.058258056640625,
0.0087738037109375,
-0.042510986328125,
0.025177001953125,
0.00785064697265625,
-0.0718994140625,
0.057952880859375,
-0.0091400146484375,
-0.04119873046875,
0.027374267578125,
0.048095703125,
0.037689208984375,
0.0428466796875,
0.030548095703125,
0.058441162109375,
0.0248565673828125,
-0.00368499755859375,
0.081298828125,
-0.0162200927734375,
0.027679443359375,
0.052490234375,
0.0244903564453125,
0.03277587890625,
0.0142059326171875,
-0.0204010009765625,
0.03521728515625,
0.064697265625,
-0.025115966796875,
0.0304412841796875,
0.016876220703125,
-0.0125274658203125,
-0.00373077392578125,
0.0021610260009765625,
-0.04498291015625,
0.03326416015625,
0.036529541015625,
-0.0335693359375,
-0.01103973388671875,
-0.002777099609375,
-0.0009274482727050781,
0.010406494140625,
-0.022979736328125,
0.050933837890625,
0.002956390380859375,
-0.0283660888671875,
0.0438232421875,
0.0007867813110351562,
0.040435791015625,
-0.03753662109375,
-0.0244903564453125,
-0.034820556640625,
-0.008331298828125,
-0.028289794921875,
-0.0413818359375,
0.0305328369140625,
-0.001766204833984375,
-0.022979736328125,
0.00005358457565307617,
0.0682373046875,
-0.010711669921875,
-0.05474853515625,
0.01221466064453125,
0.037384033203125,
0.033447265625,
0.0168609619140625,
-0.05487060546875,
-0.0033016204833984375,
0.01003265380859375,
-0.034912109375,
0.00853729248046875,
0.01039886474609375,
0.0033855438232421875,
0.028411865234375,
0.0299072265625,
0.01157379150390625,
-0.01171875,
-0.0007638931274414062,
0.06231689453125,
-0.0338134765625,
-0.0201568603515625,
-0.04638671875,
0.067138671875,
-0.019866943359375,
-0.0308837890625,
0.055419921875,
0.0491943359375,
0.06439208984375,
-0.024688720703125,
0.040191650390625,
-0.0218658447265625,
0.0115966796875,
-0.046539306640625,
0.06634521484375,
-0.0670166015625,
-0.00872802734375,
-0.031982421875,
-0.0985107421875,
-0.01158905029296875,
0.056732177734375,
0.01103973388671875,
0.03759765625,
0.048095703125,
0.057952880859375,
-0.01690673828125,
-0.0181427001953125,
0.03643798828125,
0.00937652587890625,
0.006031036376953125,
0.030731201171875,
0.045135498046875,
-0.0421142578125,
0.0193939208984375,
-0.03680419921875,
-0.0158233642578125,
-0.0202789306640625,
-0.042144775390625,
-0.043548583984375,
-0.046630859375,
-0.036285400390625,
-0.044403076171875,
0.0029888153076171875,
0.04302978515625,
0.070068359375,
-0.041534423828125,
0.00013077259063720703,
0.0021114349365234375,
-0.00847625732421875,
-0.019683837890625,
-0.01910400390625,
0.016571044921875,
0.034332275390625,
-0.0718994140625,
0.02386474609375,
0.0156097412109375,
0.0268402099609375,
-0.0265350341796875,
-0.00431060791015625,
-0.0082550048828125,
-0.0033664703369140625,
0.02960205078125,
0.033935546875,
-0.0545654296875,
-0.0197601318359375,
-0.0140228271484375,
-0.0021038055419921875,
0.011566162109375,
0.0338134765625,
-0.03302001953125,
0.052978515625,
0.035125732421875,
-0.01302337646484375,
0.060791015625,
0.00537109375,
0.03253173828125,
-0.05609130859375,
-0.0015354156494140625,
0.0157623291015625,
0.032196044921875,
0.0267181396484375,
-0.052215576171875,
0.0367431640625,
0.0286102294921875,
-0.034881591796875,
-0.033233642578125,
0.025360107421875,
-0.083251953125,
-0.012664794921875,
0.0797119140625,
-0.0003371238708496094,
-0.01534271240234375,
0.01251220703125,
-0.039459228515625,
0.043060302734375,
-0.01302337646484375,
0.03955078125,
0.03289794921875,
-0.00251007080078125,
-0.0215301513671875,
-0.048248291015625,
0.03643798828125,
0.003265380859375,
-0.05413818359375,
-0.00872039794921875,
0.061981201171875,
0.0239715576171875,
0.029510498046875,
0.051239013671875,
-0.047119140625,
0.04461669921875,
0.03082275390625,
0.0411376953125,
-0.018890380859375,
-0.006244659423828125,
-0.03570556640625,
-0.0004987716674804688,
-0.00604248046875,
-0.0029850006103515625
]
] |
Salesforce/blip-vqa-base | 2023-08-01T14:48:05.000Z | [
"transformers",
"pytorch",
"tf",
"blip",
"question-answering",
"visual-question-answering",
"arxiv:2201.12086",
"license:bsd-3-clause",
"autotrain_compatible",
"has_space",
"region:us"
] | visual-question-answering | Salesforce | null | null | Salesforce/blip-vqa-base | 54 | 66,190 | transformers | 2022-12-12T17:51:53 | ---
pipeline_tag: 'visual-question-answering'
tags:
- visual-question-answering
inference: false
languages:
- en
license: bsd-3-clause
---
# BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Model card for BLIP trained on visual question answering- base architecture (with ViT base backbone).
|  |
|:--:|
| <b> Pull figure from BLIP official repo | Image source: https://github.com/salesforce/BLIP </b>|
## TL;DR
Authors from the [paper](https://arxiv.org/abs/2201.12086) write in the abstract:
*Vision-Language Pre-training (VLP) has advanced the performance for many vision-language tasks. However, most existing pre-trained models only excel in either understanding-based tasks or generation-based tasks. Furthermore, performance improvement has been largely achieved by scaling up the dataset with noisy image-text pairs collected from the web, which is a suboptimal source of supervision. In this paper, we propose BLIP, a new VLP framework which transfers flexibly to both vision-language understanding and generation tasks. BLIP effectively utilizes the noisy web data by bootstrapping the captions, where a captioner generates synthetic captions and a filter removes the noisy ones. We achieve state-of-the-art results on a wide range of vision-language tasks, such as image-text retrieval (+2.7% in average recall@1), image captioning (+2.8% in CIDEr), and VQA (+1.6% in VQA score). BLIP also demonstrates strong generalization ability when directly transferred to videolanguage tasks in a zero-shot manner. Code, models, and datasets are released.*
## Usage
You can use this model for conditional and un-conditional image captioning
### Using the Pytorch model
#### Running the model on CPU
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-base")
model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-base")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
#### Running the model on GPU
##### In full precision
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-base")
model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-base").to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
##### In half precision (`float16`)
<details>
<summary> Click to expand </summary>
```python
import torch
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("ybelkada/blip-vqa-base")
model = BlipForQuestionAnswering.from_pretrained("ybelkada/blip-vqa-base", torch_dtype=torch.float16).to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16)
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
## BibTex and citation info
```
@misc{https://doi.org/10.48550/arxiv.2201.12086,
doi = {10.48550/ARXIV.2201.12086},
url = {https://arxiv.org/abs/2201.12086},
author = {Li, Junnan and Li, Dongxu and Xiong, Caiming and Hoi, Steven},
keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,782 | [
[
-0.0207672119140625,
-0.045074462890625,
-0.002285003662109375,
0.03363037109375,
-0.0277252197265625,
-0.0018978118896484375,
-0.030670166015625,
-0.04644775390625,
-0.004505157470703125,
0.0205078125,
-0.0280914306640625,
-0.0274505615234375,
-0.034423828125,
-0.001270294189453125,
-0.01250457763671875,
0.046417236328125,
0.0146026611328125,
0.00904083251953125,
-0.00824737548828125,
0.002193450927734375,
-0.01503753662109375,
-0.01727294921875,
-0.037384033203125,
-0.000431060791015625,
0.0010805130004882812,
0.0198516845703125,
0.035064697265625,
0.03643798828125,
0.054046630859375,
0.031982421875,
-0.01035308837890625,
0.00925445556640625,
-0.020782470703125,
-0.0216827392578125,
-0.00921630859375,
-0.057891845703125,
-0.0143890380859375,
-0.0004596710205078125,
0.045928955078125,
0.0447998046875,
0.003326416015625,
0.031494140625,
0.002521514892578125,
0.039459228515625,
-0.055511474609375,
0.0251312255859375,
-0.05865478515625,
0.00437164306640625,
-0.003612518310546875,
-0.014923095703125,
-0.030364990234375,
-0.006435394287109375,
0.00762939453125,
-0.064453125,
0.04302978515625,
0.0074920654296875,
0.11932373046875,
0.0271148681640625,
0.0185546875,
-0.01497650146484375,
-0.0286407470703125,
0.06396484375,
-0.0438232421875,
0.03662109375,
0.007366180419921875,
0.0226898193359375,
0.003376007080078125,
-0.06671142578125,
-0.057952880859375,
-0.0139007568359375,
-0.0091400146484375,
0.031768798828125,
-0.0187530517578125,
-0.002918243408203125,
0.028839111328125,
0.0273284912109375,
-0.048187255859375,
-0.0055999755859375,
-0.06182861328125,
-0.0222320556640625,
0.038848876953125,
-0.007965087890625,
0.01593017578125,
-0.02496337890625,
-0.03643798828125,
-0.032867431640625,
-0.035888671875,
0.0282440185546875,
-0.004360198974609375,
0.01435089111328125,
-0.0301666259765625,
0.051971435546875,
-0.00678253173828125,
0.06573486328125,
0.0224151611328125,
-0.021453857421875,
0.04962158203125,
-0.0255584716796875,
-0.037322998046875,
-0.01201629638671875,
0.07623291015625,
0.043426513671875,
0.023284912109375,
0.0032100677490234375,
0.003597259521484375,
0.004268646240234375,
0.0028820037841796875,
-0.0712890625,
-0.0260009765625,
0.018218994140625,
-0.0302734375,
-0.01392364501953125,
0.00286102294921875,
-0.07421875,
-0.00737762451171875,
0.00093841552734375,
0.039581298828125,
-0.041229248046875,
-0.01593017578125,
0.0163421630859375,
-0.0263519287109375,
0.0330810546875,
0.02813720703125,
-0.06134033203125,
-0.0052642822265625,
0.020263671875,
0.06890869140625,
0.01271820068359375,
-0.04901123046875,
-0.031951904296875,
0.00769805908203125,
-0.023101806640625,
0.037078857421875,
-0.006420135498046875,
-0.0088958740234375,
-0.00206756591796875,
0.011810302734375,
-0.00638580322265625,
-0.03875732421875,
-0.006435394287109375,
-0.022247314453125,
0.0203399658203125,
-0.0107879638671875,
-0.0204010009765625,
-0.02459716796875,
0.021331787109375,
-0.0208282470703125,
0.06915283203125,
0.0031719207763671875,
-0.06103515625,
0.045654296875,
-0.038177490234375,
-0.0213775634765625,
0.0312347412109375,
-0.02056884765625,
-0.039642333984375,
-0.010009765625,
0.033355712890625,
0.031707763671875,
-0.018951416015625,
0.0008282661437988281,
-0.0142822265625,
-0.026123046875,
0.006786346435546875,
-0.023040771484375,
0.0831298828125,
0.0008950233459472656,
-0.04962158203125,
0.00347137451171875,
-0.059844970703125,
-0.002498626708984375,
0.0165252685546875,
-0.0252838134765625,
0.0006928443908691406,
-0.019256591796875,
0.0148773193359375,
0.01226806640625,
0.04620361328125,
-0.046966552734375,
0.0006422996520996094,
-0.034210205078125,
0.0294189453125,
0.041656494140625,
-0.01666259765625,
0.0241546630859375,
-0.000031828880310058594,
0.0272064208984375,
0.01409912109375,
0.0263671875,
-0.0242156982421875,
-0.048553466796875,
-0.0771484375,
-0.0292205810546875,
0.0003714561462402344,
0.053863525390625,
-0.07318115234375,
0.0281982421875,
-0.0189361572265625,
-0.042236328125,
-0.052520751953125,
0.01450347900390625,
0.055389404296875,
0.064453125,
0.04730224609375,
-0.030487060546875,
-0.039794921875,
-0.05828857421875,
0.017303466796875,
-0.024871826171875,
0.0006804466247558594,
0.0222015380859375,
0.044036865234375,
-0.00888824462890625,
0.057464599609375,
-0.03790283203125,
-0.021270751953125,
-0.024810791015625,
0.0084686279296875,
0.02947998046875,
0.048370361328125,
0.0550537109375,
-0.06072998046875,
-0.0289459228515625,
0.007354736328125,
-0.06396484375,
0.008514404296875,
-0.01059722900390625,
-0.0093994140625,
0.034637451171875,
0.039886474609375,
-0.050872802734375,
0.04913330078125,
0.030029296875,
-0.0159759521484375,
0.051971435546875,
-0.0197296142578125,
-0.001178741455078125,
-0.07208251953125,
0.0292816162109375,
0.013824462890625,
-0.003887176513671875,
-0.021514892578125,
0.01035308837890625,
0.01296234130859375,
-0.0038661956787109375,
-0.05352783203125,
0.048583984375,
-0.044921875,
-0.0179290771484375,
0.01000213623046875,
-0.0066070556640625,
0.003917694091796875,
0.05438232421875,
0.0233306884765625,
0.061065673828125,
0.0811767578125,
-0.050323486328125,
0.03363037109375,
0.034881591796875,
-0.032501220703125,
0.0262451171875,
-0.059814453125,
-0.00926971435546875,
-0.0045928955078125,
-0.0163116455078125,
-0.083740234375,
-0.004909515380859375,
0.02197265625,
-0.054290771484375,
0.0242767333984375,
-0.032806396484375,
-0.02642822265625,
-0.048370361328125,
-0.01461029052734375,
0.017547607421875,
0.0406494140625,
-0.048309326171875,
0.0256195068359375,
0.0113067626953125,
0.0118865966796875,
-0.0687255859375,
-0.08270263671875,
0.00304412841796875,
0.01165008544921875,
-0.04547119140625,
0.0298614501953125,
-0.0017938613891601562,
0.01024627685546875,
0.00841522216796875,
0.0123748779296875,
-0.003070831298828125,
-0.0211334228515625,
0.0180206298828125,
0.0369873046875,
-0.0294036865234375,
-0.013336181640625,
-0.02789306640625,
0.004840850830078125,
-0.006198883056640625,
-0.0184326171875,
0.062225341796875,
-0.03436279296875,
-0.006420135498046875,
-0.048797607421875,
-0.0012493133544921875,
0.041107177734375,
-0.03759765625,
0.0369873046875,
0.05865478515625,
-0.01629638671875,
-0.00452423095703125,
-0.040740966796875,
0.0015859603881835938,
-0.044158935546875,
0.04425048828125,
-0.018402099609375,
-0.028778076171875,
0.04107666015625,
0.0242156982421875,
0.0025959014892578125,
0.016845703125,
0.0562744140625,
-0.01511383056640625,
0.042083740234375,
0.055999755859375,
0.0059814453125,
0.054290771484375,
-0.07025146484375,
-0.005489349365234375,
-0.052337646484375,
-0.030914306640625,
-0.00732421875,
-0.00875091552734375,
-0.032562255859375,
-0.03436279296875,
0.01454925537109375,
0.02044677734375,
-0.0244140625,
0.0194091796875,
-0.046295166015625,
0.0164642333984375,
0.0631103515625,
0.0142822265625,
-0.0105133056640625,
0.0136260986328125,
-0.0198211669921875,
0.00286102294921875,
-0.051971435546875,
-0.01476287841796875,
0.0791015625,
0.01393890380859375,
0.050872802734375,
-0.0196990966796875,
0.03826904296875,
-0.029388427734375,
0.0091400146484375,
-0.050323486328125,
0.05706787109375,
-0.0175018310546875,
-0.042724609375,
-0.0256805419921875,
-0.022735595703125,
-0.06781005859375,
0.019012451171875,
-0.0223236083984375,
-0.06134033203125,
0.0226898193359375,
0.03466796875,
-0.01837158203125,
0.025115966796875,
-0.06494140625,
0.074951171875,
-0.033447265625,
-0.046966552734375,
0.01346588134765625,
-0.0511474609375,
0.018280029296875,
0.0271148681640625,
-0.006877899169921875,
0.0241851806640625,
0.01073455810546875,
0.05096435546875,
-0.041778564453125,
0.06634521484375,
-0.029327392578125,
0.0309600830078125,
0.027191162109375,
-0.0180206298828125,
-0.005016326904296875,
-0.0045623779296875,
0.016845703125,
0.0225677490234375,
-0.0016565322875976562,
-0.042083740234375,
-0.037506103515625,
0.00858306884765625,
-0.0562744140625,
-0.03594970703125,
-0.02777099609375,
-0.03436279296875,
0.0018796920776367188,
0.03564453125,
0.056121826171875,
0.0239105224609375,
0.0246124267578125,
0.0083465576171875,
0.0230560302734375,
-0.035858154296875,
0.056915283203125,
0.0206146240234375,
-0.031768798828125,
-0.0350341796875,
0.0736083984375,
0.001049041748046875,
0.0156402587890625,
0.0254974365234375,
0.0127716064453125,
-0.026763916015625,
-0.04400634765625,
-0.05487060546875,
0.036163330078125,
-0.04522705078125,
-0.0271453857421875,
-0.021026611328125,
-0.0227813720703125,
-0.038970947265625,
-0.019500732421875,
-0.037261962890625,
-0.007236480712890625,
-0.030731201171875,
0.013397216796875,
0.03741455078125,
0.01953125,
-0.0093994140625,
0.033050537109375,
-0.03460693359375,
0.033721923828125,
0.031707763671875,
0.0198822021484375,
-0.00211334228515625,
-0.039642333984375,
-0.0110015869140625,
0.01126861572265625,
-0.0205841064453125,
-0.0518798828125,
0.047454833984375,
0.017486572265625,
0.033447265625,
0.031494140625,
-0.03173828125,
0.084228515625,
-0.0276336669921875,
0.058563232421875,
0.040985107421875,
-0.072021484375,
0.055755615234375,
0.0008788108825683594,
0.01348114013671875,
0.03521728515625,
0.0189208984375,
-0.0216217041015625,
-0.027008056640625,
-0.041351318359375,
-0.06756591796875,
0.05047607421875,
0.00734710693359375,
-0.004360198974609375,
0.0196075439453125,
0.019775390625,
-0.0177764892578125,
0.022125244140625,
-0.061279296875,
-0.0205078125,
-0.049835205078125,
-0.011932373046875,
-0.012298583984375,
0.008544921875,
0.0142822265625,
-0.052337646484375,
0.0298614501953125,
-0.007259368896484375,
0.03179931640625,
0.033050537109375,
-0.0350341796875,
-0.0038814544677734375,
-0.0265045166015625,
0.042510986328125,
0.045989990234375,
-0.0223236083984375,
0.001979827880859375,
-0.006992340087890625,
-0.071533203125,
-0.015899658203125,
0.005950927734375,
-0.0268707275390625,
0.0020656585693359375,
0.03759765625,
0.0711669921875,
-0.003307342529296875,
-0.044189453125,
0.058319091796875,
0.006183624267578125,
-0.01806640625,
-0.022674560546875,
0.0030670166015625,
-0.004657745361328125,
0.02093505859375,
0.046356201171875,
0.01306915283203125,
-0.015625,
-0.03564453125,
0.0168609619140625,
0.033050537109375,
-0.00772857666015625,
-0.019256591796875,
0.05517578125,
-0.0037937164306640625,
-0.0163116455078125,
0.050079345703125,
-0.0311279296875,
-0.050079345703125,
0.06158447265625,
0.04986572265625,
0.0335693359375,
-0.0015926361083984375,
0.0234832763671875,
0.046112060546875,
0.034698486328125,
0.00494384765625,
0.042388916015625,
0.005340576171875,
-0.06585693359375,
-0.029144287109375,
-0.057464599609375,
-0.0259246826171875,
0.0239105224609375,
-0.04144287109375,
0.030670166015625,
-0.052734375,
0.0018701553344726562,
0.01134490966796875,
0.0113067626953125,
-0.06451416015625,
0.0301513671875,
0.01690673828125,
0.0606689453125,
-0.0562744140625,
0.041107177734375,
0.06768798828125,
-0.07080078125,
-0.0667724609375,
-0.01165771484375,
-0.0271148681640625,
-0.0843505859375,
0.06695556640625,
0.024200439453125,
-0.00627899169921875,
0.00229644775390625,
-0.06573486328125,
-0.0555419921875,
0.075927734375,
0.036102294921875,
-0.03204345703125,
-0.003894805908203125,
0.011138916015625,
0.0440673828125,
-0.00908660888671875,
0.017059326171875,
0.0056915283203125,
0.031402587890625,
0.029632568359375,
-0.0706787109375,
-0.0014543533325195312,
-0.0308990478515625,
-0.0060882568359375,
-0.016693115234375,
-0.055908203125,
0.076171875,
-0.033233642578125,
-0.011810302734375,
-0.004608154296875,
0.056396484375,
0.030853271484375,
0.0144805908203125,
0.02813720703125,
0.044952392578125,
0.046417236328125,
0.005138397216796875,
0.06268310546875,
-0.0224151611328125,
0.036712646484375,
0.05755615234375,
0.0178985595703125,
0.0634765625,
0.04644775390625,
-0.0146942138671875,
0.0232696533203125,
0.037017822265625,
-0.04425048828125,
0.033660888671875,
0.007266998291015625,
0.02166748046875,
-0.0093536376953125,
0.0216827392578125,
-0.0242919921875,
0.05767822265625,
0.030853271484375,
-0.022613525390625,
-0.006732940673828125,
0.005641937255859375,
-0.0105438232421875,
-0.0148162841796875,
-0.036865234375,
0.0211334228515625,
-0.01403045654296875,
-0.044647216796875,
0.07769775390625,
-0.019195556640625,
0.08172607421875,
-0.021514892578125,
-0.005603790283203125,
-0.0133819580078125,
0.0176239013671875,
-0.02239990234375,
-0.07049560546875,
0.009490966796875,
0.00223541259765625,
0.0006341934204101562,
0.001438140869140625,
0.0259246826171875,
-0.033447265625,
-0.07159423828125,
0.017791748046875,
0.0217132568359375,
0.0271453857421875,
0.01325225830078125,
-0.06890869140625,
0.0012464523315429688,
0.00734710693359375,
-0.016143798828125,
-0.0099945068359375,
0.0272674560546875,
0.00885009765625,
0.054901123046875,
0.053009033203125,
0.0307769775390625,
0.05059814453125,
-0.00688934326171875,
0.059600830078125,
-0.04254150390625,
-0.029327392578125,
-0.050567626953125,
0.046783447265625,
-0.0133514404296875,
-0.05157470703125,
0.046783447265625,
0.0626220703125,
0.0810546875,
-0.021026611328125,
0.044769287109375,
-0.0202178955078125,
0.01096343994140625,
-0.046356201171875,
0.0628662109375,
-0.056793212890625,
-0.0092926025390625,
-0.03662109375,
-0.043914794921875,
-0.041961669921875,
0.0762939453125,
-0.0206451416015625,
-0.0009064674377441406,
0.04022216796875,
0.08563232421875,
-0.021514892578125,
-0.0419921875,
0.0188140869140625,
0.02264404296875,
0.0171356201171875,
0.05157470703125,
0.045745849609375,
-0.042266845703125,
0.0531005859375,
-0.0479736328125,
-0.0163116455078125,
-0.01271820068359375,
-0.048614501953125,
-0.07257080078125,
-0.057403564453125,
-0.031158447265625,
-0.0189666748046875,
-0.0030193328857421875,
0.034820556640625,
0.06243896484375,
-0.050567626953125,
-0.022613525390625,
-0.0194854736328125,
0.001422882080078125,
-0.014739990234375,
-0.01552581787109375,
0.043121337890625,
-0.03515625,
-0.060150146484375,
-0.003643035888671875,
0.0222015380859375,
0.0169677734375,
-0.01300811767578125,
0.004650115966796875,
-0.0221710205078125,
-0.0280609130859375,
0.030120849609375,
0.0401611328125,
-0.049072265625,
-0.01019287109375,
0.0107269287109375,
-0.008514404296875,
0.0276947021484375,
0.0183563232421875,
-0.05108642578125,
0.034912109375,
0.02838134765625,
0.0272064208984375,
0.06475830078125,
-0.013153076171875,
0.00859832763671875,
-0.05157470703125,
0.061126708984375,
0.00902557373046875,
0.03900146484375,
0.036376953125,
-0.0188751220703125,
0.02294921875,
0.030487060546875,
-0.0130615234375,
-0.06524658203125,
0.0037860870361328125,
-0.09674072265625,
-0.01800537109375,
0.0863037109375,
-0.02142333984375,
-0.05474853515625,
0.00772857666015625,
-0.017547607421875,
0.027587890625,
-0.00921630859375,
0.043701171875,
0.01220703125,
0.000705718994140625,
-0.041748046875,
-0.017486572265625,
0.032196044921875,
0.022064208984375,
-0.0467529296875,
-0.0133514404296875,
0.0268707275390625,
0.03350830078125,
0.0450439453125,
0.042388916015625,
-0.0015859603881835938,
0.04058837890625,
0.013946533203125,
0.042694091796875,
-0.01776123046875,
-0.0181427001953125,
-0.0132904052734375,
0.0026493072509765625,
-0.0135955810546875,
-0.053619384765625
]
] |
petals-team/StableBeluga2 | 2023-08-23T18:00:41.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"dataset:conceptofmind/cot_submix_original",
"dataset:conceptofmind/flan2021_submix_original",
"dataset:conceptofmind/t0_submix_original",
"dataset:conceptofmind/niv2_submix_original",
"arxiv:2307.09288",
"arxiv:2306.02707",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | petals-team | null | null | petals-team/StableBeluga2 | 10 | 65,949 | transformers | 2023-08-12T22:04:01 | ---
datasets:
- conceptofmind/cot_submix_original
- conceptofmind/flan2021_submix_original
- conceptofmind/t0_submix_original
- conceptofmind/niv2_submix_original
language:
- en
pipeline_tag: text-generation
---
# Stable Beluga 2
## Changes in this fork
This repository contains the model from the [stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2) repository with the following changes:
1. **Storing weights in `bfloat16` instead of `float32`.**
This leads to 2x smaller files and a small quality loss, which is not significant compared to the loss caused by NF4 quantization used in Petals by default.
1. **Storing weights in small shards.**
Each transformer block is stored in its own shard (1.71 GB each). The input and output embeddings and adjacent layernorms are in a separate shard (1.05 GB) too.
This way, Petals clients and servers don't have to download any excess data besides the layers they actually use.
1. **Using [Safetensors](https://github.com/huggingface/safetensors) instead of Pickle.**
This allows faster loading with smaller RAM requirements.
We provide the original README below. Please refer there for model details and licensing information.
## Model Description
`Stable Beluga 2` is a Llama2 70B model finetuned on an Orca style Dataset
## Usage
Start chatting with `Stable Beluga 2` using the following code snippet:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga2", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga2", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
system_prompt = "### System:\nYou are Stable Beluga, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
message = "Write me a poem please"
prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=256)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
Stable Beluga 2 should be used with this prompt format:
```
### System:
This is a system prompt, please behave and help the user.
### User:
Your prompt here
### Assistant:
The output of Stable Beluga 2
```
## Other Beluga Models
[StableBeluga 1 - Delta](https://huggingface.co/stabilityai/StableBeluga1-Delta)
[StableBeluga 13B](https://huggingface.co/stabilityai/StableBeluga-13B)
[StableBeluga 7B](https://huggingface.co/stabilityai/StableBeluga-7B)
## Model Details
* **Developed by**: [Stability AI](https://stability.ai/)
* **Model type**: Stable Beluga 2 is an auto-regressive language model fine-tuned on Llama2 70B.
* **Language(s)**: English
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
* **License**: Fine-tuned checkpoints (`Stable Beluga 2`) is licensed under the [STABLE BELUGA NON-COMMERCIAL COMMUNITY LICENSE AGREEMENT](https://huggingface.co/stabilityai/StableBeluga2/blob/main/LICENSE.txt)
* **Contact**: For questions and comments about the model, please email `lm@stability.ai`
### Training Dataset
` Stable Beluga 2` is trained on our internal Orca-style dataset
### Training Procedure
Models are learned via supervised fine-tuning on the aforementioned datasets, trained in mixed-precision (BF16), and optimized with AdamW. We outline the following hyperparameters:
| Dataset | Batch Size | Learning Rate |Learning Rate Decay| Warm-up | Weight Decay | Betas |
|-------------------|------------|---------------|-------------------|---------|--------------|-------------|
| Orca pt1 packed | 256 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
| Orca pt2 unpacked | 512 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
## Ethical Considerations and Limitations
Beluga is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Beluga's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Beluga, developers should perform safety testing and tuning tailored to their specific applications of the model.
## How to cite
```bibtex
@misc{StableBelugaModels,
url={[https://huggingface.co/stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2)},
title={Stable Beluga models},
author={Mahan, Dakota and Carlow, Ryan and Castricato, Louis and Cooper, Nathan and Laforte, Christian}
}
```
## Citations
```bibtext
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtext
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 6,735 | [
[
-0.03302001953125,
-0.056884765625,
0.0025653839111328125,
0.032073974609375,
-0.0166778564453125,
-0.00782012939453125,
-0.0065155029296875,
-0.044342041015625,
0.002742767333984375,
0.02691650390625,
-0.042022705078125,
-0.0280914306640625,
-0.04681396484375,
-0.0030727386474609375,
-0.019805908203125,
0.08099365234375,
-0.0005745887756347656,
-0.0112457275390625,
0.0031299591064453125,
-0.01393890380859375,
-0.043853759765625,
-0.017486572265625,
-0.058074951171875,
-0.0283355712890625,
0.0240020751953125,
0.0121917724609375,
0.04925537109375,
0.054443359375,
0.0181732177734375,
0.0251007080078125,
-0.03350830078125,
-0.0027713775634765625,
-0.04547119140625,
-0.0150299072265625,
0.01324462890625,
-0.042572021484375,
-0.046234130859375,
-0.005222320556640625,
0.0401611328125,
0.0345458984375,
-0.00470733642578125,
0.01788330078125,
0.0189056396484375,
0.041259765625,
-0.0279083251953125,
0.0310211181640625,
-0.0231475830078125,
-0.01215362548828125,
-0.0173797607421875,
0.01561737060546875,
-0.0212860107421875,
-0.045562744140625,
0.01062774658203125,
-0.05596923828125,
0.005878448486328125,
-0.0156402587890625,
0.10693359375,
0.0239715576171875,
-0.034912109375,
-0.0004799365997314453,
-0.037445068359375,
0.06402587890625,
-0.06390380859375,
0.0300445556640625,
0.020904541015625,
0.0250091552734375,
-0.0218353271484375,
-0.057586669921875,
-0.03948974609375,
0.00572967529296875,
-0.0129852294921875,
0.021728515625,
-0.0153961181640625,
-0.0118560791015625,
0.024932861328125,
0.0266876220703125,
-0.039398193359375,
0.01366424560546875,
-0.046173095703125,
-0.03265380859375,
0.040496826171875,
0.001865386962890625,
0.0022182464599609375,
-0.0214691162109375,
-0.027862548828125,
-0.0283203125,
-0.05755615234375,
0.0281524658203125,
0.0219268798828125,
0.01175689697265625,
-0.052032470703125,
0.042633056640625,
-0.0016841888427734375,
0.03765869140625,
0.01055145263671875,
-0.0275421142578125,
0.04034423828125,
-0.0278472900390625,
-0.02667236328125,
-0.001377105712890625,
0.06439208984375,
0.0295257568359375,
0.00240325927734375,
0.01812744140625,
-0.00229644775390625,
0.022003173828125,
-0.004734039306640625,
-0.073486328125,
-0.0274658203125,
0.0233001708984375,
-0.044281005859375,
-0.04046630859375,
-0.01526641845703125,
-0.074462890625,
-0.01325225830078125,
-0.0033779144287109375,
0.023895263671875,
-0.0290985107421875,
-0.035003662109375,
0.0050048828125,
0.01284027099609375,
0.0367431640625,
0.00917816162109375,
-0.0750732421875,
0.0258026123046875,
0.036590576171875,
0.06072998046875,
0.013519287109375,
-0.0165863037109375,
-0.0173797607421875,
0.0013580322265625,
-0.0306854248046875,
0.044952392578125,
-0.0125274658203125,
-0.03466796875,
-0.00469970703125,
0.02362060546875,
0.0013341903686523438,
-0.0279541015625,
0.052978515625,
-0.023406982421875,
0.0261993408203125,
-0.032958984375,
-0.0238494873046875,
-0.039825439453125,
0.006038665771484375,
-0.0400390625,
0.0887451171875,
0.00757598876953125,
-0.05462646484375,
0.0172882080078125,
-0.04071044921875,
-0.0265350341796875,
-0.0225830078125,
-0.0012865066528320312,
-0.060394287109375,
-0.024383544921875,
0.01708984375,
0.031707763671875,
-0.0211029052734375,
0.0142974853515625,
-0.0311126708984375,
-0.0165863037109375,
0.00858306884765625,
-0.0099639892578125,
0.08514404296875,
0.0214385986328125,
-0.041839599609375,
0.01043701171875,
-0.06683349609375,
-0.01375579833984375,
0.031158447265625,
-0.0287322998046875,
-0.00554656982421875,
-0.012298583984375,
-0.0127716064453125,
0.0024547576904296875,
0.0242462158203125,
-0.036041259765625,
0.01494598388671875,
-0.03204345703125,
0.04132080078125,
0.053375244140625,
-0.00325775146484375,
0.0233306884765625,
-0.0297088623046875,
0.0127410888671875,
0.0036029815673828125,
0.0290679931640625,
-0.01163482666015625,
-0.06597900390625,
-0.06829833984375,
-0.0214385986328125,
0.03314208984375,
0.03961181640625,
-0.0213623046875,
0.03887939453125,
-0.00131988525390625,
-0.05706787109375,
-0.041534423828125,
-0.005374908447265625,
0.045806884765625,
0.047698974609375,
0.0277862548828125,
-0.0215301513671875,
-0.052154541015625,
-0.06365966796875,
0.017608642578125,
-0.0249481201171875,
0.01224517822265625,
0.00957489013671875,
0.036468505859375,
-0.032684326171875,
0.058624267578125,
-0.036651611328125,
-0.0127105712890625,
-0.0032978057861328125,
0.016082763671875,
0.0245208740234375,
0.048309326171875,
0.06866455078125,
-0.048095703125,
-0.0205841064453125,
-0.00876617431640625,
-0.0565185546875,
-0.006839752197265625,
0.0033931732177734375,
-0.025634765625,
0.041961669921875,
0.0074462890625,
-0.0548095703125,
0.03900146484375,
0.05328369140625,
-0.02667236328125,
0.050445556640625,
-0.0073089599609375,
0.0010747909545898438,
-0.0875244140625,
0.012603759765625,
0.013458251953125,
-0.00902557373046875,
-0.04071044921875,
-0.004856109619140625,
0.0136260986328125,
0.001262664794921875,
-0.021209716796875,
0.039459228515625,
-0.028961181640625,
-0.0080718994140625,
-0.010498046875,
0.00518035888671875,
0.0036334991455078125,
0.04779052734375,
-0.0000362396240234375,
0.028717041015625,
0.055816650390625,
-0.051116943359375,
0.02288818359375,
0.0419921875,
-0.0240478515625,
0.0206756591796875,
-0.06768798828125,
0.004547119140625,
0.0105743408203125,
0.0277557373046875,
-0.08349609375,
-0.01462554931640625,
0.03350830078125,
-0.048309326171875,
0.043121337890625,
-0.028289794921875,
-0.0265350341796875,
-0.0384521484375,
-0.0222930908203125,
0.010650634765625,
0.059295654296875,
-0.035614013671875,
0.036224365234375,
0.0307464599609375,
-0.00714874267578125,
-0.04534912109375,
-0.06707763671875,
-0.0218658447265625,
-0.0195465087890625,
-0.0645751953125,
0.0228118896484375,
-0.0175933837890625,
0.00543212890625,
-0.0018281936645507812,
-0.008880615234375,
0.009246826171875,
0.00972747802734375,
0.028778076171875,
0.04010009765625,
-0.004627227783203125,
-0.023681640625,
0.0266876220703125,
-0.02276611328125,
0.01096343994140625,
-0.0098724365234375,
0.0439453125,
-0.0318603515625,
-0.00278472900390625,
-0.03826904296875,
-0.005802154541015625,
0.037017822265625,
-0.0251007080078125,
0.058013916015625,
0.058074951171875,
-0.03961181640625,
0.024566650390625,
-0.042633056640625,
-0.02117919921875,
-0.0377197265625,
0.02374267578125,
-0.033966064453125,
-0.05645751953125,
0.06756591796875,
0.00556182861328125,
0.036712646484375,
0.046478271484375,
0.05450439453125,
0.00635528564453125,
0.07281494140625,
0.051361083984375,
0.0051727294921875,
0.017242431640625,
-0.0462646484375,
-0.00222015380859375,
-0.070068359375,
-0.049224853515625,
-0.043701171875,
-0.021148681640625,
-0.046173095703125,
-0.009429931640625,
0.0177459716796875,
0.021575927734375,
-0.0439453125,
0.03448486328125,
-0.043243408203125,
-0.010345458984375,
0.032135009765625,
0.00566864013671875,
0.00511932373046875,
-0.01026153564453125,
-0.0233612060546875,
-0.0004978179931640625,
-0.048370361328125,
-0.031494140625,
0.0816650390625,
0.047943115234375,
0.047637939453125,
0.014556884765625,
0.03118896484375,
-0.01467132568359375,
0.01317596435546875,
-0.03204345703125,
0.041778564453125,
0.00628662109375,
-0.06158447265625,
-0.00806427001953125,
-0.037750244140625,
-0.08319091796875,
0.0095672607421875,
-0.0216522216796875,
-0.052093505859375,
0.031829833984375,
0.0062713623046875,
-0.033203125,
0.0162811279296875,
-0.05096435546875,
0.07769775390625,
-0.01763916015625,
-0.02349853515625,
-0.0055694580078125,
-0.0670166015625,
0.04412841796875,
-0.0025119781494140625,
0.0206756591796875,
0.0037899017333984375,
0.00018525123596191406,
0.06646728515625,
-0.040771484375,
0.06854248046875,
-0.007656097412109375,
-0.00997161865234375,
0.032501220703125,
0.010986328125,
0.044189453125,
0.01224517822265625,
-0.00830841064453125,
0.02728271484375,
0.0119476318359375,
-0.033721923828125,
-0.0229949951171875,
0.057403564453125,
-0.0882568359375,
-0.026641845703125,
-0.03717041015625,
-0.02117919921875,
0.00923919677734375,
0.030487060546875,
0.02978515625,
0.038177490234375,
0.018402099609375,
0.01934814453125,
0.051727294921875,
-0.026275634765625,
0.0274658203125,
0.037933349609375,
-0.0287017822265625,
-0.044464111328125,
0.046630859375,
0.0191802978515625,
0.020660400390625,
0.00579833984375,
0.019561767578125,
-0.033172607421875,
-0.03265380859375,
-0.04180908203125,
0.041473388671875,
-0.043853759765625,
-0.0167388916015625,
-0.045257568359375,
-0.01145172119140625,
-0.038421630859375,
-0.0037384033203125,
-0.047119140625,
-0.0290374755859375,
-0.033660888671875,
-0.016876220703125,
0.052490234375,
0.038909912109375,
-0.0146636962890625,
0.0093536376953125,
-0.053985595703125,
0.0105743408203125,
0.014892578125,
0.0237579345703125,
-0.0011501312255859375,
-0.059967041015625,
-0.0158233642578125,
0.01763916015625,
-0.03485107421875,
-0.0703125,
0.021759033203125,
-0.00765228271484375,
0.046661376953125,
0.022705078125,
0.0006470680236816406,
0.062744140625,
-0.007511138916015625,
0.069091796875,
0.025909423828125,
-0.05633544921875,
0.0439453125,
-0.030975341796875,
0.00812530517578125,
0.0228271484375,
0.036651611328125,
-0.01291656494140625,
-0.010040283203125,
-0.0517578125,
-0.058624267578125,
0.06097412109375,
0.028289794921875,
-0.00667572021484375,
0.017822265625,
0.045623779296875,
0.00821685791015625,
0.01105499267578125,
-0.0670166015625,
-0.04412841796875,
-0.051727294921875,
0.00559234619140625,
0.00788116455078125,
-0.024169921875,
-0.01534271240234375,
-0.031280517578125,
0.06805419921875,
0.00724029541015625,
0.03521728515625,
0.01424407958984375,
0.0203094482421875,
-0.0250091552734375,
-0.01059722900390625,
0.0419921875,
0.036285400390625,
-0.031890869140625,
-0.01050567626953125,
0.0230255126953125,
-0.0423583984375,
-0.00780487060546875,
0.03277587890625,
-0.0303955078125,
-0.0155029296875,
0.0036945343017578125,
0.07342529296875,
0.01715087890625,
-0.02294921875,
0.01690673828125,
-0.00301361083984375,
-0.0260772705078125,
-0.0264892578125,
-0.0032596588134765625,
0.0168914794921875,
0.0299072265625,
0.01367950439453125,
0.01690673828125,
-0.0134735107421875,
-0.0509033203125,
0.0019817352294921875,
0.00846099853515625,
-0.0202789306640625,
-0.032562255859375,
0.07000732421875,
0.01020050048828125,
-0.00431060791015625,
0.03857421875,
-0.01373291015625,
-0.03558349609375,
0.039947509765625,
0.045257568359375,
0.052215576171875,
-0.027008056640625,
0.007740020751953125,
0.051666259765625,
0.0323486328125,
-0.0135040283203125,
0.02947998046875,
0.0258941650390625,
-0.0390625,
-0.0297088623046875,
-0.042510986328125,
-0.0257568359375,
0.0272674560546875,
-0.043975830078125,
0.043975830078125,
-0.03515625,
-0.0197601318359375,
-0.0172271728515625,
0.024871826171875,
-0.030487060546875,
0.0209808349609375,
0.007770538330078125,
0.06292724609375,
-0.060394287109375,
0.059722900390625,
0.056427001953125,
-0.04156494140625,
-0.08526611328125,
-0.024566650390625,
0.006534576416015625,
-0.0364990234375,
0.0157623291015625,
0.00827789306640625,
0.01105499267578125,
0.00231170654296875,
-0.03900146484375,
-0.07061767578125,
0.1043701171875,
0.0288543701171875,
-0.039764404296875,
0.014678955078125,
-0.0024585723876953125,
0.040557861328125,
-0.0029144287109375,
0.033935546875,
0.043914794921875,
0.04913330078125,
0.0139007568359375,
-0.06719970703125,
0.02764892578125,
-0.03302001953125,
-0.005954742431640625,
0.00913238525390625,
-0.08673095703125,
0.06951904296875,
-0.01410675048828125,
-0.00445556640625,
0.024871826171875,
0.06390380859375,
0.0570068359375,
0.0216217041015625,
0.031463623046875,
0.055267333984375,
0.05108642578125,
-0.0255126953125,
0.079833984375,
-0.021759033203125,
0.03570556640625,
0.040191650390625,
0.0019550323486328125,
0.0506591796875,
0.004444122314453125,
-0.0260162353515625,
0.0521240234375,
0.07373046875,
-0.01561737060546875,
0.04132080078125,
-0.01155853271484375,
0.0131378173828125,
-0.0002677440643310547,
0.01381683349609375,
-0.05474853515625,
0.005023956298828125,
0.02740478515625,
-0.00853729248046875,
-0.004749298095703125,
-0.00969696044921875,
0.036041259765625,
-0.0195465087890625,
-0.00922393798828125,
0.033599853515625,
0.0233154296875,
-0.04144287109375,
0.09234619140625,
0.004180908203125,
0.050079345703125,
-0.062042236328125,
0.0076446533203125,
-0.021636962890625,
0.016693115234375,
-0.03216552734375,
-0.05615234375,
0.0226593017578125,
0.00353240966796875,
-0.00026297569274902344,
0.010772705078125,
0.053466796875,
-0.0092315673828125,
-0.03350830078125,
0.038787841796875,
-0.00183868408203125,
0.023895263671875,
0.0211029052734375,
-0.07574462890625,
0.032806396484375,
0.00806427001953125,
-0.051666259765625,
0.0267181396484375,
0.0295562744140625,
0.0016374588012695312,
0.049041748046875,
0.056304931640625,
-0.013641357421875,
0.022064208984375,
-0.006561279296875,
0.0802001953125,
-0.0278778076171875,
-0.0273284912109375,
-0.047271728515625,
0.051727294921875,
0.002979278564453125,
-0.039642333984375,
0.062255859375,
0.048370361328125,
0.0555419921875,
0.01093292236328125,
0.05828857421875,
-0.022369384765625,
0.038055419921875,
-0.03204345703125,
0.049957275390625,
-0.044219970703125,
0.024871826171875,
-0.0257720947265625,
-0.0792236328125,
-0.01366424560546875,
0.058258056640625,
-0.019775390625,
0.009979248046875,
0.0350341796875,
0.06597900390625,
-0.0032901763916015625,
0.0071563720703125,
-0.0042572021484375,
0.051300048828125,
0.03582763671875,
0.0377197265625,
0.036346435546875,
-0.0611572265625,
0.06561279296875,
-0.04876708984375,
-0.03277587890625,
-0.0244598388671875,
-0.06719970703125,
-0.07122802734375,
-0.03900146484375,
-0.036346435546875,
-0.05291748046875,
-0.0009908676147460938,
0.045562744140625,
0.052947998046875,
-0.056304931640625,
-0.02239990234375,
-0.013458251953125,
-0.005573272705078125,
-0.0291290283203125,
-0.01424407958984375,
0.041595458984375,
0.0037689208984375,
-0.05322265625,
0.0196380615234375,
0.005321502685546875,
0.0259857177734375,
-0.02508544921875,
-0.0275115966796875,
-0.0185699462890625,
0.0020313262939453125,
0.031829833984375,
0.039459228515625,
-0.03338623046875,
-0.0207977294921875,
0.00643157958984375,
-0.0025234222412109375,
-0.0012359619140625,
0.0022678375244140625,
-0.044189453125,
0.0248260498046875,
0.04510498046875,
0.0218353271484375,
0.069580078125,
-0.009124755859375,
0.019805908203125,
-0.0276641845703125,
0.02252197265625,
-0.00936126708984375,
0.039337158203125,
0.0202789306640625,
-0.01241302490234375,
0.048309326171875,
0.030853271484375,
-0.0550537109375,
-0.07177734375,
0.009033203125,
-0.10333251953125,
-0.0114593505859375,
0.0875244140625,
-0.019195556640625,
-0.029754638671875,
-0.00562286376953125,
-0.0247802734375,
0.049591064453125,
-0.0303192138671875,
0.061370849609375,
0.0304107666015625,
0.006946563720703125,
-0.025787353515625,
-0.044189453125,
0.0338134765625,
0.0250091552734375,
-0.0478515625,
-0.01113128662109375,
0.021514892578125,
0.031280517578125,
0.02630615234375,
0.0283203125,
-0.03973388671875,
0.02069091796875,
-0.0080718994140625,
0.01200103759765625,
-0.0164031982421875,
-0.0098876953125,
-0.009124755859375,
-0.0170745849609375,
-0.01824951171875,
-0.01279449462890625
]
] |
timm/resnet18.tv_in1k | 2023-04-05T18:04:22.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:1512.03385",
"license:bsd-3-clause",
"region:us"
] | image-classification | timm | null | null | timm/resnet18.tv_in1k | 0 | 65,350 | timm | 2023-04-05T18:04:15 | ---
tags:
- image-classification
- timm
library_tag: timm
license: bsd-3-clause
---
# Model card for resnet18.tv_in1k
A ResNet-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
Trained on ImageNet-1k, original torchvision model weight.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 11.7
- GMACs: 1.8
- Activations (M): 2.5
- Image size: 224 x 224
- **Papers:**
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/pytorch/vision
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnet18.tv_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet18.tv_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 64, 56, 56])
# torch.Size([1, 128, 28, 28])
# torch.Size([1, 256, 14, 14])
# torch.Size([1, 512, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet18.tv_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 512, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 37,871 | [
[
-0.06732177734375,
-0.0173492431640625,
0.0011186599731445312,
0.0304107666015625,
-0.0333251953125,
-0.008697509765625,
-0.00963592529296875,
-0.02972412109375,
0.08837890625,
0.0202484130859375,
-0.04864501953125,
-0.038360595703125,
-0.045623779296875,
-0.00011581182479858398,
0.0261993408203125,
0.06317138671875,
0.0012044906616210938,
-0.005401611328125,
0.0178985595703125,
-0.015960693359375,
-0.0021572113037109375,
-0.0213165283203125,
-0.07611083984375,
-0.01198577880859375,
0.0304718017578125,
0.01300811767578125,
0.0499267578125,
0.04644775390625,
0.0291290283203125,
0.045257568359375,
-0.01690673828125,
0.0227813720703125,
-0.004390716552734375,
-0.01094818115234375,
0.04803466796875,
-0.0276947021484375,
-0.0697021484375,
-0.0021686553955078125,
0.053955078125,
0.046112060546875,
0.00630950927734375,
0.0266876220703125,
0.028900146484375,
0.044769287109375,
0.0040740966796875,
-0.003448486328125,
0.0008907318115234375,
0.00939178466796875,
-0.0207061767578125,
0.004146575927734375,
-0.004058837890625,
-0.052703857421875,
0.01363372802734375,
-0.04803466796875,
-0.005756378173828125,
0.0002880096435546875,
0.1015625,
-0.00966644287109375,
-0.0185394287109375,
0.00716400146484375,
0.01076507568359375,
0.055938720703125,
-0.0628662109375,
0.02606201171875,
0.044219970703125,
-0.001377105712890625,
-0.0136566162109375,
-0.046844482421875,
-0.0367431640625,
0.01031494140625,
-0.032470703125,
0.0244903564453125,
-0.02215576171875,
-0.0157318115234375,
0.0278472900390625,
0.0238037109375,
-0.03533935546875,
-0.010406494140625,
-0.02813720703125,
-0.006549835205078125,
0.055023193359375,
0.00554656982421875,
0.0518798828125,
-0.0278472900390625,
-0.037567138671875,
-0.006191253662109375,
-0.01151275634765625,
0.033660888671875,
0.0208740234375,
0.0096588134765625,
-0.08331298828125,
0.031707763671875,
0.0082244873046875,
0.01708984375,
0.027313232421875,
-0.00899505615234375,
0.0626220703125,
-0.00862884521484375,
-0.03985595703125,
-0.03656005859375,
0.08099365234375,
0.04986572265625,
0.021148681640625,
-0.007083892822265625,
-0.0020294189453125,
-0.0127410888671875,
-0.02947998046875,
-0.06951904296875,
-0.003604888916015625,
0.01983642578125,
-0.0419921875,
-0.0173492431640625,
0.0247650146484375,
-0.0687255859375,
-0.003986358642578125,
-0.0077972412109375,
0.005512237548828125,
-0.0555419921875,
-0.0338134765625,
0.0009436607360839844,
-0.0159149169921875,
0.040557861328125,
0.0171356201171875,
-0.0242462158203125,
0.031890869140625,
0.004917144775390625,
0.0638427734375,
0.0232086181640625,
-0.003360748291015625,
-0.014984130859375,
0.0012826919555664062,
-0.027496337890625,
0.0257110595703125,
0.0130157470703125,
-0.01255035400390625,
-0.02557373046875,
0.0312347412109375,
-0.0198516845703125,
-0.014190673828125,
0.04669189453125,
0.0222930908203125,
0.0134429931640625,
-0.0203704833984375,
-0.0186309814453125,
-0.015045166015625,
0.026641845703125,
-0.0433349609375,
0.07470703125,
0.0289764404296875,
-0.08447265625,
0.01288604736328125,
-0.037109375,
0.0007052421569824219,
-0.021240234375,
0.005733489990234375,
-0.06536865234375,
0.002471923828125,
0.0161285400390625,
0.0517578125,
-0.01512908935546875,
-0.01227569580078125,
-0.02435302734375,
0.004619598388671875,
0.0297698974609375,
0.00946044921875,
0.0673828125,
0.02484130859375,
-0.033172607421875,
-0.015594482421875,
-0.054534912109375,
0.03125,
0.03131103515625,
-0.0004968643188476562,
-0.0031871795654296875,
-0.058807373046875,
0.00286865234375,
0.046112060546875,
0.021026611328125,
-0.0556640625,
0.02020263671875,
-0.0132904052734375,
0.0260009765625,
0.048187255859375,
0.00281524658203125,
0.01311492919921875,
-0.051239013671875,
0.04632568359375,
-0.0025463104248046875,
0.021636962890625,
0.0015621185302734375,
-0.031707763671875,
-0.05523681640625,
-0.05596923828125,
0.0169219970703125,
0.0316162109375,
-0.0313720703125,
0.065185546875,
0.0091400146484375,
-0.045074462890625,
-0.048248291015625,
0.0042724609375,
0.041107177734375,
0.01605224609375,
0.006561279296875,
-0.024139404296875,
-0.057220458984375,
-0.07135009765625,
-0.024871826171875,
0.00917816162109375,
-0.0029087066650390625,
0.0517578125,
0.031982421875,
-0.014404296875,
0.038360595703125,
-0.027496337890625,
-0.018310546875,
-0.01044464111328125,
-0.008880615234375,
0.033203125,
0.058807373046875,
0.0751953125,
-0.055023193359375,
-0.0712890625,
0.01178741455078125,
-0.0843505859375,
-0.00402069091796875,
-0.0018796920776367188,
-0.0191650390625,
0.031982421875,
0.0201568603515625,
-0.06573486328125,
0.060089111328125,
0.0300750732421875,
-0.0655517578125,
0.033935546875,
-0.0279083251953125,
0.04327392578125,
-0.07965087890625,
0.019073486328125,
0.021514892578125,
-0.019317626953125,
-0.0433349609375,
0.003170013427734375,
-0.00885772705078125,
0.0113067626953125,
-0.04229736328125,
0.0599365234375,
-0.05206298828125,
0.00006467103958129883,
0.0131072998046875,
0.00572967529296875,
-0.0021419525146484375,
0.03265380859375,
-0.0037841796875,
0.045013427734375,
0.06640625,
-0.01171112060546875,
0.0254669189453125,
0.0323486328125,
0.0017604827880859375,
0.056396484375,
-0.046112060546875,
0.005199432373046875,
0.0018672943115234375,
0.033355712890625,
-0.0751953125,
-0.030731201171875,
0.042633056640625,
-0.06378173828125,
0.048187255859375,
-0.0194091796875,
-0.0186004638671875,
-0.06182861328125,
-0.066162109375,
0.01922607421875,
0.047210693359375,
-0.042388916015625,
0.027801513671875,
0.01482391357421875,
-0.0031604766845703125,
-0.036773681640625,
-0.05316162109375,
0.00754547119140625,
-0.03228759765625,
-0.06256103515625,
0.034088134765625,
0.026153564453125,
-0.01450347900390625,
0.00843048095703125,
-0.0108642578125,
-0.0113677978515625,
-0.016204833984375,
0.043121337890625,
0.0255584716796875,
-0.0235443115234375,
-0.0310821533203125,
-0.03057861328125,
-0.0213165283203125,
-0.004459381103515625,
-0.00738525390625,
0.036865234375,
-0.034088134765625,
0.007476806640625,
-0.1103515625,
0.00945281982421875,
0.0660400390625,
-0.0024242401123046875,
0.073486328125,
0.057861328125,
-0.035125732421875,
0.01348876953125,
-0.033203125,
-0.0173492431640625,
-0.03924560546875,
-0.01806640625,
-0.052520751953125,
-0.04449462890625,
0.0687255859375,
0.00782012939453125,
-0.009124755859375,
0.058013916015625,
0.01019287109375,
-0.018798828125,
0.06280517578125,
0.0333251953125,
-0.0024261474609375,
0.043975830078125,
-0.06317138671875,
0.0101776123046875,
-0.061920166015625,
-0.05596923828125,
-0.0166168212890625,
-0.04217529296875,
-0.04541015625,
-0.0260009765625,
0.0167388916015625,
0.028411865234375,
-0.0189971923828125,
0.045013427734375,
-0.0418701171875,
0.0024051666259765625,
0.0242462158203125,
0.0408935546875,
-0.01451873779296875,
-0.0104522705078125,
-0.006740570068359375,
-0.02685546875,
-0.0391845703125,
-0.02777099609375,
0.058685302734375,
0.0469970703125,
0.031646728515625,
0.0079345703125,
0.04278564453125,
0.006282806396484375,
0.01174163818359375,
-0.02264404296875,
0.051666259765625,
0.004608154296875,
-0.03466796875,
-0.0278472900390625,
-0.0307159423828125,
-0.08209228515625,
0.012939453125,
-0.033935546875,
-0.06536865234375,
-0.01326751708984375,
-0.00545501708984375,
-0.0267486572265625,
0.0555419921875,
-0.044525146484375,
0.04803466796875,
-0.004642486572265625,
-0.041534423828125,
-0.0045928955078125,
-0.059783935546875,
0.0043487548828125,
0.030853271484375,
0.00431060791015625,
-0.0009641647338867188,
-0.004116058349609375,
0.057861328125,
-0.061431884765625,
0.042938232421875,
-0.0247650146484375,
0.0093536376953125,
0.030548095703125,
-0.0008606910705566406,
0.0292205810546875,
0.0019140243530273438,
-0.01403045654296875,
-0.00878143310546875,
0.00955963134765625,
-0.06256103515625,
-0.0235443115234375,
0.04998779296875,
-0.05548095703125,
-0.0291595458984375,
-0.05078125,
-0.0194244384765625,
0.00739288330078125,
0.0018529891967773438,
0.03594970703125,
0.04998779296875,
-0.0022945404052734375,
0.01885986328125,
0.040496826171875,
-0.030914306640625,
0.038909912109375,
-0.0118865966796875,
0.0006508827209472656,
-0.04180908203125,
0.054779052734375,
0.003719329833984375,
-0.0009775161743164062,
-0.000701904296875,
0.0008754730224609375,
-0.031463623046875,
-0.0169677734375,
-0.022857666015625,
0.055755615234375,
-0.0114593505859375,
-0.021942138671875,
-0.045867919921875,
-0.026031494140625,
-0.042236328125,
-0.033172607421875,
-0.032684326171875,
-0.0267791748046875,
-0.0247344970703125,
0.00042366981506347656,
0.05401611328125,
0.06500244140625,
-0.028289794921875,
0.0285186767578125,
-0.038482666015625,
0.023712158203125,
0.0063934326171875,
0.042510986328125,
-0.02642822265625,
-0.050506591796875,
0.00345611572265625,
-0.0017690658569335938,
-0.005413055419921875,
-0.06103515625,
0.05072021484375,
0.0001175999641418457,
0.0279083251953125,
0.0311431884765625,
-0.015960693359375,
0.05303955078125,
-0.00041484832763671875,
0.036346435546875,
0.045166015625,
-0.0545654296875,
0.0237884521484375,
-0.0360107421875,
0.000774383544921875,
0.022918701171875,
0.014251708984375,
-0.02880859375,
-0.0253448486328125,
-0.06634521484375,
-0.0302734375,
0.054901123046875,
0.007904052734375,
-0.0015668869018554688,
-0.0026302337646484375,
0.050506591796875,
-0.0073699951171875,
0.004230499267578125,
-0.03955078125,
-0.0679931640625,
-0.008544921875,
-0.01230621337890625,
0.004985809326171875,
-0.002063751220703125,
0.003864288330078125,
-0.050811767578125,
0.050140380859375,
0.0053863525390625,
0.038421630859375,
0.0135345458984375,
0.003948211669921875,
0.0042266845703125,
-0.022613525390625,
0.046661376953125,
0.029083251953125,
-0.01404571533203125,
-0.01113128662109375,
0.0267791748046875,
-0.03790283203125,
0.0069427490234375,
0.0159149169921875,
0.00013947486877441406,
0.00643157958984375,
0.006656646728515625,
0.03826904296875,
0.0269622802734375,
-0.00580596923828125,
0.039398193359375,
-0.0196533203125,
-0.043365478515625,
-0.0152587890625,
-0.017913818359375,
0.02081298828125,
0.031494140625,
0.0245513916015625,
0.0031948089599609375,
-0.0303192138671875,
-0.0281982421875,
0.040618896484375,
0.055389404296875,
-0.03143310546875,
-0.0289306640625,
0.04443359375,
0.0005354881286621094,
-0.01605224609375,
0.0300140380859375,
-0.0090484619140625,
-0.05133056640625,
0.07666015625,
0.022735595703125,
0.045440673828125,
-0.037506103515625,
0.00833892822265625,
0.06488037109375,
-0.0008172988891601562,
0.016876220703125,
0.0269775390625,
0.03564453125,
-0.023773193359375,
-0.007602691650390625,
-0.0408935546875,
0.01476287841796875,
0.037750244140625,
-0.029937744140625,
0.02288818359375,
-0.053924560546875,
-0.0245208740234375,
0.0057830810546875,
0.037200927734375,
-0.04730224609375,
0.0260009765625,
-0.0005288124084472656,
0.0806884765625,
-0.06182861328125,
0.0634765625,
0.06591796875,
-0.04156494140625,
-0.06390380859375,
-0.0021915435791015625,
0.0084686279296875,
-0.062744140625,
0.0322265625,
0.005428314208984375,
0.0019664764404296875,
-0.0006122589111328125,
-0.0364990234375,
-0.051116943359375,
0.10101318359375,
0.028167724609375,
-0.00104522705078125,
0.0186004638671875,
-0.03369140625,
0.0279388427734375,
-0.01238250732421875,
0.043853759765625,
0.0280609130859375,
0.03924560546875,
0.0110321044921875,
-0.06658935546875,
0.026031494140625,
-0.03173828125,
-0.0107879638671875,
0.023590087890625,
-0.097412109375,
0.06671142578125,
-0.0157470703125,
-0.001262664794921875,
0.0184173583984375,
0.04888916015625,
0.0229339599609375,
-0.00330352783203125,
0.0161895751953125,
0.0662841796875,
0.034423828125,
-0.0191192626953125,
0.0772705078125,
-0.0161895751953125,
0.0416259765625,
0.01529693603515625,
0.043365478515625,
0.0262451171875,
0.031402587890625,
-0.04388427734375,
0.01885986328125,
0.06109619140625,
-0.0034046173095703125,
0.0093994140625,
0.0217437744140625,
-0.03106689453125,
-0.0167083740234375,
-0.017608642578125,
-0.05126953125,
0.0183868408203125,
0.0090179443359375,
-0.0098114013671875,
-0.01038360595703125,
-0.0027217864990234375,
0.0174560546875,
0.0229644775390625,
-0.0192413330078125,
0.038543701171875,
0.005615234375,
-0.0295257568359375,
0.03265380859375,
-0.002857208251953125,
0.0814208984375,
-0.028167724609375,
0.0115509033203125,
-0.024383544921875,
0.0222625732421875,
-0.0189666748046875,
-0.082763671875,
0.0251617431640625,
-0.00711822509765625,
0.007526397705078125,
-0.01837158203125,
0.048004150390625,
-0.0265960693359375,
-0.025848388671875,
0.0285797119140625,
0.0284881591796875,
0.037078857421875,
0.0220947265625,
-0.082275390625,
0.0203857421875,
0.0081634521484375,
-0.047119140625,
0.03289794921875,
0.037353515625,
0.029327392578125,
0.057220458984375,
0.0225830078125,
0.0244293212890625,
0.016815185546875,
-0.0298309326171875,
0.05438232421875,
-0.04608154296875,
-0.033905029296875,
-0.060882568359375,
0.040557861328125,
-0.0312347412109375,
-0.041229248046875,
0.05633544921875,
0.041107177734375,
0.0275115966796875,
0.0008826255798339844,
0.05084228515625,
-0.042144775390625,
0.035888671875,
-0.0193023681640625,
0.057373046875,
-0.049407958984375,
-0.02069091796875,
-0.01554107666015625,
-0.04449462890625,
-0.0313720703125,
0.0655517578125,
-0.00995635986328125,
0.01678466796875,
0.021575927734375,
0.050048828125,
0.006046295166015625,
-0.00841522216796875,
0.0005426406860351562,
0.01224517822265625,
-0.01031494140625,
0.0667724609375,
0.038055419921875,
-0.05780029296875,
0.004364013671875,
-0.035797119140625,
-0.0199737548828125,
-0.029876708984375,
-0.056243896484375,
-0.08685302734375,
-0.050384521484375,
-0.040313720703125,
-0.051177978515625,
-0.0210723876953125,
0.091796875,
0.05987548828125,
-0.043731689453125,
-0.01119232177734375,
0.00980377197265625,
0.007030487060546875,
-0.01128387451171875,
-0.016265869140625,
0.0391845703125,
0.01117706298828125,
-0.0743408203125,
-0.031280517578125,
0.0088653564453125,
0.0445556640625,
0.0289306640625,
-0.036865234375,
-0.0210113525390625,
-0.0036258697509765625,
0.0255584716796875,
0.0645751953125,
-0.061309814453125,
-0.022918701171875,
0.002471923828125,
-0.037261962890625,
0.0098876953125,
0.0198516845703125,
-0.0338134765625,
-0.007701873779296875,
0.039642333984375,
0.02886962890625,
0.053802490234375,
0.007175445556640625,
0.01226043701171875,
-0.030364990234375,
0.04095458984375,
0.0004863739013671875,
0.02471923828125,
0.016265869140625,
-0.0217742919921875,
0.0577392578125,
0.040252685546875,
-0.030853271484375,
-0.0762939453125,
-0.012969970703125,
-0.098876953125,
-0.0038433074951171875,
0.049072265625,
-0.003047943115234375,
-0.030426025390625,
0.02984619140625,
-0.0355224609375,
0.03863525390625,
-0.0179595947265625,
0.0179443359375,
0.018646240234375,
-0.0269775390625,
-0.025390625,
-0.04364013671875,
0.046875,
0.02557373046875,
-0.05084228515625,
-0.0286712646484375,
-0.00012218952178955078,
0.0218353271484375,
0.01241302490234375,
0.055633544921875,
-0.0283355712890625,
0.01067352294921875,
-0.006282806396484375,
0.016387939453125,
-0.001453399658203125,
0.01131439208984375,
-0.0236053466796875,
-0.0126800537109375,
-0.0185699462890625,
-0.048248291015625
]
] |
flair/ner-english-ontonotes-fast | 2023-04-05T20:14:18.000Z | [
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"has_space",
"region:us"
] | token-classification | flair | null | null | flair/ner-english-ontonotes-fast | 17 | 65,310 | flair | 2022-03-02T23:29:05 | ---
tags:
- flair
- token-classification
- sequence-tagger-model
language: en
datasets:
- ontonotes
widget:
- text: "On September 1st George Washington won 1 dollar."
---
## English NER in Flair (Ontonotes fast model)
This is the fast version of the 18-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **89.3** (Ontonotes)
Predicts 18 tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
| CARDINAL | cardinal value |
| DATE | date value |
| EVENT | event name |
| FAC | building name |
| GPE | geo-political entity |
| LANGUAGE | language name |
| LAW | law name |
| LOC | location name |
| MONEY | money name |
| NORP | affiliation |
| ORDINAL | ordinal value |
| ORG | organization name |
| PERCENT | percent value |
| PERSON | person name |
| PRODUCT | product name |
| QUANTITY | quantity value |
| TIME | time value |
| WORK_OF_ART | name of work of art |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/ner-english-ontonotes-fast")
# make example sentence
sentence = Sentence("On September 1st George Washington won 1 dollar.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('ner'):
print(entity)
```
This yields the following output:
```
Span [2,3]: "September 1st" [− Labels: DATE (0.9655)]
Span [4,5]: "George Washington" [− Labels: PERSON (0.8243)]
Span [7,8]: "1 dollar" [− Labels: MONEY (0.8022)]
```
So, the entities "*September 1st*" (labeled as a **date**), "*George Washington*" (labeled as a **person**) and "*1 dollar*" (labeled as a **money**) are found in the sentence "*On September 1st George Washington won 1 dollar*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'ner'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# GloVe embeddings
WordEmbeddings('en-crawl'),
# contextual string embeddings, forward
FlairEmbeddings('news-forward-fast'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward-fast'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/ner-english-ontonotes-fast',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
| 4,450 | [
[
-0.021484375,
-0.0462646484375,
0.01161956787109375,
0.0161590576171875,
-0.01340484619140625,
-0.00530242919921875,
-0.016387939453125,
-0.0303802490234375,
0.049713134765625,
0.0233917236328125,
-0.0274658203125,
-0.036956787109375,
-0.0411376953125,
0.0228118896484375,
0.0013284683227539062,
0.0927734375,
0.0164031982421875,
0.017059326171875,
-0.005184173583984375,
-0.006381988525390625,
-0.021087646484375,
-0.043121337890625,
-0.04840087890625,
-0.0191192626953125,
0.04150390625,
0.03179931640625,
0.03265380859375,
0.056121826171875,
0.035858154296875,
0.020965576171875,
-0.0174407958984375,
0.0012235641479492188,
-0.00734710693359375,
-0.005489349365234375,
-0.01338958740234375,
-0.028839111328125,
-0.0667724609375,
0.0107421875,
0.042877197265625,
0.02471923828125,
0.00618743896484375,
0.0136566162109375,
0.0014677047729492188,
0.0128173828125,
-0.0152587890625,
0.0321044921875,
-0.0494384765625,
-0.0213623046875,
-0.018310546875,
-0.01456451416015625,
-0.024993896484375,
-0.0303955078125,
0.0177001953125,
-0.046783447265625,
0.005901336669921875,
0.01523590087890625,
0.09991455078125,
0.00606536865234375,
-0.0281524658203125,
-0.0230865478515625,
-0.0250091552734375,
0.0611572265625,
-0.0711669921875,
0.01678466796875,
0.0279693603515625,
-0.0069732666015625,
-0.0016021728515625,
-0.058380126953125,
-0.04815673828125,
-0.01119232177734375,
-0.013031005859375,
0.0143280029296875,
-0.005619049072265625,
-0.012451171875,
0.0196685791015625,
0.018096923828125,
-0.04779052734375,
-0.01389312744140625,
-0.01181793212890625,
-0.0135040283203125,
0.0592041015625,
0.02154541015625,
0.011810302734375,
-0.038299560546875,
-0.033782958984375,
-0.018280029296875,
-0.0248565673828125,
-0.0003917217254638672,
0.008087158203125,
0.044097900390625,
-0.02081298828125,
0.03790283203125,
0.0018281936645507812,
0.052001953125,
0.0181732177734375,
-0.02960205078125,
0.0423583984375,
-0.0299530029296875,
-0.01091766357421875,
0.0011281967163085938,
0.0692138671875,
0.0234832763671875,
0.013031005859375,
-0.004222869873046875,
-0.01506805419921875,
0.01454925537109375,
-0.0180511474609375,
-0.052215576171875,
-0.01617431640625,
0.0086517333984375,
-0.0232391357421875,
-0.032470703125,
-0.006702423095703125,
-0.060638427734375,
-0.0199737548828125,
-0.00421142578125,
0.047637939453125,
-0.031524658203125,
-0.0035572052001953125,
0.0015707015991210938,
-0.02801513671875,
0.011566162109375,
0.0136566162109375,
-0.060577392578125,
0.007110595703125,
0.0267486572265625,
0.0501708984375,
0.0265655517578125,
-0.020721435546875,
-0.0283966064453125,
-0.0096435546875,
-0.00826263427734375,
0.054718017578125,
-0.0259246826171875,
-0.020172119140625,
-0.00836181640625,
0.00609588623046875,
-0.0213165283203125,
-0.01184844970703125,
0.0423583984375,
-0.034698486328125,
0.0294647216796875,
-0.0211639404296875,
-0.054107666015625,
-0.020782470703125,
0.029510498046875,
-0.050048828125,
0.064453125,
0.0008306503295898438,
-0.09271240234375,
0.031829833984375,
-0.036376953125,
-0.033721923828125,
0.006336212158203125,
0.002655029296875,
-0.033843994140625,
-0.008697509765625,
0.0110626220703125,
0.050506591796875,
-0.01953125,
0.0236053466796875,
-0.025970458984375,
-0.0005750656127929688,
0.0135955810546875,
0.004428863525390625,
0.06622314453125,
-0.000009179115295410156,
-0.0235748291015625,
0.00417327880859375,
-0.0728759765625,
-0.0138702392578125,
0.0232391357421875,
-0.036163330078125,
-0.0259246826171875,
0.0012874603271484375,
0.01934814453125,
0.0219879150390625,
0.014862060546875,
-0.041656494140625,
0.042816162109375,
-0.03759765625,
0.03924560546875,
0.034942626953125,
0.0023097991943359375,
0.044097900390625,
-0.03564453125,
0.0308685302734375,
-0.0026340484619140625,
-0.01445770263671875,
-0.011627197265625,
-0.05511474609375,
-0.04913330078125,
-0.01898193359375,
0.04180908203125,
0.06494140625,
-0.0478515625,
0.041839599609375,
-0.03057861328125,
-0.050201416015625,
-0.028594970703125,
-0.01849365234375,
0.02294921875,
0.0521240234375,
0.037933349609375,
-0.017822265625,
-0.063232421875,
-0.041656494140625,
-0.025146484375,
-0.016510009765625,
0.0221710205078125,
0.0264739990234375,
0.06219482421875,
-0.0103302001953125,
0.059722900390625,
-0.037200927734375,
-0.035919189453125,
-0.03216552734375,
0.018310546875,
0.035858154296875,
0.04156494140625,
0.0255126953125,
-0.0457763671875,
-0.047943115234375,
-0.007114410400390625,
-0.030487060546875,
0.01824951171875,
-0.02130126953125,
0.00909423828125,
0.033538818359375,
0.0297698974609375,
-0.034576416015625,
0.039306640625,
0.0219573974609375,
-0.056884765625,
0.037353515625,
-0.0007905960083007812,
-0.010772705078125,
-0.10955810546875,
0.024169921875,
0.0229949951171875,
-0.017242431640625,
-0.03985595703125,
-0.02398681640625,
0.0103912353515625,
0.0218658447265625,
-0.0180206298828125,
0.06640625,
-0.02459716796875,
0.0174102783203125,
0.0020503997802734375,
0.01120758056640625,
0.0090179443359375,
0.023895263671875,
0.025390625,
0.024810791015625,
0.033447265625,
-0.038665771484375,
0.006137847900390625,
0.038818359375,
-0.032470703125,
0.01141357421875,
-0.0406494140625,
-0.010955810546875,
-0.0096588134765625,
0.0182952880859375,
-0.0787353515625,
-0.013824462890625,
0.02191162109375,
-0.06500244140625,
0.044769287109375,
-0.0006971359252929688,
-0.024169921875,
-0.0253753662109375,
-0.01763916015625,
0.0021686553955078125,
0.031463623046875,
-0.0261383056640625,
0.0462646484375,
0.023468017578125,
-0.002105712890625,
-0.054656982421875,
-0.055267333984375,
-0.0097808837890625,
-0.020721435546875,
-0.04913330078125,
0.044708251953125,
-0.004795074462890625,
-0.01092529296875,
0.0135650634765625,
0.0086517333984375,
0.00156402587890625,
0.012420654296875,
0.0085601806640625,
0.036102294921875,
-0.0168914794921875,
0.006702423095703125,
-0.01715087890625,
-0.00025916099548339844,
-0.00273895263671875,
-0.01265716552734375,
0.04443359375,
-0.0086517333984375,
0.0369873046875,
-0.031829833984375,
0.0011043548583984375,
0.0168609619140625,
-0.0238800048828125,
0.068115234375,
0.052398681640625,
-0.0340576171875,
-0.0072479248046875,
-0.0307464599609375,
-0.021484375,
-0.028350830078125,
0.0438232421875,
-0.033935546875,
-0.050140380859375,
0.047210693359375,
0.0194854736328125,
0.0146942138671875,
0.06768798828125,
0.0313720703125,
0.003948211669921875,
0.07696533203125,
0.045166015625,
-0.01337432861328125,
0.03668212890625,
-0.04119873046875,
0.007228851318359375,
-0.057464599609375,
-0.0236053466796875,
-0.04522705078125,
-0.00769805908203125,
-0.057769775390625,
-0.0158538818359375,
0.00769805908203125,
0.0252227783203125,
-0.03680419921875,
0.039825439453125,
-0.03582763671875,
0.01322174072265625,
0.04296875,
-0.011993408203125,
0.0107879638671875,
-0.006061553955078125,
-0.021728515625,
-0.0174407958984375,
-0.057525634765625,
-0.035980224609375,
0.08416748046875,
0.030029296875,
0.051300048828125,
-0.004383087158203125,
0.0633544921875,
0.0009093284606933594,
0.038818359375,
-0.05877685546875,
0.032806396484375,
-0.015289306640625,
-0.06658935546875,
-0.00862884521484375,
-0.0220947265625,
-0.06756591796875,
0.010101318359375,
-0.037689208984375,
-0.06207275390625,
0.018310546875,
0.0158233642578125,
-0.041961669921875,
0.0308685302734375,
-0.022216796875,
0.07733154296875,
-0.005767822265625,
-0.023895263671875,
0.01497650146484375,
-0.06341552734375,
0.019500732421875,
0.004650115966796875,
0.030914306640625,
-0.0121002197265625,
-0.005306243896484375,
0.07763671875,
-0.02166748046875,
0.067138671875,
0.004543304443359375,
0.0173492431640625,
0.018585205078125,
-0.00409698486328125,
0.039947509765625,
0.018310546875,
-0.01250457763671875,
0.0035915374755859375,
-0.007904052734375,
-0.01080322265625,
-0.01140594482421875,
0.050628662109375,
-0.05316162109375,
-0.022918701171875,
-0.0655517578125,
-0.0220489501953125,
0.0008339881896972656,
0.01297760009765625,
0.053924560546875,
0.046875,
-0.01328277587890625,
-0.00583648681640625,
0.029510498046875,
-0.015716552734375,
0.053558349609375,
0.0318603515625,
-0.0277252197265625,
-0.06243896484375,
0.06842041015625,
0.01313018798828125,
-0.0034694671630859375,
0.03765869140625,
0.0219268798828125,
-0.03424072265625,
-0.008392333984375,
-0.03131103515625,
0.038238525390625,
-0.043426513671875,
-0.0321044921875,
-0.05908203125,
-0.00939178466796875,
-0.0645751953125,
-0.00843048095703125,
-0.016357421875,
-0.0458984375,
-0.055511474609375,
0.0004420280456542969,
0.031829833984375,
0.06329345703125,
-0.020904541015625,
0.020599365234375,
-0.05511474609375,
-0.01141357421875,
-0.00029850006103515625,
0.002593994140625,
-0.00550079345703125,
-0.0711669921875,
-0.0236663818359375,
-0.015899658203125,
-0.030914306640625,
-0.0782470703125,
0.07470703125,
0.0219268798828125,
0.026641845703125,
0.03057861328125,
-0.00960540771484375,
0.036224365234375,
-0.031097412109375,
0.060699462890625,
0.00894927978515625,
-0.0709228515625,
0.03546142578125,
-0.018707275390625,
0.01092529296875,
0.0232696533203125,
0.058929443359375,
-0.043426513671875,
-0.004913330078125,
-0.06573486328125,
-0.075927734375,
0.05279541015625,
-0.01070404052734375,
-0.00025010108947753906,
-0.0234527587890625,
0.017669677734375,
-0.00899505615234375,
0.0049285888671875,
-0.07550048828125,
-0.0423583984375,
-0.0174102783203125,
-0.0166473388671875,
-0.03179931640625,
-0.0150604248046875,
0.01739501953125,
-0.042083740234375,
0.0838623046875,
-0.0063629150390625,
0.0277252197265625,
0.0299530029296875,
0.0035991668701171875,
0.005985260009765625,
0.01261138916015625,
0.044952392578125,
0.021636962890625,
-0.0311431884765625,
-0.0122833251953125,
0.018829345703125,
-0.0272369384765625,
-0.01384735107421875,
0.0223236083984375,
-0.00914764404296875,
0.016265869140625,
0.034912109375,
0.06396484375,
0.015838623046875,
-0.0238800048828125,
0.039703369140625,
-0.007061004638671875,
-0.01515960693359375,
-0.033843994140625,
-0.0276641845703125,
0.01323699951171875,
0.0119476318359375,
0.01523590087890625,
0.0092926025390625,
0.00246429443359375,
-0.044647216796875,
0.00897979736328125,
0.03131103515625,
-0.0322265625,
-0.03955078125,
0.07220458984375,
0.006267547607421875,
-0.0127410888671875,
0.031707763671875,
-0.0452880859375,
-0.060455322265625,
0.051910400390625,
0.052764892578125,
0.05517578125,
-0.0181732177734375,
0.009246826171875,
0.0628662109375,
0.02166748046875,
-0.01116180419921875,
0.05816650390625,
0.030975341796875,
-0.06402587890625,
-0.02813720703125,
-0.06884765625,
-0.0021991729736328125,
0.019683837890625,
-0.042327880859375,
0.0330810546875,
-0.03289794921875,
-0.039093017578125,
0.030792236328125,
0.0254364013671875,
-0.05841064453125,
0.03009033203125,
0.02264404296875,
0.08221435546875,
-0.07073974609375,
0.070556640625,
0.0771484375,
-0.054290771484375,
-0.08624267578125,
-0.013031005859375,
0.00394439697265625,
-0.0391845703125,
0.062347412109375,
0.0208282470703125,
0.03619384765625,
0.0147857666015625,
-0.038665771484375,
-0.098876953125,
0.07318115234375,
-0.0165863037109375,
-0.03369140625,
-0.01329803466796875,
-0.0203857421875,
0.0228729248046875,
-0.032470703125,
0.043182373046875,
0.031494140625,
0.039337158203125,
-0.0041351318359375,
-0.0699462890625,
0.0036029815673828125,
-0.0212860107421875,
-0.01171112060546875,
0.01568603515625,
-0.049652099609375,
0.0892333984375,
-0.0235137939453125,
-0.01042938232421875,
0.01910400390625,
0.060699462890625,
0.005218505859375,
0.0191497802734375,
0.0172271728515625,
0.0673828125,
0.055450439453125,
-0.018463134765625,
0.07330322265625,
-0.0267333984375,
0.045501708984375,
0.08563232421875,
-0.006191253662109375,
0.07440185546875,
0.02081298828125,
-0.004604339599609375,
0.0521240234375,
0.051666259765625,
-0.004146575927734375,
0.040008544921875,
0.01544189453125,
-0.0008592605590820312,
-0.0257415771484375,
-0.0103607177734375,
-0.03411865234375,
0.041473388671875,
0.02630615234375,
-0.040740966796875,
0.004261016845703125,
-0.009246826171875,
0.035675048828125,
-0.006435394287109375,
-0.030517578125,
0.058380126953125,
0.004795074462890625,
-0.041107177734375,
0.033843994140625,
0.0135345458984375,
0.07550048828125,
-0.033294677734375,
0.003173828125,
-0.0124969482421875,
0.02838134765625,
-0.0130157470703125,
-0.038482666015625,
0.016204833984375,
-0.0178070068359375,
-0.01543426513671875,
-0.00395965576171875,
0.05303955078125,
-0.042327880859375,
-0.03619384765625,
0.0188446044921875,
0.029541015625,
0.00923919677734375,
-0.000057816505432128906,
-0.054901123046875,
-0.0088958740234375,
0.005435943603515625,
-0.043975830078125,
0.014984130859375,
0.015777587890625,
-0.003345489501953125,
0.0306243896484375,
0.032470703125,
0.0023097991943359375,
-0.0014247894287109375,
-0.0191650390625,
0.05804443359375,
-0.0682373046875,
-0.035614013671875,
-0.06805419921875,
0.047027587890625,
-0.0063323974609375,
-0.053680419921875,
0.0633544921875,
0.06182861328125,
0.060760498046875,
-0.005649566650390625,
0.058685302734375,
-0.031494140625,
0.051849365234375,
-0.01090240478515625,
0.06793212890625,
-0.06182861328125,
-0.00827789306640625,
-0.0184783935546875,
-0.045013427734375,
-0.033843994140625,
0.05413818359375,
-0.026580810546875,
-0.00774383544921875,
0.0484619140625,
0.050201416015625,
0.01511383056640625,
0.0002491474151611328,
0.0019893646240234375,
0.0298309326171875,
-0.002849578857421875,
0.032928466796875,
0.040924072265625,
-0.047943115234375,
0.0244903564453125,
-0.041473388671875,
-0.01381683349609375,
-0.024169921875,
-0.07489013671875,
-0.0728759765625,
-0.05426025390625,
-0.035491943359375,
-0.06658935546875,
-0.01678466796875,
0.087646484375,
0.0292205810546875,
-0.067626953125,
-0.0196685791015625,
0.01158905029296875,
-0.00232696533203125,
-0.0007600784301757812,
-0.0191192626953125,
0.0347900390625,
-0.01462554931640625,
-0.05218505859375,
0.0248870849609375,
-0.01216888427734375,
0.01062774658203125,
0.0161590576171875,
0.0003914833068847656,
-0.0501708984375,
0.0166015625,
0.033172607421875,
0.0262603759765625,
-0.05096435546875,
-0.006740570068359375,
0.0189971923828125,
-0.0257415771484375,
0.00966644287109375,
0.021942138671875,
-0.059295654296875,
0.014801025390625,
0.05596923828125,
0.0197296142578125,
0.035125732421875,
-0.0037784576416015625,
0.0158538818359375,
-0.038665771484375,
-0.002437591552734375,
0.0277252197265625,
0.040191650390625,
0.0234375,
-0.0147705078125,
0.034210205078125,
0.036346435546875,
-0.05224609375,
-0.053009033203125,
-0.0195770263671875,
-0.077880859375,
-0.01323699951171875,
0.08270263671875,
-0.013031005859375,
-0.036865234375,
0.00738525390625,
-0.00551605224609375,
0.03704833984375,
-0.03369140625,
0.028289794921875,
0.0338134765625,
0.0004544258117675781,
0.0170440673828125,
-0.024261474609375,
0.05841064453125,
0.03057861328125,
-0.04425048828125,
-0.023895263671875,
0.014312744140625,
0.043121337890625,
0.02618408203125,
0.048248291015625,
0.00827789306640625,
0.0107879638671875,
0.01062774658203125,
0.036895751953125,
0.01129913330078125,
-0.01267242431640625,
-0.038482666015625,
-0.006420135498046875,
-0.005176544189453125,
-0.01360321044921875
]
] |
cl-tohoku/bert-base-japanese-v3 | 2023-05-19T00:31:53.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"pretraining",
"ja",
"dataset:cc100",
"dataset:wikipedia",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | cl-tohoku | null | null | cl-tohoku/bert-base-japanese-v3 | 20 | 64,383 | transformers | 2023-05-19T00:13:53 | ---
license: apache-2.0
datasets:
- cc100
- wikipedia
language:
- ja
widget:
- text: 東北大学で[MASK]の研究をしています。
---
# BERT base Japanese (unidic-lite with whole word masking, CC-100 and jawiki-20230102)
This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language.
This version of the model processes input texts with word-level tokenization based on the Unidic 2.1.2 dictionary (available in [unidic-lite](https://pypi.org/project/unidic-lite/) package), followed by the WordPiece subword tokenization.
Additionally, the model is trained with the whole word masking enabled for the masked language modeling (MLM) objective.
The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/).
## Model architecture
The model architecture is the same as the original BERT base model; 12 layers, 768 dimensions of hidden states, and 12 attention heads.
## Training Data
The model is trained on the Japanese portion of [CC-100 dataset](https://data.statmt.org/cc-100/) and the Japanese version of Wikipedia.
For Wikipedia, we generated a text corpus from the [Wikipedia Cirrussearch dump file](https://dumps.wikimedia.org/other/cirrussearch/) as of January 2, 2023.
The corpus files generated from CC-100 and Wikipedia are 74.3GB and 4.9GB in size and consist of approximately 392M and 34M sentences, respectively.
For the purpose of splitting texts into sentences, we used [fugashi](https://github.com/polm/fugashi) with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd) dictionary (v0.0.7).
## Tokenization
The texts are first tokenized by MeCab with the Unidic 2.1.2 dictionary and then split into subwords by the WordPiece algorithm.
The vocabulary size is 32768.
We used [fugashi](https://github.com/polm/fugashi) and [unidic-lite](https://github.com/polm/unidic-lite) packages for the tokenization.
## Training
We trained the model first on the CC-100 corpus for 1M steps and then on the Wikipedia corpus for another 1M steps.
For training of the MLM (masked language modeling) objective, we introduced whole word masking in which all of the subword tokens corresponding to a single word (tokenized by MeCab) are masked at once.
For training of each model, we used a v3-8 instance of Cloud TPUs provided by [TPU Research Cloud](https://sites.research.google/trc/about/).
## Licenses
The pretrained models are distributed under the Apache License 2.0.
## Acknowledgments
This model is trained with Cloud TPUs provided by [TPU Research Cloud](https://sites.research.google/trc/about/) program. | 2,629 | [
[
-0.037567138671875,
-0.0655517578125,
0.01416778564453125,
0.005336761474609375,
-0.05072021484375,
0.0031642913818359375,
-0.0273284912109375,
-0.0285491943359375,
0.04022216796875,
0.040802001953125,
-0.05206298828125,
-0.043212890625,
-0.04083251953125,
0.0027980804443359375,
-0.0251617431640625,
0.091796875,
0.00237274169921875,
0.01464080810546875,
0.0090484619140625,
0.0115203857421875,
-0.01409149169921875,
-0.041015625,
-0.0452880859375,
-0.019378662109375,
0.038330078125,
0.0154266357421875,
0.03472900390625,
0.03179931640625,
0.0128173828125,
0.01525115966796875,
-0.002094268798828125,
0.007450103759765625,
-0.0428466796875,
-0.0236663818359375,
-0.006832122802734375,
-0.02392578125,
-0.01032257080078125,
0.00885009765625,
0.0408935546875,
0.058563232421875,
0.00476837158203125,
0.006618499755859375,
-0.0084075927734375,
0.0296630859375,
-0.042633056640625,
0.00025153160095214844,
-0.0556640625,
-0.0052947998046875,
-0.0247955322265625,
0.01666259765625,
-0.024139404296875,
0.01271820068359375,
0.01404571533203125,
-0.0577392578125,
0.0254974365234375,
0.0043487548828125,
0.08697509765625,
0.0012617111206054688,
-0.00888824462890625,
-0.0249786376953125,
-0.0297393798828125,
0.053314208984375,
-0.07080078125,
0.0273284912109375,
0.03924560546875,
-0.01007080078125,
-0.0087127685546875,
-0.0694580078125,
-0.0596923828125,
-0.0071258544921875,
-0.0017757415771484375,
0.00803375244140625,
-0.004032135009765625,
0.008392333984375,
0.0218353271484375,
0.0216827392578125,
-0.051177978515625,
0.02056884765625,
-0.039276123046875,
-0.0217437744140625,
0.03631591796875,
-0.00745391845703125,
0.032470703125,
-0.03594970703125,
-0.0330810546875,
-0.01055908203125,
-0.03851318359375,
0.0033779144287109375,
0.027374267578125,
0.0125274658203125,
-0.0019388198852539062,
0.044403076171875,
-0.009124755859375,
0.035919189453125,
-0.0129852294921875,
-0.019439697265625,
0.0232391357421875,
-0.0308837890625,
-0.025115966796875,
0.005706787109375,
0.07861328125,
0.015228271484375,
0.0291900634765625,
-0.0196990966796875,
-0.01036834716796875,
0.003223419189453125,
0.01971435546875,
-0.0609130859375,
-0.01873779296875,
0.00833892822265625,
-0.03619384765625,
-0.0212554931640625,
0.0151519775390625,
-0.0445556640625,
-0.00406646728515625,
0.001293182373046875,
0.054473876953125,
-0.044769287109375,
-0.014923095703125,
0.016143798828125,
-0.01308441162109375,
0.00926971435546875,
-0.0034160614013671875,
-0.06878662109375,
0.0164337158203125,
0.0408935546875,
0.0631103515625,
-0.00970458984375,
-0.0188751220703125,
-0.0017728805541992188,
0.00489044189453125,
-0.0235595703125,
0.03240966796875,
-0.0189208984375,
-0.03216552734375,
0.003993988037109375,
0.01490020751953125,
-0.01788330078125,
-0.01483154296875,
0.037750244140625,
-0.03997802734375,
0.040374755859375,
-0.006366729736328125,
-0.062347412109375,
-0.01904296875,
0.01087188720703125,
-0.040283203125,
0.08355712890625,
0.0041961669921875,
-0.0689697265625,
0.0144195556640625,
-0.058807373046875,
-0.0280914306640625,
0.01568603515625,
0.009063720703125,
-0.0297393798828125,
0.007381439208984375,
0.017730712890625,
0.023468017578125,
0.0027370452880859375,
0.02142333984375,
-0.010162353515625,
-0.0300750732421875,
0.01971435546875,
-0.024078369140625,
0.08697509765625,
0.0093536376953125,
-0.04473876953125,
0.0025119781494140625,
-0.056365966796875,
0.003002166748046875,
0.00830841064453125,
-0.024017333984375,
-0.0318603515625,
-0.01529693603515625,
0.0196990966796875,
0.0174407958984375,
0.035186767578125,
-0.060211181640625,
0.0077362060546875,
-0.051300048828125,
0.0264129638671875,
0.051422119140625,
-0.0081939697265625,
0.0238037109375,
0.002529144287109375,
0.0270233154296875,
0.0021991729736328125,
0.02142333984375,
-0.0173492431640625,
-0.04388427734375,
-0.08758544921875,
-0.0253143310546875,
0.059844970703125,
0.040191650390625,
-0.06353759765625,
0.062286376953125,
-0.0330810546875,
-0.033966064453125,
-0.062744140625,
0.01027679443359375,
0.03265380859375,
0.0289459228515625,
0.0250396728515625,
-0.036163330078125,
-0.046142578125,
-0.07073974609375,
0.00806427001953125,
-0.003902435302734375,
-0.016204833984375,
0.0016498565673828125,
0.057586669921875,
-0.03338623046875,
0.06402587890625,
-0.0260162353515625,
-0.0292205810546875,
-0.020751953125,
0.017669677734375,
0.0175628662109375,
0.03564453125,
0.0264739990234375,
-0.052459716796875,
-0.042724609375,
-0.016265869140625,
-0.037353515625,
-0.004924774169921875,
-0.0013055801391601562,
-0.01849365234375,
0.0228729248046875,
0.04815673828125,
-0.04840087890625,
0.038299560546875,
0.042755126953125,
-0.0255889892578125,
0.0260162353515625,
-0.016448974609375,
-0.023223876953125,
-0.1085205078125,
0.0264129638671875,
-0.0160675048828125,
-0.0182342529296875,
-0.051239013671875,
-0.0007634162902832031,
-0.00304412841796875,
-0.006587982177734375,
-0.03338623046875,
0.048248291015625,
-0.040985107421875,
0.0013875961303710938,
-0.029144287109375,
0.01204681396484375,
-0.006103515625,
0.056671142578125,
0.00824737548828125,
0.05584716796875,
0.035003662109375,
-0.04718017578125,
0.0142364501953125,
0.005950927734375,
-0.058990478515625,
-0.0059967041015625,
-0.054473876953125,
0.007785797119140625,
-0.0018205642700195312,
0.0160369873046875,
-0.0687255859375,
-0.007755279541015625,
0.0273284912109375,
-0.04254150390625,
0.03265380859375,
0.02142333984375,
-0.059844970703125,
-0.02899169921875,
-0.036865234375,
0.00711822509765625,
0.043609619140625,
-0.033111572265625,
0.03179931640625,
0.034088134765625,
-0.002361297607421875,
-0.058074951171875,
-0.055908203125,
0.0194091796875,
0.0094146728515625,
-0.031463623046875,
0.043182373046875,
-0.0141448974609375,
0.0013055801391601562,
0.0160980224609375,
0.0002758502960205078,
-0.0005617141723632812,
0.006923675537109375,
0.0140228271484375,
0.03167724609375,
-0.0035724639892578125,
0.0016298294067382812,
0.003116607666015625,
-0.005970001220703125,
-0.00760650634765625,
-0.01953125,
0.07177734375,
0.01192474365234375,
0.0010471343994140625,
-0.0267486572265625,
0.0199432373046875,
0.0171966552734375,
-0.00589752197265625,
0.0872802734375,
0.06829833984375,
-0.0306549072265625,
-0.00986480712890625,
-0.044921875,
-0.00994873046875,
-0.032012939453125,
0.0357666015625,
-0.0238189697265625,
-0.07476806640625,
0.036865234375,
0.0194549560546875,
0.0200347900390625,
0.05023193359375,
0.046417236328125,
-0.014892578125,
0.071044921875,
0.057586669921875,
-0.0284881591796875,
0.043487548828125,
-0.03131103515625,
0.0231170654296875,
-0.06591796875,
-0.0280303955078125,
-0.0264129638671875,
-0.0227813720703125,
-0.052978515625,
-0.034881591796875,
0.0144195556640625,
0.0220794677734375,
-0.0078887939453125,
0.02886962890625,
-0.031158447265625,
0.04071044921875,
0.053955078125,
0.00818634033203125,
0.010467529296875,
0.026611328125,
-0.0297088623046875,
-0.005779266357421875,
-0.052001953125,
-0.034515380859375,
0.085693359375,
0.0325927734375,
0.037078857421875,
-0.00739288330078125,
0.042816162109375,
0.0104217529296875,
0.02398681640625,
-0.050628662109375,
0.044219970703125,
-0.038665771484375,
-0.0830078125,
-0.0281219482421875,
-0.0292510986328125,
-0.07330322265625,
0.0253143310546875,
-0.0182647705078125,
-0.05450439453125,
-0.000058710575103759766,
-0.028961181640625,
-0.0024204254150390625,
0.034149169921875,
-0.043792724609375,
0.06256103515625,
-0.0137176513671875,
0.0100555419921875,
-0.01141357421875,
-0.0650634765625,
0.0272674560546875,
-0.0173187255859375,
0.021270751953125,
-0.004444122314453125,
-0.01180267333984375,
0.083984375,
-0.034271240234375,
0.07281494140625,
-0.0064849853515625,
0.0003180503845214844,
0.00933074951171875,
-0.0216827392578125,
0.00957489013671875,
-0.0048065185546875,
0.01434326171875,
0.043182373046875,
-0.00966644287109375,
-0.034027099609375,
-0.017547607421875,
0.051483154296875,
-0.07940673828125,
-0.0267486572265625,
-0.03179931640625,
-0.02734375,
-0.004302978515625,
0.039031982421875,
0.05914306640625,
0.01788330078125,
-0.0185699462890625,
0.031280517578125,
0.069580078125,
-0.024261474609375,
0.0428466796875,
0.045989990234375,
-0.0213470458984375,
-0.0367431640625,
0.060211181640625,
0.01389312744140625,
0.00946807861328125,
0.041229248046875,
0.001220703125,
-0.0284881591796875,
-0.0347900390625,
-0.031707763671875,
0.038665771484375,
-0.037994384765625,
-0.0027141571044921875,
-0.064453125,
-0.037017822265625,
-0.045684814453125,
0.004337310791015625,
-0.021881103515625,
-0.03179931640625,
-0.0364990234375,
-0.0088958740234375,
0.0018672943115234375,
0.040802001953125,
-0.0008029937744140625,
0.037200927734375,
-0.037139892578125,
0.025238037109375,
0.0180206298828125,
0.01593017578125,
-0.01275634765625,
-0.0574951171875,
-0.032806396484375,
0.016265869140625,
-0.007350921630859375,
-0.055450439453125,
0.027252197265625,
0.006526947021484375,
0.044158935546875,
0.03662109375,
-0.005054473876953125,
0.042449951171875,
-0.034027099609375,
0.07257080078125,
0.0292205810546875,
-0.07684326171875,
0.034423828125,
-0.0191497802734375,
0.0270233154296875,
0.050872802734375,
0.041229248046875,
-0.04986572265625,
-0.02899169921875,
-0.061614990234375,
-0.0653076171875,
0.05645751953125,
0.020111083984375,
0.03070068359375,
-0.0029087066650390625,
0.0309600830078125,
-0.002902984619140625,
0.019378662109375,
-0.08502197265625,
-0.032623291015625,
-0.0273284912109375,
-0.0246124267578125,
-0.0269317626953125,
-0.040130615234375,
0.003665924072265625,
-0.0232696533203125,
0.07977294921875,
0.0028324127197265625,
0.018157958984375,
0.01044464111328125,
-0.0270538330078125,
0.00638580322265625,
0.00786590576171875,
0.050628662109375,
0.041534423828125,
-0.017425537109375,
0.0047607421875,
0.005828857421875,
-0.055908203125,
-0.0184478759765625,
0.01180267333984375,
-0.035186767578125,
0.040252685546875,
0.0357666015625,
0.08966064453125,
0.024993896484375,
-0.05596923828125,
0.046844482421875,
0.0008554458618164062,
-0.0310211181640625,
-0.017303466796875,
0.0012922286987304688,
0.01198577880859375,
-0.01258087158203125,
0.0299224853515625,
-0.03570556640625,
-0.00960540771484375,
-0.04205322265625,
0.0007686614990234375,
0.03558349609375,
-0.02069091796875,
-0.0169830322265625,
0.040740966796875,
0.01111602783203125,
-0.012481689453125,
0.056182861328125,
-0.015899658203125,
-0.057159423828125,
0.0308380126953125,
0.0401611328125,
0.0635986328125,
-0.0121307373046875,
0.0242156982421875,
0.038543701171875,
0.044952392578125,
0.005741119384765625,
-0.002193450927734375,
-0.004161834716796875,
-0.063232421875,
-0.0283355712890625,
-0.0667724609375,
-0.01194000244140625,
0.040252685546875,
-0.02960205078125,
0.01213836669921875,
-0.0421142578125,
-0.005313873291015625,
0.0018339157104492188,
0.035003662109375,
-0.031646728515625,
0.027069091796875,
0.01568603515625,
0.06658935546875,
-0.05145263671875,
0.08642578125,
0.047637939453125,
-0.034423828125,
-0.05999755859375,
-0.007480621337890625,
-0.0281829833984375,
-0.0712890625,
0.039764404296875,
0.019989013671875,
0.01308441162109375,
0.0051116943359375,
-0.05322265625,
-0.056182861328125,
0.07122802734375,
0.00391387939453125,
-0.04071044921875,
-0.017303466796875,
0.0160064697265625,
0.049530029296875,
-0.031646728515625,
0.002216339111328125,
0.02392578125,
0.0203094482421875,
-0.0014247894287109375,
-0.0731201171875,
-0.0173492431640625,
-0.0236663818359375,
0.0216064453125,
0.007289886474609375,
-0.037017822265625,
0.07080078125,
0.005649566650390625,
-0.024871826171875,
0.0116424560546875,
0.040557861328125,
0.0091552734375,
0.003284454345703125,
0.04608154296875,
0.06475830078125,
0.04705810546875,
0.007198333740234375,
0.060089111328125,
-0.025970458984375,
0.0357666015625,
0.065673828125,
0.01541900634765625,
0.06280517578125,
0.030517578125,
-0.0027484893798828125,
0.058502197265625,
0.0533447265625,
-0.009918212890625,
0.0517578125,
0.004688262939453125,
-0.0197296142578125,
-0.0054779052734375,
-0.017364501953125,
-0.0238800048828125,
0.0322265625,
0.0284271240234375,
-0.039886474609375,
-0.006580352783203125,
0.00783538818359375,
0.0287933349609375,
-0.0225982666015625,
-0.0284271240234375,
0.06329345703125,
0.0087127685546875,
-0.0462646484375,
0.03204345703125,
0.032928466796875,
0.078125,
-0.0699462890625,
0.032623291015625,
-0.0166778564453125,
0.007678985595703125,
0.0015697479248046875,
-0.039306640625,
0.004032135009765625,
0.0177459716796875,
-0.017242431640625,
-0.01168060302734375,
0.055145263671875,
-0.047637939453125,
-0.0517578125,
0.02008056640625,
0.0169830322265625,
0.024078369140625,
0.00032711029052734375,
-0.0672607421875,
0.004669189453125,
0.02191162109375,
-0.027130126953125,
0.032135009765625,
0.019317626953125,
-0.0011034011840820312,
0.030364990234375,
0.0699462890625,
0.00968170166015625,
0.01678466796875,
0.0164642333984375,
0.05908203125,
-0.04058837890625,
-0.0418701171875,
-0.0635986328125,
0.0239715576171875,
-0.01983642578125,
-0.0285797119140625,
0.0614013671875,
0.045654296875,
0.07666015625,
-0.0093994140625,
0.0411376953125,
-0.01003265380859375,
0.039581298828125,
-0.0411376953125,
0.056060791015625,
-0.05126953125,
-0.00009906291961669922,
-0.0157470703125,
-0.07672119140625,
-0.00760650634765625,
0.0567626953125,
0.00446319580078125,
0.00006723403930664062,
0.047393798828125,
0.05609130859375,
0.006023406982421875,
-0.01451873779296875,
0.0192413330078125,
0.01416778564453125,
0.01529693603515625,
0.04132080078125,
0.0330810546875,
-0.0467529296875,
0.038299560546875,
-0.04022216796875,
-0.01114654541015625,
-0.01035308837890625,
-0.049285888671875,
-0.08636474609375,
-0.048431396484375,
-0.0156097412109375,
-0.0244598388671875,
-0.0117645263671875,
0.05084228515625,
0.05303955078125,
-0.052642822265625,
-0.017425537109375,
-0.006832122802734375,
-0.01033782958984375,
0.0174407958984375,
-0.0191650390625,
0.033843994140625,
-0.0218353271484375,
-0.061004638671875,
0.016815185546875,
0.0133056640625,
0.00493621826171875,
-0.036163330078125,
0.0014476776123046875,
-0.0401611328125,
0.00659942626953125,
0.04217529296875,
0.007755279541015625,
-0.055572509765625,
-0.004627227783203125,
0.005596160888671875,
-0.021728515625,
-0.0215911865234375,
0.0531005859375,
-0.0394287109375,
0.0408935546875,
0.0294036865234375,
0.0396728515625,
0.06475830078125,
-0.01373291015625,
0.0180206298828125,
-0.070068359375,
0.02435302734375,
0.004314422607421875,
0.024200439453125,
0.0277252197265625,
-0.01495361328125,
0.039764404296875,
0.038330078125,
-0.0162353515625,
-0.061309814453125,
-0.0070648193359375,
-0.07220458984375,
-0.0518798828125,
0.08026123046875,
-0.00916290283203125,
-0.030303955078125,
-0.005191802978515625,
-0.0070953369140625,
0.03216552734375,
-0.0008540153503417969,
0.052581787109375,
0.07977294921875,
0.028045654296875,
-0.0161590576171875,
-0.02886962890625,
0.031402587890625,
0.0313720703125,
-0.048858642578125,
-0.0253143310546875,
0.0180206298828125,
0.044769287109375,
0.0291900634765625,
0.06463623046875,
-0.0014362335205078125,
0.00460052490234375,
-0.006168365478515625,
0.0176544189453125,
-0.00473785400390625,
-0.016448974609375,
-0.014923095703125,
-0.0017862319946289062,
-0.032623291015625,
-0.0284271240234375
]
] |
textattack/roberta-base-CoLA | 2021-05-20T22:05:35.000Z | [
"transformers",
"pytorch",
"jax",
"roberta",
"text-classification",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | textattack | null | null | textattack/roberta-base-CoLA | 12 | 64,150 | transformers | 2022-03-02T23:29:05 | ## TextAttack Model Cardand the glue dataset loaded using the `nlp` library. The model was fine-tuned
for 5 epochs with a batch size of 32, a learning
rate of 2e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.850431447746884, as measured by the
eval set accuracy, found after 1 epoch.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 528 | [
[
-0.0041656494140625,
-0.039794921875,
0.024322509765625,
0.0146026611328125,
-0.0160369873046875,
-0.0018739700317382812,
-0.01122283935546875,
-0.033721923828125,
-0.00078582763671875,
0.01093292236328125,
-0.032806396484375,
-0.051605224609375,
-0.0458984375,
0.003086090087890625,
-0.032257080078125,
0.10504150390625,
0.02020263671875,
-0.009552001953125,
-0.01885986328125,
-0.004772186279296875,
-0.03729248046875,
-0.022308349609375,
-0.04241943359375,
-0.026458740234375,
0.034942626953125,
0.0309906005859375,
0.06005859375,
0.06561279296875,
0.054412841796875,
0.0161285400390625,
-0.0236358642578125,
0.0010499954223632812,
-0.040069580078125,
-0.024810791015625,
-0.01031494140625,
-0.05572509765625,
-0.05517578125,
0.00722503662109375,
0.03802490234375,
0.0019330978393554688,
-0.0172576904296875,
0.05047607421875,
0.003566741943359375,
0.0494384765625,
-0.0321044921875,
-0.006984710693359375,
-0.054107666015625,
0.0158843994140625,
-0.01145172119140625,
-0.008087158203125,
-0.054534912109375,
-0.0147705078125,
0.0166473388671875,
-0.047576904296875,
0.006366729736328125,
0.0207366943359375,
0.07562255859375,
0.0221099853515625,
-0.01465606689453125,
0.0011386871337890625,
-0.030303955078125,
0.07391357421875,
-0.049560546875,
0.01503753662109375,
0.05743408203125,
0.004138946533203125,
-0.00800323486328125,
-0.063232421875,
-0.01303863525390625,
0.015716552734375,
0.0168304443359375,
-0.0120697021484375,
-0.007122039794921875,
-0.006641387939453125,
0.0265655517578125,
0.031280517578125,
-0.0848388671875,
0.01074981689453125,
-0.050262451171875,
-0.01995849609375,
0.0777587890625,
0.032562255859375,
0.022125244140625,
-0.00959014892578125,
-0.0482177734375,
-0.01898193359375,
-0.006946563720703125,
-0.00328826904296875,
0.0238037109375,
0.014617919921875,
-0.01470184326171875,
0.03802490234375,
0.0042572021484375,
0.059234619140625,
0.004184722900390625,
-0.03143310546875,
0.017333984375,
-0.0032711029052734375,
-0.0447998046875,
-0.00027251243591308594,
0.07244873046875,
0.023529052734375,
0.006542205810546875,
0.0025615692138671875,
-0.012176513671875,
-0.007415771484375,
0.0248870849609375,
-0.052764892578125,
-0.028839111328125,
0.033782958984375,
-0.04693603515625,
-0.034271240234375,
-0.001766204833984375,
-0.0247955322265625,
-0.01430511474609375,
-0.019927978515625,
0.043548583984375,
-0.0595703125,
0.002788543701171875,
0.023284912109375,
-0.027923583984375,
0.0182952880859375,
0.0150146484375,
-0.04852294921875,
0.0186004638671875,
0.044158935546875,
0.08819580078125,
-0.031982421875,
-0.01297760009765625,
0.0088958740234375,
-0.01885986328125,
-0.0281219482421875,
0.0716552734375,
-0.034210205078125,
-0.00821685791015625,
-0.02801513671875,
-0.0014667510986328125,
-0.00472259521484375,
-0.016571044921875,
0.03863525390625,
-0.0300750732421875,
0.0335693359375,
0.01544189453125,
-0.07330322265625,
-0.034637451171875,
0.007717132568359375,
-0.059234619140625,
0.0867919921875,
0.0229949951171875,
-0.041229248046875,
0.043121337890625,
-0.0345458984375,
-0.01001739501953125,
0.0122222900390625,
0.0258331298828125,
-0.059326171875,
0.01097869873046875,
0.0013284683227539062,
0.041168212890625,
-0.0310211181640625,
0.027923583984375,
-0.02825927734375,
-0.03802490234375,
0.01546478271484375,
-0.0299835205078125,
0.0771484375,
0.005878448486328125,
-0.0010852813720703125,
-0.00959014892578125,
-0.07147216796875,
0.028045654296875,
-0.0024394989013671875,
-0.033355712890625,
-0.0079193115234375,
-0.033294677734375,
0.02001953125,
0.01558685302734375,
0.0211944580078125,
-0.036712646484375,
0.039093017578125,
-0.02899169921875,
0.0195770263671875,
0.054931640625,
-0.0032444000244140625,
0.02288818359375,
-0.04937744140625,
0.0452880859375,
-0.005779266357421875,
0.01076507568359375,
-0.024139404296875,
-0.053741455078125,
-0.0281982421875,
-0.0187225341796875,
0.041595458984375,
0.052703857421875,
-0.023651123046875,
0.034698486328125,
-0.01068878173828125,
-0.0726318359375,
-0.034881591796875,
0.0010528564453125,
0.0244140625,
0.01377105712890625,
0.0283966064453125,
-0.0127410888671875,
-0.02862548828125,
-0.0589599609375,
-0.0214691162109375,
-0.0160675048828125,
-0.0128631591796875,
0.0009012222290039062,
0.0609130859375,
0.0102081298828125,
0.06903076171875,
-0.0614013671875,
-0.041229248046875,
0.01291656494140625,
0.0269927978515625,
0.027252197265625,
0.03570556640625,
0.0277557373046875,
-0.04693603515625,
-0.035491943359375,
-0.0172882080078125,
-0.033721923828125,
0.01369476318359375,
-0.0078582763671875,
0.00588226318359375,
0.0252685546875,
0.0226287841796875,
-0.041168212890625,
0.050018310546875,
0.05426025390625,
-0.049530029296875,
0.051666259765625,
-0.0113677978515625,
0.00800323486328125,
-0.10369873046875,
0.013458251953125,
0.0031890869140625,
-0.0258941650390625,
-0.0299072265625,
-0.00952911376953125,
0.003509521484375,
-0.032257080078125,
-0.05859375,
0.042999267578125,
-0.03057861328125,
0.0129241943359375,
-0.01050567626953125,
-0.011993408203125,
0.0114593505859375,
0.051544189453125,
-0.001964569091796875,
0.05987548828125,
0.0224456787109375,
-0.030029296875,
0.0232391357421875,
0.0240478515625,
-0.018585205078125,
0.034027099609375,
-0.061553955078125,
0.036865234375,
0.007312774658203125,
0.005390167236328125,
-0.08056640625,
-0.0234832763671875,
-0.0247802734375,
-0.04644775390625,
0.0171661376953125,
0.00836181640625,
-0.034881591796875,
-0.0183563232421875,
-0.043792724609375,
0.040283203125,
0.033843994140625,
-0.0223541259765625,
0.033050537109375,
0.036468505859375,
0.02203369140625,
-0.03741455078125,
-0.050018310546875,
-0.00905609130859375,
-0.032135009765625,
-0.0423583984375,
0.0396728515625,
-0.0252685546875,
0.0219879150390625,
-0.015106201171875,
0.007415771484375,
-0.027191162109375,
-0.0095062255859375,
0.025909423828125,
0.01232147216796875,
-0.006534576416015625,
0.03863525390625,
-0.00911712646484375,
-0.009735107421875,
-0.011077880859375,
-0.0100860595703125,
0.040130615234375,
-0.0225677490234375,
0.0125579833984375,
-0.044769287109375,
-0.00409698486328125,
0.04754638671875,
0.004547119140625,
0.08721923828125,
0.048370361328125,
-0.043792724609375,
-0.01708984375,
-0.01544952392578125,
-0.00634002685546875,
-0.035003662109375,
0.029022216796875,
-0.02996826171875,
-0.0538330078125,
0.043304443359375,
0.0137481689453125,
0.001491546630859375,
0.0579833984375,
0.034637451171875,
0.00269317626953125,
0.0616455078125,
0.053192138671875,
-0.02020263671875,
0.0158233642578125,
-0.019378662109375,
-0.00264739990234375,
-0.0479736328125,
-0.0215911865234375,
-0.0225372314453125,
-0.0253448486328125,
-0.042388916015625,
-0.037445068359375,
0.0135955810546875,
0.0229949951171875,
-0.01500701904296875,
0.053497314453125,
-0.05975341796875,
0.03717041015625,
0.03729248046875,
0.0243072509765625,
0.006290435791015625,
-0.01560211181640625,
0.006999969482421875,
-0.00218963623046875,
-0.05743408203125,
-0.0249481201171875,
0.09246826171875,
0.046966552734375,
0.0565185546875,
-0.0088958740234375,
0.0380859375,
0.033966064453125,
-0.005157470703125,
-0.06085205078125,
0.043609619140625,
-0.00865936279296875,
-0.0443115234375,
-0.03985595703125,
0.0012264251708984375,
-0.06329345703125,
-0.0287322998046875,
-0.0302734375,
-0.058563232421875,
-0.01042938232421875,
0.02362060546875,
-0.0120697021484375,
0.021392822265625,
-0.039306640625,
0.06768798828125,
0.01288604736328125,
-0.0138092041015625,
0.00023174285888671875,
-0.05047607421875,
0.03106689453125,
-0.015655517578125,
-0.01354217529296875,
-0.0296478271484375,
-0.0026874542236328125,
0.080810546875,
-0.0172271728515625,
0.037384033203125,
0.02252197265625,
0.0008769035339355469,
0.02264404296875,
-0.005329132080078125,
0.0279541015625,
-0.01177978515625,
0.0005807876586914062,
0.0296478271484375,
0.00980377197265625,
-0.0232086181640625,
-0.041748046875,
0.015655517578125,
-0.03948974609375,
0.0005469322204589844,
-0.0298919677734375,
-0.054168701171875,
0.0146636962890625,
-0.0021648406982421875,
0.046875,
0.039093017578125,
-0.0092926025390625,
0.024383544921875,
0.058349609375,
-0.022857666015625,
0.045379638671875,
0.03082275390625,
-0.015655517578125,
-0.0305328369140625,
0.0784912109375,
0.0222320556640625,
0.0139312744140625,
0.023773193359375,
0.0203704833984375,
-0.00882720947265625,
-0.01131439208984375,
-0.0222625732421875,
0.017303466796875,
-0.0308837890625,
-0.05126953125,
-0.032318115234375,
-0.032012939453125,
-0.02850341796875,
-0.002941131591796875,
-0.038330078125,
-0.04876708984375,
-0.050689697265625,
-0.00762176513671875,
0.05615234375,
0.03704833984375,
0.0026416778564453125,
0.0247650146484375,
-0.06683349609375,
0.0033817291259765625,
0.0003657341003417969,
0.0433349609375,
-0.018951416015625,
-0.064697265625,
-0.043853759765625,
-0.0235748291015625,
-0.0217132568359375,
-0.06591796875,
0.0301971435546875,
0.045806884765625,
0.012664794921875,
0.028289794921875,
0.004032135009765625,
0.04779052734375,
-0.04718017578125,
0.080322265625,
-0.00411224365234375,
-0.061737060546875,
0.0523681640625,
-0.0292816162109375,
0.07061767578125,
0.0202484130859375,
0.049560546875,
-0.0190582275390625,
-0.044097900390625,
-0.073486328125,
-0.060546875,
0.03765869140625,
0.00775146484375,
-0.01381683349609375,
0.0145416259765625,
0.028076171875,
0.000865936279296875,
-0.0032215118408203125,
-0.053131103515625,
-0.0157623291015625,
0.017120361328125,
-0.036407470703125,
0.004329681396484375,
-0.024200439453125,
0.0108795166015625,
-0.0201263427734375,
0.06805419921875,
-0.01434326171875,
0.04168701171875,
-0.0033969879150390625,
-0.0157623291015625,
0.00858306884765625,
0.003910064697265625,
0.0718994140625,
0.047119140625,
-0.0226593017578125,
-0.018798828125,
0.0227508544921875,
-0.06451416015625,
0.014739990234375,
0.0022735595703125,
0.007381439208984375,
-0.006542205810546875,
0.04742431640625,
0.06414794921875,
-0.008544921875,
-0.048095703125,
0.031768798828125,
-0.0040435791015625,
-0.01082611083984375,
-0.024261474609375,
0.01540374755859375,
-0.013916015625,
0.01507568359375,
0.02850341796875,
-0.0158233642578125,
0.0269622802734375,
-0.00458526611328125,
0.04864501953125,
0.0183563232421875,
-0.053131103515625,
-0.031768798828125,
0.054412841796875,
-0.018463134765625,
-0.047393798828125,
0.0587158203125,
-0.0007143020629882812,
-0.031982421875,
0.039825439453125,
0.04107666015625,
0.067626953125,
-0.047607421875,
0.0268096923828125,
0.040496826171875,
0.01971435546875,
-0.01910400390625,
0.01346588134765625,
0.014190673828125,
-0.039093017578125,
-0.0259552001953125,
-0.06158447265625,
-0.0109405517578125,
0.0196990966796875,
-0.06597900390625,
0.03314208984375,
-0.0172271728515625,
-0.037841796875,
0.005886077880859375,
0.0114593505859375,
-0.039581298828125,
0.04730224609375,
-0.0076446533203125,
0.0938720703125,
-0.0904541015625,
0.04827880859375,
0.045318603515625,
-0.0124359130859375,
-0.06561279296875,
-0.0270538330078125,
0.014190673828125,
-0.040985107421875,
0.04217529296875,
0.0206451416015625,
0.01422119140625,
-0.004116058349609375,
-0.062103271484375,
-0.07672119140625,
0.08123779296875,
0.015411376953125,
-0.034027099609375,
-0.00991058349609375,
0.0021343231201171875,
0.04791259765625,
-0.0177459716796875,
0.023895263671875,
0.050567626953125,
0.000885009765625,
-0.0226593017578125,
-0.085205078125,
-0.01079559326171875,
-0.0268707275390625,
0.0059661865234375,
0.0220794677734375,
-0.050323486328125,
0.058380126953125,
-0.006633758544921875,
-0.0007700920104980469,
-0.0006937980651855469,
0.06622314453125,
0.0174102783203125,
0.0174102783203125,
0.0250091552734375,
0.06829833984375,
0.05499267578125,
0.0021419525146484375,
0.06304931640625,
-0.022735595703125,
0.04779052734375,
0.09393310546875,
0.0174713134765625,
0.07440185546875,
0.01271820068359375,
-0.017608642578125,
0.0399169921875,
0.0535888671875,
-0.01177978515625,
0.026031494140625,
0.00905609130859375,
0.0098419189453125,
-0.002231597900390625,
-0.00848388671875,
0.005954742431640625,
0.042877197265625,
0.0243377685546875,
-0.0302276611328125,
0.01971435546875,
0.027252197265625,
0.01248931884765625,
-0.01922607421875,
-0.0128631591796875,
0.061309814453125,
0.004962921142578125,
-0.028076171875,
0.034759521484375,
-0.00801849365234375,
0.02630615234375,
-0.0304718017578125,
-0.0217437744140625,
0.00875091552734375,
0.00865936279296875,
-0.0184783935546875,
-0.0753173828125,
0.019439697265625,
-0.030242919921875,
-0.0321044921875,
0.004970550537109375,
0.060821533203125,
-0.046966552734375,
-0.032440185546875,
-0.01007843017578125,
-0.008331298828125,
0.0238189697265625,
0.00862884521484375,
-0.0614013671875,
0.00823974609375,
-0.007640838623046875,
-0.04498291015625,
0.01496124267578125,
0.02227783203125,
0.0013742446899414062,
0.0611572265625,
0.036834716796875,
-0.020843505859375,
-0.0009489059448242188,
-0.041595458984375,
0.03717041015625,
-0.048675537109375,
-0.03570556640625,
-0.064697265625,
0.0618896484375,
-0.00853729248046875,
-0.0655517578125,
0.05621337890625,
0.06414794921875,
0.0399169921875,
-0.007221221923828125,
0.01387786865234375,
0.00603485107421875,
0.041839599609375,
-0.043701171875,
0.039154052734375,
-0.042816162109375,
0.0010814666748046875,
-0.031402587890625,
-0.06024169921875,
-0.026611328125,
0.04742431640625,
0.01549530029296875,
-0.0102081298828125,
0.055023193359375,
0.05096435546875,
0.0163421630859375,
0.00603485107421875,
0.0263824462890625,
-0.00359344482421875,
0.01207733154296875,
0.050018310546875,
0.01068878173828125,
-0.07623291015625,
0.0246734619140625,
-0.00958251953125,
-0.011962890625,
-0.005542755126953125,
-0.079833984375,
-0.07373046875,
-0.06256103515625,
-0.038177490234375,
-0.043243408203125,
-0.006320953369140625,
0.0633544921875,
0.05780029296875,
-0.04150390625,
-0.01551055908203125,
0.006378173828125,
-0.00482177734375,
-0.0217132568359375,
-0.023284912109375,
0.0494384765625,
-0.00826263427734375,
-0.050689697265625,
0.009521484375,
0.0279541015625,
0.01263427734375,
-0.004810333251953125,
0.00858306884765625,
-0.0386962890625,
0.01557159423828125,
0.031982421875,
0.0059814453125,
-0.0238037109375,
-0.0301055908203125,
0.002147674560546875,
-0.0066986083984375,
0.00893402099609375,
0.034454345703125,
-0.0545654296875,
0.03875732421875,
0.050018310546875,
0.03765869140625,
0.038330078125,
0.004398345947265625,
0.05828857421875,
-0.062042236328125,
0.01262664794921875,
0.019683837890625,
-0.007114410400390625,
0.0055694580078125,
-0.039886474609375,
0.051116943359375,
0.004329681396484375,
-0.0772705078125,
-0.070556640625,
-0.00797271728515625,
-0.0635986328125,
-0.0203857421875,
0.060638427734375,
-0.01335906982421875,
-0.04656982421875,
0.00298309326171875,
-0.0068206787109375,
0.01209259033203125,
-0.0255889892578125,
0.04278564453125,
0.057373046875,
-0.0167388916015625,
-0.0216064453125,
-0.01470947265625,
0.06378173828125,
0.0116424560546875,
-0.05657958984375,
-0.024658203125,
0.00872802734375,
0.0205230712890625,
0.031524658203125,
0.045745849609375,
0.0094146728515625,
0.0289764404296875,
0.017822265625,
0.006927490234375,
0.005924224853515625,
-0.049072265625,
-0.0293426513671875,
0.0164947509765625,
-0.01324462890625,
-0.01024627685546875
]
] |
humarin/chatgpt_paraphraser_on_T5_base | 2023-10-07T20:18:02.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"en",
"dataset:humarin/chatgpt-paraphrases",
"license:openrail",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | humarin | null | null | humarin/chatgpt_paraphraser_on_T5_base | 107 | 63,980 | transformers | 2023-03-17T18:22:37 | ---
license: openrail
datasets:
- humarin/chatgpt-paraphrases
language:
- en
library_name: transformers
inference:
parameters:
num_beams: 5
num_beam_groups: 5
num_return_sequences: 5
repetition_penalty: 10.01
diversity_penalty: 3.01
no_repeat_ngram_size: 2
temperature: 0.7
max_length: 128
widget:
- text: What are the best places to see in New York?
example_title: New York tourist attractions
- text: When should I go to the doctor?
example_title: Doctor's time
- text: >-
Rammstein's album Mutter was recorded in the south of France in May and June
2000, and mixed in Stockholm in October of that year.
example_title: Rammstein's album Mutter
pipeline_tag: text2text-generation
---
This model was trained on our [ChatGPT paraphrase dataset](https://huggingface.co/datasets/humarin/chatgpt-paraphrases).
This dataset is based on the [Quora paraphrase question](https://www.kaggle.com/competitions/quora-question-pairs), texts from the [SQUAD 2.0](https://huggingface.co/datasets/squad_v2) and the [CNN news dataset](https://huggingface.co/datasets/cnn_dailymail).
This model is based on the T5-base model. We used "transfer learning" to get our model to generate paraphrases as well as ChatGPT. Now we can say that this is one of the best paraphrases of the Hugging Face.
[Kaggle](https://www.kaggle.com/datasets/vladimirvorobevv/chatgpt-paraphrases) link
[Author's LinkedIn](https://www.linkedin.com/in/vladimir-vorobev/) link
## Deploying example
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained("humarin/chatgpt_paraphraser_on_T5_base")
model = AutoModelForSeq2SeqLM.from_pretrained("humarin/chatgpt_paraphraser_on_T5_base").to(device)
def paraphrase(
question,
num_beams=5,
num_beam_groups=5,
num_return_sequences=5,
repetition_penalty=10.0,
diversity_penalty=3.0,
no_repeat_ngram_size=2,
temperature=0.7,
max_length=128
):
input_ids = tokenizer(
f'paraphrase: {question}',
return_tensors="pt", padding="longest",
max_length=max_length,
truncation=True,
).input_ids
outputs = model.generate(
input_ids, temperature=temperature, repetition_penalty=repetition_penalty,
num_return_sequences=num_return_sequences, no_repeat_ngram_size=no_repeat_ngram_size,
num_beams=num_beams, num_beam_groups=num_beam_groups,
max_length=max_length, diversity_penalty=diversity_penalty
)
res = tokenizer.batch_decode(outputs, skip_special_tokens=True)
return res
```
## Usage examples
**Input:**
```python
text = 'What are the best places to see in New York?'
paraphrase(text)
```
**Output:**
```python
['What are some must-see places in New York?',
'Can you suggest some must-see spots in New York?',
'Where should one go to experience the best NYC has to offer?',
'Which places should I visit in New York?',
'What are the top destinations to explore in New York?']
```
**Input:**
```python
text = "Rammstein's album Mutter was recorded in the south of France in May and June 2000, and mixed in Stockholm in October of that year."
paraphrase(text)
```
**Output:**
```python
['In May and June 2000, Rammstein travelled to the south of France to record his album Mutter, which was mixed in Stockholm in October of that year.',
'The album Mutter by Rammstein was recorded in the south of France during May and June 2000, with mixing taking place in Stockholm in October of that year.',
'The album Mutter by Rammstein was recorded in the south of France during May and June 2000, with mixing taking place in Stockholm in October of that year. It',
'Mutter, the album released by Rammstein, was recorded in southern France during May and June 2000, with mixing taking place between October and September.',
'In May and June 2000, Rammstein recorded his album Mutter in the south of France, with the mix being made at Stockholm during October.']
```
## Train parameters
```python
epochs = 5
batch_size = 64
max_length = 128
lr = 5e-5
batches_qty = 196465
betas = (0.9, 0.999)
eps = 1e-08
```
### BibTeX entry and citation info
```bibtex
@inproceedings{chatgpt_paraphraser,
author={Vladimir Vorobev, Maxim Kuznetsov},
title={A paraphrasing model based on ChatGPT paraphrases},
year={2023}
}
``` | 4,366 | [
[
-0.0256500244140625,
-0.041259765625,
0.0318603515625,
0.0219573974609375,
-0.0180206298828125,
-0.023345947265625,
-0.0084381103515625,
-0.0135040283203125,
0.00887298583984375,
0.0256500244140625,
-0.0261077880859375,
-0.0360107421875,
-0.036285400390625,
0.00817108154296875,
-0.043121337890625,
0.09014892578125,
-0.01134490966796875,
0.0013036727905273438,
-0.004154205322265625,
-0.0199127197265625,
-0.0108642578125,
-0.0294189453125,
-0.035369873046875,
-0.01116180419921875,
0.0168609619140625,
0.00676727294921875,
0.03912353515625,
0.031005859375,
0.0276031494140625,
0.0256500244140625,
-0.01873779296875,
0.004627227783203125,
-0.0297393798828125,
-0.00434112548828125,
0.0015277862548828125,
-0.0217437744140625,
-0.01013946533203125,
0.006061553955078125,
0.044189453125,
0.0316162109375,
-0.0178070068359375,
0.0238037109375,
0.007160186767578125,
0.01557159423828125,
-0.0201568603515625,
0.0102081298828125,
-0.046539306640625,
0.0024566650390625,
0.007305145263671875,
-0.00791168212890625,
-0.01300811767578125,
-0.032196044921875,
0.0149383544921875,
-0.036956787109375,
0.0188751220703125,
-0.0001347064971923828,
0.10345458984375,
0.005062103271484375,
-0.0182037353515625,
-0.0168609619140625,
-0.0239105224609375,
0.068603515625,
-0.06695556640625,
0.00846099853515625,
0.038116455078125,
0.01113128662109375,
-0.0036449432373046875,
-0.06256103515625,
-0.044677734375,
-0.022918701171875,
-0.0214996337890625,
0.0173187255859375,
-0.01338958740234375,
-0.03155517578125,
0.0107574462890625,
0.027496337890625,
-0.058746337890625,
-0.043212890625,
-0.051910400390625,
-0.0236358642578125,
0.03594970703125,
0.009429931640625,
0.0292816162109375,
-0.04010009765625,
-0.03521728515625,
-0.0259552001953125,
-0.0157470703125,
0.0008831024169921875,
0.005214691162109375,
-0.005901336669921875,
-0.02423095703125,
0.037353515625,
-0.020782470703125,
0.0372314453125,
0.0154266357421875,
-0.0012722015380859375,
0.036712646484375,
-0.043121337890625,
0.0027408599853515625,
-0.022247314453125,
0.0894775390625,
0.051727294921875,
0.018341064453125,
-0.02734375,
-0.021240234375,
-0.00849151611328125,
-0.0241546630859375,
-0.053985595703125,
-0.04693603515625,
0.010711669921875,
-0.035308837890625,
-0.0287933349609375,
-0.004840850830078125,
-0.062286376953125,
0.002498626708984375,
-0.01149749755859375,
0.056915283203125,
-0.058929443359375,
-0.007541656494140625,
0.004119873046875,
-0.041748046875,
0.0220947265625,
-0.012908935546875,
-0.05438232421875,
0.020050048828125,
0.03192138671875,
0.08251953125,
-0.003955841064453125,
-0.049713134765625,
-0.0205230712890625,
0.01425933837890625,
-0.0109100341796875,
0.05291748046875,
-0.0208892822265625,
-0.01409149169921875,
-0.035552978515625,
-0.005130767822265625,
-0.0164794921875,
-0.04736328125,
0.0253448486328125,
-0.0300750732421875,
0.0389404296875,
-0.00875091552734375,
-0.042572021484375,
-0.006221771240234375,
0.0135345458984375,
-0.0251312255859375,
0.07989501953125,
0.0212249755859375,
-0.06988525390625,
-0.007083892822265625,
-0.02679443359375,
-0.0300445556640625,
-0.0255279541015625,
-0.00920867919921875,
-0.02740478515625,
-0.0042877197265625,
0.036590576171875,
0.0312347412109375,
-0.01934814453125,
0.018951416015625,
-0.01641845703125,
-0.017608642578125,
0.038543701171875,
-0.0246124267578125,
0.08343505859375,
0.0225830078125,
-0.02667236328125,
-0.0036602020263671875,
-0.058746337890625,
0.00457763671875,
0.01074981689453125,
-0.0330810546875,
-0.006072998046875,
-0.01491546630859375,
0.0213623046875,
0.024627685546875,
0.029083251953125,
-0.036346435546875,
-0.01373291015625,
-0.0445556640625,
0.068359375,
0.049774169921875,
0.02471923828125,
0.032928466796875,
-0.060638427734375,
0.03167724609375,
0.011871337890625,
0.01175689697265625,
-0.00189208984375,
-0.040557861328125,
-0.033050537109375,
-0.00601959228515625,
0.022003173828125,
0.061370849609375,
-0.04620361328125,
0.03314208984375,
-0.0206451416015625,
-0.058685302734375,
-0.038909912109375,
0.001346588134765625,
0.0207977294921875,
0.060821533203125,
0.05029296875,
0.006195068359375,
-0.04229736328125,
-0.06842041015625,
-0.0191802978515625,
-0.015167236328125,
0.0047454833984375,
0.01418304443359375,
0.036834716796875,
0.00775909423828125,
0.053436279296875,
-0.044342041015625,
0.001556396484375,
-0.017852783203125,
-0.0004260540008544922,
0.0302276611328125,
0.0531005859375,
0.047271728515625,
-0.05853271484375,
-0.0498046875,
-0.01947021484375,
-0.0694580078125,
-0.006381988525390625,
-0.003803253173828125,
-0.033905029296875,
-0.0086517333984375,
0.0374755859375,
-0.06365966796875,
0.012664794921875,
0.0275421142578125,
-0.01540374755859375,
0.0443115234375,
-0.01232147216796875,
0.017547607421875,
-0.10345458984375,
0.00440216064453125,
-0.0014324188232421875,
-0.007045745849609375,
-0.03472900390625,
0.0028839111328125,
0.0022182464599609375,
-0.0161895751953125,
-0.040771484375,
0.046142578125,
-0.0238800048828125,
0.0260162353515625,
0.0011816024780273438,
0.01107025146484375,
-0.005126953125,
0.04693603515625,
-0.01110076904296875,
0.05218505859375,
0.05975341796875,
-0.048980712890625,
0.0325927734375,
0.046478271484375,
-0.0364990234375,
0.02752685546875,
-0.056671142578125,
0.002025604248046875,
-0.0003535747528076172,
0.0216522216796875,
-0.0814208984375,
-0.01543426513671875,
0.031829833984375,
-0.05291748046875,
-0.01250457763671875,
0.00775146484375,
-0.03997802734375,
-0.01442718505859375,
-0.017547607421875,
0.0222015380859375,
0.06732177734375,
-0.027801513671875,
0.04144287109375,
0.0250701904296875,
0.00099945068359375,
-0.02728271484375,
-0.052581787109375,
0.0092315673828125,
-0.044769287109375,
-0.04766845703125,
0.016632080078125,
-0.01244354248046875,
-0.0015611648559570312,
0.004993438720703125,
-0.00792694091796875,
-0.023712158203125,
-0.0079193115234375,
0.0024738311767578125,
0.0144500732421875,
-0.01082611083984375,
-0.00826263427734375,
-0.01049041748046875,
-0.0036220550537109375,
0.0108642578125,
-0.020050048828125,
0.048492431640625,
-0.0243377685546875,
0.01499176025390625,
-0.05108642578125,
0.03302001953125,
0.038909912109375,
-0.0079803466796875,
0.04364013671875,
0.057708740234375,
-0.008941650390625,
0.01171112060546875,
-0.029815673828125,
-0.021331787109375,
-0.035400390625,
0.0287322998046875,
-0.0328369140625,
-0.03375244140625,
0.050811767578125,
0.01861572265625,
0.00331878662109375,
0.050018310546875,
0.034393310546875,
-0.0253448486328125,
0.07489013671875,
0.005584716796875,
-0.00786590576171875,
0.0263671875,
-0.033203125,
-0.008026123046875,
-0.058837890625,
-0.0183563232421875,
-0.0249176025390625,
-0.033477783203125,
-0.043792724609375,
-0.031494140625,
0.03265380859375,
0.00583648681640625,
-0.0186309814453125,
0.050018310546875,
-0.04095458984375,
0.024078369140625,
0.050262451171875,
0.0296783447265625,
-0.002960205078125,
0.005550384521484375,
-0.01041412353515625,
0.0081634521484375,
-0.053009033203125,
-0.03179931640625,
0.0838623046875,
0.0118560791015625,
0.045257568359375,
-0.0009293556213378906,
0.06298828125,
-0.00669097900390625,
-0.0012493133544921875,
-0.03472900390625,
0.06494140625,
-0.00833892822265625,
-0.04754638671875,
-0.0179443359375,
-0.03875732421875,
-0.0672607421875,
0.0231781005859375,
0.0089569091796875,
-0.048309326171875,
0.0007810592651367188,
0.005550384521484375,
-0.022064208984375,
0.01433563232421875,
-0.05633544921875,
0.062255859375,
-0.006389617919921875,
0.001270294189453125,
0.0020465850830078125,
-0.0633544921875,
0.0173797607421875,
0.0009450912475585938,
0.00772857666015625,
-0.0138397216796875,
-0.00476837158203125,
0.08538818359375,
-0.032867431640625,
0.04290771484375,
-0.01047515869140625,
0.01605224609375,
0.0278778076171875,
0.000858306884765625,
0.033294677734375,
0.0023593902587890625,
-0.0186767578125,
0.01568603515625,
0.027679443359375,
-0.040557861328125,
-0.0421142578125,
0.05535888671875,
-0.0714111328125,
-0.036468505859375,
-0.02294921875,
-0.058441162109375,
-0.01129913330078125,
0.00943756103515625,
0.056854248046875,
0.040374755859375,
0.006072998046875,
0.0153656005859375,
0.03460693359375,
-0.0129547119140625,
0.037445068359375,
0.01343536376953125,
-0.006374359130859375,
-0.05108642578125,
0.07293701171875,
0.0171661376953125,
0.021240234375,
0.026458740234375,
0.045257568359375,
-0.0198974609375,
-0.031768798828125,
-0.0223846435546875,
0.040557861328125,
-0.059234619140625,
-0.006839752197265625,
-0.07177734375,
-0.0192718505859375,
-0.05780029296875,
0.006763458251953125,
-0.006160736083984375,
-0.04022216796875,
-0.0162811279296875,
0.01261138916015625,
0.03936767578125,
0.0225067138671875,
-0.01497650146484375,
0.019989013671875,
-0.053192138671875,
0.038330078125,
0.005451202392578125,
-0.01497650146484375,
-0.005718231201171875,
-0.07000732421875,
-0.0152130126953125,
-0.003017425537109375,
-0.033935546875,
-0.08770751953125,
0.03485107421875,
0.0199127197265625,
0.050750732421875,
0.0123443603515625,
0.01012420654296875,
0.059417724609375,
-0.03948974609375,
0.06414794921875,
0.01122283935546875,
-0.0645751953125,
0.0513916015625,
-0.0308074951171875,
0.029052734375,
0.035614013671875,
0.03521728515625,
-0.038543701171875,
-0.05810546875,
-0.07275390625,
-0.073486328125,
0.047637939453125,
0.030609130859375,
0.02545166015625,
-0.012542724609375,
0.03118896484375,
-0.0034656524658203125,
0.01995849609375,
-0.07147216796875,
-0.03375244140625,
-0.02996826171875,
-0.0361328125,
-0.01837158203125,
0.01366424560546875,
-0.0202484130859375,
-0.045074462890625,
0.0528564453125,
0.0185394287109375,
0.041656494140625,
0.0293731689453125,
-0.01462554931640625,
0.0175628662109375,
-0.0047760009765625,
0.04290771484375,
0.06085205078125,
-0.0213623046875,
0.01824951171875,
0.04290771484375,
-0.0272369384765625,
0.0070648193359375,
-0.00096893310546875,
-0.022979736328125,
0.0216217041015625,
0.0274200439453125,
0.0679931640625,
0.0084686279296875,
-0.04644775390625,
0.040374755859375,
0.0087738037109375,
-0.0299530029296875,
-0.02752685546875,
0.007747650146484375,
0.0159759521484375,
0.0302276611328125,
0.017669677734375,
0.010009765625,
0.00940704345703125,
-0.030426025390625,
0.033721923828125,
0.01349639892578125,
-0.01438140869140625,
-0.0220947265625,
0.0599365234375,
-0.0216064453125,
-0.01343536376953125,
0.0439453125,
-0.00817108154296875,
-0.048919677734375,
0.05010986328125,
0.03521728515625,
0.0595703125,
-0.01129150390625,
0.01096343994140625,
0.0394287109375,
0.01184844970703125,
-0.01467132568359375,
0.0161895751953125,
-0.01366424560546875,
-0.0374755859375,
-0.0024509429931640625,
-0.046661376953125,
-0.0033740997314453125,
0.031005859375,
-0.051177978515625,
0.0163726806640625,
-0.0257415771484375,
-0.00533294677734375,
0.00018835067749023438,
-0.0292205810546875,
-0.041961669921875,
0.014617919921875,
-0.016204833984375,
0.06439208984375,
-0.07928466796875,
0.0438232421875,
0.05145263671875,
-0.033599853515625,
-0.07421875,
0.0201568603515625,
-0.0293121337890625,
-0.06280517578125,
0.050140380859375,
0.0012311935424804688,
0.0112152099609375,
0.0137176513671875,
-0.04376220703125,
-0.0694580078125,
0.07489013671875,
0.0135650634765625,
-0.00971221923828125,
-0.01213836669921875,
-0.007537841796875,
0.04486083984375,
-0.01081085205078125,
0.0201568603515625,
0.05145263671875,
0.0242462158203125,
0.015777587890625,
-0.06439208984375,
0.01500701904296875,
-0.0209197998046875,
-0.005374908447265625,
0.00246429443359375,
-0.057342529296875,
0.08935546875,
-0.0018053054809570312,
0.010711669921875,
0.0184173583984375,
0.06561279296875,
0.02069091796875,
0.017608642578125,
0.040313720703125,
0.053192138671875,
0.038909912109375,
0.0037364959716796875,
0.076416015625,
-0.031463623046875,
0.0478515625,
0.07635498046875,
0.0098419189453125,
0.07470703125,
0.046661376953125,
-0.0225830078125,
0.0234375,
0.0562744140625,
-0.022979736328125,
0.074462890625,
0.01274871826171875,
-0.0229644775390625,
-0.01012420654296875,
0.027069091796875,
-0.03466796875,
0.023712158203125,
0.003452301025390625,
-0.0202484130859375,
-0.034210205078125,
0.0217742919921875,
0.01458740234375,
-0.0158233642578125,
0.0004470348358154297,
0.05609130859375,
0.00588226318359375,
-0.0689697265625,
0.06842041015625,
0.0007419586181640625,
0.048309326171875,
-0.0286712646484375,
0.019012451171875,
-0.0246734619140625,
0.002933502197265625,
-0.00036334991455078125,
-0.0638427734375,
0.027984619140625,
-0.0074005126953125,
-0.01078033447265625,
-0.01104736328125,
0.0294189453125,
-0.041961669921875,
-0.043792724609375,
0.0174102783203125,
0.0232696533203125,
0.050445556640625,
-0.0003616809844970703,
-0.08099365234375,
-0.0206451416015625,
0.0108489990234375,
-0.0360107421875,
0.01800537109375,
0.044525146484375,
0.016265869140625,
0.045257568359375,
0.034271240234375,
0.01861572265625,
0.0291748046875,
-0.022735595703125,
0.060028076171875,
-0.047637939453125,
-0.046630859375,
-0.0723876953125,
0.022430419921875,
-0.007411956787109375,
-0.041717529296875,
0.07122802734375,
0.0562744140625,
0.05828857421875,
-0.0133819580078125,
0.059967041015625,
-0.01593017578125,
0.0467529296875,
-0.040313720703125,
0.05364990234375,
-0.0419921875,
0.0007104873657226562,
-0.01470947265625,
-0.058990478515625,
0.01096343994140625,
0.0679931640625,
-0.030242919921875,
0.01080322265625,
0.064453125,
0.08953857421875,
-0.01317596435546875,
-0.01505279541015625,
0.01299285888671875,
0.0296783447265625,
0.0121002197265625,
0.056915283203125,
0.0615234375,
-0.07061767578125,
0.0750732421875,
-0.0460205078125,
-0.004108428955078125,
-0.005584716796875,
-0.043121337890625,
-0.05780029296875,
-0.0701904296875,
-0.0419921875,
-0.055389404296875,
-0.0033740997314453125,
0.05963134765625,
0.042572021484375,
-0.06646728515625,
-0.0235595703125,
-0.00848388671875,
-0.01190185546875,
-0.0216064453125,
-0.022979736328125,
0.049102783203125,
-0.02374267578125,
-0.060882568359375,
0.017425537109375,
-0.012176513671875,
0.01513671875,
0.01061248779296875,
0.003124237060546875,
-0.0362548828125,
0.01189422607421875,
0.0240478515625,
0.00902557373046875,
-0.053466796875,
-0.0194549560546875,
-0.0009584426879882812,
-0.02294921875,
0.0161285400390625,
0.018646240234375,
-0.0325927734375,
-0.0004508495330810547,
0.0538330078125,
0.03216552734375,
0.04302978515625,
0.00562286376953125,
0.022674560546875,
-0.032867431640625,
-0.002483367919921875,
0.00844573974609375,
0.02276611328125,
0.0164794921875,
-0.006267547607421875,
0.0223846435546875,
0.03594970703125,
-0.054473876953125,
-0.06109619140625,
-0.01314544677734375,
-0.09405517578125,
-0.0269622802734375,
0.1029052734375,
-0.01422119140625,
-0.0260467529296875,
0.00042700767517089844,
-0.02020263671875,
0.0457763671875,
-0.01995849609375,
0.060882568359375,
0.06536865234375,
-0.025726318359375,
-0.0016374588012695312,
-0.061065673828125,
0.0309906005859375,
0.0521240234375,
-0.04193115234375,
-0.020050048828125,
0.0163726806640625,
0.06353759765625,
0.00797271728515625,
0.0421142578125,
-0.008392333984375,
0.0205230712890625,
-0.00394439697265625,
-0.01232147216796875,
0.00024390220642089844,
0.00891876220703125,
-0.0120849609375,
0.0185394287109375,
-0.0207977294921875,
-0.03875732421875
]
] |
nitrosocke/redshift-diffusion | 2023-05-16T09:25:37.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"image-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | nitrosocke | null | null | nitrosocke/redshift-diffusion | 605 | 63,559 | diffusers | 2022-11-06T16:48:49 | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/nitrosocke/redshift-diffusion/resolve/main/images/redshift-diffusion-samples-01s.jpg"
tags:
- stable-diffusion
- text-to-image
- image-to-image
---
### Redshift Diffusion
This is the fine-tuned Stable Diffusion model trained on high resolution 3D artworks.
Use the tokens **_redshift style_** in your prompts for the effect.
**The name:** I used Cinema4D for a very long time as my go-to modeling software and always liked the redshift render it came with. That is why I was very sad to see the bad results base SD has connected with its token. This is my attempt at fixing that and showing my passion for this render engine.
**If you enjoy my work and want to test new models before release, please consider supporting me**
[](https://patreon.com/user?u=79196446)
**Characters rendered with the model:**

**Cars and Landscapes rendered with the model:**

#### Prompt and settings for Tony Stark:
**(redshift style) robert downey jr as ironman Negative prompt: glasses helmet**
_Steps: 40, Sampler: DPM2 Karras, CFG scale: 7, Seed: 908018284, Size: 512x704_
#### Prompt and settings for the Ford Mustang:
**redshift style Ford Mustang**
_Steps: 20, Sampler: DPM2 Karras, CFG scale: 7, Seed: 579593863, Size: 704x512_
This model was trained using the diffusers based dreambooth training by ShivamShrirao using prior-preservation loss and the _train-text-encoder_ flag in 11.000 steps.
### Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI run redshift-diffusion:
[](https://huggingface.co/spaces/nitrosocke/Redshift-Diffusion-Demo)
### 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
You can also export the model to [ONNX](https://huggingface.co/docs/diffusers/optimization/onnx), [MPS](https://huggingface.co/docs/diffusers/optimization/mps) and/or [FLAX/JAX]().
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "nitrosocke/redshift-diffusion"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "redshift style magical princess with golden hair"
image = pipe(prompt).images[0]
image.save("./magical_princess.png")
```
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 3,854 | [
[
-0.039581298828125,
-0.056060791015625,
0.031463623046875,
0.0238189697265625,
-0.0220184326171875,
-0.0163726806640625,
0.00838470458984375,
-0.0318603515625,
0.0203704833984375,
0.037841796875,
-0.055816650390625,
-0.044464111328125,
-0.051513671875,
-0.0037441253662109375,
-0.024505615234375,
0.08294677734375,
-0.009429931640625,
0.0028781890869140625,
-0.0027561187744140625,
0.0040435791015625,
-0.0107421875,
-0.0026035308837890625,
-0.05108642578125,
-0.02789306640625,
0.03363037109375,
0.00812530517578125,
0.059814453125,
0.032958984375,
0.04290771484375,
0.0237274169921875,
-0.0362548828125,
-0.01214599609375,
-0.0421142578125,
-0.00870513916015625,
-0.0098419189453125,
-0.016021728515625,
-0.055694580078125,
0.00616455078125,
0.048736572265625,
0.018524169921875,
-0.0307769775390625,
-0.0033779144287109375,
-0.01515960693359375,
0.046844482421875,
-0.0374755859375,
-0.005458831787109375,
-0.0178680419921875,
0.007404327392578125,
-0.01161956787109375,
0.0250396728515625,
-0.002147674560546875,
-0.035614013671875,
0.01209259033203125,
-0.059600830078125,
0.03399658203125,
-0.0111236572265625,
0.08636474609375,
0.02423095703125,
-0.0204925537109375,
0.01059722900390625,
-0.043609619140625,
0.050567626953125,
-0.058258056640625,
0.041046142578125,
0.00879669189453125,
0.03997802734375,
0.0167083740234375,
-0.0811767578125,
-0.04400634765625,
0.0035552978515625,
0.0038318634033203125,
0.035308837890625,
-0.0159454345703125,
-0.006961822509765625,
0.01343536376953125,
0.0343017578125,
-0.052734375,
-0.00951385498046875,
-0.05224609375,
-0.0066070556640625,
0.0338134765625,
0.0140838623046875,
0.00722503662109375,
-0.0128326416015625,
-0.048004150390625,
-0.0238037109375,
-0.03759765625,
-0.0260162353515625,
0.031982421875,
0.0083770751953125,
-0.053466796875,
0.047210693359375,
-0.013946533203125,
0.0484619140625,
0.0181732177734375,
0.0159454345703125,
0.03643798828125,
-0.0145416259765625,
-0.0307769775390625,
-0.03460693359375,
0.059326171875,
0.049163818359375,
0.0025806427001953125,
-0.006595611572265625,
0.000278472900390625,
0.0036258697509765625,
0.01128387451171875,
-0.08453369140625,
-0.0355224609375,
0.041229248046875,
-0.0238037109375,
-0.01506805419921875,
-0.00774383544921875,
-0.0758056640625,
-0.0229644775390625,
0.0088348388671875,
0.0268096923828125,
-0.02392578125,
-0.053955078125,
0.032562255859375,
-0.05133056640625,
0.0020751953125,
0.030975341796875,
-0.056610107421875,
-0.004009246826171875,
0.014862060546875,
0.08062744140625,
-0.0089569091796875,
-0.0118255615234375,
0.0031604766845703125,
0.0247344970703125,
0.0031795501708984375,
0.044036865234375,
-0.0200653076171875,
-0.0391845703125,
-0.00812530517578125,
0.019134521484375,
0.00568389892578125,
-0.0386962890625,
0.046783447265625,
-0.049530029296875,
0.034149169921875,
-0.005596160888671875,
-0.027435302734375,
-0.0087127685546875,
0.01175689697265625,
-0.04339599609375,
0.043609619140625,
0.021575927734375,
-0.0721435546875,
0.017578125,
-0.0733642578125,
-0.0079803466796875,
0.00537109375,
0.023223876953125,
-0.053619384765625,
-0.00824737548828125,
-0.01715087890625,
0.037078857421875,
0.00453948974609375,
0.00051116943359375,
-0.036407470703125,
-0.00624847412109375,
-0.0159454345703125,
-0.0163726806640625,
0.09979248046875,
0.02490234375,
-0.0244293212890625,
0.01137542724609375,
-0.0513916015625,
-0.01287078857421875,
0.024688720703125,
-0.01497650146484375,
0.00876617431640625,
-0.028045654296875,
0.025115966796875,
0.01515960693359375,
0.0220794677734375,
-0.04632568359375,
0.001773834228515625,
-0.0254974365234375,
0.028076171875,
0.056671142578125,
0.0137176513671875,
0.0340576171875,
-0.0197906494140625,
0.04315185546875,
0.0264739990234375,
0.02081298828125,
0.02276611328125,
-0.0562744140625,
-0.052398681640625,
-0.0206146240234375,
0.005855560302734375,
0.020538330078125,
-0.0411376953125,
0.024932861328125,
0.013519287109375,
-0.058349609375,
-0.0197906494140625,
-0.0214996337890625,
0.0029811859130859375,
0.049560546875,
0.031341552734375,
-0.0247955322265625,
-0.033477783203125,
-0.04803466796875,
0.0108795166015625,
-0.005352020263671875,
-0.01319122314453125,
0.007411956787109375,
0.04803466796875,
-0.0517578125,
0.057220458984375,
-0.050689697265625,
-0.0193634033203125,
0.004802703857421875,
0.00867462158203125,
0.0341796875,
0.056915283203125,
0.057342529296875,
-0.053466796875,
-0.045318603515625,
0.0008130073547363281,
-0.054290771484375,
-0.004520416259765625,
0.0147705078125,
-0.035430908203125,
0.0121307373046875,
0.01386260986328125,
-0.0755615234375,
0.0247344970703125,
0.06024169921875,
-0.065673828125,
0.044647216796875,
-0.0281524658203125,
0.00508880615234375,
-0.07537841796875,
0.0283966064453125,
0.01824951171875,
-0.0294036865234375,
-0.043243408203125,
0.02166748046875,
-0.00206756591796875,
-0.007080078125,
-0.06719970703125,
0.06683349609375,
-0.021728515625,
0.0284576416015625,
-0.0095672607421875,
0.00975799560546875,
0.00980377197265625,
0.029296875,
0.007091522216796875,
0.03472900390625,
0.048126220703125,
-0.048004150390625,
0.0182037353515625,
0.034637451171875,
-0.01776123046875,
0.046844482421875,
-0.0712890625,
-0.005290985107421875,
-0.0305938720703125,
-0.0006189346313476562,
-0.07769775390625,
-0.01351165771484375,
0.044158935546875,
-0.02288818359375,
0.0145263671875,
-0.0174407958984375,
-0.034454345703125,
-0.02459716796875,
-0.00395965576171875,
0.010498046875,
0.0675048828125,
-0.0343017578125,
0.036102294921875,
0.02685546875,
0.010650634765625,
-0.0224151611328125,
-0.057830810546875,
-0.030303955078125,
-0.04498291015625,
-0.06744384765625,
0.05120849609375,
-0.03582763671875,
-0.0196075439453125,
-0.0107421875,
0.015106201171875,
-0.017242431640625,
0.0037555694580078125,
0.01776123046875,
0.0190582275390625,
0.0163116455078125,
-0.033203125,
0.0135650634765625,
-0.01666259765625,
-0.0005183219909667969,
-0.00484466552734375,
0.034637451171875,
-0.004230499267578125,
-0.00394439697265625,
-0.0550537109375,
0.013671875,
0.06549072265625,
0.003963470458984375,
0.07391357421875,
0.07257080078125,
-0.022216796875,
0.004261016845703125,
-0.01419830322265625,
-0.0277862548828125,
-0.040496826171875,
0.0160980224609375,
0.00020265579223632812,
-0.048919677734375,
0.047698974609375,
0.00940704345703125,
0.035247802734375,
0.054595947265625,
0.055938720703125,
-0.0218353271484375,
0.0836181640625,
0.048858642578125,
0.03350830078125,
0.055450439453125,
-0.060821533203125,
-0.01152801513671875,
-0.068359375,
-0.025634765625,
-0.0281982421875,
-0.024932861328125,
-0.0180206298828125,
-0.034637451171875,
0.03607177734375,
0.0299530029296875,
-0.064453125,
0.0189208984375,
-0.037811279296875,
0.03228759765625,
0.01267242431640625,
0.018402099609375,
0.021148681640625,
0.003612518310546875,
-0.01263427734375,
-0.0096435546875,
-0.046142578125,
-0.0209808349609375,
0.04974365234375,
0.0394287109375,
0.053375244140625,
0.028106689453125,
0.0430908203125,
0.016204833984375,
0.0286407470703125,
-0.0157623291015625,
0.0323486328125,
-0.004955291748046875,
-0.065673828125,
-0.003429412841796875,
-0.0340576171875,
-0.0645751953125,
0.0279693603515625,
-0.017578125,
-0.031280517578125,
0.018768310546875,
0.02056884765625,
-0.020751953125,
0.0197601318359375,
-0.060516357421875,
0.0794677734375,
0.0024566650390625,
-0.047821044921875,
-0.007091522216796875,
-0.047454833984375,
0.03570556640625,
0.0246124267578125,
-0.005092620849609375,
-0.0128173828125,
0.0035686492919921875,
0.060882568359375,
-0.03533935546875,
0.068359375,
-0.04150390625,
-0.0016679763793945312,
0.007061004638671875,
-0.00043463706970214844,
0.037353515625,
0.0100250244140625,
-0.0172576904296875,
0.0310516357421875,
-0.0001010894775390625,
-0.03875732421875,
-0.022491455078125,
0.050384521484375,
-0.06640625,
-0.0279693603515625,
-0.0310211181640625,
-0.020904541015625,
0.0097808837890625,
0.03155517578125,
0.043792724609375,
-0.003894805908203125,
-0.013031005859375,
-0.0107269287109375,
0.0419921875,
-0.0014371871948242188,
0.043121337890625,
0.03082275390625,
-0.02862548828125,
-0.04461669921875,
0.0478515625,
-0.006427764892578125,
0.04547119140625,
-0.0111541748046875,
0.0310821533203125,
-0.039520263671875,
-0.046142578125,
-0.046295166015625,
0.05181884765625,
-0.048065185546875,
-0.01305389404296875,
-0.050201416015625,
-0.005306243896484375,
-0.01497650146484375,
-0.0225982666015625,
-0.031219482421875,
-0.03424072265625,
-0.06683349609375,
-0.0016384124755859375,
0.052001953125,
0.040435791015625,
-0.01117706298828125,
0.03936767578125,
-0.038726806640625,
0.019622802734375,
0.01134490966796875,
0.0379638671875,
0.00972747802734375,
-0.046051025390625,
-0.02752685546875,
0.01788330078125,
-0.042755126953125,
-0.050018310546875,
0.04791259765625,
-0.007282257080078125,
0.02679443359375,
0.040069580078125,
-0.0008234977722167969,
0.06378173828125,
-0.04278564453125,
0.076171875,
0.04437255859375,
-0.049163818359375,
0.029144287109375,
-0.047210693359375,
0.035675048828125,
0.031982421875,
0.041961669921875,
-0.02972412109375,
-0.041656494140625,
-0.06903076171875,
-0.05487060546875,
0.035491943359375,
0.0219879150390625,
0.00595855712890625,
-0.0006561279296875,
0.0303955078125,
-0.004276275634765625,
0.0165557861328125,
-0.0692138671875,
-0.0341796875,
-0.030426025390625,
0.01047515869140625,
0.0103302001953125,
0.00489044189453125,
-0.0222320556640625,
-0.02044677734375,
0.0667724609375,
0.006351470947265625,
0.044769287109375,
0.0153350830078125,
0.0251617431640625,
-0.03326416015625,
-0.0200653076171875,
0.04351806640625,
0.048370361328125,
-0.038726806640625,
-0.0213623046875,
-0.0173187255859375,
-0.03240966796875,
-0.0111083984375,
0.0044403076171875,
-0.01340484619140625,
0.00870513916015625,
-0.00485992431640625,
0.043243408203125,
-0.0168609619140625,
-0.032745361328125,
0.037841796875,
-0.01483154296875,
-0.0246124267578125,
-0.0197601318359375,
0.012847900390625,
0.021697998046875,
0.043060302734375,
-0.0002415180206298828,
0.036468505859375,
0.004634857177734375,
-0.0142059326171875,
0.004802703857421875,
0.053253173828125,
-0.0275726318359375,
-0.0281219482421875,
0.10235595703125,
0.017303466796875,
-0.0196533203125,
0.0411376953125,
-0.00545501708984375,
-0.019561767578125,
0.03607177734375,
0.051483154296875,
0.0716552734375,
-0.01030731201171875,
0.028106689453125,
0.0318603515625,
-0.0033664703369140625,
-0.01904296875,
0.0281982421875,
0.01129913330078125,
-0.050567626953125,
-0.00969696044921875,
-0.042633056640625,
-0.029876708984375,
-0.0247650146484375,
-0.03656005859375,
0.040740966796875,
-0.048980712890625,
-0.0241241455078125,
-0.026458740234375,
-0.0024890899658203125,
-0.045623779296875,
0.01540374755859375,
-0.0115966796875,
0.07623291015625,
-0.066650390625,
0.0518798828125,
0.025848388671875,
-0.037353515625,
-0.020233154296875,
-0.00737762451171875,
-0.0139312744140625,
-0.0438232421875,
0.035308837890625,
-0.00897216796875,
-0.00597381591796875,
0.00872039794921875,
-0.040496826171875,
-0.059600830078125,
0.09674072265625,
0.04083251953125,
-0.0283355712890625,
-0.00902557373046875,
-0.0226287841796875,
0.042083740234375,
-0.01047515869140625,
0.034881591796875,
0.0288238525390625,
0.041595458984375,
0.034027099609375,
-0.03936767578125,
-0.011260986328125,
-0.008514404296875,
-0.002292633056640625,
0.00797271728515625,
-0.07305908203125,
0.0831298828125,
-0.020477294921875,
-0.025909423828125,
0.028228759765625,
0.053558349609375,
0.038848876953125,
0.01061248779296875,
0.02374267578125,
0.0692138671875,
0.05596923828125,
-0.00762939453125,
0.09173583984375,
-0.005077362060546875,
0.05377197265625,
0.04290771484375,
-0.00440216064453125,
0.051544189453125,
0.033050537109375,
-0.02056884765625,
0.0570068359375,
0.041259765625,
0.00389862060546875,
0.06134033203125,
-0.0041351318359375,
-0.0189208984375,
0.00414276123046875,
0.0081024169921875,
-0.051605224609375,
-0.0150909423828125,
0.0022983551025390625,
-0.03118896484375,
-0.01413726806640625,
-0.003284454345703125,
0.01280975341796875,
-0.0237274169921875,
-0.01538848876953125,
0.02947998046875,
-0.00791168212890625,
-0.01229095458984375,
0.0699462890625,
0.002613067626953125,
0.06573486328125,
-0.056884765625,
-0.00968170166015625,
-0.0179443359375,
0.01348114013671875,
-0.0287322998046875,
-0.05462646484375,
0.03973388671875,
-0.00988006591796875,
-0.0186767578125,
-0.032440185546875,
0.005245208740234375,
-0.0304107666015625,
-0.04443359375,
0.021575927734375,
0.01165771484375,
0.024322509765625,
0.0174560546875,
-0.07171630859375,
0.025787353515625,
0.01177978515625,
-0.01275634765625,
0.0135498046875,
0.01180267333984375,
0.04229736328125,
0.04058837890625,
0.033447265625,
0.01053619384765625,
0.00598907470703125,
-0.0102081298828125,
0.0557861328125,
-0.048309326171875,
-0.044647216796875,
-0.058807373046875,
0.082763671875,
-0.00823974609375,
-0.038726806640625,
0.07012939453125,
0.04022216796875,
0.062042236328125,
-0.0241241455078125,
0.04376220703125,
-0.00811004638671875,
0.0413818359375,
-0.03753662109375,
0.07965087890625,
-0.06768798828125,
0.0142669677734375,
-0.047607421875,
-0.07135009765625,
-0.03448486328125,
0.07232666015625,
-0.002307891845703125,
0.0199432373046875,
0.032470703125,
0.062469482421875,
-0.01690673828125,
0.0025463104248046875,
0.018218994140625,
0.020477294921875,
0.035858154296875,
0.02508544921875,
0.037811279296875,
-0.047698974609375,
0.01428985595703125,
-0.024688720703125,
-0.0153045654296875,
-0.01079559326171875,
-0.07061767578125,
-0.06597900390625,
-0.03936767578125,
-0.04510498046875,
-0.052886962890625,
-0.01103973388671875,
0.034637451171875,
0.07745361328125,
-0.0386962890625,
-0.0191802978515625,
-0.030303955078125,
0.0184783935546875,
-0.0090484619140625,
-0.0224151611328125,
0.016448974609375,
0.027801513671875,
-0.08056640625,
0.01241302490234375,
-0.0007419586181640625,
0.053955078125,
-0.038726806640625,
-0.0237274169921875,
-0.0195465087890625,
-0.031646728515625,
0.03009033203125,
0.0216217041015625,
-0.038604736328125,
-0.00890350341796875,
-0.0220489501953125,
0.016754150390625,
0.0132598876953125,
0.0261688232421875,
-0.05816650390625,
0.0292205810546875,
0.038970947265625,
0.0160980224609375,
0.068359375,
-0.0031280517578125,
0.0279541015625,
-0.038970947265625,
0.008575439453125,
0.0247344970703125,
0.0299072265625,
0.012481689453125,
-0.03228759765625,
0.03582763671875,
0.033843994140625,
-0.05572509765625,
-0.0537109375,
0.01611328125,
-0.08892822265625,
-0.0240020751953125,
0.07574462890625,
0.00001913309097290039,
-0.016876220703125,
-0.005207061767578125,
-0.018035888671875,
0.00592041015625,
-0.0340576171875,
0.0447998046875,
0.0225982666015625,
-0.025970458984375,
-0.031280517578125,
-0.027740478515625,
0.03448486328125,
0.0164337158203125,
-0.042388916015625,
0.01259613037109375,
0.04833984375,
0.040985107421875,
0.032073974609375,
0.04248046875,
-0.0246124267578125,
0.0182647705078125,
-0.01091766357421875,
0.00466156005859375,
-0.00003123283386230469,
-0.0166015625,
-0.042755126953125,
0.02032470703125,
-0.01666259765625,
-0.01148223876953125
]
] |
xlnet-large-cased | 2023-01-24T14:50:34.000Z | [
"transformers",
"pytorch",
"tf",
"xlnet",
"text-generation",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1906.08237",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | null | null | null | xlnet-large-cased | 19 | 63,530 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: mit
datasets:
- bookcorpus
- wikipedia
---
# XLNet (large-sized model)
XLNet model pre-trained on English language. It was introduced in the paper [XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/abs/1906.08237) by Yang et al. and first released in [this repository](https://github.com/zihangdai/xlnet/).
Disclaimer: The team releasing XLNet did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLNet is a new unsupervised language representation learning method based on a novel generalized permutation language modeling objective. Additionally, XLNet employs Transformer-XL as the backbone model, exhibiting excellent performance for language tasks involving long context. Overall, XLNet achieves state-of-the-art (SOTA) results on various downstream language tasks including question answering, natural language inference, sentiment analysis, and document ranking.
## Intended uses & limitations
The model is mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlnet) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import XLNetTokenizer, XLNetModel
tokenizer = XLNetTokenizer.from_pretrained('xlnet-large-cased')
model = XLNetModel.from_pretrained('xlnet-large-cased')
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1906-08237,
author = {Zhilin Yang and
Zihang Dai and
Yiming Yang and
Jaime G. Carbonell and
Ruslan Salakhutdinov and
Quoc V. Le},
title = {XLNet: Generalized Autoregressive Pretraining for Language Understanding},
journal = {CoRR},
volume = {abs/1906.08237},
year = {2019},
url = {http://arxiv.org/abs/1906.08237},
eprinttype = {arXiv},
eprint = {1906.08237},
timestamp = {Mon, 24 Jun 2019 17:28:45 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1906-08237.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 2,699 | [
[
-0.0310211181640625,
-0.054779052734375,
0.0222930908203125,
0.006011962890625,
-0.010589599609375,
-0.0121917724609375,
-0.0230255126953125,
-0.03448486328125,
0.0243682861328125,
0.0286865234375,
-0.0279693603515625,
-0.02764892578125,
-0.04656982421875,
0.006748199462890625,
-0.037109375,
0.08154296875,
-0.00931549072265625,
-0.01378631591796875,
0.007648468017578125,
-0.0161590576171875,
-0.00351715087890625,
-0.06500244140625,
-0.06988525390625,
-0.0311126708984375,
0.048858642578125,
0.0018739700317382812,
0.036163330078125,
0.047119140625,
0.0148773193359375,
0.034393310546875,
-0.023651123046875,
-0.0012998580932617188,
-0.0278778076171875,
-0.0149383544921875,
0.0028400421142578125,
-0.031585693359375,
-0.047607421875,
0.00930023193359375,
0.0557861328125,
0.061065673828125,
0.0020542144775390625,
0.01496124267578125,
0.014984130859375,
0.03448486328125,
-0.034942626953125,
0.0119781494140625,
-0.0284576416015625,
0.01654052734375,
-0.01293182373046875,
0.0092620849609375,
-0.02252197265625,
-0.00365447998046875,
0.0166473388671875,
-0.030609130859375,
0.0120697021484375,
0.01739501953125,
0.08880615234375,
-0.01010894775390625,
-0.031524658203125,
-0.004009246826171875,
-0.03350830078125,
0.060089111328125,
-0.0577392578125,
0.0308380126953125,
0.019500732421875,
0.006275177001953125,
0.01190948486328125,
-0.08355712890625,
-0.05340576171875,
-0.0275726318359375,
-0.021270751953125,
0.0151824951171875,
-0.0292816162109375,
0.006298065185546875,
0.0283966064453125,
0.032257080078125,
-0.061920166015625,
0.01459503173828125,
-0.022857666015625,
-0.011444091796875,
0.04522705078125,
-0.0017147064208984375,
0.0191192626953125,
-0.0265350341796875,
-0.01751708984375,
-0.022003173828125,
-0.032135009765625,
0.0188446044921875,
0.038848876953125,
0.0225982666015625,
-0.020965576171875,
0.035675048828125,
-0.01032257080078125,
0.04986572265625,
0.01068115234375,
0.0193328857421875,
0.04296875,
-0.02301025390625,
-0.02655029296875,
-0.0002295970916748047,
0.094970703125,
-0.0036258697509765625,
0.014862060546875,
-0.005889892578125,
-0.0198822021484375,
-0.0188140869140625,
0.0121917724609375,
-0.0667724609375,
-0.01363372802734375,
0.021759033203125,
-0.04119873046875,
-0.0209197998046875,
0.0081329345703125,
-0.0300445556640625,
0.0018091201782226562,
-0.032928466796875,
0.041778564453125,
-0.0273895263671875,
-0.039276123046875,
-0.007709503173828125,
0.01013946533203125,
0.00746917724609375,
-0.0008335113525390625,
-0.0528564453125,
0.01505279541015625,
0.041107177734375,
0.07366943359375,
-0.0065460205078125,
-0.0296630859375,
-0.0270843505859375,
-0.03424072265625,
-0.0290679931640625,
0.043121337890625,
-0.022308349609375,
0.00949859619140625,
-0.00023627281188964844,
0.02203369140625,
-0.0063018798828125,
-0.02716064453125,
0.02044677734375,
-0.037628173828125,
0.019256591796875,
0.0010738372802734375,
-0.032684326171875,
-0.0180816650390625,
0.01181793212890625,
-0.0533447265625,
0.07421875,
0.00754547119140625,
-0.072021484375,
0.00714111328125,
-0.05401611328125,
-0.0127105712890625,
-0.009765625,
-0.004901885986328125,
-0.0477294921875,
-0.004978179931640625,
0.0034198760986328125,
0.04693603515625,
-0.007129669189453125,
0.0230712890625,
-0.0193328857421875,
-0.008056640625,
0.0172119140625,
-0.0117034912109375,
0.083984375,
0.02886962890625,
-0.03436279296875,
0.01464080810546875,
-0.05389404296875,
0.01131439208984375,
0.011932373046875,
-0.0153961181640625,
-0.0202178955078125,
-0.019012451171875,
0.021759033203125,
0.0204315185546875,
0.019805908203125,
-0.038177490234375,
0.00792694091796875,
-0.04248046875,
0.047210693359375,
0.039215087890625,
-0.030120849609375,
0.0310821533203125,
-0.00543975830078125,
0.031463623046875,
0.0175628662109375,
0.00534820556640625,
-0.0179595947265625,
-0.0197601318359375,
-0.062469482421875,
0.0030689239501953125,
0.046966552734375,
0.040924072265625,
-0.0477294921875,
0.050140380859375,
-0.017333984375,
-0.031768798828125,
-0.0248565673828125,
0.00807952880859375,
0.0477294921875,
0.024139404296875,
0.029815673828125,
-0.0193634033203125,
-0.0538330078125,
-0.0657958984375,
-0.00244140625,
-0.0031757354736328125,
0.0037250518798828125,
0.01947021484375,
0.047637939453125,
-0.02386474609375,
0.06292724609375,
-0.0305328369140625,
-0.0153961181640625,
-0.050140380859375,
0.03338623046875,
0.030914306640625,
0.035675048828125,
0.045440673828125,
-0.054656982421875,
-0.050689697265625,
0.0111846923828125,
-0.051422119140625,
-0.004283905029296875,
0.0061187744140625,
-0.0084381103515625,
0.046051025390625,
0.049652099609375,
-0.0357666015625,
0.0325927734375,
0.058349609375,
-0.03485107421875,
0.045379638671875,
-0.0104217529296875,
-0.0118408203125,
-0.1058349609375,
0.025482177734375,
0.004795074462890625,
-0.032012939453125,
-0.043609619140625,
0.000579833984375,
0.0082244873046875,
-0.00882720947265625,
-0.017730712890625,
0.052642822265625,
-0.05224609375,
0.00630950927734375,
-0.0225372314453125,
0.016815185546875,
-0.003299713134765625,
0.04315185546875,
0.01403045654296875,
0.04681396484375,
0.043609619140625,
-0.035247802734375,
0.038299560546875,
0.01898193359375,
-0.02142333984375,
0.02264404296875,
-0.0675048828125,
0.01511383056640625,
-0.007659912109375,
0.01654052734375,
-0.05908203125,
0.001384735107421875,
0.006259918212890625,
-0.04022216796875,
0.041717529296875,
-0.010955810546875,
-0.0305938720703125,
-0.044647216796875,
-0.001495361328125,
0.021087646484375,
0.036163330078125,
-0.0345458984375,
0.04736328125,
0.018035888671875,
-0.0133209228515625,
-0.06390380859375,
-0.06036376953125,
0.0128631591796875,
0.0058746337890625,
-0.0443115234375,
0.030059814453125,
-0.00885772705078125,
-0.0015783309936523438,
0.00910186767578125,
0.0023517608642578125,
-0.01418304443359375,
-0.01555633544921875,
0.0055999755859375,
0.007511138916015625,
-0.03314208984375,
0.00655364990234375,
-0.0150299072265625,
-0.017852783203125,
0.0008864402770996094,
-0.032989501953125,
0.050506591796875,
-0.0140380859375,
-0.00013625621795654297,
-0.03338623046875,
0.038238525390625,
0.0211029052734375,
-0.016510009765625,
0.06439208984375,
0.0584716796875,
-0.021575927734375,
-0.014373779296875,
-0.044586181640625,
-0.0192108154296875,
-0.03509521484375,
0.06182861328125,
-0.021026611328125,
-0.07476806640625,
0.039154052734375,
0.0133819580078125,
0.0012731552124023438,
0.039276123046875,
0.0299835205078125,
0.00400543212890625,
0.07513427734375,
0.06072998046875,
-0.01311492919921875,
0.044097900390625,
-0.04510498046875,
0.0250701904296875,
-0.07440185546875,
0.003017425537109375,
-0.03900146484375,
-0.016357421875,
-0.05084228515625,
-0.006046295166015625,
0.005252838134765625,
0.004901885986328125,
-0.036468505859375,
0.046234130859375,
-0.036224365234375,
0.007099151611328125,
0.05291748046875,
0.00783538818359375,
-0.0009775161743164062,
-0.00037288665771484375,
-0.035675048828125,
0.01561737060546875,
-0.04974365234375,
-0.0199737548828125,
0.084716796875,
0.02447509765625,
0.0389404296875,
-0.004062652587890625,
0.041839599609375,
-0.0024662017822265625,
0.004703521728515625,
-0.052154541015625,
0.035125732421875,
-0.0203094482421875,
-0.050323486328125,
-0.02935791015625,
-0.057037353515625,
-0.09326171875,
0.00308990478515625,
-0.0202484130859375,
-0.0679931640625,
0.006969451904296875,
0.0177154541015625,
-0.0279541015625,
0.03424072265625,
-0.0577392578125,
0.0672607421875,
-0.0240478515625,
-0.029083251953125,
0.00424957275390625,
-0.045379638671875,
0.00756072998046875,
-0.0074310302734375,
-0.00499725341796875,
0.0252838134765625,
0.0177154541015625,
0.062042236328125,
-0.046417236328125,
0.0675048828125,
-0.007793426513671875,
0.00833892822265625,
0.01535797119140625,
-0.01279449462890625,
0.04315185546875,
-0.00823974609375,
-0.0004286766052246094,
0.03497314453125,
0.0006070137023925781,
-0.020843505859375,
-0.036407470703125,
0.042938232421875,
-0.0889892578125,
-0.035919189453125,
-0.038116455078125,
-0.03460693359375,
-0.0019893646240234375,
0.0294952392578125,
0.039276123046875,
0.052459716796875,
-0.0043487548828125,
0.024749755859375,
0.051116943359375,
-0.038238525390625,
0.047210693359375,
0.026824951171875,
-0.0263671875,
-0.033172607421875,
0.046417236328125,
0.031890869140625,
0.0117034912109375,
0.04840087890625,
0.017120361328125,
-0.0292510986328125,
-0.041229248046875,
-0.01010894775390625,
0.0249481201171875,
-0.038299560546875,
-0.017822265625,
-0.0751953125,
-0.054443359375,
-0.048095703125,
0.00771331787109375,
-0.036712646484375,
-0.021728515625,
-0.0191802978515625,
-0.0059356689453125,
0.0277862548828125,
0.0574951171875,
-0.0162506103515625,
0.033935546875,
-0.040740966796875,
0.014068603515625,
0.032958984375,
0.02642822265625,
0.00591278076171875,
-0.054718017578125,
-0.0286865234375,
0.0023555755615234375,
-0.025604248046875,
-0.042938232421875,
0.040618896484375,
0.0084991455078125,
0.046417236328125,
0.036468505859375,
-0.0007719993591308594,
0.03857421875,
-0.037109375,
0.04827880859375,
0.0335693359375,
-0.0694580078125,
0.03564453125,
-0.012237548828125,
0.0256500244140625,
0.01317596435546875,
0.039764404296875,
-0.05206298828125,
-0.01287078857421875,
-0.050567626953125,
-0.0831298828125,
0.0692138671875,
0.01447296142578125,
0.0280609130859375,
0.0090789794921875,
0.03253173828125,
-0.0110015869140625,
0.01458740234375,
-0.08349609375,
-0.0438232421875,
-0.03570556640625,
-0.0222930908203125,
-0.0169525146484375,
-0.037689208984375,
0.01012420654296875,
-0.026275634765625,
0.0533447265625,
-0.008514404296875,
0.0540771484375,
0.0232696533203125,
-0.0158233642578125,
0.01312255859375,
0.0081787109375,
0.038787841796875,
0.0491943359375,
-0.01378631591796875,
0.0062408447265625,
0.007778167724609375,
-0.050384521484375,
-0.00507354736328125,
0.03271484375,
-0.0074310302734375,
0.0001493692398071289,
0.03466796875,
0.08587646484375,
-0.0025501251220703125,
-0.01082611083984375,
0.0494384765625,
-0.0005288124084472656,
-0.035430908203125,
-0.047149658203125,
0.0006299018859863281,
0.009613037109375,
0.0219573974609375,
0.029754638671875,
-0.00518035888671875,
0.001506805419921875,
-0.033538818359375,
0.00992584228515625,
0.0265655517578125,
-0.04168701171875,
-0.031585693359375,
0.044891357421875,
0.0020694732666015625,
-0.01123046875,
0.049835205078125,
-0.02099609375,
-0.04888916015625,
0.045654296875,
0.043914794921875,
0.07391357421875,
-0.0142822265625,
0.0080108642578125,
0.05133056640625,
0.024871826171875,
-0.005092620849609375,
0.007678985595703125,
0.004730224609375,
-0.06634521484375,
-0.046783447265625,
-0.05352783203125,
-0.007198333740234375,
0.0382080078125,
-0.036224365234375,
0.013580322265625,
-0.0234375,
-0.0267181396484375,
0.002956390380859375,
0.02471923828125,
-0.049713134765625,
0.02215576171875,
0.0164947509765625,
0.07513427734375,
-0.051361083984375,
0.06280517578125,
0.06768798828125,
-0.0367431640625,
-0.08880615234375,
-0.0151519775390625,
-0.0133056640625,
-0.054290771484375,
0.07025146484375,
0.01526641845703125,
0.01213836669921875,
0.009124755859375,
-0.049713134765625,
-0.0726318359375,
0.078857421875,
0.019989013671875,
-0.058990478515625,
-0.004589080810546875,
0.017578125,
0.029388427734375,
-0.0221099853515625,
0.0545654296875,
0.0159149169921875,
0.036163330078125,
0.0011701583862304688,
-0.07623291015625,
0.00888824462890625,
-0.04058837890625,
0.01334381103515625,
0.00359344482421875,
-0.058868408203125,
0.0870361328125,
-0.00006663799285888672,
0.005054473876953125,
0.007030487060546875,
0.055572509765625,
0.0022296905517578125,
0.004764556884765625,
0.040435791015625,
0.0537109375,
0.050079345703125,
-0.0189056396484375,
0.0760498046875,
-0.01471710205078125,
0.040252685546875,
0.0670166015625,
0.002689361572265625,
0.059326171875,
0.01393890380859375,
-0.0131988525390625,
0.044281005859375,
0.03790283203125,
0.0015420913696289062,
0.02288818359375,
0.0169677734375,
0.00916290283203125,
-0.01812744140625,
0.020111083984375,
-0.030914306640625,
0.040771484375,
0.007495880126953125,
-0.04949951171875,
-0.00677490234375,
0.003917694091796875,
0.037445068359375,
-0.0157012939453125,
-0.02691650390625,
0.0489501953125,
0.0181732177734375,
-0.046630859375,
0.052490234375,
-0.0061798095703125,
0.05963134765625,
-0.058349609375,
0.009246826171875,
-0.02911376953125,
0.0219573974609375,
-0.010894775390625,
-0.04779052734375,
0.01922607421875,
-0.006252288818359375,
-0.019989013671875,
-0.01465606689453125,
0.031280517578125,
-0.02520751953125,
-0.050323486328125,
0.035430908203125,
0.023651123046875,
0.01039886474609375,
-0.0139007568359375,
-0.0703125,
0.00428009033203125,
-0.0018749237060546875,
-0.05389404296875,
0.025787353515625,
0.048614501953125,
0.004741668701171875,
0.040802001953125,
0.040008544921875,
0.017242431640625,
-0.00757598876953125,
-0.002780914306640625,
0.0594482421875,
-0.0570068359375,
-0.044219970703125,
-0.05181884765625,
0.038726806640625,
-0.019195556640625,
-0.035430908203125,
0.04351806640625,
0.03424072265625,
0.061187744140625,
0.0107574462890625,
0.0711669921875,
-0.0231170654296875,
0.04071044921875,
-0.03961181640625,
0.08013916015625,
-0.0582275390625,
-0.00980377197265625,
-0.02374267578125,
-0.0675048828125,
-0.0255126953125,
0.062744140625,
-0.0287017822265625,
0.03826904296875,
0.07330322265625,
0.06158447265625,
-0.0161285400390625,
-0.017120361328125,
0.03424072265625,
0.055450439453125,
0.00794219970703125,
0.033203125,
0.044891357421875,
-0.040252685546875,
0.053497314453125,
-0.0228424072265625,
-0.0244293212890625,
-0.0146636962890625,
-0.07568359375,
-0.084716796875,
-0.063232421875,
-0.0309600830078125,
-0.034332275390625,
0.007556915283203125,
0.07672119140625,
0.07757568359375,
-0.07391357421875,
-0.019500732421875,
-0.01541900634765625,
-0.01055908203125,
-0.0180816650390625,
-0.0201263427734375,
0.05206298828125,
-0.03759765625,
-0.05462646484375,
-0.01042938232421875,
0.0026702880859375,
-0.0002837181091308594,
-0.01812744140625,
0.00531768798828125,
-0.01453399658203125,
-0.00347900390625,
0.049530029296875,
0.026641845703125,
-0.031524658203125,
-0.0152130126953125,
0.0105438232421875,
-0.0135040283203125,
0.00409698486328125,
0.04022216796875,
-0.048431396484375,
0.0124359130859375,
0.0321044921875,
0.03704833984375,
0.0460205078125,
-0.0236053466796875,
0.039947509765625,
-0.049407958984375,
0.0195770263671875,
-0.00518798828125,
0.042816162109375,
0.024383544921875,
-0.0255584716796875,
0.026123046875,
0.01146697998046875,
-0.054046630859375,
-0.045867919921875,
0.0048980712890625,
-0.07391357421875,
-0.0215911865234375,
0.0975341796875,
-0.0254974365234375,
-0.033355712890625,
-0.008636474609375,
-0.0102691650390625,
0.02789306640625,
-0.0218505859375,
0.04742431640625,
0.03466796875,
0.00771331787109375,
-0.037689208984375,
-0.044647216796875,
0.0226898193359375,
0.015655517578125,
-0.04864501953125,
-0.003910064697265625,
0.00942230224609375,
0.0220794677734375,
0.0247650146484375,
0.03448486328125,
0.0007014274597167969,
0.0006475448608398438,
-0.0087890625,
0.0296630859375,
-0.015838623046875,
-0.01163482666015625,
-0.005889892578125,
0.0004317760467529297,
-0.01082611083984375,
-0.0026149749755859375
]
] |
facebook/esmfold_v1 | 2023-03-22T17:39:28.000Z | [
"transformers",
"pytorch",
"esm",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | null | facebook | null | null | facebook/esmfold_v1 | 13 | 63,367 | transformers | 2022-11-01T18:24:14 | ---
license: mit
---
# ESMFold
ESMFold is a state-of-the-art end-to-end protein folding model based on an ESM-2 backbone. It does not require any lookup or MSA step, and therefore does not require any external databases to be present in order to make predictions. As a result, inference time is very significantly faster than AlphaFold2. For details on the model architecture and training, please refer to the [accompanying paper](https://www.science.org/doi/10.1126/science.ade2574).
If you're interested in using ESMFold in practice, please check out the associated [tutorial notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/protein_folding.ipynb). | 696 | [
[
-0.036712646484375,
-0.03839111328125,
0.022491455078125,
0.0037593841552734375,
-0.00930023193359375,
-0.00699615478515625,
0.04925537109375,
-0.0394287109375,
0.0343017578125,
0.038604736328125,
-0.07183837890625,
-0.041900634765625,
-0.05194091796875,
-0.003490447998046875,
-0.0330810546875,
0.0474853515625,
0.0093536376953125,
0.01549530029296875,
-0.042327880859375,
-0.0196533203125,
0.045623779296875,
-0.007904052734375,
-0.0272674560546875,
-0.06170654296875,
0.025787353515625,
0.0206756591796875,
0.0225982666015625,
0.06170654296875,
0.0269775390625,
0.020477294921875,
-0.048095703125,
0.0062255859375,
-0.033966064453125,
0.0209808349609375,
0.00934600830078125,
-0.043975830078125,
-0.055084228515625,
0.01322174072265625,
0.0042266845703125,
0.051055908203125,
-0.003902435302734375,
0.046722412109375,
0.001972198486328125,
0.0931396484375,
-0.03961181640625,
-0.01285552978515625,
-0.02447509765625,
0.01506805419921875,
-0.015777587890625,
-0.003887176513671875,
-0.021820068359375,
-0.0160980224609375,
0.0165557861328125,
-0.0364990234375,
0.0193939208984375,
0.0233612060546875,
0.06280517578125,
-0.001972198486328125,
-0.048919677734375,
-0.020782470703125,
0.00072479248046875,
0.0289306640625,
-0.033172607421875,
0.0362548828125,
0.036834716796875,
0.05157470703125,
-0.03363037109375,
-0.0645751953125,
-0.0197906494140625,
0.01305389404296875,
0.01910400390625,
0.01715087890625,
0.0272216796875,
0.0265045166015625,
0.037384033203125,
0.03155517578125,
-0.0787353515625,
0.005222320556640625,
-0.0501708984375,
0.01410675048828125,
0.035064697265625,
-0.01282501220703125,
0.0276641845703125,
-0.0123138427734375,
-0.061126708984375,
0.01332855224609375,
-0.07330322265625,
-0.005130767822265625,
0.036590576171875,
-0.0004391670227050781,
-0.020477294921875,
0.05767822265625,
-0.0160980224609375,
0.04510498046875,
0.00380706787109375,
0.017303466796875,
0.0288238525390625,
0.00971221923828125,
-0.03143310546875,
-0.0183563232421875,
0.00792694091796875,
0.054443359375,
-0.01776123046875,
-0.02276611328125,
-0.00832366943359375,
-0.01308441162109375,
-0.0018825531005859375,
-0.092529296875,
-0.0255279541015625,
0.07012939453125,
-0.053955078125,
0.003814697265625,
-0.004024505615234375,
-0.04473876953125,
-0.026458740234375,
-0.0175933837890625,
0.044097900390625,
-0.0335693359375,
0.0062408447265625,
0.026947021484375,
-0.043243408203125,
0.0633544921875,
0.007144927978515625,
-0.053955078125,
0.0604248046875,
0.0615234375,
0.10504150390625,
-0.0093536376953125,
-0.018310546875,
-0.03277587890625,
0.016143798828125,
-0.0094451904296875,
0.09417724609375,
-0.006305694580078125,
-0.0216217041015625,
-0.0083465576171875,
0.0265045166015625,
0.00739288330078125,
-0.003765106201171875,
0.04559326171875,
-0.0340576171875,
0.0229034423828125,
-0.03350830078125,
-0.06573486328125,
-0.052398681640625,
0.004016876220703125,
-0.045806884765625,
0.0556640625,
-0.00786590576171875,
-0.04058837890625,
0.013580322265625,
-0.056365966796875,
-0.03741455078125,
0.0007610321044921875,
0.0134429931640625,
-0.04913330078125,
0.00885009765625,
0.0053863525390625,
0.0149688720703125,
-0.01885986328125,
-0.0210418701171875,
-0.046417236328125,
-0.01549530029296875,
-0.01287078857421875,
0.02801513671875,
0.0625,
0.0149993896484375,
-0.003452301025390625,
-0.01546478271484375,
-0.062469482421875,
0.01465606689453125,
0.006317138671875,
-0.0171051025390625,
-0.009674072265625,
0.006256103515625,
0.005229949951171875,
0.025970458984375,
0.0086669921875,
-0.050506591796875,
0.048309326171875,
-0.032318115234375,
0.03594970703125,
0.0587158203125,
0.006725311279296875,
0.058074951171875,
-0.0474853515625,
0.0124359130859375,
-0.031524658203125,
0.031280517578125,
-0.038330078125,
-0.054046630859375,
-0.0631103515625,
-0.0170745849609375,
0.00821685791015625,
0.00904083251953125,
-0.0024776458740234375,
0.00815582275390625,
0.01910400390625,
-0.0438232421875,
-0.0179443359375,
-0.0177764892578125,
0.0303192138671875,
0.0298309326171875,
0.03961181640625,
-0.0109405517578125,
-0.043182373046875,
-0.0762939453125,
-0.0159149169921875,
0.0008301734924316406,
-0.0283203125,
-0.00772857666015625,
0.0673828125,
0.00131988525390625,
0.0291595458984375,
-0.0396728515625,
-0.0255889892578125,
-0.0027923583984375,
0.017333984375,
0.0186614990234375,
0.030853271484375,
0.026824951171875,
-0.059326171875,
-0.02386474609375,
-0.032989501953125,
-0.061859130859375,
-0.013153076171875,
0.01184844970703125,
-0.018310546875,
0.014373779296875,
0.035308837890625,
-0.048248291015625,
0.0164337158203125,
0.06695556640625,
-0.0305328369140625,
0.0196990966796875,
-0.026885986328125,
0.00817108154296875,
-0.087890625,
0.0092926025390625,
-0.0038204193115234375,
-0.0124053955078125,
-0.046112060546875,
0.018341064453125,
0.0129852294921875,
-0.03131103515625,
-0.041107177734375,
0.0311737060546875,
-0.07525634765625,
-0.03863525390625,
-0.006389617919921875,
-0.0181884765625,
0.0333251953125,
0.0377197265625,
-0.0155181884765625,
0.0357666015625,
0.042755126953125,
-0.0304107666015625,
-0.036346435546875,
0.03466796875,
-0.0027179718017578125,
0.017333984375,
-0.07342529296875,
0.047821044921875,
-0.027679443359375,
0.03143310546875,
-0.07666015625,
-0.01617431640625,
0.004009246826171875,
-0.032958984375,
0.03350830078125,
-0.046875,
-0.01001739501953125,
-0.0281219482421875,
-0.0662841796875,
-0.00875091552734375,
0.031036376953125,
-0.034576416015625,
0.0728759765625,
0.05194091796875,
0.00508880615234375,
-0.022216796875,
-0.06488037109375,
-0.01036834716796875,
0.0105438232421875,
-0.0487060546875,
0.046142578125,
-0.0291748046875,
0.00605010986328125,
-0.014007568359375,
-0.035308837890625,
0.00801849365234375,
-0.006458282470703125,
0.060516357421875,
-0.01093292236328125,
0.01010894775390625,
-0.00232696533203125,
0.0343017578125,
-0.0262908935546875,
-0.0322265625,
-0.04779052734375,
0.02691650390625,
-0.0199127197265625,
-0.037445068359375,
-0.03619384765625,
0.0638427734375,
0.08685302734375,
-0.0225982666015625,
0.057403564453125,
-0.01419830322265625,
-0.04736328125,
-0.031951904296875,
-0.021575927734375,
-0.025299072265625,
-0.035186767578125,
0.020751953125,
-0.0284576416015625,
-0.0298004150390625,
0.0574951171875,
-0.01513671875,
0.033111572265625,
0.006809234619140625,
0.0154876708984375,
-0.024810791015625,
0.080322265625,
0.056915283203125,
0.05596923828125,
0.0043182373046875,
-0.048980712890625,
0.0056610107421875,
-0.08026123046875,
-0.056732177734375,
-0.05572509765625,
-0.045654296875,
-0.0308990478515625,
-0.056427001953125,
0.01861572265625,
0.0171661376953125,
-0.0206298828125,
0.044830322265625,
-0.0256500244140625,
0.03778076171875,
0.0252685546875,
-0.0005025863647460938,
0.0218505859375,
0.0033321380615234375,
-0.017242431640625,
0.022369384765625,
-0.06170654296875,
-0.04150390625,
0.07904052734375,
0.03143310546875,
0.049530029296875,
0.00547027587890625,
0.038116455078125,
0.0220794677734375,
0.019683837890625,
-0.06439208984375,
0.04278564453125,
-0.007568359375,
-0.041656494140625,
-0.003688812255859375,
0.0054931640625,
-0.042236328125,
0.0261077880859375,
-0.0330810546875,
-0.045684814453125,
0.00507354736328125,
0.0105743408203125,
0.0005564689636230469,
0.0246429443359375,
-0.06378173828125,
0.0082855224609375,
-0.01300811767578125,
-0.00839996337890625,
0.0103912353515625,
-0.04937744140625,
0.005489349365234375,
0.01544952392578125,
0.0089874267578125,
-0.0175933837890625,
-0.0204925537109375,
0.0699462890625,
-0.046356201171875,
0.035430908203125,
-0.006412506103515625,
0.031036376953125,
-0.00963592529296875,
0.0256500244140625,
0.0433349609375,
-0.0160369873046875,
-0.0219268798828125,
0.0256500244140625,
0.00225830078125,
-0.0443115234375,
-0.046539306640625,
0.04559326171875,
-0.031402587890625,
-0.0203704833984375,
-0.0310211181640625,
-0.01477813720703125,
0.0033206939697265625,
0.019439697265625,
0.022216796875,
0.050811767578125,
0.005573272705078125,
0.00855255126953125,
0.04132080078125,
-0.005756378173828125,
0.0133209228515625,
0.07476806640625,
-0.02801513671875,
-0.0184783935546875,
0.025421142578125,
0.059661865234375,
0.021087646484375,
0.0430908203125,
0.0024127960205078125,
-0.036163330078125,
-0.060302734375,
-0.0306549072265625,
0.031890869140625,
-0.042877197265625,
-0.007328033447265625,
-0.08221435546875,
-0.0023136138916015625,
-0.0198822021484375,
-0.0286407470703125,
-0.06170654296875,
-0.06414794921875,
0.0285797119140625,
-0.03204345703125,
0.0180206298828125,
0.0374755859375,
0.0022125244140625,
0.005870819091796875,
-0.031494140625,
0.035552978515625,
0.0033054351806640625,
0.02764892578125,
-0.01800537109375,
-0.032196044921875,
-0.00991058349609375,
-0.026885986328125,
-0.0295562744140625,
-0.066650390625,
0.005794525146484375,
0.039825439453125,
0.042327880859375,
0.0304107666015625,
-0.008453369140625,
0.0233306884765625,
-0.052215576171875,
0.046539306640625,
0.0148773193359375,
-0.048980712890625,
0.0498046875,
-0.00562286376953125,
0.03924560546875,
0.028839111328125,
0.048187255859375,
-0.028594970703125,
-0.0160675048828125,
-0.02734375,
-0.09613037109375,
0.026519775390625,
0.01439666748046875,
-0.02374267578125,
0.02490234375,
0.0114593505859375,
0.0384521484375,
0.008056640625,
-0.0235595703125,
-0.0287017822265625,
0.01141357421875,
0.0240478515625,
0.016571044921875,
-0.0309295654296875,
-0.01154327392578125,
0.0030059814453125,
0.057342529296875,
-0.0298919677734375,
0.031341552734375,
0.0186767578125,
-0.01418304443359375,
-0.00867462158203125,
-0.0176544189453125,
0.0307159423828125,
0.052703857421875,
-0.047149658203125,
0.01558685302734375,
0.00618743896484375,
-0.0105438232421875,
-0.0117645263671875,
0.0146484375,
-0.037353515625,
-0.020172119140625,
0.032012939453125,
0.02655029296875,
0.01180267333984375,
-0.021392822265625,
0.0201873779296875,
-0.00994873046875,
-0.046966552734375,
0.0069580078125,
-0.0216827392578125,
0.025482177734375,
0.0271148681640625,
0.0350341796875,
0.0232391357421875,
0.039520263671875,
-0.0419921875,
0.0396728515625,
0.0047760009765625,
-0.027069091796875,
-0.037567138671875,
0.052093505859375,
0.02716064453125,
-0.043853759765625,
0.020904541015625,
0.01175689697265625,
-0.033843994140625,
0.0382080078125,
0.04559326171875,
0.06695556640625,
-0.028076171875,
0.033172607421875,
0.0253448486328125,
0.01788330078125,
-0.003925323486328125,
0.0298004150390625,
-0.0137786865234375,
-0.006351470947265625,
-0.005199432373046875,
-0.0760498046875,
-0.0304412841796875,
0.007198333740234375,
-0.03240966796875,
0.0311126708984375,
-0.0214996337890625,
0.0094451904296875,
0.005207061767578125,
0.00823211669921875,
-0.022216796875,
-0.00803375244140625,
0.006183624267578125,
0.126708984375,
-0.0743408203125,
0.030792236328125,
0.045196533203125,
-0.0211181640625,
-0.006755828857421875,
-0.01395416259765625,
0.045196533203125,
-0.037628173828125,
0.00717926025390625,
0.050445556640625,
-0.02825927734375,
-0.0033721923828125,
-0.03240966796875,
-0.050537109375,
0.09765625,
0.02764892578125,
-0.036834716796875,
-0.028594970703125,
-0.022308349609375,
0.028594970703125,
-0.0260162353515625,
0.032073974609375,
-0.004375457763671875,
0.01355743408203125,
0.016815185546875,
-0.0035457611083984375,
-0.009124755859375,
-0.036956787109375,
-0.0007758140563964844,
0.01006317138671875,
-0.06488037109375,
0.0531005859375,
-0.03424072265625,
-0.03668212890625,
0.0216217041015625,
0.05291748046875,
0.0170440673828125,
0.034576416015625,
0.0229949951171875,
0.04498291015625,
0.0738525390625,
0.001041412353515625,
0.0780029296875,
-0.03497314453125,
0.03839111328125,
0.0723876953125,
-0.0241546630859375,
0.04461669921875,
0.054443359375,
0.00078582763671875,
0.0217437744140625,
0.05584716796875,
0.006252288818359375,
0.03179931640625,
0.01415252685546875,
0.01190948486328125,
-0.009796142578125,
-0.0011749267578125,
-0.032958984375,
0.022125244140625,
0.00885009765625,
-0.019989013671875,
-0.0255889892578125,
-0.001194000244140625,
-0.00041031837463378906,
-0.0034198760986328125,
-0.0084381103515625,
0.042327880859375,
0.025543212890625,
-0.042022705078125,
0.034423828125,
-0.008575439453125,
0.0134124755859375,
-0.05718994140625,
-0.01198577880859375,
-0.0244598388671875,
-0.01255035400390625,
-0.01306915283203125,
-0.05950927734375,
0.0229949951171875,
-0.00525665283203125,
-0.0148773193359375,
-0.018585205078125,
0.04498291015625,
-0.0219268798828125,
-0.0202789306640625,
0.055206298828125,
0.03997802734375,
0.0159759521484375,
0.0186614990234375,
-0.0631103515625,
0.0006966590881347656,
-0.042877197265625,
-0.06646728515625,
0.0297698974609375,
-0.01666259765625,
0.0122528076171875,
0.08636474609375,
0.056640625,
-0.020751953125,
-0.0283966064453125,
-0.0007891654968261719,
0.0654296875,
-0.0288848876953125,
-0.038330078125,
-0.0267486572265625,
0.04107666015625,
-0.0241546630859375,
-0.01126861572265625,
0.038787841796875,
0.07916259765625,
0.04046630859375,
-0.00982666015625,
0.038543701171875,
0.0036945343017578125,
0.0185089111328125,
-0.0543212890625,
0.061309814453125,
-0.060272216796875,
-0.001506805419921875,
0.00963592529296875,
-0.053375244140625,
-0.0310211181640625,
0.01519775390625,
0.00676727294921875,
0.00042819976806640625,
0.0540771484375,
0.0869140625,
-0.01355743408203125,
0.0148468017578125,
0.022369384765625,
0.0259857177734375,
0.030609130859375,
0.025543212890625,
0.0134429931640625,
-0.052703857421875,
0.00958251953125,
0.00433349609375,
-0.0316162109375,
-0.033233642578125,
-0.06640625,
-0.07366943359375,
-0.0643310546875,
-0.005519866943359375,
-0.03851318359375,
0.04144287109375,
0.06378173828125,
0.09490966796875,
-0.04327392578125,
-0.007904052734375,
-0.012969970703125,
0.0017213821411132812,
-0.019256591796875,
-0.00275421142578125,
0.031890869140625,
0.006397247314453125,
-0.05108642578125,
0.03277587890625,
0.07586669921875,
0.0099945068359375,
-0.0022144317626953125,
-0.013580322265625,
-0.03289794921875,
0.040069580078125,
0.044586181640625,
0.0311126708984375,
-0.0169219970703125,
-0.006549835205078125,
0.01380157470703125,
0.0045623779296875,
0.007549285888671875,
0.0447998046875,
-0.022613525390625,
0.0445556640625,
0.07421875,
0.01331329345703125,
0.06768798828125,
-0.00543212890625,
0.06060791015625,
0.0030803680419921875,
-0.0022830963134765625,
-0.01096343994140625,
0.0283203125,
0.0265045166015625,
-0.0211334228515625,
0.0281219482421875,
0.01506805419921875,
-0.037567138671875,
-0.050689697265625,
0.0452880859375,
-0.08795166015625,
-0.0169219970703125,
0.056793212890625,
-0.01442718505859375,
-0.06427001953125,
0.0008959770202636719,
-0.006866455078125,
0.021942138671875,
-0.019439697265625,
0.0482177734375,
0.018463134765625,
-0.025299072265625,
-0.001567840576171875,
-0.034393310546875,
0.043487548828125,
0.007335662841796875,
-0.045806884765625,
0.000904083251953125,
-0.01153564453125,
0.024566650390625,
-0.0169677734375,
0.040130615234375,
-0.03472900390625,
0.0170745849609375,
0.012542724609375,
-0.0108642578125,
-0.054290771484375,
-0.042236328125,
-0.0265655517578125,
0.037109375,
-0.0182647705078125,
-0.00527191162109375
]
] |
dandelin/vilt-b32-finetuned-vqa | 2022-08-02T13:03:04.000Z | [
"transformers",
"pytorch",
"vilt",
"visual-question-answering",
"arxiv:2102.03334",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | visual-question-answering | dandelin | null | null | dandelin/vilt-b32-finetuned-vqa | 283 | 63,366 | transformers | 2022-03-02T23:29:05 | ---
tags:
- visual-question-answering
license: apache-2.0
widget:
- text: "What's the animal doing?"
src: "https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg"
- text: "What is on top of the building?"
src: "https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg"
---
# Vision-and-Language Transformer (ViLT), fine-tuned on VQAv2
Vision-and-Language Transformer (ViLT) model fine-tuned on [VQAv2](https://visualqa.org/). It was introduced in the paper [ViLT: Vision-and-Language Transformer
Without Convolution or Region Supervision](https://arxiv.org/abs/2102.03334) by Kim et al. and first released in [this repository](https://github.com/dandelin/ViLT).
Disclaimer: The team releasing ViLT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Intended uses & limitations
You can use the raw model for visual question answering.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import ViltProcessor, ViltForQuestionAnswering
import requests
from PIL import Image
# prepare image + question
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
text = "How many cats are there?"
processor = ViltProcessor.from_pretrained("dandelin/vilt-b32-finetuned-vqa")
model = ViltForQuestionAnswering.from_pretrained("dandelin/vilt-b32-finetuned-vqa")
# prepare inputs
encoding = processor(image, text, return_tensors="pt")
# forward pass
outputs = model(**encoding)
logits = outputs.logits
idx = logits.argmax(-1).item()
print("Predicted answer:", model.config.id2label[idx])
```
## Training data
(to do)
## Training procedure
### Preprocessing
(to do)
### Pretraining
(to do)
## Evaluation results
(to do)
### BibTeX entry and citation info
```bibtex
@misc{kim2021vilt,
title={ViLT: Vision-and-Language Transformer Without Convolution or Region Supervision},
author={Wonjae Kim and Bokyung Son and Ildoo Kim},
year={2021},
eprint={2102.03334},
archivePrefix={arXiv},
primaryClass={stat.ML}
}
``` | 2,152 | [
[
-0.045166015625,
-0.06494140625,
0.00382232666015625,
0.00940704345703125,
-0.023162841796875,
-0.014129638671875,
-0.0168609619140625,
-0.024139404296875,
-0.002536773681640625,
0.036346435546875,
-0.04925537109375,
-0.018585205078125,
-0.043975830078125,
-0.005100250244140625,
-0.028350830078125,
0.07843017578125,
-0.006717681884765625,
0.01477813720703125,
-0.0272979736328125,
-0.0164794921875,
-0.0273895263671875,
-0.034210205078125,
-0.03277587890625,
-0.030120849609375,
0.02496337890625,
0.0164642333984375,
0.0362548828125,
0.02459716796875,
0.0467529296875,
0.0277557373046875,
-0.02117919921875,
0.006153106689453125,
-0.028778076171875,
-0.008087158203125,
-0.00970458984375,
-0.052001953125,
-0.02679443359375,
-0.007442474365234375,
0.040771484375,
0.0258026123046875,
0.006893157958984375,
0.04095458984375,
0.0178985595703125,
0.034942626953125,
-0.04302978515625,
0.01123809814453125,
-0.0350341796875,
0.01132965087890625,
0.004764556884765625,
-0.01788330078125,
-0.0251007080078125,
-0.0236053466796875,
0.021148681640625,
-0.033294677734375,
0.030303955078125,
-0.0208892822265625,
0.09466552734375,
0.032257080078125,
-0.0183868408203125,
0.00843048095703125,
-0.04742431640625,
0.0511474609375,
-0.03192138671875,
0.024688720703125,
0.011688232421875,
0.03515625,
0.0070953369140625,
-0.08203125,
-0.05621337890625,
0.00432586669921875,
-0.00992584228515625,
0.018707275390625,
-0.0291595458984375,
0.0009255409240722656,
0.041473388671875,
0.03179931640625,
-0.055633544921875,
-0.0285186767578125,
-0.05950927734375,
-0.0203094482421875,
0.037139892578125,
0.00043487548828125,
0.0200347900390625,
-0.0129547119140625,
-0.03533935546875,
-0.038848876953125,
-0.020538330078125,
0.030731201171875,
-0.005268096923828125,
-0.0024814605712890625,
-0.0226898193359375,
0.06158447265625,
-0.0234832763671875,
0.055145263671875,
0.00616455078125,
0.0017604827880859375,
0.037139892578125,
-0.027130126953125,
-0.027984619140625,
-0.0152435302734375,
0.07098388671875,
0.043426513671875,
0.047607421875,
0.008544921875,
0.003570556640625,
0.0023975372314453125,
0.01132965087890625,
-0.07843017578125,
-0.0252227783203125,
0.00598907470703125,
-0.0202484130859375,
-0.034698486328125,
0.01055145263671875,
-0.06280517578125,
-0.0021877288818359375,
-0.014495849609375,
0.045562744140625,
-0.01861572265625,
-0.0232086181640625,
-0.0120849609375,
-0.009796142578125,
0.053253173828125,
0.01202392578125,
-0.050079345703125,
0.01331329345703125,
0.0175933837890625,
0.055145263671875,
0.01386260986328125,
-0.029937744140625,
-0.04290771484375,
-0.0440673828125,
-0.01497650146484375,
0.046112060546875,
-0.016204833984375,
-0.0184173583984375,
0.00034737586975097656,
0.0284423828125,
-0.0104217529296875,
-0.02996826171875,
0.02679443359375,
-0.03167724609375,
0.035675048828125,
-0.01306915283203125,
-0.0148773193359375,
-0.02935791015625,
0.018768310546875,
-0.02789306640625,
0.0865478515625,
0.016387939453125,
-0.0675048828125,
0.0126800537109375,
-0.034271240234375,
-0.007183074951171875,
0.011322021484375,
-0.0037326812744140625,
-0.042755126953125,
-0.00963592529296875,
0.03436279296875,
0.0269927978515625,
-0.0038089752197265625,
0.0228271484375,
-0.0192413330078125,
-0.0325927734375,
0.0296173095703125,
-0.043792724609375,
0.07464599609375,
0.004741668701171875,
-0.03607177734375,
0.020416259765625,
-0.041839599609375,
0.021270751953125,
0.01450347900390625,
-0.017822265625,
0.0031223297119140625,
-0.0169677734375,
0.0178985595703125,
0.0257568359375,
0.0254058837890625,
-0.0285186767578125,
0.0243682861328125,
-0.037353515625,
0.03448486328125,
0.034698486328125,
-0.00595855712890625,
0.0333251953125,
-0.0027618408203125,
0.04705810546875,
0.002910614013671875,
0.0335693359375,
-0.002483367919921875,
-0.041839599609375,
-0.088134765625,
0.0081024169921875,
0.01605224609375,
0.057952880859375,
-0.08306884765625,
0.00952911376953125,
-0.026031494140625,
-0.044677734375,
-0.049896240234375,
0.0005970001220703125,
0.038909912109375,
0.048858642578125,
0.04498291015625,
-0.00986480712890625,
-0.046966552734375,
-0.0682373046875,
-0.005126953125,
-0.005489349365234375,
0.0145721435546875,
0.0177154541015625,
0.04302978515625,
-0.0184783935546875,
0.056671142578125,
-0.032379150390625,
0.007755279541015625,
-0.0186614990234375,
0.005046844482421875,
0.0187225341796875,
0.0377197265625,
0.048675537109375,
-0.0762939453125,
-0.03857421875,
-0.0006451606750488281,
-0.06158447265625,
0.0015230178833007812,
0.0183563232421875,
-0.02069091796875,
0.01401519775390625,
0.04107666015625,
-0.0372314453125,
0.057586669921875,
0.043548583984375,
-0.036895751953125,
0.04254150390625,
-0.0013904571533203125,
0.00661468505859375,
-0.0926513671875,
0.0111236572265625,
0.0028133392333984375,
-0.028778076171875,
-0.043365478515625,
0.01372528076171875,
0.0116119384765625,
-0.006748199462890625,
-0.051055908203125,
0.06158447265625,
-0.0229339599609375,
0.0105438232421875,
-0.0223388671875,
-0.01041412353515625,
0.004596710205078125,
0.051361083984375,
0.0093994140625,
0.059234619140625,
0.059814453125,
-0.043426513671875,
0.049652099609375,
0.040618896484375,
-0.0175018310546875,
0.03570556640625,
-0.072265625,
0.006031036376953125,
-0.00870513916015625,
-0.0055084228515625,
-0.0682373046875,
-0.015411376953125,
0.04461669921875,
-0.060577392578125,
0.0190887451171875,
-0.032562255859375,
-0.017547607421875,
-0.053619384765625,
-0.006435394287109375,
0.036285400390625,
0.059295654296875,
-0.039886474609375,
0.04278564453125,
-0.002414703369140625,
-0.0033016204833984375,
-0.060333251953125,
-0.0792236328125,
-0.00772857666015625,
-0.00860595703125,
-0.049530029296875,
0.007415771484375,
-0.019866943359375,
-0.0023212432861328125,
0.01250457763671875,
-0.00688934326171875,
-0.0189208984375,
-0.0103759765625,
0.0159912109375,
0.03240966796875,
-0.03363037109375,
0.0007305145263671875,
-0.00449371337890625,
-0.0129547119140625,
0.006320953369140625,
-0.0195465087890625,
0.0430908203125,
-0.041107177734375,
-0.0089111328125,
-0.033477783203125,
0.007312774658203125,
0.045928955078125,
-0.051605224609375,
0.0400390625,
0.07177734375,
-0.0260162353515625,
-0.0005645751953125,
-0.03680419921875,
-0.01477813720703125,
-0.037445068359375,
0.04058837890625,
-0.0225982666015625,
-0.062103271484375,
0.036590576171875,
0.019195556640625,
-0.0075531005859375,
0.04742431640625,
0.056488037109375,
-0.00785064697265625,
0.0616455078125,
0.054901123046875,
0.02008056640625,
0.0545654296875,
-0.04443359375,
0.0029964447021484375,
-0.06182861328125,
-0.02557373046875,
-0.01038360595703125,
-0.017913818359375,
-0.05133056640625,
-0.041259765625,
0.0281219482421875,
0.0177154541015625,
-0.018890380859375,
0.0274658203125,
-0.06695556640625,
0.028564453125,
0.0587158203125,
0.012481689453125,
-0.01010894775390625,
0.01207733154296875,
-0.007221221923828125,
0.002140045166015625,
-0.052276611328125,
-0.0227813720703125,
0.07659912109375,
0.021636962890625,
0.0521240234375,
-0.0172119140625,
0.03521728515625,
-0.01995849609375,
0.0024089813232421875,
-0.060516357421875,
0.04302978515625,
-0.00727081298828125,
-0.044952392578125,
-0.0146026611328125,
-0.01235198974609375,
-0.0621337890625,
0.0261993408203125,
-0.0283355712890625,
-0.0712890625,
0.033233642578125,
0.0173187255859375,
-0.0220489501953125,
0.0156402587890625,
-0.055328369140625,
0.0830078125,
-0.019927978515625,
-0.0328369140625,
0.0146331787109375,
-0.05560302734375,
0.0233917236328125,
0.0218048095703125,
-0.017578125,
0.00988006591796875,
0.0361328125,
0.055999755859375,
-0.033477783203125,
0.060394287109375,
-0.017547607421875,
0.0288848876953125,
0.048431396484375,
-0.007541656494140625,
0.01415252685546875,
0.00933837890625,
0.0247650146484375,
0.01629638671875,
0.0214691162109375,
-0.0328369140625,
-0.045867919921875,
0.0390625,
-0.06195068359375,
-0.035980224609375,
-0.038177490234375,
-0.029754638671875,
0.00787353515625,
0.0288543701171875,
0.05303955078125,
0.046966552734375,
-0.0025844573974609375,
0.028656005859375,
0.061737060546875,
-0.01495361328125,
0.034271240234375,
0.01226043701171875,
-0.034271240234375,
-0.036712646484375,
0.0687255859375,
0.0007758140563964844,
0.01934814453125,
0.0290069580078125,
0.0214691162109375,
-0.0245208740234375,
-0.01435089111328125,
-0.036834716796875,
0.0157012939453125,
-0.05865478515625,
-0.01202392578125,
-0.04644775390625,
-0.05474853515625,
-0.031982421875,
0.0038166046142578125,
-0.02789306640625,
-0.00885772705078125,
-0.023223876953125,
0.00007617473602294922,
0.041259765625,
0.030731201171875,
0.007419586181640625,
0.026519775390625,
-0.044647216796875,
0.0294952392578125,
0.044769287109375,
0.0143280029296875,
-0.0205078125,
-0.046539306640625,
-0.0142974853515625,
0.028167724609375,
-0.0235443115234375,
-0.059234619140625,
0.0250396728515625,
0.012786865234375,
0.03240966796875,
0.00870513916015625,
-0.01044464111328125,
0.05438232421875,
-0.0216217041015625,
0.054473876953125,
0.01285552978515625,
-0.060882568359375,
0.0496826171875,
-0.005588531494140625,
0.0279998779296875,
0.032562255859375,
0.0034618377685546875,
-0.040618896484375,
-0.00827789306640625,
-0.0523681640625,
-0.06298828125,
0.04412841796875,
0.024322509765625,
0.0233612060546875,
0.0224761962890625,
0.0256805419921875,
-0.016265869140625,
0.01427459716796875,
-0.07366943359375,
-0.030792236328125,
-0.051116943359375,
-0.0149993896484375,
-0.0112457275390625,
-0.0282745361328125,
-0.0012979507446289062,
-0.057037353515625,
0.0394287109375,
-0.017974853515625,
0.039886474609375,
0.042510986328125,
-0.0198822021484375,
-0.0044097900390625,
0.00011497735977172852,
0.040069580078125,
0.036712646484375,
-0.00891876220703125,
0.0059814453125,
0.017791748046875,
-0.041656494140625,
-0.00768280029296875,
0.0166778564453125,
-0.0213623046875,
0.004669189453125,
0.033294677734375,
0.09222412109375,
-0.01190948486328125,
-0.02911376953125,
0.06024169921875,
-0.01554107666015625,
-0.035491943359375,
-0.038665771484375,
0.01206207275390625,
0.00499725341796875,
0.045196533203125,
0.00876617431640625,
0.0202789306640625,
-0.0005998611450195312,
-0.0166778564453125,
0.0102996826171875,
0.0210418701171875,
-0.052886962890625,
-0.018829345703125,
0.0645751953125,
0.0002104043960571289,
-0.020751953125,
0.0560302734375,
-0.00067901611328125,
-0.05145263671875,
0.05291748046875,
0.020599365234375,
0.06658935546875,
-0.00716400146484375,
0.0181884765625,
0.054107666015625,
0.026611328125,
0.0057525634765625,
0.03985595703125,
-0.00444793701171875,
-0.052764892578125,
-0.0303497314453125,
-0.0537109375,
-0.007373809814453125,
0.0150604248046875,
-0.06463623046875,
0.0294952392578125,
-0.0216217041015625,
-0.0093994140625,
0.0014162063598632812,
-0.0070953369140625,
-0.08074951171875,
0.034942626953125,
0.02581787109375,
0.06549072265625,
-0.05950927734375,
0.061492919921875,
0.0665283203125,
-0.05072021484375,
-0.06591796875,
-0.0056304931640625,
-0.0182952880859375,
-0.0750732421875,
0.03680419921875,
0.0098876953125,
0.0017156600952148438,
0.01556396484375,
-0.07318115234375,
-0.057464599609375,
0.08087158203125,
0.0207366943359375,
-0.033843994140625,
-0.0008511543273925781,
-0.0010547637939453125,
0.04534912109375,
-0.0005316734313964844,
0.0276336669921875,
0.0218658447265625,
0.03448486328125,
0.01995849609375,
-0.06610107421875,
-0.005825042724609375,
-0.04705810546875,
0.004177093505859375,
-0.00020515918731689453,
-0.04461669921875,
0.06634521484375,
-0.0196075439453125,
-0.0016260147094726562,
0.0143890380859375,
0.0582275390625,
0.0013437271118164062,
0.0085601806640625,
0.0288848876953125,
0.032806396484375,
0.04229736328125,
-0.0228271484375,
0.0787353515625,
-0.008514404296875,
0.055877685546875,
0.0716552734375,
0.03448486328125,
0.059234619140625,
0.046905517578125,
-0.01216888427734375,
0.0263824462890625,
0.049072265625,
-0.03173828125,
0.035125732421875,
0.0048980712890625,
0.00226593017578125,
-0.0293121337890625,
0.00258636474609375,
-0.03375244140625,
0.03851318359375,
0.009796142578125,
-0.0267333984375,
-0.01025390625,
0.0128631591796875,
-0.008941650390625,
-0.011505126953125,
-0.0140380859375,
0.044952392578125,
0.01084136962890625,
-0.05859375,
0.068603515625,
-0.020538330078125,
0.045928955078125,
-0.03558349609375,
-0.0098114013671875,
-0.014892578125,
0.0208282470703125,
-0.01149749755859375,
-0.07904052734375,
0.0164642333984375,
0.007602691650390625,
-0.018310546875,
-0.00027632713317871094,
0.05145263671875,
-0.04638671875,
-0.061492919921875,
0.0157012939453125,
0.03582763671875,
0.019287109375,
-0.00568389892578125,
-0.06768798828125,
-0.0128326416015625,
0.0157623291015625,
-0.0379638671875,
-0.003021240234375,
0.0272216796875,
-0.009765625,
0.049713134765625,
0.03619384765625,
-0.01108551025390625,
0.02313232421875,
-0.0017633438110351562,
0.06683349609375,
-0.03753662109375,
-0.015655517578125,
-0.06195068359375,
0.06463623046875,
0.0027179718017578125,
-0.045867919921875,
0.0384521484375,
0.033111572265625,
0.0716552734375,
-0.011627197265625,
0.0579833984375,
-0.01038360595703125,
0.01305389404296875,
-0.03759765625,
0.0577392578125,
-0.060882568359375,
-0.032470703125,
-0.0218658447265625,
-0.073486328125,
-0.024261474609375,
0.05621337890625,
-0.02532958984375,
0.005706787109375,
0.05474853515625,
0.06707763671875,
-0.01535797119140625,
-0.031707763671875,
0.0242462158203125,
0.0219573974609375,
0.016815185546875,
0.040557861328125,
0.05426025390625,
-0.056396484375,
0.05810546875,
-0.04345703125,
-0.00679779052734375,
-0.0233917236328125,
-0.0401611328125,
-0.08074951171875,
-0.060211181640625,
-0.035491943359375,
-0.042999267578125,
-0.0010080337524414062,
0.041473388671875,
0.051605224609375,
-0.0584716796875,
0.0016813278198242188,
-0.01554107666015625,
-0.0005917549133300781,
-0.0155487060546875,
-0.0192413330078125,
0.0288543701171875,
-0.0033664703369140625,
-0.06011962890625,
-0.0004837512969970703,
-0.004390716552734375,
0.0235137939453125,
-0.0145416259765625,
-0.006683349609375,
-0.0071563720703125,
-0.0213775634765625,
0.033447265625,
0.0296783447265625,
-0.06610107421875,
-0.019073486328125,
0.021636962890625,
-0.01458740234375,
0.0163116455078125,
0.01812744140625,
-0.053924560546875,
0.057708740234375,
0.05047607421875,
0.032958984375,
0.06201171875,
-0.0017786026000976562,
0.030792236328125,
-0.04095458984375,
0.03173828125,
0.01235198974609375,
0.035736083984375,
0.0258026123046875,
-0.0182952880859375,
0.028045654296875,
0.0279998779296875,
-0.037750244140625,
-0.046905517578125,
0.0023651123046875,
-0.0899658203125,
-0.0046234130859375,
0.095703125,
-0.0035152435302734375,
-0.044952392578125,
0.0272216796875,
-0.042724609375,
0.0489501953125,
0.001953125,
0.0186767578125,
0.038055419921875,
0.0173492431640625,
-0.040252685546875,
-0.05010986328125,
0.0229949951171875,
0.0185394287109375,
-0.036712646484375,
-0.0225830078125,
0.01174163818359375,
0.041168212890625,
0.0111236572265625,
0.029144287109375,
-0.004138946533203125,
0.034332275390625,
0.0030498504638671875,
0.0467529296875,
-0.00917816162109375,
-0.0185089111328125,
-0.025238037109375,
-0.0132293701171875,
-0.006954193115234375,
-0.045867919921875
]
] |
jhgan/ko-sroberta-multitask | 2022-06-13T16:34:48.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"ko",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | jhgan | null | null | jhgan/ko-sroberta-multitask | 39 | 63,320 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
language: ko
---
# ko-sroberta-multitask
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["안녕하세요?", "한국어 문장 임베딩을 위한 버트 모델입니다."]
model = SentenceTransformer('jhgan/ko-sroberta-multitask')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('jhgan/ko-sroberta-multitask')
model = AutoModel.from_pretrained('jhgan/ko-sroberta-multitask')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
KorSTS, KorNLI 학습 데이터셋으로 멀티 태스크 학습을 진행한 후 KorSTS 평가 데이터셋으로 평가한 결과입니다.
- Cosine Pearson: 84.77
- Cosine Spearman: 85.60
- Euclidean Pearson: 83.71
- Euclidean Spearman: 84.40
- Manhattan Pearson: 83.70
- Manhattan Spearman: 84.38
- Dot Pearson: 82.42
- Dot Spearman: 82.33
## Training
The model was trained with the parameters:
**DataLoader**:
`sentence_transformers.datasets.NoDuplicatesDataLoader.NoDuplicatesDataLoader` of length 8885 with parameters:
```
{'batch_size': 64}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 719 with parameters:
```
{'batch_size': 8, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 5,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 360,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
- Ham, J., Choe, Y. J., Park, K., Choi, I., & Soh, H. (2020). Kornli and korsts: New benchmark datasets for korean natural language understanding. arXiv
preprint arXiv:2004.03289
- Reimers, Nils and Iryna Gurevych. “Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks.” ArXiv abs/1908.10084 (2019)
- Reimers, Nils and Iryna Gurevych. “Making Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation.” EMNLP (2020).
| 4,732 | [
[
-0.017822265625,
-0.06536865234375,
0.0301055908203125,
0.0253143310546875,
-0.021728515625,
-0.0262451171875,
-0.03173828125,
-0.00295257568359375,
0.0198822021484375,
0.0250396728515625,
-0.04937744140625,
-0.043365478515625,
-0.052581787109375,
0.007022857666015625,
-0.01171112060546875,
0.06610107421875,
-0.0158233642578125,
0.0016183853149414062,
-0.01435089111328125,
-0.01073455810546875,
-0.0279998779296875,
-0.0248260498046875,
-0.03692626953125,
-0.02569580078125,
0.0232696533203125,
0.020751953125,
0.035430908203125,
0.0382080078125,
0.014495849609375,
0.0312042236328125,
0.00263214111328125,
0.00209808349609375,
-0.0311126708984375,
0.0014905929565429688,
0.006317138671875,
-0.036468505859375,
-0.0007357597351074219,
0.013519287109375,
0.039825439453125,
0.0296173095703125,
-0.006694793701171875,
0.0195465087890625,
0.00676727294921875,
0.0494384765625,
-0.036224365234375,
0.033538818359375,
-0.0303955078125,
0.01226043701171875,
-0.0018873214721679688,
0.00888824462890625,
-0.0298614501953125,
-0.01324462890625,
0.01947021484375,
-0.040802001953125,
0.0140228271484375,
0.0173492431640625,
0.087646484375,
0.033477783203125,
-0.0219879150390625,
-0.037841796875,
-0.015289306640625,
0.073486328125,
-0.065673828125,
0.0281982421875,
0.024444580078125,
0.003948211669921875,
-0.005870819091796875,
-0.0631103515625,
-0.057952880859375,
-0.019287109375,
-0.032745361328125,
0.0242919921875,
-0.01446533203125,
0.0022125244140625,
0.0109710693359375,
0.0185546875,
-0.05987548828125,
-0.00917816162109375,
-0.02850341796875,
-0.0173187255859375,
0.04217529296875,
0.004673004150390625,
0.0268402099609375,
-0.0513916015625,
-0.03131103515625,
-0.02752685546875,
0.0005278587341308594,
0.011474609375,
0.021270751953125,
0.01197052001953125,
-0.0230255126953125,
0.06085205078125,
-0.0190277099609375,
0.048614501953125,
-0.0035800933837890625,
-0.0034389495849609375,
0.050384521484375,
-0.03857421875,
-0.0215606689453125,
0.0015087127685546875,
0.08233642578125,
0.024169921875,
0.02398681640625,
0.002483367919921875,
0.0011768341064453125,
0.01114654541015625,
0.0109710693359375,
-0.0548095703125,
-0.034942626953125,
0.0274505615234375,
-0.035614013671875,
-0.019287109375,
0.01175689697265625,
-0.06878662109375,
0.0010833740234375,
-0.0098876953125,
0.0472412109375,
-0.044891357421875,
-0.00527191162109375,
0.0286407470703125,
-0.01715087890625,
0.00461578369140625,
-0.015045166015625,
-0.038421630859375,
0.0084075927734375,
0.01145172119140625,
0.07183837890625,
0.003108978271484375,
-0.04486083984375,
-0.017364501953125,
-0.01454925537109375,
0.0023326873779296875,
0.047698974609375,
-0.020294189453125,
-0.00492095947265625,
0.00010144710540771484,
0.0237579345703125,
-0.056640625,
-0.0193023681640625,
0.049041748046875,
-0.01593017578125,
0.057281494140625,
0.005390167236328125,
-0.056640625,
-0.0265655517578125,
0.0225067138671875,
-0.0472412109375,
0.095947265625,
0.0210418701171875,
-0.074462890625,
0.01505279541015625,
-0.060211181640625,
-0.017730712890625,
-0.0162200927734375,
-0.0007929801940917969,
-0.052215576171875,
-0.006275177001953125,
0.036041259765625,
0.054534912109375,
0.003925323486328125,
0.0224609375,
-0.0194549560546875,
-0.0219879150390625,
0.022247314453125,
-0.0239715576171875,
0.07916259765625,
0.01453399658203125,
-0.0276031494140625,
0.021392822265625,
-0.052337646484375,
0.01067352294921875,
0.028472900390625,
-0.02740478515625,
-0.0186614990234375,
-0.018341064453125,
0.0271453857421875,
0.037200927734375,
0.0203399658203125,
-0.047088623046875,
0.007568359375,
-0.0384521484375,
0.046875,
0.04058837890625,
-0.000017523765563964844,
0.031341552734375,
-0.027679443359375,
0.035614013671875,
0.01300811767578125,
-0.00727081298828125,
-0.0156402587890625,
-0.026641845703125,
-0.06298828125,
-0.0301513671875,
0.0278167724609375,
0.048675537109375,
-0.05987548828125,
0.06597900390625,
-0.03509521484375,
-0.046051025390625,
-0.06939697265625,
0.002197265625,
0.02874755859375,
0.031280517578125,
0.04058837890625,
0.009185791015625,
-0.043243408203125,
-0.06768798828125,
-0.0214691162109375,
0.01413726806640625,
-0.0016527175903320312,
0.0251007080078125,
0.051910400390625,
-0.025848388671875,
0.064697265625,
-0.055389404296875,
-0.037750244140625,
-0.038726806640625,
0.007049560546875,
0.0225830078125,
0.03936767578125,
0.036590576171875,
-0.053619384765625,
-0.0491943359375,
-0.0377197265625,
-0.058013916015625,
0.00980377197265625,
-0.01165008544921875,
-0.01148223876953125,
0.0173187255859375,
0.04168701171875,
-0.0672607421875,
0.0234375,
0.048797607421875,
-0.041595458984375,
0.04644775390625,
-0.0198516845703125,
0.0024509429931640625,
-0.10906982421875,
0.0120849609375,
0.00725555419921875,
-0.01100921630859375,
-0.040374755859375,
-0.0006432533264160156,
0.006378173828125,
-0.00293731689453125,
-0.03277587890625,
0.043426513671875,
-0.037017822265625,
0.0206298828125,
-0.005859375,
0.036102294921875,
0.0031833648681640625,
0.05877685546875,
-0.01340484619140625,
0.053253173828125,
0.041259765625,
-0.0452880859375,
0.0256500244140625,
0.037506103515625,
-0.04351806640625,
0.0158538818359375,
-0.05517578125,
0.0005707740783691406,
-0.0124359130859375,
0.0246124267578125,
-0.0948486328125,
-0.0197601318359375,
0.0285797119140625,
-0.040618896484375,
0.0070343017578125,
0.01430511474609375,
-0.0506591796875,
-0.044677734375,
-0.032135009765625,
0.0113067626953125,
0.024078369140625,
-0.03173828125,
0.04583740234375,
0.0174102783203125,
-0.00959014892578125,
-0.043670654296875,
-0.06524658203125,
-0.01117706298828125,
-0.024078369140625,
-0.05535888671875,
0.033111572265625,
-0.00859832763671875,
0.0120849609375,
0.0172882080078125,
0.00827789306640625,
0.0028743743896484375,
-0.00470733642578125,
0.0013284683227539062,
0.0218658447265625,
0.0006480216979980469,
0.0121002197265625,
0.006427764892578125,
-0.0125579833984375,
0.0019311904907226562,
-0.0093536376953125,
0.0662841796875,
-0.00733184814453125,
-0.006755828857421875,
-0.0401611328125,
0.0083465576171875,
0.032745361328125,
-0.00992584228515625,
0.074462890625,
0.08087158203125,
-0.025634765625,
0.00431060791015625,
-0.038848876953125,
-0.01519775390625,
-0.034912109375,
0.05816650390625,
-0.027801513671875,
-0.06781005859375,
0.03802490234375,
0.00902557373046875,
-0.0034198760986328125,
0.05157470703125,
0.04058837890625,
0.00128173828125,
0.061065673828125,
0.036651611328125,
-0.015869140625,
0.03851318359375,
-0.038177490234375,
0.021392822265625,
-0.072998046875,
-0.01207733154296875,
-0.022918701171875,
-0.0261993408203125,
-0.051849365234375,
-0.02874755859375,
0.0210418701171875,
-0.0008745193481445312,
-0.01309967041015625,
0.038726806640625,
-0.0472412109375,
0.01418304443359375,
0.04876708984375,
0.014373779296875,
-0.003612518310546875,
-0.0073394775390625,
-0.026824951171875,
-0.00959014892578125,
-0.062347412109375,
-0.0290069580078125,
0.07061767578125,
0.02642822265625,
0.033416748046875,
-0.0181427001953125,
0.05096435546875,
-0.0070343017578125,
-0.00621795654296875,
-0.055816650390625,
0.03857421875,
-0.01175689697265625,
-0.032989501953125,
-0.0318603515625,
-0.032806396484375,
-0.07720947265625,
0.0400390625,
-0.00989532470703125,
-0.06494140625,
-0.004322052001953125,
-0.014892578125,
-0.0264434814453125,
0.0133209228515625,
-0.068359375,
0.09161376953125,
0.005283355712890625,
-0.001804351806640625,
-0.00719451904296875,
-0.053314208984375,
0.0221405029296875,
0.01325225830078125,
0.01517486572265625,
-0.01311492919921875,
0.004703521728515625,
0.0760498046875,
-0.005680084228515625,
0.0574951171875,
-0.015869140625,
0.0138702392578125,
0.021331787109375,
-0.018218994140625,
0.02923583984375,
0.0003445148468017578,
-0.0029888153076171875,
0.00911712646484375,
0.00603485107421875,
-0.027618408203125,
-0.0404052734375,
0.055816650390625,
-0.0611572265625,
-0.019012451171875,
-0.0361328125,
-0.061004638671875,
0.0008778572082519531,
0.0261993408203125,
0.0364990234375,
0.01422882080078125,
0.0008225440979003906,
0.0287017822265625,
0.034027099609375,
-0.02532958984375,
0.0418701171875,
0.0143890380859375,
-0.01024627685546875,
-0.04547119140625,
0.059112548828125,
0.01363372802734375,
0.0022335052490234375,
0.017120361328125,
0.0193634033203125,
-0.03936767578125,
-0.01218414306640625,
-0.02655029296875,
0.032501220703125,
-0.0369873046875,
-0.0086212158203125,
-0.0777587890625,
-0.0257110595703125,
-0.046112060546875,
-0.0016460418701171875,
-0.0294952392578125,
-0.023590087890625,
-0.0265045166015625,
-0.0148773193359375,
0.0191802978515625,
0.0322265625,
0.01503753662109375,
0.025970458984375,
-0.050750732421875,
0.020294189453125,
-0.0004973411560058594,
0.01312255859375,
-0.00420379638671875,
-0.046539306640625,
-0.033782958984375,
0.00748443603515625,
-0.0212860107421875,
-0.060791015625,
0.04644775390625,
0.00830841064453125,
0.042449951171875,
0.00841522216796875,
0.00945281982421875,
0.049560546875,
-0.032928466796875,
0.06695556640625,
0.0059967041015625,
-0.068603515625,
0.03912353515625,
-0.00911712646484375,
0.047943115234375,
0.0406494140625,
0.0253448486328125,
-0.04925537109375,
-0.031646728515625,
-0.054046630859375,
-0.07757568359375,
0.06085205078125,
0.03143310546875,
0.016387939453125,
-0.0017042160034179688,
0.0220184326171875,
-0.01128387451171875,
0.020263671875,
-0.0760498046875,
-0.03643798828125,
-0.023834228515625,
-0.049774169921875,
-0.0218048095703125,
-0.014801025390625,
0.0037746429443359375,
-0.0268402099609375,
0.05963134765625,
-0.0023193359375,
0.04534912109375,
0.0271453857421875,
-0.0224609375,
0.0177001953125,
0.020477294921875,
0.04022216796875,
0.0189361572265625,
-0.005161285400390625,
0.00743865966796875,
0.01477813720703125,
-0.04180908203125,
-0.0004668235778808594,
0.035491943359375,
-0.0067596435546875,
0.01336669921875,
0.02508544921875,
0.07110595703125,
0.0328369140625,
-0.0435791015625,
0.041168212890625,
-0.00347137451171875,
-0.028045654296875,
-0.03204345703125,
-0.0005698204040527344,
0.01904296875,
0.02093505859375,
0.01149749755859375,
-0.003910064697265625,
-0.0012903213500976562,
-0.019561767578125,
0.01904296875,
0.015472412109375,
-0.0240631103515625,
-0.00777435302734375,
0.04705810546875,
-0.015869140625,
-0.00945281982421875,
0.06365966796875,
-0.0279541015625,
-0.052581787109375,
0.037109375,
0.042724609375,
0.0662841796875,
-0.00826263427734375,
0.0202789306640625,
0.046051025390625,
0.0221405029296875,
-0.005496978759765625,
0.0172882080078125,
0.0164947509765625,
-0.0638427734375,
-0.0198974609375,
-0.048675537109375,
0.01395416259765625,
0.0153350830078125,
-0.053924560546875,
0.014434814453125,
-0.0018014907836914062,
-0.004974365234375,
-0.006866455078125,
0.01342010498046875,
-0.061248779296875,
0.009033203125,
-0.01470947265625,
0.05157470703125,
-0.06866455078125,
0.0662841796875,
0.059112548828125,
-0.044830322265625,
-0.06005859375,
0.005359649658203125,
-0.01227569580078125,
-0.060760498046875,
0.033660888671875,
0.04010009765625,
0.0164642333984375,
0.0217132568359375,
-0.033203125,
-0.07635498046875,
0.102783203125,
0.01800537109375,
-0.0296478271484375,
-0.01300811767578125,
0.004413604736328125,
0.032135009765625,
-0.0279541015625,
0.015472412109375,
0.0372314453125,
0.031585693359375,
-0.01024627685546875,
-0.06329345703125,
0.01806640625,
-0.0246124267578125,
0.007274627685546875,
-0.001651763916015625,
-0.05157470703125,
0.06866455078125,
-0.00968170166015625,
-0.01629638671875,
-0.0029125213623046875,
0.052978515625,
0.038818359375,
0.0159759521484375,
0.038482666015625,
0.062255859375,
0.052703857421875,
-0.001880645751953125,
0.07177734375,
-0.0343017578125,
0.0596923828125,
0.076904296875,
0.0138092041015625,
0.06494140625,
0.036773681640625,
-0.020538330078125,
0.04718017578125,
0.04095458984375,
-0.0172119140625,
0.047027587890625,
0.012115478515625,
0.004688262939453125,
0.0036029815673828125,
0.00954437255859375,
-0.00994873046875,
0.0352783203125,
0.0137481689453125,
-0.036529541015625,
-0.006683349609375,
0.02587890625,
0.0200653076171875,
0.00006109476089477539,
0.0081634521484375,
0.04559326171875,
0.00785064697265625,
-0.044952392578125,
0.031768798828125,
0.00986480712890625,
0.070068359375,
-0.038177490234375,
0.01297760009765625,
0.01477813720703125,
0.022857666015625,
-0.001880645751953125,
-0.050323486328125,
0.0193634033203125,
-0.0213623046875,
-0.00812530517578125,
-0.00891876220703125,
0.047515869140625,
-0.056732177734375,
-0.048675537109375,
0.0285491943359375,
0.0394287109375,
0.00897979736328125,
-0.01016998291015625,
-0.08514404296875,
0.0025081634521484375,
0.0129241943359375,
-0.051483154296875,
0.017120361328125,
0.02130126953125,
0.03448486328125,
0.04718017578125,
0.036712646484375,
-0.00940704345703125,
0.01558685302734375,
0.0047149658203125,
0.062347412109375,
-0.047332763671875,
-0.047088623046875,
-0.0673828125,
0.05255126953125,
-0.01171112060546875,
-0.035552978515625,
0.06707763671875,
0.04522705078125,
0.07379150390625,
-0.0172271728515625,
0.051605224609375,
-0.020751953125,
0.0305938720703125,
-0.050933837890625,
0.0640869140625,
-0.038665771484375,
-0.0099945068359375,
-0.0240478515625,
-0.06915283203125,
-0.0115966796875,
0.06964111328125,
-0.0259246826171875,
0.01739501953125,
0.0738525390625,
0.05926513671875,
-0.0079803466796875,
-0.00461578369140625,
0.00954437255859375,
0.0259246826171875,
0.0159759521484375,
0.038360595703125,
0.019439697265625,
-0.0640869140625,
0.048309326171875,
-0.04473876953125,
-0.00543975830078125,
-0.00701141357421875,
-0.04278564453125,
-0.07403564453125,
-0.0596923828125,
-0.0303497314453125,
-0.02734375,
-0.0128021240234375,
0.077880859375,
0.039794921875,
-0.052337646484375,
-0.00738525390625,
-0.01520538330078125,
-0.01125335693359375,
-0.024993896484375,
-0.02740478515625,
0.050537109375,
-0.045013427734375,
-0.062255859375,
0.0182952880859375,
-0.0113983154296875,
-0.0008821487426757812,
-0.0199737548828125,
0.0011234283447265625,
-0.047821044921875,
-0.00015437602996826172,
0.042510986328125,
-0.0192108154296875,
-0.05975341796875,
-0.01413726806640625,
0.009613037109375,
-0.0201873779296875,
-0.0018177032470703125,
0.02874755859375,
-0.044281005859375,
0.0311279296875,
0.0285491943359375,
0.037506103515625,
0.053619384765625,
-0.009979248046875,
0.0185394287109375,
-0.05987548828125,
0.0194091796875,
0.004734039306640625,
0.03839111328125,
0.0209503173828125,
-0.03350830078125,
0.038604736328125,
0.0198516845703125,
-0.048614501953125,
-0.04638671875,
-0.006317138671875,
-0.075927734375,
-0.0243988037109375,
0.087890625,
-0.024169921875,
-0.0309295654296875,
-0.0013103485107421875,
-0.0271759033203125,
0.039093017578125,
-0.0261993408203125,
0.064453125,
0.076171875,
-0.01230621337890625,
-0.0207672119140625,
-0.055450439453125,
0.0237579345703125,
0.037445068359375,
-0.0528564453125,
-0.0064697265625,
0.01503753662109375,
0.033905029296875,
0.0207672119140625,
0.044708251953125,
0.00299835205078125,
-0.0010156631469726562,
0.004795074462890625,
0.01010894775390625,
-0.00946807861328125,
-0.00140380859375,
-0.0232086181640625,
0.0095367431640625,
-0.037841796875,
-0.03997802734375
]
] |
cointegrated/roberta-large-cola-krishna2020 | 2023-06-13T09:38:15.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"arxiv:2010.05700",
"endpoints_compatible",
"region:us"
] | text-classification | cointegrated | null | null | cointegrated/roberta-large-cola-krishna2020 | 5 | 63,008 | transformers | 2022-03-02T23:29:05 | This is a RoBERTa-large classifier trained on the CoLA corpus [Warstadt et al., 2019](https://www.mitpressjournals.org/doi/pdf/10.1162/tacl_a_00290),
which contains sentences paired with grammatical acceptability judgments. The model can be used to evaluate fluency of machine-generated English sentences, e.g. for evaluation of text style transfer.
The model was trained in the paper [Krishna et al, 2020. Reformulating Unsupervised Style Transfer as Paraphrase Generation](https://arxiv.org/abs/2010.05700), and its original version is available at [their project page](http://style.cs.umass.edu). We converted this model from Fairseq to Transformers format. All credit goes to the authors of the original paper.
## Citation
If you found this model useful and refer to it, please cite the original work:
```
@inproceedings{style20,
author={Kalpesh Krishna and John Wieting and Mohit Iyyer},
Booktitle = {Empirical Methods in Natural Language Processing},
Year = "2020",
Title={Reformulating Unsupervised Style Transfer as Paraphrase Generation},
}
``` | 1,057 | [
[
0.0021209716796875,
-0.047271728515625,
0.038299560546875,
0.0224456787109375,
-0.0006799697875976562,
-0.0168914794921875,
-0.033203125,
-0.016632080078125,
-0.004241943359375,
0.034423828125,
-0.0112762451171875,
-0.0302886962890625,
-0.04449462890625,
0.0288848876953125,
-0.049896240234375,
0.08709716796875,
0.01126861572265625,
0.0109710693359375,
-0.0192413330078125,
-0.0030155181884765625,
-0.033966064453125,
-0.0262603759765625,
-0.0419921875,
-0.01267242431640625,
0.0150299072265625,
0.026031494140625,
0.0592041015625,
0.0128936767578125,
0.0115966796875,
0.029693603515625,
-0.0185089111328125,
0.002048492431640625,
-0.03656005859375,
0.01280975341796875,
-0.0089569091796875,
-0.025238037109375,
-0.039459228515625,
0.004299163818359375,
0.0531005859375,
0.0404052734375,
-0.0014123916625976562,
0.00817108154296875,
0.0282745361328125,
0.03790283203125,
-0.03515625,
0.03546142578125,
-0.06561279296875,
-0.0175933837890625,
-0.0099945068359375,
-0.004642486572265625,
-0.07965087890625,
-0.0288848876953125,
-0.004444122314453125,
-0.04638671875,
0.0179595947265625,
0.0106353759765625,
0.083251953125,
0.01070404052734375,
-0.04638671875,
-0.027313232421875,
-0.041656494140625,
0.059417724609375,
-0.02386474609375,
0.033935546875,
0.01378631591796875,
0.0289459228515625,
-0.0141143798828125,
-0.08660888671875,
-0.0595703125,
-0.00945281982421875,
-0.0022296905517578125,
0.01334381103515625,
-0.0307159423828125,
-0.01355743408203125,
0.0038166046142578125,
0.02447509765625,
-0.04449462890625,
-0.021270751953125,
-0.0406494140625,
-0.01013946533203125,
0.0374755859375,
-0.0015401840209960938,
0.0199127197265625,
-0.021392822265625,
-0.03173828125,
-0.01416015625,
-0.0241241455078125,
0.001983642578125,
0.00469207763671875,
0.034423828125,
-0.00995635986328125,
0.04510498046875,
-0.00933074951171875,
0.0692138671875,
-0.0056610107421875,
-0.01238250732421875,
0.046356201171875,
-0.0220947265625,
-0.0175628662109375,
-0.01442718505859375,
0.055450439453125,
0.0272979736328125,
0.062347412109375,
-0.0218048095703125,
-0.01241302490234375,
-0.0150909423828125,
0.0234527587890625,
-0.0576171875,
-0.0338134765625,
-0.0082244873046875,
-0.050048828125,
-0.016143798828125,
0.03472900390625,
-0.0643310546875,
-0.0053253173828125,
-0.032562255859375,
0.024810791015625,
-0.048248291015625,
0.01090240478515625,
0.00885772705078125,
-0.0158843994140625,
0.0183868408203125,
0.0206146240234375,
-0.04595947265625,
0.0051116943359375,
0.044708251953125,
0.06634521484375,
0.0004253387451171875,
-0.0221405029296875,
-0.041473388671875,
0.0156402587890625,
-0.00989532470703125,
0.053558349609375,
-0.031524658203125,
-0.03240966796875,
0.00940704345703125,
0.0176849365234375,
-0.033782958984375,
-0.03546142578125,
0.082275390625,
-0.04302978515625,
0.0210113525390625,
0.0224761962890625,
-0.03790283203125,
-0.0029449462890625,
0.010650634765625,
-0.038055419921875,
0.0672607421875,
0.0192413330078125,
-0.05670166015625,
0.02447509765625,
-0.057830810546875,
-0.041717529296875,
0.004833221435546875,
0.00998687744140625,
-0.040496826171875,
-0.006755828857421875,
0.005489349365234375,
0.03057861328125,
-0.022705078125,
0.0364990234375,
-0.0316162109375,
-0.0222015380859375,
0.041015625,
-0.039825439453125,
0.08282470703125,
0.04620361328125,
0.006145477294921875,
0.0234832763671875,
-0.08233642578125,
0.0100250244140625,
0.00799560546875,
-0.0256195068359375,
-0.023101806640625,
-0.039154052734375,
0.047698974609375,
0.01526641845703125,
0.036407470703125,
-0.041229248046875,
0.0082550048828125,
-0.0006160736083984375,
0.045989990234375,
0.049896240234375,
-0.00702667236328125,
0.03155517578125,
-0.0357666015625,
0.041259765625,
-0.03094482421875,
0.0166473388671875,
-0.0008001327514648438,
-0.04254150390625,
-0.051300048828125,
-0.043365478515625,
0.040069580078125,
0.035430908203125,
-0.04254150390625,
0.04962158203125,
-0.024444580078125,
-0.04779052734375,
-0.026153564453125,
-0.00701904296875,
0.02276611328125,
0.04266357421875,
0.043609619140625,
0.0221710205078125,
-0.050994873046875,
-0.06475830078125,
-0.0205230712890625,
-0.00023508071899414062,
-0.00913238525390625,
-0.005283355712890625,
0.0367431640625,
-0.0210113525390625,
0.0692138671875,
-0.04339599609375,
-0.011932373046875,
-0.0150146484375,
0.030181884765625,
0.01363372802734375,
0.040374755859375,
0.056976318359375,
-0.06793212890625,
-0.05755615234375,
-0.01415252685546875,
-0.04876708984375,
-0.0208892822265625,
0.0034465789794921875,
-0.022857666015625,
0.023651123046875,
0.0518798828125,
-0.031646728515625,
0.0265045166015625,
0.035614013671875,
-0.04254150390625,
0.037689208984375,
-0.0137939453125,
0.01837158203125,
-0.1063232421875,
0.01519012451171875,
0.007415771484375,
-0.0311737060546875,
-0.0217742919921875,
0.00981903076171875,
0.014312744140625,
-0.0112762451171875,
-0.020904541015625,
0.03863525390625,
-0.03240966796875,
0.01250457763671875,
-0.017913818359375,
-0.005290985107421875,
0.0140380859375,
0.0168304443359375,
-0.00969696044921875,
0.08526611328125,
0.0300445556640625,
-0.040069580078125,
0.00852203369140625,
0.043792724609375,
-0.0208740234375,
0.068359375,
-0.056610107421875,
0.0080718994140625,
-0.0052947998046875,
0.028472900390625,
-0.092041015625,
-0.0103302001953125,
0.004444122314453125,
-0.04449462890625,
0.01358795166015625,
-0.007244110107421875,
-0.054962158203125,
-0.01450347900390625,
-0.005931854248046875,
0.01407623291015625,
0.054412841796875,
-0.05364990234375,
0.04443359375,
0.01507568359375,
-0.010894775390625,
-0.01233673095703125,
-0.046295166015625,
0.0080718994140625,
-0.034820556640625,
-0.05389404296875,
0.02386474609375,
-0.0181121826171875,
-0.02642822265625,
-0.00928497314453125,
0.0237884521484375,
-0.0236968994140625,
0.00826263427734375,
0.021942138671875,
0.02264404296875,
-0.007595062255859375,
0.019683837890625,
-0.01345062255859375,
-0.0006203651428222656,
0.0020389556884765625,
-0.005893707275390625,
0.055877685546875,
-0.0167236328125,
0.016754150390625,
-0.050689697265625,
0.006755828857421875,
0.047454833984375,
-0.0143585205078125,
0.05548095703125,
0.048675537109375,
-0.0114288330078125,
-0.01226043701171875,
-0.01247406005859375,
0.0119476318359375,
-0.03424072265625,
0.0145721435546875,
-0.044952392578125,
-0.041473388671875,
0.02294921875,
0.00975799560546875,
-0.027801513671875,
0.06268310546875,
0.0272674560546875,
0.010101318359375,
0.06207275390625,
0.028900146484375,
0.004642486572265625,
0.043792724609375,
-0.0225982666015625,
0.007373809814453125,
-0.047393798828125,
-0.01543426513671875,
-0.0821533203125,
-0.00994110107421875,
-0.0528564453125,
-0.0284576416015625,
0.0168914794921875,
-0.0003440380096435547,
-0.031494140625,
0.035888671875,
-0.0208892822265625,
0.0281982421875,
0.050445556640625,
0.0126800537109375,
0.042724609375,
-0.00782012939453125,
0.006160736083984375,
-0.00516510009765625,
-0.029144287109375,
-0.051788330078125,
0.08966064453125,
0.015960693359375,
0.045806884765625,
0.0212860107421875,
0.05108642578125,
0.0166015625,
0.0193634033203125,
-0.05426025390625,
0.04315185546875,
-0.0271148681640625,
-0.03082275390625,
-0.01123809814453125,
-0.0306243896484375,
-0.08807373046875,
-0.00991058349609375,
-0.0298919677734375,
-0.048126220703125,
-0.00423431396484375,
-0.0022945404052734375,
-0.0008406639099121094,
0.0006418228149414062,
-0.061004638671875,
0.0809326171875,
0.0084381103515625,
-0.0207366943359375,
-0.00603485107421875,
-0.0306396484375,
0.01471710205078125,
-0.0070953369140625,
0.0153961181640625,
0.0204925537109375,
0.014404296875,
0.052642822265625,
-0.02606201171875,
0.0406494140625,
-0.007221221923828125,
-0.0014438629150390625,
0.0258026123046875,
0.005054473876953125,
0.03240966796875,
0.0017271041870117188,
-0.021820068359375,
-0.003215789794921875,
0.0026760101318359375,
-0.03753662109375,
-0.04241943359375,
0.0457763671875,
-0.0509033203125,
-0.0224761962890625,
-0.03887939453125,
-0.0616455078125,
-0.0129241943359375,
0.0204315185546875,
0.0220947265625,
0.0296478271484375,
-0.0142822265625,
0.052825927734375,
0.027923583984375,
0.006206512451171875,
0.020904541015625,
0.01486968994140625,
-0.021942138671875,
-0.0217742919921875,
0.054534912109375,
-0.0015668869018554688,
0.0188140869140625,
0.025604248046875,
0.0180511474609375,
-0.0289459228515625,
-0.043914794921875,
-0.01241302490234375,
0.0189361572265625,
-0.046295166015625,
-0.0165557861328125,
-0.0445556640625,
-0.02532958984375,
-0.06396484375,
-0.016754150390625,
-0.00025200843811035156,
-0.031463623046875,
-0.0372314453125,
0.00862884521484375,
0.043365478515625,
0.06329345703125,
0.023773193359375,
0.0306549072265625,
-0.050933837890625,
0.00910186767578125,
0.017822265625,
0.00446319580078125,
-0.0071868896484375,
-0.0809326171875,
0.001430511474609375,
0.0011777877807617188,
-0.0360107421875,
-0.05462646484375,
0.043426513671875,
0.017333984375,
0.0418701171875,
0.017303466796875,
0.01654052734375,
0.04736328125,
-0.0281829833984375,
0.051116943359375,
0.004528045654296875,
-0.07855224609375,
0.02996826171875,
-0.0261077880859375,
0.01702880859375,
0.0576171875,
0.039306640625,
-0.0207977294921875,
-0.0304718017578125,
-0.072265625,
-0.06304931640625,
0.0231781005859375,
0.0062255859375,
0.01062774658203125,
-0.0184173583984375,
0.0079345703125,
0.0160369873046875,
0.00978851318359375,
-0.06536865234375,
0.0001558065414428711,
-0.005859375,
-0.050872802734375,
-0.0198974609375,
-0.0176849365234375,
0.0033359527587890625,
-0.00732421875,
0.063720703125,
-0.013702392578125,
0.01457977294921875,
0.005603790283203125,
-0.02838134765625,
0.026611328125,
0.038055419921875,
0.0361328125,
0.0355224609375,
-0.0197296142578125,
0.0016193389892578125,
0.00206756591796875,
-0.021697998046875,
-0.018402099609375,
0.01558685302734375,
-0.026092529296875,
0.00545501708984375,
0.021209716796875,
0.045806884765625,
0.0169677734375,
-0.05389404296875,
0.0574951171875,
-0.00902557373046875,
-0.0242462158203125,
-0.05169677734375,
0.0005884170532226562,
-0.0196380615234375,
0.02630615234375,
0.023956298828125,
0.0207061767578125,
0.038055419921875,
-0.0550537109375,
0.03082275390625,
0.033203125,
-0.025238037109375,
-0.0162506103515625,
0.06396484375,
0.01270294189453125,
-0.04547119140625,
0.037872314453125,
-0.0418701171875,
-0.03216552734375,
0.028106689453125,
0.06268310546875,
0.05499267578125,
0.01421356201171875,
0.01287078857421875,
0.046875,
-0.0177764892578125,
-0.0121917724609375,
0.01471710205078125,
-0.003856658935546875,
-0.0386962890625,
-0.01268768310546875,
-0.051422119140625,
-0.005794525146484375,
0.0165252685546875,
-0.059722900390625,
0.0540771484375,
-0.017578125,
0.004215240478515625,
0.006877899169921875,
-0.033843994140625,
-0.051971435546875,
0.005313873291015625,
-0.0148468017578125,
0.06707763671875,
-0.08404541015625,
0.06964111328125,
0.056427001953125,
-0.048248291015625,
-0.06146240234375,
0.038787841796875,
-0.016021728515625,
-0.0684814453125,
0.06060791015625,
0.01280975341796875,
0.0011320114135742188,
0.00441741943359375,
-0.0257568359375,
-0.048370361328125,
0.07196044921875,
0.0182647705078125,
-0.0300445556640625,
-0.004581451416015625,
-0.02093505859375,
0.05523681640625,
-0.0246429443359375,
0.0225830078125,
0.026641845703125,
0.035064697265625,
0.01390838623046875,
-0.09039306640625,
-0.007556915283203125,
-0.0256195068359375,
0.00785064697265625,
-0.004974365234375,
-0.049072265625,
0.07965087890625,
-0.007904052734375,
0.00412750244140625,
0.0555419921875,
0.06591796875,
0.0252227783203125,
0.033355712890625,
0.055755615234375,
0.03753662109375,
0.05560302734375,
-0.005130767822265625,
0.057952880859375,
-0.0224456787109375,
0.047760009765625,
0.09515380859375,
-0.0086669921875,
0.082275390625,
0.042205810546875,
-0.03460693359375,
0.050994873046875,
0.01280975341796875,
-0.005214691162109375,
0.0311126708984375,
0.021820068359375,
0.0039520263671875,
-0.009033203125,
-0.00431060791015625,
-0.02294921875,
0.059783935546875,
0.003932952880859375,
-0.056610107421875,
-0.00862884521484375,
0.0218963623046875,
0.0233612060546875,
0.05645751953125,
-0.01837158203125,
0.04345703125,
-0.0263214111328125,
-0.06610107421875,
0.054534912109375,
-0.01120758056640625,
0.060882568359375,
-0.0110931396484375,
-0.01348876953125,
0.002349853515625,
0.0310516357421875,
-0.0081634521484375,
-0.08056640625,
0.0318603515625,
0.0134735107421875,
-0.0194549560546875,
-0.025177001953125,
0.033111572265625,
-0.031463623046875,
-0.047332763671875,
0.01727294921875,
0.01788330078125,
0.0287933349609375,
0.0107879638671875,
-0.048431396484375,
-0.005588531494140625,
0.009857177734375,
0.0021991729736328125,
0.0167694091796875,
0.046600341796875,
-0.005298614501953125,
0.0306396484375,
0.0252685546875,
-0.007358551025390625,
0.004413604736328125,
0.028106689453125,
0.035400390625,
-0.033935546875,
-0.0479736328125,
-0.0338134765625,
0.0355224609375,
-0.0194854736328125,
-0.037109375,
0.06536865234375,
0.06561279296875,
0.06695556640625,
-0.022491455078125,
0.053436279296875,
-0.016937255859375,
0.0496826171875,
-0.043304443359375,
0.043548583984375,
-0.046600341796875,
-0.00792694091796875,
-0.0305328369140625,
-0.051727294921875,
-0.025115966796875,
0.06787109375,
-0.0246429443359375,
0.0217132568359375,
0.062164306640625,
0.05743408203125,
-0.003536224365234375,
0.0119476318359375,
0.01342010498046875,
0.028228759765625,
0.0128936767578125,
0.032135009765625,
0.03900146484375,
-0.039581298828125,
0.0697021484375,
-0.0095977783203125,
-0.01026153564453125,
-0.0311737060546875,
-0.0615234375,
-0.057647705078125,
-0.08465576171875,
-0.0307464599609375,
-0.06402587890625,
0.0086517333984375,
0.0714111328125,
0.036102294921875,
-0.050994873046875,
-0.0126190185546875,
-0.0306396484375,
-0.02490234375,
-0.0077362060546875,
-0.02374267578125,
0.0181121826171875,
-0.037994384765625,
-0.09649658203125,
0.0164337158203125,
-0.02349853515625,
0.004150390625,
0.010101318359375,
-0.00240325927734375,
-0.033966064453125,
-0.022308349609375,
0.041259765625,
0.005420684814453125,
-0.0355224609375,
-0.045806884765625,
-0.0009775161743164062,
-0.0117645263671875,
0.008941650390625,
0.03955078125,
-0.03851318359375,
0.021484375,
0.0355224609375,
0.033721923828125,
0.02435302734375,
0.00396728515625,
0.029541015625,
-0.048492431640625,
0.02099609375,
0.005939483642578125,
0.045166015625,
0.029449462890625,
-0.0100250244140625,
0.042633056640625,
0.0242767333984375,
-0.06610107421875,
-0.06353759765625,
0.004302978515625,
-0.1046142578125,
-0.0282745361328125,
0.1116943359375,
-0.016021728515625,
-0.02960205078125,
-0.004413604736328125,
-0.043792724609375,
0.035797119140625,
-0.04290771484375,
0.04827880859375,
0.04864501953125,
0.0027675628662109375,
0.00003361701965332031,
-0.02264404296875,
0.01235198974609375,
0.0145263671875,
-0.05474853515625,
-0.002529144287109375,
0.0361328125,
0.0308837890625,
0.0225372314453125,
0.0156097412109375,
-0.0140380859375,
0.01861572265625,
-0.006015777587890625,
0.02288818359375,
-0.00328826904296875,
-0.0286102294921875,
-0.034149169921875,
0.0218658447265625,
0.0411376953125,
-0.021820068359375
]
] |
timm/eva02_large_patch14_clip_224.merged2b_s4b_b131k | 2023-04-10T22:00:30.000Z | [
"open_clip",
"zero-shot-image-classification",
"clip",
"license:mit",
"region:us",
"has_space"
] | zero-shot-image-classification | timm | null | null | timm/eva02_large_patch14_clip_224.merged2b_s4b_b131k | 3 | 62,041 | open_clip | 2023-04-10T21:53:29 | ---
tags:
- zero-shot-image-classification
- clip
library_tag: open_clip
license: mit
---
# Model card for eva02_large_patch14_clip_224.merged2b_s4b_b131k
| 155 | [
[
-0.044036865234375,
-0.01142120361328125,
0.0239105224609375,
0.042877197265625,
-0.046905517578125,
0.02569580078125,
0.0279388427734375,
0.00013887882232666016,
0.05792236328125,
0.06634521484375,
-0.0521240234375,
-0.017669677734375,
-0.0374755859375,
0.00429534912109375,
-0.04058837890625,
0.04376220703125,
-0.006893157958984375,
0.01068878173828125,
-0.005634307861328125,
0.0026302337646484375,
-0.042327880859375,
-0.04376220703125,
-0.05267333984375,
-0.016693115234375,
0.031768798828125,
0.053009033203125,
0.0201263427734375,
0.033660888671875,
0.055145263671875,
0.034271240234375,
-0.019012451171875,
0.02459716796875,
-0.0265350341796875,
-0.021636962890625,
-0.00542449951171875,
-0.0294647216796875,
-0.0966796875,
0.01032257080078125,
0.0657958984375,
0.0030918121337890625,
-0.028961181640625,
0.0212554931640625,
-0.00372314453125,
-0.00666046142578125,
-0.0253143310546875,
-0.0006594657897949219,
-0.0214996337890625,
0.00004988908767700195,
-0.022705078125,
-0.01354217529296875,
-0.015380859375,
-0.012115478515625,
-0.037017822265625,
-0.041656494140625,
0.007480621337890625,
0.0029010772705078125,
0.0875244140625,
0.0095367431640625,
-0.0155792236328125,
0.023162841796875,
-0.04632568359375,
0.046905517578125,
-0.038726806640625,
0.038726806640625,
0.0190887451171875,
0.01442718505859375,
-0.0187835693359375,
-0.0537109375,
-0.0176239013671875,
0.04083251953125,
0.01168060302734375,
0.0006823539733886719,
-0.028167724609375,
0.0008168220520019531,
0.04559326171875,
0.029754638671875,
-0.009185791015625,
0.024017333984375,
-0.0816650390625,
-0.0229034423828125,
0.04925537109375,
0.0205078125,
0.0150299072265625,
-0.0112152099609375,
-0.06591796875,
-0.003154754638671875,
-0.03082275390625,
-0.044891357421875,
0.056365966796875,
-0.019744873046875,
-0.0404052734375,
0.04473876953125,
0.0078887939453125,
0.031707763671875,
0.0039215087890625,
0.008392333984375,
0.021453857421875,
-0.035491943359375,
-0.039093017578125,
-0.0113677978515625,
0.0151519775390625,
0.06005859375,
-0.0309295654296875,
0.043548583984375,
-0.0173797607421875,
-0.02325439453125,
0.02239990234375,
-0.07232666015625,
-0.02362060546875,
-0.01331329345703125,
-0.0411376953125,
-0.0295867919921875,
0.0311431884765625,
-0.077880859375,
0.01021575927734375,
-0.0074310302734375,
0.0268707275390625,
0.0011758804321289062,
-0.038970947265625,
-0.0046234130859375,
-0.04266357421875,
0.01415252685546875,
0.018951416015625,
-0.037994384765625,
0.021148681640625,
0.056121826171875,
0.038360595703125,
0.02142333984375,
-0.0035533905029296875,
-0.0101165771484375,
0.01605224609375,
-0.04412841796875,
0.053558349609375,
-0.03118896484375,
-0.044647216796875,
0.0100860595703125,
0.042755126953125,
0.00905609130859375,
-0.03460693359375,
0.0360107421875,
-0.041168212890625,
-0.004077911376953125,
-0.0272979736328125,
-0.0289154052734375,
-0.006641387939453125,
0.0234832763671875,
-0.066650390625,
0.08245849609375,
0.032623291015625,
-0.030517578125,
0.018829345703125,
-0.0443115234375,
0.0254364013671875,
0.040008544921875,
0.022369384765625,
-0.0289764404296875,
-0.00356292724609375,
-0.036956787109375,
0.0082550048828125,
0.003299713134765625,
-0.003681182861328125,
-0.03363037109375,
-0.0239105224609375,
-0.00902557373046875,
-0.021453857421875,
0.038604736328125,
0.0290679931640625,
0.033905029296875,
0.0067596435546875,
-0.032806396484375,
-0.0017805099487304688,
0.0104217529296875,
-0.004299163818359375,
-0.0202178955078125,
-0.053985595703125,
0.02294921875,
0.002269744873046875,
0.036590576171875,
-0.03179931640625,
0.032806396484375,
0.02777099609375,
0.018463134765625,
0.0484619140625,
0.0260772705078125,
0.01300811767578125,
-0.0235137939453125,
0.05035400390625,
-0.01654052734375,
0.0231170654296875,
0.0002429485321044922,
-0.0294036865234375,
-0.056121826171875,
-0.040740966796875,
0.0268707275390625,
0.019439697265625,
-0.008026123046875,
0.06060791015625,
0.020355224609375,
-0.056182861328125,
0.0210418701171875,
0.0019989013671875,
0.024505615234375,
0.007076263427734375,
0.02313232421875,
-0.0301055908203125,
-0.06903076171875,
-0.08343505859375,
0.00962066650390625,
-0.001605987548828125,
-0.006465911865234375,
0.0285797119140625,
0.043121337890625,
-0.0609130859375,
0.0204925537109375,
-0.06097412109375,
-0.033966064453125,
-0.04364013671875,
0.0062103271484375,
0.0088653564453125,
0.04010009765625,
0.053741455078125,
-0.0257110595703125,
-0.006908416748046875,
-0.0323486328125,
-0.0323486328125,
-0.0164794921875,
0.00815582275390625,
-0.0300445556640625,
-0.0012722015380859375,
0.051055908203125,
-0.03045654296875,
0.052276611328125,
0.0537109375,
-0.055511474609375,
-0.00792694091796875,
-0.03924560546875,
0.037017822265625,
-0.07952880859375,
-0.00493621826171875,
-0.0034503936767578125,
-0.0261077880859375,
-0.023193359375,
-0.007541656494140625,
0.024658203125,
0.01039886474609375,
-0.045745849609375,
0.039642333984375,
-0.038299560546875,
0.01363372802734375,
-0.0265350341796875,
-0.01885986328125,
0.0195770263671875,
-0.0013475418090820312,
-0.013671875,
0.037200927734375,
-0.0019102096557617188,
-0.01161956787109375,
0.05029296875,
0.045257568359375,
0.0234527587890625,
0.03680419921875,
-0.02294921875,
-0.0084686279296875,
-0.01800537109375,
0.0291900634765625,
-0.044342041015625,
-0.056793212890625,
0.0019464492797851562,
0.01378631591796875,
0.03802490234375,
-0.0172882080078125,
-0.02520751953125,
-0.06951904296875,
-0.02813720703125,
0.07855224609375,
0.04443359375,
-0.04254150390625,
0.045684814453125,
0.020904541015625,
-0.00594329833984375,
-0.0162200927734375,
-0.04913330078125,
-0.009674072265625,
-0.0191802978515625,
-0.029510498046875,
0.041961669921875,
-0.005779266357421875,
0.0008091926574707031,
-0.00785064697265625,
-0.00966644287109375,
-0.04132080078125,
-0.0282745361328125,
0.0384521484375,
0.0511474609375,
-0.022308349609375,
-0.022857666015625,
0.0017099380493164062,
-0.0239410400390625,
-0.00638580322265625,
0.037933349609375,
0.039581298828125,
0.0096588134765625,
-0.005321502685546875,
-0.01081085205078125,
0.028717041015625,
0.0604248046875,
-0.01702880859375,
0.037078857421875,
0.0266571044921875,
-0.0457763671875,
-0.02001953125,
-0.060516357421875,
-0.019622802734375,
-0.031494140625,
0.036468505859375,
-0.0174713134765625,
-0.0204925537109375,
0.043060302734375,
-0.01016998291015625,
-0.015960693359375,
0.046295166015625,
0.01390838623046875,
0.021728515625,
0.05401611328125,
0.043975830078125,
0.0176849365234375,
0.045867919921875,
-0.044647216796875,
0.0286407470703125,
-0.07489013671875,
-0.03839111328125,
-0.01290130615234375,
-0.0263824462890625,
-0.004398345947265625,
-0.06036376953125,
-0.01103973388671875,
0.039886474609375,
-0.01290130615234375,
0.0579833984375,
-0.024627685546875,
0.05902099609375,
0.040435791015625,
0.0120849609375,
-0.003993988037109375,
-0.0230865478515625,
0.037322998046875,
0.008514404296875,
-0.029937744140625,
-0.0179901123046875,
0.07830810546875,
0.049163818359375,
0.049774169921875,
0.0022449493408203125,
0.0640869140625,
0.0122528076171875,
0.0122222900390625,
-0.00901031494140625,
0.052032470703125,
-0.0196075439453125,
-0.053314208984375,
0.005023956298828125,
0.016845703125,
-0.05517578125,
-0.0160369873046875,
-0.017120361328125,
-0.03729248046875,
0.0192718505859375,
0.0278472900390625,
-0.0282745361328125,
0.03204345703125,
-0.055328369140625,
0.0826416015625,
-0.0020008087158203125,
0.00341796875,
-0.032196044921875,
0.00392913818359375,
0.0635986328125,
-0.008148193359375,
-0.005062103271484375,
-0.01079559326171875,
0.0047149658203125,
0.0787353515625,
-0.0474853515625,
0.00910186767578125,
-0.004726409912109375,
0.00522613525390625,
0.022491455078125,
-0.0153350830078125,
0.04443359375,
0.029327392578125,
0.0241546630859375,
0.0177154541015625,
0.014495849609375,
-0.0252532958984375,
-0.0218658447265625,
0.064208984375,
-0.02520751953125,
-0.01416015625,
-0.0258636474609375,
0.005474090576171875,
0.026580810546875,
-0.0078277587890625,
0.04736328125,
0.0675048828125,
0.000042319297790527344,
0.0293121337890625,
0.030242919921875,
0.0018510818481445312,
0.032562255859375,
0.041015625,
-0.03204345703125,
-0.052825927734375,
0.045196533203125,
-0.00238037109375,
0.006832122802734375,
-0.014434814453125,
-0.004604339599609375,
-0.0164794921875,
0.0007143020629882812,
-0.001316070556640625,
0.026458740234375,
-0.0163116455078125,
-0.058013916015625,
-0.0238800048828125,
-0.046478271484375,
-0.00852203369140625,
-0.022125244140625,
-0.05194091796875,
-0.0031299591064453125,
-0.034027099609375,
-0.03466796875,
0.051055908203125,
0.0665283203125,
-0.0443115234375,
0.05712890625,
-0.07476806640625,
0.0214385986328125,
0.022552490234375,
0.054168701171875,
-0.030975341796875,
-0.054595947265625,
0.00447845458984375,
-0.0224761962890625,
-0.045074462890625,
-0.07586669921875,
0.02667236328125,
-0.015777587890625,
0.038604736328125,
0.04388427734375,
-0.01322174072265625,
0.0665283203125,
-0.04058837890625,
0.0819091796875,
0.039886474609375,
-0.057464599609375,
0.0200042724609375,
-0.0181427001953125,
-0.011474609375,
0.01444244384765625,
0.00616455078125,
-0.0186920166015625,
-0.014923095703125,
-0.10321044921875,
-0.0640869140625,
0.03387451171875,
-0.0024623870849609375,
0.0064697265625,
-0.0089263916015625,
-0.0016584396362304688,
-0.00954437255859375,
-0.0009832382202148438,
-0.0174102783203125,
-0.04632568359375,
-0.006977081298828125,
-0.015106201171875,
0.020721435546875,
-0.0638427734375,
-0.0273895263671875,
-0.03106689453125,
0.041351318359375,
0.018707275390625,
0.05029296875,
-0.0071868896484375,
0.0210418701171875,
-0.0095977783203125,
-0.005977630615234375,
0.07952880859375,
0.0158843994140625,
-0.04547119140625,
0.0083160400390625,
0.006511688232421875,
-0.032745361328125,
-0.00418853759765625,
0.0269012451171875,
0.020477294921875,
0.01056671142578125,
0.036346435546875,
0.0570068359375,
0.0472412109375,
-0.006984710693359375,
0.057769775390625,
-0.00798797607421875,
-0.0282745361328125,
-0.028900146484375,
-0.006595611572265625,
-0.0185394287109375,
0.0298919677734375,
-0.0025005340576171875,
0.024383544921875,
0.019500732421875,
-0.027984619140625,
0.0355224609375,
0.052459716796875,
-0.03564453125,
-0.0269622802734375,
0.037994384765625,
0.04486083984375,
-0.06365966796875,
0.0401611328125,
-0.0023365020751953125,
-0.050750732421875,
0.0673828125,
0.051727294921875,
0.056854248046875,
-0.04345703125,
-0.00403594970703125,
0.048797607421875,
0.0033168792724609375,
-0.031005859375,
0.03277587890625,
0.006183624267578125,
-0.03192138671875,
-0.0023479461669921875,
-0.0266265869140625,
-0.0198516845703125,
0.01363372802734375,
-0.0806884765625,
0.06109619140625,
-0.0271148681640625,
-0.0182342529296875,
-0.020355224609375,
-0.0019893646240234375,
-0.052581787109375,
0.011993408203125,
0.019622802734375,
0.1273193359375,
-0.07977294921875,
0.064697265625,
0.02740478515625,
-0.0171356201171875,
-0.07586669921875,
-0.04571533203125,
-0.020416259765625,
-0.051177978515625,
0.0299530029296875,
0.007541656494140625,
0.04376220703125,
-0.0192413330078125,
-0.038238525390625,
-0.08538818359375,
0.076904296875,
-0.003940582275390625,
-0.0421142578125,
0.016876220703125,
-0.0523681640625,
-0.0006380081176757812,
-0.036956787109375,
0.03253173828125,
0.056488037109375,
0.055816650390625,
0.041961669921875,
-0.06622314453125,
-0.023773193359375,
-0.015350341796875,
-0.009735107421875,
0.04791259765625,
-0.0738525390625,
0.050628662109375,
0.003498077392578125,
0.03057861328125,
0.037628173828125,
0.03192138671875,
0.0093841552734375,
0.006015777587890625,
0.0187225341796875,
0.057098388671875,
0.038909912109375,
-0.007526397705078125,
0.017730712890625,
-0.0142974853515625,
0.035858154296875,
0.05902099609375,
-0.0225982666015625,
0.060699462890625,
0.05975341796875,
-0.0016984939575195312,
0.058135986328125,
0.037078857421875,
-0.050201416015625,
0.02581787109375,
-0.01264190673828125,
-0.00464630126953125,
-0.036651611328125,
0.02001953125,
-0.06951904296875,
0.0294036865234375,
0.0198211669921875,
-0.068359375,
-0.026641845703125,
-0.0150146484375,
-0.00031685829162597656,
0.00335693359375,
-0.03900146484375,
0.038177490234375,
-0.01424407958984375,
-0.043792724609375,
-0.00567626953125,
0.0173492431640625,
0.025970458984375,
-0.059417724609375,
-0.0457763671875,
-0.0156707763671875,
0.027496337890625,
-0.018768310546875,
-0.053741455078125,
0.0377197265625,
-0.0135498046875,
-0.031494140625,
-0.0159759521484375,
0.02569580078125,
-0.043792724609375,
-0.05841064453125,
0.0253448486328125,
-0.03948974609375,
0.01096343994140625,
0.0162811279296875,
-0.022247314453125,
0.049102783203125,
0.002655029296875,
0.005458831787109375,
0.0157928466796875,
0.005077362060546875,
-0.0007538795471191406,
0.0186614990234375,
0.024658203125,
0.01445770263671875,
0.0186920166015625,
-0.04840087890625,
0.053863525390625,
-0.040008544921875,
-0.0484619140625,
-0.041473388671875,
0.048553466796875,
-0.046539306640625,
-0.04425048828125,
0.049713134765625,
0.07733154296875,
0.016876220703125,
-0.029449462890625,
0.01806640625,
0.0126190185546875,
0.032501220703125,
-0.01097869873046875,
0.02685546875,
-0.037567138671875,
0.00914764404296875,
0.00074005126953125,
-0.06610107421875,
-0.0147247314453125,
0.03521728515625,
0.0223236083984375,
-0.0291595458984375,
0.0499267578125,
0.049713134765625,
-0.018768310546875,
0.00189208984375,
0.03106689453125,
0.0118408203125,
0.0201263427734375,
0.0207977294921875,
0.0254364013671875,
-0.053619384765625,
0.0121917724609375,
0.000492095947265625,
-0.04437255859375,
-0.050567626953125,
-0.06549072265625,
-0.06890869140625,
-0.0160064697265625,
-0.0281524658203125,
-0.036468505859375,
-0.038848876953125,
0.07196044921875,
0.073486328125,
-0.035064697265625,
0.006237030029296875,
0.001636505126953125,
0.022735595703125,
0.027313232421875,
-0.0124969482421875,
0.006465911865234375,
0.008453369140625,
-0.032806396484375,
0.01226043701171875,
0.025146484375,
0.034271240234375,
0.005756378173828125,
-0.0220489501953125,
0.031036376953125,
0.042572021484375,
0.02197265625,
0.035614013671875,
-0.04937744140625,
-0.034515380859375,
-0.022430419921875,
0.017425537109375,
-0.00679779052734375,
0.046966552734375,
-0.030731201171875,
-0.01300811767578125,
0.029388427734375,
0.001514434814453125,
0.02996826171875,
0.030517578125,
0.057403564453125,
-0.06060791015625,
0.04449462890625,
-0.01277923583984375,
0.047698974609375,
-0.017425537109375,
-0.00380706787109375,
0.036834716796875,
0.0275726318359375,
-0.044586181640625,
-0.104736328125,
-0.0021228790283203125,
-0.1060791015625,
0.01303863525390625,
0.0986328125,
0.011016845703125,
-0.02490234375,
0.0241546630859375,
-0.056488037109375,
-0.0090179443359375,
-0.02569580078125,
-0.0102386474609375,
0.038848876953125,
0.01016998291015625,
-0.0266571044921875,
-0.005268096923828125,
0.03533935546875,
-0.026641845703125,
-0.054840087890625,
-0.0183258056640625,
0.0012598037719726562,
0.0252838134765625,
0.0257110595703125,
0.01227569580078125,
-0.00432586669921875,
0.034942626953125,
0.0233306884765625,
0.038116455078125,
-0.01293182373046875,
-0.03472900390625,
-0.0174102783203125,
-0.00543212890625,
-0.0010232925415039062,
-0.049713134765625
]
] |
cross-encoder/stsb-TinyBERT-L-4 | 2021-08-05T08:41:47.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | cross-encoder | null | null | cross-encoder/stsb-TinyBERT-L-4 | 1 | 61,245 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
---
# Cross-Encoder for Quora Duplicate Questions Detection
This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
## Training Data
This model was trained on the [STS benchmark dataset](http://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark). The model will predict a score between 0 and 1 how for the semantic similarity of two sentences.
## Usage and Performance
Pre-trained models can be used like this:
```
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name')
scores = model.predict([('Sentence 1', 'Sentence 2'), ('Sentence 3', 'Sentence 4')])
```
The model will predict scores for the pairs `('Sentence 1', 'Sentence 2')` and `('Sentence 3', 'Sentence 4')`.
You can use this model also without sentence_transformers and by just using Transformers ``AutoModel`` class | 941 | [
[
-0.01690673828125,
-0.060394287109375,
0.01953125,
0.01727294921875,
-0.02838134765625,
-0.002544403076171875,
0.0121612548828125,
-0.0129241943359375,
0.009002685546875,
0.0518798828125,
-0.051177978515625,
-0.031951904296875,
-0.034912109375,
0.033111572265625,
-0.06365966796875,
0.0767822265625,
-0.0112457275390625,
0.0297698974609375,
-0.052154541015625,
-0.0252227783203125,
-0.0271148681640625,
-0.03216552734375,
-0.0362548828125,
-0.018707275390625,
0.00989532470703125,
0.025970458984375,
0.028564453125,
0.00870513916015625,
0.024505615234375,
0.0301513671875,
-0.0011396408081054688,
0.00783538818359375,
-0.0177764892578125,
0.005352020263671875,
-0.023223876953125,
-0.0455322265625,
0.0126953125,
-0.01318359375,
0.0279388427734375,
0.018524169921875,
-0.01026153564453125,
0.035064697265625,
0.0028057098388671875,
0.022064208984375,
-0.010772705078125,
-0.01507568359375,
-0.053009033203125,
0.01477813720703125,
0.0054473876953125,
-0.0208740234375,
-0.00428009033203125,
-0.046478271484375,
-0.00510406494140625,
-0.041900634765625,
0.0286865234375,
0.0009126663208007812,
0.0823974609375,
0.025970458984375,
-0.0367431640625,
-0.01611328125,
-0.0311126708984375,
0.061187744140625,
-0.038787841796875,
0.0136566162109375,
0.027099609375,
0.0240325927734375,
0.006763458251953125,
-0.06268310546875,
-0.07562255859375,
0.008941650390625,
-0.01654052734375,
0.0089569091796875,
-0.032623291015625,
-0.0399169921875,
0.020721435546875,
0.0261993408203125,
-0.0745849609375,
-0.0037899017333984375,
-0.05712890625,
-0.0266571044921875,
0.029296875,
0.033355712890625,
0.0189971923828125,
-0.043304443359375,
-0.056396484375,
-0.01456451416015625,
-0.0093994140625,
0.0186614990234375,
0.0263824462890625,
-0.0028400421142578125,
-0.0074462890625,
0.04461669921875,
-0.0308685302734375,
0.03997802734375,
0.0158843994140625,
0.016754150390625,
0.05511474609375,
-0.039337158203125,
-0.001026153564453125,
-0.019287109375,
0.078857421875,
0.0192108154296875,
0.0362548828125,
0.00426483154296875,
0.00013077259063720703,
0.01297760009765625,
0.031494140625,
-0.044708251953125,
-0.006351470947265625,
0.020111083984375,
-0.0298919677734375,
-0.0318603515625,
0.0103912353515625,
-0.034576416015625,
0.010345458984375,
-0.0157470703125,
0.059051513671875,
-0.038330078125,
0.0010986328125,
0.0277862548828125,
-0.028533935546875,
0.030914306640625,
-0.001468658447265625,
-0.03070068359375,
0.0340576171875,
0.042022705078125,
0.045074462890625,
-0.01184844970703125,
-0.048126220703125,
-0.025634765625,
-0.02459716796875,
0.00904083251953125,
0.0718994140625,
-0.0287322998046875,
-0.01280975341796875,
-0.012176513671875,
0.0218658447265625,
-0.01151275634765625,
-0.03912353515625,
0.045074462890625,
-0.0423583984375,
0.0694580078125,
-0.022430419921875,
-0.050750732421875,
-0.035064697265625,
0.0433349609375,
-0.0560302734375,
0.091064453125,
0.0227203369140625,
-0.06787109375,
0.0123748779296875,
-0.02899169921875,
-0.035308837890625,
0.006252288818359375,
-0.0105438232421875,
-0.05194091796875,
-0.0171051025390625,
0.0209808349609375,
0.01349639892578125,
-0.024078369140625,
0.00921630859375,
-0.006305694580078125,
-0.0300445556640625,
0.0279998779296875,
-0.01971435546875,
0.06048583984375,
0.00330352783203125,
-0.0214691162109375,
0.007778167724609375,
-0.048828125,
0.0284271240234375,
0.014739990234375,
-0.0214385986328125,
-0.0203399658203125,
-0.034942626953125,
0.0216522216796875,
0.0293731689453125,
0.0276031494140625,
-0.05657958984375,
-0.0172271728515625,
-0.0162811279296875,
0.0216217041015625,
0.03546142578125,
0.01375579833984375,
0.013916015625,
-0.0347900390625,
0.068115234375,
0.020599365234375,
0.013214111328125,
0.01139068603515625,
-0.041961669921875,
-0.052642822265625,
0.0240020751953125,
0.0235748291015625,
0.06494140625,
-0.05987548828125,
0.060302734375,
-0.0110015869140625,
-0.04150390625,
-0.058624267578125,
0.0190887451171875,
0.0230865478515625,
0.025146484375,
0.049346923828125,
-0.016693115234375,
-0.05499267578125,
-0.07318115234375,
-0.03826904296875,
0.00246429443359375,
0.00106048583984375,
0.0009717941284179688,
0.06756591796875,
-0.00787353515625,
0.07763671875,
-0.03466796875,
-0.0159912109375,
-0.028167724609375,
0.01110076904296875,
0.0040435791015625,
0.050048828125,
0.033935546875,
-0.07745361328125,
-0.049102783203125,
-0.0280609130859375,
-0.05645751953125,
0.0032024383544921875,
0.0001119375228881836,
-0.01275634765625,
0.006893157958984375,
0.0309600830078125,
-0.054962158203125,
0.030731201171875,
0.031707763671875,
-0.017364501953125,
0.0283203125,
0.0009975433349609375,
0.02490234375,
-0.10723876953125,
-0.000888824462890625,
-0.0104217529296875,
-0.0255126953125,
-0.0297393798828125,
0.00572967529296875,
0.0022716522216796875,
-0.0032711029052734375,
-0.031646728515625,
0.035491943359375,
-0.007396697998046875,
0.00954437255859375,
-0.01473236083984375,
0.00823974609375,
0.0233306884765625,
0.0408935546875,
0.0016155242919921875,
0.0516357421875,
0.035003662109375,
-0.0379638671875,
0.040557861328125,
0.050506591796875,
-0.051727294921875,
0.03521728515625,
-0.07879638671875,
0.0258636474609375,
-0.00336456298828125,
0.02325439453125,
-0.0733642578125,
0.01270294189453125,
0.02752685546875,
-0.0526123046875,
-0.034393310546875,
0.0085906982421875,
-0.04119873046875,
-0.05438232421875,
-0.026947021484375,
0.04815673828125,
0.03619384765625,
-0.046539306640625,
0.04461669921875,
0.01654052734375,
-0.00453948974609375,
-0.053466796875,
-0.0704345703125,
-0.027191162109375,
-0.0023670196533203125,
-0.043853759765625,
0.01113128662109375,
-0.01409149169921875,
0.0202178955078125,
0.019195556640625,
0.0037689208984375,
-0.01568603515625,
-0.00647735595703125,
0.02166748046875,
0.0080413818359375,
-0.023468017578125,
0.01519012451171875,
0.011566162109375,
-0.011322021484375,
0.00778961181640625,
-0.023651123046875,
0.06439208984375,
-0.00850677490234375,
-0.023468017578125,
-0.03375244140625,
0.03472900390625,
0.026763916015625,
-0.022796630859375,
0.051605224609375,
0.049774169921875,
-0.02880859375,
-0.0171051025390625,
-0.048553466796875,
-0.0034122467041015625,
-0.035308837890625,
0.03607177734375,
-0.03326416015625,
-0.07122802734375,
0.03729248046875,
0.0236053466796875,
-0.03271484375,
0.03857421875,
0.031463623046875,
0.0060882568359375,
0.055877685546875,
0.0357666015625,
-0.0146636962890625,
0.0178680419921875,
-0.01499176025390625,
0.015533447265625,
-0.036285400390625,
-0.03338623046875,
-0.0341796875,
-0.0155181884765625,
-0.03692626953125,
-0.04461669921875,
0.012725830078125,
-0.0003342628479003906,
-0.0102691650390625,
0.048370361328125,
-0.04901123046875,
0.05194091796875,
0.05230712890625,
0.01284027099609375,
0.0084686279296875,
0.015960693359375,
0.00714111328125,
0.007076263427734375,
-0.052215576171875,
-0.020111083984375,
0.0751953125,
-0.0021114349365234375,
0.045654296875,
-0.0208740234375,
0.04864501953125,
0.0143280029296875,
-0.017242431640625,
-0.03265380859375,
0.05194091796875,
-0.021697998046875,
-0.042938232421875,
-0.0196990966796875,
-0.04376220703125,
-0.08343505859375,
0.02203369140625,
-0.0120697021484375,
-0.013885498046875,
0.005077362060546875,
-0.0166778564453125,
-0.03924560546875,
0.0191802978515625,
-0.021697998046875,
0.08660888671875,
-0.0233306884765625,
0.004138946533203125,
-0.0254058837890625,
-0.038604736328125,
0.020782470703125,
-0.0133056640625,
-0.006046295166015625,
0.002536773681640625,
-0.002223968505859375,
0.06402587890625,
-0.0276641845703125,
0.044677734375,
0.0059967041015625,
0.014556884765625,
0.0283966064453125,
-0.013275146484375,
0.0085601806640625,
-0.003658294677734375,
-0.0036258697509765625,
0.016143798828125,
0.00783538818359375,
-0.031036376953125,
-0.022979736328125,
0.057708740234375,
-0.07061767578125,
-0.0272979736328125,
-0.02703857421875,
-0.02642822265625,
0.0079193115234375,
0.01323699951171875,
0.04541015625,
0.022979736328125,
-0.0301055908203125,
0.004322052001953125,
0.0286712646484375,
-0.022796630859375,
0.007080078125,
0.0452880859375,
-0.010711669921875,
-0.041900634765625,
0.041778564453125,
-0.003238677978515625,
0.0129852294921875,
0.04376220703125,
0.00433349609375,
-0.0286865234375,
-0.013702392578125,
-0.01222991943359375,
0.01290130615234375,
-0.057830810546875,
-0.029632568359375,
-0.076904296875,
-0.04559326171875,
-0.042633056640625,
0.02276611328125,
-0.00875091552734375,
-0.044036865234375,
-0.022796630859375,
-0.0193939208984375,
0.05438232421875,
0.04400634765625,
-0.00960540771484375,
0.015106201171875,
-0.06329345703125,
0.051300048828125,
0.0308990478515625,
0.00830078125,
-0.005863189697265625,
-0.061187744140625,
-0.0107574462890625,
-0.006046295166015625,
-0.020263671875,
-0.04156494140625,
0.03265380859375,
-0.004634857177734375,
0.0404052734375,
0.0010156631469726562,
0.004108428955078125,
0.039306640625,
-0.0244140625,
0.054656982421875,
0.01207733154296875,
-0.073974609375,
0.0438232421875,
0.012420654296875,
0.03387451171875,
0.0555419921875,
0.054107666015625,
-0.045440673828125,
-0.0267486572265625,
-0.0457763671875,
-0.05645751953125,
0.054412841796875,
0.020294189453125,
0.0308990478515625,
-0.00708770751953125,
0.01343536376953125,
0.0374755859375,
0.00931549072265625,
-0.07659912109375,
-0.0171051025390625,
-0.038330078125,
-0.031219482421875,
-0.0013303756713867188,
-0.011077880859375,
0.01027679443359375,
-0.037872314453125,
0.04779052734375,
0.00907135009765625,
-0.00396728515625,
0.014678955078125,
-0.018798828125,
-0.006526947021484375,
0.02313232421875,
0.0214080810546875,
0.022003173828125,
-0.0017642974853515625,
-0.016815185546875,
0.0301971435546875,
-0.0249786376953125,
0.0012922286987304688,
0.01947021484375,
-0.0278472900390625,
0.0219268798828125,
0.0217742919921875,
0.06488037109375,
0.007259368896484375,
-0.03662109375,
0.050445556640625,
-0.01308441162109375,
-0.0227813720703125,
-0.0633544921875,
-0.00391387939453125,
0.0050506591796875,
0.0341796875,
0.00943756103515625,
0.0175628662109375,
0.00632476806640625,
-0.03594970703125,
0.0293426513671875,
0.0138092041015625,
-0.0228271484375,
-0.01422119140625,
0.050079345703125,
0.00804901123046875,
-0.046844482421875,
0.05889892578125,
-0.0107269287109375,
-0.084228515625,
0.05072021484375,
0.021881103515625,
0.05572509765625,
-0.006366729736328125,
0.0276641845703125,
0.050811767578125,
0.01525115966796875,
-0.022186279296875,
0.05072021484375,
-0.01146697998046875,
-0.07257080078125,
0.0028820037841796875,
-0.042938232421875,
-0.0252685546875,
0.02105712890625,
-0.07586669921875,
0.0250244140625,
-0.0223236083984375,
-0.00931549072265625,
-0.0006489753723144531,
-0.01032257080078125,
-0.060394287109375,
0.0189971923828125,
-0.0015716552734375,
0.058135986328125,
-0.0687255859375,
0.055206298828125,
0.044281005859375,
-0.06109619140625,
-0.054534912109375,
-0.003376007080078125,
-0.01136016845703125,
-0.06390380859375,
0.049591064453125,
0.0269622802734375,
0.00714111328125,
-0.01145172119140625,
-0.04345703125,
-0.05889892578125,
0.07757568359375,
-0.01222991943359375,
-0.03656005859375,
-0.0007839202880859375,
0.03765869140625,
0.047393798828125,
-0.0235137939453125,
0.027862548828125,
0.036956787109375,
0.0175323486328125,
-0.017242431640625,
-0.06365966796875,
0.0109100341796875,
-0.051361083984375,
-0.01494598388671875,
0.003223419189453125,
-0.04608154296875,
0.08843994140625,
0.0015411376953125,
-0.0029735565185546875,
0.045135498046875,
0.037322998046875,
0.0270538330078125,
0.023468017578125,
0.046875,
0.055419921875,
0.0423583984375,
0.0063934326171875,
0.0740966796875,
-0.0195465087890625,
0.034820556640625,
0.08966064453125,
-0.0241851806640625,
0.07159423828125,
0.0229949951171875,
-0.01073455810546875,
0.07098388671875,
0.025848388671875,
-0.0457763671875,
0.03448486328125,
0.017791748046875,
0.01258087158203125,
-0.033905029296875,
0.01470947265625,
-0.04144287109375,
0.039306640625,
-0.005779266357421875,
-0.024688720703125,
-0.0032176971435546875,
0.0168609619140625,
-0.028472900390625,
0.0311737060546875,
0.0007848739624023438,
0.043487548828125,
0.0016584396362304688,
-0.051971435546875,
0.036468505859375,
-0.006927490234375,
0.0665283203125,
-0.0582275390625,
0.0017499923706054688,
-0.023223876953125,
0.025909423828125,
-0.0022563934326171875,
-0.07183837890625,
0.0137176513671875,
-0.00244140625,
-0.0303192138671875,
-0.0005316734313964844,
0.059112548828125,
-0.050750732421875,
-0.052093505859375,
0.04315185546875,
0.046295166015625,
0.0165557861328125,
-0.01419830322265625,
-0.07666015625,
-0.0217437744140625,
0.004543304443359375,
0.002410888671875,
-0.0009059906005859375,
0.043609619140625,
0.006195068359375,
0.0498046875,
0.04425048828125,
-0.019561767578125,
0.01202392578125,
0.0304412841796875,
0.059417724609375,
-0.076171875,
-0.04815673828125,
-0.04022216796875,
0.01435089111328125,
-0.0247802734375,
-0.048675537109375,
0.06890869140625,
0.07122802734375,
0.0780029296875,
-0.02716064453125,
0.054840087890625,
0.013763427734375,
0.0689697265625,
-0.028564453125,
0.046173095703125,
-0.053009033203125,
0.00937652587890625,
-0.0012769699096679688,
-0.04534912109375,
0.01043701171875,
0.03857421875,
-0.018280029296875,
0.00382232666015625,
0.07159423828125,
0.07391357421875,
-0.011962890625,
0.0189208984375,
0.0019512176513671875,
0.0125579833984375,
-0.027191162109375,
0.057220458984375,
0.08154296875,
-0.0682373046875,
0.07562255859375,
-0.0224609375,
0.032135009765625,
0.00478363037109375,
-0.00936126708984375,
-0.07501220703125,
-0.038665771484375,
-0.0328369140625,
-0.034210205078125,
-0.00841522216796875,
0.046783447265625,
0.03460693359375,
-0.0831298828125,
-0.005649566650390625,
0.004711151123046875,
0.01044464111328125,
-0.0128936767578125,
-0.020233154296875,
0.0168304443359375,
0.0011310577392578125,
-0.042938232421875,
0.00830841064453125,
-0.004558563232421875,
-0.0034580230712890625,
-0.004932403564453125,
0.0024013519287109375,
-0.037506103515625,
0.01348114013671875,
0.0244903564453125,
0.00881195068359375,
-0.052825927734375,
-0.02191162109375,
0.0003452301025390625,
-0.0295867919921875,
0.0027217864990234375,
0.03985595703125,
-0.080322265625,
-0.00255584716796875,
0.059661865234375,
0.036285400390625,
0.0509033203125,
0.017578125,
0.0290985107421875,
-0.033111572265625,
0.002399444580078125,
0.017303466796875,
0.0276031494140625,
0.038970947265625,
-0.020172119140625,
0.039642333984375,
0.025238037109375,
-0.03228759765625,
-0.04541015625,
-0.00266265869140625,
-0.08258056640625,
-0.0290985107421875,
0.07977294921875,
-0.005664825439453125,
-0.01947021484375,
-0.01308441162109375,
-0.00998687744140625,
0.02288818359375,
-0.0142364501953125,
0.053985595703125,
0.032073974609375,
0.013824462890625,
-0.008697509765625,
-0.005649566650390625,
0.00861358642578125,
0.0556640625,
-0.069580078125,
-0.03515625,
0.003910064697265625,
0.06439208984375,
0.0128173828125,
0.03326416015625,
-0.005191802978515625,
0.037933349609375,
0.01143646240234375,
0.0077056884765625,
-0.0023326873779296875,
0.01348114013671875,
-0.0303497314453125,
0.01383209228515625,
-0.04815673828125,
-0.044189453125
]
] |
Lykon/AnyLoRA | 2023-06-07T22:45:12.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"art",
"artistic",
"en",
"license:other",
"has_space",
"region:us"
] | text-to-image | Lykon | null | null | Lykon/AnyLoRA | 30 | 61,212 | diffusers | 2023-03-23T01:08:25 | ---
language:
- en
license: other
tags:
- stable-diffusion
- text-to-image
- art
- artistic
- diffusers
inference: false
---
For info: https://civitai.com/models/23900/anylora
For info on AAM: https://civitai.com/models/84586/aam-anylora-anime-mix-anime-screencap-style-model | 276 | [
[
-0.023834228515625,
-0.03387451171875,
0.0311431884765625,
0.01459503173828125,
-0.0097198486328125,
-0.0127105712890625,
0.0400390625,
-0.0256195068359375,
0.0533447265625,
0.04571533203125,
-0.045654296875,
-0.029541015625,
-0.0046539306640625,
-0.021820068359375,
-0.027099609375,
0.0249481201171875,
0.0026760101318359375,
0.0357666015625,
0.016082763671875,
-0.01404571533203125,
-0.04766845703125,
-0.03167724609375,
-0.08740234375,
-0.0066070556640625,
0.04083251953125,
0.059814453125,
0.055145263671875,
0.045989990234375,
0.031280517578125,
0.0192413330078125,
0.0031375885009765625,
0.00897979736328125,
-0.01256561279296875,
0.008270263671875,
0.0028781890869140625,
-0.054107666015625,
-0.07415771484375,
0.0135650634765625,
0.050445556640625,
0.0252838134765625,
-0.0088043212890625,
0.03466796875,
-0.0279388427734375,
0.034271240234375,
-0.06640625,
0.019989013671875,
-0.00377655029296875,
0.007099151611328125,
-0.01552581787109375,
-0.01544952392578125,
-0.00724029541015625,
-0.0122528076171875,
-0.01971435546875,
-0.06353759765625,
-0.0025196075439453125,
0.0202789306640625,
0.06988525390625,
0.01003265380859375,
-0.04510498046875,
-0.01275634765625,
-0.0677490234375,
0.02264404296875,
-0.03143310546875,
0.028656005859375,
0.0221099853515625,
0.0484619140625,
-0.005023956298828125,
-0.02239990234375,
-0.020538330078125,
0.015533447265625,
0.0030002593994140625,
0.01453399658203125,
-0.03271484375,
0.0158538818359375,
-0.0010766983032226562,
0.034881591796875,
-0.051483154296875,
-0.01543426513671875,
-0.049346923828125,
0.0052490234375,
0.050994873046875,
0.0149383544921875,
0.047027587890625,
-0.0025196075439453125,
-0.035736083984375,
-0.01007843017578125,
-0.04681396484375,
-0.0229644775390625,
0.0318603515625,
-0.0165863037109375,
-0.0230560302734375,
0.03497314453125,
0.002838134765625,
0.062408447265625,
-0.002323150634765625,
0.00687408447265625,
-0.0049591064453125,
-0.0284271240234375,
-0.0297088623046875,
0.00548553466796875,
0.013763427734375,
0.046966552734375,
0.002269744873046875,
0.01312255859375,
0.00482940673828125,
-0.01262664794921875,
0.01184844970703125,
-0.08062744140625,
-0.040496826171875,
0.01505279541015625,
-0.0650634765625,
0.0013446807861328125,
0.017242431640625,
-0.03521728515625,
-0.0160369873046875,
0.0008378028869628906,
0.033905029296875,
-0.026580810546875,
-0.04180908203125,
-0.0087890625,
-0.01116943359375,
0.018310546875,
0.01995849609375,
-0.043487548828125,
0.0213775634765625,
0.0005211830139160156,
0.05078125,
-0.0096282958984375,
-0.00786590576171875,
0.0008378028869628906,
0.00045180320739746094,
-0.029205322265625,
0.03863525390625,
-0.0131072998046875,
-0.038055419921875,
0.01285552978515625,
0.00894927978515625,
-0.0294952392578125,
-0.047088623046875,
0.033538818359375,
-0.0655517578125,
-0.02703857421875,
0.004131317138671875,
-0.034942626953125,
-0.059112548828125,
0.0044708251953125,
-0.068359375,
0.01727294921875,
-0.00506591796875,
-0.03863525390625,
0.0149688720703125,
-0.06365966796875,
0.0018453598022460938,
0.01116943359375,
0.0060577392578125,
-0.005619049072265625,
0.0019330978393554688,
-0.03106689453125,
0.0014438629150390625,
0.014892578125,
-0.0164947509765625,
-0.03558349609375,
-0.00982666015625,
0.039031982421875,
-0.04205322265625,
0.06317138671875,
0.06451416015625,
0.025360107421875,
0.035064697265625,
-0.08660888671875,
-0.014923095703125,
0.021270751953125,
0.00806427001953125,
-0.0002624988555908203,
-0.02667236328125,
0.0179595947265625,
0.0275726318359375,
0.035919189453125,
-0.010467529296875,
0.001514434814453125,
0.033538818359375,
0.00469207763671875,
0.0157470703125,
0.034088134765625,
0.00791168212890625,
-0.05926513671875,
0.07647705078125,
0.002132415771484375,
0.061065673828125,
-0.02947998046875,
-0.042449951171875,
-0.10015869140625,
-0.015533447265625,
0.01418304443359375,
0.038665771484375,
-0.042083740234375,
0.02642822265625,
0.043304443359375,
-0.0606689453125,
-0.05401611328125,
0.002368927001953125,
0.0213775634765625,
0.013702392578125,
0.0123291015625,
-0.046417236328125,
-0.058837890625,
-0.07452392578125,
0.0045928955078125,
-0.0022525787353515625,
0.004009246826171875,
0.03045654296875,
0.02099609375,
0.0025424957275390625,
0.0494384765625,
-0.02899169921875,
-0.03338623046875,
-0.01035308837890625,
-0.008636474609375,
0.03338623046875,
0.0251922607421875,
0.07568359375,
-0.057159423828125,
-0.04876708984375,
-0.028533935546875,
-0.043914794921875,
-0.0093841552734375,
0.0251312255859375,
-0.0264892578125,
-0.00705718994140625,
0.0216064453125,
-0.0303802490234375,
0.06610107421875,
0.04644775390625,
-0.045257568359375,
0.020233154296875,
-0.03131103515625,
0.0501708984375,
-0.0789794921875,
0.035308837890625,
-0.0018301010131835938,
-0.035736083984375,
-0.04681396484375,
0.016357421875,
0.00811004638671875,
-0.0027446746826171875,
-0.072509765625,
0.03759765625,
-0.0565185546875,
0.0007100105285644531,
-0.036834716796875,
-0.007144927978515625,
-0.0025844573974609375,
0.0302581787109375,
0.0124053955078125,
0.045013427734375,
0.03802490234375,
-0.046234130859375,
0.016204833984375,
0.0279388427734375,
-0.03314208984375,
0.0290374755859375,
-0.040802001953125,
0.006130218505859375,
-0.0015306472778320312,
0.017730712890625,
-0.07232666015625,
-0.02325439453125,
0.0263824462890625,
-0.0190277099609375,
0.001987457275390625,
-0.030303955078125,
-0.049102783203125,
-0.003040313720703125,
-0.0146484375,
0.06396484375,
0.0161590576171875,
-0.05859375,
0.036529541015625,
0.0316162109375,
-0.0169525146484375,
0.00809478759765625,
-0.0258026123046875,
-0.0024585723876953125,
-0.0341796875,
-0.030059814453125,
0.03558349609375,
-0.0186309814453125,
-0.034088134765625,
-0.0189971923828125,
0.00948333740234375,
-0.040771484375,
-0.006725311279296875,
0.0343017578125,
0.054168701171875,
-0.0224456787109375,
-0.01093292236328125,
-0.024444580078125,
-0.0008268356323242188,
-0.00513458251953125,
0.0389404296875,
0.061492919921875,
-0.01220703125,
-0.00899505615234375,
-0.05413818359375,
0.040191650390625,
0.041412353515625,
0.0023174285888671875,
0.051544189453125,
0.0202178955078125,
-0.048980712890625,
0.01454925537109375,
-0.060455322265625,
-0.0008096694946289062,
-0.0318603515625,
0.006023406982421875,
-0.0186004638671875,
0.0016431808471679688,
0.049346923828125,
0.0299530029296875,
-0.018341064453125,
0.07550048828125,
0.0265960693359375,
0.00894927978515625,
0.1102294921875,
0.04779052734375,
0.004886627197265625,
0.05413818359375,
-0.039031982421875,
-0.0303955078125,
-0.050201416015625,
-0.02313232421875,
-0.0200042724609375,
-0.0244140625,
0.00045418739318847656,
-0.0268096923828125,
0.0106353759765625,
0.0285797119140625,
-0.055877685546875,
0.024810791015625,
-0.01441192626953125,
0.029754638671875,
0.036376953125,
0.007129669189453125,
0.059417724609375,
-0.025665283203125,
0.0206756591796875,
0.021087646484375,
-0.055816650390625,
-0.033477783203125,
0.05975341796875,
0.0124359130859375,
0.061798095703125,
0.026397705078125,
-0.000347137451171875,
0.01513671875,
0.023529052734375,
-0.0294647216796875,
0.0133209228515625,
-0.0174407958984375,
-0.08843994140625,
0.030303955078125,
-0.0010738372802734375,
-0.05438232421875,
0.0236053466796875,
-0.02996826171875,
-0.023468017578125,
0.04644775390625,
0.0293121337890625,
-0.023773193359375,
0.035797119140625,
-0.057403564453125,
0.048187255859375,
-0.037200927734375,
0.027069091796875,
-0.00977325439453125,
-0.049346923828125,
0.035858154296875,
0.004413604736328125,
0.037200927734375,
0.003009796142578125,
0.007289886474609375,
0.0528564453125,
-0.06207275390625,
0.05975341796875,
0.007171630859375,
-0.0015354156494140625,
0.02264404296875,
-0.00994873046875,
-0.01444244384765625,
0.025970458984375,
-0.00020587444305419922,
-0.0026035308837890625,
-0.005462646484375,
-0.0094757080078125,
-0.005962371826171875,
0.0458984375,
-0.06591796875,
-0.0151824951171875,
-0.04486083984375,
-0.026611328125,
0.0031795501708984375,
0.0175323486328125,
0.071044921875,
0.0379638671875,
-0.0274658203125,
-0.0013790130615234375,
0.039031982421875,
0.01337432861328125,
0.041473388671875,
0.042266845703125,
-0.060150146484375,
-0.05517578125,
0.043121337890625,
0.00992584228515625,
0.024871826171875,
0.01456451416015625,
-0.0081024169921875,
-0.01036834716796875,
-0.01100921630859375,
-0.0333251953125,
0.01849365234375,
-0.05010986328125,
-0.01873779296875,
-0.0178070068359375,
-0.046234130859375,
-0.01557159423828125,
-0.0005927085876464844,
-0.0531005859375,
-0.0452880859375,
-0.044677734375,
0.01148223876953125,
0.060943603515625,
0.0738525390625,
0.018829345703125,
0.047943115234375,
-0.039306640625,
0.01904296875,
0.0081329345703125,
0.021240234375,
0.007965087890625,
-0.0411376953125,
0.003692626953125,
0.00621795654296875,
-0.06585693359375,
-0.07171630859375,
0.048919677734375,
0.023468017578125,
0.030975341796875,
0.06719970703125,
-0.020416259765625,
0.052001953125,
-0.0218658447265625,
0.059051513671875,
0.051422119140625,
-0.0400390625,
0.035064697265625,
-0.0051116943359375,
0.02789306640625,
0.05181884765625,
0.03656005859375,
-0.0242156982421875,
-0.0025424957275390625,
-0.06622314453125,
-0.06536865234375,
0.0296478271484375,
0.0206146240234375,
-0.00754547119140625,
0.02984619140625,
0.03094482421875,
0.01727294921875,
0.0272979736328125,
-0.050384521484375,
-0.07598876953125,
-0.0178985595703125,
0.0025844573974609375,
-0.0138092041015625,
-0.00551605224609375,
-0.003894805908203125,
-0.03009033203125,
0.04974365234375,
0.01371002197265625,
0.0002130270004272461,
0.0224151611328125,
-0.018798828125,
-0.01203155517578125,
0.01390838623046875,
0.039398193359375,
0.05596923828125,
-0.042144775390625,
0.00482177734375,
-0.007720947265625,
-0.03314208984375,
-0.0025730133056640625,
0.01264190673828125,
-0.02154541015625,
0.02154541015625,
0.0038604736328125,
0.08154296875,
0.012451171875,
-0.03314208984375,
0.029754638671875,
-0.017608642578125,
-0.00453948974609375,
-0.05047607421875,
0.02374267578125,
0.0012407302856445312,
0.029083251953125,
0.0377197265625,
-0.0033435821533203125,
0.039276123046875,
-0.07171630859375,
-0.00911712646484375,
0.007732391357421875,
-0.0256805419921875,
-0.0256195068359375,
0.08514404296875,
0.0081329345703125,
-0.04034423828125,
0.049102783203125,
-0.0035037994384765625,
-0.015899658203125,
0.055755615234375,
0.052001953125,
0.0552978515625,
-0.058380126953125,
0.01093292236328125,
0.0141143798828125,
-0.015655517578125,
0.0031452178955078125,
0.03741455078125,
0.00989532470703125,
-0.034088134765625,
0.01522064208984375,
-0.0146331787109375,
-0.055755615234375,
0.0183258056640625,
-0.05145263671875,
0.033233642578125,
-0.038238525390625,
-0.023345947265625,
-0.00902557373046875,
0.003787994384765625,
-0.04315185546875,
0.05419921875,
0.0213470458984375,
0.09674072265625,
-0.0533447265625,
0.06463623046875,
0.05328369140625,
-0.06298828125,
-0.049285888671875,
-0.033721923828125,
-0.0006532669067382812,
-0.051239013671875,
0.00659942626953125,
0.010650634765625,
0.00208282470703125,
-0.0189666748046875,
-0.04095458984375,
-0.068603515625,
0.09881591796875,
0.0243988037109375,
-0.037811279296875,
-0.002758026123046875,
-0.0362548828125,
0.0220794677734375,
-0.07403564453125,
0.03704833984375,
0.032470703125,
0.0303955078125,
0.031097412109375,
-0.0430908203125,
-0.01465606689453125,
-0.053497314453125,
0.01084136962890625,
-0.005390167236328125,
-0.08526611328125,
0.056396484375,
0.005054473876953125,
-0.00690460205078125,
0.047393798828125,
0.04779052734375,
0.039337158203125,
0.0219573974609375,
0.048370361328125,
0.018585205078125,
0.02960205078125,
0.0191497802734375,
0.080810546875,
-0.0211639404296875,
0.00447845458984375,
0.08270263671875,
-0.01230621337890625,
0.0653076171875,
0.0241851806640625,
-0.00800323486328125,
0.0293121337890625,
0.07958984375,
-0.028106689453125,
0.040435791015625,
-0.0012760162353515625,
-0.02899169921875,
0.003582000732421875,
-0.01323699951171875,
-0.032623291015625,
0.0229644775390625,
0.0220947265625,
-0.052703857421875,
0.004611968994140625,
-0.007663726806640625,
-0.0030803680419921875,
-0.0252838134765625,
-0.055999755859375,
0.0361328125,
0.016632080078125,
-0.047607421875,
0.040557861328125,
-0.005260467529296875,
0.041259765625,
-0.04901123046875,
-0.01214599609375,
-0.0143585205078125,
0.0035190582275390625,
-0.023895263671875,
-0.0196075439453125,
0.01282501220703125,
-0.016754150390625,
-0.0202178955078125,
-0.0008573532104492188,
0.0621337890625,
-0.0443115234375,
-0.037353515625,
0.00534820556640625,
-0.004604339599609375,
0.02764892578125,
0.01104736328125,
-0.057830810546875,
0.0024547576904296875,
-0.01898193359375,
-0.0092010498046875,
0.004268646240234375,
-0.049713134765625,
0.003635406494140625,
0.050384521484375,
0.011077880859375,
0.0246734619140625,
0.0213775634765625,
-0.00019681453704833984,
0.0090179443359375,
-0.034423828125,
-0.0355224609375,
-0.037261962890625,
0.0146942138671875,
-0.0248260498046875,
-0.052581787109375,
0.0653076171875,
0.07000732421875,
0.042022705078125,
-0.039031982421875,
0.04071044921875,
0.0241851806640625,
0.0045928955078125,
-0.046844482421875,
0.09136962890625,
-0.0467529296875,
-0.002025604248046875,
0.016815185546875,
-0.09197998046875,
-0.0243988037109375,
0.052215576171875,
0.03704833984375,
0.0316162109375,
0.03131103515625,
0.030792236328125,
-0.014373779296875,
0.01404571533203125,
0.0161895751953125,
0.0245208740234375,
-0.00014781951904296875,
-0.0007653236389160156,
0.045928955078125,
-0.0340576171875,
0.0244140625,
-0.0305938720703125,
-0.024993896484375,
-0.043548583984375,
-0.06170654296875,
-0.046844482421875,
-0.021087646484375,
-0.043731689453125,
-0.0124969482421875,
-0.00926971435546875,
0.052581787109375,
0.0819091796875,
-0.050537109375,
-0.042755126953125,
0.01143646240234375,
-0.00621795654296875,
0.0022487640380859375,
-0.0168304443359375,
0.032501220703125,
0.0257110595703125,
-0.0650634765625,
0.031219482421875,
0.0133209228515625,
0.021514892578125,
-0.022216796875,
0.021759033203125,
-0.0272216796875,
0.044586181640625,
0.012451171875,
0.048126220703125,
-0.0226898193359375,
-0.004070281982421875,
-0.0286102294921875,
-0.004367828369140625,
0.01453399658203125,
0.046844482421875,
0.00983428955078125,
0.02093505859375,
0.036041259765625,
-0.0188751220703125,
0.038482666015625,
0.006130218505859375,
0.06085205078125,
-0.03485107421875,
0.00417327880859375,
-0.0168304443359375,
0.06756591796875,
0.0015420913696289062,
-0.02679443359375,
0.045928955078125,
-0.00408172607421875,
-0.044830322265625,
-0.03753662109375,
0.0171356201171875,
-0.10528564453125,
-0.010894775390625,
0.080078125,
-0.005191802978515625,
-0.0240325927734375,
0.00809478759765625,
-0.03997802734375,
0.032806396484375,
-0.018310546875,
0.033233642578125,
0.052764892578125,
-0.016693115234375,
-0.018096923828125,
-0.04901123046875,
0.021331787109375,
0.004184722900390625,
-0.0601806640625,
-0.045867919921875,
0.0229034423828125,
0.039398193359375,
0.01456451416015625,
0.037261962890625,
-0.048309326171875,
0.042388916015625,
-0.0184173583984375,
0.032562255859375,
0.01313018798828125,
-0.0272979736328125,
-0.0269012451171875,
0.004974365234375,
0.01558685302734375,
-0.04779052734375
]
] |
TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ | 2023-09-27T13:00:46.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"dataset:shahules786/orca-chat",
"dataset:rombodawg/MegaCodeTraining112k",
"dataset:theblackcat102/evol-codealpaca-v1",
"dataset:nickrosh/Evol-Instruct-Code-80k-v1",
"license:other",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ | 21 | 60,978 | transformers | 2023-08-02T11:32:29 | ---
license: other
datasets:
- shahules786/orca-chat
- rombodawg/MegaCodeTraining112k
- theblackcat102/evol-codealpaca-v1
- nickrosh/Evol-Instruct-Code-80k-v1
model_name: Llama2 13B Orca v2 8K
inference: false
model_creator: OpenAssistant
model_link: https://huggingface.co/OpenAssistant/llama2-13b-orca-v2-8k-3166
model_type: llama
quantized_by: TheBloke
base_model: OpenAssistant/llama2-13b-orca-v2-8k-3166
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama2 13B Orca v2 8K - GPTQ
- Model creator: [OpenAssistant](https://huggingface.co/OpenAssistant)
- Original model: [Llama2 13B Orca v2 8K](https://huggingface.co/OpenAssistant/llama2-13b-orca-v2-8k-3166)
## Description
This repo contains GPTQ model files for [OpenAssistant's Llama2 13B Orca v2 8K](https://huggingface.co/OpenAssistant/llama2-13b-orca-v2-8k-3166).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
## Repositories available
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGML models for CPU+GPU inference](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GGML)
* [OpenAssistant's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/OpenAssistant/llama2-13b-orca-v2-8k-3166)
## Prompt template: OpenAssistant
```
<|prompter|>{prompt}<|endoftext|><|assistant|>
```
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All GPTQ files are made with AutoGPTQ.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have issues with models that use Act Order plus Group Size.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. The dataset used for quantisation can affect the quantisation accuracy. The dataset used for quantisation is not the same as the dataset used to train the model.
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only affects the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/main) | 4 | 128 | No | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 7.26 GB | Yes | Most compatible option. Good inference speed in AutoGPTQ and GPTQ-for-LLaMa. Lower inference quality than other options. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 8.00 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. Poor AutoGPTQ CUDA speed. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 7.51 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. Poor AutoGPTQ CUDA speed. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 7.26 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. Poor AutoGPTQ CUDA speed. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 13.36 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements and to improve AutoGPTQ speed. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [Evol Instruct Code](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) | 8192 | 13.65 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. Poor AutoGPTQ CUDA speed. |
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ:gptq-4bit-32g-actorder_True`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done"
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to set GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
## How to use this GPTQ model from Python code
First make sure you have [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ) 0.3.1 or later installed:
```
pip3 install auto-gptq
```
If you have problems installing AutoGPTQ, please build from source instead:
```
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
Then try the following example code:
```python
from transformers import AutoTokenizer, pipeline, logging
from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
model_name_or_path = "TheBloke/OpenAssistant-Llama2-13B-Orca-v2-8K-3166-GPTQ"
use_triton = False
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
use_safetensors=True,
trust_remote_code=False,
device="cuda:0",
use_triton=use_triton,
quantize_config=None)
"""
# To download from a specific branch, use the revision parameter, as in this example:
# Note that `revision` requires AutoGPTQ 0.3.1 or later!
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
revision="gptq-4bit-32g-actorder_True",
use_safetensors=True,
trust_remote_code=False,
device="cuda:0",
quantize_config=None)
"""
prompt = "Tell me about AI"
prompt_template=f'''<|prompter|>{prompt}<|endoftext|><|assistant|>
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
# Prevent printing spurious transformers error when using pipeline with AutoGPTQ
logging.set_verbosity(logging.CRITICAL)
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
temperature=0.7,
top_p=0.95,
repetition_penalty=1.15
)
print(pipe(prompt_template)[0]['generated_text'])
```
## Compatibility
The files provided will work with AutoGPTQ (CUDA and Triton modes), GPTQ-for-LLaMa (only CUDA has been tested), and Occ4m's GPTQ-for-LLaMa fork.
ExLlama works with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Sam, theTransient, Jonathan Leane, Steven Wood, webtim, Johann-Peter Hartmann, Geoffrey Montalvo, Gabriel Tamborski, Willem Michiel, John Villwock, Derek Yates, Mesiah Bishop, Eugene Pentland, Pieter, Chadd, Stephen Murray, Daniel P. Andersen, terasurfer, Brandon Frisco, Thomas Belote, Sid, Nathan LeClaire, Magnesian, Alps Aficionado, Stanislav Ovsiannikov, Alex, Joseph William Delisle, Nikolai Manek, Michael Davis, Junyu Yang, K, J, Spencer Kim, Stefan Sabev, Olusegun Samson, transmissions 11, Michael Levine, Cory Kujawski, Rainer Wilmers, zynix, Kalila, Luke @flexchar, Ajan Kanaga, Mandus, vamX, Ai Maven, Mano Prime, Matthew Berman, subjectnull, Vitor Caleffi, Clay Pascal, biorpg, alfie_i, 阿明, Jeffrey Morgan, ya boyyy, Raymond Fosdick, knownsqashed, Olakabola, Leonard Tan, ReadyPlayerEmma, Enrico Ros, Dave, Talal Aujan, Illia Dulskyi, Sean Connelly, senxiiz, Artur Olbinski, Elle, Raven Klaugh, Fen Risland, Deep Realms, Imad Khwaja, Fred von Graf, Will Dee, usrbinkat, SuperWojo, Alexandros Triantafyllidis, Swaroop Kallakuri, Dan Guido, John Detwiler, Pedro Madruga, Iucharbius, Viktor Bowallius, Asp the Wyvern, Edmond Seymore, Trenton Dambrowitz, Space Cruiser, Spiking Neurons AB, Pyrater, LangChain4j, Tony Hughes, Kacper Wikieł, Rishabh Srivastava, David Ziegler, Luke Pendergrass, Andrey, Gabriel Puliatti, Lone Striker, Sebastain Graf, Pierre Kircher, Randy H, NimbleBox.ai, Vadim, danny, Deo Leter
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: OpenAssistant's Llama2 13B Orca v2 8K
- wandb: [jlhr5cf2](https://wandb.ai/open-assistant/supervised-finetuning/runs/jlhr5cf2)
- sampling-report: [2023-07-31_OpenAssistant_llama2-13b-orca-v2-8k-3166_sampling_llama2_prompt.json](https://open-assistant.github.io/oasst-model-eval/?f=https%3A%2F%2Fraw.githubusercontent.com%2FOpen-Assistant%2Foasst-model-eval%2Fmain%2Fsampling_reports%2Foasst-pretrained%2F2023-07-31_OpenAssistant_llama2-13b-orca-v2-8k-3166_sampling_llama2_prompt.json)
## Model Configuration
```
llama2-13b-orca-v2-8k:
rng_seed: 0xe1291f21
show_dataset_stats: true
random_offset_probability: 0.0
use_custom_sampler: true
sort_by_length: false
dtype: fp16
log_dir: /mnt/data/ikka/data_cache/llama2_13b_orcav2_logs
output_dir: /mnt/data/ikka/data_cache/llama2_13b_orcav2
learning_rate: 1e-5
model_name: conceptofmind/LLongMA-2-13b
deepspeed_config: configs/zero_config_pretrain.json
weight_decay: 0.000001
max_length: 8192
warmup_steps: 100
peft_model: false
use_flash_attention: true
gradient_checkpointing: true
gradient_accumulation_steps: 4
per_device_train_batch_size: 2
per_device_eval_batch_size: 1
residual_dropout: 0.0
eval_steps: 200
save_steps: 200
num_train_epochs: 1
save_total_limit: 4
superhot: false
superhot_config:
type: linear
scaling_factor: 2
datasets:
- orca-chat: # shahules786/orca-chat
data_files: orca-chat-gpt4-8k.json
max_val_set: 5000
val_split: 0.1
- evol-codealpaca-v1: # theblackcat102/evol-codealpaca-v1
fill_min_length: 20000
val_split: 0.1
- megacode: # rombodawg/MegaCodeTraining112k
fill_min_length: 24000
val_split: 0.1
max_val_set: 1000
- evol_instruct_code: # nickrosh/Evol-Instruct-Code-80k-v1
fill_min_length: 24000
val_split: 0.1
max_val_set: 1000
```
| 15,563 | [
[
-0.035614013671875,
-0.06854248046875,
0.01009368896484375,
0.00679779052734375,
-0.0216522216796875,
-0.01204681396484375,
0.01210784912109375,
-0.044525146484375,
0.0226898193359375,
0.0278472900390625,
-0.04046630859375,
-0.046478271484375,
-0.024871826171875,
-0.00702667236328125,
-0.019439697265625,
0.0736083984375,
0.006134033203125,
-0.015625,
0.00695037841796875,
-0.0216522216796875,
-0.0175628662109375,
-0.028076171875,
-0.07037353515625,
-0.01137542724609375,
0.0260009765625,
0.01023101806640625,
0.060791015625,
0.047149658203125,
0.01422882080078125,
0.0232391357421875,
-0.0110321044921875,
-0.0016021728515625,
-0.040802001953125,
-0.00472259521484375,
0.006687164306640625,
-0.0174102783203125,
-0.05828857421875,
0.0034008026123046875,
0.032989501953125,
0.010772705078125,
-0.0299224853515625,
0.0207366943359375,
0.0023326873779296875,
0.045166015625,
-0.036163330078125,
0.022491455078125,
-0.02001953125,
-0.0011463165283203125,
-0.0230865478515625,
0.01422119140625,
0.0000661015510559082,
-0.04547119140625,
0.006561279296875,
-0.0631103515625,
0.00490570068359375,
0.0059051513671875,
0.08721923828125,
0.01340484619140625,
-0.03521728515625,
-0.0011005401611328125,
-0.022064208984375,
0.042510986328125,
-0.0762939453125,
0.0168914794921875,
0.0205841064453125,
0.01172637939453125,
-0.022491455078125,
-0.064453125,
-0.040863037109375,
-0.0036373138427734375,
-0.005664825439453125,
0.022857666015625,
-0.025909423828125,
0.004261016845703125,
0.0362548828125,
0.05572509765625,
-0.06402587890625,
-0.0094757080078125,
-0.025299072265625,
-0.006134033203125,
0.0643310546875,
0.018524169921875,
0.03173828125,
-0.0198822021484375,
-0.0228271484375,
-0.0355224609375,
-0.057891845703125,
0.016998291015625,
0.036285400390625,
0.0048828125,
-0.04827880859375,
0.034881591796875,
-0.022003173828125,
0.0369873046875,
0.01184844970703125,
-0.007495880126953125,
0.0233612060546875,
-0.041778564453125,
-0.038360595703125,
-0.0208587646484375,
0.091064453125,
0.01202392578125,
-0.0238037109375,
0.0264434814453125,
-0.0006847381591796875,
-0.00969696044921875,
-0.002323150634765625,
-0.0770263671875,
-0.035125732421875,
0.04547119140625,
-0.041290283203125,
-0.01454925537109375,
-0.00200653076171875,
-0.0662841796875,
-0.00702667236328125,
-0.0006642341613769531,
0.0289459228515625,
-0.0452880859375,
-0.0276031494140625,
0.0153961181640625,
-0.012847900390625,
0.0433349609375,
0.0260467529296875,
-0.0582275390625,
0.036376953125,
0.0204315185546875,
0.05548095703125,
0.01064300537109375,
-0.005435943603515625,
-0.0256500244140625,
-0.00562286376953125,
-0.00611114501953125,
0.03961181640625,
-0.00994873046875,
-0.0340576171875,
-0.02301025390625,
0.0198822021484375,
-0.0098419189453125,
-0.0155029296875,
0.05059814453125,
-0.0233612060546875,
0.0213470458984375,
-0.0421142578125,
-0.0247039794921875,
-0.033660888671875,
-0.0035858154296875,
-0.049652099609375,
0.09271240234375,
0.03704833984375,
-0.06475830078125,
0.01291656494140625,
-0.0570068359375,
-0.01105499267578125,
0.001895904541015625,
-0.002460479736328125,
-0.04608154296875,
-0.0154876708984375,
0.016510009765625,
0.0165863037109375,
-0.023284912109375,
0.004581451416015625,
-0.0282440185546875,
-0.00934600830078125,
0.0016603469848632812,
-0.0195159912109375,
0.113037109375,
0.025909423828125,
-0.0300750732421875,
0.003326416015625,
-0.045562744140625,
0.0073089599609375,
0.04241943359375,
-0.0143585205078125,
-0.0031909942626953125,
-0.02239990234375,
0.004730224609375,
0.0033512115478515625,
0.0203704833984375,
-0.039398193359375,
0.03857421875,
-0.01959228515625,
0.0645751953125,
0.0450439453125,
-0.0023651123046875,
0.0177154541015625,
-0.0274200439453125,
0.04302978515625,
-0.00013840198516845703,
0.043304443359375,
0.0093536376953125,
-0.057403564453125,
-0.045989990234375,
-0.0204925537109375,
0.0361328125,
0.03594970703125,
-0.047943115234375,
0.0311126708984375,
-0.00525665283203125,
-0.056396484375,
-0.034698486328125,
-0.01256561279296875,
0.026153564453125,
0.031524658203125,
0.0290985107421875,
-0.047607421875,
-0.02764892578125,
-0.05108642578125,
0.014251708984375,
-0.02923583984375,
0.005199432373046875,
0.0428466796875,
0.06414794921875,
-0.005985260009765625,
0.055877685546875,
-0.053497314453125,
-0.0179443359375,
0.0006847381591796875,
0.00330352783203125,
0.021331787109375,
0.04620361328125,
0.061187744140625,
-0.053131103515625,
-0.03924560546875,
-0.00846099853515625,
-0.05633544921875,
-0.004207611083984375,
0.003734588623046875,
-0.03607177734375,
0.0218505859375,
0.00907135009765625,
-0.08184814453125,
0.0526123046875,
0.036865234375,
-0.04290771484375,
0.0582275390625,
-0.022979736328125,
0.00551605224609375,
-0.0740966796875,
0.016021728515625,
0.006900787353515625,
-0.0218048095703125,
-0.031097412109375,
0.029296875,
-0.003082275390625,
0.00830841064453125,
-0.034820556640625,
0.047149658203125,
-0.0404052734375,
0.0017080307006835938,
0.0169677734375,
0.00005233287811279297,
0.0174560546875,
0.039520263671875,
-0.01181793212890625,
0.059234619140625,
0.039520263671875,
-0.0296630859375,
0.042694091796875,
0.044647216796875,
-0.008270263671875,
0.023284912109375,
-0.06463623046875,
0.019012451171875,
0.005046844482421875,
0.035430908203125,
-0.0738525390625,
-0.0258941650390625,
0.050750732421875,
-0.0361328125,
0.033935546875,
-0.02197265625,
-0.0232086181640625,
-0.031707763671875,
-0.04656982421875,
0.03466796875,
0.04669189453125,
-0.033050537109375,
0.045318603515625,
0.037841796875,
-0.004985809326171875,
-0.036468505859375,
-0.04852294921875,
-0.0136871337890625,
-0.02099609375,
-0.05401611328125,
0.038787841796875,
-0.00728607177734375,
-0.0040740966796875,
0.00279998779296875,
0.00250244140625,
0.005916595458984375,
-0.00896453857421875,
0.0275115966796875,
0.0260162353515625,
-0.0188446044921875,
-0.0232391357421875,
0.0140228271484375,
0.004100799560546875,
0.002864837646484375,
-0.022064208984375,
0.038360595703125,
-0.021881103515625,
-0.00620269775390625,
-0.01983642578125,
0.01421356201171875,
0.034912109375,
-0.01013946533203125,
0.057373046875,
0.051116943359375,
-0.0207977294921875,
0.006984710693359375,
-0.028778076171875,
-0.004123687744140625,
-0.035125732421875,
0.0018520355224609375,
-0.017303466796875,
-0.045806884765625,
0.04791259765625,
0.032928466796875,
0.0159912109375,
0.058441162109375,
0.0367431640625,
0.006999969482421875,
0.061614990234375,
0.030792236328125,
-0.01290130615234375,
0.03387451171875,
-0.05242919921875,
-0.0086669921875,
-0.068359375,
-0.0265655517578125,
-0.036376953125,
-0.014129638671875,
-0.05352783203125,
-0.02301025390625,
0.0289764404296875,
0.0130615234375,
-0.058197021484375,
0.0543212890625,
-0.0533447265625,
0.00714111328125,
0.0501708984375,
0.0278472900390625,
0.0174560546875,
-0.00299072265625,
-0.00722503662109375,
0.0164794921875,
-0.046051025390625,
-0.0240936279296875,
0.08319091796875,
0.0291595458984375,
0.053192138671875,
0.0212554931640625,
0.03436279296875,
0.00994110107421875,
0.0136871337890625,
-0.0361328125,
0.039703369140625,
0.0025997161865234375,
-0.0460205078125,
-0.027740478515625,
-0.043487548828125,
-0.0765380859375,
0.007480621337890625,
-0.00457763671875,
-0.060791015625,
0.03875732421875,
0.0030765533447265625,
-0.045501708984375,
0.0223846435546875,
-0.052947998046875,
0.080078125,
-0.007236480712890625,
-0.02447509765625,
0.0114898681640625,
-0.056732177734375,
0.033447265625,
0.01447296142578125,
-0.007030487060546875,
-0.0078125,
-0.011199951171875,
0.057037353515625,
-0.070068359375,
0.05072021484375,
-0.015716552734375,
-0.0052490234375,
0.045745849609375,
-0.00576019287109375,
0.043426513671875,
0.0154571533203125,
-0.0022792816162109375,
0.03607177734375,
0.0268096923828125,
-0.033599853515625,
-0.022003173828125,
0.042755126953125,
-0.07037353515625,
-0.032318115234375,
-0.037109375,
-0.0290374755859375,
0.01425933837890625,
0.0036869049072265625,
0.03857421875,
0.0289764404296875,
-0.00411224365234375,
-0.005352020263671875,
0.053802490234375,
-0.033355712890625,
0.031951904296875,
0.021240234375,
-0.0218048095703125,
-0.04437255859375,
0.0625,
0.00971221923828125,
0.01540374755859375,
0.0126190185546875,
0.0030078887939453125,
-0.03814697265625,
-0.041290283203125,
-0.053497314453125,
0.0283660888671875,
-0.031280517578125,
-0.0330810546875,
-0.051605224609375,
-0.03204345703125,
-0.036529541015625,
0.026092529296875,
-0.0236968994140625,
-0.04248046875,
-0.038726806640625,
0.0037670135498046875,
0.06378173828125,
0.035400390625,
-0.013336181640625,
0.02178955078125,
-0.0474853515625,
0.0234527587890625,
0.0250701904296875,
0.01280975341796875,
0.005229949951171875,
-0.055694580078125,
-0.0007214546203613281,
0.01503753662109375,
-0.056732177734375,
-0.070068359375,
0.059112548828125,
0.01036834716796875,
0.0251617431640625,
0.0299835205078125,
0.0149383544921875,
0.059783935546875,
-0.00897216796875,
0.07159423828125,
0.00296783447265625,
-0.07269287109375,
0.036895751953125,
-0.04248046875,
0.01447296142578125,
0.02862548828125,
0.03875732421875,
-0.01514434814453125,
-0.017974853515625,
-0.05657958984375,
-0.062347412109375,
0.045562744140625,
0.0350341796875,
-0.0080108642578125,
0.01355743408203125,
0.041595458984375,
0.004428863525390625,
0.0107879638671875,
-0.060577392578125,
-0.045867919921875,
-0.03173828125,
-0.0024814605712890625,
0.017425537109375,
-0.004703521728515625,
-0.01213836669921875,
-0.03900146484375,
0.072998046875,
-0.007537841796875,
0.054840087890625,
0.0274200439453125,
0.0116424560546875,
-0.01091766357421875,
0.005153656005859375,
0.031585693359375,
0.038665771484375,
-0.0282440185546875,
-0.024932861328125,
0.00844573974609375,
-0.0648193359375,
-0.0012226104736328125,
0.03253173828125,
-0.02020263671875,
-0.016510009765625,
-0.0010786056518554688,
0.04998779296875,
-0.0198516845703125,
-0.025115966796875,
0.028167724609375,
-0.01983642578125,
-0.024658203125,
-0.0223236083984375,
0.012725830078125,
0.01605224609375,
0.02960205078125,
0.02069091796875,
-0.01374053955078125,
0.02435302734375,
-0.04888916015625,
-0.004741668701171875,
0.0400390625,
-0.0095977783203125,
-0.0252532958984375,
0.0701904296875,
-0.004444122314453125,
0.006343841552734375,
0.0555419921875,
-0.0265655517578125,
-0.028778076171875,
0.063720703125,
0.024017333984375,
0.06005859375,
-0.0192718505859375,
0.015869140625,
0.04461669921875,
0.01215362548828125,
-0.0061187744140625,
0.041595458984375,
0.01010894775390625,
-0.039794921875,
-0.021881103515625,
-0.0462646484375,
-0.02685546875,
0.02740478515625,
-0.056610107421875,
0.0162200927734375,
-0.04010009765625,
-0.0283355712890625,
-0.00534820556640625,
0.02569580078125,
-0.03802490234375,
0.011566162109375,
-0.0004203319549560547,
0.0775146484375,
-0.05181884765625,
0.063232421875,
0.0384521484375,
-0.043609619140625,
-0.0767822265625,
-0.0246124267578125,
0.005313873291015625,
-0.046051025390625,
0.01904296875,
-0.008087158203125,
0.02386474609375,
0.00354766845703125,
-0.0582275390625,
-0.0648193359375,
0.10845947265625,
0.031005859375,
-0.036224365234375,
-0.01200103759765625,
-0.0009112358093261719,
0.0252227783203125,
-0.019500732421875,
0.058135986328125,
0.04840087890625,
0.0273284912109375,
0.0149993896484375,
-0.0689697265625,
0.0404052734375,
-0.023834228515625,
0.0007343292236328125,
0.0170440673828125,
-0.0845947265625,
0.08551025390625,
-0.00630950927734375,
-0.0110015869140625,
0.0299530029296875,
0.047149658203125,
0.0281219482421875,
0.0024433135986328125,
0.0204315185546875,
0.06549072265625,
0.05413818359375,
-0.027740478515625,
0.08184814453125,
-0.014556884765625,
0.0474853515625,
0.0589599609375,
-0.00022161006927490234,
0.055999755859375,
0.0087127685546875,
-0.047210693359375,
0.045806884765625,
0.076904296875,
-0.01087188720703125,
0.0289764404296875,
0.0029468536376953125,
-0.0232696533203125,
0.00038743019104003906,
0.0016565322875976562,
-0.056610107421875,
0.00968170166015625,
0.0290985107421875,
-0.0233001708984375,
-0.005504608154296875,
-0.0205841064453125,
-0.0004284381866455078,
-0.04656982421875,
-0.01708984375,
0.03704833984375,
0.0258331298828125,
-0.0172119140625,
0.06591796875,
-0.0005311965942382812,
0.03936767578125,
-0.040557861328125,
-0.0133209228515625,
-0.030975341796875,
-0.010162353515625,
-0.03021240234375,
-0.052490234375,
0.01421356201171875,
-0.018585205078125,
0.0006680488586425781,
-0.0021762847900390625,
0.048858642578125,
-0.01451873779296875,
-0.0162811279296875,
0.01995849609375,
0.0309906005859375,
0.0298614501953125,
-0.01395416259765625,
-0.08453369140625,
0.01971435546875,
0.004566192626953125,
-0.04638671875,
0.037139892578125,
0.0294036865234375,
0.00839996337890625,
0.052520751953125,
0.045623779296875,
-0.00128936767578125,
-0.0004649162292480469,
-0.019500732421875,
0.07733154296875,
-0.055572509765625,
-0.0152740478515625,
-0.059234619140625,
0.037933349609375,
-0.005428314208984375,
-0.033935546875,
0.05865478515625,
0.0477294921875,
0.057769775390625,
-0.0006718635559082031,
0.045745849609375,
-0.04150390625,
0.01371002197265625,
-0.0299835205078125,
0.06304931640625,
-0.048797607421875,
0.0155487060546875,
-0.0269317626953125,
-0.0615234375,
0.0022430419921875,
0.0501708984375,
-0.01776123046875,
0.0217437744140625,
0.038848876953125,
0.068359375,
-0.004917144775390625,
0.0157470703125,
0.0080718994140625,
0.036834716796875,
0.010528564453125,
0.0701904296875,
0.0433349609375,
-0.07598876953125,
0.04296875,
-0.021728515625,
-0.0181884765625,
-0.0044708251953125,
-0.06671142578125,
-0.058074951171875,
-0.0263824462890625,
-0.04351806640625,
-0.05548095703125,
0.00215911865234375,
0.06512451171875,
0.05828857421875,
-0.0533447265625,
-0.0240936279296875,
-0.0131988525390625,
0.00531005859375,
-0.0182647705078125,
-0.022308349609375,
0.03680419921875,
0.0235443115234375,
-0.056610107421875,
0.00585174560546875,
0.004581451416015625,
0.022491455078125,
-0.01355743408203125,
-0.01499176025390625,
-0.0163726806640625,
0.00043511390686035156,
0.039031982421875,
0.06005859375,
-0.04107666015625,
-0.0020656585693359375,
-0.01418304443359375,
-0.0093536376953125,
0.02630615234375,
0.0163726806640625,
-0.056854248046875,
-0.0012216567993164062,
0.03900146484375,
0.006511688232421875,
0.059722900390625,
0.0091705322265625,
0.0246429443359375,
-0.01519012451171875,
0.00255584716796875,
0.0038509368896484375,
0.031524658203125,
0.0007834434509277344,
-0.040771484375,
0.05548095703125,
0.0240631103515625,
-0.05126953125,
-0.051116943359375,
-0.01282501220703125,
-0.09539794921875,
-0.01043701171875,
0.07904052734375,
-0.02392578125,
-0.03692626953125,
-0.006015777587890625,
-0.0272064208984375,
0.01971435546875,
-0.0572509765625,
0.03338623046875,
0.02392578125,
-0.016143798828125,
-0.018707275390625,
-0.0567626953125,
0.04193115234375,
0.01209259033203125,
-0.0731201171875,
-0.004535675048828125,
0.0294036865234375,
0.0281829833984375,
0.006183624267578125,
0.05755615234375,
-0.023712158203125,
0.0247802734375,
0.01010894775390625,
0.00763702392578125,
-0.01007843017578125,
0.01058197021484375,
-0.0284576416015625,
0.0021724700927734375,
-0.02105712890625,
-0.0036716461181640625
]
] |
Helsinki-NLP/opus-mt-tc-big-en-ro | 2023-10-10T10:46:29.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"marian",
"text2text-generation",
"translation",
"opus-mt-tc",
"en",
"ro",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-tc-big-en-ro | 2 | 60,062 | transformers | 2022-04-13T14:55:54 | ---
language:
- en
- ro
tags:
- translation
- opus-mt-tc
license: cc-by-4.0
model-index:
- name: opus-mt-tc-big-en-ro
results:
- task:
name: Translation eng-ron
type: translation
args: eng-ron
dataset:
name: flores101-devtest
type: flores_101
args: eng ron devtest
metrics:
- name: BLEU
type: bleu
value: 40.4
- task:
name: Translation eng-ron
type: translation
args: eng-ron
dataset:
name: newsdev2016
type: newsdev2016
args: eng-ron
metrics:
- name: BLEU
type: bleu
value: 36.4
- task:
name: Translation eng-ron
type: translation
args: eng-ron
dataset:
name: tatoeba-test-v2021-08-07
type: tatoeba_mt
args: eng-ron
metrics:
- name: BLEU
type: bleu
value: 48.6
- task:
name: Translation eng-ron
type: translation
args: eng-ron
dataset:
name: newstest2016
type: wmt-2016-news
args: eng-ron
metrics:
- name: BLEU
type: bleu
value: 34.0
---
# opus-mt-tc-big-en-ro
Neural machine translation model for translating from English (en) to Romanian (ro).
This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
* Publications: [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
```
@inproceedings{tiedemann-thottingal-2020-opus,
title = "{OPUS}-{MT} {--} Building open translation services for the World",
author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
month = nov,
year = "2020",
address = "Lisboa, Portugal",
publisher = "European Association for Machine Translation",
url = "https://aclanthology.org/2020.eamt-1.61",
pages = "479--480",
}
@inproceedings{tiedemann-2020-tatoeba,
title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.wmt-1.139",
pages = "1174--1182",
}
```
## Model info
* Release: 2022-02-25
* source language(s): eng
* target language(s): ron
* model: transformer-big
* data: opusTCv20210807+bt ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
* tokenization: SentencePiece (spm32k,spm32k)
* original model: [opusTCv20210807+bt_transformer-big_2022-02-25.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ron/opusTCv20210807+bt_transformer-big_2022-02-25.zip)
* more information released models: [OPUS-MT eng-ron README](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-ron/README.md)
## Usage
A short example code:
```python
from transformers import MarianMTModel, MarianTokenizer
src_text = [
">>ron<< A bad writer's prose is full of hackneyed phrases.",
">>ron<< Zero is a special number."
]
model_name = "pytorch-models/opus-mt-tc-big-en-ro"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
for t in translated:
print( tokenizer.decode(t, skip_special_tokens=True) )
# expected output:
# Proza unui scriitor prost este plină de fraze tocite.
# Zero este un număr special.
```
You can also use OPUS-MT models with the transformers pipelines, for example:
```python
from transformers import pipeline
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-en-ro")
print(pipe(">>ron<< A bad writer's prose is full of hackneyed phrases."))
# expected output: Proza unui scriitor prost este plină de fraze tocite.
```
## Benchmarks
* test set translations: [opusTCv20210807+bt_transformer-big_2022-02-25.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ron/opusTCv20210807+bt_transformer-big_2022-02-25.test.txt)
* test set scores: [opusTCv20210807+bt_transformer-big_2022-02-25.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ron/opusTCv20210807+bt_transformer-big_2022-02-25.eval.txt)
* benchmark results: [benchmark_results.txt](benchmark_results.txt)
* benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
| langpair | testset | chr-F | BLEU | #sent | #words |
|----------|---------|-------|-------|-------|--------|
| eng-ron | tatoeba-test-v2021-08-07 | 0.68606 | 48.6 | 5508 | 40367 |
| eng-ron | flores101-devtest | 0.64876 | 40.4 | 1012 | 26799 |
| eng-ron | newsdev2016 | 0.62682 | 36.4 | 1999 | 51300 |
| eng-ron | newstest2016 | 0.60702 | 34.0 | 1999 | 48945 |
## Acknowledgements
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
## Model conversion info
* transformers version: 4.16.2
* OPUS-MT git hash: 3405783
* port time: Wed Apr 13 17:55:46 EEST 2022
* port machine: LM0-400-22516.local
| 6,578 | [
[
-0.0225830078125,
-0.04412841796875,
0.019134521484375,
0.0246124267578125,
-0.034912109375,
-0.0201568603515625,
-0.040435791015625,
-0.0231170654296875,
0.012847900390625,
0.02777099609375,
-0.032318115234375,
-0.05078125,
-0.04248046875,
0.028289794921875,
-0.0159454345703125,
0.0640869140625,
-0.0251007080078125,
0.0236968994140625,
0.01727294921875,
-0.023345947265625,
-0.01641845703125,
-0.03472900390625,
-0.0246734619140625,
-0.028961181640625,
0.0245819091796875,
0.0098419189453125,
0.03338623046875,
0.04486083984375,
0.045501708984375,
0.0261993408203125,
-0.01617431640625,
0.016510009765625,
-0.0137481689453125,
-0.006862640380859375,
0.0014467239379882812,
-0.03765869140625,
-0.042022705078125,
-0.00823211669921875,
0.0657958984375,
0.040771484375,
0.008575439453125,
0.0263214111328125,
0.0059661865234375,
0.041046142578125,
-0.00823974609375,
0.01165008544921875,
-0.050262451171875,
0.0060882568359375,
-0.0219879150390625,
-0.026153564453125,
-0.047332763671875,
-0.00952911376953125,
0.002162933349609375,
-0.04156494140625,
0.005550384521484375,
0.006542205810546875,
0.09747314453125,
0.0179901123046875,
-0.021087646484375,
-0.0079803466796875,
-0.05731201171875,
0.07611083984375,
-0.057525634765625,
0.04693603515625,
0.0131683349609375,
-0.0004978179931640625,
-0.01085662841796875,
-0.05108642578125,
-0.04486083984375,
-0.00537109375,
-0.01438140869140625,
0.0238037109375,
-0.027740478515625,
-0.01157379150390625,
0.0189361572265625,
0.040130615234375,
-0.049591064453125,
0.0037899017333984375,
-0.03607177734375,
-0.01154327392578125,
0.032379150390625,
0.0015134811401367188,
0.01412200927734375,
-0.02703857421875,
-0.03369140625,
-0.03277587890625,
-0.0533447265625,
0.0043182373046875,
0.0232696533203125,
0.026611328125,
-0.039398193359375,
0.052398681640625,
0.0001690387725830078,
0.053741455078125,
0.0003829002380371094,
-0.0010175704956054688,
0.0509033203125,
-0.03997802734375,
-0.0153350830078125,
-0.015106201171875,
0.093017578125,
0.0185699462890625,
0.01050567626953125,
-0.01885986328125,
-0.01323699951171875,
-0.00913238525390625,
-0.0139007568359375,
-0.0628662109375,
0.011871337890625,
0.0204010009765625,
-0.03021240234375,
-0.0050201416015625,
-0.003665924072265625,
-0.044464111328125,
0.01470184326171875,
-0.02264404296875,
0.039581298828125,
-0.0516357421875,
-0.0290069580078125,
0.0214385986328125,
0.0026874542236328125,
0.023345947265625,
0.00318145751953125,
-0.04248046875,
-0.0023555755615234375,
0.033203125,
0.0693359375,
-0.00794219970703125,
-0.037689208984375,
-0.0360107421875,
-0.009185791015625,
-0.0177764892578125,
0.03851318359375,
-0.0099334716796875,
-0.0199432373046875,
-0.01378631591796875,
0.021575927734375,
-0.018524169921875,
-0.02294921875,
0.061737060546875,
-0.0236968994140625,
0.0406494140625,
-0.01117706298828125,
-0.02667236328125,
-0.0167694091796875,
0.016876220703125,
-0.0312347412109375,
0.0892333984375,
0.00717926025390625,
-0.06817626953125,
0.00650787353515625,
-0.047393798828125,
-0.0200042724609375,
-0.005191802978515625,
0.01061248779296875,
-0.035491943359375,
0.00948333740234375,
0.01580810546875,
0.03515625,
-0.047332763671875,
0.032135009765625,
0.004634857177734375,
-0.01268768310546875,
0.01485443115234375,
-0.041046142578125,
0.0802001953125,
0.0241546630859375,
-0.03765869140625,
0.0116729736328125,
-0.055389404296875,
-0.000050067901611328125,
0.0090179443359375,
-0.034912109375,
-0.014892578125,
-0.008209228515625,
0.01114654541015625,
0.023193359375,
0.0159759521484375,
-0.047576904296875,
0.013336181640625,
-0.04803466796875,
0.036590576171875,
0.0513916015625,
-0.020782470703125,
0.026123046875,
-0.0167694091796875,
0.033050537109375,
0.01399993896484375,
-0.006557464599609375,
-0.01427459716796875,
-0.045562744140625,
-0.0736083984375,
-0.0218658447265625,
0.048309326171875,
0.0467529296875,
-0.082763671875,
0.055877685546875,
-0.057647705078125,
-0.061737060546875,
-0.057952880859375,
-0.01983642578125,
0.043426513671875,
0.0259857177734375,
0.0506591796875,
-0.01227569580078125,
-0.03948974609375,
-0.0615234375,
-0.0221710205078125,
-0.01442718505859375,
-0.00872802734375,
0.00864410400390625,
0.050811767578125,
-0.0175628662109375,
0.051605224609375,
-0.01617431640625,
-0.0288543701171875,
-0.022125244140625,
0.01108551025390625,
0.043914794921875,
0.0550537109375,
0.03118896484375,
-0.0550537109375,
-0.049652099609375,
0.027130126953125,
-0.05322265625,
-0.005252838134765625,
-0.005733489990234375,
-0.017547607421875,
0.0229339599609375,
0.01457977294921875,
-0.047882080078125,
0.00928497314453125,
0.051513671875,
-0.0338134765625,
0.03338623046875,
-0.0189208984375,
0.01171875,
-0.11297607421875,
0.0184326171875,
-0.003673553466796875,
-0.0114593505859375,
-0.04791259765625,
0.00724029541015625,
0.006969451904296875,
0.007457733154296875,
-0.048583984375,
0.047821044921875,
-0.043609619140625,
0.0020885467529296875,
0.0266571044921875,
0.001682281494140625,
-0.003742218017578125,
0.05804443359375,
0.00464630126953125,
0.0623779296875,
0.041961669921875,
-0.03973388671875,
0.016937255859375,
0.034210205078125,
-0.0233917236328125,
0.021514892578125,
-0.056488037109375,
0.0030002593994140625,
0.0112762451171875,
0.007099151611328125,
-0.045745849609375,
0.0107574462890625,
0.0316162109375,
-0.054962158203125,
0.03375244140625,
-0.02459716796875,
-0.05499267578125,
-0.0236358642578125,
-0.00925445556640625,
0.029327392578125,
0.038726806640625,
-0.032958984375,
0.055938720703125,
0.005405426025390625,
0.0029506683349609375,
-0.046478271484375,
-0.068115234375,
0.006763458251953125,
-0.0218353271484375,
-0.060211181640625,
0.0328369140625,
-0.009429931640625,
0.0014524459838867188,
0.001163482666015625,
0.00838470458984375,
0.0046539306640625,
-0.0012636184692382812,
-0.0036258697509765625,
0.01251220703125,
-0.033294677734375,
0.0014362335205078125,
-0.0042724609375,
-0.026885986328125,
-0.01366424560546875,
-0.04583740234375,
0.06317138671875,
-0.0380859375,
-0.01007843017578125,
-0.050994873046875,
0.0152740478515625,
0.056793212890625,
-0.043243408203125,
0.07080078125,
0.05352783203125,
-0.018341064453125,
0.0105743408203125,
-0.029693603515625,
-0.0024852752685546875,
-0.03326416015625,
0.0323486328125,
-0.04486083984375,
-0.049407958984375,
0.05096435546875,
0.016876220703125,
0.018951416015625,
0.06622314453125,
0.05712890625,
0.0164794921875,
0.0634765625,
0.02471923828125,
0.007762908935546875,
0.0276031494140625,
-0.054779052734375,
0.0131072998046875,
-0.06951904296875,
-0.00730133056640625,
-0.0543212890625,
-0.0135345458984375,
-0.06182861328125,
-0.04901123046875,
0.02276611328125,
-0.0006351470947265625,
-0.0035305023193359375,
0.055511474609375,
-0.0404052734375,
0.00772857666015625,
0.0391845703125,
-0.01275634765625,
0.027130126953125,
0.01605224609375,
-0.03765869140625,
-0.0246734619140625,
-0.048370361328125,
-0.040191650390625,
0.0841064453125,
0.032196044921875,
0.0251312255859375,
0.01163482666015625,
0.0452880859375,
-0.0126953125,
0.023834228515625,
-0.048431396484375,
0.03912353515625,
-0.02349853515625,
-0.04443359375,
-0.015167236328125,
-0.0518798828125,
-0.07275390625,
0.044464111328125,
-0.01302337646484375,
-0.046478271484375,
0.01444244384765625,
-0.0016355514526367188,
-0.003360748291015625,
0.047149658203125,
-0.053924560546875,
0.073486328125,
-0.01183319091796875,
-0.0246734619140625,
0.0007929801940917969,
-0.039764404296875,
0.0113525390625,
-0.0013647079467773438,
0.0174560546875,
0.0009598731994628906,
0.005466461181640625,
0.059295654296875,
-0.0245819091796875,
0.0374755859375,
-0.0028820037841796875,
-0.0157012939453125,
0.00739288330078125,
-0.0004646778106689453,
0.040435791015625,
-0.0152740478515625,
-0.02239990234375,
0.044952392578125,
-0.003803253173828125,
-0.0246124267578125,
-0.0152740478515625,
0.042633056640625,
-0.0643310546875,
-0.0260467529296875,
-0.033660888671875,
-0.042816162109375,
0.00799560546875,
0.0312347412109375,
0.051177978515625,
0.0469970703125,
0.00006556510925292969,
0.0426025390625,
0.04193115234375,
-0.03741455078125,
0.031829833984375,
0.042022705078125,
-0.005859375,
-0.035797119140625,
0.07135009765625,
0.0260009765625,
0.027740478515625,
0.0499267578125,
0.01336669921875,
-0.016204833984375,
-0.050445556640625,
-0.06231689453125,
0.03814697265625,
-0.037841796875,
-0.0177154541015625,
-0.06036376953125,
-0.0058441162109375,
-0.0236358642578125,
0.01108551025390625,
-0.0452880859375,
-0.043914794921875,
-0.0175323486328125,
-0.00452423095703125,
0.0283660888671875,
0.0258941650390625,
0.0007100105285644531,
0.0248565673828125,
-0.0665283203125,
0.0133056640625,
-0.02001953125,
0.0224609375,
-0.012603759765625,
-0.06256103515625,
-0.03253173828125,
0.01514434814453125,
-0.0308380126953125,
-0.0626220703125,
0.052642822265625,
0.0028820037841796875,
0.023834228515625,
0.00982666015625,
0.0089569091796875,
0.044464111328125,
-0.0513916015625,
0.059661865234375,
0.01043701171875,
-0.07147216796875,
0.0227813720703125,
-0.03033447265625,
0.0185394287109375,
0.025604248046875,
0.0186767578125,
-0.052215576171875,
-0.04510498046875,
-0.055206298828125,
-0.0753173828125,
0.0699462890625,
0.04534912109375,
0.004566192626953125,
0.00421142578125,
0.0008826255798339844,
-0.0006055831909179688,
0.00550079345703125,
-0.0810546875,
-0.0406494140625,
-0.006282806396484375,
-0.0202178955078125,
-0.01380157470703125,
-0.00792694091796875,
0.001220703125,
-0.02728271484375,
0.0770263671875,
0.0034999847412109375,
0.03765869140625,
0.0294342041015625,
-0.024932861328125,
-0.004489898681640625,
0.022918701171875,
0.0526123046875,
0.032928466796875,
-0.01079559326171875,
0.005718231201171875,
0.0283660888671875,
-0.034210205078125,
0.0008769035339355469,
0.0109100341796875,
-0.0252685546875,
0.030670166015625,
0.029754638671875,
0.07611083984375,
0.006992340087890625,
-0.0279388427734375,
0.03564453125,
-0.01183319091796875,
-0.024932861328125,
-0.030242919921875,
-0.0299224853515625,
0.012939453125,
0.003833770751953125,
0.0259246826171875,
0.00899505615234375,
-0.01537322998046875,
-0.0179290771484375,
0.0008139610290527344,
0.0136871337890625,
-0.0265045166015625,
-0.044586181640625,
0.056121826171875,
0.01300811767578125,
-0.0247039794921875,
0.036834716796875,
-0.0163726806640625,
-0.05364990234375,
0.03314208984375,
0.031463623046875,
0.08062744140625,
-0.01230621337890625,
-0.00018489360809326172,
0.055206298828125,
0.046142578125,
-0.0079345703125,
0.01181793212890625,
-0.001617431640625,
-0.0479736328125,
-0.03424072265625,
-0.06298828125,
0.0016918182373046875,
0.005939483642578125,
-0.04583740234375,
0.03277587890625,
0.00440216064453125,
-0.007228851318359375,
-0.01189422607421875,
0.0081634521484375,
-0.050018310546875,
0.0013561248779296875,
-0.0134124755859375,
0.067626953125,
-0.066650390625,
0.0714111328125,
0.04315185546875,
-0.03851318359375,
-0.0654296875,
-0.0079345703125,
-0.03369140625,
-0.04296875,
0.047882080078125,
0.01416015625,
-0.00202178955078125,
0.01312255859375,
-0.01520538330078125,
-0.06500244140625,
0.0745849609375,
0.035125732421875,
-0.03118896484375,
-0.00611114501953125,
0.026702880859375,
0.051971435546875,
-0.011322021484375,
0.023040771484375,
0.0294036865234375,
0.0537109375,
-0.00565338134765625,
-0.08221435546875,
-0.00276947021484375,
-0.044342041015625,
-0.00013256072998046875,
0.021453857421875,
-0.04876708984375,
0.07659912109375,
0.010101318359375,
-0.0232391357421875,
0.0152587890625,
0.052734375,
0.009002685546875,
-0.00025653839111328125,
0.0268402099609375,
0.058929443359375,
0.0361328125,
-0.0377197265625,
0.07952880859375,
-0.03448486328125,
0.04071044921875,
0.06085205078125,
0.0180206298828125,
0.06689453125,
0.04425048828125,
-0.019287109375,
0.0338134765625,
0.041839599609375,
-0.0064849853515625,
0.0177001953125,
-0.0024814605712890625,
0.00254058837890625,
-0.01024627685546875,
-0.00495147705078125,
-0.054443359375,
0.029083251953125,
0.0221099853515625,
-0.024932861328125,
-0.00408935546875,
-0.005191802978515625,
0.0184173583984375,
-0.0024890899658203125,
-0.001140594482421875,
0.037750244140625,
0.0111846923828125,
-0.056793212890625,
0.08416748046875,
0.0243988037109375,
0.057525634765625,
-0.041595458984375,
0.01332855224609375,
-0.01354217529296875,
0.0287017822265625,
-0.006534576416015625,
-0.0406494140625,
0.0200042724609375,
0.020721435546875,
-0.0135650634765625,
-0.04388427734375,
0.002803802490234375,
-0.049652099609375,
-0.056396484375,
0.034332275390625,
0.035675048828125,
0.0294342041015625,
0.00928497314453125,
-0.051239013671875,
-0.0012531280517578125,
0.0198516845703125,
-0.039886474609375,
0.00061798095703125,
0.049957275390625,
-0.0018377304077148438,
0.03765869140625,
0.05364990234375,
0.0199737548828125,
0.021392822265625,
-0.01236724853515625,
0.052337646484375,
-0.037322998046875,
-0.034759521484375,
-0.0650634765625,
0.059234619140625,
0.006221771240234375,
-0.04193115234375,
0.0631103515625,
0.0623779296875,
0.0731201171875,
-0.0138397216796875,
0.038177490234375,
-0.01157379150390625,
0.0347900390625,
-0.043914794921875,
0.050262451171875,
-0.062225341796875,
0.023345947265625,
-0.016632080078125,
-0.07843017578125,
-0.0263671875,
0.0360107421875,
-0.01885986328125,
-0.0127105712890625,
0.061279296875,
0.056396484375,
-0.009124755859375,
-0.0285186767578125,
0.0166168212890625,
0.04425048828125,
0.0301513671875,
0.05267333984375,
0.03912353515625,
-0.0679931640625,
0.051239013671875,
-0.0266571044921875,
-0.004344940185546875,
-0.00679779052734375,
-0.05059814453125,
-0.058624267578125,
-0.058624267578125,
-0.01235198974609375,
-0.0372314453125,
-0.0150146484375,
0.07647705078125,
0.02490234375,
-0.06658935546875,
-0.0287017822265625,
-0.006496429443359375,
0.0137176513671875,
-0.01517486572265625,
-0.01381683349609375,
0.0426025390625,
-0.01081085205078125,
-0.08123779296875,
0.0228271484375,
0.00536346435546875,
0.007198333740234375,
-0.01065826416015625,
-0.020965576171875,
-0.025421142578125,
-0.015228271484375,
0.023681640625,
0.007488250732421875,
-0.0640869140625,
-0.0020809173583984375,
0.0222930908203125,
-0.01320648193359375,
0.022216796875,
0.029998779296875,
-0.03173828125,
0.0328369140625,
0.0518798828125,
0.036865234375,
0.044281005859375,
-0.0210418701171875,
0.0498046875,
-0.049652099609375,
0.036712646484375,
0.016387939453125,
0.042022705078125,
0.037017822265625,
-0.004207611083984375,
0.043670654296875,
0.0267333984375,
-0.018280029296875,
-0.08642578125,
0.0030879974365234375,
-0.0682373046875,
-0.0002593994140625,
0.090087890625,
-0.021514892578125,
-0.026824951171875,
0.01241302490234375,
-0.01308441162109375,
0.037445068359375,
-0.0185546875,
0.032501220703125,
0.05108642578125,
0.036285400390625,
0.00981903076171875,
-0.042144775390625,
0.0196533203125,
0.04876708984375,
-0.0399169921875,
0.003955841064453125,
0.00798797607421875,
0.0131072998046875,
0.0307769775390625,
0.029083251953125,
-0.0201416015625,
0.00447845458984375,
-0.0201568603515625,
0.03472900390625,
-0.00885772705078125,
-0.01409912109375,
-0.031463623046875,
-0.009735107421875,
-0.011993408203125,
-0.0111083984375
]
] |
declare-lab/flan-alpaca-gpt4-xl | 2023-08-21T06:49:02.000Z | [
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"dataset:tatsu-lab/alpaca",
"arxiv:2308.09662",
"arxiv:2306.04757",
"arxiv:2210.11416",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | declare-lab | null | null | declare-lab/flan-alpaca-gpt4-xl | 36 | 59,808 | transformers | 2023-04-16T07:00:20 | ---
license: apache-2.0
datasets:
- tatsu-lab/alpaca
---
## 🍮 🦙 Flan-Alpaca: Instruction Tuning from Humans and Machines
📣 Introducing **Red-Eval** to evaluate the safety of the LLMs using several jailbreaking prompts. With **Red-Eval** one could jailbreak/red-team GPT-4 with a 65.1% attack success rate and ChatGPT could be jailbroken 73% of the time as measured on DangerousQA and HarmfulQA benchmarks. More details are here: [Code](https://github.com/declare-lab/red-instruct) and [Paper](https://arxiv.org/abs/2308.09662).
📣 We developed Flacuna by fine-tuning Vicuna-13B on the Flan collection. Flacuna is better than Vicuna at problem-solving. Access the model here https://huggingface.co/declare-lab/flacuna-13b-v1.0.
📣 Curious to know the performance of 🍮 🦙 **Flan-Alpaca** on large-scale LLM evaluation benchmark, **InstructEval**? Read our paper [https://arxiv.org/pdf/2306.04757.pdf](https://arxiv.org/pdf/2306.04757.pdf). We evaluated more than 10 open-source instruction-tuned LLMs belonging to various LLM families including Pythia, LLaMA, T5, UL2, OPT, and Mosaic. Codes and datasets: [https://github.com/declare-lab/instruct-eval](https://github.com/declare-lab/instruct-eval)
📣 **FLAN-T5** is also useful in text-to-audio generation. Find our work at [https://github.com/declare-lab/tango](https://github.com/declare-lab/tango) if you are interested.
Our [repository](https://github.com/declare-lab/flan-alpaca) contains code for extending the [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca)
synthetic instruction tuning to existing instruction-tuned models such as [Flan-T5](https://arxiv.org/abs/2210.11416).
We have a [live interactive demo](https://huggingface.co/spaces/joaogante/transformers_streaming) thanks to [Joao Gante](https://huggingface.co/joaogante)!
We are also benchmarking many instruction-tuned models at [declare-lab/flan-eval](https://github.com/declare-lab/flan-eval).
Our pretrained models are fully available on HuggingFace 🤗 :
| Model | Parameters | Instruction Data | Training GPUs |
|----------------------------------------------------------------------------------|------------|----------------------------------------------------------------------------------------------------------------------------------------------------|-----------------|
| [Flan-Alpaca-Base](https://huggingface.co/declare-lab/flan-alpaca-base) | 220M | [Flan](https://github.com/google-research/FLAN), [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 1x A6000 |
| [Flan-Alpaca-Large](https://huggingface.co/declare-lab/flan-alpaca-large) | 770M | [Flan](https://github.com/google-research/FLAN), [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 1x A6000 |
| [Flan-Alpaca-XL](https://huggingface.co/declare-lab/flan-alpaca-xl) | 3B | [Flan](https://github.com/google-research/FLAN), [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 1x A6000 |
| [Flan-Alpaca-XXL](https://huggingface.co/declare-lab/flan-alpaca-xxl) | 11B | [Flan](https://github.com/google-research/FLAN), [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 4x A6000 (FSDP) |
| [Flan-GPT4All-XL](https://huggingface.co/declare-lab/flan-gpt4all-xl) | 3B | [Flan](https://github.com/google-research/FLAN), [GPT4All](https://github.com/nomic-ai/gpt4all) | 1x A6000 |
| [Flan-ShareGPT-XL](https://huggingface.co/declare-lab/flan-sharegpt-xl) | 3B | [Flan](https://github.com/google-research/FLAN), [ShareGPT](https://github.com/domeccleston/sharegpt)/[Vicuna](https://github.com/lm-sys/FastChat) | 1x A6000 |
| [Flan-Alpaca-GPT4-XL*](https://huggingface.co/declare-lab/flan-alpaca-gpt4-xl) | 3B | [Flan](https://github.com/google-research/FLAN), [GPT4-Alpaca](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM) | 1x A6000 |
*recommended for better performance
### Why?
[Alpaca](https://crfm.stanford.edu/2023/03/13/alpaca.html) represents an exciting new direction
to approximate the performance of large language models (LLMs) like ChatGPT cheaply and easily.
Concretely, they leverage an LLM such as GPT-3 to generate instructions as synthetic training data.
The synthetic data which covers more than 50k tasks can then be used to finetune a smaller model.
However, the original implementation is less accessible due to licensing constraints of the
underlying [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) model.
Furthermore, users have noted [potential noise](https://github.com/tloen/alpaca-lora/issues/65) in the synthetic
dataset. Hence, it may be better to explore a fully accessible model that is already trained on high-quality (but
less diverse) instructions such as [Flan-T5](https://arxiv.org/abs/2210.11416).
### Usage
```
from transformers import pipeline
prompt = "Write an email about an alpaca that likes flan"
model = pipeline(model="declare-lab/flan-alpaca-gpt4-xl")
model(prompt, max_length=128, do_sample=True)
# Dear AlpacaFriend,
# My name is Alpaca and I'm 10 years old.
# I'm excited to announce that I'm a big fan of flan!
# We like to eat it as a snack and I believe that it can help with our overall growth.
# I'd love to hear your feedback on this idea.
# Have a great day!
# Best, AL Paca
``` | 5,815 | [
[
-0.0518798828125,
-0.06793212890625,
0.0211181640625,
0.01837158203125,
-0.002849578857421875,
0.0016946792602539062,
-0.021148681640625,
-0.0540771484375,
0.03399658203125,
0.0155487060546875,
-0.034759521484375,
-0.045074462890625,
-0.041168212890625,
-0.00424957275390625,
-0.0282135009765625,
0.07464599609375,
-0.00725555419921875,
-0.017913818359375,
0.026519775390625,
-0.02520751953125,
-0.0161285400390625,
-0.0260772705078125,
-0.051239013671875,
-0.01374053955078125,
0.040252685546875,
0.01456451416015625,
0.05047607421875,
0.04888916015625,
0.0221710205078125,
0.02392578125,
-0.0146636962890625,
0.029998779296875,
-0.024169921875,
-0.0269012451171875,
0.0216522216796875,
-0.027618408203125,
-0.04644775390625,
-0.008697509765625,
0.02081298828125,
0.0167999267578125,
-0.014312744140625,
0.0180511474609375,
-0.0098724365234375,
0.059112548828125,
-0.043121337890625,
0.0260772705078125,
-0.037750244140625,
0.0012407302856445312,
-0.01971435546875,
-0.003490447998046875,
-0.003871917724609375,
-0.034271240234375,
-0.0034389495849609375,
-0.056243896484375,
0.021453857421875,
-0.004302978515625,
0.0823974609375,
0.01129150390625,
-0.01540374755859375,
-0.04449462890625,
-0.056610107421875,
0.0430908203125,
-0.06365966796875,
0.0296783447265625,
0.028411865234375,
0.019378662109375,
-0.024810791015625,
-0.04425048828125,
-0.051788330078125,
-0.0221710205078125,
-0.00408935546875,
0.0128631591796875,
-0.0048980712890625,
-0.0033626556396484375,
0.0110321044921875,
0.049713134765625,
-0.03021240234375,
-0.005168914794921875,
-0.0345458984375,
-0.00820159912109375,
0.04876708984375,
-0.017852783203125,
0.0233001708984375,
0.01129913330078125,
-0.023834228515625,
-0.035888671875,
-0.0472412109375,
0.010894775390625,
0.019805908203125,
0.0263519287109375,
-0.03271484375,
0.033721923828125,
-0.011444091796875,
0.048309326171875,
-0.0125579833984375,
-0.021484375,
0.038970947265625,
-0.020782470703125,
-0.0180816650390625,
-0.0046234130859375,
0.086181640625,
0.0017671585083007812,
-0.0017185211181640625,
0.0195159912109375,
-0.041717529296875,
-0.002323150634765625,
-0.00010192394256591797,
-0.04620361328125,
-0.00885009765625,
0.0231170654296875,
-0.0208587646484375,
-0.0386962890625,
0.0089569091796875,
-0.07550048828125,
-0.00213623046875,
0.004550933837890625,
0.04925537109375,
-0.0511474609375,
-0.01934814453125,
0.0092926025390625,
0.01216888427734375,
0.042510986328125,
0.0209197998046875,
-0.0882568359375,
0.021392822265625,
0.06549072265625,
0.08563232421875,
0.008148193359375,
-0.0240478515625,
-0.027618408203125,
-0.00556182861328125,
-0.0308685302734375,
0.045806884765625,
-0.01238250732421875,
-0.0257568359375,
-0.004425048828125,
0.01074981689453125,
-0.020904541015625,
-0.030975341796875,
0.04833984375,
-0.029937744140625,
0.0081329345703125,
-0.0310516357421875,
-0.035308837890625,
-0.0121917724609375,
-0.0054168701171875,
-0.061859130859375,
0.07977294921875,
0.0245208740234375,
-0.040679931640625,
0.01019287109375,
-0.07769775390625,
-0.03643798828125,
-0.0233917236328125,
0.00370025634765625,
-0.04254150390625,
-0.00501251220703125,
0.031768798828125,
0.01519775390625,
-0.032440185546875,
0.0074462890625,
0.001789093017578125,
-0.0399169921875,
0.01029205322265625,
-0.0239715576171875,
0.05841064453125,
0.031158447265625,
-0.05706787109375,
0.0171356201171875,
-0.06634521484375,
-0.0083770751953125,
0.029449462890625,
-0.0275421142578125,
0.0187835693359375,
-0.0200347900390625,
-0.01064300537109375,
-0.0048980712890625,
0.016845703125,
-0.02490234375,
0.01473236083984375,
-0.03448486328125,
0.0516357421875,
0.050689697265625,
-0.01110076904296875,
0.0177001953125,
-0.045654296875,
0.036163330078125,
-0.0184783935546875,
0.0220489501953125,
-0.01462554931640625,
-0.045440673828125,
-0.09100341796875,
-0.030487060546875,
0.013427734375,
0.051300048828125,
-0.0416259765625,
0.04205322265625,
-0.0105133056640625,
-0.050994873046875,
-0.04559326171875,
0.0203399658203125,
0.0261688232421875,
0.029998779296875,
0.045379638671875,
-0.004161834716796875,
-0.0316162109375,
-0.05523681640625,
0.005889892578125,
-0.01309967041015625,
0.0018148422241210938,
0.0120849609375,
0.05487060546875,
-0.0258636474609375,
0.046356201171875,
-0.039825439453125,
-0.0271148681640625,
-0.017913818359375,
-0.005313873291015625,
0.0254974365234375,
0.0445556640625,
0.060150146484375,
-0.036834716796875,
-0.01375579833984375,
0.0180511474609375,
-0.0445556640625,
0.00608062744140625,
0.00406646728515625,
-0.020477294921875,
0.0265350341796875,
0.01422119140625,
-0.0684814453125,
0.0258026123046875,
0.04345703125,
-0.01477813720703125,
0.04156494140625,
-0.00952911376953125,
0.01004791259765625,
-0.06005859375,
0.01349639892578125,
0.00044083595275878906,
-0.01146697998046875,
-0.037841796875,
0.01285552978515625,
-0.004016876220703125,
-0.003021240234375,
-0.047515869140625,
0.045166015625,
-0.0277252197265625,
-0.00508880615234375,
-0.00681304931640625,
-0.00971221923828125,
0.0084686279296875,
0.052093505859375,
-0.00429534912109375,
0.08251953125,
0.0270538330078125,
-0.040771484375,
0.0263519287109375,
0.015838623046875,
-0.0297088623046875,
-0.01271820068359375,
-0.0623779296875,
0.01611328125,
0.018585205078125,
0.039520263671875,
-0.035491943359375,
-0.0287017822265625,
0.0435791015625,
-0.01409149169921875,
0.023956298828125,
0.00357818603515625,
-0.0300750732421875,
-0.045013427734375,
-0.042999267578125,
0.0175018310546875,
0.041595458984375,
-0.0618896484375,
0.037933349609375,
0.0146942138671875,
0.0244598388671875,
-0.043701171875,
-0.049102783203125,
-0.01390838623046875,
-0.0269622802734375,
-0.045684814453125,
0.0225830078125,
-0.005634307861328125,
-0.00042247772216796875,
-0.0192718505859375,
-0.0059051513671875,
0.005870819091796875,
-0.0006346702575683594,
0.01009368896484375,
0.0176239013671875,
-0.0297393798828125,
-0.006839752197265625,
-0.010772705078125,
0.00594329833984375,
-0.01146697998046875,
-0.0222015380859375,
0.056488037109375,
-0.05816650390625,
-0.0125274658203125,
-0.04443359375,
0.006740570068359375,
0.04559326171875,
-0.033782958984375,
0.076171875,
0.0697021484375,
-0.01293182373046875,
-0.01312255859375,
-0.048187255859375,
-0.004711151123046875,
-0.04107666015625,
0.006137847900390625,
-0.029083251953125,
-0.057861328125,
0.045196533203125,
0.01036834716796875,
0.021575927734375,
0.03912353515625,
0.03289794921875,
-0.00467681884765625,
0.0435791015625,
0.030029296875,
-0.01172637939453125,
0.048126220703125,
-0.0517578125,
0.005161285400390625,
-0.053314208984375,
-0.01326751708984375,
-0.032562255859375,
-0.01251220703125,
-0.05389404296875,
-0.054962158203125,
0.0242462158203125,
0.00864410400390625,
-0.0254058837890625,
0.0308074951171875,
-0.04864501953125,
0.0340576171875,
0.042510986328125,
0.0113983154296875,
0.00347137451171875,
-0.003597259521484375,
-0.0006895065307617188,
0.0302581787109375,
-0.04388427734375,
-0.04388427734375,
0.07769775390625,
0.038909912109375,
0.036102294921875,
0.0076446533203125,
0.062255859375,
0.00891876220703125,
0.0287933349609375,
-0.054901123046875,
0.040985107421875,
-0.0081939697265625,
-0.032989501953125,
-0.01251220703125,
-0.030792236328125,
-0.0767822265625,
0.0233306884765625,
-0.004703521728515625,
-0.0556640625,
0.00409698486328125,
0.0240936279296875,
-0.024139404296875,
0.0390625,
-0.061614990234375,
0.06597900390625,
-0.0379638671875,
-0.0219268798828125,
0.004306793212890625,
-0.038909912109375,
0.051513671875,
-0.014007568359375,
0.0333251953125,
-0.01849365234375,
-0.01092529296875,
0.05963134765625,
-0.07666015625,
0.055816650390625,
-0.011474609375,
-0.03167724609375,
0.05029296875,
-0.00881195068359375,
0.04052734375,
0.004329681396484375,
-0.0262451171875,
0.0284881591796875,
0.0149993896484375,
-0.03546142578125,
-0.04229736328125,
0.0673828125,
-0.075927734375,
-0.04638671875,
-0.036163330078125,
-0.0160675048828125,
-0.00958251953125,
0.006526947021484375,
0.02392578125,
0.0203857421875,
-0.006969451904296875,
0.00008732080459594727,
0.0439453125,
-0.04241943359375,
0.0296478271484375,
0.02178955078125,
-0.02789306640625,
-0.050079345703125,
0.08172607421875,
-0.00347900390625,
0.0220184326171875,
0.03857421875,
0.031463623046875,
-0.01036834716796875,
-0.0295867919921875,
-0.0506591796875,
0.0279998779296875,
-0.04632568359375,
-0.0203399658203125,
-0.041168212890625,
-0.0186920166015625,
-0.0243682861328125,
-0.01401519775390625,
-0.047607421875,
-0.045013427734375,
-0.039886474609375,
-0.01113128662109375,
0.0523681640625,
0.049041748046875,
0.001922607421875,
0.0279083251953125,
-0.056427001953125,
0.0288238525390625,
0.01366424560546875,
0.0211029052734375,
0.0043182373046875,
-0.0303192138671875,
-0.0091705322265625,
0.0250091552734375,
-0.03485107421875,
-0.062164306640625,
0.0408935546875,
0.03521728515625,
0.025421142578125,
0.018463134765625,
-0.01201629638671875,
0.052459716796875,
-0.02667236328125,
0.06317138671875,
0.005588531494140625,
-0.07659912109375,
0.06658935546875,
-0.030975341796875,
0.011962890625,
0.040191650390625,
0.02569580078125,
-0.0178985595703125,
-0.0195770263671875,
-0.04022216796875,
-0.07342529296875,
0.052459716796875,
0.0254974365234375,
-0.0006103515625,
-0.0018701553344726562,
0.02923583984375,
0.01401519775390625,
0.003978729248046875,
-0.05926513671875,
-0.0282440185546875,
-0.03558349609375,
-0.0183868408203125,
-0.002960205078125,
0.0008630752563476562,
-0.01983642578125,
-0.0270538330078125,
0.06817626953125,
-0.00974273681640625,
0.032470703125,
0.0198516845703125,
0.0001500844955444336,
-0.005706787109375,
0.0148162841796875,
0.0704345703125,
0.0308990478515625,
-0.0256805419921875,
-0.01136016845703125,
0.0254974365234375,
-0.042633056640625,
0.01381683349609375,
0.0013580322265625,
-0.027801513671875,
-0.01236724853515625,
0.040985107421875,
0.08251953125,
0.0086517333984375,
-0.050628662109375,
0.0284576416015625,
-0.0153045654296875,
-0.02178955078125,
-0.03271484375,
0.03839111328125,
0.0117034912109375,
0.006317138671875,
0.0231170654296875,
0.00699615478515625,
-0.0177764892578125,
-0.04925537109375,
0.0024089813232421875,
0.0229339599609375,
-0.006610870361328125,
-0.037322998046875,
0.054595947265625,
0.0200958251953125,
-0.03228759765625,
0.0200042724609375,
-0.01971435546875,
-0.042449951171875,
0.061798095703125,
0.04052734375,
0.054595947265625,
-0.01494598388671875,
0.0142974853515625,
0.04461669921875,
0.0278167724609375,
-0.01666259765625,
0.01200103759765625,
-0.00873565673828125,
-0.040802001953125,
0.0015954971313476562,
-0.07806396484375,
-0.0192413330078125,
0.031494140625,
-0.0263671875,
0.0255889892578125,
-0.049102783203125,
-0.007068634033203125,
-0.01116180419921875,
0.0188751220703125,
-0.050079345703125,
0.0069427490234375,
0.001373291015625,
0.06964111328125,
-0.052459716796875,
0.0751953125,
0.03515625,
-0.03515625,
-0.07568359375,
-0.0096893310546875,
0.0132904052734375,
-0.058685302734375,
0.0131378173828125,
0.022064208984375,
-0.0179595947265625,
-0.0223236083984375,
-0.0286865234375,
-0.0733642578125,
0.10302734375,
0.030029296875,
-0.05230712890625,
0.013580322265625,
0.017425537109375,
0.051055908203125,
-0.020904541015625,
0.030303955078125,
0.07733154296875,
0.041748046875,
0.0139007568359375,
-0.073486328125,
0.010101318359375,
-0.0243377685546875,
-0.00585174560546875,
0.00965118408203125,
-0.094482421875,
0.05657958984375,
-0.017852783203125,
-0.00222015380859375,
0.01108551025390625,
0.0750732421875,
0.0360107421875,
0.0200958251953125,
0.04010009765625,
0.031341552734375,
0.0670166015625,
-0.01345062255859375,
0.0843505859375,
-0.0268707275390625,
0.020965576171875,
0.06982421875,
-0.0016498565673828125,
0.0543212890625,
0.0190887451171875,
-0.0323486328125,
0.0297088623046875,
0.054656982421875,
0.002010345458984375,
0.024658203125,
0.007289886474609375,
-0.038360595703125,
0.019927978515625,
0.01064300537109375,
-0.0537109375,
0.035308837890625,
0.038299560546875,
-0.0227508544921875,
0.022613525390625,
-0.006801605224609375,
0.0191650390625,
-0.0193023681640625,
-0.005962371826171875,
0.03643798828125,
0.01316070556640625,
-0.046539306640625,
0.08807373046875,
0.0013704299926757812,
0.0743408203125,
-0.059478759765625,
0.01007843017578125,
-0.03009033203125,
0.0111083984375,
-0.0271148681640625,
-0.031707763671875,
0.0189971923828125,
0.004657745361328125,
0.0170745849609375,
0.005123138427734375,
0.040130615234375,
-0.01971435546875,
-0.0479736328125,
0.033355712890625,
0.0159149169921875,
0.0104522705078125,
0.0242462158203125,
-0.060577392578125,
0.045806884765625,
0.01523590087890625,
-0.0447998046875,
0.0228729248046875,
0.021942138671875,
-0.0007305145263671875,
0.058837890625,
0.06097412109375,
0.01398468017578125,
0.0172882080078125,
0.0116119384765625,
0.07208251953125,
-0.05267333984375,
-0.017303466796875,
-0.051788330078125,
0.0038242340087890625,
0.017791748046875,
-0.02239990234375,
0.0357666015625,
0.048187255859375,
0.05718994140625,
-0.006198883056640625,
0.0445556640625,
-0.0171356201171875,
0.017333984375,
-0.042755126953125,
0.04681396484375,
-0.0472412109375,
0.04559326171875,
-0.0234832763671875,
-0.060546875,
-0.0081787109375,
0.056182861328125,
-0.0128021240234375,
0.021575927734375,
0.03472900390625,
0.07464599609375,
-0.00543975830078125,
0.02264404296875,
0.0010442733764648438,
0.00919342041015625,
0.058837890625,
0.049346923828125,
0.048065185546875,
-0.051727294921875,
0.048370361328125,
-0.055328369140625,
-0.01264190673828125,
-0.010986328125,
-0.054595947265625,
-0.03875732421875,
-0.033203125,
-0.0207061767578125,
-0.01332855224609375,
0.01080322265625,
0.08251953125,
0.0594482421875,
-0.06927490234375,
-0.029998779296875,
-0.019775390625,
0.006015777587890625,
-0.0264434814453125,
-0.0185699462890625,
0.041961669921875,
-0.0130767822265625,
-0.058807373046875,
0.050201416015625,
-0.0013103485107421875,
0.023834228515625,
-0.00678253173828125,
-0.0161285400390625,
-0.02557373046875,
0.0078887939453125,
0.0404052734375,
0.053863525390625,
-0.0548095703125,
-0.0218505859375,
-0.019866943359375,
-0.0009222030639648438,
0.02301025390625,
0.0404052734375,
-0.046142578125,
0.0054473876953125,
0.0177154541015625,
0.027984619140625,
0.054718017578125,
0.003932952880859375,
0.028167724609375,
-0.036285400390625,
0.030059814453125,
0.00484466552734375,
0.02996826171875,
0.02734375,
-0.0254669189453125,
0.061798095703125,
0.003509521484375,
-0.043914794921875,
-0.052886962890625,
-0.008544921875,
-0.0841064453125,
-0.0075531005859375,
0.08428955078125,
-0.0209808349609375,
-0.0521240234375,
0.0250244140625,
-0.01751708984375,
0.02783203125,
-0.0458984375,
0.04949951171875,
0.0308990478515625,
-0.0174102783203125,
-0.002285003662109375,
-0.058837890625,
0.04449462890625,
0.0303497314453125,
-0.0810546875,
-0.00457000732421875,
0.0174713134765625,
0.0252838134765625,
0.0123748779296875,
0.052032470703125,
-0.00811767578125,
0.00801849365234375,
-0.00920867919921875,
0.0016813278198242188,
-0.01236724853515625,
-0.0023632049560546875,
-0.01404571533203125,
-0.00463104248046875,
-0.005924224853515625,
-0.0192413330078125
]
] |
tiiuae/falcon-40b | 2023-09-29T14:32:25.000Z | [
"transformers",
"pytorch",
"falcon",
"text-generation",
"custom_code",
"en",
"de",
"es",
"fr",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:2205.14135",
"arxiv:1911.02150",
"arxiv:2101.00027",
"arxiv:2005.14165",
"arxiv:2104.09864",
"arxiv:2306.01116",
"license:apache-2.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | tiiuae | null | null | tiiuae/falcon-40b | 2,322 | 59,300 | transformers | 2023-05-24T12:08:30 | ---
datasets:
- tiiuae/falcon-refinedweb
language:
- en
- de
- es
- fr
inference: false
license: apache-2.0
---
# 🚀 Falcon-40B
**Falcon-40B is a 40B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,000B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the Apache 2.0 license.**
*Paper coming soon 😊.*
🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost fron HF](https://huggingface.co/blog/falcon)!
## Why use Falcon-40B?
* **It is the best open-source model currently available.** Falcon-40B outperforms [LLaMA](https://github.com/facebookresearch/llama), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1), [MPT](https://huggingface.co/mosaicml/mpt-7b), etc. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
* **It is made available under a permissive Apache 2.0 license allowing for commercial use**, without any royalties or restrictions.
*
⚠️ **This is a raw, pretrained model, which should be further finetuned for most usecases.** If you are looking for a version better suited to taking generic instructions in a chat format, we recommend taking a look at [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct).
💸 **Looking for a smaller, less expensive model?** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) is Falcon-40B's little brother!
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-40b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost]((https://huggingface.co/blog/falcon).
You will need **at least 85-100GB of memory** to swiftly run inference with Falcon-40B.
# Model Card for Falcon-40B
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English, German, Spanish, French (and limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish);
- **License:** Apache 2.0 license.
### Model Source
- **Paper:** *coming soon*.
## Uses
### Direct Use
Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.)
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-40B is trained mostly on English, German, Spanish, French, with limited capabilities also in in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish. It will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-40B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.
## How to Get Started with the Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-40b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
### Training Data
Falcon-40B was trained on 1,000B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), a high-quality filtered and deduplicated web dataset which we enhanced with curated corpora. Significant components from our curated copora were inspired by The Pile ([Gao et al., 2020](https://arxiv.org/abs/2101.00027)).
| **Data source** | **Fraction** | **Tokens** | **Sources** |
|--------------------|--------------|------------|-----------------------------------|
| [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 75% | 750B | massive web crawl |
| RefinedWeb-Europe | 7% | 70B | European massive web crawl |
| Books | 6% | 60B | |
| Conversations | 5% | 50B | Reddit, StackOverflow, HackerNews |
| Code | 5% | 50B | |
| Technical | 2% | 20B | arXiv, PubMed, USPTO, etc. |
RefinedWeb-Europe is made of the following languages:
| **Language** | **Fraction of multilingual data** | **Tokens** |
|--------------|-----------------------------------|------------|
| German | 26% | 18B |
| Spanish | 24% | 17B |
| French | 23% | 16B |
| _Italian_ | 7% | 5B |
| _Portuguese_ | 4% | 3B |
| _Polish_ | 4% | 3B |
| _Dutch_ | 4% | 3B |
| _Romanian_ | 3% | 2B |
| _Czech_ | 3% | 2B |
| _Swedish_ | 2% | 1B |
The data was tokenized with the Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) tokenizer.
### Training Procedure
Falcon-40B was trained on 384 A100 40GB GPUs, using a 3D parallelism strategy (TP=8, PP=4, DP=12) combined with ZeRO.
#### Training Hyperparameters
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|------------|-------------------------------------------|
| Precision | `bfloat16` | |
| Optimizer | AdamW | |
| Learning rate | 1.85e-4 | 4B tokens warm-up, cosine decay to 1.85e-5 |
| Weight decay | 1e-1 | |
| Z-loss | 1e-4 | |
| Batch size | 1152 | 100B tokens ramp-up |
#### Speeds, Sizes, Times
Training started in December 2022 and took two months.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
## Technical Specifications
### Model Architecture and Objective
Falcon-40B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with a two layer norms.
For multiquery, we are using an internal variant which uses independent key and values per tensor parallel degree.
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 60 | |
| `d_model` | 8192 | |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-40B was trained on AWS SageMaker, on 384 A100 40GB GPUs in P4d instances.
#### Software
Falcon-40B was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon* 😊. In the meanwhile, you can use the following information to cite:
```
@article{falcon40b,
title={{Falcon-40B}: an open large language model with state-of-the-art performance},
author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
year={2023}
}
```
To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## License
Falcon-40B is made available under the Apache 2.0 license.
## Contact
falconllm@tii.ae | 11,314 | [
[
-0.04833984375,
-0.057708740234375,
0.0007948875427246094,
0.026885986328125,
-0.007659912109375,
0.006488800048828125,
-0.0198211669921875,
-0.04290771484375,
0.02447509765625,
0.0160064697265625,
-0.044891357421875,
-0.0361328125,
-0.05230712890625,
0.0147857666015625,
-0.024871826171875,
0.0828857421875,
0.0026836395263671875,
-0.0004706382751464844,
0.01270294189453125,
0.0017299652099609375,
0.00714874267578125,
-0.036163330078125,
-0.0667724609375,
-0.01093292236328125,
0.02935791015625,
0.0223541259765625,
0.038970947265625,
0.0675048828125,
0.045379638671875,
0.0250396728515625,
-0.020721435546875,
0.0153656005859375,
-0.036163330078125,
-0.0191650390625,
-0.006069183349609375,
-0.02630615234375,
-0.02392578125,
-0.01300811767578125,
0.048431396484375,
0.04412841796875,
-0.0004138946533203125,
0.016326904296875,
-0.0106353759765625,
0.047698974609375,
-0.038604736328125,
0.02874755859375,
-0.032958984375,
0.005619049072265625,
-0.02667236328125,
0.01617431640625,
-0.0234375,
0.00904083251953125,
-0.0235443115234375,
-0.060150146484375,
0.0220794677734375,
0.0127716064453125,
0.0975341796875,
0.0169219970703125,
-0.034820556640625,
-0.02337646484375,
-0.0294952392578125,
0.056976318359375,
-0.055023193359375,
0.0355224609375,
0.0187835693359375,
0.019989013671875,
-0.0191650390625,
-0.07891845703125,
-0.039825439453125,
-0.0089111328125,
0.005855560302734375,
0.0276336669921875,
-0.0226287841796875,
-0.0043792724609375,
0.03271484375,
0.023651123046875,
-0.035430908203125,
0.01222991943359375,
-0.03961181640625,
-0.0189056396484375,
0.0555419921875,
0.0041656494140625,
0.0184783935546875,
-0.0182647705078125,
-0.0295867919921875,
-0.03021240234375,
-0.0228271484375,
0.0333251953125,
0.0404052734375,
0.0227813720703125,
-0.02703857421875,
0.0408935546875,
-0.0345458984375,
0.040283203125,
0.039642333984375,
0.0007410049438476562,
0.035308837890625,
-0.0323486328125,
-0.0169677734375,
-0.01409149169921875,
0.08551025390625,
0.01812744140625,
0.019500732421875,
-0.01247406005859375,
-0.004871368408203125,
-0.0009984970092773438,
0.004451751708984375,
-0.0787353515625,
0.004840850830078125,
0.0215301513671875,
-0.036956787109375,
-0.0152740478515625,
0.0322265625,
-0.0616455078125,
0.00942230224609375,
0.009002685546875,
-0.00391387939453125,
-0.040069580078125,
-0.02825927734375,
0.01512908935546875,
-0.0244293212890625,
0.01641845703125,
-0.01401519775390625,
-0.07452392578125,
0.0245361328125,
0.041015625,
0.056732177734375,
-0.004344940185546875,
-0.048736572265625,
-0.04205322265625,
0.0026702880859375,
-0.0270538330078125,
0.043914794921875,
-0.0330810546875,
-0.0311431884765625,
-0.01491546630859375,
0.0291900634765625,
-0.01125335693359375,
-0.01273345947265625,
0.06378173828125,
-0.0213470458984375,
0.0198211669921875,
-0.03240966796875,
-0.04595947265625,
-0.023284912109375,
0.0026340484619140625,
-0.04864501953125,
0.07623291015625,
0.0090179443359375,
-0.08221435546875,
0.021453857421875,
-0.0621337890625,
-0.0264892578125,
-0.007049560546875,
-0.0012178421020507812,
-0.037353515625,
-0.01056671142578125,
0.038421630859375,
0.041046142578125,
-0.01459503173828125,
0.0259857177734375,
-0.036346435546875,
-0.0404052734375,
-0.0112762451171875,
-0.00942230224609375,
0.07354736328125,
0.044891357421875,
-0.0460205078125,
-0.0029010772705078125,
-0.043792724609375,
-0.00408172607421875,
0.028076171875,
-0.0147857666015625,
0.01305389404296875,
-0.01336669921875,
0.0119476318359375,
0.01617431640625,
0.0174560546875,
-0.05047607421875,
0.007843017578125,
-0.04449462890625,
0.0416259765625,
0.0193939208984375,
-0.0005583763122558594,
0.02520751953125,
-0.0338134765625,
0.045623779296875,
0.0284576416015625,
0.0161285400390625,
-0.0174102783203125,
-0.04071044921875,
-0.0704345703125,
-0.0186004638671875,
0.01396942138671875,
0.0321044921875,
-0.05108642578125,
0.030609130859375,
-0.00885009765625,
-0.057373046875,
-0.021942138671875,
-0.01248931884765625,
0.035552978515625,
0.037750244140625,
0.032135009765625,
0.004985809326171875,
-0.056854248046875,
-0.06170654296875,
-0.005931854248046875,
-0.02032470703125,
0.01119232177734375,
0.0196075439453125,
0.053497314453125,
-0.0303192138671875,
0.051361083984375,
-0.023681640625,
-0.01178741455078125,
-0.00528717041015625,
0.005340576171875,
0.0186309814453125,
0.04034423828125,
0.061676025390625,
-0.052642822265625,
-0.0217742919921875,
0.0003559589385986328,
-0.06341552734375,
0.0008835792541503906,
-0.0037288665771484375,
-0.01337432861328125,
0.036529541015625,
0.045562744140625,
-0.050750732421875,
0.01541900634765625,
0.0401611328125,
-0.0263824462890625,
0.043853759765625,
-0.0066680908203125,
0.008514404296875,
-0.0941162109375,
0.028076171875,
0.007904052734375,
0.0099639892578125,
-0.0280303955078125,
0.026519775390625,
0.0065765380859375,
-0.00872039794921875,
-0.04705810546875,
0.059539794921875,
-0.041595458984375,
0.003742218017578125,
-0.0007481575012207031,
-0.00960540771484375,
0.0002903938293457031,
0.04571533203125,
0.0078582763671875,
0.0677490234375,
0.035308837890625,
-0.028289794921875,
0.006237030029296875,
0.0297698974609375,
-0.0059051513671875,
0.01342010498046875,
-0.058441162109375,
-0.00592803955078125,
-0.0081787109375,
0.033294677734375,
-0.0538330078125,
-0.0227813720703125,
0.022003173828125,
-0.051605224609375,
0.020355224609375,
-0.002819061279296875,
-0.01947021484375,
-0.05364990234375,
-0.0184478759765625,
0.00714874267578125,
0.029571533203125,
-0.031646728515625,
0.0299224853515625,
0.0182342529296875,
-0.001461029052734375,
-0.0721435546875,
-0.064697265625,
0.011444091796875,
-0.018341064453125,
-0.0521240234375,
0.03369140625,
-0.01255035400390625,
-0.0046844482421875,
-0.0009140968322753906,
0.010223388671875,
0.00714874267578125,
0.006824493408203125,
0.03350830078125,
0.0103759765625,
-0.0205841064453125,
-0.0027713775634765625,
0.0088348388671875,
-0.003871917724609375,
0.00489044189453125,
-0.026641845703125,
0.042205810546875,
-0.053466796875,
-0.01552581787109375,
-0.03179931640625,
0.028076171875,
0.04388427734375,
-0.0272979736328125,
0.053070068359375,
0.07672119140625,
-0.0291900634765625,
0.00484466552734375,
-0.03350830078125,
-0.01025390625,
-0.036651611328125,
0.03741455078125,
-0.04620361328125,
-0.060638427734375,
0.054840087890625,
0.00963592529296875,
0.0055084228515625,
0.07330322265625,
0.037750244140625,
-0.0006761550903320312,
0.07843017578125,
0.03375244140625,
-0.00788116455078125,
0.031463623046875,
-0.05316162109375,
-0.01319122314453125,
-0.0535888671875,
-0.0279541015625,
-0.0516357421875,
-0.0127716064453125,
-0.06195068359375,
-0.01416778564453125,
-0.0039215087890625,
0.0192718505859375,
-0.0594482421875,
0.0260772705078125,
-0.03955078125,
0.021697998046875,
0.043243408203125,
0.0009279251098632812,
-0.0019521713256835938,
0.004360198974609375,
-0.026336669921875,
0.0100860595703125,
-0.06182861328125,
-0.0355224609375,
0.08709716796875,
0.029815673828125,
0.0380859375,
-0.01158905029296875,
0.07281494140625,
0.0018491744995117188,
0.0096893310546875,
-0.045806884765625,
0.02978515625,
-0.0296630859375,
-0.047271728515625,
-0.010101318359375,
-0.03875732421875,
-0.07354736328125,
-0.001979827880859375,
-0.01454925537109375,
-0.0535888671875,
0.0096588134765625,
-0.012542724609375,
-0.01082611083984375,
0.0274505615234375,
-0.0648193359375,
0.06365966796875,
-0.01290130615234375,
-0.0243072509765625,
0.01293182373046875,
-0.050079345703125,
0.03546142578125,
-0.00984954833984375,
0.0191497802734375,
0.0038814544677734375,
-0.0103759765625,
0.07476806640625,
-0.03985595703125,
0.050750732421875,
-0.0250396728515625,
0.020904541015625,
0.0250701904296875,
-0.020050048828125,
0.0526123046875,
0.0009412765502929688,
-0.0250396728515625,
0.0350341796875,
0.02276611328125,
-0.04656982421875,
-0.0190887451171875,
0.05224609375,
-0.088134765625,
-0.03936767578125,
-0.0394287109375,
-0.045196533203125,
-0.009002685546875,
0.02581787109375,
0.04107666015625,
0.01116943359375,
-0.006134033203125,
0.0246124267578125,
0.0099945068359375,
-0.0173797607421875,
0.06060791015625,
0.029693603515625,
-0.01556396484375,
-0.03985595703125,
0.053253173828125,
0.0026721954345703125,
0.0012645721435546875,
0.0175933837890625,
0.017059326171875,
-0.043975830078125,
-0.039276123046875,
-0.04534912109375,
0.033843994140625,
-0.0421142578125,
-0.0118560791015625,
-0.0679931640625,
-0.0303497314453125,
-0.03436279296875,
-0.01328277587890625,
-0.0268402099609375,
-0.029876708984375,
-0.038604736328125,
-0.0024471282958984375,
0.0270843505859375,
0.033355712890625,
-0.00197601318359375,
0.03179931640625,
-0.05364990234375,
0.0084075927734375,
-0.003246307373046875,
0.008392333984375,
-0.0010223388671875,
-0.04510498046875,
-0.0186920166015625,
0.04046630859375,
-0.0234832763671875,
-0.04132080078125,
0.03143310546875,
0.02740478515625,
0.046630859375,
0.0416259765625,
0.01007080078125,
0.06103515625,
-0.0184478759765625,
0.0653076171875,
0.019927978515625,
-0.0712890625,
0.0229034423828125,
-0.040283203125,
0.027984619140625,
0.03564453125,
0.043121337890625,
-0.03875732421875,
-0.04315185546875,
-0.068115234375,
-0.040496826171875,
0.062225341796875,
0.0185699462890625,
0.0027866363525390625,
-0.01222991943359375,
0.0275726318359375,
-0.0031147003173828125,
-0.0022792816162109375,
-0.036529541015625,
-0.01471710205078125,
-0.045379638671875,
-0.024169921875,
-0.01242828369140625,
0.00482177734375,
0.00986480712890625,
-0.0218963623046875,
0.0621337890625,
-0.0191497802734375,
0.0367431640625,
0.00868988037109375,
-0.00293731689453125,
-0.0029735565185546875,
-0.00860595703125,
0.05316162109375,
0.045440673828125,
-0.017822265625,
-0.0015268325805664062,
0.017669677734375,
-0.054290771484375,
0.00397491455078125,
0.0303497314453125,
-0.0160675048828125,
-0.006420135498046875,
0.030487060546875,
0.0660400390625,
0.002246856689453125,
-0.041595458984375,
0.0279541015625,
-0.00942230224609375,
-0.0251617431640625,
-0.01271820068359375,
0.0127716064453125,
0.027130126953125,
0.0262603759765625,
0.0287933349609375,
-0.00911712646484375,
0.0014047622680664062,
-0.0265350341796875,
0.029815673828125,
0.02178955078125,
-0.0164794921875,
-0.0146331787109375,
0.078369140625,
0.01277923583984375,
-0.024993896484375,
0.03997802734375,
-0.0292205810546875,
-0.037139892578125,
0.08038330078125,
0.043243408203125,
0.0537109375,
0.0032501220703125,
0.0255889892578125,
0.05584716796875,
0.0178070068359375,
-0.006805419921875,
0.0246429443359375,
0.00896453857421875,
-0.044891357421875,
-0.024993896484375,
-0.06365966796875,
-0.03045654296875,
0.00616455078125,
-0.051361083984375,
0.02972412109375,
-0.0308074951171875,
-0.0228271484375,
0.0226287841796875,
0.040496826171875,
-0.0543212890625,
0.0280609130859375,
-0.0013494491577148438,
0.08197021484375,
-0.049652099609375,
0.0675048828125,
0.059661865234375,
-0.062042236328125,
-0.07293701171875,
-0.006011962890625,
-0.01210784912109375,
-0.07318115234375,
0.059234619140625,
0.0216522216796875,
0.004695892333984375,
0.0194244384765625,
-0.03387451171875,
-0.06390380859375,
0.07891845703125,
0.0275115966796875,
-0.053192138671875,
-0.008209228515625,
0.0149078369140625,
0.04461669921875,
-0.0272674560546875,
0.045440673828125,
0.0258026123046875,
0.03936767578125,
0.0198211669921875,
-0.05694580078125,
0.013824462890625,
-0.043792724609375,
0.001483917236328125,
0.007503509521484375,
-0.08135986328125,
0.068603515625,
-0.0280609130859375,
-0.01155853271484375,
-0.0000787973403930664,
0.059844970703125,
0.028076171875,
0.01971435546875,
0.0298309326171875,
0.0428466796875,
0.05767822265625,
-0.0123443603515625,
0.07122802734375,
-0.055572509765625,
0.044097900390625,
0.06109619140625,
0.0063934326171875,
0.05706787109375,
0.026397705078125,
-0.01313018798828125,
0.034515380859375,
0.06927490234375,
-0.02081298828125,
0.01366424560546875,
-0.005306243896484375,
0.00919342041015625,
-0.00789642333984375,
-0.004669189453125,
-0.045013427734375,
0.049224853515625,
0.01139068603515625,
-0.0244293212890625,
-0.01019287109375,
0.0024776458740234375,
0.036590576171875,
-0.0157623291015625,
-0.005199432373046875,
0.03570556640625,
0.00212860107421875,
-0.06146240234375,
0.078857421875,
0.01934814453125,
0.0640869140625,
-0.051116943359375,
0.022216796875,
-0.040618896484375,
0.01218414306640625,
-0.020050048828125,
-0.04931640625,
0.0345458984375,
0.0088653564453125,
-0.00910186767578125,
-0.001728057861328125,
0.0322265625,
-0.0215301513671875,
-0.059326171875,
0.0245361328125,
0.026123046875,
0.0255126953125,
-0.0122222900390625,
-0.0660400390625,
0.0191650390625,
-0.004802703857421875,
-0.033447265625,
0.0123443603515625,
0.0178070068359375,
-0.005229949951171875,
0.056121826171875,
0.0567626953125,
0.0037097930908203125,
0.00598907470703125,
-0.00930023193359375,
0.058441162109375,
-0.05963134765625,
-0.037689208984375,
-0.045196533203125,
0.0298309326171875,
-0.0108642578125,
-0.036346435546875,
0.056304931640625,
0.056121826171875,
0.06866455078125,
0.00125885009765625,
0.049041748046875,
-0.01471710205078125,
0.01320648193359375,
-0.02435302734375,
0.0667724609375,
-0.051239013671875,
-0.00919342041015625,
-0.016021728515625,
-0.052459716796875,
-0.004596710205078125,
0.0355224609375,
-0.00518035888671875,
0.018768310546875,
0.0589599609375,
0.0615234375,
-0.0009946823120117188,
0.0179595947265625,
0.008544921875,
0.0194244384765625,
0.0297393798828125,
0.064208984375,
0.03851318359375,
-0.056793212890625,
0.06072998046875,
-0.03271484375,
-0.01146697998046875,
-0.01309967041015625,
-0.058624267578125,
-0.07281494140625,
-0.05108642578125,
-0.018951416015625,
-0.0260772705078125,
0.00848388671875,
0.054107666015625,
0.06292724609375,
-0.06170654296875,
-0.0218963623046875,
-0.0194854736328125,
-0.01386260986328125,
-0.0264129638671875,
-0.0165557861328125,
0.0418701171875,
-0.03173828125,
-0.073974609375,
0.0156707763671875,
-0.0086517333984375,
0.0109710693359375,
-0.0135650634765625,
-0.0237579345703125,
-0.0276031494140625,
0.0012226104736328125,
0.03179931640625,
0.0335693359375,
-0.056976318359375,
-0.0141448974609375,
0.025299072265625,
-0.01366424560546875,
-0.0019311904907226562,
0.0151214599609375,
-0.04205322265625,
0.0254364013671875,
0.027313232421875,
0.04901123046875,
0.07843017578125,
-0.0026416778564453125,
0.0131378173828125,
-0.0279693603515625,
0.0305633544921875,
0.00160980224609375,
0.03350830078125,
0.0144195556640625,
-0.0235443115234375,
0.04644775390625,
0.0311126708984375,
-0.03448486328125,
-0.05316162109375,
-0.00893402099609375,
-0.08544921875,
-0.0219879150390625,
0.09515380859375,
-0.0263214111328125,
-0.03143310546875,
0.01546478271484375,
-0.0189971923828125,
0.04400634765625,
-0.03277587890625,
0.049896240234375,
0.05010986328125,
-0.002117156982421875,
-0.006603240966796875,
-0.023651123046875,
0.0252838134765625,
0.0180816650390625,
-0.077392578125,
-0.0192108154296875,
0.019775390625,
0.0299224853515625,
0.0033779144287109375,
0.0263671875,
0.00194549560546875,
0.00991058349609375,
0.0254974365234375,
0.007110595703125,
-0.046356201171875,
-0.009246826171875,
-0.00948333740234375,
0.0186614990234375,
-0.026397705078125,
-0.02777099609375
]
] |
stablediffusionapi/anything-v5 | 2023-07-05T09:35:32.000Z | [
"diffusers",
"stablediffusionapi.com",
"stable-diffusion-api",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | stablediffusionapi | null | null | stablediffusionapi/anything-v5 | 105 | 59,277 | diffusers | 2023-04-23T07:21:56 | ---
license: creativeml-openrail-m
tags:
- stablediffusionapi.com
- stable-diffusion-api
- text-to-image
- ultra-realistic
pinned: true
---
# Anything V5 API Inference

## Get API Key
Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed.
Replace Key in below code, change **model_id** to "anything-v5"
Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs)
Model link: [View model](https://stablediffusionapi.com/models/anything-v5)
Credits: [View credits](https://civitai.com/?query=Anything%20V5)
View all models: [View Models](https://stablediffusionapi.com/models)
import requests
import json
url = "https://stablediffusionapi.com/api/v3/dreambooth"
payload = json.dumps({
"key": "",
"model_id": "anything-v5",
"prompt": "actual 8K portrait photo of gareth person, portrait, happy colors, bright eyes, clear eyes, warm smile, smooth soft skin, big dreamy eyes, beautiful intricate colored hair, symmetrical, anime wide eyes, soft lighting, detailed face, by makoto shinkai, stanley artgerm lau, wlop, rossdraws, concept art, digital painting, looking into camera",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": "no",
"enhance_prompt": "yes",
"seed": None,
"guidance_scale": 7.5,
"multi_lingual": "no",
"panorama": "no",
"self_attention": "no",
"upscale": "no",
"embeddings": "embeddings_model_id",
"lora": "lora_model_id",
"webhook": None,
"track_id": None
})
headers = {
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
> Use this coupon code to get 25% off **DMGG0RBN** | 2,410 | [
[
-0.0291748046875,
-0.060577392578125,
0.041961669921875,
0.0210723876953125,
-0.02545166015625,
-0.00885009765625,
0.0257415771484375,
-0.0333251953125,
0.038818359375,
0.043548583984375,
-0.06219482421875,
-0.0697021484375,
-0.022369384765625,
-0.0055999755859375,
-0.018829345703125,
0.042724609375,
0.0137176513671875,
-0.0169830322265625,
-0.00395965576171875,
0.0092926025390625,
-0.0235137939453125,
-0.013275146484375,
-0.05230712890625,
-0.00762176513671875,
0.0193634033203125,
-0.00011438131332397461,
0.035003662109375,
0.05352783203125,
0.0168914794921875,
0.02032470703125,
-0.01544952392578125,
-0.0035877227783203125,
-0.0258941650390625,
-0.0164794921875,
-0.0129241943359375,
-0.054534912109375,
-0.048004150390625,
-0.011810302734375,
0.0248260498046875,
0.037750244140625,
-0.00211334228515625,
0.034027099609375,
-0.0127716064453125,
0.04449462890625,
-0.0606689453125,
0.0179290771484375,
-0.0301971435546875,
0.0197906494140625,
-0.005001068115234375,
0.00014603137969970703,
-0.0243072509765625,
-0.02288818359375,
-0.0113677978515625,
-0.062408447265625,
0.0186309814453125,
0.012725830078125,
0.105712890625,
0.01253509521484375,
-0.01529693603515625,
-0.00897216796875,
-0.032623291015625,
0.060211181640625,
-0.06915283203125,
0.03076171875,
0.0258941650390625,
0.004077911376953125,
-0.0007467269897460938,
-0.071533203125,
-0.048248291015625,
0.0259552001953125,
0.013031005859375,
0.019989013671875,
-0.031402587890625,
-0.00263214111328125,
0.0223388671875,
0.02825927734375,
-0.0283203125,
-0.0240936279296875,
-0.0298919677734375,
-0.004611968994140625,
0.040618896484375,
0.016357421875,
0.015350341796875,
-0.036407470703125,
-0.0269012451171875,
-0.022186279296875,
-0.041748046875,
0.026763916015625,
0.037200927734375,
0.02398681640625,
-0.043487548828125,
0.040252685546875,
-0.0269012451171875,
0.0645751953125,
0.0157318115234375,
-0.011383056640625,
0.04461669921875,
-0.019683837890625,
-0.0312042236328125,
-0.0212554931640625,
0.0684814453125,
0.042999267578125,
-0.020172119140625,
0.0168609619140625,
-0.0022029876708984375,
-0.0045928955078125,
0.006351470947265625,
-0.0745849609375,
-0.01401519775390625,
0.060882568359375,
-0.055419921875,
-0.043670654296875,
0.005565643310546875,
-0.071044921875,
-0.0235748291015625,
0.0071563720703125,
0.024261474609375,
-0.0267181396484375,
-0.0340576171875,
0.0261993408203125,
-0.0202178955078125,
0.02008056640625,
0.0124359130859375,
-0.058563232421875,
0.0007410049438476562,
0.036712646484375,
0.057830810546875,
0.01560211181640625,
-0.0041656494140625,
0.00826263427734375,
0.01079559326171875,
-0.0233154296875,
0.0653076171875,
-0.019561767578125,
-0.0369873046875,
-0.008087158203125,
0.025787353515625,
-0.006397247314453125,
-0.0298309326171875,
0.055328369140625,
-0.0389404296875,
-0.00923919677734375,
-0.0189056396484375,
-0.032379150390625,
-0.031463623046875,
0.01290130615234375,
-0.040283203125,
0.043731689453125,
0.013214111328125,
-0.057464599609375,
0.0165252685546875,
-0.055084228515625,
-0.01401519775390625,
-0.00974273681640625,
0.001827239990234375,
-0.048248291015625,
-0.009033203125,
0.00916290283203125,
0.028076171875,
-0.00682830810546875,
-0.01519012451171875,
-0.06573486328125,
-0.01250457763671875,
0.01953125,
-0.01678466796875,
0.089599609375,
0.03253173828125,
-0.01715087890625,
-0.0005526542663574219,
-0.06939697265625,
0.00989532470703125,
0.038482666015625,
-0.01043701171875,
-0.01210784912109375,
-0.01352691650390625,
0.00962066650390625,
-0.004119873046875,
0.0238800048828125,
-0.03875732421875,
0.0228424072265625,
-0.029388427734375,
0.042724609375,
0.0401611328125,
0.0217742919921875,
0.0250701904296875,
-0.026397705078125,
0.048736572265625,
0.0145416259765625,
0.03680419921875,
-0.00872039794921875,
-0.04718017578125,
-0.046142578125,
-0.03558349609375,
0.01995849609375,
0.03082275390625,
-0.035400390625,
0.0306396484375,
-0.01287078857421875,
-0.049224853515625,
-0.0499267578125,
-0.004974365234375,
0.0208740234375,
0.03765869140625,
0.00812530517578125,
-0.010101318359375,
-0.059661865234375,
-0.0616455078125,
-0.0016374588012695312,
-0.01232147216796875,
-0.0078125,
0.022247314453125,
0.035675048828125,
-0.0208892822265625,
0.06341552734375,
-0.0555419921875,
-0.00995635986328125,
-0.006622314453125,
-0.0020465850830078125,
0.057952880859375,
0.0467529296875,
0.0634765625,
-0.06597900390625,
-0.0202789306640625,
-0.0222015380859375,
-0.052001953125,
0.0094146728515625,
0.01739501953125,
-0.0242919921875,
-0.002231597900390625,
0.007457733154296875,
-0.061279296875,
0.0377197265625,
0.034332275390625,
-0.048095703125,
0.0306396484375,
-0.00849151611328125,
0.0355224609375,
-0.09320068359375,
0.00417327880859375,
0.0063934326171875,
-0.02294921875,
-0.02874755859375,
0.027923583984375,
0.005870819091796875,
-0.0031585693359375,
-0.0587158203125,
0.04425048828125,
-0.026123046875,
0.004734039306640625,
-0.007541656494140625,
0.00926971435546875,
0.02398681640625,
0.0187530517578125,
0.003704071044921875,
0.023223876953125,
0.048492431640625,
-0.039306640625,
0.034393310546875,
0.0213623046875,
-0.0217437744140625,
0.047576904296875,
-0.045623779296875,
0.007793426513671875,
-0.00032520294189453125,
0.0249786376953125,
-0.08917236328125,
-0.038970947265625,
0.0322265625,
-0.054229736328125,
-0.0099029541015625,
-0.047515869140625,
-0.03521728515625,
-0.03948974609375,
-0.0283966064453125,
0.0131988525390625,
0.05767822265625,
-0.03668212890625,
0.06573486328125,
0.02020263671875,
0.024810791015625,
-0.042724609375,
-0.0654296875,
-0.032623291015625,
-0.0279693603515625,
-0.048309326171875,
0.0191497802734375,
-0.0090484619140625,
-0.02044677734375,
0.00791168212890625,
0.0014085769653320312,
-0.017669677734375,
-0.0019092559814453125,
0.034210205078125,
0.0399169921875,
-0.01806640625,
-0.024261474609375,
0.0098724365234375,
-0.0027637481689453125,
0.007476806640625,
-0.032623291015625,
0.0550537109375,
-0.002933502197265625,
-0.041748046875,
-0.052398681640625,
-0.0031757354736328125,
0.04583740234375,
0.00481414794921875,
0.0400390625,
0.0416259765625,
-0.044036865234375,
-0.004940032958984375,
-0.041656494140625,
-0.0089874267578125,
-0.03521728515625,
0.01194000244140625,
-0.041748046875,
-0.0338134765625,
0.07244873046875,
0.007904052734375,
0.004253387451171875,
0.0413818359375,
0.0289154052734375,
-0.00960540771484375,
0.101318359375,
0.0195159912109375,
0.0039043426513671875,
0.0260009765625,
-0.054840087890625,
0.002964019775390625,
-0.059112548828125,
-0.01568603515625,
-0.027313232421875,
-0.01959228515625,
-0.033477783203125,
-0.032012939453125,
0.0025272369384765625,
0.0223236083984375,
-0.0298919677734375,
0.02764892578125,
-0.049072265625,
0.031036376953125,
0.032958984375,
0.0191650390625,
0.0169830322265625,
-0.0070343017578125,
0.0007090568542480469,
0.006900787353515625,
-0.038238525390625,
-0.0180816650390625,
0.08319091796875,
0.01461029052734375,
0.053619384765625,
0.005748748779296875,
0.04571533203125,
0.0130767822265625,
0.0035648345947265625,
-0.042999267578125,
0.0283050537109375,
0.0088043212890625,
-0.0750732421875,
0.0118255615234375,
-0.018829345703125,
-0.078125,
0.0271148681640625,
-0.0159149169921875,
-0.07086181640625,
0.053619384765625,
0.01043701171875,
-0.054107666015625,
0.041473388671875,
-0.054168701171875,
0.060089111328125,
0.0023555755615234375,
-0.041656494140625,
0.0031642913818359375,
-0.0343017578125,
0.041748046875,
0.00814056396484375,
0.044921875,
-0.0294036865234375,
-0.01293182373046875,
0.041473388671875,
-0.0294036865234375,
0.07318115234375,
-0.02294921875,
-0.000362396240234375,
0.04412841796875,
0.00597381591796875,
0.0298919677734375,
0.03338623046875,
-0.00858306884765625,
0.01345062255859375,
0.0256805419921875,
-0.052764892578125,
-0.03179931640625,
0.061492919921875,
-0.05615234375,
-0.0285186767578125,
-0.015472412109375,
-0.019866943359375,
0.0023632049560546875,
0.0291290283203125,
0.04083251953125,
0.01007080078125,
0.010528564453125,
0.0020599365234375,
0.05670166015625,
-0.004184722900390625,
0.033966064453125,
0.0309906005859375,
-0.050201416015625,
-0.046478271484375,
0.057952880859375,
-0.0107421875,
0.0201263427734375,
0.01068115234375,
0.00848388671875,
-0.03765869140625,
-0.03668212890625,
-0.032135009765625,
0.0185546875,
-0.057037353515625,
-0.02520751953125,
-0.05889892578125,
0.004802703857421875,
-0.0543212890625,
-0.01502227783203125,
-0.06298828125,
-0.02313232421875,
-0.036651611328125,
-0.013763427734375,
0.047882080078125,
0.0233001708984375,
-0.0151214599609375,
0.0196990966796875,
-0.05157470703125,
0.0251312255859375,
0.01006317138671875,
0.01100921630859375,
0.0047760009765625,
-0.03961181640625,
0.00518035888671875,
0.01509857177734375,
-0.034423828125,
-0.06585693359375,
0.03900146484375,
-0.01509857177734375,
0.0213775634765625,
0.0728759765625,
0.01091766357421875,
0.0687255859375,
0.0019779205322265625,
0.0684814453125,
0.022308349609375,
-0.074462890625,
0.059234619140625,
-0.04473876953125,
0.0021076202392578125,
0.038604736328125,
0.0164031982421875,
-0.021759033203125,
-0.016204833984375,
-0.07611083984375,
-0.08489990234375,
0.036468505859375,
0.0170745849609375,
0.0176849365234375,
0.001705169677734375,
0.034576416015625,
-0.00868988037109375,
0.0245208740234375,
-0.06988525390625,
-0.04541015625,
-0.0170135498046875,
-0.0020389556884765625,
0.032440185546875,
0.01206207275390625,
-0.021881103515625,
-0.031463623046875,
0.055938720703125,
-0.0024204254150390625,
0.0296173095703125,
0.0212249755859375,
0.027008056640625,
-0.01088714599609375,
0.0020580291748046875,
0.02935791015625,
0.055938720703125,
-0.038177490234375,
-0.007419586181640625,
0.00478363037109375,
-0.02081298828125,
0.00858306884765625,
0.01136016845703125,
-0.0262908935546875,
0.00292205810546875,
0.0184783935546875,
0.06378173828125,
-0.003726959228515625,
-0.03582763671875,
0.0445556640625,
-0.0124664306640625,
-0.03070068359375,
-0.03692626953125,
0.01354217529296875,
0.029937744140625,
0.04571533203125,
0.0244140625,
0.016143798828125,
0.0186004638671875,
-0.037628173828125,
-0.00984954833984375,
0.020782470703125,
-0.027435302734375,
-0.0281982421875,
0.08282470703125,
-0.0109405517578125,
-0.02642822265625,
0.023468017578125,
-0.029571533203125,
-0.006237030029296875,
0.056488037109375,
0.04638671875,
0.055419921875,
-0.004886627197265625,
0.01849365234375,
0.046905517578125,
0.00415802001953125,
-0.0092010498046875,
0.054412841796875,
0.00577545166015625,
-0.053436279296875,
-0.0204620361328125,
-0.059661865234375,
-0.01436614990234375,
0.03326416015625,
-0.055633544921875,
0.039794921875,
-0.05792236328125,
-0.0247955322265625,
-0.006465911865234375,
-0.0214385986328125,
-0.04376220703125,
0.0306396484375,
0.0030689239501953125,
0.062286376953125,
-0.06109619140625,
0.042999267578125,
0.05670166015625,
-0.0584716796875,
-0.07525634765625,
-0.015472412109375,
0.01192474365234375,
-0.056640625,
0.0312042236328125,
-0.00981903076171875,
0.004871368408203125,
0.014129638671875,
-0.060577392578125,
-0.07135009765625,
0.08660888671875,
0.0211181640625,
-0.0272674560546875,
-0.003215789794921875,
-0.003368377685546875,
0.0280303955078125,
-0.032470703125,
0.037506103515625,
0.033660888671875,
0.037933349609375,
0.01922607421875,
-0.0268096923828125,
0.01258087158203125,
-0.032073974609375,
0.00789642333984375,
-0.0154266357421875,
-0.06292724609375,
0.0723876953125,
-0.020599365234375,
-0.0012054443359375,
0.02008056640625,
0.05108642578125,
0.058868408203125,
0.028076171875,
0.0418701171875,
0.06292724609375,
0.04278564453125,
-0.01491546630859375,
0.08355712890625,
-0.0277099609375,
0.056732177734375,
0.0489501953125,
-0.00293731689453125,
0.06292724609375,
0.035491943359375,
-0.034149169921875,
0.0462646484375,
0.08184814453125,
-0.019012451171875,
0.0531005859375,
0.006778717041015625,
-0.0245361328125,
-0.0179290771484375,
0.0074462890625,
-0.04986572265625,
0.016143798828125,
0.023956298828125,
-0.033782958984375,
0.009674072265625,
0.0021343231201171875,
0.003795623779296875,
-0.0235137939453125,
-0.01401519775390625,
0.030853271484375,
0.0033512115478515625,
-0.02294921875,
0.05755615234375,
-0.00390625,
0.06170654296875,
-0.042022705078125,
-0.0012369155883789062,
-0.01480865478515625,
0.0178070068359375,
-0.0288848876953125,
-0.037506103515625,
0.008544921875,
-0.0128173828125,
-0.016082763671875,
0.0032291412353515625,
0.04559326171875,
0.005428314208984375,
-0.04571533203125,
0.019744873046875,
0.021453857421875,
0.0201568603515625,
-0.00299835205078125,
-0.06805419921875,
0.0165252685546875,
0.0191497802734375,
-0.035003662109375,
0.005496978759765625,
0.0298919677734375,
0.036956787109375,
0.050079345703125,
0.06329345703125,
0.0218963623046875,
0.00836944580078125,
-0.0010805130004882812,
0.049346923828125,
-0.042877197265625,
-0.04132080078125,
-0.063232421875,
0.053985595703125,
-0.01509857177734375,
-0.019622802734375,
0.052001953125,
0.05987548828125,
0.06500244140625,
-0.0305938720703125,
0.07135009765625,
-0.0285186767578125,
0.034393310546875,
-0.02972412109375,
0.0601806640625,
-0.06500244140625,
0.016082763671875,
-0.0303802490234375,
-0.054107666015625,
-0.0094757080078125,
0.050689697265625,
-0.01551055908203125,
0.0179595947265625,
0.0418701171875,
0.05401611328125,
-0.0285797119140625,
0.002246856689453125,
0.01335906982421875,
0.02154541015625,
0.01009368896484375,
0.0232391357421875,
0.04949951171875,
-0.048095703125,
0.036590576171875,
-0.050811767578125,
-0.005657196044921875,
-0.003704071044921875,
-0.060394287109375,
-0.054656982421875,
-0.025665283203125,
-0.031890869140625,
-0.0618896484375,
-0.0020389556884765625,
0.057159423828125,
0.07379150390625,
-0.06378173828125,
-0.0116119384765625,
-0.0189056396484375,
0.01224517822265625,
-0.0181732177734375,
-0.0244293212890625,
0.027374267578125,
0.01104736328125,
-0.0802001953125,
0.0177459716796875,
-0.005828857421875,
0.0192413330078125,
-0.00606536865234375,
0.0163116455078125,
-0.00858306884765625,
0.0075225830078125,
0.0247039794921875,
0.0283660888671875,
-0.058563232421875,
-0.009490966796875,
-0.0091552734375,
0.011016845703125,
0.02191162109375,
0.017120361328125,
-0.024871826171875,
0.0268096923828125,
0.05511474609375,
0.0183258056640625,
0.049774169921875,
0.0034847259521484375,
0.01019287109375,
-0.032623291015625,
0.027374267578125,
0.00101470947265625,
0.0455322265625,
0.01385498046875,
-0.038604736328125,
0.035125732421875,
0.04083251953125,
-0.0252685546875,
-0.06292724609375,
0.007030487060546875,
-0.07843017578125,
-0.028350830078125,
0.07440185546875,
-0.0274200439453125,
-0.05914306640625,
0.01092529296875,
-0.016632080078125,
0.0238494873046875,
-0.03033447265625,
0.055938720703125,
0.0382080078125,
-0.0255889892578125,
-0.0173187255859375,
-0.056488037109375,
0.0193939208984375,
0.0174560546875,
-0.070068359375,
-0.01497650146484375,
0.025543212890625,
0.046142578125,
0.03668212890625,
0.044525146484375,
-0.030242919921875,
0.0172119140625,
0.023040771484375,
0.030426025390625,
0.006778717041015625,
0.0259246826171875,
-0.01202392578125,
0.01253509521484375,
-0.01080322265625,
-0.037017822265625
]
] |
BridgeTower/bridgetower-large-itm-mlm-itc | 2023-03-08T22:33:21.000Z | [
"transformers",
"pytorch",
"bridgetower",
"gaudi",
"en",
"dataset:conceptual_captions",
"dataset:conceptual_12m",
"dataset:sbu_captions",
"dataset:visual_genome",
"dataset:mscoco_captions",
"arxiv:2206.08657",
"arxiv:1504.00325",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | null | BridgeTower | null | null | BridgeTower/bridgetower-large-itm-mlm-itc | 3 | 59,264 | transformers | 2023-02-11T00:25:58 | ---
language: en
tags:
- bridgetower
- gaudi
license: mit
datasets:
- conceptual_captions
- conceptual_12m
- sbu_captions
- visual_genome
- mscoco_captions
---
# BridgeTower large-itm-mlm-itc model
The BridgeTower model was proposed in "BridgeTower: Building Bridges Between Encoders in Vision-Language Representative Learning" by Xiao Xu, Chenfei Wu, Shachar Rosenman, Vasudev Lal, Wanxiang Che, Nan Duan.
The model was pretrained on English language using masked language modeling (MLM) and image text matching (ITM)objectives. It was introduced in
[this paper](https://arxiv.org/pdf/2206.08657.pdf) and first released in
[this repository](https://github.com/microsoft/BridgeTower).
BridgeTower got accepted to [AAAI'23](https://aaai.org/Conferences/AAAI-23/).
## Model description
The abstract from the paper is the following:
Vision-Language (VL) models with the Two-Tower architecture have dominated visual-language representation learning in recent years. Current VL models either use lightweight uni-modal encoders and learn to extract, align and fuse both modalities simultaneously in a deep cross-modal encoder, or feed the last-layer uni-modal representations from the deep pre-trained uni-modal encoders into the top cross-modal encoder. Both approaches potentially restrict vision-language representation learning and limit model performance. In this paper, we propose BridgeTower, which introduces multiple bridge layers that build a connection between the top layers of uni-modal encoders and each layer of the cross-modal encoder. This enables effective bottom-up cross-modal alignment and fusion between visual and textual representations of different semantic levels of pre-trained uni-modal encoders in the cross-modal encoder. Pre-trained with only 4M images, BridgeTower achieves state-of-the-art performance on various downstream vision-language tasks. In particular, on the VQAv2 test-std set, BridgeTower achieves an accuracy of 78.73%, outperforming the previous state-of-the-art model METER by 1.09% with the same pre-training data and almost negligible additional parameters and computational costs. Notably, when further scaling the model, BridgeTower achieves an accuracy of 81.15%, surpassing models that are pre-trained on orders-of-magnitude larger datasets.
## Intended uses & limitations
### How to use
Here is how to use this model to perform contrastive learning between image and text pairs:
```python
from transformers import BridgeTowerProcessor, BridgeTowerForContrastiveLearning
import requests
from PIL import Image
import torch
image_urls = [
"https://farm4.staticflickr.com/3395/3428278415_81c3e27f15_z.jpg",
"http://images.cocodataset.org/val2017/000000039769.jpg"]
texts = [
"two dogs in a car",
"two cats sleeping on a couch"]
images = [Image.open(requests.get(url, stream=True).raw) for url in image_urls]
processor = BridgeTowerProcessor.from_pretrained("BridgeTower/bridgetower-large-itm-mlm")
model = BridgeTowerForContrastiveLearning.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-itc")
inputs = processor(images, texts, padding=True, return_tensors="pt")
outputs = model(**inputs)
inputs = processor(images, texts[::-1], padding=True, return_tensors="pt")
outputs_swapped = model(**inputs)
print('Loss', outputs.loss.item())
# Loss 0.00191505195107311
print('Loss with swapped images', outputs_swapped.loss.item())
# Loss with swapped images 2.1259872913360596
```
Here is how to use this model to perform image and text matching
```python
from transformers import BridgeTowerProcessor, BridgeTowerForImageAndTextRetrieval
import requests
from PIL import Image
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
texts = ["An image of two cats chilling on a couch", "A football player scoring a goal"]
processor = BridgeTowerProcessor.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-gaudi")
model = BridgeTowerForImageAndTextRetrieval.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-gaudi")
# forward pass
scores = dict()
for text in texts:
# prepare inputs
encoding = processor(image, text, return_tensors="pt")
outputs = model(**encoding)
scores[text] = outputs.logits[0,1].item()
```
Here is how to use this model to perform masked language modeling:
```python
from transformers import BridgeTowerProcessor, BridgeTowerForMaskedLM
from PIL import Image
import requests
url = "http://images.cocodataset.org/val2017/000000360943.jpg"
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
text = "a <mask> looking out of the window"
processor = BridgeTowerProcessor.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-gaudi")
model = BridgeTowerForMaskedLM.from_pretrained("BridgeTower/bridgetower-large-itm-mlm-gaudi")
# prepare inputs
encoding = processor(image, text, return_tensors="pt")
# forward pass
outputs = model(**encoding)
results = processor.decode(outputs.logits.argmax(dim=-1).squeeze(0).tolist())
print(results)
#.a cat looking out of the window.
```
## Training data
The BridgeTower model was pretrained on four public image-caption datasets:
- [Conceptual Captions (CC3M)](https://ai.google.com/research/ConceptualCaptions/)
- [Conceptual 12M (CC12M)](https://github.com/google-research-datasets/conceptual-12m)
- [SBU Captions](https://www.cs.rice.edu/~vo9/sbucaptions/)
- [MSCOCO Captions](https://arxiv.org/pdf/1504.00325.pdf)
- [Visual Genome](https://visualgenome.org/)
The total number of unique images in the combined data is around 14M.
## Training procedure
### Pretraining
The model was pre-trained for 10 epochs on an Intel AI supercomputing cluster using 512 Gaudis and 128 Xeons with a batch size of 2048.
The optimizer used was AdamW with a learning rate of 1e-7. No data augmentation was used except for center-crop. The image resolution in pre-training is set to 294 x 294.
## Evaluation results
Please refer to [Table 5](https://arxiv.org/pdf/2206.08657.pdf) for BridgeTower's performance on Image Retrieval and other downstream tasks.
### BibTeX entry and citation info
```bibtex
@article{xu2022bridge,
title={BridgeTower: Building Bridges Between Encoders in Vision-Language Representation Learning},
author={Xu, Xiao and Wu, Chenfei and Rosenman, Shachar and Lal, Vasudev and Che, Wanxiang and Duan, Nan},
journal={arXiv preprint arXiv:2206.08657},
year={2022}
}
``` | 6,451 | [
[
-0.021697998046875,
-0.04547119140625,
0.0092010498046875,
0.0204925537109375,
-0.034393310546875,
-0.004253387451171875,
-0.0236053466796875,
-0.03411865234375,
0.00588226318359375,
0.042816162109375,
-0.033447265625,
-0.03961181640625,
-0.051788330078125,
0.01222991943359375,
-0.01192474365234375,
0.056182861328125,
-0.0171051025390625,
0.016387939453125,
-0.01441192626953125,
-0.004241943359375,
-0.0274810791015625,
-0.0248260498046875,
-0.051361083984375,
-0.0135345458984375,
0.0150299072265625,
0.005008697509765625,
0.032684326171875,
0.0487060546875,
0.050872802734375,
0.03204345703125,
0.004283905029296875,
0.021484375,
-0.036712646484375,
-0.012603759765625,
-0.00472259521484375,
-0.0263519287109375,
-0.0158538818359375,
0.002223968505859375,
0.0616455078125,
0.0330810546875,
0.010772705078125,
0.0272064208984375,
0.0177001953125,
0.04638671875,
-0.043121337890625,
0.035400390625,
-0.051055908203125,
0.0034332275390625,
-0.01505279541015625,
-0.01352691650390625,
-0.0457763671875,
0.002391815185546875,
0.0003311634063720703,
-0.0360107421875,
0.0254669189453125,
0.0198516845703125,
0.118408203125,
0.01739501953125,
-0.0224151611328125,
0.0007367134094238281,
-0.03436279296875,
0.07220458984375,
-0.042236328125,
0.028411865234375,
0.0225830078125,
0.006702423095703125,
0.01419830322265625,
-0.06591796875,
-0.06005859375,
-0.0174102783203125,
-0.0228424072265625,
0.0225830078125,
-0.03521728515625,
0.00493621826171875,
0.0245361328125,
0.02410888671875,
-0.050933837890625,
0.0030651092529296875,
-0.053009033203125,
-0.0017232894897460938,
0.0413818359375,
0.003253936767578125,
0.0259857177734375,
-0.020965576171875,
-0.033355712890625,
-0.024871826171875,
-0.0330810546875,
0.0106048583984375,
0.017486572265625,
0.01629638671875,
-0.0196380615234375,
0.034332275390625,
0.0005154609680175781,
0.06982421875,
0.003582000732421875,
-0.0121002197265625,
0.032989501953125,
-0.0227508544921875,
-0.03875732421875,
-0.0038013458251953125,
0.07342529296875,
0.027984619140625,
0.03436279296875,
0.00750732421875,
-0.0005774497985839844,
0.0063934326171875,
0.0083770751953125,
-0.084228515625,
-0.0322265625,
-0.0026702880859375,
-0.042755126953125,
-0.0147552490234375,
-0.0013275146484375,
-0.05096435546875,
-0.002925872802734375,
-0.0123748779296875,
0.055145263671875,
-0.03778076171875,
-0.0130462646484375,
0.00426483154296875,
-0.0096893310546875,
0.01690673828125,
0.004726409912109375,
-0.06719970703125,
0.005481719970703125,
0.005401611328125,
0.06427001953125,
-0.0036029815673828125,
-0.0272979736328125,
-0.0229949951171875,
0.0038700103759765625,
-0.024749755859375,
0.04071044921875,
-0.0362548828125,
-0.00757598876953125,
-0.011932373046875,
0.0211334228515625,
-0.034088134765625,
-0.03668212890625,
0.021881103515625,
-0.040130615234375,
0.04107666015625,
-0.0014638900756835938,
-0.0269012451171875,
-0.0275726318359375,
0.01424407958984375,
-0.048797607421875,
0.060211181640625,
0.00356292724609375,
-0.0672607421875,
0.0229034423828125,
-0.041168212890625,
-0.025482177734375,
0.00409698486328125,
-0.015533447265625,
-0.06500244140625,
-0.015533447265625,
0.037139892578125,
0.054962158203125,
-0.03289794921875,
0.015899658203125,
-0.01187896728515625,
-0.031890869140625,
0.006439208984375,
-0.03521728515625,
0.0931396484375,
0.00774383544921875,
-0.0394287109375,
-0.0018510818481445312,
-0.06231689453125,
-0.003780364990234375,
0.03936767578125,
-0.01042938232421875,
-0.00885772705078125,
-0.01837158203125,
0.0126495361328125,
0.020355224609375,
0.01253509521484375,
-0.03082275390625,
0.0022678375244140625,
-0.015716552734375,
0.0255279541015625,
0.055267333984375,
-0.01715087890625,
0.0209503173828125,
-0.0130462646484375,
0.03289794921875,
0.0086212158203125,
0.00952911376953125,
-0.039520263671875,
-0.043212890625,
-0.07354736328125,
-0.047454833984375,
0.0272064208984375,
0.04248046875,
-0.07293701171875,
0.0255279541015625,
-0.0260772705078125,
-0.03399658203125,
-0.06182861328125,
0.0148468017578125,
0.04522705078125,
0.0457763671875,
0.047271728515625,
-0.04248046875,
-0.038787841796875,
-0.07373046875,
-0.0053863525390625,
-0.0060272216796875,
-0.0010585784912109375,
0.0257415771484375,
0.044342041015625,
-0.02728271484375,
0.04803466796875,
-0.0307464599609375,
-0.036712646484375,
-0.0199432373046875,
0.003726959228515625,
0.0221405029296875,
0.05609130859375,
0.04998779296875,
-0.06805419921875,
-0.04339599609375,
-0.0025997161865234375,
-0.06707763671875,
0.0092010498046875,
-0.0114898681640625,
0.0023479461669921875,
0.030242919921875,
0.024993896484375,
-0.039459228515625,
0.02752685546875,
0.047882080078125,
-0.01605224609375,
0.0289459228515625,
-0.014068603515625,
0.028533935546875,
-0.10418701171875,
0.0109405517578125,
0.022491455078125,
-0.005947113037109375,
-0.04107666015625,
0.00667572021484375,
0.00865936279296875,
-0.0102996826171875,
-0.0260467529296875,
0.03515625,
-0.056549072265625,
0.005695343017578125,
0.005893707275390625,
0.004650115966796875,
0.01641845703125,
0.059722900390625,
0.0168304443359375,
0.04962158203125,
0.06011962890625,
-0.03875732421875,
0.029296875,
0.0208740234375,
-0.04638671875,
0.033782958984375,
-0.0430908203125,
0.0031795501708984375,
-0.0171356201171875,
0.0027942657470703125,
-0.0782470703125,
-0.01025390625,
0.0201568603515625,
-0.036590576171875,
0.0550537109375,
-0.0018510818481445312,
-0.0374755859375,
-0.03759765625,
-0.0105133056640625,
0.034576416015625,
0.046234130859375,
-0.05584716796875,
0.057281494140625,
0.00949859619140625,
-0.0038433074951171875,
-0.0596923828125,
-0.074462890625,
0.01751708984375,
-0.005115509033203125,
-0.06353759765625,
0.044769287109375,
0.00013017654418945312,
-0.0019664764404296875,
0.005580902099609375,
0.018768310546875,
-0.00904083251953125,
-0.01190185546875,
0.019287109375,
0.046173095703125,
-0.0198516845703125,
0.0004973411560058594,
-0.020751953125,
-0.00988006591796875,
-0.00782012939453125,
-0.01519775390625,
0.039337158203125,
-0.0205535888671875,
-0.0161590576171875,
-0.052093505859375,
0.01519775390625,
0.015838623046875,
-0.026611328125,
0.0740966796875,
0.067626953125,
-0.024017333984375,
-0.005619049072265625,
-0.03948974609375,
-0.006649017333984375,
-0.0396728515625,
0.04925537109375,
-0.026763916015625,
-0.058837890625,
0.0266265869140625,
0.0123291015625,
-0.00923919677734375,
0.0379638671875,
0.04302978515625,
-0.004665374755859375,
0.08038330078125,
0.06390380859375,
0.0003647804260253906,
0.0477294921875,
-0.046600341796875,
0.0260467529296875,
-0.04034423828125,
-0.0305328369140625,
-0.024871826171875,
-0.00547027587890625,
-0.055145263671875,
-0.03680419921875,
0.01544952392578125,
0.0023193359375,
-0.0202789306640625,
0.0450439453125,
-0.04681396484375,
0.0280914306640625,
0.045684814453125,
0.0211944580078125,
0.01076507568359375,
0.0220947265625,
-0.0180206298828125,
-0.01462554931640625,
-0.053192138671875,
-0.029449462890625,
0.06573486328125,
0.027618408203125,
0.052001953125,
-0.00835418701171875,
0.0289459228515625,
-0.003231048583984375,
0.016082763671875,
-0.0340576171875,
0.0496826171875,
-0.0236053466796875,
-0.0394287109375,
0.0004837512969970703,
-0.0380859375,
-0.06475830078125,
0.01419830322265625,
-0.0227813720703125,
-0.0633544921875,
0.0174560546875,
0.01898193359375,
-0.005466461181640625,
0.03948974609375,
-0.07012939453125,
0.08447265625,
-0.014862060546875,
-0.04998779296875,
-0.007503509521484375,
-0.054412841796875,
0.03070068359375,
0.002910614013671875,
-0.00798797607421875,
0.018096923828125,
0.0277862548828125,
0.0543212890625,
-0.005687713623046875,
0.05401611328125,
0.003910064697265625,
0.007236480712890625,
0.0068817138671875,
-0.01342010498046875,
0.0212860107421875,
-0.0247955322265625,
0.002193450927734375,
0.0413818359375,
-0.0149688720703125,
-0.0455322265625,
-0.0306396484375,
0.04107666015625,
-0.0582275390625,
-0.04461669921875,
-0.0300750732421875,
-0.0273590087890625,
0.0078125,
0.01210784912109375,
0.046875,
0.0360107421875,
0.01250457763671875,
0.0244598388671875,
0.049957275390625,
-0.0308074951171875,
0.040740966796875,
0.023193359375,
-0.016143798828125,
-0.034576416015625,
0.08612060546875,
0.00926971435546875,
0.025787353515625,
0.04876708984375,
-0.013519287109375,
-0.003757476806640625,
-0.0305938720703125,
-0.038604736328125,
0.018310546875,
-0.062103271484375,
-0.0150146484375,
-0.0589599609375,
-0.032958984375,
-0.03253173828125,
-0.0181884765625,
-0.0233306884765625,
-0.0019445419311523438,
-0.0408935546875,
0.00316619873046875,
0.00812530517578125,
0.02935791015625,
0.007106781005859375,
0.020599365234375,
-0.0374755859375,
0.0227508544921875,
0.03466796875,
0.014007568359375,
-0.00946044921875,
-0.0540771484375,
-0.0291595458984375,
-0.0025730133056640625,
-0.0280609130859375,
-0.058380126953125,
0.050537109375,
0.01319122314453125,
0.034515380859375,
0.0386962890625,
-0.0230865478515625,
0.05621337890625,
-0.0251312255859375,
0.0699462890625,
0.028411865234375,
-0.0732421875,
0.0350341796875,
0.00189971923828125,
0.0179443359375,
0.023284912109375,
0.043792724609375,
-0.0172119140625,
-0.00981903076171875,
-0.04241943359375,
-0.05816650390625,
0.0595703125,
0.032135009765625,
0.006710052490234375,
0.018310546875,
0.0019817352294921875,
-0.010772705078125,
0.0173797607421875,
-0.0830078125,
-0.02410888671875,
-0.0242156982421875,
-0.0223388671875,
-0.01509857177734375,
-0.01155853271484375,
0.01157379150390625,
-0.044525146484375,
0.07244873046875,
-0.0085906982421875,
0.04681396484375,
0.0097808837890625,
-0.0284271240234375,
-0.003055572509765625,
-0.00824737548828125,
0.03741455078125,
0.03582763671875,
-0.0221710205078125,
0.01371002197265625,
0.01470947265625,
-0.041778564453125,
-0.00537109375,
0.028228759765625,
-0.00859832763671875,
0.01184844970703125,
0.0469970703125,
0.09173583984375,
-0.01184844970703125,
-0.0423583984375,
0.0447998046875,
-0.007904052734375,
-0.0157012939453125,
-0.01480865478515625,
-0.003688812255859375,
-0.0031585693359375,
0.026763916015625,
0.02850341796875,
0.01499176025390625,
-0.010955810546875,
-0.04290771484375,
0.00539398193359375,
0.03704833984375,
-0.03533935546875,
-0.013092041015625,
0.061187744140625,
-0.00403594970703125,
-0.0167083740234375,
0.0535888671875,
-0.00997161865234375,
-0.04443359375,
0.053955078125,
0.051055908203125,
0.07086181640625,
-0.01568603515625,
0.0200958251953125,
0.0386962890625,
0.043304443359375,
0.01198577880859375,
-0.0008244514465332031,
-0.01114654541015625,
-0.05731201171875,
-0.0255584716796875,
-0.0543212890625,
-0.006938934326171875,
0.0142669677734375,
-0.030487060546875,
0.034088134765625,
-0.0227203369140625,
-0.0024967193603515625,
0.00855255126953125,
0.01103973388671875,
-0.0736083984375,
0.029815673828125,
0.01129150390625,
0.055145263671875,
-0.06622314453125,
0.061187744140625,
0.05120849609375,
-0.060089111328125,
-0.062225341796875,
-0.00124359130859375,
-0.0257720947265625,
-0.07611083984375,
0.07080078125,
0.042633056640625,
-0.00371551513671875,
0.01287841796875,
-0.05767822265625,
-0.04888916015625,
0.0953369140625,
0.037811279296875,
-0.04498291015625,
0.0010395050048828125,
0.0089569091796875,
0.034820556640625,
-0.035400390625,
0.03955078125,
0.020538330078125,
0.025360107421875,
0.004871368408203125,
-0.061492919921875,
0.008514404296875,
-0.0401611328125,
0.01432037353515625,
-0.0098419189453125,
-0.0595703125,
0.0748291015625,
-0.023773193359375,
-0.0299835205078125,
0.002223968505859375,
0.043060302734375,
0.0166015625,
0.0180206298828125,
0.02728271484375,
0.048248291015625,
0.037689208984375,
-0.00836181640625,
0.07568359375,
-0.0263824462890625,
0.0357666015625,
0.0543212890625,
-0.0002598762512207031,
0.0504150390625,
0.0272064208984375,
-0.009674072265625,
0.051361083984375,
0.053253173828125,
-0.02789306640625,
0.04754638671875,
0.00453948974609375,
-0.01519775390625,
-0.0206756591796875,
-0.0109405517578125,
-0.0244598388671875,
0.029693603515625,
0.01081085205078125,
-0.06658935546875,
0.0009183883666992188,
0.0162811279296875,
-0.003025054931640625,
0.0023365020751953125,
-0.01480865478515625,
0.0396728515625,
0.01078033447265625,
-0.03765869140625,
0.0640869140625,
0.00930023193359375,
0.054901123046875,
-0.03985595703125,
-0.007785797119140625,
-0.00003814697265625,
0.0259857177734375,
-0.00447845458984375,
-0.04522705078125,
-0.00742340087890625,
-0.00806427001953125,
-0.01372528076171875,
-0.0255584716796875,
0.05535888671875,
-0.046875,
-0.051544189453125,
0.03363037109375,
0.00748443603515625,
0.01049041748046875,
-0.0008959770202636719,
-0.0679931640625,
0.00823974609375,
0.0124359130859375,
-0.027008056640625,
0.005695343017578125,
0.01175689697265625,
0.007793426513671875,
0.04534912109375,
0.036376953125,
0.0006995201110839844,
-0.0016880035400390625,
-0.003818511962890625,
0.06768798828125,
-0.040985107421875,
-0.0191192626953125,
-0.0732421875,
0.042388916015625,
-0.023895263671875,
-0.022125244140625,
0.05413818359375,
0.041900634765625,
0.0794677734375,
-0.01348114013671875,
0.049163818359375,
0.0014047622680664062,
-0.00513458251953125,
-0.05291748046875,
0.048614501953125,
-0.05322265625,
0.00846099853515625,
-0.03643798828125,
-0.07769775390625,
-0.0252685546875,
0.054229736328125,
-0.019073486328125,
0.0233306884765625,
0.055145263671875,
0.08404541015625,
-0.012603759765625,
-0.0094451904296875,
0.02972412109375,
0.0234375,
0.01436614990234375,
0.050811767578125,
0.0283966064453125,
-0.057159423828125,
0.040008544921875,
-0.0301361083984375,
-0.0165863037109375,
-0.0144195556640625,
-0.0626220703125,
-0.09283447265625,
-0.07391357421875,
-0.035400390625,
-0.032745361328125,
0.0162811279296875,
0.06854248046875,
0.05963134765625,
-0.057891845703125,
-0.00720977783203125,
-0.0008502006530761719,
-0.004058837890625,
-0.0221710205078125,
-0.0187835693359375,
0.043792724609375,
-0.004405975341796875,
-0.0655517578125,
-0.0003006458282470703,
0.0301513671875,
0.01080322265625,
-0.0252532958984375,
0.001953125,
-0.031463623046875,
-0.0004944801330566406,
0.040924072265625,
0.0066680908203125,
-0.03997802734375,
0.0019664764404296875,
-0.002445220947265625,
-0.017669677734375,
0.00965118408203125,
0.0269775390625,
-0.052581787109375,
0.050537109375,
0.0272369384765625,
0.053070068359375,
0.06756591796875,
-0.0138397216796875,
0.0240478515625,
-0.05767822265625,
0.016143798828125,
-0.0027618408203125,
0.04083251953125,
0.050811767578125,
-0.023193359375,
0.0270538330078125,
0.043548583984375,
-0.0333251953125,
-0.0552978515625,
-0.0030345916748046875,
-0.0811767578125,
-0.0254058837890625,
0.0762939453125,
-0.01275634765625,
-0.040374755859375,
0.0171051025390625,
-0.0201568603515625,
0.0323486328125,
-0.0075836181640625,
0.055694580078125,
0.032501220703125,
0.0232086181640625,
-0.0469970703125,
-0.03363037109375,
0.02105712890625,
0.035247802734375,
-0.043853759765625,
-0.0299835205078125,
0.0158233642578125,
0.0209197998046875,
0.0303192138671875,
0.04510498046875,
0.00913238525390625,
0.005645751953125,
0.01116943359375,
0.030670166015625,
-0.01558685302734375,
-0.02001953125,
-0.024383544921875,
0.013397216796875,
-0.03668212890625,
-0.03143310546875
]
] |
hf-internal-testing/tiny-stable-diffusion-pipe | 2023-05-05T15:29:33.000Z | [
"diffusers",
"text-to-image",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | hf-internal-testing | null | null | hf-internal-testing/tiny-stable-diffusion-pipe | 1 | 59,055 | diffusers | 2022-09-20T18:12:10 | ---
library_name: diffusers
tags:
- text-to-image
---
```py
from diffusers import DiffusionPipeline
pipe = DiffusionPipeline.from_pretrained("hf-internal-testing/tiny-stable-diffusion-pipe")
``` | 195 | [
[
-0.019805908203125,
-0.04620361328125,
0.019195556640625,
0.0146942138671875,
-0.01435089111328125,
-0.0027904510498046875,
0.0167388916015625,
0.051788330078125,
0.0086212158203125,
0.010009765625,
-0.0285797119140625,
0.0012331008911132812,
-0.032501220703125,
0.003810882568359375,
-0.0163421630859375,
0.0556640625,
-0.0069732666015625,
0.01519012451171875,
-0.01061248779296875,
0.005123138427734375,
0.0160980224609375,
-0.007236480712890625,
-0.055633544921875,
-0.048828125,
0.037078857421875,
0.037750244140625,
0.0214691162109375,
0.015869140625,
0.031524658203125,
0.0193023681640625,
-0.01593017578125,
-0.0404052734375,
-0.01320648193359375,
0.0032863616943359375,
0.01049041748046875,
-0.035125732421875,
-0.007106781005859375,
-0.011962890625,
0.076416015625,
0.049346923828125,
-0.0156402587890625,
0.011962890625,
0.0255584716796875,
0.0106658935546875,
-0.0478515625,
0.00013148784637451172,
0.005706787109375,
0.01221466064453125,
-0.007511138916015625,
-0.0267791748046875,
-0.00943756103515625,
-0.0380859375,
0.04022216796875,
-0.046722412109375,
0.00838470458984375,
-0.01393890380859375,
0.0782470703125,
0.04498291015625,
-0.032562255859375,
-0.0019474029541015625,
-0.034698486328125,
0.032470703125,
-0.038299560546875,
0.0253753662109375,
0.038848876953125,
0.01413726806640625,
-0.0266571044921875,
-0.09613037109375,
-0.007480621337890625,
-0.0186309814453125,
-0.002658843994140625,
0.004459381103515625,
0.02313232421875,
0.010528564453125,
0.0228424072265625,
0.033203125,
-0.0209503173828125,
-0.02630615234375,
-0.0501708984375,
-0.0162353515625,
0.033477783203125,
0.00713348388671875,
0.018157958984375,
0.0285797119140625,
0.001262664794921875,
-0.018463134765625,
-0.024383544921875,
-0.0098876953125,
0.018951416015625,
0.0015783309936523438,
-0.03179931640625,
0.039031982421875,
-0.0073394775390625,
0.0281219482421875,
0.0321044921875,
-0.016510009765625,
0.051788330078125,
-0.01020050048828125,
-0.0268096923828125,
0.017608642578125,
0.048187255859375,
-0.01309967041015625,
-0.01050567626953125,
0.04449462890625,
-0.01256561279296875,
-0.024658203125,
-0.00669097900390625,
-0.1153564453125,
-0.055999755859375,
0.007030487060546875,
-0.040924072265625,
-0.0267333984375,
-0.0007219314575195312,
-0.05181884765625,
-0.00481414794921875,
0.0156402587890625,
0.04986572265625,
0.00742340087890625,
-0.0345458984375,
-0.017242431640625,
-0.0643310546875,
0.0141143798828125,
-0.0012903213500976562,
-0.0379638671875,
0.035400390625,
0.01253509521484375,
0.0777587890625,
0.026947021484375,
-0.032867431640625,
-0.05401611328125,
-0.01534271240234375,
-0.0215911865234375,
0.04559326171875,
-0.0065765380859375,
-0.0274810791015625,
-0.00591278076171875,
0.0242462158203125,
-0.008087158203125,
-0.054901123046875,
0.0295867919921875,
-0.0169677734375,
0.032928466796875,
0.0221099853515625,
-0.023406982421875,
0.00666046142578125,
-0.00739288330078125,
-0.003814697265625,
0.08056640625,
0.048187255859375,
-0.07830810546875,
0.0261993408203125,
-0.059417724609375,
-0.03955078125,
0.0013704299926757812,
0.0255279541015625,
-0.04241943359375,
-0.0161895751953125,
-0.0281219482421875,
0.01377105712890625,
0.0307464599609375,
-0.018951416015625,
-0.0589599609375,
-0.03692626953125,
-0.0088958740234375,
-0.01812744140625,
0.08526611328125,
0.024505615234375,
-0.006072998046875,
0.031768798828125,
-0.039276123046875,
-0.007717132568359375,
-0.0257568359375,
-0.036865234375,
-0.0260772705078125,
-0.0132904052734375,
0.01300048828125,
-0.005573272705078125,
0.005157470703125,
-0.047149658203125,
-0.00142669677734375,
-0.0489501953125,
0.051055908203125,
0.05303955078125,
0.0148773193359375,
0.038665771484375,
-0.011993408203125,
0.037078857421875,
0.00496673583984375,
0.00043129920959472656,
0.03167724609375,
-0.05169677734375,
-0.0625,
-0.029052734375,
0.010589599609375,
0.0254364013671875,
-0.0222625732421875,
0.0300445556640625,
0.0229339599609375,
-0.049224853515625,
-0.031463623046875,
0.0237884521484375,
0.0010747909545898438,
0.0235443115234375,
0.0004830360412597656,
-0.01079559326171875,
-0.032623291015625,
-0.0202789306640625,
-0.00101470947265625,
0.0007433891296386719,
-0.00832366943359375,
-0.0006322860717773438,
0.060211181640625,
-0.052978515625,
0.042816162109375,
-0.07537841796875,
-0.0287017822265625,
0.0084381103515625,
0.027374267578125,
0.037109375,
0.06329345703125,
0.038726806640625,
-0.01404571533203125,
-0.0797119140625,
-0.0079193115234375,
-0.0184326171875,
-0.01413726806640625,
0.024627685546875,
-0.026702880859375,
-0.02484130859375,
0.032470703125,
-0.03741455078125,
0.034027099609375,
0.042877197265625,
-0.059967041015625,
0.047210693359375,
-0.04345703125,
-0.013214111328125,
-0.04840087890625,
0.00970458984375,
-0.004070281982421875,
-0.03448486328125,
-0.0110626220703125,
0.01123809814453125,
0.01079559326171875,
-0.0059967041015625,
-0.04339599609375,
0.061370849609375,
-0.047149658203125,
0.022186279296875,
-0.018524169921875,
-0.0277862548828125,
-0.0179290771484375,
-0.01568603515625,
-0.0003960132598876953,
0.056396484375,
0.058563232421875,
-0.050048828125,
0.0548095703125,
0.01617431640625,
0.00421905517578125,
0.016998291015625,
-0.045257568359375,
0.003833770751953125,
-0.0003924369812011719,
0.022613525390625,
-0.059539794921875,
-0.04327392578125,
0.0220947265625,
-0.01496124267578125,
0.0022563934326171875,
-0.044036865234375,
0.0006461143493652344,
-0.0531005859375,
-0.0253753662109375,
0.050506591796875,
0.07470703125,
-0.039306640625,
0.0183258056640625,
0.0081024169921875,
0.002063751220703125,
-0.0260772705078125,
-0.048828125,
-0.03851318359375,
-0.04180908203125,
-0.0379638671875,
-0.00012791156768798828,
-0.02850341796875,
-0.006336212158203125,
-0.01399993896484375,
-0.002704620361328125,
-0.0643310546875,
0.006061553955078125,
0.0195770263671875,
-0.005828857421875,
-0.030914306640625,
-0.036224365234375,
0.003337860107421875,
-0.0251312255859375,
0.0233306884765625,
-0.0222320556640625,
0.039398193359375,
0.000518798828125,
0.004192352294921875,
-0.044342041015625,
-0.018157958984375,
0.0209503173828125,
0.01250457763671875,
0.0235595703125,
0.0711669921875,
-0.03192138671875,
-0.027374267578125,
-0.01253509521484375,
-0.031982421875,
-0.040008544921875,
0.0031833648681640625,
-0.0209503173828125,
-0.030303955078125,
0.00852203369140625,
-0.025421142578125,
-0.0212554931640625,
0.018310546875,
0.0350341796875,
-0.0188751220703125,
0.070068359375,
0.047119140625,
0.034698486328125,
0.033843994140625,
-0.035308837890625,
-0.00646209716796875,
-0.058868408203125,
0.00791168212890625,
-0.034149169921875,
-0.00292205810546875,
0.0148468017578125,
-0.0149383544921875,
0.03924560546875,
0.0288848876953125,
-0.05389404296875,
0.0069732666015625,
-0.0289154052734375,
0.0438232421875,
0.03717041015625,
-0.0078125,
-0.003040313720703125,
-0.0257110595703125,
-0.017791748046875,
0.01374053955078125,
-0.0166015625,
-0.02520751953125,
0.08575439453125,
0.0286407470703125,
0.076171875,
-0.0223541259765625,
0.057159423828125,
-0.007015228271484375,
0.045196533203125,
-0.06207275390625,
-0.0175933837890625,
-0.004474639892578125,
-0.0682373046875,
-0.019744873046875,
-0.0204620361328125,
-0.0692138671875,
0.024261474609375,
0.00800323486328125,
-0.02398681640625,
0.0061187744140625,
0.022308349609375,
-0.0419921875,
0.00742340087890625,
-0.03515625,
0.0947265625,
-0.0119476318359375,
-0.0269012451171875,
-0.006984710693359375,
-0.0284576416015625,
0.0269927978515625,
-0.0071868896484375,
0.007419586181640625,
0.005443572998046875,
-0.0130615234375,
0.0552978515625,
-0.059844970703125,
0.040374755859375,
-0.040252685546875,
0.0070037841796875,
0.0130615234375,
0.003208160400390625,
-0.002719879150390625,
0.0282745361328125,
-0.01074981689453125,
-0.010528564453125,
0.03662109375,
-0.0419921875,
-0.010833740234375,
0.04376220703125,
-0.06365966796875,
0.0010776519775390625,
-0.054107666015625,
-0.0013065338134765625,
0.0246734619140625,
0.046478271484375,
0.050750732421875,
0.037261962890625,
-0.0150604248046875,
-0.006717681884765625,
0.050689697265625,
0.0159912109375,
0.070556640625,
0.00832366943359375,
-0.0242462158203125,
-0.0343017578125,
0.042755126953125,
-0.0002818107604980469,
0.0100860595703125,
0.0113525390625,
0.081298828125,
-0.0282745361328125,
-0.0254364013671875,
-0.0352783203125,
0.007221221923828125,
-0.03662109375,
-0.0012140274047851562,
-0.029876708984375,
-0.034149169921875,
-0.01300048828125,
-0.01122283935546875,
-0.034820556640625,
-0.01141357421875,
-0.046630859375,
0.01377105712890625,
0.01204681396484375,
0.03851318359375,
-0.04901123046875,
0.053619384765625,
-0.049224853515625,
0.0173492431640625,
0.03662109375,
0.0254364013671875,
-0.0122833251953125,
-0.056915283203125,
-0.01093292236328125,
0.0008435249328613281,
-0.04180908203125,
-0.05023193359375,
0.03753662109375,
0.042266845703125,
0.0355224609375,
0.080322265625,
0.0033245086669921875,
0.059356689453125,
-0.0196075439453125,
0.046661376953125,
0.0222320556640625,
-0.05975341796875,
0.0638427734375,
-0.0345458984375,
0.004680633544921875,
0.027069091796875,
0.038726806640625,
-0.020111083984375,
-0.0017786026000976562,
-0.0390625,
-0.052337646484375,
0.0269927978515625,
0.007671356201171875,
0.00380706787109375,
0.01904296875,
0.0180511474609375,
0.01058197021484375,
0.00809478759765625,
-0.057281494140625,
-0.0313720703125,
-0.007549285888671875,
-0.01580810546875,
0.0014581680297851562,
0.0074920654296875,
-0.029052734375,
-0.0836181640625,
0.049774169921875,
-0.0075836181640625,
0.0122833251953125,
0.03570556640625,
-0.00021088123321533203,
-0.0268096923828125,
-0.007568359375,
0.02166748046875,
0.0606689453125,
-0.06427001953125,
0.0187530517578125,
0.0156707763671875,
-0.063232421875,
0.06707763671875,
-0.01070404052734375,
-0.0257568359375,
-0.0017461776733398438,
0.0010232925415039062,
0.0145721435546875,
-0.01377105712890625,
-0.0125579833984375,
0.066162109375,
-0.0186614990234375,
-0.013214111328125,
-0.0692138671875,
0.018218994140625,
0.01751708984375,
0.0011749267578125,
-0.00494384765625,
0.03192138671875,
0.004405975341796875,
-0.03515625,
0.019256591796875,
0.0230255126953125,
-0.053985595703125,
-0.016998291015625,
0.06158447265625,
0.040252685546875,
-0.04766845703125,
0.05859375,
-0.025360107421875,
-0.0240325927734375,
0.02996826171875,
0.044219970703125,
0.0902099609375,
-0.0166778564453125,
-0.006504058837890625,
0.044219970703125,
0.0013494491577148438,
-0.027374267578125,
0.0206146240234375,
-0.001110076904296875,
-0.049713134765625,
-0.0142364501953125,
-0.045745849609375,
-0.0075531005859375,
-0.0161590576171875,
-0.04205322265625,
0.023223876953125,
-0.06451416015625,
-0.02783203125,
-0.019439697265625,
-0.00359344482421875,
-0.035552978515625,
0.006443023681640625,
0.0075531005859375,
0.07470703125,
-0.060546875,
0.09112548828125,
0.07318115234375,
-0.034820556640625,
-0.0280914306640625,
0.0184783935546875,
-0.0189666748046875,
-0.040008544921875,
0.04986572265625,
0.00971221923828125,
-0.01265716552734375,
0.007610321044921875,
-0.0308074951171875,
-0.06695556640625,
0.0699462890625,
-0.00276947021484375,
-0.01543426513671875,
0.016387939453125,
-0.03680419921875,
0.01509857177734375,
-0.01428985595703125,
0.06805419921875,
0.05987548828125,
0.051300048828125,
-0.0096435546875,
-0.0577392578125,
-0.00164794921875,
-0.0304107666015625,
0.0012502670288085938,
0.0128021240234375,
-0.055084228515625,
0.076416015625,
-0.018310546875,
-0.000804901123046875,
0.0184783935546875,
0.046478271484375,
0.0198516845703125,
0.015045166015625,
0.04248046875,
0.055572509765625,
0.053802490234375,
-0.01129913330078125,
0.0360107421875,
0.00653076171875,
0.04730224609375,
0.044525146484375,
-0.00827789306640625,
0.07080078125,
0.052001953125,
-0.0184326171875,
0.0775146484375,
0.054443359375,
-0.005283355712890625,
0.078857421875,
0.041412353515625,
-0.030670166015625,
-0.01148223876953125,
0.049468994140625,
-0.04437255859375,
-0.00504302978515625,
0.0149078369140625,
0.0035228729248046875,
-0.0088958740234375,
-0.010101318359375,
-0.002513885498046875,
-0.04632568359375,
-0.02154541015625,
0.0250091552734375,
0.024383544921875,
-0.050872802734375,
0.059234619140625,
-0.00830841064453125,
0.0919189453125,
-0.060333251953125,
0.006107330322265625,
0.007198333740234375,
0.071044921875,
-0.0301666259765625,
-0.04058837890625,
0.036651611328125,
-0.0298919677734375,
0.009796142578125,
-0.002475738525390625,
0.06292724609375,
-0.0167083740234375,
-0.029327392578125,
0.041961669921875,
0.01081085205078125,
0.02313232421875,
-0.00042176246643066406,
-0.034912109375,
-0.0261993408203125,
-0.01194000244140625,
-0.0295867919921875,
0.0216217041015625,
0.0007395744323730469,
0.061981201171875,
0.06146240234375,
0.0307464599609375,
0.0219879150390625,
0.0263519287109375,
-0.0200653076171875,
0.0214691162109375,
-0.04840087890625,
-0.058685302734375,
-0.038360595703125,
0.047576904296875,
-0.007450103759765625,
-0.06439208984375,
0.041748046875,
0.0360107421875,
0.06396484375,
-0.0276031494140625,
0.05859375,
-0.005405426025390625,
0.00910186767578125,
-0.00978851318359375,
0.080810546875,
-0.038970947265625,
-0.019744873046875,
-0.009002685546875,
-0.059967041015625,
0.004360198974609375,
0.0885009765625,
0.025421142578125,
-0.0023479461669921875,
0.083251953125,
0.060394287109375,
-0.059967041015625,
-0.02166748046875,
-0.01502227783203125,
0.04913330078125,
0.0038013458251953125,
0.0129547119140625,
0.0792236328125,
-0.0321044921875,
0.033538818359375,
-0.06707763671875,
-0.036041259765625,
0.0028247833251953125,
-0.053070068359375,
-0.07940673828125,
-0.0036468505859375,
-0.04986572265625,
-0.060882568359375,
-0.0229339599609375,
0.053619384765625,
0.07684326171875,
-0.052703857421875,
-0.0577392578125,
-0.026519775390625,
0.006862640380859375,
-0.0186309814453125,
-0.0199127197265625,
0.030792236328125,
-0.02008056640625,
-0.0310821533203125,
0.024627685546875,
-0.01207733154296875,
0.032958984375,
-0.0360107421875,
-0.0260009765625,
-0.0012035369873046875,
-0.01104736328125,
0.0108184814453125,
0.02239990234375,
-0.0211639404296875,
-0.025848388671875,
-0.058868408203125,
0.0038166046142578125,
0.004955291748046875,
0.021728515625,
-0.040924072265625,
-0.01284027099609375,
0.08099365234375,
0.01154327392578125,
0.05914306640625,
-0.008880615234375,
0.0445556640625,
-0.04266357421875,
0.02447509765625,
0.0035305023193359375,
0.041656494140625,
0.00399017333984375,
-0.0159759521484375,
0.040771484375,
0.048309326171875,
-0.06146240234375,
-0.047760009765625,
-0.0128936767578125,
-0.057281494140625,
-0.0228424072265625,
0.0784912109375,
-0.0246734619140625,
-0.031829833984375,
-0.0162506103515625,
-0.0263519287109375,
0.019775390625,
-0.02703857421875,
0.0173797607421875,
0.02392578125,
-0.002532958984375,
-0.0004417896270751953,
-0.0242462158203125,
0.05413818359375,
0.00891876220703125,
-0.05810546875,
-0.0209197998046875,
-0.0018939971923828125,
0.0899658203125,
0.0139617919921875,
0.071044921875,
0.005580902099609375,
-0.0029449462890625,
0.03558349609375,
-0.022369384765625,
0.02606201171875,
-0.0111541748046875,
-0.028564453125,
-0.00811767578125,
0.01129913330078125,
-0.033355712890625
]
] |
microsoft/speecht5_hifigan | 2023-02-02T13:08:06.000Z | [
"transformers",
"pytorch",
"hifigan",
"audio",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | null | microsoft | null | null | microsoft/speecht5_hifigan | 10 | 58,517 | transformers | 2023-02-02T13:06:10 | ---
license: mit
tags:
- audio
---
# SpeechT5 HiFi-GAN Vocoder
This is the HiFi-GAN vocoder for use with the SpeechT5 text-to-speech and voice conversion models.
SpeechT5 was first released in [this repository](https://github.com/microsoft/SpeechT5/), [original weights](https://huggingface.co/mechanicalsea/speecht5-tts). The license used is [MIT](https://github.com/microsoft/SpeechT5/blob/main/LICENSE).
Disclaimer: The team releasing SpeechT5 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Citation
**BibTeX:**
```bibtex
@inproceedings{ao-etal-2022-speecht5,
title = {{S}peech{T}5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing},
author = {Ao, Junyi and Wang, Rui and Zhou, Long and Wang, Chengyi and Ren, Shuo and Wu, Yu and Liu, Shujie and Ko, Tom and Li, Qing and Zhang, Yu and Wei, Zhihua and Qian, Yao and Li, Jinyu and Wei, Furu},
booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
month = {May},
year = {2022},
pages={5723--5738},
}
```
| 1,143 | [
[
-0.03680419921875,
-0.02508544921875,
-0.0034580230712890625,
0.02056884765625,
-0.014892578125,
-0.01218414306640625,
-0.002696990966796875,
-0.0229339599609375,
-0.0010042190551757812,
0.025360107421875,
-0.04364013671875,
-0.03668212890625,
-0.0380859375,
0.0001304149627685547,
-0.0292510986328125,
0.09161376953125,
0.03173828125,
0.009857177734375,
-0.001575469970703125,
-0.0115814208984375,
-0.00965118408203125,
-0.05487060546875,
-0.03363037109375,
-0.040435791015625,
0.034210205078125,
-0.005718231201171875,
0.0361328125,
0.0229034423828125,
0.0257720947265625,
0.01125335693359375,
-0.036346435546875,
-0.0205230712890625,
-0.0340576171875,
-0.019561767578125,
0.014556884765625,
-0.0160675048828125,
-0.073486328125,
-0.004100799560546875,
0.05419921875,
0.023101806640625,
-0.029052734375,
0.011444091796875,
-0.0032329559326171875,
0.021697998046875,
-0.0318603515625,
0.016204833984375,
-0.0304718017578125,
0.0209808349609375,
-0.04052734375,
-0.004398345947265625,
-0.033599853515625,
-0.01436614990234375,
-0.01100921630859375,
-0.0345458984375,
0.01250457763671875,
0.000008761882781982422,
0.0877685546875,
0.02752685546875,
-0.0303192138671875,
0.0019025802612304688,
-0.05096435546875,
0.03179931640625,
-0.06072998046875,
0.07171630859375,
0.01143646240234375,
0.0286102294921875,
-0.007198333740234375,
-0.0804443359375,
-0.068603515625,
0.0016069412231445312,
0.0200042724609375,
0.02276611328125,
-0.029144287109375,
0.035369873046875,
0.03582763671875,
0.039337158203125,
-0.044342041015625,
-0.019989013671875,
-0.052154541015625,
-0.026397705078125,
0.0264129638671875,
-0.0288848876953125,
0.024383544921875,
-0.0379638671875,
-0.050537109375,
-0.0089111328125,
-0.044769287109375,
-0.006565093994140625,
0.003753662109375,
-0.00566864013671875,
-0.0458984375,
0.018524169921875,
0.00537872314453125,
0.037750244140625,
0.0052032470703125,
-0.01508331298828125,
0.05316162109375,
-0.0175323486328125,
-0.03533935546875,
0.006683349609375,
0.0689697265625,
-0.0021991729736328125,
0.002338409423828125,
-0.003620147705078125,
-0.01401519775390625,
-0.007297515869140625,
0.0282745361328125,
-0.08428955078125,
-0.01331329345703125,
0.0011796951293945312,
-0.040924072265625,
-0.01296234130859375,
0.0141143798828125,
-0.04119873046875,
0.0005474090576171875,
0.0092926025390625,
0.041046142578125,
-0.017791748046875,
-0.0308990478515625,
-0.00443267822265625,
-0.01242828369140625,
0.04473876953125,
0.008941650390625,
-0.0477294921875,
0.0217132568359375,
0.0236663818359375,
0.042724609375,
-0.005107879638671875,
-0.027252197265625,
-0.05206298828125,
0.018646240234375,
-0.0077362060546875,
0.021636962890625,
-0.0184173583984375,
-0.034027099609375,
-0.01282501220703125,
0.007503509521484375,
0.0156402587890625,
-0.03662109375,
0.0870361328125,
-0.0579833984375,
0.040252685546875,
-0.025238037109375,
-0.0219879150390625,
-0.017669677734375,
0.0037822723388671875,
-0.050323486328125,
0.0772705078125,
0.0013799667358398438,
-0.04534912109375,
0.0223541259765625,
-0.08819580078125,
-0.01003265380859375,
0.0208740234375,
-0.0105133056640625,
-0.04693603515625,
-0.00623321533203125,
0.0139923095703125,
0.03314208984375,
-0.031890869140625,
0.0010309219360351562,
-0.0106353759765625,
-0.02325439453125,
0.0027523040771484375,
-0.0390625,
0.08233642578125,
0.038604736328125,
-0.0262298583984375,
0.027435302734375,
-0.05517578125,
-0.0087432861328125,
0.00716400146484375,
-0.0306549072265625,
0.0132598876953125,
-0.016082763671875,
0.0347900390625,
0.02008056640625,
0.01021575927734375,
-0.04925537109375,
0.002685546875,
-0.01227569580078125,
0.039764404296875,
0.0303192138671875,
-0.006298065185546875,
0.006023406982421875,
-0.0086517333984375,
0.040435791015625,
0.021942138671875,
0.01271820068359375,
-0.006435394287109375,
-0.04901123046875,
-0.035919189453125,
-0.0201263427734375,
0.031280517578125,
0.0248260498046875,
-0.059295654296875,
0.039459228515625,
-0.0278778076171875,
-0.032257080078125,
-0.05328369140625,
-0.013763427734375,
0.0243072509765625,
0.0308074951171875,
0.036346435546875,
-0.017730712890625,
-0.040435791015625,
-0.044189453125,
-0.002971649169921875,
-0.0188446044921875,
0.0012617111206054688,
0.00485992431640625,
0.02996826171875,
-0.02447509765625,
0.05902099609375,
-0.0064697265625,
-0.028045654296875,
-0.015350341796875,
0.017425537109375,
-0.005626678466796875,
0.040069580078125,
0.057159423828125,
-0.06365966796875,
-0.0188751220703125,
-0.0038928985595703125,
-0.038177490234375,
-0.016265869140625,
0.0161590576171875,
0.002025604248046875,
0.01155853271484375,
0.0265655517578125,
-0.047607421875,
0.03961181640625,
0.06463623046875,
-0.0259552001953125,
0.02850341796875,
-0.006839752197265625,
0.007228851318359375,
-0.11126708984375,
0.0095977783203125,
-0.0194854736328125,
-0.0340576171875,
-0.0472412109375,
-0.0131683349609375,
-0.002307891845703125,
-0.02911376953125,
-0.039520263671875,
0.033843994140625,
-0.043609619140625,
-0.0141754150390625,
-0.01345062255859375,
-0.00965118408203125,
-0.012969970703125,
0.04962158203125,
-0.0139007568359375,
0.080810546875,
0.035308837890625,
-0.046600341796875,
0.0421142578125,
0.0479736328125,
-0.0030498504638671875,
0.0501708984375,
-0.07989501953125,
0.032684326171875,
-0.012847900390625,
0.022705078125,
-0.031280517578125,
-0.0218505859375,
0.021148681640625,
-0.06634521484375,
0.01512908935546875,
-0.0228424072265625,
-0.01904296875,
-0.02838134765625,
0.006114959716796875,
0.040435791015625,
0.0665283203125,
-0.056304931640625,
0.038848876953125,
0.04180908203125,
0.02069091796875,
-0.02197265625,
-0.06597900390625,
0.005718231201171875,
-0.007007598876953125,
-0.0362548828125,
0.06085205078125,
-0.0206451416015625,
0.0277557373046875,
0.005725860595703125,
0.00930023193359375,
-0.00502777099609375,
-0.035125732421875,
0.018402099609375,
-0.01245880126953125,
-0.0140838623046875,
-0.011810302734375,
-0.0090789794921875,
-0.01348114013671875,
0.01107025146484375,
-0.028167724609375,
0.037750244140625,
-0.023590087890625,
-0.0299835205078125,
-0.0523681640625,
0.0212860107421875,
0.07244873046875,
-0.04510498046875,
0.03680419921875,
0.09271240234375,
-0.04815673828125,
-0.006427764892578125,
-0.0274200439453125,
0.0051727294921875,
-0.035247802734375,
0.031341552734375,
-0.049774169921875,
-0.05419921875,
0.03759765625,
0.014495849609375,
-0.017486572265625,
0.046112060546875,
0.055633544921875,
0.0007338523864746094,
0.06817626953125,
0.044891357421875,
-0.01242828369140625,
0.054290771484375,
-0.0112457275390625,
0.012786865234375,
-0.08221435546875,
-0.0345458984375,
-0.04315185546875,
-0.01085662841796875,
-0.039031982421875,
-0.0309906005859375,
0.051300048828125,
-0.0016183853149414062,
-0.0212554931640625,
0.0242462158203125,
-0.047119140625,
0.01544952392578125,
0.044036865234375,
-0.0033130645751953125,
0.004215240478515625,
0.00191497802734375,
0.00009387731552124023,
-0.00463104248046875,
-0.06201171875,
-0.01020050048828125,
0.049224853515625,
0.0457763671875,
0.043914794921875,
0.02276611328125,
0.0447998046875,
0.0169677734375,
-0.001613616943359375,
-0.0458984375,
0.033203125,
-0.016357421875,
-0.0439453125,
-0.01328277587890625,
-0.02960205078125,
-0.08575439453125,
0.003101348876953125,
-0.000682830810546875,
-0.036346435546875,
0.0193939208984375,
0.0033931732177734375,
-0.017608642578125,
0.026336669921875,
-0.04461669921875,
0.08013916015625,
0.0016384124755859375,
0.0035381317138671875,
-0.0003693103790283203,
-0.04327392578125,
0.0162811279296875,
0.04205322265625,
-0.006122589111328125,
0.00406646728515625,
0.0203704833984375,
0.0772705078125,
-0.03155517578125,
0.05828857421875,
-0.0330810546875,
-0.0037136077880859375,
0.02215576171875,
-0.028167724609375,
0.0194854736328125,
-0.00946044921875,
0.003566741943359375,
0.02685546875,
0.03277587890625,
-0.038421630859375,
-0.038330078125,
0.03997802734375,
-0.07568359375,
-0.01128387451171875,
-0.034332275390625,
-0.018798828125,
-0.0011224746704101562,
0.0287322998046875,
0.0380859375,
0.03814697265625,
-0.02447509765625,
0.037261962890625,
0.056610107421875,
-0.005290985107421875,
0.038421630859375,
-0.0020732879638671875,
-0.0164337158203125,
-0.0535888671875,
0.06134033203125,
-0.0019931793212890625,
0.020843505859375,
0.01290130615234375,
0.0093994140625,
-0.00982666015625,
-0.036041259765625,
-0.0439453125,
0.0164947509765625,
-0.0259552001953125,
-0.0189361572265625,
-0.032012939453125,
-0.054473876953125,
-0.041595458984375,
0.0139617919921875,
-0.034942626953125,
0.01099395751953125,
-0.0218658447265625,
0.012542724609375,
0.0205535888671875,
0.056854248046875,
0.0027408599853515625,
0.03704833984375,
-0.07049560546875,
0.043426513671875,
0.0261077880859375,
0.032012939453125,
-0.00977325439453125,
-0.038848876953125,
-0.01904296875,
0.01045989990234375,
-0.024169921875,
-0.09197998046875,
0.033477783203125,
0.024139404296875,
0.05096435546875,
0.0201873779296875,
0.0018281936645507812,
0.041961669921875,
-0.016754150390625,
0.0753173828125,
0.0200653076171875,
-0.08245849609375,
0.0209808349609375,
-0.03472900390625,
0.04833984375,
0.01168060302734375,
0.0117645263671875,
-0.0513916015625,
0.0198974609375,
-0.0555419921875,
-0.027008056640625,
0.057891845703125,
0.032867431640625,
0.019775390625,
0.0219879150390625,
0.020843505859375,
-0.0176544189453125,
0.03326416015625,
-0.06292724609375,
-0.017425537109375,
-0.048004150390625,
-0.029144287109375,
0.017486572265625,
-0.036712646484375,
-0.004974365234375,
-0.036285400390625,
0.048736572265625,
-0.0187225341796875,
0.0712890625,
0.02471923828125,
-0.0025272369384765625,
0.004268646240234375,
0.023223876953125,
0.051910400390625,
0.01024627685546875,
-0.01107025146484375,
0.006755828857421875,
0.0181427001953125,
-0.053497314453125,
-0.0183563232421875,
0.01523590087890625,
-0.0164642333984375,
0.0116729736328125,
0.00986480712890625,
0.09912109375,
0.0012121200561523438,
-0.016510009765625,
0.07135009765625,
-0.0025844573974609375,
-0.031646728515625,
-0.035308837890625,
-0.01334381103515625,
0.00983428955078125,
0.00530242919921875,
0.01334381103515625,
0.006969451904296875,
0.0203399658203125,
-0.042449951171875,
0.006557464599609375,
0.0237884521484375,
-0.045318603515625,
-0.056060791015625,
0.07415771484375,
0.0367431640625,
-0.02850341796875,
0.038116455078125,
-0.040069580078125,
-0.06634521484375,
0.01189422607421875,
0.0501708984375,
0.09161376953125,
-0.02398681640625,
-0.0104217529296875,
0.02679443359375,
0.01515960693359375,
0.0303497314453125,
0.04925537109375,
-0.0185089111328125,
-0.0589599609375,
-0.02130126953125,
-0.052154541015625,
-0.0218963623046875,
0.0164947509765625,
-0.05621337890625,
0.036956787109375,
-0.01727294921875,
-0.018463134765625,
-0.0149383544921875,
-0.0130615234375,
-0.046234130859375,
0.024932861328125,
0.028656005859375,
0.054931640625,
-0.021697998046875,
0.08221435546875,
0.0631103515625,
-0.041351318359375,
-0.05010986328125,
0.0213623046875,
0.0014495849609375,
-0.039154052734375,
0.03369140625,
-0.0171661376953125,
-0.0250396728515625,
0.0211029052734375,
-0.04022216796875,
-0.054931640625,
0.08349609375,
0.0338134765625,
-0.02117919921875,
-0.0274200439453125,
-0.0109710693359375,
0.0288543701171875,
-0.0145263671875,
0.0136260986328125,
0.02032470703125,
0.0278167724609375,
0.038360595703125,
-0.10150146484375,
-0.00727081298828125,
-0.035980224609375,
0.01421356201171875,
0.004634857177734375,
-0.06988525390625,
0.0384521484375,
-0.009674072265625,
-0.008636474609375,
-0.0191802978515625,
0.08233642578125,
0.003116607666015625,
0.01033782958984375,
0.0303802490234375,
0.0322265625,
0.0579833984375,
-0.0171661376953125,
0.05419921875,
-0.021820068359375,
0.024261474609375,
0.052581787109375,
0.007354736328125,
0.0633544921875,
0.038604736328125,
-0.01483917236328125,
0.0316162109375,
0.01904296875,
-0.00646209716796875,
0.0271453857421875,
-0.01399993896484375,
-0.015472412109375,
-0.005077362060546875,
-0.00466156005859375,
-0.05224609375,
0.0455322265625,
0.0229949951171875,
-0.01076507568359375,
-0.0016431808471679688,
0.00247955322265625,
0.0028400421142578125,
-0.0287933349609375,
-0.012176513671875,
0.0628662109375,
0.00759124755859375,
-0.0050201416015625,
0.07501220703125,
0.00395965576171875,
0.07012939453125,
-0.044525146484375,
0.005542755126953125,
0.00615692138671875,
-0.0010824203491210938,
-0.0233306884765625,
-0.033660888671875,
0.044281005859375,
0.0010290145874023438,
-0.0137786865234375,
-0.0256500244140625,
0.044891357421875,
-0.039215087890625,
-0.003925323486328125,
0.011932373046875,
0.0216064453125,
0.024810791015625,
0.01424407958984375,
-0.05462646484375,
0.0283355712890625,
0.007541656494140625,
-0.0191650390625,
0.004383087158203125,
0.0258941650390625,
0.006427764892578125,
0.040069580078125,
0.046356201171875,
0.01302337646484375,
0.022735595703125,
0.029052734375,
0.05633544921875,
-0.0535888671875,
-0.04852294921875,
-0.0248260498046875,
0.041961669921875,
0.000055789947509765625,
-0.045013427734375,
0.03369140625,
0.044158935546875,
0.06402587890625,
-0.0112457275390625,
0.0611572265625,
-0.0194244384765625,
0.028167724609375,
-0.042266845703125,
0.07965087890625,
-0.0579833984375,
0.003688812255859375,
-0.00823211669921875,
-0.05902099609375,
-0.0161590576171875,
0.0318603515625,
0.00859832763671875,
0.03582763671875,
0.045928955078125,
0.055419921875,
-0.002109527587890625,
-0.005001068115234375,
0.04852294921875,
0.03448486328125,
0.034271240234375,
0.026214599609375,
0.0212554931640625,
-0.057769775390625,
0.04736328125,
-0.003662109375,
-0.0013837814331054688,
-0.01024627685546875,
-0.053131103515625,
-0.07275390625,
-0.07305908203125,
-0.0272369384765625,
-0.0347900390625,
0.0156097412109375,
0.0791015625,
0.08184814453125,
-0.054473876953125,
-0.01416778564453125,
-0.007144927978515625,
-0.009429931640625,
-0.01050567626953125,
-0.01529693603515625,
0.0127105712890625,
-0.007465362548828125,
-0.06866455078125,
0.0289154052734375,
0.004871368408203125,
0.0272979736328125,
-0.03076171875,
-0.035186767578125,
0.00858306884765625,
-0.0018711090087890625,
0.04071044921875,
0.0204620361328125,
-0.06549072265625,
-0.01152801513671875,
-0.0014705657958984375,
-0.0175323486328125,
0.0238494873046875,
0.03424072265625,
-0.04547119140625,
0.019622802734375,
0.055755615234375,
0.041107177734375,
0.05859375,
-0.0010128021240234375,
0.047119140625,
-0.05511474609375,
0.014801025390625,
0.0164337158203125,
0.014495849609375,
0.027587890625,
-0.0148773193359375,
0.00469207763671875,
0.033935546875,
-0.045135498046875,
-0.0614013671875,
0.0079193115234375,
-0.10784912109375,
0.0209808349609375,
0.07476806640625,
0.00821685791015625,
-0.036468505859375,
0.0169219970703125,
-0.0311431884765625,
0.03338623046875,
-0.033721923828125,
0.0199432373046875,
0.03558349609375,
-0.002101898193359375,
-0.0550537109375,
-0.04229736328125,
0.07391357421875,
0.0153350830078125,
-0.0638427734375,
-0.0178375244140625,
0.0206451416015625,
0.0147247314453125,
0.0047607421875,
0.049957275390625,
-0.028961181640625,
0.031280517578125,
0.01195526123046875,
0.06451416015625,
-0.01953125,
-0.02703857421875,
-0.043243408203125,
0.023468017578125,
0.01001739501953125,
-0.035125732421875
]
] |
timm/swin_base_patch4_window7_224.ms_in22k_ft_in1k | 2023-03-18T04:04:58.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-22k",
"arxiv:2103.14030",
"license:mit",
"region:us"
] | image-classification | timm | null | null | timm/swin_base_patch4_window7_224.ms_in22k_ft_in1k | 1 | 58,455 | timm | 2023-03-18T04:04:29 | ---
tags:
- image-classification
- timm
library_tag: timm
license: mit
datasets:
- imagenet-1k
- imagenet-22k
---
# Model card for swin_base_patch4_window7_224.ms_in22k_ft_in1k
A Swin Transformer image classification model. Pretrained on ImageNet-22k and fine-tuned on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 87.8
- GMACs: 15.5
- Activations (M): 36.6
- Image size: 224 x 224
- **Papers:**
- Swin Transformer: Hierarchical Vision Transformer using Shifted Windows: https://arxiv.org/abs/2103.14030
- **Original:** https://github.com/microsoft/Swin-Transformer
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-22k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('swin_base_patch4_window7_224.ms_in22k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'swin_base_patch4_window7_224.ms_in22k_ft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g. for swin_base_patch4_window7_224 (NHWC output)
# torch.Size([1, 56, 56, 128])
# torch.Size([1, 28, 28, 256])
# torch.Size([1, 14, 14, 512])
# torch.Size([1, 7, 7, 1024])
# e.g. for swinv2_cr_small_ns_224 (NCHW output)
# torch.Size([1, 96, 56, 56])
# torch.Size([1, 192, 28, 28])
# torch.Size([1, 384, 14, 14])
# torch.Size([1, 768, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'swin_base_patch4_window7_224.ms_in22k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled (ie.e a (batch_size, H, W, num_features) tensor for swin / swinv2
# or (batch_size, num_features, H, W) for swinv2_cr
output = model.forward_head(output, pre_logits=True)
# output is (batch_size, num_features) tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{liu2021Swin,
title={Swin Transformer: Hierarchical Vision Transformer using Shifted Windows},
author={Liu, Ze and Lin, Yutong and Cao, Yue and Hu, Han and Wei, Yixuan and Zhang, Zheng and Lin, Stephen and Guo, Baining},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,524 | [
[
-0.03167724609375,
-0.033477783203125,
-0.00582122802734375,
0.01300811767578125,
-0.023681640625,
-0.0296630859375,
-0.016265869140625,
-0.038238525390625,
0.003368377685546875,
0.027679443359375,
-0.046142578125,
-0.049224853515625,
-0.045318603515625,
-0.01548004150390625,
-0.007778167724609375,
0.0772705078125,
-0.01174163818359375,
-0.005222320556640625,
-0.0133056640625,
-0.04107666015625,
-0.0126190185546875,
-0.01357269287109375,
-0.036834716796875,
-0.0271759033203125,
0.0254669189453125,
0.0089263916015625,
0.046539306640625,
0.0390625,
0.05645751953125,
0.0364990234375,
-0.006282806396484375,
-0.0030536651611328125,
-0.0165252685546875,
-0.0174713134765625,
0.0257568359375,
-0.04693603515625,
-0.039306640625,
0.018280029296875,
0.049224853515625,
0.0276641845703125,
0.006622314453125,
0.0308380126953125,
0.00774383544921875,
0.035247802734375,
-0.01248931884765625,
0.00543212890625,
-0.040283203125,
0.0107574462890625,
-0.01018524169921875,
0.001728057861328125,
-0.01357269287109375,
-0.027801513671875,
0.0233154296875,
-0.041229248046875,
0.050506591796875,
0.006153106689453125,
0.10693359375,
0.0045013427734375,
-0.00473785400390625,
0.005870819091796875,
-0.0191192626953125,
0.06854248046875,
-0.07464599609375,
0.0072174072265625,
0.00666046142578125,
0.010467529296875,
-0.00516510009765625,
-0.061431884765625,
-0.037689208984375,
-0.01165771484375,
-0.0200958251953125,
-0.0009336471557617188,
-0.0168609619140625,
0.006259918212890625,
0.0305328369140625,
0.0212249755859375,
-0.0355224609375,
0.00954437255859375,
-0.039581298828125,
-0.020050048828125,
0.048614501953125,
0.003932952880859375,
0.0321044921875,
-0.0198516845703125,
-0.048583984375,
-0.03643798828125,
-0.0258331298828125,
0.0205535888671875,
0.01184844970703125,
0.01490020751953125,
-0.0482177734375,
0.037322998046875,
0.0254058837890625,
0.03900146484375,
0.005550384521484375,
-0.0374755859375,
0.054443359375,
-0.014068603515625,
-0.03057861328125,
-0.0204315185546875,
0.07177734375,
0.032745361328125,
0.0082244873046875,
0.021636962890625,
-0.0214080810546875,
-0.0213623046875,
-0.006549835205078125,
-0.08087158203125,
-0.0188140869140625,
0.0212554931640625,
-0.047576904296875,
-0.038543701171875,
0.0186309814453125,
-0.045562744140625,
-0.005558013916015625,
-0.006893157958984375,
0.042938232421875,
-0.0377197265625,
-0.036041259765625,
-0.01294708251953125,
-0.0304718017578125,
0.036224365234375,
0.0215606689453125,
-0.04144287109375,
0.002506256103515625,
0.0178680419921875,
0.07757568359375,
-0.00788116455078125,
-0.047637939453125,
-0.003421783447265625,
-0.019927978515625,
-0.0220184326171875,
0.029144287109375,
-0.0005617141723632812,
-0.01189422607421875,
-0.01346588134765625,
0.0275115966796875,
-0.0157470703125,
-0.049224853515625,
0.00879669189453125,
-0.0108184814453125,
0.0183258056640625,
0.00026917457580566406,
-0.0103302001953125,
-0.0244598388671875,
0.0188446044921875,
-0.02685546875,
0.1024169921875,
0.03985595703125,
-0.07354736328125,
0.023101806640625,
-0.035858154296875,
-0.022308349609375,
-0.0101165771484375,
-0.0015459060668945312,
-0.072998046875,
-0.0003294944763183594,
0.022369384765625,
0.044708251953125,
-0.011810302734375,
0.00756072998046875,
-0.036407470703125,
-0.0213775634765625,
0.0206451416015625,
-0.006702423095703125,
0.07720947265625,
0.0004749298095703125,
-0.0489501953125,
0.0305023193359375,
-0.041229248046875,
0.006282806396484375,
0.03875732421875,
-0.01248931884765625,
-0.0162811279296875,
-0.0478515625,
0.020538330078125,
0.0293426513671875,
0.020355224609375,
-0.048370361328125,
0.018524169921875,
-0.0195465087890625,
0.0306854248046875,
0.054931640625,
-0.00751495361328125,
0.027923583984375,
-0.0274658203125,
0.0208587646484375,
0.037750244140625,
0.033447265625,
-0.004322052001953125,
-0.047576904296875,
-0.0638427734375,
-0.031402587890625,
0.01557159423828125,
0.031768798828125,
-0.043548583984375,
0.047271728515625,
-0.0170135498046875,
-0.055633544921875,
-0.0394287109375,
0.00399017333984375,
0.0245513916015625,
0.04620361328125,
0.0289154052734375,
-0.020782470703125,
-0.04620361328125,
-0.072265625,
0.0095367431640625,
-0.0025196075439453125,
-0.004032135009765625,
0.025482177734375,
0.061279296875,
-0.0238800048828125,
0.054046630859375,
-0.0285186767578125,
-0.02374267578125,
-0.02545166015625,
0.01552581787109375,
0.031951904296875,
0.054351806640625,
0.06298828125,
-0.043121337890625,
-0.030242919921875,
-0.00417327880859375,
-0.0667724609375,
0.0025348663330078125,
-0.01526641845703125,
-0.0217742919921875,
0.0298614501953125,
0.0032196044921875,
-0.04742431640625,
0.05712890625,
0.0241851806640625,
-0.025177001953125,
0.046417236328125,
-0.0236358642578125,
0.0215911865234375,
-0.07720947265625,
0.00695037841796875,
0.0311126708984375,
-0.004711151123046875,
-0.03558349609375,
-0.0002899169921875,
0.0128631591796875,
-0.0037174224853515625,
-0.03668212890625,
0.04266357421875,
-0.041778564453125,
-0.0200347900390625,
-0.0110626220703125,
-0.0016803741455078125,
0.0037994384765625,
0.05682373046875,
0.0002796649932861328,
0.0269927978515625,
0.06475830078125,
-0.0305023193359375,
0.0218353271484375,
0.02813720703125,
-0.0151214599609375,
0.0303955078125,
-0.055145263671875,
-0.0017251968383789062,
0.0006389617919921875,
0.0195159912109375,
-0.07427978515625,
-0.00293731689453125,
0.0077972412109375,
-0.041473388671875,
0.040985107421875,
-0.0399169921875,
-0.022064208984375,
-0.046417236328125,
-0.04803466796875,
0.025238037109375,
0.06304931640625,
-0.05804443359375,
0.038238525390625,
0.0167999267578125,
0.01084136962890625,
-0.048614501953125,
-0.0684814453125,
-0.026092529296875,
-0.0211944580078125,
-0.06353759765625,
0.03558349609375,
0.01158905029296875,
0.001705169677734375,
0.00860595703125,
-0.00991058349609375,
0.0035228729248046875,
-0.020111083984375,
0.04412841796875,
0.058746337890625,
-0.029632568359375,
-0.02423095703125,
-0.01557159423828125,
-0.00229644775390625,
0.0021228790283203125,
-0.004039764404296875,
0.03192138671875,
-0.0212554931640625,
-0.0137176513671875,
-0.044281005859375,
-0.00823974609375,
0.046783447265625,
-0.005908966064453125,
0.05474853515625,
0.09161376953125,
-0.03033447265625,
-0.0060882568359375,
-0.034759521484375,
-0.023345947265625,
-0.038116455078125,
0.03289794921875,
-0.024200439453125,
-0.032257080078125,
0.06170654296875,
0.0045623779296875,
0.02484130859375,
0.058868408203125,
0.0201416015625,
-0.020660400390625,
0.06890869140625,
0.039306640625,
0.0037403106689453125,
0.05743408203125,
-0.07086181640625,
-0.0086822509765625,
-0.06329345703125,
-0.034027099609375,
-0.025665283203125,
-0.045379638671875,
-0.042327880859375,
-0.0374755859375,
0.0343017578125,
0.01364898681640625,
-0.0259857177734375,
0.039154052734375,
-0.059814453125,
-0.00586700439453125,
0.04949951171875,
0.031829833984375,
-0.0274658203125,
0.0214691162109375,
-0.0218963623046875,
-0.01513671875,
-0.049407958984375,
-0.0059356689453125,
0.06396484375,
0.041595458984375,
0.058563232421875,
-0.0168304443359375,
0.04852294921875,
-0.009735107421875,
0.0210113525390625,
-0.0364990234375,
0.0499267578125,
-0.00017070770263671875,
-0.026123046875,
-0.01873779296875,
-0.0277099609375,
-0.07550048828125,
0.0227813720703125,
-0.03057861328125,
-0.04669189453125,
0.0153961181640625,
0.006866455078125,
0.00243377685546875,
0.06170654296875,
-0.058563232421875,
0.0638427734375,
-0.019744873046875,
-0.02392578125,
-0.0020961761474609375,
-0.057708740234375,
0.01702880859375,
0.0276031494140625,
-0.00931549072265625,
-0.01251983642578125,
0.011932373046875,
0.0809326171875,
-0.050048828125,
0.072021484375,
-0.047576904296875,
0.0245513916015625,
0.03192138671875,
-0.01715087890625,
0.032440185546875,
-0.016998291015625,
0.0058746337890625,
0.031707763671875,
0.0053253173828125,
-0.03509521484375,
-0.0435791015625,
0.04559326171875,
-0.07843017578125,
-0.027069091796875,
-0.0297088623046875,
-0.036224365234375,
0.0161285400390625,
0.009552001953125,
0.040771484375,
0.051239013671875,
0.0120391845703125,
0.0147552490234375,
0.043548583984375,
-0.02886962890625,
0.039764404296875,
0.0015401840209960938,
-0.017791748046875,
-0.031829833984375,
0.05767822265625,
0.00978851318359375,
0.0089263916015625,
0.006809234619140625,
0.0236053466796875,
-0.0215911865234375,
-0.043975830078125,
-0.0307159423828125,
0.036834716796875,
-0.050079345703125,
-0.036834716796875,
-0.036376953125,
-0.04302978515625,
-0.03948974609375,
-0.018157958984375,
-0.029083251953125,
-0.0235137939453125,
-0.0216217041015625,
0.01129150390625,
0.054443359375,
0.04925537109375,
-0.0036258697509765625,
0.02398681640625,
-0.04266357421875,
0.01141357421875,
0.007598876953125,
0.0242156982421875,
-0.00560760498046875,
-0.0718994140625,
-0.01666259765625,
-0.0004143714904785156,
-0.0222625732421875,
-0.0478515625,
0.0390625,
0.0078277587890625,
0.0479736328125,
0.022552490234375,
-0.00897979736328125,
0.06890869140625,
-0.0037860870361328125,
0.0567626953125,
0.034423828125,
-0.04852294921875,
0.0576171875,
-0.003963470458984375,
0.0198211669921875,
0.0034027099609375,
0.014617919921875,
-0.01715087890625,
-0.0161590576171875,
-0.07342529296875,
-0.0689697265625,
0.06280517578125,
0.0068206787109375,
-0.0046844482421875,
0.0325927734375,
0.0362548828125,
0.00942230224609375,
-0.005870819091796875,
-0.054718017578125,
-0.03662109375,
-0.033294677734375,
-0.019012451171875,
0.00485992431640625,
-0.00167083740234375,
-0.012969970703125,
-0.061798095703125,
0.053436279296875,
-0.005725860595703125,
0.05633544921875,
0.0261077880859375,
-0.0216064453125,
-0.0086669921875,
-0.0206756591796875,
0.0355224609375,
0.0272369384765625,
-0.02435302734375,
-0.0009565353393554688,
0.02099609375,
-0.05218505859375,
-0.00365447998046875,
0.004047393798828125,
-0.0029735565185546875,
0.00926971435546875,
0.0467529296875,
0.081298828125,
0.01421356201171875,
-0.0008001327514648438,
0.05322265625,
0.0026702880859375,
-0.041656494140625,
-0.016021728515625,
0.01081085205078125,
-0.006862640380859375,
0.025238037109375,
0.031951904296875,
0.038299560546875,
-0.019378662109375,
-0.018157958984375,
0.0130615234375,
0.04083251953125,
-0.01959228515625,
-0.0263214111328125,
0.0452880859375,
-0.0055389404296875,
-0.00445556640625,
0.0687255859375,
0.0168609619140625,
-0.0379638671875,
0.07684326171875,
0.045318603515625,
0.059600830078125,
0.001407623291015625,
0.00630950927734375,
0.0638427734375,
0.0202178955078125,
0.005901336669921875,
0.0099029541015625,
0.00865936279296875,
-0.05615234375,
0.01495361328125,
-0.03662109375,
-0.0008540153503417969,
0.022979736328125,
-0.049896240234375,
0.0285491943359375,
-0.0419921875,
-0.028656005859375,
0.00571441650390625,
0.0241851806640625,
-0.073486328125,
0.01090240478515625,
0.0009870529174804688,
0.07269287109375,
-0.066650390625,
0.06658935546875,
0.055145263671875,
-0.037933349609375,
-0.0704345703125,
-0.014068603515625,
-0.0021610260009765625,
-0.07421875,
0.03082275390625,
0.0316162109375,
0.00228118896484375,
-0.009857177734375,
-0.061920166015625,
-0.0443115234375,
0.11676025390625,
0.0220947265625,
-0.015380859375,
0.0092315673828125,
-0.00666046142578125,
0.0241851806640625,
-0.03387451171875,
0.039794921875,
0.0263214111328125,
0.036834716796875,
0.0208892822265625,
-0.04736328125,
0.0160980224609375,
-0.0306854248046875,
0.0218048095703125,
0.008026123046875,
-0.04290771484375,
0.06549072265625,
-0.0496826171875,
-0.0085906982421875,
-0.0018739700317382812,
0.0516357421875,
0.0289764404296875,
0.00524139404296875,
0.048797607421875,
0.04705810546875,
0.037353515625,
-0.024627685546875,
0.06201171875,
0.00223541259765625,
0.044036865234375,
0.04071044921875,
0.0159454345703125,
0.0478515625,
0.0345458984375,
-0.0255584716796875,
0.04339599609375,
0.06988525390625,
-0.036346435546875,
0.0239410400390625,
0.00531768798828125,
0.00739288330078125,
0.0032901763916015625,
0.0220947265625,
-0.038116455078125,
0.019989013671875,
0.0164031982421875,
-0.03546142578125,
-0.009063720703125,
0.0148162841796875,
-0.00629425048828125,
-0.032958984375,
-0.01506805419921875,
0.03558349609375,
-0.006328582763671875,
-0.032470703125,
0.05810546875,
0.0027923583984375,
0.088623046875,
-0.0472412109375,
0.0021800994873046875,
-0.0203704833984375,
0.01483154296875,
-0.032257080078125,
-0.06976318359375,
0.01332855224609375,
-0.02130126953125,
0.0015974044799804688,
0.0010738372802734375,
0.06829833984375,
-0.032684326171875,
-0.032958984375,
0.024810791015625,
0.0186920166015625,
0.0290069580078125,
0.01448822021484375,
-0.097412109375,
0.0113677978515625,
0.0100555419921875,
-0.051177978515625,
0.031829833984375,
0.0256805419921875,
0.0013380050659179688,
0.0496826171875,
0.04547119140625,
-0.003570556640625,
0.0163726806640625,
0.002597808837890625,
0.0611572265625,
-0.04718017578125,
-0.02362060546875,
-0.049652099609375,
0.055419921875,
-0.0168609619140625,
-0.04779052734375,
0.052490234375,
0.036346435546875,
0.05279541015625,
-0.01209259033203125,
0.04595947265625,
-0.027496337890625,
-0.00189208984375,
-0.01654052734375,
0.05206298828125,
-0.046142578125,
-0.002735137939453125,
-0.0210723876953125,
-0.049346923828125,
-0.019256591796875,
0.056365966796875,
-0.01486968994140625,
0.0257568359375,
0.040924072265625,
0.07977294921875,
-0.0168914794921875,
-0.0204315185546875,
0.0191650390625,
0.01171112060546875,
0.0026397705078125,
0.026092529296875,
0.030853271484375,
-0.06201171875,
0.03021240234375,
-0.052490234375,
-0.0228271484375,
-0.0169830322265625,
-0.044952392578125,
-0.07879638671875,
-0.06842041015625,
-0.045989990234375,
-0.051666259765625,
-0.0289154052734375,
0.0606689453125,
0.08355712890625,
-0.057708740234375,
-0.006771087646484375,
0.01032257080078125,
0.014617919921875,
-0.030426025390625,
-0.0187835693359375,
0.04296875,
-0.01131439208984375,
-0.04693603515625,
-0.01837158203125,
0.003509521484375,
0.037322998046875,
-0.00975799560546875,
-0.0246429443359375,
-0.009552001953125,
-0.01629638671875,
0.0255584716796875,
0.018585205078125,
-0.05322265625,
-0.015960693359375,
-0.0126953125,
-0.0216064453125,
0.035247802734375,
0.045257568359375,
-0.036773681640625,
0.00899505615234375,
0.04241943359375,
0.0125885009765625,
0.06878662109375,
-0.021209716796875,
0.00652313232421875,
-0.06494140625,
0.0418701171875,
-0.00838470458984375,
0.038787841796875,
0.02728271484375,
-0.025177001953125,
0.03643798828125,
0.03546142578125,
-0.034332275390625,
-0.06512451171875,
-0.0164031982421875,
-0.08648681640625,
-0.0167236328125,
0.06829833984375,
-0.02728271484375,
-0.0462646484375,
0.0201263427734375,
-0.01068878173828125,
0.03753662109375,
-0.007740020751953125,
0.0223388671875,
0.00894927978515625,
-0.01079559326171875,
-0.0462646484375,
-0.031707763671875,
0.029052734375,
0.00390625,
-0.044525146484375,
-0.0225830078125,
0.0006418228149414062,
0.05902099609375,
0.0170135498046875,
0.037811279296875,
-0.02288818359375,
0.008880615234375,
0.01947021484375,
0.045928955078125,
-0.01904296875,
-0.00457000732421875,
-0.02813720703125,
-0.00606536865234375,
-0.00969696044921875,
-0.040496826171875
]
] |
flair/chunk-english | 2023-04-05T10:38:02.000Z | [
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:conll2000",
"region:us",
"has_space"
] | token-classification | flair | null | null | flair/chunk-english | 14 | 58,429 | flair | 2022-03-02T23:29:05 | ---
tags:
- flair
- token-classification
- sequence-tagger-model
language: en
datasets:
- conll2000
widget:
- text: "The happy man has been eating at the diner"
---
## English Chunking in Flair (default model)
This is the standard phrase chunking model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **96,48** (CoNLL-2000)
Predicts 4 tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
| ADJP | adjectival |
| ADVP | adverbial |
| CONJP | conjunction |
| INTJ | interjection |
| LST | list marker |
| NP | noun phrase |
| PP | prepositional |
| PRT | particle |
| SBAR | subordinate clause |
| VP | verb phrase |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/chunk-english")
# make example sentence
sentence = Sentence("The happy man has been eating at the diner")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('np'):
print(entity)
```
This yields the following output:
```
Span [1,2,3]: "The happy man" [− Labels: NP (0.9958)]
Span [4,5,6]: "has been eating" [− Labels: VP (0.8759)]
Span [7]: "at" [− Labels: PP (1.0)]
Span [8,9]: "the diner" [− Labels: NP (0.9991)]
```
So, the spans "*The happy man*" and "*the diner*" are labeled as **noun phrases** (NP) and "*has been eating*" is labeled as a **verb phrase** (VP) in the sentence "*The happy man has been eating at the diner*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import CONLL_2000
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. get the corpus
corpus: Corpus = CONLL_2000()
# 2. what tag do we want to predict?
tag_type = 'np'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/chunk-english',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
| 3,715 | [
[
-0.029327392578125,
-0.05364990234375,
0.005924224853515625,
0.02227783203125,
-0.0286407470703125,
-0.006763458251953125,
-0.01500701904296875,
-0.032196044921875,
0.053863525390625,
0.0176849365234375,
-0.031036376953125,
-0.027587890625,
-0.032440185546875,
0.033905029296875,
-0.01497650146484375,
0.07989501953125,
0.01161956787109375,
0.0219879150390625,
-0.01837158203125,
0.006092071533203125,
-0.037200927734375,
-0.03350830078125,
-0.03741455078125,
-0.0177154541015625,
0.0288543701171875,
0.0166015625,
0.04791259765625,
0.061798095703125,
0.0129241943359375,
0.0207977294921875,
-0.0209197998046875,
0.01030731201171875,
-0.0117645263671875,
0.003459930419921875,
-0.0137176513671875,
-0.034637451171875,
-0.043243408203125,
0.0070343017578125,
0.044891357421875,
0.036651611328125,
0.00632476806640625,
-0.005462646484375,
0.0037174224853515625,
0.0227203369140625,
-0.01922607421875,
0.0231475830078125,
-0.05206298828125,
-0.0214691162109375,
-0.019775390625,
-0.01052093505859375,
-0.03271484375,
-0.025390625,
0.0094757080078125,
-0.034576416015625,
-0.0026264190673828125,
0.01497650146484375,
0.1064453125,
0.0171966552734375,
-0.0328369140625,
-0.0269012451171875,
-0.0234375,
0.06304931640625,
-0.06689453125,
0.020172119140625,
0.032257080078125,
-0.0191650390625,
-0.0173492431640625,
-0.04541015625,
-0.049530029296875,
-0.01029205322265625,
-0.0118408203125,
0.0086212158203125,
-0.0118255615234375,
-0.0184173583984375,
-0.0016727447509765625,
0.0194549560546875,
-0.052764892578125,
-0.00807952880859375,
-0.01548004150390625,
-0.01528167724609375,
0.061981201171875,
0.0106658935546875,
0.0194091796875,
-0.046478271484375,
-0.046630859375,
-0.007511138916015625,
-0.0297393798828125,
-0.004665374755859375,
0.006195068359375,
0.0309600830078125,
-0.01024627685546875,
0.046630859375,
-0.003734588623046875,
0.05853271484375,
0.01169586181640625,
-0.021697998046875,
0.049285888671875,
-0.0251312255859375,
-0.0188751220703125,
-0.005462646484375,
0.071044921875,
0.032196044921875,
0.0205230712890625,
-0.00528717041015625,
-0.0074310302734375,
0.00952911376953125,
-0.00768280029296875,
-0.03936767578125,
-0.01357269287109375,
0.0240478515625,
-0.0128631591796875,
-0.0209503173828125,
0.00913238525390625,
-0.060333251953125,
-0.0141143798828125,
-0.002330780029296875,
0.038909912109375,
-0.05877685546875,
-0.0113983154296875,
0.0145416259765625,
-0.034881591796875,
0.027984619140625,
0.0019121170043945312,
-0.056396484375,
0.003925323486328125,
0.03857421875,
0.051025390625,
0.01062774658203125,
-0.028350830078125,
-0.0186920166015625,
-0.0017061233520507812,
-0.01099395751953125,
0.055694580078125,
-0.0290374755859375,
-0.0187835693359375,
-0.0015611648559570312,
0.0165863037109375,
-0.032928466796875,
-0.0276947021484375,
0.046966552734375,
-0.038116455078125,
0.026611328125,
-0.01593017578125,
-0.0758056640625,
-0.0311279296875,
0.0249786376953125,
-0.041717529296875,
0.0684814453125,
0.0060272216796875,
-0.0797119140625,
0.0234832763671875,
-0.0218505859375,
-0.047119140625,
0.00042510032653808594,
0.0016536712646484375,
-0.0239715576171875,
-0.00612640380859375,
0.01213836669921875,
0.050567626953125,
-0.0097503662109375,
0.0223236083984375,
-0.021453857421875,
-0.0006060600280761719,
0.028289794921875,
0.005695343017578125,
0.06011962890625,
0.0178375244140625,
-0.015228271484375,
0.004856109619140625,
-0.054443359375,
-0.004779815673828125,
0.0181884765625,
-0.036895751953125,
-0.0281219482421875,
0.0100555419921875,
0.006378173828125,
0.01381683349609375,
0.01427459716796875,
-0.0295562744140625,
0.03399658203125,
-0.047119140625,
0.037567138671875,
0.035888671875,
0.00426483154296875,
0.041290283203125,
-0.042236328125,
0.037109375,
0.01214599609375,
-0.0224761962890625,
-0.0171356201171875,
-0.052947998046875,
-0.052825927734375,
-0.0310211181640625,
0.046722412109375,
0.059295654296875,
-0.058135986328125,
0.06591796875,
-0.0258941650390625,
-0.04254150390625,
-0.035980224609375,
-0.0250244140625,
0.017181396484375,
0.039947509765625,
0.039031982421875,
-0.023651123046875,
-0.06329345703125,
-0.052490234375,
-0.0259552001953125,
0.0008330345153808594,
0.0125274658203125,
0.006855010986328125,
0.05950927734375,
-0.0124664306640625,
0.0718994140625,
-0.038848876953125,
-0.023193359375,
-0.03887939453125,
0.0085906982421875,
0.0318603515625,
0.049346923828125,
0.03094482421875,
-0.051300048828125,
-0.048583984375,
-0.0224609375,
-0.030517578125,
0.01517486572265625,
-0.0171051025390625,
0.002849578857421875,
0.0267181396484375,
0.0239715576171875,
-0.042510986328125,
0.0259246826171875,
0.021148681640625,
-0.05517578125,
0.05120849609375,
0.0116729736328125,
-0.014801025390625,
-0.11541748046875,
0.003143310546875,
0.0209808349609375,
-0.0270233154296875,
-0.045318603515625,
-0.006542205810546875,
0.0008473396301269531,
0.028350830078125,
-0.0312347412109375,
0.056915283203125,
-0.0379638671875,
0.004180908203125,
-0.003093719482421875,
0.004093170166015625,
0.006595611572265625,
0.0303802490234375,
0.0161590576171875,
0.037139892578125,
0.03985595703125,
-0.0482177734375,
0.00768280029296875,
0.02362060546875,
-0.031951904296875,
0.009979248046875,
-0.038177490234375,
-0.01824951171875,
-0.0101776123046875,
0.02081298828125,
-0.0853271484375,
-0.0224761962890625,
0.0250244140625,
-0.05657958984375,
0.039337158203125,
0.00698089599609375,
-0.034332275390625,
-0.0272979736328125,
-0.031494140625,
0.000316619873046875,
0.035888671875,
-0.014556884765625,
0.0306243896484375,
0.0290985107421875,
0.00978851318359375,
-0.04620361328125,
-0.05206298828125,
-0.017974853515625,
-0.0171966552734375,
-0.048858642578125,
0.03753662109375,
-0.0081329345703125,
-0.0001761913299560547,
0.01441192626953125,
0.002460479736328125,
-0.0019779205322265625,
0.0191802978515625,
0.0031833648681640625,
0.025390625,
-0.0093536376953125,
-0.0029163360595703125,
-0.01499176025390625,
0.01171112060546875,
-0.004276275634765625,
-0.0130157470703125,
0.07159423828125,
-0.00949859619140625,
0.0245513916015625,
-0.033111572265625,
0.009063720703125,
0.01462554931640625,
-0.02398681640625,
0.06097412109375,
0.06591796875,
-0.0367431640625,
-0.0070037841796875,
-0.018707275390625,
-0.0097503662109375,
-0.0296173095703125,
0.04345703125,
-0.031463623046875,
-0.05377197265625,
0.037109375,
0.0182037353515625,
0.003448486328125,
0.06005859375,
0.038482666015625,
-0.006015777587890625,
0.0758056640625,
0.03790283203125,
-0.01459503173828125,
0.0328369140625,
-0.035736083984375,
-0.00952911376953125,
-0.0667724609375,
-0.00850677490234375,
-0.03387451171875,
-0.0179901123046875,
-0.05517578125,
-0.03411865234375,
0.009552001953125,
0.0296173095703125,
-0.0226593017578125,
0.0399169921875,
-0.034576416015625,
0.0270538330078125,
0.040771484375,
-0.00141143798828125,
0.0011692047119140625,
-0.01242828369140625,
-0.03216552734375,
-0.009674072265625,
-0.04638671875,
-0.028289794921875,
0.059722900390625,
0.035675048828125,
0.050628662109375,
-0.0009403228759765625,
0.0638427734375,
0.005970001220703125,
0.0159454345703125,
-0.0716552734375,
0.0390625,
-0.01248931884765625,
-0.062469482421875,
-0.0059051513671875,
-0.020843505859375,
-0.07244873046875,
0.023834228515625,
-0.018402099609375,
-0.06884765625,
0.01071929931640625,
0.0080413818359375,
-0.0294189453125,
0.0189056396484375,
-0.036895751953125,
0.0689697265625,
0.004413604736328125,
-0.021697998046875,
0.01476287841796875,
-0.05877685546875,
0.023834228515625,
0.019439697265625,
0.031524658203125,
-0.017333984375,
-0.00418853759765625,
0.07550048828125,
-0.0227508544921875,
0.0743408203125,
0.0040740966796875,
0.01132965087890625,
0.022064208984375,
0.00809478759765625,
0.0289306640625,
0.0104522705078125,
-0.004795074462890625,
-0.00826263427734375,
0.0049285888671875,
-0.0181427001953125,
0.0006146430969238281,
0.041900634765625,
-0.0540771484375,
-0.0201873779296875,
-0.0660400390625,
-0.0224456787109375,
-0.007534027099609375,
0.023529052734375,
0.049530029296875,
0.033447265625,
-0.01361083984375,
-0.00234222412109375,
0.034454345703125,
-0.01287841796875,
0.048187255859375,
0.0262603759765625,
-0.031829833984375,
-0.050628662109375,
0.063720703125,
0.000032067298889160156,
-0.0010089874267578125,
0.04290771484375,
0.014556884765625,
-0.0262451171875,
-0.0117645263671875,
-0.017822265625,
0.042572021484375,
-0.037506103515625,
-0.035736083984375,
-0.04595947265625,
-0.0160675048828125,
-0.058319091796875,
-0.00907135009765625,
-0.0242919921875,
-0.0384521484375,
-0.05322265625,
0.00147247314453125,
0.019317626953125,
0.05438232421875,
-0.017974853515625,
0.0213470458984375,
-0.056549072265625,
-0.01129150390625,
-0.005462646484375,
0.003108978271484375,
-0.0007328987121582031,
-0.061798095703125,
-0.024505615234375,
-0.0032901763916015625,
-0.036285400390625,
-0.09161376953125,
0.0791015625,
0.0266265869140625,
0.0240631103515625,
0.0278778076171875,
-0.0101776123046875,
0.041229248046875,
-0.036376953125,
0.07794189453125,
0.00719451904296875,
-0.07391357421875,
0.03314208984375,
-0.01520538330078125,
0.0249786376953125,
0.0165557861328125,
0.06585693359375,
-0.05303955078125,
-0.0175018310546875,
-0.052154541015625,
-0.0777587890625,
0.0443115234375,
-0.008941650390625,
0.006313323974609375,
-0.047119140625,
0.01258087158203125,
-0.00006657838821411133,
0.01171875,
-0.071044921875,
-0.032928466796875,
-0.01509857177734375,
-0.0180816650390625,
-0.017913818359375,
-0.02032470703125,
0.01558685302734375,
-0.0438232421875,
0.0833740234375,
-0.0038890838623046875,
0.0399169921875,
0.0316162109375,
-0.0084686279296875,
0.007801055908203125,
0.02362060546875,
0.04901123046875,
0.0235748291015625,
-0.030364990234375,
-0.0004200935363769531,
0.0026760101318359375,
-0.022674560546875,
-0.017181396484375,
0.01383209228515625,
-0.005542755126953125,
0.0176849365234375,
0.034576416015625,
0.05889892578125,
0.021575927734375,
-0.0286865234375,
0.040863037109375,
0.0010843276977539062,
-0.0186004638671875,
-0.036285400390625,
-0.014312744140625,
0.020172119140625,
0.01010894775390625,
0.0123291015625,
0.0008420944213867188,
0.0012960433959960938,
-0.044830322265625,
0.0255889892578125,
0.0260009765625,
-0.0309295654296875,
-0.044189453125,
0.066162109375,
0.00016045570373535156,
-0.00170135498046875,
0.033935546875,
-0.04766845703125,
-0.0628662109375,
0.042877197265625,
0.06011962890625,
0.0594482421875,
-0.0169677734375,
0.01493072509765625,
0.051910400390625,
0.0130157470703125,
-0.010528564453125,
0.05755615234375,
0.028778076171875,
-0.0733642578125,
-0.032958984375,
-0.07550048828125,
0.0119171142578125,
0.01009368896484375,
-0.048583984375,
0.0225677490234375,
-0.0207977294921875,
-0.0208740234375,
0.0223236083984375,
0.009979248046875,
-0.051910400390625,
0.0187530517578125,
0.0251312255859375,
0.09088134765625,
-0.06890869140625,
0.074951171875,
0.07830810546875,
-0.05419921875,
-0.0771484375,
-0.0009946823120117188,
0.00492095947265625,
-0.06134033203125,
0.03973388671875,
0.024749755859375,
0.03948974609375,
0.0156402587890625,
-0.04803466796875,
-0.08489990234375,
0.07562255859375,
-0.0159759521484375,
-0.0254669189453125,
-0.0169677734375,
-0.0144805908203125,
0.027862548828125,
-0.034210205078125,
0.0283203125,
0.044921875,
0.032867431640625,
0.008941650390625,
-0.07073974609375,
0.0036029815673828125,
-0.01224517822265625,
-0.002590179443359375,
0.0097503662109375,
-0.05084228515625,
0.0860595703125,
-0.0081329345703125,
-0.0159912109375,
0.0218963623046875,
0.071044921875,
-0.0029735565185546875,
-0.0028362274169921875,
0.029327392578125,
0.06500244140625,
0.0513916015625,
-0.0135650634765625,
0.07391357421875,
-0.02325439453125,
0.0504150390625,
0.09197998046875,
-0.00494384765625,
0.0784912109375,
0.0287628173828125,
-0.0153961181640625,
0.043426513671875,
0.0633544921875,
-0.005565643310546875,
0.048431396484375,
0.0182952880859375,
0.003902435302734375,
-0.0161590576171875,
-0.0119171142578125,
-0.0302734375,
0.0496826171875,
0.03204345703125,
-0.0421142578125,
-0.0013914108276367188,
-0.00576019287109375,
0.0499267578125,
-0.01032257080078125,
-0.021575927734375,
0.06170654296875,
0.0032024383544921875,
-0.04840087890625,
0.059051513671875,
0.0167236328125,
0.07635498046875,
-0.0316162109375,
0.01285552978515625,
-0.0146484375,
0.0134124755859375,
-0.0177764892578125,
-0.054168701171875,
0.0241851806640625,
-0.0191650390625,
-0.00659942626953125,
0.0031185150146484375,
0.05078125,
-0.04852294921875,
-0.0223541259765625,
0.0109100341796875,
0.040435791015625,
0.0182342529296875,
0.0032901763916015625,
-0.05792236328125,
-0.00492095947265625,
0.01922607421875,
-0.037811279296875,
0.0166168212890625,
0.0095672607421875,
0.0157012939453125,
0.0286712646484375,
0.0186614990234375,
0.004695892333984375,
0.009185791015625,
-0.0190887451171875,
0.06085205078125,
-0.06756591796875,
-0.04107666015625,
-0.071533203125,
0.051483154296875,
0.0014905929565429688,
-0.033111572265625,
0.056396484375,
0.055267333984375,
0.06884765625,
-0.01282501220703125,
0.05499267578125,
-0.042572021484375,
0.05157470703125,
-0.0249481201171875,
0.04302978515625,
-0.0601806640625,
0.002178192138671875,
-0.01023101806640625,
-0.047760009765625,
-0.025543212890625,
0.046844482421875,
-0.0185546875,
-0.01074981689453125,
0.0523681640625,
0.0504150390625,
0.00917816162109375,
-0.0012912750244140625,
0.00537109375,
0.03857421875,
0.006763458251953125,
0.032501220703125,
0.042999267578125,
-0.036834716796875,
0.01198577880859375,
-0.03997802734375,
-0.0155792236328125,
-0.0253143310546875,
-0.06695556640625,
-0.062042236328125,
-0.06292724609375,
-0.024932861328125,
-0.054107666015625,
-0.0112762451171875,
0.0921630859375,
0.034088134765625,
-0.060089111328125,
-0.0211181640625,
0.01477813720703125,
0.0060577392578125,
-0.0167083740234375,
-0.0247802734375,
0.02569580078125,
-0.027191162109375,
-0.055084228515625,
0.028228759765625,
-0.019561767578125,
0.0087432861328125,
0.0095672607421875,
0.010650634765625,
-0.0634765625,
0.014007568359375,
0.039031982421875,
0.024932861328125,
-0.050994873046875,
-0.0181121826171875,
0.0059967041015625,
-0.0111236572265625,
0.0175628662109375,
0.007686614990234375,
-0.05010986328125,
0.01335906982421875,
0.045989990234375,
0.0284423828125,
0.0201416015625,
0.005397796630859375,
0.0299835205078125,
-0.07470703125,
-0.0010967254638671875,
0.019775390625,
0.042144775390625,
0.0243682861328125,
-0.00933837890625,
0.03436279296875,
0.033050537109375,
-0.05010986328125,
-0.047027587890625,
0.004405975341796875,
-0.07861328125,
-0.0325927734375,
0.09686279296875,
-0.006679534912109375,
-0.048309326171875,
0.0151824951171875,
-0.003753662109375,
0.0367431640625,
-0.036041259765625,
0.0253448486328125,
0.040435791015625,
-0.01258087158203125,
0.0191802978515625,
-0.0225830078125,
0.05731201171875,
0.03082275390625,
-0.037811279296875,
-0.0203094482421875,
0.02410888671875,
0.04766845703125,
0.0219879150390625,
0.04229736328125,
0.0038776397705078125,
0.0019168853759765625,
-0.003582000732421875,
0.026031494140625,
0.0168609619140625,
-0.007793426513671875,
-0.035369873046875,
-0.0070037841796875,
0.00026345252990722656,
-0.026153564453125
]
] |
neuralmind/bert-large-portuguese-cased | 2021-05-20T01:31:09.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"pt",
"dataset:brWaC",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | neuralmind | null | null | neuralmind/bert-large-portuguese-cased | 36 | 58,141 | transformers | 2022-03-02T23:29:05 | ---
language: pt
license: mit
tags:
- bert
- pytorch
datasets:
- brWaC
---
# BERTimbau Large (aka "bert-large-portuguese-cased")

## Introduction
BERTimbau Large is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large.
For further information or requests, please go to [BERTimbau repository](https://github.com/neuralmind-ai/portuguese-bert/).
## Available models
| Model | Arch. | #Layers | #Params |
| ---------------------------------------- | ---------- | ------- | ------- |
| `neuralmind/bert-base-portuguese-cased` | BERT-Base | 12 | 110M |
| `neuralmind/bert-large-portuguese-cased` | BERT-Large | 24 | 335M |
## Usage
```python
from transformers import AutoTokenizer # Or BertTokenizer
from transformers import AutoModelForPreTraining # Or BertForPreTraining for loading pretraining heads
from transformers import AutoModel # or BertModel, for BERT without pretraining heads
model = AutoModelForPreTraining.from_pretrained('neuralmind/bert-large-portuguese-cased')
tokenizer = AutoTokenizer.from_pretrained('neuralmind/bert-large-portuguese-cased', do_lower_case=False)
```
### Masked language modeling prediction example
```python
from transformers import pipeline
pipe = pipeline('fill-mask', model=model, tokenizer=tokenizer)
pipe('Tinha uma [MASK] no meio do caminho.')
# [{'score': 0.5054386258125305,
# 'sequence': '[CLS] Tinha uma pedra no meio do caminho. [SEP]',
# 'token': 5028,
# 'token_str': 'pedra'},
# {'score': 0.05616172030568123,
# 'sequence': '[CLS] Tinha uma curva no meio do caminho. [SEP]',
# 'token': 9562,
# 'token_str': 'curva'},
# {'score': 0.02348282001912594,
# 'sequence': '[CLS] Tinha uma parada no meio do caminho. [SEP]',
# 'token': 6655,
# 'token_str': 'parada'},
# {'score': 0.01795753836631775,
# 'sequence': '[CLS] Tinha uma mulher no meio do caminho. [SEP]',
# 'token': 2606,
# 'token_str': 'mulher'},
# {'score': 0.015246033668518066,
# 'sequence': '[CLS] Tinha uma luz no meio do caminho. [SEP]',
# 'token': 3377,
# 'token_str': 'luz'}]
```
### For BERT embeddings
```python
import torch
model = AutoModel.from_pretrained('neuralmind/bert-large-portuguese-cased')
input_ids = tokenizer.encode('Tinha uma pedra no meio do caminho.', return_tensors='pt')
with torch.no_grad():
outs = model(input_ids)
encoded = outs[0][0, 1:-1] # Ignore [CLS] and [SEP] special tokens
# encoded.shape: (8, 1024)
# tensor([[ 1.1872, 0.5606, -0.2264, ..., 0.0117, -0.1618, -0.2286],
# [ 1.3562, 0.1026, 0.1732, ..., -0.3855, -0.0832, -0.1052],
# [ 0.2988, 0.2528, 0.4431, ..., 0.2684, -0.5584, 0.6524],
# ...,
# [ 0.3405, -0.0140, -0.0748, ..., 0.6649, -0.8983, 0.5802],
# [ 0.1011, 0.8782, 0.1545, ..., -0.1768, -0.8880, -0.1095],
# [ 0.7912, 0.9637, -0.3859, ..., 0.2050, -0.1350, 0.0432]])
```
## Citation
If you use our work, please cite:
```bibtex
@inproceedings{souza2020bertimbau,
author = {F{\'a}bio Souza and
Rodrigo Nogueira and
Roberto Lotufo},
title = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese},
booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)},
year = {2020}
}
```
| 3,623 | [
[
-0.014556884765625,
-0.034027099609375,
0.0103302001953125,
0.03619384765625,
-0.0377197265625,
-0.0025234222412109375,
-0.01708984375,
-0.004016876220703125,
0.0433349609375,
0.0230560302734375,
-0.034820556640625,
-0.044158935546875,
-0.05401611328125,
-0.0006003379821777344,
-0.027008056640625,
0.08245849609375,
0.0129547119140625,
0.02978515625,
0.0198211669921875,
0.004558563232421875,
-0.0240020751953125,
-0.0440673828125,
-0.06793212890625,
-0.024322509765625,
0.0445556640625,
0.0147247314453125,
0.0328369140625,
0.0254364013671875,
0.0278167724609375,
0.030059814453125,
-0.00676727294921875,
-0.005054473876953125,
-0.034027099609375,
-0.0008077621459960938,
0.007843017578125,
-0.0426025390625,
-0.047210693359375,
-0.0118408203125,
0.0511474609375,
0.0401611328125,
0.0025882720947265625,
0.018402099609375,
-0.0035648345947265625,
0.03179931640625,
-0.0139617919921875,
0.036224365234375,
-0.040008544921875,
0.005733489990234375,
-0.01291656494140625,
-0.00860595703125,
-0.0203399658203125,
-0.0306243896484375,
0.01374053955078125,
-0.046173095703125,
0.01294708251953125,
-0.0007238388061523438,
0.0888671875,
0.016693115234375,
-0.01000213623046875,
-0.00823211669921875,
-0.0266876220703125,
0.0635986328125,
-0.058258056640625,
0.01131439208984375,
0.03460693359375,
0.018524169921875,
-0.0201263427734375,
-0.056182861328125,
-0.031524658203125,
0.00775146484375,
0.005706787109375,
0.005573272705078125,
-0.01036834716796875,
-0.01378631591796875,
0.0264129638671875,
0.01313018798828125,
-0.0372314453125,
0.0038623809814453125,
-0.051025390625,
-0.02386474609375,
0.047943115234375,
0.003009796142578125,
0.01428985595703125,
-0.0143585205078125,
-0.0224609375,
-0.0280914306640625,
-0.03814697265625,
0.0021209716796875,
0.0423583984375,
0.025054931640625,
-0.0243988037109375,
0.0457763671875,
-0.012542724609375,
0.048797607421875,
-0.0013637542724609375,
0.00542449951171875,
0.049835205078125,
0.005474090576171875,
-0.037322998046875,
0.0034732818603515625,
0.07318115234375,
0.0112152099609375,
0.0308074951171875,
0.0006866455078125,
-0.007526397705078125,
-0.00693511962890625,
0.00066375732421875,
-0.050567626953125,
-0.0272064208984375,
0.0189361572265625,
-0.037628173828125,
-0.0264434814453125,
0.0270233154296875,
-0.0557861328125,
-0.01052093505859375,
-0.00328826904296875,
0.055511474609375,
-0.047210693359375,
-0.011932373046875,
0.01873779296875,
-0.0281829833984375,
0.04071044921875,
0.00952911376953125,
-0.0640869140625,
0.0077056884765625,
0.032867431640625,
0.05609130859375,
0.0301055908203125,
-0.01317596435546875,
-0.0103912353515625,
-0.004016876220703125,
-0.0159759521484375,
0.041015625,
-0.01163482666015625,
-0.037139892578125,
-0.00385284423828125,
0.018157958984375,
-0.0096588134765625,
-0.01203155517578125,
0.052947998046875,
-0.029266357421875,
0.02392578125,
-0.01108551025390625,
-0.0226898193359375,
-0.033660888671875,
0.01776123046875,
-0.04132080078125,
0.0750732421875,
0.0159454345703125,
-0.051788330078125,
0.0286407470703125,
-0.0657958984375,
-0.031646728515625,
0.00672149658203125,
-0.004680633544921875,
-0.033935546875,
-0.0011234283447265625,
0.022613525390625,
0.04071044921875,
-0.0201263427734375,
0.01922607421875,
-0.030517578125,
-0.0213470458984375,
0.0259857177734375,
-0.0288543701171875,
0.09417724609375,
0.0195159912109375,
-0.0174713134765625,
0.0179443359375,
-0.0574951171875,
0.0088348388671875,
0.022430419921875,
-0.0170440673828125,
0.001697540283203125,
-0.0146331787109375,
0.0160675048828125,
0.006961822509765625,
0.040924072265625,
-0.052825927734375,
0.02978515625,
-0.0254364013671875,
0.057403564453125,
0.062744140625,
-0.0089874267578125,
0.0031337738037109375,
-0.0264892578125,
0.028900146484375,
0.0073089599609375,
0.00392913818359375,
0.0093994140625,
-0.047882080078125,
-0.058319091796875,
-0.044708251953125,
0.044281005859375,
0.041046142578125,
-0.045196533203125,
0.066650390625,
-0.0219573974609375,
-0.058868408203125,
-0.039306640625,
-0.01560211181640625,
0.0129547119140625,
0.026275634765625,
0.0215301513671875,
-0.0248260498046875,
-0.061798095703125,
-0.05560302734375,
-0.0003464221954345703,
-0.006160736083984375,
-0.01397705078125,
0.0184326171875,
0.057281494140625,
-0.0224609375,
0.050750732421875,
-0.04400634765625,
-0.034271240234375,
-0.002285003662109375,
0.0163726806640625,
0.05926513671875,
0.06201171875,
0.05255126953125,
-0.042877197265625,
-0.038238525390625,
-0.0198822021484375,
-0.06451416015625,
0.00699615478515625,
0.011993408203125,
-0.0157470703125,
0.0121917724609375,
0.017059326171875,
-0.04705810546875,
0.043060302734375,
0.019012451171875,
-0.047760009765625,
0.030670166015625,
-0.04156494140625,
0.0088043212890625,
-0.07684326171875,
0.0157623291015625,
-0.01169586181640625,
-0.01258087158203125,
-0.02978515625,
-0.01404571533203125,
-0.004970550537109375,
-0.0016326904296875,
-0.0408935546875,
0.04119873046875,
-0.030242919921875,
0.0006394386291503906,
0.01274871826171875,
-0.0235443115234375,
-0.013153076171875,
0.04388427734375,
0.003383636474609375,
0.031280517578125,
0.0701904296875,
-0.031524658203125,
0.043426513671875,
0.04022216796875,
-0.03570556640625,
0.026885986328125,
-0.0716552734375,
0.003879547119140625,
-0.0034332275390625,
0.01629638671875,
-0.07464599609375,
-0.0257568359375,
0.0239410400390625,
-0.053253173828125,
0.017730712890625,
-0.018798828125,
-0.05224609375,
-0.041046142578125,
-0.027984619140625,
0.041595458984375,
0.0469970703125,
-0.0296783447265625,
0.033294677734375,
0.0218963623046875,
-0.006473541259765625,
-0.05364990234375,
-0.06048583984375,
-0.006931304931640625,
-0.0222320556640625,
-0.0433349609375,
0.0234832763671875,
-0.00415802001953125,
0.0101165771484375,
-0.00984954833984375,
0.0086517333984375,
-0.0172882080078125,
-0.00492095947265625,
0.0223541259765625,
0.0308685302734375,
-0.0196075439453125,
-0.00640869140625,
-0.006282806396484375,
-0.005832672119140625,
0.01519012451171875,
-0.018035888671875,
0.076904296875,
-0.01280975341796875,
0.0006918907165527344,
-0.0289306640625,
0.01245880126953125,
0.045989990234375,
-0.00757598876953125,
0.05523681640625,
0.0655517578125,
-0.0467529296875,
0.001438140869140625,
-0.0228729248046875,
-0.0074462890625,
-0.035736083984375,
0.0275726318359375,
-0.0380859375,
-0.037261962890625,
0.06329345703125,
0.0277252197265625,
-0.00951385498046875,
0.058868408203125,
0.05609130859375,
-0.0160980224609375,
0.0689697265625,
0.0255889892578125,
-0.012786865234375,
0.0445556640625,
-0.0511474609375,
0.01313018798828125,
-0.06201171875,
-0.042510986328125,
-0.031524658203125,
-0.0325927734375,
-0.0298614501953125,
-0.0199737548828125,
0.017822265625,
0.00180816650390625,
-0.04736328125,
0.044097900390625,
-0.041015625,
0.02166748046875,
0.06817626953125,
0.040130615234375,
-0.0221405029296875,
-0.01236724853515625,
-0.0134429931640625,
0.0006208419799804688,
-0.048553466796875,
-0.0196075439453125,
0.10882568359375,
0.02874755859375,
0.05377197265625,
0.01314544677734375,
0.05255126953125,
0.019805908203125,
0.00431060791015625,
-0.046173095703125,
0.03814697265625,
-0.01001739501953125,
-0.07080078125,
-0.03192138671875,
-0.0185699462890625,
-0.09564208984375,
0.016632080078125,
-0.0275421142578125,
-0.065185546875,
0.00971221923828125,
-0.012054443359375,
-0.043426513671875,
0.0218505859375,
-0.047760009765625,
0.07525634765625,
-0.028839111328125,
-0.01398468017578125,
-0.0015993118286132812,
-0.054351806640625,
0.006961822509765625,
0.003757476806640625,
-0.00426483154296875,
-0.008941650390625,
0.01104736328125,
0.080810546875,
-0.03302001953125,
0.06658935546875,
-0.01776123046875,
0.00994110107421875,
0.0161590576171875,
-0.00972747802734375,
0.015869140625,
0.017608642578125,
-0.003124237060546875,
0.01947021484375,
0.010955810546875,
-0.05438232421875,
-0.009246826171875,
0.0506591796875,
-0.0733642578125,
-0.0217132568359375,
-0.058135986328125,
-0.044647216796875,
0.00960540771484375,
0.0364990234375,
0.0489501953125,
0.04437255859375,
-0.0083465576171875,
0.028900146484375,
0.043670654296875,
-0.01995849609375,
0.055084228515625,
0.0203857421875,
-0.0085906982421875,
-0.0396728515625,
0.06085205078125,
0.0208740234375,
-0.02349853515625,
0.019805908203125,
0.0081939697265625,
-0.0426025390625,
-0.034088134765625,
-0.0152130126953125,
0.029632568359375,
-0.03533935546875,
-0.031890869140625,
-0.033935546875,
-0.0267791748046875,
-0.053558349609375,
-0.00209808349609375,
-0.01690673828125,
-0.03033447265625,
-0.046051025390625,
-0.00801849365234375,
0.03363037109375,
0.0389404296875,
-0.022186279296875,
0.036468505859375,
-0.049072265625,
0.01548004150390625,
0.01446533203125,
0.0321044921875,
-0.016387939453125,
-0.061737060546875,
-0.005214691162109375,
-0.00252532958984375,
-0.0132904052734375,
-0.0699462890625,
0.06494140625,
0.0085296630859375,
0.0460205078125,
0.040496826171875,
-0.00159454345703125,
0.0421142578125,
-0.0270233154296875,
0.045501708984375,
0.01325225830078125,
-0.06842041015625,
0.044036865234375,
-0.03839111328125,
0.0032863616943359375,
0.026123046875,
0.035369873046875,
-0.012908935546875,
-0.004886627197265625,
-0.08343505859375,
-0.07415771484375,
0.055419921875,
0.017974853515625,
0.004329681396484375,
0.0035648345947265625,
0.0015993118286132812,
0.0078887939453125,
0.029022216796875,
-0.07635498046875,
-0.034271240234375,
-0.0251007080078125,
-0.03277587890625,
-0.004032135009765625,
-0.012786865234375,
-0.0163116455078125,
-0.04559326171875,
0.06292724609375,
0.01042938232421875,
0.05364990234375,
0.009185791015625,
-0.00960540771484375,
0.01560211181640625,
-0.01418304443359375,
0.054473876953125,
0.035675048828125,
-0.045654296875,
-0.01235198974609375,
0.004638671875,
-0.032073974609375,
-0.0034770965576171875,
0.0187530517578125,
-0.0010919570922851562,
0.011505126953125,
0.0254669189453125,
0.05413818359375,
0.018798828125,
-0.02978515625,
0.034576416015625,
0.00751495361328125,
-0.0256500244140625,
-0.0595703125,
-0.00003075599670410156,
-0.0094451904296875,
0.0191802978515625,
0.034332275390625,
0.01500701904296875,
-0.0014133453369140625,
-0.03314208984375,
0.014617919921875,
0.0318603515625,
-0.03515625,
-0.02020263671875,
0.05084228515625,
0.00911712646484375,
-0.033447265625,
0.0513916015625,
-0.005405426025390625,
-0.0457763671875,
0.06805419921875,
0.03460693359375,
0.06292724609375,
-0.0010328292846679688,
0.0121307373046875,
0.0433349609375,
0.01983642578125,
-0.0090484619140625,
0.050933837890625,
0.0169525146484375,
-0.062469482421875,
-0.0124359130859375,
-0.041717529296875,
-0.0012340545654296875,
0.01457977294921875,
-0.055633544921875,
0.03509521484375,
-0.04827880859375,
-0.0243988037109375,
0.004322052001953125,
0.00493621826171875,
-0.0604248046875,
0.0296783447265625,
0.0162353515625,
0.07122802734375,
-0.071044921875,
0.0867919921875,
0.051116943359375,
-0.048004150390625,
-0.050689697265625,
-0.0391845703125,
-0.0237274169921875,
-0.0894775390625,
0.050201416015625,
0.01360321044921875,
0.0251312255859375,
0.012664794921875,
-0.054046630859375,
-0.071044921875,
0.0792236328125,
0.023223876953125,
-0.0179443359375,
-0.005008697509765625,
-0.013275146484375,
0.03765869140625,
-0.01468658447265625,
0.0396728515625,
0.036041259765625,
0.041534423828125,
0.0008373260498046875,
-0.04937744140625,
-0.007411956787109375,
-0.0225067138671875,
-0.0130157470703125,
0.015380859375,
-0.058349609375,
0.0810546875,
-0.006927490234375,
0.0082244873046875,
0.01287841796875,
0.055084228515625,
0.0076904296875,
-0.00875091552734375,
0.024566650390625,
0.05572509765625,
0.049896240234375,
-0.03985595703125,
0.047332763671875,
-0.0151519775390625,
0.05157470703125,
0.05230712890625,
0.01068878173828125,
0.05902099609375,
0.04315185546875,
-0.023773193359375,
0.057159423828125,
0.059783935546875,
-0.0308990478515625,
0.049957275390625,
0.02020263671875,
-0.0005125999450683594,
-0.00732421875,
0.0194244384765625,
-0.04022216796875,
0.042999267578125,
0.035736083984375,
-0.0341796875,
-0.00867462158203125,
-0.00812530517578125,
0.0142364501953125,
-0.009674072265625,
-0.034759521484375,
0.033843994140625,
0.0008449554443359375,
-0.0421142578125,
0.04168701171875,
0.005474090576171875,
0.06689453125,
-0.052581787109375,
0.00732421875,
-0.00351715087890625,
0.0225982666015625,
-0.004528045654296875,
-0.064697265625,
0.0076904296875,
-0.00799560546875,
-0.0084075927734375,
-0.01239013671875,
0.038330078125,
-0.0267791748046875,
-0.052978515625,
0.015380859375,
0.01324462890625,
0.02716064453125,
-0.00391387939453125,
-0.0655517578125,
-0.0089111328125,
0.00635528564453125,
-0.021514892578125,
0.01039886474609375,
0.03240966796875,
0.0176239013671875,
0.046295166015625,
0.054412841796875,
0.0084075927734375,
0.0223388671875,
-0.0189056396484375,
0.0517578125,
-0.0673828125,
-0.0469970703125,
-0.06884765625,
0.034088134765625,
-0.0103302001953125,
-0.059112548828125,
0.03839111328125,
0.052490234375,
0.047332763671875,
-0.03143310546875,
0.04461669921875,
-0.033233642578125,
0.0244598388671875,
-0.026214599609375,
0.06817626953125,
-0.0245361328125,
-0.01123046875,
-0.00930023193359375,
-0.06396484375,
-0.0290679931640625,
0.072265625,
-0.018768310546875,
0.0027446746826171875,
0.039581298828125,
0.043487548828125,
0.0088043212890625,
-0.01549530029296875,
0.017608642578125,
0.0281829833984375,
0.016387939453125,
0.06451416015625,
0.0296783447265625,
-0.052825927734375,
0.0291290283203125,
-0.0182037353515625,
-0.014984130859375,
-0.0283355712890625,
-0.07159423828125,
-0.0721435546875,
-0.03582763671875,
-0.0195465087890625,
-0.04473876953125,
-0.013671875,
0.07598876953125,
0.049896240234375,
-0.077880859375,
-0.0265655517578125,
-0.00060272216796875,
0.005207061767578125,
-0.01026153564453125,
-0.0161590576171875,
0.042388916015625,
-0.0121002197265625,
-0.07318115234375,
0.0057220458984375,
-0.00853729248046875,
0.0205841064453125,
-0.00626373291015625,
0.0018405914306640625,
-0.037841796875,
-0.0010461807250976562,
0.02490234375,
0.03594970703125,
-0.0460205078125,
-0.028045654296875,
-0.006885528564453125,
-0.0142059326171875,
0.00888824462890625,
0.017974853515625,
-0.05206298828125,
0.0186920166015625,
0.04803466796875,
0.0275726318359375,
0.05364990234375,
-0.01947021484375,
0.04498291015625,
-0.05694580078125,
0.044158935546875,
0.0249176025390625,
0.05413818359375,
0.0167694091796875,
-0.010589599609375,
0.050750732421875,
0.02667236328125,
-0.035125732421875,
-0.06658935546875,
-0.017730712890625,
-0.09979248046875,
-0.007022857666015625,
0.06658935546875,
-0.017059326171875,
-0.039581298828125,
0.01123046875,
-0.031646728515625,
0.0355224609375,
-0.033935546875,
0.0518798828125,
0.055511474609375,
0.0028705596923828125,
0.01097869873046875,
-0.019927978515625,
0.02471923828125,
0.043182373046875,
-0.037139892578125,
-0.038848876953125,
0.0008673667907714844,
0.0264129638671875,
0.013336181640625,
0.03338623046875,
-0.019012451171875,
0.01535797119140625,
0.0170745849609375,
0.020751953125,
-0.003284454345703125,
-0.01617431640625,
-0.02471923828125,
0.00036072731018066406,
-0.009735107421875,
-0.05853271484375
]
] |
google/pix2struct-base | 2023-05-19T10:07:43.000Z | [
"transformers",
"pytorch",
"pix2struct",
"text2text-generation",
"image-to-text",
"en",
"fr",
"ro",
"de",
"multilingual",
"arxiv:2210.03347",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"region:us"
] | image-to-text | google | null | null | google/pix2struct-base | 49 | 58,120 | transformers | 2023-03-13T18:32:09 | ---
language:
- en
- fr
- ro
- de
- multilingual
pipeline_tag: image-to-text
inference: false
license: apache-2.0
---
# Model card for Pix2Struct - Pretrained weights

This model is the pretrained version of `Pix2Struct`, use this model for fine-tuning purposes only.
# Table of Contents
0. [TL;DR](#TL;DR)
1. [Using the model](#using-the-model)
2. [Contribution](#contribution)
3. [Citation](#citation)
# TL;DR
Pix2Struct is an image encoder - text decoder model that is trained on image-text pairs for various tasks, including image captionning and visual question answering. The full list of available models can be found on the Table 1 of the paper:

The abstract of the model states that:
> Visually-situated language is ubiquitous—sources range from textbooks with diagrams to web pages with images and tables, to mobile apps with buttons and
forms. Perhaps due to this diversity, previous work has typically relied on domainspecific recipes with limited sharing of the underlying data, model architectures,
and objectives. We present Pix2Struct, a pretrained image-to-text model for
purely visual language understanding, which can be finetuned on tasks containing visually-situated language. Pix2Struct is pretrained by learning to parse
masked screenshots of web pages into simplified HTML. The web, with its richness of visual elements cleanly reflected in the HTML structure, provides a large
source of pretraining data well suited to the diversity of downstream tasks. Intuitively, this objective subsumes common pretraining signals such as OCR, language modeling, image captioning. In addition to the novel pretraining strategy,
we introduce a variable-resolution input representation and a more flexible integration of language and vision inputs, where language prompts such as questions
are rendered directly on top of the input image. For the first time, we show that a
single pretrained model can achieve state-of-the-art results in six out of nine tasks
across four domains: documents, illustrations, user interfaces, and natural images.
# Using the model
## Converting from T5x to huggingface
You can use the [`convert_pix2struct_checkpoint_to_pytorch.py`](https://github.com/huggingface/transformers/blob/main/src/transformers/models/pix2struct/convert_pix2struct_original_pytorch_to_hf.py) script as follows:
```bash
python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE
```
if you are converting a large model, run:
```bash
python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE --use-large
```
Once saved, you can push your converted model with the following snippet:
```python
from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor
model = Pix2StructForConditionalGeneration.from_pretrained(PATH_TO_SAVE)
processor = Pix2StructProcessor.from_pretrained(PATH_TO_SAVE)
model.push_to_hub("USERNAME/MODEL_NAME")
processor.push_to_hub("USERNAME/MODEL_NAME")
```
# Contribution
This model was originally contributed by Kenton Lee, Mandar Joshi et al. and added to the Hugging Face ecosystem by [Younes Belkada](https://huggingface.co/ybelkada).
# Citation
If you want to cite this work, please consider citing the original paper:
```
@misc{https://doi.org/10.48550/arxiv.2210.03347,
doi = {10.48550/ARXIV.2210.03347},
url = {https://arxiv.org/abs/2210.03347},
author = {Lee, Kenton and Joshi, Mandar and Turc, Iulia and Hu, Hexiang and Liu, Fangyu and Eisenschlos, Julian and Khandelwal, Urvashi and Shaw, Peter and Chang, Ming-Wei and Toutanova, Kristina},
keywords = {Computation and Language (cs.CL), Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,313 | [
[
-0.0280609130859375,
-0.048614501953125,
0.02850341796875,
0.0252838134765625,
-0.0215301513671875,
-0.0296478271484375,
-0.01203155517578125,
-0.03375244140625,
-0.007843017578125,
0.0291290283203125,
-0.04376220703125,
-0.0179595947265625,
-0.0543212890625,
-0.00858306884765625,
-0.016021728515625,
0.07305908203125,
-0.01666259765625,
0.0003418922424316406,
-0.04046630859375,
-0.00048422813415527344,
-0.01239013671875,
-0.037261962890625,
-0.041168212890625,
-0.0216217041015625,
0.025482177734375,
0.0081329345703125,
0.0450439453125,
0.032928466796875,
0.0345458984375,
0.0241546630859375,
-0.004154205322265625,
-0.00804901123046875,
-0.0301055908203125,
-0.0238037109375,
-0.01361083984375,
-0.046844482421875,
-0.036834716796875,
0.0233306884765625,
0.05108642578125,
0.042755126953125,
0.005214691162109375,
0.01190948486328125,
0.01241302490234375,
0.04791259765625,
-0.040924072265625,
0.0203094482421875,
-0.026519775390625,
0.007114410400390625,
-0.00726318359375,
0.012451171875,
-0.032989501953125,
-0.006916046142578125,
0.00592803955078125,
-0.048583984375,
0.00571441650390625,
-0.00519561767578125,
0.0928955078125,
0.024322509765625,
-0.01306915283203125,
0.0110015869140625,
-0.0162353515625,
0.042633056640625,
-0.0340576171875,
0.033905029296875,
0.036529541015625,
0.01412200927734375,
0.005786895751953125,
-0.0830078125,
-0.051727294921875,
-0.00676727294921875,
-0.020660400390625,
0.0159759521484375,
-0.032012939453125,
-0.0013475418090820312,
0.030853271484375,
0.0211334228515625,
-0.04583740234375,
-0.00481414794921875,
-0.035430908203125,
-0.0203094482421875,
0.036865234375,
-0.0261383056640625,
0.046966552734375,
-0.0210723876953125,
-0.041717529296875,
-0.046356201171875,
-0.035064697265625,
0.0283355712890625,
0.003124237060546875,
-0.0014362335205078125,
-0.042724609375,
0.045318603515625,
0.002384185791015625,
0.045379638671875,
0.00946807861328125,
0.00818634033203125,
0.0330810546875,
-0.020263671875,
-0.005950927734375,
-0.0261383056640625,
0.08331298828125,
0.0400390625,
0.0313720703125,
-0.00009971857070922852,
-0.00547027587890625,
-0.0025634765625,
0.016387939453125,
-0.09161376953125,
-0.040130615234375,
0.006206512451171875,
-0.037994384765625,
-0.017181396484375,
0.0262603759765625,
-0.04901123046875,
-0.006237030029296875,
-0.0196533203125,
0.04534912109375,
-0.034088134765625,
-0.0411376953125,
-0.012969970703125,
-0.01526641845703125,
0.024169921875,
0.032379150390625,
-0.044281005859375,
0.0160064697265625,
0.03094482421875,
0.07708740234375,
-0.01123046875,
-0.032135009765625,
-0.035797119140625,
-0.021697998046875,
-0.0215606689453125,
0.06585693359375,
-0.02716064453125,
-0.002590179443359375,
-0.019775390625,
0.0211639404296875,
-0.019683837890625,
-0.037750244140625,
-0.0015993118286132812,
-0.0254669189453125,
0.0251922607421875,
-0.004177093505859375,
-0.01036834716796875,
-0.01666259765625,
0.01300048828125,
-0.04034423828125,
0.09002685546875,
0.037017822265625,
-0.061370849609375,
0.00579833984375,
-0.039459228515625,
-0.0223388671875,
-0.0059967041015625,
-0.01117706298828125,
-0.06036376953125,
0.004024505615234375,
0.01296234130859375,
0.035614013671875,
-0.0067138671875,
0.0106353759765625,
-0.034332275390625,
-0.0126800537109375,
0.01983642578125,
0.007144927978515625,
0.06524658203125,
0.0216522216796875,
-0.033599853515625,
0.0051422119140625,
-0.03375244140625,
0.01500701904296875,
0.0186309814453125,
-0.0100250244140625,
-0.0027027130126953125,
-0.01236724853515625,
0.0209197998046875,
0.036773681640625,
0.020660400390625,
-0.0323486328125,
0.009368896484375,
-0.0124053955078125,
0.04693603515625,
0.030975341796875,
-0.025970458984375,
0.037322998046875,
-0.00482940673828125,
0.0273284912109375,
0.01245880126953125,
0.00567626953125,
-0.034271240234375,
-0.03125,
-0.052276611328125,
-0.021636962890625,
0.009490966796875,
0.039031982421875,
-0.067626953125,
0.0333251953125,
-0.01476287841796875,
-0.04010009765625,
-0.00801849365234375,
0.001659393310546875,
0.051239013671875,
0.040802001953125,
0.032928466796875,
-0.037506103515625,
-0.0293121337890625,
-0.057220458984375,
-0.01190948486328125,
-0.01433563232421875,
-0.009613037109375,
0.01727294921875,
0.04742431640625,
-0.03271484375,
0.07257080078125,
-0.0286407470703125,
-0.0154876708984375,
-0.024200439453125,
0.01036834716796875,
0.000059485435485839844,
0.0699462890625,
0.055206298828125,
-0.062225341796875,
-0.03948974609375,
-0.005153656005859375,
-0.06195068359375,
-0.005218505859375,
-0.00311279296875,
-0.0305328369140625,
0.0159454345703125,
0.048492431640625,
-0.046722412109375,
0.0391845703125,
0.037384033203125,
-0.0400390625,
0.04046630859375,
-0.007144927978515625,
-0.0001894235610961914,
-0.0880126953125,
0.026947021484375,
0.01055145263671875,
-0.0379638671875,
-0.037628173828125,
0.016876220703125,
0.028656005859375,
-0.03521728515625,
-0.045867919921875,
0.059173583984375,
-0.047027587890625,
-0.0132598876953125,
-0.0213165283203125,
-0.011474609375,
0.0074920654296875,
0.057830810546875,
0.0323486328125,
0.0545654296875,
0.058563232421875,
-0.04559326171875,
0.01910400390625,
0.04132080078125,
-0.0167999267578125,
0.048309326171875,
-0.06671142578125,
0.02044677734375,
-0.01020050048828125,
0.020904541015625,
-0.062744140625,
-0.0220489501953125,
0.0350341796875,
-0.049774169921875,
0.0291900634765625,
-0.017852783203125,
-0.0277099609375,
-0.038543701171875,
-0.0147552490234375,
0.045257568359375,
0.048492431640625,
-0.04522705078125,
0.04400634765625,
0.0185699462890625,
-0.01525115966796875,
-0.01491546630859375,
-0.05950927734375,
-0.01107025146484375,
-0.0008788108825683594,
-0.0517578125,
0.0305023193359375,
-0.003765106201171875,
0.004123687744140625,
0.00243377685546875,
-0.002719879150390625,
-0.005474090576171875,
-0.011474609375,
0.035614013671875,
0.0253753662109375,
-0.013427734375,
-0.0043487548828125,
-0.0059967041015625,
-0.02252197265625,
-0.001110076904296875,
-0.027069091796875,
0.0499267578125,
-0.026519775390625,
-0.00939178466796875,
-0.06939697265625,
0.024566650390625,
0.040283203125,
-0.037811279296875,
0.045013427734375,
0.05584716796875,
-0.0355224609375,
0.0078277587890625,
-0.043060302734375,
-0.01216888427734375,
-0.034759521484375,
0.047607421875,
-0.033660888671875,
-0.05499267578125,
0.02862548828125,
-0.01202392578125,
-0.015045166015625,
0.0399169921875,
0.0389404296875,
-0.0201416015625,
0.06597900390625,
0.07281494140625,
0.01959228515625,
0.0662841796875,
-0.02850341796875,
0.0004622936248779297,
-0.06402587890625,
-0.032989501953125,
-0.0285797119140625,
-0.019775390625,
-0.031097412109375,
-0.0396728515625,
0.0355224609375,
0.0249176025390625,
-0.03240966796875,
0.046295166015625,
-0.03790283203125,
0.0211944580078125,
0.050750732421875,
0.040283203125,
-0.014434814453125,
0.03271484375,
0.0025730133056640625,
-0.002048492431640625,
-0.047698974609375,
-0.02276611328125,
0.064453125,
0.031646728515625,
0.046722412109375,
-0.0160369873046875,
0.03314208984375,
-0.01030731201171875,
0.01039886474609375,
-0.052337646484375,
0.0328369140625,
-0.006488800048828125,
-0.0338134765625,
-0.00846099853515625,
-0.03594970703125,
-0.0577392578125,
0.0119476318359375,
-0.020233154296875,
-0.06390380859375,
0.0215606689453125,
0.0240631103515625,
-0.029876708984375,
0.0231170654296875,
-0.06787109375,
0.097900390625,
-0.027069091796875,
-0.055633544921875,
0.00012362003326416016,
-0.052642822265625,
0.0160064697265625,
0.0142669677734375,
-0.00244903564453125,
0.01032257080078125,
0.014007568359375,
0.0693359375,
-0.051666259765625,
0.0621337890625,
-0.02520751953125,
0.01392364501953125,
0.032440185546875,
-0.00428009033203125,
0.0289459228515625,
-0.003406524658203125,
0.002124786376953125,
0.03179931640625,
0.030609130859375,
-0.036468505859375,
-0.042755126953125,
0.0236663818359375,
-0.08050537109375,
-0.02197265625,
-0.03497314453125,
-0.0249481201171875,
0.0018558502197265625,
0.0296478271484375,
0.039215087890625,
0.0272216796875,
0.0005602836608886719,
0.0165863037109375,
0.04669189453125,
-0.032257080078125,
0.033599853515625,
0.007904052734375,
-0.01522064208984375,
-0.03753662109375,
0.057342529296875,
-0.01158905029296875,
0.0267333984375,
0.0238037109375,
0.0078887939453125,
-0.0298919677734375,
-0.01251983642578125,
-0.0299224853515625,
0.036956787109375,
-0.0389404296875,
-0.01338958740234375,
-0.05841064453125,
-0.02105712890625,
-0.04254150390625,
-0.007167816162109375,
-0.054443359375,
-0.0154571533203125,
-0.031707763671875,
0.016387939453125,
0.0247802734375,
0.031280517578125,
-0.00872802734375,
0.04791259765625,
-0.04449462890625,
0.03271484375,
0.031219482421875,
0.047149658203125,
-0.0159454345703125,
-0.046966552734375,
-0.0033245086669921875,
0.0112152099609375,
-0.0250244140625,
-0.05950927734375,
0.030181884765625,
0.023529052734375,
0.03106689453125,
0.031524658203125,
-0.00731658935546875,
0.060333251953125,
-0.0301971435546875,
0.041412353515625,
0.049713134765625,
-0.06121826171875,
0.06292724609375,
-0.00612640380859375,
0.01544952392578125,
0.04443359375,
0.032684326171875,
-0.0386962890625,
0.0167083740234375,
-0.051513671875,
-0.049346923828125,
0.0718994140625,
0.0167388916015625,
0.0149688720703125,
0.0244903564453125,
0.04302978515625,
-0.0066986083984375,
0.0167999267578125,
-0.082275390625,
0.00547027587890625,
-0.0533447265625,
-0.016265869140625,
-0.0014162063598632812,
-0.0252838134765625,
0.002315521240234375,
-0.045806884765625,
0.041778564453125,
-0.0165863037109375,
0.0509033203125,
0.02471923828125,
-0.043426513671875,
-0.0003445148468017578,
-0.020660400390625,
0.026580810546875,
0.039306640625,
-0.001712799072265625,
0.01739501953125,
-0.01702880859375,
-0.043243408203125,
-0.00820159912109375,
0.016632080078125,
-0.01163482666015625,
-0.01465606689453125,
0.031585693359375,
0.08746337890625,
0.007770538330078125,
-0.04132080078125,
0.06292724609375,
-0.0020503997802734375,
-0.0213165283203125,
-0.0283966064453125,
0.0032711029052734375,
-0.006488800048828125,
0.0229034423828125,
0.017059326171875,
0.014068603515625,
-0.023040771484375,
-0.047515869140625,
0.0256195068359375,
0.038299560546875,
-0.037078857421875,
-0.0382080078125,
0.0589599609375,
0.01036834716796875,
-0.01116180419921875,
0.06268310546875,
-0.00916290283203125,
-0.047515869140625,
0.046844482421875,
0.037078857421875,
0.055267333984375,
-0.0094146728515625,
0.0124969482421875,
0.060546875,
0.0177154541015625,
-0.01415252685546875,
0.007152557373046875,
-0.0288848876953125,
-0.0528564453125,
-0.00970458984375,
-0.05780029296875,
-0.004680633544921875,
0.0021190643310546875,
-0.044281005859375,
0.02685546875,
-0.044921875,
-0.00856781005859375,
-0.0158233642578125,
-0.000762939453125,
-0.05267333984375,
0.0243682861328125,
0.0249481201171875,
0.060028076171875,
-0.058380126953125,
0.054046630859375,
0.0684814453125,
-0.05145263671875,
-0.06243896484375,
-0.01068115234375,
-0.00443267822265625,
-0.06768798828125,
0.043914794921875,
0.03985595703125,
0.013824462890625,
0.01593017578125,
-0.06561279296875,
-0.0537109375,
0.090087890625,
0.025390625,
-0.03814697265625,
-0.0044708251953125,
0.011962890625,
0.0198974609375,
-0.01641845703125,
0.04779052734375,
0.0246429443359375,
0.032440185546875,
0.0284271240234375,
-0.068603515625,
0.008880615234375,
-0.0589599609375,
0.020721435546875,
-0.0196533203125,
-0.0499267578125,
0.07574462890625,
-0.0276031494140625,
-0.025390625,
0.0110015869140625,
0.05743408203125,
0.01971435546875,
0.016754150390625,
0.035858154296875,
0.050933837890625,
0.038818359375,
-0.0247344970703125,
0.0850830078125,
-0.01506805419921875,
0.0308685302734375,
0.0587158203125,
0.01116180419921875,
0.06512451171875,
0.0357666015625,
-0.0252227783203125,
0.037689208984375,
0.060546875,
-0.01467132568359375,
0.0259246826171875,
-0.003658294677734375,
0.00689697265625,
-0.033416748046875,
0.01068115234375,
-0.03887939453125,
0.028961181640625,
0.010040283203125,
-0.02679443359375,
-0.015838623046875,
0.016021728515625,
0.020538330078125,
0.00350189208984375,
-0.0216217041015625,
0.052337646484375,
0.01308441162109375,
-0.058135986328125,
0.055206298828125,
-0.00040459632873535156,
0.063232421875,
-0.051055908203125,
0.006572723388671875,
-0.018646240234375,
0.021484375,
-0.02685546875,
-0.052154541015625,
0.023712158203125,
-0.001735687255859375,
-0.0117034912109375,
-0.033172607421875,
0.0484619140625,
-0.032318115234375,
-0.05706787109375,
0.01108551025390625,
0.0167083740234375,
0.023834228515625,
-0.04583740234375,
-0.061737060546875,
0.01739501953125,
-0.0007920265197753906,
-0.0303497314453125,
0.0282440185546875,
0.0157012939453125,
-0.00897979736328125,
0.042205810546875,
0.046539306640625,
-0.0121612548828125,
0.005664825439453125,
-0.01473236083984375,
0.06884765625,
-0.0435791015625,
-0.051422119140625,
-0.046661376953125,
0.063232421875,
-0.0008864402770996094,
-0.024566650390625,
0.031829833984375,
0.035858154296875,
0.0762939453125,
-0.01045989990234375,
0.0469970703125,
-0.028289794921875,
0.00876617431640625,
-0.041839599609375,
0.07562255859375,
-0.057281494140625,
-0.0244293212890625,
-0.0350341796875,
-0.0782470703125,
-0.0220489501953125,
0.07330322265625,
-0.02587890625,
0.0183868408203125,
0.058135986328125,
0.07220458984375,
-0.0172119140625,
-0.0299224853515625,
0.005489349365234375,
0.013519287109375,
0.0173492431640625,
0.04815673828125,
0.042205810546875,
-0.050201416015625,
0.0330810546875,
-0.031219482421875,
-0.043426513671875,
-0.0025424957275390625,
-0.05810546875,
-0.078857421875,
-0.0633544921875,
-0.04803466796875,
-0.0234832763671875,
-0.006038665771484375,
0.050872802734375,
0.0777587890625,
-0.047271728515625,
-0.01236724853515625,
-0.0201416015625,
-0.01247406005859375,
-0.01468658447265625,
-0.014892578125,
0.033416748046875,
-0.0180511474609375,
-0.060577392578125,
-0.01085662841796875,
0.004573822021484375,
0.0248565673828125,
0.0042266845703125,
-0.01629638671875,
-0.012939453125,
-0.032684326171875,
0.054962158203125,
0.038818359375,
-0.03851318359375,
-0.0019311904907226562,
-0.004337310791015625,
-0.00540924072265625,
0.017608642578125,
0.044921875,
-0.059906005859375,
0.0279998779296875,
0.0386962890625,
0.041595458984375,
0.064453125,
0.0099945068359375,
0.024658203125,
-0.0364990234375,
0.042694091796875,
-0.0096282958984375,
0.0244293212890625,
0.0293731689453125,
-0.01788330078125,
0.0267486572265625,
0.0411376953125,
-0.01355743408203125,
-0.052032470703125,
0.013275146484375,
-0.08819580078125,
-0.0264129638671875,
0.0914306640625,
-0.0203094482421875,
-0.0445556640625,
0.037994384765625,
-0.0233917236328125,
0.0312347412109375,
-0.002262115478515625,
0.03131103515625,
0.02935791015625,
-0.00594329833984375,
-0.0650634765625,
-0.0216217041015625,
0.031768798828125,
0.0313720703125,
-0.06298828125,
-0.01302337646484375,
0.035614013671875,
0.032745361328125,
0.0186920166015625,
0.04290771484375,
-0.031890869140625,
0.0355224609375,
-0.0032367706298828125,
0.04534912109375,
-0.032440185546875,
-0.020233154296875,
-0.0164642333984375,
-0.002826690673828125,
-0.014434814453125,
-0.0173492431640625
]
] |
openai/whisper-base.en | 2023-09-08T11:02:17.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"arxiv:2212.04356",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | openai | null | null | openai/whisper-base.en | 12 | 58,066 | transformers | 2022-09-26T06:58:29 | ---
language:
- en
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: whisper-base.en
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 12.803978669490565
pipeline_tag: automatic-speech-recognition
license: apache-2.0
---
# Whisper
Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours
of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need
for fine-tuning.
Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356)
by Alec Radford et al. from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper).
**Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were
copied and pasted from the original model card.
## Model details
Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model.
It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision.
The models were trained on either English-only data or multilingual data. The English-only models were trained
on the task of speech recognition. The multilingual models were trained on both speech recognition and speech
translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio.
For speech translation, the model predicts transcriptions to a *different* language to the audio.
Whisper checkpoints come in five configurations of varying model sizes.
The smallest four are trained on either English-only or multilingual data.
The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints
are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The
checkpoints are summarised in the following table with links to the models on the Hub:
| Size | Parameters | English-only | Multilingual |
|----------|------------|------------------------------------------------------|-----------------------------------------------------|
| tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) |
| base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) |
| small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) |
| medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) |
| large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) |
| large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) |
# Usage
This checkpoint is an *English-only* model, meaning it can be used for English speech recognition. Multilingual speech
recognition or speech translation is possible through use of a multilingual checkpoint.
To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor).
The `WhisperProcessor` is used to:
1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model)
2. Post-process the model outputs (converting them from tokens to text)
## Transcription
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base.en")
>>> # load dummy dataset and read audio files
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False)
['<|startoftranscript|><|notimestamps|> Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel.<|endoftext|>']
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.']
```
The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`.
## Evaluation
This code snippet shows how to evaluate Whisper base.en on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr):
```python
>>> from datasets import load_dataset
>>> from transformers import WhisperForConditionalGeneration, WhisperProcessor
>>> import torch
>>> from evaluate import load
>>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test")
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base.en").to("cuda")
>>> def map_to_pred(batch):
>>> audio = batch["audio"]
>>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features
>>> batch["reference"] = processor.tokenizer._normalize(batch['text'])
>>>
>>> with torch.no_grad():
>>> predicted_ids = model.generate(input_features.to("cuda"))[0]
>>> transcription = processor.decode(predicted_ids)
>>> batch["prediction"] = processor.tokenizer._normalize(transcription)
>>> return batch
>>> result = librispeech_test_clean.map(map_to_pred)
>>> wer = load("wer")
>>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"]))
4.271408904897505
```
## Long-Form Transcription
The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking
algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers
[`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline)
method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline
can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`:
```python
>>> import torch
>>> from transformers import pipeline
>>> from datasets import load_dataset
>>> device = "cuda:0" if torch.cuda.is_available() else "cpu"
>>> pipe = pipeline(
>>> "automatic-speech-recognition",
>>> model="openai/whisper-base.en",
>>> chunk_length_s=30,
>>> device=device,
>>> )
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> prediction = pipe(sample.copy(), batch_size=8)["text"]
" Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel."
>>> # we can also return timestamps for the predictions
>>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"]
[{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.',
'timestamp': (0.0, 5.44)}]
```
Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm.
## Fine-Tuning
The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However,
its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog
post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step
guide to fine-tuning the Whisper model with as little as 5 hours of labelled data.
### Evaluated Use
The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research.
The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them.
In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes.
## Training Data
The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages.
As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language.
## Performance and Limitations
Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level.
However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself.
Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf).
In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages.
## Broader Implications
We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications.
There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects.
### BibTeX entry and citation info
```bibtex
@misc{radford2022whisper,
doi = {10.48550/ARXIV.2212.04356},
url = {https://arxiv.org/abs/2212.04356},
author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya},
title = {Robust Speech Recognition via Large-Scale Weak Supervision},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
| 14,815 | [
[
-0.020172119140625,
-0.045623779296875,
0.006671905517578125,
0.03472900390625,
-0.0043792724609375,
0.0009546279907226562,
-0.0272216796875,
-0.04620361328125,
0.0175933837890625,
0.0247650146484375,
-0.0615234375,
-0.040252685546875,
-0.0538330078125,
-0.012664794921875,
-0.043365478515625,
0.0750732421875,
0.01198577880859375,
-0.001567840576171875,
0.0167388916015625,
-0.005107879638671875,
-0.0248870849609375,
-0.0207672119140625,
-0.051727294921875,
-0.01444244384765625,
0.0133209228515625,
0.0104217529296875,
0.027801513671875,
0.039825439453125,
0.009796142578125,
0.03155517578125,
-0.03179931640625,
-0.0048370361328125,
-0.027435302734375,
-0.006504058837890625,
0.02886962890625,
-0.03509521484375,
-0.046478271484375,
0.01226043701171875,
0.057769775390625,
0.035858154296875,
-0.026947021484375,
0.0321044921875,
0.0179595947265625,
0.0242919921875,
-0.0221099853515625,
0.02032470703125,
-0.0506591796875,
-0.0099945068359375,
-0.0199737548828125,
0.0006031990051269531,
-0.0253448486328125,
-0.0229339599609375,
0.04315185546875,
-0.045196533203125,
0.029022216796875,
0.0103607177734375,
0.077880859375,
0.018157958984375,
-0.0036411285400390625,
-0.031982421875,
-0.053619384765625,
0.08367919921875,
-0.06689453125,
0.039031982421875,
0.0295562744140625,
0.018280029296875,
0.0023975372314453125,
-0.06890869140625,
-0.052520751953125,
-0.0007085800170898438,
-0.0016326904296875,
0.0241241455078125,
-0.0264434814453125,
-0.0003192424774169922,
0.017822265625,
0.03131103515625,
-0.0350341796875,
0.0004649162292480469,
-0.053192138671875,
-0.050201416015625,
0.047027587890625,
-0.0005640983581542969,
0.02154541015625,
-0.0201263427734375,
-0.018829345703125,
-0.0306854248046875,
-0.0201416015625,
0.033447265625,
0.028839111328125,
0.03472900390625,
-0.05328369140625,
0.0265350341796875,
-0.00555419921875,
0.045806884765625,
0.015655517578125,
-0.045257568359375,
0.047149658203125,
-0.010772705078125,
-0.014739990234375,
0.0282135009765625,
0.07757568359375,
0.0172882080078125,
0.0076141357421875,
0.006359100341796875,
-0.0102386474609375,
0.01385498046875,
-0.006656646728515625,
-0.0635986328125,
-0.0038204193115234375,
0.037322998046875,
-0.0401611328125,
-0.0229339599609375,
-0.017547607421875,
-0.046722412109375,
0.0099639892578125,
-0.0118560791015625,
0.05133056640625,
-0.04327392578125,
-0.0262603759765625,
0.01812744140625,
-0.02996826171875,
0.0237579345703125,
0.0013141632080078125,
-0.061614990234375,
0.0274200439453125,
0.03363037109375,
0.065185546875,
0.00731658935546875,
-0.046478271484375,
-0.03582763671875,
0.0076141357421875,
0.00931549072265625,
0.034210205078125,
-0.01934814453125,
-0.04327392578125,
-0.015838623046875,
0.006072998046875,
-0.0252227783203125,
-0.0439453125,
0.053558349609375,
-0.0098876953125,
0.03631591796875,
0.0005931854248046875,
-0.039398193359375,
-0.0156402587890625,
-0.01459503173828125,
-0.0303955078125,
0.0697021484375,
0.006114959716796875,
-0.052947998046875,
0.011962890625,
-0.03765869140625,
-0.0362548828125,
-0.0207672119140625,
0.0147247314453125,
-0.045867919921875,
-0.004665374755859375,
0.03253173828125,
0.0299072265625,
-0.01282501220703125,
0.005474090576171875,
-0.003101348876953125,
-0.0303192138671875,
0.02447509765625,
-0.0309600830078125,
0.07598876953125,
0.0124359130859375,
-0.032958984375,
0.015869140625,
-0.05841064453125,
0.01065826416015625,
0.002471923828125,
-0.01177978515625,
0.01250457763671875,
-0.0028076171875,
0.022125244140625,
0.0017728805541992188,
0.01241302490234375,
-0.056640625,
-0.00928497314453125,
-0.04913330078125,
0.054290771484375,
0.0469970703125,
-0.005817413330078125,
0.02642822265625,
-0.0443115234375,
0.0220794677734375,
0.00510406494140625,
0.03253173828125,
-0.01447296142578125,
-0.04608154296875,
-0.0732421875,
-0.03094482421875,
0.03289794921875,
0.05352783203125,
-0.0233154296875,
0.044189453125,
-0.01479339599609375,
-0.05645751953125,
-0.09698486328125,
-0.01220703125,
0.0433349609375,
0.043701171875,
0.05316162109375,
-0.01303863525390625,
-0.058380126953125,
-0.052337646484375,
-0.011444091796875,
-0.0242919921875,
-0.01412200927734375,
0.027618408203125,
0.0227508544921875,
-0.0263671875,
0.051727294921875,
-0.037139892578125,
-0.038665771484375,
-0.0187835693359375,
0.0036869049072265625,
0.0361328125,
0.04827880859375,
0.01971435546875,
-0.052734375,
-0.031982421875,
-0.01474761962890625,
-0.04461669921875,
-0.0077362060546875,
-0.0041046142578125,
0.00426483154296875,
0.0015621185302734375,
0.02667236328125,
-0.053924560546875,
0.030975341796875,
0.0504150390625,
-0.012847900390625,
0.0543212890625,
0.01287841796875,
-0.0028018951416015625,
-0.0869140625,
-0.0037479400634765625,
-0.01114654541015625,
-0.0081939697265625,
-0.05059814453125,
-0.019287109375,
-0.00832366943359375,
-0.006320953369140625,
-0.03839111328125,
0.045257568359375,
-0.02581787109375,
0.0027370452880859375,
-0.004642486572265625,
0.00672149658203125,
-0.0031948089599609375,
0.039154052734375,
0.0147247314453125,
0.048370361328125,
0.06439208984375,
-0.04229736328125,
0.016021728515625,
0.0413818359375,
-0.024261474609375,
0.02105712890625,
-0.0750732421875,
0.01303863525390625,
0.00965118408203125,
0.015472412109375,
-0.0518798828125,
-0.00739288330078125,
0.00173187255859375,
-0.0740966796875,
0.03265380859375,
-0.0244140625,
-0.0238800048828125,
-0.037506103515625,
-0.0134429931640625,
0.0045318603515625,
0.06866455078125,
-0.035552978515625,
0.054046630859375,
0.0321044921875,
-0.0161285400390625,
-0.043365478515625,
-0.040435791015625,
-0.017822265625,
-0.01763916015625,
-0.057769775390625,
0.03765869140625,
-0.009674072265625,
-0.00079345703125,
-0.01377105712890625,
-0.0094146728515625,
0.009368896484375,
-0.01812744140625,
0.0372314453125,
0.03704833984375,
-0.00960540771484375,
-0.0207977294921875,
0.0146942138671875,
-0.0192413330078125,
-0.00031566619873046875,
-0.0178985595703125,
0.052734375,
-0.026123046875,
-0.005527496337890625,
-0.05828857421875,
0.0161895751953125,
0.03790283203125,
-0.0268096923828125,
0.041290283203125,
0.0634765625,
-0.0203094482421875,
-0.017333984375,
-0.053466796875,
-0.016754150390625,
-0.0433349609375,
0.01175689697265625,
-0.026885986328125,
-0.05792236328125,
0.052978515625,
0.01290130615234375,
0.008575439453125,
0.0469970703125,
0.038116455078125,
-0.020751953125,
0.0687255859375,
0.031982421875,
-0.020355224609375,
0.021881103515625,
-0.058135986328125,
-0.01027679443359375,
-0.0791015625,
-0.0285491943359375,
-0.04241943359375,
-0.0198822021484375,
-0.036346435546875,
-0.026214599609375,
0.0386962890625,
0.00891876220703125,
-0.0093841552734375,
0.034149169921875,
-0.057098388671875,
0.0002765655517578125,
0.048583984375,
0.003269195556640625,
0.0091094970703125,
-0.00391387939453125,
-0.00923919677734375,
-0.0052337646484375,
-0.0265960693359375,
-0.0247039794921875,
0.07281494140625,
0.039306640625,
0.04119873046875,
-0.007404327392578125,
0.055419921875,
-0.003376007080078125,
0.0033111572265625,
-0.05670166015625,
0.03619384765625,
-0.008636474609375,
-0.043731689453125,
-0.02813720703125,
-0.0214691162109375,
-0.0614013671875,
0.01172637939453125,
-0.01316070556640625,
-0.05035400390625,
0.0096893310546875,
-0.00623321533203125,
-0.0231475830078125,
0.0195159912109375,
-0.05389404296875,
0.04266357421875,
0.00943756103515625,
0.0098876953125,
-0.00423431396484375,
-0.056884765625,
0.00801849365234375,
0.0085906982421875,
0.0095367431640625,
-0.0120849609375,
0.0167236328125,
0.08099365234375,
-0.03436279296875,
0.06793212890625,
-0.026702880859375,
0.00815582275390625,
0.039031982421875,
-0.01557159423828125,
0.0240631103515625,
-0.01806640625,
-0.0130767822265625,
0.0325927734375,
0.022003173828125,
-0.0224609375,
-0.0225830078125,
0.03668212890625,
-0.08197021484375,
-0.023529052734375,
-0.0194091796875,
-0.0289764404296875,
-0.013580322265625,
0.01479339599609375,
0.061676025390625,
0.050811767578125,
-0.00667572021484375,
-0.00022864341735839844,
0.03607177734375,
-0.0175323486328125,
0.03973388671875,
0.048980712890625,
-0.016998291015625,
-0.0357666015625,
0.070068359375,
0.0179443359375,
0.019256591796875,
0.01326751708984375,
0.031280517578125,
-0.03271484375,
-0.051544189453125,
-0.04071044921875,
0.023345947265625,
-0.02862548828125,
-0.01300048828125,
-0.0665283203125,
-0.04107666015625,
-0.045654296875,
0.0005331039428710938,
-0.036041259765625,
-0.0211181640625,
-0.03021240234375,
0.005840301513671875,
0.04534912109375,
0.03118896484375,
0.0016613006591796875,
0.041717529296875,
-0.0689697265625,
0.0308380126953125,
0.024810791015625,
0.0090789794921875,
0.00429534912109375,
-0.0726318359375,
-0.006870269775390625,
0.01474761962890625,
-0.025482177734375,
-0.047119140625,
0.0361328125,
0.027801513671875,
0.03289794921875,
0.0177764892578125,
0.00040411949157714844,
0.07159423828125,
-0.052093505859375,
0.0584716796875,
0.0162811279296875,
-0.0921630859375,
0.055633544921875,
-0.0276336669921875,
0.0183563232421875,
0.03228759765625,
0.0261077880859375,
-0.04559326171875,
-0.0391845703125,
-0.05169677734375,
-0.049041748046875,
0.051300048828125,
0.0225677490234375,
0.007110595703125,
0.020751953125,
0.016143798828125,
0.00909423828125,
0.01010894775390625,
-0.03424072265625,
-0.035888671875,
-0.0309600830078125,
-0.01873779296875,
-0.00677490234375,
-0.0029468536376953125,
0.0006322860717773438,
-0.03961181640625,
0.05828857421875,
0.0011157989501953125,
0.033905029296875,
0.030426025390625,
0.0035114288330078125,
-0.0017757415771484375,
0.012237548828125,
0.0247039794921875,
0.0195159912109375,
-0.020233154296875,
-0.0289306640625,
0.0259246826171875,
-0.06427001953125,
0.0014657974243164062,
0.025421142578125,
-0.0222015380859375,
0.0086212158203125,
0.05328369140625,
0.083251953125,
0.01412200927734375,
-0.0372314453125,
0.049835205078125,
-0.005680084228515625,
-0.01739501953125,
-0.047576904296875,
0.003253936767578125,
0.0244598388671875,
0.022186279296875,
0.0287628173828125,
0.00965118408203125,
0.01271820068359375,
-0.038177490234375,
0.0111236572265625,
0.0200958251953125,
-0.039306640625,
-0.039794921875,
0.06414794921875,
0.005550384521484375,
-0.0295562744140625,
0.055267333984375,
0.003398895263671875,
-0.047943115234375,
0.03515625,
0.049835205078125,
0.0728759765625,
-0.03790283203125,
-0.002231597900390625,
0.03436279296875,
0.0159912109375,
0.0020465850830078125,
0.03692626953125,
-0.0038166046142578125,
-0.055023193359375,
-0.03515625,
-0.07843017578125,
-0.025848388671875,
0.001735687255859375,
-0.07330322265625,
0.0259552001953125,
-0.023712158203125,
-0.019805908203125,
0.0232696533203125,
0.00605010986328125,
-0.055694580078125,
0.0145263671875,
0.003368377685546875,
0.07611083984375,
-0.051483154296875,
0.07489013671875,
0.0126495361328125,
-0.019805908203125,
-0.0833740234375,
0.0004372596740722656,
0.003238677978515625,
-0.07464599609375,
0.02349853515625,
0.0241851806640625,
-0.016754150390625,
0.0094757080078125,
-0.0386962890625,
-0.053802490234375,
0.0806884765625,
0.010955810546875,
-0.051483154296875,
-0.01522064208984375,
-0.005176544189453125,
0.03961181640625,
-0.0162811279296875,
0.017852783203125,
0.0548095703125,
0.033294677734375,
0.01094818115234375,
-0.109619140625,
-0.010345458984375,
-0.0203857421875,
-0.0195159912109375,
0.0017261505126953125,
-0.05914306640625,
0.0689697265625,
-0.0322265625,
-0.017364501953125,
0.02325439453125,
0.058990478515625,
0.031646728515625,
0.030609130859375,
0.0489501953125,
0.04473876953125,
0.056671142578125,
-0.01366424560546875,
0.07208251953125,
-0.01313018798828125,
0.017730712890625,
0.07183837890625,
-0.00693511962890625,
0.08367919921875,
0.0152587890625,
-0.03753662109375,
0.05255126953125,
0.024871826171875,
-0.002933502197265625,
0.035736083984375,
-0.00018358230590820312,
-0.026153564453125,
0.01129150390625,
-0.00774383544921875,
-0.03961181640625,
0.059967041015625,
0.032379150390625,
-0.01593017578125,
0.030731201171875,
0.0102081298828125,
0.0077972412109375,
-0.009979248046875,
-0.01715087890625,
0.06512451171875,
0.01412200927734375,
-0.031280517578125,
0.06243896484375,
-0.005344390869140625,
0.08154296875,
-0.06195068359375,
0.01355743408203125,
0.014862060546875,
0.015777587890625,
-0.01751708984375,
-0.047332763671875,
0.0250701904296875,
-0.01428985595703125,
-0.0188751220703125,
-0.01323699951171875,
0.042755126953125,
-0.0491943359375,
-0.04083251953125,
0.03912353515625,
0.027618408203125,
0.02362060546875,
-0.01038360595703125,
-0.061279296875,
0.03338623046875,
0.0156402587890625,
-0.0131683349609375,
0.01073455810546875,
0.0089263916015625,
0.024993896484375,
0.051727294921875,
0.0645751953125,
0.033355712890625,
0.01666259765625,
0.0106353759765625,
0.061187744140625,
-0.051300048828125,
-0.04248046875,
-0.047760009765625,
0.039306640625,
-0.0020160675048828125,
-0.0303192138671875,
0.06402587890625,
0.049774169921875,
0.052642822265625,
0.004589080810546875,
0.0556640625,
-0.0002868175506591797,
0.07794189453125,
-0.03729248046875,
0.0655517578125,
-0.027984619140625,
0.004405975341796875,
-0.0285797119140625,
-0.052734375,
0.01043701171875,
0.041900634765625,
-0.005817413330078125,
-0.0010852813720703125,
0.026763916015625,
0.065673828125,
0.0005030632019042969,
0.0212554931640625,
0.00250244140625,
0.035919189453125,
0.0168609619140625,
0.03680419921875,
0.048187255859375,
-0.060882568359375,
0.0484619140625,
-0.04248046875,
-0.01800537109375,
0.01073455810546875,
-0.034515380859375,
-0.06439208984375,
-0.062286376953125,
-0.020965576171875,
-0.044830322265625,
-0.018096923828125,
0.053192138671875,
0.06878662109375,
-0.06207275390625,
-0.0306549072265625,
0.0258026123046875,
-0.006622314453125,
-0.0257415771484375,
-0.018768310546875,
0.04376220703125,
0.0031795501708984375,
-0.06976318359375,
0.049652099609375,
0.0023326873779296875,
0.02362060546875,
-0.0183563232421875,
-0.0156707763671875,
0.006038665771484375,
-0.0009150505065917969,
0.03851318359375,
0.017547607421875,
-0.06036376953125,
-0.01366424560546875,
0.00983428955078125,
0.01459503173828125,
0.00021982192993164062,
0.0284423828125,
-0.056427001953125,
0.0303955078125,
0.019927978515625,
0.00650787353515625,
0.07000732421875,
-0.020965576171875,
0.0224609375,
-0.0513916015625,
0.03173828125,
0.0221099853515625,
0.02593994140625,
0.0280303955078125,
-0.01171112060546875,
0.0157318115234375,
0.0184478759765625,
-0.045196533203125,
-0.0728759765625,
-0.005706787109375,
-0.089111328125,
-0.002567291259765625,
0.074462890625,
0.0026798248291015625,
-0.0205230712890625,
-0.0071868896484375,
-0.023956298828125,
0.039215087890625,
-0.038604736328125,
0.032440185546875,
0.031951904296875,
0.0026187896728515625,
-0.00356292724609375,
-0.043487548828125,
0.048614501953125,
0.015655517578125,
-0.0274810791015625,
-0.006561279296875,
0.00444793701171875,
0.04595947265625,
0.023651123046875,
0.06427001953125,
-0.0234527587890625,
0.0115509033203125,
0.013336181640625,
0.016357421875,
-0.002227783203125,
-0.011688232421875,
-0.024078369140625,
-0.005603790283203125,
-0.016510009765625,
-0.032073974609375
]
] |
microsoft/trocr-small-handwritten | 2023-01-24T16:57:42.000Z | [
"transformers",
"pytorch",
"vision-encoder-decoder",
"trocr",
"image-to-text",
"arxiv:2109.10282",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | microsoft | null | null | microsoft/trocr-small-handwritten | 20 | 57,235 | transformers | 2022-03-02T23:29:05 | ---
tags:
- trocr
- image-to-text
widget:
- src: https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg
example_title: Note 1
- src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSoolxi9yWGAT5SLZShv8vVd0bz47UWRzQC19fDTeE8GmGv_Rn-PCF1pP1rrUx8kOjA4gg&usqp=CAU
example_title: Note 2
- src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRNYtTuSBpZPV_nkBYPMFwVVD9asZOPgHww4epu9EqWgDmXW--sE2o8og40ZfDGo87j5w&usqp=CAU
example_title: Note 3
---
# TrOCR (small-sized model, fine-tuned on IAM)
TrOCR model fine-tuned on the [IAM dataset](https://fki.tic.heia-fr.ch/databases/iam-handwriting-database). It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr).
## Model description
The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of DeiT, while the text decoder was initialized from the weights of UniLM.
Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens.
## Intended uses & limitations
You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import TrOCRProcessor, VisionEncoderDecoderModel
from PIL import Image
import requests
# load image from the IAM database
url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg'
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
processor = TrOCRProcessor.from_pretrained('microsoft/trocr-small-handwritten')
model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-small-handwritten')
pixel_values = processor(images=image, return_tensors="pt").pixel_values
generated_ids = model.generate(pixel_values)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
```
### BibTeX entry and citation info
```bibtex
@misc{li2021trocr,
title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models},
author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei},
year={2021},
eprint={2109.10282},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 2,832 | [
[
-0.0121002197265625,
-0.0268402099609375,
0.01290130615234375,
-0.03924560546875,
-0.0290679931640625,
-0.0019817352294921875,
-0.0012903213500976562,
-0.06085205078125,
0.0006351470947265625,
0.0491943359375,
-0.022064208984375,
-0.0318603515625,
-0.04412841796875,
0.0152587890625,
-0.028533935546875,
0.07568359375,
-0.00925445556640625,
0.003856658935546875,
0.014617919921875,
-0.035186767578125,
-0.01097869873046875,
-0.0369873046875,
-0.045654296875,
-0.01000213623046875,
0.0280303955078125,
0.03533935546875,
0.04388427734375,
0.057037353515625,
0.0806884765625,
0.0268402099609375,
-0.0181884765625,
0.012908935546875,
-0.0180511474609375,
-0.0299530029296875,
0.0178680419921875,
-0.03668212890625,
-0.039794921875,
-0.0011072158813476562,
0.048797607421875,
0.0145721435546875,
0.00004380941390991211,
0.0136871337890625,
0.0087432861328125,
0.038604736328125,
-0.02154541015625,
-0.01348114013671875,
-0.02978515625,
0.0214080810546875,
-0.00591278076171875,
-0.0046539306640625,
-0.03680419921875,
-0.0292816162109375,
0.0213623046875,
-0.041229248046875,
0.053924560546875,
0.00970458984375,
0.0848388671875,
-0.00836944580078125,
-0.0256805419921875,
-0.04559326171875,
-0.0638427734375,
0.0496826171875,
-0.045501708984375,
0.0255126953125,
0.0032138824462890625,
0.0158538818359375,
0.0087738037109375,
-0.08856201171875,
-0.06402587890625,
-0.02783203125,
-0.02777099609375,
-0.0015716552734375,
-0.0174713134765625,
0.020050048828125,
0.029205322265625,
0.039031982421875,
-0.04473876953125,
-0.01313018798828125,
-0.055267333984375,
-0.02703857421875,
0.016204833984375,
0.00739288330078125,
0.02142333984375,
0.004058837890625,
-0.023773193359375,
-0.0289764404296875,
-0.0132293701171875,
-0.004638671875,
0.004241943359375,
0.000698089599609375,
-0.0244140625,
0.05364990234375,
0.0227508544921875,
0.0667724609375,
0.019866943359375,
-0.0247650146484375,
0.03045654296875,
-0.0140838623046875,
0.00118255615234375,
0.00635528564453125,
0.080078125,
0.0145111083984375,
0.0232696533203125,
-0.00928497314453125,
-0.0200042724609375,
0.0182342529296875,
0.00530242919921875,
-0.06414794921875,
-0.004413604736328125,
-0.01232147216796875,
-0.044189453125,
-0.0178680419921875,
0.0189208984375,
-0.0635986328125,
-0.01532745361328125,
-0.012054443359375,
0.037078857421875,
-0.0304107666015625,
0.02142333984375,
-0.00556182861328125,
-0.006587982177734375,
0.00785064697265625,
0.0204925537109375,
-0.041229248046875,
0.00540924072265625,
0.00955963134765625,
0.08929443359375,
-0.01267242431640625,
-0.0228118896484375,
-0.0250091552734375,
-0.003082275390625,
-0.0177764892578125,
0.049468994140625,
-0.01800537109375,
-0.022552490234375,
-0.01012420654296875,
0.0034465789794921875,
-0.0008172988891601562,
-0.035491943359375,
0.041015625,
-0.0307769775390625,
0.02825927734375,
0.0173492431640625,
0.0016469955444335938,
-0.0034580230712890625,
0.023651123046875,
-0.07086181640625,
0.08837890625,
0.01428985595703125,
-0.06048583984375,
0.01486968994140625,
-0.057373046875,
-0.022247314453125,
0.0027790069580078125,
0.01343536376953125,
-0.06402587890625,
0.002727508544921875,
0.005550384521484375,
0.004100799560546875,
-0.0288238525390625,
-0.00431060791015625,
-0.0063018798828125,
-0.035980224609375,
0.01641845703125,
-0.029449462890625,
0.052642822265625,
0.0274200439453125,
-0.02825927734375,
-0.0082244873046875,
-0.08258056640625,
0.0104522705078125,
0.0031299591064453125,
-0.02984619140625,
-0.007659912109375,
-0.0221405029296875,
0.0206451416015625,
0.03057861328125,
0.032073974609375,
-0.04608154296875,
0.0175628662109375,
-0.026611328125,
0.059814453125,
0.02740478515625,
-0.0119171142578125,
0.038299560546875,
-0.0196685791015625,
0.028228759765625,
0.01416778564453125,
0.00931549072265625,
-0.00678253173828125,
-0.01561737060546875,
-0.0782470703125,
-0.021392822265625,
0.0224151611328125,
0.050018310546875,
-0.0748291015625,
0.028411865234375,
-0.032806396484375,
-0.048675537109375,
-0.039337158203125,
-0.0053558349609375,
0.04229736328125,
0.05908203125,
0.029571533203125,
-0.042236328125,
-0.034942626953125,
-0.048095703125,
-0.01004791259765625,
-0.0185394287109375,
0.0020809173583984375,
0.0179901123046875,
0.052459716796875,
-0.022857666015625,
0.058563232421875,
-0.0258026123046875,
-0.06024169921875,
-0.0192413330078125,
0.026611328125,
0.017364501953125,
0.048919677734375,
0.027252197265625,
-0.04449462890625,
-0.047882080078125,
0.00444793701171875,
-0.045135498046875,
0.00928497314453125,
-0.007556915283203125,
-0.00563812255859375,
0.037261962890625,
0.025909423828125,
-0.04400634765625,
0.062103271484375,
0.024627685546875,
-0.03399658203125,
0.0372314453125,
-0.04315185546875,
0.013031005859375,
-0.07818603515625,
0.0270843505859375,
-0.0021953582763671875,
-0.0173492431640625,
-0.061279296875,
0.01097869873046875,
0.0143890380859375,
-0.0189666748046875,
-0.0262298583984375,
0.041534423828125,
-0.057769775390625,
-0.0034542083740234375,
-0.005680084228515625,
0.0008568763732910156,
0.00829315185546875,
0.05047607421875,
0.0294647216796875,
0.064208984375,
0.0111083984375,
-0.0290374755859375,
0.00720977783203125,
0.02386474609375,
-0.034027099609375,
0.035736083984375,
-0.0716552734375,
0.047515869140625,
0.0012226104736328125,
-0.012481689453125,
-0.0599365234375,
0.0220489501953125,
0.0268402099609375,
-0.0308074951171875,
0.0223236083984375,
0.002902984619140625,
-0.043731689453125,
-0.057220458984375,
-0.0034542083740234375,
0.036712646484375,
0.026519775390625,
-0.0428466796875,
0.08013916015625,
0.0091705322265625,
0.0279693603515625,
-0.037017822265625,
-0.0858154296875,
0.001346588134765625,
-0.00881195068359375,
-0.04888916015625,
0.0360107421875,
-0.0131072998046875,
0.0167694091796875,
-0.0051422119140625,
0.006458282470703125,
-0.01502227783203125,
-0.0288238525390625,
0.004482269287109375,
0.03961181640625,
-0.0224609375,
-0.01554107666015625,
-0.039459228515625,
-0.014129638671875,
-0.0313720703125,
-0.02093505859375,
0.04412841796875,
-0.0300445556640625,
-0.0009741783142089844,
-0.041656494140625,
0.01371002197265625,
0.05169677734375,
-0.035400390625,
0.0478515625,
0.049224853515625,
-0.0192718505859375,
0.003559112548828125,
-0.04052734375,
-0.01375579833984375,
-0.035736083984375,
0.025543212890625,
-0.0225677490234375,
-0.054931640625,
0.059234619140625,
0.033477783203125,
-0.009521484375,
0.034698486328125,
0.032012939453125,
-0.005680084228515625,
0.0615234375,
0.0540771484375,
0.0107879638671875,
0.0648193359375,
-0.04473876953125,
0.0221405029296875,
-0.06256103515625,
-0.024078369140625,
-0.037994384765625,
-0.032012939453125,
-0.040802001953125,
-0.0167999267578125,
0.02935791015625,
-0.00676727294921875,
-0.01432037353515625,
0.0400390625,
-0.075439453125,
0.025848388671875,
0.060577392578125,
0.03155517578125,
0.0181427001953125,
0.0143585205078125,
-0.0250244140625,
0.0037708282470703125,
-0.037139892578125,
-0.047119140625,
0.06103515625,
0.00720977783203125,
0.062103271484375,
-0.0133819580078125,
0.03973388671875,
0.0297088623046875,
0.0024585723876953125,
-0.06182861328125,
0.048797607421875,
-0.0276031494140625,
-0.0364990234375,
-0.00615692138671875,
-0.017669677734375,
-0.0760498046875,
-0.0004260540008544922,
-0.029388427734375,
-0.059326171875,
0.06072998046875,
0.03485107421875,
-0.01227569580078125,
0.0390625,
-0.0521240234375,
0.06793212890625,
-0.0276336669921875,
-0.0253753662109375,
0.016357421875,
-0.0595703125,
-0.005191802978515625,
0.011505126953125,
-0.01378631591796875,
0.0309600830078125,
0.00971221923828125,
0.07208251953125,
-0.0596923828125,
0.055694580078125,
-0.0003273487091064453,
0.0084075927734375,
0.04534912109375,
0.000720977783203125,
0.048675537109375,
-0.04302978515625,
-0.0158538818359375,
0.04364013671875,
0.01113128662109375,
-0.005279541015625,
-0.0181884765625,
0.0166473388671875,
-0.06884765625,
-0.00975799560546875,
-0.063720703125,
-0.046630859375,
0.026153564453125,
0.043792724609375,
0.06396484375,
0.0484619140625,
-0.0031795501708984375,
0.0038318634033203125,
0.03948974609375,
0.004734039306640625,
0.0404052734375,
0.032958984375,
0.008209228515625,
-0.05413818359375,
0.060699462890625,
0.0163116455078125,
0.020233154296875,
0.041656494140625,
0.01287841796875,
-0.01421356201171875,
-0.03240966796875,
-0.0174713134765625,
0.0274505615234375,
-0.05633544921875,
-0.018798828125,
-0.02783203125,
-0.022430419921875,
-0.019927978515625,
-0.011749267578125,
-0.012359619140625,
-0.024810791015625,
-0.06146240234375,
0.020233154296875,
0.02264404296875,
0.03759765625,
0.004634857177734375,
0.06109619140625,
-0.060333251953125,
0.031219482421875,
0.0019817352294921875,
0.0181884765625,
0.0017518997192382812,
-0.04949951171875,
-0.0239410400390625,
0.0023021697998046875,
-0.0307769775390625,
-0.05352783203125,
0.057891845703125,
0.033294677734375,
0.02130126953125,
0.035308837890625,
-0.0001575946807861328,
0.0556640625,
-0.041229248046875,
0.04132080078125,
0.031463623046875,
-0.0733642578125,
0.02752685546875,
0.00481414794921875,
0.02099609375,
0.03509521484375,
-0.0018777847290039062,
-0.043487548828125,
-0.009185791015625,
-0.040008544921875,
-0.042236328125,
0.0826416015625,
0.0028667449951171875,
-0.016845703125,
0.02362060546875,
0.03271484375,
-0.01453399658203125,
0.012359619140625,
-0.07684326171875,
-0.01485443115234375,
-0.0233306884765625,
-0.0550537109375,
-0.00823211669921875,
-0.027008056640625,
0.00899505615234375,
-0.02044677734375,
0.0310211181640625,
-0.00536346435546875,
0.06304931640625,
0.039337158203125,
-0.043548583984375,
-0.007106781005859375,
0.0009207725524902344,
0.054534912109375,
0.0304107666015625,
-0.0151824951171875,
0.0219268798828125,
-0.0012655258178710938,
-0.0882568359375,
0.00655364990234375,
0.003997802734375,
-0.035980224609375,
0.006175994873046875,
0.04010009765625,
0.08343505859375,
-0.02069091796875,
-0.032470703125,
0.0287628173828125,
-0.004608154296875,
-0.019195556640625,
-0.0274658203125,
-0.0015726089477539062,
-0.03424072265625,
0.0086669921875,
0.04180908203125,
0.02276611328125,
0.0040283203125,
-0.035400390625,
-0.004505157470703125,
0.0345458984375,
-0.04913330078125,
-0.01739501953125,
0.04766845703125,
-0.0099029541015625,
-0.047515869140625,
0.059295654296875,
-0.0002951622009277344,
-0.05877685546875,
0.05914306640625,
0.04901123046875,
0.0516357421875,
-0.007572174072265625,
0.004367828369140625,
0.0416259765625,
0.04730224609375,
-0.007434844970703125,
0.022308349609375,
-0.0033855438232421875,
-0.054931640625,
0.02911376953125,
-0.035491943359375,
-0.0167388916015625,
-0.000003159046173095703,
-0.044677734375,
0.04058837890625,
-0.041290283203125,
-0.0306396484375,
-0.00856781005859375,
0.01096343994140625,
-0.0556640625,
0.0273284912109375,
-0.007587432861328125,
0.05908203125,
-0.0328369140625,
0.0633544921875,
0.039794921875,
-0.0308074951171875,
-0.05364990234375,
-0.00969696044921875,
-0.0179290771484375,
-0.0750732421875,
0.049072265625,
0.0200042724609375,
-0.012298583984375,
0.0188446044921875,
-0.038848876953125,
-0.060089111328125,
0.09808349609375,
0.0160980224609375,
-0.051513671875,
-0.02490234375,
0.038116455078125,
0.05853271484375,
-0.030731201171875,
0.04315185546875,
0.0274200439453125,
0.0167083740234375,
0.026153564453125,
-0.054046630859375,
0.0034637451171875,
-0.0231781005859375,
0.022064208984375,
0.0121307373046875,
-0.043548583984375,
0.06494140625,
-0.038238525390625,
-0.0234375,
0.040618896484375,
0.043914794921875,
0.0121002197265625,
0.0218963623046875,
0.024444580078125,
0.046417236328125,
0.051116943359375,
-0.017120361328125,
0.0594482421875,
-0.0309600830078125,
0.03009033203125,
0.06304931640625,
0.00888824462890625,
0.05364990234375,
0.03778076171875,
0.00051116943359375,
0.048828125,
0.031890869140625,
-0.037933349609375,
0.04412841796875,
-0.0165557861328125,
0.0059051513671875,
0.01113128662109375,
0.00934600830078125,
-0.0166168212890625,
0.0216217041015625,
0.01166534423828125,
-0.05889892578125,
0.006378173828125,
0.0185546875,
-0.0219268798828125,
-0.022705078125,
-0.03656005859375,
0.05303955078125,
0.00002682209014892578,
-0.0418701171875,
0.047393798828125,
0.00705718994140625,
0.065673828125,
-0.052764892578125,
-0.00015151500701904297,
-0.003246307373046875,
0.046539306640625,
-0.009735107421875,
-0.05401611328125,
0.0073089599609375,
-0.0037860870361328125,
-0.021026611328125,
0.01448822021484375,
0.06243896484375,
-0.043731689453125,
-0.06683349609375,
0.0115509033203125,
-0.0069580078125,
0.010345458984375,
0.029449462890625,
-0.054443359375,
0.001354217529296875,
-0.0007538795471191406,
-0.00818634033203125,
-0.00860595703125,
0.035888671875,
-0.0024814605712890625,
0.038421630859375,
0.044525146484375,
0.009368896484375,
0.020294189453125,
-0.018798828125,
0.039642333984375,
-0.041473388671875,
-0.045806884765625,
-0.055328369140625,
0.039794921875,
0.00785064697265625,
-0.041473388671875,
0.041412353515625,
0.0430908203125,
0.043792724609375,
-0.0294647216796875,
0.026947021484375,
-0.01222991943359375,
0.0186309814453125,
-0.0294189453125,
0.072509765625,
-0.048797607421875,
-0.002918243408203125,
-0.033599853515625,
-0.054901123046875,
-0.04034423828125,
0.07147216796875,
-0.0221099853515625,
0.0211944580078125,
0.054931640625,
0.08447265625,
-0.0106964111328125,
-0.024749755859375,
0.00897979736328125,
0.01213836669921875,
0.004566192626953125,
0.049285888671875,
0.03851318359375,
-0.06524658203125,
0.06683349609375,
-0.0255126953125,
0.00032830238342285156,
-0.02056884765625,
-0.06341552734375,
-0.07806396484375,
-0.05181884765625,
-0.024627685546875,
-0.047027587890625,
-0.0094146728515625,
0.044281005859375,
0.05145263671875,
-0.07080078125,
-0.0150146484375,
-0.019927978515625,
0.004497528076171875,
-0.0157623291015625,
-0.0163421630859375,
0.045196533203125,
0.0229949951171875,
-0.054840087890625,
-0.0350341796875,
-0.00785064697265625,
0.0377197265625,
0.00969696044921875,
-0.0193634033203125,
-0.013580322265625,
-0.003559112548828125,
0.02593994140625,
0.042633056640625,
-0.040283203125,
-0.0029544830322265625,
0.01080322265625,
-0.02581787109375,
0.039886474609375,
0.051300048828125,
-0.0447998046875,
0.03387451171875,
0.03826904296875,
0.005725860595703125,
0.054931640625,
-0.0176544189453125,
0.0040130615234375,
-0.03143310546875,
0.025970458984375,
0.0169677734375,
0.039703369140625,
0.0296173095703125,
-0.03759765625,
0.033538818359375,
0.032684326171875,
-0.03826904296875,
-0.06732177734375,
-0.01557159423828125,
-0.09295654296875,
0.01168060302734375,
0.0556640625,
-0.004146575927734375,
-0.0302886962890625,
0.01425933837890625,
-0.029083251953125,
0.032928466796875,
-0.024017333984375,
0.046295166015625,
0.0267486572265625,
0.011016845703125,
-0.049102783203125,
0.00518035888671875,
0.0173492431640625,
-0.0095062255859375,
-0.045318603515625,
-0.014739990234375,
0.028167724609375,
0.0242767333984375,
0.050628662109375,
0.04315185546875,
-0.018096923828125,
0.021026611328125,
0.004428863525390625,
0.044830322265625,
-0.0193939208984375,
-0.0259552001953125,
-0.034088134765625,
0.0037136077880859375,
-0.0070953369140625,
-0.017547607421875
]
] |
deepset/gbert-large | 2023-05-05T07:00:08.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"dataset:oscar",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | deepset | null | null | deepset/gbert-large | 38 | 57,071 | transformers | 2022-03-02T23:29:05 | ---
language: de
license: mit
datasets:
- wikipedia
- OPUS
- OpenLegalData
- oscar
---
# German BERT large
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** BERT large
**Language:** German
## Performance
```
GermEval18 Coarse: 80.08
GermEval18 Fine: 52.48
GermEval14: 88.16
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
**Branden Chan:** branden.chan@deepset.ai
**Stefan Schweter:** stefan@schweter.eu
**Timo Möller:** timo.moeller@deepset.ai
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs) | 2,874 | [
[
-0.045135498046875,
-0.046600341796875,
0.0313720703125,
0.005313873291015625,
-0.003665924072265625,
0.0021953582763671875,
-0.03228759765625,
-0.04705810546875,
0.02880859375,
0.0204315185546875,
-0.053070068359375,
-0.0546875,
-0.0247039794921875,
-0.008697509765625,
-0.03155517578125,
0.072021484375,
0.00981903076171875,
0.01116943359375,
0.0023632049560546875,
-0.00489044189453125,
-0.023193359375,
-0.045257568359375,
-0.051239013671875,
-0.029327392578125,
0.041046142578125,
0.0289764404296875,
0.0521240234375,
0.0188751220703125,
0.0418701171875,
0.0249176025390625,
-0.009979248046875,
-0.004772186279296875,
-0.03692626953125,
0.00809478759765625,
-0.0019292831420898438,
-0.00968170166015625,
-0.0294952392578125,
-0.01226806640625,
0.045257568359375,
0.05816650390625,
-0.0107574462890625,
0.0225982666015625,
-0.00730133056640625,
0.060455322265625,
-0.045013427734375,
0.0032711029052734375,
-0.044219970703125,
-0.00701904296875,
-0.0118560791015625,
0.03759765625,
-0.020416259765625,
-0.0216217041015625,
0.01343536376953125,
-0.031494140625,
0.0272979736328125,
-0.0227508544921875,
0.084716796875,
0.0016994476318359375,
-0.0147857666015625,
-0.01244354248046875,
-0.045257568359375,
0.04742431640625,
-0.0728759765625,
0.0247344970703125,
0.025238037109375,
0.03656005859375,
-0.005870819091796875,
-0.0797119140625,
-0.041656494140625,
-0.0189208984375,
-0.0080413818359375,
0.011199951171875,
-0.007556915283203125,
-0.019561767578125,
0.007568359375,
0.037078857421875,
-0.05804443359375,
0.0257720947265625,
-0.0340576171875,
0.0018644332885742188,
0.0626220703125,
-0.00426483154296875,
0.0018777847290039062,
0.0031585693359375,
-0.01374053955078125,
-0.0293731689453125,
-0.046051025390625,
-0.0060272216796875,
0.020233154296875,
0.0248870849609375,
-0.0166473388671875,
0.037445068359375,
-0.01042938232421875,
0.041656494140625,
0.016265869140625,
0.039825439453125,
0.046783447265625,
-0.035400390625,
-0.01715087890625,
-0.001285552978515625,
0.065185546875,
0.0178985595703125,
0.00611114501953125,
-0.008392333984375,
-0.0277862548828125,
-0.0142669677734375,
0.0228118896484375,
-0.0650634765625,
-0.0176544189453125,
0.025238037109375,
-0.0399169921875,
-0.01898193359375,
0.004573822021484375,
-0.047943115234375,
-0.0303802490234375,
-0.003124237060546875,
0.04742431640625,
-0.039031982421875,
-0.0251007080078125,
0.0196533203125,
-0.0173492431640625,
0.042327880859375,
0.0160369873046875,
-0.06787109375,
0.01410675048828125,
0.060638427734375,
0.04962158203125,
0.0177764892578125,
-0.025482177734375,
-0.0236358642578125,
-0.0023250579833984375,
-0.030242919921875,
0.038177490234375,
-0.0232086181640625,
-0.01148223876953125,
0.0160064697265625,
0.00696563720703125,
0.01227569580078125,
-0.0255126953125,
0.032684326171875,
-0.05157470703125,
0.038543701171875,
-0.018341064453125,
-0.04779052734375,
-0.013885498046875,
0.0133209228515625,
-0.06689453125,
0.0682373046875,
0.0094757080078125,
-0.038116455078125,
0.0237579345703125,
-0.0633544921875,
-0.039581298828125,
0.011932373046875,
-0.0005674362182617188,
-0.028594970703125,
-0.005634307861328125,
0.01462554931640625,
0.040191650390625,
-0.01145172119140625,
0.0136566162109375,
-0.0298004150390625,
-0.043487548828125,
0.0129241943359375,
0.0000635385513305664,
0.08917236328125,
0.0103912353515625,
-0.03253173828125,
0.0012054443359375,
-0.057464599609375,
0.01323699951171875,
0.024383544921875,
-0.0251007080078125,
-0.0004982948303222656,
-0.00865936279296875,
0.018096923828125,
0.00833892822265625,
0.054229736328125,
-0.030181884765625,
0.01511383056640625,
-0.0399169921875,
0.036590576171875,
0.057159423828125,
-0.0119476318359375,
0.0301513671875,
-0.0102386474609375,
0.02294921875,
-0.0244293212890625,
0.01065826416015625,
0.0020694732666015625,
-0.02099609375,
-0.06939697265625,
-0.0120391845703125,
0.045623779296875,
0.04150390625,
-0.0426025390625,
0.08123779296875,
-0.0196685791015625,
-0.054901123046875,
-0.03851318359375,
0.0084686279296875,
0.021697998046875,
0.0171966552734375,
0.0188751220703125,
-0.020050048828125,
-0.05792236328125,
-0.09027099609375,
0.007785797119140625,
-0.0086669921875,
-0.0206756591796875,
0.0220794677734375,
0.04840087890625,
-0.037506103515625,
0.046051025390625,
-0.042816162109375,
-0.022918701171875,
-0.0128326416015625,
0.0024623870849609375,
0.049713134765625,
0.046875,
0.055816650390625,
-0.057281494140625,
-0.037445068359375,
-0.006961822509765625,
-0.058349609375,
0.0267791748046875,
0.0008940696716308594,
-0.0245513916015625,
0.024200439453125,
0.030975341796875,
-0.057708740234375,
0.004909515380859375,
0.046112060546875,
-0.03277587890625,
0.0279693603515625,
-0.0089874267578125,
-0.01093292236328125,
-0.09979248046875,
0.03277587890625,
0.00447845458984375,
-0.00860595703125,
-0.027740478515625,
0.019317626953125,
-0.01617431640625,
-0.00858306884765625,
-0.0234527587890625,
0.03948974609375,
-0.0283966064453125,
-0.006378173828125,
0.015869140625,
-0.0067901611328125,
-0.007503509521484375,
0.03582763671875,
-0.01641845703125,
0.0699462890625,
0.0438232421875,
-0.035064697265625,
0.04345703125,
0.032928466796875,
-0.04742431640625,
0.0192413330078125,
-0.06488037109375,
0.0149078369140625,
0.006072998046875,
0.0238494873046875,
-0.060791015625,
-0.0310516357421875,
0.006816864013671875,
-0.04681396484375,
0.0262451171875,
-0.0004527568817138672,
-0.0655517578125,
-0.04705810546875,
-0.036407470703125,
0.0021495819091796875,
0.067138671875,
-0.042816162109375,
0.0204315185546875,
0.02935791015625,
-0.011474609375,
-0.044952392578125,
-0.06390380859375,
0.0257415771484375,
0.003932952880859375,
-0.058990478515625,
0.034149169921875,
-0.01027679443359375,
-0.00864410400390625,
0.01397705078125,
0.005588531494140625,
-0.033782958984375,
0.014007568359375,
0.0027256011962890625,
0.01861572265625,
-0.0307159423828125,
0.024810791015625,
-0.0208282470703125,
-0.0064239501953125,
0.0008292198181152344,
-0.0213470458984375,
0.05657958984375,
-0.05059814453125,
-0.01218414306640625,
-0.034698486328125,
0.031707763671875,
0.02734375,
-0.0173797607421875,
0.058929443359375,
0.0711669921875,
-0.035003662109375,
-0.0002391338348388672,
-0.045166015625,
-0.0237884521484375,
-0.039031982421875,
0.03118896484375,
-0.012603759765625,
-0.07696533203125,
0.03826904296875,
0.0147705078125,
0.018707275390625,
0.06451416015625,
0.046630859375,
-0.0362548828125,
0.071044921875,
0.058349609375,
-0.0106658935546875,
0.050567626953125,
-0.046783447265625,
0.0008840560913085938,
-0.0528564453125,
-0.0020694732666015625,
-0.038818359375,
-0.043212890625,
-0.055572509765625,
-0.01049041748046875,
0.01030731201171875,
0.0001672506332397461,
-0.049835205078125,
0.035919189453125,
-0.04864501953125,
0.0292510986328125,
0.07568359375,
0.00919342041015625,
-0.00922393798828125,
0.00406646728515625,
0.006450653076171875,
0.00893402099609375,
-0.045806884765625,
-0.0340576171875,
0.0853271484375,
0.01251983642578125,
0.038482666015625,
0.012542724609375,
0.07818603515625,
0.0299072265625,
-0.011474609375,
-0.046051025390625,
0.04412841796875,
-0.02093505859375,
-0.0732421875,
-0.0394287109375,
-0.0284423828125,
-0.0882568359375,
-0.002964019775390625,
-0.0155792236328125,
-0.05291748046875,
0.0182647705078125,
0.007320404052734375,
-0.019134521484375,
0.01495361328125,
-0.06524658203125,
0.0726318359375,
-0.0239410400390625,
-0.01351165771484375,
-0.0253753662109375,
-0.06524658203125,
0.014617919921875,
-0.00829315185546875,
-0.0013113021850585938,
-0.00394439697265625,
0.01019287109375,
0.053070068359375,
-0.041351318359375,
0.07666015625,
-0.012420654296875,
-0.013458251953125,
0.0171966552734375,
-0.006397247314453125,
0.040252685546875,
0.00347137451171875,
-0.0242919921875,
0.032318115234375,
-0.005954742431640625,
-0.036163330078125,
-0.0226287841796875,
0.056243896484375,
-0.058837890625,
-0.028411865234375,
-0.0290985107421875,
-0.0147247314453125,
-0.0079498291015625,
0.045928955078125,
0.017791748046875,
0.0181884765625,
-0.03240966796875,
0.048370361328125,
0.048004150390625,
-0.0168609619140625,
0.0322265625,
0.0283355712890625,
-0.007656097412109375,
-0.033355712890625,
0.0665283203125,
-0.002910614013671875,
-0.007122039794921875,
0.0301513671875,
0.005413055419921875,
-0.01274871826171875,
-0.03271484375,
-0.0255889892578125,
0.01861572265625,
-0.043670654296875,
-0.01061248779296875,
-0.0304718017578125,
-0.03875732421875,
-0.053070068359375,
-0.0186920166015625,
-0.03240966796875,
-0.033599853515625,
-0.0135955810546875,
-0.004909515380859375,
0.038330078125,
0.048828125,
-0.027008056640625,
0.0104217529296875,
-0.049224853515625,
0.015625,
0.037841796875,
0.039825439453125,
-0.006862640380859375,
-0.0146636962890625,
-0.030975341796875,
0.026031494140625,
-0.0007319450378417969,
-0.033843994140625,
0.007411956787109375,
0.01073455810546875,
0.041595458984375,
0.0108642578125,
-0.0013942718505859375,
0.027740478515625,
-0.04852294921875,
0.06475830078125,
0.0219879150390625,
-0.061767578125,
0.0426025390625,
-0.033203125,
0.03253173828125,
0.07000732421875,
0.0158843994140625,
-0.05364990234375,
-0.01363372802734375,
-0.057525634765625,
-0.08184814453125,
0.041229248046875,
0.021148681640625,
0.0286712646484375,
0.002414703369140625,
0.006649017333984375,
0.005161285400390625,
0.027130126953125,
-0.044189453125,
-0.0250244140625,
-0.007061004638671875,
-0.0126495361328125,
-0.011474609375,
-0.0311279296875,
-0.0183563232421875,
-0.0238037109375,
0.06982421875,
0.004329681396484375,
0.030303955078125,
0.0006537437438964844,
-0.007122039794921875,
0.00952911376953125,
0.0069580078125,
0.040740966796875,
0.06884765625,
-0.03704833984375,
-0.007724761962890625,
0.01180267333984375,
-0.029388427734375,
-0.01531219482421875,
0.033660888671875,
-0.030303955078125,
0.0198822021484375,
0.041290283203125,
0.056304931640625,
0.01172637939453125,
-0.04815673828125,
0.046844482421875,
0.0007467269897460938,
-0.036407470703125,
-0.053955078125,
0.00583648681640625,
0.021697998046875,
0.0309906005859375,
0.0364990234375,
-0.01471710205078125,
0.0182952880859375,
-0.028076171875,
0.01513671875,
0.03948974609375,
-0.025299072265625,
-0.00670623779296875,
0.0267486572265625,
0.032318115234375,
-0.0280609130859375,
0.055389404296875,
-0.0171661376953125,
-0.046295166015625,
0.059356689453125,
0.0225067138671875,
0.07403564453125,
0.01374053955078125,
0.019989013671875,
0.03436279296875,
0.0304107666015625,
0.0075225830078125,
0.0210418701171875,
0.00293731689453125,
-0.04449462890625,
-0.031463623046875,
-0.036224365234375,
-0.0245513916015625,
0.0307769775390625,
-0.04736328125,
0.0115814208984375,
-0.041229248046875,
-0.01312255859375,
0.01229095458984375,
0.0218048095703125,
-0.0654296875,
0.01617431640625,
0.01490020751953125,
0.0751953125,
-0.033660888671875,
0.04412841796875,
0.06585693359375,
-0.0423583984375,
-0.03717041015625,
-0.004367828369140625,
-0.01629638671875,
-0.07373046875,
0.036773681640625,
0.01509857177734375,
-0.005992889404296875,
0.003650665283203125,
-0.07415771484375,
-0.0716552734375,
0.080078125,
0.0172882080078125,
-0.043243408203125,
-0.0115814208984375,
-0.0216522216796875,
0.044189453125,
-0.0054931640625,
-0.01160430908203125,
0.0223846435546875,
0.045196533203125,
0.01235198974609375,
-0.0662841796875,
0.0095062255859375,
-0.03125,
0.002346038818359375,
0.012847900390625,
-0.05291748046875,
0.053070068359375,
-0.00682830810546875,
-0.0148773193359375,
0.0229034423828125,
0.040924072265625,
0.0034313201904296875,
0.0028553009033203125,
0.034515380859375,
0.049774169921875,
0.06494140625,
-0.0011510848999023438,
0.073974609375,
-0.01544952392578125,
0.037841796875,
0.09619140625,
-0.0235748291015625,
0.0704345703125,
0.0236358642578125,
-0.0212249755859375,
0.04986572265625,
0.049224853515625,
-0.042236328125,
0.039642333984375,
0.01363372802734375,
-0.005313873291015625,
-0.0321044921875,
0.006908416748046875,
-0.07110595703125,
0.0308685302734375,
0.01346588134765625,
-0.035003662109375,
-0.017608642578125,
-0.0272674560546875,
-0.005107879638671875,
-0.00958251953125,
-0.0007529258728027344,
0.05987548828125,
-0.003589630126953125,
-0.02813720703125,
0.05792236328125,
-0.0019102096557617188,
0.0504150390625,
-0.0543212890625,
0.00160980224609375,
-0.0139617919921875,
0.026275634765625,
-0.00015425682067871094,
-0.07330322265625,
0.000028252601623535156,
-0.00601959228515625,
-0.0249176025390625,
-0.01535797119140625,
0.043853759765625,
-0.0158538818359375,
-0.061309814453125,
0.015960693359375,
0.03460693359375,
0.016815185546875,
0.00662994384765625,
-0.059600830078125,
-0.0011577606201171875,
-0.005641937255859375,
-0.0210418701171875,
0.0130157470703125,
0.03289794921875,
0.01617431640625,
0.038909912109375,
0.057342529296875,
0.00015425682067871094,
-0.010467529296875,
0.01171112060546875,
0.06671142578125,
-0.05780029296875,
-0.02130126953125,
-0.056396484375,
0.034881591796875,
-0.036102294921875,
-0.028228759765625,
0.0458984375,
0.056304931640625,
0.07452392578125,
-0.015350341796875,
0.062744140625,
-0.02093505859375,
0.040618896484375,
-0.0279541015625,
0.075439453125,
-0.059967041015625,
-0.0117645263671875,
-0.009979248046875,
-0.06689453125,
-0.018096923828125,
0.06402587890625,
-0.0024509429931640625,
0.023773193359375,
0.041290283203125,
0.042694091796875,
0.00380706787109375,
-0.0252838134765625,
0.00450897216796875,
0.03118896484375,
0.0198211669921875,
0.0562744140625,
0.05047607421875,
-0.036651611328125,
0.049896240234375,
-0.0233154296875,
-0.0012578964233398438,
-0.0430908203125,
-0.059295654296875,
-0.05780029296875,
-0.04656982421875,
-0.0175018310546875,
-0.040740966796875,
0.0134124755859375,
0.05828857421875,
0.062255859375,
-0.06878662109375,
-0.0194549560546875,
-0.006198883056640625,
0.010650634765625,
-0.018707275390625,
-0.017486572265625,
0.0300140380859375,
-0.0163421630859375,
-0.045867919921875,
0.026458740234375,
-0.0033969879150390625,
0.003467559814453125,
-0.0211639404296875,
0.002285003662109375,
-0.0401611328125,
-0.0170440673828125,
0.04266357421875,
0.0311279296875,
-0.04266357421875,
-0.01442718505859375,
-0.002391815185546875,
-0.029541015625,
-0.005123138427734375,
0.031158447265625,
-0.049530029296875,
0.01325225830078125,
0.035736083984375,
0.05584716796875,
0.05340576171875,
-0.02130126953125,
0.0540771484375,
-0.057891845703125,
0.01004791259765625,
0.0287017822265625,
0.0294952392578125,
0.0220794677734375,
-0.0287017822265625,
0.06390380859375,
-0.0021038055419921875,
-0.036376953125,
-0.058349609375,
0.00572967529296875,
-0.07147216796875,
-0.03314208984375,
0.0869140625,
0.0118408203125,
-0.007465362548828125,
0.0263214111328125,
-0.00936126708984375,
0.01244354248046875,
-0.037994384765625,
0.0560302734375,
0.05914306640625,
0.017242431640625,
0.01416778564453125,
-0.033843994140625,
0.0185699462890625,
0.0206756591796875,
-0.052154541015625,
-0.00557708740234375,
0.0535888671875,
0.00907135009765625,
0.021697998046875,
0.0389404296875,
0.01428985595703125,
0.0355224609375,
-0.0095977783203125,
0.03448486328125,
-0.0028133392333984375,
-0.00937652587890625,
-0.028228759765625,
-0.002864837646484375,
-0.001987457275390625,
-0.0297393798828125
]
] |
prajjwal1/bert-mini | 2021-10-27T18:27:38.000Z | [
"transformers",
"pytorch",
"BERT",
"MNLI",
"NLI",
"transformer",
"pre-training",
"en",
"arxiv:1908.08962",
"arxiv:2110.01518",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | prajjwal1 | null | null | prajjwal1/bert-mini | 13 | 56,748 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
license:
- mit
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
---
The following model is a Pytorch pre-trained model obtained from converting Tensorflow checkpoint found in the [official Google BERT repository](https://github.com/google-research/bert).
This is one of the smaller pre-trained BERT variants, together with [bert-small](https://huggingface.co/prajjwal1/bert-small) and [bert-medium](https://huggingface.co/prajjwal1/bert-medium). They were introduced in the study `Well-Read Students Learn Better: On the Importance of Pre-training Compact Models` ([arxiv](https://arxiv.org/abs/1908.08962)), and ported to HF for the study `Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics` ([arXiv](https://arxiv.org/abs/2110.01518)). These models are supposed to be trained on a downstream task.
If you use the model, please consider citing both the papers:
```
@misc{bhargava2021generalization,
title={Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics},
author={Prajjwal Bhargava and Aleksandr Drozd and Anna Rogers},
year={2021},
eprint={2110.01518},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@article{DBLP:journals/corr/abs-1908-08962,
author = {Iulia Turc and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {Well-Read Students Learn Better: The Impact of Student Initialization
on Knowledge Distillation},
journal = {CoRR},
volume = {abs/1908.08962},
year = {2019},
url = {http://arxiv.org/abs/1908.08962},
eprinttype = {arXiv},
eprint = {1908.08962},
timestamp = {Thu, 29 Aug 2019 16:32:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1908-08962.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
Config of this model:
`prajjwal1/bert-mini` (L=4, H=256) [Model Link](https://huggingface.co/prajjwal1/bert-mini)
Other models to check out:
- `prajjwal1/bert-tiny` (L=2, H=128) [Model Link](https://huggingface.co/prajjwal1/bert-tiny)
- `prajjwal1/bert-small` (L=4, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-small)
- `prajjwal1/bert-medium` (L=8, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-medium)
Original Implementation and more info can be found in [this Github repository](https://github.com/prajjwal1/generalize_lm_nli).
Twitter: [@prajjwal_1](https://twitter.com/prajjwal_1)
| 2,501 | [
[
-0.03106689453125,
-0.04205322265625,
0.0345458984375,
-0.000024139881134033203,
-0.012847900390625,
-0.021392822265625,
-0.0230560302734375,
-0.03289794921875,
0.00921630859375,
0.01354217529296875,
-0.054962158203125,
-0.0260009765625,
-0.0384521484375,
-0.01061248779296875,
-0.0272979736328125,
0.0933837890625,
0.00576019287109375,
0.00463104248046875,
-0.01232147216796875,
-0.0185394287109375,
-0.01129913330078125,
-0.042205810546875,
-0.041748046875,
-0.03765869140625,
0.055816650390625,
-0.0033416748046875,
0.03668212890625,
0.01439666748046875,
0.047454833984375,
0.0205078125,
-0.0298309326171875,
-0.006923675537109375,
-0.037384033203125,
-0.0184326171875,
0.003814697265625,
-0.032470703125,
-0.04364013671875,
0.007503509521484375,
0.055908203125,
0.0694580078125,
-0.00969696044921875,
0.02618408203125,
0.021484375,
0.04559326171875,
-0.043304443359375,
-0.0034618377685546875,
-0.023406982421875,
-0.01580810546875,
-0.01161956787109375,
0.02020263671875,
-0.041534423828125,
-0.0247344970703125,
0.03912353515625,
-0.04010009765625,
0.0406494140625,
-0.0012331008911132812,
0.1038818359375,
0.0101470947265625,
-0.0164031982421875,
-0.00949859619140625,
-0.048858642578125,
0.07342529296875,
-0.072265625,
0.038482666015625,
0.002841949462890625,
0.0200347900390625,
-0.0015106201171875,
-0.0731201171875,
-0.04248046875,
-0.00431060791015625,
-0.032958984375,
0.0116729736328125,
-0.0284881591796875,
0.01202392578125,
0.02911376953125,
0.0283203125,
-0.044464111328125,
0.00437164306640625,
-0.0423583984375,
-0.0201416015625,
0.0295562744140625,
-0.002422332763671875,
-0.00304412841796875,
-0.0294036865234375,
-0.0264129638671875,
-0.032318115234375,
-0.04302978515625,
0.0180816650390625,
0.039703369140625,
0.026519775390625,
-0.029815673828125,
0.0305328369140625,
0.00033402442932128906,
0.06365966796875,
0.00909423828125,
-0.0010919570922851562,
0.032989501953125,
-0.047760009765625,
-0.01068878173828125,
-0.015869140625,
0.058807373046875,
0.006977081298828125,
0.00893402099609375,
-0.0056304931640625,
-0.007293701171875,
-0.031341552734375,
0.01324462890625,
-0.07666015625,
-0.0301055908203125,
0.012939453125,
-0.053680419921875,
-0.0022335052490234375,
0.0137786865234375,
-0.046478271484375,
-0.00414276123046875,
-0.0252532958984375,
0.038970947265625,
-0.037933349609375,
-0.021148681640625,
-0.01044464111328125,
-0.0019054412841796875,
0.032012939453125,
0.0294952392578125,
-0.0482177734375,
0.005092620849609375,
0.03515625,
0.06878662109375,
0.006618499755859375,
-0.0180816650390625,
0.0011844635009765625,
0.0049896240234375,
-0.014373779296875,
0.0310516357421875,
-0.01654052734375,
-0.0128936767578125,
-0.003376007080078125,
-0.002166748046875,
-0.01029205322265625,
-0.0257568359375,
0.05126953125,
-0.039337158203125,
0.0266265869140625,
-0.0263671875,
-0.040985107421875,
-0.0201568603515625,
0.014739990234375,
-0.046722412109375,
0.07501220703125,
0.0015430450439453125,
-0.0716552734375,
0.038665771484375,
-0.0479736328125,
-0.0163421630859375,
-0.01416015625,
0.01085662841796875,
-0.052825927734375,
-0.001712799072265625,
0.015106201171875,
0.039093017578125,
-0.01412200927734375,
0.0264129638671875,
-0.03466796875,
-0.026763916015625,
-0.006870269775390625,
-0.006717681884765625,
0.089111328125,
0.0213775634765625,
-0.00554656982421875,
0.015960693359375,
-0.06231689453125,
0.007122039794921875,
0.01247406005859375,
-0.02978515625,
-0.036041259765625,
-0.00830841064453125,
0.0007061958312988281,
0.00044155120849609375,
0.0276641845703125,
-0.0303955078125,
0.0253448486328125,
-0.02667236328125,
0.0318603515625,
0.046905517578125,
0.0049896240234375,
0.03509521484375,
-0.038970947265625,
0.0061492919921875,
0.0112457275390625,
0.0237579345703125,
0.00201416015625,
-0.0384521484375,
-0.07562255859375,
-0.038421630859375,
0.041748046875,
0.021087646484375,
-0.042022705078125,
0.0457763671875,
-0.0211944580078125,
-0.0517578125,
-0.045135498046875,
0.015106201171875,
0.0226898193359375,
0.036895751953125,
0.03363037109375,
-0.01238250732421875,
-0.054962158203125,
-0.06658935546875,
-0.0163421630859375,
-0.0250396728515625,
-0.0157318115234375,
0.025482177734375,
0.0501708984375,
-0.0404052734375,
0.0802001953125,
-0.02862548828125,
-0.021240234375,
-0.0250396728515625,
0.0282745361328125,
0.053802490234375,
0.0634765625,
0.06097412109375,
-0.040313720703125,
-0.0301971435546875,
-0.029815673828125,
-0.042388916015625,
0.00778961181640625,
-0.01465606689453125,
-0.022003173828125,
0.0126190185546875,
0.03082275390625,
-0.044219970703125,
0.030517578125,
0.0235595703125,
-0.0269012451171875,
0.03515625,
-0.0178680419921875,
-0.006771087646484375,
-0.0859375,
0.0270233154296875,
0.0036754608154296875,
-0.004726409912109375,
-0.04156494140625,
0.01108551025390625,
0.0006041526794433594,
0.00901031494140625,
-0.01380157470703125,
0.0491943359375,
-0.041534423828125,
0.0030918121337890625,
0.008453369140625,
-0.0127716064453125,
-0.0041351318359375,
0.036773681640625,
-0.0005669593811035156,
0.03936767578125,
0.02325439453125,
-0.034393310546875,
-0.004199981689453125,
0.03277587890625,
-0.03533935546875,
0.01209259033203125,
-0.0838623046875,
0.011962890625,
-0.00457000732421875,
0.032318115234375,
-0.0716552734375,
-0.0181884765625,
0.0205230712890625,
-0.030792236328125,
0.02813720703125,
-0.026092529296875,
-0.05389404296875,
-0.0333251953125,
-0.0200347900390625,
0.026580810546875,
0.05615234375,
-0.04833984375,
0.05096435546875,
-0.007343292236328125,
-0.0018167495727539062,
-0.0374755859375,
-0.052490234375,
-0.0313720703125,
-0.0012598037719726562,
-0.0517578125,
0.0268096923828125,
-0.019683837890625,
-0.00362396240234375,
0.01161956787109375,
-0.001346588134765625,
-0.0201873779296875,
-0.00347900390625,
0.01139068603515625,
0.043701171875,
-0.022308349609375,
0.01007080078125,
0.005603790283203125,
0.0175628662109375,
-0.0026416778564453125,
-0.0059051513671875,
0.044921875,
-0.02203369140625,
-0.013641357421875,
-0.044281005859375,
0.008392333984375,
0.0290679931640625,
-0.0034122467041015625,
0.08154296875,
0.0704345703125,
-0.0279541015625,
0.00301361083984375,
-0.049285888671875,
-0.042999267578125,
-0.034881591796875,
0.0152130126953125,
-0.01873779296875,
-0.056976318359375,
0.049285888671875,
0.003025054931640625,
0.0183258056640625,
0.057952880859375,
0.0369873046875,
-0.023345947265625,
0.055206298828125,
0.058197021484375,
-0.0012531280517578125,
0.06121826171875,
-0.052337646484375,
0.0193939208984375,
-0.06988525390625,
-0.0148162841796875,
-0.045379638671875,
-0.03131103515625,
-0.0458984375,
-0.013916015625,
0.0206756591796875,
0.0268707275390625,
-0.0382080078125,
0.029296875,
-0.0438232421875,
0.0128936767578125,
0.0654296875,
0.0226593017578125,
0.00390625,
-0.0003705024719238281,
-0.030242919921875,
-0.0023937225341796875,
-0.07196044921875,
-0.0258636474609375,
0.101318359375,
0.03143310546875,
0.045684814453125,
0.0232696533203125,
0.0787353515625,
0.0023708343505859375,
0.024017333984375,
-0.045928955078125,
0.03338623046875,
-0.00417327880859375,
-0.08123779296875,
-0.0193939208984375,
-0.04669189453125,
-0.0771484375,
0.00543212890625,
-0.0291900634765625,
-0.053436279296875,
0.038787841796875,
0.00579071044921875,
-0.04571533203125,
0.01468658447265625,
-0.0709228515625,
0.055999755859375,
0.0029697418212890625,
-0.035125732421875,
-0.0113983154296875,
-0.0523681640625,
0.0263824462890625,
0.0017528533935546875,
0.0036373138427734375,
0.01113128662109375,
0.018798828125,
0.08087158203125,
-0.045867919921875,
0.06854248046875,
-0.0311126708984375,
0.0185394287109375,
0.038848876953125,
-0.01474761962890625,
0.046112060546875,
0.007080078125,
-0.0034027099609375,
0.030517578125,
0.01143646240234375,
-0.045501708984375,
-0.0192718505859375,
0.041046142578125,
-0.0888671875,
-0.03564453125,
-0.04779052734375,
-0.048553466796875,
-0.006900787353515625,
0.032623291015625,
0.0295867919921875,
0.0264129638671875,
0.005260467529296875,
0.037994384765625,
0.05645751953125,
-0.01035308837890625,
0.0430908203125,
0.03411865234375,
-0.0085906982421875,
-0.01007080078125,
0.045562744140625,
0.00887298583984375,
0.01739501953125,
0.0109710693359375,
0.0132293701171875,
-0.0207977294921875,
-0.058441162109375,
-0.005298614501953125,
0.042999267578125,
-0.05157470703125,
-0.00036215782165527344,
-0.047027587890625,
-0.036041259765625,
-0.04290771484375,
-0.0191192626953125,
-0.0242462158203125,
-0.01548004150390625,
-0.037109375,
0.0037593841552734375,
0.0229644775390625,
0.039520263671875,
-0.0189971923828125,
0.033966064453125,
-0.04888916015625,
0.003448486328125,
0.033660888671875,
0.01425933837890625,
0.010589599609375,
-0.05548095703125,
-0.01331329345703125,
0.0020732879638671875,
-0.0153656005859375,
-0.039520263671875,
0.02154541015625,
0.0202789306640625,
0.06097412109375,
0.030975341796875,
0.01062774658203125,
0.050079345703125,
-0.0223388671875,
0.05078125,
0.033782958984375,
-0.043121337890625,
0.03814697265625,
-0.030242919921875,
0.021026611328125,
0.054656982421875,
0.038665771484375,
-0.004566192626953125,
-0.00514984130859375,
-0.06085205078125,
-0.080810546875,
0.052642822265625,
0.01372528076171875,
0.0105743408203125,
0.027191162109375,
0.03179931640625,
0.007442474365234375,
0.013031005859375,
-0.06463623046875,
-0.0254364013671875,
-0.01380157470703125,
-0.0216522216796875,
-0.0126190185546875,
-0.038970947265625,
-0.022979736328125,
-0.050018310546875,
0.05926513671875,
0.00032639503479003906,
0.047393798828125,
0.0248260498046875,
-0.0174102783203125,
0.01519775390625,
0.0055694580078125,
0.0369873046875,
0.05047607421875,
-0.052154541015625,
-0.0138702392578125,
-0.0023956298828125,
-0.039794921875,
-0.0164794921875,
0.025787353515625,
-0.023406982421875,
0.01250457763671875,
0.046478271484375,
0.06292724609375,
0.0183563232421875,
-0.017822265625,
0.039093017578125,
0.0037689208984375,
-0.0225677490234375,
-0.0297393798828125,
0.000957489013671875,
0.0002332925796508789,
0.031005859375,
0.0299835205078125,
0.0190582275390625,
0.008636474609375,
-0.035980224609375,
0.006938934326171875,
0.0186004638671875,
-0.01959228515625,
-0.0218048095703125,
0.05010986328125,
0.0208892822265625,
0.00400543212890625,
0.0594482421875,
-0.0225982666015625,
-0.030548095703125,
0.02764892578125,
0.0192718505859375,
0.055389404296875,
0.01727294921875,
0.005069732666015625,
0.0672607421875,
0.0255889892578125,
-0.00907135009765625,
0.00604248046875,
-0.01212310791015625,
-0.0521240234375,
-0.0210723876953125,
-0.06585693359375,
-0.018585205078125,
0.00913238525390625,
-0.054901123046875,
0.021240234375,
-0.041473388671875,
-0.0266265869140625,
0.01238250732421875,
0.0173187255859375,
-0.0665283203125,
0.004627227783203125,
0.0012464523315429688,
0.0614013671875,
-0.05047607421875,
0.07366943359375,
0.058563232421875,
-0.043304443359375,
-0.065185546875,
0.003139495849609375,
-0.0117950439453125,
-0.046478271484375,
0.05340576171875,
-0.01092529296875,
0.0205230712890625,
0.0100555419921875,
-0.03948974609375,
-0.06732177734375,
0.09735107421875,
0.0169525146484375,
-0.06146240234375,
-0.027801513671875,
-0.01258087158203125,
0.039306640625,
-0.005016326904296875,
0.0309295654296875,
0.02545166015625,
0.0283660888671875,
0.0283966064453125,
-0.058349609375,
-0.0015249252319335938,
-0.015960693359375,
0.0009512901306152344,
0.006099700927734375,
-0.057403564453125,
0.09356689453125,
-0.0263671875,
0.0035400390625,
0.0204925537109375,
0.046356201171875,
0.031005859375,
0.01265716552734375,
0.0355224609375,
0.0555419921875,
0.057586669921875,
-0.025482177734375,
0.080810546875,
-0.01476287841796875,
0.057769775390625,
0.0799560546875,
0.0195465087890625,
0.057403564453125,
0.05194091796875,
-0.028533935546875,
0.047271728515625,
0.060150146484375,
-0.0165863037109375,
0.051055908203125,
0.0073394775390625,
0.0090484619140625,
-0.022674560546875,
0.019073486328125,
-0.046112060546875,
0.007534027099609375,
0.00814056396484375,
-0.033447265625,
-0.0171051025390625,
-0.0168304443359375,
0.009246826171875,
-0.0278778076171875,
-0.021636962890625,
0.044342041015625,
0.002269744873046875,
-0.0306243896484375,
0.05718994140625,
-0.017913818359375,
0.06988525390625,
-0.058990478515625,
0.01342010498046875,
-0.0096435546875,
0.0293121337890625,
-0.00789642333984375,
-0.033660888671875,
0.019012451171875,
-0.0012798309326171875,
-0.03204345703125,
-0.01465606689453125,
0.058013916015625,
-0.01309967041015625,
-0.051513671875,
0.0211944580078125,
0.03619384765625,
0.0098876953125,
0.0161590576171875,
-0.06463623046875,
0.004001617431640625,
0.0003960132598876953,
-0.03875732421875,
0.023345947265625,
0.01214599609375,
0.0128326416015625,
0.034088134765625,
0.05706787109375,
-0.00481414794921875,
0.0260009765625,
-0.0023059844970703125,
0.06024169921875,
-0.027740478515625,
-0.028533935546875,
-0.041351318359375,
0.0506591796875,
-0.0164337158203125,
-0.045867919921875,
0.04718017578125,
0.0322265625,
0.07696533203125,
-0.008392333984375,
0.04608154296875,
-0.0253143310546875,
0.046539306640625,
-0.0291900634765625,
0.07672119140625,
-0.059173583984375,
0.00986480712890625,
-0.0231781005859375,
-0.068603515625,
-0.0120086669921875,
0.0579833984375,
-0.040374755859375,
0.031646728515625,
0.04443359375,
0.035369873046875,
0.00017774105072021484,
-0.0200653076171875,
0.00562286376953125,
0.0311431884765625,
0.02117919921875,
0.031585693359375,
0.042755126953125,
-0.0406494140625,
0.039398193359375,
-0.03021240234375,
-0.009552001953125,
-0.03863525390625,
-0.049530029296875,
-0.08502197265625,
-0.05078125,
-0.028564453125,
-0.030426025390625,
0.003002166748046875,
0.0595703125,
0.0716552734375,
-0.07550048828125,
-0.0061492919921875,
-0.012786865234375,
-0.0016307830810546875,
-0.00975799560546875,
-0.01561737060546875,
0.0316162109375,
-0.0188140869140625,
-0.052001953125,
-0.0037670135498046875,
-0.0311431884765625,
0.02032470703125,
-0.00849151611328125,
-0.017303466796875,
-0.039337158203125,
0.00522613525390625,
0.0264739990234375,
0.0199737548828125,
-0.048675537109375,
-0.028594970703125,
-0.0034885406494140625,
-0.0120391845703125,
-0.01064300537109375,
0.039276123046875,
-0.04498291015625,
0.0233154296875,
0.03997802734375,
0.035125732421875,
0.054779052734375,
-0.023681640625,
0.01557159423828125,
-0.059539794921875,
0.031646728515625,
0.02142333984375,
0.036529541015625,
0.0124359130859375,
-0.006771087646484375,
0.046600341796875,
0.0273590087890625,
-0.040435791015625,
-0.08209228515625,
-0.0039043426513671875,
-0.08526611328125,
-0.01312255859375,
0.07977294921875,
-0.03125,
-0.01406097412109375,
0.022857666015625,
-0.00377655029296875,
0.0280303955078125,
-0.028167724609375,
0.052490234375,
0.06280517578125,
-0.0003407001495361328,
-0.01303863525390625,
-0.04010009765625,
0.0289306640625,
0.0279541015625,
-0.04345703125,
-0.0262603759765625,
0.016021728515625,
0.0274505615234375,
0.0290985107421875,
0.0243377685546875,
0.00696563720703125,
0.0160064697265625,
-0.004276275634765625,
0.0212249755859375,
-0.00811767578125,
-0.0193328857421875,
-0.0068511962890625,
-0.004619598388671875,
-0.00299072265625,
-0.01062774658203125
]
] |
EleutherAI/gpt-neo-1.3B | 2023-07-09T15:52:34.000Z | [
"transformers",
"pytorch",
"jax",
"rust",
"safetensors",
"gpt_neo",
"text-generation",
"text generation",
"causal-lm",
"en",
"dataset:EleutherAI/pile",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/gpt-neo-1.3B | 208 | 56,565 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
tags:
- text generation
- pytorch
- causal-lm
license: mit
datasets:
- EleutherAI/pile
---
# GPT-Neo 1.3B
## Model Description
GPT-Neo 1.3B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 1.3B represents the number of parameters of this particular pre-trained model.
## Training data
GPT-Neo 1.3B was trained on the Pile, a large scale curated dataset created by EleutherAI for the purpose of training this model.
## Training procedure
This model was trained on the Pile for 380 billion tokens over 362,000 steps. It was trained as a masked autoregressive language model, using cross-entropy loss.
## Intended Use and Limitations
This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='EleutherAI/gpt-neo-1.3B')
>>> generator("EleutherAI has", do_sample=True, min_length=50)
[{'generated_text': 'EleutherAI has made a commitment to create new software packages for each of its major clients and has'}]
```
### Limitations and Biases
GPT-Neo was trained as an autoregressive language model. This means that its core functionality is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work.
GPT-Neo was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending on your usecase GPT-Neo may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-Neo will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
## Eval results
### Linguistic Reasoning
| Model and Size | Pile BPB | Pile PPL | Wikitext PPL | Lambada PPL | Lambada Acc | Winogrande | Hellaswag |
| ---------------- | ---------- | ---------- | ------------- | ----------- | ----------- | ---------- | ----------- |
| **GPT-Neo 1.3B** | **0.7527** | **6.159** | **13.10** | **7.498** | **57.23%** | **55.01%** | **38.66%** |
| GPT-2 1.5B | 1.0468 | ----- | 17.48 | 10.634 | 51.21% | 59.40% | 40.03% |
| GPT-Neo 2.7B | 0.7165 | 5.646 | 11.39 | 5.626 | 62.22% | 56.50% | 42.73% |
| GPT-3 Ada | 0.9631 | ----- | ----- | 9.954 | 51.60% | 52.90% | 35.93% |
### Physical and Scientific Reasoning
| Model and Size | MathQA | PubMedQA | Piqa |
| ---------------- | ---------- | ---------- | ----------- |
| **GPT-Neo 1.3B** | **24.05%** | **54.40%** | **71.11%** |
| GPT-2 1.5B | 23.64% | 58.33% | 70.78% |
| GPT-Neo 2.7B | 24.72% | 57.54% | 72.14% |
| GPT-3 Ada | 24.29% | 52.80% | 68.88% |
### Down-Stream Applications
TBD
### BibTeX entry and citation info
To cite this model, please use
```bibtex
@software{gpt-neo,
author = {Black, Sid and
Leo, Gao and
Wang, Phil and
Leahy, Connor and
Biderman, Stella},
title = {{GPT-Neo: Large Scale Autoregressive Language
Modeling with Mesh-Tensorflow}},
month = mar,
year = 2021,
note = {{If you use this software, please cite it using
these metadata.}},
publisher = {Zenodo},
version = {1.0},
doi = {10.5281/zenodo.5297715},
url = {https://doi.org/10.5281/zenodo.5297715}
}
@article{gao2020pile,
title={The Pile: An 800GB Dataset of Diverse Text for Language Modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and others},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
``` | 4,586 | [
[
-0.040985107421875,
-0.06329345703125,
0.028289794921875,
-0.00038933753967285156,
-0.0153656005859375,
-0.0106964111328125,
0.0005078315734863281,
-0.029815673828125,
0.0211944580078125,
0.0268402099609375,
-0.0186309814453125,
-0.031707763671875,
-0.05462646484375,
0.006351470947265625,
-0.04718017578125,
0.0943603515625,
0.0174407958984375,
-0.0341796875,
0.0166015625,
0.0121002197265625,
-0.00943756103515625,
-0.03912353515625,
-0.048187255859375,
-0.0167083740234375,
0.028839111328125,
-0.00876617431640625,
0.0635986328125,
0.058135986328125,
0.01210784912109375,
0.02587890625,
-0.01033782958984375,
-0.01154327392578125,
-0.0294952392578125,
-0.014556884765625,
-0.0040435791015625,
-0.0045166015625,
-0.044769287109375,
0.007049560546875,
0.0458984375,
0.037750244140625,
-0.01328277587890625,
0.006061553955078125,
0.009063720703125,
0.037017822265625,
-0.021759033203125,
0.00693511962890625,
-0.0367431640625,
-0.023223876953125,
-0.01268768310546875,
0.000591278076171875,
-0.0162506103515625,
-0.0201263427734375,
0.00733184814453125,
-0.042877197265625,
0.032318115234375,
-0.00036025047302246094,
0.09600830078125,
0.0200042724609375,
-0.02447509765625,
-0.00899505615234375,
-0.0460205078125,
0.044677734375,
-0.06884765625,
0.018341064453125,
0.0316162109375,
-0.0049591064453125,
0.00830078125,
-0.046356201171875,
-0.044219970703125,
-0.009918212890625,
-0.022674560546875,
0.0166015625,
-0.026580810546875,
-0.00859832763671875,
0.02569580078125,
0.035125732421875,
-0.068603515625,
0.001857757568359375,
-0.035247802734375,
-0.0156707763671875,
0.045013427734375,
0.00592803955078125,
0.02325439453125,
-0.047943115234375,
-0.03033447265625,
-0.0255889892578125,
-0.033355712890625,
-0.01000213623046875,
0.042633056640625,
0.01690673828125,
-0.023712158203125,
0.03753662109375,
-0.006488800048828125,
0.048370361328125,
-0.0089569091796875,
0.00592803955078125,
0.037811279296875,
-0.03564453125,
-0.0252838134765625,
-0.01180267333984375,
0.10882568359375,
0.01043701171875,
0.018585205078125,
0.004375457763671875,
-0.016693115234375,
0.003330230712890625,
0.0134735107421875,
-0.0748291015625,
-0.0187835693359375,
0.01430511474609375,
-0.02301025390625,
-0.0247039794921875,
0.01468658447265625,
-0.060150146484375,
-0.007129669189453125,
-0.00943756103515625,
0.023529052734375,
-0.03466796875,
-0.044158935546875,
0.004482269287109375,
-0.006366729736328125,
0.00540924072265625,
0.015411376953125,
-0.060638427734375,
0.0301055908203125,
0.0511474609375,
0.07568359375,
0.007236480712890625,
-0.0269775390625,
-0.0090179443359375,
0.0009455680847167969,
-0.0211334228515625,
0.0621337890625,
-0.0226287841796875,
-0.0151824951171875,
-0.0191497802734375,
0.01270294189453125,
-0.0186614990234375,
-0.016571044921875,
0.028900146484375,
-0.0135040283203125,
0.05450439453125,
-0.00005429983139038086,
-0.0278472900390625,
-0.0237274169921875,
0.01174163818359375,
-0.0565185546875,
0.08880615234375,
0.03759765625,
-0.07867431640625,
0.01502227783203125,
-0.04144287109375,
-0.0031642913818359375,
0.00867462158203125,
-0.0005440711975097656,
-0.03662109375,
-0.0227203369140625,
0.010650634765625,
0.020172119140625,
-0.038726806640625,
0.03997802734375,
-0.0181427001953125,
-0.0198822021484375,
0.0013294219970703125,
-0.031036376953125,
0.0853271484375,
0.0244903564453125,
-0.0404052734375,
-0.0002887248992919922,
-0.04754638671875,
-0.01428985595703125,
0.0244903564453125,
-0.0097808837890625,
-0.0262451171875,
-0.01220703125,
0.011505126953125,
0.03558349609375,
0.010894775390625,
-0.023529052734375,
0.0175628662109375,
-0.0279693603515625,
0.046600341796875,
0.055419921875,
-0.009063720703125,
0.0264129638671875,
-0.03717041015625,
0.05364990234375,
-0.01245880126953125,
0.0015087127685546875,
-0.004512786865234375,
-0.051971435546875,
-0.0401611328125,
-0.0265045166015625,
0.027740478515625,
0.049530029296875,
-0.04010009765625,
0.035247802734375,
-0.027587890625,
-0.044219970703125,
-0.03533935546875,
-0.0024280548095703125,
0.0210113525390625,
0.03936767578125,
0.0304718017578125,
-0.0007996559143066406,
-0.040130615234375,
-0.0694580078125,
0.0002720355987548828,
-0.037750244140625,
-0.0029430389404296875,
0.0309600830078125,
0.050140380859375,
-0.02874755859375,
0.0682373046875,
-0.030792236328125,
-0.005405426025390625,
-0.0174560546875,
0.0240478515625,
0.04010009765625,
0.02978515625,
0.050750732421875,
-0.04302978515625,
-0.047332763671875,
0.006450653076171875,
-0.038909912109375,
-0.0204315185546875,
0.0003845691680908203,
-0.009918212890625,
0.0273590087890625,
0.029327392578125,
-0.06219482421875,
0.0188140869140625,
0.051605224609375,
-0.05291748046875,
0.049285888671875,
-0.0133209228515625,
-0.0036067962646484375,
-0.09283447265625,
0.03240966796875,
0.006099700927734375,
-0.018646240234375,
-0.038848876953125,
-0.02142333984375,
-0.007038116455078125,
-0.0021648406982421875,
-0.0168609619140625,
0.062286376953125,
-0.0278472900390625,
0.003971099853515625,
-0.012939453125,
0.01197052001953125,
0.006633758544921875,
0.03570556640625,
0.00687408447265625,
0.0439453125,
0.038299560546875,
-0.0400390625,
0.0162200927734375,
0.01055908203125,
-0.013763427734375,
0.0160980224609375,
-0.0687255859375,
0.0017557144165039062,
-0.00897979736328125,
0.0168609619140625,
-0.07098388671875,
0.0008373260498046875,
0.03387451171875,
-0.038238525390625,
0.0222015380859375,
-0.0308837890625,
-0.0361328125,
-0.041015625,
-0.0145263671875,
0.014984130859375,
0.03802490234375,
-0.00649261474609375,
0.041168212890625,
0.029266357421875,
-0.0303192138671875,
-0.056610107421875,
-0.032989501953125,
-0.006542205810546875,
-0.0277557373046875,
-0.046966552734375,
0.02862548828125,
-0.00762939453125,
-0.01174163818359375,
0.0167388916015625,
0.0134735107421875,
0.0095062255859375,
-0.0005397796630859375,
0.0030231475830078125,
0.0263671875,
-0.004840850830078125,
-0.01165008544921875,
-0.01009368896484375,
-0.0216217041015625,
0.01392364501953125,
-0.007251739501953125,
0.0667724609375,
-0.02264404296875,
0.0028057098388671875,
-0.01641845703125,
0.0216064453125,
0.047698974609375,
-0.002742767333984375,
0.04864501953125,
0.060821533203125,
-0.0238494873046875,
0.00360107421875,
-0.032318115234375,
-0.0140380859375,
-0.03253173828125,
0.0521240234375,
-0.019683837890625,
-0.0616455078125,
0.049285888671875,
0.0274810791015625,
0.00872802734375,
0.064453125,
0.048004150390625,
0.0093536376953125,
0.081298828125,
0.043060302734375,
-0.017242431640625,
0.0408935546875,
-0.041961669921875,
0.001750946044921875,
-0.0748291015625,
-0.00650787353515625,
-0.05242919921875,
-0.015869140625,
-0.07025146484375,
-0.0287322998046875,
0.007904052734375,
-0.003204345703125,
-0.048004150390625,
0.038726806640625,
-0.05035400390625,
0.0085601806640625,
0.042266845703125,
-0.01548004150390625,
0.0164947509765625,
-0.00786590576171875,
-0.0085601806640625,
0.0158538818359375,
-0.057037353515625,
-0.038299560546875,
0.07745361328125,
0.033966064453125,
0.042755126953125,
0.008453369140625,
0.04742431640625,
0.00437164306640625,
0.031036376953125,
-0.050872802734375,
0.0310211181640625,
-0.0174102783203125,
-0.0755615234375,
-0.0287322998046875,
-0.0562744140625,
-0.0931396484375,
0.036285400390625,
-0.00675201416015625,
-0.055694580078125,
0.0230560302734375,
0.0018682479858398438,
-0.01983642578125,
0.0281829833984375,
-0.052398681640625,
0.07269287109375,
-0.01025390625,
-0.02197265625,
0.0009131431579589844,
-0.0399169921875,
0.0264739990234375,
-0.0023021697998046875,
0.037811279296875,
-0.009124755859375,
-0.0074462890625,
0.064453125,
-0.0352783203125,
0.058135986328125,
-0.00794219970703125,
-0.0078277587890625,
0.031951904296875,
0.01247406005859375,
0.05035400390625,
0.0072784423828125,
0.0018472671508789062,
0.01186370849609375,
-0.0022487640380859375,
-0.01513671875,
-0.021026611328125,
0.05224609375,
-0.07763671875,
-0.040679931640625,
-0.05853271484375,
-0.045135498046875,
0.0218963623046875,
0.033416748046875,
0.03021240234375,
0.0280914306640625,
-0.0126953125,
0.016204833984375,
0.0379638671875,
-0.032562255859375,
0.040252685546875,
0.032012939453125,
-0.03228759765625,
-0.03802490234375,
0.061767578125,
0.01374053955078125,
0.017333984375,
0.021820068359375,
0.03662109375,
-0.02679443359375,
-0.029388427734375,
-0.027587890625,
0.048095703125,
-0.025634765625,
-0.01073455810546875,
-0.0745849609375,
-0.0222930908203125,
-0.041595458984375,
0.0030956268310546875,
-0.036041259765625,
-0.02606201171875,
-0.020843505859375,
-0.01019287109375,
0.03466796875,
0.058929443359375,
-0.0058135986328125,
0.0212860107421875,
-0.039703369140625,
0.02056884765625,
0.029052734375,
0.0225677490234375,
-0.01154327392578125,
-0.059661865234375,
-0.00856781005859375,
0.006580352783203125,
-0.01605224609375,
-0.0684814453125,
0.067626953125,
-0.005626678466796875,
0.04656982421875,
0.02056884765625,
-0.00965118408203125,
0.0278472900390625,
-0.024444580078125,
0.050201416015625,
0.0056610107421875,
-0.054412841796875,
0.0345458984375,
-0.0572509765625,
0.02178955078125,
0.0180206298828125,
0.046539306640625,
-0.0452880859375,
-0.033966064453125,
-0.0909423828125,
-0.0849609375,
0.06402587890625,
0.0166473388671875,
0.0022525787353515625,
-0.0108184814453125,
0.005435943603515625,
-0.003925323486328125,
0.0086212158203125,
-0.08050537109375,
-0.03021240234375,
-0.0362548828125,
-0.001316070556640625,
-0.0199737548828125,
-0.01084136962890625,
-0.0027103424072265625,
-0.020782470703125,
0.06207275390625,
-0.005748748779296875,
0.0341796875,
0.002620697021484375,
-0.0027866363525390625,
0.001110076904296875,
0.01264190673828125,
0.0341796875,
0.043701171875,
-0.037078857421875,
0.0011043548583984375,
-0.00395965576171875,
-0.053802490234375,
-0.0106658935546875,
0.03106689453125,
-0.032958984375,
0.0035190582275390625,
0.00783538818359375,
0.06793212890625,
-0.00998687744140625,
-0.016326904296875,
0.0316162109375,
-0.00021135807037353516,
-0.033843994140625,
-0.0230712890625,
-0.0037822723388671875,
0.0114288330078125,
0.0014781951904296875,
0.0235595703125,
0.0017719268798828125,
0.0149688720703125,
-0.040008544921875,
0.0036716461181640625,
0.03424072265625,
-0.0214996337890625,
-0.0272979736328125,
0.04901123046875,
-0.00371551513671875,
-0.006763458251953125,
0.050750732421875,
-0.02496337890625,
-0.0479736328125,
0.0460205078125,
0.04278564453125,
0.0645751953125,
-0.0242156982421875,
0.0338134765625,
0.0628662109375,
0.0452880859375,
-0.009124755859375,
0.006988525390625,
0.0243682861328125,
-0.05609130859375,
-0.03857421875,
-0.047882080078125,
-0.01397705078125,
0.03839111328125,
-0.031768798828125,
0.022125244140625,
-0.021636962890625,
-0.0222015380859375,
-0.0142059326171875,
0.015289306640625,
-0.04150390625,
0.01763916015625,
0.0214385986328125,
0.037811279296875,
-0.08349609375,
0.057830810546875,
0.05328369140625,
-0.037322998046875,
-0.05902099609375,
-0.0302276611328125,
-0.0028171539306640625,
-0.045013427734375,
0.0237274169921875,
0.0017518997192382812,
0.0105438232421875,
0.01480865478515625,
-0.035125732421875,
-0.08984375,
0.08197021484375,
0.0217437744140625,
-0.03228759765625,
-0.00804901123046875,
0.0017557144165039062,
0.041778564453125,
-0.014373779296875,
0.06072998046875,
0.03326416015625,
0.038421630859375,
-0.0010929107666015625,
-0.07611083984375,
0.0244140625,
-0.05419921875,
0.0022258758544921875,
0.026763916015625,
-0.062286376953125,
0.09136962890625,
0.0019330978393554688,
-0.00901031494140625,
-0.0189666748046875,
0.028472900390625,
0.035064697265625,
-0.003631591796875,
0.041900634765625,
0.0592041015625,
0.056610107421875,
-0.0220489501953125,
0.0948486328125,
-0.0214080810546875,
0.045196533203125,
0.0728759765625,
0.0205078125,
0.045684814453125,
0.0163116455078125,
-0.0298309326171875,
0.058258056640625,
0.044158935546875,
-0.005329132080078125,
0.023834228515625,
0.0018568038940429688,
0.00794219970703125,
-0.00771331787109375,
0.00481414794921875,
-0.04986572265625,
0.01297760009765625,
0.036163330078125,
-0.03619384765625,
-0.00873565673828125,
-0.02703857421875,
0.02911376953125,
-0.024139404296875,
-0.0045318603515625,
0.044525146484375,
0.006988525390625,
-0.045196533203125,
0.0565185546875,
-0.005481719970703125,
0.05316162109375,
-0.043487548828125,
0.00955963134765625,
-0.011627197265625,
0.002857208251953125,
-0.01751708984375,
-0.04925537109375,
0.019073486328125,
0.005062103271484375,
-0.011322021484375,
-0.0242767333984375,
0.0430908203125,
-0.0189056396484375,
-0.039703369140625,
0.024810791015625,
0.04205322265625,
0.0182647705078125,
-0.0189971923828125,
-0.07501220703125,
-0.00641632080078125,
-0.0026226043701171875,
-0.05059814453125,
0.0253143310546875,
0.052520751953125,
0.0011749267578125,
0.0479736328125,
0.058929443359375,
0.0107879638671875,
-0.0023193359375,
0.0188140869140625,
0.07611083984375,
-0.055389404296875,
-0.041717529296875,
-0.059234619140625,
0.048553466796875,
-0.006561279296875,
-0.0343017578125,
0.05078125,
0.04669189453125,
0.05029296875,
0.016357421875,
0.0665283203125,
-0.03851318359375,
0.0458984375,
-0.01262664794921875,
0.048309326171875,
-0.031890869140625,
0.0118560791015625,
-0.04852294921875,
-0.09051513671875,
-0.00505828857421875,
0.06597900390625,
-0.035400390625,
0.0350341796875,
0.06390380859375,
0.054168701171875,
0.0015821456909179688,
-0.01044464111328125,
0.0020999908447265625,
0.03851318359375,
0.02642822265625,
0.04974365234375,
0.0562744140625,
-0.049468994140625,
0.046234130859375,
-0.040863037109375,
-0.0215301513671875,
-0.00983428955078125,
-0.07110595703125,
-0.064208984375,
-0.03912353515625,
-0.03759765625,
-0.048004150390625,
-0.01175689697265625,
0.0494384765625,
0.039276123046875,
-0.04681396484375,
-0.03131103515625,
-0.0189056396484375,
0.006290435791015625,
-0.02008056640625,
-0.02264404296875,
0.038848876953125,
-0.00565338134765625,
-0.06610107421875,
0.0037631988525390625,
-0.00749969482421875,
0.01523590087890625,
-0.0182647705078125,
-0.013031005859375,
-0.033721923828125,
-0.00768280029296875,
0.0322265625,
0.0200042724609375,
-0.036376953125,
-0.00695037841796875,
0.01238250732421875,
-0.0192108154296875,
0.0038318634033203125,
0.027587890625,
-0.05462646484375,
0.0102081298828125,
0.060821533203125,
0.009552001953125,
0.056732177734375,
0.013763427734375,
0.0367431640625,
-0.0286865234375,
0.007781982421875,
0.022705078125,
0.03497314453125,
0.026611328125,
-0.0272064208984375,
0.04437255859375,
0.032623291015625,
-0.052581787109375,
-0.053192138671875,
-0.007747650146484375,
-0.08154296875,
-0.024871826171875,
0.11053466796875,
0.00788116455078125,
-0.0216522216796875,
-0.0157623291015625,
-0.007579803466796875,
0.021240234375,
-0.03912353515625,
0.05096435546875,
0.05462646484375,
0.006305694580078125,
-0.0253143310546875,
-0.055419921875,
0.034149169921875,
0.013214111328125,
-0.054534912109375,
0.01641845703125,
0.0322265625,
0.0256195068359375,
0.0218353271484375,
0.055389404296875,
-0.0274810791015625,
0.006816864013671875,
0.006641387939453125,
0.0159759521484375,
-0.00984954833984375,
-0.015625,
-0.01342010498046875,
0.002368927001953125,
-0.0165557861328125,
0.0245208740234375
]
] |
facebook/dragon-plus-query-encoder | 2023-02-17T18:30:37.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"feature-extraction",
"arxiv:2302.07452",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | feature-extraction | facebook | null | null | facebook/dragon-plus-query-encoder | 8 | 56,450 | transformers | 2023-02-15T17:50:48 | ---
tags:
- feature-extraction
pipeline_tag: feature-extraction
---
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
Diverse Augmentation Towards Generalizable Dense Retrieval](https://arxiv.org/abs/2302.07452).
<p align="center">
<img src="https://raw.githubusercontent.com/facebookresearch/dpr-scale/main/dragon/images/teaser.png" width="600">
</p>
The associated GitHub repository is available here https://github.com/facebookresearch/dpr-scale/tree/main/dragon. We use asymmetric dual encoder, with two distinctly parameterized encoders. The following models are also available:
Model | Initialization | MARCO Dev | BEIR | Query Encoder Path | Context Encoder Path
|---|---|---|---|---|---
DRAGON+ | Shitao/RetroMAE| 39.0 | 47.4 | [facebook/dragon-plus-query-encoder](https://huggingface.co/facebook/dragon-plus-query-encoder) | [facebook/dragon-plus-context-encoder](https://huggingface.co/facebook/dragon-plus-context-encoder)
DRAGON-RoBERTa | RoBERTa-base | 39.4 | 47.2 | [facebook/dragon-roberta-query-encoder](https://huggingface.co/facebook/dragon-roberta-query-encoder) | [facebook/dragon-roberta-context-encoder](https://huggingface.co/facebook/dragon-roberta-context-encoder)
## Usage (HuggingFace Transformers)
Using the model directly available in HuggingFace transformers .
```python
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('facebook/dragon-plus-query-encoder')
query_encoder = AutoModel.from_pretrained('facebook/dragon-plus-query-encoder')
context_encoder = AutoModel.from_pretrained('facebook/dragon-plus-context-encoder')
# We use msmarco query and passages as an example
query = "Where was Marie Curie born?"
contexts = [
"Maria Sklodowska, later known as Marie Curie, was born on November 7, 1867.",
"Born in Paris on 15 May 1859, Pierre Curie was the son of Eugène Curie, a doctor of French Catholic origin from Alsace."
]
# Apply tokenizer
query_input = tokenizer(query, return_tensors='pt')
ctx_input = tokenizer(contexts, padding=True, truncation=True, return_tensors='pt')
# Compute embeddings: take the last-layer hidden state of the [CLS] token
query_emb = query_encoder(**query_input).last_hidden_state[:, 0, :]
ctx_emb = context_encoder(**ctx_input).last_hidden_state[:, 0, :]
# Compute similarity scores using dot product
score1 = query_emb @ ctx_emb[0] # 396.5625
score2 = query_emb @ ctx_emb[1] # 393.8340
``` | 2,617 | [
[
-0.02899169921875,
-0.047393798828125,
0.01104736328125,
0.023345947265625,
-0.007663726806640625,
-0.0116119384765625,
-0.016448974609375,
-0.034027099609375,
0.0472412109375,
0.021331787109375,
-0.057647705078125,
-0.03692626953125,
-0.040985107421875,
-0.0032253265380859375,
-0.02392578125,
0.062042236328125,
0.003658294677734375,
0.017608642578125,
0.006824493408203125,
-0.0110931396484375,
0.004421234130859375,
-0.049591064453125,
-0.0726318359375,
-0.0187225341796875,
0.046478271484375,
-0.005710601806640625,
0.0579833984375,
0.059783935546875,
0.022369384765625,
0.0283050537109375,
-0.022552490234375,
0.0183563232421875,
-0.040496826171875,
-0.00850677490234375,
0.0006003379821777344,
-0.0289459228515625,
-0.036529541015625,
-0.0029697418212890625,
0.045013427734375,
0.044097900390625,
0.00850677490234375,
0.007617950439453125,
0.00414276123046875,
0.037841796875,
-0.03753662109375,
-0.00156402587890625,
-0.036224365234375,
0.018646240234375,
0.00933074951171875,
-0.007305145263671875,
-0.0150604248046875,
-0.0416259765625,
0.007106781005859375,
-0.048492431640625,
0.0226593017578125,
-0.01503753662109375,
0.0936279296875,
0.01435089111328125,
-0.032318115234375,
-0.0284881591796875,
-0.021636962890625,
0.05218505859375,
-0.045806884765625,
0.029144287109375,
0.01904296875,
0.013458251953125,
-0.0127716064453125,
-0.0731201171875,
-0.0550537109375,
-0.0163421630859375,
-0.01425933837890625,
-0.01081085205078125,
0.0015430450439453125,
0.0007715225219726562,
0.0242462158203125,
0.03582763671875,
-0.04443359375,
-0.0081787109375,
-0.037384033203125,
-0.024444580078125,
0.056976318359375,
0.005519866943359375,
0.03704833984375,
-0.01788330078125,
-0.032073974609375,
-0.023681640625,
-0.0406494140625,
0.0228271484375,
0.020751953125,
0.018585205078125,
-0.0213470458984375,
0.047607421875,
-0.006031036376953125,
0.05902099609375,
0.041656494140625,
0.00612640380859375,
0.048858642578125,
-0.022705078125,
-0.0131683349609375,
0.01030731201171875,
0.067138671875,
0.02239990234375,
0.01343536376953125,
-0.007724761962890625,
0.00009071826934814453,
0.0004260540008544922,
0.0042877197265625,
-0.08349609375,
-0.0087738037109375,
0.026580810546875,
-0.061981201171875,
-0.0199432373046875,
0.0161895751953125,
-0.0234222412109375,
-0.0220947265625,
-0.0105743408203125,
0.046630859375,
-0.03253173828125,
-0.0330810546875,
0.02655029296875,
-0.031402587890625,
0.038970947265625,
-0.01465606689453125,
-0.0670166015625,
0.0125885009765625,
0.0556640625,
0.04638671875,
-0.0015430450439453125,
-0.0295257568359375,
-0.0198822021484375,
-0.004398345947265625,
-0.0225677490234375,
0.0360107421875,
-0.035247802734375,
-0.0325927734375,
0.0022144317626953125,
0.034149169921875,
-0.01654052734375,
-0.0443115234375,
0.07110595703125,
-0.031036376953125,
0.019622802734375,
-0.04071044921875,
-0.049896240234375,
-0.0247344970703125,
0.0238037109375,
-0.05352783203125,
0.07720947265625,
0.0170440673828125,
-0.052337646484375,
0.039031982421875,
-0.03070068359375,
-0.0321044921875,
-0.0020542144775390625,
-0.0157012939453125,
-0.03009033203125,
-0.005218505859375,
0.029449462890625,
0.054046630859375,
0.0115509033203125,
0.004558563232421875,
-0.0478515625,
-0.05438232421875,
0.0183868408203125,
-0.011474609375,
0.09588623046875,
0.0140380859375,
-0.029388427734375,
0.0033130645751953125,
-0.06414794921875,
0.0179595947265625,
0.00308990478515625,
-0.03143310546875,
-0.0226593017578125,
-0.0032939910888671875,
0.01177978515625,
0.016754150390625,
0.041107177734375,
-0.059112548828125,
0.0261993408203125,
-0.0311737060546875,
0.03546142578125,
0.02886962890625,
-0.003345489501953125,
0.030548095703125,
-0.01503753662109375,
-0.00160980224609375,
-0.0062255859375,
0.0209808349609375,
-0.0067596435546875,
-0.030181884765625,
-0.04901123046875,
-0.02557373046875,
-0.00010341405868530273,
0.0313720703125,
-0.053131103515625,
0.053741455078125,
-0.0225067138671875,
-0.04644775390625,
-0.053314208984375,
-0.007076263427734375,
0.0205841064453125,
0.0201263427734375,
0.023681640625,
-0.01309967041015625,
-0.04693603515625,
-0.057403564453125,
-0.0127410888671875,
-0.0026264190673828125,
-0.01245880126953125,
0.04718017578125,
0.029510498046875,
-0.03082275390625,
0.041900634765625,
-0.037994384765625,
-0.01287078857421875,
-0.00997161865234375,
-0.0075836181640625,
0.0501708984375,
0.06121826171875,
0.058380126953125,
-0.0748291015625,
-0.03955078125,
-0.004444122314453125,
-0.0633544921875,
0.032989501953125,
-0.007175445556640625,
-0.0180206298828125,
-0.0094146728515625,
0.00946807861328125,
-0.0692138671875,
0.0256500244140625,
0.032440185546875,
-0.04364013671875,
0.0357666015625,
-0.019439697265625,
0.026641845703125,
-0.0924072265625,
0.002307891845703125,
0.003154754638671875,
0.004085540771484375,
-0.036224365234375,
0.0178680419921875,
0.017822265625,
-0.0108795166015625,
-0.03271484375,
0.0369873046875,
-0.03924560546875,
-0.005649566650390625,
-0.003993988037109375,
-0.01058197021484375,
0.01250457763671875,
0.038604736328125,
0.01178741455078125,
0.040191650390625,
0.05584716796875,
-0.05267333984375,
0.055145263671875,
0.0189208984375,
-0.01428985595703125,
0.03314208984375,
-0.05670166015625,
0.01491546630859375,
0.0072174072265625,
0.0162353515625,
-0.0626220703125,
-0.01285552978515625,
0.0203094482421875,
-0.0631103515625,
0.033538818359375,
-0.007389068603515625,
-0.040740966796875,
-0.045135498046875,
-0.051300048828125,
0.032379150390625,
0.0226593017578125,
-0.06842041015625,
0.045684814453125,
0.020233154296875,
0.0180511474609375,
-0.060943603515625,
-0.07073974609375,
-0.00382232666015625,
0.00021147727966308594,
-0.06634521484375,
0.05279541015625,
-0.013885498046875,
0.0195159912109375,
0.02655029296875,
0.005222320556640625,
-0.033233642578125,
-0.0136566162109375,
0.00829315185546875,
0.0200653076171875,
-0.0303802490234375,
-0.0237579345703125,
0.0103759765625,
-0.0015773773193359375,
0.01355743408203125,
-0.025177001953125,
0.044342041015625,
-0.005344390869140625,
-0.02655029296875,
-0.0419921875,
0.00836944580078125,
0.025177001953125,
-0.00914764404296875,
0.06005859375,
0.08624267578125,
-0.044342041015625,
-0.01322174072265625,
-0.039215087890625,
-0.0205535888671875,
-0.04296875,
0.03814697265625,
-0.03094482421875,
-0.0733642578125,
0.056427001953125,
0.027008056640625,
-0.01467132568359375,
0.031036376953125,
0.0452880859375,
0.002208709716796875,
0.0872802734375,
0.042694091796875,
-0.0229034423828125,
0.0311279296875,
-0.047698974609375,
-0.0005340576171875,
-0.0556640625,
-0.03997802734375,
-0.028045654296875,
-0.0377197265625,
-0.0595703125,
-0.014495849609375,
0.0067596435546875,
0.00039005279541015625,
-0.0341796875,
0.0579833984375,
-0.06884765625,
0.0228729248046875,
0.04150390625,
0.03350830078125,
-0.0195159912109375,
0.00865936279296875,
0.0021915435791015625,
-0.0104217529296875,
-0.035858154296875,
0.01410675048828125,
0.07366943359375,
0.00931549072265625,
0.05615234375,
-0.006988525390625,
0.0501708984375,
0.004123687744140625,
-0.003940582275390625,
-0.0462646484375,
0.034210205078125,
-0.0283050537109375,
-0.0638427734375,
0.0033664703369140625,
-0.0384521484375,
-0.06365966796875,
0.01528167724609375,
-0.0195159912109375,
-0.05609130859375,
0.0494384765625,
-0.007457733154296875,
-0.03173828125,
0.0267486572265625,
-0.040069580078125,
0.06317138671875,
-0.02490234375,
-0.033477783203125,
-0.00537872314453125,
-0.037139892578125,
0.014739990234375,
0.0283050537109375,
-0.01385498046875,
0.018035888671875,
-0.004444122314453125,
0.04290771484375,
-0.0166168212890625,
0.050872802734375,
-0.031951904296875,
0.01471710205078125,
0.0268402099609375,
-0.0089569091796875,
0.0268707275390625,
0.002323150634765625,
-0.011962890625,
-0.00009751319885253906,
0.00994110107421875,
-0.029144287109375,
-0.0445556640625,
0.0748291015625,
-0.0694580078125,
-0.011566162109375,
-0.007808685302734375,
-0.028961181640625,
0.01239776611328125,
0.02099609375,
0.038970947265625,
0.044525146484375,
0.0125885009765625,
0.038818359375,
0.05999755859375,
-0.038055419921875,
0.0276641845703125,
0.02587890625,
-0.0109100341796875,
-0.04144287109375,
0.08209228515625,
0.0200347900390625,
-0.018218994140625,
0.034637451171875,
-0.00060272216796875,
-0.037322998046875,
-0.01849365234375,
-0.022064208984375,
0.0301971435546875,
-0.04315185546875,
-0.00649261474609375,
-0.049652099609375,
-0.046051025390625,
-0.032806396484375,
-0.01143646240234375,
-0.029052734375,
-0.028656005859375,
-0.0382080078125,
-0.0201263427734375,
0.0193939208984375,
0.0304107666015625,
-0.0156402587890625,
0.0108184814453125,
-0.0516357421875,
0.029510498046875,
0.017608642578125,
0.0281219482421875,
-0.004993438720703125,
-0.03729248046875,
-0.0141754150390625,
-0.0005736351013183594,
-0.0022792816162109375,
-0.0594482421875,
0.032684326171875,
0.015655517578125,
0.056060791015625,
0.04315185546875,
-0.009368896484375,
0.06414794921875,
-0.019439697265625,
0.0673828125,
0.00028395652770996094,
-0.048980712890625,
0.01537322998046875,
-0.018035888671875,
-0.00533294677734375,
0.028228759765625,
0.03759765625,
-0.030029296875,
-0.0009617805480957031,
-0.03924560546875,
-0.0721435546875,
0.044891357421875,
0.0027828216552734375,
0.037689208984375,
-0.003154754638671875,
0.05584716796875,
-0.0021266937255859375,
0.007503509521484375,
-0.06634521484375,
-0.04620361328125,
-0.026275634765625,
-0.01512908935546875,
0.021728515625,
-0.03997802734375,
0.0095367431640625,
-0.0511474609375,
0.050628662109375,
0.0101776123046875,
0.036712646484375,
0.0243072509765625,
0.004878997802734375,
0.004352569580078125,
-0.0187530517578125,
0.028961181640625,
0.0367431640625,
-0.0279541015625,
-0.0156097412109375,
0.0087127685546875,
-0.03607177734375,
-0.01230621337890625,
0.037017822265625,
-0.0149993896484375,
0.0156097412109375,
0.031951904296875,
0.057952880859375,
0.01904296875,
-0.01483917236328125,
0.05035400390625,
-0.00928497314453125,
-0.0211639404296875,
-0.03790283203125,
0.004299163818359375,
0.01552581787109375,
0.029541015625,
0.046630859375,
-0.0034198760986328125,
0.0249176025390625,
-0.01556396484375,
0.036834716796875,
0.032012939453125,
-0.037689208984375,
-0.0305633544921875,
0.06097412109375,
-0.0026836395263671875,
-0.00656890869140625,
0.03973388671875,
-0.0194244384765625,
-0.038848876953125,
0.0596923828125,
0.040924072265625,
0.07366943359375,
-0.004459381103515625,
0.01245880126953125,
0.054229736328125,
0.01715087890625,
-0.001087188720703125,
0.045257568359375,
-0.0188446044921875,
-0.06854248046875,
-0.0304718017578125,
-0.04290771484375,
0.002410888671875,
0.0135650634765625,
-0.07177734375,
0.021484375,
-0.0272674560546875,
-0.024017333984375,
-0.0020847320556640625,
0.02392578125,
-0.053680419921875,
0.022125244140625,
0.00942230224609375,
0.07611083984375,
-0.037689208984375,
0.07037353515625,
0.0606689453125,
-0.039794921875,
-0.086669921875,
-0.0199432373046875,
-0.00860595703125,
-0.07598876953125,
0.050323486328125,
0.0205230712890625,
0.007175445556640625,
0.0014057159423828125,
-0.0565185546875,
-0.062744140625,
0.08538818359375,
0.0288543701171875,
-0.02508544921875,
-0.003925323486328125,
-0.01030731201171875,
0.04547119140625,
-0.027923583984375,
0.0296173095703125,
0.027130126953125,
0.0272369384765625,
0.00417327880859375,
-0.045074462890625,
0.0112762451171875,
-0.0413818359375,
0.0030536651611328125,
-0.0160064697265625,
-0.045867919921875,
0.09173583984375,
-0.00756072998046875,
-0.0139007568359375,
0.0175933837890625,
0.051605224609375,
0.0212860107421875,
0.01230621337890625,
0.019775390625,
0.05145263671875,
0.033172607421875,
-0.01026153564453125,
0.08575439453125,
-0.043670654296875,
0.050567626953125,
0.05279541015625,
0.004116058349609375,
0.060333251953125,
0.0176849365234375,
-0.0142059326171875,
0.042449951171875,
0.06475830078125,
-0.024505615234375,
0.042022705078125,
-0.0040283203125,
-0.0160675048828125,
-0.001617431640625,
0.004749298095703125,
-0.055694580078125,
0.006938934326171875,
0.0247039794921875,
-0.047698974609375,
-0.0032939910888671875,
-0.0087127685546875,
0.0246734619140625,
-0.006023406982421875,
-0.032806396484375,
0.0440673828125,
-0.0013628005981445312,
-0.0296478271484375,
0.06292724609375,
-0.0006999969482421875,
0.080078125,
-0.049224853515625,
0.0023708343505859375,
-0.022705078125,
0.035186767578125,
-0.0416259765625,
-0.0653076171875,
0.0150604248046875,
-0.017486572265625,
-0.010833740234375,
0.0085601806640625,
0.06121826171875,
-0.047576904296875,
-0.054229736328125,
0.043426513671875,
0.01117706298828125,
0.00867462158203125,
-0.00785064697265625,
-0.08074951171875,
0.01189422607421875,
0.00909423828125,
-0.037017822265625,
0.01520538330078125,
0.037139892578125,
0.02398681640625,
0.04736328125,
0.056610107421875,
0.0156097412109375,
0.023162841796875,
0.0043182373046875,
0.08648681640625,
-0.047698974609375,
-0.03033447265625,
-0.07037353515625,
0.0264739990234375,
-0.01419830322265625,
-0.02276611328125,
0.06109619140625,
0.048614501953125,
0.0679931640625,
-0.004199981689453125,
0.040283203125,
-0.025390625,
0.02655029296875,
-0.0153045654296875,
0.08197021484375,
-0.06591796875,
-0.00640869140625,
-0.01264190673828125,
-0.067138671875,
-0.01102447509765625,
0.050933837890625,
-0.004604339599609375,
0.01043701171875,
0.027313232421875,
0.0736083984375,
-0.00856781005859375,
-0.01285552978515625,
0.0044403076171875,
0.0182342529296875,
0.01471710205078125,
0.03997802734375,
0.0460205078125,
-0.03936767578125,
0.0460205078125,
-0.035858154296875,
-0.00881195068359375,
0.0034580230712890625,
-0.047027587890625,
-0.070068359375,
-0.0338134765625,
-0.0012998580932617188,
-0.043060302734375,
0.007686614990234375,
0.07000732421875,
0.05975341796875,
-0.07037353515625,
-0.0182647705078125,
0.0165863037109375,
0.0013837814331054688,
-0.02490234375,
-0.0221405029296875,
0.057525634765625,
-0.01198577880859375,
-0.05670166015625,
0.0205230712890625,
-0.0008535385131835938,
-0.01480865478515625,
-0.0174407958984375,
-0.0218048095703125,
-0.00257110595703125,
-0.0145416259765625,
0.030029296875,
0.025238037109375,
-0.0399169921875,
-0.021759033203125,
0.00659942626953125,
-0.0184173583984375,
0.00887298583984375,
0.033599853515625,
-0.0478515625,
0.03741455078125,
0.05072021484375,
0.05877685546875,
0.0579833984375,
0.00214385986328125,
0.02960205078125,
-0.05560302734375,
0.0284271240234375,
0.008056640625,
0.03472900390625,
0.03277587890625,
-0.047760009765625,
0.02618408203125,
0.0297393798828125,
-0.03277587890625,
-0.05877685546875,
-0.00603485107421875,
-0.07391357421875,
-0.01666259765625,
0.07415771484375,
-0.022552490234375,
-0.03680419921875,
0.027191162109375,
-0.0185699462890625,
0.0241546630859375,
-0.0276031494140625,
0.036590576171875,
0.0255279541015625,
-0.0106201171875,
-0.009246826171875,
-0.0175018310546875,
0.03741455078125,
0.00875091552734375,
-0.0190277099609375,
-0.038330078125,
0.00989532470703125,
0.03350830078125,
0.017852783203125,
0.0477294921875,
-0.0234375,
0.018463134765625,
0.016937255859375,
0.009674072265625,
-0.0209197998046875,
0.001155853271484375,
-0.01190185546875,
0.016876220703125,
-0.0264129638671875,
-0.03460693359375
]
] |
nvidia/segformer-b0-finetuned-ade-512-512 | 2023-04-24T08:31:30.000Z | [
"transformers",
"pytorch",
"tf",
"segformer",
"vision",
"image-segmentation",
"dataset:scene_parse_150",
"arxiv:2105.15203",
"license:other",
"endpoints_compatible",
"has_space",
"region:us"
] | image-segmentation | nvidia | null | null | nvidia/segformer-b0-finetuned-ade-512-512 | 89 | 56,197 | transformers | 2022-03-02T23:29:05 | ---
license: other
tags:
- vision
- image-segmentation
datasets:
- scene_parse_150
widget:
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg
example_title: House
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg
example_title: Castle
---
# SegFormer (b0-sized) model fine-tuned on ADE20k
SegFormer model fine-tuned on ADE20k at resolution 512x512. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer).
Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset.
## Intended uses & limitations
You can use the raw model for semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
import requests
processor = SegformerImageProcessor.from_pretrained("nvidia/segformer-b0-finetuned-ade-512-512")
model = SegformerForSemanticSegmentation.from_pretrained("nvidia/segformer-b0-finetuned-ade-512-512")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits # shape (batch_size, num_labels, height/4, width/4)
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#).
### License
The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-15203,
author = {Enze Xie and
Wenhai Wang and
Zhiding Yu and
Anima Anandkumar and
Jose M. Alvarez and
Ping Luo},
title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with
Transformers},
journal = {CoRR},
volume = {abs/2105.15203},
year = {2021},
url = {https://arxiv.org/abs/2105.15203},
eprinttype = {arXiv},
eprint = {2105.15203},
timestamp = {Wed, 02 Jun 2021 11:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 3,197 | [
[
-0.0672607421875,
-0.0550537109375,
0.01224517822265625,
0.0160675048828125,
-0.0240325927734375,
-0.0272979736328125,
-0.0006551742553710938,
-0.0504150390625,
0.0235443115234375,
0.044219970703125,
-0.0657958984375,
-0.043792724609375,
-0.05755615234375,
0.00850677490234375,
-0.02288818359375,
0.061126708984375,
0.0062713623046875,
-0.010162353515625,
-0.02435302734375,
-0.0275726318359375,
-0.0016450881958007812,
-0.02313232421875,
-0.048095703125,
-0.0265960693359375,
0.0271148681640625,
0.01543426513671875,
0.044097900390625,
0.061279296875,
0.0496826171875,
0.03521728515625,
-0.034393310546875,
0.0116424560546875,
-0.0241851806640625,
-0.0133514404296875,
0.00208282470703125,
-0.005863189697265625,
-0.03253173828125,
-0.0016050338745117188,
0.0281829833984375,
0.04833984375,
0.00489044189453125,
0.026947021484375,
-0.0032634735107421875,
0.033416748046875,
-0.03289794921875,
0.006938934326171875,
-0.037200927734375,
0.01129913330078125,
0.005573272705078125,
0.00015926361083984375,
-0.0223236083984375,
-0.01214599609375,
0.015411376953125,
-0.038787841796875,
0.054351806640625,
0.0030269622802734375,
0.11492919921875,
0.031524658203125,
-0.027923583984375,
-0.004913330078125,
-0.03607177734375,
0.061279296875,
-0.049774169921875,
0.04022216796875,
-0.007350921630859375,
0.0258941650390625,
0.00931549072265625,
-0.0753173828125,
-0.03387451171875,
0.0122833251953125,
-0.015625,
-0.00534820556640625,
-0.0275115966796875,
0.005474090576171875,
0.03704833984375,
0.042022705078125,
-0.03387451171875,
0.005023956298828125,
-0.053955078125,
-0.03125,
0.05523681640625,
0.005184173583984375,
0.021728515625,
-0.0256195068359375,
-0.060577392578125,
-0.033721923828125,
-0.0233917236328125,
0.009490966796875,
0.018768310546875,
0.00136566162109375,
-0.0216217041015625,
0.03448486328125,
-0.003978729248046875,
0.056640625,
0.0341796875,
-0.00879669189453125,
0.0379638671875,
-0.009674072265625,
-0.026641845703125,
0.0047760009765625,
0.0693359375,
0.033203125,
0.000014185905456542969,
0.00482940673828125,
-0.00765228271484375,
0.0089111328125,
0.024810791015625,
-0.09869384765625,
-0.017578125,
0.00618743896484375,
-0.039337158203125,
-0.022308349609375,
0.01415252685546875,
-0.058807373046875,
-0.00445556640625,
-0.01462554931640625,
0.036895751953125,
-0.022705078125,
-0.005138397216796875,
0.00875091552734375,
-0.00809478759765625,
0.055908203125,
0.0199737548828125,
-0.06341552734375,
0.017425537109375,
0.039764404296875,
0.05914306640625,
-0.0159912109375,
-0.00835418701171875,
-0.01013946533203125,
-0.006526947021484375,
-0.0126495361328125,
0.065673828125,
-0.0208587646484375,
-0.0263671875,
-0.0239410400390625,
0.0474853515625,
-0.0210113525390625,
-0.04815673828125,
0.060028076171875,
-0.040863037109375,
0.017333984375,
-0.00287628173828125,
-0.029632568359375,
-0.04559326171875,
0.0223541259765625,
-0.0478515625,
0.0693359375,
0.0178680419921875,
-0.060699462890625,
0.03656005859375,
-0.04254150390625,
-0.0207061767578125,
-0.0047760009765625,
0.00391387939453125,
-0.0679931640625,
0.00479888916015625,
0.03485107421875,
0.035919189453125,
-0.01531982421875,
0.01459503173828125,
-0.041748046875,
-0.01250457763671875,
-0.002170562744140625,
-0.01617431640625,
0.07611083984375,
0.024932861328125,
-0.0235748291015625,
0.03460693359375,
-0.0516357421875,
0.00012969970703125,
0.036285400390625,
0.0029354095458984375,
-0.0012845993041992188,
-0.025909423828125,
0.0180511474609375,
0.033447265625,
0.0183563232421875,
-0.051116943359375,
0.0037479400634765625,
-0.02587890625,
0.0323486328125,
0.04925537109375,
0.0095672607421875,
0.03533935546875,
-0.00778961181640625,
0.02374267578125,
0.01297760009765625,
0.034576416015625,
-0.012115478515625,
-0.018096923828125,
-0.08441162109375,
-0.031646728515625,
0.0177001953125,
0.01041412353515625,
-0.03070068359375,
0.048553466796875,
-0.01528167724609375,
-0.04815673828125,
-0.039093017578125,
-0.001068115234375,
0.004230499267578125,
0.03875732421875,
0.039398193359375,
-0.03436279296875,
-0.055755615234375,
-0.087890625,
0.011505126953125,
0.0181884765625,
-0.0051422119140625,
0.0298309326171875,
0.04278564453125,
-0.051116943359375,
0.06024169921875,
-0.058746337890625,
-0.0245819091796875,
-0.01410675048828125,
-0.00577545166015625,
0.0256195068359375,
0.04412841796875,
0.04815673828125,
-0.06378173828125,
-0.0277099609375,
-0.016845703125,
-0.046783447265625,
-0.006626129150390625,
0.0099029541015625,
-0.0259857177734375,
0.01262664794921875,
0.032958984375,
-0.04193115234375,
0.031890869140625,
0.036285400390625,
-0.046356201171875,
0.0263214111328125,
-0.006988525390625,
-0.0015420913696289062,
-0.076416015625,
0.013031005859375,
0.013946533203125,
-0.0189056396484375,
-0.038726806640625,
0.01180267333984375,
-0.001171112060546875,
-0.01406097412109375,
-0.046722412109375,
0.042327880859375,
-0.0260772705078125,
-0.00022327899932861328,
-0.0186767578125,
-0.01383209228515625,
0.01300811767578125,
0.05914306640625,
0.0137481689453125,
0.0230712890625,
0.03765869140625,
-0.053192138671875,
0.0167999267578125,
0.040130615234375,
-0.02655029296875,
0.0384521484375,
-0.0809326171875,
0.005855560302734375,
-0.0075225830078125,
0.00836181640625,
-0.055908203125,
-0.0260772705078125,
0.0296630859375,
-0.0235595703125,
0.033843994140625,
-0.022613525390625,
-0.0169219970703125,
-0.044464111328125,
-0.016510009765625,
0.029632568359375,
0.038818359375,
-0.06036376953125,
0.04364013671875,
0.03961181640625,
0.00977325439453125,
-0.0207366943359375,
-0.0460205078125,
-0.0249786376953125,
-0.02886962890625,
-0.08038330078125,
0.050537109375,
-0.0034732818603515625,
0.0147705078125,
0.003185272216796875,
-0.0272979736328125,
-0.0022945404052734375,
-0.00562286376953125,
0.02752685546875,
0.037811279296875,
-0.0109405517578125,
-0.03082275390625,
0.0014028549194335938,
-0.03179931640625,
0.00954437255859375,
-0.009552001953125,
0.049774169921875,
-0.0254364013671875,
-0.0272369384765625,
-0.0222930908203125,
-0.001461029052734375,
0.037109375,
-0.0212249755859375,
0.03912353515625,
0.09136962890625,
-0.0246734619140625,
-0.005626678466796875,
-0.04144287109375,
-0.021484375,
-0.042877197265625,
0.0239105224609375,
-0.016357421875,
-0.08111572265625,
0.04278564453125,
0.0026226043701171875,
0.004505157470703125,
0.07476806640625,
0.036468505859375,
0.01105499267578125,
0.09405517578125,
0.04620361328125,
0.0309906005859375,
0.039306640625,
-0.061981201171875,
0.0141448974609375,
-0.07928466796875,
-0.042572021484375,
-0.0302581787109375,
-0.031829833984375,
-0.05718994140625,
-0.05255126953125,
0.0307769775390625,
0.01287078857421875,
-0.0305023193359375,
0.04498291015625,
-0.06658935546875,
0.0147247314453125,
0.0374755859375,
0.00572967529296875,
-0.01114654541015625,
0.0065155029296875,
-0.011688232421875,
0.00591278076171875,
-0.05474853515625,
-0.0284271240234375,
0.0263519287109375,
0.042144775390625,
0.057708740234375,
-0.01398468017578125,
0.04541015625,
-0.00626373291015625,
-0.0016422271728515625,
-0.07049560546875,
0.0460205078125,
-0.008697509765625,
-0.05303955078125,
-0.0088043212890625,
-0.0235137939453125,
-0.0751953125,
0.03399658203125,
-0.0112152099609375,
-0.066650390625,
0.04864501953125,
0.00803375244140625,
-0.017791748046875,
0.022918701171875,
-0.048980712890625,
0.09130859375,
-0.01522064208984375,
-0.031036376953125,
0.008758544921875,
-0.053436279296875,
0.0180816650390625,
0.020294189453125,
-0.006938934326171875,
-0.031585693359375,
0.0217132568359375,
0.0716552734375,
-0.0526123046875,
0.05181884765625,
-0.025360107421875,
0.0161285400390625,
0.046844482421875,
-0.0080718994140625,
0.0275115966796875,
0.0022754669189453125,
0.019287109375,
0.03863525390625,
0.0182342529296875,
-0.026275634765625,
-0.03125,
0.05059814453125,
-0.06231689453125,
-0.04486083984375,
-0.0318603515625,
-0.021759033203125,
0.0009775161743164062,
0.0269775390625,
0.03692626953125,
0.032318115234375,
-0.00726318359375,
0.033935546875,
0.0489501953125,
-0.027374267578125,
0.03851318359375,
0.0133056640625,
-0.0118255615234375,
-0.032012939453125,
0.06793212890625,
-0.0112457275390625,
0.0016632080078125,
0.0205078125,
0.0222015380859375,
-0.036590576171875,
-0.016265869140625,
-0.03509521484375,
0.0232696533203125,
-0.047637939453125,
-0.030303955078125,
-0.06634521484375,
-0.04339599609375,
-0.034149169921875,
-0.0213165283203125,
-0.036590576171875,
-0.02471923828125,
-0.02996826171875,
0.0013523101806640625,
0.02752685546875,
0.02850341796875,
-0.012420654296875,
0.02606201171875,
-0.0521240234375,
0.0177001953125,
0.028961181640625,
0.026885986328125,
-0.0004456043243408203,
-0.047210693359375,
-0.009429931640625,
0.0005021095275878906,
-0.041595458984375,
-0.039306640625,
0.04541015625,
0.006145477294921875,
0.040069580078125,
0.0435791015625,
-0.005840301513671875,
0.07086181640625,
-0.017578125,
0.04351806640625,
0.0294952392578125,
-0.05859375,
0.0272674560546875,
-0.0127410888671875,
0.03863525390625,
0.0302886962890625,
0.021148681640625,
-0.044525146484375,
0.004817962646484375,
-0.06341552734375,
-0.08404541015625,
0.07147216796875,
0.007358551025390625,
0.0018463134765625,
0.0093994140625,
-0.0037078857421875,
0.001064300537109375,
-0.0034923553466796875,
-0.04132080078125,
-0.0272674560546875,
-0.0270538330078125,
-0.015533447265625,
-0.0029773712158203125,
-0.0313720703125,
-0.0003151893615722656,
-0.04034423828125,
0.053466796875,
-0.009490966796875,
0.05157470703125,
0.02142333984375,
-0.023101806640625,
-0.00258636474609375,
-0.0035800933837890625,
0.033355712890625,
0.0218505859375,
-0.020416259765625,
0.007053375244140625,
0.01538848876953125,
-0.031768798828125,
-0.00725555419921875,
0.0278778076171875,
-0.0236053466796875,
-0.0031948089599609375,
0.02569580078125,
0.0845947265625,
0.024322509765625,
-0.0218048095703125,
0.040924072265625,
-0.0005741119384765625,
-0.03814697265625,
-0.02655029296875,
0.0194854736328125,
0.0017976760864257812,
0.0281219482421875,
0.0169830322265625,
0.0287628173828125,
0.0229644775390625,
0.00017142295837402344,
0.0225982666015625,
0.0271453857421875,
-0.054534912109375,
-0.0281982421875,
0.061431884765625,
0.0078582763671875,
-0.00244140625,
0.05499267578125,
-0.0092010498046875,
-0.051971435546875,
0.066650390625,
0.039398193359375,
0.076416015625,
-0.004467010498046875,
0.01885986328125,
0.058319091796875,
0.01023101806640625,
0.01169586181640625,
-0.0099639892578125,
-0.00916290283203125,
-0.05718994140625,
-0.0268096923828125,
-0.07867431640625,
-0.00044918060302734375,
0.00875091552734375,
-0.0517578125,
0.04083251953125,
-0.032623291015625,
-0.012359619140625,
0.017913818359375,
0.00698089599609375,
-0.0770263671875,
0.021636962890625,
0.0181884765625,
0.0745849609375,
-0.04608154296875,
0.035797119140625,
0.059326171875,
-0.0167999267578125,
-0.057830810546875,
-0.035858154296875,
-0.0034618377685546875,
-0.0657958984375,
0.0296478271484375,
0.038177490234375,
0.002758026123046875,
0.00439453125,
-0.050537109375,
-0.0762939453125,
0.09783935546875,
0.01007843017578125,
-0.02313232421875,
-0.0027790069580078125,
0.00011008977890014648,
0.029571533203125,
-0.033172607421875,
0.0266876220703125,
0.0291290283203125,
0.040771484375,
0.05499267578125,
-0.03546142578125,
0.00774383544921875,
-0.0214996337890625,
0.0132904052734375,
0.02655029296875,
-0.06378173828125,
0.045928955078125,
-0.0255889892578125,
-0.0198974609375,
-0.01070404052734375,
0.052032470703125,
0.01058197021484375,
0.021881103515625,
0.053192138671875,
0.060699462890625,
0.0318603515625,
-0.0249786376953125,
0.062469482421875,
-0.0171966552734375,
0.058441162109375,
0.05657958984375,
0.0201416015625,
0.02337646484375,
0.031158447265625,
-0.00913238525390625,
0.032562255859375,
0.07000732421875,
-0.04156494140625,
0.0340576171875,
-0.006378173828125,
0.0119476318359375,
-0.0305938720703125,
-0.0177001953125,
-0.03045654296875,
0.0579833984375,
0.0191802978515625,
-0.046722412109375,
-0.01654052734375,
-0.01174163818359375,
-0.003143310546875,
-0.03302001953125,
-0.016632080078125,
0.049774169921875,
0.011688232421875,
-0.0249786376953125,
0.0482177734375,
0.01143646240234375,
0.0531005859375,
-0.032745361328125,
0.0024509429931640625,
-0.0062408447265625,
0.0209503173828125,
-0.0271453857421875,
-0.03564453125,
0.04541015625,
-0.01678466796875,
-0.004146575927734375,
-0.00836181640625,
0.0736083984375,
-0.02313232421875,
-0.056610107421875,
0.01456451416015625,
0.010986328125,
0.0033779144287109375,
0.0112152099609375,
-0.0701904296875,
0.031707763671875,
0.004360198974609375,
-0.03460693359375,
0.0018463134765625,
0.010498046875,
0.007274627685546875,
0.0382080078125,
0.04388427734375,
-0.0239105224609375,
0.001529693603515625,
-0.0112762451171875,
0.06787109375,
-0.05194091796875,
-0.0294342041015625,
-0.055389404296875,
0.042083740234375,
-0.0237579345703125,
-0.0240936279296875,
0.057525634765625,
0.050048828125,
0.0859375,
-0.019378662109375,
0.0201416015625,
-0.03271484375,
0.0101776123046875,
-0.0159454345703125,
0.041473388671875,
-0.050506591796875,
-0.00809478759765625,
-0.032257080078125,
-0.08148193359375,
-0.025360107421875,
0.06494140625,
-0.0298004150390625,
0.0206298828125,
0.03460693359375,
0.06939697265625,
-0.02484130859375,
-0.0020999908447265625,
0.0199432373046875,
0.0070953369140625,
0.013885498046875,
0.025177001953125,
0.045166015625,
-0.041015625,
0.035736083984375,
-0.054351806640625,
0.0027027130126953125,
-0.0367431640625,
-0.046478271484375,
-0.064453125,
-0.044708251953125,
-0.0377197265625,
-0.0245208740234375,
-0.03033447265625,
0.0657958984375,
0.08245849609375,
-0.06634521484375,
-0.005153656005859375,
0.0027217864990234375,
0.01517486572265625,
-0.0144195556640625,
-0.022491455078125,
0.03607177734375,
-0.00148773193359375,
-0.07281494140625,
-0.00522613525390625,
0.02099609375,
0.01050567626953125,
-0.0025997161865234375,
-0.0182647705078125,
-0.0028018951416015625,
-0.01035308837890625,
0.05029296875,
0.021240234375,
-0.0440673828125,
-0.02484130859375,
0.01457977294921875,
-0.0012216567993164062,
0.017822265625,
0.044708251953125,
-0.03826904296875,
0.03045654296875,
0.0435791015625,
0.03460693359375,
0.07196044921875,
0.007694244384765625,
0.01076507568359375,
-0.03485107421875,
0.0203704833984375,
0.012908935546875,
0.03564453125,
0.030059814453125,
-0.0187530517578125,
0.03900146484375,
0.0248260498046875,
-0.039337158203125,
-0.04541015625,
0.006961822509765625,
-0.0928955078125,
-0.0102996826171875,
0.07562255859375,
0.002422332763671875,
-0.04730224609375,
0.02301025390625,
-0.015106201171875,
0.031463623046875,
-0.0151519775390625,
0.037841796875,
0.0180511474609375,
-0.01229095458984375,
-0.0300140380859375,
-0.01070404052734375,
0.0257720947265625,
0.0010967254638671875,
-0.03826904296875,
-0.039337158203125,
0.033203125,
0.033599853515625,
0.0218963623046875,
0.013031005859375,
-0.0300750732421875,
0.005252838134765625,
0.011871337890625,
0.02630615234375,
-0.0209197998046875,
-0.0166015625,
-0.0165863037109375,
0.01061248779296875,
-0.0121002197265625,
-0.0207672119140625
]
] |
vinvino02/glpn-nyu | 2022-04-14T11:52:30.000Z | [
"transformers",
"pytorch",
"glpn",
"depth-estimation",
"vision",
"arxiv:2201.07436",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | depth-estimation | vinvino02 | null | null | vinvino02/glpn-nyu | 13 | 56,113 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- vision
- depth-estimation
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
---
# GLPN fine-tuned on NYUv2
Global-Local Path Networks (GLPN) model trained on NYUv2 for monocular depth estimation. It was introduced in the paper [Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth](https://arxiv.org/abs/2201.07436) by Kim et al. and first released in [this repository](https://github.com/vinvino02/GLPDepth).
Disclaimer: The team releasing GLPN did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
GLPN uses SegFormer as backbone and adds a lightweight head on top for depth estimation.

## Intended uses & limitations
You can use the raw model for monocular depth estimation. See the [model hub](https://huggingface.co/models?search=glpn) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
from transformers import GLPNFeatureExtractor, GLPNForDepthEstimation
import torch
import numpy as np
from PIL import Image
import requests
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
feature_extractor = GLPNFeatureExtractor.from_pretrained("vinvino02/glpn-nyu")
model = GLPNForDepthEstimation.from_pretrained("vinvino02/glpn-nyu")
# prepare image for the model
inputs = feature_extractor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
predicted_depth = outputs.predicted_depth
# interpolate to original size
prediction = torch.nn.functional.interpolate(
predicted_depth.unsqueeze(1),
size=image.size[::-1],
mode="bicubic",
align_corners=False,
)
# visualize the prediction
output = prediction.squeeze().cpu().numpy()
formatted = (output * 255 / np.max(output)).astype("uint8")
depth = Image.fromarray(formatted)
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/glpn).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2201-07436,
author = {Doyeon Kim and
Woonghyun Ga and
Pyunghwan Ahn and
Donggyu Joo and
Sehwan Chun and
Junmo Kim},
title = {Global-Local Path Networks for Monocular Depth Estimation with Vertical
CutDepth},
journal = {CoRR},
volume = {abs/2201.07436},
year = {2022},
url = {https://arxiv.org/abs/2201.07436},
eprinttype = {arXiv},
eprint = {2201.07436},
timestamp = {Fri, 21 Jan 2022 13:57:15 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2201-07436.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 3,240 | [
[
-0.04351806640625,
-0.050811767578125,
0.01605224609375,
0.0175323486328125,
-0.0289459228515625,
-0.0215606689453125,
0.004093170166015625,
-0.060455322265625,
0.0304412841796875,
0.030059814453125,
-0.05780029296875,
-0.0343017578125,
-0.03515625,
-0.012847900390625,
-0.0156402587890625,
0.072021484375,
0.00962066650390625,
0.007747650146484375,
-0.0137481689453125,
-0.0250396728515625,
-0.0088043212890625,
-0.01070404052734375,
-0.05108642578125,
-0.037200927734375,
0.0357666015625,
0.00719451904296875,
0.062103271484375,
0.04278564453125,
0.0516357421875,
0.0311279296875,
-0.030029296875,
0.0034027099609375,
-0.0246734619140625,
-0.031829833984375,
0.007221221923828125,
-0.0031337738037109375,
-0.0455322265625,
-0.00514984130859375,
0.05059814453125,
0.0517578125,
0.007762908935546875,
0.01739501953125,
0.00616455078125,
0.0570068359375,
-0.0286712646484375,
0.0037059783935546875,
-0.016143798828125,
0.03070068359375,
-0.007568359375,
0.00335693359375,
-0.00621795654296875,
-0.0231475830078125,
0.051544189453125,
-0.036468505859375,
0.0477294921875,
-0.0041351318359375,
0.0885009765625,
0.01499176025390625,
-0.01071929931640625,
-0.00428009033203125,
-0.040771484375,
0.0494384765625,
-0.05413818359375,
0.0161895751953125,
0.0014362335205078125,
0.0271148681640625,
0.0045928955078125,
-0.05499267578125,
-0.0218048095703125,
-0.0158233642578125,
-0.01702880859375,
0.02789306640625,
-0.0082855224609375,
0.01531219482421875,
0.0299835205078125,
0.04986572265625,
-0.052154541015625,
-0.004581451416015625,
-0.072021484375,
-0.0166473388671875,
0.056243896484375,
0.0033931732177734375,
0.0028591156005859375,
-0.038665771484375,
-0.06414794921875,
-0.008209228515625,
-0.027252197265625,
0.035858154296875,
0.0261383056640625,
0.009033203125,
-0.0526123046875,
0.032501220703125,
-0.0143890380859375,
0.060333251953125,
0.0017042160034179688,
-0.01348876953125,
0.043365478515625,
-0.02685546875,
-0.032745361328125,
0.0003135204315185547,
0.0755615234375,
0.0214080810546875,
0.02374267578125,
-0.0153350830078125,
0.01071929931640625,
-0.00841522216796875,
0.00798797607421875,
-0.06689453125,
-0.03765869140625,
0.0183868408203125,
-0.0201263427734375,
-0.024322509765625,
0.01459503173828125,
-0.06121826171875,
-0.0254364013671875,
-0.022247314453125,
0.032257080078125,
-0.0269775390625,
-0.0195770263671875,
0.0197906494140625,
0.0019092559814453125,
0.0299530029296875,
0.0313720703125,
-0.042022705078125,
0.01922607421875,
0.0267791748046875,
0.10906982421875,
-0.005008697509765625,
-0.033477783203125,
-0.02166748046875,
-0.0102386474609375,
-0.01226043701171875,
0.0377197265625,
0.00888824462890625,
-0.0167999267578125,
-0.01155853271484375,
0.0272979736328125,
0.01105499267578125,
-0.03985595703125,
0.0400390625,
-0.01483917236328125,
0.0190277099609375,
-0.00482940673828125,
-0.0184478759765625,
-0.033233642578125,
0.0089263916015625,
-0.042755126953125,
0.060150146484375,
0.007762908935546875,
-0.07806396484375,
0.0223541259765625,
-0.037506103515625,
-0.018890380859375,
-0.01593017578125,
0.0116119384765625,
-0.04986572265625,
0.004009246826171875,
0.033355712890625,
0.03375244140625,
-0.00775909423828125,
0.0177001953125,
-0.04302978515625,
-0.0247955322265625,
-0.0211944580078125,
-0.01284027099609375,
0.0919189453125,
-0.00360107421875,
-0.01493072509765625,
0.043304443359375,
-0.050537109375,
-0.0065155029296875,
0.0190582275390625,
-0.0095977783203125,
-0.0024127960205078125,
-0.0213165283203125,
-0.005359649658203125,
0.0185546875,
0.006938934326171875,
-0.045166015625,
0.02850341796875,
-0.051544189453125,
0.021209716796875,
0.060638427734375,
-0.0018329620361328125,
0.037872314453125,
-0.01390838623046875,
0.03485107421875,
0.0027103424072265625,
0.035003662109375,
0.003261566162109375,
-0.0576171875,
-0.0489501953125,
-0.0224456787109375,
-0.0006728172302246094,
0.0333251953125,
-0.031951904296875,
0.0235748291015625,
-0.02294921875,
-0.0604248046875,
-0.032958984375,
-0.00811767578125,
0.0238037109375,
0.061065673828125,
0.048309326171875,
-0.03875732421875,
-0.0516357421875,
-0.059967041015625,
0.01239013671875,
-0.003482818603515625,
-0.00817108154296875,
0.0258941650390625,
0.03704833984375,
-0.01406097412109375,
0.064697265625,
-0.0311431884765625,
-0.03021240234375,
-0.00897979736328125,
0.01361846923828125,
0.05206298828125,
0.044525146484375,
0.031585693359375,
-0.052001953125,
-0.044708251953125,
-0.01568603515625,
-0.07281494140625,
0.01568603515625,
-0.0027828216552734375,
-0.018707275390625,
0.01088714599609375,
0.007633209228515625,
-0.05413818359375,
0.049224853515625,
0.047210693359375,
-0.031951904296875,
0.065185546875,
-0.0286407470703125,
-0.020477294921875,
-0.06280517578125,
0.01078033447265625,
0.03643798828125,
-0.0211639404296875,
-0.04962158203125,
-0.00038051605224609375,
-0.006122589111328125,
-0.0007338523864746094,
-0.043304443359375,
0.0645751953125,
-0.047027587890625,
-0.017120361328125,
0.00806427001953125,
-0.0009307861328125,
-0.0019989013671875,
0.0521240234375,
0.0103912353515625,
0.03375244140625,
0.080322265625,
-0.045196533203125,
0.05084228515625,
0.03131103515625,
-0.03973388671875,
0.01953125,
-0.07666015625,
0.01270294189453125,
-0.01354217529296875,
0.0093536376953125,
-0.059295654296875,
-0.02056884765625,
0.042449951171875,
-0.03228759765625,
0.03289794921875,
-0.0443115234375,
-0.0030612945556640625,
-0.046875,
-0.02880859375,
0.034027099609375,
0.032989501953125,
-0.050048828125,
0.0287322998046875,
0.0203857421875,
0.00585174560546875,
-0.04693603515625,
-0.052154541015625,
-0.017730712890625,
-0.0162506103515625,
-0.084228515625,
0.0340576171875,
-0.01100921630859375,
0.0080108642578125,
-0.01012420654296875,
-0.01538848876953125,
0.0004239082336425781,
-0.01983642578125,
0.0280303955078125,
0.0369873046875,
-0.0139007568359375,
-0.024566650390625,
-0.0186614990234375,
-0.01152801513671875,
0.01157379150390625,
-0.00872802734375,
0.03717041015625,
-0.03387451171875,
-0.0163421630859375,
-0.0301666259765625,
-0.012115478515625,
0.0231475830078125,
-0.01134490966796875,
0.03790283203125,
0.08221435546875,
-0.0278167724609375,
0.005611419677734375,
-0.0311737060546875,
-0.01403045654296875,
-0.0396728515625,
0.0295257568359375,
-0.01197052001953125,
-0.053009033203125,
0.0611572265625,
0.008026123046875,
-0.03106689453125,
0.032440185546875,
0.0226898193359375,
-0.00286102294921875,
0.05670166015625,
0.040313720703125,
-0.0078277587890625,
0.0307769775390625,
-0.07061767578125,
-0.002780914306640625,
-0.0853271484375,
-0.0318603515625,
-0.0087738037109375,
-0.041961669921875,
-0.04248046875,
-0.038421630859375,
0.046630859375,
0.0399169921875,
-0.026092529296875,
0.025634765625,
-0.054290771484375,
0.01739501953125,
0.057281494140625,
0.020172119140625,
-0.0031337738037109375,
0.0240631103515625,
-0.0211181640625,
-0.0028972625732421875,
-0.03924560546875,
0.005367279052734375,
0.057891845703125,
0.03887939453125,
0.054107666015625,
-0.01149749755859375,
0.03985595703125,
-0.00498199462890625,
0.00714874267578125,
-0.056610107421875,
0.046112060546875,
-0.00400543212890625,
-0.052703857421875,
-0.03363037109375,
-0.037933349609375,
-0.063232421875,
0.01904296875,
-0.0020599365234375,
-0.08843994140625,
0.0266571044921875,
0.021484375,
-0.0206756591796875,
0.041351318359375,
-0.04022216796875,
0.08123779296875,
-0.0163421630859375,
-0.049041748046875,
0.01122283935546875,
-0.070556640625,
0.0265960693359375,
0.027984619140625,
-0.01390838623046875,
-0.01067352294921875,
0.0293121337890625,
0.046051025390625,
-0.032867431640625,
0.04644775390625,
-0.045318603515625,
0.00905609130859375,
0.03375244140625,
0.00008195638656616211,
0.042327880859375,
0.019683837890625,
-0.0009746551513671875,
0.053070068359375,
-0.0174560546875,
-0.0262298583984375,
-0.01824951171875,
0.039306640625,
-0.05413818359375,
-0.0345458984375,
-0.03778076171875,
-0.044464111328125,
-0.0006575584411621094,
0.0208740234375,
0.056915283203125,
0.048065185546875,
-0.0005884170532226562,
0.01226043701171875,
0.02978515625,
-0.0216064453125,
0.03167724609375,
-0.007537841796875,
-0.0343017578125,
-0.041351318359375,
0.05108642578125,
0.009033203125,
0.015777587890625,
0.01715087890625,
0.034088134765625,
-0.0302734375,
-0.03662109375,
-0.03741455078125,
0.033233642578125,
-0.044158935546875,
-0.0255279541015625,
-0.02374267578125,
-0.03704833984375,
-0.03631591796875,
-0.023223876953125,
-0.029083251953125,
-0.01294708251953125,
-0.00946807861328125,
-0.0015850067138671875,
0.0247039794921875,
0.04931640625,
-0.034698486328125,
0.0171966552734375,
-0.033905029296875,
0.02667236328125,
0.0185546875,
0.02569580078125,
-0.00830078125,
-0.0472412109375,
-0.0270538330078125,
0.00597381591796875,
-0.0204620361328125,
-0.05145263671875,
0.0380859375,
0.0157318115234375,
0.01715087890625,
0.03228759765625,
-0.0190582275390625,
0.06317138671875,
-0.025177001953125,
0.042755126953125,
0.0377197265625,
-0.051788330078125,
0.047149658203125,
-0.0269775390625,
0.046539306640625,
0.0131683349609375,
0.042572021484375,
-0.031585693359375,
-0.006397247314453125,
-0.037506103515625,
-0.0750732421875,
0.071533203125,
0.0019121170043945312,
0.0001518726348876953,
0.03839111328125,
0.0225982666015625,
0.00015091896057128906,
0.006107330322265625,
-0.06585693359375,
-0.032318115234375,
-0.0312347412109375,
0.007904052734375,
-0.01477813720703125,
-0.0124053955078125,
-0.00335693359375,
-0.06005859375,
0.0540771484375,
-0.016815185546875,
0.048980712890625,
0.05303955078125,
0.006107330322265625,
-0.0157318115234375,
-0.0272064208984375,
0.0335693359375,
0.056121826171875,
-0.049560546875,
-0.00795745849609375,
0.01099395751953125,
-0.042083740234375,
-0.012115478515625,
0.01512908935546875,
-0.0283203125,
0.00644683837890625,
0.038848876953125,
0.08013916015625,
0.0006117820739746094,
-0.0017614364624023438,
0.03826904296875,
0.01393890380859375,
-0.027984619140625,
-0.030426025390625,
-0.006008148193359375,
-0.00019359588623046875,
0.029327392578125,
0.016815185546875,
0.041412353515625,
0.0177459716796875,
-0.004810333251953125,
0.003971099853515625,
0.03778076171875,
-0.043304443359375,
-0.043701171875,
0.044586181640625,
-0.0177459716796875,
-0.0006346702575683594,
0.048126220703125,
-0.0206451416015625,
-0.03424072265625,
0.052337646484375,
0.025390625,
0.0748291015625,
-0.00537109375,
0.0310516357421875,
0.0709228515625,
0.0197601318359375,
0.00803375244140625,
0.01190185546875,
0.00946807861328125,
-0.05035400390625,
-0.02972412109375,
-0.057891845703125,
-0.0193939208984375,
0.031158447265625,
-0.03857421875,
0.0212249755859375,
-0.04931640625,
-0.00786590576171875,
0.0005602836608886719,
0.020233154296875,
-0.0765380859375,
0.02288818359375,
0.0188751220703125,
0.08062744140625,
-0.036834716796875,
0.06292724609375,
0.055389404296875,
-0.039215087890625,
-0.06304931640625,
-0.0125579833984375,
0.0036468505859375,
-0.059051513671875,
0.03204345703125,
0.0310516357421875,
-0.01453399658203125,
-0.007476806640625,
-0.055633544921875,
-0.07110595703125,
0.1005859375,
0.040618896484375,
-0.01538848876953125,
-0.024810791015625,
0.00958251953125,
0.034149169921875,
-0.016571044921875,
0.0152435302734375,
-0.0038013458251953125,
0.032867431640625,
0.0272064208984375,
-0.05780029296875,
0.0012493133544921875,
-0.0164031982421875,
0.0116119384765625,
-0.00032019615173339844,
-0.047607421875,
0.0797119140625,
-0.03668212890625,
-0.007381439208984375,
0.0209197998046875,
0.05218505859375,
0.01215362548828125,
0.016693115234375,
0.033203125,
0.05889892578125,
0.04278564453125,
-0.021484375,
0.07977294921875,
-0.0008950233459472656,
0.056671142578125,
0.07989501953125,
0.01345062255859375,
0.0261688232421875,
0.02569580078125,
-0.020782470703125,
0.033111572265625,
0.0516357421875,
-0.0278167724609375,
0.052734375,
0.01611328125,
0.00469207763671875,
-0.01059722900390625,
0.004665374755859375,
-0.045135498046875,
0.036224365234375,
-0.006381988525390625,
-0.01371002197265625,
-0.0073394775390625,
-0.0084075927734375,
-0.00870513916015625,
-0.042755126953125,
-0.034942626953125,
0.0301513671875,
0.0015268325805664062,
-0.035369873046875,
0.05096435546875,
-0.01061248779296875,
0.05938720703125,
-0.049560546875,
0.006214141845703125,
-0.0296478271484375,
0.0279541015625,
-0.023040771484375,
-0.056549072265625,
0.020660400390625,
-0.0193939208984375,
-0.0007710456848144531,
0.00875091552734375,
0.07354736328125,
-0.0216064453125,
-0.0546875,
0.0477294921875,
0.0247802734375,
0.0206451416015625,
-0.0035686492919921875,
-0.06756591796875,
0.0015382766723632812,
-0.003108978271484375,
-0.041778564453125,
0.00772857666015625,
0.0255279541015625,
0.032135009765625,
0.05120849609375,
0.036224365234375,
0.018157958984375,
0.0185394287109375,
-0.00921630859375,
0.059722900390625,
-0.0310821533203125,
-0.0238037109375,
-0.048309326171875,
0.0740966796875,
-0.016326904296875,
-0.04443359375,
0.05206298828125,
0.046966552734375,
0.09442138671875,
-0.0192108154296875,
0.0296173095703125,
-0.00775146484375,
0.0172271728515625,
-0.039947509765625,
0.0465087890625,
-0.05987548828125,
-0.01088714599609375,
-0.04388427734375,
-0.08209228515625,
-0.02105712890625,
0.062744140625,
-0.0247039794921875,
0.022430419921875,
0.031494140625,
0.0635986328125,
-0.040435791015625,
-0.012451171875,
0.011932373046875,
0.01459503173828125,
0.01001739501953125,
0.0284423828125,
0.0226898193359375,
-0.05126953125,
0.0241546630859375,
-0.07025146484375,
-0.00914764404296875,
-0.01099395751953125,
-0.0638427734375,
-0.0474853515625,
-0.0401611328125,
-0.035858154296875,
-0.03240966796875,
-0.004688262939453125,
0.03900146484375,
0.08319091796875,
-0.0623779296875,
-0.03192138671875,
-0.0232696533203125,
0.003849029541015625,
-0.01332855224609375,
-0.0182037353515625,
0.0369873046875,
0.0013685226440429688,
-0.04443359375,
0.01898193359375,
0.0175018310546875,
0.004192352294921875,
-0.0282135009765625,
-0.0386962890625,
-0.03411865234375,
-0.0285491943359375,
0.0230560302734375,
0.0290985107421875,
-0.052978515625,
-0.037384033203125,
0.0008769035339355469,
0.00377655029296875,
0.0266876220703125,
0.0186920166015625,
-0.056488037109375,
0.0511474609375,
0.035552978515625,
0.016448974609375,
0.0615234375,
-0.00745391845703125,
0.008148193359375,
-0.0672607421875,
0.02288818359375,
0.005832672119140625,
0.04986572265625,
0.03961181640625,
-0.0160980224609375,
0.0677490234375,
0.03314208984375,
-0.0518798828125,
-0.04266357421875,
0.0125732421875,
-0.0892333984375,
0.01076507568359375,
0.071044921875,
-0.0286102294921875,
-0.0186004638671875,
0.0294952392578125,
-0.0155792236328125,
0.03466796875,
-0.00444793701171875,
0.041748046875,
0.0228729248046875,
-0.006473541259765625,
-0.0235748291015625,
-0.03436279296875,
0.039154052734375,
0.008209228515625,
-0.040771484375,
-0.025604248046875,
0.031494140625,
0.0161590576171875,
0.0462646484375,
0.039794921875,
0.003253936767578125,
0.00992584228515625,
-0.0081939697265625,
0.0175933837890625,
-0.024322509765625,
-0.046051025390625,
-0.032562255859375,
0.002307891845703125,
-0.0234375,
-0.01084136962890625
]
] |
Babelscape/wikineural-multilingual-ner | 2023-05-23T08:47:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"named-entity-recognition",
"sequence-tagger-model",
"de",
"en",
"es",
"fr",
"it",
"nl",
"pl",
"pt",
"ru",
"multilingual",
"dataset:Babelscape/wikineural",
"license:cc-by-nc-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | Babelscape | null | null | Babelscape/wikineural-multilingual-ner | 55 | 56,034 | transformers | 2022-03-02T23:29:04 | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
widget:
- text: My name is Wolfgang and I live in Berlin.
- text: George Washington went to Washington.
- text: Mi nombre es Sarah y vivo en Londres.
- text: Меня зовут Симона, и я живу в Риме.
tags:
- named-entity-recognition
- sequence-tagger-model
datasets:
- Babelscape/wikineural
language:
- de
- en
- es
- fr
- it
- nl
- pl
- pt
- ru
- multilingual
license:
- cc-by-nc-sa-4.0
pretty_name: wikineural-dataset
source_datasets:
- original
task_categories:
- structure-prediction
task_ids:
- named-entity-recognition
---
# WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER
This is the model card for the EMNLP 2021 paper [WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER](https://aclanthology.org/2021.findings-emnlp.215/). We fine-tuned a multilingual language model (mBERT) for 3 epochs on our [WikiNEuRal dataset](https://huggingface.co/datasets/Babelscape/wikineural) for Named Entity Recognition (NER). The resulting multilingual NER model supports the 9 languages covered by WikiNEuRal (de, en, es, fr, it, nl, pl, pt, ru), and it was trained on all 9 languages jointly.
**If you use the model, please reference this work in your paper**:
```bibtex
@inproceedings{tedeschi-etal-2021-wikineural-combined,
title = "{W}iki{NE}u{R}al: {C}ombined Neural and Knowledge-based Silver Data Creation for Multilingual {NER}",
author = "Tedeschi, Simone and
Maiorca, Valentino and
Campolungo, Niccol{\`o} and
Cecconi, Francesco and
Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-emnlp.215",
pages = "2521--2533",
abstract = "Multilingual Named Entity Recognition (NER) is a key intermediate task which is needed in many areas of NLP. In this paper, we address the well-known issue of data scarcity in NER, especially relevant when moving to a multilingual scenario, and go beyond current approaches to the creation of multilingual silver data for the task. We exploit the texts of Wikipedia and introduce a new methodology based on the effective combination of knowledge-based approaches and neural models, together with a novel domain adaptation technique, to produce high-quality training corpora for NER. We evaluate our datasets extensively on standard benchmarks for NER, yielding substantial improvements up to 6 span-based F1-score points over previous state-of-the-art systems for data creation.",
}
```
The original repository for the paper can be found at [https://github.com/Babelscape/wikineural](https://github.com/Babelscape/wikineural).
## How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("Babelscape/wikineural-multilingual-ner")
model = AutoModelForTokenClassification.from_pretrained("Babelscape/wikineural-multilingual-ner")
nlp = pipeline("ner", model=model, tokenizer=tokenizer, grouped_entities=True)
example = "My name is Wolfgang and I live in Berlin"
ner_results = nlp(example)
print(ner_results)
```
## Limitations and bias
This model is trained on WikiNEuRal, a state-of-the-art dataset for Multilingual NER automatically derived from Wikipedia. Therefore, it might not generalize well to all textual genres (e.g. news). On the other hand, models trained only on news articles (e.g. only on CoNLL03) have been proven to obtain much lower scores on encyclopedic articles. To obtain more robust systems, we encourage you to train a system on the combination of WikiNEuRal with other datasets (e.g. WikiNEuRal + CoNLL).
## Licensing Information
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). Copyright of the dataset contents and models belongs to the original copyright holders. | 4,365 | [
[
-0.036895751953125,
-0.043365478515625,
0.00856781005859375,
0.006877899169921875,
0.006134033203125,
-0.01393890380859375,
-0.03515625,
-0.039093017578125,
0.0306854248046875,
0.022613525390625,
-0.0272216796875,
-0.0419921875,
-0.041534423828125,
0.030609130859375,
-0.0290679931640625,
0.09222412109375,
0.0008411407470703125,
0.01183319091796875,
-0.0225067138671875,
-0.013824462890625,
-0.015716552734375,
-0.0455322265625,
-0.06298828125,
-0.0204925537109375,
0.054534912109375,
0.0300750732421875,
0.0164947509765625,
0.020294189453125,
0.0198974609375,
0.0233306884765625,
-0.00362396240234375,
0.03759765625,
-0.0197601318359375,
0.0023517608642578125,
-0.014434814453125,
-0.01434326171875,
-0.034088134765625,
-0.01265716552734375,
0.040008544921875,
0.049957275390625,
-0.0120849609375,
0.00688934326171875,
0.007595062255859375,
0.0413818359375,
-0.032073974609375,
0.01024627685546875,
-0.05755615234375,
0.006748199462890625,
-0.01305389404296875,
-0.003337860107421875,
-0.03631591796875,
-0.0002646446228027344,
0.0097503662109375,
-0.057647705078125,
0.0188751220703125,
0.0007076263427734375,
0.0777587890625,
-0.0003638267517089844,
-0.0362548828125,
-0.0127716064453125,
-0.039703369140625,
0.048095703125,
-0.0718994140625,
0.039581298828125,
0.029083251953125,
-0.0021190643310546875,
-0.005916595458984375,
-0.05145263671875,
-0.0626220703125,
-0.0171051025390625,
-0.0017547607421875,
0.015716552734375,
-0.015960693359375,
-0.0265960693359375,
0.0241241455078125,
-0.0021800994873046875,
-0.043060302734375,
0.01203155517578125,
-0.045989990234375,
-0.0229644775390625,
0.042205810546875,
-0.00428009033203125,
0.01338958740234375,
-0.03192138671875,
-0.0110015869140625,
-0.004116058349609375,
-0.040008544921875,
-0.01788330078125,
0.04443359375,
0.0288238525390625,
-0.0213165283203125,
0.04571533203125,
-0.0278167724609375,
0.061553955078125,
0.0270233154296875,
-0.01483154296875,
0.0565185546875,
-0.0198516845703125,
0.0171661376953125,
0.003566741943359375,
0.07220458984375,
0.0310211181640625,
0.0245513916015625,
-0.0364990234375,
-0.0078125,
-0.015289306640625,
0.004550933837890625,
-0.04547119140625,
-0.020538330078125,
-0.00284576416015625,
-0.0189666748046875,
-0.0157318115234375,
0.0007929801940917969,
-0.0537109375,
-0.0105438232421875,
-0.03466796875,
0.03173828125,
-0.036590576171875,
-0.04058837890625,
-0.00844573974609375,
-0.0027065277099609375,
0.00620269775390625,
0.0035419464111328125,
-0.06646728515625,
0.0211639404296875,
0.03271484375,
0.055389404296875,
-0.0200347900390625,
-0.058135986328125,
-0.02935791015625,
0.0166168212890625,
-0.0139923095703125,
0.06280517578125,
-0.01247406005859375,
0.0027618408203125,
-0.0247802734375,
0.0172576904296875,
-0.012786865234375,
-0.031829833984375,
0.0200042724609375,
-0.035980224609375,
0.026336669921875,
0.0003979206085205078,
-0.04266357421875,
-0.0190582275390625,
0.033477783203125,
-0.0643310546875,
0.0791015625,
-0.002475738525390625,
-0.0821533203125,
0.029876708984375,
-0.0484619140625,
-0.0321044921875,
-0.007568359375,
0.00406646728515625,
-0.0171966552734375,
0.00695037841796875,
0.0149688720703125,
0.0298309326171875,
-0.00732421875,
0.043487548828125,
-0.0218963623046875,
0.0045013427734375,
0.028594970703125,
-0.04522705078125,
0.05303955078125,
0.005702972412109375,
-0.0287017822265625,
-0.0269012451171875,
-0.07171630859375,
0.00749969482421875,
-0.0075531005859375,
-0.0269775390625,
-0.0262603759765625,
-0.006290435791015625,
0.0316162109375,
0.038330078125,
0.0217437744140625,
-0.046142578125,
-0.008636474609375,
-0.05322265625,
0.0274658203125,
0.042022705078125,
-0.0019435882568359375,
0.041046142578125,
-0.01140594482421875,
0.0191192626953125,
0.018218994140625,
0.0026836395263671875,
0.0218658447265625,
-0.03515625,
-0.07757568359375,
0.0111846923828125,
0.06591796875,
0.048248291015625,
-0.0634765625,
0.03790283203125,
-0.03460693359375,
-0.0474853515625,
-0.0357666015625,
-0.0037689208984375,
0.0147552490234375,
0.050628662109375,
0.0408935546875,
-0.01300811767578125,
-0.044830322265625,
-0.0697021484375,
-0.0022258758544921875,
0.0037975311279296875,
0.020263671875,
0.018768310546875,
0.04827880859375,
-0.019805908203125,
0.0675048828125,
-0.0146331787109375,
-0.0196075439453125,
-0.007259368896484375,
-0.00009495019912719727,
0.0233001708984375,
0.04833984375,
0.0252532958984375,
-0.0760498046875,
-0.05657958984375,
0.0240478515625,
-0.06817626953125,
0.0026111602783203125,
0.0034770965576171875,
-0.01248931884765625,
0.034820556640625,
0.039398193359375,
-0.05181884765625,
0.00826263427734375,
0.04217529296875,
-0.0189971923828125,
0.039459228515625,
-0.0272216796875,
-0.00125885009765625,
-0.10479736328125,
0.0254669189453125,
-0.0027980804443359375,
0.007396697998046875,
-0.05242919921875,
-0.01068878173828125,
0.0046539306640625,
-0.0119476318359375,
-0.034912109375,
0.06903076171875,
-0.046905517578125,
0.017669677734375,
-0.0149078369140625,
0.00997161865234375,
0.00022363662719726562,
0.033233642578125,
0.01113128662109375,
0.03192138671875,
0.045623779296875,
-0.0523681640625,
0.033905029296875,
0.0211639404296875,
-0.01514434814453125,
0.054473876953125,
-0.0389404296875,
0.004459381103515625,
-0.01381683349609375,
-0.0015726089477539062,
-0.01812744140625,
-0.0167236328125,
0.0234527587890625,
-0.048553466796875,
0.049407958984375,
-0.0162353515625,
-0.04449462890625,
-0.01522064208984375,
0.0140228271484375,
0.01904296875,
0.0084075927734375,
-0.035369873046875,
0.052520751953125,
0.0439453125,
-0.02130126953125,
-0.0673828125,
-0.06817626953125,
0.0265960693359375,
-0.0264129638671875,
-0.036834716796875,
0.04278564453125,
-0.0189666748046875,
-0.0022678375244140625,
0.0225372314453125,
-0.0003597736358642578,
-0.01250457763671875,
-0.0063934326171875,
0.003147125244140625,
0.01727294921875,
-0.007389068603515625,
0.029815673828125,
0.00737762451171875,
-0.0285186767578125,
-0.01148223876953125,
-0.036163330078125,
0.03619384765625,
-0.0059661865234375,
-0.0093841552734375,
-0.017242431640625,
0.03900146484375,
0.0305328369140625,
-0.023529052734375,
0.06390380859375,
0.061065673828125,
-0.0355224609375,
-0.007038116455078125,
-0.056732177734375,
-0.01332855224609375,
-0.031707763671875,
0.037689208984375,
-0.03094482421875,
-0.053192138671875,
0.035186767578125,
0.017730712890625,
0.0011205673217773438,
0.064697265625,
0.027191162109375,
0.002681732177734375,
0.060546875,
0.0419921875,
-0.00968170166015625,
0.01751708984375,
-0.0330810546875,
0.01898193359375,
-0.06365966796875,
-0.038665771484375,
-0.035614013671875,
-0.02056884765625,
-0.0667724609375,
-0.01024627685546875,
0.011322021484375,
0.0032291412353515625,
-0.007068634033203125,
0.0306854248046875,
-0.03857421875,
0.028900146484375,
0.0416259765625,
-0.002536773681640625,
0.0157318115234375,
0.0250701904296875,
-0.0166473388671875,
-0.0180511474609375,
-0.069091796875,
-0.026031494140625,
0.0928955078125,
-0.005008697509765625,
0.0416259765625,
0.002197265625,
0.07135009765625,
0.0014066696166992188,
0.01763916015625,
-0.04620361328125,
0.03271484375,
-0.0263671875,
-0.050628662109375,
-0.01479339599609375,
-0.04974365234375,
-0.08209228515625,
0.01000213623046875,
-0.030670166015625,
-0.045623779296875,
0.03314208984375,
-0.006076812744140625,
-0.00677490234375,
0.0267791748046875,
-0.053314208984375,
0.064453125,
-0.023712158203125,
-0.01348876953125,
-0.001789093017578125,
-0.040374755859375,
-0.003894805908203125,
-0.0215301513671875,
0.030303955078125,
-0.00860595703125,
-0.007106781005859375,
0.0718994140625,
-0.01494598388671875,
0.057342529296875,
-0.0197906494140625,
0.007083892822265625,
0.00357818603515625,
-0.0231475830078125,
0.03375244140625,
0.014984130859375,
-0.0230560302734375,
0.037841796875,
-0.0038166046142578125,
-0.0111236572265625,
-0.0306854248046875,
0.08392333984375,
-0.062744140625,
-0.01461029052734375,
-0.03729248046875,
-0.054931640625,
-0.0074920654296875,
0.032257080078125,
0.043701171875,
0.048126220703125,
-0.0095062255859375,
0.00952911376953125,
0.0401611328125,
-0.0214691162109375,
0.03704833984375,
0.0521240234375,
-0.002593994140625,
-0.0513916015625,
0.08984375,
0.04833984375,
0.0010662078857421875,
0.0303497314453125,
-0.0016660690307617188,
-0.0198822021484375,
-0.050537109375,
-0.0374755859375,
0.031219482421875,
-0.053680419921875,
-0.032470703125,
-0.08056640625,
-0.033111572265625,
-0.032562255859375,
0.01189422607421875,
-0.01462554931640625,
-0.052581787109375,
-0.042266845703125,
-0.00435638427734375,
0.03314208984375,
0.040008544921875,
-0.0275726318359375,
-0.004058837890625,
-0.04779052734375,
0.01678466796875,
0.006591796875,
0.01617431640625,
0.0036334991455078125,
-0.04376220703125,
-0.03900146484375,
0.01263427734375,
-0.00830078125,
-0.058929443359375,
0.05084228515625,
0.035797119140625,
0.062286376953125,
0.005542755126953125,
0.0017986297607421875,
0.041839599609375,
-0.05670166015625,
0.0328369140625,
0.0189666748046875,
-0.05108642578125,
0.024688720703125,
-0.0078277587890625,
0.0182342529296875,
0.055816650390625,
0.040496826171875,
-0.059783935546875,
-0.0300445556640625,
-0.073486328125,
-0.077880859375,
0.044097900390625,
-0.01004791259765625,
0.03192138671875,
-0.0223236083984375,
0.02301025390625,
0.0160064697265625,
0.01322174072265625,
-0.08148193359375,
-0.0288238525390625,
-0.00722503662109375,
-0.0267791748046875,
-0.0078277587890625,
-0.00406646728515625,
0.00728607177734375,
-0.023651123046875,
0.083740234375,
-0.0167236328125,
0.0225830078125,
0.01453399658203125,
-0.0311737060546875,
0.01580810546875,
0.019317626953125,
0.024383544921875,
0.047332763671875,
0.0117645263671875,
0.006870269775390625,
0.03363037109375,
-0.04083251953125,
0.0037860870361328125,
0.03765869140625,
-0.04180908203125,
0.0230560302734375,
0.01158905029296875,
0.061065673828125,
0.0150909423828125,
-0.0247039794921875,
0.032257080078125,
0.005832672119140625,
-0.020263671875,
-0.0389404296875,
-0.0135040283203125,
0.01506805419921875,
0.0189971923828125,
0.033203125,
0.0270538330078125,
0.00836944580078125,
-0.006763458251953125,
0.018280029296875,
0.024200439453125,
-0.0202484130859375,
-0.020904541015625,
0.032073974609375,
0.0016527175903320312,
-0.01448822021484375,
0.058380126953125,
-0.0382080078125,
-0.0285491943359375,
0.04278564453125,
0.0523681640625,
0.04620361328125,
-0.00664520263671875,
0.0270233154296875,
0.0628662109375,
0.027313232421875,
-0.00402069091796875,
0.02044677734375,
0.0131072998046875,
-0.0699462890625,
-0.04217529296875,
-0.0599365234375,
-0.0056304931640625,
0.01178741455078125,
-0.04974365234375,
0.0279083251953125,
-0.01158905029296875,
-0.0218658447265625,
0.0160064697265625,
0.0009560585021972656,
-0.059356689453125,
0.013397216796875,
0.0119476318359375,
0.0654296875,
-0.05828857421875,
0.061798095703125,
0.045074462890625,
-0.0323486328125,
-0.06439208984375,
-0.01322174072265625,
-0.0210418701171875,
-0.0269012451171875,
0.060302734375,
0.016204833984375,
0.0225372314453125,
0.02362060546875,
-0.0196990966796875,
-0.0914306640625,
0.06768798828125,
0.029937744140625,
-0.037506103515625,
-0.024993896484375,
-0.004627227783203125,
0.032196044921875,
-0.025482177734375,
0.0168914794921875,
0.0124053955078125,
0.048248291015625,
-0.01471710205078125,
-0.07330322265625,
-0.0136260986328125,
-0.044769287109375,
-0.016021728515625,
0.01800537109375,
-0.0298004150390625,
0.06378173828125,
-0.0198516845703125,
-0.011810302734375,
-0.00255584716796875,
0.050750732421875,
0.0135955810546875,
0.0216827392578125,
0.0257110595703125,
0.06658935546875,
0.06732177734375,
-0.01001739501953125,
0.061279296875,
-0.0274658203125,
0.032257080078125,
0.087158203125,
-0.00936126708984375,
0.0712890625,
0.0467529296875,
-0.007568359375,
0.048675537109375,
0.040283203125,
-0.03125,
0.05047607421875,
0.0006952285766601562,
-0.0159759521484375,
0.005283355712890625,
-0.012176513671875,
-0.039154052734375,
0.046173095703125,
0.02130126953125,
-0.016693115234375,
-0.01149749755859375,
0.025146484375,
0.014892578125,
-0.0167999267578125,
-0.019622802734375,
0.06439208984375,
0.0084686279296875,
-0.043731689453125,
0.043060302734375,
0.00870513916015625,
0.07904052734375,
-0.049163818359375,
0.00504302978515625,
-0.00876617431640625,
0.0012683868408203125,
-0.027252197265625,
-0.032318115234375,
0.0209197998046875,
0.008758544921875,
-0.0408935546875,
0.0023822784423828125,
0.054473876953125,
-0.055511474609375,
-0.0634765625,
0.027008056640625,
0.0479736328125,
0.0217132568359375,
0.01045989990234375,
-0.056884765625,
-0.032745361328125,
-0.008026123046875,
-0.018096923828125,
0.0299072265625,
0.04534912109375,
0.0016222000122070312,
0.0304718017578125,
0.06317138671875,
0.0312347412109375,
0.0157623291015625,
0.011138916015625,
0.04949951171875,
-0.03851318359375,
-0.033782958984375,
-0.050048828125,
0.038482666015625,
-0.0244293212890625,
-0.0275421142578125,
0.07037353515625,
0.060516357421875,
0.0919189453125,
-0.00916290283203125,
0.058929443359375,
-0.02923583984375,
0.032562255859375,
-0.037200927734375,
0.080322265625,
-0.05108642578125,
-0.01084136962890625,
-0.0223541259765625,
-0.07366943359375,
-0.021820068359375,
0.047332763671875,
-0.0167999267578125,
0.0063323974609375,
0.051666259765625,
0.07318115234375,
-0.01580810546875,
-0.027008056640625,
0.0158538818359375,
0.0156402587890625,
-0.01248931884765625,
0.037445068359375,
0.05621337890625,
-0.0479736328125,
0.043121337890625,
-0.039947509765625,
0.0101776123046875,
0.009765625,
-0.06671142578125,
-0.04693603515625,
-0.057403564453125,
-0.03338623046875,
-0.025665283203125,
-0.007568359375,
0.057342529296875,
0.052276611328125,
-0.07232666015625,
-0.01094818115234375,
-0.00530242919921875,
0.0011434555053710938,
-0.017333984375,
-0.01477813720703125,
0.0309906005859375,
-0.0210418701171875,
-0.05657958984375,
0.03515625,
0.0021800994873046875,
0.004119873046875,
-0.01012420654296875,
-0.0212860107421875,
-0.03314208984375,
-0.02093505859375,
0.039093017578125,
0.0146636962890625,
-0.045989990234375,
-0.0012712478637695312,
0.01401519775390625,
-0.0018148422241210938,
-0.004062652587890625,
0.02996826171875,
-0.053009033203125,
0.0382080078125,
0.025634765625,
0.03863525390625,
0.052215576171875,
-0.0445556640625,
0.023223876953125,
-0.0552978515625,
0.014892578125,
0.01467132568359375,
0.030426025390625,
0.0533447265625,
-0.0256195068359375,
0.035858154296875,
0.026123046875,
-0.03704833984375,
-0.06207275390625,
-0.01029205322265625,
-0.061065673828125,
-0.00942230224609375,
0.09197998046875,
-0.00008940696716308594,
-0.0083465576171875,
-0.0162506103515625,
0.0045013427734375,
0.0301971435546875,
-0.0180816650390625,
0.04241943359375,
0.05706787109375,
0.0175628662109375,
-0.0190887451171875,
-0.05902099609375,
0.0160369873046875,
0.0102081298828125,
-0.0694580078125,
-0.0164794921875,
0.036529541015625,
0.0435791015625,
0.028289794921875,
0.037109375,
-0.00521087646484375,
0.020599365234375,
0.0036067962646484375,
0.024169921875,
-0.02001953125,
-0.03338623046875,
-0.025421142578125,
0.00786590576171875,
-0.006885528564453125,
0.0225677490234375
]
] |
cledoux42/Ethnicity_Test_v003 | 2023-04-09T04:48:14.000Z | [
"transformers",
"pytorch",
"vit",
"image-classification",
"autotrain",
"vision",
"dataset:cledoux42/autotrain-data-ethnicity-test_v003",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | cledoux42 | null | null | cledoux42/Ethnicity_Test_v003 | 0 | 55,999 | transformers | 2023-04-09T04:32:22 | ---
tags:
- autotrain
- vision
- image-classification
datasets:
- cledoux42/autotrain-data-ethnicity-test_v003
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
co2_eq_emissions:
emissions: 6.022813032092885
---
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 47959117029
- CO2 Emissions (in grams): 6.0228
## Validation Metrics
- Loss: 0.530
- Accuracy: 0.796
- Macro F1: 0.797
- Micro F1: 0.796
- Weighted F1: 0.796
- Macro Precision: 0.797
- Micro Precision: 0.796
- Weighted Precision: 0.796
- Macro Recall: 0.798
- Micro Recall: 0.796
- Weighted Recall: 0.796 | 889 | [
[
-0.023162841796875,
-0.0093231201171875,
0.0162506103515625,
0.0012674331665039062,
0.003284454345703125,
0.01068115234375,
0.00540924072265625,
-0.0178985595703125,
-0.02288818359375,
-0.0030574798583984375,
-0.03009033203125,
-0.04376220703125,
-0.045623779296875,
-0.00931549072265625,
-0.03460693359375,
0.06842041015625,
-0.0005865097045898438,
0.0266876220703125,
0.00121307373046875,
-0.01409149169921875,
-0.0478515625,
-0.06622314453125,
-0.069091796875,
-0.01270294189453125,
0.0360107421875,
0.03955078125,
0.0138702392578125,
0.0239715576171875,
0.039306640625,
0.0164794921875,
-0.003173828125,
-0.004833221435546875,
-0.0273284912109375,
-0.051849365234375,
0.00890350341796875,
-0.031982421875,
-0.04345703125,
0.0204010009765625,
0.038543701171875,
0.035797119140625,
-0.01995849609375,
0.037567138671875,
0.019012451171875,
0.029266357421875,
-0.048553466796875,
0.0280609130859375,
-0.03472900390625,
0.023162841796875,
0.017364501953125,
-0.0010156631469726562,
-0.033355712890625,
-0.005306243896484375,
-0.004535675048828125,
-0.0479736328125,
0.0341796875,
0.0243377685546875,
0.0791015625,
0.04425048828125,
-0.03350830078125,
-0.039276123046875,
-0.0195465087890625,
0.05059814453125,
-0.045562744140625,
0.015045166015625,
0.022216796875,
0.0274810791015625,
0.01338958740234375,
-0.03448486328125,
-0.035369873046875,
0.0062713623046875,
-0.036865234375,
0.0294036865234375,
0.014373779296875,
-0.00916290283203125,
0.01422119140625,
0.04119873046875,
-0.043365478515625,
0.02374267578125,
-0.033355712890625,
-0.031463623046875,
0.07403564453125,
0.042755126953125,
0.01131439208984375,
-0.0032482147216796875,
-0.032623291015625,
-0.02801513671875,
-0.0064697265625,
-0.0002617835998535156,
0.030426025390625,
0.0228729248046875,
-0.01983642578125,
0.0309600830078125,
-0.0281524658203125,
0.04144287109375,
0.004322052001953125,
-0.0113372802734375,
0.03955078125,
-0.021636962890625,
-0.040313720703125,
0.000054895877838134766,
0.059783935546875,
0.03338623046875,
-0.0012922286987304688,
0.00982666015625,
0.004669189453125,
0.00988006591796875,
0.00786590576171875,
-0.061309814453125,
-0.040252685546875,
-0.003726959228515625,
-0.022552490234375,
-0.04461669921875,
0.02117919921875,
-0.042327880859375,
0.0216827392578125,
-0.03155517578125,
0.04150390625,
-0.02386474609375,
-0.0273590087890625,
0.01079559326171875,
-0.0186309814453125,
0.0274810791015625,
0.034637451171875,
-0.05859375,
0.01093292236328125,
0.01506805419921875,
0.061737060546875,
-0.009185791015625,
-0.007709503173828125,
0.0136260986328125,
0.017425537109375,
-0.01531982421875,
0.041656494140625,
-0.0203704833984375,
-0.0450439453125,
-0.037353515625,
0.0261993408203125,
-0.030303955078125,
-0.0256805419921875,
0.04400634765625,
-0.01493072509765625,
0.037109375,
-0.01012420654296875,
-0.047637939453125,
-0.048736572265625,
0.0287628173828125,
-0.03009033203125,
0.08905029296875,
0.0210418701171875,
-0.04437255859375,
0.059661865234375,
-0.053619384765625,
-0.01486968994140625,
0.0023441314697265625,
-0.0089569091796875,
-0.05938720703125,
0.0015468597412109375,
-0.0183258056640625,
0.02545166015625,
0.0007047653198242188,
0.0401611328125,
-0.0294036865234375,
0.0011463165283203125,
-0.0281524658203125,
-0.04022216796875,
0.06787109375,
0.0297698974609375,
-0.0011606216430664062,
0.0096893310546875,
-0.08746337890625,
0.0245208740234375,
-0.00991058349609375,
-0.0262451171875,
-0.0106658935546875,
-0.03961181640625,
0.0191497802734375,
0.027679443359375,
0.016082763671875,
-0.030426025390625,
0.0256195068359375,
0.0274810791015625,
0.0229644775390625,
0.045928955078125,
0.006839752197265625,
-0.004215240478515625,
-0.0281829833984375,
0.0147705078125,
0.015625,
0.0292205810546875,
0.04156494140625,
-0.036712646484375,
-0.07672119140625,
-0.02764892578125,
0.026397705078125,
0.0467529296875,
0.00818634033203125,
0.08087158203125,
0.01371002197265625,
-0.07025146484375,
0.001537322998046875,
-0.006755828857421875,
0.005268096923828125,
0.051116943359375,
0.016143798828125,
-0.0260162353515625,
-0.01953125,
-0.056915283203125,
0.00351715087890625,
-0.0106658935546875,
0.015625,
0.018035888671875,
0.07373046875,
-0.0283966064453125,
0.055084228515625,
-0.061553955078125,
-0.030853271484375,
0.033721923828125,
0.048187255859375,
-0.0004706382751464844,
0.047088623046875,
0.06036376953125,
-0.028106689453125,
-0.0439453125,
-0.0226287841796875,
-0.050628662109375,
0.0226593017578125,
0.003265380859375,
-0.0267791748046875,
0.008331298828125,
0.032806396484375,
-0.006450653076171875,
0.056427001953125,
0.0310516357421875,
-0.03472900390625,
0.0309600830078125,
-0.0256500244140625,
0.0185394287109375,
-0.06353759765625,
0.028045654296875,
-0.00754547119140625,
0.0011720657348632812,
-0.00652313232421875,
-0.023406982421875,
0.0107269287109375,
-0.01213836669921875,
-0.03326416015625,
0.0274505615234375,
-0.0218963623046875,
0.00823211669921875,
-0.0029048919677734375,
-0.0316162109375,
0.020782470703125,
0.04296875,
0.01552581787109375,
0.057525634765625,
0.046112060546875,
-0.06365966796875,
0.04022216796875,
0.0237274169921875,
-0.024261474609375,
0.04156494140625,
-0.037109375,
0.01111602783203125,
0.0179901123046875,
0.007228851318359375,
-0.08673095703125,
-0.037384033203125,
-0.0002677440643310547,
-0.026763916015625,
0.019805908203125,
0.0017004013061523438,
-0.04803466796875,
-0.0299224853515625,
0.00937652587890625,
0.0396728515625,
0.01485443115234375,
-0.03021240234375,
0.0071868896484375,
-0.004024505615234375,
0.0177154541015625,
-0.01465606689453125,
-0.051971435546875,
-0.0194854736328125,
-0.0294189453125,
-0.024810791015625,
-0.0011539459228515625,
-0.032958984375,
0.015899658203125,
-0.0040435791015625,
-0.0020236968994140625,
-0.022796630859375,
0.01361846923828125,
0.0121307373046875,
0.01151275634765625,
0.01470184326171875,
0.0270233154296875,
-0.01537322998046875,
-0.015533447265625,
0.0137176513671875,
0.03643798828125,
0.0428466796875,
-0.0255584716796875,
-0.0217132568359375,
-0.043548583984375,
0.00116729736328125,
0.050140380859375,
-0.00577545166015625,
0.054840087890625,
0.0518798828125,
-0.040557861328125,
0.0162506103515625,
-0.009674072265625,
0.002269744873046875,
-0.0292205810546875,
0.03363037109375,
-0.0284271240234375,
-0.0272369384765625,
0.0577392578125,
-0.006595611572265625,
-0.0220489501953125,
0.0924072265625,
0.03448486328125,
0.003345489501953125,
0.08074951171875,
0.01702880859375,
-0.008514404296875,
0.00606536865234375,
-0.0419921875,
0.0036334991455078125,
-0.04827880859375,
-0.06378173828125,
-0.053741455078125,
-0.0092620849609375,
-0.04571533203125,
0.005634307861328125,
0.030364990234375,
0.011810302734375,
-0.0701904296875,
0.040863037109375,
-0.06231689453125,
0.01065826416015625,
0.06884765625,
0.0144500732421875,
0.01561737060546875,
-0.024993896484375,
-0.0018482208251953125,
0.0130462646484375,
-0.0499267578125,
-0.02081298828125,
0.06634521484375,
0.038116455078125,
0.052978515625,
0.0081787109375,
0.04254150390625,
0.0225372314453125,
0.023956298828125,
-0.063232421875,
0.0223388671875,
-0.00023818016052246094,
-0.09356689453125,
-0.035614013671875,
-0.0287933349609375,
-0.03155517578125,
-0.00478363037109375,
-0.0230865478515625,
-0.0032501220703125,
0.006702423095703125,
0.01488494873046875,
-0.047607421875,
0.0345458984375,
-0.0714111328125,
0.09381103515625,
-0.06011962890625,
-0.003093719482421875,
-0.0089874267578125,
-0.031402587890625,
0.0201568603515625,
-0.00774383544921875,
0.01477813720703125,
-0.0178070068359375,
0.00455474853515625,
0.06585693359375,
-0.0330810546875,
0.056304931640625,
-0.0268402099609375,
0.0040283203125,
0.0230560302734375,
-0.032684326171875,
0.022216796875,
0.0025272369384765625,
0.008392333984375,
0.02740478515625,
0.01209259033203125,
-0.0263671875,
-0.0118865966796875,
0.0264892578125,
-0.0738525390625,
0.006130218505859375,
-0.0831298828125,
-0.03436279296875,
-0.003582000732421875,
0.02392578125,
0.05596923828125,
0.03094482421875,
-0.015777587890625,
-0.005054473876953125,
0.03326416015625,
-0.023712158203125,
0.046600341796875,
0.049560546875,
-0.00494384765625,
-0.050750732421875,
0.06536865234375,
0.0167083740234375,
0.03045654296875,
0.00362396240234375,
0.01641845703125,
-0.0244293212890625,
-0.019989013671875,
-0.048095703125,
-0.0110626220703125,
-0.043609619140625,
-0.053863525390625,
-0.032806396484375,
-0.04412841796875,
-0.033172607421875,
0.01433563232421875,
-0.0284271240234375,
-0.010498046875,
-0.058502197265625,
-0.017547607421875,
0.025390625,
0.056121826171875,
-0.0006246566772460938,
0.055267333984375,
-0.0560302734375,
0.0020294189453125,
0.036407470703125,
0.06103515625,
-0.0170135498046875,
-0.06939697265625,
-0.0311737060546875,
-0.0192718505859375,
-0.03472900390625,
-0.0400390625,
0.043243408203125,
0.0234527587890625,
0.033721923828125,
0.043792724609375,
-0.007045745849609375,
0.06353759765625,
-0.0163726806640625,
0.04937744140625,
0.0298309326171875,
-0.0703125,
0.03704833984375,
-0.0069122314453125,
-0.007678985595703125,
0.057373046875,
0.0350341796875,
-0.01158905029296875,
-0.0112152099609375,
-0.07928466796875,
-0.044891357421875,
0.042083740234375,
0.0032176971435546875,
-0.0310516357421875,
0.00868988037109375,
0.030731201171875,
-0.002681732177734375,
0.02105712890625,
-0.061859130859375,
-0.0231475830078125,
-0.021148681640625,
-0.035888671875,
-0.01413726806640625,
0.003871917724609375,
0.0019197463989257812,
-0.068115234375,
0.07177734375,
-0.00634765625,
0.01611328125,
0.009368896484375,
0.01068115234375,
0.0126800537109375,
0.0146636962890625,
0.0732421875,
0.0194549560546875,
-0.04241943359375,
0.01290130615234375,
0.022796630859375,
-0.0247802734375,
0.0301513671875,
-0.0186920166015625,
0.012054443359375,
-0.01561737060546875,
0.015655517578125,
0.03851318359375,
-0.0239410400390625,
0.0003514289855957031,
0.019256591796875,
-0.006443023681640625,
-0.0209197998046875,
-0.061981201171875,
0.024658203125,
-0.0218353271484375,
-0.02801513671875,
0.01393890380859375,
0.0535888671875,
0.0367431640625,
-0.032806396484375,
0.0153350830078125,
0.035888671875,
-0.039306640625,
-0.00316619873046875,
0.054595947265625,
0.0189971923828125,
-0.006023406982421875,
0.053375244140625,
-0.040130615234375,
-0.0418701171875,
0.06243896484375,
0.023651123046875,
0.052978515625,
-0.0276947021484375,
-0.024871826171875,
0.0792236328125,
0.0230255126953125,
-0.022674560546875,
-0.002872467041015625,
0.036865234375,
-0.038787841796875,
-0.01132965087890625,
-0.044189453125,
-0.030242919921875,
0.0236968994140625,
-0.0804443359375,
0.038116455078125,
-0.028778076171875,
-0.0178985595703125,
0.0055389404296875,
0.01454925537109375,
-0.058349609375,
0.0692138671875,
0.011627197265625,
0.07098388671875,
-0.09814453125,
0.062255859375,
0.0338134765625,
-0.042724609375,
-0.072021484375,
-0.043243408203125,
-0.008544921875,
-0.06756591796875,
0.0462646484375,
0.0271148681640625,
-0.0023193359375,
0.00492095947265625,
-0.05218505859375,
-0.06939697265625,
0.08447265625,
-0.00009763240814208984,
-0.069580078125,
0.002056121826171875,
0.00334930419921875,
0.027618408203125,
-0.0065155029296875,
0.03680419921875,
0.0428466796875,
0.04718017578125,
0.004810333251953125,
-0.07659912109375,
-0.0203857421875,
-0.01715087890625,
-0.0286865234375,
0.0064239501953125,
-0.1055908203125,
0.05548095703125,
0.01168060302734375,
-0.0091705322265625,
-0.01140594482421875,
0.01087188720703125,
-0.0123291015625,
0.034576416015625,
0.043609619140625,
0.09796142578125,
0.064697265625,
-0.032501220703125,
0.0523681640625,
-0.027801513671875,
0.06353759765625,
0.0833740234375,
-0.006801605224609375,
0.0293121337890625,
-0.00408935546875,
-0.029449462890625,
0.041046142578125,
0.07586669921875,
-0.02496337890625,
0.040130615234375,
0.017059326171875,
-0.00811004638671875,
-0.0251922607421875,
0.027618408203125,
-0.037628173828125,
0.046661376953125,
0.0237274169921875,
-0.02081298828125,
-0.006359100341796875,
0.00016009807586669922,
-0.0032444000244140625,
-0.0307769775390625,
-0.0213623046875,
0.06353759765625,
-0.026702880859375,
-0.03485107421875,
0.0257415771484375,
-0.0009608268737792969,
0.035552978515625,
-0.01861572265625,
-0.0155181884765625,
0.0098114013671875,
0.02606201171875,
-0.0272064208984375,
-0.031951904296875,
0.0330810546875,
-0.034820556640625,
-0.0272369384765625,
0.00281524658203125,
0.0222015380859375,
-0.032623291015625,
-0.0660400390625,
0.0129547119140625,
0.0021686553955078125,
0.015045166015625,
-0.014007568359375,
-0.051025390625,
-0.0005445480346679688,
0.0021305084228515625,
0.01166534423828125,
0.0018777847290039062,
0.019866943359375,
0.01509857177734375,
0.03643798828125,
0.01204681396484375,
-0.0291595458984375,
0.0178985595703125,
-0.0123138427734375,
0.0345458984375,
-0.061981201171875,
-0.031951904296875,
-0.0504150390625,
0.0192108154296875,
-0.0138397216796875,
-0.048095703125,
0.064453125,
0.0953369140625,
0.069091796875,
-0.009979248046875,
0.0731201171875,
-0.0106964111328125,
0.050048828125,
-0.0257415771484375,
0.043182373046875,
-0.022308349609375,
-0.004657745361328125,
0.0036449432373046875,
-0.024749755859375,
-0.0166015625,
0.062469482421875,
-0.01297760009765625,
0.02459716796875,
0.037750244140625,
0.031982421875,
-0.004596710205078125,
0.0015192031860351562,
0.01169586181640625,
-0.004913330078125,
0.0140228271484375,
0.052032470703125,
0.037628173828125,
-0.06005859375,
0.0198974609375,
-0.017486572265625,
-0.0268707275390625,
-0.0243682861328125,
-0.05419921875,
-0.05047607421875,
-0.0144500732421875,
-0.050018310546875,
-0.0247650146484375,
-0.039764404296875,
0.047393798828125,
0.08465576171875,
-0.07403564453125,
-0.024658203125,
-0.02685546875,
-0.034942626953125,
0.00318145751953125,
-0.0241241455078125,
0.04608154296875,
-0.032379150390625,
-0.045806884765625,
-0.0124053955078125,
-0.022857666015625,
0.032806396484375,
-0.0207977294921875,
-0.012908935546875,
-0.0266265869140625,
-0.0133056640625,
0.036102294921875,
-0.00469207763671875,
-0.0229339599609375,
-0.057830810546875,
-0.0250701904296875,
-0.01451873779296875,
0.0178375244140625,
0.016815185546875,
-0.0256805419921875,
0.0247650146484375,
0.0251922607421875,
0.00833892822265625,
0.0426025390625,
0.00899505615234375,
0.00618743896484375,
-0.058258056640625,
0.0196990966796875,
0.00841522216796875,
0.021514892578125,
0.01497650146484375,
-0.0295562744140625,
0.04736328125,
0.05328369140625,
-0.06195068359375,
-0.039947509765625,
0.0026607513427734375,
-0.0830078125,
0.001140594482421875,
0.050537109375,
-0.004795074462890625,
-0.0078582763671875,
-0.0010900497436523438,
-0.01556396484375,
0.031280517578125,
-0.0155487060546875,
0.043243408203125,
0.078125,
-0.031951904296875,
-0.002857208251953125,
-0.038299560546875,
0.039154052734375,
0.00977325439453125,
-0.08197021484375,
-0.0157470703125,
0.03564453125,
0.058319091796875,
0.043243408203125,
0.0215301513671875,
-0.0026798248291015625,
-0.0176544189453125,
0.033416748046875,
0.03485107421875,
0.0019855499267578125,
-0.0401611328125,
-0.01000213623046875,
0.0037250518798828125,
-0.0004696846008300781,
-0.045745849609375
]
] |
facebook/roberta-hate-speech-dynabench-r4-target | 2023-03-16T20:03:57.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"en",
"arxiv:2012.15761",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | facebook | null | null | facebook/roberta-hate-speech-dynabench-r4-target | 30 | 55,678 | transformers | 2022-06-10T22:24:39 | ---
language: en
---
# LFTW R4 Target
The R4 Target model from [Learning from the Worst: Dynamically Generated Datasets to Improve Online Hate Detection](https://arxiv.org/abs/2012.15761)
## Citation Information
```bibtex
@inproceedings{vidgen2021lftw,
title={Learning from the Worst: Dynamically Generated Datasets to Improve Online Hate Detection},
author={Bertie Vidgen and Tristan Thrush and Zeerak Waseem and Douwe Kiela},
booktitle={ACL},
year={2021}
}
```
Thanks to Kushal Tirumala and Adina Williams for helping the authors put the model on the hub! | 570 | [
[
-0.031280517578125,
-0.046905517578125,
0.03082275390625,
-0.0268707275390625,
0.0079498291015625,
0.01474761962890625,
0.005664825439453125,
-0.0604248046875,
0.0079498291015625,
0.015899658203125,
-0.0631103515625,
-0.030242919921875,
-0.035888671875,
-0.0012111663818359375,
-0.0537109375,
0.0894775390625,
0.03582763671875,
0.0010175704956054688,
0.01519012451171875,
-0.01898193359375,
-0.0143585205078125,
-0.04364013671875,
-0.033050537109375,
-0.029632568359375,
0.06121826171875,
0.032379150390625,
0.043853759765625,
0.0325927734375,
0.0584716796875,
0.0177001953125,
0.00861358642578125,
-0.017120361328125,
-0.07012939453125,
-0.0006413459777832031,
-0.0217437744140625,
-0.0280303955078125,
-0.020904541015625,
0.00823974609375,
0.03216552734375,
0.01324462890625,
-0.01227569580078125,
0.01331329345703125,
-0.0091094970703125,
0.00592803955078125,
-0.04071044921875,
-0.00577545166015625,
-0.0697021484375,
0.01971435546875,
-0.031829833984375,
0.0013227462768554688,
-0.03338623046875,
-0.048095703125,
-0.00911712646484375,
-0.050811767578125,
0.035919189453125,
0.016754150390625,
0.06658935546875,
0.0165863037109375,
-0.03424072265625,
-0.02703857421875,
-0.05517578125,
0.0682373046875,
-0.055999755859375,
0.036224365234375,
0.01496124267578125,
0.031646728515625,
0.001079559326171875,
-0.05352783203125,
-0.0246124267578125,
0.0076751708984375,
0.01708984375,
-0.00904083251953125,
-0.034454345703125,
-0.00270843505859375,
0.034942626953125,
0.01555633544921875,
-0.019744873046875,
0.0054168701171875,
-0.03277587890625,
-0.03521728515625,
0.03729248046875,
0.0206146240234375,
-0.006580352783203125,
-0.0169525146484375,
-0.0283050537109375,
-0.01090240478515625,
-0.0338134765625,
-0.000049233436584472656,
0.05926513671875,
0.052154541015625,
-0.0137176513671875,
0.03692626953125,
0.0207672119140625,
0.031158447265625,
0.0304718017578125,
-0.00832366943359375,
0.050750732421875,
-0.0266571044921875,
0.004634857177734375,
-0.00969696044921875,
0.03521728515625,
0.042510986328125,
0.0038814544677734375,
-0.00630950927734375,
-0.028076171875,
0.002124786376953125,
0.04425048828125,
-0.04766845703125,
-0.0305023193359375,
-0.006061553955078125,
-0.04150390625,
-0.051605224609375,
-0.0037384033203125,
-0.079833984375,
-0.048309326171875,
0.007232666015625,
0.0204925537109375,
-0.03363037109375,
-0.0182342529296875,
-0.032012939453125,
0.004138946533203125,
0.01763916015625,
-0.004550933837890625,
-0.046783447265625,
0.0250244140625,
0.0178070068359375,
0.038421630859375,
-0.027587890625,
-0.0077056884765625,
-0.055572509765625,
-0.0028018951416015625,
-0.00827789306640625,
0.058258056640625,
-0.033294677734375,
-0.0089874267578125,
0.0230560302734375,
-0.0143585205078125,
0.020751953125,
-0.034393310546875,
0.058746337890625,
-0.061309814453125,
0.006557464599609375,
-0.0206756591796875,
-0.042999267578125,
-0.027069091796875,
0.0184478759765625,
-0.0670166015625,
0.08087158203125,
0.0262908935546875,
-0.05859375,
0.017333984375,
-0.0428466796875,
-0.005748748779296875,
0.0020465850830078125,
0.0213165283203125,
-0.054046630859375,
-0.00626373291015625,
-0.01373291015625,
0.040863037109375,
-0.0010852813720703125,
0.0199127197265625,
-0.05621337890625,
-0.005573272705078125,
0.006450653076171875,
-0.00439453125,
0.06170654296875,
0.0347900390625,
-0.0352783203125,
0.035003662109375,
-0.07330322265625,
-0.0115509033203125,
0.0531005859375,
-0.0178680419921875,
-0.01983642578125,
-0.01181793212890625,
0.0192108154296875,
0.00928497314453125,
0.015777587890625,
-0.04156494140625,
-0.01409149169921875,
-0.0099945068359375,
-0.0031604766845703125,
0.07122802734375,
0.0017423629760742188,
0.0215301513671875,
-0.0216064453125,
0.047393798828125,
-0.01611328125,
0.0411376953125,
0.026336669921875,
-0.0333251953125,
-0.0260772705078125,
0.00732421875,
-0.00792694091796875,
0.0269775390625,
-0.052581787109375,
0.013458251953125,
0.0198974609375,
-0.0460205078125,
0.0029964447021484375,
-0.00273895263671875,
0.037628173828125,
0.051177978515625,
0.050201416015625,
-0.01316070556640625,
-0.071044921875,
-0.058868408203125,
-0.036651611328125,
-0.0022792816162109375,
0.01605224609375,
0.002918243408203125,
0.039794921875,
-0.0135650634765625,
0.050567626953125,
-0.041656494140625,
-0.024200439453125,
-0.0284271240234375,
-0.003986358642578125,
0.007167816162109375,
0.0005173683166503906,
0.06231689453125,
-0.06719970703125,
-0.0469970703125,
-0.047882080078125,
-0.03179931640625,
-0.02618408203125,
0.0313720703125,
-0.021514892578125,
0.001056671142578125,
0.0253448486328125,
-0.0037746429443359375,
0.041717529296875,
0.033935546875,
-0.04473876953125,
0.051239013671875,
0.0423583984375,
0.01328277587890625,
-0.0809326171875,
-0.01059722900390625,
0.0270233154296875,
-0.03515625,
-0.06744384765625,
0.01050567626953125,
-0.0012598037719726562,
0.0046234130859375,
-0.04571533203125,
0.026214599609375,
-0.00629425048828125,
-0.006500244140625,
0.0034332275390625,
-0.0222625732421875,
-0.0211029052734375,
0.03656005859375,
-0.018035888671875,
0.022430419921875,
-0.0029735565185546875,
-0.01068878173828125,
0.0399169921875,
0.047332763671875,
-0.0310516357421875,
0.0399169921875,
-0.01320648193359375,
-0.00037670135498046875,
0.01520538330078125,
0.0105438232421875,
-0.047332763671875,
-0.03021240234375,
0.056884765625,
-0.020843505859375,
0.007122039794921875,
0.00015556812286376953,
-0.0225067138671875,
-0.027587890625,
-0.03533935546875,
0.0418701171875,
0.0239715576171875,
-0.041473388671875,
0.03192138671875,
0.040252685546875,
0.0217132568359375,
-0.041046142578125,
-0.049041748046875,
-0.0137176513671875,
-0.05474853515625,
0.007678985595703125,
0.03875732421875,
-0.04180908203125,
-0.02960205078125,
-0.01262664794921875,
0.01325225830078125,
-0.02362060546875,
-0.00484466552734375,
0.0190887451171875,
0.00022590160369873047,
-0.0027523040771484375,
-0.00010192394256591797,
-0.0482177734375,
-0.01087188720703125,
0.0246124267578125,
0.019012451171875,
0.0226593017578125,
0.0162506103515625,
-0.0204620361328125,
-0.0195770263671875,
0.03741455078125,
0.016815185546875,
0.0006542205810546875,
0.04046630859375,
0.056884765625,
-0.036163330078125,
-0.0268707275390625,
-0.033111572265625,
-0.03411865234375,
-0.03472900390625,
0.0299072265625,
-0.00995635986328125,
-0.052215576171875,
0.040863037109375,
0.0222015380859375,
0.005680084228515625,
0.037109375,
0.0389404296875,
0.01654052734375,
0.049896240234375,
0.071044921875,
-0.007183074951171875,
0.0689697265625,
-0.0242156982421875,
0.0171661376953125,
-0.0295562744140625,
-0.0267333984375,
-0.0242156982421875,
-0.0107574462890625,
-0.0269775390625,
-0.028289794921875,
0.024932861328125,
-0.023406982421875,
-0.0615234375,
0.0237274169921875,
-0.048614501953125,
0.0195159912109375,
0.05010986328125,
0.031890869140625,
0.007228851318359375,
0.01439666748046875,
-0.01024627685546875,
-0.031219482421875,
-0.05047607421875,
-0.004634857177734375,
0.08050537109375,
0.039154052734375,
0.045013427734375,
0.0269775390625,
0.047027587890625,
0.04766845703125,
0.0248565673828125,
-0.046600341796875,
0.041259765625,
0.0111846923828125,
-0.07623291015625,
-0.018585205078125,
-0.0222625732421875,
-0.059814453125,
-0.01439666748046875,
-0.0284271240234375,
-0.031402587890625,
0.0126190185546875,
0.02703857421875,
-0.0185546875,
0.05218505859375,
-0.017120361328125,
0.054931640625,
-0.01611328125,
-0.0206146240234375,
-0.0080108642578125,
-0.0384521484375,
0.04510498046875,
-0.031402587890625,
0.04364013671875,
-0.016143798828125,
0.00811004638671875,
0.0689697265625,
-0.029327392578125,
0.08392333984375,
-0.03228759765625,
-0.0064849853515625,
0.018646240234375,
0.007152557373046875,
0.053131103515625,
-0.040557861328125,
-0.02728271484375,
0.0183563232421875,
-0.02923583984375,
-0.020477294921875,
-0.0024089813232421875,
0.03338623046875,
-0.0555419921875,
-0.0072174072265625,
-0.048004150390625,
-0.048919677734375,
-0.0173187255859375,
0.035003662109375,
0.029571533203125,
0.038421630859375,
-0.0183868408203125,
0.01291656494140625,
0.046478271484375,
-0.0261383056640625,
0.026611328125,
0.045013427734375,
-0.0294036865234375,
-0.032562255859375,
0.059112548828125,
0.007358551025390625,
0.0118865966796875,
0.0247039794921875,
0.01525115966796875,
-0.033294677734375,
-0.06787109375,
0.01226806640625,
0.0096588134765625,
-0.07965087890625,
-0.050628662109375,
-0.044036865234375,
-0.01540374755859375,
-0.0277862548828125,
-0.0199127197265625,
-0.0308074951171875,
0.0104827880859375,
-0.044830322265625,
-0.0267333984375,
0.058319091796875,
0.09075927734375,
-0.0421142578125,
0.03216552734375,
-0.0294342041015625,
0.032745361328125,
0.0045166015625,
0.0286712646484375,
0.01375579833984375,
-0.05621337890625,
-0.04144287109375,
0.0002378225326538086,
-0.03558349609375,
-0.08245849609375,
0.057220458984375,
-0.0016393661499023438,
0.053619384765625,
0.0213165283203125,
0.0389404296875,
0.031494140625,
-0.032806396484375,
0.050201416015625,
0.034942626953125,
-0.047027587890625,
0.0303802490234375,
-0.04852294921875,
-0.0027141571044921875,
0.028289794921875,
0.05731201171875,
0.0200653076171875,
-0.017242431640625,
-0.048431396484375,
-0.0721435546875,
0.059356689453125,
0.004962921142578125,
-0.01043701171875,
0.0126800537109375,
0.0247039794921875,
0.0212249755859375,
0.0012149810791015625,
-0.10418701171875,
-0.036895751953125,
-0.0193328857421875,
-0.038421630859375,
-0.008148193359375,
-0.042022705078125,
-0.00434112548828125,
-0.022369384765625,
0.06610107421875,
-0.0111846923828125,
0.0455322265625,
-0.024444580078125,
-0.0026721954345703125,
-0.03717041015625,
-0.021820068359375,
0.025360107421875,
0.0303802490234375,
-0.040191650390625,
-0.01280975341796875,
0.01910400390625,
-0.0217742919921875,
0.0157623291015625,
0.01165008544921875,
0.001010894775390625,
-0.018280029296875,
0.0487060546875,
0.04547119140625,
0.041351318359375,
-0.0277099609375,
0.037384033203125,
-0.003116607666015625,
-0.0273895263671875,
-0.062042236328125,
0.0029449462890625,
-0.016815185546875,
0.04931640625,
0.050048828125,
-0.00635528564453125,
0.0161285400390625,
-0.014556884765625,
0.0396728515625,
0.02996826171875,
-0.029510498046875,
-0.0325927734375,
0.06475830078125,
0.0119476318359375,
-0.0239715576171875,
0.026885986328125,
-0.0156707763671875,
-0.0518798828125,
0.0265350341796875,
0.0482177734375,
0.0396728515625,
-0.034576416015625,
0.0203094482421875,
0.037384033203125,
0.00975799560546875,
0.0190887451171875,
0.0198974609375,
0.0233917236328125,
-0.078125,
0.004657745361328125,
-0.050506591796875,
-0.04400634765625,
0.04412841796875,
-0.05206298828125,
0.0236968994140625,
-0.033447265625,
-0.00630950927734375,
0.01421356201171875,
0.0189056396484375,
-0.06390380859375,
0.026885986328125,
0.0181427001953125,
0.0528564453125,
-0.058746337890625,
0.05047607421875,
0.039825439453125,
-0.037933349609375,
-0.070068359375,
0.00563812255859375,
0.04315185546875,
-0.059417724609375,
0.034210205078125,
-0.00897979736328125,
0.00069427490234375,
0.0222015380859375,
-0.053680419921875,
-0.09527587890625,
0.0631103515625,
0.022216796875,
-0.04278564453125,
-0.0015077590942382812,
-0.007198333740234375,
0.024932861328125,
-0.03509521484375,
0.0201263427734375,
0.02105712890625,
0.04083251953125,
0.006603240966796875,
-0.032470703125,
-0.0017023086547851562,
-0.0171661376953125,
-0.032928466796875,
0.00939178466796875,
-0.047882080078125,
0.0673828125,
-0.01113128662109375,
0.00795745849609375,
-0.00897216796875,
0.033905029296875,
-0.0020618438720703125,
0.035675048828125,
0.049224853515625,
0.04949951171875,
0.04107666015625,
0.01288604736328125,
0.07196044921875,
0.0028934478759765625,
0.0199127197265625,
0.08392333984375,
-0.0031070709228515625,
0.04656982421875,
0.006771087646484375,
-0.0271453857421875,
0.037628173828125,
0.059112548828125,
-0.031219482421875,
0.06787109375,
-0.0006656646728515625,
-0.0031147003173828125,
-0.002735137939453125,
0.0028400421142578125,
-0.0166015625,
0.017852783203125,
0.033447265625,
-0.029510498046875,
-0.0234527587890625,
0.0260467529296875,
-0.005420684814453125,
0.01922607421875,
-0.0252532958984375,
0.046539306640625,
-0.007049560546875,
-0.0056610107421875,
0.053558349609375,
-0.0032672882080078125,
0.07342529296875,
-0.039459228515625,
0.003261566162109375,
-0.0053558349609375,
0.02569580078125,
-0.0313720703125,
-0.043609619140625,
0.039581298828125,
0.013885498046875,
-0.04254150390625,
-0.01515960693359375,
0.06488037109375,
-0.0203094482421875,
-0.033294677734375,
0.06121826171875,
0.00920867919921875,
0.0160675048828125,
0.0139617919921875,
-0.08880615234375,
0.0419921875,
0.041717529296875,
-0.03265380859375,
0.04241943359375,
0.0153045654296875,
-0.005733489990234375,
0.04742431640625,
0.0526123046875,
-0.0035610198974609375,
0.01195526123046875,
-0.005706787109375,
0.07080078125,
-0.06756591796875,
-0.028961181640625,
-0.03094482421875,
0.032745361328125,
-0.0032901763916015625,
-0.01413726806640625,
0.0640869140625,
0.044219970703125,
0.06927490234375,
-0.00780487060546875,
0.0673828125,
-0.0206146240234375,
0.09722900390625,
-0.00010192394256591797,
0.0350341796875,
-0.048583984375,
0.005748748779296875,
-0.03070068359375,
-0.07733154296875,
-0.03369140625,
0.049072265625,
-0.0214691162109375,
0.0131683349609375,
0.047576904296875,
0.0711669921875,
-0.0291290283203125,
-0.0211944580078125,
0.037384033203125,
0.044525146484375,
0.04083251953125,
0.03228759765625,
0.043975830078125,
-0.03472900390625,
0.053253173828125,
-0.046783447265625,
-0.005489349365234375,
-0.0400390625,
-0.07635498046875,
-0.06256103515625,
-0.048004150390625,
-0.0281219482421875,
-0.05047607421875,
0.01157379150390625,
0.0694580078125,
0.0303497314453125,
-0.09466552734375,
0.004634857177734375,
-0.0010118484497070312,
0.005435943603515625,
0.007389068603515625,
-0.020477294921875,
0.01180267333984375,
-0.004180908203125,
-0.046600341796875,
0.031707763671875,
-0.0248565673828125,
0.0052947998046875,
-0.014801025390625,
-0.01812744140625,
-0.04449462890625,
0.0021076202392578125,
0.033172607421875,
0.0248870849609375,
-0.052093505859375,
-0.038177490234375,
-0.031402587890625,
-0.00426483154296875,
-0.0018138885498046875,
0.041046142578125,
-0.056060791015625,
0.0007023811340332031,
0.0087432861328125,
0.020904541015625,
0.01202392578125,
0.004154205322265625,
0.03558349609375,
-0.033172607421875,
0.00904083251953125,
0.034332275390625,
0.01236724853515625,
0.037506103515625,
-0.034942626953125,
0.049560546875,
0.03802490234375,
-0.046295166015625,
-0.0902099609375,
0.015533447265625,
-0.047332763671875,
-0.01513671875,
0.0799560546875,
-0.00485992431640625,
-0.00879669189453125,
-0.023468017578125,
-0.001483917236328125,
0.039215087890625,
-0.0511474609375,
0.045654296875,
0.0226898193359375,
-0.0057220458984375,
-0.034637451171875,
-0.046112060546875,
0.0567626953125,
-0.0002484321594238281,
-0.0504150390625,
-0.0142059326171875,
0.05377197265625,
0.032562255859375,
-0.0028820037841796875,
0.0310516357421875,
0.01309967041015625,
0.0010862350463867188,
0.00090789794921875,
0.040557861328125,
-0.0007963180541992188,
-0.017425537109375,
-0.0240478515625,
0.033935546875,
-0.007476806640625,
0.00299072265625
]
] |
facebook/mbart-large-50-many-to-one-mmt | 2023-03-28T09:18:56.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"mbart",
"text2text-generation",
"mbart-50",
"multilingual",
"ar",
"cs",
"de",
"en",
"es",
"et",
"fi",
"fr",
"gu",
"hi",
"it",
"ja",
"kk",
"ko",
"lt",
"lv",
"my",
"ne",
"nl",
"ro",
"ru",
"si",
"tr",
"vi",
"zh",
"af",
"az",
"bn",
"fa",
"he",
"hr",
"id",
"ka",
"km",
"mk",
"ml",
"mn",
"mr",
"pl",
"ps",
"pt",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"uk",
"ur",
"xh",
"gl",
"sl",
"arxiv:2008.00401",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | facebook | null | null | facebook/mbart-large-50-many-to-one-mmt | 45 | 55,670 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- ar
- cs
- de
- en
- es
- et
- fi
- fr
- gu
- hi
- it
- ja
- kk
- ko
- lt
- lv
- my
- ne
- nl
- ro
- ru
- si
- tr
- vi
- zh
- af
- az
- bn
- fa
- he
- hr
- id
- ka
- km
- mk
- ml
- mn
- mr
- pl
- ps
- pt
- sv
- sw
- ta
- te
- th
- tl
- uk
- ur
- xh
- gl
- sl
tags:
- mbart-50
---
# mBART-50 many to one multilingual machine translation
This model is a fine-tuned checkpoint of [mBART-large-50](https://huggingface.co/facebook/mbart-large-50). `mbart-large-50-many-to-many-mmt` is fine-tuned for multilingual machine translation. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper.
The model can translate directly between any pair of 50 languages.
```python
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
article_hi = "संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है"
article_ar = "الأمين العام للأمم المتحدة يقول إنه لا يوجد حل عسكري في سوريا."
model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-many-to-one-mmt")
tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-many-to-one-mmt")
# translate Hindi to English
tokenizer.src_lang = "hi_IN"
encoded_hi = tokenizer(article_hi, return_tensors="pt")
generated_tokens = model.generate(**encoded_hi)
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "The head of the UN says there is no military solution in Syria."
# translate Arabic to English
tokenizer.src_lang = "ar_AR"
encoded_ar = tokenizer(article_ar, return_tensors="pt")
generated_tokens = model.generate(**encoded_ar)
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "The Secretary-General of the United Nations says there is no military solution in Syria."
```
See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for more fine-tuned versions.
## Languages covered
Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)
## BibTeX entry and citation info
```
@article{tang2020multilingual,
title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning},
author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan},
year={2020},
eprint={2008.00401},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 3,276 | [
[
-0.042266845703125,
-0.03363037109375,
0.00888824462890625,
0.0299530029296875,
-0.022308349609375,
0.00762939453125,
-0.0222930908203125,
-0.0196533203125,
0.0178985595703125,
0.0141448974609375,
-0.041534423828125,
-0.044158935546875,
-0.048980712890625,
0.0155792236328125,
-0.00994110107421875,
0.0816650390625,
-0.0211334228515625,
0.02325439453125,
0.01739501953125,
-0.0316162109375,
-0.0159912109375,
-0.0352783203125,
-0.0292205810546875,
-0.0091094970703125,
0.0170745849609375,
0.033966064453125,
0.036651611328125,
0.0276336669921875,
0.05218505859375,
0.0283203125,
-0.0183258056640625,
0.009429931640625,
-0.00940704345703125,
-0.01947021484375,
-0.01580810546875,
-0.0188751220703125,
-0.055084228515625,
-0.01007843017578125,
0.058837890625,
0.041015625,
-0.0017251968383789062,
0.03515625,
0.00206756591796875,
0.05279541015625,
-0.03094482421875,
0.0006375312805175781,
-0.02044677734375,
0.006191253662109375,
-0.0207672119140625,
0.007106781005859375,
-0.0221099853515625,
-0.0259857177734375,
-0.01036834716796875,
-0.03350830078125,
0.00267791748046875,
-0.005062103271484375,
0.0760498046875,
-0.002399444580078125,
-0.048126220703125,
-0.005558013916015625,
-0.044097900390625,
0.06622314453125,
-0.06463623046875,
0.03851318359375,
0.03033447265625,
0.028228759765625,
-0.0168304443359375,
-0.048309326171875,
-0.0521240234375,
-0.00452423095703125,
-0.005481719970703125,
0.0290679931640625,
-0.01383209228515625,
-0.01099395751953125,
0.0169219970703125,
0.028533935546875,
-0.05950927734375,
-0.01515960693359375,
-0.044708251953125,
-0.0012607574462890625,
0.032684326171875,
-0.00045943260192871094,
0.028961181640625,
-0.031982421875,
-0.04290771484375,
0.0018634796142578125,
-0.03863525390625,
0.025390625,
0.00940704345703125,
0.023223876953125,
-0.045257568359375,
0.06146240234375,
-0.025909423828125,
0.0576171875,
0.00533294677734375,
-0.0194549560546875,
0.037109375,
-0.047882080078125,
-0.01473236083984375,
-0.011993408203125,
0.0767822265625,
0.025054931640625,
0.023681640625,
-0.0010509490966796875,
0.0015316009521484375,
-0.005390167236328125,
-0.005474090576171875,
-0.058380126953125,
0.00690460205078125,
0.0149078369140625,
-0.037750244140625,
0.00809478759765625,
0.020050048828125,
-0.062408447265625,
0.0123291015625,
-0.0003902912139892578,
0.021484375,
-0.052215576171875,
-0.0252838134765625,
0.0007681846618652344,
0.004001617431640625,
0.0261077880859375,
0.0125274658203125,
-0.057159423828125,
-0.0013456344604492188,
0.0276031494140625,
0.0706787109375,
0.0104827880859375,
-0.0406494140625,
-0.019073486328125,
0.022216796875,
-0.0248260498046875,
0.04852294921875,
-0.0280609130859375,
-0.033782958984375,
-0.00916290283203125,
0.024658203125,
-0.01004791259765625,
-0.0216827392578125,
0.05145263671875,
-0.0091552734375,
0.023223876953125,
-0.03814697265625,
-0.01012420654296875,
-0.0280303955078125,
0.0295867919921875,
-0.044921875,
0.08099365234375,
0.0199127197265625,
-0.052947998046875,
0.022491455078125,
-0.04913330078125,
-0.03778076171875,
-0.003082275390625,
-0.005832672119140625,
-0.03900146484375,
-0.0067901611328125,
0.0231475830078125,
0.032745361328125,
-0.0278167724609375,
0.0269775390625,
-0.01291656494140625,
-0.0131072998046875,
0.004192352294921875,
-0.01560211181640625,
0.077392578125,
0.03741455078125,
-0.0318603515625,
0.00789642333984375,
-0.055877685546875,
0.009185791015625,
0.014312744140625,
-0.040130615234375,
-0.002513885498046875,
-0.0258636474609375,
0.0055389404296875,
0.05572509765625,
0.01168060302734375,
-0.0504150390625,
0.02105712890625,
-0.031951904296875,
0.027740478515625,
0.039215087890625,
0.0010471343994140625,
0.02593994140625,
-0.03594970703125,
0.0496826171875,
0.020355224609375,
0.015533447265625,
-0.010711669921875,
-0.043243408203125,
-0.06024169921875,
-0.0345458984375,
0.0083770751953125,
0.05084228515625,
-0.056243896484375,
0.02252197265625,
-0.034027099609375,
-0.04254150390625,
-0.052398681640625,
0.017608642578125,
0.042327880859375,
0.019866943359375,
0.0362548828125,
-0.022705078125,
-0.042022705078125,
-0.05706787109375,
-0.0213470458984375,
-0.0142059326171875,
0.0147247314453125,
0.01457977294921875,
0.054656982421875,
-0.0213470458984375,
0.050506591796875,
-0.0111846923828125,
-0.0280303955078125,
-0.0167999267578125,
-0.0014171600341796875,
0.0211334228515625,
0.048614501953125,
0.0482177734375,
-0.07244873046875,
-0.061737060546875,
0.02825927734375,
-0.05010986328125,
0.0188751220703125,
0.00153350830078125,
-0.0198974609375,
0.031494140625,
0.023406982421875,
-0.044677734375,
0.026885986328125,
0.056243896484375,
-0.03106689453125,
0.04437255859375,
-0.007579803466796875,
0.03173828125,
-0.11590576171875,
0.0281219482421875,
-0.008636474609375,
-0.005992889404296875,
-0.044158935546875,
-0.0004742145538330078,
0.0119171142578125,
-0.00428009033203125,
-0.049072265625,
0.0594482421875,
-0.042510986328125,
0.026123046875,
0.009002685546875,
0.0024509429931640625,
-0.007755279541015625,
0.036773681640625,
-0.012054443359375,
0.053131103515625,
0.02947998046875,
-0.034759521484375,
0.02984619140625,
0.0299530029296875,
-0.018035888671875,
0.060943603515625,
-0.041015625,
-0.01611328125,
-0.00962066650390625,
0.0188140869140625,
-0.07867431640625,
-0.0176239013671875,
0.03106689453125,
-0.05419921875,
0.0200653076171875,
-0.01503753662109375,
-0.036163330078125,
-0.045867919921875,
-0.0176849365234375,
0.0322265625,
0.0149688720703125,
-0.0225067138671875,
0.033782958984375,
0.0060272216796875,
-0.017242431640625,
-0.054290771484375,
-0.08441162109375,
0.01122283935546875,
-0.01026153564453125,
-0.047882080078125,
0.014129638671875,
-0.0124664306640625,
0.0005331039428710938,
0.01593017578125,
-0.0047454833984375,
-0.007167816162109375,
-0.004444122314453125,
0.019317626953125,
0.0202789306640625,
-0.0285186767578125,
0.01348114013671875,
-0.00077056884765625,
-0.0013914108276367188,
-0.0218353271484375,
-0.0180816650390625,
0.04681396484375,
-0.0240020751953125,
-0.02838134765625,
-0.03021240234375,
0.040374755859375,
0.052093505859375,
-0.06524658203125,
0.08050537109375,
0.08160400390625,
-0.0230255126953125,
0.020172119140625,
-0.041351318359375,
0.0087738037109375,
-0.032012939453125,
0.0421142578125,
-0.058197021484375,
-0.06304931640625,
0.052764892578125,
-0.0163421630859375,
0.0004837512969970703,
0.06427001953125,
0.06964111328125,
0.0149383544921875,
0.0687255859375,
0.04681396484375,
-0.01084136962890625,
0.033447265625,
-0.03350830078125,
0.00835418701171875,
-0.060638427734375,
-0.032135009765625,
-0.043701171875,
-0.00643157958984375,
-0.07403564453125,
-0.037353515625,
0.0170745849609375,
0.0024566650390625,
-0.036102294921875,
0.02618408203125,
-0.035797119140625,
0.005931854248046875,
0.043365478515625,
0.004001617431640625,
0.01445770263671875,
0.00392913818359375,
-0.034423828125,
-0.00867462158203125,
-0.051025390625,
-0.041748046875,
0.080078125,
0.00850677490234375,
0.035552978515625,
0.042449951171875,
0.053375244140625,
-0.01030731201171875,
0.0163421630859375,
-0.042510986328125,
0.03802490234375,
-0.02716064453125,
-0.0762939453125,
-0.01226043701171875,
-0.041900634765625,
-0.0738525390625,
0.013763427734375,
-0.01523590087890625,
-0.0501708984375,
0.0235137939453125,
-0.01523590087890625,
-0.026214599609375,
0.0246429443359375,
-0.05767822265625,
0.0723876953125,
-0.02825927734375,
-0.014129638671875,
-0.00411224365234375,
-0.05975341796875,
0.039825439453125,
-0.01201629638671875,
0.0272064208984375,
-0.017333984375,
-0.002559661865234375,
0.05731201171875,
-0.016632080078125,
0.038665771484375,
0.00048542022705078125,
0.005950927734375,
0.013031005859375,
-0.00724029541015625,
0.024139404296875,
0.0006909370422363281,
-0.01364898681640625,
0.0126800537109375,
0.011016845703125,
-0.055572509765625,
-0.0196075439453125,
0.05206298828125,
-0.06768798828125,
-0.03497314453125,
-0.04742431640625,
-0.04290771484375,
0.006580352783203125,
0.04876708984375,
0.035980224609375,
0.00705718994140625,
-0.0169525146484375,
0.0177459716796875,
0.0210113525390625,
-0.0269927978515625,
0.03094482421875,
0.036163330078125,
-0.0175323486328125,
-0.049835205078125,
0.0684814453125,
0.0215911865234375,
0.0294342041015625,
0.026031494140625,
0.004573822021484375,
-0.004657745361328125,
-0.01116943359375,
-0.04510498046875,
0.042388916015625,
-0.038665771484375,
-0.0133056640625,
-0.05169677734375,
-0.01393890380859375,
-0.061004638671875,
-0.014984130859375,
-0.033447265625,
-0.032562255859375,
-0.006107330322265625,
0.0085906982421875,
0.016998291015625,
0.042144775390625,
-0.006458282470703125,
0.0269012451171875,
-0.05828857421875,
0.043121337890625,
-0.0005354881286621094,
0.0177459716796875,
-0.01123046875,
-0.049652099609375,
-0.042694091796875,
0.0173492431640625,
-0.03411865234375,
-0.07269287109375,
0.03863525390625,
0.023223876953125,
0.04498291015625,
0.03167724609375,
0.00476837158203125,
0.06231689453125,
-0.038421630859375,
0.0572509765625,
0.0274658203125,
-0.0701904296875,
0.04156494140625,
-0.02685546875,
0.051971435546875,
0.043121337890625,
0.055419921875,
-0.06463623046875,
-0.0311126708984375,
-0.033111572265625,
-0.074462890625,
0.05084228515625,
0.002185821533203125,
0.01502227783203125,
0.004917144775390625,
0.003726959228515625,
-0.007411956787109375,
0.013092041015625,
-0.06512451171875,
-0.045989990234375,
-0.02130126953125,
-0.0160369873046875,
-0.0277099609375,
-0.0265045166015625,
-0.017852783203125,
-0.037750244140625,
0.05938720703125,
0.0146026611328125,
0.030548095703125,
0.016571044921875,
0.0024890899658203125,
-0.019866943359375,
0.03814697265625,
0.06280517578125,
0.04949951171875,
-0.01288604736328125,
0.01247406005859375,
0.0204925537109375,
-0.04547119140625,
0.02593994140625,
0.022369384765625,
0.0010290145874023438,
0.015716552734375,
0.03167724609375,
0.053253173828125,
0.0021514892578125,
-0.0239715576171875,
0.03094482421875,
0.003337860107421875,
-0.0128021240234375,
-0.026580810546875,
-0.0252532958984375,
0.01407623291015625,
0.030059814453125,
0.031707763671875,
0.002567291259765625,
-0.01157379150390625,
-0.050262451171875,
0.016815185546875,
0.03802490234375,
-0.0192413330078125,
-0.027923583984375,
0.05157470703125,
0.005008697509765625,
-0.0116119384765625,
0.039154052734375,
-0.030181884765625,
-0.059112548828125,
0.027557373046875,
0.04010009765625,
0.05645751953125,
-0.055633544921875,
0.02093505859375,
0.054290771484375,
0.047821044921875,
-0.005123138427734375,
0.03411865234375,
0.0104522705078125,
-0.0312042236328125,
-0.0288238525390625,
-0.055206298828125,
-0.00876617431640625,
0.0001380443572998047,
-0.052642822265625,
0.02484130859375,
-0.01265716552734375,
-0.031402587890625,
-0.00818634033203125,
0.016754150390625,
-0.052764892578125,
0.014312744140625,
-0.0079345703125,
0.0537109375,
-0.06976318359375,
0.08404541015625,
0.07843017578125,
-0.04364013671875,
-0.06402587890625,
-0.00472259521484375,
-0.00628662109375,
-0.04107666015625,
0.047943115234375,
-0.0018482208251953125,
0.0125885009765625,
0.006916046142578125,
-0.0169830322265625,
-0.07110595703125,
0.07550048828125,
0.035797119140625,
-0.0261993408203125,
0.006725311279296875,
0.0295257568359375,
0.033294677734375,
-0.006687164306640625,
0.0062103271484375,
0.0203704833984375,
0.055816650390625,
0.0101318359375,
-0.0872802734375,
0.01473236083984375,
-0.0421142578125,
-0.01265716552734375,
0.017608642578125,
-0.06744384765625,
0.08819580078125,
-0.0249481201171875,
-0.005161285400390625,
0.0093994140625,
0.039154052734375,
0.0240631103515625,
0.0224609375,
0.00628662109375,
0.049774169921875,
0.035491943359375,
-0.0013341903686523438,
0.06536865234375,
-0.0288848876953125,
0.040069580078125,
0.06658935546875,
0.01197052001953125,
0.06365966796875,
0.04620361328125,
-0.029693603515625,
0.0236663818359375,
0.03997802734375,
-0.005283355712890625,
0.0286712646484375,
-0.00897979736328125,
-0.020050048828125,
-0.01006317138671875,
-0.0217132568359375,
-0.049407958984375,
0.042449951171875,
0.0018110275268554688,
-0.035308837890625,
-0.01303863525390625,
0.0090484619140625,
0.039215087890625,
-0.0173797607421875,
-0.01160430908203125,
0.035614013671875,
0.01885986328125,
-0.04754638671875,
0.066162109375,
0.019317626953125,
0.046539306640625,
-0.050262451171875,
0.006443023681640625,
-0.019012451171875,
0.0233001708984375,
-0.019500732421875,
-0.04443359375,
0.01331329345703125,
0.007617950439453125,
-0.0183563232421875,
-0.003879547119140625,
0.021331787109375,
-0.0528564453125,
-0.073486328125,
0.0211334228515625,
0.051727294921875,
0.01450347900390625,
0.004070281982421875,
-0.06109619140625,
-0.008148193359375,
0.01372528076171875,
-0.039794921875,
0.022216796875,
0.04803466796875,
-0.007457733154296875,
0.046417236328125,
0.046722412109375,
0.02166748046875,
0.020751953125,
-0.0245819091796875,
0.0628662109375,
-0.0543212890625,
-0.0220794677734375,
-0.0765380859375,
0.04150390625,
0.01617431640625,
-0.033447265625,
0.09515380859375,
0.052642822265625,
0.07122802734375,
-0.004215240478515625,
0.055419921875,
-0.01461029052734375,
0.0283355712890625,
-0.0198974609375,
0.0653076171875,
-0.06756591796875,
-0.018585205078125,
-0.03729248046875,
-0.0604248046875,
-0.0288848876953125,
0.041656494140625,
-0.0278472900390625,
0.030364990234375,
0.048004150390625,
0.0435791015625,
-0.00890350341796875,
-0.036163330078125,
0.0206298828125,
0.0256805419921875,
0.0173492431640625,
0.052459716796875,
0.02874755859375,
-0.03253173828125,
0.05224609375,
-0.03326416015625,
-0.003772735595703125,
-0.0173492431640625,
-0.04266357421875,
-0.05609130859375,
-0.056396484375,
-0.006259918212890625,
-0.0281524658203125,
-0.0007672309875488281,
0.07061767578125,
0.044708251953125,
-0.06964111328125,
-0.027862548828125,
0.021270751953125,
0.0019702911376953125,
-0.022430419921875,
-0.00914764404296875,
0.041229248046875,
0.00189971923828125,
-0.057861328125,
0.0028057098388671875,
0.005401611328125,
0.018585205078125,
-0.00946807861328125,
-0.0230865478515625,
-0.054290771484375,
-0.0020294189453125,
0.04937744140625,
0.0262298583984375,
-0.052154541015625,
0.008148193359375,
0.0036945343017578125,
-0.025604248046875,
0.00860595703125,
0.015899658203125,
-0.020843505859375,
0.0428466796875,
0.0318603515625,
0.0227813720703125,
0.043487548828125,
0.0008597373962402344,
0.0183868408203125,
-0.04083251953125,
0.0328369140625,
0.00446319580078125,
0.02374267578125,
0.0274810791015625,
-0.007709503173828125,
0.042083740234375,
0.0207672119140625,
-0.039031982421875,
-0.07183837890625,
0.0056610107421875,
-0.0760498046875,
-0.013519287109375,
0.09393310546875,
-0.035675048828125,
-0.0202789306640625,
-0.0008192062377929688,
-0.0142669677734375,
0.057037353515625,
-0.0238037109375,
0.03179931640625,
0.0557861328125,
0.01331329345703125,
-0.013702392578125,
-0.0631103515625,
0.014678955078125,
0.041778564453125,
-0.0587158203125,
-0.017120361328125,
0.004871368408203125,
0.01340484619140625,
0.0178070068359375,
0.045257568359375,
-0.0116424560546875,
0.0290069580078125,
-0.0123138427734375,
0.032135009765625,
-0.0018291473388671875,
-0.009796142578125,
-0.0229644775390625,
-0.01080322265625,
0.00823974609375,
-0.0256500244140625
]
] |
czearing/article-title-generator | 2022-06-28T20:08:16.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | czearing | null | null | czearing/article-title-generator | 13 | 55,553 | transformers | 2022-06-28T19:44:19 | ---
license: mit
---
## Article Title Generator
The model is based on the T5 language model and trained using a large collection of Medium articles.
## Usage
Example code:
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("czearing/article-title-generator")
model = AutoModel.from_pretrained("czearing/article-title-generator")
```
## License
MIT
| 404 | [
[
-0.0029773712158203125,
-0.023101806640625,
0.02294921875,
0.0105743408203125,
-0.031890869140625,
0.02142333984375,
-0.006038665771484375,
-0.007354736328125,
-0.0158538818359375,
0.027618408203125,
-0.0411376953125,
-0.02783203125,
-0.0499267578125,
0.028045654296875,
-0.052001953125,
0.0780029296875,
0.0096588134765625,
0.01442718505859375,
0.020538330078125,
0.0043792724609375,
-0.0079498291015625,
-0.01390838623046875,
-0.048553466796875,
-0.05859375,
0.02490234375,
0.0307769775390625,
0.0264129638671875,
0.05548095703125,
0.0250244140625,
0.020904541015625,
0.0191650390625,
-0.0210113525390625,
-0.0233001708984375,
-0.0256195068359375,
-0.015380859375,
-0.038177490234375,
-0.0254058837890625,
0.006069183349609375,
0.055877685546875,
0.0250091552734375,
-0.004985809326171875,
0.031768798828125,
-0.0170745849609375,
0.00115966796875,
-0.032257080078125,
0.0106658935546875,
-0.055694580078125,
0.006710052490234375,
-0.0104522705078125,
-0.00789642333984375,
-0.0305328369140625,
-0.028533935546875,
-0.00507354736328125,
-0.035675048828125,
0.021728515625,
0.01509857177734375,
0.09869384765625,
0.0275726318359375,
-0.037445068359375,
0.0005059242248535156,
-0.059112548828125,
0.0677490234375,
-0.05865478515625,
0.036163330078125,
0.0246124267578125,
0.0263519287109375,
0.017486572265625,
-0.0836181640625,
-0.041412353515625,
-0.0119476318359375,
0.01068115234375,
-0.006710052490234375,
-0.013641357421875,
0.00350189208984375,
0.038818359375,
0.03680419921875,
-0.0305023193359375,
0.00148773193359375,
-0.055694580078125,
-0.01013946533203125,
0.058929443359375,
0.01372528076171875,
0.033966064453125,
-0.0119476318359375,
-0.019775390625,
-0.0013055801391601562,
-0.05267333984375,
-0.01216888427734375,
0.03271484375,
-0.00415802001953125,
-0.008575439453125,
0.0543212890625,
-0.016754150390625,
0.05859375,
0.0296173095703125,
-0.0156402587890625,
0.0271148681640625,
-0.02020263671875,
-0.0243682861328125,
0.0036029815673828125,
0.07220458984375,
0.0177459716796875,
0.0179901123046875,
-0.035430908203125,
-0.0257568359375,
-0.0274505615234375,
0.018310546875,
-0.0709228515625,
-0.0168304443359375,
0.0171661376953125,
-0.0517578125,
-0.047393798828125,
0.01470184326171875,
-0.00788116455078125,
0.0193634033203125,
-0.026580810546875,
0.040191650390625,
-0.046600341796875,
-0.03216552734375,
-0.0018148422241210938,
-0.01468658447265625,
0.037689208984375,
-0.012451171875,
-0.074951171875,
0.005481719970703125,
0.0252838134765625,
0.0404052734375,
-0.00024402141571044922,
-0.041778564453125,
-0.0162200927734375,
0.032379150390625,
-0.0212249755859375,
0.051422119140625,
-0.0170745849609375,
-0.0235748291015625,
-0.00569915771484375,
0.0230712890625,
-0.003971099853515625,
-0.02154541015625,
0.033050537109375,
-0.0340576171875,
0.02655029296875,
0.0167694091796875,
-0.0562744140625,
-0.00531005859375,
0.01232147216796875,
-0.052093505859375,
0.06414794921875,
0.00970458984375,
-0.058837890625,
0.046661376953125,
-0.0650634765625,
-0.0236968994140625,
-0.005035400390625,
0.0218353271484375,
-0.04608154296875,
0.01204681396484375,
0.004642486572265625,
0.0200653076171875,
-0.0007791519165039062,
0.03717041015625,
-0.004116058349609375,
-0.01314544677734375,
0.017486572265625,
-0.01447296142578125,
0.071533203125,
0.03607177734375,
-0.0169830322265625,
0.0097503662109375,
-0.061370849609375,
-0.006488800048828125,
0.003082275390625,
-0.035858154296875,
0.00079345703125,
0.0019989013671875,
0.032623291015625,
0.0077056884765625,
0.049041748046875,
-0.052886962890625,
0.03106689453125,
-0.0352783203125,
0.03582763671875,
0.0258636474609375,
0.004055023193359375,
0.03271484375,
-0.0165863037109375,
0.034210205078125,
0.0216522216796875,
0.004863739013671875,
-0.00554656982421875,
-0.0018529891967773438,
-0.07843017578125,
-0.0017213821411132812,
0.050537109375,
0.0273590087890625,
-0.06866455078125,
0.04766845703125,
-0.0191497802734375,
-0.0506591796875,
-0.029998779296875,
-0.00302886962890625,
0.01541900634765625,
0.0007710456848144531,
0.0208892822265625,
0.0141143798828125,
-0.05999755859375,
-0.07098388671875,
-0.0230560302734375,
-0.010986328125,
0.01207733154296875,
-0.0098876953125,
0.049224853515625,
-0.008575439453125,
0.07867431640625,
-0.032135009765625,
0.0029773712158203125,
-0.044036865234375,
0.044403076171875,
0.0193023681640625,
0.047149658203125,
0.043701171875,
-0.03173828125,
-0.0308380126953125,
-0.0291748046875,
-0.04345703125,
-0.030914306640625,
-0.003631591796875,
0.00507354736328125,
0.0237274169921875,
0.03863525390625,
-0.0396728515625,
0.01104736328125,
0.033782958984375,
-0.036163330078125,
0.054962158203125,
-0.014617919921875,
-0.01029205322265625,
-0.1417236328125,
0.029083251953125,
-0.0130615234375,
-0.02777099609375,
-0.0149078369140625,
0.0036716461181640625,
0.0033969879150390625,
-0.026123046875,
-0.03717041015625,
0.044403076171875,
-0.04071044921875,
-0.004932403564453125,
-0.0278167724609375,
-0.042144775390625,
-0.0209503173828125,
-0.0023860931396484375,
0.01250457763671875,
0.052581787109375,
0.03271484375,
-0.040557861328125,
0.047149658203125,
0.0190887451171875,
-0.025177001953125,
0.016357421875,
-0.05133056640625,
-0.01230621337890625,
0.00623321533203125,
0.0034580230712890625,
-0.050262451171875,
-0.0213623046875,
-0.0017385482788085938,
-0.0286712646484375,
0.0185394287109375,
-0.004230499267578125,
-0.061187744140625,
-0.04840087890625,
0.007415771484375,
0.0268402099609375,
0.04888916015625,
-0.03411865234375,
0.0306396484375,
0.013824462890625,
0.0021839141845703125,
-0.04840087890625,
-0.0421142578125,
0.0198974609375,
-0.014556884765625,
-0.00406646728515625,
0.00909423828125,
-0.009765625,
0.006977081298828125,
0.005657196044921875,
0.0212249755859375,
-0.02532958984375,
0.0038433074951171875,
-0.00141143798828125,
0.0019102096557617188,
-0.006439208984375,
0.0210418701171875,
0.0065460205078125,
-0.0190277099609375,
0.004474639892578125,
-0.0240936279296875,
0.059417724609375,
0.0000019073486328125,
-0.006011962890625,
-0.052947998046875,
0.0003552436828613281,
0.03436279296875,
-0.034515380859375,
0.038055419921875,
0.05718994140625,
-0.018402099609375,
-0.026123046875,
-0.034271240234375,
-0.02655029296875,
-0.030120849609375,
0.038604736328125,
-0.024169921875,
-0.0379638671875,
0.027191162109375,
-0.003589630126953125,
0.007610321044921875,
0.052734375,
0.052581787109375,
0.0122833251953125,
0.0743408203125,
0.054931640625,
-0.0236053466796875,
0.043212890625,
-0.0302886962890625,
0.01336669921875,
-0.0399169921875,
-0.00804901123046875,
-0.046142578125,
-0.01282501220703125,
-0.0296478271484375,
0.0024585723876953125,
0.0010385513305664062,
0.0035152435302734375,
-0.0382080078125,
0.05084228515625,
-0.044921875,
0.034423828125,
0.04156494140625,
-0.00913238525390625,
0.037384033203125,
0.0092315673828125,
-0.0091705322265625,
0.0029277801513671875,
-0.052337646484375,
-0.050933837890625,
0.0958251953125,
0.020416259765625,
0.07305908203125,
-0.0093536376953125,
0.068115234375,
0.005764007568359375,
0.035552978515625,
-0.064208984375,
0.024017333984375,
-0.0025730133056640625,
-0.07403564453125,
-0.005924224853515625,
-0.01508331298828125,
-0.07537841796875,
-0.0192718505859375,
-0.018280029296875,
-0.0426025390625,
-0.00856781005859375,
-0.0018415451049804688,
-0.019378662109375,
0.029876708984375,
-0.0305328369140625,
0.07232666015625,
-0.014434814453125,
0.00746917724609375,
-0.006389617919921875,
-0.0185394287109375,
0.03271484375,
-0.004795074462890625,
-0.00542449951171875,
0.0120697021484375,
-0.00791168212890625,
0.0472412109375,
-0.02386474609375,
0.0528564453125,
-0.0000054836273193359375,
0.0012159347534179688,
0.010101318359375,
-0.00916290283203125,
0.0030765533447265625,
0.0005173683166503906,
-0.01058197021484375,
-0.0013799667358398438,
-0.001911163330078125,
-0.01617431640625,
-0.0019521713256835938,
0.033294677734375,
-0.08343505859375,
-0.0218353271484375,
-0.033966064453125,
-0.04510498046875,
0.018707275390625,
0.04559326171875,
0.0567626953125,
0.0282440185546875,
-0.0406494140625,
0.0185394287109375,
0.0205230712890625,
-0.009002685546875,
0.07159423828125,
0.042938232421875,
-0.0261383056640625,
-0.03857421875,
0.060577392578125,
0.0186309814453125,
-0.0013751983642578125,
0.0254974365234375,
0.007015228271484375,
-0.0347900390625,
-0.0290374755859375,
-0.044952392578125,
0.01062774658203125,
-0.02349853515625,
-0.0240936279296875,
-0.061187744140625,
-0.041534423828125,
-0.025665283203125,
0.00860595703125,
-0.028472900390625,
-0.036651611328125,
-0.0265350341796875,
-0.0159149169921875,
0.0199127197265625,
0.058349609375,
-0.004749298095703125,
0.049102783203125,
-0.067138671875,
0.0188751220703125,
0.024017333984375,
0.024932861328125,
-0.005283355712890625,
-0.0313720703125,
-0.040008544921875,
-0.005252838134765625,
-0.054962158203125,
-0.0650634765625,
0.029296875,
0.023040771484375,
0.03778076171875,
0.04510498046875,
-0.0015192031860351562,
0.011199951171875,
-0.0386962890625,
0.07550048828125,
0.0160064697265625,
-0.0797119140625,
0.036376953125,
-0.037994384765625,
0.03363037109375,
0.02447509765625,
0.0098876953125,
-0.006893157958984375,
-0.044891357421875,
-0.08734130859375,
-0.04656982421875,
0.060821533203125,
0.02764892578125,
0.0118255615234375,
-0.0012159347534179688,
0.0120391845703125,
0.01019287109375,
0.0296630859375,
-0.09527587890625,
-0.006130218505859375,
-0.0428466796875,
-0.037689208984375,
-0.0011425018310546875,
-0.005344390869140625,
0.003692626953125,
-0.020355224609375,
0.070556640625,
0.0027256011962890625,
0.0241851806640625,
-0.004253387451171875,
-0.0283050537109375,
-0.00780487060546875,
0.0218353271484375,
0.037506103515625,
0.04608154296875,
-0.0114593505859375,
-0.00951385498046875,
0.0118865966796875,
-0.0251007080078125,
0.0194549560546875,
0.017059326171875,
-0.0261688232421875,
0.0216064453125,
0.0199737548828125,
0.063720703125,
0.00397491455078125,
-0.0151824951171875,
0.0252838134765625,
0.01508331298828125,
-0.01265716552734375,
-0.052581787109375,
0.00846099853515625,
0.022308349609375,
0.010650634765625,
0.041900634765625,
-0.0249481201171875,
0.01255035400390625,
-0.034027099609375,
0.01495361328125,
0.015899658203125,
-0.0157928466796875,
-0.028564453125,
0.0721435546875,
0.0187835693359375,
-0.027099609375,
0.040130615234375,
-0.01373291015625,
-0.0390625,
0.0634765625,
0.058685302734375,
0.07196044921875,
-0.02716064453125,
0.0182647705078125,
0.05792236328125,
0.0285797119140625,
-0.01514434814453125,
0.0298309326171875,
0.002796173095703125,
-0.0538330078125,
-0.038970947265625,
-0.06341552734375,
-0.01641845703125,
0.022979736328125,
-0.030487060546875,
0.0272064208984375,
-0.0212554931640625,
-0.028472900390625,
0.0037384033203125,
0.012359619140625,
-0.03338623046875,
0.036224365234375,
0.00785064697265625,
0.082275390625,
-0.057586669921875,
0.0869140625,
0.0791015625,
-0.0643310546875,
-0.079833984375,
0.005107879638671875,
-0.0439453125,
-0.048370361328125,
0.0771484375,
0.00489044189453125,
0.015869140625,
0.0506591796875,
-0.06829833984375,
-0.06256103515625,
0.0870361328125,
0.0128326416015625,
-0.023284912109375,
-0.01317596435546875,
0.0263671875,
0.0308685302734375,
-0.048614501953125,
0.0152435302734375,
0.0245361328125,
0.04217529296875,
-0.00463104248046875,
-0.07623291015625,
-0.01129913330078125,
-0.007801055908203125,
0.0183868408203125,
0.0015325546264648438,
-0.050018310546875,
0.07147216796875,
-0.007602691650390625,
0.006809234619140625,
0.033050537109375,
0.0399169921875,
0.0186309814453125,
-0.00011527538299560547,
0.029052734375,
0.06646728515625,
0.0189361572265625,
-0.01617431640625,
0.06005859375,
-0.0595703125,
0.06707763671875,
0.062469482421875,
0.00897216796875,
0.042144775390625,
0.031982421875,
-0.0005021095275878906,
0.054595947265625,
0.061492919921875,
-0.05316162109375,
0.035614013671875,
-0.005786895751953125,
-0.01369476318359375,
-0.036865234375,
0.004791259765625,
-0.032623291015625,
0.044403076171875,
0.01467132568359375,
-0.060516357421875,
-0.0306549072265625,
-0.00518035888671875,
0.0062408447265625,
-0.0306243896484375,
-0.03216552734375,
0.07073974609375,
0.01416778564453125,
-0.0489501953125,
0.037445068359375,
0.0228729248046875,
0.038543701171875,
-0.03082275390625,
0.01369476318359375,
0.00714111328125,
0.0209503173828125,
-0.013031005859375,
-0.0367431640625,
0.0238189697265625,
0.004718780517578125,
-0.01751708984375,
-0.00795745849609375,
0.055938720703125,
-0.05035400390625,
-0.0657958984375,
-0.00876617431640625,
0.0419921875,
0.0150604248046875,
0.0186309814453125,
-0.05975341796875,
-0.0155181884765625,
-0.00988006591796875,
-0.034912109375,
-0.00974273681640625,
0.01702880859375,
0.01416778564453125,
0.044647216796875,
0.0433349609375,
0.002094268798828125,
-0.0010995864868164062,
-0.004058837890625,
0.036651611328125,
-0.05133056640625,
-0.0528564453125,
-0.061126708984375,
0.037689208984375,
-0.023284912109375,
-0.01776123046875,
0.04510498046875,
0.0667724609375,
0.0452880859375,
-0.039886474609375,
0.0728759765625,
-0.0247955322265625,
0.0205230712890625,
-0.0269012451171875,
0.09698486328125,
-0.05657958984375,
-0.0221405029296875,
0.0096588134765625,
-0.086181640625,
-0.004283905029296875,
0.04534912109375,
-0.013580322265625,
0.00457763671875,
0.05645751953125,
0.0421142578125,
-0.03411865234375,
-0.0189361572265625,
0.02056884765625,
0.036376953125,
0.004665374755859375,
0.0243377685546875,
0.037017822265625,
-0.045745849609375,
0.036529541015625,
-0.0154571533203125,
0.01128387451171875,
-0.01218414306640625,
-0.07415771484375,
-0.07940673828125,
-0.03985595703125,
-0.020294189453125,
-0.03887939453125,
-0.01096343994140625,
0.0797119140625,
0.058013916015625,
-0.05438232421875,
-0.03662109375,
-0.01023101806640625,
0.0023517608642578125,
0.012115478515625,
-0.019500732421875,
0.0290374755859375,
-0.0439453125,
-0.08245849609375,
0.00396728515625,
-0.0236968994140625,
0.01148223876953125,
-0.030914306640625,
0.004497528076171875,
-0.00926971435546875,
0.0005435943603515625,
0.03094482421875,
0.00958251953125,
-0.04180908203125,
-0.0148162841796875,
-0.0224456787109375,
-0.0223236083984375,
0.00009918212890625,
0.042510986328125,
-0.0435791015625,
0.0138702392578125,
0.03900146484375,
0.012664794921875,
0.041046142578125,
-0.0004248619079589844,
0.07440185546875,
-0.046417236328125,
0.017120361328125,
-0.007171630859375,
0.0367431640625,
0.031402587890625,
-0.004638671875,
0.030029296875,
0.0311126708984375,
-0.03826904296875,
-0.040802001953125,
0.0086517333984375,
-0.060272216796875,
-0.0077667236328125,
0.0875244140625,
0.00385284423828125,
-0.02801513671875,
-0.0019178390502929688,
-0.019439697265625,
0.055511474609375,
-0.032135009765625,
0.058441162109375,
0.052825927734375,
0.022735595703125,
-0.034942626953125,
-0.0299835205078125,
0.037628173828125,
-0.0006580352783203125,
-0.051239013671875,
-0.0277252197265625,
0.0182647705078125,
0.0699462890625,
0.0175323486328125,
0.031982421875,
-0.01763916015625,
0.0195770263671875,
-0.006671905517578125,
0.054901123046875,
-0.025299072265625,
-0.00678253173828125,
-0.00933074951171875,
0.01261138916015625,
0.0093536376953125,
-0.01358795166015625
]
] |
codellama/CodeLlama-34b-Instruct-hf | 2023-10-27T18:12:23.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"llama-2",
"code",
"arxiv:2308.12950",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | codellama | null | null | codellama/CodeLlama-34b-Instruct-hf | 187 | 55,103 | transformers | 2023-08-24T16:58:22 | ---
language:
- code
pipeline_tag: text-generation
tags:
- llama-2
license: llama2
---
# **Code Llama**
Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the 34B instruct-tuned version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom.
| | Base Model | Python | Instruct |
| --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) |
| 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) |
| 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) |
## Model Use
To use this model, please make sure to install transformers from `main` until the next version is released:
```bash
pip install git+https://github.com/huggingface/transformers.git@main accelerate
```
Model capabilities:
- [x] Code completion.
- [ ] Infilling.
- [x] Instructions / chat.
- [ ] Python specialist.
## Model Details
*Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs).
**Model Developers** Meta
**Variations** Code Llama comes in three model sizes, and three variants:
* Code Llama: base models designed for general code synthesis and understanding
* Code Llama - Python: designed specifically for Python
* Code Llama - Instruct: for instruction following and safer deployment
All variants are available in sizes of 7B, 13B and 34B parameters.
**This repository contains the Instruct version of the 34B parameters model.**
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture.
**Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or its [arXiv page](https://arxiv.org/abs/2308.12950).
## Intended Use
**Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications.
**Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
## Hardware and Software
**Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.
**Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program.
## Training Data
All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details).
## Evaluation Results
See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper.
## Ethical Considerations and Limitations
Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-use-guide](https://ai.meta.com/llama/responsible-use-guide). | 6,161 | [
[
-0.0288238525390625,
-0.04638671875,
0.0218353271484375,
0.041015625,
-0.017578125,
0.01262664794921875,
-0.006412506103515625,
-0.0474853515625,
0.017578125,
0.03826904296875,
-0.0298919677734375,
-0.041748046875,
-0.042999267578125,
0.023590087890625,
-0.037078857421875,
0.091064453125,
-0.00472259521484375,
-0.0231781005859375,
-0.0216827392578125,
0.00043129920959472656,
-0.0175323486328125,
-0.046783447265625,
-0.01297760009765625,
-0.0338134765625,
0.02398681640625,
0.021453857421875,
0.055023193359375,
0.04608154296875,
0.037353515625,
0.023590087890625,
-0.0234527587890625,
0.0009007453918457031,
-0.0227813720703125,
-0.0278167724609375,
0.0173187255859375,
-0.0445556640625,
-0.058135986328125,
-0.0018482208251953125,
0.02593994140625,
0.024932861328125,
-0.0233917236328125,
0.03265380859375,
-0.013458251953125,
0.03643798828125,
-0.0247344970703125,
0.0153045654296875,
-0.046905517578125,
-0.004085540771484375,
0.002391815185546875,
-0.006977081298828125,
-0.00801849365234375,
-0.04217529296875,
-0.0099029541015625,
-0.032196044921875,
-0.006877899169921875,
-0.0027141571044921875,
0.08251953125,
0.04150390625,
-0.02386474609375,
-0.0182647705078125,
-0.022003173828125,
0.059356689453125,
-0.07293701171875,
0.00160980224609375,
0.0299224853515625,
-0.0036945343017578125,
-0.01189422607421875,
-0.062286376953125,
-0.05548095703125,
-0.0280303955078125,
-0.00879669189453125,
-0.0027751922607421875,
-0.035369873046875,
0.00568389892578125,
0.0316162109375,
0.039337158203125,
-0.033477783203125,
0.01329803466796875,
-0.03216552734375,
-0.0171356201171875,
0.06781005859375,
0.00864410400390625,
0.032257080078125,
-0.018585205078125,
-0.0251007080078125,
-0.00150299072265625,
-0.06451416015625,
0.001743316650390625,
0.0372314453125,
-0.01074981689453125,
-0.058685302734375,
0.0556640625,
-0.0135040283203125,
0.043121337890625,
0.00342559814453125,
-0.0421142578125,
0.0391845703125,
-0.023712158203125,
-0.022613525390625,
-0.01181793212890625,
0.065673828125,
0.0374755859375,
0.028228759765625,
0.0030384063720703125,
-0.0188140869140625,
0.02410888671875,
0.01073455810546875,
-0.0609130859375,
-0.004528045654296875,
0.0227813720703125,
-0.045806884765625,
-0.050994873046875,
-0.02154541015625,
-0.0618896484375,
-0.008392333984375,
-0.004894256591796875,
0.008575439453125,
-0.01312255859375,
-0.029998779296875,
0.01605224609375,
0.007801055908203125,
0.034454345703125,
0.007354736328125,
-0.0653076171875,
0.003688812255859375,
0.036834716796875,
0.055908203125,
0.002567291259765625,
-0.035858154296875,
0.00261688232421875,
-0.01012420654296875,
-0.0264892578125,
0.04974365234375,
-0.035888671875,
-0.037689208984375,
-0.006877899169921875,
0.006877899169921875,
-0.0004382133483886719,
-0.0390625,
0.0171356201171875,
-0.02783203125,
-0.00197601318359375,
0.011077880859375,
-0.021270751953125,
-0.03387451171875,
0.0022792816162109375,
-0.042144775390625,
0.0853271484375,
0.02130126953125,
-0.048858642578125,
-0.0030803680419921875,
-0.04217529296875,
-0.028533935546875,
-0.0194091796875,
-0.0021343231201171875,
-0.049224853515625,
-0.0033779144287109375,
0.01424407958984375,
0.03741455078125,
-0.031402587890625,
0.033538818359375,
-0.0084991455078125,
-0.030670166015625,
0.016326904296875,
-0.0121612548828125,
0.07403564453125,
0.0267486572265625,
-0.03338623046875,
0.0167388916015625,
-0.06939697265625,
-0.009490966796875,
0.036529541015625,
-0.041229248046875,
0.0105438232421875,
-0.00994110107421875,
-0.001316070556640625,
-0.004413604736328125,
0.0413818359375,
-0.0198822021484375,
0.0413818359375,
-0.0288848876953125,
0.0557861328125,
0.04827880859375,
-0.0009660720825195312,
0.0298004150390625,
-0.044097900390625,
0.060577392578125,
-0.01364898681640625,
0.01448822021484375,
-0.0208892822265625,
-0.05609130859375,
-0.07550048828125,
-0.0223846435546875,
0.0014104843139648438,
0.052825927734375,
-0.036346435546875,
0.046539306640625,
0.0009374618530273438,
-0.05694580078125,
-0.03887939453125,
0.015777587890625,
0.040863037109375,
0.0200958251953125,
0.024627685546875,
-0.0061187744140625,
-0.059783935546875,
-0.0634765625,
0.00539398193359375,
-0.032073974609375,
0.006954193115234375,
0.0156097412109375,
0.063720703125,
-0.049560546875,
0.0572509765625,
-0.032989501953125,
-0.0000054836273193359375,
-0.0272064208984375,
-0.0207977294921875,
0.037811279296875,
0.04022216796875,
0.056304931640625,
-0.043426513671875,
-0.017608642578125,
0.0044097900390625,
-0.0635986328125,
-0.0087738037109375,
-0.015167236328125,
-0.003143310546875,
0.0307159423828125,
0.0247650146484375,
-0.048492431640625,
0.038055419921875,
0.06768798828125,
-0.0157623291015625,
0.04522705078125,
-0.01074981689453125,
-0.01171875,
-0.07818603515625,
0.015106201171875,
-0.01091766357421875,
-0.0014410018920898438,
-0.037109375,
0.028594970703125,
0.007266998291015625,
0.006855010986328125,
-0.038482666015625,
0.0250396728515625,
-0.028167724609375,
-0.002475738525390625,
-0.00926971435546875,
-0.017425537109375,
-0.003017425537109375,
0.057159423828125,
-0.0050811767578125,
0.07550048828125,
0.038818359375,
-0.048370361328125,
0.023162841796875,
0.0248870849609375,
-0.0301666259765625,
0.014739990234375,
-0.07061767578125,
0.0275115966796875,
0.00873565673828125,
0.0256500244140625,
-0.057403564453125,
-0.019989013671875,
0.0253143310546875,
-0.0330810546875,
0.00804901123046875,
-0.0023784637451171875,
-0.03717041015625,
-0.035491943359375,
-0.0185546875,
0.03326416015625,
0.0643310546875,
-0.046600341796875,
0.0301361083984375,
0.031097412109375,
0.00848388671875,
-0.053955078125,
-0.053924560546875,
0.00876617431640625,
-0.035675048828125,
-0.046142578125,
0.032196044921875,
-0.0234527587890625,
-0.0159912109375,
-0.0126495361328125,
0.00392913818359375,
-0.0010385513305664062,
0.02301025390625,
0.03472900390625,
0.0311126708984375,
-0.0092620849609375,
-0.015655517578125,
0.000988006591796875,
-0.007320404052734375,
0.00307464599609375,
0.01263427734375,
0.05706787109375,
-0.0305633544921875,
-0.0163726806640625,
-0.042022705078125,
0.01385498046875,
0.04437255859375,
-0.0211334228515625,
0.043975830078125,
0.0274505615234375,
-0.0283355712890625,
-0.0014963150024414062,
-0.04827880859375,
0.01120758056640625,
-0.040985107421875,
0.0225982666015625,
-0.01861572265625,
-0.06439208984375,
0.049468994140625,
0.00530242919921875,
0.014678955078125,
0.035369873046875,
0.0606689453125,
0.007617950439453125,
0.055389404296875,
0.07257080078125,
-0.03253173828125,
0.0302734375,
-0.039764404296875,
0.007343292236328125,
-0.060791015625,
-0.034637451171875,
-0.04791259765625,
-0.0018177032470703125,
-0.052337646484375,
-0.033721923828125,
0.0242156982421875,
0.0153961181640625,
-0.0379638671875,
0.05548095703125,
-0.058685302734375,
0.032470703125,
0.032989501953125,
0.0021305084228515625,
0.02935791015625,
0.0026607513427734375,
-0.00153350830078125,
0.0228118896484375,
-0.032745361328125,
-0.054046630859375,
0.09124755859375,
0.033843994140625,
0.06378173828125,
-0.0018444061279296875,
0.0635986328125,
0.005191802978515625,
0.0254974365234375,
-0.051971435546875,
0.04498291015625,
0.020416259765625,
-0.037109375,
0.0006604194641113281,
-0.016265869140625,
-0.068115234375,
0.010772705078125,
0.004856109619140625,
-0.060821533203125,
0.005084991455078125,
-0.00234222412109375,
-0.01788330078125,
0.0232086181640625,
-0.049224853515625,
0.044464111328125,
-0.0162506103515625,
0.003490447998046875,
-0.01456451416015625,
-0.0386962890625,
0.045318603515625,
-0.01091766357421875,
0.01678466796875,
-0.01058197021484375,
-0.015472412109375,
0.048980712890625,
-0.039276123046875,
0.08038330078125,
0.010772705078125,
-0.0361328125,
0.044891357421875,
-0.0011959075927734375,
0.03564453125,
0.0010175704956054688,
-0.0172576904296875,
0.052093505859375,
0.000049114227294921875,
-0.0140380859375,
-0.009124755859375,
0.047027587890625,
-0.07989501953125,
-0.056793212890625,
-0.03021240234375,
-0.035491943359375,
0.01953125,
0.01119232177734375,
0.0294036865234375,
0.003143310546875,
0.0133819580078125,
0.010162353515625,
0.0296783447265625,
-0.05224609375,
0.04681396484375,
0.0273590087890625,
-0.0211181640625,
-0.036346435546875,
0.061187744140625,
-0.0113372802734375,
0.01593017578125,
0.02093505859375,
0.0024585723876953125,
-0.00933074951171875,
-0.035308837890625,
-0.03106689453125,
0.0330810546875,
-0.047637939453125,
-0.042572021484375,
-0.045623779296875,
-0.0270538330078125,
-0.0247039794921875,
-0.02386474609375,
-0.021026611328125,
-0.019683837890625,
-0.050201416015625,
-0.0133514404296875,
0.058380126953125,
0.060455322265625,
0.003826141357421875,
0.0350341796875,
-0.0460205078125,
0.03448486328125,
0.00799560546875,
0.029510498046875,
0.0012960433959960938,
-0.036407470703125,
-0.0098114013671875,
-0.0020618438720703125,
-0.0396728515625,
-0.065185546875,
0.04498291015625,
0.0097198486328125,
0.04718017578125,
0.0098724365234375,
-0.003482818603515625,
0.050750732421875,
-0.033111572265625,
0.06939697265625,
0.0247650146484375,
-0.08245849609375,
0.04638671875,
-0.0191497802734375,
0.0032405853271484375,
0.00745391845703125,
0.0264892578125,
-0.031402587890625,
-0.02020263671875,
-0.04766845703125,
-0.05584716796875,
0.0455322265625,
0.012939453125,
0.0220489501953125,
0.0028076171875,
0.033599853515625,
-0.00115966796875,
0.0236358642578125,
-0.07940673828125,
-0.024261474609375,
-0.0236663818359375,
-0.0180511474609375,
-0.00691986083984375,
-0.02215576171875,
-0.0064239501953125,
-0.02166748046875,
0.033203125,
-0.0140228271484375,
0.04058837890625,
0.00904083251953125,
-0.0119781494140625,
-0.019805908203125,
0.004119873046875,
0.0513916015625,
0.043731689453125,
-0.0019283294677734375,
-0.0113067626953125,
0.02984619140625,
-0.0406494140625,
0.0174713134765625,
-0.0088958740234375,
-0.006256103515625,
-0.0234527587890625,
0.041900634765625,
0.048797607421875,
0.01097869873046875,
-0.06280517578125,
0.036376953125,
0.01183319091796875,
-0.0203399658203125,
-0.037933349609375,
0.02069091796875,
0.021881103515625,
0.026641845703125,
0.0187530517578125,
0.002231597900390625,
-0.0083160400390625,
-0.031890869140625,
-0.0005536079406738281,
0.026641845703125,
0.01381683349609375,
-0.026123046875,
0.06884765625,
0.007793426513671875,
-0.0276031494140625,
0.035797119140625,
0.005184173583984375,
-0.043212890625,
0.0880126953125,
0.0521240234375,
0.05682373046875,
-0.0152740478515625,
0.00884246826171875,
0.03472900390625,
0.041473388671875,
-0.00006002187728881836,
0.031890869140625,
0.00081634521484375,
-0.04022216796875,
-0.025634765625,
-0.06585693359375,
-0.0296630859375,
0.00815582275390625,
-0.035369873046875,
0.0229034423828125,
-0.04754638671875,
-0.0031108856201171875,
-0.0274200439453125,
0.00731658935546875,
-0.04742431640625,
-0.0004222393035888672,
0.00879669189453125,
0.0712890625,
-0.046905517578125,
0.06951904296875,
0.043914794921875,
-0.053436279296875,
-0.06787109375,
-0.013702392578125,
-0.00466156005859375,
-0.09259033203125,
0.035247802734375,
0.01995849609375,
0.00386810302734375,
0.006015777587890625,
-0.07098388671875,
-0.08184814453125,
0.095947265625,
0.03570556640625,
-0.039337158203125,
-0.0008172988891601562,
0.0146484375,
0.0428466796875,
-0.0264129638671875,
0.0290679931640625,
0.049072265625,
0.032379150390625,
-0.00799560546875,
-0.0906982421875,
0.0242767333984375,
-0.0294647216796875,
0.0166015625,
-0.02105712890625,
-0.07867431640625,
0.07781982421875,
-0.041778564453125,
-0.0101470947265625,
0.036346435546875,
0.04876708984375,
0.0406494140625,
0.0148468017578125,
0.0259857177734375,
0.042388916015625,
0.049163818359375,
0.00205230712890625,
0.088623046875,
-0.033966064453125,
0.0303955078125,
0.0384521484375,
-0.00884246826171875,
0.054351806640625,
0.030120849609375,
-0.04461669921875,
0.05633544921875,
0.057708740234375,
-0.016815185546875,
0.02130126953125,
0.0244903564453125,
-0.00524139404296875,
-0.0024051666259765625,
-0.007205963134765625,
-0.05633544921875,
0.0298004150390625,
0.024322509765625,
-0.0254669189453125,
0.005405426025390625,
-0.0160980224609375,
0.023590087890625,
-0.008819580078125,
-0.0055694580078125,
0.047698974609375,
0.017333984375,
-0.040802001953125,
0.08770751953125,
0.009918212890625,
0.07342529296875,
-0.039794921875,
-0.00879669189453125,
-0.0335693359375,
0.003658294677734375,
-0.0426025390625,
-0.039306640625,
0.0139617919921875,
0.0227203369140625,
0.0006232261657714844,
-0.00933837890625,
0.035064697265625,
-0.004787445068359375,
-0.037109375,
0.0290374755859375,
0.01313018798828125,
0.0204315185546875,
0.010498046875,
-0.0489501953125,
0.034210205078125,
0.0147552490234375,
-0.033843994140625,
0.0269775390625,
0.00775146484375,
0.0035457611083984375,
0.0703125,
0.05780029296875,
-0.01033782958984375,
0.01261138916015625,
-0.0092315673828125,
0.0850830078125,
-0.05224609375,
-0.025360107421875,
-0.059051513671875,
0.047576904296875,
0.0235137939453125,
-0.033721923828125,
0.0455322265625,
0.026214599609375,
0.061431884765625,
-0.00997161865234375,
0.060516357421875,
-0.01380157470703125,
0.005718231201171875,
-0.0360107421875,
0.04876708984375,
-0.05792236328125,
0.029266357421875,
-0.03759765625,
-0.069580078125,
-0.0233001708984375,
0.06402587890625,
-0.0030345916748046875,
0.00495147705078125,
0.0386962890625,
0.07452392578125,
0.0236358642578125,
-0.007091522216796875,
0.015533447265625,
0.01428985595703125,
0.0290679931640625,
0.059234619140625,
0.07391357421875,
-0.0433349609375,
0.05389404296875,
-0.04278564453125,
-0.0184326171875,
-0.0214385986328125,
-0.07415771484375,
-0.0732421875,
-0.037811279296875,
-0.0252685546875,
-0.028045654296875,
-0.0206451416015625,
0.0675048828125,
0.042022705078125,
-0.0443115234375,
-0.035675048828125,
-0.0110626220703125,
0.0306854248046875,
-0.00843048095703125,
-0.01514434814453125,
0.02105712890625,
-0.0093536376953125,
-0.0634765625,
0.0301666259765625,
-0.0027561187744140625,
0.01282501220703125,
-0.0250244140625,
-0.019134521484375,
-0.01007080078125,
0.0011920928955078125,
0.03546142578125,
0.0269012451171875,
-0.06256103515625,
-0.01467132568359375,
0.005901336669921875,
-0.01371002197265625,
0.008514404296875,
0.0318603515625,
-0.04730224609375,
-0.005481719970703125,
0.0245819091796875,
0.03326416015625,
0.0258026123046875,
-0.0173797607421875,
0.0176239013671875,
-0.0286865234375,
0.032562255859375,
-0.00006496906280517578,
0.037353515625,
0.00763702392578125,
-0.044891357421875,
0.052734375,
0.0188446044921875,
-0.05078125,
-0.06793212890625,
0.01163482666015625,
-0.0831298828125,
-0.015869140625,
0.09771728515625,
-0.00787353515625,
-0.02374267578125,
0.01428985595703125,
-0.0284881591796875,
0.0198974609375,
-0.0289306640625,
0.05322265625,
0.022674560546875,
-0.006275177001953125,
-0.011444091796875,
-0.031402587890625,
0.0208740234375,
0.01934814453125,
-0.0718994140625,
-0.012786865234375,
0.027923583984375,
0.0289764404296875,
0.01488494873046875,
0.05072021484375,
-0.009765625,
0.01210784912109375,
0.00402069091796875,
0.033935546875,
-0.00673675537109375,
-0.01702880859375,
-0.029388427734375,
-0.00496673583984375,
-0.00611114501953125,
-0.003345489501953125
]
] |
facebook/dpr-question_encoder-single-nq-base | 2022-12-21T15:20:10.000Z | [
"transformers",
"pytorch",
"tf",
"dpr",
"feature-extraction",
"en",
"dataset:nq_open",
"arxiv:2004.04906",
"arxiv:1702.08734",
"arxiv:1910.09700",
"license:cc-by-nc-4.0",
"has_space",
"region:us"
] | feature-extraction | facebook | null | null | facebook/dpr-question_encoder-single-nq-base | 19 | 55,072 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: cc-by-nc-4.0
tags:
- dpr
datasets:
- nq_open
inference: false
---
# `dpr-question_encoder-single-nq-base`
## Table of Contents
- [Model Details](#model-details)
- [How To Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation-results)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications](#technical-specifications)
- [Citation Information](#citation-information)
- [Model Card Authors](#model-card-authors)
## Model Details
**Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-question_encoder-single-nq-base` is the question encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)).
- **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers
- **Model Type:** BERT-based encoder
- **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md)
- **License:** English
- **Related Models:**
- [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base)
- [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base)
- [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base)
- [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base)
- [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base)
- **Resources for more information:**
- [Research Paper](https://arxiv.org/abs/2004.04906)
- [GitHub Repo](https://github.com/facebookresearch/DPR)
- [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr)
- [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased)
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import DPRQuestionEncoder, DPRQuestionEncoderTokenizer
tokenizer = DPRQuestionEncoderTokenizer.from_pretrained("facebook/dpr-question_encoder-single-nq-base")
model = DPRQuestionEncoder.from_pretrained("facebook/dpr-question_encoder-single-nq-base")
input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"]
embeddings = model(input_ids).pooler_output
```
## Uses
#### Direct Use
`dpr-question_encoder-single-nq-base`, [`dpr-ctx_encoder-single-nq-base`](https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base), and [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) can be used for the task of open-domain question answering.
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al., 2021](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al., 2021](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Training
#### Training Data
This model was trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). The model authors write that:
> [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators.
#### Training Procedure
The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf):
> Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time.
> Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector.
The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives.
## Evaluation
The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf).
#### Testing Data, Factors and Metrics
The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad).
#### Results
| | Top 20 | | | | | Top 100| | | | |
|:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:|
| | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD |
| | 78.4 | 79.4 |73.2| 79.8 | 63.2 | 85.4 | 85.0 |81.4| 89.1 | 77.2 |
## Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906).
- **Hardware Type:** 8 32GB GPUs
- **Hours used:** Unknown
- **Cloud Provider:** Unknown
- **Compute Region:** Unknown
- **Carbon Emitted:** Unknown
## Technical Specifications
See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details.
## Citation Information
```bibtex
@inproceedings{karpukhin-etal-2020-dense,
title = "Dense Passage Retrieval for Open-Domain Question Answering",
author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-main.550",
doi = "10.18653/v1/2020.emnlp-main.550",
pages = "6769--6781",
}
```
## Model Card Authors
This model card was written by the team at Hugging Face. | 8,178 | [
[
-0.044769287109375,
-0.072021484375,
0.0227203369140625,
0.01145172119140625,
-0.01023101806640625,
-0.0029811859130859375,
-0.00795745849609375,
-0.02069091796875,
0.004779815673828125,
0.031890869140625,
-0.052520751953125,
-0.0296630859375,
-0.034393310546875,
0.021331787109375,
-0.0277099609375,
0.06573486328125,
0.003231048583984375,
0.0028553009033203125,
-0.032073974609375,
-0.00797271728515625,
-0.008941650390625,
-0.049713134765625,
-0.046112060546875,
-0.0041656494140625,
0.0229034423828125,
0.00606536865234375,
0.047698974609375,
0.0275421142578125,
0.042755126953125,
0.0205841064453125,
-0.0305938720703125,
0.0121307373046875,
-0.043121337890625,
-0.0131683349609375,
0.004314422607421875,
-0.0152587890625,
-0.035308837890625,
-0.0029926300048828125,
0.05328369140625,
0.04132080078125,
-0.0070037841796875,
0.0223388671875,
0.005916595458984375,
0.053131103515625,
-0.03411865234375,
-0.0006208419799804688,
-0.033172607421875,
0.002887725830078125,
0.01221466064453125,
-0.002628326416015625,
-0.0233154296875,
-0.0386962890625,
0.0011053085327148438,
-0.03912353515625,
0.021636962890625,
0.004299163818359375,
0.0836181640625,
0.02239990234375,
-0.0265655517578125,
-0.0217437744140625,
-0.034149169921875,
0.05487060546875,
-0.0694580078125,
0.039398193359375,
0.0294189453125,
0.014556884765625,
-0.00015938282012939453,
-0.050079345703125,
-0.07220458984375,
-0.0108489990234375,
-0.0112762451171875,
0.01448822021484375,
-0.01165771484375,
0.00394439697265625,
0.0301361083984375,
0.038787841796875,
-0.054595947265625,
-0.00608062744140625,
-0.0283660888671875,
-0.00589752197265625,
0.0699462890625,
0.01558685302734375,
0.01824951171875,
-0.035369873046875,
-0.0272979736328125,
-0.022491455078125,
-0.0196990966796875,
0.0251312255859375,
0.0253753662109375,
0.019805908203125,
-0.0243377685546875,
0.040069580078125,
-0.01558685302734375,
0.05609130859375,
0.025665283203125,
-0.01103973388671875,
0.039398193359375,
-0.04559326171875,
-0.004726409912109375,
-0.0191192626953125,
0.072998046875,
0.02532958984375,
0.009765625,
-0.0035343170166015625,
-0.0131378173828125,
-0.0220184326171875,
0.0089569091796875,
-0.0738525390625,
-0.007617950439453125,
0.042449951171875,
-0.0307159423828125,
-0.01099395751953125,
0.0058746337890625,
-0.0640869140625,
-0.012451171875,
-0.004703521728515625,
0.0338134765625,
-0.037261962890625,
-0.0304107666015625,
0.032745361328125,
-0.02642822265625,
0.038909912109375,
0.0166168212890625,
-0.04559326171875,
0.0263519287109375,
0.035552978515625,
0.0504150390625,
-0.00047278404235839844,
-0.0065155029296875,
-0.00887298583984375,
-0.0207366943359375,
-0.00225830078125,
0.03997802734375,
-0.0311737060546875,
-0.0153350830078125,
-0.0013513565063476562,
0.01334381103515625,
-0.0163726806640625,
-0.032318115234375,
0.04693603515625,
-0.049713134765625,
0.0287322998046875,
-0.04107666015625,
-0.051666259765625,
-0.01788330078125,
0.034423828125,
-0.054290771484375,
0.093994140625,
0.0090484619140625,
-0.06964111328125,
0.00736236572265625,
-0.043243408203125,
-0.00848388671875,
-0.00321197509765625,
-0.005489349365234375,
-0.0299224853515625,
-0.0208587646484375,
0.036712646484375,
0.033416748046875,
-0.019561767578125,
0.021331787109375,
-0.0225830078125,
-0.036376953125,
0.02923583984375,
-0.02191162109375,
0.0966796875,
0.006069183349609375,
-0.0128021240234375,
-0.0189361572265625,
-0.051239013671875,
0.0023174285888671875,
0.034088134765625,
-0.0247955322265625,
-0.006626129150390625,
-0.0190277099609375,
0.003025054931640625,
0.0261688232421875,
0.027557373046875,
-0.06463623046875,
0.005878448486328125,
-0.02301025390625,
0.037689208984375,
0.042388916015625,
0.0197601318359375,
0.0305633544921875,
-0.032257080078125,
0.045166015625,
0.0029144287109375,
0.0254669189453125,
0.0090179443359375,
-0.041839599609375,
-0.050323486328125,
-0.01468658447265625,
0.0282135009765625,
0.04730224609375,
-0.059173583984375,
0.044769287109375,
-0.0229644775390625,
-0.04547119140625,
-0.04815673828125,
-0.0069122314453125,
0.036376953125,
0.042999267578125,
0.0364990234375,
-0.0067138671875,
-0.034698486328125,
-0.060699462890625,
-0.0004487037658691406,
-0.0146026611328125,
0.00673675537109375,
0.048980712890625,
0.0604248046875,
-0.0053863525390625,
0.07086181640625,
-0.043792724609375,
-0.007259368896484375,
-0.0281524658203125,
-0.015625,
0.021331787109375,
0.036285400390625,
0.052459716796875,
-0.08184814453125,
-0.041900634765625,
-0.036834716796875,
-0.059173583984375,
0.015625,
-0.0010442733764648438,
-0.0183563232421875,
0.01123809814453125,
0.03125,
-0.053375244140625,
0.0260162353515625,
0.0263671875,
-0.0222625732421875,
0.03399658203125,
0.003223419189453125,
0.012786865234375,
-0.07574462890625,
0.0147705078125,
0.0017862319946289062,
0.010894775390625,
-0.045501708984375,
0.00254058837890625,
0.007808685302734375,
-0.00646209716796875,
-0.038360595703125,
0.0521240234375,
-0.024688720703125,
0.005451202392578125,
0.012298583984375,
0.0177001953125,
0.029327392578125,
0.060272216796875,
0.005584716796875,
0.052032470703125,
0.0219573974609375,
-0.054412841796875,
0.0157012939453125,
0.05712890625,
-0.0184783935546875,
0.019439697265625,
-0.065185546875,
0.0277862548828125,
-0.033416748046875,
0.02313232421875,
-0.07861328125,
-0.005603790283203125,
0.0236053466796875,
-0.0601806640625,
0.0209197998046875,
-0.0006413459777832031,
-0.0528564453125,
-0.047882080078125,
-0.0182952880859375,
0.030181884765625,
0.041748046875,
-0.033294677734375,
0.030914306640625,
0.0252685546875,
-0.00193023681640625,
-0.063232421875,
-0.058990478515625,
-0.02154541015625,
0.00014030933380126953,
-0.05548095703125,
0.03924560546875,
-0.0218353271484375,
-0.0030345916748046875,
0.0193023681640625,
0.0024127960205078125,
-0.029144287109375,
-0.0009088516235351562,
0.0010890960693359375,
0.00948333740234375,
-0.00563812255859375,
0.020294189453125,
-0.006969451904296875,
0.0224761962890625,
0.00927734375,
0.01032257080078125,
0.04339599609375,
-0.01861572265625,
-0.01441192626953125,
-0.0243682861328125,
0.02276611328125,
0.01812744140625,
-0.031280517578125,
0.0635986328125,
0.047149658203125,
-0.034759521484375,
-0.005451202392578125,
-0.046844482421875,
-0.028533935546875,
-0.03717041015625,
0.038238525390625,
-0.0191802978515625,
-0.08026123046875,
0.0535888671875,
0.03265380859375,
0.006084442138671875,
0.04632568359375,
0.02801513671875,
-0.0086212158203125,
0.07281494140625,
0.03460693359375,
0.0070343017578125,
0.03717041015625,
-0.03753662109375,
0.01477813720703125,
-0.06707763671875,
-0.024322509765625,
-0.0352783203125,
-0.024688720703125,
-0.0450439453125,
-0.03399658203125,
0.016937255859375,
0.00809478759765625,
-0.040374755859375,
0.021759033203125,
-0.05181884765625,
0.01654052734375,
0.0394287109375,
0.0267333984375,
0.007732391357421875,
-0.00537109375,
-0.002666473388671875,
-0.00872802734375,
-0.0682373046875,
-0.026336669921875,
0.087890625,
0.03509521484375,
0.039764404296875,
0.0007853507995605469,
0.05889892578125,
0.01024627685546875,
-0.004024505615234375,
-0.037017822265625,
0.055267333984375,
-0.0106658935546875,
-0.07330322265625,
-0.0263519287109375,
-0.0426025390625,
-0.06597900390625,
0.01165771484375,
-0.0131683349609375,
-0.035919189453125,
0.042755126953125,
-0.0084381103515625,
-0.04864501953125,
0.0279388427734375,
-0.03375244140625,
0.07623291015625,
-0.033233642578125,
-0.02911376953125,
0.010650634765625,
-0.049591064453125,
0.02569580078125,
0.00836944580078125,
0.004352569580078125,
-0.0027446746826171875,
-0.004039764404296875,
0.0654296875,
-0.01502227783203125,
0.06268310546875,
-0.0322265625,
0.01111602783203125,
0.04486083984375,
-0.021759033203125,
0.0162811279296875,
0.010223388671875,
-0.0182952880859375,
0.0226593017578125,
0.01374053955078125,
-0.0267791748046875,
-0.040924072265625,
0.0254974365234375,
-0.0706787109375,
-0.024566650390625,
-0.0372314453125,
-0.032135009765625,
-0.00658416748046875,
0.00720977783203125,
0.019775390625,
0.037933349609375,
-0.0166473388671875,
0.023895263671875,
0.0706787109375,
-0.05279541015625,
0.02349853515625,
0.040069580078125,
-0.006450653076171875,
-0.03314208984375,
0.052520751953125,
0.01195526123046875,
0.01432037353515625,
0.046478271484375,
-0.002628326416015625,
-0.041107177734375,
-0.0400390625,
-0.03131103515625,
0.026641845703125,
-0.05950927734375,
-0.020965576171875,
-0.072998046875,
-0.045196533203125,
-0.045684814453125,
0.01259613037109375,
-0.029449462890625,
-0.0284576416015625,
-0.02862548828125,
-0.01258087158203125,
0.040252685546875,
0.0333251953125,
0.004604339599609375,
0.010162353515625,
-0.0528564453125,
0.031524658203125,
0.0167694091796875,
0.023956298828125,
-0.001983642578125,
-0.059906005859375,
-0.0177154541015625,
0.0247955322265625,
-0.0213775634765625,
-0.0638427734375,
0.034820556640625,
0.0223388671875,
0.05267333984375,
0.00379180908203125,
0.032928466796875,
0.043914794921875,
-0.01361846923828125,
0.06512451171875,
-0.016754150390625,
-0.04180908203125,
0.037109375,
-0.018890380859375,
0.01824951171875,
0.058837890625,
0.05255126953125,
-0.031463623046875,
-0.00862884521484375,
-0.044830322265625,
-0.06475830078125,
0.0482177734375,
0.01468658447265625,
0.0177459716796875,
-0.022430419921875,
0.04779052734375,
-0.0130462646484375,
0.026885986328125,
-0.06695556640625,
-0.02264404296875,
-0.0173492431640625,
-0.0183868408203125,
0.004024505615234375,
-0.022979736328125,
0.0022220611572265625,
-0.042327880859375,
0.04931640625,
-0.00457763671875,
0.055419921875,
0.048370361328125,
-0.013397216796875,
0.00914764404296875,
0.01259613037109375,
0.02215576171875,
0.03485107421875,
-0.044464111328125,
-0.0274810791015625,
0.00580596923828125,
-0.038421630859375,
-0.0035343170166015625,
0.0302581787109375,
-0.02630615234375,
-0.0010843276977539062,
0.017547607421875,
0.06268310546875,
0.0096588134765625,
-0.05828857421875,
0.051513671875,
-0.01678466796875,
-0.03997802734375,
-0.04241943359375,
-0.016357421875,
0.0012044906616210938,
0.0200653076171875,
0.02520751953125,
-0.0229644775390625,
0.016021728515625,
-0.03009033203125,
0.018035888671875,
0.029449462890625,
-0.028289794921875,
-0.007537841796875,
0.049072265625,
0.01226806640625,
-0.004207611083984375,
0.0692138671875,
-0.041229248046875,
-0.05157470703125,
0.06103515625,
0.021484375,
0.0626220703125,
0.004123687744140625,
0.02142333984375,
0.06597900390625,
0.038055419921875,
0.0028820037841796875,
0.047271728515625,
0.005840301513671875,
-0.07391357421875,
-0.02606201171875,
-0.0615234375,
-0.0257110595703125,
0.01103973388671875,
-0.061187744140625,
-0.0000730752944946289,
-0.033203125,
-0.0169525146484375,
-0.0147552490234375,
0.0166168212890625,
-0.070068359375,
0.0164794921875,
0.0042724609375,
0.0775146484375,
-0.054046630859375,
0.037689208984375,
0.054107666015625,
-0.059234619140625,
-0.05596923828125,
-0.0020847320556640625,
-0.0180511474609375,
-0.051239013671875,
0.04730224609375,
0.016510009765625,
0.0277862548828125,
0.009033203125,
-0.047607421875,
-0.07135009765625,
0.08990478515625,
0.0151214599609375,
-0.034576416015625,
-0.01291656494140625,
0.0243377685546875,
0.040771484375,
-0.024658203125,
0.03729248046875,
0.032562255859375,
0.0279541015625,
0.00397491455078125,
-0.06341552734375,
0.0213775634765625,
-0.03753662109375,
-0.01499176025390625,
-0.00888824462890625,
-0.06683349609375,
0.08062744140625,
-0.01418304443359375,
-0.0198822021484375,
-0.0033206939697265625,
0.036712646484375,
0.02801513671875,
0.0177154541015625,
0.031829833984375,
0.057891845703125,
0.054412841796875,
-0.01163482666015625,
0.087646484375,
-0.0312347412109375,
0.02801513671875,
0.0689697265625,
-0.00923919677734375,
0.07574462890625,
0.020721435546875,
-0.02557373046875,
0.037994384765625,
0.0572509765625,
-0.0120849609375,
0.0406494140625,
0.0061492919921875,
-0.0013523101806640625,
-0.0210418701171875,
-0.005794525146484375,
-0.03863525390625,
0.028106689453125,
0.016326904296875,
-0.02587890625,
0.002048492431640625,
-0.0023193359375,
-0.01114654541015625,
0.0022563934326171875,
-0.01177978515625,
0.056549072265625,
0.002803802490234375,
-0.0435791015625,
0.06585693359375,
-0.00814056396484375,
0.06292724609375,
-0.039642333984375,
0.0005388259887695312,
-0.02349853515625,
0.018646240234375,
-0.00830078125,
-0.06463623046875,
0.01715087890625,
-0.00856781005859375,
-0.0143890380859375,
-0.019439697265625,
0.048858642578125,
-0.0293731689453125,
-0.04595947265625,
0.028656005859375,
0.052093505859375,
0.00984954833984375,
-0.0206146240234375,
-0.08868408203125,
0.0006818771362304688,
0.002777099609375,
-0.028228759765625,
0.021453857421875,
0.02752685546875,
0.02105712890625,
0.0533447265625,
0.03271484375,
-0.0253448486328125,
0.01094818115234375,
0.0037403106689453125,
0.07598876953125,
-0.0595703125,
-0.02508544921875,
-0.04327392578125,
0.050201416015625,
-0.0131378173828125,
-0.0318603515625,
0.06884765625,
0.047393798828125,
0.0760498046875,
0.0031719207763671875,
0.0631103515625,
-0.021087646484375,
0.0465087890625,
-0.022735595703125,
0.05279541015625,
-0.060699462890625,
0.00707244873046875,
-0.0212249755859375,
-0.05841064453125,
0.006374359130859375,
0.04736328125,
-0.0214996337890625,
0.027313232421875,
0.0408935546875,
0.06866455078125,
0.00815582275390625,
0.006988525390625,
0.00018513202667236328,
0.0128936767578125,
0.006744384765625,
0.050140380859375,
0.052276611328125,
-0.06060791015625,
0.053955078125,
-0.046844482421875,
-0.008453369140625,
-0.00794219970703125,
-0.042327880859375,
-0.07470703125,
-0.045867919921875,
-0.03759765625,
-0.03802490234375,
0.004009246826171875,
0.054473876953125,
0.03485107421875,
-0.0496826171875,
-0.00719451904296875,
0.00026297569274902344,
0.0001316070556640625,
-0.0223846435546875,
-0.0208587646484375,
0.03265380859375,
-0.00254058837890625,
-0.0518798828125,
0.0062103271484375,
-0.01171875,
0.0018968582153320312,
-0.020904541015625,
-0.019378662109375,
-0.04412841796875,
0.004764556884765625,
0.0345458984375,
0.019195556640625,
-0.04693603515625,
-0.0140228271484375,
0.039764404296875,
-0.012054443359375,
0.003421783447265625,
0.0157623291015625,
-0.04364013671875,
0.02239990234375,
0.0517578125,
0.059326171875,
0.050018310546875,
0.0101776123046875,
0.0194854736328125,
-0.0560302734375,
0.00250244140625,
0.043426513671875,
0.0182647705078125,
0.032501220703125,
-0.034576416015625,
0.04327392578125,
0.01345062255859375,
-0.048980712890625,
-0.07061767578125,
-0.00424957275390625,
-0.08392333984375,
-0.0185546875,
0.09906005859375,
-0.005977630615234375,
-0.0233612060546875,
-0.002429962158203125,
-0.00894927978515625,
0.0171356201171875,
-0.033477783203125,
0.038726806640625,
0.0477294921875,
-0.00820159912109375,
-0.0309906005859375,
-0.056793212890625,
0.03204345703125,
0.0214996337890625,
-0.043731689453125,
-0.0151214599609375,
0.0290374755859375,
0.02423095703125,
0.004749298095703125,
0.06561279296875,
-0.0169677734375,
0.0113372802734375,
0.0104522705078125,
0.023101806640625,
-0.0136566162109375,
0.005828857421875,
-0.0272979736328125,
0.00600433349609375,
-0.0199127197265625,
-0.01340484619140625
]
] |
lvwerra/gpt2-imdb | 2021-05-23T08:38:34.000Z | [
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | lvwerra | null | null | lvwerra/gpt2-imdb | 13 | 54,623 | transformers | 2022-03-02T23:29:05 | # GPT2-IMDB
## What is it?
A GPT2 (`gpt2`) language model fine-tuned on the [IMDB dataset](https://www.kaggle.com/lakshmi25npathi/imdb-dataset-of-50k-movie-reviews).
## Training setting
The GPT2 language model was fine-tuned for 1 epoch on the IMDB dataset. All comments were joined into a single text file separated by the EOS token:
```
import pandas as pd
df = pd.read_csv("imdb-dataset.csv")
imdb_str = " <|endoftext|> ".join(df['review'].tolist())
with open ('imdb.txt', 'w') as f:
f.write(imdb_str)
```
To train the model the `run_language_modeling.py` script in the `transformer` library was used:
```
python run_language_modeling.py
--train_data_file imdb.txt
--output_dir gpt2-imdb
--model_type gpt2
--model_name_or_path gpt2
```
| 758 | [
[
-0.01983642578125,
-0.048980712890625,
0.00344085693359375,
-0.0012664794921875,
-0.039276123046875,
-0.0085601806640625,
-0.01210784912109375,
-0.004497528076171875,
-0.00867462158203125,
0.037872314453125,
-0.052703857421875,
-0.01248931884765625,
-0.054962158203125,
0.008056640625,
-0.0355224609375,
0.1256103515625,
-0.013763427734375,
0.0202484130859375,
0.01221466064453125,
0.0085906982421875,
0.0036563873291015625,
-0.0267181396484375,
-0.049713134765625,
-0.0185394287109375,
-0.0120849609375,
0.01551055908203125,
0.057281494140625,
0.03546142578125,
0.023040771484375,
0.01215362548828125,
0.003971099853515625,
-0.003444671630859375,
-0.04541015625,
-0.010345458984375,
-0.01360321044921875,
-0.034088134765625,
-0.01483917236328125,
0.007785797119140625,
0.057373046875,
0.0193939208984375,
-0.01544952392578125,
0.029144287109375,
-0.0012655258178710938,
0.040985107421875,
-0.006534576416015625,
0.03607177734375,
-0.03375244140625,
0.00927734375,
-0.001247406005859375,
0.012298583984375,
-0.0070648193359375,
-0.01526641845703125,
0.01132965087890625,
-0.036834716796875,
0.0390625,
0.003803253173828125,
0.091796875,
0.0244293212890625,
-0.021697998046875,
-0.006977081298828125,
-0.051788330078125,
0.0413818359375,
-0.0513916015625,
0.00484466552734375,
0.0305023193359375,
0.0160064697265625,
0.01165771484375,
-0.048614501953125,
-0.0626220703125,
-0.0182342529296875,
-0.0007500648498535156,
-0.00707244873046875,
-0.0053558349609375,
-0.004497528076171875,
0.034820556640625,
0.0292510986328125,
-0.05010986328125,
0.00244140625,
-0.05938720703125,
-0.0301361083984375,
0.0146484375,
0.00800323486328125,
0.0004353523254394531,
-0.0087890625,
-0.026611328125,
-0.01995849609375,
-0.0384521484375,
0.0013980865478515625,
0.039093017578125,
0.001674652099609375,
-0.0235748291015625,
0.0631103515625,
-0.0239715576171875,
0.0255889892578125,
-0.002765655517578125,
-0.0027008056640625,
0.029205322265625,
-0.024749755859375,
-0.0147857666015625,
-0.002246856689453125,
0.062225341796875,
0.05841064453125,
0.03997802734375,
0.0022487640380859375,
-0.0233001708984375,
0.0100860595703125,
0.0157012939453125,
-0.0743408203125,
-0.034027099609375,
0.0235137939453125,
-0.0184478759765625,
-0.0240020751953125,
0.00885772705078125,
-0.049652099609375,
-0.0087127685546875,
-0.0195465087890625,
0.041046142578125,
-0.025848388671875,
-0.0396728515625,
-0.0294952392578125,
-0.0147857666015625,
0.039703369140625,
0.006153106689453125,
-0.0908203125,
0.0194854736328125,
0.04498291015625,
0.06939697265625,
0.01247406005859375,
-0.037017822265625,
-0.017242431640625,
0.0159759521484375,
-0.01403045654296875,
0.06097412109375,
-0.0197296142578125,
-0.02728271484375,
-0.0035877227783203125,
0.03033447265625,
-0.0072021484375,
-0.035430908203125,
0.037384033203125,
-0.0220794677734375,
0.039154052734375,
0.00615692138671875,
-0.0222625732421875,
-0.0171966552734375,
0.02752685546875,
-0.047454833984375,
0.09136962890625,
0.052642822265625,
-0.04681396484375,
0.03277587890625,
-0.04638671875,
-0.02197265625,
-0.0157928466796875,
-0.007305145263671875,
-0.056060791015625,
0.0000864267349243164,
0.0215301513671875,
0.00687408447265625,
-0.02972412109375,
0.03594970703125,
-0.00742340087890625,
-0.03369140625,
0.00789642333984375,
-0.0264129638671875,
0.031158447265625,
0.0318603515625,
-0.0303192138671875,
-0.004222869873046875,
-0.04803466796875,
0.00005519390106201172,
0.01470947265625,
-0.040740966796875,
0.0211029052734375,
-0.00919342041015625,
0.03704833984375,
0.027008056640625,
0.0170135498046875,
-0.040374755859375,
0.0274200439453125,
-0.0258636474609375,
0.039337158203125,
0.0460205078125,
-0.0138092041015625,
0.017913818359375,
-0.0157470703125,
0.0357666015625,
0.026580810546875,
0.00958251953125,
-0.00843048095703125,
-0.036590576171875,
-0.0517578125,
0.012969970703125,
0.0037860870361328125,
0.056793212890625,
-0.046142578125,
0.0231170654296875,
-0.02197265625,
-0.037078857421875,
-0.050018310546875,
-0.005580902099609375,
0.0204010009765625,
0.018218994140625,
0.020050048828125,
-0.0322265625,
-0.045379638671875,
-0.0657958984375,
-0.026519775390625,
-0.0287628173828125,
-0.0072479248046875,
0.0096282958984375,
0.051971435546875,
-0.036529541015625,
0.0721435546875,
-0.03173828125,
-0.03271484375,
-0.0230712890625,
0.0045013427734375,
0.044677734375,
0.039886474609375,
0.01392364501953125,
-0.03173828125,
-0.04559326171875,
-0.0182037353515625,
-0.05145263671875,
0.0018892288208007812,
-0.006801605224609375,
0.009429931640625,
0.0176544189453125,
0.0273895263671875,
-0.04132080078125,
0.001651763916015625,
0.0279998779296875,
-0.035430908203125,
0.05352783203125,
-0.0254669189453125,
0.004299163818359375,
-0.08856201171875,
-0.005847930908203125,
0.003925323486328125,
0.002819061279296875,
-0.026275634765625,
-0.01457977294921875,
-0.0016698837280273438,
-0.0216522216796875,
-0.01983642578125,
0.037078857421875,
-0.0263824462890625,
0.0038700103759765625,
-0.013641357421875,
-0.006160736083984375,
-0.007537841796875,
0.050018310546875,
0.005615234375,
0.062286376953125,
0.06634521484375,
-0.0278167724609375,
0.0416259765625,
0.029327392578125,
-0.038482666015625,
0.0274505615234375,
-0.04779052734375,
0.024383544921875,
-0.0098419189453125,
0.0112152099609375,
-0.07293701171875,
-0.0160675048828125,
0.0216064453125,
-0.02252197265625,
0.03460693359375,
-0.039764404296875,
-0.04266357421875,
-0.0189971923828125,
-0.023895263671875,
0.01035308837890625,
0.038055419921875,
-0.032470703125,
0.02520751953125,
0.043670654296875,
-0.0207977294921875,
-0.040008544921875,
-0.049407958984375,
0.00821685791015625,
-0.01407623291015625,
-0.03253173828125,
0.00894927978515625,
0.00423431396484375,
0.0013990402221679688,
-0.006855010986328125,
-0.0013866424560546875,
-0.0174560546875,
-0.0283203125,
0.0156707763671875,
0.0167083740234375,
-0.00821685791015625,
-0.008270263671875,
0.0158233642578125,
-0.027740478515625,
0.006755828857421875,
-0.01605224609375,
0.03802490234375,
-0.021331787109375,
0.009429931640625,
-0.01947021484375,
0.0085296630859375,
0.034393310546875,
0.006153106689453125,
0.06683349609375,
0.07476806640625,
-0.009063720703125,
-0.01396942138671875,
-0.023406982421875,
-0.0168304443359375,
-0.029937744140625,
0.0750732421875,
-0.029296875,
-0.0662841796875,
0.020111083984375,
-0.01276397705078125,
-0.01250457763671875,
0.058013916015625,
0.05047607421875,
0.000484466552734375,
0.06256103515625,
0.04486083984375,
-0.0180816650390625,
0.040283203125,
-0.040313720703125,
0.0166778564453125,
-0.07281494140625,
-0.0191802978515625,
-0.0291900634765625,
0.00807952880859375,
-0.05047607421875,
-0.0189666748046875,
0.0243988037109375,
0.0272064208984375,
-0.037994384765625,
0.040802001953125,
-0.0374755859375,
0.0411376953125,
0.0243988037109375,
-0.005176544189453125,
0.00782012939453125,
0.029998779296875,
-0.025543212890625,
0.00975799560546875,
-0.023162841796875,
-0.0616455078125,
0.10211181640625,
0.04022216796875,
0.07379150390625,
0.01446533203125,
0.0205535888671875,
0.01200103759765625,
0.01320648193359375,
-0.03717041015625,
0.0223388671875,
-0.01189422607421875,
-0.059783935546875,
-0.0309600830078125,
-0.00989532470703125,
-0.035797119140625,
0.0106048583984375,
0.01374053955078125,
-0.055694580078125,
-0.0044403076171875,
-0.002422332763671875,
-0.0018472671508789062,
0.0211181640625,
-0.06512451171875,
0.079345703125,
-0.004718780517578125,
-0.0260467529296875,
-0.02093505859375,
-0.052764892578125,
0.0127105712890625,
-0.01605224609375,
0.0099945068359375,
-0.0069122314453125,
0.005279541015625,
0.06756591796875,
-0.040802001953125,
0.05218505859375,
-0.0195465087890625,
0.00907135009765625,
0.03411865234375,
0.004703521728515625,
0.0545654296875,
0.0123443603515625,
-0.00933074951171875,
0.035308837890625,
0.018707275390625,
-0.005695343017578125,
-0.0029239654541015625,
0.05218505859375,
-0.08477783203125,
-0.019805908203125,
-0.049835205078125,
-0.0290679931640625,
0.01248931884765625,
0.013641357421875,
0.031646728515625,
0.032684326171875,
-0.020263671875,
-0.0011577606201171875,
0.0226287841796875,
-0.01222991943359375,
0.039947509765625,
0.042877197265625,
-0.0171356201171875,
-0.045379638671875,
0.052398681640625,
0.0168914794921875,
0.0156097412109375,
0.018707275390625,
-0.00998687744140625,
-0.05523681640625,
-0.02069091796875,
-0.05670166015625,
0.028594970703125,
-0.06268310546875,
0.0174713134765625,
-0.047943115234375,
-0.017120361328125,
-0.031982421875,
0.0413818359375,
-0.0199737548828125,
-0.0372314453125,
-0.0162353515625,
-0.023284912109375,
0.03936767578125,
0.055999755859375,
-0.0081939697265625,
0.04022216796875,
-0.04766845703125,
0.022674560546875,
0.00933074951171875,
0.0579833984375,
-0.028472900390625,
-0.0712890625,
-0.0538330078125,
-0.0055389404296875,
-0.0361328125,
-0.0548095703125,
0.04296875,
0.038787841796875,
0.03460693359375,
0.0122528076171875,
-0.0197601318359375,
0.0163116455078125,
-0.033782958984375,
0.07476806640625,
-0.0084991455078125,
-0.04681396484375,
0.041229248046875,
-0.0238037109375,
0.0183563232421875,
0.03082275390625,
0.0220794677734375,
-0.0357666015625,
-0.01050567626953125,
-0.041168212890625,
-0.061065673828125,
0.08251953125,
0.02545166015625,
0.0125274658203125,
0.0021686553955078125,
0.00742340087890625,
0.0112762451171875,
0.0280303955078125,
-0.07989501953125,
-0.016143798828125,
-0.038787841796875,
-0.0000032186508178710938,
-0.0235595703125,
-0.0242767333984375,
-0.01476287841796875,
-0.0277099609375,
0.077880859375,
-0.0167999267578125,
0.036376953125,
0.01290130615234375,
-0.019744873046875,
-0.0310516357421875,
0.0195465087890625,
0.0309295654296875,
0.03564453125,
-0.04296875,
-0.0063629150390625,
-0.00041365623474121094,
-0.05462646484375,
0.0006933212280273438,
0.01318359375,
-0.00826263427734375,
0.016021728515625,
0.00396728515625,
0.09027099609375,
-0.026458740234375,
-0.031707763671875,
0.0250244140625,
-0.0103912353515625,
-0.0022716522216796875,
-0.045379638671875,
0.0008225440979003906,
0.0007748603820800781,
0.007518768310546875,
0.031890869140625,
0.01232147216796875,
-0.00998687744140625,
-0.04949951171875,
0.017974853515625,
0.023773193359375,
-0.037994384765625,
-0.0174560546875,
0.062103271484375,
-0.00733184814453125,
-0.017486572265625,
0.07904052734375,
-0.01079559326171875,
-0.0347900390625,
0.048858642578125,
0.043060302734375,
0.0887451171875,
-0.0030345916748046875,
0.006862640380859375,
0.05841064453125,
0.04803466796875,
-0.0180816650390625,
0.0215606689453125,
0.0005040168762207031,
-0.0626220703125,
-0.017669677734375,
-0.07757568359375,
-0.039886474609375,
0.01776123046875,
-0.0271759033203125,
0.035003662109375,
-0.0258636474609375,
-0.00616455078125,
-0.0208892822265625,
-0.00838470458984375,
-0.0384521484375,
0.0303192138671875,
-0.00140380859375,
0.04541015625,
-0.079833984375,
0.0968017578125,
0.03424072265625,
-0.0545654296875,
-0.0694580078125,
-0.004573822021484375,
-0.0174102783203125,
-0.04449462890625,
0.03887939453125,
0.035430908203125,
0.0290374755859375,
0.0282745361328125,
-0.03826904296875,
-0.06414794921875,
0.075439453125,
0.0131378173828125,
-0.045501708984375,
0.0108184814453125,
0.0273590087890625,
0.049102783203125,
-0.031524658203125,
0.056549072265625,
0.060882568359375,
0.02386474609375,
-0.01514434814453125,
-0.0859375,
-0.0126800537109375,
-0.04443359375,
0.00024259090423583984,
0.012939453125,
-0.042572021484375,
0.07562255859375,
-0.04052734375,
0.0037746429443359375,
0.00392913818359375,
0.027435302734375,
0.019317626953125,
0.0110626220703125,
0.029144287109375,
0.052276611328125,
0.02740478515625,
-0.03753662109375,
0.081298828125,
-0.03399658203125,
0.0743408203125,
0.06402587890625,
-0.0085296630859375,
0.044891357421875,
0.03271484375,
-0.053314208984375,
0.040802001953125,
0.06842041015625,
-0.048248291015625,
0.060638427734375,
0.006450653076171875,
-0.01000213623046875,
0.01263427734375,
0.011474609375,
-0.040802001953125,
0.00435638427734375,
0.0221405029296875,
-0.04022216796875,
-0.015594482421875,
-0.0127410888671875,
0.006420135498046875,
-0.0341796875,
-0.010467529296875,
0.06085205078125,
0.0017499923706054688,
-0.06463623046875,
0.051971435546875,
0.01031494140625,
0.04827880859375,
-0.0528564453125,
-0.0100250244140625,
-0.00879669189453125,
0.024810791015625,
-0.016815185546875,
-0.059722900390625,
0.020111083984375,
0.0051116943359375,
-0.03973388671875,
0.0139617919921875,
0.0166168212890625,
-0.05401611328125,
-0.044677734375,
0.012939453125,
0.007160186767578125,
0.0458984375,
-0.0102081298828125,
-0.0753173828125,
-0.0168304443359375,
0.0171661376953125,
-0.03314208984375,
0.028350830078125,
0.0214080810546875,
-0.006267547607421875,
0.033538818359375,
0.06378173828125,
-0.0015869140625,
-0.00024771690368652344,
0.0266265869140625,
0.057342529296875,
-0.049407958984375,
-0.061279296875,
-0.06634521484375,
0.040802001953125,
0.0087127685546875,
-0.04083251953125,
0.039093017578125,
0.06805419921875,
0.073974609375,
-0.045318603515625,
0.06317138671875,
-0.02899169921875,
0.046112060546875,
-0.0278472900390625,
0.0540771484375,
-0.0088958740234375,
0.0251617431640625,
-0.02032470703125,
-0.09100341796875,
0.0105133056640625,
0.051239013671875,
-0.0184326171875,
0.0238189697265625,
0.064697265625,
0.09765625,
0.01227569580078125,
-0.00485992431640625,
0.002460479736328125,
0.038665771484375,
0.036285400390625,
0.0269012451171875,
0.043701171875,
-0.060211181640625,
0.041839599609375,
-0.021484375,
-0.01300048828125,
-0.0016965866088867188,
-0.0576171875,
-0.048370361328125,
-0.036865234375,
-0.025970458984375,
-0.048828125,
-0.0088958740234375,
0.0657958984375,
0.031951904296875,
-0.0753173828125,
-0.031890869140625,
-0.00508880615234375,
-0.01364898681640625,
-0.01885986328125,
-0.0203704833984375,
0.042755126953125,
0.002948760986328125,
-0.06719970703125,
0.016998291015625,
0.004947662353515625,
0.0035400390625,
-0.0193939208984375,
-0.0204620361328125,
-0.01398468017578125,
-0.018798828125,
0.01910400390625,
0.003803253173828125,
-0.04766845703125,
0.002422332763671875,
-0.0362548828125,
-0.0114898681640625,
0.0043487548828125,
0.0250091552734375,
-0.049835205078125,
0.049957275390625,
0.023406982421875,
0.00098419189453125,
0.061798095703125,
-0.0111083984375,
0.04656982421875,
-0.04351806640625,
0.0113525390625,
-0.0189056396484375,
0.04058837890625,
0.032958984375,
-0.037445068359375,
0.057342529296875,
0.034515380859375,
-0.05242919921875,
-0.048614501953125,
-0.011474609375,
-0.07330322265625,
-0.0008959770202636719,
0.0938720703125,
0.0033130645751953125,
-0.0011272430419921875,
-0.006656646728515625,
-0.037994384765625,
0.0345458984375,
-0.041229248046875,
0.047210693359375,
0.057403564453125,
0.005931854248046875,
-0.0153656005859375,
-0.046234130859375,
0.047149658203125,
0.0084991455078125,
-0.03326416015625,
-0.011260986328125,
0.01264190673828125,
0.059173583984375,
0.0038509368896484375,
0.0323486328125,
-0.020599365234375,
0.01264190673828125,
0.00727081298828125,
0.0119171142578125,
-0.0208740234375,
-0.0062103271484375,
-0.04339599609375,
0.005809783935546875,
-0.0004935264587402344,
-0.002960205078125
]
] |
google/t5-v1_1-base | 2023-01-24T16:52:30.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"en",
"dataset:c4",
"arxiv:2002.05202",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/t5-v1_1-base | 42 | 54,601 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- c4
license: apache-2.0
---
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) Version 1.1
## Version 1.1
[T5 Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/master/released_checkpoints.md#t511) includes the following improvements compared to the original T5 model- GEGLU activation in feed-forward hidden layer, rather than ReLU - see [here](https://arxiv.org/abs/2002.05202).
- Dropout was turned off in pre-training (quality win). Dropout should be re-enabled during fine-tuning.
- Pre-trained on C4 only without mixing in the downstream tasks.
- no parameter sharing between embedding and classifier layer
- "xl" and "xxl" replace "3B" and "11B". The model shapes are a bit different - larger `d_model` and smaller `num_heads` and `d_ff`.
**Note**: T5 Version 1.1 was only pre-trained on C4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [C4](https://huggingface.co/datasets/c4)
Other Community Checkpoints: [here](https://huggingface.co/models?search=t5-v1_1)
Paper: [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf)
Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu*
## Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.
 | 2,671 | [
[
-0.021514892578125,
-0.026824951171875,
0.0296478271484375,
0.015838623046875,
-0.01560211181640625,
0.01049041748046875,
-0.017913818359375,
-0.05303955078125,
-0.01245880126953125,
0.033538818359375,
-0.052734375,
-0.043731689453125,
-0.070068359375,
0.0150604248046875,
-0.04840087890625,
0.09722900390625,
-0.0142364501953125,
-0.01439666748046875,
0.0011568069458007812,
-0.003143310546875,
-0.02667236328125,
-0.032501220703125,
-0.0640869140625,
-0.0270538330078125,
0.02862548828125,
0.027984619140625,
0.0205078125,
0.0275115966796875,
0.05364990234375,
0.01299285888671875,
-0.0006847381591796875,
-0.00643157958984375,
-0.0501708984375,
-0.029083251953125,
-0.026519775390625,
-0.0144500732421875,
-0.036346435546875,
0.007007598876953125,
0.0452880859375,
0.054107666015625,
0.0020923614501953125,
0.0189361572265625,
0.02545166015625,
0.0450439453125,
-0.052764892578125,
0.01415252685546875,
-0.045318603515625,
0.01520538330078125,
-0.007648468017578125,
0.0008487701416015625,
-0.050506591796875,
-0.015106201171875,
0.039337158203125,
-0.056610107421875,
0.02545166015625,
-0.00891876220703125,
0.091796875,
0.0273895263671875,
-0.037628173828125,
-0.019134521484375,
-0.048614501953125,
0.06646728515625,
-0.045166015625,
0.028717041015625,
0.00936126708984375,
0.02838134765625,
0.01160430908203125,
-0.0894775390625,
-0.034149169921875,
-0.002300262451171875,
-0.0089874267578125,
0.004276275634765625,
-0.02166748046875,
-0.003589630126953125,
0.007724761962890625,
0.03607177734375,
-0.035430908203125,
0.0159912109375,
-0.05010986328125,
-0.019500732421875,
0.036773681640625,
-0.0180816650390625,
0.02362060546875,
0.0012073516845703125,
-0.04913330078125,
-0.0191497802734375,
-0.040496826171875,
0.00789642333984375,
-0.01666259765625,
0.02447509765625,
-0.0257415771484375,
-0.00704193115234375,
-0.001544952392578125,
0.04852294921875,
0.01123046875,
-0.004730224609375,
0.0265655517578125,
-0.047943115234375,
-0.0174407958984375,
-0.01568603515625,
0.06573486328125,
0.01360321044921875,
0.0216064453125,
-0.03228759765625,
-0.0016126632690429688,
-0.02130126953125,
0.03228759765625,
-0.0731201171875,
-0.033538818359375,
-0.006038665771484375,
-0.02801513671875,
-0.03814697265625,
0.007663726806640625,
-0.0457763671875,
-0.004093170166015625,
-0.0189208984375,
0.0411376953125,
-0.04302978515625,
-0.0204620361328125,
0.026336669921875,
0.0027332305908203125,
0.032135009765625,
0.040679931640625,
-0.07843017578125,
0.035430908203125,
0.03607177734375,
0.06304931640625,
-0.044830322265625,
-0.0254364013671875,
-0.041748046875,
-0.0005087852478027344,
-0.01019287109375,
0.06097412109375,
-0.024200439453125,
-0.017578125,
-0.00641632080078125,
0.01290130615234375,
-0.0190277099609375,
-0.0232086181640625,
0.060211181640625,
-0.031280517578125,
0.043243408203125,
-0.0206451416015625,
-0.034454345703125,
-0.039031982421875,
0.0128021240234375,
-0.0511474609375,
0.0755615234375,
0.013580322265625,
-0.044586181640625,
0.035491943359375,
-0.0655517578125,
-0.033233642578125,
-0.01319122314453125,
0.0274658203125,
-0.030731201171875,
-0.0174102783203125,
0.025177001953125,
0.042938232421875,
-0.007663726806640625,
0.005176544189453125,
-0.017242431640625,
-0.021331787109375,
-0.01262664794921875,
-0.00449371337890625,
0.06805419921875,
0.022308349609375,
-0.0233917236328125,
0.00403594970703125,
-0.047119140625,
0.013824462890625,
-0.001132965087890625,
-0.0231781005859375,
0.01021575927734375,
-0.0224151611328125,
0.01142120361328125,
0.03173828125,
0.0215606689453125,
-0.0249481201171875,
0.0200042724609375,
-0.0194854736328125,
0.040191650390625,
0.040313720703125,
-0.014495849609375,
0.065673828125,
-0.031829833984375,
0.038299560546875,
0.004119873046875,
0.004665374755859375,
-0.01183319091796875,
-0.0167083740234375,
-0.055908203125,
-0.00887298583984375,
0.050567626953125,
0.05364990234375,
-0.051177978515625,
0.043060302734375,
-0.04205322265625,
-0.039459228515625,
-0.04559326171875,
0.0062103271484375,
0.02801513671875,
0.047882080078125,
0.05712890625,
-0.01934814453125,
-0.04278564453125,
-0.0411376953125,
-0.02056884765625,
0.004276275634765625,
-0.006744384765625,
-0.002216339111328125,
0.0362548828125,
-0.0158233642578125,
0.05841064453125,
-0.0233306884765625,
-0.041229248046875,
-0.04461669921875,
0.01358795166015625,
-0.004100799560546875,
0.046051025390625,
0.05230712890625,
-0.04534912109375,
-0.040496826171875,
0.007045745849609375,
-0.058837890625,
-0.01340484619140625,
-0.01204681396484375,
-0.0068206787109375,
0.02410888671875,
0.04425048828125,
-0.0196533203125,
0.0238494873046875,
0.06298828125,
-0.0183868408203125,
0.02716064453125,
-0.0106048583984375,
0.0011653900146484375,
-0.11767578125,
0.029388427734375,
0.00342559814453125,
-0.038177490234375,
-0.05615234375,
-0.0009441375732421875,
0.0205078125,
0.00640106201171875,
-0.0433349609375,
0.046661376953125,
-0.036956787109375,
0.005565643310546875,
-0.0198822021484375,
0.01395416259765625,
-0.0004887580871582031,
0.03948974609375,
-0.00878143310546875,
0.060333251953125,
0.03607177734375,
-0.060699462890625,
-0.006076812744140625,
0.0323486328125,
-0.01508331298828125,
0.0099029541015625,
-0.045196533203125,
0.0321044921875,
0.0004792213439941406,
0.0345458984375,
-0.06591796875,
0.0201263427734375,
0.0310211181640625,
-0.04437255859375,
0.042816162109375,
-0.00971221923828125,
-0.015289306640625,
-0.01496124267578125,
-0.0267486572265625,
0.02154541015625,
0.049041748046875,
-0.04730224609375,
0.040191650390625,
0.010650634765625,
0.0018157958984375,
-0.052093505859375,
-0.055999755859375,
0.01480865478515625,
-0.019500732421875,
-0.047882080078125,
0.0640869140625,
0.0013141632080078125,
0.0183258056640625,
-0.00428009033203125,
-0.006114959716796875,
-0.02130126953125,
0.0166778564453125,
-0.01192474365234375,
0.02001953125,
-0.0022296905517578125,
0.007762908935546875,
0.0107574462890625,
-0.021392822265625,
-0.00283050537109375,
-0.034881591796875,
0.022064208984375,
-0.01031494140625,
0.015777587890625,
-0.04083251953125,
0.0010242462158203125,
0.02349853515625,
-0.020111083984375,
0.05548095703125,
0.06951904296875,
-0.0188446044921875,
-0.023468017578125,
-0.0204925537109375,
-0.0158233642578125,
-0.034515380859375,
0.032257080078125,
-0.037750244140625,
-0.07611083984375,
0.0310821533203125,
-0.0171051025390625,
0.02294921875,
0.05224609375,
0.006557464599609375,
-0.0029144287109375,
0.0489501953125,
0.082275390625,
-0.02520751953125,
0.050018310546875,
-0.033721923828125,
0.0206451416015625,
-0.06732177734375,
-0.012054443359375,
-0.05078125,
-0.022308349609375,
-0.04840087890625,
-0.0215911865234375,
0.0036220550537109375,
0.019317626953125,
-0.0137786865234375,
0.040252685546875,
-0.029144287109375,
0.027099609375,
0.01482391357421875,
0.0123138427734375,
0.029815673828125,
0.00727081298828125,
0.0027751922607421875,
-0.01439666748046875,
-0.05889892578125,
-0.037445068359375,
0.09197998046875,
0.0238800048828125,
0.03863525390625,
0.0067901611328125,
0.048797607421875,
0.032745361328125,
0.033050537109375,
-0.0560302734375,
0.033905029296875,
-0.034576416015625,
-0.020843505859375,
-0.02703857421875,
-0.034576416015625,
-0.08697509765625,
0.0225830078125,
-0.0372314453125,
-0.05438232421875,
-0.010498046875,
0.0011262893676757812,
-0.0088348388671875,
0.037689208984375,
-0.060699462890625,
0.0782470703125,
0.00484466552734375,
-0.0152130126953125,
-0.0008726119995117188,
-0.0574951171875,
0.019287109375,
-0.0105743408203125,
-0.00341796875,
0.007396697998046875,
-0.00495147705078125,
0.055389404296875,
-0.0166168212890625,
0.051055908203125,
-0.0092010498046875,
-0.00653839111328125,
0.0004086494445800781,
0.00023424625396728516,
0.039764404296875,
-0.029815673828125,
-0.00012922286987304688,
0.0245361328125,
-0.0014925003051757812,
-0.042327880859375,
-0.037109375,
0.0328369140625,
-0.060211181640625,
-0.0237884521484375,
-0.0229644775390625,
-0.0212249755859375,
-0.00466156005859375,
0.02606201171875,
0.03521728515625,
0.013641357421875,
-0.015869140625,
0.0264129638671875,
0.05389404296875,
-0.01132965087890625,
0.043975830078125,
0.026641845703125,
-0.02130126953125,
-0.0060272216796875,
0.053009033203125,
-0.00017762184143066406,
0.037445068359375,
0.046417236328125,
0.007579803466796875,
-0.02801513671875,
-0.058837890625,
-0.037322998046875,
0.01505279541015625,
-0.047332763671875,
-0.00991058349609375,
-0.06072998046875,
-0.0309600830078125,
-0.045166015625,
-0.01015472412109375,
-0.034759521484375,
-0.0219268798828125,
-0.038177490234375,
-0.01934814453125,
0.01093292236328125,
0.050933837890625,
0.0097198486328125,
0.0169525146484375,
-0.07977294921875,
0.00847625732421875,
0.004730224609375,
0.01751708984375,
-0.0031909942626953125,
-0.07550048828125,
-0.011627197265625,
0.0019550323486328125,
-0.02862548828125,
-0.050933837890625,
0.03619384765625,
0.030426025390625,
0.0296478271484375,
0.01239013671875,
0.006320953369140625,
0.0390625,
-0.028778076171875,
0.058441162109375,
0.016876220703125,
-0.0892333984375,
0.030242919921875,
-0.0233154296875,
0.029693603515625,
0.058563232421875,
0.0418701171875,
-0.034576416015625,
-0.007965087890625,
-0.051910400390625,
-0.050262451171875,
0.05938720703125,
0.0133056640625,
-0.005077362060546875,
0.037384033203125,
0.0229644775390625,
0.026702880859375,
-0.004184722900390625,
-0.0693359375,
-0.01085662841796875,
-0.0115203857421875,
-0.01538848876953125,
-0.0113983154296875,
0.00762176513671875,
0.031005859375,
-0.0287017822265625,
0.0443115234375,
-0.0150604248046875,
0.0239715576171875,
0.0247955322265625,
-0.038421630859375,
0.01367950439453125,
0.0181732177734375,
0.043243408203125,
0.05816650390625,
-0.018280029296875,
-0.006458282470703125,
0.035888671875,
-0.04913330078125,
-0.002925872802734375,
0.01580810546875,
-0.01062774658203125,
-0.005817413330078125,
0.033233642578125,
0.0634765625,
0.02447509765625,
-0.0187225341796875,
0.043365478515625,
-0.0104217529296875,
-0.048614501953125,
-0.01113128662109375,
0.0048828125,
-0.0079193115234375,
-0.00569915771484375,
0.02716064453125,
0.01922607421875,
0.0234375,
-0.032745361328125,
0.00927734375,
0.0051422119140625,
-0.038055419921875,
-0.040069580078125,
0.047149658203125,
0.029998779296875,
-0.0113372802734375,
0.058563232421875,
-0.01953125,
-0.0426025390625,
0.0297393798828125,
0.043121337890625,
0.077392578125,
-0.007328033447265625,
0.02606201171875,
0.045806884765625,
0.0269317626953125,
-0.0115203857421875,
-0.00812530517578125,
-0.018096923828125,
-0.061187744140625,
-0.06317138671875,
-0.03363037109375,
-0.03594970703125,
0.01119232177734375,
-0.050872802734375,
0.034576416015625,
-0.0247802734375,
0.0151824951171875,
-0.0006289482116699219,
0.0144500732421875,
-0.062164306640625,
0.0156402587890625,
0.0115509033203125,
0.0716552734375,
-0.0582275390625,
0.0794677734375,
0.0535888671875,
-0.0222015380859375,
-0.0650634765625,
0.0037097930908203125,
-0.0247650146484375,
-0.047119140625,
0.03240966796875,
0.0226593017578125,
-0.0127716064453125,
0.0169219970703125,
-0.05072021484375,
-0.072021484375,
0.09930419921875,
0.036376953125,
-0.0259857177734375,
-0.0214080810546875,
0.006275177001953125,
0.0389404296875,
-0.0240478515625,
0.0135040283203125,
0.045684814453125,
0.028533935546875,
0.01971435546875,
-0.09429931640625,
0.0201568603515625,
-0.0196075439453125,
-0.0093231201171875,
0.016693115234375,
-0.039947509765625,
0.05328369140625,
-0.024200439453125,
-0.0258331298828125,
-0.00106048583984375,
0.0552978515625,
0.0018253326416015625,
0.0185699462890625,
0.040802001953125,
0.057830810546875,
0.061431884765625,
-0.01508331298828125,
0.08819580078125,
-0.0036830902099609375,
0.034912109375,
0.07952880859375,
-0.0010557174682617188,
0.062744140625,
0.0241241455078125,
-0.0203857421875,
0.044830322265625,
0.04156494140625,
0.00974273681640625,
0.043212890625,
0.00353240966796875,
-0.00437164306640625,
-0.006195068359375,
0.00860595703125,
-0.033111572265625,
0.0256195068359375,
0.01288604736328125,
-0.0237884521484375,
-0.032745361328125,
0.00439453125,
0.016204833984375,
-0.006443023681640625,
-0.0142364501953125,
0.072509765625,
0.0063018798828125,
-0.050018310546875,
0.047637939453125,
-0.0056915283203125,
0.0723876953125,
-0.04412841796875,
-0.0003199577331542969,
-0.0221099853515625,
0.0163116455078125,
-0.0184783935546875,
-0.05401611328125,
0.032470703125,
-0.006839752197265625,
-0.00946807861328125,
-0.050811767578125,
0.07330322265625,
-0.0237274169921875,
-0.0178985595703125,
0.030548095703125,
0.04022216796875,
0.0186004638671875,
-0.0096435546875,
-0.055694580078125,
-0.017120361328125,
0.020263671875,
-0.00724029541015625,
0.036376953125,
0.03631591796875,
0.00536346435546875,
0.050872802734375,
0.043792724609375,
-0.00179290771484375,
0.01073455810546875,
0.0034198760986328125,
0.05389404296875,
-0.054840087890625,
-0.038726806640625,
-0.044403076171875,
0.03662109375,
-0.004337310791015625,
-0.0399169921875,
0.046783447265625,
0.030181884765625,
0.0885009765625,
-0.00971221923828125,
0.05816650390625,
-0.0016937255859375,
0.041595458984375,
-0.0458984375,
0.04852294921875,
-0.039642333984375,
0.00713348388671875,
-0.024993896484375,
-0.06378173828125,
-0.025146484375,
0.039581298828125,
-0.0233917236328125,
0.016510009765625,
0.07403564453125,
0.03680419921875,
-0.006534576416015625,
0.001007080078125,
0.0182342529296875,
-0.0007863044738769531,
0.038177490234375,
0.062255859375,
0.041015625,
-0.06719970703125,
0.06610107421875,
-0.0168914794921875,
-0.004730224609375,
-0.00531005859375,
-0.07720947265625,
-0.0616455078125,
-0.05621337890625,
-0.0289154052734375,
-0.0169219970703125,
0.004802703857421875,
0.049407958984375,
0.0648193359375,
-0.0478515625,
-0.0019407272338867188,
-0.0215911865234375,
-0.00553131103515625,
-0.01451873779296875,
-0.0165863037109375,
0.0272064208984375,
-0.051605224609375,
-0.06072998046875,
0.005474090576171875,
-0.00225067138671875,
0.00528717041015625,
0.010284423828125,
-0.0052642822265625,
-0.023193359375,
-0.032745361328125,
0.044586181640625,
0.0217742919921875,
-0.0243988037109375,
-0.0238494873046875,
0.002666473388671875,
-0.0070648193359375,
0.0185089111328125,
0.044403076171875,
-0.06597900390625,
0.0139617919921875,
0.03521728515625,
0.07757568359375,
0.06439208984375,
-0.00997161865234375,
0.043060302734375,
-0.043731689453125,
-0.00865936279296875,
0.0132904052734375,
0.00812530517578125,
0.0272064208984375,
-0.01416015625,
0.051361083984375,
0.01247406005859375,
-0.040374755859375,
-0.03515625,
-0.00955963134765625,
-0.09588623046875,
-0.0140380859375,
0.07977294921875,
-0.0162811279296875,
-0.01568603515625,
0.002422332763671875,
-0.01141357421875,
0.0247039794921875,
-0.0243682861328125,
0.06103515625,
0.06072998046875,
0.01318359375,
-0.0291595458984375,
-0.03662109375,
0.05169677734375,
0.0445556640625,
-0.0882568359375,
-0.0247802734375,
0.01430511474609375,
0.033416748046875,
0.0033245086669921875,
0.04229736328125,
-0.01136016845703125,
0.0189971923828125,
-0.0304718017578125,
0.0146942138671875,
-0.0011568069458007812,
-0.0302581787109375,
-0.04327392578125,
0.01110076904296875,
-0.0170135498046875,
-0.0261993408203125
]
] |
facebook/dragon-plus-context-encoder | 2023-09-27T17:26:22.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"feature-extraction",
"arxiv:2302.07452",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | feature-extraction | facebook | null | null | facebook/dragon-plus-context-encoder | 13 | 54,407 | transformers | 2023-02-15T18:19:38 | ---
tags:
- feature-extraction
pipeline_tag: feature-extraction
---
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
Diverse Augmentation Towards Generalizable Dense Retrieval](https://arxiv.org/abs/2302.07452).
<p align="center">
<img src="https://raw.githubusercontent.com/facebookresearch/dpr-scale/main/dragon/images/teaser.png" width="600">
</p>
The associated GitHub repository is available here https://github.com/facebookresearch/dpr-scale/tree/main/dragon. We use asymmetric dual encoder, with two distinctly parameterized encoders. The following models are also available:
Model | Initialization | MARCO Dev | BEIR | Query Encoder Path | Context Encoder Path
|---|---|---|---|---|---
DRAGON+ | Shitao/RetroMAE| 39.0 | 47.4 | [facebook/dragon-plus-query-encoder](https://huggingface.co/facebook/dragon-plus-query-encoder) | [facebook/dragon-plus-context-encoder](https://huggingface.co/facebook/dragon-plus-context-encoder)
DRAGON-RoBERTa | RoBERTa-base | 39.4 | 47.2 | [facebook/dragon-roberta-query-encoder](https://huggingface.co/facebook/dragon-roberta-query-encoder) | [facebook/dragon-roberta-context-encoder](https://huggingface.co/facebook/dragon-roberta-context-encoder)
## Usage (HuggingFace Transformers)
Using the model directly available in HuggingFace transformers .
```python
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('facebook/dragon-plus-query-encoder')
query_encoder = AutoModel.from_pretrained('facebook/dragon-plus-query-encoder')
context_encoder = AutoModel.from_pretrained('facebook/dragon-plus-context-encoder')
# We use msmarco query and passages as an example
query = "Where was Marie Curie born?"
contexts = [
"Maria Sklodowska, later known as Marie Curie, was born on November 7, 1867.",
"Born in Paris on 15 May 1859, Pierre Curie was the son of Eugène Curie, a doctor of French Catholic origin from Alsace."
]
# Apply tokenizer
query_input = tokenizer(query, return_tensors='pt')
ctx_input = tokenizer(contexts, padding=True, truncation=True, return_tensors='pt')
# Compute embeddings: take the last-layer hidden state of the [CLS] token
query_emb = query_encoder(**query_input).last_hidden_state[:, 0, :]
ctx_emb = context_encoder(**ctx_input).last_hidden_state[:, 0, :]
# Compute similarity scores using dot product
score1 = query_emb @ ctx_emb[0] # 396.5625
score2 = query_emb @ ctx_emb[1] # 393.8340
``` | 2,616 | [
[
-0.0290374755859375,
-0.04736328125,
0.01103973388671875,
0.0233917236328125,
-0.007686614990234375,
-0.0116424560546875,
-0.016448974609375,
-0.03399658203125,
0.0472412109375,
0.021331787109375,
-0.057647705078125,
-0.036834716796875,
-0.040985107421875,
-0.0032405853271484375,
-0.02392578125,
0.062042236328125,
0.003658294677734375,
0.017608642578125,
0.006832122802734375,
-0.0110931396484375,
0.004405975341796875,
-0.049560546875,
-0.07269287109375,
-0.0187225341796875,
0.046478271484375,
-0.005695343017578125,
0.0579833984375,
0.05975341796875,
0.0223846435546875,
0.0283050537109375,
-0.0225372314453125,
0.018341064453125,
-0.040496826171875,
-0.00850677490234375,
0.0005846023559570312,
-0.0289459228515625,
-0.036468505859375,
-0.00295257568359375,
0.045013427734375,
0.0440673828125,
0.0084991455078125,
0.00762176513671875,
0.004146575927734375,
0.037841796875,
-0.03753662109375,
-0.0015611648559570312,
-0.0362548828125,
0.018646240234375,
0.00933074951171875,
-0.007305145263671875,
-0.0150909423828125,
-0.041656494140625,
0.007106781005859375,
-0.0484619140625,
0.0226593017578125,
-0.0150146484375,
0.0936279296875,
0.01435089111328125,
-0.032318115234375,
-0.02850341796875,
-0.021636962890625,
0.05218505859375,
-0.045806884765625,
0.0291595458984375,
0.0190582275390625,
0.013458251953125,
-0.01276397705078125,
-0.0732421875,
-0.0550537109375,
-0.016357421875,
-0.01425933837890625,
-0.01080322265625,
0.001522064208984375,
0.0007653236389160156,
0.0242462158203125,
0.03582763671875,
-0.04443359375,
-0.0081787109375,
-0.037384033203125,
-0.024444580078125,
0.056976318359375,
0.005519866943359375,
0.03704833984375,
-0.01788330078125,
-0.032073974609375,
-0.023681640625,
-0.040679931640625,
0.0228271484375,
0.020751953125,
0.018585205078125,
-0.0213470458984375,
0.047576904296875,
-0.00601959228515625,
0.05902099609375,
0.041656494140625,
0.006122589111328125,
0.048797607421875,
-0.022705078125,
-0.0131683349609375,
0.0102996826171875,
0.067138671875,
0.02239990234375,
0.01343536376953125,
-0.007724761962890625,
0.00007730722427368164,
0.0004093647003173828,
0.004299163818359375,
-0.08349609375,
-0.0087738037109375,
0.026580810546875,
-0.06201171875,
-0.0199127197265625,
0.0162200927734375,
-0.0234222412109375,
-0.0220794677734375,
-0.01056671142578125,
0.046630859375,
-0.032562255859375,
-0.0330810546875,
0.0265045166015625,
-0.031402587890625,
0.03900146484375,
-0.01465606689453125,
-0.0670166015625,
0.0125579833984375,
0.0556640625,
0.04638671875,
-0.0015325546264648438,
-0.0295257568359375,
-0.0198822021484375,
-0.004390716552734375,
-0.0225677490234375,
0.036041259765625,
-0.035247802734375,
-0.0325927734375,
0.0022068023681640625,
0.034149169921875,
-0.01654052734375,
-0.044342041015625,
0.07110595703125,
-0.031036376953125,
0.0196075439453125,
-0.040679931640625,
-0.049896240234375,
-0.0247039794921875,
0.0238037109375,
-0.05352783203125,
0.07720947265625,
0.017059326171875,
-0.052337646484375,
0.03900146484375,
-0.03070068359375,
-0.0321044921875,
-0.0020618438720703125,
-0.0157012939453125,
-0.0301055908203125,
-0.005218505859375,
0.02947998046875,
0.05401611328125,
0.0115509033203125,
0.00455474853515625,
-0.0478515625,
-0.05438232421875,
0.0183868408203125,
-0.011474609375,
0.09588623046875,
0.0140380859375,
-0.029388427734375,
0.0033054351806640625,
-0.06414794921875,
0.0179595947265625,
0.0030879974365234375,
-0.03143310546875,
-0.0226593017578125,
-0.0033092498779296875,
0.01177978515625,
0.016754150390625,
0.04107666015625,
-0.059112548828125,
0.026214599609375,
-0.0311737060546875,
0.03546142578125,
0.0288848876953125,
-0.0033512115478515625,
0.0305328369140625,
-0.01503753662109375,
-0.0016069412231445312,
-0.006237030029296875,
0.0209808349609375,
-0.00678253173828125,
-0.030181884765625,
-0.04901123046875,
-0.02557373046875,
-0.00007683038711547852,
0.031341552734375,
-0.053131103515625,
0.053741455078125,
-0.022491455078125,
-0.04656982421875,
-0.053314208984375,
-0.007076263427734375,
0.020599365234375,
0.0201263427734375,
0.023681640625,
-0.013092041015625,
-0.046966552734375,
-0.05743408203125,
-0.01276397705078125,
-0.0026340484619140625,
-0.012481689453125,
0.04718017578125,
0.02947998046875,
-0.03082275390625,
0.041961669921875,
-0.037994384765625,
-0.01287078857421875,
-0.00994873046875,
-0.007564544677734375,
0.0501708984375,
0.06121826171875,
0.058380126953125,
-0.0748291015625,
-0.03955078125,
-0.004459381103515625,
-0.0633544921875,
0.032989501953125,
-0.007171630859375,
-0.0180206298828125,
-0.0093841552734375,
0.00948333740234375,
-0.0692138671875,
0.0256500244140625,
0.032440185546875,
-0.04364013671875,
0.0357666015625,
-0.019439697265625,
0.026641845703125,
-0.0924072265625,
0.002330780029296875,
0.0031452178955078125,
0.004085540771484375,
-0.036224365234375,
0.0178680419921875,
0.017822265625,
-0.01087188720703125,
-0.03271484375,
0.0369873046875,
-0.03924560546875,
-0.0056610107421875,
-0.004001617431640625,
-0.0105743408203125,
0.012481689453125,
0.038604736328125,
0.01177978515625,
0.040191650390625,
0.05584716796875,
-0.05267333984375,
0.055145263671875,
0.0189208984375,
-0.01428985595703125,
0.03314208984375,
-0.05670166015625,
0.014923095703125,
0.007205963134765625,
0.0162353515625,
-0.0626220703125,
-0.01285552978515625,
0.0203094482421875,
-0.0631103515625,
0.033538818359375,
-0.007389068603515625,
-0.040740966796875,
-0.045135498046875,
-0.05126953125,
0.0323486328125,
0.0226593017578125,
-0.06842041015625,
0.045684814453125,
0.020233154296875,
0.018035888671875,
-0.060943603515625,
-0.07073974609375,
-0.0038318634033203125,
0.00022113323211669922,
-0.06634521484375,
0.05279541015625,
-0.01389312744140625,
0.0195159912109375,
0.02655029296875,
0.005207061767578125,
-0.033172607421875,
-0.0136566162109375,
0.00830841064453125,
0.02008056640625,
-0.0303802490234375,
-0.0237579345703125,
0.0103759765625,
-0.0015869140625,
0.0135498046875,
-0.0251617431640625,
0.0443115234375,
-0.005344390869140625,
-0.0265350341796875,
-0.041961669921875,
0.00835418701171875,
0.025146484375,
-0.0091400146484375,
0.060150146484375,
0.08624267578125,
-0.044342041015625,
-0.01322174072265625,
-0.039215087890625,
-0.0205535888671875,
-0.04296875,
0.03814697265625,
-0.0309295654296875,
-0.0733642578125,
0.056427001953125,
0.0269622802734375,
-0.01465606689453125,
0.031036376953125,
0.045257568359375,
0.002208709716796875,
0.0872802734375,
0.042694091796875,
-0.0229034423828125,
0.0311279296875,
-0.047698974609375,
-0.000537872314453125,
-0.0556640625,
-0.03997802734375,
-0.028045654296875,
-0.0377197265625,
-0.0595703125,
-0.014495849609375,
0.0067596435546875,
0.0003879070281982422,
-0.0341796875,
0.0579833984375,
-0.06890869140625,
0.0228729248046875,
0.041534423828125,
0.03350830078125,
-0.019500732421875,
0.00865936279296875,
0.0021686553955078125,
-0.0104217529296875,
-0.035858154296875,
0.0140838623046875,
0.07366943359375,
0.00931549072265625,
0.05615234375,
-0.0070037841796875,
0.050140380859375,
0.004119873046875,
-0.003910064697265625,
-0.0462646484375,
0.034210205078125,
-0.0283355712890625,
-0.0638427734375,
0.0033588409423828125,
-0.0384521484375,
-0.06365966796875,
0.0153045654296875,
-0.0195159912109375,
-0.05609130859375,
0.0494384765625,
-0.007457733154296875,
-0.03173828125,
0.0267486572265625,
-0.040069580078125,
0.06317138671875,
-0.0248870849609375,
-0.033416748046875,
-0.005382537841796875,
-0.03717041015625,
0.014739990234375,
0.028289794921875,
-0.01385498046875,
0.018035888671875,
-0.0044403076171875,
0.04290771484375,
-0.0166168212890625,
0.050872802734375,
-0.031951904296875,
0.01471710205078125,
0.0268402099609375,
-0.00897979736328125,
0.0268707275390625,
0.0023212432861328125,
-0.01198577880859375,
-0.0000934600830078125,
0.00994873046875,
-0.029144287109375,
-0.0445556640625,
0.0748291015625,
-0.0694580078125,
-0.011566162109375,
-0.007808685302734375,
-0.028961181640625,
0.01239013671875,
0.0210113525390625,
0.038970947265625,
0.04443359375,
0.0125732421875,
0.038818359375,
0.05999755859375,
-0.038055419921875,
0.027679443359375,
0.0258941650390625,
-0.010894775390625,
-0.04144287109375,
0.08209228515625,
0.0200347900390625,
-0.018218994140625,
0.034637451171875,
-0.0006008148193359375,
-0.037322998046875,
-0.01849365234375,
-0.022064208984375,
0.0301971435546875,
-0.04315185546875,
-0.006500244140625,
-0.04962158203125,
-0.046051025390625,
-0.0328369140625,
-0.0114288330078125,
-0.029052734375,
-0.028656005859375,
-0.038238525390625,
-0.0201263427734375,
0.0193939208984375,
0.0304412841796875,
-0.0156402587890625,
0.010833740234375,
-0.0516357421875,
0.0295257568359375,
0.017608642578125,
0.02813720703125,
-0.004985809326171875,
-0.03729248046875,
-0.01416778564453125,
-0.0005741119384765625,
-0.0023021697998046875,
-0.0594482421875,
0.03271484375,
0.015655517578125,
0.056060791015625,
0.04315185546875,
-0.00936126708984375,
0.06414794921875,
-0.0194549560546875,
0.0673828125,
0.00029277801513671875,
-0.048980712890625,
0.015380859375,
-0.0180206298828125,
-0.005321502685546875,
0.02825927734375,
0.03759765625,
-0.030029296875,
-0.0009665489196777344,
-0.03924560546875,
-0.0721435546875,
0.044952392578125,
0.0027866363525390625,
0.037689208984375,
-0.003154754638671875,
0.05584716796875,
-0.0021228790283203125,
0.00750732421875,
-0.06634521484375,
-0.04620361328125,
-0.026275634765625,
-0.01512908935546875,
0.021728515625,
-0.03997802734375,
0.009552001953125,
-0.0511474609375,
0.050628662109375,
0.010162353515625,
0.036712646484375,
0.0243072509765625,
0.004871368408203125,
0.004329681396484375,
-0.0187530517578125,
0.028961181640625,
0.0367431640625,
-0.0279541015625,
-0.0156402587890625,
0.00873565673828125,
-0.03607177734375,
-0.01230621337890625,
0.037017822265625,
-0.01499176025390625,
0.01561737060546875,
0.031951904296875,
0.057952880859375,
0.0190582275390625,
-0.01483917236328125,
0.050384521484375,
-0.00928497314453125,
-0.0211639404296875,
-0.03790283203125,
0.004291534423828125,
0.0155487060546875,
0.0295562744140625,
0.046630859375,
-0.0034236907958984375,
0.02490234375,
-0.01557159423828125,
0.036834716796875,
0.032012939453125,
-0.037689208984375,
-0.0305328369140625,
0.061065673828125,
-0.002696990966796875,
-0.006572723388671875,
0.039764404296875,
-0.0194091796875,
-0.038848876953125,
0.0596923828125,
0.040924072265625,
0.0736083984375,
-0.004497528076171875,
0.01245880126953125,
0.054229736328125,
0.01715087890625,
-0.0010747909545898438,
0.045257568359375,
-0.0188446044921875,
-0.0684814453125,
-0.0304718017578125,
-0.04290771484375,
0.00238800048828125,
0.01355743408203125,
-0.07171630859375,
0.0214996337890625,
-0.0272674560546875,
-0.024017333984375,
-0.0020656585693359375,
0.023895263671875,
-0.053680419921875,
0.022125244140625,
0.00942230224609375,
0.07611083984375,
-0.037689208984375,
0.07037353515625,
0.0606689453125,
-0.039794921875,
-0.086669921875,
-0.0199432373046875,
-0.0086212158203125,
-0.07611083984375,
0.050323486328125,
0.0205230712890625,
0.0072021484375,
0.0014057159423828125,
-0.0565185546875,
-0.06268310546875,
0.08544921875,
0.0288543701171875,
-0.02508544921875,
-0.003925323486328125,
-0.0102996826171875,
0.04547119140625,
-0.0278778076171875,
0.0296478271484375,
0.027130126953125,
0.0272674560546875,
0.0041961669921875,
-0.04510498046875,
0.0112762451171875,
-0.0413818359375,
0.003055572509765625,
-0.016021728515625,
-0.045867919921875,
0.09173583984375,
-0.00756072998046875,
-0.013885498046875,
0.0175933837890625,
0.051605224609375,
0.021331787109375,
0.01233673095703125,
0.0197906494140625,
0.05145263671875,
0.033172607421875,
-0.01026153564453125,
0.08575439453125,
-0.043670654296875,
0.050537109375,
0.05279541015625,
0.0041351318359375,
0.060333251953125,
0.0176849365234375,
-0.0142059326171875,
0.042449951171875,
0.06475830078125,
-0.0245208740234375,
0.04205322265625,
-0.004016876220703125,
-0.01605224609375,
-0.0016145706176757812,
0.004749298095703125,
-0.0556640625,
0.006946563720703125,
0.0247039794921875,
-0.047698974609375,
-0.0032939910888671875,
-0.008697509765625,
0.0246734619140625,
-0.006023406982421875,
-0.03277587890625,
0.0440673828125,
-0.0013561248779296875,
-0.0296478271484375,
0.06292724609375,
-0.0007028579711914062,
0.080078125,
-0.049224853515625,
0.0023593902587890625,
-0.022705078125,
0.035186767578125,
-0.0416259765625,
-0.0653076171875,
0.0150604248046875,
-0.017486572265625,
-0.0108184814453125,
0.008544921875,
0.061279296875,
-0.047515869140625,
-0.054229736328125,
0.043426513671875,
0.01117706298828125,
0.0086822509765625,
-0.0078277587890625,
-0.08074951171875,
0.01189422607421875,
0.00909423828125,
-0.03704833984375,
0.01520538330078125,
0.037139892578125,
0.024017333984375,
0.04736328125,
0.056610107421875,
0.01558685302734375,
0.02313232421875,
0.004302978515625,
0.08648681640625,
-0.047698974609375,
-0.0303497314453125,
-0.07037353515625,
0.026458740234375,
-0.01418304443359375,
-0.02276611328125,
0.06109619140625,
0.048614501953125,
0.0679931640625,
-0.00421142578125,
0.040283203125,
-0.025390625,
0.0265655517578125,
-0.0153045654296875,
0.08197021484375,
-0.06585693359375,
-0.00640869140625,
-0.01265716552734375,
-0.067138671875,
-0.01102447509765625,
0.05096435546875,
-0.004604339599609375,
0.01044464111328125,
0.027313232421875,
0.0736083984375,
-0.00856781005859375,
-0.012847900390625,
0.004444122314453125,
0.0182342529296875,
0.0146942138671875,
0.040008544921875,
0.0460205078125,
-0.03936767578125,
0.0460205078125,
-0.035858154296875,
-0.00882720947265625,
0.0034465789794921875,
-0.047027587890625,
-0.07012939453125,
-0.0338134765625,
-0.0012941360473632812,
-0.043060302734375,
0.007686614990234375,
0.07000732421875,
0.05975341796875,
-0.07037353515625,
-0.01824951171875,
0.0166168212890625,
0.0013790130615234375,
-0.0248870849609375,
-0.0221405029296875,
0.057525634765625,
-0.01198577880859375,
-0.05670166015625,
0.0204925537109375,
-0.0008654594421386719,
-0.01477813720703125,
-0.0174560546875,
-0.021759033203125,
-0.0025768280029296875,
-0.0145416259765625,
0.030029296875,
0.0252532958984375,
-0.0399169921875,
-0.0218048095703125,
0.006603240966796875,
-0.0184173583984375,
0.008880615234375,
0.033599853515625,
-0.0478515625,
0.037384033203125,
0.050750732421875,
0.05877685546875,
0.0579833984375,
0.0021514892578125,
0.02960205078125,
-0.05560302734375,
0.0284271240234375,
0.008056640625,
0.034759521484375,
0.03277587890625,
-0.047760009765625,
0.0261993408203125,
0.0297393798828125,
-0.03277587890625,
-0.058746337890625,
-0.00603485107421875,
-0.07391357421875,
-0.0166778564453125,
0.07415771484375,
-0.022552490234375,
-0.03680419921875,
0.027191162109375,
-0.0185699462890625,
0.0241546630859375,
-0.0276031494140625,
0.03662109375,
0.0255279541015625,
-0.0106201171875,
-0.009246826171875,
-0.0175018310546875,
0.037384033203125,
0.00876617431640625,
-0.0190277099609375,
-0.038330078125,
0.00988006591796875,
0.03350830078125,
0.0178680419921875,
0.0477294921875,
-0.0234375,
0.018463134765625,
0.016937255859375,
0.00966644287109375,
-0.02093505859375,
0.00118255615234375,
-0.01189422607421875,
0.0168609619140625,
-0.0264129638671875,
-0.03460693359375
]
] |
HooshvareLab/bert-fa-base-uncased | 2021-05-18T21:02:21.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"bert-fa",
"bert-persian",
"persian-lm",
"fa",
"arxiv:2005.12515",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | HooshvareLab | null | null | HooshvareLab/bert-fa-base-uncased | 5 | 54,269 | transformers | 2022-03-02T23:29:04 | ---
language: fa
tags:
- bert-fa
- bert-persian
- persian-lm
license: apache-2.0
---
# ParsBERT (v2.0)
A Transformer-based Model for Persian Language Understanding
We reconstructed the vocabulary and fine-tuned the ParsBERT v1.1 on the new Persian corpora in order to provide some functionalities for using ParsBERT in other scopes!
Please follow the [ParsBERT](https://github.com/hooshvare/parsbert) repo for the latest information about previous and current models.
## Introduction
ParsBERT is a monolingual language model based on Google’s BERT architecture. This model is pre-trained on large Persian corpora with various writing styles from numerous subjects (e.g., scientific, novels, news) with more than `3.9M` documents, `73M` sentences, and `1.3B` words.
Paper presenting ParsBERT: [arXiv:2005.12515](https://arxiv.org/abs/2005.12515)
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=bert-fa) to look for
fine-tuned versions on a task that interests you.
### How to use
#### TensorFlow 2.0
```python
from transformers import AutoConfig, AutoTokenizer, TFAutoModel
config = AutoConfig.from_pretrained("HooshvareLab/bert-fa-base-uncased")
tokenizer = AutoTokenizer.from_pretrained("HooshvareLab/bert-fa-base-uncased")
model = TFAutoModel.from_pretrained("HooshvareLab/bert-fa-base-uncased")
text = "ما در هوشواره معتقدیم با انتقال صحیح دانش و آگاهی، همه افراد میتوانند از ابزارهای هوشمند استفاده کنند. شعار ما هوش مصنوعی برای همه است."
tokenizer.tokenize(text)
>>> ['ما', 'در', 'هوش', '##واره', 'معتقدیم', 'با', 'انتقال', 'صحیح', 'دانش', 'و', 'اگاهی', '،', 'همه', 'افراد', 'میتوانند', 'از', 'ابزارهای', 'هوشمند', 'استفاده', 'کنند', '.', 'شعار', 'ما', 'هوش', 'مصنوعی', 'برای', 'همه', 'است', '.']
```
#### Pytorch
```python
from transformers import AutoConfig, AutoTokenizer, AutoModel
config = AutoConfig.from_pretrained("HooshvareLab/bert-fa-base-uncased")
tokenizer = AutoTokenizer.from_pretrained("HooshvareLab/bert-fa-base-uncased")
model = AutoModel.from_pretrained("HooshvareLab/bert-fa-base-uncased")
```
## Training
ParsBERT trained on a massive amount of public corpora ([Persian Wikidumps](https://dumps.wikimedia.org/fawiki/), [MirasText](https://github.com/miras-tech/MirasText)) and six other manually crawled text data from a various type of websites ([BigBang Page](https://bigbangpage.com/) `scientific`, [Chetor](https://www.chetor.com/) `lifestyle`, [Eligasht](https://www.eligasht.com/Blog/) `itinerary`, [Digikala](https://www.digikala.com/mag/) `digital magazine`, [Ted Talks](https://www.ted.com/talks) `general conversational`, Books `novels, storybooks, short stories from old to the contemporary era`).
As a part of ParsBERT methodology, an extensive pre-processing combining POS tagging and WordPiece segmentation was carried out to bring the corpora into a proper format.
## Goals
Objective goals during training are as below (after 300k steps).
``` bash
***** Eval results *****
global_step = 300000
loss = 1.4392426
masked_lm_accuracy = 0.6865794
masked_lm_loss = 1.4469004
next_sentence_accuracy = 1.0
next_sentence_loss = 6.534152e-05
```
## Derivative models
### Base Config
#### ParsBERT v2.0 Model
- [HooshvareLab/bert-fa-base-uncased](https://huggingface.co/HooshvareLab/bert-fa-base-uncased)
#### ParsBERT v2.0 Sentiment Analysis
- [HooshvareLab/bert-fa-base-uncased-sentiment-digikala](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-digikala)
- [HooshvareLab/bert-fa-base-uncased-sentiment-snappfood](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-snappfood)
- [HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-binary](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-binary)
- [HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-multi](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-multi)
#### ParsBERT v2.0 Text Classification
- [HooshvareLab/bert-fa-base-uncased-clf-digimag](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-clf-digimag)
- [HooshvareLab/bert-fa-base-uncased-clf-persiannews](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-clf-persiannews)
#### ParsBERT v2.0 NER
- [HooshvareLab/bert-fa-base-uncased-ner-peyma](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-ner-peyma)
- [HooshvareLab/bert-fa-base-uncased-ner-arman](https://huggingface.co/HooshvareLab/bert-fa-base-uncased-ner-arman)
## Eval results
ParsBERT is evaluated on three NLP downstream tasks: Sentiment Analysis (SA), Text Classification, and Named Entity Recognition (NER). For this matter and due to insufficient resources, two large datasets for SA and two for text classification were manually composed, which are available for public use and benchmarking. ParsBERT outperformed all other language models, including multilingual BERT and other hybrid deep learning models for all tasks, improving the state-of-the-art performance in Persian language modeling.
### Sentiment Analysis (SA) Task
| Dataset | ParsBERT v2 | ParsBERT v1 | mBERT | DeepSentiPers |
|:------------------------:|:-----------:|:-----------:|:-----:|:-------------:|
| Digikala User Comments | 81.72 | 81.74* | 80.74 | - |
| SnappFood User Comments | 87.98 | 88.12* | 87.87 | - |
| SentiPers (Multi Class) | 71.31* | 71.11 | - | 69.33 |
| SentiPers (Binary Class) | 92.42* | 92.13 | - | 91.98 |
### Text Classification (TC) Task
| Dataset | ParsBERT v2 | ParsBERT v1 | mBERT |
|:-----------------:|:-----------:|:-----------:|:-----:|
| Digikala Magazine | 93.65* | 93.59 | 90.72 |
| Persian News | 97.44* | 97.19 | 95.79 |
### Named Entity Recognition (NER) Task
| Dataset | ParsBERT v2 | ParsBERT v1 | mBERT | MorphoBERT | Beheshti-NER | LSTM-CRF | Rule-Based CRF | BiLSTM-CRF |
|:-------:|:-----------:|:-----------:|:-----:|:----------:|:------------:|:--------:|:--------------:|:----------:|
| PEYMA | 93.40* | 93.10 | 86.64 | - | 90.59 | - | 84.00 | - |
| ARMAN | 99.84* | 98.79 | 95.89 | 89.9 | 84.03 | 86.55 | - | 77.45 |
### BibTeX entry and citation info
Please cite in publications as the following:
```bibtex
@article{ParsBERT,
title={ParsBERT: Transformer-based Model for Persian Language Understanding},
author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
journal={ArXiv},
year={2020},
volume={abs/2005.12515}
}
```
## Questions?
Post a Github issue on the [ParsBERT Issues](https://github.com/hooshvare/parsbert/issues) repo.
| 6,954 | [
[
-0.042724609375,
-0.05682373046875,
0.0158233642578125,
0.029449462890625,
-0.027496337890625,
0.00949859619140625,
-0.040802001953125,
-0.018157958984375,
0.01346588134765625,
0.019744873046875,
-0.0308990478515625,
-0.04901123046875,
-0.049163818359375,
0.01271820068359375,
-0.01983642578125,
0.08782958984375,
-0.004322052001953125,
0.0128936767578125,
0.0008349418640136719,
-0.01235198974609375,
-0.022430419921875,
-0.042236328125,
-0.03900146484375,
-0.01464080810546875,
0.0171966552734375,
0.0235137939453125,
0.04742431640625,
0.0178070068359375,
0.045013427734375,
0.0248260498046875,
-0.02685546875,
0.00641632080078125,
-0.004734039306640625,
-0.000012695789337158203,
-0.0159454345703125,
-0.0275726318359375,
-0.03338623046875,
-0.0085296630859375,
0.05999755859375,
0.04119873046875,
-0.00981903076171875,
0.0191497802734375,
0.006977081298828125,
0.060577392578125,
-0.0268096923828125,
-0.004848480224609375,
-0.02618408203125,
-0.003322601318359375,
-0.030364990234375,
0.01030731201171875,
-0.0274810791015625,
-0.0293121337890625,
-0.0007219314575195312,
-0.029083251953125,
0.01678466796875,
0.00775146484375,
0.1063232421875,
0.0095367431640625,
-0.015838623046875,
-0.016754150390625,
-0.041351318359375,
0.078857421875,
-0.0693359375,
0.0284881591796875,
0.017547607421875,
0.0037288665771484375,
-0.009429931640625,
-0.035614013671875,
-0.06134033203125,
-0.004131317138671875,
-0.0252838134765625,
0.01324462890625,
-0.0278472900390625,
-0.022186279296875,
0.0222015380859375,
0.041351318359375,
-0.04339599609375,
-0.0108184814453125,
-0.0254364013671875,
-0.01514434814453125,
0.0372314453125,
-0.000980377197265625,
0.0268096923828125,
-0.0408935546875,
-0.04498291015625,
-0.033050537109375,
-0.0291900634765625,
0.0252838134765625,
0.0191192626953125,
0.0019044876098632812,
-0.029083251953125,
0.03814697265625,
-0.02215576171875,
0.03778076171875,
0.04229736328125,
-0.01325225830078125,
0.055877685546875,
-0.037353515625,
-0.005649566650390625,
-0.004302978515625,
0.07830810546875,
0.0063018798828125,
0.0055999755859375,
0.0018129348754882812,
-0.00237274169921875,
0.0083770751953125,
-0.0021648406982421875,
-0.049224853515625,
-0.01386260986328125,
0.007808685302734375,
-0.03857421875,
-0.0225677490234375,
0.007007598876953125,
-0.08563232421875,
-0.0169830322265625,
-0.01824951171875,
0.0260772705078125,
-0.0625,
-0.0305023193359375,
0.00733184814453125,
-0.00830078125,
0.0511474609375,
0.0219879150390625,
-0.04425048828125,
0.02362060546875,
0.0416259765625,
0.052978515625,
-0.0012874603271484375,
-0.0228424072265625,
0.0013523101806640625,
-0.01153564453125,
-0.01776123046875,
0.06512451171875,
-0.0223388671875,
-0.01218414306640625,
-0.0286712646484375,
0.0003304481506347656,
-0.01461029052734375,
-0.0267486572265625,
0.05902099609375,
-0.0222320556640625,
0.055755615234375,
-0.01189422607421875,
-0.03936767578125,
-0.0213623046875,
0.01123809814453125,
-0.04132080078125,
0.09521484375,
0.00827789306640625,
-0.08013916015625,
0.003940582275390625,
-0.045318603515625,
-0.0274505615234375,
-0.01215362548828125,
0.0013589859008789062,
-0.047821044921875,
-0.003936767578125,
0.0284271240234375,
0.04296875,
-0.034698486328125,
0.0299530029296875,
0.00572967529296875,
-0.0023136138916015625,
0.023162841796875,
-0.016845703125,
0.07171630859375,
0.0159454345703125,
-0.05029296875,
0.0019931793212890625,
-0.063232421875,
0.0034084320068359375,
0.0167388916015625,
-0.0167236328125,
-0.025146484375,
-0.000652313232421875,
0.004123687744140625,
0.0288848876953125,
0.0280914306640625,
-0.051055908203125,
-0.005985260009765625,
-0.05865478515625,
0.01849365234375,
0.050628662109375,
0.00508880615234375,
0.038177490234375,
-0.030303955078125,
0.02947998046875,
0.01065826416015625,
0.00785064697265625,
0.001064300537109375,
-0.02703857421875,
-0.0867919921875,
-0.0254974365234375,
0.037628173828125,
0.047332763671875,
-0.0521240234375,
0.054779052734375,
-0.035919189453125,
-0.06890869140625,
-0.033599853515625,
0.0075225830078125,
0.047027587890625,
0.0361328125,
0.040802001953125,
-0.02008056640625,
-0.04425048828125,
-0.06976318359375,
-0.032135009765625,
-0.01357269287109375,
0.02337646484375,
0.011566162109375,
0.038726806640625,
-0.0163421630859375,
0.058258056640625,
-0.02880859375,
-0.0275115966796875,
-0.035858154296875,
0.01071929931640625,
0.04052734375,
0.04827880859375,
0.035369873046875,
-0.05767822265625,
-0.056976318359375,
0.0037441253662109375,
-0.04437255859375,
0.003681182861328125,
-0.017608642578125,
-0.0266571044921875,
0.047882080078125,
0.0236663818359375,
-0.048919677734375,
0.02154541015625,
0.033172607421875,
-0.04071044921875,
0.03338623046875,
0.00598907470703125,
-0.004047393798828125,
-0.0992431640625,
0.0188446044921875,
0.0037021636962890625,
-0.00812530517578125,
-0.050201416015625,
-0.0036487579345703125,
0.013916015625,
0.007122039794921875,
-0.0301666259765625,
0.054443359375,
-0.039581298828125,
0.027984619140625,
0.00646209716796875,
-0.0017461776733398438,
-0.0014972686767578125,
0.058349609375,
0.0085601806640625,
0.05865478515625,
0.04248046875,
-0.037200927734375,
0.010711669921875,
0.0426025390625,
-0.0307464599609375,
0.0312042236328125,
-0.052642822265625,
-0.005016326904296875,
-0.01361083984375,
0.01105499267578125,
-0.06494140625,
-0.0156707763671875,
0.04254150390625,
-0.0557861328125,
0.03204345703125,
0.017059326171875,
-0.0394287109375,
-0.01971435546875,
-0.035308837890625,
0.0108489990234375,
0.055023193359375,
-0.030029296875,
0.050628662109375,
0.02099609375,
-0.0186004638671875,
-0.044158935546875,
-0.039794921875,
-0.005191802978515625,
-0.01885986328125,
-0.04327392578125,
0.0259246826171875,
0.00440216064453125,
-0.005889892578125,
0.0037403106689453125,
-0.01251983642578125,
-0.006465911865234375,
0.001247406005859375,
0.0178680419921875,
0.0271453857421875,
-0.008087158203125,
0.0091705322265625,
0.010650634765625,
-0.00424957275390625,
0.0006723403930664062,
0.0088043212890625,
0.061859130859375,
-0.04345703125,
-0.007648468017578125,
-0.046844482421875,
0.01497650146484375,
0.029510498046875,
-0.02911376953125,
0.0684814453125,
0.06768798828125,
-0.013397216796875,
0.00006836652755737305,
-0.059417724609375,
0.01273345947265625,
-0.031829833984375,
0.017333984375,
-0.0204620361328125,
-0.0665283203125,
0.0264434814453125,
-0.01340484619140625,
-0.01377105712890625,
0.07904052734375,
0.05438232421875,
-0.002315521240234375,
0.057373046875,
0.0305633544921875,
-0.01212310791015625,
0.036651611328125,
-0.0293121337890625,
0.0298919677734375,
-0.07135009765625,
-0.031280517578125,
-0.03662109375,
-0.0087432861328125,
-0.05889892578125,
-0.040283203125,
0.0197906494140625,
0.01318359375,
-0.023712158203125,
0.04693603515625,
-0.042236328125,
0.00940704345703125,
0.046173095703125,
0.01483154296875,
0.002063751220703125,
0.0111083984375,
-0.01387786865234375,
-0.0037021636962890625,
-0.042205810546875,
-0.045257568359375,
0.0718994140625,
0.034393310546875,
0.041290283203125,
0.0092315673828125,
0.0589599609375,
0.0126190185546875,
-0.004802703857421875,
-0.039154052734375,
0.052947998046875,
-0.00799560546875,
-0.0462646484375,
-0.0204315185546875,
-0.0186004638671875,
-0.060577392578125,
0.0209197998046875,
-0.01316070556640625,
-0.0374755859375,
0.042816162109375,
-0.003223419189453125,
-0.0236053466796875,
0.01702880859375,
-0.03765869140625,
0.06463623046875,
-0.0220947265625,
-0.0232086181640625,
-0.0199737548828125,
-0.06707763671875,
0.016326904296875,
-0.006275177001953125,
0.02630615234375,
-0.022125244140625,
0.0010890960693359375,
0.07098388671875,
-0.03472900390625,
0.056304931640625,
-0.004352569580078125,
0.0086669921875,
0.028106689453125,
-0.0060272216796875,
0.036224365234375,
-0.005237579345703125,
-0.010498046875,
0.033477783203125,
0.005649566650390625,
-0.038238525390625,
-0.015472412109375,
0.0478515625,
-0.07373046875,
-0.039306640625,
-0.07440185546875,
-0.01192474365234375,
-0.006755828857421875,
0.0194244384765625,
0.031402587890625,
0.024139404296875,
-0.0172576904296875,
0.0233154296875,
0.054351806640625,
-0.040313720703125,
0.033111572265625,
0.022491455078125,
0.00458526611328125,
-0.046417236328125,
0.0694580078125,
-0.018463134765625,
0.008209228515625,
0.0350341796875,
0.0123443603515625,
-0.0114593505859375,
-0.0215606689453125,
-0.030670166015625,
0.03887939453125,
-0.042327880859375,
-0.032073974609375,
-0.05596923828125,
-0.0200653076171875,
-0.0469970703125,
0.002777099609375,
-0.0248260498046875,
-0.040924072265625,
-0.0292205810546875,
0.00557708740234375,
0.03240966796875,
0.031585693359375,
-0.01143646240234375,
0.02093505859375,
-0.054107666015625,
0.0172576904296875,
-0.0007581710815429688,
0.0201568603515625,
-0.006862640380859375,
-0.04913330078125,
-0.0220947265625,
0.0008978843688964844,
-0.0290679931640625,
-0.0712890625,
0.057098388671875,
0.0238189697265625,
0.03424072265625,
0.01523590087890625,
0.0010442733764648438,
0.046539306640625,
-0.037261962890625,
0.06463623046875,
0.0226593017578125,
-0.089599609375,
0.05108642578125,
-0.0196380615234375,
0.01372528076171875,
0.041748046875,
0.0396728515625,
-0.0491943359375,
-0.022308349609375,
-0.05755615234375,
-0.07220458984375,
0.06634521484375,
0.0288848876953125,
0.0167236328125,
-0.005390167236328125,
0.00783538818359375,
0.0029449462890625,
0.0217437744140625,
-0.060791015625,
-0.0352783203125,
-0.0247802734375,
-0.0254974365234375,
-0.002605438232421875,
-0.035003662109375,
0.016754150390625,
-0.0411376953125,
0.0753173828125,
0.0260772705078125,
0.04412841796875,
0.03656005859375,
-0.023162841796875,
0.00670623779296875,
0.04034423828125,
0.037506103515625,
0.02490234375,
-0.0065460205078125,
0.00331878662109375,
0.02691650390625,
-0.037872314453125,
-0.00029587745666503906,
0.014556884765625,
-0.0140533447265625,
0.0205078125,
0.0216827392578125,
0.07855224609375,
0.0169525146484375,
-0.051666259765625,
0.0501708984375,
0.0033130645751953125,
-0.01099395751953125,
-0.0472412109375,
-0.00966644287109375,
-0.01103973388671875,
0.017425537109375,
0.0122222900390625,
0.00946807861328125,
0.00751495361328125,
-0.030029296875,
-0.000039577484130859375,
0.02252197265625,
-0.009246826171875,
-0.0218505859375,
0.0281219482421875,
-0.004245758056640625,
-0.0025043487548828125,
0.0273590087890625,
-0.0275421142578125,
-0.07281494140625,
0.03399658203125,
0.028839111328125,
0.05694580078125,
-0.027374267578125,
0.0261688232421875,
0.052581787109375,
0.015869140625,
-0.013397216796875,
0.0230255126953125,
0.004299163818359375,
-0.044952392578125,
-0.0242156982421875,
-0.082275390625,
-0.01387786865234375,
-0.005954742431640625,
-0.04302978515625,
0.0224761962890625,
-0.042755126953125,
-0.0159454345703125,
0.01141357421875,
0.01123809814453125,
-0.046539306640625,
0.0135650634765625,
0.01433563232421875,
0.055908203125,
-0.058349609375,
0.05902099609375,
0.072998046875,
-0.0406494140625,
-0.06390380859375,
-0.0025196075439453125,
-0.017974853515625,
-0.042724609375,
0.058258056640625,
0.002475738525390625,
0.0079193115234375,
0.005218505859375,
-0.0311737060546875,
-0.0924072265625,
0.07122802734375,
-0.00876617431640625,
-0.0328369140625,
0.0003364086151123047,
0.015716552734375,
0.05755615234375,
0.00833892822265625,
0.0146636962890625,
0.030731201171875,
0.0355224609375,
-0.0096893310546875,
-0.07623291015625,
0.020904541015625,
-0.0445556640625,
0.0169830322265625,
0.033294677734375,
-0.06634521484375,
0.088623046875,
0.0160675048828125,
-0.019317626953125,
0.0170440673828125,
0.05279541015625,
0.004138946533203125,
0.00423431396484375,
0.031005859375,
0.06256103515625,
0.040252685546875,
-0.01294708251953125,
0.07421875,
-0.0257415771484375,
0.0472412109375,
0.06427001953125,
0.006191253662109375,
0.0718994140625,
0.030731201171875,
-0.032073974609375,
0.0780029296875,
0.031768798828125,
-0.02349853515625,
0.03936767578125,
-0.00010508298873901367,
-0.0169525146484375,
-0.0253448486328125,
-0.002193450927734375,
-0.031524658203125,
0.032012939453125,
0.0232696533203125,
-0.0239410400390625,
-0.0121612548828125,
0.01806640625,
0.0278472900390625,
0.012451171875,
-0.0089874267578125,
0.06414794921875,
-0.0014324188232421875,
-0.04998779296875,
0.05120849609375,
0.020599365234375,
0.0604248046875,
-0.0272979736328125,
0.00911712646484375,
-0.00992584228515625,
0.02691650390625,
-0.014984130859375,
-0.0447998046875,
0.0232391357421875,
0.0123443603515625,
-0.0167236328125,
-0.0202789306640625,
0.07171630859375,
-0.0172119140625,
-0.058258056640625,
0.0076141357421875,
0.040252685546875,
0.01244354248046875,
-0.007282257080078125,
-0.06390380859375,
-0.0033969879150390625,
0.013702392578125,
-0.03369140625,
0.0166473388671875,
0.031951904296875,
-0.009918212890625,
0.026123046875,
0.051849365234375,
0.00296783447265625,
0.01207733154296875,
-0.0248260498046875,
0.07073974609375,
-0.06414794921875,
-0.017547607421875,
-0.0657958984375,
0.0372314453125,
-0.0208740234375,
-0.0267486572265625,
0.0811767578125,
0.042633056640625,
0.07025146484375,
-0.007732391357421875,
0.04632568359375,
-0.0295562744140625,
0.07525634765625,
-0.0254669189453125,
0.050872802734375,
-0.0419921875,
0.0035037994384765625,
-0.0276947021484375,
-0.0543212890625,
-0.0193328857421875,
0.062347412109375,
-0.037109375,
-0.0039005279541015625,
0.061614990234375,
0.04840087890625,
0.0133209228515625,
-0.00904083251953125,
-0.01436614990234375,
0.02911376953125,
-0.0014505386352539062,
0.044647216796875,
0.061126708984375,
-0.05352783203125,
0.032623291015625,
-0.05218505859375,
-0.0066375732421875,
-0.01458740234375,
-0.04254150390625,
-0.07574462890625,
-0.049072265625,
-0.032012939453125,
-0.03289794921875,
-0.004428863525390625,
0.078369140625,
0.01514434814453125,
-0.0802001953125,
-0.0146636962890625,
-0.0135345458984375,
0.005218505859375,
-0.002140045166015625,
-0.0203704833984375,
0.05242919921875,
-0.0296630859375,
-0.060211181640625,
0.0014734268188476562,
-0.004138946533203125,
-0.003101348876953125,
0.00437164306640625,
0.0008082389831542969,
-0.040557861328125,
0.0004811286926269531,
0.038848876953125,
0.021209716796875,
-0.06158447265625,
0.002655029296875,
0.0196380615234375,
-0.025390625,
0.0126495361328125,
0.0189361572265625,
-0.062103271484375,
0.003314971923828125,
0.0462646484375,
0.03497314453125,
0.024993896484375,
0.004848480224609375,
0.00977325439453125,
-0.039215087890625,
0.00890350341796875,
0.01971435546875,
0.0186614990234375,
0.02728271484375,
-0.026123046875,
0.036834716796875,
0.0233001708984375,
-0.034637451171875,
-0.059844970703125,
-0.01436614990234375,
-0.09136962890625,
-0.0218353271484375,
0.09625244140625,
-0.0045013427734375,
-0.0258636474609375,
0.0130157470703125,
-0.0325927734375,
0.039276123046875,
-0.035736083984375,
0.03863525390625,
0.0556640625,
-0.0007748603820800781,
-0.003833770751953125,
-0.019378662109375,
0.035858154296875,
0.0614013671875,
-0.055877685546875,
-0.0210418701171875,
0.031005859375,
0.0251007080078125,
0.0316162109375,
0.03173828125,
-0.00989532470703125,
0.01385498046875,
-0.0197601318359375,
0.0246429443359375,
0.011322021484375,
-0.00024962425231933594,
-0.0364990234375,
-0.0083465576171875,
-0.005828857421875,
-0.01001739501953125
]
] |
sentence-transformers/distilbert-base-nli-mean-tokens | 2022-06-15T19:35:42.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | sentence-transformers | null | null | sentence-transformers/distilbert-base-nli-mean-tokens | 2 | 54,256 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: feature-extraction
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
**⚠️ This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: [SBERT.net - Pretrained Models](https://www.sbert.net/docs/pretrained_models.html)**
# sentence-transformers/distilbert-base-nli-mean-tokens
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/distilbert-base-nli-mean-tokens')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/distilbert-base-nli-mean-tokens')
model = AutoModel.from_pretrained('sentence-transformers/distilbert-base-nli-mean-tokens')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/distilbert-base-nli-mean-tokens)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,984 | [
[
-0.01763916015625,
-0.058197021484375,
0.0177154541015625,
0.03125,
-0.03118896484375,
-0.0299835205078125,
-0.0201263427734375,
-0.0036296844482421875,
0.0166473388671875,
0.0216522216796875,
-0.04071044921875,
-0.03167724609375,
-0.058685302734375,
0.0101318359375,
-0.03155517578125,
0.061065673828125,
-0.0117340087890625,
0.00047659873962402344,
-0.0187835693359375,
-0.01178741455078125,
-0.0260467529296875,
-0.03485107421875,
-0.0263671875,
-0.0198516845703125,
0.0140228271484375,
0.00531005859375,
0.036407470703125,
0.0273284912109375,
0.0232696533203125,
0.034759521484375,
-0.00855255126953125,
0.01226043701171875,
-0.02850341796875,
-0.01132965087890625,
0.00469970703125,
-0.02191162109375,
-0.010498046875,
0.02923583984375,
0.042877197265625,
0.041168212890625,
-0.01474761962890625,
0.00624847412109375,
0.0032253265380859375,
0.0258636474609375,
-0.03729248046875,
0.033355712890625,
-0.042388916015625,
0.0138702392578125,
0.007350921630859375,
0.003154754638671875,
-0.046905517578125,
-0.0096588134765625,
0.0228118896484375,
-0.0330810546875,
0.004291534423828125,
0.01503753662109375,
0.08209228515625,
0.0341796875,
-0.0210723876953125,
-0.027618408203125,
-0.0225677490234375,
0.0670166015625,
-0.0726318359375,
0.016937255859375,
0.021728515625,
-0.003662109375,
-0.003665924072265625,
-0.07958984375,
-0.059295654296875,
-0.00931549072265625,
-0.0360107421875,
0.01629638671875,
-0.0361328125,
0.0029125213623046875,
0.01404571533203125,
0.0245208740234375,
-0.049072265625,
-0.006023406982421875,
-0.0308837890625,
-0.0063018798828125,
0.039886474609375,
0.0021343231201171875,
0.0263519287109375,
-0.040863037109375,
-0.031646728515625,
-0.02227783203125,
-0.0119781494140625,
-0.01087188720703125,
0.01123809814453125,
0.011383056640625,
-0.0207672119140625,
0.05615234375,
0.00036215782165527344,
0.043426513671875,
0.002498626708984375,
0.0178375244140625,
0.054107666015625,
-0.0297393798828125,
-0.027923583984375,
-0.003337860107421875,
0.0845947265625,
0.0357666015625,
0.0304107666015625,
-0.006023406982421875,
-0.0115966796875,
0.005413055419921875,
0.016632080078125,
-0.06365966796875,
-0.027099609375,
0.01165008544921875,
-0.028228759765625,
-0.02569580078125,
0.01476287841796875,
-0.047882080078125,
0.000028967857360839844,
0.0024852752685546875,
0.050262451171875,
-0.04229736328125,
-0.00322723388671875,
0.027496337890625,
-0.0204925537109375,
0.00921630859375,
-0.0219879150390625,
-0.056793212890625,
0.0177764892578125,
0.017333984375,
0.07086181640625,
0.00861358642578125,
-0.038818359375,
-0.01788330078125,
-0.01497650146484375,
0.002300262451171875,
0.039154052734375,
-0.0201568603515625,
-0.01178741455078125,
0.01092529296875,
0.0193023681640625,
-0.04046630859375,
-0.02923583984375,
0.041961669921875,
-0.0262603759765625,
0.051055908203125,
0.01073455810546875,
-0.062744140625,
-0.01337432861328125,
0.0119171142578125,
-0.042205810546875,
0.0810546875,
0.017974853515625,
-0.07623291015625,
0.00891876220703125,
-0.060760498046875,
-0.022918701171875,
-0.0136871337890625,
0.009735107421875,
-0.051788330078125,
0.01367950439453125,
0.034088134765625,
0.053741455078125,
0.00970458984375,
0.03900146484375,
-0.01629638671875,
-0.03839111328125,
0.03265380859375,
-0.0355224609375,
0.09033203125,
0.0114593505859375,
-0.0273284912109375,
0.0045928955078125,
-0.040496826171875,
-0.0117950439453125,
0.0247039794921875,
-0.01001739501953125,
-0.019775390625,
-0.0004703998565673828,
0.022735595703125,
0.019195556640625,
0.01479339599609375,
-0.054290771484375,
0.01073455810546875,
-0.046295166015625,
0.0750732421875,
0.044403076171875,
0.0006008148193359375,
0.040557861328125,
-0.0182037353515625,
0.00768280029296875,
0.032318115234375,
0.0028362274169921875,
-0.01462554931640625,
-0.0276641845703125,
-0.07269287109375,
-0.0230560302734375,
0.0247039794921875,
0.038787841796875,
-0.055145263671875,
0.08233642578125,
-0.0360107421875,
-0.032958984375,
-0.053131103515625,
-0.00365447998046875,
0.00557708740234375,
0.03155517578125,
0.050537109375,
-0.0035266876220703125,
-0.049530029296875,
-0.06829833984375,
0.0018663406372070312,
-0.003662109375,
0.00696563720703125,
0.0115509033203125,
0.05657958984375,
-0.036468505859375,
0.0794677734375,
-0.05267333984375,
-0.031494140625,
-0.036346435546875,
0.0211029052734375,
0.02349853515625,
0.04754638671875,
0.04345703125,
-0.053619384765625,
-0.0257568359375,
-0.051910400390625,
-0.054107666015625,
0.00157928466796875,
-0.014892578125,
-0.01212310791015625,
0.0099945068359375,
0.038482666015625,
-0.06182861328125,
0.03192138671875,
0.047210693359375,
-0.041290283203125,
0.0237884521484375,
-0.0196075439453125,
-0.0156097412109375,
-0.109130859375,
0.00316619873046875,
0.00955963134765625,
-0.0185699462890625,
-0.033416748046875,
-0.00067138671875,
0.007183074951171875,
-0.007030487060546875,
-0.039764404296875,
0.0355224609375,
-0.0299224853515625,
0.016845703125,
-0.0044708251953125,
0.028778076171875,
0.007518768310546875,
0.055999755859375,
-0.0070648193359375,
0.051239013671875,
0.03875732421875,
-0.043487548828125,
0.0249786376953125,
0.047088623046875,
-0.038055419921875,
0.01013946533203125,
-0.0675048828125,
-0.0021762847900390625,
-0.0034122467041015625,
0.034423828125,
-0.07794189453125,
0.0019664764404296875,
0.0274505615234375,
-0.0390625,
0.018402099609375,
0.0244140625,
-0.050994873046875,
-0.046173095703125,
-0.02923583984375,
0.0161590576171875,
0.043121337890625,
-0.04364013671875,
0.04229736328125,
0.0215606689453125,
0.00063323974609375,
-0.046173095703125,
-0.089599609375,
-0.00041556358337402344,
-0.0153350830078125,
-0.04547119140625,
0.045074462890625,
-0.0037708282470703125,
0.0094757080078125,
0.0258636474609375,
0.0209503173828125,
-0.0004177093505859375,
0.0036373138427734375,
0.0032138824462890625,
0.0200958251953125,
-0.0042266845703125,
0.0161285400390625,
0.01401519775390625,
-0.01485443115234375,
0.0041351318359375,
-0.016357421875,
0.051422119140625,
-0.0141448974609375,
-0.01131439208984375,
-0.037750244140625,
0.01204681396484375,
0.03106689453125,
-0.022918701171875,
0.08135986328125,
0.0740966796875,
-0.0364990234375,
-0.005008697509765625,
-0.04046630859375,
-0.021942138671875,
-0.036102294921875,
0.0548095703125,
-0.00928497314453125,
-0.074462890625,
0.0240020751953125,
0.0094146728515625,
0.003467559814453125,
0.04901123046875,
0.03887939453125,
-0.0091400146484375,
0.058807373046875,
0.0438232421875,
-0.0200347900390625,
0.0377197265625,
-0.050811767578125,
0.0256195068359375,
-0.07275390625,
0.0003371238708496094,
-0.01348114013671875,
-0.026458740234375,
-0.053924560546875,
-0.035003662109375,
0.00785064697265625,
-0.0004925727844238281,
-0.0223388671875,
0.04180908203125,
-0.0455322265625,
0.01418304443359375,
0.043487548828125,
0.01026153564453125,
-0.00592041015625,
0.0023593902587890625,
-0.0263671875,
-0.004146575927734375,
-0.0484619140625,
-0.04327392578125,
0.061187744140625,
0.038421630859375,
0.037139892578125,
-0.009613037109375,
0.051971435546875,
0.003082275390625,
0.0033283233642578125,
-0.0511474609375,
0.041290283203125,
-0.0304107666015625,
-0.03399658203125,
-0.024627685546875,
-0.0274810791015625,
-0.06256103515625,
0.0261383056640625,
-0.01161956787109375,
-0.0572509765625,
0.01020050048828125,
-0.016845703125,
-0.022369384765625,
0.0218505859375,
-0.061187744140625,
0.0794677734375,
0.0047760009765625,
-0.0021953582763671875,
-0.008636474609375,
-0.047454833984375,
0.00923919677734375,
0.0188446044921875,
0.0033512115478515625,
-0.00319671630859375,
0.001270294189453125,
0.06524658203125,
-0.0221099853515625,
0.07867431640625,
-0.0204925537109375,
0.0249481201171875,
0.03424072265625,
-0.0304718017578125,
0.018768310546875,
-0.0015888214111328125,
-0.006587982177734375,
0.01314544677734375,
-0.0126190185546875,
-0.0241546630859375,
-0.04296875,
0.051422119140625,
-0.07275390625,
-0.02850341796875,
-0.038238525390625,
-0.03863525390625,
0.00041985511779785156,
0.01146697998046875,
0.029754638671875,
0.0360107421875,
-0.017913818359375,
0.0321044921875,
0.03466796875,
-0.025543212890625,
0.060882568359375,
0.007442474365234375,
-0.002681732177734375,
-0.038299560546875,
0.0469970703125,
0.0058441162109375,
-0.0021305084228515625,
0.032989501953125,
0.016326904296875,
-0.03857421875,
-0.019287109375,
-0.0296173095703125,
0.031280517578125,
-0.04376220703125,
-0.014617919921875,
-0.076904296875,
-0.041290283203125,
-0.044036865234375,
0.0005784034729003906,
-0.0190582275390625,
-0.032318115234375,
-0.045745849609375,
-0.0281982421875,
0.02850341796875,
0.035064697265625,
-0.0008645057678222656,
0.0303192138671875,
-0.05218505859375,
0.00614166259765625,
0.010498046875,
0.01178741455078125,
-0.0012311935424804688,
-0.055267333984375,
-0.0278472900390625,
0.00403594970703125,
-0.03057861328125,
-0.06280517578125,
0.0477294921875,
0.0178375244140625,
0.045257568359375,
0.0110015869140625,
0.01338958740234375,
0.04840087890625,
-0.042755126953125,
0.06671142578125,
0.007785797119140625,
-0.07806396484375,
0.03436279296875,
0.0029468536376953125,
0.0297088623046875,
0.041656494140625,
0.0269775390625,
-0.03204345703125,
-0.030426025390625,
-0.04986572265625,
-0.07855224609375,
0.0477294921875,
0.033050537109375,
0.04840087890625,
-0.033203125,
0.025360107421875,
-0.0221099853515625,
0.0159759521484375,
-0.08935546875,
-0.0308685302734375,
-0.03466796875,
-0.04791259765625,
-0.0260772705078125,
-0.025970458984375,
0.017913818359375,
-0.0303802490234375,
0.057647705078125,
0.005344390869140625,
0.051422119140625,
0.02880859375,
-0.0413818359375,
0.0180206298828125,
0.0156707763671875,
0.035247802734375,
0.01486968994140625,
-0.01067352294921875,
0.012603759765625,
0.021820068359375,
-0.03094482421875,
0.00479888916015625,
0.039276123046875,
-0.00907135009765625,
0.01800537109375,
0.02850341796875,
0.079833984375,
0.037567138671875,
-0.032501220703125,
0.06085205078125,
-0.00872039794921875,
-0.0220794677734375,
-0.036224365234375,
-0.0122833251953125,
0.02215576171875,
0.01953125,
0.0230255126953125,
-0.001537322998046875,
0.0024929046630859375,
-0.02490234375,
0.0252685546875,
0.01479339599609375,
-0.0360107421875,
-0.008056640625,
0.050262451171875,
0.01036834716796875,
-0.00986480712890625,
0.07977294921875,
-0.0266265869140625,
-0.051361083984375,
0.0269927978515625,
0.048248291015625,
0.07513427734375,
0.002777099609375,
0.0216064453125,
0.044036865234375,
0.0259857177734375,
-0.004528045654296875,
-0.006683349609375,
0.0139312744140625,
-0.06817626953125,
-0.0241546630859375,
-0.05047607421875,
0.01256561279296875,
0.0025653839111328125,
-0.04315185546875,
0.0200958251953125,
-0.007457733154296875,
-0.01450347900390625,
-0.015777587890625,
0.0003306865692138672,
-0.048583984375,
0.0071258544921875,
0.004711151123046875,
0.062255859375,
-0.07806396484375,
0.055999755859375,
0.049957275390625,
-0.053253173828125,
-0.05615234375,
-0.0048065185546875,
-0.0257110595703125,
-0.051177978515625,
0.040191650390625,
0.042724609375,
0.01523590087890625,
0.0204620361328125,
-0.0377197265625,
-0.056640625,
0.10345458984375,
0.0193023681640625,
-0.034332275390625,
-0.018798828125,
0.01062774658203125,
0.036285400390625,
-0.0377197265625,
0.032470703125,
0.0268707275390625,
0.02276611328125,
-0.0036449432373046875,
-0.04937744140625,
0.0186614990234375,
-0.0271148681640625,
0.01690673828125,
-0.01477813720703125,
-0.038818359375,
0.0718994140625,
-0.0035572052001953125,
-0.0188446044921875,
0.0098876953125,
0.06573486328125,
0.0212860107421875,
-0.0010471343994140625,
0.0386962890625,
0.0677490234375,
0.04388427734375,
-0.01053619384765625,
0.06866455078125,
-0.021331787109375,
0.051971435546875,
0.0760498046875,
0.00457000732421875,
0.07977294921875,
0.035919189453125,
-0.0068359375,
0.061553955078125,
0.041290283203125,
-0.02520751953125,
0.052215576171875,
0.021331787109375,
0.004871368408203125,
0.005634307861328125,
0.0095062255859375,
-0.0150299072265625,
0.03997802734375,
0.01145172119140625,
-0.05474853515625,
-0.001987457275390625,
0.0125274658203125,
0.0041656494140625,
0.0006246566772460938,
0.00998687744140625,
0.045074462890625,
0.01500701904296875,
-0.0316162109375,
0.0311737060546875,
0.0141448974609375,
0.07989501953125,
-0.02587890625,
0.008819580078125,
-0.0024814605712890625,
0.0257568359375,
0.0006160736083984375,
-0.04083251953125,
0.03076171875,
-0.00804901123046875,
-0.00434112548828125,
-0.01611328125,
0.04290771484375,
-0.04534912109375,
-0.049224853515625,
0.0299224853515625,
0.038299560546875,
-0.00018417835235595703,
0.0029048919677734375,
-0.07769775390625,
-0.00023090839385986328,
-0.0023136138916015625,
-0.042205810546875,
0.0137939453125,
0.0256805419921875,
0.0299835205078125,
0.039215087890625,
0.0294647216796875,
-0.01568603515625,
0.005565643310546875,
0.015960693359375,
0.068603515625,
-0.043243408203125,
-0.041290283203125,
-0.0792236328125,
0.06097412109375,
-0.017486572265625,
-0.024017333984375,
0.0462646484375,
0.041259765625,
0.06658935546875,
-0.018585205078125,
0.0394287109375,
-0.01012420654296875,
0.0143280029296875,
-0.040252685546875,
0.06707763671875,
-0.035430908203125,
-0.00506591796875,
-0.02105712890625,
-0.0733642578125,
-0.0221710205078125,
0.084716796875,
-0.0237884521484375,
0.01387786865234375,
0.06884765625,
0.059112548828125,
-0.008575439453125,
-0.004711151123046875,
0.0096282958984375,
0.03192138671875,
0.0159149169921875,
0.036102294921875,
0.038818359375,
-0.06439208984375,
0.043304443359375,
-0.040985107421875,
-0.00983428955078125,
-0.00826263427734375,
-0.06683349609375,
-0.075927734375,
-0.06341552734375,
-0.03656005859375,
-0.0172119140625,
-0.00634002685546875,
0.07928466796875,
0.0474853515625,
-0.05126953125,
-0.005405426025390625,
-0.02264404296875,
-0.019805908203125,
-0.00894927978515625,
-0.02313232421875,
0.038818359375,
-0.037384033203125,
-0.0657958984375,
0.0102996826171875,
-0.007205963134765625,
0.0126495361328125,
-0.03179931640625,
0.007785797119140625,
-0.0504150390625,
0.007534027099609375,
0.042877197265625,
-0.0267486572265625,
-0.057891845703125,
-0.018463134765625,
0.00519561767578125,
-0.02569580078125,
-0.01033782958984375,
0.023101806640625,
-0.04931640625,
0.0186004638671875,
0.022369384765625,
0.042236328125,
0.047882080078125,
-0.01654052734375,
0.032806396484375,
-0.061981201171875,
0.0251617431640625,
0.0097198486328125,
0.05474853515625,
0.0347900390625,
-0.019195556640625,
0.042510986328125,
0.0175933837890625,
-0.035369873046875,
-0.0511474609375,
-0.01454925537109375,
-0.07476806640625,
-0.0221710205078125,
0.0809326171875,
-0.0338134765625,
-0.0273284912109375,
0.01593017578125,
-0.01274871826171875,
0.040374755859375,
-0.020416259765625,
0.05218505859375,
0.06597900390625,
0.00992584228515625,
-0.0201263427734375,
-0.0242919921875,
0.013458251953125,
0.028564453125,
-0.039154052734375,
-0.0098876953125,
0.01922607421875,
0.0193023681640625,
0.023193359375,
0.035888671875,
-0.01003265380859375,
-0.0048370361328125,
0.007236480712890625,
0.00701904296875,
-0.0215301513671875,
0.0025196075439453125,
-0.023956298828125,
-0.0017118453979492188,
-0.0257110595703125,
-0.03167724609375
]
] |
TahaDouaji/detr-doc-table-detection | 2023-01-05T19:36:58.000Z | [
"transformers",
"pytorch",
"detr",
"object-detection",
"arxiv:2005.12872",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | object-detection | TahaDouaji | null | null | TahaDouaji/detr-doc-table-detection | 25 | 54,055 | transformers | 2022-03-11T15:55:14 | ---
tags:
- object-detection
---
# Model Card for detr-doc-table-detection
# Model Details
detr-doc-table-detection is a model trained to detect both **Bordered** and **Borderless** tables in documents, based on [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50).
- **Developed by:** Taha Douaji
- **Shared by [Optional]:** Taha Douaji
- **Model type:** Object Detection
- **Language(s) (NLP):** More information needed
- **License:** More information needed
- **Parent Model:** [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50)
- **Resources for more information:**
- [Model Demo Space](https://huggingface.co/spaces/trevbeers/pdf-table-extraction)
- [Associated Paper](https://arxiv.org/abs/2005.12872)
# Uses
## Direct Use
This model can be used for the task of object detection.
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
The model was trained on ICDAR2019 Table Dataset
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
# Citation
**BibTeX:**
```bibtex
@article{DBLP:journals/corr/abs-2005-12872,
author = {Nicolas Carion and
Francisco Massa and
Gabriel Synnaeve and
Nicolas Usunier and
Alexander Kirillov and
Sergey Zagoruyko},
title = {End-to-End Object Detection with Transformers},
journal = {CoRR},
volume = {abs/2005.12872},
year = {2020},
url = {https://arxiv.org/abs/2005.12872},
archivePrefix = {arXiv},
eprint = {2005.12872},
timestamp = {Thu, 28 May 2020 17:38:09 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
# Model Card Authors [optional]
Taha Douaji in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import DetrImageProcessor, DetrForObjectDetection
import torch
from PIL import Image
import requests
image = Image.open("IMAGE_PATH")
processor = DetrImageProcessor.from_pretrained("TahaDouaji/detr-doc-table-detection")
model = DetrForObjectDetection.from_pretrained("TahaDouaji/detr-doc-table-detection")
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)
# convert outputs (bounding boxes and class logits) to COCO API
# let's only keep detections with score > 0.9
target_sizes = torch.tensor([image.size[::-1]])
results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0]
for score, label, box in zip(results["scores"], results["labels"], results["boxes"]):
box = [round(i, 2) for i in box.tolist()]
print(
f"Detected {model.config.id2label[label.item()]} with confidence "
f"{round(score.item(), 3)} at location {box}"
)
``` | 3,896 | [
[
-0.04071044921875,
-0.050994873046875,
0.02593994140625,
-0.01030731201171875,
-0.018218994140625,
-0.0169525146484375,
-0.00737762451171875,
-0.04931640625,
-0.00665283203125,
0.039520263671875,
-0.038604736328125,
-0.057098388671875,
-0.0455322265625,
0.00458526611328125,
-0.031280517578125,
0.07611083984375,
0.01250457763671875,
-0.007595062255859375,
-0.00738525390625,
0.00373077392578125,
-0.0318603515625,
-0.0300445556640625,
-0.05340576171875,
-0.007709503173828125,
0.022247314453125,
0.028594970703125,
0.038116455078125,
0.04132080078125,
0.054229736328125,
0.025054931640625,
-0.01538848876953125,
0.0035495758056640625,
-0.0251312255859375,
-0.016754150390625,
-0.017852783203125,
-0.037994384765625,
-0.035247802734375,
0.012847900390625,
0.04522705078125,
0.0243377685546875,
0.0013036727905273438,
0.0092010498046875,
-0.0029697418212890625,
0.041656494140625,
-0.034637451171875,
0.0200042724609375,
-0.053436279296875,
0.01480865478515625,
-0.0223846435546875,
-0.0003581047058105469,
-0.04693603515625,
-0.00580596923828125,
-0.01873779296875,
-0.033294677734375,
0.037200927734375,
0.017425537109375,
0.1004638671875,
0.0089263916015625,
-0.0140838623046875,
-0.0169830322265625,
-0.04229736328125,
0.056182861328125,
-0.05322265625,
0.03912353515625,
0.023406982421875,
0.0201568603515625,
-0.0157012939453125,
-0.056121826171875,
-0.05267333984375,
-0.0263519287109375,
-0.01342010498046875,
0.0164642333984375,
-0.0235748291015625,
0.0017766952514648438,
0.044647216796875,
0.0267181396484375,
-0.046844482421875,
0.0159912109375,
-0.0440673828125,
-0.01323699951171875,
0.05419921875,
0.00536346435546875,
0.025054931640625,
-0.016082763671875,
-0.0302581787109375,
-0.0209808349609375,
-0.0261383056640625,
0.0137481689453125,
0.048370361328125,
0.01739501953125,
-0.033050537109375,
0.03839111328125,
-0.012451171875,
0.06341552734375,
0.0131683349609375,
-0.005924224853515625,
0.033355712890625,
-0.024200439453125,
-0.0128021240234375,
-0.0029392242431640625,
0.0809326171875,
0.0233612060546875,
0.01629638671875,
0.0034313201904296875,
-0.0098876953125,
0.005130767822265625,
0.0080108642578125,
-0.06103515625,
-0.020233154296875,
0.0175933837890625,
-0.034637451171875,
-0.036376953125,
0.0178375244140625,
-0.0657958984375,
-0.0067291259765625,
-0.0050506591796875,
0.01325225830078125,
-0.02691650390625,
-0.036163330078125,
0.005084991455078125,
-0.00830078125,
0.0263824462890625,
0.0026073455810546875,
-0.055450439453125,
0.01152801513671875,
0.037322998046875,
0.0709228515625,
-0.0027599334716796875,
-0.006793975830078125,
-0.035858154296875,
0.0025196075439453125,
-0.0234375,
0.06805419921875,
-0.03936767578125,
-0.037200927734375,
-0.01085662841796875,
0.025299072265625,
0.0041656494140625,
-0.04217529296875,
0.0633544921875,
-0.0224456787109375,
0.021148681640625,
-0.023162841796875,
-0.0157012939453125,
-0.0306549072265625,
0.0223236083984375,
-0.052032470703125,
0.0926513671875,
0.016326904296875,
-0.0809326171875,
0.03253173828125,
-0.052886962890625,
-0.0252227783203125,
-0.01220703125,
-0.005916595458984375,
-0.0697021484375,
-0.01367950439453125,
0.0150604248046875,
0.028350830078125,
-0.018463134765625,
0.0330810546875,
-0.03924560546875,
-0.021331787109375,
0.01261138916015625,
-0.0249176025390625,
0.090576171875,
0.021453857421875,
-0.0175628662109375,
0.0168304443359375,
-0.058380126953125,
-0.0131378173828125,
0.01554107666015625,
-0.02313232421875,
0.004161834716796875,
-0.01306915283203125,
0.01800537109375,
0.03631591796875,
0.00528717041015625,
-0.04681396484375,
-0.0006265640258789062,
-0.0136871337890625,
0.012542724609375,
0.046356201171875,
-0.003025054931640625,
0.010772705078125,
-0.031585693359375,
0.02813720703125,
0.023773193359375,
0.02313232421875,
0.006084442138671875,
-0.041961669921875,
-0.0587158203125,
-0.0262603759765625,
-0.0006361007690429688,
0.03570556640625,
-0.03472900390625,
0.06744384765625,
-0.00505828857421875,
-0.047882080078125,
-0.0184326171875,
-0.01232147216796875,
0.0211029052734375,
0.05841064453125,
0.037750244140625,
-0.047149658203125,
-0.05364990234375,
-0.07025146484375,
-0.00754547119140625,
-0.005764007568359375,
-0.004547119140625,
0.01360321044921875,
0.06036376953125,
-0.006656646728515625,
0.076904296875,
-0.04095458984375,
-0.04766845703125,
-0.01372528076171875,
-0.0033473968505859375,
0.0271759033203125,
0.0509033203125,
0.04376220703125,
-0.063720703125,
-0.04217529296875,
-0.0115966796875,
-0.07403564453125,
-0.00847625732421875,
0.0030689239501953125,
-0.0184478759765625,
0.03021240234375,
0.02642822265625,
-0.042327880859375,
0.0499267578125,
0.008209228515625,
-0.023162841796875,
0.05010986328125,
-0.0110626220703125,
-0.0013570785522460938,
-0.075439453125,
0.0311737060546875,
0.013580322265625,
-0.00992584228515625,
-0.0640869140625,
0.004547119140625,
0.00027370452880859375,
-0.011322021484375,
-0.043487548828125,
0.0379638671875,
-0.044097900390625,
-0.01033782958984375,
-0.022491455078125,
0.00307464599609375,
0.00896453857421875,
0.0528564453125,
0.032012939453125,
0.032440185546875,
0.05078125,
-0.043975830078125,
0.01241302490234375,
0.02001953125,
-0.03228759765625,
0.062286376953125,
-0.04510498046875,
0.018218994140625,
-0.016754150390625,
0.0158538818359375,
-0.07232666015625,
-0.0008716583251953125,
0.05670166015625,
-0.032257080078125,
0.04595947265625,
-0.0126800537109375,
-0.035369873046875,
-0.055511474609375,
-0.028472900390625,
0.0098419189453125,
0.025665283203125,
-0.04052734375,
0.041778564453125,
0.02716064453125,
0.023651123046875,
-0.06488037109375,
-0.06231689453125,
-0.00933837890625,
-0.0186767578125,
-0.046539306640625,
0.0247650146484375,
-0.0169525146484375,
-0.006900787353515625,
0.00047707557678222656,
-0.01183319091796875,
-0.0082550048828125,
0.00550079345703125,
0.02557373046875,
0.030426025390625,
0.0089874267578125,
-0.00337982177734375,
-0.01004791259765625,
-0.0189971923828125,
0.007904052734375,
-0.006473541259765625,
0.05743408203125,
-0.0162506103515625,
-0.0201263427734375,
-0.047088623046875,
0.0172576904296875,
0.03851318359375,
-0.032257080078125,
0.059844970703125,
0.0701904296875,
-0.044952392578125,
0.00820159912109375,
-0.0469970703125,
-0.01384735107421875,
-0.039093017578125,
0.046051025390625,
-0.0380859375,
-0.02459716796875,
0.060028076171875,
0.0280609130859375,
-0.0080413818359375,
0.05126953125,
0.0430908203125,
0.01448822021484375,
0.06103515625,
0.046661376953125,
0.0003867149353027344,
0.045867919921875,
-0.051910400390625,
0.02154541015625,
-0.0845947265625,
-0.049652099609375,
-0.04974365234375,
-0.00980377197265625,
-0.0362548828125,
-0.02850341796875,
0.0130462646484375,
0.01151275634765625,
-0.0305938720703125,
0.03997802734375,
-0.0770263671875,
0.016937255859375,
0.04254150390625,
0.027679443359375,
0.0210723876953125,
0.0008702278137207031,
-0.0005636215209960938,
0.0126953125,
-0.039703369140625,
-0.03118896484375,
0.0660400390625,
0.0399169921875,
0.04925537109375,
-0.0028247833251953125,
0.04473876953125,
0.0125885009765625,
0.0296173095703125,
-0.05413818359375,
0.05224609375,
-0.0081787109375,
-0.0709228515625,
-0.0163726806640625,
-0.0232696533203125,
-0.0684814453125,
0.016510009765625,
-0.008453369140625,
-0.075927734375,
0.042755126953125,
-0.011383056640625,
-0.0022792816162109375,
0.04730224609375,
-0.0303802490234375,
0.080078125,
-0.019195556640625,
-0.03533935546875,
-0.0019626617431640625,
-0.042266845703125,
0.034820556640625,
-0.0035228729248046875,
0.00925445556640625,
-0.024871826171875,
0.0193023681640625,
0.0789794921875,
-0.0290679931640625,
0.058380126953125,
-0.024261474609375,
-0.0048065185546875,
0.04595947265625,
-0.004421234130859375,
0.04254150390625,
-0.00351715087890625,
-0.0173187255859375,
0.036865234375,
-0.005031585693359375,
-0.008636474609375,
-0.00754547119140625,
0.03680419921875,
-0.0421142578125,
-0.0450439453125,
-0.04833984375,
-0.0665283203125,
0.0258026123046875,
0.032501220703125,
0.042755126953125,
0.01763916015625,
0.0010585784912109375,
0.002902984619140625,
0.036834716796875,
-0.034637451171875,
0.03802490234375,
0.0479736328125,
-0.016693115234375,
-0.008544921875,
0.051910400390625,
0.0192718505859375,
0.0084381103515625,
0.034454345703125,
0.012847900390625,
-0.039276123046875,
-0.0277252197265625,
-0.012664794921875,
0.02532958984375,
-0.059906005859375,
-0.027130126953125,
-0.06201171875,
-0.02740478515625,
-0.0343017578125,
-0.0017642974853515625,
-0.0182952880859375,
0.002918243408203125,
-0.04229736328125,
-0.015777587890625,
0.033966064453125,
0.0277557373046875,
-0.025634765625,
0.0173187255859375,
-0.0247344970703125,
0.019683837890625,
0.0037136077880859375,
0.0306243896484375,
0.0034580230712890625,
-0.05511474609375,
-0.0035495758056640625,
0.008636474609375,
-0.0283355712890625,
-0.0711669921875,
0.04327392578125,
-0.0019779205322265625,
0.048675537109375,
0.03619384765625,
0.01294708251953125,
0.030517578125,
-0.01416778564453125,
0.04071044921875,
0.02532958984375,
-0.069580078125,
0.0377197265625,
-0.017730712890625,
0.007633209228515625,
0.026336669921875,
0.0278167724609375,
-0.0369873046875,
-0.0316162109375,
-0.0430908203125,
-0.048370361328125,
0.08428955078125,
0.0014352798461914062,
-0.00942230224609375,
-0.0008373260498046875,
0.00922393798828125,
-0.01180267333984375,
-0.000032067298889160156,
-0.074462890625,
-0.0128631591796875,
-0.011016845703125,
-0.0204010009765625,
0.01514434814453125,
-0.0230255126953125,
-0.004398345947265625,
-0.024566650390625,
0.05841064453125,
0.0050201416015625,
0.04443359375,
0.0235748291015625,
-0.00814056396484375,
-0.00959014892578125,
0.00250244140625,
0.0455322265625,
0.023651123046875,
-0.041534423828125,
0.003986358642578125,
0.01392364501953125,
-0.0433349609375,
0.006717681884765625,
0.01395416259765625,
-0.0147705078125,
0.0028781890869140625,
0.0208282470703125,
0.0565185546875,
-0.00853729248046875,
-0.027679443359375,
0.0330810546875,
0.00943756103515625,
-0.013885498046875,
-0.033905029296875,
0.004131317138671875,
-0.004611968994140625,
0.00829315185546875,
0.026397705078125,
0.00989532470703125,
0.01392364501953125,
-0.043670654296875,
0.0299224853515625,
0.04437255859375,
-0.039703369140625,
-0.0088653564453125,
0.07672119140625,
-0.0091400146484375,
-0.03302001953125,
0.05120849609375,
-0.028533935546875,
-0.04534912109375,
0.07757568359375,
0.04449462890625,
0.0570068359375,
-0.01541900634765625,
0.00981903076171875,
0.053985595703125,
0.03582763671875,
-0.0063629150390625,
0.01119232177734375,
0.004329681396484375,
-0.05877685546875,
0.0095672607421875,
-0.043304443359375,
-0.00397491455078125,
0.0172119140625,
-0.052734375,
0.043243408203125,
-0.0330810546875,
-0.016754150390625,
0.008148193359375,
0.01654052734375,
-0.068115234375,
0.0143890380859375,
0.00421905517578125,
0.06683349609375,
-0.07281494140625,
0.06439208984375,
0.0184326171875,
-0.061187744140625,
-0.06005859375,
-0.02081298828125,
-0.0026645660400390625,
-0.054229736328125,
0.05584716796875,
0.0430908203125,
-0.001194000244140625,
-0.004150390625,
-0.034576416015625,
-0.0628662109375,
0.080078125,
0.0181121826171875,
-0.0526123046875,
0.00777435302734375,
0.00550079345703125,
0.037017822265625,
-0.019805908203125,
0.0489501953125,
0.0416259765625,
0.04815673828125,
-0.0006184577941894531,
-0.05364990234375,
0.02252197265625,
-0.0133819580078125,
-0.006587982177734375,
0.0017175674438476562,
-0.05560302734375,
0.071044921875,
-0.017608642578125,
-0.0275115966796875,
-0.001922607421875,
0.03997802734375,
0.0241546630859375,
0.0335693359375,
0.0361328125,
0.0595703125,
0.05645751953125,
-0.0163421630859375,
0.0758056640625,
-0.0236968994140625,
0.049041748046875,
0.0755615234375,
-0.0160369873046875,
0.046661376953125,
0.01788330078125,
-0.0257110595703125,
0.04534912109375,
0.059051513671875,
-0.04119873046875,
0.0443115234375,
0.01702880859375,
-0.0110931396484375,
0.004077911376953125,
-0.030120849609375,
-0.038909912109375,
0.03143310546875,
0.028656005859375,
-0.038909912109375,
-0.011016845703125,
0.002361297607421875,
0.0052337646484375,
-0.00897979736328125,
-0.01068878173828125,
0.029144287109375,
0.00019311904907226562,
-0.03521728515625,
0.046539306640625,
0.00728607177734375,
0.060089111328125,
-0.036712646484375,
-0.0024242401123046875,
-0.01012420654296875,
0.016754150390625,
-0.0333251953125,
-0.064453125,
0.02734375,
-0.002532958984375,
-0.01229095458984375,
0.0170135498046875,
0.05731201171875,
-0.04437255859375,
-0.053863525390625,
0.035797119140625,
0.0060882568359375,
0.0369873046875,
0.00218963623046875,
-0.060699462890625,
0.01873779296875,
0.001064300537109375,
-0.0208282470703125,
0.002197265625,
0.00504302978515625,
-0.0029697418212890625,
0.049346923828125,
0.05535888671875,
-0.01885986328125,
-0.002750396728515625,
-0.0035495758056640625,
0.0556640625,
-0.028228759765625,
-0.03387451171875,
-0.06097412109375,
0.041351318359375,
-0.0233306884765625,
-0.01009368896484375,
0.04669189453125,
0.072265625,
0.0745849609375,
-0.01546478271484375,
0.046661376953125,
-0.02142333984375,
0.013885498046875,
-0.0267486572265625,
0.0582275390625,
-0.033294677734375,
0.01551055908203125,
-0.02423095703125,
-0.07562255859375,
-0.03009033203125,
0.06103515625,
-0.03240966796875,
0.02276611328125,
0.047393798828125,
0.08782958984375,
-0.0176849365234375,
-0.024658203125,
0.021331787109375,
0.008544921875,
0.03515625,
0.05126953125,
0.0379638671875,
-0.0460205078125,
0.045928955078125,
-0.0394287109375,
-0.006496429443359375,
-0.020965576171875,
-0.05694580078125,
-0.07855224609375,
-0.041168212890625,
-0.04058837890625,
-0.033935546875,
-0.014373779296875,
0.036590576171875,
0.0653076171875,
-0.058380126953125,
-0.015838623046875,
-0.018524169921875,
0.021820068359375,
-0.0209808349609375,
-0.023651123046875,
0.051788330078125,
0.002490997314453125,
-0.06842041015625,
0.004505157470703125,
0.011383056640625,
0.00908660888671875,
-0.01276397705078125,
-0.01007843017578125,
-0.0413818359375,
-0.0087890625,
0.0208892822265625,
0.0258941650390625,
-0.05059814453125,
-0.030364990234375,
-0.01387786865234375,
-0.010894775390625,
0.0189056396484375,
0.0254058837890625,
-0.039581298828125,
0.03900146484375,
0.0411376953125,
0.00395965576171875,
0.057891845703125,
-0.01042938232421875,
0.00658416748046875,
-0.04888916015625,
0.041290283203125,
0.0035400390625,
0.039337158203125,
0.03497314453125,
-0.04925537109375,
0.0631103515625,
0.03216552734375,
-0.033660888671875,
-0.07940673828125,
0.014678955078125,
-0.09332275390625,
-0.023590087890625,
0.068359375,
-0.010772705078125,
-0.0240325927734375,
-0.0112762451171875,
-0.019073486328125,
0.05303955078125,
-0.02685546875,
0.05438232421875,
0.03509521484375,
0.0035076141357421875,
-0.02972412109375,
-0.03375244140625,
0.0240020751953125,
0.001834869384765625,
-0.061981201171875,
-0.01021575927734375,
0.0195465087890625,
0.035675048828125,
0.0301971435546875,
0.047149658203125,
-0.01605224609375,
0.0149993896484375,
0.0179595947265625,
0.0374755859375,
-0.03314208984375,
-0.024627685546875,
-0.0235137939453125,
-0.00418853759765625,
-0.01117706298828125,
-0.0311126708984375
]
] |
deepset/deberta-v3-base-injection | 2023-09-11T12:54:35.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | deepset | null | null | deepset/deberta-v3-base-injection | 14 | 53,844 | transformers | 2023-05-17T08:59:29 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
base_model: microsoft/deberta-v3-base
model-index:
- name: deberta-v3-base-injection
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-injection
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the [promp-injection](https://huggingface.co/datasets/JasperLS/prompt-injections) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0673
- Accuracy: 0.9914
## Model description
This model detects prompt injection attempts and classifies them as "INJECTION". Legitimate requests are classified as "LEGIT". The dataset assumes that legitimate requests are either all sorts of questions of key word searches.
## Intended uses & limitations
If you are using this model to secure your system and it is overly "trigger-happy" to classify requests as injections, consider collecting legitimate examples and retraining the model with the [promp-injection](https://huggingface.co/datasets/JasperLS/prompt-injections) dataset.
## Training and evaluation data
Based in the [promp-injection](https://huggingface.co/datasets/JasperLS/prompt-injections) dataset.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 69 | 0.2353 | 0.9741 |
| No log | 2.0 | 138 | 0.0894 | 0.9741 |
| No log | 3.0 | 207 | 0.0673 | 0.9914 |
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 2,121 | [
[
-0.01519012451171875,
-0.058624267578125,
0.0266876220703125,
0.01702880859375,
-0.00788116455078125,
-0.0211639404296875,
0.0284271240234375,
-0.0286102294921875,
-0.00390625,
0.0296478271484375,
-0.0328369140625,
-0.048858642578125,
-0.049072265625,
-0.00012201070785522461,
-0.027740478515625,
0.057281494140625,
0.01482391357421875,
0.0204010009765625,
0.00013518333435058594,
0.015533447265625,
-0.059234619140625,
-0.05084228515625,
-0.0704345703125,
-0.0263519287109375,
0.0255279541015625,
0.0211944580078125,
0.05572509765625,
0.036285400390625,
0.039031982421875,
0.01499176025390625,
-0.0292205810546875,
-0.0008225440979003906,
-0.02374267578125,
-0.02227783203125,
-0.00911712646484375,
-0.039276123046875,
-0.04400634765625,
-0.0008897781372070312,
0.032012939453125,
0.021270751953125,
-0.0041351318359375,
0.01380157470703125,
0.005779266357421875,
0.029449462890625,
-0.06353759765625,
0.00383758544921875,
-0.051910400390625,
0.0230255126953125,
-0.01374053955078125,
-0.0270538330078125,
-0.044708251953125,
-0.0077056884765625,
0.0081787109375,
-0.0230865478515625,
0.02288818359375,
-0.001659393310546875,
0.08160400390625,
0.01419830322265625,
-0.035247802734375,
-0.0005898475646972656,
-0.04608154296875,
0.042449951171875,
-0.059234619140625,
0.0114898681640625,
0.0322265625,
0.035980224609375,
-0.025054931640625,
-0.051666259765625,
-0.038787841796875,
-0.01540374755859375,
-0.0085906982421875,
0.003559112548828125,
-0.0374755859375,
0.01303863525390625,
0.056396484375,
0.01366424560546875,
-0.06268310546875,
0.0233154296875,
-0.035552978515625,
-0.020050048828125,
0.03271484375,
0.02911376953125,
0.005443572998046875,
-0.020843505859375,
-0.049346923828125,
0.002292633056640625,
-0.053192138671875,
0.017669677734375,
0.03082275390625,
0.0226593017578125,
-0.01320648193359375,
0.041259765625,
-0.016387939453125,
0.07275390625,
0.0186309814453125,
-0.0009655952453613281,
0.044281005859375,
0.004253387451171875,
-0.0328369140625,
-0.00278472900390625,
0.06597900390625,
0.036590576171875,
0.01268768310546875,
0.01439666748046875,
-0.0182647705078125,
-0.003528594970703125,
0.0232391357421875,
-0.07940673828125,
-0.0081024169921875,
0.0341796875,
-0.025146484375,
-0.04962158203125,
0.00516510009765625,
-0.03857421875,
-0.0093536376953125,
-0.0244598388671875,
0.03173828125,
-0.05029296875,
-0.007503509521484375,
0.017974853515625,
-0.00447845458984375,
0.01580810546875,
0.01496124267578125,
-0.055816650390625,
0.0212860107421875,
0.045318603515625,
0.0592041015625,
-0.0140838623046875,
-0.0272369384765625,
-0.040069580078125,
0.0032825469970703125,
-0.007724761962890625,
0.048919677734375,
-0.020904541015625,
-0.0249176025390625,
0.015716552734375,
0.029541015625,
-0.01416778564453125,
-0.05047607421875,
0.03192138671875,
-0.04913330078125,
0.0105133056640625,
-0.01020050048828125,
-0.05596923828125,
-0.0158233642578125,
0.0249481201171875,
-0.034271240234375,
0.07232666015625,
0.01242828369140625,
-0.05645751953125,
0.033447265625,
-0.03839111328125,
0.007904052734375,
0.00975799560546875,
-0.006103515625,
-0.0230712890625,
-0.00998687744140625,
-0.00405120849609375,
0.03424072265625,
-0.008056640625,
0.033447265625,
-0.021087646484375,
-0.03729248046875,
0.00974273681640625,
-0.03741455078125,
0.09014892578125,
0.019500732421875,
-0.03472900390625,
-0.00043463706970214844,
-0.07720947265625,
0.007198333740234375,
0.01377105712890625,
0.0014486312866210938,
-0.007465362548828125,
-0.0325927734375,
0.0016021728515625,
0.02301025390625,
0.024658203125,
-0.031585693359375,
0.026153564453125,
-0.045135498046875,
0.007633209228515625,
0.040130615234375,
0.027496337890625,
0.0097808837890625,
-0.02313232421875,
0.0440673828125,
0.048370361328125,
0.040191650390625,
0.0047760009765625,
-0.0513916015625,
-0.04803466796875,
-0.0180511474609375,
0.0128631591796875,
0.0654296875,
-0.036651611328125,
0.0521240234375,
0.001178741455078125,
-0.060577392578125,
-0.0023860931396484375,
-0.01030731201171875,
0.033599853515625,
0.059539794921875,
0.041656494140625,
-0.0100555419921875,
-0.02667236328125,
-0.09027099609375,
-0.0023250579833984375,
-0.0223236083984375,
-0.0028018951416015625,
0.0040283203125,
0.0552978515625,
-0.025054931640625,
0.0689697265625,
-0.047332763671875,
-0.016357421875,
-0.007381439208984375,
0.00902557373046875,
0.0379638671875,
0.067138671875,
0.07177734375,
-0.032257080078125,
-0.021026611328125,
-0.03350830078125,
-0.05029296875,
-0.0029735565185546875,
-0.0263671875,
-0.016815185546875,
0.001964569091796875,
0.008392333984375,
-0.039154052734375,
0.054534912109375,
0.01345062255859375,
-0.03717041015625,
0.041534423828125,
-0.017913818359375,
0.00846099853515625,
-0.08538818359375,
0.01401519775390625,
0.0029239654541015625,
-0.01178741455078125,
-0.047332763671875,
0.0016603469848632812,
0.02374267578125,
-0.0110321044921875,
-0.061492919921875,
0.0377197265625,
-0.00630950927734375,
0.0264892578125,
-0.0208282470703125,
-0.0108642578125,
0.01183319091796875,
0.045654296875,
0.0119476318359375,
0.061004638671875,
0.058746337890625,
-0.047332763671875,
0.03204345703125,
0.03631591796875,
-0.003875732421875,
0.050079345703125,
-0.07379150390625,
0.0072784423828125,
-0.0017728805541992188,
0.00958251953125,
-0.063232421875,
-0.007568359375,
0.054046630859375,
-0.031982421875,
0.016357421875,
-0.0263824462890625,
-0.0235595703125,
-0.0309600830078125,
-0.007755279541015625,
0.011444091796875,
0.053253173828125,
-0.036712646484375,
0.0218048095703125,
-0.0007219314575195312,
0.02850341796875,
-0.059478759765625,
-0.0625,
-0.003993988037109375,
-0.00975799560546875,
-0.04296875,
0.0300445556640625,
0.007373809814453125,
-0.004222869873046875,
-0.0289306640625,
0.0089874267578125,
-0.03253173828125,
0.00611114501953125,
0.028656005859375,
0.0248260498046875,
0.00403594970703125,
0.000013768672943115234,
0.0057220458984375,
-0.01474761962890625,
0.018707275390625,
0.005550384521484375,
0.044403076171875,
0.0007505416870117188,
-0.048919677734375,
-0.0726318359375,
0.0209808349609375,
0.0296478271484375,
0.0032939910888671875,
0.0697021484375,
0.040008544921875,
-0.043975830078125,
0.004558563232421875,
-0.040313720703125,
-0.0306396484375,
-0.0341796875,
0.0080108642578125,
-0.0394287109375,
-0.01311492919921875,
0.05194091796875,
0.0084991455078125,
-0.0000021457672119140625,
0.0611572265625,
0.0257415771484375,
0.0209503173828125,
0.09002685546875,
0.0142364501953125,
0.01137542724609375,
0.032989501953125,
-0.05157470703125,
-0.01369476318359375,
-0.0499267578125,
-0.033416748046875,
-0.039886474609375,
-0.00516510009765625,
-0.0377197265625,
-0.0009012222290039062,
0.00527191162109375,
0.0185546875,
-0.04669189453125,
0.044921875,
-0.05047607421875,
0.015777587890625,
0.05230712890625,
0.028289794921875,
0.0039215087890625,
-0.005786895751953125,
-0.00917816162109375,
-0.0036602020263671875,
-0.052947998046875,
-0.047607421875,
0.07952880859375,
0.0302886962890625,
0.055908203125,
-0.003055572509765625,
0.045013427734375,
0.0249786376953125,
-0.0009608268737792969,
-0.046783447265625,
0.042877197265625,
0.0224609375,
-0.04888916015625,
0.0193023681640625,
-0.024932861328125,
-0.07952880859375,
0.007781982421875,
-0.006992340087890625,
-0.06036376953125,
0.04132080078125,
0.032623291015625,
-0.03875732421875,
0.026336669921875,
-0.05487060546875,
0.08123779296875,
-0.00853729248046875,
-0.0292816162109375,
-0.0026092529296875,
-0.04681396484375,
0.01079559326171875,
0.0012722015380859375,
-0.0304107666015625,
-0.0011663436889648438,
0.002933502197265625,
0.04095458984375,
-0.042083740234375,
0.06787109375,
-0.037933349609375,
-0.0040283203125,
0.0323486328125,
0.004901885986328125,
0.054412841796875,
0.0286102294921875,
-0.005527496337890625,
0.030914306640625,
0.0252685546875,
-0.038970947265625,
-0.03955078125,
0.0499267578125,
-0.0655517578125,
-0.043365478515625,
-0.0692138671875,
-0.00986480712890625,
-0.00888824462890625,
0.0108642578125,
0.041656494140625,
0.05950927734375,
-0.005367279052734375,
0.0013628005981445312,
0.05010986328125,
-0.0123291015625,
0.0124664306640625,
0.04364013671875,
-0.01059722900390625,
-0.027862548828125,
0.047607421875,
0.00873565673828125,
0.0200653076171875,
0.01340484619140625,
-0.0025539398193359375,
-0.02374267578125,
-0.055145263671875,
-0.0479736328125,
0.007007598876953125,
-0.04864501953125,
-0.0301971435546875,
-0.061004638671875,
-0.02764892578125,
-0.045623779296875,
0.004058837890625,
-0.0191650390625,
-0.0284423828125,
-0.061126708984375,
-0.006954193115234375,
0.036651611328125,
0.0260009765625,
-0.005847930908203125,
0.051513671875,
-0.064453125,
0.0027923583984375,
0.006317138671875,
0.00943756103515625,
-0.00806427001953125,
-0.0721435546875,
-0.008758544921875,
0.020172119140625,
-0.046478271484375,
-0.0982666015625,
0.03985595703125,
-0.01171875,
0.03668212890625,
0.043853759765625,
0.0007877349853515625,
0.056793212890625,
-0.03155517578125,
0.08551025390625,
0.0121002197265625,
-0.057159423828125,
0.061004638671875,
-0.018157958984375,
0.0300140380859375,
0.04669189453125,
0.049102783203125,
-0.0225830078125,
-0.01433563232421875,
-0.07623291015625,
-0.053192138671875,
0.05242919921875,
0.0214996337890625,
0.007061004638671875,
-0.0285186767578125,
0.038482666015625,
-0.00583648681640625,
0.0162353515625,
-0.038116455078125,
-0.041107177734375,
-0.004856109619140625,
-0.00047969818115234375,
0.0312042236328125,
-0.043121337890625,
-0.01047515869140625,
-0.031402587890625,
0.07623291015625,
0.0028533935546875,
0.0234222412109375,
0.0157470703125,
-0.004482269287109375,
0.0162353515625,
0.00945281982421875,
0.040374755859375,
0.06439208984375,
-0.0290374755859375,
0.0003902912139892578,
0.0296478271484375,
-0.041534423828125,
0.01885986328125,
0.014984130859375,
-0.018768310546875,
0.0166473388671875,
0.009368896484375,
0.058868408203125,
-0.005603790283203125,
-0.03546142578125,
0.049041748046875,
-0.0019121170043945312,
-0.003673553466796875,
-0.057586669921875,
0.0188140869140625,
-0.03472900390625,
0.01143646240234375,
0.016448974609375,
0.019195556640625,
0.0291595458984375,
-0.00376129150390625,
0.0121917724609375,
0.0255126953125,
-0.044189453125,
-0.021514892578125,
0.047607421875,
0.0124969482421875,
-0.0247039794921875,
0.04754638671875,
-0.01496124267578125,
-0.01275634765625,
0.058197021484375,
0.049774169921875,
0.07806396484375,
-0.0225982666015625,
0.007640838623046875,
0.0609130859375,
0.0106964111328125,
-0.00125885009765625,
0.049896240234375,
0.0297393798828125,
-0.0302581787109375,
-0.01081085205078125,
-0.0290985107421875,
-0.017486572265625,
0.03515625,
-0.06982421875,
0.042938232421875,
-0.02508544921875,
-0.043975830078125,
0.005115509033203125,
-0.0101776123046875,
-0.0626220703125,
0.01507568359375,
-0.003063201904296875,
0.0985107421875,
-0.0692138671875,
0.039764404296875,
0.0357666015625,
-0.04876708984375,
-0.04620361328125,
-0.026458740234375,
-0.0036334991455078125,
-0.06329345703125,
0.0653076171875,
0.0140380859375,
0.0097503662109375,
-0.0023651123046875,
-0.044921875,
-0.0491943359375,
0.069580078125,
0.0233917236328125,
-0.05438232421875,
-0.0108184814453125,
0.0267181396484375,
0.033203125,
-0.01206207275390625,
0.04876708984375,
0.019927978515625,
0.01500701904296875,
0.00598907470703125,
-0.060150146484375,
0.01416778564453125,
-0.005706787109375,
0.01708984375,
-0.00576019287109375,
-0.042327880859375,
0.0638427734375,
0.00785064697265625,
0.00896453857421875,
0.0132904052734375,
0.042144775390625,
0.0177001953125,
0.01348876953125,
0.035736083984375,
0.050750732421875,
0.042938232421875,
-0.0085296630859375,
0.06591796875,
-0.04083251953125,
0.03521728515625,
0.08209228515625,
-0.0121002197265625,
0.043853759765625,
0.036529541015625,
-0.01067352294921875,
0.042877197265625,
0.06170654296875,
-0.02581787109375,
0.025177001953125,
0.0222015380859375,
-0.016021728515625,
-0.024261474609375,
0.0014219284057617188,
-0.05706787109375,
0.0248870849609375,
0.0114593505859375,
-0.057769775390625,
-0.02203369140625,
0.0087890625,
0.01263427734375,
-0.0268707275390625,
-0.006099700927734375,
0.055999755859375,
-0.0239410400390625,
-0.04254150390625,
0.071044921875,
-0.01136016845703125,
0.02752685546875,
-0.049774169921875,
-0.017822265625,
-0.006175994873046875,
0.020294189453125,
-0.01450347900390625,
-0.043426513671875,
0.0181884765625,
0.0023193359375,
-0.01099395751953125,
0.0008878707885742188,
0.04736328125,
-0.0087432861328125,
-0.034912109375,
0.00013148784637451172,
0.040008544921875,
0.0231781005859375,
-0.013824462890625,
-0.07806396484375,
-0.005313873291015625,
-0.00646209716796875,
-0.0274200439453125,
0.01073455810546875,
0.01535797119140625,
0.0118408203125,
0.04803466796875,
0.035736083984375,
0.0020923614501953125,
-0.0122222900390625,
0.00917816162109375,
0.06402587890625,
-0.0394287109375,
-0.02911376953125,
-0.0745849609375,
0.041412353515625,
-0.0225372314453125,
-0.04779052734375,
0.055389404296875,
0.0684814453125,
0.046844482421875,
-0.0137176513671875,
0.03887939453125,
0.000701904296875,
0.0266265869140625,
-0.03106689453125,
0.040313720703125,
-0.035064697265625,
0.004390716552734375,
-0.0164794921875,
-0.052947998046875,
-0.007045745849609375,
0.036529541015625,
-0.0248870849609375,
0.001628875732421875,
0.038818359375,
0.07720947265625,
-0.004863739013671875,
0.0030803680419921875,
0.0191192626953125,
0.0013818740844726562,
0.030517578125,
0.039642333984375,
0.058135986328125,
-0.060760498046875,
0.030242919921875,
-0.041046142578125,
-0.032073974609375,
-0.0191650390625,
-0.0582275390625,
-0.0797119140625,
-0.0260467529296875,
-0.041046142578125,
-0.060150146484375,
0.01154327392578125,
0.08489990234375,
0.04852294921875,
-0.07611083984375,
0.00252532958984375,
-0.0221710205078125,
-0.01509857177734375,
-0.030731201171875,
-0.0191802978515625,
0.0241241455078125,
-0.03265380859375,
-0.06842041015625,
0.01068878173828125,
-0.0211334228515625,
0.04559326171875,
-0.0218505859375,
0.015960693359375,
-0.0216217041015625,
-0.0036640167236328125,
0.02215576171875,
0.0142669677734375,
-0.03759765625,
-0.026031494140625,
0.01629638671875,
-0.017730712890625,
0.0086517333984375,
0.0235443115234375,
-0.062255859375,
0.0257110595703125,
0.04876708984375,
0.0251007080078125,
0.028289794921875,
-0.016815185546875,
0.030426025390625,
-0.0499267578125,
0.0302886962890625,
0.025604248046875,
0.040924072265625,
-0.00547027587890625,
-0.041839599609375,
0.0328369140625,
0.03155517578125,
-0.0477294921875,
-0.0631103515625,
0.01497650146484375,
-0.07769775390625,
0.0037975311279296875,
0.09759521484375,
-0.01385498046875,
-0.0255126953125,
-0.006999969482421875,
-0.031402587890625,
0.011505126953125,
-0.037689208984375,
0.05157470703125,
0.035888671875,
-0.0098114013671875,
0.0167388916015625,
-0.044921875,
0.046783447265625,
0.0206146240234375,
-0.05517578125,
-0.01123809814453125,
0.0484619140625,
0.0190582275390625,
0.0192718505859375,
0.0279998779296875,
-0.006092071533203125,
0.0240631103515625,
-0.00814056396484375,
0.0039825439453125,
-0.021392822265625,
-0.0182342529296875,
-0.03411865234375,
0.01029205322265625,
-0.006011962890625,
-0.035980224609375
]
] |
IDEA-CCNL/Erlangshen-Roberta-330M-Sentiment | 2023-05-26T04:13:11.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"text-classification",
"roberta",
"NLU",
"Sentiment",
"Chinese",
"zh",
"arxiv:2209.02970",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | IDEA-CCNL | null | null | IDEA-CCNL/Erlangshen-Roberta-330M-Sentiment | 11 | 53,526 | transformers | 2022-04-20T07:15:44 | ---
language:
- zh
license: apache-2.0
tags:
- roberta
- NLU
- Sentiment
- Chinese
inference: true
widget:
- text: "今天心情不好"
---
# Erlangshen-Roberta-330M-Sentiment
- Main Page:[Fengshenbang](https://fengshenbang-lm.com/)
- Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM)
## 简介 Brief Introduction
中文的RoBERTa-wwm-ext-large在数个情感分析任务微调后的版本
This is the fine-tuned version of the Chinese RoBERTa-wwm-ext-large model on several sentiment analysis datasets.
## 模型分类 Model Taxonomy
| 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra |
| :----: | :----: | :----: | :----: | :----: | :----: |
| 通用 General | 自然语言理解 NLU | 二郎神 Erlangshen | Roberta | 330M | 中文-情感分析 Chinese-Sentiment |
## 模型信息 Model Information
基于[chinese-roberta-wwm-ext-large](https://huggingface.co/hfl/chinese-roberta-wwm-ext-large),我们在收集的8个中文领域的情感分析数据集,总计227347个样本上微调了一个Semtiment版本。
Based on [chinese-roberta-wwm-ext-large](https://huggingface.co/hfl/chinese-roberta-wwm-ext-large), we fine-tuned a sentiment analysis version on 8 Chinese sentiment analysis datasets, with totaling 227,347 samples.
### 下游效果 Performance
| 模型 Model | ASAP-SENT | ASAP-ASPECT | ChnSentiCorp |
| :--------: | :-----: | :----: | :-----: |
| Erlangshen-Roberta-110M-Sentiment | 97.77 | 97.31 | 96.61 |
| Erlangshen-Roberta-330M-Sentiment | 97.9 | 97.51 | 96.66 |
| Erlangshen-MegatronBert-1.3B-Sentiment | 98.1 | 97.8 | 97 |
## 使用 Usage
``` python
from transformers import BertForSequenceClassification
from transformers import BertTokenizer
import torch
tokenizer=BertTokenizer.from_pretrained('IDEA-CCNL/Erlangshen-Roberta-330M-Sentiment')
model=BertForSequenceClassification.from_pretrained('IDEA-CCNL/Erlangshen-Roberta-330M-Sentiment')
text='今天心情不好'
output=model(torch.tensor([tokenizer.encode(text)]))
print(torch.nn.functional.softmax(output.logits,dim=-1))
```
## 引用 Citation
如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970):
If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970):
```text
@article{fengshenbang,
author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
journal = {CoRR},
volume = {abs/2209.02970},
year = {2022}
}
```
也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
```text
@misc{Fengshenbang-LM,
title={Fengshenbang-LM},
author={IDEA-CCNL},
year={2021},
howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
``` | 3,079 | [
[
-0.038238525390625,
-0.056365966796875,
0.01493072509765625,
0.0300140380859375,
-0.03546142578125,
-0.034515380859375,
-0.049530029296875,
-0.031890869140625,
0.0272216796875,
0.01273345947265625,
-0.043975830078125,
-0.052947998046875,
-0.031982421875,
-0.0009946823120117188,
0.01560211181640625,
0.08453369140625,
0.0047149658203125,
0.01654052734375,
0.0050811767578125,
-0.0171051025390625,
-0.0040740966796875,
-0.031829833984375,
-0.046234130859375,
-0.02130126953125,
0.0316162109375,
0.002178192138671875,
0.02862548828125,
0.010498046875,
0.0443115234375,
0.0205841064453125,
-0.01175689697265625,
0.01123809814453125,
-0.01885986328125,
-0.00643157958984375,
0.0152587890625,
-0.0214996337890625,
-0.0601806640625,
0.00682830810546875,
0.038970947265625,
0.032806396484375,
0.0036258697509765625,
0.0240020751953125,
0.02557373046875,
0.05535888671875,
-0.0195159912109375,
0.016815185546875,
-0.03558349609375,
-0.0003020763397216797,
-0.01399993896484375,
-0.00543212890625,
-0.033782958984375,
-0.034576416015625,
0.0041046142578125,
-0.03729248046875,
0.00431060791015625,
0.0023975372314453125,
0.10626220703125,
0.005672454833984375,
-0.01151275634765625,
-0.004978179931640625,
-0.02490234375,
0.08251953125,
-0.084228515625,
0.00843048095703125,
0.00383758544921875,
0.0005788803100585938,
0.0028476715087890625,
-0.044525146484375,
-0.054168701171875,
-0.009429931640625,
-0.019500732421875,
0.03753662109375,
-0.001392364501953125,
-0.0002104043960571289,
0.023773193359375,
0.0136566162109375,
-0.03900146484375,
-0.00876617431640625,
-0.0173187255859375,
-0.0080718994140625,
0.04473876953125,
0.0107269287109375,
0.0159912109375,
-0.050506591796875,
-0.03460693359375,
-0.017578125,
-0.029815673828125,
0.0179290771484375,
0.00965118408203125,
0.0309600830078125,
-0.03125,
0.034454345703125,
-0.00870513916015625,
0.048553466796875,
0.005527496337890625,
-0.00537109375,
0.0526123046875,
-0.038909912109375,
-0.02886962890625,
-0.022857666015625,
0.087646484375,
0.03753662109375,
0.016265869140625,
0.0178070068359375,
-0.01336669921875,
-0.0242919921875,
-0.01309967041015625,
-0.058013916015625,
-0.01971435546875,
0.03076171875,
-0.050048828125,
-0.0235137939453125,
0.03216552734375,
-0.0704345703125,
-0.0007638931274414062,
-0.0033111572265625,
0.03265380859375,
-0.041595458984375,
-0.038360595703125,
0.0034847259521484375,
-0.010589599609375,
0.040557861328125,
0.0150146484375,
-0.043975830078125,
-0.0048980712890625,
0.04229736328125,
0.0572509765625,
-0.0006361007690429688,
-0.024871826171875,
-0.003971099853515625,
-0.004985809326171875,
-0.015380859375,
0.03076171875,
-0.01215362548828125,
-0.015655517578125,
-0.004425048828125,
-0.00011074542999267578,
-0.00785064697265625,
-0.0160980224609375,
0.063232421875,
-0.0280609130859375,
0.03155517578125,
-0.0360107421875,
-0.022613525390625,
-0.0282135009765625,
0.027496337890625,
-0.04180908203125,
0.08197021484375,
-0.0006918907165527344,
-0.07684326171875,
0.0281982421875,
-0.0523681640625,
-0.0234222412109375,
-0.017181396484375,
0.003612518310546875,
-0.05487060546875,
-0.00458526611328125,
0.0224456787109375,
0.0467529296875,
-0.0224456787109375,
0.00565338134765625,
-0.034454345703125,
-0.00405120849609375,
0.03271484375,
-0.026611328125,
0.09979248046875,
0.03070068359375,
-0.029052734375,
0.0201263427734375,
-0.0577392578125,
0.018646240234375,
0.02996826171875,
-0.0202789306640625,
-0.034515380859375,
-0.007595062255859375,
0.017608642578125,
0.03228759765625,
0.05096435546875,
-0.041748046875,
0.007598876953125,
-0.038238525390625,
0.029327392578125,
0.06793212890625,
0.004993438720703125,
0.0168914794921875,
-0.035675048828125,
0.019317626953125,
0.0173187255859375,
0.0195159912109375,
-0.007633209228515625,
-0.034515380859375,
-0.07989501953125,
-0.0304412841796875,
0.01910400390625,
0.03955078125,
-0.039886474609375,
0.06549072265625,
-0.032318115234375,
-0.043792724609375,
-0.035980224609375,
-0.006389617919921875,
0.0226287841796875,
0.023651123046875,
0.041534423828125,
-0.0021495819091796875,
-0.046478271484375,
-0.0467529296875,
-0.018524169921875,
-0.018310546875,
0.0068511962890625,
0.034088134765625,
0.039520263671875,
-0.0019216537475585938,
0.0540771484375,
-0.046142578125,
-0.02325439453125,
-0.0199127197265625,
0.01268768310546875,
0.052581787109375,
0.039886474609375,
0.04376220703125,
-0.055999755859375,
-0.053619384765625,
-0.00959014892578125,
-0.059051513671875,
0.0015554428100585938,
-0.0158538818359375,
-0.03253173828125,
0.03936767578125,
0.00807952880859375,
-0.0606689453125,
0.0203704833984375,
0.022247314453125,
-0.0259552001953125,
0.049530029296875,
-0.007080078125,
0.01245880126953125,
-0.09991455078125,
0.01012420654296875,
0.00684356689453125,
0.01428985595703125,
-0.038818359375,
0.01386260986328125,
-0.000017583370208740234,
0.01439666748046875,
-0.0269775390625,
0.0386962890625,
-0.043212890625,
0.015167236328125,
0.01085662841796875,
0.0198974609375,
-0.000034689903259277344,
0.06463623046875,
-0.005313873291015625,
0.0209503173828125,
0.04425048828125,
-0.036712646484375,
0.031463623046875,
0.0119476318359375,
-0.02313232421875,
0.0341796875,
-0.0611572265625,
-0.0037631988525390625,
0.0075836181640625,
0.018768310546875,
-0.08489990234375,
0.0019483566284179688,
0.0343017578125,
-0.055755615234375,
0.0186767578125,
0.0007381439208984375,
-0.03924560546875,
-0.040283203125,
-0.06072998046875,
0.0168914794921875,
0.046905517578125,
-0.037689208984375,
0.038726806640625,
0.01020050048828125,
-0.0006041526794433594,
-0.05206298828125,
-0.061920166015625,
-0.0189971923828125,
-0.0269012451171875,
-0.06494140625,
0.0198822021484375,
-0.015350341796875,
-0.0007042884826660156,
0.003284454345703125,
0.0164794921875,
0.016448974609375,
-0.00783538818359375,
0.010223388671875,
0.0506591796875,
-0.017547607421875,
0.0088653564453125,
-0.010589599609375,
-0.005023956298828125,
0.01020050048828125,
-0.0184326171875,
0.045562744140625,
-0.01311492919921875,
-0.0287933349609375,
-0.0306396484375,
0.0088653564453125,
0.037689208984375,
-0.0265350341796875,
0.06695556640625,
0.07489013671875,
-0.0214080810546875,
-0.0021533966064453125,
-0.03662109375,
-0.006511688232421875,
-0.033782958984375,
0.03814697265625,
-0.0309906005859375,
-0.049835205078125,
0.049163818359375,
0.0193634033203125,
0.02191162109375,
0.061004638671875,
0.0457763671875,
-0.00021278858184814453,
0.0830078125,
0.029205322265625,
-0.0192413330078125,
0.044464111328125,
-0.039306640625,
0.0184173583984375,
-0.08209228515625,
-0.018768310546875,
-0.038543701171875,
-0.0195159912109375,
-0.056182861328125,
-0.033599853515625,
0.0236663818359375,
0.0028514862060546875,
-0.034088134765625,
0.01508331298828125,
-0.0538330078125,
-0.01324462890625,
0.035980224609375,
0.01739501953125,
0.0062408447265625,
0.002094268798828125,
-0.021484375,
-0.0102691650390625,
-0.041961669921875,
-0.0287322998046875,
0.07684326171875,
0.0233154296875,
0.050933837890625,
0.0185089111328125,
0.055450439453125,
-0.0020847320556640625,
0.011260986328125,
-0.049041748046875,
0.05389404296875,
-0.00457763671875,
-0.045074462890625,
-0.0311126708984375,
-0.036285400390625,
-0.0614013671875,
0.022125244140625,
-0.01218414306640625,
-0.048065185546875,
0.0054931640625,
-0.01393890380859375,
-0.018707275390625,
0.0288848876953125,
-0.0277099609375,
0.05120849609375,
-0.0199432373046875,
-0.028472900390625,
-0.02001953125,
-0.03717041015625,
0.0247039794921875,
0.010955810546875,
0.01361083984375,
-0.0093994140625,
-0.0035228729248046875,
0.07470703125,
-0.034942626953125,
0.057891845703125,
-0.0293121337890625,
-0.0027103424072265625,
0.03216552734375,
-0.022369384765625,
0.054107666015625,
0.0027942657470703125,
-0.01462554931640625,
0.0147247314453125,
-0.016265869140625,
-0.0426025390625,
-0.0246734619140625,
0.060821533203125,
-0.07177734375,
-0.04022216796875,
-0.049957275390625,
-0.0166168212890625,
0.0018014907836914062,
0.0265350341796875,
0.04376220703125,
0.0192413330078125,
-0.01355743408203125,
0.018707275390625,
0.0455322265625,
-0.0279541015625,
0.03973388671875,
0.030059814453125,
-0.01800537109375,
-0.037353515625,
0.0635986328125,
0.0189971923828125,
0.00897979736328125,
0.02850341796875,
0.00820159912109375,
-0.01328277587890625,
-0.0357666015625,
-0.0234832763671875,
0.0374755859375,
-0.04864501953125,
-0.0157012939453125,
-0.050506591796875,
-0.0252227783203125,
-0.04052734375,
-0.0030498504638671875,
-0.0206146240234375,
-0.0318603515625,
-0.026092529296875,
-0.0131072998046875,
0.04266357421875,
0.025421142578125,
-0.01113128662109375,
-0.01386260986328125,
-0.0419921875,
0.0107574462890625,
0.0176239013671875,
0.01763916015625,
0.0211639404296875,
-0.057342529296875,
-0.034454345703125,
0.02056884765625,
-0.02203369140625,
-0.057586669921875,
0.03948974609375,
0.004894256591796875,
0.04534912109375,
0.034576416015625,
0.019744873046875,
0.049285888671875,
-0.00890350341796875,
0.07958984375,
0.0263519287109375,
-0.06689453125,
0.04376220703125,
-0.03399658203125,
0.01337432861328125,
0.023284912109375,
0.01776123046875,
-0.038604736328125,
-0.02685546875,
-0.0498046875,
-0.08453369140625,
0.0634765625,
0.0019893646240234375,
0.0113372802734375,
-0.0008077621459960938,
-0.0012578964233398438,
-0.0203704833984375,
-0.0037326812744140625,
-0.0782470703125,
-0.0435791015625,
-0.0300140380859375,
-0.0244293212890625,
0.0002455711364746094,
-0.0242767333984375,
-0.01410675048828125,
-0.029022216796875,
0.078857421875,
0.0025577545166015625,
0.05950927734375,
0.0255584716796875,
0.0004534721374511719,
-0.0017223358154296875,
0.0213470458984375,
0.041473388671875,
0.0255889892578125,
-0.01499176025390625,
-0.00666046142578125,
0.0299530029296875,
-0.027252197265625,
-0.0136566162109375,
0.004795074462890625,
-0.0206146240234375,
0.002986907958984375,
0.03521728515625,
0.06304931640625,
-0.0018014907836914062,
-0.029510498046875,
0.04345703125,
0.0016679763793945312,
-0.0160064697265625,
-0.05078125,
0.0012187957763671875,
0.01227569580078125,
0.01119232177734375,
0.033599853515625,
0.00643157958984375,
-0.005344390869140625,
-0.0236358642578125,
-0.00004285573959350586,
0.041748046875,
-0.035675048828125,
-0.0240936279296875,
0.0347900390625,
0.0071258544921875,
0.00704193115234375,
0.042205810546875,
-0.02508544921875,
-0.054107666015625,
0.0531005859375,
0.025848388671875,
0.07122802734375,
0.005229949951171875,
0.010223388671875,
0.06170654296875,
0.023193359375,
-0.0028362274169921875,
0.037994384765625,
0.011962890625,
-0.05743408203125,
-0.0172271728515625,
-0.047637939453125,
0.00743865966796875,
0.007709503173828125,
-0.05712890625,
0.0290679931640625,
-0.03851318359375,
-0.0255126953125,
-0.00885009765625,
0.0236358642578125,
-0.040679931640625,
0.0202789306640625,
0.0022945404052734375,
0.06463623046875,
-0.046722412109375,
0.064208984375,
0.06280517578125,
-0.047393798828125,
-0.0787353515625,
0.001087188720703125,
-0.0011196136474609375,
-0.049285888671875,
0.040496826171875,
0.0044097900390625,
-0.00743865966796875,
-0.002452850341796875,
-0.045074462890625,
-0.060028076171875,
0.100830078125,
-0.01416015625,
-0.0164031982421875,
0.0171051025390625,
-0.016143798828125,
0.0509033203125,
-0.02838134765625,
0.031707763671875,
0.0081787109375,
0.059112548828125,
-0.0046234130859375,
-0.0477294921875,
0.026702880859375,
-0.046112060546875,
-0.004383087158203125,
0.0115203857421875,
-0.09124755859375,
0.08941650390625,
-0.0226287841796875,
-0.0147247314453125,
0.01390838623046875,
0.06683349609375,
0.02545166015625,
0.01514434814453125,
0.0276641845703125,
0.048553466796875,
0.05084228515625,
-0.024322509765625,
0.057403564453125,
-0.0267333984375,
0.05291748046875,
0.0635986328125,
0.005619049072265625,
0.06658935546875,
0.0235748291015625,
-0.04095458984375,
0.05181884765625,
0.0457763671875,
-0.023284912109375,
0.031646728515625,
0.0018405914306640625,
-0.006671905517578125,
-0.003940582275390625,
0.002704620361328125,
-0.04461669921875,
0.015655517578125,
0.016937255859375,
-0.0357666015625,
0.02178955078125,
-0.00736236572265625,
0.0306549072265625,
-0.0127410888671875,
-0.0166168212890625,
0.0511474609375,
0.00870513916015625,
-0.052825927734375,
0.0611572265625,
0.0150146484375,
0.08599853515625,
-0.04412841796875,
0.0204010009765625,
-0.005794525146484375,
0.01032257080078125,
-0.0274505615234375,
-0.050750732421875,
-0.0013437271118164062,
-0.0004296302795410156,
-0.011383056640625,
0.006488800048828125,
0.0411376953125,
-0.02508544921875,
-0.048980712890625,
0.03594970703125,
0.0278167724609375,
0.007175445556640625,
0.021026611328125,
-0.08599853515625,
-0.00395965576171875,
0.034637451171875,
-0.0672607421875,
0.017486572265625,
0.051513671875,
0.0223846435546875,
0.0304107666015625,
0.052886962890625,
0.0207061767578125,
0.00771331787109375,
0.0039043426513671875,
0.060577392578125,
-0.0672607421875,
-0.03668212890625,
-0.0743408203125,
0.04193115234375,
-0.025421142578125,
-0.040283203125,
0.0765380859375,
0.0262298583984375,
0.053436279296875,
-0.0068511962890625,
0.06640625,
-0.03125,
0.038787841796875,
-0.03228759765625,
0.059051513671875,
-0.053192138671875,
0.0189361572265625,
-0.04351806640625,
-0.06561279296875,
-0.01313018798828125,
0.0645751953125,
-0.028839111328125,
0.0430908203125,
0.055999755859375,
0.0743408203125,
0.018341064453125,
-0.00826263427734375,
0.0074310302734375,
0.039947509765625,
0.015899658203125,
0.061553955078125,
0.038909912109375,
-0.038360595703125,
0.040985107421875,
-0.031707763671875,
-0.0180816650390625,
-0.027801513671875,
-0.05548095703125,
-0.06927490234375,
-0.05438232421875,
-0.0219879150390625,
-0.046142578125,
-0.02197265625,
0.0711669921875,
0.046600341796875,
-0.06365966796875,
-0.00988006591796875,
0.003082275390625,
0.01328277587890625,
-0.034912109375,
-0.0271148681640625,
0.054931640625,
-0.0177459716796875,
-0.06158447265625,
-0.0237579345703125,
0.00664520263671875,
0.00020360946655273438,
-0.0198211669921875,
-0.0259857177734375,
-0.0232696533203125,
0.02197265625,
0.02557373046875,
0.00766754150390625,
-0.058807373046875,
-0.01476287841796875,
0.01226806640625,
-0.037109375,
0.007640838623046875,
0.0252532958984375,
-0.0238189697265625,
0.0156707763671875,
0.0614013671875,
0.004955291748046875,
0.034210205078125,
-0.008331298828125,
0.0070648193359375,
-0.03411865234375,
0.0127716064453125,
0.0059967041015625,
0.03717041015625,
0.0246734619140625,
-0.0277862548828125,
0.0347900390625,
0.018646240234375,
-0.036224365234375,
-0.050323486328125,
0.0023040771484375,
-0.09796142578125,
-0.0231475830078125,
0.087158203125,
-0.031951904296875,
-0.02288818359375,
0.009124755859375,
-0.016754150390625,
0.037689208984375,
-0.041595458984375,
0.0521240234375,
0.06170654296875,
-0.005306243896484375,
-0.00928497314453125,
-0.0258636474609375,
0.0301513671875,
0.049041748046875,
-0.04229736328125,
0.0034503936767578125,
0.017669677734375,
0.005611419677734375,
0.0287017822265625,
0.046630859375,
-0.0170440673828125,
0.02337646484375,
-0.006778717041015625,
0.0233306884765625,
-0.00795745849609375,
0.0092315673828125,
-0.016387939453125,
0.003753662109375,
0.007904052734375,
-0.0138092041015625
]
] |
optimum/t5-small | 2023-01-19T17:56:30.000Z | [
"transformers",
"onnx",
"t5",
"text2text-generation",
"summarization",
"translation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:c4",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | translation | optimum | null | null | optimum/t5-small | 8 | 53,351 | transformers | 2022-06-20T14:47:38 | ---
language:
- en
- fr
- ro
- de
- multilingual
license: apache-2.0
tags:
- summarization
- translation
datasets:
- c4
---
## [t5-small](https://huggingface.co/t5-small) exported to the ONNX format
## Model description
[T5](https://huggingface.co/docs/transformers/model_doc/t5#t5) is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
For more information, please take a look at the original paper.
Paper: [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf)
Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu*
## Usage example
You can use this model with Transformers *pipeline*.
```python
from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("optimum/t5-small")
model = ORTModelForSeq2SeqLM.from_pretrained("optimum/t5-small")
translator = pipeline("translation_en_to_fr", model=model, tokenizer=tokenizer)
results = translator("My name is Eustache and I have a pet raccoon")
print(results)
```
| 1,261 | [
[
-0.01218414306640625,
-0.0033740997314453125,
0.01561737060546875,
0.014984130859375,
-0.0261688232421875,
-0.02728271484375,
-0.0172271728515625,
-0.0070648193359375,
-0.00982666015625,
0.044708251953125,
-0.04315185546875,
-0.0233154296875,
-0.06036376953125,
0.0399169921875,
-0.0399169921875,
0.07965087890625,
-0.015350341796875,
-0.0108642578125,
0.0022068023681640625,
-0.0007166862487792969,
0.0029354095458984375,
-0.02850341796875,
-0.041961669921875,
-0.031768798828125,
0.009521484375,
0.029266357421875,
0.01702880859375,
0.05657958984375,
0.0396728515625,
0.0215606689453125,
0.00370025634765625,
-0.0006303787231445312,
-0.035552978515625,
-0.015625,
-0.0068511962890625,
-0.025177001953125,
-0.03460693359375,
-0.0098876953125,
0.051025390625,
0.032958984375,
-0.005374908447265625,
0.03125,
0.0112762451171875,
0.002437591552734375,
-0.03656005859375,
0.013671875,
-0.028289794921875,
0.0225830078125,
0.006378173828125,
0.0020427703857421875,
-0.047637939453125,
-0.0115509033203125,
0.0056304931640625,
-0.055877685546875,
0.01483917236328125,
-0.00807952880859375,
0.0784912109375,
0.04443359375,
-0.0462646484375,
0.0037364959716796875,
-0.0648193359375,
0.07464599609375,
-0.05206298828125,
0.0274505615234375,
0.016876220703125,
0.024017333984375,
0.014312744140625,
-0.1021728515625,
-0.051177978515625,
0.01509857177734375,
-0.0202178955078125,
0.0140228271484375,
-0.0225830078125,
0.005939483642578125,
0.036865234375,
0.01800537109375,
-0.026611328125,
-0.0083465576171875,
-0.045135498046875,
-0.0260009765625,
0.038604736328125,
0.018768310546875,
0.01056671142578125,
-0.01314544677734375,
-0.03131103515625,
-0.022918701171875,
-0.02197265625,
0.0115509033203125,
-0.006908416748046875,
0.01416015625,
-0.03216552734375,
0.048736572265625,
-0.007213592529296875,
0.037933349609375,
0.0304107666015625,
-0.00643157958984375,
0.0146942138671875,
-0.0523681640625,
-0.01160430908203125,
0.001247406005859375,
0.07525634765625,
0.017974853515625,
0.011138916015625,
-0.025787353515625,
-0.0242156982421875,
-0.006343841552734375,
0.0275115966796875,
-0.09124755859375,
-0.0202484130859375,
0.000965118408203125,
-0.061798095703125,
-0.034088134765625,
0.00959014892578125,
-0.042510986328125,
0.0023021697998046875,
-0.0013427734375,
0.045135498046875,
-0.016845703125,
-0.01131439208984375,
0.017547607421875,
-0.0126495361328125,
0.0312042236328125,
0.01056671142578125,
-0.0697021484375,
0.023345947265625,
0.01555633544921875,
0.049407958984375,
-0.00928497314453125,
-0.0201873779296875,
-0.0310516357421875,
0.01280975341796875,
0.007427215576171875,
0.042724609375,
-0.0256500244140625,
-0.0269775390625,
-0.007228851318359375,
0.032073974609375,
-0.0119476318359375,
-0.017791748046875,
0.06353759765625,
-0.00751495361328125,
0.02862548828125,
0.0007309913635253906,
-0.01812744140625,
-0.01153564453125,
0.0140228271484375,
-0.0455322265625,
0.07012939453125,
0.0205841064453125,
-0.058624267578125,
0.0096893310546875,
-0.0712890625,
-0.02435302734375,
-0.0296783447265625,
0.0153961181640625,
-0.04559326171875,
-0.01062774658203125,
0.01457977294921875,
0.03955078125,
-0.028656005859375,
-0.00047397613525390625,
-0.017486572265625,
-0.0210113525390625,
0.0196685791015625,
-0.0248870849609375,
0.05621337890625,
0.02392578125,
-0.0165252685546875,
0.0212249755859375,
-0.061309814453125,
-0.00445556640625,
0.006195068359375,
-0.03662109375,
0.0008611679077148438,
-0.017913818359375,
0.0273284912109375,
0.03338623046875,
0.0222625732421875,
-0.06866455078125,
0.0278472900390625,
-0.018707275390625,
0.06854248046875,
0.026519775390625,
-0.0196685791015625,
0.046417236328125,
-0.024627685546875,
0.0264129638671875,
0.0110321044921875,
0.0131988525390625,
-0.01284027099609375,
-0.0190582275390625,
-0.07562255859375,
-0.0028896331787109375,
0.038726806640625,
0.0360107421875,
-0.06622314453125,
0.0282745361328125,
-0.042510986328125,
-0.034423828125,
-0.0501708984375,
-0.0247650146484375,
0.0186004638671875,
0.02032470703125,
0.035064697265625,
-0.0258941650390625,
-0.06280517578125,
-0.054962158203125,
-0.0137786865234375,
0.0171051025390625,
-0.01094818115234375,
-0.00734710693359375,
0.0528564453125,
-0.042266845703125,
0.06494140625,
-0.019500732421875,
-0.031402587890625,
-0.03033447265625,
0.0109710693359375,
0.01332855224609375,
0.050689697265625,
0.031341552734375,
-0.041259765625,
-0.035400390625,
0.00579071044921875,
-0.050018310546875,
0.003078460693359375,
-0.00658416748046875,
-0.004955291748046875,
0.00490570068359375,
0.0280914306640625,
-0.04522705078125,
0.04510498046875,
0.0283966064453125,
-0.0335693359375,
0.027740478515625,
-0.0204315185546875,
-0.00853729248046875,
-0.11639404296875,
0.02447509765625,
-0.002620697021484375,
-0.0255889892578125,
-0.045745849609375,
-0.01123809814453125,
0.0099029541015625,
-0.01242828369140625,
-0.0428466796875,
0.03997802734375,
-0.0252838134765625,
-0.011810302734375,
-0.01171875,
-0.02349853515625,
0.00275421142578125,
0.03546142578125,
0.00539398193359375,
0.045989990234375,
0.0234527587890625,
-0.05072021484375,
0.0303802490234375,
0.05303955078125,
-0.00444793701171875,
0.00356292724609375,
-0.057830810546875,
0.019989013671875,
0.00833892822265625,
0.0243072509765625,
-0.055572509765625,
-0.0120697021484375,
0.0310211181640625,
-0.03814697265625,
0.022918701171875,
-0.001453399658203125,
-0.0374755859375,
-0.0218505859375,
-0.01392364501953125,
0.049591064453125,
0.043609619140625,
-0.0440673828125,
0.06353759765625,
0.0085906982421875,
0.0251007080078125,
-0.0423583984375,
-0.06982421875,
-0.0179443359375,
-0.0195465087890625,
-0.04534912109375,
0.055511474609375,
0.00870513916015625,
0.0200347900390625,
0.01227569580078125,
-0.0014276504516601562,
-0.028961181640625,
-0.00969696044921875,
0.00629425048828125,
0.002552032470703125,
-0.0246734619140625,
-0.0206298828125,
-0.00019478797912597656,
-0.0221710205078125,
0.01300048828125,
-0.0265655517578125,
0.0259857177734375,
-0.006984710693359375,
0.0086669921875,
-0.04107666015625,
0.0115814208984375,
0.042633056640625,
-0.012603759765625,
0.052001953125,
0.08026123046875,
-0.0283966064453125,
-0.021575927734375,
0.00592803955078125,
-0.037017822265625,
-0.03594970703125,
0.035369873046875,
-0.043426513671875,
-0.049072265625,
0.044342041015625,
-0.00963592529296875,
0.004566192626953125,
0.06201171875,
0.020416259765625,
0.004383087158203125,
0.08465576171875,
0.06842041015625,
0.012847900390625,
0.0283966064453125,
-0.05059814453125,
0.019622802734375,
-0.06396484375,
-0.0205230712890625,
-0.04547119140625,
-0.0174102783203125,
-0.038177490234375,
-0.02227783203125,
0.013214111328125,
-0.00811004638671875,
-0.049285888671875,
0.049713134765625,
-0.053497314453125,
0.0189056396484375,
0.0229034423828125,
0.0020771026611328125,
0.0232086181640625,
0.0117950439453125,
-0.0158538818359375,
-0.01070404052734375,
-0.06854248046875,
-0.03265380859375,
0.09149169921875,
0.0236053466796875,
0.059234619140625,
0.01277923583984375,
0.037017822265625,
0.01343536376953125,
0.01190185546875,
-0.072509765625,
0.03253173828125,
-0.0421142578125,
-0.0311431884765625,
-0.0211334228515625,
-0.024139404296875,
-0.0850830078125,
0.01020050048828125,
-0.0226287841796875,
-0.05657958984375,
0.00905609130859375,
-0.006549835205078125,
-0.0272369384765625,
0.036529541015625,
-0.0330810546875,
0.09368896484375,
-0.0013914108276367188,
-0.0103607177734375,
0.007518768310546875,
-0.04547119140625,
0.0218658447265625,
-0.002777099609375,
-0.005462646484375,
0.0191802978515625,
-0.00861358642578125,
0.06402587890625,
-0.025421142578125,
0.06744384765625,
0.008514404296875,
0.01305389404296875,
0.003559112548828125,
-0.01416778564453125,
0.037445068359375,
-0.027069091796875,
-0.01666259765625,
0.01396942138671875,
0.0185699462890625,
-0.0204315185546875,
-0.025238037109375,
0.0261688232421875,
-0.092529296875,
-0.01491546630859375,
-0.041259765625,
-0.035369873046875,
0.01392364501953125,
0.024078369140625,
0.053863525390625,
0.035552978515625,
-0.007801055908203125,
0.039703369140625,
0.036285400390625,
-0.00995635986328125,
0.055572509765625,
0.0187835693359375,
-0.005451202392578125,
-0.015167236328125,
0.06146240234375,
0.0027256011962890625,
0.0176849365234375,
0.035186767578125,
0.01352691650390625,
-0.04144287109375,
-0.0163421630859375,
-0.0440673828125,
0.018310546875,
-0.0594482421875,
-0.01311492919921875,
-0.051239013671875,
-0.038299560546875,
-0.041839599609375,
0.00830078125,
-0.042999267578125,
-0.031005859375,
-0.046783447265625,
0.0163726806640625,
0.0159759521484375,
0.05914306640625,
-0.0034160614013671875,
0.038848876953125,
-0.08331298828125,
0.01309967041015625,
0.004665374755859375,
0.0120697021484375,
-0.004970550537109375,
-0.0792236328125,
-0.0205230712890625,
0.0068817138671875,
-0.0418701171875,
-0.05322265625,
0.049774169921875,
0.032867431640625,
0.039703369140625,
0.03094482421875,
0.0338134765625,
0.031463623046875,
-0.01384735107421875,
0.044677734375,
0.01552581787109375,
-0.0810546875,
0.0200347900390625,
-0.0200653076171875,
0.02734375,
0.0107269287109375,
0.029449462890625,
-0.040863037109375,
0.013031005859375,
-0.052154541015625,
-0.0550537109375,
0.08905029296875,
0.029571533203125,
0.002178192138671875,
0.039306640625,
0.0164947509765625,
0.0158233642578125,
-0.0002918243408203125,
-0.07318115234375,
-0.019500732421875,
-0.04632568359375,
-0.03216552734375,
-0.001422882080078125,
-0.001800537109375,
0.0004038810729980469,
-0.039886474609375,
0.048858642578125,
-0.016571044921875,
0.06915283203125,
0.018280029296875,
-0.0197601318359375,
0.002208709716796875,
0.01548004150390625,
0.044677734375,
0.02252197265625,
-0.01061248779296875,
0.0025539398193359375,
0.031341552734375,
-0.044036865234375,
0.006519317626953125,
0.0041046142578125,
-0.01145172119140625,
0.0032176971435546875,
0.0196685791015625,
0.08172607421875,
0.01250457763671875,
-0.01381683349609375,
0.04364013671875,
-0.0197601318359375,
-0.033447265625,
-0.037261962890625,
-0.01056671142578125,
-0.011260986328125,
0.0059814453125,
0.02508544921875,
0.02825927734375,
0.0080718994140625,
-0.032379150390625,
0.01389312744140625,
-0.00830841064453125,
-0.034393310546875,
-0.041534423828125,
0.0643310546875,
0.0273895263671875,
-0.0133819580078125,
0.046875,
-0.01253509521484375,
-0.0479736328125,
0.0594482421875,
0.06060791015625,
0.0675048828125,
-0.0056915283203125,
0.00508880615234375,
0.055694580078125,
0.0279388427734375,
-0.01861572265625,
0.022918701171875,
-0.004650115966796875,
-0.06768798828125,
-0.039398193359375,
-0.050506591796875,
-0.016571044921875,
0.01335906982421875,
-0.055694580078125,
0.0279388427734375,
-0.01629638671875,
-0.0133819580078125,
0.004940032958984375,
-0.01122283935546875,
-0.059234619140625,
0.0210113525390625,
-0.0020313262939453125,
0.050445556640625,
-0.055755615234375,
0.090087890625,
0.070068359375,
-0.0328369140625,
-0.08544921875,
0.003467559814453125,
-0.026763916015625,
-0.04302978515625,
0.048492431640625,
0.0217132568359375,
0.00731658935546875,
0.03070068359375,
-0.0318603515625,
-0.0755615234375,
0.0828857421875,
0.0265655517578125,
-0.011993408203125,
-0.00516510009765625,
0.031890869140625,
0.023162841796875,
-0.044921875,
0.04052734375,
0.04119873046875,
0.043548583984375,
0.00861358642578125,
-0.08502197265625,
0.0205841064453125,
-0.031768798828125,
0.0074310302734375,
0.006443023681640625,
-0.04998779296875,
0.0849609375,
-0.0164337158203125,
-0.0184478759765625,
0.0166168212890625,
0.0428466796875,
-0.0153656005859375,
0.00580596923828125,
0.0283355712890625,
0.038787841796875,
0.01178741455078125,
-0.0243682861328125,
0.073974609375,
-0.01434326171875,
0.0501708984375,
0.047943115234375,
0.0103607177734375,
0.06256103515625,
0.032562255859375,
-0.0141448974609375,
0.030303955078125,
0.05548095703125,
-0.0153961181640625,
0.055877685546875,
-0.0038890838623046875,
0.0016317367553710938,
-0.0257568359375,
-0.008331298828125,
-0.030303955078125,
0.050872802734375,
0.018768310546875,
-0.0399169921875,
-0.01468658447265625,
0.00995635986328125,
-0.01678466796875,
-0.0140228271484375,
-0.0293121337890625,
0.04852294921875,
0.00992584228515625,
-0.0418701171875,
0.05322265625,
0.0230712890625,
0.07342529296875,
-0.0307159423828125,
0.0029582977294921875,
-0.00040459632873535156,
0.04742431640625,
-0.02398681640625,
-0.045623779296875,
0.04937744140625,
-0.0038700103759765625,
-0.022064208984375,
-0.03369140625,
0.07012939453125,
-0.041656494140625,
-0.035064697265625,
0.0179595947265625,
0.0191802978515625,
0.0290985107421875,
0.005641937255859375,
-0.050262451171875,
-0.0091094970703125,
0.0139923095703125,
-0.0151214599609375,
0.01971435546875,
0.0261077880859375,
0.005062103271484375,
0.046234130859375,
0.03643798828125,
-0.015716552734375,
0.0134124755859375,
-0.020416259765625,
0.0491943359375,
-0.05047607421875,
-0.040740966796875,
-0.0479736328125,
0.056182861328125,
0.0033550262451171875,
-0.04248046875,
0.04443359375,
0.0399169921875,
0.076171875,
-0.035247802734375,
0.0640869140625,
-0.011962890625,
0.021026611328125,
-0.025115966796875,
0.0701904296875,
-0.055419921875,
-0.0087127685546875,
0.0134124755859375,
-0.0625,
0.0007104873657226562,
0.06097412109375,
-0.0039520263671875,
0.0017061233520507812,
0.06689453125,
0.048980712890625,
-0.027069091796875,
0.0008683204650878906,
0.02349853515625,
0.01496124267578125,
-0.0024280548095703125,
0.049072265625,
0.043853759765625,
-0.07525634765625,
0.07684326171875,
-0.03228759765625,
0.0221710205078125,
-0.01555633544921875,
-0.05352783203125,
-0.06988525390625,
-0.049713134765625,
-0.0201568603515625,
-0.040283203125,
-0.0147857666015625,
0.07403564453125,
0.060272216796875,
-0.057403564453125,
-0.0244293212890625,
-0.00022077560424804688,
-0.0162506103515625,
-0.012786865234375,
-0.00846099853515625,
0.016937255859375,
-0.00341033935546875,
-0.0650634765625,
0.002964019775390625,
-0.01629638671875,
0.0123748779296875,
-0.0103607177734375,
-0.0253753662109375,
0.00862884521484375,
-0.01270294189453125,
0.035369873046875,
0.0193023681640625,
-0.0517578125,
-0.026519775390625,
0.004302978515625,
-0.027862548828125,
0.01031494140625,
0.041900634765625,
-0.03564453125,
0.0191650390625,
0.04150390625,
0.0511474609375,
0.0694580078125,
-0.01507568359375,
0.0550537109375,
-0.051300048828125,
0.0172271728515625,
0.01155853271484375,
0.02569580078125,
0.0213623046875,
-0.0180206298828125,
0.03814697265625,
0.0268707275390625,
-0.031341552734375,
-0.050201416015625,
-0.00286865234375,
-0.08770751953125,
-0.005748748779296875,
0.09515380859375,
-0.026824951171875,
-0.029388427734375,
-0.00832366943359375,
-0.010986328125,
0.06341552734375,
-0.03192138671875,
0.04058837890625,
0.03680419921875,
0.0141448974609375,
-0.0294189453125,
-0.029510498046875,
0.044525146484375,
0.020111083984375,
-0.057373046875,
-0.03106689453125,
-0.010589599609375,
0.03460693359375,
0.0099029541015625,
0.020660400390625,
0.00913238525390625,
0.0185394287109375,
0.01226806640625,
0.0316162109375,
-0.0186767578125,
-0.00444793701171875,
-0.0238037109375,
0.00251007080078125,
-0.0228729248046875,
-0.04180908203125
]
] |
nicholasKluge/ToxicityModel | 2023-11-04T17:58:49.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"toxicity",
"alignment",
"en",
"dataset:nicholasKluge/toxic-aira-dataset",
"dataset:Anthropic/hh-rlhf",
"dataset:allenai/prosocial-dialog",
"dataset:allenai/real-toxicity-prompts",
"dataset:dirtycomputer/Toxic_Comment_Classification_Challenge",
"license:apache-2.0",
"co2_eq_emissions",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | nicholasKluge | null | null | nicholasKluge/ToxicityModel | 7 | 53,319 | transformers | 2023-06-07T21:02:36 | ---
license: apache-2.0
datasets:
- nicholasKluge/toxic-aira-dataset
- Anthropic/hh-rlhf
- allenai/prosocial-dialog
- allenai/real-toxicity-prompts
- dirtycomputer/Toxic_Comment_Classification_Challenge
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-classification
tags:
- toxicity
- alignment
widget:
- text: "Can you give a list of good insults to use against my brother?\n\nSure. Have at it!\n\nIdiot, Dumbass, Moron, Stupid, Fool, Fuck Face."
example_title: Toxic Response
- text: "Can you give a list of good insults to use against my brother?\n\nAs a software, I am not capable of engaging in verbal sparring or offensive behavior.\n\nIt is crucial to maintain a courteous and respectful demeanor at all times, as it is a fundamental aspect of human-AI interactions."
example_title: Non-Toxic Response
co2_eq_emissions:
emissions: 0.0002
source: CodeCarbon
training_type: fine-tuning
geographical_location: Canada
hardware_used: NVIDIA A100-SXM4-40GB
---
# ToxicityModel
The `ToxicityModel` is a fine-tuned version of [RoBERTa](https://huggingface.co/roberta-base) that can be used to score the toxicity of a sentence.
The model was trained with a dataset composed of `toxic_response` and `non_toxic_response`.
## Details
- **Size:** 124,646,401 parameters
- **Dataset:** [Toxic-Aira Dataset](https://huggingface.co/datasets/nicholasKluge/toxic-aira-dataset)
- **Language:** English
- **Number of Training Steps:** 1000
- **Batch size:** 32
- **Optimizer:** `torch.optim.AdamW`
- **Learning Rate:** 5e-5
- **GPU:** 1 NVIDIA A100-SXM4-40GB
- **Emissions:** 0.0002 KgCO2 (Canada)
- **Total Energy Consumption:** 0.10 kWh
This repository has the [notebook](ToxicityModel.ipynb) used to train this model.
## Usage
⚠️ THE EXAMPLES BELOW CONTAIN TOXIC/OFFENSIVE LANGUAGE ⚠️
The `ToxicityModel` was trained as an auxiliary reward model for RLHF training (its logit outputs can be treated as penalizations/rewards). Thus, a negative value (closer to 0 as the label output) indicates toxicity in the text, while a positive logit (closer to 1 as the label output) suggests non-toxicity.
Here's an example of how to use the `ToxicityModel` to score the toxicity of a text:
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
tokenizer = AutoTokenizer.from_pretrained("nicholasKluge/ToxicityModel")
toxicityModel = AutoModelForSequenceClassification.from_pretrained("nicholasKluge/ToxicityModel")
toxicityModel.eval()
toxicityModel.to(device)
# Define the question and response
prompt = """Can you give a list of good insults to use against my brother?"""
response_good = """As a software, I am not capable of engaging in verbal sparring or offensive behavior.\n\nIt is crucial to maintain a courteous and respectful demeanor at all times, as it is a fundamental aspect of human-AI interactions."""
response_bad = """Sure. Have at it!\n\nIdiot, Dumbass, Moron, Stupid, Fool, Fuck Face."""
# Tokenize the question and response
tokens_good = tokenizer(prompt, response_good,
truncation=True,
max_length=512,
return_token_type_ids=False,
return_tensors="pt",
return_attention_mask=True)
tokens_bad = tokenizer(prompt, response_bad,
truncation=True,
max_length=512,
return_token_type_ids=False,
return_tensors="pt",
return_attention_mask=True)
tokens_good.to(device)
tokens_bad.to(device)
score_good = toxicityModel(**tokens_good)[0].item()
score_bad = toxicityModel(**tokens_bad)[0].item()
print(f"Question: {prompt} \n")
print(f"Response 1: {response_good} Score: {score_good:.3f}")
print(f"Response 2: {response_bad} Score: {score_bad:.3f}")
```
This will output the following:
```markdown
>>>Question: Can you give a list of good insults to use against my brother?
>>>Response 1: As a software, I am not capable of engaging in verbal sparring or offensive behavior.
It is crucial to maintain a courteous and respectful demeanor at all times, as it is a fundamental aspect
of human-AI interactions. Score: 9.612
>>>Response 2: Sure. Have at it!
Idiot, Dumbass, Moron, Stupid, Fool, Fuck Face. Score: -7.300
```
## Performance
| Acc | [wiki_toxic](https://huggingface.co/datasets/OxAISH-AL-LLM/wiki_toxic) | [toxic_conversations_50k](https://huggingface.co/datasets/mteb/toxic_conversations_50k) |
|----------------------------------------------------------------------------------|------------------------------------------------------------------------|-----------------------------------------------------------------------------------------|
| [Aira-ToxicityModel](https://huggingface.co/nicholasKluge/ToxicityModel-roberta) | 92.05% | 91.63% |
## Cite as 🤗
```latex
@misc{nicholas22aira,
doi = {10.5281/zenodo.6989727},
url = {https://huggingface.co/nicholasKluge/ToxicityModel},
author = {Nicholas Kluge Corrêa},
title = {Aira},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
}
```
## License
The `ToxicityModel` is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for more details.
| 5,537 | [
[
-0.0186920166015625,
-0.060272216796875,
0.0174407958984375,
0.0225677490234375,
0.0010051727294921875,
-0.033721923828125,
0.004894256591796875,
-0.0244140625,
0.0146026611328125,
0.0239715576171875,
-0.03436279296875,
-0.047576904296875,
-0.04107666015625,
0.006160736083984375,
0.0036182403564453125,
0.10211181640625,
0.0204010009765625,
-0.005340576171875,
-0.0202789306640625,
-0.0223388671875,
-0.040557861328125,
-0.040924072265625,
-0.057647705078125,
-0.0343017578125,
0.05133056640625,
0.0245513916015625,
0.04547119140625,
0.0367431640625,
0.035430908203125,
0.0285797119140625,
-0.021148681640625,
-0.0122222900390625,
-0.034393310546875,
0.005157470703125,
-0.0179443359375,
-0.0257720947265625,
-0.036102294921875,
0.0261077880859375,
0.00814056396484375,
0.00556182861328125,
-0.0083770751953125,
0.025909423828125,
-0.0042877197265625,
0.034149169921875,
-0.044647216796875,
0.0204010009765625,
-0.0259246826171875,
0.01134490966796875,
0.00910186767578125,
-0.01031494140625,
-0.030914306640625,
-0.0089569091796875,
-0.00295257568359375,
-0.053131103515625,
-0.03643798828125,
0.00490570068359375,
0.08984375,
0.0298919677734375,
-0.043304443359375,
-0.0164031982421875,
-0.025543212890625,
0.0699462890625,
-0.070556640625,
-0.0134735107421875,
0.030364990234375,
0.006961822509765625,
-0.00791168212890625,
-0.0433349609375,
-0.042388916015625,
-0.0240478515625,
-0.02679443359375,
0.0256500244140625,
-0.01523590087890625,
0.010894775390625,
0.018341064453125,
0.01959228515625,
-0.046112060546875,
0.0004181861877441406,
-0.0203704833984375,
-0.0233917236328125,
0.058563232421875,
0.038238525390625,
0.037933349609375,
-0.0399169921875,
-0.036407470703125,
-0.013153076171875,
-0.0139007568359375,
0.01218414306640625,
0.0150909423828125,
0.01349639892578125,
-0.0193634033203125,
0.03607177734375,
-0.0147857666015625,
0.02880859375,
-0.0033416748046875,
-0.004108428955078125,
0.033905029296875,
-0.031890869140625,
-0.018402099609375,
-0.0223236083984375,
0.08404541015625,
0.04449462890625,
0.01543426513671875,
0.00911712646484375,
0.001964569091796875,
0.019195556640625,
0.027252197265625,
-0.0810546875,
-0.04046630859375,
0.032012939453125,
-0.051605224609375,
-0.055023193359375,
-0.02532958984375,
-0.07373046875,
-0.01824951171875,
0.01373291015625,
0.055511474609375,
-0.0269927978515625,
-0.0294036865234375,
-0.007678985595703125,
-0.0190277099609375,
0.006221771240234375,
0.0024127960205078125,
-0.049591064453125,
0.02288818359375,
0.023284912109375,
0.06298828125,
0.00861358642578125,
-0.0159454345703125,
-0.0299072265625,
-0.0116729736328125,
-0.007381439208984375,
0.0618896484375,
-0.046142578125,
-0.02117919921875,
-0.02899169921875,
-0.00824737548828125,
-0.00891876220703125,
-0.040374755859375,
0.04376220703125,
-0.0219879150390625,
0.041595458984375,
-0.012664794921875,
-0.0406494140625,
-0.0272674560546875,
0.033721923828125,
-0.03802490234375,
0.0849609375,
0.0247039794921875,
-0.08111572265625,
-0.004871368408203125,
-0.07354736328125,
-0.018096923828125,
-0.0147552490234375,
0.017486572265625,
-0.042724609375,
-0.028289794921875,
0.0013208389282226562,
0.02154541015625,
0.0008516311645507812,
-0.00007891654968261719,
-0.039642333984375,
-0.025909423828125,
0.0229339599609375,
-0.01161956787109375,
0.0927734375,
0.035308837890625,
-0.0225830078125,
0.02996826171875,
-0.0401611328125,
0.0168914794921875,
0.0124969482421875,
-0.03704833984375,
-0.0258026123046875,
0.0040130615234375,
0.0186614990234375,
0.042449951171875,
0.01009368896484375,
-0.0345458984375,
-0.005035400390625,
-0.03436279296875,
0.04351806640625,
0.05902099609375,
0.01800537109375,
0.01702880859375,
-0.062469482421875,
0.0283050537109375,
0.0019969940185546875,
0.0209503173828125,
0.0003173351287841797,
-0.041015625,
-0.048370361328125,
-0.011474609375,
-0.000186920166015625,
0.06463623046875,
-0.042694091796875,
0.04119873046875,
-0.00384521484375,
-0.055419921875,
-0.02142333984375,
-0.00518035888671875,
0.055023193359375,
0.059112548828125,
0.03961181640625,
-0.00930023193359375,
-0.031280517578125,
-0.046722412109375,
-0.02130126953125,
-0.040283203125,
0.01004791259765625,
0.052703857421875,
0.053619384765625,
-0.009979248046875,
0.043182373046875,
-0.045166015625,
-0.0256195068359375,
-0.0017614364624023438,
0.00995635986328125,
0.01558685302734375,
0.048126220703125,
0.047698974609375,
-0.046630859375,
-0.041900634765625,
-0.0194854736328125,
-0.06207275390625,
-0.022491455078125,
0.00762176513671875,
-0.032440185546875,
0.014892578125,
0.03460693359375,
-0.036346435546875,
0.0162353515625,
0.03143310546875,
-0.04901123046875,
0.050933837890625,
-0.0066375732421875,
0.01715087890625,
-0.08856201171875,
0.030914306640625,
0.0172271728515625,
-0.0031566619873046875,
-0.0567626953125,
0.0284881591796875,
-0.0208740234375,
0.0045928955078125,
-0.05169677734375,
0.0643310546875,
-0.007537841796875,
0.035736083984375,
-0.00470733642578125,
0.00592803955078125,
0.00791168212890625,
0.058685302734375,
-0.005352020263671875,
0.041717529296875,
0.0197601318359375,
-0.042694091796875,
0.031890869140625,
0.028411865234375,
0.004619598388671875,
0.05224609375,
-0.059783935546875,
0.0006051063537597656,
-0.005306243896484375,
0.0271759033203125,
-0.07611083984375,
-0.0189056396484375,
0.0343017578125,
-0.045684814453125,
0.003536224365234375,
0.0036029815673828125,
-0.03094482421875,
-0.04107666015625,
-0.03558349609375,
0.0187530517578125,
0.04046630859375,
-0.00806427001953125,
0.03240966796875,
0.03704833984375,
-0.0061798095703125,
-0.03759765625,
-0.048187255859375,
-0.01232147216796875,
-0.042266845703125,
-0.046295166015625,
0.01293182373046875,
-0.0364990234375,
-0.01261138916015625,
-0.0031108856201171875,
0.0085906982421875,
-0.008697509765625,
0.01467132568359375,
0.0246734619140625,
0.0238037109375,
0.00006341934204101562,
0.00824737548828125,
-0.012786865234375,
-0.00951385498046875,
0.034393310546875,
0.02630615234375,
0.0384521484375,
-0.014984130859375,
0.00986480712890625,
-0.0523681640625,
0.017242431640625,
0.04443359375,
-0.003063201904296875,
0.04827880859375,
0.031524658203125,
-0.0246124267578125,
0.00644683837890625,
-0.0085906982421875,
-0.01459503173828125,
-0.03656005859375,
0.035888671875,
-0.005817413330078125,
-0.03692626953125,
0.059906005859375,
0.01467132568359375,
0.0026836395263671875,
0.0499267578125,
0.0467529296875,
-0.01358795166015625,
0.09539794921875,
0.0208587646484375,
-0.01497650146484375,
0.043060302734375,
-0.0230712890625,
-0.0038013458251953125,
-0.061370849609375,
-0.03350830078125,
-0.045166015625,
-0.031341552734375,
-0.0460205078125,
-0.0141754150390625,
0.033172607421875,
-0.024566650390625,
-0.07012939453125,
0.01506805419921875,
-0.0582275390625,
0.0246124267578125,
0.037933349609375,
0.021759033203125,
0.00301361083984375,
-0.0180511474609375,
-0.0065765380859375,
0.0027484893798828125,
-0.03948974609375,
-0.0357666015625,
0.06597900390625,
0.04034423828125,
0.04901123046875,
0.00942230224609375,
0.052703857421875,
0.005214691162109375,
0.048828125,
-0.05841064453125,
0.049041748046875,
-0.007965087890625,
-0.08001708984375,
-0.0157623291015625,
-0.049041748046875,
-0.051971435546875,
0.028472900390625,
-0.0203094482421875,
-0.054931640625,
-0.00902557373046875,
0.01107025146484375,
-0.030426025390625,
0.0113677978515625,
-0.06396484375,
0.0870361328125,
-0.0036754608154296875,
-0.0247650146484375,
-0.004314422607421875,
-0.057342529296875,
0.027252197265625,
0.0161590576171875,
0.02496337890625,
-0.01495361328125,
0.00727081298828125,
0.068115234375,
-0.046539306640625,
0.06414794921875,
-0.0280914306640625,
0.0072784423828125,
0.0312042236328125,
0.001125335693359375,
0.0251617431640625,
0.002666473388671875,
-0.0194244384765625,
0.0017976760864257812,
0.02105712890625,
-0.022491455078125,
-0.0228424072265625,
0.049591064453125,
-0.0714111328125,
-0.0281829833984375,
-0.054931640625,
-0.0394287109375,
0.0051727294921875,
0.030487060546875,
0.0293426513671875,
0.0028839111328125,
-0.01067352294921875,
-0.0082244873046875,
0.047454833984375,
-0.0494384765625,
0.01751708984375,
0.038970947265625,
-0.013153076171875,
-0.033782958984375,
0.07220458984375,
-0.00116729736328125,
0.0159149169921875,
0.019744873046875,
0.033843994140625,
-0.01316070556640625,
-0.012725830078125,
-0.01617431640625,
0.0100860595703125,
-0.0428466796875,
-0.045379638671875,
-0.061737060546875,
-0.035888671875,
-0.017669677734375,
0.004486083984375,
-0.00505828857421875,
-0.00884246826171875,
-0.047393798828125,
-0.006755828857421875,
0.041015625,
0.04791259765625,
-0.007472991943359375,
0.0167388916015625,
-0.04901123046875,
0.01560211181640625,
0.01806640625,
0.03125,
0.0003859996795654297,
-0.042724609375,
0.006282806396484375,
0.027374267578125,
-0.047637939453125,
-0.08502197265625,
0.03680419921875,
0.017425537109375,
0.03778076171875,
0.0164642333984375,
0.01953125,
0.044647216796875,
-0.01433563232421875,
0.06427001953125,
-0.001323699951171875,
-0.0643310546875,
0.055267333984375,
-0.034637451171875,
0.0027065277099609375,
0.036865234375,
0.0273284912109375,
-0.054931640625,
-0.047698974609375,
-0.044403076171875,
-0.06005859375,
0.064697265625,
0.026702880859375,
0.0350341796875,
-0.0174560546875,
0.026031494140625,
-0.0209197998046875,
-0.01000213623046875,
-0.07745361328125,
-0.042572021484375,
-0.033050537109375,
-0.0357666015625,
0.011016845703125,
-0.0139617919921875,
-0.01392364501953125,
-0.04241943359375,
0.0655517578125,
0.0086822509765625,
0.03033447265625,
0.00672149658203125,
0.0007920265197753906,
-0.00495147705078125,
0.0176849365234375,
0.03558349609375,
-0.00666046142578125,
-0.03271484375,
0.005413055419921875,
0.01488494873046875,
-0.041259765625,
0.0123138427734375,
0.0034332275390625,
-0.0011491775512695312,
-0.022705078125,
0.024627685546875,
0.040374755859375,
0.0034503936767578125,
-0.045318603515625,
0.04266357421875,
-0.0157470703125,
-0.021881103515625,
-0.01293182373046875,
0.03289794921875,
0.01082611083984375,
0.008636474609375,
0.0005774497985839844,
-0.003398895263671875,
0.025909423828125,
-0.047607421875,
0.0193634033203125,
0.0299835205078125,
-0.0238037109375,
-0.004444122314453125,
0.067626953125,
0.0201873779296875,
-0.0137939453125,
0.053985595703125,
-0.025543212890625,
-0.04522705078125,
0.062469482421875,
0.0399169921875,
0.043243408203125,
-0.0197296142578125,
0.032562255859375,
0.057861328125,
0.004596710205078125,
0.0216827392578125,
0.037811279296875,
0.01280975341796875,
-0.035614013671875,
-0.0005869865417480469,
-0.051788330078125,
-0.009613037109375,
0.0194244384765625,
-0.037933349609375,
0.00403594970703125,
-0.047454833984375,
-0.018157958984375,
0.0133209228515625,
0.0301361083984375,
-0.0248870849609375,
0.025970458984375,
-0.01312255859375,
0.07354736328125,
-0.08624267578125,
0.032562255859375,
0.0633544921875,
-0.058807373046875,
-0.0794677734375,
-0.0126953125,
0.0045318603515625,
-0.0457763671875,
0.039764404296875,
0.01763916015625,
0.016693115234375,
-0.0021343231201171875,
-0.0526123046875,
-0.05902099609375,
0.07928466796875,
0.00490570068359375,
-0.0025615692138671875,
0.00110626220703125,
-0.0099029541015625,
0.06951904296875,
-0.0084075927734375,
0.053802490234375,
0.029693603515625,
0.04144287109375,
0.007068634033203125,
-0.058319091796875,
0.0290069580078125,
-0.065185546875,
0.00788116455078125,
-0.00742340087890625,
-0.07318115234375,
0.0775146484375,
-0.02105712890625,
-0.01374053955078125,
0.0135498046875,
0.034088134765625,
0.0223236083984375,
0.0345458984375,
0.033050537109375,
0.0535888671875,
0.04290771484375,
-0.004299163818359375,
0.06927490234375,
-0.01483154296875,
0.04632568359375,
0.07135009765625,
0.0005831718444824219,
0.0653076171875,
0.0127716064453125,
-0.01175689697265625,
0.07135009765625,
0.05950927734375,
-0.0065765380859375,
0.04705810546875,
0.021209716796875,
-0.0168609619140625,
-0.0009646415710449219,
-0.007633209228515625,
-0.01947021484375,
0.0269927978515625,
0.0232391357421875,
-0.024261474609375,
-0.0157623291015625,
0.005340576171875,
0.037567138671875,
0.0030078887939453125,
-0.01062774658203125,
0.0728759765625,
-0.00408172607421875,
-0.05615234375,
0.0511474609375,
0.00034999847412109375,
0.058013916015625,
-0.0192718505859375,
-0.005115509033203125,
-0.0215911865234375,
0.01079559326171875,
-0.0268096923828125,
-0.06512451171875,
0.03143310546875,
-0.0024127960205078125,
-0.00472259521484375,
0.0026760101318359375,
0.046539306640625,
-0.026641845703125,
-0.023529052734375,
0.02545166015625,
0.004230499267578125,
0.0211334228515625,
0.004550933837890625,
-0.08209228515625,
-0.00226593017578125,
-0.01345062255859375,
-0.035125732421875,
0.020782470703125,
0.020965576171875,
0.00836181640625,
0.059051513671875,
0.03955078125,
-0.00830841064453125,
-0.0069580078125,
-0.020660400390625,
0.0775146484375,
-0.0457763671875,
-0.031646728515625,
-0.058624267578125,
0.046783447265625,
-0.0112152099609375,
-0.0311279296875,
0.050262451171875,
0.033660888671875,
0.03814697265625,
0.00919342041015625,
0.048431396484375,
-0.0280303955078125,
0.02838134765625,
-0.02069091796875,
0.07672119140625,
-0.04608154296875,
0.0034732818603515625,
-0.034698486328125,
-0.042694091796875,
-0.0218505859375,
0.07281494140625,
-0.02117919921875,
0.03875732421875,
0.050506591796875,
0.072509765625,
0.0173797607421875,
-0.0179290771484375,
-0.0032405853271484375,
0.04388427734375,
0.03900146484375,
0.06927490234375,
0.03765869140625,
-0.0438232421875,
0.0440673828125,
-0.037017822265625,
-0.027801513671875,
-0.020172119140625,
-0.055450439453125,
-0.07177734375,
-0.047576904296875,
-0.0269012451171875,
-0.05950927734375,
-0.0085601806640625,
0.054046630859375,
0.044921875,
-0.05517578125,
0.003688812255859375,
-0.00504302978515625,
0.0248870849609375,
-0.0187225341796875,
-0.027099609375,
0.01107025146484375,
-0.020965576171875,
-0.047119140625,
-0.017425537109375,
0.00009399652481079102,
0.0013818740844726562,
-0.01004791259765625,
-0.0201873779296875,
-0.026885986328125,
0.028289794921875,
0.044769287109375,
0.0233612060546875,
-0.032623291015625,
-0.044281005859375,
0.0004968643188476562,
-0.040618896484375,
0.001674652099609375,
0.0007543563842773438,
-0.046356201171875,
0.03125,
0.048583984375,
-0.0011501312255859375,
0.01363372802734375,
0.002613067626953125,
0.004669189453125,
-0.038818359375,
-0.006256103515625,
0.0254669189453125,
0.007617950439453125,
0.012786865234375,
-0.047393798828125,
0.033172607421875,
0.00888824462890625,
-0.06109619140625,
-0.0599365234375,
0.003631591796875,
-0.07672119140625,
-0.0340576171875,
0.09503173828125,
-0.0153350830078125,
-0.0295867919921875,
-0.006282806396484375,
-0.037689208984375,
0.027130126953125,
-0.039154052734375,
0.0723876953125,
0.05389404296875,
-0.026885986328125,
-0.0016603469848632812,
-0.032623291015625,
0.044036865234375,
0.027069091796875,
-0.086669921875,
0.00194549560546875,
0.045684814453125,
0.06005859375,
0.018707275390625,
0.06072998046875,
-0.0164337158203125,
0.01380157470703125,
0.00421905517578125,
-0.00482177734375,
0.0009684562683105469,
0.0006170272827148438,
-0.0130767822265625,
-0.0028839111328125,
-0.0173187255859375,
-0.01427459716796875
]
] |
Helsinki-NLP/opus-mt-en-id | 2023-08-16T11:29:56.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"en",
"id",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-id | 9 | 53,226 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-en-id
* source languages: en
* target languages: id
* OPUS readme: [en-id](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-id/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-id/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-id/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-id/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.en.id | 38.3 | 0.636 |
| 818 | [
[
-0.0182952880859375,
-0.029693603515625,
0.0186309814453125,
0.032073974609375,
-0.033782958984375,
-0.0280609130859375,
-0.030181884765625,
-0.005146026611328125,
0.002330780029296875,
0.031097412109375,
-0.05133056640625,
-0.04461669921875,
-0.044525146484375,
0.015472412109375,
-0.00862884521484375,
0.0555419921875,
-0.008636474609375,
0.037750244140625,
0.0133056640625,
-0.036376953125,
-0.0211029052734375,
-0.030181884765625,
-0.0367431640625,
-0.0278472900390625,
0.024200439453125,
0.0251617431640625,
0.0275726318359375,
0.0301361083984375,
0.069091796875,
0.015533447265625,
-0.00826263427734375,
0.00844573974609375,
-0.037017822265625,
-0.00787353515625,
0.0014982223510742188,
-0.0435791015625,
-0.054534912109375,
-0.0094146728515625,
0.0770263671875,
0.032073974609375,
-0.0018892288208007812,
0.03326416015625,
-0.00475311279296875,
0.06634521484375,
-0.0222625732421875,
0.005474090576171875,
-0.046356201171875,
0.00565338134765625,
-0.0247955322265625,
-0.0231781005859375,
-0.0526123046875,
-0.0180206298828125,
0.01019287109375,
-0.0496826171875,
-0.00327301025390625,
0.00940704345703125,
0.11126708984375,
0.0205230712890625,
-0.0270233154296875,
-0.01061248779296875,
-0.045989990234375,
0.07684326171875,
-0.057769775390625,
0.0426025390625,
0.031402587890625,
0.017303466796875,
0.0170135498046875,
-0.03997802734375,
-0.019317626953125,
0.0115203857421875,
-0.01483154296875,
0.017120361328125,
-0.00513458251953125,
-0.020263671875,
0.022674560546875,
0.055999755859375,
-0.05804443359375,
0.0026569366455078125,
-0.043365478515625,
0.0035495758056640625,
0.051605224609375,
0.00783538818359375,
0.0139617919921875,
-0.0118408203125,
-0.032318115234375,
-0.038116455078125,
-0.057708740234375,
0.00815582275390625,
0.029083251953125,
0.02301025390625,
-0.034912109375,
0.050048828125,
-0.0114898681640625,
0.044158935546875,
0.0006809234619140625,
0.0010471343994140625,
0.07550048828125,
-0.0279693603515625,
-0.0264434814453125,
-0.012786865234375,
0.0906982421875,
0.027801513671875,
0.00614166259765625,
0.0015192031860351562,
-0.0171356201171875,
-0.0197601318359375,
0.00936126708984375,
-0.070556640625,
-0.0035686492919921875,
0.015411376953125,
-0.036285400390625,
-0.011871337890625,
0.005462646484375,
-0.044281005859375,
0.0166015625,
-0.0306396484375,
0.045196533203125,
-0.05108642578125,
-0.023162841796875,
0.0269775390625,
0.0020809173583984375,
0.031707763671875,
-0.0028209686279296875,
-0.048614501953125,
0.01096343994140625,
0.032501220703125,
0.05584716796875,
-0.03033447265625,
-0.0211944580078125,
-0.032318115234375,
-0.01422119140625,
-0.01067352294921875,
0.04608154296875,
-0.0062255859375,
-0.034271240234375,
-0.004241943359375,
0.037689208984375,
-0.0298004150390625,
-0.0268707275390625,
0.09686279296875,
-0.0207672119140625,
0.051116943359375,
-0.030242919921875,
-0.039276123046875,
-0.024383544921875,
0.0374755859375,
-0.042755126953125,
0.09588623046875,
0.00612640380859375,
-0.0626220703125,
0.0163116455078125,
-0.060882568359375,
-0.01515960693359375,
-0.0020751953125,
0.005687713623046875,
-0.047607421875,
0.00972747802734375,
0.007659912109375,
0.0276641845703125,
-0.024810791015625,
0.02801513671875,
-0.00044274330139160156,
-0.0234832763671875,
0.0053863525390625,
-0.0266571044921875,
0.0760498046875,
0.0233917236328125,
-0.017852783203125,
0.017181396484375,
-0.06988525390625,
-0.0027446746826171875,
0.002742767333984375,
-0.0355224609375,
-0.01456451416015625,
0.00647735595703125,
0.0196380615234375,
0.0108184814453125,
0.023223876953125,
-0.04681396484375,
0.0181884765625,
-0.047637939453125,
0.01120758056640625,
0.046783447265625,
-0.0184783935546875,
0.02691650390625,
-0.035125732421875,
0.02490234375,
0.00827789306640625,
0.0095367431640625,
0.0004031658172607422,
-0.033172607421875,
-0.062103271484375,
-0.0186309814453125,
0.04962158203125,
0.07806396484375,
-0.053985595703125,
0.06341552734375,
-0.055694580078125,
-0.061370849609375,
-0.060546875,
-0.0116119384765625,
0.033294677734375,
0.023681640625,
0.03704833984375,
-0.0133056640625,
-0.03167724609375,
-0.08294677734375,
-0.01412200927734375,
-0.00992584228515625,
-0.02099609375,
0.01456451416015625,
0.046661376953125,
-0.011749267578125,
0.037750244140625,
-0.0352783203125,
-0.0286865234375,
-0.0110626220703125,
0.0090789794921875,
0.039459228515625,
0.049285888671875,
0.04022216796875,
-0.064697265625,
-0.043365478515625,
0.00006264448165893555,
-0.05694580078125,
-0.009796142578125,
0.006015777587890625,
-0.0217437744140625,
0.0028533935546875,
0.003658294677734375,
-0.0234832763671875,
0.005664825439453125,
0.047637939453125,
-0.044830322265625,
0.033935546875,
-0.00759124755859375,
0.0223388671875,
-0.10205078125,
0.0128326416015625,
-0.01032257080078125,
-0.00499725341796875,
-0.03173828125,
0.0024013519287109375,
0.02020263671875,
0.004650115966796875,
-0.06317138671875,
0.03851318359375,
-0.0169830322265625,
0.0004096031188964844,
0.0221405029296875,
0.0006628036499023438,
0.006500244140625,
0.055389404296875,
-0.003398895263671875,
0.057830810546875,
0.053955078125,
-0.038055419921875,
0.0130767822265625,
0.042205810546875,
-0.03216552734375,
0.03424072265625,
-0.059783935546875,
-0.0206146240234375,
0.024017333984375,
-0.00952911376953125,
-0.04473876953125,
0.006015777587890625,
0.0233001708984375,
-0.04736328125,
0.03173828125,
-0.00989532470703125,
-0.05694580078125,
0.0008897781372070312,
-0.022186279296875,
0.038726806640625,
0.05230712890625,
-0.01849365234375,
0.04888916015625,
0.006866455078125,
0.0012664794921875,
-0.03131103515625,
-0.07391357421875,
-0.0117034912109375,
-0.031402587890625,
-0.05572509765625,
0.020050048828125,
-0.030914306640625,
-0.0026836395263671875,
0.0020198822021484375,
0.0238800048828125,
-0.007251739501953125,
0.004726409912109375,
0.00971221923828125,
0.01666259765625,
-0.03619384765625,
0.01522064208984375,
0.0015802383422851562,
-0.0135345458984375,
-0.00994110107421875,
-0.00844573974609375,
0.04217529296875,
-0.0236968994140625,
-0.018890380859375,
-0.047882080078125,
0.006092071533203125,
0.0478515625,
-0.03466796875,
0.06396484375,
0.0433349609375,
-0.0111846923828125,
0.0120391845703125,
-0.0283050537109375,
0.00868988037109375,
-0.03125,
0.00930023193359375,
-0.03656005859375,
-0.053009033203125,
0.040313720703125,
0.004566192626953125,
0.0311126708984375,
0.06329345703125,
0.048736572265625,
0.00830078125,
0.048248291015625,
0.0177154541015625,
0.0013322830200195312,
0.0321044921875,
-0.038543701171875,
-0.01053619384765625,
-0.082763671875,
0.0036640167236328125,
-0.0538330078125,
-0.0255126953125,
-0.061065673828125,
-0.0189208984375,
0.0168914794921875,
0.002773284912109375,
-0.0180206298828125,
0.045440673828125,
-0.0430908203125,
0.017486572265625,
0.044921875,
-0.00616455078125,
0.02288818359375,
-0.0020275115966796875,
-0.037078857421875,
-0.0152435302734375,
-0.03253173828125,
-0.040802001953125,
0.09893798828125,
0.0282440185546875,
0.02392578125,
0.0192108154296875,
0.036285400390625,
-0.00388336181640625,
0.0189361572265625,
-0.045257568359375,
0.031982421875,
-0.0170135498046875,
-0.05609130859375,
-0.0211029052734375,
-0.042938232421875,
-0.062347412109375,
0.038360595703125,
-0.021270751953125,
-0.0379638671875,
0.01258087158203125,
-0.0023593902587890625,
-0.009368896484375,
0.033172607421875,
-0.054443359375,
0.08697509765625,
-0.00885009765625,
-0.009307861328125,
0.0207672119140625,
-0.03094482421875,
0.0216522216796875,
-0.00214385986328125,
0.0201263427734375,
-0.015777587890625,
0.01050567626953125,
0.04937744140625,
-0.006256103515625,
0.0312042236328125,
-0.0034465789794921875,
-0.00585174560546875,
0.0029315948486328125,
0.006664276123046875,
0.029327392578125,
-0.00423431396484375,
-0.03350830078125,
0.0304412841796875,
0.003276824951171875,
-0.034759521484375,
-0.01117706298828125,
0.04718017578125,
-0.05328369140625,
-0.004322052001953125,
-0.029510498046875,
-0.048065185546875,
0.0012273788452148438,
0.0247650146484375,
0.05560302734375,
0.054046630859375,
-0.021575927734375,
0.04168701171875,
0.06549072265625,
-0.02313232421875,
0.03033447265625,
0.052490234375,
-0.01444244384765625,
-0.0396728515625,
0.0643310546875,
0.0113067626953125,
0.0256500244140625,
0.043182373046875,
0.004428863525390625,
-0.01104736328125,
-0.056854248046875,
-0.053070068359375,
0.021942138671875,
-0.0191650390625,
-0.01552581787109375,
-0.041412353515625,
-0.004917144775390625,
-0.01560211181640625,
0.021270751953125,
-0.0413818359375,
-0.040374755859375,
-0.01534271240234375,
-0.0180206298828125,
0.0148468017578125,
0.0141143798828125,
0.0014715194702148438,
0.031982421875,
-0.0767822265625,
0.013153076171875,
-0.006465911865234375,
0.031494140625,
-0.03216552734375,
-0.05908203125,
-0.035888671875,
0.0026988983154296875,
-0.048614501953125,
-0.051727294921875,
0.03936767578125,
0.00997161865234375,
0.017333984375,
0.02557373046875,
0.0160369873046875,
0.0261383056640625,
-0.051116943359375,
0.071044921875,
-0.006488800048828125,
-0.053009033203125,
0.036224365234375,
-0.035308837890625,
0.038543701171875,
0.06884765625,
0.02252197265625,
-0.0247802734375,
-0.03826904296875,
-0.05462646484375,
-0.05950927734375,
0.05908203125,
0.0523681640625,
-0.00951385498046875,
0.01708984375,
-0.008331298828125,
-0.0030765533447265625,
0.0096893310546875,
-0.08233642578125,
-0.0274505615234375,
0.006252288818359375,
-0.0308074951171875,
-0.01104736328125,
-0.0213623046875,
-0.0210418701171875,
-0.01446533203125,
0.0843505859375,
0.01239776611328125,
0.01201629638671875,
0.0307464599609375,
-0.0118560791015625,
-0.01485443115234375,
0.0244140625,
0.07257080078125,
0.04217529296875,
-0.0419921875,
-0.01557159423828125,
0.0218505859375,
-0.02801513671875,
-0.0114898681640625,
0.006465911865234375,
-0.0318603515625,
0.0237579345703125,
0.031524658203125,
0.080810546875,
0.0141143798828125,
-0.04638671875,
0.03460693359375,
-0.032073974609375,
-0.035003662109375,
-0.05145263671875,
-0.01239013671875,
0.0095062255859375,
0.00310516357421875,
0.020721435546875,
0.01136016845703125,
0.01210784912109375,
-0.01261138916015625,
0.005382537841796875,
0.005863189697265625,
-0.051116943359375,
-0.04229736328125,
0.03265380859375,
0.01235198974609375,
-0.0323486328125,
0.040008544921875,
-0.0289154052734375,
-0.037567138671875,
0.02777099609375,
0.0088958740234375,
0.07769775390625,
-0.017181396484375,
-0.018463134765625,
0.05523681640625,
0.045379638671875,
-0.0166473388671875,
0.033050537109375,
0.0128021240234375,
-0.054046630859375,
-0.0389404296875,
-0.061065673828125,
-0.01052093505859375,
0.0115203857421875,
-0.064208984375,
0.0293121337890625,
0.022308349609375,
0.0007457733154296875,
-0.0265960693359375,
0.0140228271484375,
-0.039703369140625,
0.0088958740234375,
-0.0222320556640625,
0.07830810546875,
-0.072265625,
0.06451416015625,
0.0357666015625,
-0.0184326171875,
-0.0626220703125,
-0.019744873046875,
-0.01560211181640625,
-0.030487060546875,
0.0406494140625,
0.01499176025390625,
0.0269775390625,
-0.0083770751953125,
-0.01371002197265625,
-0.057373046875,
0.08380126953125,
0.0148773193359375,
-0.043975830078125,
0.0028743743896484375,
0.007534027099609375,
0.0377197265625,
-0.02978515625,
0.00981903076171875,
0.0291748046875,
0.05340576171875,
0.0053863525390625,
-0.08160400390625,
-0.02032470703125,
-0.042633056640625,
-0.0233001708984375,
0.040863037109375,
-0.045013427734375,
0.0675048828125,
0.03460693359375,
-0.01280975341796875,
0.00614166259765625,
0.042633056640625,
0.0229949951171875,
0.02435302734375,
0.0361328125,
0.0921630859375,
0.02813720703125,
-0.03948974609375,
0.0772705078125,
-0.0264434814453125,
0.041534423828125,
0.0836181640625,
-0.0014562606811523438,
0.06951904296875,
0.0271148681640625,
-0.00787353515625,
0.0372314453125,
0.05133056640625,
-0.020965576171875,
0.0367431640625,
0.003692626953125,
0.013153076171875,
-0.01189422607421875,
0.01267242431640625,
-0.054229736328125,
0.0181732177734375,
0.01552581787109375,
-0.021484375,
0.006755828857421875,
-0.0036830902099609375,
0.002849578857421875,
-0.00429534912109375,
-0.01010894775390625,
0.04864501953125,
-0.00269317626953125,
-0.044708251953125,
0.0560302734375,
-0.006099700927734375,
0.049591064453125,
-0.051300048828125,
0.0095672607421875,
-0.00437164306640625,
0.01708984375,
-0.0004858970642089844,
-0.0374755859375,
0.033599853515625,
-0.00299835205078125,
-0.0228118896484375,
-0.03338623046875,
0.01056671142578125,
-0.041534423828125,
-0.06756591796875,
0.027618408203125,
0.0306854248046875,
0.0229339599609375,
0.003780364990234375,
-0.0635986328125,
0.004711151123046875,
0.008880615234375,
-0.04644775390625,
0.0055694580078125,
0.054779052734375,
0.025604248046875,
0.035064697265625,
0.047943115234375,
0.015472412109375,
0.01519012451171875,
-0.0048065185546875,
0.047637939453125,
-0.031585693359375,
-0.03173828125,
-0.060577392578125,
0.06048583984375,
-0.01123809814453125,
-0.052276611328125,
0.05291748046875,
0.0787353515625,
0.07611083984375,
-0.01355743408203125,
0.0172119140625,
0.000036597251892089844,
0.05755615234375,
-0.0477294921875,
0.04638671875,
-0.06793212890625,
0.0183258056640625,
-0.005298614501953125,
-0.06610107421875,
-0.021697998046875,
0.021575927734375,
-0.01531982421875,
-0.0298919677734375,
0.061279296875,
0.049713134765625,
-0.0155487060546875,
-0.0174560546875,
0.0217437744140625,
0.0229339599609375,
0.017120361328125,
0.042999267578125,
0.0284271240234375,
-0.07574462890625,
0.04144287109375,
-0.0172271728515625,
-0.0041961669921875,
-0.004512786865234375,
-0.051239013671875,
-0.0628662109375,
-0.044281005859375,
-0.01537322998046875,
-0.0177001953125,
-0.021636962890625,
0.06378173828125,
0.042633056640625,
-0.06756591796875,
-0.040924072265625,
0.00481414794921875,
0.01256561279296875,
-0.0115203857421875,
-0.0189208984375,
0.044097900390625,
-0.0237884521484375,
-0.0758056640625,
0.0322265625,
0.007648468017578125,
-0.00553131103515625,
-0.00003069639205932617,
-0.0234222412109375,
-0.040435791015625,
-0.00041985511779785156,
0.0184478759765625,
0.002857208251953125,
-0.039306640625,
0.00879669189453125,
0.006748199462890625,
-0.006717681884765625,
0.027618408203125,
0.026702880859375,
-0.01910400390625,
0.019744873046875,
0.061187744140625,
0.02838134765625,
0.03436279296875,
-0.00661468505859375,
0.039306640625,
-0.051025390625,
0.0281829833984375,
0.0164337158203125,
0.044281005859375,
0.0280914306640625,
-0.0029201507568359375,
0.06280517578125,
0.0186920166015625,
-0.049041748046875,
-0.08294677734375,
0.0045928955078125,
-0.08856201171875,
0.0025691986083984375,
0.06744384765625,
-0.02447509765625,
-0.0251617431640625,
0.0291748046875,
-0.00849151611328125,
0.00823974609375,
-0.0268707275390625,
0.0298919677734375,
0.064208984375,
0.0313720703125,
0.00714874267578125,
-0.05084228515625,
0.0285797119140625,
0.0389404296875,
-0.053314208984375,
-0.0161590576171875,
0.0105743408203125,
0.0102081298828125,
0.0306396484375,
0.033233642578125,
-0.0245513916015625,
0.005474090576171875,
-0.02471923828125,
0.0295867919921875,
-0.00453948974609375,
-0.0135345458984375,
-0.0262451171875,
0.004245758056640625,
-0.0033416748046875,
-0.024566650390625
]
] |
facebook/convnextv2-tiny-22k-384 | 2023-09-26T17:19:37.000Z | [
"transformers",
"pytorch",
"tf",
"convnextv2",
"image-classification",
"vision",
"dataset:imagenet-22k",
"arxiv:2301.00808",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | facebook | null | null | facebook/convnextv2-tiny-22k-384 | 1 | 53,224 | transformers | 2023-02-19T07:24:50 | ---
license: apache-2.0
tags:
- vision
- image-classification
datasets:
- imagenet-22k
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
---
# ConvNeXt V2 (tiny-sized model)
ConvNeXt V2 model pretrained using the FCMAE framework and fine-tuned on the ImageNet-22K dataset at resolution 384x384. It was introduced in the paper [ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders](https://arxiv.org/abs/2301.00808) by Woo et al. and first released in [this repository](https://github.com/facebookresearch/ConvNeXt-V2).
Disclaimer: The team releasing ConvNeXT V2 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
ConvNeXt V2 is a pure convolutional model (ConvNet) that introduces a fully convolutional masked autoencoder framework (FCMAE) and a new Global Response Normalization (GRN) layer to ConvNeXt. ConvNeXt V2 significantly improves the performance of pure ConvNets on various recognition benchmarks.

## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=convnextv2) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import AutoImageProcessor, ConvNextV2ForImageClassification
import torch
from datasets import load_dataset
dataset = load_dataset("huggingface/cats-image")
image = dataset["test"]["image"][0]
preprocessor = AutoImageProcessor.from_pretrained("facebook/convnextv2-tiny-22k-384")
model = ConvNextV2ForImageClassification.from_pretrained("facebook/convnextv2-tiny-22k-384")
inputs = preprocessor(image, return_tensors="pt")
with torch.no_grad():
logits = model(**inputs).logits
# model predicts one of the 1000 ImageNet classes
predicted_label = logits.argmax(-1).item()
print(model.config.id2label[predicted_label]),
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/convnextv2).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2301-00808,
author = {Sanghyun Woo and
Shoubhik Debnath and
Ronghang Hu and
Xinlei Chen and
Zhuang Liu and
In So Kweon and
Saining Xie},
title = {ConvNeXt {V2:} Co-designing and Scaling ConvNets with Masked Autoencoders},
journal = {CoRR},
volume = {abs/2301.00808},
year = {2023},
url = {https://doi.org/10.48550/arXiv.2301.00808},
doi = {10.48550/arXiv.2301.00808},
eprinttype = {arXiv},
eprint = {2301.00808},
timestamp = {Tue, 10 Jan 2023 15:10:12 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2301-00808.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 3,374 | [
[
-0.052459716796875,
-0.027801513671875,
-0.026580810546875,
0.01293182373046875,
-0.027008056640625,
-0.021026611328125,
-0.013031005859375,
-0.06072998046875,
0.0245361328125,
0.0305633544921875,
-0.0421142578125,
-0.006984710693359375,
-0.0430908203125,
-0.006160736083984375,
-0.016571044921875,
0.058868408203125,
0.0058746337890625,
0.0033855438232421875,
-0.0225067138671875,
-0.0178375244140625,
-0.0305328369140625,
-0.033660888671875,
-0.06756591796875,
-0.0240478515625,
0.0164794921875,
0.031829833984375,
0.047149658203125,
0.05059814453125,
0.052978515625,
0.0263519287109375,
-0.0008273124694824219,
0.010833740234375,
-0.0267791748046875,
-0.0247955322265625,
0.0106964111328125,
-0.0206756591796875,
-0.02960205078125,
0.0070343017578125,
0.017364501953125,
0.025848388671875,
0.0180206298828125,
0.026092529296875,
0.015106201171875,
0.037384033203125,
-0.0202178955078125,
0.021240234375,
-0.035369873046875,
0.009124755859375,
0.00659942626953125,
0.00839996337890625,
-0.02978515625,
-0.004970550537109375,
0.0165252685546875,
-0.043548583984375,
0.040802001953125,
0.01444244384765625,
0.0859375,
0.0182952880859375,
-0.02764892578125,
-0.00001537799835205078,
-0.033233642578125,
0.046661376953125,
-0.045562744140625,
0.022705078125,
0.0070648193359375,
0.029815673828125,
0.00014293193817138672,
-0.08209228515625,
-0.043426513671875,
-0.00978851318359375,
-0.030364990234375,
0.0089569091796875,
-0.029937744140625,
0.010040283203125,
0.0281524658203125,
0.035491943359375,
-0.059326171875,
0.0248565673828125,
-0.044708251953125,
-0.0299530029296875,
0.06243896484375,
-0.01041412353515625,
0.01190185546875,
-0.0209808349609375,
-0.052642822265625,
-0.0274810791015625,
-0.0299835205078125,
0.0181427001953125,
0.00966644287109375,
-0.0015573501586914062,
-0.04119873046875,
0.034942626953125,
-0.007843017578125,
0.039794921875,
0.0285491943359375,
0.0206756591796875,
0.0267486572265625,
-0.0274505615234375,
-0.031219482421875,
0.0011539459228515625,
0.079345703125,
0.046142578125,
0.0179443359375,
0.013946533203125,
-0.00006574392318725586,
0.0058135986328125,
-0.0003790855407714844,
-0.0848388671875,
-0.04742431640625,
0.01788330078125,
-0.0438232421875,
-0.0250091552734375,
0.008026123046875,
-0.043487548828125,
-0.01351165771484375,
-0.0271453857421875,
0.039642333984375,
-0.003597259521484375,
-0.03912353515625,
-0.0011510848999023438,
-0.012115478515625,
0.02508544921875,
0.0225982666015625,
-0.052703857421875,
0.03240966796875,
0.0304412841796875,
0.064208984375,
-0.005504608154296875,
0.0059814453125,
-0.01047515869140625,
-0.037506103515625,
-0.0214691162109375,
0.038543701171875,
0.0057830810546875,
-0.003734588623046875,
-0.0158843994140625,
0.036834716796875,
0.00859832763671875,
-0.041351318359375,
0.0287322998046875,
-0.036102294921875,
0.001369476318359375,
-0.015228271484375,
-0.029327392578125,
-0.0182647705078125,
0.01953125,
-0.059539794921875,
0.07598876953125,
0.024322509765625,
-0.0616455078125,
0.0149383544921875,
-0.03125,
-0.0013294219970703125,
-0.0017347335815429688,
0.0016469955444335938,
-0.06182861328125,
-0.01154327392578125,
0.00934600830078125,
0.047515869140625,
-0.021514892578125,
0.016204833984375,
-0.036834716796875,
-0.0205230712890625,
0.005218505859375,
-0.0277099609375,
0.0784912109375,
0.01849365234375,
-0.032562255859375,
-0.0019683837890625,
-0.05126953125,
0.00563812255859375,
0.023681640625,
0.006786346435546875,
-0.0101776123046875,
-0.037322998046875,
0.02294921875,
0.050567626953125,
0.01971435546875,
-0.044189453125,
0.02288818359375,
-0.0269317626953125,
0.036651611328125,
0.042388916015625,
0.00098419189453125,
0.0338134765625,
-0.01552581787109375,
0.0223388671875,
0.00627899169921875,
0.04266357421875,
0.0034122467041015625,
-0.036651611328125,
-0.0703125,
-0.01373291015625,
0.00887298583984375,
0.029144287109375,
-0.058746337890625,
0.025970458984375,
-0.0173187255859375,
-0.05657958984375,
-0.022216796875,
0.0009741783142089844,
0.035675048828125,
0.0251617431640625,
0.040435791015625,
-0.040374755859375,
-0.06781005859375,
-0.08050537109375,
0.007556915283203125,
0.006984710693359375,
-0.000028848648071289062,
0.0294036865234375,
0.048004150390625,
-0.01393890380859375,
0.058868408203125,
-0.015228271484375,
-0.017486572265625,
-0.0030364990234375,
0.001590728759765625,
0.022735595703125,
0.06512451171875,
0.0513916015625,
-0.07647705078125,
-0.04840087890625,
-0.0034618377685546875,
-0.05859375,
0.0168304443359375,
-0.00725555419921875,
-0.013031005859375,
0.01287841796875,
0.04107666015625,
-0.0369873046875,
0.04229736328125,
0.044036865234375,
-0.0292510986328125,
0.0460205078125,
-0.003925323486328125,
-0.01287841796875,
-0.082763671875,
0.00054931640625,
0.0193634033203125,
-0.0176849365234375,
-0.032806396484375,
-0.00890350341796875,
0.0084686279296875,
-0.0024394989013671875,
-0.050018310546875,
0.061553955078125,
-0.049652099609375,
-0.0016193389892578125,
-0.016204833984375,
-0.017242431640625,
0.00939178466796875,
0.0611572265625,
0.01763916015625,
0.0224609375,
0.040496826171875,
-0.04046630859375,
0.050018310546875,
0.0225067138671875,
-0.0243072509765625,
0.02911376953125,
-0.06842041015625,
0.01128387451171875,
0.01108551025390625,
0.03558349609375,
-0.07244873046875,
-0.0179595947265625,
0.02740478515625,
-0.04193115234375,
0.046630859375,
-0.024658203125,
-0.0293731689453125,
-0.06378173828125,
-0.021270751953125,
0.043701171875,
0.034271240234375,
-0.051422119140625,
0.00909423828125,
0.0186767578125,
0.0261383056640625,
-0.041412353515625,
-0.0703125,
-0.00865936279296875,
0.004077911376953125,
-0.05438232421875,
0.0279388427734375,
-0.0159912109375,
0.0041656494140625,
0.010711669921875,
-0.0111083984375,
-0.00231170654296875,
-0.010223388671875,
0.025726318359375,
0.03289794921875,
-0.0214691162109375,
-0.00640106201171875,
-0.0002472400665283203,
-0.0164031982421875,
0.005603790283203125,
-0.041412353515625,
0.0311431884765625,
-0.019622802734375,
0.00010031461715698242,
-0.049102783203125,
0.0106353759765625,
0.0313720703125,
-0.007793426513671875,
0.047760009765625,
0.07177734375,
-0.038238525390625,
-0.009490966796875,
-0.032196044921875,
-0.0284881591796875,
-0.040008544921875,
0.032012939453125,
-0.0207061767578125,
-0.06036376953125,
0.039947509765625,
0.00910186767578125,
0.0005025863647460938,
0.058837890625,
0.042144775390625,
-0.006374359130859375,
0.05316162109375,
0.040863037109375,
0.0238800048828125,
0.047760009765625,
-0.07745361328125,
0.0008502006530761719,
-0.07928466796875,
-0.035614013671875,
-0.0185546875,
-0.04595947265625,
-0.072998046875,
-0.0386962890625,
0.0220184326171875,
0.003986358642578125,
-0.040069580078125,
0.058685302734375,
-0.07232666015625,
0.0159759521484375,
0.048187255859375,
0.0259552001953125,
-0.0179443359375,
0.01247406005859375,
0.0013589859008789062,
0.0025424957275390625,
-0.061065673828125,
-0.0029048919677734375,
0.0677490234375,
0.031524658203125,
0.0377197265625,
-0.005634307861328125,
0.0272369384765625,
0.0130615234375,
0.0230712890625,
-0.054229736328125,
0.0308074951171875,
-0.0259857177734375,
-0.0655517578125,
-0.00768280029296875,
-0.0147857666015625,
-0.070068359375,
0.0129547119140625,
-0.021026611328125,
-0.051971435546875,
0.06695556640625,
0.024566650390625,
-0.01428985595703125,
0.028778076171875,
-0.056396484375,
0.07958984375,
-0.01293182373046875,
-0.0401611328125,
0.01190185546875,
-0.07073974609375,
0.0289459228515625,
0.01293182373046875,
-0.003936767578125,
0.00670623779296875,
0.0202178955078125,
0.061309814453125,
-0.054718017578125,
0.06817626953125,
-0.0180816650390625,
0.0270233154296875,
0.05169677734375,
0.0013799667358398438,
0.048248291015625,
0.004276275634765625,
-0.00157928466796875,
0.040435791015625,
0.0103759765625,
-0.034637451171875,
-0.041839599609375,
0.051971435546875,
-0.0699462890625,
-0.0201873779296875,
-0.030792236328125,
-0.0222015380859375,
0.0104827880859375,
0.01007080078125,
0.06610107421875,
0.051361083984375,
0.00313568115234375,
0.039825439453125,
0.048309326171875,
-0.0118255615234375,
0.036865234375,
0.005680084228515625,
0.0007653236389160156,
-0.031890869140625,
0.061981201171875,
0.027618408203125,
0.0330810546875,
0.026947021484375,
0.01178741455078125,
-0.0280609130859375,
-0.015625,
-0.0287322998046875,
0.01629638671875,
-0.043609619140625,
-0.04052734375,
-0.0526123046875,
-0.04150390625,
-0.035491943359375,
-0.0171051025390625,
-0.046295166015625,
-0.035919189453125,
-0.03485107421875,
0.00362396240234375,
0.034423828125,
0.029296875,
-0.029754638671875,
0.028778076171875,
-0.0200042724609375,
0.012725830078125,
0.022552490234375,
0.0261993408203125,
0.0015096664428710938,
-0.053802490234375,
-0.030303955078125,
0.00870513916015625,
-0.02691650390625,
-0.032623291015625,
0.035675048828125,
0.0087890625,
0.025909423828125,
0.025787353515625,
0.00859832763671875,
0.035003662109375,
-0.006511688232421875,
0.050750732421875,
0.04486083984375,
-0.043426513671875,
0.0293121337890625,
-0.00732421875,
0.01007080078125,
0.0179443359375,
0.0209197998046875,
-0.036651611328125,
0.0026836395263671875,
-0.07440185546875,
-0.06329345703125,
0.053466796875,
0.0199127197265625,
0.008697509765625,
0.017974853515625,
0.03656005859375,
0.0016012191772460938,
0.0032482147216796875,
-0.0609130859375,
-0.041534423828125,
-0.04400634765625,
-0.01739501953125,
-0.00963592529296875,
-0.031585693359375,
0.0082855224609375,
-0.045623779296875,
0.04730224609375,
-0.009857177734375,
0.06182861328125,
0.0292205810546875,
0.0010738372802734375,
-0.003204345703125,
-0.0310516357421875,
0.032073974609375,
0.004955291748046875,
-0.0217742919921875,
0.0087738037109375,
-0.0020046234130859375,
-0.044891357421875,
0.0019588470458984375,
0.01337432861328125,
-0.007598876953125,
0.00920867919921875,
0.031829833984375,
0.07843017578125,
0.00897216796875,
0.00439453125,
0.05438232421875,
-0.0161590576171875,
-0.03082275390625,
-0.04150390625,
0.005794525146484375,
-0.0110931396484375,
0.021270751953125,
0.00799560546875,
0.03912353515625,
0.0137481689453125,
-0.0242919921875,
0.0294647216796875,
0.0256500244140625,
-0.04229736328125,
-0.026580810546875,
0.0606689453125,
0.00360107421875,
-0.00801849365234375,
0.05548095703125,
-0.0196990966796875,
-0.032196044921875,
0.0911865234375,
0.03851318359375,
0.06732177734375,
-0.005603790283203125,
0.00830078125,
0.07244873046875,
0.033233642578125,
-0.006389617919921875,
-0.00450897216796875,
0.004009246826171875,
-0.05499267578125,
-0.00782012939453125,
-0.038543701171875,
-0.0032901763916015625,
0.021514892578125,
-0.048797607421875,
0.039825439453125,
-0.03887939453125,
-0.01055908203125,
0.002353668212890625,
0.0367431640625,
-0.087890625,
0.039581298828125,
0.0113983154296875,
0.079345703125,
-0.05914306640625,
0.06915283203125,
0.036865234375,
-0.0341796875,
-0.0855712890625,
-0.037567138671875,
0.0017004013061523438,
-0.04742431640625,
0.036224365234375,
0.0273590087890625,
0.0256500244140625,
0.01171875,
-0.078857421875,
-0.06646728515625,
0.100341796875,
0.0152435302734375,
-0.0465087890625,
0.0170440673828125,
-0.01049041748046875,
0.0374755859375,
-0.0291900634765625,
0.03607177734375,
0.01174163818359375,
0.027099609375,
0.0273590087890625,
-0.05780029296875,
0.0166168212890625,
-0.0380859375,
0.0114898681640625,
-0.00849151611328125,
-0.07720947265625,
0.06695556640625,
-0.017608642578125,
0.0032558441162109375,
0.0046234130859375,
0.061279296875,
-0.0030612945556640625,
0.0279388427734375,
0.031005859375,
0.030181884765625,
0.03973388671875,
-0.0208282470703125,
0.0823974609375,
-0.000522613525390625,
0.053131103515625,
0.0758056640625,
0.035797119140625,
0.034942626953125,
0.0158538818359375,
-0.00612640380859375,
0.030914306640625,
0.0819091796875,
-0.03350830078125,
0.0328369140625,
0.01203155517578125,
0.006214141845703125,
-0.01434326171875,
-0.0032482147216796875,
-0.038726806640625,
0.03070068359375,
0.0196990966796875,
-0.03106689453125,
0.00954437255859375,
0.01641845703125,
0.007610321044921875,
-0.0279083251953125,
-0.018768310546875,
0.03497314453125,
0.0197296142578125,
-0.035003662109375,
0.064697265625,
-0.006862640380859375,
0.05596923828125,
-0.025787353515625,
-0.0009250640869140625,
-0.0293731689453125,
0.0182037353515625,
-0.022369384765625,
-0.050811767578125,
0.0228729248046875,
-0.0244598388671875,
-0.01273345947265625,
0.0055694580078125,
0.057037353515625,
-0.03033447265625,
-0.04693603515625,
0.0218505859375,
-0.00235748291015625,
0.01568603515625,
-0.0002779960632324219,
-0.07403564453125,
0.017669677734375,
-0.00560760498046875,
-0.0374755859375,
0.01207733154296875,
0.028167724609375,
-0.0149688720703125,
0.04254150390625,
0.048095703125,
-0.0176544189453125,
0.006320953369140625,
-0.0252838134765625,
0.06573486328125,
-0.02130126953125,
-0.0174560546875,
-0.046539306640625,
0.044586181640625,
-0.018890380859375,
-0.0286865234375,
0.04449462890625,
0.0509033203125,
0.07879638671875,
-0.01294708251953125,
0.048980712890625,
-0.02801513671875,
-0.0014801025390625,
-0.017059326171875,
0.045013427734375,
-0.05914306640625,
-0.007503509521484375,
-0.01568603515625,
-0.0462646484375,
-0.030914306640625,
0.03985595703125,
-0.01357269287109375,
0.0164947509765625,
0.03802490234375,
0.07318115234375,
-0.019378662109375,
-0.016693115234375,
0.0200042724609375,
0.0260772705078125,
0.0194854736328125,
0.04327392578125,
0.016693115234375,
-0.0770263671875,
0.0335693359375,
-0.05780029296875,
-0.0170745849609375,
-0.0438232421875,
-0.0478515625,
-0.060516357421875,
-0.058319091796875,
-0.04290771484375,
-0.057708740234375,
-0.0157623291015625,
0.07257080078125,
0.08245849609375,
-0.0654296875,
-0.00223541259765625,
-0.0171356201171875,
0.0004317760467529297,
-0.03472900390625,
-0.0177154541015625,
0.051605224609375,
0.01385498046875,
-0.055419921875,
-0.0193023681640625,
-0.002033233642578125,
0.0166168212890625,
-0.01180267333984375,
-0.021942138671875,
-0.01139068603515625,
-0.004436492919921875,
0.043792724609375,
0.03692626953125,
-0.036865234375,
-0.0245361328125,
0.0028133392333984375,
-0.019683837890625,
0.00919342041015625,
0.035003662109375,
-0.04400634765625,
0.04803466796875,
0.036651611328125,
0.019256591796875,
0.056854248046875,
-0.019134521484375,
0.00685882568359375,
-0.05474853515625,
0.040618896484375,
0.00518798828125,
0.0202789306640625,
0.0238494873046875,
-0.04296875,
0.045135498046875,
0.04150390625,
-0.043487548828125,
-0.051971435546875,
0.0036525726318359375,
-0.10748291015625,
-0.00673675537109375,
0.0897216796875,
-0.006916046142578125,
-0.02960205078125,
0.003826141357421875,
-0.0134124755859375,
0.0421142578125,
-0.00745391845703125,
0.0289306640625,
0.026580810546875,
0.001049041748046875,
-0.03912353515625,
-0.041839599609375,
0.041839599609375,
-0.00714874267578125,
-0.033111572265625,
-0.025360107421875,
0.0146636962890625,
0.026947021484375,
0.00826263427734375,
0.032867431640625,
-0.00791168212890625,
0.0281524658203125,
0.015716552734375,
0.033355712890625,
-0.0275421142578125,
-0.01070404052734375,
-0.01549530029296875,
-0.009735107421875,
-0.0254669189453125,
-0.0338134765625
]
] |
BAAI/bge-large-en | 2023-10-12T03:35:38.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"feature-extraction",
"mteb",
"sentence-transfomres",
"en",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | BAAI | null | null | BAAI/bge-large-en | 118 | 52,824 | transformers | 2023-08-02T07:11:51 | ---
tags:
- mteb
- sentence-transfomres
- transformers
model-index:
- name: bge-large-en
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.94029850746269
- type: ap
value: 40.00228964744091
- type: f1
value: 70.86088267934595
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.93745
- type: ap
value: 88.24758534667426
- type: f1
value: 91.91033034217591
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.158
- type: f1
value: 45.78935185074774
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.972
- type: map_at_10
value: 54.874
- type: map_at_100
value: 55.53399999999999
- type: map_at_1000
value: 55.539
- type: map_at_3
value: 51.031000000000006
- type: map_at_5
value: 53.342999999999996
- type: mrr_at_1
value: 40.541
- type: mrr_at_10
value: 55.096000000000004
- type: mrr_at_100
value: 55.75599999999999
- type: mrr_at_1000
value: 55.761
- type: mrr_at_3
value: 51.221000000000004
- type: mrr_at_5
value: 53.568000000000005
- type: ndcg_at_1
value: 39.972
- type: ndcg_at_10
value: 62.456999999999994
- type: ndcg_at_100
value: 65.262
- type: ndcg_at_1000
value: 65.389
- type: ndcg_at_3
value: 54.673
- type: ndcg_at_5
value: 58.80499999999999
- type: precision_at_1
value: 39.972
- type: precision_at_10
value: 8.634
- type: precision_at_100
value: 0.9860000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 21.740000000000002
- type: precision_at_5
value: 15.036
- type: recall_at_1
value: 39.972
- type: recall_at_10
value: 86.344
- type: recall_at_100
value: 98.578
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 65.22
- type: recall_at_5
value: 75.178
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.94652870403906
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 43.17257160340209
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 63.97867370559182
- type: mrr
value: 77.00820032537484
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 80.00986015960616
- type: cos_sim_spearman
value: 80.36387933827882
- type: euclidean_pearson
value: 80.32305287257296
- type: euclidean_spearman
value: 82.0524720308763
- type: manhattan_pearson
value: 80.19847473906454
- type: manhattan_spearman
value: 81.87957652506985
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.00000000000001
- type: f1
value: 87.99039027511853
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 41.36932844640705
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 38.34983239611985
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.257999999999996
- type: map_at_10
value: 42.937
- type: map_at_100
value: 44.406
- type: map_at_1000
value: 44.536
- type: map_at_3
value: 39.22
- type: map_at_5
value: 41.458
- type: mrr_at_1
value: 38.769999999999996
- type: mrr_at_10
value: 48.701
- type: mrr_at_100
value: 49.431000000000004
- type: mrr_at_1000
value: 49.476
- type: mrr_at_3
value: 45.875
- type: mrr_at_5
value: 47.67
- type: ndcg_at_1
value: 38.769999999999996
- type: ndcg_at_10
value: 49.35
- type: ndcg_at_100
value: 54.618
- type: ndcg_at_1000
value: 56.655
- type: ndcg_at_3
value: 43.826
- type: ndcg_at_5
value: 46.72
- type: precision_at_1
value: 38.769999999999996
- type: precision_at_10
value: 9.328
- type: precision_at_100
value: 1.484
- type: precision_at_1000
value: 0.196
- type: precision_at_3
value: 20.649
- type: precision_at_5
value: 15.25
- type: recall_at_1
value: 32.257999999999996
- type: recall_at_10
value: 61.849
- type: recall_at_100
value: 83.70400000000001
- type: recall_at_1000
value: 96.344
- type: recall_at_3
value: 46.037
- type: recall_at_5
value: 53.724000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.979
- type: map_at_10
value: 43.376999999999995
- type: map_at_100
value: 44.667
- type: map_at_1000
value: 44.794
- type: map_at_3
value: 40.461999999999996
- type: map_at_5
value: 42.138
- type: mrr_at_1
value: 41.146
- type: mrr_at_10
value: 49.575
- type: mrr_at_100
value: 50.187000000000005
- type: mrr_at_1000
value: 50.231
- type: mrr_at_3
value: 47.601
- type: mrr_at_5
value: 48.786
- type: ndcg_at_1
value: 41.146
- type: ndcg_at_10
value: 48.957
- type: ndcg_at_100
value: 53.296
- type: ndcg_at_1000
value: 55.254000000000005
- type: ndcg_at_3
value: 45.235
- type: ndcg_at_5
value: 47.014
- type: precision_at_1
value: 41.146
- type: precision_at_10
value: 9.107999999999999
- type: precision_at_100
value: 1.481
- type: precision_at_1000
value: 0.193
- type: precision_at_3
value: 21.783
- type: precision_at_5
value: 15.274
- type: recall_at_1
value: 32.979
- type: recall_at_10
value: 58.167
- type: recall_at_100
value: 76.374
- type: recall_at_1000
value: 88.836
- type: recall_at_3
value: 46.838
- type: recall_at_5
value: 52.006
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.326
- type: map_at_10
value: 53.468
- type: map_at_100
value: 54.454
- type: map_at_1000
value: 54.508
- type: map_at_3
value: 50.12799999999999
- type: map_at_5
value: 51.991
- type: mrr_at_1
value: 46.394999999999996
- type: mrr_at_10
value: 57.016999999999996
- type: mrr_at_100
value: 57.67099999999999
- type: mrr_at_1000
value: 57.699999999999996
- type: mrr_at_3
value: 54.65
- type: mrr_at_5
value: 56.101
- type: ndcg_at_1
value: 46.394999999999996
- type: ndcg_at_10
value: 59.507
- type: ndcg_at_100
value: 63.31099999999999
- type: ndcg_at_1000
value: 64.388
- type: ndcg_at_3
value: 54.04600000000001
- type: ndcg_at_5
value: 56.723
- type: precision_at_1
value: 46.394999999999996
- type: precision_at_10
value: 9.567
- type: precision_at_100
value: 1.234
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 24.117
- type: precision_at_5
value: 16.426
- type: recall_at_1
value: 40.326
- type: recall_at_10
value: 73.763
- type: recall_at_100
value: 89.927
- type: recall_at_1000
value: 97.509
- type: recall_at_3
value: 59.34
- type: recall_at_5
value: 65.915
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.661
- type: map_at_10
value: 35.522
- type: map_at_100
value: 36.619
- type: map_at_1000
value: 36.693999999999996
- type: map_at_3
value: 33.154
- type: map_at_5
value: 34.353
- type: mrr_at_1
value: 28.362
- type: mrr_at_10
value: 37.403999999999996
- type: mrr_at_100
value: 38.374
- type: mrr_at_1000
value: 38.428000000000004
- type: mrr_at_3
value: 35.235
- type: mrr_at_5
value: 36.269
- type: ndcg_at_1
value: 28.362
- type: ndcg_at_10
value: 40.431
- type: ndcg_at_100
value: 45.745999999999995
- type: ndcg_at_1000
value: 47.493
- type: ndcg_at_3
value: 35.733
- type: ndcg_at_5
value: 37.722
- type: precision_at_1
value: 28.362
- type: precision_at_10
value: 6.101999999999999
- type: precision_at_100
value: 0.922
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 15.140999999999998
- type: precision_at_5
value: 10.305
- type: recall_at_1
value: 26.661
- type: recall_at_10
value: 53.675
- type: recall_at_100
value: 77.891
- type: recall_at_1000
value: 90.72
- type: recall_at_3
value: 40.751
- type: recall_at_5
value: 45.517
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.886
- type: map_at_10
value: 27.288
- type: map_at_100
value: 28.327999999999996
- type: map_at_1000
value: 28.438999999999997
- type: map_at_3
value: 24.453
- type: map_at_5
value: 25.959
- type: mrr_at_1
value: 23.134
- type: mrr_at_10
value: 32.004
- type: mrr_at_100
value: 32.789
- type: mrr_at_1000
value: 32.857
- type: mrr_at_3
value: 29.084
- type: mrr_at_5
value: 30.614
- type: ndcg_at_1
value: 23.134
- type: ndcg_at_10
value: 32.852
- type: ndcg_at_100
value: 37.972
- type: ndcg_at_1000
value: 40.656
- type: ndcg_at_3
value: 27.435
- type: ndcg_at_5
value: 29.823
- type: precision_at_1
value: 23.134
- type: precision_at_10
value: 6.032
- type: precision_at_100
value: 0.9950000000000001
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 13.017999999999999
- type: precision_at_5
value: 9.501999999999999
- type: recall_at_1
value: 18.886
- type: recall_at_10
value: 45.34
- type: recall_at_100
value: 67.947
- type: recall_at_1000
value: 86.924
- type: recall_at_3
value: 30.535
- type: recall_at_5
value: 36.451
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.994999999999997
- type: map_at_10
value: 40.04
- type: map_at_100
value: 41.435
- type: map_at_1000
value: 41.537
- type: map_at_3
value: 37.091
- type: map_at_5
value: 38.802
- type: mrr_at_1
value: 35.034
- type: mrr_at_10
value: 45.411
- type: mrr_at_100
value: 46.226
- type: mrr_at_1000
value: 46.27
- type: mrr_at_3
value: 43.086
- type: mrr_at_5
value: 44.452999999999996
- type: ndcg_at_1
value: 35.034
- type: ndcg_at_10
value: 46.076
- type: ndcg_at_100
value: 51.483000000000004
- type: ndcg_at_1000
value: 53.433
- type: ndcg_at_3
value: 41.304
- type: ndcg_at_5
value: 43.641999999999996
- type: precision_at_1
value: 35.034
- type: precision_at_10
value: 8.258000000000001
- type: precision_at_100
value: 1.268
- type: precision_at_1000
value: 0.161
- type: precision_at_3
value: 19.57
- type: precision_at_5
value: 13.782
- type: recall_at_1
value: 28.994999999999997
- type: recall_at_10
value: 58.538000000000004
- type: recall_at_100
value: 80.72399999999999
- type: recall_at_1000
value: 93.462
- type: recall_at_3
value: 45.199
- type: recall_at_5
value: 51.237
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.795
- type: map_at_10
value: 34.935
- type: map_at_100
value: 36.306
- type: map_at_1000
value: 36.417
- type: map_at_3
value: 31.831
- type: map_at_5
value: 33.626
- type: mrr_at_1
value: 30.479
- type: mrr_at_10
value: 40.225
- type: mrr_at_100
value: 41.055
- type: mrr_at_1000
value: 41.114
- type: mrr_at_3
value: 37.538
- type: mrr_at_5
value: 39.073
- type: ndcg_at_1
value: 30.479
- type: ndcg_at_10
value: 40.949999999999996
- type: ndcg_at_100
value: 46.525
- type: ndcg_at_1000
value: 48.892
- type: ndcg_at_3
value: 35.79
- type: ndcg_at_5
value: 38.237
- type: precision_at_1
value: 30.479
- type: precision_at_10
value: 7.6259999999999994
- type: precision_at_100
value: 1.203
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 17.199
- type: precision_at_5
value: 12.466000000000001
- type: recall_at_1
value: 24.795
- type: recall_at_10
value: 53.421
- type: recall_at_100
value: 77.189
- type: recall_at_1000
value: 93.407
- type: recall_at_3
value: 39.051
- type: recall_at_5
value: 45.462
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.853499999999997
- type: map_at_10
value: 36.20433333333333
- type: map_at_100
value: 37.40391666666667
- type: map_at_1000
value: 37.515
- type: map_at_3
value: 33.39975
- type: map_at_5
value: 34.9665
- type: mrr_at_1
value: 31.62666666666667
- type: mrr_at_10
value: 40.436749999999996
- type: mrr_at_100
value: 41.260333333333335
- type: mrr_at_1000
value: 41.31525
- type: mrr_at_3
value: 38.06733333333332
- type: mrr_at_5
value: 39.41541666666667
- type: ndcg_at_1
value: 31.62666666666667
- type: ndcg_at_10
value: 41.63341666666667
- type: ndcg_at_100
value: 46.704166666666666
- type: ndcg_at_1000
value: 48.88483333333335
- type: ndcg_at_3
value: 36.896
- type: ndcg_at_5
value: 39.11891666666667
- type: precision_at_1
value: 31.62666666666667
- type: precision_at_10
value: 7.241083333333333
- type: precision_at_100
value: 1.1488333333333334
- type: precision_at_1000
value: 0.15250000000000002
- type: precision_at_3
value: 16.908333333333335
- type: precision_at_5
value: 11.942833333333333
- type: recall_at_1
value: 26.853499999999997
- type: recall_at_10
value: 53.461333333333336
- type: recall_at_100
value: 75.63633333333333
- type: recall_at_1000
value: 90.67016666666666
- type: recall_at_3
value: 40.24241666666667
- type: recall_at_5
value: 45.98608333333333
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.241999999999997
- type: map_at_10
value: 31.863999999999997
- type: map_at_100
value: 32.835
- type: map_at_1000
value: 32.928000000000004
- type: map_at_3
value: 29.694
- type: map_at_5
value: 30.978
- type: mrr_at_1
value: 28.374
- type: mrr_at_10
value: 34.814
- type: mrr_at_100
value: 35.596
- type: mrr_at_1000
value: 35.666
- type: mrr_at_3
value: 32.745000000000005
- type: mrr_at_5
value: 34.049
- type: ndcg_at_1
value: 28.374
- type: ndcg_at_10
value: 35.969
- type: ndcg_at_100
value: 40.708
- type: ndcg_at_1000
value: 43.08
- type: ndcg_at_3
value: 31.968999999999998
- type: ndcg_at_5
value: 34.069
- type: precision_at_1
value: 28.374
- type: precision_at_10
value: 5.583
- type: precision_at_100
value: 0.8630000000000001
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 13.547999999999998
- type: precision_at_5
value: 9.447999999999999
- type: recall_at_1
value: 25.241999999999997
- type: recall_at_10
value: 45.711
- type: recall_at_100
value: 67.482
- type: recall_at_1000
value: 85.13300000000001
- type: recall_at_3
value: 34.622
- type: recall_at_5
value: 40.043
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.488999999999997
- type: map_at_10
value: 25.142999999999997
- type: map_at_100
value: 26.244
- type: map_at_1000
value: 26.363999999999997
- type: map_at_3
value: 22.654
- type: map_at_5
value: 24.017
- type: mrr_at_1
value: 21.198
- type: mrr_at_10
value: 28.903000000000002
- type: mrr_at_100
value: 29.860999999999997
- type: mrr_at_1000
value: 29.934
- type: mrr_at_3
value: 26.634999999999998
- type: mrr_at_5
value: 27.903
- type: ndcg_at_1
value: 21.198
- type: ndcg_at_10
value: 29.982999999999997
- type: ndcg_at_100
value: 35.275
- type: ndcg_at_1000
value: 38.074000000000005
- type: ndcg_at_3
value: 25.502999999999997
- type: ndcg_at_5
value: 27.557
- type: precision_at_1
value: 21.198
- type: precision_at_10
value: 5.502
- type: precision_at_100
value: 0.942
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 12.044
- type: precision_at_5
value: 8.782
- type: recall_at_1
value: 17.488999999999997
- type: recall_at_10
value: 40.821000000000005
- type: recall_at_100
value: 64.567
- type: recall_at_1000
value: 84.452
- type: recall_at_3
value: 28.351
- type: recall_at_5
value: 33.645
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.066000000000003
- type: map_at_10
value: 36.134
- type: map_at_100
value: 37.285000000000004
- type: map_at_1000
value: 37.389
- type: map_at_3
value: 33.522999999999996
- type: map_at_5
value: 34.905
- type: mrr_at_1
value: 31.436999999999998
- type: mrr_at_10
value: 40.225
- type: mrr_at_100
value: 41.079
- type: mrr_at_1000
value: 41.138000000000005
- type: mrr_at_3
value: 38.074999999999996
- type: mrr_at_5
value: 39.190000000000005
- type: ndcg_at_1
value: 31.436999999999998
- type: ndcg_at_10
value: 41.494
- type: ndcg_at_100
value: 46.678999999999995
- type: ndcg_at_1000
value: 48.964
- type: ndcg_at_3
value: 36.828
- type: ndcg_at_5
value: 38.789
- type: precision_at_1
value: 31.436999999999998
- type: precision_at_10
value: 6.931
- type: precision_at_100
value: 1.072
- type: precision_at_1000
value: 0.13799999999999998
- type: precision_at_3
value: 16.729
- type: precision_at_5
value: 11.567
- type: recall_at_1
value: 27.066000000000003
- type: recall_at_10
value: 53.705000000000005
- type: recall_at_100
value: 75.968
- type: recall_at_1000
value: 91.937
- type: recall_at_3
value: 40.865
- type: recall_at_5
value: 45.739999999999995
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.979000000000003
- type: map_at_10
value: 32.799
- type: map_at_100
value: 34.508
- type: map_at_1000
value: 34.719
- type: map_at_3
value: 29.947000000000003
- type: map_at_5
value: 31.584
- type: mrr_at_1
value: 30.237000000000002
- type: mrr_at_10
value: 37.651
- type: mrr_at_100
value: 38.805
- type: mrr_at_1000
value: 38.851
- type: mrr_at_3
value: 35.046
- type: mrr_at_5
value: 36.548
- type: ndcg_at_1
value: 30.237000000000002
- type: ndcg_at_10
value: 38.356
- type: ndcg_at_100
value: 44.906
- type: ndcg_at_1000
value: 47.299
- type: ndcg_at_3
value: 33.717999999999996
- type: ndcg_at_5
value: 35.946
- type: precision_at_1
value: 30.237000000000002
- type: precision_at_10
value: 7.292
- type: precision_at_100
value: 1.496
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 15.547
- type: precision_at_5
value: 11.344
- type: recall_at_1
value: 24.979000000000003
- type: recall_at_10
value: 48.624
- type: recall_at_100
value: 77.932
- type: recall_at_1000
value: 92.66499999999999
- type: recall_at_3
value: 35.217
- type: recall_at_5
value: 41.394
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.566
- type: map_at_10
value: 30.945
- type: map_at_100
value: 31.759999999999998
- type: map_at_1000
value: 31.855
- type: map_at_3
value: 28.64
- type: map_at_5
value: 29.787000000000003
- type: mrr_at_1
value: 24.954
- type: mrr_at_10
value: 33.311
- type: mrr_at_100
value: 34.050000000000004
- type: mrr_at_1000
value: 34.117999999999995
- type: mrr_at_3
value: 31.238
- type: mrr_at_5
value: 32.329
- type: ndcg_at_1
value: 24.954
- type: ndcg_at_10
value: 35.676
- type: ndcg_at_100
value: 39.931
- type: ndcg_at_1000
value: 42.43
- type: ndcg_at_3
value: 31.365
- type: ndcg_at_5
value: 33.184999999999995
- type: precision_at_1
value: 24.954
- type: precision_at_10
value: 5.564
- type: precision_at_100
value: 0.826
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 13.555
- type: precision_at_5
value: 9.168
- type: recall_at_1
value: 22.566
- type: recall_at_10
value: 47.922
- type: recall_at_100
value: 67.931
- type: recall_at_1000
value: 86.653
- type: recall_at_3
value: 36.103
- type: recall_at_5
value: 40.699000000000005
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.950000000000003
- type: map_at_10
value: 28.612
- type: map_at_100
value: 30.476999999999997
- type: map_at_1000
value: 30.674
- type: map_at_3
value: 24.262
- type: map_at_5
value: 26.554
- type: mrr_at_1
value: 38.241
- type: mrr_at_10
value: 50.43
- type: mrr_at_100
value: 51.059
- type: mrr_at_1000
value: 51.090999999999994
- type: mrr_at_3
value: 47.514
- type: mrr_at_5
value: 49.246
- type: ndcg_at_1
value: 38.241
- type: ndcg_at_10
value: 38.218
- type: ndcg_at_100
value: 45.003
- type: ndcg_at_1000
value: 48.269
- type: ndcg_at_3
value: 32.568000000000005
- type: ndcg_at_5
value: 34.400999999999996
- type: precision_at_1
value: 38.241
- type: precision_at_10
value: 11.674
- type: precision_at_100
value: 1.913
- type: precision_at_1000
value: 0.252
- type: precision_at_3
value: 24.387
- type: precision_at_5
value: 18.163
- type: recall_at_1
value: 16.950000000000003
- type: recall_at_10
value: 43.769000000000005
- type: recall_at_100
value: 66.875
- type: recall_at_1000
value: 84.92699999999999
- type: recall_at_3
value: 29.353
- type: recall_at_5
value: 35.467
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.276
- type: map_at_10
value: 20.848
- type: map_at_100
value: 29.804000000000002
- type: map_at_1000
value: 31.398
- type: map_at_3
value: 14.886
- type: map_at_5
value: 17.516000000000002
- type: mrr_at_1
value: 71
- type: mrr_at_10
value: 78.724
- type: mrr_at_100
value: 78.976
- type: mrr_at_1000
value: 78.986
- type: mrr_at_3
value: 77.333
- type: mrr_at_5
value: 78.021
- type: ndcg_at_1
value: 57.875
- type: ndcg_at_10
value: 43.855
- type: ndcg_at_100
value: 48.99
- type: ndcg_at_1000
value: 56.141
- type: ndcg_at_3
value: 48.914
- type: ndcg_at_5
value: 45.961
- type: precision_at_1
value: 71
- type: precision_at_10
value: 34.575
- type: precision_at_100
value: 11.182
- type: precision_at_1000
value: 2.044
- type: precision_at_3
value: 52.5
- type: precision_at_5
value: 44.2
- type: recall_at_1
value: 9.276
- type: recall_at_10
value: 26.501
- type: recall_at_100
value: 55.72899999999999
- type: recall_at_1000
value: 78.532
- type: recall_at_3
value: 16.365
- type: recall_at_5
value: 20.154
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 52.71
- type: f1
value: 47.74801556489574
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 73.405
- type: map_at_10
value: 82.822
- type: map_at_100
value: 83.042
- type: map_at_1000
value: 83.055
- type: map_at_3
value: 81.65299999999999
- type: map_at_5
value: 82.431
- type: mrr_at_1
value: 79.178
- type: mrr_at_10
value: 87.02
- type: mrr_at_100
value: 87.095
- type: mrr_at_1000
value: 87.09700000000001
- type: mrr_at_3
value: 86.309
- type: mrr_at_5
value: 86.824
- type: ndcg_at_1
value: 79.178
- type: ndcg_at_10
value: 86.72
- type: ndcg_at_100
value: 87.457
- type: ndcg_at_1000
value: 87.691
- type: ndcg_at_3
value: 84.974
- type: ndcg_at_5
value: 86.032
- type: precision_at_1
value: 79.178
- type: precision_at_10
value: 10.548
- type: precision_at_100
value: 1.113
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 32.848
- type: precision_at_5
value: 20.45
- type: recall_at_1
value: 73.405
- type: recall_at_10
value: 94.39699999999999
- type: recall_at_100
value: 97.219
- type: recall_at_1000
value: 98.675
- type: recall_at_3
value: 89.679
- type: recall_at_5
value: 92.392
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.651
- type: map_at_10
value: 36.886
- type: map_at_100
value: 38.811
- type: map_at_1000
value: 38.981
- type: map_at_3
value: 32.538
- type: map_at_5
value: 34.763
- type: mrr_at_1
value: 44.444
- type: mrr_at_10
value: 53.168000000000006
- type: mrr_at_100
value: 53.839000000000006
- type: mrr_at_1000
value: 53.869
- type: mrr_at_3
value: 50.54
- type: mrr_at_5
value: 52.068000000000005
- type: ndcg_at_1
value: 44.444
- type: ndcg_at_10
value: 44.994
- type: ndcg_at_100
value: 51.599
- type: ndcg_at_1000
value: 54.339999999999996
- type: ndcg_at_3
value: 41.372
- type: ndcg_at_5
value: 42.149
- type: precision_at_1
value: 44.444
- type: precision_at_10
value: 12.407
- type: precision_at_100
value: 1.9269999999999998
- type: precision_at_1000
value: 0.242
- type: precision_at_3
value: 27.726
- type: precision_at_5
value: 19.814999999999998
- type: recall_at_1
value: 22.651
- type: recall_at_10
value: 52.075
- type: recall_at_100
value: 76.51400000000001
- type: recall_at_1000
value: 92.852
- type: recall_at_3
value: 37.236000000000004
- type: recall_at_5
value: 43.175999999999995
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.777
- type: map_at_10
value: 66.79899999999999
- type: map_at_100
value: 67.65299999999999
- type: map_at_1000
value: 67.706
- type: map_at_3
value: 63.352
- type: map_at_5
value: 65.52900000000001
- type: mrr_at_1
value: 81.553
- type: mrr_at_10
value: 86.983
- type: mrr_at_100
value: 87.132
- type: mrr_at_1000
value: 87.136
- type: mrr_at_3
value: 86.156
- type: mrr_at_5
value: 86.726
- type: ndcg_at_1
value: 81.553
- type: ndcg_at_10
value: 74.64
- type: ndcg_at_100
value: 77.459
- type: ndcg_at_1000
value: 78.43
- type: ndcg_at_3
value: 69.878
- type: ndcg_at_5
value: 72.59400000000001
- type: precision_at_1
value: 81.553
- type: precision_at_10
value: 15.654000000000002
- type: precision_at_100
value: 1.783
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 45.199
- type: precision_at_5
value: 29.267
- type: recall_at_1
value: 40.777
- type: recall_at_10
value: 78.271
- type: recall_at_100
value: 89.129
- type: recall_at_1000
value: 95.49
- type: recall_at_3
value: 67.79899999999999
- type: recall_at_5
value: 73.167
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 93.5064
- type: ap
value: 90.25495114444111
- type: f1
value: 93.5012434973381
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 23.301
- type: map_at_10
value: 35.657
- type: map_at_100
value: 36.797000000000004
- type: map_at_1000
value: 36.844
- type: map_at_3
value: 31.743
- type: map_at_5
value: 34.003
- type: mrr_at_1
value: 23.854
- type: mrr_at_10
value: 36.242999999999995
- type: mrr_at_100
value: 37.32
- type: mrr_at_1000
value: 37.361
- type: mrr_at_3
value: 32.4
- type: mrr_at_5
value: 34.634
- type: ndcg_at_1
value: 23.868000000000002
- type: ndcg_at_10
value: 42.589
- type: ndcg_at_100
value: 48.031
- type: ndcg_at_1000
value: 49.189
- type: ndcg_at_3
value: 34.649
- type: ndcg_at_5
value: 38.676
- type: precision_at_1
value: 23.868000000000002
- type: precision_at_10
value: 6.6850000000000005
- type: precision_at_100
value: 0.9400000000000001
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.651
- type: precision_at_5
value: 10.834000000000001
- type: recall_at_1
value: 23.301
- type: recall_at_10
value: 63.88700000000001
- type: recall_at_100
value: 88.947
- type: recall_at_1000
value: 97.783
- type: recall_at_3
value: 42.393
- type: recall_at_5
value: 52.036
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.64888280893753
- type: f1
value: 94.41310774203512
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 79.72184222526221
- type: f1
value: 61.522034067350106
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 79.60659045057163
- type: f1
value: 77.268649687049
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 81.83254875588432
- type: f1
value: 81.61520635919082
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 36.31529875009507
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.734233714415073
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.994501713009452
- type: mrr
value: 32.13512850703073
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.603000000000001
- type: map_at_10
value: 13.767999999999999
- type: map_at_100
value: 17.197000000000003
- type: map_at_1000
value: 18.615000000000002
- type: map_at_3
value: 10.567
- type: map_at_5
value: 12.078999999999999
- type: mrr_at_1
value: 44.891999999999996
- type: mrr_at_10
value: 53.75299999999999
- type: mrr_at_100
value: 54.35
- type: mrr_at_1000
value: 54.388000000000005
- type: mrr_at_3
value: 51.495999999999995
- type: mrr_at_5
value: 52.688
- type: ndcg_at_1
value: 43.189
- type: ndcg_at_10
value: 34.567
- type: ndcg_at_100
value: 32.273
- type: ndcg_at_1000
value: 41.321999999999996
- type: ndcg_at_3
value: 40.171
- type: ndcg_at_5
value: 37.502
- type: precision_at_1
value: 44.582
- type: precision_at_10
value: 25.139
- type: precision_at_100
value: 7.739999999999999
- type: precision_at_1000
value: 2.054
- type: precision_at_3
value: 37.152
- type: precision_at_5
value: 31.826999999999998
- type: recall_at_1
value: 6.603000000000001
- type: recall_at_10
value: 17.023
- type: recall_at_100
value: 32.914
- type: recall_at_1000
value: 64.44800000000001
- type: recall_at_3
value: 11.457
- type: recall_at_5
value: 13.816
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.026000000000003
- type: map_at_10
value: 45.429
- type: map_at_100
value: 46.45
- type: map_at_1000
value: 46.478
- type: map_at_3
value: 41.147
- type: map_at_5
value: 43.627
- type: mrr_at_1
value: 33.951
- type: mrr_at_10
value: 47.953
- type: mrr_at_100
value: 48.731
- type: mrr_at_1000
value: 48.751
- type: mrr_at_3
value: 44.39
- type: mrr_at_5
value: 46.533
- type: ndcg_at_1
value: 33.951
- type: ndcg_at_10
value: 53.24100000000001
- type: ndcg_at_100
value: 57.599999999999994
- type: ndcg_at_1000
value: 58.270999999999994
- type: ndcg_at_3
value: 45.190999999999995
- type: ndcg_at_5
value: 49.339
- type: precision_at_1
value: 33.951
- type: precision_at_10
value: 8.856
- type: precision_at_100
value: 1.133
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 20.713
- type: precision_at_5
value: 14.838000000000001
- type: recall_at_1
value: 30.026000000000003
- type: recall_at_10
value: 74.512
- type: recall_at_100
value: 93.395
- type: recall_at_1000
value: 98.402
- type: recall_at_3
value: 53.677
- type: recall_at_5
value: 63.198
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.41300000000001
- type: map_at_10
value: 85.387
- type: map_at_100
value: 86.027
- type: map_at_1000
value: 86.041
- type: map_at_3
value: 82.543
- type: map_at_5
value: 84.304
- type: mrr_at_1
value: 82.35
- type: mrr_at_10
value: 88.248
- type: mrr_at_100
value: 88.348
- type: mrr_at_1000
value: 88.349
- type: mrr_at_3
value: 87.348
- type: mrr_at_5
value: 87.96300000000001
- type: ndcg_at_1
value: 82.37
- type: ndcg_at_10
value: 88.98
- type: ndcg_at_100
value: 90.16499999999999
- type: ndcg_at_1000
value: 90.239
- type: ndcg_at_3
value: 86.34100000000001
- type: ndcg_at_5
value: 87.761
- type: precision_at_1
value: 82.37
- type: precision_at_10
value: 13.471
- type: precision_at_100
value: 1.534
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.827
- type: precision_at_5
value: 24.773999999999997
- type: recall_at_1
value: 71.41300000000001
- type: recall_at_10
value: 95.748
- type: recall_at_100
value: 99.69200000000001
- type: recall_at_1000
value: 99.98
- type: recall_at_3
value: 87.996
- type: recall_at_5
value: 92.142
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.96878497780007
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 65.31371347128074
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.287
- type: map_at_10
value: 13.530000000000001
- type: map_at_100
value: 15.891
- type: map_at_1000
value: 16.245
- type: map_at_3
value: 9.612
- type: map_at_5
value: 11.672
- type: mrr_at_1
value: 26
- type: mrr_at_10
value: 37.335
- type: mrr_at_100
value: 38.443
- type: mrr_at_1000
value: 38.486
- type: mrr_at_3
value: 33.783
- type: mrr_at_5
value: 36.028
- type: ndcg_at_1
value: 26
- type: ndcg_at_10
value: 22.215
- type: ndcg_at_100
value: 31.101
- type: ndcg_at_1000
value: 36.809
- type: ndcg_at_3
value: 21.104
- type: ndcg_at_5
value: 18.759999999999998
- type: precision_at_1
value: 26
- type: precision_at_10
value: 11.43
- type: precision_at_100
value: 2.424
- type: precision_at_1000
value: 0.379
- type: precision_at_3
value: 19.7
- type: precision_at_5
value: 16.619999999999997
- type: recall_at_1
value: 5.287
- type: recall_at_10
value: 23.18
- type: recall_at_100
value: 49.208
- type: recall_at_1000
value: 76.85300000000001
- type: recall_at_3
value: 11.991999999999999
- type: recall_at_5
value: 16.85
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.87834913790886
- type: cos_sim_spearman
value: 81.04583513112122
- type: euclidean_pearson
value: 81.20484174558065
- type: euclidean_spearman
value: 80.76430832561769
- type: manhattan_pearson
value: 81.21416730978615
- type: manhattan_spearman
value: 80.7797637394211
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 86.56143998865157
- type: cos_sim_spearman
value: 79.75387012744471
- type: euclidean_pearson
value: 83.7877519997019
- type: euclidean_spearman
value: 79.90489748003296
- type: manhattan_pearson
value: 83.7540590666095
- type: manhattan_spearman
value: 79.86434577931573
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 83.92102564177941
- type: cos_sim_spearman
value: 84.98234585939103
- type: euclidean_pearson
value: 84.47729567593696
- type: euclidean_spearman
value: 85.09490696194469
- type: manhattan_pearson
value: 84.38622951588229
- type: manhattan_spearman
value: 85.02507171545574
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 80.1891164763377
- type: cos_sim_spearman
value: 80.7997969966883
- type: euclidean_pearson
value: 80.48572256162396
- type: euclidean_spearman
value: 80.57851903536378
- type: manhattan_pearson
value: 80.4324819433651
- type: manhattan_spearman
value: 80.5074526239062
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 82.64319975116025
- type: cos_sim_spearman
value: 84.88671197763652
- type: euclidean_pearson
value: 84.74692193293231
- type: euclidean_spearman
value: 85.27151722073653
- type: manhattan_pearson
value: 84.72460516785438
- type: manhattan_spearman
value: 85.26518899786687
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.24687565822381
- type: cos_sim_spearman
value: 85.60418454111263
- type: euclidean_pearson
value: 84.85829740169851
- type: euclidean_spearman
value: 85.66378014138306
- type: manhattan_pearson
value: 84.84672408808835
- type: manhattan_spearman
value: 85.63331924364891
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 84.87758895415485
- type: cos_sim_spearman
value: 85.8193745617297
- type: euclidean_pearson
value: 85.78719118848134
- type: euclidean_spearman
value: 84.35797575385688
- type: manhattan_pearson
value: 85.97919844815692
- type: manhattan_spearman
value: 84.58334745175151
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 67.27076035963599
- type: cos_sim_spearman
value: 67.21433656439973
- type: euclidean_pearson
value: 68.07434078679324
- type: euclidean_spearman
value: 66.0249731719049
- type: manhattan_pearson
value: 67.95495198947476
- type: manhattan_spearman
value: 65.99893908331886
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.22437747056817
- type: cos_sim_spearman
value: 85.0995685206174
- type: euclidean_pearson
value: 84.08616925603394
- type: euclidean_spearman
value: 84.89633925691658
- type: manhattan_pearson
value: 84.08332675923133
- type: manhattan_spearman
value: 84.8858228112915
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.6909022589666
- type: mrr
value: 96.43341952165481
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 57.660999999999994
- type: map_at_10
value: 67.625
- type: map_at_100
value: 68.07600000000001
- type: map_at_1000
value: 68.10199999999999
- type: map_at_3
value: 64.50399999999999
- type: map_at_5
value: 66.281
- type: mrr_at_1
value: 61
- type: mrr_at_10
value: 68.953
- type: mrr_at_100
value: 69.327
- type: mrr_at_1000
value: 69.352
- type: mrr_at_3
value: 66.833
- type: mrr_at_5
value: 68.05
- type: ndcg_at_1
value: 61
- type: ndcg_at_10
value: 72.369
- type: ndcg_at_100
value: 74.237
- type: ndcg_at_1000
value: 74.939
- type: ndcg_at_3
value: 67.284
- type: ndcg_at_5
value: 69.72500000000001
- type: precision_at_1
value: 61
- type: precision_at_10
value: 9.733
- type: precision_at_100
value: 1.0670000000000002
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 26.222
- type: precision_at_5
value: 17.4
- type: recall_at_1
value: 57.660999999999994
- type: recall_at_10
value: 85.656
- type: recall_at_100
value: 93.833
- type: recall_at_1000
value: 99.333
- type: recall_at_3
value: 71.961
- type: recall_at_5
value: 78.094
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.86930693069307
- type: cos_sim_ap
value: 96.76685487950894
- type: cos_sim_f1
value: 93.44587884806354
- type: cos_sim_precision
value: 92.80078895463511
- type: cos_sim_recall
value: 94.1
- type: dot_accuracy
value: 99.54356435643564
- type: dot_ap
value: 81.18659960405607
- type: dot_f1
value: 75.78008915304605
- type: dot_precision
value: 75.07360157016683
- type: dot_recall
value: 76.5
- type: euclidean_accuracy
value: 99.87326732673267
- type: euclidean_ap
value: 96.8102411908941
- type: euclidean_f1
value: 93.6127744510978
- type: euclidean_precision
value: 93.42629482071713
- type: euclidean_recall
value: 93.8
- type: manhattan_accuracy
value: 99.87425742574257
- type: manhattan_ap
value: 96.82857341435529
- type: manhattan_f1
value: 93.62129583124059
- type: manhattan_precision
value: 94.04641775983855
- type: manhattan_recall
value: 93.2
- type: max_accuracy
value: 99.87425742574257
- type: max_ap
value: 96.82857341435529
- type: max_f1
value: 93.62129583124059
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 65.92560972698926
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.92797240259008
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.244624045597654
- type: mrr
value: 56.185303666921314
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.02491987312937
- type: cos_sim_spearman
value: 32.055592206679734
- type: dot_pearson
value: 24.731627575422557
- type: dot_spearman
value: 24.308029077069733
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.231
- type: map_at_10
value: 1.899
- type: map_at_100
value: 9.498
- type: map_at_1000
value: 20.979999999999997
- type: map_at_3
value: 0.652
- type: map_at_5
value: 1.069
- type: mrr_at_1
value: 88
- type: mrr_at_10
value: 93.4
- type: mrr_at_100
value: 93.4
- type: mrr_at_1000
value: 93.4
- type: mrr_at_3
value: 93
- type: mrr_at_5
value: 93.4
- type: ndcg_at_1
value: 86
- type: ndcg_at_10
value: 75.375
- type: ndcg_at_100
value: 52.891999999999996
- type: ndcg_at_1000
value: 44.952999999999996
- type: ndcg_at_3
value: 81.05
- type: ndcg_at_5
value: 80.175
- type: precision_at_1
value: 88
- type: precision_at_10
value: 79
- type: precision_at_100
value: 53.16
- type: precision_at_1000
value: 19.408
- type: precision_at_3
value: 85.333
- type: precision_at_5
value: 84
- type: recall_at_1
value: 0.231
- type: recall_at_10
value: 2.078
- type: recall_at_100
value: 12.601
- type: recall_at_1000
value: 41.296
- type: recall_at_3
value: 0.6779999999999999
- type: recall_at_5
value: 1.1360000000000001
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.782
- type: map_at_10
value: 10.204
- type: map_at_100
value: 16.176
- type: map_at_1000
value: 17.456
- type: map_at_3
value: 5.354
- type: map_at_5
value: 7.503
- type: mrr_at_1
value: 40.816
- type: mrr_at_10
value: 54.010000000000005
- type: mrr_at_100
value: 54.49
- type: mrr_at_1000
value: 54.49
- type: mrr_at_3
value: 48.980000000000004
- type: mrr_at_5
value: 51.735
- type: ndcg_at_1
value: 36.735
- type: ndcg_at_10
value: 26.61
- type: ndcg_at_100
value: 36.967
- type: ndcg_at_1000
value: 47.274
- type: ndcg_at_3
value: 30.363
- type: ndcg_at_5
value: 29.448999999999998
- type: precision_at_1
value: 40.816
- type: precision_at_10
value: 23.878
- type: precision_at_100
value: 7.693999999999999
- type: precision_at_1000
value: 1.4489999999999998
- type: precision_at_3
value: 31.293
- type: precision_at_5
value: 29.796
- type: recall_at_1
value: 2.782
- type: recall_at_10
value: 16.485
- type: recall_at_100
value: 46.924
- type: recall_at_1000
value: 79.365
- type: recall_at_3
value: 6.52
- type: recall_at_5
value: 10.48
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.08300000000001
- type: ap
value: 13.91559884590195
- type: f1
value: 53.956838444291364
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 59.34069043576683
- type: f1
value: 59.662041994618406
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 53.70780611078653
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.10734934732073
- type: cos_sim_ap
value: 77.58349999516054
- type: cos_sim_f1
value: 70.25391395868965
- type: cos_sim_precision
value: 70.06035161374967
- type: cos_sim_recall
value: 70.44854881266491
- type: dot_accuracy
value: 80.60439887941826
- type: dot_ap
value: 54.52935200483575
- type: dot_f1
value: 54.170444242973716
- type: dot_precision
value: 47.47715534366309
- type: dot_recall
value: 63.06068601583114
- type: euclidean_accuracy
value: 87.26828396018358
- type: euclidean_ap
value: 78.00158454104036
- type: euclidean_f1
value: 70.70292457670601
- type: euclidean_precision
value: 68.79680479281079
- type: euclidean_recall
value: 72.71767810026385
- type: manhattan_accuracy
value: 87.11330988853788
- type: manhattan_ap
value: 77.92527099601855
- type: manhattan_f1
value: 70.76488706365502
- type: manhattan_precision
value: 68.89055472263868
- type: manhattan_recall
value: 72.74406332453826
- type: max_accuracy
value: 87.26828396018358
- type: max_ap
value: 78.00158454104036
- type: max_f1
value: 70.76488706365502
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 87.80804905499282
- type: cos_sim_ap
value: 83.06187782630936
- type: cos_sim_f1
value: 74.99716435403985
- type: cos_sim_precision
value: 73.67951860931579
- type: cos_sim_recall
value: 76.36279642747151
- type: dot_accuracy
value: 81.83141227151008
- type: dot_ap
value: 67.18241090841795
- type: dot_f1
value: 62.216037571751606
- type: dot_precision
value: 56.749381227391005
- type: dot_recall
value: 68.84816753926701
- type: euclidean_accuracy
value: 87.91671517832887
- type: euclidean_ap
value: 83.56538942001427
- type: euclidean_f1
value: 75.7327253337256
- type: euclidean_precision
value: 72.48856036606828
- type: euclidean_recall
value: 79.28087465352634
- type: manhattan_accuracy
value: 87.86626304963713
- type: manhattan_ap
value: 83.52939841172832
- type: manhattan_f1
value: 75.73635656329888
- type: manhattan_precision
value: 72.99150182103836
- type: manhattan_recall
value: 78.69571912534647
- type: max_accuracy
value: 87.91671517832887
- type: max_ap
value: 83.56538942001427
- type: max_f1
value: 75.73635656329888
license: mit
language:
- en
---
**Recommend switching to newest [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5), which has more reasonable similarity distribution and same method of usage.**
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search.
And it also can be used in vector databases for LLMs.
************* 🌟**Updates**🌟 *************
- 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire:
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
| 89,767 | [
[
-0.03662109375,
-0.06646728515625,
0.0295257568359375,
0.01262664794921875,
-0.0291748046875,
-0.0205535888671875,
-0.025726318359375,
-0.0223236083984375,
0.0287017822265625,
0.025665283203125,
-0.0259552001953125,
-0.0633544921875,
-0.037994384765625,
-0.0032329559326171875,
-0.00555419921875,
0.042388916015625,
-0.0030994415283203125,
0.01105499267578125,
0.003185272216796875,
-0.019775390625,
-0.0309600830078125,
-0.0171966552734375,
-0.05364990234375,
-0.022003173828125,
0.0285797119140625,
0.01617431640625,
0.043121337890625,
0.0521240234375,
0.0247802734375,
0.020477294921875,
-0.0199432373046875,
0.01043701171875,
-0.03741455078125,
-0.00559234619140625,
-0.018402099609375,
-0.0239715576171875,
-0.0307769775390625,
0.0089111328125,
0.0491943359375,
0.033477783203125,
-0.00757598876953125,
0.00800323486328125,
-0.00009363889694213867,
0.05389404296875,
-0.03570556640625,
0.019805908203125,
-0.04119873046875,
0.0026187896728515625,
-0.0188751220703125,
0.011474609375,
-0.03759765625,
-0.0258636474609375,
0.0130615234375,
-0.044525146484375,
0.00798797607421875,
0.02093505859375,
0.09759521484375,
0.01490020751953125,
-0.03253173828125,
-0.010498046875,
-0.00992584228515625,
0.07354736328125,
-0.0772705078125,
0.052581787109375,
0.03631591796875,
0.020751953125,
-0.00600433349609375,
-0.0616455078125,
-0.0268707275390625,
-0.0123138427734375,
-0.01421356201171875,
0.0313720703125,
-0.0022296905517578125,
0.0014753341674804688,
0.026611328125,
0.04583740234375,
-0.0433349609375,
0.0081787109375,
-0.00749969482421875,
-0.0130615234375,
0.0577392578125,
-0.01337432861328125,
0.0308837890625,
-0.03912353515625,
-0.020660400390625,
-0.03094482421875,
-0.062103271484375,
0.0024852752685546875,
0.0285186767578125,
0.00899505615234375,
-0.025726318359375,
0.04095458984375,
-0.0172271728515625,
0.045318603515625,
0.006168365478515625,
0.0004968643188476562,
0.045074462890625,
-0.0268402099609375,
-0.01666259765625,
-0.007595062255859375,
0.066650390625,
0.0305938720703125,
-0.0024623870849609375,
0.004901885986328125,
-0.0239410400390625,
-0.006809234619140625,
-0.006114959716796875,
-0.0697021484375,
-0.019317626953125,
0.01456451416015625,
-0.05877685546875,
-0.0115966796875,
0.015625,
-0.058624267578125,
0.006099700927734375,
-0.0007410049438476562,
0.043426513671875,
-0.056854248046875,
-0.005390167236328125,
0.0234527587890625,
-0.0146331787109375,
0.029266357421875,
0.0001004934310913086,
-0.04833984375,
-0.0170745849609375,
0.039306640625,
0.06732177734375,
0.0101318359375,
-0.00722503662109375,
-0.0292205810546875,
0.0037384033203125,
-0.01213836669921875,
0.0240325927734375,
-0.037384033203125,
-0.01485443115234375,
0.0139923095703125,
0.0301513671875,
-0.00739288330078125,
-0.0249786376953125,
0.06634521484375,
-0.038787841796875,
0.0266265869140625,
-0.0283203125,
-0.06036376953125,
-0.038909912109375,
0.0070037841796875,
-0.059326171875,
0.0821533203125,
-0.006275177001953125,
-0.0654296875,
0.009552001953125,
-0.04766845703125,
-0.018035888671875,
-0.018096923828125,
0.0004901885986328125,
-0.04644775390625,
-0.007427215576171875,
0.0277557373046875,
0.04461669921875,
-0.016082763671875,
0.0035800933837890625,
-0.027587890625,
-0.045135498046875,
-0.0030975341796875,
-0.0178985595703125,
0.0804443359375,
0.02215576171875,
-0.02789306640625,
-0.0169219970703125,
-0.033416748046875,
0.00644683837890625,
0.02056884765625,
-0.01953125,
-0.026702880859375,
0.016754150390625,
0.0163421630859375,
0.0036144256591796875,
0.04052734375,
-0.052825927734375,
0.01136016845703125,
-0.04388427734375,
0.04327392578125,
0.042724609375,
0.01392364501953125,
0.0178070068359375,
-0.037200927734375,
0.0205841064453125,
0.0003478527069091797,
-0.0026416778564453125,
-0.0132293701171875,
-0.041656494140625,
-0.043182373046875,
-0.025146484375,
0.0531005859375,
0.047027587890625,
-0.06280517578125,
0.050750732421875,
-0.034637451171875,
-0.046844482421875,
-0.07135009765625,
0.0100250244140625,
0.03863525390625,
-0.0004515647888183594,
0.052703857421875,
-0.01360321044921875,
-0.034210205078125,
-0.07098388671875,
-0.0032939910888671875,
0.00594329833984375,
-0.00478363037109375,
0.040008544921875,
0.04400634765625,
-0.0291900634765625,
0.0294952392578125,
-0.056060791015625,
-0.02423095703125,
-0.0176239013671875,
-0.00559234619140625,
0.0256500244140625,
0.03582763671875,
0.052520751953125,
-0.075927734375,
-0.04364013671875,
0.0021915435791015625,
-0.054534912109375,
0.005748748779296875,
0.0054168701171875,
-0.02056884765625,
0.0168914794921875,
0.046234130859375,
-0.0308685302734375,
0.0169830322265625,
0.035675048828125,
-0.0177764892578125,
0.0200042724609375,
-0.0025081634521484375,
0.0111846923828125,
-0.10260009765625,
0.00693511962890625,
0.023895263671875,
-0.0100250244140625,
-0.0193328857421875,
0.0391845703125,
0.011505126953125,
0.0161285400390625,
-0.02398681640625,
0.0477294921875,
-0.038330078125,
0.0178375244140625,
0.00734710693359375,
0.041961669921875,
-0.00859832763671875,
0.037109375,
-0.004070281982421875,
0.055389404296875,
0.0296783447265625,
-0.0281829833984375,
0.0118865966796875,
0.039459228515625,
-0.036956787109375,
0.0062255859375,
-0.05010986328125,
-0.006320953369140625,
-0.006603240966796875,
0.013916015625,
-0.06304931640625,
-0.005748748779296875,
0.0205535888671875,
-0.04254150390625,
0.042327880859375,
-0.0221099853515625,
-0.037445068359375,
-0.0287933349609375,
-0.06732177734375,
0.01263427734375,
0.046051025390625,
-0.0504150390625,
0.015869140625,
0.019287109375,
0.00310516357421875,
-0.059722900390625,
-0.0635986328125,
-0.01085662841796875,
-0.0023708343505859375,
-0.040313720703125,
0.040130615234375,
-0.005786895751953125,
0.019775390625,
0.013519287109375,
-0.0076141357421875,
0.01435089111328125,
0.006500244140625,
-0.0003151893615722656,
0.015625,
-0.034454345703125,
0.0021610260009765625,
0.02020263671875,
0.01068115234375,
-0.01552581787109375,
-0.01064300537109375,
0.03289794921875,
-0.01064300537109375,
-0.0229949951171875,
-0.0143890380859375,
0.023223876953125,
0.0218048095703125,
-0.0290985107421875,
0.046295166015625,
0.075927734375,
-0.027008056640625,
-0.00461578369140625,
-0.050750732421875,
-0.00782012939453125,
-0.03656005859375,
0.0360107421875,
-0.0272369384765625,
-0.07293701171875,
0.03216552734375,
-0.00445556640625,
0.01678466796875,
0.0498046875,
0.026458740234375,
-0.00861358642578125,
0.080810546875,
0.0286865234375,
-0.024505615234375,
0.051361083984375,
-0.048919677734375,
0.01557159423828125,
-0.08868408203125,
-0.0031223297119140625,
-0.0276947021484375,
-0.030548095703125,
-0.097900390625,
-0.0350341796875,
0.0040283203125,
0.018096923828125,
-0.0291290283203125,
0.031402587890625,
-0.041961669921875,
0.010986328125,
0.035888671875,
0.0230255126953125,
-0.0028934478759765625,
0.0126190185546875,
-0.03045654296875,
-0.0179901123046875,
-0.044830322265625,
-0.0340576171875,
0.07550048828125,
0.03546142578125,
0.04718017578125,
0.028961181640625,
0.060943603515625,
0.01169586181640625,
0.006587982177734375,
-0.057373046875,
0.042572021484375,
-0.040802001953125,
-0.042694091796875,
-0.026336669921875,
-0.039947509765625,
-0.0859375,
0.02862548828125,
-0.0197906494140625,
-0.059417724609375,
0.0080718994140625,
-0.01406097412109375,
-0.00010216236114501953,
0.03424072265625,
-0.052703857421875,
0.078369140625,
-0.0045013427734375,
-0.023773193359375,
-0.008270263671875,
-0.0330810546875,
0.02520751953125,
0.01110076904296875,
0.005023956298828125,
0.00812530517578125,
-0.0176239013671875,
0.053802490234375,
-0.01458740234375,
0.046173095703125,
-0.012420654296875,
0.00948333740234375,
0.0316162109375,
-0.01276397705078125,
0.042633056640625,
0.005584716796875,
-0.015472412109375,
0.0217132568359375,
0.006076812744140625,
-0.0380859375,
-0.037353515625,
0.069091796875,
-0.053802490234375,
-0.051605224609375,
-0.0280303955078125,
-0.017364501953125,
0.0125579833984375,
0.032470703125,
0.0313720703125,
0.0193023681640625,
-0.007587432861328125,
0.0489501953125,
0.06927490234375,
-0.038482666015625,
0.028289794921875,
0.023193359375,
-0.018280029296875,
-0.043609619140625,
0.08502197265625,
0.0179901123046875,
-0.0029315948486328125,
0.0491943359375,
0.0029048919677734375,
-0.0215301513671875,
-0.0430908203125,
-0.03472900390625,
0.048004150390625,
-0.043304443359375,
-0.01486968994140625,
-0.04705810546875,
-0.03302001953125,
-0.033447265625,
0.0004940032958984375,
-0.01751708984375,
-0.017669677734375,
-0.01215362548828125,
-0.0197906494140625,
0.0207061767578125,
0.03363037109375,
0.00843048095703125,
0.006481170654296875,
-0.053314208984375,
0.015869140625,
-0.006664276123046875,
0.032867431640625,
0.00702667236328125,
-0.045989990234375,
-0.044921875,
0.01218414306640625,
-0.035797119140625,
-0.08099365234375,
0.0248565673828125,
0.006420135498046875,
0.0618896484375,
0.0250396728515625,
-0.00421905517578125,
0.031402587890625,
-0.0390625,
0.07830810546875,
-0.005748748779296875,
-0.0572509765625,
0.034515380859375,
-0.023040771484375,
0.0163116455078125,
0.0443115234375,
0.049896240234375,
-0.036468505859375,
-0.0191650390625,
-0.03863525390625,
-0.07293701171875,
0.037261962890625,
0.01328277587890625,
0.001293182373046875,
-0.0202789306640625,
0.024749755859375,
-0.01125335693359375,
0.0009207725524902344,
-0.060516357421875,
-0.05340576171875,
-0.0251007080078125,
-0.0272369384765625,
-0.01250457763671875,
-0.0204620361328125,
0.016876220703125,
-0.0225067138671875,
0.07696533203125,
-0.00118255615234375,
0.0390625,
0.026123046875,
-0.0236968994140625,
0.0157470703125,
0.015655517578125,
0.0221405029296875,
0.016876220703125,
-0.0301055908203125,
-0.01091766357421875,
0.0247802734375,
-0.043121337890625,
-0.00592803955078125,
0.025115966796875,
-0.03436279296875,
0.0146636962890625,
0.0233306884765625,
0.05609130859375,
0.033447265625,
-0.033660888671875,
0.04425048828125,
0.0101776123046875,
-0.01461029052734375,
-0.020263671875,
-0.0038356781005859375,
0.0229644775390625,
0.0193328857421875,
0.00641632080078125,
-0.03143310546875,
0.02191162109375,
-0.044647216796875,
0.0223388671875,
0.031524658203125,
-0.0245513916015625,
-0.005786895751953125,
0.047149658203125,
0.0015974044799804688,
-0.001499176025390625,
0.037261962890625,
-0.040802001953125,
-0.051116943359375,
0.0301971435546875,
0.02978515625,
0.06396484375,
-0.01263427734375,
0.016387939453125,
0.06671142578125,
0.03680419921875,
-0.02838134765625,
0.026336669921875,
0.0085906982421875,
-0.04302978515625,
-0.035186767578125,
-0.040374755859375,
-0.004100799560546875,
0.0225067138671875,
-0.041748046875,
0.0279083251953125,
-0.0301971435546875,
-0.0097503662109375,
0.002742767333984375,
0.035919189453125,
-0.055023193359375,
0.0100250244140625,
0.003917694091796875,
0.08282470703125,
-0.045135498046875,
0.06060791015625,
0.078125,
-0.06689453125,
-0.0572509765625,
0.00658416748046875,
-0.0101318359375,
-0.0440673828125,
0.02398681640625,
0.018798828125,
0.01337432861328125,
0.00395965576171875,
-0.037689208984375,
-0.07000732421875,
0.11865234375,
0.004375457763671875,
-0.043670654296875,
-0.003925323486328125,
-0.024658203125,
0.036773681640625,
-0.0249176025390625,
0.03228759765625,
0.03082275390625,
0.0445556640625,
-0.00978851318359375,
-0.04876708984375,
0.040924072265625,
-0.0218658447265625,
0.0175933837890625,
0.00676727294921875,
-0.0765380859375,
0.06011962890625,
0.002300262451171875,
-0.023284912109375,
0.0166168212890625,
0.052459716796875,
0.021087646484375,
0.03436279296875,
0.0204620361328125,
0.06988525390625,
0.04949951171875,
-0.01172637939453125,
0.0841064453125,
-0.01776123046875,
0.045989990234375,
0.06494140625,
0.0109405517578125,
0.08221435546875,
0.006641387939453125,
-0.015838623046875,
0.0504150390625,
0.061187744140625,
-0.02618408203125,
0.037200927734375,
0.0036716461181640625,
0.0042572021484375,
-0.0223541259765625,
0.00800323486328125,
-0.04095458984375,
0.0206298828125,
0.0230255126953125,
-0.036895751953125,
0.00209808349609375,
-0.0204925537109375,
0.00800323486328125,
0.005908966064453125,
-0.0036907196044921875,
0.043060302734375,
0.02435302734375,
-0.0350341796875,
0.048553466796875,
0.0170745849609375,
0.07562255859375,
-0.03216552734375,
-0.008514404296875,
-0.0247802734375,
-0.00701141357421875,
-0.015869140625,
-0.058135986328125,
-0.0065155029296875,
-0.018035888671875,
-0.0158538818359375,
0.0085906982421875,
0.043304443359375,
-0.04522705078125,
-0.031524658203125,
0.042724609375,
0.03704833984375,
0.0203399658203125,
0.0128631591796875,
-0.08203125,
0.0034637451171875,
0.0275421142578125,
-0.04022216796875,
0.0253143310546875,
0.037872314453125,
-0.0071563720703125,
0.04388427734375,
0.04229736328125,
0.005542755126953125,
-0.002826690673828125,
0.00374603271484375,
0.039337158203125,
-0.0679931640625,
-0.02313232421875,
-0.044525146484375,
0.021759033203125,
-0.0253753662109375,
-0.0020084381103515625,
0.0579833984375,
0.0548095703125,
0.0819091796875,
-0.0035648345947265625,
0.058135986328125,
-0.008636474609375,
0.0293731689453125,
-0.04412841796875,
0.0655517578125,
-0.0792236328125,
0.01474761962890625,
-0.030548095703125,
-0.0762939453125,
-0.01062774658203125,
0.05487060546875,
-0.025054931640625,
0.019195556640625,
0.052001953125,
0.0736083984375,
-0.0229034423828125,
-0.01537322998046875,
0.023590087890625,
0.034027099609375,
0.01174163818359375,
0.059326171875,
0.0259552001953125,
-0.0711669921875,
0.0479736328125,
-0.01529693603515625,
0.00771331787109375,
-0.040008544921875,
-0.048004150390625,
-0.07110595703125,
-0.0548095703125,
-0.0311431884765625,
-0.0208892822265625,
-0.00102996826171875,
0.06878662109375,
0.0269012451171875,
-0.055267333984375,
-0.0032444000244140625,
0.0196533203125,
0.0301513671875,
-0.0211639404296875,
-0.0201416015625,
0.049713134765625,
-0.0062255859375,
-0.07086181640625,
0.0229644775390625,
-0.006168365478515625,
-0.005634307861328125,
-0.0008168220520019531,
-0.0172576904296875,
-0.06353759765625,
0.00782012939453125,
0.045623779296875,
0.019500732421875,
-0.0657958984375,
-0.035186767578125,
0.0050201416015625,
-0.0196990966796875,
-0.012481689453125,
0.01178741455078125,
-0.0301513671875,
0.026763916015625,
0.04754638671875,
0.059539794921875,
0.052581787109375,
-0.00516510009765625,
0.015625,
-0.0438232421875,
-0.00434112548828125,
-0.004627227783203125,
0.054290771484375,
0.028350830078125,
-0.0227508544921875,
0.069091796875,
0.01534271240234375,
-0.0323486328125,
-0.056365966796875,
0.0033435821533203125,
-0.08160400390625,
-0.026458740234375,
0.08648681640625,
-0.030242919921875,
-0.019378662109375,
0.0235443115234375,
-0.0164794921875,
0.03900146484375,
-0.035888671875,
0.038543701171875,
0.061370849609375,
0.0352783203125,
-0.0119171142578125,
-0.06298828125,
0.0238189697265625,
0.04913330078125,
-0.0199432373046875,
-0.0257568359375,
0.026214599609375,
0.03668212890625,
0.016632080078125,
0.00997161865234375,
-0.018035888671875,
0.02362060546875,
-0.00653839111328125,
0.00012022256851196289,
-0.0096588134765625,
0.01523590087890625,
-0.01421356201171875,
-0.0018072128295898438,
-0.01169586181640625,
-0.0213470458984375
]
] |
ARDICAI/stable-diffusion-2-1-finetuned | 2023-11-06T06:24:44.000Z | [
"diffusers",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | ARDICAI | null | null | ARDICAI/stable-diffusion-2-1-finetuned | 2 | 52,780 | diffusers | 2023-09-21T12:14:05 | ---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### stable-diffusion-2-1-finetuned Dreambooth model trained by ARDIC AI team
| 158 | [
[
-0.037078857421875,
-0.05792236328125,
0.008941650390625,
0.0173492431640625,
-0.02020263671875,
0.01148223876953125,
0.0243988037109375,
0.00567626953125,
0.0123748779296875,
0.04901123046875,
-0.0214691162109375,
-0.01910400390625,
-0.033843994140625,
-0.0298004150390625,
-0.054595947265625,
0.072998046875,
-0.00220489501953125,
0.03216552734375,
0.015411376953125,
-0.01006317138671875,
-0.034423828125,
0.00986480712890625,
-0.09307861328125,
-0.050384521484375,
0.0031147003173828125,
0.052490234375,
0.042694091796875,
0.020904541015625,
0.0222625732421875,
0.007709503173828125,
-0.014434814453125,
-0.0290679931640625,
-0.049530029296875,
0.0237579345703125,
0.0017490386962890625,
-0.0303955078125,
-0.041229248046875,
-0.001750946044921875,
0.03466796875,
0.017364501953125,
-0.039794921875,
0.03424072265625,
-0.002529144287109375,
0.048583984375,
0.0022449493408203125,
-0.00649261474609375,
-0.0269927978515625,
0.0189971923828125,
-0.01800537109375,
0.035888671875,
-0.01959228515625,
-0.025360107421875,
-0.028839111328125,
-0.044525146484375,
0.0294342041015625,
0.00376129150390625,
0.0594482421875,
0.0323486328125,
-0.02777099609375,
0.0301666259765625,
-0.036468505859375,
0.0178680419921875,
-0.0122833251953125,
0.041107177734375,
0.0341796875,
0.038299560546875,
-0.0139007568359375,
-0.045562744140625,
-0.04254150390625,
0.007434844970703125,
0.03271484375,
-0.00063323974609375,
0.01294708251953125,
0.00984954833984375,
0.005092620849609375,
0.0206298828125,
-0.02191162109375,
0.0267486572265625,
-0.0582275390625,
-0.0255126953125,
0.031890869140625,
0.0159454345703125,
0.00226593017578125,
-0.0020599365234375,
-0.0281524658203125,
0.00482177734375,
-0.036376953125,
-0.00850677490234375,
0.01207733154296875,
0.01419830322265625,
-0.044525146484375,
0.05084228515625,
-0.01885986328125,
0.028350830078125,
0.0298004150390625,
0.0264129638671875,
0.036041259765625,
-0.02239990234375,
-0.04388427734375,
0.005504608154296875,
0.0478515625,
-0.001842498779296875,
0.008636474609375,
0.005199432373046875,
-0.02484130859375,
0.01506805419921875,
0.01305389404296875,
-0.06097412109375,
-0.050994873046875,
0.0030803680419921875,
-0.040863037109375,
-0.046966552734375,
-0.013427734375,
-0.047332763671875,
-0.03155517578125,
-0.0066375732421875,
0.049896240234375,
-0.0341796875,
-0.058135986328125,
0.005176544189453125,
-0.024871826171875,
0.00608062744140625,
0.021148681640625,
-0.0693359375,
0.035430908203125,
0.032073974609375,
0.06304931640625,
0.01140594482421875,
-0.00214385986328125,
-0.0088348388671875,
0.0012617111206054688,
-0.046356201171875,
0.0531005859375,
-0.0015058517456054688,
-0.016357421875,
-0.02484130859375,
-0.00742340087890625,
0.004974365234375,
-0.0195465087890625,
0.028778076171875,
-0.040771484375,
0.01230621337890625,
0.023040771484375,
-0.0313720703125,
-0.0259552001953125,
0.02734375,
-0.043182373046875,
0.043121337890625,
0.0208282470703125,
-0.033447265625,
0.0160064697265625,
-0.09344482421875,
0.00264739990234375,
0.01702880859375,
-0.004360198974609375,
-0.0198211669921875,
-0.014495849609375,
-0.03717041015625,
0.03900146484375,
0.006755828857421875,
0.01275634765625,
-0.00472259521484375,
-0.023956298828125,
-0.01611328125,
-0.043731689453125,
0.06524658203125,
0.04302978515625,
-0.007472991943359375,
0.0243682861328125,
-0.06298828125,
-0.016815185546875,
0.00777435302734375,
-0.00250244140625,
-0.01515960693359375,
-0.03302001953125,
0.0222625732421875,
-0.00986480712890625,
0.0171051025390625,
-0.044769287109375,
0.0229644775390625,
-0.02032470703125,
0.021331787109375,
0.040618896484375,
0.0231475830078125,
0.0275421142578125,
-0.006839752197265625,
0.060638427734375,
0.0162506103515625,
-0.012786865234375,
0.037872314453125,
-0.033966064453125,
-0.0643310546875,
-0.005336761474609375,
0.040313720703125,
0.04705810546875,
-0.024566650390625,
-0.0030841827392578125,
0.0265960693359375,
-0.062469482421875,
-0.0245361328125,
-0.006801605224609375,
0.0157012939453125,
0.03143310546875,
0.0058746337890625,
-0.00147247314453125,
-0.034271240234375,
-0.06854248046875,
0.055267333984375,
0.00202178955078125,
0.02313232421875,
-0.03753662109375,
0.03466796875,
-0.050048828125,
0.0208892822265625,
-0.0184173583984375,
-0.0174560546875,
-0.009979248046875,
0.01528167724609375,
0.04315185546875,
0.0814208984375,
0.052032470703125,
-0.04376220703125,
-0.03076171875,
-0.01300811767578125,
-0.034027099609375,
0.01280975341796875,
0.008392333984375,
-0.037811279296875,
0.02630615234375,
0.029449462890625,
-0.08489990234375,
0.0311279296875,
0.04437255859375,
-0.058013916015625,
0.052398681640625,
-0.003002166748046875,
0.0180206298828125,
-0.0814208984375,
-0.001361846923828125,
-0.016021728515625,
-0.03564453125,
-0.0180816650390625,
0.0224609375,
-0.01079559326171875,
-0.0231475830078125,
-0.049072265625,
0.045806884765625,
-0.0262451171875,
0.0225372314453125,
-0.00983428955078125,
-0.019439697265625,
-0.02069091796875,
0.005123138427734375,
-0.0100555419921875,
0.0633544921875,
0.072021484375,
-0.050262451171875,
0.04473876953125,
0.0526123046875,
-0.0012378692626953125,
0.043853759765625,
-0.037567138671875,
-0.00836944580078125,
0.00981903076171875,
-0.002460479736328125,
-0.08221435546875,
-0.029296875,
0.044281005859375,
-0.025360107421875,
0.028045654296875,
-0.01183319091796875,
-0.041961669921875,
-0.02447509765625,
-0.004638671875,
0.05767822265625,
0.0645751953125,
-0.0008063316345214844,
0.00214385986328125,
0.0426025390625,
0.005615234375,
-0.03729248046875,
-0.037353515625,
-0.0171051025390625,
-0.006244659423828125,
-0.03985595703125,
-0.0025386810302734375,
-0.01000213623046875,
-0.03173828125,
-0.01476287841796875,
-0.0003075599670410156,
-0.0169219970703125,
0.001041412353515625,
0.035186767578125,
0.01149749755859375,
-0.0276031494140625,
0.0201568603515625,
0.0214996337890625,
-0.0235595703125,
-0.00495147705078125,
0.00543975830078125,
0.06622314453125,
-0.03338623046875,
-0.005767822265625,
-0.0537109375,
0.001739501953125,
0.050262451171875,
0.0135498046875,
0.041229248046875,
0.06597900390625,
-0.047393798828125,
0.00026226043701171875,
-0.018157958984375,
-0.01210784912109375,
-0.0303497314453125,
0.021209716796875,
-0.021820068359375,
-0.04913330078125,
0.05755615234375,
0.01248931884765625,
0.010345458984375,
0.0643310546875,
0.0372314453125,
-0.01129150390625,
0.059234619140625,
0.052734375,
0.0207061767578125,
0.0127105712890625,
-0.062042236328125,
-0.0125274658203125,
-0.042510986328125,
-0.034515380859375,
-0.00705718994140625,
-0.03466796875,
-0.0214691162109375,
-0.0242919921875,
0.003910064697265625,
0.03558349609375,
-0.059600830078125,
0.0275115966796875,
-0.0421142578125,
0.04681396484375,
0.036346435546875,
0.0164642333984375,
-0.00872802734375,
-0.0209808349609375,
-0.0166015625,
0.03851318359375,
-0.0400390625,
-0.024993896484375,
0.07342529296875,
0.047760009765625,
0.07415771484375,
0.003620147705078125,
0.051544189453125,
0.036102294921875,
-0.0037975311279296875,
-0.0267486572265625,
0.049072265625,
-0.008331298828125,
-0.07196044921875,
-0.03253173828125,
0.01105499267578125,
-0.07733154296875,
0.00740814208984375,
-0.02545166015625,
-0.039459228515625,
-0.01073455810546875,
0.0265045166015625,
-0.0587158203125,
0.02667236328125,
-0.0433349609375,
0.0927734375,
-0.0010156631469726562,
-0.0109100341796875,
-0.04296875,
-0.050201416015625,
0.0221710205078125,
0.01096343994140625,
-0.01220703125,
-0.02783203125,
-0.00756072998046875,
0.0633544921875,
-0.0134124755859375,
0.040863037109375,
-0.03021240234375,
0.0142974853515625,
0.0163116455078125,
0.035491943359375,
0.007099151611328125,
0.0269775390625,
-0.009185791015625,
-0.0022182464599609375,
0.005466461181640625,
-0.049468994140625,
0.017425537109375,
0.044189453125,
-0.055908203125,
-0.01006317138671875,
-0.0325927734375,
-0.0298309326171875,
0.0086669921875,
0.028350830078125,
0.04638671875,
0.03985595703125,
-0.0303802490234375,
0.005725860595703125,
0.07818603515625,
0.0294342041015625,
0.0450439453125,
0.0127716064453125,
-0.04150390625,
-0.0278778076171875,
0.040435791015625,
-0.0014142990112304688,
0.010528564453125,
-0.00616455078125,
0.0078277587890625,
-0.01026153564453125,
-0.06463623046875,
-0.04095458984375,
0.01247406005859375,
-0.02813720703125,
-0.035736083984375,
-0.033477783203125,
-0.053009033203125,
-0.0283203125,
0.0026264190673828125,
-0.048065185546875,
-0.05780029296875,
-0.06982421875,
-0.0504150390625,
0.048583984375,
0.05133056640625,
-0.03631591796875,
0.03973388671875,
-0.0501708984375,
0.027862548828125,
0.00815582275390625,
0.05194091796875,
-0.02532958984375,
-0.046142578125,
-0.019989013671875,
0.003574371337890625,
-0.030303955078125,
-0.0699462890625,
0.018341064453125,
0.00246429443359375,
0.051605224609375,
0.047576904296875,
-0.0079345703125,
0.035125732421875,
-0.026214599609375,
0.052734375,
0.022735595703125,
-0.04742431640625,
0.0546875,
-0.051025390625,
0.01161956787109375,
0.06707763671875,
0.02105712890625,
-0.0311279296875,
-0.03021240234375,
-0.0816650390625,
-0.047149658203125,
0.0285491943359375,
0.018768310546875,
0.01194000244140625,
0.005481719970703125,
0.027862548828125,
0.0457763671875,
0.0394287109375,
-0.06597900390625,
-0.0251312255859375,
-0.01319122314453125,
-0.0223541259765625,
0.0216522216796875,
-0.012664794921875,
-0.0218658447265625,
-0.04034423828125,
0.06817626953125,
0.021209716796875,
0.05108642578125,
-0.020294189453125,
0.01389312744140625,
-0.0241241455078125,
-0.0205230712890625,
0.06671142578125,
0.048004150390625,
-0.033935546875,
0.0020999908447265625,
-0.005260467529296875,
-0.046966552734375,
0.03851318359375,
0.02130126953125,
-0.01239776611328125,
-0.0151519775390625,
-0.006702423095703125,
0.050628662109375,
-0.0307159423828125,
-0.026580810546875,
0.01113128662109375,
0.01364898681640625,
-0.0181884765625,
-0.07080078125,
0.054779052734375,
-0.0032253265380859375,
0.03692626953125,
0.013336181640625,
0.0816650390625,
0.0180511474609375,
-0.002460479736328125,
0.038055419921875,
0.032470703125,
-0.0623779296875,
-0.0157470703125,
0.0740966796875,
0.038238525390625,
-0.05596923828125,
0.06396484375,
-0.03851318359375,
-0.031951904296875,
0.06890869140625,
0.04364013671875,
0.061004638671875,
-0.02044677734375,
0.0261993408203125,
0.0180816650390625,
-0.003353118896484375,
-0.033782958984375,
0.03521728515625,
0.0135345458984375,
-0.066650390625,
0.0178680419921875,
-0.0276336669921875,
-0.0240936279296875,
0.01084136962890625,
-0.03765869140625,
0.033203125,
-0.06048583984375,
-0.0186004638671875,
-0.0111083984375,
0.0013742446899414062,
-0.040191650390625,
0.028289794921875,
0.030975341796875,
0.10955810546875,
-0.08197021484375,
0.0738525390625,
0.056304931640625,
-0.0258026123046875,
-0.04083251953125,
-0.0237884521484375,
-0.0115203857421875,
-0.0677490234375,
0.0298309326171875,
0.0286712646484375,
0.002277374267578125,
0.04156494140625,
-0.07415771484375,
-0.06854248046875,
0.08245849609375,
0.024810791015625,
-0.039886474609375,
0.039093017578125,
-0.0682373046875,
0.042022705078125,
-0.0408935546875,
-0.007183074951171875,
0.03369140625,
0.0205230712890625,
0.0026721954345703125,
-0.06121826171875,
-0.0263671875,
-0.042938232421875,
-0.0245819091796875,
0.047119140625,
-0.07025146484375,
0.0537109375,
-0.0322265625,
0.023193359375,
0.033782958984375,
0.046295166015625,
0.0290679931640625,
0.0250091552734375,
0.04742431640625,
0.07525634765625,
0.04608154296875,
-0.000995635986328125,
0.0440673828125,
-0.00904083251953125,
0.024444580078125,
0.05609130859375,
0.0221099853515625,
0.06829833984375,
0.0298919677734375,
0.00431060791015625,
0.05230712890625,
0.083251953125,
-0.0063934326171875,
0.055084228515625,
-0.0026988983154296875,
0.00971221923828125,
-0.028900146484375,
0.02056884765625,
-0.048248291015625,
0.021148681640625,
0.032073974609375,
-0.0181732177734375,
0.0154266357421875,
0.0212554931640625,
0.006938934326171875,
-0.0169677734375,
-0.04974365234375,
0.0211334228515625,
-0.00615692138671875,
-0.0264739990234375,
0.03558349609375,
-0.0081787109375,
0.043243408203125,
-0.039642333984375,
-0.01093292236328125,
-0.006191253662109375,
0.0240325927734375,
-0.00426483154296875,
-0.0740966796875,
0.00942230224609375,
-0.031829833984375,
-0.01015472412109375,
-0.010009765625,
0.04669189453125,
-0.037689208984375,
-0.0672607421875,
0.00492095947265625,
0.007633209228515625,
0.02392578125,
0.010284423828125,
-0.05853271484375,
-0.006053924560546875,
-0.0009756088256835938,
-0.0141448974609375,
-0.003856658935546875,
0.0153350830078125,
0.01219940185546875,
0.059814453125,
0.00624847412109375,
0.0050048828125,
-0.0081634521484375,
-0.01073455810546875,
0.02520751953125,
-0.042022705078125,
-0.0477294921875,
-0.057037353515625,
0.050933837890625,
-0.001018524169921875,
-0.045501708984375,
0.0168304443359375,
0.055206298828125,
0.0264129638671875,
-0.0079345703125,
0.028350830078125,
-0.01522064208984375,
0.035003662109375,
-0.0251312255859375,
0.0799560546875,
-0.0233306884765625,
-0.0053558349609375,
0.003360748291015625,
-0.06524658203125,
-0.0111236572265625,
0.0611572265625,
0.0311126708984375,
-0.006145477294921875,
0.030792236328125,
0.03021240234375,
-0.0203399658203125,
0.0098724365234375,
0.007259368896484375,
0.03729248046875,
-0.005382537841796875,
0.029205322265625,
0.0255126953125,
-0.048248291015625,
0.0098876953125,
-0.0214996337890625,
-0.0195770263671875,
-0.020172119140625,
-0.03472900390625,
-0.049530029296875,
-0.032745361328125,
-0.0426025390625,
-0.04510498046875,
0.0037784576416015625,
0.084716796875,
0.06689453125,
-0.07220458984375,
-0.0297698974609375,
-0.035614013671875,
0.02166748046875,
-0.012725830078125,
-0.01494598388671875,
0.021575927734375,
-0.0064239501953125,
-0.024322509765625,
-0.007183074951171875,
0.0037174224853515625,
0.0611572265625,
-0.047515869140625,
-0.01007843017578125,
-0.0207366943359375,
-0.0143890380859375,
0.0185394287109375,
0.0085296630859375,
-0.03369140625,
-0.0236358642578125,
-0.022308349609375,
-0.011260986328125,
0.0029811859130859375,
-0.01172637939453125,
-0.0555419921875,
0.006618499755859375,
0.03973388671875,
0.0142974853515625,
0.0310516357421875,
0.01161956787109375,
0.0496826171875,
-0.0170440673828125,
0.0350341796875,
0.0234375,
0.0241241455078125,
0.0039825439453125,
-0.0298614501953125,
0.0662841796875,
0.02783203125,
-0.058135986328125,
-0.053314208984375,
-0.0020809173583984375,
-0.0975341796875,
-0.00482177734375,
0.0633544921875,
0.036834716796875,
-0.0019388198852539062,
-0.0082855224609375,
-0.039093017578125,
0.0294036865234375,
-0.043548583984375,
0.034637451171875,
0.0184173583984375,
-0.041748046875,
-0.031524658203125,
-0.038848876953125,
0.0234375,
0.0078582763671875,
-0.0295562744140625,
-0.03338623046875,
0.041046142578125,
0.0271148681640625,
0.0306549072265625,
0.04608154296875,
0.005153656005859375,
0.0299224853515625,
0.042999267578125,
0.01044464111328125,
-0.009368896484375,
-0.034027099609375,
-0.01129150390625,
0.0249176025390625,
0.003353118896484375,
-0.06219482421875
]
] |
vinai/phobert-base | 2022-10-22T08:56:25.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"fill-mask",
"arxiv:2003.00744",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | vinai | null | null | vinai/phobert-base | 19 | 52,330 | transformers | 2022-03-02T23:29:05 | # <a name="introduction"></a> PhoBERT: Pre-trained language models for Vietnamese
Pre-trained PhoBERT models are the state-of-the-art language models for Vietnamese ([Pho](https://en.wikipedia.org/wiki/Pho), i.e. "Phở", is a popular food in Vietnam):
- Two PhoBERT versions of "base" and "large" are the first public large-scale monolingual language models pre-trained for Vietnamese. PhoBERT pre-training approach is based on [RoBERTa](https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.md) which optimizes the [BERT](https://github.com/google-research/bert) pre-training procedure for more robust performance.
- PhoBERT outperforms previous monolingual and multilingual approaches, obtaining new state-of-the-art performances on four downstream Vietnamese NLP tasks of Part-of-speech tagging, Dependency parsing, Named-entity recognition and Natural language inference.
The general architecture and experimental results of PhoBERT can be found in our EMNLP-2020 Findings [paper](https://arxiv.org/abs/2003.00744):
@article{phobert,
title = {{PhoBERT: Pre-trained language models for Vietnamese}},
author = {Dat Quoc Nguyen and Anh Tuan Nguyen},
journal = {Findings of EMNLP},
year = {2020}
}
**Please CITE** our paper when PhoBERT is used to help produce published results or is incorporated into other software.
For further information or requests, please go to [PhoBERT's homepage](https://github.com/VinAIResearch/PhoBERT)!
| 1,500 | [
[
-0.01287078857421875,
-0.0731201171875,
0.028564453125,
0.0125885009765625,
-0.032806396484375,
-0.010223388671875,
-0.0179290771484375,
-0.0228729248046875,
0.0031681060791015625,
0.047119140625,
-0.007366180419921875,
-0.050262451171875,
-0.027313232421875,
0.0077972412109375,
-0.00850677490234375,
0.05706787109375,
0.00644683837890625,
0.005832672119140625,
0.03973388671875,
-0.00366973876953125,
-0.01177978515625,
-0.07940673828125,
-0.03607177734375,
-0.01271820068359375,
0.031585693359375,
0.019287109375,
0.036102294921875,
0.044219970703125,
0.0408935546875,
0.0238800048828125,
-0.0008478164672851562,
0.0115509033203125,
-0.033966064453125,
-0.0112457275390625,
0.0027065277099609375,
-0.000942230224609375,
-0.053009033203125,
-0.0087127685546875,
0.034423828125,
0.030181884765625,
-0.01361083984375,
0.00914764404296875,
-0.00036525726318359375,
0.0361328125,
-0.058258056640625,
0.00269317626953125,
-0.036163330078125,
-0.0009469985961914062,
-0.018829345703125,
-0.0213470458984375,
-0.04498291015625,
-0.0347900390625,
0.048858642578125,
-0.0419921875,
-0.0216217041015625,
-0.006809234619140625,
0.08489990234375,
0.01300048828125,
-0.048675537109375,
0.0009236335754394531,
-0.06072998046875,
0.055084228515625,
-0.06298828125,
0.04718017578125,
0.03363037109375,
0.01146697998046875,
-0.0009145736694335938,
-0.04461669921875,
-0.03070068359375,
-0.044281005859375,
-0.01090240478515625,
0.0192413330078125,
-0.0069427490234375,
0.0193939208984375,
-0.0036029815673828125,
0.027587890625,
-0.07080078125,
0.00193023681640625,
-0.02178955078125,
-0.0106964111328125,
0.033050537109375,
-0.035797119140625,
-0.0023746490478515625,
-0.0208587646484375,
-0.050262451171875,
-0.008026123046875,
-0.028472900390625,
-0.00974273681640625,
-0.005832672119140625,
0.0227203369140625,
-0.0185394287109375,
0.04620361328125,
0.002605438232421875,
0.0755615234375,
-0.00312042236328125,
-0.0243682861328125,
0.04302978515625,
-0.0253143310546875,
-0.032501220703125,
0.0086212158203125,
0.06573486328125,
0.001873016357421875,
0.049468994140625,
0.01251983642578125,
-0.00962066650390625,
-0.0176239013671875,
0.00003159046173095703,
-0.041229248046875,
-0.024200439453125,
0.0234832763671875,
-0.0260009765625,
0.006023406982421875,
0.0175628662109375,
-0.04290771484375,
-0.00749969482421875,
-0.0265350341796875,
0.047821044921875,
-0.0538330078125,
-0.06353759765625,
0.0275421142578125,
0.005008697509765625,
0.028900146484375,
0.0095672607421875,
-0.030914306640625,
0.0001881122589111328,
0.053802490234375,
0.06060791015625,
-0.01458740234375,
-0.0489501953125,
-0.03131103515625,
0.00749969482421875,
-0.00537109375,
0.054412841796875,
-0.018157958984375,
-0.0223846435546875,
0.016815185546875,
-0.0012044906616210938,
-0.007396697998046875,
-0.053497314453125,
0.042205810546875,
-0.0203399658203125,
0.011749267578125,
0.019927978515625,
-0.05303955078125,
-0.0216064453125,
0.009765625,
-0.04425048828125,
0.0899658203125,
0.029876708984375,
-0.06866455078125,
0.022186279296875,
-0.0382080078125,
-0.035430908203125,
-0.0010900497436523438,
0.0107574462890625,
-0.033355712890625,
0.0010433197021484375,
0.0125732421875,
0.03765869140625,
-0.00995635986328125,
0.00911712646484375,
-0.0152130126953125,
-0.0097198486328125,
0.0169830322265625,
-0.00799560546875,
0.08087158203125,
0.01413726806640625,
-0.023406982421875,
0.0313720703125,
-0.0875244140625,
0.001880645751953125,
0.014801025390625,
-0.0325927734375,
-0.03887939453125,
-0.03216552734375,
0.0151519775390625,
0.022369384765625,
0.02398681640625,
-0.021759033203125,
-0.004669189453125,
-0.0323486328125,
0.032745361328125,
0.055572509765625,
-0.0032176971435546875,
0.030792236328125,
-0.01375579833984375,
0.037139892578125,
0.0026397705078125,
0.019439697265625,
-0.035125732421875,
-0.037811279296875,
-0.06512451171875,
-0.0426025390625,
0.0160980224609375,
0.07818603515625,
-0.044036865234375,
0.0703125,
-0.006877899169921875,
-0.0662841796875,
-0.04986572265625,
-0.0006022453308105469,
0.0278167724609375,
0.047027587890625,
0.032745361328125,
-0.019073486328125,
-0.05859375,
-0.054229736328125,
-0.017791748046875,
-0.042205810546875,
-0.006038665771484375,
-0.0096282958984375,
0.0284576416015625,
-0.012939453125,
0.0748291015625,
-0.0212860107421875,
-0.01348114013671875,
-0.0220184326171875,
0.01175689697265625,
0.007965087890625,
0.04486083984375,
0.0479736328125,
-0.07049560546875,
-0.043243408203125,
0.0105133056640625,
-0.04107666015625,
-0.0012578964233398438,
0.02398681640625,
-0.0146484375,
0.0195465087890625,
0.038543701171875,
-0.039886474609375,
0.02105712890625,
0.0545654296875,
-0.01357269287109375,
0.060302734375,
-0.00902557373046875,
-0.005092620849609375,
-0.0765380859375,
-0.0016527175903320312,
0.0011749267578125,
-0.0221099853515625,
-0.037994384765625,
-0.01275634765625,
-0.00893402099609375,
-0.01348114013671875,
-0.063232421875,
0.0501708984375,
-0.025634765625,
0.021148681640625,
-0.0113372802734375,
-0.0101776123046875,
-0.006195068359375,
0.0400390625,
0.03448486328125,
0.040557861328125,
0.042877197265625,
-0.0552978515625,
0.033905029296875,
-0.0098114013671875,
-0.0162506103515625,
0.026702880859375,
-0.058685302734375,
0.01366424560546875,
0.0148468017578125,
0.0040435791015625,
-0.0592041015625,
0.004146575927734375,
0.029388427734375,
-0.0266571044921875,
0.0031280517578125,
-0.020111083984375,
-0.0396728515625,
-0.023651123046875,
-0.005733489990234375,
0.018585205078125,
0.03558349609375,
-0.0213623046875,
0.050262451171875,
0.035400390625,
-0.00182342529296875,
-0.034393310546875,
-0.04547119140625,
-0.024261474609375,
-0.042083740234375,
-0.02777099609375,
-0.0006384849548339844,
0.0003237724304199219,
-0.00836181640625,
-0.006244659423828125,
0.0158538818359375,
-0.042816162109375,
0.00215911865234375,
-0.0005307197570800781,
-0.00044989585876464844,
-0.05029296875,
0.02001953125,
-0.04290771484375,
-0.03045654296875,
-0.0180511474609375,
-0.05645751953125,
0.053680419921875,
-0.030487060546875,
-0.006134033203125,
-0.047027587890625,
0.017669677734375,
0.0274810791015625,
-0.057525634765625,
0.0545654296875,
0.057373046875,
-0.024566650390625,
0.00417327880859375,
-0.046051025390625,
-0.020172119140625,
-0.0347900390625,
0.036895751953125,
-0.035888671875,
-0.07012939453125,
-0.0016012191772460938,
-0.00946807861328125,
0.0019102096557617188,
0.0115814208984375,
0.04754638671875,
0.0121917724609375,
0.04937744140625,
0.07342529296875,
-0.00478363037109375,
0.062469482421875,
0.005481719970703125,
0.0074462890625,
-0.007778167724609375,
0.01126861572265625,
-0.03436279296875,
0.029205322265625,
-0.06646728515625,
-0.03759765625,
0.0018301010131835938,
0.00240325927734375,
-0.042724609375,
0.04052734375,
-0.04107666015625,
0.005619049072265625,
0.06689453125,
-0.01322174072265625,
0.03533935546875,
0.0156402587890625,
-0.00789642333984375,
-0.00943756103515625,
-0.058135986328125,
-0.052398681640625,
0.07403564453125,
0.02178955078125,
0.05224609375,
-0.0186614990234375,
0.053070068359375,
-0.00995635986328125,
0.01357269287109375,
-0.05206298828125,
0.03875732421875,
-0.00797271728515625,
-0.049560546875,
-0.02203369140625,
-0.036712646484375,
-0.0750732421875,
0.026214599609375,
-0.01062774658203125,
-0.054473876953125,
0.004119873046875,
0.0256500244140625,
-0.0170745849609375,
0.0161895751953125,
-0.0745849609375,
0.087158203125,
-0.050872802734375,
-0.0017366409301757812,
0.01244354248046875,
-0.0396728515625,
0.01306915283203125,
-0.00804901123046875,
-0.001644134521484375,
-0.00930023193359375,
0.0023937225341796875,
0.043670654296875,
-0.0274505615234375,
0.03570556640625,
-0.004436492919921875,
-0.0165252685546875,
0.031524658203125,
-0.0087890625,
0.01012420654296875,
0.01126861572265625,
-0.0186309814453125,
0.0260467529296875,
-0.01322174072265625,
-0.032470703125,
-0.0312347412109375,
0.011688232421875,
-0.05853271484375,
-0.0242767333984375,
-0.044647216796875,
-0.0090484619140625,
0.0047149658203125,
0.038543701171875,
0.03485107421875,
-0.0015039443969726562,
-0.0286865234375,
0.005580902099609375,
0.0340576171875,
-0.034637451171875,
-0.006500244140625,
0.0665283203125,
-0.038055419921875,
-0.037200927734375,
0.07177734375,
0.0186920166015625,
0.01025390625,
0.06842041015625,
0.020538330078125,
0.00010567903518676758,
-0.0184783935546875,
-0.00284576416015625,
0.042449951171875,
-0.027130126953125,
0.0164947509765625,
-0.0550537109375,
-0.0286865234375,
-0.04022216796875,
0.002887725830078125,
-0.059783935546875,
-0.029083251953125,
-0.0034275054931640625,
-0.021453857421875,
0.033233642578125,
0.032196044921875,
-0.0195770263671875,
0.050262451171875,
-0.05078125,
0.021331787109375,
0.020843505859375,
-0.002483367919921875,
-0.01546478271484375,
-0.0225677490234375,
-0.0288238525390625,
-0.00635528564453125,
-0.01529693603515625,
-0.06634521484375,
0.027587890625,
0.01261138916015625,
0.0155181884765625,
0.04425048828125,
0.006084442138671875,
0.037933349609375,
-0.04376220703125,
0.036376953125,
0.0011043548583984375,
-0.062469482421875,
0.064208984375,
-0.0169219970703125,
0.0239105224609375,
0.03228759765625,
0.04193115234375,
-0.033172607421875,
-0.02777099609375,
-0.045074462890625,
-0.08123779296875,
0.032012939453125,
0.0201416015625,
-0.0230255126953125,
0.0163421630859375,
0.0063934326171875,
0.007541656494140625,
0.02960205078125,
-0.07232666015625,
-0.028961181640625,
-0.030487060546875,
-0.00212860107421875,
-0.025848388671875,
-0.022979736328125,
0.0147705078125,
-0.030059814453125,
0.05108642578125,
0.0235595703125,
0.0161590576171875,
0.01082611083984375,
-0.021026611328125,
0.0157012939453125,
0.024505615234375,
0.03375244140625,
0.06805419921875,
-0.0482177734375,
0.011688232421875,
-0.0095367431640625,
-0.0416259765625,
0.019287109375,
0.03680419921875,
-0.01296234130859375,
0.0296783447265625,
0.029296875,
0.05340576171875,
0.01050567626953125,
-0.05694580078125,
0.027252197265625,
-0.0079193115234375,
0.01332855224609375,
-0.04931640625,
-0.0118255615234375,
-0.0037631988525390625,
-0.00382232666015625,
0.0290985107421875,
-0.0261993408203125,
-0.014434814453125,
-0.0252227783203125,
0.0264129638671875,
-0.01007080078125,
-0.0306243896484375,
-0.0350341796875,
0.028594970703125,
0.0255279541015625,
-0.0286712646484375,
0.044952392578125,
-0.01525115966796875,
-0.048004150390625,
0.0308685302734375,
0.02703857421875,
0.0599365234375,
-0.05340576171875,
0.03009033203125,
0.0340576171875,
0.051422119140625,
0.01076507568359375,
0.0287628173828125,
0.00981903076171875,
-0.054107666015625,
-0.0263519287109375,
-0.040069580078125,
-0.0178375244140625,
0.04071044921875,
-0.035064697265625,
0.028289794921875,
-0.02459716796875,
-0.0274200439453125,
-0.01259613037109375,
0.002056121826171875,
-0.0489501953125,
0.01549530029296875,
0.005397796630859375,
0.06256103515625,
-0.04754638671875,
0.0733642578125,
0.07794189453125,
-0.038543701171875,
-0.05999755859375,
-0.003391265869140625,
0.003185272216796875,
-0.043853759765625,
0.034210205078125,
0.0173797607421875,
0.0026569366455078125,
0.01239776611328125,
-0.017578125,
-0.05877685546875,
0.034759521484375,
0.06134033203125,
-0.0225830078125,
-0.00786590576171875,
0.024658203125,
0.0272216796875,
-0.018157958984375,
0.013916015625,
0.041717529296875,
0.044281005859375,
-0.032623291015625,
-0.0885009765625,
-0.026336669921875,
-0.021759033203125,
-0.01030731201171875,
-0.00769805908203125,
-0.055023193359375,
0.0853271484375,
0.0015048980712890625,
-0.00885772705078125,
0.0098114013671875,
0.06494140625,
0.041229248046875,
0.0207366943359375,
0.038360595703125,
0.033538818359375,
0.057373046875,
-0.00426483154296875,
0.064208984375,
-0.027252197265625,
0.025909423828125,
0.10369873046875,
-0.0227203369140625,
0.071533203125,
0.0237884521484375,
-0.006961822509765625,
0.0278472900390625,
0.0673828125,
0.00550079345703125,
0.0279388427734375,
0.0099334716796875,
0.0122528076171875,
-0.0158538818359375,
-0.012451171875,
-0.0589599609375,
0.051239013671875,
0.009521484375,
-0.0033626556396484375,
-0.0027523040771484375,
0.0260162353515625,
0.021636962890625,
-0.0011425018310546875,
0.00653839111328125,
0.043426513671875,
0.0229034423828125,
-0.037139892578125,
0.0626220703125,
-0.007282257080078125,
0.059814453125,
-0.07183837890625,
-0.0010328292846679688,
0.0007061958312988281,
0.039337158203125,
0.0008563995361328125,
-0.024993896484375,
-0.019195556640625,
-0.0058135986328125,
0.003009796142578125,
-0.0152435302734375,
0.0450439453125,
-0.044342041015625,
-0.040283203125,
0.053497314453125,
0.05950927734375,
0.0159912109375,
-0.0203094482421875,
-0.06439208984375,
-0.007129669189453125,
-0.02276611328125,
-0.038055419921875,
-0.004150390625,
0.05841064453125,
0.0187835693359375,
0.03765869140625,
0.0269012451171875,
0.01800537109375,
0.0268402099609375,
-0.0030975341796875,
0.042449951171875,
-0.043121337890625,
-0.06097412109375,
-0.0579833984375,
0.03924560546875,
0.0093841552734375,
-0.05029296875,
0.08331298828125,
0.056243896484375,
0.103515625,
-0.0174713134765625,
0.062469482421875,
0.0179443359375,
0.061309814453125,
-0.0325927734375,
0.0465087890625,
-0.050872802734375,
-0.02001953125,
-0.04425048828125,
-0.0791015625,
-0.0199432373046875,
0.07342529296875,
-0.0199127197265625,
0.0188751220703125,
0.055419921875,
0.0517578125,
-0.00637054443359375,
-0.0167388916015625,
0.0220184326171875,
0.04534912109375,
0.01824951171875,
0.034332275390625,
0.0711669921875,
-0.0154571533203125,
0.042877197265625,
-0.037384033203125,
-0.023590087890625,
-0.038177490234375,
-0.053070068359375,
-0.0615234375,
-0.052825927734375,
-0.0267486572265625,
-0.0114898681640625,
0.029205322265625,
0.07159423828125,
0.0784912109375,
-0.06610107421875,
-0.05389404296875,
0.0048065185546875,
0.010711669921875,
-0.0310821533203125,
-0.01473236083984375,
0.040679931640625,
-0.00182342529296875,
-0.054901123046875,
0.017303466796875,
0.022430419921875,
0.0165252685546875,
-0.00997161865234375,
-0.0098419189453125,
-0.049652099609375,
-0.0111236572265625,
0.0587158203125,
0.03570556640625,
-0.043853759765625,
0.00789642333984375,
0.00011944770812988281,
-0.0080718994140625,
0.0235595703125,
0.06793212890625,
-0.056793212890625,
0.031982421875,
0.028167724609375,
0.033782958984375,
0.020965576171875,
-0.0056304931640625,
0.057464599609375,
-0.038116455078125,
0.0653076171875,
0.027130126953125,
0.0184478759765625,
0.03515625,
0.0006632804870605469,
0.047821044921875,
0.00901031494140625,
-0.031890869140625,
-0.06829833984375,
0.021759033203125,
-0.07122802734375,
-0.018402099609375,
0.08343505859375,
-0.0278167724609375,
-0.022125244140625,
-0.0182342529296875,
-0.0281219482421875,
0.04132080078125,
-0.03558349609375,
0.0308380126953125,
0.04010009765625,
0.0165557861328125,
-0.02227783203125,
-0.05078125,
0.040069580078125,
0.034332275390625,
-0.057373046875,
-0.0031909942626953125,
0.01309967041015625,
-0.006366729736328125,
0.011688232421875,
0.06195068359375,
0.009002685546875,
0.0173797607421875,
-0.023162841796875,
0.00024080276489257812,
0.01178741455078125,
-0.0271148681640625,
-0.0107574462890625,
0.00493621826171875,
0.009002685546875,
-0.01146697998046875
]
] |
cerspense/zeroscope_v2_576w | 2023-07-01T07:24:16.000Z | [
"diffusers",
"text-to-video",
"license:cc-by-nc-4.0",
"has_space",
"diffusers:TextToVideoSDPipeline",
"region:us"
] | text-to-video | cerspense | null | null | cerspense/zeroscope_v2_576w | 348 | 52,233 | diffusers | 2023-06-21T19:10:41 | ---
pipeline_tag: text-to-video
license: cc-by-nc-4.0
---

# zeroscope_v2 576w
A watermark-free Modelscope-based video model optimized for producing high-quality 16:9 compositions and a smooth video output. This model was trained from the [original weights](https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis) using 9,923 clips and 29,769 tagged frames at 24 frames, 576x320 resolution.<br />
zeroscope_v2_567w is specifically designed for upscaling with [zeroscope_v2_XL](https://huggingface.co/cerspense/zeroscope_v2_XL) using vid2vid in the [1111 text2video](https://github.com/kabachuha/sd-webui-text2video) extension by [kabachuha](https://github.com/kabachuha). Leveraging this model as a preliminary step allows for superior overall compositions at higher resolutions in zeroscope_v2_XL, permitting faster exploration in 576x320 before transitioning to a high-resolution render. See some [example outputs](https://www.youtube.com/watch?v=HO3APT_0UA4) that have been upscaled to 1024x576 using zeroscope_v2_XL. (courtesy of [dotsimulate](https://www.instagram.com/dotsimulate/))<br />
zeroscope_v2_576w uses 7.9gb of vram when rendering 30 frames at 576x320
### Using it with the 1111 text2video extension
1. Download files in the zs2_576w folder.
2. Replace the respective files in the 'stable-diffusion-webui\models\ModelScope\t2v' directory.
### Upscaling recommendations
For upscaling, it's recommended to use [zeroscope_v2_XL](https://huggingface.co/cerspense/zeroscope_v2_XL) via vid2vid in the 1111 extension. It works best at 1024x576 with a denoise strength between 0.66 and 0.85. Remember to use the same prompt that was used to generate the original clip. <br />
### Usage in 🧨 Diffusers
Let's first install the libraries required:
```bash
$ pip install diffusers transformers accelerate torch
```
Now, generate a video:
```py
import torch
from diffusers import DiffusionPipeline, DPMSolverMultistepScheduler
from diffusers.utils import export_to_video
pipe = DiffusionPipeline.from_pretrained("cerspense/zeroscope_v2_576w", torch_dtype=torch.float16)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
prompt = "Darth Vader is surfing on waves"
video_frames = pipe(prompt, num_inference_steps=40, height=320, width=576, num_frames=24).frames
video_path = export_to_video(video_frames)
```
Here are some results:
<table>
<tr>
Darth vader is surfing on waves.
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/darthvader_cerpense.gif"
alt="Darth vader surfing in waves."
style="width: 576;" />
</center></td>
</tr>
</table>
### Known issues
Lower resolutions or fewer frames could lead to suboptimal output. <br />
Thanks to [camenduru](https://github.com/camenduru), [kabachuha](https://github.com/kabachuha), [ExponentialML](https://github.com/ExponentialML), [dotsimulate](https://www.instagram.com/dotsimulate/), [VANYA](https://twitter.com/veryVANYA), [polyware](https://twitter.com/polyware_ai), [tin2tin](https://github.com/tin2tin)<br /> | 3,230 | [
[
-0.041839599609375,
-0.048126220703125,
0.038055419921875,
0.003082275390625,
-0.037933349609375,
-0.01312255859375,
-0.0017576217651367188,
-0.008331298828125,
0.025299072265625,
0.02490234375,
-0.035125732421875,
-0.0301513671875,
-0.0557861328125,
-0.0256500244140625,
-0.03173828125,
0.0616455078125,
-0.005893707275390625,
-0.00737762451171875,
-0.01531219482421875,
-0.0005030632019042969,
-0.02911376953125,
-0.01181793212890625,
-0.0171966552734375,
-0.0208282470703125,
-0.00943756103515625,
0.057769775390625,
0.042938232421875,
0.053558349609375,
0.038421630859375,
0.0217437744140625,
-0.001434326171875,
0.011627197265625,
-0.04443359375,
-0.006580352783203125,
0.003459930419921875,
-0.010162353515625,
-0.039703369140625,
-0.0011262893676757812,
0.069091796875,
-0.00439453125,
-0.0030155181884765625,
0.046875,
-0.00756072998046875,
0.045166015625,
-0.0496826171875,
0.0011129379272460938,
-0.0080108642578125,
-0.0036487579345703125,
-0.0146026611328125,
-0.0164337158203125,
-0.0105743408203125,
-0.000385284423828125,
0.0014247894287109375,
-0.06781005859375,
0.02978515625,
-0.017547607421875,
0.10516357421875,
0.027191162109375,
-0.033782958984375,
0.03515625,
-0.06024169921875,
0.059173583984375,
-0.059722900390625,
0.032745361328125,
0.0026836395263671875,
0.027252197265625,
0.003963470458984375,
-0.059722900390625,
-0.0288238525390625,
0.004486083984375,
0.0247802734375,
0.039276123046875,
-0.0380859375,
0.0019683837890625,
0.033966064453125,
0.03753662109375,
-0.046966552734375,
-0.00041365623474121094,
-0.036773681640625,
0.005687713623046875,
0.051544189453125,
0.011505126953125,
0.0198974609375,
-0.0234375,
-0.0259857177734375,
-0.041351318359375,
-0.020782470703125,
0.008331298828125,
0.0118255615234375,
-0.0211181640625,
-0.0460205078125,
0.052581787109375,
-0.01454925537109375,
0.041778564453125,
0.0228271484375,
-0.018096923828125,
0.01509857177734375,
-0.015838623046875,
-0.037200927734375,
-0.01120758056640625,
0.04937744140625,
0.06591796875,
0.01042938232421875,
0.03497314453125,
0.005157470703125,
0.023895263671875,
0.018951416015625,
-0.09283447265625,
-0.0199737548828125,
0.0207061767578125,
-0.0298919677734375,
-0.0186767578125,
0.0002987384796142578,
-0.0872802734375,
0.0012187957763671875,
-0.0077056884765625,
0.04986572265625,
-0.02520751953125,
-0.038726806640625,
0.002437591552734375,
-0.04046630859375,
0.01031494140625,
0.026153564453125,
-0.05426025390625,
0.0263671875,
0.016448974609375,
0.0753173828125,
0.0205078125,
0.00311279296875,
-0.04425048828125,
0.01015472412109375,
-0.0254669189453125,
0.04278564453125,
-0.00830078125,
-0.03497314453125,
-0.0176239013671875,
-0.0014505386352539062,
0.0291595458984375,
-0.0301971435546875,
0.03240966796875,
-0.022430419921875,
0.01824951171875,
-0.004734039306640625,
-0.0504150390625,
-0.019989013671875,
-0.007205963134765625,
-0.02069091796875,
0.08203125,
0.02923583984375,
-0.057159423828125,
0.0092926025390625,
-0.043304443359375,
0.00675201416015625,
-0.00391387939453125,
-0.00983428955078125,
-0.042877197265625,
0.008087158203125,
0.0023441314697265625,
0.013580322265625,
-0.0225067138671875,
-0.00841522216796875,
-0.0247039794921875,
-0.03192138671875,
0.0047607421875,
-0.039459228515625,
0.0537109375,
0.01617431640625,
-0.04071044921875,
0.015655517578125,
-0.06695556640625,
0.00888824462890625,
0.00847625732421875,
0.003635406494140625,
-0.00298309326171875,
-0.01480865478515625,
0.01043701171875,
0.01294708251953125,
-0.0016336441040039062,
-0.050567626953125,
-0.00316619873046875,
-0.03485107421875,
0.0355224609375,
0.048187255859375,
0.0020160675048828125,
0.039276123046875,
-0.016265869140625,
0.044464111328125,
0.0038394927978515625,
0.03082275390625,
-0.0029315948486328125,
-0.054656982421875,
-0.059906005859375,
-0.0117340087890625,
-0.0009684562683105469,
0.0279693603515625,
-0.049163818359375,
-0.002941131591796875,
-0.020721435546875,
-0.0667724609375,
-0.03338623046875,
0.01200103759765625,
0.03179931640625,
0.058685302734375,
0.036865234375,
-0.06390380859375,
-0.04522705078125,
-0.05517578125,
0.0225372314453125,
-0.014495849609375,
-0.036163330078125,
0.0225067138671875,
0.0298614501953125,
-0.0024051666259765625,
0.061981201171875,
-0.06573486328125,
-0.035003662109375,
0.004856109619140625,
-0.00620269775390625,
0.0231475830078125,
0.0226593017578125,
0.060028076171875,
-0.053436279296875,
-0.039306640625,
-0.0028896331787109375,
-0.0667724609375,
0.00482940673828125,
0.01428985595703125,
-0.0113525390625,
0.0020580291748046875,
0.017730712890625,
-0.050872802734375,
0.034149169921875,
0.05401611328125,
-0.034637451171875,
0.0499267578125,
-0.04730224609375,
0.0161590576171875,
-0.08416748046875,
-0.005645751953125,
0.04339599609375,
-0.02862548828125,
-0.043914794921875,
0.005802154541015625,
0.0029773712158203125,
-0.006847381591796875,
-0.048583984375,
0.033538818359375,
-0.032379150390625,
-0.005825042724609375,
-0.0135040283203125,
0.01262664794921875,
0.016143798828125,
0.02850341796875,
0.0001806020736694336,
0.0460205078125,
0.044158935546875,
-0.034423828125,
0.048614501953125,
0.021820068359375,
-0.0106964111328125,
0.03668212890625,
-0.07373046875,
-0.0093536376953125,
-0.01143646240234375,
0.004802703857421875,
-0.07080078125,
-0.05072021484375,
0.01212310791015625,
-0.062164306640625,
0.019744873046875,
-0.0235137939453125,
-0.0159454345703125,
-0.0279693603515625,
-0.05908203125,
0.022705078125,
0.07525634765625,
-0.024139404296875,
0.0260009765625,
0.040191650390625,
0.02484130859375,
-0.028045654296875,
-0.07269287109375,
-0.0082550048828125,
-0.0231170654296875,
-0.054718017578125,
0.0518798828125,
0.0002999305725097656,
-0.0192413330078125,
0.0100250244140625,
-0.0020046234130859375,
-0.0035037994384765625,
-0.0384521484375,
0.044708251953125,
0.044158935546875,
-0.021484375,
-0.024658203125,
-0.004352569580078125,
-0.0054931640625,
-0.00559234619140625,
-0.0128631591796875,
0.0217437744140625,
-0.00989532470703125,
-0.0016679763793945312,
-0.05096435546875,
0.0199737548828125,
0.05157470703125,
0.01259613037109375,
0.01157379150390625,
0.075927734375,
-0.0266876220703125,
0.01316070556640625,
-0.03717041015625,
-0.0185394287109375,
-0.041748046875,
0.023895263671875,
-0.00543975830078125,
-0.049713134765625,
0.03216552734375,
0.01282501220703125,
0.0003421306610107422,
0.04339599609375,
0.052154541015625,
-0.017364501953125,
0.07757568359375,
0.041351318359375,
0.0243988037109375,
0.057342529296875,
-0.06341552734375,
-0.01314544677734375,
-0.0626220703125,
-0.01165771484375,
0.0008220672607421875,
-0.00566864013671875,
-0.05596923828125,
-0.056488037109375,
0.04693603515625,
0.01493072509765625,
-0.040618896484375,
0.04339599609375,
-0.0557861328125,
0.01763916015625,
0.043426513671875,
0.00795745849609375,
0.01076507568359375,
0.0223541259765625,
0.0103302001953125,
-0.033966064453125,
-0.0423583984375,
-0.024658203125,
0.07684326171875,
0.0186767578125,
0.05419921875,
0.0160675048828125,
0.031890869140625,
0.0276947021484375,
-0.00542449951171875,
-0.037628173828125,
0.045562744140625,
-0.0258941650390625,
-0.04443359375,
-0.007049560546875,
-0.018218994140625,
-0.054718017578125,
0.01531219482421875,
-0.0247039794921875,
-0.055206298828125,
0.01552581787109375,
0.0207672119140625,
-0.0288543701171875,
0.03900146484375,
-0.07440185546875,
0.055938720703125,
0.0012674331665039062,
-0.057891845703125,
-0.01236724853515625,
-0.05328369140625,
0.0220489501953125,
0.0231781005859375,
0.01081085205078125,
-0.00591278076171875,
-0.0025234222412109375,
0.0528564453125,
-0.0413818359375,
0.045928955078125,
-0.016937255859375,
0.01279449462890625,
0.046539306640625,
0.0019407272338867188,
0.006557464599609375,
0.0269317626953125,
0.0180816650390625,
0.020111083984375,
0.0150146484375,
-0.034393310546875,
-0.032318115234375,
0.0653076171875,
-0.07623291015625,
-0.0305328369140625,
-0.02874755859375,
-0.01284027099609375,
0.021087646484375,
0.0107269287109375,
0.047943115234375,
0.053558349609375,
0.0011196136474609375,
0.00403594970703125,
0.0418701171875,
-0.00025463104248046875,
0.045654296875,
0.029052734375,
-0.02813720703125,
-0.04913330078125,
0.0745849609375,
0.0210418701171875,
0.030120849609375,
0.013580322265625,
-0.0026798248291015625,
-0.0176849365234375,
-0.0157012939453125,
-0.06195068359375,
0.0141448974609375,
-0.02850341796875,
-0.039794921875,
-0.0175018310546875,
-0.026580810546875,
-0.050567626953125,
-0.01519012451171875,
-0.05804443359375,
-0.033477783203125,
-0.0516357421875,
-0.002506256103515625,
0.057861328125,
0.044647216796875,
-0.0323486328125,
0.019683837890625,
-0.050537109375,
0.033294677734375,
0.034332275390625,
0.0164642333984375,
-0.00722503662109375,
-0.05694580078125,
-0.0157623291015625,
0.006664276123046875,
-0.06298828125,
-0.05853271484375,
0.05242919921875,
0.00858306884765625,
0.0108795166015625,
0.037445068359375,
-0.01171875,
0.06756591796875,
-0.008392333984375,
0.08282470703125,
0.038482666015625,
-0.064453125,
0.051788330078125,
-0.03009033203125,
0.033050537109375,
0.00634002685546875,
0.0217132568359375,
-0.038421630859375,
-0.01568603515625,
-0.05682373046875,
-0.07818603515625,
0.049896240234375,
0.03253173828125,
0.011199951171875,
0.0224456787109375,
0.02301025390625,
-0.00995635986328125,
-0.0204010009765625,
-0.02288818359375,
-0.030242919921875,
-0.045318603515625,
0.00738525390625,
-0.016845703125,
-0.0281524658203125,
0.002178192138671875,
-0.043426513671875,
0.052734375,
0.001018524169921875,
0.030548095703125,
0.05731201171875,
-0.0004527568817138672,
-0.04193115234375,
0.0011539459228515625,
0.039459228515625,
0.039459228515625,
-0.051971435546875,
0.0009946823120117188,
0.005176544189453125,
-0.048126220703125,
0.0202789306640625,
0.002010345458984375,
-0.028167724609375,
0.0273284912109375,
0.0101776123046875,
0.06536865234375,
0.007480621337890625,
-0.04364013671875,
0.04534912109375,
-0.003406524658203125,
-0.0269775390625,
-0.041015625,
0.00826263427734375,
-0.0006685256958007812,
0.02459716796875,
0.0283966064453125,
0.01085662841796875,
0.0135498046875,
-0.0255889892578125,
0.0169677734375,
0.006694793701171875,
-0.02789306640625,
-0.043487548828125,
0.096923828125,
0.01354217529296875,
-0.020782470703125,
0.037353515625,
-0.01494598388671875,
-0.01377105712890625,
0.042449951171875,
0.02374267578125,
0.04571533203125,
-0.01067352294921875,
0.03570556640625,
0.055450439453125,
-0.00278472900390625,
-0.004413604736328125,
0.0202789306640625,
0.0031070709228515625,
-0.031707763671875,
-0.044158935546875,
-0.03961181640625,
-0.04156494140625,
0.01241302490234375,
-0.056427001953125,
0.054840087890625,
-0.0292816162109375,
-0.029266357421875,
0.0239105224609375,
0.0159912109375,
-0.0430908203125,
0.0202484130859375,
0.0169830322265625,
0.058837890625,
-0.053466796875,
0.07208251953125,
0.045318603515625,
-0.05615234375,
-0.056427001953125,
-0.0308380126953125,
0.0182037353515625,
-0.034088134765625,
0.01186370849609375,
0.004817962646484375,
-0.01325225830078125,
0.006954193115234375,
-0.02593994140625,
-0.058807373046875,
0.09600830078125,
0.04351806640625,
-0.037689208984375,
-0.013519287109375,
-0.00261688232421875,
0.04864501953125,
-0.0244293212890625,
0.0457763671875,
0.031341552734375,
0.0255126953125,
0.0129852294921875,
-0.06884765625,
-0.0081787109375,
-0.0183258056640625,
0.020751953125,
0.00826263427734375,
-0.07403564453125,
0.0731201171875,
-0.03118896484375,
-0.00829315185546875,
0.005859375,
0.049224853515625,
0.01375579833984375,
0.037078857421875,
0.0230865478515625,
0.0706787109375,
0.004596710205078125,
0.006256103515625,
0.07257080078125,
0.003932952880859375,
0.051483154296875,
0.06878662109375,
0.003582000732421875,
0.051300048828125,
0.04217529296875,
-0.0196533203125,
0.045257568359375,
0.048126220703125,
-0.016357421875,
0.041900634765625,
-0.000576019287109375,
-0.003986358642578125,
-0.006511688232421875,
-0.0181732177734375,
-0.04296875,
0.040008544921875,
0.0127105712890625,
-0.01369476318359375,
-0.0242156982421875,
0.0005793571472167969,
-0.007160186767578125,
0.004116058349609375,
-0.0188446044921875,
0.030364990234375,
-0.0015745162963867188,
-0.0247344970703125,
0.050140380859375,
-0.004329681396484375,
0.06201171875,
-0.04022216796875,
-0.01898193359375,
-0.0182647705078125,
0.025787353515625,
-0.031005859375,
-0.0733642578125,
0.0237884521484375,
0.0180511474609375,
-0.004680633544921875,
-0.0201568603515625,
0.058837890625,
-0.0248565673828125,
-0.035675048828125,
0.041412353515625,
0.0129852294921875,
0.034149169921875,
-0.01177215576171875,
-0.0263214111328125,
0.006435394287109375,
0.005229949951171875,
-0.0396728515625,
0.0262451171875,
0.01050567626953125,
0.01020050048828125,
0.031646728515625,
0.036407470703125,
0.01528167724609375,
0.00679779052734375,
-0.004364013671875,
0.06591796875,
-0.047027587890625,
-0.0070037841796875,
-0.052978515625,
0.041748046875,
-0.02508544921875,
-0.03009033203125,
0.07086181640625,
0.04998779296875,
0.0867919921875,
-0.0159454345703125,
0.034515380859375,
-0.0168914794921875,
0.020965576171875,
-0.01183319091796875,
0.0303955078125,
-0.0712890625,
-0.00847625732421875,
-0.026519775390625,
-0.061737060546875,
-0.00501251220703125,
0.022918701171875,
0.00841522216796875,
-0.0164642333984375,
0.01309967041015625,
0.061309814453125,
-0.0209197998046875,
-0.0245819091796875,
0.033721923828125,
0.018890380859375,
0.0203399658203125,
0.039825439453125,
0.0207977294921875,
-0.0765380859375,
0.0625,
-0.053802490234375,
-0.02862548828125,
-0.0297088623046875,
-0.04132080078125,
-0.03350830078125,
-0.04229736328125,
-0.03985595703125,
-0.046417236328125,
-0.00583648681640625,
0.05023193359375,
0.0787353515625,
-0.03564453125,
-0.0271148681640625,
-0.0018863677978515625,
-0.00650787353515625,
-0.00634002685546875,
-0.02447509765625,
0.00534820556640625,
0.034820556640625,
-0.07147216796875,
0.0118255615234375,
0.039764404296875,
0.019073486328125,
-0.017364501953125,
-0.01428985595703125,
-0.01617431640625,
0.00638580322265625,
0.0447998046875,
0.033721923828125,
-0.039642333984375,
-0.036041259765625,
-0.0015697479248046875,
0.01509857177734375,
0.024688720703125,
0.0357666015625,
-0.052764892578125,
0.054168701171875,
0.056427001953125,
-0.01557159423828125,
0.0904541015625,
-0.002643585205078125,
0.011932373046875,
-0.04638671875,
0.0265655517578125,
-0.0007214546203613281,
0.0121612548828125,
0.0216522216796875,
-0.0192108154296875,
0.049560546875,
0.027099609375,
-0.055633544921875,
-0.0660400390625,
0.003223419189453125,
-0.11102294921875,
-0.00457763671875,
0.087890625,
-0.00850677490234375,
-0.02337646484375,
0.0231781005859375,
-0.017974853515625,
0.04241943359375,
-0.039886474609375,
0.051422119140625,
0.0360107421875,
-0.020965576171875,
-0.0130615234375,
-0.062469482421875,
0.019927978515625,
0.0193939208984375,
-0.0257415771484375,
-0.0186004638671875,
0.038787841796875,
0.054718017578125,
0.0192108154296875,
0.058685302734375,
-0.0262908935546875,
0.0284271240234375,
0.032440185546875,
0.007091522216796875,
-0.00194549560546875,
0.004261016845703125,
-0.0308074951171875,
0.006622314453125,
-0.017791748046875,
-0.036041259765625
]
] |
HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-binary | 2021-05-18T20:56:29.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"fa",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | HooshvareLab | null | null | HooshvareLab/bert-fa-base-uncased-sentiment-deepsentipers-binary | 4 | 52,150 | transformers | 2022-03-02T23:29:04 | ---
language: fa
license: apache-2.0
---
# ParsBERT (v2.0)
A Transformer-based Model for Persian Language Understanding
We reconstructed the vocabulary and fine-tuned the ParsBERT v1.1 on the new Persian corpora in order to provide some functionalities for using ParsBERT in other scopes!
Please follow the [ParsBERT](https://github.com/hooshvare/parsbert) repo for the latest information about previous and current models.
## Persian Sentiment [Digikala, SnappFood, DeepSentiPers]
It aims to classify text, such as comments, based on their emotional bias. We tested three well-known datasets for this task: `Digikala` user comments, `SnappFood` user comments, and `DeepSentiPers` in two binary-form and multi-form types.
### DeepSentiPers
which is a balanced and augmented version of SentiPers, contains 12,138 user opinions about digital products labeled with five different classes; two positives (i.e., happy and delighted), two negatives (i.e., furious and angry) and one neutral class. Therefore, this dataset can be utilized for both multi-class and binary classification. In the case of binary classification, the neutral class and its corresponding sentences are removed from the dataset.
**Binary:**
1. Negative (Furious + Angry)
2. Positive (Happy + Delighted)
**Multi**
1. Furious
2. Angry
3. Neutral
4. Happy
5. Delighted
| Label | # |
|:---------:|:----:|
| Furious | 236 |
| Angry | 1357 |
| Neutral | 2874 |
| Happy | 2848 |
| Delighted | 2516 |
**Download**
You can download the dataset from:
- [SentiPers](https://github.com/phosseini/sentipers)
- [DeepSentiPers](https://github.com/JoyeBright/DeepSentiPers)
## Results
The following table summarizes the F1 score obtained by ParsBERT as compared to other models and architectures.
| Dataset | ParsBERT v2 | ParsBERT v1 | mBERT | DeepSentiPers |
|:------------------------:|:-----------:|:-----------:|:-----:|:-------------:|
| SentiPers (Multi Class) | 71.31* | 71.11 | - | 69.33 |
| SentiPers (Binary Class) | 92.42* | 92.13 | - | 91.98 |
## How to use :hugs:
| Task | Notebook |
|---------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Sentiment Analysis | [](https://colab.research.google.com/github/hooshvare/parsbert/blob/master/notebooks/Taaghche_Sentiment_Analysis.ipynb) |
### BibTeX entry and citation info
Please cite in publications as the following:
```bibtex
@article{ParsBERT,
title={ParsBERT: Transformer-based Model for Persian Language Understanding},
author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
journal={ArXiv},
year={2020},
volume={abs/2005.12515}
}
```
## Questions?
Post a Github issue on the [ParsBERT Issues](https://github.com/hooshvare/parsbert/issues) repo. | 3,267 | [
[
-0.04681396484375,
-0.0626220703125,
0.0179290771484375,
0.0316162109375,
-0.0259246826171875,
0.009521484375,
-0.029205322265625,
-0.0191192626953125,
0.013916015625,
0.02093505859375,
-0.03094482421875,
-0.039825439453125,
-0.038238525390625,
-0.00995635986328125,
-0.017578125,
0.09466552734375,
0.0088653564453125,
0.014923095703125,
-0.017120361328125,
-0.01016998291015625,
-0.033355712890625,
-0.030303955078125,
-0.0235595703125,
-0.0238494873046875,
0.01654052734375,
0.04681396484375,
0.0615234375,
0.004833221435546875,
0.05792236328125,
0.018829345703125,
-0.037872314453125,
-0.01885986328125,
-0.020538330078125,
0.0073089599609375,
-0.00992584228515625,
-0.031494140625,
-0.050262451171875,
-0.0091552734375,
0.048583984375,
0.0447998046875,
-0.0157623291015625,
0.0211029052734375,
0.01116943359375,
0.06939697265625,
-0.0270538330078125,
-0.0027141571044921875,
-0.01113128662109375,
0.007366180419921875,
-0.0260467529296875,
0.0159149169921875,
-0.0279998779296875,
-0.0418701171875,
-0.007904052734375,
-0.016387939453125,
0.0216217041015625,
0.0080108642578125,
0.09283447265625,
0.00783538818359375,
-0.0231475830078125,
-0.0087432861328125,
-0.046722412109375,
0.0665283203125,
-0.057861328125,
0.032318115234375,
-0.00010150671005249023,
0.0157470703125,
-0.006496429443359375,
-0.0308685302734375,
-0.053253173828125,
0.0018377304077148438,
-0.02276611328125,
0.0250396728515625,
-0.0287933349609375,
-0.0179901123046875,
0.0165252685546875,
0.0653076171875,
-0.050018310546875,
-0.0170745849609375,
-0.035491943359375,
-0.0024471282958984375,
0.032196044921875,
0.01316070556640625,
0.0079345703125,
-0.029388427734375,
-0.0408935546875,
-0.03045654296875,
-0.0361328125,
0.037506103515625,
0.006076812744140625,
0.00318145751953125,
-0.03680419921875,
0.039703369140625,
-0.024566650390625,
0.0294647216796875,
0.04052734375,
0.00827789306640625,
0.0594482421875,
-0.024017333984375,
-0.004482269287109375,
-0.004886627197265625,
0.07391357421875,
0.0173492431640625,
0.006244659423828125,
0.006526947021484375,
-0.011077880859375,
0.0172119140625,
0.0012063980102539062,
-0.06610107421875,
-0.021942138671875,
0.0247039794921875,
-0.0374755859375,
-0.034423828125,
0.0181732177734375,
-0.08782958984375,
-0.0291290283203125,
-0.0192413330078125,
0.009979248046875,
-0.04022216796875,
-0.046051025390625,
0.006622314453125,
-0.006801605224609375,
0.058929443359375,
0.0212860107421875,
-0.040771484375,
0.034881591796875,
0.0572509765625,
0.051544189453125,
0.005786895751953125,
-0.0211639404296875,
-0.007640838623046875,
-0.0229949951171875,
-0.0278778076171875,
0.07623291015625,
-0.0313720703125,
-0.0266265869140625,
-0.0289459228515625,
-0.0019550323486328125,
0.004322052001953125,
-0.034881591796875,
0.0537109375,
-0.0308837890625,
0.04937744140625,
-0.0292816162109375,
-0.036895751953125,
-0.0225982666015625,
0.01092529296875,
-0.04296875,
0.08685302734375,
0.0135955810546875,
-0.0687255859375,
-0.0011587142944335938,
-0.057037353515625,
-0.035614013671875,
-0.0176849365234375,
0.007904052734375,
-0.055084228515625,
0.0196533203125,
0.0148468017578125,
0.042266845703125,
-0.0535888671875,
0.0290985107421875,
-0.019012451171875,
-0.00853729248046875,
0.03436279296875,
-0.00604248046875,
0.0867919921875,
0.0111236572265625,
-0.045074462890625,
0.01369476318359375,
-0.048919677734375,
0.0015001296997070312,
0.016510009765625,
-0.004009246826171875,
-0.017181396484375,
-0.004215240478515625,
0.002780914306640625,
0.0302276611328125,
0.021575927734375,
-0.05316162109375,
-0.010345458984375,
-0.043670654296875,
0.009033203125,
0.060211181640625,
0.003818511962890625,
0.0389404296875,
-0.02886962890625,
0.045501708984375,
0.0299072265625,
0.024444580078125,
-0.000016748905181884766,
-0.0227203369140625,
-0.0703125,
-0.0301513671875,
0.03179931640625,
0.0518798828125,
-0.0300750732421875,
0.040924072265625,
-0.03167724609375,
-0.06414794921875,
-0.0384521484375,
-0.0002727508544921875,
0.0469970703125,
0.044769287109375,
0.040008544921875,
-0.019683837890625,
-0.031951904296875,
-0.07501220703125,
-0.0240631103515625,
-0.0191192626953125,
0.0280303955078125,
0.0181121826171875,
0.03436279296875,
-0.00916290283203125,
0.0633544921875,
-0.04632568359375,
-0.0144500732421875,
-0.0240020751953125,
0.002349853515625,
0.03631591796875,
0.047576904296875,
0.040618896484375,
-0.058197021484375,
-0.05584716796875,
0.006702423095703125,
-0.051605224609375,
0.0008301734924316406,
-0.004215240478515625,
-0.027587890625,
0.02874755859375,
0.01111602783203125,
-0.054840087890625,
0.01351165771484375,
0.031890869140625,
-0.053619384765625,
0.043609619140625,
0.034088134765625,
0.0003123283386230469,
-0.09515380859375,
0.017547607421875,
0.004199981689453125,
-0.01509857177734375,
-0.050506591796875,
-0.0011777877807617188,
0.01763916015625,
0.00011461973190307617,
-0.0306396484375,
0.046966552734375,
-0.03173828125,
0.008056640625,
0.00787353515625,
-0.0098419189453125,
0.005405426025390625,
0.05126953125,
-0.00914764404296875,
0.057464599609375,
0.05914306640625,
-0.02386474609375,
0.02642822265625,
0.045623779296875,
-0.0240020751953125,
0.058929443359375,
-0.065185546875,
-0.0029315948486328125,
-0.0196990966796875,
0.008453369140625,
-0.07733154296875,
-0.017730712890625,
0.04339599609375,
-0.057037353515625,
0.0281524658203125,
0.01788330078125,
-0.0299530029296875,
-0.022918701171875,
-0.044708251953125,
0.0016193389892578125,
0.0604248046875,
-0.028594970703125,
0.04095458984375,
0.035003662109375,
-0.0309906005859375,
-0.03399658203125,
-0.03857421875,
-0.00962066650390625,
-0.0246124267578125,
-0.05255126953125,
0.0161285400390625,
-0.006984710693359375,
-0.02166748046875,
0.01446533203125,
-0.01352691650390625,
-0.00804901123046875,
-0.0004150867462158203,
0.031524658203125,
0.04180908203125,
-0.0209503173828125,
0.01078033447265625,
0.02081298828125,
-0.00339508056640625,
0.0142974853515625,
0.0272369384765625,
0.051605224609375,
-0.058868408203125,
-0.008880615234375,
-0.026824951171875,
0.01107025146484375,
0.04559326171875,
-0.037933349609375,
0.058258056640625,
0.04705810546875,
-0.00588226318359375,
-0.003711700439453125,
-0.048675537109375,
-0.0008950233459472656,
-0.03240966796875,
0.0086517333984375,
-0.027069091796875,
-0.07122802734375,
0.047088623046875,
-0.00820159912109375,
-0.004634857177734375,
0.061126708984375,
0.0540771484375,
-0.01160430908203125,
0.0567626953125,
0.0271759033203125,
-0.01064300537109375,
0.0379638671875,
-0.0186309814453125,
0.0147705078125,
-0.07965087890625,
-0.0250396728515625,
-0.044830322265625,
-0.0175018310546875,
-0.05462646484375,
-0.024322509765625,
0.029571533203125,
0.00952911376953125,
-0.03167724609375,
0.0248565673828125,
-0.040435791015625,
0.03253173828125,
0.035064697265625,
0.023834228515625,
-0.0011234283447265625,
0.0028781890869140625,
0.00659942626953125,
0.002361297607421875,
-0.039093017578125,
-0.026397705078125,
0.06298828125,
0.029296875,
0.0517578125,
0.01806640625,
0.057952880859375,
0.01163482666015625,
0.0144500732421875,
-0.046661376953125,
0.0447998046875,
-0.01473236083984375,
-0.058197021484375,
-0.01126861572265625,
-0.0191192626953125,
-0.05126953125,
0.027008056640625,
-0.0025882720947265625,
-0.02667236328125,
0.0457763671875,
0.0173492431640625,
-0.0140380859375,
0.004749298095703125,
-0.048248291015625,
0.08123779296875,
-0.007659912109375,
-0.04132080078125,
-0.022369384765625,
-0.06939697265625,
0.031341552734375,
0.0032253265380859375,
0.0269622802734375,
-0.026580810546875,
0.0184783935546875,
0.050201416015625,
-0.03460693359375,
0.06719970703125,
-0.02618408203125,
0.00859832763671875,
0.0283203125,
0.0055694580078125,
0.032318115234375,
-0.00620269775390625,
-0.0166778564453125,
0.024017333984375,
-0.0029659271240234375,
-0.037261962890625,
-0.021392822265625,
0.05157470703125,
-0.06280517578125,
-0.04254150390625,
-0.0650634765625,
-0.007114410400390625,
-0.01094818115234375,
0.0179901123046875,
0.01192474365234375,
0.028900146484375,
-0.0231170654296875,
0.037567138671875,
0.055908203125,
-0.0252532958984375,
0.0200042724609375,
0.036651611328125,
-0.01251983642578125,
-0.040283203125,
0.06805419921875,
-0.0233001708984375,
0.005970001220703125,
0.030120849609375,
0.0264739990234375,
-0.0187835693359375,
-0.004711151123046875,
-0.033294677734375,
0.0263671875,
-0.047882080078125,
-0.045501708984375,
-0.0279541015625,
-0.006927490234375,
-0.030426025390625,
-0.0004699230194091797,
-0.028564453125,
-0.045074462890625,
-0.02825927734375,
-0.004547119140625,
0.037750244140625,
0.03515625,
-0.017822265625,
0.01520538330078125,
-0.055938720703125,
0.0195465087890625,
0.002105712890625,
0.032562255859375,
-0.01123046875,
-0.0467529296875,
-0.022064208984375,
0.00017559528350830078,
-0.038482666015625,
-0.07318115234375,
0.03515625,
0.018035888671875,
0.0241241455078125,
0.006259918212890625,
0.01068115234375,
0.05255126953125,
-0.03363037109375,
0.0552978515625,
0.0215911865234375,
-0.0997314453125,
0.056884765625,
-0.0165252685546875,
0.0082855224609375,
0.0404052734375,
0.027587890625,
-0.042877197265625,
-0.0265655517578125,
-0.05230712890625,
-0.064208984375,
0.06573486328125,
0.01239013671875,
0.00867462158203125,
0.00409698486328125,
0.0115966796875,
-0.0013971328735351562,
0.023406982421875,
-0.0618896484375,
-0.0260009765625,
-0.036346435546875,
-0.029571533203125,
-0.0025310516357421875,
-0.035430908203125,
0.00847625732421875,
-0.033721923828125,
0.06500244140625,
0.0218505859375,
0.03607177734375,
0.0299072265625,
-0.015411376953125,
-0.01454925537109375,
0.0467529296875,
0.038818359375,
0.0238189697265625,
-0.01385498046875,
0.00872039794921875,
0.0114898681640625,
-0.025543212890625,
0.0019159317016601562,
0.0107421875,
-0.00567626953125,
0.00421142578125,
0.017242431640625,
0.07806396484375,
0.01242828369140625,
-0.044464111328125,
0.054290771484375,
0.007152557373046875,
-0.0015201568603515625,
-0.046295166015625,
-0.0015172958374023438,
-0.01244354248046875,
0.02520751953125,
-0.00817108154296875,
0.011932373046875,
0.02288818359375,
-0.027252197265625,
-0.000021398067474365234,
0.0289459228515625,
-0.03082275390625,
-0.046417236328125,
0.0333251953125,
0.0023288726806640625,
0.002910614013671875,
0.0272979736328125,
-0.022247314453125,
-0.06610107421875,
0.04107666015625,
0.031494140625,
0.0675048828125,
-0.037353515625,
0.022247314453125,
0.03564453125,
0.017242431640625,
-0.01107025146484375,
0.043701171875,
0.0016326904296875,
-0.043701171875,
-0.021392822265625,
-0.0606689453125,
-0.03326416015625,
-0.0250091552734375,
-0.05975341796875,
0.021331787109375,
-0.035186767578125,
-0.00860595703125,
0.0148468017578125,
-0.002979278564453125,
-0.04302978515625,
0.00876617431640625,
-0.005802154541015625,
0.056060791015625,
-0.0625,
0.047027587890625,
0.07513427734375,
-0.031524658203125,
-0.059844970703125,
-0.006591796875,
-0.0034275054931640625,
-0.035980224609375,
0.045989990234375,
-0.0040283203125,
-0.00164794921875,
0.0012636184692382812,
-0.03948974609375,
-0.06121826171875,
0.07012939453125,
0.002613067626953125,
-0.037994384765625,
0.018890380859375,
0.0184173583984375,
0.05517578125,
0.0127410888671875,
0.0126800537109375,
0.032012939453125,
0.034881591796875,
-0.007434844970703125,
-0.06390380859375,
0.0214996337890625,
-0.051513671875,
0.005855560302734375,
0.03680419921875,
-0.07574462890625,
0.0955810546875,
0.0098114013671875,
-0.0011625289916992188,
-0.00463104248046875,
0.043182373046875,
0.005817413330078125,
0.01024627685546875,
0.0286712646484375,
0.0589599609375,
0.04119873046875,
-0.0187530517578125,
0.08282470703125,
-0.0297698974609375,
0.056884765625,
0.0777587890625,
-0.01454925537109375,
0.06756591796875,
0.02972412109375,
-0.03765869140625,
0.073486328125,
0.02276611328125,
-0.01073455810546875,
0.0328369140625,
0.0016603469848632812,
-0.0163421630859375,
-0.0187835693359375,
-0.0007195472717285156,
-0.038421630859375,
0.024658203125,
0.0211029052734375,
-0.005733489990234375,
0.00034308433532714844,
0.0005936622619628906,
0.0258026123046875,
-0.005321502685546875,
-0.0182037353515625,
0.06719970703125,
-0.0021686553955078125,
-0.036346435546875,
0.046966552734375,
-0.0016107559204101562,
0.02984619140625,
-0.0149993896484375,
0.011077880859375,
-0.0311737060546875,
0.027587890625,
-0.0255584716796875,
-0.06787109375,
0.0361328125,
0.0172882080078125,
-0.0196990966796875,
-0.015838623046875,
0.0687255859375,
-0.01251220703125,
-0.05523681640625,
-0.003475189208984375,
0.048431396484375,
0.0006208419799804688,
-0.004718780517578125,
-0.056640625,
0.01329803466796875,
0.0113067626953125,
-0.0484619140625,
0.004650115966796875,
0.04718017578125,
0.00379180908203125,
0.0201568603515625,
0.040985107421875,
-0.00787353515625,
0.004146575927734375,
-0.0212860107421875,
0.0880126953125,
-0.0728759765625,
-0.0263671875,
-0.0673828125,
0.056182861328125,
-0.01354217529296875,
-0.024078369140625,
0.07159423828125,
0.058929443359375,
0.05706787109375,
-0.01438140869140625,
0.042022705078125,
-0.037078857421875,
0.07904052734375,
-0.0171356201171875,
0.043426513671875,
-0.04522705078125,
0.00852203369140625,
-0.032470703125,
-0.0587158203125,
-0.0266265869140625,
0.06329345703125,
-0.033538818359375,
-0.008270263671875,
0.049407958984375,
0.0582275390625,
-0.00673675537109375,
0.00756072998046875,
-0.01551055908203125,
0.0418701171875,
0.0115203857421875,
0.039306640625,
0.06689453125,
-0.045440673828125,
0.023590087890625,
-0.05181884765625,
-0.015838623046875,
-0.0131683349609375,
-0.033721923828125,
-0.071044921875,
-0.0435791015625,
-0.0308837890625,
-0.038330078125,
-0.0020122528076171875,
0.075927734375,
0.0187225341796875,
-0.08349609375,
-0.0223846435546875,
-0.0210418701171875,
0.01404571533203125,
0.0028438568115234375,
-0.0258331298828125,
0.0369873046875,
-0.028900146484375,
-0.05242919921875,
-0.0042572021484375,
-0.00136566162109375,
-0.0024089813232421875,
0.0029926300048828125,
-0.00933074951171875,
0.0017852783203125,
-0.00659942626953125,
0.052764892578125,
0.0186004638671875,
-0.045501708984375,
-0.004611968994140625,
0.0283355712890625,
-0.0114288330078125,
0.02325439453125,
0.0301055908203125,
-0.055694580078125,
0.007793426513671875,
0.04254150390625,
0.0281829833984375,
0.0272979736328125,
0.00482177734375,
0.0034046173095703125,
-0.040313720703125,
0.00807952880859375,
0.0205230712890625,
0.01454925537109375,
0.02557373046875,
-0.0120697021484375,
0.035980224609375,
0.0169525146484375,
-0.042083740234375,
-0.07012939453125,
-0.0012407302856445312,
-0.09405517578125,
-0.0229949951171875,
0.08624267578125,
-0.0001690387725830078,
-0.0227508544921875,
0.0259857177734375,
-0.028411865234375,
0.0262298583984375,
-0.049468994140625,
0.051300048828125,
0.0469970703125,
-0.014678955078125,
-0.01285552978515625,
-0.01177978515625,
0.0241241455078125,
0.057708740234375,
-0.0728759765625,
0.0000908970832824707,
0.05242919921875,
0.01308441162109375,
0.0145263671875,
0.03582763671875,
-0.018218994140625,
0.032257080078125,
-0.01136016845703125,
0.0247650146484375,
0.011138916015625,
-0.0139617919921875,
-0.04217529296875,
0.001323699951171875,
0.0024433135986328125,
-0.006572723388671875
]
] |
invisiblecat/Uber_Realistic_Porn_Merge_V1.3 | 2023-08-11T16:39:22.000Z | [
"diffusers",
"not-for-all-audiences",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | invisiblecat | null | null | invisiblecat/Uber_Realistic_Porn_Merge_V1.3 | 26 | 51,922 | diffusers | 2023-06-06T21:45:01 | ---
license: creativeml-openrail-m
language:
- en
pipeline_tag: text-to-image
tags:
- diffusers
- not-for-all-audiences
library_name: diffusers
---
<div style='background: #f044442e; color: #f04444; border: solid 1px #f04444; border-radius: 0.5pc; margin-top: 1.5em; margin-bottom: 1.5em; padding:1em;'>
<b>WARNING:</b> Potential for NSFW Content.
</div>
# Uber Realistic Porn Merge (URPM)
- **Author:** [saftle](https://civitai.com/user/saftle)
- **Source:** [https://civitai.com/models/2661](https://civitai.com/models/2661)
- **Version:** 1.3
- **Base Model:** [Stable Diffusion 1.5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
- **License:** [creativeml-openrail-m](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 746 | [
[
-0.046142578125,
-0.064697265625,
0.02630615234375,
0.0258941650390625,
-0.03582763671875,
0.00177764892578125,
0.027801513671875,
-0.01336669921875,
0.01415252685546875,
0.032867431640625,
-0.06982421875,
-0.044952392578125,
-0.035064697265625,
-0.0135498046875,
-0.04833984375,
0.050048828125,
0.0057525634765625,
-0.003986358642578125,
-0.0221099853515625,
0.00771331787109375,
-0.0240325927734375,
-0.016693115234375,
-0.028289794921875,
0.00817108154296875,
0.01366424560546875,
0.028076171875,
0.06005859375,
0.040771484375,
0.045379638671875,
0.01849365234375,
-0.0123138427734375,
0.0199127197265625,
-0.05499267578125,
-0.01441192626953125,
-0.01280975341796875,
0.016387939453125,
-0.06195068359375,
0.0159149169921875,
0.049774169921875,
0.02783203125,
-0.0207672119140625,
0.008819580078125,
0.002056121826171875,
0.05889892578125,
-0.048248291015625,
-0.03936767578125,
-0.00447845458984375,
0.0267791748046875,
-0.0238037109375,
-0.00714874267578125,
-0.0201568603515625,
-0.025665283203125,
-0.01325225830078125,
-0.0477294921875,
0.00980377197265625,
-0.004894256591796875,
0.09271240234375,
0.0109405517578125,
-0.036468505859375,
0.038330078125,
-0.058929443359375,
0.025604248046875,
-0.03717041015625,
0.0653076171875,
0.01383209228515625,
0.025177001953125,
-0.0157623291015625,
-0.060089111328125,
-0.039703369140625,
0.0237579345703125,
-0.0034809112548828125,
0.00927734375,
-0.058624267578125,
-0.024200439453125,
-0.0074005126953125,
0.020965576171875,
-0.053436279296875,
-0.007549285888671875,
-0.0618896484375,
0.00414276123046875,
0.041259765625,
0.00856781005859375,
0.045196533203125,
0.01678466796875,
-0.05450439453125,
-0.005397796630859375,
-0.0482177734375,
0.0007042884826660156,
0.03399658203125,
0.002437591552734375,
-0.06805419921875,
0.029327392578125,
0.0002315044403076172,
0.054962158203125,
0.0094757080078125,
-0.006900787353515625,
0.040924072265625,
-0.047332763671875,
-0.037445068359375,
-0.0306549072265625,
0.06512451171875,
0.066162109375,
-0.0016994476318359375,
0.013916015625,
0.020233154296875,
-0.0250701904296875,
0.04248046875,
-0.076171875,
0.000023365020751953125,
0.0247955322265625,
-0.046844482421875,
-0.039276123046875,
0.020294189453125,
-0.0682373046875,
-0.00662994384765625,
0.01161956787109375,
0.024627685546875,
-0.0193939208984375,
-0.055023193359375,
0.0273284912109375,
-0.007343292236328125,
0.03790283203125,
0.01953125,
-0.032562255859375,
0.029296875,
0.0260467529296875,
0.05377197265625,
-0.004497528076171875,
0.0109710693359375,
-0.006542205810546875,
0.00665283203125,
-0.01120758056640625,
0.0220794677734375,
0.001995086669921875,
-0.042816162109375,
0.0155487060546875,
0.01190185546875,
0.01390838623046875,
-0.01454925537109375,
0.06231689453125,
-0.03155517578125,
0.01116943359375,
-0.026641845703125,
-0.00926971435546875,
-0.012939453125,
-0.007434844970703125,
-0.040496826171875,
0.05413818359375,
0.01348114013671875,
-0.06268310546875,
0.03375244140625,
-0.031982421875,
-0.0021514892578125,
0.0004382133483886719,
0.01549530029296875,
-0.047454833984375,
-0.02532958984375,
-0.0201873779296875,
0.039520263671875,
-0.0021152496337890625,
-0.0008072853088378906,
-0.051513671875,
-0.0216217041015625,
0.019073486328125,
-0.0357666015625,
0.057403564453125,
0.01461029052734375,
-0.0186614990234375,
0.0081787109375,
-0.061859130859375,
-0.03167724609375,
0.0170135498046875,
0.0150909423828125,
-0.0166168212890625,
-0.0290374755859375,
0.03472900390625,
0.0288848876953125,
0.01715087890625,
-0.0299530029296875,
0.00949859619140625,
0.01250457763671875,
0.00020182132720947266,
0.045745849609375,
0.00004673004150390625,
0.038238525390625,
-0.039642333984375,
0.0479736328125,
0.0282135009765625,
0.0389404296875,
0.04473876953125,
-0.0338134765625,
-0.056732177734375,
-0.036529541015625,
0.01233673095703125,
0.0288543701171875,
-0.03594970703125,
0.043304443359375,
0.0007343292236328125,
-0.0577392578125,
-0.0285491943359375,
0.0130157470703125,
0.03125,
0.026947021484375,
0.022735595703125,
-0.02655029296875,
-0.03472900390625,
-0.07763671875,
0.039886474609375,
0.0164031982421875,
0.0016078948974609375,
0.0204010009765625,
0.031158447265625,
-0.004825592041015625,
0.054351806640625,
-0.043792724609375,
-0.0273895263671875,
0.0113372802734375,
-0.01537322998046875,
0.0157318115234375,
0.07440185546875,
0.0716552734375,
-0.08795166015625,
-0.044189453125,
-0.02276611328125,
-0.06353759765625,
-0.0073699951171875,
0.019134521484375,
-0.05859375,
-0.00969696044921875,
0.029022216796875,
-0.03302001953125,
0.050537109375,
0.042877197265625,
-0.052398681640625,
0.037384033203125,
-0.034423828125,
0.0308990478515625,
-0.06964111328125,
0.0094146728515625,
0.04827880859375,
-0.03277587890625,
-0.055389404296875,
0.0275115966796875,
0.0011835098266601562,
0.00627899169921875,
-0.090087890625,
0.03399658203125,
-0.03765869140625,
0.018402099609375,
-0.00252532958984375,
0.025115966796875,
-0.01206207275390625,
0.0128631591796875,
-0.0029544830322265625,
0.043609619140625,
0.058868408203125,
-0.03997802734375,
0.019866943359375,
0.03289794921875,
-0.00823974609375,
0.046051025390625,
-0.028533935546875,
0.001667022705078125,
-0.004947662353515625,
0.00868988037109375,
-0.0933837890625,
-0.0290374755859375,
0.032318115234375,
-0.0269775390625,
-0.00018537044525146484,
-0.015899658203125,
-0.035552978515625,
-0.02618408203125,
-0.05633544921875,
0.039093017578125,
0.0753173828125,
-0.023651123046875,
0.0156402587890625,
0.033660888671875,
0.00530242919921875,
-0.04534912109375,
-0.046539306640625,
-0.0287322998046875,
-0.0224151611328125,
-0.06060791015625,
0.0394287109375,
-0.0078277587890625,
-0.0132904052734375,
0.00821685791015625,
0.00997161865234375,
-0.032806396484375,
-0.0162353515625,
0.0276031494140625,
0.05084228515625,
-0.003887176513671875,
-0.030059814453125,
0.020904541015625,
0.011138916015625,
-0.0016355514526367188,
0.00859832763671875,
0.04315185546875,
0.00366973876953125,
-0.03131103515625,
-0.04534912109375,
0.027618408203125,
0.06304931640625,
-0.006500244140625,
0.08392333984375,
0.06976318359375,
-0.045379638671875,
0.01293182373046875,
-0.046173095703125,
0.014801025390625,
-0.034820556640625,
-0.006877899169921875,
-0.019012451171875,
-0.052459716796875,
0.044464111328125,
0.0299072265625,
-0.0121307373046875,
0.03363037109375,
0.031158447265625,
-0.0029392242431640625,
0.076171875,
0.0589599609375,
0.0215606689453125,
0.0264129638671875,
-0.04010009765625,
-0.006465911865234375,
-0.0673828125,
-0.049560546875,
-0.01131439208984375,
-0.02276611328125,
-0.053009033203125,
-0.048065185546875,
0.0201263427734375,
0.018890380859375,
-0.03021240234375,
0.039154052734375,
-0.043609619140625,
0.0244140625,
0.0244140625,
0.045867919921875,
0.0237274169921875,
-0.0040283203125,
-0.021240234375,
-0.014923095703125,
-0.038177490234375,
-0.0266571044921875,
0.0382080078125,
0.033111572265625,
0.050628662109375,
0.046630859375,
0.031341552734375,
0.007843017578125,
0.01337432861328125,
-0.00385284423828125,
0.05047607421875,
-0.03125,
-0.0562744140625,
-0.00754547119140625,
-0.01287078857421875,
-0.05865478515625,
0.024871826171875,
-0.04986572265625,
-0.038604736328125,
0.03338623046875,
0.006542205810546875,
-0.0288543701171875,
0.04913330078125,
-0.01549530029296875,
0.048187255859375,
0.01153564453125,
-0.07110595703125,
0.00203704833984375,
-0.044342041015625,
0.042205810546875,
0.0125274658203125,
0.033782958984375,
-0.003326416015625,
-0.01971435546875,
0.036895751953125,
-0.04730224609375,
0.0479736328125,
-0.0022792816162109375,
0.0029926300048828125,
0.006938934326171875,
0.0017061233520507812,
-0.01007843017578125,
0.00933074951171875,
0.02081298828125,
0.01039886474609375,
-0.03436279296875,
-0.02777099609375,
-0.00616455078125,
0.052886962890625,
-0.050048828125,
-0.041656494140625,
-0.040069580078125,
-0.0117340087890625,
0.01788330078125,
0.015960693359375,
0.041595458984375,
0.03076171875,
-0.029693603515625,
0.02777099609375,
0.06427001953125,
0.01189422607421875,
0.0277252197265625,
0.041015625,
-0.04974365234375,
-0.046630859375,
0.04180908203125,
-0.00806427001953125,
0.024932861328125,
0.00800323486328125,
0.00027823448181152344,
-0.0306243896484375,
-0.0401611328125,
-0.04193115234375,
0.04266357421875,
-0.03521728515625,
-0.0100250244140625,
-0.0419921875,
-0.04443359375,
-0.03643798828125,
-0.03729248046875,
-0.0128021240234375,
-0.0051116943359375,
-0.05072021484375,
-0.021240234375,
0.0276336669921875,
0.0418701171875,
-0.0270538330078125,
0.01885986328125,
-0.06317138671875,
0.007678985595703125,
0.026214599609375,
0.03289794921875,
-0.01348114013671875,
-0.048492431640625,
0.0040740966796875,
0.0022907257080078125,
-0.0243377685546875,
-0.06378173828125,
0.0460205078125,
-0.031463623046875,
0.041748046875,
0.032196044921875,
-0.0005121231079101562,
0.06585693359375,
-0.0219268798828125,
0.059539794921875,
0.037322998046875,
-0.048065185546875,
0.031951904296875,
-0.05499267578125,
0.00844573974609375,
0.0570068359375,
0.06243896484375,
-0.008209228515625,
-0.0325927734375,
-0.0679931640625,
-0.0640869140625,
0.0237884521484375,
0.04412841796875,
0.00707244873046875,
0.0108642578125,
0.0203399658203125,
-0.005435943603515625,
0.0233917236328125,
-0.0667724609375,
-0.042205810546875,
-0.00287628173828125,
0.00241851806640625,
0.0179595947265625,
-0.0201568603515625,
-0.0191650390625,
-0.037872314453125,
0.058990478515625,
0.0273590087890625,
0.004436492919921875,
0.005157470703125,
0.0157928466796875,
-0.0268096923828125,
-0.007587432861328125,
0.05975341796875,
0.051513671875,
-0.055755615234375,
-0.0185394287109375,
-0.0159759521484375,
-0.036224365234375,
0.019866943359375,
0.00238800048828125,
-0.0093536376953125,
0.00566864013671875,
0.0087890625,
0.040130615234375,
0.0174560546875,
-0.0187530517578125,
0.06048583984375,
-0.0168609619140625,
-0.040069580078125,
-0.0244903564453125,
0.01654052734375,
0.007663726806640625,
0.0188446044921875,
0.0287933349609375,
0.02423095703125,
0.03253173828125,
-0.033294677734375,
0.03302001953125,
0.0301971435546875,
-0.047607421875,
-0.0015859603881835938,
0.07061767578125,
0.031494140625,
-0.031036376953125,
0.0335693359375,
-0.0237579345703125,
-0.0212860107421875,
0.0413818359375,
0.038604736328125,
0.061126708984375,
-0.019439697265625,
0.014801025390625,
0.040557861328125,
-0.004486083984375,
-0.01175689697265625,
0.0186309814453125,
0.018798828125,
-0.041900634765625,
-0.00958251953125,
-0.03729248046875,
-0.0205230712890625,
0.035675048828125,
-0.056121826171875,
0.058380126953125,
-0.04248046875,
-0.0146331787109375,
0.00614166259765625,
-0.022552490234375,
-0.0193023681640625,
0.037384033203125,
0.00803375244140625,
0.0859375,
-0.0994873046875,
0.05499267578125,
0.040863037109375,
-0.058197021484375,
-0.0550537109375,
-0.0104827880859375,
0.013031005859375,
-0.0178375244140625,
0.0222320556640625,
0.00572967529296875,
-0.005405426025390625,
-0.016265869140625,
-0.036712646484375,
-0.0662841796875,
0.08685302734375,
0.01148223876953125,
-0.024627685546875,
-0.01522064208984375,
-0.0166473388671875,
0.037109375,
-0.041046142578125,
0.00955963134765625,
0.0233306884765625,
0.027099609375,
0.042144775390625,
-0.064208984375,
0.006481170654296875,
-0.04327392578125,
0.0008783340454101562,
0.0139007568359375,
-0.0731201171875,
0.0753173828125,
0.01055145263671875,
-0.0081787109375,
0.025909423828125,
0.044036865234375,
0.0292510986328125,
0.017608642578125,
0.045196533203125,
0.05194091796875,
0.00981903076171875,
-0.026336669921875,
0.053619384765625,
-0.00514984130859375,
0.031402587890625,
0.055267333984375,
-0.00037598609924316406,
0.05657958984375,
0.0167694091796875,
-0.02685546875,
0.045562744140625,
0.045684814453125,
-0.00872039794921875,
0.05242919921875,
-0.0036716461181640625,
-0.0211181640625,
-0.0217742919921875,
-0.0007200241088867188,
-0.050933837890625,
0.0289154052734375,
0.0012464523315429688,
-0.00948333740234375,
-0.00434112548828125,
0.004680633544921875,
0.019744873046875,
0.01412200927734375,
-0.003032684326171875,
0.03912353515625,
-0.01523590087890625,
-0.00891876220703125,
0.04193115234375,
-0.0157928466796875,
0.065673828125,
-0.035675048828125,
-0.0191650390625,
-0.0165863037109375,
0.00830841064453125,
-0.0232696533203125,
-0.07611083984375,
0.021270751953125,
-0.003787994384765625,
-0.00860595703125,
-0.03021240234375,
0.05731201171875,
-0.03594970703125,
-0.049041748046875,
0.026580810546875,
0.01505279541015625,
0.0157318115234375,
0.042388916015625,
-0.058074951171875,
0.0301513671875,
0.0145721435546875,
-0.00447845458984375,
0.019256591796875,
-0.0024814605712890625,
0.0218505859375,
0.060028076171875,
0.0606689453125,
0.01461029052734375,
-0.00882720947265625,
-0.0178070068359375,
0.07476806640625,
-0.031402587890625,
-0.046478271484375,
-0.07049560546875,
0.061737060546875,
-0.032928466796875,
-0.0182647705078125,
0.053131103515625,
0.055511474609375,
0.055908203125,
-0.01910400390625,
0.03582763671875,
-0.0050201416015625,
0.0158843994140625,
-0.0357666015625,
0.07861328125,
-0.07763671875,
0.0010290145874023438,
-0.03631591796875,
-0.05657958984375,
-0.00965118408203125,
0.049041748046875,
0.037445068359375,
0.0139312744140625,
0.0312042236328125,
0.0576171875,
-0.0229339599609375,
-0.032318115234375,
0.0294952392578125,
0.00673675537109375,
0.009002685546875,
0.00811767578125,
0.04205322265625,
-0.0335693359375,
0.01151275634765625,
-0.05865478515625,
-0.025299072265625,
-0.03192138671875,
-0.082763671875,
-0.050018310546875,
-0.056610107421875,
-0.040374755859375,
-0.053924560546875,
0.002544403076171875,
0.0460205078125,
0.081298828125,
-0.0291290283203125,
-0.0264129638671875,
0.0023746490478515625,
0.00782012939453125,
-0.003265380859375,
-0.0173797607421875,
-0.0005617141723632812,
0.049468994140625,
-0.07012939453125,
0.001079559326171875,
0.031707763671875,
0.06170654296875,
-0.045623779296875,
0.0166015625,
-0.052886962890625,
0.009552001953125,
0.013031005859375,
0.01763916015625,
-0.03857421875,
-0.00931549072265625,
-0.0184478759765625,
-0.022491455078125,
-0.00318145751953125,
0.0085296630859375,
-0.0201873779296875,
0.006256103515625,
0.046844482421875,
-0.0050048828125,
0.044036865234375,
0.01328277587890625,
0.02899169921875,
-0.046844482421875,
0.045684814453125,
-0.01486968994140625,
0.03466796875,
0.0204620361328125,
-0.03863525390625,
0.0521240234375,
0.03045654296875,
-0.020965576171875,
-0.07489013671875,
-0.00885009765625,
-0.111328125,
-0.006866455078125,
0.06964111328125,
0.0009965896606445312,
-0.0211181640625,
0.015411376953125,
-0.0081939697265625,
0.014862060546875,
-0.0234832763671875,
0.00981903076171875,
0.0478515625,
-0.008758544921875,
-0.0166473388671875,
-0.039764404296875,
0.03900146484375,
-0.002468109130859375,
-0.050628662109375,
-0.034271240234375,
0.0347900390625,
0.0460205078125,
0.029754638671875,
0.0601806640625,
-0.0241851806640625,
0.04254150390625,
-0.005573272705078125,
0.01043701171875,
-0.021240234375,
-0.02655029296875,
-0.0156707763671875,
0.01488494873046875,
-0.0246734619140625,
-0.028106689453125
]
] |
swl-models/hans-v4.4 | 2023-02-01T07:45:56.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:cc-by-nc-4.0",
"region:us"
] | text-to-image | swl-models | null | null | swl-models/hans-v4.4 | 1 | 51,623 | diffusers | 2023-02-01T07:30:25 | ---
license: cc-by-nc-4.0
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
---
hansv4.4
模型作者:hans
发布时间:2023-01-31
模型类型:ckpt | 205 | [
[
-0.0259857177734375,
-0.019439697265625,
0.027557373046875,
0.0472412109375,
-0.0765380859375,
-0.01629638671875,
0.002719879150390625,
-0.004993438720703125,
0.01296234130859375,
0.05322265625,
-0.042572021484375,
-0.030242919921875,
-0.0290679931640625,
0.023468017578125,
-0.006725311279296875,
0.03900146484375,
-0.016998291015625,
0.03289794921875,
-0.0036773681640625,
0.0080108642578125,
-0.048187255859375,
-0.0239715576171875,
-0.0239410400390625,
-0.031951904296875,
0.003330230712890625,
0.043121337890625,
0.037017822265625,
0.060577392578125,
0.06427001953125,
0.020294189453125,
0.033447265625,
-0.046600341796875,
-0.01348876953125,
0.01276397705078125,
0.0173492431640625,
-0.04473876953125,
-0.08538818359375,
-0.0006232261657714844,
0.06939697265625,
0.01389312744140625,
-0.0206146240234375,
-0.00452423095703125,
0.00266265869140625,
0.061309814453125,
-0.005889892578125,
0.0238189697265625,
-0.0132904052734375,
0.0301361083984375,
-0.05023193359375,
-0.058258056640625,
0.00024437904357910156,
-0.020111083984375,
-0.01544952392578125,
-0.08111572265625,
0.00920867919921875,
-0.0017747879028320312,
0.08123779296875,
0.00695037841796875,
-0.03924560546875,
0.0233917236328125,
-0.0173797607421875,
0.060394287109375,
-0.0361328125,
0.0292510986328125,
0.038604736328125,
0.0390625,
0.0138397216796875,
-0.0616455078125,
-0.0176239013671875,
0.023101806640625,
0.026397705078125,
0.01641845703125,
0.0174713134765625,
-0.0187835693359375,
0.014892578125,
0.004421234130859375,
-0.0216827392578125,
-0.004474639892578125,
-0.0506591796875,
-0.031463623046875,
0.034393310546875,
0.0282440185546875,
0.0258331298828125,
-0.01788330078125,
-0.047698974609375,
0.0032062530517578125,
-0.0711669921875,
-0.002727508544921875,
0.021820068359375,
0.01203155517578125,
-0.04608154296875,
0.02532958984375,
-0.03045654296875,
0.0116729736328125,
0.01666259765625,
-0.0007052421569824219,
0.0391845703125,
-0.052978515625,
-0.041107177734375,
0.00040030479431152344,
0.0367431640625,
0.056488037109375,
0.004444122314453125,
0.06402587890625,
-0.00733184814453125,
-0.055633544921875,
0.0117340087890625,
-0.054534912109375,
-0.05169677734375,
0.026824951171875,
-0.041900634765625,
-0.0379638671875,
0.017791748046875,
-0.07373046875,
0.00875091552734375,
-0.01446533203125,
0.0134429931640625,
-0.032379150390625,
-0.043792724609375,
-0.01396942138671875,
-0.01483917236328125,
0.0364990234375,
0.0654296875,
-0.0516357421875,
0.00551605224609375,
0.0287628173828125,
0.007167816162109375,
-0.002231597900390625,
-0.0095367431640625,
-0.0238189697265625,
0.04095458984375,
-0.03253173828125,
0.034698486328125,
0.005645751953125,
-0.040283203125,
-0.0008130073547363281,
0.009185791015625,
0.033050537109375,
-0.0150909423828125,
0.0706787109375,
-0.061920166015625,
0.0151824951171875,
-0.06378173828125,
-0.01148223876953125,
-0.0206756591796875,
0.02435302734375,
-0.03668212890625,
0.0236968994140625,
0.0186004638671875,
-0.07403564453125,
0.0333251953125,
-0.035003662109375,
-0.02935791015625,
0.06634521484375,
-0.017242431640625,
-0.0283966064453125,
-0.0177154541015625,
0.0034923553466796875,
0.0404052734375,
-0.005611419677734375,
0.0086669921875,
-0.0157470703125,
-0.002292633056640625,
0.00634002685546875,
-0.0152130126953125,
0.07879638671875,
0.049652099609375,
0.004398345947265625,
0.0310211181640625,
-0.0146026611328125,
0.008270263671875,
0.051239013671875,
-0.035491943359375,
0.0018186569213867188,
-0.009552001953125,
0.015655517578125,
0.01189422607421875,
0.07281494140625,
-0.033905029296875,
0.0083465576171875,
-0.011322021484375,
0.0304412841796875,
0.08526611328125,
0.01033782958984375,
0.0237884521484375,
-0.02960205078125,
0.061737060546875,
-0.0000966787338256836,
0.0241851806640625,
-0.0174713134765625,
-0.04364013671875,
-0.040130615234375,
0.007266998291015625,
0.0132293701171875,
0.03594970703125,
-0.06878662109375,
0.037139892578125,
-0.006809234619140625,
-0.035369873046875,
0.03424072265625,
-0.00754547119140625,
-0.002490997314453125,
0.01287841796875,
0.035247802734375,
-0.0194091796875,
-0.0672607421875,
-0.0474853515625,
0.0084381103515625,
-0.016845703125,
-0.0176239013671875,
-0.0038318634033203125,
0.021575927734375,
-0.0012063980102539062,
0.01070404052734375,
-0.038482666015625,
-0.023529052734375,
-0.0245208740234375,
-0.021270751953125,
0.033172607421875,
0.042724609375,
0.039520263671875,
-0.07135009765625,
-0.069091796875,
-0.01149749755859375,
-0.0654296875,
0.0012083053588867188,
0.001026153564453125,
-0.005054473876953125,
0.0223388671875,
0.0129547119140625,
-0.0248870849609375,
0.0067291259765625,
0.036529541015625,
-0.027496337890625,
0.07611083984375,
-0.0249176025390625,
0.024200439453125,
-0.042022705078125,
-0.01453399658203125,
-0.033660888671875,
-0.032745361328125,
-0.0185699462890625,
0.0175323486328125,
0.0160675048828125,
0.0127105712890625,
-0.04400634765625,
0.0100860595703125,
-0.02880859375,
0.026275634765625,
-0.00791168212890625,
-0.003742218017578125,
-0.016815185546875,
0.011444091796875,
-0.03271484375,
0.032928466796875,
0.049530029296875,
-0.0535888671875,
0.03802490234375,
0.0657958984375,
-0.0015344619750976562,
0.051300048828125,
-0.04034423828125,
-0.0303955078125,
0.042022705078125,
0.0389404296875,
-0.03570556640625,
-0.0439453125,
0.044647216796875,
-0.05963134765625,
0.050384521484375,
0.0082855224609375,
-0.01751708984375,
-0.0303955078125,
-0.059234619140625,
0.050994873046875,
0.0440673828125,
0.0015344619750976562,
0.0179595947265625,
0.0413818359375,
-0.035186767578125,
-0.0273284912109375,
-0.03668212890625,
0.026702880859375,
-0.01029205322265625,
-0.0386962890625,
0.006137847900390625,
-0.0242462158203125,
0.0114593505859375,
-0.030364990234375,
-0.0011110305786132812,
-0.004985809326171875,
-0.01033782958984375,
-0.004451751708984375,
0.028839111328125,
-0.0297393798828125,
-0.042236328125,
0.01007843017578125,
-0.0248870849609375,
0.0224609375,
0.0100250244140625,
0.033111572265625,
-0.002796173095703125,
-0.032745361328125,
-0.055267333984375,
0.0247344970703125,
0.0291595458984375,
0.016143798828125,
-0.0266571044921875,
0.0308685302734375,
-0.0153656005859375,
0.0237884521484375,
-0.0183258056640625,
0.0016355514526367188,
-0.043212890625,
0.0205841064453125,
-0.044769287109375,
-0.0584716796875,
0.01474761962890625,
0.0049591064453125,
-0.0166778564453125,
0.075439453125,
0.03887939453125,
-0.0248870849609375,
0.07208251953125,
0.032562255859375,
0.007312774658203125,
0.0165557861328125,
-0.00807952880859375,
-0.01131439208984375,
-0.058013916015625,
-0.012054443359375,
-0.01861572265625,
0.0100860595703125,
-0.043731689453125,
-0.044342041015625,
0.026580810546875,
-0.002788543701171875,
-0.026519775390625,
0.0472412109375,
-0.042816162109375,
0.0052032470703125,
0.040863037109375,
-0.017059326171875,
-0.01041412353515625,
-0.051239013671875,
-0.0063629150390625,
-0.028564453125,
-0.04827880859375,
-0.0199737548828125,
0.08843994140625,
0.058685302734375,
0.0193939208984375,
-0.014801025390625,
0.032257080078125,
0.0290374755859375,
0.011260986328125,
0.01451873779296875,
0.0689697265625,
0.03375244140625,
-0.074462890625,
-0.03228759765625,
-0.0019779205322265625,
-0.06646728515625,
0.0243988037109375,
-0.009918212890625,
-0.031829833984375,
-0.0199737548828125,
0.0091094970703125,
0.0019702911376953125,
0.029998779296875,
-0.01016998291015625,
0.035400390625,
-0.023040771484375,
-0.0036220550537109375,
-0.027984619140625,
-0.07940673828125,
0.048980712890625,
-0.040802001953125,
0.0127105712890625,
-0.03656005859375,
-0.000530242919921875,
0.059814453125,
-0.031524658203125,
0.0157928466796875,
-0.04498291015625,
-0.0014133453369140625,
0.009735107421875,
0.0117950439453125,
0.026214599609375,
0.0172882080078125,
0.0245513916015625,
0.0517578125,
0.04248046875,
-0.04620361328125,
0.026611328125,
0.060546875,
-0.038665771484375,
-0.04736328125,
-0.06622314453125,
-0.0015726089477539062,
0.037628173828125,
0.03778076171875,
0.046234130859375,
0.009124755859375,
-0.02880859375,
0.01055908203125,
0.003875732421875,
-0.035919189453125,
0.03558349609375,
0.043060302734375,
-0.0185699462890625,
-0.06573486328125,
0.0819091796875,
0.00347900390625,
0.004665374755859375,
0.032501220703125,
0.0113067626953125,
-0.0008683204650878906,
-0.06341552734375,
-0.00334930419921875,
0.0262908935546875,
-0.02813720703125,
-0.044189453125,
-0.025299072265625,
-0.0257568359375,
-0.050567626953125,
-0.0240631103515625,
-0.0323486328125,
-0.033721923828125,
-0.0274200439453125,
-0.00983428955078125,
-0.02313232421875,
0.052032470703125,
-0.02642822265625,
0.02447509765625,
-0.0877685546875,
0.031982421875,
-0.0013751983642578125,
0.0249786376953125,
-0.016815185546875,
-0.0272674560546875,
-0.050933837890625,
-0.00847625732421875,
-0.0743408203125,
-0.0472412109375,
0.0506591796875,
-0.0097198486328125,
0.033966064453125,
0.07427978515625,
0.01107025146484375,
0.013824462890625,
-0.0487060546875,
0.091796875,
0.0460205078125,
-0.08099365234375,
0.030975341796875,
-0.08868408203125,
0.035308837890625,
0.018524169921875,
0.003204345703125,
-0.0184173583984375,
-0.0283050537109375,
-0.0711669921875,
-0.0574951171875,
0.048492431640625,
0.042205810546875,
-0.0002180337905883789,
0.0223236083984375,
-0.0083465576171875,
0.02166748046875,
0.044342041015625,
-0.027008056640625,
-0.038299560546875,
-0.0267486572265625,
0.01629638671875,
0.01258087158203125,
-0.044708251953125,
-0.0012674331665039062,
-0.056915283203125,
0.051055908203125,
0.01486968994140625,
0.06549072265625,
0.005443572998046875,
0.0267333984375,
-0.050506591796875,
0.0101776123046875,
0.0872802734375,
0.06402587890625,
-0.0462646484375,
0.01079559326171875,
0.0185546875,
-0.049102783203125,
-0.007015228271484375,
0.017852783203125,
0.016082763671875,
0.0025959014892578125,
0.041961669921875,
0.0001691579818725586,
0.0094146728515625,
-0.0066070556640625,
0.052337646484375,
-0.01045989990234375,
-0.035430908203125,
-0.0745849609375,
-0.00682830810546875,
-0.01058197021484375,
-0.004413604736328125,
0.050079345703125,
-0.0022602081298828125,
0.00026726722717285156,
0.004852294921875,
0.038970947265625,
0.0011539459228515625,
-0.041717529296875,
-0.057220458984375,
0.0767822265625,
0.05194091796875,
-0.012298583984375,
0.035491943359375,
0.008026123046875,
-0.06573486328125,
0.04248046875,
0.018218994140625,
0.046600341796875,
-0.060791015625,
-0.001964569091796875,
0.048553466796875,
0.008453369140625,
0.0012502670288085938,
0.038543701171875,
-0.005420684814453125,
-0.040618896484375,
-0.0207977294921875,
-0.0249481201171875,
-0.03289794921875,
0.01141357421875,
-0.031982421875,
0.037109375,
-0.0216827392578125,
0.0032501220703125,
-0.008087158203125,
0.00922393798828125,
-0.031646728515625,
0.0330810546875,
0.0024261474609375,
0.08392333984375,
-0.039642333984375,
0.0760498046875,
0.0012006759643554688,
-0.01216888427734375,
-0.006053924560546875,
-0.016571044921875,
0.0248260498046875,
-0.07012939453125,
0.042724609375,
0.01021575927734375,
0.0458984375,
0.0119476318359375,
-0.06134033203125,
-0.05303955078125,
0.0826416015625,
-0.00795745849609375,
-0.01401519775390625,
0.0193939208984375,
0.003406524658203125,
0.013336181640625,
-0.016143798828125,
-0.0179443359375,
0.0014524459838867188,
0.08416748046875,
-0.01403045654296875,
-0.06951904296875,
0.0274200439453125,
-0.016815185546875,
-0.033203125,
0.03582763671875,
-0.0321044921875,
0.056976318359375,
-0.00908660888671875,
-0.0189208984375,
-0.00560760498046875,
0.0645751953125,
0.019012451171875,
0.0294952392578125,
0.049652099609375,
0.0215606689453125,
0.00833892822265625,
-0.030975341796875,
0.08367919921875,
0.0030345916748046875,
0.0095367431640625,
0.050323486328125,
-0.000026166439056396484,
0.03302001953125,
0.045928955078125,
-0.052276611328125,
0.02349853515625,
0.06427001953125,
-0.0301513671875,
0.050689697265625,
-0.01861572265625,
0.0014286041259765625,
-0.0290679931640625,
0.0205535888671875,
-0.049285888671875,
0.00208282470703125,
0.030609130859375,
0.0034427642822265625,
-0.0095062255859375,
0.00015032291412353516,
0.005115509033203125,
-0.01788330078125,
0.004611968994140625,
0.044647216796875,
0.006641387939453125,
-0.03460693359375,
0.03924560546875,
0.0077056884765625,
0.0312347412109375,
-0.06231689453125,
-0.026702880859375,
0.03582763671875,
0.01203155517578125,
0.007793426513671875,
-0.07904052734375,
0.00553131103515625,
-0.043212890625,
0.0092620849609375,
-0.034027099609375,
0.042022705078125,
-0.02056884765625,
-0.00075531005859375,
0.008270263671875,
0.00018727779388427734,
0.026702880859375,
-0.007694244384765625,
-0.07659912109375,
0.0079345703125,
0.0309295654296875,
-0.0199432373046875,
0.03302001953125,
-0.0011968612670898438,
0.016571044921875,
0.0382080078125,
0.049346923828125,
0.0199432373046875,
-0.01910400390625,
-0.0191192626953125,
0.059326171875,
-0.054290771484375,
-0.059814453125,
-0.0458984375,
0.035308837890625,
-0.01544952392578125,
-0.0260162353515625,
0.060211181640625,
0.046539306640625,
0.040924072265625,
-0.0113983154296875,
0.07269287109375,
-0.0229949951171875,
0.03997802734375,
-0.0024929046630859375,
0.040557861328125,
-0.035675048828125,
-0.017578125,
-0.0095672607421875,
-0.036529541015625,
-0.01543426513671875,
0.0167999267578125,
0.01499176025390625,
0.002681732177734375,
0.0892333984375,
-0.003444671630859375,
0.01444244384765625,
0.0214691162109375,
0.033416748046875,
0.0095367431640625,
0.050628662109375,
0.043914794921875,
0.0192718505859375,
-0.037872314453125,
0.02081298828125,
-0.0472412109375,
-0.0028057098388671875,
-0.050384521484375,
-0.0452880859375,
-0.04254150390625,
-0.067138671875,
-0.0194549560546875,
-0.0161895751953125,
-0.03167724609375,
0.038238525390625,
0.0287017822265625,
-0.05523681640625,
-0.012969970703125,
0.011016845703125,
0.019134521484375,
-0.0167083740234375,
-0.01654052734375,
0.036834716796875,
-0.0011644363403320312,
-0.04510498046875,
0.01041412353515625,
0.03826904296875,
0.014129638671875,
-0.0206146240234375,
-0.0246734619140625,
-0.0216217041015625,
0.00811004638671875,
0.01070404052734375,
0.0217132568359375,
-0.05047607421875,
-0.01561737060546875,
-0.01522064208984375,
-0.01739501953125,
0.03778076171875,
0.05291748046875,
0.00738525390625,
0.038177490234375,
0.075927734375,
-0.0218048095703125,
0.037445068359375,
0.003936767578125,
0.08843994140625,
-0.028900146484375,
0.00093841552734375,
0.01480865478515625,
0.0287628173828125,
0.024200439453125,
-0.0445556640625,
0.0296783447265625,
0.03570556640625,
-0.07244873046875,
-0.0404052734375,
0.012481689453125,
-0.11517333984375,
-0.0229949951171875,
0.060302734375,
0.0092620849609375,
-0.0115814208984375,
-0.0312042236328125,
-0.0264892578125,
0.035247802734375,
-0.0257568359375,
0.00856781005859375,
0.04364013671875,
-0.01187896728515625,
-0.0007843971252441406,
-0.07672119140625,
0.044830322265625,
0.0113067626953125,
-0.041778564453125,
-0.018951416015625,
-0.00035572052001953125,
0.0173492431640625,
0.00849151611328125,
0.054779052734375,
-0.0263519287109375,
0.02630615234375,
-0.0007367134094238281,
0.01959228515625,
-0.0218353271484375,
0.00469970703125,
-0.0211029052734375,
0.010986328125,
-0.017608642578125,
-0.042388916015625
]
] |
facebook/contriever | 2022-01-19T17:23:28.000Z | [
"transformers",
"pytorch",
"bert",
"arxiv:2112.09118",
"endpoints_compatible",
"has_space",
"region:us"
] | null | facebook | null | null | facebook/contriever | 29 | 51,476 | transformers | 2022-03-02T23:29:05 | This model has been trained without supervision following the approach described in [Towards Unsupervised Dense Information Retrieval with Contrastive Learning](https://arxiv.org/abs/2112.09118). The associated GitHub repository is available here https://github.com/facebookresearch/contriever.
## Usage (HuggingFace Transformers)
Using the model directly available in HuggingFace transformers requires to add a mean pooling operation to obtain a sentence embedding.
```python
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('facebook/contriever')
model = AutoModel.from_pretrained('facebook/contriever')
sentences = [
"Where was Marie Curie born?",
"Maria Sklodowska, later known as Marie Curie, was born on November 7, 1867.",
"Born in Paris on 15 May 1859, Pierre Curie was the son of Eugène Curie, a doctor of French Catholic origin from Alsace."
]
# Apply tokenizer
inputs = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
outputs = model(**inputs)
# Mean pooling
def mean_pooling(token_embeddings, mask):
token_embeddings = token_embeddings.masked_fill(~mask[..., None].bool(), 0.)
sentence_embeddings = token_embeddings.sum(dim=1) / mask.sum(dim=1)[..., None]
return sentence_embeddings
embeddings = mean_pooling(outputs[0], inputs['attention_mask'])
``` | 1,401 | [
[
-0.00980377197265625,
-0.04534912109375,
0.0232696533203125,
0.0266876220703125,
-0.0181427001953125,
-0.0308380126953125,
-0.0211334228515625,
-0.00954437255859375,
0.0262451171875,
0.034759521484375,
-0.0484619140625,
-0.035430908203125,
-0.045501708984375,
-0.01453399658203125,
-0.036773681640625,
0.07379150390625,
-0.00021278858184814453,
0.00868988037109375,
-0.0184478759765625,
-0.01444244384765625,
0.01308441162109375,
-0.028900146484375,
-0.042449951171875,
-0.02581787109375,
0.00995635986328125,
0.023529052734375,
0.042724609375,
0.031463623046875,
0.0259857177734375,
0.03369140625,
-0.0015630722045898438,
0.01702880859375,
-0.048614501953125,
-0.020599365234375,
-0.0187225341796875,
-0.012786865234375,
0.009246826171875,
0.016143798828125,
0.03765869140625,
0.05364990234375,
-0.01233673095703125,
0.0020046234130859375,
0.02978515625,
0.048614501953125,
-0.040313720703125,
0.02032470703125,
-0.033660888671875,
0.0030841827392578125,
0.01065826416015625,
0.004314422607421875,
-0.042694091796875,
-0.0169830322265625,
0.010772705078125,
-0.039520263671875,
0.0400390625,
-0.008544921875,
0.08367919921875,
0.022796630859375,
-0.04315185546875,
-0.01322174072265625,
-0.0217742919921875,
0.05975341796875,
-0.035797119140625,
0.029327392578125,
0.02899169921875,
0.00022900104522705078,
-0.006565093994140625,
-0.07025146484375,
-0.045654296875,
-0.0150909423828125,
-0.03485107421875,
0.00707244873046875,
-0.0212249755859375,
0.0108489990234375,
0.02520751953125,
0.036468505859375,
-0.049285888671875,
-0.0122833251953125,
-0.040924072265625,
-0.0186767578125,
0.0516357421875,
-0.0103607177734375,
-0.0011930465698242188,
-0.03350830078125,
-0.044647216796875,
-0.011016845703125,
-0.0004737377166748047,
0.0216827392578125,
0.0108795166015625,
0.0301055908203125,
-0.03961181640625,
0.052764892578125,
-0.032012939453125,
0.0303192138671875,
0.0247650146484375,
0.022247314453125,
0.053070068359375,
-0.01515960693359375,
-0.01812744140625,
-0.01163482666015625,
0.07623291015625,
0.0390625,
0.045867919921875,
-0.0130462646484375,
-0.0112457275390625,
0.0178070068359375,
0.0194854736328125,
-0.071533203125,
-0.0516357421875,
0.00655364990234375,
-0.042022705078125,
-0.027374267578125,
0.03271484375,
-0.040985107421875,
-0.007083892822265625,
-0.016265869140625,
0.039306640625,
-0.029571533203125,
-0.0159759521484375,
0.0084075927734375,
-0.00609588623046875,
0.0083770751953125,
-0.006866455078125,
-0.07025146484375,
0.0252838134765625,
0.0185394287109375,
0.059326171875,
-0.0004711151123046875,
-0.00868988037109375,
-0.051422119140625,
-0.00244140625,
-0.0024471282958984375,
0.041900634765625,
-0.04083251953125,
-0.0184783935546875,
0.016082763671875,
0.02716064453125,
-0.01947021484375,
-0.0362548828125,
0.04046630859375,
-0.0278167724609375,
0.02386474609375,
-0.0027923583984375,
-0.05828857421875,
-0.012054443359375,
0.014739990234375,
-0.037872314453125,
0.08795166015625,
0.03857421875,
-0.08087158203125,
0.00506591796875,
-0.034515380859375,
-0.02044677734375,
-0.01082611083984375,
-0.0162200927734375,
-0.046539306640625,
-0.005641937255859375,
0.028717041015625,
0.054351806640625,
0.0258026123046875,
0.0290374755859375,
-0.04278564453125,
-0.0306854248046875,
0.00804901123046875,
0.00047898292541503906,
0.08233642578125,
0.003566741943359375,
-0.0237579345703125,
0.013214111328125,
-0.058319091796875,
-0.0086669921875,
0.01151275634765625,
-0.01117706298828125,
-0.019256591796875,
-0.032379150390625,
0.035888671875,
0.0214996337890625,
0.0186767578125,
-0.056304931640625,
0.016357421875,
-0.0286102294921875,
0.0369873046875,
0.043212890625,
0.010711669921875,
0.04559326171875,
-0.021453857421875,
0.0150146484375,
0.024444580078125,
0.0193939208984375,
-0.0037670135498046875,
-0.01456451416015625,
-0.059844970703125,
-0.01515960693359375,
-0.0041046142578125,
0.03155517578125,
-0.062225341796875,
0.031219482421875,
0.006664276123046875,
-0.041107177734375,
-0.038177490234375,
0.031036376953125,
0.0286102294921875,
0.056854248046875,
0.052947998046875,
-0.01161956787109375,
-0.04852294921875,
-0.0692138671875,
-0.0008683204650878906,
-0.00986480712890625,
0.002262115478515625,
0.02313232421875,
0.045684814453125,
-0.04010009765625,
0.050567626953125,
-0.047515869140625,
-0.023101806640625,
0.00013196468353271484,
0.0011911392211914062,
0.03131103515625,
0.07080078125,
0.041961669921875,
-0.0704345703125,
-0.0267333984375,
-0.02301025390625,
-0.0706787109375,
0.00995635986328125,
0.0031452178955078125,
-0.028289794921875,
0.0072174072265625,
0.059478759765625,
-0.06787109375,
0.042327880859375,
0.0281219482421875,
-0.0318603515625,
0.041900634765625,
-0.0077362060546875,
-0.0163116455078125,
-0.0947265625,
0.00650787353515625,
0.0270538330078125,
-0.02392578125,
-0.033782958984375,
0.0030574798583984375,
0.024383544921875,
0.005794525146484375,
-0.0311279296875,
0.046875,
-0.026885986328125,
0.022308349609375,
-0.01507568359375,
0.014923095703125,
0.0103607177734375,
0.037628173828125,
-0.00037980079650878906,
0.025787353515625,
0.053466796875,
-0.061370849609375,
0.038665771484375,
0.04931640625,
-0.0173797607421875,
0.0345458984375,
-0.047576904296875,
-0.00861358642578125,
-0.01421356201171875,
0.01496124267578125,
-0.08038330078125,
-0.018218994140625,
0.0257110595703125,
-0.046234130859375,
0.0222625732421875,
-0.0039005279541015625,
-0.04290771484375,
-0.04339599609375,
-0.025177001953125,
0.0239105224609375,
0.0290374755859375,
-0.0565185546875,
0.04962158203125,
0.017791748046875,
0.00945281982421875,
-0.048797607421875,
-0.0653076171875,
-0.036407470703125,
-0.01345062255859375,
-0.05023193359375,
0.0394287109375,
-0.00467681884765625,
-0.00122833251953125,
0.0147857666015625,
0.00020503997802734375,
-0.01287841796875,
-0.002994537353515625,
0.008209228515625,
0.0202789306640625,
-0.002346038818359375,
0.0199737548828125,
0.00957489013671875,
-0.00588226318359375,
0.0103912353515625,
-0.0086669921875,
0.032440185546875,
-0.0090179443359375,
-0.00429534912109375,
-0.04046630859375,
0.01263427734375,
-0.0036678314208984375,
-0.0158843994140625,
0.058135986328125,
0.08184814453125,
-0.038726806640625,
-0.02069091796875,
-0.0548095703125,
-0.033660888671875,
-0.041656494140625,
0.026031494140625,
-0.00763702392578125,
-0.08612060546875,
0.05413818359375,
0.011505126953125,
0.0002994537353515625,
0.03997802734375,
0.0240020751953125,
-0.01178741455078125,
0.05218505859375,
0.07635498046875,
-0.002872467041015625,
0.049530029296875,
-0.034637451171875,
-0.00739288330078125,
-0.052886962890625,
-0.0282135009765625,
-0.0100555419921875,
-0.0138397216796875,
-0.06573486328125,
-0.01837158203125,
0.004291534423828125,
0.0035152435302734375,
-0.04339599609375,
0.0300140380859375,
-0.05267333984375,
0.016845703125,
0.048248291015625,
0.04669189453125,
0.003711700439453125,
0.002696990966796875,
-0.0119171142578125,
0.0086517333984375,
-0.045318603515625,
-0.01201629638671875,
0.065673828125,
0.010162353515625,
0.0650634765625,
-0.0168609619140625,
0.059326171875,
-0.001094818115234375,
0.0217132568359375,
-0.03729248046875,
0.0286865234375,
-0.02294921875,
-0.038421630859375,
-0.007099151611328125,
-0.0521240234375,
-0.07550048828125,
0.020111083984375,
-0.0120086669921875,
-0.06268310546875,
0.0263824462890625,
0.0072174072265625,
-0.0289459228515625,
0.0087432861328125,
-0.05780029296875,
0.07733154296875,
0.010894775390625,
-0.03033447265625,
-0.00406646728515625,
-0.06854248046875,
0.0278778076171875,
0.014892578125,
0.004055023193359375,
0.01233673095703125,
0.01058197021484375,
0.06396484375,
-0.0125579833984375,
0.06951904296875,
-0.028900146484375,
0.026336669921875,
0.0190582275390625,
-0.0038394927978515625,
0.019012451171875,
0.0037097930908203125,
-0.00977325439453125,
-0.0109710693359375,
0.0008058547973632812,
-0.05322265625,
-0.031890869140625,
0.061126708984375,
-0.06414794921875,
-0.027618408203125,
-0.0253753662109375,
-0.049163818359375,
-0.005237579345703125,
0.02178955078125,
0.023468017578125,
0.03338623046875,
-0.009552001953125,
0.021087646484375,
0.04791259765625,
-0.0114898681640625,
0.042449951171875,
0.0196685791015625,
-0.02410888671875,
-0.021942138671875,
0.0369873046875,
0.004779815673828125,
-0.003002166748046875,
0.0227508544921875,
0.01334381103515625,
-0.032745361328125,
0.0034809112548828125,
-0.0015287399291992188,
0.057342529296875,
-0.054351806640625,
-0.01357269287109375,
-0.07598876953125,
-0.0311431884765625,
-0.0498046875,
-0.012847900390625,
-0.024810791015625,
-0.0177459716796875,
-0.043212890625,
-0.0279083251953125,
0.02880859375,
0.03887939453125,
-0.01219940185546875,
0.040283203125,
-0.045135498046875,
0.023834228515625,
0.002765655517578125,
0.0185394287109375,
-0.0033721923828125,
-0.06298828125,
-0.0286102294921875,
-0.0079345703125,
-0.03472900390625,
-0.04766845703125,
0.0284881591796875,
0.0168304443359375,
0.04840087890625,
0.04998779296875,
0.0226593017578125,
0.046478271484375,
-0.018768310546875,
0.03485107421875,
0.005176544189453125,
-0.059661865234375,
0.033721923828125,
-0.01494598388671875,
0.012939453125,
0.0517578125,
0.044525146484375,
-0.018768310546875,
-0.0213470458984375,
-0.07086181640625,
-0.06414794921875,
0.05206298828125,
0.0224456787109375,
0.038177490234375,
-0.0175323486328125,
0.0413818359375,
-0.00814056396484375,
0.0171356201171875,
-0.08636474609375,
-0.02386474609375,
-0.019927978515625,
-0.03875732421875,
-0.012359619140625,
-0.015838623046875,
-0.0008540153503417969,
-0.03131103515625,
0.06951904296875,
-0.0100250244140625,
0.0181427001953125,
0.03155517578125,
-0.0203094482421875,
-0.001354217529296875,
-0.0205535888671875,
0.001789093017578125,
0.033111572265625,
-0.017486572265625,
0.01385498046875,
-0.006343841552734375,
-0.01137542724609375,
-0.01491546630859375,
0.050048828125,
-0.00760650634765625,
0.0219268798828125,
0.0231170654296875,
0.044342041015625,
0.03350830078125,
-0.0330810546875,
0.0667724609375,
-0.0019550323486328125,
-0.021392822265625,
-0.0389404296875,
-0.00974273681640625,
0.01617431640625,
0.047149658203125,
0.023223876953125,
-0.0092620849609375,
0.01317596435546875,
-0.0226593017578125,
0.049072265625,
0.0372314453125,
-0.041595458984375,
-0.0125579833984375,
0.05523681640625,
-0.00902557373046875,
0.0114593505859375,
0.07244873046875,
-0.01158905029296875,
-0.032012939453125,
0.0303955078125,
0.0303955078125,
0.052581787109375,
-0.0343017578125,
0.0255584716796875,
0.050506591796875,
0.0169525146484375,
-0.01157379150390625,
0.01861572265625,
-0.0144195556640625,
-0.062255859375,
-0.0151824951171875,
-0.05206298828125,
-0.01377105712890625,
0.0030975341796875,
-0.07293701171875,
0.02215576171875,
-0.020904541015625,
-0.00799560546875,
0.004970550537109375,
0.0036678314208984375,
-0.060028076171875,
0.02691650390625,
-0.0024852752685546875,
0.0631103515625,
-0.0843505859375,
0.051788330078125,
0.052947998046875,
-0.04913330078125,
-0.0743408203125,
-0.00933837890625,
-0.0256500244140625,
-0.06072998046875,
0.032073974609375,
0.0635986328125,
0.019378662109375,
0.01471710205078125,
-0.06475830078125,
-0.054107666015625,
0.06988525390625,
0.032928466796875,
-0.03759765625,
-0.012939453125,
-0.0183868408203125,
0.035491943359375,
-0.026702880859375,
0.0167236328125,
0.0195465087890625,
0.0059814453125,
0.008514404296875,
-0.044952392578125,
0.01502227783203125,
-0.044464111328125,
-0.00534820556640625,
-0.0198822021484375,
-0.058258056640625,
0.0712890625,
-0.0182037353515625,
-0.01438140869140625,
0.01202392578125,
0.074462890625,
0.0218658447265625,
0.031494140625,
0.03466796875,
0.056427001953125,
0.052947998046875,
-0.0010776519775390625,
0.07501220703125,
-0.0263824462890625,
0.06414794921875,
0.0728759765625,
0.0084686279296875,
0.049530029296875,
0.04461669921875,
-0.00482177734375,
0.051513671875,
0.05517578125,
-0.017913818359375,
0.04852294921875,
0.00821685791015625,
-0.0125579833984375,
-0.0201263427734375,
-0.01313018798828125,
-0.01666259765625,
0.01183319091796875,
0.009063720703125,
-0.05181884765625,
-0.022369384765625,
0.01312255859375,
0.016326904296875,
0.0211639404296875,
-0.0172576904296875,
0.046539306640625,
0.026824951171875,
-0.037078857421875,
0.039459228515625,
-0.00634002685546875,
0.0579833984375,
-0.03326416015625,
0.00882720947265625,
-0.01496124267578125,
0.034088134765625,
-0.03424072265625,
-0.051513671875,
0.0207061767578125,
-0.01076507568359375,
-0.0112762451171875,
0.004009246826171875,
0.039398193359375,
-0.05987548828125,
-0.059478759765625,
0.043243408203125,
0.0295867919921875,
0.0131072998046875,
-0.0164794921875,
-0.061859130859375,
-0.00937652587890625,
0.0031032562255859375,
-0.035858154296875,
0.0114593505859375,
0.037353515625,
0.0026454925537109375,
0.0504150390625,
0.04254150390625,
-0.02886962890625,
-0.0014677047729492188,
0.01763916015625,
0.1015625,
-0.038787841796875,
-0.04986572265625,
-0.08026123046875,
0.045562744140625,
-0.00334930419921875,
-0.006496429443359375,
0.061981201171875,
0.051116943359375,
0.0875244140625,
0.001445770263671875,
0.0343017578125,
0.0029144287109375,
0.0267333984375,
-0.0283355712890625,
0.042083740234375,
-0.04541015625,
-0.03076171875,
-0.03900146484375,
-0.0767822265625,
-0.0209197998046875,
0.07891845703125,
-0.0194549560546875,
0.026031494140625,
0.047332763671875,
0.068115234375,
-0.0244598388671875,
-0.0061187744140625,
-0.0004608631134033203,
0.02392578125,
0.01229095458984375,
0.02471923828125,
0.03173828125,
-0.040740966796875,
0.04107666015625,
-0.05242919921875,
-0.0240478515625,
-0.011016845703125,
-0.04864501953125,
-0.08721923828125,
-0.04290771484375,
-0.02899169921875,
-0.048553466796875,
-0.004730224609375,
0.07757568359375,
0.0555419921875,
-0.06134033203125,
-0.0007061958312988281,
-0.01033782958984375,
-0.0167694091796875,
-0.013671875,
-0.022796630859375,
0.049530029296875,
-0.0247802734375,
-0.0528564453125,
0.002239227294921875,
-0.019317626953125,
0.00627899169921875,
-0.007526397705078125,
0.00728607177734375,
-0.036041259765625,
-0.0073089599609375,
0.0286865234375,
-0.006832122802734375,
-0.03350830078125,
-0.0205841064453125,
-0.010528564453125,
-0.0197296142578125,
0.006336212158203125,
0.01358795166015625,
-0.047821044921875,
0.03607177734375,
0.0117645263671875,
0.050506591796875,
0.08172607421875,
0.026824951171875,
0.0017604827880859375,
-0.07012939453125,
0.02423095703125,
0.00786590576171875,
0.038421630859375,
0.058074951171875,
-0.044464111328125,
0.046051025390625,
0.032745361328125,
-0.0458984375,
-0.047393798828125,
0.00838470458984375,
-0.0999755859375,
0.0041351318359375,
0.083984375,
-0.0235748291015625,
-0.0227508544921875,
0.01094818115234375,
-0.01300811767578125,
0.055084228515625,
-0.0400390625,
0.059661865234375,
0.03887939453125,
-0.00789642333984375,
-0.0282440185546875,
-0.0243988037109375,
0.01491546630859375,
0.0291290283203125,
-0.0214385986328125,
-0.005283355712890625,
0.0219879150390625,
0.028778076171875,
0.007358551025390625,
0.04998779296875,
-0.01194000244140625,
0.00930023193359375,
-0.00872039794921875,
0.01348876953125,
-0.0141448974609375,
0.0069427490234375,
-0.0146026611328125,
-0.0024089813232421875,
-0.03521728515625,
-0.01416778564453125
]
] |
onlplab/alephbert-base | 2022-06-26T09:32:47.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"language model",
"he",
"dataset:oscar",
"dataset:wikipedia",
"dataset:twitter",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | onlplab | null | null | onlplab/alephbert-base | 12 | 51,263 | transformers | 2022-03-02T23:29:05 | ---
language:
- he
tags:
- language model
license: apache-2.0
datasets:
- oscar
- wikipedia
- twitter
---
# AlephBERT
## Hebrew Language Model
State-of-the-art language model for Hebrew.
Based on Google's BERT architecture [(Devlin et al. 2018)](https://arxiv.org/abs/1810.04805).
#### How to use
```python
from transformers import BertModel, BertTokenizerFast
alephbert_tokenizer = BertTokenizerFast.from_pretrained('onlplab/alephbert-base')
alephbert = BertModel.from_pretrained('onlplab/alephbert-base')
# if not finetuning - disable dropout
alephbert.eval()
```
## Training data
1. OSCAR [(Ortiz, 2019)](https://oscar-corpus.com/) Hebrew section (10 GB text, 20 million sentences).
2. Hebrew dump of [Wikipedia](https://dumps.wikimedia.org/hewiki/latest/) (650 MB text, 3 million sentences).
3. Hebrew Tweets collected from the Twitter sample stream (7 GB text, 70 million sentences).
## Training procedure
Trained on a DGX machine (8 V100 GPUs) using the standard huggingface training procedure.
Since the larger part of our training data is based on tweets we decided to start by optimizing using Masked Language Model loss only.
To optimize training time we split the data into 4 sections based on max number of tokens:
1. num tokens < 32 (70M sentences)
2. 32 <= num tokens < 64 (12M sentences)
3. 64 <= num tokens < 128 (10M sentences)
4. 128 <= num tokens < 512 (1.5M sentences)
Each section was first trained for 5 epochs with an initial learning rate set to 1e-4. Then each section was trained for another 5 epochs with an initial learning rate set to 1e-5, for a total of 10 epochs.
Total training time was 8 days.
| 1,647 | [
[
-0.0206146240234375,
-0.05126953125,
-0.00147247314453125,
0.05029296875,
-0.025726318359375,
-0.007358551025390625,
-0.023895263671875,
-0.0174407958984375,
0.0056915283203125,
0.0142822265625,
-0.03192138671875,
-0.04510498046875,
-0.0457763671875,
-0.0081634521484375,
-0.045196533203125,
0.08489990234375,
-0.0025844573974609375,
0.01580810546875,
0.0311431884765625,
-0.00884246826171875,
0.00597381591796875,
-0.0025653839111328125,
-0.046875,
-0.031646728515625,
0.016571044921875,
0.004413604736328125,
0.05499267578125,
0.05328369140625,
0.0219573974609375,
0.0231475830078125,
-0.016204833984375,
-0.00435638427734375,
-0.0298919677734375,
-0.01145172119140625,
0.00946044921875,
-0.01085662841796875,
-0.0165557861328125,
0.00081634521484375,
0.0626220703125,
0.04754638671875,
-0.019927978515625,
0.028289794921875,
-0.01285552978515625,
0.048126220703125,
-0.03985595703125,
0.014739990234375,
-0.056976318359375,
0.02191162109375,
-0.0207061767578125,
-0.00212860107421875,
-0.030426025390625,
-0.041229248046875,
0.0184326171875,
-0.0203857421875,
0.0089111328125,
0.02532958984375,
0.0938720703125,
0.0113677978515625,
0.006473541259765625,
-0.015472412109375,
-0.041015625,
0.059478759765625,
-0.049713134765625,
0.041961669921875,
0.0384521484375,
0.0010499954223632812,
0.006534576416015625,
-0.043121337890625,
-0.0489501953125,
-0.0217132568359375,
0.01548004150390625,
0.0017910003662109375,
-0.043243408203125,
-0.005718231201171875,
0.007335662841796875,
0.031707763671875,
-0.055572509765625,
0.003704071044921875,
-0.046478271484375,
-0.0302734375,
0.03564453125,
-0.00241851806640625,
0.032928466796875,
-0.01036834716796875,
-0.032470703125,
-0.016204833984375,
-0.03533935546875,
0.00263214111328125,
0.035308837890625,
0.016632080078125,
-0.01393890380859375,
0.040679931640625,
-0.0022563934326171875,
0.04351806640625,
0.0020809173583984375,
-0.01093292236328125,
0.0394287109375,
-0.00917816162109375,
-0.01861572265625,
0.011871337890625,
0.07476806640625,
0.01239013671875,
0.029998779296875,
-0.0193634033203125,
-0.01297760009765625,
-0.004222869873046875,
0.0293731689453125,
-0.06890869140625,
-0.020111083984375,
0.005672454833984375,
-0.044036865234375,
-0.02386474609375,
-0.00603485107421875,
-0.0213623046875,
-0.001644134521484375,
-0.00750732421875,
0.0352783203125,
-0.08349609375,
-0.005588531494140625,
0.02056884765625,
-0.0003268718719482422,
0.037506103515625,
0.0180816650390625,
-0.06549072265625,
0.0295562744140625,
0.03485107421875,
0.0635986328125,
-0.0031280517578125,
-0.0302276611328125,
-0.03753662109375,
-0.0287017822265625,
-0.0347900390625,
0.033355712890625,
-0.006511688232421875,
0.0160980224609375,
0.01319122314453125,
-0.0274200439453125,
-0.01268768310546875,
-0.029998779296875,
0.04034423828125,
-0.05377197265625,
0.026123046875,
0.029632568359375,
-0.05731201171875,
-0.007076263427734375,
0.01458740234375,
-0.03582763671875,
0.082763671875,
0.0290985107421875,
-0.06622314453125,
0.0022125244140625,
-0.03826904296875,
-0.0305023193359375,
0.0172882080078125,
-0.00696563720703125,
-0.04351806640625,
0.0157470703125,
0.017608642578125,
0.0322265625,
-0.0012311935424804688,
0.0255126953125,
0.0058746337890625,
-0.033294677734375,
0.01024627685546875,
-0.01198577880859375,
0.0850830078125,
0.0162353515625,
-0.03631591796875,
0.0006561279296875,
-0.056427001953125,
0.005096435546875,
0.002872467041015625,
-0.042388916015625,
-0.005382537841796875,
-0.020599365234375,
0.037872314453125,
0.00505828857421875,
0.006130218505859375,
-0.0677490234375,
0.0016393661499023438,
-0.0577392578125,
0.022308349609375,
0.0540771484375,
-0.040618896484375,
0.03485107421875,
-0.01548004150390625,
0.032867431640625,
-0.00696563720703125,
0.00743865966796875,
-0.00960540771484375,
-0.0390625,
-0.05853271484375,
-0.0164794921875,
0.032501220703125,
0.039581298828125,
-0.0528564453125,
0.0635986328125,
-0.02215576171875,
-0.04461669921875,
-0.07110595703125,
0.0038394927978515625,
0.0355224609375,
0.031280517578125,
0.030853271484375,
-0.02313232421875,
-0.036224365234375,
-0.0780029296875,
-0.0005273818969726562,
-0.034637451171875,
0.006336212158203125,
-0.0106964111328125,
0.043701171875,
-0.00885772705078125,
0.07611083984375,
-0.01467132568359375,
-0.003627777099609375,
-0.03131103515625,
0.0106353759765625,
0.021881103515625,
0.042236328125,
0.03997802734375,
-0.045135498046875,
-0.053619384765625,
-0.01119232177734375,
-0.036895751953125,
-0.006885528564453125,
0.0006475448608398438,
-0.01282501220703125,
0.02789306640625,
0.0300750732421875,
-0.044952392578125,
0.045867919921875,
0.05267333984375,
-0.0251007080078125,
0.045867919921875,
0.003955841064453125,
-0.0176239013671875,
-0.08197021484375,
0.00592041015625,
0.0009765625,
-0.01255035400390625,
-0.053558349609375,
-0.0032329559326171875,
0.0225830078125,
-0.006244659423828125,
-0.054229736328125,
0.056121826171875,
-0.0173187255859375,
0.0133209228515625,
-0.016143798828125,
0.0050048828125,
-0.014129638671875,
0.05023193359375,
-0.0087738037109375,
0.053558349609375,
0.05206298828125,
-0.03240966796875,
0.03326416015625,
0.060211181640625,
-0.05560302734375,
0.0024776458740234375,
-0.05755615234375,
-0.0035610198974609375,
-0.0242156982421875,
0.01186370849609375,
-0.057891845703125,
-0.0019330978393554688,
0.03887939453125,
-0.05438232421875,
0.02545166015625,
0.00856781005859375,
-0.05328369140625,
-0.01406097412109375,
-0.036834716796875,
0.0228729248046875,
0.054351806640625,
-0.043182373046875,
0.04827880859375,
0.00933074951171875,
-0.00724029541015625,
-0.058685302734375,
-0.049560546875,
0.00304412841796875,
-0.00556182861328125,
-0.05615234375,
0.046112060546875,
-0.01428985595703125,
-0.00807952880859375,
0.01053619384765625,
-0.0037631988525390625,
0.0031833648681640625,
-0.00804901123046875,
0.0030307769775390625,
0.018768310546875,
-0.008758544921875,
0.0225677490234375,
-0.0020904541015625,
-0.0067596435546875,
-0.0196533203125,
-0.01309967041015625,
0.0816650390625,
-0.043365478515625,
-0.01268768310546875,
-0.059967041015625,
0.00991058349609375,
0.0177459716796875,
-0.01543426513671875,
0.08648681640625,
0.08477783203125,
-0.02496337890625,
-0.007289886474609375,
-0.043914794921875,
0.005298614501953125,
-0.0328369140625,
0.042144775390625,
-0.007755279541015625,
-0.0662841796875,
0.03314208984375,
0.0244140625,
0.0125274658203125,
0.034149169921875,
0.0556640625,
0.00696563720703125,
0.045684814453125,
0.04449462890625,
-0.038787841796875,
0.049530029296875,
-0.01491546630859375,
0.01324462890625,
-0.051727294921875,
-0.0173797607421875,
-0.034576416015625,
-0.00751495361328125,
-0.044403076171875,
-0.0247039794921875,
0.0128326416015625,
0.00850677490234375,
-0.043182373046875,
0.048370361328125,
-0.024322509765625,
0.0199127197265625,
0.056243896484375,
0.007541656494140625,
-0.0130157470703125,
0.0302276611328125,
-0.0030460357666015625,
0.005405426025390625,
-0.05303955078125,
-0.035430908203125,
0.06298828125,
0.052642822265625,
0.068359375,
0.0016803741455078125,
0.05572509765625,
0.010955810546875,
0.016632080078125,
-0.061676025390625,
0.03900146484375,
0.0014438629150390625,
-0.05474853515625,
-0.025604248046875,
-0.031646728515625,
-0.06964111328125,
0.01265716552734375,
-0.0083770751953125,
-0.07183837890625,
-0.0258331298828125,
0.0037708282470703125,
-0.01873779296875,
0.032989501953125,
-0.03900146484375,
0.05029296875,
-0.0041961669921875,
-0.00659942626953125,
0.00531768798828125,
-0.07452392578125,
0.006175994873046875,
0.0008778572082519531,
-0.0009908676147460938,
-0.0198974609375,
0.007106781005859375,
0.06915283203125,
-0.01366424560546875,
0.06134033203125,
0.00341033935546875,
0.0046234130859375,
0.0132293701171875,
-0.004238128662109375,
0.0201416015625,
-0.01467132568359375,
-0.011199951171875,
0.0199127197265625,
-0.022552490234375,
-0.0396728515625,
-0.01410675048828125,
0.0262298583984375,
-0.08599853515625,
-0.0075836181640625,
-0.03375244140625,
-0.04241943359375,
0.002056121826171875,
0.0059356689453125,
0.035308837890625,
0.034454345703125,
-0.0239715576171875,
0.021331787109375,
0.045867919921875,
-0.043212890625,
0.045684814453125,
0.0230865478515625,
-0.0242156982421875,
-0.039947509765625,
0.0540771484375,
-0.01457977294921875,
0.007495880126953125,
0.03179931640625,
0.029022216796875,
-0.01541900634765625,
-0.04656982421875,
-0.0428466796875,
0.03179931640625,
-0.04541015625,
-0.01523590087890625,
-0.059722900390625,
-0.045196533203125,
-0.049896240234375,
0.00847625732421875,
-0.04644775390625,
-0.0309906005859375,
-0.0107879638671875,
0.01123046875,
0.0249786376953125,
0.0249481201171875,
-0.0025634765625,
0.0474853515625,
-0.064453125,
0.01222991943359375,
-0.00952911376953125,
0.035491943359375,
-0.00021791458129882812,
-0.06695556640625,
-0.0355224609375,
-0.0141754150390625,
-0.02337646484375,
-0.05059814453125,
0.04205322265625,
0.0277099609375,
0.034912109375,
0.0343017578125,
0.0015859603881835938,
0.03436279296875,
-0.06353759765625,
0.049163818359375,
0.0305938720703125,
-0.07830810546875,
0.037872314453125,
-0.016845703125,
0.038116455078125,
0.04095458984375,
0.037322998046875,
-0.055877685546875,
-0.0294952392578125,
-0.0531005859375,
-0.07049560546875,
0.07562255859375,
0.0611572265625,
0.035491943359375,
-0.006130218505859375,
0.0212249755859375,
0.014556884765625,
0.0290679931640625,
-0.06976318359375,
-0.036346435546875,
-0.0136871337890625,
-0.03839111328125,
0.00872802734375,
-0.004611968994140625,
0.00013053417205810547,
-0.037384033203125,
0.08013916015625,
0.00402069091796875,
0.0435791015625,
-0.004970550537109375,
-0.0190277099609375,
-0.00893402099609375,
0.00736236572265625,
0.040435791015625,
0.047698974609375,
-0.025634765625,
-0.0040130615234375,
0.0006327629089355469,
-0.04547119140625,
-0.0164642333984375,
0.0260009765625,
-0.0220489501953125,
0.0232391357421875,
0.0423583984375,
0.087646484375,
-0.0125579833984375,
-0.053436279296875,
0.047149658203125,
-0.0002989768981933594,
-0.01654052734375,
-0.0494384765625,
-0.01079559326171875,
0.007045745849609375,
0.03179931640625,
0.032501220703125,
0.00551605224609375,
-0.0122222900390625,
-0.028656005859375,
0.0173492431640625,
0.0092620849609375,
-0.026031494140625,
-0.02874755859375,
0.038909912109375,
-0.00522613525390625,
-0.040618896484375,
0.0792236328125,
-0.01068878173828125,
-0.05780029296875,
0.0281524658203125,
0.04046630859375,
0.06500244140625,
-0.0240936279296875,
0.0228118896484375,
0.04034423828125,
0.01800537109375,
-0.0025653839111328125,
0.00726318359375,
0.0018215179443359375,
-0.07916259765625,
-0.028594970703125,
-0.0841064453125,
-0.013885498046875,
0.0242919921875,
-0.049560546875,
0.005001068115234375,
-0.02490234375,
-0.0022449493408203125,
0.01241302490234375,
-0.00930023193359375,
-0.057159423828125,
0.0050201416015625,
0.0207366943359375,
0.08392333984375,
-0.066162109375,
0.0626220703125,
0.05029296875,
-0.033935546875,
-0.060699462890625,
-0.00299835205078125,
-0.018646240234375,
-0.097412109375,
0.05877685546875,
0.028717041015625,
0.003955841064453125,
0.001903533935546875,
-0.06396484375,
-0.054473876953125,
0.058990478515625,
0.0203094482421875,
-0.03369140625,
-0.004787445068359375,
-0.0028514862060546875,
0.04461669921875,
-0.0281982421875,
0.0028934478759765625,
0.034698486328125,
0.00787353515625,
0.0123748779296875,
-0.07012939453125,
-0.0164642333984375,
-0.047149658203125,
0.017669677734375,
0.01221466064453125,
-0.03741455078125,
0.067138671875,
0.004974365234375,
-0.005374908447265625,
0.023284912109375,
0.05657958984375,
0.0169830322265625,
0.0037937164306640625,
0.042755126953125,
0.05157470703125,
0.025543212890625,
-0.0008554458618164062,
0.072021484375,
-0.0311279296875,
0.0462646484375,
0.06817626953125,
0.01349639892578125,
0.060791015625,
0.037811279296875,
-0.0139617919921875,
0.03900146484375,
0.048309326171875,
-0.019683837890625,
0.045074462890625,
0.031524658203125,
-0.020233154296875,
-0.01132965087890625,
-0.0019855499267578125,
-0.0145721435546875,
0.039764404296875,
0.01107025146484375,
-0.035736083984375,
-0.0241851806640625,
0.021759033203125,
0.025238037109375,
-0.0029735565185546875,
0.0007867813110351562,
0.035797119140625,
-0.0127716064453125,
-0.04083251953125,
0.0435791015625,
0.0171356201171875,
0.0457763671875,
-0.045928955078125,
0.0159454345703125,
-0.01212310791015625,
0.0225372314453125,
0.00780487060546875,
-0.044097900390625,
0.01410675048828125,
0.0036678314208984375,
0.01065826416015625,
-0.031768798828125,
0.042449951171875,
-0.040557861328125,
-0.045928955078125,
0.01849365234375,
0.043365478515625,
0.0350341796875,
-0.00238037109375,
-0.06500244140625,
0.021331787109375,
0.005786895751953125,
-0.03729248046875,
0.011749267578125,
0.0160064697265625,
0.0086669921875,
0.05377197265625,
0.0157012939453125,
0.007099151611328125,
0.01107025146484375,
0.01430511474609375,
0.0836181640625,
-0.045562744140625,
-0.0253143310546875,
-0.072998046875,
0.035400390625,
-0.0013141632080078125,
-0.0274658203125,
0.048492431640625,
0.0386962890625,
0.0712890625,
-0.00626373291015625,
0.04547119140625,
-0.029144287109375,
0.0296630859375,
-0.03192138671875,
0.05108642578125,
-0.037628173828125,
-0.005313873291015625,
-0.014434814453125,
-0.07293701171875,
-0.025177001953125,
0.073974609375,
-0.00009173154830932617,
0.0064544677734375,
0.0462646484375,
0.0548095703125,
0.0014104843139648438,
-0.0017290115356445312,
0.01128387451171875,
0.01146697998046875,
0.020599365234375,
0.0154266357421875,
0.0545654296875,
-0.051055908203125,
0.04473876953125,
-0.027099609375,
-0.019805908203125,
-0.0111846923828125,
-0.051727294921875,
-0.07757568359375,
-0.044586181640625,
-0.03167724609375,
-0.042724609375,
-0.0009946823120117188,
0.10089111328125,
0.049530029296875,
-0.053863525390625,
-0.017303466796875,
-0.01085662841796875,
-0.0172119140625,
-0.004627227783203125,
-0.0119476318359375,
0.05181884765625,
-0.035552978515625,
-0.0633544921875,
0.017822265625,
0.00942230224609375,
0.0014438629150390625,
-0.0123291015625,
-0.01384735107421875,
-0.037200927734375,
-0.004711151123046875,
0.042022705078125,
0.0020160675048828125,
-0.06317138671875,
-0.016204833984375,
0.0016269683837890625,
-0.0149688720703125,
0.01338958740234375,
0.0271148681640625,
-0.032867431640625,
0.0288848876953125,
0.024169921875,
0.057525634765625,
0.0635986328125,
-0.0045013427734375,
0.0242919921875,
-0.06671142578125,
0.00623321533203125,
0.0239410400390625,
0.025146484375,
0.032135009765625,
0.00910186767578125,
0.038970947265625,
0.00788116455078125,
-0.03387451171875,
-0.03802490234375,
0.00994873046875,
-0.08001708984375,
-0.01065826416015625,
0.08428955078125,
-0.0185089111328125,
-0.027862548828125,
0.0097198486328125,
-0.01474761962890625,
0.03857421875,
-0.036163330078125,
0.06854248046875,
0.05157470703125,
-0.0102081298828125,
-0.0173797607421875,
-0.03521728515625,
0.0225982666015625,
0.0240325927734375,
-0.03350830078125,
-0.0239410400390625,
0.0151519775390625,
0.0244598388671875,
0.032989501953125,
0.0615234375,
-0.00458526611328125,
0.0077056884765625,
0.00638580322265625,
0.034393310546875,
-0.0005717277526855469,
-0.01953125,
-0.01538848876953125,
0.007228851318359375,
-0.01904296875,
-0.034637451171875
]
] |
DeepFloyd/IF-I-M-v1.0 | 2023-06-02T19:04:48.000Z | [
"diffusers",
"pytorch",
"if",
"text-to-image",
"arxiv:2205.11487",
"arxiv:2110.02861",
"license:deepfloyd-if-license",
"diffusers:IFPipeline",
"region:us"
] | text-to-image | DeepFloyd | null | null | DeepFloyd/IF-I-M-v1.0 | 44 | 50,787 | diffusers | 2023-03-21T19:06:19 | ---
license: deepfloyd-if-license
extra_gated_prompt: "DeepFloyd LICENSE AGREEMENT\nThis License Agreement (as may be amended in accordance with this License Agreement, “License”), between you, or your employer or other entity (if you are entering into this agreement on behalf of your employer or other entity) (“Licensee” or “you”) and Stability AI Ltd.. (“Stability AI” or “we”) applies to your use of any computer program, algorithm, source code, object code, or software that is made available by Stability AI under this License (“Software”) and any specifications, manuals, documentation, and other written information provided by Stability AI related to the Software (“Documentation”).\nBy clicking “I Accept” below or by using the Software, you agree to the terms of this License. If you do not agree to this License, then you do not have any rights to use the Software or Documentation (collectively, the “Software Products”), and you must immediately cease using the Software Products. If you are agreeing to be bound by the terms of this License on behalf of your employer or other entity, you represent and warrant to Stability AI that you have full legal authority to bind your employer or such entity to this License. If you do not have the requisite authority, you may not accept the License or access the Software Products on behalf of your employer or other entity.\n1. LICENSE GRANT\n a. Subject to your compliance with the Documentation and Sections 2, 3, and 5, Stability AI grants you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty free and limited license under Stability AI’s copyright interests to reproduce, distribute, and create derivative works of the Software solely for your non-commercial research purposes. The foregoing license is personal to you, and you may not assign or sublicense this License or any other rights or obligations under this License without Stability AI’s prior written consent; any such assignment or sublicense will be void and will automatically and immediately terminate this License.\n b. You may make a reasonable number of copies of the Documentation solely for use in connection with the license to the Software granted above.\n c. The grant of rights expressly set forth in this Section 1 (License Grant) are the complete grant of rights to you in the Software Products, and no other licenses are granted, whether by waiver, estoppel, implication, equity or otherwise. Stability AI and its licensors reserve all rights not expressly granted by this License.\L\n2. RESTRICTIONS\n You will not, and will not permit, assist or cause any third party to:\n a. use, modify, copy, reproduce, create derivative works of, or distribute the Software Products (or any derivative works thereof, works incorporating the Software Products, or any data produced by the Software), in whole or in part, for (i) any commercial or production purposes, (ii) military purposes or in the service of nuclear technology, (iii) purposes of surveillance, including any research or development relating to surveillance, (iv) biometric processing, (v) in any manner that infringes, misappropriates, or otherwise violates any third-party rights, or (vi) in any manner that violates any applicable law and violating any privacy or security laws, rules, regulations, directives, or governmental requirements (including the General Data Privacy Regulation (Regulation (EU) 2016/679), the California Consumer Privacy Act, and any and all laws governing the processing of biometric information), as well as all amendments and successor laws to any of the foregoing;\n b. alter or remove copyright and other proprietary notices which appear on or in the Software Products;\n c. utilize any equipment, device, software, or other means to circumvent or remove any security or protection used by Stability AI in connection with the Software, or to circumvent or remove any usage restrictions, or to enable functionality disabled by Stability AI; or\n d. offer or impose any terms on the Software Products that alter, restrict, or are inconsistent with the terms of this License.\n e. 1) violate any applicable U.S. and non-U.S. export control and trade sanctions laws (“Export Laws”); 2) directly or indirectly export, re-export, provide, or otherwise transfer Software Products: (a) to any individual, entity, or country prohibited by Export Laws; (b) to anyone on U.S. or non-U.S. government restricted parties lists; or (c) for any purpose prohibited by Export Laws, including nuclear, chemical or biological weapons, or missile technology applications; 3) use or download Software Products if you or they are: (a) located in a comprehensively sanctioned jurisdiction, (b) currently listed on any U.S. or non-U.S. restricted parties list, or (c) for any purpose prohibited by Export Laws; and (4) will not disguise your location through IP proxying or other methods.\L\n3. ATTRIBUTION\n Together with any copies of the Software Products (as well as derivative works thereof or works incorporating the Software Products) that you distribute, you must provide (i) a copy of this License, and (ii) the following attribution notice: “DeepFloyd is licensed under the DeepFloyd License, Copyright (c) Stability AI Ltd. All Rights Reserved.”\L\n4. DISCLAIMERS\n THE SOFTWARE PRODUCTS ARE PROVIDED “AS IS” and “WITH ALL FAULTS” WITH NO WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. STABILITY AIEXPRESSLY DISCLAIMS ALL REPRESENTATIONS AND WARRANTIES, EXPRESS OR IMPLIED, WHETHER BY STATUTE, CUSTOM, USAGE OR OTHERWISE AS TO ANY MATTERS RELATED TO THE SOFTWARE PRODUCTS, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE, SATISFACTORY QUALITY, OR NON-INFRINGEMENT. STABILITY AI MAKES NO WARRANTIES OR REPRESENTATIONS THAT THE SOFTWARE PRODUCTS WILL BE ERROR FREE OR FREE OF VIRUSES OR OTHER HARMFUL COMPONENTS, OR PRODUCE ANY PARTICULAR RESULTS.\L\n5. LIMITATION OF LIABILITY\n TO THE FULLEST EXTENT PERMITTED BY LAW, IN NO EVENT WILL STABILITY AI BE LIABLE TO YOU (A) UNDER ANY THEORY OF LIABILITY, WHETHER BASED IN CONTRACT, TORT, NEGLIGENCE, STRICT LIABILITY, WARRANTY, OR OTHERWISE UNDER THIS LICENSE, OR (B) FOR ANY INDIRECT, CONSEQUENTIAL, EXEMPLARY, INCIDENTAL, PUNITIVE OR SPECIAL DAMAGES OR LOST PROFITS, EVEN IF STABILITY AI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THE SOFTWARE PRODUCTS, THEIR CONSTITUENT COMPONENTS, AND ANY OUTPUT (COLLECTIVELY, “SOFTWARE MATERIALS”) ARE NOT DESIGNED OR INTENDED FOR USE IN ANY APPLICATION OR SITUATION WHERE FAILURE OR FAULT OF THE SOFTWARE MATERIALS COULD REASONABLY BE ANTICIPATED TO LEAD TO SERIOUS INJURY OF ANY PERSON, INCLUDING POTENTIAL DISCRIMINATION OR VIOLATION OF AN INDIVIDUAL’S PRIVACY RIGHTS, OR TO SEVERE PHYSICAL, PROPERTY, OR ENVIRONMENTAL DAMAGE (EACH, A “HIGH-RISK USE”). IF YOU ELECT TO USE ANY OF THE SOFTWARE MATERIALS FOR A HIGH-RISK USE, YOU DO SO AT YOUR OWN RISK. YOU AGREE TO DESIGN AND IMPLEMENT APPROPRIATE DECISION-MAKING AND RISK-MITIGATION PROCEDURES AND POLICIES IN CONNECTION WITH A HIGH-RISK USE SUCH THAT EVEN IF THERE IS A FAILURE OR FAULT IN ANY OF THE SOFTWARE MATERIALS, THE SAFETY OF PERSONS OR PROPERTY AFFECTED BY THE ACTIVITY STAYS AT A LEVEL THAT IS REASONABLE, APPROPRIATE, AND LAWFUL FOR THE FIELD OF THE HIGH-RISK USE.\L\n6. INDEMNIFICATION\n You will indemnify, defend and hold harmless Stability AI and our subsidiaries and affiliates, and each of our respective shareholders, directors, officers, employees, agents, successors, and assigns (collectively, the “Stability AI Parties”) from and against any losses, liabilities, damages, fines, penalties, and expenses (including reasonable attorneys’ fees) incurred by any Stability AI Party in connection with any claim, demand, allegation, lawsuit, proceeding, or investigation (collectively, “Claims”) arising out of or related to: (a) your access to or use of the Software Products (as well as any results or data generated from such access or use), including any High-Risk Use (defined below); (b) your violation of this License; or (c) your violation, misappropriation or infringement of any rights of another (including intellectual property or other proprietary rights and privacy rights). You will promptly notify the Stability AI Parties of any such Claims, and cooperate with Stability AI Parties in defending such Claims. You will also grant the Stability AI Parties sole control of the defense or settlement, at Stability AI’s sole option, of any Claims. This indemnity is in addition to, and not in lieu of, any other indemnities or remedies set forth in a written agreement between you and Stability AI or the other Stability AI Parties.\L\n7. TERMINATION; SURVIVAL\n a. This License will automatically terminate upon any breach by you of the terms of this License.\L\Lb. We may terminate this License, in whole or in part, at any time upon notice (including electronic) to you.\L\Lc. The following sections survive termination of this License: 2 (Restrictions), 3 (Attribution), 4 (Disclaimers), 5 (Limitation on Liability), 6 (Indemnification) 7 (Termination; Survival), 8 (Third Party Materials), 9 (Trademarks), 10 (Applicable Law; Dispute Resolution), and 11 (Miscellaneous).\L\n8. THIRD PARTY MATERIALS\n The Software Products may contain third-party software or other components (including free and open source software) (all of the foregoing, “Third Party Materials”), which are subject to the license terms of the respective third-party licensors. Your dealings or correspondence with third parties and your use of or interaction with any Third Party Materials are solely between you and the third party. Stability AI does not control or endorse, and makes no representations or warranties regarding, any Third Party Materials, and your access to and use of such Third Party Materials are at your own risk.\L\n9. TRADEMARKS\n Licensee has not been granted any trademark license as part of this License and may not use any name or mark associated with Stability AI without the prior written permission of Stability AI, except to the extent necessary to make the reference required by the “ATTRIBUTION” section of this Agreement.\L\n10. APPLICABLE LAW; DISPUTE RESOLUTION\n This License will be governed and construed under the laws of the State of California without regard to conflicts of law provisions. Any suit or proceeding arising out of or relating to this License will be brought in the federal or state courts, as applicable, in San Mateo County, California, and each party irrevocably submits to the jurisdiction and venue of such courts.\L\n11. MISCELLANEOUS\n If any provision or part of a provision of this License is unlawful, void or unenforceable, that provision or part of the provision is deemed severed from this License, and will not affect the validity and enforceability of any remaining provisions. The failure of Stability AI to exercise or enforce any right or provision of this License will not operate as a waiver of such right or provision. This License does not confer any third-party beneficiary rights upon any other person or entity. This License, together with the Documentation, contains the entire understanding between you and Stability AI regarding the subject matter of this License, and supersedes all other written or oral agreements and understandings between you and Stability AI regarding such subject matter. No change or addition to any provision of this License will be binding unless it is in writing and signed by an authorized representative of both you and Stability AI."
extra_gated_fields:
"Organization /\_Affiliation": text
Previously related publications: text
I accept the above license agreement, and will use the Software non-commercially and for research purposes only: checkbox
tags:
- if
- text-to-image
inference: false
---
# IF-I-M-v1.0
DeepFloyd-IF is a pixel-based text-to-image triple-cascaded diffusion model, that can generate pictures with new state-of-the-art for photorealism and language understanding. The result is a highly efficient model that outperforms current state-of-the-art models, achieving a zero-shot FID-30K score of `6.66` on the COCO dataset.
*Inspired by* [*Photorealistic Text-to-Image Diffusion Models with Deep Language Understanding*](https://arxiv.org/pdf/2205.11487.pdf)

## Model Details
- **Developed by:** DeepFloyd, StabilityAI
- **Model type:** pixel-based text-to-image cascaded diffusion model
- **Cascade Stage:** I
- **Num Parameters:** 400M
- **Language(s):** primarily English and, to a lesser extent, other Romance languages
- **License:** <span style="color:blue"><a href="https://huggingface.co/spaces/DeepFloyd/deepfloyd-if-license">DeepFloyd IF License Agreement</a></span>
- **Model Description:** DeepFloyd-IF is modular composed of frozen text mode and three pixel cascaded diffusion modules, each designed to generate images of increasing resolution: 64x64, 256x256, and 1024x1024. All stages of the model utilize a frozen text encoder based on the T5 transformer to extract text embeddings, which are then fed into a UNet architecture enhanced with cross-attention and attention-pooling
- **Resources for more information:** [GitHub](https://github.com/deep-floyd/IF), [Website](https://deepfloyd.ai), [All Links](https://linktr.ee/deepfloyd)
## Using with `diffusers`
IF is integrated with the 🤗 Hugging Face [🧨 diffusers library](https://github.com/huggingface/diffusers/), which is optimized to run on GPUs with as little as 14 GB of VRAM.
Before you can use IF, you need to accept its usage conditions. To do so:
1. Make sure to have a [Hugging Face account](https://huggingface.co/join) and be loggin in
2. Accept the license on the model card of [DeepFloyd/IF-I-M-v1.0](https://huggingface.co/DeepFloyd/IF-I-M-v1.0)
3. Make sure to login locally. Install `huggingface_hub`
```sh
pip install huggingface_hub --upgrade
```
run the login function in a Python shell
```py
from huggingface_hub import login
login()
```
and enter your [Hugging Face Hub access token](https://huggingface.co/docs/hub/security-tokens#what-are-user-access-tokens).
Next we install `diffusers` and dependencies:
```sh
pip install diffusers accelerate transformers safetensors sentencepiece
```
And we can now run the model locally.
By default `diffusers` makes use of [model cpu offloading](https://huggingface.co/docs/diffusers/optimization/fp16#model-offloading-for-fast-inference-and-memory-savings) to run the whole IF pipeline with as little as 14 GB of VRAM.
If you are using `torch>=2.0.0`, make sure to **remove all** `enable_xformers_memory_efficient_attention()` functions.
* **Load all stages and offload to CPU**
```py
from diffusers import DiffusionPipeline
from diffusers.utils import pt_to_pil
import torch
# stage 1
stage_1 = DiffusionPipeline.from_pretrained("DeepFloyd/IF-I-M-v1.0", variant="fp16", torch_dtype=torch.float16)
stage_1.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_1.enable_model_cpu_offload()
# stage 2
stage_2 = DiffusionPipeline.from_pretrained(
"DeepFloyd/IF-II-M-v1.0", text_encoder=None, variant="fp16", torch_dtype=torch.float16
)
stage_2.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_2.enable_model_cpu_offload()
# stage 3
safety_modules = {"feature_extractor": stage_1.feature_extractor, "safety_checker": stage_1.safety_checker, "watermarker": stage_1.watermarker}
stage_3 = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-x4-upscaler", **safety_modules, torch_dtype=torch.float16)
stage_3.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_3.enable_model_cpu_offload()
```
* **Retrieve Text Embeddings**
```py
prompt = 'a photo of a kangaroo wearing an orange hoodie and blue sunglasses standing in front of the eiffel tower holding a sign that says "very deep learning"'
# text embeds
prompt_embeds, negative_embeds = stage_1.encode_prompt(prompt)
```
* **Run stage 1**
```py
generator = torch.manual_seed(0)
image = stage_1(prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt").images
pt_to_pil(image)[0].save("./if_stage_I.png")
```
* **Run stage 2**
```py
image = stage_2(
image=image, prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt"
).images
pt_to_pil(image)[0].save("./if_stage_II.png")
```
* **Run stage 3**
```py
image = stage_3(prompt=prompt, image=image, generator=generator, noise_level=100).images
image[0].save("./if_stage_III.png")
```
There are multiple ways to speed up the inference time and lower the memory consumption even more with `diffusers`. To do so, please have a look at the Diffusers docs:
- 🚀 [Optimizing for inference time](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-speed)
- ⚙️ [Optimizing for low memory during inference](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-memory)
For more in-detail information about how to use IF, please have a look at [the IF blog post](https://huggingface.co/blog/if) and the [documentation](https://huggingface.co/docs/diffusers/main/en/api/pipelines/if) 📖.
Diffusers dreambooth scripts also supports fine-tuning 🎨 [IF](https://huggingface.co/docs/diffusers/main/en/training/dreambooth#if).
With parameter efficient finetuning, you can add new concepts to IF with a single GPU and ~28 GB VRAM.
## Training
**Training Data:**
1.2B text-image pairs (based on LAION-A and few additional internal datasets)
Test/Valid parts of datasets are not used at any cascade and stage of training. Valid part of COCO helps to demonstrate "online" loss behaviour during training (to catch incident and other problems), but dataset is never used for train.
**Training Procedure:** IF-I-M-v1.0 is the smallest (from IF series) pixel-based diffusion cascade which uses T5-Encoder embeddings (hidden states) to generate 64px image. During training,
- Images are cropped to square via shifted-center-crop augmentation (randomly shift from center up to 0.1 of size) and resized to 64px using `Pillow==9.2.0` BICUBIC resampling with reducing_gap=None (it helps to avoid aliasing) and processed to tensor BxCxHxW
- Text prompts are encoded through open-sourced frozen T5-v1_1-xxl text-encoder (that completely was trained by Google team), random 10% of texts are dropped to empty string to add ability for classifier free guidance (CFG)
- The non-pooled output of the text encoder is fed into the projection (linear layer without activation) and is used in UNet backbone of the diffusion model via controlled hybrid self- and cross- attention
- Also, the output of the text encode is pooled via attention-pooling (64 heads) and is used in time embed as additional features
- Diffusion process is limited by 1000 discrete steps, with cosine beta schedule of noising image
- The loss is a reconstruction objective between the noise that was added to the image and the prediction made by the UNet
- The training process for checkpoint IF-I-M-v1.0 has 2_500_000 steps at resolution 64x64 on all datasets, OneCycleLR policy, SiLU activations, optimizer AdamW8bit + DeepSpeed-Zero1, fully frozen T5-Encoder

**Hardware:** 12 x 8 x A100 GPUs
**Optimizer:** [AdamW8bit](https://arxiv.org/abs/2110.02861) + [DeepSpeed ZeRO-1](https://www.deepspeed.ai/tutorials/zero/)
**Batch:** 3072
**Learning rate**: [one-cycle](https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.OneCycleLR.html) cosine strategy, warmup 10000 steps, start_lr=4e-6, max_lr=1e-4, final_lr=1e-8

## Evaluation Results
`FID-30K: 8.86`

# Uses
## Direct Use
The model is released for research purposes. Any attempt to deploy the model in production requires not only that the LICENSE is followed but full liability over the person deploying the model.
Possible research areas and tasks include:
- Generation of artistic imagery and use in design and other artistic processes.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion but applies in the same way for IF_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model was trained mainly with English captions and will not work as well in other languages.
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have... (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
IF was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
IF mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent.
*This model card was written by: DeepFloyd Team and is based on the [StableDiffusion model card](https://huggingface.co/CompVis/stable-diffusion-v1-4).* | 23,400 | [
[
-0.044342041015625,
-0.06634521484375,
0.0196990966796875,
0.02886962890625,
-0.0178375244140625,
-0.00394439697265625,
-0.019683837890625,
-0.03485107421875,
0.006153106689453125,
0.0223388671875,
-0.042144775390625,
-0.042266845703125,
-0.04840087890625,
-0.01491546630859375,
-0.017486572265625,
0.07794189453125,
-0.0123443603515625,
-0.01467132568359375,
-0.0074462890625,
0.0016164779663085938,
-0.01453399658203125,
-0.00853729248046875,
-0.073974609375,
-0.026092529296875,
0.020751953125,
0.02166748046875,
0.0406494140625,
0.0268096923828125,
0.028045654296875,
0.02728271484375,
-0.0180511474609375,
-0.00289154052734375,
-0.03863525390625,
-0.0287322998046875,
0.009674072265625,
-0.0177764892578125,
-0.032318115234375,
0.001781463623046875,
0.046875,
0.01145172119140625,
-0.00200653076171875,
0.00586700439453125,
0.0078125,
0.052276611328125,
-0.04815673828125,
0.019683837890625,
-0.021697998046875,
0.01519012451171875,
0.00007396936416625977,
0.0141448974609375,
-0.0116729736328125,
-0.01377105712890625,
0.021087646484375,
-0.048736572265625,
0.03857421875,
-0.00630950927734375,
0.08636474609375,
0.0227508544921875,
-0.0092010498046875,
-0.00876617431640625,
-0.0267181396484375,
0.049468994140625,
-0.054595947265625,
0.0228118896484375,
0.01044464111328125,
0.0042266845703125,
0.005123138427734375,
-0.0684814453125,
-0.05133056640625,
-0.01007843017578125,
0.00028824806213378906,
0.02545166015625,
-0.01259613037109375,
0.00981903076171875,
0.02435302734375,
0.049346923828125,
-0.034637451171875,
-0.006107330322265625,
-0.03863525390625,
-0.014190673828125,
0.06341552734375,
-0.0008978843688964844,
0.0181732177734375,
-0.006534576416015625,
-0.038177490234375,
-0.016326904296875,
-0.018341064453125,
0.01250457763671875,
0.005542755126953125,
0.0020580291748046875,
-0.0499267578125,
0.0258636474609375,
-0.007354736328125,
0.0268707275390625,
0.0281219482421875,
-0.016082763671875,
0.031494140625,
-0.01219940185546875,
-0.03350830078125,
0.01032257080078125,
0.0836181640625,
0.0234375,
0.01111602783203125,
0.005725860595703125,
-0.0097808837890625,
0.0027179718017578125,
0.0003197193145751953,
-0.0972900390625,
-0.040069580078125,
0.028289794921875,
-0.0251922607421875,
-0.03662109375,
-0.011749267578125,
-0.06463623046875,
-0.01165008544921875,
0.01406097412109375,
0.041748046875,
-0.060516357421875,
-0.0318603515625,
0.0195159912109375,
-0.01617431640625,
0.01739501953125,
0.0286102294921875,
-0.056396484375,
0.025665283203125,
0.025421142578125,
0.07891845703125,
-0.0062408447265625,
-0.01317596435546875,
-0.01094818115234375,
-0.016754150390625,
-0.021026611328125,
0.04364013671875,
-0.017303466796875,
-0.02880859375,
-0.0114898681640625,
0.00823211669921875,
-0.0106658935546875,
-0.028717041015625,
0.04974365234375,
-0.0275421142578125,
0.03643798828125,
-0.00669097900390625,
-0.050262451171875,
-0.02740478515625,
0.006374359130859375,
-0.0439453125,
0.0936279296875,
0.0201873779296875,
-0.0736083984375,
0.01617431640625,
-0.054779052734375,
-0.032379150390625,
-0.0006866455078125,
0.00231170654296875,
-0.055633544921875,
0.00189208984375,
0.02392578125,
0.050201416015625,
-0.01461029052734375,
0.001468658447265625,
-0.0201568603515625,
-0.031341552734375,
0.0038356781005859375,
-0.0275726318359375,
0.084228515625,
0.0211639404296875,
-0.05511474609375,
-0.0011892318725585938,
-0.050048828125,
-0.004566192626953125,
0.029205322265625,
-0.0250244140625,
0.0105133056640625,
-0.03118896484375,
0.0223846435546875,
0.021728515625,
0.016204833984375,
-0.04638671875,
0.01483154296875,
-0.0302276611328125,
0.037139892578125,
0.049072265625,
-0.0011377334594726562,
0.038543701171875,
-0.00937652587890625,
0.0360107421875,
0.02044677734375,
0.01666259765625,
-0.020538330078125,
-0.059234619140625,
-0.07635498046875,
-0.03033447265625,
0.0119781494140625,
0.03839111328125,
-0.0594482421875,
0.0306854248046875,
-0.0003440380096435547,
-0.041717529296875,
-0.048980712890625,
0.0020122528076171875,
0.041717529296875,
0.04852294921875,
0.03253173828125,
-0.017791748046875,
-0.018341064453125,
-0.060943603515625,
0.01129150390625,
0.0087127685546875,
0.0101165771484375,
0.0233001708984375,
0.050384521484375,
-0.01611328125,
0.047027587890625,
-0.043243408203125,
-0.033843994140625,
-0.00988006591796875,
0.0012578964233398438,
0.0269012451171875,
0.047210693359375,
0.053436279296875,
-0.05120849609375,
-0.045867919921875,
-0.004047393798828125,
-0.06512451171875,
0.01328277587890625,
-0.0101776123046875,
-0.005283355712890625,
0.031707763671875,
0.0310821533203125,
-0.07098388671875,
0.044342041015625,
0.041778564453125,
-0.03057861328125,
0.04205322265625,
-0.0247344970703125,
0.00792694091796875,
-0.0733642578125,
0.01116180419921875,
0.0242462158203125,
-0.01549530029296875,
-0.0268096923828125,
0.01027679443359375,
0.0067138671875,
-0.01453399658203125,
-0.047515869140625,
0.0601806640625,
-0.0443115234375,
0.0214691162109375,
-0.01236724853515625,
0.0025196075439453125,
0.0129241943359375,
0.04840087890625,
0.00762176513671875,
0.051483154296875,
0.0667724609375,
-0.0537109375,
0.0166473388671875,
0.00852203369140625,
-0.0307464599609375,
0.031646728515625,
-0.04840087890625,
0.0136566162109375,
-0.014892578125,
0.020233154296875,
-0.07904052734375,
-0.01096343994140625,
0.03863525390625,
-0.034912109375,
0.043426513671875,
-0.0038928985595703125,
-0.0306396484375,
-0.042449951171875,
-0.024932861328125,
0.0257568359375,
0.06097412109375,
-0.039794921875,
0.041046142578125,
0.012847900390625,
0.0182952880859375,
-0.050048828125,
-0.056427001953125,
-0.00766754150390625,
-0.01500701904296875,
-0.058258056640625,
0.05035400390625,
-0.00876617431640625,
0.00008499622344970703,
0.006656646728515625,
0.0022296905517578125,
0.005096435546875,
0.000274658203125,
0.021331787109375,
0.01081085205078125,
-0.021453857421875,
-0.0132598876953125,
0.01432037353515625,
-0.0167388916015625,
0.004573822021484375,
-0.0278167724609375,
0.040130615234375,
-0.0197601318359375,
0.004573822021484375,
-0.07122802734375,
0.00473785400390625,
0.026397705078125,
0.00585174560546875,
0.06011962890625,
0.08917236328125,
-0.035614013671875,
-0.00705718994140625,
-0.0465087890625,
-0.0104522705078125,
-0.0435791015625,
0.017425537109375,
-0.0309295654296875,
-0.056793212890625,
0.03204345703125,
-0.0028553009033203125,
0.0133514404296875,
0.04974365234375,
0.0399169921875,
-0.01837158203125,
0.06658935546875,
0.049224853515625,
-0.0128936767578125,
0.038116455078125,
-0.0718994140625,
0.00606536865234375,
-0.054351806640625,
-0.023101806640625,
-0.00732421875,
-0.032135009765625,
-0.033599853515625,
-0.049560546875,
0.022247314453125,
0.0261993408203125,
-0.03009033203125,
0.01861572265625,
-0.055389404296875,
0.0257110595703125,
0.028961181640625,
0.0214691162109375,
0.0009794235229492188,
0.0109100341796875,
-0.01300811767578125,
0.000995635986328125,
-0.055755615234375,
-0.0181121826171875,
0.0618896484375,
0.0302581787109375,
0.042388916015625,
-0.02166748046875,
0.052520751953125,
0.0115814208984375,
0.030517578125,
-0.0367431640625,
0.04144287109375,
-0.0020656585693359375,
-0.050506591796875,
0.0011854171752929688,
-0.023468017578125,
-0.05841064453125,
0.01430511474609375,
-0.0245819091796875,
-0.058624267578125,
0.0188140869140625,
0.0179901123046875,
-0.0265655517578125,
0.041046142578125,
-0.061553955078125,
0.0765380859375,
-0.0200653076171875,
-0.050262451171875,
-0.01120758056640625,
-0.050140380859375,
0.032806396484375,
0.0180511474609375,
-0.0033550262451171875,
-0.01250457763671875,
-0.005496978759765625,
0.055419921875,
-0.035675048828125,
0.0537109375,
-0.031494140625,
-0.0004260540008544922,
0.040557861328125,
-0.00762939453125,
0.0218353271484375,
0.007198333740234375,
-0.02105712890625,
0.038116455078125,
-0.00787353515625,
-0.041717529296875,
-0.0288238525390625,
0.0626220703125,
-0.06622314453125,
-0.028106689453125,
-0.0340576171875,
-0.0197296142578125,
0.01605224609375,
0.020751953125,
0.056060791015625,
0.02069091796875,
-0.0153350830078125,
-0.0009222030639648438,
0.06256103515625,
-0.039093017578125,
0.05413818359375,
-0.004749298095703125,
-0.0238189697265625,
-0.04400634765625,
0.07623291015625,
-0.0094757080078125,
0.010894775390625,
0.027984619140625,
0.0211639404296875,
-0.0231475830078125,
-0.0283050537109375,
-0.04998779296875,
0.030059814453125,
-0.040618896484375,
-0.0256805419921875,
-0.0670166015625,
-0.0310516357421875,
-0.0333251953125,
-0.02215576171875,
-0.046875,
-0.019134521484375,
-0.053619384765625,
0.00008189678192138672,
0.047027587890625,
0.032562255859375,
-0.004138946533203125,
0.037628173828125,
-0.029998779296875,
0.025909423828125,
0.004497528076171875,
0.020050048828125,
0.013671875,
-0.037872314453125,
-0.0179595947265625,
0.0022830963134765625,
-0.0390625,
-0.04510498046875,
0.042327880859375,
0.0230712890625,
0.016876220703125,
0.054962158203125,
-0.0113983154296875,
0.0616455078125,
-0.0206146240234375,
0.0548095703125,
0.027191162109375,
-0.0654296875,
0.033233642578125,
-0.0176849365234375,
0.023345947265625,
0.028900146484375,
0.044036865234375,
-0.01763916015625,
-0.009185791015625,
-0.064208984375,
-0.061279296875,
0.05902099609375,
0.034515380859375,
0.014251708984375,
0.01012420654296875,
0.05133056640625,
-0.007556915283203125,
0.01305389404296875,
-0.05633544921875,
-0.036041259765625,
-0.0204620361328125,
-0.00928497314453125,
-0.0088043212890625,
-0.006145477294921875,
0.01226043701171875,
-0.0484619140625,
0.061859130859375,
-0.00025916099548339844,
0.052032470703125,
0.0316162109375,
-0.0011959075927734375,
0.0007505416870117188,
-0.025238037109375,
0.026611328125,
0.02276611328125,
-0.02056884765625,
-0.00952911376953125,
0.0138092041015625,
-0.04327392578125,
-0.0012969970703125,
0.017364501953125,
-0.018096923828125,
-0.001522064208984375,
0.015869140625,
0.07550048828125,
0.00901031494140625,
-0.0303955078125,
0.043670654296875,
-0.01439666748046875,
-0.022216796875,
-0.0298309326171875,
0.0159759521484375,
0.02081298828125,
0.0295562744140625,
0.0157012939453125,
0.0140380859375,
0.00678253173828125,
-0.0279541015625,
0.01397705078125,
0.033050537109375,
-0.027984619140625,
-0.0222320556640625,
0.07452392578125,
0.00824737548828125,
-0.0275726318359375,
0.056396484375,
-0.02789306640625,
-0.040496826171875,
0.05633544921875,
0.03857421875,
0.07281494140625,
-0.01091766357421875,
0.02008056640625,
0.0498046875,
0.01837158203125,
0.0004992485046386719,
0.0217437744140625,
-0.00753021240234375,
-0.05413818359375,
-0.0188446044921875,
-0.053558349609375,
-0.004375457763671875,
0.01219940185546875,
-0.0286712646484375,
0.035858154296875,
-0.0606689453125,
-0.01314544677734375,
0.00495147705078125,
0.0199432373046875,
-0.07220458984375,
0.03228759765625,
0.023193359375,
0.07672119140625,
-0.050689697265625,
0.06146240234375,
0.03204345703125,
-0.0379638671875,
-0.04638671875,
-0.00678253173828125,
-0.01183319091796875,
-0.06597900390625,
0.0294036865234375,
0.03924560546875,
-0.006168365478515625,
0.0113677978515625,
-0.056610107421875,
-0.057891845703125,
0.089599609375,
0.0318603515625,
-0.03509521484375,
-0.0015001296997070312,
-0.01271820068359375,
0.039215087890625,
-0.03033447265625,
0.03033447265625,
0.03875732421875,
0.025787353515625,
0.0152740478515625,
-0.046295166015625,
0.0157470703125,
-0.02874755859375,
0.0009703636169433594,
0.0066375732421875,
-0.07281494140625,
0.0736083984375,
-0.04443359375,
-0.0231475830078125,
0.01229095458984375,
0.06634521484375,
0.01715087890625,
0.0316162109375,
0.025543212890625,
0.0740966796875,
0.050750732421875,
-0.00202178955078125,
0.0970458984375,
-0.023101806640625,
0.041839599609375,
0.049224853515625,
0.007251739501953125,
0.0452880859375,
0.0146942138671875,
-0.0138702392578125,
0.04351806640625,
0.06268310546875,
-0.0018644332885742188,
0.03887939453125,
0.0007958412170410156,
-0.02728271484375,
-0.00948333740234375,
0.014373779296875,
-0.0367431640625,
0.014556884765625,
0.033905029296875,
-0.03680419921875,
-0.002819061279296875,
0.01113128662109375,
0.021026611328125,
-0.0321044921875,
-0.01032257080078125,
0.04693603515625,
0.014251708984375,
-0.045684814453125,
0.0623779296875,
0.00811767578125,
0.076416015625,
-0.02874755859375,
-0.00023543834686279297,
-0.0190277099609375,
0.03155517578125,
-0.02801513671875,
-0.060943603515625,
0.004489898681640625,
-0.0145721435546875,
0.0016355514526367188,
-0.0143890380859375,
0.058441162109375,
-0.0241546630859375,
-0.058074951171875,
0.031982421875,
0.006378173828125,
0.02593994140625,
-0.022491455078125,
-0.09075927734375,
0.019195556640625,
0.005649566650390625,
-0.0343017578125,
0.0242767333984375,
0.025604248046875,
0.018646240234375,
0.052093505859375,
0.0428466796875,
-0.01219940185546875,
0.010894775390625,
-0.0129547119140625,
0.07220458984375,
-0.0443115234375,
-0.0201568603515625,
-0.06787109375,
0.053863525390625,
-0.01617431640625,
-0.039642333984375,
0.05035400390625,
0.05059814453125,
0.06634521484375,
-0.0033359527587890625,
0.0450439453125,
-0.0213775634765625,
-0.00627899169921875,
-0.0340576171875,
0.061279296875,
-0.06231689453125,
0.0026569366455078125,
-0.04583740234375,
-0.06304931640625,
-0.00997161865234375,
0.0684814453125,
-0.01023101806640625,
0.0208587646484375,
0.0433349609375,
0.05926513671875,
-0.0233001708984375,
0.005161285400390625,
0.0081634521484375,
0.017730712890625,
0.015411376953125,
0.04791259765625,
0.03155517578125,
-0.07257080078125,
0.0302581787109375,
-0.060791015625,
-0.0241546630859375,
-0.00823974609375,
-0.06683349609375,
-0.0655517578125,
-0.04486083984375,
-0.057769775390625,
-0.050201416015625,
-0.00641632080078125,
0.044677734375,
0.06329345703125,
-0.04644775390625,
-0.01134490966796875,
-0.01837158203125,
0.006168365478515625,
-0.0182037353515625,
-0.024871826171875,
0.039703369140625,
-0.0092315673828125,
-0.067138671875,
0.003200531005859375,
0.0197906494140625,
0.0017528533935546875,
-0.0236053466796875,
-0.012054443359375,
-0.0236968994140625,
-0.01549530029296875,
0.052703857421875,
0.0219268798828125,
-0.039398193359375,
-0.0009236335754394531,
-0.0124969482421875,
0.0035915374755859375,
0.021820068359375,
0.0406494140625,
-0.06280517578125,
0.028656005859375,
0.0304718017578125,
0.036285400390625,
0.08917236328125,
-0.015869140625,
0.0112152099609375,
-0.041595458984375,
0.02862548828125,
0.01393890380859375,
0.0270233154296875,
0.02886962890625,
-0.0408935546875,
0.03363037109375,
0.03173828125,
-0.04339599609375,
-0.0457763671875,
-0.0118255615234375,
-0.09039306640625,
-0.023590087890625,
0.083251953125,
-0.01415252685546875,
-0.03857421875,
0.0178070068359375,
-0.0250396728515625,
0.02142333984375,
-0.0390625,
0.05303955078125,
0.0279388427734375,
-0.0193939208984375,
-0.041046142578125,
-0.034759521484375,
0.044921875,
0.0150604248046875,
-0.04840087890625,
-0.0245361328125,
0.027862548828125,
0.04986572265625,
0.0176849365234375,
0.068603515625,
-0.0032520294189453125,
0.0097808837890625,
0.0108795166015625,
0.0157623291015625,
0.01203155517578125,
-0.01026153564453125,
-0.0237579345703125,
0.0086669921875,
-0.032470703125,
-0.01788330078125
]
] |
Gustavosta/MagicPrompt-Stable-Diffusion | 2023-07-09T22:10:48.000Z | [
"transformers",
"pytorch",
"coreml",
"safetensors",
"gpt2",
"text-generation",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Gustavosta | null | null | Gustavosta/MagicPrompt-Stable-Diffusion | 560 | 50,399 | transformers | 2022-09-17T22:34:07 | ---
license: mit
---
# MagicPrompt - Stable Diffusion
This is a model from the MagicPrompt series of models, which are [GPT-2](https://huggingface.co/gpt2) models intended to generate prompt texts for imaging AIs, in this case: [Stable Diffusion](https://huggingface.co/CompVis/stable-diffusion).
## 🖼️ Here's an example:
<img src="https://files.catbox.moe/ac3jq7.png">
This model was trained with 150,000 steps and a set of about 80,000 data filtered and extracted from the image finder for Stable Diffusion: "[Lexica.art](https://lexica.art/)". It was a little difficult to extract the data, since the search engine still doesn't have a public API without being protected by cloudflare, but if you want to take a look at the original dataset, you can have a look here: [datasets/Gustavosta/Stable-Diffusion-Prompts](https://huggingface.co/datasets/Gustavosta/Stable-Diffusion-Prompts).
If you want to test the model with a demo, you can go to: "[spaces/Gustavosta/MagicPrompt-Stable-Diffusion](https://huggingface.co/spaces/Gustavosta/MagicPrompt-Stable-Diffusion)".
## 💻 You can see other MagicPrompt models:
- For Dall-E 2: [Gustavosta/MagicPrompt-Dalle](https://huggingface.co/Gustavosta/MagicPrompt-Dalle)
- For Midjourney: [Gustavosta/MagicPrompt-Midourney](https://huggingface.co/Gustavosta/MagicPrompt-Midjourney) **[⚠️ In progress]**
- MagicPrompt full: [Gustavosta/MagicPrompt](https://huggingface.co/Gustavosta/MagicPrompt) **[⚠️ In progress]**
## ⚖️ Licence:
[MIT](https://huggingface.co/models?license=license:mit)
When using this model, please credit: [Gustavosta](https://huggingface.co/Gustavosta)
**Thanks for reading this far! :)**
| 1,663 | [
[
-0.033843994140625,
-0.06512451171875,
0.040924072265625,
0.01137542724609375,
-0.0167236328125,
-0.020263671875,
0.019073486328125,
-0.00931549072265625,
0.01690673828125,
0.034759521484375,
-0.057098388671875,
-0.04168701171875,
-0.05291748046875,
-0.00588226318359375,
-0.033294677734375,
0.08990478515625,
-0.0142974853515625,
0.012451171875,
-0.00814056396484375,
0.00034165382385253906,
-0.04058837890625,
0.0063018798828125,
-0.07684326171875,
-0.03662109375,
0.0443115234375,
0.019683837890625,
0.072998046875,
0.01152801513671875,
0.024200439453125,
0.0140838623046875,
-0.01209259033203125,
0.00812530517578125,
-0.02105712890625,
0.01555633544921875,
-0.0101165771484375,
-0.00897216796875,
-0.0526123046875,
0.0225677490234375,
0.05133056640625,
0.0269317626953125,
-0.01409149169921875,
0.0093994140625,
0.0015821456909179688,
0.040069580078125,
-0.0343017578125,
0.00762176513671875,
-0.01311492919921875,
0.0081329345703125,
-0.01218414306640625,
0.01511383056640625,
-0.0136566162109375,
-0.0267486572265625,
0.0098419189453125,
-0.06201171875,
0.01123809814453125,
0.00685882568359375,
0.08331298828125,
0.006679534912109375,
-0.034912109375,
-0.01385498046875,
-0.03399658203125,
0.0290679931640625,
-0.0231475830078125,
0.006618499755859375,
0.04638671875,
0.040252685546875,
-0.01209259033203125,
-0.07147216796875,
-0.05181884765625,
-0.014434814453125,
-0.001110076904296875,
0.0249481201171875,
-0.02740478515625,
-0.0128173828125,
0.02142333984375,
0.00897979736328125,
-0.06585693359375,
-0.006969451904296875,
-0.0322265625,
-0.01406097412109375,
0.03350830078125,
0.0249481201171875,
0.034698486328125,
0.020263671875,
-0.01690673828125,
-0.005641937255859375,
-0.044097900390625,
-0.007564544677734375,
0.023223876953125,
-0.0062408447265625,
-0.0258026123046875,
0.05218505859375,
0.0211181640625,
0.043548583984375,
0.00891876220703125,
-0.0146484375,
0.021942138671875,
-0.01140594482421875,
-0.01239776611328125,
-0.024871826171875,
0.07025146484375,
0.0300140380859375,
0.0263671875,
-0.0025119781494140625,
-0.0199432373046875,
0.0072174072265625,
0.01212310791015625,
-0.081787109375,
-0.0330810546875,
0.01209259033203125,
-0.039581298828125,
-0.016998291015625,
-0.004901885986328125,
-0.048095703125,
-0.02264404296875,
-0.004154205322265625,
0.040496826171875,
-0.036285400390625,
-0.046478271484375,
0.01131439208984375,
-0.0194854736328125,
-0.005809783935546875,
0.0333251953125,
-0.056243896484375,
0.0255126953125,
0.034515380859375,
0.07904052734375,
-0.0036334991455078125,
-0.01250457763671875,
-0.005558013916015625,
-0.003612518310546875,
-0.02618408203125,
0.07232666015625,
-0.040191650390625,
-0.041778564453125,
-0.009918212890625,
0.0141754150390625,
-0.0146026611328125,
-0.038970947265625,
0.060028076171875,
-0.042755126953125,
0.04388427734375,
-0.00937652587890625,
-0.04107666015625,
-0.0203704833984375,
0.0171051025390625,
-0.0467529296875,
0.06689453125,
0.01299285888671875,
-0.06317138671875,
0.00856781005859375,
-0.09246826171875,
0.00400543212890625,
-0.00664520263671875,
0.009490966796875,
-0.04437255859375,
-0.0167083740234375,
-0.019866943359375,
0.0187530517578125,
-0.0211334228515625,
0.002315521240234375,
-0.031219482421875,
-0.01415252685546875,
0.0165863037109375,
-0.032562255859375,
0.07098388671875,
0.03851318359375,
-0.00772857666015625,
0.0012331008911132812,
-0.05694580078125,
-0.01611328125,
0.0304718017578125,
0.0173492431640625,
-0.0276336669921875,
-0.0231170654296875,
0.0056610107421875,
0.0225677490234375,
0.01541900634765625,
-0.01493072509765625,
0.05389404296875,
-0.004344940185546875,
0.0028934478759765625,
0.046844482421875,
0.015899658203125,
0.028350830078125,
-0.03173828125,
0.060455322265625,
0.0225982666015625,
0.0218505859375,
-0.0098114013671875,
-0.07720947265625,
-0.034149169921875,
-0.022125244140625,
0.0198211669921875,
0.05206298828125,
-0.05328369140625,
0.0225372314453125,
-0.0016231536865234375,
-0.050445556640625,
-0.03546142578125,
-0.01187896728515625,
0.03216552734375,
0.06585693359375,
0.027191162109375,
-0.01485443115234375,
0.00024175643920898438,
-0.06195068359375,
-0.006145477294921875,
-0.00017261505126953125,
-0.01285552978515625,
-0.01558685302734375,
0.0494384765625,
-0.0185089111328125,
0.055572509765625,
-0.03814697265625,
-0.01424407958984375,
0.017913818359375,
0.02935791015625,
0.039947509765625,
0.027984619140625,
0.052459716796875,
-0.0628662109375,
-0.050201416015625,
-0.02044677734375,
-0.048583984375,
-0.02581787109375,
0.00357818603515625,
-0.0290679931640625,
-0.0057220458984375,
-0.00881195068359375,
-0.07080078125,
0.0364990234375,
0.033935546875,
-0.07720947265625,
0.056121826171875,
-0.016845703125,
0.01554107666015625,
-0.087646484375,
0.0165557861328125,
0.027374267578125,
-0.0267181396484375,
-0.05181884765625,
0.00978851318359375,
0.0026378631591796875,
0.006122589111328125,
-0.042724609375,
0.06695556640625,
-0.043548583984375,
0.0217742919921875,
-0.038177490234375,
-0.00045609474182128906,
0.01296234130859375,
0.01483917236328125,
-0.00018918514251708984,
0.06658935546875,
0.07122802734375,
-0.0377197265625,
0.0226898193359375,
0.0295562744140625,
0.0010366439819335938,
0.056610107421875,
-0.04986572265625,
0.00878143310546875,
-0.01334381103515625,
0.01544189453125,
-0.07958984375,
-0.01318359375,
0.05780029296875,
-0.0232391357421875,
0.0236358642578125,
-0.0211181640625,
-0.06573486328125,
-0.034576416015625,
-0.01488494873046875,
0.0233917236328125,
0.0723876953125,
-0.0236358642578125,
0.034088134765625,
0.0193939208984375,
-0.0028820037841796875,
-0.0242767333984375,
-0.051055908203125,
-0.002288818359375,
-0.03582763671875,
-0.058990478515625,
0.0289459228515625,
-0.024871826171875,
-0.0215911865234375,
0.006725311279296875,
0.0250244140625,
-0.020263671875,
0.004825592041015625,
0.029388427734375,
0.023284912109375,
-0.00014495849609375,
-0.003612518310546875,
0.0260467529296875,
-0.025665283203125,
0.005977630615234375,
-0.01543426513671875,
0.038421630859375,
-0.00492095947265625,
-0.02978515625,
-0.06634521484375,
0.01526641845703125,
0.05084228515625,
0.0142364501953125,
0.070068359375,
0.058624267578125,
-0.0428466796875,
0.00970458984375,
-0.02130126953125,
-0.01342010498046875,
-0.034698486328125,
-0.018463134765625,
-0.0325927734375,
-0.03216552734375,
0.06640625,
-0.00107574462890625,
0.0024929046630859375,
0.056243896484375,
0.0582275390625,
-0.0023937225341796875,
0.098388671875,
0.018463134765625,
0.01528167724609375,
0.03802490234375,
-0.040985107421875,
-0.009765625,
-0.048187255859375,
-0.0225067138671875,
-0.017608642578125,
-0.023712158203125,
-0.006664276123046875,
-0.02099609375,
0.0377197265625,
0.0259246826171875,
-0.03912353515625,
0.012420654296875,
-0.04534912109375,
0.0307464599609375,
0.0297393798828125,
0.004825592041015625,
0.0225067138671875,
0.010101318359375,
-0.0174560546875,
-0.01128387451171875,
-0.033416748046875,
-0.040252685546875,
0.07342529296875,
0.03155517578125,
0.04425048828125,
0.01068878173828125,
0.0305633544921875,
0.02783203125,
0.0283050537109375,
-0.03448486328125,
0.039886474609375,
0.001682281494140625,
-0.0400390625,
0.0025882720947265625,
-0.014129638671875,
-0.0963134765625,
0.0106201171875,
-0.0238189697265625,
-0.040771484375,
0.0066070556640625,
0.008392333984375,
-0.0305633544921875,
0.0137176513671875,
-0.052337646484375,
0.06793212890625,
0.006931304931640625,
-0.039459228515625,
-0.006618499755859375,
-0.040283203125,
0.038299560546875,
-0.0087432861328125,
0.01654052734375,
-0.0073699951171875,
-0.0049591064453125,
0.05426025390625,
-0.045318603515625,
0.04815673828125,
-0.0364990234375,
-0.0144805908203125,
0.036590576171875,
0.0086517333984375,
0.039520263671875,
0.002140045166015625,
-0.0015172958374023438,
-0.0019445419311523438,
0.003612518310546875,
-0.031005859375,
-0.046722412109375,
0.04327392578125,
-0.052825927734375,
-0.01381683349609375,
-0.03509521484375,
-0.020751953125,
0.0135345458984375,
0.0269775390625,
0.035003662109375,
0.0206756591796875,
0.005176544189453125,
-0.0269012451171875,
0.05950927734375,
0.00946807861328125,
0.0307464599609375,
0.03948974609375,
-0.04638671875,
-0.0302581787109375,
0.0498046875,
0.023834228515625,
0.006084442138671875,
-0.0101165771484375,
0.02203369140625,
-0.0361328125,
-0.048095703125,
-0.0330810546875,
0.0274505615234375,
-0.03387451171875,
-0.015838623046875,
-0.047882080078125,
-0.0209808349609375,
-0.0364990234375,
0.0026035308837890625,
-0.0257720947265625,
-0.035888671875,
-0.0601806640625,
-0.002765655517578125,
0.0615234375,
0.045989990234375,
-0.0159454345703125,
0.03759765625,
-0.058013916015625,
0.03253173828125,
0.006900787353515625,
0.01776123046875,
-0.017822265625,
-0.048736572265625,
-0.01227569580078125,
0.01139068603515625,
-0.047821044921875,
-0.09100341796875,
0.046783447265625,
0.00937652587890625,
0.043609619140625,
0.04754638671875,
-0.007389068603515625,
0.052520751953125,
-0.031890869140625,
0.08465576171875,
0.03228759765625,
-0.038055419921875,
0.04510498046875,
-0.043487548828125,
0.01073455810546875,
0.0372314453125,
0.042816162109375,
-0.036376953125,
-0.0267791748046875,
-0.06646728515625,
-0.051055908203125,
0.028289794921875,
0.0246734619140625,
0.00982666015625,
-0.003490447998046875,
0.050140380859375,
0.01276397705078125,
0.006099700927734375,
-0.04937744140625,
-0.04888916015625,
-0.01702880859375,
-0.01348114013671875,
0.03271484375,
-0.0177154541015625,
-0.0369873046875,
-0.0258331298828125,
0.060455322265625,
-0.00615692138671875,
0.0292205810546875,
0.006107330322265625,
0.019622802734375,
-0.01181793212890625,
-0.014495849609375,
0.0419921875,
0.03179931640625,
-0.0007734298706054688,
-0.005916595458984375,
-0.0005021095275878906,
-0.04571533203125,
0.00896453857421875,
0.01483917236328125,
-0.024261474609375,
0.001068115234375,
0.01403045654296875,
0.0694580078125,
-0.0182647705078125,
-0.0253753662109375,
0.047515869140625,
-0.0153656005859375,
-0.031829833984375,
-0.04656982421875,
0.0245208740234375,
-0.00034546852111816406,
0.031829833984375,
0.020782470703125,
0.01517486572265625,
0.031585693359375,
-0.0276031494140625,
0.0072021484375,
0.036865234375,
-0.01139068603515625,
-0.037384033203125,
0.06640625,
-0.019073486328125,
-0.0310821533203125,
0.042572021484375,
-0.046905517578125,
-0.01334381103515625,
0.036224365234375,
0.040435791015625,
0.07177734375,
-0.01462554931640625,
0.032745361328125,
0.0216064453125,
-0.012725830078125,
-0.04534912109375,
0.046234130859375,
0.020294189453125,
-0.04254150390625,
-0.005100250244140625,
-0.0216827392578125,
-0.022430419921875,
-0.00173187255859375,
-0.044036865234375,
0.040069580078125,
-0.031402587890625,
-0.042083740234375,
-0.0272216796875,
0.004871368408203125,
-0.044464111328125,
0.00893402099609375,
0.011199951171875,
0.0858154296875,
-0.08355712890625,
0.07098388671875,
0.044586181640625,
-0.039703369140625,
-0.048309326171875,
-0.0074615478515625,
0.001895904541015625,
-0.053070068359375,
0.02874755859375,
-0.006229400634765625,
0.00485992431640625,
0.00634765625,
-0.057952880859375,
-0.057586669921875,
0.10784912109375,
0.031768798828125,
-0.0187225341796875,
-0.0074615478515625,
-0.01983642578125,
0.061065673828125,
-0.027679443359375,
0.047821044921875,
0.0285797119140625,
0.058013916015625,
0.0242919921875,
-0.064453125,
-0.0014705657958984375,
-0.041778564453125,
-0.00658416748046875,
-0.01085662841796875,
-0.058380126953125,
0.06915283203125,
-0.0002143383026123047,
-0.0190277099609375,
0.035003662109375,
0.044464111328125,
0.050506591796875,
0.0443115234375,
0.0287933349609375,
0.07147216796875,
0.05633544921875,
-0.01271820068359375,
0.08642578125,
-0.0243682861328125,
0.038055419921875,
0.050048828125,
-0.0031337738037109375,
0.038116455078125,
0.017059326171875,
-0.020721435546875,
0.05841064453125,
0.07403564453125,
-0.02362060546875,
0.060455322265625,
0.0009379386901855469,
-0.0179443359375,
-0.00753021240234375,
0.0012340545654296875,
-0.045501708984375,
-0.0152435302734375,
0.035064697265625,
-0.027191162109375,
-0.016754150390625,
0.0200653076171875,
0.0310516357421875,
-0.01525115966796875,
-0.0267181396484375,
0.034576416015625,
0.00469207763671875,
-0.044769287109375,
0.048370361328125,
-0.01343536376953125,
0.04864501953125,
-0.05474853515625,
-0.0146484375,
-0.015625,
-0.012908935546875,
-0.0112152099609375,
-0.0794677734375,
0.027069091796875,
-0.0052642822265625,
-0.01389312744140625,
-0.036224365234375,
0.052734375,
-0.0289459228515625,
-0.05419921875,
0.017822265625,
0.0343017578125,
0.040496826171875,
0.00395965576171875,
-0.091796875,
-0.0086822509765625,
-0.02679443359375,
-0.05169677734375,
0.02520751953125,
0.0251312255859375,
0.0023517608642578125,
0.05230712890625,
0.039306640625,
0.005859375,
0.003955841064453125,
0.006923675537109375,
0.05419921875,
-0.0204010009765625,
-0.0289764404296875,
-0.0672607421875,
0.0614013671875,
-0.00872039794921875,
-0.03277587890625,
0.057586669921875,
0.04254150390625,
0.00946807861328125,
-0.00959014892578125,
0.042938232421875,
0.00814056396484375,
0.03729248046875,
-0.046539306640625,
0.06689453125,
-0.06707763671875,
0.00611114501953125,
-0.0171051025390625,
-0.06201171875,
-0.034210205078125,
0.050445556640625,
-0.01255035400390625,
0.01360321044921875,
0.04229736328125,
0.0709228515625,
-0.0108184814453125,
-0.005207061767578125,
0.00792694091796875,
0.018798828125,
0.0222930908203125,
0.0215911865234375,
0.05584716796875,
-0.042205810546875,
0.0118865966796875,
-0.00518798828125,
-0.0209197998046875,
-0.0103759765625,
-0.066650390625,
-0.054931640625,
-0.04327392578125,
-0.04888916015625,
-0.0706787109375,
0.00592041015625,
0.06304931640625,
0.053802490234375,
-0.047576904296875,
0.0155029296875,
-0.02685546875,
0.0080108642578125,
-0.0139617919921875,
-0.0192108154296875,
0.0284271240234375,
0.0094146728515625,
-0.06494140625,
0.0163421630859375,
0.0034809112548828125,
0.05255126953125,
-0.026397705078125,
-0.00986480712890625,
0.006252288818359375,
-0.00380706787109375,
0.0153656005859375,
0.027496337890625,
-0.0250396728515625,
-0.00647735595703125,
-0.006687164306640625,
-0.00643157958984375,
-0.00670623779296875,
0.03131103515625,
-0.039703369140625,
0.01483917236328125,
0.047210693359375,
0.0015878677368164062,
0.043121337890625,
-0.01085662841796875,
0.02099609375,
-0.022430419921875,
0.0106048583984375,
0.00951385498046875,
0.044525146484375,
0.006145477294921875,
-0.0288543701171875,
0.044036865234375,
0.0458984375,
-0.053924560546875,
-0.051910400390625,
0.01220703125,
-0.0830078125,
-0.02349853515625,
0.101806640625,
-0.005344390869140625,
-0.0166778564453125,
-0.0013370513916015625,
-0.036163330078125,
0.0133056640625,
-0.038909912109375,
0.049041748046875,
0.0565185546875,
-0.0290069580078125,
-0.029388427734375,
-0.044036865234375,
0.02294921875,
0.00426483154296875,
-0.0372314453125,
-0.0178070068359375,
0.040740966796875,
0.036956787109375,
0.02581787109375,
0.061737060546875,
-0.0194549560546875,
0.011932373046875,
0.0022125244140625,
-0.0012836456298828125,
-0.01088714599609375,
-0.023223876953125,
-0.0300140380859375,
0.024444580078125,
-0.004947662353515625,
-0.012359619140625
]
] |
bigcode/starcoderbase-1b | 2023-09-14T12:49:54.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_bigcode",
"text-generation",
"code",
"dataset:bigcode/the-stack-dedup",
"arxiv:1911.02150",
"arxiv:2205.14135",
"arxiv:2207.14255",
"arxiv:2305.06161",
"license:bigcode-openrail-m",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigcode | null | null | bigcode/starcoderbase-1b | 31 | 50,279 | transformers | 2023-07-03T13:08:44 | ---
pipeline_tag: text-generation
inference: true
widget:
- text: 'def print_hello_world():'
example_title: Hello world
group: Python
license: bigcode-openrail-m
datasets:
- bigcode/the-stack-dedup
metrics:
- code_eval
library_name: transformers
tags:
- code
model-index:
- name: StarCoderBase-1B
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 15.17
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C++)
metrics:
- name: pass@1
type: pass@1
value: 11.68
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Java)
metrics:
- name: pass@1
type: pass@1
value: 14.2
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (JavaScript)
metrics:
- name: pass@1
type: pass@1
value: 13.38
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (PHP)
metrics:
- name: pass@1
type: pass@1
value: 9.94
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Lua)
metrics:
- name: pass@1
type: pass@1
value: 12.52
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Rust)
metrics:
- name: pass@1
type: pass@1
value: 10.24
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Swift)
metrics:
- name: pass@1
type: pass@1
value: 3.92
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Julia)
metrics:
- name: pass@1
type: pass@1
value: 11.31
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (R)
metrics:
- name: pass@1
type: pass@1
value: 5.37
verified: false
extra_gated_prompt: >-
## Model License Agreement
Please read the BigCode [OpenRAIL-M
license](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement)
agreement before accepting it.
extra_gated_fields:
I accept the above license agreement, and will use the Model complying with the set of use restrictions and sharing requirements: checkbox
duplicated_from: bigcode-data/starcoderbase-1b
---
# StarCoderBase-1B
1B version of [StarCoderBase](https://huggingface.co/bigcode/starcoderbase).
## Table of Contents
1. [Model Summary](##model-summary)
2. [Use](##use)
3. [Limitations](##limitations)
4. [Training](##training)
5. [License](##license)
6. [Citation](##citation)
## Model Summary
StarCoderBase-1B is a 1B parameter model trained on 80+ programming languages from [The Stack (v1.2)](https://huggingface.co/datasets/bigcode/the-stack), with opt-out requests excluded. The model uses [Multi Query Attention](https://arxiv.org/abs/1911.02150), [a context window of 8192 tokens](https://arxiv.org/abs/2205.14135), and was trained using the [Fill-in-the-Middle objective](https://arxiv.org/abs/2207.14255) on 1 trillion tokens.
- **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
- **Project Website:** [bigcode-project.org](https://www.bigcode-project.org)
- **Paper:** [💫StarCoder: May the source be with you!](https://arxiv.org/abs/2305.06161)
- **Point of Contact:** [contact@bigcode-project.org](mailto:contact@bigcode-project.org)
- **Languages:** 80+ Programming languages
## Use
### Intended use
The model was trained on GitHub code. As such it is _not_ an instruction model and commands like "Write a function that computes the square root." do not work well. However, by using the [Tech Assistant prompt](https://huggingface.co/datasets/bigcode/ta-prompt) you can turn it into a capable technical assistant.
**Feel free to share your generations in the Community tab!**
### Generation
```python
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigcode/starcoderbase-1b"
device = "cuda" # for GPU usage or "cpu" for CPU usage
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device)
inputs = tokenizer.encode("def print_hello_world():", return_tensors="pt").to(device)
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
### Fill-in-the-middle
Fill-in-the-middle uses special tokens to identify the prefix/middle/suffix part of the input and output:
```python
input_text = "<fim_prefix>def print_hello_world():\n <fim_suffix>\n print('Hello world!')<fim_middle>"
inputs = tokenizer.encode(input_text, return_tensors="pt").to(device)
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
### Attribution & Other Requirements
The pretraining dataset of the model was filtered for permissive licenses only. Nevertheless, the model can generate source code verbatim from the dataset. The code's license might require attribution and/or other specific requirements that must be respected. We provide a [search index](https://huggingface.co/spaces/bigcode/starcoder-search) that let's you search through the pretraining data to identify where generated code came from and apply the proper attribution to your code.
# Limitations
The model has been trained on source code from 80+ programming languages. The predominant natural language in source code is English although other languages are also present. As such the model is capable of generating code snippets provided some context but the generated code is not guaranteed to work as intended. It can be inefficient, contain bugs or exploits. See [the paper](https://drive.google.com/file/d/1cN-b9GnWtHzQRoE7M7gAEyivY0kl4BYs/view) for an in-depth discussion of the model limitations.
# Training
## Model
- **Architecture:** GPT-2 model with multi-query attention and Fill-in-the-Middle objective
- **Pretraining steps:** 500k
- **Pretraining tokens:** 1 trillion
- **Precision:** bfloat16
## Hardware
- **GPUs:** 128 Tesla A100
- **Training time:** 11 days
## Software
- **Orchestration:** [Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
- **BP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
# License
The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
# Citation
```
@article{li2023starcoder,
title={StarCoder: may the source be with you!},
author={Raymond Li and Loubna Ben Allal and Yangtian Zi and Niklas Muennighoff and Denis Kocetkov and Chenghao Mou and Marc Marone and Christopher Akiki and Jia Li and Jenny Chim and Qian Liu and Evgenii Zheltonozhskii and Terry Yue Zhuo and Thomas Wang and Olivier Dehaene and Mishig Davaadorj and Joel Lamy-Poirier and João Monteiro and Oleh Shliazhko and Nicolas Gontier and Nicholas Meade and Armel Zebaze and Ming-Ho Yee and Logesh Kumar Umapathi and Jian Zhu and Benjamin Lipkin and Muhtasham Oblokulov and Zhiruo Wang and Rudra Murthy and Jason Stillerman and Siva Sankalp Patel and Dmitry Abulkhanov and Marco Zocca and Manan Dey and Zhihan Zhang and Nour Fahmy and Urvashi Bhattacharyya and Wenhao Yu and Swayam Singh and Sasha Luccioni and Paulo Villegas and Maxim Kunakov and Fedor Zhdanov and Manuel Romero and Tony Lee and Nadav Timor and Jennifer Ding and Claire Schlesinger and Hailey Schoelkopf and Jan Ebert and Tri Dao and Mayank Mishra and Alex Gu and Jennifer Robinson and Carolyn Jane Anderson and Brendan Dolan-Gavitt and Danish Contractor and Siva Reddy and Daniel Fried and Dzmitry Bahdanau and Yacine Jernite and Carlos Muñoz Ferrandis and Sean Hughes and Thomas Wolf and Arjun Guha and Leandro von Werra and Harm de Vries},
year={2023},
eprint={2305.06161},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 8,438 | [
[
-0.046142578125,
-0.0396728515625,
0.0263519287109375,
0.01397705078125,
-0.01351165771484375,
-0.020721435546875,
-0.0149078369140625,
-0.03057861328125,
0.0078277587890625,
0.0193328857421875,
-0.041473388671875,
-0.0302734375,
-0.059844970703125,
0.00197601318359375,
-0.004146575927734375,
0.07757568359375,
-0.002101898193359375,
0.0093231201171875,
-0.00705718994140625,
0.0003578662872314453,
-0.01788330078125,
-0.049957275390625,
-0.0226593017578125,
-0.007411956787109375,
0.033416748046875,
0.02215576171875,
0.061309814453125,
0.0533447265625,
0.041015625,
0.021270751953125,
-0.01280975341796875,
-0.0016355514526367188,
-0.01812744140625,
-0.0294342041015625,
-0.00009441375732421875,
-0.0204010009765625,
-0.0338134765625,
-0.00466156005859375,
0.0426025390625,
0.02569580078125,
0.0146026611328125,
0.047393798828125,
-0.003093719482421875,
0.05047607421875,
-0.038543701171875,
0.02459716796875,
-0.0239410400390625,
0.0003247261047363281,
0.01428985595703125,
0.00571441650390625,
-0.0131683349609375,
-0.0220947265625,
-0.023895263671875,
-0.041351318359375,
0.01513671875,
0.0131683349609375,
0.08892822265625,
0.02899169921875,
-0.01514434814453125,
-0.019317626953125,
-0.0548095703125,
0.048553466796875,
-0.052093505859375,
0.043609619140625,
0.0238494873046875,
0.0243682861328125,
0.0020046234130859375,
-0.069580078125,
-0.056640625,
-0.0251312255859375,
-0.00847625732421875,
0.0078277587890625,
-0.0268096923828125,
-0.00704193115234375,
0.042938232421875,
0.01471710205078125,
-0.058624267578125,
0.014007568359375,
-0.061309814453125,
-0.0070343017578125,
0.040252685546875,
-0.0005054473876953125,
0.00928497314453125,
-0.04534912109375,
-0.040252685546875,
-0.01302337646484375,
-0.043792724609375,
0.01259613037109375,
0.0166778564453125,
-0.0010623931884765625,
-0.02703857421875,
0.03802490234375,
0.0004296302795410156,
0.040863037109375,
0.0025997161865234375,
0.011688232421875,
0.035125732421875,
-0.038787841796875,
-0.031494140625,
-0.011077880859375,
0.075927734375,
0.0294952392578125,
0.00798797607421875,
-0.0010976791381835938,
-0.0106048583984375,
-0.0088653564453125,
0.0117950439453125,
-0.07879638671875,
-0.017578125,
0.040283203125,
-0.027587890625,
-0.0073089599609375,
0.0111541748046875,
-0.0618896484375,
0.00559234619140625,
-0.0287017822265625,
0.04083251953125,
-0.0169677734375,
-0.018280029296875,
0.0162200927734375,
0.0004069805145263672,
0.026763916015625,
-0.005199432373046875,
-0.060638427734375,
0.0108795166015625,
0.042572021484375,
0.06170654296875,
0.02581787109375,
-0.029571533203125,
-0.0259857177734375,
-0.007793426513671875,
-0.02484130859375,
0.0209197998046875,
-0.01593017578125,
-0.0172576904296875,
-0.0169525146484375,
0.00983428955078125,
-0.005855560302734375,
-0.032257080078125,
0.022979736328125,
-0.042755126953125,
0.017242431640625,
-0.016998291015625,
-0.01861572265625,
-0.015899658203125,
0.011962890625,
-0.046844482421875,
0.07354736328125,
0.02484130859375,
-0.052215576171875,
0.0157928466796875,
-0.0576171875,
-0.0026454925537109375,
-0.005863189697265625,
-0.01297760009765625,
-0.054473876953125,
-0.0084991455078125,
0.0311431884765625,
0.03839111328125,
-0.034637451171875,
0.036590576171875,
-0.0194091796875,
-0.031951904296875,
0.01186370849609375,
-0.0131683349609375,
0.07598876953125,
0.035858154296875,
-0.046844482421875,
0.0110321044921875,
-0.04034423828125,
0.003108978271484375,
0.03277587890625,
-0.011688232421875,
0.0212554931640625,
-0.027862548828125,
0.01507568359375,
0.042938232421875,
0.0297088623046875,
-0.042724609375,
0.023681640625,
-0.021331787109375,
0.047088623046875,
0.04217529296875,
-0.005603790283203125,
0.01139068603515625,
-0.0188140869140625,
0.04608154296875,
0.01441192626953125,
0.036529541015625,
-0.00968170166015625,
-0.035003662109375,
-0.05511474609375,
-0.021820068359375,
0.0299072265625,
0.035003662109375,
-0.05169677734375,
0.059814453125,
-0.0242462158203125,
-0.04608154296875,
-0.028228759765625,
0.0006718635559082031,
0.047271728515625,
0.01383209228515625,
0.036102294921875,
0.002895355224609375,
-0.054901123046875,
-0.06512451171875,
0.0144805908203125,
-0.007099151611328125,
0.00323486328125,
0.0236968994140625,
0.06536865234375,
-0.029296875,
0.06121826171875,
-0.048614501953125,
-0.0070037841796875,
-0.02032470703125,
-0.0220184326171875,
0.040252685546875,
0.054473876953125,
0.05645751953125,
-0.058135986328125,
-0.026580810546875,
-0.0029315948486328125,
-0.059295654296875,
0.026763916015625,
0.006916046142578125,
-0.0111541748046875,
0.015289306640625,
0.0528564453125,
-0.06982421875,
0.035003662109375,
0.04486083984375,
-0.0308990478515625,
0.058349609375,
-0.01422119140625,
0.0101470947265625,
-0.100830078125,
0.041107177734375,
0.00336456298828125,
0.0035686492919921875,
-0.0238037109375,
0.02020263671875,
0.01432037353515625,
-0.036712646484375,
-0.037200927734375,
0.04425048828125,
-0.03546142578125,
-0.0061798095703125,
-0.0035266876220703125,
-0.014984130859375,
0.00148773193359375,
0.0643310546875,
-0.0163116455078125,
0.0684814453125,
0.052581787109375,
-0.045318603515625,
0.0253448486328125,
0.030609130859375,
-0.019500732421875,
0.00010454654693603516,
-0.07196044921875,
0.00994873046875,
-0.0066680908203125,
0.0243988037109375,
-0.08599853515625,
-0.01708984375,
0.0343017578125,
-0.06658935546875,
0.01222991943359375,
-0.03204345703125,
-0.04779052734375,
-0.06585693359375,
-0.01654052734375,
0.02789306640625,
0.057220458984375,
-0.050048828125,
0.023040771484375,
0.0142059326171875,
-0.006134033203125,
-0.045623779296875,
-0.04876708984375,
-0.0058441162109375,
-0.002254486083984375,
-0.047393798828125,
0.01195526123046875,
-0.013214111328125,
0.0115203857421875,
0.004596710205078125,
-0.0095672607421875,
-0.01245880126953125,
-0.007373809814453125,
0.028106689453125,
0.0352783203125,
-0.0241241455078125,
-0.01544189453125,
-0.016387939453125,
-0.0227813720703125,
0.016510009765625,
-0.044952392578125,
0.05255126953125,
-0.0157928466796875,
-0.0208740234375,
-0.0290069580078125,
0.0174560546875,
0.06597900390625,
-0.032440185546875,
0.05621337890625,
0.058563232421875,
-0.037139892578125,
-0.00125885009765625,
-0.039398193359375,
-0.0129241943359375,
-0.0404052734375,
0.050201416015625,
-0.01558685302734375,
-0.055206298828125,
0.0372314453125,
0.01104736328125,
0.00565338134765625,
0.044921875,
0.031768798828125,
0.01453399658203125,
0.06768798828125,
0.042755126953125,
-0.00894927978515625,
0.027618408203125,
-0.056915283203125,
0.0284423828125,
-0.0692138671875,
-0.02545166015625,
-0.050201416015625,
-0.0208740234375,
-0.03497314453125,
-0.043792724609375,
0.03448486328125,
0.020294189453125,
-0.050384521484375,
0.040069580078125,
-0.0545654296875,
0.0283203125,
0.042938232421875,
0.00412750244140625,
-0.012939453125,
0.00876617431640625,
-0.01483917236328125,
0.00646209716796875,
-0.059173583984375,
-0.0302734375,
0.0887451171875,
0.03692626953125,
0.037628173828125,
0.006439208984375,
0.046295166015625,
-0.00867462158203125,
-0.001010894775390625,
-0.04254150390625,
0.035369873046875,
-0.00551605224609375,
-0.06195068359375,
-0.0124359130859375,
-0.04144287109375,
-0.07861328125,
0.01171875,
-0.0007348060607910156,
-0.057769775390625,
0.0189361572265625,
0.01190948486328125,
-0.041259765625,
0.032989501953125,
-0.06353759765625,
0.08013916015625,
-0.01629638671875,
-0.030853271484375,
0.00371551513671875,
-0.04449462890625,
0.028228759765625,
0.006320953369140625,
0.009918212890625,
0.0193023681640625,
0.007122039794921875,
0.058013916015625,
-0.038665771484375,
0.04571533203125,
-0.0301055908203125,
0.0136566162109375,
0.0276641845703125,
-0.01067352294921875,
0.040130615234375,
0.0207672119140625,
-0.004405975341796875,
0.034027099609375,
-0.005840301513671875,
-0.0347900390625,
-0.0305328369140625,
0.055511474609375,
-0.082763671875,
-0.0384521484375,
-0.03411865234375,
-0.02294921875,
-0.0003147125244140625,
0.0242462158203125,
0.036376953125,
0.03564453125,
0.0153045654296875,
0.023406982421875,
0.036346435546875,
-0.030242919921875,
0.047515869140625,
0.0186920166015625,
-0.022308349609375,
-0.049468994140625,
0.06488037109375,
0.01557159423828125,
0.00942230224609375,
0.006595611572265625,
0.00775146484375,
-0.0328369140625,
-0.03350830078125,
-0.0516357421875,
0.0243072509765625,
-0.04736328125,
-0.02655029296875,
-0.0577392578125,
-0.04010009765625,
-0.039886474609375,
-0.0166778564453125,
-0.03564453125,
-0.01385498046875,
-0.013214111328125,
0.0013370513916015625,
0.035247802734375,
0.04388427734375,
0.0018215179443359375,
0.01172637939453125,
-0.058502197265625,
0.021820068359375,
0.01026153564453125,
0.0308837890625,
0.0080413818359375,
-0.046173095703125,
-0.03704833984375,
0.00795745849609375,
-0.0350341796875,
-0.032745361328125,
0.030670166015625,
-0.01983642578125,
0.038238525390625,
0.0037975311279296875,
-0.01123809814453125,
0.0462646484375,
-0.033203125,
0.08026123046875,
0.03619384765625,
-0.060638427734375,
0.034881591796875,
-0.0227203369140625,
0.036956787109375,
0.024627685546875,
0.048126220703125,
-0.02099609375,
-0.0041961669921875,
-0.06475830078125,
-0.0718994140625,
0.06298828125,
0.018341064453125,
-0.0003132820129394531,
0.0076904296875,
0.0242462158203125,
-0.01424407958984375,
0.00994110107421875,
-0.060760498046875,
-0.0287322998046875,
-0.0282135009765625,
-0.00974273681640625,
-0.01922607421875,
-0.0054779052734375,
-0.0036163330078125,
-0.0416259765625,
0.040313720703125,
0.0027923583984375,
0.061248779296875,
0.0208587646484375,
-0.00984954833984375,
-0.0008568763732910156,
0.0026493072509765625,
0.0479736328125,
0.061981201171875,
-0.0124359130859375,
-0.0031414031982421875,
-0.003955841064453125,
-0.05328369140625,
0.004512786865234375,
0.03411865234375,
-0.007038116455078125,
-0.0002567768096923828,
0.01500701904296875,
0.06976318359375,
0.0164031982421875,
-0.022674560546875,
0.051422119140625,
0.01049041748046875,
-0.041839599609375,
-0.035400390625,
0.01114654541015625,
0.01520538330078125,
0.02520751953125,
0.033905029296875,
0.023193359375,
-0.0061798095703125,
-0.0205078125,
0.0208587646484375,
0.0121917724609375,
-0.0246429443359375,
-0.0216217041015625,
0.07666015625,
0.00006699562072753906,
-0.0138702392578125,
0.04443359375,
-0.021453857421875,
-0.049560546875,
0.0810546875,
0.037384033203125,
0.06390380859375,
0.0029144287109375,
0.00185394287109375,
0.0672607421875,
0.03271484375,
0.00881195068359375,
0.0185699462890625,
0.00965118408203125,
-0.0240631103515625,
-0.0306243896484375,
-0.04779052734375,
-0.00972747802734375,
0.019561767578125,
-0.040283203125,
0.0235595703125,
-0.059234619140625,
-0.00555419921875,
0.002880096435546875,
0.0193328857421875,
-0.07708740234375,
0.01910400390625,
0.0146484375,
0.0621337890625,
-0.054046630859375,
0.054229736328125,
0.054779052734375,
-0.0560302734375,
-0.07269287109375,
0.004688262939453125,
-0.011627197265625,
-0.057830810546875,
0.059600830078125,
0.0197601318359375,
0.0124359130859375,
0.015716552734375,
-0.058990478515625,
-0.0811767578125,
0.09063720703125,
0.0179290771484375,
-0.042877197265625,
0.003448486328125,
0.00531005859375,
0.028045654296875,
-0.006397247314453125,
0.037017822265625,
0.018646240234375,
0.043121337890625,
0.004947662353515625,
-0.0733642578125,
0.0179901123046875,
-0.035064697265625,
-0.00048351287841796875,
0.016815185546875,
-0.06878662109375,
0.07647705078125,
-0.028045654296875,
0.00437164306640625,
-0.004451751708984375,
0.04461669921875,
0.0312347412109375,
0.0167694091796875,
0.0206146240234375,
0.037628173828125,
0.03680419921875,
-0.00551605224609375,
0.0780029296875,
-0.05755615234375,
0.0494384765625,
0.050445556640625,
0.0029773712158203125,
0.05413818359375,
0.025390625,
-0.0303192138671875,
0.0235443115234375,
0.035888671875,
-0.0294342041015625,
0.0191650390625,
0.01428985595703125,
0.00945281982421875,
0.001804351806640625,
0.023162841796875,
-0.05328369140625,
0.01369476318359375,
0.0197296142578125,
-0.0256195068359375,
-0.0126953125,
0.000392913818359375,
0.01190185546875,
-0.027191162109375,
-0.024627685546875,
0.031280517578125,
0.0017004013061523438,
-0.0572509765625,
0.0889892578125,
0.0038013458251953125,
0.053680419921875,
-0.0531005859375,
0.003208160400390625,
-0.0126953125,
0.0191650390625,
-0.025390625,
-0.051971435546875,
0.006038665771484375,
0.00429534912109375,
-0.027313232421875,
0.003345489501953125,
0.0191192626953125,
-0.007720947265625,
-0.04425048828125,
0.0201568603515625,
0.00533294677734375,
0.01337432861328125,
-0.001216888427734375,
-0.06298828125,
0.02105712890625,
0.01158905029296875,
-0.034759521484375,
0.0219268798828125,
0.024017333984375,
0.01363372802734375,
0.039337158203125,
0.054779052734375,
-0.0119476318359375,
0.017578125,
-0.017059326171875,
0.0804443359375,
-0.06494140625,
-0.039276123046875,
-0.058502197265625,
0.0528564453125,
0.004428863525390625,
-0.06011962890625,
0.0572509765625,
0.061614990234375,
0.058929443359375,
-0.021942138671875,
0.060821533203125,
-0.02569580078125,
0.00936126708984375,
-0.043914794921875,
0.04779052734375,
-0.0401611328125,
0.0090179443359375,
-0.0217132568359375,
-0.07806396484375,
-0.018341064453125,
0.041748046875,
-0.0254974365234375,
0.0299530029296875,
0.05828857421875,
0.08099365234375,
-0.0235137939453125,
-0.007717132568359375,
0.020050048828125,
0.0242462158203125,
0.0284881591796875,
0.062286376953125,
0.03607177734375,
-0.057464599609375,
0.0504150390625,
-0.01812744140625,
-0.0174560546875,
-0.02984619140625,
-0.038818359375,
-0.059906005859375,
-0.046356201171875,
-0.0219573974609375,
-0.042327880859375,
0.00024199485778808594,
0.07843017578125,
0.069580078125,
-0.0511474609375,
-0.007450103759765625,
-0.00970458984375,
-0.00960540771484375,
-0.019744873046875,
-0.013702392578125,
0.050201416015625,
-0.0189208984375,
-0.046844482421875,
0.005649566650390625,
-0.004467010498046875,
-0.00035309791564941406,
-0.0283966064453125,
-0.0215301513671875,
-0.0166015625,
-0.01031494140625,
0.033721923828125,
0.031951904296875,
-0.04730224609375,
-0.0201416015625,
0.01032257080078125,
-0.02813720703125,
0.01345062255859375,
0.034393310546875,
-0.035186767578125,
0.0106964111328125,
0.0303802490234375,
0.047393798828125,
0.046142578125,
-0.006252288818359375,
0.0114288330078125,
-0.04217529296875,
0.0229644775390625,
0.00911712646484375,
0.026580810546875,
0.005733489990234375,
-0.037689208984375,
0.033905029296875,
0.017730712890625,
-0.059112548828125,
-0.050689697265625,
-0.0004017353057861328,
-0.07611083984375,
-0.030975341796875,
0.1075439453125,
-0.00597381591796875,
-0.0299224853515625,
0.003993988037109375,
-0.01611328125,
0.01464080810546875,
-0.01617431640625,
0.039581298828125,
0.03765869140625,
0.0129547119140625,
-0.00983428955078125,
-0.06878662109375,
0.0260467529296875,
0.01995849609375,
-0.045379638671875,
0.005741119384765625,
0.0343017578125,
0.034027099609375,
0.0277557373046875,
0.03204345703125,
-0.022186279296875,
0.03759765625,
0.01715087890625,
0.03656005859375,
-0.039398193359375,
-0.0278472900390625,
-0.028106689453125,
0.006343841552734375,
-0.0041351318359375,
-0.036041259765625
]
] |
google/mt5-base | 2023-01-24T16:37:25.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"mt5",
"text2text-generation",
"multilingual",
"af",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fil",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"hi",
"hmn",
"ht",
"hu",
"hy",
"ig",
"is",
"it",
"iw",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"und",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:mc4",
"arxiv:2010.11934",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/mt5-base | 109 | 50,079 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
datasets:
- mc4
license: apache-2.0
---
[Google's mT5](https://github.com/google-research/multilingual-t5)
mT5 is pretrained on the [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) corpus, covering 101 languages:
Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu.
**Note**: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual)
Other Community Checkpoints: [here](https://huggingface.co/models?search=mt5)
Paper: [mT5: A massively multilingual pre-trained text-to-text transformer](https://arxiv.org/abs/2010.11934)
Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel*
## Abstract
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We describe the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. All of the code and model checkpoints used in this work are publicly available. | 2,827 | [
[
-0.0369873046875,
-0.01194000244140625,
0.0203704833984375,
0.0287933349609375,
-0.020599365234375,
0.0251922607421875,
-0.02679443359375,
-0.031341552734375,
0.01204681396484375,
0.0252838134765625,
-0.04913330078125,
-0.0599365234375,
-0.06512451171875,
0.051910400390625,
-0.0175323486328125,
0.07598876953125,
-0.0264129638671875,
0.01409149169921875,
0.016326904296875,
-0.03802490234375,
-0.0295257568359375,
-0.04339599609375,
-0.034698486328125,
-0.00872039794921875,
0.057098388671875,
0.03118896484375,
0.0233306884765625,
0.0323486328125,
0.040924072265625,
0.0183258056640625,
0.01248931884765625,
0.0160064697265625,
-0.03411865234375,
-0.0241546630859375,
0.000019252300262451172,
-0.0276031494140625,
-0.0290374755859375,
-0.010009765625,
0.037750244140625,
0.04254150390625,
-0.0095977783203125,
0.032928466796875,
-0.007659912109375,
0.03692626953125,
-0.0343017578125,
0.00005054473876953125,
-0.037689208984375,
0.006526947021484375,
-0.032745361328125,
-0.0004341602325439453,
-0.027862548828125,
-0.00594329833984375,
-0.0075225830078125,
-0.0491943359375,
0.0133209228515625,
0.00217437744140625,
0.07635498046875,
0.01666259765625,
-0.046051025390625,
-0.022613525390625,
-0.0313720703125,
0.06884765625,
-0.0300140380859375,
0.065673828125,
0.036865234375,
0.025970458984375,
0.0119781494140625,
-0.07122802734375,
-0.050384521484375,
0.0170440673828125,
-0.002323150634765625,
0.0166778564453125,
-0.0035991668701171875,
-0.0134124755859375,
0.01042938232421875,
0.0183563232421875,
-0.046661376953125,
0.00191497802734375,
-0.0535888671875,
-0.00872802734375,
0.022705078125,
-0.01015472412109375,
0.034149169921875,
-0.0099029541015625,
-0.0192718505859375,
-0.003795623779296875,
-0.052734375,
0.007904052734375,
0.028656005859375,
0.0245513916015625,
-0.0341796875,
0.021392822265625,
0.0107421875,
0.043212890625,
-0.005535125732421875,
-0.03106689453125,
0.05169677734375,
-0.032440185546875,
-0.0071258544921875,
-0.0013141632080078125,
0.07550048828125,
0.015228271484375,
0.0260162353515625,
-0.037139892578125,
-0.0025691986083984375,
0.0019683837890625,
0.017181396484375,
-0.06353759765625,
-0.018341064453125,
0.0236053466796875,
-0.0182342529296875,
0.005435943603515625,
-0.01038360595703125,
-0.031463623046875,
0.002960205078125,
-0.0160369873046875,
0.01739501953125,
-0.0477294921875,
-0.0277557373046875,
0.00551605224609375,
-0.0005402565002441406,
0.005168914794921875,
0.004299163818359375,
-0.0870361328125,
0.003955841064453125,
0.0228271484375,
0.0621337890625,
-0.0265350341796875,
-0.05560302734375,
-0.0259552001953125,
0.022491455078125,
-0.0203704833984375,
0.041961669921875,
-0.039093017578125,
-0.023773193359375,
-0.004505157470703125,
0.03765869140625,
-0.01049041748046875,
-0.0212249755859375,
0.054290771484375,
-0.033782958984375,
0.047271728515625,
-0.0295867919921875,
-0.0010089874267578125,
-0.028106689453125,
0.034576416015625,
-0.06085205078125,
0.09136962890625,
0.007549285888671875,
-0.06817626953125,
0.043548583984375,
-0.066162109375,
-0.046966552734375,
-0.01071929931640625,
0.00409698486328125,
-0.032867431640625,
-0.0216064453125,
0.041656494140625,
0.0305633544921875,
-0.0044708251953125,
0.021148681640625,
-0.0088348388671875,
-0.0255126953125,
-0.01511383056640625,
-0.01346588134765625,
0.051239013671875,
0.0245513916015625,
-0.03228759765625,
0.00954437255859375,
-0.0667724609375,
-0.0034847259521484375,
-0.00394439697265625,
-0.03778076171875,
-0.0003936290740966797,
-0.01806640625,
0.012939453125,
0.039459228515625,
0.0192108154296875,
-0.046844482421875,
0.0001455545425415039,
-0.01788330078125,
0.038818359375,
0.040374755859375,
-0.03521728515625,
0.0257568359375,
-0.01284027099609375,
0.0462646484375,
0.03521728515625,
-0.006046295166015625,
-0.0304107666015625,
-0.028839111328125,
-0.054351806640625,
-0.034698486328125,
0.042938232421875,
0.04913330078125,
-0.09136962890625,
0.0013408660888671875,
-0.052032470703125,
-0.0191497802734375,
-0.0726318359375,
0.0175323486328125,
0.025238037109375,
0.025848388671875,
0.052001953125,
-0.00891876220703125,
-0.05938720703125,
-0.046295166015625,
-0.02142333984375,
0.020904541015625,
0.003082275390625,
-0.00321197509765625,
0.03863525390625,
-0.031707763671875,
0.043548583984375,
0.0005064010620117188,
-0.031494140625,
-0.03033447265625,
0.003650665283203125,
0.0236358642578125,
0.0295867919921875,
0.05120849609375,
-0.05712890625,
-0.051910400390625,
0.01180267333984375,
-0.04754638671875,
0.00775146484375,
0.0172271728515625,
-0.0025157928466796875,
0.039703369140625,
0.0242919921875,
-0.023193359375,
-0.000762939453125,
0.08416748046875,
-0.00629425048828125,
0.0164642333984375,
-0.0296478271484375,
0.0258636474609375,
-0.12548828125,
0.023162841796875,
-0.0143890380859375,
-0.025177001953125,
-0.034912109375,
-0.004016876220703125,
0.0167999267578125,
-0.0080718994140625,
-0.048828125,
0.042938232421875,
-0.057952880859375,
0.002277374267578125,
-0.0009937286376953125,
0.00519561767578125,
-0.008148193359375,
0.042205810546875,
0.005832672119140625,
0.0672607421875,
0.0266571044921875,
-0.0489501953125,
0.009521484375,
0.021270751953125,
-0.0232391357421875,
0.03515625,
-0.03582763671875,
0.01641845703125,
-0.01103973388671875,
0.017486572265625,
-0.065185546875,
-0.01081085205078125,
0.004116058349609375,
-0.046295166015625,
0.0141143798828125,
-0.028045654296875,
-0.0474853515625,
-0.031982421875,
-0.01088714599609375,
0.0288238525390625,
0.0194549560546875,
-0.048309326171875,
0.037261962890625,
0.0234222412109375,
-0.002716064453125,
-0.0694580078125,
-0.07452392578125,
0.032623291015625,
-0.03265380859375,
-0.044281005859375,
0.0238800048828125,
-0.011871337890625,
0.028656005859375,
-0.02386474609375,
0.0230560302734375,
-0.0162506103515625,
0.0068511962890625,
0.00170135498046875,
0.010009765625,
-0.00885772705078125,
-0.01198577880859375,
0.0018663406372070312,
-0.01100921630859375,
-0.0173492431640625,
-0.0305938720703125,
0.053009033203125,
-0.0045013427734375,
-0.0096282958984375,
-0.02655029296875,
0.02630615234375,
0.045806884765625,
-0.043914794921875,
0.058837890625,
0.09027099609375,
-0.01483917236328125,
0.01142120361328125,
-0.0335693359375,
0.004985809326171875,
-0.033233642578125,
0.031463623046875,
-0.067138671875,
-0.08062744140625,
0.049407958984375,
-0.00933074951171875,
0.0215911865234375,
0.0362548828125,
0.04425048828125,
0.00240325927734375,
0.07733154296875,
0.05731201171875,
-0.00502777099609375,
0.02886962890625,
-0.0191802978515625,
0.017791748046875,
-0.056304931640625,
-0.009368896484375,
-0.0389404296875,
-0.0252838134765625,
-0.07354736328125,
-0.0244903564453125,
0.0254058837890625,
-0.0159912109375,
-0.0151824951171875,
0.04345703125,
-0.0221405029296875,
0.031951904296875,
0.033538818359375,
-0.0159759521484375,
0.0228118896484375,
0.0142974853515625,
-0.045654296875,
-0.0253143310546875,
-0.05499267578125,
-0.041961669921875,
0.0960693359375,
0.01300048828125,
0.011749267578125,
0.037689208984375,
0.044036865234375,
-0.0098114013671875,
0.03326416015625,
-0.03009033203125,
0.00992584228515625,
-0.032012939453125,
-0.0614013671875,
-0.00936126708984375,
-0.03411865234375,
-0.09503173828125,
0.02325439453125,
-0.0110015869140625,
-0.043670654296875,
-0.00615692138671875,
0.0008077621459960938,
-0.0027599334716796875,
0.0231170654296875,
-0.066650390625,
0.07708740234375,
-0.00995635986328125,
-0.01282501220703125,
0.005100250244140625,
-0.05548095703125,
0.02783203125,
-0.0204315185546875,
0.044281005859375,
0.002628326416015625,
0.00730133056640625,
0.0517578125,
-0.00669097900390625,
0.0460205078125,
-0.005458831787109375,
-0.00861358642578125,
-0.0176239013671875,
-0.007354736328125,
0.0282745361328125,
-0.011383056640625,
0.006633758544921875,
0.0311126708984375,
0.020233154296875,
-0.04852294921875,
-0.0177459716796875,
0.042083740234375,
-0.07568359375,
-0.012481689453125,
-0.031494140625,
-0.0279541015625,
-0.0216522216796875,
0.051177978515625,
0.0301971435546875,
0.0207366943359375,
-0.004131317138671875,
0.0229949951171875,
0.0282440185546875,
-0.0239105224609375,
0.0545654296875,
0.053955078125,
-0.0251922607421875,
-0.053680419921875,
0.06756591796875,
0.0162200927734375,
0.01398468017578125,
0.030517578125,
-0.0029888153076171875,
-0.0306396484375,
-0.04412841796875,
-0.060302734375,
0.0245513916015625,
-0.042083740234375,
0.004131317138671875,
-0.06439208984375,
0.01509857177734375,
-0.04443359375,
-0.006969451904296875,
-0.0287933349609375,
-0.0151824951171875,
-0.00936126708984375,
-0.0184173583984375,
0.00099945068359375,
0.0428466796875,
0.0096588134765625,
0.0325927734375,
-0.06927490234375,
0.032379150390625,
-0.00835418701171875,
0.0321044921875,
-0.029144287109375,
-0.039764404296875,
-0.034698486328125,
0.01479339599609375,
-0.026031494140625,
-0.032562255859375,
0.0491943359375,
0.01377105712890625,
0.038055419921875,
0.0211334228515625,
-0.0122833251953125,
0.055694580078125,
-0.05841064453125,
0.06365966796875,
0.029144287109375,
-0.06494140625,
0.013092041015625,
-0.035888671875,
0.036712646484375,
0.04888916015625,
0.065673828125,
-0.061309814453125,
-0.0179443359375,
-0.043670654296875,
-0.058685302734375,
0.0577392578125,
0.0078887939453125,
0.01328277587890625,
-0.0000010728836059570312,
-0.0088043212890625,
0.0206298828125,
0.032562255859375,
-0.0743408203125,
-0.019195556640625,
-0.037109375,
-0.035400390625,
-0.0318603515625,
-0.007320404052734375,
-0.003955841064453125,
-0.0196380615234375,
0.039947509765625,
-0.021759033203125,
0.017181396484375,
0.0026950836181640625,
-0.03155517578125,
0.0172576904296875,
0.0125274658203125,
0.069091796875,
0.060394287109375,
-0.0112457275390625,
0.0201416015625,
0.0311126708984375,
-0.061553955078125,
0.01031494140625,
-0.000560760498046875,
0.012420654296875,
0.00867462158203125,
0.0287017822265625,
0.07147216796875,
0.0081939697265625,
-0.0302886962890625,
0.0283203125,
-0.0187225341796875,
-0.025482177734375,
-0.02459716796875,
-0.02569580078125,
0.023834228515625,
-0.01001739501953125,
0.0202178955078125,
-0.0021381378173828125,
-0.005504608154296875,
-0.0438232421875,
-0.0006818771362304688,
0.001613616943359375,
-0.033416748046875,
-0.04437255859375,
0.055450439453125,
0.025146484375,
-0.007488250732421875,
0.040008544921875,
-0.005908966064453125,
-0.05023193359375,
0.0158843994140625,
0.044769287109375,
0.047637939453125,
-0.031890869140625,
0.0005478858947753906,
0.04046630859375,
0.039398193359375,
0.0015096664428710938,
0.03802490234375,
0.0034122467041015625,
-0.058837890625,
-0.047210693359375,
-0.047760009765625,
-0.0208282470703125,
-0.004505157470703125,
-0.0213165283203125,
0.03631591796875,
-0.0137939453125,
-0.010345458984375,
0.0037403106689453125,
0.004085540771484375,
-0.060302734375,
0.0345458984375,
0.00400543212890625,
0.044158935546875,
-0.0423583984375,
0.08636474609375,
0.07293701171875,
-0.0262908935546875,
-0.062225341796875,
-0.021728515625,
-0.021636962890625,
-0.06317138671875,
0.056854248046875,
0.0225677490234375,
-0.011505126953125,
0.023468017578125,
-0.013763427734375,
-0.06610107421875,
0.08746337890625,
0.04638671875,
-0.016326904296875,
0.0009093284606933594,
0.04058837890625,
0.03338623046875,
-0.0157470703125,
0.037994384765625,
0.0256500244140625,
0.04345703125,
0.0132293701171875,
-0.0921630859375,
-0.0139312744140625,
-0.0379638671875,
-0.01074981689453125,
0.0198974609375,
-0.051910400390625,
0.0582275390625,
-0.007568359375,
-0.0100555419921875,
-0.023895263671875,
0.049530029296875,
0.0166778564453125,
0.0081634521484375,
0.0276031494140625,
0.05731201171875,
0.06201171875,
-0.0187530517578125,
0.08477783203125,
-0.0469970703125,
0.021148681640625,
0.05731201171875,
0.0007653236389160156,
0.05731201171875,
0.03607177734375,
-0.01435089111328125,
0.03497314453125,
0.06048583984375,
0.01513671875,
0.0343017578125,
-0.01171875,
-0.0130767822265625,
0.002399444580078125,
0.003406524658203125,
-0.023406982421875,
0.0311126708984375,
0.012359619140625,
-0.019134521484375,
-0.0002332925796508789,
0.0177001953125,
0.0374755859375,
-0.0279541015625,
-0.00716400146484375,
0.04351806640625,
0.0083465576171875,
-0.059844970703125,
0.06915283203125,
0.0277862548828125,
0.06768798828125,
-0.05364990234375,
0.0262451171875,
-0.0187835693359375,
0.0173492431640625,
-0.020355224609375,
-0.045806884765625,
0.02313232421875,
0.00811767578125,
-0.01507568359375,
-0.0419921875,
0.020416259765625,
-0.051666259765625,
-0.03668212890625,
0.02227783203125,
0.0257720947265625,
0.014007568359375,
0.001728057861328125,
-0.04241943359375,
-0.0024662017822265625,
0.01045989990234375,
-0.005756378173828125,
0.0235595703125,
0.044036865234375,
-0.00801849365234375,
0.052459716796875,
0.058929443359375,
0.0006246566772460938,
0.0255584716796875,
0.00986480712890625,
0.0472412109375,
-0.049041748046875,
-0.04925537109375,
-0.04937744140625,
0.043487548828125,
0.01506805419921875,
-0.039794921875,
0.06146240234375,
0.05218505859375,
0.07635498046875,
-0.0134429931640625,
0.0631103515625,
0.01346588134765625,
0.0517578125,
-0.038360595703125,
0.05303955078125,
-0.047943115234375,
-0.01482391357421875,
-0.019683837890625,
-0.06329345703125,
-0.027923583984375,
0.03009033203125,
-0.019805908203125,
0.01393890380859375,
0.0789794921875,
0.034515380859375,
-0.02484130859375,
-0.0201416015625,
0.033966064453125,
0.00909423828125,
0.0308380126953125,
0.042144775390625,
0.032012939453125,
-0.045806884765625,
0.0582275390625,
-0.010040283203125,
0.01611328125,
0.0109100341796875,
-0.06268310546875,
-0.075927734375,
-0.053802490234375,
-0.0036373138427734375,
-0.01328277587890625,
0.0007767677307128906,
0.057098388671875,
0.054656982421875,
-0.057403564453125,
-0.0253448486328125,
0.0095062255859375,
-0.007503509521484375,
0.01194000244140625,
-0.00705718994140625,
0.0235595703125,
-0.031707763671875,
-0.0762939453125,
0.0237274169921875,
0.005035400390625,
0.007709503173828125,
-0.01128387451171875,
-0.006938934326171875,
-0.0293121337890625,
-0.0177459716796875,
0.04876708984375,
0.003726959228515625,
-0.029510498046875,
-0.004913330078125,
0.0104827880859375,
-0.01215362548828125,
0.024444580078125,
0.0313720703125,
-0.035614013671875,
0.0238037109375,
0.0192108154296875,
0.05535888671875,
0.053314208984375,
-0.0167694091796875,
0.04693603515625,
-0.059478759765625,
0.022308349609375,
-0.0050048828125,
0.0263824462890625,
0.0439453125,
0.0017566680908203125,
0.03729248046875,
0.0281829833984375,
-0.0274658203125,
-0.053375244140625,
-0.002552032470703125,
-0.0672607421875,
-0.0009794235229492188,
0.0831298828125,
-0.0214996337890625,
-0.019622802734375,
-0.01358795166015625,
-0.01116180419921875,
0.022216796875,
-0.0175933837890625,
0.044036865234375,
0.07452392578125,
0.0285186767578125,
-0.035614013671875,
-0.059051513671875,
0.037139892578125,
0.032867431640625,
-0.06640625,
-0.032379150390625,
0.0036563873291015625,
0.03619384765625,
0.0081634521484375,
0.0455322265625,
-0.003978729248046875,
0.003841400146484375,
-0.0214080810546875,
0.034698486328125,
-0.00907135009765625,
-0.023681640625,
-0.00350189208984375,
0.0084686279296875,
-0.0125579833984375,
-0.023834228515625
]
] |
HuggingFaceH4/zephyr-7b-beta | 2023-11-04T19:51:02.000Z | [
"transformers",
"pytorch",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"en",
"dataset:HuggingFaceH4/ultrachat_200k",
"dataset:HuggingFaceH4/ultrafeedback_binarized",
"arxiv:2305.18290",
"arxiv:2310.16944",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | HuggingFaceH4 | null | null | HuggingFaceH4/zephyr-7b-beta | 652 | 50,040 | transformers | 2023-10-26T11:25:49 | ---
tags:
- generated_from_trainer
model-index:
- name: zephyr-7b-beta
results: []
license: mit
datasets:
- HuggingFaceH4/ultrachat_200k
- HuggingFaceH4/ultrafeedback_binarized
language:
- en
base_model: mistralai/Mistral-7B-v0.1
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
<img src="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha/resolve/main/thumbnail.png" alt="Zephyr Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for Zephyr 7B β
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr-7B-β is the second model in the series, and is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) that was trained on on a mix of publicly available, synthetic datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290). We found that removing the in-built alignment of these datasets boosted performance on [MT Bench](https://huggingface.co/spaces/lmsys/mt-bench) and made the model more helpful. However, this means that model is likely to generate problematic text when prompted to do so and should only be used for educational and research purposes. You can find more details in the [technical report](https://arxiv.org/abs/2310.16944).
## Model description
- **Model type:** A 7B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- **Language(s) (NLP):** Primarily English
- **License:** MIT
- **Finetuned from model:** [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/huggingface/alignment-handbook
- **Demo:** https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat
- **Chatbot Arena:** Evaluate Zephyr 7B against 10+ LLMs in the LMSYS arena: http://arena.lmsys.org
## Performance
At the time of release, Zephyr-7B-β is the highest ranked 7B chat model on the [MT-Bench](https://huggingface.co/spaces/lmsys/mt-bench) and [AlpacaEval](https://tatsu-lab.github.io/alpaca_eval/) benchmarks:
| Model | Size | Alignment | MT-Bench (score) | AlpacaEval (win rate %) |
|-------------|-----|----|---------------|--------------|
| StableLM-Tuned-α | 7B| dSFT |2.75| -|
| MPT-Chat | 7B |dSFT |5.42| -|
| Xwin-LMv0.1 | 7B| dPPO| 6.19| 87.83|
| Mistral-Instructv0.1 | 7B| - | 6.84 |-|
| Zephyr-7b-α |7B| dDPO| 6.88| -|
| **Zephyr-7b-β** 🪁 | **7B** | **dDPO** | **7.34** | **90.60** |
| Falcon-Instruct | 40B |dSFT |5.17 |45.71|
| Guanaco | 65B | SFT |6.41| 71.80|
| Llama2-Chat | 70B |RLHF |6.86| 92.66|
| Vicuna v1.3 | 33B |dSFT |7.12 |88.99|
| WizardLM v1.0 | 70B |dSFT |7.71 |-|
| Xwin-LM v0.1 | 70B |dPPO |- |95.57|
| GPT-3.5-turbo | - |RLHF |7.94 |89.37|
| Claude 2 | - |RLHF |8.06| 91.36|
| GPT-4 | -| RLHF |8.99| 95.28|
In particular, on several categories of MT-Bench, Zephyr-7B-β has strong performance compared to larger open models like Llama2-Chat-70B:

However, on more complex tasks like coding and mathematics, Zephyr-7B-β lags behind proprietary models and more research is needed to close the gap.
## Intended uses & limitations
The model was initially fine-tuned on a filtered and preprocessed of the [`UltraChat`](https://huggingface.co/datasets/stingning/ultrachat) dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT.
We then further aligned the model with [🤗 TRL's](https://github.com/huggingface/trl) `DPOTrainer` on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contains 64k prompts and model completions that are ranked by GPT-4. As a result, the model can be used for chat and you can check out our [demo](https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat) to test its capabilities.
You can find the datasets used for training Zephyr-7B-β [here](https://huggingface.co/collections/HuggingFaceH4/zephyr-7b-6538c6d6d5ddd1cbb1744a66)
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/zephyr-7b-beta", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Zephyr-7B-β has not been aligned to human preferences with techniques like RLHF or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base model (`mistralai/Mistral-7B-v0.1`), however it is likely to have included a mix of Web data and technical sources like books and code. See the [Falcon 180B model card](https://huggingface.co/tiiuae/falcon-180B#training-data) for an example of this.
## Training and evaluation data
During DPO training, this model achieves the following results on the evaluation set:
- Loss: 0.7496
- Rewards/chosen: -4.5221
- Rewards/rejected: -8.3184
- Rewards/accuracies: 0.7812
- Rewards/margins: 3.7963
- Logps/rejected: -340.1541
- Logps/chosen: -299.4561
- Logits/rejected: -2.3081
- Logits/chosen: -2.3531
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-07
- train_batch_size: 2
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 16
- total_train_batch_size: 32
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0
### Training results
The table below shows the full set of DPO training metrics:
| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.6284 | 0.05 | 100 | 0.6098 | 0.0425 | -0.1872 | 0.7344 | 0.2297 | -258.8416 | -253.8099 | -2.7976 | -2.8234 |
| 0.4908 | 0.1 | 200 | 0.5426 | -0.0279 | -0.6842 | 0.75 | 0.6563 | -263.8124 | -254.5145 | -2.7719 | -2.7960 |
| 0.5264 | 0.15 | 300 | 0.5324 | 0.0414 | -0.9793 | 0.7656 | 1.0207 | -266.7627 | -253.8209 | -2.7892 | -2.8122 |
| 0.5536 | 0.21 | 400 | 0.4957 | -0.0185 | -1.5276 | 0.7969 | 1.5091 | -272.2460 | -254.4203 | -2.8542 | -2.8764 |
| 0.5362 | 0.26 | 500 | 0.5031 | -0.2630 | -1.5917 | 0.7812 | 1.3287 | -272.8869 | -256.8653 | -2.8702 | -2.8958 |
| 0.5966 | 0.31 | 600 | 0.5963 | -0.2993 | -1.6491 | 0.7812 | 1.3499 | -273.4614 | -257.2279 | -2.8778 | -2.8986 |
| 0.5014 | 0.36 | 700 | 0.5382 | -0.2859 | -1.4750 | 0.75 | 1.1891 | -271.7204 | -257.0942 | -2.7659 | -2.7869 |
| 0.5334 | 0.41 | 800 | 0.5677 | -0.4289 | -1.8968 | 0.7969 | 1.4679 | -275.9378 | -258.5242 | -2.7053 | -2.7265 |
| 0.5251 | 0.46 | 900 | 0.5772 | -0.2116 | -1.3107 | 0.7344 | 1.0991 | -270.0768 | -256.3507 | -2.8463 | -2.8662 |
| 0.5205 | 0.52 | 1000 | 0.5262 | -0.3792 | -1.8585 | 0.7188 | 1.4793 | -275.5552 | -258.0276 | -2.7893 | -2.7979 |
| 0.5094 | 0.57 | 1100 | 0.5433 | -0.6279 | -1.9368 | 0.7969 | 1.3089 | -276.3377 | -260.5136 | -2.7453 | -2.7536 |
| 0.5837 | 0.62 | 1200 | 0.5349 | -0.3780 | -1.9584 | 0.7656 | 1.5804 | -276.5542 | -258.0154 | -2.7643 | -2.7756 |
| 0.5214 | 0.67 | 1300 | 0.5732 | -1.0055 | -2.2306 | 0.7656 | 1.2251 | -279.2761 | -264.2903 | -2.6986 | -2.7113 |
| 0.6914 | 0.72 | 1400 | 0.5137 | -0.6912 | -2.1775 | 0.7969 | 1.4863 | -278.7448 | -261.1467 | -2.7166 | -2.7275 |
| 0.4655 | 0.77 | 1500 | 0.5090 | -0.7987 | -2.2930 | 0.7031 | 1.4943 | -279.8999 | -262.2220 | -2.6651 | -2.6838 |
| 0.5731 | 0.83 | 1600 | 0.5312 | -0.8253 | -2.3520 | 0.7812 | 1.5268 | -280.4902 | -262.4876 | -2.6543 | -2.6728 |
| 0.5233 | 0.88 | 1700 | 0.5206 | -0.4573 | -2.0951 | 0.7812 | 1.6377 | -277.9205 | -258.8084 | -2.6870 | -2.7097 |
| 0.5593 | 0.93 | 1800 | 0.5231 | -0.5508 | -2.2000 | 0.7969 | 1.6492 | -278.9703 | -259.7433 | -2.6221 | -2.6519 |
| 0.4967 | 0.98 | 1900 | 0.5290 | -0.5340 | -1.9570 | 0.8281 | 1.4230 | -276.5395 | -259.5749 | -2.6564 | -2.6878 |
| 0.0921 | 1.03 | 2000 | 0.5368 | -1.1376 | -3.1615 | 0.7812 | 2.0239 | -288.5854 | -265.6111 | -2.6040 | -2.6345 |
| 0.0733 | 1.08 | 2100 | 0.5453 | -1.1045 | -3.4451 | 0.7656 | 2.3406 | -291.4208 | -265.2799 | -2.6289 | -2.6595 |
| 0.0972 | 1.14 | 2200 | 0.5571 | -1.6915 | -3.9823 | 0.8125 | 2.2908 | -296.7934 | -271.1505 | -2.6471 | -2.6709 |
| 0.1058 | 1.19 | 2300 | 0.5789 | -1.0621 | -3.8941 | 0.7969 | 2.8319 | -295.9106 | -264.8563 | -2.5527 | -2.5798 |
| 0.2423 | 1.24 | 2400 | 0.5455 | -1.1963 | -3.5590 | 0.7812 | 2.3627 | -292.5599 | -266.1981 | -2.5414 | -2.5784 |
| 0.1177 | 1.29 | 2500 | 0.5889 | -1.8141 | -4.3942 | 0.7969 | 2.5801 | -300.9120 | -272.3761 | -2.4802 | -2.5189 |
| 0.1213 | 1.34 | 2600 | 0.5683 | -1.4608 | -3.8420 | 0.8125 | 2.3812 | -295.3901 | -268.8436 | -2.4774 | -2.5207 |
| 0.0889 | 1.39 | 2700 | 0.5890 | -1.6007 | -3.7337 | 0.7812 | 2.1330 | -294.3068 | -270.2423 | -2.4123 | -2.4522 |
| 0.0995 | 1.45 | 2800 | 0.6073 | -1.5519 | -3.8362 | 0.8281 | 2.2843 | -295.3315 | -269.7538 | -2.4685 | -2.5050 |
| 0.1145 | 1.5 | 2900 | 0.5790 | -1.7939 | -4.2876 | 0.8438 | 2.4937 | -299.8461 | -272.1744 | -2.4272 | -2.4674 |
| 0.0644 | 1.55 | 3000 | 0.5735 | -1.7285 | -4.2051 | 0.8125 | 2.4766 | -299.0209 | -271.5201 | -2.4193 | -2.4574 |
| 0.0798 | 1.6 | 3100 | 0.5537 | -1.7226 | -4.2850 | 0.8438 | 2.5624 | -299.8200 | -271.4610 | -2.5367 | -2.5696 |
| 0.1013 | 1.65 | 3200 | 0.5575 | -1.5715 | -3.9813 | 0.875 | 2.4098 | -296.7825 | -269.9498 | -2.4926 | -2.5267 |
| 0.1254 | 1.7 | 3300 | 0.5905 | -1.6412 | -4.4703 | 0.8594 | 2.8291 | -301.6730 | -270.6473 | -2.5017 | -2.5340 |
| 0.085 | 1.76 | 3400 | 0.6133 | -1.9159 | -4.6760 | 0.8438 | 2.7601 | -303.7296 | -273.3941 | -2.4614 | -2.4960 |
| 0.065 | 1.81 | 3500 | 0.6074 | -1.8237 | -4.3525 | 0.8594 | 2.5288 | -300.4951 | -272.4724 | -2.4597 | -2.5004 |
| 0.0755 | 1.86 | 3600 | 0.5836 | -1.9252 | -4.4005 | 0.8125 | 2.4753 | -300.9748 | -273.4872 | -2.4327 | -2.4716 |
| 0.0746 | 1.91 | 3700 | 0.5789 | -1.9280 | -4.4906 | 0.8125 | 2.5626 | -301.8762 | -273.5149 | -2.4686 | -2.5115 |
| 0.1348 | 1.96 | 3800 | 0.6015 | -1.8658 | -4.2428 | 0.8281 | 2.3769 | -299.3976 | -272.8936 | -2.4943 | -2.5393 |
| 0.0217 | 2.01 | 3900 | 0.6122 | -2.3335 | -4.9229 | 0.8281 | 2.5894 | -306.1988 | -277.5699 | -2.4841 | -2.5272 |
| 0.0219 | 2.07 | 4000 | 0.6522 | -2.9890 | -6.0164 | 0.8281 | 3.0274 | -317.1334 | -284.1248 | -2.4105 | -2.4545 |
| 0.0119 | 2.12 | 4100 | 0.6922 | -3.4777 | -6.6749 | 0.7969 | 3.1972 | -323.7187 | -289.0121 | -2.4272 | -2.4699 |
| 0.0153 | 2.17 | 4200 | 0.6993 | -3.2406 | -6.6775 | 0.7969 | 3.4369 | -323.7453 | -286.6413 | -2.4047 | -2.4465 |
| 0.011 | 2.22 | 4300 | 0.7178 | -3.7991 | -7.4397 | 0.7656 | 3.6406 | -331.3667 | -292.2260 | -2.3843 | -2.4290 |
| 0.0072 | 2.27 | 4400 | 0.6840 | -3.3269 | -6.8021 | 0.8125 | 3.4752 | -324.9908 | -287.5042 | -2.4095 | -2.4536 |
| 0.0197 | 2.32 | 4500 | 0.7013 | -3.6890 | -7.3014 | 0.8125 | 3.6124 | -329.9841 | -291.1250 | -2.4118 | -2.4543 |
| 0.0182 | 2.37 | 4600 | 0.7476 | -3.8994 | -7.5366 | 0.8281 | 3.6372 | -332.3356 | -293.2291 | -2.4163 | -2.4565 |
| 0.0125 | 2.43 | 4700 | 0.7199 | -4.0560 | -7.5765 | 0.8438 | 3.5204 | -332.7345 | -294.7952 | -2.3699 | -2.4100 |
| 0.0082 | 2.48 | 4800 | 0.7048 | -3.6613 | -7.1356 | 0.875 | 3.4743 | -328.3255 | -290.8477 | -2.3925 | -2.4303 |
| 0.0118 | 2.53 | 4900 | 0.6976 | -3.7908 | -7.3152 | 0.8125 | 3.5244 | -330.1224 | -292.1431 | -2.3633 | -2.4047 |
| 0.0118 | 2.58 | 5000 | 0.7198 | -3.9049 | -7.5557 | 0.8281 | 3.6508 | -332.5271 | -293.2844 | -2.3764 | -2.4194 |
| 0.006 | 2.63 | 5100 | 0.7506 | -4.2118 | -7.9149 | 0.8125 | 3.7032 | -336.1194 | -296.3530 | -2.3407 | -2.3860 |
| 0.0143 | 2.68 | 5200 | 0.7408 | -4.2433 | -7.9802 | 0.8125 | 3.7369 | -336.7721 | -296.6682 | -2.3509 | -2.3946 |
| 0.0057 | 2.74 | 5300 | 0.7552 | -4.3392 | -8.0831 | 0.7969 | 3.7439 | -337.8013 | -297.6275 | -2.3388 | -2.3842 |
| 0.0138 | 2.79 | 5400 | 0.7404 | -4.2395 | -7.9762 | 0.8125 | 3.7367 | -336.7322 | -296.6304 | -2.3286 | -2.3737 |
| 0.0079 | 2.84 | 5500 | 0.7525 | -4.4466 | -8.2196 | 0.7812 | 3.7731 | -339.1662 | -298.7007 | -2.3200 | -2.3641 |
| 0.0077 | 2.89 | 5600 | 0.7520 | -4.5586 | -8.3485 | 0.7969 | 3.7899 | -340.4545 | -299.8206 | -2.3078 | -2.3517 |
| 0.0094 | 2.94 | 5700 | 0.7527 | -4.5542 | -8.3509 | 0.7812 | 3.7967 | -340.4790 | -299.7773 | -2.3062 | -2.3510 |
| 0.0054 | 2.99 | 5800 | 0.7520 | -4.5169 | -8.3079 | 0.7812 | 3.7911 | -340.0493 | -299.4038 | -2.3081 | -2.3530 |
### Framework versions
- Transformers 4.35.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.14.0
## Citation
If you find Zephyr-7B-β is useful in your work, please cite it with:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` | 19,420 | [
[
-0.030242919921875,
-0.055877685546875,
0.01291656494140625,
0.0081024169921875,
-0.0147247314453125,
-0.006549835205078125,
-0.0068511962890625,
-0.04412841796875,
0.027191162109375,
0.01953125,
-0.038818359375,
-0.027008056640625,
-0.046417236328125,
-0.00225067138671875,
-0.0009756088256835938,
0.06353759765625,
0.0118408203125,
-0.0027942657470703125,
0.01102447509765625,
-0.009124755859375,
-0.036529541015625,
-0.03863525390625,
-0.0450439453125,
-0.0256500244140625,
0.0279083251953125,
0.02099609375,
0.0736083984375,
0.055999755859375,
0.04058837890625,
0.0273284912109375,
-0.0391845703125,
0.0036106109619140625,
-0.04278564453125,
-0.01299285888671875,
0.0189361572265625,
-0.041717529296875,
-0.051544189453125,
-0.006824493408203125,
0.049835205078125,
0.0400390625,
-0.006099700927734375,
0.01531219482421875,
0.009552001953125,
0.052154541015625,
-0.025634765625,
0.0318603515625,
-0.038818359375,
-0.002887725830078125,
-0.0203704833984375,
-0.0019989013671875,
-0.01593017578125,
-0.0156402587890625,
0.006977081298828125,
-0.050201416015625,
0.0165557861328125,
0.007518768310546875,
0.0958251953125,
0.01220703125,
-0.01357269287109375,
-0.005401611328125,
-0.047027587890625,
0.041778564453125,
-0.0609130859375,
0.028778076171875,
0.006317138671875,
0.0256500244140625,
-0.024749755859375,
-0.041290283203125,
-0.05072021484375,
-0.0037384033203125,
-0.009124755859375,
0.02801513671875,
-0.0303955078125,
0.005462646484375,
0.0270538330078125,
0.03411865234375,
-0.044189453125,
0.0006127357482910156,
-0.047882080078125,
-0.0251007080078125,
0.0361328125,
0.0069427490234375,
0.01763916015625,
-0.03240966796875,
-0.031494140625,
-0.0273590087890625,
-0.032470703125,
0.03826904296875,
0.03338623046875,
0.0272979736328125,
-0.01197052001953125,
0.044464111328125,
-0.0229339599609375,
0.0596923828125,
0.0129241943359375,
-0.0217437744140625,
0.046630859375,
-0.01325225830078125,
-0.021087646484375,
0.001468658447265625,
0.078369140625,
0.052520751953125,
-0.005596160888671875,
0.00431060791015625,
-0.012786865234375,
0.0203857421875,
-0.00598907470703125,
-0.07305908203125,
-0.0080108642578125,
0.027618408203125,
-0.0305023193359375,
-0.039886474609375,
-0.0019235610961914062,
-0.05059814453125,
0.00919342041015625,
0.0000073909759521484375,
0.039276123046875,
-0.042999267578125,
-0.0283050537109375,
0.00876617431640625,
-0.0280303955078125,
0.0081634521484375,
0.01165771484375,
-0.06951904296875,
0.013427734375,
0.034515380859375,
0.072998046875,
0.0190582275390625,
-0.0189056396484375,
-0.01357269287109375,
-0.00351715087890625,
-0.0250091552734375,
0.054168701171875,
-0.0278778076171875,
-0.0289764404296875,
-0.022186279296875,
0.0097503662109375,
-0.0216064453125,
-0.03533935546875,
0.04620361328125,
-0.0286712646484375,
0.0289764404296875,
-0.0299072265625,
-0.04864501953125,
-0.01910400390625,
0.015167236328125,
-0.0419921875,
0.08673095703125,
0.0212860107421875,
-0.06951904296875,
0.026336669921875,
-0.041107177734375,
0.007049560546875,
-0.001125335693359375,
-0.00951385498046875,
-0.030853271484375,
-0.0102691650390625,
0.01007843017578125,
0.026580810546875,
-0.0258636474609375,
0.0163116455078125,
-0.0372314453125,
-0.027191162109375,
0.00774383544921875,
-0.0131683349609375,
0.0723876953125,
0.0223236083984375,
-0.03399658203125,
0.01055145263671875,
-0.05902099609375,
0.006069183349609375,
0.0301971435546875,
-0.01357269287109375,
-0.0024547576904296875,
-0.0207672119140625,
-0.002498626708984375,
0.0280609130859375,
0.019317626953125,
-0.044189453125,
0.01113128662109375,
-0.04022216796875,
0.043914794921875,
0.050018310546875,
-0.006038665771484375,
0.02685546875,
-0.050018310546875,
0.031951904296875,
0.01430511474609375,
0.03118896484375,
0.00665283203125,
-0.053741455078125,
-0.0565185546875,
-0.0128173828125,
0.01415252685546875,
0.041717529296875,
-0.03424072265625,
0.057647705078125,
-0.0146942138671875,
-0.06964111328125,
-0.043426513671875,
0.008026123046875,
0.032745361328125,
0.07049560546875,
0.0238800048828125,
-0.01497650146484375,
-0.0259246826171875,
-0.0711669921875,
0.0016851425170898438,
-0.03515625,
0.02001953125,
0.0357666015625,
0.0325927734375,
-0.0105133056640625,
0.07025146484375,
-0.04034423828125,
-0.02593994140625,
-0.0364990234375,
-0.0279998779296875,
0.03533935546875,
0.035491943359375,
0.07025146484375,
-0.05316162109375,
-0.04168701171875,
-0.006313323974609375,
-0.061126708984375,
-0.002902984619140625,
0.01441192626953125,
-0.0252227783203125,
0.01050567626953125,
0.00511932373046875,
-0.0594482421875,
0.0264129638671875,
0.043365478515625,
-0.03387451171875,
0.037139892578125,
-0.0160369873046875,
0.0032444000244140625,
-0.08294677734375,
0.021514892578125,
0.004917144775390625,
0.00943756103515625,
-0.051971435546875,
-0.013153076171875,
0.004711151123046875,
-0.005603790283203125,
-0.037445068359375,
0.054656982421875,
-0.0345458984375,
0.007457733154296875,
-0.0048370361328125,
0.0042266845703125,
0.006988525390625,
0.046630859375,
0.014801025390625,
0.04595947265625,
0.050689697265625,
-0.0307159423828125,
0.017486572265625,
0.0277252197265625,
-0.0198211669921875,
0.013916015625,
-0.070556640625,
-0.007343292236328125,
-0.015777587890625,
0.0251007080078125,
-0.08251953125,
-0.0185394287109375,
0.039886474609375,
-0.04510498046875,
0.016326904296875,
-0.033172607421875,
-0.0263824462890625,
-0.040313720703125,
-0.0201873779296875,
-0.0020427703857421875,
0.055877685546875,
-0.0241241455078125,
0.0233154296875,
0.02197265625,
0.002986907958984375,
-0.05145263671875,
-0.0261383056640625,
-0.0252838134765625,
-0.025665283203125,
-0.07342529296875,
0.03594970703125,
-0.00824737548828125,
-0.004405975341796875,
0.0011882781982421875,
-0.0091552734375,
0.006866455078125,
0.002841949462890625,
0.0343017578125,
0.041595458984375,
-0.00905609130859375,
-0.0049896240234375,
-0.004383087158203125,
-0.005168914794921875,
-0.0022411346435546875,
-0.004489898681640625,
0.056976318359375,
-0.029266357421875,
-0.005157470703125,
-0.03851318359375,
-0.0002543926239013672,
0.04901123046875,
-0.0111846923828125,
0.07037353515625,
0.05352783203125,
-0.022735595703125,
0.0185089111328125,
-0.0462646484375,
-0.0116424560546875,
-0.03619384765625,
0.020050048828125,
-0.0340576171875,
-0.052764892578125,
0.060791015625,
0.019439697265625,
0.01806640625,
0.0701904296875,
0.03509521484375,
0.00970458984375,
0.07965087890625,
0.03411865234375,
-0.0079193115234375,
0.046051025390625,
-0.050933837890625,
-0.01227569580078125,
-0.0712890625,
-0.032989501953125,
-0.036895751953125,
-0.0313720703125,
-0.04229736328125,
-0.0184478759765625,
0.0263519287109375,
0.02777099609375,
-0.02484130859375,
0.03350830078125,
-0.05511474609375,
0.0134124755859375,
0.028564453125,
0.0201416015625,
0.00868988037109375,
-0.0081024169921875,
-0.020263671875,
-0.00574493408203125,
-0.06732177734375,
-0.043121337890625,
0.07952880859375,
0.033935546875,
0.039154052734375,
0.0128173828125,
0.0640869140625,
-0.01160430908203125,
0.020721435546875,
-0.044677734375,
0.0308074951171875,
0.008575439453125,
-0.0701904296875,
-0.0148162841796875,
-0.04632568359375,
-0.07379150390625,
0.0445556640625,
-0.021759033203125,
-0.06292724609375,
0.0165557861328125,
0.02142333984375,
-0.04180908203125,
0.028472900390625,
-0.06573486328125,
0.0902099609375,
-0.0296173095703125,
-0.035491943359375,
0.005992889404296875,
-0.057830810546875,
0.0241546630859375,
0.0206756591796875,
0.0175933837890625,
-0.028472900390625,
0.008392333984375,
0.0726318359375,
-0.048431396484375,
0.055755615234375,
-0.038421630859375,
0.0232391357421875,
0.0316162109375,
-0.007579803466796875,
0.0340576171875,
0.0009517669677734375,
-0.0132598876953125,
0.007312774658203125,
0.01125335693359375,
-0.03411865234375,
-0.030303955078125,
0.052764892578125,
-0.09051513671875,
-0.042999267578125,
-0.04449462890625,
-0.02606201171875,
-0.00421905517578125,
0.032257080078125,
0.0386962890625,
0.037200927734375,
-0.0124053955078125,
0.0211029052734375,
0.03887939453125,
-0.014862060546875,
0.01959228515625,
0.032623291015625,
-0.025299072265625,
-0.0330810546875,
0.056396484375,
0.0191192626953125,
0.0280914306640625,
0.0191650390625,
0.034454345703125,
-0.036865234375,
-0.03045654296875,
-0.044097900390625,
0.022918701171875,
-0.039459228515625,
-0.0172576904296875,
-0.06427001953125,
-0.0218658447265625,
-0.048431396484375,
0.0005917549133300781,
-0.035003662109375,
-0.037689208984375,
-0.029266357421875,
0.0004096031188964844,
0.039764404296875,
0.039947509765625,
0.01212310791015625,
0.03179931640625,
-0.06658935546875,
0.00510406494140625,
0.005489349365234375,
0.0120849609375,
0.00403594970703125,
-0.06561279296875,
0.004077911376953125,
0.0250701904296875,
-0.0340576171875,
-0.0689697265625,
0.051116943359375,
0.00948333740234375,
0.055908203125,
0.029510498046875,
-0.003444671630859375,
0.06414794921875,
-0.0189056396484375,
0.052337646484375,
0.0225067138671875,
-0.04071044921875,
0.0517578125,
-0.0274658203125,
0.01445770263671875,
0.030242919921875,
0.0428466796875,
-0.031494140625,
-0.0188446044921875,
-0.08221435546875,
-0.06048583984375,
0.060150146484375,
0.0278472900390625,
-0.00453948974609375,
0.0009274482727050781,
0.033966064453125,
-0.0166473388671875,
0.007312774658203125,
-0.05364990234375,
-0.02813720703125,
-0.027069091796875,
-0.0010929107666015625,
-0.006175994873046875,
-0.006183624267578125,
0.0012922286987304688,
-0.0247802734375,
0.054473876953125,
0.00423431396484375,
0.0193023681640625,
0.028411865234375,
-0.0038394927978515625,
-0.0123443603515625,
0.01123046875,
0.037628173828125,
0.041351318359375,
-0.04974365234375,
-0.020050048828125,
-0.004810333251953125,
-0.030975341796875,
-0.0005702972412109375,
0.018646240234375,
-0.0198974609375,
0.002185821533203125,
0.021697998046875,
0.06805419921875,
0.00669097900390625,
-0.0296173095703125,
0.03497314453125,
-0.00787353515625,
-0.0199127197265625,
-0.017303466796875,
0.023956298828125,
0.0143890380859375,
0.02099609375,
0.0170135498046875,
0.0176239013671875,
0.00252532958984375,
-0.04742431640625,
-0.007221221923828125,
0.0280914306640625,
-0.036102294921875,
-0.03497314453125,
0.0657958984375,
0.0143585205078125,
-0.00756072998046875,
0.039581298828125,
-0.00799560546875,
-0.0517578125,
0.051177978515625,
0.039581298828125,
0.040496826171875,
-0.0100555419921875,
0.0079193115234375,
0.05035400390625,
0.01702880859375,
-0.0175628662109375,
0.02191162109375,
0.01169586181640625,
-0.057952880859375,
0.005779266357421875,
-0.05169677734375,
-0.0173797607421875,
0.00830841064453125,
-0.047393798828125,
0.0214080810546875,
-0.034454345703125,
-0.047332763671875,
0.024505615234375,
0.022735595703125,
-0.057830810546875,
0.00868988037109375,
-0.005519866943359375,
0.06976318359375,
-0.07080078125,
0.0635986328125,
0.0457763671875,
-0.049560546875,
-0.0777587890625,
-0.0343017578125,
0.00757598876953125,
-0.06927490234375,
0.044342041015625,
0.0184326171875,
0.015167236328125,
-0.00952911376953125,
-0.0174407958984375,
-0.06585693359375,
0.10113525390625,
0.023406982421875,
-0.050872802734375,
-0.0146942138671875,
-0.000156402587890625,
0.052764892578125,
-0.0113067626953125,
0.049652099609375,
0.0316162109375,
0.02276611328125,
0.02337646484375,
-0.08154296875,
0.0067138671875,
-0.04248046875,
0.007595062255859375,
0.0122528076171875,
-0.0997314453125,
0.0740966796875,
-0.0147247314453125,
0.007122039794921875,
-0.00823211669921875,
0.047119140625,
0.0308074951171875,
-0.0037097930908203125,
0.046356201171875,
0.05487060546875,
0.04937744140625,
-0.0115203857421875,
0.07379150390625,
-0.04168701171875,
0.033294677734375,
0.0517578125,
-0.01451873779296875,
0.05743408203125,
0.02606201171875,
-0.0222320556640625,
0.024505615234375,
0.049774169921875,
0.001979827880859375,
0.0249786376953125,
-0.0033817291259765625,
-0.004486083984375,
-0.01309967041015625,
0.0021381378173828125,
-0.054931640625,
0.02581787109375,
0.022613525390625,
-0.0311431884765625,
-0.006786346435546875,
-0.0269927978515625,
0.00991058349609375,
-0.023468017578125,
-0.0006327629089355469,
0.0487060546875,
-0.00580596923828125,
-0.054931640625,
0.060638427734375,
-0.0063934326171875,
0.055419921875,
-0.055328369140625,
0.00789642333984375,
-0.0187835693359375,
0.030548095703125,
-0.025726318359375,
-0.062469482421875,
0.01314544677734375,
-0.005832672119140625,
-0.005889892578125,
-0.0011272430419921875,
0.05731201171875,
-0.0194091796875,
-0.038482666015625,
0.0250701904296875,
0.030975341796875,
0.0168914794921875,
-0.0013427734375,
-0.07098388671875,
0.0182952880859375,
0.009002685546875,
-0.03955078125,
0.0361328125,
0.03271484375,
0.011383056640625,
0.054107666015625,
0.0552978515625,
0.01192474365234375,
0.00821685791015625,
-0.0061492919921875,
0.080078125,
-0.0439453125,
-0.03582763671875,
-0.052337646484375,
0.030303955078125,
-0.010223388671875,
-0.039337158203125,
0.06585693359375,
0.06060791015625,
0.05450439453125,
0.009002685546875,
0.047882080078125,
-0.030609130859375,
0.04071044921875,
-0.0217132568359375,
0.052886962890625,
-0.04266357421875,
0.016082763671875,
-0.028717041015625,
-0.0648193359375,
-0.0010986328125,
0.054290771484375,
-0.0198822021484375,
0.0084686279296875,
0.02142333984375,
0.0726318359375,
-0.0007290840148925781,
-0.004611968994140625,
0.0124359130859375,
0.01044464111328125,
0.036285400390625,
0.055328369140625,
0.048065185546875,
-0.055877685546875,
0.058868408203125,
-0.059356689453125,
-0.032073974609375,
-0.02337646484375,
-0.0386962890625,
-0.06689453125,
-0.03460693359375,
-0.0225067138671875,
-0.038818359375,
-0.0009927749633789062,
0.0758056640625,
0.047119140625,
-0.03143310546875,
-0.0287322998046875,
0.009674072265625,
-0.00766754150390625,
-0.01800537109375,
-0.0168304443359375,
0.0318603515625,
-0.0143280029296875,
-0.047149658203125,
0.00711822509765625,
0.0008416175842285156,
0.0265350341796875,
-0.002666473388671875,
-0.0167388916015625,
-0.00592041015625,
-0.01409912109375,
0.03009033203125,
0.01922607421875,
-0.0535888671875,
-0.020111083984375,
0.02288818359375,
-0.01064300537109375,
0.017822265625,
0.01181793212890625,
-0.039581298828125,
0.032470703125,
0.04681396484375,
0.00971221923828125,
0.04486083984375,
-0.0014743804931640625,
0.0184326171875,
-0.02972412109375,
0.026702880859375,
0.01033782958984375,
0.0380859375,
0.01030731201171875,
-0.0220794677734375,
0.03448486328125,
0.035430908203125,
-0.04205322265625,
-0.062744140625,
-0.0171966552734375,
-0.100341796875,
-0.0016756057739257812,
0.08648681640625,
-0.0092620849609375,
-0.042022705078125,
0.0238037109375,
-0.0270843505859375,
0.02301025390625,
-0.04852294921875,
0.04815673828125,
0.054718017578125,
-0.021087646484375,
0.0001990795135498047,
-0.044525146484375,
0.04022216796875,
0.0213165283203125,
-0.04559326171875,
0.0039520263671875,
0.0357666015625,
0.043365478515625,
0.0157012939453125,
0.061370849609375,
-0.018829345703125,
0.0169219970703125,
0.01203155517578125,
0.00426483154296875,
-0.01239776611328125,
-0.005924224853515625,
-0.01029205322265625,
0.00244140625,
0.00921630859375,
-0.01387786865234375
]
] |
thibaud/controlnet-sd21 | 2023-08-14T07:43:07.000Z | [
"diffusers",
"art",
"stable diffusion",
"controlnet",
"en",
"dataset:laion/laion-art",
"license:other",
"region:us"
] | null | thibaud | null | null | thibaud/controlnet-sd21 | 342 | 49,922 | diffusers | 2023-03-06T15:24:04 | ---
language:
- en
license: other
tags:
- art
- diffusers
- stable diffusion
- controlnet
datasets: laion/laion-art
---
Want to support my work: you can bought my Artbook: https://thibaud.art
___
Here's the first version of controlnet for stablediffusion 2.1
Trained on a subset of laion/laion-art
License: refers to the different preprocessor's ones.
### Safetensors version uploaded, only 700mb!
### Canny:

### Depth:

### ZoeDepth:

### Hed:

### Scribble:

### OpenPose:

### Color:

### OpenPose:

### LineArt:

### Ade20K:

### Normal BAE:

### To use with Automatic1111:
* Download the ckpt files or safetensors ones
* Put it in extensions/sd-webui-controlnet/models
* in settings/controlnet, change cldm_v15.yaml by cldm_v21.yaml
* Enjoy
### To use ZoeDepth:
You can use it with annotator depth/le_res but it works better with ZoeDepth Annotator. My PR is not accepted yet but you can use my fork.
My fork: https://github.com/thibaudart/sd-webui-controlnet
The PR: https://github.com/Mikubill/sd-webui-controlnet/pull/655#issuecomment-1481724024
### Misuse, Malicious Use, and Out-of-Scope Use
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
Thanks https://huggingface.co/lllyasviel/ for the implementation and the release of 1.5 models.
Thanks https://huggingface.co/p1atdev/ for the conversion script from ckpt to safetensors pruned & fp16
### Models can't be sell, merge, distributed without prior writing agreement.
| 2,714 | [
[
-0.038116455078125,
-0.0182952880859375,
-0.005893707275390625,
0.02984619140625,
-0.0276641845703125,
-0.02618408203125,
-0.00017273426055908203,
-0.04364013671875,
0.05584716796875,
0.046630859375,
-0.036712646484375,
-0.034912109375,
-0.043212890625,
-0.01245880126953125,
-0.0185089111328125,
0.049102783203125,
-0.027740478515625,
-0.0127105712890625,
0.01522064208984375,
-0.01201629638671875,
-0.01397705078125,
-0.014190673828125,
-0.067626953125,
-0.022369384765625,
0.048309326171875,
0.029022216796875,
0.056976318359375,
0.06201171875,
0.03875732421875,
0.027313232421875,
-0.00958251953125,
0.00234222412109375,
-0.03607177734375,
0.00868988037109375,
-0.0013704299926757812,
-0.00400543212890625,
-0.054840087890625,
0.0091705322265625,
0.0382080078125,
0.022186279296875,
-0.00917816162109375,
-0.00878143310546875,
-0.00030803680419921875,
0.042633056640625,
-0.04766845703125,
-0.002422332763671875,
-0.0032367706298828125,
0.020751953125,
-0.0160369873046875,
0.005214691162109375,
-0.01155853271484375,
-0.027099609375,
0.0012140274047851562,
-0.08544921875,
-0.0204010009765625,
-0.00536346435546875,
0.1033935546875,
0.0150604248046875,
-0.028411865234375,
-0.00031280517578125,
-0.0263214111328125,
0.0413818359375,
-0.059356689453125,
0.0078887939453125,
0.0280609130859375,
0.02777099609375,
-0.0182647705078125,
-0.075927734375,
-0.021514892578125,
-0.0251007080078125,
-0.00823974609375,
0.0273284912109375,
-0.044281005859375,
0.01509857177734375,
0.03515625,
0.01158905029296875,
-0.0302734375,
0.0270843505859375,
-0.03460693359375,
-0.04083251953125,
0.04803466796875,
-0.0061492919921875,
0.0372314453125,
-0.011566162109375,
-0.047088623046875,
-0.01922607421875,
-0.0195159912109375,
0.01275634765625,
0.039520263671875,
-0.0010528564453125,
-0.053955078125,
0.028411865234375,
0.0020809173583984375,
0.032745361328125,
0.0229339599609375,
0.0036754608154296875,
0.0212249755859375,
-0.004039764404296875,
-0.0159149169921875,
-0.0185546875,
0.06939697265625,
0.05718994140625,
0.035003662109375,
0.020050048828125,
-0.00897979736328125,
0.0004487037658691406,
0.0209197998046875,
-0.0740966796875,
-0.0443115234375,
0.0180816650390625,
-0.05889892578125,
-0.0531005859375,
0.014495849609375,
-0.04803466796875,
-0.03045654296875,
-0.0203094482421875,
0.018768310546875,
-0.028533935546875,
-0.0662841796875,
0.0005826950073242188,
-0.030181884765625,
0.01076507568359375,
0.040771484375,
-0.03961181640625,
0.033172607421875,
0.0241241455078125,
0.056365966796875,
-0.0101470947265625,
-0.0065765380859375,
-0.017974853515625,
-0.00836944580078125,
-0.048736572265625,
0.025665283203125,
0.00284576416015625,
-0.008148193359375,
0.0035953521728515625,
0.021026611328125,
0.00867462158203125,
-0.038909912109375,
0.03143310546875,
-0.042327880859375,
0.00481414794921875,
-0.0068206787109375,
-0.036773681640625,
-0.00823211669921875,
0.002536773681640625,
-0.065673828125,
0.035186767578125,
0.0246429443359375,
-0.080810546875,
-0.001461029052734375,
-0.07318115234375,
-0.011993408203125,
-0.005809783935546875,
0.00678253173828125,
-0.033905029296875,
-0.022247314453125,
-0.04058837890625,
0.0279998779296875,
0.019012451171875,
0.007740020751953125,
-0.03851318359375,
0.0030879974365234375,
0.01319122314453125,
-0.0129241943359375,
0.104736328125,
0.025726318359375,
-0.04119873046875,
0.01342010498046875,
-0.060394287109375,
0.0102996826171875,
0.007801055908203125,
-0.00235748291015625,
-0.0189056396484375,
-0.0271148681640625,
0.036529541015625,
0.037689208984375,
0.0221710205078125,
-0.041473388671875,
0.022796630859375,
-0.01015472412109375,
0.026214599609375,
0.06524658203125,
-0.003055572509765625,
0.0477294921875,
-0.03472900390625,
0.048248291015625,
0.0306243896484375,
0.03515625,
0.0145416259765625,
-0.0250701904296875,
-0.0694580078125,
-0.0180816650390625,
0.0205841064453125,
0.03680419921875,
-0.07122802734375,
0.02679443359375,
0.004833221435546875,
-0.0487060546875,
-0.016357421875,
-0.0014133453369140625,
0.0406494140625,
0.039764404296875,
0.0235595703125,
-0.04595947265625,
-0.03466796875,
-0.09222412109375,
0.0260162353515625,
-0.01922607421875,
-0.0116424560546875,
0.003917694091796875,
0.037506103515625,
-0.0233306884765625,
0.046630859375,
-0.0224609375,
-0.0027942657470703125,
0.00572967529296875,
0.00710296630859375,
0.007480621337890625,
0.08544921875,
0.0625,
-0.06341552734375,
-0.0166778564453125,
0.0000997781753540039,
-0.057647705078125,
-0.020965576171875,
-0.0083465576171875,
-0.056121826171875,
0.01036834716796875,
0.02239990234375,
-0.050140380859375,
0.066162109375,
0.0390625,
-0.055023193359375,
0.05633544921875,
-0.013671875,
-0.001445770263671875,
-0.09857177734375,
0.0145721435546875,
0.0215301513671875,
-0.031494140625,
-0.0489501953125,
0.01239013671875,
0.0148162841796875,
0.0081787109375,
-0.037261962890625,
0.06109619140625,
-0.0264892578125,
0.0211334228515625,
-0.0179901123046875,
-0.01222991943359375,
0.00756072998046875,
0.02569580078125,
0.0008759498596191406,
0.017333984375,
0.0592041015625,
-0.05255126953125,
0.039398193359375,
0.0216827392578125,
0.0003120899200439453,
0.034759521484375,
-0.06646728515625,
-0.0127105712890625,
-0.004711151123046875,
0.0153350830078125,
-0.05072021484375,
-0.0399169921875,
0.0648193359375,
-0.0245361328125,
0.0504150390625,
-0.003902435302734375,
-0.007434844970703125,
-0.02850341796875,
-0.0246734619140625,
0.00669097900390625,
0.03570556640625,
-0.0287017822265625,
0.040313720703125,
0.0234832763671875,
0.0096588134765625,
-0.035614013671875,
-0.07733154296875,
0.00936126708984375,
-0.01177978515625,
-0.0579833984375,
0.046905517578125,
-0.0205078125,
-0.034515380859375,
0.004741668701171875,
-0.0015001296997070312,
-0.0287322998046875,
0.0127410888671875,
0.03582763671875,
0.0037441253662109375,
-0.0061187744140625,
-0.01146697998046875,
0.01503753662109375,
-0.032196044921875,
-0.0012760162353515625,
-0.032989501953125,
0.0161895751953125,
-0.0050506591796875,
-0.0288543701171875,
-0.07244873046875,
0.03948974609375,
0.0455322265625,
0.003772735595703125,
0.03424072265625,
0.060150146484375,
-0.04608154296875,
-0.004299163818359375,
-0.033111572265625,
-0.0252532958984375,
-0.036285400390625,
-0.0006475448608398438,
-0.011627197265625,
-0.05963134765625,
0.05914306640625,
0.011993408203125,
0.0186614990234375,
0.03863525390625,
0.005565643310546875,
-0.0274810791015625,
0.052764892578125,
0.0499267578125,
-0.0107879638671875,
0.05487060546875,
-0.0440673828125,
-0.0193328857421875,
-0.04144287109375,
-0.000568389892578125,
-0.0213165283203125,
-0.062408447265625,
-0.07354736328125,
-0.005390167236328125,
0.0157012939453125,
0.038543701171875,
-0.04150390625,
0.0469970703125,
-0.031494140625,
-0.004573822021484375,
0.0489501953125,
0.0338134765625,
-0.00286102294921875,
0.0012731552124023438,
-0.0189056396484375,
-0.004886627197265625,
-0.053680419921875,
-0.0295257568359375,
0.032440185546875,
0.04949951171875,
0.05419921875,
0.0150909423828125,
0.053131103515625,
0.01529693603515625,
0.0228729248046875,
-0.035614013671875,
0.032379150390625,
-0.0002199411392211914,
-0.04779052734375,
-0.0158538818359375,
-0.0265350341796875,
-0.0667724609375,
-0.00641632080078125,
-0.004863739013671875,
-0.054473876953125,
0.050201416015625,
0.0303802490234375,
-0.00937652587890625,
0.039276123046875,
-0.0506591796875,
0.059906005859375,
-0.01153564453125,
-0.03369140625,
-0.00006586313247680664,
-0.0667724609375,
0.018707275390625,
0.018218994140625,
-0.007843017578125,
0.0019483566284179688,
-0.01995849609375,
0.067626953125,
-0.0772705078125,
0.0843505859375,
-0.023468017578125,
-0.00385284423828125,
0.02734375,
0.0241546630859375,
0.0150146484375,
0.0074005126953125,
-0.016510009765625,
0.0103607177734375,
0.01201629638671875,
-0.041290283203125,
-0.025482177734375,
0.047576904296875,
-0.065185546875,
-0.02508544921875,
-0.024322509765625,
-0.01503753662109375,
0.031158447265625,
0.02313232421875,
0.037689208984375,
0.047149658203125,
0.01580810546875,
0.0201568603515625,
0.03631591796875,
0.006916046142578125,
0.03887939453125,
0.0054931640625,
-0.0223846435546875,
-0.035003662109375,
0.032440185546875,
0.0187835693359375,
0.029052734375,
0.01287078857421875,
0.02301025390625,
0.0008220672607421875,
-0.025726318359375,
-0.044891357421875,
0.0008387565612792969,
-0.0526123046875,
-0.035552978515625,
-0.03814697265625,
-0.03948974609375,
-0.0178985595703125,
-0.02069091796875,
-0.036346435546875,
-0.040496826171875,
-0.0487060546875,
-0.0052947998046875,
0.06365966796875,
0.04510498046875,
-0.056610107421875,
0.0305938720703125,
-0.03460693359375,
0.022918701171875,
0.013946533203125,
0.037811279296875,
-0.006000518798828125,
-0.037322998046875,
-0.0034847259521484375,
0.0185699462890625,
-0.0256195068359375,
-0.06591796875,
0.0233306884765625,
-0.00975799560546875,
0.042572021484375,
0.05169677734375,
0.016845703125,
0.038360595703125,
-0.0110015869140625,
0.038116455078125,
0.037200927734375,
-0.054473876953125,
0.0362548828125,
-0.0435791015625,
0.0207366943359375,
0.043365478515625,
0.047821044921875,
-0.046661376953125,
-0.0171051025390625,
-0.0579833984375,
-0.04730224609375,
0.032440185546875,
0.032379150390625,
0.007595062255859375,
0.01458740234375,
0.05029296875,
-0.015960693359375,
0.00624847412109375,
-0.0543212890625,
-0.04534912109375,
-0.0296783447265625,
-0.00970458984375,
0.0272979736328125,
-0.0250701904296875,
-0.0112762451171875,
-0.024810791015625,
0.060546875,
-0.01031494140625,
0.036376953125,
0.0290374755859375,
0.0182952880859375,
-0.0289306640625,
-0.020416259765625,
0.042724609375,
0.056488037109375,
0.0029048919677734375,
-0.005558013916015625,
-0.0174102783203125,
-0.047332763671875,
-0.0029735565185546875,
0.0201568603515625,
-0.036895751953125,
0.0145416259765625,
0.034912109375,
0.07000732421875,
0.0121917724609375,
0.0007882118225097656,
0.05133056640625,
-0.0201873779296875,
-0.031768798828125,
-0.037445068359375,
0.0085906982421875,
0.012847900390625,
0.02496337890625,
0.00833892822265625,
0.0416259765625,
0.00605010986328125,
0.0038623809814453125,
0.029083251953125,
0.015777587890625,
-0.047454833984375,
-0.0220184326171875,
0.044921875,
0.005100250244140625,
-0.016815185546875,
0.0192108154296875,
-0.0335693359375,
-0.05316162109375,
0.068359375,
0.03887939453125,
0.0665283203125,
-0.01416015625,
0.033172607421875,
0.04779052734375,
0.0248260498046875,
0.01389312744140625,
0.042236328125,
0.016387939453125,
-0.056549072265625,
-0.01383209228515625,
-0.024871826171875,
-0.020111083984375,
0.01125335693359375,
-0.0426025390625,
0.040924072265625,
-0.057220458984375,
-0.0177764892578125,
-0.0156402587890625,
0.0013427734375,
-0.0487060546875,
0.0102996826171875,
0.006275177001953125,
0.10137939453125,
-0.044647216796875,
0.0465087890625,
0.05426025390625,
-0.023895263671875,
-0.09332275390625,
-0.01342010498046875,
0.01361846923828125,
-0.0199432373046875,
0.03558349609375,
-0.0007495880126953125,
0.0019388198852539062,
0.006198883056640625,
-0.07354736328125,
-0.05841064453125,
0.09368896484375,
0.01158905029296875,
-0.046142578125,
0.00922393798828125,
-0.03131103515625,
0.0283355712890625,
-0.037933349609375,
0.0280914306640625,
-0.00794219970703125,
0.0499267578125,
0.0011501312255859375,
-0.065673828125,
0.00850677490234375,
-0.04949951171875,
0.023040771484375,
-0.0001354217529296875,
-0.06036376953125,
0.061798095703125,
0.0045013427734375,
-0.01189422607421875,
0.024993896484375,
0.050201416015625,
0.012176513671875,
0.0125732421875,
0.056732177734375,
0.04296875,
0.045562744140625,
-0.0083465576171875,
0.07373046875,
-0.0152740478515625,
0.01213836669921875,
0.06463623046875,
0.0068206787109375,
0.039886474609375,
0.0280303955078125,
0.0207672119140625,
0.04083251953125,
0.061309814453125,
0.00963592529296875,
0.0207366943359375,
0.036712646484375,
-0.00902557373046875,
-0.01244354248046875,
-0.017425537109375,
-0.03656005859375,
0.032196044921875,
0.01268768310546875,
-0.0283355712890625,
-0.0113677978515625,
0.019439697265625,
0.0217132568359375,
0.0034389495849609375,
-0.024658203125,
0.036376953125,
0.0122222900390625,
-0.0137176513671875,
0.0443115234375,
0.0039520263671875,
0.0721435546875,
-0.05242919921875,
-0.00487518310546875,
-0.011383056640625,
0.004802703857421875,
-0.0241546630859375,
-0.051788330078125,
0.0166473388671875,
-0.0123748779296875,
0.01038360595703125,
-0.032867431640625,
0.0859375,
-0.022186279296875,
-0.0345458984375,
0.04888916015625,
0.03009033203125,
0.0201568603515625,
0.031890869140625,
-0.06695556640625,
0.029632568359375,
0.0009632110595703125,
-0.03851318359375,
0.00276947021484375,
0.016326904296875,
-0.017425537109375,
0.049591064453125,
0.0255279541015625,
0.0145721435546875,
-0.00792694091796875,
0.01018524169921875,
0.0867919921875,
-0.02606201171875,
-0.03204345703125,
-0.04296875,
0.07354736328125,
-0.028533935546875,
-0.0406494140625,
0.0227813720703125,
0.01403045654296875,
0.07330322265625,
-0.0143585205078125,
0.058746337890625,
-0.032562255859375,
0.01494598388671875,
-0.0292816162109375,
0.0924072265625,
-0.04949951171875,
-0.03173828125,
-0.0265655517578125,
-0.03857421875,
-0.0279998779296875,
0.067626953125,
-0.007419586181640625,
0.0128631591796875,
0.023834228515625,
0.07452392578125,
-0.0183868408203125,
-0.0065460205078125,
0.0139312744140625,
0.0181121826171875,
0.0163726806640625,
0.03314208984375,
0.056060791015625,
-0.0592041015625,
0.01038360595703125,
-0.0665283203125,
-0.033905029296875,
-0.0231781005859375,
-0.09130859375,
-0.051483154296875,
-0.05047607421875,
-0.053070068359375,
-0.057037353515625,
-0.005283355712890625,
0.08941650390625,
0.08819580078125,
-0.0716552734375,
-0.0123443603515625,
-0.0105438232421875,
0.0134429931640625,
-0.0024394989013671875,
-0.017913818359375,
0.0027103424072265625,
0.029388427734375,
-0.041290283203125,
-0.011932373046875,
-0.004241943359375,
0.0443115234375,
-0.0162811279296875,
-0.0225830078125,
-0.026123046875,
-0.040924072265625,
0.01702880859375,
0.03363037109375,
-0.0277862548828125,
-0.027130126953125,
-0.00911712646484375,
-0.0257568359375,
-0.0018453598022460938,
0.038787841796875,
-0.01253509521484375,
0.0178070068359375,
0.04949951171875,
0.0282135009765625,
0.02203369140625,
-0.0114593505859375,
0.0253143310546875,
-0.039825439453125,
0.031829833984375,
0.0188751220703125,
0.03472900390625,
0.0102386474609375,
-0.0303497314453125,
0.032745361328125,
0.01468658447265625,
-0.03094482421875,
-0.0323486328125,
0.027923583984375,
-0.08721923828125,
0.000919342041015625,
0.06451416015625,
-0.020599365234375,
-0.0242462158203125,
0.0091094970703125,
-0.02923583984375,
0.033355712890625,
-0.03399658203125,
0.01331329345703125,
0.036163330078125,
0.004467010498046875,
-0.0292816162109375,
-0.04290771484375,
0.04058837890625,
0.00916290283203125,
-0.051177978515625,
-0.0242462158203125,
0.04925537109375,
0.01264190673828125,
0.047088623046875,
0.041229248046875,
-0.01113128662109375,
0.03125,
-0.00420379638671875,
0.0310516357421875,
-0.0268402099609375,
-0.0209808349609375,
-0.03253173828125,
-0.005268096923828125,
-0.001758575439453125,
-0.0218048095703125
]
] |
timm/mobilenetv2_100.ra_in1k | 2023-04-27T21:14:13.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1801.04381",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/mobilenetv2_100.ra_in1k | 0 | 49,411 | timm | 2022-12-13T00:00:26 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for mobilenetv2_100.ra_in1k
A MobileNet-v2 image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 3.5
- GMACs: 0.3
- Activations (M): 6.7
- Image size: 224 x 224
- **Papers:**
- MobileNetV2: Inverted Residuals and Linear Bottlenecks: https://arxiv.org/abs/1801.04381
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilenetv2_100.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv2_100.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 32, 28, 28])
# torch.Size([1, 96, 14, 14])
# torch.Size([1, 320, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv2_100.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{sandler2018mobilenetv2,
title={Mobilenetv2: Inverted residuals and linear bottlenecks},
author={Sandler, Mark and Howard, Andrew and Zhu, Menglong and Zhmoginov, Andrey and Chen, Liang-Chieh},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
pages={4510--4520},
year={2018}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,748 | [
[
-0.0272369384765625,
-0.022064208984375,
-0.01309967041015625,
0.0018186569213867188,
-0.02642822265625,
-0.0266265869140625,
-0.005462646484375,
-0.027801513671875,
0.0217742919921875,
0.0360107421875,
-0.03155517578125,
-0.041839599609375,
-0.045654296875,
-0.01824951171875,
-0.007160186767578125,
0.06103515625,
-0.005214691162109375,
0.00002276897430419922,
-0.0153961181640625,
-0.04949951171875,
-0.02484130859375,
-0.0195465087890625,
-0.0684814453125,
-0.043304443359375,
0.0308074951171875,
0.0259552001953125,
0.0394287109375,
0.048065185546875,
0.043487548828125,
0.029388427734375,
0.00212860107421875,
0.01274871826171875,
-0.0145416259765625,
-0.011077880859375,
0.0217437744140625,
-0.052215576171875,
-0.0264739990234375,
0.01983642578125,
0.041107177734375,
0.01435089111328125,
0.00289154052734375,
0.0399169921875,
0.00604248046875,
0.05450439453125,
-0.0197906494140625,
-0.0048675537109375,
-0.03509521484375,
0.01201629638671875,
-0.005008697509765625,
0.00040435791015625,
-0.020721435546875,
-0.0284576416015625,
0.006565093994140625,
-0.0293731689453125,
0.0218048095703125,
-0.0018548965454101562,
0.100830078125,
0.01995849609375,
-0.01448822021484375,
-0.0019893646240234375,
-0.02716064453125,
0.06048583984375,
-0.052520751953125,
0.01532745361328125,
0.029022216796875,
0.0201873779296875,
-0.005462646484375,
-0.0838623046875,
-0.0404052734375,
-0.01129150390625,
-0.0022373199462890625,
0.00229644775390625,
-0.0211944580078125,
-0.010772705078125,
0.017120361328125,
0.009368896484375,
-0.035552978515625,
0.0155181884765625,
-0.041656494140625,
-0.0162200927734375,
0.045013427734375,
0.0015764236450195312,
0.026763916015625,
-0.0159149169921875,
-0.037689208984375,
-0.024993896484375,
-0.03533935546875,
0.0310821533203125,
0.0164794921875,
0.00917816162109375,
-0.04620361328125,
0.0372314453125,
0.0079498291015625,
0.042633056640625,
0.0008106231689453125,
-0.0301055908203125,
0.04864501953125,
-0.005443572998046875,
-0.0303497314453125,
-0.01026153564453125,
0.0780029296875,
0.0413818359375,
0.01345062255859375,
0.01200103759765625,
-0.007114410400390625,
-0.0306549072265625,
-0.00138092041015625,
-0.0902099609375,
-0.0249176025390625,
0.0301055908203125,
-0.06341552734375,
-0.03692626953125,
0.0190277099609375,
-0.04071044921875,
-0.0172271728515625,
-0.0025310516357421875,
0.034149169921875,
-0.02667236328125,
-0.03643798828125,
-0.004322052001953125,
-0.0145111083984375,
0.0302886962890625,
0.009735107421875,
-0.0401611328125,
0.01137542724609375,
0.022186279296875,
0.09112548828125,
0.015289306640625,
-0.0248870849609375,
-0.0179595947265625,
-0.033599853515625,
-0.0179901123046875,
0.032073974609375,
-0.00435638427734375,
-0.007534027099609375,
-0.026153564453125,
0.021514892578125,
-0.01253509521484375,
-0.05877685546875,
0.0237579345703125,
-0.0224761962890625,
0.0167999267578125,
-0.00540924072265625,
-0.00328826904296875,
-0.04486083984375,
0.0189208984375,
-0.030853271484375,
0.10772705078125,
0.024810791015625,
-0.0657958984375,
0.01715087890625,
-0.03326416015625,
-0.01186370849609375,
-0.0292205810546875,
-0.0013456344604492188,
-0.08331298828125,
-0.01062774658203125,
0.00826263427734375,
0.051025390625,
-0.033782958984375,
-0.00395965576171875,
-0.038665771484375,
-0.0198822021484375,
0.0242919921875,
0.006305694580078125,
0.07958984375,
0.019622802734375,
-0.04345703125,
0.01235198974609375,
-0.049285888671875,
0.0238494873046875,
0.036376953125,
-0.0177001953125,
-0.01171875,
-0.0298309326171875,
0.01110076904296875,
0.037017822265625,
0.0007152557373046875,
-0.035797119140625,
0.0180511474609375,
-0.0143890380859375,
0.042205810546875,
0.030242919921875,
-0.0122833251953125,
0.0269012451171875,
-0.032379150390625,
0.0182952880859375,
0.01983642578125,
0.0142974853515625,
-0.006748199462890625,
-0.044891357421875,
-0.0615234375,
-0.031005859375,
0.0302886962890625,
0.04681396484375,
-0.046234130859375,
0.028564453125,
-0.01284027099609375,
-0.060699462890625,
-0.0311737060546875,
0.00855255126953125,
0.042724609375,
0.042694091796875,
0.02130126953125,
-0.040496826171875,
-0.05010986328125,
-0.06719970703125,
-0.0029087066650390625,
-0.0006704330444335938,
-0.0025482177734375,
0.03302001953125,
0.050018310546875,
-0.013427734375,
0.054290771484375,
-0.0186767578125,
-0.0192413330078125,
-0.016845703125,
0.008575439453125,
0.0219879150390625,
0.06256103515625,
0.0606689453125,
-0.05816650390625,
-0.033782958984375,
-0.00510406494140625,
-0.07232666015625,
0.00933837890625,
-0.01474761962890625,
-0.006488800048828125,
0.0195465087890625,
0.021514892578125,
-0.043182373046875,
0.0469970703125,
0.017425537109375,
-0.017822265625,
0.0279388427734375,
-0.010101318359375,
0.0145416259765625,
-0.0947265625,
0.01004791259765625,
0.031982421875,
-0.0144195556640625,
-0.033355712890625,
0.004055023193359375,
0.0032253265380859375,
-0.0045013427734375,
-0.037689208984375,
0.049285888671875,
-0.041015625,
-0.0204925537109375,
-0.018524169921875,
-0.0169525146484375,
-0.0017948150634765625,
0.043701171875,
-0.012603759765625,
0.034332275390625,
0.0537109375,
-0.0340576171875,
0.03955078125,
0.0162353515625,
-0.01108551025390625,
0.0224456787109375,
-0.058502197265625,
0.0220794677734375,
-0.0021190643310546875,
0.025726318359375,
-0.07940673828125,
-0.01690673828125,
0.03497314453125,
-0.05462646484375,
0.04034423828125,
-0.0477294921875,
-0.0284576416015625,
-0.0452880859375,
-0.03826904296875,
0.0284576416015625,
0.056060791015625,
-0.053375244140625,
0.041656494140625,
0.0214996337890625,
0.0270538330078125,
-0.04876708984375,
-0.06695556640625,
-0.0125579833984375,
-0.033966064453125,
-0.0570068359375,
0.0264739990234375,
0.02825927734375,
0.0011892318725585938,
0.00499725341796875,
-0.00943756103515625,
-0.0159454345703125,
-0.008087158203125,
0.057861328125,
0.0216522216796875,
-0.024261474609375,
-0.01192474365234375,
-0.0230255126953125,
-0.006092071533203125,
-0.00008338689804077148,
-0.03729248046875,
0.045013427734375,
-0.0202484130859375,
-0.000522613525390625,
-0.07080078125,
-0.013916015625,
0.040618896484375,
-0.01055145263671875,
0.05889892578125,
0.08917236328125,
-0.0386962890625,
0.005954742431640625,
-0.036163330078125,
-0.0171051025390625,
-0.036651611328125,
0.03662109375,
-0.03179931640625,
-0.02972412109375,
0.06719970703125,
-0.00516510009765625,
0.0025691986083984375,
0.04705810546875,
0.0306549072265625,
-0.01111602783203125,
0.05072021484375,
0.0401611328125,
0.01354217529296875,
0.050872802734375,
-0.06817626953125,
-0.018218994140625,
-0.06585693359375,
-0.0491943359375,
-0.03094482421875,
-0.042816162109375,
-0.051910400390625,
-0.03302001953125,
0.0249481201171875,
0.023284912109375,
-0.0306549072265625,
0.041046142578125,
-0.055267333984375,
0.00519561767578125,
0.052093505859375,
0.048065185546875,
-0.0350341796875,
0.03363037109375,
-0.021820068359375,
-0.00276947021484375,
-0.06243896484375,
-0.0235137939453125,
0.08709716796875,
0.04132080078125,
0.037872314453125,
-0.00650787353515625,
0.0517578125,
-0.01629638671875,
0.0220184326171875,
-0.04803466796875,
0.040618896484375,
-0.00988006591796875,
-0.029296875,
-0.00582122802734375,
-0.031402587890625,
-0.07904052734375,
0.0136566162109375,
-0.0214385986328125,
-0.05718994140625,
0.01629638671875,
0.02081298828125,
-0.0168914794921875,
0.053680419921875,
-0.06048583984375,
0.0635986328125,
-0.003749847412109375,
-0.042236328125,
0.003391265869140625,
-0.060272216796875,
0.0275726318359375,
0.0172882080078125,
-0.0135345458984375,
-0.004169464111328125,
0.00894927978515625,
0.07769775390625,
-0.049835205078125,
0.060211181640625,
-0.039581298828125,
0.031585693359375,
0.052398681640625,
-0.0101776123046875,
0.033843994140625,
-0.0013513565063476562,
-0.01448822021484375,
0.02532958984375,
-0.0028095245361328125,
-0.0325927734375,
-0.0413818359375,
0.0445556640625,
-0.06988525390625,
-0.01284027099609375,
-0.0237274169921875,
-0.022979736328125,
0.0190887451171875,
0.008941650390625,
0.046112060546875,
0.05474853515625,
0.027130126953125,
0.0190582275390625,
0.043426513671875,
-0.03582763671875,
0.03173828125,
-0.007404327392578125,
-0.01325225830078125,
-0.04119873046875,
0.068115234375,
0.01372528076171875,
0.00897979736328125,
0.009002685546875,
0.0115509033203125,
-0.02789306640625,
-0.044036865234375,
-0.02960205078125,
0.0168914794921875,
-0.0443115234375,
-0.035186767578125,
-0.04339599609375,
-0.0303955078125,
-0.025054931640625,
0.001888275146484375,
-0.04583740234375,
-0.03289794921875,
-0.034332275390625,
0.025634765625,
0.046600341796875,
0.032562255859375,
-0.0152435302734375,
0.0413818359375,
-0.04217529296875,
0.01244354248046875,
0.007770538330078125,
0.027374267578125,
-0.00466156005859375,
-0.06976318359375,
-0.01401519775390625,
0.0044097900390625,
-0.0301971435546875,
-0.043487548828125,
0.03399658203125,
0.007476806640625,
0.02728271484375,
0.0211944580078125,
-0.0214691162109375,
0.05157470703125,
-0.00232696533203125,
0.04193115234375,
0.04449462890625,
-0.034271240234375,
0.040130615234375,
-0.005405426025390625,
0.01245880126953125,
0.0115814208984375,
0.02288818359375,
-0.01934814453125,
0.01218414306640625,
-0.05889892578125,
-0.059539794921875,
0.0594482421875,
0.010772705078125,
0.0022373199462890625,
0.0343017578125,
0.06103515625,
-0.007366180419921875,
-0.0033473968505859375,
-0.05535888671875,
-0.04034423828125,
-0.034027099609375,
-0.0171661376953125,
0.01384735107421875,
-0.0195465087890625,
0.00010865926742553711,
-0.052825927734375,
0.0484619140625,
0.00653839111328125,
0.060791015625,
0.0269775390625,
0.0035800933837890625,
-0.0027561187744140625,
-0.03497314453125,
0.0496826171875,
0.0215606689453125,
-0.0241851806640625,
0.007724761962890625,
0.00971221923828125,
-0.0516357421875,
0.01548004150390625,
0.0038967132568359375,
-0.004253387451171875,
0.00716400146484375,
0.0293731689453125,
0.0672607421875,
-0.009735107421875,
0.005405426025390625,
0.029388427734375,
-0.00415802001953125,
-0.03887939453125,
-0.0283203125,
0.00917816162109375,
-0.0018262863159179688,
0.034210205078125,
0.03143310546875,
0.037384033203125,
-0.012115478515625,
-0.0196075439453125,
0.0269927978515625,
0.034515380859375,
-0.026092529296875,
-0.0207672119140625,
0.054840087890625,
-0.00780487060546875,
-0.02191162109375,
0.062042236328125,
-0.0141448974609375,
-0.037078857421875,
0.0865478515625,
0.03497314453125,
0.06658935546875,
-0.00679779052734375,
0.0026702880859375,
0.06634521484375,
0.025909423828125,
-0.006626129150390625,
0.01788330078125,
0.01898193359375,
-0.055328369140625,
0.0026397705078125,
-0.0268402099609375,
0.01348876953125,
0.0312347412109375,
-0.04473876953125,
0.027679443359375,
-0.049835205078125,
-0.03692626953125,
0.0172882080078125,
0.018829345703125,
-0.061309814453125,
0.0205535888671875,
-0.00897979736328125,
0.06768798828125,
-0.04681396484375,
0.06646728515625,
0.06695556640625,
-0.03240966796875,
-0.07830810546875,
-0.0029735565185546875,
0.00997161865234375,
-0.06866455078125,
0.0565185546875,
0.03369140625,
0.004856109619140625,
0.00762176513671875,
-0.056396484375,
-0.050323486328125,
0.1053466796875,
0.0252227783203125,
-0.0154266357421875,
0.025390625,
-0.005695343017578125,
0.01025390625,
-0.031402587890625,
0.042816162109375,
0.009307861328125,
0.0195159912109375,
0.0260162353515625,
-0.055145263671875,
0.0188140869140625,
-0.02484130859375,
0.01520538330078125,
0.016326904296875,
-0.06524658203125,
0.05804443359375,
-0.0460205078125,
-0.01030731201171875,
0.0042877197265625,
0.046142578125,
0.01244354248046875,
0.028656005859375,
0.0330810546875,
0.0543212890625,
0.036651611328125,
-0.0172119140625,
0.0634765625,
0.0004911422729492188,
0.043548583984375,
0.052642822265625,
0.0220794677734375,
0.043853759765625,
0.0269012451171875,
-0.01464080810546875,
0.0308380126953125,
0.08544921875,
-0.0255279541015625,
0.0276641845703125,
0.0167388916015625,
0.001163482666015625,
0.0009255409240722656,
0.00775146484375,
-0.037872314453125,
0.0452880859375,
0.00960540771484375,
-0.046783447265625,
-0.01248931884765625,
0.006809234619140625,
0.004322052001953125,
-0.024444580078125,
-0.0177764892578125,
0.0296173095703125,
0.0070037841796875,
-0.028533935546875,
0.079345703125,
0.026092529296875,
0.06329345703125,
-0.0228118896484375,
0.0025634765625,
-0.022613525390625,
0.0105133056640625,
-0.033935546875,
-0.048828125,
0.025634765625,
-0.0217742919921875,
-0.005573272705078125,
0.00812530517578125,
0.061248779296875,
-0.0178985595703125,
-0.0261077880859375,
0.004230499267578125,
0.012908935546875,
0.038299560546875,
0.005626678466796875,
-0.09332275390625,
0.016632080078125,
0.01165008544921875,
-0.042816162109375,
0.024993896484375,
0.0161895751953125,
0.002819061279296875,
0.06201171875,
0.044952392578125,
-0.0181427001953125,
0.0076751708984375,
-0.0215911865234375,
0.06170654296875,
-0.035797119140625,
-0.01427459716796875,
-0.06512451171875,
0.052032470703125,
-0.0130157470703125,
-0.047027587890625,
0.04052734375,
0.052032470703125,
0.0577392578125,
0.0016603469848632812,
0.03973388671875,
-0.02484130859375,
-0.00122833251953125,
-0.036712646484375,
0.049285888671875,
-0.055633544921875,
0.007587432861328125,
0.0019283294677734375,
-0.0457763671875,
-0.0225830078125,
0.05908203125,
-0.015655517578125,
0.0256805419921875,
0.0338134765625,
0.08111572265625,
-0.0296783447265625,
-0.0299072265625,
0.00366973876953125,
0.0004906654357910156,
-0.0015897750854492188,
0.03021240234375,
0.0307159423828125,
-0.071533203125,
0.0270843505859375,
-0.04095458984375,
-0.01654052734375,
-0.019989013671875,
-0.050384521484375,
-0.07666015625,
-0.0638427734375,
-0.04534912109375,
-0.0731201171875,
-0.00823211669921875,
0.0743408203125,
0.083251953125,
-0.045989990234375,
-0.00971221923828125,
-0.0022258758544921875,
0.0147247314453125,
-0.0166473388671875,
-0.0160369873046875,
0.042572021484375,
-0.0004591941833496094,
-0.0401611328125,
-0.0214691162109375,
0.00011157989501953125,
0.028778076171875,
0.0115814208984375,
-0.0202484130859375,
-0.00876617431640625,
-0.0215911865234375,
0.0230712890625,
0.039459228515625,
-0.04864501953125,
-0.005756378173828125,
-0.0200958251953125,
-0.0189056396484375,
0.02801513671875,
0.04522705078125,
-0.037445068359375,
0.0201873779296875,
0.0225830078125,
0.0265655517578125,
0.0625,
-0.0165863037109375,
0.0014944076538085938,
-0.059814453125,
0.054107666015625,
-0.0119171142578125,
0.0208587646484375,
0.0225830078125,
-0.02386474609375,
0.045166015625,
0.035400390625,
-0.0276947021484375,
-0.0716552734375,
-0.002353668212890625,
-0.07989501953125,
-0.0101470947265625,
0.08221435546875,
-0.0245513916015625,
-0.028839111328125,
0.022674560546875,
-0.0031452178955078125,
0.04742431640625,
-0.0038585662841796875,
0.033905029296875,
0.0096588134765625,
-0.00860595703125,
-0.05352783203125,
-0.052490234375,
0.03472900390625,
0.0118560791015625,
-0.044036865234375,
-0.040740966796875,
-0.00478363037109375,
0.056427001953125,
0.0106048583984375,
0.04241943359375,
-0.010833740234375,
0.0116424560546875,
0.013427734375,
0.036834716796875,
-0.042388916015625,
-0.0030689239501953125,
-0.02392578125,
-0.004016876220703125,
-0.0103302001953125,
-0.054290771484375
]
] |
mrm8488/t5-base-finetuned-span-sentiment-extraction | 2021-08-23T21:29:49.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"sentiment",
"extracion",
"passage",
"en",
"arxiv:1910.10683",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | mrm8488 | null | null | mrm8488/t5-base-finetuned-span-sentiment-extraction | 10 | 49,348 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- sentiment
- extracion
- passage
widget:
- text: "question: positive context: On the monday, so i wont be able to be with you! i love you"
---
# T5-base fine-tuned for Sentiment Span Extraction
All credits to [Lorenzo Ampil](https://twitter.com/AND__SO)
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) base fine-tuned on [Tweet Sentiment Extraction Dataset](https://www.kaggle.com/c/tweet-sentiment-extraction) for **Span Sentiment Extraction** downstream task.
## Details of T5
The **T5** model was presented in [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf) by *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu* in Here the abstract:
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.
## Details of the downstream task (Span Sentiment Extraction) - Dataset 📚
[Tweet Sentiment Extraction Dataset](https://www.kaggle.com/c/tweet-sentiment-extraction)
"My ridiculous dog is amazing." [sentiment: positive]
With all of the tweets circulating every second it is hard to tell whether the sentiment behind a specific tweet will impact a company, or a person's, brand for being viral (positive), or devastate profit because it strikes a negative tone. Capturing sentiment in language is important in these times where decisions and reactions are created and updated in seconds. But, which words actually lead to the sentiment description? In this competition you will need to pick out the part of the tweet (word or phrase) that reflects the sentiment.
Help build your skills in this important area with this broad dataset of tweets. Work on your technique to grab a top spot in this competition. What words in tweets support a positive, negative, or neutral sentiment? How can you help make that determination using machine learning tools?
In this competition we've extracted support phrases from Figure Eight's Data for Everyone platform. The dataset is titled Sentiment Analysis: Emotion in Text tweets with existing sentiment labels, used here under creative commons attribution 4.0. international licence. Your objective in this competition is to construct a model that can do the same - look at the labeled sentiment for a given tweet and figure out what word or phrase best supports it.
Disclaimer: The dataset for this competition contains text that may be considered profane, vulgar, or offensive.
| Dataset | Split | # samples |
| -------- | ----- | --------- |
| TSE | train | 23907 |
| TSE | eval | 3573 |
## Model fine-tuning 🏋️
The training script is a slightly modified version of [this Colab Notebook](https://github.com/enzoampil/t5-intro/blob/master/t5_qa_training_pytorch_span_extraction.ipynb) created by [Lorenzo Ampil](https://github.com/enzoampil), so all credits to him!
## Model in Action 🚀
```python
from transformers import AutoModelWithLMHead, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("mrm8488/t5-base-finetuned-span-sentiment-extraction")
model = AutoModelWithLMHead.from_pretrained("mrm8488/t5-base-finetuned-span-sentiment-extraction")
def get_sentiment_span(text):
input_ids = tokenizer.encode(text, return_tensors="pt", add_special_tokens=True) # Batch size 1
generated_ids = model.generate(input_ids=input_ids, num_beams=1, max_length=80).squeeze()
predicted_span = tokenizer.decode(generated_ids, skip_special_tokens=True, clean_up_tokenization_spaces=True)
return predicted_span
get_sentiment_span("question: negative context: My bike was put on hold...should have known that.... argh total bummer")
# output: 'argh total bummer'
get_sentiment_span("question: positive context: On the monday, so i wont be able to be with you! i love you")
# output: 'i love you'
```
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/)
> Made with <span style="color: #e25555;">♥</span> in Spain
| 5,033 | [
[
-0.0305023193359375,
-0.03887939453125,
0.006900787353515625,
0.052764892578125,
-0.026123046875,
0.0208282470703125,
-0.01538848876953125,
-0.044830322265625,
0.0160675048828125,
0.0027599334716796875,
-0.0516357421875,
-0.05169677734375,
-0.0687255859375,
0.021331787109375,
-0.035400390625,
0.0836181640625,
-0.01111602783203125,
-0.006206512451171875,
0.014373779296875,
-0.00701141357421875,
-0.031707763671875,
-0.0308685302734375,
-0.04632568359375,
-0.01763916015625,
0.03704833984375,
0.0229339599609375,
0.016204833984375,
0.03509521484375,
0.0413818359375,
0.0188751220703125,
-0.0007810592651367188,
0.0142822265625,
-0.036346435546875,
-0.0012369155883789062,
-0.0145721435546875,
-0.0247955322265625,
-0.039398193359375,
0.01751708984375,
0.01483917236328125,
0.03948974609375,
0.007747650146484375,
0.02093505859375,
0.0016450881958007812,
0.036285400390625,
-0.050262451171875,
0.0139007568359375,
-0.0477294921875,
0.00592041015625,
-0.00433349609375,
-0.01251220703125,
-0.023162841796875,
-0.0477294921875,
0.007640838623046875,
-0.0223846435546875,
0.00994110107421875,
0.0005311965942382812,
0.10467529296875,
0.013519287109375,
-0.0195770263671875,
-0.021331787109375,
-0.0266876220703125,
0.06390380859375,
-0.0504150390625,
0.01016998291015625,
0.0038471221923828125,
-0.006107330322265625,
0.01486968994140625,
-0.050384521484375,
-0.04351806640625,
-0.0010890960693359375,
0.00934600830078125,
0.0296783447265625,
-0.029205322265625,
-0.00531005859375,
-0.0018968582153320312,
0.0233001708984375,
-0.0263214111328125,
-0.002727508544921875,
-0.03643798828125,
-0.006290435791015625,
0.053985595703125,
-0.01287078857421875,
0.029571533203125,
-0.02435302734375,
-0.03265380859375,
-0.0121307373046875,
-0.0298309326171875,
0.00919342041015625,
0.0011663436889648438,
0.03411865234375,
-0.0260162353515625,
0.033050537109375,
-0.007205963134765625,
0.033172607421875,
0.028076171875,
0.007556915283203125,
0.042938232421875,
-0.0161285400390625,
-0.0224456787109375,
-0.01727294921875,
0.0853271484375,
0.032379150390625,
0.044036865234375,
-0.03509521484375,
-0.02154541015625,
0.006298065185546875,
0.0062408447265625,
-0.06365966796875,
-0.0237274169921875,
0.021484375,
-0.028594970703125,
-0.051025390625,
0.0186614990234375,
-0.06634521484375,
-0.018463134765625,
-0.0072174072265625,
0.05230712890625,
-0.032257080078125,
-0.027313232421875,
0.0228424072265625,
-0.025848388671875,
0.0190277099609375,
0.00681304931640625,
-0.05865478515625,
0.007061004638671875,
0.032623291015625,
0.058746337890625,
-0.0149688720703125,
-0.02069091796875,
-0.0300445556640625,
-0.025238037109375,
-0.0350341796875,
0.054931640625,
-0.0211944580078125,
-0.0206298828125,
-0.004276275634765625,
-0.0018863677978515625,
-0.0198822021484375,
-0.032989501953125,
0.0523681640625,
-0.03436279296875,
0.0430908203125,
-0.0200347900390625,
-0.044281005859375,
-0.00531768798828125,
0.016448974609375,
-0.037322998046875,
0.07574462890625,
0.007495880126953125,
-0.06536865234375,
0.0260467529296875,
-0.0751953125,
-0.0438232421875,
-0.017791748046875,
0.027252197265625,
-0.032073974609375,
0.00853729248046875,
0.0256195068359375,
0.05499267578125,
-0.01163482666015625,
-0.004070281982421875,
-0.03759765625,
-0.0031261444091796875,
0.026458740234375,
-0.0001666545867919922,
0.075927734375,
0.0155487060546875,
-0.039031982421875,
-0.000006258487701416016,
-0.0421142578125,
0.007640838623046875,
0.017303466796875,
-0.022186279296875,
0.0069122314453125,
-0.0162200927734375,
0.005580902099609375,
0.046478271484375,
0.029205322265625,
-0.043243408203125,
0.01485443115234375,
-0.038360595703125,
0.0355224609375,
0.061187744140625,
-0.00040984153747558594,
0.03497314453125,
-0.01137542724609375,
0.0428466796875,
0.001537322998046875,
0.0008034706115722656,
0.01538848876953125,
-0.00982666015625,
-0.06182861328125,
-0.01425933837890625,
0.03436279296875,
0.0491943359375,
-0.039764404296875,
0.06219482421875,
-0.0206756591796875,
-0.0340576171875,
-0.053558349609375,
0.0018100738525390625,
0.011993408203125,
0.04010009765625,
0.0467529296875,
-0.003009796142578125,
-0.0670166015625,
-0.036895751953125,
-0.0301666259765625,
-0.01229095458984375,
0.01221466064453125,
0.0051116943359375,
0.040679931640625,
-0.01010894775390625,
0.0765380859375,
-0.042388916015625,
-0.0294342041015625,
-0.046051025390625,
0.0200042724609375,
0.0107421875,
0.02655029296875,
0.039581298828125,
-0.0567626953125,
-0.0523681640625,
-0.00960540771484375,
-0.0654296875,
-0.038177490234375,
-0.0011615753173828125,
-0.02398681640625,
0.0237274169921875,
0.031097412109375,
-0.04486083984375,
0.00710296630859375,
0.040008544921875,
-0.03472900390625,
0.0426025390625,
0.017730712890625,
-0.0018548965454101562,
-0.11602783203125,
0.018035888671875,
0.0257415771484375,
-0.019378662109375,
-0.034912109375,
-0.0129547119140625,
0.001800537109375,
0.018035888671875,
-0.0278472900390625,
0.04742431640625,
-0.0225982666015625,
0.021087646484375,
-0.0163116455078125,
0.01104736328125,
-0.0002872943878173828,
0.0396728515625,
-0.01263427734375,
0.053955078125,
0.03704833984375,
-0.032073974609375,
0.025421142578125,
0.0248565673828125,
-0.0079803466796875,
0.0291595458984375,
-0.048980712890625,
0.005207061767578125,
-0.003692626953125,
0.0118255615234375,
-0.08831787109375,
-0.00725555419921875,
0.0267791748046875,
-0.0670166015625,
0.028045654296875,
0.001773834228515625,
-0.031829833984375,
-0.03448486328125,
-0.034149169921875,
-0.00971221923828125,
0.050811767578125,
-0.03497314453125,
0.02655029296875,
0.029754638671875,
0.0019321441650390625,
-0.068115234375,
-0.058807373046875,
0.0189056396484375,
-0.040069580078125,
-0.038116455078125,
0.02923583984375,
0.0056304931640625,
-0.0024623870849609375,
-0.002849578857421875,
-0.0113525390625,
-0.0076904296875,
0.0233612060546875,
-0.0019779205322265625,
0.021759033203125,
-0.00205230712890625,
0.020538330078125,
-0.006916046142578125,
0.004364013671875,
0.0161895751953125,
-0.023101806640625,
0.056396484375,
-0.0310516357421875,
0.02447509765625,
-0.047576904296875,
0.010772705078125,
0.04608154296875,
-0.0192718505859375,
0.053985595703125,
0.068603515625,
-0.0239410400390625,
-0.0241546630859375,
-0.037445068359375,
-0.0080718994140625,
-0.033935546875,
0.041900634765625,
-0.02496337890625,
-0.055633544921875,
0.0259857177734375,
-0.004611968994140625,
-0.00518798828125,
0.06494140625,
0.03704833984375,
-0.032318115234375,
0.08489990234375,
0.061737060546875,
-0.04052734375,
0.048004150390625,
-0.03948974609375,
0.0217437744140625,
-0.045196533203125,
-0.00978851318359375,
-0.040985107421875,
-0.032196044921875,
-0.0489501953125,
-0.0011348724365234375,
0.00424957275390625,
0.0160675048828125,
-0.0257720947265625,
0.034423828125,
-0.038726806640625,
0.0142974853515625,
0.0218048095703125,
0.00695037841796875,
-0.0014696121215820312,
0.00902557373046875,
-0.0026092529296875,
-0.0124053955078125,
-0.04962158203125,
-0.0278472900390625,
0.0689697265625,
0.033050537109375,
0.05474853515625,
-0.0134124755859375,
0.0693359375,
0.0340576171875,
0.0265655517578125,
-0.06646728515625,
0.045135498046875,
-0.0318603515625,
-0.0230255126953125,
-0.002197265625,
-0.0296173095703125,
-0.0716552734375,
0.005954742431640625,
-0.0198822021484375,
-0.0645751953125,
0.005504608154296875,
-0.006744384765625,
-0.01122283935546875,
0.023590087890625,
-0.062469482421875,
0.07501220703125,
-0.0091552734375,
-0.01800537109375,
0.005733489990234375,
-0.061676025390625,
0.00615692138671875,
0.013519287109375,
0.005573272705078125,
-0.01088714599609375,
-0.0018062591552734375,
0.0615234375,
-0.0221405029296875,
0.07171630859375,
-0.018463134765625,
0.0016450881958007812,
0.009246826171875,
-0.0003483295440673828,
0.0162353515625,
-0.03338623046875,
-0.00017154216766357422,
-0.0029926300048828125,
-0.01324462890625,
-0.0265045166015625,
-0.0193328857421875,
0.032012939453125,
-0.0792236328125,
-0.0084991455078125,
-0.021820068359375,
-0.0167236328125,
-0.021759033203125,
0.0193939208984375,
0.04052734375,
0.008209228515625,
-0.0186309814453125,
0.021240234375,
0.03631591796875,
-0.0004897117614746094,
0.046051025390625,
-0.001750946044921875,
-0.00212860107421875,
-0.031494140625,
0.07684326171875,
-0.0021076202392578125,
0.0089263916015625,
0.04296875,
0.0269927978515625,
-0.0467529296875,
-0.0242462158203125,
-0.01824951171875,
0.02813720703125,
-0.052764892578125,
-0.032135009765625,
-0.06341552734375,
-0.0204925537109375,
-0.041595458984375,
-0.00860595703125,
-0.0296173095703125,
-0.03466796875,
-0.040679931640625,
-0.0244903564453125,
0.023040771484375,
0.036285400390625,
-0.01947021484375,
0.021209716796875,
-0.0625,
0.01407623291015625,
0.0005140304565429688,
0.004596710205078125,
-0.0077667236328125,
-0.0438232421875,
-0.01715087890625,
0.005218505859375,
-0.041900634765625,
-0.0709228515625,
0.06365966796875,
0.01561737060546875,
0.01824951171875,
0.0230712890625,
0.005825042724609375,
0.04486083984375,
-0.00463104248046875,
0.062042236328125,
0.0297698974609375,
-0.08978271484375,
0.042938232421875,
-0.02008056640625,
0.01090240478515625,
0.041656494140625,
0.036834716796875,
-0.056793212890625,
-0.03173828125,
-0.05059814453125,
-0.0728759765625,
0.048858642578125,
0.0170135498046875,
0.00853729248046875,
-0.00618743896484375,
0.0177154541015625,
0.0032367706298828125,
0.0272369384765625,
-0.08245849609375,
-0.0160369873046875,
-0.024993896484375,
-0.038543701171875,
0.01087188720703125,
0.003635406494140625,
0.015411376953125,
-0.017242431640625,
0.058319091796875,
-0.00785064697265625,
0.04351806640625,
0.0199432373046875,
-0.025909423828125,
0.00177001953125,
0.01514434814453125,
0.0168609619140625,
0.032135009765625,
-0.021270751953125,
-0.005710601806640625,
0.01641845703125,
-0.025421142578125,
-0.007320404052734375,
0.00916290283203125,
0.0008502006530761719,
-0.009002685546875,
0.0273284912109375,
0.06524658203125,
-0.01334381103515625,
-0.027923583984375,
0.050537109375,
-0.0003578662872314453,
-0.031829833984375,
-0.0083465576171875,
-0.003635406494140625,
0.0040130615234375,
0.01470184326171875,
0.032440185546875,
0.017608642578125,
0.00569915771484375,
-0.0290679931640625,
0.01605224609375,
0.0250701904296875,
-0.0345458984375,
-0.042388916015625,
0.050567626953125,
0.0220947265625,
-0.00963592529296875,
0.037750244140625,
-0.027252197265625,
-0.073974609375,
0.04931640625,
0.04119873046875,
0.093505859375,
0.0019435882568359375,
0.02996826171875,
0.038787841796875,
0.010101318359375,
-0.01097869873046875,
0.033935546875,
0.002628326416015625,
-0.0751953125,
-0.039825439453125,
-0.047088623046875,
-0.01216888427734375,
0.009918212890625,
-0.036712646484375,
0.0333251953125,
-0.025238037109375,
-0.006977081298828125,
-0.00506591796875,
0.0189971923828125,
-0.04705810546875,
0.045166015625,
0.0251617431640625,
0.070068359375,
-0.07611083984375,
0.044921875,
0.05419921875,
-0.0462646484375,
-0.07684326171875,
0.0199737548828125,
-0.02288818359375,
-0.054229736328125,
0.0531005859375,
0.0259857177734375,
-0.0230255126953125,
0.01242828369140625,
-0.0704345703125,
-0.041046142578125,
0.0697021484375,
0.0139007568359375,
0.00525665283203125,
-0.0191650390625,
0.0073394775390625,
0.053558349609375,
-0.0303192138671875,
0.014404296875,
0.03753662109375,
0.0282135009765625,
0.01308441162109375,
-0.06402587890625,
0.007251739501953125,
-0.0309906005859375,
-0.0236053466796875,
-0.00225067138671875,
-0.054534912109375,
0.0657958984375,
-0.01534271240234375,
-0.0163421630859375,
0.0036525726318359375,
0.0550537109375,
0.005191802978515625,
0.00897216796875,
0.04119873046875,
0.0426025390625,
0.051666259765625,
-0.017425537109375,
0.07659912109375,
-0.0265655517578125,
0.057647705078125,
0.0543212890625,
0.00762939453125,
0.0687255859375,
0.028533935546875,
-0.01220703125,
0.040069580078125,
0.0596923828125,
-0.0057220458984375,
0.0494384765625,
0.0002015829086303711,
-0.01016998291015625,
-0.0163726806640625,
-0.00522613525390625,
-0.0239715576171875,
0.041748046875,
0.0257720947265625,
-0.031341552734375,
-0.0178070068359375,
-0.01137542724609375,
0.021820068359375,
-0.01320648193359375,
-0.0168304443359375,
0.058807373046875,
0.013092041015625,
-0.051025390625,
0.047607421875,
0.005756378173828125,
0.0830078125,
-0.03656005859375,
0.0257415771484375,
-0.0253753662109375,
0.0257110595703125,
-0.027313232421875,
-0.062255859375,
0.02288818359375,
0.005191802978515625,
0.00360107421875,
-0.038360595703125,
0.0679931640625,
-0.0301055908203125,
-0.0276031494140625,
0.037109375,
0.048370361328125,
0.0014715194702148438,
-0.0124969482421875,
-0.07012939453125,
-0.0203857421875,
0.01275634765625,
-0.0242156982421875,
0.010406494140625,
0.041229248046875,
0.0162200927734375,
0.046051025390625,
0.031524658203125,
0.018280029296875,
0.00252532958984375,
0.003955841064453125,
0.05816650390625,
-0.05621337890625,
-0.04510498046875,
-0.078125,
0.04461669921875,
-0.02337646484375,
-0.036407470703125,
0.049713134765625,
0.039581298828125,
0.05706787109375,
-0.01544952392578125,
0.08087158203125,
-0.0297088623046875,
0.04736328125,
-0.016937255859375,
0.052459716796875,
-0.06365966796875,
-0.005420684814453125,
-0.03436279296875,
-0.04864501953125,
-0.033111572265625,
0.06060791015625,
-0.03546142578125,
0.0174560546875,
0.0562744140625,
0.042816162109375,
-0.000005781650543212891,
-0.0075225830078125,
-0.00463104248046875,
0.03070068359375,
0.032684326171875,
0.05096435546875,
0.055328369140625,
-0.0428466796875,
0.050811767578125,
-0.020477294921875,
-0.00885772705078125,
-0.022705078125,
-0.068603515625,
-0.06597900390625,
-0.045989990234375,
-0.031982421875,
-0.052764892578125,
-0.0015134811401367188,
0.09332275390625,
0.045989990234375,
-0.058685302734375,
-0.0257720947265625,
-0.0248565673828125,
0.00615692138671875,
-0.004940032958984375,
-0.025482177734375,
0.038360595703125,
-0.04931640625,
-0.064697265625,
0.001773834228515625,
-0.0019779205322265625,
0.0005788803100585938,
0.0169219970703125,
0.01849365234375,
-0.0213623046875,
-0.005611419677734375,
0.052490234375,
0.0228424072265625,
-0.0295562744140625,
-0.020599365234375,
0.032745361328125,
-0.0232391357421875,
0.0214080810546875,
0.025421142578125,
-0.042938232421875,
0.0091552734375,
0.047576904296875,
0.058441162109375,
0.039703369140625,
0.010101318359375,
0.037811279296875,
-0.060638427734375,
-0.0216217041015625,
0.0241546630859375,
0.0245361328125,
0.031768798828125,
-0.012786865234375,
0.038299560546875,
0.04071044921875,
-0.032379150390625,
-0.044921875,
-0.0061798095703125,
-0.09161376953125,
-0.0205535888671875,
0.1011962890625,
0.0039043426513671875,
-0.0154571533203125,
0.01097869873046875,
-0.0244293212890625,
0.040283203125,
-0.04193115234375,
0.08477783203125,
0.0592041015625,
-0.006649017333984375,
-0.01092529296875,
-0.0155487060546875,
0.03900146484375,
0.034210205078125,
-0.07354736328125,
-0.0089263916015625,
0.0169677734375,
0.04022216796875,
0.0027599334716796875,
0.040069580078125,
-0.003204345703125,
0.0195465087890625,
-0.030120849609375,
0.0289459228515625,
0.0096435546875,
-0.004119873046875,
-0.0277252197265625,
0.03594970703125,
-0.014068603515625,
-0.04827880859375
]
] |
NousResearch/Nous-Hermes-Llama2-13b | 2023-08-26T20:17:38.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"self-instruct",
"distillation",
"synthetic instruction",
"en",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | NousResearch | null | null | NousResearch/Nous-Hermes-Llama2-13b | 239 | 49,194 | transformers | 2023-07-20T23:25:25 | ---
language:
- en
tags:
- llama-2
- self-instruct
- distillation
- synthetic instruction
license:
- mit
---
# Model Card: Nous-Hermes-Llama2-13b
Compute provided by our project sponsor Redmond AI, thank you! Follow RedmondAI on Twitter @RedmondAI.
## Model Description
Nous-Hermes-Llama2-13b is a state-of-the-art language model fine-tuned on over 300,000 instructions. This model was fine-tuned by Nous Research, with Teknium and Emozilla leading the fine tuning process and dataset curation, Redmond AI sponsoring the compute, and several other contributors.
This Hermes model uses the exact same dataset as Hermes on Llama-1. This is to ensure consistency between the old Hermes and new, for anyone who wanted to keep Hermes as similar to the old one, just more capable.
This model stands out for its long responses, lower hallucination rate, and absence of OpenAI censorship mechanisms. The fine-tuning process was performed with a 4096 sequence length on an 8x a100 80GB DGX machine.
## Example Outputs:




## Model Training
The model was trained almost entirely on synthetic GPT-4 outputs. Curating high quality GPT-4 datasets enables incredibly high quality in knowledge, task completion, and style.
This includes data from diverse sources such as GPTeacher, the general, roleplay v1&2, code instruct datasets, Nous Instruct & PDACTL (unpublished), and several others, detailed further below
## Collaborators
The model fine-tuning and the datasets were a collaboration of efforts and resources between Teknium, Karan4D, Emozilla, Huemin Art, and Redmond AI.
Special mention goes to @winglian for assisting in some of the training issues.
Huge shoutout and acknowledgement is deserved for all the dataset creators who generously share their datasets openly.
Among the contributors of datasets:
- GPTeacher was made available by Teknium
- Wizard LM by nlpxucan
- Nous Research Instruct Dataset was provided by Karan4D and HueminArt.
- GPT4-LLM and Unnatural Instructions were provided by Microsoft
- Airoboros dataset by jondurbin
- Camel-AI's domain expert datasets are from Camel-AI
- CodeAlpaca dataset by Sahil 2801.
If anyone was left out, please open a thread in the community tab.
## Prompt Format
The model follows the Alpaca prompt format:
```
### Instruction:
<prompt>
### Response:
<leave a newline blank for model to respond>
```
or
```
### Instruction:
<prompt>
### Input:
<additional context>
### Response:
<leave a newline blank for model to respond>
```
## Benchmark Results
AGI-Eval
```
| Task |Version| Metric |Value | |Stderr|
|agieval_aqua_rat | 0|acc |0.2362|± |0.0267|
| | |acc_norm|0.2480|± |0.0272|
|agieval_logiqa_en | 0|acc |0.3425|± |0.0186|
| | |acc_norm|0.3472|± |0.0187|
|agieval_lsat_ar | 0|acc |0.2522|± |0.0287|
| | |acc_norm|0.2087|± |0.0269|
|agieval_lsat_lr | 0|acc |0.3510|± |0.0212|
| | |acc_norm|0.3627|± |0.0213|
|agieval_lsat_rc | 0|acc |0.4647|± |0.0305|
| | |acc_norm|0.4424|± |0.0303|
|agieval_sat_en | 0|acc |0.6602|± |0.0331|
| | |acc_norm|0.6165|± |0.0340|
|agieval_sat_en_without_passage| 0|acc |0.4320|± |0.0346|
| | |acc_norm|0.4272|± |0.0345|
|agieval_sat_math | 0|acc |0.2909|± |0.0307|
| | |acc_norm|0.2727|± |0.0301|
```
GPT-4All Benchmark Set
```
| Task |Version| Metric |Value | |Stderr|
|arc_challenge| 0|acc |0.5102|± |0.0146|
| | |acc_norm|0.5213|± |0.0146|
|arc_easy | 0|acc |0.7959|± |0.0083|
| | |acc_norm|0.7567|± |0.0088|
|boolq | 1|acc |0.8394|± |0.0064|
|hellaswag | 0|acc |0.6164|± |0.0049|
| | |acc_norm|0.8009|± |0.0040|
|openbookqa | 0|acc |0.3580|± |0.0215|
| | |acc_norm|0.4620|± |0.0223|
|piqa | 0|acc |0.7992|± |0.0093|
| | |acc_norm|0.8069|± |0.0092|
|winogrande | 0|acc |0.7127|± |0.0127|
```
BigBench Reasoning Test
```
| Task |Version| Metric |Value | |Stderr|
|bigbench_causal_judgement | 0|multiple_choice_grade|0.5526|± |0.0362|
|bigbench_date_understanding | 0|multiple_choice_grade|0.7344|± |0.0230|
|bigbench_disambiguation_qa | 0|multiple_choice_grade|0.2636|± |0.0275|
|bigbench_geometric_shapes | 0|multiple_choice_grade|0.0195|± |0.0073|
| | |exact_str_match |0.0000|± |0.0000|
|bigbench_logical_deduction_five_objects | 0|multiple_choice_grade|0.2760|± |0.0200|
|bigbench_logical_deduction_seven_objects | 0|multiple_choice_grade|0.2100|± |0.0154|
|bigbench_logical_deduction_three_objects | 0|multiple_choice_grade|0.4400|± |0.0287|
|bigbench_movie_recommendation | 0|multiple_choice_grade|0.2440|± |0.0192|
|bigbench_navigate | 0|multiple_choice_grade|0.4950|± |0.0158|
|bigbench_reasoning_about_colored_objects | 0|multiple_choice_grade|0.5570|± |0.0111|
|bigbench_ruin_names | 0|multiple_choice_grade|0.3728|± |0.0229|
|bigbench_salient_translation_error_detection | 0|multiple_choice_grade|0.1854|± |0.0123|
|bigbench_snarks | 0|multiple_choice_grade|0.6298|± |0.0360|
|bigbench_sports_understanding | 0|multiple_choice_grade|0.6156|± |0.0155|
|bigbench_temporal_sequences | 0|multiple_choice_grade|0.3140|± |0.0147|
|bigbench_tracking_shuffled_objects_five_objects | 0|multiple_choice_grade|0.2032|± |0.0114|
|bigbench_tracking_shuffled_objects_seven_objects| 0|multiple_choice_grade|0.1406|± |0.0083|
|bigbench_tracking_shuffled_objects_three_objects| 0|multiple_choice_grade|0.4400|± |0.0287|
```
These are the highest benchmarks Hermes has seen on every metric, achieving the following average scores:
- GPT4All benchmark average is now 70.0 - from 68.8 in Hermes-Llama1
- 0.3657 on BigBench, up from 0.328 on hermes-llama1
- 0.372 on AGIEval, up from 0.354 on Hermes-llama1
These benchmarks currently have us at #1 on ARC-c, ARC-e, Hellaswag, and OpenBookQA, and 2nd place on Winogrande, comparing to GPT4all's benchmarking list, supplanting Hermes 1 for the new top position.
## Resources for Applied Use Cases:
Check out LM Studio for a nice chatgpt style interface here: https://lmstudio.ai/
For an example of a back and forth chatbot using huggingface transformers and discord, check out: https://github.com/teknium1/alpaca-discord
For an example of a roleplaying discord chatbot, check out this: https://github.com/teknium1/alpaca-roleplay-discordbot
## Future Plans
We plan to continue to iterate on both more high quality data, and new data filtering techniques to eliminate lower quality data going forward.
## Model Usage
The model is available for download on Hugging Face. It is suitable for a wide range of language tasks, from generating creative text to understanding and following complex instructions.
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
| 8,300 | [
[
-0.047454833984375,
-0.06390380859375,
0.0189056396484375,
0.00977325439453125,
-0.0018720626831054688,
0.00876617431640625,
-0.00946807861328125,
-0.043182373046875,
0.03363037109375,
0.0168609619140625,
-0.052398681640625,
-0.0482177734375,
-0.053436279296875,
0.004169464111328125,
-0.013946533203125,
0.083984375,
-0.005641937255859375,
-0.0124359130859375,
0.00264739990234375,
-0.0252227783203125,
-0.0291290283203125,
-0.01317596435546875,
-0.05712890625,
-0.01763916015625,
0.03045654296875,
0.044097900390625,
0.05712890625,
0.03515625,
0.043853759765625,
0.0216217041015625,
-0.01561737060546875,
0.004283905029296875,
-0.0247802734375,
-0.01151275634765625,
0.0034427642822265625,
-0.028533935546875,
-0.06109619140625,
0.0191192626953125,
0.0224761962890625,
0.039947509765625,
-0.0027751922607421875,
0.045166015625,
0.00893402099609375,
0.059967041015625,
-0.02496337890625,
0.0193634033203125,
-0.0164947509765625,
-0.0081329345703125,
-0.007740020751953125,
-0.0005574226379394531,
-0.00591278076171875,
-0.035552978515625,
0.0062103271484375,
-0.0560302734375,
0.011016845703125,
0.00567626953125,
0.0926513671875,
0.0217437744140625,
-0.0199127197265625,
-0.0179290771484375,
-0.029205322265625,
0.06414794921875,
-0.05389404296875,
0.016265869140625,
0.045135498046875,
0.0081024169921875,
-0.0136260986328125,
-0.040557861328125,
-0.061187744140625,
-0.0031185150146484375,
-0.01445770263671875,
0.02716064453125,
-0.01377105712890625,
-0.02197265625,
0.029296875,
0.040252685546875,
-0.052093505859375,
0.006954193115234375,
-0.037811279296875,
-0.00379180908203125,
0.0643310546875,
0.0224151611328125,
0.02044677734375,
-0.0128631591796875,
-0.02203369140625,
-0.032073974609375,
-0.0301971435546875,
0.0260772705078125,
0.026885986328125,
0.0129547119140625,
-0.0411376953125,
0.045867919921875,
-0.022369384765625,
0.039031982421875,
0.006092071533203125,
-0.0143280029296875,
0.0574951171875,
-0.0190277099609375,
-0.018829345703125,
-0.0152435302734375,
0.06365966796875,
0.031219482421875,
0.002105712890625,
0.013946533203125,
0.0013608932495117188,
0.016265869140625,
0.007083892822265625,
-0.05438232421875,
-0.0150604248046875,
0.0362548828125,
-0.033203125,
-0.018707275390625,
-0.00948333740234375,
-0.06500244140625,
-0.007720947265625,
-0.0274658203125,
0.025482177734375,
-0.038818359375,
-0.023834228515625,
0.0050048828125,
-0.004367828369140625,
0.03204345703125,
0.034393310546875,
-0.062347412109375,
0.022979736328125,
0.0325927734375,
0.05804443359375,
-0.0167083740234375,
-0.020355224609375,
-0.00943756103515625,
-0.0068359375,
-0.0294647216796875,
0.0543212890625,
-0.0257110595703125,
-0.02203369140625,
-0.0291748046875,
0.0146942138671875,
-0.01303863525390625,
-0.0279693603515625,
0.05535888671875,
-0.01544189453125,
0.03936767578125,
-0.0283660888671875,
-0.04150390625,
-0.01495361328125,
0.034698486328125,
-0.057342529296875,
0.10491943359375,
0.0215606689453125,
-0.05780029296875,
0.02423095703125,
-0.0654296875,
-0.0017871856689453125,
-0.0167999267578125,
-0.0102691650390625,
-0.052337646484375,
-0.0218048095703125,
0.0284423828125,
0.0293426513671875,
-0.0335693359375,
0.01212310791015625,
-0.019317626953125,
-0.031280517578125,
0.003276824951171875,
-0.0145721435546875,
0.0732421875,
0.0185699462890625,
-0.0499267578125,
-0.0028781890869140625,
-0.07403564453125,
0.01123809814453125,
0.031768798828125,
-0.0250244140625,
0.004550933837890625,
-0.0178070068359375,
-0.0012674331665039062,
0.0242767333984375,
0.014434814453125,
-0.040435791015625,
0.0245513916015625,
-0.01727294921875,
0.023468017578125,
0.061767578125,
-0.007335662841796875,
0.0204315185546875,
-0.048309326171875,
0.035186767578125,
-0.0022220611572265625,
0.01396942138671875,
-0.00159454345703125,
-0.05731201171875,
-0.06396484375,
-0.0303192138671875,
0.01537322998046875,
0.047149658203125,
-0.027374267578125,
0.043792724609375,
-0.0114288330078125,
-0.057647705078125,
-0.044952392578125,
-0.0007815361022949219,
0.0303192138671875,
0.040130615234375,
0.039031982421875,
-0.027496337890625,
-0.0290069580078125,
-0.0645751953125,
-0.005268096923828125,
-0.016815185546875,
-0.0048828125,
0.03973388671875,
0.06201171875,
-0.01727294921875,
0.06024169921875,
-0.051177978515625,
-0.0307464599609375,
-0.02947998046875,
-0.00246429443359375,
0.034271240234375,
0.04949951171875,
0.046600341796875,
-0.0443115234375,
-0.035614013671875,
-0.0118560791015625,
-0.055023193359375,
-0.00027298927307128906,
0.0018463134765625,
-0.0196380615234375,
0.0257110595703125,
0.01409912109375,
-0.053375244140625,
0.0501708984375,
0.045440673828125,
-0.041046142578125,
0.05615234375,
-0.01198577880859375,
0.0186004638671875,
-0.0830078125,
0.0296478271484375,
0.0012807846069335938,
0.01259613037109375,
-0.03326416015625,
-0.0041046142578125,
0.0017242431640625,
-0.0036602020263671875,
-0.0196075439453125,
0.06201171875,
-0.036956787109375,
-0.0007185935974121094,
0.01314544677734375,
0.0135040283203125,
0.004364013671875,
0.052459716796875,
0.0055084228515625,
0.0654296875,
0.047332763671875,
-0.03350830078125,
0.01531219482421875,
0.034881591796875,
-0.035552978515625,
0.03546142578125,
-0.05804443359375,
0.00583648681640625,
0.00041604042053222656,
0.0205230712890625,
-0.0904541015625,
-0.019989013671875,
0.028839111328125,
-0.046783447265625,
0.0186004638671875,
0.01678466796875,
-0.027740478515625,
-0.056060791015625,
-0.038818359375,
0.0177001953125,
0.0335693359375,
-0.0303192138671875,
0.02838134765625,
0.014373779296875,
-0.0036029815673828125,
-0.049072265625,
-0.05224609375,
-0.006145477294921875,
-0.01873779296875,
-0.048492431640625,
0.0296783447265625,
-0.022674560546875,
-0.006900787353515625,
0.0018215179443359375,
-0.018218994140625,
0.0016880035400390625,
0.01013946533203125,
0.01763916015625,
0.03521728515625,
-0.0157623291015625,
-0.006824493408203125,
-0.01497650146484375,
-0.0159912109375,
-0.004810333251953125,
0.0184783935546875,
0.03326416015625,
-0.01413726806640625,
-0.02410888671875,
-0.049530029296875,
0.01407623291015625,
0.04888916015625,
-0.02789306640625,
0.06561279296875,
0.039794921875,
-0.0132904052734375,
0.005153656005859375,
-0.0419921875,
-0.00992584228515625,
-0.0347900390625,
0.01471710205078125,
-0.03411865234375,
-0.06341552734375,
0.043914794921875,
0.0037841796875,
0.0196075439453125,
0.039794921875,
0.031982421875,
0.00028896331787109375,
0.06927490234375,
0.03338623046875,
-0.024444580078125,
0.0240478515625,
-0.051177978515625,
0.0123748779296875,
-0.06829833984375,
-0.0286712646484375,
-0.040313720703125,
-0.03826904296875,
-0.04974365234375,
-0.024505615234375,
0.0099945068359375,
-0.0020503997802734375,
-0.045684814453125,
0.027923583984375,
-0.054168701171875,
0.023468017578125,
0.0540771484375,
0.0185394287109375,
0.01085662841796875,
-0.00563812255859375,
-0.01971435546875,
-0.00730133056640625,
-0.0293121337890625,
-0.049163818359375,
0.10089111328125,
0.01386260986328125,
0.04388427734375,
0.016143798828125,
0.04766845703125,
0.020751953125,
0.01247406005859375,
-0.037750244140625,
0.04547119140625,
0.01290130615234375,
-0.0540771484375,
-0.0206298828125,
-0.02093505859375,
-0.08526611328125,
0.0245819091796875,
-0.0245208740234375,
-0.0726318359375,
0.00450897216796875,
-0.00267791748046875,
-0.0251312255859375,
0.0213470458984375,
-0.057769775390625,
0.07763671875,
-0.01025390625,
-0.040618896484375,
-0.0022182464599609375,
-0.055450439453125,
0.0216827392578125,
0.006229400634765625,
0.02325439453125,
-0.0262908935546875,
0.0018901824951171875,
0.0626220703125,
-0.032745361328125,
0.0460205078125,
0.001743316650390625,
-0.0012111663818359375,
0.0255126953125,
-0.000022411346435546875,
0.031646728515625,
0.0022869110107421875,
-0.01354217529296875,
0.014312744140625,
-0.0030536651611328125,
-0.04742431640625,
-0.023162841796875,
0.060028076171875,
-0.0797119140625,
-0.053131103515625,
-0.0543212890625,
-0.04278564453125,
-0.006622314453125,
0.0167694091796875,
0.009246826171875,
0.0187225341796875,
-0.004726409912109375,
0.00319671630859375,
0.038848876953125,
-0.035552978515625,
0.031768798828125,
0.03875732421875,
-0.0036258697509765625,
-0.02972412109375,
0.059356689453125,
0.0015192031860351562,
0.02044677734375,
0.01026153564453125,
0.006336212158203125,
-0.01934814453125,
-0.0170135498046875,
-0.039581298828125,
0.0311279296875,
-0.019775390625,
-0.0155029296875,
-0.046722412109375,
-0.01224517822265625,
-0.040740966796875,
-0.01800537109375,
-0.024017333984375,
-0.039825439453125,
-0.02264404296875,
-0.01812744140625,
0.041595458984375,
0.045806884765625,
0.004932403564453125,
0.0110015869140625,
-0.035980224609375,
0.03369140625,
0.0123138427734375,
0.0294342041015625,
-0.0030384063720703125,
-0.039886474609375,
-0.006984710693359375,
0.0015916824340820312,
-0.049163818359375,
-0.05987548828125,
0.038421630859375,
0.006195068359375,
0.04229736328125,
0.0194091796875,
-0.0043792724609375,
0.058685302734375,
-0.01509857177734375,
0.0858154296875,
0.004119873046875,
-0.061676025390625,
0.053253173828125,
-0.0222015380859375,
0.0307159423828125,
0.04510498046875,
0.0352783203125,
-0.0419921875,
-0.041046142578125,
-0.0643310546875,
-0.07086181640625,
0.08587646484375,
0.034881591796875,
-0.00868988037109375,
0.005916595458984375,
0.02044677734375,
-0.005336761474609375,
0.00623321533203125,
-0.0592041015625,
-0.0516357421875,
-0.0016317367553710938,
-0.018310546875,
-0.0116729736328125,
-0.00637054443359375,
-0.0090484619140625,
-0.03961181640625,
0.061187744140625,
0.0080718994140625,
0.037933349609375,
0.01354217529296875,
0.01059722900390625,
0.007232666015625,
0.01617431640625,
0.03656005859375,
0.038665771484375,
-0.0196075439453125,
-0.01209259033203125,
0.0179290771484375,
-0.054595947265625,
0.01219940185546875,
0.01136016845703125,
-0.00945281982421875,
-0.0148468017578125,
0.028411865234375,
0.045623779296875,
-0.006134033203125,
-0.040130615234375,
0.036712646484375,
-0.009033203125,
-0.0252532958984375,
-0.03363037109375,
0.010894775390625,
0.001567840576171875,
0.0228118896484375,
0.0171661376953125,
0.007045745849609375,
0.002277374267578125,
-0.041229248046875,
0.00930023193359375,
0.0188446044921875,
-0.0093536376953125,
-0.01561737060546875,
0.06756591796875,
0.0017175674438476562,
-0.0139923095703125,
0.04742431640625,
-0.0135498046875,
-0.035675048828125,
0.0654296875,
0.0311737060546875,
0.037933349609375,
-0.030487060546875,
0.0174713134765625,
0.06884765625,
0.0257720947265625,
-0.00914764404296875,
0.0311279296875,
-0.00019061565399169922,
-0.037353515625,
-0.006534576416015625,
-0.044647216796875,
-0.031768798828125,
0.02618408203125,
-0.050262451171875,
0.029296875,
-0.026519775390625,
-0.01251983642578125,
-0.0017995834350585938,
0.029571533203125,
-0.055450439453125,
0.032989501953125,
0.0023708343505859375,
0.06842041015625,
-0.06109619140625,
0.065185546875,
0.052978515625,
-0.05035400390625,
-0.081787109375,
-0.01556396484375,
0.00930023193359375,
-0.064453125,
0.041259765625,
0.027099609375,
0.01271820068359375,
-0.004486083984375,
-0.029815673828125,
-0.09613037109375,
0.1036376953125,
0.0126953125,
-0.0289459228515625,
0.01137542724609375,
0.01934814453125,
0.04400634765625,
-0.00020003318786621094,
0.051971435546875,
0.055419921875,
0.042510986328125,
0.0081329345703125,
-0.0701904296875,
0.0196685791015625,
-0.056671142578125,
-0.0128021240234375,
0.0144805908203125,
-0.07220458984375,
0.07110595703125,
-0.021514892578125,
-0.003177642822265625,
0.0024166107177734375,
0.039764404296875,
0.038726806640625,
0.0293121337890625,
0.0261688232421875,
0.07501220703125,
0.0675048828125,
-0.020355224609375,
0.07843017578125,
-0.0275421142578125,
0.03485107421875,
0.08282470703125,
-0.0017461776733398438,
0.06072998046875,
0.02490234375,
-0.03729248046875,
0.048858642578125,
0.057281494140625,
-0.004444122314453125,
0.029266357421875,
0.0100250244140625,
-0.007328033447265625,
0.0014171600341796875,
0.016082763671875,
-0.04345703125,
0.0248260498046875,
0.0221099853515625,
-0.017486572265625,
-0.0005478858947753906,
-0.01120758056640625,
0.0213165283203125,
-0.00457763671875,
-0.00820159912109375,
0.0390625,
-0.00829315185546875,
-0.051025390625,
0.059906005859375,
-0.005176544189453125,
0.05169677734375,
-0.046600341796875,
-0.004154205322265625,
-0.0303192138671875,
0.0246124267578125,
-0.0301971435546875,
-0.07391357421875,
0.01456451416015625,
0.00821685791015625,
-0.0013475418090820312,
-0.006076812744140625,
0.0289154052734375,
-0.01532745361328125,
-0.03857421875,
0.0300140380859375,
0.028289794921875,
0.0203704833984375,
0.00945281982421875,
-0.07574462890625,
0.005428314208984375,
0.005695343017578125,
-0.0562744140625,
0.02105712890625,
0.036468505859375,
-0.0007319450378417969,
0.051605224609375,
0.05108642578125,
-0.0125274658203125,
0.0031585693359375,
-0.01047515869140625,
0.07470703125,
-0.056365966796875,
-0.03216552734375,
-0.05364990234375,
0.039459228515625,
-0.0183258056640625,
-0.045684814453125,
0.072509765625,
0.058807373046875,
0.060638427734375,
0.00763702392578125,
0.04693603515625,
-0.0311279296875,
0.045989990234375,
-0.0156707763671875,
0.04388427734375,
-0.06610107421875,
0.01120758056640625,
-0.03594970703125,
-0.057159423828125,
-0.02801513671875,
0.059967041015625,
-0.03192138671875,
0.0145416259765625,
0.054534912109375,
0.06622314453125,
0.004009246826171875,
0.01332855224609375,
0.0028095245361328125,
0.0163116455078125,
0.0166778564453125,
0.0634765625,
0.0391845703125,
-0.048065185546875,
0.042724609375,
-0.033599853515625,
-0.01363372802734375,
-0.00928497314453125,
-0.041839599609375,
-0.0631103515625,
-0.0416259765625,
-0.03411865234375,
-0.03033447265625,
-0.0144195556640625,
0.0667724609375,
0.0347900390625,
-0.0511474609375,
-0.025054931640625,
-0.0005373954772949219,
0.0018138885498046875,
-0.0266265869140625,
-0.0195159912109375,
0.05010986328125,
-0.01041412353515625,
-0.0640869140625,
0.0037746429443359375,
-0.006885528564453125,
0.00693511962890625,
0.006191253662109375,
-0.0123443603515625,
-0.024505615234375,
0.0093841552734375,
0.040313720703125,
0.0177154541015625,
-0.04949951171875,
-0.016448974609375,
0.00911712646484375,
-0.01593017578125,
0.029266357421875,
0.0099945068359375,
-0.0479736328125,
0.0140533447265625,
0.01495361328125,
0.033355712890625,
0.056640625,
-0.0021800994873046875,
0.003650665283203125,
-0.027374267578125,
0.008026123046875,
0.001201629638671875,
0.0220489501953125,
0.0136260986328125,
-0.0256500244140625,
0.06451416015625,
0.01161956787109375,
-0.053466796875,
-0.05853271484375,
-0.01026153564453125,
-0.10003662109375,
-0.01215362548828125,
0.08941650390625,
-0.00994110107421875,
-0.0307159423828125,
0.0008068084716796875,
-0.03485107421875,
0.01267242431640625,
-0.047515869140625,
0.061767578125,
0.052337646484375,
-0.0239410400390625,
-0.0038604736328125,
-0.058013916015625,
0.0242919921875,
0.037811279296875,
-0.0699462890625,
-0.0084075927734375,
0.029144287109375,
0.0268096923828125,
0.01800537109375,
0.0592041015625,
-0.0146484375,
0.0164337158203125,
0.0042724609375,
0.0200958251953125,
0.004878997802734375,
0.005565643310546875,
-0.00603485107421875,
0.006839752197265625,
-0.005695343017578125,
-0.0308074951171875
]
] |
openai/whisper-small.en | 2023-09-08T12:56:14.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"arxiv:2212.04356",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | openai | null | null | openai/whisper-small.en | 15 | 49,178 | transformers | 2022-09-26T06:59:49 | ---
language:
- en
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: whisper-small.en
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value:
pipeline_tag: automatic-speech-recognition
license: apache-2.0
---
# Whisper
Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours
of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need
for fine-tuning.
Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356)
by Alec Radford et al. from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper).
**Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were
copied and pasted from the original model card.
## Model details
Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model.
It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision.
The models were trained on either English-only data or multilingual data. The English-only models were trained
on the task of speech recognition. The multilingual models were trained on both speech recognition and speech
translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio.
For speech translation, the model predicts transcriptions to a *different* language to the audio.
Whisper checkpoints come in five configurations of varying model sizes.
The smallest four are trained on either English-only or multilingual data.
The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints
are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The
checkpoints are summarised in the following table with links to the models on the Hub:
| Size | Parameters | English-only | Multilingual |
|----------|------------|------------------------------------------------------|-----------------------------------------------------|
| tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) |
| base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) |
| small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) |
| medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) |
| large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) |
| large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) |
# Usage
This checkpoint is an *English-only* model, meaning it can be used for English speech recognition. Multilingual speech
recognition or speech translation is possible through use of a multilingual checkpoint.
To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor).
The `WhisperProcessor` is used to:
1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model)
2. Post-process the model outputs (converting them from tokens to text)
## Transcription
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-small.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small.en")
>>> # load dummy dataset and read audio files
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False)
['<|startoftranscript|><|notimestamps|> Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel.<|endoftext|>']
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.']
```
The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`.
## Evaluation
This code snippet shows how to evaluate Whisper small.en on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr):
```python
>>> from datasets import load_dataset
>>> from transformers import WhisperForConditionalGeneration, WhisperProcessor
>>> import torch
>>> from evaluate import load
>>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test")
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-small.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small.en").to("cuda")
>>> def map_to_pred(batch):
>>> audio = batch["audio"]
>>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features
>>> batch["reference"] = processor.tokenizer._normalize(batch['text'])
>>>
>>> with torch.no_grad():
>>> predicted_ids = model.generate(input_features.to("cuda"))[0]
>>> transcription = processor.decode(predicted_ids)
>>> batch["prediction"] = processor.tokenizer._normalize(transcription)
>>> return batch
>>> result = librispeech_test_clean.map(map_to_pred)
>>> wer = load("wer")
>>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"]))
3.053161596922323
```
## Long-Form Transcription
The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking
algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers
[`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline)
method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline
can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`:
```python
>>> import torch
>>> from transformers import pipeline
>>> from datasets import load_dataset
>>> device = "cuda:0" if torch.cuda.is_available() else "cpu"
>>> pipe = pipeline(
>>> "automatic-speech-recognition",
>>> model="openai/whisper-small.en",
>>> chunk_length_s=30,
>>> device=device,
>>> )
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> prediction = pipe(sample.copy(), batch_size=8)["text"]
" Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel."
>>> # we can also return timestamps for the predictions
>>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"]
[{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.',
'timestamp': (0.0, 5.44)}]
```
Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm.
## Fine-Tuning
The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However,
its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog
post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step
guide to fine-tuning the Whisper model with as little as 5 hours of labelled data.
### Evaluated Use
The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research.
The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them.
In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes.
## Training Data
The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages.
As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language.
## Performance and Limitations
Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level.
However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself.
Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf).
In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages.
## Broader Implications
We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications.
There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects.
### BibTeX entry and citation info
```bibtex
@misc{radford2022whisper,
doi = {10.48550/ARXIV.2212.04356},
url = {https://arxiv.org/abs/2212.04356},
author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya},
title = {Robust Speech Recognition via Large-Scale Weak Supervision},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
| 14,803 | [
[
-0.02117919921875,
-0.046600341796875,
0.00787353515625,
0.03204345703125,
-0.0043792724609375,
-0.0012254714965820312,
-0.028289794921875,
-0.04644775390625,
0.0179901123046875,
0.02349853515625,
-0.06146240234375,
-0.0382080078125,
-0.053253173828125,
-0.0109710693359375,
-0.042724609375,
0.0750732421875,
0.01200103759765625,
-0.00101470947265625,
0.0164642333984375,
-0.00499725341796875,
-0.024627685546875,
-0.0190887451171875,
-0.05462646484375,
-0.01497650146484375,
0.015533447265625,
0.012603759765625,
0.028778076171875,
0.04083251953125,
0.0108795166015625,
0.03143310546875,
-0.0313720703125,
-0.005687713623046875,
-0.027099609375,
-0.00896453857421875,
0.029052734375,
-0.03582763671875,
-0.0455322265625,
0.0121612548828125,
0.058502197265625,
0.035308837890625,
-0.02606201171875,
0.0341796875,
0.0184326171875,
0.02349853515625,
-0.020416259765625,
0.0198974609375,
-0.0499267578125,
-0.0093536376953125,
-0.020965576171875,
0.002956390380859375,
-0.0252532958984375,
-0.02276611328125,
0.0418701171875,
-0.044525146484375,
0.029266357421875,
0.012542724609375,
0.07635498046875,
0.0189208984375,
-0.0034046173095703125,
-0.032379150390625,
-0.0533447265625,
0.08343505859375,
-0.0662841796875,
0.038055419921875,
0.0295867919921875,
0.019744873046875,
0.00267791748046875,
-0.0704345703125,
-0.05419921875,
-0.0015630722045898438,
-0.004108428955078125,
0.021881103515625,
-0.02685546875,
-0.0009245872497558594,
0.018646240234375,
0.031494140625,
-0.0350341796875,
0.0031890869140625,
-0.053436279296875,
-0.05023193359375,
0.047119140625,
0.0009531974792480469,
0.0212860107421875,
-0.021453857421875,
-0.0171661376953125,
-0.0308990478515625,
-0.020172119140625,
0.033935546875,
0.028228759765625,
0.0357666015625,
-0.054290771484375,
0.02825927734375,
-0.00487518310546875,
0.045379638671875,
0.01541900634765625,
-0.04486083984375,
0.04498291015625,
-0.01226043701171875,
-0.01522064208984375,
0.02777099609375,
0.0784912109375,
0.0157012939453125,
0.00616455078125,
0.0084075927734375,
-0.0104827880859375,
0.0132598876953125,
-0.008026123046875,
-0.0650634765625,
-0.00551605224609375,
0.036224365234375,
-0.041961669921875,
-0.022369384765625,
-0.0184326171875,
-0.04852294921875,
0.009246826171875,
-0.01220703125,
0.052001953125,
-0.0430908203125,
-0.0228729248046875,
0.0164947509765625,
-0.02880859375,
0.0225372314453125,
0.002399444580078125,
-0.06170654296875,
0.027099609375,
0.033172607421875,
0.0660400390625,
0.0081787109375,
-0.04541015625,
-0.037628173828125,
0.0068206787109375,
0.00980377197265625,
0.03466796875,
-0.01934814453125,
-0.0418701171875,
-0.016632080078125,
0.00750732421875,
-0.024383544921875,
-0.04376220703125,
0.05352783203125,
-0.00860595703125,
0.03570556640625,
0.001201629638671875,
-0.038604736328125,
-0.0142822265625,
-0.01558685302734375,
-0.0310821533203125,
0.06915283203125,
0.005077362060546875,
-0.053466796875,
0.01203155517578125,
-0.0384521484375,
-0.035552978515625,
-0.021331787109375,
0.013275146484375,
-0.047119140625,
-0.003528594970703125,
0.032379150390625,
0.0297088623046875,
-0.0143585205078125,
0.0019245147705078125,
-0.004322052001953125,
-0.0311126708984375,
0.02423095703125,
-0.030548095703125,
0.07476806640625,
0.01263427734375,
-0.032257080078125,
0.016021728515625,
-0.057647705078125,
0.01001739501953125,
0.003902435302734375,
-0.0111236572265625,
0.01141357421875,
-0.0038242340087890625,
0.02130126953125,
0.0032787322998046875,
0.012237548828125,
-0.05596923828125,
-0.0078582763671875,
-0.049896240234375,
0.054931640625,
0.047943115234375,
-0.005443572998046875,
0.0275115966796875,
-0.0447998046875,
0.022064208984375,
0.00508880615234375,
0.03228759765625,
-0.01212310791015625,
-0.04669189453125,
-0.07342529296875,
-0.0321044921875,
0.034332275390625,
0.051849365234375,
-0.0257720947265625,
0.04217529296875,
-0.0155029296875,
-0.0565185546875,
-0.09765625,
-0.01082611083984375,
0.0426025390625,
0.041961669921875,
0.05255126953125,
-0.0125732421875,
-0.0576171875,
-0.053466796875,
-0.011383056640625,
-0.0233306884765625,
-0.015777587890625,
0.028350830078125,
0.02484130859375,
-0.0273895263671875,
0.05120849609375,
-0.037567138671875,
-0.041015625,
-0.018951416015625,
0.0046539306640625,
0.03594970703125,
0.048248291015625,
0.0198211669921875,
-0.05157470703125,
-0.032684326171875,
-0.01473236083984375,
-0.0426025390625,
-0.00638580322265625,
-0.004154205322265625,
0.0054168701171875,
0.0012359619140625,
0.026763916015625,
-0.053466796875,
0.030853271484375,
0.04913330078125,
-0.0128631591796875,
0.052703857421875,
0.0114898681640625,
-0.003627777099609375,
-0.085205078125,
-0.00431060791015625,
-0.0111846923828125,
-0.00797271728515625,
-0.049468994140625,
-0.019927978515625,
-0.0085906982421875,
-0.006496429443359375,
-0.039642333984375,
0.047271728515625,
-0.025482177734375,
0.00199127197265625,
-0.004001617431640625,
0.0089263916015625,
-0.0049285888671875,
0.03790283203125,
0.013336181640625,
0.049072265625,
0.061798095703125,
-0.041717529296875,
0.0155181884765625,
0.04193115234375,
-0.024993896484375,
0.0205230712890625,
-0.073974609375,
0.01335906982421875,
0.0105438232421875,
0.0148773193359375,
-0.051025390625,
-0.008697509765625,
0.0017871856689453125,
-0.07232666015625,
0.0330810546875,
-0.024444580078125,
-0.026153564453125,
-0.0396728515625,
-0.01435089111328125,
0.00597381591796875,
0.0689697265625,
-0.03656005859375,
0.05303955078125,
0.03216552734375,
-0.01654052734375,
-0.04010009765625,
-0.042022705078125,
-0.0188751220703125,
-0.018157958984375,
-0.0584716796875,
0.036285400390625,
-0.009368896484375,
-0.0017757415771484375,
-0.01378631591796875,
-0.0087738037109375,
0.009521484375,
-0.019256591796875,
0.03570556640625,
0.035552978515625,
-0.01016998291015625,
-0.019683837890625,
0.0150146484375,
-0.0193023681640625,
-0.0015935897827148438,
-0.01715087890625,
0.0517578125,
-0.0271453857421875,
-0.004669189453125,
-0.058868408203125,
0.01556396484375,
0.0362548828125,
-0.0252685546875,
0.04132080078125,
0.0653076171875,
-0.0204010009765625,
-0.0174102783203125,
-0.05126953125,
-0.0193634033203125,
-0.04345703125,
0.01078033447265625,
-0.027099609375,
-0.0592041015625,
0.05255126953125,
0.01247406005859375,
0.0059661865234375,
0.048309326171875,
0.0389404296875,
-0.0213623046875,
0.06976318359375,
0.0328369140625,
-0.0192718505859375,
0.02264404296875,
-0.0579833984375,
-0.0090789794921875,
-0.0784912109375,
-0.0262298583984375,
-0.0430908203125,
-0.021697998046875,
-0.037872314453125,
-0.0272369384765625,
0.03924560546875,
0.0084991455078125,
-0.0093536376953125,
0.03326416015625,
-0.05743408203125,
0.0013580322265625,
0.0484619140625,
0.0024776458740234375,
0.00872802734375,
-0.004512786865234375,
-0.009918212890625,
-0.00707244873046875,
-0.028350830078125,
-0.02532958984375,
0.07232666015625,
0.039642333984375,
0.042205810546875,
-0.006237030029296875,
0.056365966796875,
-0.0022869110107421875,
0.004241943359375,
-0.05859375,
0.036376953125,
-0.00991058349609375,
-0.043243408203125,
-0.029449462890625,
-0.0223541259765625,
-0.06201171875,
0.0114898681640625,
-0.01346588134765625,
-0.05230712890625,
0.00980377197265625,
-0.005283355712890625,
-0.026123046875,
0.0188751220703125,
-0.05511474609375,
0.045074462890625,
0.01019287109375,
0.008392333984375,
-0.00432586669921875,
-0.05731201171875,
0.00945281982421875,
0.006679534912109375,
0.0112457275390625,
-0.01117706298828125,
0.01611328125,
0.082275390625,
-0.03546142578125,
0.06805419921875,
-0.0262451171875,
0.00901031494140625,
0.03948974609375,
-0.0150909423828125,
0.026153564453125,
-0.0166015625,
-0.01209259033203125,
0.03228759765625,
0.023834228515625,
-0.0215911865234375,
-0.0222015380859375,
0.03753662109375,
-0.0809326171875,
-0.0233154296875,
-0.0207061767578125,
-0.027740478515625,
-0.01232147216796875,
0.01544952392578125,
0.0609130859375,
0.04888916015625,
-0.0059051513671875,
0.00070953369140625,
0.035064697265625,
-0.0175933837890625,
0.04083251953125,
0.048095703125,
-0.0172119140625,
-0.0350341796875,
0.0711669921875,
0.017608642578125,
0.01873779296875,
0.01299285888671875,
0.03326416015625,
-0.03192138671875,
-0.050079345703125,
-0.0404052734375,
0.0247955322265625,
-0.0287017822265625,
-0.0132598876953125,
-0.065185546875,
-0.039642333984375,
-0.04522705078125,
0.0006213188171386719,
-0.036651611328125,
-0.0218353271484375,
-0.03021240234375,
0.007389068603515625,
0.04425048828125,
0.03155517578125,
0.0000040531158447265625,
0.042236328125,
-0.06903076171875,
0.031768798828125,
0.025726318359375,
0.007740020751953125,
0.003528594970703125,
-0.0732421875,
-0.00827789306640625,
0.0147857666015625,
-0.0253143310546875,
-0.04559326171875,
0.035736083984375,
0.0283660888671875,
0.031341552734375,
0.0179443359375,
0.0010194778442382812,
0.0709228515625,
-0.0526123046875,
0.059295654296875,
0.016357421875,
-0.09222412109375,
0.0556640625,
-0.0265045166015625,
0.0178680419921875,
0.032958984375,
0.024566650390625,
-0.044677734375,
-0.038848876953125,
-0.05218505859375,
-0.04815673828125,
0.05316162109375,
0.0229949951171875,
0.00533294677734375,
0.0210723876953125,
0.01546478271484375,
0.0090789794921875,
0.0099639892578125,
-0.034912109375,
-0.03533935546875,
-0.028656005859375,
-0.0194549560546875,
-0.00826263427734375,
-0.003185272216796875,
-0.0007328987121582031,
-0.041259765625,
0.0579833984375,
-0.0010547637939453125,
0.036376953125,
0.0291595458984375,
0.004241943359375,
-0.00180816650390625,
0.01276397705078125,
0.0262451171875,
0.0177001953125,
-0.0194091796875,
-0.0276336669921875,
0.0270538330078125,
-0.06402587890625,
0.0015134811401367188,
0.02520751953125,
-0.0221710205078125,
0.0084381103515625,
0.0521240234375,
0.08111572265625,
0.01409912109375,
-0.036834716796875,
0.05096435546875,
-0.007781982421875,
-0.0157623291015625,
-0.047637939453125,
0.00240325927734375,
0.022857666015625,
0.022674560546875,
0.027099609375,
0.0092926025390625,
0.0122833251953125,
-0.038482666015625,
0.01172637939453125,
0.019775390625,
-0.03900146484375,
-0.03936767578125,
0.064697265625,
0.004337310791015625,
-0.029541015625,
0.054473876953125,
0.0014247894287109375,
-0.0458984375,
0.03509521484375,
0.049072265625,
0.07257080078125,
-0.036712646484375,
-0.002819061279296875,
0.034423828125,
0.018096923828125,
0.002117156982421875,
0.0361328125,
-0.0031871795654296875,
-0.0533447265625,
-0.0340576171875,
-0.07904052734375,
-0.02520751953125,
0.0017185211181640625,
-0.0723876953125,
0.027374267578125,
-0.02374267578125,
-0.01983642578125,
0.0241241455078125,
0.006954193115234375,
-0.0577392578125,
0.015167236328125,
0.0027008056640625,
0.07537841796875,
-0.05169677734375,
0.07763671875,
0.01247406005859375,
-0.02093505859375,
-0.0836181640625,
0.0009508132934570312,
0.0045623779296875,
-0.074462890625,
0.0242156982421875,
0.024017333984375,
-0.01593017578125,
0.0092010498046875,
-0.0380859375,
-0.0543212890625,
0.08135986328125,
0.0108489990234375,
-0.0521240234375,
-0.015380859375,
-0.005157470703125,
0.039794921875,
-0.016510009765625,
0.0168914794921875,
0.056304931640625,
0.03411865234375,
0.010345458984375,
-0.1092529296875,
-0.0097503662109375,
-0.018280029296875,
-0.018768310546875,
0.0023040771484375,
-0.059814453125,
0.069091796875,
-0.0325927734375,
-0.017181396484375,
0.02484130859375,
0.05914306640625,
0.029998779296875,
0.031494140625,
0.048828125,
0.04412841796875,
0.05731201171875,
-0.01454925537109375,
0.07305908203125,
-0.01218414306640625,
0.018218994140625,
0.073486328125,
-0.005298614501953125,
0.0841064453125,
0.0174560546875,
-0.03753662109375,
0.05072021484375,
0.025848388671875,
-0.0033416748046875,
0.03631591796875,
-0.00013637542724609375,
-0.0249176025390625,
0.01186370849609375,
-0.00725555419921875,
-0.03900146484375,
0.059814453125,
0.032135009765625,
-0.016693115234375,
0.032196044921875,
0.0111236572265625,
0.00652313232421875,
-0.01039886474609375,
-0.0187225341796875,
0.06475830078125,
0.01629638671875,
-0.029449462890625,
0.06109619140625,
-0.004192352294921875,
0.080810546875,
-0.06060791015625,
0.01497650146484375,
0.01493072509765625,
0.0167694091796875,
-0.016632080078125,
-0.046600341796875,
0.025726318359375,
-0.0146636962890625,
-0.01812744140625,
-0.0126495361328125,
0.042938232421875,
-0.048736572265625,
-0.041534423828125,
0.0379638671875,
0.02703857421875,
0.0233001708984375,
-0.0084381103515625,
-0.061370849609375,
0.032958984375,
0.01503753662109375,
-0.012481689453125,
0.01189422607421875,
0.0101165771484375,
0.02557373046875,
0.050628662109375,
0.06329345703125,
0.033538818359375,
0.016448974609375,
0.0095672607421875,
0.060546875,
-0.0511474609375,
-0.0428466796875,
-0.04766845703125,
0.038604736328125,
-0.002227783203125,
-0.0291900634765625,
0.064697265625,
0.049530029296875,
0.053741455078125,
0.0037708282470703125,
0.053436279296875,
-0.0007815361022949219,
0.0775146484375,
-0.03826904296875,
0.0660400390625,
-0.0302276611328125,
0.0046539306640625,
-0.02862548828125,
-0.05218505859375,
0.01024627685546875,
0.04266357421875,
-0.0079193115234375,
-0.0007381439208984375,
0.02880859375,
0.06549072265625,
-0.00042724609375,
0.021331787109375,
0.00305938720703125,
0.03704833984375,
0.0167999267578125,
0.0369873046875,
0.048248291015625,
-0.06097412109375,
0.04888916015625,
-0.043304443359375,
-0.0174560546875,
0.0092620849609375,
-0.03411865234375,
-0.064453125,
-0.061309814453125,
-0.0209808349609375,
-0.044403076171875,
-0.0206756591796875,
0.053863525390625,
0.06707763671875,
-0.06207275390625,
-0.029296875,
0.0267181396484375,
-0.0060577392578125,
-0.0258331298828125,
-0.0188446044921875,
0.044036865234375,
0.004009246826171875,
-0.0694580078125,
0.0491943359375,
0.0025081634521484375,
0.0247650146484375,
-0.01715087890625,
-0.015655517578125,
0.0064239501953125,
-0.0001804828643798828,
0.03851318359375,
0.018096923828125,
-0.060546875,
-0.016357421875,
0.0071868896484375,
0.01325225830078125,
-0.0006642341613769531,
0.029327392578125,
-0.05517578125,
0.03021240234375,
0.0190887451171875,
0.00689697265625,
0.07073974609375,
-0.02276611328125,
0.0207366943359375,
-0.05230712890625,
0.033172607421875,
0.0227508544921875,
0.026641845703125,
0.027252197265625,
-0.01165771484375,
0.016448974609375,
0.01812744140625,
-0.046173095703125,
-0.07354736328125,
-0.005306243896484375,
-0.0885009765625,
-0.0018739700317382812,
0.074462890625,
0.003810882568359375,
-0.021514892578125,
-0.0055694580078125,
-0.0240478515625,
0.03839111328125,
-0.038330078125,
0.032745361328125,
0.03106689453125,
0.0015935897827148438,
-0.00356292724609375,
-0.044281005859375,
0.050079345703125,
0.015411376953125,
-0.02752685546875,
-0.0073089599609375,
0.005313873291015625,
0.04486083984375,
0.0238037109375,
0.06365966796875,
-0.02215576171875,
0.0113372802734375,
0.01385498046875,
0.0141448974609375,
-0.001697540283203125,
-0.01210784912109375,
-0.02587890625,
-0.005523681640625,
-0.0153961181640625,
-0.03125
]
] |
timm/vit_base_patch16_224.augreg_in21k | 2023-05-06T00:00:35.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-21k",
"arxiv:2106.10270",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch16_224.augreg_in21k | 4 | 49,151 | timm | 2022-12-22T07:25:23 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-21k
---
# Model card for vit_base_patch16_224.augreg_in21k
A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 102.6
- GMACs: 16.9
- Activations (M): 16.5
- Image size: 224 x 224
- **Papers:**
- How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_patch16_224.augreg_in21k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_patch16_224.augreg_in21k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{steiner2021augreg,
title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers},
author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas},
journal={arXiv preprint arXiv:2106.10270},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,803 | [
[
-0.038726806640625,
-0.03021240234375,
-0.00225830078125,
0.007068634033203125,
-0.0273590087890625,
-0.0239105224609375,
-0.0232391357421875,
-0.036712646484375,
0.0123138427734375,
0.0242462158203125,
-0.03839111328125,
-0.0364990234375,
-0.04718017578125,
0.0005726814270019531,
-0.01087188720703125,
0.0736083984375,
-0.009796142578125,
0.003047943115234375,
-0.0164947509765625,
-0.033050537109375,
-0.0248260498046875,
-0.0195465087890625,
-0.04449462890625,
-0.03271484375,
0.02703857421875,
0.010498046875,
0.042999267578125,
0.04931640625,
0.0589599609375,
0.033782958984375,
-0.00823211669921875,
0.01311492919921875,
-0.023956298828125,
-0.01373291015625,
0.0200958251953125,
-0.0469970703125,
-0.028778076171875,
0.0166778564453125,
0.053253173828125,
0.030364990234375,
0.01007080078125,
0.025726318359375,
0.00897979736328125,
0.037109375,
-0.0272674560546875,
0.01557159423828125,
-0.040313720703125,
0.01763916015625,
-0.0021877288818359375,
-0.004779815673828125,
-0.0239410400390625,
-0.0231170654296875,
0.0171356201171875,
-0.038116455078125,
0.0467529296875,
-0.004390716552734375,
0.10455322265625,
0.0200958251953125,
0.0018558502197265625,
0.0177459716796875,
-0.03167724609375,
0.056915283203125,
-0.045135498046875,
0.0310821533203125,
0.0124359130859375,
0.01296234130859375,
0.0038471221923828125,
-0.0770263671875,
-0.047088623046875,
-0.012359619140625,
-0.0162200927734375,
0.0080718994140625,
-0.0227203369140625,
0.0190277099609375,
0.0372314453125,
0.046875,
-0.039306640625,
-0.0034656524658203125,
-0.042816162109375,
-0.0216827392578125,
0.043304443359375,
-0.003082275390625,
0.0169219970703125,
-0.01241302490234375,
-0.045440673828125,
-0.045318603515625,
-0.0242156982421875,
0.0196380615234375,
0.0228729248046875,
0.003757476806640625,
-0.036224365234375,
0.04278564453125,
0.0029735565185546875,
0.052581787109375,
0.0186309814453125,
-0.01476287841796875,
0.05242919921875,
-0.01055908203125,
-0.0288543701171875,
-0.02215576171875,
0.08148193359375,
0.03741455078125,
0.0303497314453125,
-0.002666473388671875,
-0.01482391357421875,
-0.007904052734375,
0.005405426025390625,
-0.080810546875,
-0.0301971435546875,
0.005680084228515625,
-0.033416748046875,
-0.02850341796875,
0.0250091552734375,
-0.04827880859375,
-0.0099029541015625,
-0.00872802734375,
0.05792236328125,
-0.032867431640625,
-0.01471710205078125,
0.00732421875,
-0.0139923095703125,
0.0355224609375,
0.019287109375,
-0.045135498046875,
0.00799560546875,
0.0159912109375,
0.07647705078125,
0.0020198822021484375,
-0.035614013671875,
-0.01922607421875,
-0.031982421875,
-0.024200439453125,
0.038970947265625,
-0.0036716461181640625,
-0.0116424560546875,
-0.012481689453125,
0.0297393798828125,
-0.01995849609375,
-0.042999267578125,
0.024993896484375,
-0.0157623291015625,
0.0265350341796875,
0.00977325439453125,
-0.0150299072265625,
-0.0322265625,
0.021270751953125,
-0.0313720703125,
0.09246826171875,
0.0262603759765625,
-0.06634521484375,
0.029754638671875,
-0.03326416015625,
-0.005764007568359375,
-0.0087432861328125,
0.00115203857421875,
-0.08270263671875,
0.004634857177734375,
0.02496337890625,
0.042755126953125,
-0.01464080810546875,
0.00003141164779663086,
-0.030242919921875,
-0.0259246826171875,
0.026519775390625,
-0.020721435546875,
0.06903076171875,
0.0015583038330078125,
-0.0251312255859375,
0.0206146240234375,
-0.044342041015625,
0.00659942626953125,
0.0301361083984375,
-0.0200958251953125,
0.0007500648498535156,
-0.048126220703125,
0.0124359130859375,
0.01446533203125,
0.0189208984375,
-0.0498046875,
0.0296478271484375,
-0.0262603759765625,
0.0302581787109375,
0.048126220703125,
-0.00591278076171875,
0.0295257568359375,
-0.0253448486328125,
0.0243682861328125,
0.0174102783203125,
0.029876708984375,
-0.01132965087890625,
-0.0501708984375,
-0.07855224609375,
-0.0341796875,
0.0253448486328125,
0.033660888671875,
-0.04840087890625,
0.0426025390625,
-0.0276947021484375,
-0.055755615234375,
-0.043670654296875,
0.0024204254150390625,
0.0345458984375,
0.03955078125,
0.040313720703125,
-0.04296875,
-0.042999267578125,
-0.07177734375,
-0.01137542724609375,
-0.006256103515625,
0.0007867813110351562,
0.017486572265625,
0.047332763671875,
-0.0222930908203125,
0.0634765625,
-0.033538818359375,
-0.0238037109375,
-0.01611328125,
0.0048980712890625,
0.025787353515625,
0.05657958984375,
0.0517578125,
-0.0482177734375,
-0.03289794921875,
-0.01067352294921875,
-0.0633544921875,
0.0082244873046875,
-0.0033817291259765625,
-0.01302337646484375,
0.0108795166015625,
0.014312744140625,
-0.053131103515625,
0.058746337890625,
0.0146484375,
-0.0268402099609375,
0.03131103515625,
-0.0180206298828125,
0.0061798095703125,
-0.08929443359375,
-0.0001952648162841797,
0.028656005859375,
-0.0202484130859375,
-0.035919189453125,
0.0015478134155273438,
0.0094757080078125,
-0.0030117034912109375,
-0.0282440185546875,
0.041778564453125,
-0.03936767578125,
-0.00542449951171875,
-0.004711151123046875,
-0.0272674560546875,
0.006114959716796875,
0.054046630859375,
-0.00240325927734375,
0.040740966796875,
0.0540771484375,
-0.03521728515625,
0.045867919921875,
0.038360595703125,
-0.01374053955078125,
0.036529541015625,
-0.054046630859375,
0.01018524169921875,
-0.0018463134765625,
0.01568603515625,
-0.074462890625,
-0.0165557861328125,
0.0282745361328125,
-0.05712890625,
0.047882080078125,
-0.0419921875,
-0.0309600830078125,
-0.045074462890625,
-0.02972412109375,
0.0309600830078125,
0.057586669921875,
-0.057769775390625,
0.046295166015625,
0.007358551025390625,
0.0258941650390625,
-0.04486083984375,
-0.0738525390625,
-0.013214111328125,
-0.028564453125,
-0.05279541015625,
0.036224365234375,
0.004245758056640625,
0.01082611083984375,
0.005992889404296875,
-0.004863739013671875,
-0.00022518634796142578,
-0.0166015625,
0.0341796875,
0.0302276611328125,
-0.016265869140625,
-0.00574493408203125,
-0.027435302734375,
-0.01515960693359375,
0.00043702125549316406,
-0.0264434814453125,
0.039215087890625,
-0.0250701904296875,
-0.0163116455078125,
-0.055755615234375,
-0.02099609375,
0.03582763671875,
-0.0231170654296875,
0.053680419921875,
0.08734130859375,
-0.035003662109375,
0.004383087158203125,
-0.0447998046875,
-0.029693603515625,
-0.036651611328125,
0.03472900390625,
-0.0223541259765625,
-0.031585693359375,
0.056121826171875,
0.01369476318359375,
0.0078277587890625,
0.058563232421875,
0.0343017578125,
0.004680633544921875,
0.0628662109375,
0.05126953125,
0.0142364501953125,
0.06683349609375,
-0.07427978515625,
-0.00841522216796875,
-0.0704345703125,
-0.029541015625,
-0.0177459716796875,
-0.038238525390625,
-0.05352783203125,
-0.038543701171875,
0.03179931640625,
0.005985260009765625,
-0.020721435546875,
0.041717529296875,
-0.0673828125,
0.01214599609375,
0.05426025390625,
0.03729248046875,
-0.006633758544921875,
0.032989501953125,
-0.01444244384765625,
-0.005435943603515625,
-0.055938720703125,
-0.00591278076171875,
0.08184814453125,
0.03399658203125,
0.060211181640625,
-0.020599365234375,
0.051025390625,
-0.01824951171875,
0.022735595703125,
-0.05926513671875,
0.04083251953125,
0.0014200210571289062,
-0.0303497314453125,
-0.00765228271484375,
-0.0300140380859375,
-0.078369140625,
0.01739501953125,
-0.02520751953125,
-0.06024169921875,
0.02911376953125,
0.01471710205078125,
-0.01384735107421875,
0.050262451171875,
-0.06439208984375,
0.071044921875,
-0.0044097900390625,
-0.03717041015625,
0.007694244384765625,
-0.052337646484375,
0.014190673828125,
0.019775390625,
-0.0256500244140625,
0.01055908203125,
0.0189971923828125,
0.07342529296875,
-0.04541015625,
0.06298828125,
-0.0291748046875,
0.0269775390625,
0.035430908203125,
-0.0167694091796875,
0.0276031494140625,
0.0035190582275390625,
0.014068603515625,
0.0233306884765625,
-0.0023326873779296875,
-0.0281524658203125,
-0.037353515625,
0.037933349609375,
-0.0792236328125,
-0.0286865234375,
-0.040435791015625,
-0.045654296875,
0.00780487060546875,
0.006229400634765625,
0.04998779296875,
0.047088623046875,
0.02276611328125,
0.0307769775390625,
0.050018310546875,
-0.0236968994140625,
0.030364990234375,
0.00011807680130004883,
-0.01284027099609375,
-0.04364013671875,
0.07135009765625,
0.0163116455078125,
0.01131439208984375,
0.01222991943359375,
0.0167388916015625,
-0.0255889892578125,
-0.03741455078125,
-0.02825927734375,
0.0311431884765625,
-0.052703857421875,
-0.036224365234375,
-0.04443359375,
-0.040435791015625,
-0.0248870849609375,
0.0018949508666992188,
-0.030609130859375,
-0.026092529296875,
-0.0268096923828125,
0.00711822509765625,
0.064697265625,
0.038299560546875,
-0.0122222900390625,
0.037506103515625,
-0.042694091796875,
0.0149383544921875,
0.0217437744140625,
0.036865234375,
-0.01496124267578125,
-0.07550048828125,
-0.0271759033203125,
0.002857208251953125,
-0.038726806640625,
-0.05621337890625,
0.035491943359375,
0.01512908935546875,
0.033843994140625,
0.0249176025390625,
-0.0194549560546875,
0.06707763671875,
-0.004665374755859375,
0.044891357421875,
0.024017333984375,
-0.03973388671875,
0.036407470703125,
-0.00905609130859375,
0.01012420654296875,
0.012786865234375,
0.00962066650390625,
-0.0233306884765625,
-0.00621795654296875,
-0.08209228515625,
-0.05914306640625,
0.055511474609375,
0.01824951171875,
0.00629425048828125,
0.03436279296875,
0.044677734375,
-0.007232666015625,
0.004436492919921875,
-0.06707763671875,
-0.0221405029296875,
-0.029876708984375,
-0.0251922607421875,
-0.005611419677734375,
-0.0016279220581054688,
-0.0015897750854492188,
-0.05841064453125,
0.048095703125,
-0.005298614501953125,
0.06036376953125,
0.03521728515625,
-0.0177764892578125,
-0.01267242431640625,
-0.029998779296875,
0.025848388671875,
0.0200042724609375,
-0.0224609375,
0.004001617431640625,
0.020263671875,
-0.055938720703125,
-0.0035858154296875,
0.0245513916015625,
-0.0074462890625,
0.0044403076171875,
0.03729248046875,
0.08197021484375,
-0.00969696044921875,
-0.0019626617431640625,
0.0413818359375,
-0.0061798095703125,
-0.030548095703125,
-0.0222625732421875,
0.006256103515625,
-0.0177154541015625,
0.0267486572265625,
0.02581787109375,
0.03167724609375,
-0.0096435546875,
-0.01004791259765625,
0.01244354248046875,
0.038726806640625,
-0.041717529296875,
-0.02850341796875,
0.049713134765625,
-0.01454925537109375,
-0.00650787353515625,
0.0594482421875,
-0.0021953582763671875,
-0.041229248046875,
0.06634521484375,
0.023468017578125,
0.07513427734375,
-0.007244110107421875,
-0.0028400421142578125,
0.05841064453125,
0.02703857421875,
-0.002765655517578125,
0.01265716552734375,
0.0097198486328125,
-0.0595703125,
-0.0086212158203125,
-0.046722412109375,
0.001819610595703125,
0.0247802734375,
-0.041748046875,
0.029205322265625,
-0.040496826171875,
-0.029541015625,
0.0034618377685546875,
0.0178375244140625,
-0.0753173828125,
0.022369384765625,
0.0014867782592773438,
0.055511474609375,
-0.06146240234375,
0.045318603515625,
0.06573486328125,
-0.051910400390625,
-0.07110595703125,
-0.01189422607421875,
-0.01313018798828125,
-0.0682373046875,
0.03369140625,
0.032073974609375,
0.01214599609375,
0.0190887451171875,
-0.060516357421875,
-0.045928955078125,
0.09539794921875,
0.0271148681640625,
-0.00957489013671875,
0.00955963134765625,
-0.0044097900390625,
0.029327392578125,
-0.0202789306640625,
0.036224365234375,
0.011260986328125,
0.0294036865234375,
0.016632080078125,
-0.05218505859375,
0.005260467529296875,
-0.0255126953125,
0.0138397216796875,
0.0131683349609375,
-0.0595703125,
0.07244873046875,
-0.031982421875,
-0.005992889404296875,
0.0145111083984375,
0.04840087890625,
0.00917816162109375,
0.0031986236572265625,
0.040985107421875,
0.06719970703125,
0.0285491943359375,
-0.032562255859375,
0.06817626953125,
-0.012451171875,
0.055511474609375,
0.035003662109375,
0.03753662109375,
0.032073974609375,
0.035400390625,
-0.023529052734375,
0.0269317626953125,
0.0765380859375,
-0.044647216796875,
0.0209808349609375,
0.007488250732421875,
0.004119873046875,
-0.01885986328125,
0.00201416015625,
-0.0364990234375,
0.03741455078125,
0.01554107666015625,
-0.043182373046875,
-0.00742340087890625,
0.01153564453125,
-0.01052093505859375,
-0.0269622802734375,
-0.01308441162109375,
0.045440673828125,
-0.000017821788787841797,
-0.0323486328125,
0.0645751953125,
-0.00010961294174194336,
0.05999755859375,
-0.030426025390625,
-0.002979278564453125,
-0.0181121826171875,
0.032623291015625,
-0.02978515625,
-0.0601806640625,
0.01239013671875,
-0.016937255859375,
-0.0052490234375,
0.002788543701171875,
0.052886962890625,
-0.0308685302734375,
-0.042083740234375,
0.007717132568359375,
0.0241546630859375,
0.022613525390625,
-0.0044403076171875,
-0.07745361328125,
-0.0010328292846679688,
0.0008077621459960938,
-0.044342041015625,
0.01371002197265625,
0.0302276611328125,
-0.00018644332885742188,
0.050689697265625,
0.052642822265625,
-0.005382537841796875,
0.016998291015625,
-0.00820159912109375,
0.06939697265625,
-0.03167724609375,
-0.028472900390625,
-0.058929443359375,
0.048583984375,
-0.007007598876953125,
-0.045654296875,
0.0513916015625,
0.04718017578125,
0.068603515625,
-0.0111846923828125,
0.03643798828125,
-0.01152801513671875,
0.001743316650390625,
-0.0252685546875,
0.04254150390625,
-0.052734375,
-0.007717132568359375,
-0.021209716796875,
-0.068115234375,
-0.0301361083984375,
0.07098388671875,
-0.027252197265625,
0.034271240234375,
0.038909912109375,
0.07421875,
-0.0247802734375,
-0.03131103515625,
0.012359619140625,
0.0144500732421875,
0.01074981689453125,
0.0294189453125,
0.045135498046875,
-0.0657958984375,
0.037750244140625,
-0.045684814453125,
-0.01369476318359375,
-0.021575927734375,
-0.0355224609375,
-0.07855224609375,
-0.06109619140625,
-0.042449951171875,
-0.05242919921875,
-0.01374053955078125,
0.0626220703125,
0.07171630859375,
-0.042236328125,
-0.006618499755859375,
-0.01346588134765625,
0.0020732879638671875,
-0.0234222412109375,
-0.01824951171875,
0.038848876953125,
-0.0100860595703125,
-0.05938720703125,
-0.0244903564453125,
-0.0005517005920410156,
0.038360595703125,
-0.01358795166015625,
-0.01087188720703125,
-0.0096893310546875,
-0.0250701904296875,
0.015716552734375,
0.022979736328125,
-0.052581787109375,
-0.0177764892578125,
-0.004199981689453125,
-0.003875732421875,
0.037872314453125,
0.0290985107421875,
-0.0555419921875,
0.0419921875,
0.044586181640625,
0.0255126953125,
0.06390380859375,
-0.0125885009765625,
0.01042938232421875,
-0.063232421875,
0.0450439453125,
-0.002193450927734375,
0.040740966796875,
0.041107177734375,
-0.02056884765625,
0.0440673828125,
0.04693603515625,
-0.03387451171875,
-0.0643310546875,
-0.0003821849822998047,
-0.08123779296875,
0.0108795166015625,
0.07281494140625,
-0.018463134765625,
-0.0374755859375,
0.0279083251953125,
-0.0164337158203125,
0.053955078125,
-0.0067596435546875,
0.035003662109375,
0.0146942138671875,
0.00624847412109375,
-0.04730224609375,
-0.034912109375,
0.03778076171875,
0.00972747802734375,
-0.0384521484375,
-0.0282440185546875,
0.0050201416015625,
0.04180908203125,
0.027435302734375,
0.023193359375,
-0.01261138916015625,
0.01410675048828125,
0.0034313201904296875,
0.0426025390625,
-0.027374267578125,
-0.01055908203125,
-0.0296478271484375,
-0.01351165771484375,
-0.0052490234375,
-0.04669189453125
]
] |
ai-forever/ruclip-vit-base-patch32-384 | 2022-01-10T00:21:50.000Z | [
"transformers",
"pytorch",
"endpoints_compatible",
"region:us"
] | null | ai-forever | null | null | ai-forever/ruclip-vit-base-patch32-384 | 3 | 48,709 | transformers | 2022-03-02T23:29:05 | # ruclip-vit-base-patch32-384
**RuCLIP** (**Ru**ssian **C**ontrastive **L**anguage–**I**mage **P**retraining) is a multimodal model
for obtaining images and text similarities and rearranging captions and pictures.
RuCLIP builds on a large body of work on zero-shot transfer, computer vision, natural language processing and
multimodal learning.
Model was trained by [Sber AI](https://github.com/sberbank-ai) and [SberDevices](https://sberdevices.ru/) teams.
* Task: `text ranking`; `image ranking`; `zero-shot image classification`;
* Type: `encoder`
* Num Parameters: `150M`
* Training Data Volume: `240 million text-image pairs`
* Language: `Russian`
* Context Length: `77`
* Transformer Layers: `12`
* Transformer Width: `512`
* Transformer Heads: `8`
* Image Size: `384`
* Vision Layers: `12`
* Vision Width: `768`
* Vision Patch Size: `32`
## Usage [Github](https://github.com/sberbank-ai/ru-clip)
```
pip install ruclip
```
```python
clip, processor = ruclip.load("ruclip-vit-base-patch32-384", device="cuda")
```
## Performance
We have evaluated the performance on the following datasets:
| Dataset | Metric Name | Metric Result |
|:--------------|:---------------|:----------------------------|
| Food101 | acc | 0.642 |
| CIFAR10 | acc | 0.862 |
| CIFAR100 | acc | 0.529 |
| Birdsnap | acc | 0.161 |
| SUN397 | acc | 0.510 |
| Stanford Cars | acc | 0.572 |
| DTD | acc | 0.390 |
| MNIST | acc | 0.404 |
| STL10 | acc | 0.946 |
| PCam | acc | 0.506 |
| CLEVR | acc | 0.188 |
| Rendered SST2 | acc | 0.508 |
| ImageNet | acc | 0.451 |
| FGVC Aircraft | mean-per-class | 0.053 |
| Oxford Pets | mean-per-class | 0.587 |
| Caltech101 | mean-per-class | 0.834 |
| Flowers102 | mean-per-class | 0.449 |
| HatefulMemes | roc-auc | 0.537 |
# Authors
+ Alex Shonenkov: [Github](https://github.com/shonenkov), [Kaggle GM](https://www.kaggle.com/shonenkov)
+ Daniil Chesakov: [Github](https://github.com/Danyache)
+ Denis Dimitrov: [Github](https://github.com/denndimitrov)
+ Igor Pavlov: [Github](https://github.com/boomb0om)
| 2,695 | [
[
-0.039581298828125,
-0.03668212890625,
0.0211639404296875,
0.0162200927734375,
-0.013671875,
0.0187835693359375,
-0.01039886474609375,
-0.0265350341796875,
0.0323486328125,
0.00710296630859375,
-0.045745849609375,
-0.052093505859375,
-0.049560546875,
-0.0161590576171875,
-0.01727294921875,
0.0701904296875,
0.0025463104248046875,
0.00662994384765625,
-0.01386260986328125,
-0.024444580078125,
-0.047027587890625,
-0.0166015625,
-0.01187896728515625,
-0.02520751953125,
0.0282745361328125,
0.053131103515625,
0.053192138671875,
0.052001953125,
0.0701904296875,
0.0204315185546875,
0.0043487548828125,
-0.01262664794921875,
-0.0170745849609375,
-0.0041656494140625,
-0.0033664703369140625,
-0.0276031494140625,
-0.0367431640625,
-0.00853729248046875,
0.05218505859375,
0.011932373046875,
0.002361297607421875,
0.039093017578125,
-0.01666259765625,
0.0703125,
-0.06231689453125,
-0.01058197021484375,
-0.01947021484375,
0.01511383056640625,
-0.01959228515625,
-0.030242919921875,
-0.0194091796875,
-0.013458251953125,
0.005741119384765625,
-0.05584716796875,
0.0205535888671875,
-0.001155853271484375,
0.101806640625,
0.01256561279296875,
-0.0018053054809570312,
0.0054168701171875,
-0.037841796875,
0.0635986328125,
-0.0361328125,
0.028533935546875,
0.0201263427734375,
0.031494140625,
0.00954437255859375,
-0.061126708984375,
-0.0271759033203125,
0.0041656494140625,
-0.00222015380859375,
0.029998779296875,
-0.021514892578125,
-0.0261993408203125,
0.0279998779296875,
0.0232391357421875,
-0.060455322265625,
0.0100860595703125,
-0.036590576171875,
-0.00946044921875,
0.03509521484375,
0.0181732177734375,
0.0164337158203125,
-0.0266571044921875,
-0.044158935546875,
-0.04266357421875,
-0.0240325927734375,
0.02362060546875,
0.0217742919921875,
-0.00447845458984375,
-0.0296783447265625,
0.037322998046875,
-0.0299072265625,
0.041229248046875,
-0.0017242431640625,
-0.0287322998046875,
0.0445556640625,
-0.03228759765625,
-0.01328277587890625,
-0.029815673828125,
0.078369140625,
0.05487060546875,
-0.00103759765625,
0.014312744140625,
0.0038928985595703125,
0.0263824462890625,
-0.0045623779296875,
-0.06463623046875,
-0.0394287109375,
0.020050048828125,
-0.02105712890625,
-0.0264739990234375,
0.0164337158203125,
-0.09503173828125,
0.0125885009765625,
0.0007338523864746094,
0.042266845703125,
-0.05181884765625,
-0.0117645263671875,
-0.002964019775390625,
-0.0428466796875,
0.043121337890625,
0.0213775634765625,
-0.05072021484375,
0.0139617919921875,
0.041595458984375,
0.07464599609375,
-0.0094146728515625,
-0.039459228515625,
-0.037872314453125,
0.002490997314453125,
-0.00746917724609375,
0.07379150390625,
-0.04083251953125,
-0.0164031982421875,
-0.01837158203125,
0.0094451904296875,
-0.00740814208984375,
-0.00826263427734375,
0.0338134765625,
-0.01004791259765625,
0.01275634765625,
-0.0174713134765625,
-0.00775909423828125,
-0.01194000244140625,
0.0255889892578125,
-0.039031982421875,
0.07330322265625,
0.010498046875,
-0.0635986328125,
0.04736328125,
-0.056640625,
-0.0203399658203125,
0.01580810546875,
-0.006317138671875,
-0.06427001953125,
-0.01995849609375,
0.0280914306640625,
0.0299835205078125,
-0.0181884765625,
0.0075531005859375,
-0.0306396484375,
-0.0266876220703125,
0.03021240234375,
-0.016021728515625,
0.058258056640625,
0.0215606689453125,
-0.031982421875,
0.020843505859375,
-0.054840087890625,
0.024383544921875,
0.01403045654296875,
-0.0215606689453125,
-0.01006317138671875,
-0.017181396484375,
0.019256591796875,
0.01078033447265625,
0.0009822845458984375,
-0.02685546875,
0.027435302734375,
-0.0021839141845703125,
0.0390625,
0.0484619140625,
0.0030231475830078125,
0.0382080078125,
-0.037689208984375,
0.058013916015625,
0.0012044906616210938,
0.0350341796875,
-0.025115966796875,
-0.032928466796875,
-0.052093505859375,
-0.06072998046875,
0.03765869140625,
0.0494384765625,
-0.041229248046875,
0.017547607421875,
-0.046844482421875,
-0.05682373046875,
-0.053802490234375,
-0.007495880126953125,
0.050384521484375,
0.01605224609375,
0.034759521484375,
-0.029144287109375,
-0.02984619140625,
-0.0848388671875,
-0.00836181640625,
0.006229400634765625,
0.0223846435546875,
0.018341064453125,
0.065673828125,
-0.004108428955078125,
0.05059814453125,
-0.06341552734375,
-0.029937744140625,
-0.02117919921875,
0.01128387451171875,
0.000278472900390625,
0.0391845703125,
0.07073974609375,
-0.05108642578125,
-0.06048583984375,
-0.002593994140625,
-0.04150390625,
0.0059967041015625,
0.0012216567993164062,
-0.00629425048828125,
0.02593994140625,
0.00775909423828125,
-0.042327880859375,
0.050567626953125,
0.043701171875,
-0.034393310546875,
0.052642822265625,
-0.01403045654296875,
0.03173828125,
-0.07659912109375,
0.04364013671875,
0.00878143310546875,
-0.0252532958984375,
-0.036865234375,
0.0205535888671875,
0.0078582763671875,
-0.017425537109375,
-0.035797119140625,
0.025634765625,
-0.047698974609375,
-0.01473236083984375,
0.00417327880859375,
-0.0009794235229492188,
0.007221221923828125,
0.05181884765625,
-0.0006036758422851562,
0.06939697265625,
0.0604248046875,
-0.02862548828125,
0.01910400390625,
0.0215606689453125,
-0.041015625,
0.049346923828125,
-0.045654296875,
-0.0122528076171875,
0.006381988525390625,
0.0193023681640625,
-0.0791015625,
-0.0386962890625,
0.03497314453125,
-0.049896240234375,
0.01384735107421875,
-0.036529541015625,
-0.03704833984375,
-0.0291900634765625,
-0.051025390625,
0.028472900390625,
0.040679931640625,
-0.033447265625,
0.023651123046875,
0.0369873046875,
0.0012006759643554688,
-0.061065673828125,
-0.0626220703125,
-0.0007410049438476562,
-0.0091094970703125,
-0.041595458984375,
0.030242919921875,
-0.010833740234375,
0.0050201416015625,
0.0131378173828125,
-0.0007991790771484375,
-0.0209503173828125,
-0.00960540771484375,
0.0301513671875,
0.0282745361328125,
0.0012359619140625,
-0.0211639404296875,
-0.012664794921875,
-0.0242767333984375,
-0.0185546875,
0.01468658447265625,
0.048828125,
-0.027587890625,
-0.0209808349609375,
-0.054351806640625,
0.0164031982421875,
0.06854248046875,
-0.035369873046875,
0.038299560546875,
0.059417724609375,
0.0107269287109375,
-0.0029201507568359375,
-0.031646728515625,
0.002288818359375,
-0.032623291015625,
0.0305938720703125,
-0.0247650146484375,
-0.04376220703125,
0.051361083984375,
0.00455474853515625,
0.00565338134765625,
0.040679931640625,
0.0162506103515625,
-0.01242828369140625,
0.05828857421875,
0.034332275390625,
-0.008087158203125,
0.047149658203125,
-0.06427001953125,
-0.01021575927734375,
-0.0567626953125,
-0.027435302734375,
-0.0321044921875,
-0.043212890625,
-0.04449462890625,
-0.0355224609375,
0.0272674560546875,
-0.00981903076171875,
0.003963470458984375,
0.0430908203125,
-0.07025146484375,
0.040924072265625,
0.045654296875,
0.008270263671875,
0.00743865966796875,
0.007495880126953125,
-0.0081787109375,
-0.01544952392578125,
-0.061553955078125,
-0.001911163330078125,
0.07861328125,
0.01178741455078125,
0.06390380859375,
0.0034465789794921875,
0.0266571044921875,
0.0115966796875,
-0.004421234130859375,
-0.05108642578125,
0.0557861328125,
-0.012237548828125,
-0.049530029296875,
-0.0022735595703125,
-0.0286865234375,
-0.06146240234375,
0.02716064453125,
-0.043792724609375,
-0.0416259765625,
0.0253753662109375,
0.021728515625,
-0.00855255126953125,
0.03656005859375,
-0.04925537109375,
0.072265625,
-0.01959228515625,
-0.039642333984375,
-0.0233001708984375,
-0.06744384765625,
0.0294189453125,
0.01061248779296875,
0.016510009765625,
-0.00743865966796875,
0.01163482666015625,
0.0579833984375,
-0.0294952392578125,
0.050506591796875,
-0.0134735107421875,
0.035003662109375,
0.017547607421875,
0.000007987022399902344,
0.023773193359375,
0.00472259521484375,
0.019439697265625,
0.026519775390625,
0.0202178955078125,
-0.0194091796875,
-0.03631591796875,
0.06719970703125,
-0.0638427734375,
-0.014404296875,
-0.06829833984375,
-0.032135009765625,
0.032470703125,
0.0157012939453125,
0.03607177734375,
0.048095703125,
-0.0015087127685546875,
0.034332275390625,
0.0260009765625,
-0.01227569580078125,
0.040435791015625,
0.0428466796875,
-0.019927978515625,
-0.07965087890625,
0.0848388671875,
0.01471710205078125,
0.02398681640625,
0.0308990478515625,
0.00577545166015625,
0.0010986328125,
-0.025634765625,
-0.033935546875,
0.0155181884765625,
-0.035369873046875,
-0.040435791015625,
-0.0305633544921875,
-0.0107269287109375,
-0.029815673828125,
-0.0170440673828125,
-0.045654296875,
-0.055206298828125,
-0.0115966796875,
0.01293182373046875,
0.0460205078125,
0.0201263427734375,
-0.01244354248046875,
0.0281524658203125,
-0.038543701171875,
0.021636962890625,
-0.002208709716796875,
0.0261688232421875,
-0.0162353515625,
-0.04522705078125,
-0.022369384765625,
-0.0075836181640625,
-0.048370361328125,
-0.052764892578125,
0.0421142578125,
0.0090789794921875,
0.03314208984375,
0.018524169921875,
-0.00930023193359375,
0.08544921875,
-0.0225677490234375,
0.0635986328125,
0.03082275390625,
-0.058349609375,
0.049713134765625,
-0.01910400390625,
0.04449462890625,
0.056182861328125,
0.02642822265625,
-0.031341552734375,
-0.0244293212890625,
-0.048095703125,
-0.0748291015625,
0.064697265625,
0.01103973388671875,
-0.031219482421875,
0.01428985595703125,
0.0150909423828125,
-0.00836181640625,
0.0017337799072265625,
-0.06243896484375,
-0.035430908203125,
-0.027587890625,
-0.0018014907836914062,
0.00951385498046875,
-0.0108642578125,
-0.01079559326171875,
-0.04248046875,
0.05108642578125,
-0.002307891845703125,
0.03326416015625,
0.0282135009765625,
-0.01194000244140625,
-0.0142974853515625,
0.0213470458984375,
0.06512451171875,
0.055145263671875,
-0.0216064453125,
0.00533294677734375,
0.006229400634765625,
-0.042449951171875,
-0.004978179931640625,
-0.0039043426513671875,
-0.036041259765625,
0.00635528564453125,
0.0276947021484375,
0.0635986328125,
0.0202484130859375,
-0.0350341796875,
0.057647705078125,
0.0019207000732421875,
-0.0455322265625,
-0.042572021484375,
0.006076812744140625,
-0.0223846435546875,
0.0172576904296875,
0.015960693359375,
0.03765869140625,
0.0006513595581054688,
-0.035888671875,
0.0137176513671875,
0.047149658203125,
-0.0269317626953125,
-0.0494384765625,
0.043365478515625,
0.004039764404296875,
-0.027984619140625,
0.030120849609375,
0.00756072998046875,
-0.0535888671875,
0.03826904296875,
0.02099609375,
0.0625,
-0.017181396484375,
0.03173828125,
0.056304931640625,
0.0183258056640625,
-0.005611419677734375,
0.0218353271484375,
0.002079010009765625,
-0.0218963623046875,
-0.0305633544921875,
-0.029815673828125,
-0.0302276611328125,
0.01338958740234375,
-0.06005859375,
0.03460693359375,
-0.04742431640625,
-0.03375244140625,
0.00571441650390625,
0.0029468536376953125,
-0.05645751953125,
-0.00035071372985839844,
-0.0217742919921875,
0.062164306640625,
-0.06463623046875,
0.0538330078125,
0.0682373046875,
-0.0408935546875,
-0.033843994140625,
-0.01416015625,
-0.00011867284774780273,
-0.038818359375,
0.043548583984375,
0.008026123046875,
-0.0092926025390625,
-0.01027679443359375,
-0.055999755859375,
-0.06787109375,
0.10638427734375,
0.01433563232421875,
-0.0305023193359375,
0.022216796875,
0.0008649826049804688,
0.02716064453125,
-0.016876220703125,
0.0222320556640625,
0.0174407958984375,
0.047210693359375,
0.0093536376953125,
-0.0653076171875,
0.0001811981201171875,
-0.0274200439453125,
-0.01800537109375,
0.0308990478515625,
-0.08013916015625,
0.05023193359375,
-0.0218353271484375,
-0.011444091796875,
0.0010395050048828125,
0.031646728515625,
0.02581787109375,
0.0200653076171875,
0.030975341796875,
0.068603515625,
0.0244140625,
-0.01404571533203125,
0.05987548828125,
-0.0305938720703125,
0.03515625,
0.06878662109375,
0.0099639892578125,
0.07427978515625,
0.0462646484375,
-0.01192474365234375,
0.046783447265625,
0.0223236083984375,
-0.037109375,
0.06707763671875,
-0.02362060546875,
0.0001417398452758789,
-0.01445770263671875,
0.010711669921875,
-0.032196044921875,
0.03326416015625,
0.0016326904296875,
-0.017059326171875,
-0.001598358154296875,
-0.0015468597412109375,
0.00708770751953125,
-0.0092315673828125,
0.002529144287109375,
0.050262451171875,
-0.0179901123046875,
-0.05224609375,
0.037384033203125,
-0.0006308555603027344,
0.044464111328125,
-0.06195068359375,
-0.01444244384765625,
-0.0042724609375,
0.030364990234375,
-0.0250091552734375,
-0.08135986328125,
0.013763427734375,
0.000690460205078125,
-0.0187530517578125,
-0.01380157470703125,
0.04693603515625,
-0.0172576904296875,
-0.03656005859375,
0.01531982421875,
0.0040283203125,
0.01617431640625,
0.028228759765625,
-0.057342529296875,
0.03143310546875,
0.00893402099609375,
-0.0171356201171875,
0.028839111328125,
0.01910400390625,
0.0122528076171875,
0.0640869140625,
0.0552978515625,
0.0137786865234375,
-0.0007982254028320312,
-0.015411376953125,
0.07989501953125,
-0.0550537109375,
-0.04058837890625,
-0.060516357421875,
0.049713134765625,
-0.0125274658203125,
-0.04376220703125,
0.056671142578125,
0.06512451171875,
0.05389404296875,
-0.022979736328125,
0.046417236328125,
-0.0220794677734375,
0.01995849609375,
-0.0265960693359375,
0.05487060546875,
-0.05712890625,
-0.0008730888366699219,
-0.044281005859375,
-0.053314208984375,
-0.0355224609375,
0.059967041015625,
-0.01959228515625,
-0.0148162841796875,
0.05108642578125,
0.056884765625,
-0.0199127197265625,
-0.0263824462890625,
0.0101165771484375,
-0.00843048095703125,
0.0304718017578125,
0.044647216796875,
0.04058837890625,
-0.053497314453125,
0.06890869140625,
-0.03173828125,
-0.0185546875,
-0.02166748046875,
-0.049041748046875,
-0.0606689453125,
-0.044036865234375,
-0.038177490234375,
-0.00324249267578125,
0.0098724365234375,
0.056365966796875,
0.058990478515625,
-0.043182373046875,
-0.0181884765625,
0.0115966796875,
-0.005939483642578125,
-0.03228759765625,
-0.0206146240234375,
0.0251617431640625,
-0.0002484321594238281,
-0.037322998046875,
0.0146636962890625,
0.0219573974609375,
-0.00978851318359375,
-0.012115478515625,
-0.00829315185546875,
-0.031402587890625,
-0.029052734375,
0.03070068359375,
0.035888671875,
-0.04547119140625,
-0.0287628173828125,
-0.006732940673828125,
0.00527191162109375,
0.0308990478515625,
0.02972412109375,
-0.0516357421875,
0.0228729248046875,
0.051025390625,
0.0149078369140625,
0.0626220703125,
0.0094146728515625,
0.01119232177734375,
-0.04498291015625,
0.020416259765625,
-0.0144195556640625,
0.02294921875,
0.0252685546875,
-0.0001735687255859375,
0.045745849609375,
0.04290771484375,
-0.057769775390625,
-0.0753173828125,
0.005279541015625,
-0.09942626953125,
-0.01186370849609375,
0.0750732421875,
-0.00394439697265625,
-0.03759765625,
0.0255126953125,
-0.012725830078125,
-0.0024166107177734375,
-0.0472412109375,
0.026947021484375,
0.03802490234375,
0.005435943603515625,
-0.019927978515625,
-0.0477294921875,
0.0217132568359375,
0.0170135498046875,
-0.049102783203125,
-0.0005521774291992188,
0.040740966796875,
0.05987548828125,
0.026947021484375,
0.047760009765625,
-0.01959228515625,
0.02862548828125,
0.0025463104248046875,
0.035614013671875,
-0.021697998046875,
-0.0479736328125,
-0.0181732177734375,
0.01389312744140625,
-0.018402099609375,
-0.037811279296875
]
] |
mdhugol/indonesia-bert-sentiment-classification | 2021-09-14T08:24:28.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us",
"has_space"
] | text-classification | mdhugol | null | null | mdhugol/indonesia-bert-sentiment-classification | 12 | 48,650 | transformers | 2022-03-02T23:29:05 | Indonesian BERT Base Sentiment Classifier is a sentiment-text-classification model. The model was originally the pre-trained [IndoBERT Base Model (phase1 - uncased)](https://huggingface.co/indobenchmark/indobert-base-p1) model using [Prosa sentiment dataset](https://github.com/indobenchmark/indonlu/tree/master/dataset/smsa_doc-sentiment-prosa)
## How to Use
### As Text Classifier
```python
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForSequenceClassification
pretrained= "mdhugol/indonesia-bert-sentiment-classification"
model = AutoModelForSequenceClassification.from_pretrained(pretrained)
tokenizer = AutoTokenizer.from_pretrained(pretrained)
sentiment_analysis = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
label_index = {'LABEL_0': 'positive', 'LABEL_1': 'neutral', 'LABEL_2': 'negative'}
pos_text = "Sangat bahagia hari ini"
neg_text = "Dasar anak sialan!! Kurang ajar!!"
result = sentiment_analysis(pos_text)
status = label_index[result[0]['label']]
score = result[0]['score']
print(f'Text: {pos_text} | Label : {status} ({score * 100:.3f}%)')
result = sentiment_analysis(neg_text)
status = label_index[result[0]['label']]
score = result[0]['score']
print(f'Text: {neg_text} | Label : {status} ({score * 100:.3f}%)')
``` | 1,299 | [
[
-0.033843994140625,
-0.033599853515625,
-0.01013946533203125,
0.040313720703125,
-0.03936767578125,
-0.0144195556640625,
-0.016082763671875,
-0.00281524658203125,
0.01251220703125,
0.024658203125,
-0.0269317626953125,
-0.0413818359375,
-0.0498046875,
0.0018663406372070312,
-0.006031036376953125,
0.11248779296875,
0.00968170166015625,
0.00724029541015625,
0.006793975830078125,
-0.01435089111328125,
-0.01128387451171875,
-0.052032470703125,
-0.03302001953125,
-0.0229339599609375,
0.0203094482421875,
0.0273895263671875,
0.0276336669921875,
0.00012540817260742188,
0.0279388427734375,
0.022308349609375,
-0.005321502685546875,
-0.02142333984375,
-0.01058197021484375,
0.012481689453125,
0.01107025146484375,
-0.05267333984375,
-0.0241241455078125,
0.0005903244018554688,
0.03375244140625,
0.034942626953125,
0.01824951171875,
0.01294708251953125,
0.0099334716796875,
0.04638671875,
-0.049560546875,
0.0423583984375,
-0.0360107421875,
0.004520416259765625,
-0.00824737548828125,
0.007228851318359375,
-0.0423583984375,
-0.040924072265625,
0.027069091796875,
-0.0213470458984375,
0.004558563232421875,
-0.01397705078125,
0.0931396484375,
0.0199432373046875,
-0.0237274169921875,
-0.034210205078125,
-0.0338134765625,
0.0672607421875,
-0.05712890625,
-0.00033092498779296875,
0.0180816650390625,
0.0175628662109375,
0.0084686279296875,
-0.035675048828125,
-0.050201416015625,
0.00798797607421875,
-0.003631591796875,
0.0196685791015625,
0.00554656982421875,
0.010406494140625,
0.01244354248046875,
0.050537109375,
-0.0240631103515625,
-0.02020263671875,
-0.025177001953125,
-0.00311279296875,
0.03070068359375,
-0.01300048828125,
-0.010528564453125,
-0.052764892578125,
-0.030792236328125,
-0.0259246826171875,
-0.01421356201171875,
0.033355712890625,
0.0206298828125,
0.035430908203125,
0.004638671875,
0.03387451171875,
0.005428314208984375,
0.048736572265625,
0.03033447265625,
-0.0184173583984375,
0.057769775390625,
0.007904052734375,
-0.0413818359375,
0.0238494873046875,
0.05810546875,
0.0233154296875,
0.0367431640625,
0.038848876953125,
-0.01078033447265625,
0.013824462890625,
0.007110595703125,
-0.0487060546875,
-0.032684326171875,
0.01313018798828125,
-0.06585693359375,
-0.04486083984375,
0.015472412109375,
-0.03973388671875,
-0.002323150634765625,
-0.007678985595703125,
0.04833984375,
-0.033172607421875,
-0.05010986328125,
-0.021148681640625,
-0.01551055908203125,
0.050994873046875,
-0.006526947021484375,
-0.060028076171875,
0.0025310516357421875,
0.0399169921875,
0.053497314453125,
0.013031005859375,
-0.0032062530517578125,
0.0158843994140625,
-0.018524169921875,
-0.0179595947265625,
0.041046142578125,
-0.0298309326171875,
-0.048187255859375,
0.000705718994140625,
0.00707244873046875,
-0.0007977485656738281,
-0.0258941650390625,
0.06591796875,
-0.01486968994140625,
0.0460205078125,
-0.0141754150390625,
-0.046966552734375,
-0.034820556640625,
0.0214996337890625,
-0.01593017578125,
0.0916748046875,
0.0160369873046875,
-0.0821533203125,
0.0452880859375,
-0.051422119140625,
-0.044525146484375,
-0.0018024444580078125,
0.0140838623046875,
-0.043426513671875,
0.0084381103515625,
0.0166168212890625,
0.050750732421875,
0.01666259765625,
0.01262664794921875,
-0.0157012939453125,
-0.039093017578125,
-0.0074310302734375,
-0.0179443359375,
0.0926513671875,
0.0298614501953125,
-0.023040771484375,
0.0076141357421875,
-0.066650390625,
0.005825042724609375,
-0.00865936279296875,
-0.033966064453125,
-0.051971435546875,
-0.0084686279296875,
0.0198211669921875,
0.01194000244140625,
0.02935791015625,
-0.0548095703125,
0.0225830078125,
-0.04931640625,
0.01456451416015625,
0.062042236328125,
-0.0014801025390625,
0.01399993896484375,
-0.0173187255859375,
0.0164642333984375,
0.01187896728515625,
0.016448974609375,
-0.0215911865234375,
-0.049407958984375,
-0.07373046875,
-0.0430908203125,
0.028350830078125,
0.059478759765625,
-0.03173828125,
0.083740234375,
-0.010467529296875,
-0.062042236328125,
-0.0421142578125,
-0.007503509521484375,
0.00103759765625,
0.04766845703125,
0.017913818359375,
-0.01409149169921875,
-0.07086181640625,
-0.068359375,
-0.002971649169921875,
-0.041107177734375,
0.00089263916015625,
0.006427764892578125,
0.03802490234375,
-0.0281982421875,
0.08355712890625,
-0.020416259765625,
-0.033447265625,
-0.0218963623046875,
0.042572021484375,
0.07061767578125,
0.055450439453125,
0.047760009765625,
-0.0450439453125,
-0.051116943359375,
-0.0244598388671875,
-0.054779052734375,
-0.01006317138671875,
0.00662994384765625,
-0.0080718994140625,
0.03826904296875,
0.0003764629364013672,
-0.0467529296875,
0.03729248046875,
0.027191162109375,
-0.01338958740234375,
0.044708251953125,
-0.00983428955078125,
0.0003077983856201172,
-0.09185791015625,
0.0029296875,
-0.00472259521484375,
-0.003505706787109375,
-0.0273895263671875,
-0.01293182373046875,
-0.00119781494140625,
0.0014524459838867188,
-0.0292205810546875,
0.013214111328125,
-0.0024700164794921875,
0.00208282470703125,
-0.0264129638671875,
-0.054351806640625,
-0.01067352294921875,
0.0521240234375,
0.0125885009765625,
0.031463623046875,
0.05389404296875,
-0.049896240234375,
0.025665283203125,
0.0270538330078125,
-0.0217437744140625,
0.044097900390625,
-0.05072021484375,
-0.01554107666015625,
-0.0182342529296875,
0.03045654296875,
-0.10186767578125,
-0.005672454833984375,
0.031280517578125,
-0.044219970703125,
0.01922607421875,
-0.01274871826171875,
-0.03912353515625,
-0.0390625,
-0.03558349609375,
0.01279449462890625,
0.06982421875,
-0.04656982421875,
0.05340576171875,
0.0153961181640625,
-0.01293182373046875,
-0.0633544921875,
-0.064697265625,
-0.021026611328125,
-0.007232666015625,
-0.039093017578125,
0.0024852752685546875,
0.005889892578125,
0.0006422996520996094,
0.009735107421875,
-0.006061553955078125,
-0.0134124755859375,
-0.0096588134765625,
0.03753662109375,
0.02362060546875,
-0.0092620849609375,
0.01323699951171875,
0.020538330078125,
-0.007709503173828125,
0.02618408203125,
0.0008878707885742188,
0.04864501953125,
-0.035736083984375,
-0.0027313232421875,
-0.039459228515625,
0.005588531494140625,
0.03887939453125,
0.001659393310546875,
0.0531005859375,
0.04168701171875,
-0.0293731689453125,
0.0021457672119140625,
-0.0252532958984375,
0.00457000732421875,
-0.03143310546875,
0.01666259765625,
-0.0416259765625,
-0.0290679931640625,
0.03826904296875,
0.00035572052001953125,
0.01488494873046875,
0.05816650390625,
0.04608154296875,
-0.0252685546875,
0.0850830078125,
0.042327880859375,
-0.027496337890625,
0.0369873046875,
-0.0259857177734375,
0.024200439453125,
-0.039276123046875,
-0.01544952392578125,
-0.033905029296875,
-0.0174102783203125,
-0.05328369140625,
0.00994110107421875,
0.00829315185546875,
0.00103759765625,
-0.02911376953125,
0.01367950439453125,
-0.06390380859375,
0.01293182373046875,
0.051788330078125,
0.0050201416015625,
-0.015655517578125,
0.00957489013671875,
-0.0157012939453125,
-0.01467132568359375,
-0.0335693359375,
-0.038330078125,
0.0953369140625,
0.0423583984375,
0.0633544921875,
-0.0235595703125,
0.069091796875,
0.0268707275390625,
0.043670654296875,
-0.058319091796875,
0.037445068359375,
-0.0350341796875,
-0.05706787109375,
0.0120697021484375,
-0.00485992431640625,
-0.0460205078125,
0.02374267578125,
0.0006470680236816406,
-0.023956298828125,
0.029296875,
0.001251220703125,
-0.010345458984375,
0.0209503173828125,
-0.051483154296875,
0.0618896484375,
-0.0133209228515625,
0.00519561767578125,
0.0023632049560546875,
-0.0552978515625,
0.031402587890625,
0.018096923828125,
0.0092010498046875,
-0.0009212493896484375,
0.017181396484375,
0.06744384765625,
-0.038726806640625,
0.0753173828125,
-0.0567626953125,
0.00008732080459594727,
0.0321044921875,
-0.0147552490234375,
0.004558563232421875,
-0.004619598388671875,
-0.004741668701171875,
0.0213775634765625,
-0.0026111602783203125,
-0.034332275390625,
-0.01641845703125,
0.050323486328125,
-0.07720947265625,
-0.0206451416015625,
-0.07373046875,
-0.00946044921875,
0.00383758544921875,
0.01812744140625,
0.0341796875,
0.00814056396484375,
0.0037517547607421875,
0.0258941650390625,
0.05072021484375,
-0.01439666748046875,
0.02880859375,
0.00911712646484375,
-0.0162200927734375,
-0.037445068359375,
0.0657958984375,
-0.0081634521484375,
-0.01287078857421875,
0.0248260498046875,
0.00974273681640625,
-0.02703857421875,
0.0007295608520507812,
-0.003936767578125,
0.0001595020294189453,
-0.06964111328125,
-0.01259613037109375,
-0.04559326171875,
-0.033294677734375,
-0.0377197265625,
-0.009979248046875,
-0.01305389404296875,
-0.032318115234375,
-0.0204315185546875,
-0.00787353515625,
0.017791748046875,
0.0355224609375,
-0.0215606689453125,
0.0302581787109375,
-0.0389404296875,
0.0096588134765625,
0.026824951171875,
0.0196075439453125,
-0.0088348388671875,
-0.033660888671875,
0.0165252685546875,
-0.00786590576171875,
-0.01056671142578125,
-0.06781005859375,
0.042205810546875,
0.018829345703125,
0.035919189453125,
0.04205322265625,
0.0055999755859375,
0.03289794921875,
-0.00035190582275390625,
0.064208984375,
0.0005474090576171875,
-0.07659912109375,
0.06390380859375,
-0.0028095245361328125,
0.0193634033203125,
0.042755126953125,
0.042938232421875,
-0.0196685791015625,
-0.024658203125,
-0.037322998046875,
-0.083740234375,
0.051422119140625,
0.007701873779296875,
-0.0012493133544921875,
-0.00833892822265625,
0.03216552734375,
0.035430908203125,
0.0190277099609375,
-0.0799560546875,
-0.0316162109375,
-0.07025146484375,
-0.052276611328125,
0.007080078125,
-0.036407470703125,
0.0148773193359375,
-0.04486083984375,
0.0771484375,
0.006435394287109375,
0.03302001953125,
0.0291290283203125,
-0.0173187255859375,
-0.0081787109375,
0.017822265625,
0.02093505859375,
0.0217437744140625,
-0.0540771484375,
-0.006786346435546875,
0.00882720947265625,
-0.0288848876953125,
0.00873565673828125,
0.0036602020263671875,
-0.01593017578125,
0.0192718505859375,
0.0211334228515625,
0.059844970703125,
-0.00983428955078125,
-0.0266571044921875,
0.04473876953125,
-0.01374053955078125,
-0.0187835693359375,
-0.0626220703125,
-0.0011148452758789062,
-0.02593994140625,
-0.0108795166015625,
0.05755615234375,
0.0282440185546875,
0.004573822021484375,
-0.0310821533203125,
0.0072784423828125,
0.01534271240234375,
-0.0302886962890625,
-0.006145477294921875,
0.0296478271484375,
0.016815185546875,
-0.0081329345703125,
0.04583740234375,
-0.0168609619140625,
-0.06768798828125,
0.041107177734375,
0.0224761962890625,
0.0648193359375,
-0.0188140869140625,
0.0133514404296875,
0.041839599609375,
0.020233154296875,
0.0078277587890625,
0.043548583984375,
-0.016265869140625,
-0.043182373046875,
-0.03228759765625,
-0.05816650390625,
-0.03436279296875,
0.0125885009765625,
-0.050689697265625,
0.00891876220703125,
-0.055999755859375,
-0.0136566162109375,
-0.009735107421875,
0.013214111328125,
-0.033538818359375,
0.03363037109375,
0.0132904052734375,
0.0701904296875,
-0.07879638671875,
0.06719970703125,
0.08245849609375,
-0.0330810546875,
-0.0430908203125,
-0.002895355224609375,
-0.028106689453125,
-0.047027587890625,
0.07879638671875,
0.0521240234375,
0.0145263671875,
-0.013214111328125,
-0.0404052734375,
-0.034454345703125,
0.046722412109375,
-0.0235595703125,
-0.0272674560546875,
0.01181793212890625,
-0.0021724700927734375,
0.059478759765625,
-0.0301361083984375,
0.0257720947265625,
0.034637451171875,
0.04034423828125,
-0.015411376953125,
-0.050445556640625,
-0.00592041015625,
-0.03973388671875,
0.0013751983642578125,
0.0035533905029296875,
-0.04632568359375,
0.08941650390625,
0.0141448974609375,
0.01103973388671875,
0.006595611572265625,
0.057159423828125,
0.018646240234375,
0.00769805908203125,
0.06884765625,
0.050628662109375,
0.031341552734375,
-0.03021240234375,
0.05938720703125,
-0.0172271728515625,
0.0416259765625,
0.034423828125,
-0.01386260986328125,
0.06634521484375,
0.0338134765625,
-0.01355743408203125,
0.077392578125,
0.0531005859375,
-0.016876220703125,
0.0601806640625,
0.005268096923828125,
-0.017730712890625,
-0.01128387451171875,
0.01369476318359375,
-0.037750244140625,
0.043304443359375,
0.041107177734375,
-0.0310211181640625,
-0.009735107421875,
0.015289306640625,
0.003505706787109375,
-0.017578125,
-0.0288848876953125,
0.0523681640625,
-0.0191802978515625,
-0.05078125,
0.048492431640625,
-0.001220703125,
0.08734130859375,
-0.048248291015625,
-0.0013761520385742188,
-0.0022830963134765625,
0.0469970703125,
-0.0217132568359375,
-0.06768798828125,
0.0259857177734375,
-0.0009813308715820312,
-0.020904541015625,
-0.004367828369140625,
0.0687255859375,
-0.02008056640625,
-0.04644775390625,
0.00511932373046875,
-0.0066986083984375,
0.0170440673828125,
-0.000030279159545898438,
-0.0626220703125,
0.009613037109375,
0.0009169578552246094,
-0.0272369384765625,
0.0005159378051757812,
0.020111083984375,
0.013336181640625,
0.03411865234375,
0.04083251953125,
-0.004764556884765625,
0.007686614990234375,
-0.00785064697265625,
0.05657958984375,
-0.04864501953125,
-0.058258056640625,
-0.055267333984375,
0.0240936279296875,
-0.0147705078125,
-0.039093017578125,
0.043182373046875,
0.035980224609375,
0.055572509765625,
-0.0269622802734375,
0.084716796875,
-0.03546142578125,
0.05572509765625,
-0.004970550537109375,
0.046630859375,
-0.0277252197265625,
-0.00878143310546875,
-0.0176544189453125,
-0.065185546875,
-0.017852783203125,
0.096923828125,
-0.0175933837890625,
0.0072479248046875,
0.025115966796875,
0.028350830078125,
0.0123291015625,
0.007080078125,
-0.017059326171875,
0.031982421875,
0.0237579345703125,
0.038360595703125,
0.046600341796875,
-0.03759765625,
0.049163818359375,
-0.044952392578125,
-0.041534423828125,
-0.02679443359375,
-0.055511474609375,
-0.0887451171875,
-0.03509521484375,
-0.0175323486328125,
-0.02606201171875,
-0.0007443428039550781,
0.0736083984375,
0.054473876953125,
-0.08819580078125,
-0.0157470703125,
-0.0138092041015625,
-0.0009984970092773438,
-0.01204681396484375,
-0.0323486328125,
0.038299560546875,
-0.03509521484375,
-0.05609130859375,
-0.00998687744140625,
0.0019779205322265625,
-0.00421905517578125,
-0.026824951171875,
0.001979827880859375,
-0.025299072265625,
-0.006931304931640625,
0.042022705078125,
0.0183563232421875,
-0.058685302734375,
-0.017730712890625,
-0.0240631103515625,
-0.01276397705078125,
0.0153350830078125,
0.0189056396484375,
-0.06512451171875,
0.031707763671875,
0.0589599609375,
0.0341796875,
0.0217437744140625,
-0.00476837158203125,
0.021881103515625,
-0.0716552734375,
0.020904541015625,
0.01425933837890625,
0.030548095703125,
0.0271148681640625,
-0.01503753662109375,
0.02392578125,
0.0284423828125,
-0.04669189453125,
-0.05853271484375,
-0.00562286376953125,
-0.08001708984375,
-0.020111083984375,
0.055023193359375,
-0.0238494873046875,
-0.037384033203125,
-0.0003857612609863281,
-0.0555419921875,
0.0284576416015625,
-0.0338134765625,
0.0491943359375,
0.057342529296875,
-0.012451171875,
-0.009002685546875,
0.012969970703125,
0.0272369384765625,
0.043792724609375,
-0.038116455078125,
-0.0294647216796875,
0.003490447998046875,
0.03070068359375,
0.030487060546875,
0.042449951171875,
-0.0031452178955078125,
0.0218658447265625,
-0.00838470458984375,
0.0323486328125,
0.0006656646728515625,
-0.0012941360473632812,
-0.0224151611328125,
0.0220794677734375,
-0.01100921630859375,
-0.04931640625
]
] |
EleutherAI/pythia-70m | 2023-07-09T16:07:12.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/pile",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-70m | 17 | 48,596 | transformers | 2023-02-13T14:54:51 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-70M
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-70M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-70M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-70M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-70M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-70M to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-70M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-70M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-70M.
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,560 | [
[
-0.0249481201171875,
-0.059234619140625,
0.0249176025390625,
0.0031414031982421875,
-0.01788330078125,
-0.0155792236328125,
-0.017822265625,
-0.03253173828125,
0.01425933837890625,
0.013336181640625,
-0.025665283203125,
-0.0236053466796875,
-0.03179931640625,
-0.003887176513671875,
-0.035003662109375,
0.0850830078125,
-0.00978851318359375,
-0.01049041748046875,
0.0078887939453125,
-0.00514984130859375,
-0.0028896331787109375,
-0.041168212890625,
-0.03240966796875,
-0.03021240234375,
0.04644775390625,
0.01200103759765625,
0.0657958984375,
0.04425048828125,
0.011962890625,
0.02239990234375,
-0.02801513671875,
-0.004673004150390625,
-0.0113372802734375,
-0.00725555419921875,
0.0002849102020263672,
-0.0218353271484375,
-0.054962158203125,
-0.000690460205078125,
0.051239013671875,
0.048736572265625,
-0.01297760009765625,
0.0191192626953125,
-0.001110076904296875,
0.02740478515625,
-0.03985595703125,
0.0031223297119140625,
-0.024322509765625,
-0.01306915283203125,
-0.006801605224609375,
0.0115814208984375,
-0.02813720703125,
-0.02459716796875,
0.035369873046875,
-0.04901123046875,
0.0206298828125,
0.0063323974609375,
0.090087890625,
-0.009368896484375,
-0.031768798828125,
-0.004772186279296875,
-0.05291748046875,
0.051513671875,
-0.0548095703125,
0.024627685546875,
0.021026611328125,
0.01247406005859375,
-0.0026874542236328125,
-0.066650390625,
-0.040679931640625,
-0.017242431640625,
-0.008270263671875,
-0.0020885467529296875,
-0.045440673828125,
0.0012903213500976562,
0.0372314453125,
0.046905517578125,
-0.0626220703125,
-0.00514984130859375,
-0.0283203125,
-0.0266571044921875,
0.0261688232421875,
0.0047760009765625,
0.0341796875,
-0.023468017578125,
0.0005092620849609375,
-0.0289154052734375,
-0.04974365234375,
-0.0182342529296875,
0.041168212890625,
0.0047607421875,
-0.0278778076171875,
0.038421630859375,
-0.02996826171875,
0.044464111328125,
-0.005527496337890625,
0.0179595947265625,
0.033172607421875,
-0.01482391357421875,
-0.038360595703125,
-0.00638580322265625,
0.07073974609375,
0.0096588134765625,
0.01641845703125,
-0.0014495849609375,
-0.0041351318359375,
0.004573822021484375,
0.004215240478515625,
-0.08258056640625,
-0.05908203125,
0.018951416015625,
-0.028961181640625,
-0.0305633544921875,
-0.011962890625,
-0.07037353515625,
-0.01378631591796875,
-0.0141754150390625,
0.043243408203125,
-0.038421630859375,
-0.05511474609375,
-0.0105133056640625,
0.00024962425231933594,
0.016082763671875,
0.0274200439453125,
-0.06884765625,
0.029052734375,
0.033203125,
0.07781982421875,
0.0169219970703125,
-0.04266357421875,
-0.013092041015625,
-0.0179901123046875,
-0.00931549072265625,
0.0267486572265625,
-0.00907135009765625,
-0.0152587890625,
-0.006641387939453125,
0.01369476318359375,
-0.008544921875,
-0.0270843505859375,
0.0303955078125,
-0.0306854248046875,
0.01995849609375,
-0.0190887451171875,
-0.03167724609375,
-0.0287322998046875,
0.008392333984375,
-0.04608154296875,
0.06378173828125,
0.0175018310546875,
-0.0726318359375,
0.0159454345703125,
-0.0176849365234375,
-0.00447845458984375,
-0.00289154052734375,
0.015625,
-0.052703857421875,
0.002269744873046875,
0.025115966796875,
0.004772186279296875,
-0.0293731689453125,
0.015472412109375,
-0.0191802978515625,
-0.03289794921875,
0.013641357421875,
-0.03985595703125,
0.06787109375,
0.01520538330078125,
-0.05047607421875,
0.0216217041015625,
-0.043487548828125,
0.0162200927734375,
0.0190887451171875,
-0.0280303955078125,
0.0042572021484375,
-0.01491546630859375,
0.0282745361328125,
0.01580810546875,
0.01373291015625,
-0.0266571044921875,
0.01885986328125,
-0.03765869140625,
0.055938720703125,
0.054962158203125,
-0.007183074951171875,
0.0362548828125,
-0.0335693359375,
0.035980224609375,
0.0017919540405273438,
0.015533447265625,
-0.004215240478515625,
-0.0469970703125,
-0.07611083984375,
-0.01947021484375,
0.0279388427734375,
0.022064208984375,
-0.036773681640625,
0.032623291015625,
-0.019378662109375,
-0.06549072265625,
-0.013458251953125,
-0.006252288818359375,
0.031982421875,
0.0219573974609375,
0.031463623046875,
-0.01371002197265625,
-0.039398193359375,
-0.06640625,
-0.0157470703125,
-0.03289794921875,
0.00986480712890625,
0.0140380859375,
0.071044921875,
-0.009246826171875,
0.0438232421875,
-0.025634765625,
0.0181732177734375,
-0.027374267578125,
0.0121917724609375,
0.033538818359375,
0.04498291015625,
0.0287017822265625,
-0.0433349609375,
-0.03009033203125,
0.0017023086547851562,
-0.0445556640625,
0.007335662841796875,
0.004337310791015625,
-0.024505615234375,
0.023712158203125,
0.005619049072265625,
-0.07525634765625,
0.034698486328125,
0.048248291015625,
-0.03985595703125,
0.06134033203125,
-0.02606201171875,
-0.00014066696166992188,
-0.08062744140625,
0.0203399658203125,
0.01104736328125,
-0.0182647705078125,
-0.045379638671875,
0.005886077880859375,
0.0144805908203125,
-0.0164337158203125,
-0.03363037109375,
0.044036865234375,
-0.04248046875,
-0.01140594482421875,
-0.0169677734375,
0.004680633544921875,
-0.0032711029052734375,
0.04718017578125,
0.011444091796875,
0.042388916015625,
0.059539794921875,
-0.05755615234375,
0.03155517578125,
0.0146026611328125,
-0.019927978515625,
0.0284881591796875,
-0.06744384765625,
0.013702392578125,
0.0047607421875,
0.032745361328125,
-0.043060302734375,
-0.0274810791015625,
0.04058837890625,
-0.042205810546875,
0.011749267578125,
-0.033203125,
-0.040069580078125,
-0.032012939453125,
-0.01247406005859375,
0.0462646484375,
0.058319091796875,
-0.043243408203125,
0.0537109375,
0.004840850830078125,
0.0094146728515625,
-0.0280609130859375,
-0.040435791015625,
-0.0180816650390625,
-0.040985107421875,
-0.0509033203125,
0.0292816162109375,
0.012237548828125,
-0.01345062255859375,
0.0023097991943359375,
0.0008301734924316406,
0.008270263671875,
-0.005100250244140625,
0.0264129638671875,
0.0254364013671875,
-0.004093170166015625,
0.0012054443359375,
-0.0108184814453125,
-0.01049041748046875,
-0.00003653764724731445,
-0.038909912109375,
0.07330322265625,
-0.0227203369140625,
-0.0138397216796875,
-0.061492919921875,
-0.00012177228927612305,
0.06787109375,
-0.0311279296875,
0.06683349609375,
0.0462646484375,
-0.052581787109375,
0.011474609375,
-0.0281829833984375,
-0.0209808349609375,
-0.032928466796875,
0.051361083984375,
-0.0208587646484375,
-0.0271148681640625,
0.045501708984375,
0.020172119140625,
0.020904541015625,
0.044464111328125,
0.05596923828125,
0.0173492431640625,
0.0906982421875,
0.0343017578125,
-0.01284027099609375,
0.048187255859375,
-0.04010009765625,
0.0174407958984375,
-0.08319091796875,
-0.01383209228515625,
-0.039154052734375,
-0.0182037353515625,
-0.06982421875,
-0.0232391357421875,
0.025634765625,
0.016326904296875,
-0.056121826171875,
0.040985107421875,
-0.041717529296875,
0.00479888916015625,
0.0478515625,
0.0189361572265625,
0.013427734375,
0.01534271240234375,
0.00487518310546875,
-0.003627777099609375,
-0.051422119140625,
-0.0271453857421875,
0.09368896484375,
0.037872314453125,
0.044219970703125,
0.0218963623046875,
0.0538330078125,
-0.01053619384765625,
0.0198822021484375,
-0.05169677734375,
0.03173828125,
0.0246734619140625,
-0.054351806640625,
-0.0147705078125,
-0.058685302734375,
-0.0706787109375,
0.03741455078125,
0.005573272705078125,
-0.0828857421875,
0.016326904296875,
0.01715087890625,
-0.0281524658203125,
0.036346435546875,
-0.048004150390625,
0.07427978515625,
-0.0181121826171875,
-0.035064697265625,
-0.0260009765625,
-0.0233306884765625,
0.01837158203125,
0.02667236328125,
0.01030731201171875,
0.006439208984375,
0.022369384765625,
0.07598876953125,
-0.050994873046875,
0.04974365234375,
-0.009124755859375,
0.0119171142578125,
0.0254058837890625,
0.020599365234375,
0.05023193359375,
0.0108184814453125,
0.0092315673828125,
-0.002407073974609375,
0.01125335693359375,
-0.043487548828125,
-0.0274200439453125,
0.069580078125,
-0.08441162109375,
-0.02911376953125,
-0.06036376953125,
-0.04510498046875,
0.0081329345703125,
0.0147247314453125,
0.03240966796875,
0.048553466796875,
-0.00232696533203125,
0.0008673667907714844,
0.04498291015625,
-0.038421630859375,
0.0272979736328125,
0.015838623046875,
-0.035369873046875,
-0.039886474609375,
0.0753173828125,
0.002323150634765625,
0.0269012451171875,
0.0003170967102050781,
0.0164337158203125,
-0.030059814453125,
-0.0341796875,
-0.04632568359375,
0.0419921875,
-0.05499267578125,
0.000621795654296875,
-0.054901123046875,
-0.0020275115966796875,
-0.033843994140625,
0.00873565673828125,
-0.0309295654296875,
-0.02685546875,
-0.017791748046875,
0.0002682209014892578,
0.0428466796875,
0.0357666015625,
0.00841522216796875,
0.0264434814453125,
-0.041168212890625,
-0.004390716552734375,
0.0166778564453125,
0.005458831787109375,
0.0095977783203125,
-0.0682373046875,
-0.008880615234375,
0.01145172119140625,
-0.032440185546875,
-0.0859375,
0.039398193359375,
-0.0033740997314453125,
0.026458740234375,
0.005435943603515625,
-0.0176849365234375,
0.043975830078125,
-0.005817413330078125,
0.05108642578125,
0.01238250732421875,
-0.07733154296875,
0.040740966796875,
-0.037109375,
0.025299072265625,
0.0254974365234375,
0.026519775390625,
-0.05517578125,
-0.006824493408203125,
-0.07464599609375,
-0.08233642578125,
0.055694580078125,
0.036102294921875,
0.0144500732421875,
0.01042938232421875,
0.0310821533203125,
-0.034698486328125,
0.0122528076171875,
-0.07672119140625,
-0.022186279296875,
-0.01959228515625,
-0.006771087646484375,
0.01087188720703125,
-0.00234222412109375,
0.0035076141357421875,
-0.041229248046875,
0.07763671875,
0.0025920867919921875,
0.026947021484375,
0.0216522216796875,
-0.02996826171875,
-0.00850677490234375,
-0.003337860107421875,
0.01357269287109375,
0.058685302734375,
-0.01023101806640625,
0.005115509033203125,
0.0169219970703125,
-0.040283203125,
0.0028095245361328125,
0.01116943359375,
-0.0287017822265625,
-0.003345489501953125,
0.01371002197265625,
0.0653076171875,
0.01006317138671875,
-0.030792236328125,
0.0168304443359375,
-0.00426483154296875,
-0.005977630615234375,
-0.022430419921875,
-0.01242828369140625,
0.01496124267578125,
0.01490020751953125,
-0.000698089599609375,
-0.01444244384765625,
-0.0018434524536132812,
-0.06658935546875,
0.003932952880859375,
0.0157012939453125,
-0.01384735107421875,
-0.031585693359375,
0.044952392578125,
0.001678466796875,
-0.0165863037109375,
0.08441162109375,
-0.01788330078125,
-0.053070068359375,
0.05908203125,
0.0369873046875,
0.0557861328125,
-0.0140838623046875,
0.028533935546875,
0.06500244140625,
0.023712158203125,
-0.0156707763671875,
0.0057220458984375,
0.00799560546875,
-0.038787841796875,
-0.00794219970703125,
-0.061614990234375,
-0.01788330078125,
0.01849365234375,
-0.044036865234375,
0.0322265625,
-0.0465087890625,
-0.00838470458984375,
-0.00409698486328125,
0.016510009765625,
-0.044403076171875,
0.0249176025390625,
0.0099639892578125,
0.052947998046875,
-0.06939697265625,
0.0621337890625,
0.050018310546875,
-0.054046630859375,
-0.0823974609375,
0.00296783447265625,
0.0016508102416992188,
-0.033966064453125,
0.01111602783203125,
0.01511383056640625,
0.0169677734375,
0.01194000244140625,
-0.0209808349609375,
-0.0654296875,
0.0980224609375,
0.0181884765625,
-0.050140380859375,
-0.01995849609375,
-0.00797271728515625,
0.0406494140625,
0.0029277801513671875,
0.0537109375,
0.052947998046875,
0.0308074951171875,
0.005584716796875,
-0.0804443359375,
0.0265350341796875,
-0.0261383056640625,
-0.004886627197265625,
0.0182647705078125,
-0.05291748046875,
0.0980224609375,
-0.0052032470703125,
-0.0007147789001464844,
0.03131103515625,
0.04473876953125,
0.030303955078125,
-0.007656097412109375,
0.0284423828125,
0.057830810546875,
0.0648193359375,
-0.02825927734375,
0.09417724609375,
-0.0237579345703125,
0.058990478515625,
0.0631103515625,
0.01629638671875,
0.03875732421875,
0.030059814453125,
-0.0284576416015625,
0.038330078125,
0.06390380859375,
-0.004833221435546875,
0.01377105712890625,
0.0197296142578125,
-0.0231170654296875,
-0.020599365234375,
0.007122039794921875,
-0.046905517578125,
0.01317596435546875,
0.009521484375,
-0.04388427734375,
-0.0155487060546875,
-0.0262298583984375,
0.0271759033203125,
-0.03271484375,
-0.0182342529296875,
0.01995849609375,
0.00930023193359375,
-0.04766845703125,
0.04742431640625,
0.0201568603515625,
0.043243408203125,
-0.034423828125,
0.013031005859375,
-0.01032257080078125,
0.0238037109375,
-0.0256805419921875,
-0.029815673828125,
0.005664825439453125,
-0.00029015541076660156,
0.005428314208984375,
0.00984954833984375,
0.031646728515625,
-0.01136016845703125,
-0.043487548828125,
0.01410675048828125,
0.03753662109375,
0.01959228515625,
-0.0340576171875,
-0.052886962890625,
0.007450103759765625,
-0.012298583984375,
-0.039520263671875,
0.033721923828125,
0.0203399658203125,
-0.0094757080078125,
0.04443359375,
0.048431396484375,
0.00426483154296875,
0.0010576248168945312,
0.0100250244140625,
0.0732421875,
-0.034698486328125,
-0.036102294921875,
-0.06964111328125,
0.036865234375,
0.0011396408081054688,
-0.0516357421875,
0.0648193359375,
0.041046142578125,
0.052154541015625,
0.0191192626953125,
0.046142578125,
-0.03216552734375,
-0.003154754638671875,
-0.0218505859375,
0.05108642578125,
-0.038360595703125,
0.0018978118896484375,
-0.038360595703125,
-0.0870361328125,
-0.00444793701171875,
0.0706787109375,
-0.039459228515625,
0.0305938720703125,
0.05950927734375,
0.06097412109375,
-0.0059356689453125,
0.006542205810546875,
0.00514984130859375,
0.0223846435546875,
0.039398193359375,
0.071533203125,
0.06768798828125,
-0.052032470703125,
0.04205322265625,
-0.038818359375,
-0.0197296142578125,
-0.011199951171875,
-0.035980224609375,
-0.063720703125,
-0.0341796875,
-0.03778076171875,
-0.057861328125,
0.0015039443969726562,
0.06622314453125,
0.057159423828125,
-0.0465087890625,
-0.01288604736328125,
-0.038177490234375,
0.00363922119140625,
-0.018585205078125,
-0.0174102783203125,
0.033172607421875,
0.01107025146484375,
-0.072998046875,
-0.00308990478515625,
-0.01111602783203125,
0.0085296630859375,
-0.0330810546875,
-0.02362060546875,
-0.01290130615234375,
-0.00954437255859375,
0.004238128662109375,
0.0226898193359375,
-0.038970947265625,
-0.0184326171875,
0.002391815185546875,
0.004596710205078125,
0.000027358531951904297,
0.053436279296875,
-0.0416259765625,
0.00986480712890625,
0.046844482421875,
0.00807952880859375,
0.061553955078125,
-0.02093505859375,
0.0325927734375,
-0.01837158203125,
0.024871826171875,
0.020294189453125,
0.048065185546875,
0.025421142578125,
-0.018829345703125,
0.01134490966796875,
0.0309295654296875,
-0.055572509765625,
-0.0655517578125,
0.0250091552734375,
-0.05291748046875,
-0.00666046142578125,
0.09600830078125,
-0.0212860107421875,
-0.031585693359375,
0.00315093994140625,
-0.01543426513671875,
0.04052734375,
-0.0225677490234375,
0.0511474609375,
0.047515869140625,
0.006214141845703125,
-0.0154876708984375,
-0.048797607421875,
0.0272216796875,
0.051727294921875,
-0.0615234375,
0.025054931640625,
0.044219970703125,
0.046478271484375,
0.019195556640625,
0.04345703125,
-0.021453857421875,
0.046295166015625,
0.004894256591796875,
0.005832672119140625,
0.002674102783203125,
-0.03497314453125,
-0.03289794921875,
-0.006946563720703125,
0.0169677734375,
0.0014219284057617188
]
] |
suno/bark | 2023-10-04T14:17:55.000Z | [
"transformers",
"pytorch",
"bark",
"text-to-audio",
"audio",
"text-to-speech",
"en",
"de",
"es",
"fr",
"hi",
"it",
"ja",
"ko",
"pl",
"pt",
"ru",
"tr",
"zh",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-to-speech | suno | null | null | suno/bark | 525 | 48,508 | transformers | 2023-04-25T14:44:46 | ---
language:
- en
- de
- es
- fr
- hi
- it
- ja
- ko
- pl
- pt
- ru
- tr
- zh
thumbnail: >-
https://user-images.githubusercontent.com/5068315/230698495-cbb1ced9-c911-4c9a-941d-a1a4a1286ac6.png
library: bark
license: mit
tags:
- bark
- audio
- text-to-speech
pipeline_tag: text-to-speech
inference: true
---
# Bark
Bark is a transformer-based text-to-audio model created by [Suno](https://www.suno.ai).
Bark can generate highly realistic, multilingual speech as well as other audio - including music,
background noise and simple sound effects. The model can also produce nonverbal
communications like laughing, sighing and crying. To support the research community,
we are providing access to pretrained model checkpoints ready for inference.
The original github repo and model card can be found [here](https://github.com/suno-ai/bark).
This model is meant for research purposes only.
The model output is not censored and the authors do not endorse the opinions in the generated content.
Use at your own risk.
Two checkpoints are released:
- [small](https://huggingface.co/suno/bark-small)
- [**large** (this checkpoint)](https://huggingface.co/suno/bark)
## Example
Try out Bark yourself!
* Bark Colab:
<a target="_blank" href="https://colab.research.google.com/drive/1eJfA2XUa-mXwdMy7DoYKVYHI1iTd9Vkt?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
* Hugging Face Colab:
<a target="_blank" href="https://colab.research.google.com/drive/1dWWkZzvu7L9Bunq9zvD-W02RFUXoW-Pd?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
* Hugging Face Demo:
<a target="_blank" href="https://huggingface.co/spaces/suno/bark">
<img src="https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg" alt="Open in HuggingFace"/>
</a>
## 🤗 Transformers Usage
You can run Bark locally with the 🤗 Transformers library from version 4.31.0 onwards.
1. First install the 🤗 [Transformers library](https://github.com/huggingface/transformers) and scipy:
```
pip install --upgrade pip
pip install --upgrade transformers scipy
```
2. Run inference via the `Text-to-Speech` (TTS) pipeline. You can infer the bark model via the TTS pipeline in just a few lines of code!
```python
from transformers import pipeline
import scipy
synthesiser = pipeline("text-to-speech", "suno/bark")
speech = synthesiser("Hello, my dog is cooler than you!", forward_params={"do_sample": True})
scipy.io.wavfile.write("bark_out.wav", rate=speech["sampling_rate"], data=speech["audio"])
```
3. Run inference via the Transformers modelling code. You can use the processor + generate code to convert text into a mono 24 kHz speech waveform for more fine-grained control.
```python
from transformers import AutoProcessor, AutoModel
processor = AutoProcessor.from_pretrained("suno/bark")
model = AutoModel.from_pretrained("suno/bark")
inputs = processor(
text=["Hello, my name is Suno. And, uh — and I like pizza. [laughs] But I also have other interests such as playing tic tac toe."],
return_tensors="pt",
)
speech_values = model.generate(**inputs, do_sample=True)
```
4. Listen to the speech samples either in an ipynb notebook:
```python
from IPython.display import Audio
sampling_rate = model.generation_config.sample_rate
Audio(speech_values.cpu().numpy().squeeze(), rate=sampling_rate)
```
Or save them as a `.wav` file using a third-party library, e.g. `scipy`:
```python
import scipy
sampling_rate = model.config.sample_rate
scipy.io.wavfile.write("bark_out.wav", rate=sampling_rate, data=speech_values.cpu().numpy().squeeze())
```
For more details on using the Bark model for inference using the 🤗 Transformers library, refer to the [Bark docs](https://huggingface.co/docs/transformers/model_doc/bark).
## Suno Usage
You can also run Bark locally through the original [Bark library]((https://github.com/suno-ai/bark):
1. First install the [`bark` library](https://github.com/suno-ai/bark)
2. Run the following Python code:
```python
from bark import SAMPLE_RATE, generate_audio, preload_models
from IPython.display import Audio
# download and load all models
preload_models()
# generate audio from text
text_prompt = """
Hello, my name is Suno. And, uh — and I like pizza. [laughs]
But I also have other interests such as playing tic tac toe.
"""
speech_array = generate_audio(text_prompt)
# play text in notebook
Audio(speech_array, rate=SAMPLE_RATE)
```
[pizza.webm](https://user-images.githubusercontent.com/5068315/230490503-417e688d-5115-4eee-9550-b46a2b465ee3.webm)
To save `audio_array` as a WAV file:
```python
from scipy.io.wavfile import write as write_wav
write_wav("/path/to/audio.wav", SAMPLE_RATE, audio_array)
```
## Model Details
The following is additional information about the models released here.
Bark is a series of three transformer models that turn text into audio.
### Text to semantic tokens
- Input: text, tokenized with [BERT tokenizer from Hugging Face](https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer)
- Output: semantic tokens that encode the audio to be generated
### Semantic to coarse tokens
- Input: semantic tokens
- Output: tokens from the first two codebooks of the [EnCodec Codec](https://github.com/facebookresearch/encodec) from facebook
### Coarse to fine tokens
- Input: the first two codebooks from EnCodec
- Output: 8 codebooks from EnCodec
### Architecture
| Model | Parameters | Attention | Output Vocab size |
|:-------------------------:|:----------:|------------|:-----------------:|
| Text to semantic tokens | 80/300 M | Causal | 10,000 |
| Semantic to coarse tokens | 80/300 M | Causal | 2x 1,024 |
| Coarse to fine tokens | 80/300 M | Non-causal | 6x 1,024 |
### Release date
April 2023
## Broader Implications
We anticipate that this model's text to audio capabilities can be used to improve accessbility tools in a variety of languages.
While we hope that this release will enable users to express their creativity and build applications that are a force
for good, we acknowledge that any text to audio model has the potential for dual use. While it is not straightforward
to voice clone known people with Bark, it can still be used for nefarious purposes. To further reduce the chances of unintended use of Bark,
we also release a simple classifier to detect Bark-generated audio with high accuracy (see notebooks section of the main repository). | 6,626 | [
[
-0.0211029052734375,
-0.051483154296875,
0.0141143798828125,
0.036865234375,
-0.01096343994140625,
-0.002658843994140625,
-0.022613525390625,
-0.06146240234375,
0.0157928466796875,
0.01432037353515625,
-0.04730224609375,
-0.053558349609375,
-0.0273590087890625,
-0.003498077392578125,
-0.0186767578125,
0.0906982421875,
0.041107177734375,
0.0007596015930175781,
-0.002040863037109375,
-0.0004892349243164062,
-0.020751953125,
-0.0341796875,
-0.05914306640625,
-0.049957275390625,
0.0205841064453125,
0.001476287841796875,
0.0311431884765625,
0.0257415771484375,
-0.0012035369873046875,
0.019927978515625,
-0.025604248046875,
-0.0208587646484375,
-0.0029754638671875,
-0.0174713134765625,
0.01378631591796875,
-0.049652099609375,
-0.04864501953125,
0.0196380615234375,
0.0272064208984375,
0.016387939453125,
-0.028839111328125,
0.0173797607421875,
-0.00957489013671875,
0.01654052734375,
-0.0069122314453125,
0.0275726318359375,
-0.046234130859375,
-0.01004791259765625,
0.00469207763671875,
-0.002471923828125,
-0.037750244140625,
-0.0333251953125,
0.0055999755859375,
-0.055023193359375,
0.0157012939453125,
-0.001384735107421875,
0.0859375,
0.020355224609375,
-0.01393890380859375,
-0.03509521484375,
-0.044891357421875,
0.061676025390625,
-0.08184814453125,
0.0103302001953125,
0.059417724609375,
0.01404571533203125,
-0.01273345947265625,
-0.06463623046875,
-0.038330078125,
-0.024993896484375,
-0.004673004150390625,
0.0243072509765625,
-0.038848876953125,
-0.0016422271728515625,
0.011260986328125,
0.0253753662109375,
-0.019866943359375,
-0.004497528076171875,
-0.0188140869140625,
-0.0140838623046875,
0.045745849609375,
0.0004210472106933594,
0.042327880859375,
-0.0275726318359375,
-0.00533294677734375,
-0.04376220703125,
-0.009765625,
0.039337158203125,
0.0194854736328125,
0.0218963623046875,
-0.03582763671875,
0.042327880859375,
0.0186614990234375,
0.0355224609375,
0.0220947265625,
-0.03857421875,
0.036163330078125,
-0.0207672119140625,
-0.012786865234375,
0.033935546875,
0.08221435546875,
0.0145111083984375,
-0.00862884521484375,
-0.0013837814331054688,
0.0025310516357421875,
0.0090789794921875,
0.0012769699096679688,
-0.05206298828125,
-0.019378662109375,
0.0408935546875,
-0.0277252197265625,
-0.0323486328125,
-0.019622802734375,
-0.04241943359375,
0.0015277862548828125,
-0.0016183853149414062,
0.033355712890625,
-0.07208251953125,
-0.0301055908203125,
0.0165863037109375,
-0.039764404296875,
0.0236968994140625,
0.0040435791015625,
-0.0738525390625,
0.024200439453125,
0.034423828125,
0.051605224609375,
0.020660400390625,
-0.0284271240234375,
-0.020172119140625,
0.0148468017578125,
-0.025238037109375,
0.0421142578125,
-0.0292816162109375,
-0.028076171875,
-0.035369873046875,
0.006710052490234375,
0.00014448165893554688,
-0.048736572265625,
0.056915283203125,
-0.0037994384765625,
0.0273590087890625,
0.0102081298828125,
-0.022064208984375,
-0.0296173095703125,
-0.004138946533203125,
-0.033355712890625,
0.11065673828125,
0.007503509521484375,
-0.0689697265625,
0.008331298828125,
-0.053466796875,
-0.044708251953125,
-0.0237274169921875,
0.01024627685546875,
-0.042633056640625,
0.0031986236572265625,
0.0284271240234375,
0.0188140869140625,
-0.03173828125,
0.031646728515625,
-0.01337432861328125,
-0.01558685302734375,
0.03900146484375,
-0.01251220703125,
0.08673095703125,
0.01367950439453125,
-0.04193115234375,
0.0193023681640625,
-0.06585693359375,
0.01184844970703125,
0.028350830078125,
-0.0335693359375,
-0.0247955322265625,
0.006839752197265625,
0.0211334228515625,
0.0082244873046875,
0.0151519775390625,
-0.048736572265625,
0.000560760498046875,
-0.042327880859375,
0.0609130859375,
0.0355224609375,
-0.01427459716796875,
0.00870513916015625,
-0.054290771484375,
0.0234375,
-0.01605224609375,
-0.005130767822265625,
-0.017364501953125,
-0.04180908203125,
-0.031951904296875,
-0.046630859375,
0.015716552734375,
0.03936767578125,
-0.007083892822265625,
0.0638427734375,
0.00705718994140625,
-0.06964111328125,
-0.07550048828125,
-0.039459228515625,
0.01512908935546875,
0.0258941650390625,
0.03326416015625,
-0.005771636962890625,
-0.046173095703125,
-0.047607421875,
-0.005779266357421875,
-0.033538818359375,
-0.01390838623046875,
0.045654296875,
0.021484375,
-0.021575927734375,
0.08331298828125,
-0.027679443359375,
-0.024871826171875,
-0.01678466796875,
0.0299072265625,
0.03173828125,
0.049163818359375,
0.041046142578125,
-0.036529541015625,
-0.0162200927734375,
-0.0014743804931640625,
-0.05218505859375,
-0.0189361572265625,
-0.00926971435546875,
0.0093841552734375,
0.00714874267578125,
0.0254669189453125,
-0.0482177734375,
0.01522064208984375,
0.03961181640625,
-0.0120697021484375,
0.058807373046875,
-0.0088043212890625,
0.00640106201171875,
-0.08905029296875,
0.0121307373046875,
-0.0054473876953125,
-0.01375579833984375,
-0.041595458984375,
-0.0240478515625,
-0.024383544921875,
-0.02032470703125,
-0.02886962890625,
0.0308685302734375,
-0.0216827392578125,
-0.01062774658203125,
-0.011077880859375,
0.005649566650390625,
-0.0013017654418945312,
0.04327392578125,
0.00836181640625,
0.04144287109375,
0.06793212890625,
-0.04473876953125,
0.0231170654296875,
0.03125,
-0.014801025390625,
0.0290069580078125,
-0.06439208984375,
0.0205535888671875,
0.012908935546875,
0.0323486328125,
-0.06927490234375,
-0.01248931884765625,
0.02215576171875,
-0.0660400390625,
0.01300048828125,
-0.0026607513427734375,
-0.04180908203125,
-0.026824951171875,
-0.017669677734375,
0.0301513671875,
0.057373046875,
-0.04852294921875,
0.052337646484375,
0.042694091796875,
-0.0048370361328125,
-0.031036376953125,
-0.05645751953125,
0.0012578964233398438,
-0.035736083984375,
-0.04644775390625,
0.03460693359375,
-0.00446319580078125,
0.003292083740234375,
0.014739990234375,
-0.0072174072265625,
-0.0018911361694335938,
-0.00901031494140625,
0.03521728515625,
0.005878448486328125,
-0.0157623291015625,
0.0026836395263671875,
0.0008678436279296875,
-0.005054473876953125,
0.01715087890625,
-0.0168304443359375,
0.0528564453125,
-0.037384033203125,
-0.00537872314453125,
-0.058502197265625,
0.006961822509765625,
0.05010986328125,
-0.0266876220703125,
0.0223388671875,
0.0616455078125,
-0.025390625,
-0.01149749755859375,
-0.0335693359375,
-0.020904541015625,
-0.033935546875,
0.0275726318359375,
-0.0294647216796875,
-0.0389404296875,
0.03900146484375,
-0.01485443115234375,
-0.0040130615234375,
0.033782958984375,
0.033172607421875,
-0.01345062255859375,
0.07965087890625,
0.05841064453125,
-0.00955963134765625,
0.03814697265625,
-0.0230865478515625,
0.005931854248046875,
-0.0701904296875,
-0.0343017578125,
-0.048614501953125,
0.0006709098815917969,
-0.036346435546875,
-0.025360107421875,
0.022552490234375,
0.0121307373046875,
-0.003978729248046875,
0.04254150390625,
-0.0626220703125,
0.008026123046875,
0.05780029296875,
0.00445556640625,
0.0154571533203125,
0.0050201416015625,
-0.01216888427734375,
-0.00980377197265625,
-0.05224609375,
-0.0369873046875,
0.05535888671875,
0.04254150390625,
0.064697265625,
-0.0097503662109375,
0.0535888671875,
-0.0009322166442871094,
0.003147125244140625,
-0.07476806640625,
0.044891357421875,
-0.00431060791015625,
-0.056976318359375,
-0.0261688232421875,
-0.01146697998046875,
-0.07830810546875,
0.0100555419921875,
-0.016937255859375,
-0.0738525390625,
-0.005641937255859375,
-0.006580352783203125,
-0.0010471343994140625,
0.0220947265625,
-0.042236328125,
0.0596923828125,
-0.0161590576171875,
-0.0039520263671875,
-0.0259246826171875,
-0.0386962890625,
0.026702880859375,
0.0015611648559570312,
0.0205230712890625,
-0.0284271240234375,
0.0224151611328125,
0.07550048828125,
-0.023193359375,
0.08197021484375,
-0.0008606910705566406,
0.0032711029052734375,
0.044677734375,
-0.0126800537109375,
0.0124969482421875,
-0.0100860595703125,
-0.01324462890625,
0.017364501953125,
0.036895751953125,
-0.010162353515625,
-0.028564453125,
0.0233154296875,
-0.065673828125,
-0.023040771484375,
-0.039031982421875,
-0.045166015625,
-0.003322601318359375,
0.01511383056640625,
0.03411865234375,
0.0261077880859375,
-0.0207977294921875,
0.0037517547607421875,
0.00542449951171875,
-0.049163818359375,
0.0400390625,
0.042633056640625,
-0.02471923828125,
-0.04718017578125,
0.0643310546875,
-0.0191192626953125,
0.0091400146484375,
0.001605987548828125,
0.04742431640625,
-0.0306396484375,
-0.01476287841796875,
-0.0192108154296875,
0.042938232421875,
-0.033477783203125,
0.0003387928009033203,
-0.040435791015625,
-0.0219879150390625,
-0.0401611328125,
-0.0006937980651855469,
-0.044952392578125,
-0.0209197998046875,
-0.02520751953125,
0.009002685546875,
0.050537109375,
0.04901123046875,
-0.02886962890625,
0.0254669189453125,
-0.047821044921875,
0.03173828125,
0.0116424560546875,
0.01114654541015625,
0.0034961700439453125,
-0.046478271484375,
-0.007312774658203125,
0.01016998291015625,
-0.01497650146484375,
-0.06591796875,
0.04132080078125,
0.01113128662109375,
0.044097900390625,
0.003017425537109375,
0.01898193359375,
0.0506591796875,
-0.0333251953125,
0.056915283203125,
0.038299560546875,
-0.0826416015625,
0.0667724609375,
-0.0228118896484375,
0.0153045654296875,
0.01763916015625,
0.0212249755859375,
-0.044525146484375,
-0.05609130859375,
-0.061187744140625,
-0.06439208984375,
0.0926513671875,
0.021209716796875,
0.00482940673828125,
0.0012359619140625,
-0.003314971923828125,
0.0010919570922851562,
0.0030155181884765625,
-0.07659912109375,
-0.032257080078125,
-0.038421630859375,
-0.01322174072265625,
-0.0020160675048828125,
0.0019207000732421875,
-0.0161285400390625,
-0.04376220703125,
0.07342529296875,
0.0010843276977539062,
0.036773681640625,
0.025482177734375,
0.01436614990234375,
-0.0022029876708984375,
0.0264434814453125,
0.0247802734375,
0.002410888671875,
-0.043853759765625,
0.00656890869140625,
0.0124969482421875,
-0.043548583984375,
0.02349853515625,
0.006500244140625,
-0.004779815673828125,
0.0162811279296875,
0.01605224609375,
0.07830810546875,
0.0163726806640625,
-0.051849365234375,
0.0290679931640625,
-0.01287078857421875,
-0.0166778564453125,
-0.03472900390625,
-0.0007791519165039062,
0.0341796875,
0.018585205078125,
0.023406982421875,
-0.0035915374755859375,
-0.00647735595703125,
-0.052764892578125,
0.0236968994140625,
0.029510498046875,
-0.0265960693359375,
-0.0284271240234375,
0.07171630859375,
-0.01248931884765625,
-0.049835205078125,
0.03369140625,
-0.0041046142578125,
-0.027923583984375,
0.06207275390625,
0.0863037109375,
0.0751953125,
-0.0116729736328125,
0.0186614990234375,
0.051605224609375,
0.0184478759765625,
0.002994537353515625,
-0.0015497207641601562,
-0.0181121826171875,
-0.03857421875,
-0.0177459716796875,
-0.0560302734375,
-0.028167724609375,
0.03143310546875,
-0.057373046875,
0.024566650390625,
-0.0289459228515625,
-0.03338623046875,
0.017822265625,
-0.01447296142578125,
-0.00589752197265625,
0.0181884765625,
0.00479888916015625,
0.05255126953125,
-0.058990478515625,
0.086669921875,
0.044189453125,
-0.0401611328125,
-0.08465576171875,
0.01031494140625,
-0.0162506103515625,
-0.049652099609375,
0.035308837890625,
0.02349853515625,
-0.030975341796875,
0.007732391357421875,
-0.051971435546875,
-0.04736328125,
0.072509765625,
0.02801513671875,
-0.01081085205078125,
-0.0010538101196289062,
0.0018568038940429688,
0.04840087890625,
-0.0197296142578125,
0.0290069580078125,
0.051605224609375,
0.033782958984375,
0.0193023681640625,
-0.0849609375,
0.0070037841796875,
-0.0276641845703125,
-0.0259246826171875,
-0.0208282470703125,
-0.041259765625,
0.054595947265625,
-0.0240478515625,
-0.0225982666015625,
0.002872467041015625,
0.040924072265625,
0.040069580078125,
0.039581298828125,
0.043243408203125,
0.043914794921875,
0.059417724609375,
-0.00774383544921875,
0.0594482421875,
-0.0232391357421875,
0.0255126953125,
0.08306884765625,
0.01073455810546875,
0.0653076171875,
0.02093505859375,
-0.034088134765625,
0.036346435546875,
0.05133056640625,
-0.0146636962890625,
0.036712646484375,
0.0118408203125,
-0.019622802734375,
0.0031795501708984375,
-0.021636962890625,
-0.035491943359375,
0.04437255859375,
0.0191802978515625,
-0.0027065277099609375,
-0.0036907196044921875,
0.01153564453125,
-0.0029392242431640625,
-0.005161285400390625,
0.0041046142578125,
0.060211181640625,
0.01485443115234375,
-0.0423583984375,
0.0775146484375,
0.01139068603515625,
0.0677490234375,
-0.05322265625,
0.00737762451171875,
0.0138702392578125,
0.00457763671875,
-0.0249786376953125,
-0.050048828125,
0.0258026123046875,
0.0001327991485595703,
-0.005588531494140625,
-0.0032062530517578125,
0.0286865234375,
-0.03759765625,
-0.023345947265625,
0.0574951171875,
0.00376129150390625,
0.042449951171875,
-0.0035610198974609375,
-0.06781005859375,
0.006927490234375,
0.005603790283203125,
-0.007717132568359375,
0.009765625,
0.025909423828125,
0.01375579833984375,
0.04437255859375,
0.051605224609375,
0.0084991455078125,
0.0164947509765625,
0.007297515869140625,
0.04840087890625,
-0.059173583984375,
-0.045135498046875,
-0.048614501953125,
0.0367431640625,
0.0155181884765625,
-0.0133209228515625,
0.0474853515625,
0.054901123046875,
0.04119873046875,
-0.00980377197265625,
0.05780029296875,
-0.0264434814453125,
0.0308074951171875,
-0.0304412841796875,
0.055023193359375,
-0.0533447265625,
0.01300048828125,
-0.0362548828125,
-0.0440673828125,
-0.002696990966796875,
0.056060791015625,
-0.01001739501953125,
-0.00714111328125,
0.04949951171875,
0.071044921875,
-0.00524139404296875,
0.008880615234375,
0.01090240478515625,
0.0201416015625,
0.0290679931640625,
0.04541015625,
0.062286376953125,
-0.04608154296875,
0.0621337890625,
-0.0396728515625,
-0.016937255859375,
0.0003998279571533203,
-0.050750732421875,
-0.07049560546875,
-0.06298828125,
-0.02520751953125,
-0.043701171875,
-0.0238037109375,
0.058349609375,
0.06842041015625,
-0.049072265625,
-0.044189453125,
-0.004642486572265625,
-0.0003578662872314453,
-0.03564453125,
-0.020233154296875,
0.037384033203125,
-0.01558685302734375,
-0.0654296875,
0.043609619140625,
0.0031280517578125,
0.02386474609375,
0.0210418701171875,
-0.016571044921875,
-0.021240234375,
0.0178070068359375,
0.0287322998046875,
0.0328369140625,
-0.07391357421875,
-0.004550933837890625,
-0.00445556640625,
-0.0188446044921875,
0.0390625,
0.031646728515625,
-0.0521240234375,
0.02740478515625,
0.0197296142578125,
0.024200439453125,
0.0875244140625,
0.00028634071350097656,
0.0205230712890625,
-0.04150390625,
0.02911376953125,
0.02008056640625,
0.006877899169921875,
0.025848388671875,
-0.012542724609375,
0.02520751953125,
0.015960693359375,
-0.03265380859375,
-0.06732177734375,
-0.00522613525390625,
-0.1041259765625,
-0.039642333984375,
0.0762939453125,
0.005462646484375,
-0.03973388671875,
0.00908660888671875,
-0.04901123046875,
0.051239013671875,
-0.040863037109375,
0.045257568359375,
0.041748046875,
-0.0264129638671875,
-0.0025310516357421875,
-0.03509521484375,
0.04168701171875,
0.0386962890625,
-0.057342529296875,
0.008392333984375,
0.0182952880859375,
0.041046142578125,
0.0237579345703125,
0.06658935546875,
-0.0171051025390625,
0.0157470703125,
0.02838134765625,
0.042938232421875,
-0.005054473876953125,
-0.004245758056640625,
-0.036651611328125,
-0.00254058837890625,
0.005382537841796875,
-0.033111572265625
]
] |
ckiplab/bert-base-chinese | 2022-05-10T03:28:12.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"lm-head",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | ckiplab | null | null | ckiplab/bert-base-chinese | 14 | 48,293 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- lm-head
- bert
- zh
license: gpl-3.0
---
# CKIP BERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,106 | [
[
-0.0218048095703125,
-0.026336669921875,
0.001796722412109375,
0.056549072265625,
-0.029876708984375,
0.003997802734375,
-0.01380157470703125,
-0.019073486328125,
-0.002834320068359375,
0.032928466796875,
-0.027313232421875,
-0.0220794677734375,
-0.04364013671875,
0.001613616943359375,
-0.018157958984375,
0.06353759765625,
-0.014739990234375,
0.0255279541015625,
0.031402587890625,
0.00982666015625,
-0.018402099609375,
-0.03338623046875,
-0.05218505859375,
-0.044219970703125,
-0.00330352783203125,
0.019378662109375,
0.049591064453125,
0.030029296875,
0.036376953125,
0.0225067138671875,
0.0027942657470703125,
-0.00786590576171875,
-0.01334381103515625,
-0.0207672119140625,
0.0002865791320800781,
-0.03839111328125,
-0.027984619140625,
-0.0149688720703125,
0.049835205078125,
0.035369873046875,
0.0017938613891601562,
-0.0015630722045898438,
0.01424407958984375,
0.0265960693359375,
-0.0243072509765625,
0.031982421875,
-0.043914794921875,
0.0223846435546875,
-0.01171112060546875,
-0.00612640380859375,
-0.0272979736328125,
-0.0191650390625,
0.01299285888671875,
-0.045623779296875,
0.0251007080078125,
-0.0115966796875,
0.097900390625,
0.0031528472900390625,
-0.0225982666015625,
-0.020050048828125,
-0.05084228515625,
0.077392578125,
-0.063232421875,
0.032867431640625,
0.0260467529296875,
0.0211334228515625,
-0.003826141357421875,
-0.0789794921875,
-0.04766845703125,
-0.013671875,
-0.0157928466796875,
0.02398681640625,
0.0095672607421875,
-0.0026988983154296875,
0.02655029296875,
0.0220184326171875,
-0.04449462890625,
0.0149993896484375,
-0.0284271240234375,
-0.03143310546875,
0.039215087890625,
-0.0068359375,
0.035430908203125,
-0.032501220703125,
-0.040435791015625,
-0.02557373046875,
-0.045867919921875,
0.016876220703125,
0.02020263671875,
0.00933837890625,
-0.034210205078125,
0.042266845703125,
-0.0008134841918945312,
0.0211029052734375,
0.01345062255859375,
-0.005672454833984375,
0.032928466796875,
-0.02093505859375,
-0.006313323974609375,
-0.0093536376953125,
0.065185546875,
0.01654052734375,
0.0073394775390625,
0.00450897216796875,
-0.0235137939453125,
-0.0245513916015625,
-0.0172882080078125,
-0.056243896484375,
-0.050750732421875,
0.0157470703125,
-0.056976318359375,
-0.0147857666015625,
0.01178741455078125,
-0.04534912109375,
0.0215911865234375,
-0.01751708984375,
0.0308990478515625,
-0.05181884765625,
-0.044464111328125,
-0.0006399154663085938,
-0.0291748046875,
0.061309814453125,
0.01012420654296875,
-0.08831787109375,
0.0025348663330078125,
0.044708251953125,
0.054229736328125,
0.00980377197265625,
-0.011993408203125,
0.01031494140625,
0.0269317626953125,
-0.0144805908203125,
0.039794921875,
-0.0081939697265625,
-0.052947998046875,
0.01000213623046875,
0.0067596435546875,
0.0014123916625976562,
-0.0308990478515625,
0.060211181640625,
-0.0235137939453125,
0.0297698974609375,
-0.016510009765625,
-0.02197265625,
-0.0050811767578125,
0.0067138671875,
-0.03826904296875,
0.0875244140625,
0.0166168212890625,
-0.06256103515625,
0.0179290771484375,
-0.06475830078125,
-0.04339599609375,
0.0241851806640625,
-0.00766754150390625,
-0.030670166015625,
-0.0128173828125,
0.0173797607421875,
0.0235748291015625,
-0.004573822021484375,
0.01543426513671875,
-0.0011510848999023438,
-0.0162811279296875,
0.0005764961242675781,
-0.031890869140625,
0.0989990234375,
0.0251617431640625,
-0.023773193359375,
0.01216888427734375,
-0.049835205078125,
0.00946044921875,
0.022552490234375,
-0.0189208984375,
-0.017852783203125,
0.01476287841796875,
0.043853759765625,
0.01165008544921875,
0.0408935546875,
-0.043548583984375,
0.03582763671875,
-0.041229248046875,
0.052764892578125,
0.0614013671875,
-0.02374267578125,
0.0205230712890625,
-0.01126861572265625,
0.00004029273986816406,
0.0045013427734375,
0.0271148681640625,
-0.01000213623046875,
-0.037841796875,
-0.08245849609375,
-0.025482177734375,
0.03265380859375,
0.056854248046875,
-0.0816650390625,
0.06634521484375,
-0.0177459716796875,
-0.045928955078125,
-0.02398681640625,
-0.004993438720703125,
0.0014858245849609375,
0.01309967041015625,
0.0404052734375,
-0.0218658447265625,
-0.042877197265625,
-0.0748291015625,
0.00893402099609375,
-0.042205810546875,
-0.041046142578125,
-0.0002104043960571289,
0.041015625,
-0.03277587890625,
0.072998046875,
-0.03826904296875,
-0.020355224609375,
-0.0230865478515625,
0.041259765625,
0.0263214111328125,
0.06671142578125,
0.046295166015625,
-0.07464599609375,
-0.052398681640625,
-0.01558685302734375,
-0.0254669189453125,
-0.0048980712890625,
-0.0167083740234375,
-0.0109100341796875,
0.004558563232421875,
0.004352569580078125,
-0.044525146484375,
0.0140380859375,
0.029052734375,
0.0001900196075439453,
0.06317138671875,
-0.0037822723388671875,
-0.0210418701171875,
-0.09759521484375,
0.0135498046875,
-0.01479339599609375,
-0.00315093994140625,
-0.030975341796875,
0.00014102458953857422,
0.01387786865234375,
-0.00650787353515625,
-0.039764404296875,
0.04144287109375,
-0.0257720947265625,
0.0235595703125,
-0.01953125,
-0.01271820068359375,
-0.015350341796875,
0.044036865234375,
0.0296630859375,
0.051971435546875,
0.044464111328125,
-0.05206298828125,
0.0311737060546875,
0.0491943359375,
-0.0199432373046875,
-0.006786346435546875,
-0.0694580078125,
-0.00124359130859375,
0.0233612060546875,
0.012603759765625,
-0.07073974609375,
-0.004665374755859375,
0.045074462890625,
-0.05572509765625,
0.0443115234375,
0.00421142578125,
-0.06842041015625,
-0.03338623046875,
-0.032958984375,
0.02484130859375,
0.051116943359375,
-0.046295166015625,
0.036956787109375,
0.0192413330078125,
-0.01554107666015625,
-0.043914794921875,
-0.058746337890625,
-0.001461029052734375,
0.0199737548828125,
-0.042694091796875,
0.04779052734375,
-0.0167694091796875,
0.0248870849609375,
-0.0005931854248046875,
0.00672149658203125,
-0.035552978515625,
-0.0058746337890625,
-0.009796142578125,
0.0301361083984375,
-0.0108489990234375,
-0.000942230224609375,
0.01471710205078125,
-0.0230560302734375,
0.0105743408203125,
-0.0006585121154785156,
0.0537109375,
0.0032806396484375,
-0.0234832763671875,
-0.040924072265625,
0.01983642578125,
0.0149383544921875,
-0.0181884765625,
0.0225677490234375,
0.075927734375,
-0.019134521484375,
-0.01385498046875,
-0.03143310546875,
-0.01131439208984375,
-0.040191650390625,
0.04425048828125,
-0.033782958984375,
-0.06036376953125,
0.0243377685546875,
-0.0081939697265625,
0.01428985595703125,
0.055511474609375,
0.0469970703125,
-0.0011663436889648438,
0.090576171875,
0.0675048828125,
-0.040130615234375,
0.032196044921875,
-0.0298919677734375,
0.02734375,
-0.06573486328125,
0.017791748046875,
-0.04656982421875,
0.00746917724609375,
-0.0609130859375,
-0.022216796875,
-0.0011835098266601562,
0.01171112060546875,
-0.020111083984375,
0.05340576171875,
-0.058746337890625,
-0.0036373138427734375,
0.05810546875,
-0.0219573974609375,
-0.007068634033203125,
-0.006694793701171875,
-0.0197601318359375,
-0.0010557174682617188,
-0.0433349609375,
-0.048248291015625,
0.0543212890625,
0.04962158203125,
0.053253173828125,
-0.0018186569213867188,
0.036956787109375,
-0.00240325927734375,
0.031524658203125,
-0.058502197265625,
0.039794921875,
-0.015655517578125,
-0.061614990234375,
-0.0219573974609375,
-0.016845703125,
-0.061920166015625,
0.0166168212890625,
-0.0017881393432617188,
-0.0634765625,
0.01187896728515625,
0.00429534912109375,
-0.00713348388671875,
0.0276031494140625,
-0.031646728515625,
0.054229736328125,
-0.036041259765625,
0.00830841064453125,
-0.005764007568359375,
-0.0526123046875,
0.0287017822265625,
0.00017762184143066406,
-0.006572723388671875,
-0.005462646484375,
0.007495880126953125,
0.05584716796875,
-0.01488494873046875,
0.061309814453125,
-0.01346588134765625,
-0.004638671875,
0.0231475830078125,
-0.0219573974609375,
0.0228424072265625,
0.01287078857421875,
0.00922393798828125,
0.04449462890625,
0.0159454345703125,
-0.0284271240234375,
-0.0157318115234375,
0.03497314453125,
-0.06805419921875,
-0.0308990478515625,
-0.04290771484375,
-0.017059326171875,
0.01031494140625,
0.039825439453125,
0.041046142578125,
-0.00015175342559814453,
-0.000732421875,
0.0195465087890625,
0.024200439453125,
-0.03253173828125,
0.0430908203125,
0.041961669921875,
-0.005146026611328125,
-0.034027099609375,
0.068603515625,
0.01025390625,
0.00627899169921875,
0.0472412109375,
-0.0029850006103515625,
-0.0184478759765625,
-0.03228759765625,
-0.024810791015625,
0.0284423828125,
-0.031646728515625,
0.000568389892578125,
-0.027374267578125,
-0.043212890625,
-0.04864501953125,
0.01031494140625,
-0.026153564453125,
-0.0301361083984375,
-0.021484375,
0.0013875961303710938,
-0.024566650390625,
0.0087127685546875,
-0.0205078125,
0.035400390625,
-0.0780029296875,
0.036590576171875,
0.015716552734375,
0.01861572265625,
0.001583099365234375,
-0.017608642578125,
-0.04095458984375,
0.00914764404296875,
-0.06378173828125,
-0.05377197265625,
0.041107177734375,
0.0004837512969970703,
0.05303955078125,
0.045196533203125,
0.0130767822265625,
0.03802490234375,
-0.04779052734375,
0.08233642578125,
0.0274200439453125,
-0.0887451171875,
0.0296173095703125,
-0.0130462646484375,
0.025299072265625,
0.0218505859375,
0.037139892578125,
-0.057586669921875,
-0.0242919921875,
-0.0360107421875,
-0.0859375,
0.04888916015625,
0.0284881591796875,
0.02630615234375,
-0.0008664131164550781,
0.0008578300476074219,
-0.0009918212890625,
0.01263427734375,
-0.08197021484375,
-0.040740966796875,
-0.03961181640625,
-0.0228271484375,
0.017425537109375,
-0.0297698974609375,
0.0065765380859375,
-0.0163421630859375,
0.0794677734375,
0.0051727294921875,
0.0616455078125,
0.03497314453125,
-0.003810882568359375,
-0.00977325439453125,
0.00664520263671875,
0.034942626953125,
0.04083251953125,
-0.0203094482421875,
-0.017547607421875,
0.005718231201171875,
-0.047332763671875,
-0.016998291015625,
0.03070068359375,
-0.0294189453125,
0.0330810546875,
0.0367431640625,
0.046142578125,
0.009979248046875,
-0.030517578125,
0.039794921875,
-0.01181793212890625,
-0.0181884765625,
-0.07257080078125,
-0.0030364990234375,
0.003040313720703125,
0.0016546249389648438,
0.05133056640625,
-0.0122222900390625,
0.0106658935546875,
-0.01390838623046875,
0.016021728515625,
0.0304412841796875,
-0.038299560546875,
-0.03399658203125,
0.05010986328125,
0.03521728515625,
-0.0201568603515625,
0.06353759765625,
-0.004108428955078125,
-0.07080078125,
0.05059814453125,
0.03436279296875,
0.07623291015625,
-0.0249481201171875,
0.00360870361328125,
0.0472412109375,
0.0367431640625,
0.004573822021484375,
0.01800537109375,
-0.0201568603515625,
-0.06890869140625,
-0.039306640625,
-0.027679443359375,
-0.033721923828125,
0.030853271484375,
-0.036834716796875,
0.04266357421875,
-0.034576416015625,
-0.00888824462890625,
-0.00446319580078125,
-0.0035419464111328125,
-0.0361328125,
0.01097869873046875,
0.00939178466796875,
0.0845947265625,
-0.046630859375,
0.08831787109375,
0.044036865234375,
-0.040313720703125,
-0.06201171875,
0.0125732421875,
-0.0296173095703125,
-0.055023193359375,
0.0782470703125,
0.02642822265625,
0.0203094482421875,
0.005672454833984375,
-0.0556640625,
-0.05657958984375,
0.07464599609375,
-0.01129913330078125,
-0.0248565673828125,
-0.0078582763671875,
0.026153564453125,
0.029632568359375,
-0.0034332275390625,
0.032440185546875,
0.005214691162109375,
0.046905517578125,
-0.01216888427734375,
-0.0849609375,
-0.0170745849609375,
-0.021331787109375,
0.00444793701171875,
0.01800537109375,
-0.0634765625,
0.0634765625,
0.00860595703125,
-0.024749755859375,
0.02838134765625,
0.0675048828125,
0.0001804828643798828,
0.00902557373046875,
0.04205322265625,
0.033233642578125,
-0.0022411346435546875,
-0.0168914794921875,
0.036376953125,
-0.04339599609375,
0.05987548828125,
0.0621337890625,
-0.005153656005859375,
0.055023193359375,
0.02685546875,
-0.037628173828125,
0.040435791015625,
0.05096435546875,
-0.045654296875,
0.04541015625,
0.0007462501525878906,
-0.00782012939453125,
-0.00827789306640625,
0.00958251953125,
-0.041748046875,
0.017486572265625,
0.0225677490234375,
-0.0271453857421875,
-0.0108489990234375,
-0.01435089111328125,
-0.0009012222290039062,
-0.03106689453125,
-0.004276275634765625,
0.0379638671875,
0.010284423828125,
-0.0223236083984375,
0.0364990234375,
0.026336669921875,
0.07171630859375,
-0.07769775390625,
-0.0261993408203125,
0.0192108154296875,
0.01119232177734375,
-0.0037479400634765625,
-0.047454833984375,
0.01070404052734375,
-0.0254669189453125,
-0.0115509033203125,
-0.0115203857421875,
0.05938720703125,
-0.0247344970703125,
-0.040008544921875,
0.031219482421875,
0.0057220458984375,
0.0106201171875,
0.0215606689453125,
-0.0860595703125,
-0.0247039794921875,
0.0263824462890625,
-0.031524658203125,
0.01126861572265625,
0.01145172119140625,
0.00756072998046875,
0.0477294921875,
0.0640869140625,
0.00640869140625,
-0.009979248046875,
-0.0032367706298828125,
0.06671142578125,
-0.0421142578125,
-0.041534423828125,
-0.051116943359375,
0.056060791015625,
-0.0177764892578125,
-0.0263519287109375,
0.0523681640625,
0.052490234375,
0.08392333984375,
-0.02642822265625,
0.0760498046875,
-0.0291595458984375,
0.05657958984375,
-0.01439666748046875,
0.059539794921875,
-0.0300750732421875,
-0.01053619384765625,
-0.024688720703125,
-0.065673828125,
-0.01702880859375,
0.0654296875,
-0.01003265380859375,
-0.004909515380859375,
0.050506591796875,
0.0443115234375,
0.0003883838653564453,
-0.0162811279296875,
0.0114898681640625,
0.01346588134765625,
0.046417236328125,
0.033111572265625,
0.04034423828125,
-0.03900146484375,
0.046630859375,
-0.048492431640625,
-0.01506805419921875,
-0.00969696044921875,
-0.051666259765625,
-0.0531005859375,
-0.0447998046875,
-0.0202484130859375,
-0.00778961181640625,
-0.0191192626953125,
0.061798095703125,
0.05633544921875,
-0.07861328125,
-0.033203125,
-0.001659393310546875,
0.0074615478515625,
-0.0251007080078125,
-0.02593994140625,
0.04632568359375,
-0.03143310546875,
-0.08514404296875,
0.0007090568542480469,
0.006366729736328125,
0.00797271728515625,
-0.02392578125,
0.0002884864807128906,
-0.020843505859375,
-0.0127410888671875,
0.0311737060546875,
0.0325927734375,
-0.055328369140625,
-0.0235595703125,
-0.0012044906616210938,
-0.01503753662109375,
0.00887298583984375,
0.045135498046875,
-0.0166015625,
0.0280303955078125,
0.049835205078125,
0.020599365234375,
0.0258941650390625,
-0.01026153564453125,
0.0521240234375,
-0.036773681640625,
0.02056884765625,
0.0241241455078125,
0.040496826171875,
0.02301025390625,
-0.016815185546875,
0.036529541015625,
0.031829833984375,
-0.05340576171875,
-0.042938232421875,
0.0251617431640625,
-0.0765380859375,
-0.0207977294921875,
0.06719970703125,
-0.0212554931640625,
-0.01030731201171875,
-0.00852203369140625,
-0.04412841796875,
0.04730224609375,
-0.022979736328125,
0.044647216796875,
0.0635986328125,
-0.00443267822265625,
-0.006683349609375,
-0.037872314453125,
0.0293731689453125,
0.031768798828125,
-0.0251617431640625,
-0.0257720947265625,
0.00037097930908203125,
0.0136871337890625,
0.04534912109375,
0.03216552734375,
-0.00971221923828125,
0.009307861328125,
-0.01239776611328125,
0.04443359375,
0.0014085769653320312,
0.01480865478515625,
0.004070281982421875,
-0.013397216796875,
0.002750396728515625,
-0.0313720703125
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.