modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 โ | author stringlengths 2 34 โ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 โ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
jonatasgrosman/wav2vec2-large-xlsr-53-english | 2023-03-25T10:56:55.000Z | [
"transformers",
"pytorch",
"jax",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"en",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_6_0",
"robust-speech-event",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"dataset:mozilla-foundation/common_voice_6_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | jonatasgrosman | null | null | jonatasgrosman/wav2vec2-large-xlsr-53-english | 299 | 73,582,776 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- common_voice
- mozilla-foundation/common_voice_6_0
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- en
- hf-asr-leaderboard
- mozilla-foundation/common_voice_6_0
- robust-speech-event
- speech
- xlsr-fine-tuning-week
license: apache-2.0
model-index:
- name: XLSR Wav2Vec2 English by Jonatas Grosman
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice en
type: common_voice
args: en
metrics:
- name: Test WER
type: wer
value: 19.06
- name: Test CER
type: cer
value: 7.69
- name: Test WER (+LM)
type: wer
value: 14.81
- name: Test CER (+LM)
type: cer
value: 6.84
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: en
metrics:
- name: Dev WER
type: wer
value: 27.72
- name: Dev CER
type: cer
value: 11.65
- name: Dev WER (+LM)
type: wer
value: 20.85
- name: Dev CER (+LM)
type: cer
value: 11.01
---
# Fine-tuned XLSR-53 large model for speech recognition in English
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on English using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-english")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "en"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-english"
SAMPLES = 10
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| "SHE'LL BE ALL RIGHT." | SHE'LL BE ALL RIGHT |
| SIX | SIX |
| "ALL'S WELL THAT ENDS WELL." | ALL AS WELL THAT ENDS WELL |
| DO YOU MEAN IT? | DO YOU MEAN IT |
| THE NEW PATCH IS LESS INVASIVE THAN THE OLD ONE, BUT STILL CAUSES REGRESSIONS. | THE NEW PATCH IS LESS INVASIVE THAN THE OLD ONE BUT STILL CAUSES REGRESSION |
| HOW IS MOZILLA GOING TO HANDLE AMBIGUITIES LIKE QUEUE AND CUE? | HOW IS MOSLILLAR GOING TO HANDLE ANDBEWOOTH HIS LIKE Q AND Q |
| "I GUESS YOU MUST THINK I'M KINDA BATTY." | RUSTIAN WASTIN PAN ONTE BATTLY |
| NO ONE NEAR THE REMOTE MACHINE YOU COULD RING? | NO ONE NEAR THE REMOTE MACHINE YOU COULD RING |
| SAUCE FOR THE GOOSE IS SAUCE FOR THE GANDER. | SAUCE FOR THE GUICE IS SAUCE FOR THE GONDER |
| GROVES STARTED WRITING SONGS WHEN SHE WAS FOUR YEARS OLD. | GRAFS STARTED WRITING SONGS WHEN SHE WAS FOUR YEARS OLD |
## Evaluation
1. To evaluate on `mozilla-foundation/common_voice_6_0` with split `test`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-english --dataset mozilla-foundation/common_voice_6_0 --config en --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-english --dataset speech-recognition-community-v2/dev_data --config en --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-english,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {E}nglish},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-english}},
year={2021}
}
``` | 5,327 | [
[
-0.0232391357421875,
-0.048248291015625,
0.011932373046875,
0.0168914794921875,
-0.0070037841796875,
-0.018402099609375,
-0.0272369384765625,
-0.052276611328125,
0.0106964111328125,
0.0249176025390625,
-0.05072021484375,
-0.0323486328125,
-0.031036376953125,
0.0012102127075195312,
-0.0285491943359375,
0.07763671875,
0.0163421630859375,
0.017425537109375,
-0.00754547119140625,
-0.006565093994140625,
-0.021759033203125,
-0.02606201171875,
-0.06048583984375,
-0.0256805419921875,
0.03424072265625,
0.02032470703125,
0.0196685791015625,
0.033050537109375,
0.02716064453125,
0.027069091796875,
-0.0276031494140625,
0.0025653839111328125,
-0.024139404296875,
-0.0016345977783203125,
0.007354736328125,
-0.031951904296875,
-0.0297088623046875,
0.01027679443359375,
0.05401611328125,
0.037872314453125,
-0.016357421875,
0.020294189453125,
-0.0013217926025390625,
0.0313720703125,
-0.0205230712890625,
0.0110015869140625,
-0.045928955078125,
-0.0134429931640625,
-0.00983428955078125,
0.00962066650390625,
-0.033050537109375,
-0.01473236083984375,
0.0159149169921875,
-0.037109375,
0.0140380859375,
-0.002685546875,
0.0712890625,
0.0142822265625,
-0.0118408203125,
-0.0248260498046875,
-0.04937744140625,
0.071533203125,
-0.06884765625,
0.03265380859375,
0.04058837890625,
0.015838623046875,
-0.011505126953125,
-0.0697021484375,
-0.046234130859375,
-0.021392822265625,
0.0101470947265625,
0.013641357421875,
-0.03662109375,
-0.003452301025390625,
0.0328369140625,
0.0155487060546875,
-0.05511474609375,
0.005191802978515625,
-0.0634765625,
-0.0360107421875,
0.051513671875,
-0.0095977783203125,
0.0123291015625,
-0.01413726806640625,
-0.01385498046875,
-0.03070068359375,
-0.0240478515625,
0.0270538330078125,
0.03411865234375,
0.03680419921875,
-0.038787841796875,
0.036895751953125,
-0.00533294677734375,
0.054168701171875,
0.0008001327514648438,
-0.032012939453125,
0.061248779296875,
-0.0191802978515625,
-0.020111083984375,
0.01041412353515625,
0.0784912109375,
0.01055145263671875,
0.026702880859375,
-0.0015735626220703125,
-0.01201629638671875,
0.0167236328125,
-0.0148162841796875,
-0.050445556640625,
-0.022491455078125,
0.035125732421875,
-0.0194091796875,
-0.01412200927734375,
-0.004093170166015625,
-0.0411376953125,
-0.004451751708984375,
-0.014373779296875,
0.0457763671875,
-0.045166015625,
-0.01233673095703125,
0.01708984375,
-0.0222320556640625,
0.00567626953125,
0.00006115436553955078,
-0.06524658203125,
0.017669677734375,
0.034912109375,
0.0631103515625,
0.01406097412109375,
-0.027923583984375,
-0.046417236328125,
-0.01551055908203125,
-0.00804901123046875,
0.0445556640625,
-0.0269927978515625,
-0.01468658447265625,
-0.01485443115234375,
0.0088653564453125,
-0.0169525146484375,
-0.03948974609375,
0.050689697265625,
-0.008148193359375,
0.0325927734375,
-0.00408172607421875,
-0.039520263671875,
-0.01055908203125,
-0.01161956787109375,
-0.048583984375,
0.08026123046875,
-0.0036144256591796875,
-0.058929443359375,
0.000850677490234375,
-0.04632568359375,
-0.040435791015625,
-0.022857666015625,
0.0009927749633789062,
-0.03338623046875,
-0.013153076171875,
0.016693115234375,
0.035980224609375,
-0.0218963623046875,
0.007442474365234375,
-0.0261993408203125,
-0.02069091796875,
0.0341796875,
-0.0275421142578125,
0.08648681640625,
0.021087646484375,
-0.027191162109375,
-0.00568389892578125,
-0.06988525390625,
0.0125274658203125,
0.003711700439453125,
-0.0289459228515625,
-0.0037593841552734375,
0.0013513565063476562,
0.0246429443359375,
0.006633758544921875,
0.01395416259765625,
-0.046478271484375,
0.00011360645294189453,
-0.04779052734375,
0.0615234375,
0.032806396484375,
-0.0120849609375,
0.0159149169921875,
-0.03546142578125,
0.0284881591796875,
-0.003993988037109375,
0.0016613006591796875,
-0.01019287109375,
-0.036102294921875,
-0.055267333984375,
-0.0250244140625,
0.033905029296875,
0.043304443359375,
-0.0241241455078125,
0.049957275390625,
-0.0090789794921875,
-0.06988525390625,
-0.060821533203125,
-0.00720977783203125,
0.032196044921875,
0.037017822265625,
0.045257568359375,
0.0005002021789550781,
-0.0709228515625,
-0.0615234375,
0.0009107589721679688,
-0.0190277099609375,
-0.0005807876586914062,
0.026336669921875,
0.041656494140625,
-0.0294189453125,
0.06011962890625,
-0.0325927734375,
-0.019989013671875,
-0.0235748291015625,
0.01218414306640625,
0.028900146484375,
0.052764892578125,
0.036468505859375,
-0.050994873046875,
-0.0210723876953125,
-0.0125885009765625,
-0.0276947021484375,
-0.01203155517578125,
-0.00384521484375,
0.000827789306640625,
0.01910400390625,
0.030303955078125,
-0.052093505859375,
0.01220703125,
0.0399169921875,
-0.0183563232421875,
0.043548583984375,
0.0029964447021484375,
-0.002162933349609375,
-0.0919189453125,
0.0068359375,
0.0131072998046875,
-0.013519287109375,
-0.045318603515625,
-0.0220184326171875,
-0.0102081298828125,
0.007518768310546875,
-0.03375244140625,
0.034393310546875,
-0.031829833984375,
-0.00817108154296875,
0.0010538101196289062,
0.0164642333984375,
-0.00279998779296875,
0.03173828125,
0.0007076263427734375,
0.053192138671875,
0.056427001953125,
-0.036834716796875,
0.043304443359375,
0.023895263671875,
-0.04217529296875,
0.00878143310546875,
-0.06829833984375,
0.0231170654296875,
0.0094757080078125,
0.01904296875,
-0.08154296875,
-0.01102447509765625,
0.014923095703125,
-0.06524658203125,
0.0175323486328125,
0.00186920166015625,
-0.0268402099609375,
-0.0401611328125,
-0.00927734375,
0.01399993896484375,
0.055511474609375,
-0.0283660888671875,
0.03704833984375,
0.041229248046875,
-0.0166473388671875,
-0.04913330078125,
-0.065185546875,
-0.017791748046875,
-0.0149688720703125,
-0.057373046875,
0.0176239013671875,
-0.01727294921875,
-0.01462554931640625,
-0.01503753662109375,
-0.006404876708984375,
-0.006916046142578125,
-0.006649017333984375,
0.0191802978515625,
0.0180816650390625,
-0.0204010009765625,
0.00046634674072265625,
-0.0059661865234375,
0.003772735595703125,
0.00707244873046875,
-0.012298583984375,
0.049468994140625,
-0.01363372802734375,
-0.00246429443359375,
-0.039337158203125,
0.0123291015625,
0.04632568359375,
-0.0259552001953125,
0.03070068359375,
0.06634521484375,
-0.03240966796875,
-0.0013265609741210938,
-0.04541015625,
-0.007633209228515625,
-0.033599853515625,
0.05072021484375,
-0.01416015625,
-0.053985595703125,
0.0460205078125,
0.0225677490234375,
0.0029582977294921875,
0.044219970703125,
0.037017822265625,
-0.01617431640625,
0.07275390625,
0.0275421142578125,
-0.018951416015625,
0.03802490234375,
-0.03826904296875,
-0.0044708251953125,
-0.06390380859375,
-0.025146484375,
-0.060150146484375,
-0.01419830322265625,
-0.0281524658203125,
-0.02923583984375,
0.0109405517578125,
0.0011224746704101562,
-0.0156097412109375,
0.0379638671875,
-0.040008544921875,
0.025054931640625,
0.044952392578125,
0.009429931640625,
-0.006420135498046875,
0.00997161865234375,
-0.0157318115234375,
0.001918792724609375,
-0.04266357421875,
-0.029083251953125,
0.07171630859375,
0.041748046875,
0.054962158203125,
-0.001953125,
0.045257568359375,
0.005207061767578125,
-0.0205535888671875,
-0.06243896484375,
0.0413818359375,
-0.01971435546875,
-0.047271728515625,
-0.033172607421875,
-0.02862548828125,
-0.06768798828125,
0.00757598876953125,
-0.0178985595703125,
-0.07843017578125,
0.010223388671875,
0.0064239501953125,
-0.037200927734375,
0.00627899169921875,
-0.060455322265625,
0.059295654296875,
-0.006160736083984375,
-0.01001739501953125,
-0.0123748779296875,
-0.053009033203125,
0.0172271728515625,
-0.0015649795532226562,
0.0134124755859375,
-0.006023406982421875,
0.0276031494140625,
0.09747314453125,
-0.0196075439453125,
0.061981201171875,
-0.01183319091796875,
0.00756072998046875,
0.0294036865234375,
-0.0284423828125,
0.033447265625,
-0.0148162841796875,
-0.0203399658203125,
0.01904296875,
0.02734375,
-0.006900787353515625,
-0.02728271484375,
0.047607421875,
-0.07666015625,
-0.026123046875,
-0.03704833984375,
-0.043731689453125,
-0.01377105712890625,
0.01212310791015625,
0.04852294921875,
0.054443359375,
-0.01410675048828125,
0.039398193359375,
0.040069580078125,
-0.01410675048828125,
0.033477783203125,
0.033721923828125,
-0.0111541748046875,
-0.05181884765625,
0.050689697265625,
0.02154541015625,
0.019683837890625,
0.019317626953125,
0.0231170654296875,
-0.03521728515625,
-0.035247802734375,
-0.01505279541015625,
0.0269927978515625,
-0.046234130859375,
-0.011932373046875,
-0.05364990234375,
-0.0272369384765625,
-0.062286376953125,
0.0144500732421875,
-0.01983642578125,
-0.0302886962890625,
-0.043426513671875,
-0.004970550537109375,
0.043243408203125,
0.041290283203125,
-0.017425537109375,
0.0230560302734375,
-0.04962158203125,
0.0240325927734375,
0.00722503662109375,
-0.0003571510314941406,
-0.0038738250732421875,
-0.073486328125,
-0.032928466796875,
0.0221710205078125,
-0.0133514404296875,
-0.0653076171875,
0.03448486328125,
0.0194854736328125,
0.04290771484375,
0.0244140625,
0.0013580322265625,
0.0611572265625,
-0.0374755859375,
0.06207275390625,
0.0239715576171875,
-0.0794677734375,
0.05364990234375,
-0.0189056396484375,
0.0197601318359375,
0.025177001953125,
0.0213470458984375,
-0.04681396484375,
-0.0292510986328125,
-0.052093505859375,
-0.06353759765625,
0.0693359375,
0.0156097412109375,
0.0090484619140625,
0.004901885986328125,
0.0158843994140625,
-0.00836181640625,
0.0018253326416015625,
-0.06536865234375,
-0.0389404296875,
-0.0166015625,
-0.0235748291015625,
-0.0234222412109375,
-0.0178985595703125,
-0.00569915771484375,
-0.042388916015625,
0.07537841796875,
0.0109100341796875,
0.03253173828125,
0.02215576171875,
0.00321197509765625,
-0.0037593841552734375,
0.0231781005859375,
0.052215576171875,
0.0234832763671875,
-0.0277099609375,
-0.0021839141845703125,
0.016876220703125,
-0.054595947265625,
0.00923919677734375,
0.022308349609375,
-0.0026454925537109375,
0.00846099853515625,
0.045013427734375,
0.08831787109375,
0.01119232177734375,
-0.043609619140625,
0.0276641845703125,
0.00170135498046875,
-0.02728271484375,
-0.051300048828125,
0.0180511474609375,
0.02386474609375,
0.0269927978515625,
0.0301666259765625,
0.01032257080078125,
-0.0001742839813232422,
-0.038360595703125,
0.01354217529296875,
0.0225372314453125,
-0.028228759765625,
-0.0211639404296875,
0.04376220703125,
0.007904052734375,
-0.029571533203125,
0.039398193359375,
-0.001068115234375,
-0.032989501953125,
0.059417724609375,
0.05474853515625,
0.0640869140625,
-0.02459716796875,
0.00023102760314941406,
0.04547119140625,
0.029144287109375,
-0.01861572265625,
0.03558349609375,
0.007137298583984375,
-0.05938720703125,
-0.017822265625,
-0.045501708984375,
-0.01433563232421875,
0.0357666015625,
-0.06427001953125,
0.0283660888671875,
-0.0221710205078125,
-0.01959228515625,
0.030303955078125,
0.0089874267578125,
-0.0418701171875,
0.0247650146484375,
0.016693115234375,
0.08837890625,
-0.07305908203125,
0.07745361328125,
0.0411376953125,
-0.036041259765625,
-0.091552734375,
-0.0047607421875,
-0.01177978515625,
-0.048492431640625,
0.03485107421875,
0.0183563232421875,
-0.007518768310546875,
0.004299163818359375,
-0.048858642578125,
-0.07330322265625,
0.09014892578125,
0.032470703125,
-0.06781005859375,
0.0009822845458984375,
-0.0098114013671875,
0.037689208984375,
-0.0258026123046875,
0.03070068359375,
0.0517578125,
0.035003662109375,
0.0109100341796875,
-0.0797119140625,
-0.0129241943359375,
-0.0301055908203125,
-0.018310546875,
-0.01503753662109375,
-0.042694091796875,
0.0810546875,
-0.029876708984375,
-0.0035247802734375,
0.024200439453125,
0.05926513671875,
0.0269927978515625,
0.0211029052734375,
0.048248291015625,
0.043792724609375,
0.07965087890625,
-0.009368896484375,
0.05828857421875,
-0.0099639892578125,
0.035980224609375,
0.0877685546875,
-0.0155487060546875,
0.08929443359375,
0.02655029296875,
-0.027191162109375,
0.037689208984375,
0.048065185546875,
-0.0207672119140625,
0.0474853515625,
0.0087127685546875,
-0.01093292236328125,
-0.00966644287109375,
0.00421142578125,
-0.049896240234375,
0.052215576171875,
0.016021728515625,
-0.032928466796875,
0.0169677734375,
0.0141754150390625,
0.01244354248046875,
-0.01629638671875,
-0.01202392578125,
0.043365478515625,
0.0113525390625,
-0.045135498046875,
0.061370849609375,
-0.00012505054473876953,
0.0634765625,
-0.054840087890625,
0.01251983642578125,
0.00910186767578125,
0.0194854736328125,
-0.022735595703125,
-0.048675537109375,
0.0100860595703125,
0.01141357421875,
-0.031036376953125,
0.01108551025390625,
0.03521728515625,
-0.052703857421875,
-0.051177978515625,
0.03509521484375,
0.0089874267578125,
0.0316162109375,
0.0045318603515625,
-0.06463623046875,
0.019989013671875,
0.0224151611328125,
-0.0256500244140625,
0.00789642333984375,
0.02178955078125,
0.0245819091796875,
0.045013427734375,
0.053924560546875,
0.0236358642578125,
-0.0022068023681640625,
0.01038360595703125,
0.047943115234375,
-0.044830322265625,
-0.0452880859375,
-0.051849365234375,
0.037017822265625,
0.0016489028930664062,
-0.0297088623046875,
0.048828125,
0.051513671875,
0.06414794921875,
-0.00406646728515625,
0.07269287109375,
-0.01039886474609375,
0.06689453125,
-0.048980712890625,
0.0595703125,
-0.04351806640625,
0.00888824462890625,
-0.0303497314453125,
-0.048828125,
-0.00823211669921875,
0.07562255859375,
-0.0275115966796875,
0.0139923095703125,
0.049468994140625,
0.08197021484375,
-0.0029659271240234375,
-0.00429534912109375,
0.023284912109375,
0.038818359375,
0.01140594482421875,
0.048797607421875,
0.048492431640625,
-0.0526123046875,
0.0587158203125,
-0.029693603515625,
-0.0041656494140625,
-0.0083160400390625,
-0.048828125,
-0.06402587890625,
-0.06585693359375,
-0.033477783203125,
-0.0504150390625,
-0.004238128662109375,
0.08892822265625,
0.05474853515625,
-0.07012939453125,
-0.02728271484375,
0.0123291015625,
-0.007030487060546875,
-0.024749755859375,
-0.0155181884765625,
0.032196044921875,
0.003971099853515625,
-0.062286376953125,
0.0357666015625,
-0.01360321044921875,
0.0194244384765625,
-0.01198577880859375,
-0.0185089111328125,
-0.0199127197265625,
0.0031490325927734375,
0.0225830078125,
0.0310821533203125,
-0.0648193359375,
-0.0156402587890625,
0.002849578857421875,
-0.017120361328125,
0.003269195556640625,
0.02142333984375,
-0.0498046875,
0.01898193359375,
0.041107177734375,
0.01381683349609375,
0.038360595703125,
-0.021453857421875,
0.0231170654296875,
-0.038604736328125,
0.019866943359375,
0.019622802734375,
0.049896240234375,
0.031280517578125,
-0.0168304443359375,
0.026275634765625,
0.0200347900390625,
-0.04388427734375,
-0.078125,
-0.00795745849609375,
-0.09588623046875,
-0.0059051513671875,
0.10540771484375,
-0.0121307373046875,
-0.01995849609375,
0.0050811767578125,
-0.0293731689453125,
0.043426513671875,
-0.0394287109375,
0.0341796875,
0.049285888671875,
-0.0013437271118164062,
-0.007171630859375,
-0.0394287109375,
0.03118896484375,
0.0240325927734375,
-0.040740966796875,
0.004924774169921875,
0.03472900390625,
0.03826904296875,
0.0234375,
0.05462646484375,
0.0017671585083007812,
0.028167724609375,
0.005115509033203125,
0.02239990234375,
-0.010162353515625,
-0.0200042724609375,
-0.04315185546875,
-0.004688262939453125,
-0.0098876953125,
-0.03350830078125
]
] |
timm/mobilenetv3_large_100.ra_in1k | 2023-04-27T22:49:21.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.02244",
"license:apache-2.0",
"region:us",
"has_space"
] | image-classification | timm | null | null | timm/mobilenetv3_large_100.ra_in1k | 9 | 61,880,982 | timm | 2022-12-16T05:38:07 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for mobilenetv3_large_100.ra_in1k
A MobileNet-v3 image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.5
- GMACs: 0.2
- Activations (M): 4.4
- Image size: 224 x 224
- **Papers:**
- Searching for MobileNetV3: https://arxiv.org/abs/1905.02244
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilenetv3_large_100.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 960, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 960, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{howard2019searching,
title={Searching for mobilenetv3},
author={Howard, Andrew and Sandler, Mark and Chu, Grace and Chen, Liang-Chieh and Chen, Bo and Tan, Mingxing and Wang, Weijun and Zhu, Yukun and Pang, Ruoming and Vasudevan, Vijay and others},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={1314--1324},
year={2019}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,793 | [
[
-0.0309295654296875,
-0.0209808349609375,
-0.004390716552734375,
0.006359100341796875,
-0.0230865478515625,
-0.0299835205078125,
-0.00531005859375,
-0.0260467529296875,
0.0279998779296875,
0.035003662109375,
-0.02734375,
-0.054229736328125,
-0.0435791015625,
-0.0071258544921875,
-0.00962066650390625,
0.0599365234375,
-0.00848388671875,
-0.00580596923828125,
-0.0081329345703125,
-0.048980712890625,
-0.0174102783203125,
-0.0248260498046875,
-0.066650390625,
-0.0352783203125,
0.036407470703125,
0.0258026123046875,
0.0423583984375,
0.05010986328125,
0.04693603515625,
0.03173828125,
-0.0027332305908203125,
0.010040283203125,
-0.006763458251953125,
-0.013885498046875,
0.0306396484375,
-0.051177978515625,
-0.0323486328125,
0.02398681640625,
0.045989990234375,
0.017181396484375,
0.0005712509155273438,
0.0325927734375,
0.003284454345703125,
0.053680419921875,
-0.0270233154296875,
-0.0017337799072265625,
-0.035614013671875,
0.01486968994140625,
-0.01163482666015625,
-0.002689361572265625,
-0.0200653076171875,
-0.033447265625,
0.00841522216796875,
-0.03204345703125,
0.029052734375,
0.0007772445678710938,
0.09967041015625,
0.01220703125,
-0.01251220703125,
0.001819610595703125,
-0.0191802978515625,
0.05828857421875,
-0.054412841796875,
0.015869140625,
0.03021240234375,
0.0176239013671875,
-0.005908966064453125,
-0.07733154296875,
-0.04608154296875,
-0.01360321044921875,
1.7881393432617188e-7,
-0.0015926361083984375,
-0.01580810546875,
-0.007415771484375,
0.01898193359375,
0.0183868408203125,
-0.03216552734375,
0.00780487060546875,
-0.0386962890625,
-0.01555633544921875,
0.044464111328125,
0.0037364959716796875,
0.0299224853515625,
-0.02374267578125,
-0.0312347412109375,
-0.0281829833984375,
-0.03570556640625,
0.0237579345703125,
0.0179595947265625,
0.01557159423828125,
-0.048370361328125,
0.037078857421875,
0.004085540771484375,
0.051055908203125,
-0.0019521713256835938,
-0.0316162109375,
0.053466796875,
-0.00482940673828125,
-0.0302276611328125,
-0.006069183349609375,
0.08587646484375,
0.04132080078125,
0.011932373046875,
0.01425933837890625,
-0.007038116455078125,
-0.03466796875,
-0.0014934539794921875,
-0.087158203125,
-0.019012451171875,
0.025726318359375,
-0.06427001953125,
-0.03521728515625,
0.015960693359375,
-0.037445068359375,
-0.016571044921875,
0.0043182373046875,
0.033294677734375,
-0.03143310546875,
-0.0311279296875,
0.00396728515625,
-0.009735107421875,
0.0271148681640625,
0.01038360595703125,
-0.04315185546875,
0.0098724365234375,
0.0162506103515625,
0.09588623046875,
0.0102386474609375,
-0.03436279296875,
-0.019134521484375,
-0.0279541015625,
-0.016082763671875,
0.030120849609375,
-0.00742340087890625,
-0.01183319091796875,
-0.0246429443359375,
0.0247955322265625,
-0.019775390625,
-0.059844970703125,
0.02667236328125,
-0.0228271484375,
0.01392364501953125,
-0.00037384033203125,
-0.0021038055419921875,
-0.046112060546875,
0.0219879150390625,
-0.039093017578125,
0.10247802734375,
0.020355224609375,
-0.0675048828125,
0.02093505859375,
-0.04229736328125,
-0.0139923095703125,
-0.0281219482421875,
0.00036835670471191406,
-0.080078125,
-0.01224517822265625,
0.0189056396484375,
0.0635986328125,
-0.028778076171875,
-0.005321502685546875,
-0.0447998046875,
-0.02374267578125,
0.0230255126953125,
0.010040283203125,
0.07843017578125,
0.0183563232421875,
-0.0396728515625,
0.0159149169921875,
-0.046905517578125,
0.01441192626953125,
0.035980224609375,
-0.017730712890625,
-0.00907135009765625,
-0.033538818359375,
0.0078582763671875,
0.0281524658203125,
0.000034689903259277344,
-0.041229248046875,
0.015655517578125,
-0.01244354248046875,
0.041748046875,
0.031524658203125,
-0.0095977783203125,
0.0289154052734375,
-0.033905029296875,
0.02130126953125,
0.0208892822265625,
0.0208740234375,
-0.00830078125,
-0.042938232421875,
-0.058563232421875,
-0.0330810546875,
0.028717041015625,
0.03497314453125,
-0.04315185546875,
0.02490234375,
-0.01522064208984375,
-0.062286376953125,
-0.033416748046875,
0.00592041015625,
0.03338623046875,
0.03973388671875,
0.0225982666015625,
-0.037200927734375,
-0.046112060546875,
-0.06793212890625,
-0.00392913818359375,
0.003307342529296875,
0.0008764266967773438,
0.033843994140625,
0.05303955078125,
-0.01282501220703125,
0.051116943359375,
-0.0214080810546875,
-0.0213470458984375,
-0.0184783935546875,
0.00646209716796875,
0.0299224853515625,
0.060791015625,
0.060211181640625,
-0.061920166015625,
-0.03460693359375,
-0.0017242431640625,
-0.071533203125,
0.014556884765625,
-0.00820159912109375,
-0.005542755126953125,
0.0199127197265625,
0.019744873046875,
-0.04681396484375,
0.047760009765625,
0.0181732177734375,
-0.0119476318359375,
0.02764892578125,
-0.00506591796875,
0.02117919921875,
-0.09490966796875,
0.0096282958984375,
0.03509521484375,
-0.01081085205078125,
-0.027801513671875,
0.005817413330078125,
0.007183074951171875,
-0.01073455810546875,
-0.043975830078125,
0.0509033203125,
-0.0400390625,
-0.01776123046875,
-0.01503753662109375,
-0.00930023193359375,
-0.0012540817260742188,
0.0440673828125,
-0.0116119384765625,
0.03448486328125,
0.053985595703125,
-0.041229248046875,
0.039886474609375,
0.0205078125,
-0.01329803466796875,
0.02276611328125,
-0.052825927734375,
0.013641357421875,
0.0024547576904296875,
0.027984619140625,
-0.06304931640625,
-0.021575927734375,
0.0299835205078125,
-0.05096435546875,
0.031829833984375,
-0.048095703125,
-0.0322265625,
-0.047637939453125,
-0.04058837890625,
0.031280517578125,
0.043060302734375,
-0.057373046875,
0.046844482421875,
0.021514892578125,
0.0274200439453125,
-0.046112060546875,
-0.061370849609375,
-0.01397705078125,
-0.03643798828125,
-0.05828857421875,
0.033355712890625,
0.0235748291015625,
0.0063018798828125,
0.0025310516357421875,
-0.0119476318359375,
-0.0125579833984375,
-0.008087158203125,
0.052764892578125,
0.024993896484375,
-0.0198822021484375,
-0.0153656005859375,
-0.035400390625,
-0.0025997161865234375,
-0.0011548995971679688,
-0.029571533203125,
0.0447998046875,
-0.0255279541015625,
-0.0037403106689453125,
-0.069091796875,
-0.016021728515625,
0.042205810546875,
-0.0110931396484375,
0.060089111328125,
0.086669921875,
-0.03802490234375,
0.01049041748046875,
-0.037506103515625,
-0.013671875,
-0.0367431640625,
0.0277557373046875,
-0.0330810546875,
-0.035430908203125,
0.070068359375,
0.00246429443359375,
-0.00009745359420776367,
0.047454833984375,
0.0247344970703125,
-0.005611419677734375,
0.058563232421875,
0.04193115234375,
0.01213836669921875,
0.05316162109375,
-0.06622314453125,
-0.0162200927734375,
-0.06964111328125,
-0.047576904296875,
-0.033905029296875,
-0.038665771484375,
-0.056365966796875,
-0.030731201171875,
0.0298919677734375,
0.012786865234375,
-0.033111572265625,
0.03985595703125,
-0.056427001953125,
0.007358551025390625,
0.0535888671875,
0.049591064453125,
-0.0302276611328125,
0.026885986328125,
-0.026153564453125,
-0.00007468461990356445,
-0.060150146484375,
-0.0196380615234375,
0.086669921875,
0.03436279296875,
0.040283203125,
-0.006526947021484375,
0.060577392578125,
-0.022674560546875,
0.0253448486328125,
-0.048431396484375,
0.0455322265625,
-0.005619049072265625,
-0.033233642578125,
-0.0026340484619140625,
-0.037506103515625,
-0.0810546875,
0.01128387451171875,
-0.02227783203125,
-0.059600830078125,
0.01386260986328125,
0.01450347900390625,
-0.020721435546875,
0.056427001953125,
-0.0633544921875,
0.0670166015625,
-0.0039043426513671875,
-0.039703369140625,
0.009613037109375,
-0.05413818359375,
0.0290679931640625,
0.01525115966796875,
-0.01294708251953125,
-0.00809478759765625,
0.009796142578125,
0.0799560546875,
-0.048431396484375,
0.059844970703125,
-0.0361328125,
0.0303955078125,
0.04351806640625,
-0.009307861328125,
0.0294342041015625,
-0.003765106201171875,
-0.01800537109375,
0.0201873779296875,
-0.001239776611328125,
-0.034912109375,
-0.040740966796875,
0.05059814453125,
-0.0682373046875,
-0.0190887451171875,
-0.0266876220703125,
-0.02532958984375,
0.01502227783203125,
0.01464080810546875,
0.041107177734375,
0.0526123046875,
0.026824951171875,
0.0242767333984375,
0.04156494140625,
-0.038726806640625,
0.03717041015625,
-0.00699615478515625,
-0.019439697265625,
-0.041107177734375,
0.06951904296875,
0.00945281982421875,
0.0014753341674804688,
0.01018524169921875,
0.0126495361328125,
-0.02471923828125,
-0.04681396484375,
-0.023162841796875,
0.020263671875,
-0.04522705078125,
-0.037445068359375,
-0.046905517578125,
-0.0302886962890625,
-0.023193359375,
-0.003429412841796875,
-0.04156494140625,
-0.0242156982421875,
-0.031158447265625,
0.0236663818359375,
0.053619384765625,
0.0362548828125,
-0.01171112060546875,
0.0455322265625,
-0.049346923828125,
0.01413726806640625,
0.005275726318359375,
0.0340576171875,
-0.005184173583984375,
-0.060150146484375,
-0.0199127197265625,
-0.001285552978515625,
-0.033203125,
-0.047576904296875,
0.03759765625,
0.005706787109375,
0.0291290283203125,
0.0178680419921875,
-0.0198516845703125,
0.055145263671875,
-0.0010881423950195312,
0.045654296875,
0.041778564453125,
-0.03985595703125,
0.046051025390625,
-0.0144805908203125,
0.0187530517578125,
0.01050567626953125,
0.0294647216796875,
-0.017913818359375,
0.0095672607421875,
-0.06591796875,
-0.059326171875,
0.061187744140625,
0.00975799560546875,
-0.0012216567993164062,
0.0249786376953125,
0.056549072265625,
-0.00975799560546875,
-0.005298614501953125,
-0.0628662109375,
-0.03338623046875,
-0.0302276611328125,
-0.018280029296875,
0.013946533203125,
-0.008087158203125,
-0.0022983551025390625,
-0.052703857421875,
0.05029296875,
0.00028061866760253906,
0.057891845703125,
0.0288848876953125,
0.0038585662841796875,
0.00409698486328125,
-0.034820556640625,
0.043792724609375,
0.0190887451171875,
-0.0288848876953125,
0.0057830810546875,
0.0104827880859375,
-0.053466796875,
0.0118408203125,
0.0086517333984375,
0.0022602081298828125,
0.0023403167724609375,
0.0264129638671875,
0.06549072265625,
-0.0038318634033203125,
0.00685882568359375,
0.03326416015625,
-0.00959014892578125,
-0.040618896484375,
-0.0224761962890625,
0.0102386474609375,
-0.0029125213623046875,
0.0309295654296875,
0.032135009765625,
0.034149169921875,
-0.00897979736328125,
-0.01727294921875,
0.0206756591796875,
0.03021240234375,
-0.0215606689453125,
-0.0187530517578125,
0.05291748046875,
-0.0073089599609375,
-0.0166473388671875,
0.057281494140625,
-0.0097198486328125,
-0.0355224609375,
0.07855224609375,
0.033782958984375,
0.06781005859375,
-0.006023406982421875,
0.004734039306640625,
0.065185546875,
0.0229339599609375,
-0.007781982421875,
0.0191192626953125,
0.0156402587890625,
-0.05718994140625,
0.001251220703125,
-0.03302001953125,
0.00803375244140625,
0.033294677734375,
-0.04205322265625,
0.026275634765625,
-0.05023193359375,
-0.037811279296875,
0.0180816650390625,
0.021636962890625,
-0.06536865234375,
0.0218963623046875,
-0.01166534423828125,
0.0693359375,
-0.042694091796875,
0.059295654296875,
0.0667724609375,
-0.036651611328125,
-0.0816650390625,
0.0009984970092773438,
0.006893157958984375,
-0.0684814453125,
0.05426025390625,
0.037567138671875,
0.0008287429809570312,
0.00783538818359375,
-0.0616455078125,
-0.05157470703125,
0.10357666015625,
0.0309600830078125,
-0.00643157958984375,
0.0250244140625,
-0.0121612548828125,
0.0047607421875,
-0.035308837890625,
0.0404052734375,
0.01076507568359375,
0.0211944580078125,
0.022796630859375,
-0.05657958984375,
0.01849365234375,
-0.028717041015625,
0.013824462890625,
0.015838623046875,
-0.06365966796875,
0.0570068359375,
-0.0426025390625,
-0.00992584228515625,
0.0008807182312011719,
0.04473876953125,
0.0187530517578125,
0.0219573974609375,
0.033843994140625,
0.055145263671875,
0.037139892578125,
-0.0176849365234375,
0.0692138671875,
0.0012378692626953125,
0.03631591796875,
0.047119140625,
0.0180816650390625,
0.0460205078125,
0.026580810546875,
-0.01351165771484375,
0.0302276611328125,
0.08966064453125,
-0.0205535888671875,
0.02239990234375,
0.0142669677734375,
-0.005367279052734375,
-0.0003795623779296875,
0.007160186767578125,
-0.03466796875,
0.047515869140625,
0.008544921875,
-0.045745849609375,
-0.0103912353515625,
0.007205963134765625,
0.003932952880859375,
-0.0254364013671875,
-0.01959228515625,
0.026641845703125,
0.004669189453125,
-0.026641845703125,
0.0794677734375,
0.01947021484375,
0.06500244140625,
-0.0196533203125,
0.00443267822265625,
-0.02288818359375,
0.00757598876953125,
-0.035400390625,
-0.051666259765625,
0.021026611328125,
-0.0216522216796875,
-0.0025463104248046875,
0.007511138916015625,
0.054229736328125,
-0.00841522216796875,
-0.0229034423828125,
0.006206512451171875,
0.01525115966796875,
0.03759765625,
0.0028247833251953125,
-0.09033203125,
0.0215301513671875,
0.010345458984375,
-0.0433349609375,
0.02374267578125,
0.023040771484375,
0.005863189697265625,
0.06585693359375,
0.046112060546875,
-0.0157928466796875,
0.00998687744140625,
-0.0177154541015625,
0.061920166015625,
-0.04681396484375,
-0.015899658203125,
-0.06304931640625,
0.044403076171875,
-0.0147705078125,
-0.0450439453125,
0.04290771484375,
0.050750732421875,
0.059844970703125,
0.003101348876953125,
0.0389404296875,
-0.0236053466796875,
-0.0022735595703125,
-0.03778076171875,
0.04913330078125,
-0.0592041015625,
0.006649017333984375,
-0.005970001220703125,
-0.04937744140625,
-0.03265380859375,
0.061370849609375,
-0.0201416015625,
0.03216552734375,
0.039764404296875,
0.07977294921875,
-0.032958984375,
-0.0191802978515625,
0.004970550537109375,
-0.00022518634796142578,
-0.002838134765625,
0.024871826171875,
0.031951904296875,
-0.06768798828125,
0.0294952392578125,
-0.04052734375,
-0.013397216796875,
-0.01934814453125,
-0.056427001953125,
-0.07421875,
-0.06658935546875,
-0.040496826171875,
-0.06561279296875,
-0.01277923583984375,
0.07177734375,
0.08502197265625,
-0.04052734375,
-0.01168060302734375,
-0.0001786947250366211,
0.0172882080078125,
-0.01352691650390625,
-0.0163726806640625,
0.042694091796875,
-0.002899169921875,
-0.04595947265625,
-0.0159759521484375,
-0.0005826950073242188,
0.0313720703125,
0.01325225830078125,
-0.0173797607421875,
-0.012603759765625,
-0.026885986328125,
0.0225677490234375,
0.03643798828125,
-0.045745849609375,
-0.006343841552734375,
-0.0153656005859375,
-0.0166015625,
0.0293731689453125,
0.04119873046875,
-0.036895751953125,
0.019439697265625,
0.017547607421875,
0.0265045166015625,
0.0670166015625,
-0.0249786376953125,
0.0087738037109375,
-0.06207275390625,
0.047271728515625,
-0.01102447509765625,
0.02899169921875,
0.0301666259765625,
-0.0208892822265625,
0.0478515625,
0.0310516357421875,
-0.0304412841796875,
-0.069091796875,
-0.005889892578125,
-0.08013916015625,
-0.003696441650390625,
0.07562255859375,
-0.0204925537109375,
-0.03863525390625,
0.025726318359375,
-0.0009217262268066406,
0.0455322265625,
-0.00858306884765625,
0.033233642578125,
0.0128326416015625,
-0.01202392578125,
-0.05517578125,
-0.054962158203125,
0.035064697265625,
0.0097808837890625,
-0.044921875,
-0.039215087890625,
-0.0034084320068359375,
0.050445556640625,
0.015960693359375,
0.04669189453125,
-0.01629638671875,
0.0122833251953125,
0.00569915771484375,
0.04119873046875,
-0.0322265625,
-0.002735137939453125,
-0.0188751220703125,
-0.0017261505126953125,
-0.00922393798828125,
-0.053375244140625
]
] |
bert-base-uncased | 2023-06-30T01:42:19.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"coreml",
"onnx",
"safetensors",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-base-uncased | 1,182 | 52,250,055 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT base model (uncased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally masks the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences, for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Model variations
BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers.
Chinese and multilingual uncased and cased versions followed shortly after.
Modified preprocessing with whole word masking has replaced subpiece masking in a following work, with the release of two models.
Other 24 smaller models are released afterward.
The detailed release history can be found on the [google-research/bert readme](https://github.com/google-research/bert/blob/master/README.md) on github.
| Model | #params | Language |
|------------------------|--------------------------------|-------|
| [`bert-base-uncased`](https://huggingface.co/bert-base-uncased) | 110M | English |
| [`bert-large-uncased`](https://huggingface.co/bert-large-uncased) | 340M | English | sub
| [`bert-base-cased`](https://huggingface.co/bert-base-cased) | 110M | English |
| [`bert-large-cased`](https://huggingface.co/bert-large-cased) | 340M | English |
| [`bert-base-chinese`](https://huggingface.co/bert-base-chinese) | 110M | Chinese |
| [`bert-base-multilingual-cased`](https://huggingface.co/bert-base-multilingual-cased) | 110M | Multiple |
| [`bert-large-uncased-whole-word-masking`](https://huggingface.co/bert-large-uncased-whole-word-masking) | 340M | English |
| [`bert-large-cased-whole-word-masking`](https://huggingface.co/bert-large-cased-whole-word-masking) | 340M | English |
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions of a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
'score': 0.1073106899857521,
'token': 4827,
'token_str': 'fashion'},
{'sequence': "[CLS] hello i'm a role model. [SEP]",
'score': 0.08774490654468536,
'token': 2535,
'token_str': 'role'},
{'sequence': "[CLS] hello i'm a new model. [SEP]",
'score': 0.05338378623127937,
'token': 2047,
'token_str': 'new'},
{'sequence': "[CLS] hello i'm a super model. [SEP]",
'score': 0.04667217284440994,
'token': 3565,
'token_str': 'super'},
{'sequence': "[CLS] hello i'm a fine model. [SEP]",
'score': 0.027095865458250046,
'token': 2986,
'token_str': 'fine'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = TFBertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("The man worked as a [MASK].")
[{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
'score': 0.09747550636529922,
'token': 10533,
'token_str': 'carpenter'},
{'sequence': '[CLS] the man worked as a waiter. [SEP]',
'score': 0.0523831807076931,
'token': 15610,
'token_str': 'waiter'},
{'sequence': '[CLS] the man worked as a barber. [SEP]',
'score': 0.04962705448269844,
'token': 13362,
'token_str': 'barber'},
{'sequence': '[CLS] the man worked as a mechanic. [SEP]',
'score': 0.03788609802722931,
'token': 15893,
'token_str': 'mechanic'},
{'sequence': '[CLS] the man worked as a salesman. [SEP]',
'score': 0.037680890411138535,
'token': 18968,
'token_str': 'salesman'}]
>>> unmasker("The woman worked as a [MASK].")
[{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
'score': 0.21981462836265564,
'token': 6821,
'token_str': 'nurse'},
{'sequence': '[CLS] the woman worked as a waitress. [SEP]',
'score': 0.1597415804862976,
'token': 13877,
'token_str': 'waitress'},
{'sequence': '[CLS] the woman worked as a maid. [SEP]',
'score': 0.1154729500412941,
'token': 10850,
'token_str': 'maid'},
{'sequence': '[CLS] the woman worked as a prostitute. [SEP]',
'score': 0.037968918681144714,
'token': 19215,
'token_str': 'prostitute'},
{'sequence': '[CLS] the woman worked as a cook. [SEP]',
'score': 0.03042375110089779,
'token': 5660,
'token_str': 'cook'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus, and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
|:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
| | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=bert-base-uncased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 10,517 | [
[
-0.010284423828125,
-0.046142578125,
0.0119476318359375,
0.023162841796875,
-0.0394287109375,
0.0003082752227783203,
-0.00923919677734375,
-0.0169677734375,
0.033599853515625,
0.041656494140625,
-0.04144287109375,
-0.03338623046875,
-0.0570068359375,
0.01056671142578125,
-0.035247802734375,
0.08563232421875,
0.020050048828125,
0.0227508544921875,
0.004886627197265625,
0.01316070556640625,
-0.030303955078125,
-0.0643310546875,
-0.05291748046875,
-0.0250091552734375,
0.03326416015625,
0.0224456787109375,
0.04949951171875,
0.04010009765625,
0.031280517578125,
0.0310821533203125,
-0.00392913818359375,
-0.005260467529296875,
-0.0206451416015625,
0.0047149658203125,
-0.000025093555450439453,
-0.0374755859375,
-0.03314208984375,
0.01495361328125,
0.0391845703125,
0.06378173828125,
-0.0007462501525878906,
0.0184478759765625,
-0.0075225830078125,
0.0416259765625,
-0.0123748779296875,
0.0263824462890625,
-0.039642333984375,
0.004497528076171875,
-0.0200042724609375,
0.0180816650390625,
-0.0297088623046875,
-0.0150604248046875,
0.01398468017578125,
-0.039093017578125,
0.0155181884765625,
0.008270263671875,
0.085693359375,
0.0057525634765625,
-0.01311492919921875,
-0.00921630859375,
-0.032073974609375,
0.061737060546875,
-0.0635986328125,
0.0176239013671875,
0.0362548828125,
0.01265716552734375,
-0.0182647705078125,
-0.07720947265625,
-0.0308074951171875,
-0.006862640380859375,
-0.0142974853515625,
0.00397491455078125,
-0.00476837158203125,
-0.004734039306640625,
0.020294189453125,
0.0338134765625,
-0.02801513671875,
-0.0003311634063720703,
-0.050628662109375,
-0.02813720703125,
0.055450439453125,
0.0029697418212890625,
0.0226593017578125,
-0.0323486328125,
-0.0209503173828125,
-0.019866943359375,
-0.0225677490234375,
0.00791168212890625,
0.04205322265625,
0.036773681640625,
-0.0186309814453125,
0.0567626953125,
-0.01336669921875,
0.046173095703125,
0.0012607574462890625,
0.0035305023193359375,
0.036224365234375,
-0.00335693359375,
-0.0292510986328125,
-0.005840301513671875,
0.07159423828125,
0.0211334228515625,
0.031341552734375,
-0.004062652587890625,
-0.025787353515625,
0.003299713134765625,
0.01971435546875,
-0.04193115234375,
-0.0300140380859375,
0.00847625732421875,
-0.040985107421875,
-0.0288848876953125,
0.031524658203125,
-0.049224853515625,
-0.005687713623046875,
-0.0044708251953125,
0.041595458984375,
-0.0290985107421875,
-0.01265716552734375,
0.013916015625,
-0.03436279296875,
0.0148773193359375,
0.0031147003173828125,
-0.0694580078125,
0.014862060546875,
0.0457763671875,
0.059844970703125,
0.02294921875,
-0.0107574462890625,
-0.0216827392578125,
-0.02008056640625,
-0.02435302734375,
0.03533935546875,
-0.025787353515625,
-0.0325927734375,
0.004444122314453125,
0.0263519287109375,
-0.00792694091796875,
-0.0198211669921875,
0.05035400390625,
-0.0333251953125,
0.045654296875,
-0.004665374755859375,
-0.0390625,
-0.0241546630859375,
0.006137847900390625,
-0.05584716796875,
0.08953857421875,
0.02386474609375,
-0.049072265625,
0.01715087890625,
-0.066650390625,
-0.047515869140625,
0.0151824951171875,
0.014007568359375,
-0.030364990234375,
0.0153045654296875,
0.013702392578125,
0.0330810546875,
-0.005146026611328125,
0.0223388671875,
-0.01172637939453125,
-0.0286712646484375,
0.0252532958984375,
-0.0180816650390625,
0.0809326171875,
0.01416015625,
-0.0231170654296875,
0.01033782958984375,
-0.058746337890625,
0.007526397705078125,
0.01922607421875,
-0.024932861328125,
-0.00971221923828125,
-0.00647735595703125,
0.024688720703125,
0.01235198974609375,
0.034637451171875,
-0.048065185546875,
0.0182952880859375,
-0.04638671875,
0.0543212890625,
0.059417724609375,
-0.012939453125,
0.01953125,
-0.0274810791015625,
0.03594970703125,
-0.0037746429443359375,
-0.0005397796630859375,
-0.01097869873046875,
-0.0596923828125,
-0.06341552734375,
-0.0265350341796875,
0.051177978515625,
0.054168701171875,
-0.04034423828125,
0.058868408203125,
-0.0004706382751464844,
-0.04443359375,
-0.0496826171875,
-0.0083160400390625,
0.0276336669921875,
0.0282440185546875,
0.023101806640625,
-0.03973388671875,
-0.06805419921875,
-0.065185546875,
-0.0186614990234375,
-0.017669677734375,
-0.0186767578125,
0.0089569091796875,
0.046783447265625,
-0.0271759033203125,
0.0577392578125,
-0.052520751953125,
-0.0311737060546875,
-0.0198516845703125,
0.0180511474609375,
0.0479736328125,
0.05291748046875,
0.0283050537109375,
-0.0428466796875,
-0.03106689453125,
-0.0241546630859375,
-0.041534423828125,
0.0019969940185546875,
0.0020847320556640625,
-0.0133209228515625,
0.017669677734375,
0.036865234375,
-0.05609130859375,
0.035247802734375,
0.027557373046875,
-0.036773681640625,
0.050048828125,
-0.027008056640625,
-0.00846099853515625,
-0.09320068359375,
0.00966644287109375,
-0.00977325439453125,
-0.0223388671875,
-0.057769775390625,
0.00142669677734375,
-0.00970458984375,
-0.0031337738037109375,
-0.04034423828125,
0.039947509765625,
-0.032958984375,
0.0014314651489257812,
0.003566741943359375,
-0.01080322265625,
0.001750946044921875,
0.03509521484375,
0.00629425048828125,
0.042755126953125,
0.040496826171875,
-0.037872314453125,
0.041107177734375,
0.031707763671875,
-0.045013427734375,
0.0032196044921875,
-0.06329345703125,
0.0163726806640625,
0.0033969879150390625,
0.004467010498046875,
-0.08306884765625,
-0.0246124267578125,
0.0207672119140625,
-0.04644775390625,
0.0155181884765625,
-0.005344390869140625,
-0.054412841796875,
-0.04620361328125,
-0.0194244384765625,
0.02593994140625,
0.0443115234375,
-0.0194091796875,
0.0302886962890625,
0.027252197265625,
-0.01367950439453125,
-0.045654296875,
-0.05731201171875,
0.01189422607421875,
-0.00872802734375,
-0.043731689453125,
0.030120849609375,
-0.004535675048828125,
-0.00710296630859375,
-0.01291656494140625,
0.007572174072265625,
-0.015380859375,
0.0020751953125,
0.01319122314453125,
0.036529541015625,
-0.015380859375,
-0.00789642333984375,
-0.01467132568359375,
-0.0081939697265625,
0.0173187255859375,
-0.0120849609375,
0.06243896484375,
-0.005706787109375,
-0.0031261444091796875,
-0.0180206298828125,
0.0275421142578125,
0.04510498046875,
-0.01013946533203125,
0.0518798828125,
0.06195068359375,
-0.0457763671875,
0.0024662017822265625,
-0.0297698974609375,
-0.013702392578125,
-0.038848876953125,
0.036865234375,
-0.0323486328125,
-0.06243896484375,
0.05841064453125,
0.0259552001953125,
-0.006069183349609375,
0.056243896484375,
0.044921875,
-0.0159912109375,
0.07904052734375,
0.042938232421875,
-0.0112152099609375,
0.03594970703125,
-0.0115814208984375,
0.0285186767578125,
-0.055511474609375,
-0.03375244140625,
-0.028228759765625,
-0.0206451416015625,
-0.0406494140625,
-0.01424407958984375,
0.0137786865234375,
0.0178070068359375,
-0.026824951171875,
0.05023193359375,
-0.049774169921875,
0.027496337890625,
0.07403564453125,
0.022003173828125,
-0.01415252685546875,
-0.0159454345703125,
-0.0198211669921875,
-0.002536773681640625,
-0.032318115234375,
-0.032073974609375,
0.0811767578125,
0.0391845703125,
0.05126953125,
0.006702423095703125,
0.043670654296875,
0.0269927978515625,
-0.0018939971923828125,
-0.05279541015625,
0.0474853515625,
-0.0325927734375,
-0.0694580078125,
-0.0284881591796875,
-0.00785064697265625,
-0.07598876953125,
0.0162506103515625,
-0.020416259765625,
-0.06500244140625,
-0.0036716461181640625,
-0.01265716552734375,
-0.0236663818359375,
0.01398468017578125,
-0.060791015625,
0.0792236328125,
-0.0229339599609375,
-0.0032806396484375,
0.01336669921875,
-0.07073974609375,
0.022369384765625,
0.0018529891967773438,
0.007770538330078125,
-0.01142120361328125,
0.019439697265625,
0.07647705078125,
-0.040008544921875,
0.078857421875,
-0.01155853271484375,
0.01031494140625,
0.00420379638671875,
-0.00550079345703125,
0.0243377685546875,
0.002399444580078125,
0.0038433074951171875,
0.0275726318359375,
0.006145477294921875,
-0.03472900390625,
-0.00922393798828125,
0.0269927978515625,
-0.05718994140625,
-0.0386962890625,
-0.04754638671875,
-0.048828125,
0.006610870361328125,
0.035247802734375,
0.046783447265625,
0.036956787109375,
-0.00913238525390625,
0.019500732421875,
0.032958984375,
-0.0242462158203125,
0.055267333984375,
0.0251922607421875,
-0.0167236328125,
-0.037261962890625,
0.04693603515625,
0.0019474029541015625,
0.00009721517562866211,
0.034393310546875,
0.01486968994140625,
-0.04376220703125,
-0.0150146484375,
-0.027191162109375,
0.0124969482421875,
-0.04278564453125,
-0.022552490234375,
-0.042388916015625,
-0.040740966796875,
-0.05364990234375,
-0.005405426025390625,
-0.01212310791015625,
-0.03851318359375,
-0.04595947265625,
-0.01071929931640625,
0.03265380859375,
0.053802490234375,
-0.01288604736328125,
0.035400390625,
-0.056396484375,
0.019073486328125,
0.022705078125,
0.0338134765625,
-0.020172119140625,
-0.057342529296875,
-0.025390625,
0.005985260009765625,
-0.01071929931640625,
-0.061798095703125,
0.054107666015625,
0.0185394287109375,
0.04046630859375,
0.036956787109375,
0.001834869384765625,
0.04449462890625,
-0.05023193359375,
0.07855224609375,
0.01788330078125,
-0.08599853515625,
0.03973388671875,
-0.0243072509765625,
0.0162506103515625,
0.0224456787109375,
0.0167236328125,
-0.04901123046875,
-0.0299835205078125,
-0.060211181640625,
-0.076416015625,
0.0625,
0.011993408203125,
0.031219482421875,
-0.00836181640625,
0.012420654296875,
0.01103973388671875,
0.029388427734375,
-0.07763671875,
-0.04071044921875,
-0.0384521484375,
-0.0250701904296875,
-0.0151214599609375,
-0.02349853515625,
-0.0014667510986328125,
-0.04693603515625,
0.05419921875,
0.011138916015625,
0.04266357421875,
0.007526397705078125,
-0.0138702392578125,
0.0096435546875,
0.01395416259765625,
0.06158447265625,
0.0341796875,
-0.0384521484375,
-0.0005054473876953125,
0.0015201568603515625,
-0.0467529296875,
-0.00127410888671875,
0.01611328125,
0.00107574462890625,
0.02264404296875,
0.044586181640625,
0.061431884765625,
0.017578125,
-0.041259765625,
0.04443359375,
0.01099395751953125,
-0.0266876220703125,
-0.041595458984375,
0.0020084381103515625,
-0.0018157958984375,
0.01030731201171875,
0.033935546875,
0.0088348388671875,
0.004398345947265625,
-0.04278564453125,
0.0301055908203125,
0.0306549072265625,
-0.0380859375,
-0.022247314453125,
0.0626220703125,
0.007106781005859375,
-0.048126220703125,
0.060150146484375,
-0.0106048583984375,
-0.06378173828125,
0.0531005859375,
0.052459716796875,
0.0694580078125,
-0.016632080078125,
0.0205535888671875,
0.032318115234375,
0.03369140625,
-0.01506805419921875,
0.031219482421875,
0.0258331298828125,
-0.065185546875,
-0.0260467529296875,
-0.054931640625,
-0.01363372802734375,
0.0206298828125,
-0.056640625,
0.01947021484375,
-0.03924560546875,
-0.017669677734375,
0.013427734375,
0.001781463623046875,
-0.049224853515625,
0.033447265625,
0.006496429443359375,
0.07720947265625,
-0.0726318359375,
0.076904296875,
0.061309814453125,
-0.04779052734375,
-0.0635986328125,
-0.0285797119140625,
-0.02301025390625,
-0.084228515625,
0.051361083984375,
0.02435302734375,
0.0265045166015625,
-0.00171661376953125,
-0.04718017578125,
-0.056427001953125,
0.055938720703125,
0.0142822265625,
-0.0300140380859375,
-0.0098876953125,
0.010040283203125,
0.043243408203125,
-0.03955078125,
0.034515380859375,
0.042999267578125,
0.03387451171875,
-0.00510406494140625,
-0.06207275390625,
0.0016393661499023438,
-0.0355224609375,
0.0017709732055664062,
0.006702423095703125,
-0.034454345703125,
0.09063720703125,
-0.00852203369140625,
0.0013093948364257812,
0.0178070068359375,
0.039794921875,
-0.0051422119140625,
0.0008530616760253906,
0.03656005859375,
0.04559326171875,
0.0518798828125,
-0.0287933349609375,
0.057464599609375,
-0.0167236328125,
0.038330078125,
0.061187744140625,
0.003803253173828125,
0.06243896484375,
0.026641845703125,
-0.0218658447265625,
0.0706787109375,
0.06561279296875,
-0.025177001953125,
0.056121826171875,
0.0173492431640625,
-0.004886627197265625,
-0.00823211669921875,
0.00879669189453125,
-0.0257568359375,
0.040679931640625,
0.0190277099609375,
-0.041259765625,
0.0017652511596679688,
-0.00794219970703125,
0.01409912109375,
-0.011322021484375,
-0.03131103515625,
0.051544189453125,
0.01316070556640625,
-0.0498046875,
0.0271453857421875,
0.0208892822265625,
0.059295654296875,
-0.045318603515625,
0.0030879974365234375,
-0.01103973388671875,
0.016632080078125,
-0.007015228271484375,
-0.064453125,
0.015167236328125,
-0.01052093505859375,
-0.032958984375,
-0.0202789306640625,
0.055938720703125,
-0.0390625,
-0.053924560546875,
0.00608062744140625,
0.0222930908203125,
0.0251312255859375,
-0.00765228271484375,
-0.0614013671875,
-0.0160369873046875,
0.00620269775390625,
-0.00971221923828125,
0.00951385498046875,
0.021575927734375,
0.0065765380859375,
0.04083251953125,
0.060821533203125,
-0.007720947265625,
0.0106201171875,
0.0031681060791015625,
0.052032470703125,
-0.072265625,
-0.057403564453125,
-0.0694580078125,
0.04779052734375,
-0.006908416748046875,
-0.041656494140625,
0.052154541015625,
0.050445556640625,
0.056640625,
-0.03472900390625,
0.042572021484375,
-0.0159759521484375,
0.042022705078125,
-0.02777099609375,
0.0621337890625,
-0.031158447265625,
-0.0033931732177734375,
-0.03179931640625,
-0.06097412109375,
-0.02630615234375,
0.0648193359375,
-0.006908416748046875,
0.00308990478515625,
0.04986572265625,
0.04339599609375,
0.007640838623046875,
-0.00942230224609375,
0.0159759521484375,
0.01175689697265625,
0.00551605224609375,
0.0291595458984375,
0.04193115234375,
-0.049102783203125,
0.0301971435546875,
-0.016357421875,
-0.004375457763671875,
-0.025634765625,
-0.0672607421875,
-0.07666015625,
-0.046661376953125,
-0.015716552734375,
-0.042205810546875,
-0.016448974609375,
0.06927490234375,
0.060028076171875,
-0.07452392578125,
-0.0243377685546875,
-0.0009102821350097656,
0.00807952880859375,
-0.0209503173828125,
-0.021697998046875,
0.03350830078125,
-0.0174102783203125,
-0.06085205078125,
0.0216827392578125,
-0.0038471221923828125,
0.00731658935546875,
-0.0112152099609375,
0.004535675048828125,
-0.0313720703125,
0.009735107421875,
0.046417236328125,
0.012603759765625,
-0.061981201171875,
-0.036468505859375,
0.00724029541015625,
-0.013153076171875,
0.006099700927734375,
0.0325927734375,
-0.04058837890625,
0.028472900390625,
0.031158447265625,
0.030670166015625,
0.043731689453125,
0.00276947021484375,
0.0537109375,
-0.0853271484375,
0.022003173828125,
0.015899658203125,
0.04022216796875,
0.0296783447265625,
-0.033294677734375,
0.03955078125,
0.03253173828125,
-0.03314208984375,
-0.06610107421875,
-0.0004992485046386719,
-0.07330322265625,
-0.0216217041015625,
0.064697265625,
-0.01198577880859375,
-0.0217132568359375,
-0.006145477294921875,
-0.0224609375,
0.0279541015625,
-0.0282440185546875,
0.052978515625,
0.069091796875,
0.00710296630859375,
-0.01279449462890625,
-0.0250244140625,
0.0297393798828125,
0.037322998046875,
-0.034423828125,
-0.027862548828125,
0.010284423828125,
0.031341552734375,
0.017578125,
0.04071044921875,
-0.0064849853515625,
0.008056640625,
0.00913238525390625,
0.0228271484375,
-0.0037994384765625,
-0.01013946533203125,
-0.0192108154296875,
0.0014715194702148438,
-0.01175689697265625,
-0.05242919921875
]
] |
distilbert-base-uncased-finetuned-sst-2-english | 2023-10-26T16:14:11.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"onnx",
"safetensors",
"distilbert",
"text-classification",
"en",
"dataset:sst2",
"dataset:glue",
"arxiv:1910.01108",
"doi:10.57967/hf/0181",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | null | null | null | distilbert-base-uncased-finetuned-sst-2-english | 331 | 41,670,892 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: apache-2.0
datasets:
- sst2
- glue
model-index:
- name: distilbert-base-uncased-finetuned-sst-2-english
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: glue
type: glue
config: sst2
split: validation
metrics:
- type: accuracy
value: 0.9105504587155964
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2YyOGMxYjY2Y2JhMjkxNjIzN2FmMjNiNmM2ZWViNGY3MTNmNWI2YzhiYjYxZTY0ZGUyN2M1NGIxZjRiMjQwZiIsInZlcnNpb24iOjF9.uui0srxV5ZHRhxbYN6082EZdwpnBgubPJ5R2-Wk8HTWqmxYE3QHidevR9LLAhidqGw6Ih93fK0goAXncld_gBg
- type: precision
value: 0.8978260869565218
name: Precision
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzgwYTYwYjA2MmM0ZTYwNDk0M2NmNTBkZmM2NGNhYzQ1OGEyN2NkNDQ3Mzc2NTQyMmZiNDJiNzBhNGVhZGUyOSIsInZlcnNpb24iOjF9.eHjLmw3K02OU69R2Au8eyuSqT3aBDHgZCn8jSzE3_urD6EUSSsLxUpiAYR4BGLD_U6-ZKcdxVo_A2rdXqvUJDA
- type: recall
value: 0.9301801801801802
name: Recall
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGIzM2E3MTI2Mzc2MDYwNmU3ZTVjYmZmZDBkNjY4ZTc5MGY0Y2FkNDU3NjY1MmVkNmE3Y2QzMzAwZDZhOWY1NiIsInZlcnNpb24iOjF9.PUZlqmct13-rJWBXdHm5tdkXgETL9F82GNbbSR4hI8MB-v39KrK59cqzFC2Ac7kJe_DtOeUyosj34O_mFt_1DQ
- type: auc
value: 0.9716626673402374
name: AUC
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDM0YWIwZmQ4YjUwOGZmMWU2MjI1YjIxZGQ2MzNjMzRmZmYxMzZkNGFjODhlMDcyZDM1Y2RkMWZlOWQ0MWYwNSIsInZlcnNpb24iOjF9.E7GRlAXmmpEkTHlXheVkuL1W4WNjv4JO3qY_WCVsTVKiO7bUu0UVjPIyQ6g-J1OxsfqZmW3Leli1wY8vPBNNCQ
- type: f1
value: 0.9137168141592922
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGU4MjNmOGYwZjZjMDQ1ZTkyZTA4YTc1MWYwOTM0NDM4ZWY1ZGVkNDY5MzNhYTQyZGFlNzIyZmUwMDg3NDU0NyIsInZlcnNpb24iOjF9.mW5ftkq50Se58M-jm6a2Pu93QeKa3MfV7xcBwvG3PSB_KNJxZWTCpfMQp-Cmx_EMlmI2siKOyd8akYjJUrzJCA
- type: loss
value: 0.39013850688934326
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTZiNzAyZDc0MzUzMmE1MGJiN2JlYzFiODE5ZTNlNGE4MmI4YzRiMTc2ODEzMTUwZmEzOTgxNzc4YjJjZTRmNiIsInZlcnNpb24iOjF9.VqIC7uYC-ZZ8ss9zQOlRV39YVOOLc5R36sIzCcVz8lolh61ux_5djm2XjpP6ARc6KqEnXC4ZtfNXsX2HZfrtCQ
- task:
type: text-classification
name: Text Classification
dataset:
name: sst2
type: sst2
config: default
split: train
metrics:
- type: accuracy
value: 0.9885521685548412
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2I3NzU3YzhmMDkxZTViY2M3OTY1NmI0ZTdmMDQxNjNjYzJiZmQxNzczM2E4YmExYTY5ODY0NDBkY2I4ZjNkOCIsInZlcnNpb24iOjF9.4Gtk3FeVc9sPWSqZIaeUXJ9oVlPzm-NmujnWpK2y5s1Vhp1l6Y1pK5_78wW0-NxSvQqV6qd5KQf_OAEpVAkQDA
- type: precision
value: 0.9881965062029833
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDdlZDMzY2I3MTAwYTljNmM4MGMyMzU2YjAzZDg1NDYwN2ZmM2Y5OWZhMjUyMGJiNjY1YmZiMzFhMDI2ODFhNyIsInZlcnNpb24iOjF9.cqmv6yBxu4St2mykRWrZ07tDsiSLdtLTz2hbqQ7Gm1rMzq9tdlkZ8MyJRxtME_Y8UaOG9rs68pV-gKVUs8wABw
- type: precision
value: 0.9885521685548412
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjFlYzAzNmE1YjljNjUwNzBjZjEzZDY0ZDQyMmY5ZWM2OTBhNzNjYjYzYTk1YWE1NjU3YTMxZDQwOTE1Y2FkNyIsInZlcnNpb24iOjF9.jnCHOkUHuAOZZ_ZMVOnetx__OVJCS6LOno4caWECAmfrUaIPnPNV9iJ6izRO3sqkHRmxYpWBb-27GJ4N3LU-BQ
- type: precision
value: 0.9885639626373408
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGUyODFjNjBlNTE2MTY3ZDAxOGU1N2U0YjUyY2NiZjhkOGVmYThjYjBkNGU3NTRkYzkzNDQ2MmMwMjkwMWNiMyIsInZlcnNpb24iOjF9.zTNabMwApiZyXdr76QUn7WgGB7D7lP-iqS3bn35piqVTNsv3wnKjZOaKFVLIUvtBXq4gKw7N2oWxvWc4OcSNDg
- type: recall
value: 0.9886145346602994
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTU1YjlhODU3YTkyNTdiZDcwZGFlZDBiYjY0N2NjMGM2NTRiNjQ3MDNjNGMxOWY2ZGQ4NWU1YmMzY2UwZTI3YSIsInZlcnNpb24iOjF9.xaLPY7U-wHsJ3DDui1yyyM-xWjL0Jz5puRThy7fczal9x05eKEQ9s0a_WD-iLmapvJs0caXpV70hDe2NLcs-DA
- type: recall
value: 0.9885521685548412
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODE0YTU0MDBlOGY4YzU0MjY5MzA3OTk2OGNhOGVkMmU5OGRjZmFiZWI2ZjY5ODEzZTQzMTI0N2NiOTVkNDliYiIsInZlcnNpb24iOjF9.SOt1baTBbuZRrsvGcak2sUwoTrQzmNCbyV2m1_yjGsU48SBH0NcKXicidNBSnJ6ihM5jf_Lv_B5_eOBkLfNWDQ
- type: recall
value: 0.9885521685548412
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWNkNmM0ZGRlNmYxYzIwNDk4OTI5MzIwZWU1NzZjZDVhMDcyNDFlMjBhNDQxODU5OWMwMWNhNGEzNjY3ZGUyOSIsInZlcnNpb24iOjF9.b15Fh70GwtlG3cSqPW-8VEZT2oy0CtgvgEOtWiYonOovjkIQ4RSLFVzVG-YfslaIyfg9RzMWzjhLnMY7Bpn2Aw
- type: f1
value: 0.9884019815052447
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYmM4NjQ5Yjk5ODRhYTU1MTY3MmRhZDBmODM1NTg3OTFiNWM4NDRmYjI0MzZkNmQ1MzE3MzcxODZlYzBkYTMyYSIsInZlcnNpb24iOjF9.74RaDK8nBVuGRl2Se_-hwQvP6c4lvVxGHpcCWB4uZUCf2_HoC9NT9u7P3pMJfH_tK2cpV7U3VWGgSDhQDi-UBQ
- type: f1
value: 0.9885521685548412
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDRmYWRmMmQ0YjViZmQxMzhhYTUyOTE1MTc0ZDU1ZjQyZjFhMDYzYzMzZDE0NzZlYzQyOTBhMTBhNmM5NTlkMiIsInZlcnNpb24iOjF9.VMn_psdAHIZTlW6GbjERZDe8MHhwzJ0rbjV_VJyuMrsdOh5QDmko-wEvaBWNEdT0cEKsbggm-6jd3Gh81PfHAQ
- type: f1
value: 0.9885546181087554
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjUyZWFhZDZhMGQ3MzBmYmRiNDVmN2FkZDBjMjk3ODk0OTAxNGZkMWE0NzU5ZjI0NzE0NGZiNzM0N2Y2NDYyOSIsInZlcnNpb24iOjF9.YsXBhnzEEFEW6jw3mQlFUuIrW7Gabad2Ils-iunYJr-myg0heF8NEnEWABKFE1SnvCWt-69jkLza6SupeyLVCA
- type: loss
value: 0.040652573108673096
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTc3YjU3MjdjMzkxODA5MjU5NGUyY2NkMGVhZDg3ZWEzMmU1YWVjMmI0NmU2OWEyZTkzMTVjNDZiYTc0YjIyNCIsInZlcnNpb24iOjF9.lA90qXZVYiILHMFlr6t6H81Oe8a-4KmeX-vyCC1BDia2ofudegv6Vb46-4RzmbtuKeV6yy6YNNXxXxqVak1pAg
---
# DistilBERT base uncased finetuned SST-2
## Table of Contents
- [Model Details](#model-details)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
## Model Details
**Model Description:** This model is a fine-tune checkpoint of [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased), fine-tuned on SST-2.
This model reaches an accuracy of 91.3 on the dev set (for comparison, Bert bert-base-uncased version reaches an accuracy of 92.7).
- **Developed by:** Hugging Face
- **Model Type:** Text Classification
- **Language(s):** English
- **License:** Apache-2.0
- **Parent Model:** For more details about DistilBERT, we encourage users to check out [this model card](https://huggingface.co/distilbert-base-uncased).
- **Resources for more information:**
- [Model Documentation](https://huggingface.co/docs/transformers/main/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification)
- [DistilBERT paper](https://arxiv.org/abs/1910.01108)
## How to Get Started With the Model
Example of single-label classification:
โโ
```python
import torch
from transformers import DistilBertTokenizer, DistilBertForSequenceClassification
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
model = DistilBertForSequenceClassification.from_pretrained("distilbert-base-uncased")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
with torch.no_grad():
logits = model(**inputs).logits
predicted_class_id = logits.argmax().item()
model.config.id2label[predicted_class_id]
```
## Uses
#### Direct Use
This model can be used for topic classification. You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
Based on a few experimentations, we observed that this model could produce biased predictions that target underrepresented populations.
For instance, for sentences like `This film was filmed in COUNTRY`, this binary classification model will give radically different probabilities for the positive label depending on the country (0.89 if the country is France, but 0.08 if the country is Afghanistan) when nothing in the input indicates such a strong semantic shift. In this [colab](https://colab.research.google.com/gist/ageron/fb2f64fb145b4bc7c49efc97e5f114d3/biasmap.ipynb), [Aurรฉlien Gรฉron](https://twitter.com/aureliengeron) made an interesting map plotting these probabilities for each country.
<img src="https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/map.jpeg" alt="Map of positive probabilities per country." width="500"/>
We strongly advise users to thoroughly probe these aspects on their use-cases in order to evaluate the risks of this model. We recommend looking at the following bias evaluation datasets as a place to start: [WinoBias](https://huggingface.co/datasets/wino_bias), [WinoGender](https://huggingface.co/datasets/super_glue), [Stereoset](https://huggingface.co/datasets/stereoset).
# Training
#### Training Data
The authors use the following Stanford Sentiment Treebank([sst2](https://huggingface.co/datasets/sst2)) corpora for the model.
#### Training Procedure
###### Fine-tuning hyper-parameters
- learning_rate = 1e-5
- batch_size = 32
- warmup = 600
- max_seq_length = 128
- num_train_epochs = 3.0
| 10,458 | [
[
-0.0304412841796875,
-0.05908203125,
0.0137481689453125,
0.012725830078125,
-0.032501220703125,
-0.0002455711364746094,
-0.01410675048828125,
-0.0252838134765625,
0.007808685302734375,
0.032745361328125,
-0.04638671875,
-0.04730224609375,
-0.0693359375,
-0.01554107666015625,
-0.0091705322265625,
0.110595703125,
0.0013370513916015625,
0.01512908935546875,
-0.0000023245811462402344,
0.0007128715515136719,
-0.03240966796875,
-0.0452880859375,
-0.0287322998046875,
-0.0243072509765625,
0.0083465576171875,
0.0231475830078125,
0.045166015625,
0.01399993896484375,
0.0333251953125,
0.0234527587890625,
-0.035552978515625,
-0.01025390625,
-0.04461669921875,
-0.01134490966796875,
-0.01177215576171875,
-0.033233642578125,
-0.039031982421875,
0.0266265869140625,
0.0209503173828125,
0.048187255859375,
-0.0013904571533203125,
0.0230865478515625,
0.008209228515625,
0.043212890625,
-0.02227783203125,
0.0230255126953125,
-0.05328369140625,
-0.004344940185546875,
-0.01861572265625,
0.013153076171875,
-0.036590576171875,
-0.0226287841796875,
0.0309600830078125,
-0.0281219482421875,
0.0301971435546875,
-0.0013256072998046875,
0.08154296875,
0.023590087890625,
-0.036224365234375,
-0.0160980224609375,
-0.0360107421875,
0.05755615234375,
-0.041534423828125,
0.01537322998046875,
0.036102294921875,
0.00759124755859375,
-0.01067352294921875,
-0.05364990234375,
-0.03924560546875,
-0.01116943359375,
-0.0044097900390625,
0.024139404296875,
-0.020233154296875,
-0.003704071044921875,
0.03192138671875,
0.03045654296875,
-0.033447265625,
-0.00772857666015625,
-0.0439453125,
-0.026123046875,
0.043792724609375,
-0.006710052490234375,
0.01412200927734375,
-0.02569580078125,
-0.050262451171875,
-0.01145172119140625,
-0.030792236328125,
0.0003490447998046875,
0.034515380859375,
0.02618408203125,
-0.020721435546875,
0.047821044921875,
-0.01100921630859375,
0.03662109375,
0.030609130859375,
-0.006763458251953125,
0.043670654296875,
-0.0177459716796875,
-0.0171661376953125,
0.01084136962890625,
0.053375244140625,
0.036834716796875,
0.017303466796875,
0.00308990478515625,
0.00411224365234375,
0.01971435546875,
0.00476837158203125,
-0.08001708984375,
-0.0286865234375,
0.01617431640625,
-0.029449462890625,
-0.0404052734375,
0.01032257080078125,
-0.05352783203125,
-0.01064300537109375,
-0.017181396484375,
0.033294677734375,
-0.036407470703125,
-0.03057861328125,
0.0103912353515625,
-0.0252685546875,
0.0018939971923828125,
0.012176513671875,
-0.051025390625,
0.00907135009765625,
0.023590087890625,
0.05853271484375,
-0.0120391845703125,
-0.01264190673828125,
-0.010467529296875,
-0.0164947509765625,
-0.0111236572265625,
0.034759521484375,
-0.011627197265625,
-0.0202789306640625,
-0.0169830322265625,
0.0150299072265625,
0.0089111328125,
-0.0244903564453125,
0.0567626953125,
-0.0298919677734375,
0.0382080078125,
-0.01412200927734375,
-0.033050537109375,
-0.01287078857421875,
0.017059326171875,
-0.047210693359375,
0.08892822265625,
0.02752685546875,
-0.08697509765625,
0.035400390625,
-0.035064697265625,
-0.0281219482421875,
-0.0113677978515625,
0.00911712646484375,
-0.045379638671875,
-0.0032939910888671875,
0.005695343017578125,
0.032989501953125,
-0.00850677490234375,
0.045257568359375,
-0.0203094482421875,
-0.02337646484375,
0.01485443115234375,
-0.0328369140625,
0.09515380859375,
0.0157470703125,
-0.038818359375,
-0.00780487060546875,
-0.055908203125,
-0.0093231201171875,
0.00582122802734375,
-0.0177764892578125,
-0.031768798828125,
-0.0255889892578125,
0.03143310546875,
0.034210205078125,
0.0154266357421875,
-0.048828125,
0.01393890380859375,
-0.0271148681640625,
0.03466796875,
0.051177978515625,
-0.00472259521484375,
0.0323486328125,
-0.005401611328125,
0.0222930908203125,
0.028289794921875,
0.015167236328125,
0.014007568359375,
-0.043853759765625,
-0.0576171875,
-0.0257415771484375,
0.041656494140625,
0.046234130859375,
-0.050567626953125,
0.05218505859375,
-0.0026416778564453125,
-0.05230712890625,
-0.02777099609375,
-0.00029015541076660156,
0.035980224609375,
0.05023193359375,
0.0316162109375,
-0.026763916015625,
-0.0428466796875,
-0.060211181640625,
0.006702423095703125,
-0.0208587646484375,
0.0011844635009765625,
-0.00555419921875,
0.051513671875,
-0.029022216796875,
0.06500244140625,
-0.0517578125,
-0.0286102294921875,
-0.0167999267578125,
0.0185089111328125,
0.0224609375,
0.039581298828125,
0.046234130859375,
-0.06689453125,
-0.036102294921875,
-0.03338623046875,
-0.05487060546875,
0.0048370361328125,
0.0103607177734375,
-0.025177001953125,
0.026336669921875,
0.024261474609375,
-0.04779052734375,
0.033050537109375,
0.031646728515625,
-0.041534423828125,
0.03350830078125,
-0.00241851806640625,
-0.00780487060546875,
-0.10382080078125,
-0.0032501220703125,
0.0218353271484375,
-0.004909515380859375,
-0.049102783203125,
-0.01110076904296875,
0.0018987655639648438,
0.005313873291015625,
-0.0458984375,
0.0401611328125,
-0.028564453125,
0.0224456787109375,
-0.0191497802734375,
-0.013885498046875,
0.006175994873046875,
0.046173095703125,
0.0230560302734375,
0.0419921875,
0.04998779296875,
-0.03265380859375,
0.01343536376953125,
0.03790283203125,
-0.03558349609375,
0.04376220703125,
-0.052337646484375,
-0.0020198822021484375,
-0.0201568603515625,
0.0294952392578125,
-0.064208984375,
-0.01102447509765625,
0.0219573974609375,
-0.040008544921875,
0.042694091796875,
-0.0160980224609375,
-0.03131103515625,
-0.03662109375,
-0.02392578125,
0.016082763671875,
0.0477294921875,
-0.030181884765625,
0.022308349609375,
0.03509521484375,
-0.0091094970703125,
-0.051239013671875,
-0.07342529296875,
-0.0174713134765625,
-0.036224365234375,
-0.036224365234375,
0.0361328125,
-0.007488250732421875,
-0.00966644287109375,
-0.00982666015625,
-0.00688934326171875,
-0.002117156982421875,
-0.00005835294723510742,
0.036712646484375,
0.039459228515625,
0.0052490234375,
0.0166015625,
0.0060577392578125,
-0.01548004150390625,
-0.0026950836181640625,
-0.0162200927734375,
0.03424072265625,
-0.0265045166015625,
0.003032684326171875,
-0.0289306640625,
0.0052490234375,
0.0261688232421875,
0.000583648681640625,
0.053802490234375,
0.060882568359375,
-0.04010009765625,
0.01013946533203125,
-0.039947509765625,
-0.0203857421875,
-0.032012939453125,
0.043121337890625,
-0.0241546630859375,
-0.05706787109375,
0.032257080078125,
-0.0006074905395507812,
-0.0150604248046875,
0.058502197265625,
0.057220458984375,
-0.014190673828125,
0.0662841796875,
0.054473876953125,
-0.01311492919921875,
0.032562255859375,
-0.037322998046875,
0.0011148452758789062,
-0.0638427734375,
-0.0270538330078125,
-0.0255279541015625,
-0.0150146484375,
-0.0693359375,
-0.034759521484375,
0.0170135498046875,
0.024322509765625,
-0.035797119140625,
0.046112060546875,
-0.05279541015625,
0.0242919921875,
0.059661865234375,
0.019317626953125,
0.01050567626953125,
0.01345062255859375,
-0.005603790283203125,
-0.0147705078125,
-0.044769287109375,
-0.0439453125,
0.08917236328125,
0.053802490234375,
0.06817626953125,
-0.00920867919921875,
0.048370361328125,
0.0273895263671875,
0.0227813720703125,
-0.04156494140625,
0.02117919921875,
-0.0192108154296875,
-0.0775146484375,
-0.0248260498046875,
-0.0219879150390625,
-0.052978515625,
0.01062774658203125,
-0.0133819580078125,
-0.054229736328125,
0.02294921875,
0.00423431396484375,
-0.017333984375,
0.0155487060546875,
-0.0589599609375,
0.07366943359375,
-0.0296630859375,
-0.0304412841796875,
0.0164031982421875,
-0.0640869140625,
0.0268402099609375,
0.0006670951843261719,
0.0028896331787109375,
-0.0192108154296875,
0.0233306884765625,
0.06365966796875,
-0.0244598388671875,
0.08111572265625,
-0.0260162353515625,
0.007785797119140625,
0.04095458984375,
-0.0124969482421875,
0.029327392578125,
0.01168060302734375,
-0.01410675048828125,
0.04345703125,
0.0026950836181640625,
-0.0268096923828125,
-0.01534271240234375,
0.04443359375,
-0.0792236328125,
-0.021636962890625,
-0.0611572265625,
-0.029022216796875,
-0.00716400146484375,
0.0161285400390625,
0.052978515625,
0.017608642578125,
-0.0279388427734375,
0.006481170654296875,
0.059906005859375,
-0.0184173583984375,
0.004993438720703125,
0.0254364013671875,
-0.0016355514526367188,
-0.029388427734375,
0.056121826171875,
0.006015777587890625,
0.0179595947265625,
0.0178985595703125,
0.0210113525390625,
-0.0428466796875,
-0.0195770263671875,
-0.03887939453125,
0.0085906982421875,
-0.056549072265625,
-0.02545166015625,
-0.051849365234375,
-0.0269927978515625,
-0.041534423828125,
0.006046295166015625,
-0.0277557373046875,
-0.03662109375,
-0.033782958984375,
-0.0302734375,
0.041656494140625,
0.032073974609375,
-0.005077362060546875,
0.03521728515625,
-0.0217437744140625,
0.0146484375,
0.00865936279296875,
0.025482177734375,
-0.0298004150390625,
-0.06243896484375,
0.0011987686157226562,
0.020843505859375,
-0.03961181640625,
-0.0726318359375,
0.0196533203125,
0.00585174560546875,
0.034820556640625,
0.02764892578125,
0.0187835693359375,
0.032684326171875,
-0.02569580078125,
0.05010986328125,
0.0248260498046875,
-0.0657958984375,
0.057952880859375,
-0.0208892822265625,
0.017547607421875,
0.06439208984375,
0.054595947265625,
-0.0253753662109375,
-0.028350830078125,
-0.0606689453125,
-0.07147216796875,
0.061737060546875,
0.0325927734375,
0.01424407958984375,
0.003993988037109375,
0.017578125,
0.0142822265625,
0.02392578125,
-0.0814208984375,
-0.033172607421875,
-0.037353515625,
-0.0188446044921875,
-0.01522064208984375,
-0.033203125,
-0.007617950439453125,
-0.0372314453125,
0.074462890625,
0.0008111000061035156,
0.024139404296875,
0.0137481689453125,
-0.01001739501953125,
0.0012769699096679688,
0.00891876220703125,
0.032196044921875,
0.0306243896484375,
-0.04931640625,
0.007343292236328125,
0.0155181884765625,
-0.047332763671875,
0.00722503662109375,
0.02850341796875,
-0.0291595458984375,
0.01177215576171875,
0.01216888427734375,
0.07635498046875,
-0.00356292724609375,
-0.032196044921875,
0.042816162109375,
-0.00018918514251708984,
-0.0286712646484375,
-0.0316162109375,
-0.010345458984375,
0.0118255615234375,
0.019256591796875,
0.01334381103515625,
0.008392333984375,
0.01323699951171875,
-0.0540771484375,
0.0184173583984375,
0.0269622802734375,
-0.049285888671875,
-0.007709503173828125,
0.055877685546875,
0.0171661376953125,
-0.00295257568359375,
0.054443359375,
-0.029876708984375,
-0.055267333984375,
0.055267333984375,
0.037139892578125,
0.060699462890625,
-0.006511688232421875,
0.033233642578125,
0.049652099609375,
0.03662109375,
-0.01227569580078125,
0.00835418701171875,
0.01139068603515625,
-0.053802490234375,
-0.006404876708984375,
-0.0614013671875,
-0.01100921630859375,
0.0181427001953125,
-0.05230712890625,
0.034759521484375,
-0.0200347900390625,
-0.032989501953125,
0.00379180908203125,
0.0111236572265625,
-0.060760498046875,
0.0289764404296875,
0.0163421630859375,
0.068359375,
-0.091796875,
0.06689453125,
0.049835205078125,
-0.049652099609375,
-0.046875,
0.0017576217651367188,
0.0037689208984375,
-0.044036865234375,
0.053466796875,
0.033050537109375,
0.018707275390625,
-0.01690673828125,
-0.040313720703125,
-0.06182861328125,
0.08673095703125,
0.01335906982421875,
-0.0443115234375,
0.000010728836059570312,
0.0134429931640625,
0.055999755859375,
-0.0183563232421875,
0.0419921875,
0.0386962890625,
0.0206451416015625,
0.0240936279296875,
-0.06243896484375,
0.008270263671875,
-0.01446533203125,
0.00994110107421875,
-0.0008153915405273438,
-0.0599365234375,
0.06976318359375,
-0.0159759521484375,
-0.00174713134765625,
-0.0101165771484375,
0.048492431640625,
0.024505615234375,
0.032012939453125,
0.035369873046875,
0.053741455078125,
0.0518798828125,
-0.0216522216796875,
0.054443359375,
-0.007328033447265625,
0.045623779296875,
0.09637451171875,
-0.01329803466796875,
0.050628662109375,
0.03338623046875,
-0.0255279541015625,
0.0458984375,
0.0732421875,
-0.0177459716796875,
0.057525634765625,
0.0216064453125,
-0.0021076202392578125,
-0.004894256591796875,
0.003753662109375,
-0.044586181640625,
0.037628173828125,
0.0220184326171875,
-0.036834716796875,
-0.01267242431640625,
0.01259613037109375,
0.01256561279296875,
-0.01318359375,
-0.007266998291015625,
0.044586181640625,
0.0024890899658203125,
-0.05010986328125,
0.0345458984375,
0.00958251953125,
0.07470703125,
-0.038299560546875,
0.00894927978515625,
-0.021087646484375,
0.019866943359375,
-0.00870513916015625,
-0.057037353515625,
0.0212860107421875,
0.00847625732421875,
-0.01922607421875,
-0.01561737060546875,
0.06427001953125,
-0.040374755859375,
-0.06805419921875,
0.0174102783203125,
0.0231170654296875,
0.02392578125,
-0.0185089111328125,
-0.07366943359375,
0.00014388561248779297,
0.00864410400390625,
-0.030364990234375,
0.0216064453125,
0.0306549072265625,
-0.0111083984375,
0.032684326171875,
0.033233642578125,
-0.0095062255859375,
-0.00487518310546875,
-0.0024890899658203125,
0.06512451171875,
-0.032684326171875,
-0.02923583984375,
-0.055389404296875,
0.047088623046875,
-0.015350341796875,
-0.035064697265625,
0.0552978515625,
0.057464599609375,
0.08740234375,
-0.01511383056640625,
0.06683349609375,
-0.02679443359375,
0.0284423828125,
-0.030670166015625,
0.056732177734375,
-0.039093017578125,
-0.00547027587890625,
-0.03533935546875,
-0.0667724609375,
0.0011796951293945312,
0.05523681640625,
-0.021484375,
0.01229095458984375,
0.04522705078125,
0.0584716796875,
-0.008209228515625,
-0.004291534423828125,
0.002288818359375,
0.0205230712890625,
-0.003406524658203125,
0.035919189453125,
0.0455322265625,
-0.0560302734375,
0.03277587890625,
-0.048126220703125,
-0.0374755859375,
-0.018585205078125,
-0.06597900390625,
-0.083984375,
-0.053741455078125,
-0.0467529296875,
-0.05364990234375,
-0.0017614364624023438,
0.06304931640625,
0.057373046875,
-0.06561279296875,
-0.00139617919921875,
-0.004474639892578125,
-0.0022449493408203125,
-0.0035457611083984375,
-0.01812744140625,
0.038055419921875,
0.0008063316345214844,
-0.0692138671875,
-0.0137786865234375,
-0.00855255126953125,
0.0219573974609375,
-0.018829345703125,
-0.003627777099609375,
-0.0206451416015625,
-0.0188140869140625,
0.050537109375,
0.0030879974365234375,
-0.055206298828125,
-0.00493621826171875,
-0.003444671630859375,
-0.018951416015625,
-0.01006317138671875,
0.026580810546875,
-0.04022216796875,
0.032562255859375,
0.031036376953125,
0.0203704833984375,
0.0594482421875,
0.00209808349609375,
0.01154327392578125,
-0.05975341796875,
0.03045654296875,
0.0113067626953125,
0.0228729248046875,
0.024017333984375,
-0.040496826171875,
0.051483154296875,
0.0282440185546875,
-0.033935546875,
-0.0543212890625,
0.00565338134765625,
-0.09527587890625,
-0.0277557373046875,
0.10125732421875,
-0.0024051666259765625,
-0.01116943359375,
0.007293701171875,
-0.0308380126953125,
0.04144287109375,
-0.0261688232421875,
0.06439208984375,
0.0709228515625,
0.0007390975952148438,
0.01092529296875,
-0.042236328125,
0.04046630859375,
0.02276611328125,
-0.04693603515625,
-0.01183319091796875,
0.033447265625,
0.04925537109375,
0.0154571533203125,
0.0498046875,
-0.014129638671875,
0.0005726814270019531,
0.004970550537109375,
0.022705078125,
-0.00225067138671875,
-0.0223846435546875,
-0.0099029541015625,
-0.00858306884765625,
-0.004856109619140625,
-0.0250244140625
]
] |
openai/clip-vit-large-patch14 | 2023-09-15T15:49:35.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"clip",
"zero-shot-image-classification",
"vision",
"arxiv:2103.00020",
"arxiv:1908.04913",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | openai | null | null | openai/clip-vit-large-patch14 | 676 | 26,212,915 | transformers | 2022-03-02T23:29:05 | ---
tags:
- vision
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# Model Card: CLIP
Disclaimer: The model card is taken and modified from the official CLIP repository, it can be found [here](https://github.com/openai/CLIP/blob/main/model-card.md).
## Model Details
The CLIP model was developed by researchers at OpenAI to learn about what contributes to robustness in computer vision tasks. The model was also developed to test the ability of models to generalize to arbitrary image classification tasks in a zero-shot manner. It was not developed for general model deployment - to deploy models like CLIP, researchers will first need to carefully study their capabilities in relation to the specific context theyโre being deployed within.
### Model Date
January 2021
### Model Type
The base model uses a ViT-L/14 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
The original implementation had two variants: one using a ResNet image encoder and the other using a Vision Transformer. This repository has the variant with the Vision Transformer.
### Documents
- [Blog Post](https://openai.com/blog/clip/)
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
### Use with Transformers
```python
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("openai/clip-vit-large-patch14")
processor = CLIPProcessor.from_pretrained("openai/clip-vit-large-patch14")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=["a photo of a cat", "a photo of a dog"], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
### Out-of-Scope Use Cases
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIPโs performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
## Data
The model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet which tend to skew towards more developed nations, and younger, male users.
### Data Mission Statement
Our goal with building this dataset was to test out robustness and generalizability in computer vision tasks. As a result, the focus was on gathering large quantities of data from different publicly-available internet data sources. The data was gathered in a mostly non-interventionist manner. However, we only crawled websites that had policies against excessively violent and adult images and allowed us to filter out such content. We do not intend for this dataset to be used as the basis for any commercial or deployed model and will not be releasing the dataset.
## Performance and Limitations
### Performance
We have evaluated the performance of CLIP on a wide range of benchmarks across a variety of computer vision datasets such as OCR to texture recognition to fine-grained classification. The paper describes model performance on the following datasets:
- Food101
- CIFAR10
- CIFAR100
- Birdsnap
- SUN397
- Stanford Cars
- FGVC Aircraft
- VOC2007
- DTD
- Oxford-IIIT Pet dataset
- Caltech101
- Flowers102
- MNIST
- SVHN
- IIIT5K
- Hateful Memes
- SST-2
- UCF101
- Kinetics700
- Country211
- CLEVR Counting
- KITTI Distance
- STL-10
- RareAct
- Flickr30
- MSCOCO
- ImageNet
- ImageNet-A
- ImageNet-R
- ImageNet Sketch
- ObjectNet (ImageNet Overlap)
- Youtube-BB
- ImageNet-Vid
## Limitations
CLIP and our analysis of it have a number of limitations. CLIP currently struggles with respect to certain tasks such as fine grained classification and counting objects. CLIP also poses issues with regards to fairness and bias which we discuss in the paper and briefly in the next section. Additionally, our approach to testing CLIP also has an important limitation- in many cases we have used linear probes to evaluate the performance of CLIP and there is evidence suggesting that linear probes can underestimate model performance.
### Bias and Fairness
We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from [Fairface](https://arxiv.org/abs/1908.04913) into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed. (Details captured in the Broader Impacts Section in the paper).
We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with โMiddle Easternโ having the highest accuracy (98.4%) and โWhiteโ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification. Our use of evaluations to test for gender, race and age classification as well as denigration harms is simply to evaluate performance of the model across people and surface potential risks and not to demonstrate an endorsement/enthusiasm for such tasks.
## Feedback
### Where to send questions or comments about the model
Please use [this Google Form](https://forms.gle/Uv7afRH5dvY34ZEs9) | 7,935 | [
[
-0.039031982421875,
-0.0443115234375,
0.0128173828125,
-0.0023288726806640625,
-0.01251983642578125,
-0.019561767578125,
0.001708984375,
-0.054962158203125,
0.0099334716796875,
0.0298919677734375,
-0.0217132568359375,
-0.03155517578125,
-0.048919677734375,
0.00937652587890625,
-0.048370361328125,
0.055267333984375,
-0.004909515380859375,
0.005016326904296875,
-0.023681640625,
-0.025604248046875,
-0.03887939453125,
-0.053131103515625,
-0.0184326171875,
0.012603759765625,
0.0067138671875,
0.01096343994140625,
0.051025390625,
0.065185546875,
0.061859130859375,
0.0171051025390625,
-0.02423095703125,
-0.00868988037109375,
-0.038909912109375,
-0.047515869140625,
-0.02947998046875,
-0.03021240234375,
-0.030181884765625,
0.016021728515625,
0.041168212890625,
0.0281829833984375,
0.00244140625,
0.022796630859375,
0.005565643310546875,
0.0284271240234375,
-0.07135009765625,
-0.003841400146484375,
-0.042724609375,
0.005321502685546875,
-0.021759033203125,
0.01116180419921875,
-0.01328277587890625,
-0.01464080810546875,
0.02423095703125,
-0.0390625,
0.037109375,
-0.00487518310546875,
0.10064697265625,
0.01386260986328125,
-0.012847900390625,
-0.0023479461669921875,
-0.045379638671875,
0.057769775390625,
-0.04400634765625,
0.018402099609375,
0.01763916015625,
0.029754638671875,
0.011016845703125,
-0.06475830078125,
-0.048736572265625,
-0.004779815673828125,
0.0232086181640625,
0.001255035400390625,
-0.0174102783203125,
-0.004337310791015625,
0.032470703125,
0.0380859375,
-0.01238250732421875,
-0.004566192626953125,
-0.05499267578125,
-0.016693115234375,
0.051971435546875,
0.022918701171875,
0.02593994140625,
-0.0178985595703125,
-0.0484619140625,
-0.0362548828125,
-0.034637451171875,
0.041168212890625,
0.029541015625,
0.0074615478515625,
-0.01166534423828125,
0.0496826171875,
-0.003643035888671875,
0.03289794921875,
0.0005292892456054688,
-0.0268707275390625,
0.026641845703125,
-0.036285400390625,
-0.014404296875,
-0.0209197998046875,
0.05841064453125,
0.06390380859375,
0.01366424560546875,
0.016387939453125,
-0.00679779052734375,
0.0164947509765625,
0.0262603759765625,
-0.07135009765625,
-0.01218414306640625,
-0.01555633544921875,
-0.04833984375,
-0.0284271240234375,
0.0217742919921875,
-0.07086181640625,
0.00592041015625,
-0.00928497314453125,
0.0567626953125,
-0.034393310546875,
-0.00577545166015625,
0.01480865478515625,
-0.02490234375,
0.02484130859375,
0.0253753662109375,
-0.051361083984375,
0.029144287109375,
0.0247802734375,
0.08447265625,
-0.03643798828125,
-0.02398681640625,
0.0035343170166015625,
-0.005126953125,
-0.00870513916015625,
0.054779052734375,
-0.029052734375,
-0.0360107421875,
-0.01464080810546875,
0.033905029296875,
-0.00917816162109375,
-0.0469970703125,
0.04473876953125,
-0.016021728515625,
0.0019893646240234375,
-0.0217742919921875,
-0.02984619140625,
-0.04742431640625,
0.02459716796875,
-0.05487060546875,
0.0684814453125,
0.011383056640625,
-0.059844970703125,
0.028961181640625,
-0.054962158203125,
-0.003910064697265625,
-0.00991058349609375,
-0.007671356201171875,
-0.045867919921875,
-0.0218658447265625,
0.030853271484375,
0.02484130859375,
-0.0175933837890625,
0.0285186767578125,
-0.0462646484375,
-0.03790283203125,
0.01425933837890625,
-0.0338134765625,
0.06817626953125,
0.0015306472778320312,
-0.02520751953125,
0.00013267993927001953,
-0.035430908203125,
-0.01331329345703125,
0.027252197265625,
0.0006656646728515625,
-0.01251983642578125,
-0.00811004638671875,
0.0151824951171875,
0.007366180419921875,
-0.00318145751953125,
-0.0526123046875,
0.01025390625,
-0.00640106201171875,
0.041656494140625,
0.052154541015625,
0.007183074951171875,
0.021209716796875,
-0.032562255859375,
0.040130615234375,
-0.001861572265625,
0.05047607421875,
-0.0194091796875,
-0.0396728515625,
-0.037567138671875,
-0.035675048828125,
0.0447998046875,
0.0498046875,
-0.033599853515625,
0.01275634765625,
-0.01073455810546875,
-0.02581787109375,
-0.01378631591796875,
-0.016998291015625,
0.0266876220703125,
0.049957275390625,
0.0263824462890625,
-0.0753173828125,
-0.0311126708984375,
-0.08056640625,
0.01499176025390625,
0.005340576171875,
-0.004283905029296875,
0.053070068359375,
0.0694580078125,
-0.0181884765625,
0.08251953125,
-0.05767822265625,
-0.031951904296875,
-0.0104827880859375,
-0.01021575927734375,
-0.001708984375,
0.038177490234375,
0.072509765625,
-0.07147216796875,
-0.0200653076171875,
-0.04058837890625,
-0.061920166015625,
0.01111602783203125,
0.01515960693359375,
-0.00690460205078125,
0.0033111572265625,
0.0172119140625,
-0.01898193359375,
0.0787353515625,
0.0199432373046875,
-0.004192352294921875,
0.056060791015625,
0.006618499755859375,
0.02203369140625,
-0.045318603515625,
0.0278167724609375,
0.01323699951171875,
-0.01166534423828125,
-0.037261962890625,
0.004192352294921875,
-0.0006690025329589844,
-0.032623291015625,
-0.071044921875,
0.02838134765625,
-0.011016845703125,
-0.00948333740234375,
-0.01222991943359375,
-0.01450347900390625,
0.0242767333984375,
0.054931640625,
0.0106201171875,
0.08270263671875,
0.03790283203125,
-0.05810546875,
-0.0018405914306640625,
0.041534423828125,
-0.03643798828125,
0.041015625,
-0.0731201171875,
-0.003200531005859375,
-0.004535675048828125,
0.00817108154296875,
-0.043487548828125,
-0.0259246826171875,
0.023712158203125,
-0.02703857421875,
0.0162353515625,
-0.01018524169921875,
-0.024383544921875,
-0.0458984375,
-0.042022705078125,
0.05767822265625,
0.03912353515625,
-0.034393310546875,
0.0281829833984375,
0.05487060546875,
0.01445770263671875,
-0.0411376953125,
-0.058837890625,
-0.006103515625,
-0.0159149169921875,
-0.0557861328125,
0.0418701171875,
-0.00008100271224975586,
0.005779266357421875,
0.0101165771484375,
0.00701141357421875,
-0.0240478515625,
0.002086639404296875,
0.03521728515625,
0.03961181640625,
-0.0063018798828125,
-0.0090789794921875,
-0.022918701171875,
0.027435302734375,
-0.005764007568359375,
0.0098419189453125,
0.021148681640625,
-0.01085662841796875,
-0.0265350341796875,
-0.039031982421875,
0.0250396728515625,
0.034820556640625,
-0.0207061767578125,
0.037567138671875,
0.037689208984375,
-0.0211639404296875,
0.008514404296875,
-0.04107666015625,
-0.002811431884765625,
-0.034027099609375,
0.038665771484375,
-0.0095367431640625,
-0.05181884765625,
0.055908203125,
0.0106201171875,
-0.01108551025390625,
0.048187255859375,
0.0233612060546875,
0.0008273124694824219,
0.0654296875,
0.072265625,
0.0028324127197265625,
0.04949951171875,
-0.062286376953125,
-0.0012989044189453125,
-0.07733154296875,
-0.0260162353515625,
-0.0196685791015625,
-0.0165863037109375,
-0.033538818359375,
-0.042938232421875,
0.044830322265625,
0.01358795166015625,
-0.00817108154296875,
0.032073974609375,
-0.05096435546875,
0.034759521484375,
0.04766845703125,
0.034515380859375,
0.0012979507446289062,
-0.006504058837890625,
0.0002951622009277344,
-0.012420654296875,
-0.051727294921875,
-0.038238525390625,
0.08575439453125,
0.05096435546875,
0.05419921875,
-0.016998291015625,
0.0167999267578125,
0.032470703125,
-0.006702423095703125,
-0.057464599609375,
0.041351318359375,
-0.03509521484375,
-0.054718017578125,
-0.01444244384765625,
-0.004497528076171875,
-0.05853271484375,
0.0117340087890625,
-0.01088714599609375,
-0.057037353515625,
0.046539306640625,
0.01038360595703125,
-0.0257415771484375,
0.0516357421875,
-0.04559326171875,
0.07562255859375,
-0.022369384765625,
-0.03338623046875,
0.005584716796875,
-0.049652099609375,
0.04412841796875,
0.0055084228515625,
0.002166748046875,
-0.01611328125,
0.00778961181640625,
0.08294677734375,
-0.04443359375,
0.07135009765625,
-0.00893402099609375,
0.031982421875,
0.056884765625,
-0.01409149169921875,
0.004302978515625,
-0.015869140625,
0.01485443115234375,
0.054168701171875,
0.0212554931640625,
-0.0084991455078125,
-0.0283660888671875,
0.01116180419921875,
-0.055908203125,
-0.03045654296875,
-0.0284423828125,
-0.034088134765625,
0.017303466796875,
0.015533447265625,
0.04217529296875,
0.058258056640625,
-0.0038280487060546875,
0.0125274658203125,
0.047882080078125,
-0.038726806640625,
0.028900146484375,
0.0152740478515625,
-0.021331787109375,
-0.03997802734375,
0.06988525390625,
0.0214080810546875,
0.0166168212890625,
0.0031719207763671875,
0.00666046142578125,
-0.0174102783203125,
-0.03741455078125,
-0.033905029296875,
0.0055999755859375,
-0.056182861328125,
-0.0328369140625,
-0.04168701171875,
-0.0285186767578125,
-0.033905029296875,
-0.0009074211120605469,
-0.03692626953125,
-0.0257720947265625,
-0.048126220703125,
0.016082763671875,
0.01354217529296875,
0.049285888671875,
-0.007602691650390625,
0.0222930908203125,
-0.0474853515625,
0.019256591796875,
0.02947998046875,
0.04052734375,
0.005298614501953125,
-0.053375244140625,
-0.01107025146484375,
0.000002562999725341797,
-0.06719970703125,
-0.060516357421875,
0.03411865234375,
0.02520751953125,
0.04498291015625,
0.02764892578125,
0.00742340087890625,
0.05316162109375,
-0.03271484375,
0.08294677734375,
0.017547607421875,
-0.07275390625,
0.04217529296875,
-0.0236663818359375,
0.016571044921875,
0.052215576171875,
0.037200927734375,
-0.01611328125,
-0.00982666015625,
-0.04180908203125,
-0.068115234375,
0.06060791015625,
0.0099945068359375,
0.003711700439453125,
0.004894256591796875,
0.025543212890625,
0.0016679763793945312,
0.00658416748046875,
-0.05377197265625,
-0.01251983642578125,
-0.038482666015625,
0.004535675048828125,
0.02203369140625,
-0.0330810546875,
0.0018787384033203125,
-0.032196044921875,
0.0311126708984375,
-0.00403594970703125,
0.04296875,
0.041259765625,
-0.01328277587890625,
0.01062774658203125,
-0.007648468017578125,
0.05047607421875,
0.04638671875,
-0.02984619140625,
-0.0176239013671875,
0.019683837890625,
-0.06396484375,
0.0009121894836425781,
-0.01397705078125,
-0.038665771484375,
-0.0035247802734375,
0.024200439453125,
0.07135009765625,
0.01544189453125,
-0.057281494140625,
0.07684326171875,
-0.007297515869140625,
-0.04248046875,
-0.0199127197265625,
0.005950927734375,
-0.041961669921875,
0.010284423828125,
0.0244140625,
0.01678466796875,
0.034942626953125,
-0.039459228515625,
0.030059814453125,
0.03277587890625,
-0.026641845703125,
-0.0290069580078125,
0.058013916015625,
0.01119232177734375,
-0.0156707763671875,
0.038238525390625,
-0.01340484619140625,
-0.07330322265625,
0.0626220703125,
0.030792236328125,
0.050384521484375,
-0.0010805130004882812,
0.01323699951171875,
0.051177978515625,
0.01148223876953125,
-0.0261688232421875,
-0.003635406494140625,
0.0010900497436523438,
-0.04327392578125,
-0.0169525146484375,
-0.031768798828125,
-0.044525146484375,
0.0119171142578125,
-0.07073974609375,
0.032135009765625,
-0.038848876953125,
-0.038787841796875,
-0.00824737548828125,
-0.0207366943359375,
-0.055755615234375,
0.0102691650390625,
0.01198577880859375,
0.0936279296875,
-0.06427001953125,
0.03753662109375,
0.032989501953125,
-0.045562744140625,
-0.061737060546875,
-0.011505126953125,
-0.00778961181640625,
-0.0491943359375,
0.05108642578125,
0.04083251953125,
-0.00028133392333984375,
-0.03607177734375,
-0.07244873046875,
-0.0755615234375,
0.08648681640625,
0.02490234375,
-0.03125,
-0.0069122314453125,
-0.0012636184692382812,
0.026397705078125,
-0.0251922607421875,
0.02838134765625,
0.02508544921875,
-0.0016050338745117188,
0.02593994140625,
-0.089599609375,
-0.0145416259765625,
-0.0132293701171875,
0.0198822021484375,
0.0018024444580078125,
-0.06463623046875,
0.08062744140625,
-0.020843505859375,
-0.033721923828125,
0.004543304443359375,
0.033416748046875,
-0.00485992431640625,
0.0287933349609375,
0.0396728515625,
0.05352783203125,
0.03228759765625,
0.004970550537109375,
0.08209228515625,
-0.0043487548828125,
0.035125732421875,
0.07177734375,
-0.011077880859375,
0.06805419921875,
0.0233612060546875,
-0.027130126953125,
0.0280914306640625,
0.03314208984375,
-0.052490234375,
0.058563232421875,
0.00016570091247558594,
0.01197052001953125,
-0.0032806396484375,
-0.034332275390625,
-0.022796630859375,
0.0543212890625,
0.0024929046630859375,
-0.035736083984375,
-0.00489044189453125,
0.03131103515625,
-0.0190887451171875,
-0.004138946533203125,
-0.03466796875,
0.03448486328125,
-0.012420654296875,
-0.0262451171875,
0.033538818359375,
0.005069732666015625,
0.0728759765625,
-0.0277862548828125,
-0.01198577880859375,
0.0070343017578125,
0.01515960693359375,
-0.00644683837890625,
-0.07257080078125,
0.041961669921875,
0.00447845458984375,
-0.0172882080078125,
0.006832122802734375,
0.056549072265625,
-0.002712249755859375,
-0.04388427734375,
0.016387939453125,
-0.01055908203125,
0.02691650390625,
-0.0079193115234375,
-0.05401611328125,
0.0256805419921875,
0.004856109619140625,
0.0021648406982421875,
0.022705078125,
-0.0013065338134765625,
-0.00858306884765625,
0.05120849609375,
0.0295867919921875,
-0.003814697265625,
0.008392333984375,
-0.026214599609375,
0.08056640625,
-0.04229736328125,
-0.0308685302734375,
-0.052490234375,
0.0269622802734375,
-0.00762176513671875,
-0.0264434814453125,
0.04718017578125,
0.046844482421875,
0.08563232421875,
-0.00942230224609375,
0.042877197265625,
-0.0163726806640625,
0.038787841796875,
-0.0286102294921875,
0.03411865234375,
-0.0404052734375,
-0.0023632049560546875,
-0.033172607421875,
-0.048828125,
-0.0140228271484375,
0.046966552734375,
-0.0306243896484375,
-0.005657196044921875,
0.037628173828125,
0.055633544921875,
-0.01885986328125,
-0.0024852752685546875,
0.0201263427734375,
-0.025634765625,
0.0198822021484375,
0.04681396484375,
0.04669189453125,
-0.06072998046875,
0.053131103515625,
-0.05267333984375,
-0.0174407958984375,
-0.01511383056640625,
-0.06402587890625,
-0.079345703125,
-0.038726806640625,
-0.03271484375,
-0.01049041748046875,
-0.004497528076171875,
0.04412841796875,
0.0738525390625,
-0.054351806640625,
-0.0070343017578125,
0.0255126953125,
-0.0055389404296875,
-0.0005292892456054688,
-0.01861572265625,
0.029144287109375,
0.0160369873046875,
-0.0433349609375,
-0.01500701904296875,
0.00966644287109375,
0.02728271484375,
-0.01396942138671875,
0.009307861328125,
-0.01505279541015625,
-0.0051116943359375,
0.034393310546875,
0.040863037109375,
-0.049957275390625,
-0.0247344970703125,
0.01187896728515625,
0.0032100677490234375,
0.026153564453125,
0.049285888671875,
-0.04876708984375,
0.03326416015625,
0.021240234375,
0.042236328125,
0.05047607421875,
0.0199432373046875,
0.015625,
-0.03289794921875,
0.0162200927734375,
0.0161590576171875,
0.025787353515625,
0.0264434814453125,
-0.0305633544921875,
0.045318603515625,
0.037078857421875,
-0.0498046875,
-0.074951171875,
-0.00237274169921875,
-0.08251953125,
-0.0152435302734375,
0.0677490234375,
-0.031280517578125,
-0.051910400390625,
0.0117950439453125,
-0.0161590576171875,
0.012847900390625,
-0.0274658203125,
0.050323486328125,
0.0301666259765625,
-0.002231597900390625,
-0.02764892578125,
-0.0457763671875,
0.0152435302734375,
0.004451751708984375,
-0.0399169921875,
-0.0297088623046875,
0.027984619140625,
0.044677734375,
0.026336669921875,
0.0361328125,
-0.02630615234375,
0.02960205078125,
0.0032444000244140625,
0.0228424072265625,
-0.0249786376953125,
-0.0296173095703125,
-0.03656005859375,
0.022796630859375,
-0.0216217041015625,
-0.04693603515625
]
] |
gpt2 | 2023-06-30T02:19:43.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"onnx",
"safetensors",
"gpt2",
"text-generation",
"exbert",
"en",
"doi:10.57967/hf/0039",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | null | null | null | gpt2 | 1,471 | 23,269,709 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: mit
---
# GPT-2
Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large
Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in
[this paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf)
and first released at [this page](https://openai.com/blog/better-language-models/).
Disclaimer: The team releasing GPT-2 also wrote a
[model card](https://github.com/openai/gpt-2/blob/master/model_card.md) for their model. Content from this model card
has been written by the Hugging Face team to complete the information they provided and give specific examples of bias.
## Model description
GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This
means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots
of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely,
it was trained to guess the next word in sentences.
More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence,
shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the
predictions for the token `i` only uses the inputs from `1` to `i` but not the future tokens.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a
prompt.
This is the **smallest** version of GPT-2, with 124M parameters.
**Related Models:** [GPT-Large](https://huggingface.co/gpt2-large), [GPT-Medium](https://huggingface.co/gpt2-medium) and [GPT-XL](https://huggingface.co/gpt2-xl)
## Intended uses & limitations
You can use the raw model for text generation or fine-tune it to a downstream task. See the
[model hub](https://huggingface.co/models?filter=gpt2) to look for fine-tuned versions on a task that interests you.
### How to use
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='gpt2')
>>> set_seed(42)
>>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5)
[{'generated_text': "Hello, I'm a language model, a language for thinking, a language for expressing thoughts."},
{'generated_text': "Hello, I'm a language model, a compiler, a compiler library, I just want to know how I build this kind of stuff. I don"},
{'generated_text': "Hello, I'm a language model, and also have more than a few of your own, but I understand that they're going to need some help"},
{'generated_text': "Hello, I'm a language model, a system model. I want to know my language so that it might be more interesting, more user-friendly"},
{'generated_text': 'Hello, I\'m a language model, not a language model"\n\nThe concept of "no-tricks" comes in handy later with new'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2Model.from_pretrained('gpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = TFGPT2Model.from_pretrained('gpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model has not been released as a dataset one can browse. We know it contains a lot of
unfiltered content from the internet, which is far from neutral. As the openAI team themselves point out in their
[model card](https://github.com/openai/gpt-2/blob/master/model_card.md#out-of-scope-use-cases):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we donโt support use-cases
> that require the generated text to be true.
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do
> not recommend that they be deployed into systems that interact with humans > unless the deployers first carry out a
> study of biases relevant to the intended use-case. We found no statistically significant difference in gender, race,
> and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with similar
> levels of caution around use cases that are sensitive to biases around human attributes.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='gpt2')
>>> set_seed(42)
>>> generator("The White man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The White man worked as a mannequin for'},
{'generated_text': 'The White man worked as a maniser of the'},
{'generated_text': 'The White man worked as a bus conductor by day'},
{'generated_text': 'The White man worked as a plumber at the'},
{'generated_text': 'The White man worked as a journalist. He had'}]
>>> set_seed(42)
>>> generator("The Black man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The Black man worked as a man at a restaurant'},
{'generated_text': 'The Black man worked as a car salesman in a'},
{'generated_text': 'The Black man worked as a police sergeant at the'},
{'generated_text': 'The Black man worked as a man-eating monster'},
{'generated_text': 'The Black man worked as a slave, and was'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The OpenAI team wanted to train this model on a corpus as large as possible. To build it, they scraped all the web
pages from outbound links on Reddit which received at least 3 karma. Note that all Wikipedia pages were removed from
this dataset, so the model was not trained on any part of Wikipedia. The resulting dataset (called WebText) weights
40GB of texts but has not been publicly released. You can find a list of the top 1,000 domains present in WebText
[here](https://github.com/openai/gpt-2/blob/master/domains.txt).
## Training procedure
### Preprocessing
The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50,257. The inputs are sequences of 1024 consecutive tokens.
The larger model was trained on 256 cloud TPU v3 cores. The training duration was not disclosed, nor were the exact
details of training.
## Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
| Dataset | LAMBADA | LAMBADA | CBT-CN | CBT-NE | WikiText2 | PTB | enwiki8 | text8 | WikiText103 | 1BW |
|:--------:|:-------:|:-------:|:------:|:------:|:---------:|:------:|:-------:|:------:|:-----------:|:-----:|
| (metric) | (PPL) | (ACC) | (ACC) | (ACC) | (PPL) | (PPL) | (BPB) | (BPC) | (PPL) | (PPL) |
| | 35.13 | 45.99 | 87.65 | 83.4 | 29.41 | 65.85 | 1.16 | 1,17 | 37.50 | 75.20 |
### BibTeX entry and citation info
```bibtex
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
```
<a href="https://huggingface.co/exbert/?model=gpt2">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 8,090 | [
[
-0.0205841064453125,
-0.055419921875,
0.0232086181640625,
-0.0022525787353515625,
-0.019683837890625,
-0.0235137939453125,
-0.030242919921875,
-0.03985595703125,
-0.00772857666015625,
0.023651123046875,
-0.0361328125,
-0.0206756591796875,
-0.055755615234375,
-0.002529144287109375,
-0.0201873779296875,
0.106689453125,
-0.008941650390625,
0.0010986328125,
0.00606536865234375,
0.00580596923828125,
-0.0267486572265625,
-0.03436279296875,
-0.048095703125,
-0.03436279296875,
0.0239410400390625,
0.00859832763671875,
0.045928955078125,
0.047515869140625,
0.0148468017578125,
0.0142974853515625,
-0.00852203369140625,
0.0029506683349609375,
-0.037384033203125,
-0.01349639892578125,
-0.01503753662109375,
-0.0251312255859375,
-0.0245361328125,
0.0162506103515625,
0.038116455078125,
0.029510498046875,
0.0108795166015625,
0.025665283203125,
0.022308349609375,
0.021881103515625,
-0.0293426513671875,
0.0224456787109375,
-0.041839599609375,
-0.008026123046875,
-0.0286407470703125,
0.0072174072265625,
-0.03131103515625,
-0.00948333740234375,
0.00772857666015625,
-0.037445068359375,
0.031463623046875,
-0.0072174072265625,
0.08953857421875,
0.0188446044921875,
-0.03570556640625,
-0.01480865478515625,
-0.053680419921875,
0.061309814453125,
-0.053497314453125,
0.024169921875,
0.035675048828125,
0.004711151123046875,
-0.00189208984375,
-0.06475830078125,
-0.056121826171875,
-0.01276397705078125,
-0.0224151611328125,
0.019866943359375,
-0.0051116943359375,
-0.00428009033203125,
0.0279998779296875,
0.0230712890625,
-0.06536865234375,
-0.0009450912475585938,
-0.035247802734375,
-0.0229339599609375,
0.036895751953125,
-0.00833892822265625,
0.027252197265625,
-0.0305633544921875,
-0.0275421142578125,
-0.0217437744140625,
-0.04364013671875,
-0.005680084228515625,
0.033416748046875,
0.0197906494140625,
-0.02264404296875,
0.052215576171875,
-0.00015485286712646484,
0.040130615234375,
-0.005100250244140625,
-0.01119232177734375,
0.0254974365234375,
-0.037353515625,
-0.00984954833984375,
-0.0181427001953125,
0.08502197265625,
0.0202484130859375,
0.0340576171875,
-0.00919342041015625,
-0.01561737060546875,
0.00977325439453125,
0.00431060791015625,
-0.068603515625,
-0.0253753662109375,
0.01500701904296875,
-0.0330810546875,
-0.0251312255859375,
0.004619598388671875,
-0.06182861328125,
-0.0035037994384765625,
-0.0118865966796875,
0.0300445556640625,
-0.034637451171875,
-0.0374755859375,
-0.018280029296875,
-0.018035888671875,
0.0181732177734375,
-0.00518798828125,
-0.08123779296875,
0.0112457275390625,
0.045989990234375,
0.0687255859375,
0.0011758804321289062,
-0.03143310546875,
-0.01129150390625,
-0.0021228790283203125,
-0.01041412353515625,
0.037933349609375,
-0.019989013671875,
-0.01055145263671875,
-0.007526397705078125,
0.0011854171752929688,
-0.0047149658203125,
-0.0221710205078125,
0.036224365234375,
-0.0303497314453125,
0.0477294921875,
-0.0174102783203125,
-0.034576416015625,
-0.007129669189453125,
0.0034618377685546875,
-0.036041259765625,
0.0927734375,
0.030242919921875,
-0.07965087890625,
0.0297088623046875,
-0.057769775390625,
-0.03033447265625,
-0.009918212890625,
-0.006992340087890625,
-0.038848876953125,
-0.00429534912109375,
0.01371002197265625,
0.022857666015625,
-0.035736083984375,
0.03131103515625,
-0.0091705322265625,
-0.0164337158203125,
0.0101318359375,
-0.0298614501953125,
0.08355712890625,
0.021636962890625,
-0.0506591796875,
-0.0050048828125,
-0.039154052734375,
0.00658416748046875,
0.03143310546875,
-0.0267181396484375,
-0.01009368896484375,
-0.00502777099609375,
0.0254364013671875,
0.027618408203125,
0.01458740234375,
-0.03900146484375,
0.0175323486328125,
-0.037353515625,
0.050201416015625,
0.046539306640625,
-0.009735107421875,
0.0247039794921875,
-0.01398468017578125,
0.0218963623046875,
-0.0012960433959960938,
0.007171630859375,
-0.0041656494140625,
-0.058868408203125,
-0.05706787109375,
-0.00911712646484375,
0.036285400390625,
0.055877685546875,
-0.059417724609375,
0.034088134765625,
-0.0218505859375,
-0.03692626953125,
-0.03704833984375,
0.0079498291015625,
0.048492431640625,
0.03955078125,
0.033660888671875,
-0.0171966552734375,
-0.044464111328125,
-0.061187744140625,
-0.02178955078125,
-0.025421142578125,
-0.01377105712890625,
0.012603759765625,
0.055633544921875,
-0.0235443115234375,
0.07171630859375,
-0.043121337890625,
-0.01654052734375,
-0.0292510986328125,
0.01532745361328125,
0.0108795166015625,
0.0408935546875,
0.039825439453125,
-0.049346923828125,
-0.038330078125,
-0.00922393798828125,
-0.061737060546875,
-0.006168365478515625,
0.0018301010131835938,
-0.00988006591796875,
0.0247039794921875,
0.0213165283203125,
-0.06744384765625,
0.020965576171875,
0.03717041015625,
-0.0382080078125,
0.05072021484375,
-0.009857177734375,
-0.0201568603515625,
-0.10260009765625,
0.026580810546875,
0.010406494140625,
-0.00942230224609375,
-0.058929443359375,
0.01068878173828125,
0.00328826904296875,
-0.005489349365234375,
-0.02288818359375,
0.06048583984375,
-0.039031982421875,
-0.0005812644958496094,
-0.0152130126953125,
0.00583648681640625,
-0.00921630859375,
0.049896240234375,
0.0011053085327148438,
0.071533203125,
0.0316162109375,
-0.03558349609375,
0.0105438232421875,
0.022735595703125,
-0.0307769775390625,
0.0189666748046875,
-0.05950927734375,
0.0213165283203125,
-0.00861358642578125,
0.0164794921875,
-0.0726318359375,
-0.0285491943359375,
0.0240325927734375,
-0.049041748046875,
0.032440185546875,
-0.01340484619140625,
-0.05047607421875,
-0.042022705078125,
-0.010772705078125,
0.03131103515625,
0.060333251953125,
-0.033416748046875,
0.025726318359375,
0.033721923828125,
-0.01012420654296875,
-0.04302978515625,
-0.062744140625,
-0.004207611083984375,
-0.00882720947265625,
-0.045501708984375,
0.0263519287109375,
0.00659942626953125,
-0.0059051513671875,
-0.00286102294921875,
0.01910400390625,
-0.005161285400390625,
-0.00028252601623535156,
0.0149688720703125,
0.0242156982421875,
-0.01029205322265625,
0.00675201416015625,
0.00034046173095703125,
-0.0135345458984375,
-0.00072479248046875,
-0.0428466796875,
0.0556640625,
0.004116058349609375,
-0.004016876220703125,
-0.03143310546875,
0.0113525390625,
0.0257568359375,
-0.0198211669921875,
0.05462646484375,
0.0775146484375,
-0.035186767578125,
0.00014698505401611328,
-0.033538818359375,
-0.022369384765625,
-0.0309295654296875,
0.05316162109375,
-0.0264129638671875,
-0.068359375,
0.031280517578125,
0.0232391357421875,
0.010284423828125,
0.062042236328125,
0.058135986328125,
0.01235198974609375,
0.0797119140625,
0.03936767578125,
-0.0092010498046875,
0.0316162109375,
-0.03118896484375,
0.01305389404296875,
-0.06951904296875,
-0.01526641845703125,
-0.03277587890625,
-0.01007843017578125,
-0.05926513671875,
-0.0308685302734375,
0.018035888671875,
0.017333984375,
-0.035400390625,
0.037109375,
-0.047607421875,
0.02606201171875,
0.057708740234375,
0.005985260009765625,
-0.0017347335815429688,
0.00853729248046875,
-0.01529693603515625,
-0.0002799034118652344,
-0.040985107421875,
-0.03985595703125,
0.09881591796875,
0.033660888671875,
0.032012939453125,
0.0021800994873046875,
0.032867431640625,
0.002437591552734375,
0.025177001953125,
-0.039154052734375,
0.0267486572265625,
-0.0213165283203125,
-0.063720703125,
-0.0244140625,
-0.03851318359375,
-0.06903076171875,
0.0177459716796875,
-0.002300262451171875,
-0.06890869140625,
-0.004901885986328125,
0.01434326171875,
-0.01422119140625,
0.0301666259765625,
-0.0625,
0.07781982421875,
-0.0145721435546875,
-0.0272064208984375,
0.00337982177734375,
-0.05352783203125,
0.0345458984375,
-0.0011358261108398438,
0.00579833984375,
0.010345458984375,
0.0069732666015625,
0.0694580078125,
-0.047943115234375,
0.07037353515625,
-0.0227203369140625,
-0.0004489421844482422,
0.038848876953125,
-0.006763458251953125,
0.041015625,
-0.0034027099609375,
0.0026340484619140625,
0.029144287109375,
-0.0170440673828125,
-0.031768798828125,
-0.0206756591796875,
0.046600341796875,
-0.0909423828125,
-0.0278778076171875,
-0.033966064453125,
-0.036224365234375,
0.0036945343017578125,
0.0300445556640625,
0.056060791015625,
0.03045654296875,
-0.006622314453125,
-0.00734710693359375,
0.03173828125,
-0.0187225341796875,
0.03448486328125,
0.0118560791015625,
-0.009552001953125,
-0.034088134765625,
0.0660400390625,
0.01242828369140625,
0.0206451416015625,
0.0272979736328125,
0.0162506103515625,
-0.038116455078125,
-0.0276336669921875,
-0.0467529296875,
0.033660888671875,
-0.04229736328125,
-0.00782012939453125,
-0.06231689453125,
-0.02655029296875,
-0.048675537109375,
0.017822265625,
-0.0226287841796875,
-0.034515380859375,
-0.0304412841796875,
-0.01229095458984375,
0.02239990234375,
0.065185546875,
0.0032024383544921875,
0.0303802490234375,
-0.03564453125,
0.0169219970703125,
0.031005859375,
0.0284881591796875,
0.0007696151733398438,
-0.056243896484375,
-0.01425933837890625,
0.018585205078125,
-0.037017822265625,
-0.0635986328125,
0.0263519287109375,
0.006893157958984375,
0.0270538330078125,
0.0221405029296875,
-0.01470947265625,
0.033203125,
-0.03436279296875,
0.0809326171875,
0.01435089111328125,
-0.0631103515625,
0.03851318359375,
-0.0301666259765625,
0.0220947265625,
0.02569580078125,
0.019775390625,
-0.039581298828125,
-0.0216522216796875,
-0.050537109375,
-0.066162109375,
0.07012939453125,
0.0343017578125,
0.0135650634765625,
-0.005016326904296875,
0.032135009765625,
0.004787445068359375,
0.0086822509765625,
-0.08001708984375,
-0.02996826171875,
-0.040863037109375,
-0.0238037109375,
-0.015625,
-0.03387451171875,
0.0050811767578125,
-0.0159912109375,
0.061187744140625,
-0.0033359527587890625,
0.049652099609375,
0.00977325439453125,
-0.01934814453125,
0.003559112548828125,
0.012237548828125,
0.050262451171875,
0.04022216796875,
-0.006237030029296875,
0.00897216796875,
0.007465362548828125,
-0.05419921875,
0.0005121231079101562,
0.0174713134765625,
-0.035797119140625,
0.0010805130004882812,
0.0278778076171875,
0.08660888671875,
-0.013153076171875,
-0.0283203125,
0.047882080078125,
0.00905609130859375,
-0.021759033203125,
-0.028961181640625,
-0.0006647109985351562,
0.00553131103515625,
0.01026153564453125,
0.017913818359375,
-0.0037593841552734375,
-0.0063018798828125,
-0.040191650390625,
0.01206207275390625,
0.02490234375,
-0.02978515625,
-0.039215087890625,
0.07501220703125,
0.006557464599609375,
-0.02197265625,
0.0623779296875,
-0.037017822265625,
-0.049713134765625,
0.04296875,
0.0546875,
0.07305908203125,
-0.00983428955078125,
0.02197265625,
0.049530029296875,
0.032379150390625,
-0.017333984375,
0.01483154296875,
0.015533447265625,
-0.054595947265625,
-0.0390625,
-0.048126220703125,
0.0031757354736328125,
0.0357666015625,
-0.0287017822265625,
0.022735595703125,
-0.0279693603515625,
-0.0236358642578125,
-0.005878448486328125,
0.005031585693359375,
-0.0606689453125,
0.018646240234375,
0.006175994873046875,
0.05767822265625,
-0.07000732421875,
0.07080078125,
0.0521240234375,
-0.05267333984375,
-0.07177734375,
0.00965118408203125,
-0.0017976760864257812,
-0.06536865234375,
0.04901123046875,
0.0238037109375,
0.03106689453125,
0.004093170166015625,
-0.03955078125,
-0.0662841796875,
0.087890625,
0.0225830078125,
-0.0282440185546875,
-0.0204010009765625,
0.0265655517578125,
0.04644775390625,
-0.00917816162109375,
0.053863525390625,
0.04302978515625,
0.037353515625,
-0.01335906982421875,
-0.0789794921875,
0.021728515625,
-0.0254058837890625,
0.0142822265625,
0.0168914794921875,
-0.058746337890625,
0.08966064453125,
-0.0184783935546875,
-0.01197052001953125,
0.00658416748046875,
0.042633056640625,
0.008331298828125,
0.0028705596923828125,
0.031280517578125,
0.0504150390625,
0.052215576171875,
-0.023101806640625,
0.097412109375,
-0.02490234375,
0.052978515625,
0.08135986328125,
0.0054473876953125,
0.048614501953125,
0.023193359375,
-0.0216522216796875,
0.0362548828125,
0.047088623046875,
-0.01110076904296875,
0.042205810546875,
0.005023956298828125,
0.0028228759765625,
0.001972198486328125,
-0.0016756057739257812,
-0.031158447265625,
0.0296630859375,
0.00765228271484375,
-0.04022216796875,
-0.0089263916015625,
-0.00250244140625,
0.03302001953125,
-0.0220489501953125,
-0.00409698486328125,
0.055328369140625,
0.007049560546875,
-0.06597900390625,
0.0511474609375,
0.023681640625,
0.05902099609375,
-0.044281005859375,
0.01258087158203125,
-0.01058197021484375,
0.0197906494140625,
-0.01107025146484375,
-0.059661865234375,
0.01342010498046875,
0.00861358642578125,
-0.0251312255859375,
-0.01215362548828125,
0.0552978515625,
-0.042755126953125,
-0.032958984375,
0.020721435546875,
0.0333251953125,
0.023681640625,
-0.0171966552734375,
-0.056549072265625,
-0.01434326171875,
0.01403045654296875,
-0.034912109375,
0.0294952392578125,
0.02099609375,
-0.005584716796875,
0.028656005859375,
0.049713134765625,
0.006587982177734375,
0.0019588470458984375,
0.0077056884765625,
0.056610107421875,
-0.044403076171875,
-0.03466796875,
-0.07073974609375,
0.04498291015625,
-0.0069732666015625,
-0.0430908203125,
0.050506591796875,
0.04949951171875,
0.07843017578125,
-0.01134490966796875,
0.07781982421875,
-0.0280609130859375,
0.03387451171875,
-0.03143310546875,
0.0582275390625,
-0.03948974609375,
-0.0054473876953125,
-0.0185546875,
-0.0771484375,
-0.0038623809814453125,
0.048431396484375,
-0.0253448486328125,
0.0269927978515625,
0.05682373046875,
0.066650390625,
-0.00882720947265625,
-0.006984710693359375,
0.001216888427734375,
0.0290985107421875,
0.0251312255859375,
0.0511474609375,
0.034820556640625,
-0.0496826171875,
0.053253173828125,
-0.0267181396484375,
-0.0245361328125,
-0.004924774169921875,
-0.045318603515625,
-0.0787353515625,
-0.050079345703125,
-0.0148773193359375,
-0.044403076171875,
0.0020351409912109375,
0.06475830078125,
0.051483154296875,
-0.06646728515625,
-0.017364501953125,
-0.0182647705078125,
-0.006923675537109375,
-0.0117950439453125,
-0.021881103515625,
0.037567138671875,
-0.0181427001953125,
-0.05859375,
-0.0011777877807617188,
-0.00978851318359375,
0.0152587890625,
-0.0199127197265625,
-0.01337432861328125,
-0.016571044921875,
-0.01430511474609375,
0.0411376953125,
0.01568603515625,
-0.05206298828125,
-0.027740478515625,
-0.002899169921875,
-0.01580810546875,
-0.00455474853515625,
0.0511474609375,
-0.04278564453125,
0.0190582275390625,
0.039154052734375,
0.032196044921875,
0.045379638671875,
-0.00853729248046875,
0.0333251953125,
-0.055816650390625,
0.016357421875,
-0.000751495361328125,
0.0252532958984375,
0.0281524658203125,
-0.0308990478515625,
0.04632568359375,
0.036895751953125,
-0.04754638671875,
-0.05328369140625,
0.01349639892578125,
-0.05731201171875,
-0.0213165283203125,
0.10992431640625,
-0.0090179443359375,
-0.01275634765625,
-0.0022792816162109375,
-0.01425933837890625,
0.048583984375,
-0.028839111328125,
0.051361083984375,
0.05181884765625,
0.01250457763671875,
-0.006809234619140625,
-0.053466796875,
0.043365478515625,
0.0236053466796875,
-0.052001953125,
0.00379180908203125,
0.016326904296875,
0.045257568359375,
0.01727294921875,
0.05682373046875,
-0.01038360595703125,
0.00321197509765625,
0.0033664703369140625,
0.0164337158203125,
-0.007709503173828125,
-0.012603759765625,
-0.01180267333984375,
0.0037403106689453125,
-0.00577545166015625,
-0.0106964111328125
]
] |
tiiuae/falcon-7b-instruct | 2023-09-29T14:32:23.000Z | [
"transformers",
"pytorch",
"coreml",
"falcon",
"text-generation",
"custom_code",
"en",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:2205.14135",
"arxiv:1911.02150",
"arxiv:2005.14165",
"arxiv:2104.09864",
"arxiv:2306.01116",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | tiiuae | null | null | tiiuae/falcon-7b-instruct | 710 | 15,487,847 | transformers | 2023-04-25T06:21:01 | ---
datasets:
- tiiuae/falcon-refinedweb
language:
- en
inference: true
widget:
- text: "Hey Falcon! Any recommendations for my holidays in Abu Dhabi?"
example_title: "Abu Dhabi Trip"
- text: "What's the Everett interpretation of quantum mechanics?"
example_title: "Q/A: Quantum & Answers"
- text: "Give me a list of the top 10 dive sites you would recommend around the world."
example_title: "Diving Top 10"
- text: "Can you tell me more about deep-water soloing?"
example_title: "Extreme sports"
- text: "Can you write a short tweet about the Apache 2.0 release of our latest AI model, Falcon LLM?"
example_title: "Twitter Helper"
- text: "What are the responsabilities of a Chief Llama Officer?"
example_title: "Trendy Jobs"
license: apache-2.0
---
# โจ Falcon-7B-Instruct
**Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and finetuned on a mixture of chat/instruct datasets. It is made available under the Apache 2.0 license.**
*Paper coming soon ๐.*
๐ค To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost fron HF](https://huggingface.co/blog/falcon)!
## Why use Falcon-7B-Instruct?
* **You are looking for a ready-to-use chat/instruct model based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).**
* **Falcon-7B is a strong base model, outperforming comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
๐ฌ **This is an instruct model, which may not be ideal for further finetuning.** If you are interested in building your own instruct/chat model, we recommend starting from [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
๐ฅ **Looking for an even more powerful model?** [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) is Falcon-7B-Instruct's big brother!
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
๐ฅ **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost]((https://huggingface.co/blog/falcon).
You will need **at least 16GB of memory** to swiftly run inference with Falcon-7B-Instruct.
# Model Card for Falcon-7B-Instruct
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English and French;
- **License:** Apache 2.0;
- **Finetuned from model:** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
### Model Source
- **Paper:** *coming soon*.
## Uses
### Direct Use
Falcon-7B-Instruct has been finetuned on a mixture of instruct and chat datasets.
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-7B-Instruct is mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-7B-Instruct to develop guardrails and to take appropriate precautions for any production use.
## How to Get Started with the Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
### Training Data
Falcon-7B-Instruct was finetuned on a 250M tokens mixture of instruct/chat datasets.
| **Data source** | **Fraction** | **Tokens** | **Description** |
|--------------------|--------------|------------|-----------------------------------|
| [Bai ze](https://github.com/project-baize/baize-chatbot) | 65% | 164M | chat |
| [GPT4All](https://github.com/nomic-ai/gpt4all) | 25% | 62M | instruct |
| [GPTeacher](https://github.com/teknium1/GPTeacher) | 5% | 11M | instruct |
| [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 5% | 13M | massive web crawl |
The data was tokenized with the Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) tokenizer.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
Note that this model variant is not optimized for NLP benchmarks.
## Technical Specifications
For more information about pretraining, see [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
### Model Architecture and Objective
Falcon-7B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with a single layer norm.
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 32 | |
| `d_model` | 4544 | Increased to compensate for multiquery |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-7B-Instruct was trained on AWS SageMaker, on 32 A100 40GB GPUs in P4d instances.
#### Software
Falcon-7B-Instruct was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon* ๐. In the meanwhile, you can use the following information to cite:
```
@article{falcon40b,
title={{Falcon-40B}: an open large language model with state-of-the-art performance},
author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
year={2023}
}
```
To learn more about the pretraining dataset, see the ๐ [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## License
Falcon-7B-Instruct is made available under the Apache 2.0 license.
## Contact
falconllm@tii.ae | 9,798 | [
[
-0.035675048828125,
-0.07257080078125,
0.005641937255859375,
0.02783203125,
-0.00731658935546875,
-0.007244110107421875,
-0.00921630859375,
-0.034698486328125,
0.01654052734375,
0.0285797119140625,
-0.0340576171875,
-0.036224365234375,
-0.056793212890625,
0.005584716796875,
-0.029632568359375,
0.07391357421875,
0.0185546875,
-0.01132965087890625,
0.01439666748046875,
0.0033054351806640625,
-0.0215606689453125,
-0.040740966796875,
-0.0723876953125,
-0.0012559890747070312,
0.02923583984375,
0.0176544189453125,
0.044677734375,
0.06719970703125,
0.050567626953125,
0.0301361083984375,
-0.019012451171875,
0.018310546875,
-0.043182373046875,
-0.01453399658203125,
-0.0011606216430664062,
-0.0210113525390625,
-0.0201873779296875,
-0.002376556396484375,
0.056304931640625,
0.035980224609375,
0.0012912750244140625,
0.0177459716796875,
-0.0020427703857421875,
0.037139892578125,
-0.0462646484375,
0.041900634765625,
-0.042388916015625,
-0.0084228515625,
-0.01849365234375,
0.01047515869140625,
-0.03887939453125,
0.006755828857421875,
-0.02191162109375,
-0.060791015625,
0.0172271728515625,
0.0190277099609375,
0.09228515625,
0.0252685546875,
-0.0286712646484375,
-0.019683837890625,
-0.031280517578125,
0.0516357421875,
-0.06573486328125,
0.0311737060546875,
0.01358795166015625,
0.0288848876953125,
-0.0307464599609375,
-0.081298828125,
-0.039825439453125,
-0.0136260986328125,
-0.0034885406494140625,
0.0240631103515625,
-0.01026153564453125,
0.00844573974609375,
0.035003662109375,
0.01308441162109375,
-0.0306243896484375,
0.00055694580078125,
-0.037689208984375,
-0.01514434814453125,
0.043548583984375,
-0.0008268356323242188,
0.018218994140625,
-0.0235748291015625,
-0.0252227783203125,
-0.0226287841796875,
-0.0274810791015625,
0.018310546875,
0.0296783447265625,
0.027923583984375,
-0.0215606689453125,
0.03570556640625,
-0.022186279296875,
0.043060302734375,
0.035491943359375,
-0.005428314208984375,
0.03033447265625,
-0.0252227783203125,
-0.03240966796875,
0.0027713775634765625,
0.08831787109375,
0.014678955078125,
0.006198883056640625,
-0.0086669921875,
0.0016908645629882812,
0.002864837646484375,
0.0086669921875,
-0.072998046875,
0.01103973388671875,
0.0172576904296875,
-0.040191650390625,
-0.023712158203125,
0.0269317626953125,
-0.052642822265625,
-0.005260467529296875,
0.0091094970703125,
0.01480865478515625,
-0.037384033203125,
-0.0297088623046875,
0.0190582275390625,
-0.01189422607421875,
0.01439666748046875,
-0.004276275634765625,
-0.06085205078125,
0.0141754150390625,
0.04730224609375,
0.06640625,
0.01045989990234375,
-0.048614501953125,
-0.05609130859375,
0.003551483154296875,
-0.0200653076171875,
0.043060302734375,
-0.03656005859375,
-0.0239715576171875,
-0.007511138916015625,
0.023681640625,
-0.02777099609375,
-0.01201629638671875,
0.0628662109375,
-0.0278778076171875,
0.020751953125,
-0.0263214111328125,
-0.048248291015625,
-0.030487060546875,
-0.0022220611572265625,
-0.04266357421875,
0.07257080078125,
-0.0035610198974609375,
-0.08270263671875,
0.01532745361328125,
-0.067138671875,
-0.0206756591796875,
-0.0160980224609375,
-0.0005135536193847656,
-0.03369140625,
-0.01169586181640625,
0.034454345703125,
0.04852294921875,
-0.0267791748046875,
0.037933349609375,
-0.04949951171875,
-0.044647216796875,
-0.0006413459777832031,
-0.0174407958984375,
0.0665283203125,
0.041046142578125,
-0.042388916015625,
0.011138916015625,
-0.042816162109375,
-0.018280029296875,
0.015655517578125,
0.0002598762512207031,
0.01230621337890625,
-0.0013065338134765625,
0.0026988983154296875,
0.0211944580078125,
0.004810333251953125,
-0.043609619140625,
0.004596710205078125,
-0.046661376953125,
0.0445556640625,
0.0294342041015625,
-0.0017299652099609375,
0.027740478515625,
-0.037933349609375,
0.02752685546875,
0.036224365234375,
0.0259246826171875,
-0.0191650390625,
-0.045867919921875,
-0.07440185546875,
-0.0222625732421875,
0.008941650390625,
0.03021240234375,
-0.054046630859375,
0.035064697265625,
-0.010986328125,
-0.04864501953125,
-0.03509521484375,
-0.01666259765625,
0.037841796875,
0.04986572265625,
0.037628173828125,
0.010650634765625,
-0.049224853515625,
-0.060089111328125,
-0.005825042724609375,
-0.020751953125,
0.0218353271484375,
0.01041412353515625,
0.04400634765625,
-0.0250701904296875,
0.0477294921875,
-0.020538330078125,
-0.019012451171875,
-0.018096923828125,
0.005260467529296875,
0.026031494140625,
0.040374755859375,
0.058929443359375,
-0.039093017578125,
-0.022552490234375,
-0.004833221435546875,
-0.07080078125,
-0.005161285400390625,
-0.01468658447265625,
-0.026397705078125,
0.034393310546875,
0.044677734375,
-0.058837890625,
0.027801513671875,
0.0240325927734375,
-0.0262451171875,
0.02777099609375,
0.00159454345703125,
0.014801025390625,
-0.0970458984375,
0.0160369873046875,
0.011566162109375,
0.00780487060546875,
-0.03564453125,
0.0146636962890625,
-0.00020802021026611328,
-0.00234222412109375,
-0.0482177734375,
0.059844970703125,
-0.040191650390625,
-0.00006866455078125,
-0.007175445556640625,
-0.006175994873046875,
-0.0118560791015625,
0.050262451171875,
0.0058746337890625,
0.06207275390625,
0.044647216796875,
-0.03021240234375,
0.0020694732666015625,
0.029327392578125,
-0.002063751220703125,
0.0085601806640625,
-0.063232421875,
0.0018815994262695312,
-0.0085296630859375,
0.0290679931640625,
-0.065185546875,
-0.0198516845703125,
0.03997802734375,
-0.053009033203125,
0.024017333984375,
-0.01824951171875,
-0.0306854248046875,
-0.0421142578125,
-0.016387939453125,
0.0019092559814453125,
0.038299560546875,
-0.0421142578125,
0.035491943359375,
0.0201416015625,
0.00835418701171875,
-0.072998046875,
-0.046630859375,
0.00370025634765625,
-0.022003173828125,
-0.0621337890625,
0.02197265625,
-0.00128936767578125,
0.00499725341796875,
-0.0046844482421875,
0.0118560791015625,
0.006603240966796875,
0.004299163818359375,
0.042388916015625,
0.01398468017578125,
-0.0220489501953125,
-0.00563812255859375,
0.0100555419921875,
-0.00787353515625,
0.0052032470703125,
-0.0226593017578125,
0.03643798828125,
-0.047149658203125,
-0.0214080810546875,
-0.03411865234375,
0.02734375,
0.041534423828125,
-0.015899658203125,
0.06500244140625,
0.07916259765625,
-0.02484130859375,
0.006832122802734375,
-0.04986572265625,
-0.0088043212890625,
-0.03863525390625,
0.0333251953125,
-0.035247802734375,
-0.065673828125,
0.052215576171875,
0.0177764892578125,
0.004119873046875,
0.06634521484375,
0.035858154296875,
0.00894927978515625,
0.08367919921875,
0.0247802734375,
-0.0103759765625,
0.03533935546875,
-0.0399169921875,
0.0010805130004882812,
-0.05670166015625,
-0.0175323486328125,
-0.0517578125,
-0.006343841552734375,
-0.050079345703125,
-0.01497650146484375,
-0.0004413127899169922,
0.02490234375,
-0.06683349609375,
0.0190277099609375,
-0.04718017578125,
0.0153961181640625,
0.04541015625,
-0.0005278587341308594,
-0.0009937286376953125,
-0.004177093505859375,
-0.014617919921875,
0.019012451171875,
-0.0673828125,
-0.04193115234375,
0.0799560546875,
0.0290679931640625,
0.0474853515625,
-0.004787445068359375,
0.0645751953125,
-0.0018291473388671875,
0.0232391357421875,
-0.03717041015625,
0.038421630859375,
-0.00937652587890625,
-0.0390625,
-0.008026123046875,
-0.0406494140625,
-0.07525634765625,
0.0080718994140625,
-0.01232147216796875,
-0.0626220703125,
0.00380706787109375,
-0.004119873046875,
-0.007465362548828125,
0.0228424072265625,
-0.07611083984375,
0.07159423828125,
-0.000980377197265625,
-0.0250396728515625,
0.0124053955078125,
-0.05712890625,
0.0440673828125,
0.0032329559326171875,
0.01715087890625,
0.0020923614501953125,
0.00640869140625,
0.072265625,
-0.0440673828125,
0.0635986328125,
-0.0279541015625,
0.03497314453125,
0.036956787109375,
-0.0207061767578125,
0.049041748046875,
0.01052093505859375,
-0.0167083740234375,
0.0284423828125,
0.0213623046875,
-0.029449462890625,
-0.03594970703125,
0.063720703125,
-0.09161376953125,
-0.0474853515625,
-0.04278564453125,
-0.037322998046875,
-0.007549285888671875,
0.0245361328125,
0.030548095703125,
0.02606201171875,
0.00457763671875,
0.027252197265625,
0.01390838623046875,
-0.02630615234375,
0.054473876953125,
0.026641845703125,
-0.0186767578125,
-0.03765869140625,
0.0560302734375,
0.005359649658203125,
0.0012187957763671875,
0.025665283203125,
0.0181121826171875,
-0.0509033203125,
-0.035400390625,
-0.03851318359375,
0.034698486328125,
-0.04937744140625,
-0.0232086181640625,
-0.07147216796875,
-0.043609619140625,
-0.04638671875,
-0.00799560546875,
-0.027923583984375,
-0.018951416015625,
-0.046417236328125,
-0.00023245811462402344,
0.0340576171875,
0.040496826171875,
0.0023899078369140625,
0.037506103515625,
-0.0650634765625,
0.00859832763671875,
-0.00960540771484375,
0.0140533447265625,
0.0085296630859375,
-0.0501708984375,
-0.0177459716796875,
0.036041259765625,
-0.0291290283203125,
-0.0499267578125,
0.03717041015625,
0.0196380615234375,
0.053924560546875,
0.03057861328125,
0.01092529296875,
0.058380126953125,
-0.0134429931640625,
0.06060791015625,
0.0183563232421875,
-0.06768798828125,
0.02569580078125,
-0.0390625,
0.0191497802734375,
0.02490234375,
0.0286102294921875,
-0.03131103515625,
-0.03955078125,
-0.06927490234375,
-0.034912109375,
0.069091796875,
0.0301513671875,
-0.00445556640625,
-0.0224151611328125,
0.0306854248046875,
-0.012451171875,
-0.0003235340118408203,
-0.036041259765625,
-0.0156097412109375,
-0.0546875,
-0.0291595458984375,
-0.01277923583984375,
-0.0034198760986328125,
0.01824951171875,
-0.0197601318359375,
0.062042236328125,
-0.01116943359375,
0.05206298828125,
0.01316070556640625,
-0.01453399658203125,
0.0102386474609375,
-0.006771087646484375,
0.052490234375,
0.0291900634765625,
-0.0202789306640625,
-0.003582000732421875,
0.00457000732421875,
-0.04742431640625,
0.0033931732177734375,
0.0301361083984375,
-0.013336181640625,
-0.0102386474609375,
0.030975341796875,
0.07952880859375,
0.009735107421875,
-0.027099609375,
0.0325927734375,
-0.00835418701171875,
-0.021697998046875,
-0.004901885986328125,
0.020233154296875,
0.0201416015625,
0.026611328125,
0.016876220703125,
-0.00714111328125,
0.00957489013671875,
-0.01763916015625,
0.01320648193359375,
0.0149078369140625,
-0.0197296142578125,
-0.01605224609375,
0.07818603515625,
0.0142822265625,
-0.0171966552734375,
0.040679931640625,
-0.0267181396484375,
-0.0308685302734375,
0.06658935546875,
0.04986572265625,
0.06683349609375,
0.00582122802734375,
0.021484375,
0.052276611328125,
0.0187530517578125,
-0.015716552734375,
0.0161285400390625,
0.018829345703125,
-0.049224853515625,
-0.033782958984375,
-0.054107666015625,
-0.0178680419921875,
0.00867462158203125,
-0.0382080078125,
0.0290374755859375,
-0.03521728515625,
-0.019256591796875,
0.01849365234375,
0.024871826171875,
-0.05126953125,
0.0122222900390625,
-0.00893402099609375,
0.068603515625,
-0.039276123046875,
0.06329345703125,
0.05206298828125,
-0.061737060546875,
-0.08380126953125,
-0.0194091796875,
-0.0063629150390625,
-0.065185546875,
0.054168701171875,
0.0292816162109375,
0.0024662017822265625,
0.0202484130859375,
-0.037261962890625,
-0.06463623046875,
0.07757568359375,
0.0308380126953125,
-0.04071044921875,
-0.004703521728515625,
0.01464080810546875,
0.033111572265625,
-0.0300140380859375,
0.060211181640625,
0.026123046875,
0.035797119140625,
0.030364990234375,
-0.058349609375,
0.0162200927734375,
-0.0421142578125,
0.005466461181640625,
0.00756072998046875,
-0.07525634765625,
0.064453125,
-0.0174102783203125,
-0.01198577880859375,
-0.0028438568115234375,
0.06439208984375,
0.0251007080078125,
0.016448974609375,
0.02764892578125,
0.035888671875,
0.048126220703125,
-0.00917816162109375,
0.0732421875,
-0.044158935546875,
0.04510498046875,
0.0699462890625,
0.00203704833984375,
0.053985595703125,
0.018524169921875,
-0.00003427267074584961,
0.0171966552734375,
0.06658935546875,
-0.0025577545166015625,
0.017486572265625,
-0.00827789306640625,
0.0124969482421875,
-0.010406494140625,
-0.0022411346435546875,
-0.047882080078125,
0.03631591796875,
0.01995849609375,
-0.02459716796875,
-0.0115509033203125,
-0.0031585693359375,
0.0289764404296875,
-0.0250396728515625,
-0.005229949951171875,
0.04144287109375,
0.002475738525390625,
-0.0589599609375,
0.07391357421875,
0.0109405517578125,
0.0596923828125,
-0.044586181640625,
0.0085601806640625,
-0.033660888671875,
0.0160980224609375,
-0.0122528076171875,
-0.045806884765625,
0.033172607421875,
-0.004978179931640625,
-0.00208282470703125,
0.0035114288330078125,
0.051177978515625,
-0.0220489501953125,
-0.054351806640625,
0.0180206298828125,
0.0201873779296875,
0.0181427001953125,
-0.017730712890625,
-0.0655517578125,
0.0288238525390625,
-0.01019287109375,
-0.026397705078125,
0.0158538818359375,
0.02008056640625,
-0.004291534423828125,
0.0565185546875,
0.0572509765625,
-0.0098419189453125,
0.01467132568359375,
-0.00012171268463134766,
0.057525634765625,
-0.0570068359375,
-0.035247802734375,
-0.050384521484375,
0.03338623046875,
-0.01311492919921875,
-0.029998779296875,
0.05584716796875,
0.04718017578125,
0.057952880859375,
-0.004364013671875,
0.05078125,
-0.00909423828125,
0.0213623046875,
-0.03497314453125,
0.059844970703125,
-0.038665771484375,
0.005992889404296875,
-0.0296173095703125,
-0.054290771484375,
-0.015167236328125,
0.045623779296875,
-0.013275146484375,
0.0181884765625,
0.058746337890625,
0.07952880859375,
-0.00848388671875,
0.0224609375,
0.01320648193359375,
0.0305023193359375,
0.03875732421875,
0.055633544921875,
0.05712890625,
-0.058135986328125,
0.05096435546875,
-0.0204620361328125,
-0.0124359130859375,
-0.0209197998046875,
-0.061553955078125,
-0.08990478515625,
-0.05145263671875,
-0.020751953125,
-0.0310211181640625,
0.010711669921875,
0.0653076171875,
0.058258056640625,
-0.04571533203125,
-0.0189971923828125,
-0.0172271728515625,
0.003116607666015625,
-0.0210418701171875,
-0.0159454345703125,
0.039093017578125,
-0.043548583984375,
-0.05682373046875,
0.01084136962890625,
0.002685546875,
0.007526397705078125,
-0.004665374755859375,
-0.01873779296875,
-0.031951904296875,
0.00199127197265625,
0.04315185546875,
0.02374267578125,
-0.061553955078125,
-0.0297088623046875,
0.0189208984375,
-0.01052093505859375,
-0.000621795654296875,
0.018280029296875,
-0.03997802734375,
0.0208282470703125,
0.032196044921875,
0.053375244140625,
0.06591796875,
-0.00589752197265625,
0.0174407958984375,
-0.018829345703125,
0.0306854248046875,
-0.01068115234375,
0.03582763671875,
0.01219940185546875,
-0.0292816162109375,
0.043060302734375,
0.0316162109375,
-0.0401611328125,
-0.05438232421875,
-0.0177764892578125,
-0.0943603515625,
-0.008514404296875,
0.0994873046875,
-0.0162200927734375,
-0.03350830078125,
0.009735107421875,
-0.0335693359375,
0.041107177734375,
-0.046539306640625,
0.04510498046875,
0.0428466796875,
0.0050048828125,
-0.01334381103515625,
-0.0251007080078125,
0.0275726318359375,
0.0057373046875,
-0.0723876953125,
-0.0177001953125,
0.026885986328125,
0.022369384765625,
-0.006137847900390625,
0.040863037109375,
0.006320953369140625,
0.008575439453125,
0.0191497802734375,
-0.00325775146484375,
-0.045684814453125,
-0.01885986328125,
0.002288818359375,
0.0139007568359375,
-0.02117919921875,
-0.030975341796875
]
] |
xlm-roberta-base | 2023-04-07T12:46:17.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"onnx",
"safetensors",
"xlm-roberta",
"fill-mask",
"exbert",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:1911.02116",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | xlm-roberta-base | 406 | 12,048,443 | transformers | 2022-03-02T23:29:04 | ---
tags:
- exbert
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
license: mit
---
# XLM-RoBERTa (base-sized model)
XLM-RoBERTa model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116) by Conneau et al. and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr).
Disclaimer: The team releasing XLM-RoBERTa did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa is a multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='xlm-roberta-base')
>>> unmasker("Hello I'm a <mask> model.")
[{'score': 0.10563907772302628,
'sequence': "Hello I'm a fashion model.",
'token': 54543,
'token_str': 'fashion'},
{'score': 0.08015287667512894,
'sequence': "Hello I'm a new model.",
'token': 3525,
'token_str': 'new'},
{'score': 0.033413201570510864,
'sequence': "Hello I'm a model model.",
'token': 3299,
'token_str': 'model'},
{'score': 0.030217764899134636,
'sequence': "Hello I'm a French model.",
'token': 92265,
'token_str': 'French'},
{'score': 0.026436051353812218,
'sequence': "Hello I'm a sexy model.",
'token': 17473,
'token_str': 'sexy'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('xlm-roberta-base')
model = AutoModelForMaskedLM.from_pretrained("xlm-roberta-base")
# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
# forward pass
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1911-02116,
author = {Alexis Conneau and
Kartikay Khandelwal and
Naman Goyal and
Vishrav Chaudhary and
Guillaume Wenzek and
Francisco Guzm{\'{a}}n and
Edouard Grave and
Myle Ott and
Luke Zettlemoyer and
Veselin Stoyanov},
title = {Unsupervised Cross-lingual Representation Learning at Scale},
journal = {CoRR},
volume = {abs/1911.02116},
year = {2019},
url = {http://arxiv.org/abs/1911.02116},
eprinttype = {arXiv},
eprint = {1911.02116},
timestamp = {Mon, 11 Nov 2019 18:38:09 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1911-02116.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=xlm-roberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 5,238 | [
[
-0.03326416015625,
-0.056610107421875,
0.01509857177734375,
0.005535125732421875,
-0.015625,
-0.0003008842468261719,
-0.0286407470703125,
-0.029022216796875,
0.01404571533203125,
0.044036865234375,
-0.033782958984375,
-0.04351806640625,
-0.05340576171875,
0.0161285400390625,
-0.0308837890625,
0.08746337890625,
-0.0021820068359375,
0.004909515380859375,
0.0024776458740234375,
-0.0160980224609375,
-0.016448974609375,
-0.061309814453125,
-0.03533935546875,
-0.025726318359375,
0.029876708984375,
0.009765625,
0.041534423828125,
0.04595947265625,
0.0161285400390625,
0.031524658203125,
-0.0153961181640625,
0.01290130615234375,
-0.020111083984375,
-0.0008029937744140625,
0.00270843505859375,
-0.04583740234375,
-0.03643798828125,
0.0177154541015625,
0.0487060546875,
0.05511474609375,
0.00916290283203125,
0.02178955078125,
0.0092315673828125,
0.0269775390625,
-0.0136260986328125,
0.0220489501953125,
-0.04052734375,
0.012786865234375,
-0.0163726806640625,
0.0069427490234375,
-0.032928466796875,
-0.007694244384765625,
0.01163482666015625,
-0.0223846435546875,
0.0163726806640625,
0.01317596435546875,
0.09234619140625,
-0.000644683837890625,
-0.0247344970703125,
-0.013885498046875,
-0.043914794921875,
0.08099365234375,
-0.050567626953125,
0.032745361328125,
0.0172576904296875,
0.003055572509765625,
0.006046295166015625,
-0.0679931640625,
-0.040679931640625,
-0.0197601318359375,
-0.031463623046875,
0.00725555419921875,
-0.036956787109375,
-0.018402099609375,
0.023406982421875,
0.03131103515625,
-0.060760498046875,
-0.0016679763793945312,
-0.032928466796875,
-0.0177459716796875,
0.04083251953125,
0.000942230224609375,
0.03131103515625,
-0.0380859375,
-0.031524658203125,
-0.033050537109375,
-0.0364990234375,
0.00991058349609375,
0.0257110595703125,
0.03240966796875,
-0.025909423828125,
0.03802490234375,
0.007694244384765625,
0.056610107421875,
0.01213836669921875,
0.0025920867919921875,
0.04156494140625,
-0.0208892822265625,
-0.022979736328125,
-0.017974853515625,
0.09234619140625,
-0.004116058349609375,
0.0186920166015625,
-0.007965087890625,
-0.01153564453125,
-0.007366180419921875,
0.0024871826171875,
-0.054229736328125,
-0.020965576171875,
0.015228271484375,
-0.04229736328125,
-0.0156097412109375,
0.01457977294921875,
-0.0518798828125,
0.01263427734375,
-0.02392578125,
0.047882080078125,
-0.037109375,
-0.0211181640625,
-0.007720947265625,
-0.0015153884887695312,
0.0019350051879882812,
-0.00299835205078125,
-0.057830810546875,
0.01456451416015625,
0.023406982421875,
0.0638427734375,
-0.00592803955078125,
-0.0231170654296875,
-0.032989501953125,
-0.0208587646484375,
-0.0175933837890625,
0.035614013671875,
-0.02935791015625,
-0.01097869873046875,
-0.00905609130859375,
0.02496337890625,
-0.01320648193359375,
-0.0379638671875,
0.02874755859375,
-0.0250091552734375,
0.03839111328125,
0.00865936279296875,
-0.0251922607421875,
-0.029296875,
0.0088348388671875,
-0.049041748046875,
0.09307861328125,
0.020477294921875,
-0.050445556640625,
0.015960693359375,
-0.04302978515625,
-0.023712158203125,
-0.0129547119140625,
-0.0008401870727539062,
-0.056182861328125,
-0.00476837158203125,
0.031097412109375,
0.03997802734375,
-0.0209197998046875,
0.0112762451171875,
-0.01123809814453125,
-0.004978179931640625,
0.02935791015625,
-0.0198974609375,
0.0880126953125,
0.024810791015625,
-0.036590576171875,
0.0125885009765625,
-0.06304931640625,
0.01556396484375,
0.0140228271484375,
-0.0159454345703125,
-0.0189666748046875,
-0.029296875,
0.0261993408203125,
0.023193359375,
0.016265869140625,
-0.0296630859375,
0.0041656494140625,
-0.04052734375,
0.039459228515625,
0.037261962890625,
-0.0204925537109375,
0.037933349609375,
-0.0207672119140625,
0.044281005859375,
0.01374053955078125,
0.006587982177734375,
-0.027587890625,
-0.042083740234375,
-0.06292724609375,
-0.0234527587890625,
0.049468994140625,
0.0426025390625,
-0.038726806640625,
0.0506591796875,
-0.0118255615234375,
-0.045166015625,
-0.0521240234375,
0.01515960693359375,
0.041717529296875,
0.0270233154296875,
0.0377197265625,
-0.031097412109375,
-0.05426025390625,
-0.053955078125,
-0.01373291015625,
0.001590728759765625,
-0.006290435791015625,
0.0277252197265625,
0.04388427734375,
-0.0219268798828125,
0.06768798828125,
-0.03350830078125,
-0.033660888671875,
-0.044647216796875,
0.0264129638671875,
0.0281982421875,
0.0457763671875,
0.050689697265625,
-0.05859375,
-0.057281494140625,
-0.0021915435791015625,
-0.048126220703125,
-0.00727081298828125,
-0.0019702911376953125,
-0.00749969482421875,
0.042510986328125,
0.033782958984375,
-0.045074462890625,
0.031463623046875,
0.045867919921875,
-0.020294189453125,
0.0201263427734375,
-0.02447509765625,
-0.0017757415771484375,
-0.09820556640625,
0.0114898681640625,
0.002475738525390625,
-0.0252685546875,
-0.048065185546875,
0.0017328262329101562,
0.006420135498046875,
-0.01316070556640625,
-0.024139404296875,
0.04766845703125,
-0.061431884765625,
-0.00019598007202148438,
-0.006999969482421875,
0.028411865234375,
0.00762939453125,
0.0521240234375,
0.0163116455078125,
0.0306549072265625,
0.050811767578125,
-0.033203125,
0.0214996337890625,
0.0240631103515625,
-0.0248260498046875,
0.0204315185546875,
-0.0472412109375,
0.01209259033203125,
0.0024433135986328125,
0.01285552978515625,
-0.06671142578125,
0.007228851318359375,
0.0235595703125,
-0.046417236328125,
0.03814697265625,
-0.0272979736328125,
-0.03802490234375,
-0.031951904296875,
-0.0066070556640625,
0.0295257568359375,
0.055328369140625,
-0.037261962890625,
0.0548095703125,
0.032562255859375,
-0.01035308837890625,
-0.042205810546875,
-0.06011962890625,
0.008880615234375,
-0.0183258056640625,
-0.04742431640625,
0.037109375,
-0.00576019287109375,
0.0001614093780517578,
-0.0017004013061523438,
0.0157012939453125,
0.005062103271484375,
-0.0070343017578125,
0.017852783203125,
0.0243377685546875,
-0.01406097412109375,
-0.004673004150390625,
-0.0167999267578125,
-0.022796630859375,
-0.0038738250732421875,
-0.0289764404296875,
0.06707763671875,
-0.00673675537109375,
-0.005191802978515625,
-0.025238037109375,
0.029541015625,
0.025146484375,
-0.03692626953125,
0.0509033203125,
0.07476806640625,
-0.0245208740234375,
-0.01328277587890625,
-0.0278778076171875,
-0.0152130126953125,
-0.0318603515625,
0.04193115234375,
-0.0275726318359375,
-0.0606689453125,
0.050018310546875,
0.0177459716796875,
-0.00865936279296875,
0.049041748046875,
0.051025390625,
0.010955810546875,
0.08642578125,
0.053192138671875,
-0.0026874542236328125,
0.03704833984375,
-0.049468994140625,
0.026214599609375,
-0.0731201171875,
-0.0224151611328125,
-0.0455322265625,
-0.0147705078125,
-0.06365966796875,
-0.043487548828125,
0.0206756591796875,
0.00872039794921875,
-0.01045989990234375,
0.053436279296875,
-0.045013427734375,
0.0017232894897460938,
0.058929443359375,
0.01151275634765625,
0.008880615234375,
0.005565643310546875,
-0.025665283203125,
-0.005523681640625,
-0.05340576171875,
-0.0256195068359375,
0.0885009765625,
0.0262908935546875,
0.053436279296875,
0.0006847381591796875,
0.056610107421875,
-0.0025424957275390625,
0.01322174072265625,
-0.04730224609375,
0.037261962890625,
-0.019989013671875,
-0.054473876953125,
-0.0200653076171875,
-0.0394287109375,
-0.08245849609375,
0.0174560546875,
-0.022552490234375,
-0.064697265625,
0.0159149169921875,
0.00029659271240234375,
-0.019805908203125,
0.0261993408203125,
-0.04168701171875,
0.0687255859375,
-0.0230865478515625,
-0.0199432373046875,
0.0032024383544921875,
-0.052154541015625,
0.012908935546875,
-0.006443023681640625,
0.01226806640625,
0.0107421875,
0.015838623046875,
0.0594482421875,
-0.038787841796875,
0.0697021484375,
0.003017425537109375,
0.0006284713745117188,
0.018524169921875,
-0.0045013427734375,
0.031463623046875,
-0.0060882568359375,
0.00972747802734375,
0.035400390625,
-0.00525665283203125,
-0.017578125,
-0.037445068359375,
0.046783447265625,
-0.07330322265625,
-0.045074462890625,
-0.045196533203125,
-0.047027587890625,
0.009429931640625,
0.0221099853515625,
0.0350341796875,
0.043548583984375,
-0.0014829635620117188,
0.01593017578125,
0.043212890625,
-0.032989501953125,
0.038848876953125,
0.03350830078125,
-0.031097412109375,
-0.038543701171875,
0.052764892578125,
0.0225067138671875,
0.01544189453125,
0.046142578125,
0.01535797119140625,
-0.033721923828125,
-0.034576416015625,
-0.032196044921875,
0.0212249755859375,
-0.0478515625,
-0.020263671875,
-0.0777587890625,
-0.036773681640625,
-0.051971435546875,
0.00701141357421875,
-0.017425537109375,
-0.038909912109375,
-0.0305633544921875,
0.0019092559814453125,
0.040985107421875,
0.0543212890625,
-0.020782470703125,
0.0146484375,
-0.053558349609375,
0.0196380615234375,
0.01885986328125,
0.00494384765625,
-0.004085540771484375,
-0.06915283203125,
-0.030548095703125,
0.006801605224609375,
-0.0272979736328125,
-0.051971435546875,
0.0653076171875,
0.0118255615234375,
0.04400634765625,
0.021453857421875,
-0.001605987548828125,
0.052459716796875,
-0.0291900634765625,
0.055755615234375,
0.0134735107421875,
-0.0738525390625,
0.0406494140625,
-0.00545501708984375,
0.01800537109375,
0.0022125244140625,
0.036773681640625,
-0.042999267578125,
-0.0390625,
-0.058807373046875,
-0.0770263671875,
0.0697021484375,
0.022003173828125,
0.0216064453125,
0.00047898292541503906,
0.014678955078125,
0.001735687255859375,
0.006072998046875,
-0.088134765625,
-0.046875,
-0.032562255859375,
-0.0294647216796875,
-0.0205078125,
-0.0110321044921875,
-0.0018911361694335938,
-0.0302734375,
0.051971435546875,
-0.0029773712158203125,
0.033966064453125,
0.0198211669921875,
-0.0323486328125,
-0.0012836456298828125,
0.00746917724609375,
0.03594970703125,
0.0328369140625,
-0.01617431640625,
0.006183624267578125,
0.01287841796875,
-0.03436279296875,
-0.00487518310546875,
0.028533935546875,
-0.015167236328125,
0.015869140625,
0.0260772705078125,
0.06951904296875,
0.021728515625,
-0.0311279296875,
0.034271240234375,
0.00943756103515625,
-0.01236724853515625,
-0.032073974609375,
0.0047607421875,
0.00534820556640625,
0.0227203369140625,
0.033447265625,
0.0031642913818359375,
-0.00962066650390625,
-0.0574951171875,
0.0274810791015625,
0.03814697265625,
-0.035125732421875,
-0.02197265625,
0.063720703125,
-0.01444244384765625,
-0.0279998779296875,
0.039398193359375,
-0.0092620849609375,
-0.0565185546875,
0.04962158203125,
0.048492431640625,
0.068359375,
-0.010772705078125,
0.0181884765625,
0.0472412109375,
0.020751953125,
0.0040435791015625,
0.0025501251220703125,
0.005626678466796875,
-0.054595947265625,
-0.017181396484375,
-0.0576171875,
-0.0038394927978515625,
0.01593017578125,
-0.0462646484375,
0.0248260498046875,
-0.024200439453125,
-0.0179443359375,
0.0023040771484375,
0.01788330078125,
-0.056396484375,
0.0229949951171875,
0.004817962646484375,
0.053955078125,
-0.0633544921875,
0.0679931640625,
0.052703857421875,
-0.05987548828125,
-0.0762939453125,
-0.018768310546875,
-0.00963592529296875,
-0.07049560546875,
0.06988525390625,
0.01065826416015625,
0.0233154296875,
0.004291534423828125,
-0.030059814453125,
-0.07818603515625,
0.08477783203125,
0.00952911376953125,
-0.038970947265625,
0.001220703125,
0.0271148681640625,
0.042327880859375,
-0.0491943359375,
0.04779052734375,
0.0228271484375,
0.033721923828125,
-0.0017232894897460938,
-0.06536865234375,
0.015625,
-0.0283050537109375,
0.0095977783203125,
0.005687713623046875,
-0.057769775390625,
0.0953369140625,
-0.01297760009765625,
-0.005290985107421875,
0.0189666748046875,
0.043853759765625,
0.01056671142578125,
-0.0007882118225097656,
0.0310516357421875,
0.051055908203125,
0.046539306640625,
-0.02471923828125,
0.0697021484375,
-0.02862548828125,
0.048828125,
0.071044921875,
0.00487518310546875,
0.056121826171875,
0.0175323486328125,
-0.0178070068359375,
0.0565185546875,
0.049224853515625,
-0.0245513916015625,
0.0323486328125,
0.0070648193359375,
0.00688934326171875,
-0.01419830322265625,
0.016693115234375,
-0.023895263671875,
0.041839599609375,
0.007232666015625,
-0.0516357421875,
-0.00966644287109375,
0.00989532470703125,
0.0268402099609375,
-0.0013370513916015625,
-0.0103302001953125,
0.045654296875,
0.0193634033203125,
-0.04730224609375,
0.056610107421875,
0.0092620849609375,
0.052764892578125,
-0.04443359375,
0.00525665283203125,
-0.023529052734375,
0.018035888671875,
-0.00995635986328125,
-0.044403076171875,
0.00861358642578125,
0.005157470703125,
-0.019683837890625,
-0.0235595703125,
0.03424072265625,
-0.057769775390625,
-0.0576171875,
0.034454345703125,
0.03363037109375,
0.0166015625,
-0.00013506412506103516,
-0.07275390625,
0.004520416259765625,
0.007144927978515625,
-0.033721923828125,
0.0301055908203125,
0.0430908203125,
-0.00449371337890625,
0.047119140625,
0.0526123046875,
0.00867462158203125,
0.0084991455078125,
0.004817962646484375,
0.05511474609375,
-0.05914306640625,
-0.031402587890625,
-0.058258056640625,
0.0513916015625,
-0.003753662109375,
-0.0236968994140625,
0.06732177734375,
0.04620361328125,
0.06317138671875,
-0.007740020751953125,
0.054412841796875,
-0.0178070068359375,
0.03753662109375,
-0.03814697265625,
0.0699462890625,
-0.053863525390625,
0.01445770263671875,
-0.0267181396484375,
-0.06689453125,
-0.0267486572265625,
0.05926513671875,
-0.01175689697265625,
0.028472900390625,
0.0555419921875,
0.0704345703125,
-0.00852203369140625,
-0.0306854248046875,
0.02642822265625,
0.04052734375,
0.01175689697265625,
0.039642333984375,
0.034515380859375,
-0.056182861328125,
0.0556640625,
-0.027008056640625,
-0.0160675048828125,
-0.0177001953125,
-0.06231689453125,
-0.08343505859375,
-0.06494140625,
-0.032562255859375,
-0.034515380859375,
-0.013214111328125,
0.07135009765625,
0.06317138671875,
-0.06689453125,
-0.0211639404296875,
0.0018186569213867188,
0.01038360595703125,
-0.0183563232421875,
-0.023345947265625,
0.045318603515625,
-0.030426025390625,
-0.0816650390625,
0.006877899169921875,
0.007678985595703125,
0.01444244384765625,
-0.027740478515625,
-0.0034046173095703125,
-0.0210418701171875,
0.00135040283203125,
0.0380859375,
0.015411376953125,
-0.04913330078125,
-0.01690673828125,
0.00669097900390625,
-0.00948333740234375,
0.0197296142578125,
0.035797119140625,
-0.06317138671875,
0.0225982666015625,
0.03253173828125,
0.0157318115234375,
0.054412841796875,
-0.0229339599609375,
0.044036865234375,
-0.05487060546875,
0.0218505859375,
0.005031585693359375,
0.040985107421875,
0.031463623046875,
-0.016265869140625,
0.0280609130859375,
0.021759033203125,
-0.035491943359375,
-0.065185546875,
0.004291534423828125,
-0.07904052734375,
-0.0200653076171875,
0.07794189453125,
-0.027008056640625,
-0.02783203125,
-0.004302978515625,
-0.01203155517578125,
0.03460693359375,
-0.01288604736328125,
0.05511474609375,
0.03814697265625,
0.00658416748046875,
-0.0380859375,
-0.024139404296875,
0.0372314453125,
0.0237579345703125,
-0.0426025390625,
-0.0031414031982421875,
0.0023365020751953125,
0.03900146484375,
0.0292510986328125,
0.027252197265625,
-0.024078369140625,
-0.0031070709228515625,
-0.01299285888671875,
0.0194549560546875,
0.000965118408203125,
-0.01177978515625,
-0.0210723876953125,
0.008453369140625,
-0.019775390625,
-0.0038604736328125
]
] |
distilbert-base-uncased | 2023-08-18T14:59:41.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"distilbert",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1910.01108",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | distilbert-base-uncased | 292 | 11,014,465 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# DistilBERT base model (uncased)
This model is a distilled version of the [BERT base model](https://huggingface.co/bert-base-uncased). It was
introduced in [this paper](https://arxiv.org/abs/1910.01108). The code for the distillation process can be found
[here](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation). This model is uncased: it does
not make a difference between english and English.
## Model description
DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a
self-supervised fashion, using the BERT base model as a teacher. This means it was pretrained on the raw texts only,
with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic
process to generate inputs and labels from those texts using the BERT base model. More precisely, it was pretrained
with three objectives:
- Distillation loss: the model was trained to return the same probabilities as the BERT base model.
- Masked language modeling (MLM): this is part of the original training loss of the BERT base model. When taking a
sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the
model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that
usually see the words one after the other, or from autoregressive models like GPT which internally mask the future
tokens. It allows the model to learn a bidirectional representation of the sentence.
- Cosine embedding loss: the model was also trained to generate hidden states as close as possible as the BERT base
model.
This way, the model learns the same inner representation of the English language than its teacher model, while being
faster for inference or downstream tasks.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=distilbert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='distilbert-base-uncased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] hello i'm a role model. [SEP]",
'score': 0.05292855575680733,
'token': 2535,
'token_str': 'role'},
{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
'score': 0.03968575969338417,
'token': 4827,
'token_str': 'fashion'},
{'sequence': "[CLS] hello i'm a business model. [SEP]",
'score': 0.034743521362543106,
'token': 2449,
'token_str': 'business'},
{'sequence': "[CLS] hello i'm a model model. [SEP]",
'score': 0.03462274372577667,
'token': 2944,
'token_str': 'model'},
{'sequence': "[CLS] hello i'm a modeling model. [SEP]",
'score': 0.018145186826586723,
'token': 11643,
'token_str': 'modeling'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import DistilBertTokenizer, DistilBertModel
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
model = DistilBertModel.from_pretrained("distilbert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import DistilBertTokenizer, TFDistilBertModel
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
model = TFDistilBertModel.from_pretrained("distilbert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions. It also inherits some of
[the bias of its teacher model](https://huggingface.co/bert-base-uncased#limitations-and-bias).
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='distilbert-base-uncased')
>>> unmasker("The White man worked as a [MASK].")
[{'sequence': '[CLS] the white man worked as a blacksmith. [SEP]',
'score': 0.1235365942120552,
'token': 20987,
'token_str': 'blacksmith'},
{'sequence': '[CLS] the white man worked as a carpenter. [SEP]',
'score': 0.10142576694488525,
'token': 10533,
'token_str': 'carpenter'},
{'sequence': '[CLS] the white man worked as a farmer. [SEP]',
'score': 0.04985016956925392,
'token': 7500,
'token_str': 'farmer'},
{'sequence': '[CLS] the white man worked as a miner. [SEP]',
'score': 0.03932540491223335,
'token': 18594,
'token_str': 'miner'},
{'sequence': '[CLS] the white man worked as a butcher. [SEP]',
'score': 0.03351764753460884,
'token': 14998,
'token_str': 'butcher'}]
>>> unmasker("The Black woman worked as a [MASK].")
[{'sequence': '[CLS] the black woman worked as a waitress. [SEP]',
'score': 0.13283951580524445,
'token': 13877,
'token_str': 'waitress'},
{'sequence': '[CLS] the black woman worked as a nurse. [SEP]',
'score': 0.12586183845996857,
'token': 6821,
'token_str': 'nurse'},
{'sequence': '[CLS] the black woman worked as a maid. [SEP]',
'score': 0.11708822101354599,
'token': 10850,
'token_str': 'maid'},
{'sequence': '[CLS] the black woman worked as a prostitute. [SEP]',
'score': 0.11499975621700287,
'token': 19215,
'token_str': 'prostitute'},
{'sequence': '[CLS] the black woman worked as a housekeeper. [SEP]',
'score': 0.04722772538661957,
'token': 22583,
'token_str': 'housekeeper'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
DistilBERT pretrained on the same data as BERT, which is [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset
consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia)
(excluding lists, tables and headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 8 16 GB V100 for 90 hours. See the
[training code](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for all hyperparameters
details.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE |
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|
| | 82.2 | 88.5 | 89.2 | 91.3 | 51.3 | 85.8 | 87.5 | 59.9 |
### BibTeX entry and citation info
```bibtex
@article{Sanh2019DistilBERTAD,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
journal={ArXiv},
year={2019},
volume={abs/1910.01108}
}
```
<a href="https://huggingface.co/exbert/?model=distilbert-base-uncased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 8,577 | [
[
-0.004299163818359375,
-0.049346923828125,
0.018951416015625,
0.0210113525390625,
-0.041534423828125,
0.003765106201171875,
-0.0017490386962890625,
-0.00771331787109375,
0.0274810791015625,
0.02911376953125,
-0.0396728515625,
-0.032623291015625,
-0.0697021484375,
0.0120849609375,
-0.040985107421875,
0.0908203125,
0.016021728515625,
0.0166168212890625,
0.00496673583984375,
0.00986480712890625,
-0.0286102294921875,
-0.05682373046875,
-0.049468994140625,
-0.021270751953125,
0.034027099609375,
0.0202178955078125,
0.0517578125,
0.049530029296875,
0.031951904296875,
0.03167724609375,
-0.01218414306640625,
-0.0096435546875,
-0.034210205078125,
0.00618743896484375,
-0.007297515869140625,
-0.039398193359375,
-0.0236358642578125,
0.0167694091796875,
0.0283966064453125,
0.06353759765625,
-0.0031452178955078125,
0.035186767578125,
0.00015103816986083984,
0.044830322265625,
-0.0274810791015625,
0.02142333984375,
-0.043060302734375,
0.00647735595703125,
-0.019989013671875,
0.011871337890625,
-0.030120849609375,
-0.01430511474609375,
0.007232666015625,
-0.0404052734375,
0.0236053466796875,
0.01397705078125,
0.08013916015625,
0.0170745849609375,
-0.02154541015625,
-0.00435638427734375,
-0.043365478515625,
0.058929443359375,
-0.052642822265625,
0.00611114501953125,
0.03533935546875,
0.0182342529296875,
-0.01471710205078125,
-0.08428955078125,
-0.03570556640625,
-0.0010023117065429688,
-0.01849365234375,
0.0078277587890625,
-0.006195068359375,
-0.007053375244140625,
0.034881591796875,
0.0433349609375,
-0.024078369140625,
-0.00475311279296875,
-0.06134033203125,
-0.0238494873046875,
0.04730224609375,
0.0157318115234375,
0.00823211669921875,
-0.0247802734375,
-0.0264129638671875,
-0.0209197998046875,
-0.01213836669921875,
0.004425048828125,
0.045562744140625,
0.0308837890625,
-0.0150146484375,
0.059539794921875,
-0.018798828125,
0.048065185546875,
0.006267547607421875,
-0.005970001220703125,
0.035614013671875,
-0.009735107421875,
-0.033843994140625,
0.002655029296875,
0.0709228515625,
0.023895263671875,
0.0272064208984375,
0.0031566619873046875,
-0.018951416015625,
0.01384735107421875,
0.0232391357421875,
-0.05426025390625,
-0.031463623046875,
0.00777435302734375,
-0.041351318359375,
-0.03619384765625,
0.03375244140625,
-0.046661376953125,
-0.0082244873046875,
-0.0087432861328125,
0.042388916015625,
-0.016693115234375,
-0.013427734375,
0.01158905029296875,
-0.047698974609375,
0.00476837158203125,
0.0020618438720703125,
-0.07281494140625,
0.0218505859375,
0.04986572265625,
0.0697021484375,
0.0182647705078125,
-0.0081634521484375,
-0.029876708984375,
-0.0175323486328125,
-0.01171875,
0.021759033203125,
-0.019500732421875,
-0.040496826171875,
-0.0066680908203125,
0.0260467529296875,
-0.0026226043701171875,
-0.0228424072265625,
0.047607421875,
-0.033660888671875,
0.036773681640625,
-0.00162506103515625,
-0.035614013671875,
-0.022216796875,
0.00487518310546875,
-0.057769775390625,
0.0926513671875,
0.032501220703125,
-0.06524658203125,
0.0254974365234375,
-0.05999755859375,
-0.039794921875,
0.00978851318359375,
0.017303466796875,
-0.035980224609375,
0.0207672119140625,
0.0006742477416992188,
0.026885986328125,
-0.00804901123046875,
0.0207061767578125,
-0.02337646484375,
-0.037261962890625,
0.0253143310546875,
-0.0263214111328125,
0.08740234375,
0.0159759521484375,
-0.026611328125,
0.0019989013671875,
-0.0654296875,
-0.00806427001953125,
0.0208740234375,
-0.018798828125,
-0.0219879150390625,
-0.01493072509765625,
0.0224609375,
0.01311492919921875,
0.027496337890625,
-0.053009033203125,
0.0201568603515625,
-0.0345458984375,
0.049102783203125,
0.059478759765625,
-0.006206512451171875,
0.0194091796875,
-0.02569580078125,
0.03509521484375,
0.00653839111328125,
0.0015306472778320312,
-0.0064239501953125,
-0.047760009765625,
-0.058685302734375,
-0.0286712646484375,
0.045684814453125,
0.05291748046875,
-0.0309295654296875,
0.050079345703125,
0.0032711029052734375,
-0.048187255859375,
-0.049530029296875,
-0.00997161865234375,
0.017303466796875,
0.04644775390625,
0.0274658203125,
-0.02911376953125,
-0.06353759765625,
-0.06231689453125,
-0.01300811767578125,
-0.0142059326171875,
-0.0133056640625,
-0.004077911376953125,
0.0560302734375,
-0.021148681640625,
0.06256103515625,
-0.06646728515625,
-0.027130126953125,
-0.005298614501953125,
0.0156707763671875,
0.054412841796875,
0.050018310546875,
0.02874755859375,
-0.050750732421875,
-0.037445068359375,
-0.0279693603515625,
-0.0430908203125,
0.0010509490966796875,
0.0106964111328125,
-0.0161895751953125,
0.0019168853759765625,
0.03863525390625,
-0.056304931640625,
0.05157470703125,
0.0263519287109375,
-0.041412353515625,
0.053070068359375,
-0.0233612060546875,
-0.002437591552734375,
-0.09991455078125,
0.0141143798828125,
-0.008148193359375,
-0.0271453857421875,
-0.055419921875,
-0.00565338134765625,
-0.01105499267578125,
-0.0033721923828125,
-0.043975830078125,
0.033660888671875,
-0.032318115234375,
0.0034008026123046875,
-0.002422332763671875,
-0.0140380859375,
0.0139312744140625,
0.036590576171875,
-0.004940032958984375,
0.04180908203125,
0.042999267578125,
-0.036529541015625,
0.046356201171875,
0.032623291015625,
-0.037689208984375,
0.0139312744140625,
-0.06689453125,
0.0133056640625,
0.0008282661437988281,
0.0028553009033203125,
-0.08306884765625,
-0.0129547119140625,
0.0174560546875,
-0.03802490234375,
0.01384735107421875,
-0.0177001953125,
-0.052642822265625,
-0.047271728515625,
-0.018280029296875,
0.038848876953125,
0.04974365234375,
-0.0162811279296875,
0.030609130859375,
0.0212860107421875,
-0.0084075927734375,
-0.0531005859375,
-0.0552978515625,
-0.0015535354614257812,
-0.0236358642578125,
-0.035980224609375,
0.033172607421875,
0.0009264945983886719,
-0.01727294921875,
-0.0101776123046875,
0.0018892288208007812,
-0.00922393798828125,
0.0121002197265625,
0.0247650146484375,
0.037200927734375,
-0.00890350341796875,
-0.01218414306640625,
-0.01006317138671875,
-0.011627197265625,
0.0196685791015625,
-0.0160369873046875,
0.0574951171875,
0.00359344482421875,
-0.0074920654296875,
-0.0257568359375,
0.0204010009765625,
0.049957275390625,
-0.00443267822265625,
0.05841064453125,
0.055084228515625,
-0.039337158203125,
0.00432586669921875,
-0.0201263427734375,
-0.01325225830078125,
-0.038818359375,
0.03656005859375,
-0.031890869140625,
-0.060455322265625,
0.058563232421875,
0.0189056396484375,
-0.01020050048828125,
0.06201171875,
0.04840087890625,
-0.01357269287109375,
0.07403564453125,
0.036163330078125,
-0.00766754150390625,
0.0286712646484375,
-0.0182037353515625,
0.0215911865234375,
-0.0545654296875,
-0.033203125,
-0.0343017578125,
-0.0294647216796875,
-0.041748046875,
-0.01497650146484375,
0.0187530517578125,
0.0227508544921875,
-0.0298309326171875,
0.046478271484375,
-0.051239013671875,
0.0261993408203125,
0.069580078125,
0.0184326171875,
-0.00351715087890625,
-0.01448822021484375,
-0.0204315185546875,
0.0031681060791015625,
-0.0294952392578125,
-0.0292510986328125,
0.07769775390625,
0.0416259765625,
0.055328369140625,
0.00047326087951660156,
0.050506591796875,
0.028228759765625,
0.00316619873046875,
-0.048828125,
0.034698486328125,
-0.0283660888671875,
-0.06622314453125,
-0.0294342041015625,
-0.00904083251953125,
-0.07489013671875,
0.0174102783203125,
-0.02008056640625,
-0.061767578125,
-0.0014171600341796875,
-0.003997802734375,
-0.0270233154296875,
0.0151519775390625,
-0.052215576171875,
0.08587646484375,
-0.018798828125,
-0.0117034912109375,
0.0089263916015625,
-0.06634521484375,
0.021331787109375,
0.0020847320556640625,
0.0041351318359375,
-0.01279449462890625,
0.0231781005859375,
0.070068359375,
-0.049713134765625,
0.0701904296875,
-0.022308349609375,
0.0169219970703125,
0.01357269287109375,
-0.001987457275390625,
0.0251922607421875,
0.00508880615234375,
0.0005431175231933594,
0.026641845703125,
0.0072021484375,
-0.033905029296875,
-0.0177001953125,
0.0254974365234375,
-0.053802490234375,
-0.04345703125,
-0.052032470703125,
-0.04278564453125,
0.0181427001953125,
0.02496337890625,
0.045257568359375,
0.036651611328125,
-0.0123748779296875,
0.021270751953125,
0.0299224853515625,
-0.0105438232421875,
0.0513916015625,
0.019073486328125,
-0.01491546630859375,
-0.0345458984375,
0.036163330078125,
0.00208282470703125,
0.00513458251953125,
0.032257080078125,
0.0165863037109375,
-0.048309326171875,
-0.0176544189453125,
-0.033660888671875,
0.005420684814453125,
-0.04461669921875,
-0.0286712646484375,
-0.04913330078125,
-0.037445068359375,
-0.044403076171875,
0.00055694580078125,
-0.0087738037109375,
-0.042999267578125,
-0.053985595703125,
-0.023406982421875,
0.0401611328125,
0.04974365234375,
-0.00673675537109375,
0.0423583984375,
-0.05718994140625,
0.0192108154296875,
0.02410888671875,
0.0277862548828125,
-0.0158843994140625,
-0.06439208984375,
-0.027923583984375,
0.00923919677734375,
-0.018096923828125,
-0.06689453125,
0.046051025390625,
0.0101776123046875,
0.033935546875,
0.03497314453125,
0.00621795654296875,
0.0537109375,
-0.04833984375,
0.068603515625,
0.018646240234375,
-0.07977294921875,
0.037689208984375,
-0.0178680419921875,
0.0162811279296875,
0.038604736328125,
0.0234375,
-0.036773681640625,
-0.03155517578125,
-0.06329345703125,
-0.07366943359375,
0.059844970703125,
0.0191192626953125,
0.0222930908203125,
-0.0048980712890625,
0.013519287109375,
0.01439666748046875,
0.0263824462890625,
-0.072509765625,
-0.04638671875,
-0.042144775390625,
-0.028900146484375,
-0.00945281982421875,
-0.0224609375,
-0.005157470703125,
-0.040924072265625,
0.04840087890625,
0.0147857666015625,
0.0228118896484375,
0.00801849365234375,
-0.0165557861328125,
0.011871337890625,
0.00777435302734375,
0.051513671875,
0.03570556640625,
-0.038238525390625,
0.0062103271484375,
0.004642486572265625,
-0.0478515625,
0.01476287841796875,
0.019744873046875,
-0.0020961761474609375,
0.019287109375,
0.036102294921875,
0.06573486328125,
0.0050506591796875,
-0.0260162353515625,
0.042205810546875,
0.00830841064453125,
-0.0235443115234375,
-0.04437255859375,
0.00875091552734375,
0.002201080322265625,
0.010162353515625,
0.041229248046875,
0.01377105712890625,
0.01548004150390625,
-0.043182373046875,
0.027801513671875,
0.0215301513671875,
-0.040802001953125,
-0.020294189453125,
0.0697021484375,
0.00434112548828125,
-0.04766845703125,
0.062347412109375,
-0.0159454345703125,
-0.0518798828125,
0.053009033203125,
0.047882080078125,
0.06829833984375,
-0.0105743408203125,
0.0164947509765625,
0.037689208984375,
0.01959228515625,
-0.022308349609375,
0.0205078125,
0.022125244140625,
-0.053802490234375,
-0.0257568359375,
-0.06707763671875,
-0.0044097900390625,
0.0146942138671875,
-0.06231689453125,
0.026092529296875,
-0.0343017578125,
-0.0285491943359375,
0.0195159912109375,
-0.004619598388671875,
-0.050323486328125,
0.03216552734375,
0.00004416704177856445,
0.0791015625,
-0.08245849609375,
0.06768798828125,
0.05255126953125,
-0.047454833984375,
-0.0621337890625,
-0.030975341796875,
-0.0205841064453125,
-0.06878662109375,
0.06280517578125,
0.0268096923828125,
0.0233306884765625,
-0.0032520294189453125,
-0.0380859375,
-0.054595947265625,
0.0712890625,
0.01513671875,
-0.041595458984375,
-0.00826263427734375,
0.00832366943359375,
0.044219970703125,
-0.033447265625,
0.03631591796875,
0.04193115234375,
0.029632568359375,
-0.0015544891357421875,
-0.06414794921875,
0.005672454833984375,
-0.029144287109375,
0.00225067138671875,
0.01207733154296875,
-0.036956787109375,
0.0858154296875,
-0.00634765625,
-0.0018701553344726562,
0.009552001953125,
0.044219970703125,
0.005062103271484375,
0.0166473388671875,
0.042724609375,
0.054840087890625,
0.050506591796875,
-0.02874755859375,
0.057037353515625,
-0.015655517578125,
0.041534423828125,
0.06787109375,
-0.0010890960693359375,
0.04864501953125,
0.03106689453125,
-0.0256195068359375,
0.07354736328125,
0.060882568359375,
-0.0285797119140625,
0.056121826171875,
0.024688720703125,
-0.007717132568359375,
0.002231597900390625,
0.00749969482421875,
-0.0204925537109375,
0.045806884765625,
0.01300811767578125,
-0.041107177734375,
0.004703521728515625,
-0.0101165771484375,
0.0133209228515625,
-0.004364013671875,
-0.033050537109375,
0.0533447265625,
0.0165252685546875,
-0.050201416015625,
0.02496337890625,
0.018707275390625,
0.050262451171875,
-0.03997802734375,
-0.0031528472900390625,
-0.006038665771484375,
0.0175628662109375,
-0.0106201171875,
-0.06195068359375,
0.024505615234375,
-0.011871337890625,
-0.03814697265625,
-0.01436614990234375,
0.05303955078125,
-0.040069580078125,
-0.05419921875,
0.00583648681640625,
0.0188446044921875,
0.017181396484375,
-0.011138916015625,
-0.055389404296875,
-0.0164337158203125,
0.0010194778442382812,
-0.00876617431640625,
0.0101776123046875,
0.0305328369140625,
0.0022792816162109375,
0.031402587890625,
0.062744140625,
-0.007381439208984375,
0.004558563232421875,
0.004802703857421875,
0.056640625,
-0.07257080078125,
-0.057952880859375,
-0.08624267578125,
0.05230712890625,
-0.0162200927734375,
-0.03717041015625,
0.05377197265625,
0.05999755859375,
0.057159423828125,
-0.034027099609375,
0.03741455078125,
-0.0143585205078125,
0.034454345703125,
-0.024566650390625,
0.0579833984375,
-0.0242919921875,
-0.0095367431640625,
-0.02850341796875,
-0.068603515625,
-0.0124359130859375,
0.055877685546875,
0.0009360313415527344,
0.0015134811401367188,
0.05291748046875,
0.045623779296875,
-0.004451751708984375,
-0.0132904052734375,
0.0139617919921875,
0.0140533447265625,
-0.0005807876586914062,
0.024078369140625,
0.046356201171875,
-0.04937744140625,
0.02862548828125,
-0.0203094482421875,
-0.006916046142578125,
-0.02593994140625,
-0.0704345703125,
-0.07305908203125,
-0.04345703125,
-0.02191162109375,
-0.0474853515625,
-0.0184478759765625,
0.06536865234375,
0.0565185546875,
-0.06903076171875,
-0.020599365234375,
-0.0000699162483215332,
0.00582122802734375,
-0.0216827392578125,
-0.020965576171875,
0.031768798828125,
-0.006137847900390625,
-0.0633544921875,
0.0120391845703125,
0.0023136138916015625,
0.014312744140625,
-0.011749267578125,
0.008148193359375,
-0.0278472900390625,
-0.00289154052734375,
0.040740966796875,
0.000732421875,
-0.04901123046875,
-0.030364990234375,
0.0026111602783203125,
-0.01380157470703125,
0.004909515380859375,
0.038970947265625,
-0.04248046875,
0.0263519287109375,
0.037078857421875,
0.02392578125,
0.05804443359375,
0.0178985595703125,
0.047088623046875,
-0.08428955078125,
0.031463623046875,
0.01435089111328125,
0.042022705078125,
0.0294036865234375,
-0.03778076171875,
0.03948974609375,
0.04034423828125,
-0.032562255859375,
-0.06488037109375,
-0.0005736351013183594,
-0.076416015625,
-0.0174102783203125,
0.0657958984375,
-0.01378631591796875,
-0.0267181396484375,
-0.00794219970703125,
-0.0262451171875,
0.03863525390625,
-0.0262451171875,
0.056976318359375,
0.06390380859375,
0.0142669677734375,
-0.0032978057861328125,
-0.022125244140625,
0.032073974609375,
0.0249786376953125,
-0.0249786376953125,
-0.0230712890625,
0.013824462890625,
0.03741455078125,
0.0184783935546875,
0.038726806640625,
-0.009735107421875,
0.005950927734375,
0.0181427001953125,
0.008880615234375,
-0.0182342529296875,
-0.00818634033203125,
-0.01947021484375,
0.0108184814453125,
-0.00910186767578125,
-0.052947998046875
]
] |
sentence-transformers/all-mpnet-base-v2 | 2023-11-02T09:35:52.000Z | [
"sentence-transformers",
"pytorch",
"mpnet",
"feature-extraction",
"sentence-similarity",
"en",
"dataset:s2orc",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:ms_marco",
"dataset:gooaq",
"dataset:yahoo_answers_topics",
"dataset:code_search_net",
"dataset:search_qa",
"dataset:eli5",
"dataset:snli",
"dataset:multi_nli",
"dataset:wikihow",
"dataset:natural_questions",
"dataset:trivia_qa",
"dataset:embedding-data/sentence-compression",
"dataset:embedding-data/flickr30k-captions",
"dataset:embedding-data/altlex",
"dataset:embedding-data/simple-wiki",
"dataset:embedding-data/QQP",
"dataset:embedding-data/SPECTER",
"dataset:embedding-data/PAQ_pairs",
"dataset:embedding-data/WikiAnswers",
"arxiv:1904.06472",
"arxiv:2102.07033",
"arxiv:2104.08727",
"arxiv:1704.05179",
"arxiv:1810.09305",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/all-mpnet-base-v2 | 452 | 10,816,338 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
language: en
license: apache-2.0
datasets:
- s2orc
- flax-sentence-embeddings/stackexchange_xml
- ms_marco
- gooaq
- yahoo_answers_topics
- code_search_net
- search_qa
- eli5
- snli
- multi_nli
- wikihow
- natural_questions
- trivia_qa
- embedding-data/sentence-compression
- embedding-data/flickr30k-captions
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/QQP
- embedding-data/SPECTER
- embedding-data/PAQ_pairs
- embedding-data/WikiAnswers
---
# all-mpnet-base-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/all-mpnet-base-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
import torch.nn.functional as F
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/all-mpnet-base-v2')
model = AutoModel.from_pretrained('sentence-transformers/all-mpnet-base-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
# Normalize embeddings
sentence_embeddings = F.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-mpnet-base-v2)
------
## Background
The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised
contrastive learning objective. We used the pretrained [`microsoft/mpnet-base`](https://huggingface.co/microsoft/mpnet-base) model and fine-tuned in on a
1B sentence pairs dataset. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset.
We developped this model during the
[Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104),
organized by Hugging Face. We developped this model as part of the project:
[Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks.
## Intended uses
Our model is intented to be used as a sentence and short paragraph encoder. Given an input text, it ouptuts a vector which captures
the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks.
By default, input text longer than 384 word pieces is truncated.
## Training procedure
### Pre-training
We use the pretrained [`microsoft/mpnet-base`](https://huggingface.co/microsoft/mpnet-base) model. Please refer to the model card for more detailed information about the pre-training procedure.
### Fine-tuning
We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each possible sentence pairs from the batch.
We then apply the cross entropy loss by comparing with true pairs.
#### Hyper parameters
We trained ou model on a TPU v3-8. We train the model during 100k steps using a batch size of 1024 (128 per TPU core).
We use a learning rate warm up of 500. The sequence length was limited to 128 tokens. We used the AdamW optimizer with
a 2e-5 learning rate. The full training script is accessible in this current repository: `train_script.py`.
#### Training data
We use the concatenation from multiple datasets to fine-tune our model. The total number of sentence pairs is above 1 billion sentences.
We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file.
| Dataset | Paper | Number of training tuples |
|--------------------------------------------------------|:----------------------------------------:|:--------------------------:|
| [Reddit comments (2015-2018)](https://github.com/PolyAI-LDN/conversational-datasets/tree/master/reddit) | [paper](https://arxiv.org/abs/1904.06472) | 726,484,430 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Abstracts) | [paper](https://aclanthology.org/2020.acl-main.447/) | 116,288,806 |
| [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs | [paper](https://doi.org/10.1145/2623330.2623677) | 77,427,422 |
| [PAQ](https://github.com/facebookresearch/PAQ) (Question, Answer) pairs | [paper](https://arxiv.org/abs/2102.07033) | 64,371,441 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Titles) | [paper](https://aclanthology.org/2020.acl-main.447/) | 52,603,982 |
| [S2ORC](https://github.com/allenai/s2orc) (Title, Abstract) | [paper](https://aclanthology.org/2020.acl-main.447/) | 41,769,185 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs | - | 25,316,456 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title+Body, Answer) pairs | - | 21,396,559 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs | - | 21,396,559 |
| [MS MARCO](https://microsoft.github.io/msmarco/) triplets | [paper](https://doi.org/10.1145/3404835.3462804) | 9,144,553 |
| [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) | [paper](https://arxiv.org/pdf/2104.08727.pdf) | 3,012,496 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 1,198,260 |
| [Code Search](https://huggingface.co/datasets/code_search_net) | - | 1,151,414 |
| [COCO](https://cocodataset.org/#home) Image captions | [paper](https://link.springer.com/chapter/10.1007%2F978-3-319-10602-1_48) | 828,395|
| [SPECTER](https://github.com/allenai/specter) citation triplets | [paper](https://doi.org/10.18653/v1/2020.acl-main.207) | 684,100 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 681,164 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 659,896 |
| [SearchQA](https://huggingface.co/datasets/search_qa) | [paper](https://arxiv.org/abs/1704.05179) | 582,261 |
| [Eli5](https://huggingface.co/datasets/eli5) | [paper](https://doi.org/10.18653/v1/p19-1346) | 325,475 |
| [Flickr 30k](https://shannon.cs.illinois.edu/DenotationGraph/) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/229/33) | 317,695 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles) | | 304,525 |
| AllNLI ([SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) | [paper SNLI](https://doi.org/10.18653/v1/d15-1075), [paper MultiNLI](https://doi.org/10.18653/v1/n18-1101) | 277,230 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (bodies) | | 250,519 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles+bodies) | | 250,460 |
| [Sentence Compression](https://github.com/google-research-datasets/sentence-compression) | [paper](https://www.aclweb.org/anthology/D13-1155/) | 180,000 |
| [Wikihow](https://github.com/pvl/wikihow_pairs_dataset) | [paper](https://arxiv.org/abs/1810.09305) | 128,542 |
| [Altlex](https://github.com/chridey/altlex/) | [paper](https://aclanthology.org/P16-1135.pdf) | 112,696 |
| [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) | - | 103,663 |
| [Simple Wikipedia](https://cs.pomona.edu/~dkauchak/simplification/) | [paper](https://www.aclweb.org/anthology/P11-2117/) | 102,225 |
| [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/1455) | 100,231 |
| [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) | [paper](https://aclanthology.org/P18-2124.pdf) | 87,599 |
| [TriviaQA](https://huggingface.co/datasets/trivia_qa) | - | 73,346 |
| **Total** | | **1,170,060,424** | | 10,571 | [
[
-0.0270233154296875,
-0.0555419921875,
0.0252685546875,
0.01505279541015625,
-0.00969696044921875,
-0.0243377685546875,
-0.0179595947265625,
-0.014678955078125,
0.02606201171875,
0.0147552490234375,
-0.03167724609375,
-0.0374755859375,
-0.05572509765625,
0.006427764892578125,
-0.0208587646484375,
0.07733154296875,
-0.005016326904296875,
-0.006061553955078125,
-0.0301666259765625,
-0.020172119140625,
-0.002712249755859375,
-0.037384033203125,
-0.039276123046875,
-0.0029468536376953125,
0.03948974609375,
0.025054931640625,
0.030609130859375,
0.043243408203125,
0.031585693359375,
0.0194091796875,
-0.00978851318359375,
0.024261474609375,
-0.042816162109375,
-0.01201629638671875,
0.0062713623046875,
-0.0253448486328125,
-0.01364898681640625,
0.00945281982421875,
0.0384521484375,
0.04620361328125,
-0.0033283233642578125,
0.018157958984375,
0.01454925537109375,
0.0380859375,
-0.040679931640625,
0.0217132568359375,
-0.041107177734375,
0.00670623779296875,
-0.01140594482421875,
-0.004718780517578125,
-0.02935791015625,
-0.020050048828125,
0.019622802734375,
-0.03594970703125,
0.0057830810546875,
0.00966644287109375,
0.0792236328125,
0.01593017578125,
-0.0335693359375,
-0.0252532958984375,
-0.01397705078125,
0.054931640625,
-0.060546875,
0.014678955078125,
0.039154052734375,
-0.00748443603515625,
-0.0000591278076171875,
-0.05975341796875,
-0.0555419921875,
-0.007694244384765625,
-0.0300750732421875,
0.0203399658203125,
-0.0245819091796875,
-0.0089263916015625,
0.0167236328125,
0.0287322998046875,
-0.0567626953125,
0.0034427642822265625,
-0.03106689453125,
-0.00606536865234375,
0.0545654296875,
0.005146026611328125,
0.0160980224609375,
-0.04345703125,
-0.0229644775390625,
-0.01995849609375,
-0.02349853515625,
0.0104827880859375,
0.0300140380859375,
0.013336181640625,
-0.033447265625,
0.05926513671875,
-0.006359100341796875,
0.047637939453125,
-0.00022399425506591797,
0.0052032470703125,
0.044464111328125,
-0.04132080078125,
-0.01059722900390625,
-0.0195770263671875,
0.0897216796875,
0.0303955078125,
0.0164794921875,
0.00016367435455322266,
0.00970458984375,
-0.00913238525390625,
-0.0030345916748046875,
-0.06292724609375,
-0.0201263427734375,
0.025177001953125,
-0.034820556640625,
-0.0284576416015625,
0.01543426513671875,
-0.05615234375,
-0.0141143798828125,
-0.003604888916015625,
0.0250244140625,
-0.042205810546875,
-0.018585205078125,
0.017059326171875,
-0.01203155517578125,
0.0226287841796875,
-0.0101470947265625,
-0.05279541015625,
0.01666259765625,
0.0401611328125,
0.07440185546875,
-0.0038623809814453125,
-0.0259552001953125,
-0.0189056396484375,
-0.01385498046875,
-0.00249481201171875,
0.051666259765625,
-0.03399658203125,
-0.00455474853515625,
-0.005558013916015625,
0.00969696044921875,
-0.03802490234375,
-0.025848388671875,
0.042633056640625,
-0.0163421630859375,
0.046905517578125,
-0.01316070556640625,
-0.0604248046875,
-0.0080108642578125,
0.0167083740234375,
-0.032379150390625,
0.08782958984375,
0.0148162841796875,
-0.08795166015625,
-0.001255035400390625,
-0.0458984375,
-0.01776123046875,
-0.0217132568359375,
-0.01139068603515625,
-0.046173095703125,
-0.00420379638671875,
0.03125,
0.046539306640625,
-0.01776123046875,
0.0172119140625,
-0.0266571044921875,
-0.019012451171875,
0.0213165283203125,
-0.0124664306640625,
0.08905029296875,
0.01336669921875,
-0.0271453857421875,
-0.00823211669921875,
-0.049285888671875,
-0.0097503662109375,
0.0241546630859375,
-0.01416015625,
-0.005031585693359375,
-0.0102691650390625,
0.01204681396484375,
0.02880859375,
0.0163116455078125,
-0.052642822265625,
0.01314544677734375,
-0.04327392578125,
0.054290771484375,
0.051513671875,
0.0011835098266601562,
0.0225830078125,
-0.03521728515625,
0.027313232421875,
0.01690673828125,
-0.00014495849609375,
-0.00778961181640625,
-0.0421142578125,
-0.07916259765625,
-0.009552001953125,
0.0302734375,
0.046905517578125,
-0.0543212890625,
0.060546875,
-0.037628173828125,
-0.039276123046875,
-0.059173583984375,
0.00011086463928222656,
0.0289306640625,
0.045684814453125,
0.04473876953125,
-0.00803375244140625,
-0.046173095703125,
-0.07867431640625,
-0.00833892822265625,
-0.005550384521484375,
-0.00096893310546875,
0.045745849609375,
0.05828857421875,
-0.02105712890625,
0.06463623046875,
-0.050537109375,
-0.0211181640625,
-0.01461029052734375,
0.004169464111328125,
0.0185089111328125,
0.046112060546875,
0.041748046875,
-0.057952880859375,
-0.044525146484375,
-0.0283050537109375,
-0.062286376953125,
0.0008869171142578125,
-0.005893707275390625,
-0.0171356201171875,
0.0224761962890625,
0.050018310546875,
-0.05413818359375,
0.025299072265625,
0.04058837890625,
-0.032623291015625,
0.0256500244140625,
-0.0048370361328125,
-0.007610321044921875,
-0.109130859375,
0.02398681640625,
0.0072784423828125,
-0.0046234130859375,
-0.03302001953125,
-0.002727508544921875,
-0.00888824462890625,
-0.005878448486328125,
-0.027191162109375,
0.0313720703125,
-0.0308990478515625,
0.0054473876953125,
0.01088714599609375,
0.0226287841796875,
0.0006451606750488281,
0.05816650390625,
-0.007587432861328125,
0.055633544921875,
0.035491943359375,
-0.0253143310546875,
0.0173492431640625,
0.03936767578125,
-0.0311431884765625,
0.0234527587890625,
-0.06427001953125,
0.0131988525390625,
-0.015716552734375,
0.0335693359375,
-0.08013916015625,
-0.0082550048828125,
0.02630615234375,
-0.05010986328125,
0.00565338134765625,
0.0012083053588867188,
-0.0457763671875,
-0.03704833984375,
-0.034820556640625,
0.0181732177734375,
0.027801513671875,
-0.032318115234375,
0.0294189453125,
0.0258941650390625,
-0.0035572052001953125,
-0.048309326171875,
-0.07916259765625,
-0.0022125244140625,
-0.00714111328125,
-0.060577392578125,
0.0316162109375,
-0.013092041015625,
0.0063323974609375,
0.013824462890625,
0.009063720703125,
0.00676727294921875,
-0.00974273681640625,
0.0155029296875,
0.003997802734375,
-0.01096343994140625,
0.0161895751953125,
-0.01136016845703125,
-0.00713348388671875,
-0.006000518798828125,
-0.02459716796875,
0.0552978515625,
-0.0282135009765625,
-0.001888275146484375,
-0.04339599609375,
0.027374267578125,
0.0296173095703125,
-0.0178680419921875,
0.08184814453125,
0.067626953125,
-0.021881103515625,
0.0164794921875,
-0.0416259765625,
-0.00637054443359375,
-0.033447265625,
0.030609130859375,
-0.022125244140625,
-0.08489990234375,
0.0305633544921875,
0.0266571044921875,
0.001415252685546875,
0.0604248046875,
0.032470703125,
-0.0170135498046875,
0.061981201171875,
0.024078369140625,
-0.005298614501953125,
0.035888671875,
-0.04766845703125,
0.0246429443359375,
-0.07330322265625,
-0.025634765625,
-0.0294647216796875,
-0.02655029296875,
-0.0712890625,
-0.044952392578125,
0.0222930908203125,
-0.0037555694580078125,
-0.0203399658203125,
0.03369140625,
-0.041748046875,
0.009674072265625,
0.0479736328125,
0.0269012451171875,
-0.0125885009765625,
0.0066986083984375,
-0.021240234375,
-0.00492095947265625,
-0.058502197265625,
-0.0230865478515625,
0.0892333984375,
0.0276031494140625,
0.0276031494140625,
-0.0004074573516845703,
0.05230712890625,
0.0031108856201171875,
-0.01088714599609375,
-0.041900634765625,
0.0426025390625,
-0.0257568359375,
-0.0289306640625,
-0.006351470947265625,
-0.049713134765625,
-0.07501220703125,
0.01213836669921875,
-0.025970458984375,
-0.0572509765625,
0.0258331298828125,
-0.00311279296875,
-0.02728271484375,
0.0235443115234375,
-0.060546875,
0.076416015625,
-0.004154205322265625,
-0.02838134765625,
-0.002559661865234375,
-0.06292724609375,
0.01226043701171875,
0.0236663818359375,
0.001674652099609375,
-0.005672454833984375,
-0.007171630859375,
0.07159423828125,
-0.0309295654296875,
0.05487060546875,
-0.01175689697265625,
0.022613525390625,
0.0279693603515625,
-0.020233154296875,
0.033721923828125,
-0.0005688667297363281,
-0.015655517578125,
0.010498046875,
-0.0029621124267578125,
-0.0511474609375,
-0.040069580078125,
0.061859130859375,
-0.077880859375,
-0.0293426513671875,
-0.043243408203125,
-0.03399658203125,
-0.005031585693359375,
0.004718780517578125,
0.0302734375,
0.031219482421875,
0.0008730888366699219,
0.047149658203125,
0.047027587890625,
-0.03338623046875,
0.03729248046875,
0.00661468505859375,
0.004161834716796875,
-0.04425048828125,
0.056915283203125,
0.01171875,
0.0036983489990234375,
0.0416259765625,
0.0191497802734375,
-0.020233154296875,
-0.03363037109375,
-0.01519775390625,
0.023712158203125,
-0.038818359375,
-0.015625,
-0.09014892578125,
-0.03094482421875,
-0.05645751953125,
-0.0007224082946777344,
-0.0195159912109375,
-0.04107666015625,
-0.044586181640625,
-0.02423095703125,
0.0302886962890625,
0.0290374755859375,
-0.001995086669921875,
0.0177001953125,
-0.0281982421875,
0.01248931884765625,
0.020721435546875,
0.01284027099609375,
-0.00827789306640625,
-0.050689697265625,
-0.0161285400390625,
0.0101165771484375,
-0.0188140869140625,
-0.05267333984375,
0.029296875,
0.0254669189453125,
0.03558349609375,
0.01065826416015625,
0.004367828369140625,
0.05010986328125,
-0.0155792236328125,
0.07501220703125,
0.006565093994140625,
-0.05169677734375,
0.0477294921875,
-0.022064208984375,
0.035186767578125,
0.059478759765625,
0.044464111328125,
-0.039031982421875,
-0.015716552734375,
-0.060150146484375,
-0.0792236328125,
0.05035400390625,
0.036773681640625,
0.01479339599609375,
-0.0127716064453125,
0.0323486328125,
-0.01395416259765625,
0.00600433349609375,
-0.0693359375,
-0.0338134765625,
-0.019989013671875,
-0.03875732421875,
-0.0188140869140625,
-0.02789306640625,
-0.0005044937133789062,
-0.036712646484375,
0.05999755859375,
-0.0079498291015625,
0.052490234375,
0.03289794921875,
-0.024993896484375,
0.02655029296875,
0.007640838623046875,
0.0445556640625,
0.0307769775390625,
-0.019805908203125,
0.0136871337890625,
0.01238250732421875,
-0.0276031494140625,
-0.015106201171875,
0.031280517578125,
-0.0110321044921875,
-0.00933837890625,
0.033447265625,
0.06964111328125,
0.0102691650390625,
-0.03985595703125,
0.061126708984375,
-0.01654052734375,
-0.0231170654296875,
-0.0309295654296875,
-0.005504608154296875,
0.022125244140625,
0.01123809814453125,
0.01220703125,
0.0035648345947265625,
0.0032176971435546875,
-0.034698486328125,
0.0254669189453125,
0.01226806640625,
-0.0305938720703125,
-0.00571441650390625,
0.035919189453125,
0.0031070709228515625,
-0.007068634033203125,
0.060028076171875,
-0.0189056396484375,
-0.039276123046875,
0.0430908203125,
0.03143310546875,
0.05987548828125,
0.01396942138671875,
0.0158538818359375,
0.054901123046875,
0.0245208740234375,
0.0121612548828125,
0.0083770751953125,
0.00559234619140625,
-0.053558349609375,
-0.00743865966796875,
-0.054046630859375,
0.0027523040771484375,
0.009033203125,
-0.041839599609375,
0.01270294189453125,
-0.0240631103515625,
0.0017766952514648438,
0.00868988037109375,
0.0179443359375,
-0.054412841796875,
-0.0016050338745117188,
0.005115509033203125,
0.0694580078125,
-0.06451416015625,
0.05487060546875,
0.04559326171875,
-0.049224853515625,
-0.05560302734375,
-0.0001926422119140625,
-0.005588531494140625,
-0.059661865234375,
0.022308349609375,
0.029693603515625,
0.01151275634765625,
0.00612640380859375,
-0.04998779296875,
-0.06689453125,
0.0997314453125,
0.02557373046875,
-0.032684326171875,
-0.002788543701171875,
0.01593017578125,
0.046356201171875,
-0.041748046875,
0.03900146484375,
0.040283203125,
0.0291290283203125,
0.0008330345153808594,
-0.053985595703125,
0.01245880126953125,
-0.04669189453125,
0.010223388671875,
-0.01255035400390625,
-0.06671142578125,
0.0587158203125,
0.0014286041259765625,
-0.0080718994140625,
0.004581451416015625,
0.05267333984375,
0.024658203125,
0.0122833251953125,
0.034942626953125,
0.0760498046875,
0.05047607421875,
-0.0100250244140625,
0.08905029296875,
-0.0211029052734375,
0.043060302734375,
0.0855712890625,
0.015167236328125,
0.075927734375,
0.035552978515625,
-0.01142120361328125,
0.057281494140625,
0.061676025390625,
-0.014068603515625,
0.035552978515625,
0.01361083984375,
-0.0007143020629882812,
-0.00848388671875,
-0.01349639892578125,
-0.029693603515625,
0.04364013671875,
0.01476287841796875,
-0.04315185546875,
0.00424957275390625,
0.004878997802734375,
0.0260009765625,
0.003704071044921875,
0.00493621826171875,
0.059173583984375,
0.01178741455078125,
-0.049774169921875,
0.046356201171875,
-0.0027923583984375,
0.07025146484375,
-0.035247802734375,
0.022308349609375,
-0.0230865478515625,
0.0153656005859375,
-0.023773193359375,
-0.056060791015625,
0.0300445556640625,
-0.0033473968505859375,
-0.01224517822265625,
-0.0171356201171875,
0.0380859375,
-0.042877197265625,
-0.050048828125,
0.0270233154296875,
0.0283660888671875,
0.01087188720703125,
0.0069122314453125,
-0.0819091796875,
-0.0029468536376953125,
0.00740814208984375,
-0.03277587890625,
0.01513671875,
0.01503753662109375,
0.030120849609375,
0.03631591796875,
0.04412841796875,
-0.014068603515625,
0.004108428955078125,
-0.00708770751953125,
0.06536865234375,
-0.047576904296875,
-0.039459228515625,
-0.06121826171875,
0.04736328125,
-0.0280303955078125,
-0.041900634765625,
0.057708740234375,
0.062744140625,
0.0721435546875,
0.007785797119140625,
0.048797607421875,
-0.03106689453125,
0.0321044921875,
-0.033447265625,
0.04986572265625,
-0.05096435546875,
0.00433349609375,
-0.015533447265625,
-0.05499267578125,
-0.0222930908203125,
0.05450439453125,
-0.0321044921875,
0.011016845703125,
0.06378173828125,
0.07574462890625,
-0.007282257080078125,
-0.0029296875,
-0.004459381103515625,
0.0287322998046875,
0.0127410888671875,
0.0616455078125,
0.0313720703125,
-0.06854248046875,
0.0576171875,
-0.034515380859375,
-0.007488250732421875,
-0.019622802734375,
-0.05523681640625,
-0.0655517578125,
-0.06427001953125,
-0.035858154296875,
-0.041290283203125,
0.00940704345703125,
0.080322265625,
0.06121826171875,
-0.0640869140625,
-0.01324462890625,
-0.018310546875,
-0.0048370361328125,
-0.003414154052734375,
-0.0220947265625,
0.049957275390625,
-0.01094818115234375,
-0.044403076171875,
0.013275146484375,
-0.0038204193115234375,
-0.004352569580078125,
-0.0021724700927734375,
-0.0025424957275390625,
-0.06304931640625,
-0.0003685951232910156,
0.042205810546875,
0.01203155517578125,
-0.044708251953125,
-0.021209716796875,
0.004817962646484375,
-0.0238494873046875,
0.0149688720703125,
0.038360595703125,
-0.03631591796875,
0.0259552001953125,
0.04736328125,
0.04949951171875,
0.0692138671875,
-0.00670623779296875,
0.0263671875,
-0.06024169921875,
0.0170440673828125,
0.0174713134765625,
0.0309906005859375,
0.037353515625,
-0.03033447265625,
0.055877685546875,
0.0301666259765625,
-0.04425048828125,
-0.046875,
-0.0062713623046875,
-0.09271240234375,
-0.0166778564453125,
0.1009521484375,
-0.027435302734375,
-0.0167083740234375,
0.01263427734375,
-0.0081787109375,
0.0243682861328125,
-0.02642822265625,
0.04425048828125,
0.05010986328125,
-0.01491546630859375,
-0.033599853515625,
-0.0300445556640625,
0.026397705078125,
0.0396728515625,
-0.06768798828125,
-0.0170135498046875,
0.017242431640625,
0.0262298583984375,
0.0164337158203125,
0.044464111328125,
-0.0060272216796875,
-0.0003151893615722656,
0.001995086669921875,
-0.007038116455078125,
-0.01287841796875,
0.00036835670471191406,
-0.0270538330078125,
0.019378662109375,
-0.0304412841796875,
-0.016815185546875
]
] |
stabilityai/stable-diffusion-xl-base-1.0 | 2023-10-30T16:03:47.000Z | [
"diffusers",
"onnx",
"text-to-image",
"stable-diffusion",
"arxiv:2307.01952",
"arxiv:2211.01324",
"arxiv:2108.01073",
"arxiv:2112.10752",
"license:openrail++",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | stabilityai | null | null | stabilityai/stable-diffusion-xl-base-1.0 | 3,405 | 9,097,397 | diffusers | 2023-07-25T13:25:51 | ---
license: openrail++
tags:
- text-to-image
- stable-diffusion
---
# SD-XL 1.0-base Model Card

## Model

[SDXL](https://arxiv.org/abs/2307.01952) consists of an [ensemble of experts](https://arxiv.org/abs/2211.01324) pipeline for latent diffusion:
In a first step, the base model is used to generate (noisy) latents,
which are then further processed with a refinement model (available here: https://huggingface.co/stabilityai/stable-diffusion-xl-refiner-1.0/) specialized for the final denoising steps.
Note that the base model can be used as a standalone module.
Alternatively, we can use a two-stage pipeline as follows:
First, the base model is used to generate latents of the desired output size.
In the second step, we use a specialized high-resolution model and apply a technique called SDEdit (https://arxiv.org/abs/2108.01073, also known as "img2img")
to the latents generated in the first step, using the same prompt. This technique is slightly slower than the first one, as it requires more function evaluations.
Source code is available at https://github.com/Stability-AI/generative-models .
### Model Description
- **Developed by:** Stability AI
- **Model type:** Diffusion-based text-to-image generative model
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0/blob/main/LICENSE.md)
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses two fixed, pretrained text encoders ([OpenCLIP-ViT/G](https://github.com/mlfoundations/open_clip) and [CLIP-ViT/L](https://github.com/openai/CLIP/tree/main)).
- **Resources for more information:** Check out our [GitHub Repository](https://github.com/Stability-AI/generative-models) and the [SDXL report on arXiv](https://arxiv.org/abs/2307.01952).
### Model Sources
For research purposes, we recommend our `generative-models` Github repository (https://github.com/Stability-AI/generative-models), which implements the most popular diffusion frameworks (both training and inference) and for which new functionalities like distillation will be added over time.
[Clipdrop](https://clipdrop.co/stable-diffusion) provides free SDXL inference.
- **Repository:** https://github.com/Stability-AI/generative-models
- **Demo:** https://clipdrop.co/stable-diffusion
## Evaluation

The chart above evaluates user preference for SDXL (with and without refinement) over SDXL 0.9 and Stable Diffusion 1.5 and 2.1.
The SDXL base model performs significantly better than the previous variants, and the model combined with the refinement module achieves the best overall performance.
### ๐งจ Diffusers
Make sure to upgrade diffusers to >= 0.19.0:
```
pip install diffusers --upgrade
```
In addition make sure to install `transformers`, `safetensors`, `accelerate` as well as the invisible watermark:
```
pip install invisible_watermark transformers accelerate safetensors
```
To just use the base model, you can run:
```py
from diffusers import DiffusionPipeline
import torch
pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")
pipe.to("cuda")
# if using torch < 2.0
# pipe.enable_xformers_memory_efficient_attention()
prompt = "An astronaut riding a green horse"
images = pipe(prompt=prompt).images[0]
```
To use the whole base + refiner pipeline as an ensemble of experts you can run:
```py
from diffusers import DiffusionPipeline
import torch
# load both base & refiner
base = DiffusionPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, variant="fp16", use_safetensors=True
)
base.to("cuda")
refiner = DiffusionPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-refiner-1.0",
text_encoder_2=base.text_encoder_2,
vae=base.vae,
torch_dtype=torch.float16,
use_safetensors=True,
variant="fp16",
)
refiner.to("cuda")
# Define how many steps and what % of steps to be run on each experts (80/20) here
n_steps = 40
high_noise_frac = 0.8
prompt = "A majestic lion jumping from a big stone at night"
# run both experts
image = base(
prompt=prompt,
num_inference_steps=n_steps,
denoising_end=high_noise_frac,
output_type="latent",
).images
image = refiner(
prompt=prompt,
num_inference_steps=n_steps,
denoising_start=high_noise_frac,
image=image,
).images[0]
```
When using `torch >= 2.0`, you can improve the inference speed by 20-30% with torch.compile. Simple wrap the unet with torch compile before running the pipeline:
```py
pipe.unet = torch.compile(pipe.unet, mode="reduce-overhead", fullgraph=True)
```
If you are limited by GPU VRAM, you can enable *cpu offloading* by calling `pipe.enable_model_cpu_offload`
instead of `.to("cuda")`:
```diff
- pipe.to("cuda")
+ pipe.enable_model_cpu_offload()
```
For more information on how to use Stable Diffusion XL with `diffusers`, please have a look at [the Stable Diffusion XL Docs](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl).
### Optimum
[Optimum](https://github.com/huggingface/optimum) provides a Stable Diffusion pipeline compatible with both [OpenVINO](https://docs.openvino.ai/latest/index.html) and [ONNX Runtime](https://onnxruntime.ai/).
#### OpenVINO
To install Optimum with the dependencies required for OpenVINO :
```bash
pip install optimum[openvino]
```
To load an OpenVINO model and run inference with OpenVINO Runtime, you need to replace `StableDiffusionXLPipeline` with Optimum `OVStableDiffusionXLPipeline`. In case you want to load a PyTorch model and convert it to the OpenVINO format on-the-fly, you can set `export=True`.
```diff
- from diffusers import StableDiffusionXLPipeline
+ from optimum.intel import OVStableDiffusionXLPipeline
model_id = "stabilityai/stable-diffusion-xl-base-1.0"
- pipeline = StableDiffusionXLPipeline.from_pretrained(model_id)
+ pipeline = OVStableDiffusionXLPipeline.from_pretrained(model_id)
prompt = "A majestic lion jumping from a big stone at night"
image = pipeline(prompt).images[0]
```
You can find more examples (such as static reshaping and model compilation) in optimum [documentation](https://huggingface.co/docs/optimum/main/en/intel/inference#stable-diffusion-xl).
#### ONNX
To install Optimum with the dependencies required for ONNX Runtime inference :
```bash
pip install optimum[onnxruntime]
```
To load an ONNX model and run inference with ONNX Runtime, you need to replace `StableDiffusionXLPipeline` with Optimum `ORTStableDiffusionXLPipeline`. In case you want to load a PyTorch model and convert it to the ONNX format on-the-fly, you can set `export=True`.
```diff
- from diffusers import StableDiffusionXLPipeline
+ from optimum.onnxruntime import ORTStableDiffusionXLPipeline
model_id = "stabilityai/stable-diffusion-xl-base-1.0"
- pipeline = StableDiffusionXLPipeline.from_pretrained(model_id)
+ pipeline = ORTStableDiffusionXLPipeline.from_pretrained(model_id)
prompt = "A majestic lion jumping from a big stone at night"
image = pipeline(prompt).images[0]
```
You can find more examples in optimum [documentation](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models#stable-diffusion-xl).
## Uses
### Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
Excluded uses are described below.
### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model struggles with more difficult tasks which involve compositionality, such as rendering an image corresponding to โA red cube on top of a blue sphereโ
- Faces and people in general may not be generated properly.
- The autoencoding part of the model is lossy.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
| 8,661 | [
[
-0.0302581787109375,
-0.0626220703125,
0.038360595703125,
0.01006317138671875,
-0.00792694091796875,
-0.02288818359375,
-0.01019287109375,
-0.00579071044921875,
0.00980377197265625,
0.031585693359375,
-0.021881103515625,
-0.0384521484375,
-0.045867919921875,
-0.012451171875,
-0.020782470703125,
0.0748291015625,
-0.0136260986328125,
0.0008368492126464844,
-0.0115203857421875,
-0.003520965576171875,
0.005565643310546875,
0.0013713836669921875,
-0.0794677734375,
-0.0208740234375,
0.029815673828125,
-0.002994537353515625,
0.03863525390625,
0.0289764404296875,
0.0190277099609375,
0.02972412109375,
-0.0284423828125,
-0.0030536651611328125,
-0.040496826171875,
0.01050567626953125,
0.00640869140625,
-0.031524658203125,
-0.0214996337890625,
0.0013980865478515625,
0.038787841796875,
0.0240936279296875,
-0.0283355712890625,
-0.004276275634765625,
-0.005218505859375,
0.0291290283203125,
-0.045989990234375,
0.0032482147216796875,
-0.022125244140625,
0.004703521728515625,
0.00878143310546875,
0.0108489990234375,
-0.0244293212890625,
-0.032012939453125,
0.0167083740234375,
-0.06597900390625,
0.0267486572265625,
-0.01343536376953125,
0.07940673828125,
0.0478515625,
-0.0029735565185546875,
-0.00798797607421875,
-0.039337158203125,
0.055694580078125,
-0.061767578125,
0.0175323486328125,
0.01464080810546875,
0.001995086669921875,
0.00299072265625,
-0.0921630859375,
-0.041778564453125,
-0.01155853271484375,
0.0033855438232421875,
0.0377197265625,
-0.02398681640625,
0.011138916015625,
0.035247802734375,
0.0269012451171875,
-0.033721923828125,
-0.004146575927734375,
-0.045806884765625,
-0.00801849365234375,
0.04315185546875,
0.009185791015625,
0.0167694091796875,
0.0018157958984375,
-0.0245361328125,
-0.01715087890625,
-0.0224609375,
-0.0032958984375,
0.0174560546875,
-0.005939483642578125,
-0.029449462890625,
0.0301666259765625,
-0.0018062591552734375,
0.040252685546875,
0.022918701171875,
-0.0019664764404296875,
0.0244293212890625,
-0.01140594482421875,
-0.030059814453125,
-0.0081939697265625,
0.07269287109375,
0.0213470458984375,
-0.006000518798828125,
0.0033740997314453125,
-0.01216888427734375,
0.0034008026123046875,
0.004974365234375,
-0.09039306640625,
-0.048431396484375,
0.031951904296875,
-0.03466796875,
-0.035858154296875,
-0.01641845703125,
-0.0731201171875,
-0.015838623046875,
0.002178192138671875,
0.044281005859375,
-0.027069091796875,
-0.037261962890625,
0.004302978515625,
-0.0308837890625,
0.00215911865234375,
0.032806396484375,
-0.05902099609375,
0.0179901123046875,
0.01052093505859375,
0.097900390625,
-0.00998687744140625,
0.004230499267578125,
-0.033660888671875,
-0.01336669921875,
-0.01546478271484375,
0.044525146484375,
-0.02581787109375,
-0.0286712646484375,
-0.021575927734375,
0.003864288330078125,
-0.005001068115234375,
-0.035400390625,
0.05126953125,
-0.03228759765625,
0.023529052734375,
0.007358551025390625,
-0.0330810546875,
-0.017425537109375,
-0.00440216064453125,
-0.0401611328125,
0.087646484375,
0.0224609375,
-0.077392578125,
0.00460052490234375,
-0.06976318359375,
-0.007503509521484375,
-0.01163482666015625,
-0.0011949539184570312,
-0.054168701171875,
-0.004215240478515625,
0.00702667236328125,
0.03125,
-0.00655364990234375,
0.0084381103515625,
-0.021026611328125,
-0.007503509521484375,
-0.007129669189453125,
-0.027801513671875,
0.09814453125,
0.03240966796875,
-0.0281982421875,
0.0236053466796875,
-0.05780029296875,
-0.0019893646240234375,
0.01023101806640625,
-0.028228759765625,
-0.010711669921875,
-0.0199737548828125,
0.0189056396484375,
0.006687164306640625,
0.003391265869140625,
-0.05181884765625,
0.006870269775390625,
-0.0243988037109375,
0.064453125,
0.060028076171875,
0.004344940185546875,
0.0345458984375,
-0.0078277587890625,
0.03411865234375,
0.01454925537109375,
0.00734710693359375,
-0.004245758056640625,
-0.061737060546875,
-0.06951904296875,
-0.02056884765625,
0.0290679931640625,
0.04302978515625,
-0.06402587890625,
0.02203369140625,
0.0025157928466796875,
-0.05340576171875,
-0.037353515625,
-0.0007004737854003906,
0.0189208984375,
0.054901123046875,
0.0258331298828125,
-0.03973388671875,
-0.028717041015625,
-0.0499267578125,
0.0299224853515625,
-0.0052490234375,
0.0003864765167236328,
0.01073455810546875,
0.04742431640625,
-0.0223388671875,
0.0614013671875,
-0.07061767578125,
-0.00860595703125,
0.0050811767578125,
0.0243377685546875,
0.0230560302734375,
0.0523681640625,
0.0528564453125,
-0.05718994140625,
-0.05621337890625,
0.0010423660278320312,
-0.056671142578125,
-0.00328826904296875,
0.004505157470703125,
-0.0263824462890625,
0.021820068359375,
0.0307159423828125,
-0.063720703125,
0.041748046875,
0.04718017578125,
-0.0311126708984375,
0.046875,
-0.042022705078125,
-0.002628326416015625,
-0.07470703125,
0.0193634033203125,
0.02777099609375,
-0.0245513916015625,
-0.0474853515625,
0.00827789306640625,
-0.01169586181640625,
-0.00688934326171875,
-0.04339599609375,
0.053497314453125,
-0.021697998046875,
0.0280303955078125,
-0.0084075927734375,
-0.01128387451171875,
0.00742340087890625,
0.03668212890625,
0.0177154541015625,
0.048919677734375,
0.0675048828125,
-0.044921875,
0.0298919677734375,
0.020050048828125,
-0.03192138671875,
0.0196533203125,
-0.06878662109375,
0.0122833251953125,
-0.0190277099609375,
0.0156402587890625,
-0.082275390625,
-0.01174163818359375,
0.03863525390625,
-0.0345458984375,
0.030548095703125,
-0.0181732177734375,
-0.019866943359375,
-0.02191162109375,
-0.0098876953125,
0.01262664794921875,
0.0704345703125,
-0.032012939453125,
0.060394287109375,
0.013580322265625,
-0.0022068023681640625,
-0.043792724609375,
-0.060577392578125,
-0.0217132568359375,
-0.0259857177734375,
-0.06414794921875,
0.03887939453125,
-0.022979736328125,
-0.0274505615234375,
0.004764556884765625,
0.008544921875,
-0.017242431640625,
-0.0039825439453125,
0.0219879150390625,
0.0191497802734375,
-0.01352691650390625,
-0.02557373046875,
0.0168609619140625,
-0.021575927734375,
0.005115509033203125,
-0.02410888671875,
0.0291748046875,
0.006160736083984375,
-0.0005373954772949219,
-0.05230712890625,
0.01044464111328125,
0.045440673828125,
0.0065460205078125,
0.0704345703125,
0.07879638671875,
-0.0212554931640625,
-0.0027065277099609375,
-0.0313720703125,
-0.0258941650390625,
-0.038177490234375,
0.022491455078125,
-0.0115203857421875,
-0.029022216796875,
0.043670654296875,
0.004520416259765625,
0.02410888671875,
0.04168701171875,
0.05340576171875,
-0.005847930908203125,
0.08184814453125,
0.0377197265625,
0.024169921875,
0.0411376953125,
-0.0703125,
0.0035533905029296875,
-0.055755615234375,
-0.01303863525390625,
-0.032562255859375,
-0.0118560791015625,
-0.01433563232421875,
-0.0438232421875,
0.033355712890625,
0.01419830322265625,
-0.03240966796875,
0.0091094970703125,
-0.053863525390625,
0.019744873046875,
0.0404052734375,
0.0191802978515625,
0.00882720947265625,
0.0174407958984375,
-0.0184783935546875,
-0.005138397216796875,
-0.05224609375,
-0.031158447265625,
0.06951904296875,
0.0279998779296875,
0.06689453125,
-0.004009246826171875,
0.037353515625,
0.0179290771484375,
0.032684326171875,
-0.0360107421875,
0.028076171875,
-0.01549530029296875,
-0.039642333984375,
-0.01189422607421875,
-0.028045654296875,
-0.063720703125,
0.01837158203125,
-0.01424407958984375,
-0.04034423828125,
0.0252838134765625,
0.01503753662109375,
-0.0472412109375,
0.039215087890625,
-0.0682373046875,
0.072021484375,
-0.012237548828125,
-0.059844970703125,
-0.004062652587890625,
-0.045684814453125,
0.0232391357421875,
0.0112152099609375,
-0.00569915771484375,
0.0115966796875,
-0.01134490966796875,
0.06097412109375,
-0.039093017578125,
0.063232421875,
-0.0391845703125,
0.003971099853515625,
0.032135009765625,
-0.017181396484375,
0.025665283203125,
0.00022614002227783203,
-0.032318115234375,
0.0204925537109375,
0.006908416748046875,
-0.0285186767578125,
-0.034423828125,
0.0650634765625,
-0.0772705078125,
-0.03350830078125,
-0.0282745361328125,
-0.0275726318359375,
0.0384521484375,
0.0160980224609375,
0.0460205078125,
0.0237579345703125,
-0.00264739990234375,
-0.0024471282958984375,
0.0687255859375,
-0.024383544921875,
0.040313720703125,
0.007358551025390625,
-0.0174407958984375,
-0.0406494140625,
0.0625,
0.007061004638671875,
0.0362548828125,
0.00811004638671875,
0.026214599609375,
-0.02642822265625,
-0.0546875,
-0.049713134765625,
0.01247406005859375,
-0.0650634765625,
-0.00870513916015625,
-0.07177734375,
-0.022491455078125,
-0.03118896484375,
-0.0129852294921875,
-0.0269927978515625,
-0.028656005859375,
-0.05511474609375,
0.010589599609375,
0.04608154296875,
0.04254150390625,
-0.0226898193359375,
0.032073974609375,
-0.0269927978515625,
0.0158538818359375,
0.016082763671875,
0.0145721435546875,
0.01019287109375,
-0.045989990234375,
-0.0093231201171875,
0.003726959228515625,
-0.048492431640625,
-0.051483154296875,
0.04815673828125,
0.01337432861328125,
0.045166015625,
0.053070068359375,
0.0038661956787109375,
0.0467529296875,
-0.0148468017578125,
0.0634765625,
0.022918701171875,
-0.057342529296875,
0.0404052734375,
-0.02093505859375,
-0.0034465789794921875,
0.0186309814453125,
0.04443359375,
-0.01580810546875,
-0.01495361328125,
-0.052642822265625,
-0.06634521484375,
0.0550537109375,
0.03570556640625,
-0.001377105712890625,
0.01161956787109375,
0.047515869140625,
0.002685546875,
-0.00423431396484375,
-0.061737060546875,
-0.03375244140625,
-0.0243377685546875,
-0.006793975830078125,
0.0031375885009765625,
0.008544921875,
-0.01090240478515625,
-0.04388427734375,
0.066162109375,
0.006862640380859375,
0.03375244140625,
0.0335693359375,
0.007091522216796875,
-0.01192474365234375,
-0.019317626953125,
0.0343017578125,
0.03656005859375,
-0.0263214111328125,
-0.01082611083984375,
0.01074981689453125,
-0.045379638671875,
0.00722503662109375,
0.006717681884765625,
-0.041046142578125,
0.003612518310546875,
-0.005222320556640625,
0.075439453125,
-0.013671875,
-0.03662109375,
0.024627685546875,
-0.01522064208984375,
-0.026519775390625,
-0.034088134765625,
0.0223388671875,
0.0234832763671875,
0.01910400390625,
0.006336212158203125,
0.046417236328125,
0.0020542144775390625,
-0.0253753662109375,
-0.0029811859130859375,
0.0299224853515625,
-0.025604248046875,
-0.0175628662109375,
0.0931396484375,
0.01593017578125,
-0.0178375244140625,
0.059844970703125,
-0.016143798828125,
-0.028106689453125,
0.061279296875,
0.0266571044921875,
0.06475830078125,
-0.00899505615234375,
0.0181732177734375,
0.05419921875,
0.0003719329833984375,
-0.02642822265625,
0.00847625732421875,
0.00030994415283203125,
-0.0478515625,
-0.0211181640625,
-0.05059814453125,
-0.011810302734375,
0.0002532005310058594,
-0.03363037109375,
0.0357666015625,
-0.0467529296875,
-0.0227203369140625,
0.00930023193359375,
-0.003940582275390625,
-0.045013427734375,
0.012054443359375,
0.0018253326416015625,
0.06622314453125,
-0.07025146484375,
0.0645751953125,
0.044097900390625,
-0.05242919921875,
-0.039093017578125,
-0.0170440673828125,
-0.016082763671875,
-0.0404052734375,
0.044525146484375,
0.0038242340087890625,
-0.00614166259765625,
0.0174102783203125,
-0.05902099609375,
-0.052764892578125,
0.0975341796875,
0.032379150390625,
-0.02191162109375,
-0.0034275054931640625,
-0.032379150390625,
0.043731689453125,
-0.033538818359375,
0.05804443359375,
0.035369873046875,
0.0389404296875,
0.037353515625,
-0.042694091796875,
0.003658294677734375,
-0.0279998779296875,
-0.0005044937133789062,
0.0007462501525878906,
-0.056243896484375,
0.09027099609375,
-0.0369873046875,
-0.0165252685546875,
0.033416748046875,
0.04766845703125,
0.0151824951171875,
0.023040771484375,
0.0340576171875,
0.08734130859375,
0.048309326171875,
-0.01136016845703125,
0.0794677734375,
-0.01197052001953125,
0.048309326171875,
0.04693603515625,
0.0005998611450195312,
0.064453125,
0.0245513916015625,
-0.0228118896484375,
0.051971435546875,
0.061767578125,
-0.01392364501953125,
0.052947998046875,
0.01067352294921875,
-0.024261474609375,
-0.0091705322265625,
0.01052093505859375,
-0.037353515625,
-0.00511932373046875,
0.0306243896484375,
-0.04876708984375,
-0.0111236572265625,
-0.002460479736328125,
0.004711151123046875,
-0.0220184326171875,
-0.0165557861328125,
0.03485107421875,
-0.003177642822265625,
-0.058258056640625,
0.06072998046875,
0.0060272216796875,
0.07623291015625,
-0.045440673828125,
-0.00457000732421875,
-0.004375457763671875,
0.03143310546875,
-0.030364990234375,
-0.060302734375,
0.0257568359375,
-0.01222991943359375,
-0.010101318359375,
-0.0184326171875,
0.0411376953125,
-0.032501220703125,
-0.045623779296875,
0.0233917236328125,
0.01132965087890625,
0.04205322265625,
0.00312042236328125,
-0.06524658203125,
0.01561737060546875,
0.01398468017578125,
-0.0255126953125,
0.01424407958984375,
0.00682830810546875,
0.0257568359375,
0.046142578125,
0.045135498046875,
0.002529144287109375,
0.01557159423828125,
-0.008941650390625,
0.06536865234375,
-0.020721435546875,
-0.0199737548828125,
-0.05352783203125,
0.06317138671875,
-0.016754150390625,
-0.0308074951171875,
0.0435791015625,
0.045440673828125,
0.049713134765625,
-0.01361846923828125,
0.06268310546875,
-0.027557373046875,
0.0016155242919921875,
-0.041412353515625,
0.07330322265625,
-0.05242919921875,
0.00896453857421875,
-0.0174407958984375,
-0.06451416015625,
-0.01023101806640625,
0.0750732421875,
-0.01549530029296875,
0.0139617919921875,
0.03546142578125,
0.0794677734375,
-0.02447509765625,
-0.0072021484375,
0.0118560791015625,
0.03863525390625,
0.022674560546875,
0.0307159423828125,
0.036590576171875,
-0.041046142578125,
0.03814697265625,
-0.041595458984375,
-0.0290679931640625,
0.0046844482421875,
-0.06207275390625,
-0.0675048828125,
-0.044921875,
-0.059783935546875,
-0.0604248046875,
-0.0102081298828125,
0.0546875,
0.072265625,
-0.0494384765625,
-0.008514404296875,
-0.0264739990234375,
-0.00003796815872192383,
-0.01090240478515625,
-0.018157958984375,
0.0316162109375,
-0.002689361572265625,
-0.080078125,
0.00010454654693603516,
0.01230621337890625,
0.01715087890625,
-0.049652099609375,
-0.02301025390625,
-0.02398681640625,
-0.0085296630859375,
0.0311737060546875,
0.0278778076171875,
-0.05352783203125,
0.006870269775390625,
-0.0146331787109375,
0.0122528076171875,
0.0214385986328125,
0.033905029296875,
-0.054443359375,
0.0460205078125,
0.057403564453125,
0.0175628662109375,
0.0728759765625,
-0.0079498291015625,
0.0229339599609375,
-0.025970458984375,
0.0191497802734375,
0.01157379150390625,
0.0300445556640625,
0.01934814453125,
-0.02996826171875,
0.04608154296875,
0.036346435546875,
-0.052764892578125,
-0.05645751953125,
-0.006145477294921875,
-0.08538818359375,
-0.0154266357421875,
0.0809326171875,
-0.0364990234375,
-0.0273895263671875,
-0.01296234130859375,
-0.029693603515625,
0.0237579345703125,
-0.043426513671875,
0.05322265625,
0.03973388671875,
-0.0218048095703125,
-0.034210205078125,
-0.043304443359375,
0.03143310546875,
0.0171051025390625,
-0.04833984375,
-0.017578125,
0.024627685546875,
0.05419921875,
0.036346435546875,
0.051025390625,
-0.003326416015625,
0.00534820556640625,
0.0220184326171875,
0.004489898681640625,
0.00043320655822753906,
0.0032253265380859375,
-0.0207061767578125,
0.008575439453125,
-0.0174560546875,
-0.00850677490234375
]
] |
cardiffnlp/twitter-roberta-base-sentiment | 2023-01-20T09:52:13.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"text-classification",
"en",
"dataset:tweet_eval",
"arxiv:2010.12421",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-sentiment | 219 | 8,846,250 | transformers | 2022-03-02T23:29:05 | ---
datasets:
- tweet_eval
language:
- en
---
# Twitter-roBERTa-base for Sentiment Analysis
This is a roBERTa-base model trained on ~58M tweets and finetuned for sentiment analysis with the TweetEval benchmark. This model is suitable for English (for a similar multilingual model, see [XLM-T](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment)).
- Reference Paper: [_TweetEval_ (Findings of EMNLP 2020)](https://arxiv.org/pdf/2010.12421.pdf).
- Git Repo: [Tweeteval official repository](https://github.com/cardiffnlp/tweeteval).
<b>Labels</b>:
0 -> Negative;
1 -> Neutral;
2 -> Positive
<b>New!</b> We just released a new sentiment analysis model trained on more recent and a larger quantity of tweets.
See [twitter-roberta-base-sentiment-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-sentiment-latest) and [TweetNLP](https://tweetnlp.org) for more details.
## Example of classification
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
from scipy.special import softmax
import csv
import urllib.request
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
# Tasks:
# emoji, emotion, hate, irony, offensive, sentiment
# stance/abortion, stance/atheism, stance/climate, stance/feminist, stance/hillary
task='sentiment'
MODEL = f"cardiffnlp/twitter-roberta-base-{task}"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# download label mapping
labels=[]
mapping_link = f"https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/{task}/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)
text = "Good night ๐"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Good night ๐"
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) positive 0.8466
2) neutral 0.1458
3) negative 0.0076
```
### BibTeX entry and citation info
Please cite the [reference paper](https://aclanthology.org/2020.findings-emnlp.148/) if you use this model.
```bibtex
@inproceedings{barbieri-etal-2020-tweeteval,
title = "{T}weet{E}val: Unified Benchmark and Comparative Evaluation for Tweet Classification",
author = "Barbieri, Francesco and
Camacho-Collados, Jose and
Espinosa Anke, Luis and
Neves, Leonardo",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.findings-emnlp.148",
doi = "10.18653/v1/2020.findings-emnlp.148",
pages = "1644--1650"
}
``` | 3,717 | [
[
-0.004131317138671875,
-0.05206298828125,
0.00853729248046875,
0.0300140380859375,
-0.01299285888671875,
0.01293182373046875,
-0.032379150390625,
-0.01396942138671875,
0.02685546875,
0.00331878662109375,
-0.0279998779296875,
-0.0701904296875,
-0.0518798828125,
0.00673675537109375,
-0.0264129638671875,
0.07806396484375,
0.004848480224609375,
0.00865936279296875,
0.0156402587890625,
-0.020599365234375,
-0.004093170166015625,
-0.040191650390625,
-0.053680419921875,
-0.01477813720703125,
0.033294677734375,
0.023681640625,
0.02764892578125,
0.019012451171875,
0.03125,
0.033477783203125,
-0.005939483642578125,
0.001007080078125,
-0.031707763671875,
0.014007568359375,
0.00009167194366455078,
-0.024993896484375,
-0.046875,
0.0180206298828125,
0.045074462890625,
0.044921875,
0.0025615692138671875,
0.027984619140625,
0.00699615478515625,
0.033660888671875,
-0.03045654296875,
0.012359619140625,
-0.031463623046875,
-0.0028820037841796875,
-0.0072784423828125,
-0.01444244384765625,
-0.0278167724609375,
-0.0504150390625,
0.00975799560546875,
-0.03125,
0.0117340087890625,
-0.007335662841796875,
0.0946044921875,
0.011016845703125,
-0.0170440673828125,
-0.00792694091796875,
-0.0212860107421875,
0.0902099609375,
-0.0576171875,
0.01392364501953125,
0.01361083984375,
0.006011962890625,
0.0107269287109375,
-0.0401611328125,
-0.032379150390625,
-0.0123748779296875,
0.00481414794921875,
0.01898193359375,
-0.02520751953125,
-0.0169525146484375,
0.004863739013671875,
0.01343536376953125,
-0.03717041015625,
-0.01242828369140625,
-0.0263214111328125,
-0.0034694671630859375,
0.046112060546875,
0.00501251220703125,
0.0220947265625,
-0.033416748046875,
-0.01788330078125,
-0.01488494873046875,
-0.014739990234375,
0.0036907196044921875,
0.00567626953125,
0.0377197265625,
-0.0311431884765625,
0.0379638671875,
-0.005817413330078125,
0.036956787109375,
0.006908416748046875,
-0.00835418701171875,
0.057861328125,
-0.016693115234375,
-0.02105712890625,
-0.0157470703125,
0.0872802734375,
0.0294952392578125,
0.032867431640625,
-0.007434844970703125,
-0.01026153564453125,
-0.00006711483001708984,
-0.010162353515625,
-0.05950927734375,
-0.016265869140625,
0.026580810546875,
-0.04132080078125,
-0.04925537109375,
0.005100250244140625,
-0.0626220703125,
-0.01213836669921875,
-0.007633209228515625,
0.044708251953125,
-0.042083740234375,
-0.0352783203125,
-0.00707244873046875,
-0.023712158203125,
0.01143646240234375,
0.019775390625,
-0.0494384765625,
0.0038909912109375,
0.0355224609375,
0.07037353515625,
0.0006818771362304688,
-0.02911376953125,
-0.0198974609375,
-0.0019130706787109375,
-0.0186920166015625,
0.049591064453125,
-0.03253173828125,
-0.01983642578125,
0.0031604766845703125,
0.0007658004760742188,
-0.015167236328125,
-0.0164337158203125,
0.032958984375,
-0.016021728515625,
0.021636962890625,
-0.016815185546875,
-0.036712646484375,
-0.003192901611328125,
0.032012939453125,
-0.0285797119140625,
0.08697509765625,
0.016845703125,
-0.059906005859375,
0.0186767578125,
-0.06231689453125,
-0.0289764404296875,
-0.01548004150390625,
0.00930023193359375,
-0.036590576171875,
-0.0037708282470703125,
0.0191192626953125,
0.051177978515625,
-0.0210418701171875,
0.014892578125,
-0.0399169921875,
-0.0102691650390625,
0.0275421142578125,
-0.018768310546875,
0.1005859375,
0.0203704833984375,
-0.0256195068359375,
0.00505828857421875,
-0.059326171875,
0.01947021484375,
0.01025390625,
-0.0341796875,
-0.016204833984375,
-0.0195159912109375,
0.0249481201171875,
0.016754150390625,
0.020599365234375,
-0.050506591796875,
0.0143585205078125,
-0.0269317626953125,
0.047607421875,
0.056304931640625,
-0.00411224365234375,
0.02984619140625,
-0.030517578125,
0.0301666259765625,
0.0082244873046875,
0.01436614990234375,
0.0101165771484375,
-0.037567138671875,
-0.06402587890625,
-0.00995635986328125,
0.034637451171875,
0.04083251953125,
-0.037445068359375,
0.04205322265625,
-0.038970947265625,
-0.049896240234375,
-0.045806884765625,
-0.004299163818359375,
0.0184783935546875,
0.035064697265625,
0.045440673828125,
0.00664520263671875,
-0.057403564453125,
-0.042724609375,
-0.03765869140625,
-0.0195159912109375,
0.01158905029296875,
0.0184478759765625,
0.052276611328125,
-0.01904296875,
0.054046630859375,
-0.032012939453125,
-0.0269012451171875,
-0.024566650390625,
0.028961181640625,
0.033050537109375,
0.058319091796875,
0.0550537109375,
-0.048919677734375,
-0.05828857421875,
-0.0179290771484375,
-0.06298828125,
-0.019256591796875,
0.01041412353515625,
-0.0170135498046875,
0.037628173828125,
0.016845703125,
-0.036590576171875,
0.026702880859375,
0.026702880859375,
-0.0322265625,
0.00865936279296875,
0.0100250244140625,
0.025115966796875,
-0.10577392578125,
0.004131317138671875,
0.0205841064453125,
-0.00009739398956298828,
-0.04693603515625,
-0.01552581787109375,
-0.0003726482391357422,
0.01251220703125,
-0.0289306640625,
0.04071044921875,
-0.031005859375,
0.01499176025390625,
0.00901031494140625,
0.004791259765625,
0.0013971328735351562,
0.031097412109375,
-0.0099029541015625,
0.0289459228515625,
0.04266357421875,
-0.0254058837890625,
0.0217742919921875,
0.0218505859375,
-0.0019664764404296875,
0.036376953125,
-0.051605224609375,
-0.006488800048828125,
0.0062255859375,
0.0165252685546875,
-0.08807373046875,
-0.0103302001953125,
0.0162353515625,
-0.0750732421875,
0.03204345703125,
-0.028411865234375,
-0.037078857421875,
-0.027984619140625,
-0.039306640625,
0.02154541015625,
0.048004150390625,
-0.03656005859375,
0.053466796875,
0.033050537109375,
0.01279449462890625,
-0.0545654296875,
-0.0782470703125,
0.0115203857421875,
-0.018310546875,
-0.059661865234375,
0.0211944580078125,
-0.005229949951171875,
-0.0173797607421875,
0.00910186767578125,
0.01092529296875,
-0.01467132568359375,
0.005352020263671875,
0.0140533447265625,
0.0194244384765625,
-0.0166168212890625,
0.0146636962890625,
-0.0157318115234375,
-0.00258636474609375,
0.0043182373046875,
-0.023773193359375,
0.049163818359375,
-0.030517578125,
0.016143798828125,
-0.03765869140625,
0.016357421875,
0.0281829833984375,
-0.006561279296875,
0.07305908203125,
0.07366943359375,
-0.029052734375,
-0.01242828369140625,
-0.047119140625,
0.0013666152954101562,
-0.03546142578125,
0.02532958984375,
-0.0225372314453125,
-0.04931640625,
0.04296875,
0.016387939453125,
0.0010118484497070312,
0.06573486328125,
0.049346923828125,
-0.0103912353515625,
0.0718994140625,
0.03253173828125,
-0.01084136962890625,
0.044158935546875,
-0.062744140625,
0.003696441650390625,
-0.04876708984375,
-0.0225372314453125,
-0.0526123046875,
-0.007335662841796875,
-0.06427001953125,
-0.0197906494140625,
0.0101776123046875,
-0.00485992431640625,
-0.040374755859375,
0.0209503173828125,
-0.048248291015625,
0.008697509765625,
0.03546142578125,
0.01194000244140625,
-0.00955963134765625,
0.003025054931640625,
-0.003849029541015625,
-0.00896453857421875,
-0.043731689453125,
-0.03277587890625,
0.0970458984375,
0.0238494873046875,
0.05084228515625,
0.01202392578125,
0.07281494140625,
0.01320648193359375,
0.03814697265625,
-0.048919677734375,
0.0443115234375,
-0.02655029296875,
-0.042449951171875,
-0.01499176025390625,
-0.043182373046875,
-0.060760498046875,
0.0103607177734375,
-0.0247650146484375,
-0.050506591796875,
0.00878143310546875,
-0.006317138671875,
-0.01442718505859375,
0.027496337890625,
-0.054046630859375,
0.0631103515625,
-0.006580352783203125,
-0.03399658203125,
0.0022525787353515625,
-0.03570556640625,
0.01519775390625,
0.0067596435546875,
0.0221099853515625,
-0.020355224609375,
-0.0104217529296875,
0.0789794921875,
-0.0433349609375,
0.065185546875,
-0.020599365234375,
0.01568603515625,
0.0168914794921875,
-0.006984710693359375,
0.01184844970703125,
-0.0024566650390625,
-0.0284881591796875,
0.0203704833984375,
-0.00589752197265625,
-0.042572021484375,
-0.0172271728515625,
0.06939697265625,
-0.07843017578125,
-0.028594970703125,
-0.05859375,
-0.028106689453125,
-0.00434112548828125,
0.0187835693359375,
0.03338623046875,
0.03179931640625,
-0.0103302001953125,
0.03143310546875,
0.0287628173828125,
-0.0206298828125,
0.049652099609375,
0.0258636474609375,
-0.0012035369873046875,
-0.040313720703125,
0.059967041015625,
0.0267333984375,
-0.0013523101806640625,
0.034637451171875,
0.02801513671875,
-0.0223236083984375,
-0.032684326171875,
-0.0092926025390625,
0.018890380859375,
-0.05657958984375,
-0.0301055908203125,
-0.06298828125,
-0.0235595703125,
-0.060516357421875,
-0.00855255126953125,
-0.01776123046875,
-0.05206298828125,
-0.04010009765625,
-0.0033855438232421875,
0.048187255859375,
0.05841064453125,
-0.0272979736328125,
0.0153961181640625,
-0.050140380859375,
0.01293182373046875,
-0.00027871131896972656,
0.020050048828125,
0.0024433135986328125,
-0.0609130859375,
-0.0181732177734375,
0.00739288330078125,
-0.0252532958984375,
-0.06890869140625,
0.057098388671875,
0.021148681640625,
0.0302886962890625,
0.0164337158203125,
0.0048370361328125,
0.051422119140625,
-0.01202392578125,
0.0673828125,
0.0193328857421875,
-0.07666015625,
0.051177978515625,
-0.037445068359375,
0.01352691650390625,
0.036834716796875,
0.0285186767578125,
-0.0350341796875,
-0.03338623046875,
-0.052276611328125,
-0.07171630859375,
0.058868408203125,
0.018890380859375,
0.0120697021484375,
-0.0123291015625,
0.01227569580078125,
-0.01294708251953125,
0.00664520263671875,
-0.069091796875,
-0.039276123046875,
-0.038726806640625,
-0.039794921875,
-0.01003265380859375,
-0.0131988525390625,
-0.0006456375122070312,
-0.046234130859375,
0.0653076171875,
0.009063720703125,
0.03448486328125,
0.00913238525390625,
-0.018035888671875,
-0.007572174072265625,
0.01200103759765625,
0.02471923828125,
0.04327392578125,
-0.036895751953125,
-0.00792694091796875,
0.0237579345703125,
-0.03533935546875,
0.0017099380493164062,
0.00891876220703125,
-0.00762939453125,
0.01337432861328125,
0.034149169921875,
0.036285400390625,
0.019744873046875,
-0.004566192626953125,
0.0528564453125,
-0.00844573974609375,
-0.026123046875,
-0.04266357421875,
-0.0021343231201171875,
0.0035419464111328125,
0.01055908203125,
0.048095703125,
0.01751708984375,
-0.0006103515625,
-0.041015625,
0.00739288330078125,
0.016815185546875,
-0.027557373046875,
-0.033966064453125,
0.05316162109375,
0.00646209716796875,
-0.023040771484375,
0.03277587890625,
-0.00833892822265625,
-0.059478759765625,
0.049346923828125,
0.0256500244140625,
0.0858154296875,
-0.01422119140625,
0.0230560302734375,
0.057220458984375,
0.0144805908203125,
-0.003223419189453125,
0.043853759765625,
0.01385498046875,
-0.054412841796875,
-0.005542755126953125,
-0.058563232421875,
-0.004932403564453125,
0.0008454322814941406,
-0.043304443359375,
0.0188751220703125,
-0.04010009765625,
-0.034271240234375,
0.0195159912109375,
0.0216827392578125,
-0.057403564453125,
0.031646728515625,
-0.0009908676147460938,
0.054779052734375,
-0.0697021484375,
0.0625,
0.0537109375,
-0.0623779296875,
-0.07275390625,
-0.00006717443466186523,
-0.00580596923828125,
-0.04962158203125,
0.06781005859375,
0.01629638671875,
0.00484466552734375,
0.0038604736328125,
-0.059356689453125,
-0.07275390625,
0.08056640625,
0.01055145263671875,
-0.0038928985595703125,
0.00006783008575439453,
0.00786590576171875,
0.05474853515625,
-0.033935546875,
0.046356201171875,
0.033233642578125,
0.040496826171875,
-0.0052337646484375,
-0.046142578125,
0.01316070556640625,
-0.0408935546875,
-0.0056304931640625,
0.00037407875061035156,
-0.07269287109375,
0.09429931640625,
-0.00875091552734375,
0.00146484375,
0.00476837158203125,
0.0487060546875,
0.01332855224609375,
0.015655517578125,
0.0296630859375,
0.046112060546875,
0.049346923828125,
-0.030914306640625,
0.06591796875,
-0.026763916015625,
0.0582275390625,
0.054046630859375,
0.020721435546875,
0.071533203125,
0.031341552734375,
-0.0218658447265625,
0.050689697265625,
0.053497314453125,
-0.017181396484375,
0.0289764404296875,
0.009979248046875,
0.00450897216796875,
-0.01390838623046875,
-0.006916046142578125,
-0.033782958984375,
0.029205322265625,
0.024993896484375,
-0.036865234375,
-0.01229095458984375,
-0.00920867919921875,
0.021636962890625,
-0.004730224609375,
-0.0159759521484375,
0.0386962890625,
0.01117706298828125,
-0.04315185546875,
0.06719970703125,
-0.0010576248168945312,
0.06939697265625,
-0.02069091796875,
0.00884246826171875,
-0.007732391357421875,
0.0255584716796875,
-0.031280517578125,
-0.06085205078125,
0.00525665283203125,
0.0178375244140625,
-0.0023250579833984375,
-0.0224151611328125,
0.033355712890625,
-0.03509521484375,
-0.05169677734375,
0.040283203125,
0.0294036865234375,
0.0185089111328125,
0.0171966552734375,
-0.08599853515625,
0.005786895751953125,
0.0038700103759765625,
-0.053192138671875,
0.00261688232421875,
0.0374755859375,
0.018524169921875,
0.04254150390625,
0.04583740234375,
0.0047607421875,
0.0181121826171875,
0.025238037109375,
0.0654296875,
-0.0596923828125,
-0.0272979736328125,
-0.07177734375,
0.03179931640625,
-0.0194244384765625,
-0.041900634765625,
0.06829833984375,
0.04669189453125,
0.05035400390625,
0.0024662017822265625,
0.07196044921875,
-0.0257568359375,
0.04931640625,
-0.01995849609375,
0.06793212890625,
-0.0631103515625,
0.0084991455078125,
-0.031341552734375,
-0.0557861328125,
-0.0302734375,
0.05657958984375,
-0.034271240234375,
0.03155517578125,
0.0528564453125,
0.05389404296875,
0.003131866455078125,
-0.01227569580078125,
0.007476806640625,
0.049072265625,
0.023681640625,
0.04913330078125,
0.046112060546875,
-0.059844970703125,
0.05206298828125,
-0.036529541015625,
-0.01206207275390625,
-0.0263519287109375,
-0.06201171875,
-0.08984375,
-0.0501708984375,
-0.0305633544921875,
-0.072998046875,
0.0007343292236328125,
0.08441162109375,
0.03753662109375,
-0.08135986328125,
-0.021453857421875,
0.0027828216552734375,
0.000804901123046875,
0.00853729248046875,
-0.0218505859375,
0.050994873046875,
-0.032745361328125,
-0.06329345703125,
-0.0005030632019042969,
-0.00926971435546875,
0.0096893310546875,
-0.0013360977172851562,
-0.003566741943359375,
-0.036285400390625,
0.00560760498046875,
0.03167724609375,
0.0090789794921875,
-0.049346923828125,
-0.0182037353515625,
0.006290435791015625,
-0.031219482421875,
0.01387786865234375,
0.0140380859375,
-0.03607177734375,
0.01250457763671875,
0.05560302734375,
0.0206298828125,
0.04150390625,
0.0005168914794921875,
0.02642822265625,
-0.040130615234375,
0.0037784576416015625,
0.0180816650390625,
0.03717041015625,
0.039764404296875,
-0.00939178466796875,
0.048095703125,
0.0270538330078125,
-0.03857421875,
-0.07208251953125,
-0.028167724609375,
-0.0845947265625,
-0.0170135498046875,
0.099365234375,
-0.0159912109375,
-0.039093017578125,
0.0030689239501953125,
0.0012807846069335938,
0.045013427734375,
-0.055328369140625,
0.056396484375,
0.040435791015625,
0.0138702392578125,
0.0036678314208984375,
-0.0289306640625,
0.0377197265625,
0.023223876953125,
-0.043701171875,
-0.0182952880859375,
0.0026645660400390625,
0.045257568359375,
0.0119171142578125,
0.052490234375,
0.0010356903076171875,
0.0205078125,
-0.002376556396484375,
0.0090179443359375,
-0.006786346435546875,
-0.0011396408081054688,
-0.0284423828125,
0.01369476318359375,
-0.026031494140625,
-0.0170440673828125
]
] |
openai/clip-vit-base-patch32 | 2022-10-04T09:42:04.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"clip",
"zero-shot-image-classification",
"vision",
"arxiv:2103.00020",
"arxiv:1908.04913",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | openai | null | null | openai/clip-vit-base-patch32 | 262 | 8,682,996 | transformers | 2022-03-02T23:29:05 | ---
tags:
- vision
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# Model Card: CLIP
Disclaimer: The model card is taken and modified from the official CLIP repository, it can be found [here](https://github.com/openai/CLIP/blob/main/model-card.md).
## Model Details
The CLIP model was developed by researchers at OpenAI to learn about what contributes to robustness in computer vision tasks. The model was also developed to test the ability of models to generalize to arbitrary image classification tasks in a zero-shot manner. It was not developed for general model deployment - to deploy models like CLIP, researchers will first need to carefully study their capabilities in relation to the specific context theyโre being deployed within.
### Model Date
January 2021
### Model Type
The model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
The original implementation had two variants: one using a ResNet image encoder and the other using a Vision Transformer. This repository has the variant with the Vision Transformer.
### Documents
- [Blog Post](https://openai.com/blog/clip/)
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
### Use with Transformers
```python3
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("openai/clip-vit-base-patch32")
processor = CLIPProcessor.from_pretrained("openai/clip-vit-base-patch32")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=["a photo of a cat", "a photo of a dog"], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
### Out-of-Scope Use Cases
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIPโs performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
## Data
The model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet which tend to skew towards more developed nations, and younger, male users.
### Data Mission Statement
Our goal with building this dataset was to test out robustness and generalizability in computer vision tasks. As a result, the focus was on gathering large quantities of data from different publicly-available internet data sources. The data was gathered in a mostly non-interventionist manner. However, we only crawled websites that had policies against excessively violent and adult images and allowed us to filter out such content. We do not intend for this dataset to be used as the basis for any commercial or deployed model and will not be releasing the dataset.
## Performance and Limitations
### Performance
We have evaluated the performance of CLIP on a wide range of benchmarks across a variety of computer vision datasets such as OCR to texture recognition to fine-grained classification. The paper describes model performance on the following datasets:
- Food101
- CIFAR10
- CIFAR100
- Birdsnap
- SUN397
- Stanford Cars
- FGVC Aircraft
- VOC2007
- DTD
- Oxford-IIIT Pet dataset
- Caltech101
- Flowers102
- MNIST
- SVHN
- IIIT5K
- Hateful Memes
- SST-2
- UCF101
- Kinetics700
- Country211
- CLEVR Counting
- KITTI Distance
- STL-10
- RareAct
- Flickr30
- MSCOCO
- ImageNet
- ImageNet-A
- ImageNet-R
- ImageNet Sketch
- ObjectNet (ImageNet Overlap)
- Youtube-BB
- ImageNet-Vid
## Limitations
CLIP and our analysis of it have a number of limitations. CLIP currently struggles with respect to certain tasks such as fine grained classification and counting objects. CLIP also poses issues with regards to fairness and bias which we discuss in the paper and briefly in the next section. Additionally, our approach to testing CLIP also has an important limitation- in many cases we have used linear probes to evaluate the performance of CLIP and there is evidence suggesting that linear probes can underestimate model performance.
### Bias and Fairness
We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from [Fairface](https://arxiv.org/abs/1908.04913) into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed. (Details captured in the Broader Impacts Section in the paper).
We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with โMiddle Easternโ having the highest accuracy (98.4%) and โWhiteโ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification. Our use of evaluations to test for gender, race and age classification as well as denigration harms is simply to evaluate performance of the model across people and surface potential risks and not to demonstrate an endorsement/enthusiasm for such tasks.
## Feedback
### Where to send questions or comments about the model
Please use [this Google Form](https://forms.gle/Uv7afRH5dvY34ZEs9) | 7,930 | [
[
-0.03851318359375,
-0.044281005859375,
0.012908935546875,
-0.0017910003662109375,
-0.01258087158203125,
-0.0192108154296875,
0.00249481201171875,
-0.055084228515625,
0.009490966796875,
0.0294189453125,
-0.0214080810546875,
-0.03143310546875,
-0.048919677734375,
0.00919342041015625,
-0.0482177734375,
0.055145263671875,
-0.005157470703125,
0.00510406494140625,
-0.0239715576171875,
-0.0253753662109375,
-0.0390625,
-0.052734375,
-0.0186004638671875,
0.0125579833984375,
0.006031036376953125,
0.01123809814453125,
0.05108642578125,
0.06536865234375,
0.06158447265625,
0.0167999267578125,
-0.0240478515625,
-0.00896453857421875,
-0.03863525390625,
-0.047760009765625,
-0.029083251953125,
-0.0307159423828125,
-0.0300750732421875,
0.0161895751953125,
0.040283203125,
0.028106689453125,
0.0021991729736328125,
0.0226593017578125,
0.005496978759765625,
0.0284423828125,
-0.07147216796875,
-0.003429412841796875,
-0.0433349609375,
0.00507354736328125,
-0.022003173828125,
0.0110931396484375,
-0.01334381103515625,
-0.01515960693359375,
0.0240631103515625,
-0.038818359375,
0.037689208984375,
-0.004352569580078125,
0.10107421875,
0.01351165771484375,
-0.0122528076171875,
-0.00261688232421875,
-0.0447998046875,
0.057220458984375,
-0.044219970703125,
0.018341064453125,
0.018035888671875,
0.0297393798828125,
0.01174163818359375,
-0.0648193359375,
-0.048736572265625,
-0.0039825439453125,
0.02301025390625,
0.001422882080078125,
-0.017913818359375,
-0.004642486572265625,
0.031768798828125,
0.037933349609375,
-0.01218414306640625,
-0.004913330078125,
-0.055419921875,
-0.0169525146484375,
0.051666259765625,
0.0229644775390625,
0.0260772705078125,
-0.0181884765625,
-0.0482177734375,
-0.035888671875,
-0.03472900390625,
0.04156494140625,
0.0298919677734375,
0.007232666015625,
-0.0121307373046875,
0.049530029296875,
-0.003307342529296875,
0.033203125,
0.0005087852478027344,
-0.0264892578125,
0.026397705078125,
-0.036285400390625,
-0.01404571533203125,
-0.0207061767578125,
0.058380126953125,
0.064208984375,
0.01351165771484375,
0.0159912109375,
-0.006671905517578125,
0.016387939453125,
0.026641845703125,
-0.0716552734375,
-0.0125274658203125,
-0.0155792236328125,
-0.04833984375,
-0.0284271240234375,
0.021697998046875,
-0.0711669921875,
0.006351470947265625,
-0.0086517333984375,
0.056549072265625,
-0.03472900390625,
-0.00548553466796875,
0.0147705078125,
-0.02484130859375,
0.0247802734375,
0.0249786376953125,
-0.052154541015625,
0.0294189453125,
0.0244140625,
0.08477783203125,
-0.036376953125,
-0.0239410400390625,
0.004428863525390625,
-0.00487518310546875,
-0.0088043212890625,
0.055145263671875,
-0.02899169921875,
-0.036468505859375,
-0.01500701904296875,
0.03338623046875,
-0.009490966796875,
-0.047271728515625,
0.0443115234375,
-0.01593017578125,
0.0017423629760742188,
-0.0216217041015625,
-0.0295867919921875,
-0.047943115234375,
0.024200439453125,
-0.054718017578125,
0.06884765625,
0.0115966796875,
-0.060028076171875,
0.0293121337890625,
-0.05450439453125,
-0.004108428955078125,
-0.00946807861328125,
-0.007671356201171875,
-0.04608154296875,
-0.0217742919921875,
0.03131103515625,
0.0247650146484375,
-0.017486572265625,
0.028106689453125,
-0.04656982421875,
-0.037933349609375,
0.014190673828125,
-0.033538818359375,
0.0684814453125,
0.0016345977783203125,
-0.0254058837890625,
-0.0005846023559570312,
-0.0350341796875,
-0.01334381103515625,
0.0270843505859375,
0.0008502006530761719,
-0.01224517822265625,
-0.0081787109375,
0.01496124267578125,
0.007122039794921875,
-0.0028514862060546875,
-0.052947998046875,
0.00955963134765625,
-0.006443023681640625,
0.041046142578125,
0.0518798828125,
0.007503509521484375,
0.020965576171875,
-0.0333251953125,
0.040252685546875,
-0.001556396484375,
0.05047607421875,
-0.0189666748046875,
-0.040130615234375,
-0.037689208984375,
-0.03570556640625,
0.044525146484375,
0.049957275390625,
-0.03326416015625,
0.0122222900390625,
-0.01055908203125,
-0.0261077880859375,
-0.01433563232421875,
-0.0170135498046875,
0.0264739990234375,
0.05047607421875,
0.0267791748046875,
-0.0753173828125,
-0.031158447265625,
-0.08050537109375,
0.0145721435546875,
0.00478363037109375,
-0.004123687744140625,
0.05328369140625,
0.0692138671875,
-0.01824951171875,
0.08331298828125,
-0.057830810546875,
-0.031768798828125,
-0.01042938232421875,
-0.0101165771484375,
-0.0017528533935546875,
0.0382080078125,
0.07275390625,
-0.07135009765625,
-0.020050048828125,
-0.040740966796875,
-0.0621337890625,
0.01061248779296875,
0.01535797119140625,
-0.00685882568359375,
0.0035305023193359375,
0.0166473388671875,
-0.0188446044921875,
0.078857421875,
0.01971435546875,
-0.00388336181640625,
0.056182861328125,
0.00677490234375,
0.0218658447265625,
-0.045013427734375,
0.0278778076171875,
0.012969970703125,
-0.011505126953125,
-0.0372314453125,
0.003589630126953125,
-0.00016200542449951172,
-0.032684326171875,
-0.07147216796875,
0.0286407470703125,
-0.010955810546875,
-0.00948333740234375,
-0.012115478515625,
-0.01436614990234375,
0.024749755859375,
0.05487060546875,
0.0104522705078125,
0.08251953125,
0.038482666015625,
-0.058380126953125,
-0.0022430419921875,
0.04180908203125,
-0.036224365234375,
0.041656494140625,
-0.0728759765625,
-0.0032024383544921875,
-0.004657745361328125,
0.00749969482421875,
-0.043060302734375,
-0.0256805419921875,
0.0237579345703125,
-0.0274658203125,
0.016357421875,
-0.01010894775390625,
-0.0243377685546875,
-0.0460205078125,
-0.04193115234375,
0.05767822265625,
0.038818359375,
-0.0341796875,
0.0279541015625,
0.055145263671875,
0.01433563232421875,
-0.04083251953125,
-0.05926513671875,
-0.00655364990234375,
-0.0156707763671875,
-0.0552978515625,
0.04229736328125,
-0.00012063980102539062,
0.005992889404296875,
0.01056671142578125,
0.006191253662109375,
-0.0243377685546875,
0.0021457672119140625,
0.0352783203125,
0.03973388671875,
-0.005657196044921875,
-0.00970458984375,
-0.0225067138671875,
0.0278167724609375,
-0.00569915771484375,
0.0098419189453125,
0.02056884765625,
-0.01126861572265625,
-0.02606201171875,
-0.039031982421875,
0.02490234375,
0.03448486328125,
-0.0204315185546875,
0.037384033203125,
0.037200927734375,
-0.02154541015625,
0.00878143310546875,
-0.040985107421875,
-0.0025920867919921875,
-0.034027099609375,
0.03802490234375,
-0.00966644287109375,
-0.051727294921875,
0.056182861328125,
0.01126861572265625,
-0.011444091796875,
0.048065185546875,
0.0235443115234375,
0.000316619873046875,
0.0650634765625,
0.0721435546875,
0.00304412841796875,
0.04901123046875,
-0.06243896484375,
-0.0009617805480957031,
-0.07745361328125,
-0.0267333984375,
-0.0195770263671875,
-0.016021728515625,
-0.0335693359375,
-0.042724609375,
0.044677734375,
0.013671875,
-0.007747650146484375,
0.032470703125,
-0.05078125,
0.034423828125,
0.047119140625,
0.03466796875,
0.0007224082946777344,
-0.00707244873046875,
-0.0001647472381591797,
-0.0124053955078125,
-0.0518798828125,
-0.03851318359375,
0.08563232421875,
0.050811767578125,
0.053924560546875,
-0.0168304443359375,
0.0169830322265625,
0.03228759765625,
-0.006103515625,
-0.05731201171875,
0.041168212890625,
-0.034759521484375,
-0.055419921875,
-0.01378631591796875,
-0.004375457763671875,
-0.058441162109375,
0.0115966796875,
-0.0106048583984375,
-0.057586669921875,
0.046966552734375,
0.01031494140625,
-0.025909423828125,
0.051483154296875,
-0.0455322265625,
0.0755615234375,
-0.0225067138671875,
-0.033416748046875,
0.005970001220703125,
-0.0496826171875,
0.044158935546875,
0.00548553466796875,
0.0023174285888671875,
-0.0164947509765625,
0.00786590576171875,
0.08294677734375,
-0.044189453125,
0.071044921875,
-0.00905609130859375,
0.032684326171875,
0.057281494140625,
-0.01343536376953125,
0.003582000732421875,
-0.01551055908203125,
0.0147705078125,
0.05450439453125,
0.0214080810546875,
-0.009002685546875,
-0.02838134765625,
0.011077880859375,
-0.055755615234375,
-0.0302886962890625,
-0.0282745361328125,
-0.034332275390625,
0.0170745849609375,
0.0157318115234375,
0.042144775390625,
0.058380126953125,
-0.0034770965576171875,
0.01233673095703125,
0.047332763671875,
-0.038055419921875,
0.0296630859375,
0.01546478271484375,
-0.0209808349609375,
-0.040130615234375,
0.06988525390625,
0.0211944580078125,
0.0161895751953125,
0.003307342529296875,
0.00638580322265625,
-0.0178680419921875,
-0.037628173828125,
-0.033721923828125,
0.00560760498046875,
-0.0562744140625,
-0.033203125,
-0.0421142578125,
-0.027984619140625,
-0.033966064453125,
-0.001369476318359375,
-0.036865234375,
-0.025634765625,
-0.048553466796875,
0.01558685302734375,
0.01343536376953125,
0.049285888671875,
-0.0077667236328125,
0.0228424072265625,
-0.04730224609375,
0.0194091796875,
0.02947998046875,
0.04071044921875,
0.0050048828125,
-0.05267333984375,
-0.0110931396484375,
-0.00025081634521484375,
-0.06756591796875,
-0.061370849609375,
0.034423828125,
0.0249786376953125,
0.0452880859375,
0.0274810791015625,
0.007190704345703125,
0.053253173828125,
-0.032470703125,
0.08282470703125,
0.0172271728515625,
-0.07330322265625,
0.04217529296875,
-0.0235443115234375,
0.01666259765625,
0.05291748046875,
0.03753662109375,
-0.015350341796875,
-0.01042938232421875,
-0.0419921875,
-0.0679931640625,
0.060821533203125,
0.01055908203125,
0.0034656524658203125,
0.00482940673828125,
0.025604248046875,
0.0018396377563476562,
0.0068206787109375,
-0.05401611328125,
-0.01251983642578125,
-0.0391845703125,
0.004253387451171875,
0.022613525390625,
-0.03277587890625,
0.00247955322265625,
-0.03228759765625,
0.0310821533203125,
-0.003864288330078125,
0.043060302734375,
0.04132080078125,
-0.01369476318359375,
0.0108489990234375,
-0.00798797607421875,
0.050140380859375,
0.0465087890625,
-0.03045654296875,
-0.017364501953125,
0.019683837890625,
-0.06396484375,
0.0010051727294921875,
-0.01360321044921875,
-0.03887939453125,
-0.003429412841796875,
0.023834228515625,
0.07159423828125,
0.0155181884765625,
-0.056396484375,
0.07666015625,
-0.007648468017578125,
-0.042022705078125,
-0.0189666748046875,
0.00585174560546875,
-0.0419921875,
0.009796142578125,
0.024810791015625,
0.01739501953125,
0.035125732421875,
-0.039154052734375,
0.0302886962890625,
0.03253173828125,
-0.02691650390625,
-0.028961181640625,
0.058624267578125,
0.01116180419921875,
-0.0158843994140625,
0.038116455078125,
-0.01354217529296875,
-0.0733642578125,
0.0623779296875,
0.03118896484375,
0.05047607421875,
-0.0007371902465820312,
0.01322174072265625,
0.05108642578125,
0.01197052001953125,
-0.0257720947265625,
-0.003467559814453125,
0.0003867149353027344,
-0.043426513671875,
-0.0160980224609375,
-0.032012939453125,
-0.04510498046875,
0.0113983154296875,
-0.07080078125,
0.03204345703125,
-0.03887939453125,
-0.0386962890625,
-0.00830078125,
-0.020660400390625,
-0.055816650390625,
0.0107421875,
0.011749267578125,
0.09344482421875,
-0.06427001953125,
0.0372314453125,
0.032562255859375,
-0.045623779296875,
-0.06170654296875,
-0.01105499267578125,
-0.00803375244140625,
-0.048828125,
0.05059814453125,
0.041473388671875,
-0.0006299018859863281,
-0.03558349609375,
-0.0721435546875,
-0.0753173828125,
0.08648681640625,
0.025115966796875,
-0.0305938720703125,
-0.006565093994140625,
-0.0016536712646484375,
0.0260772705078125,
-0.025238037109375,
0.0288848876953125,
0.0254669189453125,
-0.0016651153564453125,
0.0253753662109375,
-0.08917236328125,
-0.0143280029296875,
-0.0133209228515625,
0.020233154296875,
0.0012540817260742188,
-0.0643310546875,
0.0799560546875,
-0.02105712890625,
-0.033966064453125,
0.0041961669921875,
0.0338134765625,
-0.00424957275390625,
0.028839111328125,
0.039337158203125,
0.053466796875,
0.032318115234375,
0.004734039306640625,
0.082275390625,
-0.004695892333984375,
0.03509521484375,
0.0711669921875,
-0.01116180419921875,
0.06744384765625,
0.0230560302734375,
-0.02703857421875,
0.028656005859375,
0.033660888671875,
-0.05242919921875,
0.0589599609375,
-0.00015485286712646484,
0.011993408203125,
-0.002643585205078125,
-0.033905029296875,
-0.0223236083984375,
0.054412841796875,
0.0026149749755859375,
-0.034912109375,
-0.00508880615234375,
0.0307769775390625,
-0.018402099609375,
-0.0042724609375,
-0.0343017578125,
0.03387451171875,
-0.01238250732421875,
-0.026611328125,
0.033538818359375,
0.00527191162109375,
0.07275390625,
-0.0274200439453125,
-0.01166534423828125,
0.006679534912109375,
0.01464080810546875,
-0.006679534912109375,
-0.07244873046875,
0.0426025390625,
0.004421234130859375,
-0.0171966552734375,
0.0065460205078125,
0.0562744140625,
-0.0023212432861328125,
-0.043701171875,
0.0163421630859375,
-0.0107574462890625,
0.027130126953125,
-0.007472991943359375,
-0.0540771484375,
0.0255889892578125,
0.0045013427734375,
0.0027027130126953125,
0.02191162109375,
-0.0016536712646484375,
-0.00843048095703125,
0.05126953125,
0.0293121337890625,
-0.00359344482421875,
0.00884246826171875,
-0.026336669921875,
0.080078125,
-0.042144775390625,
-0.0309295654296875,
-0.05267333984375,
0.0268096923828125,
-0.00751495361328125,
-0.026336669921875,
0.047332763671875,
0.04742431640625,
0.0858154296875,
-0.00928497314453125,
0.042816162109375,
-0.0167694091796875,
0.038482666015625,
-0.0284881591796875,
0.03460693359375,
-0.0400390625,
-0.002410888671875,
-0.033050537109375,
-0.04840087890625,
-0.014404296875,
0.04669189453125,
-0.0306396484375,
-0.00543975830078125,
0.03814697265625,
0.056182861328125,
-0.0186004638671875,
-0.0024051666259765625,
0.019805908203125,
-0.0259552001953125,
0.01971435546875,
0.046661376953125,
0.046112060546875,
-0.060821533203125,
0.053131103515625,
-0.053131103515625,
-0.017333984375,
-0.01505279541015625,
-0.06390380859375,
-0.0792236328125,
-0.038665771484375,
-0.032867431640625,
-0.0102691650390625,
-0.00440216064453125,
0.043121337890625,
0.07421875,
-0.05426025390625,
-0.007328033447265625,
0.025115966796875,
-0.005069732666015625,
-0.0006260871887207031,
-0.0187530517578125,
0.028839111328125,
0.0159454345703125,
-0.043182373046875,
-0.01459503173828125,
0.00978851318359375,
0.027618408203125,
-0.013458251953125,
0.009552001953125,
-0.01514434814453125,
-0.0046539306640625,
0.033660888671875,
0.04052734375,
-0.049774169921875,
-0.0244598388671875,
0.01183319091796875,
0.00325775146484375,
0.0264434814453125,
0.049072265625,
-0.04888916015625,
0.033782958984375,
0.021209716796875,
0.04180908203125,
0.05120849609375,
0.020050048828125,
0.01548004150390625,
-0.033294677734375,
0.016143798828125,
0.0164947509765625,
0.0258636474609375,
0.0270843505859375,
-0.0305938720703125,
0.04498291015625,
0.037384033203125,
-0.04949951171875,
-0.074951171875,
-0.0025272369384765625,
-0.08233642578125,
-0.01503753662109375,
0.06744384765625,
-0.03143310546875,
-0.052154541015625,
0.01148223876953125,
-0.0158233642578125,
0.013427734375,
-0.0274810791015625,
0.05047607421875,
0.0300140380859375,
-0.002910614013671875,
-0.0279388427734375,
-0.0455322265625,
0.0150909423828125,
0.004459381103515625,
-0.040069580078125,
-0.0301361083984375,
0.027740478515625,
0.044677734375,
0.026031494140625,
0.036529541015625,
-0.0267791748046875,
0.0297088623046875,
0.003307342529296875,
0.02301025390625,
-0.0250091552734375,
-0.0298614501953125,
-0.036407470703125,
0.023040771484375,
-0.0217132568359375,
-0.047210693359375
]
] |
stabilityai/stable-diffusion-xl-refiner-1.0 | 2023-09-25T13:42:56.000Z | [
"diffusers",
"stable-diffusion",
"image-to-image",
"arxiv:2307.01952",
"arxiv:2211.01324",
"arxiv:2108.01073",
"arxiv:2112.10752",
"license:openrail++",
"has_space",
"diffusers:StableDiffusionXLImg2ImgPipeline",
"region:us"
] | image-to-image | stabilityai | null | null | stabilityai/stable-diffusion-xl-refiner-1.0 | 1,032 | 8,432,740 | diffusers | 2023-07-26T07:38:01 | ---
license: openrail++
tags:
- stable-diffusion
- image-to-image
---
# SD-XL 1.0-refiner Model Card

## Model

[SDXL](https://arxiv.org/abs/2307.01952) consists of an [ensemble of experts](https://arxiv.org/abs/2211.01324) pipeline for latent diffusion:
In a first step, the base model (available here: https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) is used to generate (noisy) latents,
which are then further processed with a refinement model specialized for the final denoising steps.
Note that the base model can be used as a standalone module.
Alternatively, we can use a two-stage pipeline as follows:
First, the base model is used to generate latents of the desired output size.
In the second step, we use a specialized high-resolution model and apply a technique called SDEdit (https://arxiv.org/abs/2108.01073, also known as "img2img")
to the latents generated in the first step, using the same prompt. This technique is slightly slower than the first one, as it requires more function evaluations.
Source code is available at https://github.com/Stability-AI/generative-models .
### Model Description
- **Developed by:** Stability AI
- **Model type:** Diffusion-based text-to-image generative model
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-xl-refiner-1.0/blob/main/LICENSE.md)
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses two fixed, pretrained text encoders ([OpenCLIP-ViT/G](https://github.com/mlfoundations/open_clip) and [CLIP-ViT/L](https://github.com/openai/CLIP/tree/main)).
- **Resources for more information:** Check out our [GitHub Repository](https://github.com/Stability-AI/generative-models) and the [SDXL report on arXiv](https://arxiv.org/abs/2307.01952).
### Model Sources
For research purposes, we recommned our `generative-models` Github repository (https://github.com/Stability-AI/generative-models), which implements the most popoular diffusion frameworks (both training and inference) and for which new functionalities like distillation will be added over time.
[Clipdrop](https://clipdrop.co/stable-diffusion) provides free SDXL inference.
- **Repository:** https://github.com/Stability-AI/generative-models
- **Demo:** https://clipdrop.co/stable-diffusion
## Evaluation

The chart above evaluates user preference for SDXL (with and without refinement) over SDXL 0.9 and Stable Diffusion 1.5 and 2.1.
The SDXL base model performs significantly better than the previous variants, and the model combined with the refinement module achieves the best overall performance.
### ๐งจ Diffusers
Make sure to upgrade diffusers to >= 0.18.0:
```
pip install diffusers --upgrade
```
In addition make sure to install `transformers`, `safetensors`, `accelerate` as well as the invisible watermark:
```
pip install invisible_watermark transformers accelerate safetensors
```
Yon can then use the refiner to improve images.
```py
import torch
from diffusers import StableDiffusionXLImg2ImgPipeline
from diffusers.utils import load_image
pipe = StableDiffusionXLImg2ImgPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-refiner-1.0", torch_dtype=torch.float16, variant="fp16", use_safetensors=True
)
pipe = pipe.to("cuda")
url = "https://huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png"
init_image = load_image(url).convert("RGB")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt, image=init_image).images
```
When using `torch >= 2.0`, you can improve the inference speed by 20-30% with torch.compile. Simple wrap the unet with torch compile before running the pipeline:
```py
pipe.unet = torch.compile(pipe.unet, mode="reduce-overhead", fullgraph=True)
```
If you are limited by GPU VRAM, you can enable *cpu offloading* by calling `pipe.enable_model_cpu_offload`
instead of `.to("cuda")`:
```diff
- pipe.to("cuda")
+ pipe.enable_model_cpu_offload()
```
For more advanced use cases, please have a look at [the docs](https://huggingface.co/docs/diffusers/main/en/api/pipelines/stable_diffusion/stable_diffusion_xl).
## Uses
### Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
Excluded uses are described below.
### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model struggles with more difficult tasks which involve compositionality, such as rendering an image corresponding to โA red cube on top of a blue sphereโ
- Faces and people in general may not be generated properly.
- The autoencoding part of the model is lossy.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. | 5,535 | [
[
-0.039093017578125,
-0.060943603515625,
0.035675048828125,
0.00960540771484375,
-0.0181884765625,
-0.0196990966796875,
-0.0052337646484375,
-0.0210418701171875,
0.004833221435546875,
0.032196044921875,
-0.03515625,
-0.037872314453125,
-0.0516357421875,
-0.00341033935546875,
-0.032989501953125,
0.077392578125,
-0.01201629638671875,
-0.0086822509765625,
-0.02032470703125,
-0.00868988037109375,
-0.0064544677734375,
-0.00820159912109375,
-0.078125,
-0.0152740478515625,
0.02899169921875,
-0.00699615478515625,
0.047088623046875,
0.029083251953125,
0.0217437744140625,
0.0276031494140625,
-0.02459716796875,
-0.006664276123046875,
-0.041961669921875,
0.006092071533203125,
0.003948211669921875,
-0.03302001953125,
-0.0198974609375,
0.00817108154296875,
0.041778564453125,
0.029449462890625,
-0.01348114013671875,
-0.0012760162353515625,
-0.0032958984375,
0.049896240234375,
-0.040985107421875,
0.0006895065307617188,
-0.021484375,
0.0024871826171875,
-0.0064544677734375,
0.0231475830078125,
-0.02276611328125,
-0.02459716796875,
0.01123046875,
-0.064453125,
0.04315185546875,
-0.010650634765625,
0.08966064453125,
0.04095458984375,
-0.0064697265625,
-0.007663726806640625,
-0.043060302734375,
0.052154541015625,
-0.053253173828125,
0.024261474609375,
0.018524169921875,
-0.0023784637451171875,
0.0100250244140625,
-0.08416748046875,
-0.0535888671875,
-0.00908660888671875,
0.0007376670837402344,
0.0301513671875,
-0.025634765625,
0.00789642333984375,
0.035186767578125,
0.03753662109375,
-0.0369873046875,
-0.0046539306640625,
-0.045806884765625,
-0.00533294677734375,
0.053497314453125,
0.00572967529296875,
0.024993896484375,
-0.003894805908203125,
-0.0269927978515625,
-0.01114654541015625,
-0.03948974609375,
-0.009063720703125,
0.02386474609375,
-0.01207733154296875,
-0.038909912109375,
0.037109375,
0.0070343017578125,
0.03363037109375,
0.0275726318359375,
-0.0003616809844970703,
0.0235137939453125,
-0.0247344970703125,
-0.025726318359375,
-0.031402587890625,
0.07269287109375,
0.032989501953125,
-0.01004791259765625,
0.005859375,
-0.0137786865234375,
0.0145416259765625,
0.006969451904296875,
-0.09173583984375,
-0.040802001953125,
0.030517578125,
-0.047576904296875,
-0.0297698974609375,
-0.0107879638671875,
-0.0716552734375,
-0.01325225830078125,
-0.0006384849548339844,
0.031402587890625,
-0.0273284912109375,
-0.04046630859375,
-0.003498077392578125,
-0.031585693359375,
0.0023517608642578125,
0.04052734375,
-0.057769775390625,
0.007579803466796875,
0.01039886474609375,
0.09234619140625,
-0.022491455078125,
0.001506805419921875,
-0.02215576171875,
-0.01097869873046875,
-0.0145416259765625,
0.052154541015625,
-0.0250091552734375,
-0.044158935546875,
-0.021453857421875,
0.0204315185546875,
0.00032520294189453125,
-0.04742431640625,
0.047027587890625,
-0.037139892578125,
0.0233917236328125,
-0.0099029541015625,
-0.041412353515625,
-0.01528167724609375,
-0.00798797607421875,
-0.0565185546875,
0.085205078125,
0.03228759765625,
-0.06939697265625,
0.007793426513671875,
-0.06390380859375,
-0.01244354248046875,
0.0011148452758789062,
-0.00666046142578125,
-0.059326171875,
-0.0028896331787109375,
0.006099700927734375,
0.03375244140625,
-0.0161590576171875,
0.0141754150390625,
-0.0185394287109375,
-0.01505279541015625,
-0.00795745849609375,
-0.0234375,
0.09375,
0.03729248046875,
-0.027069091796875,
0.023193359375,
-0.051849365234375,
-0.0148468017578125,
0.0226593017578125,
-0.016998291015625,
-0.00653076171875,
-0.0250701904296875,
0.0276031494140625,
0.01044464111328125,
0.004276275634765625,
-0.04974365234375,
0.0031986236572265625,
-0.0167388916015625,
0.049774169921875,
0.059967041015625,
0.0052947998046875,
0.039337158203125,
-0.0157928466796875,
0.03973388671875,
0.023773193359375,
0.006229400634765625,
-0.0190277099609375,
-0.06256103515625,
-0.06121826171875,
-0.027557373046875,
0.01953125,
0.0350341796875,
-0.06744384765625,
0.0260162353515625,
0.006786346435546875,
-0.050872802734375,
-0.042083740234375,
0.007518768310546875,
0.0153656005859375,
0.0491943359375,
0.0177154541015625,
-0.04595947265625,
-0.026885986328125,
-0.047271728515625,
0.03619384765625,
-0.010162353515625,
-0.0009250640869140625,
0.01471710205078125,
0.05047607421875,
-0.029998779296875,
0.05596923828125,
-0.061859130859375,
-0.0115966796875,
0.0036792755126953125,
0.01873779296875,
0.00923919677734375,
0.04888916015625,
0.061859130859375,
-0.072265625,
-0.049560546875,
-0.0099639892578125,
-0.061981201171875,
-0.0052337646484375,
-0.00039577484130859375,
-0.0129852294921875,
0.03289794921875,
0.031036376953125,
-0.0692138671875,
0.0457763671875,
0.053253173828125,
-0.03515625,
0.05609130859375,
-0.0298309326171875,
-0.0010433197021484375,
-0.073974609375,
0.0231170654296875,
0.0255126953125,
-0.0167083740234375,
-0.0440673828125,
0.017242431640625,
-0.0033931732177734375,
-0.0196533203125,
-0.0350341796875,
0.05670166015625,
-0.0205841064453125,
0.0284423828125,
-0.022918701171875,
-0.00540924072265625,
0.01812744140625,
0.033111572265625,
0.022705078125,
0.059051513671875,
0.06121826171875,
-0.0460205078125,
0.031890869140625,
0.0204925537109375,
-0.027496337890625,
0.037200927734375,
-0.06890869140625,
0.007793426513671875,
-0.0300750732421875,
0.0193023681640625,
-0.087158203125,
-0.00249481201171875,
0.0307159423828125,
-0.0268096923828125,
0.036376953125,
-0.0190887451171875,
-0.0255126953125,
-0.03106689453125,
-0.0142974853515625,
0.025482177734375,
0.0634765625,
-0.036956787109375,
0.042144775390625,
0.017333984375,
-0.0031585693359375,
-0.033172607421875,
-0.046600341796875,
-0.01548004150390625,
-0.02117919921875,
-0.05902099609375,
0.039794921875,
-0.03680419921875,
-0.0179901123046875,
0.011138916015625,
0.01303863525390625,
-0.0003371238708496094,
-0.00213623046875,
0.035736083984375,
0.032623291015625,
-0.00894927978515625,
-0.0189361572265625,
0.01189422607421875,
-0.0194549560546875,
0.0036716461181640625,
-0.007419586181640625,
0.028594970703125,
0.005001068115234375,
-0.006195068359375,
-0.054290771484375,
0.025726318359375,
0.046661376953125,
0.0083465576171875,
0.06414794921875,
0.0810546875,
-0.0246429443359375,
0.001964569091796875,
-0.03485107421875,
-0.016387939453125,
-0.038299560546875,
0.0212249755859375,
-0.01479339599609375,
-0.04736328125,
0.0478515625,
-0.0026912689208984375,
0.01438140869140625,
0.0494384765625,
0.053009033203125,
-0.01195526123046875,
0.0745849609375,
0.044769287109375,
0.015899658203125,
0.042510986328125,
-0.06982421875,
0.0023479461669921875,
-0.0712890625,
-0.0151824951171875,
-0.0267181396484375,
-0.00510406494140625,
-0.030548095703125,
-0.044464111328125,
0.0266265869140625,
0.0110931396484375,
-0.0182647705078125,
0.0157012939453125,
-0.049407958984375,
0.0125885009765625,
0.032196044921875,
0.01198577880859375,
0.007049560546875,
0.0101776123046875,
-0.00942230224609375,
-0.0081939697265625,
-0.0386962890625,
-0.030731201171875,
0.06964111328125,
0.0295257568359375,
0.0699462890625,
-0.00372314453125,
0.037841796875,
0.027252197265625,
0.040374755859375,
-0.031341552734375,
0.0306854248046875,
-0.017181396484375,
-0.04644775390625,
-0.01161956787109375,
-0.0244598388671875,
-0.06317138671875,
0.0156097412109375,
-0.0146484375,
-0.0341796875,
0.03302001953125,
0.00942230224609375,
-0.033935546875,
0.038848876953125,
-0.07080078125,
0.066650390625,
-0.0035839080810546875,
-0.059661865234375,
-0.0018558502197265625,
-0.041473388671875,
0.019561767578125,
0.00982666015625,
-0.0051116943359375,
-0.0011472702026367188,
-0.00708770751953125,
0.0550537109375,
-0.034515380859375,
0.06463623046875,
-0.034271240234375,
-0.0134124755859375,
0.0328369140625,
-0.0159454345703125,
0.027923583984375,
-0.000005424022674560547,
-0.0270538330078125,
0.03131103515625,
0.016571044921875,
-0.0286102294921875,
-0.036712646484375,
0.0611572265625,
-0.07476806640625,
-0.04132080078125,
-0.0228424072265625,
-0.0272064208984375,
0.04083251953125,
0.0166473388671875,
0.05511474609375,
0.0128326416015625,
-0.01255035400390625,
-0.007396697998046875,
0.06573486328125,
-0.0300140380859375,
0.03411865234375,
0.004375457763671875,
-0.0207977294921875,
-0.03662109375,
0.06475830078125,
0.00846099853515625,
0.034912109375,
0.00737762451171875,
0.00798797607421875,
-0.0267333984375,
-0.042022705078125,
-0.05499267578125,
0.0234375,
-0.05999755859375,
-0.0172576904296875,
-0.0634765625,
-0.03057861328125,
-0.030364990234375,
-0.01438140869140625,
-0.02899169921875,
-0.0273590087890625,
-0.06182861328125,
0.004974365234375,
0.034515380859375,
0.044281005859375,
-0.0141754150390625,
0.0261993408203125,
-0.0276947021484375,
0.0311737060546875,
0.0159759521484375,
0.0301055908203125,
0.01197052001953125,
-0.0382080078125,
-0.006130218505859375,
-0.0026912689208984375,
-0.048492431640625,
-0.05413818359375,
0.0390625,
0.00514984130859375,
0.031890869140625,
0.05047607421875,
0.0010747909545898438,
0.04473876953125,
-0.0170135498046875,
0.07257080078125,
0.0279541015625,
-0.05511474609375,
0.041778564453125,
-0.018280029296875,
0.007015228271484375,
0.013946533203125,
0.038177490234375,
-0.023590087890625,
-0.01812744140625,
-0.0574951171875,
-0.06317138671875,
0.0474853515625,
0.031829833984375,
0.0101165771484375,
0.0004570484161376953,
0.05377197265625,
0.00885772705078125,
0.0007042884826660156,
-0.059326171875,
-0.046173095703125,
-0.027740478515625,
-0.0022068023681640625,
0.005649566650390625,
-0.01372528076171875,
-0.00672149658203125,
-0.040863037109375,
0.0657958984375,
0.004917144775390625,
0.038787841796875,
0.0279693603515625,
0.01091766357421875,
-0.0236968994140625,
-0.0174560546875,
0.032867431640625,
0.033294677734375,
-0.0191192626953125,
-0.008544921875,
-0.005107879638671875,
-0.043304443359375,
0.01180267333984375,
0.0176544189453125,
-0.048797607421875,
0.0029449462890625,
-0.004608154296875,
0.072509765625,
-0.0228271484375,
-0.04144287109375,
0.03216552734375,
-0.017181396484375,
-0.023162841796875,
-0.031951904296875,
0.017974853515625,
0.020416259765625,
0.014678955078125,
0.0112762451171875,
0.034881591796875,
0.005550384521484375,
-0.0267486572265625,
0.00043082237243652344,
0.038909912109375,
-0.0180511474609375,
-0.02471923828125,
0.09112548828125,
0.0169677734375,
-0.0105743408203125,
0.058746337890625,
-0.0267486572265625,
-0.018646240234375,
0.05706787109375,
0.04193115234375,
0.0645751953125,
-0.00986480712890625,
0.0267486572265625,
0.05609130859375,
0.006786346435546875,
-0.01532745361328125,
0.0103759765625,
-0.00289154052734375,
-0.055633544921875,
-0.01183319091796875,
-0.04132080078125,
-0.00917816162109375,
0.0180511474609375,
-0.0330810546875,
0.0281982421875,
-0.040008544921875,
-0.0290374755859375,
0.00865936279296875,
0.004428863525390625,
-0.049530029296875,
0.01349639892578125,
0.00951385498046875,
0.06390380859375,
-0.07196044921875,
0.05706787109375,
0.04498291015625,
-0.053741455078125,
-0.029052734375,
-0.017181396484375,
-0.0140533447265625,
-0.037200927734375,
0.03851318359375,
0.0079345703125,
0.0029315948486328125,
0.0075225830078125,
-0.0673828125,
-0.059234619140625,
0.100341796875,
0.03228759765625,
-0.0294342041015625,
-0.00007641315460205078,
-0.0246734619140625,
0.042510986328125,
-0.031951904296875,
0.032135009765625,
0.029327392578125,
0.0297393798828125,
0.034027099609375,
-0.04010009765625,
0.0160369873046875,
-0.03680419921875,
0.02020263671875,
-0.0017871856689453125,
-0.0662841796875,
0.0706787109375,
-0.0433349609375,
-0.03125,
0.038970947265625,
0.053985595703125,
0.0241851806640625,
0.0309906005859375,
0.03411865234375,
0.08599853515625,
0.0489501953125,
-0.0144195556640625,
0.0848388671875,
-0.00342559814453125,
0.04241943359375,
0.048309326171875,
-0.01020050048828125,
0.047943115234375,
0.026123046875,
-0.0268707275390625,
0.05078125,
0.0570068359375,
-0.028106689453125,
0.04736328125,
0.0059051513671875,
-0.031219482421875,
0.0031375885009765625,
-0.0025539398193359375,
-0.031951904296875,
-0.01122283935546875,
0.030792236328125,
-0.045928955078125,
-0.01097869873046875,
0.01094818115234375,
0.00494384765625,
-0.0115966796875,
-0.00623321533203125,
0.03582763671875,
-0.0005950927734375,
-0.046234130859375,
0.049407958984375,
0.0060272216796875,
0.07232666015625,
-0.03778076171875,
-0.0114288330078125,
-0.012481689453125,
0.0164337158203125,
-0.0267486572265625,
-0.065185546875,
0.03076171875,
-0.00867462158203125,
-0.022125244140625,
-0.0153045654296875,
0.049346923828125,
-0.0301055908203125,
-0.043487548828125,
0.022735595703125,
0.0148468017578125,
0.027984619140625,
0.0024318695068359375,
-0.074951171875,
0.035491943359375,
0.00490570068359375,
-0.019012451171875,
0.0174713134765625,
0.0102081298828125,
0.0200347900390625,
0.0472412109375,
0.048065185546875,
0.0012664794921875,
-0.00026345252990722656,
-0.002910614013671875,
0.06646728515625,
-0.027923583984375,
-0.01532745361328125,
-0.051025390625,
0.04998779296875,
-0.00940704345703125,
-0.0174102783203125,
0.0445556640625,
0.04541015625,
0.060333251953125,
-0.01435089111328125,
0.064453125,
-0.03192138671875,
-0.0004191398620605469,
-0.03302001953125,
0.0701904296875,
-0.0479736328125,
0.01062774658203125,
-0.0274810791015625,
-0.05438232421875,
-0.00909423828125,
0.061370849609375,
-0.00814056396484375,
0.021820068359375,
0.0302276611328125,
0.07550048828125,
-0.0107421875,
-0.0026092529296875,
0.019256591796875,
0.023712158203125,
0.0272979736328125,
0.0165557861328125,
0.03802490234375,
-0.05316162109375,
0.0247344970703125,
-0.046600341796875,
-0.01751708984375,
0.00641632080078125,
-0.06097412109375,
-0.0635986328125,
-0.06854248046875,
-0.06396484375,
-0.0570068359375,
-0.01444244384765625,
0.04229736328125,
0.07684326171875,
-0.04962158203125,
-0.002933502197265625,
-0.0170745849609375,
0.00946807861328125,
-0.010955810546875,
-0.0233917236328125,
0.04071044921875,
0.00031065940856933594,
-0.077880859375,
-0.0117034912109375,
0.0221099853515625,
0.0278472900390625,
-0.04144287109375,
-0.01505279541015625,
-0.0212860107421875,
-0.011444091796875,
0.04962158203125,
0.02825927734375,
-0.053253173828125,
0.005847930908203125,
-0.01503753662109375,
0.010284423828125,
0.0164031982421875,
0.035491943359375,
-0.051177978515625,
0.042205810546875,
0.03924560546875,
0.021453857421875,
0.06524658203125,
-0.002765655517578125,
0.01451873779296875,
-0.0384521484375,
0.0178680419921875,
-0.00055694580078125,
0.030242919921875,
0.033966064453125,
-0.044921875,
0.0477294921875,
0.045806884765625,
-0.0472412109375,
-0.05133056640625,
0.007293701171875,
-0.08599853515625,
-0.015533447265625,
0.0814208984375,
-0.028106689453125,
-0.0236053466796875,
-0.0024566650390625,
-0.03155517578125,
0.02044677734375,
-0.034423828125,
0.05865478515625,
0.044830322265625,
-0.019927978515625,
-0.049407958984375,
-0.034881591796875,
0.03607177734375,
0.012664794921875,
-0.053741455078125,
-0.007293701171875,
0.03680419921875,
0.055206298828125,
0.0364990234375,
0.05889892578125,
-0.020599365234375,
0.0110626220703125,
0.008056640625,
0.0026340484619140625,
0.007843017578125,
0.00502777099609375,
-0.0237579345703125,
0.0037326812744140625,
-0.0157318115234375,
-0.001361846923828125
]
] |
roberta-base | 2023-03-06T15:14:53.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"roberta",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1907.11692",
"arxiv:1806.02847",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | roberta-base | 236 | 8,107,369 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: mit
datasets:
- bookcorpus
- wikipedia
---
# RoBERTa base model
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). This model is case-sensitive: it
makes a difference between english and English.
Disclaimer: The team releasing RoBERTa did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at a model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-base')
>>> unmasker("Hello I'm a <mask> model.")
[{'sequence': "<s>Hello I'm a male model.</s>",
'score': 0.3306540250778198,
'token': 2943,
'token_str': 'ฤ male'},
{'sequence': "<s>Hello I'm a female model.</s>",
'score': 0.04655390977859497,
'token': 2182,
'token_str': 'ฤ female'},
{'sequence': "<s>Hello I'm a professional model.</s>",
'score': 0.04232972860336304,
'token': 2038,
'token_str': 'ฤ professional'},
{'sequence': "<s>Hello I'm a fashion model.</s>",
'score': 0.037216778844594955,
'token': 2734,
'token_str': 'ฤ fashion'},
{'sequence': "<s>Hello I'm a Russian model.</s>",
'score': 0.03253649175167084,
'token': 1083,
'token_str': 'ฤ Russian'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
model = RobertaModel.from_pretrained('roberta-base')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import RobertaTokenizer, TFRobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
model = TFRobertaModel.from_pretrained('roberta-base')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model contains a lot of unfiltered content from the internet, which is far from
neutral. Therefore, the model can have biased predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-base')
>>> unmasker("The man worked as a <mask>.")
[{'sequence': '<s>The man worked as a mechanic.</s>',
'score': 0.08702439814805984,
'token': 25682,
'token_str': 'ฤ mechanic'},
{'sequence': '<s>The man worked as a waiter.</s>',
'score': 0.0819653645157814,
'token': 38233,
'token_str': 'ฤ waiter'},
{'sequence': '<s>The man worked as a butcher.</s>',
'score': 0.073323555290699,
'token': 32364,
'token_str': 'ฤ butcher'},
{'sequence': '<s>The man worked as a miner.</s>',
'score': 0.046322137117385864,
'token': 18678,
'token_str': 'ฤ miner'},
{'sequence': '<s>The man worked as a guard.</s>',
'score': 0.040150221437215805,
'token': 2510,
'token_str': 'ฤ guard'}]
>>> unmasker("The Black woman worked as a <mask>.")
[{'sequence': '<s>The Black woman worked as a waitress.</s>',
'score': 0.22177888453006744,
'token': 35698,
'token_str': 'ฤ waitress'},
{'sequence': '<s>The Black woman worked as a prostitute.</s>',
'score': 0.19288744032382965,
'token': 36289,
'token_str': 'ฤ prostitute'},
{'sequence': '<s>The Black woman worked as a maid.</s>',
'score': 0.06498628109693527,
'token': 29754,
'token_str': 'ฤ maid'},
{'sequence': '<s>The Black woman worked as a secretary.</s>',
'score': 0.05375480651855469,
'token': 2971,
'token_str': 'ฤ secretary'},
{'sequence': '<s>The Black woman worked as a nurse.</s>',
'score': 0.05245552211999893,
'token': 9008,
'token_str': 'ฤ nurse'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The RoBERTa model was pretrained on the reunion of five datasets:
- [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books;
- [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers) ;
- [CC-News](https://commoncrawl.org/2016/10/news-dataset-available/), a dataset containing 63 millions English news
articles crawled between September 2016 and February 2019.
- [OpenWebText](https://github.com/jcpeterson/openwebtext), an opensource recreation of the WebText dataset used to
train GPT-2,
- [Stories](https://arxiv.org/abs/1806.02847) a dataset containing a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas.
Together these datasets weigh 160GB of text.
## Training procedure
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50,000. The inputs of
the model take pieces of 512 contiguous tokens that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The
optimizer used is Adam with a learning rate of 6e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and
\\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 24,000 steps and linear decay of the learning
rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE |
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|
| | 87.6 | 91.9 | 92.8 | 94.8 | 63.6 | 91.2 | 90.2 | 78.7 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1907-11692,
author = {Yinhan Liu and
Myle Ott and
Naman Goyal and
Jingfei Du and
Mandar Joshi and
Danqi Chen and
Omer Levy and
Mike Lewis and
Luke Zettlemoyer and
Veselin Stoyanov},
title = {RoBERTa: {A} Robustly Optimized {BERT} Pretraining Approach},
journal = {CoRR},
volume = {abs/1907.11692},
year = {2019},
url = {http://arxiv.org/abs/1907.11692},
archivePrefix = {arXiv},
eprint = {1907.11692},
timestamp = {Thu, 01 Aug 2019 08:59:33 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1907-11692.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=roberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 9,064 | [
[
-0.0119171142578125,
-0.059051513671875,
0.0160980224609375,
-0.0010471343994140625,
-0.0270843505859375,
-0.005008697509765625,
-0.0267791748046875,
-0.0281829833984375,
0.0208282470703125,
0.0306243896484375,
-0.042816162109375,
-0.0440673828125,
-0.06805419921875,
0.0012311935424804688,
-0.026611328125,
0.10125732421875,
0.0117340087890625,
0.0150909423828125,
0.005092620849609375,
0.013885498046875,
-0.0228424072265625,
-0.044403076171875,
-0.0447998046875,
-0.023162841796875,
0.018310546875,
-0.005420684814453125,
0.04156494140625,
0.0394287109375,
0.0231170654296875,
0.0260772705078125,
-0.01934814453125,
0.00547027587890625,
-0.031341552734375,
0.0009279251098632812,
-0.0049896240234375,
-0.035675048828125,
-0.0233001708984375,
0.020660400390625,
0.0240325927734375,
0.042449951171875,
-0.004871368408203125,
0.030364990234375,
0.01313018798828125,
0.03131103515625,
-0.019012451171875,
0.01558685302734375,
-0.04766845703125,
-0.0018339157104492188,
-0.02490234375,
0.012115478515625,
-0.028228759765625,
-0.01177978515625,
0.01153564453125,
-0.029693603515625,
0.03192138671875,
-0.002521514892578125,
0.102783203125,
0.0120849609375,
-0.019805908203125,
-0.020660400390625,
-0.042572021484375,
0.072265625,
-0.06475830078125,
0.01468658447265625,
0.0328369140625,
0.0077972412109375,
-0.0094757080078125,
-0.06915283203125,
-0.04376220703125,
-0.00769805908203125,
-0.018768310546875,
0.0121612548828125,
-0.026336669921875,
-0.01390838623046875,
0.0220794677734375,
0.031829833984375,
-0.05023193359375,
-0.01085662841796875,
-0.048309326171875,
-0.022430419921875,
0.041290283203125,
0.00018417835235595703,
0.0197601318359375,
-0.031463623046875,
-0.0272979736328125,
-0.01561737060546875,
-0.018798828125,
0.01088714599609375,
0.04052734375,
0.02655029296875,
-0.0181121826171875,
0.0386962890625,
-0.005401611328125,
0.055572509765625,
0.004833221435546875,
-0.0214996337890625,
0.040252685546875,
-0.0158233642578125,
-0.0223846435546875,
-0.0177154541015625,
0.0726318359375,
0.01995849609375,
0.02667236328125,
-0.0046539306640625,
-0.0132293701171875,
0.0167694091796875,
0.0157470703125,
-0.05322265625,
-0.02203369140625,
0.0189208984375,
-0.03875732421875,
-0.035736083984375,
0.01285552978515625,
-0.0599365234375,
0.0011510848999023438,
-0.00908660888671875,
0.043060302734375,
-0.0293426513671875,
-0.01445770263671875,
0.01290130615234375,
-0.0280303955078125,
0.01526641845703125,
0.0047760009765625,
-0.064453125,
0.00957489013671875,
0.035491943359375,
0.0677490234375,
0.004741668701171875,
-0.01404571533203125,
-0.0185089111328125,
-0.00811004638671875,
0.0004589557647705078,
0.033599853515625,
-0.0235595703125,
-0.004734039306640625,
-0.00786590576171875,
0.0214080810546875,
-0.0174102783203125,
-0.0181732177734375,
0.036102294921875,
-0.0225677490234375,
0.05023193359375,
0.0152435302734375,
-0.0292816162109375,
-0.0240325927734375,
0.0131072998046875,
-0.042633056640625,
0.08941650390625,
0.0190277099609375,
-0.0677490234375,
0.0209808349609375,
-0.04925537109375,
-0.031402587890625,
-0.0139007568359375,
0.010986328125,
-0.051971435546875,
-0.0038280487060546875,
0.023468017578125,
0.0355224609375,
-0.02435302734375,
0.0276336669921875,
-0.0036640167236328125,
-0.025848388671875,
0.0220794677734375,
-0.028717041015625,
0.1036376953125,
0.01474761962890625,
-0.0465087890625,
0.00007253885269165039,
-0.059356689453125,
-0.002460479736328125,
0.0341796875,
-0.0284576416015625,
-0.007633209228515625,
-0.0156402587890625,
0.01361846923828125,
0.0216827392578125,
0.01436614990234375,
-0.04083251953125,
0.01120758056640625,
-0.0364990234375,
0.052276611328125,
0.0555419921875,
-0.00902557373046875,
0.0184173583984375,
-0.032501220703125,
0.041961669921875,
-0.0031642913818359375,
0.0143890380859375,
-0.015533447265625,
-0.052001953125,
-0.0550537109375,
-0.0355224609375,
0.05059814453125,
0.052001953125,
-0.03619384765625,
0.041015625,
-0.00690460205078125,
-0.043487548828125,
-0.06536865234375,
-0.004764556884765625,
0.03521728515625,
0.04351806640625,
0.03411865234375,
-0.033721923828125,
-0.04913330078125,
-0.056640625,
-0.021392822265625,
0.0070648193359375,
-0.02105712890625,
0.02105712890625,
0.046661376953125,
-0.0211181640625,
0.051971435546875,
-0.045440673828125,
-0.039154052734375,
-0.02545166015625,
0.006500244140625,
0.0418701171875,
0.0509033203125,
0.0355224609375,
-0.04736328125,
-0.03753662109375,
-0.0183868408203125,
-0.054443359375,
0.01015472412109375,
-0.0004444122314453125,
-0.01084136962890625,
0.0290374755859375,
0.0266265869140625,
-0.05975341796875,
0.043060302734375,
0.04193115234375,
-0.0262603759765625,
0.046234130859375,
-0.0169677734375,
-0.005184173583984375,
-0.1014404296875,
0.016357421875,
0.0006680488586425781,
-0.0204925537109375,
-0.05975341796875,
0.00409698486328125,
-0.0149688720703125,
-0.01424407958984375,
-0.034637451171875,
0.04022216796875,
-0.043548583984375,
-0.001163482666015625,
0.0024356842041015625,
0.01149749755859375,
0.0106658935546875,
0.060455322265625,
0.000797271728515625,
0.050079345703125,
0.0439453125,
-0.024871826171875,
0.01261138916015625,
0.026885986328125,
-0.035675048828125,
0.01390838623046875,
-0.056243896484375,
0.021881103515625,
-0.0086212158203125,
0.00713348388671875,
-0.0748291015625,
-0.00799560546875,
0.02435302734375,
-0.05718994140625,
0.0301361083984375,
-0.032623291015625,
-0.03668212890625,
-0.041717529296875,
-0.01351165771484375,
0.0122833251953125,
0.05316162109375,
-0.024993896484375,
0.049774169921875,
0.030120849609375,
-0.005771636962890625,
-0.057464599609375,
-0.05865478515625,
-0.0020580291748046875,
-0.017547607421875,
-0.054168701171875,
0.034698486328125,
0.003932952880859375,
-0.0059661865234375,
-0.007671356201171875,
0.0030975341796875,
-0.00974273681640625,
0.009490966796875,
0.020263671875,
0.034149169921875,
-0.00348663330078125,
-0.009979248046875,
-0.01506805419921875,
-0.01064300537109375,
0.0025653839111328125,
-0.03448486328125,
0.07037353515625,
-0.003292083740234375,
-0.0035495758056640625,
-0.0318603515625,
0.0162200927734375,
0.02825927734375,
-0.0272369384765625,
0.06591796875,
0.0784912109375,
-0.02825927734375,
0.0014190673828125,
-0.031646728515625,
-0.0184326171875,
-0.033538818359375,
0.032684326171875,
-0.02386474609375,
-0.0634765625,
0.050048828125,
0.0239410400390625,
-0.0104827880859375,
0.055145263671875,
0.0433349609375,
-0.009918212890625,
0.075439453125,
0.03424072265625,
-0.009307861328125,
0.03692626953125,
-0.045501708984375,
0.0130767822265625,
-0.0621337890625,
-0.0251312255859375,
-0.03765869140625,
-0.0196075439453125,
-0.051361083984375,
-0.031524658203125,
0.0219573974609375,
0.0093994140625,
-0.0167999267578125,
0.03753662109375,
-0.05609130859375,
0.019683837890625,
0.0638427734375,
0.0296783447265625,
-0.0018606185913085938,
0.006317138671875,
-0.01763916015625,
-0.002788543701171875,
-0.046234130859375,
-0.0313720703125,
0.097900390625,
0.0350341796875,
0.0396728515625,
0.002193450927734375,
0.049072265625,
0.0208587646484375,
0.0019588470458984375,
-0.032501220703125,
0.0301666259765625,
-0.0212554931640625,
-0.0653076171875,
-0.0211334228515625,
-0.0221710205078125,
-0.07940673828125,
0.0202178955078125,
-0.02545166015625,
-0.06512451171875,
0.00412750244140625,
-0.004276275634765625,
-0.0123138427734375,
0.02984619140625,
-0.0521240234375,
0.07513427734375,
-0.01155853271484375,
-0.024749755859375,
-0.00225830078125,
-0.0618896484375,
0.02325439453125,
0.0102081298828125,
0.006908416748046875,
0.002223968505859375,
0.028076171875,
0.072998046875,
-0.036834716796875,
0.07513427734375,
-0.018463134765625,
0.007534027099609375,
0.0176849365234375,
-0.00325775146484375,
0.043212890625,
-0.011077880859375,
0.0003285408020019531,
0.045440673828125,
-0.0139312744140625,
-0.037567138671875,
-0.0223388671875,
0.0291290283203125,
-0.0660400390625,
-0.0479736328125,
-0.050140380859375,
-0.043914794921875,
0.019439697265625,
0.0278778076171875,
0.0433349609375,
0.0399169921875,
0.007221221923828125,
0.005252838134765625,
0.034942626953125,
-0.016876220703125,
0.03466796875,
0.021026611328125,
-0.004627227783203125,
-0.034149169921875,
0.05255126953125,
0.00594329833984375,
0.0159149169921875,
0.0202178955078125,
0.00560760498046875,
-0.02606201171875,
-0.04266357421875,
-0.0278472900390625,
0.0254058837890625,
-0.04132080078125,
-0.0221710205078125,
-0.061431884765625,
-0.033050537109375,
-0.042022705078125,
-0.0048370361328125,
-0.0106658935546875,
-0.036834716796875,
-0.041656494140625,
-0.00431060791015625,
0.029144287109375,
0.051025390625,
-0.0030364990234375,
0.0226287841796875,
-0.03912353515625,
0.01464080810546875,
0.0221405029296875,
0.0108795166015625,
-0.00787353515625,
-0.0731201171875,
-0.0247650146484375,
0.012786865234375,
-0.025146484375,
-0.06280517578125,
0.0577392578125,
0.0004475116729736328,
0.036773681640625,
0.0247802734375,
-0.0103759765625,
0.048614501953125,
-0.0278778076171875,
0.07000732421875,
0.010467529296875,
-0.071044921875,
0.043060302734375,
-0.03173828125,
0.01229095458984375,
0.022857666015625,
0.027984619140625,
-0.03192138671875,
-0.0404052734375,
-0.0699462890625,
-0.07373046875,
0.06976318359375,
0.026580810546875,
0.00861358642578125,
0.00295257568359375,
0.0171661376953125,
-0.002201080322265625,
0.0228729248046875,
-0.0860595703125,
-0.0369873046875,
-0.0313720703125,
-0.0244293212890625,
-0.01251983642578125,
-0.01666259765625,
-0.00690460205078125,
-0.02716064453125,
0.05938720703125,
0.00815582275390625,
0.04754638671875,
0.01708984375,
-0.0290374755859375,
0.01120758056640625,
0.01114654541015625,
0.053955078125,
0.041473388671875,
-0.035003662109375,
0.005641937255859375,
0.01270294189453125,
-0.047515869140625,
0.0004398822784423828,
0.0239410400390625,
-0.0249786376953125,
0.010101318359375,
0.03302001953125,
0.0711669921875,
0.0008511543273925781,
-0.0361328125,
0.04815673828125,
0.0065765380859375,
-0.021148681640625,
-0.034423828125,
0.0048980712890625,
0.005245208740234375,
0.0210723876953125,
0.0333251953125,
0.012969970703125,
-0.00591278076171875,
-0.042510986328125,
0.0169219970703125,
0.036773681640625,
-0.030029296875,
-0.019866943359375,
0.0682373046875,
-0.0086212158203125,
-0.03692626953125,
0.05035400390625,
-0.02313232421875,
-0.05853271484375,
0.0501708984375,
0.05230712890625,
0.065673828125,
-0.014862060546875,
0.020965576171875,
0.042327880859375,
0.034576416015625,
0.00044465065002441406,
0.00759124755859375,
0.01453399658203125,
-0.049530029296875,
-0.031036376953125,
-0.059356689453125,
0.00998687744140625,
0.02581787109375,
-0.04705810546875,
0.014862060546875,
-0.03387451171875,
-0.0232696533203125,
0.00644683837890625,
0.012298583984375,
-0.059112548828125,
0.0185546875,
-0.003002166748046875,
0.06048583984375,
-0.0792236328125,
0.06500244140625,
0.042449951171875,
-0.05413818359375,
-0.06036376953125,
-0.0034637451171875,
-0.00861358642578125,
-0.0748291015625,
0.055877685546875,
0.0207672119140625,
0.0242919921875,
0.0006613731384277344,
-0.0367431640625,
-0.0667724609375,
0.093994140625,
0.020599365234375,
-0.031585693359375,
-0.0140838623046875,
0.00798797607421875,
0.0439453125,
-0.03729248046875,
0.04962158203125,
0.0360107421875,
0.028411865234375,
-0.01529693603515625,
-0.06805419921875,
0.0108795166015625,
-0.025482177734375,
0.006198883056640625,
0.00836944580078125,
-0.051239013671875,
0.09295654296875,
-0.01036834716796875,
-0.006816864013671875,
0.0006966590881347656,
0.032470703125,
0.0051422119140625,
0.0096282958984375,
0.034881591796875,
0.05743408203125,
0.0577392578125,
-0.02203369140625,
0.07958984375,
-0.0238494873046875,
0.045135498046875,
0.0634765625,
0.0132904052734375,
0.05206298828125,
0.0181121826171875,
-0.0284576416015625,
0.062255859375,
0.04376220703125,
-0.024627685546875,
0.03826904296875,
0.01114654541015625,
-0.00516510009765625,
-0.0012693405151367188,
0.0048828125,
-0.01995849609375,
0.039337158203125,
0.004730224609375,
-0.041656494140625,
-0.0008821487426757812,
0.0018978118896484375,
0.0279388427734375,
-0.00389862060546875,
-0.01094818115234375,
0.053253173828125,
-0.003208160400390625,
-0.045440673828125,
0.05194091796875,
0.01316070556640625,
0.06072998046875,
-0.045806884765625,
0.00444793701171875,
-0.016265869140625,
0.00899505615234375,
-0.00812530517578125,
-0.04962158203125,
0.01059722900390625,
0.0017576217651367188,
-0.032958984375,
-0.013336181640625,
0.052154541015625,
-0.04693603515625,
-0.044525146484375,
0.019317626953125,
0.020172119140625,
0.0251312255859375,
-0.0070648193359375,
-0.06707763671875,
-0.005466461181640625,
0.0264434814453125,
-0.0165557861328125,
0.02777099609375,
0.016143798828125,
0.0095367431640625,
0.045806884765625,
0.06829833984375,
0.01180267333984375,
0.006053924560546875,
-0.003978729248046875,
0.06329345703125,
-0.057098388671875,
-0.0421142578125,
-0.0623779296875,
0.05718994140625,
-0.00858306884765625,
-0.0249786376953125,
0.0606689453125,
0.0465087890625,
0.06866455078125,
-0.019012451171875,
0.052825927734375,
-0.0128936767578125,
0.042266845703125,
-0.04541015625,
0.057891845703125,
-0.036651611328125,
0.005847930908203125,
-0.0272369384765625,
-0.06475830078125,
-0.0110626220703125,
0.0634765625,
-0.0153656005859375,
0.01526641845703125,
0.04541015625,
0.069091796875,
-0.00868988037109375,
-0.02386474609375,
0.006671905517578125,
0.027069091796875,
0.0115814208984375,
0.0465087890625,
0.034698486328125,
-0.061279296875,
0.050079345703125,
-0.020111083984375,
-0.01169586181640625,
-0.020294189453125,
-0.06005859375,
-0.08294677734375,
-0.049163818359375,
-0.02178955078125,
-0.05181884765625,
0.006870269775390625,
0.0633544921875,
0.0584716796875,
-0.062286376953125,
-0.01654052734375,
-0.00240325927734375,
0.00843048095703125,
-0.026611328125,
-0.023773193359375,
0.04034423828125,
-0.018768310546875,
-0.06463623046875,
0.01183319091796875,
-0.00337982177734375,
0.0150146484375,
-0.0123443603515625,
-0.0055389404296875,
-0.034423828125,
-0.0009179115295410156,
0.032073974609375,
0.00650787353515625,
-0.054412841796875,
-0.019134521484375,
-0.001251220703125,
-0.00502777099609375,
0.00875091552734375,
0.032684326171875,
-0.048583984375,
0.02606201171875,
0.020294189453125,
0.023590087890625,
0.0726318359375,
-0.00101470947265625,
0.0277862548828125,
-0.06463623046875,
0.0273284912109375,
0.00617218017578125,
0.0303192138671875,
0.0273895263671875,
-0.0333251953125,
0.042266845703125,
0.0355224609375,
-0.043609619140625,
-0.066650390625,
-0.003082275390625,
-0.0731201171875,
-0.0245819091796875,
0.07958984375,
-0.01983642578125,
-0.0294189453125,
-0.005268096923828125,
-0.01128387451171875,
0.034698486328125,
-0.0284576416015625,
0.06256103515625,
0.048126220703125,
0.010009765625,
-0.013519287109375,
-0.041015625,
0.0421142578125,
0.0284881591796875,
-0.0321044921875,
-0.01239013671875,
0.013031005859375,
0.049072265625,
0.024322509765625,
0.04913330078125,
-0.006656646728515625,
0.00467681884765625,
0.002002716064453125,
0.0233612060546875,
-0.01363372802734375,
-0.0103912353515625,
-0.029876708984375,
0.0099334716796875,
-0.018951416015625,
-0.0258636474609375
]
] |
MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli | 2023-03-20T08:28:44.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"dataset:anli",
"dataset:fever",
"arxiv:2006.03654",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | MoritzLaurer | null | null | MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli | 105 | 8,058,718 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
license: mit
tags:
- text-classification
- zero-shot-classification
datasets:
- multi_nli
- anli
- fever
metrics:
- accuracy
pipeline_tag: zero-shot-classification
model-index:
- name: MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli
results:
- task:
type: natural-language-inference
name: Natural Language Inference
dataset:
name: anli
type: anli
config: plain_text
split: test_r3
metrics:
- type: accuracy
value: 0.495
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYWViYjQ5YTZlYjU4NjQyN2NhOTVhNjFjNGQyMmFiNmQyZjRkOTdhNzJmNjc3NGU4MmY0MjYyMzY5MjZhYzE0YiIsInZlcnNpb24iOjF9.S8pIQ7gEGokd_wKXMi6Bc3B2DThIP3cvVkTFErZ-2JxXTSCy1TBuulY3dzGfaiP7kTHbL52OuBhG_-wb7Ue9DQ
- type: precision
value: 0.4984740618243923
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTllZDU3NmVmYjk4ZmYzNjAwNzExMGZjNDMzOWRkZjRjMTRhNzhlZmI0ZmNlM2E0Mzk4OWE5NTM5MTYyYWU5NCIsInZlcnNpb24iOjF9.WHz_TUJgPVn-rU-9vBCDdmSMOuWzADwr09rJY6ktqRM46zytbyWs7Vcm7jqDrTkfU-rp0_7IyoNv_xEsKhJbBA
- type: precision
value: 0.495
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjllODE3ZjUxZDhiMTI0MzZmYjY5OTUwYWI2OTc4ZjJhNTVjMjY2ODdkMmJlZjQ5YWQ1Mjk2ZThmYjJlM2RlYSIsInZlcnNpb24iOjF9.a9V06-O7l9S0Bv4vj0aard8128SAP61DZdXl_3XqdmNgt_C6KAoDBVueF2M2kF_kT6lRfEz6YW0ACIfJNXDYAA
- type: precision
value: 0.4984357572868885
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjhiMzYzY2JiMmYwN2YxYzEwZTQ3NGI1NzFmMzliNjJkMDE2YzI5Njg1ZjEzMGIxODdiMDNmYmI4Y2Y2MmJkMiIsInZlcnNpb24iOjF9.xvZZaUMogw9MJjb3ls6h5liDlTqHMmNgqk6KbyDqQWfCcD255brCU3Xo6nECwaChS4te0dQu_iWGBqR_o2kYAA
- type: recall
value: 0.49461028192371476
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDVjYTEzOTI0ZjVhOTk3ZTkzZmZhNTk5ODcxMWJhYWU4ZTRjYWVhNzcwOWY5YmI2NGFlYWE4NjM5MDY5NTExOSIsInZlcnNpb24iOjF9.xgHCB2rbCQBzHzUokw4u8JyOdhtF4yvPv1t8t7YiEkaAuM5MAPsVuCZ1VtlLapHS_IWetlocizsVl6akjh3cAQ
- type: recall
value: 0.495
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTEyYmM0ZDQ0M2RiMDNhNjIxNzQ4OWZiNTBiOTAwZDFkNjNmYjBhNjA4NmQ0NjFkNmNiZTljNDkxNDg3NzIyYSIsInZlcnNpb24iOjF9.3FJPwNtwgFNvMjVxVAayaVXXR1sWlr0sqAYmXzmMzMxl7IJh6RS77dGPwFaqD3jamLVBiqPn9wsfz5lFK5yTAA
- type: recall
value: 0.495
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmY1MjZlZTQ4OTg5YzdlYmFhZDMzMmNlNjNkYmIyZGI4M2NjZjQ1ZDVkNmZkMTUxNjI3M2UwZmI1MDM1NDYwOSIsInZlcnNpb24iOjF9.cnbM6xjTLRa9z0wEDGd_Q4lTXVLRKIQ6_YLGLjf-t7Nto4lzxAeWF-RrwA0Mq9OPITlJq2Jk1Eg_0Utb13d9Dg
- type: f1
value: 0.4942810999491704
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2U3NGM1MDM4YTM4NzQxMGM4ZTIyZDM2YTQ1MGNlZWM1MzEzM2MxN2ZmZmRmYTM0OWJmZGJjYjM5OWEzMmZjNSIsInZlcnNpb24iOjF9.vMtge1F-tmMn9D3aVUuwcNEXjqpNgEyHAl9f5UDSoTYcOgTwi2vi5yRGRCl8y6Fx7BtgaCwMyoZVNbP5-GRtCA
- type: f1
value: 0.495
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjBjMTQ5MmQ5OGE5OWJjZGMyNzg4N2RmNDUzMzQ5Zjc4ZTc4N2JlMTk0MTc2M2RjZTgzOTNlYWQzODAwNDI0NCIsInZlcnNpb24iOjF9.yxXG0CNWW8__xJC14BjbTY9QkXD75x6uCIXR51oKDemkP0b_xGyd-A2wPIuwNJN1EYkQevPY0bhVpRWBKyO9Bg
- type: f1
value: 0.4944671868893595
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzczNjQzY2FmMmY4NTAwYjNkYjJlN2I2NjI2Yjc0ZmQ3NjZiN2U5YWEwYjk4OTUyOTMzZTYyZjYzOTMzZGU2YiIsInZlcnNpb24iOjF9.mLOnst2ScPX7ZQwaUF12W2nv7-w9lX9-BxHl3-0T0gkSWnmtBSwYcL5faTX0_I5q33Fjz5tfkjpCJuxP5JYIBQ
- type: loss
value: 1.8788293600082397
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzRlOTYwYjU1Y2Y4ZGM0NDBjYTE2MmEzNWIwN2NiMWVkOWZlNzA2ZmQ3YjZjNzI4MjQwYWZhODIwMzU3ODAyZiIsInZlcnNpb24iOjF9._Xs9bl48MSavvp5eyamrP2iNlFWv35QZCrmWjJXLkUdIBx0ElCjEdxBb3dxPGnUxdpDzGMmOoKCPI44ZPXrtDw
- task:
type: natural-language-inference
name: Natural Language Inference
dataset:
name: anli
type: anli
config: plain_text
split: test_r1
metrics:
- type: accuracy
value: 0.712
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYWYxMGY0ZWU0YTEyY2I3NmQwZmQ3YmFmNzQxNGU5OGNjN2ViN2I0ZjdkYWUzM2RmYzkzMDg3ZjVmNGYwNGZkZCIsInZlcnNpb24iOjF9.snWBusAeo1rrQqWk--vTxb-CBcFqM298YCtwTQGBZiFegKGSTSKzj-SM6HMNsmoQWmMuv7UfYPqYlnzEthOSAg
- type: precision
value: 0.7134839439315348
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjMxMjg1Y2QwNzMwM2ZkNGM3ZTJhOGJmY2FkNGI1ZTFhOGQ3ODViNTJmZTYwMWJkZDYyYWRjMzFmZDI1NTM5YSIsInZlcnNpb24iOjF9.ZJnY6zYOBn-YEtN7uKzQ-VKXPwlIO1zq19Yuo37vBJNSs1dGDd8f1jgfdZuA19e_wA3Nc5nQKe9VXRwPHPgwAQ
- type: precision
value: 0.712
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWM4YWQyODBlYTIwMWQxZDA1NmY1M2M2ODgwNDJiY2RhMDVhYTlkMDUzZTJkMThkYzRmNDg2YTdjMjczNGUwOCIsInZlcnNpb24iOjF9.SogsKHdbdlEs05IBYwXvlnaC_esg-DXAPc2KPRyHaVC5ItVHbxa63NpybSpao4baOoMlLG9aRe7TjG4gtB2dAQ
- type: precision
value: 0.7134676028447461
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODdjMzFkM2IwNWZiM2I4ZWViMmQ4NWM5MDY5ZWQxZjc1MGRmNjhmNzJhYWFmOWEwMjg3ZjhiZWM3YjlhOTIxNSIsInZlcnNpb24iOjF9._0JNIbiqLuDZrp_vrCljBe28xexZJPmigLyhkcO8AtH2VcNxWshwCpZuRF4bqvpMvnApJeuGMf3vXjCj0MC1Bw
- type: recall
value: 0.7119814425203647
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYjU4MWEyMzkyYzg1ZTIxMTc0M2NhMTgzOGEyZmY5OTg3M2Q1ZmMwNmU3ZmU1ZjA1MDk0OGZkMzM5NDVlZjBlNSIsInZlcnNpb24iOjF9.sZ3GTcmGGthpTLL7_Zovq8aBmE3Dp_PZi5v8ZI9yG9N6B_GjWvBuPC8ENXK1NwmwiHLsSvtKTG5JmAum-su0Dg
- type: recall
value: 0.712
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDg3NGViZTlmMWM2ZDNhMzIzZGZkYWZhODQxNzg2MjNiNjQ0Zjg0NjQ1OWZkY2I5ODdiY2Y3Y2JjNzRmYjJkMiIsInZlcnNpb24iOjF9.bCZUzJamsozKWehnNph6E5coww5zZTrJdbWevWrSyfT0PyXc_wkZ-NKdyBAoqprBz3_8L3i5hPM6Qsy56b4BDA
- type: recall
value: 0.712
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDk1MDJiOGUzZThlZjJjMzY4NjMzODFiZjUzZmIwMjIxY2UwNzBiN2IxMWEwMGJjZTkxODA0YzUxZDE3ODRhOCIsInZlcnNpb24iOjF9.z0dqvB3aBVYt3xRIb_M4svWebfQc0QaDFVFzHnlA5QGEHkHOW3OecGhHE4EzBqTDI3DASWZTGMjrMDDt0uOMBw
- type: f1
value: 0.7119226991285647
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiM2U0YjMwNzhmOTEyNDZhODU3MTU0YTM4MmQ0NzEzNWI1YjY0ZWQ3MWRiMTdiNTUzNWRkZThjMWE4M2NkZmI0MiIsInZlcnNpb24iOjF9.hhj1BXkuWi9wXrCjT9NwqaPETtOoYNiyqYsJEw-ufA8A4hVThKA6ZBtma1Q_M65-DZFfPEBDBNASLZ7EPSbmDw
- type: f1
value: 0.712
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODk0Y2EyMzc5M2ZlNWFlNDg2Zjc1OTQxNGY3YjA5YjUxYTYzZjRlZmU4ODYxNjA3ZjkxNGUzYjBmNmMxMzY5YiIsInZlcnNpb24iOjF9.DvKk-3hNh2LhN2ug5e0FgUntL3Ozdfl06Kz7jvmB-deOJH6INi2a2ZySXoEePoo8t2nR6ENFYu9QjMA2ojnpCA
- type: f1
value: 0.7119242267218338
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2MxOWFlMmI2NGRiMjkwN2Q5MWZhNDFlYzQxNWNmNzQ3OWYxZThmNDU2OWU1MTE5OGY2MWRlYWUyNDM3OTkzZCIsInZlcnNpb24iOjF9.QrTD1gE8_wRok9u59W-Mx0cX89K-h2Ad6qa8J5rmP8lc_rkG0ft2n5_GqH1CBZBJwMFYv91Pn6TuE3eGxJuUDA
- type: loss
value: 1.0105403661727905
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmUwMTg4NjM3ZTBiZTIyODcyNDNmNTE5ZDZhMzNkMDMyNjcwOGQ5NmY0NTlhMjgyNmIzZjRiNDFiNjA3M2RkZSIsInZlcnNpb24iOjF9.sjBDVJV-jnygwcppmByAXpoo-Wzz178bBzozJEuYEiJaHSbk_xEevfJS1PmLUuplYslKb1iyEctnjI-5bl-XDw
- task:
type: natural-language-inference
name: Natural Language Inference
dataset:
name: multi_nli
type: multi_nli
config: default
split: validation_mismatched
metrics:
- type: accuracy
value: 0.902766476810415
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjExZWM3YzA3ZDNlNjEwMmViNWEwZTE3MjJjNjEyNDhjOTQxNGFmMzBjZTk0ODUwYTc2OGNiZjYyMTBmNWZjZSIsInZlcnNpb24iOjF9.zbFAGrv2flpmweqS7Poxib7qHFLdW8eUTzshdOm2B9H-KWpIZCWC-P4p8TLMdNJnUcZJZ03Okil4qjIMqqIRCA
- type: precision
value: 0.9023816542652491
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2U2MGViNmJjNWQxNzRjOTkxNDIxZjZjNmM5YzE4ZjU5NTE5NjFlNmEzZWRlOGYxN2E3NTAwMTEwYjNhNzE0YSIsInZlcnNpb24iOjF9.WJjDJf56FROvf7Y5ShWnnxMvK_ZpQ2PibAOtSFhSiYJ7bt4TGOzMwaZ5RSTf_mcfXgRfWbXmy1jCwNhDb-5EAw
- type: precision
value: 0.902766476810415
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzRhZTExOTc5NDczZjI1YmMzOGYyOTU2MDU1OGE5ZTczMDE0MmU0NzZhY2YzMDI1ZGQ3MGM5MmJiODFkNzUzZiIsInZlcnNpb24iOjF9.aRYcGEI1Y8-a0d8XOoXhBgsFyj9LWNwEjoIPc594y7kJn91wXIsXoR0-_0iy3uz41mWaTTlwJx7lI-kipFDvDQ
- type: precision
value: 0.9034597464719761
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWQyMTZiZDA2OTUwZjRmNTFiMWRlZTNmOTliZmI2MWFmMjdjYzEyYTgwNzkyOTQzOTBmNTUyYjMwNTUxMTFkNiIsInZlcnNpb24iOjF9.hUtAMTl0THHUkaLcgk1Vy9IhjqJAXCJ_5STJ5A7k7s_SO9DHp3b6qusgwPmcGLYyPy1-j1dB2AIstxK4tHfmDA
- type: recall
value: 0.9024304801555488
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzAxZGJhNGI3ZDNlMjg2ZDIxNTgwMDY5MTFjM2ExZmIxMDBmZjUyNTliNWNkOGI0OTY3NTYyNWU3OWFlYTA3YiIsInZlcnNpb24iOjF9.1o_GNq8zmXa_50MUF_K63IDc2aUKNeUkNQ5fT592-SAo8WgiaP9Dh6bOEu2OqrpRQ57P4qm7OdJt7UKsrosMDA
- type: recall
value: 0.902766476810415
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjhiMWE4Yjk0ODFkZjlkYjRlMjU1OTJmMjA2Njg1N2M4MzQ0OWE3N2FlYjY4NDgxZThjMmExYWQ5OGNmYmI1NSIsInZlcnNpb24iOjF9.Gmm5lf_qpxjXWWrycDze7LHR-6WGQc62WZTmcoc5uxWd0tivEUqCAFzFdbEU1jVKxQBIyDX77CPuBm7mUA4sCg
- type: recall
value: 0.902766476810415
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2EzZWYwNjNkYWE1YTcyZGZjNTNhMmNlNzgzYjk5MGJjOWJmZmE5NmYwM2U2NTA5ZDY3ZjFiMmRmZmQwY2QwYiIsInZlcnNpb24iOjF9.yA68rslg3e9kUR3rFTNJJTAad6Usr4uFmJvE_a7G2IvSKqLxG_pqsHszsWfg5mFBQLjWEAyCtdQYMdVayuYMBA
- type: f1
value: 0.9023086094638595
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzMyMzZhNjI5MWRmZWJhMjkzN2E0MjM4ZTM5YzZmNTk5YTZmYzU4NDRiYjczZGQ4MDdhNjJiMGU0MjE3NDEwNyIsInZlcnNpb24iOjF9.RCMqH_xUMN97Vos54pTFfAMbLstXUMdFTs-eNaypbDb_Fc-MW8NLmJ6dzJsp9sSvhXyYjugjRMUpMpnQseKXDA
- type: f1
value: 0.902766476810415
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTYxZTZhZGM0NThlNTAzNmYwMTA4NDNkN2FiNzhhN2RlYThlYjcxMjE5MjBkMzhiOGYxZGRmMjE0NGM2ZWQ5ZSIsInZlcnNpb24iOjF9.wRfllNw2Gibmi1keU7d_GjkyO0F9HESCgJlJ9PHGZQRRT414nnB-DyRvulHjCNnaNjXqMi0LJimC3iBrNawwAw
- type: f1
value: 0.9030161011457231
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDA0YjAxMWU5MjI4MWEzNTNjMzJlNjM3ZDMxOTE0ZTZhYmZlNmUyNDViNTU2NmMyMmM3MjAxZWVjNWJmZjI4MCIsInZlcnNpb24iOjF9.vJ8aUjfTbFMc1BgNUVpoVDuYwQJYQjwZQxblkUdvSoGtkW_AzQJ_KJ8Njc7IBA3ADgj8iZHjRQNIZkFCf-xICw
- type: loss
value: 0.3283354640007019
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODdmYzYzNTUzZDNmOWIxM2E0ZmUyOWUzM2Y2NGRmZDNiYjg3ZTMzYTUyNzg3OWEzNzYyN2IyNmExOGRlMWUxYSIsInZlcnNpb24iOjF9.Qv0FzFZPkcBs9aHGf4TEREX4jdkc40NazdMlP2M_-w2wHwyjoAjvhk611RLXHcbicozNelZJLnsOMdEMnPLEDg
- task:
type: natural-language-inference
name: Natural Language Inference
dataset:
name: anli
type: anli
config: plain_text
split: dev_r1
metrics:
- type: accuracy
value: 0.737
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTQ1ZGVkOTVmNTlhYjhkMjVlNTNhMjNmZWFjZWZjZjcxZmRhMDVlOWI0YTdkOTMwYjVjNWFlOGY4OTc1MmRhNiIsInZlcnNpb24iOjF9.wGLgKA1E46ljbLokdPeip_UCr1gqK8iSSbsJKX2vgKuuhDdUWWiECrUFN-bv_78JWKoKW5T0GF_hb-RVDzA0AQ
- type: precision
value: 0.737681071614645
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYmFkMGUwMjNhN2E3NzMxNTc5NDM0MjY1MGU5ODllM2Q2YzA1MDI3OGI1ZmI4YTcxN2E4ZDk5OWY2OGNiN2I0MCIsInZlcnNpb24iOjF9.6G5qhccjheaNfasgRyrkKBTaQPRzuPMZZ0hrLxTNzAydMDgx09FkFP3hni7WLRMWp0IpwzkEeBlxV-mPyQBtBw
- type: precision
value: 0.737
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2QzYjQ4ZDZjOGU5YzI3YmFlMThlYTRkYTUyYWIyNzc4NDkwNzM1OWFiMTgyMzA0NDZmMGI3YTQxODBjM2EwMCIsInZlcnNpb24iOjF9.bvNWyzfct1CLJFx_EuD2GeKieVtyGJy0cwUBP2qJE1ey2i9SVn6n1Dr0AALTGBkxQ6n5-fJ61QFNufpdr2KvCA
- type: precision
value: 0.7376755842752241
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2VmYWYzZWQwZmMzMDk0NTdlY2Y3NDkzYWY5ZTdmOGU0ZTUzZWE4YWFhZjVmODhkZmE1Njg4NjA5YjJmYWVhOSIsInZlcnNpb24iOjF9.50FQR2aoBpORLgYa7482ZTrRhT-KfIgv5ltBEHndUBMmqGF9Ru0LHENSGwyD_tO89sGPfiW32TxpbrNWiBdIBA
- type: recall
value: 0.7369675064285843
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTM4OTAyNDYwNjY4Zjc5NDljNjBmNTg2Mzk4YjYxM2MyYTA0MDllYTMyNzEwOGI1ZTEwYWE3ZmU0NDZmZDg2NiIsInZlcnNpb24iOjF9.UvWBxuApNV3vd4hpgwqd6XPHCbkA_bB_Cw24ooquiOf0dstvjP3JvpGoDp5SniOzIOg3i2aYbcvFCLJqEXMZCQ
- type: recall
value: 0.737
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYmQ4MjMzNzRmNTI5NjIzNGQ0ZDFmZTA1MDU3OTk0MzYyMGI0NTMzZTZlMTQ1MDc1MzBkMGMzYjcxZjU1NDNjOSIsInZlcnNpb24iOjF9.kpbdXOpDG3CUB-kUEXsgFT3HWWIbu70wwzs2TNf0rhIuRrzdZz3dXXvwqu1BcLJTsOxl8G6NTiYXgnv-ul8lDg
- type: recall
value: 0.737
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmU1ZWJkNWE0NjczY2NiZWYyNzYyMzllNzZmZTIxNWRkYTEyZDgxN2E0NTNmM2ExMTc1ZWVjMzBiYjg0ZmM1MiIsInZlcnNpb24iOjF9.S6HHWCWnut_LJqXbEA_Z8ZOTtyq6V51ZeiA0qbwzr0hapDYZOZHrN4prvSLvoNv-GiYDYKatwIsAZxCZc5fmCA
- type: f1
value: 0.7366853496239583
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNzkxYmY2NTcyOTE0ZDdjNGY2ZmE4MzQwMGIxZTA2MDg1NzI5YTQ0MTdkZjdkNzNkMDM2NTk2MTNiNjU4ODMwZCIsInZlcnNpb24iOjF9.ECVaCBqGd0pnQT3xJF7yWrgecIb-5TMiVWpEO0MQGhYy43snkI6Qs-2FOXzvfwIWqG-Q6XIIhGbWZh5TFEGKCA
- type: f1
value: 0.737
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDMwMWZiNzQyNWEzNmMzMDJjOTAxYzAxNzc0MTNlYzRkZjllYmNjZmU0OTgzZDFkNWM1ZWI5OTA2NzE5Y2YxOSIsInZlcnNpb24iOjF9.8yZFol_Gcj9n3w9Yk5wx48yql7p3wriDecv-6VSTAB6Q_MWLQAWsCEGRRhgGJ3zvhoRehJZdb35ozk36VOinDQ
- type: f1
value: 0.7366990292378379
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjhhN2ZkMjc5ZGQ3ZGM1Nzk3ZTgwY2E1N2NjYjdhNjZlOTdhYmRlNGVjN2EwNTIzN2UyYTY2ODVlODhmY2Q4ZCIsInZlcnNpb24iOjF9.Cz7ClDAfCGpqdRTYd5v3dPjXFq8lZLXx8AX_rqmF-Jb8KocqVDsHWeZScW5I2oy951UrdMpiUOLieBuJLOmCCQ
- type: loss
value: 0.9349392056465149
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmI4MTI5MDM1NjBmMzgzMzc2NjM5MzZhOGUyNTgyY2RlZTEyYTIzYzY2ZGJmODcxY2Q5OTVjOWU3OTQ2MzM1NSIsInZlcnNpb24iOjF9.bSOFnYC4Y2y2pW1AR-bgPUHKafR-0OHf8PvexK8eQLsS323Xy9-rYkKUaP09KY6_fk9GqAawv5eqj72B_uyeCA
---
# DeBERTa-v3-base-mnli-fever-anli
## Model description
This model was trained on the MultiNLI, Fever-NLI and Adversarial-NLI (ANLI) datasets, which comprise 763 913 NLI hypothesis-premise pairs. This base model outperforms almost all large models on the [ANLI benchmark](https://github.com/facebookresearch/anli).
The base model is [DeBERTa-v3-base from Microsoft](https://huggingface.co/microsoft/deberta-v3-base). The v3 variant of DeBERTa substantially outperforms previous versions of the model by including a different pre-training objective, see annex 11 of the original [DeBERTa paper](https://arxiv.org/pdf/2006.03654.pdf).
For highest performance (but less speed), I recommend using https://huggingface.co/MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli.
### How to use the model
#### Simple zero-shot classification pipeline
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model="MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli")
sequence_to_classify = "Angela Merkel is a politician in Germany and leader of the CDU"
candidate_labels = ["politics", "economy", "entertainment", "environment"]
output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
```
#### NLI use-case
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
model_name = "MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
premise = "I first thought that I liked the movie, but upon second thought it was actually disappointing."
hypothesis = "The movie was good."
input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt")
output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu"
prediction = torch.softmax(output["logits"][0], -1).tolist()
label_names = ["entailment", "neutral", "contradiction"]
prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)}
print(prediction)
```
### Training data
DeBERTa-v3-base-mnli-fever-anli was trained on the MultiNLI, Fever-NLI and Adversarial-NLI (ANLI) datasets, which comprise 763 913 NLI hypothesis-premise pairs.
### Training procedure
DeBERTa-v3-base-mnli-fever-anli was trained using the Hugging Face trainer with the following hyperparameters.
```
training_args = TrainingArguments(
num_train_epochs=3, # total number of training epochs
learning_rate=2e-05,
per_device_train_batch_size=32, # batch size per device during training
per_device_eval_batch_size=32, # batch size for evaluation
warmup_ratio=0.1, # number of warmup steps for learning rate scheduler
weight_decay=0.06, # strength of weight decay
fp16=True # mixed precision training
)
```
### Eval results
The model was evaluated using the test sets for MultiNLI and ANLI and the dev set for Fever-NLI. The metric used is accuracy.
mnli-m | mnli-mm | fever-nli | anli-all | anli-r3
---------|----------|---------|----------|----------
0.903 | 0.903 | 0.777 | 0.579 | 0.495
## Limitations and bias
Please consult the original DeBERTa paper and literature on different NLI datasets for potential biases.
## Citation
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. โLess Annotating, More Classifying โ Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLIโ. Preprint, June. Open Science Framework. https://osf.io/74b8k.
### Ideas for cooperation or questions?
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
### Debugging and issues
Note that DeBERTa-v3 was released on 06.12.21 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers>=4.13 might solve some issues.
## Model Recycling
[Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=0.65&mnli_lp=nan&20_newsgroup=-0.61&ag_news=-0.01&amazon_reviews_multi=0.46&anli=0.84&boolq=2.12&cb=16.07&cola=-0.76&copa=8.60&dbpedia=-0.40&esnli=-0.29&financial_phrasebank=-1.98&imdb=-0.47&isear=-0.22&mnli=-0.21&mrpc=0.50&multirc=1.91&poem_sentiment=1.73&qnli=0.07&qqp=-0.37&rotten_tomatoes=-0.74&rte=3.94&sst2=-0.45&sst_5bins=0.07&stsb=1.27&trec_coarse=-0.16&trec_fine=0.18&tweet_ev_emoji=-0.93&tweet_ev_emotion=-1.33&tweet_ev_hate=-1.67&tweet_ev_irony=-5.46&tweet_ev_offensive=-0.17&tweet_ev_sentiment=-0.11&wic=-0.21&wnli=-1.20&wsc=4.18&yahoo_answers=-0.70&model_name=MoritzLaurer%2FDeBERTa-v3-base-mnli-fever-anli&base_name=microsoft%2Fdeberta-v3-base) using MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli as a base model yields average score of 79.69 in comparison to 79.04 by microsoft/deberta-v3-base.
The model is ranked 2nd among all tested models for the microsoft/deberta-v3-base architecture as of 09/01/2023.
Results:
| 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
|---------------:|----------:|-----------------------:|-------:|--------:|--------:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|-------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
| 85.8072 | 90.4333 | 67.32 | 59.625 | 85.107 | 91.0714 | 85.8102 | 67 | 79.0333 | 91.6327 | 82.5 | 94.02 | 71.6428 | 89.5749 | 89.7059 | 64.1708 | 88.4615 | 93.575 | 91.4148 | 89.6811 | 86.2816 | 94.6101 | 57.0588 | 91.5508 | 97.6 | 91.2 | 45.264 | 82.6179 | 54.5455 | 74.3622 | 84.8837 | 71.6949 | 71.0031 | 69.0141 | 68.2692 | 71.3333 |
For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)
| 23,366 | [
[
-0.040802001953125,
-0.04534912109375,
0.01319122314453125,
0.011016845703125,
-0.005153656005859375,
0.002948760986328125,
0.01454925537109375,
-0.027252197265625,
0.0254058837890625,
0.020172119140625,
-0.042327880859375,
-0.03302001953125,
-0.049285888671875,
0.00131988525390625,
-0.0187835693359375,
0.0626220703125,
0.00904083251953125,
-0.00746917724609375,
-0.0023345947265625,
-0.00930023193359375,
-0.0335693359375,
-0.0423583984375,
-0.041229248046875,
-0.0264434814453125,
0.0163421630859375,
0.026885986328125,
0.06793212890625,
0.044830322265625,
0.03253173828125,
0.020416259765625,
-0.017364501953125,
-0.00350189208984375,
-0.0310821533203125,
-0.00917816162109375,
0.0098114013671875,
-0.0189361572265625,
-0.038238525390625,
0.00887298583984375,
0.0249176025390625,
0.0270233154296875,
0.005001068115234375,
0.03253173828125,
0.001895904541015625,
0.0614013671875,
-0.04022216796875,
0.0028209686279296875,
-0.036834716796875,
0.00814056396484375,
-0.005279541015625,
-0.0012750625610351562,
-0.0146484375,
-0.022979736328125,
0.003025054931640625,
-0.0301361083984375,
0.0097808837890625,
-0.0009145736694335938,
0.10107421875,
0.0165863037109375,
-0.02618408203125,
0.00449371337890625,
-0.04937744140625,
0.06170654296875,
-0.05670166015625,
0.0152130126953125,
0.0198974609375,
0.006488800048828125,
-0.007762908935546875,
-0.0302886962890625,
-0.04180908203125,
0.002063751220703125,
-0.0172119140625,
0.0160980224609375,
-0.047271728515625,
-0.0289459228515625,
0.025970458984375,
0.0205230712890625,
-0.056549072265625,
0.005336761474609375,
-0.053802490234375,
-0.00885009765625,
0.04522705078125,
0.00658416748046875,
0.01529693603515625,
-0.0282745361328125,
-0.044830322265625,
0.0014286041259765625,
-0.042083740234375,
0.02880859375,
0.01959228515625,
0.00653076171875,
-0.017547607421875,
0.0240020751953125,
-0.00269317626953125,
0.048248291015625,
0.01010894775390625,
-0.006969451904296875,
0.06475830078125,
-0.027557373046875,
-0.0328369140625,
-0.0008754730224609375,
0.058929443359375,
0.041748046875,
0.004322052001953125,
0.003566741943359375,
-0.003948211669921875,
-0.004283905029296875,
0.0138397216796875,
-0.07159423828125,
-0.0213470458984375,
0.039825439453125,
-0.045501708984375,
-0.034881591796875,
-0.0037746429443359375,
-0.07147216796875,
-0.01025390625,
-0.027374267578125,
0.033599853515625,
-0.036590576171875,
-0.034210205078125,
-0.0184326171875,
-0.01439666748046875,
0.018310546875,
0.018768310546875,
-0.05731201171875,
0.0084686279296875,
0.0272064208984375,
0.080810546875,
-0.009368896484375,
-0.0141448974609375,
-0.01027679443359375,
0.0035343170166015625,
-0.0236968994140625,
0.0482177734375,
-0.0217742919921875,
-0.0260772705078125,
-0.01702880859375,
0.011993408203125,
-0.029205322265625,
-0.0361328125,
0.0290985107421875,
-0.033447265625,
0.0113372802734375,
-0.0146484375,
-0.03204345703125,
-0.032073974609375,
0.0273284912109375,
-0.05322265625,
0.07318115234375,
0.011505126953125,
-0.07080078125,
0.01678466796875,
-0.050750732421875,
-0.01473236083984375,
-0.0224151611328125,
0.00873565673828125,
-0.05133056640625,
-0.0177001953125,
0.024505615234375,
0.036651611328125,
-0.0251617431640625,
0.0278167724609375,
-0.03466796875,
-0.02618408203125,
0.005451202392578125,
-0.029266357421875,
0.096435546875,
0.040496826171875,
-0.043914794921875,
0.005756378173828125,
-0.06976318359375,
-0.0030193328857421875,
0.0184783935546875,
-0.00658416748046875,
-0.0059967041015625,
-0.0217437744140625,
-0.00542449951171875,
0.036956787109375,
0.022979736328125,
-0.027008056640625,
0.02362060546875,
-0.03131103515625,
0.037994384765625,
0.0528564453125,
0.005466461181640625,
0.0294647216796875,
-0.052581787109375,
0.035858154296875,
0.0256195068359375,
0.0290985107421875,
0.006282806396484375,
-0.04736328125,
-0.059661865234375,
-0.0330810546875,
0.0030345916748046875,
0.06646728515625,
-0.03643798828125,
0.041717529296875,
-0.0063934326171875,
-0.06170654296875,
-0.02978515625,
0.006374359130859375,
0.02288818359375,
0.044403076171875,
0.0310821533203125,
0.005596160888671875,
-0.060394287109375,
-0.08154296875,
0.0041046142578125,
-0.01309967041015625,
0.0099029541015625,
0.0137786865234375,
0.0625,
-0.027008056640625,
0.07550048828125,
-0.038299560546875,
-0.039031982421875,
-0.0224151611328125,
-0.001323699951171875,
0.060455322265625,
0.04803466796875,
0.070556640625,
-0.06610107421875,
-0.040435791015625,
-0.0160980224609375,
-0.07049560546875,
0.0127716064453125,
0.0110321044921875,
-0.012939453125,
0.034820556640625,
0.014495849609375,
-0.033294677734375,
0.03369140625,
0.044403076171875,
-0.0227508544921875,
0.030731201171875,
-0.0166168212890625,
0.022918701171875,
-0.096923828125,
0.036529541015625,
0.003803253173828125,
0.006038665771484375,
-0.06964111328125,
-0.01010894775390625,
-0.01413726806640625,
-0.00279998779296875,
-0.03741455078125,
0.035919189453125,
-0.0140228271484375,
0.01239013671875,
0.002239227294921875,
-0.0035800933837890625,
0.01064300537109375,
0.0404052734375,
-0.006237030029296875,
0.0455322265625,
0.05731201171875,
-0.046600341796875,
0.019500732421875,
0.01499176025390625,
-0.023651123046875,
0.01837158203125,
-0.049041748046875,
0.004749298095703125,
-0.0172882080078125,
0.014068603515625,
-0.072021484375,
-0.004039764404296875,
0.0291595458984375,
-0.027496337890625,
0.01654052734375,
-0.01131439208984375,
-0.026214599609375,
-0.05718994140625,
-0.022613525390625,
0.00778961181640625,
0.053955078125,
-0.03448486328125,
0.046630859375,
0.0281219482421875,
0.01239013671875,
-0.0601806640625,
-0.05419921875,
-0.0176239013671875,
-0.0275726318359375,
-0.06353759765625,
0.0277557373046875,
-0.005126953125,
-0.022857666015625,
0.00849151611328125,
0.007427215576171875,
-0.0216217041015625,
0.02105712890625,
0.02386474609375,
0.040863037109375,
-0.00672149658203125,
-0.00383758544921875,
-0.0171051025390625,
-0.0004887580871582031,
-0.00910186767578125,
-0.007083892822265625,
0.046600341796875,
-0.023162841796875,
-0.0042572021484375,
-0.04071044921875,
0.01284027099609375,
0.06201171875,
-0.0249176025390625,
0.06964111328125,
0.061614990234375,
-0.0287628173828125,
0.0171051025390625,
-0.036865234375,
-0.01355743408203125,
-0.0303955078125,
0.0231781005859375,
-0.0355224609375,
-0.03607177734375,
0.052520751953125,
0.0212860107421875,
0.019989013671875,
0.05902099609375,
0.03466796875,
0.000499725341796875,
0.07025146484375,
0.0273590087890625,
-0.0160064697265625,
0.0149078369140625,
-0.06048583984375,
0.0005612373352050781,
-0.052398681640625,
-0.021240234375,
-0.045928955078125,
-0.027252197265625,
-0.0418701171875,
-0.0214385986328125,
0.02703857421875,
0.013885498046875,
-0.042236328125,
0.03192138671875,
-0.0504150390625,
0.0083770751953125,
0.05706787109375,
0.01329803466796875,
0.0164947509765625,
-0.007534027099609375,
0.00690460205078125,
0.00843048095703125,
-0.053192138671875,
-0.0413818359375,
0.08087158203125,
0.0266876220703125,
0.033233642578125,
0.0245513916015625,
0.0655517578125,
0.01922607421875,
0.0254058837890625,
-0.0297088623046875,
0.022216796875,
-0.0225677490234375,
-0.072509765625,
-0.01004791259765625,
-0.0322265625,
-0.07574462890625,
0.0256805419921875,
-0.0173187255859375,
-0.07244873046875,
0.040557861328125,
0.0228118896484375,
-0.03662109375,
0.039031982421875,
-0.054046630859375,
0.07757568359375,
-0.0012655258178710938,
-0.02398681640625,
-0.00496673583984375,
-0.0477294921875,
0.04534912109375,
-0.0011301040649414062,
0.041259765625,
-0.030426025390625,
0.01947021484375,
0.059844970703125,
-0.032745361328125,
0.053009033203125,
-0.0310821533203125,
0.010894775390625,
0.037506103515625,
-0.0015153884887695312,
0.03448486328125,
0.005802154541015625,
-0.03643798828125,
0.0256195068359375,
0.0142974853515625,
-0.0328369140625,
-0.0178680419921875,
0.06854248046875,
-0.083740234375,
-0.061004638671875,
-0.04974365234375,
-0.02099609375,
0.00927734375,
0.006580352783203125,
0.041717529296875,
0.03192138671875,
-0.00754547119140625,
0.018707275390625,
0.047393798828125,
-0.0157012939453125,
0.041534423828125,
0.026123046875,
-0.01531982421875,
-0.0408935546875,
0.059539794921875,
0.020538330078125,
0.017547607421875,
0.00885772705078125,
0.014801025390625,
-0.015869140625,
-0.0265655517578125,
-0.04461669921875,
0.00047016143798828125,
-0.03851318359375,
-0.031951904296875,
-0.07373046875,
-0.019866943359375,
-0.053680419921875,
-0.00954437255859375,
-0.0222625732421875,
-0.019500732421875,
-0.0423583984375,
-0.00775146484375,
0.03961181640625,
0.05133056640625,
-0.00765228271484375,
-0.00023853778839111328,
-0.06341552734375,
0.01094818115234375,
0.0021190643310546875,
0.007328033447265625,
0.00704193115234375,
-0.0579833984375,
-0.007266998291015625,
0.0154266357421875,
-0.041046142578125,
-0.07305908203125,
0.0579833984375,
0.0144500732421875,
0.042083740234375,
0.01468658447265625,
0.00823974609375,
0.0528564453125,
-0.016571044921875,
0.06634521484375,
0.01873779296875,
-0.0682373046875,
0.04974365234375,
-0.0110931396484375,
0.031585693359375,
0.04510498046875,
0.05224609375,
-0.02032470703125,
-0.02508544921875,
-0.056365966796875,
-0.07000732421875,
0.06494140625,
0.019805908203125,
0.0012445449829101562,
0.00896453857421875,
0.00830841064453125,
-0.01506805419921875,
0.01076507568359375,
-0.06201171875,
-0.0589599609375,
-0.0234527587890625,
-0.0011234283447265625,
-0.0016069412231445312,
-0.00753021240234375,
-0.00414276123046875,
-0.0335693359375,
0.0787353515625,
0.0177764892578125,
0.0249481201171875,
0.0316162109375,
-0.00167083740234375,
-0.008392333984375,
0.0185546875,
0.04705810546875,
0.042022705078125,
-0.042236328125,
0.0013551712036132812,
0.0157928466796875,
-0.025177001953125,
0.0229034423828125,
0.01245880126953125,
-0.03582763671875,
0.0062408447265625,
0.0224609375,
0.0699462890625,
-0.0152435302734375,
-0.0147552490234375,
0.036529541015625,
0.0034923553466796875,
-0.0256195068359375,
-0.04473876953125,
-0.00067138671875,
0.00537109375,
0.0224151611328125,
0.040069580078125,
0.03460693359375,
0.0229949951171875,
-0.05511474609375,
0.0167083740234375,
0.02880859375,
-0.033843994140625,
-0.00939178466796875,
0.0537109375,
0.01001739501953125,
-0.0015802383422851562,
0.042877197265625,
-0.0197296142578125,
-0.053985595703125,
0.0689697265625,
0.031402587890625,
0.0576171875,
-0.0263824462890625,
0.0146331787109375,
0.0662841796875,
0.0272064208984375,
0.002155303955078125,
0.0231170654296875,
0.0302276611328125,
-0.036712646484375,
-0.00826263427734375,
-0.054931640625,
-0.007221221923828125,
0.0205230712890625,
-0.0482177734375,
0.04150390625,
-0.026824951171875,
-0.0258941650390625,
0.0017299652099609375,
0.00894927978515625,
-0.0538330078125,
0.0122528076171875,
0.00510406494140625,
0.068359375,
-0.0784912109375,
0.06854248046875,
0.04083251953125,
-0.0479736328125,
-0.0716552734375,
-0.01052093505859375,
-0.0030307769775390625,
-0.040863037109375,
0.0662841796875,
0.0308074951171875,
0.00592803955078125,
-0.009368896484375,
-0.0187835693359375,
-0.09515380859375,
0.08819580078125,
0.0159454345703125,
-0.05535888671875,
-0.005950927734375,
-0.01438140869140625,
0.04010009765625,
-0.020751953125,
0.0261688232421875,
0.045928955078125,
0.042236328125,
0.0182952880859375,
-0.06793212890625,
0.0182952880859375,
-0.041473388671875,
-0.0006580352783203125,
0.00849151611328125,
-0.0567626953125,
0.0833740234375,
-0.032470703125,
0.00019657611846923828,
0.0027179718017578125,
0.046844482421875,
0.0225982666015625,
0.036163330078125,
0.050323486328125,
0.0487060546875,
0.0445556640625,
-0.015594482421875,
0.06768798828125,
-0.034576416015625,
0.036468505859375,
0.06884765625,
-0.004322052001953125,
0.064697265625,
0.0240936279296875,
-0.0228729248046875,
0.041900634765625,
0.050994873046875,
-0.02691650390625,
0.039825439453125,
0.0158233642578125,
-0.01113128662109375,
-0.0108489990234375,
0.0089263916015625,
-0.045013427734375,
0.0254364013671875,
0.015594482421875,
-0.045745849609375,
-0.003173828125,
0.0104522705078125,
0.02618408203125,
-0.00972747802734375,
-0.016326904296875,
0.04388427734375,
-0.00847625732421875,
-0.055633544921875,
0.0794677734375,
-0.0182647705078125,
0.050537109375,
-0.027557373046875,
0.00897979736328125,
-0.00830841064453125,
0.020355224609375,
-0.031585693359375,
-0.04656982421875,
0.03509521484375,
-0.0016050338745117188,
-0.0299835205078125,
0.00841522216796875,
0.0296783447265625,
-0.02117919921875,
-0.04931640625,
0.02557373046875,
0.0211334228515625,
0.01537322998046875,
0.01076507568359375,
-0.06689453125,
0.003864288330078125,
0.017333984375,
-0.04193115234375,
0.0189666748046875,
0.01435089111328125,
0.011810302734375,
0.036529541015625,
0.04736328125,
-0.0123443603515625,
-0.0035419464111328125,
-0.01226043701171875,
0.07562255859375,
-0.035430908203125,
-0.01345062255859375,
-0.0716552734375,
0.0271148681640625,
-0.017059326171875,
-0.025970458984375,
0.059539794921875,
0.0589599609375,
0.04315185546875,
-0.0134735107421875,
0.048828125,
-0.0316162109375,
0.0297698974609375,
-0.0251922607421875,
0.04852294921875,
-0.048828125,
0.0086822509765625,
-0.0269927978515625,
-0.06390380859375,
-0.022979736328125,
0.04400634765625,
-0.028900146484375,
0.0072021484375,
0.0355224609375,
0.06671142578125,
0.0160980224609375,
-0.00157928466796875,
0.0168304443359375,
0.0262603759765625,
0.0279693603515625,
0.051849365234375,
0.03912353515625,
-0.046173095703125,
0.048431396484375,
-0.061920166015625,
-0.025848388671875,
-0.020965576171875,
-0.0504150390625,
-0.06793212890625,
-0.03924560546875,
-0.0421142578125,
-0.049041748046875,
0.0066070556640625,
0.08526611328125,
0.0537109375,
-0.071533203125,
-0.01255035400390625,
-0.0010042190551757812,
0.0023326873779296875,
-0.0322265625,
-0.0157623291015625,
0.0499267578125,
-0.0145263671875,
-0.06756591796875,
0.01016998291015625,
0.0020294189453125,
0.0159149169921875,
-0.0167999267578125,
-0.000762939453125,
-0.035430908203125,
0.00860595703125,
0.03533935546875,
0.00855255126953125,
-0.04510498046875,
-0.0138702392578125,
0.00847625732421875,
-0.019287109375,
0.0169830322265625,
0.0133056640625,
-0.040771484375,
0.01580810546875,
0.0421142578125,
0.006282806396484375,
0.037506103515625,
-0.0028667449951171875,
0.024261474609375,
-0.0305938720703125,
0.00780487060546875,
0.003444671630859375,
0.0386962890625,
0.01285552978515625,
-0.0325927734375,
0.044952392578125,
0.0225982666015625,
-0.047607421875,
-0.0633544921875,
-0.01016998291015625,
-0.07208251953125,
-0.02972412109375,
0.08966064453125,
-0.0225982666015625,
-0.03692626953125,
-0.00611114501953125,
-0.0157012939453125,
0.0230255126953125,
-0.047576904296875,
0.04559326171875,
0.037689208984375,
-0.0163421630859375,
0.0002834796905517578,
-0.045623779296875,
0.04254150390625,
0.019500732421875,
-0.052978515625,
-0.0028324127197265625,
0.021026611328125,
0.0235748291015625,
0.040069580078125,
0.048675537109375,
-0.0144500732421875,
0.01435089111328125,
-0.0093841552734375,
0.01154327392578125,
-0.00705718994140625,
-0.01163482666015625,
-0.0211181640625,
0.0204315185546875,
-0.0074462890625,
-0.01413726806640625
]
] |
runwayml/stable-diffusion-v1-5 | 2023-08-23T21:14:19.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"arxiv:2207.12598",
"arxiv:2112.10752",
"arxiv:2103.00020",
"arxiv:2205.11487",
"arxiv:1910.09700",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | runwayml | null | null | runwayml/stable-diffusion-v1-5 | 9,529 | 7,769,173 | diffusers | 2022-10-19T23:38:35 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
inference: true
extra_gated_prompt: |-
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. CompVis claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
Please read the full license carefully here: https://huggingface.co/spaces/CompVis/stable-diffusion-license
extra_gated_heading: Please read the LICENSE to access this model
---
# Stable Diffusion v1-5 Model Card
Stable Diffusion is a latent text-to-image diffusion model capable of generating photo-realistic images given any text input.
For more information about how Stable Diffusion functions, please have a look at [๐ค's Stable Diffusion blog](https://huggingface.co/blog/stable_diffusion).
The **Stable-Diffusion-v1-5** checkpoint was initialized with the weights of the [Stable-Diffusion-v1-2](https:/steps/huggingface.co/CompVis/stable-diffusion-v1-2)
checkpoint and subsequently fine-tuned on 595k steps at resolution 512x512 on "laion-aesthetics v2 5+" and 10% dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
You can use this both with the [๐งจDiffusers library](https://github.com/huggingface/diffusers) and the [RunwayML GitHub repository](https://github.com/runwayml/stable-diffusion).
### Diffusers
```py
from diffusers import StableDiffusionPipeline
import torch
model_id = "runwayml/stable-diffusion-v1-5"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt).images[0]
image.save("astronaut_rides_horse.png")
```
For more detailed instructions, use-cases and examples in JAX follow the instructions [here](https://github.com/huggingface/diffusers#text-to-image-generation-with-stable-diffusion)
### Original GitHub Repository
1. Download the weights
- [v1-5-pruned-emaonly.ckpt](https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.ckpt) - 4.27GB, ema-only weight. uses less VRAM - suitable for inference
- [v1-5-pruned.ckpt](https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned.ckpt) - 7.7GB, ema+non-ema weights. uses more VRAM - suitable for fine-tuning
2. Follow instructions [here](https://github.com/runwayml/stable-diffusion).
## Model Details
- **Developed by:** Robin Rombach, Patrick Esser
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([CLIP ViT-L/14](https://arxiv.org/abs/2103.00020)) as suggested in the [Imagen paper](https://arxiv.org/abs/2205.11487).
- **Resources for more information:** [GitHub Repository](https://github.com/CompVis/stable-diffusion), [Paper](https://arxiv.org/abs/2112.10752).
- **Cite as:**
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
# Uses
## Direct Use
The model is intended for research purposes only. Possible research areas and
tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), but applies in the same way to Stable Diffusion v1_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to โA red cube on top of a blue sphereโ
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/) which contains adult material
and is not fit for product use without additional safety mechanisms and
considerations.
- No additional measures were used to deduplicate the dataset. As a result, we observe some degree of memorization for images that are duplicated in the training data.
The training data can be searched at [https://rom1504.github.io/clip-retrieval/](https://rom1504.github.io/clip-retrieval/) to possibly assist in the detection of memorized images.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion v1 was trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are primarily limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
### Safety Module
The intended use of this model is with the [Safety Checker](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/safety_checker.py) in Diffusers.
This checker works by checking model outputs against known hard-coded NSFW concepts.
The concepts are intentionally hidden to reduce the likelihood of reverse-engineering this filter.
Specifically, the checker compares the class probability of harmful concepts in the embedding space of the `CLIPTextModel` *after generation* of the images.
The concepts are passed into the model with the generated image and compared to a hand-engineered weight for each NSFW concept.
## Training
**Training Data**
The model developers used the following dataset for training the model:
- LAION-2B (en) and subsets thereof (see next section)
**Training Procedure**
Stable Diffusion v1-5 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training,
- Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4
- Text prompts are encoded through a ViT-L/14 text-encoder.
- The non-pooled output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention.
- The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet.
Currently six Stable Diffusion checkpoints are provided, which were trained as follows.
- [`stable-diffusion-v1-1`](https://huggingface.co/CompVis/stable-diffusion-v1-1): 237,000 steps at resolution `256x256` on [laion2B-en](https://huggingface.co/datasets/laion/laion2B-en).
194,000 steps at resolution `512x512` on [laion-high-resolution](https://huggingface.co/datasets/laion/laion-high-resolution) (170M examples from LAION-5B with resolution `>= 1024x1024`).
- [`stable-diffusion-v1-2`](https://huggingface.co/CompVis/stable-diffusion-v1-2): Resumed from `stable-diffusion-v1-1`.
515,000 steps at resolution `512x512` on "laion-improved-aesthetics" (a subset of laion2B-en,
filtered to images with an original size `>= 512x512`, estimated aesthetics score `> 5.0`, and an estimated watermark probability `< 0.5`. The watermark estimate is from the LAION-5B metadata, the aesthetics score is estimated using an [improved aesthetics estimator](https://github.com/christophschuhmann/improved-aesthetic-predictor)).
- [`stable-diffusion-v1-3`](https://huggingface.co/CompVis/stable-diffusion-v1-3): Resumed from `stable-diffusion-v1-2` - 195,000 steps at resolution `512x512` on "laion-improved-aesthetics" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-v1-4`](https://huggingface.co/CompVis/stable-diffusion-v1-4) Resumed from `stable-diffusion-v1-2` - 225,000 steps at resolution `512x512` on "laion-aesthetics v2 5+" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-v1-5`](https://huggingface.co/runwayml/stable-diffusion-v1-5) Resumed from `stable-diffusion-v1-2` - 595,000 steps at resolution `512x512` on "laion-aesthetics v2 5+" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-inpainting`](https://huggingface.co/runwayml/stable-diffusion-inpainting) Resumed from `stable-diffusion-v1-5` - then 440,000 steps of inpainting training at resolution 512x512 on โlaion-aesthetics v2 5+โ and 10% dropping of the text-conditioning. For inpainting, the UNet has 5 additional input channels (4 for the encoded masked-image and 1 for the mask itself) whose weights were zero-initialized after restoring the non-inpainting checkpoint. During training, we generate synthetic masks and in 25% mask everything.
- **Hardware:** 32 x 8 x A100 GPUs
- **Optimizer:** AdamW
- **Gradient Accumulations**: 2
- **Batch:** 32 x 8 x 2 x 4 = 2048
- **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant
## Evaluation Results
Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0,
5.0, 6.0, 7.0, 8.0) and 50 PNDM/PLMS sampling
steps show the relative improvements of the checkpoints:

Evaluated using 50 PLMS steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores.
## Environmental Impact
**Stable Diffusion v1** **Estimated Emissions**
Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.
- **Hardware Type:** A100 PCIe 40GB
- **Hours used:** 150000
- **Cloud Provider:** AWS
- **Compute Region:** US-east
- **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 11250 kg CO2 eq.
## Citation
```bibtex
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
```
*This model card was written by: Robin Rombach and Patrick Esser and is based on the [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).* | 14,448 | [
[
-0.0296478271484375,
-0.0716552734375,
0.03448486328125,
0.02020263671875,
-0.018157958984375,
-0.02935791015625,
0.006397247314453125,
-0.033203125,
-0.01377105712890625,
0.033599853515625,
-0.023651123046875,
-0.042083740234375,
-0.053192138671875,
-0.01279449462890625,
-0.035797119140625,
0.07073974609375,
-0.009185791015625,
-0.0018110275268554688,
-0.0180206298828125,
-0.00725555419921875,
-0.0191192626953125,
-0.01517486572265625,
-0.078125,
-0.0155181884765625,
0.03680419921875,
-0.000988006591796875,
0.049041748046875,
0.044647216796875,
0.03485107421875,
0.0211334228515625,
-0.0294342041015625,
-0.00782012939453125,
-0.041351318359375,
-0.00579833984375,
-0.01137542724609375,
-0.01068115234375,
-0.03863525390625,
0.00991058349609375,
0.04779052734375,
0.0187530517578125,
-0.004608154296875,
0.00010341405868530273,
-0.0025463104248046875,
0.04217529296875,
-0.042388916015625,
-0.01320648193359375,
-0.0259552001953125,
0.006954193115234375,
-0.0125274658203125,
0.0159454345703125,
-0.0257720947265625,
-0.00965118408203125,
0.01236724853515625,
-0.058013916015625,
0.030029296875,
-0.017669677734375,
0.08843994140625,
0.0207366943359375,
-0.0260009765625,
-0.01220703125,
-0.053497314453125,
0.0504150390625,
-0.050201416015625,
0.01436614990234375,
0.0238037109375,
0.00862884521484375,
-0.0034923553466796875,
-0.0794677734375,
-0.049774169921875,
-0.006519317626953125,
0.00775909423828125,
0.032745361328125,
-0.0167083740234375,
0.00022327899932861328,
0.0293426513671875,
0.0221099853515625,
-0.043731689453125,
-0.006397247314453125,
-0.033782958984375,
-0.0059814453125,
0.048004150390625,
0.0019273757934570312,
0.0325927734375,
-0.0167083740234375,
-0.036407470703125,
-0.00907135009765625,
-0.0421142578125,
-0.01073455810546875,
0.0270233154296875,
-0.01555633544921875,
-0.031524658203125,
0.0322265625,
0.012451171875,
0.0416259765625,
0.0089111328125,
-0.017974853515625,
0.023651123046875,
-0.0238189697265625,
-0.01690673828125,
-0.036163330078125,
0.07159423828125,
0.039459228515625,
0.00005424022674560547,
0.01146697998046875,
-0.0177459716796875,
0.004238128662109375,
0.00597381591796875,
-0.0919189453125,
-0.0291748046875,
0.009124755859375,
-0.055999755859375,
-0.04302978515625,
-0.0128936767578125,
-0.06817626953125,
-0.01204681396484375,
0.01470184326171875,
0.0462646484375,
-0.036773681640625,
-0.037689208984375,
0.00545501708984375,
-0.03369140625,
0.007793426513671875,
0.0408935546875,
-0.039459228515625,
-0.00128936767578125,
0.0001823902130126953,
0.084716796875,
-0.0225830078125,
-0.004680633544921875,
0.006130218505859375,
0.01296234130859375,
-0.0223541259765625,
0.0516357421875,
-0.020965576171875,
-0.050018310546875,
-0.01132965087890625,
0.0230255126953125,
0.0088958740234375,
-0.039825439453125,
0.04486083984375,
-0.03765869140625,
0.02508544921875,
0.00022661685943603516,
-0.037139892578125,
-0.0142669677734375,
-0.0004775524139404297,
-0.055389404296875,
0.0745849609375,
0.01214599609375,
-0.0692138671875,
0.01474761962890625,
-0.053436279296875,
-0.0204010009765625,
-0.003101348876953125,
0.002552032470703125,
-0.0576171875,
-0.0196380615234375,
-0.0008449554443359375,
0.031005859375,
-0.00623321533203125,
0.0194244384765625,
-0.0218353271484375,
-0.01480865478515625,
-0.0026950836181640625,
-0.04266357421875,
0.07843017578125,
0.03228759765625,
-0.0247344970703125,
0.0013151168823242188,
-0.05224609375,
-0.0251312255859375,
0.03741455078125,
-0.02630615234375,
-0.026275634765625,
-0.0093536376953125,
0.0291595458984375,
0.0263824462890625,
0.00894927978515625,
-0.03277587890625,
0.00009459257125854492,
-0.0164031982421875,
0.03411865234375,
0.057525634765625,
0.0272369384765625,
0.04864501953125,
-0.03533935546875,
0.042999267578125,
0.031768798828125,
0.0145263671875,
-0.02862548828125,
-0.064453125,
-0.05096435546875,
-0.0307159423828125,
0.0164031982421875,
0.03912353515625,
-0.0614013671875,
0.0198822021484375,
0.0022983551025390625,
-0.0506591796875,
-0.016632080078125,
-0.007171630859375,
0.02642822265625,
0.05169677734375,
0.0241241455078125,
-0.030426025390625,
-0.0167694091796875,
-0.056854248046875,
0.01470947265625,
-0.00954437255859375,
0.0098419189453125,
0.02667236328125,
0.055084228515625,
-0.0269622802734375,
0.04266357421875,
-0.041046142578125,
-0.0240478515625,
0.004810333251953125,
0.01641845703125,
-0.00096893310546875,
0.06353759765625,
0.06024169921875,
-0.075927734375,
-0.04193115234375,
-0.0160980224609375,
-0.06121826171875,
0.006275177001953125,
-0.0158538818359375,
-0.02520751953125,
0.0261688232421875,
0.039337158203125,
-0.061676025390625,
0.053802490234375,
0.036773681640625,
-0.0277099609375,
0.037811279296875,
-0.027191162109375,
0.006072998046875,
-0.08782958984375,
0.01531982421875,
0.029998779296875,
-0.0290069580078125,
-0.03887939453125,
0.01496124267578125,
-0.004985809326171875,
-0.013275146484375,
-0.05364990234375,
0.06256103515625,
-0.02667236328125,
0.033294677734375,
-0.0215911865234375,
-0.00240325927734375,
0.01132965087890625,
0.0204925537109375,
0.0273895263671875,
0.053619384765625,
0.060089111328125,
-0.052337646484375,
0.00568389892578125,
0.0207061767578125,
-0.016754150390625,
0.04315185546875,
-0.0638427734375,
0.00884246826171875,
-0.032440185546875,
0.0237884521484375,
-0.07440185546875,
-0.01548004150390625,
0.03668212890625,
-0.03076171875,
0.02520751953125,
-0.0222625732421875,
-0.034576416015625,
-0.02996826171875,
-0.006290435791015625,
0.04449462890625,
0.0745849609375,
-0.03167724609375,
0.036529541015625,
0.027557373046875,
0.0091400146484375,
-0.03033447265625,
-0.06121826171875,
-0.009490966796875,
-0.031005859375,
-0.059906005859375,
0.046630859375,
-0.02020263671875,
-0.004791259765625,
0.01415252685546875,
0.02056884765625,
-0.01169586181640625,
-0.004245758056640625,
0.02520751953125,
0.01971435546875,
-0.0033321380615234375,
0.0016908645629882812,
0.010009765625,
-0.0006670951843261719,
-0.00518798828125,
-0.01511383056640625,
0.0175933837890625,
0.011474609375,
-0.00800323486328125,
-0.04791259765625,
0.028533935546875,
0.0404052734375,
0.00605010986328125,
0.06683349609375,
0.0728759765625,
-0.037933349609375,
-0.006412506103515625,
-0.0247802734375,
-0.01012420654296875,
-0.03826904296875,
0.0275421142578125,
-0.01629638671875,
-0.043121337890625,
0.046722412109375,
-0.00908660888671875,
-0.0015211105346679688,
0.048980712890625,
0.053497314453125,
-0.0159912109375,
0.08294677734375,
0.046783447265625,
0.025054931640625,
0.056976318359375,
-0.05096435546875,
-0.0042877197265625,
-0.061431884765625,
-0.01800537109375,
-0.01837158203125,
-0.0115814208984375,
-0.0322265625,
-0.04962158203125,
0.02679443359375,
0.01316070556640625,
-0.017578125,
0.00653839111328125,
-0.043182373046875,
0.0284881591796875,
0.0213165283203125,
0.0165252685546875,
0.00856781005859375,
0.007221221923828125,
-0.0081939697265625,
-0.007030487060546875,
-0.05560302734375,
-0.05010986328125,
0.0728759765625,
0.0391845703125,
0.0712890625,
0.0040435791015625,
0.0440673828125,
0.031585693359375,
0.03265380859375,
-0.039306640625,
0.046234130859375,
-0.02044677734375,
-0.05755615234375,
-0.00879669189453125,
-0.0234832763671875,
-0.06890869140625,
0.0135955810546875,
-0.0235443115234375,
-0.0289306640625,
0.0279388427734375,
0.023345947265625,
-0.00972747802734375,
0.0313720703125,
-0.055145263671875,
0.0733642578125,
-0.001789093017578125,
-0.05389404296875,
-0.006626129150390625,
-0.041717529296875,
0.03411865234375,
-0.0009527206420898438,
0.0165252685546875,
-0.0016298294067382812,
-0.004177093505859375,
0.06414794921875,
-0.0248870849609375,
0.06842041015625,
-0.031707763671875,
0.0011320114135742188,
0.0321044921875,
-0.00487518310546875,
0.02471923828125,
0.00786590576171875,
-0.004741668701171875,
0.0241851806640625,
0.0092010498046875,
-0.031585693359375,
-0.0214385986328125,
0.052886962890625,
-0.0699462890625,
-0.033447265625,
-0.0308380126953125,
-0.02227783203125,
0.03826904296875,
0.0306549072265625,
0.06024169921875,
0.021881103515625,
-0.0229034423828125,
-0.00273895263671875,
0.066162109375,
-0.039886474609375,
0.03131103515625,
0.0177154541015625,
-0.031463623046875,
-0.037017822265625,
0.06964111328125,
0.008270263671875,
0.037384033203125,
-0.005687713623046875,
0.0108184814453125,
-0.01078033447265625,
-0.0462646484375,
-0.0458984375,
0.0233917236328125,
-0.06842041015625,
-0.01384735107421875,
-0.058563232421875,
-0.032745361328125,
-0.0304718017578125,
-0.0090179443359375,
-0.02838134765625,
-0.0260009765625,
-0.06854248046875,
0.007373809814453125,
0.0188751220703125,
0.045806884765625,
-0.0119781494140625,
0.0289306640625,
-0.036590576171875,
0.0263519287109375,
0.0105438232421875,
0.0198822021484375,
0.01059722900390625,
-0.054107666015625,
-0.0194091796875,
0.00360107421875,
-0.05206298828125,
-0.0684814453125,
0.0322265625,
0.006877899169921875,
0.03912353515625,
0.037811279296875,
-0.00980377197265625,
0.04632568359375,
-0.034271240234375,
0.07891845703125,
0.0162200927734375,
-0.048828125,
0.049896240234375,
-0.037384033203125,
0.01548004150390625,
0.01155853271484375,
0.043365478515625,
-0.0216827392578125,
-0.03350830078125,
-0.06353759765625,
-0.07244873046875,
0.03765869140625,
0.032318115234375,
0.027008056640625,
-0.01025390625,
0.05072021484375,
-0.0079345703125,
-0.006908416748046875,
-0.08087158203125,
-0.031585693359375,
-0.03070068359375,
-0.0047149658203125,
0.0114898681640625,
-0.02191162109375,
-0.00943756103515625,
-0.0288543701171875,
0.07098388671875,
0.01123809814453125,
0.038726806640625,
0.034210205078125,
0.00003993511199951172,
-0.023468017578125,
-0.0181427001953125,
0.04156494140625,
0.028411865234375,
-0.01290130615234375,
-0.0024280548095703125,
-0.00978851318359375,
-0.038330078125,
0.01885986328125,
0.0020885467529296875,
-0.052459716796875,
0.00402069091796875,
0.00452423095703125,
0.06201171875,
-0.0245208740234375,
-0.039215087890625,
0.0537109375,
-0.016448974609375,
-0.0305633544921875,
-0.037811279296875,
0.011138916015625,
0.006927490234375,
0.0106048583984375,
0.008392333984375,
0.037384033203125,
0.016876220703125,
-0.0203704833984375,
0.00902557373046875,
0.046630859375,
-0.0233306884765625,
-0.023651123046875,
0.08050537109375,
0.011138916015625,
-0.02166748046875,
0.035736083984375,
-0.0340576171875,
-0.01419830322265625,
0.051483154296875,
0.05841064453125,
0.0599365234375,
-0.01526641845703125,
0.0340576171875,
0.05328369140625,
0.0212554931640625,
-0.02191162109375,
0.0098419189453125,
0.01508331298828125,
-0.061737060546875,
-0.00592803955078125,
-0.034088134765625,
0.00222015380859375,
0.0207061767578125,
-0.03228759765625,
0.03363037109375,
-0.0447998046875,
-0.040191650390625,
-0.0012874603271484375,
-0.03057861328125,
-0.043365478515625,
0.0208892822265625,
0.0199127197265625,
0.06787109375,
-0.0802001953125,
0.060943603515625,
0.06024169921875,
-0.05499267578125,
-0.041351318359375,
0.01183319091796875,
-0.00928497314453125,
-0.019683837890625,
0.043609619140625,
0.00400543212890625,
0.006778717041015625,
0.0085601806640625,
-0.06292724609375,
-0.062469482421875,
0.0919189453125,
0.017364501953125,
-0.01348114013671875,
-0.00589752197265625,
-0.019805908203125,
0.04669189453125,
-0.032958984375,
0.0188446044921875,
0.01296234130859375,
0.02349853515625,
0.03643798828125,
-0.033477783203125,
0.011016845703125,
-0.0247344970703125,
0.03448486328125,
-0.0179443359375,
-0.06402587890625,
0.07244873046875,
-0.0228729248046875,
-0.0299530029296875,
0.0333251953125,
0.04638671875,
0.02117919921875,
0.023590087890625,
0.031280517578125,
0.06427001953125,
0.039764404296875,
-0.00211334228515625,
0.08050537109375,
-0.0020961761474609375,
0.0303802490234375,
0.054168701171875,
0.0005488395690917969,
0.05169677734375,
0.032379150390625,
-0.00670623779296875,
0.049163818359375,
0.0511474609375,
-0.0185089111328125,
0.056365966796875,
-0.00397491455078125,
-0.022735595703125,
-0.00769805908203125,
-0.0021305084228515625,
-0.0279693603515625,
0.0009398460388183594,
0.0275421142578125,
-0.052337646484375,
-0.006130218505859375,
0.016754150390625,
0.00004482269287109375,
-0.0137481689453125,
-0.00519561767578125,
0.04742431640625,
0.00478363037109375,
-0.03143310546875,
0.04632568359375,
0.0161895751953125,
0.061065673828125,
-0.03076171875,
-0.01377105712890625,
-0.006481170654296875,
0.00829315185546875,
-0.01534271240234375,
-0.060943603515625,
0.032073974609375,
-0.00966644287109375,
-0.0154571533203125,
-0.020599365234375,
0.0687255859375,
-0.03070068359375,
-0.044097900390625,
0.0228271484375,
0.0215911865234375,
0.0211181640625,
0.01474761962890625,
-0.08233642578125,
0.0149383544921875,
-0.004184722900390625,
-0.02764892578125,
0.016876220703125,
0.0192413330078125,
0.006561279296875,
0.04827880859375,
0.044219970703125,
0.005313873291015625,
0.003261566162109375,
-0.00678253173828125,
0.060211181640625,
-0.032745361328125,
-0.030609130859375,
-0.05364990234375,
0.060394287109375,
-0.0069732666015625,
-0.0123291015625,
0.048248291015625,
0.04315185546875,
0.0560302734375,
-0.0215606689453125,
0.068115234375,
-0.0167999267578125,
0.007320404052734375,
-0.040985107421875,
0.0716552734375,
-0.066650390625,
0.01171112060546875,
-0.03973388671875,
-0.059234619140625,
-0.0196075439453125,
0.07061767578125,
-0.01641845703125,
0.0244903564453125,
0.0335693359375,
0.07623291015625,
-0.01515960693359375,
-0.0227203369140625,
0.0274200439453125,
0.021209716796875,
0.034912109375,
0.0197601318359375,
0.06414794921875,
-0.05303955078125,
0.031951904296875,
-0.0281524658203125,
-0.0186309814453125,
-0.00014734268188476562,
-0.07177734375,
-0.06884765625,
-0.0599365234375,
-0.059906005859375,
-0.057464599609375,
-0.001934051513671875,
0.028778076171875,
0.07647705078125,
-0.03302001953125,
0.002017974853515625,
-0.025970458984375,
0.007770538330078125,
0.0050506591796875,
-0.02191162109375,
0.020721435546875,
0.004302978515625,
-0.062744140625,
-0.01129150390625,
0.0188446044921875,
0.0555419921875,
-0.032379150390625,
-0.01352691650390625,
-0.023773193359375,
0.001873016357421875,
0.039947509765625,
0.014984130859375,
-0.049774169921875,
0.0009145736694335938,
-0.019317626953125,
-0.01255035400390625,
0.0054779052734375,
0.0273590087890625,
-0.048675537109375,
0.031768798828125,
0.03857421875,
0.018157958984375,
0.0594482421875,
-0.00003415346145629883,
0.01279449462890625,
-0.045440673828125,
0.033721923828125,
0.011474609375,
0.024444580078125,
0.028717041015625,
-0.0406494140625,
0.0276947021484375,
0.046844482421875,
-0.058685302734375,
-0.05072021484375,
0.01543426513671875,
-0.07696533203125,
-0.02655029296875,
0.09454345703125,
-0.023773193359375,
-0.029327392578125,
0.00957489013671875,
-0.0289306640625,
0.0165252685546875,
-0.02532958984375,
0.0438232421875,
0.04248046875,
-0.001922607421875,
-0.039459228515625,
-0.039459228515625,
0.04534912109375,
0.01004791259765625,
-0.05072021484375,
-0.0202789306640625,
0.0506591796875,
0.05419921875,
0.020721435546875,
0.07440185546875,
-0.0268707275390625,
0.019805908203125,
0.002719879150390625,
0.007579803466796875,
0.0128631591796875,
-0.01270294189453125,
-0.031463623046875,
0.0006313323974609375,
-0.0018253326416015625,
-0.0017175674438476562
]
] |
stabilityai/StableBeluga-7B | 2023-08-29T20:21:36.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"dataset:conceptofmind/cot_submix_original",
"dataset:conceptofmind/flan2021_submix_original",
"dataset:conceptofmind/t0_submix_original",
"dataset:conceptofmind/niv2_submix_original",
"arxiv:2307.09288",
"arxiv:2306.02707",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | stabilityai | null | null | stabilityai/StableBeluga-7B | 119 | 7,521,844 | transformers | 2023-07-27T02:01:15 | ---
datasets:
- conceptofmind/cot_submix_original
- conceptofmind/flan2021_submix_original
- conceptofmind/t0_submix_original
- conceptofmind/niv2_submix_original
language:
- en
pipeline_tag: text-generation
---
# Stable Beluga 7B
Use [Stable Chat (Research Preview)](https://chat.stability.ai/chat) to test Stability AI's best language models for free
## Model Description
`Stable Beluga 7B` is a Llama2 7B model finetuned on an Orca style Dataset
## Usage
Start chatting with `Stable Beluga 7B` using the following code snippet:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga-7B", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga-7B", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
system_prompt = "### System:\nYou are StableBeluga, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
message = "Write me a poem please"
prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=256)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
Stable Beluga 7B should be used with this prompt format:
```
### System:
This is a system prompt, please behave and help the user.
### User:
Your prompt here
### Assistant:
The output of Stable Beluga 7B
```
## Model Details
* **Developed by**: [Stability AI](https://stability.ai/)
* **Model type**: Stable Beluga 7B is an auto-regressive language model fine-tuned on Llama2 7B.
* **Language(s)**: English
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
* **License**: Fine-tuned checkpoints (`Stable Beluga 7B`) is licensed under the [STABLE BELUGA NON-COMMERCIAL COMMUNITY LICENSE AGREEMENT](https://huggingface.co/stabilityai/StableBeluga-7B/blob/main/LICENSE.txt)
* **Contact**: For questions and comments about the model, please email `lm@stability.ai`
### Training Dataset
` Stable Beluga 7B` is trained on our internal Orca-style dataset
### Training Procedure
Models are learned via supervised fine-tuning on the aforementioned datasets, trained in mixed-precision (BF16), and optimized with AdamW. We outline the following hyperparameters:
| Dataset | Batch Size | Learning Rate |Learning Rate Decay| Warm-up | Weight Decay | Betas |
|-------------------|------------|---------------|-------------------|---------|--------------|-------------|
| Orca pt1 packed | 256 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
| Orca pt2 unpacked | 512 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
## Ethical Considerations and Limitations
Beluga is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Beluga's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Beluga, developers should perform safety testing and tuning tailored to their specific applications of the model.
## Citations
```bibtext
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtext
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 5,307 | [
[
-0.0335693359375,
-0.0728759765625,
0.0037937164306640625,
0.029083251953125,
-0.021453857421875,
0.00447845458984375,
-0.00992584228515625,
-0.038787841796875,
0.0010976791381835938,
0.0221710205078125,
-0.040283203125,
-0.03875732421875,
-0.0482177734375,
-0.0044708251953125,
-0.024261474609375,
0.07855224609375,
0.00827789306640625,
-0.01418304443359375,
0.00939178466796875,
-0.01068878173828125,
-0.05169677734375,
-0.02374267578125,
-0.06915283203125,
-0.0201568603515625,
0.01873779296875,
0.016357421875,
0.055419921875,
0.055206298828125,
0.01629638671875,
0.0265960693359375,
-0.031829833984375,
0.00240325927734375,
-0.046875,
-0.012786865234375,
0.011932373046875,
-0.041534423828125,
-0.045654296875,
-0.002582550048828125,
0.033843994140625,
0.031341552734375,
-0.0117950439453125,
0.0163726806640625,
0.01485443115234375,
0.0265045166015625,
-0.0286712646484375,
0.0291748046875,
-0.034942626953125,
-0.0140380859375,
-0.01311492919921875,
0.006671905517578125,
-0.0170440673828125,
-0.051666259765625,
0.005077362060546875,
-0.0438232421875,
0.0027523040771484375,
-0.003887176513671875,
0.10638427734375,
0.0273590087890625,
-0.03680419921875,
-0.00998687744140625,
-0.04095458984375,
0.065185546875,
-0.0697021484375,
0.03302001953125,
0.021575927734375,
0.0271148681640625,
-0.0292510986328125,
-0.048126220703125,
-0.0518798828125,
-0.0165557861328125,
-0.0029697418212890625,
0.02508544921875,
-0.0098724365234375,
0.0000349879264831543,
0.0243682861328125,
0.03094482421875,
-0.040374755859375,
0.0168304443359375,
-0.045013427734375,
-0.03253173828125,
0.0394287109375,
0.006938934326171875,
0.007843017578125,
-0.00908660888671875,
-0.0234527587890625,
-0.0284423828125,
-0.054443359375,
0.0261077880859375,
0.0303497314453125,
0.01401519775390625,
-0.04022216796875,
0.0413818359375,
-0.00614166259765625,
0.047454833984375,
0.004268646240234375,
-0.0286407470703125,
0.033660888671875,
-0.027679443359375,
-0.0303192138671875,
0.0023708343505859375,
0.07073974609375,
0.02813720703125,
0.004795074462890625,
0.0243682861328125,
-0.0029544830322265625,
0.03253173828125,
-0.0028171539306640625,
-0.0626220703125,
-0.01364898681640625,
0.0267791748046875,
-0.035400390625,
-0.039215087890625,
-0.025421142578125,
-0.076904296875,
-0.0131988525390625,
-0.00254058837890625,
0.0233001708984375,
-0.03204345703125,
-0.03155517578125,
0.0143585205078125,
0.0241851806640625,
0.041168212890625,
0.0025386810302734375,
-0.07342529296875,
0.021881103515625,
0.031402587890625,
0.053009033203125,
0.009918212890625,
-0.021331787109375,
-0.021636962890625,
-0.006488800048828125,
-0.033966064453125,
0.044464111328125,
-0.0150604248046875,
-0.0357666015625,
-0.00241851806640625,
0.01132965087890625,
-0.004901885986328125,
-0.024200439453125,
0.052215576171875,
-0.0200958251953125,
0.03125,
-0.01328277587890625,
-0.0295562744140625,
-0.036224365234375,
0.01776123046875,
-0.0302276611328125,
0.0880126953125,
-0.005214691162109375,
-0.04974365234375,
0.0177154541015625,
-0.0482177734375,
-0.0232391357421875,
-0.0259552001953125,
-0.00959014892578125,
-0.0531005859375,
-0.0267333984375,
0.01806640625,
0.036163330078125,
-0.01953125,
0.021087646484375,
-0.035430908203125,
-0.0216064453125,
0.0127410888671875,
-0.0224456787109375,
0.079833984375,
0.0251922607421875,
-0.05169677734375,
0.0144500732421875,
-0.069580078125,
-0.01085662841796875,
0.0266876220703125,
-0.0283355712890625,
-0.000385284423828125,
-0.01004791259765625,
-0.02313232421875,
0.0011444091796875,
0.0251007080078125,
-0.040130615234375,
0.01025390625,
-0.03521728515625,
0.03411865234375,
0.053192138671875,
-0.004871368408203125,
0.01378631591796875,
-0.0271453857421875,
0.0247802734375,
-0.0006251335144042969,
0.03173828125,
-0.007354736328125,
-0.061065673828125,
-0.07342529296875,
-0.0217437744140625,
0.0261383056640625,
0.052001953125,
-0.0275726318359375,
0.048126220703125,
0.0034847259521484375,
-0.054168701171875,
-0.048553466796875,
-0.0021648406982421875,
0.047821044921875,
0.050689697265625,
0.025299072265625,
-0.023956298828125,
-0.0465087890625,
-0.060455322265625,
0.0111236572265625,
-0.0272674560546875,
0.01290130615234375,
0.00792694091796875,
0.0252532958984375,
-0.032745361328125,
0.05810546875,
-0.0352783203125,
-0.0122833251953125,
-0.01050567626953125,
0.00949859619140625,
0.02752685546875,
0.041534423828125,
0.06243896484375,
-0.043670654296875,
-0.01873779296875,
-0.00821685791015625,
-0.0538330078125,
-0.005096435546875,
0.00740814208984375,
-0.019439697265625,
0.040496826171875,
0.01885986328125,
-0.04132080078125,
0.04193115234375,
0.059295654296875,
-0.0258026123046875,
0.044097900390625,
-0.01204681396484375,
0.006641387939453125,
-0.09063720703125,
0.0081787109375,
0.0079345703125,
-0.0067291259765625,
-0.040191650390625,
-0.0036029815673828125,
0.003971099853515625,
0.0008368492126464844,
-0.03094482421875,
0.038421630859375,
-0.0268707275390625,
-0.0015459060668945312,
-0.0142059326171875,
0.01004791259765625,
-0.01099395751953125,
0.047454833984375,
-0.003910064697265625,
0.034332275390625,
0.0570068359375,
-0.055694580078125,
0.0256195068359375,
0.033782958984375,
-0.0266876220703125,
0.017333984375,
-0.06787109375,
0.0100250244140625,
0.0126190185546875,
0.01397705078125,
-0.092529296875,
-0.01204681396484375,
0.0303802490234375,
-0.048583984375,
0.03546142578125,
-0.0200958251953125,
-0.0281219482421875,
-0.036590576171875,
-0.0173492431640625,
0.00693511962890625,
0.062255859375,
-0.037139892578125,
0.0335693359375,
0.032806396484375,
-0.0033626556396484375,
-0.054290771484375,
-0.056427001953125,
-0.0217132568359375,
-0.0228118896484375,
-0.06219482421875,
0.00757598876953125,
-0.0291748046875,
0.009918212890625,
-0.009674072265625,
-0.005695343017578125,
0.004817962646484375,
0.019073486328125,
0.0251922607421875,
0.04095458984375,
-0.00844573974609375,
-0.0295562744140625,
0.020416259765625,
-0.016265869140625,
0.0037021636962890625,
0.0005807876586914062,
0.051666259765625,
-0.04180908203125,
-0.01306915283203125,
-0.03143310546875,
-0.006114959716796875,
0.040283203125,
-0.0240936279296875,
0.057769775390625,
0.052215576171875,
-0.033599853515625,
0.0260162353515625,
-0.048126220703125,
-0.0183258056640625,
-0.0390625,
0.02734375,
-0.02691650390625,
-0.060516357421875,
0.06988525390625,
0.005420684814453125,
0.03662109375,
0.040313720703125,
0.0609130859375,
0.01209259033203125,
0.07720947265625,
0.052581787109375,
-0.000037550926208496094,
0.02008056640625,
-0.03973388671875,
0.008941650390625,
-0.056060791015625,
-0.038543701171875,
-0.0443115234375,
-0.01287841796875,
-0.045379638671875,
0.003063201904296875,
0.017547607421875,
0.0110626220703125,
-0.037017822265625,
0.027801513671875,
-0.042724609375,
0.005001068115234375,
0.034149169921875,
0.0024166107177734375,
0.01154327392578125,
-0.0164337158203125,
-0.0285797119140625,
0.00597381591796875,
-0.04901123046875,
-0.039306640625,
0.0738525390625,
0.045989990234375,
0.05743408203125,
0.0077362060546875,
0.032745361328125,
-0.01910400390625,
0.01549530029296875,
-0.03753662109375,
0.046630859375,
0.005741119384765625,
-0.06256103515625,
-0.0124664306640625,
-0.032958984375,
-0.09228515625,
0.00916290283203125,
-0.0208587646484375,
-0.054443359375,
0.0252532958984375,
0.0170745849609375,
-0.037017822265625,
0.01441192626953125,
-0.057464599609375,
0.08111572265625,
-0.022552490234375,
-0.01139068603515625,
-0.004177093505859375,
-0.0640869140625,
0.05078125,
0.0029697418212890625,
0.026153564453125,
-0.00264739990234375,
0.0029144287109375,
0.057159423828125,
-0.0298004150390625,
0.0728759765625,
-0.0008034706115722656,
-0.0077362060546875,
0.0292510986328125,
0.01490020751953125,
0.02972412109375,
0.0106658935546875,
0.0002334117889404297,
0.021453857421875,
0.011566162109375,
-0.0333251953125,
-0.0220947265625,
0.05548095703125,
-0.0928955078125,
-0.037994384765625,
-0.03704833984375,
-0.0198211669921875,
0.0019321441650390625,
0.0283050537109375,
0.0264739990234375,
0.03082275390625,
0.017822265625,
0.0185089111328125,
0.053192138671875,
-0.03009033203125,
0.0267791748046875,
0.040863037109375,
-0.033599853515625,
-0.040191650390625,
0.051666259765625,
0.012786865234375,
0.0247344970703125,
0.0023555755615234375,
0.0248260498046875,
-0.035888671875,
-0.044403076171875,
-0.03411865234375,
0.0281219482421875,
-0.042816162109375,
-0.01605224609375,
-0.042510986328125,
-0.0183258056640625,
-0.038970947265625,
0.00019240379333496094,
-0.049285888671875,
-0.01910400390625,
-0.029876708984375,
-0.0213775634765625,
0.045928955078125,
0.0338134765625,
-0.005245208740234375,
0.0171051025390625,
-0.055419921875,
0.0135650634765625,
0.0165863037109375,
0.021636962890625,
-0.004608154296875,
-0.0577392578125,
-0.0109710693359375,
0.0179901123046875,
-0.037994384765625,
-0.06817626953125,
0.027740478515625,
-0.001323699951171875,
0.05682373046875,
0.0242767333984375,
0.00494384765625,
0.060302734375,
-0.005878448486328125,
0.070556640625,
0.0173492431640625,
-0.059417724609375,
0.044403076171875,
-0.03094482421875,
0.006969451904296875,
0.025848388671875,
0.039459228515625,
-0.0223236083984375,
-0.0290985107421875,
-0.055755615234375,
-0.06024169921875,
0.050628662109375,
0.024169921875,
0.00013649463653564453,
0.005695343017578125,
0.04083251953125,
0.005802154541015625,
0.009979248046875,
-0.06756591796875,
-0.0379638671875,
-0.047515869140625,
-0.0028228759765625,
0.002471923828125,
-0.0186920166015625,
-0.00911712646484375,
-0.0205078125,
0.06158447265625,
0.003429412841796875,
0.0277557373046875,
0.01311492919921875,
0.01488494873046875,
-0.01143646240234375,
0.0015115737915039062,
0.053192138671875,
0.040740966796875,
-0.0310516357421875,
-0.00772857666015625,
0.026824951171875,
-0.045654296875,
-0.0027141571044921875,
0.0224456787109375,
-0.0205841064453125,
-0.021270751953125,
0.002017974853515625,
0.07037353515625,
0.007259368896484375,
-0.0302581787109375,
0.01727294921875,
-0.0167999267578125,
-0.0240631103515625,
-0.0251007080078125,
0.006195068359375,
0.015350341796875,
0.0231475830078125,
0.01145172119140625,
0.0014524459838867188,
-0.0175323486328125,
-0.053619384765625,
-0.00669097900390625,
0.01042938232421875,
-0.01837158203125,
-0.0244140625,
0.06805419921875,
0.01415252685546875,
-0.010406494140625,
0.045318603515625,
-0.01169586181640625,
-0.03912353515625,
0.045806884765625,
0.0399169921875,
0.050689697265625,
-0.03173828125,
0.00028204917907714844,
0.042572021484375,
0.038116455078125,
-0.01155853271484375,
0.03485107421875,
0.03619384765625,
-0.046722412109375,
-0.033172607421875,
-0.039306640625,
-0.03619384765625,
0.03466796875,
-0.038543701171875,
0.04095458984375,
-0.03759765625,
-0.0189056396484375,
-0.0217132568359375,
0.0214080810546875,
-0.032958984375,
0.0217742919921875,
0.0010194778442382812,
0.06378173828125,
-0.05780029296875,
0.058563232421875,
0.050048828125,
-0.041229248046875,
-0.0802001953125,
-0.0271148681640625,
-0.001331329345703125,
-0.050628662109375,
0.01410675048828125,
0.0067596435546875,
0.00855255126953125,
0.004268646240234375,
-0.046600341796875,
-0.0693359375,
0.09649658203125,
0.037750244140625,
-0.0390625,
0.020538330078125,
-0.000043511390686035156,
0.044464111328125,
-0.003986358642578125,
0.038543701171875,
0.045318603515625,
0.042449951171875,
0.0115966796875,
-0.0750732421875,
0.0251617431640625,
-0.0237579345703125,
-0.00904083251953125,
0.005458831787109375,
-0.083251953125,
0.059906005859375,
-0.00943756103515625,
-0.0020313262939453125,
0.0182037353515625,
0.06536865234375,
0.055999755859375,
0.02593994140625,
0.04180908203125,
0.049774169921875,
0.05108642578125,
-0.0157470703125,
0.080322265625,
-0.028289794921875,
0.031768798828125,
0.04791259765625,
0.003376007080078125,
0.05401611328125,
0.00511932373046875,
-0.017730712890625,
0.04412841796875,
0.078857421875,
-0.00946044921875,
0.04107666015625,
-0.0008845329284667969,
0.00963592529296875,
-0.004467010498046875,
0.01544952392578125,
-0.047821044921875,
0.01038360595703125,
0.035675048828125,
-0.0048065185546875,
-0.004268646240234375,
-0.00719451904296875,
0.0310516357421875,
-0.016204833984375,
-0.0107574462890625,
0.036224365234375,
0.0182647705078125,
-0.04779052734375,
0.09368896484375,
0.0009660720825195312,
0.055328369140625,
-0.0648193359375,
0.0115203857421875,
-0.0306243896484375,
0.021881103515625,
-0.0253448486328125,
-0.05279541015625,
0.023223876953125,
-0.0034427642822265625,
0.01091766357421875,
0.014923095703125,
0.056121826171875,
-0.014923095703125,
-0.0242156982421875,
0.035888671875,
0.007007598876953125,
0.02850341796875,
0.02178955078125,
-0.07037353515625,
0.0262908935546875,
0.006927490234375,
-0.04412841796875,
0.0253143310546875,
0.0303802490234375,
-0.0018491744995117188,
0.05780029296875,
0.05731201171875,
-0.01055908203125,
0.03204345703125,
-0.009613037109375,
0.07525634765625,
-0.03125,
-0.0268402099609375,
-0.05487060546875,
0.0416259765625,
0.007389068603515625,
-0.03900146484375,
0.06280517578125,
0.04193115234375,
0.05963134765625,
0.01282501220703125,
0.05401611328125,
-0.01282501220703125,
0.035614013671875,
-0.035125732421875,
0.051849365234375,
-0.03704833984375,
0.022918701171875,
-0.0168609619140625,
-0.07318115234375,
-0.0171661376953125,
0.05877685546875,
-0.0183563232421875,
0.006900787353515625,
0.041717529296875,
0.0697021484375,
-0.0067291259765625,
0.0013675689697265625,
0.0132293701171875,
0.052276611328125,
0.039886474609375,
0.041473388671875,
0.050506591796875,
-0.04608154296875,
0.0765380859375,
-0.043060302734375,
-0.034088134765625,
-0.0209503173828125,
-0.07550048828125,
-0.0833740234375,
-0.03399658203125,
-0.03369140625,
-0.050537109375,
0.004512786865234375,
0.05303955078125,
0.048675537109375,
-0.05487060546875,
-0.0285491943359375,
-0.00885009765625,
0.0010061264038085938,
-0.023834228515625,
-0.01367950439453125,
0.036163330078125,
-0.00682830810546875,
-0.055511474609375,
0.0220947265625,
0.00016248226165771484,
0.03094482421875,
-0.02886962890625,
-0.0170135498046875,
-0.0189971923828125,
0.007633209228515625,
0.0253143310546875,
0.042755126953125,
-0.034942626953125,
-0.0264739990234375,
0.0105743408203125,
-0.004261016845703125,
-0.0008344650268554688,
0.00037479400634765625,
-0.04168701171875,
0.02288818359375,
0.039215087890625,
0.030853271484375,
0.05950927734375,
-0.0182037353515625,
0.0286407470703125,
-0.028076171875,
0.0192108154296875,
-0.004695892333984375,
0.035980224609375,
0.0164337158203125,
-0.017333984375,
0.046234130859375,
0.024932861328125,
-0.060516357421875,
-0.0684814453125,
0.006267547607421875,
-0.09912109375,
-0.0020427703857421875,
0.08984375,
-0.0198974609375,
-0.032257080078125,
-0.01187896728515625,
-0.032867431640625,
0.04510498046875,
-0.038055419921875,
0.07568359375,
0.033355712890625,
0.004749298095703125,
-0.0221405029296875,
-0.04571533203125,
0.032257080078125,
0.0267791748046875,
-0.0538330078125,
-0.0144805908203125,
0.0192718505859375,
0.0278167724609375,
0.0224151611328125,
0.036651611328125,
-0.0279693603515625,
0.018798828125,
-0.015716552734375,
0.0124359130859375,
-0.0127716064453125,
-0.00479888916015625,
-0.0128173828125,
-0.0171051025390625,
-0.00952911376953125,
-0.0227508544921875
]
] |
cardiffnlp/twitter-roberta-base-irony | 2023-08-02T00:36:09.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"text-classification",
"en",
"dataset:tweet_eval",
"arxiv:2010.12421",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-irony | 13 | 7,044,288 | transformers | 2022-03-02T23:29:05 | ---
datasets:
- tweet_eval
language:
- en
---
# Twitter-roBERTa-base for Irony Detection
This is a roBERTa-base model trained on ~58M tweets and finetuned for irony detection with the TweetEval benchmark.
This model has integrated into the [TweetNLP Python library](https://github.com/cardiffnlp/tweetnlp/).
- Paper: [_TweetEval_ benchmark (Findings of EMNLP 2020)](https://arxiv.org/pdf/2010.12421.pdf).
- Git Repo: [Tweeteval official repository](https://github.com/cardiffnlp/tweeteval).
## Example of classification
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
from scipy.special import softmax
import csv
import urllib.request
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = [
]
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
# Tasks:
# emoji, emotion, hate, irony, offensive, sentiment
# stance/abortion, stance/atheism, stance/climate, stance/feminist, stance/hillary
task='irony'
MODEL = f"cardiffnlp/twitter-roberta-base-{task}"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# download label mapping
labels=[]
mapping_link = f"https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/{task}/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)
text = "Great, it broke the first day..."
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Great, it broke the first day..."
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) irony 0.914
2) non_irony 0.086
```
### Reference
Please cite the [reference paper](https://aclanthology.org/2020.findings-emnlp.148/) if you use this model.
```bibtex
@inproceedings{barbieri-etal-2020-tweeteval,
title = "{T}weet{E}val: Unified Benchmark and Comparative Evaluation for Tweet Classification",
author = "Barbieri, Francesco and
Camacho-Collados, Jose and
Espinosa Anke, Luis and
Neves, Leonardo",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.findings-emnlp.148",
doi = "10.18653/v1/2020.findings-emnlp.148",
pages = "1644--1650"
}
``` | 3,305 | [
[
0.0025501251220703125,
-0.051666259765625,
0.0228118896484375,
0.0288238525390625,
-0.00787353515625,
0.0005745887756347656,
-0.02398681640625,
-0.021148681640625,
0.0163116455078125,
0.00397491455078125,
-0.0225830078125,
-0.0596923828125,
-0.04693603515625,
0.0164031982421875,
-0.0296630859375,
0.069580078125,
0.013275146484375,
-0.007335662841796875,
0.00997161865234375,
-0.0149993896484375,
0.006298065185546875,
-0.0302734375,
-0.062225341796875,
-0.0125732421875,
0.03497314453125,
0.02593994140625,
0.0308990478515625,
0.030853271484375,
0.023468017578125,
0.03802490234375,
-0.00518035888671875,
-0.0081787109375,
-0.02813720703125,
0.0122528076171875,
-0.0019216537475585938,
-0.03277587890625,
-0.040374755859375,
0.0144500732421875,
0.038604736328125,
0.0360107421875,
0.006366729736328125,
0.0290985107421875,
-0.0059661865234375,
0.0322265625,
-0.045013427734375,
0.0048065185546875,
-0.0310211181640625,
-0.0008916854858398438,
-0.01407623291015625,
-0.01457977294921875,
-0.035247802734375,
-0.0511474609375,
-0.00023484230041503906,
-0.036468505859375,
0.0077972412109375,
0.0005950927734375,
0.08203125,
0.0253143310546875,
-0.01065826416015625,
-0.0144500732421875,
-0.0230560302734375,
0.08526611328125,
-0.06695556640625,
0.005767822265625,
0.01004791259765625,
0.010467529296875,
0.003437042236328125,
-0.0484619140625,
-0.044830322265625,
-0.0128326416015625,
0.0110015869140625,
0.01605224609375,
-0.035552978515625,
-0.01068115234375,
0.025115966796875,
0.01380157470703125,
-0.04595947265625,
-0.00693511962890625,
-0.0167236328125,
0.0017099380493164062,
0.041839599609375,
0.00225830078125,
0.027069091796875,
-0.0350341796875,
-0.0211334228515625,
-0.00794219970703125,
-0.01181793212890625,
0.0075225830078125,
0.0211029052734375,
0.0138702392578125,
-0.01910400390625,
0.037628173828125,
-0.011199951171875,
0.035400390625,
0.0025424957275390625,
-0.004093170166015625,
0.047119140625,
-0.0004944801330566406,
-0.01534271240234375,
-0.0134124755859375,
0.08245849609375,
0.01910400390625,
0.03900146484375,
-0.005950927734375,
-0.013916015625,
0.00018703937530517578,
-0.01169586181640625,
-0.05596923828125,
-0.027740478515625,
0.0299835205078125,
-0.033721923828125,
-0.04681396484375,
-0.0039825439453125,
-0.049163818359375,
-0.014251708984375,
-0.002834320068359375,
0.0511474609375,
-0.053009033203125,
-0.0264434814453125,
-0.0026187896728515625,
-0.0389404296875,
0.0011920928955078125,
0.017547607421875,
-0.04388427734375,
-0.0025501251220703125,
0.044189453125,
0.06585693359375,
0.01763916015625,
-0.0288543701171875,
-0.0214080810546875,
0.0043792724609375,
-0.0097808837890625,
0.060821533203125,
-0.033721923828125,
-0.0021190643310546875,
-0.0024929046630859375,
-0.0093841552734375,
-0.022613525390625,
-0.023529052734375,
0.046661376953125,
-0.01342010498046875,
0.0166015625,
-0.0077667236328125,
-0.048309326171875,
-0.0105743408203125,
0.027099609375,
-0.04364013671875,
0.0889892578125,
0.01361083984375,
-0.076171875,
0.0022144317626953125,
-0.058685302734375,
-0.0283355712890625,
-0.01067352294921875,
0.004123687744140625,
-0.0413818359375,
-0.003635406494140625,
0.0167083740234375,
0.04180908203125,
-0.0254058837890625,
0.00962066650390625,
-0.053497314453125,
-0.025421142578125,
0.0379638671875,
-0.0212249755859375,
0.1099853515625,
0.0156402587890625,
-0.03778076171875,
-0.007381439208984375,
-0.058441162109375,
0.01605224609375,
0.01483917236328125,
-0.0240325927734375,
-0.0210723876953125,
-0.0176849365234375,
0.018310546875,
0.0226593017578125,
0.0218048095703125,
-0.04803466796875,
0.0015687942504882812,
-0.0276641845703125,
0.045501708984375,
0.06524658203125,
-0.0124053955078125,
0.011383056640625,
-0.04010009765625,
0.0255279541015625,
0.004638671875,
0.0224761962890625,
0.01123809814453125,
-0.034820556640625,
-0.06805419921875,
-0.0166778564453125,
0.034515380859375,
0.047119140625,
-0.04534912109375,
0.049835205078125,
-0.033447265625,
-0.041534423828125,
-0.053985595703125,
-0.01265716552734375,
0.024017333984375,
0.0423583984375,
0.042022705078125,
0.0063323974609375,
-0.052978515625,
-0.0450439453125,
-0.0521240234375,
-0.017852783203125,
0.01061248779296875,
0.023223876953125,
0.05560302734375,
-0.002307891845703125,
0.052703857421875,
-0.031890869140625,
-0.01837158203125,
-0.0168914794921875,
0.03936767578125,
0.033203125,
0.054290771484375,
0.054290771484375,
-0.053802490234375,
-0.053497314453125,
-0.0234375,
-0.06787109375,
-0.0224761962890625,
0.0190277099609375,
-0.00583648681640625,
0.0389404296875,
0.021575927734375,
-0.03558349609375,
0.03662109375,
0.0142059326171875,
-0.032440185546875,
0.032958984375,
0.01180267333984375,
0.0284271240234375,
-0.09075927734375,
0.023162841796875,
0.005191802978515625,
-0.00360107421875,
-0.044677734375,
-0.02142333984375,
-0.0092926025390625,
0.0130462646484375,
-0.030059814453125,
0.0377197265625,
-0.024688720703125,
0.0034275054931640625,
0.0088043212890625,
0.00867462158203125,
-0.00250244140625,
0.0284271240234375,
-0.021392822265625,
0.022430419921875,
0.031494140625,
-0.0164337158203125,
0.0229339599609375,
0.008697509765625,
-0.01116943359375,
0.046844482421875,
-0.0545654296875,
0.0015687942504882812,
0.0091552734375,
0.0175323486328125,
-0.0927734375,
-0.0118255615234375,
0.0230865478515625,
-0.07012939453125,
0.01529693603515625,
-0.0258636474609375,
-0.045806884765625,
-0.0305328369140625,
-0.040679931640625,
0.033233642578125,
0.043121337890625,
-0.034027099609375,
0.05511474609375,
0.0223236083984375,
0.0046844482421875,
-0.06671142578125,
-0.06622314453125,
0.006816864013671875,
-0.016937255859375,
-0.06707763671875,
0.0164031982421875,
-0.009033203125,
-0.01320648193359375,
0.0177001953125,
0.0083465576171875,
-0.005069732666015625,
0.0128021240234375,
0.01229095458984375,
0.01161956787109375,
-0.0170440673828125,
0.0143280029296875,
-0.017486572265625,
-0.00653839111328125,
-0.0005984306335449219,
-0.0265960693359375,
0.045196533203125,
-0.0255889892578125,
0.00702667236328125,
-0.054290771484375,
0.02264404296875,
0.029998779296875,
0.0028400421142578125,
0.06732177734375,
0.07855224609375,
-0.0276336669921875,
-0.024017333984375,
-0.049896240234375,
0.0012845993041992188,
-0.036285400390625,
0.0194244384765625,
-0.021270751953125,
-0.043975830078125,
0.048492431640625,
0.0034656524658203125,
-0.0045318603515625,
0.0606689453125,
0.0341796875,
-0.019866943359375,
0.07110595703125,
0.038543701171875,
-0.005950927734375,
0.04693603515625,
-0.049957275390625,
0.006618499755859375,
-0.05413818359375,
-0.00685882568359375,
-0.056854248046875,
-0.02294921875,
-0.05322265625,
-0.020751953125,
0.015350341796875,
-0.0017976760864257812,
-0.04486083984375,
0.01183319091796875,
-0.042816162109375,
0.0225067138671875,
0.028594970703125,
0.01358795166015625,
-0.0026493072509765625,
-0.00742340087890625,
-0.0011119842529296875,
-0.004985809326171875,
-0.049560546875,
-0.033355712890625,
0.09088134765625,
0.0173492431640625,
0.048126220703125,
0.0142364501953125,
0.0692138671875,
0.01494598388671875,
0.038330078125,
-0.0469970703125,
0.045867919921875,
-0.037353515625,
-0.03594970703125,
-0.01049041748046875,
-0.05035400390625,
-0.05303955078125,
0.00394439697265625,
-0.01186370849609375,
-0.05853271484375,
0.01139068603515625,
0.00388336181640625,
-0.0158233642578125,
0.029327392578125,
-0.05462646484375,
0.07244873046875,
-0.0030384063720703125,
-0.0225982666015625,
0.0013990402221679688,
-0.047119140625,
0.0245819091796875,
0.019256591796875,
0.0199127197265625,
-0.02386474609375,
-0.003917694091796875,
0.0753173828125,
-0.032470703125,
0.06561279296875,
-0.0157623291015625,
0.0239715576171875,
0.0089263916015625,
-0.005130767822265625,
0.02069091796875,
-0.01366424560546875,
-0.03570556640625,
0.01241302490234375,
-0.0173187255859375,
-0.04083251953125,
-0.023529052734375,
0.060882568359375,
-0.0743408203125,
-0.0292205810546875,
-0.049713134765625,
-0.042205810546875,
0.01467132568359375,
0.021087646484375,
0.038360595703125,
0.0283050537109375,
-0.00827789306640625,
0.0260162353515625,
0.03643798828125,
-0.01334381103515625,
0.048309326171875,
0.01873779296875,
0.00859832763671875,
-0.03662109375,
0.057891845703125,
0.0267486572265625,
0.00400543212890625,
0.03326416015625,
0.0264739990234375,
-0.021636962890625,
-0.0195770263671875,
-0.0012073516845703125,
0.018402099609375,
-0.06829833984375,
-0.0205841064453125,
-0.0697021484375,
-0.0213623046875,
-0.063232421875,
-0.00911712646484375,
-0.019500732421875,
-0.064208984375,
-0.04327392578125,
0.0157470703125,
0.039276123046875,
0.055328369140625,
-0.0264129638671875,
0.015167236328125,
-0.03228759765625,
0.01262664794921875,
0.0022754669189453125,
0.0125885009765625,
0.01049041748046875,
-0.0655517578125,
-0.01277923583984375,
-0.003047943115234375,
-0.030487060546875,
-0.0701904296875,
0.041290283203125,
0.02593994140625,
0.03204345703125,
0.026763916015625,
0.0239105224609375,
0.043060302734375,
-0.0103302001953125,
0.0545654296875,
0.013671875,
-0.0802001953125,
0.042510986328125,
-0.027496337890625,
0.0129241943359375,
0.028900146484375,
0.03155517578125,
-0.03594970703125,
-0.0413818359375,
-0.06597900390625,
-0.07659912109375,
0.05938720703125,
0.019012451171875,
-0.0127105712890625,
-0.007152557373046875,
0.013031005859375,
-0.0183868408203125,
0.00518035888671875,
-0.06988525390625,
-0.03289794921875,
-0.031494140625,
-0.050537109375,
-0.0005373954772949219,
-0.0101165771484375,
-0.01519775390625,
-0.032806396484375,
0.064453125,
0.00600433349609375,
0.034454345703125,
0.013671875,
-0.0278167724609375,
-0.003032684326171875,
0.014556884765625,
0.03204345703125,
0.038543701171875,
-0.0341796875,
0.0106353759765625,
0.01029205322265625,
-0.0272674560546875,
-0.005306243896484375,
0.017486572265625,
-0.004817962646484375,
0.01390838623046875,
0.03955078125,
0.03765869140625,
0.0254974365234375,
-0.0120086669921875,
0.05419921875,
-0.0038356781005859375,
-0.01345062255859375,
-0.04522705078125,
0.003814697265625,
0.004268646240234375,
0.0265960693359375,
0.046600341796875,
0.0164337158203125,
0.014007568359375,
-0.038543701171875,
0.0133056640625,
0.0135498046875,
-0.0167083740234375,
-0.03497314453125,
0.047821044921875,
0.0014896392822265625,
-0.03204345703125,
0.037384033203125,
-0.0135345458984375,
-0.07293701171875,
0.05914306640625,
0.03753662109375,
0.0933837890625,
-0.0182647705078125,
0.0227203369140625,
0.04815673828125,
0.02447509765625,
0.0020313262939453125,
0.044891357421875,
0.005527496337890625,
-0.043060302734375,
0.0015039443969726562,
-0.05035400390625,
-0.0070037841796875,
-0.00304412841796875,
-0.033782958984375,
0.017608642578125,
-0.03900146484375,
-0.02593994140625,
0.027587890625,
0.01163482666015625,
-0.054290771484375,
0.019195556640625,
-0.00037479400634765625,
0.056610107421875,
-0.0794677734375,
0.0531005859375,
0.050048828125,
-0.05181884765625,
-0.0655517578125,
0.008758544921875,
-0.001155853271484375,
-0.050994873046875,
0.06402587890625,
0.0266571044921875,
-0.004016876220703125,
0.01178741455078125,
-0.063720703125,
-0.059326171875,
0.08245849609375,
0.025909423828125,
-0.0026683807373046875,
0.007373809814453125,
0.005992889404296875,
0.043670654296875,
-0.032928466796875,
0.053558349609375,
0.0377197265625,
0.04339599609375,
-0.00215911865234375,
-0.044769287109375,
0.023345947265625,
-0.03717041015625,
-0.0088348388671875,
-0.00022017955780029297,
-0.06787109375,
0.10333251953125,
-0.0183868408203125,
-0.005016326904296875,
0.01032257080078125,
0.04254150390625,
0.018157958984375,
0.027069091796875,
0.04083251953125,
0.06610107421875,
0.051544189453125,
-0.02374267578125,
0.0657958984375,
-0.013916015625,
0.06060791015625,
0.059112548828125,
0.0196990966796875,
0.07196044921875,
0.032440185546875,
-0.01308441162109375,
0.05133056640625,
0.041229248046875,
-0.01056671142578125,
0.023773193359375,
0.008758544921875,
-0.00878143310546875,
-0.0225830078125,
-0.01019287109375,
-0.0287628173828125,
0.0226898193359375,
0.0251922607421875,
-0.035614013671875,
-0.018585205078125,
-0.0162811279296875,
0.0236968994140625,
-0.005397796630859375,
-0.008331298828125,
0.035430908203125,
0.01332855224609375,
-0.035858154296875,
0.0682373046875,
-0.01053619384765625,
0.055511474609375,
-0.009033203125,
0.00307464599609375,
-0.00179290771484375,
0.031341552734375,
-0.0290985107421875,
-0.06298828125,
0.00011146068572998047,
0.01105499267578125,
-0.0065460205078125,
0.0000317692756652832,
0.035736083984375,
-0.038299560546875,
-0.038818359375,
0.03594970703125,
0.0142059326171875,
0.0276947021484375,
0.01383209228515625,
-0.091064453125,
0.00501251220703125,
-0.00403594970703125,
-0.0482177734375,
-0.006389617919921875,
0.049957275390625,
0.019256591796875,
0.04571533203125,
0.04718017578125,
-0.0049591064453125,
0.0194854736328125,
0.0179901123046875,
0.0640869140625,
-0.05682373046875,
-0.037017822265625,
-0.06683349609375,
0.03662109375,
-0.0272216796875,
-0.040771484375,
0.057647705078125,
0.04180908203125,
0.0452880859375,
-0.0017042160034179688,
0.0704345703125,
-0.03564453125,
0.04095458984375,
-0.019561767578125,
0.06976318359375,
-0.06341552734375,
0.0090789794921875,
-0.031097412109375,
-0.048492431640625,
-0.02789306640625,
0.05499267578125,
-0.03765869140625,
0.026519775390625,
0.05499267578125,
0.0487060546875,
-0.0050201416015625,
-0.006549835205078125,
0.009521484375,
0.05267333984375,
0.024658203125,
0.06182861328125,
0.0516357421875,
-0.06463623046875,
0.052734375,
-0.034759521484375,
-0.0204315185546875,
-0.041900634765625,
-0.06707763671875,
-0.0804443359375,
-0.04217529296875,
-0.0290985107421875,
-0.06085205078125,
0.0052947998046875,
0.07379150390625,
0.02471923828125,
-0.06854248046875,
-0.01546478271484375,
0.007465362548828125,
0.0196990966796875,
-0.00777435302734375,
-0.0218353271484375,
0.053436279296875,
-0.033599853515625,
-0.060821533203125,
-0.007122039794921875,
0.00211334228515625,
0.002956390380859375,
0.0097808837890625,
0.0060577392578125,
-0.048370361328125,
0.005687713623046875,
0.0250701904296875,
0.020477294921875,
-0.051666259765625,
-0.02734375,
0.0004487037658691406,
-0.0272674560546875,
0.0020389556884765625,
0.0157623291015625,
-0.0307464599609375,
0.0175933837890625,
0.05560302734375,
0.0237274169921875,
0.0294189453125,
0.0006718635559082031,
0.021392822265625,
-0.04681396484375,
0.0110321044921875,
0.03179931640625,
0.0313720703125,
0.03717041015625,
-0.0153656005859375,
0.045166015625,
0.040618896484375,
-0.052978515625,
-0.07965087890625,
-0.0203399658203125,
-0.0872802734375,
-0.01611328125,
0.08074951171875,
-0.03173828125,
-0.049652099609375,
-0.00958251953125,
0.0125732421875,
0.054656982421875,
-0.0465087890625,
0.0709228515625,
0.0345458984375,
0.00298309326171875,
0.0015773773193359375,
-0.03253173828125,
0.040771484375,
0.03204345703125,
-0.0479736328125,
-0.0117340087890625,
0.00502777099609375,
0.043792724609375,
0.01763916015625,
0.0531005859375,
0.0015716552734375,
0.0380859375,
0.003864288330078125,
0.0208892822265625,
-0.0035190582275390625,
-0.00814056396484375,
-0.0216064453125,
0.014862060546875,
-0.027862548828125,
-0.01183319091796875
]
] |
SamLowe/roberta-base-go_emotions | 2023-10-04T10:00:58.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"emotions",
"multi-class-classification",
"multi-label-classification",
"en",
"dataset:go_emotions",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | SamLowe | null | null | SamLowe/roberta-base-go_emotions | 159 | 6,800,543 | transformers | 2022-09-15T13:04:21 | ---
language: en
tags:
- text-classification
- pytorch
- roberta
- emotions
- multi-class-classification
- multi-label-classification
datasets:
- go_emotions
license: mit
widget:
- text: I am not having a great day.
---
#### Overview
Model trained from [roberta-base](https://huggingface.co/roberta-base) on the [go_emotions](https://huggingface.co/datasets/go_emotions) dataset for multi-label classification.
##### ONNX version also available
A version of this model in ONNX format (including an INT8 quantized ONNX version) is now available at [https://huggingface.co/SamLowe/roberta-base-go_emotions-onnx](https://huggingface.co/SamLowe/roberta-base-go_emotions-onnx). These are faster for inference, esp for smaller batch sizes, massively reduce the size of the dependencies required for inference, make inference of the model more multi-platform, and in the case of the quantized version reduce the model file/download size by 75% whilst retaining almost all the accuracy if you only need inference.
#### Dataset used for the model
[go_emotions](https://huggingface.co/datasets/go_emotions) is based on Reddit data and has 28 labels. It is a multi-label dataset where one or multiple labels may apply for any given input text, hence this model is a multi-label classification model with 28 'probability' float outputs for any given input text. Typically a threshold of 0.5 is applied to the probabilities for the prediction for each label.
#### How the model was created
The model was trained using `AutoModelForSequenceClassification.from_pretrained` with `problem_type="multi_label_classification"` for 3 epochs with a learning rate of 2e-5 and weight decay of 0.01.
#### Inference
There are multiple ways to use this model in Huggingface Transformers. Possibly the simplest is using a pipeline:
```python
from transformers import pipeline
classifier = pipeline(task="text-classification", model="SamLowe/roberta-base-go_emotions", top_k=None)
sentences = ["I am not having a great day"]
model_outputs = classifier(sentences)
print(model_outputs[0])
# produces a list of dicts for each of the labels
```
#### Evaluation / metrics
Evaluation of the model is available at
- https://github.com/samlowe/go_emotions-dataset/blob/main/eval-roberta-base-go_emotions.ipynb
[](https://colab.research.google.com/github/samlowe/go_emotions-dataset/blob/main/eval-roberta-base-go_emotions.ipynb)
##### Summary
As provided in the above notebook, evaluation of the multi-label output (of the 28 dim output via a threshold of 0.5 to binarize each) using the dataset test split gives:
- Accuracy: 0.474
- Precision: 0.575
- Recall: 0.396
- F1: 0.450
But the metrics are more meaningful when measured per label given the multi-label nature (each label is effectively an independent binary classification) and the fact that there is drastically different representations of the labels in the dataset.
With a threshold of 0.5 applied to binarize the model outputs, as per the above notebook, the metrics per label are:
| | accuracy | precision | recall | f1 | mcc | support | threshold |
| -------------- | -------- | --------- | ------ | ----- | ----- | ------- | --------- |
| admiration | 0.946 | 0.725 | 0.675 | 0.699 | 0.670 | 504 | 0.5 |
| amusement | 0.982 | 0.790 | 0.871 | 0.829 | 0.821 | 264 | 0.5 |
| anger | 0.970 | 0.652 | 0.379 | 0.479 | 0.483 | 198 | 0.5 |
| annoyance | 0.940 | 0.472 | 0.159 | 0.238 | 0.250 | 320 | 0.5 |
| approval | 0.942 | 0.609 | 0.302 | 0.404 | 0.403 | 351 | 0.5 |
| caring | 0.973 | 0.448 | 0.319 | 0.372 | 0.364 | 135 | 0.5 |
| confusion | 0.972 | 0.500 | 0.431 | 0.463 | 0.450 | 153 | 0.5 |
| curiosity | 0.950 | 0.537 | 0.356 | 0.428 | 0.412 | 284 | 0.5 |
| desire | 0.987 | 0.630 | 0.410 | 0.496 | 0.502 | 83 | 0.5 |
| disappointment | 0.974 | 0.625 | 0.199 | 0.302 | 0.343 | 151 | 0.5 |
| disapproval | 0.950 | 0.494 | 0.307 | 0.379 | 0.365 | 267 | 0.5 |
| disgust | 0.982 | 0.707 | 0.333 | 0.453 | 0.478 | 123 | 0.5 |
| embarrassment | 0.994 | 0.750 | 0.243 | 0.367 | 0.425 | 37 | 0.5 |
| excitement | 0.983 | 0.603 | 0.340 | 0.435 | 0.445 | 103 | 0.5 |
| fear | 0.992 | 0.758 | 0.603 | 0.671 | 0.672 | 78 | 0.5 |
| gratitude | 0.990 | 0.960 | 0.881 | 0.919 | 0.914 | 352 | 0.5 |
| grief | 0.999 | 0.000 | 0.000 | 0.000 | 0.000 | 6 | 0.5 |
| joy | 0.978 | 0.647 | 0.559 | 0.600 | 0.590 | 161 | 0.5 |
| love | 0.982 | 0.773 | 0.832 | 0.802 | 0.793 | 238 | 0.5 |
| nervousness | 0.996 | 0.600 | 0.130 | 0.214 | 0.278 | 23 | 0.5 |
| optimism | 0.972 | 0.667 | 0.376 | 0.481 | 0.488 | 186 | 0.5 |
| pride | 0.997 | 0.000 | 0.000 | 0.000 | 0.000 | 16 | 0.5 |
| realization | 0.974 | 0.541 | 0.138 | 0.220 | 0.264 | 145 | 0.5 |
| relief | 0.998 | 0.000 | 0.000 | 0.000 | 0.000 | 11 | 0.5 |
| remorse | 0.991 | 0.553 | 0.750 | 0.636 | 0.640 | 56 | 0.5 |
| sadness | 0.977 | 0.621 | 0.494 | 0.550 | 0.542 | 156 | 0.5 |
| surprise | 0.981 | 0.750 | 0.404 | 0.525 | 0.542 | 141 | 0.5 |
| neutral | 0.782 | 0.694 | 0.604 | 0.646 | 0.492 | 1787 | 0.5 |
Optimizing the threshold per label for the one that gives the optimum F1 metrics gives slightly better metrics - sacrificing some precision for a greater gain in recall, hence to the benefit of F1 (how this was done is shown in the above notebook):
| | accuracy | precision | recall | f1 | mcc | support | threshold |
| -------------- | -------- | --------- | ------ | ----- | ----- | ------- | --------- |
| admiration | 0.940 | 0.651 | 0.776 | 0.708 | 0.678 | 504 | 0.25 |
| amusement | 0.982 | 0.781 | 0.890 | 0.832 | 0.825 | 264 | 0.45 |
| anger | 0.959 | 0.454 | 0.601 | 0.517 | 0.502 | 198 | 0.15 |
| annoyance | 0.864 | 0.243 | 0.619 | 0.349 | 0.328 | 320 | 0.10 |
| approval | 0.926 | 0.432 | 0.442 | 0.437 | 0.397 | 351 | 0.30 |
| caring | 0.972 | 0.426 | 0.385 | 0.405 | 0.391 | 135 | 0.40 |
| confusion | 0.974 | 0.548 | 0.412 | 0.470 | 0.462 | 153 | 0.55 |
| curiosity | 0.943 | 0.473 | 0.711 | 0.568 | 0.552 | 284 | 0.25 |
| desire | 0.985 | 0.518 | 0.530 | 0.524 | 0.516 | 83 | 0.25 |
| disappointment | 0.974 | 0.562 | 0.298 | 0.390 | 0.398 | 151 | 0.40 |
| disapproval | 0.941 | 0.414 | 0.468 | 0.439 | 0.409 | 267 | 0.30 |
| disgust | 0.978 | 0.523 | 0.463 | 0.491 | 0.481 | 123 | 0.20 |
| embarrassment | 0.994 | 0.567 | 0.459 | 0.507 | 0.507 | 37 | 0.10 |
| excitement | 0.981 | 0.500 | 0.417 | 0.455 | 0.447 | 103 | 0.35 |
| fear | 0.991 | 0.712 | 0.667 | 0.689 | 0.685 | 78 | 0.40 |
| gratitude | 0.990 | 0.957 | 0.889 | 0.922 | 0.917 | 352 | 0.45 |
| grief | 0.999 | 0.333 | 0.333 | 0.333 | 0.333 | 6 | 0.05 |
| joy | 0.978 | 0.623 | 0.646 | 0.634 | 0.623 | 161 | 0.40 |
| love | 0.982 | 0.740 | 0.899 | 0.812 | 0.807 | 238 | 0.25 |
| nervousness | 0.996 | 0.571 | 0.348 | 0.432 | 0.444 | 23 | 0.25 |
| optimism | 0.971 | 0.580 | 0.565 | 0.572 | 0.557 | 186 | 0.20 |
| pride | 0.998 | 0.875 | 0.438 | 0.583 | 0.618 | 16 | 0.10 |
| realization | 0.961 | 0.270 | 0.262 | 0.266 | 0.246 | 145 | 0.15 |
| relief | 0.992 | 0.152 | 0.636 | 0.246 | 0.309 | 11 | 0.05 |
| remorse | 0.991 | 0.541 | 0.946 | 0.688 | 0.712 | 56 | 0.10 |
| sadness | 0.977 | 0.599 | 0.583 | 0.591 | 0.579 | 156 | 0.40 |
| surprise | 0.977 | 0.543 | 0.674 | 0.601 | 0.593 | 141 | 0.15 |
| neutral | 0.758 | 0.598 | 0.810 | 0.688 | 0.513 | 1787 | 0.25 |
This improves the overall metrics:
- Precision: 0.542
- Recall: 0.577
- F1: 0.541
Or if calculated weighted by the relative size of the support of each label:
- Precision: 0.572
- Recall: 0.677
- F1: 0.611
#### Commentary on the dataset
Some labels (E.g. gratitude) when considered independently perform very strongly with F1 exceeding 0.9, whilst others (E.g. relief) perform very poorly.
This is a challenging dataset. Labels such as relief do have much fewer examples in the training data (less than 100 out of the 40k+, and only 11 in the test split).
But there is also some ambiguity and/or labelling errors visible in the training data of go_emotions that is suspected to constrain the performance. Data cleaning on the dataset to reduce some of the mistakes, ambiguity, conflicts and duplication in the labelling would produce a higher performing model. | 9,565 | [
[
-0.038543701171875,
-0.03546142578125,
0.01154327392578125,
0.0163726806640625,
-0.0005674362182617188,
0.00769805908203125,
0.006744384765625,
-0.0227813720703125,
0.04791259765625,
0.026031494140625,
-0.028289794921875,
-0.052825927734375,
-0.0595703125,
-0.0019216537475585938,
0.0033779144287109375,
0.0830078125,
-0.005893707275390625,
-0.005382537841796875,
0.00235748291015625,
-0.0288543701171875,
-0.0213623046875,
-0.02069091796875,
-0.04400634765625,
-0.0113372802734375,
0.025848388671875,
0.033660888671875,
0.049041748046875,
0.033233642578125,
0.03814697265625,
0.032318115234375,
-0.022796630859375,
-0.002574920654296875,
-0.0213623046875,
-0.0243377685546875,
0.02020263671875,
-0.0389404296875,
-0.037628173828125,
0.0184783935546875,
0.0197601318359375,
0.047271728515625,
0.0004208087921142578,
0.027435302734375,
-0.0034008026123046875,
0.08087158203125,
-0.032073974609375,
0.0112457275390625,
-0.00885772705078125,
0.003875732421875,
0.00623321533203125,
-0.003391265869140625,
-0.00860595703125,
-0.05413818359375,
0.007534027099609375,
-0.04034423828125,
0.01558685302734375,
0.00664520263671875,
0.09716796875,
0.0134124755859375,
-0.024993896484375,
-0.028564453125,
-0.019378662109375,
0.047576904296875,
-0.050872802734375,
0.0229644775390625,
0.038604736328125,
-0.0019044876098632812,
-0.0077362060546875,
-0.037933349609375,
-0.041961669921875,
0.0157318115234375,
-0.038726806640625,
0.0246429443359375,
-0.014434814453125,
-0.0239105224609375,
0.032440185546875,
0.0477294921875,
-0.0516357421875,
-0.02484130859375,
-0.02618408203125,
-0.0175323486328125,
0.045684814453125,
0.000896453857421875,
0.0196533203125,
-0.033538818359375,
-0.03607177734375,
-0.0160675048828125,
-0.00838470458984375,
0.041656494140625,
0.0185699462890625,
0.0014734268188476562,
-0.031341552734375,
0.033233642578125,
-0.01558685302734375,
0.036865234375,
0.0166473388671875,
-0.028350830078125,
0.07049560546875,
-0.036376953125,
-0.0158538818359375,
-0.0012292861938476562,
0.062744140625,
0.035369873046875,
0.004199981689453125,
0.013397216796875,
-0.0012664794921875,
0.00962066650390625,
-0.005664825439453125,
-0.0484619140625,
-0.0271148681640625,
0.042022705078125,
-0.039337158203125,
-0.019927978515625,
0.0132293701171875,
-0.051483154296875,
0.005184173583984375,
-0.0171051025390625,
0.04144287109375,
-0.0302734375,
-0.009796142578125,
-0.001331329345703125,
-0.0268707275390625,
-0.006687164306640625,
0.016265869140625,
-0.07110595703125,
-0.0023670196533203125,
0.0165557861328125,
0.05712890625,
0.00211334228515625,
0.0010633468627929688,
-0.0007228851318359375,
0.01537322998046875,
-0.03765869140625,
0.04205322265625,
-0.01593017578125,
-0.0272369384765625,
-0.0380859375,
0.02496337890625,
-0.01224517822265625,
-0.029937744140625,
0.04925537109375,
-0.019256591796875,
0.02557373046875,
-0.021240234375,
-0.0242767333984375,
-0.022247314453125,
0.023956298828125,
-0.060394287109375,
0.09906005859375,
0.0301361083984375,
-0.06292724609375,
0.036346435546875,
-0.0474853515625,
0.0014581680297851562,
-0.016937255859375,
0.018218994140625,
-0.05279541015625,
-0.01137542724609375,
0.0235595703125,
0.0241851806640625,
-0.0180816650390625,
0.01215362548828125,
-0.00826263427734375,
-0.005290985107421875,
0.013336181640625,
-0.0165252685546875,
0.092041015625,
0.01494598388671875,
-0.0306243896484375,
0.006259918212890625,
-0.0794677734375,
0.0126190185546875,
0.00925445556640625,
-0.044677734375,
-0.01641845703125,
-0.03875732421875,
-0.0072174072265625,
0.021240234375,
0.023284912109375,
-0.035369873046875,
0.005523681640625,
-0.00611114501953125,
0.03607177734375,
0.04217529296875,
0.00860595703125,
0.019561767578125,
-0.05023193359375,
0.024505615234375,
0.031280517578125,
0.0191497802734375,
-0.004909515380859375,
-0.04449462890625,
-0.05743408203125,
-0.0419921875,
0.02618408203125,
0.046173095703125,
-0.0157623291015625,
0.046661376953125,
-0.01213836669921875,
-0.051116943359375,
-0.040069580078125,
0.007724761962890625,
0.022552490234375,
0.0513916015625,
0.029998779296875,
-0.027252197265625,
-0.0267333984375,
-0.06317138671875,
-0.00605010986328125,
-0.01544952392578125,
0.00408172607421875,
0.053558349609375,
0.06854248046875,
-0.019805908203125,
0.07354736328125,
-0.0565185546875,
-0.04241943359375,
0.01068878173828125,
0.00799560546875,
0.04071044921875,
0.039581298828125,
0.056793212890625,
-0.06610107421875,
-0.058013916015625,
0.00269317626953125,
-0.048370361328125,
0.014495849609375,
-0.0076141357421875,
-0.01837158203125,
0.01837158203125,
0.00910186767578125,
-0.038604736328125,
0.059814453125,
0.034423828125,
-0.034027099609375,
0.053863525390625,
-0.032196044921875,
0.032379150390625,
-0.069091796875,
0.02008056640625,
0.0010652542114257812,
-0.004344940185546875,
-0.038909912109375,
-0.032135009765625,
0.0216827392578125,
0.01360321044921875,
-0.0240936279296875,
0.038299560546875,
-0.04205322265625,
0.00571441650390625,
0.0265960693359375,
0.0020809173583984375,
0.016876220703125,
0.04925537109375,
0.0090789794921875,
0.052032470703125,
0.051605224609375,
-0.045196533203125,
0.03167724609375,
0.020904541015625,
-0.039093017578125,
0.062744140625,
-0.035858154296875,
-0.006259918212890625,
-0.0123138427734375,
0.015533447265625,
-0.10430908203125,
-0.0292510986328125,
0.0247344970703125,
-0.05743408203125,
0.0156402587890625,
0.0184783935546875,
-0.0160675048828125,
-0.06365966796875,
-0.036834716796875,
0.0030727386474609375,
0.0242156982421875,
-0.026702880859375,
0.034271240234375,
0.03631591796875,
-0.001598358154296875,
-0.04937744140625,
-0.058013916015625,
-0.01212310791015625,
-0.024505615234375,
-0.043182373046875,
0.0247650146484375,
-0.021484375,
-0.0025730133056640625,
-0.0001398324966430664,
-0.0015153884887695312,
-0.00855255126953125,
0.0005726814270019531,
0.02294921875,
0.01641845703125,
-0.00988006591796875,
0.002185821533203125,
-0.01216888427734375,
-0.02569580078125,
-0.0024433135986328125,
0.004619598388671875,
0.0205535888671875,
-0.022979736328125,
0.0015935897827148438,
-0.044586181640625,
0.005786895751953125,
0.036529541015625,
-0.018310546875,
0.0657958984375,
0.042999267578125,
-0.0274505615234375,
0.016998291015625,
-0.035858154296875,
0.0024280548095703125,
-0.030670166015625,
0.0026988983154296875,
-0.04248046875,
-0.05657958984375,
0.0543212890625,
0.0013437271118164062,
-0.0074615478515625,
0.050384521484375,
0.040252685546875,
-0.01369476318359375,
0.06207275390625,
0.0287628173828125,
-0.00885772705078125,
0.0162200927734375,
-0.04400634765625,
0.01277923583984375,
-0.05426025390625,
-0.053863525390625,
-0.03729248046875,
-0.032073974609375,
-0.02734375,
-0.0213775634765625,
0.032379150390625,
0.0079193115234375,
-0.048126220703125,
0.0228424072265625,
-0.0728759765625,
0.021575927734375,
0.062347412109375,
0.02337646484375,
0.0090179443359375,
-0.005062103271484375,
-0.016876220703125,
-0.015869140625,
-0.0391845703125,
-0.0302276611328125,
0.0814208984375,
0.0205535888671875,
0.03216552734375,
0.026123046875,
0.046173095703125,
0.02313232421875,
0.02947998046875,
-0.0462646484375,
0.0185699462890625,
0.0001494884490966797,
-0.07122802734375,
-0.0223846435546875,
-0.026458740234375,
-0.0794677734375,
0.022918701171875,
-0.0177001953125,
-0.08612060546875,
0.03314208984375,
0.01007843017578125,
-0.0276336669921875,
0.027008056640625,
-0.057525634765625,
0.07110595703125,
-0.03057861328125,
-0.0234832763671875,
0.019378662109375,
-0.06939697265625,
0.0261688232421875,
-0.011871337890625,
0.042022705078125,
-0.0219268798828125,
0.004772186279296875,
0.061492919921875,
-0.04400634765625,
0.041412353515625,
0.007415771484375,
0.01129150390625,
0.0362548828125,
-0.01558685302734375,
0.0262603759765625,
-0.0033721923828125,
-0.005878448486328125,
-0.011138916015625,
0.008148193359375,
-0.027740478515625,
-0.0206146240234375,
0.06378173828125,
-0.06707763671875,
-0.034912109375,
-0.046051025390625,
-0.042144775390625,
0.0016355514526367188,
0.028778076171875,
0.032684326171875,
0.023162841796875,
0.005771636962890625,
0.0196685791015625,
0.04730224609375,
-0.0223388671875,
0.037139892578125,
0.01554107666015625,
-0.0184326171875,
-0.06280517578125,
0.07684326171875,
0.0132293701171875,
0.0243377685546875,
0.026824951171875,
0.024505615234375,
-0.0391845703125,
0.002353668212890625,
-0.022918701171875,
0.02362060546875,
-0.03265380859375,
-0.0231781005859375,
-0.057159423828125,
0.0013837814331054688,
-0.0565185546875,
-0.02496337890625,
-0.01922607421875,
-0.032928466796875,
-0.042694091796875,
-0.01430511474609375,
0.050384521484375,
0.05120849609375,
-0.025177001953125,
0.0269317626953125,
-0.051300048828125,
0.0312347412109375,
0.01088714599609375,
0.022857666015625,
-0.0073089599609375,
-0.037109375,
0.002651214599609375,
-0.00598907470703125,
-0.03155517578125,
-0.0738525390625,
0.058197021484375,
-0.0004458427429199219,
0.0298614501953125,
0.0390625,
-0.0013380050659179688,
0.06243896484375,
-0.00235748291015625,
0.0595703125,
0.05291748046875,
-0.06951904296875,
0.053009033203125,
-0.014404296875,
0.0172119140625,
0.04205322265625,
0.04974365234375,
-0.043060302734375,
-0.01971435546875,
-0.07196044921875,
-0.07366943359375,
0.06280517578125,
0.0186920166015625,
-0.0178680419921875,
0.011474609375,
0.008087158203125,
-0.007171630859375,
0.02655029296875,
-0.0732421875,
-0.07196044921875,
-0.01397705078125,
-0.0215911865234375,
-0.025543212890625,
-0.015655517578125,
-0.016082763671875,
-0.045440673828125,
0.05364990234375,
0.01036834716796875,
0.0233612060546875,
0.0199432373046875,
0.003643035888671875,
0.011260986328125,
0.01399993896484375,
0.03948974609375,
0.038665771484375,
-0.0382080078125,
0.00260162353515625,
-0.00372314453125,
-0.041839599609375,
0.0263214111328125,
-0.01540374755859375,
-0.0285186767578125,
0.0036487579345703125,
0.00823211669921875,
0.051361083984375,
0.004047393798828125,
-0.01453399658203125,
0.027618408203125,
-0.0004341602325439453,
-0.03228759765625,
-0.0226898193359375,
-0.0009832382202148438,
0.0180206298828125,
0.01192474365234375,
0.015350341796875,
0.0141754150390625,
0.00689697265625,
-0.0489501953125,
0.022308349609375,
0.0285186767578125,
-0.046600341796875,
0.01313018798828125,
0.0643310546875,
0.0016765594482421875,
-0.0215606689453125,
0.032379150390625,
-0.008148193359375,
-0.058258056640625,
0.07110595703125,
0.037200927734375,
0.0223541259765625,
-0.025787353515625,
0.034088134765625,
0.0751953125,
0.0213165283203125,
0.0007290840148925781,
0.031280517578125,
0.01406097412109375,
-0.0309295654296875,
0.0223388671875,
-0.05865478515625,
-0.01947021484375,
0.01425933837890625,
-0.047119140625,
0.0019063949584960938,
-0.0255889892578125,
-0.042877197265625,
0.00399017333984375,
0.0172576904296875,
-0.04791259765625,
0.05029296875,
-0.0078887939453125,
0.06298828125,
-0.0584716796875,
0.043914794921875,
0.04217529296875,
-0.046295166015625,
-0.08160400390625,
-0.0460205078125,
0.00887298583984375,
-0.039642333984375,
0.053497314453125,
0.0211944580078125,
0.0157623291015625,
0.00466156005859375,
-0.0254669189453125,
-0.0916748046875,
0.0870361328125,
-0.01214599609375,
-0.0341796875,
0.0276947021484375,
0.00954437255859375,
0.035400390625,
-0.00946807861328125,
0.0452880859375,
0.0643310546875,
0.048492431640625,
0.01384735107421875,
-0.06927490234375,
0.002185821533203125,
-0.03558349609375,
-0.01165008544921875,
0.0201263427734375,
-0.0677490234375,
0.073486328125,
-0.0311737060546875,
0.01070404052734375,
-0.0045928955078125,
0.042022705078125,
0.033660888671875,
0.020416259765625,
0.0268707275390625,
0.0728759765625,
0.09014892578125,
-0.0379638671875,
0.0679931640625,
-0.0213775634765625,
0.0548095703125,
0.048492431640625,
0.0206451416015625,
0.0460205078125,
0.0279541015625,
-0.0477294921875,
0.042633056640625,
0.07037353515625,
-0.0016508102416992188,
0.03387451171875,
0.00598907470703125,
-0.017578125,
0.00360870361328125,
0.00899505615234375,
-0.041534423828125,
0.00453948974609375,
0.035858154296875,
-0.04058837890625,
0.0014562606811523438,
-0.004665374755859375,
0.01690673828125,
-0.014923095703125,
-0.035247802734375,
0.0279083251953125,
-0.0037841796875,
-0.03985595703125,
0.041290283203125,
-0.00921630859375,
0.04998779296875,
-0.035186767578125,
0.0213623046875,
-0.00887298583984375,
0.0435791015625,
-0.039459228515625,
-0.07818603515625,
0.010162353515625,
-0.0079193115234375,
-0.0226287841796875,
-0.010040283203125,
0.0271759033203125,
-0.0244903564453125,
-0.042633056640625,
0.01000213623046875,
0.002166748046875,
0.028533935546875,
0.01409912109375,
-0.07855224609375,
0.0007486343383789062,
0.026458740234375,
-0.046844482421875,
0.005672454833984375,
0.038299560546875,
0.008148193359375,
0.0426025390625,
0.049713134765625,
0.0003693103790283203,
0.00627899169921875,
-0.0177154541015625,
0.055755615234375,
-0.059906005859375,
-0.043609619140625,
-0.048553466796875,
0.04071044921875,
-0.012786865234375,
-0.048919677734375,
0.05865478515625,
0.058380126953125,
0.041961669921875,
-0.01617431640625,
0.052642822265625,
-0.0256195068359375,
0.052337646484375,
-0.02423095703125,
0.053955078125,
-0.05902099609375,
-0.021575927734375,
-0.024993896484375,
-0.046112060546875,
-0.05157470703125,
0.06768798828125,
-0.039947509765625,
0.01580810546875,
0.048187255859375,
0.062744140625,
0.02490234375,
0.004611968994140625,
-0.0043792724609375,
0.015869140625,
0.006183624267578125,
0.062347412109375,
0.03094482421875,
-0.052734375,
0.031524658203125,
-0.04571533203125,
-0.018798828125,
-0.0210723876953125,
-0.058349609375,
-0.0474853515625,
-0.038299560546875,
-0.0474853515625,
-0.04339599609375,
-0.023956298828125,
0.05059814453125,
0.037628173828125,
-0.05914306640625,
-0.02264404296875,
0.008697509765625,
0.0189971923828125,
-0.008453369140625,
-0.01134490966796875,
0.05987548828125,
0.0038433074951171875,
-0.053253173828125,
-0.0154266357421875,
0.0223541259765625,
0.0296630859375,
0.0123443603515625,
-0.01561737060546875,
-0.0231781005859375,
0.003833770751953125,
0.039520263671875,
0.0201568603515625,
-0.0458984375,
-0.01422882080078125,
-0.005046844482421875,
-0.033721923828125,
0.043609619140625,
0.004619598388671875,
-0.049468994140625,
0.031219482421875,
0.0328369140625,
0.0186004638671875,
0.043731689453125,
0.01409912109375,
-0.0028285980224609375,
-0.02691650390625,
-0.0043487548828125,
-0.004497528076171875,
0.01210784912109375,
0.0176544189453125,
-0.031768798828125,
0.05792236328125,
0.036163330078125,
-0.04925537109375,
-0.055999755859375,
-0.01081085205078125,
-0.0975341796875,
-0.01160430908203125,
0.0721435546875,
-0.014617919921875,
-0.0498046875,
0.0042877197265625,
-0.02423095703125,
0.0086517333984375,
-0.03851318359375,
0.02734375,
0.06622314453125,
-0.0235595703125,
-0.003570556640625,
-0.05303955078125,
0.03228759765625,
0.0166778564453125,
-0.060821533203125,
-0.0137176513671875,
0.033660888671875,
0.03009033203125,
0.037872314453125,
0.0643310546875,
-0.0111236572265625,
0.004413604736328125,
0.019683837890625,
0.0191802978515625,
0.0179290771484375,
-0.001934051513671875,
0.00670623779296875,
0.0232391357421875,
-0.019256591796875,
-0.035614013671875
]
] |
marieke93/MiniLM-evidence-types | 2022-06-11T13:32:27.000Z | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | marieke93 | null | null | marieke93/MiniLM-evidence-types | 3 | 6,543,049 | transformers | 2022-06-07T14:19:25 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: MiniLM-evidence-types
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MiniLM-evidence-types
This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co/microsoft/MiniLM-L12-H384-uncased) on the evidence types dataset.
It achieved the following results on the evaluation set:
- Loss: 1.8672
- Macro f1: 0.3726
- Weighted f1: 0.7030
- Accuracy: 0.7161
- Balanced accuracy: 0.3616
## Training and evaluation data
The data set, as well as the code that was used to fine tune this model can be found in the GitHub repository [BA-Thesis-Information-Science-Persuasion-Strategies](https://github.com/mariekevdh/BA-Thesis-Information-Science-Persuasion-Strategies)
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Macro f1 | Weighted f1 | Accuracy | Balanced accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:-----------------:|
| 1.4106 | 1.0 | 250 | 1.2698 | 0.1966 | 0.6084 | 0.6735 | 0.2195 |
| 1.1437 | 2.0 | 500 | 1.0985 | 0.3484 | 0.6914 | 0.7116 | 0.3536 |
| 0.9714 | 3.0 | 750 | 1.0901 | 0.2606 | 0.6413 | 0.6446 | 0.2932 |
| 0.8382 | 4.0 | 1000 | 1.0197 | 0.2764 | 0.7024 | 0.7237 | 0.2783 |
| 0.7192 | 5.0 | 1250 | 1.0895 | 0.2847 | 0.6824 | 0.6963 | 0.2915 |
| 0.6249 | 6.0 | 1500 | 1.1296 | 0.3487 | 0.6888 | 0.6948 | 0.3377 |
| 0.5336 | 7.0 | 1750 | 1.1515 | 0.3591 | 0.6982 | 0.7024 | 0.3496 |
| 0.4694 | 8.0 | 2000 | 1.1962 | 0.3626 | 0.7185 | 0.7314 | 0.3415 |
| 0.4058 | 9.0 | 2250 | 1.3313 | 0.3121 | 0.6920 | 0.7085 | 0.3033 |
| 0.3746 | 10.0 | 2500 | 1.3993 | 0.3628 | 0.6976 | 0.7047 | 0.3495 |
| 0.3267 | 11.0 | 2750 | 1.5078 | 0.3560 | 0.6958 | 0.7055 | 0.3464 |
| 0.2939 | 12.0 | 3000 | 1.5875 | 0.3685 | 0.6968 | 0.7062 | 0.3514 |
| 0.2677 | 13.0 | 3250 | 1.6470 | 0.3606 | 0.6976 | 0.7070 | 0.3490 |
| 0.2425 | 14.0 | 3500 | 1.7164 | 0.3714 | 0.7069 | 0.7207 | 0.3551 |
| 0.2301 | 15.0 | 3750 | 1.8151 | 0.3597 | 0.6975 | 0.7123 | 0.3466 |
| 0.2268 | 16.0 | 4000 | 1.7838 | 0.3940 | 0.7034 | 0.7123 | 0.3869 |
| 0.201 | 17.0 | 4250 | 1.8328 | 0.3725 | 0.6964 | 0.7062 | 0.3704 |
| 0.1923 | 18.0 | 4500 | 1.8788 | 0.3708 | 0.7019 | 0.7154 | 0.3591 |
| 0.1795 | 19.0 | 4750 | 1.8574 | 0.3752 | 0.7031 | 0.7161 | 0.3619 |
| 0.1713 | 20.0 | 5000 | 1.8672 | 0.3726 | 0.7030 | 0.7161 | 0.3616 |
### Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1
| 3,742 | [
[
-0.039520263671875,
-0.038787841796875,
0.0187225341796875,
-0.0010738372802734375,
-0.0034332275390625,
-0.012451171875,
0.0037631988525390625,
-0.0081024169921875,
0.03173828125,
0.019744873046875,
-0.045654296875,
-0.050079345703125,
-0.0484619140625,
-0.0109710693359375,
-0.012786865234375,
0.0626220703125,
0.01385498046875,
-0.005878448486328125,
-0.01509857177734375,
-0.0156097412109375,
-0.0222625732421875,
-0.018402099609375,
-0.054351806640625,
-0.024078369140625,
0.0146026611328125,
0.036773681640625,
0.06201171875,
0.05059814453125,
0.029144287109375,
0.02423095703125,
-0.0208892822265625,
0.0119476318359375,
-0.02630615234375,
-0.047271728515625,
0.0018453598022460938,
-0.038330078125,
-0.0390625,
0.0061492919921875,
0.036163330078125,
0.036956787109375,
-0.00882720947265625,
0.041961669921875,
0.0092315673828125,
0.062347412109375,
-0.026519775390625,
0.0204315185546875,
-0.0288238525390625,
0.004199981689453125,
-0.0203857421875,
-0.0233154296875,
-0.007049560546875,
-0.0269012451171875,
0.01070404052734375,
-0.035125732421875,
0.034881591796875,
0.018951416015625,
0.09814453125,
0.01277923583984375,
-0.032196044921875,
-0.004947662353515625,
-0.02301025390625,
0.04266357421875,
-0.052398681640625,
0.01262664794921875,
0.036834716796875,
0.0083160400390625,
-0.0090179443359375,
-0.05364990234375,
-0.050994873046875,
0.0143280029296875,
-0.035247802734375,
0.0169830322265625,
-0.0250701904296875,
-0.019805908203125,
0.05035400390625,
0.04364013671875,
-0.050750732421875,
-0.0028476715087890625,
-0.033355712890625,
-0.0199127197265625,
0.04766845703125,
0.0341796875,
0.0123291015625,
-0.043304443359375,
-0.032073974609375,
-0.012115478515625,
-0.03570556640625,
0.044769287109375,
0.03497314453125,
0.0160980224609375,
-0.037384033203125,
0.040924072265625,
-0.01016998291015625,
0.044708251953125,
0.0134124755859375,
-0.0153045654296875,
0.044036865234375,
-0.035980224609375,
-0.0255889892578125,
-0.007015228271484375,
0.061004638671875,
0.042236328125,
-0.01280975341796875,
0.025665283203125,
-0.00403594970703125,
0.0030460357666015625,
0.00015151500701904297,
-0.06561279296875,
-0.026641845703125,
0.030609130859375,
-0.05145263671875,
-0.0288848876953125,
0.0021610260009765625,
-0.061920166015625,
0.0005955696105957031,
-0.0242156982421875,
0.0240020751953125,
-0.020050048828125,
-0.02789306640625,
-0.004566192626953125,
-0.01117706298828125,
0.02056884765625,
0.0191497802734375,
-0.0716552734375,
0.0216217041015625,
0.02947998046875,
0.062103271484375,
-0.0032024383544921875,
0.0006690025329589844,
-0.00016558170318603516,
0.017822265625,
-0.026580810546875,
0.058135986328125,
-0.0132598876953125,
-0.031890869140625,
-0.0134735107421875,
0.0191497802734375,
-0.0179595947265625,
-0.028564453125,
0.04534912109375,
-0.020172119140625,
0.027069091796875,
-0.0292205810546875,
-0.036712646484375,
-0.01476287841796875,
0.0323486328125,
-0.05078125,
0.093017578125,
0.01166534423828125,
-0.08111572265625,
0.041259765625,
-0.056610107421875,
0.005107879638671875,
-0.007598876953125,
-0.012420654296875,
-0.0648193359375,
-0.0152435302734375,
0.0192718505859375,
0.01477813720703125,
-0.0270233154296875,
0.0178680419921875,
-0.006122589111328125,
-0.0160369873046875,
-0.014923095703125,
-0.023284912109375,
0.0885009765625,
0.01543426513671875,
-0.0465087890625,
0.020263671875,
-0.08343505859375,
0.01381683349609375,
0.0197296142578125,
-0.02874755859375,
-0.0054931640625,
-0.015380859375,
0.00798797607421875,
0.0213165283203125,
0.022125244140625,
-0.035308837890625,
0.018341064453125,
-0.0180816650390625,
0.03076171875,
0.04766845703125,
0.0095062255859375,
0.025726318359375,
-0.0467529296875,
0.02471923828125,
0.0278167724609375,
0.02252197265625,
0.01064300537109375,
-0.04290771484375,
-0.06414794921875,
-0.033203125,
0.00850677490234375,
0.036407470703125,
-0.0257110595703125,
0.03857421875,
-0.017486572265625,
-0.04742431640625,
-0.036865234375,
-0.00446319580078125,
0.016265869140625,
0.05303955078125,
0.0282440185546875,
-0.0010986328125,
-0.04052734375,
-0.076416015625,
-0.008026123046875,
-0.00870513916015625,
0.01959228515625,
0.03118896484375,
0.0670166015625,
-0.0240020751953125,
0.085205078125,
-0.0445556640625,
-0.03997802734375,
-0.005970001220703125,
0.0017719268798828125,
0.046630859375,
0.046478271484375,
0.062347412109375,
-0.05450439453125,
-0.047149658203125,
-0.00928497314453125,
-0.043609619140625,
0.0223541259765625,
-0.0083770751953125,
-0.0074615478515625,
0.0037517547607421875,
0.006549835205078125,
-0.042999267578125,
0.07110595703125,
0.036407470703125,
-0.04632568359375,
0.061309814453125,
-0.02728271484375,
0.0160980224609375,
-0.0748291015625,
0.0216522216796875,
-0.0064544677734375,
-0.01885986328125,
-0.0289154052734375,
-0.0132293701171875,
0.0078582763671875,
-0.0103759765625,
-0.0218353271484375,
0.048919677734375,
-0.0347900390625,
0.0102081298828125,
-0.0001653432846069336,
-0.00659942626953125,
0.0018815994262695312,
0.043060302734375,
0.005748748779296875,
0.06591796875,
0.054351806640625,
-0.0380859375,
0.01174163818359375,
0.02166748046875,
-0.039215087890625,
0.044189453125,
-0.043670654296875,
0.0005998611450195312,
-0.004100799560546875,
-0.0028591156005859375,
-0.083984375,
-0.0177001953125,
0.023193359375,
-0.02996826171875,
0.014251708984375,
-0.00951385498046875,
-0.0207061767578125,
-0.0614013671875,
-0.027130126953125,
0.0015249252319335938,
0.027374267578125,
-0.02960205078125,
0.04248046875,
0.0141754150390625,
0.01296234130859375,
-0.037628173828125,
-0.05902099609375,
-0.004306793212890625,
-0.01184844970703125,
-0.0596923828125,
0.026702880859375,
-0.01009368896484375,
-0.0110626220703125,
0.0084228515625,
-0.004299163818359375,
-0.0006585121154785156,
-0.0021953582763671875,
0.025848388671875,
0.0172882080078125,
-0.013885498046875,
-0.017242431640625,
-0.0083160400390625,
-0.0283966064453125,
-0.00013399124145507812,
0.0033168792724609375,
0.03240966796875,
-0.02239990234375,
-0.042510986328125,
-0.04833984375,
0.0110931396484375,
0.0430908203125,
-0.01509857177734375,
0.08099365234375,
0.0386962890625,
-0.0267791748046875,
0.00873565673828125,
-0.037139892578125,
-0.01377105712890625,
-0.031951904296875,
0.01543426513671875,
-0.044097900390625,
-0.0526123046875,
0.05767822265625,
-0.00878143310546875,
0.01507568359375,
0.06439208984375,
0.042633056640625,
-0.0017271041870117188,
0.07525634765625,
0.0238800048828125,
-0.0017528533935546875,
0.02362060546875,
-0.06414794921875,
0.007556915283203125,
-0.0538330078125,
-0.045013427734375,
-0.040924072265625,
-0.02691650390625,
-0.032958984375,
-0.00872039794921875,
0.0234527587890625,
0.00045108795166015625,
-0.0390625,
0.0144805908203125,
-0.05865478515625,
0.0112152099609375,
0.06109619140625,
0.0225982666015625,
0.01114654541015625,
-0.0018281936645507812,
-0.0303955078125,
-0.0184173583984375,
-0.05426025390625,
-0.043487548828125,
0.08917236328125,
0.0165863037109375,
0.0360107421875,
0.020965576171875,
0.05865478515625,
0.0256805419921875,
0.0168609619140625,
-0.038055419921875,
0.019500732421875,
0.00980377197265625,
-0.0682373046875,
-0.01593017578125,
-0.0212249755859375,
-0.0704345703125,
0.0310516357421875,
-0.0247955322265625,
-0.0596923828125,
0.05328369140625,
0.0252838134765625,
-0.0367431640625,
0.039520263671875,
-0.0450439453125,
0.07025146484375,
-0.007152557373046875,
-0.044036865234375,
-0.007633209228515625,
-0.047119140625,
0.0258026123046875,
0.0081024169921875,
0.02288818359375,
-0.0105133056640625,
0.00786590576171875,
0.06378173828125,
-0.058868408203125,
0.040771484375,
-0.01837158203125,
0.0221405029296875,
0.03326416015625,
-0.00800323486328125,
0.05792236328125,
0.0133209228515625,
-0.0203399658203125,
-0.003955841064453125,
0.0223846435546875,
-0.046630859375,
-0.0216522216796875,
0.05670166015625,
-0.07501220703125,
-0.06182861328125,
-0.05059814453125,
-0.02972412109375,
0.01617431640625,
0.0283660888671875,
0.03753662109375,
0.04144287109375,
0.0025634765625,
0.019927978515625,
0.046112060546875,
0.0034580230712890625,
0.0413818359375,
0.034881591796875,
-0.01107025146484375,
-0.056304931640625,
0.0516357421875,
0.00850677490234375,
0.0197296142578125,
0.003322601318359375,
0.0209503173828125,
-0.036285400390625,
-0.034637451171875,
-0.024139404296875,
0.00946044921875,
-0.0310516357421875,
-0.0232696533203125,
-0.052398681640625,
-0.0110015869140625,
-0.05303955078125,
-0.016143798828125,
-0.02630615234375,
-0.01396942138671875,
-0.04388427734375,
-0.0179595947265625,
0.042236328125,
0.040374755859375,
-0.01232147216796875,
0.025177001953125,
-0.038543701171875,
0.01209259033203125,
0.01263427734375,
0.0017213821411132812,
0.00524139404296875,
-0.044036865234375,
-0.01488494873046875,
0.006961822509765625,
-0.02960205078125,
-0.056732177734375,
0.053131103515625,
-0.0026226043701171875,
0.045654296875,
0.049468994140625,
0.0009083747863769531,
0.078125,
-0.0135650634765625,
0.062255859375,
0.0302276611328125,
-0.055084228515625,
0.044769287109375,
-0.01611328125,
0.0202789306640625,
0.0557861328125,
0.03076171875,
-0.028839111328125,
-0.0124969482421875,
-0.08087158203125,
-0.05712890625,
0.061859130859375,
0.0246124267578125,
-0.01103973388671875,
-0.0005545616149902344,
0.0166015625,
-0.0231170654296875,
0.0219268798828125,
-0.0687255859375,
-0.06329345703125,
-0.01172637939453125,
-0.0086212158203125,
0.0094451904296875,
-0.0278167724609375,
-0.029144287109375,
-0.049652099609375,
0.04791259765625,
0.01042938232421875,
0.023834228515625,
0.023284912109375,
0.01064300537109375,
-0.00400543212890625,
0.01219940185546875,
0.052093505859375,
0.061004638671875,
-0.03900146484375,
0.01348114013671875,
0.00788116455078125,
-0.034637451171875,
0.0113372802734375,
0.0025386810302734375,
-0.035400390625,
0.0032825469970703125,
0.0256805419921875,
0.050994873046875,
0.0011377334594726562,
0.004467010498046875,
0.0462646484375,
0.0048980712890625,
-0.03472900390625,
-0.039154052734375,
-0.006504058837890625,
0.00820159912109375,
0.01654052734375,
0.031005859375,
0.037750244140625,
0.00579833984375,
-0.04437255859375,
0.012298583984375,
0.036834716796875,
-0.039306640625,
-0.0011234283447265625,
0.0670166015625,
0.0011034011840820312,
-0.0205535888671875,
0.034942626953125,
-0.0119781494140625,
-0.0411376953125,
0.0626220703125,
0.034088134765625,
0.0396728515625,
-0.01160430908203125,
0.0011196136474609375,
0.06591796875,
0.03155517578125,
0.0009021759033203125,
0.039642333984375,
0.00957489013671875,
-0.0225677490234375,
0.01396942138671875,
-0.0435791015625,
-0.0096435546875,
0.019256591796875,
-0.058197021484375,
0.02978515625,
-0.0267791748046875,
-0.03997802734375,
-0.01251983642578125,
0.032806396484375,
-0.06787109375,
0.0255889892578125,
-0.00925445556640625,
0.0899658203125,
-0.07177734375,
0.0557861328125,
0.04364013671875,
-0.045196533203125,
-0.0794677734375,
-0.038299560546875,
0.00954437255859375,
-0.052459716796875,
0.047821044921875,
0.0035305023193359375,
0.0134124755859375,
0.01087188720703125,
-0.04400634765625,
-0.08392333984375,
0.1007080078125,
-0.0076904296875,
-0.049224853515625,
0.01419830322265625,
0.01006317138671875,
0.0302886962890625,
-0.016204833984375,
0.04278564453125,
0.040618896484375,
0.044769287109375,
0.00997161865234375,
-0.05340576171875,
0.0194854736328125,
-0.0308685302734375,
-0.0012254714965820312,
0.00852203369140625,
-0.0626220703125,
0.08154296875,
-0.01552581787109375,
0.002685546875,
0.0023670196533203125,
0.052093505859375,
0.0311737060546875,
0.0241241455078125,
0.032073974609375,
0.07275390625,
0.06878662109375,
-0.02362060546875,
0.07470703125,
-0.033721923828125,
0.056610107421875,
0.07183837890625,
0.01270294189453125,
0.05474853515625,
0.0352783203125,
-0.04278564453125,
0.041046142578125,
0.06689453125,
-0.01229095458984375,
0.0379638671875,
-0.0003409385681152344,
-0.02203369140625,
-0.0136871337890625,
0.01416015625,
-0.053741455078125,
0.0021724700927734375,
0.0214996337890625,
-0.0411376953125,
-0.01422119140625,
-0.01053619384765625,
0.006832122802734375,
-0.006916046142578125,
-0.035675048828125,
0.037933349609375,
-0.01377105712890625,
-0.02166748046875,
0.03204345703125,
-0.00836181640625,
0.049072265625,
-0.046630859375,
0.0017566680908203125,
-0.00839996337890625,
0.0306549072265625,
-0.05291748046875,
-0.065185546875,
0.01959228515625,
-0.013946533203125,
-0.02728271484375,
0.002132415771484375,
0.0386962890625,
-0.0180816650390625,
-0.0472412109375,
0.00290679931640625,
0.017822265625,
0.005107879638671875,
0.0102996826171875,
-0.0712890625,
-0.01450347900390625,
0.0176849365234375,
-0.044281005859375,
0.0137939453125,
0.03204345703125,
0.0029888153076171875,
0.032135009765625,
0.06353759765625,
0.005619049072265625,
0.01309967041015625,
-0.01178741455078125,
0.0850830078125,
-0.047149658203125,
-0.03961181640625,
-0.059051513671875,
0.039093017578125,
-0.0296783447265625,
-0.038787841796875,
0.07000732421875,
0.068603515625,
0.033721923828125,
-0.006755828857421875,
0.041259765625,
-0.030120849609375,
0.04315185546875,
-0.020263671875,
0.054046630859375,
-0.05609130859375,
-0.005680084228515625,
-0.01328277587890625,
-0.047698974609375,
-0.02069091796875,
0.04913330078125,
-0.050933837890625,
0.001369476318359375,
0.04534912109375,
0.0745849609375,
0.010406494140625,
0.004894256591796875,
0.00811004638671875,
0.0034351348876953125,
0.0212554931640625,
0.036346435546875,
0.03271484375,
-0.0450439453125,
0.03961181640625,
-0.045196533203125,
-0.01139068603515625,
-0.0195770263671875,
-0.053924560546875,
-0.062103271484375,
-0.0257568359375,
-0.0450439453125,
-0.04150390625,
-0.017730712890625,
0.07177734375,
0.056488037109375,
-0.06512451171875,
-0.019287109375,
-0.00412750244140625,
0.00873565673828125,
-0.0214691162109375,
-0.017364501953125,
0.07537841796875,
0.002010345458984375,
-0.04791259765625,
-0.00978851318359375,
0.00865936279296875,
0.025787353515625,
-0.004436492919921875,
-0.0081329345703125,
-0.0293731689453125,
-0.0016927719116210938,
0.03363037109375,
0.019927978515625,
-0.04852294921875,
-0.018524169921875,
-0.01015472412109375,
-0.0281982421875,
0.026275634765625,
0.024749755859375,
-0.038116455078125,
0.030029296875,
0.031463623046875,
0.0181427001953125,
0.059173583984375,
0.007598876953125,
-0.0030612945556640625,
-0.03076171875,
0.021270751953125,
0.00020325183868408203,
0.03143310546875,
0.0017595291137695312,
-0.03619384765625,
0.043731689453125,
0.04376220703125,
-0.046661376953125,
-0.05670166015625,
-0.015777587890625,
-0.08404541015625,
0.005702972412109375,
0.07269287109375,
-0.00971221923828125,
-0.039398193359375,
-0.00750732421875,
-0.02447509765625,
0.001613616943359375,
-0.034942626953125,
0.03204345703125,
0.05108642578125,
-0.0269927978515625,
-0.01001739501953125,
-0.042388916015625,
0.0478515625,
0.00794219970703125,
-0.057403564453125,
-0.0193634033203125,
0.0230560302734375,
0.03131103515625,
0.025421142578125,
0.0516357421875,
-0.0082550048828125,
0.0132598876953125,
0.024688720703125,
0.01300811767578125,
-0.01488494873046875,
0.0045166015625,
-0.00656890869140625,
0.0188446044921875,
-0.0010852813720703125,
-0.0267791748046875
]
] |
Ashishkr/query_wellformedness_score | 2023-10-13T16:08:29.000Z | [
"transformers",
"pytorch",
"jax",
"safetensors",
"roberta",
"text-classification",
"dataset:google_wellformed_query",
"license:apache-2.0",
"region:us"
] | text-classification | Ashishkr | null | null | Ashishkr/query_wellformedness_score | 13 | 6,540,192 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
inference: false
datasets: google_wellformed_query
---
**Model name**: Query Wellformedness Scoring
**Description** : Evaluate the well-formedness of sentences by checking grammatical correctness and completeness. Sensitive to case and penalizes sentences for incorrect grammar and case.
**Features**:
- *Wellformedness Score*: Provides a score indicating grammatical correctness and completeness.
- *Case Sensitivity*: Recognizes and penalizes incorrect casing in sentences.
- *Broad Applicability*: Can be used on a wide range of sentences.
**Example**:
1. Dogs are mammals.
2. she loves to read books on history.
3. When the rain in Spain.
4. Eating apples are healthy for you.
5. The Eiffel Tower is in Paris.
Among these sentences:
Sentences 1 and 5 are well-formed and have correct grammar and case.
Sentence 2 starts with a lowercase letter.
Sentence 3 is a fragment and is not well-formed.
Sentence 4 has a subject-verb agreement error.
**example_usage:**
*library: HuggingFace transformers*
```python
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Ashishkr/query_wellformedness_score")
model = AutoModelForSequenceClassification.from_pretrained("Ashishkr/query_wellformedness_score")
sentences = [
"The quarterly financial report are showing an increase.", # Incorrect
"Him has completed the audit for last fiscal year.", # Incorrect
"Please to inform the board about the recent developments.", # Incorrect
"The team successfully achieved all its targets for the last quarter.", # Correct
"Our company is exploring new ventures in the European market." # Correct
]
features = tokenizer(sentences, padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
```
**Intended Use Cases**
*Content Creation*: Validate the well-formedness of written content.
*Educational Platforms*: Helps students check the grammaticality of their sentences.
*Chatbots & Virtual Assistants*: To validate user queries or generate well-formed responses.
contact: kua613@g.harvard.edu
| 2,209 | [
[
-0.00690460205078125,
-0.078369140625,
0.01397705078125,
0.04833984375,
-0.00553131103515625,
-0.00585174560546875,
-0.002742767333984375,
-0.0077056884765625,
0.01849365234375,
0.0201263427734375,
-0.040130615234375,
-0.0606689453125,
-0.042572021484375,
0.005535125732421875,
-0.0362548828125,
0.0865478515625,
-0.0037860870361328125,
-0.0222930908203125,
-0.0273590087890625,
-0.007537841796875,
-0.02569580078125,
-0.044769287109375,
-0.0109405517578125,
-0.0252685546875,
0.0073089599609375,
0.0301666259765625,
0.0280303955078125,
0.0258026123046875,
0.05462646484375,
0.03729248046875,
0.0009107589721679688,
0.002777099609375,
-0.035552978515625,
0.0191802978515625,
-0.005626678466796875,
-0.037322998046875,
-0.055145263671875,
0.02178955078125,
0.04241943359375,
0.025909423828125,
-0.0133514404296875,
0.025146484375,
0.01983642578125,
0.03900146484375,
-0.03082275390625,
0.02301025390625,
-0.035797119140625,
0.0005025863647460938,
-0.0206298828125,
0.005008697509765625,
-0.026763916015625,
-0.02587890625,
0.01361846923828125,
-0.032379150390625,
0.028594970703125,
0.014617919921875,
0.09185791015625,
0.02642822265625,
-0.0113525390625,
-0.0221099853515625,
-0.0210418701171875,
0.059967041015625,
-0.0513916015625,
0.00882720947265625,
0.031707763671875,
0.0169219970703125,
-0.01568603515625,
-0.0706787109375,
-0.0491943359375,
-0.01519012451171875,
-0.0297088623046875,
0.03631591796875,
-0.0202484130859375,
0.01910400390625,
0.0386962890625,
0.04290771484375,
-0.06829833984375,
-0.01491546630859375,
-0.027984619140625,
-0.038421630859375,
0.05145263671875,
0.0014524459838867188,
0.0174407958984375,
-0.054168701171875,
-0.02862548828125,
-0.014312744140625,
-0.020355224609375,
0.0256500244140625,
0.030853271484375,
0.0158538818359375,
-0.0213470458984375,
0.06298828125,
-0.03460693359375,
0.0439453125,
0.0176239013671875,
0.010345458984375,
0.04022216796875,
-0.0255126953125,
-0.0474853515625,
0.0101470947265625,
0.05621337890625,
0.01444244384765625,
0.0252838134765625,
-0.000016570091247558594,
-0.01401519775390625,
0.027587890625,
0.0163726806640625,
-0.0543212890625,
-0.040191650390625,
0.03472900390625,
-0.0382080078125,
-0.01229095458984375,
0.032562255859375,
-0.03851318359375,
0.0148468017578125,
0.0042724609375,
0.0406494140625,
-0.030059814453125,
-0.00730133056640625,
0.024261474609375,
-0.0113525390625,
0.00647735595703125,
0.00843048095703125,
-0.05999755859375,
0.0149993896484375,
0.0341796875,
0.053955078125,
0.0016231536865234375,
-0.040069580078125,
-0.04296875,
-0.01195526123046875,
-0.0018415451049804688,
0.05462646484375,
-0.03271484375,
-0.0175323486328125,
-0.00341796875,
0.0168914794921875,
-0.0247650146484375,
-0.040252685546875,
0.06884765625,
-0.021820068359375,
0.048675537109375,
0.00939178466796875,
-0.059234619140625,
-0.006549835205078125,
0.04583740234375,
-0.0386962890625,
0.106201171875,
0.0264739990234375,
-0.037933349609375,
0.0382080078125,
-0.041259765625,
-0.03173828125,
0.0018625259399414062,
0.01120758056640625,
-0.05462646484375,
0.0011444091796875,
0.019561767578125,
0.060791015625,
0.026641845703125,
0.007701873779296875,
-0.0101776123046875,
-0.01261138916015625,
0.00424957275390625,
-0.0122222900390625,
0.04443359375,
0.003391265869140625,
-0.023529052734375,
0.0212249755859375,
-0.060272216796875,
0.0263824462890625,
-0.0032100677490234375,
-0.038330078125,
-0.01438140869140625,
-0.01348114013671875,
0.0298614501953125,
0.01715087890625,
0.01540374755859375,
-0.0297698974609375,
0.032135009765625,
-0.059539794921875,
0.027923583984375,
0.028717041015625,
-0.00145721435546875,
0.014129638671875,
-0.0208282470703125,
0.03167724609375,
-0.004833221435546875,
0.0162506103515625,
-0.01049041748046875,
-0.03436279296875,
-0.0718994140625,
-0.03021240234375,
0.019012451171875,
0.0504150390625,
-0.0335693359375,
0.07012939453125,
0.0004627704620361328,
-0.048583984375,
-0.03466796875,
0.016082763671875,
0.04071044921875,
0.036224365234375,
0.028594970703125,
-0.01050567626953125,
-0.050750732421875,
-0.0572509765625,
-0.0164031982421875,
-0.0210113525390625,
0.0221710205078125,
-0.006988525390625,
0.049560546875,
0.0004112720489501953,
0.06243896484375,
-0.02459716796875,
-0.0303192138671875,
-0.0182952880859375,
0.0116119384765625,
0.04046630859375,
0.05072021484375,
0.045654296875,
-0.047576904296875,
-0.033294677734375,
-0.037017822265625,
-0.05828857421875,
0.01387786865234375,
0.0031585693359375,
-0.01397705078125,
0.00797271728515625,
0.0184478759765625,
-0.076416015625,
0.03179931640625,
0.042755126953125,
-0.06195068359375,
0.041015625,
-0.0205841064453125,
0.0268096923828125,
-0.08343505859375,
-0.00609588623046875,
0.0159912109375,
-0.035736083984375,
-0.01861572265625,
0.005893707275390625,
-0.01239776611328125,
0.0162811279296875,
-0.02825927734375,
0.0301666259765625,
-0.0284423828125,
0.026397705078125,
-0.00876617431640625,
0.0162811279296875,
0.0170440673828125,
0.01910400390625,
-0.032623291015625,
0.047515869140625,
0.043212890625,
-0.0574951171875,
0.053314208984375,
0.031829833984375,
-0.025390625,
0.04547119140625,
-0.06201171875,
0.0011701583862304688,
-0.0034465789794921875,
0.0013017654418945312,
-0.0955810546875,
-0.036407470703125,
-0.001979827880859375,
-0.05621337890625,
0.01290130615234375,
0.01019287109375,
-0.043426513671875,
-0.041717529296875,
-0.0228729248046875,
0.0201873779296875,
0.024200439453125,
-0.0207977294921875,
0.04364013671875,
0.01238250732421875,
0.0008878707885742188,
-0.0305023193359375,
-0.04736328125,
-0.014739990234375,
-0.0202484130859375,
-0.0606689453125,
0.032379150390625,
0.0026187896728515625,
-0.007476806640625,
0.00374603271484375,
-0.00121307373046875,
0.00848388671875,
0.009368896484375,
0.006549835205078125,
0.0247802734375,
-0.01552581787109375,
0.023529052734375,
-0.0137481689453125,
-0.005298614501953125,
0.003650665283203125,
-0.0078277587890625,
0.0511474609375,
-0.027008056640625,
-0.02880859375,
-0.03741455078125,
-0.004253387451171875,
0.033355712890625,
-0.013153076171875,
0.037689208984375,
0.0487060546875,
-0.046478271484375,
-0.008270263671875,
-0.056732177734375,
-0.01396942138671875,
-0.0341796875,
0.0413818359375,
-0.0226287841796875,
-0.069091796875,
0.0465087890625,
0.0145111083984375,
-0.0102691650390625,
0.08319091796875,
0.048675537109375,
-0.0218048095703125,
0.0716552734375,
0.048126220703125,
-0.01959228515625,
0.01947021484375,
-0.025726318359375,
0.0211944580078125,
-0.039398193359375,
-0.032257080078125,
-0.04119873046875,
-0.01708984375,
-0.059417724609375,
0.0023593902587890625,
0.015167236328125,
0.0180511474609375,
-0.026824951171875,
0.025146484375,
-0.0606689453125,
0.0257720947265625,
0.04718017578125,
-0.005855560302734375,
0.002552032470703125,
-0.0037994384765625,
-0.01129913330078125,
-0.0146484375,
-0.035186767578125,
-0.05792236328125,
0.057281494140625,
0.032928466796875,
0.055694580078125,
0.0193328857421875,
0.03802490234375,
0.0290374755859375,
0.001438140869140625,
-0.06005859375,
0.031524658203125,
-0.0167236328125,
-0.0601806640625,
-0.029998779296875,
-0.017547607421875,
-0.0616455078125,
-0.01108551025390625,
-0.0262298583984375,
-0.06597900390625,
-0.010650634765625,
0.01128387451171875,
-0.04296875,
-0.0188446044921875,
-0.061859130859375,
0.088134765625,
-0.0291748046875,
-0.032012939453125,
-0.00966644287109375,
-0.0299835205078125,
0.006702423095703125,
0.0286865234375,
-0.0204315185546875,
-0.01267242431640625,
0.024932861328125,
0.07257080078125,
-0.02435302734375,
0.0633544921875,
-0.0111083984375,
0.0310821533203125,
0.0241241455078125,
-0.0034847259521484375,
0.024383544921875,
0.0179595947265625,
-0.022216796875,
-0.0070037841796875,
0.0234375,
-0.029571533203125,
-0.05401611328125,
0.04608154296875,
-0.046630859375,
-0.03936767578125,
-0.055145263671875,
-0.01009368896484375,
-0.0025577545166015625,
0.023834228515625,
0.034759521484375,
0.021636962890625,
-0.0288543701171875,
0.00632476806640625,
0.0426025390625,
-0.0039520263671875,
0.0322265625,
0.0035858154296875,
-0.0224151611328125,
-0.0262451171875,
0.051055908203125,
-0.0025787353515625,
0.0010900497436523438,
0.004573822021484375,
0.007476806640625,
-0.0360107421875,
-0.0213775634765625,
-0.031585693359375,
0.0204315185546875,
-0.04119873046875,
-0.0226593017578125,
-0.0439453125,
-0.035614013671875,
-0.04962158203125,
0.007579803466796875,
-0.0188140869140625,
-0.0208587646484375,
-0.0158538818359375,
-0.033935546875,
0.02459716796875,
0.052490234375,
0.0153961181640625,
0.0290679931640625,
-0.05029296875,
0.00688934326171875,
0.0078277587890625,
0.03253173828125,
-0.0218048095703125,
-0.050384521484375,
-0.01031494140625,
-0.0261077880859375,
-0.037017822265625,
-0.05072021484375,
0.030517578125,
0.032073974609375,
0.0594482421875,
0.0259552001953125,
-0.004535675048828125,
0.046142578125,
-0.0150299072265625,
0.0731201171875,
0.006084442138671875,
-0.07733154296875,
0.068359375,
-0.0089569091796875,
0.018798828125,
0.05438232421875,
0.0213775634765625,
-0.0210113525390625,
-0.0333251953125,
-0.060577392578125,
-0.0616455078125,
0.0428466796875,
0.033050537109375,
0.0372314453125,
-0.005401611328125,
0.0347900390625,
-0.006023406982421875,
0.0161285400390625,
-0.0665283203125,
-0.0222320556640625,
-0.037384033203125,
-0.047210693359375,
-0.0016956329345703125,
-0.036773681640625,
0.004642486572265625,
-0.039276123046875,
0.07379150390625,
0.0020389556884765625,
0.01506805419921875,
0.033233642578125,
-0.0094451904296875,
0.01203155517578125,
0.0172576904296875,
0.032135009765625,
0.034698486328125,
-0.0204925537109375,
-0.0014371871948242188,
0.007320404052734375,
-0.049041748046875,
-0.0015239715576171875,
0.025360107421875,
0.00475311279296875,
0.0056610107421875,
0.04388427734375,
0.045928955078125,
-0.00463104248046875,
-0.0238494873046875,
0.04803466796875,
-0.00009149312973022461,
-0.01163482666015625,
-0.034881591796875,
0.0123748779296875,
-0.0112762451171875,
0.026824951171875,
0.0225067138671875,
-0.005680084228515625,
0.017913818359375,
-0.054901123046875,
0.02587890625,
0.02069091796875,
-0.0345458984375,
-0.01532745361328125,
0.035003662109375,
0.01189422607421875,
-0.016143798828125,
0.07928466796875,
-0.03564453125,
-0.05908203125,
0.049102783203125,
0.0404052734375,
0.04803466796875,
-0.018646240234375,
0.0194244384765625,
0.04443359375,
0.0294647216796875,
-0.01209259033203125,
0.03570556640625,
0.00997161865234375,
-0.0565185546875,
-0.0269012451171875,
-0.061798095703125,
-0.0032806396484375,
0.0013723373413085938,
-0.033172607421875,
0.03656005859375,
-0.033050537109375,
-0.038543701171875,
-0.004241943359375,
0.00579833984375,
-0.05572509765625,
0.0499267578125,
-0.0131683349609375,
0.06719970703125,
-0.06787109375,
0.0401611328125,
0.0748291015625,
-0.050689697265625,
-0.0823974609375,
0.0028285980224609375,
-0.0091552734375,
-0.039337158203125,
0.054107666015625,
0.03521728515625,
0.0030612945556640625,
0.007137298583984375,
-0.036712646484375,
-0.05828857421875,
0.058013916015625,
0.0184478759765625,
-0.0299530029296875,
-0.0035114288330078125,
0.002887725830078125,
0.04669189453125,
-0.01323699951171875,
0.042449951171875,
0.04327392578125,
0.035064697265625,
-0.003467559814453125,
-0.07855224609375,
0.0010967254638671875,
-0.03668212890625,
-0.0297088623046875,
-0.0160675048828125,
-0.05804443359375,
0.0655517578125,
0.0003666877746582031,
-0.00799560546875,
0.0117950439453125,
0.058013916015625,
0.02105712890625,
0.034820556640625,
0.019989013671875,
0.052581787109375,
0.07525634765625,
-0.03961181640625,
0.0712890625,
-0.01277923583984375,
0.06427001953125,
0.048126220703125,
0.0037670135498046875,
0.060455322265625,
0.019561767578125,
-0.01641845703125,
0.05242919921875,
0.06060791015625,
-0.022064208984375,
0.038299560546875,
0.019866943359375,
0.005512237548828125,
-0.0294952392578125,
0.0164337158203125,
-0.01288604736328125,
0.06890869140625,
0.01427459716796875,
-0.024322509765625,
-0.00750732421875,
0.01078033447265625,
-0.003849029541015625,
0.021453857421875,
-0.00888824462890625,
0.0404052734375,
0.0032024383544921875,
-0.052642822265625,
0.0254669189453125,
0.01042938232421875,
0.061126708984375,
-0.033843994140625,
-0.014404296875,
-0.00669097900390625,
0.02716064453125,
-0.053466796875,
-0.063232421875,
0.03350830078125,
0.00215911865234375,
-0.0237274169921875,
0.006023406982421875,
0.05023193359375,
-0.01114654541015625,
-0.06365966796875,
0.006153106689453125,
0.04241943359375,
0.0213470458984375,
-0.01153564453125,
-0.061920166015625,
0.0014476776123046875,
-0.0189208984375,
-0.02691650390625,
-0.0222930908203125,
0.031402587890625,
0.007415771484375,
0.033355712890625,
0.0197296142578125,
0.0012493133544921875,
-0.00888824462890625,
0.0255279541015625,
0.046356201171875,
-0.055816650390625,
-0.0290374755859375,
-0.08331298828125,
0.055206298828125,
-0.027679443359375,
-0.0556640625,
0.06805419921875,
0.07159423828125,
0.06414794921875,
0.01220703125,
0.05108642578125,
-0.01302337646484375,
0.036346435546875,
-0.061767578125,
0.07440185546875,
-0.03271484375,
-0.0028972625732421875,
-0.030914306640625,
-0.08721923828125,
-0.0399169921875,
0.07513427734375,
-0.035400390625,
0.033233642578125,
0.05841064453125,
0.04840087890625,
-0.0196533203125,
-0.01105499267578125,
0.0175018310546875,
0.0281219482421875,
0.0173797607421875,
0.031768798828125,
0.0487060546875,
-0.054168701171875,
0.014434814453125,
-0.0194244384765625,
-0.0207672119140625,
-0.0213470458984375,
-0.05572509765625,
-0.08477783203125,
-0.046112060546875,
-0.04254150390625,
-0.05047607421875,
0.0020751953125,
0.073974609375,
0.055145263671875,
-0.07147216796875,
-0.0014753341674804688,
0.005207061767578125,
-0.00626373291015625,
-0.0069580078125,
-0.024749755859375,
0.04840087890625,
-0.040252685546875,
-0.051666259765625,
0.00876617431640625,
0.01161956787109375,
-0.004268646240234375,
-0.0302886962890625,
0.01019287109375,
-0.0181732177734375,
-0.0073089599609375,
0.044647216796875,
-0.005413055419921875,
-0.040863037109375,
-0.032806396484375,
0.01922607421875,
-0.032196044921875,
0.0094451904296875,
0.0175323486328125,
-0.05157470703125,
0.02960205078125,
0.0246734619140625,
0.030517578125,
0.043212890625,
0.01102447509765625,
0.0155029296875,
-0.050689697265625,
0.0084228515625,
0.0240936279296875,
0.03948974609375,
0.0460205078125,
-0.04510498046875,
0.02496337890625,
0.02545166015625,
-0.0499267578125,
-0.033050537109375,
-0.005767822265625,
-0.08758544921875,
0.010101318359375,
0.0770263671875,
-0.01358795166015625,
-0.032501220703125,
0.00243377685546875,
-0.0220489501953125,
0.02947998046875,
-0.0259857177734375,
0.0809326171875,
0.0806884765625,
-0.0260772705078125,
0.00701904296875,
-0.02777099609375,
0.036376953125,
0.03564453125,
-0.057586669921875,
-0.029876708984375,
0.0299835205078125,
0.0294647216796875,
0.020660400390625,
0.03692626953125,
-0.00133514404296875,
0.03265380859375,
-0.0181121826171875,
0.0146026611328125,
0.0009946823120117188,
-0.006500244140625,
-0.006458282470703125,
0.0166778564453125,
0.013214111328125,
-0.0229034423828125
]
] |
benjamin/wtp-canine-s-1l | 2023-05-31T09:10:23.000Z | [
"transformers",
"pytorch",
"la-canine",
"token-classification",
"multilingual",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hu",
"hy",
"id",
"ig",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"si",
"sk",
"sl",
"sq",
"sr",
"sv",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | benjamin | null | null | benjamin/wtp-canine-s-1l | 0 | 6,521,575 | transformers | 2023-05-10T20:48:35 | ---
license: mit
language:
- multilingual
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hu
- hy
- id
- ig
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- pa
- pl
- ps
- pt
- ro
- ru
- si
- sk
- sl
- sq
- sr
- sv
- ta
- te
- tg
- th
- tr
- uk
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
---
# wtp-canine-s-1l
Model for [`wtpsplit`](https://github.com/bminixhofer/wtpsplit). | 551 | [
[
-0.0229034423828125,
-0.031951904296875,
0.01824951171875,
0.042236328125,
-0.032562255859375,
-0.00981903076171875,
0.020782470703125,
-0.01507568359375,
0.035614013671875,
0.032501220703125,
-0.060943603515625,
-0.0177154541015625,
-0.028472900390625,
-0.00534820556640625,
-0.042877197265625,
0.051910400390625,
0.025054931640625,
0.04071044921875,
0.0221099853515625,
-0.0143280029296875,
0.0254364013671875,
-0.001110076904296875,
-0.039031982421875,
-0.06890869140625,
0.062744140625,
0.01202392578125,
0.03857421875,
0.01392364501953125,
0.054046630859375,
0.0013942718505859375,
0.0001575946807861328,
-0.04437255859375,
-0.0137176513671875,
-0.00833892822265625,
-0.01334381103515625,
0.00157928466796875,
-0.052520751953125,
0.00994873046875,
0.0214385986328125,
0.0293731689453125,
-0.046722412109375,
0.04296875,
-0.0232391357421875,
0.048370361328125,
-0.0283050537109375,
-0.00926971435546875,
-0.02728271484375,
0.0257568359375,
-0.0206298828125,
-0.00036644935607910156,
-0.03033447265625,
-0.042449951171875,
0.01059722900390625,
-0.045196533203125,
0.005161285400390625,
-0.002777099609375,
0.07720947265625,
0.007358551025390625,
-0.0379638671875,
0.004425048828125,
-0.0411376953125,
0.02032470703125,
-0.043212890625,
0.0513916015625,
0.037506103515625,
0.04736328125,
-0.017547607421875,
-0.08526611328125,
-0.0290374755859375,
-0.029571533203125,
-0.01116943359375,
0.0023403167724609375,
-0.0230865478515625,
0.005489349365234375,
0.038604736328125,
0.017242431640625,
-0.05718994140625,
-0.009918212890625,
-0.06256103515625,
-0.00897979736328125,
0.0237274169921875,
0.007549285888671875,
0.007572174072265625,
0.012054443359375,
-0.0225372314453125,
0.031402587890625,
-0.057220458984375,
0.005771636962890625,
-0.0009746551513671875,
0.022308349609375,
-0.01410675048828125,
0.050506591796875,
-0.0248870849609375,
0.0260162353515625,
0.0279541015625,
0.00392913818359375,
0.0272369384765625,
0.01062774658203125,
-0.037078857421875,
0.0079803466796875,
0.028472900390625,
0.004039764404296875,
-0.004520416259765625,
-0.0054473876953125,
-0.019500732421875,
0.007297515869140625,
0.03839111328125,
-0.0772705078125,
-0.0479736328125,
0.022918701171875,
-0.06268310546875,
-0.034698486328125,
-0.0010852813720703125,
-0.01102447509765625,
-0.0153656005859375,
0.0026702880859375,
0.045135498046875,
-0.03570556640625,
-0.05072021484375,
0.0290679931640625,
-0.04400634765625,
0.03857421875,
0.01837158203125,
-0.062469482421875,
0.02252197265625,
0.0357666015625,
0.032745361328125,
0.0238800048828125,
-0.05023193359375,
-0.02471923828125,
0.018280029296875,
-0.01470947265625,
0.043426513671875,
-0.02679443359375,
-0.0210418701171875,
0.00426483154296875,
0.0200347900390625,
0.023040771484375,
0.0091552734375,
0.0215606689453125,
-0.043853759765625,
0.006641387939453125,
-0.0267333984375,
-0.0462646484375,
-0.00437164306640625,
0.048309326171875,
-0.070556640625,
0.07659912109375,
0.01251983642578125,
-0.06439208984375,
0.014892578125,
-0.08209228515625,
-0.0230865478515625,
0.037872314453125,
0.0188751220703125,
-0.02581787109375,
0.00272369384765625,
-0.0325927734375,
0.004199981689453125,
-0.013092041015625,
0.0178070068359375,
-0.039703369140625,
-0.01105499267578125,
0.00514984130859375,
0.01396942138671875,
0.07257080078125,
0.0189666748046875,
0.01108551025390625,
0.0201568603515625,
-0.059906005859375,
-0.01262664794921875,
0.03948974609375,
-0.004734039306640625,
-0.0212249755859375,
-0.0153350830078125,
0.00738525390625,
0.0208740234375,
0.03564453125,
-0.0545654296875,
0.033233642578125,
0.017333984375,
0.0328369140625,
0.020355224609375,
-0.004863739013671875,
0.040985107421875,
-0.038299560546875,
0.027984619140625,
-0.031707763671875,
0.06622314453125,
-0.00864410400390625,
-0.003383636474609375,
-0.040191650390625,
-0.0205078125,
0.032470703125,
0.03271484375,
-0.0259552001953125,
0.041015625,
0.017364501953125,
-0.0927734375,
-0.01561737060546875,
-0.031982421875,
0.0131988525390625,
0.024932861328125,
0.008514404296875,
-0.011322021484375,
-0.0235137939453125,
-0.072021484375,
0.0261383056640625,
-0.0204925537109375,
-0.0042266845703125,
-0.0286712646484375,
0.044158935546875,
-0.01513671875,
0.047149658203125,
-0.03753662109375,
0.0129852294921875,
-0.033355712890625,
0.004974365234375,
0.018402099609375,
0.055511474609375,
0.048126220703125,
-0.037933349609375,
-0.031768798828125,
-0.03369140625,
-0.03643798828125,
-0.01493072509765625,
0.01355743408203125,
-0.0279388427734375,
-0.0267791748046875,
0.00991058349609375,
-0.09564208984375,
0.045501708984375,
0.03717041015625,
-0.03192138671875,
0.08367919921875,
0.01038360595703125,
0.00881195068359375,
-0.057525634765625,
0.031768798828125,
0.0142669677734375,
-0.024383544921875,
-0.059417724609375,
0.023345947265625,
0.03472900390625,
-0.0233154296875,
-0.058929443359375,
0.0208740234375,
-0.0160675048828125,
-0.00970458984375,
0.003643035888671875,
-0.02252197265625,
0.01233673095703125,
0.0257415771484375,
0.020355224609375,
0.04144287109375,
0.0626220703125,
-0.01451873779296875,
0.033843994140625,
0.0396728515625,
0.02239990234375,
0.017730712890625,
-0.060028076171875,
-0.018524169921875,
0.004520416259765625,
0.042724609375,
-0.040985107421875,
-0.008880615234375,
0.014617919921875,
-0.0236358642578125,
-0.00025653839111328125,
-0.046875,
-0.06414794921875,
-0.041351318359375,
-0.0423583984375,
0.021575927734375,
0.041015625,
-0.042144775390625,
0.035369873046875,
0.032745361328125,
-0.00756072998046875,
0.0008630752563476562,
-0.03643798828125,
-0.01117706298828125,
-0.011322021484375,
-0.02923583984375,
0.019073486328125,
-0.0096282958984375,
0.00897979736328125,
-0.006496429443359375,
-0.0021648406982421875,
-0.035614013671875,
-0.024993896484375,
-0.00322723388671875,
0.031158447265625,
-0.028350830078125,
0.030242919921875,
-0.00014495849609375,
-0.01959228515625,
-0.003574371337890625,
-0.0192413330078125,
0.061370849609375,
-0.006565093994140625,
-0.026458740234375,
-0.033447265625,
0.03424072265625,
0.07525634765625,
-0.03338623046875,
0.0296478271484375,
0.046234130859375,
-0.05535888671875,
0.0008578300476074219,
-0.039703369140625,
-0.0263519287109375,
-0.035858154296875,
0.0083160400390625,
-0.033447265625,
-0.0457763671875,
0.032470703125,
-0.0247039794921875,
-0.00580596923828125,
0.0151214599609375,
0.02008056640625,
-0.0228729248046875,
0.06573486328125,
0.06585693359375,
0.0018014907836914062,
0.048126220703125,
-0.007076263427734375,
-0.0018444061279296875,
-0.04779052734375,
-0.00787353515625,
-0.035858154296875,
0.0132598876953125,
-0.03326416015625,
0.0005712509155273438,
0.004650115966796875,
0.018035888671875,
-0.0628662109375,
0.056396484375,
-0.025909423828125,
0.037994384765625,
0.05328369140625,
0.010650634765625,
-0.00292205810546875,
-0.0109405517578125,
-0.004840850830078125,
-0.01052093505859375,
-0.0443115234375,
-0.03924560546875,
0.056243896484375,
0.051727294921875,
0.06817626953125,
0.0165557861328125,
0.045135498046875,
0.032379150390625,
0.023895263671875,
-0.034149169921875,
0.0355224609375,
-0.0169830322265625,
-0.05096435546875,
-0.035552978515625,
0.01531982421875,
-0.06622314453125,
0.024932861328125,
-0.01364898681640625,
-0.0567626953125,
-0.034454345703125,
0.020599365234375,
0.0019893646240234375,
0.020111083984375,
-0.0223388671875,
0.0867919921875,
-0.0169219970703125,
-0.005352020263671875,
-0.0279388427734375,
-0.032745361328125,
0.03662109375,
0.0048675537109375,
-0.035430908203125,
-0.034027099609375,
0.027984619140625,
0.0228424072265625,
-0.055206298828125,
0.04046630859375,
-0.0260772705078125,
0.001239776611328125,
-0.00811767578125,
0.0187225341796875,
0.06280517578125,
0.01053619384765625,
-0.0038127899169921875,
0.00131988525390625,
0.0129547119140625,
-0.052734375,
-0.0188751220703125,
0.0180511474609375,
-0.021240234375,
0.002040863037109375,
-0.0322265625,
-0.00890350341796875,
0.00968170166015625,
0.042205810546875,
0.0186309814453125,
0.038482666015625,
-0.031494140625,
-0.01433563232421875,
0.04425048828125,
-0.0187530517578125,
0.025115966796875,
0.0911865234375,
-0.047149658203125,
-0.01134490966796875,
0.04669189453125,
-0.0167388916015625,
0.007297515869140625,
0.01531219482421875,
0.022857666015625,
-0.04351806640625,
-0.04046630859375,
-0.04351806640625,
0.039398193359375,
-0.027984619140625,
-0.02630615234375,
-0.030914306640625,
-0.05279541015625,
-0.0239410400390625,
-0.0206451416015625,
-0.054473876953125,
-0.049957275390625,
-0.0201568603515625,
-0.0267791748046875,
0.0217132568359375,
0.036956787109375,
-0.01103973388671875,
0.050384521484375,
-0.0792236328125,
0.0225677490234375,
0.021759033203125,
0.06707763671875,
-0.0235137939453125,
-0.023956298828125,
-0.0037746429443359375,
-0.0165557861328125,
-0.037628173828125,
-0.0538330078125,
0.0267333984375,
-0.0031890869140625,
0.056304931640625,
0.023101806640625,
0.0030841827392578125,
0.043487548828125,
-0.0528564453125,
0.0595703125,
0.0631103515625,
-0.07244873046875,
0.07171630859375,
-0.0455322265625,
0.03375244140625,
0.0523681640625,
0.028717041015625,
-0.055572509765625,
-0.054779052734375,
-0.05670166015625,
-0.05126953125,
0.03857421875,
0.0226593017578125,
-0.0200653076171875,
0.016845703125,
0.0003197193145751953,
0.0300140380859375,
0.0296478271484375,
-0.08734130859375,
-0.0325927734375,
-0.028350830078125,
0.002170562744140625,
0.03369140625,
-0.0247039794921875,
-0.01065826416015625,
-0.02899169921875,
0.051422119140625,
0.006877899169921875,
0.027069091796875,
0.004932403564453125,
-0.023284912109375,
-0.0275115966796875,
-0.0036563873291015625,
0.056121826171875,
0.0504150390625,
-0.05706787109375,
0.01206207275390625,
-0.005138397216796875,
-0.026092529296875,
0.00560760498046875,
0.0303955078125,
-0.007396697998046875,
0.005298614501953125,
0.035491943359375,
0.0295867919921875,
-0.0198822021484375,
-0.03375244140625,
0.044647216796875,
-0.03033447265625,
-0.0198211669921875,
-0.048126220703125,
-0.0010318756103515625,
0.0200653076171875,
0.028350830078125,
0.03753662109375,
-0.00708770751953125,
0.036651611328125,
-0.009796142578125,
0.041351318359375,
0.00974273681640625,
-0.043975830078125,
-0.037994384765625,
0.032135009765625,
0.03387451171875,
-0.035919189453125,
0.040130615234375,
-0.051361083984375,
-0.042327880859375,
0.048248291015625,
0.06549072265625,
0.06591796875,
-0.0225830078125,
0.0298309326171875,
0.032073974609375,
0.041534423828125,
-0.0249786376953125,
0.0400390625,
-0.00336456298828125,
-0.03741455078125,
0.00862884521484375,
-0.04205322265625,
-0.02825927734375,
0.024444580078125,
-0.0576171875,
0.01500701904296875,
-0.033447265625,
-0.0084228515625,
0.01143646240234375,
-0.001117706298828125,
-0.032745361328125,
0.0075836181640625,
0.00905609130859375,
0.1436767578125,
-0.0758056640625,
0.1102294921875,
0.06005859375,
-0.043853759765625,
-0.033233642578125,
0.00482177734375,
-0.00817108154296875,
-0.029266357421875,
0.030731201171875,
0.0085906982421875,
0.0038623809814453125,
-0.00482940673828125,
-0.042205810546875,
-0.06732177734375,
0.09649658203125,
0.00638580322265625,
-0.057342529296875,
0.032745361328125,
-0.0024394989013671875,
0.035736083984375,
-0.02886962890625,
0.005855560302734375,
0.03143310546875,
0.03521728515625,
0.0037689208984375,
-0.08270263671875,
0.01318359375,
-0.04095458984375,
-0.01192474365234375,
0.022705078125,
-0.06646728515625,
0.040252685546875,
0.01385498046875,
0.00608062744140625,
-0.0061798095703125,
0.0166778564453125,
0.032196044921875,
0.0194854736328125,
0.04571533203125,
0.0416259765625,
0.026641845703125,
0.007503509521484375,
0.03472900390625,
-0.01031494140625,
0.049224853515625,
0.0880126953125,
-0.032867431640625,
0.03314208984375,
0.044281005859375,
-0.035369873046875,
0.0031585693359375,
0.0543212890625,
-0.0307159423828125,
0.072998046875,
0.01678466796875,
-0.017181396484375,
-0.01192474365234375,
0.00016736984252929688,
-0.043121337890625,
0.0187225341796875,
0.0306396484375,
-0.007720947265625,
-0.0218048095703125,
-0.019561767578125,
-0.01971435546875,
-0.04931640625,
-0.0193023681640625,
0.038787841796875,
-0.01313018798828125,
-0.01361083984375,
0.01389312744140625,
0.01171875,
0.03485107421875,
-0.07958984375,
-0.006603240966796875,
-0.0230712890625,
-0.01071929931640625,
0.005229949951171875,
-0.051361083984375,
0.043548583984375,
-0.00751495361328125,
-0.0225677490234375,
-0.004871368408203125,
0.0765380859375,
-0.032806396484375,
-0.05181884765625,
0.03851318359375,
0.0174560546875,
-0.00489044189453125,
0.0014171600341796875,
-0.08056640625,
0.01551055908203125,
-0.0091094970703125,
-0.01177978515625,
0.01181793212890625,
0.006679534912109375,
0.0072021484375,
0.07537841796875,
0.0271148681640625,
0.0008845329284667969,
0.0036907196044921875,
0.0240478515625,
0.0723876953125,
-0.0584716796875,
-0.04522705078125,
-0.044158935546875,
0.055419921875,
-0.01873779296875,
-0.0213165283203125,
0.020782470703125,
0.0667724609375,
0.033599853515625,
-0.037933349609375,
0.0633544921875,
-0.0252532958984375,
0.01678466796875,
-0.00811767578125,
0.060455322265625,
-0.041107177734375,
0.0011348724365234375,
-0.017181396484375,
-0.02166748046875,
-0.0282745361328125,
0.044891357421875,
0.0297393798828125,
-0.035919189453125,
0.053375244140625,
0.07763671875,
-0.0450439453125,
0.00836944580078125,
0.03314208984375,
-0.00571441650390625,
0.0197296142578125,
0.00638580322265625,
0.0714111328125,
-0.031402587890625,
0.033935546875,
-0.0293731689453125,
-0.0167083740234375,
-0.0189056396484375,
-0.06500244140625,
-0.056304931640625,
-0.04461669921875,
-0.0240478515625,
-0.02569580078125,
0.00608062744140625,
0.08514404296875,
0.088134765625,
-0.0367431640625,
-0.0137786865234375,
-0.011444091796875,
-0.0008549690246582031,
-0.004840850830078125,
-0.009918212890625,
0.01983642578125,
-0.01010894775390625,
-0.005580902099609375,
-0.01024627685546875,
0.040252685546875,
0.0259857177734375,
-0.029876708984375,
-0.0279998779296875,
-0.00858306884765625,
0.0242919921875,
0.046783447265625,
0.0146942138671875,
-0.051177978515625,
-0.015716552734375,
-0.021240234375,
-0.044281005859375,
0.0000750422477722168,
0.06982421875,
0.01041412353515625,
-0.01336669921875,
0.028228759765625,
-0.00072479248046875,
0.0653076171875,
-0.00762939453125,
0.03424072265625,
-0.04766845703125,
0.025970458984375,
-0.002964019775390625,
0.03948974609375,
0.00640869140625,
-0.00482177734375,
0.043487548828125,
0.0145721435546875,
-0.024169921875,
-0.0479736328125,
0.0300140380859375,
-0.096923828125,
-0.021759033203125,
0.043060302734375,
-0.0171051025390625,
-0.00890350341796875,
0.0223541259765625,
-0.017578125,
0.0208892822265625,
-0.0548095703125,
0.06158447265625,
0.037628173828125,
-0.005962371826171875,
-0.051727294921875,
-0.034759521484375,
0.0299072265625,
-0.0225830078125,
-0.060791015625,
-0.003711700439453125,
0.0234375,
0.037445068359375,
-0.002071380615234375,
0.08343505859375,
-0.0087127685546875,
0.043731689453125,
0.031341552734375,
0.038238525390625,
-0.006603240966796875,
-0.01165771484375,
-0.00653076171875,
-0.000545501708984375,
0.0186309814453125,
-0.04449462890625
]
] |
stabilityai/stable-diffusion-2-1 | 2023-07-05T16:19:17.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"arxiv:2112.10752",
"arxiv:2202.00512",
"arxiv:1910.09700",
"license:openrail++",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | stabilityai | null | null | stabilityai/stable-diffusion-2-1 | 3,284 | 6,158,960 | diffusers | 2022-12-06T17:24:51 | ---
license: openrail++
tags:
- stable-diffusion
- text-to-image
pinned: true
---
# Stable Diffusion v2-1 Model Card
This model card focuses on the model associated with the Stable Diffusion v2-1 model, codebase available [here](https://github.com/Stability-AI/stablediffusion).
This `stable-diffusion-2-1` model is fine-tuned from [stable-diffusion-2](https://huggingface.co/stabilityai/stable-diffusion-2) (`768-v-ema.ckpt`) with an additional 55k steps on the same dataset (with `punsafe=0.1`), and then fine-tuned for another 155k extra steps with `punsafe=0.98`.
- Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `v2-1_768-ema-pruned.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-2-1/blob/main/v2-1_768-ema-pruned.ckpt).
- Use it with ๐งจ [`diffusers`](#examples)
## Model Details
- **Developed by:** Robin Rombach, Patrick Esser
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)).
- **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/).
- **Cite as:**
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
## Examples
Using the [๐ค's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 in a simple and efficient manner.
```bash
pip install diffusers transformers accelerate scipy safetensors
```
Running the pipeline (if you don't swap the scheduler it will run with the default DDIM, in this example we are swapping it to DPMSolverMultistepScheduler):
```python
import torch
from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler
model_id = "stabilityai/stable-diffusion-2-1"
# Use the DPMSolverMultistepScheduler (DPM-Solver++) scheduler here instead
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
pipe = pipe.to("cuda")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt).images[0]
image.save("astronaut_rides_horse.png")
```
**Notes**:
- Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance)
- If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed)
# Uses
## Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to โA red cube on top of a blue sphereโ
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent.
## Training
**Training Data**
The model developers used the following dataset for training the model:
- LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic.
**Training Procedure**
Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training,
- Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4
- Text prompts are encoded through the OpenCLIP-ViT/H text-encoder.
- The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention.
- The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512.
We currently provide the following checkpoints:
- `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`.
850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`.
- `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset.
- `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning.
The additional input channels of the U-Net which process this extra information were zero-initialized.
- `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning.
The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://huggingface.co/runwayml/stable-diffusion-inpainting).
- `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752).
In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml).
- **Hardware:** 32 x 8 x A100 GPUs
- **Optimizer:** AdamW
- **Gradient Accumulations**: 1
- **Batch:** 32 x 8 x 2 x 4 = 2048
- **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant
## Evaluation Results
Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0,
5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints:

Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores.
## Environmental Impact
**Stable Diffusion v1** **Estimated Emissions**
Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.
- **Hardware Type:** A100 PCIe 40GB
- **Hours used:** 200000
- **Cloud Provider:** AWS
- **Compute Region:** US-east
- **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq.
## Citation
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
*This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
| 12,209 | [
[
-0.02923583984375,
-0.0653076171875,
0.02716064453125,
0.014923095703125,
-0.0188751220703125,
-0.0278472900390625,
0.007843017578125,
-0.0301361083984375,
-0.00852203369140625,
0.0291595458984375,
-0.034027099609375,
-0.029205322265625,
-0.057952880859375,
-0.00811004638671875,
-0.029296875,
0.06951904296875,
-0.00984954833984375,
0.002422332763671875,
-0.01207733154296875,
0.00007361173629760742,
-0.02587890625,
-0.00867462158203125,
-0.075927734375,
-0.0213775634765625,
0.037628173828125,
0.006809234619140625,
0.0548095703125,
0.042999267578125,
0.03857421875,
0.020538330078125,
-0.02166748046875,
0.0004892349243164062,
-0.055511474609375,
-0.003570556640625,
-0.003185272216796875,
-0.021514892578125,
-0.03912353515625,
0.01312255859375,
0.048370361328125,
0.0189056396484375,
-0.0078125,
0.0030517578125,
0.0013895034790039062,
0.04327392578125,
-0.04229736328125,
-0.007843017578125,
-0.0254364013671875,
0.01384735107421875,
-0.013336181640625,
0.0156707763671875,
-0.028350830078125,
-0.00984954833984375,
0.00649261474609375,
-0.05853271484375,
0.0255126953125,
-0.021453857421875,
0.078369140625,
0.03619384765625,
-0.025238037109375,
-0.006961822509765625,
-0.05767822265625,
0.0435791015625,
-0.046478271484375,
0.0199127197265625,
0.030242919921875,
0.00624847412109375,
-0.002079010009765625,
-0.07354736328125,
-0.049163818359375,
-0.00473785400390625,
0.0005717277526855469,
0.036529541015625,
-0.032928466796875,
-0.002231597900390625,
0.035308837890625,
0.01496124267578125,
-0.0465087890625,
0.00305938720703125,
-0.042388916015625,
-0.0035343170166015625,
0.048858642578125,
0.01276397705078125,
0.0179290771484375,
-0.01470947265625,
-0.031646728515625,
-0.003910064697265625,
-0.03851318359375,
-0.0029468536376953125,
0.0306549072265625,
-0.025238037109375,
-0.031524658203125,
0.0316162109375,
0.0094146728515625,
0.03802490234375,
0.0226593017578125,
-0.01337432861328125,
0.02313232421875,
-0.0184173583984375,
-0.015594482421875,
-0.035400390625,
0.06475830078125,
0.048095703125,
-0.005893707275390625,
0.01025390625,
-0.008819580078125,
0.0170135498046875,
0.004016876220703125,
-0.0924072265625,
-0.037109375,
0.0105743408203125,
-0.051116943359375,
-0.0435791015625,
-0.01232147216796875,
-0.07635498046875,
-0.01470947265625,
0.010711669921875,
0.036346435546875,
-0.025787353515625,
-0.035736083984375,
-0.0018177032470703125,
-0.028900146484375,
0.01074981689453125,
0.034210205078125,
-0.054046630859375,
0.0126190185546875,
0.006832122802734375,
0.08538818359375,
-0.0265960693359375,
-0.0034942626953125,
-0.01131439208984375,
0.0084075927734375,
-0.0210723876953125,
0.0504150390625,
-0.027313232421875,
-0.037811279296875,
-0.01885986328125,
0.022613525390625,
0.008026123046875,
-0.03717041015625,
0.0455322265625,
-0.035125732421875,
0.0274505615234375,
-0.0029449462890625,
-0.031524658203125,
-0.01568603515625,
-0.0006070137023925781,
-0.0543212890625,
0.08428955078125,
0.0192718505859375,
-0.06866455078125,
0.00896453857421875,
-0.056304931640625,
-0.017913818359375,
-0.0072174072265625,
0.00214385986328125,
-0.04974365234375,
-0.01100921630859375,
0.0036220550537109375,
0.031097412109375,
-0.01100921630859375,
0.01953125,
-0.0203704833984375,
-0.020538330078125,
-0.0012807846069335938,
-0.04925537109375,
0.0751953125,
0.029205322265625,
-0.032073974609375,
-0.00032782554626464844,
-0.049102783203125,
-0.028076171875,
0.040130615234375,
-0.0167694091796875,
-0.0131988525390625,
-0.01119232177734375,
0.0228729248046875,
0.0274505615234375,
0.00522613525390625,
-0.034912109375,
0.002655029296875,
-0.0224761962890625,
0.0458984375,
0.057708740234375,
0.0156707763671875,
0.051025390625,
-0.031280517578125,
0.042388916015625,
0.0255279541015625,
0.02294921875,
-0.01100921630859375,
-0.0638427734375,
-0.0509033203125,
-0.0171356201171875,
0.01409149169921875,
0.042327880859375,
-0.058258056640625,
0.013336181640625,
0.005306243896484375,
-0.054534912109375,
-0.0186309814453125,
-0.0066680908203125,
0.020111083984375,
0.054229736328125,
0.0228729248046875,
-0.0255584716796875,
-0.0257720947265625,
-0.056396484375,
0.02935791015625,
-0.007205963134765625,
0.013763427734375,
0.01910400390625,
0.050201416015625,
-0.02899169921875,
0.043914794921875,
-0.051025390625,
-0.0217437744140625,
0.00894927978515625,
0.009307861328125,
0.000682830810546875,
0.051025390625,
0.061309814453125,
-0.0797119140625,
-0.048553466796875,
-0.02398681640625,
-0.0611572265625,
-0.002895355224609375,
0.00038695335388183594,
-0.025238037109375,
0.0306243896484375,
0.036529541015625,
-0.056732177734375,
0.0469970703125,
0.048126220703125,
-0.0253448486328125,
0.034881591796875,
-0.025726318359375,
0.0003380775451660156,
-0.08087158203125,
0.01076507568359375,
0.0223846435546875,
-0.0230255126953125,
-0.044952392578125,
-0.0008177757263183594,
-0.00411224365234375,
-0.01561737060546875,
-0.045928955078125,
0.06182861328125,
-0.027679443359375,
0.033843994140625,
-0.033660888671875,
0.0004353523254394531,
0.01372528076171875,
0.02410888671875,
0.0242462158203125,
0.049468994140625,
0.06500244140625,
-0.04620361328125,
0.015228271484375,
0.020416259765625,
-0.005428314208984375,
0.03839111328125,
-0.064453125,
0.0133056640625,
-0.0340576171875,
0.024139404296875,
-0.07940673828125,
-0.012176513671875,
0.04998779296875,
-0.0300750732421875,
0.0295562744140625,
-0.01447296142578125,
-0.0318603515625,
-0.034576416015625,
-0.01209259033203125,
0.040130615234375,
0.07525634765625,
-0.0289154052734375,
0.0364990234375,
0.033599853515625,
0.01079559326171875,
-0.0352783203125,
-0.05804443359375,
-0.0097198486328125,
-0.027557373046875,
-0.06256103515625,
0.048004150390625,
-0.0204010009765625,
-0.01218414306640625,
0.0123443603515625,
0.01171112060546875,
-0.0035800933837890625,
-0.003940582275390625,
0.032318115234375,
0.017791748046875,
0.0032176971435546875,
-0.004909515380859375,
0.0171051025390625,
-0.0204925537109375,
-0.0008702278137207031,
-0.00960540771484375,
0.02960205078125,
0.017578125,
-0.00879669189453125,
-0.050628662109375,
0.033721923828125,
0.039031982421875,
-0.000598907470703125,
0.055938720703125,
0.08074951171875,
-0.04376220703125,
0.0009546279907226562,
-0.0284423828125,
-0.0206298828125,
-0.037139892578125,
0.03253173828125,
-0.01227569580078125,
-0.04595947265625,
0.04437255859375,
0.0019483566284179688,
0.0022106170654296875,
0.051422119140625,
0.060638427734375,
-0.0154876708984375,
0.08551025390625,
0.049957275390625,
0.0211944580078125,
0.054534912109375,
-0.05523681640625,
-0.002197265625,
-0.06402587890625,
-0.0213470458984375,
-0.0126190185546875,
-0.0221099853515625,
-0.034210205078125,
-0.051055908203125,
0.025665283203125,
0.012420654296875,
-0.010833740234375,
0.01409149169921875,
-0.045806884765625,
0.023712158203125,
0.023101806640625,
0.016845703125,
0.0026226043701171875,
0.01079559326171875,
0.00823974609375,
-0.015716552734375,
-0.06298828125,
-0.047119140625,
0.07598876953125,
0.04052734375,
0.07135009765625,
0.0012521743774414062,
0.038787841796875,
0.03228759765625,
0.02813720703125,
-0.034942626953125,
0.03875732421875,
-0.0285797119140625,
-0.05047607421875,
-0.0088043212890625,
-0.017974853515625,
-0.07061767578125,
0.01465606689453125,
-0.0160064697265625,
-0.032379150390625,
0.035919189453125,
0.015655517578125,
-0.0235443115234375,
0.0260772705078125,
-0.054656982421875,
0.074462890625,
-0.00818634033203125,
-0.056732177734375,
-0.01297760009765625,
-0.049163818359375,
0.0247039794921875,
0.000014841556549072266,
0.0103302001953125,
-0.0107879638671875,
-0.009185791015625,
0.06610107421875,
-0.0207977294921875,
0.07196044921875,
-0.031341552734375,
-0.0007586479187011719,
0.029510498046875,
-0.007381439208984375,
0.0287933349609375,
0.020782470703125,
-0.009765625,
0.028350830078125,
0.002777099609375,
-0.0281219482421875,
-0.027008056640625,
0.056243896484375,
-0.07391357421875,
-0.03497314453125,
-0.03466796875,
-0.030548095703125,
0.0439453125,
0.013763427734375,
0.06072998046875,
0.0305328369140625,
-0.01666259765625,
-0.007434844970703125,
0.06317138671875,
-0.019195556640625,
0.0355224609375,
0.01910400390625,
-0.0211029052734375,
-0.038360595703125,
0.0556640625,
0.017333984375,
0.036956787109375,
0.0005259513854980469,
0.01229095458984375,
-0.0162200927734375,
-0.041168212890625,
-0.0455322265625,
0.0215911865234375,
-0.06463623046875,
-0.0164337158203125,
-0.06195068359375,
-0.025604248046875,
-0.034515380859375,
-0.0098724365234375,
-0.025390625,
-0.0218658447265625,
-0.0667724609375,
0.007038116455078125,
0.0244293212890625,
0.042388916015625,
-0.02471923828125,
0.029052734375,
-0.032318115234375,
0.031524658203125,
0.01221466064453125,
0.01447296142578125,
0.0027408599853515625,
-0.060211181640625,
-0.0117034912109375,
0.0082550048828125,
-0.050872802734375,
-0.0772705078125,
0.0283660888671875,
0.0082244873046875,
0.04559326171875,
0.040985107421875,
-0.002193450927734375,
0.0418701171875,
-0.031951904296875,
0.0723876953125,
0.0142669677734375,
-0.04669189453125,
0.05029296875,
-0.03179931640625,
0.0113372802734375,
0.0154571533203125,
0.04443359375,
-0.0230560302734375,
-0.023162841796875,
-0.05841064453125,
-0.0640869140625,
0.051544189453125,
0.033966064453125,
0.0271148681640625,
-0.00994873046875,
0.0521240234375,
0.00003814697265625,
-0.0068511962890625,
-0.08197021484375,
-0.0421142578125,
-0.0252838134765625,
0.0034732818603515625,
0.007480621337890625,
-0.033050537109375,
-0.01409149169921875,
-0.037506103515625,
0.06964111328125,
0.00699615478515625,
0.041290283203125,
0.0310211181640625,
0.000008940696716308594,
-0.02996826171875,
-0.0268096923828125,
0.041534423828125,
0.03125,
-0.01059722900390625,
-0.00286865234375,
-0.00293731689453125,
-0.042327880859375,
0.02203369140625,
0.01605224609375,
-0.05120849609375,
0.003124237060546875,
-0.00370025634765625,
0.071044921875,
-0.0179595947265625,
-0.033538818359375,
0.046630859375,
-0.0132598876953125,
-0.0276336669921875,
-0.034271240234375,
0.0115509033203125,
0.005680084228515625,
0.02581787109375,
0.01053619384765625,
0.036529541015625,
0.0145416259765625,
-0.0226287841796875,
0.0082550048828125,
0.03460693359375,
-0.0278167724609375,
-0.0250244140625,
0.0819091796875,
0.01279449462890625,
-0.0267181396484375,
0.04278564453125,
-0.038665771484375,
-0.0188446044921875,
0.051605224609375,
0.059295654296875,
0.060791015625,
-0.01355743408203125,
0.03643798828125,
0.05426025390625,
0.024993896484375,
-0.01837158203125,
0.01323699951171875,
0.0173797607421875,
-0.05389404296875,
-0.0091552734375,
-0.032867431640625,
-0.0033321380615234375,
0.013092041015625,
-0.03515625,
0.039031982421875,
-0.034942626953125,
-0.033843994140625,
-0.0005125999450683594,
-0.021087646484375,
-0.04559326171875,
0.01136016845703125,
0.0283966064453125,
0.060882568359375,
-0.08538818359375,
0.06134033203125,
0.055908203125,
-0.047332763671875,
-0.03607177734375,
0.00347900390625,
-0.007572174072265625,
-0.0279998779296875,
0.039398193359375,
0.00998687744140625,
0.004680633544921875,
0.01079559326171875,
-0.056884765625,
-0.072998046875,
0.097412109375,
0.0293731689453125,
-0.025482177734375,
-0.0036258697509765625,
-0.0187530517578125,
0.0440673828125,
-0.033966064453125,
0.023468017578125,
0.0232391357421875,
0.0308380126953125,
0.0272674560546875,
-0.039306640625,
0.01105499267578125,
-0.0278472900390625,
0.0236663818359375,
-0.0054168701171875,
-0.07086181640625,
0.0750732421875,
-0.026611328125,
-0.025146484375,
0.018798828125,
0.049163818359375,
0.0156707763671875,
0.0266265869140625,
0.031494140625,
0.064208984375,
0.04229736328125,
-0.01055145263671875,
0.07373046875,
-0.0057830810546875,
0.031280517578125,
0.057403564453125,
-0.0069427490234375,
0.049163818359375,
0.033416748046875,
-0.01407623291015625,
0.043487548828125,
0.055389404296875,
-0.027984619140625,
0.06048583984375,
-0.000919342041015625,
-0.01268768310546875,
-0.004322052001953125,
0.0007810592651367188,
-0.0369873046875,
0.01355743408203125,
0.0229034423828125,
-0.0428466796875,
-0.016632080078125,
0.0217437744140625,
0.0025272369384765625,
-0.0130767822265625,
-0.007572174072265625,
0.043731689453125,
0.004192352294921875,
-0.034149169921875,
0.044891357421875,
0.01560211181640625,
0.06707763671875,
-0.036224365234375,
-0.01366424560546875,
-0.00797271728515625,
0.0117340087890625,
-0.0187225341796875,
-0.058013916015625,
0.03680419921875,
-0.0087738037109375,
-0.0242156982421875,
-0.0182952880859375,
0.0693359375,
-0.0279693603515625,
-0.049896240234375,
0.029144287109375,
0.0220947265625,
0.0230560302734375,
0.002658843994140625,
-0.0802001953125,
0.01558685302734375,
-0.0041961669921875,
-0.026580810546875,
0.019073486328125,
0.0153656005859375,
0.008453369140625,
0.039947509765625,
0.041595458984375,
-0.005725860595703125,
0.006008148193359375,
-0.00397491455078125,
0.0653076171875,
-0.0227203369140625,
-0.025634765625,
-0.06402587890625,
0.05462646484375,
-0.00481414794921875,
-0.021514892578125,
0.05047607421875,
0.04461669921875,
0.05914306640625,
-0.00982666015625,
0.057525634765625,
-0.0239105224609375,
0.00015056133270263672,
-0.037750244140625,
0.0645751953125,
-0.057159423828125,
0.003887176513671875,
-0.0292205810546875,
-0.06787109375,
-0.01174163818359375,
0.06927490234375,
-0.020538330078125,
0.018096923828125,
0.035919189453125,
0.075927734375,
-0.00974273681640625,
-0.0162200927734375,
0.0257720947265625,
0.01873779296875,
0.027130126953125,
0.0255126953125,
0.0592041015625,
-0.05889892578125,
0.032135009765625,
-0.0416259765625,
-0.0226287841796875,
-0.0013341903686523438,
-0.06585693359375,
-0.064208984375,
-0.05316162109375,
-0.06121826171875,
-0.050323486328125,
-0.00429534912109375,
0.03363037109375,
0.0728759765625,
-0.03460693359375,
-0.0005474090576171875,
-0.0150299072265625,
0.00023221969604492188,
-0.00665283203125,
-0.0211029052734375,
0.0213775634765625,
0.007289886474609375,
-0.07061767578125,
-0.0059661865234375,
0.0222320556640625,
0.041168212890625,
-0.041717529296875,
-0.01522064208984375,
-0.0220794677734375,
-0.009674072265625,
0.042724609375,
0.0052337646484375,
-0.050689697265625,
0.0010738372802734375,
-0.003009796142578125,
-0.005863189697265625,
0.0113067626953125,
0.026336669921875,
-0.047332763671875,
0.0299530029296875,
0.040679931640625,
0.013702392578125,
0.06402587890625,
-0.008514404296875,
0.012237548828125,
-0.036163330078125,
0.0263671875,
0.01259613037109375,
0.02923583984375,
0.0240478515625,
-0.045501708984375,
0.0379638671875,
0.046783447265625,
-0.056488037109375,
-0.05902099609375,
0.0159149169921875,
-0.080810546875,
-0.01971435546875,
0.09967041015625,
-0.0123443603515625,
-0.0222320556640625,
0.0032634735107421875,
-0.031494140625,
0.0236358642578125,
-0.0278472900390625,
0.044036865234375,
0.04327392578125,
-0.01087188720703125,
-0.038482666015625,
-0.047515869140625,
0.03948974609375,
0.01116180419921875,
-0.04638671875,
-0.0168914794921875,
0.04736328125,
0.051544189453125,
0.0170135498046875,
0.07464599609375,
-0.022613525390625,
0.02069091796875,
0.0098724365234375,
-0.0014162063598632812,
0.0009236335754394531,
-0.0161590576171875,
-0.03448486328125,
0.0006566047668457031,
-0.01380157470703125,
-0.004138946533203125
]
] |
albert-base-v2 | 2023-05-30T07:52:10.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"albert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1909.11942",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | albert-base-v2 | 72 | 6,140,346 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# ALBERT Base v2
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1909.11942) and first released in
[this repository](https://github.com/google-research/albert). This model, as all ALBERT models, is uncased: it does not make a difference
between english and English.
Disclaimer: The team releasing ALBERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
ALBERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Sentence Ordering Prediction (SOP): ALBERT uses a pretraining loss based on predicting the ordering of two consecutive segments of text.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the ALBERT model as inputs.
ALBERT is particular in that it shares its layers across its Transformer. Therefore, all layers have the same weights. Using repeating layers results in a small memory footprint, however, the computational cost remains similar to a BERT-like architecture with the same number of hidden layers as it has to iterate through the same number of (repeating) layers.
This is the second version of the base model. Version 2 is different from version 1 due to different dropout rates, additional training data, and longer training. It has better results in nearly all downstream tasks.
This model has the following configuration:
- 12 repeating layers
- 128 embedding dimension
- 768 hidden dimension
- 12 attention heads
- 11M parameters
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=albert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='albert-base-v2')
>>> unmasker("Hello I'm a [MASK] model.")
[
{
"sequence":"[CLS] hello i'm a modeling model.[SEP]",
"score":0.05816134437918663,
"token":12807,
"token_str":"โmodeling"
},
{
"sequence":"[CLS] hello i'm a modelling model.[SEP]",
"score":0.03748830780386925,
"token":23089,
"token_str":"โmodelling"
},
{
"sequence":"[CLS] hello i'm a model model.[SEP]",
"score":0.033725276589393616,
"token":1061,
"token_str":"โmodel"
},
{
"sequence":"[CLS] hello i'm a runway model.[SEP]",
"score":0.017313428223133087,
"token":8014,
"token_str":"โrunway"
},
{
"sequence":"[CLS] hello i'm a lingerie model.[SEP]",
"score":0.014405295252799988,
"token":29104,
"token_str":"โlingerie"
}
]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AlbertTokenizer, AlbertModel
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')
model = AlbertModel.from_pretrained("albert-base-v2")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import AlbertTokenizer, TFAlbertModel
tokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')
model = TFAlbertModel.from_pretrained("albert-base-v2")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='albert-base-v2')
>>> unmasker("The man worked as a [MASK].")
[
{
"sequence":"[CLS] the man worked as a chauffeur.[SEP]",
"score":0.029577180743217468,
"token":28744,
"token_str":"โchauffeur"
},
{
"sequence":"[CLS] the man worked as a janitor.[SEP]",
"score":0.028865724802017212,
"token":29477,
"token_str":"โjanitor"
},
{
"sequence":"[CLS] the man worked as a shoemaker.[SEP]",
"score":0.02581118606030941,
"token":29024,
"token_str":"โshoemaker"
},
{
"sequence":"[CLS] the man worked as a blacksmith.[SEP]",
"score":0.01849772222340107,
"token":21238,
"token_str":"โblacksmith"
},
{
"sequence":"[CLS] the man worked as a lawyer.[SEP]",
"score":0.01820771023631096,
"token":3672,
"token_str":"โlawyer"
}
]
>>> unmasker("The woman worked as a [MASK].")
[
{
"sequence":"[CLS] the woman worked as a receptionist.[SEP]",
"score":0.04604868218302727,
"token":25331,
"token_str":"โreceptionist"
},
{
"sequence":"[CLS] the woman worked as a janitor.[SEP]",
"score":0.028220869600772858,
"token":29477,
"token_str":"โjanitor"
},
{
"sequence":"[CLS] the woman worked as a paramedic.[SEP]",
"score":0.0261906236410141,
"token":23386,
"token_str":"โparamedic"
},
{
"sequence":"[CLS] the woman worked as a chauffeur.[SEP]",
"score":0.024797942489385605,
"token":28744,
"token_str":"โchauffeur"
},
{
"sequence":"[CLS] the woman worked as a waitress.[SEP]",
"score":0.024124596267938614,
"token":13678,
"token_str":"โwaitress"
}
]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The ALBERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
### Training
The ALBERT procedure follows the BERT setup.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
## Evaluation results
When fine-tuned on downstream tasks, the ALBERT models achieve the following results:
| | Average | SQuAD1.1 | SQuAD2.0 | MNLI | SST-2 | RACE |
|----------------|----------|----------|----------|----------|----------|----------|
|V2 |
|ALBERT-base |82.3 |90.2/83.2 |82.1/79.3 |84.6 |92.9 |66.8 |
|ALBERT-large |85.7 |91.8/85.2 |84.9/81.8 |86.5 |94.9 |75.2 |
|ALBERT-xlarge |87.9 |92.9/86.4 |87.9/84.1 |87.9 |95.4 |80.7 |
|ALBERT-xxlarge |90.9 |94.6/89.1 |89.8/86.9 |90.6 |96.8 |86.8 |
|V1 |
|ALBERT-base |80.1 |89.3/82.3 | 80.0/77.1|81.6 |90.3 | 64.0 |
|ALBERT-large |82.4 |90.6/83.9 | 82.3/79.4|83.5 |91.7 | 68.5 |
|ALBERT-xlarge |85.5 |92.5/86.1 | 86.1/83.1|86.4 |92.4 | 74.8 |
|ALBERT-xxlarge |91.0 |94.8/89.3 | 90.2/87.4|90.8 |96.9 | 86.5 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1909-11942,
author = {Zhenzhong Lan and
Mingda Chen and
Sebastian Goodman and
Kevin Gimpel and
Piyush Sharma and
Radu Soricut},
title = {{ALBERT:} {A} Lite {BERT} for Self-supervised Learning of Language
Representations},
journal = {CoRR},
volume = {abs/1909.11942},
year = {2019},
url = {http://arxiv.org/abs/1909.11942},
archivePrefix = {arXiv},
eprint = {1909.11942},
timestamp = {Fri, 27 Sep 2019 13:04:21 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1909-11942.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 9,719 | [
[
-0.006855010986328125,
-0.03765869140625,
0.0141448974609375,
0.025909423828125,
-0.035369873046875,
0.001735687255859375,
0.006717681884765625,
-0.01329803466796875,
0.0234832763671875,
0.045806884765625,
-0.03759765625,
-0.035125732421875,
-0.061187744140625,
0.00803375244140625,
-0.035919189453125,
0.08770751953125,
0.00470733642578125,
0.0269317626953125,
-0.000995635986328125,
0.00963592529296875,
-0.0293426513671875,
-0.046417236328125,
-0.060302734375,
-0.022918701171875,
0.033050537109375,
0.0259857177734375,
0.0447998046875,
0.052703857421875,
0.04046630859375,
0.0291290283203125,
-0.0010671615600585938,
-0.01337432861328125,
-0.0237579345703125,
0.0036869049072265625,
-0.00310516357421875,
-0.0506591796875,
-0.034454345703125,
0.00653076171875,
0.046295166015625,
0.06109619140625,
-0.002899169921875,
0.024261474609375,
-0.007587432861328125,
0.04278564453125,
-0.0266876220703125,
0.0224609375,
-0.0288848876953125,
0.0069427490234375,
-0.019500732421875,
0.006542205810546875,
-0.0231781005859375,
-0.00897979736328125,
0.0076904296875,
-0.0474853515625,
0.01641845703125,
0.0249176025390625,
0.08477783203125,
0.0069427490234375,
-0.01556396484375,
-0.0114593505859375,
-0.04119873046875,
0.06304931640625,
-0.048828125,
0.014862060546875,
0.0399169921875,
0.02032470703125,
-0.0015411376953125,
-0.07440185546875,
-0.0236663818359375,
-0.00514984130859375,
-0.0174560546875,
-0.0005488395690917969,
-0.0017194747924804688,
-0.010223388671875,
0.032440185546875,
0.0305023193359375,
-0.034759521484375,
0.00018644332885742188,
-0.05572509765625,
-0.0237579345703125,
0.050079345703125,
0.0174560546875,
0.0198822021484375,
-0.020477294921875,
-0.02532958984375,
-0.0237579345703125,
-0.0282135009765625,
0.008209228515625,
0.039337158203125,
0.0250091552734375,
-0.0186004638671875,
0.052581787109375,
-0.0270843505859375,
0.0350341796875,
-0.002704620361328125,
-0.0007233619689941406,
0.0367431640625,
-0.0044708251953125,
-0.032135009765625,
-0.00047016143798828125,
0.08050537109375,
0.0206756591796875,
0.0266876220703125,
-0.0058746337890625,
-0.033233642578125,
-0.0034275054931640625,
0.0189666748046875,
-0.054443359375,
-0.027069091796875,
0.011444091796875,
-0.0357666015625,
-0.03167724609375,
0.0362548828125,
-0.049407958984375,
-0.00408172607421875,
-0.006175994873046875,
0.035858154296875,
-0.0207977294921875,
-0.016815185546875,
0.01256561279296875,
-0.032012939453125,
0.00847625732421875,
0.0074615478515625,
-0.0712890625,
0.016265869140625,
0.045562744140625,
0.0643310546875,
0.02703857421875,
-0.01421356201171875,
-0.029205322265625,
-0.0086212158203125,
-0.0284271240234375,
0.037567138671875,
-0.024200439453125,
-0.039337158203125,
0.0038356781005859375,
0.0162811279296875,
-0.00167083740234375,
-0.0289764404296875,
0.04473876953125,
-0.046112060546875,
0.03839111328125,
-0.003009796142578125,
-0.0304718017578125,
-0.0151824951171875,
0.002506256103515625,
-0.055328369140625,
0.08306884765625,
0.03082275390625,
-0.050079345703125,
0.020904541015625,
-0.0673828125,
-0.040771484375,
0.0182647705078125,
0.00714874267578125,
-0.037200927734375,
0.0098876953125,
0.00852203369140625,
0.0275115966796875,
-0.00968170166015625,
0.01558685302734375,
-0.01461029052734375,
-0.031494140625,
0.02398681640625,
-0.01349639892578125,
0.0810546875,
0.0166168212890625,
-0.0196533203125,
0.0039825439453125,
-0.06353759765625,
-0.0032501220703125,
0.021087646484375,
-0.0224761962890625,
-0.0181884765625,
-0.01953125,
0.02423095703125,
0.00982666015625,
0.03125,
-0.039093017578125,
0.0187835693359375,
-0.0452880859375,
0.036956787109375,
0.0552978515625,
-0.006038665771484375,
0.0306549072265625,
-0.03094482421875,
0.04248046875,
0.0031147003173828125,
-0.007389068603515625,
-0.019073486328125,
-0.046875,
-0.06585693359375,
-0.0221099853515625,
0.0386962890625,
0.05718994140625,
-0.03411865234375,
0.0478515625,
-0.00841522216796875,
-0.0465087890625,
-0.048614501953125,
-0.002819061279296875,
0.031524658203125,
0.03118896484375,
0.025054931640625,
-0.032073974609375,
-0.065185546875,
-0.06463623046875,
-0.0225677490234375,
-0.01537322998046875,
-0.0253143310546875,
-0.0007944107055664062,
0.0599365234375,
-0.0241241455078125,
0.056060791015625,
-0.05078125,
-0.0279388427734375,
-0.0094757080078125,
0.0253448486328125,
0.036529541015625,
0.0582275390625,
0.0271453857421875,
-0.04620361328125,
-0.033660888671875,
-0.0214385986328125,
-0.05670166015625,
-0.0022449493408203125,
-0.0040435791015625,
-0.0193939208984375,
-0.00038051605224609375,
0.03594970703125,
-0.056427001953125,
0.03955078125,
0.0159454345703125,
-0.03912353515625,
0.04852294921875,
-0.018463134765625,
0.00392913818359375,
-0.0902099609375,
0.01323699951171875,
-0.006977081298828125,
-0.0228118896484375,
-0.05474853515625,
-0.0017652511596679688,
-0.00774383544921875,
-0.0004558563232421875,
-0.0439453125,
0.042449951171875,
-0.039703369140625,
0.0017747879028320312,
-0.005252838134765625,
-0.01123046875,
0.0131988525390625,
0.031494140625,
0.0006475448608398438,
0.04345703125,
0.049530029296875,
-0.043670654296875,
0.042694091796875,
0.034820556640625,
-0.046905517578125,
0.0196075439453125,
-0.0643310546875,
0.0208587646484375,
-0.0024566650390625,
-0.005619049072265625,
-0.07861328125,
-0.024566650390625,
0.0284576416015625,
-0.03948974609375,
0.0270843505859375,
-0.00433349609375,
-0.05157470703125,
-0.0372314453125,
-0.0165863037109375,
0.038787841796875,
0.040252685546875,
-0.0165557861328125,
0.03216552734375,
0.0249176025390625,
-0.0100860595703125,
-0.047698974609375,
-0.056671142578125,
0.00801849365234375,
-0.0186614990234375,
-0.039642333984375,
0.024322509765625,
-0.0008530616760253906,
-0.0196990966796875,
-0.019775390625,
0.0070037841796875,
-0.00402069091796875,
0.00652313232421875,
0.021636962890625,
0.0362548828125,
-0.02099609375,
-0.0196990966796875,
-0.006504058837890625,
-0.0151519775390625,
0.0246429443359375,
-0.0031299591064453125,
0.053741455078125,
-0.0016794204711914062,
-0.00868988037109375,
-0.034912109375,
0.033447265625,
0.047119140625,
-0.0055694580078125,
0.060394287109375,
0.05975341796875,
-0.0430908203125,
0.0063018798828125,
-0.020050048828125,
-0.014129638671875,
-0.03948974609375,
0.047637939453125,
-0.04180908203125,
-0.0626220703125,
0.054473876953125,
0.02081298828125,
-0.01250457763671875,
0.053375244140625,
0.047515869140625,
-0.0132598876953125,
0.08892822265625,
0.03326416015625,
-0.004962921142578125,
0.0350341796875,
-0.018463134765625,
0.0243072509765625,
-0.06744384765625,
-0.0390625,
-0.035491943359375,
-0.01477813720703125,
-0.03277587890625,
-0.007965087890625,
0.014801025390625,
0.0306396484375,
-0.039154052734375,
0.04931640625,
-0.045196533203125,
0.028289794921875,
0.06842041015625,
0.01189422607421875,
-0.01035308837890625,
-0.0175323486328125,
-0.0043792724609375,
0.0018415451049804688,
-0.0294647216796875,
-0.03875732421875,
0.07818603515625,
0.0443115234375,
0.05279541015625,
0.00707244873046875,
0.0452880859375,
0.0186614990234375,
0.01267242431640625,
-0.041748046875,
0.04339599609375,
-0.006465911865234375,
-0.06671142578125,
-0.02398681640625,
-0.011932373046875,
-0.0753173828125,
0.01235198974609375,
-0.0193939208984375,
-0.0704345703125,
-0.00836181640625,
-0.0100250244140625,
-0.02972412109375,
0.008056640625,
-0.052734375,
0.08245849609375,
-0.024322509765625,
-0.01031494140625,
0.0139617919921875,
-0.0665283203125,
0.0226593017578125,
0.00514984130859375,
0.0140533447265625,
-0.0113525390625,
0.01160430908203125,
0.087646484375,
-0.034576416015625,
0.06146240234375,
-0.010894775390625,
0.01629638671875,
0.00415802001953125,
0.005096435546875,
0.025421142578125,
0.01080322265625,
0.00984954833984375,
0.03155517578125,
0.00698089599609375,
-0.03240966796875,
-0.019012451171875,
0.032928466796875,
-0.065673828125,
-0.04119873046875,
-0.04656982421875,
-0.041259765625,
0.0157623291015625,
0.034759521484375,
0.046722412109375,
0.037109375,
-0.012786865234375,
0.01407623291015625,
0.0278472900390625,
-0.012603759765625,
0.0479736328125,
0.03009033203125,
-0.0246124267578125,
-0.042510986328125,
0.047088623046875,
0.0018205642700195312,
0.0002123117446899414,
0.042755126953125,
0.0074310302734375,
-0.04266357421875,
-0.017364501953125,
-0.03497314453125,
0.01580810546875,
-0.047576904296875,
-0.0263519287109375,
-0.04852294921875,
-0.0287322998046875,
-0.0521240234375,
-0.00630950927734375,
-0.0172882080078125,
-0.02679443359375,
-0.0537109375,
-0.01351165771484375,
0.02398681640625,
0.05108642578125,
-0.0074005126953125,
0.0460205078125,
-0.0589599609375,
0.0191650390625,
0.0181884765625,
0.03167724609375,
-0.0219268798828125,
-0.0599365234375,
-0.0382080078125,
0.0026378631591796875,
-0.01415252685546875,
-0.06170654296875,
0.050628662109375,
0.00836181640625,
0.030853271484375,
0.047821044921875,
-0.0062713623046875,
0.04522705078125,
-0.044647216796875,
0.0699462890625,
0.0214080810546875,
-0.07818603515625,
0.043304443359375,
-0.023040771484375,
0.016265869140625,
0.0299224853515625,
0.0186004638671875,
-0.0396728515625,
-0.0272979736328125,
-0.060516357421875,
-0.073974609375,
0.0704345703125,
0.02001953125,
0.022674560546875,
0.0025787353515625,
0.0164642333984375,
0.005870819091796875,
0.0325927734375,
-0.065185546875,
-0.04901123046875,
-0.034210205078125,
-0.0215911865234375,
-0.00913238525390625,
-0.0247039794921875,
-0.0091552734375,
-0.0352783203125,
0.0579833984375,
0.0182952880859375,
0.0411376953125,
-0.0012464523315429688,
-0.0088043212890625,
-0.005382537841796875,
0.0119781494140625,
0.0596923828125,
0.036956787109375,
-0.0369873046875,
0.00431060791015625,
0.0015268325805664062,
-0.04180908203125,
0.00511932373046875,
0.010650634765625,
-0.003711700439453125,
0.017608642578125,
0.0428466796875,
0.0753173828125,
0.01062774658203125,
-0.03753662109375,
0.042694091796875,
0.0092620849609375,
-0.018463134765625,
-0.046844482421875,
0.01366424560546875,
-0.006717681884765625,
0.00885009765625,
0.034393310546875,
0.015625,
0.0114593505859375,
-0.033782958984375,
0.0271759033203125,
0.0281524658203125,
-0.0362548828125,
-0.0218963623046875,
0.07598876953125,
0.00141143798828125,
-0.06451416015625,
0.055633544921875,
-0.0154876708984375,
-0.049835205078125,
0.051361083984375,
0.051605224609375,
0.06903076171875,
-0.0199737548828125,
0.01197052001953125,
0.0430908203125,
0.023529052734375,
-0.0261688232421875,
0.01666259765625,
0.021148681640625,
-0.062255859375,
-0.0227508544921875,
-0.058013916015625,
-0.0169677734375,
0.01947021484375,
-0.06475830078125,
0.0285186767578125,
-0.037200927734375,
-0.01001739501953125,
0.0094146728515625,
-0.006011962890625,
-0.056427001953125,
0.038482666015625,
0.007740020751953125,
0.0728759765625,
-0.07696533203125,
0.0655517578125,
0.059356689453125,
-0.051239013671875,
-0.06512451171875,
-0.03271484375,
-0.0225067138671875,
-0.0830078125,
0.05279541015625,
0.032684326171875,
0.0232696533203125,
0.003612518310546875,
-0.0465087890625,
-0.05743408203125,
0.06842041015625,
0.01346588134765625,
-0.037933349609375,
-0.01320648193359375,
0.01557159423828125,
0.038299560546875,
-0.0439453125,
0.047821044921875,
0.042510986328125,
0.02960205078125,
-0.0037841796875,
-0.05657958984375,
0.0003180503845214844,
-0.036865234375,
0.0065155029296875,
0.0060882568359375,
-0.033599853515625,
0.08172607421875,
-0.005840301513671875,
-0.0006303787231445312,
0.0169677734375,
0.050689697265625,
0.00522613525390625,
0.0180206298828125,
0.03448486328125,
0.048828125,
0.046966552734375,
-0.021331787109375,
0.06304931640625,
-0.0162811279296875,
0.045013427734375,
0.0601806640625,
0.00251007080078125,
0.052520751953125,
0.0303955078125,
-0.0243682861328125,
0.0728759765625,
0.060150146484375,
-0.021514892578125,
0.05841064453125,
0.00843048095703125,
-0.007671356201171875,
-0.005451202392578125,
0.007476806640625,
-0.02032470703125,
0.03594970703125,
0.01568603515625,
-0.04217529296875,
0.00487518310546875,
-0.0007824897766113281,
0.0161895751953125,
-0.01500701904296875,
-0.035308837890625,
0.0521240234375,
0.017608642578125,
-0.052001953125,
0.02520751953125,
0.018463134765625,
0.04345703125,
-0.03900146484375,
-0.00106048583984375,
-0.00566864013671875,
0.015411376953125,
-0.00992584228515625,
-0.061859130859375,
0.0170135498046875,
-0.01033782958984375,
-0.031158447265625,
-0.0239105224609375,
0.043212890625,
-0.037841796875,
-0.053680419921875,
-0.0008134841918945312,
0.0196685791015625,
0.0257415771484375,
-0.01222991943359375,
-0.060791015625,
-0.0120697021484375,
0.0023021697998046875,
-0.023651123046875,
0.0167694091796875,
0.02362060546875,
0.01043701171875,
0.0400390625,
0.054443359375,
-0.007904052734375,
-0.0016193389892578125,
0.004177093505859375,
0.051849365234375,
-0.0675048828125,
-0.061553955078125,
-0.0787353515625,
0.057830810546875,
-0.008087158203125,
-0.042236328125,
0.04937744140625,
0.06146240234375,
0.062469482421875,
-0.0300750732421875,
0.04473876953125,
-0.01108551025390625,
0.044708251953125,
-0.02734375,
0.0579833984375,
-0.032135009765625,
0.005840301513671875,
-0.0248565673828125,
-0.06256103515625,
-0.0268096923828125,
0.06646728515625,
-0.00789642333984375,
0.00714874267578125,
0.05303955078125,
0.060394287109375,
0.00470733642578125,
-0.01033782958984375,
0.019012451171875,
0.0064239501953125,
0.00701141357421875,
0.031036376953125,
0.0467529296875,
-0.060791015625,
0.0294036865234375,
-0.020660400390625,
-0.006175994873046875,
-0.029327392578125,
-0.055633544921875,
-0.0799560546875,
-0.045013427734375,
-0.0246429443359375,
-0.05419921875,
-0.0193328857421875,
0.068603515625,
0.055023193359375,
-0.076171875,
-0.0202484130859375,
-0.00778961181640625,
0.006336212158203125,
-0.013519287109375,
-0.0211639404296875,
0.031890869140625,
-0.00733184814453125,
-0.0638427734375,
0.0163726806640625,
0.0035533905029296875,
0.005344390869140625,
-0.01523590087890625,
-0.005664825439453125,
-0.01910400390625,
-0.002292633056640625,
0.026092529296875,
0.00980377197265625,
-0.04962158203125,
-0.033843994140625,
0.006134033203125,
-0.022857666015625,
0.0138702392578125,
0.038238525390625,
-0.038330078125,
0.026885986328125,
0.0299224853515625,
0.020111083984375,
0.047332763671875,
0.0072021484375,
0.045379638671875,
-0.07269287109375,
0.021331787109375,
0.0191497802734375,
0.039398193359375,
0.033721923828125,
-0.03167724609375,
0.03216552734375,
0.04010009765625,
-0.041107177734375,
-0.06854248046875,
0.000732421875,
-0.07330322265625,
-0.006206512451171875,
0.0760498046875,
-0.008544921875,
-0.0258636474609375,
-0.006702423095703125,
-0.0269012451171875,
0.037322998046875,
-0.0296783447265625,
0.05181884765625,
0.057037353515625,
0.00655364990234375,
-0.0191192626953125,
-0.029693603515625,
0.03228759765625,
0.0291748046875,
-0.03143310546875,
-0.034637451171875,
0.002635955810546875,
0.033721923828125,
0.0235443115234375,
0.047271728515625,
-0.0000947713851928711,
0.011993408203125,
0.019775390625,
0.01953125,
-0.007335662841796875,
-0.0131988525390625,
-0.0179901123046875,
0.01099395751953125,
-0.007099151611328125,
-0.05126953125
]
] |
microsoft/deberta-base | 2022-09-26T08:50:43.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"deberta",
"deberta-v1",
"fill-mask",
"en",
"arxiv:2006.03654",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/deberta-base | 53 | 5,097,824 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- deberta-v1
- fill-mask
thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
license: mit
---
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data.
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
#### Fine-tuning on NLU tasks
We present the dev results on SQuAD 1.1/2.0 and MNLI tasks.
| Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m |
|-------------------|-----------|-----------|--------|
| RoBERTa-base | 91.5/84.6 | 83.7/80.5 | 87.6 |
| XLNet-Large | -/- | -/80.2 | 86.8 |
| **DeBERTa-base** | 93.1/87.2 | 86.2/83.1 | 88.8 |
### Citation
If you find DeBERTa useful for your work, please cite the following paper:
``` latex
@inproceedings{
he2021deberta,
title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION},
author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=XPZIaotutsD}
}
```
| 1,303 | [
[
-0.0241851806640625,
-0.04290771484375,
0.01727294921875,
0.03973388671875,
-0.0186767578125,
0.0206298828125,
-0.00824737548828125,
-0.04779052734375,
0.0112152099609375,
0.004024505615234375,
-0.043731689453125,
-0.02947998046875,
-0.0721435546875,
0.004657745361328125,
-0.0173797607421875,
0.047607421875,
0.0119171142578125,
0.021759033203125,
0.0020923614501953125,
-0.01209259033203125,
-0.036346435546875,
-0.04522705078125,
-0.053741455078125,
-0.02947998046875,
0.0239715576171875,
-0.00830841064453125,
0.0279998779296875,
-0.00011366605758666992,
0.049041748046875,
0.0236358642578125,
-0.03826904296875,
0.0271148681640625,
-0.04718017578125,
-0.0016450881958007812,
0.00395965576171875,
-0.01155853271484375,
-0.054931640625,
-0.0088348388671875,
0.04779052734375,
0.025970458984375,
0.0080718994140625,
0.01485443115234375,
0.0213623046875,
0.0780029296875,
-0.04071044921875,
0.0126800537109375,
-0.051483154296875,
-0.00333404541015625,
0.0146942138671875,
-0.000583648681640625,
-0.046600341796875,
-0.00738525390625,
0.01493072509765625,
-0.025909423828125,
0.0071258544921875,
-0.025299072265625,
0.08746337890625,
0.051116943359375,
-0.019073486328125,
-0.005764007568359375,
-0.041015625,
0.0797119140625,
-0.06640625,
0.035003662109375,
0.027313232421875,
0.0150299072265625,
-0.0240325927734375,
-0.03594970703125,
-0.031158447265625,
-0.01360321044921875,
-0.004863739013671875,
0.0295562744140625,
-0.05133056640625,
-0.0035533905029296875,
0.0252227783203125,
0.00981903076171875,
-0.0384521484375,
0.00861358642578125,
-0.0292205810546875,
0.0088348388671875,
0.0589599609375,
-0.0027179718017578125,
-0.0036106109619140625,
0.00634765625,
-0.036346435546875,
-0.0216522216796875,
-0.047088623046875,
0.0012865066528320312,
0.028411865234375,
-0.0086822509765625,
-0.00682830810546875,
0.001575469970703125,
-0.00986480712890625,
0.07171630859375,
0.006561279296875,
0.0264129638671875,
0.03851318359375,
-0.005222320556640625,
-0.026763916015625,
0.0067291259765625,
0.036346435546875,
0.01490020751953125,
-0.00934600830078125,
-0.02191162109375,
-0.0010042190551757812,
-0.00366973876953125,
0.015716552734375,
-0.05194091796875,
-0.03472900390625,
0.0290374755859375,
-0.04815673828125,
-0.00928497314453125,
0.00894927978515625,
-0.04962158203125,
-0.0034637451171875,
-0.033966064453125,
0.0221710205078125,
-0.042022705078125,
-0.0268707275390625,
0.02008056640625,
-0.0140228271484375,
0.0271759033203125,
0.038665771484375,
-0.07421875,
0.001739501953125,
0.0360107421875,
0.0462646484375,
-0.00966644287109375,
-0.007232666015625,
-0.032989501953125,
-0.00946044921875,
-0.0005936622619628906,
0.0252227783203125,
-0.00201416015625,
0.0174560546875,
-0.01491546630859375,
0.0023136138916015625,
-0.0123443603515625,
-0.022247314453125,
0.0279998779296875,
-0.0631103515625,
-0.01117706298828125,
-0.018951416015625,
-0.033599853515625,
-0.037841796875,
0.0191192626953125,
-0.057708740234375,
0.060699462890625,
0.01401519775390625,
-0.045684814453125,
0.0147705078125,
-0.05340576171875,
-0.0035495758056640625,
-0.007587432861328125,
0.0079803466796875,
-0.027374267578125,
-0.0010051727294921875,
0.0279998779296875,
0.03143310546875,
-0.0005702972412109375,
0.02130126953125,
-0.01195526123046875,
-0.038970947265625,
0.02783203125,
-0.032470703125,
0.1077880859375,
0.022674560546875,
-0.042083740234375,
-0.001251220703125,
-0.07269287109375,
0.00487518310546875,
0.01629638671875,
-0.031982421875,
-0.0150146484375,
0.0089111328125,
0.005077362060546875,
0.004215240478515625,
0.03338623046875,
-0.046600341796875,
0.0155487060546875,
-0.02423095703125,
0.051605224609375,
0.051910400390625,
-0.0240020751953125,
0.0197601318359375,
-0.0011587142944335938,
0.011505126953125,
0.0162811279296875,
0.0258636474609375,
0.0186309814453125,
-0.04937744140625,
-0.053558349609375,
-0.0484619140625,
0.049774169921875,
0.03961181640625,
-0.043121337890625,
0.057891845703125,
0.002025604248046875,
-0.03594970703125,
-0.06231689453125,
0.00591278076171875,
0.0236968994140625,
0.0174407958984375,
0.04925537109375,
0.0028400421142578125,
-0.0594482421875,
-0.0594482421875,
0.01018524169921875,
-0.0008358955383300781,
-0.00628662109375,
-0.0026645660400390625,
0.035400390625,
-0.0301513671875,
0.05877685546875,
-0.029693603515625,
-0.03857421875,
-0.0216522216796875,
0.01055908203125,
0.032470703125,
0.048309326171875,
0.068359375,
-0.061309814453125,
-0.03607177734375,
-0.0299835205078125,
-0.05230712890625,
0.0257415771484375,
0.0008602142333984375,
-0.0187225341796875,
0.037841796875,
0.0149383544921875,
-0.01947021484375,
0.036102294921875,
0.0552978515625,
-0.0227508544921875,
-0.0006833076477050781,
-0.021331787109375,
0.01000213623046875,
-0.08258056640625,
0.00632476806640625,
-0.00949859619140625,
-0.0159759521484375,
-0.0390625,
0.01299285888671875,
0.01381683349609375,
0.0236663818359375,
-0.0298004150390625,
0.00738525390625,
-0.044158935546875,
0.00360870361328125,
-0.01739501953125,
0.01303863525390625,
0.00725555419921875,
0.05938720703125,
0.01004791259765625,
0.0421142578125,
0.042724609375,
-0.031707763671875,
0.0247802734375,
0.047637939453125,
-0.0274200439453125,
-0.00445556640625,
-0.072998046875,
0.022857666015625,
-0.02197265625,
0.032684326171875,
-0.082275390625,
0.013336181640625,
0.01861572265625,
-0.0438232421875,
0.032257080078125,
-0.00907135009765625,
-0.04925537109375,
-0.0217742919921875,
-0.03155517578125,
0.0125732421875,
0.055450439453125,
-0.05963134765625,
0.01268768310546875,
0.0240325927734375,
0.03424072265625,
-0.060882568359375,
-0.0679931640625,
-0.005893707275390625,
-0.00656890869140625,
-0.035491943359375,
0.045440673828125,
-0.026947021484375,
-0.00807952880859375,
0.007129669189453125,
0.01361083984375,
-0.02532958984375,
0.01947021484375,
0.0078887939453125,
0.03125,
-0.0103607177734375,
0.00982666015625,
0.0074005126953125,
-0.0014553070068359375,
-0.0063629150390625,
0.0020999908447265625,
0.03277587890625,
-0.0231475830078125,
-0.005702972412109375,
-0.031036376953125,
0.0308074951171875,
0.0218658447265625,
-0.0283355712890625,
0.056640625,
0.076416015625,
-0.0274200439453125,
0.0012464523315429688,
-0.030670166015625,
-0.0270233154296875,
-0.032135009765625,
0.0185546875,
-0.010711669921875,
-0.0535888671875,
0.046051025390625,
0.0228729248046875,
0.0101165771484375,
0.043548583984375,
0.032470703125,
-0.0225372314453125,
0.0762939453125,
0.042724609375,
-0.0203094482421875,
0.045440673828125,
-0.05963134765625,
0.01377105712890625,
-0.09417724609375,
-0.01007843017578125,
-0.03167724609375,
-0.055267333984375,
-0.031341552734375,
-0.019256591796875,
0.01148223876953125,
0.0401611328125,
-0.0048828125,
0.052337646484375,
-0.08135986328125,
0.0249176025390625,
0.051910400390625,
0.038055419921875,
0.019744873046875,
0.00684356689453125,
0.035491943359375,
-0.00981903076171875,
-0.0556640625,
-0.0284881591796875,
0.072021484375,
0.0379638671875,
0.056549072265625,
0.040069580078125,
0.06268310546875,
0.0224761962890625,
-0.0098114013671875,
-0.03338623046875,
0.0390625,
-0.017303466796875,
-0.05377197265625,
-0.021728515625,
-0.018280029296875,
-0.08880615234375,
0.0167694091796875,
-0.01476287841796875,
-0.0804443359375,
0.039215087890625,
0.01042938232421875,
-0.029541015625,
0.0178375244140625,
-0.040008544921875,
0.04241943359375,
-0.0011196136474609375,
-0.0054931640625,
-0.020355224609375,
-0.047882080078125,
0.02008056640625,
0.0145111083984375,
-0.03936767578125,
-0.009490966796875,
0.01110076904296875,
0.058837890625,
0.0018310546875,
0.06671142578125,
-0.0182342529296875,
-0.0203857421875,
0.02783203125,
-0.01885986328125,
0.04083251953125,
0.029388427734375,
-0.00977325439453125,
0.04266357421875,
0.005039215087890625,
-0.0313720703125,
-0.0343017578125,
0.06097412109375,
-0.07147216796875,
-0.0268402099609375,
-0.04022216796875,
-0.049957275390625,
-0.0045318603515625,
-0.0007843971252441406,
0.031463623046875,
0.042938232421875,
-0.0017528533935546875,
0.0287628173828125,
0.082763671875,
-0.0012178421020507812,
0.05303955078125,
0.050018310546875,
0.021484375,
-0.016632080078125,
0.05712890625,
0.0052337646484375,
0.0018014907836914062,
0.05078125,
-0.02459716796875,
-0.0325927734375,
-0.059783935546875,
-0.04925537109375,
0.007213592529296875,
-0.0562744140625,
-0.03460693359375,
-0.06298828125,
-0.0285797119140625,
-0.03302001953125,
0.01274871826171875,
-0.0190582275390625,
-0.041534423828125,
-0.053070068359375,
0.01033782958984375,
0.05126953125,
0.04388427734375,
-0.00910186767578125,
0.00757598876953125,
-0.073974609375,
0.0157318115234375,
0.01183319091796875,
0.00429534912109375,
0.01538848876953125,
-0.03741455078125,
-0.023223876953125,
0.01251220703125,
-0.03448486328125,
-0.06805419921875,
0.0304412841796875,
0.005687713623046875,
0.060699462890625,
0.00007009506225585938,
0.0223388671875,
0.0421142578125,
-0.0234222412109375,
0.05584716796875,
0.006221771240234375,
-0.06915283203125,
0.0439453125,
-0.0156097412109375,
0.0298614501953125,
0.048370361328125,
0.0250244140625,
0.0220947265625,
-0.0274200439453125,
-0.049072265625,
-0.06866455078125,
0.067626953125,
0.03289794921875,
0.00408172607421875,
0.0008559226989746094,
-0.0007290840148925781,
-0.00731658935546875,
0.010101318359375,
-0.052490234375,
-0.0301971435546875,
-0.010833740234375,
-0.021728515625,
-0.0015316009521484375,
-0.02972412109375,
-0.00771331787109375,
-0.03472900390625,
0.0640869140625,
0.0047454833984375,
0.057037353515625,
0.04278564453125,
-0.02618408203125,
0.006885528564453125,
0.002216339111328125,
0.0733642578125,
0.059417724609375,
-0.050201416015625,
-0.00978851318359375,
0.021636962890625,
-0.032318115234375,
0.0004878044128417969,
0.0240020751953125,
-0.0014638900756835938,
0.0262451171875,
0.034698486328125,
0.0657958984375,
0.0013151168823242188,
-0.042388916015625,
0.0207977294921875,
-0.01282501220703125,
-0.0278167724609375,
-0.01861572265625,
-0.007068634033203125,
-0.004505157470703125,
0.04510498046875,
0.038299560546875,
0.0088348388671875,
0.027252197265625,
-0.0220184326171875,
0.007579803466796875,
0.0283050537109375,
-0.0305938720703125,
-0.018218994140625,
0.04278564453125,
0.020782470703125,
0.0036773681640625,
0.0439453125,
-0.0302276611328125,
-0.037841796875,
0.056793212890625,
0.043548583984375,
0.0640869140625,
-0.0077056884765625,
0.003528594970703125,
0.04473876953125,
0.02850341796875,
0.00936126708984375,
0.045928955078125,
-0.0070648193359375,
-0.03570556640625,
-0.029388427734375,
-0.0362548828125,
-0.01143646240234375,
0.024505615234375,
-0.059967041015625,
-0.00559234619140625,
-0.006931304931640625,
-0.012603759765625,
0.004058837890625,
0.007602691650390625,
-0.04827880859375,
-0.0059356689453125,
0.0005450248718261719,
0.06805419921875,
-0.0270843505859375,
0.0782470703125,
0.051513671875,
-0.029510498046875,
-0.039520263671875,
-0.006500244140625,
-0.027374267578125,
-0.043731689453125,
0.07904052734375,
0.0011377334594726562,
-0.005802154541015625,
0.012481689453125,
-0.01751708984375,
-0.0645751953125,
0.09149169921875,
0.036102294921875,
-0.07037353515625,
0.0076904296875,
-0.017913818359375,
0.040740966796875,
-0.01235198974609375,
0.01016998291015625,
0.0267486572265625,
0.0288543701171875,
-0.00653839111328125,
-0.052520751953125,
0.00225067138671875,
-0.0119476318359375,
0.0111236572265625,
0.00521087646484375,
-0.057342529296875,
0.06787109375,
-0.01248931884765625,
0.0018663406372070312,
0.01215362548828125,
0.052947998046875,
0.003879547119140625,
0.017364501953125,
0.030487060546875,
0.048187255859375,
0.03961181640625,
-0.0200042724609375,
0.048797607421875,
-0.0142364501953125,
0.048004150390625,
0.086181640625,
0.00370025634765625,
0.07275390625,
0.035491943359375,
-0.026275634765625,
0.034271240234375,
0.04193115234375,
-0.017242431640625,
0.06475830078125,
0.0158843994140625,
0.0014476776123046875,
-0.00628662109375,
0.026885986328125,
-0.048553466796875,
0.03564453125,
0.0091552734375,
-0.037109375,
-0.0209503173828125,
0.0154266357421875,
-0.0160369873046875,
-0.003101348876953125,
-0.01250457763671875,
0.06744384765625,
-0.00626373291015625,
-0.037933349609375,
0.0872802734375,
-0.0160675048828125,
0.054595947265625,
-0.042999267578125,
-0.021942138671875,
-0.00939178466796875,
0.037109375,
-0.00952911376953125,
-0.0460205078125,
0.021484375,
0.0036373138427734375,
-0.031524658203125,
-0.0002884864807128906,
0.05072021484375,
-0.034912109375,
-0.032989501953125,
0.038177490234375,
0.00958251953125,
0.01157379150390625,
-0.0006318092346191406,
-0.0731201171875,
0.0347900390625,
0.0096435546875,
-0.0183868408203125,
0.0256500244140625,
0.005046844482421875,
0.0210418701171875,
0.02935791015625,
0.03033447265625,
-0.0283203125,
0.0125274658203125,
0.00017404556274414062,
0.06683349609375,
-0.0260772705078125,
-0.021697998046875,
-0.0628662109375,
0.04205322265625,
-0.0229644775390625,
-0.0208587646484375,
0.06341552734375,
0.0236358642578125,
0.05316162109375,
-0.03094482421875,
0.025909423828125,
-0.01293182373046875,
0.01273345947265625,
-0.04156494140625,
0.057830810546875,
-0.040435791015625,
0.005939483642578125,
-0.018402099609375,
-0.069091796875,
-0.0187835693359375,
0.05419921875,
0.00945281982421875,
-0.0065155029296875,
0.029205322265625,
0.0416259765625,
-0.0010328292846679688,
-0.011016845703125,
0.0214996337890625,
0.004611968994140625,
0.02679443359375,
0.06793212890625,
0.051910400390625,
-0.07470703125,
0.03179931640625,
-0.0168609619140625,
-0.0256195068359375,
-0.036590576171875,
-0.0640869140625,
-0.08624267578125,
-0.060821533203125,
-0.045013427734375,
-0.027740478515625,
0.01338958740234375,
0.05474853515625,
0.06494140625,
-0.06689453125,
0.019287109375,
-0.017425537109375,
0.0099334716796875,
-0.054229736328125,
-0.01287078857421875,
0.043548583984375,
-0.034393310546875,
-0.08978271484375,
0.042236328125,
-0.001953125,
0.0180511474609375,
-0.0233306884765625,
-0.0189971923828125,
-0.0406494140625,
0.0013608932495117188,
0.05316162109375,
0.0037326812744140625,
-0.05712890625,
0.004985809326171875,
0.002239227294921875,
-0.00844573974609375,
0.0001569986343383789,
0.0204010009765625,
-0.05474853515625,
0.0237884521484375,
0.05389404296875,
0.03704833984375,
0.0345458984375,
-0.0202178955078125,
0.018280029296875,
-0.05181884765625,
0.0347900390625,
0.020660400390625,
0.026702880859375,
0.0098876953125,
-0.044036865234375,
0.052032470703125,
-0.0204620361328125,
-0.04620361328125,
-0.0731201171875,
0.003803253173828125,
-0.11199951171875,
-0.0217437744140625,
0.066162109375,
-0.03851318359375,
-0.014404296875,
0.002910614013671875,
-0.02337646484375,
0.01727294921875,
-0.04205322265625,
0.06793212890625,
0.0400390625,
0.00104522705078125,
-0.00389862060546875,
-0.035888671875,
0.0472412109375,
0.038970947265625,
-0.043670654296875,
-0.0040740966796875,
0.0106658935546875,
-0.00418853759765625,
0.0330810546875,
0.02691650390625,
-0.0029926300048828125,
0.027374267578125,
-0.0129547119140625,
-0.01018524169921875,
-0.0284423828125,
-0.0294189453125,
-0.033843994140625,
-0.0187530517578125,
-0.0042266845703125,
-0.0611572265625
]
] |
distilbert-base-multilingual-cased | 2023-04-06T13:40:24.000Z | [
"transformers",
"pytorch",
"tf",
"onnx",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"af",
"sq",
"ar",
"an",
"hy",
"ast",
"az",
"ba",
"eu",
"bar",
"be",
"bn",
"inc",
"bs",
"br",
"bg",
"my",
"ca",
"ceb",
"ce",
"zh",
"cv",
"hr",
"cs",
"da",
"nl",
"en",
"et",
"fi",
"fr",
"gl",
"ka",
"de",
"el",
"gu",
"ht",
"he",
"hi",
"hu",
"is",
"io",
"id",
"ga",
"it",
"ja",
"jv",
"kn",
"kk",
"ky",
"ko",
"la",
"lv",
"lt",
"roa",
"nds",
"lm",
"mk",
"mg",
"ms",
"ml",
"mr",
"mn",
"min",
"ne",
"new",
"nb",
"nn",
"oc",
"fa",
"pms",
"pl",
"pt",
"pa",
"ro",
"ru",
"sco",
"sr",
"scn",
"sk",
"sl",
"aze",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"th",
"ta",
"tt",
"te",
"tr",
"uk",
"ud",
"uz",
"vi",
"vo",
"war",
"cy",
"fry",
"pnb",
"yo",
"dataset:wikipedia",
"arxiv:1910.01108",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | distilbert-base-multilingual-cased | 78 | 5,091,482 | transformers | 2022-03-02T23:29:04 | ---
language:
- multilingual
- af
- sq
- ar
- an
- hy
- ast
- az
- ba
- eu
- bar
- be
- bn
- inc
- bs
- br
- bg
- my
- ca
- ceb
- ce
- zh
- cv
- hr
- cs
- da
- nl
- en
- et
- fi
- fr
- gl
- ka
- de
- el
- gu
- ht
- he
- hi
- hu
- is
- io
- id
- ga
- it
- ja
- jv
- kn
- kk
- ky
- ko
- la
- lv
- lt
- roa
- nds
- lm
- mk
- mg
- ms
- ml
- mr
- mn
- min
- ne
- new
- nb
- nn
- oc
- fa
- pms
- pl
- pt
- pa
- ro
- ru
- sco
- sr
- hr
- scn
- sk
- sl
- aze
- es
- su
- sw
- sv
- tl
- tg
- th
- ta
- tt
- te
- tr
- uk
- ud
- uz
- vi
- vo
- war
- cy
- fry
- pnb
- yo
license: apache-2.0
datasets:
- wikipedia
---
# Model Card for DistilBERT base multilingual (cased)
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
This model is a distilled version of the [BERT base multilingual model](https://huggingface.co/bert-base-multilingual-cased/). The code for the distillation process can be found [here](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation). This model is cased: it does make a difference between english and English.
The model is trained on the concatenation of Wikipedia in 104 different languages listed [here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages).
The model has 6 layers, 768 dimension and 12 heads, totalizing 134M parameters (compared to 177M parameters for mBERT-base).
On average, this model, referred to as DistilmBERT, is twice as fast as mBERT-base.
We encourage potential users of this model to check out the [BERT base multilingual model card](https://huggingface.co/bert-base-multilingual-cased) to learn more about usage, limitations and potential biases.
- **Developed by:** Victor Sanh, Lysandre Debut, Julien Chaumond, Thomas Wolf (Hugging Face)
- **Model type:** Transformer-based language model
- **Language(s) (NLP):** 104 languages; see full list [here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages)
- **License:** Apache 2.0
- **Related Models:** [BERT base multilingual model](https://huggingface.co/bert-base-multilingual-cased)
- **Resources for more information:**
- [GitHub Repository](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)
- [Associated Paper](https://arxiv.org/abs/1910.01108)
# Uses
## Direct Use and Downstream Use
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2.
## Out of Scope Use
The model should not be used to intentionally create hostile or alienating environments for people. The model was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
# Training Details
- The model was pretrained with the supervision of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the concatenation of Wikipedia in 104 different languages
- The model has 6 layers, 768 dimension and 12 heads, totalizing 134M parameters.
- Further information about the training procedure and data is included in the [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) model card.
# Evaluation
The model developers report the following accuracy results for DistilmBERT (see [GitHub Repo](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)):
> Here are the results on the test sets for 6 of the languages available in XNLI. The results are computed in the zero shot setting (trained on the English portion and evaluated on the target language portion):
| Model | English | Spanish | Chinese | German | Arabic | Urdu |
| :---: | :---: | :---: | :---: | :---: | :---: | :---:|
| mBERT base cased (computed) | 82.1 | 74.6 | 69.1 | 72.3 | 66.4 | 58.5 |
| mBERT base uncased (reported)| 81.4 | 74.3 | 63.8 | 70.5 | 62.1 | 58.3 |
| DistilmBERT | 78.2 | 69.1 | 64.0 | 66.3 | 59.1 | 54.7 |
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
```bibtex
@article{Sanh2019DistilBERTAD,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
journal={ArXiv},
year={2019},
volume={abs/1910.01108}
}
```
APA
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
# How to Get Started With the Model
You can use the model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='distilbert-base-multilingual-cased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'score': 0.040800247341394424,
'sequence': "Hello I'm a virtual model.",
'token': 37859,
'token_str': 'virtual'},
{'score': 0.020015988498926163,
'sequence': "Hello I'm a big model.",
'token': 22185,
'token_str': 'big'},
{'score': 0.018680453300476074,
'sequence': "Hello I'm a Hello model.",
'token': 31178,
'token_str': 'Hello'},
{'score': 0.017396586015820503,
'sequence': "Hello I'm a model model.",
'token': 13192,
'token_str': 'model'},
{'score': 0.014229810796678066,
'sequence': "Hello I'm a perfect model.",
'token': 43477,
'token_str': 'perfect'}]
```
| 7,316 | [
[
-0.0290985107421875,
-0.05462646484375,
0.0196075439453125,
0.0210723876953125,
-0.01438140869140625,
0.0031719207763671875,
-0.030853271484375,
-0.0269622802734375,
0.00433349609375,
0.0261383056640625,
-0.042877197265625,
-0.033599853515625,
-0.055633544921875,
0.0025615692138671875,
-0.0241546630859375,
0.0902099609375,
0.01119232177734375,
0.01085662841796875,
0.0071563720703125,
0.006381988525390625,
-0.0255279541015625,
-0.062347412109375,
-0.04345703125,
-0.0225982666015625,
0.0307159423828125,
0.009979248046875,
0.035430908203125,
0.02288818359375,
0.0217437744140625,
0.0263671875,
-0.0218658447265625,
-0.00832366943359375,
-0.0288543701171875,
-0.00943756103515625,
0.00885009765625,
-0.0227508544921875,
-0.0291290283203125,
-0.0017557144165039062,
0.042694091796875,
0.062347412109375,
-0.003360748291015625,
0.023101806640625,
0.0079345703125,
0.048553466796875,
-0.0132293701171875,
0.033294677734375,
-0.056488037109375,
-0.007190704345703125,
-0.01526641845703125,
0.03192138671875,
-0.040374755859375,
0.0006403923034667969,
0.013336181640625,
-0.0181427001953125,
0.01953125,
0.0004284381866455078,
0.0865478515625,
0.0117950439453125,
-0.022491455078125,
-0.0158538818359375,
-0.041473388671875,
0.07244873046875,
-0.07470703125,
0.033782958984375,
0.021942138671875,
0.0146331787109375,
-0.006389617919921875,
-0.056549072265625,
-0.0555419921875,
-0.017242431640625,
-0.022369384765625,
0.0128173828125,
-0.0186614990234375,
-0.00391387939453125,
0.0264129638671875,
0.030303955078125,
-0.043792724609375,
0.0022487640380859375,
-0.037322998046875,
-0.01447296142578125,
0.05413818359375,
-0.01221466064453125,
0.01506805419921875,
-0.031463623046875,
-0.0202789306640625,
-0.0260009765625,
-0.023712158203125,
0.006603240966796875,
0.040283203125,
0.047698974609375,
-0.02166748046875,
0.0380859375,
-0.002178192138671875,
0.047637939453125,
0.01120758056640625,
-0.0156402587890625,
0.0433349609375,
-0.02294921875,
-0.0153045654296875,
0.0052032470703125,
0.06707763671875,
0.009124755859375,
0.018157958984375,
0.0036525726318359375,
-0.00604248046875,
-0.00457000732421875,
-0.0005993843078613281,
-0.06378173828125,
-0.0239410400390625,
0.02655029296875,
-0.0302886962890625,
-0.0167083740234375,
0.000354766845703125,
-0.048309326171875,
0.004558563232421875,
-0.019439697265625,
0.020263671875,
-0.03985595703125,
-0.03607177734375,
0.00978851318359375,
-0.0111541748046875,
0.006954193115234375,
0.0005707740783691406,
-0.06561279296875,
0.01134490966796875,
0.032623291015625,
0.0579833984375,
-0.00162506103515625,
-0.020263671875,
-0.006908416748046875,
-0.0214385986328125,
0.0017137527465820312,
0.021942138671875,
-0.0150299072265625,
-0.0192108154296875,
-0.004123687744140625,
0.019775390625,
0.00525665283203125,
-0.0228424072265625,
0.05328369140625,
-0.02752685546875,
0.029876708984375,
-0.01544952392578125,
-0.0343017578125,
-0.029632568359375,
0.01050567626953125,
-0.059295654296875,
0.09564208984375,
0.0111846923828125,
-0.0614013671875,
0.024810791015625,
-0.047332763671875,
-0.045745849609375,
-0.00020885467529296875,
0.0106353759765625,
-0.044036865234375,
-0.0032672882080078125,
0.0177154541015625,
0.0380859375,
-0.004238128662109375,
0.047454833984375,
-0.00787353515625,
-0.006008148193359375,
-0.00485992431640625,
-0.040130615234375,
0.106201171875,
0.02899169921875,
-0.035858154296875,
-0.01097869873046875,
-0.06689453125,
0.00377655029296875,
0.016448974609375,
-0.032318115234375,
-0.0176544189453125,
-0.000583648681640625,
0.026275634765625,
0.0281982421875,
0.0223541259765625,
-0.03704833984375,
0.007625579833984375,
-0.024810791015625,
0.0419921875,
0.047821044921875,
-0.0239715576171875,
0.016815185546875,
-0.01390838623046875,
0.019927978515625,
0.0141143798828125,
0.01018524169921875,
-0.0148773193359375,
-0.056915283203125,
-0.07476806640625,
-0.0268707275390625,
0.042327880859375,
0.0491943359375,
-0.05413818359375,
0.056915283203125,
-0.024322509765625,
-0.050994873046875,
-0.047637939453125,
0.00850677490234375,
0.036956787109375,
0.037506103515625,
0.0264129638671875,
-0.0144500732421875,
-0.056121826171875,
-0.07037353515625,
0.01381683349609375,
-0.023284912109375,
0.0007472038269042969,
0.0099945068359375,
0.04815673828125,
-0.0206756591796875,
0.0635986328125,
-0.032440185546875,
-0.017364501953125,
-0.02874755859375,
0.01142120361328125,
0.04864501953125,
0.032318115234375,
0.05706787109375,
-0.05780029296875,
-0.0594482421875,
-0.0031452178955078125,
-0.046478271484375,
-0.00406646728515625,
-0.0014791488647460938,
-0.00696563720703125,
0.033935546875,
0.024627685546875,
-0.042205810546875,
0.00536346435546875,
0.06005859375,
-0.0110626220703125,
0.03485107421875,
-0.0217132568359375,
-0.00501251220703125,
-0.096923828125,
0.01464080810546875,
0.00698089599609375,
-0.0116729736328125,
-0.06201171875,
0.002872467041015625,
-0.000675201416015625,
-0.0016374588012695312,
-0.051666259765625,
0.040985107421875,
-0.037750244140625,
0.017730712890625,
-0.0014448165893554688,
-0.0067596435546875,
0.00988006591796875,
0.0611572265625,
0.0193023681640625,
0.041168212890625,
0.041351318359375,
-0.037445068359375,
0.01751708984375,
0.016815185546875,
-0.042694091796875,
0.0096893310546875,
-0.050628662109375,
0.0057373046875,
-0.01120758056640625,
0.01340484619140625,
-0.06890869140625,
0.0017032623291015625,
0.00001043081283569336,
-0.034088134765625,
0.044097900390625,
-0.0164642333984375,
-0.047607421875,
-0.041656494140625,
-0.01270294189453125,
0.00936126708984375,
0.054779052734375,
-0.03839111328125,
0.04046630859375,
0.027313232421875,
-0.0166168212890625,
-0.05548095703125,
-0.07330322265625,
0.00017189979553222656,
-0.0209503173828125,
-0.05303955078125,
0.036224365234375,
-0.015625,
-0.0157470703125,
-0.00811004638671875,
0.019134521484375,
-0.017059326171875,
0.004364013671875,
0.006343841552734375,
0.0258636474609375,
0.0008654594421386719,
-0.00014925003051757812,
0.00681304931640625,
0.00201416015625,
-0.00616455078125,
-0.01287841796875,
0.0526123046875,
-0.0190582275390625,
-0.0014858245849609375,
-0.0160064697265625,
0.0300750732421875,
0.038543701171875,
-0.005275726318359375,
0.06121826171875,
0.05133056640625,
-0.03717041015625,
0.0029239654541015625,
-0.041961669921875,
-0.01398468017578125,
-0.035003662109375,
0.045867919921875,
-0.034393310546875,
-0.057891845703125,
0.052642822265625,
0.0205230712890625,
0.01318359375,
0.0531005859375,
0.0594482421875,
-0.00614166259765625,
0.076416015625,
0.049835205078125,
-0.026580810546875,
0.032684326171875,
-0.02972412109375,
0.030487060546875,
-0.044097900390625,
-0.014984130859375,
-0.03936767578125,
-0.01245880126953125,
-0.060791015625,
-0.0213470458984375,
0.015838623046875,
0.01849365234375,
-0.018829345703125,
0.052825927734375,
-0.046600341796875,
0.01328277587890625,
0.061126708984375,
-0.0012731552124023438,
0.015899658203125,
0.00777435302734375,
-0.021575927734375,
-0.013031005859375,
-0.056640625,
-0.03802490234375,
0.0758056640625,
0.04052734375,
0.038818359375,
0.01329803466796875,
0.04815673828125,
0.01837158203125,
0.01152801513671875,
-0.039398193359375,
0.0296783447265625,
-0.029022216796875,
-0.080322265625,
-0.0234222412109375,
-0.0245208740234375,
-0.0623779296875,
0.0172119140625,
-0.005855560302734375,
-0.05206298828125,
0.0080413818359375,
-0.0007328987121582031,
-0.016998291015625,
0.0264129638671875,
-0.07684326171875,
0.06488037109375,
-0.040618896484375,
-0.01557159423828125,
0.00782012939453125,
-0.059906005859375,
0.029449462890625,
-0.01922607421875,
0.0261077880859375,
-0.01190185546875,
0.0364990234375,
0.044189453125,
-0.0369873046875,
0.076416015625,
-0.019439697265625,
-0.007572174072265625,
0.0222320556640625,
-0.020965576171875,
0.0303955078125,
-0.004650115966796875,
-0.01430511474609375,
0.06024169921875,
0.002887725830078125,
-0.01904296875,
-0.0162811279296875,
0.04925537109375,
-0.0592041015625,
-0.040374755859375,
-0.043304443359375,
-0.0380859375,
0.006053924560546875,
0.0245513916015625,
0.025482177734375,
0.005092620849609375,
-0.007236480712890625,
0.0086822509765625,
0.040863037109375,
-0.038543701171875,
0.041412353515625,
0.03985595703125,
-0.0193023681640625,
-0.01320648193359375,
0.059661865234375,
0.0146026611328125,
0.0189056396484375,
0.031951904296875,
0.01438140869140625,
-0.035369873046875,
-0.0294952392578125,
-0.040618896484375,
0.0145111083984375,
-0.049041748046875,
-0.01175689697265625,
-0.06158447265625,
-0.03753662109375,
-0.047576904296875,
0.01276397705078125,
-0.0287628173828125,
-0.033294677734375,
-0.0213775634765625,
-0.0162506103515625,
0.038604736328125,
0.0240631103515625,
-0.0082244873046875,
0.01549530029296875,
-0.04791259765625,
0.01397705078125,
0.0203704833984375,
0.021240234375,
-0.007587432861328125,
-0.0546875,
-0.0189971923828125,
0.0284576416015625,
-0.02362060546875,
-0.04827880859375,
0.041839599609375,
0.0268707275390625,
0.04595947265625,
0.02099609375,
0.00494384765625,
0.050140380859375,
-0.057861328125,
0.07293701171875,
0.0186920166015625,
-0.07623291015625,
0.041046142578125,
-0.00896453857421875,
0.0121612548828125,
0.035003662109375,
0.038787841796875,
-0.040069580078125,
-0.031890869140625,
-0.04827880859375,
-0.07489013671875,
0.061004638671875,
0.0265960693359375,
0.032958984375,
-0.01412200927734375,
0.01221466064453125,
0.01044464111328125,
0.0147705078125,
-0.08734130859375,
-0.046966552734375,
-0.0206451416015625,
-0.01389312744140625,
-0.00959014892578125,
-0.02276611328125,
0.01015472412109375,
-0.04638671875,
0.07373046875,
0.00473785400390625,
0.019927978515625,
0.004741668701171875,
-0.01319122314453125,
0.0206756591796875,
0.01055908203125,
0.041229248046875,
0.0181121826171875,
-0.037933349609375,
-0.0021381378173828125,
0.02105712890625,
-0.039703369140625,
0.0045166015625,
0.023529052734375,
-0.01236724853515625,
0.0269317626953125,
0.0233306884765625,
0.07452392578125,
-0.006389617919921875,
-0.053619384765625,
0.032867431640625,
0.0023345947265625,
-0.029449462890625,
-0.03240966796875,
-0.0136566162109375,
0.0160980224609375,
0.01389312744140625,
0.0225372314453125,
-0.01160430908203125,
0.01067352294921875,
-0.047576904296875,
0.01549530029296875,
0.0257720947265625,
-0.0273590087890625,
-0.0189056396484375,
0.0633544921875,
0.0145111083984375,
-0.010528564453125,
0.058990478515625,
-0.0267333984375,
-0.045928955078125,
0.050018310546875,
0.03131103515625,
0.05902099609375,
-0.008392333984375,
0.00911712646484375,
0.050018310546875,
0.03875732421875,
-0.00202178955078125,
0.01456451416015625,
0.01285552978515625,
-0.059173583984375,
-0.040313720703125,
-0.0623779296875,
-0.0019683837890625,
0.0283203125,
-0.04083251953125,
0.036163330078125,
-0.0157318115234375,
-0.02447509765625,
0.017547607421875,
0.01245880126953125,
-0.053131103515625,
0.01355743408203125,
0.023040771484375,
0.0621337890625,
-0.07470703125,
0.0885009765625,
0.04046630859375,
-0.050048828125,
-0.057220458984375,
-0.026214599609375,
-0.01506805419921875,
-0.054229736328125,
0.059478759765625,
0.01497650146484375,
0.023345947265625,
-0.0077667236328125,
-0.0252227783203125,
-0.06207275390625,
0.0709228515625,
0.034912109375,
-0.06488037109375,
0.0010509490966796875,
0.0224609375,
0.057159423828125,
-0.015289306640625,
0.035308837890625,
0.041778564453125,
0.041168212890625,
0.003902435302734375,
-0.0792236328125,
-0.0049591064453125,
-0.039520263671875,
0.01049041748046875,
0.0021190643310546875,
-0.0509033203125,
0.0775146484375,
-0.01398468017578125,
-0.01537322998046875,
-0.006999969482421875,
0.0305023193359375,
0.0159149169921875,
-0.00006753206253051758,
0.037933349609375,
0.04931640625,
0.046051025390625,
-0.0282440185546875,
0.07470703125,
-0.038543701171875,
0.044830322265625,
0.08538818359375,
-0.0223541259765625,
0.056884765625,
0.037567138671875,
-0.030517578125,
0.048431396484375,
0.053192138671875,
-0.016021728515625,
0.053558349609375,
0.0164337158203125,
-0.01053619384765625,
0.004108428955078125,
-0.0011997222900390625,
-0.0305023193359375,
0.0263214111328125,
0.013946533203125,
-0.03765869140625,
-0.005695343017578125,
0.00008147954940795898,
0.021484375,
-0.005519866943359375,
0.01007080078125,
0.0416259765625,
-0.0016908645629882812,
-0.05133056640625,
0.051239013671875,
0.0211944580078125,
0.0721435546875,
-0.045196533203125,
-0.00037479400634765625,
-0.01171112060546875,
0.013580322265625,
-0.005596160888671875,
-0.054656982421875,
0.022125244140625,
0.00827789306640625,
-0.035614013671875,
-0.0301055908203125,
0.03717041015625,
-0.049102783203125,
-0.07244873046875,
0.037017822265625,
0.035797119140625,
0.0205535888671875,
-0.00485992431640625,
-0.0732421875,
0.004306793212890625,
0.0203704833984375,
-0.019927978515625,
0.01058197021484375,
0.0181884765625,
-0.005718231201171875,
0.03521728515625,
0.054718017578125,
-0.006500244140625,
0.01441192626953125,
0.0217132568359375,
0.05914306640625,
-0.0305328369140625,
-0.0155181884765625,
-0.06427001953125,
0.061248779296875,
-0.0108489990234375,
-0.0172576904296875,
0.05511474609375,
0.056365966796875,
0.08123779296875,
-0.0155792236328125,
0.07611083984375,
-0.029541015625,
0.0322265625,
-0.02630615234375,
0.05743408203125,
-0.0426025390625,
0.01190948486328125,
-0.029693603515625,
-0.07476806640625,
-0.01739501953125,
0.051605224609375,
-0.005764007568359375,
0.01751708984375,
0.041900634765625,
0.052001953125,
0.001331329345703125,
-0.026641845703125,
0.0184478759765625,
0.0234527587890625,
0.025909423828125,
0.039703369140625,
0.0283203125,
-0.0521240234375,
0.03582763671875,
-0.049041748046875,
-0.0179290771484375,
-0.007045745849609375,
-0.08050537109375,
-0.07269287109375,
-0.055206298828125,
-0.0304718017578125,
-0.024993896484375,
-0.0096893310546875,
0.05279541015625,
0.063720703125,
-0.076904296875,
-0.0250244140625,
-0.005916595458984375,
-0.0012226104736328125,
-0.023284912109375,
-0.0194091796875,
0.0300140380859375,
-0.0191497802734375,
-0.0870361328125,
0.01274871826171875,
0.00104522705078125,
0.00878143310546875,
-0.027557373046875,
-0.013336181640625,
-0.039306640625,
-0.004241943359375,
0.0582275390625,
0.0038280487060546875,
-0.061492919921875,
-0.0123291015625,
0.010650634765625,
-0.0066680908203125,
0.0012063980102539062,
0.0208587646484375,
-0.0330810546875,
0.040679931640625,
0.0310516357421875,
0.02166748046875,
0.056640625,
-0.027679443359375,
0.0271148681640625,
-0.07427978515625,
0.0338134765625,
0.00604248046875,
0.046905517578125,
0.026885986328125,
-0.0300445556640625,
0.04156494140625,
0.01080322265625,
-0.026519775390625,
-0.0596923828125,
-0.0004413127899169922,
-0.08184814453125,
-0.0287017822265625,
0.08892822265625,
-0.0180511474609375,
-0.0105438232421875,
-0.003948211669921875,
-0.022979736328125,
0.02508544921875,
-0.026611328125,
0.058441162109375,
0.0767822265625,
0.01235198974609375,
-0.0093231201171875,
-0.043243408203125,
0.0201568603515625,
0.0257110595703125,
-0.0440673828125,
-0.006641387939453125,
0.022705078125,
0.0307769775390625,
0.0281829833984375,
0.03472900390625,
-0.00926971435546875,
-0.0148162841796875,
0.0005269050598144531,
0.036285400390625,
0.0014362335205078125,
-0.01128387451171875,
-0.016143798828125,
-0.0200347900390625,
-0.005340576171875,
-0.0057525634765625
]
] |
roberta-large | 2023-03-22T09:25:01.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"onnx",
"safetensors",
"roberta",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1907.11692",
"arxiv:1806.02847",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | roberta-large | 133 | 5,073,263 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: mit
datasets:
- bookcorpus
- wikipedia
---
# RoBERTa large model
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). This model is case-sensitive: it
makes a difference between english and English.
Disclaimer: The team releasing RoBERTa did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-large')
>>> unmasker("Hello I'm a <mask> model.")
[{'sequence': "<s>Hello I'm a male model.</s>",
'score': 0.3317350447177887,
'token': 2943,
'token_str': 'ฤ male'},
{'sequence': "<s>Hello I'm a fashion model.</s>",
'score': 0.14171843230724335,
'token': 2734,
'token_str': 'ฤ fashion'},
{'sequence': "<s>Hello I'm a professional model.</s>",
'score': 0.04291723668575287,
'token': 2038,
'token_str': 'ฤ professional'},
{'sequence': "<s>Hello I'm a freelance model.</s>",
'score': 0.02134818211197853,
'token': 18150,
'token_str': 'ฤ freelance'},
{'sequence': "<s>Hello I'm a young model.</s>",
'score': 0.021098261699080467,
'token': 664,
'token_str': 'ฤ young'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-large')
model = RobertaModel.from_pretrained('roberta-large')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import RobertaTokenizer, TFRobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-large')
model = TFRobertaModel.from_pretrained('roberta-large')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model contains a lot of unfiltered content from the internet, which is far from
neutral. Therefore, the model can have biased predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-large')
>>> unmasker("The man worked as a <mask>.")
[{'sequence': '<s>The man worked as a mechanic.</s>',
'score': 0.08260300755500793,
'token': 25682,
'token_str': 'ฤ mechanic'},
{'sequence': '<s>The man worked as a driver.</s>',
'score': 0.05736079439520836,
'token': 1393,
'token_str': 'ฤ driver'},
{'sequence': '<s>The man worked as a teacher.</s>',
'score': 0.04709019884467125,
'token': 3254,
'token_str': 'ฤ teacher'},
{'sequence': '<s>The man worked as a bartender.</s>',
'score': 0.04641604796051979,
'token': 33080,
'token_str': 'ฤ bartender'},
{'sequence': '<s>The man worked as a waiter.</s>',
'score': 0.04239227622747421,
'token': 38233,
'token_str': 'ฤ waiter'}]
>>> unmasker("The woman worked as a <mask>.")
[{'sequence': '<s>The woman worked as a nurse.</s>',
'score': 0.2667474150657654,
'token': 9008,
'token_str': 'ฤ nurse'},
{'sequence': '<s>The woman worked as a waitress.</s>',
'score': 0.12280137836933136,
'token': 35698,
'token_str': 'ฤ waitress'},
{'sequence': '<s>The woman worked as a teacher.</s>',
'score': 0.09747499972581863,
'token': 3254,
'token_str': 'ฤ teacher'},
{'sequence': '<s>The woman worked as a secretary.</s>',
'score': 0.05783602222800255,
'token': 2971,
'token_str': 'ฤ secretary'},
{'sequence': '<s>The woman worked as a cleaner.</s>',
'score': 0.05576248839497566,
'token': 16126,
'token_str': 'ฤ cleaner'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The RoBERTa model was pretrained on the reunion of five datasets:
- [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books;
- [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers) ;
- [CC-News](https://commoncrawl.org/2016/10/news-dataset-available/), a dataset containing 63 millions English news
articles crawled between September 2016 and February 2019.
- [OpenWebText](https://github.com/jcpeterson/openwebtext), an opensource recreation of the WebText dataset used to
train GPT-2,
- [Stories](https://arxiv.org/abs/1806.02847) a dataset containing a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas.
Together theses datasets weight 160GB of text.
## Training procedure
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50,000. The inputs of
the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The
optimizer used is Adam with a learning rate of 4e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and
\\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 30,000 steps and linear decay of the learning
rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE |
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|
| | 90.2 | 92.2 | 94.7 | 96.4 | 68.0 | 96.4 | 90.9 | 86.6 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1907-11692,
author = {Yinhan Liu and
Myle Ott and
Naman Goyal and
Jingfei Du and
Mandar Joshi and
Danqi Chen and
Omer Levy and
Mike Lewis and
Luke Zettlemoyer and
Veselin Stoyanov},
title = {RoBERTa: {A} Robustly Optimized {BERT} Pretraining Approach},
journal = {CoRR},
volume = {abs/1907.11692},
year = {2019},
url = {http://arxiv.org/abs/1907.11692},
archivePrefix = {arXiv},
eprint = {1907.11692},
timestamp = {Thu, 01 Aug 2019 08:59:33 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1907-11692.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=roberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 9,280 | [
[
-0.01348114013671875,
-0.058319091796875,
0.0187530517578125,
-0.0002536773681640625,
-0.0238037109375,
-0.0056610107421875,
-0.029541015625,
-0.027099609375,
0.0188140869140625,
0.033447265625,
-0.04193115234375,
-0.04119873046875,
-0.06573486328125,
0.006496429443359375,
-0.03240966796875,
0.09771728515625,
0.01018524169921875,
0.02001953125,
0.00014221668243408203,
0.0162200927734375,
-0.02105712890625,
-0.03961181640625,
-0.048553466796875,
-0.02459716796875,
0.0194244384765625,
-0.001995086669921875,
0.035125732421875,
0.0367431640625,
0.0258941650390625,
0.02752685546875,
-0.0216217041015625,
0.0003533363342285156,
-0.02777099609375,
0.0026092529296875,
-0.007320404052734375,
-0.03631591796875,
-0.0216827392578125,
0.0202789306640625,
0.03173828125,
0.043853759765625,
-0.00405120849609375,
0.0262908935546875,
0.01502227783203125,
0.03546142578125,
-0.007049560546875,
0.021240234375,
-0.043243408203125,
0.0014314651489257812,
-0.023468017578125,
0.0137481689453125,
-0.0257110595703125,
-0.016143798828125,
0.0143585205078125,
-0.032440185546875,
0.0290985107421875,
-0.003437042236328125,
0.10076904296875,
0.01273345947265625,
-0.0196075439453125,
-0.024505615234375,
-0.039947509765625,
0.071533203125,
-0.0634765625,
0.0136260986328125,
0.03173828125,
0.0102691650390625,
-0.0118865966796875,
-0.06781005859375,
-0.0445556640625,
-0.01025390625,
-0.011749267578125,
0.01107025146484375,
-0.0245513916015625,
-0.0133514404296875,
0.0214080810546875,
0.03271484375,
-0.049102783203125,
-0.006103515625,
-0.04534912109375,
-0.0192413330078125,
0.038665771484375,
-0.001232147216796875,
0.0188751220703125,
-0.0338134765625,
-0.03289794921875,
-0.0189056396484375,
-0.0235443115234375,
0.01418304443359375,
0.0400390625,
0.0262298583984375,
-0.0162506103515625,
0.04193115234375,
-0.0066680908203125,
0.054168701171875,
0.003063201904296875,
-0.0196380615234375,
0.03997802734375,
-0.01715087890625,
-0.0187530517578125,
-0.01422882080078125,
0.07489013671875,
0.0189971923828125,
0.0338134765625,
-0.0018682479858398438,
-0.015869140625,
0.01461029052734375,
0.0080413818359375,
-0.05322265625,
-0.0207977294921875,
0.0187835693359375,
-0.03509521484375,
-0.035675048828125,
0.01357269287109375,
-0.0599365234375,
-0.003711700439453125,
-0.01317596435546875,
0.040283203125,
-0.0295867919921875,
-0.01087188720703125,
0.0113983154296875,
-0.024017333984375,
0.0090789794921875,
0.0063629150390625,
-0.058990478515625,
0.00431060791015625,
0.035308837890625,
0.0662841796875,
0.0088653564453125,
-0.0157318115234375,
-0.0182647705078125,
-0.0069580078125,
-0.0025959014892578125,
0.033203125,
-0.0252532958984375,
-0.00255584716796875,
-0.005184173583984375,
0.019989013671875,
-0.01422882080078125,
-0.0209197998046875,
0.037841796875,
-0.0254669189453125,
0.05487060546875,
0.01593017578125,
-0.0340576171875,
-0.019866943359375,
0.01284027099609375,
-0.045654296875,
0.08892822265625,
0.0189971923828125,
-0.06488037109375,
0.0190582275390625,
-0.0555419921875,
-0.0322265625,
-0.01435089111328125,
0.0099945068359375,
-0.04803466796875,
-0.006153106689453125,
0.027313232421875,
0.037261962890625,
-0.02899169921875,
0.035675048828125,
0.0009374618530273438,
-0.0233154296875,
0.0267333984375,
-0.035186767578125,
0.10565185546875,
0.01329803466796875,
-0.041290283203125,
0.002933502197265625,
-0.056793212890625,
-0.0034618377685546875,
0.0291748046875,
-0.037384033203125,
-0.0052642822265625,
-0.0162811279296875,
0.017547607421875,
0.0217742919921875,
0.0185089111328125,
-0.040924072265625,
0.016021728515625,
-0.03765869140625,
0.056121826171875,
0.056549072265625,
-0.00934600830078125,
0.018341064453125,
-0.032257080078125,
0.040557861328125,
-0.003711700439453125,
0.01248931884765625,
-0.019805908203125,
-0.052703857421875,
-0.050079345703125,
-0.035919189453125,
0.0482177734375,
0.049285888671875,
-0.04754638671875,
0.04229736328125,
-0.013214111328125,
-0.040924072265625,
-0.061737060546875,
-0.0027332305908203125,
0.039581298828125,
0.04156494140625,
0.03314208984375,
-0.030914306640625,
-0.050506591796875,
-0.053955078125,
-0.02606201171875,
0.00586700439453125,
-0.0252227783203125,
0.0205078125,
0.047119140625,
-0.0170745849609375,
0.053985595703125,
-0.04412841796875,
-0.047119140625,
-0.0251922607421875,
0.00559234619140625,
0.04498291015625,
0.054168701171875,
0.0360107421875,
-0.044921875,
-0.036407470703125,
-0.02288818359375,
-0.05413818359375,
0.01143646240234375,
-0.0034027099609375,
-0.0031604766845703125,
0.029327392578125,
0.031463623046875,
-0.0611572265625,
0.03582763671875,
0.036590576171875,
-0.0264892578125,
0.045074462890625,
-0.020111083984375,
-0.0049591064453125,
-0.09967041015625,
0.01324462890625,
0.0032253265380859375,
-0.022430419921875,
-0.05322265625,
0.0038051605224609375,
-0.0163726806640625,
-0.01468658447265625,
-0.035980224609375,
0.03912353515625,
-0.045166015625,
0.0038394927978515625,
0.00662994384765625,
0.014892578125,
0.005184173583984375,
0.054168701171875,
0.00028586387634277344,
0.050201416015625,
0.044677734375,
-0.0243682861328125,
0.0181732177734375,
0.0241851806640625,
-0.042083740234375,
0.017822265625,
-0.05474853515625,
0.0176849365234375,
-0.007579803466796875,
0.012725830078125,
-0.078125,
-0.016021728515625,
0.0211639404296875,
-0.056732177734375,
0.0264129638671875,
-0.0242462158203125,
-0.04119873046875,
-0.048004150390625,
-0.0166168212890625,
0.01143646240234375,
0.049957275390625,
-0.03265380859375,
0.046661376953125,
0.0291900634765625,
-0.0045318603515625,
-0.05218505859375,
-0.05889892578125,
0.00688934326171875,
-0.01763916015625,
-0.05126953125,
0.03375244140625,
0.006023406982421875,
-0.00653839111328125,
-0.0057373046875,
0.005382537841796875,
-0.01348876953125,
0.005535125732421875,
0.015869140625,
0.030487060546875,
-0.00926971435546875,
-0.00791168212890625,
-0.0172119140625,
-0.01438140869140625,
0.006988525390625,
-0.036895751953125,
0.0670166015625,
-0.0085296630859375,
-0.0023708343505859375,
-0.039276123046875,
0.01268768310546875,
0.03369140625,
-0.022064208984375,
0.0618896484375,
0.07843017578125,
-0.026947021484375,
0.0010995864868164062,
-0.0301971435546875,
-0.0210723876953125,
-0.0341796875,
0.03875732421875,
-0.0246124267578125,
-0.06158447265625,
0.044830322265625,
0.0210113525390625,
-0.015960693359375,
0.054473876953125,
0.0401611328125,
-0.0100250244140625,
0.07220458984375,
0.02996826171875,
-0.01230621337890625,
0.040283203125,
-0.041412353515625,
0.0157928466796875,
-0.0592041015625,
-0.026275634765625,
-0.04046630859375,
-0.0176544189453125,
-0.04669189453125,
-0.030609130859375,
0.022735595703125,
0.00855255126953125,
-0.021148681640625,
0.03741455078125,
-0.056488037109375,
0.026275634765625,
0.06591796875,
0.026580810546875,
0.001392364501953125,
0.0034847259521484375,
-0.01233673095703125,
-0.0033359527587890625,
-0.045379638671875,
-0.034423828125,
0.09490966796875,
0.03704833984375,
0.0416259765625,
0.006000518798828125,
0.047454833984375,
0.017364501953125,
0.0008625984191894531,
-0.035308837890625,
0.033905029296875,
-0.02423095703125,
-0.06658935546875,
-0.027099609375,
-0.02325439453125,
-0.0787353515625,
0.0171051025390625,
-0.023834228515625,
-0.064208984375,
-0.0024662017822265625,
-0.00675201416015625,
-0.00946044921875,
0.029144287109375,
-0.049652099609375,
0.07232666015625,
-0.01216888427734375,
-0.027587890625,
-0.00012803077697753906,
-0.06549072265625,
0.0251312255859375,
0.006351470947265625,
0.0119781494140625,
0.00473785400390625,
0.0278167724609375,
0.077880859375,
-0.03759765625,
0.07745361328125,
-0.02020263671875,
0.0030994415283203125,
0.0181121826171875,
-0.00817108154296875,
0.03997802734375,
-0.0102691650390625,
0.0018749237060546875,
0.039947509765625,
-0.006744384765625,
-0.03387451171875,
-0.0209197998046875,
0.032440185546875,
-0.0675048828125,
-0.047088623046875,
-0.049896240234375,
-0.045379638671875,
0.0187225341796875,
0.030517578125,
0.040771484375,
0.041015625,
0.007152557373046875,
0.010589599609375,
0.034393310546875,
-0.018707275390625,
0.040557861328125,
0.0222625732421875,
-0.01074981689453125,
-0.03369140625,
0.055023193359375,
0.006130218505859375,
0.01462554931640625,
0.0219879150390625,
0.0119781494140625,
-0.030975341796875,
-0.03387451171875,
-0.029541015625,
0.0237274169921875,
-0.04266357421875,
-0.0213470458984375,
-0.054718017578125,
-0.0299072265625,
-0.04248046875,
-0.00493621826171875,
-0.01279449462890625,
-0.037384033203125,
-0.041290283203125,
0.0009946823120117188,
0.031829833984375,
0.056793212890625,
-0.0035877227783203125,
0.0208282470703125,
-0.038726806640625,
0.0120086669921875,
0.017059326171875,
0.0106658935546875,
-0.0056915283203125,
-0.07000732421875,
-0.0198822021484375,
0.01129913330078125,
-0.02032470703125,
-0.058837890625,
0.053985595703125,
0.005107879638671875,
0.0323486328125,
0.0297698974609375,
-0.00908660888671875,
0.04400634765625,
-0.028289794921875,
0.07379150390625,
0.013916015625,
-0.073486328125,
0.03948974609375,
-0.033172607421875,
0.01256561279296875,
0.0185699462890625,
0.023468017578125,
-0.03570556640625,
-0.04083251953125,
-0.064697265625,
-0.0772705078125,
0.069091796875,
0.0211029052734375,
0.01255035400390625,
0.0033473968505859375,
0.0156402587890625,
-0.002349853515625,
0.024078369140625,
-0.08416748046875,
-0.0292510986328125,
-0.0295867919921875,
-0.0290679931640625,
-0.0156402587890625,
-0.01238250732421875,
-0.0087738037109375,
-0.03363037109375,
0.0611572265625,
0.00911712646484375,
0.043975830078125,
0.0174713134765625,
-0.0256195068359375,
0.0146636962890625,
0.01444244384765625,
0.054168701171875,
0.0389404296875,
-0.028900146484375,
0.0107574462890625,
0.0100555419921875,
-0.052734375,
0.0007872581481933594,
0.0210723876953125,
-0.019256591796875,
0.01049041748046875,
0.037261962890625,
0.06719970703125,
0.00791168212890625,
-0.041473388671875,
0.054534912109375,
0.0074920654296875,
-0.0222625732421875,
-0.037445068359375,
0.004817962646484375,
0.004932403564453125,
0.023590087890625,
0.033843994140625,
0.01049041748046875,
-0.0129547119140625,
-0.042572021484375,
0.017242431640625,
0.0311431884765625,
-0.029632568359375,
-0.0175933837890625,
0.07196044921875,
-0.008453369140625,
-0.04046630859375,
0.052398681640625,
-0.024169921875,
-0.061767578125,
0.054718017578125,
0.0557861328125,
0.060882568359375,
-0.01490020751953125,
0.0211029052734375,
0.043365478515625,
0.031219482421875,
-0.0050201416015625,
0.00600433349609375,
0.0133209228515625,
-0.047821044921875,
-0.0258941650390625,
-0.055938720703125,
0.01043701171875,
0.025726318359375,
-0.050262451171875,
0.0142669677734375,
-0.031646728515625,
-0.018096923828125,
0.004589080810546875,
0.009552001953125,
-0.06072998046875,
0.0167999267578125,
-0.00952911376953125,
0.06036376953125,
-0.08221435546875,
0.06915283203125,
0.045562744140625,
-0.057037353515625,
-0.06439208984375,
-0.0030803680419921875,
-0.00919342041015625,
-0.07550048828125,
0.061767578125,
0.02001953125,
0.0235595703125,
0.004482269287109375,
-0.037200927734375,
-0.0667724609375,
0.08966064453125,
0.01442718505859375,
-0.036956787109375,
-0.01496124267578125,
0.01277923583984375,
0.046875,
-0.033966064453125,
0.047821044921875,
0.036285400390625,
0.029541015625,
-0.01134490966796875,
-0.06622314453125,
0.013092041015625,
-0.0207672119140625,
0.005649566650390625,
0.0036468505859375,
-0.04815673828125,
0.0966796875,
-0.0115814208984375,
-0.0026988983154296875,
0.006275177001953125,
0.031646728515625,
-0.0004260540008544922,
0.00917816162109375,
0.032806396484375,
0.054443359375,
0.06298828125,
-0.0224456787109375,
0.07733154296875,
-0.02734375,
0.05047607421875,
0.06427001953125,
0.01554107666015625,
0.0584716796875,
0.0247650146484375,
-0.031494140625,
0.056671142578125,
0.044036865234375,
-0.0257110595703125,
0.04071044921875,
0.01380157470703125,
-0.007350921630859375,
-0.0008950233459472656,
0.00601959228515625,
-0.0228729248046875,
0.03912353515625,
0.00164031982421875,
-0.042999267578125,
-0.00380706787109375,
0.00917816162109375,
0.02777099609375,
-0.004505157470703125,
-0.01338958740234375,
0.051422119140625,
-0.00250244140625,
-0.049224853515625,
0.051788330078125,
0.01251983642578125,
0.0614013671875,
-0.03717041015625,
0.00702667236328125,
-0.01349639892578125,
0.016998291015625,
-0.007598876953125,
-0.05780029296875,
0.006259918212890625,
0.004344940185546875,
-0.0278778076171875,
-0.017120361328125,
0.04779052734375,
-0.04827880859375,
-0.041961669921875,
0.0149993896484375,
0.020111083984375,
0.032470703125,
-0.00435638427734375,
-0.06402587890625,
-0.008087158203125,
0.0248565673828125,
-0.019195556640625,
0.0220489501953125,
0.0206298828125,
0.01068878173828125,
0.04412841796875,
0.063720703125,
0.007465362548828125,
0.0038242340087890625,
-0.0001786947250366211,
0.0634765625,
-0.051605224609375,
-0.041259765625,
-0.0643310546875,
0.051116943359375,
-0.006069183349609375,
-0.03094482421875,
0.057525634765625,
0.048858642578125,
0.0673828125,
-0.02398681640625,
0.05230712890625,
-0.01113128662109375,
0.04229736328125,
-0.0479736328125,
0.0643310546875,
-0.03802490234375,
0.01084136962890625,
-0.0284271240234375,
-0.065185546875,
-0.0146331787109375,
0.06854248046875,
-0.0186767578125,
0.0154876708984375,
0.048492431640625,
0.0660400390625,
-0.004558563232421875,
-0.019195556640625,
0.0031642913818359375,
0.03033447265625,
0.0172882080078125,
0.050445556640625,
0.03955078125,
-0.057830810546875,
0.047119140625,
-0.01549530029296875,
-0.012115478515625,
-0.020721435546875,
-0.058624267578125,
-0.0845947265625,
-0.0555419921875,
-0.019683837890625,
-0.05426025390625,
0.0036678314208984375,
0.06439208984375,
0.051605224609375,
-0.061187744140625,
-0.01241302490234375,
-0.0076141357421875,
0.00804901123046875,
-0.0222625732421875,
-0.0253143310546875,
0.042877197265625,
-0.0187835693359375,
-0.0662841796875,
0.00980377197265625,
-0.01049041748046875,
0.01424407958984375,
-0.0156097412109375,
-0.0011730194091796875,
-0.03204345703125,
-0.003040313720703125,
0.033538818359375,
0.00763702392578125,
-0.05633544921875,
-0.023406982421875,
0.0005564689636230469,
-0.006137847900390625,
0.01080322265625,
0.03192138671875,
-0.048858642578125,
0.023101806640625,
0.0171356201171875,
0.026153564453125,
0.07037353515625,
-0.0017843246459960938,
0.0218353271484375,
-0.06298828125,
0.022369384765625,
0.009246826171875,
0.0273284912109375,
0.0276031494140625,
-0.0303955078125,
0.0399169921875,
0.035125732421875,
-0.04583740234375,
-0.06854248046875,
-0.00569915771484375,
-0.0701904296875,
-0.0265350341796875,
0.08056640625,
-0.0173797607421875,
-0.0279998779296875,
-0.0057220458984375,
-0.015167236328125,
0.03277587890625,
-0.029449462890625,
0.06402587890625,
0.052276611328125,
0.0102081298828125,
-0.006183624267578125,
-0.042449951171875,
0.03955078125,
0.0300140380859375,
-0.034759521484375,
-0.0150299072265625,
0.013641357421875,
0.051361083984375,
0.02288818359375,
0.0460205078125,
-0.005764007568359375,
-0.0003829002380371094,
0.00592041015625,
0.02227783203125,
-0.0111846923828125,
-0.01317596435546875,
-0.030181884765625,
0.01076507568359375,
-0.0176849365234375,
-0.0260772705078125
]
] |
bert-base-cased | 2022-11-16T15:18:28.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-base-cased | 157 | 5,046,520 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT base model (cased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is case-sensitive: it makes a difference between
english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-cased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] Hello I'm a fashion model. [SEP]",
'score': 0.09019174426794052,
'token': 4633,
'token_str': 'fashion'},
{'sequence': "[CLS] Hello I'm a new model. [SEP]",
'score': 0.06349995732307434,
'token': 1207,
'token_str': 'new'},
{'sequence': "[CLS] Hello I'm a male model. [SEP]",
'score': 0.06228214129805565,
'token': 2581,
'token_str': 'male'},
{'sequence': "[CLS] Hello I'm a professional model. [SEP]",
'score': 0.0441727414727211,
'token': 1848,
'token_str': 'professional'},
{'sequence': "[CLS] Hello I'm a super model. [SEP]",
'score': 0.03326151892542839,
'token': 7688,
'token_str': 'super'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-cased')
model = BertModel.from_pretrained("bert-base-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-cased')
model = TFBertModel.from_pretrained("bert-base-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-cased')
>>> unmasker("The man worked as a [MASK].")
[{'sequence': '[CLS] The man worked as a lawyer. [SEP]',
'score': 0.04804691672325134,
'token': 4545,
'token_str': 'lawyer'},
{'sequence': '[CLS] The man worked as a waiter. [SEP]',
'score': 0.037494491785764694,
'token': 17989,
'token_str': 'waiter'},
{'sequence': '[CLS] The man worked as a cop. [SEP]',
'score': 0.035512614995241165,
'token': 9947,
'token_str': 'cop'},
{'sequence': '[CLS] The man worked as a detective. [SEP]',
'score': 0.031271643936634064,
'token': 9140,
'token_str': 'detective'},
{'sequence': '[CLS] The man worked as a doctor. [SEP]',
'score': 0.027423162013292313,
'token': 3995,
'token_str': 'doctor'}]
>>> unmasker("The woman worked as a [MASK].")
[{'sequence': '[CLS] The woman worked as a nurse. [SEP]',
'score': 0.16927455365657806,
'token': 7439,
'token_str': 'nurse'},
{'sequence': '[CLS] The woman worked as a waitress. [SEP]',
'score': 0.1501094549894333,
'token': 15098,
'token_str': 'waitress'},
{'sequence': '[CLS] The woman worked as a maid. [SEP]',
'score': 0.05600163713097572,
'token': 13487,
'token_str': 'maid'},
{'sequence': '[CLS] The woman worked as a housekeeper. [SEP]',
'score': 0.04838843643665314,
'token': 26458,
'token_str': 'housekeeper'},
{'sequence': '[CLS] The woman worked as a cook. [SEP]',
'score': 0.029980547726154327,
'token': 9834,
'token_str': 'cook'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
|:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
| | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=bert-base-cased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 8,982 | [
[
-0.00791168212890625,
-0.046417236328125,
0.016326904296875,
0.017486572265625,
-0.041015625,
0.002872467041015625,
-0.0024013519287109375,
-0.01007843017578125,
0.0302276611328125,
0.036956787109375,
-0.042694091796875,
-0.03314208984375,
-0.0595703125,
0.01033782958984375,
-0.041351318359375,
0.0850830078125,
0.021026611328125,
0.0246124267578125,
0.00597381591796875,
0.01255035400390625,
-0.03326416015625,
-0.057830810546875,
-0.060821533203125,
-0.0205535888671875,
0.034088134765625,
0.0288543701171875,
0.045257568359375,
0.0457763671875,
0.03515625,
0.027801513671875,
-0.004718780517578125,
-0.006107330322265625,
-0.0238494873046875,
0.007022857666015625,
-0.0024318695068359375,
-0.045196533203125,
-0.027923583984375,
0.0125885009765625,
0.034088134765625,
0.058868408203125,
-0.0030155181884765625,
0.0245361328125,
-0.01113128662109375,
0.047088623046875,
-0.0166015625,
0.022705078125,
-0.037261962890625,
0.009857177734375,
-0.0194854736328125,
0.0098114013671875,
-0.0287628173828125,
-0.01508331298828125,
0.00940704345703125,
-0.044708251953125,
0.0220184326171875,
0.01312255859375,
0.08251953125,
0.01282501220703125,
-0.0167083740234375,
-0.01012420654296875,
-0.035308837890625,
0.05615234375,
-0.0518798828125,
0.01146697998046875,
0.037200927734375,
0.0177001953125,
-0.01438140869140625,
-0.080078125,
-0.0305938720703125,
-0.00209808349609375,
-0.007251739501953125,
0.004650115966796875,
0.00007086992263793945,
-0.00982666015625,
0.027008056640625,
0.0309295654296875,
-0.0258026123046875,
0.00006085634231567383,
-0.053619384765625,
-0.0254364013671875,
0.0496826171875,
0.01245880126953125,
0.0155487060546875,
-0.0254058837890625,
-0.0256195068359375,
-0.0220184326171875,
-0.020721435546875,
0.00948333740234375,
0.0390625,
0.033905029296875,
-0.012664794921875,
0.0548095703125,
-0.014984130859375,
0.0438232421875,
0.00754547119140625,
-0.0012149810791015625,
0.036407470703125,
-0.010162353515625,
-0.02801513671875,
0.0002582073211669922,
0.06964111328125,
0.019622802734375,
0.032440185546875,
-0.0013952255249023438,
-0.0242156982421875,
0.003658294677734375,
0.0277252197265625,
-0.046539306640625,
-0.02783203125,
0.010894775390625,
-0.039764404296875,
-0.034027099609375,
0.034912109375,
-0.0472412109375,
-0.0050506591796875,
-0.0092620849609375,
0.043121337890625,
-0.02996826171875,
-0.011138916015625,
0.0099029541015625,
-0.038970947265625,
0.0162200927734375,
0.00307464599609375,
-0.06884765625,
0.02099609375,
0.052947998046875,
0.06427001953125,
0.0220794677734375,
-0.01067352294921875,
-0.034088134765625,
-0.0206756591796875,
-0.029205322265625,
0.03680419921875,
-0.0227508544921875,
-0.0372314453125,
-0.0013761520385742188,
0.02130126953125,
-0.00902557373046875,
-0.0188140869140625,
0.053466796875,
-0.036163330078125,
0.04107666015625,
-0.0074005126953125,
-0.044281005859375,
-0.0193634033203125,
0.00135040283203125,
-0.052703857421875,
0.090576171875,
0.0261383056640625,
-0.054107666015625,
0.0290069580078125,
-0.0682373046875,
-0.047760009765625,
0.015655517578125,
0.00835418701171875,
-0.036834716796875,
0.01378631591796875,
0.00890350341796875,
0.034942626953125,
-0.002140045166015625,
0.0247039794921875,
-0.01393890380859375,
-0.035400390625,
0.0244293212890625,
-0.01470947265625,
0.07708740234375,
0.0163726806640625,
-0.024749755859375,
0.01317596435546875,
-0.059051513671875,
-0.002704620361328125,
0.018402099609375,
-0.0290374755859375,
-0.0119476318359375,
-0.007183074951171875,
0.021759033203125,
0.013763427734375,
0.02716064453125,
-0.048492431640625,
0.02288818359375,
-0.042205810546875,
0.05047607421875,
0.06121826171875,
-0.00493621826171875,
0.0189056396484375,
-0.032623291015625,
0.037750244140625,
-0.0015707015991210938,
-0.0020160675048828125,
-0.0146331787109375,
-0.059478759765625,
-0.057769775390625,
-0.02685546875,
0.047607421875,
0.05731201171875,
-0.037933349609375,
0.055145263671875,
-0.002552032470703125,
-0.04376220703125,
-0.046600341796875,
-0.00931549072265625,
0.0249176025390625,
0.037261962890625,
0.0260467529296875,
-0.03509521484375,
-0.06414794921875,
-0.061614990234375,
-0.0203094482421875,
-0.0113067626953125,
-0.0188446044921875,
0.00797271728515625,
0.0545654296875,
-0.01763916015625,
0.0621337890625,
-0.05462646484375,
-0.03076171875,
-0.01372528076171875,
0.0196685791015625,
0.0496826171875,
0.052490234375,
0.0267486572265625,
-0.047607421875,
-0.03265380859375,
-0.0310821533203125,
-0.0416259765625,
0.00202178955078125,
-0.0006999969482421875,
-0.013641357421875,
0.0130767822265625,
0.04241943359375,
-0.055999755859375,
0.04315185546875,
0.0175018310546875,
-0.0433349609375,
0.05487060546875,
-0.027679443359375,
-0.003948211669921875,
-0.0970458984375,
0.01293182373046875,
-0.00789642333984375,
-0.0254364013671875,
-0.058349609375,
-0.0007572174072265625,
-0.00695037841796875,
-0.00597381591796875,
-0.04315185546875,
0.036285400390625,
-0.03118896484375,
-0.0015478134155273438,
0.003131866455078125,
-0.0161895751953125,
0.0002856254577636719,
0.034149169921875,
0.0018587112426757812,
0.044647216796875,
0.042633056640625,
-0.04193115234375,
0.037933349609375,
0.030731201171875,
-0.041473388671875,
0.01282501220703125,
-0.0604248046875,
0.01995849609375,
0.00699615478515625,
0.00250244140625,
-0.08447265625,
-0.025726318359375,
0.0178375244140625,
-0.044647216796875,
0.01690673828125,
-0.00867462158203125,
-0.056182861328125,
-0.044464111328125,
-0.01776123046875,
0.030426025390625,
0.045806884765625,
-0.0172119140625,
0.03314208984375,
0.0213775634765625,
-0.00681304931640625,
-0.04559326171875,
-0.051116943359375,
0.00811004638671875,
-0.0154266357421875,
-0.03790283203125,
0.03118896484375,
-0.002101898193359375,
-0.0086517333984375,
-0.01561737060546875,
0.003871917724609375,
-0.01192474365234375,
0.007068634033203125,
0.021636962890625,
0.0328369140625,
-0.013763427734375,
-0.005481719970703125,
-0.01418304443359375,
-0.00949859619140625,
0.0210723876953125,
-0.01265716552734375,
0.06524658203125,
-0.0011339187622070312,
-0.00501251220703125,
-0.0240020751953125,
0.0258636474609375,
0.04949951171875,
-0.0042572021484375,
0.053924560546875,
0.06353759765625,
-0.044097900390625,
0.00630950927734375,
-0.02532958984375,
-0.0153961181640625,
-0.037994384765625,
0.035308837890625,
-0.037567138671875,
-0.05987548828125,
0.0567626953125,
0.022918701171875,
-0.0105133056640625,
0.055755615234375,
0.043670654296875,
-0.01580810546875,
0.07391357421875,
0.035797119140625,
-0.01154327392578125,
0.036590576171875,
-0.01132965087890625,
0.0226593017578125,
-0.05450439453125,
-0.032440185546875,
-0.030731201171875,
-0.0210723876953125,
-0.0390625,
-0.01409149169921875,
0.0181884765625,
0.0160675048828125,
-0.03302001953125,
0.04461669921875,
-0.047210693359375,
0.0247344970703125,
0.07757568359375,
0.0289459228515625,
-0.016326904296875,
-0.017669677734375,
-0.0219573974609375,
0.005344390869140625,
-0.03790283203125,
-0.0251312255859375,
0.08807373046875,
0.038482666015625,
0.045196533203125,
0.002887725830078125,
0.049774169921875,
0.029144287109375,
-0.004150390625,
-0.0537109375,
0.046600341796875,
-0.03045654296875,
-0.065673828125,
-0.0299224853515625,
-0.007007598876953125,
-0.0794677734375,
0.010498046875,
-0.024322509765625,
-0.06170654296875,
-0.003936767578125,
-0.01206207275390625,
-0.0274810791015625,
0.01433563232421875,
-0.05517578125,
0.077392578125,
-0.0206146240234375,
-0.01043701171875,
0.006595611572265625,
-0.07330322265625,
0.018951416015625,
-0.0002505779266357422,
0.007068634033203125,
-0.005649566650390625,
0.0180511474609375,
0.08160400390625,
-0.0439453125,
0.07391357421875,
-0.01824951171875,
0.0202178955078125,
0.0049896240234375,
-0.00202178955078125,
0.02392578125,
0.0015630722045898438,
0.005584716796875,
0.0223236083984375,
0.003856658935546875,
-0.036712646484375,
-0.00823974609375,
0.02239990234375,
-0.05462646484375,
-0.038909912109375,
-0.047698974609375,
-0.04766845703125,
0.01116180419921875,
0.03271484375,
0.044158935546875,
0.0391845703125,
-0.00707244873046875,
0.0186767578125,
0.037689208984375,
-0.0162353515625,
0.057647705078125,
0.024261474609375,
-0.01334381103515625,
-0.036834716796875,
0.0440673828125,
0.0006399154663085938,
0.00211334228515625,
0.036834716796875,
0.0177154541015625,
-0.044097900390625,
-0.0140380859375,
-0.02655029296875,
0.0098114013671875,
-0.042694091796875,
-0.0233612060546875,
-0.04278564453125,
-0.03436279296875,
-0.04864501953125,
-0.0043792724609375,
-0.01348114013671875,
-0.03961181640625,
-0.051300048828125,
-0.01549530029296875,
0.034759521484375,
0.047393798828125,
-0.0090484619140625,
0.033935546875,
-0.05487060546875,
0.0205535888671875,
0.02313232421875,
0.02777099609375,
-0.0238037109375,
-0.05999755859375,
-0.023406982421875,
-0.002536773681640625,
-0.009735107421875,
-0.06396484375,
0.05120849609375,
0.01824951171875,
0.03692626953125,
0.04022216796875,
-0.0013475418090820312,
0.046966552734375,
-0.04736328125,
0.07513427734375,
0.0160675048828125,
-0.08203125,
0.04522705078125,
-0.0272216796875,
0.0184478759765625,
0.027801513671875,
0.0202484130859375,
-0.036651611328125,
-0.027587890625,
-0.0687255859375,
-0.07330322265625,
0.060699462890625,
0.01338958740234375,
0.01837158203125,
-0.0024471282958984375,
0.02362060546875,
0.00908660888671875,
0.0297393798828125,
-0.06591796875,
-0.038482666015625,
-0.035675048828125,
-0.0257110595703125,
-0.01290130615234375,
-0.021026611328125,
-0.003917694091796875,
-0.04144287109375,
0.0506591796875,
0.00959014892578125,
0.044403076171875,
0.00942230224609375,
-0.00931549072265625,
0.01012420654296875,
0.0121917724609375,
0.06207275390625,
0.035888671875,
-0.04071044921875,
-0.0013380050659179688,
-0.0009551048278808594,
-0.04766845703125,
0.004917144775390625,
0.0113067626953125,
0.0008444786071777344,
0.016510009765625,
0.04278564453125,
0.059112548828125,
0.013702392578125,
-0.034912109375,
0.04339599609375,
0.0084228515625,
-0.02801513671875,
-0.04193115234375,
0.0107421875,
-0.0032215118408203125,
0.00969696044921875,
0.040008544921875,
0.0164794921875,
0.007568359375,
-0.041290283203125,
0.03387451171875,
0.0247039794921875,
-0.036834716796875,
-0.0185394287109375,
0.068359375,
0.00380706787109375,
-0.0584716796875,
0.06341552734375,
-0.01702880859375,
-0.0599365234375,
0.058074951171875,
0.0479736328125,
0.071044921875,
-0.016326904296875,
0.016632080078125,
0.035400390625,
0.02423095703125,
-0.0247344970703125,
0.0323486328125,
0.021575927734375,
-0.0631103515625,
-0.0250244140625,
-0.053558349609375,
-0.0137481689453125,
0.01531982421875,
-0.06549072265625,
0.02178955078125,
-0.036346435546875,
-0.021697998046875,
0.01235198974609375,
0.0007009506225585938,
-0.053558349609375,
0.0360107421875,
0.002559661865234375,
0.080810546875,
-0.0784912109375,
0.07501220703125,
0.058441162109375,
-0.04541015625,
-0.06591796875,
-0.028778076171875,
-0.0225067138671875,
-0.080810546875,
0.05694580078125,
0.0274200439453125,
0.0263824462890625,
-0.0013532638549804688,
-0.0426025390625,
-0.05169677734375,
0.0660400390625,
0.00977325439453125,
-0.0386962890625,
-0.01042938232421875,
0.0038661956787109375,
0.04254150390625,
-0.041259765625,
0.03228759765625,
0.042510986328125,
0.029327392578125,
-0.00794219970703125,
-0.060943603515625,
0.006206512451171875,
-0.03289794921875,
-0.0020904541015625,
0.00780487060546875,
-0.0341796875,
0.08843994140625,
-0.0129241943359375,
0.00408172607421875,
0.0126495361328125,
0.03704833984375,
0.0031147003173828125,
0.01424407958984375,
0.040985107421875,
0.0489501953125,
0.054107666015625,
-0.0281524658203125,
0.06121826171875,
-0.020050048828125,
0.03704833984375,
0.0645751953125,
0.005512237548828125,
0.06060791015625,
0.031890869140625,
-0.02178955078125,
0.0704345703125,
0.0673828125,
-0.0259857177734375,
0.058624267578125,
0.0184326171875,
-0.0029621124267578125,
-0.0082550048828125,
0.01244354248046875,
-0.0184783935546875,
0.037933349609375,
0.0221405029296875,
-0.0416259765625,
0.01113128662109375,
-0.006359100341796875,
0.01241302490234375,
-0.01328277587890625,
-0.03472900390625,
0.05303955078125,
0.01102447509765625,
-0.053131103515625,
0.0214385986328125,
0.01525115966796875,
0.048004150390625,
-0.04010009765625,
0.0004494190216064453,
-0.00823974609375,
0.0129241943359375,
-0.00885009765625,
-0.06298828125,
0.0157470703125,
-0.0130767822265625,
-0.032745361328125,
-0.0172271728515625,
0.055389404296875,
-0.0291748046875,
-0.05010986328125,
0.0010852813720703125,
0.02252197265625,
0.023590087890625,
-0.01160430908203125,
-0.059967041015625,
-0.02227783203125,
0.0012760162353515625,
-0.0103607177734375,
0.01274871826171875,
0.0222320556640625,
0.005992889404296875,
0.044891357421875,
0.0615234375,
-0.0088958740234375,
0.00803375244140625,
0.0057220458984375,
0.053558349609375,
-0.0731201171875,
-0.0626220703125,
-0.072998046875,
0.0447998046875,
-0.007045745849609375,
-0.043060302734375,
0.04559326171875,
0.055999755859375,
0.052520751953125,
-0.030731201171875,
0.038482666015625,
-0.01424407958984375,
0.041107177734375,
-0.033905029296875,
0.053558349609375,
-0.0252227783203125,
0.0011301040649414062,
-0.025848388671875,
-0.055419921875,
-0.021209716796875,
0.06231689453125,
-0.00394439697265625,
-0.00037860870361328125,
0.055755615234375,
0.0460205078125,
0.004730224609375,
-0.00904083251953125,
0.01346588134765625,
0.00894927978515625,
0.006908416748046875,
0.03302001953125,
0.04058837890625,
-0.049407958984375,
0.0294036865234375,
-0.0162353515625,
-0.004428863525390625,
-0.0272064208984375,
-0.0653076171875,
-0.07550048828125,
-0.0447998046875,
-0.021087646484375,
-0.044219970703125,
-0.01038360595703125,
0.0684814453125,
0.060333251953125,
-0.07000732421875,
-0.020477294921875,
-0.0115966796875,
0.0036258697509765625,
-0.028411865234375,
-0.0224609375,
0.036285400390625,
-0.0186614990234375,
-0.054718017578125,
0.02020263671875,
-0.004848480224609375,
0.003490447998046875,
-0.00846099853515625,
0.00557708740234375,
-0.031036376953125,
0.004150390625,
0.0386962890625,
0.007358551025390625,
-0.05596923828125,
-0.037261962890625,
0.006717681884765625,
-0.011505126953125,
0.0119171142578125,
0.036712646484375,
-0.043975830078125,
0.028076171875,
0.0310821533203125,
0.02874755859375,
0.0543212890625,
0.01244354248046875,
0.0479736328125,
-0.08270263671875,
0.01690673828125,
0.01305389404296875,
0.037994384765625,
0.024749755859375,
-0.0367431640625,
0.04217529296875,
0.03948974609375,
-0.03826904296875,
-0.062255859375,
-0.0036144256591796875,
-0.078125,
-0.0202484130859375,
0.06689453125,
-0.0132293701171875,
-0.0216064453125,
-0.007747650146484375,
-0.022125244140625,
0.03369140625,
-0.034515380859375,
0.055908203125,
0.0640869140625,
-0.0011987686157226562,
-0.00754547119140625,
-0.0268707275390625,
0.0298309326171875,
0.032440185546875,
-0.03253173828125,
-0.03656005859375,
0.007305145263671875,
0.03887939453125,
0.017425537109375,
0.0433349609375,
-0.00010216236114501953,
0.0144805908203125,
0.0157623291015625,
0.01226043701171875,
-0.01033782958984375,
-0.01129913330078125,
-0.023712158203125,
0.01383209228515625,
-0.01319122314453125,
-0.056549072265625
]
] |
microsoft/layoutlmv3-base | 2023-04-12T12:49:21.000Z | [
"transformers",
"pytorch",
"tf",
"onnx",
"layoutlmv3",
"en",
"arxiv:2204.08387",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | microsoft | null | null | microsoft/layoutlmv3-base | 199 | 4,904,052 | transformers | 2022-04-18T06:53:05 | ---
language: en
license: cc-by-nc-sa-4.0
---
# LayoutLMv3
[Microsoft Document AI](https://www.microsoft.com/en-us/research/project/document-ai/) | [GitHub](https://aka.ms/layoutlmv3)
## Model description
LayoutLMv3 is a pre-trained multimodal Transformer for Document AI with unified text and image masking. The simple unified architecture and training objectives make LayoutLMv3 a general-purpose pre-trained model. For example, LayoutLMv3 can be fine-tuned for both text-centric tasks, including form understanding, receipt understanding, and document visual question answering, and image-centric tasks such as document image classification and document layout analysis.
[LayoutLMv3: Pre-training for Document AI with Unified Text and Image Masking](https://arxiv.org/abs/2204.08387)
Yupan Huang, Tengchao Lv, Lei Cui, Yutong Lu, Furu Wei, ACM Multimedia 2022.
## Citation
If you find LayoutLM useful in your research, please cite the following paper:
```
@inproceedings{huang2022layoutlmv3,
author={Yupan Huang and Tengchao Lv and Lei Cui and Yutong Lu and Furu Wei},
title={LayoutLMv3: Pre-training for Document AI with Unified Text and Image Masking},
booktitle={Proceedings of the 30th ACM International Conference on Multimedia},
year={2022}
}
```
## License
The content of this project itself is licensed under the [Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/).
Portions of the source code are based on the [transformers](https://github.com/huggingface/transformers) project.
[Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct)
| 1,672 | [
[
-0.0279083251953125,
-0.0275115966796875,
0.0304412841796875,
0.0307769775390625,
-0.016021728515625,
-0.01016998291015625,
0.0164794921875,
-0.01020050048828125,
-0.011383056640625,
0.038818359375,
-0.0419921875,
-0.040191650390625,
-0.037109375,
-0.01381683349609375,
-0.0224456787109375,
0.054412841796875,
-0.0302886962890625,
0.01291656494140625,
-0.037567138671875,
-0.01270294189453125,
-0.0297088623046875,
-0.033966064453125,
-0.0095672607421875,
-0.02398681640625,
0.01495361328125,
0.0006804466247558594,
0.0496826171875,
0.040985107421875,
0.0528564453125,
0.0294952392578125,
0.01090240478515625,
0.019622802734375,
-0.0109710693359375,
-0.01216888427734375,
0.0217437744140625,
-0.01324462890625,
-0.05096435546875,
0.0206451416015625,
0.050537109375,
0.0188446044921875,
-0.0018377304077148438,
0.0057525634765625,
0.017120361328125,
0.05218505859375,
-0.048248291015625,
0.00406646728515625,
-0.02197265625,
0.0208282470703125,
-0.01439666748046875,
-0.02130126953125,
-0.03570556640625,
-0.0105133056640625,
-0.01035308837890625,
-0.067626953125,
0.0178985595703125,
0.020263671875,
0.078857421875,
0.02398681640625,
-0.0167694091796875,
-0.0007472038269042969,
-0.04620361328125,
0.055023193359375,
-0.041656494140625,
0.04229736328125,
0.0311431884765625,
0.0131378173828125,
0.01142120361328125,
-0.0765380859375,
-0.0452880859375,
-0.003963470458984375,
-0.037689208984375,
0.02703857421875,
-0.02423095703125,
0.01357269287109375,
0.0293426513671875,
0.026611328125,
-0.0733642578125,
0.00864410400390625,
-0.044830322265625,
-0.03607177734375,
0.035888671875,
-0.0035266876220703125,
0.048614501953125,
-0.01629638671875,
-0.0386962890625,
-0.01421356201171875,
-0.024932861328125,
0.00015795230865478516,
0.0352783203125,
-0.01474761962890625,
-0.0239715576171875,
0.0274658203125,
0.0251617431640625,
0.06005859375,
-0.0141448974609375,
-0.0273895263671875,
0.048919677734375,
-0.0113677978515625,
-0.03668212890625,
-0.01580810546875,
0.06280517578125,
0.0169525146484375,
-0.00882720947265625,
-0.0045013427734375,
-0.01513671875,
-0.00901031494140625,
0.049530029296875,
-0.05682373046875,
-0.0189056396484375,
0.01023101806640625,
-0.05584716796875,
-0.01010894775390625,
0.0279083251953125,
-0.032196044921875,
-0.0198516845703125,
-0.037811279296875,
0.043853759765625,
-0.04888916015625,
-0.00337982177734375,
-0.00968170166015625,
-0.006221771240234375,
0.04052734375,
0.047393798828125,
-0.03448486328125,
-0.002002716064453125,
0.037811279296875,
0.075439453125,
-0.038818359375,
-0.041778564453125,
-0.036956787109375,
0.0196685791015625,
-0.00804901123046875,
0.057647705078125,
-0.0169830322265625,
-0.0311737060546875,
0.01117706298828125,
0.0188446044921875,
-0.01885986328125,
-0.028167724609375,
0.044525146484375,
-0.0440673828125,
0.05926513671875,
0.031097412109375,
-0.018707275390625,
-0.007843017578125,
0.0304718017578125,
-0.0672607421875,
0.06304931640625,
0.01434326171875,
-0.06634521484375,
0.010406494140625,
-0.065185546875,
-0.0283660888671875,
0.01062774658203125,
-0.0122222900390625,
-0.06671142578125,
-0.0161590576171875,
0.021148681640625,
0.026519775390625,
0.0059051513671875,
0.006267547607421875,
-0.013458251953125,
-0.030059814453125,
-0.00931549072265625,
-0.029296875,
0.06591796875,
0.0270843505859375,
-0.00913238525390625,
0.03497314453125,
-0.06414794921875,
-0.004573822021484375,
0.00650787353515625,
-0.018707275390625,
-0.016693115234375,
-0.0302734375,
0.01812744140625,
0.02215576171875,
0.01922607421875,
-0.0309295654296875,
0.016815185546875,
-0.0130157470703125,
0.01006317138671875,
0.052978515625,
-0.0231781005859375,
0.06500244140625,
-0.030303955078125,
0.052825927734375,
-0.005466461181640625,
0.0347900390625,
-0.035614013671875,
-0.04010009765625,
-0.0372314453125,
-0.037353515625,
-0.0004291534423828125,
0.044830322265625,
-0.06884765625,
0.02862548828125,
-0.004638671875,
-0.039215087890625,
-0.03363037109375,
0.0111083984375,
0.044464111328125,
0.037445068359375,
0.039794921875,
-0.0133819580078125,
-0.052520751953125,
-0.0687255859375,
-0.01189422607421875,
0.0018901824951171875,
-0.01947021484375,
0.0009822845458984375,
0.02935791015625,
-0.018463134765625,
0.050872802734375,
-0.0292510986328125,
-0.058929443359375,
-0.005481719970703125,
0.0146331787109375,
-0.0017652511596679688,
0.0472412109375,
0.0382080078125,
-0.08880615234375,
-0.0400390625,
-0.01003265380859375,
-0.05938720703125,
0.0030364990234375,
-0.00923919677734375,
-0.0199432373046875,
0.02825927734375,
0.048553466796875,
-0.05224609375,
0.054840087890625,
0.0440673828125,
-0.01493072509765625,
0.0357666015625,
-0.03375244140625,
0.001636505126953125,
-0.09832763671875,
0.01232147216796875,
0.00007557868957519531,
-0.011871337890625,
-0.04937744140625,
-0.00017070770263671875,
0.02972412109375,
-0.017333984375,
-0.0531005859375,
0.04656982421875,
-0.053955078125,
-0.00830078125,
-0.0126495361328125,
0.0015468597412109375,
0.03155517578125,
0.0482177734375,
-0.00511932373046875,
0.054931640625,
0.0257720947265625,
-0.0226898193359375,
0.010650634765625,
0.042510986328125,
-0.0206451416015625,
0.047760009765625,
-0.0384521484375,
0.023406982421875,
-0.0121917724609375,
0.0302276611328125,
-0.0654296875,
-0.0120391845703125,
0.01947021484375,
-0.035186767578125,
0.035369873046875,
0.009368896484375,
-0.040802001953125,
-0.051544189453125,
-0.03582763671875,
0.0209197998046875,
0.035736083984375,
-0.038604736328125,
0.08050537109375,
0.0198974609375,
0.023284912109375,
-0.038665771484375,
-0.057037353515625,
-0.02130126953125,
-0.0178680419921875,
-0.055511474609375,
0.054443359375,
-0.02239990234375,
0.0003006458282470703,
-0.00457763671875,
-0.00801849365234375,
-0.00836944580078125,
-0.004665374755859375,
0.0406494140625,
0.037933349609375,
-0.0117034912109375,
-0.00467681884765625,
-0.0195465087890625,
-0.0169677734375,
-0.0012531280517578125,
-0.0199432373046875,
0.0406494140625,
-0.019683837890625,
-0.04931640625,
-0.042755126953125,
0.028045654296875,
0.04205322265625,
-0.0300445556640625,
0.038543701171875,
0.078857421875,
-0.0240325927734375,
0.008331298828125,
-0.04754638671875,
0.0263671875,
-0.03778076171875,
0.0286865234375,
-0.0165252685546875,
-0.057830810546875,
0.0278167724609375,
0.01428985595703125,
0.004108428955078125,
0.047943115234375,
0.0404052734375,
-0.0248870849609375,
0.075439453125,
0.05572509765625,
0.01313018798828125,
0.051971435546875,
-0.0264129638671875,
0.00585174560546875,
-0.07269287109375,
-0.039215087890625,
-0.035369873046875,
-0.0248565673828125,
-0.025482177734375,
-0.03857421875,
0.021392822265625,
0.001956939697265625,
-0.034881591796875,
0.0189056396484375,
-0.06024169921875,
0.0254364013671875,
0.054779052734375,
-0.005023956298828125,
0.01947021484375,
0.002521514892578125,
-0.00669097900390625,
0.002857208251953125,
-0.0248565673828125,
-0.049041748046875,
0.05450439453125,
0.0287933349609375,
0.05377197265625,
0.006866455078125,
0.0523681640625,
0.024871826171875,
0.0274658203125,
-0.038543701171875,
0.0278472900390625,
-0.0016040802001953125,
-0.0272674560546875,
-0.0244598388671875,
-0.0016326904296875,
-0.08099365234375,
0.0159149169921875,
-0.003833770751953125,
-0.052764892578125,
0.0037593841552734375,
0.0158233642578125,
-0.006610870361328125,
0.04010009765625,
-0.067138671875,
0.07037353515625,
-0.0253448486328125,
-0.01132965087890625,
0.0243072509765625,
-0.059326171875,
0.0223236083984375,
-0.015289306640625,
0.016204833984375,
0.01493072509765625,
0.0072784423828125,
0.06634521484375,
-0.03741455078125,
0.04815673828125,
-0.0215911865234375,
-0.0111846923828125,
0.006603240966796875,
-0.0012750625610351562,
0.033599853515625,
-0.00986480712890625,
0.00780487060546875,
0.005443572998046875,
-0.00020360946655273438,
-0.0264129638671875,
-0.05743408203125,
0.04180908203125,
-0.0902099609375,
-0.0482177734375,
-0.02716064453125,
-0.04595947265625,
-0.00780487060546875,
0.0389404296875,
0.036773681640625,
0.020172119140625,
-0.01004791259765625,
0.0254364013671875,
0.04913330078125,
-0.0185699462890625,
0.045867919921875,
0.0311737060546875,
-0.022705078125,
-0.0291290283203125,
0.066162109375,
-0.0086212158203125,
-0.004077911376953125,
0.043609619140625,
0.003780364990234375,
-0.01299285888671875,
-0.038665771484375,
-0.0253448486328125,
0.007228851318359375,
-0.0523681640625,
-0.028533935546875,
-0.0643310546875,
-0.054718017578125,
-0.02777099609375,
-0.0266265869140625,
-0.0174560546875,
-0.00453948974609375,
-0.044403076171875,
0.00571441650390625,
-0.005279541015625,
0.04931640625,
0.00948333740234375,
0.0294647216796875,
-0.059844970703125,
0.038299560546875,
0.0275421142578125,
0.029998779296875,
-0.0122528076171875,
-0.04559326171875,
-0.01445770263671875,
-0.012908935546875,
-0.06011962890625,
-0.0474853515625,
0.033782958984375,
-0.0028438568115234375,
0.071044921875,
0.018951416015625,
-0.01525115966796875,
0.035125732421875,
-0.043182373046875,
0.06884765625,
0.0374755859375,
-0.0596923828125,
0.03741455078125,
-0.01641845703125,
0.037506103515625,
0.016845703125,
0.031829833984375,
-0.0156707763671875,
-0.00919342041015625,
-0.061553955078125,
-0.056976318359375,
0.06597900390625,
0.033172607421875,
0.006847381591796875,
0.045562744140625,
0.00400543212890625,
0.0007038116455078125,
0.01213836669921875,
-0.0653076171875,
-0.0263824462890625,
-0.0562744140625,
-0.00841522216796875,
0.005725860595703125,
-0.0157470703125,
-0.0133514404296875,
-0.02947998046875,
0.05413818359375,
-0.0126190185546875,
0.041900634765625,
0.0112152099609375,
-0.035491943359375,
0.0074310302734375,
0.004726409912109375,
0.07171630859375,
0.04595947265625,
-0.018402099609375,
0.00714111328125,
-0.009246826171875,
-0.056304931640625,
0.00585174560546875,
0.0298004150390625,
-0.0033111572265625,
-0.0015430450439453125,
0.049652099609375,
0.08697509765625,
-0.0114898681640625,
-0.00824737548828125,
0.06243896484375,
-0.01477813720703125,
-0.054779052734375,
-0.026519775390625,
-0.01523590087890625,
-0.007289886474609375,
0.01520538330078125,
0.03265380859375,
0.019561767578125,
0.0016183853149414062,
-0.007289886474609375,
0.0159759521484375,
0.0251312255859375,
-0.040802001953125,
-0.0154571533203125,
0.059844970703125,
0.0102691650390625,
-0.053131103515625,
0.038970947265625,
-0.01158905029296875,
-0.03717041015625,
0.039581298828125,
0.052520751953125,
0.065185546875,
-0.01302337646484375,
0.02886962890625,
0.009368896484375,
0.0279388427734375,
0.0172576904296875,
0.00730133056640625,
-0.006954193115234375,
-0.050628662109375,
-0.02197265625,
-0.041656494140625,
-0.01409912109375,
0.03070068359375,
-0.03515625,
0.0281829833984375,
-0.0222625732421875,
0.0157318115234375,
-0.007266998291015625,
-0.00562286376953125,
-0.07568359375,
0.016204833984375,
0.03765869140625,
0.06951904296875,
-0.04205322265625,
0.06414794921875,
0.07305908203125,
-0.03582763671875,
-0.06365966796875,
0.0081634521484375,
0.010162353515625,
-0.0753173828125,
0.042938232421875,
0.024566650390625,
-0.0020427703857421875,
0.0003857612609863281,
-0.051544189453125,
-0.062408447265625,
0.10162353515625,
0.0169219970703125,
-0.01168060302734375,
-0.0272674560546875,
0.0007114410400390625,
0.038330078125,
-0.034912109375,
0.02691650390625,
0.00847625732421875,
0.04327392578125,
0.0218048095703125,
-0.05804443359375,
0.003360748291015625,
-0.048858642578125,
0.0249481201171875,
-0.0085906982421875,
-0.04644775390625,
0.058990478515625,
0.00201416015625,
-0.005016326904296875,
0.00604248046875,
0.0596923828125,
0.02813720703125,
0.0242919921875,
0.0430908203125,
0.03436279296875,
0.051055908203125,
-0.00616455078125,
0.08154296875,
-0.01558685302734375,
0.0186309814453125,
0.0802001953125,
-0.004436492919921875,
0.0249786376953125,
0.03375244140625,
-0.0104217529296875,
0.048858642578125,
0.037109375,
-0.0168304443359375,
0.038909912109375,
0.00002467632293701172,
0.021484375,
-0.0034885406494140625,
0.0146331787109375,
-0.0408935546875,
0.0261993408203125,
0.0093994140625,
-0.03485107421875,
-0.01201629638671875,
0.0252838134765625,
0.0034580230712890625,
-0.004364013671875,
-0.0121917724609375,
0.05438232421875,
-0.002010345458984375,
-0.0360107421875,
0.0284881591796875,
-0.007488250732421875,
0.04986572265625,
-0.05657958984375,
-0.001438140869140625,
-0.0181121826171875,
0.00505828857421875,
-0.0244293212890625,
-0.06304931640625,
0.020477294921875,
-0.026947021484375,
-0.0194854736328125,
-0.041839599609375,
0.06634521484375,
-0.01361083984375,
-0.0296630859375,
0.00791168212890625,
0.03204345703125,
-0.0060272216796875,
-0.0019502639770507812,
-0.061004638671875,
0.019775390625,
0.0038204193115234375,
-0.0281982421875,
0.0394287109375,
0.02655029296875,
-0.0164642333984375,
0.03350830078125,
0.051727294921875,
-0.01824951171875,
-0.0031757354736328125,
0.01715087890625,
0.0731201171875,
-0.0245361328125,
-0.052886962890625,
-0.052520751953125,
0.059051513671875,
-0.0268402099609375,
-0.0176239013671875,
0.06884765625,
0.0533447265625,
0.0628662109375,
-0.0164642333984375,
0.054962158203125,
0.0115509033203125,
0.00603485107421875,
-0.040802001953125,
0.0701904296875,
-0.06341552734375,
-0.001068115234375,
-0.04425048828125,
-0.0809326171875,
-0.052886962890625,
0.041748046875,
-0.0174713134765625,
0.0079193115234375,
0.061981201171875,
0.0560302734375,
-0.01580810546875,
-0.00893402099609375,
0.043548583984375,
-0.001567840576171875,
0.0439453125,
0.0032024383544921875,
0.0572509765625,
-0.03582763671875,
0.051422119140625,
-0.01806640625,
-0.0080108642578125,
-0.010986328125,
-0.06561279296875,
-0.070556640625,
-0.0687255859375,
-0.0226593017578125,
-0.03662109375,
-0.025421142578125,
0.03656005859375,
0.0699462890625,
-0.04486083984375,
0.01264190673828125,
0.00363922119140625,
0.0137939453125,
-0.0014905929565429688,
-0.01519012451171875,
0.055419921875,
-0.012054443359375,
-0.04119873046875,
0.0028629302978515625,
0.0283050537109375,
0.0274200439453125,
-0.02642822265625,
-0.026092529296875,
-0.01494598388671875,
-0.008514404296875,
0.04290771484375,
0.0202178955078125,
-0.05706787109375,
-0.0056915283203125,
-0.010772705078125,
-0.02044677734375,
0.022735595703125,
0.06488037109375,
-0.039825439453125,
0.035308837890625,
0.034393310546875,
0.03631591796875,
0.035858154296875,
0.002777099609375,
0.029998779296875,
-0.06719970703125,
0.030670166015625,
0.00016760826110839844,
0.0364990234375,
0.037811279296875,
-0.0283966064453125,
0.0304718017578125,
0.01493072509765625,
-0.040557861328125,
-0.055877685546875,
-0.00037407875061035156,
-0.07672119140625,
-0.010162353515625,
0.088623046875,
-0.006412506103515625,
-0.028564453125,
0.0013494491577148438,
-0.042694091796875,
0.0177154541015625,
-0.00939178466796875,
0.0268707275390625,
0.03277587890625,
-0.00608062744140625,
-0.032867431640625,
-0.0298919677734375,
0.045501708984375,
0.00016582012176513672,
-0.06744384765625,
-0.03472900390625,
0.020172119140625,
-0.00852203369140625,
0.049774169921875,
0.0560302734375,
-0.01313018798828125,
0.0125579833984375,
-0.0237884521484375,
0.030181884765625,
-0.037811279296875,
-0.018157958984375,
-0.02264404296875,
0.018890380859375,
-0.0243988037109375,
-0.02886962890625
]
] |
bert-base-multilingual-cased | 2022-11-16T23:22:54.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"af",
"sq",
"ar",
"an",
"hy",
"ast",
"az",
"ba",
"eu",
"bar",
"be",
"bn",
"inc",
"bs",
"br",
"bg",
"my",
"ca",
"ceb",
"ce",
"zh",
"cv",
"hr",
"cs",
"da",
"nl",
"en",
"et",
"fi",
"fr",
"gl",
"ka",
"de",
"el",
"gu",
"ht",
"he",
"hi",
"hu",
"is",
"io",
"id",
"ga",
"it",
"ja",
"jv",
"kn",
"kk",
"ky",
"ko",
"la",
"lv",
"lt",
"roa",
"nds",
"lm",
"mk",
"mg",
"ms",
"ml",
"mr",
"mn",
"min",
"ne",
"new",
"nb",
"nn",
"oc",
"fa",
"pms",
"pl",
"pt",
"pa",
"ro",
"ru",
"sco",
"sr",
"scn",
"sk",
"sl",
"aze",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"th",
"ta",
"tt",
"te",
"tr",
"uk",
"ud",
"uz",
"vi",
"vo",
"war",
"cy",
"fry",
"pnb",
"yo",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-base-multilingual-cased | 241 | 4,807,995 | transformers | 2022-03-02T23:29:04 | ---
language:
- multilingual
- af
- sq
- ar
- an
- hy
- ast
- az
- ba
- eu
- bar
- be
- bn
- inc
- bs
- br
- bg
- my
- ca
- ceb
- ce
- zh
- cv
- hr
- cs
- da
- nl
- en
- et
- fi
- fr
- gl
- ka
- de
- el
- gu
- ht
- he
- hi
- hu
- is
- io
- id
- ga
- it
- ja
- jv
- kn
- kk
- ky
- ko
- la
- lv
- lt
- roa
- nds
- lm
- mk
- mg
- ms
- ml
- mr
- mn
- min
- ne
- new
- nb
- nn
- oc
- fa
- pms
- pl
- pt
- pa
- ro
- ru
- sco
- sr
- hr
- scn
- sk
- sl
- aze
- es
- su
- sw
- sv
- tl
- tg
- th
- ta
- tt
- te
- tr
- uk
- ud
- uz
- vi
- vo
- war
- cy
- fry
- pnb
- yo
license: apache-2.0
datasets:
- wikipedia
---
# BERT multilingual base model (cased)
Pretrained model on the top 104 languages with the largest Wikipedia using a masked language modeling (MLM) objective.
It was introduced in [this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is case sensitive: it makes a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of multilingual data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the languages in the training set that can then be used to
extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a
standard classifier using the features produced by the BERT model as inputs.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-multilingual-cased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] Hello I'm a model model. [SEP]",
'score': 0.10182085633277893,
'token': 13192,
'token_str': 'model'},
{'sequence': "[CLS] Hello I'm a world model. [SEP]",
'score': 0.052126359194517136,
'token': 11356,
'token_str': 'world'},
{'sequence': "[CLS] Hello I'm a data model. [SEP]",
'score': 0.048930276185274124,
'token': 11165,
'token_str': 'data'},
{'sequence': "[CLS] Hello I'm a flight model. [SEP]",
'score': 0.02036019042134285,
'token': 23578,
'token_str': 'flight'},
{'sequence': "[CLS] Hello I'm a business model. [SEP]",
'score': 0.020079681649804115,
'token': 14155,
'token_str': 'business'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-multilingual-cased')
model = BertModel.from_pretrained("bert-base-multilingual-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-multilingual-cased')
model = TFBertModel.from_pretrained("bert-base-multilingual-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Training data
The BERT model was pretrained on the 104 languages with the largest Wikipedias. You can find the complete list
[here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a shared vocabulary size of 110,000. The languages with a
larger Wikipedia are under-sampled and the ones with lower resources are oversampled. For languages like Chinese,
Japanese Kanji and Korean Hanja that don't have space, a CJK Unicode block is added around every character.
The inputs of the model are then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 7,104 | [
[
-0.0262603759765625,
-0.060150146484375,
0.012237548828125,
0.025970458984375,
-0.031005859375,
0.005405426025390625,
-0.0201416015625,
-0.0233154296875,
0.02862548828125,
0.040130615234375,
-0.052001953125,
-0.0308380126953125,
-0.048553466796875,
0.003925323486328125,
-0.0308685302734375,
0.0953369140625,
0.00685882568359375,
0.0175628662109375,
0.0030059814453125,
0.006298065185546875,
-0.0290374755859375,
-0.0699462890625,
-0.04241943359375,
-0.0198822021484375,
0.033294677734375,
0.0201416015625,
0.042022705078125,
0.0312347412109375,
0.03753662109375,
0.026611328125,
-0.006771087646484375,
-0.00974273681640625,
-0.0256195068359375,
0.0005216598510742188,
0.002582550048828125,
-0.0306549072265625,
-0.0215606689453125,
0.0075225830078125,
0.05181884765625,
0.066650390625,
0.0024204254150390625,
0.01389312744140625,
-0.006557464599609375,
0.037384033203125,
-0.0184173583984375,
0.0223388671875,
-0.039764404296875,
0.005794525146484375,
-0.021514892578125,
0.017059326171875,
-0.0271759033203125,
-0.0170135498046875,
0.0206298828125,
-0.042755126953125,
0.0218353271484375,
0.0107421875,
0.09088134765625,
0.0006489753723144531,
-0.0181121826171875,
-0.01369476318359375,
-0.0357666015625,
0.05743408203125,
-0.052459716796875,
0.0270538330078125,
0.03192138671875,
0.01263427734375,
-0.01078033447265625,
-0.08062744140625,
-0.047943115234375,
-0.01238250732421875,
-0.016357421875,
0.00579071044921875,
-0.004848480224609375,
0.00439453125,
0.0260009765625,
0.0266571044921875,
-0.04791259765625,
-0.0010995864868164062,
-0.047515869140625,
-0.033203125,
0.0469970703125,
-0.009552001953125,
0.0251617431640625,
-0.031951904296875,
-0.029937744140625,
-0.0257568359375,
-0.02960205078125,
0.005855560302734375,
0.038787841796875,
0.032440185546875,
-0.017242431640625,
0.052337646484375,
-0.01013946533203125,
0.04864501953125,
0.0048370361328125,
0.0064697265625,
0.04034423828125,
-0.021636962890625,
-0.0245513916015625,
0.0028247833251953125,
0.06854248046875,
0.01078033447265625,
0.037841796875,
-0.0115814208984375,
-0.00978851318359375,
-0.00804901123046875,
0.0159912109375,
-0.0570068359375,
-0.0198211669921875,
0.0135650634765625,
-0.043792724609375,
-0.017059326171875,
0.0215606689453125,
-0.045196533203125,
0.0036334991455078125,
-0.0087432861328125,
0.046112060546875,
-0.033477783203125,
-0.01509857177734375,
0.00733184814453125,
-0.021636962890625,
0.01473236083984375,
-0.00553131103515625,
-0.0692138671875,
0.01568603515625,
0.046173095703125,
0.0631103515625,
0.007476806640625,
-0.0288848876953125,
-0.0198974609375,
-0.0173797607421875,
-0.02191162109375,
0.041015625,
-0.0201873779296875,
-0.0276947021484375,
0.0047760009765625,
0.0179595947265625,
-0.01110076904296875,
-0.0188140869140625,
0.04632568359375,
-0.043609619140625,
0.0384521484375,
-0.01039886474609375,
-0.041748046875,
-0.0286865234375,
0.006984710693359375,
-0.05145263671875,
0.0860595703125,
0.01369476318359375,
-0.050994873046875,
0.02911376953125,
-0.05706787109375,
-0.04766845703125,
0.01324462890625,
0.010833740234375,
-0.03753662109375,
0.0057220458984375,
0.0160369873046875,
0.03289794921875,
0.0056610107421875,
0.03643798828125,
-0.022430419921875,
-0.0282440185546875,
0.0258026123046875,
-0.023834228515625,
0.081787109375,
0.019805908203125,
-0.01837158203125,
0.00859832763671875,
-0.05841064453125,
0.006816864013671875,
0.01007843017578125,
-0.0284576416015625,
-0.015716552734375,
-0.00494384765625,
0.0308990478515625,
0.0184173583984375,
0.0284423828125,
-0.04742431640625,
0.01806640625,
-0.044769287109375,
0.046295166015625,
0.053558349609375,
-0.004177093505859375,
0.022705078125,
-0.0206146240234375,
0.028076171875,
0.00732421875,
0.0020599365234375,
-0.018798828125,
-0.051177978515625,
-0.07073974609375,
-0.02264404296875,
0.048004150390625,
0.048980712890625,
-0.04803466796875,
0.054412841796875,
-0.0192718505859375,
-0.032135009765625,
-0.05279541015625,
0.0020694732666015625,
0.0254058837890625,
0.02850341796875,
0.0265655517578125,
-0.03338623046875,
-0.06561279296875,
-0.07000732421875,
-0.01030731201171875,
-0.021453857421875,
-0.0039825439453125,
0.00827789306640625,
0.05072021484375,
-0.0199737548828125,
0.06707763671875,
-0.037445068359375,
-0.021240234375,
-0.0267486572265625,
0.0202178955078125,
0.03546142578125,
0.049468994140625,
0.0335693359375,
-0.0582275390625,
-0.042144775390625,
-0.0203399658203125,
-0.044830322265625,
-0.002361297607421875,
-0.0112152099609375,
-0.016387939453125,
0.025787353515625,
0.04290771484375,
-0.052703857421875,
0.031494140625,
0.033203125,
-0.0298309326171875,
0.034698486328125,
-0.0207672119140625,
-0.006572723388671875,
-0.09393310546875,
0.01358795166015625,
-0.01244354248046875,
-0.01434326171875,
-0.05657958984375,
0.004329681396484375,
-0.0005784034729003906,
-0.007843017578125,
-0.037200927734375,
0.04278564453125,
-0.031219482421875,
-0.0007224082946777344,
-0.002330780029296875,
-0.001888275146484375,
0.001956939697265625,
0.050384521484375,
0.0143585205078125,
0.048797607421875,
0.04248046875,
-0.04205322265625,
0.032135009765625,
0.034210205078125,
-0.049957275390625,
0.01171112060546875,
-0.05633544921875,
0.010589599609375,
-0.0034770965576171875,
0.0125579833984375,
-0.08026123046875,
-0.02105712890625,
0.017059326171875,
-0.04595947265625,
0.021514892578125,
-0.007289886474609375,
-0.052642822265625,
-0.046722412109375,
-0.0111846923828125,
0.029327392578125,
0.045135498046875,
-0.0300750732421875,
0.030975341796875,
0.0227203369140625,
-0.0208282470703125,
-0.050048828125,
-0.06695556640625,
0.0127105712890625,
-0.0080718994140625,
-0.046356201171875,
0.03485107421875,
-0.016143798828125,
-0.0023975372314453125,
-0.0068359375,
0.015655517578125,
-0.01154327392578125,
0.0048828125,
0.01377105712890625,
0.0267486572265625,
-0.018157958984375,
0.009918212890625,
-0.0031833648681640625,
-0.0015277862548828125,
0.00920867919921875,
-0.020477294921875,
0.061492919921875,
-0.00640106201171875,
-0.01279449462890625,
-0.0225677490234375,
0.033660888671875,
0.033660888671875,
-0.01042938232421875,
0.05596923828125,
0.061981201171875,
-0.035675048828125,
0.00759124755859375,
-0.036956787109375,
-0.013397216796875,
-0.036102294921875,
0.046234130859375,
-0.037628173828125,
-0.0606689453125,
0.05426025390625,
0.02642822265625,
-0.0033359527587890625,
0.050567626953125,
0.049774169921875,
-0.01363372802734375,
0.07244873046875,
0.050750732421875,
-0.0137481689453125,
0.0399169921875,
-0.0165863037109375,
0.028076171875,
-0.051300048828125,
-0.0311737060546875,
-0.031402587890625,
-0.0177154541015625,
-0.05426025390625,
-0.01558685302734375,
0.013671875,
0.021484375,
-0.0268707275390625,
0.041351318359375,
-0.03338623046875,
0.0208892822265625,
0.076171875,
0.01378631591796875,
-0.0088958740234375,
0.000591278076171875,
-0.02117919921875,
-0.00399017333984375,
-0.0430908203125,
-0.0234222412109375,
0.0872802734375,
0.0335693359375,
0.045196533203125,
0.0031490325927734375,
0.05029296875,
0.0183258056640625,
0.003864288330078125,
-0.057769775390625,
0.038604736328125,
-0.022369384765625,
-0.07366943359375,
-0.0213165283203125,
-0.0152587890625,
-0.07489013671875,
0.01434326171875,
-0.0194854736328125,
-0.0594482421875,
0.005855560302734375,
-0.01506805419921875,
-0.019378662109375,
0.01190185546875,
-0.06072998046875,
0.0731201171875,
-0.023834228515625,
-0.0026721954345703125,
0.0028171539306640625,
-0.0703125,
0.0201263427734375,
-0.01016998291015625,
0.007747650146484375,
0.00042247772216796875,
0.02532958984375,
0.07122802734375,
-0.036285400390625,
0.07403564453125,
-0.0103302001953125,
0.01531219482421875,
0.0167999267578125,
-0.01363372802734375,
0.017730712890625,
-0.0002720355987548828,
0.00955963134765625,
0.035003662109375,
-0.003238677978515625,
-0.03106689453125,
-0.0211029052734375,
0.03350830078125,
-0.06591796875,
-0.0430908203125,
-0.041168212890625,
-0.04443359375,
0.0029735565185546875,
0.033782958984375,
0.039398193359375,
0.0238800048828125,
-0.019012451171875,
0.0187225341796875,
0.037689208984375,
-0.0283355712890625,
0.057403564453125,
0.02850341796875,
-0.0203399658203125,
-0.034881591796875,
0.054779052734375,
0.00620269775390625,
0.008575439453125,
0.0399169921875,
0.0088653564453125,
-0.03948974609375,
-0.0210113525390625,
-0.0301055908203125,
0.015167236328125,
-0.046173095703125,
-0.01166534423828125,
-0.055938720703125,
-0.047119140625,
-0.0540771484375,
0.0007815361022949219,
-0.00791168212890625,
-0.045379638671875,
-0.02984619140625,
-0.01153564453125,
0.0276336669921875,
0.041168212890625,
-0.0176849365234375,
0.038238525390625,
-0.0550537109375,
0.0223388671875,
0.024322509765625,
0.0275726318359375,
-0.017852783203125,
-0.054046630859375,
-0.027984619140625,
0.00969696044921875,
-0.017578125,
-0.054931640625,
0.043304443359375,
0.0229034423828125,
0.05047607421875,
0.03289794921875,
-0.006683349609375,
0.052764892578125,
-0.048797607421875,
0.07122802734375,
0.01352691650390625,
-0.0830078125,
0.041900634765625,
-0.0103759765625,
0.0135650634765625,
0.025909423828125,
0.0212860107421875,
-0.052093505859375,
-0.03009033203125,
-0.0606689453125,
-0.0699462890625,
0.061767578125,
0.019134521484375,
0.033355712890625,
-0.00493621826171875,
0.01971435546875,
0.00519561767578125,
0.0265045166015625,
-0.0858154296875,
-0.039031982421875,
-0.040008544921875,
-0.025634765625,
-0.0202789306640625,
-0.0276947021484375,
0.0019464492797851562,
-0.0301055908203125,
0.052276611328125,
0.0101776123046875,
0.040618896484375,
0.00775146484375,
-0.0238494873046875,
0.00989532470703125,
0.0091552734375,
0.0582275390625,
0.0286865234375,
-0.03350830078125,
-0.0038814544677734375,
-0.0000014901161193847656,
-0.0521240234375,
-0.007389068603515625,
0.025238037109375,
-0.00910186767578125,
0.0201263427734375,
0.040740966796875,
0.07275390625,
0.0107879638671875,
-0.0390625,
0.044342041015625,
0.01045989990234375,
-0.021636962890625,
-0.031494140625,
-0.007396697998046875,
-0.004222869873046875,
0.013153076171875,
0.0379638671875,
0.0009074211120605469,
0.007556915283203125,
-0.04119873046875,
0.0293121337890625,
0.029296875,
-0.030548095703125,
-0.0199737548828125,
0.0552978515625,
0.00940704345703125,
-0.0347900390625,
0.0635986328125,
-0.01165008544921875,
-0.06402587890625,
0.05633544921875,
0.050384521484375,
0.07110595703125,
-0.007022857666015625,
0.0217437744140625,
0.0352783203125,
0.032867431640625,
-0.005016326904296875,
0.022979736328125,
0.012939453125,
-0.06744384765625,
-0.03338623046875,
-0.056640625,
-0.01357269287109375,
0.0218963623046875,
-0.053741455078125,
0.0237579345703125,
-0.0277862548828125,
-0.01311492919921875,
0.01052093505859375,
0.01065826416015625,
-0.0494384765625,
0.027435302734375,
0.0128936767578125,
0.06878662109375,
-0.06756591796875,
0.0789794921875,
0.052093505859375,
-0.049774169921875,
-0.0546875,
-0.0172271728515625,
-0.027557373046875,
-0.08074951171875,
0.06585693359375,
0.0206146240234375,
0.0367431640625,
0.0005331039428710938,
-0.044525146484375,
-0.06494140625,
0.053985595703125,
0.01116943359375,
-0.032989501953125,
-0.01078033447265625,
0.01396942138671875,
0.04296875,
-0.03192138671875,
0.0222625732421875,
0.029693603515625,
0.034759521484375,
0.0009541511535644531,
-0.06414794921875,
-0.003513336181640625,
-0.028045654296875,
0.0075531005859375,
0.004421234130859375,
-0.040252685546875,
0.0858154296875,
-0.01251220703125,
-0.00397491455078125,
0.0107269287109375,
0.045074462890625,
0.00939178466796875,
0.00318145751953125,
0.029815673828125,
0.040740966796875,
0.050689697265625,
-0.0226898193359375,
0.06494140625,
-0.024444580078125,
0.042236328125,
0.06756591796875,
0.002964019775390625,
0.06439208984375,
0.0361328125,
-0.01512908935546875,
0.063232421875,
0.06646728515625,
-0.0277862548828125,
0.06256103515625,
0.01546478271484375,
0.000026106834411621094,
-0.00968170166015625,
0.01116943359375,
-0.032012939453125,
0.036590576171875,
0.023956298828125,
-0.03485107421875,
0.0006132125854492188,
0.0072784423828125,
0.01187896728515625,
-0.0157318115234375,
-0.0210113525390625,
0.0528564453125,
0.006397247314453125,
-0.056488037109375,
0.0301666259765625,
0.017730712890625,
0.054351806640625,
-0.04718017578125,
0.006927490234375,
-0.01971435546875,
0.01020050048828125,
0.0001055002212524414,
-0.058074951171875,
0.0175323486328125,
-0.0098876953125,
-0.029754638671875,
-0.0262603759765625,
0.048553466796875,
-0.04571533203125,
-0.056854248046875,
0.01110076904296875,
0.0274505615234375,
0.025238037109375,
-0.00537109375,
-0.0626220703125,
-0.012237548828125,
0.003017425537109375,
-0.02008056640625,
0.0132598876953125,
0.0261993408203125,
-0.004573822021484375,
0.045257568359375,
0.058837890625,
0.0008058547973632812,
0.01544189453125,
0.01309967041015625,
0.053466796875,
-0.06109619140625,
-0.052825927734375,
-0.06561279296875,
0.042999267578125,
-0.0088958740234375,
-0.033294677734375,
0.048736572265625,
0.050567626953125,
0.06964111328125,
-0.0232391357421875,
0.05767822265625,
-0.0165557861328125,
0.03778076171875,
-0.0394287109375,
0.06439208984375,
-0.03265380859375,
-0.0008392333984375,
-0.0208587646484375,
-0.0640869140625,
-0.0254058837890625,
0.0655517578125,
-0.001323699951171875,
0.01006317138671875,
0.056549072265625,
0.046112060546875,
0.0017948150634765625,
-0.0141143798828125,
0.0203399658203125,
0.0198974609375,
0.009124755859375,
0.0355224609375,
0.043548583984375,
-0.049224853515625,
0.041412353515625,
-0.0245819091796875,
-0.0025310516357421875,
-0.024017333984375,
-0.0648193359375,
-0.08343505859375,
-0.05609130859375,
-0.0233001708984375,
-0.03875732421875,
-0.007061004638671875,
0.061492919921875,
0.060272216796875,
-0.07794189453125,
-0.023712158203125,
-0.007175445556640625,
0.0121002197265625,
-0.01561737060546875,
-0.020721435546875,
0.034759521484375,
-0.030914306640625,
-0.06463623046875,
0.01364898681640625,
-0.004573822021484375,
0.018280029296875,
-0.0187835693359375,
0.0044708251953125,
-0.032928466796875,
0.003154754638671875,
0.042449951171875,
0.0137481689453125,
-0.058929443359375,
-0.031646728515625,
0.00891876220703125,
-0.016754150390625,
0.00255584716796875,
0.041961669921875,
-0.049957275390625,
0.034454345703125,
0.03045654296875,
0.03228759765625,
0.056396484375,
-0.0072479248046875,
0.045806884765625,
-0.0828857421875,
0.0308380126953125,
0.00499725341796875,
0.033050537109375,
0.0252685546875,
-0.0258941650390625,
0.032257080078125,
0.032623291015625,
-0.03369140625,
-0.06329345703125,
0.0010728836059570312,
-0.07794189453125,
-0.025238037109375,
0.0770263671875,
-0.0205841064453125,
-0.01091766357421875,
-0.007534027099609375,
-0.0170135498046875,
0.033172607421875,
-0.0189361572265625,
0.05706787109375,
0.0782470703125,
0.0145416259765625,
-0.0098114013671875,
-0.0271453857421875,
0.032989501953125,
0.02630615234375,
-0.040374755859375,
-0.0256195068359375,
0.01540374755859375,
0.028411865234375,
0.0231475830078125,
0.040802001953125,
0.0015249252319335938,
0.0072021484375,
0.0010280609130859375,
0.0254974365234375,
0.001071929931640625,
-0.0167388916015625,
-0.0239410400390625,
0.0028896331787109375,
-0.01763916015625,
-0.04132080078125
]
] |
CompVis/stable-diffusion-safety-checker | 2022-11-25T17:21:38.000Z | [
"transformers",
"pytorch",
"clip",
"arxiv:2103.00020",
"arxiv:1910.09700",
"endpoints_compatible",
"has_space",
"region:us"
] | null | CompVis | null | null | CompVis/stable-diffusion-safety-checker | 79 | 4,714,020 | transformers | 2022-08-22T10:22:34 | ---
tags:
- clip
---
# Model Card for stable-diffusion-safety-checker
# Model Details
## Model Description
More information needed
- **Developed by:** More information needed
- **Shared by [Optional]:** CompVis
- **Model type:** Image Identification
- **Language(s) (NLP):** More information needed
- **License:** More information needed
- **Parent Model:** [CLIP](https://huggingface.co/openai/clip-vit-large-patch14)
- **Resources for more information:**
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
- [Stable Diffusion Model Card](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md)
# Uses
## Direct Use
This model can be used for identifying NSFW image
The CLIP model devlopers note in their [model card](https://huggingface.co/openai/clip-vit-large-patch14) :
>The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Downstream Use [Optional]
More information needed.
## Out-of-Scope Use
The model is not intended to be used with transformers but with diffusers. This model should also not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
The CLIP model devlopers note in their [model card](https://huggingface.co/openai/clip-vit-large-patch14) :
> We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from Fairface into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed.
> We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with โMiddle Easternโ having the highest accuracy (98.4%) and โWhiteโ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
More information needed
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
The CLIP model devlopers note in their [model card](https://huggingface.co/openai/clip-vit-large-patch14) :
> The base model uses a ViT-L/14 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed.
# Citation
**BibTeX:**
More information needed
**APA:**
More information needed
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
CompVis in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoProcessor, SafetyChecker
processor = AutoProcessor.from_pretrained("CompVis/stable-diffusion-safety-checker")
safety_checker = SafetyChecker.from_pretrained("CompVis/stable-diffusion-safety-checker")
```
</details>
| 5,359 | [
[
-0.033172607421875,
-0.055419921875,
0.018310546875,
0.00818634033203125,
-0.0128326416015625,
-0.01904296875,
0.0013494491577148438,
-0.04119873046875,
0.0038776397705078125,
0.029571533203125,
-0.0265045166015625,
-0.04156494140625,
-0.062469482421875,
-0.005565643310546875,
-0.039215087890625,
0.0665283203125,
-0.00023174285888671875,
0.00232696533203125,
-0.0109710693359375,
-0.01032257080078125,
-0.0325927734375,
-0.037078857421875,
-0.054290771484375,
-0.0112762451171875,
0.01392364501953125,
-0.0003058910369873047,
0.057220458984375,
0.05810546875,
0.04534912109375,
0.0261688232421875,
-0.034271240234375,
-0.0192413330078125,
-0.042938232421875,
-0.032562255859375,
-0.021270751953125,
-0.0234832763671875,
-0.0430908203125,
0.01311492919921875,
0.048858642578125,
0.029296875,
-0.0169830322265625,
0.0190887451171875,
0.006320953369140625,
0.031646728515625,
-0.06378173828125,
0.0035858154296875,
-0.038482666015625,
0.01031494140625,
-0.01715087890625,
0.0159454345703125,
-0.021331787109375,
-0.0172119140625,
0.0049591064453125,
-0.03717041015625,
0.0242767333984375,
-0.0082244873046875,
0.0916748046875,
0.0277557373046875,
-0.0156402587890625,
-0.003742218017578125,
-0.046142578125,
0.05059814453125,
-0.046173095703125,
0.031646728515625,
0.01560211181640625,
0.0159759521484375,
0.0031719207763671875,
-0.059112548828125,
-0.050994873046875,
0.00574493408203125,
0.00835418701171875,
0.01174163818359375,
-0.0196533203125,
-0.0031871795654296875,
0.03668212890625,
0.027099609375,
-0.023284912109375,
0.003124237060546875,
-0.03857421875,
-0.027984619140625,
0.052154541015625,
0.01788330078125,
0.02777099609375,
-0.0158233642578125,
-0.04595947265625,
-0.0270233154296875,
-0.0287322998046875,
0.0281524658203125,
0.031341552734375,
-0.00276947021484375,
-0.03643798828125,
0.04022216796875,
-0.002315521240234375,
0.0296783447265625,
0.00699615478515625,
-0.0245361328125,
0.0345458984375,
-0.0302734375,
-0.0213623046875,
-0.012298583984375,
0.07373046875,
0.048065185546875,
0.0010957717895507812,
0.0203704833984375,
-0.01136016845703125,
0.01361846923828125,
0.01145172119140625,
-0.087158203125,
-0.02813720703125,
-0.00007832050323486328,
-0.053436279296875,
-0.03924560546875,
0.00637054443359375,
-0.08160400390625,
-0.00005537271499633789,
-0.00537109375,
0.04443359375,
-0.0257415771484375,
-0.034942626953125,
0.0092620849609375,
-0.02301025390625,
0.01474761962890625,
0.0279083251953125,
-0.052337646484375,
0.027374267578125,
0.0266265869140625,
0.07867431640625,
-0.03521728515625,
-0.008514404296875,
0.0008139610290527344,
0.0032176971435546875,
0.0009794235229492188,
0.0484619140625,
-0.037872314453125,
-0.039215087890625,
-0.008575439453125,
0.03179931640625,
-0.00003248453140258789,
-0.028533935546875,
0.053741455078125,
-0.023406982421875,
0.0202484130859375,
-0.0083160400390625,
-0.031494140625,
-0.030731201171875,
0.0152435302734375,
-0.055267333984375,
0.08563232421875,
0.006870269775390625,
-0.0787353515625,
0.018829345703125,
-0.054840087890625,
-0.0015707015991210938,
-0.0092620849609375,
-0.003383636474609375,
-0.0634765625,
-0.0361328125,
0.01421356201171875,
0.02435302734375,
-0.0085906982421875,
0.0294189453125,
-0.040802001953125,
-0.0194549560546875,
0.004405975341796875,
-0.03363037109375,
0.0902099609375,
0.01617431640625,
-0.033111572265625,
0.008636474609375,
-0.04693603515625,
-0.0170440673828125,
0.0235595703125,
-0.00775909423828125,
-0.01369476318359375,
-0.01558685302734375,
0.024383544921875,
0.022979736328125,
0.011566162109375,
-0.046417236328125,
-0.01033782958984375,
0.002307891845703125,
0.034515380859375,
0.06451416015625,
0.002208709716796875,
0.0278472900390625,
-0.0206298828125,
0.038909912109375,
0.004062652587890625,
0.0418701171875,
0.00013113021850585938,
-0.047576904296875,
-0.0574951171875,
-0.0288848876953125,
0.03570556640625,
0.053436279296875,
-0.0301666259765625,
0.039520263671875,
-0.00028514862060546875,
-0.048095703125,
-0.016937255859375,
-0.007541656494140625,
0.03167724609375,
0.04949951171875,
0.0243072509765625,
-0.04473876953125,
-0.032196044921875,
-0.06768798828125,
0.0147552490234375,
-0.00202178955078125,
0.0090789794921875,
0.0249176025390625,
0.06378173828125,
-0.0321044921875,
0.06793212890625,
-0.047271728515625,
-0.0228729248046875,
0.003082275390625,
-0.006317138671875,
-0.0013494491577148438,
0.050384521484375,
0.0606689453125,
-0.0673828125,
-0.019775390625,
-0.03424072265625,
-0.06219482421875,
0.005161285400390625,
0.0120697021484375,
-0.02337646484375,
0.012481689453125,
0.0355224609375,
-0.03802490234375,
0.0560302734375,
0.0408935546875,
-0.027740478515625,
0.0443115234375,
-0.00890350341796875,
0.00727081298828125,
-0.06622314453125,
0.027557373046875,
0.0162811279296875,
-0.0256805419921875,
-0.035888671875,
0.0006461143493652344,
0.0001308917999267578,
-0.020477294921875,
-0.0594482421875,
0.04754638671875,
-0.016632080078125,
0.01474761962890625,
-0.02374267578125,
-0.00661468505859375,
0.00396728515625,
0.04766845703125,
0.02154541015625,
0.06976318359375,
0.041961669921875,
-0.05438232421875,
-0.010467529296875,
0.033111572265625,
-0.02435302734375,
0.037322998046875,
-0.06103515625,
0.0052642822265625,
-0.01364898681640625,
0.018341064453125,
-0.04461669921875,
-0.0188140869140625,
0.0261077880859375,
-0.0264739990234375,
0.02691650390625,
-0.0092926025390625,
-0.02630615234375,
-0.03765869140625,
-0.020782470703125,
0.043609619140625,
0.05413818359375,
-0.032958984375,
0.031585693359375,
0.061798095703125,
-0.001773834228515625,
-0.049468994140625,
-0.048675537109375,
-0.01422882080078125,
-0.0208282470703125,
-0.05474853515625,
0.044097900390625,
-0.00732421875,
-0.00585174560546875,
0.0008196830749511719,
0.01036834716796875,
-0.0201873779296875,
0.0087890625,
0.03680419921875,
0.03070068359375,
0.007793426513671875,
-0.001819610595703125,
-0.003017425537109375,
-0.006145477294921875,
0.005889892578125,
0.015167236328125,
0.0250244140625,
-0.00978851318359375,
-0.0079345703125,
-0.04522705078125,
0.0247955322265625,
0.0372314453125,
-0.01302337646484375,
0.060394287109375,
0.059417724609375,
-0.041717529296875,
0.0013303756713867188,
-0.039215087890625,
-0.00995635986328125,
-0.038787841796875,
0.032135009765625,
-0.01372528076171875,
-0.05816650390625,
0.057098388671875,
0.0062103271484375,
-0.0159149169921875,
0.061553955078125,
0.04449462890625,
-0.002429962158203125,
0.0772705078125,
0.0733642578125,
-0.0003676414489746094,
0.052764892578125,
-0.043182373046875,
0.00012552738189697266,
-0.059906005859375,
-0.034454345703125,
-0.03900146484375,
-0.00439453125,
-0.0380859375,
-0.03497314453125,
0.0245208740234375,
0.0147247314453125,
-0.036773681640625,
0.0257415771484375,
-0.058624267578125,
0.028289794921875,
0.0325927734375,
0.02093505859375,
0.0005693435668945312,
-0.0119476318359375,
0.001140594482421875,
-0.00731658935546875,
-0.04376220703125,
-0.044830322265625,
0.0635986328125,
0.0635986328125,
0.062347412109375,
0.00007534027099609375,
0.0345458984375,
0.03143310546875,
0.01515960693359375,
-0.032440185546875,
0.03961181640625,
-0.020782470703125,
-0.053924560546875,
-0.01531982421875,
-0.01947021484375,
-0.060333251953125,
0.01224517822265625,
-0.025848388671875,
-0.0511474609375,
0.04498291015625,
0.01435089111328125,
-0.01715087890625,
0.040008544921875,
-0.0472412109375,
0.08984375,
-0.0177459716796875,
-0.031646728515625,
-0.0013761520385742188,
-0.049652099609375,
0.04327392578125,
0.0017976760864257812,
0.0162200927734375,
-0.0202484130859375,
0.0031757354736328125,
0.07769775390625,
-0.046875,
0.072021484375,
-0.0271453857421875,
0.01031494140625,
0.03631591796875,
-0.01538848876953125,
0.0269622802734375,
-0.0034160614013671875,
-0.0088348388671875,
0.04962158203125,
0.01059722900390625,
-0.01372528076171875,
-0.017913818359375,
0.0341796875,
-0.06488037109375,
-0.0255126953125,
-0.0391845703125,
-0.0215911865234375,
0.0269927978515625,
0.02484130859375,
0.044189453125,
0.0211334228515625,
-0.0150604248046875,
-0.00067901611328125,
0.06103515625,
-0.02947998046875,
0.0307769775390625,
0.021392822265625,
-0.012786865234375,
-0.0435791015625,
0.0594482421875,
0.004627227783203125,
0.0221099853515625,
0.00550079345703125,
0.01036834716796875,
-0.0233154296875,
-0.032257080078125,
-0.0286865234375,
0.0106353759765625,
-0.056488037109375,
-0.0290069580078125,
-0.06640625,
-0.03662109375,
-0.033203125,
0.0025501251220703125,
-0.035491943359375,
-0.0211639404296875,
-0.045654296875,
-0.00012183189392089844,
0.0304107666015625,
0.0374755859375,
-0.01248931884765625,
0.02752685546875,
-0.04248046875,
0.02410888671875,
0.0250396728515625,
0.040802001953125,
0.0005893707275390625,
-0.04296875,
-0.007568359375,
0.00412750244140625,
-0.054779052734375,
-0.07342529296875,
0.030364990234375,
0.0151824951171875,
0.04901123046875,
0.029937744140625,
0.00791168212890625,
0.037200927734375,
-0.03582763671875,
0.07318115234375,
0.0282440185546875,
-0.07666015625,
0.043182373046875,
-0.0248870849609375,
0.0090179443359375,
0.05072021484375,
0.029296875,
-0.0206756591796875,
-0.0260467529296875,
-0.04827880859375,
-0.057952880859375,
0.057861328125,
0.028533935546875,
0.0016918182373046875,
-0.0035190582275390625,
0.0307159423828125,
-0.0024509429931640625,
-0.0034923553466796875,
-0.07574462890625,
-0.032196044921875,
-0.032501220703125,
-0.0081787109375,
0.01399993896484375,
-0.0263824462890625,
-0.0020847320556640625,
-0.028594970703125,
0.054656982421875,
0.0008196830749511719,
0.04876708984375,
0.030059814453125,
-0.007068634033203125,
0.0017404556274414062,
0.00318145751953125,
0.048370361328125,
0.0165557861328125,
-0.035186767578125,
-0.004306793212890625,
0.018524169921875,
-0.0584716796875,
0.0134429931640625,
0.00228118896484375,
-0.038177490234375,
0.0005159378051757812,
0.003993988037109375,
0.07049560546875,
-0.0036220550537109375,
-0.0374755859375,
0.06982421875,
-0.00930023193359375,
-0.0274200439453125,
-0.029266357421875,
0.014312744140625,
-0.00969696044921875,
0.01097869873046875,
0.00209808349609375,
0.0289306640625,
0.031585693359375,
-0.033966064453125,
0.0157470703125,
0.0418701171875,
-0.0389404296875,
-0.0146026611328125,
0.07940673828125,
0.02947998046875,
-0.0287322998046875,
0.031951904296875,
-0.0208282470703125,
-0.05474853515625,
0.0628662109375,
0.036865234375,
0.064697265625,
-0.01512908935546875,
0.009307861328125,
0.0582275390625,
0.0235137939453125,
-0.02496337890625,
0.0024318695068359375,
0.00556182861328125,
-0.049346923828125,
-0.0210113525390625,
-0.038543701171875,
-0.02923583984375,
0.006511688232421875,
-0.05926513671875,
0.04119873046875,
-0.043304443359375,
-0.03863525390625,
-0.0026073455810546875,
-0.019134521484375,
-0.06304931640625,
0.020782470703125,
0.01824951171875,
0.090576171875,
-0.07843017578125,
0.0606689453125,
0.028839111328125,
-0.044036865234375,
-0.048553466796875,
-0.014862060546875,
-0.0062255859375,
-0.038238525390625,
0.0447998046875,
0.031341552734375,
-0.00799560546875,
-0.0284576416015625,
-0.05792236328125,
-0.06878662109375,
0.0902099609375,
0.031494140625,
-0.044525146484375,
0.0016279220581054688,
-0.0139923095703125,
0.03582763671875,
-0.0306243896484375,
0.037567138671875,
0.03173828125,
0.020263671875,
0.006786346435546875,
-0.075927734375,
0.0091705322265625,
-0.0284576416015625,
0.01143646240234375,
0.006320953369140625,
-0.0760498046875,
0.067138671875,
-0.0201263427734375,
-0.0268096923828125,
0.002044677734375,
0.0474853515625,
0.01395416259765625,
0.0302734375,
0.04779052734375,
0.05718994140625,
0.052642822265625,
0.002147674560546875,
0.075439453125,
-0.01568603515625,
0.040557861328125,
0.077880859375,
-0.005107879638671875,
0.06878662109375,
0.0192718505859375,
-0.0214080810546875,
0.046356201171875,
0.045867919921875,
-0.029754638671875,
0.054962158203125,
-0.01031494140625,
-0.0006351470947265625,
-0.0159149169921875,
-0.01213836669921875,
-0.040863037109375,
0.0254364013671875,
0.0169219970703125,
-0.046051025390625,
0.0019483566284179688,
0.01116180419921875,
-0.005207061767578125,
-0.005035400390625,
-0.021392822265625,
0.046661376953125,
-0.008148193359375,
-0.02923583984375,
0.026275634765625,
0.014068603515625,
0.0738525390625,
-0.02838134765625,
-0.0103912353515625,
0.0115509033203125,
0.0165252685546875,
-0.01358795166015625,
-0.07281494140625,
0.031158447265625,
0.0023822784423828125,
-0.0255126953125,
-0.00304412841796875,
0.057403564453125,
-0.028076171875,
-0.05224609375,
0.0306396484375,
-0.00460052490234375,
0.0181427001953125,
0.009521484375,
-0.07293701171875,
0.023101806640625,
0.007457733154296875,
-0.0011425018310546875,
0.0096435546875,
-0.003170013427734375,
-0.00185394287109375,
0.051116943359375,
0.0335693359375,
-0.00820159912109375,
0.005031585693359375,
-0.01335906982421875,
0.06439208984375,
-0.041717529296875,
-0.041961669921875,
-0.053070068359375,
0.046356201171875,
-0.0252532958984375,
-0.0241241455078125,
0.04644775390625,
0.04962158203125,
0.06683349609375,
-0.00453948974609375,
0.056640625,
-0.0041656494140625,
0.03240966796875,
-0.032806396484375,
0.056182861328125,
-0.039031982421875,
-0.0016345977783203125,
-0.041961669921875,
-0.06732177734375,
-0.0092620849609375,
0.0618896484375,
-0.0111846923828125,
0.00653076171875,
0.0345458984375,
0.06298828125,
-0.01108551025390625,
0.0013151168823242188,
0.007686614990234375,
0.0007171630859375,
0.0283050537109375,
0.0313720703125,
0.037933349609375,
-0.05792236328125,
0.039825439453125,
-0.0452880859375,
-0.023834228515625,
-0.0086822509765625,
-0.07196044921875,
-0.0799560546875,
-0.03179931640625,
-0.060516357421875,
-0.040496826171875,
-0.005100250244140625,
0.04449462890625,
0.0716552734375,
-0.055633544921875,
-0.0003273487091064453,
0.0010137557983398438,
0.0071258544921875,
-0.0038242340087890625,
-0.018218994140625,
0.0355224609375,
0.0117645263671875,
-0.050506591796875,
-0.0178375244140625,
0.00988006591796875,
0.0278472900390625,
-0.0240325927734375,
-0.0156402587890625,
-0.00653076171875,
-0.001407623291015625,
0.03497314453125,
0.0286102294921875,
-0.05352783203125,
-0.0192413330078125,
-0.00995635986328125,
-0.01407623291015625,
0.0092926025390625,
0.0341796875,
-0.03839111328125,
0.030792236328125,
0.03546142578125,
0.0283050537109375,
0.05987548828125,
-0.0035381317138671875,
0.0217437744140625,
-0.0254058837890625,
0.026275634765625,
0.003765106201171875,
0.04327392578125,
0.0259246826171875,
-0.041473388671875,
0.04376220703125,
0.029937744140625,
-0.0570068359375,
-0.06591796875,
0.0005517005920410156,
-0.09344482421875,
-0.0121002197265625,
0.08575439453125,
-0.01531982421875,
-0.042877197265625,
0.00159454345703125,
-0.02520751953125,
0.017425537109375,
-0.03533935546875,
0.046966552734375,
0.03863525390625,
0.00803375244140625,
-0.03582763671875,
-0.031280517578125,
0.03717041015625,
0.0020599365234375,
-0.059417724609375,
-0.019805908203125,
0.0308685302734375,
0.052947998046875,
0.024810791015625,
0.05072021484375,
-0.0267791748046875,
0.0262298583984375,
0.01001739501953125,
0.035400390625,
-0.00428009033203125,
-0.018585205078125,
-0.036376953125,
0.01467132568359375,
-0.022308349609375,
-0.026702880859375
]
] |
mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis | 2023-03-16T20:03:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"financial",
"stocks",
"sentiment",
"dataset:financial_phrasebank",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | mrm8488 | null | null | mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis | 109 | 4,468,850 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- generated_from_trainer
- financial
- stocks
- sentiment
widget:
- text: "Operating profit totaled EUR 9.4 mn , down from EUR 11.7 mn in 2004 ."
datasets:
- financial_phrasebank
metrics:
- accuracy
model-index:
- name: distilRoberta-financial-sentiment
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: financial_phrasebank
type: financial_phrasebank
args: sentences_allagree
metrics:
- name: Accuracy
type: accuracy
value: 0.9823008849557522
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilRoberta-financial-sentiment
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the financial_phrasebank dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1116
- Accuracy: 0.9823
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 255 | 0.1670 | 0.9646 |
| 0.209 | 2.0 | 510 | 0.2290 | 0.9558 |
| 0.209 | 3.0 | 765 | 0.2044 | 0.9558 |
| 0.0326 | 4.0 | 1020 | 0.1116 | 0.9823 |
| 0.0326 | 5.0 | 1275 | 0.1127 | 0.9779 |
### Framework versions
- Transformers 4.10.2
- Pytorch 1.9.0+cu102
- Datasets 1.12.1
- Tokenizers 0.10.3
| 2,044 | [
[
-0.0296173095703125,
-0.043365478515625,
-0.00102996826171875,
0.028900146484375,
-0.0255279541015625,
-0.006290435791015625,
-0.006916046142578125,
-0.0010595321655273438,
0.006244659423828125,
0.01160430908203125,
-0.047821044921875,
-0.0531005859375,
-0.0589599609375,
-0.0106201171875,
-0.0229644775390625,
0.10638427734375,
0.015228271484375,
0.03564453125,
0.0009245872497558594,
0.0058746337890625,
-0.0224761962890625,
-0.048553466796875,
-0.0654296875,
-0.05352783203125,
0.0235137939453125,
0.006252288818359375,
0.05621337890625,
0.03729248046875,
0.045623779296875,
0.018890380859375,
-0.03790283203125,
-0.025604248046875,
-0.045684814453125,
-0.033843994140625,
0.00029850006103515625,
-0.040313720703125,
-0.048126220703125,
0.00887298583984375,
0.0299835205078125,
0.0377197265625,
-0.025421142578125,
0.042388916015625,
0.0237884521484375,
0.059417724609375,
-0.03155517578125,
0.03857421875,
-0.0294647216796875,
0.0174102783203125,
-0.0103912353515625,
-0.012420654296875,
-0.0239715576171875,
-0.01479339599609375,
0.0226287841796875,
-0.035430908203125,
0.0285491943359375,
0.0016584396362304688,
0.0941162109375,
0.032623291015625,
-0.029266357421875,
-0.01119232177734375,
-0.048675537109375,
0.04669189453125,
-0.055206298828125,
0.00878143310546875,
0.0265350341796875,
0.0290374755859375,
0.01016998291015625,
-0.0496826171875,
-0.037689208984375,
0.005157470703125,
-0.0049285888671875,
0.025970458984375,
-0.0257568359375,
-0.0018854141235351562,
0.0419921875,
0.04547119140625,
-0.0279541015625,
0.0010004043579101562,
-0.042327880859375,
-0.00888824462890625,
0.042144775390625,
0.025604248046875,
-0.03717041015625,
-0.027801513671875,
-0.043792724609375,
-0.0232086181640625,
-0.0144805908203125,
0.0192718505859375,
0.053070068359375,
0.026702880859375,
-0.0277557373046875,
0.032928466796875,
-0.0185394287109375,
0.04742431640625,
0.0243988037109375,
-0.023956298828125,
0.050628662109375,
0.0038585662841796875,
-0.036865234375,
0.0094451904296875,
0.0625,
0.0496826171875,
0.0128021240234375,
0.0227203369140625,
-0.03240966796875,
-0.0032749176025390625,
0.0209503173828125,
-0.06268310546875,
-0.0281982421875,
0.00556182861328125,
-0.054046630859375,
-0.059234619140625,
0.0195770263671875,
-0.05206298828125,
0.00626373291015625,
-0.0386962890625,
0.02899169921875,
-0.037689208984375,
-0.0207061767578125,
0.0194091796875,
-0.0181884765625,
0.0234222412109375,
0.01155853271484375,
-0.06365966796875,
0.0224761962890625,
0.041595458984375,
0.045654296875,
0.025054931640625,
-0.00792694091796875,
-0.009429931640625,
-0.00833892822265625,
-0.01287841796875,
0.039459228515625,
-0.0005788803100585938,
-0.040435791015625,
-0.013946533203125,
-0.0028743743896484375,
-0.00616455078125,
-0.03179931640625,
0.051971435546875,
-0.0183258056640625,
0.0313720703125,
-0.0134124755859375,
-0.041534423828125,
-0.0234527587890625,
0.0335693359375,
-0.045745849609375,
0.09222412109375,
0.01332855224609375,
-0.07830810546875,
0.040313720703125,
-0.05120849609375,
-0.01165771484375,
-0.01446533203125,
-0.002368927001953125,
-0.054168701171875,
-0.0004322528839111328,
-0.00070953369140625,
0.03558349609375,
-0.022491455078125,
0.03436279296875,
-0.027923583984375,
-0.026580810546875,
0.020904541015625,
-0.04937744140625,
0.066162109375,
0.01580810546875,
-0.04052734375,
-0.0020847320556640625,
-0.088623046875,
-0.004180908203125,
0.01346588134765625,
-0.035430908203125,
-0.017364501953125,
-0.0157470703125,
0.03759765625,
0.0208587646484375,
0.034576416015625,
-0.0401611328125,
0.01317596435546875,
-0.03741455078125,
0.022705078125,
0.06243896484375,
-0.0033931732177734375,
0.00853729248046875,
-0.027008056640625,
0.028076171875,
0.03009033203125,
0.03741455078125,
0.0191802978515625,
-0.021697998046875,
-0.061492919921875,
-0.0206756591796875,
0.01971435546875,
0.041717529296875,
-0.0218048095703125,
0.057586669921875,
-0.01535797119140625,
-0.05242919921875,
-0.01554107666015625,
0.00679779052734375,
0.0277862548828125,
0.0662841796875,
0.0251922607421875,
0.0009436607360839844,
-0.03704833984375,
-0.08489990234375,
0.00870513916015625,
-0.018707275390625,
0.01499176025390625,
0.0009222030639648438,
0.04718017578125,
-0.01016998291015625,
0.06781005859375,
-0.052825927734375,
-0.0176239013671875,
-0.0097198486328125,
0.01329803466796875,
0.058685302734375,
0.040924072265625,
0.05908203125,
-0.052703857421875,
-0.0218963623046875,
-0.013458251953125,
-0.05511474609375,
0.030242919921875,
-0.0131683349609375,
-0.01090240478515625,
-0.0009613037109375,
0.0175018310546875,
-0.036834716796875,
0.050323486328125,
0.0310516357421875,
-0.034423828125,
0.047332763671875,
-0.018310546875,
-0.0175323486328125,
-0.1072998046875,
0.01983642578125,
0.0272979736328125,
-0.0085601806640625,
-0.0171966552734375,
-0.01824951171875,
-0.0006275177001953125,
-0.0045166015625,
-0.02239990234375,
0.0372314453125,
0.0025272369384765625,
0.012664794921875,
-0.01129150390625,
-0.0265350341796875,
0.00829315185546875,
0.05535888671875,
0.0034351348876953125,
0.04290771484375,
0.053924560546875,
-0.03143310546875,
0.03863525390625,
0.03411865234375,
-0.023590087890625,
0.05035400390625,
-0.06884765625,
-0.0028743743896484375,
-0.0039825439453125,
0.00803375244140625,
-0.06781005859375,
-0.004241943359375,
0.02880859375,
-0.033294677734375,
0.0253448486328125,
-0.01084136962890625,
-0.01983642578125,
-0.038909912109375,
-0.01035308837890625,
0.003101348876953125,
0.036376953125,
-0.034393310546875,
0.032958984375,
-0.004619598388671875,
0.004558563232421875,
-0.06695556640625,
-0.060546875,
-0.00994110107421875,
-0.0302581787109375,
-0.025970458984375,
0.00995635986328125,
0.007495880126953125,
-0.0135650634765625,
-0.000476837158203125,
-0.0080413818359375,
-0.010711669921875,
0.0035991668701171875,
0.035400390625,
0.04736328125,
-0.0077362060546875,
0.0014743804931640625,
-0.002689361572265625,
-0.029205322265625,
0.0222015380859375,
-0.00864410400390625,
0.043121337890625,
-0.0296173095703125,
-0.0111236572265625,
-0.055389404296875,
-0.00783538818359375,
0.0303955078125,
-0.007122039794921875,
0.07159423828125,
0.037353515625,
-0.028961181640625,
0.006153106689453125,
-0.0303192138671875,
-0.004383087158203125,
-0.033538818359375,
0.041473388671875,
-0.034820556640625,
-0.0276031494140625,
0.057342529296875,
-0.00847625732421875,
0.0007653236389160156,
0.0748291015625,
0.0423583984375,
-0.004550933837890625,
0.07257080078125,
0.02813720703125,
-0.0298004150390625,
0.0223388671875,
-0.06427001953125,
0.00292205810546875,
-0.03863525390625,
-0.037567138671875,
-0.047576904296875,
-0.029296875,
-0.04473876953125,
0.00893402099609375,
0.0111846923828125,
0.0129852294921875,
-0.051788330078125,
0.02099609375,
-0.034912109375,
0.01617431640625,
0.0504150390625,
0.0238800048828125,
0.0029735565185546875,
0.005184173583984375,
-0.0285797119140625,
-0.0009627342224121094,
-0.05352783203125,
-0.044677734375,
0.091064453125,
0.050201416015625,
0.0711669921875,
-0.0172119140625,
0.06256103515625,
0.01120758056640625,
0.01282501220703125,
-0.05413818359375,
0.0189361572265625,
-0.00620269775390625,
-0.0611572265625,
-0.0125274658203125,
-0.037628173828125,
-0.036224365234375,
-0.00518798828125,
-0.018585205078125,
-0.0255279541015625,
0.02386474609375,
0.0180206298828125,
-0.04669189453125,
0.0278472900390625,
-0.028564453125,
0.0858154296875,
-0.0208892822265625,
-0.0161895751953125,
-0.016387939453125,
-0.03887939453125,
0.011749267578125,
0.0005450248718261719,
0.0012598037719726562,
-0.00022101402282714844,
0.01433563232421875,
0.053802490234375,
-0.044525146484375,
0.06591796875,
-0.0367431640625,
0.004413604736328125,
0.0221099853515625,
-0.021484375,
0.037628173828125,
0.023895263671875,
-0.0179901123046875,
0.0231781005859375,
0.0014352798461914062,
-0.039886474609375,
-0.032379150390625,
0.04052734375,
-0.08343505859375,
-0.01103973388671875,
-0.05474853515625,
-0.02587890625,
-0.0038299560546875,
0.003421783447265625,
0.0400390625,
0.04486083984375,
-0.0185394287109375,
0.0230865478515625,
0.03857421875,
-0.0008335113525390625,
0.021087646484375,
0.009735107421875,
-0.01314544677734375,
-0.052276611328125,
0.055389404296875,
-0.01544952392578125,
0.012725830078125,
0.005725860595703125,
0.01274871826171875,
-0.043548583984375,
-0.0230865478515625,
-0.0364990234375,
0.00984954833984375,
-0.05999755859375,
-0.0253143310546875,
-0.0229034423828125,
-0.0278472900390625,
-0.0270233154296875,
0.000492095947265625,
-0.03826904296875,
-0.0259246826171875,
-0.046142578125,
-0.033599853515625,
0.04437255859375,
0.039520263671875,
0.0015459060668945312,
0.0430908203125,
-0.048980712890625,
-0.00962066650390625,
0.0142059326171875,
0.021820068359375,
0.002948760986328125,
-0.06011962890625,
-0.033660888671875,
0.007678985595703125,
-0.031463623046875,
-0.047088623046875,
0.0411376953125,
0.00652313232421875,
0.03680419921875,
0.06207275390625,
-0.0087127685546875,
0.0699462890625,
-0.003170013427734375,
0.050628662109375,
0.034759521484375,
-0.055877685546875,
0.036712646484375,
-0.017303466796875,
0.00923919677734375,
0.057281494140625,
0.0447998046875,
-0.0233306884765625,
-0.01129150390625,
-0.075439453125,
-0.059844970703125,
0.0538330078125,
0.014129638671875,
0.01454925537109375,
-0.00492095947265625,
0.030242919921875,
-0.0009098052978515625,
0.037200927734375,
-0.06756591796875,
-0.0467529296875,
-0.0455322265625,
-0.02587890625,
-0.00290679931640625,
-0.0236968994140625,
-0.01039886474609375,
-0.03814697265625,
0.07330322265625,
-0.0011043548583984375,
0.01264190673828125,
0.005886077880859375,
0.02191162109375,
0.0035686492919921875,
0.0015649795532226562,
0.031494140625,
0.048797607421875,
-0.04473876953125,
-0.0070343017578125,
0.01175689697265625,
-0.035400390625,
0.00799560546875,
0.023834228515625,
-0.0106201171875,
0.012969970703125,
0.0160064697265625,
0.08770751953125,
0.01441192626953125,
-0.0203399658203125,
0.04315185546875,
-0.01373291015625,
-0.038360595703125,
-0.05291748046875,
-0.0087738037109375,
-0.002002716064453125,
0.02447509765625,
0.03875732421875,
0.04388427734375,
0.009796142578125,
-0.020355224609375,
0.006931304931640625,
0.0126953125,
-0.04925537109375,
-0.016387939453125,
0.05670166015625,
0.0085601806640625,
-0.001285552978515625,
0.060638427734375,
-0.00992584228515625,
-0.02490234375,
0.05169677734375,
0.0230865478515625,
0.073486328125,
-0.007633209228515625,
0.0123291015625,
0.057891845703125,
0.006275177001953125,
-0.020050048828125,
0.04168701171875,
0.0195159912109375,
-0.04254150390625,
-0.0239410400390625,
-0.075439453125,
-0.01325225830078125,
0.0295562744140625,
-0.0947265625,
0.032012939453125,
-0.04742431640625,
-0.050994873046875,
0.010711669921875,
-0.0028896331787109375,
-0.0635986328125,
0.042755126953125,
0.010955810546875,
0.09002685546875,
-0.06591796875,
0.052886962890625,
0.048370361328125,
-0.041412353515625,
-0.076904296875,
-0.022369384765625,
-0.00409698486328125,
-0.053558349609375,
0.0654296875,
0.0108642578125,
0.0172882080078125,
0.0030269622802734375,
-0.028076171875,
-0.050384521484375,
0.076171875,
0.01439666748046875,
-0.06817626953125,
-0.01507568359375,
0.0254974365234375,
0.04705810546875,
-0.0137176513671875,
0.0260009765625,
0.023712158203125,
0.022430419921875,
0.015045166015625,
-0.062408447265625,
-0.01983642578125,
-0.0288543701171875,
0.0051116943359375,
0.0105743408203125,
-0.054656982421875,
0.07855224609375,
0.0091094970703125,
0.0316162109375,
-0.00069427490234375,
0.048980712890625,
0.0122528076171875,
0.02679443359375,
0.041107177734375,
0.07501220703125,
0.040557861328125,
-0.0205535888671875,
0.07177734375,
-0.04931640625,
0.0670166015625,
0.0804443359375,
-0.001972198486328125,
0.052978515625,
0.031768798828125,
-0.0302276611328125,
0.039398193359375,
0.06378173828125,
-0.020782470703125,
0.032867431640625,
0.0127105712890625,
-0.01047515869140625,
-0.02252197265625,
0.0217437744140625,
-0.034423828125,
0.037200927734375,
0.004550933837890625,
-0.052642822265625,
-0.0205535888671875,
0.0004341602325439453,
0.0024318695068359375,
-0.00899505615234375,
-0.029388427734375,
0.048370361328125,
0.00023698806762695312,
-0.00893402099609375,
0.043426513671875,
0.003818511962890625,
0.04248046875,
-0.045745849609375,
0.00447845458984375,
-0.012054443359375,
0.034271240234375,
-0.0135650634765625,
-0.044097900390625,
0.023956298828125,
0.006679534912109375,
-0.0221099853515625,
-0.0088653564453125,
0.0301666259765625,
-0.005496978759765625,
-0.0762939453125,
0.0147247314453125,
0.024810791015625,
0.0015726089477539062,
-0.00975799560546875,
-0.08355712890625,
-0.020294189453125,
0.00839996337890625,
-0.047119140625,
0.00609588623046875,
0.032989501953125,
0.01477813720703125,
0.036102294921875,
0.04736328125,
0.0020732879638671875,
-0.006755828857421875,
0.019134521484375,
0.07952880859375,
-0.054168701171875,
-0.054412841796875,
-0.069091796875,
0.039825439453125,
-0.025543212890625,
-0.053558349609375,
0.04931640625,
0.0806884765625,
0.0712890625,
-0.0203857421875,
0.050201416015625,
0.006443023681640625,
0.03411865234375,
-0.030487060546875,
0.05511474609375,
-0.02716064453125,
-0.0039043426513671875,
-0.0260162353515625,
-0.0672607421875,
0.0009675025939941406,
0.06524658203125,
-0.033203125,
-0.002124786376953125,
0.024688720703125,
0.05194091796875,
0.00021529197692871094,
0.0096588134765625,
0.007904052734375,
0.005336761474609375,
0.0023059844970703125,
0.0206298828125,
0.045318603515625,
-0.056182861328125,
0.03680419921875,
-0.0489501953125,
-0.01136016845703125,
-0.00753021240234375,
-0.057708740234375,
-0.070556640625,
-0.0209503173828125,
-0.034637451171875,
-0.039215087890625,
-0.01023101806640625,
0.0767822265625,
0.043304443359375,
-0.05511474609375,
-0.03277587890625,
0.0021266937255859375,
-0.03857421875,
-0.019866943359375,
-0.0198822021484375,
0.029510498046875,
-0.01470184326171875,
-0.053375244140625,
-0.01430511474609375,
-0.0017995834350585938,
0.0276336669921875,
-0.0287628173828125,
-0.0145721435546875,
-0.00504302978515625,
-0.01580810546875,
0.022369384765625,
-0.00986480712890625,
-0.0255279541015625,
-0.01113128662109375,
0.00238800048828125,
-0.01019287109375,
0.01216888427734375,
0.019287109375,
-0.0294036865234375,
0.01446533203125,
0.019317626953125,
0.021484375,
0.052886962890625,
0.0059356689453125,
0.01428985595703125,
-0.052734375,
0.040191650390625,
0.01364898681640625,
0.04022216796875,
0.00833892822265625,
-0.0312042236328125,
0.024993896484375,
0.032989501953125,
-0.035552978515625,
-0.044677734375,
-0.021209716796875,
-0.0814208984375,
0.0027828216552734375,
0.0718994140625,
-0.007785797119140625,
-0.039031982421875,
0.0250091552734375,
-0.02685546875,
0.017913818359375,
-0.034942626953125,
0.0557861328125,
0.0582275390625,
0.003292083740234375,
0.0184783935546875,
-0.034942626953125,
0.037078857421875,
0.01151275634765625,
-0.032806396484375,
-0.0113983154296875,
0.037353515625,
0.03436279296875,
0.0146636962890625,
0.0295562744140625,
-0.00809478759765625,
0.01953125,
0.0191802978515625,
0.026214599609375,
-0.019775390625,
-0.01198577880859375,
-0.0291595458984375,
0.00722503662109375,
0.0092315673828125,
-0.03533935546875
]
] |
lxyuan/distilbert-base-multilingual-cased-sentiments-student | 2023-06-24T04:09:07.000Z | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"text-classification",
"sentiment-analysis",
"zero-shot-distillation",
"distillation",
"zero-shot-classification",
"debarta-v3",
"en",
"ar",
"de",
"es",
"fr",
"ja",
"zh",
"id",
"hi",
"it",
"ms",
"pt",
"dataset:tyqiangz/multilingual-sentiments",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | lxyuan | null | null | lxyuan/distilbert-base-multilingual-cased-sentiments-student | 38 | 4,349,010 | transformers | 2023-05-05T16:22:55 | ---
license: apache-2.0
tags:
- sentiment-analysis
- text-classification
- zero-shot-distillation
- distillation
- zero-shot-classification
- debarta-v3
model-index:
- name: distilbert-base-multilingual-cased-sentiments-student
results: []
datasets:
- tyqiangz/multilingual-sentiments
language:
- en
- ar
- de
- es
- fr
- ja
- zh
- id
- hi
- it
- ms
- pt
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-multilingual-cased-sentiments-student
This model is distilled from the zero-shot classification pipeline on the Multilingual Sentiment
dataset using this [script](https://github.com/huggingface/transformers/tree/main/examples/research_projects/zero-shot-distillation).
In reality the multilingual-sentiment dataset is annotated of course,
but we'll pretend and ignore the annotations for the sake of example.
Teacher model: MoritzLaurer/mDeBERTa-v3-base-mnli-xnli
Teacher hypothesis template: "The sentiment of this text is {}."
Student model: distilbert-base-multilingual-cased
## Inference example
```python
from transformers import pipeline
distilled_student_sentiment_classifier = pipeline(
model="lxyuan/distilbert-base-multilingual-cased-sentiments-student",
return_all_scores=True
)
# english
distilled_student_sentiment_classifier ("I love this movie and i would watch it again and again!")
>> [[{'label': 'positive', 'score': 0.9731044769287109},
{'label': 'neutral', 'score': 0.016910076141357422},
{'label': 'negative', 'score': 0.009985478594899178}]]
# malay
distilled_student_sentiment_classifier("Saya suka filem ini dan saya akan menontonnya lagi dan lagi!")
[[{'label': 'positive', 'score': 0.9760093688964844},
{'label': 'neutral', 'score': 0.01804516464471817},
{'label': 'negative', 'score': 0.005945465061813593}]]
# japanese
distilled_student_sentiment_classifier("็งใฏใใฎๆ ็ปใๅคงๅฅฝใใงใไฝๅบฆใ่ฆใพใ๏ผ")
>> [[{'label': 'positive', 'score': 0.9342429041862488},
{'label': 'neutral', 'score': 0.040193185210227966},
{'label': 'negative', 'score': 0.025563929229974747}]]
```
## Training procedure
Notebook link: [here](https://github.com/LxYuan0420/nlp/blob/main/notebooks/Distilling_Zero_Shot_multilingual_distilbert_sentiments_student.ipynb)
### Training hyperparameters
Result can be reproduce using the following commands:
```bash
python transformers/examples/research_projects/zero-shot-distillation/distill_classifier.py \
--data_file ./multilingual-sentiments/train_unlabeled.txt \
--class_names_file ./multilingual-sentiments/class_names.txt \
--hypothesis_template "The sentiment of this text is {}." \
--teacher_name_or_path MoritzLaurer/mDeBERTa-v3-base-mnli-xnli \
--teacher_batch_size 32 \
--student_name_or_path distilbert-base-multilingual-cased \
--output_dir ./distilbert-base-multilingual-cased-sentiments-student \
--per_device_train_batch_size 16 \
--fp16
```
If you are training this model on Colab, make the following code changes to avoid Out-of-memory error message:
```bash
###### modify L78 to disable fast tokenizer
default=False,
###### update dataset map part at L313
dataset = dataset.map(tokenizer, input_columns="text", fn_kwargs={"padding": "max_length", "truncation": True, "max_length": 512})
###### add following lines to L213
del model
print(f"Manually deleted Teacher model, free some memory for student model.")
###### add following lines to L337
trainer.push_to_hub()
tokenizer.push_to_hub("distilbert-base-multilingual-cased-sentiments-student")
```
### Training log
```bash
Training completed. Do not forget to share your model on huggingface.co/models =)
{'train_runtime': 2009.8864, 'train_samples_per_second': 73.0, 'train_steps_per_second': 4.563, 'train_loss': 0.6473459283913797, 'epoch': 1.0}
100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 9171/9171 [33:29<00:00, 4.56it/s]
[INFO|trainer.py:762] 2023-05-06 10:56:18,555 >> The following columns in the evaluation set don't have a corresponding argument in `DistilBertForSequenceClassification.forward` and have been ignored: text. If text are not expected by `DistilBertForSequenceClassification.forward`, you can safely ignore this message.
[INFO|trainer.py:3129] 2023-05-06 10:56:18,557 >> ***** Running Evaluation *****
[INFO|trainer.py:3131] 2023-05-06 10:56:18,557 >> Num examples = 146721
[INFO|trainer.py:3134] 2023-05-06 10:56:18,557 >> Batch size = 128
100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 1147/1147 [08:59<00:00, 2.13it/s]
05/06/2023 11:05:18 - INFO - __main__ - Agreement of student and teacher predictions: 88.29%
[INFO|trainer.py:2868] 2023-05-06 11:05:18,251 >> Saving model checkpoint to ./distilbert-base-multilingual-cased-sentiments-student
[INFO|configuration_utils.py:457] 2023-05-06 11:05:18,251 >> Configuration saved in ./distilbert-base-multilingual-cased-sentiments-student/config.json
[INFO|modeling_utils.py:1847] 2023-05-06 11:05:18,905 >> Model weights saved in ./distilbert-base-multilingual-cased-sentiments-student/pytorch_model.bin
[INFO|tokenization_utils_base.py:2171] 2023-05-06 11:05:18,905 >> tokenizer config file saved in ./distilbert-base-multilingual-cased-sentiments-student/tokenizer_config.json
[INFO|tokenization_utils_base.py:2178] 2023-05-06 11:05:18,905 >> Special tokens file saved in ./distilbert-base-multilingual-cased-sentiments-student/special_tokens_map.json
```
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3 | 5,577 | [
[
-0.0288848876953125,
-0.052886962890625,
0.0166473388671875,
0.0264892578125,
-0.01430511474609375,
0.0003046989440917969,
-0.0217437744140625,
0.0030193328857421875,
0.00989532470703125,
0.00980377197265625,
-0.033935546875,
-0.055572509765625,
-0.052398681640625,
0.00443267822265625,
-0.00743865966796875,
0.09503173828125,
-0.01666259765625,
0.0240478515625,
-0.014404296875,
-0.019317626953125,
-0.027496337890625,
-0.06280517578125,
-0.055999755859375,
-0.0235443115234375,
0.0158843994140625,
0.0207061767578125,
0.0212860107421875,
0.026031494140625,
0.0274810791015625,
0.02789306640625,
-0.026611328125,
-0.0164642333984375,
-0.0142974853515625,
-0.005855560302734375,
0.0028209686279296875,
-0.051513671875,
-0.04345703125,
0.01273345947265625,
0.055999755859375,
0.036529541015625,
-0.0007905960083007812,
0.019927978515625,
0.001667022705078125,
0.035736083984375,
-0.0308074951171875,
0.034576416015625,
-0.029632568359375,
0.00853729248046875,
-0.00954437255859375,
0.0016298294067382812,
-0.0245819091796875,
-0.010711669921875,
0.0037078857421875,
-0.0228424072265625,
0.032958984375,
-0.0194244384765625,
0.088134765625,
0.007526397705078125,
-0.018463134765625,
-0.00803375244140625,
-0.0222930908203125,
0.065185546875,
-0.067626953125,
0.00423431396484375,
0.0237274169921875,
0.0035552978515625,
-0.01306915283203125,
-0.04290771484375,
-0.060455322265625,
-0.007656097412109375,
-0.0145721435546875,
0.027801513671875,
-0.007568359375,
-0.0005049705505371094,
0.044586181640625,
0.03802490234375,
-0.0254669189453125,
-0.0088653564453125,
-0.039581298828125,
-0.0256195068359375,
0.042694091796875,
0.019805908203125,
0.006847381591796875,
-0.0341796875,
-0.0171661376953125,
-0.033843994140625,
-0.005931854248046875,
0.0242767333984375,
0.04193115234375,
0.03179931640625,
-0.03155517578125,
0.040863037109375,
-0.0189666748046875,
0.040618896484375,
0.00861358642578125,
-0.0265350341796875,
0.0584716796875,
-0.01702880859375,
-0.017181396484375,
0.020355224609375,
0.0877685546875,
0.035614013671875,
0.018890380859375,
0.0267333984375,
-0.00649261474609375,
0.014404296875,
-0.0179595947265625,
-0.0723876953125,
-0.01617431640625,
0.033782958984375,
-0.0200653076171875,
-0.0258026123046875,
0.0014781951904296875,
-0.06719970703125,
-0.0017194747924804688,
-0.0114593505859375,
0.0279083251953125,
-0.035675048828125,
-0.029266357421875,
0.0128326416015625,
-0.01800537109375,
0.004817962646484375,
-0.0017652511596679688,
-0.06927490234375,
0.0053863525390625,
0.0229644775390625,
0.056854248046875,
0.0098724365234375,
-0.049560546875,
-0.016021728515625,
-0.0172576904296875,
-0.00798797607421875,
0.021728515625,
-0.0114898681640625,
-0.032958984375,
-0.004825592041015625,
0.0212860107421875,
-0.022796630859375,
-0.039031982421875,
0.050140380859375,
-0.00670623779296875,
0.04095458984375,
-0.0159149169921875,
-0.034454345703125,
-0.0168914794921875,
0.031097412109375,
-0.03033447265625,
0.10357666015625,
0.013946533203125,
-0.07598876953125,
0.0230712890625,
-0.03839111328125,
-0.026092529296875,
-0.0163421630859375,
0.00897216796875,
-0.050323486328125,
-0.00820159912109375,
0.01477813720703125,
0.049102783203125,
0.0027866363525390625,
0.036865234375,
-0.0183868408203125,
-0.0275115966796875,
0.014556884765625,
-0.029296875,
0.09161376953125,
0.00901031494140625,
-0.0308380126953125,
-0.00463104248046875,
-0.07476806640625,
-0.0080108642578125,
0.023345947265625,
-0.05157470703125,
-0.0240936279296875,
-0.0291595458984375,
0.01219940185546875,
0.0172271728515625,
0.027618408203125,
-0.041015625,
0.031341552734375,
-0.02606201171875,
0.023712158203125,
0.04888916015625,
-0.00238800048828125,
0.0181427001953125,
-0.00782012939453125,
0.0261383056640625,
0.04669189453125,
0.006572723388671875,
-0.01708984375,
-0.02557373046875,
-0.09136962890625,
-0.0162200927734375,
0.01366424560546875,
0.0487060546875,
-0.038482666015625,
0.0401611328125,
-0.0175323486328125,
-0.04644775390625,
-0.03759765625,
0.0029430389404296875,
0.0191192626953125,
0.06317138671875,
0.03680419921875,
-0.0133819580078125,
-0.050994873046875,
-0.059112548828125,
-0.01180267333984375,
-0.030731201171875,
0.0226898193359375,
0.0029773712158203125,
0.058135986328125,
-0.023406982421875,
0.058349609375,
-0.049285888671875,
-0.0215301513671875,
-0.03076171875,
0.0160675048828125,
0.04913330078125,
0.048126220703125,
0.055145263671875,
-0.0377197265625,
-0.04510498046875,
-0.017578125,
-0.062225341796875,
0.0032444000244140625,
-0.0244140625,
-0.00872802734375,
0.0267333984375,
0.01617431640625,
-0.04840087890625,
0.0288238525390625,
0.01239013671875,
-0.0288543701171875,
0.04193115234375,
-0.0217132568359375,
0.0102386474609375,
-0.1063232421875,
-0.0018815994262695312,
0.0214691162109375,
0.004520416259765625,
-0.036712646484375,
-0.01128387451171875,
0.0024623870849609375,
0.00264739990234375,
-0.0421142578125,
0.04754638671875,
-0.0218963623046875,
0.0268096923828125,
-0.0024852752685546875,
0.0008215904235839844,
0.0128173828125,
0.055145263671875,
0.01457977294921875,
0.034088134765625,
0.07525634765625,
-0.03912353515625,
0.0352783203125,
0.0293121337890625,
-0.0113677978515625,
0.0435791015625,
-0.034942626953125,
-0.00830841064453125,
-0.01444244384765625,
0.01171875,
-0.0745849609375,
-0.0121002197265625,
0.038421630859375,
-0.041351318359375,
0.03631591796875,
-0.0260467529296875,
-0.0292816162109375,
-0.039031982421875,
-0.016510009765625,
0.01268768310546875,
0.037994384765625,
-0.03704833984375,
0.038726806640625,
0.00707244873046875,
0.0008616447448730469,
-0.058563232421875,
-0.05999755859375,
-0.00952911376953125,
-0.031097412109375,
-0.0224151611328125,
0.004734039306640625,
-0.003437042236328125,
-0.0101470947265625,
-0.0121002197265625,
0.0034942626953125,
-0.006999969482421875,
-0.002651214599609375,
0.01641845703125,
0.0439453125,
-0.00598907470703125,
0.0017232894897460938,
0.006313323974609375,
-0.0097503662109375,
0.0201263427734375,
-0.00042247772216796875,
0.05181884765625,
-0.0244598388671875,
-0.0079498291015625,
-0.043121337890625,
0.00522613525390625,
0.047698974609375,
-0.01160430908203125,
0.07208251953125,
0.0635986328125,
-0.0220489501953125,
0.0146484375,
-0.034210205078125,
-0.01103973388671875,
-0.033111572265625,
0.054534912109375,
-0.0240478515625,
-0.0303802490234375,
0.046783447265625,
0.0011835098266601562,
-0.001522064208984375,
0.05548095703125,
0.0458984375,
-0.0009522438049316406,
0.075439453125,
0.028656005859375,
-0.028717041015625,
0.03692626953125,
-0.0582275390625,
0.0056610107421875,
-0.057952880859375,
-0.033660888671875,
-0.04241943359375,
-0.01361846923828125,
-0.0447998046875,
-0.0038547515869140625,
0.0152587890625,
0.0245819091796875,
-0.03143310546875,
0.024749755859375,
-0.043487548828125,
0.0149383544921875,
0.0394287109375,
-0.00160980224609375,
0.000023663043975830078,
0.00972747802734375,
-0.032470703125,
-0.0084991455078125,
-0.043548583984375,
-0.03228759765625,
0.0849609375,
0.035675048828125,
0.038787841796875,
-0.0183868408203125,
0.0665283203125,
-0.014404296875,
0.00971221923828125,
-0.06182861328125,
0.042572021484375,
0.003925323486328125,
-0.048797607421875,
-0.0031337738037109375,
-0.038726806640625,
-0.04730224609375,
0.0189208984375,
-0.007556915283203125,
-0.060302734375,
0.007415771484375,
-0.0002779960632324219,
-0.01122283935546875,
0.033966064453125,
-0.052978515625,
0.076904296875,
-0.0204620361328125,
-0.03692626953125,
0.0001289844512939453,
-0.03887939453125,
0.01611328125,
0.005096435546875,
0.00909423828125,
-0.007648468017578125,
0.026214599609375,
0.080078125,
-0.040863037109375,
0.06536865234375,
-0.042083740234375,
0.01561737060546875,
0.0340576171875,
-0.017669677734375,
0.0262603759765625,
0.0079193115234375,
-0.01580810546875,
0.036041259765625,
0.01432037353515625,
-0.03240966796875,
-0.0262908935546875,
0.057220458984375,
-0.07720947265625,
-0.039886474609375,
-0.06390380859375,
-0.0341796875,
-0.00458526611328125,
0.0195770263671875,
0.0293121337890625,
0.0216827392578125,
0.00356292724609375,
0.0021991729736328125,
0.0338134765625,
-0.023712158203125,
0.04632568359375,
0.033477783203125,
-0.013885498046875,
-0.0290069580078125,
0.06634521484375,
0.0108642578125,
0.007480621337890625,
0.00897979736328125,
0.020965576171875,
-0.036865234375,
-0.02496337890625,
-0.046630859375,
0.022186279296875,
-0.0670166015625,
-0.007293701171875,
-0.0596923828125,
-0.01505279541015625,
-0.042633056640625,
0.005886077880859375,
-0.03472900390625,
-0.0262603759765625,
-0.0360107421875,
-0.02252197265625,
0.048095703125,
0.02130126953125,
-0.0024280548095703125,
0.031341552734375,
-0.055633544921875,
-0.0005435943603515625,
-0.00180816650390625,
0.0172119140625,
0.007160186767578125,
-0.06158447265625,
-0.0272064208984375,
0.015655517578125,
-0.039276123046875,
-0.056243896484375,
0.044586181640625,
0.0044403076171875,
0.038177490234375,
0.02764892578125,
-0.0067596435546875,
0.0633544921875,
-0.006946563720703125,
0.06658935546875,
0.02288818359375,
-0.06890869140625,
0.050048828125,
-0.0145263671875,
0.024810791015625,
0.050872802734375,
0.051177978515625,
-0.048187255859375,
-0.02008056640625,
-0.046539306640625,
-0.07427978515625,
0.07220458984375,
-0.0005893707275390625,
0.0281982421875,
-0.00848388671875,
0.027496337890625,
-0.005825042724609375,
0.01377105712890625,
-0.072265625,
-0.047119140625,
-0.036773681640625,
-0.026397705078125,
-0.00923919677734375,
-0.0086517333984375,
-0.004199981689453125,
-0.060211181640625,
0.07208251953125,
0.0005521774291992188,
0.0011749267578125,
0.016021728515625,
0.0074005126953125,
0.006069183349609375,
0.016510009765625,
0.0179290771484375,
0.018341064453125,
-0.028472900390625,
-0.0080718994140625,
0.03338623046875,
-0.058746337890625,
0.029998779296875,
0.00795745849609375,
-0.01189422607421875,
0.0153045654296875,
0.01462554931640625,
0.07623291015625,
-0.0101165771484375,
-0.028717041015625,
0.04364013671875,
-0.0149383544921875,
-0.02105712890625,
-0.040191650390625,
-0.0000030994415283203125,
0.00524139404296875,
0.007266998291015625,
0.0188751220703125,
0.0148162841796875,
-0.01499176025390625,
-0.048828125,
-0.005641937255859375,
0.01073455810546875,
-0.0338134765625,
-0.02069091796875,
0.054107666015625,
-0.0095672607421875,
-0.0012311935424804688,
0.06396484375,
-0.0117950439453125,
-0.048309326171875,
0.04742431640625,
0.03338623046875,
0.057220458984375,
-0.0036563873291015625,
0.01316070556640625,
0.07373046875,
-0.0035610198974609375,
-0.0176544189453125,
0.0254364013671875,
0.01519775390625,
-0.053466796875,
-0.004268646240234375,
-0.07232666015625,
-0.0008420944213867188,
0.0175323486328125,
-0.06011962890625,
0.0360107421875,
-0.036224365234375,
-0.02581787109375,
0.0019130706787109375,
0.0145721435546875,
-0.04656982421875,
0.0297698974609375,
0.005519866943359375,
0.05548095703125,
-0.080322265625,
0.0728759765625,
0.06109619140625,
-0.058837890625,
-0.08154296875,
-0.01229095458984375,
-0.005950927734375,
-0.04071044921875,
0.056396484375,
0.0189208984375,
0.0207366943359375,
-0.005462646484375,
-0.0189208984375,
-0.03533935546875,
0.074462890625,
0.0175018310546875,
-0.0418701171875,
0.00942230224609375,
0.0290374755859375,
0.05255126953125,
-0.0081634521484375,
0.04473876953125,
0.04364013671875,
0.036895751953125,
0.0143890380859375,
-0.047210693359375,
0.0015583038330078125,
-0.03375244140625,
-0.0175933837890625,
0.00439453125,
-0.06201171875,
0.09521484375,
-0.01424407958984375,
0.002262115478515625,
-0.005279541015625,
0.03564453125,
0.02581787109375,
0.0234222412109375,
0.0204620361328125,
0.0631103515625,
0.053314208984375,
-0.0279998779296875,
0.06292724609375,
-0.0281524658203125,
0.056549072265625,
0.05902099609375,
-0.005054473876953125,
0.052276611328125,
0.050933837890625,
-0.0301513671875,
0.04248046875,
0.060028076171875,
-0.0286102294921875,
0.0310211181640625,
0.012603759765625,
-0.0225677490234375,
-0.00948333740234375,
0.00603485107421875,
-0.033966064453125,
0.04022216796875,
0.0007734298706054688,
-0.03900146484375,
-0.00490570068359375,
-0.0007534027099609375,
0.03314208984375,
-0.016021728515625,
-0.0198822021484375,
0.04290771484375,
-0.0055084228515625,
-0.05926513671875,
0.06787109375,
0.005367279052734375,
0.06610107421875,
-0.0457763671875,
-0.0009908676147460938,
-0.0249786376953125,
0.040924072265625,
-0.0265350341796875,
-0.0484619140625,
0.01226806640625,
-0.0025424957275390625,
-0.0185699462890625,
-0.0015392303466796875,
0.02294921875,
-0.037750244140625,
-0.0635986328125,
0.0275726318359375,
0.0271148681640625,
0.028228759765625,
0.0034008026123046875,
-0.0640869140625,
0.0020008087158203125,
0.0204620361328125,
-0.033294677734375,
0.034271240234375,
0.0306549072265625,
0.00514984130859375,
0.030670166015625,
0.046417236328125,
0.0124664306640625,
0.004241943359375,
0.00986480712890625,
0.0699462890625,
-0.040771484375,
-0.0174102783203125,
-0.087890625,
0.0518798828125,
-0.00737762451171875,
-0.045867919921875,
0.06756591796875,
0.06549072265625,
0.08544921875,
-0.02117919921875,
0.06805419921875,
-0.013946533203125,
0.0189208984375,
-0.0174102783203125,
0.0653076171875,
-0.050933837890625,
0.00423431396484375,
-0.03155517578125,
-0.057861328125,
-0.005916595458984375,
0.057586669921875,
-0.0364990234375,
0.0092315673828125,
0.044952392578125,
0.05975341796875,
-0.0052642822265625,
-0.0186767578125,
0.00390625,
0.023651123046875,
0.0165863037109375,
0.042236328125,
0.04864501953125,
-0.06158447265625,
0.0384521484375,
-0.049560546875,
-0.01229095458984375,
-0.0017652511596679688,
-0.052276611328125,
-0.08404541015625,
-0.051116943359375,
-0.040863037109375,
-0.041351318359375,
-0.0251617431640625,
0.06939697265625,
0.034515380859375,
-0.072021484375,
-0.0273895263671875,
-0.0004949569702148438,
-0.00618743896484375,
-0.01271820068359375,
-0.0245819091796875,
0.03759765625,
-0.0191192626953125,
-0.08172607421875,
-0.0064544677734375,
-0.01203155517578125,
0.0260009765625,
-0.018096923828125,
-0.016326904296875,
-0.0258941650390625,
-0.0190582275390625,
0.0207061767578125,
0.00789642333984375,
-0.03631591796875,
-0.004425048828125,
-0.0020599365234375,
-0.013702392578125,
0.0206146240234375,
0.018890380859375,
-0.0418701171875,
0.03692626953125,
0.034149169921875,
0.005767822265625,
0.0482177734375,
-0.0190887451171875,
0.01366424560546875,
-0.05712890625,
0.0458984375,
0.00518798828125,
0.055755615234375,
0.0294189453125,
-0.0250701904296875,
0.022308349609375,
0.0416259765625,
-0.02606201171875,
-0.060821533203125,
-0.0225830078125,
-0.071533203125,
-0.0100555419921875,
0.0831298828125,
-0.029876708984375,
-0.027801513671875,
0.00873565673828125,
-0.03753662109375,
0.045074462890625,
-0.0301055908203125,
0.0601806640625,
0.06744384765625,
0.008331298828125,
-0.0027027130126953125,
-0.032440185546875,
0.0269927978515625,
0.0286407470703125,
-0.0384521484375,
-0.001598358154296875,
0.0153350830078125,
0.04095458984375,
0.0207366943359375,
0.0411376953125,
-0.005523681640625,
0.0024242401123046875,
0.0167083740234375,
0.028350830078125,
-0.01277923583984375,
-0.0028324127197265625,
-0.024810791015625,
-0.0038890838623046875,
-0.00911712646484375,
-0.0262603759765625
]
] |
allenai/longformer-base-4096 | 2023-04-05T18:24:00.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"longformer",
"en",
"arxiv:2004.05150",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | allenai | null | null | allenai/longformer-base-4096 | 107 | 4,035,450 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: apache-2.0
---
# longformer-base-4096
[Longformer](https://arxiv.org/abs/2004.05150) is a transformer model for long documents.
`longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint and pretrained for MLM on long documents. It supports sequences of length up to 4,096.
Longformer uses a combination of a sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
Please refer to the examples in `modeling_longformer.py` and the paper for more details on how to set global attention.
### Citing
If you use `Longformer` in your research, please cite [Longformer: The Long-Document Transformer](https://arxiv.org/abs/2004.05150).
```
@article{Beltagy2020Longformer,
title={Longformer: The Long-Document Transformer},
author={Iz Beltagy and Matthew E. Peters and Arman Cohan},
journal={arXiv:2004.05150},
year={2020},
}
```
`Longformer` is an open-source project developed by [the Allen Institute for Artificial Intelligence (AI2)](http://www.allenai.org).
AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. | 1,253 | [
[
-0.016815185546875,
-0.0389404296875,
0.040313720703125,
0.028289794921875,
0.007564544677734375,
-0.016937255859375,
-0.024322509765625,
-0.03375244140625,
0.009765625,
0.042877197265625,
-0.04742431640625,
-0.0058746337890625,
-0.045928955078125,
0.011871337890625,
-0.02880859375,
0.0855712890625,
-0.0201263427734375,
-0.00872802734375,
-0.019683837890625,
0.006893157958984375,
-0.0020465850830078125,
-0.040313720703125,
-0.0452880859375,
-0.036041259765625,
0.0574951171875,
-0.01143646240234375,
0.0357666015625,
0.0240478515625,
0.041595458984375,
0.023712158203125,
-0.005893707275390625,
-0.0272674560546875,
-0.05938720703125,
-0.00209808349609375,
-0.01202392578125,
-0.047882080078125,
-0.04461669921875,
-0.0259246826171875,
0.050537109375,
0.0288543701171875,
0.01447296142578125,
0.02301025390625,
0.015106201171875,
0.06640625,
-0.032806396484375,
0.027099609375,
-0.01282501220703125,
-0.0029087066650390625,
-0.0265045166015625,
0.03472900390625,
-0.0382080078125,
-0.0224609375,
0.00623321533203125,
-0.02423095703125,
0.032562255859375,
-0.00870513916015625,
0.052276611328125,
0.0213470458984375,
-0.03216552734375,
-0.01094818115234375,
-0.0697021484375,
0.0679931640625,
-0.043426513671875,
0.0300140380859375,
0.01316070556640625,
0.036712646484375,
-0.01558685302734375,
-0.06475830078125,
-0.041778564453125,
-0.00508880615234375,
-0.004665374755859375,
0.0098419189453125,
-0.03131103515625,
0.01580810546875,
0.03759765625,
0.0292816162109375,
-0.04522705078125,
0.01401519775390625,
-0.07208251953125,
-0.01239013671875,
0.030059814453125,
-0.01605224609375,
-0.01540374755859375,
-0.01953125,
-0.040863037109375,
0.007293701171875,
-0.027923583984375,
0.035186767578125,
0.01474761962890625,
0.0013799667358398438,
-0.0181732177734375,
0.03240966796875,
-0.0098114013671875,
0.040069580078125,
0.03204345703125,
-0.0031948089599609375,
0.0333251953125,
0.00933074951171875,
-0.0304107666015625,
-0.0004134178161621094,
0.052154541015625,
0.0156707763671875,
0.0186004638671875,
-0.020904541015625,
-0.0129241943359375,
-0.00601959228515625,
0.038330078125,
-0.0626220703125,
0.01226806640625,
0.02813720703125,
-0.055145263671875,
-0.013214111328125,
0.01363372802734375,
-0.04486083984375,
-0.007293701171875,
-0.0283966064453125,
0.03582763671875,
-0.027130126953125,
-0.00380706787109375,
-0.003459930419921875,
-0.0004105567932128906,
0.03521728515625,
0.019622802734375,
-0.06414794921875,
0.0254058837890625,
0.04925537109375,
0.059295654296875,
-0.0309906005859375,
-0.0234527587890625,
-0.01047515869140625,
0.01435089111328125,
0.0022563934326171875,
0.042816162109375,
-0.00511932373046875,
-0.022857666015625,
0.007411956787109375,
0.040313720703125,
-0.016845703125,
-0.03369140625,
0.06304931640625,
-0.045562744140625,
0.05548095703125,
0.0181427001953125,
-0.032196044921875,
-0.0291290283203125,
0.020599365234375,
-0.08087158203125,
0.09527587890625,
0.038238525390625,
-0.0391845703125,
0.0024662017822265625,
-0.0833740234375,
-0.0186309814453125,
-0.0006146430969238281,
0.0011568069458007812,
-0.054779052734375,
-0.00473785400390625,
-0.00214385986328125,
0.031829833984375,
-0.00395965576171875,
0.0290985107421875,
-0.0014886856079101562,
-0.0389404296875,
-0.00669097900390625,
-0.0307769775390625,
0.044677734375,
-0.006267547607421875,
-0.050689697265625,
0.0384521484375,
-0.07708740234375,
-0.007373809814453125,
0.0122222900390625,
-0.041473388671875,
0.0036411285400390625,
-0.00820159912109375,
0.0232696533203125,
0.0276336669921875,
0.0279083251953125,
-0.04510498046875,
0.024078369140625,
-0.03265380859375,
0.040924072265625,
0.04534912109375,
-0.0177001953125,
0.040618896484375,
-0.041534423828125,
0.03411865234375,
-0.0012674331665039062,
0.017913818359375,
-0.01361083984375,
-0.027496337890625,
-0.07281494140625,
-0.007720947265625,
0.006565093994140625,
0.033966064453125,
-0.01140594482421875,
0.055816650390625,
-0.039276123046875,
-0.03997802734375,
-0.042388916015625,
0.0146942138671875,
0.0031833648681640625,
0.017303466796875,
0.037994384765625,
-0.001705169677734375,
-0.043731689453125,
-0.076416015625,
0.0281829833984375,
0.0214996337890625,
-0.0105438232421875,
-0.0005211830139160156,
0.046173095703125,
-0.039459228515625,
0.07318115234375,
-0.01371002197265625,
-0.031036376953125,
-0.02471923828125,
0.005985260009765625,
0.06304931640625,
0.02801513671875,
0.041351318359375,
-0.06390380859375,
-0.032440185546875,
-0.0435791015625,
-0.042877197265625,
0.0325927734375,
-0.02490234375,
-0.0201263427734375,
0.03277587890625,
0.0275726318359375,
-0.0849609375,
0.039337158203125,
0.042022705078125,
-0.004039764404296875,
0.0364990234375,
-0.0057830810546875,
-0.01236724853515625,
-0.09234619140625,
0.016815185546875,
0.0026836395263671875,
-0.015380859375,
-0.042694091796875,
-0.006114959716796875,
0.02886962890625,
-0.02008056640625,
-0.0343017578125,
0.033172607421875,
-0.038726806640625,
0.0224609375,
-0.023223876953125,
-0.027069091796875,
-0.00217437744140625,
0.044525146484375,
-0.003978729248046875,
0.044158935546875,
0.031097412109375,
-0.022186279296875,
0.0401611328125,
0.017822265625,
-0.027313232421875,
0.018890380859375,
-0.06109619140625,
0.020782470703125,
-0.032257080078125,
0.0526123046875,
-0.05889892578125,
-0.01204681396484375,
0.01505279541015625,
-0.0172119140625,
0.0298614501953125,
-0.01222991943359375,
-0.016876220703125,
-0.06219482421875,
-0.0312042236328125,
0.0340576171875,
0.03460693359375,
-0.028656005859375,
0.060516357421875,
-0.022552490234375,
-0.016754150390625,
-0.04669189453125,
-0.038726806640625,
-0.006969451904296875,
-0.01371002197265625,
-0.0653076171875,
0.04156494140625,
-0.0203857421875,
0.004535675048828125,
-0.0216522216796875,
-0.004665374755859375,
0.01214599609375,
-0.021697998046875,
0.058563232421875,
0.0209503173828125,
-0.03875732421875,
0.00499725341796875,
-0.0216522216796875,
-0.01378631591796875,
0.03173828125,
-0.00914764404296875,
0.0518798828125,
-0.0001327991485595703,
-0.028076171875,
-0.028656005859375,
0.04327392578125,
0.073486328125,
-0.0203857421875,
0.054931640625,
0.046661376953125,
-0.032135009765625,
-0.0187835693359375,
-0.05047607421875,
0.0023193359375,
-0.03436279296875,
0.0457763671875,
-0.018157958984375,
-0.0516357421875,
0.0303955078125,
-0.0017614364624023438,
-0.00811004638671875,
0.059600830078125,
0.048004150390625,
-0.0125274658203125,
0.0390625,
0.0662841796875,
-0.0390625,
0.04095458984375,
-0.032440185546875,
0.016326904296875,
-0.04998779296875,
-0.0262603759765625,
-0.027923583984375,
-0.0389404296875,
-0.018951416015625,
-0.04693603515625,
0.01317596435546875,
0.004703521728515625,
-0.047027587890625,
0.01345062255859375,
-0.04168701171875,
0.02520751953125,
0.052947998046875,
0.0080718994140625,
-0.00937652587890625,
-0.00958251953125,
0.02197265625,
0.01074981689453125,
-0.01403045654296875,
-0.033203125,
0.07000732421875,
0.054351806640625,
0.0682373046875,
0.0109100341796875,
0.06561279296875,
0.0221099853515625,
-0.0007100105285644531,
-0.08050537109375,
0.0164947509765625,
0.0056915283203125,
-0.050811767578125,
-0.03363037109375,
0.00586700439453125,
-0.07952880859375,
-0.0198974609375,
0.0014104843139648438,
-0.053070068359375,
0.00989532470703125,
-0.00830078125,
-0.0196533203125,
0.017181396484375,
-0.0307769775390625,
0.06964111328125,
-0.038543701171875,
-0.00403594970703125,
-0.0054931640625,
-0.057403564453125,
-0.003108978271484375,
-0.01345062255859375,
0.00930023193359375,
0.01100921630859375,
0.0291748046875,
0.0653076171875,
-0.0024662017822265625,
0.0845947265625,
-0.012603759765625,
-0.01727294921875,
0.006805419921875,
-0.0277862548828125,
0.062347412109375,
-0.02471923828125,
-0.01800537109375,
0.00998687744140625,
-0.03057861328125,
-0.030548095703125,
-0.0345458984375,
0.04095458984375,
-0.07806396484375,
-0.038238525390625,
-0.035369873046875,
-0.0290679931640625,
0.0103912353515625,
0.048126220703125,
0.0274658203125,
0.0162506103515625,
-0.022674560546875,
0.044677734375,
0.05029296875,
0.0216217041015625,
0.05255126953125,
0.019989013671875,
0.0007381439208984375,
-0.023834228515625,
0.029693603515625,
0.002079010009765625,
0.00986480712890625,
0.052520751953125,
-0.0080718994140625,
-0.01456451416015625,
-0.03533935546875,
-0.0185546875,
0.029144287109375,
-0.053680419921875,
-0.0111236572265625,
-0.03436279296875,
-0.06060791015625,
-0.031829833984375,
-0.01210784912109375,
-0.01163482666015625,
-0.015777587890625,
-0.03497314453125,
-0.01412200927734375,
0.0102386474609375,
0.0570068359375,
0.0239410400390625,
0.0213775634765625,
-0.0550537109375,
0.031036376953125,
0.01520538330078125,
0.03253173828125,
0.0055694580078125,
-0.039459228515625,
-0.028533935546875,
-0.005680084228515625,
-0.042236328125,
-0.047393798828125,
0.009002685546875,
0.016845703125,
0.07012939453125,
0.0231170654296875,
-0.002788543701171875,
0.025238037109375,
-0.04827880859375,
0.0599365234375,
0.0178985595703125,
-0.054107666015625,
0.04119873046875,
-0.0258331298828125,
0.037445068359375,
0.0179290771484375,
0.060943603515625,
-0.033905029296875,
-0.01258087158203125,
-0.032012939453125,
-0.08074951171875,
0.04217529296875,
0.01055145263671875,
0.0245361328125,
0.019317626953125,
0.0118408203125,
0.0194091796875,
-0.0103607177734375,
-0.09027099609375,
-0.0202789306640625,
-0.03912353515625,
-0.0263824462890625,
-0.01496124267578125,
-0.039794921875,
-0.016815185546875,
-0.0016603469848632812,
0.053131103515625,
-0.00713348388671875,
0.041229248046875,
0.01105499267578125,
-0.0256195068359375,
-0.016998291015625,
0.040008544921875,
0.051971435546875,
0.058258056640625,
-0.035675048828125,
-0.013092041015625,
-0.005008697509765625,
-0.035614013671875,
-0.00395965576171875,
0.03448486328125,
-0.00664520263671875,
0.005054473876953125,
0.032806396484375,
0.074951171875,
0.0162506103515625,
-0.0166015625,
0.03533935546875,
0.01715087890625,
-0.0278778076171875,
-0.06195068359375,
-0.000039696693420410156,
0.018951416015625,
0.027740478515625,
0.053466796875,
0.009246826171875,
0.003841400146484375,
-0.025482177734375,
0.00518035888671875,
0.006542205810546875,
-0.033050537109375,
-0.01490020751953125,
0.046539306640625,
0.0221405029296875,
-0.034515380859375,
0.048736572265625,
0.0200958251953125,
-0.03204345703125,
0.047149658203125,
0.068603515625,
0.050994873046875,
0.0003859996795654297,
-0.0148773193359375,
0.0189208984375,
-0.004184722900390625,
-0.01073455810546875,
0.0171966552734375,
-0.006168365478515625,
-0.0238189697265625,
-0.03564453125,
-0.056640625,
-0.018402099609375,
0.041229248046875,
-0.052520751953125,
0.0302581787109375,
-0.01110076904296875,
-0.0214996337890625,
0.0281829833984375,
-0.00339508056640625,
-0.045928955078125,
0.01456451416015625,
0.048065185546875,
0.0799560546875,
-0.0341796875,
0.06903076171875,
0.056365966796875,
-0.02532958984375,
-0.034912109375,
-0.00621795654296875,
-0.00521087646484375,
-0.058868408203125,
0.058868408203125,
0.041778564453125,
-0.0012998580932617188,
-0.0159454345703125,
-0.030517578125,
-0.0791015625,
0.090087890625,
0.0023059844970703125,
-0.060028076171875,
-0.049957275390625,
0.02490234375,
0.032501220703125,
-0.006519317626953125,
0.019927978515625,
0.0014314651489257812,
0.0303955078125,
-0.0062255859375,
-0.0799560546875,
0.0006480216979980469,
-0.036285400390625,
-0.0027599334716796875,
0.03173828125,
-0.06365966796875,
0.05908203125,
-0.0121917724609375,
0.016357421875,
0.0266876220703125,
0.056121826171875,
0.0048828125,
0.01290130615234375,
0.029144287109375,
0.029022216796875,
0.037445068359375,
0.0008635520935058594,
0.041717529296875,
-0.0396728515625,
0.047393798828125,
0.07275390625,
-0.002529144287109375,
0.07171630859375,
0.0374755859375,
-0.0284881591796875,
0.065673828125,
0.017425537109375,
-0.037872314453125,
0.0274658203125,
0.01800537109375,
-0.0106048583984375,
0.0008349418640136719,
0.038330078125,
-0.04107666015625,
0.0251007080078125,
0.01061248779296875,
-0.061859130859375,
-0.0174102783203125,
0.00360107421875,
0.004650115966796875,
-0.020599365234375,
-0.027984619140625,
0.0460205078125,
0.005237579345703125,
-0.05145263671875,
0.05352783203125,
-0.00003075599670410156,
0.0791015625,
-0.06451416015625,
0.00420379638671875,
0.002437591552734375,
0.045989990234375,
-0.0247344970703125,
-0.0604248046875,
0.0096893310546875,
-0.005313873291015625,
-0.07568359375,
-0.025146484375,
0.06036376953125,
-0.0193023681640625,
-0.051483154296875,
0.030181884765625,
0.0132904052734375,
0.0004222393035888672,
-0.020416259765625,
-0.0491943359375,
-0.01042938232421875,
-0.0014257431030273438,
-0.035614013671875,
0.025146484375,
0.01849365234375,
-0.0233001708984375,
0.039825439453125,
0.043670654296875,
-0.014801025390625,
0.0015764236450195312,
0.016265869140625,
0.06854248046875,
-0.0645751953125,
-0.04461669921875,
-0.053375244140625,
0.050323486328125,
-0.0017004013061523438,
-0.015380859375,
0.037811279296875,
0.0694580078125,
0.0438232421875,
-0.038299560546875,
0.0653076171875,
-0.00848388671875,
0.046661376953125,
-0.01538848876953125,
0.0679931640625,
-0.01505279541015625,
-0.030975341796875,
0.006549835205078125,
-0.0758056640625,
-0.00394439697265625,
0.03839111328125,
-0.022613525390625,
0.00229644775390625,
0.0301055908203125,
0.03863525390625,
-0.0190277099609375,
-0.0116729736328125,
0.013458251953125,
0.0201263427734375,
0.035400390625,
0.052215576171875,
0.044158935546875,
-0.03289794921875,
0.04827880859375,
-0.0278778076171875,
-0.01123046875,
-0.0187225341796875,
-0.06451416015625,
-0.07666015625,
-0.036346435546875,
0.010986328125,
-0.036590576171875,
0.0098876953125,
0.07244873046875,
0.056488037109375,
-0.07159423828125,
-0.01490020751953125,
0.0208282470703125,
-0.0162811279296875,
-0.0030002593994140625,
-0.018310546875,
0.045806884765625,
-0.0178375244140625,
-0.052398681640625,
0.0215301513671875,
-0.005153656005859375,
0.0123443603515625,
-0.01342010498046875,
-0.006710052490234375,
0.006946563720703125,
-0.0179443359375,
0.0484619140625,
0.02783203125,
-0.0592041015625,
-0.0096893310546875,
0.0096282958984375,
-0.0177001953125,
0.035491943359375,
0.039215087890625,
-0.053619384765625,
0.014068603515625,
0.01323699951171875,
0.0191192626953125,
0.061553955078125,
0.005359649658203125,
0.04083251953125,
-0.037109375,
0.01190948486328125,
0.0242919921875,
0.038299560546875,
0.0147705078125,
-0.04595947265625,
0.0156707763671875,
0.0047454833984375,
-0.062408447265625,
-0.0285491943359375,
-0.005565643310546875,
-0.11651611328125,
0.003925323486328125,
0.08612060546875,
-0.0060272216796875,
-0.03369140625,
0.01087188720703125,
-0.034820556640625,
0.0215911865234375,
-0.04364013671875,
0.06292724609375,
0.04266357421875,
-0.0230560302734375,
-0.0229644775390625,
-0.02874755859375,
0.0265655517578125,
-0.0106201171875,
-0.05120849609375,
-0.01416778564453125,
0.0185089111328125,
0.04168701171875,
0.056671142578125,
0.030731201171875,
0.0032863616943359375,
0.0190277099609375,
-0.0282135009765625,
0.022430419921875,
-0.01800537109375,
-0.00862884521484375,
-0.020782470703125,
0.01898193359375,
-0.01180267333984375,
-0.0132904052734375
]
] |
facebook/bart-large-cnn | 2023-10-03T04:52:04.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"bart",
"text2text-generation",
"summarization",
"en",
"dataset:cnn_dailymail",
"arxiv:1910.13461",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | facebook | null | null | facebook/bart-large-cnn | 647 | 3,952,883 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
tags:
- summarization
license: mit
thumbnail: https://huggingface.co/front/thumbnails/facebook.png
datasets:
- cnn_dailymail
model-index:
- name: facebook/bart-large-cnn
results:
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: train
metrics:
- name: ROUGE-1
type: rouge
value: 42.9486
verified: true
- name: ROUGE-2
type: rouge
value: 20.8149
verified: true
- name: ROUGE-L
type: rouge
value: 30.6186
verified: true
- name: ROUGE-LSUM
type: rouge
value: 40.0376
verified: true
- name: loss
type: loss
value: 2.529000997543335
verified: true
- name: gen_len
type: gen_len
value: 78.5866
verified: true
---
# BART (large-sized model), fine-tuned on CNN Daily Mail
BART model pre-trained on English language, and fine-tuned on [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail). It was introduced in the paper [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension](https://arxiv.org/abs/1910.13461) by Lewis et al. and first released in [this repository (https://github.com/pytorch/fairseq/tree/master/examples/bart).
Disclaimer: The team releasing BART did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
BART is particularly effective when fine-tuned for text generation (e.g. summarization, translation) but also works well for comprehension tasks (e.g. text classification, question answering). This particular checkpoint has been fine-tuned on CNN Daily Mail, a large collection of text-summary pairs.
## Intended uses & limitations
You can use this model for text summarization.
### How to use
Here is how to use this model with the [pipeline API](https://huggingface.co/transformers/main_classes/pipelines.html):
```python
from transformers import pipeline
summarizer = pipeline("summarization", model="facebook/bart-large-cnn")
ARTICLE = """ New York (CNN)When Liana Barrientos was 23 years old, she got married in Westchester County, New York.
A year later, she got married again in Westchester County, but to a different man and without divorcing her first husband.
Only 18 days after that marriage, she got hitched yet again. Then, Barrientos declared "I do" five more times, sometimes only within two weeks of each other.
In 2010, she married once more, this time in the Bronx. In an application for a marriage license, she stated it was her "first and only" marriage.
Barrientos, now 39, is facing two criminal counts of "offering a false instrument for filing in the first degree," referring to her false statements on the
2010 marriage license application, according to court documents.
Prosecutors said the marriages were part of an immigration scam.
On Friday, she pleaded not guilty at State Supreme Court in the Bronx, according to her attorney, Christopher Wright, who declined to comment further.
After leaving court, Barrientos was arrested and charged with theft of service and criminal trespass for allegedly sneaking into the New York subway through an emergency exit, said Detective
Annette Markowski, a police spokeswoman. In total, Barrientos has been married 10 times, with nine of her marriages occurring between 1999 and 2002.
All occurred either in Westchester County, Long Island, New Jersey or the Bronx. She is believed to still be married to four men, and at one time, she was married to eight men at once, prosecutors say.
Prosecutors said the immigration scam involved some of her husbands, who filed for permanent residence status shortly after the marriages.
Any divorces happened only after such filings were approved. It was unclear whether any of the men will be prosecuted.
The case was referred to the Bronx District Attorney\'s Office by Immigration and Customs Enforcement and the Department of Homeland Security\'s
Investigation Division. Seven of the men are from so-called "red-flagged" countries, including Egypt, Turkey, Georgia, Pakistan and Mali.
Her eighth husband, Rashid Rajput, was deported in 2006 to his native Pakistan after an investigation by the Joint Terrorism Task Force.
If convicted, Barrientos faces up to four years in prison. Her next court appearance is scheduled for May 18.
"""
print(summarizer(ARTICLE, max_length=130, min_length=30, do_sample=False))
>>> [{'summary_text': 'Liana Barrientos, 39, is charged with two counts of "offering a false instrument for filing in the first degree" In total, she has been married 10 times, with nine of her marriages occurring between 1999 and 2002. She is believed to still be married to four men.'}]
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1910-13461,
author = {Mike Lewis and
Yinhan Liu and
Naman Goyal and
Marjan Ghazvininejad and
Abdelrahman Mohamed and
Omer Levy and
Veselin Stoyanov and
Luke Zettlemoyer},
title = {{BART:} Denoising Sequence-to-Sequence Pre-training for Natural Language
Generation, Translation, and Comprehension},
journal = {CoRR},
volume = {abs/1910.13461},
year = {2019},
url = {http://arxiv.org/abs/1910.13461},
eprinttype = {arXiv},
eprint = {1910.13461},
timestamp = {Thu, 31 Oct 2019 14:02:26 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1910-13461.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 6,003 | [
[
-0.035247802734375,
-0.055023193359375,
0.02850341796875,
0.02752685546875,
-0.0382080078125,
-0.019073486328125,
0.00536346435546875,
-0.0229339599609375,
0.0298614501953125,
0.04620361328125,
-0.0208587646484375,
-0.0300445556640625,
-0.04248046875,
0.032745361328125,
-0.035369873046875,
0.07623291015625,
0.01256561279296875,
0.00470733642578125,
-0.0171661376953125,
-0.0014448165893554688,
-0.0250701904296875,
-0.025238037109375,
-0.047576904296875,
0.015533447265625,
0.0285186767578125,
0.0307464599609375,
0.046630859375,
0.0418701171875,
0.047210693359375,
0.022369384765625,
-0.0236663818359375,
-0.0101165771484375,
-0.045013427734375,
0.00006538629531860352,
-0.0012722015380859375,
-0.0479736328125,
-0.056396484375,
0.0081787109375,
0.0306243896484375,
0.054840087890625,
-0.027252197265625,
0.038360595703125,
-0.023040771484375,
0.063720703125,
-0.03985595703125,
-0.01006317138671875,
-0.046112060546875,
-0.0009064674377441406,
-0.048309326171875,
-0.012420654296875,
-0.040252685546875,
-0.03912353515625,
-0.0052490234375,
-0.04547119140625,
0.0101318359375,
0.01457977294921875,
0.08050537109375,
0.003032684326171875,
-0.038848876953125,
-0.00534820556640625,
-0.0479736328125,
0.051513671875,
-0.07330322265625,
0.03643798828125,
0.045013427734375,
0.00975799560546875,
-0.016204833984375,
-0.04449462890625,
-0.03631591796875,
-0.0238494873046875,
-0.031890869140625,
0.01861572265625,
-0.02978515625,
-0.00995635986328125,
0.0474853515625,
0.04345703125,
-0.065673828125,
-0.026123046875,
-0.04913330078125,
-0.0178985595703125,
0.03973388671875,
0.00847625732421875,
0.016937255859375,
-0.002452850341796875,
-0.0162811279296875,
0.01413726806640625,
-0.02081298828125,
0.01491546630859375,
0.032012939453125,
0.0014562606811523438,
-0.04937744140625,
0.0416259765625,
-0.0024089813232421875,
0.04925537109375,
0.02117919921875,
-0.0318603515625,
0.04486083984375,
-0.031463623046875,
-0.021728515625,
-0.0231170654296875,
0.07159423828125,
0.02398681640625,
0.038665771484375,
0.01983642578125,
-0.01593017578125,
0.006984710693359375,
0.01165771484375,
-0.0521240234375,
0.0038585662841796875,
-0.006961822509765625,
-0.052703857421875,
-0.02874755859375,
-0.003261566162109375,
-0.04718017578125,
0.018280029296875,
-0.0036373138427734375,
-0.004695892333984375,
-0.02606201171875,
-0.0277252197265625,
0.02850341796875,
-0.033721923828125,
0.014129638671875,
0.013641357421875,
-0.049407958984375,
0.025146484375,
0.03265380859375,
0.06756591796875,
0.0087738037109375,
-0.0208587646484375,
-0.03802490234375,
-0.0205841064453125,
-0.05364990234375,
0.0171051025390625,
-0.033477783203125,
-0.0188446044921875,
-0.01386260986328125,
0.0024871826171875,
0.0083160400390625,
-0.031280517578125,
0.038299560546875,
-0.0677490234375,
0.0175628662109375,
-0.01491546630859375,
-0.01739501953125,
-0.00006443262100219727,
0.0126495361328125,
-0.03802490234375,
0.060455322265625,
0.01247406005859375,
-0.07403564453125,
0.00534820556640625,
-0.037078857421875,
-0.040985107421875,
-0.01922607421875,
0.031494140625,
-0.0537109375,
0.0086669921875,
0.0000483393669128418,
0.02685546875,
-0.024688720703125,
0.051849365234375,
-0.042724609375,
-0.0017194747924804688,
0.0005211830139160156,
-0.03759765625,
0.11053466796875,
0.0399169921875,
-0.0157928466796875,
-0.0196685791015625,
-0.056671142578125,
-0.04736328125,
0.026519775390625,
-0.04046630859375,
-0.02984619140625,
-0.0281524658203125,
-0.0286865234375,
0.002529144287109375,
0.002552032470703125,
-0.056793212890625,
0.004756927490234375,
-0.05218505859375,
0.0110931396484375,
0.0107879638671875,
0.00897979736328125,
0.036651611328125,
-0.0347900390625,
0.04248046875,
0.033447265625,
0.02642822265625,
-0.031768798828125,
-0.06292724609375,
-0.061676025390625,
0.0124969482421875,
0.040252685546875,
0.04937744140625,
-0.045166015625,
0.02587890625,
-0.032501220703125,
-0.0450439453125,
-0.006496429443359375,
-0.009033203125,
0.032501220703125,
0.033843994140625,
0.032958984375,
-0.04144287109375,
-0.050750732421875,
-0.054901123046875,
-0.0284271240234375,
-0.0077972412109375,
0.0190277099609375,
-0.002437591552734375,
0.0677490234375,
0.0008907318115234375,
0.07373046875,
-0.031890869140625,
-0.0101318359375,
-0.003238677978515625,
0.018280029296875,
0.0310821533203125,
0.04315185546875,
0.07745361328125,
-0.05267333984375,
-0.058837890625,
-0.004405975341796875,
-0.056640625,
0.0132598876953125,
-0.01122283935546875,
-0.0024166107177734375,
0.0276336669921875,
0.0162811279296875,
-0.04217529296875,
0.04376220703125,
0.0291900634765625,
-0.031463623046875,
0.060791015625,
-0.01165008544921875,
-0.01042938232421875,
-0.07763671875,
0.00368499755859375,
-0.003032684326171875,
-0.0168304443359375,
-0.054840087890625,
-0.004711151123046875,
0.005825042724609375,
-0.007732391357421875,
-0.0305633544921875,
0.044281005859375,
-0.057861328125,
-0.0180206298828125,
0.015655517578125,
0.0006108283996582031,
-0.001476287841796875,
0.02545166015625,
-0.01287078857421875,
0.03936767578125,
0.0296478271484375,
-0.0289306640625,
0.037445068359375,
0.04388427734375,
-0.01459503173828125,
0.051605224609375,
-0.004268646240234375,
-0.0038394927978515625,
-0.0300750732421875,
0.020660400390625,
-0.06732177734375,
-0.0159759521484375,
0.0411376953125,
-0.0538330078125,
0.012725830078125,
-0.00656890869140625,
-0.014251708984375,
-0.0513916015625,
-0.04119873046875,
0.0187835693359375,
0.0262451171875,
-0.0169677734375,
0.0615234375,
0.031280517578125,
-0.006557464599609375,
-0.06890869140625,
-0.07366943359375,
0.033111572265625,
-0.0101318359375,
-0.052459716796875,
0.0290985107421875,
-0.0158538818359375,
-0.0305023193359375,
0.01190185546875,
0.00566864013671875,
-0.0181121826171875,
0.0168609619140625,
0.010345458984375,
0.020294189453125,
-0.01332855224609375,
0.004459381103515625,
0.0231170654296875,
-0.0004851818084716797,
0.000156402587890625,
0.0006322860717773438,
0.0570068359375,
-0.004184722900390625,
0.0103759765625,
-0.03558349609375,
0.0430908203125,
0.06982421875,
-0.0335693359375,
0.0743408203125,
0.036163330078125,
-0.02679443359375,
0.0125579833984375,
-0.057708740234375,
-0.0157623291015625,
-0.03314208984375,
0.01335906982421875,
-0.05377197265625,
-0.0672607421875,
0.06982421875,
0.018798828125,
0.00994873046875,
0.054412841796875,
0.01416015625,
0.0015439987182617188,
0.046600341796875,
0.0455322265625,
-0.006481170654296875,
0.025177001953125,
-0.0178985595703125,
0.00890350341796875,
-0.0679931640625,
0.0008292198181152344,
-0.0174560546875,
-0.01483154296875,
-0.0347900390625,
-0.0099945068359375,
0.0123291015625,
0.03265380859375,
-0.0347900390625,
0.037811279296875,
-0.032318115234375,
0.0230560302734375,
0.049285888671875,
-0.01459503173828125,
0.031646728515625,
-0.01386260986328125,
-0.02679443359375,
0.0018777847290039062,
-0.0572509765625,
-0.0162353515625,
0.091796875,
0.004383087158203125,
0.0340576171875,
0.0115814208984375,
0.04864501953125,
0.0172576904296875,
0.035614013671875,
-0.044281005859375,
0.04498291015625,
-0.01502227783203125,
-0.0657958984375,
-0.0180206298828125,
-0.05059814453125,
-0.0972900390625,
-0.0057525634765625,
-0.01375579833984375,
-0.0185089111328125,
0.036834716796875,
0.0120849609375,
-0.044921875,
0.022430419921875,
-0.032073974609375,
0.044677734375,
-0.0310516357421875,
-0.01511383056640625,
-0.02044677734375,
-0.079345703125,
0.07220458984375,
-0.0125579833984375,
0.039459228515625,
-0.002750396728515625,
0.019073486328125,
0.04937744140625,
-0.051116943359375,
0.054412841796875,
-0.01155853271484375,
0.00356292724609375,
0.0040740966796875,
0.007556915283203125,
0.04852294921875,
-0.0011882781982421875,
-0.0130615234375,
0.0104827880859375,
0.008026123046875,
0.00015723705291748047,
-0.0203857421875,
0.0478515625,
-0.0305938720703125,
-0.040985107421875,
-0.04364013671875,
-0.03265380859375,
0.032806396484375,
0.0239105224609375,
0.016693115234375,
0.0440673828125,
-0.0012006759643554688,
0.007053375244140625,
0.0232391357421875,
-0.03094482421875,
0.0430908203125,
0.0447998046875,
-0.036041259765625,
-0.060455322265625,
0.04669189453125,
0.024383544921875,
0.004009246826171875,
0.021514892578125,
0.0202789306640625,
0.00792694091796875,
-0.0236663818359375,
-0.0174102783203125,
0.05047607421875,
-0.036956787109375,
-0.0163421630859375,
-0.0430908203125,
-0.021209716796875,
-0.04595947265625,
-0.016815185546875,
-0.03729248046875,
-0.0237884521484375,
-0.02545166015625,
0.015960693359375,
0.01508331298828125,
0.03192138671875,
-0.01137542724609375,
0.0156707763671875,
-0.06671142578125,
0.04486083984375,
0.0186614990234375,
0.02740478515625,
-0.00004374980926513672,
-0.05328369140625,
-0.01450347900390625,
0.005950927734375,
-0.004505157470703125,
-0.07513427734375,
0.036163330078125,
0.01454925537109375,
0.0201263427734375,
0.043060302734375,
0.031280517578125,
0.0606689453125,
-0.02783203125,
0.06866455078125,
0.018463134765625,
-0.08056640625,
0.0433349609375,
-0.0233612060546875,
0.01480865478515625,
0.022674560546875,
0.035308837890625,
-0.03985595703125,
-0.0278778076171875,
-0.0535888671875,
-0.06304931640625,
0.07183837890625,
0.021392822265625,
0.022064208984375,
0.00681304931640625,
0.0477294921875,
-0.01216888427734375,
0.0291900634765625,
-0.0802001953125,
-0.0462646484375,
-0.0062103271484375,
-0.0089874267578125,
0.021728515625,
-0.0457763671875,
-0.0310211181640625,
-0.060150146484375,
0.055877685546875,
0.0290069580078125,
0.0301513671875,
0.0224609375,
-0.018157958984375,
0.0099334716796875,
0.0021209716796875,
0.093994140625,
0.06256103515625,
-0.018218994140625,
0.00917816162109375,
0.013702392578125,
-0.041412353515625,
0.0147705078125,
0.01186370849609375,
-0.0181884765625,
0.02691650390625,
0.0303192138671875,
0.0830078125,
0.020782470703125,
-0.0244140625,
0.06561279296875,
-0.00926971435546875,
-0.04644775390625,
-0.0543212890625,
-0.01213836669921875,
0.0220794677734375,
0.0084075927734375,
0.0158538818359375,
0.03057861328125,
0.0009307861328125,
-0.034912109375,
0.004802703857421875,
0.0621337890625,
-0.001949310302734375,
-0.0263519287109375,
0.06658935546875,
0.00934600830078125,
-0.032012939453125,
0.019805908203125,
-0.0408935546875,
-0.039337158203125,
0.021881103515625,
0.0467529296875,
0.035430908203125,
-0.04217529296875,
0.025482177734375,
0.0406494140625,
0.032379150390625,
-0.03277587890625,
0.0192108154296875,
-0.033782958984375,
-0.0712890625,
-0.0162353515625,
-0.06268310546875,
-0.00848388671875,
-0.0008254051208496094,
-0.052093505859375,
0.0160980224609375,
-0.0191192626953125,
-0.039794921875,
-0.0184173583984375,
-0.0130615234375,
-0.0247039794921875,
-0.005779266357421875,
0.01385498046875,
0.0655517578125,
-0.06585693359375,
0.051483154296875,
0.03387451171875,
-0.0120391845703125,
-0.067138671875,
-0.0230712890625,
0.0005269050598144531,
-0.0312042236328125,
0.04339599609375,
-0.016448974609375,
-0.0223388671875,
0.0189361572265625,
-0.037445068359375,
-0.086181640625,
0.07159423828125,
0.0557861328125,
-0.060791015625,
-0.00495147705078125,
0.0129547119140625,
0.031524658203125,
-0.02520751953125,
0.01983642578125,
0.0499267578125,
0.051116943359375,
0.00031685829162597656,
-0.081787109375,
-0.0020198822021484375,
-0.02520751953125,
-0.00336456298828125,
0.0028076171875,
-0.0494384765625,
0.07330322265625,
-0.0088043212890625,
-0.015899658203125,
-0.004360198974609375,
0.0167999267578125,
0.0176849365234375,
0.0379638671875,
0.04644775390625,
0.07415771484375,
0.045379638671875,
-0.01532745361328125,
0.0703125,
-0.03106689453125,
0.034454345703125,
0.0859375,
-0.006938934326171875,
0.05126953125,
0.019927978515625,
-0.043670654296875,
0.038299560546875,
0.02447509765625,
-0.00502777099609375,
0.025299072265625,
0.00007003545761108398,
0.00870513916015625,
0.0015211105346679688,
0.012908935546875,
-0.019775390625,
0.05096435546875,
0.008880615234375,
-0.0469970703125,
-0.01335906982421875,
0.004375457763671875,
0.042083740234375,
-0.01348114013671875,
-0.0024623870849609375,
0.035675048828125,
0.0293731689453125,
-0.057342529296875,
0.052825927734375,
-0.0008234977722167969,
0.05352783203125,
-0.044403076171875,
0.01326751708984375,
-0.034759521484375,
-0.012054443359375,
-0.0254974365234375,
-0.036102294921875,
0.0296478271484375,
-0.0010585784912109375,
-0.0184326171875,
-0.0166015625,
0.038299560546875,
-0.041168212890625,
-0.035919189453125,
0.0234375,
0.0270233154296875,
0.02215576171875,
-0.0018873214721679688,
-0.0396728515625,
-0.005779266357421875,
0.0190887451171875,
-0.0537109375,
-0.0078887939453125,
0.0284271240234375,
0.01006317138671875,
0.055145263671875,
0.061859130859375,
0.026885986328125,
0.026611328125,
-0.016998291015625,
0.05291748046875,
-0.058319091796875,
-0.0284881591796875,
-0.0712890625,
0.06427001953125,
-0.0187225341796875,
-0.036956787109375,
0.055572509765625,
0.06915283203125,
0.055145263671875,
-0.01287841796875,
0.0298919677734375,
0.0010290145874023438,
0.047088623046875,
-0.011077880859375,
0.0504150390625,
-0.04473876953125,
0.025634765625,
-0.035980224609375,
-0.06829833984375,
-0.057708740234375,
0.0288543701171875,
-0.0277099609375,
-0.0106201171875,
0.06378173828125,
0.052459716796875,
-0.0047607421875,
-0.0175933837890625,
0.037506103515625,
0.03326416015625,
0.007537841796875,
0.01007843017578125,
0.0428466796875,
-0.0400390625,
0.0546875,
-0.0162200927734375,
0.0028228759765625,
-0.034271240234375,
-0.03887939453125,
-0.054718017578125,
-0.0504150390625,
-0.0030956268310546875,
0.014862060546875,
-0.0087127685546875,
0.04248046875,
0.0296783447265625,
-0.035797119140625,
-0.01983642578125,
-0.017333984375,
0.0014781951904296875,
-0.0289459228515625,
-0.0205230712890625,
0.00872802734375,
-0.0038776397705078125,
-0.03021240234375,
0.0244140625,
0.035064697265625,
0.01100921630859375,
-0.004322052001953125,
0.0028743743896484375,
-0.0343017578125,
0.003993988037109375,
0.04901123046875,
0.0088958740234375,
-0.060150146484375,
-0.0018768310546875,
0.007099151611328125,
-0.0003800392150878906,
0.0255126953125,
0.02691650390625,
-0.052490234375,
0.02789306640625,
0.04132080078125,
0.0208740234375,
0.04937744140625,
0.0193328857421875,
0.033721923828125,
-0.04833984375,
0.0047607421875,
0.01666259765625,
0.039154052734375,
0.01433563232421875,
-0.0308074951171875,
0.03094482421875,
0.0411376953125,
-0.039398193359375,
-0.079345703125,
0.007556915283203125,
-0.0877685546875,
-0.004116058349609375,
0.045867919921875,
-0.01100921630859375,
0.01226806640625,
-0.02508544921875,
0.0107421875,
0.0384521484375,
-0.024169921875,
0.051849365234375,
0.0555419921875,
-0.007709503173828125,
-0.021728515625,
-0.04437255859375,
0.0254974365234375,
0.038726806640625,
-0.04510498046875,
0.0006914138793945312,
0.0130767822265625,
0.022491455078125,
0.0443115234375,
0.064453125,
-0.00658416748046875,
0.01149749755859375,
0.007511138916015625,
0.0102996826171875,
-0.0157928466796875,
-0.01485443115234375,
-0.0201263427734375,
0.025634765625,
-0.01403045654296875,
-0.005443572998046875
]
] |
cl-tohoku/bert-base-japanese | 2021-09-23T13:45:36.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | cl-tohoku | null | null | cl-tohoku/bert-base-japanese | 15 | 3,830,390 | transformers | 2022-03-02T23:29:05 | ---
language: ja
license: cc-by-sa-4.0
datasets:
- wikipedia
widget:
- text: ๆฑๅๅคงๅญฆใง[MASK]ใฎ็ ็ฉถใใใฆใใพใใ
---
# BERT base Japanese (IPA dictionary)
This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language.
This version of the model processes input texts with word-level tokenization based on the IPA dictionary, followed by the WordPiece subword tokenization.
The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/tree/v1.0).
## Model architecture
The model architecture is the same as the original BERT base model; 12 layers, 768 dimensions of hidden states, and 12 attention heads.
## Training Data
The model is trained on Japanese Wikipedia as of September 1, 2019.
To generate the training corpus, [WikiExtractor](https://github.com/attardi/wikiextractor) is used to extract plain texts from a dump file of Wikipedia articles.
The text files used for the training are 2.6GB in size, consisting of approximately 17M sentences.
## Tokenization
The texts are first tokenized by [MeCab](https://taku910.github.io/mecab/) morphological parser with the IPA dictionary and then split into subwords by the WordPiece algorithm.
The vocabulary size is 32000.
## Training
The model is trained with the same configuration as the original BERT; 512 tokens per instance, 256 instances per batch, and 1M training steps.
## Licenses
The pretrained models are distributed under the terms of the [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/).
## Acknowledgments
For training models, we used Cloud TPUs provided by [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc/) program.
| 1,743 | [
[
-0.035369873046875,
-0.052978515625,
0.0237884521484375,
0.0175018310546875,
-0.049774169921875,
-0.01739501953125,
-0.0164794921875,
-0.037017822265625,
0.034027099609375,
0.033111572265625,
-0.05194091796875,
-0.033172607421875,
-0.04608154296875,
-0.00026869773864746094,
-0.0140228271484375,
0.0877685546875,
-0.00205230712890625,
0.0197601318359375,
0.01239013671875,
0.01324462890625,
-0.02459716796875,
-0.02899169921875,
-0.048431396484375,
-0.03289794921875,
0.03558349609375,
0.01959228515625,
0.041351318359375,
0.04241943359375,
0.03155517578125,
0.0164031982421875,
-0.0017070770263671875,
-0.0183258056640625,
-0.041168212890625,
-0.022705078125,
-0.006954193115234375,
-0.0302734375,
-0.0166015625,
-0.0085906982421875,
0.0418701171875,
0.049407958984375,
0.00638580322265625,
0.00850677490234375,
-0.01318359375,
0.0222930908203125,
-0.041900634765625,
0.015777587890625,
-0.0504150390625,
0.0024166107177734375,
-0.0234527587890625,
0.018157958984375,
-0.023406982421875,
-0.01264190673828125,
0.025543212890625,
-0.059844970703125,
0.0197601318359375,
-0.01160430908203125,
0.09515380859375,
0.01015472412109375,
-0.004779815673828125,
-0.0231781005859375,
-0.0247039794921875,
0.049407958984375,
-0.051361083984375,
0.03350830078125,
0.047149658203125,
0.000751495361328125,
0.0005254745483398438,
-0.07281494140625,
-0.052581787109375,
-0.01317596435546875,
0.00811767578125,
0.01200103759765625,
0.0002791881561279297,
0.00759124755859375,
0.0210113525390625,
0.0240020751953125,
-0.044708251953125,
0.0206756591796875,
-0.03753662109375,
-0.01129150390625,
0.033050537109375,
-0.0056915283203125,
0.037384033203125,
-0.035125732421875,
-0.0300750732421875,
-0.034393310546875,
-0.04058837890625,
0.004146575927734375,
0.0193634033203125,
0.0123291015625,
-0.0208892822265625,
0.04931640625,
0.00925445556640625,
0.031524658203125,
-0.003894805908203125,
-0.02410888671875,
0.033905029296875,
-0.0210723876953125,
-0.01593017578125,
0.0109405517578125,
0.06829833984375,
0.011688232421875,
0.0242156982421875,
-0.00887298583984375,
-0.01251983642578125,
-0.006801605224609375,
0.039764404296875,
-0.05712890625,
-0.0183563232421875,
0.00550079345703125,
-0.04974365234375,
-0.0202178955078125,
-0.0088958740234375,
-0.02197265625,
0.00630950927734375,
-0.0005474090576171875,
0.056976318359375,
-0.07806396484375,
-0.023284912109375,
-0.0036487579345703125,
-0.033905029296875,
0.0258636474609375,
0.00457000732421875,
-0.07806396484375,
0.0089263916015625,
0.0428466796875,
0.058807373046875,
0.0086669921875,
-0.039520263671875,
0.0277252197265625,
0.01861572265625,
-0.037017822265625,
0.032470703125,
-0.0169525146484375,
-0.048675537109375,
-0.00908660888671875,
-0.0014009475708007812,
-0.00382232666015625,
-0.01523590087890625,
0.03997802734375,
-0.033416748046875,
0.01561737060546875,
-0.018402099609375,
-0.0577392578125,
-0.01070404052734375,
0.01763916015625,
-0.046630859375,
0.078857421875,
0.01184844970703125,
-0.06365966796875,
0.032623291015625,
-0.06829833984375,
-0.038726806640625,
0.028228759765625,
0.0042266845703125,
-0.0218353271484375,
0.009857177734375,
0.0174713134765625,
0.023895263671875,
0.0098876953125,
0.01511383056640625,
-0.0177459716796875,
-0.031646728515625,
-0.00565338134765625,
-0.0132598876953125,
0.09063720703125,
0.0196380615234375,
-0.0210723876953125,
-0.005466461181640625,
-0.063232421875,
0.002407073974609375,
0.014923095703125,
-0.0364990234375,
-0.04583740234375,
-0.0077056884765625,
0.01338958740234375,
-0.006114959716796875,
0.050079345703125,
-0.0609130859375,
0.017181396484375,
-0.04107666015625,
0.026123046875,
0.044219970703125,
0.00041985511779785156,
0.01486968994140625,
-0.0072479248046875,
0.0065460205078125,
-0.0021514892578125,
0.0202178955078125,
-0.0288848876953125,
-0.049041748046875,
-0.0628662109375,
-0.02362060546875,
0.0350341796875,
0.028717041015625,
-0.056884765625,
0.0714111328125,
-0.043426513671875,
-0.058868408203125,
-0.06134033203125,
-0.006671905517578125,
0.0257415771484375,
0.036346435546875,
0.0211181640625,
-0.03204345703125,
-0.040557861328125,
-0.0723876953125,
0.01497650146484375,
-0.025543212890625,
-0.01099395751953125,
0.0015354156494140625,
0.052886962890625,
-0.0283050537109375,
0.0662841796875,
-0.0183258056640625,
-0.0157623291015625,
-0.0244293212890625,
0.032470703125,
0.0244903564453125,
0.044525146484375,
0.04632568359375,
-0.04815673828125,
-0.036712646484375,
-0.0121917724609375,
-0.04412841796875,
0.004817962646484375,
0.003509521484375,
-0.01247406005859375,
0.0041046142578125,
0.0233154296875,
-0.05096435546875,
0.019012451171875,
0.035400390625,
-0.0094146728515625,
0.02728271484375,
-0.0182647705078125,
-0.021392822265625,
-0.10650634765625,
0.0294189453125,
-0.014984130859375,
-0.001956939697265625,
-0.0347900390625,
0.0272064208984375,
0.0114593505859375,
-0.0193023681640625,
-0.0287628173828125,
0.040618896484375,
-0.0245819091796875,
-0.0013036727905273438,
-0.0184326171875,
-0.017059326171875,
-0.0097198486328125,
0.05450439453125,
0.0210113525390625,
0.0638427734375,
0.030487060546875,
-0.044219970703125,
0.0156707763671875,
0.031707763671875,
-0.05108642578125,
0.0022792816162109375,
-0.06549072265625,
0.007465362548828125,
-0.00738525390625,
0.0106201171875,
-0.0745849609375,
-0.0208282470703125,
0.01849365234375,
-0.044891357421875,
0.0287628173828125,
0.003124237060546875,
-0.0550537109375,
-0.035247802734375,
-0.035858154296875,
0.003662109375,
0.04931640625,
-0.039764404296875,
0.038055419921875,
0.035980224609375,
-0.01904296875,
-0.0574951171875,
-0.06005859375,
0.004543304443359375,
0.0177764892578125,
-0.0308380126953125,
0.03961181640625,
-0.0076751708984375,
0.00921630859375,
0.0191497802734375,
0.00672149658203125,
-0.0243377685546875,
0.00543212890625,
0.0192413330078125,
0.0252838134765625,
-0.00905609130859375,
0.01141357421875,
0.0186767578125,
0.00974273681640625,
-0.0006937980651855469,
0.001842498779296875,
0.07659912109375,
0.002048492431640625,
-0.00577545166015625,
-0.03173828125,
0.0107574462890625,
0.033294677734375,
0.00334930419921875,
0.0673828125,
0.05908203125,
-0.0300140380859375,
0.00731658935546875,
-0.035736083984375,
-0.0020122528076171875,
-0.033355712890625,
0.043487548828125,
-0.042633056640625,
-0.04278564453125,
0.0382080078125,
0.0218505859375,
0.0247344970703125,
0.050506591796875,
0.043243408203125,
-0.0247039794921875,
0.06671142578125,
0.052001953125,
-0.04107666015625,
0.048492431640625,
-0.0269775390625,
-0.00130462646484375,
-0.0548095703125,
-0.0235137939453125,
-0.0318603515625,
-0.022064208984375,
-0.036285400390625,
-0.0067138671875,
0.019622802734375,
0.006275177001953125,
-0.03363037109375,
0.0330810546875,
-0.024658203125,
0.03253173828125,
0.06158447265625,
0.0156707763671875,
-0.010528564453125,
0.018768310546875,
-0.0250396728515625,
-0.00750732421875,
-0.047576904296875,
-0.0264892578125,
0.08758544921875,
0.043243408203125,
0.046844482421875,
-0.00847625732421875,
0.058746337890625,
0.0058746337890625,
0.01506805419921875,
-0.062744140625,
0.03741455078125,
-0.032470703125,
-0.07684326171875,
-0.030303955078125,
-0.01438140869140625,
-0.07421875,
0.005664825439453125,
-0.0175628662109375,
-0.041351318359375,
-0.0122528076171875,
-0.0169525146484375,
0.00368499755859375,
0.0254364013671875,
-0.056976318359375,
0.0606689453125,
-0.022247314453125,
0.01474761962890625,
-0.0192108154296875,
-0.057830810546875,
0.0221405029296875,
-0.01593017578125,
-0.00209808349609375,
0.007080078125,
-0.004802703857421875,
0.07647705078125,
-0.04376220703125,
0.07391357421875,
-0.025787353515625,
-0.006092071533203125,
0.009857177734375,
-0.0260772705078125,
0.0116729736328125,
-0.01282501220703125,
0.0152435302734375,
0.0477294921875,
-0.00460052490234375,
-0.024078369140625,
-0.0007724761962890625,
0.037750244140625,
-0.1046142578125,
-0.01422119140625,
-0.0153656005859375,
-0.0277862548828125,
-0.0051727294921875,
0.0537109375,
0.0596923828125,
0.0109405517578125,
-0.0230712890625,
0.027557373046875,
0.054931640625,
-0.0241546630859375,
0.0330810546875,
0.03533935546875,
-0.00975799560546875,
-0.032958984375,
0.06475830078125,
0.018035888671875,
-0.008758544921875,
0.04022216796875,
0.00045800209045410156,
-0.0242156982421875,
-0.038726806640625,
-0.0341796875,
0.03131103515625,
-0.0384521484375,
0.004116058349609375,
-0.04364013671875,
-0.03948974609375,
-0.043975830078125,
0.00899505615234375,
-0.03265380859375,
-0.0235595703125,
-0.0243682861328125,
-0.00452423095703125,
0.01153564453125,
0.047821044921875,
0.00267791748046875,
0.048004150390625,
-0.058807373046875,
0.0280609130859375,
0.0150146484375,
0.03216552734375,
0.0006241798400878906,
-0.038604736328125,
-0.0297393798828125,
0.00848388671875,
-0.015869140625,
-0.053131103515625,
0.0263214111328125,
0.0063629150390625,
0.04718017578125,
0.039581298828125,
-0.01629638671875,
0.0599365234375,
-0.054931640625,
0.08135986328125,
0.0325927734375,
-0.0704345703125,
0.04351806640625,
-0.02294921875,
0.027862548828125,
0.046783447265625,
0.058563232421875,
-0.03192138671875,
-0.026031494140625,
-0.059661865234375,
-0.0657958984375,
0.056640625,
0.0007271766662597656,
0.03131103515625,
-0.00998687744140625,
0.0304412841796875,
0.01486968994140625,
0.005397796630859375,
-0.07135009765625,
-0.023040771484375,
-0.043548583984375,
-0.032470703125,
-0.00910186767578125,
-0.032928466796875,
0.006771087646484375,
-0.0226593017578125,
0.0634765625,
0.012542724609375,
0.036163330078125,
0.002361297607421875,
-0.0208740234375,
-0.002727508544921875,
-0.00510406494140625,
0.03204345703125,
0.038970947265625,
-0.029754638671875,
-0.022857666015625,
0.00579071044921875,
-0.0704345703125,
-0.014373779296875,
0.00557708740234375,
-0.0229034423828125,
0.036102294921875,
0.039154052734375,
0.08599853515625,
0.0225830078125,
-0.04705810546875,
0.040252685546875,
-0.002899169921875,
-0.02679443359375,
-0.040802001953125,
0.0129547119140625,
0.00612640380859375,
-0.005245208740234375,
0.036041259765625,
-0.027984619140625,
-0.0013866424560546875,
-0.03131103515625,
-0.0030040740966796875,
0.025665283203125,
-0.004756927490234375,
-0.0187530517578125,
0.0362548828125,
0.011474609375,
-0.00893402099609375,
0.06683349609375,
0.00499725341796875,
-0.034881591796875,
0.039825439453125,
0.04461669921875,
0.05523681640625,
-0.002597808837890625,
-0.0005011558532714844,
0.04254150390625,
0.028961181640625,
0.0010499954223632812,
0.023590087890625,
-0.01351165771484375,
-0.0751953125,
-0.0284423828125,
-0.054656982421875,
-0.03668212890625,
0.0535888671875,
-0.054046630859375,
0.0191497802734375,
-0.0533447265625,
-0.018096923828125,
0.0083770751953125,
0.0174713134765625,
-0.0347900390625,
0.0297393798828125,
0.022735595703125,
0.0897216796875,
-0.051483154296875,
0.0909423828125,
0.0653076171875,
-0.04559326171875,
-0.0701904296875,
-0.001972198486328125,
-0.039276123046875,
-0.0869140625,
0.052642822265625,
0.0087127685546875,
0.0224151611328125,
0.00615692138671875,
-0.056884765625,
-0.0643310546875,
0.06500244140625,
0.0101470947265625,
-0.037200927734375,
-0.0289154052734375,
0.0018301010131835938,
0.0472412109375,
-0.00376129150390625,
0.0035114288330078125,
0.0229034423828125,
0.017608642578125,
-0.0025482177734375,
-0.07305908203125,
-0.034912109375,
-0.038177490234375,
0.0280914306640625,
0.005603790283203125,
-0.035491943359375,
0.07061767578125,
0.0109710693359375,
0.000507354736328125,
0.0198974609375,
0.041351318359375,
0.0264434814453125,
-0.01320648193359375,
0.040496826171875,
0.06890869140625,
0.042327880859375,
-0.0025234222412109375,
0.0732421875,
-0.037445068359375,
0.0243377685546875,
0.0665283203125,
0.0030727386474609375,
0.073486328125,
0.037017822265625,
-0.0092010498046875,
0.05120849609375,
0.058624267578125,
-0.022064208984375,
0.062225341796875,
-0.01219940185546875,
-0.0007305145263671875,
0.0086517333984375,
0.003940582275390625,
-0.038238525390625,
0.03167724609375,
0.039947509765625,
-0.039947509765625,
-0.01165771484375,
0.01067352294921875,
0.00399017333984375,
-0.0379638671875,
-0.041595458984375,
0.066162109375,
-0.00995635986328125,
-0.05120849609375,
0.03961181640625,
0.0182037353515625,
0.0653076171875,
-0.08209228515625,
0.0167388916015625,
-0.005947113037109375,
0.00799560546875,
0.01105499267578125,
-0.06640625,
0.00516510009765625,
0.01763916015625,
-0.017578125,
-0.010040283203125,
0.0543212890625,
-0.0192108154296875,
-0.03253173828125,
0.01197052001953125,
0.0136260986328125,
0.035003662109375,
0.02520751953125,
-0.059967041015625,
0.0077667236328125,
0.006130218505859375,
-0.0299530029296875,
0.0204010009765625,
0.023040771484375,
0.004894256591796875,
0.03131103515625,
0.048004150390625,
0.0196990966796875,
0.0150604248046875,
0.01580810546875,
0.05731201171875,
-0.034637451171875,
-0.058441162109375,
-0.052001953125,
0.024017333984375,
-0.00975799560546875,
-0.03741455078125,
0.042083740234375,
0.039031982421875,
0.08538818359375,
-0.032379150390625,
0.06549072265625,
-0.0252838134765625,
0.0443115234375,
-0.03228759765625,
0.0640869140625,
-0.052703857421875,
-0.0199127197265625,
-0.0150146484375,
-0.061431884765625,
0.0024890899658203125,
0.07733154296875,
0.00028061866760253906,
0.0036602020263671875,
0.0269622802734375,
0.0296478271484375,
-0.0003476142883300781,
-0.0034236907958984375,
0.01345062255859375,
0.022918701171875,
0.01027679443359375,
0.031280517578125,
0.036224365234375,
-0.034942626953125,
0.0321044921875,
-0.03466796875,
-0.00556182861328125,
-0.01251983642578125,
-0.0443115234375,
-0.06854248046875,
-0.044891357421875,
0.0013017654418945312,
-0.0079193115234375,
-0.0013189315795898438,
0.06976318359375,
0.052886962890625,
-0.053070068359375,
-0.0216522216796875,
-0.01458740234375,
-0.0272216796875,
0.01116180419921875,
-0.017364501953125,
0.0262603759765625,
-0.03851318359375,
-0.06646728515625,
0.0153350830078125,
-0.004962921142578125,
0.0140228271484375,
-0.01421356201171875,
-0.006107330322265625,
-0.019744873046875,
-0.00995635986328125,
0.036651611328125,
0.01470184326171875,
-0.04278564453125,
-0.0218963623046875,
-0.0099945068359375,
-0.021881103515625,
-0.00772857666015625,
0.02862548828125,
-0.032958984375,
0.03594970703125,
0.031524658203125,
0.049468994140625,
0.06134033203125,
-0.0192108154296875,
0.03143310546875,
-0.07867431640625,
0.02630615234375,
0.003509521484375,
0.044158935546875,
0.0138092041015625,
-0.0078277587890625,
0.03759765625,
0.0212249755859375,
-0.0143890380859375,
-0.05633544921875,
-0.0080718994140625,
-0.07305908203125,
-0.043914794921875,
0.05712890625,
-0.020294189453125,
-0.031768798828125,
0.00945281982421875,
-0.0175628662109375,
0.042572021484375,
-0.0110931396484375,
0.05828857421875,
0.0760498046875,
0.01329803466796875,
-0.01218414306640625,
-0.005847930908203125,
0.018035888671875,
0.01849365234375,
-0.037078857421875,
-0.033843994140625,
0.01372528076171875,
0.049041748046875,
0.042449951171875,
0.05523681640625,
-0.008270263671875,
0.0207977294921875,
0.0109405517578125,
0.036407470703125,
0.0004978179931640625,
-0.0197906494140625,
-0.010986328125,
-0.0005564689636230469,
-0.0037841796875,
-0.034027099609375
]
] |
camembert-base | 2023-05-30T14:36:19.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"camembert",
"fill-mask",
"fr",
"dataset:oscar",
"arxiv:1911.03894",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | camembert-base | 44 | 3,827,519 | transformers | 2022-03-02T23:29:04 | ---
language: fr
license: mit
datasets:
- oscar
---
# CamemBERT: a Tasty French Language Model
## Table of Contents
- [Model Details](#model-details)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Citation Information](#citation-information)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
## Model Details
- **Model Description:**
CamemBERT is a state-of-the-art language model for French based on the RoBERTa model.
It is now available on Hugging Face in 6 different versions with varying number of parameters, amount of pretraining data and pretraining data source domains.
- **Developed by:** Louis Martin\*, Benjamin Muller\*, Pedro Javier Ortiz Suรกrez\*, Yoann Dupont, Laurent Romary, รric Villemonte de la Clergerie, Djamรฉ Seddah and Benoรฎt Sagot.
- **Model Type:** Fill-Mask
- **Language(s):** French
- **License:** MIT
- **Parent Model:** See the [RoBERTa base model](https://huggingface.co/roberta-base) for more information about the RoBERTa base model.
- **Resources for more information:**
- [Research Paper](https://arxiv.org/abs/1911.03894)
- [Camembert Website](https://camembert-model.fr/)
## Uses
#### Direct Use
This model can be used for Fill-Mask tasks.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
This model was pretrained on a subcorpus of OSCAR multilingual corpus. Some of the limitations and risks associated with the OSCAR dataset, which are further detailed in the [OSCAR dataset card](https://huggingface.co/datasets/oscar), include the following:
> The quality of some OSCAR sub-corpora might be lower than expected, specifically for the lowest-resource languages.
> Constructed from Common Crawl, Personal and sensitive information might be present.
## Training
#### Training Data
OSCAR or Open Super-large Crawled Aggregated coRpus is a multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the Ungoliant architecture.
#### Training Procedure
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `camembert-base` | 110M | Base | OSCAR (138 GB of text) |
| `camembert/camembert-large` | 335M | Large | CCNet (135 GB of text) |
| `camembert/camembert-base-ccnet` | 110M | Base | CCNet (135 GB of text) |
| `camembert/camembert-base-wikipedia-4gb` | 110M | Base | Wikipedia (4 GB of text) |
| `camembert/camembert-base-oscar-4gb` | 110M | Base | Subsample of OSCAR (4 GB of text) |
| `camembert/camembert-base-ccnet-4gb` | 110M | Base | Subsample of CCNet (4 GB of text) |
## Evaluation
The model developers evaluated CamemBERT using four different downstream tasks for French: part-of-speech (POS) tagging, dependency parsing, named entity recognition (NER) and natural language inference (NLI).
## Citation Information
```bibtex
@inproceedings{martin2020camembert,
title={CamemBERT: a Tasty French Language Model},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}
}
```
## How to Get Started With the Model
##### Load CamemBERT and its sub-word tokenizer :
```python
from transformers import CamembertModel, CamembertTokenizer
# You can replace "camembert-base" with any other model from the table, e.g. "camembert/camembert-large".
tokenizer = CamembertTokenizer.from_pretrained("camembert-base")
camembert = CamembertModel.from_pretrained("camembert-base")
camembert.eval() # disable dropout (or leave in train mode to finetune)
```
##### Filling masks using pipeline
```python
from transformers import pipeline
camembert_fill_mask = pipeline("fill-mask", model="camembert-base", tokenizer="camembert-base")
results = camembert_fill_mask("Le camembert est <mask> :)")
# results
#[{'sequence': '<s> Le camembert est dรฉlicieux :)</s>', 'score': 0.4909103214740753, 'token': 7200},
# {'sequence': '<s> Le camembert est excellent :)</s>', 'score': 0.10556930303573608, 'token': 2183},
# {'sequence': '<s> Le camembert est succulent :)</s>', 'score': 0.03453315049409866, 'token': 26202},
# {'sequence': '<s> Le camembert est meilleur :)</s>', 'score': 0.03303130343556404, 'token': 528},
# {'sequence': '<s> Le camembert est parfait :)</s>', 'score': 0.030076518654823303, 'token': 1654}]
```
##### Extract contextual embedding features from Camembert output
```python
import torch
# Tokenize in sub-words with SentencePiece
tokenized_sentence = tokenizer.tokenize("J'aime le camembert !")
# ['โJ', "'", 'aime', 'โle', 'โca', 'member', 't', 'โ!']
# 1-hot encode and add special starting and end tokens
encoded_sentence = tokenizer.encode(tokenized_sentence)
# [5, 121, 11, 660, 16, 730, 25543, 110, 83, 6]
# NB: Can be done in one step : tokenize.encode("J'aime le camembert !")
# Feed tokens to Camembert as a torch tensor (batch dim 1)
encoded_sentence = torch.tensor(encoded_sentence).unsqueeze(0)
embeddings, _ = camembert(encoded_sentence)
# embeddings.detach()
# embeddings.size torch.Size([1, 10, 768])
# tensor([[[-0.0254, 0.0235, 0.1027, ..., -0.1459, -0.0205, -0.0116],
# [ 0.0606, -0.1811, -0.0418, ..., -0.1815, 0.0880, -0.0766],
# [-0.1561, -0.1127, 0.2687, ..., -0.0648, 0.0249, 0.0446],
# ...,
```
##### Extract contextual embedding features from all Camembert layers
```python
from transformers import CamembertConfig
# (Need to reload the model with new config)
config = CamembertConfig.from_pretrained("camembert-base", output_hidden_states=True)
camembert = CamembertModel.from_pretrained("camembert-base", config=config)
embeddings, _, all_layer_embeddings = camembert(encoded_sentence)
# all_layer_embeddings list of len(all_layer_embeddings) == 13 (input embedding layer + 12 self attention layers)
all_layer_embeddings[5]
# layer 5 contextual embedding : size torch.Size([1, 10, 768])
#tensor([[[-0.0032, 0.0075, 0.0040, ..., -0.0025, -0.0178, -0.0210],
# [-0.0996, -0.1474, 0.1057, ..., -0.0278, 0.1690, -0.2982],
# [ 0.0557, -0.0588, 0.0547, ..., -0.0726, -0.0867, 0.0699],
# ...,
```
| 6,969 | [
[
-0.0152130126953125,
-0.0570068359375,
0.019561767578125,
0.0211029052734375,
-0.01403045654296875,
-0.006702423095703125,
-0.0253753662109375,
-0.00923919677734375,
0.030120849609375,
0.036529541015625,
-0.03570556640625,
-0.0474853515625,
-0.052642822265625,
0.005970001220703125,
-0.03125,
0.08221435546875,
0.005779266357421875,
0.0126190185546875,
0.0088958740234375,
-0.005451202392578125,
-0.0171966552734375,
-0.04998779296875,
-0.048736572265625,
-0.024993896484375,
0.0341796875,
0.01042938232421875,
0.03564453125,
0.035980224609375,
0.0245208740234375,
0.0264129638671875,
-0.01012420654296875,
0.0033473968505859375,
-0.0322265625,
-0.0123291015625,
-0.00293731689453125,
-0.0390625,
-0.044921875,
0.012603759765625,
0.040802001953125,
0.04632568359375,
0.0006718635559082031,
0.00701141357421875,
0.006847381591796875,
0.054473876953125,
-0.0172882080078125,
0.026336669921875,
-0.037994384765625,
0.00890350341796875,
-0.01361083984375,
-0.004337310791015625,
-0.0330810546875,
-0.019683837890625,
0.008087158203125,
-0.0374755859375,
0.01172637939453125,
0.0018434524536132812,
0.08599853515625,
0.003421783447265625,
-0.0208282470703125,
-0.006805419921875,
-0.0251617431640625,
0.072998046875,
-0.06756591796875,
0.047821044921875,
0.034454345703125,
0.001346588134765625,
-0.0106353759765625,
-0.078125,
-0.053741455078125,
-0.0179443359375,
-0.0233612060546875,
0.025421142578125,
-0.0188140869140625,
-0.0148162841796875,
0.008575439453125,
0.029693603515625,
-0.05023193359375,
-0.016448974609375,
-0.0263214111328125,
-0.0168304443359375,
0.050567626953125,
-0.01065826416015625,
0.0311279296875,
-0.0304107666015625,
-0.0225372314453125,
-0.0274810791015625,
-0.03338623046875,
0.00006639957427978516,
0.016937255859375,
0.0267181396484375,
-0.0301361083984375,
0.053863525390625,
-0.0133056640625,
0.0498046875,
-0.003406524658203125,
-0.008514404296875,
0.05364990234375,
-0.030242919921875,
-0.0164642333984375,
-0.0022258758544921875,
0.07647705078125,
0.024993896484375,
0.0285491943359375,
-0.007740020751953125,
-0.0130157470703125,
-0.0015058517456054688,
0.001483917236328125,
-0.048095703125,
-0.030242919921875,
0.0247802734375,
-0.0235748291015625,
-0.015167236328125,
0.002674102783203125,
-0.051788330078125,
0.00746917724609375,
-0.00814056396484375,
0.035980224609375,
-0.05322265625,
-0.009368896484375,
0.01456451416015625,
-0.0162200927734375,
0.0187835693359375,
0.01219940185546875,
-0.048614501953125,
0.009765625,
0.033416748046875,
0.0692138671875,
-0.005706787109375,
-0.030914306640625,
-0.03277587890625,
-0.01197052001953125,
-0.00864410400390625,
0.036834716796875,
-0.027984619140625,
-0.01352691650390625,
-0.0072021484375,
0.023162841796875,
-0.020416259765625,
-0.0191650390625,
0.042816162109375,
-0.026885986328125,
0.038360595703125,
-0.0064849853515625,
-0.04998779296875,
-0.0299072265625,
0.0299224853515625,
-0.050872802734375,
0.08184814453125,
0.03118896484375,
-0.0716552734375,
0.0166015625,
-0.038360595703125,
-0.02264404296875,
0.0011234283447265625,
-0.01861572265625,
-0.033416748046875,
-0.00640869140625,
0.039703369140625,
0.041534423828125,
-0.0201416015625,
0.0235443115234375,
0.005451202392578125,
-0.0135040283203125,
0.0300140380859375,
-0.03131103515625,
0.0888671875,
0.01055908203125,
-0.0279998779296875,
0.0048675537109375,
-0.0625,
0.004215240478515625,
0.0190887451171875,
-0.03814697265625,
0.00397491455078125,
-0.0125579833984375,
0.0194244384765625,
0.01029205322265625,
0.0173797607421875,
-0.04083251953125,
0.0001895427703857422,
-0.05126953125,
0.0540771484375,
0.04803466796875,
0.004779815673828125,
0.0133514404296875,
-0.033203125,
0.0304107666015625,
0.01702880859375,
-0.0032482147216796875,
-0.00926971435546875,
-0.041351318359375,
-0.07330322265625,
-0.042266845703125,
0.044708251953125,
0.0634765625,
-0.054962158203125,
0.06494140625,
-0.0295867919921875,
-0.04345703125,
-0.041839599609375,
-0.004238128662109375,
0.0189666748046875,
0.0268402099609375,
0.033782958984375,
-0.032745361328125,
-0.036529541015625,
-0.07183837890625,
-0.0039043426513671875,
-0.007396697998046875,
0.004085540771484375,
0.00836944580078125,
0.053131103515625,
-0.01401519775390625,
0.0765380859375,
-0.0380859375,
-0.0175933837890625,
-0.0265655517578125,
-0.005924224853515625,
0.032989501953125,
0.0589599609375,
0.051788330078125,
-0.0455322265625,
-0.036102294921875,
-0.0015354156494140625,
-0.059906005859375,
0.01300811767578125,
0.00811767578125,
-0.01025390625,
0.01114654541015625,
0.03448486328125,
-0.04022216796875,
0.02734375,
0.03369140625,
-0.0223236083984375,
0.0367431640625,
-0.00553131103515625,
0.01029205322265625,
-0.10089111328125,
-0.00252532958984375,
0.00579071044921875,
-0.0188751220703125,
-0.041839599609375,
0.006011962890625,
-0.005542755126953125,
-0.01300811767578125,
-0.048980712890625,
0.049102783203125,
-0.02703857421875,
0.023834228515625,
0.01210784912109375,
0.0208587646484375,
0.004428863525390625,
0.0723876953125,
0.01363372802734375,
0.04278564453125,
0.052734375,
-0.029022216796875,
0.034820556640625,
0.0241546630859375,
-0.038543701171875,
0.034576416015625,
-0.052337646484375,
0.004425048828125,
-0.0035343170166015625,
0.0235748291015625,
-0.07958984375,
-0.0111083984375,
0.0389404296875,
-0.047943115234375,
0.0185699462890625,
-0.01242828369140625,
-0.0386962890625,
-0.02178955078125,
-0.0232696533203125,
0.0235443115234375,
0.0301971435546875,
-0.028106689453125,
0.0386962890625,
0.016571044921875,
-0.005100250244140625,
-0.045013427734375,
-0.07550048828125,
0.00588226318359375,
-0.0236968994140625,
-0.0567626953125,
0.0271453857421875,
-0.01438140869140625,
0.00537872314453125,
0.0024166107177734375,
0.01055145263671875,
-0.0013446807861328125,
-0.0043182373046875,
0.0007581710815429688,
0.0048828125,
-0.01617431640625,
-0.004222869873046875,
-0.008087158203125,
-0.004150390625,
-0.02008056640625,
-0.03363037109375,
0.061492919921875,
-0.0265045166015625,
-0.01023101806640625,
-0.03167724609375,
0.0221099853515625,
0.026397705078125,
-0.025177001953125,
0.068115234375,
0.0701904296875,
-0.03369140625,
0.001190185546875,
-0.03448486328125,
-0.0165557861328125,
-0.03436279296875,
0.041717529296875,
-0.032989501953125,
-0.06488037109375,
0.0443115234375,
0.02093505859375,
0.00240325927734375,
0.050048828125,
0.041351318359375,
0.004543304443359375,
0.064208984375,
0.034881591796875,
-0.00971221923828125,
0.032745361328125,
-0.050872802734375,
0.0192718505859375,
-0.0576171875,
-0.0230560302734375,
-0.033599853515625,
-0.01407623291015625,
-0.057647705078125,
-0.03143310546875,
0.02447509765625,
0.00772857666015625,
-0.017120361328125,
0.04522705078125,
-0.0367431640625,
0.01104736328125,
0.0462646484375,
0.026336669921875,
0.00428009033203125,
0.0158538818359375,
-0.0308990478515625,
-0.0100555419921875,
-0.0704345703125,
-0.037017822265625,
0.06866455078125,
0.0423583984375,
0.0450439453125,
0.0128631591796875,
0.053070068359375,
0.016876220703125,
0.0036067962646484375,
-0.05657958984375,
0.04876708984375,
-0.0180206298828125,
-0.052459716796875,
-0.0161590576171875,
-0.03594970703125,
-0.06640625,
0.0222930908203125,
-0.01290130615234375,
-0.0704345703125,
0.0191802978515625,
0.00830078125,
-0.01335906982421875,
0.0193328857421875,
-0.0628662109375,
0.076904296875,
-0.0210723876953125,
-0.0251617431640625,
0.007656097412109375,
-0.045928955078125,
0.0157928466796875,
-0.0007557868957519531,
0.0224609375,
0.002841949462890625,
0.016632080078125,
0.08184814453125,
-0.03582763671875,
0.07415771484375,
0.0085601806640625,
-0.003963470458984375,
0.023162841796875,
0.00098419189453125,
0.0333251953125,
0.0107879638671875,
-0.005401611328125,
0.027374267578125,
0.002155303955078125,
-0.03997802734375,
-0.031280517578125,
0.053955078125,
-0.0615234375,
-0.03369140625,
-0.04815673828125,
-0.0312347412109375,
-0.0007152557373046875,
0.019287109375,
0.046783447265625,
0.046722412109375,
-0.0128631591796875,
0.01873779296875,
0.0288238525390625,
-0.03497314453125,
0.03460693359375,
0.0128021240234375,
-0.01213836669921875,
-0.04425048828125,
0.0791015625,
0.010101318359375,
0.006343841552734375,
0.036590576171875,
0.0164337158203125,
-0.0171661376953125,
-0.0294952392578125,
-0.02581787109375,
0.031494140625,
-0.0498046875,
-0.0068817138671875,
-0.06280517578125,
-0.041961669921875,
-0.0457763671875,
-0.01050567626953125,
-0.031829833984375,
-0.050323486328125,
-0.0297088623046875,
-0.01328277587890625,
0.03466796875,
0.028289794921875,
-0.00514984130859375,
0.040740966796875,
-0.052947998046875,
-0.0017223358154296875,
0.014434814453125,
0.01873779296875,
-0.006122589111328125,
-0.056365966796875,
-0.0306396484375,
0.00464630126953125,
-0.021331787109375,
-0.06402587890625,
0.042449951171875,
0.01458740234375,
0.0438232421875,
0.0159149169921875,
-0.00782012939453125,
0.0243988037109375,
-0.043487548828125,
0.088623046875,
0.0236663818359375,
-0.06866455078125,
0.03369140625,
-0.01361846923828125,
0.0194854736328125,
0.0287933349609375,
0.035003662109375,
-0.047454833984375,
-0.0281982421875,
-0.058441162109375,
-0.08367919921875,
0.068115234375,
0.046112060546875,
0.00994873046875,
-0.0204925537109375,
0.00901031494140625,
-0.0013208389282226562,
0.0178375244140625,
-0.089599609375,
-0.032806396484375,
-0.042572021484375,
-0.0283203125,
-0.022705078125,
-0.00982666015625,
-0.0116729736328125,
-0.031707763671875,
0.071044921875,
0.011566162109375,
0.042877197265625,
0.020904541015625,
-0.020782470703125,
0.014923095703125,
0.018951416015625,
0.042144775390625,
0.042755126953125,
-0.0340576171875,
0.016204833984375,
0.00937652587890625,
-0.03765869140625,
0.00707244873046875,
0.01497650146484375,
-0.00458526611328125,
0.0014848709106445312,
0.054962158203125,
0.07666015625,
-0.00005412101745605469,
-0.0457763671875,
0.037322998046875,
-0.0011243820190429688,
-0.017852783203125,
-0.03424072265625,
0.0006532669067382812,
0.00659942626953125,
0.025177001953125,
0.01239776611328125,
-0.00888824462890625,
-0.01511383056640625,
-0.045867919921875,
0.0322265625,
0.01947021484375,
-0.036285400390625,
-0.0242919921875,
0.047454833984375,
0.005626678466796875,
-0.0308074951171875,
0.050994873046875,
-0.017242431640625,
-0.056793212890625,
0.03240966796875,
0.041473388671875,
0.06439208984375,
-0.004428863525390625,
0.019500732421875,
0.044921875,
0.0389404296875,
0.0032749176025390625,
0.01197052001953125,
0.01535797119140625,
-0.07373046875,
-0.014862060546875,
-0.05950927734375,
0.0173797607421875,
0.02606201171875,
-0.0284881591796875,
0.00666046142578125,
-0.027008056640625,
-0.0206298828125,
0.002895355224609375,
0.0001976490020751953,
-0.059814453125,
0.028656005859375,
-0.00803375244140625,
0.06451416015625,
-0.0736083984375,
0.0567626953125,
0.047515869140625,
-0.053436279296875,
-0.055908203125,
0.0004260540008544922,
-0.01024627685546875,
-0.07098388671875,
0.0582275390625,
0.00782012939453125,
0.0095367431640625,
0.0179901123046875,
-0.04241943359375,
-0.06146240234375,
0.0687255859375,
0.0244903564453125,
-0.032958984375,
-0.00086212158203125,
-0.0013170242309570312,
0.034027099609375,
-0.0308380126953125,
0.0276031494140625,
0.050567626953125,
0.03173828125,
0.0023345947265625,
-0.0538330078125,
0.00489044189453125,
-0.02978515625,
-0.0016002655029296875,
-0.00605010986328125,
-0.056884765625,
0.07025146484375,
0.005126953125,
-0.007244110107421875,
-0.0111846923828125,
0.05792236328125,
0.01020050048828125,
-0.000629425048828125,
0.0312347412109375,
0.0601806640625,
0.04083251953125,
-0.0175628662109375,
0.07366943359375,
-0.01971435546875,
0.03900146484375,
0.068359375,
0.00914764404296875,
0.054931640625,
0.0233612060546875,
-0.0261383056640625,
0.0426025390625,
0.055999755859375,
0.0025501251220703125,
0.04229736328125,
0.0157012939453125,
-0.01056671142578125,
-0.0007200241088867188,
0.007781982421875,
-0.031585693359375,
0.036468505859375,
0.0261688232421875,
-0.032257080078125,
-0.0088043212890625,
0.005542755126953125,
0.0218353271484375,
-0.003173828125,
0.0018482208251953125,
0.036407470703125,
0.0166778564453125,
-0.04443359375,
0.06549072265625,
0.0216064453125,
0.0518798828125,
-0.04339599609375,
0.008514404296875,
-0.02099609375,
0.0036830902099609375,
-0.01374053955078125,
-0.056304931640625,
0.0018434524536132812,
-0.0019893646240234375,
-0.0024280548095703125,
-0.0018529891967773438,
0.026824951171875,
-0.04937744140625,
-0.06182861328125,
0.0244140625,
0.041412353515625,
0.0208587646484375,
0.00605010986328125,
-0.077880859375,
0.009613037109375,
0.0204925537109375,
-0.037750244140625,
0.004421234130859375,
0.0261077880859375,
0.00911712646484375,
0.036224365234375,
0.0311279296875,
0.005096435546875,
0.00800323486328125,
0.009765625,
0.057525634765625,
-0.0521240234375,
-0.042633056640625,
-0.0657958984375,
0.051361083984375,
0.0016794204711914062,
-0.0218353271484375,
0.05401611328125,
0.053192138671875,
0.07122802734375,
-0.0120849609375,
0.0595703125,
-0.0190582275390625,
0.0280303955078125,
-0.060699462890625,
0.04937744140625,
-0.06182861328125,
0.01184844970703125,
-0.0275421142578125,
-0.077392578125,
-0.0237274169921875,
0.06695556640625,
-0.0193939208984375,
0.02252197265625,
0.0660400390625,
0.07525634765625,
-0.0208892822265625,
-0.017303466796875,
0.0159759521484375,
0.03387451171875,
0.027374267578125,
0.039520263671875,
0.036376953125,
-0.047515869140625,
0.0242156982421875,
-0.031585693359375,
-0.022613525390625,
-0.01325225830078125,
-0.06878662109375,
-0.0750732421875,
-0.0704345703125,
-0.03948974609375,
-0.042205810546875,
0.004421234130859375,
0.08026123046875,
0.052093505859375,
-0.07183837890625,
-0.0300445556640625,
-0.0016021728515625,
-0.00197601318359375,
-0.022705078125,
-0.0208282470703125,
0.056182861328125,
-0.00949859619140625,
-0.05853271484375,
0.034454345703125,
-0.0010480880737304688,
0.006061553955078125,
-0.0031604766845703125,
-0.006649017333984375,
-0.053955078125,
0.01398468017578125,
0.051849365234375,
0.00665283203125,
-0.05059814453125,
-0.0288848876953125,
-0.00750732421875,
-0.0011148452758789062,
0.016357421875,
0.028411865234375,
-0.041839599609375,
0.02532958984375,
0.03887939453125,
0.03131103515625,
0.05975341796875,
-0.01512908935546875,
0.0345458984375,
-0.08197021484375,
0.028411865234375,
0.0094757080078125,
0.0394287109375,
0.034454345703125,
-0.0196380615234375,
0.04541015625,
0.0345458984375,
-0.03277587890625,
-0.05633544921875,
0.01059722900390625,
-0.09478759765625,
-0.0246124267578125,
0.07086181640625,
-0.0115814208984375,
-0.0296173095703125,
0.01328277587890625,
-0.0186309814453125,
0.037841796875,
-0.036041259765625,
0.036346435546875,
0.0614013671875,
0.00011283159255981445,
-0.0243682861328125,
-0.046051025390625,
0.03436279296875,
0.03076171875,
-0.043060302734375,
-0.01323699951171875,
0.01934814453125,
0.03314208984375,
0.019073486328125,
0.0498046875,
-0.0099029541015625,
-0.0029201507568359375,
-0.0035610198974609375,
0.0084381103515625,
0.0109710693359375,
-0.016815185546875,
-0.011871337890625,
0.007137298583984375,
-0.00412750244140625,
-0.021270751953125
]
] |
sentence-transformers/all-MiniLM-L6-v2 | 2022-11-07T08:44:33.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"rust",
"bert",
"feature-extraction",
"sentence-similarity",
"en",
"dataset:s2orc",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:ms_marco",
"dataset:gooaq",
"dataset:yahoo_answers_topics",
"dataset:code_search_net",
"dataset:search_qa",
"dataset:eli5",
"dataset:snli",
"dataset:multi_nli",
"dataset:wikihow",
"dataset:natural_questions",
"dataset:trivia_qa",
"dataset:embedding-data/sentence-compression",
"dataset:embedding-data/flickr30k-captions",
"dataset:embedding-data/altlex",
"dataset:embedding-data/simple-wiki",
"dataset:embedding-data/QQP",
"dataset:embedding-data/SPECTER",
"dataset:embedding-data/PAQ_pairs",
"dataset:embedding-data/WikiAnswers",
"arxiv:1904.06472",
"arxiv:2102.07033",
"arxiv:2104.08727",
"arxiv:1704.05179",
"arxiv:1810.09305",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/all-MiniLM-L6-v2 | 1,078 | 3,768,075 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
language: en
license: apache-2.0
datasets:
- s2orc
- flax-sentence-embeddings/stackexchange_xml
- ms_marco
- gooaq
- yahoo_answers_topics
- code_search_net
- search_qa
- eli5
- snli
- multi_nli
- wikihow
- natural_questions
- trivia_qa
- embedding-data/sentence-compression
- embedding-data/flickr30k-captions
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/QQP
- embedding-data/SPECTER
- embedding-data/PAQ_pairs
- embedding-data/WikiAnswers
---
# all-MiniLM-L6-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/all-MiniLM-L6-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
import torch.nn.functional as F
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
model = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
# Normalize embeddings
sentence_embeddings = F.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-MiniLM-L6-v2)
------
## Background
The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised
contrastive learning objective. We used the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model and fine-tuned in on a
1B sentence pairs dataset. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset.
We developped this model during the
[Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104),
organized by Hugging Face. We developped this model as part of the project:
[Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks.
## Intended uses
Our model is intented to be used as a sentence and short paragraph encoder. Given an input text, it ouptuts a vector which captures
the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks.
By default, input text longer than 256 word pieces is truncated.
## Training procedure
### Pre-training
We use the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model. Please refer to the model card for more detailed information about the pre-training procedure.
### Fine-tuning
We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each possible sentence pairs from the batch.
We then apply the cross entropy loss by comparing with true pairs.
#### Hyper parameters
We trained ou model on a TPU v3-8. We train the model during 100k steps using a batch size of 1024 (128 per TPU core).
We use a learning rate warm up of 500. The sequence length was limited to 128 tokens. We used the AdamW optimizer with
a 2e-5 learning rate. The full training script is accessible in this current repository: `train_script.py`.
#### Training data
We use the concatenation from multiple datasets to fine-tune our model. The total number of sentence pairs is above 1 billion sentences.
We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file.
| Dataset | Paper | Number of training tuples |
|--------------------------------------------------------|:----------------------------------------:|:--------------------------:|
| [Reddit comments (2015-2018)](https://github.com/PolyAI-LDN/conversational-datasets/tree/master/reddit) | [paper](https://arxiv.org/abs/1904.06472) | 726,484,430 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Abstracts) | [paper](https://aclanthology.org/2020.acl-main.447/) | 116,288,806 |
| [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs | [paper](https://doi.org/10.1145/2623330.2623677) | 77,427,422 |
| [PAQ](https://github.com/facebookresearch/PAQ) (Question, Answer) pairs | [paper](https://arxiv.org/abs/2102.07033) | 64,371,441 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Titles) | [paper](https://aclanthology.org/2020.acl-main.447/) | 52,603,982 |
| [S2ORC](https://github.com/allenai/s2orc) (Title, Abstract) | [paper](https://aclanthology.org/2020.acl-main.447/) | 41,769,185 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs | - | 25,316,456 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title+Body, Answer) pairs | - | 21,396,559 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs | - | 21,396,559 |
| [MS MARCO](https://microsoft.github.io/msmarco/) triplets | [paper](https://doi.org/10.1145/3404835.3462804) | 9,144,553 |
| [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) | [paper](https://arxiv.org/pdf/2104.08727.pdf) | 3,012,496 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 1,198,260 |
| [Code Search](https://huggingface.co/datasets/code_search_net) | - | 1,151,414 |
| [COCO](https://cocodataset.org/#home) Image captions | [paper](https://link.springer.com/chapter/10.1007%2F978-3-319-10602-1_48) | 828,395|
| [SPECTER](https://github.com/allenai/specter) citation triplets | [paper](https://doi.org/10.18653/v1/2020.acl-main.207) | 684,100 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 681,164 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 659,896 |
| [SearchQA](https://huggingface.co/datasets/search_qa) | [paper](https://arxiv.org/abs/1704.05179) | 582,261 |
| [Eli5](https://huggingface.co/datasets/eli5) | [paper](https://doi.org/10.18653/v1/p19-1346) | 325,475 |
| [Flickr 30k](https://shannon.cs.illinois.edu/DenotationGraph/) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/229/33) | 317,695 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles) | | 304,525 |
| AllNLI ([SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) | [paper SNLI](https://doi.org/10.18653/v1/d15-1075), [paper MultiNLI](https://doi.org/10.18653/v1/n18-1101) | 277,230 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (bodies) | | 250,519 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles+bodies) | | 250,460 |
| [Sentence Compression](https://github.com/google-research-datasets/sentence-compression) | [paper](https://www.aclweb.org/anthology/D13-1155/) | 180,000 |
| [Wikihow](https://github.com/pvl/wikihow_pairs_dataset) | [paper](https://arxiv.org/abs/1810.09305) | 128,542 |
| [Altlex](https://github.com/chridey/altlex/) | [paper](https://aclanthology.org/P16-1135.pdf) | 112,696 |
| [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) | - | 103,663 |
| [Simple Wikipedia](https://cs.pomona.edu/~dkauchak/simplification/) | [paper](https://www.aclweb.org/anthology/P11-2117/) | 102,225 |
| [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/1455) | 100,231 |
| [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) | [paper](https://aclanthology.org/P18-2124.pdf) | 87,599 |
| [TriviaQA](https://huggingface.co/datasets/trivia_qa) | - | 73,346 |
| **Total** | | **1,170,060,424** | | 10,610 | [
[
-0.026092529296875,
-0.06402587890625,
0.024322509765625,
0.0079345703125,
-0.01029205322265625,
-0.021026611328125,
-0.0173187255859375,
-0.0211181640625,
0.025390625,
0.0139617919921875,
-0.037078857421875,
-0.040008544921875,
-0.048309326171875,
0.0092620849609375,
-0.02154541015625,
0.0833740234375,
-0.0013980865478515625,
-0.00580596923828125,
-0.030426025390625,
-0.023651123046875,
-0.00868988037109375,
-0.03656005859375,
-0.03912353515625,
-0.0021648406982421875,
0.03546142578125,
0.027099609375,
0.035858154296875,
0.037841796875,
0.02777099609375,
0.019287109375,
-0.00612640380859375,
0.023223876953125,
-0.045684814453125,
-0.01177215576171875,
0.01219940185546875,
-0.0306854248046875,
-0.01448822021484375,
0.003452301025390625,
0.03509521484375,
0.045196533203125,
-0.0017452239990234375,
0.0204620361328125,
0.015838623046875,
0.036651611328125,
-0.030426025390625,
0.0198211669921875,
-0.040191650390625,
0.0034809112548828125,
-0.0135040283203125,
0.0018415451049804688,
-0.028900146484375,
-0.027557373046875,
0.02056884765625,
-0.039581298828125,
0.0137939453125,
0.0217742919921875,
0.075927734375,
0.019927978515625,
-0.0259857177734375,
-0.033966064453125,
-0.011505126953125,
0.059295654296875,
-0.0572509765625,
0.017303466796875,
0.036895751953125,
-0.00923919677734375,
0.0008206367492675781,
-0.057464599609375,
-0.056060791015625,
-0.00641632080078125,
-0.03765869140625,
0.01334381103515625,
-0.025238037109375,
-0.0097503662109375,
0.0204010009765625,
0.0287322998046875,
-0.062469482421875,
0.0034999847412109375,
-0.0278472900390625,
-0.0093231201171875,
0.0548095703125,
0.0098114013671875,
0.0191497802734375,
-0.043853759765625,
-0.0216827392578125,
-0.01995849609375,
-0.0247802734375,
0.0135498046875,
0.0230560302734375,
0.0142364501953125,
-0.036834716796875,
0.05340576171875,
-0.0008792877197265625,
0.045440673828125,
0.0026531219482421875,
0.0043792724609375,
0.042388916015625,
-0.047027587890625,
-0.013031005859375,
-0.0205078125,
0.08929443359375,
0.025054931640625,
0.00868988037109375,
0.005817413330078125,
0.00872802734375,
-0.0046234130859375,
-0.01050567626953125,
-0.058380126953125,
-0.0221710205078125,
0.0228424072265625,
-0.032470703125,
-0.0278778076171875,
0.006549835205078125,
-0.0611572265625,
-0.01041412353515625,
0.0012903213500976562,
0.0265350341796875,
-0.04132080078125,
-0.01032257080078125,
0.01515960693359375,
-0.01084136962890625,
0.0201568603515625,
-0.00435638427734375,
-0.054168701171875,
0.0201568603515625,
0.029876708984375,
0.068603515625,
-0.001117706298828125,
-0.0193939208984375,
-0.0221710205078125,
-0.01036834716796875,
-0.004642486572265625,
0.053375244140625,
-0.0322265625,
-0.0076446533203125,
0.0008392333984375,
0.00679779052734375,
-0.034637451171875,
-0.0263214111328125,
0.04388427734375,
-0.020660400390625,
0.048614501953125,
-0.01306915283203125,
-0.06097412109375,
-0.002899169921875,
0.01499176025390625,
-0.03680419921875,
0.09619140625,
0.01317596435546875,
-0.08599853515625,
0.000766754150390625,
-0.045562744140625,
-0.0140228271484375,
-0.01708984375,
-0.0164031982421875,
-0.045684814453125,
-0.003643035888671875,
0.035186767578125,
0.040191650390625,
-0.016448974609375,
0.0086517333984375,
-0.029144287109375,
-0.01885986328125,
0.016571044921875,
-0.004695892333984375,
0.0880126953125,
0.00966644287109375,
-0.0230560302734375,
-0.004390716552734375,
-0.046905517578125,
-0.007755279541015625,
0.0276947021484375,
-0.0077056884765625,
-0.01739501953125,
-0.0201416015625,
0.0140228271484375,
0.02630615234375,
0.0194854736328125,
-0.048919677734375,
0.0141448974609375,
-0.04510498046875,
0.048065185546875,
0.052001953125,
-0.001026153564453125,
0.0253753662109375,
-0.03594970703125,
0.02886962890625,
0.01006317138671875,
-0.0004925727844238281,
-0.004856109619140625,
-0.0419921875,
-0.08209228515625,
-0.01375579833984375,
0.0301513671875,
0.04095458984375,
-0.05938720703125,
0.05938720703125,
-0.038177490234375,
-0.040283203125,
-0.0662841796875,
0.00817108154296875,
0.03350830078125,
0.041778564453125,
0.049652099609375,
-0.0010118484497070312,
-0.04669189453125,
-0.073974609375,
-0.01263427734375,
-0.0027904510498046875,
0.00212860107421875,
0.03826904296875,
0.0601806640625,
-0.0252227783203125,
0.0697021484375,
-0.055755615234375,
-0.0303802490234375,
-0.021881103515625,
0.0047149658203125,
0.017852783203125,
0.043060302734375,
0.045074462890625,
-0.051971435546875,
-0.049072265625,
-0.0312347412109375,
-0.06451416015625,
0.003650665283203125,
-0.0011892318725585938,
-0.019256591796875,
0.02508544921875,
0.0506591796875,
-0.05419921875,
0.0295257568359375,
0.03863525390625,
-0.030975341796875,
0.0217132568359375,
-0.0101470947265625,
-0.0142669677734375,
-0.09820556640625,
0.0160980224609375,
0.00411224365234375,
-0.01025390625,
-0.036163330078125,
-0.0015020370483398438,
-0.0080413818359375,
-0.0030918121337890625,
-0.028045654296875,
0.041107177734375,
-0.0335693359375,
0.0062713623046875,
0.01053619384765625,
0.0285491943359375,
0.0021190643310546875,
0.05633544921875,
-0.00824737548828125,
0.04901123046875,
0.0295867919921875,
-0.02777099609375,
0.0149383544921875,
0.04644775390625,
-0.0311126708984375,
0.0222625732421875,
-0.0616455078125,
0.0205078125,
-0.0121917724609375,
0.036041259765625,
-0.077880859375,
-0.00516510009765625,
0.021881103515625,
-0.044677734375,
0.005115509033203125,
0.006561279296875,
-0.04986572265625,
-0.039520263671875,
-0.04229736328125,
0.024200439453125,
0.033294677734375,
-0.032806396484375,
0.0299224853515625,
0.0259857177734375,
-0.00377655029296875,
-0.041015625,
-0.07489013671875,
-0.01261138916015625,
-0.0103912353515625,
-0.061614990234375,
0.02947998046875,
-0.01995849609375,
0.0016641616821289062,
0.01561737060546875,
0.01027679443359375,
0.01342010498046875,
-0.015380859375,
0.0105743408203125,
0.00905609130859375,
-0.0072021484375,
0.018585205078125,
-0.0031032562255859375,
-0.00725555419921875,
-0.0128631591796875,
-0.01535797119140625,
0.055328369140625,
-0.031951904296875,
-0.000006556510925292969,
-0.042999267578125,
0.027191162109375,
0.0221710205078125,
-0.00991058349609375,
0.07635498046875,
0.06646728515625,
-0.02783203125,
0.01104736328125,
-0.0411376953125,
-0.01229095458984375,
-0.034576416015625,
0.0262451171875,
-0.0253753662109375,
-0.0833740234375,
0.032012939453125,
0.032073974609375,
0.005954742431640625,
0.063232421875,
0.034149169921875,
-0.0235595703125,
0.061187744140625,
0.031585693359375,
-0.0061187744140625,
0.03564453125,
-0.0506591796875,
0.02337646484375,
-0.06951904296875,
-0.020050048828125,
-0.030517578125,
-0.022674560546875,
-0.0662841796875,
-0.047607421875,
0.026763916015625,
-0.004230499267578125,
-0.01490020751953125,
0.0294189453125,
-0.03900146484375,
0.00754547119140625,
0.0443115234375,
0.022552490234375,
-0.0029888153076171875,
0.0049285888671875,
-0.019561767578125,
-0.0115509033203125,
-0.06304931640625,
-0.02630615234375,
0.08685302734375,
0.0276947021484375,
0.036834716796875,
-0.0033893585205078125,
0.0584716796875,
0.0118408203125,
-0.005222320556640625,
-0.041473388671875,
0.043304443359375,
-0.0236053466796875,
-0.032958984375,
-0.01334381103515625,
-0.040924072265625,
-0.075439453125,
0.0158538818359375,
-0.0248260498046875,
-0.050537109375,
0.0230255126953125,
0.0012912750244140625,
-0.03582763671875,
0.01454925537109375,
-0.05694580078125,
0.07623291015625,
-0.005741119384765625,
-0.0267791748046875,
-0.00563812255859375,
-0.063232421875,
0.013153076171875,
0.01800537109375,
0.0162353515625,
-0.005031585693359375,
-0.0057220458984375,
0.07904052734375,
-0.033172607421875,
0.057525634765625,
-0.009490966796875,
0.0201568603515625,
0.0270843505859375,
-0.021240234375,
0.037322998046875,
0.0018558502197265625,
-0.011077880859375,
0.0132904052734375,
0.00328826904296875,
-0.050262451171875,
-0.03875732421875,
0.062469482421875,
-0.07769775390625,
-0.0306549072265625,
-0.046783447265625,
-0.035125732421875,
-0.00550079345703125,
0.0075531005859375,
0.033172607421875,
0.029693603515625,
-0.0034122467041015625,
0.041717529296875,
0.049163818359375,
-0.03106689453125,
0.0367431640625,
0.008392333984375,
-0.0037479400634765625,
-0.0418701171875,
0.0543212890625,
0.0097503662109375,
0.003299713134765625,
0.0379638671875,
0.020294189453125,
-0.02508544921875,
-0.02935791015625,
-0.0198822021484375,
0.032623291015625,
-0.0423583984375,
-0.016845703125,
-0.08905029296875,
-0.02606201171875,
-0.05621337890625,
-0.00487518310546875,
-0.0176239013671875,
-0.036407470703125,
-0.045440673828125,
-0.0251617431640625,
0.0294647216796875,
0.03411865234375,
0.0008859634399414062,
0.01409149169921875,
-0.032684326171875,
0.0181121826171875,
0.0193939208984375,
-0.0006432533264160156,
-0.01227569580078125,
-0.05511474609375,
-0.0228729248046875,
0.01142120361328125,
-0.0196685791015625,
-0.04962158203125,
0.03350830078125,
0.026580810546875,
0.034912109375,
0.009796142578125,
0.01116943359375,
0.0589599609375,
-0.01617431640625,
0.07208251953125,
0.005680084228515625,
-0.05584716796875,
0.05084228515625,
-0.0172576904296875,
0.0306549072265625,
0.0584716796875,
0.037445068359375,
-0.03448486328125,
-0.0258636474609375,
-0.0650634765625,
-0.074462890625,
0.05108642578125,
0.03375244140625,
0.01468658447265625,
-0.00905609130859375,
0.026763916015625,
-0.006168365478515625,
0.005306243896484375,
-0.071044921875,
-0.033721923828125,
-0.01558685302734375,
-0.0452880859375,
-0.0195770263671875,
-0.026031494140625,
-0.006473541259765625,
-0.0377197265625,
0.057586669921875,
-0.01123809814453125,
0.049652099609375,
0.0303802490234375,
-0.031463623046875,
0.02264404296875,
0.0059661865234375,
0.04180908203125,
0.02227783203125,
-0.019378662109375,
0.01015472412109375,
0.018646240234375,
-0.02264404296875,
-0.01502227783203125,
0.027801513671875,
-0.0155029296875,
-0.0026264190673828125,
0.0377197265625,
0.0672607421875,
0.01523590087890625,
-0.043853759765625,
0.058746337890625,
-0.018646240234375,
-0.020050048828125,
-0.0330810546875,
-0.00698089599609375,
0.015625,
0.01122283935546875,
0.01454925537109375,
-0.002025604248046875,
0.0019664764404296875,
-0.03851318359375,
0.0210113525390625,
0.018096923828125,
-0.0293731689453125,
-0.00930023193359375,
0.039459228515625,
0.0031642913818359375,
-0.00385284423828125,
0.05780029296875,
-0.0169830322265625,
-0.03509521484375,
0.03912353515625,
0.03173828125,
0.0555419921875,
0.01305389404296875,
0.00971221923828125,
0.05194091796875,
0.0261688232421875,
0.01316070556640625,
0.00843048095703125,
0.01004791259765625,
-0.05419921875,
0.002368927001953125,
-0.054412841796875,
0.0021152496337890625,
0.008270263671875,
-0.045684814453125,
0.017303466796875,
-0.023040771484375,
0.002445220947265625,
0.004772186279296875,
0.0220489501953125,
-0.06414794921875,
0.001155853271484375,
0.0031890869140625,
0.066162109375,
-0.07025146484375,
0.0621337890625,
0.045501708984375,
-0.050445556640625,
-0.051727294921875,
-0.000009834766387939453,
-0.005382537841796875,
-0.0654296875,
0.0254974365234375,
0.02825927734375,
0.00798797607421875,
0.006801605224609375,
-0.047088623046875,
-0.069580078125,
0.09820556640625,
0.0214080810546875,
-0.033843994140625,
-0.01015472412109375,
0.0109100341796875,
0.05145263671875,
-0.040618896484375,
0.037841796875,
0.043060302734375,
0.0254669189453125,
-0.00324249267578125,
-0.05413818359375,
0.014434814453125,
-0.040618896484375,
0.0132904052734375,
-0.016571044921875,
-0.06634521484375,
0.0556640625,
-0.00623321533203125,
-0.01007843017578125,
0.00807952880859375,
0.058807373046875,
0.0292510986328125,
0.0173797607421875,
0.038726806640625,
0.072265625,
0.054718017578125,
-0.00775146484375,
0.08538818359375,
-0.0181427001953125,
0.045684814453125,
0.0853271484375,
0.0145111083984375,
0.07501220703125,
0.035888671875,
-0.01245880126953125,
0.0615234375,
0.061370849609375,
-0.007110595703125,
0.04071044921875,
0.00782012939453125,
0.0044708251953125,
-0.00638580322265625,
-0.01064300537109375,
-0.032958984375,
0.034423828125,
0.0210113525390625,
-0.03558349609375,
0.0074920654296875,
0.01007080078125,
0.024139404296875,
0.004344940185546875,
0.0066986083984375,
0.06036376953125,
0.0160369873046875,
-0.04290771484375,
0.0494384765625,
-0.006275177001953125,
0.07177734375,
-0.036163330078125,
0.0237884521484375,
-0.026092529296875,
0.01529693603515625,
-0.0245819091796875,
-0.049835205078125,
0.029144287109375,
-0.0008754730224609375,
-0.01092529296875,
-0.018524169921875,
0.03717041015625,
-0.046295166015625,
-0.050201416015625,
0.0295257568359375,
0.0308837890625,
0.00897216796875,
0.0149383544921875,
-0.08135986328125,
0.002956390380859375,
0.0089569091796875,
-0.03302001953125,
0.0172882080078125,
0.0140533447265625,
0.0228729248046875,
0.034759521484375,
0.04669189453125,
-0.016876220703125,
0.0088348388671875,
-0.005466461181640625,
0.0675048828125,
-0.050994873046875,
-0.040802001953125,
-0.060516357421875,
0.045196533203125,
-0.0258941650390625,
-0.03369140625,
0.062347412109375,
0.061859130859375,
0.0743408203125,
0.0030345916748046875,
0.05169677734375,
-0.030181884765625,
0.0394287109375,
-0.03558349609375,
0.044097900390625,
-0.0555419921875,
0.007068634033203125,
-0.0181427001953125,
-0.0499267578125,
-0.0217132568359375,
0.05584716796875,
-0.0330810546875,
0.00739288330078125,
0.069580078125,
0.072021484375,
0.00006723403930664062,
-0.002902984619140625,
-0.00003349781036376953,
0.0294647216796875,
0.0166778564453125,
0.060760498046875,
0.03271484375,
-0.07135009765625,
0.05908203125,
-0.03363037109375,
-0.0096893310546875,
-0.02679443359375,
-0.048553466796875,
-0.067626953125,
-0.057861328125,
-0.034942626953125,
-0.0377197265625,
0.0019521713256835938,
0.08050537109375,
0.051971435546875,
-0.06304931640625,
-0.0136871337890625,
-0.01020050048828125,
-0.0014944076538085938,
-0.00511932373046875,
-0.0213470458984375,
0.05340576171875,
-0.01311492919921875,
-0.049957275390625,
0.01202392578125,
-0.00470733642578125,
-0.0037975311279296875,
-0.0013017654418945312,
-0.00566864013671875,
-0.054412841796875,
0.0019159317016601562,
0.043304443359375,
0.01103973388671875,
-0.050445556640625,
-0.0225067138671875,
0.0038890838623046875,
-0.0293731689453125,
0.01055908203125,
0.0355224609375,
-0.036224365234375,
0.027252197265625,
0.044647216796875,
0.043212890625,
0.06915283203125,
-0.0084381103515625,
0.0192108154296875,
-0.058197021484375,
0.0198211669921875,
0.0186004638671875,
0.03216552734375,
0.035552978515625,
-0.0303192138671875,
0.0538330078125,
0.0311737060546875,
-0.042449951171875,
-0.0537109375,
-0.00801849365234375,
-0.091064453125,
-0.01114654541015625,
0.0985107421875,
-0.023712158203125,
-0.015106201171875,
0.0061187744140625,
-0.0113983154296875,
0.02288818359375,
-0.0284881591796875,
0.04473876953125,
0.048309326171875,
-0.02117919921875,
-0.0298614501953125,
-0.0303497314453125,
0.03509521484375,
0.039642333984375,
-0.0697021484375,
-0.017120361328125,
0.0193023681640625,
0.0276031494140625,
0.01739501953125,
0.05560302734375,
-0.00377655029296875,
-0.0031871795654296875,
-0.001476287841796875,
-0.00800323486328125,
-0.0032958984375,
0.0033054351806640625,
-0.025848388671875,
0.016326904296875,
-0.0258331298828125,
-0.0159454345703125
]
] |
facebook/mbart-large-50 | 2023-03-28T08:28:50.000Z | [
"transformers",
"pytorch",
"tf",
"mbart",
"text2text-generation",
"mbart-50",
"multilingual",
"ar",
"cs",
"de",
"en",
"es",
"et",
"fi",
"fr",
"gu",
"hi",
"it",
"ja",
"kk",
"ko",
"lt",
"lv",
"my",
"ne",
"nl",
"ro",
"ru",
"si",
"tr",
"vi",
"zh",
"af",
"az",
"bn",
"fa",
"he",
"hr",
"id",
"ka",
"km",
"mk",
"ml",
"mn",
"mr",
"pl",
"ps",
"pt",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"uk",
"ur",
"xh",
"gl",
"sl",
"arxiv:2008.00401",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | facebook | null | null | facebook/mbart-large-50 | 90 | 3,361,559 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- ar
- cs
- de
- en
- es
- et
- fi
- fr
- gu
- hi
- it
- ja
- kk
- ko
- lt
- lv
- my
- ne
- nl
- ro
- ru
- si
- tr
- vi
- zh
- af
- az
- bn
- fa
- he
- hr
- id
- ka
- km
- mk
- ml
- mn
- mr
- pl
- ps
- pt
- sv
- sw
- ta
- te
- th
- tl
- uk
- ur
- xh
- gl
- sl
license: mit
tags:
- mbart-50
---
# mBART-50
mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper.
## Model description
mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning.
Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.
**Multilingual Denoising Pretraining**: The model incorporates N languages by concatenating data:
`D = {D1, ..., DN }` where each Di is a collection of monolingual documents in language `i`. The source documents are noised using two schemes,
first randomly shuffling the original sentences' order, and second a novel in-filling scheme,
where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text.
35% of each instance's words are masked by random sampling a span length according to a Poisson distribution `(ฮป = 3.5)`.
The decoder input is the original text with one position offset. A language id symbol `LID` is used as the initial token to predict the sentence.
## Intended uses & limitations
`mbart-large-50` is pre-trained model and primarily aimed at being fine-tuned on translation tasks. It can also be fine-tuned on other multilingual sequence-to-sequence tasks.
See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for fine-tuned versions.
## Training
As the model is multilingual, it expects the sequences in a different format. A special language id token is used as a prefix in both the source and target text. The text format is `[lang_code] X [eos]` with `X` being the source or target text respectively and `lang_code` is `source_lang_code` for source text and `tgt_lang_code` for target text. `bos` is never used. Once the examples are prepared in this format, it can be trained as any other sequence-to-sequence model.
```python
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50")
tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50", src_lang="en_XX", tgt_lang="ro_RO")
src_text = " UN Chief Says There Is No Military Solution in Syria"
tgt_text = "ลeful ONU declarฤ cฤ nu existฤ o soluลฃie militarฤ รฎn Siria"
model_inputs = tokenizer(src_text, return_tensors="pt")
with tokenizer.as_target_tokenizer():
labels = tokenizer(tgt_text, return_tensors="pt").input_ids
model(**model_inputs, labels=labels) # forward pass
```
## Languages covered
Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)
## BibTeX entry and citation info
```
@article{tang2020multilingual,
title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning},
author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan},
year={2020},
eprint={2008.00401},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,595 | [
[
-0.0345458984375,
-0.037322998046875,
0.004802703857421875,
0.0258026123046875,
-0.0286712646484375,
0.0096588134765625,
-0.021392822265625,
-0.021331787109375,
0.0192413330078125,
0.01446533203125,
-0.042755126953125,
-0.04608154296875,
-0.049560546875,
0.0227813720703125,
-0.01276397705078125,
0.07861328125,
-0.00823974609375,
0.0266876220703125,
0.0229949951171875,
-0.0094451904296875,
-0.0255126953125,
-0.0384521484375,
-0.034881591796875,
-0.005474090576171875,
0.0225982666015625,
0.034820556640625,
0.03070068359375,
0.031890869140625,
0.043792724609375,
0.025543212890625,
-0.0169677734375,
0.008697509765625,
-0.0229034423828125,
-0.0236663818359375,
-0.00037598609924316406,
-0.0273284912109375,
-0.048583984375,
-0.0094451904296875,
0.055999755859375,
0.041534423828125,
0.0001385211944580078,
0.032867431640625,
0.0007033348083496094,
0.051361083984375,
-0.0303955078125,
0.0029449462890625,
-0.02862548828125,
0.00836181640625,
-0.02606201171875,
-0.0012865066528320312,
-0.033782958984375,
-0.02386474609375,
-0.005268096923828125,
-0.0274658203125,
0.00850677490234375,
-0.004302978515625,
0.0894775390625,
-0.00981903076171875,
-0.040313720703125,
-0.007129669189453125,
-0.04608154296875,
0.05322265625,
-0.06787109375,
0.03472900390625,
0.039215087890625,
0.0159149169921875,
-0.0036067962646484375,
-0.05499267578125,
-0.049713134765625,
-0.00586700439453125,
-0.01282501220703125,
0.025054931640625,
-0.006122589111328125,
-0.01220703125,
0.0171661376953125,
0.031341552734375,
-0.062042236328125,
-0.01519775390625,
-0.049041748046875,
-0.00004571676254272461,
0.04248046875,
-0.0024166107177734375,
0.02557373046875,
-0.0318603515625,
-0.0308380126953125,
-0.004955291748046875,
-0.036834716796875,
0.0269012451171875,
0.0175933837890625,
0.0266571044921875,
-0.0299530029296875,
0.05291748046875,
-0.019287109375,
0.05859375,
0.008453369140625,
-0.0165557861328125,
0.049102783203125,
-0.04498291015625,
-0.01873779296875,
-0.0003402233123779297,
0.08319091796875,
0.01265716552734375,
0.023101806640625,
0.004161834716796875,
-0.0023326873779296875,
-0.01313018798828125,
0.0004336833953857422,
-0.061187744140625,
0.00008732080459594727,
0.0139312744140625,
-0.032623291015625,
0.002132415771484375,
0.0214080810546875,
-0.053314208984375,
0.018341064453125,
-0.006427764892578125,
0.033782958984375,
-0.055389404296875,
-0.016937255859375,
0.005283355712890625,
-0.0007686614990234375,
0.0197296142578125,
0.00189971923828125,
-0.06353759765625,
-0.002777099609375,
0.02593994140625,
0.06500244140625,
0.006626129150390625,
-0.045074462890625,
-0.014923095703125,
0.0176849365234375,
-0.026153564453125,
0.042449951171875,
-0.028472900390625,
-0.036712646484375,
-0.0153961181640625,
0.029052734375,
-0.0148773193359375,
-0.024749755859375,
0.04534912109375,
-0.0193023681640625,
0.0244140625,
-0.0270843505859375,
-0.0229034423828125,
-0.0249176025390625,
0.024993896484375,
-0.04998779296875,
0.09686279296875,
0.0164337158203125,
-0.057037353515625,
0.0251922607421875,
-0.04400634765625,
-0.043792724609375,
-0.00041413307189941406,
-0.00006079673767089844,
-0.04205322265625,
-0.00974273681640625,
0.030242919921875,
0.03680419921875,
-0.020050048828125,
0.0284576416015625,
-0.002681732177734375,
-0.027679443359375,
0.00661468505859375,
-0.024383544921875,
0.079833984375,
0.0299072265625,
-0.03839111328125,
0.006748199462890625,
-0.0648193359375,
0.0025539398193359375,
0.0177154541015625,
-0.041717529296875,
-0.0107574462890625,
-0.0230255126953125,
-0.003376007080078125,
0.0386962890625,
0.01238250732421875,
-0.041290283203125,
0.01097869873046875,
-0.032562255859375,
0.0257568359375,
0.03955078125,
-0.005733489990234375,
0.024810791015625,
-0.022857666015625,
0.049713134765625,
0.019287109375,
0.01097869873046875,
-0.026885986328125,
-0.04681396484375,
-0.061370849609375,
-0.032623291015625,
0.019012451171875,
0.060028076171875,
-0.05438232421875,
0.03167724609375,
-0.040802001953125,
-0.04571533203125,
-0.054290771484375,
0.0109405517578125,
0.04290771484375,
0.0308837890625,
0.031951904296875,
-0.0262908935546875,
-0.047943115234375,
-0.05682373046875,
-0.0143585205078125,
-0.0120086669921875,
0.01412200927734375,
0.0167999267578125,
0.050537109375,
-0.020965576171875,
0.057525634765625,
-0.0225830078125,
-0.0178070068359375,
-0.0283966064453125,
0.005718231201171875,
0.026947021484375,
0.048858642578125,
0.04205322265625,
-0.07305908203125,
-0.059600830078125,
0.01641845703125,
-0.042816162109375,
0.01267242431640625,
0.002742767333984375,
-0.0222625732421875,
0.028900146484375,
0.03448486328125,
-0.0396728515625,
0.02557373046875,
0.050048828125,
-0.02734375,
0.045654296875,
-0.01247406005859375,
0.028533935546875,
-0.1177978515625,
0.027496337890625,
-0.01531982421875,
-0.0035266876220703125,
-0.052886962890625,
-0.00966644287109375,
0.020660400390625,
-0.01129150390625,
-0.04901123046875,
0.04766845703125,
-0.040771484375,
0.01922607421875,
0.01166534423828125,
-0.0013017654418945312,
0.0035915374755859375,
0.040069580078125,
0.007671356201171875,
0.053802490234375,
0.0266265869140625,
-0.0384521484375,
0.0182647705078125,
0.0225982666015625,
-0.033294677734375,
0.0413818359375,
-0.038177490234375,
-0.01363372802734375,
-0.01306915283203125,
0.0177154541015625,
-0.082763671875,
-0.010528564453125,
0.026031494140625,
-0.058746337890625,
0.025909423828125,
-0.00836944580078125,
-0.03009033203125,
-0.051727294921875,
-0.01319122314453125,
0.038330078125,
0.023651123046875,
-0.0182037353515625,
0.03729248046875,
0.003711700439453125,
-0.0186614990234375,
-0.0643310546875,
-0.07952880859375,
0.005092620849609375,
-0.01061248779296875,
-0.042816162109375,
0.024810791015625,
-0.008941650390625,
0.0111083984375,
0.00946044921875,
0.007080078125,
-0.007251739501953125,
0.0036029815673828125,
0.01395416259765625,
0.026123046875,
-0.0260009765625,
0.00800323486328125,
-0.002613067626953125,
0.0011301040649414062,
-0.0187835693359375,
-0.018890380859375,
0.053314208984375,
-0.01389312744140625,
-0.017608642578125,
-0.0306854248046875,
0.032623291015625,
0.0352783203125,
-0.05792236328125,
0.08056640625,
0.07501220703125,
-0.025360107421875,
0.0139007568359375,
-0.031890869140625,
0.0021457672119140625,
-0.033599853515625,
0.048583984375,
-0.056060791015625,
-0.06304931640625,
0.044281005859375,
0.001979827880859375,
0.0016345977783203125,
0.04864501953125,
0.06378173828125,
0.0078887939453125,
0.0596923828125,
0.04443359375,
-0.006195068359375,
0.041259765625,
-0.039337158203125,
0.01107025146484375,
-0.05059814453125,
-0.029449462890625,
-0.0390625,
-0.00724029541015625,
-0.053558349609375,
-0.037384033203125,
0.00940704345703125,
0.01302337646484375,
-0.035400390625,
0.035552978515625,
-0.0230255126953125,
0.0174713134765625,
0.051422119140625,
0.0037975311279296875,
0.01123046875,
0.005863189697265625,
-0.034332275390625,
-0.0007181167602539062,
-0.055816650390625,
-0.035919189453125,
0.08209228515625,
0.010162353515625,
0.029052734375,
0.02423095703125,
0.05438232421875,
-0.0054473876953125,
0.021484375,
-0.041229248046875,
0.037322998046875,
-0.0244903564453125,
-0.07257080078125,
-0.01971435546875,
-0.040557861328125,
-0.0750732421875,
0.015960693359375,
-0.01482391357421875,
-0.045928955078125,
0.01934814453125,
-0.0021152496337890625,
-0.0297393798828125,
0.0175628662109375,
-0.058135986328125,
0.0777587890625,
-0.0218505859375,
-0.005962371826171875,
0.007564544677734375,
-0.06353759765625,
0.039154052734375,
-0.0184173583984375,
0.03143310546875,
-0.01202392578125,
0.0118408203125,
0.05841064453125,
-0.02569580078125,
0.046966552734375,
0.0045013427734375,
0.00072479248046875,
0.0166168212890625,
-0.008819580078125,
0.0267333984375,
-0.006229400634765625,
-0.00646209716796875,
0.018585205078125,
0.005344390869140625,
-0.042510986328125,
-0.01398468017578125,
0.032073974609375,
-0.0654296875,
-0.0350341796875,
-0.038360595703125,
-0.03717041015625,
0.002742767333984375,
0.042755126953125,
0.044830322265625,
0.01534271240234375,
-0.0180816650390625,
0.01313018798828125,
0.02764892578125,
-0.033599853515625,
0.038818359375,
0.042144775390625,
-0.01678466796875,
-0.054168701171875,
0.0677490234375,
0.01776123046875,
0.033538818359375,
0.043243408203125,
0.004970550537109375,
-0.006656646728515625,
-0.0151519775390625,
-0.0418701171875,
0.037139892578125,
-0.04412841796875,
-0.0098114013671875,
-0.055572509765625,
-0.01442718505859375,
-0.06341552734375,
-0.00815582275390625,
-0.033721923828125,
-0.0274658203125,
-0.0175933837890625,
-0.0018873214721679688,
0.01873779296875,
0.032684326171875,
-0.007755279541015625,
0.02703857421875,
-0.0682373046875,
0.036529541015625,
-0.0008029937744140625,
0.01116943359375,
-0.006946563720703125,
-0.058502197265625,
-0.04644775390625,
0.0160369873046875,
-0.0200347900390625,
-0.071533203125,
0.045318603515625,
0.0248870849609375,
0.038055419921875,
0.03448486328125,
-0.0010328292846679688,
0.065673828125,
-0.051605224609375,
0.0633544921875,
0.01824951171875,
-0.07708740234375,
0.04461669921875,
-0.0162811279296875,
0.044647216796875,
0.0418701171875,
0.053680419921875,
-0.061798095703125,
-0.0302886962890625,
-0.030303955078125,
-0.07928466796875,
0.0628662109375,
0.006488800048828125,
0.011810302734375,
-0.00638580322265625,
0.01052093505859375,
-0.005908966064453125,
0.0138702392578125,
-0.0775146484375,
-0.041168212890625,
-0.025177001953125,
-0.0262908935546875,
-0.038970947265625,
-0.0272979736328125,
-0.0037975311279296875,
-0.03692626953125,
0.0609130859375,
0.012481689453125,
0.030181884765625,
0.0164337158203125,
-0.01218414306640625,
-0.00482177734375,
0.024993896484375,
0.0640869140625,
0.042022705078125,
-0.02081298828125,
0.004352569580078125,
0.0147552490234375,
-0.05792236328125,
0.02301025390625,
0.0219879150390625,
-0.00994873046875,
0.01544952392578125,
0.0281829833984375,
0.0706787109375,
0.00400543212890625,
-0.03759765625,
0.033294677734375,
0.0010471343994140625,
-0.0162353515625,
-0.0265045166015625,
-0.0242156982421875,
0.017059326171875,
0.0216827392578125,
0.03277587890625,
-0.0087127685546875,
-0.00926971435546875,
-0.04595947265625,
0.015869140625,
0.0262451171875,
-0.01910400390625,
-0.0272216796875,
0.0511474609375,
-0.00009059906005859375,
-0.0194549560546875,
0.0360107421875,
-0.02105712890625,
-0.054229736328125,
0.03973388671875,
0.04205322265625,
0.05419921875,
-0.049896240234375,
0.0222015380859375,
0.05364990234375,
0.043426513671875,
-0.01029205322265625,
0.031890869140625,
0.008544921875,
-0.044097900390625,
-0.0303955078125,
-0.05950927734375,
0.0006470680236816406,
0.0027637481689453125,
-0.05389404296875,
0.0263214111328125,
-0.0046844482421875,
-0.0296173095703125,
-0.01177215576171875,
0.0069427490234375,
-0.0484619140625,
0.0242919921875,
0.002254486083984375,
0.062347412109375,
-0.07012939453125,
0.08447265625,
0.06524658203125,
-0.04510498046875,
-0.0714111328125,
-0.009613037109375,
-0.01273345947265625,
-0.048583984375,
0.06671142578125,
0.01116943359375,
0.0138702392578125,
0.01419830322265625,
-0.0158843994140625,
-0.0726318359375,
0.08331298828125,
0.03582763671875,
-0.032623291015625,
0.0004119873046875,
0.033935546875,
0.029449462890625,
-0.00933074951171875,
0.01149749755859375,
0.028564453125,
0.03936767578125,
0.0114593505859375,
-0.08123779296875,
0.003337860107421875,
-0.04339599609375,
-0.007198333740234375,
0.01448822021484375,
-0.058349609375,
0.0872802734375,
-0.018310546875,
-0.006290435791015625,
0.0023250579833984375,
0.0438232421875,
0.0246124267578125,
0.0122528076171875,
0.01031494140625,
0.048919677734375,
0.04693603515625,
-0.00661468505859375,
0.06671142578125,
-0.0338134765625,
0.03240966796875,
0.0748291015625,
0.012420654296875,
0.0738525390625,
0.036529541015625,
-0.0225677490234375,
0.0270843505859375,
0.058685302734375,
-0.00237274169921875,
0.040863037109375,
-0.00997161865234375,
-0.01145172119140625,
-0.00327301025390625,
-0.0021915435791015625,
-0.044525146484375,
0.033233642578125,
0.007965087890625,
-0.04107666015625,
0.004230499267578125,
0.017364501953125,
0.0355224609375,
-0.026885986328125,
-0.0160369873046875,
0.039093017578125,
0.01031494140625,
-0.0552978515625,
0.06353759765625,
0.0169219970703125,
0.050872802734375,
-0.055633544921875,
0.0147705078125,
-0.025604248046875,
0.0242767333984375,
-0.009002685546875,
-0.03887939453125,
0.006824493408203125,
0.0042572021484375,
-0.0188751220703125,
-0.007511138916015625,
0.0175018310546875,
-0.059906005859375,
-0.069580078125,
0.0191650390625,
0.038177490234375,
0.0139923095703125,
0.006610870361328125,
-0.060791015625,
-0.00800323486328125,
0.015838623046875,
-0.042022705078125,
0.01226043701171875,
0.0511474609375,
0.0026092529296875,
0.039703369140625,
0.042724609375,
0.01409149169921875,
0.0266571044921875,
-0.0119781494140625,
0.05419921875,
-0.055816650390625,
-0.032623291015625,
-0.07958984375,
0.045013427734375,
0.016204833984375,
-0.0299072265625,
0.0841064453125,
0.051025390625,
0.07843017578125,
-0.01274871826171875,
0.054779052734375,
-0.011871337890625,
0.0274810791015625,
-0.02911376953125,
0.061492919921875,
-0.06097412109375,
-0.00672149658203125,
-0.03936767578125,
-0.059417724609375,
-0.031341552734375,
0.04705810546875,
-0.0222015380859375,
0.0245513916015625,
0.049713134765625,
0.050628662109375,
0.0003654956817626953,
-0.0241241455078125,
0.01593017578125,
0.0218505859375,
0.0160675048828125,
0.051513671875,
0.0301513671875,
-0.03985595703125,
0.051483154296875,
-0.0311737060546875,
-0.005786895751953125,
-0.0184326171875,
-0.050872802734375,
-0.06207275390625,
-0.05645751953125,
-0.0092010498046875,
-0.0251312255859375,
0.0032520294189453125,
0.0682373046875,
0.042816162109375,
-0.06549072265625,
-0.031982421875,
0.01392364501953125,
-0.00954437255859375,
-0.022308349609375,
-0.0100860595703125,
0.045806884765625,
-0.01155853271484375,
-0.06146240234375,
0.00847625732421875,
0.003368377685546875,
0.0238494873046875,
-0.0012521743774414062,
-0.0194244384765625,
-0.054962158203125,
0.0030498504638671875,
0.049224853515625,
0.020721435546875,
-0.044830322265625,
0.0150909423828125,
0.0093994140625,
-0.0244903564453125,
0.0137939453125,
0.0235443115234375,
-0.0294036865234375,
0.03857421875,
0.032318115234375,
0.031524658203125,
0.048095703125,
0.0011444091796875,
0.032012939453125,
-0.04901123046875,
0.031829833984375,
0.000576019287109375,
0.024383544921875,
0.0226593017578125,
-0.00843048095703125,
0.04034423828125,
0.0269012451171875,
-0.03143310546875,
-0.07244873046875,
-0.00357818603515625,
-0.06878662109375,
-0.0274658203125,
0.0921630859375,
-0.03814697265625,
-0.020751953125,
0.0036468505859375,
-0.0186614990234375,
0.046295166015625,
-0.016387939453125,
0.033782958984375,
0.060272216796875,
0.0180816650390625,
-0.018341064453125,
-0.051849365234375,
0.0299072265625,
0.036407470703125,
-0.054412841796875,
-0.0168304443359375,
0.00844573974609375,
0.01485443115234375,
0.01433563232421875,
0.05419921875,
-0.0113372802734375,
0.019500732421875,
-0.0091705322265625,
0.0299072265625,
0.000370025634765625,
-0.01165008544921875,
-0.0221099853515625,
-0.00289154052734375,
0.002071380615234375,
-0.0203857421875
]
] |
distilbert-base-uncased-distilled-squad | 2023-04-06T13:40:56.000Z | [
"transformers",
"pytorch",
"tf",
"tflite",
"coreml",
"safetensors",
"distilbert",
"question-answering",
"en",
"dataset:squad",
"arxiv:1910.01108",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | null | null | null | distilbert-base-uncased-distilled-squad | 64 | 3,358,757 | transformers | 2022-03-02T23:29:04 | ---
language: en
datasets:
- squad
widget:
- text: "Which name is also used to describe the Amazon rainforest in English?"
context: "The Amazon rainforest (Portuguese: Floresta Amazรดnica or Amazรดnia; Spanish: Selva Amazรณnica, Amazonรญa or usually Amazonia; French: Forรชt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
- text: "How many square kilometers of rainforest is covered in the basin?"
context: "The Amazon rainforest (Portuguese: Floresta Amazรดnica or Amazรดnia; Spanish: Selva Amazรณnica, Amazonรญa or usually Amazonia; French: Forรชt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
license: apache-2.0
---
# DistilBERT base uncased distilled SQuAD
## Table of Contents
- [Model Details](#model-details)
- [How To Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications](#technical-specifications)
- [Citation Information](#citation-information)
- [Model Card Authors](#model-card-authors)
## Model Details
**Model Description:** The DistilBERT model was proposed in the blog post [Smaller, faster, cheaper, lighter: Introducing DistilBERT, adistilled version of BERT](https://medium.com/huggingface/distilbert-8cf3380435b5), and the paper [DistilBERT, adistilled version of BERT: smaller, faster, cheaper and lighter](https://arxiv.org/abs/1910.01108). DistilBERT is a small, fast, cheap and light Transformer model trained by distilling BERT base. It has 40% less parameters than *bert-base-uncased*, runs 60% faster while preserving over 95% of BERT's performances as measured on the GLUE language understanding benchmark.
This model is a fine-tune checkpoint of [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased), fine-tuned using (a second step of) knowledge distillation on [SQuAD v1.1](https://huggingface.co/datasets/squad).
- **Developed by:** Hugging Face
- **Model Type:** Transformer-based language model
- **Language(s):** English
- **License:** Apache 2.0
- **Related Models:** [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased)
- **Resources for more information:**
- See [this repository](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for more about Distil\* (a class of compressed models including this model)
- See [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108) for more information about knowledge distillation and the training procedure
## How to Get Started with the Model
Use the code below to get started with the model.
```python
>>> from transformers import pipeline
>>> question_answerer = pipeline("question-answering", model='distilbert-base-uncased-distilled-squad')
>>> context = r"""
... Extractive Question Answering is the task of extracting an answer from a text given a question. An example of a
... question answering dataset is the SQuAD dataset, which is entirely based on that task. If you would like to fine-tune
... a model on a SQuAD task, you may leverage the examples/pytorch/question-answering/run_squad.py script.
... """
>>> result = question_answerer(question="What is a good example of a question answering dataset?", context=context)
>>> print(
... f"Answer: '{result['answer']}', score: {round(result['score'], 4)}, start: {result['start']}, end: {result['end']}"
...)
Answer: 'SQuAD dataset', score: 0.4704, start: 147, end: 160
```
Here is how to use this model in PyTorch:
```python
from transformers import DistilBertTokenizer, DistilBertForQuestionAnswering
import torch
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased-distilled-squad')
model = DistilBertForQuestionAnswering.from_pretrained('distilbert-base-uncased-distilled-squad')
question, text = "Who was Jim Henson?", "Jim Henson was a nice puppet"
inputs = tokenizer(question, text, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
answer_start_index = torch.argmax(outputs.start_logits)
answer_end_index = torch.argmax(outputs.end_logits)
predict_answer_tokens = inputs.input_ids[0, answer_start_index : answer_end_index + 1]
tokenizer.decode(predict_answer_tokens)
```
And in TensorFlow:
```python
from transformers import DistilBertTokenizer, TFDistilBertForQuestionAnswering
import tensorflow as tf
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased-distilled-squad")
model = TFDistilBertForQuestionAnswering.from_pretrained("distilbert-base-uncased-distilled-squad")
question, text = "Who was Jim Henson?", "Jim Henson was a nice puppet"
inputs = tokenizer(question, text, return_tensors="tf")
outputs = model(**inputs)
answer_start_index = int(tf.math.argmax(outputs.start_logits, axis=-1)[0])
answer_end_index = int(tf.math.argmax(outputs.end_logits, axis=-1)[0])
predict_answer_tokens = inputs.input_ids[0, answer_start_index : answer_end_index + 1]
tokenizer.decode(predict_answer_tokens)
```
## Uses
This model can be used for question answering.
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware that language generated by this model can be disturbing or offensive to some and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example:
```python
>>> from transformers import pipeline
>>> question_answerer = pipeline("question-answering", model='distilbert-base-uncased-distilled-squad')
>>> context = r"""
... Alice is sitting on the bench. Bob is sitting next to her.
... """
>>> result = question_answerer(question="Who is the CEO?", context=context)
>>> print(
... f"Answer: '{result['answer']}', score: {round(result['score'], 4)}, start: {result['start']}, end: {result['end']}"
...)
Answer: 'Bob', score: 0.4183, start: 32, end: 35
```
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
## Training
#### Training Data
The [distilbert-base-uncased model](https://huggingface.co/distilbert-base-uncased) model describes it's training data as:
> DistilBERT pretrained on the same data as BERT, which is [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers).
To learn more about the SQuAD v1.1 dataset, see the [SQuAD v1.1 data card](https://huggingface.co/datasets/squad).
#### Training Procedure
##### Preprocessing
See the [distilbert-base-uncased model card](https://huggingface.co/distilbert-base-uncased) for further details.
##### Pretraining
See the [distilbert-base-uncased model card](https://huggingface.co/distilbert-base-uncased) for further details.
## Evaluation
As discussed in the [model repository](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)
> This model reaches a F1 score of 86.9 on the [SQuAD v1.1] dev set (for comparison, Bert bert-base-uncased version reaches a F1 score of 88.5).
## Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and hours used based on the [associated paper](https://arxiv.org/pdf/1910.01108.pdf). Note that these details are just for training DistilBERT, not including the fine-tuning with SQuAD.
- **Hardware Type:** 8 16GB V100 GPUs
- **Hours used:** 90 hours
- **Cloud Provider:** Unknown
- **Compute Region:** Unknown
- **Carbon Emitted:** Unknown
## Technical Specifications
See the [associated paper](https://arxiv.org/abs/1910.01108) for details on the modeling architecture, objective, compute infrastructure, and training details.
## Citation Information
```bibtex
@inproceedings{sanh2019distilbert,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas},
booktitle={NeurIPS EMC^2 Workshop},
year={2019}
}
```
APA:
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
## Model Card Authors
This model card was written by the Hugging Face team.
| 10,955 | [
[
-0.0267181396484375,
-0.06646728515625,
0.0169525146484375,
0.01119232177734375,
-0.00801849365234375,
0.0150146484375,
-0.014617919921875,
-0.0214385986328125,
-0.004642486572265625,
0.01114654541015625,
-0.06011962890625,
-0.0208892822265625,
-0.056243896484375,
0.008544921875,
-0.0207977294921875,
0.09375,
-0.00424957275390625,
0.0030193328857421875,
-0.01390838623046875,
-0.000043332576751708984,
-0.0210418701171875,
-0.050750732421875,
-0.031524658203125,
-0.029998779296875,
0.0206756591796875,
0.007236480712890625,
0.0369873046875,
0.0240936279296875,
0.03900146484375,
0.0341796875,
-0.0176544189453125,
-0.0129852294921875,
-0.0546875,
-0.003204345703125,
0.0059661865234375,
-0.03466796875,
-0.02825927734375,
0.0236663818359375,
0.0205841064453125,
0.049560546875,
-0.015716552734375,
0.0251007080078125,
0.01230621337890625,
0.043548583984375,
-0.0193023681640625,
0.034942626953125,
-0.054443359375,
-0.01038360595703125,
0.01311492919921875,
0.01678466796875,
-0.0262298583984375,
-0.01142120361328125,
0.0290985107421875,
-0.027099609375,
0.040985107421875,
-0.0027599334716796875,
0.07696533203125,
0.03436279296875,
-0.00566864013671875,
-0.024017333984375,
-0.040557861328125,
0.07183837890625,
-0.06060791015625,
-0.0015306472778320312,
0.021392822265625,
0.0233917236328125,
-0.0054473876953125,
-0.0621337890625,
-0.058258056640625,
-0.01568603515625,
-0.0190582275390625,
0.024169921875,
-0.03277587890625,
0.001735687255859375,
0.0195770263671875,
0.0292205810546875,
-0.03302001953125,
-0.0067291259765625,
-0.05780029296875,
-0.0050506591796875,
0.061309814453125,
0.005340576171875,
0.00232696533203125,
-0.007373809814453125,
-0.039520263671875,
-0.023223876953125,
-0.0149993896484375,
0.0178985595703125,
0.0311737060546875,
0.0253448486328125,
-0.0138092041015625,
0.04400634765625,
-0.01715087890625,
0.036834716796875,
0.03057861328125,
-0.0025882720947265625,
0.0303192138671875,
-0.0178375244140625,
-0.027313232421875,
0.0128631591796875,
0.06573486328125,
0.0252685546875,
0.0201263427734375,
-0.007659912109375,
-0.01137542724609375,
-0.003032684326171875,
0.007137298583984375,
-0.08074951171875,
-0.039337158203125,
0.031890869140625,
-0.02349853515625,
-0.039703369140625,
0.00506591796875,
-0.045013427734375,
-0.0022678375244140625,
-0.0009851455688476562,
0.0382080078125,
-0.0284423828125,
-0.0268402099609375,
0.01061248779296875,
-0.03271484375,
0.00299835205078125,
0.01029205322265625,
-0.06787109375,
0.0034503936767578125,
0.0162506103515625,
0.053009033203125,
-0.004985809326171875,
-0.0171966552734375,
-0.01406097412109375,
-0.01629638671875,
0.00691986083984375,
0.03057861328125,
-0.01082611083984375,
-0.025177001953125,
-0.01546478271484375,
0.015777587890625,
0.0017995834350585938,
-0.038177490234375,
0.0184173583984375,
-0.0308990478515625,
0.0293426513671875,
-0.01224517822265625,
-0.0445556640625,
-0.0189208984375,
0.01255035400390625,
-0.047393798828125,
0.08929443359375,
0.0290985107421875,
-0.05780029296875,
0.0262908935546875,
-0.04644775390625,
-0.027252197265625,
-0.0094146728515625,
0.01611328125,
-0.039154052734375,
0.000019609928131103516,
0.0197601318359375,
0.04119873046875,
-0.021148681640625,
0.038177490234375,
-0.0274810791015625,
-0.02008056640625,
0.0165863037109375,
-0.0347900390625,
0.0994873046875,
0.01287078857421875,
-0.0294342041015625,
-0.0162506103515625,
-0.049072265625,
0.0030956268310546875,
0.020751953125,
-0.033111572265625,
0.0012454986572265625,
-0.01432037353515625,
0.003536224365234375,
0.0208740234375,
0.0251312255859375,
-0.036102294921875,
0.01456451416015625,
-0.0099334716796875,
0.055206298828125,
0.0599365234375,
-0.0162811279296875,
0.024871826171875,
-0.0297393798828125,
0.027252197265625,
0.023162841796875,
0.01485443115234375,
0.005237579345703125,
-0.038299560546875,
-0.06427001953125,
-0.030426025390625,
0.0194091796875,
0.04150390625,
-0.048736572265625,
0.055328369140625,
-0.00724029541015625,
-0.05810546875,
-0.037445068359375,
0.00278472900390625,
0.024688720703125,
0.06561279296875,
0.042236328125,
0.01271820068359375,
-0.0538330078125,
-0.06732177734375,
0.01140594482421875,
-0.043212890625,
-0.002239227294921875,
0.004138946533203125,
0.06378173828125,
-0.00534820556640625,
0.0804443359375,
-0.053985595703125,
-0.0110321044921875,
-0.040771484375,
0.0098876953125,
0.044952392578125,
0.04290771484375,
0.0528564453125,
-0.061920166015625,
-0.038177490234375,
-0.032745361328125,
-0.06695556640625,
0.00382232666015625,
0.01611328125,
-0.0024051666259765625,
0.01158905029296875,
0.0262298583984375,
-0.04620361328125,
0.03106689453125,
0.035736083984375,
-0.0192718505859375,
0.041778564453125,
-0.012939453125,
0.00899505615234375,
-0.0848388671875,
0.01363372802734375,
-0.002780914306640625,
-0.001068115234375,
-0.05230712890625,
-0.017791748046875,
-0.0189208984375,
-0.0007581710815429688,
-0.049530029296875,
0.0311431884765625,
-0.0172119140625,
0.02862548828125,
0.0028095245361328125,
-0.015869140625,
0.0168914794921875,
0.0595703125,
0.00844573974609375,
0.05682373046875,
0.036468505859375,
-0.045013427734375,
0.039031982421875,
0.032928466796875,
-0.0292205810546875,
0.0210418701171875,
-0.08251953125,
0.011444091796875,
-0.0211334228515625,
0.0164642333984375,
-0.0863037109375,
0.0026397705078125,
0.006244659423828125,
-0.042449951171875,
0.0308380126953125,
-0.017303466796875,
-0.040679931640625,
-0.04290771484375,
-0.00447845458984375,
0.02197265625,
0.060394287109375,
-0.03009033203125,
0.031158447265625,
0.0274505615234375,
-0.00010472536087036133,
-0.054779052734375,
-0.06805419921875,
-0.038787841796875,
-0.03802490234375,
-0.05035400390625,
0.0308837890625,
-0.0160675048828125,
-0.0252838134765625,
0.00218963623046875,
-0.01099395751953125,
-0.033782958984375,
0.00888824462890625,
0.00980377197265625,
0.04638671875,
-0.00634765625,
0.0085906982421875,
-0.0087432861328125,
0.0041961669921875,
0.0026702880859375,
-0.00826263427734375,
0.03765869140625,
-0.0294952392578125,
0.01108551025390625,
-0.032318115234375,
0.016937255859375,
0.033477783203125,
-0.0017423629760742188,
0.0684814453125,
0.03668212890625,
-0.018157958984375,
0.00415802001953125,
-0.0389404296875,
-0.03314208984375,
-0.039215087890625,
0.05535888671875,
-0.01494598388671875,
-0.04779052734375,
0.0457763671875,
0.01806640625,
0.0114898681640625,
0.059234619140625,
0.045196533203125,
-0.035980224609375,
0.068359375,
0.04400634765625,
-0.0126800537109375,
0.0249786376953125,
-0.052764892578125,
0.010650634765625,
-0.039886474609375,
-0.0268096923828125,
-0.039794921875,
-0.040496826171875,
-0.04107666015625,
-0.02825927734375,
0.024200439453125,
0.03399658203125,
-0.035552978515625,
0.0472412109375,
-0.05352783203125,
0.0234832763671875,
0.040863037109375,
0.0123138427734375,
0.012176513671875,
-0.000995635986328125,
-0.0019245147705078125,
0.0011930465698242188,
-0.06494140625,
-0.039031982421875,
0.08245849609375,
0.03173828125,
0.06060791015625,
-0.0103912353515625,
0.05072021484375,
0.01468658447265625,
0.0205230712890625,
-0.040069580078125,
0.0275726318359375,
-0.004528045654296875,
-0.09295654296875,
-0.027130126953125,
-0.035491943359375,
-0.0550537109375,
0.0055694580078125,
-0.0061492919921875,
-0.056243896484375,
0.00726318359375,
0.00492095947265625,
-0.030426025390625,
0.0249786376953125,
-0.072265625,
0.07012939453125,
-0.0265045166015625,
-0.021514892578125,
0.01068878173828125,
-0.05487060546875,
0.01910400390625,
0.006557464599609375,
-0.00812530517578125,
-0.01123809814453125,
0.0262298583984375,
0.0599365234375,
-0.046478271484375,
0.061279296875,
-0.027099609375,
0.0111083984375,
0.04632568359375,
-0.024444580078125,
0.021575927734375,
0.006641387939453125,
-0.0203704833984375,
0.0382080078125,
0.01515960693359375,
-0.029541015625,
-0.03863525390625,
0.03204345703125,
-0.0704345703125,
-0.045989990234375,
-0.04901123046875,
-0.050262451171875,
0.0011434555053710938,
0.00811767578125,
0.039154052734375,
0.0171966552734375,
-0.0175628662109375,
0.0204620361328125,
0.0528564453125,
-0.0252838134765625,
0.0430908203125,
0.03009033203125,
-0.007659912109375,
-0.0036640167236328125,
0.036224365234375,
0.0059051513671875,
0.02960205078125,
0.0308074951171875,
0.013031005859375,
-0.051971435546875,
-0.025970458984375,
-0.03521728515625,
0.006687164306640625,
-0.05206298828125,
-0.019805908203125,
-0.0479736328125,
-0.035308837890625,
-0.034881591796875,
0.005706787109375,
-0.0286102294921875,
-0.032318115234375,
-0.037933349609375,
-0.0177001953125,
0.046356201171875,
0.046478271484375,
0.00910186767578125,
0.0255584716796875,
-0.038848876953125,
0.01316070556640625,
0.0278472900390625,
0.0109100341796875,
-0.01448822021484375,
-0.050201416015625,
-0.005329132080078125,
0.038421630859375,
-0.037078857421875,
-0.060577392578125,
0.0211029052734375,
0.010009765625,
0.04364013671875,
0.01715087890625,
0.0220794677734375,
0.0601806640625,
-0.0293426513671875,
0.0577392578125,
0.0209808349609375,
-0.05462646484375,
0.042724609375,
-0.004917144775390625,
0.00978851318359375,
0.0574951171875,
0.0367431640625,
-0.0027484893798828125,
-0.021697998046875,
-0.058074951171875,
-0.05755615234375,
0.0675048828125,
0.036041259765625,
0.0123443603515625,
-0.0031833648681640625,
0.01026153564453125,
0.004924774169921875,
0.032745361328125,
-0.0640869140625,
-0.03985595703125,
-0.025299072265625,
-0.01103973388671875,
-0.020538330078125,
0.0012102127075195312,
0.0029144287109375,
-0.053863525390625,
0.06048583984375,
0.012298583984375,
0.0294342041015625,
0.020721435546875,
-0.010467529296875,
0.0229644775390625,
-0.0024204254150390625,
0.021392822265625,
0.033935546875,
-0.047698974609375,
-0.01061248779296875,
0.024688720703125,
-0.038543701171875,
0.0209808349609375,
0.02044677734375,
-0.0013141632080078125,
0.01488494873046875,
0.021820068359375,
0.06317138671875,
-0.0114593505859375,
-0.0460205078125,
0.0308074951171875,
-0.005977630615234375,
-0.0305633544921875,
-0.02838134765625,
0.006214141845703125,
0.01323699951171875,
0.0276641845703125,
0.0341796875,
0.006122589111328125,
-0.0036830902099609375,
-0.057373046875,
0.0156402587890625,
0.0242156982421875,
-0.036407470703125,
-0.0117950439453125,
0.061614990234375,
0.012176513671875,
-0.0071868896484375,
0.07257080078125,
-0.010009765625,
-0.044952392578125,
0.06634521484375,
0.024322509765625,
0.049957275390625,
-0.0003178119659423828,
0.015777587890625,
0.043731689453125,
0.02301025390625,
-0.0080108642578125,
0.0016489028930664062,
0.00572967529296875,
-0.046783447265625,
-0.0012836456298828125,
-0.05718994140625,
0.010101318359375,
0.0170440673828125,
-0.0496826171875,
0.024688720703125,
-0.025360107421875,
-0.01953125,
0.010223388671875,
0.015899658203125,
-0.07659912109375,
0.016510009765625,
-0.01038360595703125,
0.062225341796875,
-0.067138671875,
0.06292724609375,
0.0389404296875,
-0.0628662109375,
-0.06719970703125,
-0.01432037353515625,
-0.02227783203125,
-0.06695556640625,
0.06787109375,
0.0245819091796875,
0.01397705078125,
0.0035686492919921875,
-0.036590576171875,
-0.043365478515625,
0.09527587890625,
0.038848876953125,
-0.04541015625,
-0.0190887451171875,
0.0253753662109375,
0.048675537109375,
-0.0213775634765625,
0.050384521484375,
0.052490234375,
0.0263214111328125,
0.024169921875,
-0.06060791015625,
-0.006603240966796875,
-0.0255584716796875,
-0.01012420654296875,
-0.01197052001953125,
-0.053619384765625,
0.088623046875,
-0.0261077880859375,
-0.0040283203125,
0.002346038818359375,
0.0404052734375,
0.0222930908203125,
0.00745391845703125,
0.038482666015625,
0.046600341796875,
0.055328369140625,
-0.0288543701171875,
0.07745361328125,
-0.01099395751953125,
0.050384521484375,
0.074951171875,
-0.004444122314453125,
0.045013427734375,
0.04290771484375,
-0.0396728515625,
0.036224365234375,
0.044952392578125,
-0.018096923828125,
0.0621337890625,
0.031005859375,
-0.0075225830078125,
-0.00658416748046875,
0.01195526123046875,
-0.03680419921875,
0.04058837890625,
-0.0031490325927734375,
-0.034088134765625,
-0.0167999267578125,
-0.01270294189453125,
0.0009765625,
-0.012542724609375,
-0.00046133995056152344,
0.04986572265625,
-0.00562286376953125,
-0.061431884765625,
0.07159423828125,
-0.0103912353515625,
0.05633544921875,
-0.046173095703125,
-0.00152587890625,
-0.0151519775390625,
0.0238189697265625,
-0.002956390380859375,
-0.061767578125,
0.0175628662109375,
0.006366729736328125,
-0.03448486328125,
-0.022979736328125,
0.024383544921875,
-0.04315185546875,
-0.06536865234375,
0.0117950439453125,
0.0333251953125,
0.0179290771484375,
-0.007335662841796875,
-0.07232666015625,
-0.00627899169921875,
0.01111602783203125,
-0.016021728515625,
0.01335906982421875,
0.0265960693359375,
0.0295257568359375,
0.046875,
0.040130615234375,
-0.0205230712890625,
0.00788116455078125,
-0.0164642333984375,
0.07586669921875,
-0.0162506103515625,
-0.01702880859375,
-0.07806396484375,
0.06793212890625,
-0.01216888427734375,
-0.03436279296875,
0.044219970703125,
0.05584716796875,
0.068359375,
-0.0221405029296875,
0.07293701171875,
-0.033782958984375,
0.01329803466796875,
-0.02130126953125,
0.06884765625,
-0.044586181640625,
0.00954437255859375,
-0.0277557373046875,
-0.06884765625,
0.0191802978515625,
0.061126708984375,
-0.01332855224609375,
0.0195159912109375,
0.044158935546875,
0.059112548828125,
-0.00899505615234375,
-0.0023193359375,
0.0046234130859375,
0.0169830322265625,
0.0193634033203125,
0.053131103515625,
0.052642822265625,
-0.057281494140625,
0.053955078125,
-0.045318603515625,
-0.0216522216796875,
-0.006534576416015625,
-0.0648193359375,
-0.09515380859375,
-0.0604248046875,
-0.029815673828125,
-0.03887939453125,
-0.0080718994140625,
0.055572509765625,
0.06390380859375,
-0.060821533203125,
-0.018463134765625,
-0.0131683349609375,
-0.003871917724609375,
-0.0198516845703125,
-0.01837158203125,
0.0224761962890625,
-0.0170440673828125,
-0.07275390625,
0.00165557861328125,
-0.01107025146484375,
0.01702880859375,
-0.0192718505859375,
-0.0038547515869140625,
-0.0330810546875,
-0.01485443115234375,
0.03240966796875,
-0.002986907958984375,
-0.043548583984375,
-0.0140380859375,
0.018707275390625,
0.0011777877807617188,
0.011993408203125,
0.02716064453125,
-0.053863525390625,
0.031768798828125,
0.032989501953125,
0.0249786376953125,
0.0550537109375,
-0.004611968994140625,
0.03741455078125,
-0.0594482421875,
0.031951904296875,
0.03338623046875,
0.036224365234375,
0.0162506103515625,
-0.031463623046875,
0.035980224609375,
0.01450347900390625,
-0.0382080078125,
-0.0609130859375,
-0.00527191162109375,
-0.0745849609375,
-0.0233154296875,
0.0794677734375,
-0.0291900634765625,
-0.0153656005859375,
0.004901885986328125,
-0.0192413330078125,
0.04473876953125,
-0.033660888671875,
0.07232666015625,
0.06732177734375,
0.004062652587890625,
0.00925445556640625,
-0.041412353515625,
0.030792236328125,
0.021728515625,
-0.0469970703125,
0.00005811452865600586,
0.0177001953125,
0.04327392578125,
0.005893707275390625,
0.031585693359375,
-0.0008335113525390625,
-0.00568389892578125,
0.00521087646484375,
-0.00032830238342285156,
-0.022552490234375,
-0.01044464111328125,
0.0021839141845703125,
-0.007793426513671875,
-0.027130126953125,
-0.026031494140625
]
] |
google/electra-base-discriminator | 2021-04-30T07:33:10.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"electra",
"pretraining",
"en",
"arxiv:1406.2661",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | google | null | null | google/electra-base-discriminator | 37 | 3,318,249 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail: https://huggingface.co/front/thumbnails/google.png
license: apache-2.0
---
## ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
**ELECTRA** is a new method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a [GAN](https://arxiv.org/pdf/1406.2661.pdf). At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset.
For a detailed description and experimental results, please refer to our paper [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB).
This repository contains code to pre-train ELECTRA, including small ELECTRA models on a single GPU. It also supports fine-tuning ELECTRA on downstream tasks including classification tasks (e.g,. [GLUE](https://gluebenchmark.com/)), QA tasks (e.g., [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/)), and sequence tagging tasks (e.g., [text chunking](https://www.clips.uantwerpen.be/conll2000/chunking/)).
## How to use the discriminator in `transformers`
```python
from transformers import ElectraForPreTraining, ElectraTokenizerFast
import torch
discriminator = ElectraForPreTraining.from_pretrained("google/electra-base-discriminator")
tokenizer = ElectraTokenizerFast.from_pretrained("google/electra-base-discriminator")
sentence = "The quick brown fox jumps over the lazy dog"
fake_sentence = "The quick brown fox fake over the lazy dog"
fake_tokens = tokenizer.tokenize(fake_sentence)
fake_inputs = tokenizer.encode(fake_sentence, return_tensors="pt")
discriminator_outputs = discriminator(fake_inputs)
predictions = torch.round((torch.sign(discriminator_outputs[0]) + 1) / 2)
[print("%7s" % token, end="") for token in fake_tokens]
[print("%7s" % int(prediction), end="") for prediction in predictions.tolist()]
```
| 2,199 | [
[
-0.034942626953125,
-0.037567138671875,
0.0118255615234375,
0.0135955810546875,
-0.017913818359375,
0.025543212890625,
-0.0184173583984375,
-0.01348114013671875,
0.0285186767578125,
0.03411865234375,
-0.0260772705078125,
-0.016448974609375,
-0.038543701171875,
0.0304107666015625,
-0.04595947265625,
0.07696533203125,
-0.01387786865234375,
-0.0095672607421875,
-0.003910064697265625,
-0.0045928955078125,
-0.0294647216796875,
-0.046234130859375,
-0.036376953125,
-0.04486083984375,
0.0281829833984375,
0.025604248046875,
0.0111236572265625,
0.022979736328125,
0.025238037109375,
0.041290283203125,
0.0149688720703125,
0.01514434814453125,
-0.027313232421875,
-0.0006871223449707031,
0.0033626556396484375,
-0.045684814453125,
-0.0092620849609375,
0.003299713134765625,
0.033172607421875,
0.007007598876953125,
-0.0142059326171875,
0.005580902099609375,
-0.01514434814453125,
0.05206298828125,
-0.05291748046875,
0.01074981689453125,
-0.061676025390625,
0.003826141357421875,
-0.020355224609375,
-0.01080322265625,
-0.057098388671875,
-0.0211029052734375,
-0.00899505615234375,
-0.04412841796875,
0.041259765625,
0.0024051666259765625,
0.0850830078125,
0.02655029296875,
-0.01427459716796875,
-0.0208892822265625,
-0.060821533203125,
0.05224609375,
-0.03363037109375,
0.01248931884765625,
0.01265716552734375,
0.022613525390625,
0.01299285888671875,
-0.0792236328125,
-0.06591796875,
0.0051116943359375,
-0.006793975830078125,
0.032135009765625,
-0.029022216796875,
0.01885986328125,
0.0194854736328125,
0.031768798828125,
-0.0302886962890625,
0.0174407958984375,
-0.043121337890625,
-0.0126800537109375,
0.04547119140625,
-0.0107879638671875,
0.01486968994140625,
-0.00043582916259765625,
-0.0186004638671875,
-0.026519775390625,
-0.05706787109375,
-0.00360107421875,
0.037109375,
0.01239776611328125,
-0.004421234130859375,
0.041595458984375,
-0.00722503662109375,
0.037109375,
0.04302978515625,
0.0201416015625,
0.048065185546875,
-0.00292205810546875,
-0.0190582275390625,
0.0293121337890625,
0.08038330078125,
-0.0024166107177734375,
0.026641845703125,
-0.01456451416015625,
-0.016357421875,
0.03076171875,
0.0267486572265625,
-0.0751953125,
-0.0411376953125,
0.00603485107421875,
-0.031463623046875,
-0.02996826171875,
-0.00286102294921875,
-0.06683349609375,
-0.005817413330078125,
0.0091552734375,
0.044647216796875,
-0.030975341796875,
-0.024017333984375,
0.0091400146484375,
-0.01520538330078125,
0.0219879150390625,
0.002849578857421875,
-0.08294677734375,
0.0253448486328125,
0.016815185546875,
0.054962158203125,
-0.0015287399291992188,
-0.0313720703125,
-0.0275726318359375,
-0.0081634521484375,
-0.0028553009033203125,
0.0792236328125,
-0.016204833984375,
-0.00981903076171875,
0.00930023193359375,
0.0025501251220703125,
-0.0274200439453125,
-0.041595458984375,
0.0274658203125,
-0.043701171875,
0.017822265625,
0.005489349365234375,
-0.048431396484375,
-0.0192108154296875,
-0.01629638671875,
-0.06463623046875,
0.08123779296875,
0.005779266357421875,
-0.055908203125,
0.03424072265625,
-0.04901123046875,
-0.040374755859375,
0.009429931640625,
-0.004180908203125,
-0.04913330078125,
0.0050048828125,
0.0210113525390625,
0.024566650390625,
0.00012981891632080078,
0.002765655517578125,
0.0010776519775390625,
-0.025543212890625,
0.0177154541015625,
-0.0228118896484375,
0.043914794921875,
0.0165557861328125,
-0.044830322265625,
0.0241546630859375,
-0.050079345703125,
0.004611968994140625,
0.01275634765625,
-0.011444091796875,
0.001918792724609375,
0.0168914794921875,
0.007434844970703125,
0.02276611328125,
0.0175323486328125,
-0.044097900390625,
-0.0001437664031982422,
-0.049041748046875,
0.05328369140625,
0.0582275390625,
-0.0299072265625,
0.037261962890625,
-0.00978851318359375,
0.035369873046875,
0.0017900466918945312,
-0.0158233642578125,
-0.01751708984375,
-0.021087646484375,
-0.0931396484375,
-0.01342010498046875,
0.018035888671875,
0.038726806640625,
-0.0728759765625,
0.06817626953125,
-0.009124755859375,
-0.0509033203125,
-0.041900634765625,
0.0006079673767089844,
0.0130615234375,
0.01290130615234375,
0.046539306640625,
-0.00321197509765625,
-0.0828857421875,
-0.0457763671875,
-0.01280975341796875,
-0.031219482421875,
0.01708984375,
-0.0233306884765625,
0.05908203125,
-0.006496429443359375,
0.0804443359375,
-0.0160369873046875,
-0.0242156982421875,
-0.06610107421875,
0.0085296630859375,
0.0188751220703125,
0.036956787109375,
0.030517578125,
-0.0531005859375,
-0.030120849609375,
-0.00678253173828125,
-0.042388916015625,
0.0037174224853515625,
-0.0076751708984375,
0.016815185546875,
0.00266265869140625,
0.038726806640625,
-0.060943603515625,
0.0294342041015625,
0.037384033203125,
-0.032989501953125,
0.0411376953125,
-0.02740478515625,
-0.00867462158203125,
-0.0743408203125,
-0.01520538330078125,
-0.002536773681640625,
-0.0184783935546875,
-0.0574951171875,
-0.01239776611328125,
0.004123687744140625,
0.0091094970703125,
-0.0465087890625,
0.0288848876953125,
-0.02203369140625,
0.0124969482421875,
-0.0165557861328125,
-0.006244659423828125,
-0.0024585723876953125,
0.0360107421875,
0.005435943603515625,
0.0780029296875,
0.0276947021484375,
-0.0506591796875,
0.02728271484375,
0.022979736328125,
-0.0240631103515625,
0.01776123046875,
-0.0655517578125,
0.0226287841796875,
-0.0185699462890625,
0.0224609375,
-0.05889892578125,
0.0009388923645019531,
0.0077362060546875,
-0.03173828125,
0.0202789306640625,
0.01094818115234375,
-0.05145263671875,
-0.05450439453125,
-0.0228118896484375,
0.0281524658203125,
0.05859375,
-0.060546875,
0.041900634765625,
0.0341796875,
0.0269317626953125,
-0.0279998779296875,
-0.04510498046875,
-0.0119476318359375,
-0.025054931640625,
-0.0192718505859375,
0.0430908203125,
0.007511138916015625,
0.0083770751953125,
-0.0113067626953125,
0.001796722412109375,
-0.018402099609375,
-0.007747650146484375,
0.01323699951171875,
0.017822265625,
-0.00007128715515136719,
0.014678955078125,
0.001102447509765625,
-0.01549530029296875,
0.0023956298828125,
-0.0123443603515625,
0.0732421875,
-0.0323486328125,
-0.011322021484375,
-0.0330810546875,
0.006702423095703125,
0.0183258056640625,
-0.038909912109375,
0.05560302734375,
0.06280517578125,
-0.02386474609375,
-0.01558685302734375,
-0.05242919921875,
-0.0063629150390625,
-0.0457763671875,
0.040618896484375,
-0.0283203125,
-0.06787109375,
0.032135009765625,
0.0009064674377441406,
-0.00018227100372314453,
0.0689697265625,
0.0570068359375,
-0.0116424560546875,
0.09002685546875,
0.046051025390625,
-0.01213836669921875,
0.055877685546875,
-0.049713134765625,
0.02911376953125,
-0.060577392578125,
-0.0223236083984375,
-0.0474853515625,
-0.01131439208984375,
-0.041900634765625,
-0.01824951171875,
-0.01416778564453125,
0.00817108154296875,
-0.030670166015625,
0.046356201171875,
-0.0718994140625,
0.034698486328125,
0.0260009765625,
0.002529144287109375,
-0.0038604736328125,
0.00678253173828125,
0.01251220703125,
-0.0030231475830078125,
-0.06158447265625,
-0.04290771484375,
0.0828857421875,
0.0149078369140625,
0.0712890625,
-0.033538818359375,
0.07391357421875,
0.0135955810546875,
0.026824951171875,
-0.053253173828125,
0.04730224609375,
-0.020904541015625,
-0.0435791015625,
-0.01149749755859375,
-0.0280303955078125,
-0.0906982421875,
0.02740478515625,
0.0012483596801757812,
-0.058624267578125,
0.0177459716796875,
0.006046295166015625,
-0.0191650390625,
0.045684814453125,
-0.0670166015625,
0.0684814453125,
-0.01113128662109375,
0.001285552978515625,
-0.011199951171875,
-0.01312255859375,
-0.004608154296875,
0.00890350341796875,
-0.005329132080078125,
-0.0037631988525390625,
0.021697998046875,
0.08563232421875,
-0.04132080078125,
0.060821533203125,
-0.01146697998046875,
0.01763916015625,
0.0408935546875,
-0.030120849609375,
0.0361328125,
-0.021484375,
0.0015268325805664062,
0.00217437744140625,
-0.0005645751953125,
-0.0099029541015625,
-0.021453857421875,
0.030120849609375,
-0.07659912109375,
-0.026031494140625,
-0.045074462890625,
-0.0145263671875,
0.0206756591796875,
0.029693603515625,
0.06597900390625,
0.027557373046875,
-0.016326904296875,
0.010101318359375,
0.046417236328125,
-0.008636474609375,
0.05645751953125,
0.007724761962890625,
-0.005405426025390625,
-0.0274810791015625,
0.08074951171875,
0.01105499267578125,
0.0012378692626953125,
0.02935791015625,
-0.00443267822265625,
-0.035430908203125,
-0.038238525390625,
-0.0199737548828125,
0.0201873779296875,
-0.050537109375,
-0.03076171875,
-0.0570068359375,
-0.033721923828125,
-0.022796630859375,
-0.012115478515625,
-0.0390625,
-0.0176544189453125,
-0.037841796875,
-0.021392822265625,
0.033203125,
0.049041748046875,
0.01421356201171875,
0.0457763671875,
-0.0311737060546875,
0.030029296875,
0.0252685546875,
-0.0014095306396484375,
-0.0156402587890625,
-0.0204315185546875,
-0.020751953125,
-0.0004737377166748047,
-0.0246734619140625,
-0.0718994140625,
0.05291748046875,
0.0171051025390625,
0.034515380859375,
0.019866943359375,
0.003910064697265625,
0.05291748046875,
-0.055572509765625,
0.05279541015625,
0.03265380859375,
-0.07366943359375,
0.037841796875,
0.020721435546875,
0.008056640625,
0.058380126953125,
-0.00830841064453125,
-0.0160675048828125,
-0.0270233154296875,
-0.0509033203125,
-0.057281494140625,
0.0596923828125,
0.031951904296875,
0.017486572265625,
-0.0169525146484375,
0.00388336181640625,
0.001224517822265625,
0.021575927734375,
-0.06414794921875,
-0.04168701171875,
-0.046417236328125,
-0.034759521484375,
-0.021514892578125,
-0.014678955078125,
0.018524169921875,
-0.0401611328125,
0.050750732421875,
0.003772735595703125,
0.0501708984375,
0.0192413330078125,
-0.035888671875,
0.0012750625610351562,
0.0174713134765625,
0.019012451171875,
0.03546142578125,
-0.0076751708984375,
0.0160675048828125,
0.0270843505859375,
-0.052001953125,
0.032379150390625,
0.02569580078125,
-0.024505615234375,
0.0267486572265625,
0.011688232421875,
0.07159423828125,
-0.006542205810546875,
-0.030426025390625,
0.028564453125,
-0.0064849853515625,
-0.0214691162109375,
-0.051544189453125,
-0.0006356239318847656,
-0.0177001953125,
-0.006412506103515625,
0.0266571044921875,
0.0157928466796875,
0.0006031990051269531,
-0.02899169921875,
0.0100860595703125,
0.0189361572265625,
-0.032135009765625,
-0.046844482421875,
0.060089111328125,
0.0261688232421875,
-0.030517578125,
0.04266357421875,
-0.0253753662109375,
-0.069091796875,
0.058319091796875,
0.06646728515625,
0.0806884765625,
-0.0221405029296875,
0.041595458984375,
0.032135009765625,
0.03717041015625,
-0.0186614990234375,
0.0113067626953125,
-0.004974365234375,
-0.0826416015625,
-0.03973388671875,
-0.033172607421875,
-0.0142822265625,
0.00028514862060546875,
-0.03179931640625,
0.0160675048828125,
-0.02166748046875,
-0.0088348388671875,
0.004047393798828125,
0.00341033935546875,
-0.07818603515625,
0.005809783935546875,
-0.0064697265625,
0.060211181640625,
-0.055999755859375,
0.072998046875,
0.05755615234375,
-0.039215087890625,
-0.06317138671875,
-0.029693603515625,
-0.0478515625,
-0.05419921875,
0.052337646484375,
0.046173095703125,
-0.00021123886108398438,
0.0240936279296875,
-0.02105712890625,
-0.047332763671875,
0.06817626953125,
0.0283203125,
-0.032867431640625,
-0.0219573974609375,
0.01375579833984375,
0.035614013671875,
-0.0175933837890625,
0.045013427734375,
0.049224853515625,
0.0313720703125,
-0.013641357421875,
-0.053131103515625,
0.0035533905029296875,
-0.0234222412109375,
-0.0090789794921875,
0.01995849609375,
-0.046417236328125,
0.07958984375,
-0.0013437271118164062,
-0.0236663818359375,
0.005435943603515625,
0.06329345703125,
0.018310546875,
0.01190185546875,
0.046417236328125,
0.045806884765625,
0.0570068359375,
-0.0301055908203125,
0.0772705078125,
-0.007648468017578125,
0.0506591796875,
0.055694580078125,
-0.0106964111328125,
0.048431396484375,
0.041778564453125,
-0.03369140625,
0.060089111328125,
0.0323486328125,
-0.010650634765625,
0.04937744140625,
0.00708770751953125,
-0.025146484375,
-0.01922607421875,
0.018310546875,
-0.038482666015625,
0.030792236328125,
0.0204315185546875,
-0.0200653076171875,
-0.015899658203125,
0.006328582763671875,
-0.00220489501953125,
-0.006359100341796875,
-0.022369384765625,
0.05126953125,
-0.001491546630859375,
-0.03497314453125,
0.039794921875,
-0.005489349365234375,
0.0841064453125,
-0.045989990234375,
-0.00445556640625,
0.0015478134155273438,
0.03466796875,
-0.02899169921875,
-0.0369873046875,
0.007904052734375,
0.00785064697265625,
0.0065765380859375,
-0.0224609375,
0.07159423828125,
-0.028411865234375,
-0.046142578125,
0.0012941360473632812,
0.02178955078125,
0.0069122314453125,
-0.01097869873046875,
-0.03594970703125,
-0.0033740997314453125,
-0.00809478759765625,
-0.01837158203125,
0.007808685302734375,
0.0140380859375,
0.02996826171875,
0.042388916015625,
0.0450439453125,
0.00786590576171875,
0.022918701171875,
0.00682830810546875,
0.06939697265625,
-0.034423828125,
-0.041351318359375,
-0.0777587890625,
0.0285491943359375,
-0.0156402587890625,
-0.0311279296875,
0.0736083984375,
0.046905517578125,
0.07574462890625,
-0.01238250732421875,
0.052947998046875,
-0.0225830078125,
0.0032634735107421875,
-0.037139892578125,
0.053863525390625,
-0.0121612548828125,
-0.00922393798828125,
-0.01517486572265625,
-0.06622314453125,
-0.00811767578125,
0.08111572265625,
-0.0214996337890625,
0.01226043701171875,
0.056732177734375,
0.035552978515625,
0.014251708984375,
-0.01152801513671875,
0.0042266845703125,
0.0047149658203125,
0.040283203125,
0.03564453125,
0.07464599609375,
-0.044708251953125,
0.055999755859375,
-0.03753662109375,
0.01287078857421875,
-0.007648468017578125,
-0.02996826171875,
-0.09381103515625,
-0.046112060546875,
-0.025726318359375,
-0.02227783203125,
-0.0023670196533203125,
0.058929443359375,
0.060546875,
-0.066162109375,
-0.0167388916015625,
-0.046844482421875,
0.0207061767578125,
-0.0216217041015625,
-0.0174407958984375,
0.029205322265625,
-0.04046630859375,
-0.0537109375,
0.0221710205078125,
0.0004341602325439453,
-0.00713348388671875,
-0.020233154296875,
-0.006900787353515625,
-0.0097503662109375,
-0.00678253173828125,
0.0499267578125,
0.027679443359375,
-0.0379638671875,
-0.0323486328125,
-0.006504058837890625,
0.00164031982421875,
0.0257110595703125,
0.054931640625,
-0.0906982421875,
0.03863525390625,
0.0301055908203125,
0.0285491943359375,
0.0687255859375,
-0.0204010009765625,
0.03411865234375,
-0.039276123046875,
0.0291900634765625,
0.0173492431640625,
0.042205810546875,
0.01837158203125,
-0.01039886474609375,
0.0244903564453125,
0.008331298828125,
-0.0411376953125,
-0.057861328125,
0.0070648193359375,
-0.057769775390625,
0.00792694091796875,
0.066162109375,
-0.00592803955078125,
-0.017730712890625,
-0.00301361083984375,
-0.022186279296875,
0.0430908203125,
-0.035736083984375,
0.04583740234375,
0.0277557373046875,
0.006031036376953125,
-0.01305389404296875,
-0.0197906494140625,
0.035797119140625,
0.0215606689453125,
-0.08001708984375,
-0.0216217041015625,
0.005352020263671875,
0.007625579833984375,
0.021575927734375,
0.056854248046875,
0.0022106170654296875,
0.01342010498046875,
0.0004544258117675781,
0.0268402099609375,
-0.007793426513671875,
-0.01268768310546875,
-0.0219268798828125,
-0.000995635986328125,
-0.022186279296875,
-0.046295166015625
]
] |
pyannote/segmentation | 2023-10-04T18:52:36.000Z | [
"pyannote-audio",
"pytorch",
"pyannote",
"pyannote-audio-model",
"audio",
"voice",
"speech",
"speaker",
"speaker-segmentation",
"voice-activity-detection",
"overlapped-speech-detection",
"resegmentation",
"arxiv:2104.04045",
"license:mit",
"has_space",
"region:us"
] | voice-activity-detection | pyannote | null | null | pyannote/segmentation | 309 | 3,189,524 | pyannote-audio | 2022-03-02T23:29:05 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-model
- audio
- voice
- speech
- speaker
- speaker-segmentation
- voice-activity-detection
- overlapped-speech-detection
- resegmentation
license: mit
inference: false
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers apply for grants to improve it further. If you are an academic researcher, please cite the relevant papers in your own publications using the model. If you work for a company, please consider contributing back to pyannote.audio development (e.g. through unrestricted gifts). We also provide scientific consulting services around speaker diarization and machine listening."
extra_gated_fields:
Company/university: text
Website: text
I plan to use this model for (task, type of audio data, etc): text
---
Using this open-source model in production?
Make the most of it thanks to our [consulting services](https://herve.niderb.fr/consulting.html).
# ๐น Speaker segmentation
[Paper](http://arxiv.org/abs/2104.04045) | [Demo](https://huggingface.co/spaces/pyannote/pretrained-pipelines) | [Blog post](https://herve.niderb.fr/fastpages/2022/10/23/One-speaker-segmentation-model-to-rule-them-all)

## Usage
Relies on pyannote.audio 2.1.1: see [installation instructions](https://github.com/pyannote/pyannote-audio).
```python
# 1. visit hf.co/pyannote/segmentation and accept user conditions
# 2. visit hf.co/settings/tokens to create an access token
# 3. instantiate pretrained model
from pyannote.audio import Model
model = Model.from_pretrained("pyannote/segmentation",
use_auth_token="ACCESS_TOKEN_GOES_HERE")
```
### Voice activity detection
```python
from pyannote.audio.pipelines import VoiceActivityDetection
pipeline = VoiceActivityDetection(segmentation=model)
HYPER_PARAMETERS = {
# onset/offset activation thresholds
"onset": 0.5, "offset": 0.5,
# remove speech regions shorter than that many seconds.
"min_duration_on": 0.0,
# fill non-speech regions shorter than that many seconds.
"min_duration_off": 0.0
}
pipeline.instantiate(HYPER_PARAMETERS)
vad = pipeline("audio.wav")
# `vad` is a pyannote.core.Annotation instance containing speech regions
```
### Overlapped speech detection
```python
from pyannote.audio.pipelines import OverlappedSpeechDetection
pipeline = OverlappedSpeechDetection(segmentation=model)
pipeline.instantiate(HYPER_PARAMETERS)
osd = pipeline("audio.wav")
# `osd` is a pyannote.core.Annotation instance containing overlapped speech regions
```
### Resegmentation
```python
from pyannote.audio.pipelines import Resegmentation
pipeline = Resegmentation(segmentation=model,
diarization="baseline")
pipeline.instantiate(HYPER_PARAMETERS)
resegmented_baseline = pipeline({"audio": "audio.wav", "baseline": baseline})
# where `baseline` should be provided as a pyannote.core.Annotation instance
```
### Raw scores
```python
from pyannote.audio import Inference
inference = Inference(model)
segmentation = inference("audio.wav")
# `segmentation` is a pyannote.core.SlidingWindowFeature
# instance containing raw segmentation scores like the
# one pictured above (output)
```
## Citation
```bibtex
@inproceedings{Bredin2021,
Title = {{End-to-end speaker segmentation for overlap-aware resegmentation}},
Author = {{Bredin}, Herv{\'e} and {Laurent}, Antoine},
Booktitle = {Proc. Interspeech 2021},
Address = {Brno, Czech Republic},
Month = {August},
Year = {2021},
```
```bibtex
@inproceedings{Bredin2020,
Title = {{pyannote.audio: neural building blocks for speaker diarization}},
Author = {{Bredin}, Herv{\'e} and {Yin}, Ruiqing and {Coria}, Juan Manuel and {Gelly}, Gregory and {Korshunov}, Pavel and {Lavechin}, Marvin and {Fustes}, Diego and {Titeux}, Hadrien and {Bouaziz}, Wassim and {Gill}, Marie-Philippe},
Booktitle = {ICASSP 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing},
Address = {Barcelona, Spain},
Month = {May},
Year = {2020},
}
```
## Reproducible research
In order to reproduce the results of the paper ["End-to-end speaker segmentation for overlap-aware resegmentation
"](https://arxiv.org/abs/2104.04045), use `pyannote/segmentation@Interspeech2021` with the following hyper-parameters:
| Voice activity detection | `onset` | `offset` | `min_duration_on` | `min_duration_off` |
| ------------------------ | ------- | -------- | ----------------- | ------------------ |
| AMI Mix-Headset | 0.684 | 0.577 | 0.181 | 0.037 |
| DIHARD3 | 0.767 | 0.377 | 0.136 | 0.067 |
| VoxConverse | 0.767 | 0.713 | 0.182 | 0.501 |
| Overlapped speech detection | `onset` | `offset` | `min_duration_on` | `min_duration_off` |
| --------------------------- | ------- | -------- | ----------------- | ------------------ |
| AMI Mix-Headset | 0.448 | 0.362 | 0.116 | 0.187 |
| DIHARD3 | 0.430 | 0.320 | 0.091 | 0.144 |
| VoxConverse | 0.587 | 0.426 | 0.337 | 0.112 |
| Resegmentation of VBx | `onset` | `offset` | `min_duration_on` | `min_duration_off` |
| --------------------- | ------- | -------- | ----------------- | ------------------ |
| AMI Mix-Headset | 0.542 | 0.527 | 0.044 | 0.705 |
| DIHARD3 | 0.592 | 0.489 | 0.163 | 0.182 |
| VoxConverse | 0.537 | 0.724 | 0.410 | 0.563 |
Expected outputs (and VBx baseline) are also provided in the `/reproducible_research` sub-directories.
| 5,844 | [
[
-0.048187255859375,
-0.049713134765625,
0.0252227783203125,
0.0224456787109375,
-0.0297088623046875,
-0.0223846435546875,
-0.0261077880859375,
-0.0256805419921875,
0.031951904296875,
0.033172607421875,
-0.05047607421875,
-0.048187255859375,
-0.019287109375,
-0.020233154296875,
-0.01165008544921875,
0.042877197265625,
0.02587890625,
-0.011077880859375,
-0.009246826171875,
-0.00598907470703125,
-0.0279541015625,
-0.0207977294921875,
-0.0289764404296875,
-0.02044677734375,
0.01317596435546875,
0.039520263671875,
0.0233612060546875,
0.057861328125,
0.0195159912109375,
0.0260009765625,
-0.03509521484375,
0.0094146728515625,
-0.00708770751953125,
-0.00034046173095703125,
0.0133209228515625,
-0.0006093978881835938,
-0.0263671875,
0.0028228759765625,
0.06610107421875,
0.05035400390625,
-0.0203399658203125,
0.01885986328125,
-0.009246826171875,
0.025726318359375,
-0.029754638671875,
-0.0024509429931640625,
-0.033416748046875,
-0.0009946823120117188,
-0.0244140625,
-0.01380157470703125,
-0.01898193359375,
-0.0034999847412109375,
0.01641845703125,
-0.0435791015625,
-0.0007638931274414062,
-0.005130767822265625,
0.09375,
0.01218414306640625,
0.01025390625,
0.0021266937255859375,
-0.0462646484375,
0.05572509765625,
-0.07452392578125,
0.03497314453125,
0.025177001953125,
0.023956298828125,
-0.01026153564453125,
-0.06396484375,
-0.043914794921875,
-0.0010890960693359375,
-0.004459381103515625,
0.0206298828125,
-0.02276611328125,
0.00853729248046875,
0.02874755859375,
0.034210205078125,
-0.03515625,
0.01213836669921875,
-0.041229248046875,
-0.0225372314453125,
0.05194091796875,
-0.00530242919921875,
0.0182037353515625,
-0.019561767578125,
-0.0268402099609375,
-0.0162353515625,
-0.0196685791015625,
0.01861572265625,
0.03204345703125,
0.026214599609375,
-0.0274810791015625,
0.0333251953125,
-0.0018148422241210938,
0.06622314453125,
0.01277923583984375,
-0.007602691650390625,
0.051788330078125,
-0.0357666015625,
-0.01959228515625,
0.0301361083984375,
0.0814208984375,
0.0183868408203125,
0.00616455078125,
0.02655029296875,
-0.0020313262939453125,
-0.0265655517578125,
0.0003752708435058594,
-0.05633544921875,
-0.034393310546875,
0.03857421875,
-0.035797119140625,
0.01702880859375,
-0.00441741943359375,
-0.06011962890625,
-0.0001493692398071289,
-0.0237884521484375,
0.035888671875,
-0.041778564453125,
-0.0438232421875,
0.00498199462890625,
-0.0201263427734375,
0.01514434814453125,
0.0018968582153320312,
-0.07171630859375,
0.016143798828125,
0.03887939453125,
0.0838623046875,
0.008148193359375,
-0.018524169921875,
-0.041168212890625,
-0.004146575927734375,
-0.0172882080078125,
0.054107666015625,
-0.025299072265625,
-0.033416748046875,
-0.028228759765625,
-0.0007796287536621094,
-0.016937255859375,
-0.046173095703125,
0.0496826171875,
0.004108428955078125,
0.0201873779296875,
-0.01270294189453125,
-0.039794921875,
-0.004627227783203125,
-0.01261138916015625,
-0.029541015625,
0.0732421875,
0.0136260986328125,
-0.054412841796875,
0.0271148681640625,
-0.049163818359375,
-0.00859832763671875,
-0.00885009765625,
-0.0111236572265625,
-0.05999755859375,
-0.01366424560546875,
0.0233001708984375,
0.0142822265625,
-0.00841522216796875,
-0.004276275634765625,
-0.014068603515625,
-0.0240936279296875,
0.007354736328125,
-0.022064208984375,
0.08172607421875,
0.01322174072265625,
-0.041656494140625,
0.023529052734375,
-0.08038330078125,
0.00591278076171875,
-0.00012481212615966797,
-0.03448486328125,
-0.017059326171875,
-0.0028972625732421875,
0.01361083984375,
0.010284423828125,
0.006099700927734375,
-0.054229736328125,
-0.017730712890625,
-0.040008544921875,
0.0296783447265625,
0.05029296875,
0.019378662109375,
0.01044464111328125,
-0.0210113525390625,
0.0161285400390625,
0.0092926025390625,
0.01015472412109375,
-0.0280609130859375,
-0.04364013671875,
-0.040252685546875,
-0.05352783203125,
0.02313232421875,
0.03887939453125,
-0.004070281982421875,
0.060638427734375,
-0.00693511962890625,
-0.058013916015625,
-0.050994873046875,
-0.004726409912109375,
0.037872314453125,
0.0465087890625,
0.043304443359375,
-0.0263671875,
-0.060638427734375,
-0.07647705078125,
-0.00937652587890625,
-0.01898193359375,
-0.0023860931396484375,
0.042724609375,
0.0196990966796875,
-0.005321502685546875,
0.08172607421875,
-0.032623291015625,
-0.01416778564453125,
-0.0005211830139160156,
-0.0016050338745117188,
0.040008544921875,
0.060150146484375,
0.03643798828125,
-0.058807373046875,
-0.04052734375,
0.0009946823120117188,
-0.0276031494140625,
-0.031982421875,
-0.0209808349609375,
-0.02117919921875,
-0.006011962890625,
0.034820556640625,
-0.045989990234375,
0.02911376953125,
0.0159454345703125,
-0.0240478515625,
0.06634521484375,
0.004547119140625,
0.002155303955078125,
-0.0711669921875,
0.009674072265625,
0.0197601318359375,
-0.0016613006591796875,
-0.05291748046875,
-0.05438232421875,
-0.01189422607421875,
-0.00467681884765625,
-0.029876708984375,
0.03399658203125,
-0.0445556640625,
-0.0247955322265625,
-0.001331329345703125,
0.0235595703125,
-0.012298583984375,
0.052032470703125,
0.01678466796875,
0.05859375,
0.047576904296875,
-0.04180908203125,
0.0190887451171875,
0.0266571044921875,
-0.06005859375,
0.03326416015625,
-0.06695556640625,
0.006977081298828125,
0.0235137939453125,
0.00630950927734375,
-0.09716796875,
-0.0125274658203125,
0.0379638671875,
-0.062744140625,
0.0178375244140625,
-0.030426025390625,
-0.0084075927734375,
-0.021636962890625,
-0.004322052001953125,
0.028076171875,
0.0396728515625,
-0.03839111328125,
0.023223876953125,
0.044525146484375,
-0.0240020751953125,
-0.030242919921875,
-0.052154541015625,
-0.01540374755859375,
-0.02154541015625,
-0.06451416015625,
0.047607421875,
-0.0017032623291015625,
-0.0274810791015625,
-0.003131866455078125,
-0.01561737060546875,
-0.007312774658203125,
-0.02325439453125,
0.0190582275390625,
0.007904052734375,
-0.023895263671875,
-0.01071929931640625,
-0.00969696044921875,
-0.00237274169921875,
-0.0028934478759765625,
-0.02655029296875,
0.0445556640625,
0.008392333984375,
-0.0325927734375,
-0.059417724609375,
0.01763916015625,
0.0477294921875,
-0.03936767578125,
0.0290374755859375,
0.06671142578125,
-0.0231781005859375,
0.0016012191772460938,
-0.0384521484375,
0.00010323524475097656,
-0.0350341796875,
0.0484619140625,
-0.0138702392578125,
-0.058349609375,
0.053009033203125,
0.0086822509765625,
0.023834228515625,
0.0389404296875,
0.04345703125,
-0.00115203857421875,
0.0618896484375,
0.0133209228515625,
0.01445770263671875,
0.06866455078125,
-0.0287933349609375,
0.01641845703125,
-0.09149169921875,
-0.02984619140625,
-0.04949951171875,
-0.01318359375,
-0.034576416015625,
-0.039215087890625,
0.0237274169921875,
0.007415771484375,
-0.01279449462890625,
0.0272064208984375,
-0.056365966796875,
0.024444580078125,
0.047393798828125,
-0.00695037841796875,
-0.029571533203125,
0.01497650146484375,
-0.0204620361328125,
-0.008575439453125,
-0.04498291015625,
-0.017974853515625,
0.0587158203125,
0.0245208740234375,
0.0239105224609375,
-0.0006732940673828125,
0.057586669921875,
0.008544921875,
-0.0195465087890625,
-0.056732177734375,
0.03631591796875,
0.00007218122482299805,
-0.040740966796875,
-0.045562744140625,
-0.0345458984375,
-0.0694580078125,
0.04486083984375,
0.00907135009765625,
-0.08807373046875,
0.05047607421875,
0.004238128662109375,
-0.0440673828125,
0.0275726318359375,
-0.06536865234375,
0.07537841796875,
-0.01226043701171875,
-0.0298309326171875,
-0.003322601318359375,
-0.045440673828125,
0.0169830322265625,
0.020263671875,
0.02069091796875,
-0.019989013671875,
0.0240020751953125,
0.08905029296875,
-0.038482666015625,
0.04888916015625,
-0.0396728515625,
0.005458831787109375,
0.053009033203125,
-0.020660400390625,
0.02203369140625,
0.004734039306640625,
0.0006399154663085938,
0.0095672607421875,
0.0021152496337890625,
-0.0221710205078125,
-0.0190887451171875,
0.04888916015625,
-0.060516357421875,
-0.045135498046875,
-0.0154266357421875,
-0.0284423828125,
-0.006412506103515625,
0.022064208984375,
0.0161590576171875,
0.044586181640625,
-0.002147674560546875,
0.0269927978515625,
0.048553466796875,
-0.0261993408203125,
0.046875,
0.0235137939453125,
0.004222869873046875,
-0.0714111328125,
0.0634765625,
0.017486572265625,
0.0240936279296875,
0.0167999267578125,
0.0224609375,
-0.036224365234375,
-0.0462646484375,
-0.0213623046875,
0.02032470703125,
-0.038360595703125,
0.01006317138671875,
-0.045013427734375,
-0.0190887451171875,
-0.0517578125,
0.00782012939453125,
-0.053131103515625,
-0.033721923828125,
-0.025726318359375,
0.0003514289855957031,
0.01474761962890625,
0.00923919677734375,
-0.03289794921875,
0.0325927734375,
-0.051605224609375,
0.010894775390625,
0.019378662109375,
0.0168914794921875,
-0.016021728515625,
-0.0616455078125,
-0.039520263671875,
0.0037746429443359375,
-0.01517486572265625,
-0.061553955078125,
0.0229034423828125,
0.0087432861328125,
0.07037353515625,
0.028533935546875,
-0.0106353759765625,
0.06573486328125,
-0.01397705078125,
0.0750732421875,
0.0306854248046875,
-0.07171630859375,
0.045379638671875,
-0.0302886962890625,
0.035888671875,
0.039581298828125,
0.006641387939453125,
-0.0574951171875,
0.003192901611328125,
-0.05120849609375,
-0.09747314453125,
0.088134765625,
0.032379150390625,
-0.024749755859375,
0.0017061233520507812,
0.004451751708984375,
-0.0186309814453125,
-0.003406524658203125,
-0.042816162109375,
-0.028045654296875,
-0.02630615234375,
0.0014028549194335938,
-0.0029754638671875,
-0.025054931640625,
-0.0036640167236328125,
-0.039520263671875,
0.061248779296875,
0.018341064453125,
0.0377197265625,
0.051788330078125,
0.0048065185546875,
-0.0092620849609375,
0.0085906982421875,
0.0634765625,
0.052398681640625,
-0.039031982421875,
-0.0122528076171875,
-0.005279541015625,
-0.034515380859375,
0.003082275390625,
0.01044464111328125,
0.0027256011962890625,
0.0301361083984375,
0.035400390625,
0.0816650390625,
0.004985809326171875,
-0.0372314453125,
0.0275726318359375,
0.00235748291015625,
-0.030242919921875,
-0.04449462890625,
-0.0015535354614257812,
0.0171356201171875,
0.0220489501953125,
0.025787353515625,
-0.0033626556396484375,
-0.0027923583984375,
-0.0195770263671875,
0.031524658203125,
0.00787353515625,
-0.037109375,
-0.0090179443359375,
0.036590576171875,
0.0159759521484375,
-0.05023193359375,
0.04644775390625,
-0.004627227783203125,
-0.039154052734375,
0.065673828125,
0.036224365234375,
0.08319091796875,
-0.03741455078125,
-0.003742218017578125,
0.052734375,
0.022613525390625,
0.010223388671875,
0.016387939453125,
-0.033447265625,
-0.0377197265625,
-0.02276611328125,
-0.05572509765625,
-0.018951416015625,
0.025054931640625,
-0.043304443359375,
0.01708984375,
-0.038787841796875,
-0.0225372314453125,
0.031280517578125,
-0.0035991668701171875,
-0.01496124267578125,
0.020751953125,
0.01512908935546875,
0.067626953125,
-0.045196533203125,
0.051239013671875,
0.051422119140625,
-0.0240478515625,
-0.0599365234375,
0.0003058910369873047,
-0.0030002593994140625,
-0.028472900390625,
0.0160675048828125,
0.00722503662109375,
-0.00337982177734375,
-0.00876617431640625,
-0.022247314453125,
-0.0679931640625,
0.08038330078125,
0.013427734375,
-0.06683349609375,
0.0181427001953125,
-0.00705718994140625,
0.0238494873046875,
-0.0305023193359375,
0.0242156982421875,
0.053436279296875,
0.049041748046875,
0.006938934326171875,
-0.08441162109375,
-0.0037326812744140625,
-0.037628173828125,
-0.0197296142578125,
0.0193634033203125,
-0.06854248046875,
0.081298828125,
-0.00482177734375,
-0.0052490234375,
-0.01029205322265625,
0.0472412109375,
0.041107177734375,
0.0243072509765625,
0.063232421875,
0.048370361328125,
0.04412841796875,
-0.01438140869140625,
0.0386962890625,
-0.0207672119140625,
0.016845703125,
0.08905029296875,
0.01033782958984375,
0.049285888671875,
0.0338134765625,
-0.034088134765625,
0.0316162109375,
0.06890869140625,
-0.0124664306640625,
0.045745849609375,
0.023651123046875,
-0.0074462890625,
-0.032562255859375,
-0.0025272369384765625,
-0.047332763671875,
0.045989990234375,
0.03326416015625,
-0.02716064453125,
0.01309967041015625,
-0.026153564453125,
-0.0011835098266601562,
-0.01000213623046875,
-0.006000518798828125,
0.044769287109375,
0.00510406494140625,
-0.046661376953125,
0.05975341796875,
-0.00463104248046875,
0.04443359375,
-0.0509033203125,
-0.0007967948913574219,
-0.0101470947265625,
0.011566162109375,
-0.030853271484375,
-0.04803466796875,
0.0161285400390625,
-0.01136016845703125,
-0.01328277587890625,
-0.0012493133544921875,
0.036651611328125,
-0.044952392578125,
-0.01514434814453125,
0.0255584716796875,
0.005046844482421875,
0.02325439453125,
0.0009465217590332031,
-0.054107666015625,
0.013580322265625,
0.0218505859375,
-0.03155517578125,
0.0094146728515625,
0.03570556640625,
0.016510009765625,
0.0183868408203125,
0.055145263671875,
0.017730712890625,
0.0190887451171875,
0.0142822265625,
0.057769775390625,
-0.04534912109375,
-0.07110595703125,
-0.0604248046875,
0.046295166015625,
-0.035369873046875,
-0.0394287109375,
0.06097412109375,
0.0662841796875,
0.07037353515625,
0.0193634033203125,
0.051116943359375,
-0.0090179443359375,
0.039947509765625,
-0.0294952392578125,
0.05316162109375,
-0.04779052734375,
0.0257110595703125,
-0.041778564453125,
-0.063720703125,
-0.0062103271484375,
0.051849365234375,
-0.025970458984375,
0.01477813720703125,
0.03472900390625,
0.06451416015625,
-0.0142974853515625,
-0.00855255126953125,
0.0173797607421875,
0.0126953125,
0.0350341796875,
0.0306854248046875,
0.051544189453125,
-0.0192413330078125,
0.052581787109375,
-0.04083251953125,
-0.01458740234375,
-0.015960693359375,
-0.0301513671875,
-0.05279541015625,
-0.055908203125,
-0.0406494140625,
-0.0164947509765625,
0.0015497207641601562,
0.07861328125,
0.077880859375,
-0.052703857421875,
-0.044342041015625,
0.00003600120544433594,
0.01084136962890625,
-0.033355712890625,
-0.017547607421875,
0.0465087890625,
-0.002490997314453125,
-0.051361083984375,
0.047119140625,
0.0265045166015625,
0.00516510009765625,
-0.00017523765563964844,
-0.021636962890625,
-0.052093505859375,
0.0057830810546875,
0.01006317138671875,
0.03399658203125,
-0.043853759765625,
-0.01129150390625,
-0.0286102294921875,
0.0162506103515625,
0.0260009765625,
0.045989990234375,
-0.0245513916015625,
0.045745849609375,
0.04888916015625,
0.0170745849609375,
0.054840087890625,
0.004848480224609375,
0.019012451171875,
-0.054779052734375,
0.0170745849609375,
0.02178955078125,
0.0305328369140625,
0.029571533203125,
-0.00970458984375,
0.0208892822265625,
0.0389404296875,
-0.046844482421875,
-0.0809326171875,
-0.007061004638671875,
-0.07293701171875,
-0.019256591796875,
0.08331298828125,
-0.0182037353515625,
-0.03717041015625,
-0.007274627685546875,
-0.0260772705078125,
0.038299560546875,
-0.044677734375,
0.041229248046875,
0.045166015625,
-0.023193359375,
-0.0095062255859375,
-0.02630615234375,
0.044647216796875,
0.0290069580078125,
-0.041168212890625,
0.0167694091796875,
0.039794921875,
0.0177001953125,
0.037445068359375,
0.0655517578125,
-0.013275146484375,
0.03155517578125,
0.03570556640625,
0.0180511474609375,
-0.030609130859375,
-0.018829345703125,
-0.0196533203125,
0.00708770751953125,
-0.00803375244140625,
-0.056915283203125
]
] |
distilroberta-base | 2022-11-16T23:22:40.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"roberta",
"fill-mask",
"exbert",
"en",
"dataset:openwebtext",
"arxiv:1910.01108",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | distilroberta-base | 86 | 3,140,871 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- openwebtext
---
# Model Card for DistilRoBERTa base
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
This model is a distilled version of the [RoBERTa-base model](https://huggingface.co/roberta-base). It follows the same training procedure as [DistilBERT](https://huggingface.co/distilbert-base-uncased).
The code for the distillation process can be found [here](https://github.com/huggingface/transformers/tree/master/examples/distillation).
This model is case-sensitive: it makes a difference between english and English.
The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 125M parameters for RoBERTa-base).
On average DistilRoBERTa is twice as fast as Roberta-base.
We encourage users of this model card to check out the [RoBERTa-base model card](https://huggingface.co/roberta-base) to learn more about usage, limitations and potential biases.
- **Developed by:** Victor Sanh, Lysandre Debut, Julien Chaumond, Thomas Wolf (Hugging Face)
- **Model type:** Transformer-based language model
- **Language(s) (NLP):** English
- **License:** Apache 2.0
- **Related Models:** [RoBERTa-base model card](https://huggingface.co/roberta-base)
- **Resources for more information:**
- [GitHub Repository](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)
- [Associated Paper](https://arxiv.org/abs/1910.01108)
# Uses
## Direct Use and Downstream Use
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2.
## Out of Scope Use
The model should not be used to intentionally create hostile or alienating environments for people. The model was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='distilroberta-base')
>>> unmasker("The man worked as a <mask>.")
[{'score': 0.1237526461482048,
'sequence': 'The man worked as a waiter.',
'token': 38233,
'token_str': ' waiter'},
{'score': 0.08968018740415573,
'sequence': 'The man worked as a waitress.',
'token': 35698,
'token_str': ' waitress'},
{'score': 0.08387645334005356,
'sequence': 'The man worked as a bartender.',
'token': 33080,
'token_str': ' bartender'},
{'score': 0.061059024184942245,
'sequence': 'The man worked as a mechanic.',
'token': 25682,
'token_str': ' mechanic'},
{'score': 0.03804653510451317,
'sequence': 'The man worked as a courier.',
'token': 37171,
'token_str': ' courier'}]
>>> unmasker("The woman worked as a <mask>.")
[{'score': 0.23149248957633972,
'sequence': 'The woman worked as a waitress.',
'token': 35698,
'token_str': ' waitress'},
{'score': 0.07563332468271255,
'sequence': 'The woman worked as a waiter.',
'token': 38233,
'token_str': ' waiter'},
{'score': 0.06983394920825958,
'sequence': 'The woman worked as a bartender.',
'token': 33080,
'token_str': ' bartender'},
{'score': 0.05411609262228012,
'sequence': 'The woman worked as a nurse.',
'token': 9008,
'token_str': ' nurse'},
{'score': 0.04995106905698776,
'sequence': 'The woman worked as a maid.',
'token': 29754,
'token_str': ' maid'}]
```
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
# Training Details
DistilRoBERTa was pre-trained on [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), a reproduction of OpenAI's WebText dataset (it is ~4 times less training data than the teacher RoBERTa). See the [roberta-base model card](https://huggingface.co/roberta-base/blob/main/README.md) for further details on training.
# Evaluation
When fine-tuned on downstream tasks, this model achieves the following results (see [GitHub Repo](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)):
Glue test results:
| Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE |
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|
| | 84.0 | 89.4 | 90.8 | 92.5 | 59.3 | 88.3 | 86.6 | 67.9 |
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
```bibtex
@article{Sanh2019DistilBERTAD,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
journal={ArXiv},
year={2019},
volume={abs/1910.01108}
}
```
APA
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
# How to Get Started With the Model
You can use the model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='distilroberta-base')
>>> unmasker("Hello I'm a <mask> model.")
[{'score': 0.04673689603805542,
'sequence': "Hello I'm a business model.",
'token': 265,
'token_str': ' business'},
{'score': 0.03846118599176407,
'sequence': "Hello I'm a freelance model.",
'token': 18150,
'token_str': ' freelance'},
{'score': 0.03308931365609169,
'sequence': "Hello I'm a fashion model.",
'token': 2734,
'token_str': ' fashion'},
{'score': 0.03018997237086296,
'sequence': "Hello I'm a role model.",
'token': 774,
'token_str': ' role'},
{'score': 0.02111748233437538,
'sequence': "Hello I'm a Playboy model.",
'token': 24526,
'token_str': ' Playboy'}]
```
<a href="https://huggingface.co/exbert/?model=distilroberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 7,498 | [
[
-0.0174560546875,
-0.058135986328125,
0.01849365234375,
0.0123443603515625,
-0.019500732421875,
-0.0018777847290039062,
-0.019134521484375,
-0.018829345703125,
0.0098419189453125,
0.0335693359375,
-0.043243408203125,
-0.038482666015625,
-0.0576171875,
0.01526641845703125,
-0.0195770263671875,
0.08343505859375,
0.0207061767578125,
0.003509521484375,
0.005222320556640625,
0.0166168212890625,
-0.0263671875,
-0.04095458984375,
-0.043243408203125,
-0.0257110595703125,
0.016326904296875,
0.003810882568359375,
0.04150390625,
0.04449462890625,
0.020843505859375,
0.0292816162109375,
-0.0262451171875,
-0.0012416839599609375,
-0.0291290283203125,
-0.0088958740234375,
0.0005459785461425781,
-0.03271484375,
-0.032501220703125,
0.0066986083984375,
0.0270843505859375,
0.058746337890625,
-0.01131439208984375,
0.0360107421875,
0.01558685302734375,
0.05450439453125,
-0.014862060546875,
0.034759521484375,
-0.048858642578125,
-0.007183074951171875,
-0.0209808349609375,
0.0216217041015625,
-0.024322509765625,
-0.0094451904296875,
0.0003743171691894531,
-0.02978515625,
0.0177764892578125,
0.007579803466796875,
0.09442138671875,
0.0298919677734375,
-0.01552581787109375,
-0.019927978515625,
-0.043243408203125,
0.06256103515625,
-0.076171875,
0.0140838623046875,
0.029327392578125,
0.0142974853515625,
-0.0178680419921875,
-0.06536865234375,
-0.052093505859375,
-0.00614166259765625,
-0.029327392578125,
0.00814056396484375,
-0.030029296875,
-0.0142669677734375,
0.026885986328125,
0.03863525390625,
-0.0372314453125,
-0.0036468505859375,
-0.043487548828125,
-0.0001678466796875,
0.04376220703125,
0.00875091552734375,
0.0113525390625,
-0.03375244140625,
-0.0267791748046875,
-0.0213775634765625,
-0.0195159912109375,
0.01470947265625,
0.038299560546875,
0.03704833984375,
-0.025482177734375,
0.047760009765625,
-0.02288818359375,
0.049530029296875,
0.01910400390625,
-0.0206756591796875,
0.038482666015625,
-0.025177001953125,
-0.0270843505859375,
-0.00849151611328125,
0.07879638671875,
0.01427459716796875,
0.01197052001953125,
0.0129547119140625,
-0.006381988525390625,
0.0013360977172851562,
0.00583648681640625,
-0.056243896484375,
-0.034149169921875,
0.025726318359375,
-0.0364990234375,
-0.03485107421875,
0.0149078369140625,
-0.06787109375,
-0.0040130615234375,
-0.0142974853515625,
0.016204833984375,
-0.018463134765625,
-0.0291900634765625,
0.01358795166015625,
-0.0254364013671875,
0.003337860107421875,
0.00009065866470336914,
-0.06341552734375,
0.0162200927734375,
0.040985107421875,
0.06842041015625,
0.01151275634765625,
-0.00518798828125,
-0.011444091796875,
-0.0163116455078125,
0.0032100677490234375,
0.0225830078125,
-0.018524169921875,
-0.0230560302734375,
-0.017425537109375,
0.02252197265625,
0.0026874542236328125,
-0.031646728515625,
0.0523681640625,
-0.0213470458984375,
0.0273284912109375,
0.002262115478515625,
-0.0272979736328125,
-0.030670166015625,
0.005847930908203125,
-0.058349609375,
0.09246826171875,
0.0218505859375,
-0.07489013671875,
0.0181884765625,
-0.0584716796875,
-0.03350830078125,
-0.004444122314453125,
0.01422119140625,
-0.050048828125,
-0.005584716796875,
0.017333984375,
0.039031982421875,
-0.036865234375,
0.033538818359375,
-0.01082611083984375,
-0.00803375244140625,
0.0128021240234375,
-0.0289459228515625,
0.11474609375,
0.02581787109375,
-0.0350341796875,
-0.0035152435302734375,
-0.0638427734375,
-0.0082244873046875,
0.030853271484375,
-0.0350341796875,
-0.0201263427734375,
-0.01265716552734375,
0.01453399658203125,
0.0224761962890625,
0.017913818359375,
-0.033355712890625,
0.01552581787109375,
-0.0283966064453125,
0.058685302734375,
0.04840087890625,
-0.01103973388671875,
0.017333984375,
-0.0270538330078125,
0.0357666015625,
0.0025272369384765625,
0.0152740478515625,
-0.007843017578125,
-0.057403564453125,
-0.05670166015625,
-0.03900146484375,
0.034027099609375,
0.053863525390625,
-0.046600341796875,
0.0596923828125,
-0.019500732421875,
-0.06280517578125,
-0.048675537109375,
0.00001728534698486328,
0.0361328125,
0.05224609375,
0.023773193359375,
-0.0211029052734375,
-0.058135986328125,
-0.054901123046875,
-0.0027561187744140625,
-0.01428985595703125,
-0.00855255126953125,
0.017822265625,
0.055694580078125,
-0.016357421875,
0.06500244140625,
-0.05426025390625,
-0.033111572265625,
-0.01513671875,
0.00409698486328125,
0.0677490234375,
0.04522705078125,
0.046722412109375,
-0.054107666015625,
-0.05023193359375,
-0.0174560546875,
-0.052093505859375,
-0.002735137939453125,
0.004467010498046875,
-0.0034008026123046875,
0.0160369873046875,
0.02398681640625,
-0.051971435546875,
0.0211334228515625,
0.0386962890625,
-0.02581787109375,
0.04730224609375,
-0.0256195068359375,
0.004268646240234375,
-0.09881591796875,
0.02374267578125,
0.00772857666015625,
-0.01531219482421875,
-0.057952880859375,
-0.00705718994140625,
-0.0146636962890625,
-0.005218505859375,
-0.047332763671875,
0.034393310546875,
-0.046051025390625,
0.01251983642578125,
0.001495361328125,
-0.012298583984375,
0.0123291015625,
0.05987548828125,
0.004367828369140625,
0.03717041015625,
0.04736328125,
-0.0306396484375,
0.0270538330078125,
0.018341064453125,
-0.034515380859375,
0.018829345703125,
-0.060791015625,
0.01320648193359375,
-0.00254058837890625,
0.0181121826171875,
-0.080810546875,
-0.003459930419921875,
0.018646240234375,
-0.040802001953125,
0.0295257568359375,
-0.032318115234375,
-0.046722412109375,
-0.04931640625,
-0.01107025146484375,
0.0156402587890625,
0.05792236328125,
-0.028167724609375,
0.03851318359375,
0.03424072265625,
-0.0047760009765625,
-0.051788330078125,
-0.058197021484375,
-0.00971221923828125,
-0.040069580078125,
-0.045196533203125,
0.036102294921875,
0.00817108154296875,
-0.0167999267578125,
-0.002948760986328125,
0.00832366943359375,
-0.0174102783203125,
0.015411376953125,
0.026885986328125,
0.034454345703125,
0.004268646240234375,
-0.02398681640625,
-0.01113128662109375,
-0.01137542724609375,
0.0027446746826171875,
-0.0249786376953125,
0.054473876953125,
-0.0133514404296875,
0.0025081634521484375,
-0.03759765625,
0.010986328125,
0.046234130859375,
-0.007175445556640625,
0.061187744140625,
0.052154541015625,
-0.035308837890625,
0.0045623779296875,
-0.0258941650390625,
-0.01351165771484375,
-0.0360107421875,
0.0303955078125,
-0.030303955078125,
-0.056365966796875,
0.052886962890625,
0.0090484619140625,
-0.0092620849609375,
0.061614990234375,
0.0506591796875,
-0.004352569580078125,
0.06365966796875,
0.032073974609375,
-0.01849365234375,
0.0278472900390625,
-0.03759765625,
0.021331787109375,
-0.05560302734375,
-0.0249481201171875,
-0.053863525390625,
-0.01343536376953125,
-0.04913330078125,
-0.041717529296875,
0.0188446044921875,
0.0233917236328125,
-0.0232391357421875,
0.055511474609375,
-0.06011962890625,
0.01898193359375,
0.054107666015625,
0.0062713623046875,
0.016357421875,
-0.01102447509765625,
-0.01422882080078125,
0.0057220458984375,
-0.04766845703125,
-0.03985595703125,
0.0751953125,
0.040618896484375,
0.051544189453125,
0.009796142578125,
0.041656494140625,
0.02581787109375,
-0.0017118453979492188,
-0.02716064453125,
0.02935791015625,
-0.0252227783203125,
-0.07281494140625,
-0.02734375,
-0.0242462158203125,
-0.073974609375,
0.0165863037109375,
-0.016845703125,
-0.055877685546875,
0.0092926025390625,
0.0031414031982421875,
-0.018280029296875,
0.0283660888671875,
-0.0523681640625,
0.0750732421875,
-0.025360107421875,
-0.0217742919921875,
0.007404327392578125,
-0.06378173828125,
0.033203125,
0.0005888938903808594,
0.025970458984375,
-0.011871337890625,
0.0258941650390625,
0.05279541015625,
-0.056884765625,
0.077880859375,
-0.022125244140625,
0.0019435882568359375,
0.02557373046875,
-0.01364898681640625,
0.03521728515625,
-0.004077911376953125,
-0.00335693359375,
0.05609130859375,
-0.002979278564453125,
-0.019256591796875,
-0.0213470458984375,
0.038330078125,
-0.049591064453125,
-0.0382080078125,
-0.044281005859375,
-0.028594970703125,
0.0232696533203125,
0.0237884521484375,
0.0313720703125,
0.0159149169921875,
-0.0004489421844482422,
0.01248931884765625,
0.031524658203125,
-0.0242462158203125,
0.03546142578125,
0.026611328125,
-0.01003265380859375,
-0.026885986328125,
0.04522705078125,
0.0054473876953125,
0.012298583984375,
0.034027099609375,
0.0164031982421875,
-0.043182373046875,
-0.029449462890625,
-0.039398193359375,
0.0159149169921875,
-0.04376220703125,
-0.0228424072265625,
-0.057952880859375,
-0.0204010009765625,
-0.046051025390625,
0.00507354736328125,
-0.0297088623046875,
-0.038665771484375,
-0.041595458984375,
-0.00496673583984375,
0.0498046875,
0.03936767578125,
-0.007110595703125,
0.0094451904296875,
-0.04595947265625,
0.0184326171875,
0.0203857421875,
0.0168609619140625,
-0.0029296875,
-0.05340576171875,
-0.01497650146484375,
0.0182647705078125,
-0.0196990966796875,
-0.06402587890625,
0.053955078125,
0.0112152099609375,
0.0400390625,
0.0214691162109375,
0.009124755859375,
0.055694580078125,
-0.040435791015625,
0.0726318359375,
0.021026611328125,
-0.07318115234375,
0.04913330078125,
-0.0225372314453125,
0.006778717041015625,
0.031097412109375,
0.03131103515625,
-0.026611328125,
-0.0268402099609375,
-0.0577392578125,
-0.08087158203125,
0.064453125,
0.034393310546875,
0.007266998291015625,
-0.0030841827392578125,
0.0123291015625,
0.004825592041015625,
0.0172576904296875,
-0.07550048828125,
-0.0469970703125,
-0.027587890625,
-0.0173187255859375,
-0.00370025634765625,
-0.003887176513671875,
0.002407073974609375,
-0.03955078125,
0.06719970703125,
0.0102386474609375,
0.0215301513671875,
0.0128631591796875,
-0.00846099853515625,
0.0222320556640625,
0.00975799560546875,
0.03875732421875,
0.0217437744140625,
-0.03216552734375,
0.004207611083984375,
0.01367950439453125,
-0.03826904296875,
0.011444091796875,
0.017333984375,
-0.00571441650390625,
0.0118408203125,
0.016937255859375,
0.06982421875,
-0.004901885986328125,
-0.0479736328125,
0.0367431640625,
0.0007963180541992188,
-0.021392822265625,
-0.037109375,
0.003360748291015625,
0.0169219970703125,
0.022857666015625,
0.0306396484375,
0.0008649826049804688,
0.0128631591796875,
-0.047515869140625,
0.011260986328125,
0.0245208740234375,
-0.025360107421875,
-0.01189422607421875,
0.07733154296875,
0.00466156005859375,
-0.035125732421875,
0.051177978515625,
-0.0298004150390625,
-0.038330078125,
0.059906005859375,
0.0350341796875,
0.0677490234375,
0.0004112720489501953,
0.003025054931640625,
0.052337646484375,
0.023712158203125,
-0.0149688720703125,
0.00408172607421875,
0.005950927734375,
-0.041900634765625,
-0.02142333984375,
-0.0638427734375,
0.01055145263671875,
0.018310546875,
-0.048309326171875,
0.0295562744140625,
-0.024810791015625,
-0.029754638671875,
0.01552581787109375,
0.00415802001953125,
-0.05462646484375,
0.00800323486328125,
-0.0012845993041992188,
0.055816650390625,
-0.08636474609375,
0.0736083984375,
0.0479736328125,
-0.05499267578125,
-0.06829833984375,
-0.0246734619140625,
-0.003940582275390625,
-0.049774169921875,
0.0570068359375,
0.011444091796875,
0.0208282470703125,
-0.00907135009765625,
-0.024505615234375,
-0.059417724609375,
0.09625244140625,
0.021331787109375,
-0.053863525390625,
-0.005435943603515625,
0.01033782958984375,
0.052001953125,
-0.028167724609375,
0.042877197265625,
0.045867919921875,
0.030487060546875,
-0.000027179718017578125,
-0.08355712890625,
0.0203857421875,
-0.038330078125,
0.01386260986328125,
0.0037059783935546875,
-0.05133056640625,
0.08367919921875,
-0.010986328125,
-0.0097808837890625,
0.0038242340087890625,
0.031829833984375,
0.017974853515625,
0.009124755859375,
0.043792724609375,
0.059234619140625,
0.052978515625,
-0.03021240234375,
0.070556640625,
-0.0224151611328125,
0.050323486328125,
0.08050537109375,
-0.003459930419921875,
0.047515869140625,
0.0258026123046875,
-0.03875732421875,
0.057525634765625,
0.048828125,
-0.01995849609375,
0.051544189453125,
0.023834228515625,
0.0013513565063476562,
0.0128631591796875,
0.0018596649169921875,
-0.025665283203125,
0.036376953125,
0.00970458984375,
-0.041168212890625,
0.00191497802734375,
-0.009002685546875,
0.0206756591796875,
-0.002307891845703125,
-0.002460479736328125,
0.054290771484375,
-0.0014028549194335938,
-0.05255126953125,
0.06201171875,
0.01494598388671875,
0.0693359375,
-0.0308074951171875,
-0.006717681884765625,
-0.0010213851928710938,
0.00992584228515625,
-0.0116729736328125,
-0.0565185546875,
0.01947021484375,
0.0145263671875,
-0.03082275390625,
-0.0209808349609375,
0.0458984375,
-0.03692626953125,
-0.05340576171875,
0.01506805419921875,
0.0260009765625,
0.025421142578125,
-0.0026092529296875,
-0.0726318359375,
-0.0063018798828125,
0.023468017578125,
-0.012908935546875,
0.005916595458984375,
0.0280303955078125,
0.00955963134765625,
0.0399169921875,
0.06353759765625,
-0.007083892822265625,
0.01020050048828125,
0.0023403167724609375,
0.060791015625,
-0.045013427734375,
-0.036285400390625,
-0.07501220703125,
0.06561279296875,
-0.0128936767578125,
-0.0234222412109375,
0.055938720703125,
0.05682373046875,
0.0638427734375,
-0.02581787109375,
0.05908203125,
-0.032745361328125,
0.033966064453125,
-0.031524658203125,
0.058563232421875,
-0.041534423828125,
0.0161285400390625,
-0.041656494140625,
-0.0654296875,
-0.007389068603515625,
0.058685302734375,
-0.0004138946533203125,
0.0123748779296875,
0.0372314453125,
0.061798095703125,
-0.00003522634506225586,
-0.024810791015625,
-0.0013322830200195312,
0.025543212890625,
0.0179595947265625,
0.043182373046875,
0.033935546875,
-0.04913330078125,
0.033172607421875,
-0.03973388671875,
-0.0167388916015625,
-0.02215576171875,
-0.077880859375,
-0.0677490234375,
-0.05609130859375,
-0.03240966796875,
-0.0390625,
-0.014892578125,
0.0577392578125,
0.06219482421875,
-0.05596923828125,
-0.0238800048828125,
-0.0166168212890625,
0.00428009033203125,
-0.033355712890625,
-0.02197265625,
0.032318115234375,
-0.0009975433349609375,
-0.07696533203125,
0.01001739501953125,
0.0029754638671875,
0.01261138916015625,
-0.01346588134765625,
-0.0093994140625,
-0.03594970703125,
-0.009674072265625,
0.036285400390625,
0.01207733154296875,
-0.06072998046875,
-0.00634765625,
0.0036449432373046875,
-0.00047969818115234375,
0.00833892822265625,
0.0193328857421875,
-0.0382080078125,
0.0255279541015625,
0.03912353515625,
0.01465606689453125,
0.05718994140625,
-0.0014781951904296875,
0.0191192626953125,
-0.06353759765625,
0.0256805419921875,
0.00812530517578125,
0.0450439453125,
0.024566650390625,
-0.035552978515625,
0.05145263671875,
0.0260772705078125,
-0.037841796875,
-0.0634765625,
-0.004924774169921875,
-0.08819580078125,
-0.032196044921875,
0.08209228515625,
-0.0182037353515625,
-0.027587890625,
-0.006313323974609375,
-0.0219879150390625,
0.0367431640625,
-0.032470703125,
0.059906005859375,
0.0478515625,
0.0012111663818359375,
0.00274658203125,
-0.0345458984375,
0.0272064208984375,
0.016937255859375,
-0.042083740234375,
-0.0045623779296875,
0.0237579345703125,
0.04498291015625,
0.024566650390625,
0.046173095703125,
-0.0157318115234375,
-0.0139007568359375,
0.01120758056640625,
0.017913818359375,
-0.0112152099609375,
-0.00376129150390625,
-0.0214080810546875,
0.00031876564025878906,
-0.0084686279296875,
-0.0191650390625
]
] |
pyannote/speaker-diarization | 2023-10-04T18:53:17.000Z | [
"pyannote-audio",
"pyannote",
"pyannote-audio-pipeline",
"audio",
"voice",
"speech",
"speaker",
"speaker-diarization",
"speaker-change-detection",
"voice-activity-detection",
"overlapped-speech-detection",
"automatic-speech-recognition",
"dataset:ami",
"dataset:dihard",
"dataset:voxconverse",
"dataset:aishell",
"dataset:repere",
"dataset:voxceleb",
"arxiv:2012.01477",
"arxiv:2110.07058",
"arxiv:2005.08072",
"license:mit",
"has_space",
"region:us"
] | automatic-speech-recognition | pyannote | null | null | pyannote/speaker-diarization | 537 | 2,981,173 | pyannote-audio | 2022-03-02T23:29:05 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-pipeline
- audio
- voice
- speech
- speaker
- speaker-diarization
- speaker-change-detection
- voice-activity-detection
- overlapped-speech-detection
- automatic-speech-recognition
datasets:
- ami
- dihard
- voxconverse
- aishell
- repere
- voxceleb
license: mit
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers apply for grants to improve it further. If you are an academic researcher, please cite the relevant papers in your own publications using the model. If you work for a company, please consider contributing back to pyannote.audio development (e.g. through unrestricted gifts). We also provide scientific consulting services around speaker diarization and machine listening."
extra_gated_fields:
Company/university: text
Website: text
I plan to use this model for (task, type of audio data, etc): text
---
Using this open-source pipeline in production?
Make the most of it thanks to our [consulting services](https://herve.niderb.fr/consulting.html).
# ๐น Speaker diarization
Relies on pyannote.audio 2.1.1: see [installation instructions](https://github.com/pyannote/pyannote-audio#installation).
## TL;DR
```python
# 1. visit hf.co/pyannote/speaker-diarization and accept user conditions
# 2. visit hf.co/pyannote/segmentation and accept user conditions
# 3. visit hf.co/settings/tokens to create an access token
# 4. instantiate pretrained speaker diarization pipeline
from pyannote.audio import Pipeline
pipeline = Pipeline.from_pretrained("pyannote/speaker-diarization@2.1",
use_auth_token="ACCESS_TOKEN_GOES_HERE")
# apply the pipeline to an audio file
diarization = pipeline("audio.wav")
# dump the diarization output to disk using RTTM format
with open("audio.rttm", "w") as rttm:
diarization.write_rttm(rttm)
```
## Advanced usage
In case the number of speakers is known in advance, one can use the `num_speakers` option:
```python
diarization = pipeline("audio.wav", num_speakers=2)
```
One can also provide lower and/or upper bounds on the number of speakers using `min_speakers` and `max_speakers` options:
```python
diarization = pipeline("audio.wav", min_speakers=2, max_speakers=5)
```
## Benchmark
### Real-time factor
Real-time factor is around 2.5% using one Nvidia Tesla V100 SXM2 GPU (for the neural inference part) and one Intel Cascade Lake 6248 CPU (for the clustering part).
In other words, it takes approximately 1.5 minutes to process a one hour conversation.
### Accuracy
This pipeline is benchmarked on a growing collection of datasets.
Processing is fully automatic:
* no manual voice activity detection (as is sometimes the case in the literature)
* no manual number of speakers (though it is possible to provide it to the pipeline)
* no fine-tuning of the internal models nor tuning of the pipeline hyper-parameters to each dataset
... with the least forgiving diarization error rate (DER) setup (named *"Full"* in [this paper](https://doi.org/10.1016/j.csl.2021.101254)):
* no forgiveness collar
* evaluation of overlapped speech
| Benchmark | [DER%](. "Diarization error rate") | [FA%](. "False alarm rate") | [Miss%](. "Missed detection rate") | [Conf%](. "Speaker confusion rate") | Expected output | File-level evaluation |
| ------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------- | --------------------------- | ---------------------------------- | ----------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- |
| [AISHELL-4](http://www.openslr.org/111/) | 14.09 | 5.17 | 3.27 | 5.65 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AISHELL.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AISHELL.test.eval) |
| [Albayzin (*RTVE 2022*)](http://catedrartve.unizar.es/albayzindatabases.html) | 25.60 | 5.58 | 6.84 | 13.18 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/Albayzin2022.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/Albayzin2022.test.eval) |
| [AliMeeting (*channel 1*)](https://www.openslr.org/119/) | 27.42 | 4.84 | 14.00 | 8.58 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AliMeeting.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AliMeeting.test.eval) |
| [AMI (*headset mix,*](https://groups.inf.ed.ac.uk/ami/corpus/) [*only_words*)](https://github.com/BUTSpeechFIT/AMI-diarization-setup) | 18.91 | 4.48 | 9.51 | 4.91 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AMI.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AMI.test.eval) |
| [AMI (*array1, channel 1,*](https://groups.inf.ed.ac.uk/ami/corpus/) [*only_words)*](https://github.com/BUTSpeechFIT/AMI-diarization-setup) | 27.12 | 4.11 | 17.78 | 5.23 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AMI-SDM.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/AMI-SDM.test.eval) |
| [CALLHOME](https://catalog.ldc.upenn.edu/LDC2001S97) [(*part2*)](https://github.com/BUTSpeechFIT/CALLHOME_sublists/issues/1) | 32.37 | 6.30 | 13.72 | 12.35 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/CALLHOME.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/CALLHOME.test.eval) |
| [DIHARD 3 (*Full*)](https://arxiv.org/abs/2012.01477) | 26.94 | 10.50 | 8.41 | 8.03 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/DIHARD.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/DIHARD.test.eval) |
| [Ego4D *v1 (validation)*](https://arxiv.org/abs/2110.07058) | 63.99 | 3.91 | 44.42 | 15.67 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/Ego4D.development.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/Ego4D.development.eval) |
| [REPERE (*phase 2*)](https://islrn.org/resources/360-758-359-485-0/) | 8.17 | 2.23 | 2.49 | 3.45 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/REPERE.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/REPERE.test.eval) |
| [This American Life](https://arxiv.org/abs/2005.08072) | 20.82 | 2.03 | 11.89 | 6.90 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/ThisAmericanLife.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/2.1.1/reproducible_research/2.1.1/ThisAmericanLife.test.eval) |
| [VoxConverse (*v0.3*)](https://github.com/joonson/voxconverse) | 11.24 | 4.42 | 2.88 | 3.94 | [RTTM](https://huggingface.co/pyannote/speaker-diarization/blob/main/reproducible_research/2.1.1/VoxConverse.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization/blob/main/reproducible_research/2.1.1/VoxConverse.test.eval) |
## Technical report
This [report](technical_report_2.1.pdf) describes the main principles behind version `2.1` of pyannote.audio speaker diarization pipeline.
It also provides recipes explaining how to adapt the pipeline to your own set of annotated data. In particular, those are applied to the above benchmark and consistently leads to significant performance improvement over the above out-of-the-box performance.
## Citations
```bibtex
@inproceedings{Bredin2021,
Title = {{End-to-end speaker segmentation for overlap-aware resegmentation}},
Author = {{Bredin}, Herv{\'e} and {Laurent}, Antoine},
Booktitle = {Proc. Interspeech 2021},
Address = {Brno, Czech Republic},
Month = {August},
Year = {2021},
}
```
```bibtex
@inproceedings{Bredin2020,
Title = {{pyannote.audio: neural building blocks for speaker diarization}},
Author = {{Bredin}, Herv{\'e} and {Yin}, Ruiqing and {Coria}, Juan Manuel and {Gelly}, Gregory and {Korshunov}, Pavel and {Lavechin}, Marvin and {Fustes}, Diego and {Titeux}, Hadrien and {Bouaziz}, Wassim and {Gill}, Marie-Philippe},
Booktitle = {ICASSP 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing},
Address = {Barcelona, Spain},
Month = {May},
Year = {2020},
}
```
| 11,494 | [
[
-0.046142578125,
-0.0531005859375,
0.007167816162109375,
0.038177490234375,
-0.01226806640625,
0.0023937225341796875,
-0.039459228515625,
-0.0251007080078125,
0.042388916015625,
0.0274200439453125,
-0.0272979736328125,
-0.05548095703125,
-0.0299530029296875,
0.006999969482421875,
-0.01403045654296875,
0.054840087890625,
0.0274658203125,
-0.0023193359375,
0.0095977783203125,
0.0009245872497558594,
-0.0196990966796875,
-0.0185089111328125,
-0.03564453125,
-0.01155853271484375,
0.00978851318359375,
0.043670654296875,
0.0242767333984375,
0.058990478515625,
0.022430419921875,
0.023681640625,
-0.033599853515625,
0.0080413818359375,
-0.0012149810791015625,
-0.005706787109375,
0.004642486572265625,
-0.0006232261657714844,
-0.042205810546875,
0.0097198486328125,
0.06146240234375,
0.04754638671875,
-0.0167388916015625,
0.0188751220703125,
0.0035343170166015625,
0.04278564453125,
-0.01509857177734375,
0.0182952880859375,
-0.05035400390625,
-0.01189422607421875,
-0.03961181640625,
-0.01690673828125,
-0.0148468017578125,
-0.0224761962890625,
0.01049041748046875,
-0.04168701171875,
0.0081329345703125,
0.006107330322265625,
0.08197021484375,
0.004955291748046875,
-0.004062652587890625,
-0.0087432861328125,
-0.051513671875,
0.054443359375,
-0.07220458984375,
0.030731201171875,
0.04632568359375,
-0.0018167495727539062,
-0.0186004638671875,
-0.045074462890625,
-0.05810546875,
-0.004947662353515625,
-0.0191650390625,
0.01904296875,
-0.007472991943359375,
0.00823211669921875,
0.0264129638671875,
0.039398193359375,
-0.04498291015625,
-0.00231170654296875,
-0.03839111328125,
-0.03314208984375,
0.05712890625,
-0.01251220703125,
0.0325927734375,
-0.048828125,
-0.0196380615234375,
-0.0184478759765625,
-0.02960205078125,
0.0192108154296875,
0.038360595703125,
0.039306640625,
-0.0379638671875,
0.04583740234375,
-0.007598876953125,
0.043365478515625,
0.0116119384765625,
-0.0200958251953125,
0.0509033203125,
-0.04656982421875,
-0.00650787353515625,
0.0132293701171875,
0.08819580078125,
0.027099609375,
-0.00205230712890625,
0.0080413818359375,
0.0092315673828125,
-0.0036487579345703125,
-0.0120697021484375,
-0.043670654296875,
-0.03179931640625,
0.04364013671875,
-0.04486083984375,
-0.0019121170043945312,
-0.009033203125,
-0.0830078125,
-0.000942230224609375,
-0.01290130615234375,
0.0203704833984375,
-0.05096435546875,
-0.04766845703125,
-0.0041656494140625,
-0.01189422607421875,
0.00732421875,
0.007110595703125,
-0.07965087890625,
0.0172576904296875,
0.040863037109375,
0.07623291015625,
0.0021381378173828125,
-0.02294921875,
-0.033447265625,
0.004184722900390625,
-0.0291900634765625,
0.042510986328125,
-0.01116180419921875,
-0.03173828125,
-0.0099029541015625,
0.0027313232421875,
-0.03277587890625,
-0.03851318359375,
0.0693359375,
-0.0014190673828125,
0.021575927734375,
-0.012603759765625,
-0.046966552734375,
0.00588226318359375,
-0.0037059783935546875,
-0.04132080078125,
0.07464599609375,
0.0093841552734375,
-0.0771484375,
0.0169677734375,
-0.051788330078125,
-0.012298583984375,
-0.000614166259765625,
-0.0146942138671875,
-0.046417236328125,
-0.0240325927734375,
0.01320648193359375,
0.0279541015625,
-0.009033203125,
0.003643035888671875,
0.00469207763671875,
-0.020172119140625,
0.004116058349609375,
-0.0035877227783203125,
0.0823974609375,
0.02105712890625,
-0.047454833984375,
0.0006561279296875,
-0.07861328125,
-0.0014591217041015625,
0.004627227783203125,
-0.046539306640625,
-0.01485443115234375,
0.004459381103515625,
0.017608642578125,
0.0037994384765625,
0.0160980224609375,
-0.0574951171875,
0.000058770179748535156,
-0.056640625,
0.03131103515625,
0.044647216796875,
0.00222015380859375,
0.0240325927734375,
-0.03546142578125,
0.02142333984375,
0.00144195556640625,
-0.00630950927734375,
-0.015472412109375,
-0.043609619140625,
-0.06591796875,
-0.044464111328125,
0.029754638671875,
0.05670166015625,
-0.035003662109375,
0.049468994140625,
-0.0237884521484375,
-0.050262451171875,
-0.06695556640625,
0.004039764404296875,
0.053436279296875,
0.0279083251953125,
0.03436279296875,
-0.017791748046875,
-0.060302734375,
-0.0692138671875,
-0.0141754150390625,
-0.045013427734375,
0.004688262939453125,
0.04010009765625,
0.0330810546875,
-0.009246826171875,
0.0687255859375,
-0.0206146240234375,
-0.02471923828125,
0.00031638145446777344,
0.005435943603515625,
0.044036865234375,
0.049652099609375,
0.03961181640625,
-0.06707763671875,
-0.05224609375,
0.0101776123046875,
-0.03582763671875,
-0.021728515625,
-0.003002166748046875,
-0.00036454200744628906,
0.01416015625,
0.0268402099609375,
-0.055908203125,
0.025360107421875,
0.0220184326171875,
-0.022125244140625,
0.06365966796875,
0.005950927734375,
0.017608642578125,
-0.0830078125,
0.026275634765625,
0.004573822021484375,
-0.0001506805419921875,
-0.05670166015625,
-0.030242919921875,
-0.00952911376953125,
0.0178680419921875,
-0.02099609375,
0.046173095703125,
-0.029815673828125,
-0.0038204193115234375,
0.0142974853515625,
0.0318603515625,
-0.017364501953125,
0.039459228515625,
-0.0013971328735351562,
0.05926513671875,
0.04119873046875,
-0.04669189453125,
0.0231170654296875,
0.054718017578125,
-0.046478271484375,
0.03070068359375,
-0.04925537109375,
0.00984954833984375,
0.01352691650390625,
0.012298583984375,
-0.0777587890625,
-0.01456451416015625,
0.05010986328125,
-0.05877685546875,
0.028411865234375,
-0.0247650146484375,
-0.01983642578125,
-0.038970947265625,
-0.029388427734375,
0.00919342041015625,
0.0217437744140625,
-0.0333251953125,
0.0279541015625,
0.0341796875,
-0.0308837890625,
-0.044281005859375,
-0.0404052734375,
0.008941650390625,
-0.0247802734375,
-0.04876708984375,
0.054595947265625,
-0.01558685302734375,
-0.035369873046875,
-0.007724761962890625,
0.0015592575073242188,
0.01357269287109375,
-0.0178375244140625,
0.019012451171875,
0.00975799560546875,
-0.01617431640625,
0.0028171539306640625,
-0.01415252685546875,
-0.003940582275390625,
-0.0167694091796875,
-0.0135040283203125,
0.044891357421875,
-0.011383056640625,
-0.013824462890625,
-0.057098388671875,
0.026153564453125,
0.05242919921875,
-0.036376953125,
0.044403076171875,
0.0665283203125,
-0.015838623046875,
-0.0021877288818359375,
-0.050567626953125,
0.0011119842529296875,
-0.03179931640625,
0.022308349609375,
-0.029205322265625,
-0.055206298828125,
0.042236328125,
0.0030155181884765625,
0.0210723876953125,
0.038330078125,
0.0494384765625,
-0.00655364990234375,
0.045379638671875,
0.01201629638671875,
-0.0155487060546875,
0.042694091796875,
-0.032440185546875,
0.0220794677734375,
-0.08233642578125,
-0.0191497802734375,
-0.062408447265625,
0.0008535385131835938,
-0.0650634765625,
-0.03448486328125,
0.024658203125,
0.01201629638671875,
0.001323699951171875,
0.042083740234375,
-0.053009033203125,
0.007427215576171875,
0.04034423828125,
-0.0037174224853515625,
0.003314971923828125,
0.007312774658203125,
-0.023468017578125,
-0.0051727294921875,
-0.0257568359375,
-0.048828125,
0.0748291015625,
0.03143310546875,
0.015228271484375,
0.018768310546875,
0.0516357421875,
0.0201263427734375,
-0.0174560546875,
-0.0494384765625,
0.044647216796875,
-0.007488250732421875,
-0.04461669921875,
-0.037322998046875,
-0.03326416015625,
-0.0767822265625,
0.0305633544921875,
-0.002376556396484375,
-0.07861328125,
0.02545166015625,
0.005756378173828125,
-0.025115966796875,
0.0196075439453125,
-0.059112548828125,
0.06793212890625,
0.005672454833984375,
-0.01245880126953125,
-0.0199432373046875,
-0.05792236328125,
0.016693115234375,
0.0091552734375,
0.038604736328125,
-0.0269775390625,
0.0199432373046875,
0.08929443359375,
-0.027862548828125,
0.046173095703125,
-0.02642822265625,
-0.0007653236389160156,
0.03887939453125,
-0.012359619140625,
0.0272369384765625,
0.006855010986328125,
-0.0236663818359375,
0.0171051025390625,
0.0157623291015625,
-0.024444580078125,
-0.0153656005859375,
0.06475830078125,
-0.0694580078125,
-0.048187255859375,
-0.02374267578125,
-0.020538330078125,
-0.00222015380859375,
0.01020050048828125,
0.021636962890625,
0.0328369140625,
-0.00678253173828125,
0.0193023681640625,
0.05096435546875,
-0.0284423828125,
0.05010986328125,
0.033233642578125,
-0.00630950927734375,
-0.06329345703125,
0.0592041015625,
0.00844573974609375,
0.011993408203125,
0.032928466796875,
0.016265869140625,
-0.016632080078125,
-0.05303955078125,
-0.034027099609375,
0.0261383056640625,
-0.026031494140625,
-0.0004639625549316406,
-0.06396484375,
-0.0212860107421875,
-0.05841064453125,
0.023529052734375,
-0.035888671875,
-0.049041748046875,
-0.025787353515625,
-0.007709503173828125,
0.02874755859375,
0.019073486328125,
-0.0211029052734375,
0.0188446044921875,
-0.0474853515625,
0.0213623046875,
0.01226806640625,
0.015472412109375,
-0.0179595947265625,
-0.0426025390625,
-0.03253173828125,
0.01279449462890625,
-0.029052734375,
-0.0628662109375,
0.0419921875,
0.037017822265625,
0.049835205078125,
0.01477813720703125,
-0.004566192626953125,
0.050048828125,
-0.02557373046875,
0.0823974609375,
0.0124359130859375,
-0.0789794921875,
0.05828857421875,
-0.043701171875,
0.012786865234375,
0.04248046875,
0.0188751220703125,
-0.04913330078125,
-0.0167999267578125,
-0.04791259765625,
-0.0830078125,
0.073974609375,
0.033111572265625,
-0.002010345458984375,
-0.0091552734375,
0.00652313232421875,
-0.006130218505859375,
0.01016998291015625,
-0.0450439453125,
-0.054412841796875,
-0.02166748046875,
-0.003780364990234375,
-0.01219940185546875,
-0.01904296875,
-0.004581451416015625,
-0.044036865234375,
0.07525634765625,
0.01055145263671875,
0.042572021484375,
0.041748046875,
0.0037670135498046875,
-0.016937255859375,
0.03271484375,
0.053802490234375,
0.0226898193359375,
-0.0396728515625,
0.002445220947265625,
0.0035991668701171875,
-0.045989990234375,
0.001773834228515625,
0.00829315185546875,
-0.004802703857421875,
0.0241851806640625,
0.03271484375,
0.062408447265625,
0.006381988525390625,
-0.0304412841796875,
0.037628173828125,
-0.007572174072265625,
-0.03094482421875,
-0.044952392578125,
-0.01201629638671875,
0.0245208740234375,
0.01363372802734375,
0.0305328369140625,
-0.0023059844970703125,
-0.0008478164672851562,
-0.03607177734375,
0.0215606689453125,
0.01239013671875,
-0.0118560791015625,
-0.026824951171875,
0.0413818359375,
0.01517486572265625,
-0.040313720703125,
0.037322998046875,
-0.0197296142578125,
-0.044647216796875,
0.04290771484375,
0.0264129638671875,
0.06793212890625,
-0.0452880859375,
0.01296234130859375,
0.060546875,
0.0211181640625,
0.01392364501953125,
0.032318115234375,
-0.023223876953125,
-0.048004150390625,
-0.01514434814453125,
-0.0712890625,
-0.0176544189453125,
0.0171966552734375,
-0.033721923828125,
0.019561767578125,
-0.0303955078125,
-0.0257720947265625,
0.02752685546875,
0.024566650390625,
-0.0230255126953125,
0.005329132080078125,
0.00667572021484375,
0.061126708984375,
-0.05859375,
0.057830810546875,
0.044952392578125,
-0.0244293212890625,
-0.07586669921875,
-0.001132965087890625,
0.004650115966796875,
-0.02093505859375,
0.024505615234375,
-0.00403594970703125,
0.00814056396484375,
-0.0019245147705078125,
-0.01953125,
-0.061553955078125,
0.0782470703125,
0.0188751220703125,
-0.06329345703125,
0.0194854736328125,
-0.007598876953125,
0.03887939453125,
-0.01519775390625,
0.0241546630859375,
0.0477294921875,
0.058502197265625,
0.004108428955078125,
-0.10858154296875,
0.00467681884765625,
-0.058258056640625,
-0.0093994140625,
0.00807952880859375,
-0.06494140625,
0.06536865234375,
-0.002536773681640625,
-0.018798828125,
0.0091094970703125,
0.048675537109375,
0.024658203125,
0.033233642578125,
0.0418701171875,
0.054901123046875,
0.051483154296875,
-0.00843048095703125,
0.04345703125,
-0.03326416015625,
0.0243072509765625,
0.07403564453125,
0.0026702880859375,
0.06561279296875,
0.040496826171875,
-0.042327880859375,
0.03790283203125,
0.06353759765625,
-0.00791168212890625,
0.040985107421875,
0.0188140869140625,
-0.030792236328125,
-0.0027618408203125,
-0.018310546875,
-0.04364013671875,
0.047149658203125,
0.0296478271484375,
-0.022857666015625,
0.0257568359375,
-0.018463134765625,
0.0205841064453125,
0.00896453857421875,
-0.003261566162109375,
0.047943115234375,
0.01318359375,
-0.0445556640625,
0.058380126953125,
-0.0009889602661132812,
0.059906005859375,
-0.0330810546875,
0.005832672119140625,
0.0017633438110351562,
0.008453369140625,
-0.042694091796875,
-0.03302001953125,
0.0303955078125,
-0.005634307861328125,
-0.0164947509765625,
-0.01399993896484375,
0.033660888671875,
-0.048065185546875,
-0.0216217041015625,
0.0305328369140625,
0.028961181640625,
0.03387451171875,
0.0223236083984375,
-0.0411376953125,
0.00492095947265625,
0.0171356201171875,
-0.0288543701171875,
0.01549530029296875,
0.0233306884765625,
0.006633758544921875,
0.026947021484375,
0.0538330078125,
0.031982421875,
0.0171356201171875,
0.01309967041015625,
0.055328369140625,
-0.041229248046875,
-0.0328369140625,
-0.063720703125,
0.032440185546875,
-0.017333984375,
-0.0293426513671875,
0.07525634765625,
0.056884765625,
0.0614013671875,
0.006580352783203125,
0.04986572265625,
-0.032958984375,
0.06488037109375,
-0.0196990966796875,
0.0601806640625,
-0.031097412109375,
0.024871826171875,
-0.0516357421875,
-0.057403564453125,
-0.0106658935546875,
0.04693603515625,
-0.0218505859375,
0.0010805130004882812,
0.047149658203125,
0.072998046875,
0.010101318359375,
0.00604248046875,
0.0107269287109375,
0.031890869140625,
0.0260162353515625,
0.03436279296875,
0.043243408203125,
-0.040435791015625,
0.0399169921875,
-0.04071044921875,
-0.0133514404296875,
-0.0121307373046875,
-0.0458984375,
-0.05731201171875,
-0.06768798828125,
-0.0516357421875,
-0.027740478515625,
0.00033092498779296875,
0.091064453125,
0.055816650390625,
-0.06134033203125,
-0.034271240234375,
0.007373809814453125,
0.0160675048828125,
-0.0290985107421875,
-0.01549530029296875,
0.051300048828125,
0.0131072998046875,
-0.056427001953125,
0.046173095703125,
0.01055908203125,
-0.0045623779296875,
0.004413604736328125,
-0.0185546875,
-0.04656982421875,
0.004611968994140625,
0.0180816650390625,
0.030670166015625,
-0.05255126953125,
-0.014617919921875,
-0.0241241455078125,
-0.006076812744140625,
0.0178985595703125,
0.0287017822265625,
-0.0272064208984375,
0.03790283203125,
0.049530029296875,
0.003047943115234375,
0.043670654296875,
-0.0008530616760253906,
0.01678466796875,
-0.04656982421875,
0.0013580322265625,
0.01629638671875,
0.0205078125,
0.03521728515625,
-0.0166168212890625,
0.038665771484375,
0.032623291015625,
-0.05157470703125,
-0.06976318359375,
-0.0172576904296875,
-0.0848388671875,
0.0030307769775390625,
0.08795166015625,
-0.0230255126953125,
-0.0219573974609375,
-0.01474761962890625,
-0.028564453125,
0.0426025390625,
-0.04754638671875,
0.05718994140625,
0.052581787109375,
-0.00775909423828125,
-0.00360107421875,
-0.04248046875,
0.0474853515625,
0.0301971435546875,
-0.048187255859375,
0.003925323486328125,
0.0170440673828125,
0.0211029052734375,
0.038299560546875,
0.07476806640625,
-0.019195556640625,
0.0052642822265625,
0.0067596435546875,
0.0223388671875,
-0.0026035308837890625,
-0.001773834228515625,
-0.0159454345703125,
-0.005657196044921875,
-0.00746917724609375,
-0.03271484375
]
] |
prajjwal1/bert-small | 2021-10-27T18:31:52.000Z | [
"transformers",
"pytorch",
"BERT",
"MNLI",
"NLI",
"transformer",
"pre-training",
"en",
"arxiv:1908.08962",
"arxiv:2110.01518",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | prajjwal1 | null | null | prajjwal1/bert-small | 12 | 2,935,794 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
license:
- mit
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
---
The following model is a Pytorch pre-trained model obtained from converting Tensorflow checkpoint found in the [official Google BERT repository](https://github.com/google-research/bert).
This is one of the smaller pre-trained BERT variants, together with [bert-tiny](https://huggingface.co/prajjwal1/bert-small), [bert-mini]([bert-small](https://huggingface.co/prajjwal1/bert-mini) and [bert-medium](https://huggingface.co/prajjwal1/bert-medium). They were introduced in the study `Well-Read Students Learn Better: On the Importance of Pre-training Compact Models` ([arxiv](https://arxiv.org/abs/1908.08962)), and ported to HF for the study `Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics` ([arXiv](https://arxiv.org/abs/2110.01518)). These models are supposed to be trained on a downstream task.
If you use the model, please consider citing both the papers:
```
@misc{bhargava2021generalization,
title={Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics},
author={Prajjwal Bhargava and Aleksandr Drozd and Anna Rogers},
year={2021},
eprint={2110.01518},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@article{DBLP:journals/corr/abs-1908-08962,
author = {Iulia Turc and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {Well-Read Students Learn Better: The Impact of Student Initialization
on Knowledge Distillation},
journal = {CoRR},
volume = {abs/1908.08962},
year = {2019},
url = {http://arxiv.org/abs/1908.08962},
eprinttype = {arXiv},
eprint = {1908.08962},
timestamp = {Thu, 29 Aug 2019 16:32:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1908-08962.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
Config of this model:
- `prajjwal1/bert-small` (L=4, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-small)
Other models to check out:
- `prajjwal1/bert-tiny` (L=2, H=128) [Model Link](https://huggingface.co/prajjwal1/bert-tiny)
- `prajjwal1/bert-mini` (L=4, H=256) [Model Link](https://huggingface.co/prajjwal1/bert-mini)
- `prajjwal1/bert-medium` (L=8, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-medium)
Original Implementation and more info can be found in [this Github repository](https://github.com/prajjwal1/generalize_lm_nli).
Twitter: [@prajjwal_1](https://twitter.com/prajjwal_1)
| 2,572 | [
[
-0.031707763671875,
-0.041748046875,
0.034271240234375,
-0.0016889572143554688,
-0.01259613037109375,
-0.0216522216796875,
-0.02337646484375,
-0.032440185546875,
0.0079498291015625,
0.0124664306640625,
-0.054656982421875,
-0.024566650390625,
-0.038116455078125,
-0.0103607177734375,
-0.0257415771484375,
0.09197998046875,
0.00551605224609375,
0.005062103271484375,
-0.0129241943359375,
-0.0181427001953125,
-0.01230621337890625,
-0.04083251953125,
-0.0439453125,
-0.037384033203125,
0.055419921875,
-0.0015869140625,
0.0362548828125,
0.0157623291015625,
0.046630859375,
0.020355224609375,
-0.029571533203125,
-0.00731658935546875,
-0.037200927734375,
-0.019317626953125,
0.00385284423828125,
-0.032867431640625,
-0.04278564453125,
0.0078582763671875,
0.055816650390625,
0.0701904296875,
-0.00885009765625,
0.0273284912109375,
0.02142333984375,
0.0455322265625,
-0.04254150390625,
-0.003997802734375,
-0.023681640625,
-0.01568603515625,
-0.0121917724609375,
0.0208587646484375,
-0.0413818359375,
-0.0251312255859375,
0.037506103515625,
-0.03997802734375,
0.040985107421875,
-0.0009937286376953125,
0.10369873046875,
0.01145172119140625,
-0.0162353515625,
-0.00881195068359375,
-0.0477294921875,
0.0728759765625,
-0.0712890625,
0.03912353515625,
0.0022563934326171875,
0.0196990966796875,
-0.001491546630859375,
-0.07269287109375,
-0.043365478515625,
-0.00528717041015625,
-0.034088134765625,
0.0108642578125,
-0.0281219482421875,
0.011810302734375,
0.0294036865234375,
0.028472900390625,
-0.043701171875,
0.006011962890625,
-0.043121337890625,
-0.0194549560546875,
0.0303955078125,
-0.0013456344604492188,
-0.00234222412109375,
-0.0302581787109375,
-0.0255889892578125,
-0.033050537109375,
-0.0440673828125,
0.0186614990234375,
0.039764404296875,
0.02874755859375,
-0.0302581787109375,
0.031219482421875,
-0.000058770179748535156,
0.062744140625,
0.010589599609375,
-0.002445220947265625,
0.0321044921875,
-0.04913330078125,
-0.01087188720703125,
-0.015411376953125,
0.059661865234375,
0.0058441162109375,
0.008026123046875,
-0.005523681640625,
-0.006778717041015625,
-0.0296630859375,
0.0118408203125,
-0.07794189453125,
-0.03076171875,
0.01274871826171875,
-0.05316162109375,
-0.0021572113037109375,
0.0129852294921875,
-0.047088623046875,
-0.004314422607421875,
-0.025054931640625,
0.0384521484375,
-0.039398193359375,
-0.0200042724609375,
-0.0107269287109375,
-0.0012969970703125,
0.0312347412109375,
0.0292510986328125,
-0.049041748046875,
0.005443572998046875,
0.03521728515625,
0.07000732421875,
0.006595611572265625,
-0.01739501953125,
0.0008478164672851562,
0.00385284423828125,
-0.0145111083984375,
0.0307769775390625,
-0.0162811279296875,
-0.013763427734375,
-0.0059661865234375,
-0.0023860931396484375,
-0.0117645263671875,
-0.0261993408203125,
0.05072021484375,
-0.0379638671875,
0.026519775390625,
-0.02630615234375,
-0.04071044921875,
-0.01947021484375,
0.013214111328125,
-0.046417236328125,
0.0750732421875,
0.0014505386352539062,
-0.07110595703125,
0.038970947265625,
-0.04827880859375,
-0.0160064697265625,
-0.014129638671875,
0.0104217529296875,
-0.052581787109375,
-0.0006728172302246094,
0.0144500732421875,
0.03857421875,
-0.01422119140625,
0.0240478515625,
-0.035491943359375,
-0.026092529296875,
-0.007572174072265625,
-0.0060577392578125,
0.0892333984375,
0.0224456787109375,
-0.00446319580078125,
0.0152587890625,
-0.062286376953125,
0.00800323486328125,
0.0133209228515625,
-0.0285797119140625,
-0.03704833984375,
-0.00948333740234375,
-0.0008606910705566406,
0.002117156982421875,
0.027099609375,
-0.0302581787109375,
0.0263824462890625,
-0.027313232421875,
0.030731201171875,
0.04718017578125,
0.006153106689453125,
0.036468505859375,
-0.039520263671875,
0.007236480712890625,
0.01187896728515625,
0.02423095703125,
0.00272369384765625,
-0.037750244140625,
-0.0755615234375,
-0.038543701171875,
0.0423583984375,
0.0208282470703125,
-0.04254150390625,
0.044219970703125,
-0.0231170654296875,
-0.0528564453125,
-0.045501708984375,
0.0153656005859375,
0.023681640625,
0.036712646484375,
0.033447265625,
-0.012542724609375,
-0.054840087890625,
-0.0653076171875,
-0.016265869140625,
-0.02545166015625,
-0.015655517578125,
0.0251617431640625,
0.051025390625,
-0.040069580078125,
0.0792236328125,
-0.0293121337890625,
-0.02239990234375,
-0.024566650390625,
0.026123046875,
0.05194091796875,
0.0640869140625,
0.061126708984375,
-0.039794921875,
-0.031524658203125,
-0.0294952392578125,
-0.041839599609375,
0.00873565673828125,
-0.015411376953125,
-0.0212554931640625,
0.01241302490234375,
0.030364990234375,
-0.04461669921875,
0.0300445556640625,
0.0225677490234375,
-0.0271453857421875,
0.034637451171875,
-0.017303466796875,
-0.0070343017578125,
-0.084716796875,
0.0267333984375,
0.0032196044921875,
-0.00327301025390625,
-0.04144287109375,
0.0109710693359375,
0.00039196014404296875,
0.009185791015625,
-0.014068603515625,
0.049835205078125,
-0.0418701171875,
0.0035858154296875,
0.009735107421875,
-0.0110321044921875,
-0.0033893585205078125,
0.0361328125,
-0.0015125274658203125,
0.03985595703125,
0.0225830078125,
-0.034027099609375,
-0.005462646484375,
0.03338623046875,
-0.03399658203125,
0.0123443603515625,
-0.0828857421875,
0.0110321044921875,
-0.0034122467041015625,
0.031494140625,
-0.071044921875,
-0.0178680419921875,
0.0208740234375,
-0.0303955078125,
0.029022216796875,
-0.026641845703125,
-0.053466796875,
-0.033905029296875,
-0.02142333984375,
0.02685546875,
0.0567626953125,
-0.04827880859375,
0.049652099609375,
-0.00705718994140625,
-0.0017032623291015625,
-0.03582763671875,
-0.052001953125,
-0.032318115234375,
-0.0017137527465820312,
-0.05194091796875,
0.026519775390625,
-0.0197906494140625,
-0.004261016845703125,
0.01202392578125,
-0.0013751983642578125,
-0.0189971923828125,
-0.0027179718017578125,
0.01274871826171875,
0.04376220703125,
-0.02197265625,
0.0103607177734375,
0.005649566650390625,
0.017120361328125,
-0.003726959228515625,
-0.005084991455078125,
0.0439453125,
-0.0228118896484375,
-0.0132598876953125,
-0.0433349609375,
0.007843017578125,
0.0294036865234375,
-0.00226593017578125,
0.08233642578125,
0.069091796875,
-0.0277862548828125,
0.002643585205078125,
-0.048309326171875,
-0.044158935546875,
-0.03485107421875,
0.014739990234375,
-0.019287109375,
-0.056182861328125,
0.04876708984375,
0.002880096435546875,
0.016632080078125,
0.05767822265625,
0.036590576171875,
-0.0219573974609375,
0.055328369140625,
0.0592041015625,
-0.0004029273986816406,
0.06085205078125,
-0.05279541015625,
0.0197296142578125,
-0.0699462890625,
-0.01512908935546875,
-0.04510498046875,
-0.0305633544921875,
-0.046173095703125,
-0.01450347900390625,
0.0209503173828125,
0.0272216796875,
-0.03790283203125,
0.0293731689453125,
-0.04376220703125,
0.012054443359375,
0.064697265625,
0.022705078125,
0.0042572021484375,
-0.0009069442749023438,
-0.0311126708984375,
-0.0026721954345703125,
-0.07366943359375,
-0.0264739990234375,
0.10101318359375,
0.030914306640625,
0.045135498046875,
0.0224609375,
0.07867431640625,
0.0019168853759765625,
0.024017333984375,
-0.04656982421875,
0.033782958984375,
-0.003986358642578125,
-0.080078125,
-0.0191192626953125,
-0.0472412109375,
-0.076904296875,
0.0059051513671875,
-0.0289459228515625,
-0.053436279296875,
0.03900146484375,
0.00655364990234375,
-0.0484619140625,
0.01532745361328125,
-0.07183837890625,
0.05712890625,
0.0030384063720703125,
-0.035919189453125,
-0.010498046875,
-0.0526123046875,
0.0277862548828125,
0.0007967948913574219,
0.003948211669921875,
0.01104736328125,
0.0175323486328125,
0.08123779296875,
-0.04791259765625,
0.06878662109375,
-0.03076171875,
0.0195159912109375,
0.03912353515625,
-0.0147552490234375,
0.0460205078125,
0.00711822509765625,
-0.003265380859375,
0.030731201171875,
0.01207733154296875,
-0.04443359375,
-0.0185546875,
0.0418701171875,
-0.08868408203125,
-0.034881591796875,
-0.047637939453125,
-0.04736328125,
-0.006908416748046875,
0.03265380859375,
0.030426025390625,
0.0257568359375,
0.00677490234375,
0.0369873046875,
0.056182861328125,
-0.01043701171875,
0.0430908203125,
0.03436279296875,
-0.00847625732421875,
-0.0092315673828125,
0.04571533203125,
0.01073455810546875,
0.01806640625,
0.00972747802734375,
0.0134735107421875,
-0.0203399658203125,
-0.059173583984375,
-0.006183624267578125,
0.0439453125,
-0.05133056640625,
-0.0024871826171875,
-0.047393798828125,
-0.035552978515625,
-0.04296875,
-0.01904296875,
-0.026214599609375,
-0.0157012939453125,
-0.037322998046875,
0.003398895263671875,
0.023040771484375,
0.038177490234375,
-0.0200347900390625,
0.033843994140625,
-0.048736572265625,
0.0034313201904296875,
0.0340576171875,
0.01457977294921875,
0.01047515869140625,
-0.05633544921875,
-0.01367950439453125,
0.0026149749755859375,
-0.0158233642578125,
-0.03924560546875,
0.021270751953125,
0.021026611328125,
0.05987548828125,
0.0310516357421875,
0.0114288330078125,
0.05108642578125,
-0.022857666015625,
0.05169677734375,
0.033477783203125,
-0.042877197265625,
0.038604736328125,
-0.028900146484375,
0.0192718505859375,
0.054718017578125,
0.038299560546875,
-0.0037174224853515625,
-0.003986358642578125,
-0.06256103515625,
-0.08056640625,
0.0535888671875,
0.01309967041015625,
0.0084228515625,
0.0277557373046875,
0.032073974609375,
0.007366180419921875,
0.01220703125,
-0.06396484375,
-0.0256195068359375,
-0.01285552978515625,
-0.021148681640625,
-0.01262664794921875,
-0.038238525390625,
-0.0229034423828125,
-0.050323486328125,
0.059783935546875,
0.000029861927032470703,
0.047576904296875,
0.0238037109375,
-0.0168914794921875,
0.01377105712890625,
0.00557708740234375,
0.03717041015625,
0.0482177734375,
-0.051239013671875,
-0.01367950439453125,
-0.00027489662170410156,
-0.04034423828125,
-0.01497650146484375,
0.0252838134765625,
-0.0241546630859375,
0.0121002197265625,
0.0457763671875,
0.062286376953125,
0.0182952880859375,
-0.0182647705078125,
0.039642333984375,
0.003749847412109375,
-0.0206146240234375,
-0.0294036865234375,
0.0024623870849609375,
-0.0008211135864257812,
0.030548095703125,
0.0286407470703125,
0.0208282470703125,
0.00774383544921875,
-0.035430908203125,
0.006832122802734375,
0.018890380859375,
-0.019378662109375,
-0.02252197265625,
0.050811767578125,
0.018768310546875,
0.00487518310546875,
0.057891845703125,
-0.0233917236328125,
-0.029632568359375,
0.0275726318359375,
0.01812744140625,
0.054718017578125,
0.0174560546875,
0.004764556884765625,
0.06768798828125,
0.0257568359375,
-0.0093231201171875,
0.004932403564453125,
-0.01129913330078125,
-0.051025390625,
-0.0200347900390625,
-0.0662841796875,
-0.017333984375,
0.00830841064453125,
-0.054718017578125,
0.021759033203125,
-0.041656494140625,
-0.0261077880859375,
0.01187896728515625,
0.01922607421875,
-0.06787109375,
0.0051727294921875,
0.00212860107421875,
0.06109619140625,
-0.051513671875,
0.07440185546875,
0.058624267578125,
-0.04412841796875,
-0.06781005859375,
0.0016183853149414062,
-0.01108551025390625,
-0.045806884765625,
0.053619384765625,
-0.01123809814453125,
0.0199432373046875,
0.00997161865234375,
-0.039154052734375,
-0.06756591796875,
0.09783935546875,
0.0177764892578125,
-0.0615234375,
-0.0272979736328125,
-0.01274871826171875,
0.039337158203125,
-0.00510406494140625,
0.0310211181640625,
0.026031494140625,
0.0278778076171875,
0.02813720703125,
-0.058624267578125,
0.000415802001953125,
-0.0152740478515625,
0.0018682479858398438,
0.005695343017578125,
-0.058380126953125,
0.09417724609375,
-0.0280609130859375,
0.0017795562744140625,
0.020355224609375,
0.04608154296875,
0.0311126708984375,
0.012725830078125,
0.035552978515625,
0.055511474609375,
0.057403564453125,
-0.02587890625,
0.08099365234375,
-0.014556884765625,
0.05718994140625,
0.08038330078125,
0.0208587646484375,
0.05780029296875,
0.052581787109375,
-0.0291595458984375,
0.0467529296875,
0.060394287109375,
-0.0162353515625,
0.051177978515625,
0.0064544677734375,
0.00954437255859375,
-0.0223541259765625,
0.01922607421875,
-0.046966552734375,
0.00812530517578125,
0.00872039794921875,
-0.032135009765625,
-0.0162200927734375,
-0.0152740478515625,
0.0104827880859375,
-0.027923583984375,
-0.02197265625,
0.044281005859375,
0.002979278564453125,
-0.031341552734375,
0.05767822265625,
-0.0179595947265625,
0.06964111328125,
-0.059051513671875,
0.0148468017578125,
-0.010894775390625,
0.0301361083984375,
-0.00859832763671875,
-0.031829833984375,
0.0183563232421875,
-0.0017557144165039062,
-0.03173828125,
-0.0142822265625,
0.058380126953125,
-0.01293182373046875,
-0.05108642578125,
0.02081298828125,
0.0361328125,
0.0086822509765625,
0.01629638671875,
-0.06451416015625,
0.004528045654296875,
0.0005903244018554688,
-0.039520263671875,
0.0247802734375,
0.01230621337890625,
0.013153076171875,
0.03399658203125,
0.0574951171875,
-0.0035724639892578125,
0.0267791748046875,
-0.004352569580078125,
0.060821533203125,
-0.02685546875,
-0.028839111328125,
-0.04229736328125,
0.050567626953125,
-0.0167388916015625,
-0.04443359375,
0.0501708984375,
0.032135009765625,
0.07879638671875,
-0.00905609130859375,
0.045745849609375,
-0.0264129638671875,
0.0462646484375,
-0.0284881591796875,
0.075927734375,
-0.05975341796875,
0.011566162109375,
-0.0235748291015625,
-0.06683349609375,
-0.01137542724609375,
0.05694580078125,
-0.041656494140625,
0.032012939453125,
0.044189453125,
0.036590576171875,
-0.0007467269897460938,
-0.0199127197265625,
0.005611419677734375,
0.0312347412109375,
0.0206451416015625,
0.031829833984375,
0.04205322265625,
-0.04205322265625,
0.0404052734375,
-0.0321044921875,
-0.00917816162109375,
-0.03912353515625,
-0.049560546875,
-0.084228515625,
-0.049560546875,
-0.0298004150390625,
-0.0307464599609375,
0.0012502670288085938,
0.060089111328125,
0.07177734375,
-0.0760498046875,
-0.004924774169921875,
-0.0121002197265625,
-0.0012502670288085938,
-0.0107269287109375,
-0.0156707763671875,
0.032012939453125,
-0.019256591796875,
-0.051513671875,
-0.00379180908203125,
-0.0305023193359375,
0.020965576171875,
-0.00728607177734375,
-0.0184326171875,
-0.03875732421875,
0.00567626953125,
0.02618408203125,
0.020965576171875,
-0.04840087890625,
-0.0295867919921875,
-0.00510406494140625,
-0.0128326416015625,
-0.011199951171875,
0.0389404296875,
-0.045379638671875,
0.0231170654296875,
0.03955078125,
0.03369140625,
0.053558349609375,
-0.0233306884765625,
0.0131378173828125,
-0.059661865234375,
0.03271484375,
0.0212249755859375,
0.036376953125,
0.0130462646484375,
-0.007526397705078125,
0.04779052734375,
0.0286407470703125,
-0.040252685546875,
-0.08306884765625,
-0.00388336181640625,
-0.0850830078125,
-0.0128936767578125,
0.079345703125,
-0.031402587890625,
-0.01357269287109375,
0.0238037109375,
-0.0034008026123046875,
0.0284423828125,
-0.0272064208984375,
0.051910400390625,
0.060302734375,
-0.0014896392822265625,
-0.012725830078125,
-0.040557861328125,
0.030426025390625,
0.028228759765625,
-0.043701171875,
-0.025970458984375,
0.016021728515625,
0.027984619140625,
0.029571533203125,
0.0245208740234375,
0.006855010986328125,
0.01462554931640625,
-0.001956939697265625,
0.0199127197265625,
-0.00904083251953125,
-0.019622802734375,
-0.00708770751953125,
-0.00363922119140625,
-0.0033779144287109375,
-0.01160430908203125
]
] |
facebook/bart-large-mnli | 2023-09-05T14:49:34.000Z | [
"transformers",
"pytorch",
"jax",
"rust",
"safetensors",
"bart",
"text-classification",
"zero-shot-classification",
"dataset:multi_nli",
"arxiv:1910.13461",
"arxiv:1909.00161",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | facebook | null | null | facebook/bart-large-mnli | 726 | 2,909,045 | transformers | 2022-03-02T23:29:05 | ---
license: mit
thumbnail: https://huggingface.co/front/thumbnails/facebook.png
pipeline_tag: zero-shot-classification
datasets:
- multi_nli
---
# bart-large-mnli
This is the checkpoint for [bart-large](https://huggingface.co/facebook/bart-large) after being trained on the [MultiNLI (MNLI)](https://huggingface.co/datasets/multi_nli) dataset.
Additional information about this model:
- The [bart-large](https://huggingface.co/facebook/bart-large) model page
- [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
](https://arxiv.org/abs/1910.13461)
- [BART fairseq implementation](https://github.com/pytorch/fairseq/tree/master/fairseq/models/bart)
## NLI-based Zero Shot Text Classification
[Yin et al.](https://arxiv.org/abs/1909.00161) proposed a method for using pre-trained NLI models as a ready-made zero-shot sequence classifiers. The method works by posing the sequence to be classified as the NLI premise and to construct a hypothesis from each candidate label. For example, if we want to evaluate whether a sequence belongs to the class "politics", we could construct a hypothesis of `This text is about politics.`. The probabilities for entailment and contradiction are then converted to label probabilities.
This method is surprisingly effective in many cases, particularly when used with larger pre-trained models like BART and Roberta. See [this blog post](https://joeddav.github.io/blog/2020/05/29/ZSL.html) for a more expansive introduction to this and other zero shot methods, and see the code snippets below for examples of using this model for zero-shot classification both with Hugging Face's built-in pipeline and with native Transformers/PyTorch code.
#### With the zero-shot classification pipeline
The model can be loaded with the `zero-shot-classification` pipeline like so:
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="facebook/bart-large-mnli")
```
You can then use this pipeline to classify sequences into any of the class names you specify.
```python
sequence_to_classify = "one day I will see the world"
candidate_labels = ['travel', 'cooking', 'dancing']
classifier(sequence_to_classify, candidate_labels)
#{'labels': ['travel', 'dancing', 'cooking'],
# 'scores': [0.9938651323318481, 0.0032737774308770895, 0.002861034357920289],
# 'sequence': 'one day I will see the world'}
```
If more than one candidate label can be correct, pass `multi_label=True` to calculate each class independently:
```python
candidate_labels = ['travel', 'cooking', 'dancing', 'exploration']
classifier(sequence_to_classify, candidate_labels, multi_label=True)
#{'labels': ['travel', 'exploration', 'dancing', 'cooking'],
# 'scores': [0.9945111274719238,
# 0.9383890628814697,
# 0.0057061901316046715,
# 0.0018193122232332826],
# 'sequence': 'one day I will see the world'}
```
#### With manual PyTorch
```python
# pose sequence as a NLI premise and label as a hypothesis
from transformers import AutoModelForSequenceClassification, AutoTokenizer
nli_model = AutoModelForSequenceClassification.from_pretrained('facebook/bart-large-mnli')
tokenizer = AutoTokenizer.from_pretrained('facebook/bart-large-mnli')
premise = sequence
hypothesis = f'This example is {label}.'
# run through model pre-trained on MNLI
x = tokenizer.encode(premise, hypothesis, return_tensors='pt',
truncation_strategy='only_first')
logits = nli_model(x.to(device))[0]
# we throw away "neutral" (dim 1) and take the probability of
# "entailment" (2) as the probability of the label being true
entail_contradiction_logits = logits[:,[0,2]]
probs = entail_contradiction_logits.softmax(dim=1)
prob_label_is_true = probs[:,1]
```
| 3,793 | [
[
-0.0284423828125,
-0.043670654296875,
0.0249481201171875,
0.01000213623046875,
-0.0018644332885742188,
-0.0100250244140625,
0.0018339157104492188,
-0.0286712646484375,
0.0241241455078125,
0.0256500244140625,
-0.05023193359375,
-0.049041748046875,
-0.03277587890625,
0.01373291015625,
-0.0251922607421875,
0.09002685546875,
0.0045013427734375,
-0.0181427001953125,
0.0021533966064453125,
-0.0118408203125,
-0.024017333984375,
-0.037933349609375,
-0.036895751953125,
-0.0243377685546875,
0.04437255859375,
0.017822265625,
0.041656494140625,
0.0244140625,
0.0258026123046875,
0.0227508544921875,
-0.01259613037109375,
-0.01427459716796875,
-0.0158233642578125,
-0.01107025146484375,
0.006298065185546875,
-0.043670654296875,
-0.0302886962890625,
0.01422119140625,
0.053924560546875,
0.044708251953125,
-0.0009746551513671875,
0.0361328125,
-0.0092926025390625,
0.034820556640625,
-0.04998779296875,
0.002498626708984375,
-0.03521728515625,
0.0249176025390625,
-0.0025653839111328125,
-0.007015228271484375,
-0.035888671875,
-0.021240234375,
0.01154327392578125,
-0.04840087890625,
0.02044677734375,
-0.005596160888671875,
0.0867919921875,
0.015960693359375,
-0.0308380126953125,
-0.00562286376953125,
-0.032958984375,
0.068359375,
-0.07208251953125,
0.0122528076171875,
0.016387939453125,
0.0162811279296875,
-0.00937652587890625,
-0.046722412109375,
-0.071044921875,
0.0003027915954589844,
-0.0025806427001953125,
0.02288818359375,
0.0009889602661132812,
0.00409698486328125,
0.041656494140625,
0.029296875,
-0.0704345703125,
-0.0077972412109375,
-0.025909423828125,
-0.0207977294921875,
0.05859375,
0.004833221435546875,
0.01904296875,
-0.036407470703125,
-0.023681640625,
-0.0312347412109375,
-0.034454345703125,
0.007358551025390625,
0.01708984375,
0.021392822265625,
-0.020477294921875,
0.04559326171875,
-0.0261688232421875,
0.0673828125,
0.004638671875,
-0.02978515625,
0.06243896484375,
-0.0209808349609375,
-0.0287017822265625,
0.01568603515625,
0.0711669921875,
0.031585693359375,
0.01348114013671875,
0.01074981689453125,
-0.00027370452880859375,
0.01097869873046875,
-0.01617431640625,
-0.072021484375,
-0.00859832763671875,
0.0248260498046875,
-0.031494140625,
-0.039306640625,
0.0049896240234375,
-0.044769287109375,
-0.0112457275390625,
-0.0242462158203125,
0.06787109375,
-0.03662109375,
-0.015960693359375,
0.0173797607421875,
-0.021942138671875,
0.0256195068359375,
0.0152587890625,
-0.03973388671875,
-0.007198333740234375,
0.036376953125,
0.07342529296875,
0.022216796875,
-0.026763916015625,
-0.0253143310546875,
-0.00475311279296875,
-0.0184173583984375,
0.035003662109375,
-0.0176849365234375,
-0.002361297607421875,
-0.011993408203125,
0.0145111083984375,
-0.01526641845703125,
-0.02203369140625,
0.036651611328125,
-0.03338623046875,
0.0306243896484375,
0.0033397674560546875,
-0.049896240234375,
-0.029144287109375,
0.035888671875,
-0.03472900390625,
0.05706787109375,
-0.003025054931640625,
-0.076171875,
0.0169219970703125,
-0.050048828125,
-0.0310821533203125,
-0.0015230178833007812,
0.001911163330078125,
-0.043609619140625,
-0.00786590576171875,
0.00966644287109375,
0.038421630859375,
0.0008101463317871094,
0.034698486328125,
-0.0291290283203125,
-0.03619384765625,
0.0252532958984375,
-0.029144287109375,
0.09136962890625,
0.02520751953125,
-0.0310516357421875,
0.0096588134765625,
-0.06658935546875,
0.0158233642578125,
0.01629638671875,
-0.0186309814453125,
-0.0114593505859375,
-0.024505615234375,
0.00839996337890625,
0.02337646484375,
0.00885772705078125,
-0.0523681640625,
0.0189361572265625,
-0.036407470703125,
0.043365478515625,
0.0391845703125,
0.00914764404296875,
0.019683837890625,
-0.0247802734375,
0.021392822265625,
0.009979248046875,
0.01531982421875,
-0.0355224609375,
-0.052703857421875,
-0.06884765625,
-0.0033817291259765625,
0.036895751953125,
0.061920166015625,
-0.057952880859375,
0.0645751953125,
-0.0182037353515625,
-0.050140380859375,
-0.03173828125,
-0.0189361572265625,
0.0228118896484375,
0.03509521484375,
0.03173828125,
-0.025787353515625,
-0.054412841796875,
-0.044921875,
-0.005084991455078125,
-0.0158538818359375,
-0.007572174072265625,
0.0033092498779296875,
0.0445556640625,
-0.02398681640625,
0.07476806640625,
-0.038818359375,
-0.017669677734375,
-0.013427734375,
0.0212554931640625,
0.05615234375,
0.04742431640625,
0.034027099609375,
-0.050262451171875,
-0.03326416015625,
-0.0203857421875,
-0.07623291015625,
0.0092315673828125,
-0.0264434814453125,
-0.019287109375,
0.0250244140625,
0.0264434814453125,
-0.04193115234375,
0.051055908203125,
0.017333984375,
-0.031097412109375,
0.039337158203125,
0.00313568115234375,
-0.004528045654296875,
-0.0792236328125,
0.01202392578125,
0.0062103271484375,
-0.01100921630859375,
-0.058135986328125,
0.007354736328125,
-0.0015325546264648438,
-0.01277923583984375,
-0.037017822265625,
0.04248046875,
-0.019683837890625,
0.0016460418701171875,
-0.00777435302734375,
-0.0023345947265625,
0.0055389404296875,
0.044708251953125,
0.011810302734375,
0.0140838623046875,
0.06365966796875,
-0.05474853515625,
0.02191162109375,
0.03057861328125,
-0.03277587890625,
0.0269317626953125,
-0.04443359375,
-0.005157470703125,
-0.0271759033203125,
0.036285400390625,
-0.07391357421875,
-0.0243988037109375,
0.035491943359375,
-0.04638671875,
0.022369384765625,
0.0018415451049804688,
-0.03228759765625,
-0.04736328125,
-0.0170135498046875,
0.0313720703125,
0.044586181640625,
-0.045074462890625,
0.032470703125,
0.007030487060546875,
0.0160064697265625,
-0.05694580078125,
-0.06396484375,
0.00603485107421875,
-0.0156402587890625,
-0.026885986328125,
0.0243682861328125,
-0.017791748046875,
0.00334930419921875,
-0.0014429092407226562,
0.0012054443359375,
0.0036468505859375,
0.00141143798828125,
0.024932861328125,
0.03533935546875,
-0.017242431640625,
0.0006794929504394531,
-0.006267547607421875,
-0.01329803466796875,
-0.0007963180541992188,
-0.0256195068359375,
0.04437255859375,
-0.0240325927734375,
-0.01558685302734375,
-0.058837890625,
0.00555419921875,
0.0272369384765625,
-0.0005674362182617188,
0.053436279296875,
0.06396484375,
-0.032867431640625,
0.004779815673828125,
-0.038665771484375,
-0.003559112548828125,
-0.036407470703125,
0.024017333984375,
-0.0306549072265625,
-0.04803466796875,
0.0294647216796875,
0.0129547119140625,
0.01047515869140625,
0.05267333984375,
0.0232086181640625,
0.003543853759765625,
0.06707763671875,
0.037933349609375,
-0.01192474365234375,
0.031005859375,
-0.050933837890625,
0.023101806640625,
-0.051422119140625,
-0.012542724609375,
-0.01277923583984375,
-0.03436279296875,
-0.0469970703125,
-0.0308074951171875,
0.0223541259765625,
0.022705078125,
-0.046234130859375,
0.047760009765625,
-0.05218505859375,
0.044769287109375,
0.053436279296875,
0.01114654541015625,
0.01409912109375,
0.00235748291015625,
-0.0016355514526367188,
0.00677490234375,
-0.055999755859375,
-0.023101806640625,
0.08966064453125,
0.0267791748046875,
0.03668212890625,
-0.005580902099609375,
0.083251953125,
-0.00211334228515625,
0.040069580078125,
-0.0582275390625,
0.057830810546875,
-0.008636474609375,
-0.0616455078125,
-0.0258941650390625,
-0.0384521484375,
-0.0675048828125,
0.022705078125,
-0.0267181396484375,
-0.04412841796875,
0.0208282470703125,
-0.0078887939453125,
-0.043548583984375,
0.02703857421875,
-0.04705810546875,
0.0826416015625,
-0.017333984375,
-0.0142974853515625,
0.0029315948486328125,
-0.071044921875,
0.032257080078125,
-0.002285003662109375,
0.0170440673828125,
-0.022613525390625,
0.0211639404296875,
0.060028076171875,
-0.0252685546875,
0.0751953125,
-0.0259857177734375,
0.007236480712890625,
0.04632568359375,
-0.01091766357421875,
-0.0003485679626464844,
0.0033168792724609375,
-0.01219940185546875,
0.033294677734375,
0.01061248779296875,
-0.0269317626953125,
-0.034271240234375,
0.037506103515625,
-0.05792236328125,
-0.0205230712890625,
-0.04962158203125,
-0.0265655517578125,
0.01061248779296875,
0.021484375,
0.04248046875,
0.04156494140625,
-0.009002685546875,
-0.005901336669921875,
0.0294342041015625,
-0.04498291015625,
0.051239013671875,
0.026702880859375,
-0.0259857177734375,
-0.0401611328125,
0.088134765625,
0.0047454833984375,
0.01568603515625,
0.032012939453125,
0.018890380859375,
-0.0251617431640625,
-0.00577545166015625,
-0.0333251953125,
0.0259857177734375,
-0.0509033203125,
-0.030487060546875,
-0.0479736328125,
-0.04705810546875,
-0.0465087890625,
-0.00555419921875,
-0.003803253173828125,
-0.045166015625,
-0.014190673828125,
-0.0035152435302734375,
0.0224151611328125,
0.032958984375,
-0.019287109375,
0.01617431640625,
-0.0572509765625,
0.0255584716796875,
0.0167388916015625,
0.01690673828125,
0.007083892822265625,
-0.054351806640625,
-0.0016279220581054688,
0.00547027587890625,
-0.0423583984375,
-0.0650634765625,
0.0389404296875,
0.041229248046875,
0.034210205078125,
0.04400634765625,
0.0120086669921875,
0.062103271484375,
-0.043121337890625,
0.053070068359375,
0.03765869140625,
-0.07647705078125,
0.056121826171875,
-0.004932403564453125,
0.0186920166015625,
0.0286407470703125,
0.05059814453125,
-0.052734375,
-0.0325927734375,
-0.060943603515625,
-0.06463623046875,
0.059326171875,
0.0196380615234375,
0.00708770751953125,
-0.011138916015625,
0.032745361328125,
-0.00229644775390625,
0.0167083740234375,
-0.09332275390625,
-0.037811279296875,
-0.03558349609375,
-0.0321044921875,
-0.0296173095703125,
0.003314971923828125,
-0.004627227783203125,
-0.04876708984375,
0.0576171875,
-0.011993408203125,
0.033416748046875,
0.039459228515625,
-0.008331298828125,
0.0009036064147949219,
0.017852783203125,
0.025360107421875,
0.0224761962890625,
-0.0255126953125,
0.00194549560546875,
0.0148773193359375,
-0.023895263671875,
0.0197601318359375,
0.00799560546875,
-0.033935546875,
0.01629638671875,
0.035858154296875,
0.080322265625,
0.0016145706176757812,
-0.05419921875,
0.04833984375,
0.00991058349609375,
-0.0287933349609375,
-0.03759765625,
0.007503509521484375,
-0.0088043212890625,
0.0210418701171875,
0.019287109375,
0.01107025146484375,
0.0159912109375,
-0.055694580078125,
0.019500732421875,
0.019317626953125,
-0.0132904052734375,
-0.027923583984375,
0.05535888671875,
-0.007503509521484375,
-0.0214996337890625,
0.044586181640625,
-0.04278564453125,
-0.0496826171875,
0.062164306640625,
0.046295166015625,
0.068359375,
-0.0112457275390625,
0.03759765625,
0.0804443359375,
0.0095672607421875,
-0.00943756103515625,
0.0122528076171875,
-0.005405426025390625,
-0.0657958984375,
-0.038787841796875,
-0.07098388671875,
-0.0219268798828125,
0.01715087890625,
-0.047698974609375,
0.02178955078125,
-0.0435791015625,
-0.01300048828125,
0.019012451171875,
-0.00885772705078125,
-0.051025390625,
0.0205535888671875,
0.03521728515625,
0.056640625,
-0.07659912109375,
0.055511474609375,
0.044891357421875,
-0.032562255859375,
-0.057373046875,
0.001499176025390625,
-0.0027866363525390625,
-0.04736328125,
0.052520751953125,
0.0631103515625,
0.0216522216796875,
-0.0002111196517944336,
-0.054107666015625,
-0.0732421875,
0.0782470703125,
-0.00399017333984375,
-0.040863037109375,
0.0010461807250976562,
0.00902557373046875,
0.044921875,
-0.0164337158203125,
0.0289764404296875,
0.0347900390625,
0.031402587890625,
0.0245208740234375,
-0.0579833984375,
0.01081085205078125,
-0.02459716796875,
-0.00897216796875,
0.01000213623046875,
-0.039306640625,
0.06591796875,
-0.03765869140625,
-0.01861572265625,
0.01422119140625,
0.051971435546875,
0.032623291015625,
0.051239013671875,
0.05535888671875,
0.06488037109375,
0.052825927734375,
-0.009246826171875,
0.05841064453125,
-0.010223388671875,
0.044921875,
0.07080078125,
-0.015899658203125,
0.07525634765625,
0.00975799560546875,
-0.0307769775390625,
0.058685302734375,
0.062103271484375,
-0.0194854736328125,
0.04071044921875,
0.0210113525390625,
-0.01904296875,
-0.019683837890625,
0.01190948486328125,
-0.03289794921875,
0.0279388427734375,
0.020904541015625,
-0.008941650390625,
-0.0103302001953125,
0.0192413330078125,
-0.01300048828125,
-0.0263671875,
-0.0129852294921875,
0.051788330078125,
0.003673553466796875,
-0.0626220703125,
0.06671142578125,
0.0006299018859863281,
0.07183837890625,
-0.02606201171875,
0.00791168212890625,
0.00176239013671875,
0.0208587646484375,
-0.0278472900390625,
-0.06195068359375,
0.0225067138671875,
-0.00958251953125,
0.002521514892578125,
-0.004337310791015625,
0.04193115234375,
-0.04290771484375,
-0.04754638671875,
0.0129547119140625,
0.007289886474609375,
0.032623291015625,
-0.006809234619140625,
-0.06787109375,
-0.01554107666015625,
0.00846099853515625,
-0.022705078125,
0.0200042724609375,
0.0254669189453125,
0.01328277587890625,
0.039520263671875,
0.052337646484375,
-0.0222930908203125,
0.01172637939453125,
0.00885772705078125,
0.050048828125,
-0.06591796875,
-0.0323486328125,
-0.06298828125,
0.040679931640625,
-0.003459930419921875,
-0.0236358642578125,
0.050384521484375,
0.04815673828125,
0.0672607421875,
-0.01396942138671875,
0.04229736328125,
-0.0240478515625,
0.02978515625,
-0.0247802734375,
0.0413818359375,
-0.062347412109375,
-0.01119232177734375,
-0.02789306640625,
-0.059112548828125,
-0.04498291015625,
0.073486328125,
-0.0088958740234375,
-0.00438690185546875,
0.03668212890625,
0.055633544921875,
0.01096343994140625,
0.00608062744140625,
0.00881195068359375,
0.019744873046875,
0.0120391845703125,
0.0596923828125,
0.058258056640625,
-0.0650634765625,
0.03045654296875,
-0.039093017578125,
-0.0252532958984375,
-0.004459381103515625,
-0.060577392578125,
-0.07025146484375,
-0.03692626953125,
-0.050689697265625,
-0.03668212890625,
-0.012786865234375,
0.058074951171875,
0.050445556640625,
-0.07421875,
-0.01093292236328125,
-0.020599365234375,
0.00348663330078125,
-0.0158538818359375,
-0.0263519287109375,
0.0226287841796875,
-0.022186279296875,
-0.0557861328125,
0.01532745361328125,
0.0045928955078125,
0.017974853515625,
0.0011377334594726562,
-0.003204345703125,
-0.0222625732421875,
-0.00672149658203125,
0.051239013671875,
0.03369140625,
-0.058349609375,
-0.01557159423828125,
0.002307891845703125,
0.003986358642578125,
0.0038089752197265625,
0.022216796875,
-0.06463623046875,
0.020965576171875,
0.031951904296875,
0.026824951171875,
0.05157470703125,
-0.005512237548828125,
0.005207061767578125,
-0.04736328125,
0.024017333984375,
0.0106964111328125,
0.0222930908203125,
0.020782470703125,
-0.0142669677734375,
0.04949951171875,
0.036285400390625,
-0.049896240234375,
-0.06884765625,
0.0115814208984375,
-0.0882568359375,
-0.0243988037109375,
0.07525634765625,
-0.006404876708984375,
-0.0280914306640625,
0.0092010498046875,
-0.00501251220703125,
0.028076171875,
-0.0015010833740234375,
0.0489501953125,
0.03466796875,
-0.0094757080078125,
0.001506805419921875,
-0.04437255859375,
0.0198516845703125,
0.03399658203125,
-0.045806884765625,
-0.0271148681640625,
0.014556884765625,
0.0380859375,
0.04736328125,
0.0501708984375,
-0.00217437744140625,
0.00437164306640625,
-0.0011501312255859375,
0.0267181396484375,
0.012359619140625,
-0.0204315185546875,
-0.0300445556640625,
0.006687164306640625,
-0.027374267578125,
-0.0296173095703125
]
] |
xlm-roberta-large | 2023-09-29T13:04:24.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"onnx",
"safetensors",
"xlm-roberta",
"fill-mask",
"exbert",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:1911.02116",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | xlm-roberta-large | 219 | 2,582,394 | transformers | 2022-03-02T23:29:04 | ---
tags:
- exbert
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
license: mit
---
# XLM-RoBERTa (large-sized model)
XLM-RoBERTa model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116) by Conneau et al. and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr).
Disclaimer: The team releasing XLM-RoBERTa did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa is a multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='xlm-roberta-large')
>>> unmasker("Hello I'm a <mask> model.")
[{'score': 0.10563907772302628,
'sequence': "Hello I'm a fashion model.",
'token': 54543,
'token_str': 'fashion'},
{'score': 0.08015287667512894,
'sequence': "Hello I'm a new model.",
'token': 3525,
'token_str': 'new'},
{'score': 0.033413201570510864,
'sequence': "Hello I'm a model model.",
'token': 3299,
'token_str': 'model'},
{'score': 0.030217764899134636,
'sequence': "Hello I'm a French model.",
'token': 92265,
'token_str': 'French'},
{'score': 0.026436051353812218,
'sequence': "Hello I'm a sexy model.",
'token': 17473,
'token_str': 'sexy'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('xlm-roberta-large')
model = AutoModelForMaskedLM.from_pretrained("xlm-roberta-large")
# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
# forward pass
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1911-02116,
author = {Alexis Conneau and
Kartikay Khandelwal and
Naman Goyal and
Vishrav Chaudhary and
Guillaume Wenzek and
Francisco Guzm{\'{a}}n and
Edouard Grave and
Myle Ott and
Luke Zettlemoyer and
Veselin Stoyanov},
title = {Unsupervised Cross-lingual Representation Learning at Scale},
journal = {CoRR},
volume = {abs/1911.02116},
year = {2019},
url = {http://arxiv.org/abs/1911.02116},
eprinttype = {arXiv},
eprint = {1911.02116},
timestamp = {Mon, 11 Nov 2019 18:38:09 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1911-02116.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=xlm-roberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 5,241 | [
[
-0.03472900390625,
-0.057403564453125,
0.0170440673828125,
0.00618743896484375,
-0.0150299072265625,
-0.001697540283203125,
-0.03173828125,
-0.0298309326171875,
0.0155792236328125,
0.043853759765625,
-0.032318115234375,
-0.04302978515625,
-0.05377197265625,
0.0164337158203125,
-0.031951904296875,
0.08660888671875,
-0.0025768280029296875,
0.00421905517578125,
0.003574371337890625,
-0.01496124267578125,
-0.01525115966796875,
-0.06158447265625,
-0.037139892578125,
-0.024871826171875,
0.032196044921875,
0.009307861328125,
0.041717529296875,
0.044952392578125,
0.0169677734375,
0.0322265625,
-0.0160675048828125,
0.01251983642578125,
-0.0210418701171875,
0.000213623046875,
0.00128173828125,
-0.0445556640625,
-0.0355224609375,
0.0162353515625,
0.051666259765625,
0.0535888671875,
0.0096282958984375,
0.0226898193359375,
0.00902557373046875,
0.0262451171875,
-0.014434814453125,
0.02337646484375,
-0.04022216796875,
0.01324462890625,
-0.0167236328125,
0.00713348388671875,
-0.032928466796875,
-0.007762908935546875,
0.01139068603515625,
-0.0229339599609375,
0.0146636962890625,
0.01361083984375,
0.09075927734375,
-0.0009021759033203125,
-0.0255889892578125,
-0.0133819580078125,
-0.045745849609375,
0.081298828125,
-0.050384521484375,
0.032806396484375,
0.0169677734375,
0.004764556884765625,
0.0050048828125,
-0.06787109375,
-0.043182373046875,
-0.02008056640625,
-0.0289764404296875,
0.005977630615234375,
-0.035186767578125,
-0.018341064453125,
0.0257720947265625,
0.03118896484375,
-0.05963134765625,
0.00237274169921875,
-0.031951904296875,
-0.0171356201171875,
0.042449951171875,
0.000980377197265625,
0.02886962890625,
-0.036529541015625,
-0.0301971435546875,
-0.0343017578125,
-0.036285400390625,
0.00870513916015625,
0.0257720947265625,
0.03094482421875,
-0.024658203125,
0.039794921875,
0.007740020751953125,
0.056304931640625,
0.01082611083984375,
0.001102447509765625,
0.042083740234375,
-0.0189971923828125,
-0.022552490234375,
-0.0165252685546875,
0.09283447265625,
-0.0038318634033203125,
0.0192108154296875,
-0.006072998046875,
-0.01233673095703125,
-0.00963592529296875,
0.0014896392822265625,
-0.056732177734375,
-0.019775390625,
0.016448974609375,
-0.04156494140625,
-0.01396942138671875,
0.0145416259765625,
-0.05352783203125,
0.0091094970703125,
-0.025726318359375,
0.0484619140625,
-0.035980224609375,
-0.0204315185546875,
-0.00749969482421875,
-0.0014133453369140625,
0.0032367706298828125,
-0.0010433197021484375,
-0.057403564453125,
0.01194000244140625,
0.0253753662109375,
0.06427001953125,
-0.005580902099609375,
-0.0233306884765625,
-0.034698486328125,
-0.01934814453125,
-0.0180511474609375,
0.034454345703125,
-0.030364990234375,
-0.00843048095703125,
-0.007411956787109375,
0.026092529296875,
-0.01197052001953125,
-0.03753662109375,
0.02935791015625,
-0.026458740234375,
0.0360107421875,
0.00867462158203125,
-0.024993896484375,
-0.028533935546875,
0.00910186767578125,
-0.049560546875,
0.09075927734375,
0.019134521484375,
-0.050811767578125,
0.01522064208984375,
-0.042877197265625,
-0.0244598388671875,
-0.01371002197265625,
-0.0005931854248046875,
-0.056793212890625,
-0.004283905029296875,
0.030609130859375,
0.043243408203125,
-0.0229949951171875,
0.01215362548828125,
-0.01361083984375,
-0.005825042724609375,
0.03118896484375,
-0.02008056640625,
0.08624267578125,
0.023834228515625,
-0.036865234375,
0.013336181640625,
-0.06219482421875,
0.013275146484375,
0.01552581787109375,
-0.01548004150390625,
-0.0186309814453125,
-0.029144287109375,
0.02728271484375,
0.024810791015625,
0.0168609619140625,
-0.03094482421875,
0.00525665283203125,
-0.038330078125,
0.04241943359375,
0.039215087890625,
-0.0209808349609375,
0.0362548828125,
-0.0188446044921875,
0.043304443359375,
0.01303863525390625,
0.006458282470703125,
-0.0279388427734375,
-0.0401611328125,
-0.0623779296875,
-0.0251312255859375,
0.0499267578125,
0.040069580078125,
-0.04058837890625,
0.05206298828125,
-0.0134429931640625,
-0.04510498046875,
-0.050384521484375,
0.0152740478515625,
0.0401611328125,
0.024078369140625,
0.036834716796875,
-0.029144287109375,
-0.05255126953125,
-0.054840087890625,
-0.01061248779296875,
0.00421905517578125,
-0.007671356201171875,
0.02679443359375,
0.046234130859375,
-0.021820068359375,
0.06414794921875,
-0.03546142578125,
-0.035247802734375,
-0.042755126953125,
0.0243682861328125,
0.0284576416015625,
0.04461669921875,
0.049652099609375,
-0.05865478515625,
-0.0556640625,
-0.0008020401000976562,
-0.047515869140625,
-0.004238128662109375,
-0.0020122528076171875,
-0.00707244873046875,
0.042938232421875,
0.037445068359375,
-0.047332763671875,
0.0296173095703125,
0.047454833984375,
-0.0202178955078125,
0.0213470458984375,
-0.02490234375,
-0.0023174285888671875,
-0.09869384765625,
0.01287841796875,
0.0042266845703125,
-0.024627685546875,
-0.045013427734375,
0.002552032470703125,
0.006259918212890625,
-0.014678955078125,
-0.023223876953125,
0.04852294921875,
-0.060546875,
-0.0012998580932617188,
-0.006320953369140625,
0.029296875,
0.007396697998046875,
0.05133056640625,
0.01549530029296875,
0.0307159423828125,
0.0491943359375,
-0.032745361328125,
0.0247802734375,
0.02459716796875,
-0.0276641845703125,
0.021514892578125,
-0.048004150390625,
0.01053619384765625,
0.001850128173828125,
0.0163421630859375,
-0.066650390625,
0.0035800933837890625,
0.020660400390625,
-0.04461669921875,
0.039215087890625,
-0.0256195068359375,
-0.040802001953125,
-0.035552978515625,
-0.0078125,
0.0298309326171875,
0.0543212890625,
-0.0394287109375,
0.05377197265625,
0.032196044921875,
-0.01192474365234375,
-0.04150390625,
-0.058868408203125,
0.0098419189453125,
-0.0185394287109375,
-0.047210693359375,
0.03485107421875,
-0.005077362060546875,
0.0010976791381835938,
-0.003536224365234375,
0.017333984375,
0.006259918212890625,
-0.008087158203125,
0.0178985595703125,
0.023406982421875,
-0.01468658447265625,
-0.0017871856689453125,
-0.0178985595703125,
-0.0216522216796875,
-0.0036144256591796875,
-0.0294647216796875,
0.0694580078125,
-0.00472259521484375,
-0.00568389892578125,
-0.02655029296875,
0.0294189453125,
0.0284576416015625,
-0.03863525390625,
0.0531005859375,
0.07696533203125,
-0.0237884521484375,
-0.0134735107421875,
-0.0296478271484375,
-0.0145416259765625,
-0.032012939453125,
0.04449462890625,
-0.0266265869140625,
-0.06280517578125,
0.04925537109375,
0.0171356201171875,
-0.00798797607421875,
0.0489501953125,
0.050628662109375,
0.011749267578125,
0.088134765625,
0.05169677734375,
-0.004199981689453125,
0.03753662109375,
-0.048919677734375,
0.02752685546875,
-0.0740966796875,
-0.02197265625,
-0.046844482421875,
-0.016937255859375,
-0.06280517578125,
-0.044403076171875,
0.0207977294921875,
0.0072021484375,
-0.014007568359375,
0.05072021484375,
-0.0438232421875,
0.002300262451171875,
0.058929443359375,
0.01183319091796875,
0.008880615234375,
0.00548553466796875,
-0.0224609375,
-0.00423431396484375,
-0.051544189453125,
-0.0237274169921875,
0.088623046875,
0.0280303955078125,
0.05364990234375,
0.00033545494079589844,
0.054840087890625,
-0.004901885986328125,
0.0113067626953125,
-0.04962158203125,
0.037689208984375,
-0.0202484130859375,
-0.05364990234375,
-0.022491455078125,
-0.041229248046875,
-0.08380126953125,
0.0167388916015625,
-0.0248565673828125,
-0.06536865234375,
0.01392364501953125,
-0.001064300537109375,
-0.02001953125,
0.02783203125,
-0.042755126953125,
0.06842041015625,
-0.02239990234375,
-0.0208282470703125,
0.0017099380493164062,
-0.050933837890625,
0.01445770263671875,
-0.01030731201171875,
0.011322021484375,
0.012115478515625,
0.01544189453125,
0.060089111328125,
-0.037139892578125,
0.0689697265625,
0.0027141571044921875,
-0.0024356842041015625,
0.0172271728515625,
-0.005191802978515625,
0.03399658203125,
-0.0062713623046875,
0.00850677490234375,
0.0347900390625,
-0.005706787109375,
-0.0177001953125,
-0.037994384765625,
0.048828125,
-0.07342529296875,
-0.046295166015625,
-0.0447998046875,
-0.0474853515625,
0.009185791015625,
0.022003173828125,
0.033203125,
0.0438232421875,
-0.0006437301635742188,
0.0193939208984375,
0.044219970703125,
-0.036468505859375,
0.0394287109375,
0.031951904296875,
-0.031951904296875,
-0.037811279296875,
0.053863525390625,
0.02294921875,
0.0154266357421875,
0.04547119140625,
0.0163116455078125,
-0.033935546875,
-0.034942626953125,
-0.03118896484375,
0.0226898193359375,
-0.046173095703125,
-0.0207977294921875,
-0.0777587890625,
-0.037628173828125,
-0.051422119140625,
0.0066986083984375,
-0.0171051025390625,
-0.037689208984375,
-0.0289306640625,
0.0035572052001953125,
0.041351318359375,
0.054595947265625,
-0.0179443359375,
0.01508331298828125,
-0.053863525390625,
0.0198516845703125,
0.019866943359375,
0.0061492919921875,
-0.0017461776733398438,
-0.0689697265625,
-0.0304412841796875,
0.006481170654296875,
-0.0275726318359375,
-0.04840087890625,
0.0638427734375,
0.0116424560546875,
0.0413818359375,
0.0244140625,
-0.0011301040649414062,
0.051513671875,
-0.030059814453125,
0.0556640625,
0.0157012939453125,
-0.07379150390625,
0.0386962890625,
-0.007602691650390625,
0.0192718505859375,
0.0012044906616210938,
0.037628173828125,
-0.043212890625,
-0.03790283203125,
-0.057830810546875,
-0.077392578125,
0.0689697265625,
0.0196685791015625,
0.0215911865234375,
0.0012683868408203125,
0.014007568359375,
0.00046443939208984375,
0.00681304931640625,
-0.0872802734375,
-0.04296875,
-0.0289764404296875,
-0.029083251953125,
-0.02490234375,
-0.0109100341796875,
-0.0030117034912109375,
-0.0305023193359375,
0.05145263671875,
-0.00379180908203125,
0.034210205078125,
0.018463134765625,
-0.0304412841796875,
-0.0006022453308105469,
0.0070648193359375,
0.035736083984375,
0.033782958984375,
-0.01401519775390625,
0.006473541259765625,
0.01282501220703125,
-0.034912109375,
-0.005279541015625,
0.029327392578125,
-0.0132598876953125,
0.01383209228515625,
0.0267333984375,
0.0692138671875,
0.0213165283203125,
-0.0328369140625,
0.03704833984375,
0.0098876953125,
-0.01323699951171875,
-0.033660888671875,
0.00380706787109375,
0.006031036376953125,
0.0254058837890625,
0.032684326171875,
0.0006895065307617188,
-0.01079559326171875,
-0.05853271484375,
0.024566650390625,
0.03936767578125,
-0.03363037109375,
-0.0209503173828125,
0.0623779296875,
-0.013580322265625,
-0.02655029296875,
0.04010009765625,
-0.00955963134765625,
-0.056732177734375,
0.049652099609375,
0.049072265625,
0.06634521484375,
-0.0123748779296875,
0.01837158203125,
0.04827880859375,
0.0198974609375,
0.0013904571533203125,
-0.00024259090423583984,
0.007007598876953125,
-0.052001953125,
-0.020660400390625,
-0.059783935546875,
-0.00251007080078125,
0.0178070068359375,
-0.04437255859375,
0.0262451171875,
-0.0255279541015625,
-0.016448974609375,
0.0036220550537109375,
0.0178680419921875,
-0.0565185546875,
0.022003173828125,
0.0050506591796875,
0.0540771484375,
-0.06317138671875,
0.0679931640625,
0.05206298828125,
-0.057952880859375,
-0.07659912109375,
-0.0188446044921875,
-0.01064300537109375,
-0.07049560546875,
0.06927490234375,
0.01222991943359375,
0.025299072265625,
0.0028553009033203125,
-0.0304412841796875,
-0.0791015625,
0.0845947265625,
0.00972747802734375,
-0.042388916015625,
0.0007829666137695312,
0.0261688232421875,
0.04296875,
-0.04693603515625,
0.046661376953125,
0.0233001708984375,
0.035614013671875,
0.00015294551849365234,
-0.066650390625,
0.0167999267578125,
-0.0261688232421875,
0.0092010498046875,
0.009674072265625,
-0.059661865234375,
0.0950927734375,
-0.0133209228515625,
-0.0036106109619140625,
0.0194549560546875,
0.043426513671875,
0.00922393798828125,
-0.0008287429809570312,
0.03125,
0.050811767578125,
0.047210693359375,
-0.02423095703125,
0.0699462890625,
-0.025665283203125,
0.04730224609375,
0.07147216796875,
0.0038204193115234375,
0.058929443359375,
0.0177764892578125,
-0.0171356201171875,
0.053680419921875,
0.04791259765625,
-0.0256195068359375,
0.02874755859375,
0.00943756103515625,
0.006549835205078125,
-0.01502227783203125,
0.0165252685546875,
-0.024932861328125,
0.04443359375,
0.008209228515625,
-0.05206298828125,
-0.00801849365234375,
0.01143646240234375,
0.0266571044921875,
-0.002826690673828125,
-0.0127410888671875,
0.0455322265625,
0.019622802734375,
-0.048065185546875,
0.056732177734375,
0.00844573974609375,
0.052398681640625,
-0.044464111328125,
0.00562286376953125,
-0.02093505859375,
0.01983642578125,
-0.01001739501953125,
-0.045989990234375,
0.00670623779296875,
0.0035247802734375,
-0.0179290771484375,
-0.0213775634765625,
0.03173828125,
-0.05694580078125,
-0.0577392578125,
0.034393310546875,
0.03082275390625,
0.0155487060546875,
-0.002750396728515625,
-0.07037353515625,
0.005825042724609375,
0.006381988525390625,
-0.03436279296875,
0.031463623046875,
0.04486083984375,
-0.005100250244140625,
0.046630859375,
0.052520751953125,
0.00894927978515625,
0.006290435791015625,
0.0036907196044921875,
0.055694580078125,
-0.0595703125,
-0.031463623046875,
-0.057586669921875,
0.048614501953125,
-0.00556182861328125,
-0.023681640625,
0.0667724609375,
0.045806884765625,
0.06231689453125,
-0.007427215576171875,
0.05291748046875,
-0.0167236328125,
0.036529541015625,
-0.038787841796875,
0.0694580078125,
-0.053955078125,
0.0130615234375,
-0.028289794921875,
-0.07159423828125,
-0.0274658203125,
0.058380126953125,
-0.01358795166015625,
0.0284576416015625,
0.055816650390625,
0.069580078125,
-0.0091705322265625,
-0.0291595458984375,
0.0264892578125,
0.04351806640625,
0.0130157470703125,
0.041259765625,
0.035980224609375,
-0.05389404296875,
0.0556640625,
-0.0238494873046875,
-0.016693115234375,
-0.018707275390625,
-0.061767578125,
-0.08453369140625,
-0.06573486328125,
-0.031463623046875,
-0.036102294921875,
-0.0145721435546875,
0.072021484375,
0.0625,
-0.06524658203125,
-0.0191192626953125,
0.004100799560546875,
0.00878143310546875,
-0.0159759521484375,
-0.0229949951171875,
0.046600341796875,
-0.029022216796875,
-0.08160400390625,
0.00433349609375,
0.007221221923828125,
0.0135498046875,
-0.0288238525390625,
-0.0018568038940429688,
-0.0203094482421875,
0.00047469139099121094,
0.04156494140625,
0.015655517578125,
-0.04766845703125,
-0.0186920166015625,
0.004451751708984375,
-0.0083160400390625,
0.0177001953125,
0.036376953125,
-0.060211181640625,
0.0214080810546875,
0.031768798828125,
0.0182342529296875,
0.056243896484375,
-0.0248565673828125,
0.04302978515625,
-0.05279541015625,
0.0225677490234375,
0.0046234130859375,
0.041229248046875,
0.031280517578125,
-0.015472412109375,
0.02679443359375,
0.0185394287109375,
-0.038055419921875,
-0.0645751953125,
0.004482269287109375,
-0.0799560546875,
-0.02142333984375,
0.0806884765625,
-0.026336669921875,
-0.0289764404296875,
-0.003536224365234375,
-0.01309967041015625,
0.033203125,
-0.01380157470703125,
0.055999755859375,
0.03887939453125,
0.009185791015625,
-0.03619384765625,
-0.02642822265625,
0.0360107421875,
0.0225677490234375,
-0.043853759765625,
-0.0019063949584960938,
0.004161834716796875,
0.0369873046875,
0.03033447265625,
0.0238800048828125,
-0.024139404296875,
-0.004199981689453125,
-0.01395416259765625,
0.0186614990234375,
0.0015287399291992188,
-0.01251220703125,
-0.0217132568359375,
0.00945281982421875,
-0.01873779296875,
-0.00266265869140625
]
] |
openai/clip-vit-base-patch16 | 2022-10-04T09:42:28.000Z | [
"transformers",
"pytorch",
"jax",
"clip",
"zero-shot-image-classification",
"vision",
"arxiv:2103.00020",
"arxiv:1908.04913",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | openai | null | null | openai/clip-vit-base-patch16 | 41 | 2,560,286 | transformers | 2022-03-02T23:29:05 | ---
tags:
- vision
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# Model Card: CLIP
Disclaimer: The model card is taken and modified from the official CLIP repository, it can be found [here](https://github.com/openai/CLIP/blob/main/model-card.md).
## Model Details
The CLIP model was developed by researchers at OpenAI to learn about what contributes to robustness in computer vision tasks. The model was also developed to test the ability of models to generalize to arbitrary image classification tasks in a zero-shot manner. It was not developed for general model deployment - to deploy models like CLIP, researchers will first need to carefully study their capabilities in relation to the specific context theyโre being deployed within.
### Model Date
January 2021
### Model Type
The base model uses a ViT-B/16 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
The original implementation had two variants: one using a ResNet image encoder and the other using a Vision Transformer. This repository has the variant with the Vision Transformer.
### Documents
- [Blog Post](https://openai.com/blog/clip/)
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
### Use with Transformers
```python3
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("openai/clip-vit-base-patch16")
processor = CLIPProcessor.from_pretrained("openai/clip-vit-base-patch16")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=["a photo of a cat", "a photo of a dog"], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
### Out-of-Scope Use Cases
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIPโs performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
## Data
The model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet which tend to skew towards more developed nations, and younger, male users.
### Data Mission Statement
Our goal with building this dataset was to test out robustness and generalizability in computer vision tasks. As a result, the focus was on gathering large quantities of data from different publicly-available internet data sources. The data was gathered in a mostly non-interventionist manner. However, we only crawled websites that had policies against excessively violent and adult images and allowed us to filter out such content. We do not intend for this dataset to be used as the basis for any commercial or deployed model and will not be releasing the dataset.
## Performance and Limitations
### Performance
We have evaluated the performance of CLIP on a wide range of benchmarks across a variety of computer vision datasets such as OCR to texture recognition to fine-grained classification. The paper describes model performance on the following datasets:
- Food101
- CIFAR10
- CIFAR100
- Birdsnap
- SUN397
- Stanford Cars
- FGVC Aircraft
- VOC2007
- DTD
- Oxford-IIIT Pet dataset
- Caltech101
- Flowers102
- MNIST
- SVHN
- IIIT5K
- Hateful Memes
- SST-2
- UCF101
- Kinetics700
- Country211
- CLEVR Counting
- KITTI Distance
- STL-10
- RareAct
- Flickr30
- MSCOCO
- ImageNet
- ImageNet-A
- ImageNet-R
- ImageNet Sketch
- ObjectNet (ImageNet Overlap)
- Youtube-BB
- ImageNet-Vid
## Limitations
CLIP and our analysis of it have a number of limitations. CLIP currently struggles with respect to certain tasks such as fine grained classification and counting objects. CLIP also poses issues with regards to fairness and bias which we discuss in the paper and briefly in the next section. Additionally, our approach to testing CLIP also has an important limitation- in many cases we have used linear probes to evaluate the performance of CLIP and there is evidence suggesting that linear probes can underestimate model performance.
### Bias and Fairness
We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from [Fairface](https://arxiv.org/abs/1908.04913) into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed. (Details captured in the Broader Impacts Section in the paper).
We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with โMiddle Easternโ having the highest accuracy (98.4%) and โWhiteโ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification. Our use of evaluations to test for gender, race and age classification as well as denigration harms is simply to evaluate performance of the model across people and surface potential risks and not to demonstrate an endorsement/enthusiasm for such tasks.
## Feedback
### Where to send questions or comments about the model
Please use [this Google Form](https://forms.gle/Uv7afRH5dvY34ZEs9)
| 7,916 | [
[
-0.038726806640625,
-0.04437255859375,
0.012847900390625,
-0.0020294189453125,
-0.012603759765625,
-0.0195159912109375,
0.00241851806640625,
-0.05499267578125,
0.00960540771484375,
0.029541015625,
-0.0215911865234375,
-0.0313720703125,
-0.048919677734375,
0.00913238525390625,
-0.048095703125,
0.05517578125,
-0.00505828857421875,
0.005168914794921875,
-0.0238037109375,
-0.02545166015625,
-0.039215087890625,
-0.0528564453125,
-0.0184173583984375,
0.01253509521484375,
0.006191253662109375,
0.01148223876953125,
0.051116943359375,
0.065185546875,
0.061737060546875,
0.016845703125,
-0.0240325927734375,
-0.00890350341796875,
-0.038726806640625,
-0.047821044921875,
-0.0292816162109375,
-0.0305328369140625,
-0.030181884765625,
0.0163116455078125,
0.040313720703125,
0.0284271240234375,
0.002239227294921875,
0.0225830078125,
0.005451202392578125,
0.0284423828125,
-0.07147216796875,
-0.00363922119140625,
-0.043121337890625,
0.00498199462890625,
-0.02203369140625,
0.01096343994140625,
-0.0133056640625,
-0.01494598388671875,
0.02398681640625,
-0.0389404296875,
0.03759765625,
-0.004512786865234375,
0.10113525390625,
0.01354217529296875,
-0.0124664306640625,
-0.002414703369140625,
-0.0450439453125,
0.057464599609375,
-0.044158935546875,
0.018341064453125,
0.0179443359375,
0.02984619140625,
0.01142120361328125,
-0.06475830078125,
-0.048736572265625,
-0.004150390625,
0.02294921875,
0.0014390945434570312,
-0.017852783203125,
-0.00455474853515625,
0.03192138671875,
0.0380859375,
-0.0122528076171875,
-0.00499725341796875,
-0.055419921875,
-0.016845703125,
0.051727294921875,
0.02288818359375,
0.026123046875,
-0.018218994140625,
-0.048431396484375,
-0.035736083984375,
-0.03485107421875,
0.041534423828125,
0.0298614501953125,
0.0073394775390625,
-0.0120697021484375,
0.049407958984375,
-0.003391265869140625,
0.033294677734375,
0.0006375312805175781,
-0.0266265869140625,
0.02655029296875,
-0.036468505859375,
-0.014068603515625,
-0.0208740234375,
0.058380126953125,
0.064208984375,
0.01351165771484375,
0.0160064697265625,
-0.006626129150390625,
0.0164642333984375,
0.0267791748046875,
-0.07147216796875,
-0.01242828369140625,
-0.015594482421875,
-0.048187255859375,
-0.028656005859375,
0.02178955078125,
-0.07080078125,
0.00653076171875,
-0.00885009765625,
0.05657958984375,
-0.03472900390625,
-0.005680084228515625,
0.0146942138671875,
-0.02459716796875,
0.024993896484375,
0.0251617431640625,
-0.052337646484375,
0.0296173095703125,
0.024444580078125,
0.08465576171875,
-0.036468505859375,
-0.02398681640625,
0.00415802001953125,
-0.0049285888671875,
-0.00899505615234375,
0.05499267578125,
-0.0289154052734375,
-0.036224365234375,
-0.01505279541015625,
0.0335693359375,
-0.00952911376953125,
-0.047149658203125,
0.044342041015625,
-0.015869140625,
0.0019512176513671875,
-0.021697998046875,
-0.02947998046875,
-0.048095703125,
0.02423095703125,
-0.0546875,
0.0689697265625,
0.0117950439453125,
-0.060028076171875,
0.02935791015625,
-0.0545654296875,
-0.00412750244140625,
-0.00948333740234375,
-0.007678985595703125,
-0.04583740234375,
-0.0217742919921875,
0.0311737060546875,
0.0247039794921875,
-0.017486572265625,
0.02813720703125,
-0.04644775390625,
-0.038116455078125,
0.01395416259765625,
-0.033538818359375,
0.06854248046875,
0.0015096664428710938,
-0.025360107421875,
-0.0002758502960205078,
-0.035247802734375,
-0.0133056640625,
0.02703857421875,
0.0008206367492675781,
-0.012115478515625,
-0.00823211669921875,
0.01494598388671875,
0.00726318359375,
-0.0031452178955078125,
-0.052703857421875,
0.00968170166015625,
-0.006626129150390625,
0.041229248046875,
0.0517578125,
0.007617950439453125,
0.021270751953125,
-0.03314208984375,
0.040252685546875,
-0.0014934539794921875,
0.05059814453125,
-0.0189666748046875,
-0.04010009765625,
-0.037689208984375,
-0.035614013671875,
0.0445556640625,
0.04998779296875,
-0.033233642578125,
0.01235198974609375,
-0.0106201171875,
-0.0260009765625,
-0.01430511474609375,
-0.0168914794921875,
0.02642822265625,
0.05047607421875,
0.0268707275390625,
-0.075439453125,
-0.031158447265625,
-0.08050537109375,
0.01480865478515625,
0.0048675537109375,
-0.003932952880859375,
0.053131103515625,
0.0693359375,
-0.0182037353515625,
0.0831298828125,
-0.0574951171875,
-0.0316162109375,
-0.010711669921875,
-0.0102081298828125,
-0.0017728805541992188,
0.038299560546875,
0.072998046875,
-0.0714111328125,
-0.0199432373046875,
-0.040679931640625,
-0.061981201171875,
0.01085662841796875,
0.01541900634765625,
-0.007038116455078125,
0.0033893585205078125,
0.016571044921875,
-0.0189361572265625,
0.07904052734375,
0.01971435546875,
-0.003894805908203125,
0.05609130859375,
0.006999969482421875,
0.022003173828125,
-0.045257568359375,
0.0278778076171875,
0.0128173828125,
-0.0116119384765625,
-0.03753662109375,
0.003757476806640625,
-0.0002868175506591797,
-0.032440185546875,
-0.07159423828125,
0.028533935546875,
-0.011016845703125,
-0.0096282958984375,
-0.0121612548828125,
-0.014495849609375,
0.0245208740234375,
0.055023193359375,
0.010589599609375,
0.0823974609375,
0.03839111328125,
-0.058563232421875,
-0.00208282470703125,
0.041748046875,
-0.036163330078125,
0.041351318359375,
-0.072998046875,
-0.0031871795654296875,
-0.0045013427734375,
0.00759124755859375,
-0.04327392578125,
-0.0257568359375,
0.0240325927734375,
-0.0276031494140625,
0.0161895751953125,
-0.010345458984375,
-0.0240936279296875,
-0.04583740234375,
-0.041839599609375,
0.05767822265625,
0.0390625,
-0.034271240234375,
0.028045654296875,
0.054901123046875,
0.01444244384765625,
-0.04095458984375,
-0.059234619140625,
-0.0066070556640625,
-0.01568603515625,
-0.05560302734375,
0.042205810546875,
-0.000060498714447021484,
0.005962371826171875,
0.01052093505859375,
0.006282806396484375,
-0.024200439453125,
0.002384185791015625,
0.0352783203125,
0.03973388671875,
-0.00582122802734375,
-0.0098876953125,
-0.02264404296875,
0.0276031494140625,
-0.005802154541015625,
0.00991058349609375,
0.0206146240234375,
-0.011138916015625,
-0.02618408203125,
-0.0389404296875,
0.0247650146484375,
0.034393310546875,
-0.0204925537109375,
0.037322998046875,
0.037353515625,
-0.0214996337890625,
0.0086822509765625,
-0.0408935546875,
-0.00269317626953125,
-0.034027099609375,
0.037841796875,
-0.00966644287109375,
-0.051544189453125,
0.05609130859375,
0.01125335693359375,
-0.0112457275390625,
0.048095703125,
0.0236053466796875,
0.0005068778991699219,
0.06512451171875,
0.07208251953125,
0.0032787322998046875,
0.049102783203125,
-0.06256103515625,
-0.0011014938354492188,
-0.07745361328125,
-0.026580810546875,
-0.01947021484375,
-0.0162200927734375,
-0.03338623046875,
-0.042755126953125,
0.044647216796875,
0.01389312744140625,
-0.007732391357421875,
0.032440185546875,
-0.0506591796875,
0.034393310546875,
0.047332763671875,
0.034515380859375,
0.0008034706115722656,
-0.006809234619140625,
-0.00022804737091064453,
-0.01242828369140625,
-0.052001953125,
-0.03851318359375,
0.0855712890625,
0.050750732421875,
0.0537109375,
-0.0166015625,
0.016937255859375,
0.03228759765625,
-0.006191253662109375,
-0.05731201171875,
0.041259765625,
-0.034576416015625,
-0.055419921875,
-0.0138702392578125,
-0.004337310791015625,
-0.058441162109375,
0.01187896728515625,
-0.0105438232421875,
-0.057464599609375,
0.0469970703125,
0.01041412353515625,
-0.025909423828125,
0.05145263671875,
-0.045440673828125,
0.07568359375,
-0.022369384765625,
-0.0335693359375,
0.005893707275390625,
-0.0498046875,
0.044219970703125,
0.005710601806640625,
0.00231170654296875,
-0.016387939453125,
0.00795745849609375,
0.082763671875,
-0.044647216796875,
0.071044921875,
-0.00910186767578125,
0.03271484375,
0.05731201171875,
-0.01345062255859375,
0.0034656524658203125,
-0.0155792236328125,
0.01485443115234375,
0.0545654296875,
0.02130126953125,
-0.0090484619140625,
-0.028533935546875,
0.01097869873046875,
-0.055755615234375,
-0.0303192138671875,
-0.0283660888671875,
-0.034088134765625,
0.016998291015625,
0.01580810546875,
0.042236328125,
0.05841064453125,
-0.003612518310546875,
0.0123138427734375,
0.0472412109375,
-0.0379638671875,
0.029327392578125,
0.01546478271484375,
-0.0210723876953125,
-0.040130615234375,
0.06976318359375,
0.021240234375,
0.0162811279296875,
0.0030841827392578125,
0.00640869140625,
-0.0174713134765625,
-0.037689208984375,
-0.03387451171875,
0.005527496337890625,
-0.0562744140625,
-0.032958984375,
-0.042022705078125,
-0.0281829833984375,
-0.033966064453125,
-0.0012874603271484375,
-0.0367431640625,
-0.025848388671875,
-0.04840087890625,
0.01544189453125,
0.0135650634765625,
0.049407958984375,
-0.00800323486328125,
0.0227203369140625,
-0.047210693359375,
0.0194244384765625,
0.0293121337890625,
0.040496826171875,
0.0051116943359375,
-0.052978515625,
-0.01116180419921875,
-0.00012743473052978516,
-0.06744384765625,
-0.061279296875,
0.034332275390625,
0.0248565673828125,
0.045440673828125,
0.02740478515625,
0.006992340087890625,
0.05328369140625,
-0.032562255859375,
0.08306884765625,
0.017364501953125,
-0.07305908203125,
0.042388916015625,
-0.023590087890625,
0.0164337158203125,
0.05267333984375,
0.03753662109375,
-0.01548004150390625,
-0.0100860595703125,
-0.042144775390625,
-0.06805419921875,
0.06109619140625,
0.01065826416015625,
0.0034656524658203125,
0.004703521728515625,
0.02569580078125,
0.0018396377563476562,
0.006763458251953125,
-0.053802490234375,
-0.01251220703125,
-0.03924560546875,
0.00437164306640625,
0.022369384765625,
-0.0328369140625,
0.0023651123046875,
-0.03253173828125,
0.03131103515625,
-0.0038604736328125,
0.0428466796875,
0.04119873046875,
-0.0137176513671875,
0.0107421875,
-0.00799560546875,
0.050140380859375,
0.04656982421875,
-0.03045654296875,
-0.0176544189453125,
0.0197601318359375,
-0.06396484375,
0.0010385513305664062,
-0.01380157470703125,
-0.039093017578125,
-0.0033817291259765625,
0.0240478515625,
0.07177734375,
0.01580810546875,
-0.0562744140625,
0.07684326171875,
-0.007419586181640625,
-0.04229736328125,
-0.0191192626953125,
0.00605010986328125,
-0.0419921875,
0.0096435546875,
0.0245513916015625,
0.0174560546875,
0.035186767578125,
-0.039276123046875,
0.0301055908203125,
0.03240966796875,
-0.02685546875,
-0.029205322265625,
0.058441162109375,
0.01129150390625,
-0.0157318115234375,
0.03814697265625,
-0.013519287109375,
-0.0732421875,
0.06243896484375,
0.0309600830078125,
0.05047607421875,
-0.0008254051208496094,
0.0131683349609375,
0.05120849609375,
0.01190948486328125,
-0.025848388671875,
-0.0031948089599609375,
0.0005688667297363281,
-0.04364013671875,
-0.01617431640625,
-0.031829833984375,
-0.044952392578125,
0.01166534423828125,
-0.07098388671875,
0.03216552734375,
-0.0391845703125,
-0.0386962890625,
-0.00836944580078125,
-0.02056884765625,
-0.05596923828125,
0.0107879638671875,
0.0117645263671875,
0.09344482421875,
-0.06427001953125,
0.037109375,
0.032928466796875,
-0.045928955078125,
-0.06182861328125,
-0.011199951171875,
-0.00783538818359375,
-0.048736572265625,
0.0506591796875,
0.041259765625,
-0.0007562637329101562,
-0.035491943359375,
-0.07244873046875,
-0.07550048828125,
0.0865478515625,
0.0250701904296875,
-0.030487060546875,
-0.00667572021484375,
-0.0018243789672851562,
0.0260772705078125,
-0.025421142578125,
0.028839111328125,
0.025604248046875,
-0.0017070770263671875,
0.025726318359375,
-0.089111328125,
-0.01445770263671875,
-0.01343536376953125,
0.020172119140625,
0.0013055801391601562,
-0.06439208984375,
0.08001708984375,
-0.0210418701171875,
-0.033966064453125,
0.00420379638671875,
0.033843994140625,
-0.0044403076171875,
0.028656005859375,
0.0394287109375,
0.053314208984375,
0.032257080078125,
0.004619598388671875,
0.0821533203125,
-0.004741668701171875,
0.03485107421875,
0.0712890625,
-0.01110076904296875,
0.06756591796875,
0.0230255126953125,
-0.0269775390625,
0.02874755859375,
0.033660888671875,
-0.052337646484375,
0.05889892578125,
-0.0002460479736328125,
0.0121612548828125,
-0.0029392242431640625,
-0.034027099609375,
-0.022430419921875,
0.05426025390625,
0.002666473388671875,
-0.0347900390625,
-0.0050201416015625,
0.0307159423828125,
-0.018707275390625,
-0.0043487548828125,
-0.034332275390625,
0.03411865234375,
-0.01230621337890625,
-0.0264739990234375,
0.0335693359375,
0.00521087646484375,
0.07257080078125,
-0.02764892578125,
-0.01180267333984375,
0.006549835205078125,
0.0146484375,
-0.0067596435546875,
-0.07220458984375,
0.042388916015625,
0.004459381103515625,
-0.01715087890625,
0.006725311279296875,
0.056427001953125,
-0.0023555755615234375,
-0.04364013671875,
0.0164337158203125,
-0.01073455810546875,
0.027191162109375,
-0.00728607177734375,
-0.054168701171875,
0.025604248046875,
0.00453948974609375,
0.002452850341796875,
0.0218048095703125,
-0.0014743804931640625,
-0.0086669921875,
0.05120849609375,
0.029449462890625,
-0.003665924072265625,
0.0087738037109375,
-0.0260772705078125,
0.0802001953125,
-0.04217529296875,
-0.03106689453125,
-0.052520751953125,
0.0270843505859375,
-0.007419586181640625,
-0.0263824462890625,
0.047332763671875,
0.0472412109375,
0.08587646484375,
-0.00926971435546875,
0.042999267578125,
-0.0166015625,
0.038726806640625,
-0.0287017822265625,
0.034271240234375,
-0.0401611328125,
-0.0023975372314453125,
-0.0330810546875,
-0.04864501953125,
-0.01430511474609375,
0.046905517578125,
-0.0306854248046875,
-0.005222320556640625,
0.0380859375,
0.056060791015625,
-0.0189666748046875,
-0.002407073974609375,
0.019927978515625,
-0.0260009765625,
0.0196075439453125,
0.04656982421875,
0.04644775390625,
-0.060821533203125,
0.053009033203125,
-0.053131103515625,
-0.017425537109375,
-0.01534271240234375,
-0.06390380859375,
-0.07916259765625,
-0.03857421875,
-0.032867431640625,
-0.01031494140625,
-0.004421234130859375,
0.043548583984375,
0.07421875,
-0.054473876953125,
-0.007259368896484375,
0.0251312255859375,
-0.00531005859375,
-0.0008039474487304688,
-0.018707275390625,
0.02874755859375,
0.015960693359375,
-0.043121337890625,
-0.0144195556640625,
0.009796142578125,
0.0275726318359375,
-0.01358795166015625,
0.00917816162109375,
-0.015380859375,
-0.00460052490234375,
0.03369140625,
0.04052734375,
-0.04998779296875,
-0.0244598388671875,
0.01175689697265625,
0.0032939910888671875,
0.026336669921875,
0.049224853515625,
-0.048919677734375,
0.033477783203125,
0.0213165283203125,
0.041778564453125,
0.05096435546875,
0.020233154296875,
0.0157470703125,
-0.033203125,
0.0162200927734375,
0.0163116455078125,
0.0259246826171875,
0.0272064208984375,
-0.030548095703125,
0.04534912109375,
0.03753662109375,
-0.04925537109375,
-0.07489013671875,
-0.0023212432861328125,
-0.082275390625,
-0.01505279541015625,
0.0675048828125,
-0.03125,
-0.052093505859375,
0.0115203857421875,
-0.015869140625,
0.01323699951171875,
-0.0273284912109375,
0.04998779296875,
0.030242919921875,
-0.00278472900390625,
-0.027862548828125,
-0.045562744140625,
0.01531982421875,
0.004398345947265625,
-0.039794921875,
-0.030059814453125,
0.028045654296875,
0.04486083984375,
0.0259857177734375,
0.0367431640625,
-0.0267791748046875,
0.0298919677734375,
0.00328826904296875,
0.0230255126953125,
-0.0251312255859375,
-0.030059814453125,
-0.03643798828125,
0.0227203369140625,
-0.02166748046875,
-0.04693603515625
]
] |
nlpconnect/vit-gpt2-image-captioning | 2023-02-27T15:00:09.000Z | [
"transformers",
"pytorch",
"vision-encoder-decoder",
"image-to-text",
"image-captioning",
"doi:10.57967/hf/0222",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | nlpconnect | null | null | nlpconnect/vit-gpt2-image-captioning | 591 | 2,485,193 | transformers | 2022-03-02T23:29:05 | ---
tags:
- image-to-text
- image-captioning
license: apache-2.0
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg
example_title: Savanna
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg
example_title: Football Match
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg
example_title: Airport
---
# nlpconnect/vit-gpt2-image-captioning
This is an image captioning model trained by @ydshieh in [flax ](https://github.com/huggingface/transformers/tree/main/examples/flax/image-captioning) this is pytorch version of [this](https://huggingface.co/ydshieh/vit-gpt2-coco-en-ckpts).
# The Illustrated Image Captioning using transformers

* https://ankur3107.github.io/blogs/the-illustrated-image-captioning-using-transformers/
# Sample running code
```python
from transformers import VisionEncoderDecoderModel, ViTImageProcessor, AutoTokenizer
import torch
from PIL import Image
model = VisionEncoderDecoderModel.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
feature_extractor = ViTImageProcessor.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
tokenizer = AutoTokenizer.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
max_length = 16
num_beams = 4
gen_kwargs = {"max_length": max_length, "num_beams": num_beams}
def predict_step(image_paths):
images = []
for image_path in image_paths:
i_image = Image.open(image_path)
if i_image.mode != "RGB":
i_image = i_image.convert(mode="RGB")
images.append(i_image)
pixel_values = feature_extractor(images=images, return_tensors="pt").pixel_values
pixel_values = pixel_values.to(device)
output_ids = model.generate(pixel_values, **gen_kwargs)
preds = tokenizer.batch_decode(output_ids, skip_special_tokens=True)
preds = [pred.strip() for pred in preds]
return preds
predict_step(['doctor.e16ba4e4.jpg']) # ['a woman in a hospital bed with a woman in a hospital bed']
```
# Sample running code using transformers pipeline
```python
from transformers import pipeline
image_to_text = pipeline("image-to-text", model="nlpconnect/vit-gpt2-image-captioning")
image_to_text("https://ankur3107.github.io/assets/images/image-captioning-example.png")
# [{'generated_text': 'a soccer game with a player jumping to catch the ball '}]
```
# Contact for any help
* https://huggingface.co/ankur310794
* https://twitter.com/ankur310794
* http://github.com/ankur3107
* https://www.linkedin.com/in/ankur310794 | 2,702 | [
[
-0.0182647705078125,
-0.031524658203125,
0.00959014892578125,
0.0275726318359375,
-0.040985107421875,
0.00250244140625,
-0.00007170438766479492,
-0.0248870849609375,
-0.0013437271118164062,
0.029266357421875,
-0.042816162109375,
-0.0205078125,
-0.060577392578125,
-0.0002410411834716797,
-0.03106689453125,
0.0712890625,
-0.01629638671875,
0.0036468505859375,
-0.01325225830078125,
-0.00921630859375,
-0.03515625,
-0.013519287109375,
-0.057373046875,
-0.0242919921875,
0.008544921875,
0.00554656982421875,
0.055908203125,
0.039764404296875,
0.05694580078125,
0.0253448486328125,
0.006641387939453125,
0.0071258544921875,
-0.0212249755859375,
-0.0174560546875,
0.00817108154296875,
-0.034576416015625,
-0.016998291015625,
-0.0035247802734375,
0.038177490234375,
0.005157470703125,
0.0052337646484375,
0.0099334716796875,
-0.0016231536865234375,
0.03448486328125,
-0.023895263671875,
0.044189453125,
-0.03546142578125,
0.02545166015625,
0.009765625,
-0.0263214111328125,
-0.0207977294921875,
-0.00823974609375,
0.0197601318359375,
-0.039459228515625,
0.045867919921875,
0.00399017333984375,
0.1104736328125,
0.03515625,
-0.00672149658203125,
-0.01367950439453125,
-0.03631591796875,
0.053558349609375,
-0.0579833984375,
-0.0059814453125,
0.0101776123046875,
0.019317626953125,
0.01499176025390625,
-0.0894775390625,
-0.044281005859375,
-0.009765625,
-0.034942626953125,
0.038726806640625,
-0.0272674560546875,
0.0014009475708007812,
0.034820556640625,
0.02490234375,
-0.043548583984375,
-0.00846099853515625,
-0.052764892578125,
-0.032867431640625,
0.036529541015625,
0.0003800392150878906,
0.0309600830078125,
-0.0282135009765625,
-0.043121337890625,
-0.025482177734375,
-0.02593994140625,
0.0034694671630859375,
0.00506591796875,
-0.0119171142578125,
-0.04315185546875,
0.0528564453125,
-0.004730224609375,
0.037139892578125,
0.003101348876953125,
-0.0017261505126953125,
0.0396728515625,
-0.010589599609375,
-0.0171966552734375,
-0.005565643310546875,
0.08441162109375,
0.0511474609375,
0.029266357421875,
-0.005523681640625,
-0.00890350341796875,
0.010650634765625,
0.006893157958984375,
-0.07672119140625,
-0.047943115234375,
0.0266876220703125,
-0.039947509765625,
-0.030731201171875,
0.018341064453125,
-0.07037353515625,
-0.016998291015625,
0.0033550262451171875,
0.052490234375,
-0.034912109375,
-0.01068115234375,
-0.0017805099487304688,
-0.01922607421875,
0.044403076171875,
0.01361083984375,
-0.06658935546875,
0.003353118896484375,
0.0217437744140625,
0.08355712890625,
0.0022411346435546875,
-0.0299072265625,
-0.022613525390625,
-0.01007080078125,
-0.0115203857421875,
0.04815673828125,
0.00855255126953125,
-0.00977325439453125,
-0.006198883056640625,
0.0211944580078125,
-0.01329803466796875,
-0.0252227783203125,
0.02728271484375,
-0.02093505859375,
0.032135009765625,
0.01093292236328125,
-0.021728515625,
-0.01898193359375,
-0.000022113323211669922,
-0.041015625,
0.065185546875,
0.02685546875,
-0.0780029296875,
0.01270294189453125,
-0.0587158203125,
-0.0272216796875,
0.0034732818603515625,
-0.014923095703125,
-0.07513427734375,
-0.0027713775634765625,
0.0345458984375,
0.030609130859375,
-0.010711669921875,
0.0030231475830078125,
0.006046295166015625,
-0.032073974609375,
0.01338958740234375,
-0.03741455078125,
0.067626953125,
0.011962890625,
-0.049072265625,
0.0193634033203125,
-0.0198822021484375,
0.0022869110107421875,
0.03839111328125,
-0.00335693359375,
0.0323486328125,
-0.0038967132568359375,
0.0076751708984375,
0.0212249755859375,
0.01861572265625,
-0.0290374755859375,
0.0128936767578125,
-0.032867431640625,
0.057342529296875,
0.043975830078125,
-0.005222320556640625,
0.0267333984375,
-0.012298583984375,
0.034698486328125,
0.001728057861328125,
0.018463134765625,
-0.0172271728515625,
-0.0616455078125,
-0.0692138671875,
-0.0225982666015625,
0.0079345703125,
0.0491943359375,
-0.0811767578125,
0.0364990234375,
-0.0241546630859375,
-0.03533935546875,
-0.037017822265625,
-0.01325225830078125,
0.0234375,
0.03863525390625,
0.044219970703125,
-0.0382080078125,
-0.047576904296875,
-0.06536865234375,
0.006504058837890625,
-0.01558685302734375,
-0.0128936767578125,
0.025482177734375,
0.05389404296875,
-0.0217132568359375,
0.07086181640625,
-0.039093017578125,
-0.0301971435546875,
-0.0167083740234375,
0.014312744140625,
0.038177490234375,
0.04974365234375,
0.040802001953125,
-0.054351806640625,
-0.039398193359375,
-0.011962890625,
-0.06719970703125,
0.0012521743774414062,
-0.020172119140625,
-0.00962066650390625,
0.01971435546875,
0.0220489501953125,
-0.06158447265625,
0.0474853515625,
0.035980224609375,
-0.0288543701171875,
0.0521240234375,
-0.0323486328125,
0.006439208984375,
-0.08123779296875,
0.01629638671875,
-0.0003974437713623047,
-0.019500732421875,
-0.0272064208984375,
0.00858306884765625,
0.0065765380859375,
-0.00017964839935302734,
-0.043914794921875,
0.03668212890625,
-0.039886474609375,
-0.0026531219482421875,
-0.017333984375,
-0.0168609619140625,
-0.001819610595703125,
0.0467529296875,
0.024200439453125,
0.05487060546875,
0.0726318359375,
-0.01531219482421875,
0.06304931640625,
0.042266845703125,
-0.020599365234375,
0.01309967041015625,
-0.067626953125,
0.010467529296875,
-0.01314544677734375,
0.0228424072265625,
-0.08441162109375,
-0.022796630859375,
0.047943115234375,
-0.057464599609375,
0.0282135009765625,
-0.03436279296875,
-0.040130615234375,
-0.056121826171875,
-0.01451873779296875,
0.0325927734375,
0.061920166015625,
-0.053009033203125,
0.0325927734375,
0.0010547637939453125,
0.0014333724975585938,
-0.049224853515625,
-0.0716552734375,
0.0121612548828125,
-0.01580810546875,
-0.0394287109375,
0.0286407470703125,
0.0136566162109375,
0.0267791748046875,
0.001338958740234375,
-0.008544921875,
-0.01267242431640625,
-0.02325439453125,
0.020477294921875,
0.040130615234375,
-0.007617950439453125,
-0.01534271240234375,
-0.006381988525390625,
-0.030517578125,
0.01513671875,
-0.023223876953125,
0.059600830078125,
-0.032958984375,
-0.01151275634765625,
-0.048309326171875,
0.007781982421875,
0.033111572265625,
-0.017822265625,
0.048828125,
0.08941650390625,
-0.0279388427734375,
0.007171630859375,
-0.021820068359375,
-0.032684326171875,
-0.037994384765625,
0.0526123046875,
-0.01165771484375,
-0.046600341796875,
0.037628173828125,
0.01050567626953125,
-0.005153656005859375,
0.0537109375,
0.042449951171875,
-0.02398681640625,
0.07391357421875,
0.0232696533203125,
-0.00986480712890625,
0.0251617431640625,
-0.0731201171875,
0.0160369873046875,
-0.052154541015625,
-0.0137176513671875,
-0.011199951171875,
-0.01529693603515625,
-0.038818359375,
-0.048187255859375,
0.02398681640625,
0.01715087890625,
-0.01776123046875,
0.041778564453125,
-0.07373046875,
0.0253143310546875,
0.03363037109375,
0.0203857421875,
-0.01084136962890625,
0.035369873046875,
-0.007137298583984375,
-0.006381988525390625,
-0.056427001953125,
-0.0275726318359375,
0.0628662109375,
0.039520263671875,
0.04840087890625,
-0.016693115234375,
0.030731201171875,
0.0029468536376953125,
0.01763916015625,
-0.0538330078125,
0.034454345703125,
-0.011016845703125,
-0.037384033203125,
-0.00455474853515625,
-0.0277557373046875,
-0.07403564453125,
0.0126800537109375,
-0.01558685302734375,
-0.06488037109375,
-0.0047760009765625,
0.0218658447265625,
-0.0126800537109375,
0.042724609375,
-0.046234130859375,
0.076171875,
-0.01473236083984375,
-0.0264129638671875,
0.005748748779296875,
-0.06878662109375,
0.01203155517578125,
0.022003173828125,
-0.01203155517578125,
0.0109710693359375,
0.01384735107421875,
0.060821533203125,
-0.0252532958984375,
0.058197021484375,
-0.01910400390625,
0.01540374755859375,
0.04327392578125,
-0.00975799560546875,
0.0380859375,
0.009765625,
0.023834228515625,
0.0274200439453125,
-0.001842498779296875,
-0.033721923828125,
-0.02362060546875,
0.039764404296875,
-0.069091796875,
-0.02587890625,
-0.046417236328125,
-0.033203125,
0.02532958984375,
0.0095977783203125,
0.06732177734375,
0.036529541015625,
0.0271453857421875,
0.013397216796875,
0.0286865234375,
-0.0300140380859375,
0.05780029296875,
-0.01172637939453125,
-0.0007810592651367188,
-0.04437255859375,
0.068115234375,
-0.0171966552734375,
0.01074981689453125,
0.0254058837890625,
0.01454925537109375,
-0.045440673828125,
-0.036346435546875,
-0.043487548828125,
0.0283203125,
-0.05340576171875,
-0.03143310546875,
-0.038818359375,
-0.037506103515625,
-0.026519775390625,
-0.006694793701171875,
-0.0261077880859375,
-0.023468017578125,
-0.03338623046875,
0.0017566680908203125,
0.050079345703125,
0.01837158203125,
0.00001150369644165039,
0.0361328125,
-0.055145263671875,
0.03961181640625,
0.01959228515625,
0.0272674560546875,
-0.00765228271484375,
-0.05194091796875,
-0.005779266357421875,
-0.005084991455078125,
-0.038482666015625,
-0.0626220703125,
0.05194091796875,
0.01297760009765625,
0.0296630859375,
0.0198822021484375,
-0.013427734375,
0.04376220703125,
-0.0289154052734375,
0.058868408203125,
0.037017822265625,
-0.06341552734375,
0.031524658203125,
-0.017059326171875,
0.020355224609375,
0.01441192626953125,
0.0083770751953125,
-0.0386962890625,
-0.0180206298828125,
-0.051025390625,
-0.0838623046875,
0.0682373046875,
0.046417236328125,
0.0197296142578125,
0.0199432373046875,
0.0244598388671875,
-0.0124053955078125,
0.0182952880859375,
-0.0802001953125,
-0.03692626953125,
-0.04742431640625,
-0.0225830078125,
-0.001728057861328125,
-0.0083465576171875,
-0.0005512237548828125,
-0.041351318359375,
0.045654296875,
-0.0170135498046875,
0.07464599609375,
0.04608154296875,
-0.0159149169921875,
-0.00646209716796875,
-0.0176239013671875,
0.039337158203125,
0.0352783203125,
-0.01509857177734375,
0.002811431884765625,
0.01316070556640625,
-0.039825439453125,
-0.0084381103515625,
0.0192413330078125,
-0.0118255615234375,
0.0128936767578125,
0.04290771484375,
0.0855712890625,
-0.0250091552734375,
-0.0166168212890625,
0.048126220703125,
-0.004589080810546875,
-0.03302001953125,
-0.0335693359375,
-0.0107269287109375,
0.01030731201171875,
0.00994110107421875,
0.030731201171875,
0.03497314453125,
-0.01006317138671875,
-0.01232147216796875,
0.020172119140625,
0.026580810546875,
-0.0399169921875,
-0.009490966796875,
0.07373046875,
0.00337982177734375,
-0.01552581787109375,
0.0645751953125,
-0.0119171142578125,
-0.054046630859375,
0.0738525390625,
0.0306854248046875,
0.071044921875,
0.016937255859375,
0.0208892822265625,
0.0521240234375,
0.0289154052734375,
-0.0034847259521484375,
0.01163482666015625,
0.00937652587890625,
-0.0513916015625,
-0.032684326171875,
-0.038360595703125,
-0.0118255615234375,
0.006103515625,
-0.02880859375,
0.0212860107421875,
-0.033233642578125,
-0.0112762451171875,
-0.009124755859375,
-0.0014247894287109375,
-0.06268310546875,
0.02203369140625,
0.00998687744140625,
0.050537109375,
-0.06158447265625,
0.059326171875,
0.044097900390625,
-0.035614013671875,
-0.0615234375,
-0.01287841796875,
-0.0244598388671875,
-0.0628662109375,
0.02532958984375,
0.0379638671875,
0.0211944580078125,
0.02099609375,
-0.055694580078125,
-0.040679931640625,
0.09320068359375,
0.0146331787109375,
-0.019439697265625,
0.004852294921875,
0.01861572265625,
0.040069580078125,
-0.0254058837890625,
0.04510498046875,
0.0289154052734375,
0.0396728515625,
0.0105133056640625,
-0.055572509765625,
0.0192413330078125,
-0.02142333984375,
-0.0023193359375,
-0.00490570068359375,
-0.045989990234375,
0.066650390625,
-0.043304443359375,
-0.01373291015625,
0.0380859375,
0.052215576171875,
0.0131072998046875,
0.0074615478515625,
0.034271240234375,
0.038177490234375,
0.0140533447265625,
-0.037750244140625,
0.08685302734375,
-0.01373291015625,
0.0771484375,
0.0521240234375,
0.0253448486328125,
0.03460693359375,
0.042205810546875,
-0.01529693603515625,
0.0303802490234375,
0.04766845703125,
-0.05511474609375,
0.04412841796875,
0.01267242431640625,
-0.00514984130859375,
0.00830078125,
0.0204620361328125,
-0.033416748046875,
0.0286407470703125,
0.0120391845703125,
-0.049072265625,
-0.0118255615234375,
-0.0115814208984375,
-0.0026836395263671875,
-0.0211029052734375,
-0.0045623779296875,
0.03466796875,
0.0013685226440429688,
-0.03741455078125,
0.06268310546875,
-0.0023212432861328125,
0.06494140625,
-0.0333251953125,
-0.00968170166015625,
0.00591278076171875,
0.011962890625,
-0.025634765625,
-0.072265625,
0.0211029052734375,
-0.009796142578125,
0.002185821533203125,
-0.003368377685546875,
0.045867919921875,
-0.033721923828125,
-0.04998779296875,
0.0228424072265625,
0.015869140625,
0.024871826171875,
-0.006504058837890625,
-0.082275390625,
-0.007160186767578125,
0.016143798828125,
-0.01522064208984375,
-0.006649017333984375,
-0.0027008056640625,
0.01194000244140625,
0.04388427734375,
0.04290771484375,
-0.002231597900390625,
0.0091400146484375,
0.001750946044921875,
0.05108642578125,
-0.031402587890625,
-0.0272674560546875,
-0.06573486328125,
0.057098388671875,
-0.01523590087890625,
-0.031707763671875,
0.043731689453125,
0.047760009765625,
0.047576904296875,
-0.044403076171875,
0.054290771484375,
-0.03216552734375,
-0.005268096923828125,
-0.02838134765625,
0.06695556640625,
-0.028472900390625,
-0.006237030029296875,
-0.0264739990234375,
-0.0670166015625,
-0.028839111328125,
0.07440185546875,
-0.011810302734375,
0.006439208984375,
0.057098388671875,
0.07427978515625,
-0.01261138916015625,
-0.0208892822265625,
-0.001567840576171875,
0.018035888671875,
0.027435302734375,
0.049835205078125,
0.044403076171875,
-0.07049560546875,
0.0574951171875,
-0.047515869140625,
-0.01049041748046875,
-0.00007241964340209961,
-0.0625,
-0.06158447265625,
-0.036590576171875,
-0.0447998046875,
-0.04962158203125,
-0.0087738037109375,
0.047149658203125,
0.062164306640625,
-0.06890869140625,
-0.0035190582275390625,
-0.03302001953125,
-0.006755828857421875,
-0.0196685791015625,
-0.0250396728515625,
0.053558349609375,
-0.0049285888671875,
-0.062103271484375,
-0.0167694091796875,
-0.0006957054138183594,
0.02069091796875,
0.004566192626953125,
-0.016021728515625,
-0.0036468505859375,
-0.0251312255859375,
0.026519775390625,
0.028472900390625,
-0.055816650390625,
-0.0222320556640625,
-0.0147705078125,
-0.00418853759765625,
0.0301055908203125,
0.02911376953125,
-0.0494384765625,
0.037506103515625,
0.043701171875,
0.01715087890625,
0.058074951171875,
-0.00626373291015625,
0.018035888671875,
-0.054290771484375,
0.031768798828125,
0.0092010498046875,
0.044097900390625,
0.0352783203125,
-0.027740478515625,
0.040985107421875,
0.036285400390625,
-0.037811279296875,
-0.038726806640625,
-0.00403594970703125,
-0.09576416015625,
-0.005096435546875,
0.081787109375,
-0.0185394287109375,
-0.014556884765625,
0.0190277099609375,
-0.045562744140625,
0.043731689453125,
-0.0203857421875,
0.05755615234375,
0.0164337158203125,
0.005245208740234375,
-0.0362548828125,
-0.0323486328125,
0.03338623046875,
0.0230255126953125,
-0.042938232421875,
-0.020782470703125,
0.007259368896484375,
0.031494140625,
0.0287322998046875,
0.028594970703125,
-0.0201263427734375,
0.031707763671875,
0.002452850341796875,
0.023040771484375,
-0.016571044921875,
-0.003192901611328125,
-0.02532958984375,
-0.00433349609375,
-0.024200439453125,
-0.047698974609375
]
] |
t5-small | 2023-06-30T02:31:26.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"onnx",
"safetensors",
"t5",
"text2text-generation",
"summarization",
"translation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:c4",
"arxiv:1805.12471",
"arxiv:1708.00055",
"arxiv:1704.05426",
"arxiv:1606.05250",
"arxiv:1808.09121",
"arxiv:1810.12885",
"arxiv:1905.10044",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | translation | null | null | null | t5-small | 161 | 2,346,088 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- fr
- ro
- de
- multilingual
license: apache-2.0
tags:
- summarization
- translation
datasets:
- c4
---
# Model Card for T5 Small

# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [Model Card Authors](#model-card-authors)
9. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
> With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
T5-Small is the checkpoint with 60 million parameters.
- **Developed by:** Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. See [associated paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) and [GitHub repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
- **Model type:** Language model
- **Language(s) (NLP):** English, French, Romanian, German
- **License:** Apache 2.0
- **Related Models:** [All T5 Checkpoints](https://huggingface.co/models?search=t5)
- **Resources for more information:**
- [Research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
- [Google's T5 Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
- [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer)
- [Hugging Face T5 Docs](https://huggingface.co/docs/transformers/model_doc/t5)
# Uses
## Direct Use and Downstream Use
The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
> Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task, including machine translation, document summarization, question answering, and classification tasks (e.g., sentiment analysis). We can even apply T5 to regression tasks by training it to predict the string representation of a number instead of the number itself.
See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
More information needed.
## Recommendations
More information needed.
# Training Details
## Training Data
The model is pre-trained on the [Colossal Clean Crawled Corpus (C4)](https://www.tensorflow.org/datasets/catalog/c4), which was developed and released in the context of the same [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) as T5.
The model was pre-trained on a on a **multi-task mixture of unsupervised (1.) and supervised tasks (2.)**.
Thereby, the following datasets were being used for (1.) and (2.):
1. **Datasets used for Unsupervised denoising objective**:
- [C4](https://huggingface.co/datasets/c4)
- [Wiki-DPR](https://huggingface.co/datasets/wiki_dpr)
2. **Datasets used for Supervised text-to-text language modeling objective**
- Sentence acceptability judgment
- CoLA [Warstadt et al., 2018](https://arxiv.org/abs/1805.12471)
- Sentiment analysis
- SST-2 [Socher et al., 2013](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)
- Paraphrasing/sentence similarity
- MRPC [Dolan and Brockett, 2005](https://aclanthology.org/I05-5002)
- STS-B [Ceret al., 2017](https://arxiv.org/abs/1708.00055)
- QQP [Iyer et al., 2017](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- Natural language inference
- MNLI [Williams et al., 2017](https://arxiv.org/abs/1704.05426)
- QNLI [Rajpurkar et al.,2016](https://arxiv.org/abs/1606.05250)
- RTE [Dagan et al., 2005](https://link.springer.com/chapter/10.1007/11736790_9)
- CB [De Marneff et al., 2019](https://semanticsarchive.net/Archive/Tg3ZGI2M/Marneffe.pdf)
- Sentence completion
- COPA [Roemmele et al., 2011](https://www.researchgate.net/publication/221251392_Choice_of_Plausible_Alternatives_An_Evaluation_of_Commonsense_Causal_Reasoning)
- Word sense disambiguation
- WIC [Pilehvar and Camacho-Collados, 2018](https://arxiv.org/abs/1808.09121)
- Question answering
- MultiRC [Khashabi et al., 2018](https://aclanthology.org/N18-1023)
- ReCoRD [Zhang et al., 2018](https://arxiv.org/abs/1810.12885)
- BoolQ [Clark et al., 2019](https://arxiv.org/abs/1905.10044)
## Training Procedure
In their [abstract](https://jmlr.org/papers/volume21/20-074/20-074.pdf), the model developers write:
> In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks.
The framework introduced, the T5 framework, involves a training procedure that brings together the approaches studied in the paper. See the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
# Evaluation
## Testing Data, Factors & Metrics
The developers evaluated the model on 24 tasks, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for full details.
## Results
For full results for T5-small, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf), Table 14.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@article{2020t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {Journal of Machine Learning Research},
year = {2020},
volume = {21},
number = {140},
pages = {1-67},
url = {http://jmlr.org/papers/v21/20-074.html}
}
```
**APA:**
- Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., ... & Liu, P. J. (2020). Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140), 1-67.
# Model Card Authors
This model card was written by the team at Hugging Face.
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import T5Tokenizer, T5Model
tokenizer = T5Tokenizer.from_pretrained("t5-small")
model = T5Model.from_pretrained("t5-small")
input_ids = tokenizer(
"Studies have been shown that owning a dog is good for you", return_tensors="pt"
).input_ids # Batch size 1
decoder_input_ids = tokenizer("Studies show that", return_tensors="pt").input_ids # Batch size 1
# forward pass
outputs = model(input_ids=input_ids, decoder_input_ids=decoder_input_ids)
last_hidden_states = outputs.last_hidden_state
```
See the [Hugging Face T5](https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Model) docs and a [Colab Notebook](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) created by the model developers for more examples.
</details>
| 8,473 | [
[
-0.0220489501953125,
-0.0258941650390625,
0.03717041015625,
0.00821685791015625,
-0.011505126953125,
-0.009307861328125,
-0.0204010009765625,
-0.043121337890625,
-0.02386474609375,
0.030792236328125,
-0.039154052734375,
-0.044342041015625,
-0.060302734375,
0.02618408203125,
-0.03900146484375,
0.08050537109375,
-0.006256103515625,
-0.01116943359375,
-0.00904083251953125,
-0.010040283203125,
-0.0269012451171875,
-0.03729248046875,
-0.047637939453125,
-0.0263824462890625,
0.032012939453125,
0.0196380615234375,
0.02154541015625,
0.03338623046875,
0.051300048828125,
0.017974853515625,
-0.00897979736328125,
-0.0016269683837890625,
-0.03460693359375,
-0.0238189697265625,
-0.02001953125,
-0.0250701904296875,
-0.0261993408203125,
-0.003673553466796875,
0.043243408203125,
0.054595947265625,
0.0028858184814453125,
0.02947998046875,
0.01206207275390625,
0.037994384765625,
-0.04547119140625,
0.01010894775390625,
-0.04364013671875,
0.007549285888671875,
-0.002429962158203125,
0.006252288818359375,
-0.045013427734375,
-0.0035419464111328125,
0.0153961181640625,
-0.0455322265625,
0.025054931640625,
-0.00017952919006347656,
0.08966064453125,
0.0257415771484375,
-0.034942626953125,
-0.01410675048828125,
-0.056915283203125,
0.08294677734375,
-0.058990478515625,
0.038818359375,
0.0120086669921875,
0.01152801513671875,
0.0108642578125,
-0.086181640625,
-0.052703857421875,
-0.0006084442138671875,
-0.01861572265625,
0.016632080078125,
-0.023590087890625,
0.0025730133056640625,
0.0258941650390625,
0.0263519287109375,
-0.03216552734375,
-0.00173187255859375,
-0.045013427734375,
-0.00891876220703125,
0.042510986328125,
-0.00040531158447265625,
0.0247344970703125,
-0.0166015625,
-0.034088134765625,
-0.02301025390625,
-0.0265350341796875,
0.00873565673828125,
-0.00980377197265625,
0.0223236083984375,
-0.02783203125,
0.02227783203125,
0.006549835205078125,
0.044708251953125,
0.01503753662109375,
-0.0158538818359375,
0.0289764404296875,
-0.058990478515625,
-0.018463134765625,
-0.026702880859375,
0.0845947265625,
0.0216522216796875,
0.0084075927734375,
-0.0306243896484375,
-0.0036525726318359375,
-0.00992584228515625,
0.027740478515625,
-0.07476806640625,
-0.01020050048828125,
0.0221710205078125,
-0.039703369140625,
-0.03619384765625,
-0.004482269287109375,
-0.0594482421875,
-0.0030345916748046875,
-0.007205963134765625,
0.0372314453125,
-0.03704833984375,
-0.01273345947265625,
0.0130157470703125,
-0.0206756591796875,
0.0255126953125,
0.0228118896484375,
-0.065673828125,
0.026702880859375,
0.0232391357421875,
0.05621337890625,
-0.033050537109375,
-0.0266265869140625,
-0.0125885009765625,
0.0084075927734375,
-0.00830841064453125,
0.052734375,
-0.0303192138671875,
-0.032012939453125,
-0.01210784912109375,
0.0137939453125,
-0.01849365234375,
-0.021697998046875,
0.06280517578125,
-0.019439697265625,
0.053985595703125,
-0.0234527587890625,
-0.038665771484375,
-0.028839111328125,
0.01116943359375,
-0.0474853515625,
0.08917236328125,
-0.0009112358093261719,
-0.060455322265625,
0.022369384765625,
-0.07073974609375,
-0.020416259765625,
-0.0214996337890625,
0.021636962890625,
-0.042327880859375,
-0.01849365234375,
0.0223541259765625,
0.04632568359375,
-0.030303955078125,
0.023956298828125,
-0.02203369140625,
-0.019775390625,
0.0064544677734375,
-0.024932861328125,
0.07666015625,
0.0218963623046875,
-0.03656005859375,
0.0017566680908203125,
-0.054595947265625,
0.00201416015625,
0.0010862350463867188,
-0.0202789306640625,
0.0003197193145751953,
-0.0161895751953125,
0.0178375244140625,
0.034912109375,
0.0179595947265625,
-0.038818359375,
0.003337860107421875,
-0.0245819091796875,
0.04962158203125,
0.035125732421875,
-0.00534820556640625,
0.045013427734375,
-0.037872314453125,
0.02825927734375,
0.0140380859375,
0.006183624267578125,
-0.01198577880859375,
-0.027740478515625,
-0.0614013671875,
0.000347137451171875,
0.040008544921875,
0.0401611328125,
-0.047393798828125,
0.04052734375,
-0.041778564453125,
-0.05291748046875,
-0.0501708984375,
-0.0031147003173828125,
0.027435302734375,
0.048919677734375,
0.060272216796875,
-0.007068634033203125,
-0.045196533203125,
-0.04840087890625,
-0.025726318359375,
-0.003665924072265625,
-0.0031642913818359375,
0.0092926025390625,
0.055450439453125,
-0.01023101806640625,
0.063720703125,
-0.0228424072265625,
-0.030792236328125,
-0.04058837890625,
0.0005540847778320312,
-0.004764556884765625,
0.0439453125,
0.048858642578125,
-0.0556640625,
-0.038604736328125,
-0.014190673828125,
-0.061492919921875,
-0.0000445246696472168,
-0.011505126953125,
0.0012960433959960938,
0.0298919677734375,
0.04083251953125,
-0.04498291015625,
0.01800537109375,
0.045074462890625,
-0.0248565673828125,
0.0212860107421875,
-0.01104736328125,
-0.0016222000122070312,
-0.12139892578125,
0.038970947265625,
0.01023101806640625,
-0.0158843994140625,
-0.055816650390625,
-0.00846099853515625,
0.0036678314208984375,
-0.007129669189453125,
-0.04248046875,
0.05499267578125,
-0.030914306640625,
0.0027751922607421875,
-0.0011014938354492188,
0.005733489990234375,
0.009124755859375,
0.049957275390625,
-0.002742767333984375,
0.05926513671875,
0.016815185546875,
-0.053253173828125,
-0.001377105712890625,
0.0258941650390625,
-0.00560760498046875,
0.022430419921875,
-0.055450439453125,
0.022003173828125,
-0.004039764404296875,
0.035003662109375,
-0.07049560546875,
0.0106048583984375,
0.0265045166015625,
-0.05035400390625,
0.023529052734375,
-0.00081634521484375,
-0.03155517578125,
-0.028472900390625,
-0.024627685546875,
0.0205230712890625,
0.051483154296875,
-0.038360595703125,
0.05340576171875,
0.00992584228515625,
0.021881103515625,
-0.056427001953125,
-0.0655517578125,
0.0107421875,
-0.030792236328125,
-0.039764404296875,
0.060302734375,
-0.01062774658203125,
0.006969451904296875,
0.0114898681640625,
0.00283050537109375,
-0.01522064208984375,
0.00962066650390625,
0.0027751922607421875,
0.01361083984375,
0.0012788772583007812,
0.01409912109375,
-0.00763702392578125,
-0.0119476318359375,
-0.0021533966064453125,
-0.03448486328125,
0.0208587646484375,
-0.01453399658203125,
0.0133819580078125,
-0.050445556640625,
0.013092041015625,
0.042724609375,
-0.01197052001953125,
0.0638427734375,
0.07586669921875,
-0.019775390625,
-0.00597381591796875,
-0.032806396484375,
-0.019378662109375,
-0.034698486328125,
0.0279541015625,
-0.034210205078125,
-0.0667724609375,
0.03240966796875,
0.0023860931396484375,
0.0269317626953125,
0.06768798828125,
0.0266571044921875,
-0.0133056640625,
0.058502197265625,
0.06634521484375,
-0.0026226043701171875,
0.042633056640625,
-0.03509521484375,
0.0225067138671875,
-0.06524658203125,
-0.0177001953125,
-0.057525634765625,
-0.022064208984375,
-0.06109619140625,
-0.0282745361328125,
0.0083465576171875,
-0.000751495361328125,
-0.02587890625,
0.03753662109375,
-0.04168701171875,
0.00897979736328125,
0.032318115234375,
0.005886077880859375,
0.027862548828125,
-0.0005254745483398438,
-0.005863189697265625,
-0.01375579833984375,
-0.06878662109375,
-0.03729248046875,
0.09649658203125,
0.027496337890625,
0.0305328369140625,
-0.0008788108825683594,
0.050384521484375,
0.0187225341796875,
0.0160980224609375,
-0.057159423828125,
0.05206298828125,
-0.030853271484375,
-0.037078857421875,
-0.0191497802734375,
-0.032012939453125,
-0.08673095703125,
0.0225830078125,
-0.0261077880859375,
-0.053009033203125,
0.0120086669921875,
0.0003046989440917969,
-0.019378662109375,
0.03863525390625,
-0.06634521484375,
0.083984375,
-0.004024505615234375,
-0.0253143310546875,
-0.0014886856079101562,
-0.054595947265625,
0.01788330078125,
0.0014123916625976562,
0.01020050048828125,
0.0089569091796875,
-0.0130615234375,
0.07525634765625,
-0.025909423828125,
0.07061767578125,
-0.01406097412109375,
0.0034542083740234375,
0.01160430908203125,
-0.026336669921875,
0.0341796875,
-0.0300750732421875,
-0.0052032470703125,
0.0301055908203125,
0.01172637939453125,
-0.03448486328125,
-0.0399169921875,
0.034149169921875,
-0.0718994140625,
-0.02685546875,
-0.031280517578125,
-0.03363037109375,
-0.0101776123046875,
0.029296875,
0.028533935546875,
0.01154327392578125,
-0.01273345947265625,
0.0283966064453125,
0.048736572265625,
-0.0266571044921875,
0.0565185546875,
0.0247650146484375,
0.0014190673828125,
-0.02203369140625,
0.05914306640625,
0.010162353515625,
0.02783203125,
0.04339599609375,
0.01537322998046875,
-0.024505615234375,
-0.04132080078125,
-0.027587890625,
0.0258636474609375,
-0.04693603515625,
-0.00820159912109375,
-0.0714111328125,
-0.015533447265625,
-0.042327880859375,
-0.0020236968994140625,
-0.03387451171875,
-0.0301513671875,
-0.035308837890625,
-0.01140594482421875,
0.0206756591796875,
0.0360107421875,
0.01036834716796875,
0.016326904296875,
-0.06976318359375,
0.01425933837890625,
0.0036449432373046875,
0.005672454833984375,
0.001255035400390625,
-0.06170654296875,
-0.012664794921875,
0.0078125,
-0.031402587890625,
-0.0487060546875,
0.03289794921875,
0.0186614990234375,
0.0247039794921875,
0.0019197463989257812,
0.0127716064453125,
0.0474853515625,
-0.0209808349609375,
0.0762939453125,
0.012420654296875,
-0.0784912109375,
0.021484375,
-0.018463134765625,
0.03155517578125,
0.0401611328125,
0.03515625,
-0.048248291015625,
-0.01593017578125,
-0.0760498046875,
-0.05767822265625,
0.05926513671875,
0.019317626953125,
0.0088653564453125,
0.0299530029296875,
0.01904296875,
0.002964019775390625,
0.0118560791015625,
-0.071533203125,
-0.017578125,
-0.0176849365234375,
-0.029266357421875,
-0.00634765625,
-0.004627227783203125,
0.007366180419921875,
-0.027099609375,
0.04931640625,
-0.00909423828125,
0.057891845703125,
0.022430419921875,
-0.0210723876953125,
0.01331329345703125,
0.029266357421875,
0.05194091796875,
0.037689208984375,
-0.013427734375,
-0.0005254745483398438,
0.035552978515625,
-0.040008544921875,
-0.0021762847900390625,
0.0118560791015625,
-0.0253448486328125,
0.0016717910766601562,
0.035064697265625,
0.072021484375,
0.00838470458984375,
-0.030029296875,
0.042938232421875,
-0.005584716796875,
-0.044525146484375,
-0.0200042724609375,
-0.005146026611328125,
0.0089569091796875,
-0.0012874603271484375,
0.018096923828125,
0.0189666748046875,
0.00891876220703125,
-0.03717041015625,
0.0043487548828125,
0.00946807861328125,
-0.03509521484375,
-0.035064697265625,
0.061370849609375,
0.02545166015625,
-0.00225067138671875,
0.04388427734375,
-0.010162353515625,
-0.038543701171875,
0.04144287109375,
0.0399169921875,
0.07696533203125,
-0.001995086669921875,
0.013916015625,
0.05010986328125,
0.031463623046875,
-0.00885009765625,
0.0026607513427734375,
-0.005367279052734375,
-0.060516357421875,
-0.0426025390625,
-0.0367431640625,
-0.0223541259765625,
0.0174713134765625,
-0.033843994140625,
0.0262298583984375,
-0.0244293212890625,
-0.0012884140014648438,
0.00782012939453125,
0.0136566162109375,
-0.05975341796875,
0.025177001953125,
0.0011510848999023438,
0.06341552734375,
-0.05718994140625,
0.06689453125,
0.0565185546875,
-0.04705810546875,
-0.0714111328125,
0.0119476318359375,
-0.02117919921875,
-0.048095703125,
0.04278564453125,
0.010772705078125,
-0.0015935897827148438,
0.01430511474609375,
-0.04132080078125,
-0.06494140625,
0.1007080078125,
0.0264739990234375,
-0.02276611328125,
-0.02679443359375,
0.0200958251953125,
0.049896240234375,
-0.0208892822265625,
0.0306854248046875,
0.0367431640625,
0.037384033203125,
0.0163726806640625,
-0.07904052734375,
0.0252227783203125,
-0.0175323486328125,
0.007038116455078125,
0.002513885498046875,
-0.06353759765625,
0.043731689453125,
-0.0262908935546875,
-0.0176239013671875,
-0.0128631591796875,
0.0518798828125,
0.0028324127197265625,
0.0167388916015625,
0.035675048828125,
0.05548095703125,
0.051666259765625,
-0.00838470458984375,
0.08837890625,
-0.0235443115234375,
0.036468505859375,
0.062744140625,
0.01258087158203125,
0.06927490234375,
0.039306640625,
-0.02349853515625,
0.03582763671875,
0.050689697265625,
-0.00844573974609375,
0.040008544921875,
-0.01018524169921875,
-0.0028171539306640625,
-0.007015228271484375,
-0.0126495361328125,
-0.0253753662109375,
0.0193634033203125,
0.0179290771484375,
-0.031707763671875,
-0.0184173583984375,
0.01129150390625,
0.022125244140625,
-0.011810302734375,
-0.006916046142578125,
0.06353759765625,
0.02008056640625,
-0.05743408203125,
0.054443359375,
0.01280975341796875,
0.0673828125,
-0.03314208984375,
0.007335662841796875,
-0.0144500732421875,
0.0179901123046875,
-0.0242156982421875,
-0.050506591796875,
0.03790283203125,
0.002254486083984375,
-0.01490020751953125,
-0.05126953125,
0.06439208984375,
-0.03338623046875,
-0.03118896484375,
0.0249176025390625,
0.0347900390625,
0.00673675537109375,
0.0046844482421875,
-0.07049560546875,
-0.00702667236328125,
0.01386260986328125,
-0.0158233642578125,
0.0264434814453125,
0.0297393798828125,
0.005283355712890625,
0.050018310546875,
0.04449462890625,
-0.01349639892578125,
0.0010318756103515625,
-0.0109405517578125,
0.049957275390625,
-0.0560302734375,
-0.023468017578125,
-0.0552978515625,
0.052093505859375,
-0.0007190704345703125,
-0.03363037109375,
0.050506591796875,
0.0325927734375,
0.0814208984375,
-0.01056671142578125,
0.07373046875,
-0.0147705078125,
0.041229248046875,
-0.031707763671875,
0.039459228515625,
-0.052581787109375,
0.0145721435546875,
-0.0265045166015625,
-0.060577392578125,
-0.0213165283203125,
0.0311126708984375,
-0.02923583984375,
0.02508544921875,
0.0814208984375,
0.046905517578125,
-0.001678466796875,
-0.01088714599609375,
0.0162811279296875,
0.0138092041015625,
0.0264129638671875,
0.05645751953125,
0.027374267578125,
-0.0745849609375,
0.0733642578125,
-0.0281219482421875,
0.0174713134765625,
-0.002841949462890625,
-0.0614013671875,
-0.07080078125,
-0.06170654296875,
-0.0296783447265625,
-0.033782958984375,
0.0081024169921875,
0.059661865234375,
0.04412841796875,
-0.05157470703125,
-0.0189208984375,
-0.02685546875,
0.0020999908447265625,
-0.0178375244140625,
-0.0158538818359375,
0.038726806640625,
-0.036590576171875,
-0.06622314453125,
0.0020198822021484375,
-0.006443023681640625,
0.00417327880859375,
0.0002570152282714844,
-0.002506256103515625,
-0.022857666015625,
-0.01309967041015625,
0.046295166015625,
0.0142974853515625,
-0.047943115234375,
-0.024627685546875,
0.018341064453125,
-0.01470947265625,
0.0095062255859375,
0.03619384765625,
-0.049346923828125,
0.01415252685546875,
0.039306640625,
0.06951904296875,
0.06341552734375,
-0.0111236572265625,
0.045379638671875,
-0.033477783203125,
-0.007488250732421875,
0.01082611083984375,
0.00800323486328125,
0.0292205810546875,
-0.016265869140625,
0.048248291015625,
0.037322998046875,
-0.036773681640625,
-0.051177978515625,
-0.0126953125,
-0.0924072265625,
-0.013702392578125,
0.09454345703125,
-0.01129150390625,
-0.018280029296875,
0.0031375885009765625,
-0.0040740966796875,
0.027557373046875,
-0.03448486328125,
0.056396484375,
0.064208984375,
0.00696563720703125,
-0.03448486328125,
-0.04486083984375,
0.0511474609375,
0.044097900390625,
-0.078857421875,
-0.0167236328125,
0.01406097412109375,
0.0379638671875,
0.0089263916015625,
0.046051025390625,
-0.006195068359375,
0.00597381591796875,
-0.0137786865234375,
0.0218658447265625,
0.00006657838821411133,
-0.004123687744140625,
-0.027435302734375,
0.01495361328125,
-0.01358795166015625,
-0.020294189453125
]
] |
patrickjohncyh/fashion-clip | 2023-06-09T01:03:16.000Z | [
"transformers",
"pytorch",
"safetensors",
"clip",
"zero-shot-image-classification",
"vision",
"language",
"fashion",
"ecommerce",
"en",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | patrickjohncyh | null | null | patrickjohncyh/fashion-clip | 94 | 2,161,565 | transformers | 2023-02-21T19:51:47 | ---
license: mit
tags:
- vision
- language
- fashion
- ecommerce
library_name: transformers
language:
- en
widget:
- src: https://cdn-images.farfetch-contents.com/19/76/05/56/19760556_44221665_1000.jpg
candidate_labels: black shoe, red shoe, a cat
example_title: Black Shoe
---
[](https://www.youtube.com/watch?v=uqRSc-KSA1Y) [](https://huggingface.co/patrickjohncyh/fashion-clip) [](https://colab.research.google.com/drive/1Z1hAxBnWjF76bEi9KQ6CMBBEmI_FVDrW?usp=sharing) [](https://towardsdatascience.com/teaching-clip-some-fashion-3005ac3fdcc3) [](https://huggingface.co/spaces/vinid/fashion-clip-app)
# Model Card: Fashion CLIP
Disclaimer: The model card adapts the model card from [here](https://huggingface.co/openai/clip-vit-base-patch32).
## Model Details
UPDATE (10/03/23): We have updated the model! We found that [laion/CLIP-ViT-B-32-laion2B-s34B-b79K](https://huggingface.co/laion/CLIP-ViT-B-32-laion2B-s34B-b79K) checkpoint (thanks [Bin](https://www.linkedin.com/in/bin-duan-56205310/)!) worked better than original OpenAI CLIP on Fashion. We thus fine-tune a newer (and better!) version of FashionCLIP (henceforth FashionCLIP 2.0), while keeping the architecture the same. We postulate that the perofrmance gains afforded by `laion/CLIP-ViT-B-32-laion2B-s34B-b79K` are due to the increased training data (5x OpenAI CLIP data). Our [thesis](https://www.nature.com/articles/s41598-022-23052-9), however, remains the same -- fine-tuning `laion/CLIP` on our fashion dataset improved zero-shot perofrmance across our benchmarks. See the below table comparing weighted macro F1 score across models.
| Model | FMNIST | KAGL | DEEP |
| ------------- | ------------- | ------------- | ------------- |
| OpenAI CLIP | 0.66 | 0.63 | 0.45 |
| FashionCLIP | 0.74 | 0.67 | 0.48 |
| Laion CLIP | 0.78 | 0.71 | 0.58 |
| FashionCLIP 2.0 | __0.83__ | __0.73__ | __0.62__ |
---
FashionCLIP is a CLIP-based model developed to produce general product representations for fashion concepts. Leveraging the pre-trained checkpoint (ViT-B/32) released by [OpenAI](https://github.com/openai/CLIP), we train FashionCLIP on a large, high-quality novel fashion dataset to study whether domain specific fine-tuning of CLIP-like models is sufficient to produce product representations that are zero-shot transferable to entirely new datasets and tasks. FashionCLIP was not developed for model deplyoment - to do so, researchers will first need to carefully study their capabilities in relation to the specific context theyโre being deployed within.
### Model Date
March 2023
### Model Type
The model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained, starting from a pre-trained checkpoint, to maximize the similarity of (image, text) pairs via a contrastive loss on a fashion dataset containing 800K products.
### Documents
- [FashionCLIP Github Repo](https://github.com/patrickjohncyh/fashion-clip)
- [FashionCLIP Paper](https://www.nature.com/articles/s41598-022-23052-9)
## Data
The model was trained on (image, text) pairs obtained from the Farfecth dataset[^1 Awaiting official release.], an English dataset comprising over 800K fashion products, with more than 3K brands across dozens of object types. The image used for encoding is the standard product image, which is a picture of the item over a white background, with no humans. The text used is a concatenation of the _highlight_ (e.g., โstripesโ, โlong sleevesโ, โArmaniโ) and _short description_ (โ80s styled t-shirtโ)) available in the Farfetch dataset.
## Limitations, Bias and Fiarness
We acknowledge certain limitations of FashionCLIP and expect that it inherits certain limitations and biases present in the original CLIP model. We do not expect our fine-tuning to significantly augment these limitations: we acknowledge that the fashion data we use makes explicit assumptions about the notion of gender as in "blue shoes for a woman" that inevitably associate aspects of clothing with specific people.
Our investigations also suggest that the data used introduces certain limitations in FashionCLIP. From the textual modality, given that most captions derived from the Farfetch dataset are long, we observe that FashionCLIP may be more performant in longer queries than shorter ones. From the image modality, FashionCLIP is also biased towards standard product images (centered, white background).
Model selection, i.e. selecting an appropariate stopping critera during fine-tuning, remains an open challenge. We observed that using loss on an in-domain (i.e. same distribution as test) validation dataset is a poor selection critera when out-of-domain generalization (i.e. across different datasets) is desired, even when the dataset used is relatively diverse and large.
## Citation
```
@Article{Chia2022,
title="Contrastive language and vision learning of general fashion concepts",
author="Chia, Patrick John
and Attanasio, Giuseppe
and Bianchi, Federico
and Terragni, Silvia
and Magalh{\~a}es, Ana Rita
and Goncalves, Diogo
and Greco, Ciro
and Tagliabue, Jacopo",
journal="Scientific Reports",
year="2022",
month="Nov",
day="08",
volume="12",
number="1",
abstract="The steady rise of online shopping goes hand in hand with the development of increasingly complex ML and NLP models. While most use cases are cast as specialized supervised learning problems, we argue that practitioners would greatly benefit from general and transferable representations of products. In this work, we build on recent developments in contrastive learning to train FashionCLIP, a CLIP-like model adapted for the fashion industry. We demonstrate the effectiveness of the representations learned by FashionCLIP with extensive tests across a variety of tasks, datasets and generalization probes. We argue that adaptations of large pre-trained models such as CLIP offer new perspectives in terms of scalability and sustainability for certain types of players in the industry. Finally, we detail the costs and environmental impact of training, and release the model weights and code as open source contribution to the community.",
issn="2045-2322",
doi="10.1038/s41598-022-23052-9",
url="https://doi.org/10.1038/s41598-022-23052-9"
}
``` | 7,002 | [
[
-0.029693603515625,
-0.040496826171875,
0.0106353759765625,
0.027801513671875,
-0.038360595703125,
0.0012998580932617188,
-0.0110931396484375,
-0.0482177734375,
0.03729248046875,
0.01007080078125,
-0.0704345703125,
-0.0528564453125,
-0.0289459228515625,
0.0034332275390625,
-0.01934814453125,
0.06536865234375,
-0.01535797119140625,
-0.00011563301086425781,
-0.00792694091796875,
-0.037139892578125,
-0.039031982421875,
-0.05963134765625,
-0.0196533203125,
-0.0098876953125,
-0.009521484375,
0.0152435302734375,
0.041595458984375,
0.039398193359375,
0.043365478515625,
0.0164031982421875,
-0.01204681396484375,
0.0021514892578125,
-0.053009033203125,
-0.016082763671875,
-0.020721435546875,
-0.024322509765625,
-0.0462646484375,
0.02374267578125,
0.012725830078125,
0.006809234619140625,
-0.000934600830078125,
0.03948974609375,
0.01690673828125,
0.041595458984375,
-0.05487060546875,
-0.003948211669921875,
-0.0223846435546875,
-0.0067291259765625,
-0.0167388916015625,
0.00823211669921875,
-0.01512908935546875,
0.0041656494140625,
0.01029205322265625,
-0.053497314453125,
0.041656494140625,
0.005245208740234375,
0.0953369140625,
0.0020122528076171875,
-0.0018768310546875,
-0.0065765380859375,
-0.048187255859375,
0.06878662109375,
-0.042999267578125,
0.043182373046875,
0.01219940185546875,
0.006866455078125,
0.0321044921875,
-0.052001953125,
-0.028411865234375,
0.021026611328125,
0.00742340087890625,
0.0207061767578125,
-0.0137786865234375,
-0.0220947265625,
0.0109710693359375,
0.03485107421875,
-0.019927978515625,
-0.0034236907958984375,
-0.04486083984375,
0.0040740966796875,
0.050018310546875,
0.0036411285400390625,
0.0201568603515625,
-0.0196533203125,
-0.06439208984375,
-0.037200927734375,
-0.049652099609375,
0.0176849365234375,
0.01309967041015625,
0.00426483154296875,
-0.0440673828125,
0.0181732177734375,
0.0036869049072265625,
0.0275726318359375,
0.01495361328125,
-0.024322509765625,
0.041259765625,
-0.03533935546875,
-0.0230560302734375,
-0.014556884765625,
0.0557861328125,
0.0753173828125,
0.003353118896484375,
0.018463134765625,
0.002407073974609375,
-0.0244903564453125,
0.0034122467041015625,
-0.07135009765625,
-0.0296478271484375,
0.01358795166015625,
-0.0423583984375,
-0.0284423828125,
0.036773681640625,
-0.06646728515625,
0.020843505859375,
-0.02215576171875,
0.045440673828125,
-0.02685546875,
-0.00377655029296875,
0.035858154296875,
-0.03607177734375,
0.032623291015625,
0.0228729248046875,
-0.05084228515625,
0.01000213623046875,
0.03668212890625,
0.082763671875,
-0.0203094482421875,
-0.00559234619140625,
0.02276611328125,
0.021820068359375,
-0.019561767578125,
0.0460205078125,
-0.0202484130859375,
-0.028045654296875,
-0.00974273681640625,
0.03509521484375,
0.0043487548828125,
-0.038177490234375,
0.042205810546875,
-0.0060272216796875,
-0.005046844482421875,
-0.041229248046875,
-0.013641357421875,
-0.01047515869140625,
0.01267242431640625,
-0.044677734375,
0.06097412109375,
0.00328826904296875,
-0.055999755859375,
0.038787841796875,
-0.037628173828125,
-0.0171051025390625,
-0.0189208984375,
0.006595611572265625,
-0.055633544921875,
-0.01050567626953125,
0.04010009765625,
0.04315185546875,
-0.021636962890625,
-0.00800323486328125,
-0.0595703125,
-0.03765869140625,
0.0211334228515625,
-0.0020046234130859375,
0.050567626953125,
-0.00852203369140625,
-0.023406982421875,
-0.0081024169921875,
-0.046630859375,
-0.01229095458984375,
0.0411376953125,
0.004474639892578125,
-0.038177490234375,
-0.025726318359375,
0.0013799667358398438,
0.0192108154296875,
0.007015228271484375,
-0.03839111328125,
0.0168304443359375,
-0.01116180419921875,
0.058258056640625,
0.0743408203125,
0.00864410400390625,
0.0252838134765625,
-0.049835205078125,
0.03570556640625,
-0.0164794921875,
0.047210693359375,
-0.013214111328125,
-0.026092529296875,
-0.0504150390625,
-0.04327392578125,
0.0252532958984375,
0.0279388427734375,
-0.031585693359375,
0.01462554931640625,
-0.004299163818359375,
-0.041015625,
-0.0295867919921875,
-0.00998687744140625,
0.0249176025390625,
0.0265960693359375,
0.0440673828125,
-0.0428466796875,
-0.0426025390625,
-0.05865478515625,
-0.0006771087646484375,
0.006626129150390625,
-0.0278778076171875,
0.04766845703125,
0.055267333984375,
-0.015228271484375,
0.0675048828125,
-0.068603515625,
-0.0300445556640625,
-0.0176239013671875,
0.006717681884765625,
0.019073486328125,
0.04315185546875,
0.08770751953125,
-0.059844970703125,
-0.026824951171875,
-0.03411865234375,
-0.0386962890625,
-0.0111846923828125,
-0.01155853271484375,
-0.0191650390625,
-0.01297760009765625,
0.023040771484375,
-0.01739501953125,
0.049530029296875,
0.029693603515625,
-0.00504302978515625,
0.054168701171875,
-0.0034732818603515625,
0.01294708251953125,
-0.07269287109375,
0.01229095458984375,
0.0198211669921875,
-0.01007843017578125,
-0.0265045166015625,
-0.0022983551025390625,
-0.005260467529296875,
-0.02117919921875,
-0.06268310546875,
0.054046630859375,
-0.0285186767578125,
0.0008411407470703125,
0.005950927734375,
0.00548553466796875,
0.042938232421875,
0.058502197265625,
0.007049560546875,
0.0655517578125,
0.013153076171875,
-0.05438232421875,
0.0178070068359375,
0.05633544921875,
-0.03131103515625,
0.050140380859375,
-0.074951171875,
0.01141357421875,
-0.00341033935546875,
0.01495361328125,
-0.06365966796875,
-0.0287628173828125,
0.03839111328125,
-0.017578125,
0.02838134765625,
-0.0244598388671875,
-0.0240936279296875,
-0.052154541015625,
-0.0682373046875,
0.041656494140625,
0.043975830078125,
-0.061492919921875,
0.00787353515625,
0.0283203125,
0.0108642578125,
-0.0302276611328125,
-0.058074951171875,
-0.045623779296875,
-0.0285491943359375,
-0.040863037109375,
0.054168701171875,
-0.0289306640625,
0.00739288330078125,
-0.0233306884765625,
-0.0292816162109375,
-0.01532745361328125,
0.0012950897216796875,
0.0221710205078125,
0.053009033203125,
-0.002231597900390625,
0.008636474609375,
-0.0100250244140625,
0.0345458984375,
-0.004947662353515625,
-0.012969970703125,
0.03387451171875,
-0.014617919921875,
-0.0235595703125,
-0.035614013671875,
0.00914764404296875,
0.0248260498046875,
-0.01456451416015625,
0.040618896484375,
0.0396728515625,
0.007137298583984375,
-0.0213775634765625,
-0.0306243896484375,
-0.006473541259765625,
-0.039794921875,
0.021942138671875,
-0.01837158203125,
-0.052398681640625,
0.037353515625,
0.002033233642578125,
-0.006694793701171875,
0.04962158203125,
0.01885986328125,
-0.0009264945983886719,
0.07000732421875,
0.0689697265625,
-0.00274658203125,
0.060211181640625,
-0.046630859375,
0.0017232894897460938,
-0.0810546875,
-0.0299835205078125,
-0.0254974365234375,
-0.03875732421875,
-0.03955078125,
-0.04852294921875,
0.0394287109375,
0.0457763671875,
-0.038177490234375,
0.052276611328125,
-0.058807373046875,
0.02789306640625,
0.035491943359375,
0.050567626953125,
-0.0046539306640625,
0.0016622543334960938,
0.004077911376953125,
-0.019927978515625,
-0.05914306640625,
-0.00027179718017578125,
0.0789794921875,
0.042205810546875,
0.07293701171875,
-0.00592041015625,
0.036041259765625,
0.00872802734375,
0.0048370361328125,
-0.0556640625,
0.028106689453125,
-0.038360595703125,
-0.02545166015625,
0.0107879638671875,
-0.00318145751953125,
-0.0401611328125,
-0.003635406494140625,
-0.02691650390625,
-0.047210693359375,
0.062286376953125,
0.032257080078125,
-0.0231170654296875,
0.023681640625,
-0.0300140380859375,
0.060272216796875,
-0.03558349609375,
-0.05517578125,
0.0107574462890625,
-0.048004150390625,
0.0236968994140625,
0.01453399658203125,
-0.0011205673217773438,
-0.022430419921875,
0.005466461181640625,
0.09326171875,
-0.04364013671875,
0.07354736328125,
0.0138092041015625,
0.0160369873046875,
0.03302001953125,
-0.029693603515625,
0.020599365234375,
-0.00516510009765625,
0.00730133056640625,
0.04425048828125,
0.0099639892578125,
-0.0240478515625,
-0.037200927734375,
0.01708984375,
-0.047607421875,
-0.0322265625,
-0.024322509765625,
-0.01114654541015625,
-0.005542755126953125,
0.00698089599609375,
0.052398681640625,
0.048919677734375,
-0.024322509765625,
0.0244903564453125,
0.045257568359375,
-0.019378662109375,
0.015655517578125,
0.0262603759765625,
-0.00476837158203125,
-0.05029296875,
0.07330322265625,
0.006114959716796875,
0.0259552001953125,
0.0228271484375,
0.03118896484375,
-0.01049041748046875,
-0.01313018798828125,
-0.02288818359375,
0.035797119140625,
-0.05841064453125,
-0.029937744140625,
-0.0222015380859375,
-0.0106658935546875,
-0.037384033203125,
-0.03253173828125,
-0.0321044921875,
-0.034332275390625,
-0.053009033203125,
-0.0030364990234375,
0.0335693359375,
0.05084228515625,
-0.01049041748046875,
0.023956298828125,
-0.05145263671875,
0.00536346435546875,
0.0301361083984375,
0.023468017578125,
-0.0021839141845703125,
-0.0452880859375,
-0.01290130615234375,
0.0074462890625,
-0.0518798828125,
-0.052490234375,
0.037017822265625,
0.032501220703125,
0.03277587890625,
0.0457763671875,
0.016845703125,
0.073974609375,
-0.0286865234375,
0.06878662109375,
0.031494140625,
-0.068359375,
0.0478515625,
-0.029571533203125,
0.0171661376953125,
0.040069580078125,
0.06158447265625,
-0.0265045166015625,
-0.003444671630859375,
-0.06494140625,
-0.08062744140625,
0.05029296875,
-0.002208709716796875,
0.0020542144775390625,
-0.006488800048828125,
0.02667236328125,
-0.00257110595703125,
0.036407470703125,
-0.040130615234375,
-0.01470184326171875,
-0.0496826171875,
0.0038318634033203125,
0.0167388916015625,
-0.035369873046875,
-0.00229644775390625,
-0.03826904296875,
0.044158935546875,
-0.009185791015625,
0.043548583984375,
0.036102294921875,
0.006649017333984375,
0.02044677734375,
0.008392333984375,
0.051727294921875,
0.0531005859375,
-0.03472900390625,
-0.039581298828125,
0.00995635986328125,
-0.038848876953125,
-0.0164337158203125,
-0.019683837890625,
-0.0210418701171875,
0.01049041748046875,
0.02752685546875,
0.08050537109375,
0.04046630859375,
-0.0474853515625,
0.08123779296875,
0.01389312744140625,
-0.0240936279296875,
-0.00811767578125,
-0.0062103271484375,
-0.0361328125,
0.0174713134765625,
0.00505828857421875,
0.0114288330078125,
0.01116180419921875,
-0.058563232421875,
0.032867431640625,
0.036590576171875,
-0.0277252197265625,
-0.051177978515625,
0.0738525390625,
0.0154876708984375,
-0.0029201507568359375,
0.021026611328125,
0.0131072998046875,
-0.06463623046875,
0.044036865234375,
0.05316162109375,
0.06341552734375,
-0.0199127197265625,
0.02252197265625,
0.057281494140625,
-0.01204681396484375,
-0.0433349609375,
0.0003719329833984375,
-0.0154876708984375,
-0.038604736328125,
-0.0044097900390625,
-0.058929443359375,
-0.043426513671875,
0.01021575927734375,
-0.0689697265625,
0.045257568359375,
-0.049285888671875,
-0.02252197265625,
-0.01861572265625,
-0.0246429443359375,
-0.05615234375,
0.015106201171875,
-0.01457977294921875,
0.07110595703125,
-0.07391357421875,
0.029205322265625,
0.0264892578125,
-0.043853759765625,
-0.059967041015625,
0.00662994384765625,
-0.00518035888671875,
-0.05963134765625,
0.05108642578125,
0.03192138671875,
0.0007581710815429688,
-0.0438232421875,
-0.06494140625,
-0.05914306640625,
0.0902099609375,
0.022003173828125,
-0.0413818359375,
-0.0293426513671875,
-0.01006317138671875,
0.0306549072265625,
-0.034210205078125,
0.0084381103515625,
0.0289306640625,
-0.004364013671875,
0.03851318359375,
-0.0521240234375,
-0.0137786865234375,
-0.010284423828125,
0.00263214111328125,
0.006191253662109375,
-0.08331298828125,
0.08355712890625,
-0.00555419921875,
-0.0226898193359375,
0.020599365234375,
0.036346435546875,
0.003551483154296875,
0.0283355712890625,
0.02056884765625,
0.03692626953125,
0.042205810546875,
-0.003215789794921875,
0.0797119140625,
-0.01148223876953125,
0.051300048828125,
0.097412109375,
0.005214691162109375,
0.07122802734375,
0.0033473968505859375,
-0.016204833984375,
0.0343017578125,
0.0355224609375,
-0.0452880859375,
0.048736572265625,
-0.007701873779296875,
0.00980377197265625,
-0.0179290771484375,
-0.00902557373046875,
-0.029205322265625,
0.042022705078125,
-0.0023212432861328125,
-0.0408935546875,
0.00598907470703125,
0.01739501953125,
0.009552001953125,
-0.0107879638671875,
-0.02764892578125,
0.04302978515625,
0.006072998046875,
-0.00925445556640625,
0.061126708984375,
-0.01128387451171875,
0.0611572265625,
-0.05810546875,
-0.01354217529296875,
0.0028858184814453125,
0.01319122314453125,
-0.01197052001953125,
-0.048980712890625,
0.02197265625,
-0.01425933837890625,
-0.01259613037109375,
-0.033172607421875,
0.058990478515625,
-0.01206207275390625,
-0.04144287109375,
0.017486572265625,
-0.01508331298828125,
0.005939483642578125,
0.004726409912109375,
-0.060943603515625,
0.028411865234375,
0.00443267822265625,
0.0005545616149902344,
0.0194244384765625,
0.004825592041015625,
0.0023517608642578125,
0.043121337890625,
0.041046142578125,
-0.009552001953125,
-0.01171875,
-0.0163116455078125,
0.07489013671875,
-0.033935546875,
-0.045318603515625,
-0.054046630859375,
0.0230712890625,
-0.0245361328125,
-0.0270843505859375,
0.0650634765625,
0.04010009765625,
0.09112548828125,
-0.0285491943359375,
0.03240966796875,
0.0016841888427734375,
0.0394287109375,
-0.04254150390625,
0.03759765625,
-0.049072265625,
0.022125244140625,
-0.05316162109375,
-0.0396728515625,
-0.01219940185546875,
0.054168701171875,
-0.0271759033203125,
-0.013275146484375,
0.035064697265625,
0.035186767578125,
-0.0163116455078125,
-0.0134735107421875,
0.007781982421875,
-0.00585174560546875,
0.01232147216796875,
0.031707763671875,
0.0238494873046875,
-0.05340576171875,
0.03387451171875,
-0.047210693359375,
-0.0160675048828125,
-0.027923583984375,
-0.056976318359375,
-0.07342529296875,
-0.04254150390625,
-0.02886962890625,
-0.0214996337890625,
0.013214111328125,
0.068603515625,
0.0723876953125,
-0.047119140625,
-0.01043701171875,
0.0310821533203125,
0.0007052421569824219,
-0.028533935546875,
-0.0145416259765625,
0.0284881591796875,
0.017974853515625,
-0.05560302734375,
0.01129913330078125,
0.022796630859375,
0.03125,
0.015533447265625,
0.01337432861328125,
-0.004894256591796875,
0.01393890380859375,
0.0592041015625,
0.040283203125,
-0.0299835205078125,
-0.03375244140625,
0.0014028549194335938,
0.028961181640625,
0.0389404296875,
0.06396484375,
-0.051971435546875,
0.0164031982421875,
0.038970947265625,
0.024505615234375,
0.05633544921875,
0.03912353515625,
0.02984619140625,
-0.06634521484375,
0.0238800048828125,
-0.00931549072265625,
0.022796630859375,
0.025726318359375,
-0.0089569091796875,
0.04913330078125,
0.045562744140625,
-0.0302734375,
-0.0494384765625,
0.01654052734375,
-0.08203125,
-0.0281982421875,
0.06414794921875,
-0.02008056640625,
-0.05316162109375,
0.0214691162109375,
-0.0210418701171875,
0.00357818603515625,
-0.0272064208984375,
0.02984619140625,
0.021148681640625,
0.004932403564453125,
-0.0145263671875,
-0.0291595458984375,
0.034271240234375,
0.00013875961303710938,
-0.03167724609375,
-0.00586700439453125,
0.0236053466796875,
0.050750732421875,
0.018768310546875,
0.03948974609375,
-0.00554656982421875,
0.0196075439453125,
-0.0233612060546875,
-0.01297760009765625,
-0.0175018310546875,
-0.0267486572265625,
-0.01444244384765625,
0.01517486572265625,
-0.02752685546875,
-0.038665771484375
]
] |
Intel/dpt-hybrid-midas | 2023-03-06T16:35:15.000Z | [
"transformers",
"pytorch",
"dpt",
"depth-estimation",
"vision",
"arxiv:2103.13413",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | depth-estimation | Intel | null | null | Intel/dpt-hybrid-midas | 38 | 2,114,701 | transformers | 2022-12-06T09:12:55 | ---
license: apache-2.0
tags:
- vision
- depth-estimation
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
model-index:
- name: dpt-hybrid-midas
results:
- task:
type: monocular-depth-estimation
name: Monocular Depth Estimation
dataset:
type: MIX 6
name: MIX 6
metrics:
- type: Zero-shot transfer
value: 11.06
name: Zero-shot transfer
config: Zero-shot transfer
verified: false
---
## Model Details: DPT-Hybrid
Dense Prediction Transformer (DPT) model trained on 1.4 million images for monocular depth estimation.
It was introduced in the paper [Vision Transformers for Dense Prediction](https://arxiv.org/abs/2103.13413) by Ranftl et al. (2021) and first released in [this repository](https://github.com/isl-org/DPT).
DPT uses the Vision Transformer (ViT) as backbone and adds a neck + head on top for monocular depth estimation.

This repository hosts the "hybrid" version of the model as stated in the paper. DPT-Hybrid diverges from DPT by using [ViT-hybrid](https://huggingface.co/google/vit-hybrid-base-bit-384) as a backbone and taking some activations from the backbone.
The model card has been written in combination by the Hugging Face team and Intel.
| Model Detail | Description |
| ----------- | ----------- |
| Model Authors - Company | Intel |
| Date | December 22, 2022 |
| Version | 1 |
| Type | Computer Vision - Monocular Depth Estimation |
| Paper or Other Resources | [Vision Transformers for Dense Prediction](https://arxiv.org/abs/2103.13413) and [GitHub Repo](https://github.com/isl-org/DPT) |
| License | Apache 2.0 |
| Questions or Comments | [Community Tab](https://huggingface.co/Intel/dpt-hybrid-midas/discussions) and [Intel Developers Discord](https://discord.gg/rv2Gp55UJQ)|
| Intended Use | Description |
| ----------- | ----------- |
| Primary intended uses | You can use the raw model for zero-shot monocular depth estimation. See the [model hub](https://huggingface.co/models?search=dpt) to look for fine-tuned versions on a task that interests you. |
| Primary intended users | Anyone doing monocular depth estimation |
| Out-of-scope uses | This model in most cases will need to be fine-tuned for your particular task. The model should not be used to intentionally create hostile or alienating environments for people.|
### How to use
Here is how to use this model for zero-shot depth estimation on an image:
```python
from PIL import Image
import numpy as np
import requests
import torch
from transformers import DPTForDepthEstimation, DPTFeatureExtractor
model = DPTForDepthEstimation.from_pretrained("Intel/dpt-hybrid-midas", low_cpu_mem_usage=True)
feature_extractor = DPTFeatureExtractor.from_pretrained("Intel/dpt-hybrid-midas")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# prepare image for the model
inputs = feature_extractor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
predicted_depth = outputs.predicted_depth
# interpolate to original size
prediction = torch.nn.functional.interpolate(
predicted_depth.unsqueeze(1),
size=image.size[::-1],
mode="bicubic",
align_corners=False,
)
# visualize the prediction
output = prediction.squeeze().cpu().numpy()
formatted = (output * 255 / np.max(output)).astype("uint8")
depth = Image.fromarray(formatted)
depth.show()
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/dpt).
| Factors | Description |
| ----------- | ----------- |
| Groups | Multiple datasets compiled together |
| Instrumentation | - |
| Environment | Inference completed on Intel Xeon Platinum 8280 CPU @ 2.70GHz with 8 physical cores and an NVIDIA RTX 2080 GPU. |
| Card Prompts | Model deployment on alternate hardware and software will change model performance |
| Metrics | Description |
| ----------- | ----------- |
| Model performance measures | Zero-shot Transfer |
| Decision thresholds | - |
| Approaches to uncertainty and variability | - |
| Training and Evaluation Data | Description |
| ----------- | ----------- |
| Datasets | The dataset is called MIX 6, and contains around 1.4M images. The model was initialized with ImageNet-pretrained weights.|
| Motivation | To build a robust monocular depth prediction network |
| Preprocessing | "We resize the image such that the longer side is 384 pixels and train on random square crops of size 384. ... We perform random horizontal flips for data augmentation." See [Ranftl et al. (2021)](https://arxiv.org/abs/2103.13413) for more details. |
## Quantitative Analyses
| Model | Training set | DIW WHDR | ETH3D AbsRel | Sintel AbsRel | KITTI ฮด>1.25 | NYU ฮด>1.25 | TUM ฮด>1.25 |
| --- | --- | --- | --- | --- | --- | --- | --- |
| DPT - Large | MIX 6 | 10.82 (-13.2%) | 0.089 (-31.2%) | 0.270 (-17.5%) | 8.46 (-64.6%) | 8.32 (-12.9%) | 9.97 (-30.3%) |
| DPT - Hybrid | MIX 6 | 11.06 (-11.2%) | 0.093 (-27.6%) | 0.274 (-16.2%) | 11.56 (-51.6%) | 8.69 (-9.0%) | 10.89 (-23.2%) |
| MiDaS | MIX 6 | 12.95 (+3.9%) | 0.116 (-10.5%) | 0.329 (+0.5%) | 16.08 (-32.7%) | 8.71 (-8.8%) | 12.51 (-12.5%)
| MiDaS [30] | MIX 5 | 12.46 | 0.129 | 0.327 | 23.90 | 9.55 | 14.29 |
| Li [22] | MD [22] | 23.15 | 0.181 | 0.385 | 36.29 | 27.52 | 29.54 |
| Li [21] | MC [21] | 26.52 | 0.183 | 0.405 | 47.94 | 18.57 | 17.71 |
| Wang [40] | WS [40] | 19.09 | 0.205 | 0.390 | 31.92 | 29.57 | 20.18 |
| Xian [45] | RW [45] | 14.59 | 0.186 | 0.422 | 34.08 | 27.00 | 25.02 |
| Casser [5] | CS [8] | 32.80 | 0.235 | 0.422 | 21.15 | 39.58 | 37.18 |
Table 1. Comparison to the state of the art on monocular depth estimation. We evaluate zero-shot cross-dataset transfer according to the
protocol defined in [30]. Relative performance is computed with respect to the original MiDaS model [30]. Lower is better for all metrics. ([Ranftl et al., 2021](https://arxiv.org/abs/2103.13413))
| Ethical Considerations | Description |
| ----------- | ----------- |
| Data | The training data come from multiple image datasets compiled together. |
| Human life | The model is not intended to inform decisions central to human life or flourishing. It is an aggregated set of monocular depth image datasets. |
| Mitigations | No additional risk mitigation strategies were considered during model development. |
| Risks and harms | The extent of the risks involved by using the model remain unknown. |
| Use cases | - |
| Caveats and Recommendations |
| ----------- |
| Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. There are no additional caveats or recommendations for this model. |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2103-13413,
author = {Ren{\'{e}} Ranftl and
Alexey Bochkovskiy and
Vladlen Koltun},
title = {Vision Transformers for Dense Prediction},
journal = {CoRR},
volume = {abs/2103.13413},
year = {2021},
url = {https://arxiv.org/abs/2103.13413},
eprinttype = {arXiv},
eprint = {2103.13413},
timestamp = {Wed, 07 Apr 2021 15:31:46 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2103-13413.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 7,825 | [
[
-0.053314208984375,
-0.04412841796875,
0.0111846923828125,
0.008941650390625,
-0.0423583984375,
-0.005039215087890625,
0.015106201171875,
-0.0379638671875,
0.03143310546875,
0.0247802734375,
-0.0509033203125,
-0.034942626953125,
-0.049102783203125,
-0.010986328125,
-0.0313720703125,
0.05487060546875,
0.016357421875,
0.003559112548828125,
-0.016510009765625,
-0.0079345703125,
-0.01641845703125,
-0.01369476318359375,
-0.0311737060546875,
-0.0251312255859375,
0.032196044921875,
0.0384521484375,
0.058441162109375,
0.039215087890625,
0.046722412109375,
0.0265350341796875,
-0.01529693603515625,
-0.01531219482421875,
-0.0347900390625,
-0.0310211181640625,
0.01026153564453125,
-0.009002685546875,
-0.048675537109375,
-0.00478363037109375,
0.052703857421875,
0.056182861328125,
-0.0034580230712890625,
0.0215606689453125,
0.0023746490478515625,
0.06707763671875,
-0.033966064453125,
-0.007328033447265625,
-0.0229034423828125,
0.0273590087890625,
0.00035572052001953125,
-0.005523681640625,
-0.00524139404296875,
0.0081634521484375,
0.06256103515625,
-0.0517578125,
0.0276336669921875,
0.0019207000732421875,
0.07208251953125,
0.03497314453125,
-0.01849365234375,
0.0201568603515625,
-0.04010009765625,
0.057373046875,
-0.057342529296875,
0.024322509765625,
-0.006336212158203125,
0.033203125,
0.01282501220703125,
-0.040924072265625,
-0.05377197265625,
0.00307464599609375,
-0.01084136962890625,
0.034698486328125,
-0.01363372802734375,
0.01551055908203125,
0.025543212890625,
0.05743408203125,
-0.050201416015625,
-0.0011720657348632812,
-0.049041748046875,
-0.00502777099609375,
0.06304931640625,
0.004673004150390625,
-0.00647735595703125,
-0.02606201171875,
-0.0789794921875,
-0.03778076171875,
-0.011322021484375,
0.030914306640625,
0.015594482421875,
-0.0007009506225585938,
-0.051025390625,
0.042755126953125,
-0.0292205810546875,
0.059783935546875,
0.0215606689453125,
-0.02593994140625,
0.0401611328125,
-0.030853271484375,
-0.044586181640625,
0.00733184814453125,
0.058837890625,
0.032196044921875,
0.0255126953125,
0.00968170166015625,
-0.0079803466796875,
0.00849151611328125,
0.0005559921264648438,
-0.06744384765625,
-0.014404296875,
0.0303497314453125,
-0.016510009765625,
-0.0312347412109375,
0.022064208984375,
-0.07183837890625,
-0.01103973388671875,
-0.0150604248046875,
0.033782958984375,
-0.03466796875,
-0.0335693359375,
0.046783447265625,
-0.0098114013671875,
0.036163330078125,
0.0217742919921875,
-0.037567138671875,
0.032379150390625,
0.0105133056640625,
0.070556640625,
-0.0161590576171875,
-0.0352783203125,
0.020355224609375,
-0.005615234375,
-0.008880615234375,
0.041290283203125,
0.017578125,
-0.023956298828125,
-0.0308074951171875,
0.01128387451171875,
0.00780487060546875,
-0.0305938720703125,
0.03375244140625,
-0.018341064453125,
0.0096588134765625,
0.00395965576171875,
-0.0206451416015625,
-0.041107177734375,
0.032623291015625,
-0.048980712890625,
0.0665283203125,
0.0152587890625,
-0.0711669921875,
0.04327392578125,
-0.03216552734375,
-0.00960540771484375,
-0.0008831024169921875,
0.005584716796875,
-0.05718994140625,
-0.0010652542114257812,
0.0290069580078125,
0.031280517578125,
-0.01509857177734375,
0.0176544189453125,
-0.037017822265625,
-0.02227783203125,
-0.0033664703369140625,
-0.0311126708984375,
0.08538818359375,
-0.00476837158203125,
-0.0279998779296875,
0.010467529296875,
-0.067138671875,
-0.01529693603515625,
0.03436279296875,
0.0031185150146484375,
-0.005889892578125,
-0.035125732421875,
-0.0174560546875,
0.0252227783203125,
0.01499176025390625,
-0.06378173828125,
0.007720947265625,
-0.038848876953125,
0.0011730194091796875,
0.055694580078125,
0.00589752197265625,
0.039794921875,
-0.0144500732421875,
0.04296875,
0.04498291015625,
0.038726806640625,
-0.00772857666015625,
-0.0404052734375,
-0.0546875,
-0.0335693359375,
0.00934600830078125,
0.03411865234375,
-0.04779052734375,
0.0302276611328125,
-0.0303955078125,
-0.058349609375,
-0.023284912109375,
-0.0174407958984375,
0.0280609130859375,
0.06549072265625,
0.0390625,
-0.03179931640625,
-0.052276611328125,
-0.0777587890625,
0.02459716796875,
-0.0082855224609375,
0.01296234130859375,
0.01374053955078125,
0.04376220703125,
0.00023543834686279297,
0.06829833984375,
-0.041534423828125,
-0.0125274658203125,
-0.00815582275390625,
-0.00803375244140625,
0.041656494140625,
0.04107666015625,
0.049713134765625,
-0.053955078125,
-0.050201416015625,
-0.0175628662109375,
-0.0762939453125,
0.035797119140625,
0.0109710693359375,
-0.009765625,
0.005123138427734375,
-0.00916290283203125,
-0.055267333984375,
0.06396484375,
0.050201416015625,
-0.032257080078125,
0.052093505859375,
-0.004253387451171875,
-0.00984954833984375,
-0.0701904296875,
0.0023822784423828125,
0.027679443359375,
-0.0142059326171875,
-0.04962158203125,
-0.01444244384765625,
-0.004421234130859375,
-0.00910186767578125,
-0.0498046875,
0.052825927734375,
-0.036224365234375,
-0.01203155517578125,
0.0115966796875,
-0.0138397216796875,
0.006969451904296875,
0.051910400390625,
0.0190887451171875,
0.035400390625,
0.0848388671875,
-0.048797607421875,
0.035858154296875,
0.0282745361328125,
-0.037994384765625,
0.037109375,
-0.0748291015625,
-0.008880615234375,
-0.004917144775390625,
0.008758544921875,
-0.0701904296875,
-0.01171112060546875,
0.0236968994140625,
-0.043731689453125,
0.033203125,
-0.0219268798828125,
-0.01242828369140625,
-0.046783447265625,
-0.0287933349609375,
0.04571533203125,
0.037384033203125,
-0.033660888671875,
0.032501220703125,
0.030242919921875,
0.0008172988891601562,
-0.061065673828125,
-0.06768798828125,
-0.0109100341796875,
-0.0074005126953125,
-0.08770751953125,
0.040985107421875,
-0.004703521728515625,
-0.0060577392578125,
-0.0083160400390625,
-0.0119781494140625,
-0.004734039306640625,
-0.025909423828125,
0.0209197998046875,
0.049041748046875,
-0.0243988037109375,
-0.019439697265625,
-0.00275421142578125,
-0.01271820068359375,
-0.00885009765625,
-0.007659912109375,
0.0288238525390625,
-0.0249176025390625,
-0.01367950439453125,
-0.03668212890625,
-0.0031795501708984375,
0.044189453125,
-0.007450103759765625,
0.0443115234375,
0.039337158203125,
-0.025390625,
0.00019943714141845703,
-0.03778076171875,
-0.037384033203125,
-0.036285400390625,
0.026336669921875,
-0.0190277099609375,
-0.0290985107421875,
0.049713134765625,
0.01751708984375,
-0.015380859375,
0.046142578125,
0.02215576171875,
-0.00856781005859375,
0.060516357421875,
0.045501708984375,
0.00893402099609375,
0.0283660888671875,
-0.07232666015625,
-0.004589080810546875,
-0.052520751953125,
-0.0278778076171875,
0.0155029296875,
-0.049468994140625,
-0.033050537109375,
-0.0225830078125,
0.055877685546875,
-0.00295257568359375,
-0.0171661376953125,
0.0280914306640625,
-0.049896240234375,
0.0104522705078125,
0.050018310546875,
0.03314208984375,
0.014739990234375,
0.025970458984375,
-0.01410675048828125,
0.0012998580932617188,
-0.063720703125,
0.00037980079650878906,
0.06878662109375,
0.039398193359375,
0.06231689453125,
-0.027496337890625,
0.041473388671875,
-0.00490570068359375,
-0.00855255126953125,
-0.044342041015625,
0.04736328125,
0.00232696533203125,
-0.06982421875,
-0.036285400390625,
-0.0305938720703125,
-0.056304931640625,
0.0226593017578125,
-0.0159759521484375,
-0.034698486328125,
0.06103515625,
0.01485443115234375,
-0.04791259765625,
0.04364013671875,
-0.05120849609375,
0.076904296875,
-0.030242919921875,
-0.043426513671875,
0.0111846923828125,
-0.07421875,
0.035675048828125,
0.01226806640625,
-0.015838623046875,
-0.006824493408203125,
0.005931854248046875,
0.04296875,
-0.03033447265625,
0.055419921875,
-0.036407470703125,
0.0176849365234375,
0.044464111328125,
-0.00298309326171875,
0.0182647705078125,
0.01410675048828125,
-0.00008600950241088867,
0.054534912109375,
0.0160369873046875,
-0.0408935546875,
-0.018463134765625,
0.038726806640625,
-0.06707763671875,
-0.0295867919921875,
-0.054168701171875,
-0.062286376953125,
0.005382537841796875,
0.028045654296875,
0.037445068359375,
0.032073974609375,
-0.003917694091796875,
0.02386474609375,
0.04351806640625,
-0.0221710205078125,
0.0294189453125,
0.019195556640625,
-0.0240936279296875,
-0.028533935546875,
0.04986572265625,
0.0199432373046875,
0.028411865234375,
0.012939453125,
0.02490234375,
-0.02215576171875,
-0.037750244140625,
-0.0421142578125,
0.027801513671875,
-0.0350341796875,
-0.02880859375,
-0.030670166015625,
-0.033660888671875,
-0.041412353515625,
-0.01265716552734375,
-0.0452880859375,
-0.041412353515625,
-0.015167236328125,
-0.0213623046875,
0.0178070068359375,
0.0304107666015625,
-0.01165771484375,
0.0190887451171875,
-0.031982421875,
0.0162506103515625,
0.0230712890625,
0.0281524658203125,
-0.01409149169921875,
-0.055694580078125,
-0.01360321044921875,
0.0194854736328125,
-0.022491455078125,
-0.0477294921875,
0.039703369140625,
-0.001499176025390625,
0.014923095703125,
0.030364990234375,
-0.00470733642578125,
0.0633544921875,
0.0032711029052734375,
0.047027587890625,
0.035980224609375,
-0.040985107421875,
0.0280609130859375,
-0.0140838623046875,
0.047760009765625,
0.037933349609375,
0.032623291015625,
-0.0080108642578125,
0.00733184814453125,
-0.053680419921875,
-0.0472412109375,
0.0667724609375,
0.00934600830078125,
-0.01343536376953125,
0.03497314453125,
0.01270294189453125,
0.00879669189453125,
0.0201568603515625,
-0.06842041015625,
-0.033905029296875,
-0.0416259765625,
0.0033931732177734375,
-0.0102996826171875,
-0.00974273681640625,
0.00437164306640625,
-0.0665283203125,
0.04779052734375,
0.0017795562744140625,
0.021575927734375,
0.05169677734375,
0.005947113037109375,
-0.02459716796875,
-0.0266876220703125,
0.0294036865234375,
0.05718994140625,
-0.043243408203125,
-0.003936767578125,
0.01079559326171875,
-0.03680419921875,
-0.003063201904296875,
0.021026611328125,
-0.0286712646484375,
-0.0009899139404296875,
0.019195556640625,
0.061004638671875,
0.01381683349609375,
-0.0269317626953125,
0.049560546875,
0.00910186767578125,
-0.039215087890625,
-0.03692626953125,
-0.025543212890625,
-0.016204833984375,
0.01486968994140625,
0.02679443359375,
0.0305633544921875,
0.011993408203125,
-0.01593017578125,
0.0182952880859375,
0.03607177734375,
-0.0404052734375,
-0.039093017578125,
0.031524658203125,
-0.0003955364227294922,
0.01763916015625,
0.042877197265625,
-0.003574371337890625,
-0.0228118896484375,
0.05999755859375,
0.01491546630859375,
0.06207275390625,
-0.025360107421875,
0.0207061767578125,
0.06396484375,
0.0238037109375,
0.018402099609375,
0.018035888671875,
0.01308441162109375,
-0.058074951171875,
-0.0166778564453125,
-0.054656982421875,
-0.01282501220703125,
0.00815582275390625,
-0.037872314453125,
0.0367431640625,
-0.039215087890625,
-0.004207611083984375,
0.0075531005859375,
0.00933837890625,
-0.06982421875,
0.0295562744140625,
0.0264892578125,
0.08087158203125,
-0.038482666015625,
0.0626220703125,
0.04022216796875,
-0.057769775390625,
-0.041900634765625,
-0.031494140625,
-0.0014495849609375,
-0.0697021484375,
0.0162811279296875,
0.00830078125,
-0.01326751708984375,
0.005340576171875,
-0.06060791015625,
-0.0672607421875,
0.109375,
0.038848876953125,
-0.0384521484375,
-0.0260467529296875,
0.027099609375,
0.0309600830078125,
-0.0285797119140625,
0.0174560546875,
0.0186614990234375,
0.032745361328125,
0.03741455078125,
-0.050262451171875,
-0.006378173828125,
-0.002544403076171875,
-0.00774383544921875,
-0.0014410018920898438,
-0.052825927734375,
0.0836181640625,
-0.0282745361328125,
-0.017578125,
0.003795623779296875,
0.039703369140625,
0.0204925537109375,
0.0290985107421875,
0.05657958984375,
0.06781005859375,
0.00992584228515625,
-0.00395965576171875,
0.08880615234375,
-0.033660888671875,
0.03668212890625,
0.06524658203125,
0.005207061767578125,
0.02801513671875,
0.0457763671875,
-0.0246124267578125,
0.0322265625,
0.055938720703125,
-0.0116119384765625,
0.04266357421875,
-0.01201629638671875,
0.0013761520385742188,
-0.004840850830078125,
0.005161285400390625,
-0.03875732421875,
0.0372314453125,
0.0018205642700195312,
-0.024017333984375,
-0.005565643310546875,
-0.0049591064453125,
-0.0291900634765625,
-0.036956787109375,
-0.0193023681640625,
0.04180908203125,
0.0054931640625,
-0.041595458984375,
0.053131103515625,
-0.028228759765625,
0.05145263671875,
-0.04290771484375,
-0.0027408599853515625,
-0.0225982666015625,
0.029815673828125,
-0.02264404296875,
-0.07232666015625,
0.01218414306640625,
-0.037994384765625,
-0.007701873779296875,
0.004932403564453125,
0.060333251953125,
-0.024200439453125,
-0.0660400390625,
0.041900634765625,
0.03662109375,
0.00623321533203125,
-0.023406982421875,
-0.07501220703125,
-0.00792694091796875,
-0.008819580078125,
-0.03497314453125,
0.00891876220703125,
0.0235137939453125,
0.0443115234375,
0.0408935546875,
0.0330810546875,
0.0005860328674316406,
0.032073974609375,
-0.02294921875,
0.080078125,
-0.0301361083984375,
-0.023193359375,
-0.044189453125,
0.07476806640625,
-0.0239105224609375,
-0.03204345703125,
0.058135986328125,
0.0528564453125,
0.08612060546875,
-0.01399993896484375,
0.035491943359375,
-0.0048065185546875,
0.03466796875,
-0.01171112060546875,
0.0303955078125,
-0.062347412109375,
-0.005725860595703125,
-0.036407470703125,
-0.0909423828125,
-0.0142822265625,
0.02777099609375,
-0.023681640625,
0.0146331787109375,
0.035186767578125,
0.058685302734375,
-0.04937744140625,
-0.0151824951171875,
0.042877197265625,
0.0242462158203125,
-0.01215362548828125,
0.0253143310546875,
0.02947998046875,
-0.057586669921875,
0.0369873046875,
-0.0859375,
-0.033843994140625,
-0.006439208984375,
-0.038787841796875,
-0.043731689453125,
-0.031280517578125,
-0.0269622802734375,
-0.0164947509765625,
-0.0148162841796875,
0.0305023193359375,
0.08258056640625,
-0.052642822265625,
-0.039398193359375,
-0.0079498291015625,
-0.007106781005859375,
-0.02325439453125,
-0.0164794921875,
0.013214111328125,
0.021728515625,
-0.049346923828125,
0.01454925537109375,
0.03411865234375,
0.0153350830078125,
-0.0517578125,
-0.021026611328125,
-0.057403564453125,
-0.0033817291259765625,
0.03997802734375,
0.023681640625,
-0.0546875,
-0.0174560546875,
0.01364898681640625,
0.01015472412109375,
0.030364990234375,
0.0182647705078125,
-0.051727294921875,
0.064453125,
0.041107177734375,
0.0162200927734375,
0.06842041015625,
-0.0024261474609375,
0.01111602783203125,
-0.049468994140625,
0.037384033203125,
0.0252838134765625,
0.0421142578125,
0.031280517578125,
-0.0169219970703125,
0.05120849609375,
0.030242919921875,
-0.042327880859375,
-0.051849365234375,
0.0004146099090576172,
-0.09991455078125,
0.0025997161865234375,
0.05902099609375,
-0.013214111328125,
-0.036895751953125,
0.0234527587890625,
-0.006992340087890625,
0.02294921875,
-0.0147552490234375,
0.042755126953125,
0.0258636474609375,
0.002552032470703125,
-0.040985107421875,
-0.029327392578125,
0.0198516845703125,
0.02484130859375,
-0.037139892578125,
-0.038360595703125,
0.0212554931640625,
0.0263824462890625,
0.0321044921875,
0.0290985107421875,
-0.0203857421875,
0.0205230712890625,
0.01953125,
0.032867431640625,
-0.01042938232421875,
-0.03631591796875,
-0.029449462890625,
-0.00067901611328125,
-0.031646728515625,
-0.02325439453125
]
] |
jonatasgrosman/wav2vec2-large-xlsr-53-portuguese | 2022-12-14T01:59:47.000Z | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_6_0",
"pt",
"robust-speech-event",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"dataset:mozilla-foundation/common_voice_6_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | jonatasgrosman | null | null | jonatasgrosman/wav2vec2-large-xlsr-53-portuguese | 18 | 2,088,717 | transformers | 2022-03-02T23:29:05 | ---
language: pt
license: apache-2.0
datasets:
- common_voice
- mozilla-foundation/common_voice_6_0
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
- mozilla-foundation/common_voice_6_0
- pt
- robust-speech-event
- speech
- xlsr-fine-tuning-week
model-index:
- name: XLSR Wav2Vec2 Portuguese by Jonatas Grosman
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice pt
type: common_voice
args: pt
metrics:
- name: Test WER
type: wer
value: 11.31
- name: Test CER
type: cer
value: 3.74
- name: Test WER (+LM)
type: wer
value: 9.01
- name: Test CER (+LM)
type: cer
value: 3.21
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: pt
metrics:
- name: Dev WER
type: wer
value: 42.1
- name: Dev CER
type: cer
value: 17.93
- name: Dev WER (+LM)
type: wer
value: 36.92
- name: Dev CER (+LM)
type: cer
value: 16.88
---
# Fine-tuned XLSR-53 large model for speech recognition in Portuguese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Portuguese using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-portuguese")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "pt"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-portuguese"
SAMPLES = 10
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| NEM O RADAR NEM OS OUTROS INSTRUMENTOS DETECTARAM O BOMBARDEIRO STEALTH. | NEMHUM VADAN OS OLTWES INSTRUMENTOS DE TTรรN UM BOMBERDEIRO OSTER |
| PEDIR DINHEIRO EMPRESTADO รS PESSOAS DA ALDEIA | E DIR ENGINHEIRO EMPRESTAR AS PESSOAS DA ALDEIA |
| OITO | OITO |
| TRANCร-LOS | TRANCAUVOS |
| REALIZAR UMA INVESTIGAรรO PARA RESOLVER O PROBLEMA | REALIZAR UMA INVESTIGAรรO PARA RESOLVER O PROBLEMA |
| O YOUTUBE AINDA ร A MELHOR PLATAFORMA DE VรDEOS. | YOUTUBE AINDA ร A MELHOR PLATAFOMA DE VรDEOS |
| MENINA E MENINO BEIJANDO NAS SOMBRAS | MENINA E MENINO BEIJANDO NAS SOMBRAS |
| EU SOU O SENHOR | EU SOU O SENHOR |
| DUAS MULHERES QUE SENTAM-SE PARA BAIXO LENDO JORNAIS. | DUAS MIERES QUE SENTAM-SE PARA BAICLANE JODNรI |
| EU ORIGINALMENTE ESPERAVA | EU ORIGINALMENTE ESPERAVA |
## Evaluation
1. To evaluate on `mozilla-foundation/common_voice_6_0` with split `test`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-portuguese --dataset mozilla-foundation/common_voice_6_0 --config pt --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-portuguese --dataset speech-recognition-community-v2/dev_data --config pt --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-portuguese,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {P}ortuguese},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-portuguese}},
year={2021}
}
``` | 5,291 | [
[
-0.033416748046875,
-0.049102783203125,
0.01044464111328125,
0.018890380859375,
-0.019073486328125,
-0.015167236328125,
-0.0269622802734375,
-0.04327392578125,
0.0177459716796875,
0.0209197998046875,
-0.04425048828125,
-0.041473388671875,
-0.0443115234375,
-0.0053863525390625,
-0.0276641845703125,
0.07666015625,
0.01367950439453125,
0.0213623046875,
0.00397491455078125,
-0.0173797607421875,
-0.0186920166015625,
-0.0284423828125,
-0.058319091796875,
-0.0144500732421875,
0.0292510986328125,
0.01377105712890625,
0.019287109375,
0.03436279296875,
0.0214691162109375,
0.0298614501953125,
-0.0250396728515625,
0.00505828857421875,
-0.0244293212890625,
-0.003265380859375,
-0.0005469322204589844,
-0.0247802734375,
-0.027099609375,
-0.0019178390502929688,
0.055206298828125,
0.027099609375,
-0.017547607421875,
0.0215606689453125,
0.0014257431030273438,
0.025115966796875,
-0.0248565673828125,
0.021636962890625,
-0.036376953125,
-0.01568603515625,
-0.004192352294921875,
0.0018911361694335938,
-0.01666259765625,
-0.02166748046875,
0.0227508544921875,
-0.035003662109375,
0.0107269287109375,
-0.010955810546875,
0.07415771484375,
0.0219879150390625,
-0.0100250244140625,
-0.033172607421875,
-0.05181884765625,
0.06903076171875,
-0.065185546875,
0.03912353515625,
0.0306854248046875,
0.01678466796875,
-0.004596710205078125,
-0.052520751953125,
-0.041595458984375,
-0.0150146484375,
0.016082763671875,
0.01207733154296875,
-0.03814697265625,
-0.005401611328125,
0.0298004150390625,
0.0007748603820800781,
-0.050872802734375,
0.01177215576171875,
-0.060516357421875,
-0.0335693359375,
0.049102783203125,
-0.01520538330078125,
0.00978851318359375,
-0.01149749755859375,
-0.01221466064453125,
-0.038970947265625,
-0.023773193359375,
0.0262451171875,
0.041900634765625,
0.0297088623046875,
-0.046295166015625,
0.03466796875,
-0.006595611572265625,
0.053802490234375,
-0.0008111000061035156,
-0.03106689453125,
0.06640625,
-0.015533447265625,
-0.0289306640625,
0.02203369140625,
0.08148193359375,
-0.0003757476806640625,
0.0206756591796875,
0.01258087158203125,
-0.0060272216796875,
0.0177459716796875,
-0.01068878173828125,
-0.056182861328125,
-0.012664794921875,
0.0308685302734375,
-0.01432037353515625,
-0.008697509765625,
0.002201080322265625,
-0.041961669921875,
0.0037403106689453125,
-0.019683837890625,
0.046905517578125,
-0.045654296875,
-0.01389312744140625,
0.02032470703125,
-0.01387786865234375,
0.008941650390625,
-0.0003604888916015625,
-0.05596923828125,
0.01546478271484375,
0.031463623046875,
0.057769775390625,
0.01399993896484375,
-0.0239715576171875,
-0.042144775390625,
-0.01195526123046875,
-0.0037364959716796875,
0.041778564453125,
-0.015411376953125,
-0.0208282470703125,
-0.0192718505859375,
0.005786895751953125,
-0.0132904052734375,
-0.03900146484375,
0.06683349609375,
-0.0098419189453125,
0.030548095703125,
-0.0056915283203125,
-0.0341796875,
-0.021636962890625,
-0.01033782958984375,
-0.03875732421875,
0.07940673828125,
-0.0034198760986328125,
-0.059722900390625,
0.00556182861328125,
-0.04888916015625,
-0.0325927734375,
-0.0209197998046875,
-0.008392333984375,
-0.041259765625,
-0.016815185546875,
0.022186279296875,
0.036376953125,
-0.0196685791015625,
0.0162811279296875,
-0.0218048095703125,
-0.0220947265625,
0.0291290283203125,
-0.032562255859375,
0.086181640625,
0.0267181396484375,
-0.03057861328125,
0.004791259765625,
-0.06524658203125,
0.00583648681640625,
0.006610870361328125,
-0.032379150390625,
-0.008331298828125,
-0.0005965232849121094,
0.0209808349609375,
0.018768310546875,
0.012298583984375,
-0.047576904296875,
-0.0044097900390625,
-0.049713134765625,
0.0501708984375,
0.04156494140625,
-0.0109710693359375,
0.01482391357421875,
-0.02374267578125,
0.026397705078125,
0.0004279613494873047,
0.00009870529174804688,
0.00008887052536010742,
-0.037353515625,
-0.051513671875,
-0.03662109375,
0.03271484375,
0.04376220703125,
-0.0271453857421875,
0.054107666015625,
-0.0175933837890625,
-0.066162109375,
-0.060821533203125,
-0.00814056396484375,
0.03778076171875,
0.041290283203125,
0.0406494140625,
-0.00717926025390625,
-0.07647705078125,
-0.06329345703125,
0.0010671615600585938,
-0.0137939453125,
-0.0176239013671875,
0.0390625,
0.048583984375,
-0.0262451171875,
0.0609130859375,
-0.0306854248046875,
-0.028228759765625,
-0.0158538818359375,
0.004810333251953125,
0.03363037109375,
0.057037353515625,
0.04644775390625,
-0.057647705078125,
-0.0241546630859375,
-0.0193328857421875,
-0.037841796875,
-0.015228271484375,
0.0031299591064453125,
0.00400543212890625,
0.0233306884765625,
0.0227508544921875,
-0.041168212890625,
0.01355743408203125,
0.04766845703125,
-0.0202178955078125,
0.035888671875,
-0.009521484375,
0.00008171796798706055,
-0.08538818359375,
0.00891876220703125,
0.014739990234375,
-0.009429931640625,
-0.0335693359375,
-0.0215911865234375,
-0.0162200927734375,
0.005146026611328125,
-0.04852294921875,
0.036346435546875,
-0.0296478271484375,
-0.0129241943359375,
0.0010395050048828125,
0.017913818359375,
-0.01019287109375,
0.041229248046875,
0.006072998046875,
0.046417236328125,
0.064208984375,
-0.04034423828125,
0.039459228515625,
0.0296478271484375,
-0.046295166015625,
0.016815185546875,
-0.075439453125,
0.0183563232421875,
0.00795745849609375,
0.0203857421875,
-0.0760498046875,
-0.01154327392578125,
0.0274658203125,
-0.06512451171875,
0.01509857177734375,
-0.00667572021484375,
-0.030029296875,
-0.034942626953125,
-0.0033321380615234375,
0.016815185546875,
0.055084228515625,
-0.0323486328125,
0.043853759765625,
0.045196533203125,
-0.01197052001953125,
-0.05230712890625,
-0.06341552734375,
-0.021881103515625,
-0.0190887451171875,
-0.05584716796875,
0.019378662109375,
-0.01419830322265625,
-0.00896453857421875,
-0.01091766357421875,
-0.008087158203125,
-0.004322052001953125,
-0.005420684814453125,
0.031646728515625,
0.0249481201171875,
-0.015228271484375,
-0.00519561767578125,
-0.003631591796875,
0.002567291259765625,
0.0067138671875,
-0.0175323486328125,
0.057159423828125,
-0.0085906982421875,
-0.00710296630859375,
-0.046295166015625,
0.009674072265625,
0.041107177734375,
-0.02264404296875,
0.04107666015625,
0.06512451171875,
-0.0245361328125,
-0.0026988983154296875,
-0.0406494140625,
-0.006866455078125,
-0.03472900390625,
0.050872802734375,
-0.01715087890625,
-0.055816650390625,
0.04107666015625,
0.027923583984375,
-0.006313323974609375,
0.042755126953125,
0.046173095703125,
-0.00525665283203125,
0.07281494140625,
0.0209503173828125,
-0.0203857421875,
0.04345703125,
-0.047393798828125,
-0.0083465576171875,
-0.05841064453125,
-0.03326416015625,
-0.06341552734375,
-0.01488494873046875,
-0.03680419921875,
-0.03387451171875,
0.01910400390625,
-0.007427215576171875,
-0.0256500244140625,
0.03314208984375,
-0.04901123046875,
0.0196685791015625,
0.050079345703125,
0.01178741455078125,
-0.01189422607421875,
0.01128387451171875,
-0.01425933837890625,
0.01029205322265625,
-0.0447998046875,
-0.0274200439453125,
0.0794677734375,
0.0311126708984375,
0.049072265625,
-0.0020427703857421875,
0.04522705078125,
0.0035419464111328125,
-0.02532958984375,
-0.0643310546875,
0.042083740234375,
-0.01245880126953125,
-0.03985595703125,
-0.02947998046875,
-0.031036376953125,
-0.07373046875,
0.01373291015625,
-0.0172882080078125,
-0.0704345703125,
0.01204681396484375,
0.0009264945983886719,
-0.036407470703125,
0.004268646240234375,
-0.05230712890625,
0.063720703125,
-0.005550384521484375,
-0.0103759765625,
-0.01308441162109375,
-0.054443359375,
0.01021575927734375,
0.0035343170166015625,
0.01137542724609375,
-0.007610321044921875,
0.0178680419921875,
0.10015869140625,
-0.019378662109375,
0.05499267578125,
-0.020782470703125,
-0.0016736984252929688,
0.036468505859375,
-0.02569580078125,
0.0305023193359375,
-0.0098876953125,
-0.01474761962890625,
0.03375244140625,
0.017181396484375,
-0.0128631591796875,
-0.0225372314453125,
0.0576171875,
-0.08544921875,
-0.0204010009765625,
-0.0360107421875,
-0.03582763671875,
-0.007022857666015625,
0.01348114013671875,
0.046142578125,
0.044158935546875,
-0.017181396484375,
0.033905029296875,
0.0400390625,
-0.020965576171875,
0.0325927734375,
0.04449462890625,
0.0014896392822265625,
-0.049285888671875,
0.055023193359375,
0.02825927734375,
0.01404571533203125,
0.0142059326171875,
0.0164794921875,
-0.0406494140625,
-0.041015625,
-0.01416778564453125,
0.028167724609375,
-0.040924072265625,
-0.00685882568359375,
-0.058746337890625,
-0.0205841064453125,
-0.06109619140625,
0.0143280029296875,
-0.0307464599609375,
-0.025634765625,
-0.04241943359375,
-0.01102447509765625,
0.033477783203125,
0.0282745361328125,
-0.01953125,
0.0233306884765625,
-0.050689697265625,
0.02569580078125,
0.0081787109375,
-0.0006351470947265625,
-0.01358795166015625,
-0.07440185546875,
-0.0311737060546875,
0.026947021484375,
-0.01549530029296875,
-0.059844970703125,
0.046966552734375,
0.01139068603515625,
0.038665771484375,
0.026702880859375,
-0.007038116455078125,
0.053863525390625,
-0.0258941650390625,
0.05511474609375,
0.01678466796875,
-0.08001708984375,
0.054290771484375,
-0.0257568359375,
0.0239715576171875,
0.02056884765625,
0.0302581787109375,
-0.03271484375,
-0.0181732177734375,
-0.059600830078125,
-0.06402587890625,
0.07415771484375,
0.0250701904296875,
-0.001708984375,
0.01139068603515625,
0.007068634033203125,
-0.0104522705078125,
0.00435638427734375,
-0.050384521484375,
-0.036712646484375,
-0.013397216796875,
-0.01543426513671875,
-0.0259857177734375,
-0.01215362548828125,
-0.00955963134765625,
-0.036224365234375,
0.0767822265625,
0.01184844970703125,
0.029205322265625,
0.02630615234375,
0.007808685302734375,
0.003894805908203125,
0.0188140869140625,
0.052032470703125,
0.017547607421875,
-0.02947998046875,
-0.004459381103515625,
0.01549530029296875,
-0.050140380859375,
0.007534027099609375,
0.0203704833984375,
-0.01027679443359375,
0.017120361328125,
0.02581787109375,
0.09100341796875,
0.015045166015625,
-0.05133056640625,
0.0306854248046875,
0.001605987548828125,
-0.02508544921875,
-0.049346923828125,
0.014739990234375,
0.01904296875,
0.0260009765625,
0.032440185546875,
0.01708984375,
0.001163482666015625,
-0.0285797119140625,
0.004302978515625,
0.0260467529296875,
-0.022735595703125,
-0.02203369140625,
0.050048828125,
0.007068634033203125,
-0.0282745361328125,
0.035919189453125,
-0.00269317626953125,
-0.03424072265625,
0.066162109375,
0.0433349609375,
0.0654296875,
-0.030914306640625,
-0.001483917236328125,
0.048736572265625,
0.01490020751953125,
-0.02178955078125,
0.036224365234375,
0.0007166862487792969,
-0.062225341796875,
-0.018890380859375,
-0.046173095703125,
-0.017669677734375,
0.022918701171875,
-0.06402587890625,
0.038543701171875,
-0.0209808349609375,
-0.0166015625,
0.021636962890625,
0.00848388671875,
-0.047088623046875,
0.026397705078125,
0.023834228515625,
0.0721435546875,
-0.0684814453125,
0.08233642578125,
0.03497314453125,
-0.031524658203125,
-0.08831787109375,
-0.0211181640625,
-0.0109405517578125,
-0.052490234375,
0.0386962890625,
0.00992584228515625,
-0.0164947509765625,
0.0036029815673828125,
-0.046966552734375,
-0.07379150390625,
0.08087158203125,
0.04388427734375,
-0.06488037109375,
-0.006839752197265625,
-0.0066375732421875,
0.04052734375,
-0.02587890625,
0.030792236328125,
0.050994873046875,
0.03729248046875,
0.0159759521484375,
-0.08880615234375,
-0.01019287109375,
-0.019287109375,
-0.022735595703125,
-0.00974273681640625,
-0.046356201171875,
0.0753173828125,
-0.0274200439453125,
-0.0029582977294921875,
0.00864410400390625,
0.051910400390625,
0.035736083984375,
0.0256500244140625,
0.03851318359375,
0.047607421875,
0.06475830078125,
-0.0110626220703125,
0.054443359375,
-0.00942230224609375,
0.037200927734375,
0.0931396484375,
-0.004360198974609375,
0.080078125,
0.02960205078125,
-0.027435302734375,
0.0379638671875,
0.040008544921875,
-0.0253448486328125,
0.04742431640625,
0.004489898681640625,
-0.0102691650390625,
-0.0061187744140625,
-0.00018095970153808594,
-0.04754638671875,
0.0653076171875,
0.019378662109375,
-0.040283203125,
0.0174560546875,
0.01605224609375,
0.01401519775390625,
-0.016082763671875,
-0.01558685302734375,
0.041839599609375,
0.003631591796875,
-0.043365478515625,
0.072021484375,
-0.003047943115234375,
0.066162109375,
-0.06103515625,
0.0150146484375,
0.01654052734375,
0.016845703125,
-0.023040771484375,
-0.04754638671875,
0.0136871337890625,
0.01290130615234375,
-0.0145263671875,
0.00257110595703125,
0.026885986328125,
-0.04095458984375,
-0.04388427734375,
0.0416259765625,
0.01392364501953125,
0.035125732421875,
-0.006961822509765625,
-0.06536865234375,
0.028106689453125,
0.025848388671875,
-0.0279998779296875,
0.0161590576171875,
0.01953125,
0.0253448486328125,
0.04296875,
0.055816650390625,
0.0246734619140625,
-0.001865386962890625,
-0.0009021759033203125,
0.054779052734375,
-0.0426025390625,
-0.050384521484375,
-0.057861328125,
0.032928466796875,
0.0018224716186523438,
-0.02825927734375,
0.055694580078125,
0.06109619140625,
0.07177734375,
-0.005096435546875,
0.0706787109375,
-0.0133514404296875,
0.058319091796875,
-0.036102294921875,
0.054290771484375,
-0.03411865234375,
0.015106201171875,
-0.031158447265625,
-0.0516357421875,
-0.00479888916015625,
0.06512451171875,
-0.0278472900390625,
0.00673675537109375,
0.032562255859375,
0.07781982421875,
-0.0035800933837890625,
-0.00800323486328125,
0.019744873046875,
0.0443115234375,
0.0130462646484375,
0.053131103515625,
0.033966064453125,
-0.0516357421875,
0.062225341796875,
-0.032562255859375,
-0.006412506103515625,
-0.0110015869140625,
-0.042388916015625,
-0.058990478515625,
-0.055816650390625,
-0.036041259765625,
-0.047393798828125,
-0.0030689239501953125,
0.08795166015625,
0.055023193359375,
-0.0692138671875,
-0.0301055908203125,
0.0123443603515625,
-0.005764007568359375,
-0.021240234375,
-0.0161590576171875,
0.04107666015625,
0.0214691162109375,
-0.0775146484375,
0.03704833984375,
-0.00978851318359375,
0.019073486328125,
-0.01021575927734375,
-0.0199127197265625,
-0.0196990966796875,
0.007312774658203125,
0.0206451416015625,
0.033477783203125,
-0.059112548828125,
-0.01203155517578125,
-0.0033435821533203125,
-0.01488494873046875,
0.00949859619140625,
0.018157958984375,
-0.0665283203125,
0.0193328857421875,
0.045318603515625,
0.00905609130859375,
0.049285888671875,
-0.0183563232421875,
0.01351165771484375,
-0.037200927734375,
0.039459228515625,
0.023651123046875,
0.047760009765625,
0.02496337890625,
-0.0107269287109375,
0.0305023193359375,
0.018951416015625,
-0.04248046875,
-0.0760498046875,
-0.01189422607421875,
-0.10858154296875,
-0.00949859619140625,
0.098876953125,
-0.014251708984375,
-0.0177154541015625,
0.01068115234375,
-0.03955078125,
0.050384521484375,
-0.0521240234375,
0.03680419921875,
0.035064697265625,
-0.004413604736328125,
0.0027790069580078125,
-0.035125732421875,
0.02655029296875,
0.0226898193359375,
-0.0379638671875,
-0.012603759765625,
0.033447265625,
0.0433349609375,
0.0177459716796875,
0.045684814453125,
0.0022335052490234375,
0.0224151611328125,
0.00464630126953125,
0.0180206298828125,
-0.0109405517578125,
-0.01110076904296875,
-0.04486083984375,
0.00730133056640625,
-0.0253448486328125,
-0.0426025390625
]
] |
distilgpt2 | 2023-04-29T12:24:21.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"coreml",
"safetensors",
"gpt2",
"text-generation",
"exbert",
"en",
"dataset:openwebtext",
"arxiv:1910.01108",
"arxiv:2201.08542",
"arxiv:2203.12574",
"arxiv:1910.09700",
"arxiv:1503.02531",
"license:apache-2.0",
"model-index",
"co2_eq_emissions",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | null | null | null | distilgpt2 | 267 | 1,992,862 | transformers | 2022-03-02T23:29:04 | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- openwebtext
model-index:
- name: distilgpt2
results:
- task:
type: text-generation
name: Text Generation
dataset:
type: wikitext
name: WikiText-103
metrics:
- type: perplexity
name: Perplexity
value: 21.1
co2_eq_emissions: 149200
---
# DistilGPT2
DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text. Users of this model card should also consider information about the design, training, and limitations of [GPT-2](https://huggingface.co/gpt2).
## Model Details
- **Developed by:** Hugging Face
- **Model type:** Transformer-based Language Model
- **Language:** English
- **License:** Apache 2.0
- **Model Description:** DistilGPT2 is an English-language model pre-trained with the supervision of the 124 million parameter version of GPT-2. DistilGPT2, which has 82 million parameters, was developed using [knowledge distillation](#knowledge-distillation) and was designed to be a faster, lighter version of GPT-2.
- **Resources for more information:** See [this repository](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for more about Distil\* (a class of compressed models including Distilled-GPT2), [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108) for more information about knowledge distillation and the training procedure, and this page for more about [GPT-2](https://openai.com/blog/better-language-models/).
## Uses, Limitations and Risks
#### Limitations and Risks
<details>
<summary>Click to expand</summary>
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
As the developers of GPT-2 (OpenAI) note in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md), โlanguage models like GPT-2 reflect the biases inherent to the systems they were trained on.โ Significant research has explored bias and fairness issues with models for language generation including GPT-2 (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
DistilGPT2 also suffers from persistent bias issues, as highlighted in the demonstrative examples below. Note that these examples are not a comprehensive stress-testing of the model. Readers considering using the model should consider more rigorous evaluations of the model depending on their use case and context.
The impact of model compression techniques โ such as knowledge distillation โ on bias and fairness issues associated with language models is an active area of research. For example:
- [Silva, Tambwekar and Gombolay (2021)](https://aclanthology.org/2021.naacl-main.189.pdf) find that distilled versions of BERT and RoBERTa consistently exhibit statistically significant bias (with regard to gender and race) with effect sizes larger than the teacher models.
- [Xu and Hu (2022)](https://arxiv.org/pdf/2201.08542.pdf) find that distilled versions of GPT-2 showed consistent reductions in toxicity and bias compared to the teacher model (see the paper for more detail on metrics used to define/measure toxicity and bias).
- [Gupta et al. (2022)](https://arxiv.org/pdf/2203.12574.pdf) find that DistilGPT2 exhibits greater gender disparities than GPT-2 and propose a technique for mitigating gender bias in distilled language models like DistilGPT2.
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='distilgpt2')
>>> set_seed(48)
>>> generator("The White man worked as a", max_length=20, num_return_sequences=3)
[{'generated_text': "The White man worked as a salesman at a McDonald's restaurant called Kia at the time of the"},
{'generated_text': 'The White man worked as a contractor in the Army in the late 1990s. He became a "'},
{'generated_text': 'The White man worked as a police spokesman to the US Navy in the 1930s.'}]
>>> set_seed(48)
>>> generator("The Black man worked as a", max_length=20, num_return_sequences=3)
[{'generated_text': 'The Black man worked as a shop assistant for an hour at Wal-Mart at Wal-Mart in'},
{'generated_text': 'The Black man worked as a waiter in the hotel when he was assaulted when he got out of a'},
{'generated_text': 'The Black man worked as a police spokesman four months ago...'}]
```
</details>
#### Potential Uses
Since DistilGPT2 is a distilled version of GPT-2, it is intended to be used for similar use cases with the increased functionality of being smaller and easier to run than the base model.
The developers of GPT-2 state in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md) that they envisioned GPT-2 would be used by researchers to better understand large-scale generative language models, with possible secondary use cases including:
> - *Writing assistance: Grammar assistance, autocompletion (for normal prose or code)*
> - *Creative writing and art: exploring the generation of creative, fictional texts; aiding creation of poetry and other literary art.*
> - *Entertainment: Creation of games, chat bots, and amusing generations.*
Using DistilGPT2, the Hugging Face team built the [Write With Transformers](https://transformer.huggingface.co/doc/distil-gpt2) web app, which allows users to play with the model to generate text directly from their browser.
#### Out-of-scope Uses
OpenAI states in the GPT-2 [model card](https://github.com/openai/gpt-2/blob/master/model_card.md):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we donโt support use-cases that require the generated text to be true.
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do not recommend that they be deployed into systems that interact with humans unless the deployers first carry out a study of biases relevant to the intended use-case.
### How to Get Started with the Model
<details>
<summary>Click to expand</summary>
*Be sure to read the sections on in-scope and out-of-scope uses and limitations of the model for further information on how to use the model.*
Using DistilGPT2 is similar to using GPT-2. DistilGPT2 can be used directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='distilgpt2')
>>> set_seed(42)
>>> generator("Hello, Iโm a language model", max_length=20, num_return_sequences=5)
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
[{'generated_text': "Hello, I'm a language model, I'm a language model. In my previous post I've"},
{'generated_text': "Hello, I'm a language model, and I'd love to hear what you think about it."},
{'generated_text': "Hello, I'm a language model, but I don't get much of a connection anymore, so"},
{'generated_text': "Hello, I'm a language model, a functional language... It's not an example, and that"},
{'generated_text': "Hello, I'm a language model, not an object model.\n\nIn a nutshell, I"}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2')
model = GPT2Model.from_pretrained('distilgpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
And in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2')
model = TFGPT2Model.from_pretrained('distilgpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
</details>
## Training Data
DistilGPT2 was trained using [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), an open-source reproduction of OpenAIโs WebText dataset, which was used to train GPT-2. See the [OpenWebTextCorpus Dataset Card](https://huggingface.co/datasets/openwebtext) for additional information about OpenWebTextCorpus and [Radford et al. (2019)](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf) for additional information about WebText.
## Training Procedure
The texts were tokenized using the same tokenizer as GPT-2, a byte-level version of Byte Pair Encoding (BPE). DistilGPT2 was trained using knowledge distillation, following a procedure similar to the training procedure for DistilBERT, described in more detail in [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108).
## Evaluation Results
The creators of DistilGPT2 [report](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) that, on the [WikiText-103](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/) benchmark, GPT-2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set).
## Environmental Impact
*Carbon emissions were estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.*
- **Hardware Type:** 8 16GB V100
- **Hours used:** 168 (1 week)
- **Cloud Provider:** Azure
- **Compute Region:** unavailable, assumed East US for calculations
- **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*: 149.2 kg eq. CO2
## Citation
```bibtex
@inproceedings{sanh2019distilbert,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas},
booktitle={NeurIPS EMC^2 Workshop},
year={2019}
}
```
## Glossary
- <a name="knowledge-distillation">**Knowledge Distillation**</a>: As described in [Sanh et al. (2019)](https://arxiv.org/pdf/1910.01108.pdf), โknowledge distillation is a compression technique in which a compact model โ the student โ is trained to reproduce the behavior of a larger model โ the teacher โ or an ensemble of models.โ Also see [Bucila et al. (2006)](https://www.cs.cornell.edu/~caruana/compression.kdd06.pdf) and [Hinton et al. (2015)](https://arxiv.org/abs/1503.02531).
<a href="https://huggingface.co/exbert/?model=distilgpt2">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 10,966 | [
[
-0.0114593505859375,
-0.059539794921875,
0.02685546875,
0.0154571533203125,
-0.0211639404296875,
-0.019805908203125,
-0.021942138671875,
-0.033203125,
-0.0305633544921875,
0.011474609375,
-0.02484130859375,
-0.0015993118286132812,
-0.07177734375,
-0.00104522705078125,
-0.0212249755859375,
0.1070556640625,
-0.01020050048828125,
0.003875732421875,
0.01326751708984375,
0.0198211669921875,
-0.0362548828125,
-0.038116455078125,
-0.052398681640625,
-0.042144775390625,
0.01568603515625,
-0.0016422271728515625,
0.049102783203125,
0.031585693359375,
0.019500732421875,
0.0194091796875,
-0.0240020751953125,
-0.0060882568359375,
-0.041229248046875,
-0.0016603469848632812,
-0.018157958984375,
-0.030426025390625,
-0.024627685546875,
0.0243682861328125,
0.019683837890625,
0.0255889892578125,
-0.006381988525390625,
0.0189361572265625,
0.03472900390625,
0.041107177734375,
-0.0192108154296875,
0.03631591796875,
-0.04150390625,
-0.002544403076171875,
-0.019287109375,
0.0211029052734375,
-0.03009033203125,
-0.02178955078125,
0.01538848876953125,
-0.0301361083984375,
0.022674560546875,
-0.011016845703125,
0.07232666015625,
0.0251922607421875,
-0.031158447265625,
-0.030181884765625,
-0.0675048828125,
0.052642822265625,
-0.057403564453125,
0.006427764892578125,
0.02996826171875,
0.022674560546875,
-0.01010894775390625,
-0.0816650390625,
-0.060882568359375,
-0.01015472412109375,
-0.02685546875,
0.0229949951171875,
-0.02215576171875,
-0.002170562744140625,
0.03582763671875,
0.031280517578125,
-0.055633544921875,
-0.002895355224609375,
-0.04046630859375,
-0.00518035888671875,
0.02783203125,
-0.01515960693359375,
0.0277252197265625,
-0.01290130615234375,
-0.0221710205078125,
-0.012359619140625,
-0.041351318359375,
-0.0166168212890625,
0.04339599609375,
0.01145172119140625,
-0.01318359375,
0.043060302734375,
0.006591796875,
0.028961181640625,
-0.0044708251953125,
-0.0100860595703125,
0.016357421875,
-0.0261993408203125,
-0.01131439208984375,
-0.0102996826171875,
0.0728759765625,
0.03240966796875,
0.03875732421875,
0.0029468536376953125,
-0.016448974609375,
0.005817413330078125,
-0.0026607513427734375,
-0.07574462890625,
-0.0374755859375,
0.0089111328125,
-0.03302001953125,
-0.03228759765625,
0.0013761520385742188,
-0.0687255859375,
-0.01215362548828125,
-0.0107421875,
0.0218505859375,
-0.02178955078125,
-0.05364990234375,
-0.01503753662109375,
-0.04010009765625,
0.00576019287109375,
0.0024890899658203125,
-0.09649658203125,
0.01727294921875,
0.0499267578125,
0.07196044921875,
0.00983428955078125,
-0.01263427734375,
-0.0018167495727539062,
-0.006641387939453125,
-0.000278472900390625,
0.0230255126953125,
-0.01436614990234375,
-0.0196685791015625,
-0.0057373046875,
-0.004573822021484375,
0.019927978515625,
-0.037109375,
0.0335693359375,
-0.0211029052734375,
0.054443359375,
-0.0215606689453125,
-0.02032470703125,
-0.00325775146484375,
-0.006134033203125,
-0.039947509765625,
0.10052490234375,
0.036346435546875,
-0.07452392578125,
0.0246124267578125,
-0.055450439453125,
-0.027496337890625,
-0.01036834716796875,
-0.004299163818359375,
-0.03656005859375,
0.006404876708984375,
0.0025386810302734375,
0.0030059814453125,
-0.042022705078125,
0.0396728515625,
0.0008673667907714844,
-0.019378662109375,
0.009185791015625,
-0.049102783203125,
0.0782470703125,
0.038909912109375,
-0.054901123046875,
-0.0343017578125,
-0.0232086181640625,
-0.0029582977294921875,
0.0355224609375,
-0.0294342041015625,
-0.0176239013671875,
0.006740570068359375,
0.0301361083984375,
0.029571533203125,
0.00812530517578125,
-0.0284576416015625,
0.015960693359375,
-0.0307464599609375,
0.0638427734375,
0.050628662109375,
-0.0235443115234375,
0.024688720703125,
-0.008880615234375,
0.0154571533203125,
0.006053924560546875,
0.011627197265625,
0.006618499755859375,
-0.059417724609375,
-0.0518798828125,
-0.004245758056640625,
0.028594970703125,
0.058837890625,
-0.06524658203125,
0.029815673828125,
-0.01079559326171875,
-0.034332275390625,
-0.027984619140625,
-0.0030155181884765625,
0.048858642578125,
0.04034423828125,
0.0287628173828125,
-0.0085296630859375,
-0.04608154296875,
-0.060791015625,
-0.021270751953125,
-0.042144775390625,
-0.0146026611328125,
0.00933074951171875,
0.048919677734375,
-0.012542724609375,
0.0810546875,
-0.052978515625,
-0.0166778564453125,
-0.032867431640625,
0.0185394287109375,
0.0101776123046875,
0.036224365234375,
0.05462646484375,
-0.0611572265625,
-0.0472412109375,
-0.0146484375,
-0.056640625,
-0.0233001708984375,
0.0181732177734375,
0.00246429443359375,
0.019256591796875,
0.01427459716796875,
-0.047271728515625,
0.01617431640625,
0.041473388671875,
-0.030426025390625,
0.03448486328125,
-0.0166778564453125,
-0.00995635986328125,
-0.1031494140625,
0.014862060546875,
0.006439208984375,
-0.00927734375,
-0.0692138671875,
-0.0016841888427734375,
-0.01079559326171875,
-0.01032257080078125,
-0.0338134765625,
0.049163818359375,
-0.03564453125,
0.00788116455078125,
-0.0264434814453125,
-0.0017805099487304688,
-0.0023097991943359375,
0.032745361328125,
0.0036296844482421875,
0.07366943359375,
0.030426025390625,
-0.0255279541015625,
0.02081298828125,
0.0219879150390625,
-0.018280029296875,
0.0167388916015625,
-0.06341552734375,
0.0287017822265625,
-0.01541900634765625,
0.017547607421875,
-0.06781005859375,
-0.01232147216796875,
0.0288238525390625,
-0.042816162109375,
0.0372314453125,
-0.0300445556640625,
-0.052764892578125,
-0.0440673828125,
-0.00994110107421875,
0.0284576416015625,
0.08355712890625,
-0.035247802734375,
0.01436614990234375,
0.02960205078125,
-0.023162841796875,
-0.034912109375,
-0.08172607421875,
0.00013065338134765625,
-0.019256591796875,
-0.033538818359375,
0.01837158203125,
0.01523590087890625,
-0.0239105224609375,
-0.0025424957275390625,
0.00908660888671875,
-0.0169525146484375,
0.00247955322265625,
0.0109100341796875,
0.0170745849609375,
0.001617431640625,
0.0004119873046875,
0.016265869140625,
-0.020721435546875,
-0.0030269622802734375,
-0.039520263671875,
0.049591064453125,
-0.0021266937255859375,
-0.0010833740234375,
-0.0272369384765625,
0.0220947265625,
0.021820068359375,
-0.00832366943359375,
0.040771484375,
0.0638427734375,
-0.0307159423828125,
0.0015192031860351562,
-0.0306854248046875,
-0.0278167724609375,
-0.03564453125,
0.058746337890625,
-0.0162353515625,
-0.07080078125,
0.018768310546875,
-0.0002448558807373047,
-0.0006871223449707031,
0.051300048828125,
0.066162109375,
0.0016012191772460938,
0.0706787109375,
0.041839599609375,
-0.01145172119140625,
0.035980224609375,
-0.016326904296875,
0.0290679931640625,
-0.061431884765625,
-0.0160064697265625,
-0.03778076171875,
-0.0170440673828125,
-0.058929443359375,
-0.0313720703125,
0.0174407958984375,
0.0298309326171875,
-0.020843505859375,
0.029144287109375,
-0.05853271484375,
0.0343017578125,
0.04766845703125,
-0.00885772705078125,
0.00514984130859375,
0.016510009765625,
0.004108428955078125,
-0.01016998291015625,
-0.038360595703125,
-0.05712890625,
0.08758544921875,
0.04437255859375,
0.041656494140625,
0.0186767578125,
0.031005859375,
0.01451873779296875,
0.0199737548828125,
-0.029510498046875,
0.01788330078125,
-0.0307159423828125,
-0.068115234375,
-0.0225372314453125,
-0.02947998046875,
-0.051422119140625,
0.013702392578125,
0.01678466796875,
-0.061553955078125,
-0.0014505386352539062,
0.021728515625,
-0.0016260147094726562,
0.0279998779296875,
-0.07269287109375,
0.0623779296875,
-0.01473236083984375,
-0.0196533203125,
0.00951385498046875,
-0.04962158203125,
0.03558349609375,
0.0016498565673828125,
0.0037517547607421875,
0.0005426406860351562,
0.016693115234375,
0.0518798828125,
-0.05096435546875,
0.060699462890625,
-0.035919189453125,
-0.01139068603515625,
0.0428466796875,
-0.0059356689453125,
0.04248046875,
0.007709503173828125,
-0.00215911865234375,
0.03118896484375,
-0.01099395751953125,
-0.01473236083984375,
-0.0307159423828125,
0.040802001953125,
-0.07012939453125,
-0.035125732421875,
-0.0469970703125,
-0.0289306640625,
0.02166748046875,
0.012939453125,
0.043365478515625,
0.01971435546875,
-0.00533294677734375,
-0.0167388916015625,
0.036346435546875,
-0.0249176025390625,
0.03863525390625,
0.0194091796875,
-0.01082611083984375,
-0.0225067138671875,
0.056060791015625,
0.0086212158203125,
0.03240966796875,
0.01318359375,
0.01534271240234375,
-0.04937744140625,
-0.0294189453125,
-0.0487060546875,
0.01000213623046875,
-0.041778564453125,
-0.0008397102355957031,
-0.05035400390625,
-0.0254364013671875,
-0.044586181640625,
0.0288238525390625,
-0.0240478515625,
-0.038299560546875,
-0.027740478515625,
-0.01453399658203125,
0.03521728515625,
0.060272216796875,
-0.005985260009765625,
0.026702880859375,
-0.0274200439453125,
0.0201873779296875,
0.03424072265625,
0.0283203125,
-0.01084136962890625,
-0.055267333984375,
0.0084075927734375,
0.0257568359375,
-0.038970947265625,
-0.06622314453125,
0.0154876708984375,
0.01078033447265625,
0.0220794677734375,
0.01314544677734375,
-0.00994873046875,
0.0287628173828125,
-0.04669189453125,
0.078369140625,
0.01629638671875,
-0.06414794921875,
0.045806884765625,
-0.0164337158203125,
0.004756927490234375,
0.034942626953125,
0.0189666748046875,
-0.0458984375,
-0.03216552734375,
-0.0300445556640625,
-0.06781005859375,
0.07470703125,
0.053253173828125,
0.0271453857421875,
-0.00778961181640625,
0.033111572265625,
0.005901336669921875,
0.01024627685546875,
-0.07867431640625,
-0.041961669921875,
-0.0394287109375,
-0.018798828125,
-0.001926422119140625,
-0.0509033203125,
0.00428009033203125,
-0.0176239013671875,
0.0531005859375,
0.00980377197265625,
0.042449951171875,
0.01328277587890625,
-0.0120849609375,
0.01617431640625,
0.0240325927734375,
0.049835205078125,
0.0292510986328125,
-0.00687408447265625,
0.020416259765625,
0.00780487060546875,
-0.0611572265625,
0.0219268798828125,
0.0264892578125,
-0.03692626953125,
0.0101318359375,
0.0137939453125,
0.08123779296875,
-0.032958984375,
-0.0221099853515625,
0.03985595703125,
0.000038683414459228516,
-0.0262298583984375,
-0.027069091796875,
-0.0003676414489746094,
0.0096893310546875,
0.01861572265625,
0.021270751953125,
-0.008270263671875,
0.012969970703125,
-0.05474853515625,
0.007965087890625,
0.01058197021484375,
-0.03509521484375,
-0.02838134765625,
0.078369140625,
0.020050048828125,
-0.022064208984375,
0.061676025390625,
-0.04217529296875,
-0.05023193359375,
0.0401611328125,
0.05609130859375,
0.070068359375,
-0.011810302734375,
0.0194091796875,
0.03857421875,
0.03643798828125,
-0.033294677734375,
0.01479339599609375,
0.019805908203125,
-0.037261962890625,
-0.0210113525390625,
-0.04498291015625,
0.00977325439453125,
0.026214599609375,
-0.0277252197265625,
0.0209503173828125,
-0.022796630859375,
-0.0304718017578125,
-0.01519012451171875,
-0.00853729248046875,
-0.03997802734375,
0.0029296875,
0.01087188720703125,
0.048858642578125,
-0.08294677734375,
0.07354736328125,
0.039764404296875,
-0.05303955078125,
-0.06414794921875,
0.0079498291015625,
0.01108551025390625,
-0.039154052734375,
0.055084228515625,
0.013519287109375,
0.0301513671875,
0.0033721923828125,
-0.0190887451171875,
-0.0440673828125,
0.094970703125,
0.0311737060546875,
-0.049896240234375,
-0.0113372802734375,
0.045135498046875,
0.0535888671875,
0.0042572021484375,
0.055389404296875,
0.059173583984375,
0.042724609375,
-0.0039825439453125,
-0.0872802734375,
0.01971435546875,
-0.0246429443359375,
0.0306854248046875,
0.00991058349609375,
-0.051727294921875,
0.08740234375,
-0.0094757080078125,
-0.0157470703125,
0.003673553466796875,
0.032958984375,
0.00722503662109375,
0.007198333740234375,
0.0262298583984375,
0.049041748046875,
0.048370361328125,
-0.04473876953125,
0.08428955078125,
-0.00455474853515625,
0.056640625,
0.08538818359375,
-0.01436614990234375,
0.0304718017578125,
0.03204345703125,
-0.040313720703125,
0.0243072509765625,
0.0408935546875,
-0.0156097412109375,
0.0584716796875,
0.016876220703125,
-0.010467529296875,
0.0277862548828125,
0.00335693359375,
-0.049224853515625,
0.0185394287109375,
-0.0014066696166992188,
-0.041656494140625,
-0.019317626953125,
-0.0095672607421875,
0.0321044921875,
-0.01068115234375,
0.01666259765625,
0.061676025390625,
0.016937255859375,
-0.05194091796875,
0.04010009765625,
0.0276641845703125,
0.05615234375,
-0.038970947265625,
-0.006023406982421875,
-0.0208892822265625,
0.019805908203125,
-0.005802154541015625,
-0.06414794921875,
0.02593994140625,
0.022247314453125,
-0.02716064453125,
-0.020263671875,
0.0489501953125,
-0.053436279296875,
-0.03668212890625,
0.0190582275390625,
0.0263671875,
0.023345947265625,
-0.01654052734375,
-0.054962158203125,
-0.00995635986328125,
0.01129150390625,
-0.0271453857421875,
0.0272064208984375,
0.0302581787109375,
-0.0030574798583984375,
0.0226287841796875,
0.045074462890625,
0.0023956298828125,
-0.0034694671630859375,
0.013397216796875,
0.061981201171875,
-0.0183258056640625,
-0.01971435546875,
-0.08807373046875,
0.048370361328125,
-0.0118560791015625,
-0.0284576416015625,
0.04083251953125,
0.05865478515625,
0.0657958984375,
-0.015899658203125,
0.08636474609375,
-0.042205810546875,
0.019927978515625,
-0.03009033203125,
0.07122802734375,
-0.0281219482421875,
0.005527496337890625,
-0.02294921875,
-0.08331298828125,
0.0014142990112304688,
0.042327880859375,
-0.0172119140625,
0.0276947021484375,
0.051605224609375,
0.062164306640625,
-0.0079498291015625,
-0.01580810546875,
-0.0033416748046875,
0.02386474609375,
0.04229736328125,
0.0472412109375,
0.048095703125,
-0.0518798828125,
0.0499267578125,
-0.023345947265625,
-0.025848388671875,
-0.0027751922607421875,
-0.05377197265625,
-0.0712890625,
-0.046356201171875,
-0.01546478271484375,
-0.039825439453125,
0.01030731201171875,
0.0518798828125,
0.039031982421875,
-0.05596923828125,
-0.0059814453125,
-0.0222320556640625,
-0.006557464599609375,
-0.006137847900390625,
-0.018768310546875,
0.0248870849609375,
-0.01318359375,
-0.07611083984375,
-0.00765228271484375,
0.002208709716796875,
0.0258026123046875,
-0.0164642333984375,
-0.01715087890625,
-0.0012006759643554688,
-0.0211639404296875,
0.04852294921875,
0.0006546974182128906,
-0.053558349609375,
-0.019775390625,
-0.0236358642578125,
-0.0139007568359375,
-0.00652313232421875,
0.0526123046875,
-0.0287628173828125,
0.024139404296875,
0.039306640625,
0.01239013671875,
0.042816162109375,
-0.007396697998046875,
0.040374755859375,
-0.052398681640625,
0.03955078125,
0.00885009765625,
0.02203369140625,
0.0204010009765625,
-0.027069091796875,
0.052093505859375,
0.029327392578125,
-0.03857421875,
-0.059173583984375,
0.0189361572265625,
-0.051605224609375,
-0.023040771484375,
0.11431884765625,
-0.01470184326171875,
0.006336212158203125,
-0.0069122314453125,
-0.032958984375,
0.048095703125,
-0.0263671875,
0.056793212890625,
0.054840087890625,
0.0222320556640625,
-0.0009522438049316406,
-0.064697265625,
0.053070068359375,
0.0170745849609375,
-0.05157470703125,
0.01227569580078125,
0.0201416015625,
0.045501708984375,
-0.0007710456848144531,
0.05108642578125,
-0.025787353515625,
-0.0014505386352539062,
0.00653839111328125,
0.01181793212890625,
-0.021331787109375,
-0.00644683837890625,
-0.0181121826171875,
-0.0094451904296875,
0.005977630615234375,
-0.0025653839111328125
]
] |
nateraw/vit-age-classifier | 2023-09-19T15:53:10.000Z | [
"transformers",
"pytorch",
"vit",
"image-classification",
"dataset:fairface",
"doi:10.57967/hf/1259",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | nateraw | null | null | nateraw/vit-age-classifier | 58 | 1,954,882 | transformers | 2022-03-02T23:29:05 | ---
tags:
- image-classification
- pytorch
datasets:
- fairface
---
A vision transformer finetuned to classify the age of a given person's face.
```python
import requests
from PIL import Image
from io import BytesIO
from transformers import ViTFeatureExtractor, ViTForImageClassification
# Get example image from official fairface repo + read it in as an image
r = requests.get('https://github.com/dchen236/FairFace/blob/master/detected_faces/race_Asian_face0.jpg?raw=true')
im = Image.open(BytesIO(r.content))
# Init model, transforms
model = ViTForImageClassification.from_pretrained('nateraw/vit-age-classifier')
transforms = ViTFeatureExtractor.from_pretrained('nateraw/vit-age-classifier')
# Transform our image and pass it through the model
inputs = transforms(im, return_tensors='pt')
output = model(**inputs)
# Predicted Class probabilities
proba = output.logits.softmax(1)
# Predicted Classes
preds = proba.argmax(1)
``` | 943 | [
[
-0.02899169921875,
-0.0242767333984375,
0.0189666748046875,
0.01367950439453125,
-0.00441741943359375,
-0.02325439453125,
0.0138397216796875,
-0.0310821533203125,
-0.020965576171875,
0.0307464599609375,
-0.043670654296875,
-0.01123809814453125,
-0.0389404296875,
-0.0005421638488769531,
-0.0209197998046875,
0.06500244140625,
0.006793975830078125,
0.00879669189453125,
-0.0023593902587890625,
-0.011627197265625,
-0.0298614501953125,
0.01141357421875,
-0.05279541015625,
-0.034149169921875,
0.00664520263671875,
0.03192138671875,
0.053131103515625,
0.0205078125,
0.0301971435546875,
0.041107177734375,
-0.01428985595703125,
-0.011566162109375,
-0.0199737548828125,
-0.01136016845703125,
-0.0195465087890625,
-0.0677490234375,
-0.0215606689453125,
0.019775390625,
0.02886962890625,
0.0465087890625,
-0.0013723373413085938,
0.0272674560546875,
-0.0040130615234375,
0.054534912109375,
-0.041961669921875,
0.023345947265625,
-0.0328369140625,
0.0220489501953125,
-0.027587890625,
-0.006641387939453125,
-0.03515625,
-0.022491455078125,
0.021728515625,
-0.0443115234375,
0.037811279296875,
0.004878997802734375,
0.08026123046875,
0.020477294921875,
-0.00948333740234375,
0.0256195068359375,
-0.03997802734375,
0.054718017578125,
-0.025543212890625,
0.0260772705078125,
0.0006155967712402344,
0.0704345703125,
0.004848480224609375,
-0.07806396484375,
-0.05938720703125,
0.0022373199462890625,
0.0020465850830078125,
-0.009246826171875,
-0.031280517578125,
0.0024166107177734375,
0.036834716796875,
0.0416259765625,
-0.037628173828125,
-0.0130767822265625,
-0.0693359375,
-0.02716064453125,
0.04156494140625,
0.0107879638671875,
0.027008056640625,
-0.0081329345703125,
-0.045318603515625,
-0.0303497314453125,
-0.0020313262939453125,
0.03887939453125,
0.014251708984375,
0.00928497314453125,
-0.0222320556640625,
0.049041748046875,
-0.04754638671875,
0.05242919921875,
0.051513671875,
-0.0271759033203125,
0.06549072265625,
0.0153045654296875,
-0.0307159423828125,
-0.00794219970703125,
0.037017822265625,
0.053863525390625,
0.056640625,
0.01244354248046875,
-0.02545166015625,
0.01454925537109375,
0.007476806640625,
-0.08331298828125,
-0.027587890625,
-0.00173187255859375,
-0.04833984375,
-0.034454345703125,
0.022308349609375,
-0.0653076171875,
-0.01528167724609375,
-0.013336181640625,
0.066650390625,
-0.024566650390625,
-0.01404571533203125,
0.00011843442916870117,
-0.0240631103515625,
0.028106689453125,
0.03662109375,
-0.04583740234375,
0.018707275390625,
0.00164031982421875,
0.0467529296875,
0.0018663406372070312,
-0.01221466064453125,
-0.0350341796875,
-0.026275634765625,
0.0013713836669921875,
0.042144775390625,
-0.010986328125,
-0.0011281967163085938,
-0.004421234130859375,
0.046966552734375,
-0.01166534423828125,
-0.05841064453125,
0.032867431640625,
-0.0399169921875,
0.0352783203125,
0.032135009765625,
-0.0004162788391113281,
-0.0294647216796875,
0.0024280548095703125,
-0.042236328125,
0.050048828125,
0.044921875,
-0.051788330078125,
0.027618408203125,
-0.0277862548828125,
-0.0173187255859375,
0.038299560546875,
-0.01277923583984375,
-0.06201171875,
0.0171966552734375,
0.01470947265625,
0.052093505859375,
0.0097198486328125,
0.035736083984375,
-0.0251617431640625,
-0.03021240234375,
0.02325439453125,
-0.052490234375,
0.08160400390625,
0.03521728515625,
-0.042938232421875,
0.004436492919921875,
-0.056182861328125,
-0.005092620849609375,
0.0282135009765625,
-0.011444091796875,
0.0115203857421875,
-0.0285797119140625,
0.03265380859375,
0.0211334228515625,
0.017059326171875,
-0.0601806640625,
0.0031948089599609375,
-0.02362060546875,
0.0268096923828125,
0.051666259765625,
-0.0220489501953125,
0.04058837890625,
-0.0419921875,
0.01313018798828125,
0.030853271484375,
0.039337158203125,
0.0124359130859375,
-0.04656982421875,
-0.07220458984375,
0.0011777877807617188,
-0.01154327392578125,
0.045562744140625,
-0.0748291015625,
0.0113525390625,
-0.030853271484375,
-0.043670654296875,
-0.01312255859375,
-0.023223876953125,
0.005260467529296875,
0.0400390625,
0.0323486328125,
-0.03839111328125,
-0.03729248046875,
-0.07611083984375,
-0.005023956298828125,
-0.013671875,
-0.00835418701171875,
0.0029582977294921875,
0.0310821533203125,
-0.03826904296875,
0.059295654296875,
-0.04248046875,
-0.04150390625,
0.005268096923828125,
-0.0043792724609375,
0.042572021484375,
0.059722900390625,
0.054656982421875,
-0.08453369140625,
-0.026123046875,
-0.01554107666015625,
-0.038330078125,
-0.0051116943359375,
0.004962921142578125,
-0.00783538818359375,
0.0019130706787109375,
0.0244293212890625,
-0.026336669921875,
0.06640625,
0.01476287841796875,
-0.044677734375,
0.045562744140625,
0.0020542144775390625,
0.0098724365234375,
-0.0408935546875,
0.0110931396484375,
0.0179443359375,
-0.03485107421875,
-0.040191650390625,
-0.007843017578125,
0.00884246826171875,
-0.0223846435546875,
-0.0633544921875,
0.051025390625,
-0.0012645721435546875,
0.01372528076171875,
-0.005138397216796875,
-0.04400634765625,
-0.00991058349609375,
0.034149169921875,
0.0007777214050292969,
0.043243408203125,
0.03582763671875,
-0.04486083984375,
0.041015625,
0.0225982666015625,
-0.02996826171875,
0.059722900390625,
-0.03564453125,
-0.01555633544921875,
-0.0223388671875,
0.0026340484619140625,
-0.0662841796875,
-0.034912109375,
0.0274200439453125,
-0.041839599609375,
0.0283660888671875,
-0.0201568603515625,
0.005069732666015625,
-0.042633056640625,
-0.019439697265625,
0.04638671875,
0.04986572265625,
-0.0548095703125,
0.0650634765625,
0.01337432861328125,
0.0010223388671875,
-0.0419921875,
-0.079833984375,
-0.0343017578125,
-0.0170440673828125,
-0.06353759765625,
0.053131103515625,
-0.0004363059997558594,
-0.01371002197265625,
0.006595611572265625,
-0.012908935546875,
-0.0218963623046875,
-0.006687164306640625,
0.029693603515625,
0.031463623046875,
-0.017608642578125,
-0.016998291015625,
-0.00995635986328125,
-0.0235137939453125,
0.0106964111328125,
-0.007602691650390625,
0.045501708984375,
-0.04827880859375,
-0.0006823539733886719,
-0.06500244140625,
-0.01165008544921875,
0.048828125,
-0.004505157470703125,
0.038604736328125,
0.06634521484375,
-0.04058837890625,
-0.018280029296875,
-0.044769287109375,
0.0017337799072265625,
-0.04327392578125,
0.032440185546875,
-0.0408935546875,
-0.03375244140625,
0.048492431640625,
0.002239227294921875,
-0.01123046875,
0.07440185546875,
0.034088134765625,
0.0017728805541992188,
0.0706787109375,
0.0430908203125,
0.032135009765625,
0.059783935546875,
-0.040557861328125,
-0.0169677734375,
-0.04425048828125,
-0.04400634765625,
-0.00939178466796875,
-0.0355224609375,
-0.05255126953125,
-0.0157928466796875,
0.01163482666015625,
-0.0028591156005859375,
-0.0531005859375,
0.0213165283203125,
-0.06365966796875,
0.0244293212890625,
0.05072021484375,
0.047576904296875,
0.003826141357421875,
0.0006155967712402344,
0.01081085205078125,
-0.008331298828125,
-0.02069091796875,
-0.01751708984375,
0.067138671875,
0.032958984375,
0.07171630859375,
-0.0073089599609375,
0.04937744140625,
0.006732940673828125,
0.04071044921875,
-0.0509033203125,
0.040863037109375,
-0.0443115234375,
-0.06005859375,
-0.00775909423828125,
-0.034149169921875,
-0.06414794921875,
0.00970458984375,
-0.0211334228515625,
-0.052398681640625,
0.046905517578125,
0.00511932373046875,
-0.0364990234375,
0.0258026123046875,
-0.05230712890625,
0.07525634765625,
-0.0163421630859375,
-0.0219573974609375,
0.0059356689453125,
-0.062286376953125,
0.034912109375,
0.01097869873046875,
0.00128173828125,
-0.0273284912109375,
0.042877197265625,
0.08074951171875,
-0.022247314453125,
0.0570068359375,
-0.05584716796875,
0.041412353515625,
0.0222320556640625,
-0.00890350341796875,
-0.0129852294921875,
0.0066680908203125,
0.0172271728515625,
0.009979248046875,
0.01531982421875,
-0.0193939208984375,
-0.0228271484375,
0.038421630859375,
-0.0699462890625,
-0.0396728515625,
-0.0509033203125,
-0.027435302734375,
0.015228271484375,
0.023406982421875,
0.040130615234375,
0.043365478515625,
0.00981903076171875,
0.020538330078125,
0.0305633544921875,
-0.021240234375,
0.0298919677734375,
-0.004070281982421875,
-0.037200927734375,
-0.022125244140625,
0.059112548828125,
0.0101776123046875,
0.0251617431640625,
0.00920867919921875,
0.025970458984375,
-0.010589599609375,
-0.01172637939453125,
-0.020721435546875,
-0.0018310546875,
-0.06451416015625,
-0.036895751953125,
-0.0142059326171875,
-0.053070068359375,
-0.035736083984375,
-0.00217437744140625,
-0.013427734375,
-0.026824951171875,
-0.02337646484375,
0.0305328369140625,
0.0139923095703125,
0.0169677734375,
-0.0102386474609375,
0.045654296875,
-0.03277587890625,
0.035003662109375,
0.056121826171875,
0.0264434814453125,
-0.015960693359375,
-0.0557861328125,
0.02313232421875,
-0.006488800048828125,
-0.042938232421875,
-0.0576171875,
0.042572021484375,
0.0283660888671875,
0.033111572265625,
0.041839599609375,
-0.001010894775390625,
0.045318603515625,
-0.035369873046875,
0.03179931640625,
0.03704833984375,
-0.068603515625,
0.037109375,
0.007965087890625,
0.0205078125,
0.045318603515625,
0.002994537353515625,
-0.038726806640625,
0.0024509429931640625,
-0.058258056640625,
-0.03436279296875,
0.06597900390625,
0.00679779052734375,
0.009063720703125,
0.02606201171875,
0.0217437744140625,
0.0186767578125,
-0.012725830078125,
-0.0762939453125,
-0.025970458984375,
-0.03271484375,
-0.03997802734375,
0.01276397705078125,
0.0081787109375,
0.006946563720703125,
-0.048736572265625,
0.038848876953125,
-0.0208587646484375,
0.04571533203125,
0.0188140869140625,
-0.0247955322265625,
-0.0175933837890625,
-0.004283905029296875,
0.0288848876953125,
0.038604736328125,
-0.031494140625,
0.02923583984375,
0.01258087158203125,
-0.047760009765625,
0.00677490234375,
0.0338134765625,
-0.006381988525390625,
-0.006641387939453125,
0.0157470703125,
0.05267333984375,
-0.0135955810546875,
-0.0178985595703125,
0.07763671875,
-0.024871826171875,
-0.0175628662109375,
-0.040130615234375,
0.0231475830078125,
-0.024688720703125,
0.051300048828125,
0.0548095703125,
0.06591796875,
0.01528167724609375,
-0.028533935546875,
0.0286102294921875,
0.004314422607421875,
-0.035186767578125,
-0.01515960693359375,
0.0660400390625,
0.0125274658203125,
-0.0350341796875,
0.06256103515625,
-0.01389312744140625,
-0.060821533203125,
0.057098388671875,
0.0195465087890625,
0.060394287109375,
-0.029571533203125,
0.00983428955078125,
0.034271240234375,
0.01123809814453125,
-0.000507354736328125,
0.01427459716796875,
0.01195526123046875,
-0.066162109375,
-0.025115966796875,
-0.05926513671875,
0.00807952880859375,
0.0109405517578125,
-0.052947998046875,
0.037109375,
-0.031646728515625,
-0.0279541015625,
0.0310821533203125,
0.0045623779296875,
-0.1044921875,
0.045440673828125,
0.03564453125,
0.0546875,
-0.08087158203125,
0.04217529296875,
0.03582763671875,
-0.0299530029296875,
-0.0411376953125,
-0.0303192138671875,
-0.0060577392578125,
-0.0904541015625,
0.07421875,
0.064453125,
0.007110595703125,
0.00603485107421875,
-0.06829833984375,
-0.041229248046875,
0.0791015625,
0.0223541259765625,
-0.033935546875,
-0.00984954833984375,
-0.002178192138671875,
0.0079498291015625,
-0.0134429931640625,
0.0160980224609375,
0.027130126953125,
0.0350341796875,
0.005008697509765625,
-0.067626953125,
-0.006359100341796875,
-0.02691650390625,
0.0135650634765625,
0.003330230712890625,
-0.0279541015625,
0.0897216796875,
-0.027801513671875,
-0.032958984375,
0.01143646240234375,
0.054718017578125,
0.00974273681640625,
0.035980224609375,
0.076416015625,
0.05126953125,
0.0223236083984375,
-0.025787353515625,
0.056610107421875,
0.0188140869140625,
0.047821044921875,
0.02667236328125,
0.033172607421875,
0.037750244140625,
0.03033447265625,
-0.031036376953125,
0.039337158203125,
0.049896240234375,
-0.0286102294921875,
0.055084228515625,
0.01102447509765625,
-0.0010137557983398438,
-0.0162506103515625,
0.0193328857421875,
-0.028167724609375,
0.045501708984375,
0.0178985595703125,
-0.0215911865234375,
-0.01148223876953125,
0.03564453125,
-0.0221099853515625,
-0.01849365234375,
-0.039031982421875,
0.026336669921875,
0.001934051513671875,
-0.0285491943359375,
0.0380859375,
0.006099700927734375,
0.0732421875,
-0.0065460205078125,
-0.00959014892578125,
0.0016126632690429688,
0.06439208984375,
-0.0275726318359375,
-0.081298828125,
0.0211334228515625,
-0.0294647216796875,
-0.00992584228515625,
0.01128387451171875,
0.055145263671875,
-0.037933349609375,
-0.041839599609375,
0.001033782958984375,
0.01328277587890625,
0.01149749755859375,
-0.005344390869140625,
-0.05401611328125,
-0.0159454345703125,
-0.0011005401611328125,
-0.0260162353515625,
-0.005550384521484375,
0.02313232421875,
-0.005977630615234375,
0.048126220703125,
0.029388427734375,
-0.0186920166015625,
0.01444244384765625,
-0.00458526611328125,
0.05181884765625,
-0.0302886962890625,
-0.0251312255859375,
-0.047271728515625,
0.044708251953125,
-0.005702972412109375,
-0.05364990234375,
0.01512908935546875,
0.029388427734375,
0.08319091796875,
-0.015655517578125,
0.01456451416015625,
-0.0175628662109375,
0.00875091552734375,
-0.027008056640625,
0.05877685546875,
-0.046173095703125,
-0.0298309326171875,
-0.00959014892578125,
-0.083984375,
-0.0272216796875,
0.0823974609375,
-0.0172576904296875,
0.0188751220703125,
0.036834716796875,
0.0439453125,
-0.017425537109375,
-0.005008697509765625,
0.0150299072265625,
-0.00946044921875,
0.020355224609375,
0.05340576171875,
0.056488037109375,
-0.044097900390625,
0.0239105224609375,
-0.0567626953125,
-0.035888671875,
-0.0259246826171875,
-0.025390625,
-0.054351806640625,
-0.04486083984375,
-0.0178375244140625,
-0.041748046875,
-0.0234527587890625,
0.04461669921875,
0.0711669921875,
-0.062286376953125,
0.00444793701171875,
0.003055572509765625,
-0.0209503173828125,
-0.01226806640625,
-0.0263824462890625,
0.0256195068359375,
-0.0080718994140625,
-0.0692138671875,
-0.0170440673828125,
-0.00878143310546875,
0.032012939453125,
-0.018218994140625,
0.0017147064208984375,
0.006687164306640625,
-0.040802001953125,
0.021728515625,
0.01506805419921875,
-0.049530029296875,
-0.035125732421875,
-0.00560760498046875,
-0.0080108642578125,
0.0233001708984375,
0.01174163818359375,
-0.06817626953125,
0.04345703125,
0.0335693359375,
0.0299835205078125,
0.06304931640625,
-0.01303863525390625,
0.01203155517578125,
-0.033843994140625,
0.0286865234375,
0.0290679931640625,
0.0457763671875,
0.029083251953125,
-0.0211181640625,
0.034759521484375,
0.03961181640625,
-0.03826904296875,
-0.049774169921875,
-0.006084442138671875,
-0.08514404296875,
-0.00897216796875,
0.05224609375,
-0.0004744529724121094,
-0.049285888671875,
0.004302978515625,
-0.0120849609375,
0.035491943359375,
-0.00794219970703125,
0.029571533203125,
0.005329132080078125,
0.004150390625,
-0.01995849609375,
-0.0265655517578125,
0.0205078125,
0.01605224609375,
-0.05230712890625,
-0.04644775390625,
0.006744384765625,
0.05010986328125,
0.0472412109375,
0.0211334228515625,
-0.0197906494140625,
0.042205810546875,
0.01125335693359375,
0.0217437744140625,
-0.0098876953125,
-0.023956298828125,
-0.0274810791015625,
0.01511383056640625,
-0.018341064453125,
-0.0399169921875
]
] |
jonatasgrosman/wav2vec2-large-xlsr-53-russian | 2022-12-14T01:58:43.000Z | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_6_0",
"robust-speech-event",
"ru",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"dataset:mozilla-foundation/common_voice_6_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | jonatasgrosman | null | null | jonatasgrosman/wav2vec2-large-xlsr-53-russian | 28 | 1,952,331 | transformers | 2022-03-02T23:29:05 | ---
language: ru
license: apache-2.0
datasets:
- common_voice
- mozilla-foundation/common_voice_6_0
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
- mozilla-foundation/common_voice_6_0
- robust-speech-event
- ru
- speech
- xlsr-fine-tuning-week
model-index:
- name: XLSR Wav2Vec2 Russian by Jonatas Grosman
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice ru
type: common_voice
args: ru
metrics:
- name: Test WER
type: wer
value: 13.3
- name: Test CER
type: cer
value: 2.88
- name: Test WER (+LM)
type: wer
value: 9.57
- name: Test CER (+LM)
type: cer
value: 2.24
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: ru
metrics:
- name: Dev WER
type: wer
value: 40.22
- name: Dev CER
type: cer
value: 14.8
- name: Dev WER (+LM)
type: wer
value: 33.61
- name: Dev CER (+LM)
type: cer
value: 13.5
---
# Fine-tuned XLSR-53 large model for speech recognition in Russian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Russian using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice) and [CSS10](https://github.com/Kyubyong/css10).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-russian")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "ru"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-russian"
SAMPLES = 5
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| ะะ ะ ะะะะขะะขะฌ, ะ ะะ ะะ ะฃะะะ ะะะขะฌ ะะะะะ โ ะะะะะะข ะะ ะะะะจะะ ะะะะะะะ ะะฃะะฌะะะ ะะะะ. | ะะ ะ ะะะะขะะขะฌ ะ ะะ ะะ ะฃะะะ ะะะข ะะะะะ ะะะะะะข ะะ ะะะะจะะ ะะะะะะะ ะะฃะะฌะะะ ะะะะ |
| ะะกะะ ะะ ะะฃะะะข ะะะะ ะะะะะะ, ะฏ ะะฃะะฃ ะกะงะะขะะขะฌ, ะงะขะ ะะกะกะะะะะะฏ ะกะะะะะกะะ ะก ะญะขะะ ะะ ะะะะะะะะะะ. | ะะกะะ ะะ ะะฃะะะข ะะะะ ะะะะะะ ะฏ ะะฃะะฃ ะกะงะะขะะขะฌ ะงะขะ ะะกะกะะะะะะฏ ะกะะะะะกะะ ะก ะญะขะะ ะะ ะะะะะะะะะะ |
| ะะะะะกะขะะะฆะะ ะะะะะฅะะะะะ ะกะะะงะะะ ะฃะกะขะะะะะะขะฌ ะะะ ะก ะะะ ะะะะะ, ะ ะะะขะะ ะะะะะะะขะฌะกะฏ ะะ ะะะะะะะฏ ะะะกะฃะะะ ะกะขะะะะะะกะขะ. | ะะะะะกะขะะะฆะะ ะะะะะฅะะะะะ ะกะะะงะะะ ะฃะกะขะะะะะะขะฌ ะก ะะ ะะะ ะคะะะ ะะะะ ะ ะะะขะะ ะะะะะะะขะฌะกะฏ ะะ ะะะะะะะฏ ะะะกะฃะะะ ะกะขะะะะกะะ |
| ะฃ ะะะะฏ ะะซะะ ะขะะะะ ะงะฃะะกะขะะ, ะงะขะ ะงะขะ-ะขะ ะขะะะะ ะะงะะะฌ ะะะะะะ ะฏ ะะ ะะะะะะฏะฎ. | ะฃ ะะะะฏ ะะซะะ ะขะะะะ ะงะฃะะกะขะะ ะงะขะ ะงะขะ-ะขะ ะขะะะะ ะะงะะะฌ ะะะะะะ ะฏ ะะ ะะะะะะะฏะะข |
| ะขะะะฌะะ ะะ ะฏะ ะะ ะะะะะะข. | ะขะะะฌะะ ะะ ะฏะ ะะ ะะะะะะข |
| ะะ ะะะกะะะ, ะกะะฃะจะะฏ ะะะะะ ะฃะฅะะ, ะะะ ะะะะะะ ะะะะะะะฌ ะก ะะะะฃะะ ะ ะะ ะะะะฌ-ะญะขะะ ะ ะะะะฏะะซะะะ ะะะะ. | ะะะะะะ ะกะะฃะจะะฎ ะะข ะะะะะ ะฃะฅะะ ะขะซ ะะะขะ ะ ะะะะะะะข ะกะะะะ ะะ ะะะ ะะขะะง ะ ะะะะฏะะซะะะ ะะะกะฃ |
| ะ ะกะะะะะะะะฎ, ะกะะขะฃะะฆะะฏ ะะ ะะะะะะะะข ะฃะฅะฃะะจะะขะฌะกะฏ. | ะ ะกะะะะะะะะฎ ะกะะขะฃะะฆะะ ะะ ะะะะะะะะข ะฃะฅะฃะะะขะฌะกะฏ |
| ะะกะ ะะะะะะะะะ ะฃะฅะะะะะ ะะ ะะะะะจะะะ ะ ะะกะฅะะะซ ะ ะะ ะฃะะะะขะฃ ะะะะะะฅ ะะะะะ ะะะะะะะจะะฅะกะฏ ะะะะะะ. | ะะกะ ะะะะะะะะะ ะฃะฅะะะะะ ะะ ะะะะะจะะะ ะ ะะกะฅะะะซ ะ ะะ ะฃะะะะขะฃ ะะะะะะฅ ะะ ะะะ ะะะะะะะจะะฅะกะฏ ะะะะะะ |
| ะขะะะะ ะฌ ะะะะ, ะะะะะงะะ, ะะ ะขะะ, ะงะขะะะซ ะะ ะะะ ะะขะะขะฌ ะกะะะะ ะ ะะะะ. | ะขะะะะ ะฌ ะะะะะฎ ะะะะะงะะ ะะะขะะ ะงะขะะะซ ะะ ะะะ ะะขะะขะฌ ะกะะะะ ะ ะะะะ |
| ะะะะฏะขะฌ | ะะะะะขะฌ |
## Evaluation
1. To evaluate on `mozilla-foundation/common_voice_6_0` with split `test`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-russian --dataset mozilla-foundation/common_voice_6_0 --config ru --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-russian --dataset speech-recognition-community-v2/dev_data --config ru --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-russian,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {R}ussian},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-russian}},
year={2021}
}
``` | 5,855 | [
[
-0.035858154296875,
-0.043670654296875,
0.01512908935546875,
0.0102691650390625,
-0.02203369140625,
-0.0086212158203125,
-0.0180511474609375,
-0.0300750732421875,
0.022491455078125,
0.0113677978515625,
-0.04315185546875,
-0.039520263671875,
-0.029449462890625,
-0.00832366943359375,
-0.0322265625,
0.068603515625,
0.01654052734375,
0.0073394775390625,
-0.01142120361328125,
-0.00848388671875,
-0.028045654296875,
-0.026397705078125,
-0.040191650390625,
-0.0299072265625,
0.027191162109375,
0.01523590087890625,
0.037384033203125,
0.027252197265625,
0.034698486328125,
0.0330810546875,
-0.00865936279296875,
0.0050506591796875,
-0.0125885009765625,
0.01239013671875,
0.016143798828125,
-0.0294036865234375,
-0.0223236083984375,
-0.0047760009765625,
0.0531005859375,
0.03253173828125,
-0.0240936279296875,
0.027313232421875,
-0.0027370452880859375,
0.038482666015625,
-0.0238800048828125,
-0.0004935264587402344,
-0.0223846435546875,
-0.0103302001953125,
-0.008026123046875,
-0.0077056884765625,
-0.0104217529296875,
-0.0306854248046875,
0.01178741455078125,
-0.0295562744140625,
0.008941650390625,
0.01049041748046875,
0.08636474609375,
-0.00278472900390625,
-0.007808685302734375,
-0.0104827880859375,
-0.05438232421875,
0.0782470703125,
-0.060760498046875,
0.03009033203125,
0.038726806640625,
0.0017251968383789062,
-0.0082855224609375,
-0.05084228515625,
-0.047821044921875,
-0.0094146728515625,
-0.0086212158203125,
0.0249481201171875,
-0.0306243896484375,
-0.0123291015625,
0.027862548828125,
0.01181793212890625,
-0.059234619140625,
-0.0008440017700195312,
-0.0462646484375,
-0.0264434814453125,
0.049285888671875,
0.008148193359375,
0.02239990234375,
-0.0265350341796875,
-0.00592041015625,
-0.0182037353515625,
-0.01328277587890625,
0.023468017578125,
0.03131103515625,
0.01053619384765625,
-0.0318603515625,
0.038818359375,
-0.0249481201171875,
0.057037353515625,
0.005657196044921875,
-0.0231475830078125,
0.05584716796875,
-0.0318603515625,
-0.03131103515625,
0.0001417398452758789,
0.08551025390625,
0.0196685791015625,
0.0135955810546875,
0.005260467529296875,
-0.0082550048828125,
0.0143585205078125,
-0.0100555419921875,
-0.06292724609375,
-0.007625579833984375,
0.0296478271484375,
-0.0209197998046875,
-0.0227813720703125,
0.00342559814453125,
-0.063232421875,
0.0028514862060546875,
-0.01904296875,
0.038055419921875,
-0.05120849609375,
-0.01105499267578125,
0.01000213623046875,
-0.004947662353515625,
0.01120758056640625,
0.00867462158203125,
-0.06597900390625,
0.032379150390625,
0.032928466796875,
0.07012939453125,
0.0186309814453125,
-0.023681640625,
-0.03076171875,
-0.01284027099609375,
-0.01543426513671875,
0.048919677734375,
-0.0157012939453125,
-0.0149993896484375,
-0.0093994140625,
0.0004940032958984375,
-0.016326904296875,
-0.0216827392578125,
0.032257080078125,
-0.0163421630859375,
0.040191650390625,
-0.0018205642700195312,
-0.036407470703125,
-0.0131988525390625,
0.0038585662841796875,
-0.035064697265625,
0.0855712890625,
-0.007099151611328125,
-0.0645751953125,
0.0081329345703125,
-0.046142578125,
-0.03271484375,
-0.00943756103515625,
-0.0016994476318359375,
-0.0438232421875,
-0.00865936279296875,
0.0141754150390625,
0.044708251953125,
-0.023101806640625,
0.005401611328125,
-0.022674560546875,
-0.028167724609375,
0.03472900390625,
-0.016387939453125,
0.07135009765625,
0.0232696533203125,
-0.03387451171875,
0.00626373291015625,
-0.0721435546875,
0.031646728515625,
0.011688232421875,
-0.0289459228515625,
-0.007476806640625,
-0.01519775390625,
0.01021575927734375,
0.0208740234375,
0.009521484375,
-0.043121337890625,
-0.01154327392578125,
-0.0555419921875,
0.04486083984375,
0.05694580078125,
-0.0018644332885742188,
0.0287017822265625,
-0.0399169921875,
0.0287017822265625,
-0.0037212371826171875,
-0.004673004150390625,
-0.009796142578125,
-0.0243377685546875,
-0.0648193359375,
-0.027313232421875,
0.0205841064453125,
0.04595947265625,
-0.0283966064453125,
0.058319091796875,
-0.0124359130859375,
-0.06195068359375,
-0.05322265625,
-0.009185791015625,
0.031402587890625,
0.028167724609375,
0.03387451171875,
0.00563812255859375,
-0.06298828125,
-0.0595703125,
-0.010162353515625,
-0.01556396484375,
-0.0018548965454101562,
0.02294921875,
0.0562744140625,
-0.01512908935546875,
0.0576171875,
-0.04193115234375,
-0.0151824951171875,
-0.0162353515625,
0.0029239654541015625,
0.030548095703125,
0.0611572265625,
0.038055419921875,
-0.062286376953125,
-0.04669189453125,
0.0048980712890625,
-0.042327880859375,
-0.0021152496337890625,
0.0008025169372558594,
0.000408172607421875,
0.03314208984375,
0.019317626953125,
-0.057098388671875,
0.032379150390625,
0.03509521484375,
-0.03302001953125,
0.06182861328125,
-0.006473541259765625,
0.00891876220703125,
-0.09893798828125,
0.02880859375,
0.00763702392578125,
-0.018310546875,
-0.060516357421875,
-0.021484375,
-0.01207733154296875,
0.0112762451171875,
-0.0269622802734375,
0.048187255859375,
-0.03466796875,
0.0029811859130859375,
0.0010776519775390625,
0.01165008544921875,
-0.0063934326171875,
0.036407470703125,
0.0017137527465820312,
0.044708251953125,
0.060302734375,
-0.036163330078125,
0.047760009765625,
0.0169219970703125,
-0.048919677734375,
0.0244140625,
-0.06573486328125,
0.01337432861328125,
-0.00617218017578125,
0.0037689208984375,
-0.0850830078125,
-0.0259246826171875,
0.03778076171875,
-0.06976318359375,
0.01445770263671875,
0.0007529258728027344,
-0.0311431884765625,
-0.035003662109375,
-0.0192108154296875,
0.01194000244140625,
0.04583740234375,
-0.017364501953125,
0.038177490234375,
0.01788330078125,
-0.0125579833984375,
-0.060150146484375,
-0.06689453125,
-0.017822265625,
-0.01302337646484375,
-0.063720703125,
0.0162353515625,
-0.020294189453125,
-0.016571044921875,
-0.0012311935424804688,
-0.005344390869140625,
-0.005298614501953125,
-0.00557708740234375,
0.026763916015625,
0.033843994140625,
-0.0104217529296875,
0.0034389495849609375,
-0.019012451171875,
-0.0009307861328125,
0.007274627685546875,
-0.00572967529296875,
0.062042236328125,
-0.0209197998046875,
-0.016143798828125,
-0.06146240234375,
0.02069091796875,
0.048858642578125,
-0.0167236328125,
0.05126953125,
0.06634521484375,
-0.0258941650390625,
-0.004894256591796875,
-0.038421630859375,
-0.0020751953125,
-0.03466796875,
0.048553466796875,
-0.0145263671875,
-0.055938720703125,
0.04925537109375,
0.0179901123046875,
-0.00910186767578125,
0.0379638671875,
0.045989990234375,
-0.01666259765625,
0.0784912109375,
0.016021728515625,
0.0018644332885742188,
0.04937744140625,
-0.053497314453125,
0.0004019737243652344,
-0.052093505859375,
-0.041412353515625,
-0.048828125,
-0.02105712890625,
-0.039459228515625,
-0.0251617431640625,
0.01169586181640625,
-0.0107879638671875,
-0.0218048095703125,
0.0208587646484375,
-0.04949951171875,
0.019989013671875,
0.0401611328125,
0.01158905029296875,
-0.01337432861328125,
0.006439208984375,
-0.0182952880859375,
0.0027294158935546875,
-0.045623779296875,
-0.0266876220703125,
0.07763671875,
0.0149383544921875,
0.05523681640625,
0.01105499267578125,
0.0418701171875,
0.00943756103515625,
-0.02008056640625,
-0.07171630859375,
0.050628662109375,
0.00042629241943359375,
-0.050445556640625,
-0.03759765625,
-0.02752685546875,
-0.06842041015625,
0.0272216796875,
-0.02447509765625,
-0.0850830078125,
0.01216888427734375,
0.0025119781494140625,
-0.037078857421875,
0.0147247314453125,
-0.043426513671875,
0.060546875,
-0.0017032623291015625,
-0.042022705078125,
-0.0135955810546875,
-0.05926513671875,
0.01837158203125,
0.0021419525146484375,
0.0086669921875,
-0.00759124755859375,
0.0215911865234375,
0.09124755859375,
-0.030426025390625,
0.044464111328125,
-0.006076812744140625,
0.0193634033203125,
0.028289794921875,
-0.0094451904296875,
0.0386962890625,
-0.0032806396484375,
-0.01195526123046875,
0.0103302001953125,
0.0233917236328125,
-0.0224456787109375,
-0.0311431884765625,
0.06103515625,
-0.07958984375,
-0.038299560546875,
-0.0567626953125,
-0.03314208984375,
0.00615692138671875,
0.0294189453125,
0.04345703125,
0.05206298828125,
-0.0156402587890625,
0.044189453125,
0.039886474609375,
-0.0144500732421875,
0.039154052734375,
0.02294921875,
-0.0104217529296875,
-0.049530029296875,
0.050994873046875,
0.0211181640625,
0.0205841064453125,
0.01194000244140625,
0.02374267578125,
-0.02923583984375,
-0.041900634765625,
-0.0061492919921875,
0.0253143310546875,
-0.0469970703125,
-0.0190887451171875,
-0.046112060546875,
-0.01373291015625,
-0.0650634765625,
0.002986907958984375,
-0.020660400390625,
-0.0291595458984375,
-0.035614013671875,
-0.00830841064453125,
0.037261962890625,
0.037811279296875,
-0.021453857421875,
0.01396942138671875,
-0.049163818359375,
0.01532745361328125,
0.00576019287109375,
0.00910186767578125,
-0.0006666183471679688,
-0.059112548828125,
-0.032745361328125,
0.02008056640625,
-0.0187835693359375,
-0.061004638671875,
0.048126220703125,
-0.005527496337890625,
0.035858154296875,
0.0276336669921875,
-0.0023651123046875,
0.06036376953125,
-0.0203094482421875,
0.055938720703125,
0.03753662109375,
-0.087646484375,
0.05255126953125,
-0.030792236328125,
0.02923583984375,
0.021728515625,
0.0221710205078125,
-0.05780029296875,
-0.01259613037109375,
-0.04840087890625,
-0.05377197265625,
0.0858154296875,
0.0259246826171875,
0.0014276504516601562,
0.01061248779296875,
0.0102386474609375,
-0.0195159912109375,
0.00907135009765625,
-0.049713134765625,
-0.04656982421875,
-0.0141754150390625,
-0.017669677734375,
-0.006641387939453125,
-0.01222991943359375,
-0.0158538818359375,
-0.045745849609375,
0.06927490234375,
0.0176239013671875,
0.03466796875,
0.034027099609375,
0.00943756103515625,
-0.01131439208984375,
0.023712158203125,
0.06103515625,
0.049072265625,
-0.0264434814453125,
-0.004581451416015625,
0.023040771484375,
-0.048431396484375,
0.0146331787109375,
0.00806427001953125,
-0.0252685546875,
0.01317596435546875,
0.0311737060546875,
0.089111328125,
0.0034637451171875,
-0.037017822265625,
0.036041259765625,
0.0008502006530761719,
-0.0227813720703125,
-0.055877685546875,
0.001331329345703125,
0.01029205322265625,
0.0155029296875,
0.0276336669921875,
0.0248870849609375,
-0.00670623779296875,
-0.041717529296875,
0.01154327392578125,
0.035614013671875,
-0.02239990234375,
-0.032928466796875,
0.041748046875,
-0.00142669677734375,
-0.03228759765625,
0.033294677734375,
0.004268646240234375,
-0.04168701171875,
0.0587158203125,
0.05108642578125,
0.059234619140625,
-0.04107666015625,
0.0052032470703125,
0.059295654296875,
0.0150604248046875,
-0.009979248046875,
0.031036376953125,
0.01042938232421875,
-0.054779052734375,
-0.0146026611328125,
-0.053192138671875,
-0.0030460357666015625,
0.0345458984375,
-0.05535888671875,
0.020416259765625,
-0.041168212890625,
-0.0264739990234375,
0.0248870849609375,
0.018402099609375,
-0.047576904296875,
0.017608642578125,
0.0169830322265625,
0.060943603515625,
-0.06573486328125,
0.0672607421875,
0.049346923828125,
-0.023529052734375,
-0.07489013671875,
-0.00121307373046875,
-0.006862640380859375,
-0.04425048828125,
0.059051513671875,
0.00992584228515625,
-0.01125335693359375,
-0.00009042024612426758,
-0.03997802734375,
-0.0855712890625,
0.09686279296875,
0.0186920166015625,
-0.0472412109375,
0.00812530517578125,
0.0017538070678710938,
0.0330810546875,
-0.0151824951171875,
0.0249481201171875,
0.042999267578125,
0.04632568359375,
0.016754150390625,
-0.079345703125,
0.0263519287109375,
-0.027313232421875,
-0.01520538330078125,
0.00873565673828125,
-0.0606689453125,
0.07635498046875,
-0.02410888671875,
-0.005039215087890625,
0.0225982666015625,
0.044189453125,
0.0156097412109375,
0.0224456787109375,
0.034942626953125,
0.047210693359375,
0.059173583984375,
-0.0185394287109375,
0.05596923828125,
-0.00891876220703125,
0.04132080078125,
0.0584716796875,
0.005214691162109375,
0.06787109375,
0.033447265625,
-0.03546142578125,
0.02728271484375,
0.03228759765625,
-0.0312347412109375,
0.04205322265625,
0.01152801513671875,
-0.03228759765625,
-0.021728515625,
0.0037326812744140625,
-0.0303955078125,
0.049285888671875,
0.0104217529296875,
-0.025360107421875,
0.01556396484375,
0.0081787109375,
0.0145263671875,
-0.01015472412109375,
-0.00511932373046875,
0.0523681640625,
0.0023097991943359375,
-0.04962158203125,
0.061248779296875,
-0.004161834716796875,
0.0482177734375,
-0.0655517578125,
0.01435089111328125,
0.00574493408203125,
0.015411376953125,
-0.03631591796875,
-0.0513916015625,
0.01332855224609375,
0.00467681884765625,
-0.021026611328125,
0.00942230224609375,
0.056182861328125,
-0.0304718017578125,
-0.051605224609375,
0.031280517578125,
0.016387939453125,
0.0194091796875,
0.00640106201171875,
-0.05377197265625,
0.01036834716796875,
0.0167236328125,
-0.042816162109375,
0.018402099609375,
0.022735595703125,
0.02606201171875,
0.05535888671875,
0.0625,
0.025177001953125,
0.003643035888671875,
-0.0140228271484375,
0.06158447265625,
-0.06109619140625,
-0.04248046875,
-0.06640625,
0.041473388671875,
-0.003147125244140625,
-0.0186767578125,
0.056396484375,
0.055572509765625,
0.052642822265625,
-0.00005620718002319336,
0.07373046875,
-0.0258331298828125,
0.04852294921875,
-0.036224365234375,
0.064697265625,
-0.04742431640625,
-0.006771087646484375,
-0.0243072509765625,
-0.049530029296875,
-0.0260772705078125,
0.061492919921875,
-0.0300750732421875,
0.015899658203125,
0.049774169921875,
0.0726318359375,
0.00574493408203125,
-0.00812530517578125,
0.027069091796875,
0.036163330078125,
0.009185791015625,
0.0439453125,
0.050018310546875,
-0.049163818359375,
0.0635986328125,
-0.042755126953125,
-0.00574493408203125,
-0.019866943359375,
-0.03887939453125,
-0.0693359375,
-0.060211181640625,
-0.0228424072265625,
-0.044158935546875,
-0.004718780517578125,
0.09466552734375,
0.060791015625,
-0.061431884765625,
-0.026153564453125,
0.01303863525390625,
0.002872467041015625,
-0.03265380859375,
-0.0202178955078125,
0.043121337890625,
0.007274627685546875,
-0.0706787109375,
0.0174560546875,
-0.011260986328125,
0.01373291015625,
-0.004550933837890625,
-0.01308441162109375,
-0.039703369140625,
0.00730133056640625,
0.0268707275390625,
0.020751953125,
-0.061065673828125,
-0.0258331298828125,
0.00101470947265625,
-0.014129638671875,
0.0112152099609375,
0.01500701904296875,
-0.0484619140625,
0.018463134765625,
0.053497314453125,
0.00830078125,
0.043975830078125,
-0.0029048919677734375,
0.010986328125,
-0.034088134765625,
0.0207366943359375,
0.0193328857421875,
0.04052734375,
0.0186004638671875,
-0.0272979736328125,
0.0220947265625,
0.033172607421875,
-0.049835205078125,
-0.07275390625,
-0.006381988525390625,
-0.10089111328125,
-0.0135498046875,
0.0948486328125,
-0.00923919677734375,
-0.0193328857421875,
0.01142120361328125,
-0.0159912109375,
0.041412353515625,
-0.05096435546875,
0.04351806640625,
0.049835205078125,
-0.00855255126953125,
-0.00036907196044921875,
-0.045745849609375,
0.036651611328125,
0.0299072265625,
-0.048095703125,
0.0030155181884765625,
0.0322265625,
0.040679931640625,
0.032135009765625,
0.0550537109375,
0.00907135009765625,
0.0272979736328125,
0.0006775856018066406,
0.0152587890625,
-0.01502227783203125,
-0.004077911376953125,
-0.036102294921875,
0.0012674331665039062,
-0.029632568359375,
-0.040313720703125
]
] |
t5-base | 2023-04-06T13:42:36.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"t5",
"text2text-generation",
"summarization",
"translation",
"en",
"fr",
"ro",
"de",
"dataset:c4",
"arxiv:1805.12471",
"arxiv:1708.00055",
"arxiv:1704.05426",
"arxiv:1606.05250",
"arxiv:1808.09121",
"arxiv:1810.12885",
"arxiv:1905.10044",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | translation | null | null | null | t5-base | 361 | 1,853,138 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- fr
- ro
- de
datasets:
- c4
tags:
- summarization
- translation
license: apache-2.0
---
# Model Card for T5 Base

# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [Model Card Authors](#model-card-authors)
9. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
> With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
T5-Base is the checkpoint with 220 million parameters.
- **Developed by:** Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. See [associated paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) and [GitHub repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
- **Model type:** Language model
- **Language(s) (NLP):** English, French, Romanian, German
- **License:** Apache 2.0
- **Related Models:** [All T5 Checkpoints](https://huggingface.co/models?search=t5)
- **Resources for more information:**
- [Research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
- [Google's T5 Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
- [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer)
- [Hugging Face T5 Docs](https://huggingface.co/docs/transformers/model_doc/t5)
# Uses
## Direct Use and Downstream Use
The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
> Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task, including machine translation, document summarization, question answering, and classification tasks (e.g., sentiment analysis). We can even apply T5 to regression tasks by training it to predict the string representation of a number instead of the number itself.
See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
More information needed.
## Recommendations
More information needed.
# Training Details
## Training Data
The model is pre-trained on the [Colossal Clean Crawled Corpus (C4)](https://www.tensorflow.org/datasets/catalog/c4), which was developed and released in the context of the same [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) as T5.
The model was pre-trained on a on a **multi-task mixture of unsupervised (1.) and supervised tasks (2.)**.
Thereby, the following datasets were being used for (1.) and (2.):
1. **Datasets used for Unsupervised denoising objective**:
- [C4](https://huggingface.co/datasets/c4)
- [Wiki-DPR](https://huggingface.co/datasets/wiki_dpr)
2. **Datasets used for Supervised text-to-text language modeling objective**
- Sentence acceptability judgment
- CoLA [Warstadt et al., 2018](https://arxiv.org/abs/1805.12471)
- Sentiment analysis
- SST-2 [Socher et al., 2013](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)
- Paraphrasing/sentence similarity
- MRPC [Dolan and Brockett, 2005](https://aclanthology.org/I05-5002)
- STS-B [Ceret al., 2017](https://arxiv.org/abs/1708.00055)
- QQP [Iyer et al., 2017](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- Natural language inference
- MNLI [Williams et al., 2017](https://arxiv.org/abs/1704.05426)
- QNLI [Rajpurkar et al.,2016](https://arxiv.org/abs/1606.05250)
- RTE [Dagan et al., 2005](https://link.springer.com/chapter/10.1007/11736790_9)
- CB [De Marneff et al., 2019](https://semanticsarchive.net/Archive/Tg3ZGI2M/Marneffe.pdf)
- Sentence completion
- COPA [Roemmele et al., 2011](https://www.researchgate.net/publication/221251392_Choice_of_Plausible_Alternatives_An_Evaluation_of_Commonsense_Causal_Reasoning)
- Word sense disambiguation
- WIC [Pilehvar and Camacho-Collados, 2018](https://arxiv.org/abs/1808.09121)
- Question answering
- MultiRC [Khashabi et al., 2018](https://aclanthology.org/N18-1023)
- ReCoRD [Zhang et al., 2018](https://arxiv.org/abs/1810.12885)
- BoolQ [Clark et al., 2019](https://arxiv.org/abs/1905.10044)
## Training Procedure
In their [abstract](https://jmlr.org/papers/volume21/20-074/20-074.pdf), the model developers write:
> In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks.
The framework introduced, the T5 framework, involves a training procedure that brings together the approaches studied in the paper. See the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
# Evaluation
## Testing Data, Factors & Metrics
The developers evaluated the model on 24 tasks, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for full details.
## Results
For full results for T5-Base, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf), Table 14.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@article{2020t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {Journal of Machine Learning Research},
year = {2020},
volume = {21},
number = {140},
pages = {1-67},
url = {http://jmlr.org/papers/v21/20-074.html}
}
```
**APA:**
- Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., ... & Liu, P. J. (2020). Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140), 1-67.
# Model Card Authors
This model card was written by the team at Hugging Face.
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import T5Tokenizer, T5Model
tokenizer = T5Tokenizer.from_pretrained("t5-base")
model = T5Model.from_pretrained("t5-base")
input_ids = tokenizer(
"Studies have been shown that owning a dog is good for you", return_tensors="pt"
).input_ids # Batch size 1
decoder_input_ids = tokenizer("Studies show that", return_tensors="pt").input_ids # Batch size 1
# forward pass
outputs = model(input_ids=input_ids, decoder_input_ids=decoder_input_ids)
last_hidden_states = outputs.last_hidden_state
```
See the [Hugging Face T5](https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Model) docs and a [Colab Notebook](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) created by the model developers for more examples.
</details>
| 8,454 | [
[
-0.0205535888671875,
-0.0252227783203125,
0.03521728515625,
0.0108795166015625,
-0.01141357421875,
-0.005847930908203125,
-0.0185699462890625,
-0.0426025390625,
-0.0258026123046875,
0.032806396484375,
-0.038421630859375,
-0.04766845703125,
-0.061553955078125,
0.024505615234375,
-0.03997802734375,
0.08087158203125,
-0.00659942626953125,
-0.0118408203125,
-0.00907135009765625,
-0.00917816162109375,
-0.027862548828125,
-0.040313720703125,
-0.044219970703125,
-0.0268402099609375,
0.0290069580078125,
0.0162200927734375,
0.0200653076171875,
0.03338623046875,
0.04937744140625,
0.017822265625,
-0.00872039794921875,
-0.002162933349609375,
-0.034912109375,
-0.0202178955078125,
-0.0212249755859375,
-0.0241851806640625,
-0.0272064208984375,
-0.0036525726318359375,
0.0411376953125,
0.05523681640625,
0.0027103424072265625,
0.0263214111328125,
0.0097503662109375,
0.03857421875,
-0.04766845703125,
0.0111541748046875,
-0.04364013671875,
0.00750732421875,
-0.00045228004455566406,
0.002460479736328125,
-0.045135498046875,
-0.00315093994140625,
0.0166473388671875,
-0.045928955078125,
0.0250701904296875,
-0.0037517547607421875,
0.09149169921875,
0.02459716796875,
-0.036346435546875,
-0.0132598876953125,
-0.0574951171875,
0.08319091796875,
-0.05853271484375,
0.040863037109375,
0.0124053955078125,
0.00954437255859375,
0.01113128662109375,
-0.08380126953125,
-0.050506591796875,
-0.0008511543273925781,
-0.01522064208984375,
0.019012451171875,
-0.02215576171875,
0.002689361572265625,
0.025360107421875,
0.0261993408203125,
-0.032501220703125,
-0.005176544189453125,
-0.044677734375,
-0.0087890625,
0.042236328125,
-0.00234222412109375,
0.0258026123046875,
-0.0149688720703125,
-0.036590576171875,
-0.022979736328125,
-0.0253753662109375,
0.007694244384765625,
-0.00928497314453125,
0.0200958251953125,
-0.0259552001953125,
0.0208892822265625,
0.006580352783203125,
0.044525146484375,
0.01541900634765625,
-0.0150146484375,
0.032470703125,
-0.05731201171875,
-0.017730712890625,
-0.026580810546875,
0.08294677734375,
0.0241546630859375,
0.01175689697265625,
-0.034423828125,
-0.0032062530517578125,
-0.008697509765625,
0.029632568359375,
-0.07257080078125,
-0.0086822509765625,
0.02410888671875,
-0.03680419921875,
-0.0367431640625,
-0.0027637481689453125,
-0.0577392578125,
-0.00229644775390625,
-0.00670623779296875,
0.03570556640625,
-0.038665771484375,
-0.018218994140625,
0.015167236328125,
-0.0228271484375,
0.0261688232421875,
0.0209808349609375,
-0.0667724609375,
0.026397705078125,
0.0237884521484375,
0.054412841796875,
-0.03436279296875,
-0.027313232421875,
-0.009796142578125,
0.0099945068359375,
-0.00923919677734375,
0.051849365234375,
-0.0306854248046875,
-0.034088134765625,
-0.01125335693359375,
0.01194000244140625,
-0.0183868408203125,
-0.0216827392578125,
0.062347412109375,
-0.0212860107421875,
0.0545654296875,
-0.0246734619140625,
-0.040130615234375,
-0.03118896484375,
0.01351165771484375,
-0.04718017578125,
0.0906982421875,
0.0011987686157226562,
-0.05950927734375,
0.0225677490234375,
-0.071044921875,
-0.0211029052734375,
-0.0204620361328125,
0.023193359375,
-0.040374755859375,
-0.0201416015625,
0.022674560546875,
0.0457763671875,
-0.0277862548828125,
0.0282745361328125,
-0.0201873779296875,
-0.018463134765625,
0.00634002685546875,
-0.0244903564453125,
0.07904052734375,
0.0208740234375,
-0.038818359375,
0.0013341903686523438,
-0.054412841796875,
0.0026702880859375,
-0.0010814666748046875,
-0.0214996337890625,
0.0018768310546875,
-0.01471710205078125,
0.019012451171875,
0.033172607421875,
0.018707275390625,
-0.039825439453125,
0.0020904541015625,
-0.02294921875,
0.04791259765625,
0.03497314453125,
-0.005886077880859375,
0.043548583984375,
-0.036346435546875,
0.0282135009765625,
0.01456451416015625,
0.004962921142578125,
-0.0150909423828125,
-0.0260467529296875,
-0.06085205078125,
0.002292633056640625,
0.038116455078125,
0.04290771484375,
-0.0433349609375,
0.044342041015625,
-0.041412353515625,
-0.051910400390625,
-0.04705810546875,
-0.005306243896484375,
0.028106689453125,
0.051544189453125,
0.061798095703125,
-0.00811767578125,
-0.04705810546875,
-0.047210693359375,
-0.025390625,
-0.0048065185546875,
-0.0010061264038085938,
0.0077362060546875,
0.053070068359375,
-0.0099945068359375,
0.06396484375,
-0.020904541015625,
-0.0264892578125,
-0.040863037109375,
0.00005370378494262695,
-0.005046844482421875,
0.044677734375,
0.048675537109375,
-0.05645751953125,
-0.037139892578125,
-0.01383209228515625,
-0.06439208984375,
-0.0020618438720703125,
-0.01103973388671875,
-0.0006823539733886719,
0.03167724609375,
0.040313720703125,
-0.045135498046875,
0.017303466796875,
0.045654296875,
-0.0259552001953125,
0.022979736328125,
-0.00885009765625,
-0.00017631053924560547,
-0.12408447265625,
0.04071044921875,
0.01056671142578125,
-0.015838623046875,
-0.058349609375,
-0.0078125,
0.00449371337890625,
-0.00592041015625,
-0.039703369140625,
0.0517578125,
-0.0310821533203125,
0.0028839111328125,
-0.0016450881958007812,
0.00152587890625,
0.01071929931640625,
0.051055908203125,
-0.0012083053588867188,
0.058990478515625,
0.0198974609375,
-0.0537109375,
-0.0008535385131835938,
0.0252532958984375,
-0.0037670135498046875,
0.023040771484375,
-0.0557861328125,
0.0215301513671875,
-0.00518035888671875,
0.03619384765625,
-0.07037353515625,
0.01184844970703125,
0.0259552001953125,
-0.05279541015625,
0.023590087890625,
-0.0001385211944580078,
-0.0291595458984375,
-0.0256805419921875,
-0.022918701171875,
0.01934814453125,
0.051055908203125,
-0.0367431640625,
0.054840087890625,
0.0101318359375,
0.023223876953125,
-0.06134033203125,
-0.06451416015625,
0.01338958740234375,
-0.02947998046875,
-0.0390625,
0.0618896484375,
-0.01009368896484375,
0.0074310302734375,
0.01180267333984375,
0.0014848709106445312,
-0.0160980224609375,
0.011444091796875,
0.004474639892578125,
0.015716552734375,
0.00202178955078125,
0.01348876953125,
-0.008697509765625,
-0.01221466064453125,
0.00007194280624389648,
-0.0360107421875,
0.0229949951171875,
-0.01373291015625,
0.01251983642578125,
-0.04974365234375,
0.013458251953125,
0.04510498046875,
-0.01468658447265625,
0.062042236328125,
0.07379150390625,
-0.0196533203125,
-0.004566192626953125,
-0.03570556640625,
-0.0160675048828125,
-0.034332275390625,
0.0303802490234375,
-0.033599853515625,
-0.06500244140625,
0.0325927734375,
0.00313568115234375,
0.0298004150390625,
0.06683349609375,
0.0265045166015625,
-0.01187896728515625,
0.0576171875,
0.064697265625,
-0.003467559814453125,
0.042205810546875,
-0.03594970703125,
0.020751953125,
-0.0654296875,
-0.0211029052734375,
-0.055450439453125,
-0.0199737548828125,
-0.058807373046875,
-0.0265960693359375,
0.007411956787109375,
-0.0008568763732910156,
-0.0269317626953125,
0.0389404296875,
-0.0411376953125,
0.0087738037109375,
0.03204345703125,
0.006500244140625,
0.028045654296875,
0.0011510848999023438,
-0.0053558349609375,
-0.01136016845703125,
-0.06500244140625,
-0.036651611328125,
0.09765625,
0.0269775390625,
0.0285491943359375,
-0.0025615692138671875,
0.048980712890625,
0.0182037353515625,
0.01467132568359375,
-0.05499267578125,
0.05206298828125,
-0.029327392578125,
-0.03802490234375,
-0.01702880859375,
-0.03057861328125,
-0.0858154296875,
0.0224456787109375,
-0.025726318359375,
-0.051055908203125,
0.01107025146484375,
-0.0007147789001464844,
-0.01549530029296875,
0.039703369140625,
-0.064697265625,
0.0819091796875,
-0.006191253662109375,
-0.024444580078125,
-0.0016908645629882812,
-0.054046630859375,
0.016082763671875,
0.00431060791015625,
0.007556915283203125,
0.0082244873046875,
-0.0125274658203125,
0.07318115234375,
-0.0242919921875,
0.0689697265625,
-0.0146636962890625,
0.0030002593994140625,
0.0097198486328125,
-0.02593994140625,
0.03033447265625,
-0.031768798828125,
-0.0073089599609375,
0.031036376953125,
0.00812530517578125,
-0.035125732421875,
-0.040374755859375,
0.03302001953125,
-0.07342529296875,
-0.0277252197265625,
-0.03021240234375,
-0.036590576171875,
-0.0113677978515625,
0.0288238525390625,
0.0279388427734375,
0.0136566162109375,
-0.01296234130859375,
0.0275421142578125,
0.04949951171875,
-0.0265350341796875,
0.054962158203125,
0.024444580078125,
0.002010345458984375,
-0.023193359375,
0.058441162109375,
0.00928497314453125,
0.0284881591796875,
0.04364013671875,
0.01220703125,
-0.0256805419921875,
-0.0428466796875,
-0.0284576416015625,
0.0240020751953125,
-0.04736328125,
-0.00662994384765625,
-0.0740966796875,
-0.0168304443359375,
-0.043121337890625,
-0.0022983551025390625,
-0.03228759765625,
-0.0301971435546875,
-0.035186767578125,
-0.0146636962890625,
0.02130126953125,
0.0362548828125,
0.0112152099609375,
0.01430511474609375,
-0.0692138671875,
0.01297760009765625,
0.0016775131225585938,
0.007556915283203125,
0.001010894775390625,
-0.060699462890625,
-0.01091766357421875,
0.00742340087890625,
-0.03204345703125,
-0.050872802734375,
0.03350830078125,
0.0171661376953125,
0.0270843505859375,
0.000934600830078125,
0.01068878173828125,
0.048309326171875,
-0.0196990966796875,
0.07611083984375,
0.01209259033203125,
-0.0791015625,
0.0206451416015625,
-0.0197601318359375,
0.03167724609375,
0.039520263671875,
0.036865234375,
-0.049102783203125,
-0.0166168212890625,
-0.074951171875,
-0.059722900390625,
0.0576171875,
0.0189361572265625,
0.01081085205078125,
0.0293426513671875,
0.0190887451171875,
0.0027523040771484375,
0.01171875,
-0.0714111328125,
-0.0185394287109375,
-0.0219879150390625,
-0.028045654296875,
-0.004650115966796875,
-0.004405975341796875,
0.008697509765625,
-0.0254669189453125,
0.049652099609375,
-0.0052947998046875,
0.054779052734375,
0.023345947265625,
-0.0228729248046875,
0.01302337646484375,
0.0291595458984375,
0.05029296875,
0.039825439453125,
-0.015106201171875,
-0.0012941360473632812,
0.034332275390625,
-0.040374755859375,
-0.0017786026000976562,
0.0115203857421875,
-0.0239715576171875,
0.00142669677734375,
0.03643798828125,
0.07366943359375,
0.006923675537109375,
-0.030517578125,
0.041290283203125,
-0.0026912689208984375,
-0.0472412109375,
-0.0204010009765625,
-0.0039825439453125,
0.0104827880859375,
-0.0023822784423828125,
0.0213775634765625,
0.0184326171875,
0.00939178466796875,
-0.037322998046875,
0.00506591796875,
0.0092620849609375,
-0.035003662109375,
-0.03570556640625,
0.060882568359375,
0.0267791748046875,
-0.002437591552734375,
0.04547119140625,
-0.007709503173828125,
-0.042755126953125,
0.0430908203125,
0.041015625,
0.078125,
-0.00431060791015625,
0.01371002197265625,
0.051055908203125,
0.0289764404296875,
-0.00777435302734375,
0.0038661956787109375,
-0.005645751953125,
-0.0616455078125,
-0.043701171875,
-0.03515625,
-0.0228729248046875,
0.0183868408203125,
-0.0341796875,
0.0246124267578125,
-0.02435302734375,
-0.0010986328125,
0.006465911865234375,
0.011322021484375,
-0.05657958984375,
0.0250091552734375,
0.0030612945556640625,
0.0638427734375,
-0.0567626953125,
0.06329345703125,
0.056976318359375,
-0.04583740234375,
-0.0699462890625,
0.01209259033203125,
-0.023406982421875,
-0.047454833984375,
0.044219970703125,
0.01189422607421875,
-0.00220489501953125,
0.01544952392578125,
-0.04144287109375,
-0.0654296875,
0.100830078125,
0.0269927978515625,
-0.0209503173828125,
-0.0267333984375,
0.02130126953125,
0.049957275390625,
-0.020355224609375,
0.0330810546875,
0.035675048828125,
0.036895751953125,
0.017791748046875,
-0.07904052734375,
0.0245513916015625,
-0.020904541015625,
0.00623321533203125,
0.00244140625,
-0.061004638671875,
0.0443115234375,
-0.02581787109375,
-0.0181732177734375,
-0.0164794921875,
0.05145263671875,
0.00446319580078125,
0.01445770263671875,
0.036956787109375,
0.05670166015625,
0.05072021484375,
-0.006755828857421875,
0.087158203125,
-0.0261688232421875,
0.03619384765625,
0.06024169921875,
0.010009765625,
0.06719970703125,
0.036773681640625,
-0.0229034423828125,
0.0390625,
0.049468994140625,
-0.00901031494140625,
0.03875732421875,
-0.01001739501953125,
-0.004962921142578125,
-0.008331298828125,
-0.01316070556640625,
-0.0262908935546875,
0.0186767578125,
0.018707275390625,
-0.031982421875,
-0.021026611328125,
0.0086822509765625,
0.02294921875,
-0.01153564453125,
-0.004451751708984375,
0.0631103515625,
0.01605224609375,
-0.060028076171875,
0.056365966796875,
0.01168060302734375,
0.06707763671875,
-0.0333251953125,
0.00482177734375,
-0.013824462890625,
0.016143798828125,
-0.0256195068359375,
-0.052642822265625,
0.0374755859375,
0.0034351348876953125,
-0.015411376953125,
-0.051910400390625,
0.06304931640625,
-0.033935546875,
-0.0305938720703125,
0.026519775390625,
0.03472900390625,
0.00714874267578125,
0.003467559814453125,
-0.070556640625,
-0.006744384765625,
0.014862060546875,
-0.0170135498046875,
0.0237884521484375,
0.027740478515625,
0.00402069091796875,
0.051361083984375,
0.046295166015625,
-0.01544952392578125,
0.001949310302734375,
-0.0099639892578125,
0.051116943359375,
-0.056396484375,
-0.02239990234375,
-0.055694580078125,
0.05377197265625,
-0.0008311271667480469,
-0.034088134765625,
0.049102783203125,
0.034423828125,
0.08062744140625,
-0.00982666015625,
0.0780029296875,
-0.01450347900390625,
0.040618896484375,
-0.0300750732421875,
0.0382080078125,
-0.05010986328125,
0.014007568359375,
-0.0261993408203125,
-0.06060791015625,
-0.022430419921875,
0.0305328369140625,
-0.0275421142578125,
0.0252532958984375,
0.0797119140625,
0.047271728515625,
-0.00098419189453125,
-0.01103973388671875,
0.01554107666015625,
0.01271820068359375,
0.0273895263671875,
0.056427001953125,
0.0262451171875,
-0.07391357421875,
0.07281494140625,
-0.0273895263671875,
0.01690673828125,
-0.0009379386901855469,
-0.061981201171875,
-0.0712890625,
-0.062286376953125,
-0.0291290283203125,
-0.03460693359375,
0.01113128662109375,
0.05963134765625,
0.04498291015625,
-0.05242919921875,
-0.0211029052734375,
-0.0298309326171875,
0.00030803680419921875,
-0.0165863037109375,
-0.01580810546875,
0.037872314453125,
-0.0379638671875,
-0.06671142578125,
0.0022830963134765625,
-0.006443023681640625,
0.002803802490234375,
-0.0009312629699707031,
-0.002262115478515625,
-0.0241546630859375,
-0.014739990234375,
0.045684814453125,
0.0132293701171875,
-0.047821044921875,
-0.0211029052734375,
0.0222625732421875,
-0.0136566162109375,
0.01149749755859375,
0.03515625,
-0.05059814453125,
0.015625,
0.040313720703125,
0.0701904296875,
0.0626220703125,
-0.007808685302734375,
0.048797607421875,
-0.031463623046875,
-0.01035308837890625,
0.010650634765625,
0.00737762451171875,
0.030609130859375,
-0.0169830322265625,
0.048309326171875,
0.03765869140625,
-0.0355224609375,
-0.049652099609375,
-0.01207733154296875,
-0.09356689453125,
-0.01383209228515625,
0.09552001953125,
-0.0128021240234375,
-0.017242431640625,
0.0005221366882324219,
-0.003543853759765625,
0.0297393798828125,
-0.035400390625,
0.056549072265625,
0.06549072265625,
0.00787353515625,
-0.0341796875,
-0.04388427734375,
0.048065185546875,
0.045257568359375,
-0.07879638671875,
-0.015472412109375,
0.0142974853515625,
0.0396728515625,
0.00890350341796875,
0.045654296875,
-0.00849151611328125,
0.007076263427734375,
-0.01495361328125,
0.02618408203125,
-0.00225067138671875,
-0.0027828216552734375,
-0.0256500244140625,
0.0148162841796875,
-0.01557159423828125,
-0.020355224609375
]
] |
Davlan/distilbert-base-multilingual-cased-ner-hrl | 2023-08-14T19:34:34.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"distilbert",
"token-classification",
"license:afl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | Davlan | null | null | Davlan/distilbert-base-multilingual-cased-ner-hrl | 67 | 1,808,811 | transformers | 2022-03-02T23:29:04 | ---
license: afl-3.0
---
Hugging Face's logo
---
language:
- ar
- de
- en
- es
- fr
- it
- lv
- nl
- pt
- zh
- multilingual
---
# distilbert-base-multilingual-cased-ner-hrl
## Model description
**distilbert-base-multilingual-cased-ner-hrl** is a **Named Entity Recognition** model for 10 high resourced languages (Arabic, German, English, Spanish, French, Italian, Latvian, Dutch, Portuguese and Chinese) based on a fine-tuned Distiled BERT base model. It has been trained to recognize three types of entities: location (LOC), organizations (ORG), and person (PER).
Specifically, this model is a *distilbert-base-multilingual-cased* model that was fine-tuned on an aggregation of 10 high-resourced languages
## Intended uses & limitations
#### How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("Davlan/distilbert-base-multilingual-cased-ner-hrl")
model = AutoModelForTokenClassification.from_pretrained("Davlan/distilbert-base-multilingual-cased-ner-hrl")
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "Nader Jokhadar had given Syria the lead with a well-struck header in the seventh minute."
ner_results = nlp(example)
print(ner_results)
```
#### Limitations and bias
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
## Training data
The training data for the 10 languages are from:
Language|Dataset
-|-
Arabic | [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/)
German | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/)
English | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/)
Spanish | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/)
French | [Europeana Newspapers](https://github.com/EuropeanaNewspapers/ner-corpora/tree/master/enp_FR.bnf.bio)
Italian | [Italian I-CAB](https://ontotext.fbk.eu/icab.html)
Latvian | [Latvian NER](https://github.com/LUMII-AILab/FullStack/tree/master/NamedEntities)
Dutch | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/)
Portuguese |[Paramopama + Second Harem](https://github.com/davidsbatista/NER-datasets/tree/master/Portuguese)
Chinese | [MSRA](https://huggingface.co/datasets/msra_ner)
The training dataset distinguishes between the beginning and continuation of an entity so that if there are back-to-back entities of the same type, the model can output where the second entity begins. As in the dataset, each token will be classified as one of the following classes:
Abbreviation|Description
-|-
O|Outside of a named entity
B-PER |Beginning of a personโs name right after another personโs name
I-PER |Personโs name
B-ORG |Beginning of an organisation right after another organisation
I-ORG |Organisation
B-LOC |Beginning of a location right after another location
I-LOC |Location
## Training procedure
This model was trained on NVIDIA V100 GPU with recommended hyperparameters from HuggingFace code. | 3,135 | [
[
-0.041656494140625,
-0.0469970703125,
0.00795745849609375,
0.031036376953125,
-0.01284027099609375,
0.0083160400390625,
-0.0204010009765625,
-0.041656494140625,
0.04095458984375,
0.0208740234375,
-0.036590576171875,
-0.0523681640625,
-0.06402587890625,
0.03173828125,
-0.017059326171875,
0.0927734375,
-0.016265869140625,
0.0192413330078125,
0.007694244384765625,
-0.043121337890625,
-0.006229400634765625,
-0.05499267578125,
-0.0655517578125,
-0.0202178955078125,
0.0303955078125,
0.0128326416015625,
0.0347900390625,
0.042816162109375,
0.019622802734375,
0.02685546875,
-0.0217742919921875,
-0.005352020263671875,
-0.01509857177734375,
-0.017578125,
-0.01364898681640625,
-0.010223388671875,
-0.0243682861328125,
-0.00597381591796875,
0.056884765625,
0.046539306640625,
-0.01409912109375,
0.0211029052734375,
-0.0030765533447265625,
0.057281494140625,
-0.01605224609375,
0.0196380615234375,
-0.0301971435546875,
-0.0032444000244140625,
-0.0204010009765625,
0.01983642578125,
-0.0081787109375,
-0.01471710205078125,
0.018798828125,
-0.02484130859375,
0.01233673095703125,
-0.00391387939453125,
0.10540771484375,
0.0041656494140625,
-0.039825439453125,
-0.01904296875,
-0.023529052734375,
0.04150390625,
-0.03839111328125,
0.05584716796875,
0.021484375,
0.01261138916015625,
0.0009484291076660156,
-0.035675048828125,
-0.0582275390625,
0.0020008087158203125,
-0.01329803466796875,
0.0009055137634277344,
-0.020172119140625,
-0.01751708984375,
0.0260467529296875,
0.0318603515625,
-0.0430908203125,
-0.005901336669921875,
-0.0296630859375,
-0.0211334228515625,
0.0433349609375,
-0.0167999267578125,
0.03564453125,
-0.026702880859375,
-0.0310211181640625,
-0.01436614990234375,
-0.035888671875,
0.01294708251953125,
0.0290374755859375,
0.032012939453125,
-0.041046142578125,
0.04095458984375,
-0.019989013671875,
0.05169677734375,
0.0209503173828125,
-0.01500701904296875,
0.051971435546875,
-0.01690673828125,
-0.0169219970703125,
0.0015878677368164062,
0.06488037109375,
0.0268096923828125,
0.0095062255859375,
-0.009918212890625,
-0.0204010009765625,
0.00975799560546875,
-0.0182037353515625,
-0.0643310546875,
-0.00830841064453125,
0.004116058349609375,
-0.03936767578125,
-0.006992340087890625,
-0.007663726806640625,
-0.042999267578125,
0.0017795562744140625,
-0.0294036865234375,
0.0198822021484375,
-0.046234130859375,
-0.031585693359375,
0.002521514892578125,
-0.004077911376953125,
0.03216552734375,
0.00917816162109375,
-0.055755615234375,
0.0192718505859375,
0.02874755859375,
0.06494140625,
-0.01186370849609375,
-0.033111572265625,
-0.039886474609375,
-0.00750732421875,
-0.0015192031860351562,
0.04827880859375,
-0.0284271240234375,
-0.01186370849609375,
0.0001652240753173828,
0.0311126708984375,
-0.011077880859375,
-0.03594970703125,
0.03106689453125,
-0.0355224609375,
0.033111572265625,
-0.0286712646484375,
-0.0377197265625,
-0.0233306884765625,
0.02923583984375,
-0.0543212890625,
0.09228515625,
0.035858154296875,
-0.07073974609375,
0.042266845703125,
-0.03253173828125,
-0.0377197265625,
0.0021457672119140625,
-0.0240631103515625,
-0.039306640625,
0.00112152099609375,
0.021514892578125,
0.03192138671875,
0.0016984939575195312,
0.02899169921875,
0.0049285888671875,
0.00921630859375,
-0.01230621337890625,
0.0010356903076171875,
0.084228515625,
-0.0010929107666015625,
-0.0279693603515625,
0.0003056526184082031,
-0.06829833984375,
-0.0220947265625,
0.017822265625,
-0.035186767578125,
-0.0274505615234375,
-0.02703857421875,
0.03228759765625,
0.0419921875,
0.01345062255859375,
-0.05029296875,
0.0200958251953125,
-0.0299530029296875,
0.020263671875,
0.034454345703125,
-0.0012798309326171875,
0.0312347412109375,
-0.0132904052734375,
0.03839111328125,
0.027984619140625,
-0.010345458984375,
-0.0038394927978515625,
-0.039031982421875,
-0.06951904296875,
-0.01151275634765625,
0.0364990234375,
0.04791259765625,
-0.0758056640625,
0.0257110595703125,
-0.02044677734375,
-0.0347900390625,
-0.03631591796875,
0.005031585693359375,
0.04541015625,
0.051361083984375,
0.03216552734375,
-0.040771484375,
-0.05596923828125,
-0.06884765625,
-0.0164642333984375,
-0.01678466796875,
0.020294189453125,
0.0212554931640625,
0.051116943359375,
-0.0210113525390625,
0.05108642578125,
-0.00484466552734375,
-0.0310516357421875,
-0.018951416015625,
-0.000705718994140625,
0.03948974609375,
0.038299560546875,
0.053558349609375,
-0.06884765625,
-0.0516357421875,
-0.002094268798828125,
-0.053466796875,
0.01544189453125,
-0.000545501708984375,
-0.0223846435546875,
0.042572021484375,
0.0310211181640625,
-0.041046142578125,
0.033660888671875,
0.048828125,
-0.035369873046875,
0.0303955078125,
-0.0025310516357421875,
-0.005466461181640625,
-0.09747314453125,
0.01039886474609375,
0.016510009765625,
-0.004901885986328125,
-0.045440673828125,
-0.0036792755126953125,
-0.00689697265625,
-0.0008697509765625,
-0.046142578125,
0.07110595703125,
-0.0572509765625,
0.00014901161193847656,
-0.004547119140625,
-0.0038318634033203125,
-0.00868988037109375,
0.037994384765625,
0.044525146484375,
0.041900634765625,
0.048675537109375,
-0.050750732421875,
0.016265869140625,
0.04913330078125,
-0.0250091552734375,
0.062042236328125,
-0.037628173828125,
-0.0052490234375,
-0.02044677734375,
0.01910400390625,
-0.0372314453125,
-0.0218658447265625,
0.02166748046875,
-0.04095458984375,
0.033935546875,
-0.03179931640625,
-0.0377197265625,
-0.0221710205078125,
0.00615692138671875,
0.02947998046875,
0.020782470703125,
-0.044708251953125,
0.05810546875,
0.03167724609375,
-0.0048828125,
-0.055511474609375,
-0.058837890625,
0.0182342529296875,
-0.02642822265625,
-0.039215087890625,
0.0296173095703125,
0.004085540771484375,
0.0015630722045898438,
0.0030307769775390625,
0.0005030632019042969,
-0.0156097412109375,
-0.005298614501953125,
0.01175689697265625,
0.027801513671875,
-0.01953125,
0.0038909912109375,
-0.009368896484375,
-0.0040130615234375,
-0.0100250244140625,
-0.0177154541015625,
0.046630859375,
-0.0272979736328125,
-0.003948211669921875,
-0.03497314453125,
0.032440185546875,
0.0250701904296875,
-0.0249481201171875,
0.082275390625,
0.06280517578125,
-0.04901123046875,
0.01629638671875,
-0.04913330078125,
-0.003864288330078125,
-0.0303955078125,
0.0189208984375,
-0.044158935546875,
-0.0633544921875,
0.053314208984375,
0.01043701171875,
0.005035400390625,
0.053497314453125,
0.052459716796875,
0.025665283203125,
0.0548095703125,
0.0670166015625,
-0.035400390625,
0.039031982421875,
-0.025726318359375,
0.00833892822265625,
-0.0572509765625,
-0.03131103515625,
-0.033111572265625,
-0.021514892578125,
-0.06585693359375,
-0.01532745361328125,
0.00899505615234375,
0.00970458984375,
-0.0233154296875,
0.06005859375,
-0.04779052734375,
0.0215606689453125,
0.03570556640625,
-0.00981903076171875,
0.01432037353515625,
0.0160675048828125,
-0.0214385986328125,
-0.001148223876953125,
-0.039825439453125,
-0.042755126953125,
0.047149658203125,
0.037109375,
0.03057861328125,
0.011688232421875,
0.07354736328125,
-0.01421356201171875,
0.03271484375,
-0.047760009765625,
0.022064208984375,
-0.008758544921875,
-0.05352783203125,
-0.01012420654296875,
-0.03472900390625,
-0.067626953125,
0.005176544189453125,
-0.0077056884765625,
-0.0604248046875,
0.0307769775390625,
-0.00864410400390625,
-0.0222015380859375,
0.0272674560546875,
-0.032196044921875,
0.06280517578125,
-0.0226287841796875,
-0.006061553955078125,
0.020751953125,
-0.059234619140625,
0.01265716552734375,
0.005901336669921875,
0.017333984375,
-0.01551055908203125,
-0.00046062469482421875,
0.06268310546875,
-0.0182647705078125,
0.0511474609375,
-0.019744873046875,
-0.0071868896484375,
0.00885772705078125,
-0.007610321044921875,
0.0225677490234375,
0.006267547607421875,
-0.01149749755859375,
0.049835205078125,
0.0025653839111328125,
-0.03204345703125,
-0.0184173583984375,
0.046173095703125,
-0.060546875,
-0.0177764892578125,
-0.04962158203125,
-0.0215606689453125,
-0.0012617111206054688,
0.032196044921875,
0.042144775390625,
0.0208740234375,
-0.01467132568359375,
-0.000835418701171875,
0.0304412841796875,
-0.0279693603515625,
0.0362548828125,
0.04791259765625,
-0.02423095703125,
-0.041961669921875,
0.0625,
0.007663726806640625,
-0.00264739990234375,
0.021575927734375,
-0.0003147125244140625,
-0.023345947265625,
-0.0238037109375,
-0.06256103515625,
0.040802001953125,
-0.03485107421875,
-0.02154541015625,
-0.0699462890625,
-0.0293121337890625,
-0.03814697265625,
0.00344085693359375,
-0.0243377685546875,
-0.0235137939453125,
-0.0241546630859375,
-0.0089111328125,
0.02740478515625,
0.042510986328125,
-0.006549835205078125,
0.0269622802734375,
-0.053619384765625,
0.0257720947265625,
-0.00263214111328125,
0.03289794921875,
-0.01406097412109375,
-0.046905517578125,
-0.02178955078125,
-0.000006020069122314453,
-0.010467529296875,
-0.0772705078125,
0.0499267578125,
0.0308837890625,
0.039215087890625,
0.033843994140625,
-0.01512908935546875,
0.048309326171875,
-0.046905517578125,
0.0443115234375,
0.0194549560546875,
-0.06396484375,
0.04156494140625,
-0.0192413330078125,
0.00893402099609375,
0.050933837890625,
0.058624267578125,
-0.064208984375,
-0.00981903076171875,
-0.06298828125,
-0.06658935546875,
0.05230712890625,
0.0168609619140625,
0.0210723876953125,
-0.0282745361328125,
0.031158447265625,
0.0023441314697265625,
0.02178955078125,
-0.0709228515625,
-0.04034423828125,
-0.006671905517578125,
-0.01812744140625,
0.0014734268188476562,
-0.0127716064453125,
-0.002758026123046875,
-0.0298919677734375,
0.0780029296875,
-0.00484466552734375,
0.0197601318359375,
0.0268096923828125,
-0.0162200927734375,
-0.0009479522705078125,
0.0004248619079589844,
0.028167724609375,
0.03399658203125,
-0.0038318634033203125,
-0.00440216064453125,
0.023529052734375,
-0.03240966796875,
0.00569915771484375,
0.0230712890625,
-0.01513671875,
0.030731201171875,
0.020751953125,
0.07476806640625,
-0.0016641616821289062,
-0.03228759765625,
0.054290771484375,
-0.01263427734375,
-0.01078033447265625,
-0.04010009765625,
-0.024810791015625,
0.0082244873046875,
0.023712158203125,
0.029541015625,
-0.014617919921875,
0.0036487579345703125,
-0.043426513671875,
0.032745361328125,
0.03173828125,
-0.0296173095703125,
-0.03192138671875,
0.03778076171875,
0.005207061767578125,
-0.0203857421875,
0.056610107421875,
-0.03076171875,
-0.061004638671875,
0.04541015625,
0.03265380859375,
0.061309814453125,
-0.031219482421875,
0.01131439208984375,
0.06298828125,
0.02227783203125,
0.0013189315795898438,
0.0304107666015625,
0.00539398193359375,
-0.06622314453125,
-0.0263824462890625,
-0.0745849609375,
-0.007335662841796875,
0.0164337158203125,
-0.06988525390625,
0.039215087890625,
-0.03289794921875,
-0.0208740234375,
0.01032257080078125,
0.01203155517578125,
-0.07861328125,
0.019561767578125,
0.029083251953125,
0.080078125,
-0.0809326171875,
0.0584716796875,
0.07562255859375,
-0.050506591796875,
-0.0645751953125,
-0.015716552734375,
0.00732421875,
-0.067138671875,
0.063232421875,
0.0303192138671875,
0.0203704833984375,
-0.00572967529296875,
-0.02783203125,
-0.0743408203125,
0.059600830078125,
0.0201416015625,
-0.041778564453125,
-0.0189056396484375,
-0.0078582763671875,
0.03631591796875,
-0.041290283203125,
0.0253143310546875,
0.046600341796875,
0.029541015625,
0.006496429443359375,
-0.0826416015625,
0.0023670196533203125,
-0.037261962890625,
0.001590728759765625,
0.0167236328125,
-0.0635986328125,
0.058380126953125,
-0.01349639892578125,
-0.02850341796875,
-0.0032062530517578125,
0.063720703125,
0.0182647705078125,
0.0263824462890625,
0.04437255859375,
0.06585693359375,
0.043304443359375,
-0.007755279541015625,
0.0640869140625,
-0.042877197265625,
0.03021240234375,
0.07977294921875,
-0.0027332305908203125,
0.058868408203125,
0.03759765625,
-0.0124053955078125,
0.0635986328125,
0.057281494140625,
-0.01448822021484375,
0.026031494140625,
0.0027103424072265625,
-0.0174407958984375,
0.00684356689453125,
-0.023193359375,
-0.02716064453125,
0.049072265625,
0.0140380859375,
-0.03375244140625,
-0.0182647705078125,
0.00750732421875,
0.0341796875,
-0.010223388671875,
-0.0007710456848144531,
0.067626953125,
0.00894927978515625,
-0.044525146484375,
0.045196533203125,
0.00890350341796875,
0.0543212890625,
-0.02655029296875,
-0.00400543212890625,
-0.017852783203125,
-0.0032939910888671875,
-0.0219879150390625,
-0.0401611328125,
0.026885986328125,
0.002471923828125,
-0.0179443359375,
-0.0137939453125,
0.0294036865234375,
-0.0528564453125,
-0.053253173828125,
0.031829833984375,
0.048187255859375,
0.032257080078125,
-0.000050008296966552734,
-0.0711669921875,
0.00894927978515625,
0.00185394287109375,
-0.0286712646484375,
0.0241851806640625,
0.0382080078125,
-0.0063934326171875,
0.031646728515625,
0.048614501953125,
0.0196075439453125,
-0.0022869110107421875,
0.013671875,
0.07318115234375,
-0.05438232421875,
-0.034759521484375,
-0.061248779296875,
0.0310211181640625,
-0.01151275634765625,
-0.03753662109375,
0.0576171875,
0.057159423828125,
0.0887451171875,
0.007678985595703125,
0.047210693359375,
-0.0232696533203125,
0.044097900390625,
-0.023681640625,
0.05633544921875,
-0.041259765625,
-0.0144195556640625,
-0.0308837890625,
-0.079833984375,
-0.017486572265625,
0.060211181640625,
-0.008880615234375,
0.01165008544921875,
0.0352783203125,
0.038818359375,
-0.005950927734375,
-0.025421142578125,
0.00046062469482421875,
0.017120361328125,
0.00646209716796875,
0.04541015625,
0.03765869140625,
-0.044586181640625,
0.018096923828125,
-0.039886474609375,
-0.01824951171875,
-0.00005936622619628906,
-0.0740966796875,
-0.07330322265625,
-0.056243896484375,
-0.0450439453125,
-0.04913330078125,
-0.00931549072265625,
0.070068359375,
0.053466796875,
-0.07464599609375,
-0.0156402587890625,
0.004611968994140625,
-0.0015316009521484375,
-0.0038280487060546875,
-0.016265869140625,
0.036529541015625,
-0.00771331787109375,
-0.0635986328125,
0.0069580078125,
0.0090789794921875,
0.01401519775390625,
-0.016754150390625,
-0.0109100341796875,
-0.0360107421875,
-0.01241302490234375,
0.040863037109375,
0.036468505859375,
-0.05474853515625,
-0.00756072998046875,
0.0006847381591796875,
-0.0164337158203125,
0.00904083251953125,
0.03411865234375,
-0.058074951171875,
0.023681640625,
0.0202178955078125,
0.04779052734375,
0.043304443359375,
0.0027866363525390625,
0.01357269287109375,
-0.058197021484375,
0.0252685546875,
0.01097869873046875,
0.042022705078125,
0.05023193359375,
-0.03424072265625,
0.047149658203125,
0.0266876220703125,
-0.0296478271484375,
-0.05377197265625,
0.001010894775390625,
-0.0699462890625,
0.006465911865234375,
0.08709716796875,
-0.015716552734375,
-0.03070068359375,
0.0006875991821289062,
-0.0070343017578125,
0.039154052734375,
-0.032318115234375,
0.03924560546875,
0.061614990234375,
-0.0029087066650390625,
-0.01543426513671875,
-0.038482666015625,
0.039276123046875,
0.0210113525390625,
-0.04736328125,
-0.022064208984375,
0.0295562744140625,
0.043182373046875,
0.017608642578125,
0.058013916015625,
-0.00920867919921875,
0.002429962158203125,
-0.01515960693359375,
0.0269775390625,
0.013031005859375,
-0.0217437744140625,
-0.03387451171875,
-0.0208282470703125,
-0.021209716796875,
-0.005901336669921875
]
] |
ckiplab/bert-base-chinese-ner | 2022-05-10T03:28:12.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/bert-base-chinese-ner | 60 | 1,748,813 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- bert
- zh
license: gpl-3.0
---
# CKIP BERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
้ๅๅฐๆกๆไพไบ็น้ซไธญๆ็ transformers ๆจกๅ๏ผๅ
ๅซ ALBERTใBERTใGPT2๏ผๅ่ช็ถ่ช่จ่็ๅทฅๅ
ท๏ผๅ
ๅซๆท่ฉใ่ฉๆงๆจ่จใๅฏฆ้ซ่พจ่ญ๏ผใ
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
่ซไฝฟ็จ BertTokenizerFast ่้ AutoTokenizerใ
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese-ner')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
ๆ้ๅฎๆดไฝฟ็จๆนๆณๅๅ
ถไป่ณ่จ๏ผ่ซๅ่ฆ https://github.com/ckiplab/ckip-transformers ใ
| 1,123 | [
[
-0.021881103515625,
-0.0265655517578125,
0.0011539459228515625,
0.0556640625,
-0.0289306640625,
0.00384521484375,
-0.01404571533203125,
-0.018829345703125,
-0.0028018951416015625,
0.032958984375,
-0.026458740234375,
-0.0213775634765625,
-0.043975830078125,
0.0017786026000976562,
-0.0178070068359375,
0.0640869140625,
-0.013946533203125,
0.02630615234375,
0.031463623046875,
0.010009765625,
-0.018829345703125,
-0.0340576171875,
-0.05316162109375,
-0.04461669921875,
-0.003162384033203125,
0.0195159912109375,
0.049224853515625,
0.0294342041015625,
0.03692626953125,
0.0230865478515625,
0.0019197463989257812,
-0.0084381103515625,
-0.0132598876953125,
-0.0210723876953125,
0.0004949569702148438,
-0.03875732421875,
-0.028350830078125,
-0.0154876708984375,
0.049591064453125,
0.034820556640625,
0.0022602081298828125,
-0.0021495819091796875,
0.0146484375,
0.0265655517578125,
-0.02398681640625,
0.0311279296875,
-0.043853759765625,
0.0218505859375,
-0.011627197265625,
-0.005901336669921875,
-0.027130126953125,
-0.019866943359375,
0.01352691650390625,
-0.04559326171875,
0.025634765625,
-0.01258087158203125,
0.097900390625,
0.002452850341796875,
-0.023101806640625,
-0.0200042724609375,
-0.050323486328125,
0.07733154296875,
-0.064208984375,
0.03289794921875,
0.0256805419921875,
0.021728515625,
-0.003993988037109375,
-0.0780029296875,
-0.048828125,
-0.0140228271484375,
-0.0157012939453125,
0.023651123046875,
0.0098114013671875,
-0.0027980804443359375,
0.026580810546875,
0.021942138671875,
-0.04412841796875,
0.01444244384765625,
-0.0288238525390625,
-0.03143310546875,
0.03924560546875,
-0.006847381591796875,
0.035430908203125,
-0.03265380859375,
-0.039520263671875,
-0.0258941650390625,
-0.04522705078125,
0.01641845703125,
0.0199737548828125,
0.00943756103515625,
-0.034759521484375,
0.042755126953125,
-0.00086212158203125,
0.0218048095703125,
0.01491546630859375,
-0.00637054443359375,
0.03399658203125,
-0.021484375,
-0.005306243896484375,
-0.00934600830078125,
0.065185546875,
0.01715087890625,
0.007770538330078125,
0.004703521728515625,
-0.0237884521484375,
-0.0243072509765625,
-0.0175933837890625,
-0.056304931640625,
-0.05084228515625,
0.01494598388671875,
-0.056671142578125,
-0.01482391357421875,
0.011627197265625,
-0.045379638671875,
0.02178955078125,
-0.0178985595703125,
0.03143310546875,
-0.0526123046875,
-0.044342041015625,
-0.00109100341796875,
-0.029754638671875,
0.062255859375,
0.01039886474609375,
-0.0882568359375,
0.0031223297119140625,
0.044342041015625,
0.053741455078125,
0.0107879638671875,
-0.0120697021484375,
0.00998687744140625,
0.0259246826171875,
-0.01464080810546875,
0.03936767578125,
-0.00847625732421875,
-0.051849365234375,
0.00936126708984375,
0.007007598876953125,
0.001338958740234375,
-0.030975341796875,
0.05950927734375,
-0.02398681640625,
0.029388427734375,
-0.0165863037109375,
-0.0218963623046875,
-0.0051422119140625,
0.007007598876953125,
-0.038482666015625,
0.08758544921875,
0.0167236328125,
-0.062408447265625,
0.01837158203125,
-0.064697265625,
-0.043426513671875,
0.02447509765625,
-0.007602691650390625,
-0.030303955078125,
-0.01259613037109375,
0.0178985595703125,
0.0231475830078125,
-0.00457000732421875,
0.0157470703125,
-0.0009388923645019531,
-0.0160369873046875,
0.0012235641479492188,
-0.031646728515625,
0.098388671875,
0.0242767333984375,
-0.02447509765625,
0.012542724609375,
-0.0496826171875,
0.0096435546875,
0.022674560546875,
-0.0183258056640625,
-0.0178070068359375,
0.0147705078125,
0.044342041015625,
0.01253509521484375,
0.04071044921875,
-0.04412841796875,
0.03497314453125,
-0.04156494140625,
0.053131103515625,
0.06134033203125,
-0.0239410400390625,
0.02081298828125,
-0.01132965087890625,
-0.00041985511779785156,
0.004299163818359375,
0.0263519287109375,
-0.0094757080078125,
-0.037139892578125,
-0.082763671875,
-0.0250091552734375,
0.0325927734375,
0.056915283203125,
-0.08154296875,
0.06689453125,
-0.0181121826171875,
-0.04595947265625,
-0.024078369140625,
-0.005126953125,
0.0023403167724609375,
0.0139617919921875,
0.040435791015625,
-0.0219268798828125,
-0.042755126953125,
-0.0743408203125,
0.00926971435546875,
-0.041351318359375,
-0.04107666015625,
0.000013589859008789062,
0.041168212890625,
-0.03253173828125,
0.07305908203125,
-0.037933349609375,
-0.0208282470703125,
-0.0240631103515625,
0.040557861328125,
0.0268707275390625,
0.06695556640625,
0.0452880859375,
-0.07501220703125,
-0.052703857421875,
-0.01471710205078125,
-0.0256500244140625,
-0.004650115966796875,
-0.016571044921875,
-0.0099029541015625,
0.00453948974609375,
0.00516510009765625,
-0.043975830078125,
0.01389312744140625,
0.028411865234375,
0.0005955696105957031,
0.062347412109375,
-0.00426483154296875,
-0.0209808349609375,
-0.0970458984375,
0.013885498046875,
-0.01482391357421875,
-0.002044677734375,
-0.031707763671875,
-0.0004992485046386719,
0.013702392578125,
-0.00630950927734375,
-0.03973388671875,
0.042144775390625,
-0.02587890625,
0.0241546630859375,
-0.0199737548828125,
-0.012603759765625,
-0.01554107666015625,
0.04364013671875,
0.030181884765625,
0.0517578125,
0.044342041015625,
-0.0526123046875,
0.0305023193359375,
0.04937744140625,
-0.0188140869140625,
-0.006450653076171875,
-0.07061767578125,
-0.0007829666137695312,
0.0227203369140625,
0.01241302490234375,
-0.070068359375,
-0.00460052490234375,
0.0450439453125,
-0.0565185546875,
0.04443359375,
0.004215240478515625,
-0.06781005859375,
-0.032958984375,
-0.0322265625,
0.0255584716796875,
0.05047607421875,
-0.04644775390625,
0.037139892578125,
0.0195465087890625,
-0.0160980224609375,
-0.04364013671875,
-0.0587158203125,
-0.0015354156494140625,
0.020111083984375,
-0.04266357421875,
0.047698974609375,
-0.0169525146484375,
0.0249176025390625,
-0.0006208419799804688,
0.0069427490234375,
-0.0355224609375,
-0.00664520263671875,
-0.00968170166015625,
0.0298919677734375,
-0.011322021484375,
-0.00118255615234375,
0.01398468017578125,
-0.02374267578125,
0.010498046875,
-0.00042510032653808594,
0.054107666015625,
0.00273895263671875,
-0.0236053466796875,
-0.041046142578125,
0.019378662109375,
0.01474761962890625,
-0.0184326171875,
0.02264404296875,
0.07586669921875,
-0.0189208984375,
-0.01284027099609375,
-0.03228759765625,
-0.01171112060546875,
-0.04034423828125,
0.04437255859375,
-0.033843994140625,
-0.060516357421875,
0.024078369140625,
-0.007556915283203125,
0.0134124755859375,
0.0556640625,
0.04644775390625,
0.000005900859832763672,
0.09088134765625,
0.0675048828125,
-0.040313720703125,
0.032562255859375,
-0.029541015625,
0.027587890625,
-0.0660400390625,
0.0169830322265625,
-0.04595947265625,
0.007190704345703125,
-0.061737060546875,
-0.02227783203125,
-0.0014057159423828125,
0.01166534423828125,
-0.0197296142578125,
0.053253173828125,
-0.05877685546875,
-0.002414703369140625,
0.05804443359375,
-0.0218505859375,
-0.00757598876953125,
-0.007152557373046875,
-0.0198974609375,
-0.0017118453979492188,
-0.04315185546875,
-0.048431396484375,
0.05517578125,
0.0489501953125,
0.0526123046875,
-0.0014972686767578125,
0.036865234375,
-0.003017425537109375,
0.03131103515625,
-0.058074951171875,
0.0404052734375,
-0.015899658203125,
-0.0616455078125,
-0.0222015380859375,
-0.016571044921875,
-0.061737060546875,
0.0166168212890625,
-0.0018243789672851562,
-0.0631103515625,
0.01209259033203125,
0.004207611083984375,
-0.006511688232421875,
0.0272369384765625,
-0.031494140625,
0.054107666015625,
-0.03631591796875,
0.00830078125,
-0.005580902099609375,
-0.053863525390625,
0.0280914306640625,
-0.00022840499877929688,
-0.007114410400390625,
-0.005069732666015625,
0.008056640625,
0.0556640625,
-0.0151519775390625,
0.06134033203125,
-0.01351165771484375,
-0.003971099853515625,
0.0227813720703125,
-0.0227203369140625,
0.022308349609375,
0.01251220703125,
0.0081787109375,
0.044708251953125,
0.01464080810546875,
-0.0287322998046875,
-0.0157012939453125,
0.034912109375,
-0.06817626953125,
-0.0303497314453125,
-0.04327392578125,
-0.01702880859375,
0.01129150390625,
0.040252685546875,
0.041748046875,
0.0005788803100585938,
-0.0009517669677734375,
0.0196075439453125,
0.0238494873046875,
-0.032501220703125,
0.04296875,
0.0411376953125,
-0.005046844482421875,
-0.03497314453125,
0.0687255859375,
0.0105438232421875,
0.006786346435546875,
0.048095703125,
-0.00292205810546875,
-0.0180206298828125,
-0.032623291015625,
-0.02386474609375,
0.02850341796875,
-0.031646728515625,
0.0008730888366699219,
-0.0274658203125,
-0.04388427734375,
-0.0484619140625,
0.00966644287109375,
-0.0254058837890625,
-0.02960205078125,
-0.0215301513671875,
0.0015172958374023438,
-0.0241851806640625,
0.008819580078125,
-0.02099609375,
0.035614013671875,
-0.0775146484375,
0.036651611328125,
0.015411376953125,
0.0178985595703125,
0.0017528533935546875,
-0.01812744140625,
-0.040435791015625,
0.00937652587890625,
-0.06317138671875,
-0.053741455078125,
0.04229736328125,
0.0006036758422851562,
0.05316162109375,
0.04522705078125,
0.0133056640625,
0.03814697265625,
-0.048004150390625,
0.08154296875,
0.027618408203125,
-0.0892333984375,
0.0300445556640625,
-0.01303863525390625,
0.025665283203125,
0.0222015380859375,
0.03692626953125,
-0.05712890625,
-0.02423095703125,
-0.03582763671875,
-0.0859375,
0.048553466796875,
0.0273895263671875,
0.026611328125,
-0.0021686553955078125,
0.000255584716796875,
-0.0012292861938476562,
0.0118408203125,
-0.08233642578125,
-0.04034423828125,
-0.039947509765625,
-0.022125244140625,
0.0159912109375,
-0.0296478271484375,
0.007183074951171875,
-0.0166473388671875,
0.0794677734375,
0.004634857177734375,
0.060821533203125,
0.03558349609375,
-0.00435638427734375,
-0.00925445556640625,
0.00714874267578125,
0.034393310546875,
0.040863037109375,
-0.0204620361328125,
-0.0176544189453125,
0.006198883056640625,
-0.047576904296875,
-0.0167999267578125,
0.0305023193359375,
-0.0292816162109375,
0.032806396484375,
0.03607177734375,
0.04620361328125,
0.0099945068359375,
-0.0308685302734375,
0.040313720703125,
-0.011810302734375,
-0.01837158203125,
-0.07269287109375,
-0.00356292724609375,
0.0028400421142578125,
0.0018682479858398438,
0.051361083984375,
-0.012176513671875,
0.01059722900390625,
-0.01306915283203125,
0.0159912109375,
0.03070068359375,
-0.0391845703125,
-0.0343017578125,
0.048828125,
0.034942626953125,
-0.019866943359375,
0.06439208984375,
-0.00432586669921875,
-0.071044921875,
0.050750732421875,
0.035614013671875,
0.07598876953125,
-0.0251007080078125,
0.003452301025390625,
0.047637939453125,
0.036956787109375,
0.00545501708984375,
0.018280029296875,
-0.019256591796875,
-0.0693359375,
-0.039306640625,
-0.0270233154296875,
-0.033355712890625,
0.03082275390625,
-0.037139892578125,
0.04302978515625,
-0.034759521484375,
-0.009185791015625,
-0.004154205322265625,
-0.0035552978515625,
-0.036102294921875,
0.0110626220703125,
0.008819580078125,
0.0838623046875,
-0.0462646484375,
0.0877685546875,
0.043853759765625,
-0.040313720703125,
-0.062255859375,
0.01258087158203125,
-0.028564453125,
-0.054168701171875,
0.07928466796875,
0.0265655517578125,
0.020263671875,
0.005329132080078125,
-0.05535888671875,
-0.057220458984375,
0.074462890625,
-0.01123046875,
-0.024993896484375,
-0.00742340087890625,
0.0255889892578125,
0.0300445556640625,
-0.00333404541015625,
0.033233642578125,
0.0045928955078125,
0.04644775390625,
-0.0117340087890625,
-0.0848388671875,
-0.0171356201171875,
-0.022430419921875,
0.003871917724609375,
0.01922607421875,
-0.063232421875,
0.06414794921875,
0.00749969482421875,
-0.024627685546875,
0.0281524658203125,
0.0673828125,
0.0006356239318847656,
0.009521484375,
0.041778564453125,
0.0330810546875,
-0.0023899078369140625,
-0.0164794921875,
0.036163330078125,
-0.04327392578125,
0.05816650390625,
0.062255859375,
-0.005413055419921875,
0.0556640625,
0.0271453857421875,
-0.03662109375,
0.040008544921875,
0.050384521484375,
-0.045745849609375,
0.0452880859375,
0.0019292831420898438,
-0.007965087890625,
-0.007785797119140625,
0.00894927978515625,
-0.042449951171875,
0.017852783203125,
0.0222625732421875,
-0.0266265869140625,
-0.01026153564453125,
-0.014801025390625,
-0.0009055137634277344,
-0.0309295654296875,
-0.004364013671875,
0.0382080078125,
0.01010894775390625,
-0.021881103515625,
0.036224365234375,
0.0258636474609375,
0.07275390625,
-0.077880859375,
-0.0263519287109375,
0.0193634033203125,
0.01177215576171875,
-0.00385284423828125,
-0.04644775390625,
0.010711669921875,
-0.0253448486328125,
-0.01163482666015625,
-0.01076507568359375,
0.06005859375,
-0.0241851806640625,
-0.040130615234375,
0.031982421875,
0.0060882568359375,
0.0104827880859375,
0.022064208984375,
-0.08502197265625,
-0.0252838134765625,
0.0257415771484375,
-0.0303497314453125,
0.01157379150390625,
0.011810302734375,
0.007564544677734375,
0.047637939453125,
0.06500244140625,
0.00627899169921875,
-0.0096893310546875,
-0.00328826904296875,
0.06689453125,
-0.0433349609375,
-0.040924072265625,
-0.05120849609375,
0.05548095703125,
-0.0175628662109375,
-0.0272064208984375,
0.052947998046875,
0.0526123046875,
0.084228515625,
-0.0262603759765625,
0.07568359375,
-0.02935791015625,
0.0565185546875,
-0.0143585205078125,
0.059783935546875,
-0.0298309326171875,
-0.01065826416015625,
-0.0256805419921875,
-0.066162109375,
-0.0175018310546875,
0.06591796875,
-0.0109710693359375,
-0.00469970703125,
0.04998779296875,
0.045135498046875,
0.0005106925964355469,
-0.016265869140625,
0.0111083984375,
0.01328277587890625,
0.045654296875,
0.033477783203125,
0.04144287109375,
-0.0391845703125,
0.0462646484375,
-0.048095703125,
-0.01482391357421875,
-0.00986480712890625,
-0.052154541015625,
-0.052703857421875,
-0.044403076171875,
-0.02032470703125,
-0.007781982421875,
-0.018798828125,
0.061859130859375,
0.056182861328125,
-0.07867431640625,
-0.0338134765625,
-0.0020732879638671875,
0.007793426513671875,
-0.0251617431640625,
-0.0258331298828125,
0.0457763671875,
-0.0311279296875,
-0.08447265625,
0.0009469985961914062,
0.00580596923828125,
0.007602691650390625,
-0.022491455078125,
-0.00102996826171875,
-0.021484375,
-0.01309967041015625,
0.0309295654296875,
0.03265380859375,
-0.05548095703125,
-0.0232086181640625,
-0.00028777122497558594,
-0.015045166015625,
0.00890350341796875,
0.044525146484375,
-0.0169525146484375,
0.027740478515625,
0.050201416015625,
0.021148681640625,
0.02618408203125,
-0.01129913330078125,
0.052093505859375,
-0.035736083984375,
0.01922607421875,
0.024200439453125,
0.039703369140625,
0.023681640625,
-0.01629638671875,
0.0362548828125,
0.031036376953125,
-0.05352783203125,
-0.042755126953125,
0.024139404296875,
-0.0762939453125,
-0.0202789306640625,
0.06689453125,
-0.020050048828125,
-0.00986480712890625,
-0.00907135009765625,
-0.043792724609375,
0.047393798828125,
-0.0230560302734375,
0.04412841796875,
0.0633544921875,
-0.004364013671875,
-0.0069732666015625,
-0.037689208984375,
0.0281982421875,
0.0323486328125,
-0.0254669189453125,
-0.0264129638671875,
0.0009102821350097656,
0.0140228271484375,
0.045867919921875,
0.0330810546875,
-0.010223388671875,
0.0089874267578125,
-0.0117950439453125,
0.044921875,
0.0019292831420898438,
0.01471710205078125,
0.0031032562255859375,
-0.01375579833984375,
0.002635955810546875,
-0.0308074951171875
]
] |
facebook/wav2vec2-base-960h | 2022-11-14T21:37:23.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-base-960h | 162 | 1,741,258 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
license: apache-2.0
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: wav2vec2-base-960h
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 3.4
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 8.6
---
# Wav2Vec2-Base-960h
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained and fine-tuned on 960 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and tokenizer
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-960h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-960h")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-base-960h** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-960h").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-960h")
def map_to_pred(batch):
input_values = processor(batch["audio"]["array"], return_tensors="pt", padding="longest").input_values
with torch.no_grad():
logits = model(input_values.to("cuda")).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, batched=True, batch_size=1, remove_columns=["audio"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 3.4 | 8.6 | | 4,431 | [
[
-0.01495361328125,
-0.049468994140625,
0.0128326416015625,
0.0127716064453125,
-0.01332855224609375,
-0.01201629638671875,
-0.036651611328125,
-0.040771484375,
-0.003925323486328125,
0.0116119384765625,
-0.04620361328125,
-0.045318603515625,
-0.0440673828125,
-0.030975341796875,
-0.0297698974609375,
0.0712890625,
0.0200653076171875,
0.007476806640625,
0.004650115966796875,
-0.01204681396484375,
-0.03106689453125,
-0.0209197998046875,
-0.063232421875,
-0.03363037109375,
0.0127105712890625,
0.01523590087890625,
0.0153961181640625,
0.017120361328125,
0.0256195068359375,
0.0252685546875,
-0.0181427001953125,
0.005340576171875,
-0.051910400390625,
-0.007122039794921875,
0.0108642578125,
-0.0310516357421875,
-0.023193359375,
0.021575927734375,
0.040557861328125,
0.03961181640625,
-0.0155487060546875,
0.039520263671875,
0.007038116455078125,
0.0352783203125,
-0.023773193359375,
0.02398681640625,
-0.04437255859375,
-0.0157012939453125,
-0.0088043212890625,
-0.00896453857421875,
-0.0460205078125,
-0.010162353515625,
0.00899505615234375,
-0.039581298828125,
0.01546478271484375,
-0.0178985595703125,
0.068603515625,
0.017547607421875,
-0.0241546630859375,
-0.0308074951171875,
-0.0714111328125,
0.06439208984375,
-0.04937744140625,
0.056182861328125,
0.0362548828125,
0.016265869140625,
-0.003978729248046875,
-0.08477783203125,
-0.031158447265625,
-0.00238037109375,
0.0220794677734375,
0.035125732421875,
-0.0258331298828125,
0.0035114288330078125,
0.028778076171875,
0.02313232421875,
-0.0482177734375,
0.0045623779296875,
-0.06817626953125,
-0.0384521484375,
0.056610107421875,
-0.0249481201171875,
0.0019683837890625,
-0.0005235671997070312,
-0.028167724609375,
-0.04290771484375,
-0.01904296875,
0.0380859375,
0.0238494873046875,
0.0135040283203125,
-0.03106689453125,
0.0310211181640625,
0.005603790283203125,
0.04656982421875,
0.0072784423828125,
-0.031402587890625,
0.054412841796875,
-0.013641357421875,
-0.01020050048828125,
0.03045654296875,
0.06341552734375,
0.01456451416015625,
0.01172637939453125,
0.00885772705078125,
-0.01303863525390625,
0.01593017578125,
-0.0135955810546875,
-0.0562744140625,
-0.038665771484375,
0.0328369140625,
-0.03375244140625,
0.005031585693359375,
0.0127716064453125,
-0.01617431640625,
-0.003421783447265625,
-0.0175933837890625,
0.072265625,
-0.042388916015625,
-0.0204010009765625,
0.01116943359375,
-0.022216796875,
0.01483917236328125,
-0.00925445556640625,
-0.06256103515625,
0.0165252685546875,
0.035980224609375,
0.062744140625,
0.0086212158203125,
-0.005115509033203125,
-0.042877197265625,
-0.00592803955078125,
-0.0202789306640625,
0.0380859375,
-0.004436492919921875,
-0.0438232421875,
-0.0234527587890625,
-0.0087890625,
0.007495880126953125,
-0.047088623046875,
0.054840087890625,
-0.0286712646484375,
0.0181121826171875,
-0.005954742431640625,
-0.052490234375,
-0.0211639404296875,
-0.044097900390625,
-0.044830322265625,
0.09454345703125,
0.01329803466796875,
-0.045013427734375,
0.0197601318359375,
-0.028656005859375,
-0.048370361328125,
-0.0250701904296875,
-0.002704620361328125,
-0.045013427734375,
0.00960540771484375,
0.0163726806640625,
0.03204345703125,
-0.0113372802734375,
0.0003273487091064453,
-0.011688232421875,
-0.04901123046875,
0.0292510986328125,
-0.041168212890625,
0.08544921875,
0.0224151611328125,
-0.040069580078125,
0.01232147216796875,
-0.07086181640625,
0.01561737060546875,
0.00244903564453125,
-0.03631591796875,
0.00795745849609375,
-0.0114288330078125,
0.0278778076171875,
0.01708984375,
0.00988006591796875,
-0.0426025390625,
-0.015899658203125,
-0.0574951171875,
0.044158935546875,
0.051605224609375,
-0.0084381103515625,
0.029693603515625,
-0.0272674560546875,
-0.003330230712890625,
-0.020233154296875,
-0.0013580322265625,
0.0095367431640625,
-0.0362548828125,
-0.046112060546875,
-0.031707763671875,
0.025634765625,
0.039337158203125,
-0.0145111083984375,
0.04974365234375,
-0.00887298583984375,
-0.0682373046875,
-0.073486328125,
0.0020236968994140625,
0.0255126953125,
0.040313720703125,
0.05242919921875,
-0.018341064453125,
-0.0611572265625,
-0.05810546875,
-0.01050567626953125,
-0.0112457275390625,
-0.0159454345703125,
0.0207672119140625,
0.0175628662109375,
-0.025482177734375,
0.048919677734375,
-0.0161285400390625,
-0.029571533203125,
-0.0184478759765625,
0.01456451416015625,
0.04840087890625,
0.051025390625,
0.02032470703125,
-0.048797607421875,
-0.0244293212890625,
-0.02752685546875,
-0.03814697265625,
-0.00989532470703125,
-0.005474090576171875,
-0.0019311904907226562,
0.010284423828125,
0.0297393798828125,
-0.036041259765625,
0.0303802490234375,
0.03985595703125,
-0.00804901123046875,
0.0284423828125,
-0.005718231201171875,
-0.0012302398681640625,
-0.07403564453125,
0.0006842613220214844,
-0.0083465576171875,
-0.0204010009765625,
-0.04046630859375,
-0.04449462890625,
-0.00946044921875,
-0.003387451171875,
-0.040252685546875,
0.0285186767578125,
-0.0335693359375,
-0.0218963623046875,
-0.01369476318359375,
0.010162353515625,
-0.01010894775390625,
0.0367431640625,
0.00467681884765625,
0.050201416015625,
0.046112060546875,
-0.042022705078125,
0.04241943359375,
0.0162353515625,
-0.043853759765625,
-0.00008565187454223633,
-0.0631103515625,
0.034393310546875,
0.0128326416015625,
0.0261993408203125,
-0.08966064453125,
-0.004619598388671875,
-0.00675201416015625,
-0.07171630859375,
0.0236053466796875,
0.0007729530334472656,
-0.02777099609375,
-0.0285491943359375,
-0.0066375732421875,
0.0288543701171875,
0.0758056640625,
-0.049957275390625,
0.042083740234375,
0.033935546875,
0.01380157470703125,
-0.035003662109375,
-0.07305908203125,
-0.03729248046875,
-0.00028228759765625,
-0.05438232421875,
0.03173828125,
-0.0005307197570800781,
-0.001293182373046875,
-0.00954437255859375,
-0.039031982421875,
0.01108551025390625,
-0.006427764892578125,
0.042388916015625,
0.01806640625,
-0.00650787353515625,
0.00867462158203125,
-0.00740814208984375,
-0.021209716796875,
0.01483154296875,
-0.040618896484375,
0.05511474609375,
-0.0126953125,
-0.0165863037109375,
-0.067138671875,
-0.0032196044921875,
0.01299285888671875,
-0.0226287841796875,
0.03228759765625,
0.08514404296875,
-0.0306243896484375,
-0.0204010009765625,
-0.040069580078125,
-0.0230712890625,
-0.040618896484375,
0.049957275390625,
-0.01995849609375,
-0.048187255859375,
0.0267181396484375,
0.003345489501953125,
0.00925445556640625,
0.0478515625,
0.05828857421875,
-0.034576416015625,
0.064697265625,
0.0202178955078125,
0.0026950836181640625,
0.03973388671875,
-0.067626953125,
0.00664520263671875,
-0.050323486328125,
-0.03472900390625,
-0.0248260498046875,
-0.031341552734375,
-0.04150390625,
-0.03741455078125,
0.035247802734375,
0.00659942626953125,
-0.009033203125,
0.027923583984375,
-0.051055908203125,
0.01261138916015625,
0.05474853515625,
0.024505615234375,
-0.0114288330078125,
0.017669677734375,
-0.0011043548583984375,
-0.0013895034790039062,
-0.036834716796875,
-0.016876220703125,
0.0928955078125,
0.03271484375,
0.0528564453125,
-0.0085296630859375,
0.0589599609375,
0.01483917236328125,
-0.02001953125,
-0.06689453125,
0.03271484375,
-0.008636474609375,
-0.051025390625,
-0.0182952880859375,
-0.018157958984375,
-0.060943603515625,
0.00925445556640625,
-0.023895263671875,
-0.06109619140625,
0.01032257080078125,
0.0008854866027832031,
-0.0232696533203125,
0.01324462890625,
-0.056854248046875,
0.04498291015625,
-0.01270294189453125,
-0.02459716796875,
-0.022613525390625,
-0.05401611328125,
0.003742218017578125,
0.0084991455078125,
0.0174560546875,
-0.0111083984375,
0.033172607421875,
0.10205078125,
-0.01641845703125,
0.038482666015625,
-0.032501220703125,
0.0030994415283203125,
0.052093505859375,
-0.01508331298828125,
0.0217437744140625,
0.0014247894287109375,
-0.01666259765625,
0.021881103515625,
0.0100555419921875,
-0.0258941650390625,
-0.02874755859375,
0.045562744140625,
-0.0777587890625,
-0.01971435546875,
-0.01430511474609375,
-0.034637451171875,
-0.0221710205078125,
0.01010894775390625,
0.061553955078125,
0.05804443359375,
-0.0045166015625,
0.035430908203125,
0.049224853515625,
-0.00203704833984375,
0.033905029296875,
0.00666046142578125,
-0.005588531494140625,
-0.033935546875,
0.07110595703125,
0.016265869140625,
0.0165863037109375,
0.0097198486328125,
0.0157928466796875,
-0.044281005859375,
-0.036376953125,
-0.003696441650390625,
0.01337432861328125,
-0.05609130859375,
-0.00365447998046875,
-0.045623779296875,
-0.028167724609375,
-0.050537109375,
0.004627227783203125,
-0.0528564453125,
-0.035491943359375,
-0.03216552734375,
-0.01163482666015625,
0.0257568359375,
0.043365478515625,
-0.0386962890625,
0.0300445556640625,
-0.04412841796875,
0.03533935546875,
0.0233917236328125,
-0.0033245086669921875,
-0.01263427734375,
-0.0760498046875,
-0.02587890625,
0.0196685791015625,
-0.0007777214050292969,
-0.0689697265625,
0.01190948486328125,
0.01666259765625,
0.036163330078125,
0.0240325927734375,
-0.007366180419921875,
0.049102783203125,
-0.0216827392578125,
0.056182861328125,
0.01904296875,
-0.08062744140625,
0.049102783203125,
-0.0031566619873046875,
0.008331298828125,
0.03692626953125,
0.01427459716796875,
-0.0253448486328125,
-0.0038814544677734375,
-0.05316162109375,
-0.07598876953125,
0.06683349609375,
0.0299530029296875,
-0.00159454345703125,
0.03179931640625,
0.01995849609375,
-0.004718780517578125,
-0.006488800048828125,
-0.048583984375,
-0.03741455078125,
-0.030181884765625,
-0.0233306884765625,
-0.0233154296875,
-0.0194091796875,
-0.0026187896728515625,
-0.0396728515625,
0.07525634765625,
0.028961181640625,
0.04168701171875,
0.02850341796875,
-0.01505279541015625,
0.00719451904296875,
0.0088348388671875,
0.0261077880859375,
0.025177001953125,
-0.0293731689453125,
0.01027679443359375,
0.0238189697265625,
-0.047149658203125,
0.016448974609375,
0.017852783203125,
0.01071929931640625,
0.00853729248046875,
0.056427001953125,
0.0875244140625,
-0.0020999908447265625,
-0.031951904296875,
0.038421630859375,
0.00246429443359375,
-0.022613525390625,
-0.0413818359375,
0.0159759521484375,
0.03509521484375,
0.0270843505859375,
0.03289794921875,
0.00347900390625,
0.01032257080078125,
-0.030181884765625,
0.0291595458984375,
0.015106201171875,
-0.0386962890625,
-0.02435302734375,
0.0684814453125,
0.00337982177734375,
-0.01995849609375,
0.052398681640625,
-0.0019235610961914062,
-0.021209716796875,
0.04815673828125,
0.0455322265625,
0.056365966796875,
-0.0236663818359375,
-0.0174407958984375,
0.0430908203125,
0.0171051025390625,
-0.0015125274658203125,
0.038970947265625,
-0.015167236328125,
-0.037384033203125,
-0.020111083984375,
-0.045867919921875,
0.0013427734375,
0.0174407958984375,
-0.06292724609375,
0.026275634765625,
-0.032958984375,
-0.031829833984375,
0.0218048095703125,
0.0149993896484375,
-0.05950927734375,
0.0278778076171875,
0.021453857421875,
0.056976318359375,
-0.066162109375,
0.07733154296875,
0.02813720703125,
-0.026336669921875,
-0.09783935546875,
-0.0134735107421875,
-0.0020618438720703125,
-0.05615234375,
0.048370361328125,
0.026397705078125,
-0.03143310546875,
0.0154876708984375,
-0.042236328125,
-0.061309814453125,
0.08135986328125,
0.0239105224609375,
-0.05267333984375,
0.00787353515625,
-0.008056640625,
0.035614013671875,
-0.004154205322265625,
0.01708984375,
0.05615234375,
0.0325927734375,
0.006084442138671875,
-0.0718994140625,
-0.0103607177734375,
-0.013275146484375,
-0.01885986328125,
-0.021392822265625,
-0.044708251953125,
0.07427978515625,
-0.0299835205078125,
-0.0239410400390625,
-0.006175994873046875,
0.07672119140625,
0.021087646484375,
0.0231475830078125,
0.04730224609375,
0.038848876953125,
0.0712890625,
-0.0182952880859375,
0.056976318359375,
-0.002445220947265625,
0.040557861328125,
0.082275390625,
0.005985260009765625,
0.0654296875,
0.020843505859375,
-0.0295562744140625,
0.032867431640625,
0.046142578125,
-0.01120758056640625,
0.056976318359375,
0.01519012451171875,
-0.0178985595703125,
-0.0122528076171875,
0.00379180908203125,
-0.0467529296875,
0.066650390625,
0.02398681640625,
-0.01012420654296875,
0.017974853515625,
0.01088714599609375,
-0.007526397705078125,
-0.007965087890625,
-0.005817413330078125,
0.05584716796875,
0.01367950439453125,
-0.017852783203125,
0.07177734375,
0.004627227783203125,
0.05841064453125,
-0.0474853515625,
0.0025577545166015625,
0.0147247314453125,
0.0226593017578125,
-0.03131103515625,
-0.045562744140625,
0.009185791015625,
-0.01568603515625,
-0.0160980224609375,
0.00310516357421875,
0.048309326171875,
-0.04888916015625,
-0.03546142578125,
0.045928955078125,
0.00835418701171875,
0.021392822265625,
0.0006299018859863281,
-0.0487060546875,
0.02490234375,
0.020599365234375,
-0.031219482421875,
-0.007572174072265625,
0.00473785400390625,
0.025299072265625,
0.02032470703125,
0.060699462890625,
0.0144195556640625,
0.01285552978515625,
0.0018434524536132812,
0.045440673828125,
-0.039794921875,
-0.0419921875,
-0.04632568359375,
0.0311279296875,
0.006916046142578125,
-0.01360321044921875,
0.042083740234375,
0.061676025390625,
0.08001708984375,
0.00333404541015625,
0.056732177734375,
-0.0011425018310546875,
0.049957275390625,
-0.053192138671875,
0.058929443359375,
-0.0457763671875,
0.01103973388671875,
-0.0098114013671875,
-0.0601806640625,
0.006099700927734375,
0.0692138671875,
-0.0090179443359375,
0.0250701904296875,
0.03741455078125,
0.06561279296875,
-0.007724761962890625,
0.001483917236328125,
0.0112152099609375,
0.0244293212890625,
0.0254364013671875,
0.05389404296875,
0.04290771484375,
-0.061309814453125,
0.05438232421875,
-0.04779052734375,
-0.01456451416015625,
-0.00615692138671875,
-0.017822265625,
-0.06585693359375,
-0.06048583984375,
-0.016815185546875,
-0.052978515625,
-0.0030231475830078125,
0.079345703125,
0.06268310546875,
-0.06671142578125,
-0.030120849609375,
0.018310546875,
-0.0110626220703125,
-0.03253173828125,
-0.01446533203125,
0.056671142578125,
0.001110076904296875,
-0.064697265625,
0.05841064453125,
-0.005893707275390625,
0.00585174560546875,
0.004322052001953125,
-0.013763427734375,
-0.0210418701171875,
-0.0018281936645507812,
0.0275115966796875,
0.01499176025390625,
-0.050994873046875,
-0.015350341796875,
-0.00960540771484375,
-0.01297760009765625,
0.0117034912109375,
0.031402587890625,
-0.0509033203125,
0.0445556640625,
0.046661376953125,
0.023712158203125,
0.076904296875,
-0.01538848876953125,
0.008453369140625,
-0.05206298828125,
0.035980224609375,
0.0202484130859375,
0.0247955322265625,
0.02191162109375,
-0.01427459716796875,
0.021453857421875,
0.0264739990234375,
-0.040069580078125,
-0.05938720703125,
-0.006687164306640625,
-0.10040283203125,
-0.016693115234375,
0.09423828125,
0.0033740997314453125,
-0.01873779296875,
0.011871337890625,
-0.02618408203125,
0.07550048828125,
-0.035614013671875,
0.02606201171875,
0.029449462890625,
-0.0152587890625,
0.01338958740234375,
-0.042022705078125,
0.041778564453125,
0.034027099609375,
-0.0221710205078125,
-0.00923919677734375,
0.0298919677734375,
0.043548583984375,
0.005767822265625,
0.06915283203125,
-0.01076507568359375,
0.029449462890625,
0.0229339599609375,
0.0200958251953125,
-0.0228118896484375,
-0.0208892822265625,
-0.03424072265625,
0.00511932373046875,
-0.0093231201171875,
-0.039520263671875
]
] |
microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext | 2023-11-06T18:03:43.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"fill-mask",
"exbert",
"en",
"arxiv:2007.15779",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext | 131 | 1,685,260 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- exbert
license: mit
widget:
- text: "[MASK] is a tumor suppressor gene."
---
## MSR BiomedBERT (abstracts + full text)
<div style="border: 2px solid orange; border-radius:10px; padding:0px 10px; width: fit-content;">
* This model was previously named **"PubMedBERT (abstracts + full text)"**.
* You can either adopt the new model name "microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext" or update your `transformers` library to version 4.22+ if you need to refer to the old name.
</div>
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web. A prevailing assumption is that even domain-specific pretraining can benefit by starting from general-domain language models. [Recent work](https://arxiv.org/abs/2007.15779) shows that for domains with abundant unlabeled text, such as biomedicine, pretraining language models from scratch results in substantial gains over continual pretraining of general-domain language models.
BiomedBERT is pretrained from scratch using _abstracts_ from [PubMed](https://pubmed.ncbi.nlm.nih.gov/) and _full-text_ articles from [PubMedCentral](https://www.ncbi.nlm.nih.gov/pmc/). This model achieves state-of-the-art performance on many biomedical NLP tasks, and currently holds the top score on the [Biomedical Language Understanding and Reasoning Benchmark](https://aka.ms/BLURB).
## Citation
If you find BiomedBERT useful in your research, please cite the following paper:
```latex
@misc{pubmedbert,
author = {Yu Gu and Robert Tinn and Hao Cheng and Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann and Jianfeng Gao and Hoifung Poon},
title = {Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing},
year = {2020},
eprint = {arXiv:2007.15779},
}
```
<a href="https://huggingface.co/exbert/?model=microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext&modelKind=bidirectional&sentence=Gefitinib%20is%20an%20EGFR%20tyrosine%20kinase%20inhibitor,%20which%20is%20often%20used%20for%20breast%20cancer%20and%20NSCLC%20treatment.&layer=3&heads=..0,1,2,3,4,5,6,7,8,9,10,11&threshold=0.7&tokenInd=17&tokenSide=right&maskInds=..&hideClsSep=true">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| 2,436 | [
[
-0.0162353515625,
-0.04083251953125,
0.041656494140625,
0.008331298828125,
-0.028564453125,
0.006622314453125,
-0.017822265625,
-0.038665771484375,
0.0193634033203125,
0.0217132568359375,
-0.0308837890625,
-0.0458984375,
-0.0543212890625,
0.0233612060546875,
-0.005489349365234375,
0.09552001953125,
-0.0009546279907226562,
0.0171966552734375,
-0.0242462158203125,
-0.0178680419921875,
0.004974365234375,
-0.0653076171875,
-0.035980224609375,
-0.03948974609375,
0.045928955078125,
-0.0171051025390625,
0.039215087890625,
0.0260467529296875,
0.03363037109375,
0.02191162109375,
-0.016510009765625,
-0.00342559814453125,
-0.0219879150390625,
-0.00811767578125,
-0.01067352294921875,
-0.007793426513671875,
-0.061187744140625,
0.0119476318359375,
0.03857421875,
0.0703125,
0.0020160675048828125,
-0.00411224365234375,
0.0159149169921875,
0.055419921875,
-0.028076171875,
-0.00002485513687133789,
-0.03143310546875,
0.001789093017578125,
-0.0131988525390625,
-0.015960693359375,
-0.035064697265625,
-0.017059326171875,
0.039459228515625,
-0.038970947265625,
0.020294189453125,
-0.0037860870361328125,
0.09759521484375,
0.0033626556396484375,
-0.0189971923828125,
0.01050567626953125,
-0.0340576171875,
0.06964111328125,
-0.0706787109375,
0.036834716796875,
0.033477783203125,
-0.0025882720947265625,
0.00007420778274536133,
-0.08038330078125,
-0.027557373046875,
-0.025238037109375,
-0.0138092041015625,
0.0131072998046875,
-0.039642333984375,
0.01116180419921875,
0.00928497314453125,
0.0020084381103515625,
-0.071044921875,
-0.01500701904296875,
-0.049560546875,
-0.020233154296875,
0.029693603515625,
-0.017547607421875,
0.024078369140625,
0.004528045654296875,
-0.0294342041015625,
-0.00771331787109375,
-0.053985595703125,
-0.005397796630859375,
-0.00476837158203125,
0.01287078857421875,
-0.01383209228515625,
0.0200042724609375,
0.01018524169921875,
0.0665283203125,
-0.0012493133544921875,
0.0042572021484375,
0.069580078125,
-0.03277587890625,
-0.018707275390625,
0.0005002021789550781,
0.076416015625,
0.0086669921875,
0.03765869140625,
-0.0142822265625,
0.001285552978515625,
-0.01003265380859375,
0.035003662109375,
-0.0653076171875,
-0.02728271484375,
0.034759521484375,
-0.036346435546875,
-0.006656646728515625,
-0.0087890625,
-0.032867431640625,
0.0029468536376953125,
-0.028472900390625,
0.042816162109375,
-0.0504150390625,
-0.002040863037109375,
0.0273895263671875,
0.005428314208984375,
0.006580352783203125,
0.0098724365234375,
-0.04425048828125,
0.00940704345703125,
0.00940704345703125,
0.06298828125,
-0.0288238525390625,
-0.021881103515625,
-0.0204315185546875,
0.0095977783203125,
0.003692626953125,
0.05828857421875,
-0.037078857421875,
-0.00841522216796875,
-0.0011873245239257812,
0.0255889892578125,
-0.0194854736328125,
-0.03289794921875,
0.0184783935546875,
-0.03564453125,
0.0202484130859375,
0.0013952255249023438,
-0.035064697265625,
-0.00870513916015625,
-0.0068206787109375,
-0.029571533203125,
0.04779052734375,
0.00333404541015625,
-0.06695556640625,
0.00646209716796875,
-0.050018310546875,
-0.042449951171875,
-0.0104827880859375,
-0.01377105712890625,
-0.038604736328125,
0.0006418228149414062,
0.00788116455078125,
0.037994384765625,
-0.00756072998046875,
0.0188751220703125,
-0.01397705078125,
0.0004146099090576172,
0.0130462646484375,
0.0016384124755859375,
0.0726318359375,
0.0053558349609375,
-0.0270538330078125,
0.020904541015625,
-0.0626220703125,
0.0218505859375,
0.009765625,
-0.0262603759765625,
-0.02178955078125,
-0.004608154296875,
-0.00376129150390625,
0.0289764404296875,
0.01641845703125,
-0.041717529296875,
0.0025768280029296875,
-0.047332763671875,
0.04266357421875,
0.040740966796875,
0.0007033348083496094,
0.033355712890625,
-0.021881103515625,
0.041839599609375,
0.00963592529296875,
0.005035400390625,
0.00907135009765625,
-0.039642333984375,
-0.03912353515625,
-0.033782958984375,
0.0450439453125,
0.0423583984375,
-0.057861328125,
0.05029296875,
-0.01293182373046875,
-0.0228424072265625,
-0.0543212890625,
0.00021409988403320312,
0.04095458984375,
0.038116455078125,
0.0692138671875,
-0.037506103515625,
-0.0457763671875,
-0.0743408203125,
-0.020904541015625,
0.00836181640625,
-0.01351165771484375,
0.00888824462890625,
0.03887939453125,
-0.043182373046875,
0.061492919921875,
-0.030853271484375,
-0.02203369140625,
-0.03204345703125,
0.0205078125,
0.02667236328125,
0.050323486328125,
0.04156494140625,
-0.043792724609375,
-0.038482666015625,
-0.0194244384765625,
-0.048614501953125,
-0.019805908203125,
0.00742340087890625,
-0.022613525390625,
0.00714111328125,
0.040130615234375,
-0.052001953125,
0.0333251953125,
0.045166015625,
-0.0188446044921875,
0.049285888671875,
-0.029754638671875,
-0.00925445556640625,
-0.0755615234375,
0.0250396728515625,
0.00464630126953125,
-0.020294189453125,
-0.07122802734375,
-0.015777587890625,
0.0038700103759765625,
0.002826690673828125,
-0.038330078125,
0.0389404296875,
-0.042724609375,
0.026214599609375,
-0.01343536376953125,
0.01316070556640625,
0.01383209228515625,
0.043426513671875,
0.021240234375,
0.051483154296875,
0.0457763671875,
-0.052032470703125,
-0.0159454345703125,
0.03680419921875,
-0.0175323486328125,
-0.0007824897766113281,
-0.08587646484375,
0.004199981689453125,
-0.021820068359375,
0.02606201171875,
-0.06689453125,
0.002197265625,
0.0117950439453125,
-0.048614501953125,
0.0406494140625,
0.01229095458984375,
-0.0186614990234375,
-0.01074981689453125,
-0.02581787109375,
0.02410888671875,
0.047882080078125,
-0.01255035400390625,
0.039093017578125,
0.0290069580078125,
-0.032867431640625,
-0.0491943359375,
-0.0614013671875,
-0.01525115966796875,
0.02001953125,
-0.0386962890625,
0.048736572265625,
-0.0155792236328125,
0.00829315185546875,
-0.00652313232421875,
-0.003971099853515625,
-0.0145416259765625,
-0.01421356201171875,
0.00958251953125,
0.0318603515625,
-0.019012451171875,
0.019195556640625,
0.0084991455078125,
-0.01438140869140625,
-0.004100799560546875,
-0.01329803466796875,
0.046630859375,
-0.01287841796875,
-0.0103302001953125,
-0.0205078125,
0.0279541015625,
0.028167724609375,
-0.041717529296875,
0.07281494140625,
0.041015625,
-0.016357421875,
0.005512237548828125,
-0.021759033203125,
-0.0229339599609375,
-0.033721923828125,
0.04351806640625,
0.0063018798828125,
-0.0748291015625,
0.013580322265625,
-0.007205963134765625,
0.00270843505859375,
0.04327392578125,
0.0489501953125,
-0.0026302337646484375,
0.07501220703125,
0.046142578125,
0.01097869873046875,
0.0189208984375,
-0.01499176025390625,
0.0260467529296875,
-0.0679931640625,
-0.004421234130859375,
-0.03955078125,
-0.01169586181640625,
-0.0122528076171875,
-0.035797119140625,
0.0209197998046875,
0.0030803680419921875,
-0.028076171875,
0.035614013671875,
-0.055633544921875,
0.0140838623046875,
0.03509521484375,
0.0187835693359375,
0.0105438232421875,
0.01554107666015625,
-0.045379638671875,
-0.00841522216796875,
-0.04840087890625,
-0.046600341796875,
0.08538818359375,
0.0292510986328125,
0.044097900390625,
0.006328582763671875,
0.047882080078125,
0.004398345947265625,
0.03131103515625,
-0.0299835205078125,
0.034912109375,
-0.01448822021484375,
-0.059234619140625,
-0.00815582275390625,
-0.033721923828125,
-0.096435546875,
0.01255035400390625,
-0.0262908935546875,
-0.06634521484375,
0.0266876220703125,
0.0159912109375,
-0.04632568359375,
0.010101318359375,
-0.0518798828125,
0.0694580078125,
-0.02423095703125,
-0.02197265625,
0.00957489013671875,
-0.07269287109375,
0.0030727386474609375,
-0.017242431640625,
0.017547607421875,
-0.002208709716796875,
0.002857208251953125,
0.06732177734375,
-0.0288848876953125,
0.0631103515625,
-0.0135040283203125,
0.00791168212890625,
0.006160736083984375,
-0.02239990234375,
0.020904541015625,
-0.0214691162109375,
0.006366729736328125,
0.0299835205078125,
0.01316070556640625,
-0.0299835205078125,
-0.01073455810546875,
0.024200439453125,
-0.07373046875,
-0.031494140625,
-0.053192138671875,
-0.016387939453125,
-0.033538818359375,
0.0184326171875,
0.05792236328125,
0.03265380859375,
-0.0096435546875,
0.028472900390625,
0.0667724609375,
-0.057952880859375,
0.01062774658203125,
0.0479736328125,
-0.0171966552734375,
-0.027557373046875,
0.04833984375,
0.0012044906616210938,
0.023651123046875,
0.03564453125,
-0.003780364990234375,
-0.02008056640625,
-0.04742431640625,
-0.004718780517578125,
0.041534423828125,
-0.034149169921875,
-0.018341064453125,
-0.0811767578125,
-0.043792724609375,
-0.04034423828125,
-0.00804901123046875,
-0.017425537109375,
-0.0273590087890625,
-0.030792236328125,
-0.003299713134765625,
0.0182037353515625,
0.03302001953125,
-0.013031005859375,
0.017547607421875,
-0.0726318359375,
0.0199737548828125,
0.0080108642578125,
0.0213775634765625,
0.005645751953125,
-0.059417724609375,
-0.0225982666015625,
0.007091522216796875,
-0.0208282470703125,
-0.0699462890625,
0.044158935546875,
0.034454345703125,
0.05615234375,
0.0146484375,
-0.003570556640625,
0.0234375,
-0.07037353515625,
0.050140380859375,
0.033935546875,
-0.05279541015625,
0.0421142578125,
-0.018402099609375,
0.04010009765625,
0.055908203125,
0.0697021484375,
-0.0190887451171875,
-0.0286407470703125,
-0.0477294921875,
-0.09228515625,
0.040191650390625,
0.024871826171875,
0.0011615753173828125,
-0.0178375244140625,
0.011199951171875,
0.0016431808471679688,
0.0187225341796875,
-0.07000732421875,
-0.03839111328125,
-0.01171875,
-0.0244598388671875,
-0.017120361328125,
-0.02056884765625,
-0.019073486328125,
-0.05926513671875,
0.0643310546875,
0.0031299591064453125,
0.0665283203125,
0.039825439453125,
-0.0281524658203125,
0.01202392578125,
0.027099609375,
0.051727294921875,
0.0645751953125,
-0.03363037109375,
0.005474090576171875,
0.006938934326171875,
-0.053985595703125,
0.003993988037109375,
0.032867431640625,
0.008514404296875,
0.0219879150390625,
0.033935546875,
0.0509033203125,
0.01044464111328125,
-0.04766845703125,
0.05438232421875,
-0.003131866455078125,
-0.039947509765625,
-0.01154327392578125,
-0.00836944580078125,
0.0173187255859375,
0.00630950927734375,
0.034332275390625,
0.01065826416015625,
-0.0024089813232421875,
-0.0282135009765625,
0.028533935546875,
0.0174713134765625,
-0.045440673828125,
-0.03131103515625,
0.059967041015625,
0.009307861328125,
0.0011301040649414062,
0.0276641845703125,
-0.0074462890625,
-0.046661376953125,
0.021087646484375,
0.0489501953125,
0.0635986328125,
-0.02044677734375,
0.0115814208984375,
0.03961181640625,
0.0178070068359375,
0.005359649658203125,
0.018798828125,
0.0233612060546875,
-0.059906005859375,
-0.05169677734375,
-0.0709228515625,
-0.002765655517578125,
0.0237579345703125,
-0.037750244140625,
-0.016632080078125,
-0.0404052734375,
-0.037506103515625,
0.02398681640625,
-0.01088714599609375,
-0.051513671875,
0.022857666015625,
-0.0015382766723632812,
0.061553955078125,
-0.05157470703125,
0.0797119140625,
0.07586669921875,
-0.0435791015625,
-0.05279541015625,
-0.014739990234375,
-0.00519561767578125,
-0.052520751953125,
0.0576171875,
0.00017070770263671875,
0.0011730194091796875,
-0.00215911865234375,
-0.06390380859375,
-0.062255859375,
0.06231689453125,
0.0185394287109375,
-0.050140380859375,
-0.01499176025390625,
-0.00609588623046875,
0.05810546875,
-0.02386474609375,
0.012603759765625,
0.029052734375,
0.019622802734375,
-0.0119781494140625,
-0.06378173828125,
0.018829345703125,
-0.046600341796875,
-0.00759124755859375,
-0.00024628639221191406,
-0.0239715576171875,
0.0843505859375,
-0.004116058349609375,
0.002231597900390625,
0.004459381103515625,
0.0406494140625,
0.025604248046875,
0.0011873245239257812,
0.0174102783203125,
0.031494140625,
0.052642822265625,
-0.005313873291015625,
0.0828857421875,
-0.033355712890625,
0.026947021484375,
0.07415771484375,
-0.01433563232421875,
0.0623779296875,
0.030181884765625,
-0.0338134765625,
0.062347412109375,
0.033935546875,
0.00688934326171875,
0.051483154296875,
0.01739501953125,
-0.01910400390625,
-0.01617431640625,
0.01053619384765625,
-0.057281494140625,
0.00925445556640625,
0.01641845703125,
-0.051361083984375,
-0.00785064697265625,
0.008026123046875,
0.0187835693359375,
-0.01056671142578125,
0.0006422996520996094,
0.03924560546875,
0.0270538330078125,
-0.024322509765625,
0.0589599609375,
-0.01363372802734375,
0.05291748046875,
-0.08245849609375,
0.005252838134765625,
-0.0019350051879882812,
0.0186309814453125,
-0.01265716552734375,
-0.0318603515625,
0.01108551025390625,
0.0033245086669921875,
-0.019775390625,
-0.0130462646484375,
0.057525634765625,
-0.03570556640625,
-0.0211639404296875,
0.035369873046875,
0.046173095703125,
0.021514892578125,
-0.0067138671875,
-0.07159423828125,
-0.005443572998046875,
0.01433563232421875,
-0.0294342041015625,
0.0447998046875,
0.0173187255859375,
0.0254058837890625,
0.03363037109375,
0.046600341796875,
0.019775390625,
-0.0157928466796875,
0.005859375,
0.07080078125,
-0.0462646484375,
-0.0188140869140625,
-0.05712890625,
0.046722412109375,
-0.01320648193359375,
-0.019622802734375,
0.049560546875,
0.031768798828125,
0.05242919921875,
-0.0288238525390625,
0.057769775390625,
0.00008654594421386719,
0.057342529296875,
-0.02520751953125,
0.0814208984375,
-0.053497314453125,
-0.00016486644744873047,
-0.035064697265625,
-0.06524658203125,
-0.0386962890625,
0.07745361328125,
-0.027618408203125,
0.03509521484375,
0.078125,
0.048065185546875,
0.0072021484375,
-0.020294189453125,
0.0158233642578125,
0.037322998046875,
0.006107330322265625,
0.044158935546875,
0.034271240234375,
-0.0191192626953125,
0.0270538330078125,
0.0013799667358398438,
-0.0293426513671875,
-0.011260986328125,
-0.0665283203125,
-0.07586669921875,
-0.04248046875,
-0.03607177734375,
-0.046966552734375,
0.0218505859375,
0.088134765625,
0.055511474609375,
-0.0809326171875,
0.0048675537109375,
0.0128173828125,
-0.0305938720703125,
-0.008819580078125,
-0.01094818115234375,
0.04058837890625,
-0.0233917236328125,
-0.018310546875,
0.018585205078125,
0.0193634033203125,
0.0161895751953125,
0.009307861328125,
-0.00335693359375,
-0.0638427734375,
0.010772705078125,
0.054412841796875,
0.040863037109375,
-0.03564453125,
-0.01605224609375,
-0.006587982177734375,
-0.027740478515625,
0.016021728515625,
0.035797119140625,
-0.05908203125,
0.0207061767578125,
0.0246429443359375,
0.05548095703125,
0.036102294921875,
-0.006099700927734375,
0.05169677734375,
-0.06475830078125,
0.0032329559326171875,
0.029632568359375,
0.026947021484375,
0.021881103515625,
-0.00043773651123046875,
0.038726806640625,
0.016510009765625,
-0.054931640625,
-0.041961669921875,
-0.006061553955078125,
-0.077392578125,
-0.045989990234375,
0.0831298828125,
-0.01441192626953125,
-0.018829345703125,
-0.02032470703125,
-0.0109405517578125,
0.026458740234375,
-0.0180511474609375,
0.035491943359375,
0.04058837890625,
-0.007190704345703125,
0.0003631114959716797,
-0.054779052734375,
0.058868408203125,
0.044464111328125,
-0.05615234375,
-0.02020263671875,
0.01568603515625,
0.0161285400390625,
0.027008056640625,
0.0653076171875,
-0.019256591796875,
0.01282501220703125,
-0.0288848876953125,
0.03570556640625,
0.00756072998046875,
-0.0162506103515625,
-0.0272674560546875,
-0.009765625,
-0.0080413818359375,
0.0017957687377929688
]
] |
google/fnet-base | 2021-10-31T07:33:21.000Z | [
"transformers",
"pytorch",
"rust",
"fnet",
"pretraining",
"en",
"dataset:c4",
"arxiv:2105.03824",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | google | null | null | google/fnet-base | 13 | 1,611,942 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- fnet
license: apache-2.0
datasets:
- c4
---
# FNet base model
Pretrained model on English language using a masked language modeling (MLM) and next sentence prediction (NSP) objective. It was
introduced in [this paper](https://arxiv.org/abs/2105.03824) and first released in [this repository](https://github.com/google-research/google-research/tree/master/f_net).
This model is cased: it makes a difference between english and English. The model achieves 0.58 accuracy on MLM objective and 0.80 on NSP objective.
Disclaimer: This model card has been written by [gchhablani](https://huggingface.co/gchhablani).
## Model description
FNet is a transformers model with attention replaced with fourier transforms. Hence, the inputs do not contain an `attention_mask`. It is pretrained on a large corpus of
English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling
them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and
labels from those texts. More precisely, it was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the FNet model as inputs.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=fnet) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
## Training data
The FNet model was pretrained on [C4](https://huggingface.co/datasets/c4), a cleaned version of the Common Crawl dataset.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 32,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
FNet-base was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 512 tokens. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
FNet-base was fine-tuned and evaluated on the validation data of the [GLUE benchamrk](https://huggingface.co/datasets/glue). The results of the official model (written in Flax) can be seen in Table 1 on page 7 of [the official paper](https://arxiv.org/abs/2105.03824).
For comparison, this model (ported to PyTorch) was fine-tuned and evaluated using the [official Hugging Face GLUE evaluation scripts](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification#glue-tasks) alongside [bert-base-cased](https://hf.co/models/bert-base-cased) for comparison.
The training was done on a single 16GB NVIDIA Tesla V100 GPU. For MRPC/WNLI, the models were trained for 5 epochs, while for other tasks, the models were trained for 3 epochs. A sequence length of 512 was used with batch size 16 and learning rate 2e-5.
The following table summarizes the results for [fnet-base](https://huggingface.co/google/fnet-base) (called *FNet (PyTorch) - Reproduced*) and [bert-base-cased](https://hf.co/models/bert-base-cased) (called *Bert (PyTorch) - Reproduced*) in terms of **fine-tuning** speed. The format is *hour:min:seconds*. **Note** that the authors compared **pre-traning** speed in [the official paper](https://arxiv.org/abs/2105.03824) instead.
| Task/Model | FNet-base (PyTorch) |Bert-base (PyTorch)|
|:----:|:-----------:|:----:|
| MNLI-(m/mm) | [06:40:55](https://huggingface.co/gchhablani/fnet-base-finetuned-mnli) | [09:52:33](https://huggingface.co/gchhablani/bert-base-cased-finetuned-mnli)|
| QQP | [06:21:16](https://huggingface.co/gchhablani/fnet-base-finetuned-qqp) | [09:25:01](https://huggingface.co/gchhablani/bert-base-cased-finetuned-qqp) |
| QNLI | [01:48:22](https://huggingface.co/gchhablani/fnet-base-finetuned-qnli) | [02:40:22](https://huggingface.co/gchhablani/bert-base-cased-finetuned-qnli)|
| SST-2 | [01:09:27](https://huggingface.co/gchhablani/fnet-base-finetuned-sst2) | [01:42:17](https://huggingface.co/gchhablani/bert-base-cased-finetuned-sst2)|
| CoLA | [00:09:47](https://huggingface.co/gchhablani/fnet-base-finetuned-cola) | [00:14:20](https://huggingface.co/gchhablani/bert-base-cased-finetuned-cola)|
| STS-B | [00:07:09](https://huggingface.co/gchhablani/fnet-base-finetuned-stsb) | [00:10:24](https://huggingface.co/gchhablani/bert-base-cased-finetuned-stsb)|
| MRPC | [00:07:48](https://huggingface.co/gchhablani/fnet-base-finetuned-mrpc) | [00:11:12](https://huggingface.co/gchhablani/bert-base-cased-finetuned-mrpc)|
| RTE | [00:03:24](https://huggingface.co/gchhablani/fnet-base-finetuned-rte) | [00:04:51](https://huggingface.co/gchhablani/bert-base-cased-finetuned-rte)|
| WNLI | [00:02:37](https://huggingface.co/gchhablani/fnet-base-finetuned-wnli) | [00:03:23](https://huggingface.co/gchhablani/bert-base-cased-finetuned-wnli)|
| SUM | 16:30:45 | 24:23:56 |
On average the PyTorch version of FNet-base requires *ca.* 32% less time for GLUE fine-tuning on GPU.
The following table summarizes the results for [fnet-base](https://huggingface.co/google/fnet-base) (called *FNet (PyTorch) - Reproduced*) and [bert-base-cased](https://hf.co/models/bert-base-cased) (called *Bert (PyTorch) - Reproduced*) in terms of performance and compares it to the reported performance of the official FNet-base model (called *FNet (Flax) - Official*). Note that the training hyperparameters of the reproduced models were not the same as the official model, so the performance may differ significantly for some tasks (for example: CoLA).
| Task/Model | Metric | FNet-base (PyTorch) | Bert-base (PyTorch) | FNet-Base (Flax - official) |
|:----:|:-----------:|:----:|:-----------:|:----:|
| MNLI-(m/mm) | Accuracy or Match/Mismatch | [76.75](https://huggingface.co/gchhablani/fnet-base-finetuned-mnli) | [84.10](https://huggingface.co/gchhablani/bert-base-cased-finetuned-mnli) | 72/73 |
| QQP | mean(Accuracy,F1) | [86.5](https://huggingface.co/gchhablani/fnet-base-finetuned-qqp) | [89.26](https://huggingface.co/gchhablani/bert-base-cased-finetuned-qqp) | 83 |
| QNLI | Accuracy | [84.39](https://huggingface.co/gchhablani/fnet-base-finetuned-qnli) | [90.99](https://huggingface.co/gchhablani/bert-base-cased-finetuned-qnli) | 80 |
| SST-2 | Accuracy | [89.45](https://huggingface.co/gchhablani/fnet-base-finetuned-sst2) | [92.32](https://huggingface.co/gchhablani/bert-base-cased-finetuned-sst2) | 95 |
| CoLA | Matthews corr or Accuracy | [35.94](https://huggingface.co/gchhablani/fnet-base-finetuned-cola) | [59.57](https://huggingface.co/gchhablani/bert-base-cased-finetuned-cola) | 69 |
| STS-B | Spearman corr. | [82.19](https://huggingface.co/gchhablani/fnet-base-finetuned-stsb) | [88.98](https://huggingface.co/gchhablani/bert-base-cased-finetuned-stsb) | 79 |
| MRPC | mean(F1/Accuracy) | [81.15](https://huggingface.co/gchhablani/fnet-base-finetuned-mrpc) | [88.15](https://huggingface.co/gchhablani/bert-base-cased-finetuned-mrpc) | 76 |
| RTE | Accuracy | [62.82](https://huggingface.co/gchhablani/fnet-base-finetuned-rte) | [67.15](https://huggingface.co/gchhablani/bert-base-cased-finetuned-rte) | 63 |
| WNLI | Accuracy | [54.93](https://huggingface.co/gchhablani/fnet-base-finetuned-wnli) | [46.48](https://huggingface.co/gchhablani/bert-base-cased-finetuned-wnli) | - |
| Avg | - | 72.7 | 78.6 | 76.7 |
We can see that FNet-base achieves around 93% of BERT-base's performance on average.
For more details, please refer to the checkpoints linked with the scores. On overview of all fine-tuned checkpoints of the following table can be accessed [here](https://huggingface.co/models?other=fnet-bert-base-comparison).
### How to use
You can use this model directly with a pipeline for masked language modeling:
**Note: The mask filling pipeline doesn't work exactly as the original model performs masking after converting to tokens. In masking pipeline an additional space is added after the [MASK].**
```python
>>> from transformers import FNetForMaskedLM, FNetTokenizer, pipeline
>>> tokenizer = FNetTokenizer.from_pretrained("google/fnet-base")
>>> model = FNetForMaskedLM.from_pretrained("google/fnet-base")
>>> unmasker = pipeline('fill-mask', model=model, tokenizer=tokenizer)
>>> unmasker("Hello I'm a [MASK] model.")
[
{"sequence": "hello i'm a new model.", "score": 0.12073223292827606, "token": 351, "token_str": "new"},
{"sequence": "hello i'm a first model.", "score": 0.08501081168651581, "token": 478, "token_str": "first"},
{"sequence": "hello i'm a next model.", "score": 0.060546260327100754, "token": 1037, "token_str": "next"},
{"sequence": "hello i'm a last model.", "score": 0.038265593349933624, "token": 813, "token_str": "last"},
{"sequence": "hello i'm a sister model.", "score": 0.033868927508592606, "token": 6232, "token_str": "sister"},
]
```
Here is how to use this model to get the features of a given text in PyTorch:
**Note: You must specify the maximum sequence length to be 512 and truncate/pad to the same length because the original model has no attention mask and considers all the hidden states during forward pass.**
```python
from transformers import FNetTokenizer, FNetModel
tokenizer = FNetTokenizer.from_pretrained("google/fnet-base")
model = FNetModel.from_pretrained("google/fnet-base")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt', padding='max_length', truncation=True, max_length=512)
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-03824,
author = {James Lee{-}Thorp and
Joshua Ainslie and
Ilya Eckstein and
Santiago Onta{\~{n}}{\'{o}}n},
title = {FNet: Mixing Tokens with Fourier Transforms},
journal = {CoRR},
volume = {abs/2105.03824},
year = {2021},
url = {https://arxiv.org/abs/2105.03824},
archivePrefix = {arXiv},
eprint = {2105.03824},
timestamp = {Fri, 14 May 2021 12:13:30 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-03824.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
## Contributions
Thanks to [@gchhablani](https://huggingface.co/gchhablani) for adding this model. | 12,614 | [
[
-0.03802490234375,
-0.065185546875,
-0.004058837890625,
0.023040771484375,
-0.0263519287109375,
-0.012176513671875,
-0.02545166015625,
-0.0531005859375,
0.043426513671875,
0.0117034912109375,
-0.0504150390625,
-0.0183563232421875,
-0.044921875,
-0.0011768341064453125,
-0.022705078125,
0.0843505859375,
0.00254058837890625,
-0.00794219970703125,
-0.002536773681640625,
-0.00038313865661621094,
-0.016693115234375,
-0.047332763671875,
-0.04840087890625,
-0.0258941650390625,
0.046234130859375,
0.021575927734375,
0.050506591796875,
0.036590576171875,
0.040283203125,
0.0299835205078125,
-0.0192108154296875,
-0.004360198974609375,
-0.03375244140625,
-0.0254974365234375,
0.01447296142578125,
-0.0263671875,
-0.054107666015625,
0.019866943359375,
0.04345703125,
0.045562744140625,
0.0005049705505371094,
0.014923095703125,
0.0165863037109375,
0.055816650390625,
-0.0273590087890625,
0.0024318695068359375,
-0.031768798828125,
0.0237579345703125,
-0.01503753662109375,
0.00820159912109375,
-0.030731201171875,
-0.0157928466796875,
0.0288238525390625,
-0.039276123046875,
0.0225830078125,
0.01035308837890625,
0.0955810546875,
0.0019092559814453125,
-0.015289306640625,
-0.0069427490234375,
-0.016693115234375,
0.0556640625,
-0.05517578125,
0.034210205078125,
0.03314208984375,
0.00882720947265625,
-0.005481719970703125,
-0.066650390625,
-0.03558349609375,
-0.0267486572265625,
-0.01551055908203125,
0.01023101806640625,
-0.0251312255859375,
0.0161590576171875,
0.03851318359375,
0.046783447265625,
-0.054107666015625,
0.0033893585205078125,
-0.03204345703125,
-0.029541015625,
0.05841064453125,
-0.006526947021484375,
0.0190582275390625,
-0.019927978515625,
-0.0311126708984375,
-0.012359619140625,
-0.027557373046875,
0.01385498046875,
0.04278564453125,
0.0167999267578125,
-0.0279998779296875,
0.03936767578125,
-0.01910400390625,
0.05914306640625,
0.0177001953125,
-0.00406646728515625,
0.04437255859375,
-0.0217132568359375,
-0.036468505859375,
-0.0094757080078125,
0.07305908203125,
0.0235748291015625,
0.035400390625,
-0.0183563232421875,
-0.003337860107421875,
-0.0037059783935546875,
0.01485443115234375,
-0.0765380859375,
-0.02801513671875,
0.01129150390625,
-0.0577392578125,
-0.021636962890625,
0.006992340087890625,
-0.034027099609375,
0.01543426513671875,
-0.02197265625,
0.02642822265625,
-0.046783447265625,
-0.024749755859375,
0.003627777099609375,
-0.0179595947265625,
0.0105743408203125,
0.0184783935546875,
-0.05291748046875,
0.015838623046875,
0.048675537109375,
0.080322265625,
-0.0069580078125,
-0.0182647705078125,
-0.02587890625,
-0.040771484375,
-0.0111236572265625,
0.0230255126953125,
0.0032596588134765625,
0.0040740966796875,
-0.016143798828125,
0.020233154296875,
-0.0227508544921875,
-0.0121307373046875,
0.0258941650390625,
-0.0208587646484375,
0.0244140625,
-0.016143798828125,
-0.043426513671875,
-0.021484375,
0.01128387451171875,
-0.044708251953125,
0.070068359375,
0.0200958251953125,
-0.06744384765625,
0.0245513916015625,
-0.049774169921875,
-0.0306396484375,
-0.01464080810546875,
0.0208740234375,
-0.03369140625,
0.0016803741455078125,
-0.0019063949584960938,
0.059539794921875,
-0.007167816162109375,
0.017791748046875,
-0.0204620361328125,
-0.032470703125,
0.0185546875,
-0.021942138671875,
0.070556640625,
0.02789306640625,
-0.04461669921875,
-0.00534820556640625,
-0.05023193359375,
0.01027679443359375,
0.007465362548828125,
-0.01439666748046875,
0.004901885986328125,
-0.02093505859375,
0.0145263671875,
0.038482666015625,
0.01357269287109375,
-0.05584716796875,
0.01461029052734375,
-0.05908203125,
0.045928955078125,
0.04931640625,
-0.02001953125,
0.0294036865234375,
-0.04266357421875,
0.03143310546875,
0.0032978057861328125,
0.01461029052734375,
0.0005688667297363281,
-0.058746337890625,
-0.060211181640625,
-0.01666259765625,
0.04437255859375,
0.04364013671875,
-0.0435791015625,
0.037109375,
-0.0166473388671875,
-0.044647216796875,
-0.04510498046875,
0.01061248779296875,
0.050079345703125,
0.025177001953125,
0.03521728515625,
-0.0311126708984375,
-0.0443115234375,
-0.0750732421875,
-0.006786346435546875,
-0.0005540847778320312,
-0.0008840560913085938,
0.03314208984375,
0.047332763671875,
-0.0011568069458007812,
0.049346923828125,
-0.03289794921875,
-0.018768310546875,
-0.0113525390625,
0.0140380859375,
0.0287933349609375,
0.046844482421875,
0.059906005859375,
-0.060150146484375,
-0.044403076171875,
-0.000965118408203125,
-0.05657958984375,
0.01262664794921875,
-0.001953125,
-0.006664276123046875,
0.0482177734375,
0.035614013671875,
-0.050140380859375,
0.046112060546875,
0.04779052734375,
-0.0299072265625,
0.032135009765625,
-0.008819580078125,
0.0012359619140625,
-0.08282470703125,
0.0226898193359375,
0.001434326171875,
-0.01727294921875,
-0.051055908203125,
0.007236480712890625,
-0.0189208984375,
0.005519866943359375,
-0.050689697265625,
0.0458984375,
-0.045074462890625,
-0.004253387451171875,
-0.007076263427734375,
-0.0008606910705566406,
-0.0190277099609375,
0.052978515625,
0.0089874267578125,
0.057281494140625,
0.029754638671875,
-0.0183868408203125,
0.0183563232421875,
0.0294647216796875,
-0.018707275390625,
0.01183319091796875,
-0.05645751953125,
0.016693115234375,
0.00783538818359375,
0.0245819091796875,
-0.07171630859375,
-0.01277923583984375,
0.0275421142578125,
-0.06353759765625,
0.03228759765625,
-0.00278472900390625,
-0.030792236328125,
-0.04888916015625,
-0.0247650146484375,
0.01357269287109375,
0.04608154296875,
-0.026611328125,
0.03021240234375,
0.01453399658203125,
0.0093536376953125,
-0.062347412109375,
-0.06829833984375,
0.0003478527069091797,
-0.008331298828125,
-0.0687255859375,
0.0423583984375,
-0.006496429443359375,
0.0081787109375,
-0.000514984130859375,
-0.007080078125,
-0.0132904052734375,
0.0131072998046875,
0.036163330078125,
0.00215911865234375,
-0.03997802734375,
-0.01116180419921875,
-0.0178375244140625,
0.0007905960083007812,
0.0020751953125,
-0.02215576171875,
0.051177978515625,
-0.0307769775390625,
0.0078125,
-0.05096435546875,
0.0299224853515625,
0.024322509765625,
0.0018301010131835938,
0.065185546875,
0.07470703125,
-0.03009033203125,
0.005161285400390625,
-0.043548583984375,
-0.022857666015625,
-0.038604736328125,
0.004245758056640625,
-0.036041259765625,
-0.0806884765625,
0.03564453125,
0.0343017578125,
-0.0019369125366210938,
0.049835205078125,
0.02716064453125,
-0.01654052734375,
0.0631103515625,
0.034271240234375,
-0.023284912109375,
0.051361083984375,
-0.039520263671875,
0.0079193115234375,
-0.049896240234375,
-0.01457977294921875,
-0.019622802734375,
-0.044647216796875,
-0.060028076171875,
-0.0246429443359375,
0.005626678466796875,
0.0271453857421875,
-0.0171661376953125,
0.041351318359375,
-0.045867919921875,
0.0026111602783203125,
0.055206298828125,
0.0328369140625,
-0.0228118896484375,
0.007656097412109375,
-0.020538330078125,
-0.0048980712890625,
-0.0706787109375,
-0.0123291015625,
0.07794189453125,
0.041839599609375,
0.0171051025390625,
-0.0014209747314453125,
0.044097900390625,
0.0171966552734375,
0.0225677490234375,
-0.043701171875,
0.049041748046875,
-0.0307769775390625,
-0.049652099609375,
-0.01389312744140625,
-0.031768798828125,
-0.07696533203125,
0.003902435302734375,
-0.0377197265625,
-0.07330322265625,
-0.0021724700927734375,
0.01544189453125,
-0.01263427734375,
0.02496337890625,
-0.059600830078125,
0.0667724609375,
-0.00615692138671875,
-0.033843994140625,
-0.002044677734375,
-0.07867431640625,
0.02471923828125,
0.01493072509765625,
-0.01187896728515625,
0.0018568038940429688,
0.0208587646484375,
0.065673828125,
-0.05474853515625,
0.054534912109375,
-0.02496337890625,
0.022003173828125,
0.0303802490234375,
-0.0186767578125,
0.014007568359375,
-0.01169586181640625,
0.00788116455078125,
0.0303497314453125,
-0.01027679443359375,
-0.037628173828125,
-0.02459716796875,
0.040496826171875,
-0.058258056640625,
-0.0286102294921875,
-0.0367431640625,
-0.0255889892578125,
-0.00428009033203125,
0.0008630752563476562,
0.046783447265625,
0.039031982421875,
-0.0033702850341796875,
0.037109375,
0.04156494140625,
-0.01324462890625,
0.044830322265625,
0.0009794235229492188,
0.001346588134765625,
-0.02789306640625,
0.055938720703125,
0.00974273681640625,
0.01517486572265625,
0.039825439453125,
0.0163421630859375,
-0.0222015380859375,
-0.0382080078125,
-0.0196380615234375,
0.02630615234375,
-0.019439697265625,
-0.029327392578125,
-0.07025146484375,
-0.041290283203125,
-0.039215087890625,
-0.015289306640625,
-0.0183868408203125,
-0.040771484375,
-0.03143310546875,
-0.004364013671875,
0.050872802734375,
0.031982421875,
-0.014068603515625,
0.03314208984375,
-0.05670166015625,
0.016021728515625,
0.0094451904296875,
0.0238189697265625,
-0.004245758056640625,
-0.04449462890625,
-0.01355743408203125,
0.0212554931640625,
-0.0180816650390625,
-0.05401611328125,
0.01983642578125,
0.0270843505859375,
0.04266357421875,
0.0193328857421875,
-0.02239990234375,
0.042327880859375,
-0.029052734375,
0.06976318359375,
0.0263824462890625,
-0.06829833984375,
0.039215087890625,
-0.035858154296875,
0.01535797119140625,
0.030914306640625,
0.03778076171875,
-0.0278167724609375,
-0.0225830078125,
-0.0657958984375,
-0.06658935546875,
0.05584716796875,
0.027191162109375,
0.002262115478515625,
0.0032596588134765625,
0.03759765625,
-0.0146942138671875,
0.0239105224609375,
-0.051910400390625,
-0.033599853515625,
-0.025970458984375,
-0.0084381103515625,
-0.026763916015625,
-0.021575927734375,
0.023284912109375,
-0.042022705078125,
0.05133056640625,
-0.004299163818359375,
0.0682373046875,
0.025848388671875,
-0.0225372314453125,
0.01531219482421875,
-0.00981903076171875,
0.0712890625,
0.024444580078125,
-0.031890869140625,
-0.024444580078125,
0.0109405517578125,
-0.06268310546875,
-0.005035400390625,
0.02520751953125,
-0.00347900390625,
0.0124053955078125,
0.040130615234375,
0.0858154296875,
0.0162811279296875,
-0.020721435546875,
0.054962158203125,
-0.0006809234619140625,
-0.040435791015625,
-0.011199951171875,
0.005153656005859375,
0.00962066650390625,
0.0180511474609375,
0.01174163818359375,
0.00940704345703125,
0.001483917236328125,
-0.038421630859375,
0.04840087890625,
0.022308349609375,
-0.033843994140625,
-0.025421142578125,
0.055419921875,
0.0075225830078125,
-0.018096923828125,
0.051483154296875,
-0.01270294189453125,
-0.051849365234375,
0.05413818359375,
0.030242919921875,
0.0645751953125,
-0.028411865234375,
0.02197265625,
0.039398193359375,
0.01800537109375,
-0.00008636713027954102,
0.0253753662109375,
0.004085540771484375,
-0.05535888671875,
-0.03460693359375,
-0.054779052734375,
-0.0058135986328125,
0.02215576171875,
-0.044342041015625,
0.002407073974609375,
-0.040130615234375,
-0.0216064453125,
0.0182342529296875,
0.0272064208984375,
-0.06036376953125,
0.035858154296875,
0.0043792724609375,
0.09564208984375,
-0.0628662109375,
0.06591796875,
0.04974365234375,
-0.021728515625,
-0.06768798828125,
-0.017059326171875,
-0.003101348876953125,
-0.051910400390625,
0.0516357421875,
0.020782470703125,
0.00984954833984375,
-0.0062408447265625,
-0.0313720703125,
-0.0430908203125,
0.08404541015625,
0.0160675048828125,
-0.037353515625,
-0.01453399658203125,
0.005138397216796875,
0.052093505859375,
-0.0328369140625,
0.032073974609375,
0.036468505859375,
0.022735595703125,
0.0011510848999023438,
-0.07916259765625,
0.01044464111328125,
-0.044158935546875,
-0.00461578369140625,
0.01480865478515625,
-0.064453125,
0.0859375,
0.00618743896484375,
0.00571441650390625,
0.007755279541015625,
0.05474853515625,
0.0019588470458984375,
0.01531982421875,
0.040283203125,
0.05145263671875,
0.0364990234375,
-0.0255889892578125,
0.0870361328125,
-0.0140380859375,
0.0285186767578125,
0.05517578125,
0.02716064453125,
0.069091796875,
0.028167724609375,
-0.01439666748046875,
0.0238189697265625,
0.058319091796875,
-0.0001506805419921875,
0.03289794921875,
0.028350830078125,
0.005519866943359375,
-0.0031223297119140625,
-0.0007624626159667969,
-0.0347900390625,
0.044708251953125,
0.0196990966796875,
-0.034637451171875,
0.003627777099609375,
0.01526641845703125,
0.0213775634765625,
-0.014617919921875,
-0.0167694091796875,
0.049957275390625,
0.0099639892578125,
-0.054595947265625,
0.057098388671875,
-0.007381439208984375,
0.0762939453125,
-0.035247802734375,
0.015655517578125,
-0.0199432373046875,
0.01349639892578125,
-0.01177215576171875,
-0.050994873046875,
0.037017822265625,
-0.01349639892578125,
-0.0191650390625,
-0.031463623046875,
0.061279296875,
-0.025360107421875,
-0.027374267578125,
0.0283966064453125,
0.029083251953125,
0.0246734619140625,
-0.0164794921875,
-0.0799560546875,
0.004535675048828125,
0.01314544677734375,
-0.0333251953125,
0.0305328369140625,
0.00567626953125,
0.0018930435180664062,
0.042144775390625,
0.049102783203125,
0.0032749176025390625,
0.00919342041015625,
-0.020965576171875,
0.054962158203125,
-0.041900634765625,
-0.03594970703125,
-0.04193115234375,
0.04144287109375,
-0.0182342529296875,
-0.0472412109375,
0.04730224609375,
0.039703369140625,
0.060211181640625,
-0.00670623779296875,
0.0298309326171875,
-0.022979736328125,
0.03265380859375,
-0.05950927734375,
0.054229736328125,
-0.06005859375,
-0.0239715576171875,
-0.039398193359375,
-0.059356689453125,
-0.01641845703125,
0.06634521484375,
-0.00615692138671875,
0.0076141357421875,
0.047821044921875,
0.050689697265625,
-0.015716552734375,
0.006298065185546875,
0.006061553955078125,
0.01540374755859375,
-0.0033321380615234375,
0.061614990234375,
0.044403076171875,
-0.060577392578125,
0.023193359375,
-0.0391845703125,
-0.003444671630859375,
-0.0374755859375,
-0.0791015625,
-0.085693359375,
-0.06756591796875,
-0.03533935546875,
-0.037872314453125,
0.00298309326171875,
0.06854248046875,
0.07135009765625,
-0.059112548828125,
-0.01080322265625,
-0.005123138427734375,
-0.00627899169921875,
-0.0207977294921875,
-0.0207977294921875,
0.0313720703125,
-0.005939483642578125,
-0.0439453125,
0.01104736328125,
-0.00617218017578125,
0.01134490966796875,
0.0102386474609375,
-0.0027618408203125,
-0.046356201171875,
-0.007659912109375,
0.045562744140625,
0.02203369140625,
-0.048797607421875,
-0.037353515625,
0.022735595703125,
-0.01019287109375,
0.0089874267578125,
0.04351806640625,
-0.03765869140625,
0.007160186767578125,
0.051666259765625,
0.03753662109375,
0.070068359375,
0.00606536865234375,
0.03253173828125,
-0.06317138671875,
0.033203125,
0.00980377197265625,
0.0264892578125,
0.0166015625,
-0.033782958984375,
0.053497314453125,
0.0310211181640625,
-0.0419921875,
-0.05975341796875,
-0.005130767822265625,
-0.09014892578125,
-0.00927734375,
0.08251953125,
-0.0256805419921875,
-0.0253448486328125,
0.02239990234375,
-0.01125335693359375,
0.0272369384765625,
-0.016143798828125,
0.0306854248046875,
0.053680419921875,
-0.0014448165893554688,
-0.01427459716796875,
-0.050567626953125,
0.051300048828125,
0.035400390625,
-0.038818359375,
-0.03167724609375,
0.026458740234375,
0.03057861328125,
0.024017333984375,
0.0587158203125,
0.0135040283203125,
0.01490020751953125,
-0.0021533966064453125,
-0.00710296630859375,
-0.0168609619140625,
-0.0256500244140625,
-0.020294189453125,
0.008636474609375,
-0.035430908203125,
-0.039031982421875
]
] |
shibing624/text2vec-base-chinese | 2023-08-28T08:58:03.000Z | [
"transformers",
"pytorch",
"onnx",
"bert",
"feature-extraction",
"text2vec",
"sentence-similarity",
"zh",
"dataset:shibing624/nli_zh",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | shibing624 | null | null | shibing624/text2vec-base-chinese | 473 | 1,589,826 | transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- text2vec
- feature-extraction
- sentence-similarity
- transformers
datasets:
- shibing624/nli_zh
language:
- zh
metrics:
- spearmanr
library_name: transformers
---
# shibing624/text2vec-base-chinese
This is a CoSENT(Cosine Sentence) model: shibing624/text2vec-base-chinese.
It maps sentences to a 768 dimensional dense vector space and can be used for tasks
like sentence embeddings, text matching or semantic search.
## Evaluation
For an automated evaluation of this model, see the *Evaluation Benchmark*: [text2vec](https://github.com/shibing624/text2vec)
- chinese text matching task๏ผ
| Arch | BaseModel | Model | ATEC | BQ | LCQMC | PAWSX | STS-B | SOHU-dd | SOHU-dc | Avg | QPS |
|:-----------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----:|:-----:|:-----:|:-----:|:-----:|:-------:|:-------:|:---------:|:-----:|
| Word2Vec | word2vec | [w2v-light-tencent-chinese](https://ai.tencent.com/ailab/nlp/en/download.html) | 20.00 | 31.49 | 59.46 | 2.57 | 55.78 | 55.04 | 20.70 | 35.03 | 23769 |
| SBERT | xlm-roberta-base | [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) | 18.42 | 38.52 | 63.96 | 10.14 | 78.90 | 63.01 | 52.28 | 46.46 | 3138 |
| Instructor | hfl/chinese-roberta-wwm-ext | [moka-ai/m3e-base](https://huggingface.co/moka-ai/m3e-base) | 41.27 | 63.81 | 74.87 | 12.20 | 76.96 | 75.83 | 60.55 | 57.93 | 2980 |
| CoSENT | hfl/chinese-macbert-base | [shibing624/text2vec-base-chinese](https://huggingface.co/shibing624/text2vec-base-chinese) | 31.93 | 42.67 | 70.16 | 17.21 | 79.30 | 70.27 | 50.42 | 51.61 | 3008 |
| CoSENT | hfl/chinese-lert-large | [GanymedeNil/text2vec-large-chinese](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 32.61 | 44.59 | 69.30 | 14.51 | 79.44 | 73.01 | 59.04 | 53.12 | 2092 |
| CoSENT | nghuyong/ernie-3.0-base-zh | [shibing624/text2vec-base-chinese-sentence](https://huggingface.co/shibing624/text2vec-base-chinese-sentence) | 43.37 | 61.43 | 73.48 | 38.90 | 78.25 | 70.60 | 53.08 | 59.87 | 3089 |
| CoSENT | nghuyong/ernie-3.0-base-zh | [shibing624/text2vec-base-chinese-paraphrase](https://huggingface.co/shibing624/text2vec-base-chinese-paraphrase) | 44.89 | 63.58 | 74.24 | 40.90 | 78.93 | 76.70 | 63.30 | 63.08 | 3066 |
| CoSENT | sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 | [shibing624/text2vec-base-multilingual](https://huggingface.co/shibing624/text2vec-base-multilingual) | 32.39 | 50.33 | 65.64 | 32.56 | 74.45 | 68.88 | 51.17 | 53.67 | 4004 |
่ฏดๆ๏ผ
- ็ปๆ่ฏๆตๆๆ ๏ผspearman็ณปๆฐ
- `shibing624/text2vec-base-chinese`ๆจกๅ๏ผๆฏ็จCoSENTๆนๆณ่ฎญ็ป๏ผๅบไบ`hfl/chinese-macbert-base`ๅจไธญๆSTS-Bๆฐๆฎ่ฎญ็ปๅพๅฐ๏ผๅนถๅจไธญๆSTS-Bๆต่ฏ้่ฏไผฐ่พพๅฐ่พๅฅฝๆๆ๏ผ่ฟ่ก[examples/training_sup_text_matching_model.py](https://github.com/shibing624/text2vec/blob/master/examples/training_sup_text_matching_model.py)ไปฃ็ ๅฏ่ฎญ็ปๆจกๅ๏ผๆจกๅๆไปถๅทฒ็ปไธไผ HF model hub๏ผไธญๆ้็จ่ฏญไนๅน้
ไปปๅกๆจ่ไฝฟ็จ
- `shibing624/text2vec-base-chinese-sentence`ๆจกๅ๏ผๆฏ็จCoSENTๆนๆณ่ฎญ็ป๏ผๅบไบ`nghuyong/ernie-3.0-base-zh`็จไบบๅทฅๆ้ๅ็ไธญๆSTSๆฐๆฎ้[shibing624/nli-zh-all/text2vec-base-chinese-sentence-dataset](https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/text2vec-base-chinese-sentence-dataset)่ฎญ็ปๅพๅฐ๏ผๅนถๅจไธญๆๅNLIๆต่ฏ้่ฏไผฐ่พพๅฐ่พๅฅฝๆๆ๏ผ่ฟ่ก[examples/training_sup_text_matching_model_jsonl_data.py](https://github.com/shibing624/text2vec/blob/master/examples/training_sup_text_matching_model_jsonl_data.py)ไปฃ็ ๅฏ่ฎญ็ปๆจกๅ๏ผๆจกๅๆไปถๅทฒ็ปไธไผ HF model hub๏ผไธญๆs2s(ๅฅๅญvsๅฅๅญ)่ฏญไนๅน้
ไปปๅกๆจ่ไฝฟ็จ
- `shibing624/text2vec-base-chinese-paraphrase`ๆจกๅ๏ผๆฏ็จCoSENTๆนๆณ่ฎญ็ป๏ผๅบไบ`nghuyong/ernie-3.0-base-zh`็จไบบๅทฅๆ้ๅ็ไธญๆSTSๆฐๆฎ้[shibing624/nli-zh-all/text2vec-base-chinese-paraphrase-dataset](https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/text2vec-base-chinese-paraphrase-dataset)๏ผๆฐๆฎ้็ธๅฏนไบ[shibing624/nli-zh-all/text2vec-base-chinese-sentence-dataset](https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/text2vec-base-chinese-sentence-dataset)ๅ ๅ
ฅไบs2p(sentence to paraphrase)ๆฐๆฎ๏ผๅผบๅไบๅ
ถ้ฟๆๆฌ็่กจๅพ่ฝๅ๏ผๅนถๅจไธญๆๅNLIๆต่ฏ้่ฏไผฐ่พพๅฐSOTA๏ผ่ฟ่ก[examples/training_sup_text_matching_model_jsonl_data.py](https://github.com/shibing624/text2vec/blob/master/examples/training_sup_text_matching_model_jsonl_data.py)ไปฃ็ ๅฏ่ฎญ็ปๆจกๅ๏ผๆจกๅๆไปถๅทฒ็ปไธไผ HF model hub๏ผไธญๆs2p(ๅฅๅญvsๆฎต่ฝ)่ฏญไนๅน้
ไปปๅกๆจ่ไฝฟ็จ
- `sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2`ๆจกๅๆฏ็จSBERT่ฎญ็ป๏ผๆฏ`paraphrase-MiniLM-L12-v2`ๆจกๅ็ๅค่ฏญ่จ็ๆฌ๏ผๆฏๆไธญๆใ่ฑๆ็ญ
- `w2v-light-tencent-chinese`ๆฏ่
พ่ฎฏ่ฏๅ้็Word2Vecๆจกๅ๏ผCPUๅ ่ฝฝไฝฟ็จ๏ผ้็จไบไธญๆๅญ้ขๅน้
ไปปๅกๅ็ผบๅฐๆฐๆฎ็ๅทๅฏๅจๆ
ๅต
## Usage (text2vec)
Using this model becomes easy when you have [text2vec](https://github.com/shibing624/text2vec) installed:
```
pip install -U text2vec
```
Then you can use the model like this:
```python
from text2vec import SentenceModel
sentences = ['ๅฆไฝๆดๆข่ฑๅ็ปๅฎ้ถ่กๅก', '่ฑๅๆดๆน็ปๅฎ้ถ่กๅก']
model = SentenceModel('shibing624/text2vec-base-chinese')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [text2vec](https://github.com/shibing624/text2vec), you can use the model like this:
First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
Install transformers:
```
pip install transformers
```
Then load model and predict:
```python
from transformers import BertTokenizer, BertModel
import torch
# Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] # First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Load model from HuggingFace Hub
tokenizer = BertTokenizer.from_pretrained('shibing624/text2vec-base-chinese')
model = BertModel.from_pretrained('shibing624/text2vec-base-chinese')
sentences = ['ๅฆไฝๆดๆข่ฑๅ็ปๅฎ้ถ่กๅก', '่ฑๅๆดๆน็ปๅฎ้ถ่กๅก']
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Usage (sentence-transformers)
[sentence-transformers](https://github.com/UKPLab/sentence-transformers) is a popular library to compute dense vector representations for sentences.
Install sentence-transformers:
```
pip install -U sentence-transformers
```
Then load model and predict:
```python
from sentence_transformers import SentenceTransformer
m = SentenceTransformer("shibing624/text2vec-base-chinese")
sentences = ['ๅฆไฝๆดๆข่ฑๅ็ปๅฎ้ถ่กๅก', '่ฑๅๆดๆน็ปๅฎ้ถ่กๅก']
sentence_embeddings = m.encode(sentences)
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Full Model Architecture
```
CoSENT(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_mean_tokens': True})
)
```
## Intended uses
Our model is intented to be used as a sentence and short paragraph encoder. Given an input text, it ouptuts a vector which captures
the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks.
By default, input text longer than 256 word pieces is truncated.
## Training procedure
### Pre-training
We use the pretrained [`hfl/chinese-macbert-base`](https://huggingface.co/hfl/chinese-macbert-base) model.
Please refer to the model card for more detailed information about the pre-training procedure.
### Fine-tuning
We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each
possible sentence pairs from the batch.
We then apply the rank loss by comparing with true pairs and false pairs.
#### Hyper parameters
- training dataset: https://huggingface.co/datasets/shibing624/nli_zh
- max_seq_length: 128
- best epoch: 5
- sentence embedding dim: 768
## Citing & Authors
This model was trained by [text2vec](https://github.com/shibing624/text2vec).
If you find this model helpful, feel free to cite:
```bibtex
@software{text2vec,
author = {Xu Ming},
title = {text2vec: A Tool for Text to Vector},
year = {2022},
url = {https://github.com/shibing624/text2vec},
}
``` | 9,285 | [
[
-0.006488800048828125,
-0.056549072265625,
0.0229644775390625,
0.029998779296875,
-0.0227813720703125,
-0.03387451171875,
-0.0203704833984375,
-0.01314544677734375,
0.0069732666015625,
0.02984619140625,
-0.0308990478515625,
-0.041595458984375,
-0.0413818359375,
0.008026123046875,
-0.00707244873046875,
0.06207275390625,
-0.02044677734375,
0.016021728515625,
-0.00946044921875,
-0.0281829833984375,
-0.0372314453125,
-0.0308074951171875,
-0.041229248046875,
-0.01351165771484375,
0.01039886474609375,
0.023529052734375,
0.049957275390625,
0.0377197265625,
0.033782958984375,
0.0199432373046875,
-0.004245758056640625,
0.0157928466796875,
-0.0208587646484375,
-0.005512237548828125,
0.0027790069580078125,
-0.042999267578125,
-0.0061187744140625,
0.0008215904235839844,
0.037078857421875,
0.01922607421875,
0.004604339599609375,
-0.00438690185546875,
0.025726318359375,
0.042694091796875,
-0.0262298583984375,
0.0243377685546875,
-0.037994384765625,
-0.00039696693420410156,
-0.010345458984375,
-0.00699615478515625,
-0.027374267578125,
-0.015625,
0.01081085205078125,
-0.0406494140625,
0.004924774169921875,
0.01207733154296875,
0.093017578125,
0.011474609375,
-0.0241851806640625,
-0.03594970703125,
-0.01457977294921875,
0.0697021484375,
-0.064453125,
0.0151519775390625,
0.039581298828125,
-0.002674102783203125,
0.0024547576904296875,
-0.059173583984375,
-0.0528564453125,
-0.00435638427734375,
-0.040557861328125,
0.0296478271484375,
-0.00301361083984375,
-0.01374053955078125,
0.00872802734375,
0.008087158203125,
-0.058746337890625,
-0.0184783935546875,
-0.028900146484375,
-0.01326751708984375,
0.03875732421875,
0.004123687744140625,
0.0438232421875,
-0.049102783203125,
-0.03887939453125,
-0.024444580078125,
-0.031829833984375,
0.0225677490234375,
-0.003780364990234375,
0.01207733154296875,
-0.04071044921875,
0.052642822265625,
0.0020999908447265625,
0.032501220703125,
-0.00893402099609375,
0.001434326171875,
0.044952392578125,
-0.05145263671875,
-0.006031036376953125,
-0.02056884765625,
0.0946044921875,
0.047515869140625,
0.0196685791015625,
0.0021820068359375,
-0.005474090576171875,
0.0007824897766113281,
-0.0242462158203125,
-0.05126953125,
-0.02362060546875,
0.0253143310546875,
-0.04498291015625,
-0.0123443603515625,
0.01419830322265625,
-0.06488037109375,
-0.002140045166015625,
0.00014448165893554688,
0.036224365234375,
-0.057647705078125,
-0.008148193359375,
0.00934600830078125,
-0.03167724609375,
0.03533935546875,
-0.00986480712890625,
-0.057403564453125,
0.00997161865234375,
0.040496826171875,
0.08233642578125,
-0.0035800933837890625,
-0.0325927734375,
-0.0254364013671875,
0.0052032470703125,
-0.0209808349609375,
0.0364990234375,
-0.0199127197265625,
-0.01116943359375,
0.01432037353515625,
0.01265716552734375,
-0.024566650390625,
-0.02703857421875,
0.055328369140625,
-0.0070037841796875,
0.050811767578125,
-0.032196044921875,
-0.041595458984375,
-0.003971099853515625,
0.007045745849609375,
-0.038299560546875,
0.0997314453125,
0.0069580078125,
-0.086669921875,
-0.0032958984375,
-0.0253753662109375,
-0.032379150390625,
-0.008544921875,
-0.01515960693359375,
-0.043853759765625,
-0.0139923095703125,
0.0323486328125,
0.047119140625,
-0.01406097412109375,
0.00323486328125,
-0.0003693103790283203,
-0.0241851806640625,
0.0159912109375,
-0.01136016845703125,
0.0806884765625,
0.01021575927734375,
-0.03656005859375,
0.0028133392333984375,
-0.044097900390625,
0.005504608154296875,
0.021820068359375,
-0.017578125,
-0.03436279296875,
-0.00380706787109375,
0.027008056640625,
0.036041259765625,
0.0323486328125,
-0.0277557373046875,
-0.01329803466796875,
-0.049346923828125,
0.05810546875,
0.036346435546875,
0.004547119140625,
0.033203125,
-0.0288238525390625,
0.0099029541015625,
0.0050048828125,
-0.0013475418090820312,
-0.01520538330078125,
-0.054351806640625,
-0.06939697265625,
-0.00507354736328125,
0.01641845703125,
0.06964111328125,
-0.0843505859375,
0.063232421875,
-0.02752685546875,
-0.039947509765625,
-0.048126220703125,
0.00679779052734375,
0.037139892578125,
0.02386474609375,
0.05670166015625,
0.0214691162109375,
-0.036834716796875,
-0.06048583984375,
-0.0219879150390625,
-0.0213775634765625,
-0.00543975830078125,
0.027008056640625,
0.0462646484375,
-0.01467132568359375,
0.05108642578125,
-0.0347900390625,
-0.038177490234375,
-0.043609619140625,
0.0020351409912109375,
0.013946533203125,
0.038330078125,
0.0447998046875,
-0.0677490234375,
-0.05291748046875,
-0.011627197265625,
-0.07177734375,
0.0066070556640625,
-0.026092529296875,
-0.0282135009765625,
0.0143890380859375,
0.046295166015625,
-0.043243408203125,
0.025146484375,
0.053466796875,
-0.033355712890625,
0.0225067138671875,
-0.036651611328125,
0.006587982177734375,
-0.11181640625,
0.005466461181640625,
0.01293182373046875,
-0.0032444000244140625,
-0.043853759765625,
0.0090484619140625,
0.0199432373046875,
0.01453399658203125,
-0.033660888671875,
0.043792724609375,
-0.042388916015625,
0.03179931640625,
-0.004978179931640625,
0.03515625,
0.00470733642578125,
0.05126953125,
0.0112762451171875,
0.05731201171875,
0.0277099609375,
-0.04461669921875,
0.0208282470703125,
0.04986572265625,
-0.03387451171875,
0.0129547119140625,
-0.061492919921875,
-0.00919342041015625,
0.003238677978515625,
0.0191497802734375,
-0.0908203125,
-0.0033245086669921875,
0.038604736328125,
-0.053863525390625,
-0.0016355514526367188,
0.023223876953125,
-0.030242919921875,
-0.040283203125,
-0.05401611328125,
0.0127716064453125,
0.05072021484375,
-0.038482666015625,
0.033233642578125,
0.011962890625,
-0.006572723388671875,
-0.04266357421875,
-0.08526611328125,
0.00957489013671875,
-0.00278472900390625,
-0.060150146484375,
0.037841796875,
-0.00617218017578125,
0.010528564453125,
0.00023102760314941406,
0.020843505859375,
-0.00711822509765625,
-0.00894927978515625,
-0.0025177001953125,
0.0238800048828125,
-0.004947662353515625,
-0.0062103271484375,
0.013580322265625,
-0.0017747879028320312,
-0.006317138671875,
0.00644683837890625,
0.0465087890625,
0.0027446746826171875,
0.0001552104949951172,
-0.050262451171875,
0.024810791015625,
0.0200347900390625,
-0.016357421875,
0.0677490234375,
0.0628662109375,
-0.030242919921875,
0.00920867919921875,
-0.028106689453125,
-0.005802154541015625,
-0.03533935546875,
0.045623779296875,
-0.035369873046875,
-0.0633544921875,
0.0295257568359375,
0.02392578125,
0.0135955810546875,
0.0645751953125,
0.055389404296875,
-0.0027256011962890625,
0.0640869140625,
0.0269622802734375,
-0.01125335693359375,
0.031463623046875,
-0.0258636474609375,
0.009521484375,
-0.06561279296875,
-0.02960205078125,
-0.03460693359375,
-0.00774383544921875,
-0.04925537109375,
-0.0535888671875,
0.0184783935546875,
0.00884246826171875,
0.001434326171875,
0.053802490234375,
-0.042388916015625,
0.0025577545166015625,
0.03839111328125,
0.0169830322265625,
-0.0085601806640625,
-0.00015234947204589844,
-0.030181884765625,
-0.017547607421875,
-0.04058837890625,
-0.03692626953125,
0.06292724609375,
0.036529541015625,
0.029571533203125,
-0.0021686553955078125,
0.035552978515625,
-0.000025451183319091797,
-0.0164031982421875,
-0.04510498046875,
0.04461669921875,
-0.0247039794921875,
-0.032562255859375,
-0.0257720947265625,
-0.02801513671875,
-0.0648193359375,
0.023651123046875,
-0.0167236328125,
-0.0543212890625,
0.006649017333984375,
-0.01021575927734375,
-0.0235595703125,
0.01039886474609375,
-0.047943115234375,
0.06964111328125,
-0.01142120361328125,
-0.0207061767578125,
-0.004673004150390625,
-0.06280517578125,
0.021453857421875,
0.01678466796875,
0.0211639404296875,
-0.0007200241088867188,
-0.01422882080078125,
0.07379150390625,
-0.03472900390625,
0.0428466796875,
-0.0110015869140625,
-0.0005087852478027344,
0.040557861328125,
-0.017242431640625,
0.049224853515625,
0.0026531219482421875,
-0.006755828857421875,
0.0168304443359375,
0.0151519775390625,
-0.031707763671875,
-0.034332275390625,
0.05181884765625,
-0.0616455078125,
-0.02349853515625,
-0.041259765625,
-0.0209197998046875,
0.0046234130859375,
0.009674072265625,
0.0498046875,
0.01503753662109375,
-0.01140594482421875,
0.035888671875,
0.0401611328125,
-0.039581298828125,
0.038543701171875,
0.00879669189453125,
0.009246826171875,
-0.049072265625,
0.0684814453125,
0.0029239654541015625,
0.00858306884765625,
0.053466796875,
0.0215301513671875,
-0.025909423828125,
-0.03094482421875,
-0.0225067138671875,
0.03704833984375,
-0.034271240234375,
-0.00130462646484375,
-0.08001708984375,
-0.027435302734375,
-0.055511474609375,
0.00029921531677246094,
-0.01016998291015625,
-0.024444580078125,
-0.033538818359375,
-0.0048675537109375,
0.02239990234375,
0.032196044921875,
0.007472991943359375,
0.0191650390625,
-0.056549072265625,
0.0289306640625,
-0.0008974075317382812,
-0.01013946533203125,
-0.0149688720703125,
-0.0517578125,
-0.034698486328125,
0.0105133056640625,
-0.01534271240234375,
-0.060882568359375,
0.044708251953125,
0.0157012939453125,
0.035980224609375,
0.01617431640625,
0.0038166046142578125,
0.038360595703125,
-0.04766845703125,
0.08990478515625,
0.0281219482421875,
-0.0731201171875,
0.033905029296875,
-0.01140594482421875,
0.0164337158203125,
0.03631591796875,
0.024322509765625,
-0.0667724609375,
-0.0268707275390625,
-0.0274810791015625,
-0.0794677734375,
0.05584716796875,
0.03131103515625,
0.027618408203125,
-0.01203155517578125,
0.03887939453125,
-0.0134735107421875,
-0.0060272216796875,
-0.0675048828125,
-0.040771484375,
-0.02752685546875,
-0.05364990234375,
-0.0092010498046875,
-0.035186767578125,
0.00957489013671875,
-0.027069091796875,
0.054229736328125,
-0.0003981590270996094,
0.04766845703125,
0.0285491943359375,
-0.01160430908203125,
0.0149383544921875,
0.00341796875,
0.040557861328125,
0.0194244384765625,
0.0028133392333984375,
-0.0015211105346679688,
0.035552978515625,
-0.039093017578125,
-0.00403594970703125,
0.00824737548828125,
-0.02484130859375,
0.0163726806640625,
0.04876708984375,
0.06488037109375,
0.0177459716796875,
-0.05035400390625,
0.0660400390625,
-0.01403045654296875,
-0.03790283203125,
-0.0275421142578125,
-0.0014896392822265625,
0.0272369384765625,
0.01549530029296875,
0.0143585205078125,
-0.00930023193359375,
0.00330352783203125,
-0.02734375,
0.020172119140625,
0.0250396728515625,
-0.03741455078125,
-0.016693115234375,
0.06048583984375,
0.00469207763671875,
-0.006168365478515625,
0.042755126953125,
-0.0053863525390625,
-0.054779052734375,
0.03533935546875,
0.0294952392578125,
0.057861328125,
-0.00783538818359375,
0.01364898681640625,
0.0557861328125,
0.02606201171875,
-0.0228424072265625,
0.012481689453125,
0.0200347900390625,
-0.052459716796875,
-0.0019664764404296875,
-0.032562255859375,
0.007343292236328125,
0.01107025146484375,
-0.0372314453125,
0.03680419921875,
-0.0269317626953125,
0.0017337799072265625,
-0.008087158203125,
0.00536346435546875,
-0.035736083984375,
0.015380859375,
0.0046844482421875,
0.06964111328125,
-0.06500244140625,
0.0728759765625,
0.04254150390625,
-0.0439453125,
-0.058563232421875,
0.0178680419921875,
-0.0194091796875,
-0.06884765625,
0.03375244140625,
0.0190582275390625,
0.0146484375,
-0.0128936767578125,
-0.03204345703125,
-0.0528564453125,
0.09356689453125,
-0.00460052490234375,
-0.029998779296875,
-0.02362060546875,
0.0125274658203125,
0.05035400390625,
-0.01473236083984375,
0.033538818359375,
0.03955078125,
0.036285400390625,
0.00682830810546875,
-0.04583740234375,
0.0270843505859375,
-0.0238800048828125,
0.01531219482421875,
-0.0233154296875,
-0.07806396484375,
0.08172607421875,
-0.0095977783203125,
-0.01084136962890625,
0.031585693359375,
0.0721435546875,
0.0166778564453125,
0.0084991455078125,
0.030731201171875,
0.03228759765625,
0.0364990234375,
-0.006465911865234375,
0.06939697265625,
-0.0278472900390625,
0.0482177734375,
0.058197021484375,
0.0084228515625,
0.08489990234375,
0.028411865234375,
-0.0194091796875,
0.038055419921875,
0.0430908203125,
-0.01351165771484375,
0.054962158203125,
-0.005001068115234375,
-0.0030574798583984375,
-0.006221771240234375,
0.00937652587890625,
-0.0265655517578125,
0.019439697265625,
0.01505279541015625,
-0.0307464599609375,
0.00885772705078125,
-0.0035247802734375,
0.03155517578125,
0.0177154541015625,
0.00701141357421875,
0.0550537109375,
0.01364898681640625,
-0.054534912109375,
0.044891357421875,
0.0303955078125,
0.07965087890625,
-0.03961181640625,
0.00989532470703125,
-0.000789642333984375,
0.022979736328125,
-0.01049041748046875,
-0.062225341796875,
0.016876220703125,
-0.01212310791015625,
-0.00450897216796875,
-0.00531768798828125,
0.043243408203125,
-0.058441162109375,
-0.03912353515625,
0.044921875,
0.0294647216796875,
0.006961822509765625,
-0.0014619827270507812,
-0.09393310546875,
-0.0048370361328125,
0.0294036865234375,
-0.037872314453125,
0.0177001953125,
0.041107177734375,
0.0078277587890625,
0.03558349609375,
0.03009033203125,
-0.006015777587890625,
0.00263214111328125,
-0.0011234283447265625,
0.051727294921875,
-0.0562744140625,
-0.03466796875,
-0.07830810546875,
0.0369873046875,
-0.0183563232421875,
-0.032806396484375,
0.07171630859375,
0.05291748046875,
0.04962158203125,
-0.003955841064453125,
0.07379150390625,
-0.0119781494140625,
0.04736328125,
-0.04583740234375,
0.05804443359375,
-0.052734375,
-0.00684356689453125,
-0.0271453857421875,
-0.041748046875,
-0.0156402587890625,
0.0654296875,
-0.0186004638671875,
0.0018291473388671875,
0.07501220703125,
0.059478759765625,
0.019744873046875,
-0.01288604736328125,
0.007144927978515625,
0.0203857421875,
0.0260009765625,
0.06561279296875,
0.0304412841796875,
-0.08135986328125,
0.06695556640625,
-0.026885986328125,
-0.01245880126953125,
-0.01654052734375,
-0.035186767578125,
-0.0870361328125,
-0.06939697265625,
-0.0274810791015625,
-0.052581787109375,
-0.0036830902099609375,
0.067138671875,
0.0162506103515625,
-0.06915283203125,
-0.01058197021484375,
-0.01190185546875,
-0.003742218017578125,
-0.020751953125,
-0.02001953125,
0.05364990234375,
-0.027191162109375,
-0.0711669921875,
0.003696441650390625,
0.0041351318359375,
0.0015277862548828125,
-0.0014142990112304688,
-0.0184173583984375,
-0.046630859375,
0.01165771484375,
0.036041259765625,
-0.00119781494140625,
-0.050384521484375,
-0.0206146240234375,
0.016143798828125,
-0.04461669921875,
0.005718231201171875,
0.036529541015625,
-0.026824951171875,
0.0116424560546875,
0.05670166015625,
0.0297393798828125,
0.03228759765625,
-0.0032958984375,
0.0291290283203125,
-0.03900146484375,
0.0267333984375,
-0.0016803741455078125,
0.03509521484375,
0.0294189453125,
-0.0285491943359375,
0.035247802734375,
0.033966064453125,
-0.032562255859375,
-0.039306640625,
-0.0199737548828125,
-0.08343505859375,
-0.035552978515625,
0.11602783203125,
-0.0280609130859375,
-0.033599853515625,
0.00522613525390625,
-0.0386962890625,
0.0426025390625,
-0.021820068359375,
0.027618408203125,
0.06280517578125,
0.013580322265625,
-0.01062774658203125,
-0.03948974609375,
0.03924560546875,
0.038848876953125,
-0.051116943359375,
0.0114593505859375,
0.005519866943359375,
0.026824951171875,
0.00502777099609375,
0.04730224609375,
-0.00750732421875,
0.002590179443359375,
-0.0027923583984375,
0.0006222724914550781,
0.01096343994140625,
0.011016845703125,
-0.01248931884765625,
-0.0013294219970703125,
-0.025360107421875,
-0.026885986328125
]
] |
timm/resnet50.a1_in1k | 2023-04-05T18:08:16.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2110.00476",
"arxiv:1512.03385",
"license:apache-2.0",
"has_space",
"region:us"
] | image-classification | timm | null | null | timm/resnet50.a1_in1k | 9 | 1,577,731 | timm | 2023-04-05T18:07:45 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
---
# Model card for resnet50.a1_in1k
A ResNet-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* ResNet Strikes Back `A1` recipe
* LAMB optimizer with BCE loss
* Cosine LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 25.6
- GMACs: 4.1
- Activations (M): 11.1
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnet50.a1_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet50.a1_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet50.a1_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
| 38,406 | [
[
-0.06549072265625,
-0.0157928466796875,
0.00183868408203125,
0.02838134765625,
-0.030517578125,
-0.00846099853515625,
-0.00994873046875,
-0.02886962890625,
0.0867919921875,
0.0216217041015625,
-0.04901123046875,
-0.040283203125,
-0.04541015625,
-0.0006422996520996094,
0.0233154296875,
0.06494140625,
-0.00022554397583007812,
-0.00592803955078125,
0.0162200927734375,
-0.0183868408203125,
-0.003704071044921875,
-0.025390625,
-0.08038330078125,
-0.01357269287109375,
0.03179931640625,
0.01297760009765625,
0.049713134765625,
0.045745849609375,
0.02880859375,
0.04510498046875,
-0.01971435546875,
0.020355224609375,
-0.00426483154296875,
-0.009521484375,
0.0467529296875,
-0.031280517578125,
-0.07025146484375,
-0.0031337738037109375,
0.05377197265625,
0.04656982421875,
0.005218505859375,
0.0264892578125,
0.026702880859375,
0.046539306640625,
0.001522064208984375,
-0.004852294921875,
0.00124359130859375,
0.0110931396484375,
-0.0228729248046875,
0.00711822509765625,
-0.0055084228515625,
-0.0533447265625,
0.01201629638671875,
-0.04534912109375,
-0.0036869049072265625,
0.00010055303573608398,
0.10015869140625,
-0.009063720703125,
-0.016448974609375,
0.006404876708984375,
0.0108642578125,
0.057464599609375,
-0.0628662109375,
0.0258026123046875,
0.04180908203125,
0.0015172958374023438,
-0.0144500732421875,
-0.0501708984375,
-0.03900146484375,
0.0096588134765625,
-0.031768798828125,
0.023284912109375,
-0.0238494873046875,
-0.0179443359375,
0.02880859375,
0.024932861328125,
-0.03350830078125,
-0.009063720703125,
-0.02685546875,
-0.0069427490234375,
0.052459716796875,
0.00611114501953125,
0.051849365234375,
-0.026702880859375,
-0.037811279296875,
-0.00989532470703125,
-0.0128021240234375,
0.0361328125,
0.019256591796875,
0.01068115234375,
-0.080810546875,
0.032196044921875,
0.00823211669921875,
0.01806640625,
0.0277099609375,
-0.01006317138671875,
0.061798095703125,
-0.0070343017578125,
-0.03857421875,
-0.036407470703125,
0.0809326171875,
0.0479736328125,
0.0207366943359375,
-0.00646209716796875,
-0.00443267822265625,
-0.0142669677734375,
-0.0286102294921875,
-0.07049560546875,
-0.002758026123046875,
0.02069091796875,
-0.041229248046875,
-0.0177764892578125,
0.0248565673828125,
-0.0670166015625,
-0.0032100677490234375,
-0.006771087646484375,
0.00498199462890625,
-0.056549072265625,
-0.03289794921875,
0.0009303092956542969,
-0.0185394287109375,
0.03900146484375,
0.0158538818359375,
-0.0232086181640625,
0.03277587890625,
0.006084442138671875,
0.06610107421875,
0.0216217041015625,
-0.004573822021484375,
-0.0158843994140625,
0.0018634796142578125,
-0.026336669921875,
0.027252197265625,
0.0120849609375,
-0.01366424560546875,
-0.0259246826171875,
0.032684326171875,
-0.01910400390625,
-0.018798828125,
0.04559326171875,
0.02069091796875,
0.0127410888671875,
-0.02215576171875,
-0.01812744140625,
-0.0181884765625,
0.0272216796875,
-0.04388427734375,
0.07666015625,
0.0285797119140625,
-0.0836181640625,
0.0115203857421875,
-0.038818359375,
-0.0015468597412109375,
-0.0217437744140625,
0.00771331787109375,
-0.0675048828125,
0.00218963623046875,
0.0158538818359375,
0.052215576171875,
-0.016632080078125,
-0.01393890380859375,
-0.026702880859375,
0.003223419189453125,
0.0311431884765625,
0.01351165771484375,
0.06927490234375,
0.0233154296875,
-0.035980224609375,
-0.0167694091796875,
-0.053680419921875,
0.03375244140625,
0.03314208984375,
-0.0007352828979492188,
-0.0038318634033203125,
-0.05975341796875,
0.0018110275268554688,
0.0440673828125,
0.018524169921875,
-0.05328369140625,
0.0180511474609375,
-0.01398468017578125,
0.0251007080078125,
0.046875,
0.0024356842041015625,
0.0122222900390625,
-0.053131103515625,
0.04644775390625,
-0.0010776519775390625,
0.0210723876953125,
-0.0011577606201171875,
-0.0300750732421875,
-0.057373046875,
-0.055206298828125,
0.018218994140625,
0.03192138671875,
-0.0301513671875,
0.0645751953125,
0.00989532470703125,
-0.045928955078125,
-0.047698974609375,
0.0044403076171875,
0.044189453125,
0.0179595947265625,
0.0083160400390625,
-0.028076171875,
-0.055694580078125,
-0.07244873046875,
-0.0260009765625,
0.00884246826171875,
-0.0029315948486328125,
0.051239013671875,
0.032623291015625,
-0.0154571533203125,
0.040557861328125,
-0.0282440185546875,
-0.01629638671875,
-0.0113525390625,
-0.00688934326171875,
0.0330810546875,
0.059173583984375,
0.07647705078125,
-0.05523681640625,
-0.06976318359375,
0.0105133056640625,
-0.08197021484375,
-0.00539398193359375,
-0.00017654895782470703,
-0.0192108154296875,
0.033447265625,
0.0181121826171875,
-0.06494140625,
0.05792236328125,
0.0285797119140625,
-0.060577392578125,
0.034271240234375,
-0.025054931640625,
0.0428466796875,
-0.08197021484375,
0.02093505859375,
0.021636962890625,
-0.0192718505859375,
-0.043304443359375,
0.00479888916015625,
-0.007198333740234375,
0.0088653564453125,
-0.04266357421875,
0.05877685546875,
-0.05303955078125,
-0.0025119781494140625,
0.0110321044921875,
0.004253387451171875,
-0.0011043548583984375,
0.032073974609375,
-0.00421905517578125,
0.04327392578125,
0.0650634765625,
-0.01226806640625,
0.0250701904296875,
0.03070068359375,
0.005115509033203125,
0.05865478515625,
-0.047210693359375,
0.00994110107421875,
0.0019159317016601562,
0.034942626953125,
-0.0751953125,
-0.0291900634765625,
0.041015625,
-0.061309814453125,
0.04949951171875,
-0.0203399658203125,
-0.0208740234375,
-0.06280517578125,
-0.0657958984375,
0.0196685791015625,
0.04901123046875,
-0.04412841796875,
0.0284881591796875,
0.015380859375,
-0.0039520263671875,
-0.036712646484375,
-0.052581787109375,
0.007427215576171875,
-0.032318115234375,
-0.06109619140625,
0.0335693359375,
0.02374267578125,
-0.01398468017578125,
0.00745391845703125,
-0.01019287109375,
-0.01050567626953125,
-0.016510009765625,
0.046173095703125,
0.0243377685546875,
-0.022247314453125,
-0.0304107666015625,
-0.029388427734375,
-0.0211334228515625,
-0.005115509033203125,
-0.0087738037109375,
0.038909912109375,
-0.03436279296875,
0.0066070556640625,
-0.1092529296875,
0.009521484375,
0.06610107421875,
-0.0023860931396484375,
0.073486328125,
0.0577392578125,
-0.03619384765625,
0.0126953125,
-0.0341796875,
-0.016815185546875,
-0.039031982421875,
-0.016937255859375,
-0.053680419921875,
-0.042633056640625,
0.068359375,
0.003963470458984375,
-0.01045989990234375,
0.058837890625,
0.01129913330078125,
-0.0193634033203125,
0.061431884765625,
0.035980224609375,
-0.0028533935546875,
0.0419921875,
-0.0628662109375,
0.005962371826171875,
-0.06146240234375,
-0.056304931640625,
-0.0194091796875,
-0.04315185546875,
-0.0439453125,
-0.0251007080078125,
0.0176849365234375,
0.0283355712890625,
-0.0197296142578125,
0.045135498046875,
-0.04168701171875,
0.0024204254150390625,
0.02471923828125,
0.04071044921875,
-0.0168609619140625,
-0.009185791015625,
-0.00865936279296875,
-0.025238037109375,
-0.0396728515625,
-0.026947021484375,
0.05755615234375,
0.047515869140625,
0.031341552734375,
0.007396697998046875,
0.045013427734375,
0.004634857177734375,
0.01541900634765625,
-0.0232086181640625,
0.0521240234375,
0.002475738525390625,
-0.03375244140625,
-0.0250396728515625,
-0.03173828125,
-0.08001708984375,
0.01041412353515625,
-0.033538818359375,
-0.06243896484375,
-0.0134124755859375,
-0.003932952880859375,
-0.0266265869140625,
0.0562744140625,
-0.0460205078125,
0.046844482421875,
-0.005405426025390625,
-0.039337158203125,
-0.0024318695068359375,
-0.0601806640625,
0.005161285400390625,
0.0278778076171875,
0.0035762786865234375,
0.0008368492126464844,
-0.00386810302734375,
0.058807373046875,
-0.0614013671875,
0.046234130859375,
-0.0262908935546875,
0.01052093505859375,
0.0292816162109375,
-0.0026149749755859375,
0.029388427734375,
-0.0024547576904296875,
-0.014892578125,
-0.005779266357421875,
0.008880615234375,
-0.0628662109375,
-0.023834228515625,
0.048614501953125,
-0.0546875,
-0.028472900390625,
-0.04827880859375,
-0.0200347900390625,
0.007793426513671875,
0.0021610260009765625,
0.036346435546875,
0.04840087890625,
-0.0010547637939453125,
0.0174560546875,
0.03924560546875,
-0.033172607421875,
0.039459228515625,
-0.0092315673828125,
0.0012865066528320312,
-0.042694091796875,
0.05224609375,
0.00469207763671875,
0.0001957416534423828,
-0.0011348724365234375,
0.0009927749633789062,
-0.03118896484375,
-0.01593017578125,
-0.0224761962890625,
0.055572509765625,
-0.0131072998046875,
-0.022796630859375,
-0.047698974609375,
-0.025360107421875,
-0.042572021484375,
-0.03094482421875,
-0.03387451171875,
-0.0272674560546875,
-0.0223388671875,
0.0035419464111328125,
0.052825927734375,
0.0655517578125,
-0.0252838134765625,
0.0303497314453125,
-0.038818359375,
0.02197265625,
0.005886077880859375,
0.0421142578125,
-0.0249481201171875,
-0.051544189453125,
0.0035858154296875,
-0.0019054412841796875,
-0.00621795654296875,
-0.06378173828125,
0.049072265625,
0.0007500648498535156,
0.0272674560546875,
0.0292205810546875,
-0.015960693359375,
0.0550537109375,
-0.0021038055419921875,
0.035003662109375,
0.04620361328125,
-0.05511474609375,
0.026702880859375,
-0.0310516357421875,
0.002490997314453125,
0.0211639404296875,
0.0171966552734375,
-0.03021240234375,
-0.0257415771484375,
-0.06683349609375,
-0.0313720703125,
0.05462646484375,
0.0070343017578125,
-0.0022029876708984375,
-0.001201629638671875,
0.055755615234375,
-0.005619049072265625,
0.00446319580078125,
-0.0401611328125,
-0.06744384765625,
-0.00893402099609375,
-0.0120086669921875,
0.004711151123046875,
-0.003627777099609375,
0.0028743743896484375,
-0.04986572265625,
0.049835205078125,
0.004375457763671875,
0.036865234375,
0.01324462890625,
0.00531768798828125,
0.002880096435546875,
-0.0222625732421875,
0.045013427734375,
0.0270233154296875,
-0.0141143798828125,
-0.00919342041015625,
0.0287628173828125,
-0.037261962890625,
0.0071868896484375,
0.01605224609375,
0.00017976760864257812,
0.00670623779296875,
0.00678253173828125,
0.0380859375,
0.0257415771484375,
-0.005252838134765625,
0.037872314453125,
-0.018646240234375,
-0.040283203125,
-0.0166778564453125,
-0.01507568359375,
0.0197296142578125,
0.03338623046875,
0.02447509765625,
0.003185272216796875,
-0.0299224853515625,
-0.0287628173828125,
0.040771484375,
0.055450439453125,
-0.0304107666015625,
-0.0307159423828125,
0.0443115234375,
-0.00360107421875,
-0.018157958984375,
0.0291900634765625,
-0.007244110107421875,
-0.05224609375,
0.07708740234375,
0.0258941650390625,
0.047637939453125,
-0.03753662109375,
0.0073699951171875,
0.06439208984375,
-0.0008497238159179688,
0.014678955078125,
0.02545166015625,
0.0350341796875,
-0.023162841796875,
-0.0051727294921875,
-0.041259765625,
0.01230621337890625,
0.0361328125,
-0.031463623046875,
0.0216522216796875,
-0.0533447265625,
-0.027923583984375,
0.007221221923828125,
0.036865234375,
-0.047698974609375,
0.02752685546875,
-0.003406524658203125,
0.08056640625,
-0.061248779296875,
0.0631103515625,
0.0682373046875,
-0.0408935546875,
-0.06463623046875,
0.0000368952751159668,
0.0087738037109375,
-0.06451416015625,
0.03533935546875,
0.00653076171875,
0.0021572113037109375,
-0.000995635986328125,
-0.037689208984375,
-0.04949951171875,
0.103271484375,
0.0302276611328125,
-0.003742218017578125,
0.0207366943359375,
-0.031768798828125,
0.028564453125,
-0.01329803466796875,
0.043670654296875,
0.02752685546875,
0.038116455078125,
0.01275634765625,
-0.0657958984375,
0.0284881591796875,
-0.0316162109375,
-0.0090179443359375,
0.0237579345703125,
-0.09820556640625,
0.0675048828125,
-0.018218994140625,
-0.002651214599609375,
0.0181121826171875,
0.047882080078125,
0.024261474609375,
-0.001407623291015625,
0.0196380615234375,
0.0699462890625,
0.0357666015625,
-0.01873779296875,
0.0799560546875,
-0.0159454345703125,
0.04022216796875,
0.01537322998046875,
0.04193115234375,
0.027984619140625,
0.0296783447265625,
-0.04339599609375,
0.020233154296875,
0.0616455078125,
-0.0030536651611328125,
0.0085296630859375,
0.0220794677734375,
-0.030364990234375,
-0.0137481689453125,
-0.01654052734375,
-0.0513916015625,
0.0167694091796875,
0.00621795654296875,
-0.01081085205078125,
-0.00955963134765625,
-0.003925323486328125,
0.019317626953125,
0.0206146240234375,
-0.0191497802734375,
0.03900146484375,
0.00667572021484375,
-0.0291748046875,
0.035186767578125,
-0.0005369186401367188,
0.07916259765625,
-0.0262603759765625,
0.01284027099609375,
-0.027435302734375,
0.0216522216796875,
-0.0183563232421875,
-0.08038330078125,
0.0244293212890625,
-0.006069183349609375,
0.005664825439453125,
-0.0166168212890625,
0.049163818359375,
-0.025848388671875,
-0.0259552001953125,
0.0286865234375,
0.02880859375,
0.038726806640625,
0.0212860107421875,
-0.08367919921875,
0.01947021484375,
0.006145477294921875,
-0.046600341796875,
0.032196044921875,
0.037261962890625,
0.0280609130859375,
0.05657958984375,
0.02471923828125,
0.0239105224609375,
0.01544952392578125,
-0.0277862548828125,
0.0560302734375,
-0.049102783203125,
-0.033660888671875,
-0.061309814453125,
0.04022216796875,
-0.030670166015625,
-0.03924560546875,
0.05523681640625,
0.041259765625,
0.02752685546875,
0.0019006729125976562,
0.050079345703125,
-0.04010009765625,
0.03704833984375,
-0.0193634033203125,
0.056549072265625,
-0.051025390625,
-0.0187835693359375,
-0.0150604248046875,
-0.04461669921875,
-0.0297088623046875,
0.061492919921875,
-0.00873565673828125,
0.0195770263671875,
0.02142333984375,
0.0496826171875,
0.00576019287109375,
-0.00952911376953125,
-0.0004324913024902344,
0.01251983642578125,
-0.0099945068359375,
0.06634521484375,
0.0380859375,
-0.057220458984375,
0.004322052001953125,
-0.036102294921875,
-0.0217742919921875,
-0.0267333984375,
-0.056365966796875,
-0.08721923828125,
-0.04974365234375,
-0.039398193359375,
-0.05078125,
-0.0179901123046875,
0.088134765625,
0.06109619140625,
-0.04534912109375,
-0.01125335693359375,
0.01123809814453125,
0.005741119384765625,
-0.0113525390625,
-0.0161285400390625,
0.0396728515625,
0.0073699951171875,
-0.07452392578125,
-0.0311737060546875,
0.01053619384765625,
0.046234130859375,
0.02874755859375,
-0.0369873046875,
-0.01739501953125,
-0.004566192626953125,
0.02508544921875,
0.06427001953125,
-0.059539794921875,
-0.020111083984375,
0.0017423629760742188,
-0.036651611328125,
0.009246826171875,
0.0218658447265625,
-0.033843994140625,
-0.00804901123046875,
0.03607177734375,
0.0297088623046875,
0.055511474609375,
0.005924224853515625,
0.01239776611328125,
-0.03448486328125,
0.04132080078125,
-0.0011434555053710938,
0.0250396728515625,
0.0168304443359375,
-0.021575927734375,
0.0570068359375,
0.040374755859375,
-0.029388427734375,
-0.07708740234375,
-0.01288604736328125,
-0.09771728515625,
-0.004924774169921875,
0.04864501953125,
-0.006134033203125,
-0.032440185546875,
0.032318115234375,
-0.033538818359375,
0.03955078125,
-0.01514434814453125,
0.019927978515625,
0.01806640625,
-0.0265045166015625,
-0.02734375,
-0.04229736328125,
0.04541015625,
0.029144287109375,
-0.0521240234375,
-0.0308074951171875,
-0.0010929107666015625,
0.0234375,
0.01448822021484375,
0.055511474609375,
-0.0292510986328125,
0.01027679443359375,
-0.0084228515625,
0.0194244384765625,
-0.001708984375,
0.01190948486328125,
-0.0233001708984375,
-0.00901031494140625,
-0.01739501953125,
-0.04803466796875
]
] |
nlptown/bert-base-multilingual-uncased-sentiment | 2023-07-27T18:14:29.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"en",
"nl",
"de",
"fr",
"it",
"es",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | nlptown | null | null | nlptown/bert-base-multilingual-uncased-sentiment | 196 | 1,572,780 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
- nl
- de
- fr
- it
- es
license: mit
---
# bert-base-multilingual-uncased-sentiment
This is a bert-base-multilingual-uncased model finetuned for sentiment analysis on product reviews in six languages: English, Dutch, German, French, Spanish, and Italian. It predicts the sentiment of the review as a number of stars (between 1 and 5).
This model is intended for direct use as a sentiment analysis model for product reviews in any of the six languages above or for further finetuning on related sentiment analysis tasks.
## Training data
Here is the number of product reviews we used for finetuning the model:
| Language | Number of reviews |
| -------- | ----------------- |
| English | 150k |
| Dutch | 80k |
| German | 137k |
| French | 140k |
| Italian | 72k |
| Spanish | 50k |
## Accuracy
The fine-tuned model obtained the following accuracy on 5,000 held-out product reviews in each of the languages:
- Accuracy (exact) is the exact match for the number of stars.
- Accuracy (off-by-1) is the percentage of reviews where the number of stars the model predicts differs by a maximum of 1 from the number given by the human reviewer.
| Language | Accuracy (exact) | Accuracy (off-by-1) |
| -------- | ---------------------- | ------------------- |
| English | 67% | 95%
| Dutch | 57% | 93%
| German | 61% | 94%
| French | 59% | 94%
| Italian | 59% | 95%
| Spanish | 58% | 95%
## Contact
If you found this model useful, you can buy me a coffee at https://www.buymeacoffee.com/yvespeirsman.
In addition to this model, [NLP Town](http://nlp.town) offers custom models for many languages and NLP tasks.
Feel free to contact me for questions, feedback and/or requests for similar models. | 1,903 | [
[
-0.044036865234375,
-0.04083251953125,
0.01678466796875,
0.058868408203125,
-0.0267791748046875,
-0.00809478759765625,
-0.0283966064453125,
-0.047821044921875,
0.029754638671875,
0.03662109375,
-0.050933837890625,
-0.058868408203125,
-0.04205322265625,
0.0030498504638671875,
0.0008034706115722656,
0.0958251953125,
0.01351165771484375,
0.060333251953125,
0.00232696533203125,
-0.01171112060546875,
-0.0162811279296875,
-0.0673828125,
-0.0313720703125,
-0.0325927734375,
0.040924072265625,
0.0167083740234375,
0.0506591796875,
-0.0246734619140625,
0.036407470703125,
0.01508331298828125,
-0.0110321044921875,
-0.00591278076171875,
-0.01403045654296875,
-0.00574493408203125,
0.003597259521484375,
-0.0223388671875,
-0.04278564453125,
-0.0126495361328125,
0.019012451171875,
0.06597900390625,
0.0083465576171875,
0.00848388671875,
0.00783538818359375,
0.056793212890625,
-0.02044677734375,
0.033935546875,
-0.02655029296875,
0.0087432861328125,
0.0224151611328125,
0.028167724609375,
-0.035919189453125,
-0.034881591796875,
0.016845703125,
-0.0199432373046875,
0.00909423828125,
-0.01739501953125,
0.0723876953125,
-0.01593017578125,
-0.020965576171875,
-0.03814697265625,
-0.054443359375,
0.0748291015625,
-0.07122802734375,
0.032073974609375,
0.0175933837890625,
0.0184478759765625,
-0.0014286041259765625,
-0.019256591796875,
-0.0435791015625,
-0.0207366943359375,
-0.01525115966796875,
0.026397705078125,
-0.009185791015625,
-0.0006785392761230469,
-0.0015306472778320312,
0.02923583984375,
-0.0276336669921875,
-0.011199951171875,
-0.027374267578125,
0.0036830902099609375,
0.060699462890625,
-0.01678466796875,
-0.0006999969482421875,
-0.0294647216796875,
-0.041259765625,
-0.0182647705078125,
-0.03094482421875,
0.0257720947265625,
0.038665771484375,
0.039215087890625,
-0.0135955810546875,
0.034149169921875,
-0.0185089111328125,
0.0440673828125,
-0.00913238525390625,
0.0037136077880859375,
0.04736328125,
-0.01482391357421875,
-0.0307159423828125,
-0.0064697265625,
0.07781982421875,
0.039306640625,
0.03515625,
0.004528045654296875,
-0.043304443359375,
0.01271820068359375,
0.0028209686279296875,
-0.046539306640625,
-0.0152587890625,
0.0244598388671875,
-0.045166015625,
-0.021942138671875,
0.0107574462890625,
-0.036376953125,
-0.008697509765625,
-0.00676727294921875,
0.0333251953125,
-0.031341552734375,
-0.0221405029296875,
0.0305328369140625,
-0.00008511543273925781,
0.00969696044921875,
0.01119232177734375,
-0.06390380859375,
0.0266265869140625,
0.019439697265625,
0.032745361328125,
-0.0016279220581054688,
-0.014251708984375,
0.02197265625,
-0.02606201171875,
-0.02978515625,
0.055023193359375,
-0.0258941650390625,
-0.046478271484375,
0.02587890625,
0.0148773193359375,
0.00933837890625,
-0.0181427001953125,
0.061065673828125,
-0.041259765625,
0.034881591796875,
-0.040191650390625,
-0.0298309326171875,
-0.054840087890625,
0.033447265625,
-0.056488037109375,
0.078857421875,
0.00200653076171875,
-0.040679931640625,
0.0242919921875,
-0.052398681640625,
-0.033966064453125,
-0.016632080078125,
0.007740020751953125,
-0.02630615234375,
0.021575927734375,
0.0283966064453125,
0.037322998046875,
-0.001575469970703125,
0.0159759521484375,
-0.017547607421875,
-0.003734588623046875,
0.01143646240234375,
-0.02520751953125,
0.083740234375,
0.0273895263671875,
-0.0139923095703125,
0.0132598876953125,
-0.0633544921875,
0.01265716552734375,
0.005870819091796875,
-0.03814697265625,
-0.01432037353515625,
-0.003238677978515625,
0.043060302734375,
0.027130126953125,
0.03857421875,
-0.06292724609375,
0.003704071044921875,
-0.0472412109375,
0.01076507568359375,
0.0264739990234375,
0.004901885986328125,
0.0245361328125,
-0.02178955078125,
0.0335693359375,
0.01279449462890625,
0.0262603759765625,
0.0024738311767578125,
-0.050323486328125,
-0.0775146484375,
-0.008819580078125,
0.037384033203125,
0.04412841796875,
-0.036224365234375,
0.06390380859375,
-0.0102996826171875,
-0.0362548828125,
-0.045928955078125,
0.0017690658569335938,
0.031097412109375,
0.0242919921875,
0.020721435546875,
-0.032440185546875,
-0.0352783203125,
-0.09027099609375,
0.0032901763916015625,
-0.0166778564453125,
0.0014753341674804688,
0.0186920166015625,
0.037261962890625,
-0.034698486328125,
0.051422119140625,
-0.01503753662109375,
-0.035614013671875,
-0.037139892578125,
0.00989532470703125,
0.061004638671875,
0.029205322265625,
0.0706787109375,
-0.039276123046875,
-0.057403564453125,
0.01128387451171875,
-0.052215576171875,
0.00748443603515625,
0.0025920867919921875,
-0.0005998611450195312,
0.04132080078125,
-0.003787994384765625,
-0.05126953125,
0.0165252685546875,
0.046722412109375,
-0.0089111328125,
0.03277587890625,
-0.0225067138671875,
0.011444091796875,
-0.09063720703125,
-0.00962066650390625,
0.00229644775390625,
-0.00266265869140625,
-0.035247802734375,
0.00811767578125,
0.00499725341796875,
-0.00249481201171875,
-0.036529541015625,
0.036834716796875,
-0.029754638671875,
0.0005106925964355469,
-0.0011854171752929688,
-0.022247314453125,
0.0196075439453125,
0.06292724609375,
0.0221405029296875,
0.02337646484375,
0.037841796875,
-0.0238037109375,
0.02459716796875,
0.026031494140625,
-0.0640869140625,
0.0252838134765625,
-0.058013916015625,
-0.0099334716796875,
-0.004344940185546875,
0.007411956787109375,
-0.09014892578125,
-0.0006394386291503906,
-0.0013599395751953125,
-0.048431396484375,
0.0103912353515625,
0.003620147705078125,
-0.054840087890625,
-0.02740478515625,
-0.015106201171875,
-0.006336212158203125,
0.0293731689453125,
-0.03985595703125,
0.04339599609375,
0.01885986328125,
-0.0220489501953125,
-0.0545654296875,
-0.06927490234375,
-0.00989532470703125,
0.007114410400390625,
-0.055267333984375,
0.006557464599609375,
-0.02130126953125,
-0.01389312744140625,
-0.00830078125,
0.0061798095703125,
-0.016387939453125,
-0.016937255859375,
0.006275177001953125,
0.0273284912109375,
-0.006717681884765625,
0.0125732421875,
-0.0033359527587890625,
0.0055084228515625,
0.0086517333984375,
-0.00368499755859375,
0.05169677734375,
-0.03131103515625,
-0.0015172958374023438,
-0.01910400390625,
0.048797607421875,
0.0521240234375,
-0.006099700927734375,
0.04595947265625,
0.0421142578125,
-0.037322998046875,
-0.01175689697265625,
-0.04400634765625,
-0.015625,
-0.0284423828125,
0.037841796875,
-0.04119873046875,
-0.035888671875,
0.0694580078125,
0.040740966796875,
0.02093505859375,
0.048583984375,
0.061126708984375,
-0.0276031494140625,
0.0999755859375,
0.0623779296875,
-0.028900146484375,
0.022979736328125,
-0.023284912109375,
0.028289794921875,
-0.0232696533203125,
-0.0297698974609375,
-0.032806396484375,
-0.03216552734375,
-0.03961181640625,
-0.0011663436889648438,
0.0234527587890625,
0.005096435546875,
-0.0306243896484375,
0.020721435546875,
-0.03619384765625,
0.0243377685546875,
0.05938720703125,
0.01441192626953125,
0.0205230712890625,
0.0279541015625,
-0.037628173828125,
-0.0169677734375,
-0.04632568359375,
-0.04010009765625,
0.065673828125,
0.03851318359375,
0.05859375,
0.0146636962890625,
0.040191650390625,
0.0225677490234375,
0.004730224609375,
-0.06634521484375,
0.036529541015625,
-0.0305633544921875,
-0.0704345703125,
-0.004703521728515625,
-0.005641937255859375,
-0.04791259765625,
0.01175689697265625,
-0.03167724609375,
-0.031524658203125,
0.026458740234375,
-0.00402069091796875,
-0.0302886962890625,
0.021270751953125,
-0.06719970703125,
0.061065673828125,
-0.045257568359375,
-0.004947662353515625,
-0.01378631591796875,
-0.040191650390625,
0.01611328125,
0.001361846923828125,
0.029052734375,
-0.0160064697265625,
0.0311279296875,
0.048492431640625,
-0.0285491943359375,
0.08148193359375,
-0.012664794921875,
-0.01523590087890625,
0.01328277587890625,
0.016021728515625,
0.02447509765625,
0.01103973388671875,
-0.006099700927734375,
0.040557861328125,
0.00093841552734375,
-0.02447509765625,
-0.0214996337890625,
0.0679931640625,
-0.08197021484375,
-0.0254364013671875,
-0.041473388671875,
-0.0413818359375,
-0.034149169921875,
0.02978515625,
0.038360595703125,
0.00638580322265625,
-0.0225677490234375,
0.01397705078125,
0.0394287109375,
-0.0277557373046875,
0.04046630859375,
0.0494384765625,
-0.0264129638671875,
-0.03692626953125,
0.052734375,
-0.00122833251953125,
0.00598907470703125,
0.0294189453125,
0.009552001953125,
-0.040191650390625,
-0.0226287841796875,
-0.0288848876953125,
0.01183319091796875,
-0.0626220703125,
-0.01007080078125,
-0.05657958984375,
-0.0161285400390625,
-0.0255584716796875,
-0.007770538330078125,
-0.0362548828125,
-0.0458984375,
-0.0030975341796875,
-0.016815185546875,
0.0294342041015625,
0.07073974609375,
0.00374603271484375,
0.034332275390625,
-0.053314208984375,
-0.005550384521484375,
0.01477813720703125,
0.040252685546875,
-0.01177215576171875,
-0.03350830078125,
-0.0203704833984375,
0.00897979736328125,
-0.007076263427734375,
-0.07122802734375,
0.05975341796875,
0.002105712890625,
0.029693603515625,
0.032623291015625,
-0.0017690658569335938,
0.037384033203125,
-0.037322998046875,
0.070556640625,
0.03338623046875,
-0.056549072265625,
0.0271148681640625,
-0.032501220703125,
0.0194854736328125,
0.040679931640625,
0.04217529296875,
-0.036346435546875,
-0.015838623046875,
-0.04266357421875,
-0.062103271484375,
0.032470703125,
0.001071929931640625,
0.040679931640625,
-0.01024627685546875,
-0.016021728515625,
0.019927978515625,
0.035858154296875,
-0.09637451171875,
-0.0186309814453125,
-0.024932861328125,
-0.006649017333984375,
-0.0113067626953125,
-0.0439453125,
-0.0089263916015625,
-0.024932861328125,
0.073974609375,
0.00905609130859375,
0.028350830078125,
0.0013933181762695312,
0.00214385986328125,
-0.007732391357421875,
0.0263214111328125,
0.0357666015625,
0.037109375,
-0.044708251953125,
-0.017303466796875,
0.008636474609375,
-0.0163116455078125,
-0.019805908203125,
0.003490447998046875,
-0.0140838623046875,
0.038665771484375,
0.0190887451171875,
0.06109619140625,
0.016845703125,
-0.03338623046875,
0.0341796875,
-0.005306243896484375,
-0.0250701904296875,
-0.052459716796875,
-0.0286712646484375,
0.003719329833984375,
0.01369476318359375,
0.01800537109375,
0.01515960693359375,
0.00771331787109375,
-0.0347900390625,
0.007694244384765625,
0.04730224609375,
-0.053466796875,
-0.0251922607421875,
0.035186767578125,
0.031463623046875,
-0.0051422119140625,
0.04541015625,
-0.0076141357421875,
-0.04656982421875,
0.03741455078125,
0.0285186767578125,
0.07177734375,
-0.004840850830078125,
0.019927978515625,
0.04241943359375,
0.039154052734375,
0.0010976791381835938,
0.05389404296875,
0.016876220703125,
-0.0703125,
-0.0212249755859375,
-0.0703125,
-0.02813720703125,
0.01296234130859375,
-0.0523681640625,
0.0180816650390625,
-0.03314208984375,
-0.0262298583984375,
0.0110321044921875,
0.0108795166015625,
-0.037567138671875,
0.036468505859375,
0.018798828125,
0.07794189453125,
-0.07366943359375,
0.07611083984375,
0.056976318359375,
-0.047027587890625,
-0.0450439453125,
-0.032470703125,
-0.02288818359375,
-0.05810546875,
0.051116943359375,
0.0088958740234375,
-0.0036754608154296875,
-0.027252197265625,
-0.030364990234375,
-0.051544189453125,
0.0299530029296875,
0.005886077880859375,
-0.04803466796875,
0.0116119384765625,
0.00588226318359375,
0.059814453125,
-0.0460205078125,
0.01837158203125,
0.0284576416015625,
0.0240020751953125,
0.0005631446838378906,
-0.06365966796875,
-0.038330078125,
-0.041748046875,
0.0015010833740234375,
-0.00988006591796875,
-0.04583740234375,
0.07293701171875,
-0.01114654541015625,
0.01425933837890625,
0.00894927978515625,
0.033538818359375,
0.0084075927734375,
-0.015655517578125,
0.032958984375,
0.045196533203125,
0.0341796875,
-0.0162353515625,
0.080810546875,
-0.0457763671875,
0.047454833984375,
0.068603515625,
-0.0214080810546875,
0.07562255859375,
0.0245819091796875,
-0.0205230712890625,
0.059661865234375,
0.061065673828125,
-0.0170745849609375,
0.043060302734375,
-0.01131439208984375,
-0.0202178955078125,
-0.0229644775390625,
-0.005218505859375,
-0.0226287841796875,
0.01739501953125,
0.0212249755859375,
-0.04205322265625,
-0.01099395751953125,
-0.004795074462890625,
0.00824737548828125,
-0.005382537841796875,
-0.0275726318359375,
0.044586181640625,
-0.0026798248291015625,
-0.046844482421875,
0.0440673828125,
0.0168914794921875,
0.053955078125,
-0.06201171875,
0.00952911376953125,
-0.021636962890625,
0.027679443359375,
-0.017791748046875,
-0.0709228515625,
0.01491546630859375,
-0.004802703857421875,
-0.04327392578125,
-0.0259246826171875,
0.0275421142578125,
-0.037567138671875,
-0.0885009765625,
0.03765869140625,
0.042938232421875,
0.00698089599609375,
0.003406524658203125,
-0.0714111328125,
0.00292205810546875,
0.0278472900390625,
-0.03228759765625,
-0.002361297607421875,
0.0114288330078125,
-0.00942230224609375,
0.0335693359375,
0.042694091796875,
0.0130615234375,
-0.001068115234375,
0.035247802734375,
0.04071044921875,
-0.0440673828125,
-0.05133056640625,
-0.039794921875,
0.0439453125,
-0.0016794204711914062,
-0.0258331298828125,
0.058013916015625,
0.048431396484375,
0.07122802734375,
-0.052001953125,
0.07781982421875,
0.0015535354614257812,
0.049713134765625,
-0.01525115966796875,
0.0677490234375,
-0.036590576171875,
0.012542724609375,
-0.0160064697265625,
-0.07659912109375,
-0.03448486328125,
0.0684814453125,
-0.021514892578125,
0.0166015625,
0.03997802734375,
0.049163818359375,
0.01262664794921875,
-0.0003552436828613281,
0.033355712890625,
0.030914306640625,
-0.0008993148803710938,
0.034698486328125,
0.038970947265625,
-0.032958984375,
0.032958984375,
-0.03704833984375,
-0.01192474365234375,
0.0033721923828125,
-0.054656982421875,
-0.088134765625,
-0.026763916015625,
-0.01198577880859375,
-0.033172607421875,
-0.007354736328125,
0.064208984375,
0.04730224609375,
-0.10418701171875,
-0.043853759765625,
0.01047515869140625,
-0.006206512451171875,
-0.01216888427734375,
-0.0159149169921875,
0.022003173828125,
-0.044097900390625,
-0.07867431640625,
0.0128173828125,
-0.00007134675979614258,
0.0024585723876953125,
-0.031890869140625,
0.0012693405151367188,
-0.01100921630859375,
0.019744873046875,
0.057403564453125,
0.00016605854034423828,
-0.0595703125,
-0.01016998291015625,
0.0189971923828125,
-0.01238250732421875,
0.0135345458984375,
0.03253173828125,
-0.0295257568359375,
0.053375244140625,
0.0404052734375,
0.01416015625,
0.025177001953125,
-0.023712158203125,
0.0498046875,
-0.07293701171875,
0.0268096923828125,
0.0274200439453125,
0.051849365234375,
0.0250396728515625,
-0.01763916015625,
0.022186279296875,
0.0012617111206054688,
-0.03289794921875,
-0.038848876953125,
0.0196533203125,
-0.0826416015625,
-0.02825927734375,
0.0858154296875,
-0.00043082237243652344,
-0.01090240478515625,
0.00616455078125,
-0.0191192626953125,
-0.01557159423828125,
-0.043365478515625,
0.070068359375,
0.07958984375,
-0.01065826416015625,
0.0005249977111816406,
-0.03973388671875,
0.03411865234375,
0.03802490234375,
-0.0232391357421875,
-0.01503753662109375,
0.04241943359375,
0.020233154296875,
0.02679443359375,
0.027130126953125,
-0.0199432373046875,
0.02520751953125,
-0.0218963623046875,
0.054718017578125,
0.0261993408203125,
-0.0223541259765625,
-0.0236663818359375,
0.00771331787109375,
0.00623321533203125,
-0.0255584716796875
]
] |
bert-large-cased | 2023-04-06T13:41:58.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-large-cased | 13 | 1,569,927 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT large model (cased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is cased: it makes a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
This model has the following configuration:
- 24-layer
- 1024 hidden dimension
- 16 attention heads
- 336M parameters.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-large-cased')
>>> unmasker("Hello I'm a [MASK] model.")
[
{
"sequence":"[CLS] Hello I'm a male model. [SEP]",
"score":0.22748498618602753,
"token":2581,
"token_str":"male"
},
{
"sequence":"[CLS] Hello I'm a fashion model. [SEP]",
"score":0.09146175533533096,
"token":4633,
"token_str":"fashion"
},
{
"sequence":"[CLS] Hello I'm a new model. [SEP]",
"score":0.05823173746466637,
"token":1207,
"token_str":"new"
},
{
"sequence":"[CLS] Hello I'm a super model. [SEP]",
"score":0.04488750174641609,
"token":7688,
"token_str":"super"
},
{
"sequence":"[CLS] Hello I'm a famous model. [SEP]",
"score":0.03271442651748657,
"token":2505,
"token_str":"famous"
}
]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-large-cased')
model = BertModel.from_pretrained("bert-large-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-large-cased')
model = TFBertModel.from_pretrained("bert-large-cased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-large-cased')
>>> unmasker("The man worked as a [MASK].")
[
{
"sequence":"[CLS] The man worked as a doctor. [SEP]",
"score":0.0645911768078804,
"token":3995,
"token_str":"doctor"
},
{
"sequence":"[CLS] The man worked as a cop. [SEP]",
"score":0.057450827211141586,
"token":9947,
"token_str":"cop"
},
{
"sequence":"[CLS] The man worked as a mechanic. [SEP]",
"score":0.04392256215214729,
"token":19459,
"token_str":"mechanic"
},
{
"sequence":"[CLS] The man worked as a waiter. [SEP]",
"score":0.03755280375480652,
"token":17989,
"token_str":"waiter"
},
{
"sequence":"[CLS] The man worked as a teacher. [SEP]",
"score":0.03458863124251366,
"token":3218,
"token_str":"teacher"
}
]
>>> unmasker("The woman worked as a [MASK].")
[
{
"sequence":"[CLS] The woman worked as a nurse. [SEP]",
"score":0.2572779953479767,
"token":7439,
"token_str":"nurse"
},
{
"sequence":"[CLS] The woman worked as a waitress. [SEP]",
"score":0.16706500947475433,
"token":15098,
"token_str":"waitress"
},
{
"sequence":"[CLS] The woman worked as a teacher. [SEP]",
"score":0.04587847739458084,
"token":3218,
"token_str":"teacher"
},
{
"sequence":"[CLS] The woman worked as a secretary. [SEP]",
"score":0.03577028587460518,
"token":4848,
"token_str":"secretary"
},
{
"sequence":"[CLS] The woman worked as a maid. [SEP]",
"score":0.03298963978886604,
"token":13487,
"token_str":"maid"
}
]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Model | SQUAD 1.1 F1/EM | Multi NLI Accuracy
---------------------------------------- | :-------------: | :----------------:
BERT-Large, Cased (Original) | 91.5/84.8 | 86.09
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 9,214 | [
[
-0.0111846923828125,
-0.046142578125,
0.0198516845703125,
0.0201873779296875,
-0.042022705078125,
0.0015964508056640625,
-0.0030803680419921875,
-0.01270294189453125,
0.031280517578125,
0.03924560546875,
-0.041778564453125,
-0.030548095703125,
-0.061279296875,
0.00738525390625,
-0.04486083984375,
0.0845947265625,
0.0174713134765625,
0.029022216796875,
0.006458282470703125,
0.0142974853515625,
-0.03350830078125,
-0.059234619140625,
-0.0625,
-0.01910400390625,
0.03533935546875,
0.026763916015625,
0.042388916015625,
0.040313720703125,
0.037841796875,
0.0296173095703125,
-0.004344940185546875,
-0.00848388671875,
-0.0258941650390625,
0.004817962646484375,
-0.005039215087890625,
-0.0435791015625,
-0.027557373046875,
0.0126190185546875,
0.046966552734375,
0.059478759765625,
-0.0008091926574707031,
0.022247314453125,
-0.01001739501953125,
0.043701171875,
-0.01464080810546875,
0.021148681640625,
-0.03607177734375,
0.01380157470703125,
-0.0213165283203125,
0.00829315185546875,
-0.0288238525390625,
-0.0189971923828125,
0.0140380859375,
-0.041107177734375,
0.0212860107421875,
0.01494598388671875,
0.08331298828125,
0.006610870361328125,
-0.01715087890625,
-0.0084381103515625,
-0.03790283203125,
0.05731201171875,
-0.051605224609375,
0.0113067626953125,
0.032623291015625,
0.0243377685546875,
-0.01480865478515625,
-0.077880859375,
-0.03509521484375,
-0.0020999908447265625,
-0.005779266357421875,
-0.000004887580871582031,
0.0020351409912109375,
-0.0080413818359375,
0.0284881591796875,
0.030303955078125,
-0.02276611328125,
0.0013217926025390625,
-0.0560302734375,
-0.024566650390625,
0.050689697265625,
0.01114654541015625,
0.01215362548828125,
-0.0260009765625,
-0.026153564453125,
-0.02264404296875,
-0.0208282470703125,
0.00490570068359375,
0.0408935546875,
0.0308837890625,
-0.01346588134765625,
0.0540771484375,
-0.01392364501953125,
0.04522705078125,
0.0010099411010742188,
0.0015583038330078125,
0.032257080078125,
-0.005680084228515625,
-0.0297393798828125,
0.001922607421875,
0.07147216796875,
0.017974853515625,
0.03399658203125,
-0.003948211669921875,
-0.0237884521484375,
-0.0007920265197753906,
0.0266571044921875,
-0.048583984375,
-0.0203857421875,
0.00811767578125,
-0.039794921875,
-0.035675048828125,
0.03839111328125,
-0.049530029296875,
-0.00833892822265625,
-0.00836944580078125,
0.04583740234375,
-0.02532958984375,
-0.006481170654296875,
0.01099395751953125,
-0.0377197265625,
0.01462554931640625,
0.00547027587890625,
-0.065673828125,
0.0151519775390625,
0.05096435546875,
0.0633544921875,
0.0218505859375,
-0.01334381103515625,
-0.0310821533203125,
-0.014556884765625,
-0.0289459228515625,
0.03375244140625,
-0.024810791015625,
-0.0390625,
0.0019006729125976562,
0.022247314453125,
-0.00623321533203125,
-0.0177154541015625,
0.0552978515625,
-0.040557861328125,
0.040496826171875,
-0.005199432373046875,
-0.046051025390625,
-0.01861572265625,
0.00113677978515625,
-0.054718017578125,
0.0863037109375,
0.0222930908203125,
-0.04913330078125,
0.0266571044921875,
-0.0693359375,
-0.046600341796875,
0.015655517578125,
0.00724029541015625,
-0.03424072265625,
0.0177459716796875,
0.005336761474609375,
0.035980224609375,
-0.006443023681640625,
0.026824951171875,
-0.01462554931640625,
-0.034332275390625,
0.031585693359375,
-0.0193023681640625,
0.07598876953125,
0.01611328125,
-0.020904541015625,
0.01140594482421875,
-0.059417724609375,
-0.0014772415161132812,
0.0182342529296875,
-0.027557373046875,
-0.01438140869140625,
-0.00974273681640625,
0.022369384765625,
0.01361083984375,
0.0300140380859375,
-0.05023193359375,
0.0234375,
-0.04010009765625,
0.048919677734375,
0.0650634765625,
-0.005687713623046875,
0.0190277099609375,
-0.0303802490234375,
0.03729248046875,
-0.003551483154296875,
0.00030612945556640625,
-0.0133819580078125,
-0.054718017578125,
-0.05804443359375,
-0.026123046875,
0.047576904296875,
0.049774169921875,
-0.035980224609375,
0.0555419921875,
-0.00550079345703125,
-0.042388916015625,
-0.045166015625,
-0.011077880859375,
0.0223236083984375,
0.035980224609375,
0.023162841796875,
-0.03375244140625,
-0.06439208984375,
-0.06451416015625,
-0.01922607421875,
-0.008331298828125,
-0.0198974609375,
0.007572174072265625,
0.055206298828125,
-0.0183868408203125,
0.0618896484375,
-0.059051513671875,
-0.032989501953125,
-0.01184844970703125,
0.0187225341796875,
0.04534912109375,
0.052886962890625,
0.0281219482421875,
-0.046417236328125,
-0.028350830078125,
-0.030792236328125,
-0.0433349609375,
0.0029964447021484375,
-0.000591278076171875,
-0.0186614990234375,
0.01276397705078125,
0.045745849609375,
-0.054931640625,
0.0394287109375,
0.020355224609375,
-0.042572021484375,
0.051666259765625,
-0.028564453125,
-0.0026683807373046875,
-0.0963134765625,
0.0137481689453125,
-0.010894775390625,
-0.02410888671875,
-0.052337646484375,
0.002269744873046875,
-0.0098114013671875,
-0.007671356201171875,
-0.041839599609375,
0.039581298828125,
-0.03173828125,
-0.00402069091796875,
-0.00025272369384765625,
-0.009735107421875,
0.0012445449829101562,
0.029937744140625,
0.0012302398681640625,
0.041351318359375,
0.042083740234375,
-0.04217529296875,
0.041595458984375,
0.03179931640625,
-0.044525146484375,
0.0196990966796875,
-0.06005859375,
0.018310546875,
0.0038299560546875,
0.004512786865234375,
-0.0830078125,
-0.0261688232421875,
0.01538848876953125,
-0.04150390625,
0.0182342529296875,
-0.0031337738037109375,
-0.05963134765625,
-0.05059814453125,
-0.0167083740234375,
0.033233642578125,
0.0419921875,
-0.0229339599609375,
0.03338623046875,
0.02447509765625,
-0.007122039794921875,
-0.043121337890625,
-0.054046630859375,
0.0122222900390625,
-0.0120849609375,
-0.039703369140625,
0.028564453125,
-0.002811431884765625,
-0.00861358642578125,
-0.016448974609375,
0.008544921875,
-0.0102691650390625,
0.005748748779296875,
0.0174102783203125,
0.03594970703125,
-0.01282501220703125,
-0.00231170654296875,
-0.01055145263671875,
-0.009429931640625,
0.02386474609375,
-0.0126495361328125,
0.0633544921875,
0.0009365081787109375,
-0.0087738037109375,
-0.025115966796875,
0.0295257568359375,
0.050201416015625,
-0.00395965576171875,
0.05816650390625,
0.062164306640625,
-0.043975830078125,
0.0043182373046875,
-0.0272979736328125,
-0.0177001953125,
-0.0389404296875,
0.039764404296875,
-0.034942626953125,
-0.063720703125,
0.054443359375,
0.023773193359375,
-0.0124664306640625,
0.054718017578125,
0.0435791015625,
-0.01131439208984375,
0.076904296875,
0.035247802734375,
-0.00966644287109375,
0.03790283203125,
-0.01122283935546875,
0.0238800048828125,
-0.056182861328125,
-0.035125732421875,
-0.033966064453125,
-0.02496337890625,
-0.0364990234375,
-0.01453399658203125,
0.0188140869140625,
0.014007568359375,
-0.035858154296875,
0.0391845703125,
-0.050048828125,
0.02740478515625,
0.0760498046875,
0.023834228515625,
-0.013763427734375,
-0.0163726806640625,
-0.0139923095703125,
0.0028324127197265625,
-0.033294677734375,
-0.025787353515625,
0.08673095703125,
0.040496826171875,
0.0482177734375,
0.00783538818359375,
0.051727294921875,
0.0243072509765625,
-0.0000922083854675293,
-0.05145263671875,
0.050262451171875,
-0.027740478515625,
-0.06951904296875,
-0.030853271484375,
-0.00917816162109375,
-0.07879638671875,
0.00940704345703125,
-0.0229644775390625,
-0.06341552734375,
-0.0039825439453125,
-0.01357269287109375,
-0.025238037109375,
0.01215362548828125,
-0.05731201171875,
0.0784912109375,
-0.0205841064453125,
-0.005802154541015625,
0.004055023193359375,
-0.0706787109375,
0.0202789306640625,
-0.0024242401123046875,
0.0090179443359375,
-0.0047149658203125,
0.0171966552734375,
0.0833740234375,
-0.0455322265625,
0.07354736328125,
-0.01861572265625,
0.01617431640625,
0.004543304443359375,
-0.0049285888671875,
0.0251617431640625,
0.00140380859375,
0.006664276123046875,
0.0211181640625,
0.004268646240234375,
-0.035400390625,
-0.00897216796875,
0.0291290283203125,
-0.0594482421875,
-0.040069580078125,
-0.04833984375,
-0.0452880859375,
0.01116943359375,
0.033935546875,
0.04351806640625,
0.03802490234375,
-0.0132293701171875,
0.020355224609375,
0.0360107421875,
-0.0194854736328125,
0.058685302734375,
0.0298919677734375,
-0.0195770263671875,
-0.035614013671875,
0.045501708984375,
0.003444671630859375,
0.0031299591064453125,
0.037445068359375,
0.0185699462890625,
-0.044586181640625,
-0.010040283203125,
-0.0265350341796875,
0.0120697021484375,
-0.042938232421875,
-0.023406982421875,
-0.042388916015625,
-0.036529541015625,
-0.047271728515625,
-0.00759124755859375,
-0.01093292236328125,
-0.035400390625,
-0.050811767578125,
-0.01465606689453125,
0.03515625,
0.0479736328125,
-0.006076812744140625,
0.037200927734375,
-0.0560302734375,
0.02008056640625,
0.024688720703125,
0.031951904296875,
-0.0183563232421875,
-0.06219482421875,
-0.022247314453125,
-0.003692626953125,
-0.0090179443359375,
-0.0640869140625,
0.046630859375,
0.0176239013671875,
0.034454345703125,
0.04296875,
-0.002246856689453125,
0.0460205078125,
-0.0478515625,
0.0714111328125,
0.0168304443359375,
-0.0859375,
0.042388916015625,
-0.0236358642578125,
0.020172119140625,
0.0250701904296875,
0.015045166015625,
-0.041900634765625,
-0.0296630859375,
-0.06689453125,
-0.07318115234375,
0.0614013671875,
0.00992584228515625,
0.026947021484375,
-0.005252838134765625,
0.0221099853515625,
0.006694793701171875,
0.0311431884765625,
-0.06646728515625,
-0.039764404296875,
-0.033203125,
-0.0252685546875,
-0.015625,
-0.0232391357421875,
-0.005584716796875,
-0.042236328125,
0.051544189453125,
0.01171112060546875,
0.04278564453125,
0.007045745849609375,
-0.007747650146484375,
0.0084228515625,
0.015289306640625,
0.06640625,
0.034942626953125,
-0.035675048828125,
0.00010418891906738281,
-0.00048089027404785156,
-0.049346923828125,
0.0027637481689453125,
0.0179290771484375,
0.0009341239929199219,
0.0184478759765625,
0.039703369140625,
0.061492919921875,
0.0189971923828125,
-0.037109375,
0.0469970703125,
0.00917816162109375,
-0.0229949951171875,
-0.047454833984375,
0.00749969482421875,
-0.0059814453125,
0.0097808837890625,
0.042633056640625,
0.00949859619140625,
0.0063018798828125,
-0.039642333984375,
0.03057861328125,
0.0287017822265625,
-0.03704833984375,
-0.01467132568359375,
0.0697021484375,
0.006290435791015625,
-0.052581787109375,
0.0579833984375,
-0.0165863037109375,
-0.057159423828125,
0.05731201171875,
0.05059814453125,
0.07000732421875,
-0.018524169921875,
0.017333984375,
0.034423828125,
0.02398681640625,
-0.02410888671875,
0.0289306640625,
0.0274200439453125,
-0.061187744140625,
-0.026580810546875,
-0.057708740234375,
-0.013427734375,
0.01538848876953125,
-0.061370849609375,
0.02569580078125,
-0.038787841796875,
-0.01873779296875,
0.0158538818359375,
-0.0014743804931640625,
-0.0513916015625,
0.03533935546875,
-0.0004627704620361328,
0.07965087890625,
-0.07476806640625,
0.07421875,
0.0577392578125,
-0.04473876953125,
-0.07025146484375,
-0.031341552734375,
-0.022186279296875,
-0.08306884765625,
0.05731201171875,
0.0245819091796875,
0.027496337890625,
0.0027904510498046875,
-0.043792724609375,
-0.050811767578125,
0.06341552734375,
0.012420654296875,
-0.041656494140625,
-0.00853729248046875,
0.006137847900390625,
0.044097900390625,
-0.041259765625,
0.030120849609375,
0.0404052734375,
0.031768798828125,
-0.00321197509765625,
-0.06036376953125,
0.0065155029296875,
-0.0274200439453125,
-0.00010329484939575195,
0.013031005859375,
-0.03546142578125,
0.08807373046875,
-0.0107421875,
0.004856109619140625,
0.0164031982421875,
0.0394287109375,
-0.0011234283447265625,
0.01038360595703125,
0.035980224609375,
0.046600341796875,
0.055084228515625,
-0.02520751953125,
0.0643310546875,
-0.018829345703125,
0.040008544921875,
0.06317138671875,
0.004878997802734375,
0.0609130859375,
0.03179931640625,
-0.020904541015625,
0.0675048828125,
0.06298828125,
-0.0291290283203125,
0.052337646484375,
0.0191192626953125,
-0.00836181640625,
-0.004528045654296875,
0.01070404052734375,
-0.0229644775390625,
0.037841796875,
0.016357421875,
-0.048980712890625,
0.00974273681640625,
-0.0016021728515625,
0.01389312744140625,
-0.0132904052734375,
-0.03765869140625,
0.052490234375,
0.01263427734375,
-0.052825927734375,
0.0192108154296875,
0.0161285400390625,
0.04547119140625,
-0.039642333984375,
0.0016307830810546875,
-0.00670623779296875,
0.0163116455078125,
-0.0067138671875,
-0.0633544921875,
0.01444244384765625,
-0.0149078369140625,
-0.03302001953125,
-0.01494598388671875,
0.049163818359375,
-0.03338623046875,
-0.05352783203125,
-0.0015459060668945312,
0.0179901123046875,
0.0252685546875,
-0.01025390625,
-0.05596923828125,
-0.0175628662109375,
-0.0008320808410644531,
-0.00804901123046875,
0.0133514404296875,
0.025604248046875,
0.0046234130859375,
0.039764404296875,
0.057708740234375,
-0.007411956787109375,
0.005397796630859375,
0.008514404296875,
0.051300048828125,
-0.0738525390625,
-0.06707763671875,
-0.0753173828125,
0.03961181640625,
-0.011260986328125,
-0.04229736328125,
0.045440673828125,
0.056732177734375,
0.052520751953125,
-0.033203125,
0.03863525390625,
-0.01202392578125,
0.0445556640625,
-0.03070068359375,
0.05767822265625,
-0.0249786376953125,
0.00183868408203125,
-0.0294189453125,
-0.063720703125,
-0.0219879150390625,
0.0634765625,
-0.006496429443359375,
0.0035457611083984375,
0.055694580078125,
0.0447998046875,
0.007625579833984375,
-0.005462646484375,
0.0158538818359375,
0.012939453125,
0.007678985595703125,
0.0302581787109375,
0.041839599609375,
-0.047607421875,
0.0309600830078125,
-0.01033782958984375,
-0.005802154541015625,
-0.030487060546875,
-0.06658935546875,
-0.0762939453125,
-0.046875,
-0.01715087890625,
-0.046417236328125,
-0.013275146484375,
0.068359375,
0.057891845703125,
-0.06878662109375,
-0.018035888671875,
-0.007671356201171875,
0.007293701171875,
-0.018524169921875,
-0.022918701171875,
0.0367431640625,
-0.0143280029296875,
-0.0592041015625,
0.0140838623046875,
-0.004322052001953125,
0.0080108642578125,
-0.01561737060546875,
0.00380706787109375,
-0.02716064453125,
0.0062103271484375,
0.040496826171875,
0.0074310302734375,
-0.0548095703125,
-0.041900634765625,
0.002918243408203125,
-0.0157470703125,
0.005687713623046875,
0.04071044921875,
-0.04180908203125,
0.0284881591796875,
0.02825927734375,
0.02740478515625,
0.053955078125,
0.0084381103515625,
0.049713134765625,
-0.08251953125,
0.0209503173828125,
0.0147705078125,
0.0362548828125,
0.024688720703125,
-0.03326416015625,
0.037139892578125,
0.038970947265625,
-0.03802490234375,
-0.0650634765625,
-0.0007605552673339844,
-0.07708740234375,
-0.021514892578125,
0.06658935546875,
-0.0084381103515625,
-0.02032470703125,
-0.005397796630859375,
-0.0280914306640625,
0.0300445556640625,
-0.03277587890625,
0.057861328125,
0.066162109375,
0.005863189697265625,
-0.00957489013671875,
-0.031463623046875,
0.0293426513671875,
0.030120849609375,
-0.035736083984375,
-0.038818359375,
0.00963592529296875,
0.0338134765625,
0.0182342529296875,
0.0411376953125,
-0.0034313201904296875,
0.01381683349609375,
0.012481689453125,
0.0170440673828125,
-0.005680084228515625,
-0.00969696044921875,
-0.0210418701171875,
0.014129638671875,
-0.01056671142578125,
-0.054901123046875
]
] |
openai/clip-vit-large-patch14-336 | 2022-10-04T09:41:39.000Z | [
"transformers",
"pytorch",
"tf",
"clip",
"zero-shot-image-classification",
"generated_from_keras_callback",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | openai | null | null | openai/clip-vit-large-patch14-336 | 59 | 1,534,763 | transformers | 2022-04-22T14:57:43 | ---
tags:
- generated_from_keras_callback
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
model-index:
- name: clip-vit-large-patch14-336
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# clip-vit-large-patch14-336
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.21.3
- TensorFlow 2.8.2
- Tokenizers 0.12.1
| 1,045 | [
[
-0.038055419921875,
-0.040740966796875,
0.03253173828125,
0.0024471282958984375,
-0.04364013671875,
-0.0309600830078125,
0.00014650821685791016,
-0.02325439453125,
0.01114654541015625,
0.037322998046875,
-0.04949951171875,
-0.036285400390625,
-0.0634765625,
-0.01788330078125,
-0.03228759765625,
0.08062744140625,
0.0151214599609375,
0.03460693359375,
-0.0134124755859375,
-0.006259918212890625,
-0.03485107421875,
-0.047271728515625,
-0.07293701171875,
-0.046905517578125,
0.032318115234375,
0.0128936767578125,
0.051605224609375,
0.08489990234375,
0.05584716796875,
0.029266357421875,
-0.0188140869140625,
-0.017974853515625,
-0.040802001953125,
-0.04364013671875,
-0.0090484619140625,
-0.0338134765625,
-0.049713134765625,
0.00397491455078125,
0.06475830078125,
0.0254058837890625,
-0.0187530517578125,
0.03753662109375,
-0.0085296630859375,
0.0179595947265625,
-0.045745849609375,
0.0146331787109375,
-0.046295166015625,
0.04290771484375,
-0.01226806640625,
-0.0102996826171875,
-0.0172119140625,
-0.01306915283203125,
0.003917694091796875,
-0.045989990234375,
0.0394287109375,
-0.015289306640625,
0.09124755859375,
0.021514892578125,
-0.005649566650390625,
-0.0005154609680175781,
-0.06298828125,
0.041168212890625,
-0.055633544921875,
0.0159759521484375,
0.0298614501953125,
0.050048828125,
0.007526397705078125,
-0.0797119140625,
-0.03314208984375,
-0.005641937255859375,
0.019683837890625,
0.00821685791015625,
-0.024688720703125,
-0.0079803466796875,
0.04888916015625,
0.03961181640625,
-0.0028514862060546875,
0.026885986328125,
-0.061798095703125,
-0.017791748046875,
0.049591064453125,
0.037384033203125,
-0.02020263671875,
-0.0206756591796875,
-0.03802490234375,
-0.037384033203125,
-0.0208740234375,
0.00009524822235107422,
0.052642822265625,
0.01195526123046875,
-0.0172576904296875,
0.06549072265625,
-0.01049041748046875,
0.03961181640625,
0.002285003662109375,
0.00896453857421875,
0.034271240234375,
0.01336669921875,
-0.04522705078125,
-0.0018358230590820312,
0.0672607421875,
0.0426025390625,
0.00665283203125,
0.00783538818359375,
-0.0355224609375,
-0.0135498046875,
0.03668212890625,
-0.05279541015625,
-0.022186279296875,
-0.0006365776062011719,
-0.05841064453125,
-0.07635498046875,
0.005474090576171875,
-0.061248779296875,
-0.010345458984375,
-0.033843994140625,
0.058837890625,
-0.00786590576171875,
-0.0141448974609375,
0.007110595703125,
-0.0225067138671875,
0.0125732421875,
0.00775146484375,
-0.03631591796875,
0.0179290771484375,
0.04266357421875,
0.0282135009765625,
0.006412506103515625,
-0.0201416015625,
-0.00977325439453125,
-0.00670623779296875,
-0.020751953125,
0.039306640625,
-0.0236663818359375,
-0.0367431640625,
-0.01493072509765625,
0.02716064453125,
0.0040740966796875,
-0.031494140625,
0.07183837890625,
-0.0250396728515625,
0.01593017578125,
-0.032440185546875,
-0.050811767578125,
-0.0279998779296875,
0.0166473388671875,
-0.06201171875,
0.0745849609375,
-0.00830841064453125,
-0.053253173828125,
0.036041259765625,
-0.06488037109375,
-0.00591278076171875,
0.00988006591796875,
0.00418853759765625,
-0.059417724609375,
0.018890380859375,
0.0006237030029296875,
0.040283203125,
-0.0211181640625,
0.00533294677734375,
-0.029266357421875,
-0.03729248046875,
-0.006488800048828125,
-0.04638671875,
0.04119873046875,
0.01910400390625,
-0.032073974609375,
0.006565093994140625,
-0.08685302734375,
0.016143798828125,
0.0350341796875,
-0.02947998046875,
0.00800323486328125,
-0.02496337890625,
0.0292205810546875,
0.01108551025390625,
0.0240936279296875,
-0.059051513671875,
0.006694793701171875,
-0.0140838623046875,
0.02288818359375,
0.0654296875,
0.01276397705078125,
-0.01071929931640625,
-0.0192718505859375,
0.0206146240234375,
0.01959228515625,
0.0211639404296875,
0.0092315673828125,
-0.033233642578125,
-0.068115234375,
-0.00757598876953125,
0.0380859375,
0.01476287841796875,
-0.006328582763671875,
0.048828125,
0.0020694732666015625,
-0.0726318359375,
-0.0251007080078125,
0.002262115478515625,
0.0290374755859375,
0.039154052734375,
0.015594482421875,
-0.01303863525390625,
-0.0406494140625,
-0.08795166015625,
0.01910400390625,
0.006229400634765625,
0.0129852294921875,
0.0224761962890625,
0.049072265625,
-0.018524169921875,
0.042236328125,
-0.05596923828125,
0.0019006729125976562,
-0.0013856887817382812,
0.0034809112548828125,
0.022705078125,
0.06378173828125,
0.026885986328125,
-0.03765869140625,
0.0146026611328125,
-0.0139923095703125,
-0.05169677734375,
0.031402587890625,
-0.004108428955078125,
-0.0304718017578125,
-0.02410888671875,
0.03887939453125,
-0.045562744140625,
0.050506591796875,
0.0118255615234375,
-0.0107574462890625,
0.0421142578125,
-0.046112060546875,
-0.01523590087890625,
-0.085205078125,
0.033966064453125,
0.0118255615234375,
-0.0006747245788574219,
-0.0140533447265625,
0.01129913330078125,
0.00453948974609375,
-0.031280517578125,
-0.031463623046875,
0.038604736328125,
0.004581451416015625,
-0.0029315948486328125,
-0.024383544921875,
-0.032867431640625,
0.004131317138671875,
0.051361083984375,
0.0245513916015625,
0.02349853515625,
0.0284881591796875,
-0.055206298828125,
0.04150390625,
0.03155517578125,
-0.03179931640625,
0.03668212890625,
-0.07763671875,
0.0216827392578125,
-0.021881103515625,
-0.019683837890625,
-0.049591064453125,
-0.027984619140625,
0.036834716796875,
-0.01470947265625,
0.01336669921875,
-0.008056640625,
-0.0298614501953125,
-0.046875,
0.00653076171875,
0.031402587890625,
0.046295166015625,
-0.056640625,
0.02947998046875,
0.0168609619140625,
0.037078857421875,
-0.038970947265625,
-0.053619384765625,
-0.0258026123046875,
-0.0162506103515625,
-0.00798797607421875,
0.00556182861328125,
-0.009552001953125,
0.0031890869140625,
0.0045928955078125,
0.0200347900390625,
-0.023651123046875,
-0.006557464599609375,
0.033599853515625,
0.0237579345703125,
-0.0193023681640625,
0.005859375,
0.0018329620361328125,
-0.007190704345703125,
0.0210723876953125,
0.0105133056640625,
0.03399658203125,
-0.0115814208984375,
-0.046905517578125,
-0.042999267578125,
-0.0135345458984375,
0.048187255859375,
-0.0121612548828125,
0.0357666015625,
0.05950927734375,
-0.046051025390625,
0.005847930908203125,
-0.039093017578125,
0.0014019012451171875,
-0.0311279296875,
0.053466796875,
-0.03265380859375,
-0.007274627685546875,
0.050933837890625,
0.014495849609375,
0.01275634765625,
0.0733642578125,
0.044952392578125,
0.00015425682067871094,
0.07818603515625,
0.025909423828125,
0.0027561187744140625,
0.0130767822265625,
-0.04376220703125,
-0.0103759765625,
-0.054107666015625,
-0.045501708984375,
-0.031585693359375,
-0.0251007080078125,
-0.04443359375,
-0.003299713134765625,
0.0184478759765625,
0.01050567626953125,
-0.0229949951171875,
0.043548583984375,
-0.047454833984375,
0.0321044921875,
0.056976318359375,
0.045684814453125,
-0.01160430908203125,
0.005451202392578125,
-0.02423095703125,
0.00417327880859375,
-0.05950927734375,
-0.0141448974609375,
0.1019287109375,
0.058441162109375,
0.02947998046875,
-0.035400390625,
0.036834716796875,
0.0221405029296875,
-0.00013327598571777344,
-0.036346435546875,
0.042266845703125,
0.0108642578125,
-0.05474853515625,
-0.018463134765625,
-0.01995849609375,
-0.06756591796875,
0.004581451416015625,
-0.03631591796875,
-0.01739501953125,
0.01255035400390625,
0.007965087890625,
-0.028350830078125,
0.042144775390625,
-0.039093017578125,
0.0794677734375,
-0.003993988037109375,
-0.0131072998046875,
-0.0250396728515625,
-0.02069091796875,
0.0247039794921875,
-0.0095977783203125,
-0.02386474609375,
0.0025463104248046875,
0.022216796875,
0.07305908203125,
-0.053436279296875,
0.049652099609375,
-0.036865234375,
0.020294189453125,
0.034210205078125,
-0.0244903564453125,
0.0382080078125,
-0.00040912628173828125,
-0.007450103759765625,
0.016387939453125,
-0.0027370452880859375,
-0.040008544921875,
-0.0202178955078125,
0.046112060546875,
-0.07781982421875,
-0.0113525390625,
-0.0149688720703125,
-0.037322998046875,
0.00786590576171875,
0.013031005859375,
0.0521240234375,
0.06304931640625,
-0.022918701171875,
0.0131378173828125,
0.03778076171875,
-0.006229400634765625,
0.048004150390625,
0.0171661376953125,
0.005764007568359375,
-0.04840087890625,
0.055633544921875,
-0.004558563232421875,
0.0026149749755859375,
-0.013580322265625,
0.0179290771484375,
-0.0290374755859375,
-0.0239410400390625,
-0.0361328125,
0.01152801513671875,
-0.0711669921875,
-0.01910400390625,
-0.0175018310546875,
-0.04522705078125,
-0.0164947509765625,
0.013763427734375,
-0.037384033203125,
-0.027069091796875,
-0.0218048095703125,
-0.02227783203125,
0.013916015625,
0.07415771484375,
-0.01465606689453125,
0.050811767578125,
-0.05938720703125,
0.0026531219482421875,
0.028594970703125,
0.04010009765625,
0.0249176025390625,
-0.04705810546875,
-0.0377197265625,
0.00730133056640625,
-0.027923583984375,
-0.027099609375,
0.00896453857421875,
-0.0093231201171875,
0.06280517578125,
0.050628662109375,
-0.034423828125,
0.056640625,
-0.0236358642578125,
0.056488037109375,
0.015289306640625,
-0.0404052734375,
0.0169830322265625,
-0.0166168212890625,
0.042236328125,
0.03509521484375,
0.032684326171875,
0.002399444580078125,
-0.0028858184814453125,
-0.09918212890625,
-0.0345458984375,
0.045013427734375,
0.0218505859375,
0.02667236328125,
-0.00559234619140625,
0.037811279296875,
0.00414276123046875,
0.0231781005859375,
-0.057464599609375,
-0.039276123046875,
-0.015533447265625,
0.0009365081787109375,
-0.0025386810302734375,
-0.04046630859375,
-0.00072479248046875,
-0.048004150390625,
0.0738525390625,
0.0277862548828125,
0.03948974609375,
0.006488800048828125,
0.01360321044921875,
-0.0204925537109375,
-0.019500732421875,
0.050384521484375,
0.050628662109375,
-0.0255889892578125,
-0.0140533447265625,
0.016448974609375,
-0.0433349609375,
-0.0123443603515625,
0.00905609130859375,
-0.0070953369140625,
0.01168060302734375,
0.0166473388671875,
0.0797119140625,
0.01727294921875,
-0.0179901123046875,
0.03875732421875,
-0.004352569580078125,
-0.02520751953125,
-0.031494140625,
0.01122283935546875,
-0.00930023193359375,
0.00963592529296875,
-0.0007534027099609375,
0.05157470703125,
0.0018663406372070312,
-0.005496978759765625,
0.025421142578125,
0.0200347900390625,
-0.04010009765625,
-0.020843505859375,
0.060394287109375,
-0.0014619827270507812,
-0.028717041015625,
0.05596923828125,
-0.0004699230194091797,
-0.0240020751953125,
0.061920166015625,
0.0325927734375,
0.068115234375,
-0.019927978515625,
0.004791259765625,
0.0653076171875,
-0.00048279762268066406,
-0.014739990234375,
0.0229339599609375,
-0.00974273681640625,
-0.053375244140625,
-0.0182037353515625,
-0.05267333984375,
-0.0213165283203125,
0.053375244140625,
-0.08355712890625,
0.06182861328125,
-0.055633544921875,
-0.024444580078125,
0.0259552001953125,
-0.006954193115234375,
-0.0777587890625,
0.053436279296875,
0.0179595947265625,
0.08453369140625,
-0.0771484375,
0.05322265625,
0.040802001953125,
-0.033782958984375,
-0.055389404296875,
-0.027069091796875,
-0.040191650390625,
-0.0767822265625,
0.04400634765625,
-0.0005216598510742188,
0.0309600830078125,
0.01380157470703125,
-0.04168701171875,
-0.057647705078125,
0.0653076171875,
0.01197052001953125,
-0.033599853515625,
-0.005168914794921875,
0.029266357421875,
0.039276123046875,
-0.01503753662109375,
0.0413818359375,
0.0186004638671875,
0.027984619140625,
0.0181121826171875,
-0.06158447265625,
-0.01451873779296875,
-0.0292205810546875,
0.00453948974609375,
0.0040283203125,
-0.041351318359375,
0.05474853515625,
0.01132965087890625,
0.039703369140625,
0.01678466796875,
0.040008544921875,
0.016082763671875,
0.0024433135986328125,
0.04119873046875,
0.07989501953125,
0.047637939453125,
0.01499176025390625,
0.07928466796875,
-0.0267791748046875,
0.035369873046875,
0.08758544921875,
0.00946807861328125,
0.04632568359375,
0.0219573974609375,
-0.00856781005859375,
0.0154571533203125,
0.05810546875,
-0.06298828125,
0.039703369140625,
0.01255035400390625,
0.007671356201171875,
-0.031402587890625,
0.01262664794921875,
-0.03521728515625,
0.04742431640625,
-0.005817413330078125,
-0.074462890625,
-0.0311737060546875,
-0.006069183349609375,
-0.0107574462890625,
-0.018157958984375,
-0.049591064453125,
0.036346435546875,
-0.0227508544921875,
-0.0311737060546875,
0.031585693359375,
0.025146484375,
0.00977325439453125,
-0.038604736328125,
-0.02056884765625,
0.0192718505859375,
0.0213470458984375,
-0.006740570068359375,
-0.044342041015625,
0.0186614990234375,
-0.008026123046875,
-0.0224456787109375,
0.0131378173828125,
0.0467529296875,
-0.00664520263671875,
-0.0791015625,
-0.00958251953125,
0.0120391845703125,
0.03765869140625,
-0.0028839111328125,
-0.070068359375,
-0.004913330078125,
-0.002330780029296875,
-0.0289306640625,
0.0014190673828125,
0.0210723876953125,
0.0054931640625,
0.035491943359375,
0.043212890625,
-0.0018157958984375,
0.0110931396484375,
0.0016584396362304688,
0.06103515625,
-0.03729248046875,
-0.048980712890625,
-0.0631103515625,
0.04266357421875,
-0.01580810546875,
-0.061614990234375,
0.04852294921875,
0.08544921875,
0.06170654296875,
-0.0269775390625,
0.040802001953125,
0.0012388229370117188,
0.0298614501953125,
-0.036529541015625,
0.05029296875,
-0.021820068359375,
-0.0094146728515625,
-0.0205230712890625,
-0.07684326171875,
0.0170440673828125,
0.03448486328125,
-0.01226806640625,
0.005584716796875,
0.027069091796875,
0.046875,
-0.0274505615234375,
0.0156707763671875,
0.031494140625,
-0.002353668212890625,
-0.0114898681640625,
0.0171661376953125,
0.03607177734375,
-0.0733642578125,
0.03662109375,
-0.052886962890625,
0.0157012939453125,
0.0038318634033203125,
-0.06292724609375,
-0.07830810546875,
-0.039947509765625,
-0.036651611328125,
-0.031280517578125,
-0.013824462890625,
0.0689697265625,
0.0672607421875,
-0.05096435546875,
-0.017333984375,
-0.0127410888671875,
-0.02069091796875,
0.00039386749267578125,
-0.0118865966796875,
0.04412841796875,
-0.0001888275146484375,
-0.055084228515625,
0.006580352783203125,
-0.035186767578125,
0.0274810791015625,
-0.0203399658203125,
-0.007080078125,
0.00726318359375,
-0.013946533203125,
-0.00028443336486816406,
0.0043792724609375,
-0.026611328125,
-0.031951904296875,
-0.0219879150390625,
0.001537322998046875,
0.0293731689453125,
0.0311431884765625,
-0.04510498046875,
0.03240966796875,
0.018035888671875,
0.034820556640625,
0.06793212890625,
0.0013170242309570312,
0.044647216796875,
-0.042999267578125,
0.026397705078125,
-0.00276947021484375,
0.0325927734375,
0.00035309791564941406,
-0.0491943359375,
0.0293731689453125,
0.0269927978515625,
-0.03369140625,
-0.066162109375,
-0.019378662109375,
-0.05938720703125,
0.011627197265625,
0.0753173828125,
-0.000904083251953125,
-0.041900634765625,
0.045196533203125,
-0.00734710693359375,
0.031494140625,
-0.0018129348754882812,
0.032958984375,
0.049530029296875,
0.0143280029296875,
0.0026073455810546875,
-0.035369873046875,
0.0254669189453125,
0.02166748046875,
-0.0372314453125,
-0.0290985107421875,
0.010345458984375,
0.045745849609375,
0.004436492919921875,
0.00250244140625,
-0.0258941650390625,
0.03363037109375,
0.01554107666015625,
0.038330078125,
-0.048980712890625,
-0.0154266357421875,
-0.039154052734375,
0.0189361572265625,
0.00146484375,
-0.051727294921875
]
] |
meta-llama/Llama-2-70b-chat-hf | 2023-10-12T16:19:08.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"llama-2",
"en",
"arxiv:2307.09288",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | meta-llama | null | null | meta-llama/Llama-2-70b-chat-hf | 1,561 | 1,531,804 | transformers | 2023-07-14T18:02:07 | ---
extra_gated_heading: Access Llama 2 on Hugging Face
extra_gated_description: >-
This is a form to enable access to Llama 2 on Hugging Face after you have been
granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our
license terms and acceptable use policy before submitting this form. Requests
will be processed in 1-2 days.
extra_gated_prompt: "**Your Hugging Face account email address MUST match the email you provide on the Meta website, or your request will not be approved.**"
extra_gated_button_content: Submit
extra_gated_fields:
I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
---
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes โ 7B, 13B, and 70B โ as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Metaโs sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2โs potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software โbug,โ or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)| | 9,960 | [
[
-0.01375579833984375,
-0.052825927734375,
0.027191162109375,
0.0132904052734375,
-0.0277862548828125,
0.0153961181640625,
-0.0003552436828613281,
-0.059814453125,
0.00212860107421875,
0.026336669921875,
-0.0491943359375,
-0.042572021484375,
-0.0504150390625,
0.0033206939697265625,
-0.0172271728515625,
0.07977294921875,
-0.00212860107421875,
-0.019073486328125,
-0.0109100341796875,
0.004955291748046875,
-0.040130615234375,
-0.03118896484375,
-0.0386962890625,
-0.03253173828125,
0.030914306640625,
0.03759765625,
0.0458984375,
0.0462646484375,
0.042083740234375,
0.0171661376953125,
-0.01922607421875,
0.015716552734375,
-0.0518798828125,
-0.0210723876953125,
0.007274627685546875,
-0.04052734375,
-0.052581787109375,
0.0138702392578125,
0.0233917236328125,
0.013671875,
-0.022186279296875,
0.0400390625,
0.0035858154296875,
0.0362548828125,
-0.04241943359375,
0.0130615234375,
-0.05291748046875,
0.001819610595703125,
-0.0166473388671875,
-0.005939483642578125,
-0.01690673828125,
-0.021636962890625,
-0.01503753662109375,
-0.060638427734375,
-0.01172637939453125,
0.007415771484375,
0.07672119140625,
0.048980712890625,
-0.03643798828125,
-0.00856781005859375,
-0.02252197265625,
0.07244873046875,
-0.06427001953125,
0.004119873046875,
0.044830322265625,
0.021697998046875,
-0.0155029296875,
-0.055572509765625,
-0.049224853515625,
-0.01155853271484375,
0.00428009033203125,
0.0269317626953125,
-0.033966064453125,
-0.0007948875427246094,
0.01117706298828125,
0.0247955322265625,
-0.04437255859375,
0.045135498046875,
-0.038818359375,
-0.011932373046875,
0.07861328125,
0.0183563232421875,
0.0013408660888671875,
-0.00292205810546875,
-0.03912353515625,
-0.0219879150390625,
-0.0625,
0.013702392578125,
0.036041259765625,
-0.006317138671875,
-0.0338134765625,
0.04925537109375,
-0.0264739990234375,
0.01971435546875,
-0.0023479461669921875,
-0.039825439453125,
0.0330810546875,
-0.0352783203125,
-0.020172119140625,
-0.0085601806640625,
0.0634765625,
0.057861328125,
0.01355743408203125,
0.0075531005859375,
-0.005771636962890625,
0.0100860595703125,
0.0004706382751464844,
-0.05999755859375,
-0.003421783447265625,
0.016937255859375,
-0.030670166015625,
-0.04388427734375,
-0.024627685546875,
-0.056915283203125,
-0.01580810546875,
-0.007160186767578125,
0.0179443359375,
-0.0008273124694824219,
-0.0310821533203125,
0.0101776123046875,
0.004627227783203125,
0.04095458984375,
0.0165863037109375,
-0.07025146484375,
0.019866943359375,
0.0426025390625,
0.05853271484375,
-0.0204925537109375,
-0.0260162353515625,
0.0031719207763671875,
-0.000545501708984375,
-0.0254364013671875,
0.0687255859375,
-0.0257110595703125,
-0.04229736328125,
-0.017364501953125,
-0.002658843994140625,
0.013671875,
-0.0408935546875,
0.033416748046875,
-0.0292510986328125,
0.01172637939453125,
-0.02532958984375,
-0.0265350341796875,
-0.02789306640625,
0.01454925537109375,
-0.03265380859375,
0.109130859375,
0.0089874267578125,
-0.037109375,
0.0204925537109375,
-0.052337646484375,
-0.01114654541015625,
-0.0185089111328125,
0.00832366943359375,
-0.040069580078125,
-0.0200653076171875,
0.0083160400390625,
0.0251007080078125,
-0.050262451171875,
0.036590576171875,
-0.01483917236328125,
-0.032928466796875,
0.0035076141357421875,
-0.032135009765625,
0.0653076171875,
0.02178955078125,
-0.034393310546875,
0.00396728515625,
-0.061065673828125,
0.000797271728515625,
0.0352783203125,
-0.03558349609375,
0.01776123046875,
0.00835418701171875,
-0.00705718994140625,
0.01258087158203125,
0.034423828125,
-0.0258636474609375,
0.01276397705078125,
-0.0236663818359375,
0.03662109375,
0.056182861328125,
0.004901885986328125,
0.0121612548828125,
-0.04052734375,
0.03912353515625,
0.00018525123596191406,
0.0293121337890625,
0.003673553466796875,
-0.055511474609375,
-0.0765380859375,
-0.01178741455078125,
-0.0028820037841796875,
0.0655517578125,
-0.0169830322265625,
0.048736572265625,
-0.0011358261108398438,
-0.055023193359375,
-0.0303955078125,
0.02508544921875,
0.0498046875,
0.039215087890625,
0.0330810546875,
-0.0213470458984375,
-0.045745849609375,
-0.0760498046875,
0.0018281936645507812,
-0.0364990234375,
0.00005054473876953125,
0.0285491943359375,
0.05029296875,
-0.0257720947265625,
0.05517578125,
-0.04052734375,
-0.01284027099609375,
-0.0200347900390625,
-0.008575439453125,
0.002582550048828125,
0.02630615234375,
0.050079345703125,
-0.030181884765625,
-0.01324462890625,
-0.00984954833984375,
-0.06793212890625,
-0.00679779052734375,
0.0088348388671875,
-0.0152740478515625,
0.0200653076171875,
0.02178955078125,
-0.045074462890625,
0.034332275390625,
0.052520751953125,
-0.015228271484375,
0.039215087890625,
0.0003266334533691406,
-0.01314544677734375,
-0.07958984375,
0.004425048828125,
-0.01503753662109375,
0.003360748291015625,
-0.034515380859375,
-0.0016431808471679688,
-0.0157623291015625,
0.00490570068359375,
-0.046539306640625,
0.047088623046875,
-0.0232391357421875,
-0.01503753662109375,
-0.01041412353515625,
0.003787994384765625,
0.006229400634765625,
0.04595947265625,
-0.01068878173828125,
0.08251953125,
0.02728271484375,
-0.043182373046875,
0.018585205078125,
0.029754638671875,
-0.03924560546875,
0.011688232421875,
-0.067138671875,
0.026153564453125,
0.006862640380859375,
0.040618896484375,
-0.07293701171875,
-0.0258331298828125,
0.0243682861328125,
-0.031890869140625,
0.00598907470703125,
0.018402099609375,
-0.043060302734375,
-0.0294189453125,
-0.032073974609375,
0.0242156982421875,
0.061920166015625,
-0.0338134765625,
0.014068603515625,
0.03192138671875,
0.0014791488647460938,
-0.05157470703125,
-0.06658935546875,
0.0035800933837890625,
-0.0296630859375,
-0.04071044921875,
0.0214691162109375,
-0.01422119140625,
-0.021636962890625,
-0.02020263671875,
0.003955841064453125,
-0.003162384033203125,
0.032989501953125,
0.0283660888671875,
0.0303955078125,
-0.0088653564453125,
-0.00030350685119628906,
0.01076507568359375,
-0.0159759521484375,
0.00506591796875,
0.0178985595703125,
0.040679931640625,
-0.0095977783203125,
-0.0173797607421875,
-0.053924560546875,
0.006587982177734375,
0.0251007080078125,
-0.0203399658203125,
0.044830322265625,
0.031463623046875,
-0.0180206298828125,
0.0202484130859375,
-0.05694580078125,
-0.009765625,
-0.038970947265625,
0.04132080078125,
-0.01457977294921875,
-0.06170654296875,
0.040618896484375,
-0.002353668212890625,
0.03265380859375,
0.05535888671875,
0.046234130859375,
-0.006259918212890625,
0.062744140625,
0.04522705078125,
-0.006977081298828125,
0.022796630859375,
-0.036895751953125,
-0.00432586669921875,
-0.07403564453125,
-0.049285888671875,
-0.0249786376953125,
-0.0338134765625,
-0.049896240234375,
-0.033416748046875,
0.022705078125,
0.0171356201171875,
-0.050872802734375,
0.0222320556640625,
-0.04217529296875,
0.04095458984375,
0.03863525390625,
0.010894775390625,
0.0240020751953125,
0.00705718994140625,
0.01065826416015625,
0.0033359527587890625,
-0.037567138671875,
-0.056732177734375,
0.1097412109375,
0.035400390625,
0.03570556640625,
0.0117645263671875,
0.046234130859375,
0.01366424560546875,
0.0228729248046875,
-0.0546875,
0.049560546875,
0.004302978515625,
-0.055267333984375,
-0.0103912353515625,
-0.0054931640625,
-0.06634521484375,
0.01062774658203125,
-0.0140533447265625,
-0.060546875,
0.00396728515625,
-0.0028781890869140625,
-0.02740478515625,
0.0218963623046875,
-0.0506591796875,
0.04315185546875,
-0.040008544921875,
-0.02313232421875,
-0.0281524658203125,
-0.0601806640625,
0.05084228515625,
-0.01328277587890625,
0.00676727294921875,
-0.0380859375,
-0.021240234375,
0.06689453125,
-0.0239410400390625,
0.07708740234375,
-0.004146575927734375,
-0.00862884521484375,
0.043853759765625,
-0.01204681396484375,
0.038330078125,
0.0033397674560546875,
-0.0218353271484375,
0.049835205078125,
-0.0102691650390625,
-0.023223876953125,
-0.0113983154296875,
0.04119873046875,
-0.09222412109375,
-0.05999755859375,
-0.03924560546875,
-0.038909912109375,
-0.0009431838989257812,
0.004077911376953125,
0.035888671875,
-0.006618499755859375,
-0.0030231475830078125,
0.007476806640625,
0.035888671875,
-0.038421630859375,
0.03472900390625,
0.044097900390625,
-0.006977081298828125,
-0.03021240234375,
0.04840087890625,
0.005584716796875,
0.02593994140625,
0.0176239013671875,
0.004390716552734375,
-0.0321044921875,
-0.03277587890625,
-0.037506103515625,
0.020050048828125,
-0.035736083984375,
-0.036834716796875,
-0.041046142578125,
-0.02520751953125,
-0.0240020751953125,
-0.0038928985595703125,
-0.031829833984375,
-0.0335693359375,
-0.0582275390625,
-0.0303955078125,
0.040985107421875,
0.060546875,
-0.00018668174743652344,
0.044921875,
-0.0234222412109375,
0.0128173828125,
0.028228759765625,
0.01335906982421875,
-0.002750396728515625,
-0.054840087890625,
0.0066986083984375,
0.00894927978515625,
-0.056671142578125,
-0.0482177734375,
0.0182647705078125,
0.01849365234375,
0.035064697265625,
0.03289794921875,
-0.005950927734375,
0.055999755859375,
-0.0261688232421875,
0.08551025390625,
0.026885986328125,
-0.048614501953125,
0.05169677734375,
-0.01641845703125,
0.003570556640625,
0.04736328125,
0.02069091796875,
-0.006694793701171875,
-0.00992584228515625,
-0.04541015625,
-0.050689697265625,
0.05889892578125,
0.0159149169921875,
0.01348876953125,
0.00434112548828125,
0.0333251953125,
0.0036830902099609375,
0.007198333740234375,
-0.062347412109375,
-0.02484130859375,
-0.0189666748046875,
-0.00508880615234375,
-0.01364898681640625,
-0.0433349609375,
-0.00682830810546875,
-0.02197265625,
0.045501708984375,
0.005123138427734375,
0.0281219482421875,
-0.01068115234375,
0.0007004737854003906,
-0.0086517333984375,
0.00567626953125,
0.05712890625,
0.03857421875,
-0.0185546875,
-0.0087890625,
0.04852294921875,
-0.0477294921875,
0.0255279541015625,
-0.0024433135986328125,
-0.0092926025390625,
-0.0283660888671875,
0.0302734375,
0.0657958984375,
0.0223236083984375,
-0.0548095703125,
0.0244140625,
0.01197052001953125,
-0.0264739990234375,
-0.032135009765625,
0.0270233154296875,
0.0064849853515625,
0.0249786376953125,
0.018463134765625,
-0.0084686279296875,
0.0106353759765625,
-0.0404052734375,
-0.010528564453125,
0.0279083251953125,
0.00875091552734375,
-0.0311737060546875,
0.07427978515625,
0.023895263671875,
-0.0215301513671875,
0.039093017578125,
-0.0096435546875,
-0.0261688232421875,
0.067626953125,
0.049957275390625,
0.045867919921875,
-0.022796630859375,
0.00881195068359375,
0.053070068359375,
0.033966064453125,
-0.017822265625,
0.0187530517578125,
0.0029964447021484375,
-0.036468505859375,
-0.012969970703125,
-0.047698974609375,
-0.036346435546875,
0.025238037109375,
-0.0443115234375,
0.022674560546875,
-0.045501708984375,
-0.0196075439453125,
-0.0237579345703125,
0.035858154296875,
-0.047607421875,
0.01473236083984375,
0.0067291259765625,
0.07080078125,
-0.054901123046875,
0.05755615234375,
0.036041259765625,
-0.038330078125,
-0.06683349609375,
-0.02191162109375,
0.017364501953125,
-0.0919189453125,
0.03924560546875,
0.026031494140625,
-0.005344390869140625,
0.00830078125,
-0.05609130859375,
-0.08984375,
0.1292724609375,
0.0341796875,
-0.056304931640625,
-0.0005879402160644531,
0.0251007080078125,
0.035797119140625,
-0.006023406982421875,
0.0347900390625,
0.0628662109375,
0.03948974609375,
0.00768280029296875,
-0.0775146484375,
0.006420135498046875,
-0.0255584716796875,
-0.0006203651428222656,
-0.01526641845703125,
-0.09771728515625,
0.0625,
-0.0301361083984375,
-0.01837158203125,
0.0171661376953125,
0.04742431640625,
0.052520751953125,
0.041961669921875,
0.025848388671875,
0.059051513671875,
0.070068359375,
-0.0008764266967773438,
0.08551025390625,
-0.0235595703125,
0.0127105712890625,
0.0679931640625,
-0.0236053466796875,
0.07366943359375,
0.0189971923828125,
-0.0439453125,
0.046966552734375,
0.07403564453125,
-0.001575469970703125,
0.0465087890625,
0.005168914794921875,
-0.0124969482421875,
-0.0099334716796875,
-0.0153350830078125,
-0.04852294921875,
0.0401611328125,
0.016571044921875,
-0.01029205322265625,
-0.0021800994873046875,
-0.025238037109375,
0.0194549560546875,
-0.0236968994140625,
-0.0005488395690917969,
0.0594482421875,
0.01271820068359375,
-0.04522705078125,
0.06573486328125,
0.00579833984375,
0.060272216796875,
-0.046112060546875,
0.00455474853515625,
-0.039581298828125,
-0.0007033348083496094,
-0.0255584716796875,
-0.053436279296875,
0.007717132568359375,
0.029632568359375,
-0.0033111572265625,
-0.006381988525390625,
0.04193115234375,
0.00475311279296875,
-0.04290771484375,
0.02825927734375,
0.0193328857421875,
0.026031494140625,
0.01708984375,
-0.0521240234375,
0.0132904052734375,
0.0055694580078125,
-0.040618896484375,
0.028961181640625,
0.00011026859283447266,
-0.007602691650390625,
0.06024169921875,
0.054107666015625,
-0.0160980224609375,
0.0100555419921875,
-0.01451873779296875,
0.0750732421875,
-0.03594970703125,
-0.013641357421875,
-0.056121826171875,
0.04254150390625,
0.004955291748046875,
-0.053131103515625,
0.040924072265625,
0.048187255859375,
0.050079345703125,
0.0207977294921875,
0.04742431640625,
0.00908660888671875,
0.0256195068359375,
-0.041107177734375,
0.046142578125,
-0.05816650390625,
0.029388427734375,
0.00811004638671875,
-0.07244873046875,
-0.007232666015625,
0.050384521484375,
-0.019012451171875,
0.002429962158203125,
0.0295867919921875,
0.0657958984375,
0.01409912109375,
-0.01277923583984375,
0.0103759765625,
0.0138397216796875,
0.0270538330078125,
0.065673828125,
0.062347412109375,
-0.047698974609375,
0.049774169921875,
-0.0260772705078125,
-0.0160369873046875,
-0.0236053466796875,
-0.05438232421875,
-0.07275390625,
-0.01885986328125,
-0.0168304443359375,
-0.0116424560546875,
0.0059967041015625,
0.053436279296875,
0.037384033203125,
-0.0439453125,
-0.02264404296875,
-0.006595611572265625,
-0.0036449432373046875,
0.0026035308837890625,
-0.01187896728515625,
0.0230560302734375,
-0.0032958984375,
-0.04345703125,
0.037506103515625,
0.0018596649169921875,
0.0154266357421875,
-0.0275421142578125,
-0.0204925537109375,
-0.01403045654296875,
0.0134429931640625,
0.0462646484375,
0.0203704833984375,
-0.07159423828125,
-0.0188140869140625,
0.00461578369140625,
-0.01242828369140625,
0.0086517333984375,
0.0015306472778320312,
-0.0557861328125,
0.00772857666015625,
0.011077880859375,
0.027679443359375,
0.048004150390625,
0.0034770965576171875,
0.004467010498046875,
-0.0355224609375,
0.03436279296875,
0.0037937164306640625,
0.0106048583984375,
0.021087646484375,
-0.033966064453125,
0.062103271484375,
0.0087432861328125,
-0.0521240234375,
-0.07196044921875,
0.0102081298828125,
-0.080810546875,
-0.0015459060668945312,
0.10498046875,
-0.0016727447509765625,
-0.00832366943359375,
0.01360321044921875,
-0.0157012939453125,
0.0279693603515625,
-0.027191162109375,
0.060882568359375,
0.0411376953125,
-0.005405426025390625,
-0.0101776123046875,
-0.06182861328125,
0.0239410400390625,
0.0295562744140625,
-0.08404541015625,
-0.01904296875,
0.0338134765625,
0.03631591796875,
-0.00823974609375,
0.049224853515625,
-0.0011768341064453125,
0.01702880859375,
0.006103515625,
0.006046295166015625,
-0.0184173583984375,
-0.01324462890625,
-0.007419586181640625,
-0.021209716796875,
-0.0022678375244140625,
-0.01654052734375
]
] |
pyannote/segmentation-3.0 | 2023-10-04T18:53:59.000Z | [
"pyannote-audio",
"pytorch",
"pyannote",
"pyannote-audio-model",
"audio",
"voice",
"speech",
"speaker",
"speaker-diarization",
"speaker-change-detection",
"speaker-segmentation",
"voice-activity-detection",
"overlapped-speech-detection",
"resegmentation",
"license:mit",
"has_space",
"region:us"
] | voice-activity-detection | pyannote | null | null | pyannote/segmentation-3.0 | 25 | 1,512,995 | pyannote-audio | 2023-09-22T12:03:10 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-model
- audio
- voice
- speech
- speaker
- speaker-diarization
- speaker-change-detection
- speaker-segmentation
- voice-activity-detection
- overlapped-speech-detection
- resegmentation
license: mit
inference: false
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers improve it further. Though this model uses MIT license and will always remain open-source, we will occasionnally email you about premium models and paid services around pyannote."
extra_gated_fields:
Company/university: text
Website: text
---
Using this open-source model in production?
Make the most of it thanks to our [consulting services](https://herve.niderb.fr/consulting.html).
# ๐น "Powerset" speaker segmentation
This model ingests 10 seconds of mono audio sampled at 16kHz and outputs speaker diarization as a (num_frames, num_classes) matrix where the 7 classes are _non-speech_, _speaker #1_, _speaker #2_, _speaker #3_, _speakers #1 and #2_, _speakers #1 and #3_, and _speakers #2 and #3_.

```python
# waveform (first row)
duration, sample_rate, num_channels = 10, 16000, 1
waveform = torch.randn(batch_size, num_channels, duration * sample_rate
# powerset multi-class encoding (second row)
powerset_encoding = model(waveform)
# multi-label encoding (third row)
from pyannote.audio.utils.powerset import Powerset
max_speakers_per_chunk, max_speakers_per_frame = 3, 2
to_multilabel = Powerset(
max_speakers_per_chunk,
max_speakers_per_frame).to_multilabel
multilabel_encoding = to_multilabel(powerset_encoding)
```
The various concepts behind this model are described in details in this [paper](https://www.isca-speech.org/archive/interspeech_2023/plaquet23_interspeech.html).
It has been trained by Sรฉverin Baroudi with [pyannote.audio](https://github.com/pyannote/pyannote-audio) `3.0.0` using the combination of the training sets of AISHELL, AliMeeting, AMI, AVA-AVD, DIHARD, Ego4D, MSDWild, REPERE, and VoxConverse.
This [companion repository](https://github.com/FrenchKrab/IS2023-powerset-diarization/) by [Alexis Plaquet](https://frenchkrab.github.io/) also provides instructions on how to train or finetune such a model on your own data.
## Requirements
1. Install [`pyannote.audio`](https://github.com/pyannote/pyannote-audio) `3.0` with `pip install pyannote.audio`
2. Accept [`pyannote/segmentation-3.0`](https://hf.co/pyannote/segmentation-3.0) user conditions
3. Create access token at [`hf.co/settings/tokens`](https://hf.co/settings/tokens).
## Usage
```python
# instantiate the model
from pyannote.audio import Model
model = Model.from_pretrained(
"pyannote/segmentation-3.0",
use_auth_token="HUGGINGFACE_ACCESS_TOKEN_GOES_HERE")
```
### Speaker diarization
This model cannot be used to perform speaker diarization of full recordings on its own (it only processes 10s chunks).
See [pyannote/speaker-diarization-3.0](https://hf.co/pyannote/speaker-diarization-3.0) pipeline that uses an additional speaker embedding model to perform full recording speaker diarization.
### Voice activity detection
```python
from pyannote.audio.pipelines import VoiceActivityDetection
pipeline = VoiceActivityDetection(segmentation=model)
HYPER_PARAMETERS = {
# remove speech regions shorter than that many seconds.
"min_duration_on": 0.0,
# fill non-speech regions shorter than that many seconds.
"min_duration_off": 0.0
}
pipeline.instantiate(HYPER_PARAMETERS)
vad = pipeline("audio.wav")
# `vad` is a pyannote.core.Annotation instance containing speech regions
```
### Overlapped speech detection
```python
from pyannote.audio.pipelines import OverlappedSpeechDetection
pipeline = OverlappedSpeechDetection(segmentation=model)
HYPER_PARAMETERS = {
# remove overlapped speech regions shorter than that many seconds.
"min_duration_on": 0.0,
# fill non-overlapped speech regions shorter than that many seconds.
"min_duration_off": 0.0
}
pipeline.instantiate(HYPER_PARAMETERS)
osd = pipeline("audio.wav")
# `osd` is a pyannote.core.Annotation instance containing overlapped speech regions
```
## Citations
```bibtex
@inproceedings{Plaquet23,
author={Alexis Plaquet and Hervรฉ Bredin},
title={{Powerset multi-class cross entropy loss for neural speaker diarization}},
year=2023,
booktitle={Proc. INTERSPEECH 2023},
}
```
```bibtex
@inproceedings{Bredin23,
author={Hervรฉ Bredin},
title={{pyannote.audio 2.1 speaker diarization pipeline: principle, benchmark, and recipe}},
year=2023,
booktitle={Proc. INTERSPEECH 2023},
}
```
| 4,648 | [
[
-0.0228118896484375,
-0.045806884765625,
0.01507568359375,
0.0221710205078125,
-0.0389404296875,
-0.0167999267578125,
-0.03997802734375,
-0.0287017822265625,
0.0313720703125,
0.03741455078125,
-0.031585693359375,
-0.0428466796875,
-0.017242431640625,
-0.0239105224609375,
-0.0165557861328125,
0.06707763671875,
0.02398681640625,
0.004390716552734375,
-0.0034637451171875,
0.0022258758544921875,
-0.0159454345703125,
-0.03411865234375,
-0.0266876220703125,
-0.0267333984375,
0.00955963134765625,
0.043609619140625,
0.0133819580078125,
0.048004150390625,
0.02484130859375,
0.027069091796875,
-0.033294677734375,
0.00809478759765625,
-0.00213623046875,
-0.0020885467529296875,
0.01003265380859375,
-0.00838470458984375,
-0.041748046875,
0.01287078857421875,
0.05902099609375,
0.03997802734375,
-0.0165557861328125,
0.0192718505859375,
-0.006679534912109375,
0.00927734375,
-0.02581787109375,
0.0130615234375,
-0.048980712890625,
-0.005603790283203125,
-0.0290679931640625,
-0.007427215576171875,
-0.0304412841796875,
-0.00461578369140625,
0.0272216796875,
-0.050079345703125,
0.0130767822265625,
-0.0160980224609375,
0.08282470703125,
0.0035305023193359375,
0.0096435546875,
-0.01477813720703125,
-0.05120849609375,
0.05242919921875,
-0.0706787109375,
0.0282745361328125,
0.034454345703125,
0.01898193359375,
0.01074981689453125,
-0.069580078125,
-0.03961181640625,
-0.01503753662109375,
0.005771636962890625,
0.01654052734375,
-0.00334930419921875,
0.01305389404296875,
0.0278167724609375,
0.040679931640625,
-0.031524658203125,
-0.00701141357421875,
-0.040863037109375,
-0.03240966796875,
0.061920166015625,
-0.02069091796875,
0.0276336669921875,
-0.0302276611328125,
-0.034576416015625,
-0.027862548828125,
-0.015777587890625,
0.005218505859375,
0.045989990234375,
0.044464111328125,
-0.023345947265625,
0.03363037109375,
0.00414276123046875,
0.055816650390625,
0.01236724853515625,
-0.028045654296875,
0.049713134765625,
-0.035003662109375,
-0.02020263671875,
0.04541015625,
0.07244873046875,
0.0089569091796875,
0.019927978515625,
0.0177764892578125,
-0.0032749176025390625,
-0.0292510986328125,
-0.00455474853515625,
-0.058563232421875,
-0.05865478515625,
0.025390625,
-0.033782958984375,
0.00788116455078125,
-0.005733489990234375,
-0.053863525390625,
-0.026031494140625,
-0.01727294921875,
0.052978515625,
-0.049713134765625,
-0.049163818359375,
0.0093994140625,
-0.0230255126953125,
-0.00424957275390625,
-0.004428863525390625,
-0.08935546875,
0.01021575927734375,
0.033935546875,
0.0823974609375,
0.0227508544921875,
-0.0275726318359375,
-0.034423828125,
-0.003932952880859375,
-0.004650115966796875,
0.043182373046875,
-0.01227569580078125,
-0.02911376953125,
-0.0304718017578125,
0.0013189315795898438,
-0.0310516357421875,
-0.047607421875,
0.0518798828125,
0.00864410400390625,
-0.00036716461181640625,
-0.0004146099090576172,
-0.049774169921875,
-0.0007653236389160156,
-0.0194549560546875,
-0.033111572265625,
0.056915283203125,
0.0081024169921875,
-0.06719970703125,
0.0221710205078125,
-0.0401611328125,
-0.0154571533203125,
0.0030002593994140625,
0.0013990402221679688,
-0.07122802734375,
-0.01910400390625,
0.0166778564453125,
0.0290069580078125,
0.0008473396301269531,
0.010711669921875,
-0.0084228515625,
-0.02490234375,
0.01220703125,
-0.018096923828125,
0.082763671875,
0.007244110107421875,
-0.035186767578125,
0.0135650634765625,
-0.083984375,
-0.005306243896484375,
0.006755828857421875,
-0.040374755859375,
-0.0294189453125,
-0.005184173583984375,
0.028106689453125,
0.0006976127624511719,
0.0081787109375,
-0.0650634765625,
-0.00920867919921875,
-0.053070068359375,
0.0430908203125,
0.04498291015625,
0.0187835693359375,
0.0193328857421875,
-0.016845703125,
0.005535125732421875,
0.006710052490234375,
0.01110076904296875,
-0.0273590087890625,
-0.047760009765625,
-0.048614501953125,
-0.04486083984375,
0.034088134765625,
0.042816162109375,
-0.0196990966796875,
0.045257568359375,
-0.005954742431640625,
-0.062255859375,
-0.045074462890625,
0.00698089599609375,
0.0296173095703125,
0.03460693359375,
0.04132080078125,
-0.0285186767578125,
-0.060302734375,
-0.069580078125,
-0.0202789306640625,
-0.0247344970703125,
-0.006450653076171875,
0.0277252197265625,
0.031951904296875,
0.01087188720703125,
0.06768798828125,
-0.0168304443359375,
-0.016510009765625,
-0.004390716552734375,
0.0029735565185546875,
0.03778076171875,
0.06341552734375,
0.0277252197265625,
-0.060516357421875,
-0.035552978515625,
-0.00878143310546875,
-0.022979736328125,
-0.01026153564453125,
-0.0225830078125,
-0.004917144775390625,
-0.01287841796875,
0.031280517578125,
-0.05206298828125,
0.025787353515625,
0.01467132568359375,
-0.0135345458984375,
0.045654296875,
0.00751495361328125,
-0.007720947265625,
-0.07421875,
0.008148193359375,
0.01477813720703125,
0.00341796875,
-0.057952880859375,
-0.047393798828125,
0.004161834716796875,
-0.0169677734375,
-0.035125732421875,
0.0296630859375,
-0.036895751953125,
-0.01214599609375,
-0.0000546574592590332,
0.0265960693359375,
-0.0087738037109375,
0.044647216796875,
0.0150299072265625,
0.04779052734375,
0.045166015625,
-0.04888916015625,
0.037109375,
0.039794921875,
-0.06256103515625,
0.03765869140625,
-0.0584716796875,
0.011688232421875,
0.0252838134765625,
0.00830078125,
-0.07952880859375,
-0.0121917724609375,
0.040069580078125,
-0.05438232421875,
0.02301025390625,
-0.0325927734375,
-0.01267242431640625,
-0.0197601318359375,
-0.00835418701171875,
0.02777099609375,
0.02117919921875,
-0.05096435546875,
0.032196044921875,
0.03973388671875,
-0.026031494140625,
-0.02850341796875,
-0.06341552734375,
-0.005283355712890625,
-0.0256195068359375,
-0.058807373046875,
0.04754638671875,
-0.0052490234375,
-0.03094482421875,
-0.00823211669921875,
-0.0069580078125,
0.0012788772583007812,
-0.018798828125,
0.025482177734375,
0.005069732666015625,
-0.0183563232421875,
0.01305389404296875,
-0.0174102783203125,
0.00042176246643066406,
-0.0140380859375,
-0.04718017578125,
0.038299560546875,
0.00955963134765625,
-0.0174102783203125,
-0.0550537109375,
0.0059814453125,
0.0355224609375,
-0.05047607421875,
0.035552978515625,
0.06439208984375,
-0.0201263427734375,
-0.00939178466796875,
-0.0311737060546875,
-0.00682830810546875,
-0.032806396484375,
0.048309326171875,
-0.0176544189453125,
-0.0312347412109375,
0.034027099609375,
0.01346588134765625,
0.0178070068359375,
0.0290679931640625,
0.0404052734375,
0.011474609375,
0.051727294921875,
0.0249481201171875,
0.005214691162109375,
0.0650634765625,
-0.040679931640625,
0.0256500244140625,
-0.07647705078125,
-0.030548095703125,
-0.037933349609375,
0.005584716796875,
-0.03955078125,
-0.032012939453125,
0.02972412109375,
0.01163482666015625,
-0.00405120849609375,
0.0278472900390625,
-0.05810546875,
0.026702880859375,
0.050689697265625,
0.011993408203125,
-0.017364501953125,
0.0239105224609375,
-0.00713348388671875,
0.0020351409912109375,
-0.042633056640625,
-0.0243072509765625,
0.07489013671875,
0.04193115234375,
0.0306396484375,
-0.01372528076171875,
0.060516357421875,
0.00580596923828125,
-0.0172576904296875,
-0.06964111328125,
0.033294677734375,
-0.01145172119140625,
-0.044036865234375,
-0.043365478515625,
-0.0311279296875,
-0.05731201171875,
0.0325927734375,
0.0032329559326171875,
-0.07672119140625,
0.03607177734375,
0.002124786376953125,
-0.039154052734375,
0.044464111328125,
-0.0596923828125,
0.06585693359375,
-0.00870513916015625,
-0.0250091552734375,
-0.0011949539184570312,
-0.044647216796875,
0.0186920166015625,
0.0205841064453125,
0.01020050048828125,
-0.014801025390625,
0.038665771484375,
0.10003662109375,
-0.03564453125,
0.0521240234375,
-0.046966552734375,
0.000926971435546875,
0.053375244140625,
-0.0318603515625,
0.0144805908203125,
0.0009512901306152344,
-0.00411224365234375,
0.00984954833984375,
-0.00041031837463378906,
-0.022918701171875,
-0.00791168212890625,
0.0496826171875,
-0.06787109375,
-0.021484375,
-0.015777587890625,
-0.037567138671875,
-0.00902557373046875,
0.007785797119140625,
0.0254974365234375,
0.057830810546875,
-0.00653839111328125,
0.0096893310546875,
0.0540771484375,
-0.0282440185546875,
0.052001953125,
0.0190277099609375,
0.0025463104248046875,
-0.0679931640625,
0.07525634765625,
0.0230255126953125,
0.023773193359375,
0.01275634765625,
0.022613525390625,
-0.0249786376953125,
-0.04254150390625,
-0.033905029296875,
0.02923583984375,
-0.05279541015625,
0.007556915283203125,
-0.054718017578125,
-0.0233917236328125,
-0.053497314453125,
0.03131103515625,
-0.0504150390625,
-0.04345703125,
-0.0197601318359375,
-0.0034275054931640625,
0.0292816162109375,
0.0190277099609375,
-0.036651611328125,
0.039031982421875,
-0.03973388671875,
0.01305389404296875,
0.0364990234375,
0.01018524169921875,
-0.006496429443359375,
-0.059356689453125,
-0.04425048828125,
0.004238128662109375,
-0.018218994140625,
-0.06903076171875,
0.0231475830078125,
0.0222930908203125,
0.05706787109375,
0.0207977294921875,
-0.01535797119140625,
0.040435791015625,
-0.0160064697265625,
0.0748291015625,
0.0186004638671875,
-0.07989501953125,
0.04620361328125,
-0.0295257568359375,
0.01971435546875,
0.034088134765625,
0.007720947265625,
-0.051300048828125,
-0.0005750656127929688,
-0.051116943359375,
-0.09326171875,
0.08074951171875,
0.02410888671875,
-0.00948333740234375,
0.0142974853515625,
0.011871337890625,
-0.0127105712890625,
0.005146026611328125,
-0.045654296875,
-0.026153564453125,
-0.044036865234375,
-0.0011425018310546875,
-0.0214996337890625,
-0.0177154541015625,
-0.0013790130615234375,
-0.043487548828125,
0.08575439453125,
0.0156402587890625,
0.0341796875,
0.04718017578125,
-0.004425048828125,
-0.00843048095703125,
0.007228851318359375,
0.053131103515625,
0.038909912109375,
-0.042388916015625,
0.000408172607421875,
-0.0103607177734375,
-0.0543212890625,
0.01403045654296875,
0.0276947021484375,
-8.940696716308594e-7,
0.0355224609375,
0.031524658203125,
0.08599853515625,
0.00569915771484375,
-0.0220794677734375,
0.03466796875,
-0.003490447998046875,
-0.02459716796875,
-0.036529541015625,
-0.0102081298828125,
0.0208587646484375,
0.01568603515625,
0.0240325927734375,
-0.004364013671875,
-0.006755828857421875,
-0.0249481201171875,
0.0265655517578125,
0.0002079010009765625,
-0.026123046875,
-0.0158233642578125,
0.0426025390625,
0.018280029296875,
-0.048370361328125,
0.05621337890625,
-0.007633209228515625,
-0.037109375,
0.060333251953125,
0.0369873046875,
0.07183837890625,
-0.04827880859375,
0.014312744140625,
0.05322265625,
0.0235748291015625,
0.006275177001953125,
0.0130615234375,
-0.028961181640625,
-0.041595458984375,
-0.01483154296875,
-0.061614990234375,
-0.0242462158203125,
0.034820556640625,
-0.042205810546875,
0.01024627685546875,
-0.042449951171875,
-0.022430419921875,
0.035003662109375,
0.010589599609375,
-0.01262664794921875,
0.0179443359375,
0.0222320556640625,
0.06060791015625,
-0.07000732421875,
0.057525634765625,
0.041534423828125,
-0.036712646484375,
-0.0684814453125,
0.0004792213439941406,
-0.005443572998046875,
-0.01885986328125,
0.0177154541015625,
0.0125732421875,
0.003292083740234375,
-0.01079559326171875,
-0.0360107421875,
-0.055267333984375,
0.072265625,
0.0206146240234375,
-0.06658935546875,
0.0185546875,
-0.01129150390625,
0.0170440673828125,
-0.023834228515625,
0.0151519775390625,
0.05389404296875,
0.04583740234375,
-0.006816864013671875,
-0.0992431640625,
-0.0183258056640625,
-0.033203125,
-0.026519775390625,
0.01084136962890625,
-0.06640625,
0.096435546875,
-0.003582000732421875,
-0.00666046142578125,
0.01084136962890625,
0.044830322265625,
0.0263519287109375,
0.034942626953125,
0.05352783203125,
0.038848876953125,
0.06915283203125,
0.0024089813232421875,
0.0435791015625,
-0.0235748291015625,
0.0289306640625,
0.08148193359375,
0.006053924560546875,
0.04547119140625,
0.035186767578125,
-0.014984130859375,
0.03271484375,
0.0584716796875,
-0.013031005859375,
0.053466796875,
0.02630615234375,
-0.022796630859375,
-0.025604248046875,
-0.00830841064453125,
-0.0430908203125,
0.05419921875,
0.025604248046875,
-0.035247802734375,
0.01357269287109375,
-0.0010499954223632812,
0.0012235641479492188,
-0.00580596923828125,
-0.0182647705078125,
0.037750244140625,
0.0084381103515625,
-0.033111572265625,
0.053802490234375,
0.00264739990234375,
0.04998779296875,
-0.045166015625,
0.004825592041015625,
-0.003726959228515625,
0.0228271484375,
-0.0246124267578125,
-0.019287109375,
0.003986358642578125,
-0.0170135498046875,
-0.00943756103515625,
-0.006229400634765625,
0.044464111328125,
-0.059295654296875,
-0.0255279541015625,
0.0186004638671875,
0.007843017578125,
0.035125732421875,
-0.007244110107421875,
-0.057281494140625,
0.006500244140625,
0.010955810546875,
-0.018707275390625,
0.00576019287109375,
0.02081298828125,
0.016357421875,
0.006938934326171875,
0.039764404296875,
0.021728515625,
0.01549530029296875,
0.021240234375,
0.04644775390625,
-0.02813720703125,
-0.0626220703125,
-0.062744140625,
0.031036376953125,
-0.026763916015625,
-0.033721923828125,
0.06768798828125,
0.059661865234375,
0.06756591796875,
0.00890350341796875,
0.05755615234375,
0.00232696533203125,
0.041229248046875,
-0.038604736328125,
0.061279296875,
-0.033905029296875,
0.015380859375,
-0.04705810546875,
-0.060516357421875,
-0.00003319978713989258,
0.059112548828125,
-0.032012939453125,
0.0084228515625,
0.033599853515625,
0.06732177734375,
-0.0246429443359375,
0.01219940185546875,
0.01296234130859375,
0.0186920166015625,
0.0276031494140625,
0.038818359375,
0.06280517578125,
-0.0390625,
0.0452880859375,
-0.048553466796875,
-0.01242828369140625,
-0.006465911865234375,
-0.0304412841796875,
-0.07073974609375,
-0.0604248046875,
-0.040191650390625,
-0.0265350341796875,
0.0012950897216796875,
0.0791015625,
0.0806884765625,
-0.058929443359375,
-0.04449462890625,
0.01024627685546875,
0.00347137451171875,
-0.0258636474609375,
-0.0145263671875,
0.05169677734375,
0.0094451904296875,
-0.051422119140625,
0.054718017578125,
0.01800537109375,
0.00856781005859375,
-0.008148193359375,
-0.004619598388671875,
-0.053497314453125,
0.00731658935546875,
0.005931854248046875,
0.022003173828125,
-0.031036376953125,
-0.0018014907836914062,
-0.035614013671875,
0.01280975341796875,
0.0223236083984375,
0.05975341796875,
-0.0248565673828125,
0.046600341796875,
0.039794921875,
0.00897216796875,
0.0679931640625,
-0.0012865066528320312,
0.00916290283203125,
-0.06439208984375,
0.032684326171875,
0.0183258056640625,
0.019989013671875,
0.054840087890625,
-0.0128631591796875,
0.016265869140625,
0.0413818359375,
-0.04779052734375,
-0.0731201171875,
-0.00937652587890625,
-0.057464599609375,
-0.0174102783203125,
0.07781982421875,
-0.0210113525390625,
-0.03314208984375,
-0.00644683837890625,
-0.0223846435546875,
0.047149658203125,
-0.0325927734375,
0.03594970703125,
0.0377197265625,
-0.01361846923828125,
0.00004369020462036133,
-0.031951904296875,
0.047210693359375,
0.027130126953125,
-0.0311737060546875,
0.0143890380859375,
0.02484130859375,
0.0303192138671875,
0.04095458984375,
0.06988525390625,
0.0002696514129638672,
0.0218963623046875,
0.03997802734375,
0.027252197265625,
-0.0266571044921875,
-0.031890869140625,
-0.0289764404296875,
0.00946044921875,
-0.005306243896484375,
-0.0540771484375
]
] |
pyannote/speaker-diarization-3.0 | 2023-10-04T18:54:33.000Z | [
"pyannote-audio",
"pyannote",
"pyannote-audio-pipeline",
"audio",
"voice",
"speech",
"speaker",
"speaker-diarization",
"speaker-change-detection",
"voice-activity-detection",
"overlapped-speech-detection",
"automatic-speech-recognition",
"arxiv:2111.14448",
"arxiv:2012.01477",
"license:mit",
"has_space",
"region:us"
] | automatic-speech-recognition | pyannote | null | null | pyannote/speaker-diarization-3.0 | 91 | 1,511,968 | pyannote-audio | 2023-09-22T13:40:36 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-pipeline
- audio
- voice
- speech
- speaker
- speaker-diarization
- speaker-change-detection
- voice-activity-detection
- overlapped-speech-detection
- automatic-speech-recognition
license: mit
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers improve it further. Though this pipeline uses MIT license and will always remain open-source, we will occasionnally email you about premium pipelines and paid services around pyannote."
extra_gated_fields:
Company/university: text
Website: text
---
Using this open-source pipeline in production?
Make the most of it thanks to our [consulting services](https://herve.niderb.fr/consulting.html).
# ๐น Speaker diarization 3.0
This pipeline has been trained by Sรฉverin Baroudi with [pyannote.audio](https://github.com/pyannote/pyannote-audio) `3.0.0` using a combination of the training sets of AISHELL, AliMeeting, AMI, AVA-AVD, DIHARD, Ego4D, MSDWild, REPERE, and VoxConverse.
It ingests mono audio sampled at 16kHz and outputs speaker diarization as an [`Annotation`](http://pyannote.github.io/pyannote-core/structure.html#annotation) instance:
* stereo or multi-channel audio files are automatically downmixed to mono by averaging the channels.
* audio files sampled at a different rate are resampled to 16kHz automatically upon loading.
## Requirements
1. Install [`pyannote.audio`](https://github.com/pyannote/pyannote-audio) `3.0` with `pip install pyannote.audio`
2. Accept [`pyannote/segmentation-3.0`](https://hf.co/pyannote/segmentation-3.0) user conditions
3. Accept [`pyannote/speaker-diarization-3.0`](https://hf.co/pyannote-speaker-diarization-3.0) user conditions
4. Create access token at [`hf.co/settings/tokens`](https://hf.co/settings/tokens).
## Usage
```python
# instantiate the pipeline
from pyannote.audio import Pipeline
pipeline = Pipeline.from_pretrained(
"pyannote/speaker-diarization-3.0",
use_auth_token="HUGGINGFACE_ACCESS_TOKEN_GOES_HERE")
# run the pipeline on an audio file
diarization = pipeline("audio.wav")
# dump the diarization output to disk using RTTM format
with open("audio.rttm", "w") as rttm:
diarization.write_rttm(rttm)
```
### Processing on GPU
`pyannote.audio` pipelines run on CPU by default.
You can send them to GPU with the following lines:
```python
import torch
pipeline.to(torch.device("cuda"))
```
Real-time factor is around 2.5% using one Nvidia Tesla V100 SXM2 GPU (for the neural inference part) and one Intel Cascade Lake 6248 CPU (for the clustering part).
In other words, it takes approximately 1.5 minutes to process a one hour conversation.
### Processing from memory
Pre-loading audio files in memory may result in faster processing:
```python
waveform, sample_rate = torchaudio.load("audio.wav")
diarization = pipeline({"waveform": waveform, "sample_rate": sample_rate})
```
### Monitoring progress
Hooks are available to monitor the progress of the pipeline:
```python
from pyannote.audio.pipelines.utils.hook import ProgressHook
with ProgressHook() as hook:
diarization = pipeline("audio.wav", hook=hook)
```
### Controlling the number of speakers
In case the number of speakers is known in advance, one can use the `num_speakers` option:
```python
diarization = pipeline("audio.wav", num_speakers=2)
```
One can also provide lower and/or upper bounds on the number of speakers using `min_speakers` and `max_speakers` options:
```python
diarization = pipeline("audio.wav", min_speakers=2, max_speakers=5)
```
## Benchmark
This pipeline has been benchmarked on a large collection of datasets.
Processing is fully automatic:
* no manual voice activity detection (as is sometimes the case in the literature)
* no manual number of speakers (though it is possible to provide it to the pipeline)
* no fine-tuning of the internal models nor tuning of the pipeline hyper-parameters to each dataset
... with the least forgiving diarization error rate (DER) setup (named *"Full"* in [this paper](https://doi.org/10.1016/j.csl.2021.101254)):
* no forgiveness collar
* evaluation of overlapped speech
| Benchmark | [DER%](. "Diarization error rate") | [FA%](. "False alarm rate") | [Miss%](. "Missed detection rate") | [Conf%](. "Speaker confusion rate") | Expected output | File-level evaluation |
| ------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------- | --------------------------- | ---------------------------------- | ----------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- |
| [AISHELL-4](http://www.openslr.org/111/) | 12.3 | 3.8 | 4.4 | 4.1 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AISHELL.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AISHELL.SpeakerDiarization.Benchmark.test.eval) |
| [AliMeeting (*channel 1*)](https://www.openslr.org/119/) | 24.3 | 4.4 | 10.0 | 9.9 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AliMeeting.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AliMeeting.SpeakerDiarization.Benchmark.test.eval) |
| [AMI (*headset mix,*](https://groups.inf.ed.ac.uk/ami/corpus/) [*only_words*)](https://github.com/BUTSpeechFIT/AMI-diarization-setup) | 19.0 | 3.6 | 9.5 | 5.9 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AMI.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AMI.SpeakerDiarization.Benchmark.test.eval) |
| [AMI (*array1, channel 1,*](https://groups.inf.ed.ac.uk/ami/corpus/) [*only_words)*](https://github.com/BUTSpeechFIT/AMI-diarization-setup) | 22.2 | 3.8 | 11.2 | 7.3 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AMI-SDM.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AMI-SDM.SpeakerDiarization.Benchmark.test.eval) |
| [AVA-AVD](https://arxiv.org/abs/2111.14448) | 49.1 | 10.8 | 15.7| 22.5 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AVA-AVD.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/AVA-AVD.SpeakerDiarization.Benchmark.test.eval) |
| [DIHARD 3 (*Full*)](https://arxiv.org/abs/2012.01477) | 21.7 | 6.2 | 8.1 | 7.3 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/DIHARD.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/DIHARD.SpeakerDiarization.Benchmark.test.eval) |
| [MSDWild](https://x-lance.github.io/MSDWILD/) | 24.6 | 5.8 | 8.0 | 10.7 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/MSDWILD.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/MSDWILD.SpeakerDiarization.Benchmark.test.eval) |
| [REPERE (*phase 2*)](https://islrn.org/resources/360-758-359-485-0/) | 7.8 | 1.8 | 2.6 | 3.5 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/REPERE.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/REPERE.SpeakerDiarization.Benchmark.test.eval) |
| [VoxConverse (*v0.3*)](https://github.com/joonson/voxconverse) | 11.3 | 4.1 | 3.4 | 3.8 | [RTTM](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/VoxConverse.SpeakerDiarization.Benchmark.test.rttm) | [eval](https://huggingface.co/pyannote/speaker-diarization-3.0.0/blob/main/reproducible_research/VoxConverse.SpeakerDiarization.Benchmark.test.eval) |
## Citations
```bibtex
@inproceedings{Plaquet23,
author={Alexis Plaquet and Hervรฉ Bredin},
title={{Powerset multi-class cross entropy loss for neural speaker diarization}},
year=2023,
booktitle={Proc. INTERSPEECH 2023},
}
```
```bibtex
@inproceedings{Bredin23,
author={Hervรฉ Bredin},
title={{pyannote.audio 2.1 speaker diarization pipeline: principle, benchmark, and recipe}},
year=2023,
booktitle={Proc. INTERSPEECH 2023},
}
```
| 10,655 | [
[
-0.048187255859375,
-0.05755615234375,
0.008392333984375,
0.03692626953125,
-0.01526641845703125,
0.005237579345703125,
-0.037811279296875,
-0.0221710205078125,
0.035797119140625,
0.027191162109375,
-0.029754638671875,
-0.053436279296875,
-0.0318603515625,
0.0013580322265625,
-0.0159759521484375,
0.056488037109375,
0.0259246826171875,
-0.0007996559143066406,
0.00908660888671875,
0.0007195472717285156,
-0.0264739990234375,
-0.0150299072265625,
-0.03948974609375,
-0.015167236328125,
0.00978851318359375,
0.04345703125,
0.0166168212890625,
0.059417724609375,
0.0235137939453125,
0.024444580078125,
-0.03173828125,
0.007476806640625,
-0.00518798828125,
0.0006933212280273438,
0.00775909423828125,
0.0014829635620117188,
-0.039642333984375,
0.00897216796875,
0.057525634765625,
0.042724609375,
-0.0179595947265625,
0.0229644775390625,
0.004833221435546875,
0.041015625,
-0.0214080810546875,
0.01605224609375,
-0.04974365234375,
-0.01300048828125,
-0.035247802734375,
-0.0216064453125,
-0.01580810546875,
-0.0190582275390625,
0.0096893310546875,
-0.04071044921875,
0.016326904296875,
0.005855560302734375,
0.0753173828125,
0.00637054443359375,
-0.0007462501525878906,
-0.01047515869140625,
-0.0557861328125,
0.0511474609375,
-0.06756591796875,
0.029815673828125,
0.038482666015625,
0.0063018798828125,
-0.01187896728515625,
-0.052764892578125,
-0.05157470703125,
-0.00402069091796875,
-0.0099334716796875,
0.020751953125,
-0.0095367431640625,
0.01064300537109375,
0.0276947021484375,
0.042999267578125,
-0.04241943359375,
-0.002887725830078125,
-0.042572021484375,
-0.0347900390625,
0.0562744140625,
-0.0130157470703125,
0.030059814453125,
-0.0372314453125,
-0.01971435546875,
-0.0213623046875,
-0.029022216796875,
0.0142059326171875,
0.036468505859375,
0.033935546875,
-0.03485107421875,
0.041961669921875,
-0.00887298583984375,
0.04217529296875,
0.009307861328125,
-0.020538330078125,
0.052978515625,
-0.04119873046875,
-0.0124359130859375,
0.027008056640625,
0.08209228515625,
0.0217742919921875,
-0.007015228271484375,
0.009368896484375,
0.004459381103515625,
-0.001644134521484375,
-0.00940704345703125,
-0.049285888671875,
-0.037750244140625,
0.045074462890625,
-0.041656494140625,
-0.0013856887817382812,
-0.0139312744140625,
-0.07470703125,
-0.0081024169921875,
-0.01019287109375,
0.0294342041015625,
-0.049102783203125,
-0.04534912109375,
0.0010137557983398438,
-0.019073486328125,
0.003421783447265625,
0.0117645263671875,
-0.0836181640625,
0.0185546875,
0.045684814453125,
0.0802001953125,
0.00386810302734375,
-0.020965576171875,
-0.043243408203125,
0.004718780517578125,
-0.025604248046875,
0.042327880859375,
-0.007904052734375,
-0.03460693359375,
-0.0161590576171875,
-0.00017940998077392578,
-0.0291290283203125,
-0.0421142578125,
0.07135009765625,
0.0012636184692382812,
0.0194854736328125,
-0.0091705322265625,
-0.04736328125,
0.0059051513671875,
-0.0088348388671875,
-0.038482666015625,
0.07586669921875,
0.0037822723388671875,
-0.076171875,
0.0207672119140625,
-0.050872802734375,
-0.0169830322265625,
0.00006848573684692383,
-0.006740570068359375,
-0.047607421875,
-0.0198516845703125,
0.01885986328125,
0.03216552734375,
-0.014495849609375,
0.00437164306640625,
0.003940582275390625,
-0.0321044921875,
-0.001354217529296875,
-0.01201629638671875,
0.0848388671875,
0.02386474609375,
-0.053985595703125,
0.004001617431640625,
-0.07958984375,
-0.006870269775390625,
-0.0029449462890625,
-0.04217529296875,
-0.011474609375,
0.003597259521484375,
0.0198211669921875,
-0.00016069412231445312,
0.0140533447265625,
-0.060333251953125,
-0.005840301513671875,
-0.056488037109375,
0.03704833984375,
0.050811767578125,
0.0021953582763671875,
0.0225830078125,
-0.03363037109375,
0.0153045654296875,
0.00109100341796875,
-0.006328582763671875,
-0.01910400390625,
-0.04736328125,
-0.06939697265625,
-0.04815673828125,
0.0237579345703125,
0.056610107421875,
-0.027191162109375,
0.048370361328125,
-0.016143798828125,
-0.05560302734375,
-0.06866455078125,
0.005584716796875,
0.048614501953125,
0.030059814453125,
0.034393310546875,
-0.0245819091796875,
-0.059906005859375,
-0.06488037109375,
-0.0155181884765625,
-0.0447998046875,
-0.002117156982421875,
0.036407470703125,
0.030731201171875,
-0.005588531494140625,
0.06475830078125,
-0.019287109375,
-0.027587890625,
0.0029964447021484375,
0.0106048583984375,
0.045806884765625,
0.0518798828125,
0.033660888671875,
-0.06353759765625,
-0.04840087890625,
0.01151275634765625,
-0.037872314453125,
-0.0218048095703125,
-0.005535125732421875,
0.0011281967163085938,
0.010986328125,
0.02947998046875,
-0.051177978515625,
0.0218353271484375,
0.02093505859375,
-0.01340484619140625,
0.057830810546875,
0.0033893585205078125,
0.01512908935546875,
-0.07720947265625,
0.0243072509765625,
0.006938934326171875,
-0.002651214599609375,
-0.053375244140625,
-0.031768798828125,
-0.01003265380859375,
0.00872802734375,
-0.0270538330078125,
0.0472412109375,
-0.0290985107421875,
-0.006778717041015625,
0.0130767822265625,
0.0283660888671875,
-0.018096923828125,
0.03753662109375,
0.0026798248291015625,
0.0606689453125,
0.04669189453125,
-0.046875,
0.0311431884765625,
0.043609619140625,
-0.050933837890625,
0.034088134765625,
-0.05462646484375,
0.01024627685546875,
0.016571044921875,
0.0097503662109375,
-0.0794677734375,
-0.0089569091796875,
0.047027587890625,
-0.0604248046875,
0.0236358642578125,
-0.0248870849609375,
-0.0236358642578125,
-0.03863525390625,
-0.0242462158203125,
0.01043701171875,
0.0288238525390625,
-0.037811279296875,
0.0290985107421875,
0.032440185546875,
-0.02685546875,
-0.043060302734375,
-0.0435791015625,
0.00543212890625,
-0.0286407470703125,
-0.0511474609375,
0.04840087890625,
-0.0171051025390625,
-0.0335693359375,
-0.0120391845703125,
-0.0012025833129882812,
0.01335906982421875,
-0.0172882080078125,
0.0226898193359375,
0.01103973388671875,
-0.01363372802734375,
0.00004172325134277344,
-0.0160064697265625,
-0.0072021484375,
-0.016143798828125,
-0.019866943359375,
0.0423583984375,
-0.011322021484375,
-0.01702880859375,
-0.05889892578125,
0.0196075439453125,
0.050048828125,
-0.036407470703125,
0.03790283203125,
0.0733642578125,
-0.01210784912109375,
-0.007415771484375,
-0.05169677734375,
-0.0035762786865234375,
-0.03485107421875,
0.0263671875,
-0.0272979736328125,
-0.056732177734375,
0.034576416015625,
0.00482177734375,
0.0235595703125,
0.03375244140625,
0.052825927734375,
-0.00847625732421875,
0.051300048828125,
0.01345062255859375,
-0.01320648193359375,
0.04119873046875,
-0.032012939453125,
0.024658203125,
-0.0767822265625,
-0.0166778564453125,
-0.05841064453125,
0.0009927749633789062,
-0.061981201171875,
-0.03314208984375,
0.030487060546875,
0.0100250244140625,
-0.00482940673828125,
0.039703369140625,
-0.061767578125,
0.01148223876953125,
0.0452880859375,
-0.00554656982421875,
0.0036220550537109375,
0.01152801513671875,
-0.018463134765625,
-0.0023632049560546875,
-0.0286102294921875,
-0.044891357421875,
0.07916259765625,
0.033416748046875,
0.0200042724609375,
0.0064697265625,
0.055999755859375,
0.0199127197265625,
-0.0153045654296875,
-0.04949951171875,
0.04595947265625,
-0.006099700927734375,
-0.04534912109375,
-0.033660888671875,
-0.0318603515625,
-0.0721435546875,
0.032073974609375,
-0.0030307769775390625,
-0.07818603515625,
0.0225372314453125,
0.005863189697265625,
-0.0214385986328125,
0.0269012451171875,
-0.0645751953125,
0.0672607421875,
0.00516510009765625,
-0.0179901123046875,
-0.020843505859375,
-0.054931640625,
0.015838623046875,
0.01212310791015625,
0.0343017578125,
-0.02947998046875,
0.0252685546875,
0.08721923828125,
-0.0231475830078125,
0.04473876953125,
-0.030609130859375,
0.0015249252319335938,
0.0386962890625,
-0.01403045654296875,
0.0247802734375,
0.0077056884765625,
-0.01995849609375,
0.0151824951171875,
0.0147705078125,
-0.0235443115234375,
-0.00836181640625,
0.06683349609375,
-0.0706787109375,
-0.045074462890625,
-0.023345947265625,
-0.024200439453125,
-0.0029544830322265625,
0.00925445556640625,
0.0234222412109375,
0.035430908203125,
-0.00928497314453125,
0.01690673828125,
0.051849365234375,
-0.02850341796875,
0.052490234375,
0.0304718017578125,
-0.0012025833129882812,
-0.061767578125,
0.07037353515625,
0.006526947021484375,
0.01123809814453125,
0.0303802490234375,
0.01186370849609375,
-0.01702880859375,
-0.05609130859375,
-0.033660888671875,
0.0209503173828125,
-0.02886962890625,
0.003910064697265625,
-0.065673828125,
-0.01922607421875,
-0.058502197265625,
0.0186309814453125,
-0.040618896484375,
-0.04437255859375,
-0.024017333984375,
-0.005100250244140625,
0.0302276611328125,
0.016326904296875,
-0.0238800048828125,
0.0199432373046875,
-0.048004150390625,
0.0252532958984375,
0.0196533203125,
0.0160064697265625,
-0.0175323486328125,
-0.0386962890625,
-0.0338134765625,
0.0101776123046875,
-0.030487060546875,
-0.05462646484375,
0.040191650390625,
0.033233642578125,
0.049407958984375,
0.014434814453125,
-0.00476837158203125,
0.0472412109375,
-0.025665283203125,
0.07696533203125,
0.012603759765625,
-0.081787109375,
0.05487060546875,
-0.040924072265625,
0.0128173828125,
0.041656494140625,
0.01678466796875,
-0.0491943359375,
-0.0197906494140625,
-0.04730224609375,
-0.0858154296875,
0.07293701171875,
0.03448486328125,
-0.00603485107421875,
-0.0028228759765625,
0.0012912750244140625,
-0.00658416748046875,
0.012603759765625,
-0.039520263671875,
-0.050384521484375,
-0.021697998046875,
0.0017528533935546875,
-0.01557159423828125,
-0.0097198486328125,
-0.005245208740234375,
-0.045745849609375,
0.079833984375,
0.0147247314453125,
0.041229248046875,
0.04144287109375,
0.0017976760864257812,
-0.01097869873046875,
0.0294342041015625,
0.04840087890625,
0.02880859375,
-0.04486083984375,
0.00135040283203125,
0.004726409912109375,
-0.04620361328125,
0.00801849365234375,
0.0087127685546875,
-0.00009733438491821289,
0.0265655517578125,
0.02880859375,
0.064697265625,
0.00385284423828125,
-0.0309295654296875,
0.035003662109375,
-0.0092010498046875,
-0.027862548828125,
-0.04443359375,
-0.006134033203125,
0.0274200439453125,
0.014923095703125,
0.03253173828125,
-0.0005908012390136719,
0.00147247314453125,
-0.0379638671875,
0.0211639404296875,
0.0095977783203125,
-0.01354217529296875,
-0.0198211669921875,
0.050201416015625,
0.0202789306640625,
-0.04278564453125,
0.04302978515625,
-0.01508331298828125,
-0.03631591796875,
0.045379638671875,
0.0264892578125,
0.07232666015625,
-0.043792724609375,
0.0124053955078125,
0.061431884765625,
0.021514892578125,
0.00885772705078125,
0.02587890625,
-0.0295867919921875,
-0.044219970703125,
-0.0164031982421875,
-0.07000732421875,
-0.0241546630859375,
0.01678466796875,
-0.03387451171875,
0.0266265869140625,
-0.032318115234375,
-0.01971435546875,
0.0316162109375,
0.0201873779296875,
-0.02362060546875,
0.01343536376953125,
0.006946563720703125,
0.06121826171875,
-0.06158447265625,
0.059295654296875,
0.03692626953125,
-0.0202178955078125,
-0.069091796875,
-0.0059661865234375,
0.004810333251953125,
-0.022735595703125,
0.020721435546875,
0.003391265869140625,
0.001438140869140625,
-0.0008001327514648438,
-0.020660400390625,
-0.053192138671875,
0.076416015625,
0.02294921875,
-0.06524658203125,
0.0211334228515625,
-0.01200103759765625,
0.0384521484375,
-0.01396942138671875,
0.0243072509765625,
0.054290771484375,
0.0546875,
0.003643035888671875,
-0.10797119140625,
-0.00165557861328125,
-0.0577392578125,
-0.0152130126953125,
0.00954437255859375,
-0.058502197265625,
0.06781005859375,
-0.001041412353515625,
-0.0196380615234375,
0.005878448486328125,
0.045928955078125,
0.02789306640625,
0.036834716796875,
0.044158935546875,
0.05517578125,
0.052978515625,
-0.01161956787109375,
0.04425048828125,
-0.0322265625,
0.0239715576171875,
0.07354736328125,
0.002674102783203125,
0.062103271484375,
0.03973388671875,
-0.036376953125,
0.031402587890625,
0.066650390625,
-0.01165008544921875,
0.042816162109375,
0.01806640625,
-0.034759521484375,
-0.004604339599609375,
-0.0121612548828125,
-0.049407958984375,
0.04638671875,
0.027587890625,
-0.0230865478515625,
0.02386474609375,
-0.01454925537109375,
0.0116729736328125,
0.0018358230590820312,
-0.007747650146484375,
0.046844482421875,
0.01041412353515625,
-0.041351318359375,
0.0653076171875,
-0.0021152496337890625,
0.06475830078125,
-0.034393310546875,
0.00766754150390625,
0.00008314847946166992,
0.0099945068359375,
-0.037933349609375,
-0.031402587890625,
0.0253753662109375,
-0.00691986083984375,
-0.01239776611328125,
-0.0196533203125,
0.035736083984375,
-0.04376220703125,
-0.022003173828125,
0.029541015625,
0.0243377685546875,
0.03485107421875,
0.0147247314453125,
-0.0401611328125,
0.007389068603515625,
0.013427734375,
-0.026031494140625,
0.0153045654296875,
0.023101806640625,
0.012664794921875,
0.026885986328125,
0.057281494140625,
0.033660888671875,
0.0168914794921875,
0.01033782958984375,
0.055145263671875,
-0.038970947265625,
-0.0418701171875,
-0.061553955078125,
0.0330810546875,
-0.0175323486328125,
-0.0310516357421875,
0.075439453125,
0.06134033203125,
0.06744384765625,
0.00830841064453125,
0.052215576171875,
-0.028411865234375,
0.0589599609375,
-0.02008056640625,
0.060943603515625,
-0.0325927734375,
0.03228759765625,
-0.051971435546875,
-0.062286376953125,
-0.006999969482421875,
0.045257568359375,
-0.0179595947265625,
-0.001178741455078125,
0.0457763671875,
0.07379150390625,
0.004795074462890625,
0.012054443359375,
0.0106201171875,
0.0262451171875,
0.0270233154296875,
0.036224365234375,
0.044464111328125,
-0.037322998046875,
0.044647216796875,
-0.04547119140625,
-0.01294708251953125,
-0.0098419189453125,
-0.0438232421875,
-0.05712890625,
-0.06817626953125,
-0.049530029296875,
-0.0271148681640625,
0.005062103271484375,
0.08258056640625,
0.0650634765625,
-0.058502197265625,
-0.040771484375,
0.0005974769592285156,
0.015625,
-0.03326416015625,
-0.0155181884765625,
0.04840087890625,
0.0105133056640625,
-0.06353759765625,
0.04888916015625,
0.0178375244140625,
-0.00853729248046875,
0.0028839111328125,
-0.015655517578125,
-0.039581298828125,
0.0017299652099609375,
0.01739501953125,
0.0287628173828125,
-0.04669189453125,
-0.012603759765625,
-0.0301513671875,
-0.00029277801513671875,
0.0255279541015625,
0.036651611328125,
-0.03302001953125,
0.04840087890625,
0.0518798828125,
0.006805419921875,
0.05755615234375,
-0.007595062255859375,
0.0188140869140625,
-0.04901123046875,
0.00830841064453125,
0.0178985595703125,
0.0172271728515625,
0.038848876953125,
-0.014739990234375,
0.039459228515625,
0.032135009765625,
-0.053558349609375,
-0.07452392578125,
-0.02020263671875,
-0.08099365234375,
0.000043272972106933594,
0.0833740234375,
-0.0171966552734375,
-0.02191162109375,
-0.016082763671875,
-0.0303802490234375,
0.044219970703125,
-0.047088623046875,
0.05426025390625,
0.047332763671875,
-0.010528564453125,
-0.006023406982421875,
-0.046142578125,
0.047943115234375,
0.031341552734375,
-0.045989990234375,
0.004886627197265625,
0.0202484130859375,
0.0236358642578125,
0.0341796875,
0.0772705078125,
-0.0166168212890625,
0.009033203125,
0.0106964111328125,
0.0196380615234375,
-0.00872802734375,
-0.00701141357421875,
-0.0171966552734375,
0.00235748291015625,
-0.00800323486328125,
-0.039642333984375
]
] |
facebook/wav2vec2-xlsr-53-espeak-cv-ft | 2021-12-10T17:18:39.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"phoneme-recognition",
"multi-lingual",
"dataset:common_voice",
"arxiv:2109.11680",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-xlsr-53-espeak-cv-ft | 16 | 1,507,688 | transformers | 2022-03-02T23:29:05 | ---
language: multi-lingual
datasets:
- common_voice
tags:
- speech
- audio
- automatic-speech-recognition
- phoneme-recognition
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
license: apache-2.0
---
# Wav2Vec2-Large-XLSR-53 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint [wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53)
and is fine-tuned on [CommonVoice](https://huggingface.co/datasets/common_voice) to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
[Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition](https://arxiv.org/abs/2109.11680)
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
**Abstract**
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-xlsr-53-espeak-cv-ft")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-xlsr-53-espeak-cv-ft")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values
# retrieve logits
with torch.no_grad():
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
# => should give ['m ษช s t ษ k w ษช l t ษ ษช z รฐ ษช ษ p ษห s ษl l ส v รฐ ษ m ษช d ษl k l รฆ s ษช z รฆ n d w iห aส ษก l รฆ d t ษ w ษ l k ษ m h ษช z ษก ษห s p ษ']
``` | 3,012 | [
[
-0.0119171142578125,
-0.035003662109375,
0.0123443603515625,
0.0185394287109375,
-0.0155181884765625,
0.005764007568359375,
-0.0164794921875,
-0.050445556640625,
0.01059722900390625,
0.02642822265625,
-0.054595947265625,
-0.0467529296875,
-0.041168212890625,
-0.01029205322265625,
-0.020782470703125,
0.067138671875,
0.004261016845703125,
0.0196380615234375,
0.034515380859375,
-0.01508331298828125,
-0.042510986328125,
-0.037872314453125,
-0.0548095703125,
-0.024658203125,
0.03436279296875,
0.0323486328125,
0.00885009765625,
0.0251007080078125,
0.010589599609375,
0.018829345703125,
0.00017964839935302734,
-0.00608062744140625,
-0.04437255859375,
-0.002353668212890625,
0.00907135009765625,
-0.0269622802734375,
-0.030029296875,
0.0106048583984375,
0.043670654296875,
0.042633056640625,
-0.004482269287109375,
0.019073486328125,
0.0009937286376953125,
0.0173187255859375,
-0.0282745361328125,
0.0080108642578125,
-0.052337646484375,
0.005558013916015625,
-0.0200653076171875,
-0.0163116455078125,
-0.035003662109375,
-0.005680084228515625,
0.010223388671875,
-0.05419921875,
0.01139068603515625,
-0.017242431640625,
0.06591796875,
0.003925323486328125,
-0.03448486328125,
-0.03692626953125,
-0.06439208984375,
0.08380126953125,
-0.04254150390625,
0.0615234375,
0.0294952392578125,
0.0152130126953125,
0.005687713623046875,
-0.0675048828125,
-0.026275634765625,
-0.0021209716796875,
0.022247314453125,
0.03265380859375,
-0.0199737548828125,
0.00878143310546875,
0.0165557861328125,
0.01177215576171875,
-0.0582275390625,
0.0176239013671875,
-0.06536865234375,
-0.04852294921875,
0.0274505615234375,
-0.01457977294921875,
0.019561767578125,
0.001819610595703125,
-0.028717041015625,
-0.03057861328125,
-0.036468505859375,
0.0277862548828125,
0.024322509765625,
0.0301971435546875,
-0.030731201171875,
0.03900146484375,
-0.01084136962890625,
0.054412841796875,
0.004180908203125,
-0.0241851806640625,
0.05743408203125,
-0.02911376953125,
-0.00960540771484375,
0.0401611328125,
0.050201416015625,
0.01149749755859375,
0.02410888671875,
-0.00351715087890625,
-0.0013437271118164062,
0.00925445556640625,
-0.018707275390625,
-0.06585693359375,
-0.02740478515625,
0.027587890625,
-0.0153656005859375,
-0.0035953521728515625,
-0.0038356781005859375,
-0.02459716796875,
0.0158843994140625,
-0.026123046875,
0.0572509765625,
-0.038848876953125,
-0.03173828125,
0.01617431640625,
-0.0179901123046875,
0.02337646484375,
0.00724029541015625,
-0.0634765625,
0.0033664703369140625,
0.0263519287109375,
0.06805419921875,
0.01080322265625,
-0.034210205078125,
-0.06280517578125,
0.003192901611328125,
-0.0237884521484375,
0.034820556640625,
-0.0212860107421875,
-0.03125,
-0.013946533203125,
0.0094146728515625,
-0.01200103759765625,
-0.03759765625,
0.0333251953125,
-0.02337646484375,
0.014434814453125,
-0.0168304443359375,
-0.034881591796875,
-0.0125274658203125,
-0.018280029296875,
-0.03668212890625,
0.08795166015625,
0.0024242401123046875,
-0.0406494140625,
0.0211944580078125,
-0.03033447265625,
-0.050872802734375,
-0.01102447509765625,
-0.0021724700927734375,
-0.0232086181640625,
-0.01126861572265625,
0.0069732666015625,
0.0323486328125,
-0.00525665283203125,
-0.00811004638671875,
-0.014801025390625,
-0.0313720703125,
0.016082763671875,
-0.035125732421875,
0.0887451171875,
0.037689208984375,
-0.03997802734375,
0.0010137557983398438,
-0.07122802734375,
0.0297393798828125,
-0.01291656494140625,
-0.035888671875,
0.0016546249389648438,
-0.044525146484375,
0.0221099853515625,
0.0282745361328125,
0.01044464111328125,
-0.05694580078125,
-0.005199432373046875,
-0.030517578125,
0.0452880859375,
0.0295257568359375,
-0.0156097412109375,
0.03350830078125,
-0.01519012451171875,
0.019775390625,
-0.00359344482421875,
-0.01450347900390625,
0.0031108856201171875,
-0.0297393798828125,
-0.04632568359375,
-0.042144775390625,
0.034271240234375,
0.0540771484375,
-0.038604736328125,
0.038604736328125,
-0.004596710205078125,
-0.052642822265625,
-0.064697265625,
0.0017976760864257812,
0.034088134765625,
0.042510986328125,
0.044647216796875,
-0.034393310546875,
-0.0556640625,
-0.042205810546875,
-0.0170745849609375,
-0.004486083984375,
-0.018096923828125,
0.02288818359375,
0.00858306884765625,
-0.0186004638671875,
0.03741455078125,
-0.00971221923828125,
-0.040618896484375,
-0.0167388916015625,
0.035858154296875,
0.0232086181640625,
0.047943115234375,
0.03131103515625,
-0.034912109375,
-0.036285400390625,
-0.002010345458984375,
-0.04290771484375,
-0.0171661376953125,
-0.01555633544921875,
-0.00862884521484375,
0.0208282470703125,
0.0391845703125,
-0.03497314453125,
0.0139312744140625,
0.041534423828125,
-0.015716552734375,
0.0252532958984375,
-0.00946044921875,
0.0155029296875,
-0.06634521484375,
0.01001739501953125,
-0.00986480712890625,
-0.0175018310546875,
-0.059539794921875,
-0.03668212890625,
0.0038547515869140625,
-0.006702423095703125,
-0.0595703125,
0.040130615234375,
-0.058013916015625,
-0.016510009765625,
-0.01064300537109375,
0.0034580230712890625,
-0.007099151611328125,
0.04962158203125,
0.027984619140625,
0.0372314453125,
0.0631103515625,
-0.041656494140625,
0.037200927734375,
0.01082611083984375,
-0.03167724609375,
0.0330810546875,
-0.0546875,
0.0294342041015625,
0.015899658203125,
0.0208740234375,
-0.08233642578125,
0.0006494522094726562,
0.0191650390625,
-0.06591796875,
0.033172607421875,
-0.0222930908203125,
-0.01126861572265625,
-0.01513671875,
-0.004199981689453125,
0.050445556640625,
0.050506591796875,
-0.04437255859375,
0.033355712890625,
0.025543212890625,
0.0009264945983886719,
-0.050689697265625,
-0.07855224609375,
-0.01027679443359375,
-0.0025653839111328125,
-0.04791259765625,
0.0172119140625,
-0.0013303756713867188,
0.015655517578125,
-0.0210723876953125,
-0.0186004638671875,
-0.01202392578125,
-0.0177154541015625,
0.0205230712890625,
0.0183258056640625,
-0.0184326171875,
-0.00276947021484375,
-0.004566192626953125,
-0.0182952880859375,
-0.0067596435546875,
-0.0413818359375,
0.046600341796875,
0.005748748779296875,
-0.01555633544921875,
-0.04803466796875,
0.015411376953125,
0.045440673828125,
-0.0269775390625,
0.0178680419921875,
0.08966064453125,
-0.0296173095703125,
-0.0021190643310546875,
-0.04010009765625,
-0.01198577880859375,
-0.035003662109375,
0.055328369140625,
-0.039154052734375,
-0.05419921875,
0.04058837890625,
0.0006237030029296875,
0.00460052490234375,
0.033355712890625,
0.047515869140625,
-0.005603790283203125,
0.064453125,
0.02239990234375,
-0.0088958740234375,
0.03387451171875,
-0.047088623046875,
0.00733184814453125,
-0.058380126953125,
-0.02978515625,
-0.03582763671875,
-0.015045166015625,
-0.0340576171875,
-0.038604736328125,
0.02142333984375,
0.0009417533874511719,
-0.005588531494140625,
0.036865234375,
-0.0457763671875,
0.023834228515625,
0.05615234375,
0.0078887939453125,
0.0016908645629882812,
0.0160064697265625,
-0.01267242431640625,
-0.01151275634765625,
-0.044219970703125,
-0.005153656005859375,
0.0848388671875,
0.02886962890625,
0.0552978515625,
-0.01006317138671875,
0.06109619140625,
-0.0041656494140625,
-0.027984619140625,
-0.06719970703125,
0.026275634765625,
-0.01210784912109375,
-0.056854248046875,
-0.037017822265625,
-0.0279541015625,
-0.07147216796875,
0.0168914794921875,
-0.0189208984375,
-0.0562744140625,
0.0233154296875,
-0.00360107421875,
-0.031707763671875,
0.01934814453125,
-0.050506591796875,
0.0635986328125,
-0.02587890625,
-0.013031005859375,
-0.0180511474609375,
-0.059478759765625,
0.01506805419921875,
-0.01262664794921875,
0.0094757080078125,
-0.01322174072265625,
0.03155517578125,
0.07794189453125,
-0.0224761962890625,
0.057098388671875,
-0.0222320556640625,
-0.01177215576171875,
0.0321044921875,
-0.005340576171875,
0.031219482421875,
-0.01267242431640625,
-0.0080718994140625,
0.040802001953125,
0.022003173828125,
-0.0284271240234375,
-0.0152435302734375,
0.06304931640625,
-0.08062744140625,
-0.0018405914306640625,
-0.02069091796875,
-0.032135009765625,
-0.0066680908203125,
0.0272979736328125,
0.06597900390625,
0.05718994140625,
-0.01421356201171875,
0.019256591796875,
0.05035400390625,
-0.0154571533203125,
0.02667236328125,
0.037017822265625,
0.004230499267578125,
-0.03411865234375,
0.08087158203125,
0.03570556640625,
0.0265655517578125,
0.01806640625,
-0.0009179115295410156,
-0.038787841796875,
-0.032684326171875,
-0.02960205078125,
0.0153656005859375,
-0.050506591796875,
-0.00862884521484375,
-0.05645751953125,
-0.026031494140625,
-0.05841064453125,
-0.004543304443359375,
-0.05206298828125,
-0.04296875,
-0.033355712890625,
0.000576019287109375,
0.0229034423828125,
0.050506591796875,
-0.051849365234375,
0.017120361328125,
-0.058502197265625,
0.049468994140625,
0.0264739990234375,
0.005840301513671875,
-0.005138397216796875,
-0.07159423828125,
-0.0246429443359375,
0.0249481201171875,
0.002437591552734375,
-0.062225341796875,
0.032196044921875,
0.01409149169921875,
0.034393310546875,
0.016632080078125,
0.00640869140625,
0.057037353515625,
-0.03179931640625,
0.040618896484375,
0.035247802734375,
-0.0889892578125,
0.03662109375,
-0.01145172119140625,
0.026611328125,
0.03387451171875,
0.0211181640625,
-0.05712890625,
-0.0189056396484375,
-0.03216552734375,
-0.074462890625,
0.0758056640625,
0.0207061767578125,
-0.0013189315795898438,
0.0209503173828125,
0.0081939697265625,
0.006397247314453125,
-0.00780487060546875,
-0.05645751953125,
-0.035797119140625,
-0.03265380859375,
-0.01329803466796875,
-0.01105499267578125,
-0.015899658203125,
-0.0017852783203125,
-0.037689208984375,
0.06781005859375,
0.0189208984375,
0.024749755859375,
0.02984619140625,
-0.016845703125,
-0.007778167724609375,
0.013702392578125,
0.04449462890625,
0.01499176025390625,
-0.025054931640625,
0.01114654541015625,
0.0247344970703125,
-0.04840087890625,
0.0256195068359375,
0.017822265625,
0.0126953125,
0.0196685791015625,
0.0325927734375,
0.0714111328125,
0.0174102783203125,
-0.0206298828125,
0.0310821533203125,
-0.0006399154663085938,
-0.040435791015625,
-0.041900634765625,
-0.006317138671875,
0.016387939453125,
0.0211334228515625,
0.02862548828125,
0.00562286376953125,
0.005313873291015625,
-0.042694091796875,
0.0218353271484375,
0.0251922607421875,
-0.0394287109375,
-0.030303955078125,
0.0628662109375,
0.0165557861328125,
-0.0156097412109375,
0.0528564453125,
-0.00031304359436035156,
-0.0263671875,
0.047210693359375,
0.036956787109375,
0.06585693359375,
-0.031982421875,
-0.003570556640625,
0.049407958984375,
0.0200347900390625,
-0.01262664794921875,
0.0313720703125,
-0.014892578125,
-0.052337646484375,
-0.0279541015625,
-0.033966064453125,
-0.014373779296875,
0.0193939208984375,
-0.060882568359375,
0.0418701171875,
-0.0240325927734375,
-0.004688262939453125,
0.0214080810546875,
0.002292633056640625,
-0.06732177734375,
0.0241241455078125,
0.02606201171875,
0.048858642578125,
-0.06842041015625,
0.09814453125,
0.0217437744140625,
-0.0205078125,
-0.092529296875,
-0.0232391357421875,
0.00537872314453125,
-0.07537841796875,
0.05126953125,
0.017303466796875,
-0.037811279296875,
0.0147247314453125,
-0.032928466796875,
-0.07763671875,
0.08331298828125,
0.0286712646484375,
-0.05810546875,
0.012664794921875,
0.0154571533203125,
0.029296875,
-0.013275146484375,
0.02587890625,
0.0447998046875,
0.039642333984375,
0.03399658203125,
-0.0909423828125,
-0.00882720947265625,
-0.00792694091796875,
-0.03314208984375,
-0.01311492919921875,
-0.04290771484375,
0.06524658203125,
-0.0272979736328125,
-0.0227203369140625,
0.00554656982421875,
0.050384521484375,
0.02362060546875,
0.023468017578125,
0.036346435546875,
0.041290283203125,
0.0694580078125,
-0.002986907958984375,
0.03778076171875,
-0.0186767578125,
0.022918701171875,
0.08319091796875,
-0.01666259765625,
0.0576171875,
0.027862548828125,
-0.0012216567993164062,
0.017822265625,
0.048553466796875,
-0.004405975341796875,
0.0562744140625,
0.0094146728515625,
-0.01444244384765625,
-0.0222930908203125,
-0.0005669593811035156,
-0.05126953125,
0.06591796875,
0.01224517822265625,
-0.0090789794921875,
-0.0007677078247070312,
0.00927734375,
-0.007595062255859375,
-0.0318603515625,
-0.01172637939453125,
0.050506591796875,
0.012420654296875,
-0.02557373046875,
0.07366943359375,
-0.0026416778564453125,
0.07159423828125,
-0.059722900390625,
-0.007045745849609375,
0.005359649658203125,
0.01409912109375,
-0.0229644775390625,
-0.04486083984375,
0.01114654541015625,
-0.01165771484375,
-0.006519317626953125,
-0.013916015625,
0.04241943359375,
-0.06878662109375,
-0.04248046875,
0.0528564453125,
0.0257568359375,
0.040283203125,
0.0003674030303955078,
-0.04571533203125,
0.01303863525390625,
0.02581787109375,
-0.0142364501953125,
0.01303863525390625,
0.034332275390625,
0.0214996337890625,
0.0227203369140625,
0.050445556640625,
0.02386474609375,
0.0185089111328125,
0.01336669921875,
0.034759521484375,
-0.04986572265625,
-0.043609619140625,
-0.032196044921875,
0.0286712646484375,
0.0214080810546875,
-0.027740478515625,
0.04583740234375,
0.0538330078125,
0.09307861328125,
-0.00724029541015625,
0.0643310546875,
0.00981903076171875,
0.05712890625,
-0.05133056640625,
0.058074951171875,
-0.04669189453125,
0.009185791015625,
-0.0206451416015625,
-0.058319091796875,
-0.017242431640625,
0.057952880859375,
-0.0027256011962890625,
0.01297760009765625,
0.035125732421875,
0.07421875,
-0.01348114013671875,
-0.02117919921875,
0.040740966796875,
0.014068603515625,
0.00623321533203125,
0.05987548828125,
0.04656982421875,
-0.055084228515625,
0.0751953125,
-0.0298919677734375,
-0.00958251953125,
-0.004467010498046875,
-0.029571533203125,
-0.06292724609375,
-0.05303955078125,
-0.0297698974609375,
-0.03277587890625,
0.001983642578125,
0.059906005859375,
0.0706787109375,
-0.06671142578125,
-0.03350830078125,
0.0149688720703125,
-0.00894927978515625,
-0.02239990234375,
-0.0167083740234375,
0.0307464599609375,
-0.0069580078125,
-0.0660400390625,
0.051239013671875,
0.0030422210693359375,
0.0223541259765625,
0.004322052001953125,
-0.0205230712890625,
-0.014007568359375,
-0.004119873046875,
0.02642822265625,
0.045013427734375,
-0.0477294921875,
-0.0183563232421875,
0.008026123046875,
-0.01322174072265625,
0.01259613037109375,
0.0516357421875,
-0.042999267578125,
0.040313720703125,
0.035491943359375,
0.014007568359375,
0.06622314453125,
-0.037017822265625,
0.02813720703125,
-0.05859375,
0.03485107421875,
0.004741668701171875,
0.0277862548828125,
0.024139404296875,
-0.025543212890625,
0.0210113525390625,
0.0188751220703125,
-0.03369140625,
-0.06134033203125,
0.003498077392578125,
-0.097412109375,
-0.025634765625,
0.0987548828125,
-0.0014867782592773438,
-0.0022907257080078125,
-0.0204010009765625,
-0.0240936279296875,
0.056854248046875,
-0.03173828125,
0.031158447265625,
0.03570556640625,
-0.007289886474609375,
-0.004425048828125,
-0.031982421875,
0.035858154296875,
0.04351806640625,
-0.021514892578125,
0.00650787353515625,
0.034423828125,
0.0528564453125,
0.00081634521484375,
0.07373046875,
-0.00921630859375,
0.0308380126953125,
0.0127105712890625,
0.037933349609375,
-0.00936126708984375,
-0.031982421875,
-0.047760009765625,
-0.006275177001953125,
-0.00604248046875,
-0.0567626953125
]
] |
QCRI/bert-base-multilingual-cased-pos-english | 2023-01-25T06:00:31.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"part-of-speech",
"finetuned",
"en",
"license:cc-by-nc-3.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | QCRI | null | null | QCRI/bert-base-multilingual-cased-pos-english | 21 | 1,504,084 | transformers | 2022-04-27T08:15:20 | ---
language:
- en
tags:
- part-of-speech
- finetuned
license: cc-by-nc-3.0
---
# BERT-base-multilingual-cased finetuned for Part-of-Speech tagging
This is a multilingual BERT model fine tuned for part-of-speech tagging for English. It is trained using the Penn TreeBank (Marcus et al., 1993) and achieves an F1-score of 96.69.
## Usage
A *transformers* pipeline can be used to run the model:
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification, TokenClassificationPipeline
model_name = "QCRI/bert-base-multilingual-cased-pos-english"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForTokenClassification.from_pretrained(model_name)
pipeline = TokenClassificationPipeline(model=model, tokenizer=tokenizer)
outputs = pipeline("A test example")
print(outputs)
```
## Citation
This model was used for all the part-of-speech tagging based results in *Analyzing Encoded Concepts in Transformer Language Models*, published at NAACL'22. If you find this model useful for your own work, please use the following citation:
```bib
@inproceedings{sajjad-NAACL,
title={Analyzing Encoded Concepts in Transformer Language Models},
author={Hassan Sajjad, Nadir Durrani, Fahim Dalvi, Firoj Alam, Abdul Rafae Khan and Jia Xu},
booktitle={North American Chapter of the Association of Computational Linguistics: Human Language Technologies (NAACL)},
series={NAACL~'22},
year={2022},
address={Seattle}
}
``` | 1,468 | [
[
-0.03271484375,
-0.050994873046875,
-0.00537109375,
0.0200042724609375,
-0.022430419921875,
0.01033782958984375,
-0.012481689453125,
-0.0281524658203125,
0.0166778564453125,
0.01297760009765625,
-0.034332275390625,
-0.040985107421875,
-0.035980224609375,
-0.00189208984375,
-0.03192138671875,
0.07891845703125,
0.0164794921875,
0.0277252197265625,
0.005977630615234375,
0.00189971923828125,
-0.0190887451171875,
-0.09014892578125,
-0.043182373046875,
-0.029022216796875,
0.0277252197265625,
0.016571044921875,
0.0421142578125,
0.038177490234375,
0.03509521484375,
0.0189666748046875,
-0.01214599609375,
-0.0142974853515625,
-0.00786590576171875,
0.0016450881958007812,
-0.01168060302734375,
-0.033111572265625,
-0.04559326171875,
0.0017414093017578125,
0.055908203125,
0.05950927734375,
-0.00914764404296875,
-0.004947662353515625,
-0.01470947265625,
0.0207672119140625,
-0.015655517578125,
0.02349853515625,
-0.049652099609375,
-0.0078277587890625,
-0.02691650390625,
0.0021610260009765625,
-0.043426513671875,
-0.0002543926239013672,
0.0148162841796875,
-0.0299835205078125,
0.02581787109375,
0.00350189208984375,
0.08868408203125,
0.01157379150390625,
-0.0255584716796875,
-0.02069091796875,
-0.038299560546875,
0.06561279296875,
-0.06805419921875,
0.0413818359375,
0.02459716796875,
-0.0014142990112304688,
-0.004734039306640625,
-0.05804443359375,
-0.046661376953125,
-0.006877899169921875,
-0.0126800537109375,
-0.0014858245849609375,
-0.023590087890625,
0.019195556640625,
0.0213470458984375,
0.0234375,
-0.04998779296875,
-0.005290985107421875,
-0.0253448486328125,
-0.034027099609375,
0.049224853515625,
-0.016937255859375,
0.01337432861328125,
-0.0274810791015625,
-0.03387451171875,
-0.0205078125,
-0.03570556640625,
0.00820159912109375,
0.034332275390625,
0.050872802734375,
0.00270843505859375,
0.036102294921875,
-0.01140594482421875,
0.042938232421875,
0.01015472412109375,
-0.0169830322265625,
0.04766845703125,
-0.0239105224609375,
-0.01296234130859375,
0.00827789306640625,
0.04962158203125,
0.00213623046875,
0.0296783447265625,
-0.01490020751953125,
-0.017486572265625,
0.005741119384765625,
0.00946807861328125,
-0.054840087890625,
-0.03533935546875,
0.00507354736328125,
-0.035186767578125,
0.0014133453369140625,
0.012786865234375,
-0.0265350341796875,
0.0015993118286132812,
-0.01371002197265625,
0.0399169921875,
-0.06317138671875,
-0.027862548828125,
0.0090484619140625,
-0.00421142578125,
0.04766845703125,
-0.0183563232421875,
-0.06085205078125,
0.01134490966796875,
0.046173095703125,
0.050140380859375,
0.0080718994140625,
-0.0300445556640625,
-0.0150909423828125,
-0.01499176025390625,
-0.006565093994140625,
0.056060791015625,
-0.040252685546875,
-0.030792236328125,
0.01375579833984375,
0.01337432861328125,
-0.011444091796875,
-0.0151214599609375,
0.06622314453125,
-0.045623779296875,
0.007358551025390625,
-0.0147247314453125,
-0.0595703125,
-0.018096923828125,
0.01412200927734375,
-0.0400390625,
0.07769775390625,
-0.002666473388671875,
-0.0526123046875,
0.0254364013671875,
-0.049591064453125,
-0.04913330078125,
0.01313018798828125,
-0.0014743804931640625,
-0.0253448486328125,
-0.006504058837890625,
0.0150146484375,
0.0283966064453125,
0.003330230712890625,
0.041015625,
-0.0013189315795898438,
-0.0176239013671875,
0.00604248046875,
-0.0249786376953125,
0.07763671875,
0.029296875,
-0.01352691650390625,
0.01021575927734375,
-0.059600830078125,
0.01099395751953125,
0.00009822845458984375,
-0.034210205078125,
-0.0271148681640625,
0.00641632080078125,
0.0187530517578125,
0.005405426025390625,
0.0260772705078125,
-0.0626220703125,
0.0189056396484375,
-0.0511474609375,
0.030975341796875,
0.031463623046875,
-0.01145172119140625,
0.033294677734375,
-0.0229949951171875,
0.03521728515625,
0.005645751953125,
0.00908660888671875,
-0.0206146240234375,
-0.04693603515625,
-0.08502197265625,
-0.0258026123046875,
0.0791015625,
0.040252685546875,
-0.0648193359375,
0.0584716796875,
-0.01506805419921875,
-0.045074462890625,
-0.058197021484375,
-0.0241851806640625,
0.031951904296875,
0.0218048095703125,
0.0267791748046875,
-0.021881103515625,
-0.045074462890625,
-0.08154296875,
-0.0056915283203125,
-0.014190673828125,
0.0169219970703125,
-0.006519317626953125,
0.044708251953125,
-0.01457977294921875,
0.08233642578125,
-0.00881195068359375,
-0.02325439453125,
-0.016357421875,
0.038299560546875,
0.0283050537109375,
0.058197021484375,
0.0435791015625,
-0.05047607421875,
-0.04534912109375,
-0.0154571533203125,
-0.021575927734375,
-0.006443023681640625,
-0.003856658935546875,
-0.0225067138671875,
0.03277587890625,
0.027008056640625,
-0.043609619140625,
0.0258941650390625,
0.0352783203125,
-0.02972412109375,
0.0277557373046875,
-0.01052093505859375,
-0.00341033935546875,
-0.0882568359375,
0.009246826171875,
0.011627197265625,
-0.0199737548828125,
-0.057098388671875,
0.021240234375,
0.0088043212890625,
0.006435394287109375,
-0.03839111328125,
0.035125732421875,
-0.030181884765625,
0.00873565673828125,
-0.01763916015625,
-0.0188751220703125,
-0.006595611572265625,
0.056304931640625,
0.015869140625,
0.04931640625,
0.0440673828125,
-0.02960205078125,
0.0248870849609375,
0.023406982421875,
-0.029022216796875,
0.02886962890625,
-0.050384521484375,
0.0048065185546875,
-0.0016489028930664062,
0.0174407958984375,
-0.07061767578125,
0.0007128715515136719,
0.01557159423828125,
-0.0484619140625,
0.044342041015625,
-0.014617919921875,
-0.0577392578125,
-0.01526641845703125,
-0.00804901123046875,
0.03533935546875,
0.03125,
-0.039093017578125,
0.058837890625,
0.038299560546875,
-0.0159149169921875,
-0.04852294921875,
-0.058258056640625,
-0.00890350341796875,
-0.006023406982421875,
-0.0452880859375,
0.05023193359375,
-0.025054931640625,
0.0060272216796875,
-0.007167816162109375,
-0.00583648681640625,
-0.0286865234375,
-0.0008053779602050781,
0.004718780517578125,
0.00772857666015625,
-0.0186920166015625,
0.02130126953125,
0.00789642333984375,
-0.006443023681640625,
0.01465606689453125,
-0.004673004150390625,
0.055511474609375,
-0.021026611328125,
-0.02191162109375,
-0.0265350341796875,
0.04693603515625,
0.0261077880859375,
-0.0274505615234375,
0.04638671875,
0.057525634765625,
-0.044464111328125,
-0.01177215576171875,
-0.0267791748046875,
-0.0117950439453125,
-0.032684326171875,
0.0276641845703125,
-0.055511474609375,
-0.056610107421875,
0.043365478515625,
0.01568603515625,
0.00675201416015625,
0.053192138671875,
0.045074462890625,
-0.0165252685546875,
0.07843017578125,
0.058563232421875,
-0.0224151611328125,
0.039093017578125,
-0.029052734375,
0.0282440185546875,
-0.040740966796875,
-0.00894927978515625,
-0.0496826171875,
-0.0118255615234375,
-0.058135986328125,
-0.016571044921875,
-0.0002532005310058594,
0.00954437255859375,
-0.021820068359375,
0.0270843505859375,
-0.04248046875,
0.0278167724609375,
0.07464599609375,
-0.0086517333984375,
0.0107879638671875,
0.0082550048828125,
-0.0222320556640625,
0.0005755424499511719,
-0.057830810546875,
-0.031829833984375,
0.0667724609375,
0.040557861328125,
0.0526123046875,
0.01303863525390625,
0.052215576171875,
0.0259552001953125,
0.01454925537109375,
-0.07342529296875,
0.02703857421875,
-0.01434326171875,
-0.08416748046875,
-0.0056610107421875,
-0.00720977783203125,
-0.06781005859375,
0.0150909423828125,
-0.00545501708984375,
-0.04876708984375,
0.0261688232421875,
-0.0018215179443359375,
-0.0245361328125,
0.0114288330078125,
-0.06719970703125,
0.0640869140625,
-0.0161590576171875,
0.01776123046875,
0.00004094839096069336,
-0.05889892578125,
0.010101318359375,
-0.01116180419921875,
0.01129913330078125,
-0.00577545166015625,
0.0308685302734375,
0.075439453125,
-0.02435302734375,
0.073486328125,
-0.0217132568359375,
-0.0088348388671875,
0.018524169921875,
-0.01076507568359375,
0.006954193115234375,
-0.0052337646484375,
-0.00467681884765625,
0.035919189453125,
0.0287628173828125,
-0.03582763671875,
-0.00659942626953125,
0.051605224609375,
-0.064697265625,
-0.033050537109375,
-0.06976318359375,
-0.04345703125,
-0.015655517578125,
0.021514892578125,
0.0262908935546875,
0.0335693359375,
-0.01434326171875,
0.017547607421875,
0.049041748046875,
-0.01300811767578125,
0.043701171875,
0.048004150390625,
-0.00443267822265625,
-0.0229949951171875,
0.0732421875,
0.0076446533203125,
-0.007305145263671875,
0.038787841796875,
-0.0009632110595703125,
-0.0285491943359375,
-0.0233612060546875,
-0.0201416015625,
0.03656005859375,
-0.058319091796875,
-0.02935791015625,
-0.040863037109375,
-0.052337646484375,
-0.02960205078125,
0.007598876953125,
-0.025726318359375,
-0.02685546875,
-0.0299530029296875,
0.00020515918731689453,
0.01375579833984375,
0.03765869140625,
-0.00830841064453125,
0.045166015625,
-0.0435791015625,
0.003719329833984375,
0.00762176513671875,
0.0269622802734375,
-0.01378631591796875,
-0.057891845703125,
-0.02685546875,
0.023040771484375,
-0.0171356201171875,
-0.066650390625,
0.032806396484375,
0.03466796875,
0.06097412109375,
0.009063720703125,
0.00879669189453125,
0.0391845703125,
-0.041595458984375,
0.055145263671875,
0.0016088485717773438,
-0.09271240234375,
0.031005859375,
-0.01837158203125,
0.025054931640625,
0.02960205078125,
0.04315185546875,
-0.031524658203125,
-0.0178070068359375,
-0.048583984375,
-0.072509765625,
0.06988525390625,
0.0092010498046875,
0.0254364013671875,
-0.0156097412109375,
0.017425537109375,
0.0033111572265625,
0.007244110107421875,
-0.08154296875,
-0.04412841796875,
-0.019775390625,
-0.0269622802734375,
0.011138916015625,
-0.03741455078125,
0.003681182861328125,
-0.0428466796875,
0.08135986328125,
-0.00775146484375,
0.048583984375,
0.01464080810546875,
-0.02423095703125,
0.0112457275390625,
0.022979736328125,
0.04339599609375,
0.03515625,
-0.032196044921875,
-0.006649017333984375,
-0.003719329833984375,
-0.0233306884765625,
-0.011505126953125,
0.043731689453125,
-0.0014171600341796875,
0.0247344970703125,
0.0240478515625,
0.049835205078125,
0.004123687744140625,
-0.03271484375,
0.043426513671875,
-0.00860595703125,
-0.03643798828125,
-0.0287322998046875,
-0.0225067138671875,
0.004718780517578125,
0.0260162353515625,
0.043426513671875,
-0.005138397216796875,
0.0018444061279296875,
-0.036590576171875,
0.021484375,
0.035552978515625,
-0.0208587646484375,
-0.0307159423828125,
0.03131103515625,
0.0244293212890625,
-0.0236968994140625,
0.04461669921875,
-0.035980224609375,
-0.06048583984375,
0.0302734375,
0.0477294921875,
0.05145263671875,
-0.02630615234375,
0.005706787109375,
0.03143310546875,
0.04278564453125,
0.0274810791015625,
0.053375244140625,
0.0002627372741699219,
-0.07330322265625,
-0.0404052734375,
-0.07196044921875,
-0.017059326171875,
0.0144195556640625,
-0.051055908203125,
0.00875091552734375,
-0.01190185546875,
-0.029571533203125,
0.025115966796875,
0.0076751708984375,
-0.05047607421875,
0.025665283203125,
0.036376953125,
0.079345703125,
-0.0599365234375,
0.10711669921875,
0.07171630859375,
-0.040008544921875,
-0.0626220703125,
0.0011272430419921875,
-0.032562255859375,
-0.046112060546875,
0.055938720703125,
-0.0012655258178710938,
0.00905609130859375,
0.017791748046875,
-0.0241546630859375,
-0.08526611328125,
0.054290771484375,
0.01465606689453125,
-0.037872314453125,
-0.00470733642578125,
0.00823211669921875,
0.041656494140625,
-0.03271484375,
0.019500732421875,
0.051788330078125,
0.0269927978515625,
-0.0016021728515625,
-0.0826416015625,
-0.0263214111328125,
-0.03387451171875,
0.00569915771484375,
0.01290130615234375,
-0.041839599609375,
0.08203125,
0.01085662841796875,
-0.0211029052734375,
0.0190887451171875,
0.058258056640625,
0.00209808349609375,
-0.0073394775390625,
0.03973388671875,
0.04644775390625,
0.04638671875,
-0.02593994140625,
0.055450439453125,
-0.051727294921875,
0.04730224609375,
0.0703125,
-0.006969451904296875,
0.07366943359375,
0.035400390625,
-0.018402099609375,
0.042266845703125,
0.048828125,
-0.005832672119140625,
0.052337646484375,
0.01184844970703125,
-0.003253936767578125,
-0.0196533203125,
-0.0125274658203125,
-0.03515625,
0.047821044921875,
0.03076171875,
-0.034942626953125,
-0.01360321044921875,
0.0011167526245117188,
0.012908935546875,
-0.01479339599609375,
-0.0036525726318359375,
0.037750244140625,
-0.00873565673828125,
-0.0404052734375,
0.050689697265625,
0.0290069580078125,
0.076416015625,
-0.05029296875,
0.002262115478515625,
-0.017120361328125,
0.00382232666015625,
-0.0011472702026367188,
-0.04315185546875,
0.016357421875,
0.007232666015625,
-0.032806396484375,
-0.01320648193359375,
0.042327880859375,
-0.050689697265625,
-0.04888916015625,
0.01534271240234375,
0.0413818359375,
0.0201416015625,
0.00833892822265625,
-0.0693359375,
0.01025390625,
0.01494598388671875,
-0.0172576904296875,
0.007022857666015625,
0.01198577880859375,
0.002956390380859375,
0.023101806640625,
0.03753662109375,
-0.0091094970703125,
0.0174407958984375,
0.025726318359375,
0.058197021484375,
-0.05279541015625,
-0.0380859375,
-0.04248046875,
0.043548583984375,
0.0065765380859375,
-0.027862548828125,
0.046630859375,
0.04541015625,
0.08184814453125,
-0.0006403923034667969,
0.06109619140625,
-0.022430419921875,
0.050689697265625,
-0.020660400390625,
0.057586669921875,
-0.035369873046875,
-0.005031585693359375,
-0.007534027099609375,
-0.06622314453125,
-0.026092529296875,
0.07269287109375,
-0.006877899169921875,
0.0025310516357421875,
0.055328369140625,
0.055511474609375,
-0.0029621124267578125,
-0.002353668212890625,
0.0242462158203125,
0.0193328857421875,
0.0237884521484375,
0.0262298583984375,
0.05328369140625,
-0.047393798828125,
0.036651611328125,
-0.043975830078125,
-0.015899658203125,
-0.0113677978515625,
-0.064208984375,
-0.0701904296875,
-0.06427001953125,
-0.034423828125,
-0.01531982421875,
-0.004161834716796875,
0.0660400390625,
0.07049560546875,
-0.07904052734375,
-0.021209716796875,
0.00933074951171875,
-0.0108795166015625,
-0.0011835098266601562,
-0.0218505859375,
0.024261474609375,
-0.035919189453125,
-0.06866455078125,
0.03125,
-0.0160675048828125,
-0.001445770263671875,
-0.0247802734375,
0.00354766845703125,
-0.052154541015625,
0.01352691650390625,
0.037200927734375,
0.005153656005859375,
-0.07257080078125,
-0.0185546875,
0.0012521743774414062,
-0.019134521484375,
0.0077056884765625,
0.03533935546875,
-0.04815673828125,
0.042694091796875,
0.0269012451171875,
0.035247802734375,
0.027130126953125,
0.0048980712890625,
0.035247802734375,
-0.087158203125,
0.013641357421875,
0.0152740478515625,
0.048919677734375,
0.0312347412109375,
-0.00659942626953125,
0.020843505859375,
0.016937255859375,
-0.042633056640625,
-0.053436279296875,
0.0094146728515625,
-0.08380126953125,
-0.02288818359375,
0.0863037109375,
-0.024078369140625,
-0.0189666748046875,
0.0023632049560546875,
-0.0250244140625,
0.03363037109375,
-0.040252685546875,
0.029052734375,
0.07452392578125,
0.003726959228515625,
-0.0006928443908691406,
-0.011749267578125,
0.030487060546875,
0.0296783447265625,
-0.0268402099609375,
-0.01983642578125,
0.0245513916015625,
0.0196533203125,
0.034698486328125,
0.0282745361328125,
0.01517486572265625,
0.008026123046875,
-0.001636505126953125,
0.045379638671875,
0.01323699951171875,
-0.0260772705078125,
-0.02557373046875,
0.004138946533203125,
0.01059722900390625,
-0.03240966796875
]
] |
indobenchmark/indobert-base-p1 | 2021-05-19T20:22:23.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"feature-extraction",
"indobert",
"indobenchmark",
"indonlu",
"id",
"dataset:Indo4B",
"arxiv:2009.05387",
"license:mit",
"has_space",
"region:us"
] | feature-extraction | indobenchmark | null | null | indobenchmark/indobert-base-p1 | 8 | 1,496,629 | transformers | 2022-03-02T23:29:05 | ---
language: id
tags:
- indobert
- indobenchmark
- indonlu
license: mit
inference: false
datasets:
- Indo4B
---
# IndoBERT Base Model (phase1 - uncased)
[IndoBERT](https://arxiv.org/abs/2009.05387) is a state-of-the-art language model for Indonesian based on the BERT model. The pretrained model is trained using a masked language modeling (MLM) objective and next sentence prediction (NSP) objective.
## All Pre-trained Models
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `indobenchmark/indobert-base-p1` | 124.5M | Base | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-base-p2` | 124.5M | Base | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-large-p1` | 335.2M | Large | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-large-p2` | 335.2M | Large | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-lite-base-p1` | 11.7M | Base | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-lite-base-p2` | 11.7M | Base | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-lite-large-p1` | 17.7M | Large | Indo4B (23.43 GB of text) |
| `indobenchmark/indobert-lite-large-p2` | 17.7M | Large | Indo4B (23.43 GB of text) |
## How to use
### Load model and tokenizer
```python
from transformers import BertTokenizer, AutoModel
tokenizer = BertTokenizer.from_pretrained("indobenchmark/indobert-base-p1")
model = AutoModel.from_pretrained("indobenchmark/indobert-base-p1")
```
### Extract contextual representation
```python
x = torch.LongTensor(tokenizer.encode('aku adalah anak [MASK]')).view(1,-1)
print(x, model(x)[0].sum())
```
## Authors
<b>IndoBERT</b> was trained and evaluated by Bryan Wilie\*, Karissa Vincentio\*, Genta Indra Winata\*, Samuel Cahyawijaya\*, Xiaohong Li, Zhi Yuan Lim, Sidik Soleman, Rahmad Mahendra, Pascale Fung, Syafri Bahar, Ayu Purwarianti.
## Citation
If you use our work, please cite:
```bibtex
@inproceedings{wilie2020indonlu,
title={IndoNLU: Benchmark and Resources for Evaluating Indonesian Natural Language Understanding},
author={Bryan Wilie and Karissa Vincentio and Genta Indra Winata and Samuel Cahyawijaya and X. Li and Zhi Yuan Lim and S. Soleman and R. Mahendra and Pascale Fung and Syafri Bahar and A. Purwarianti},
booktitle={Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing},
year={2020}
}
```
| 2,712 | [
[
-0.0278472900390625,
-0.035003662109375,
0.00737762451171875,
0.03814697265625,
-0.034637451171875,
-0.0213775634765625,
-0.036834716796875,
-0.0243377685546875,
0.0171051025390625,
0.03717041015625,
-0.026885986328125,
-0.029693603515625,
-0.04986572265625,
0.022918701171875,
-0.0056304931640625,
0.06060791015625,
-0.005558013916015625,
0.004825592041015625,
-0.005950927734375,
-0.030548095703125,
-0.0255126953125,
-0.052581787109375,
-0.0333251953125,
-0.026275634765625,
0.00351715087890625,
0.0250701904296875,
0.0195159912109375,
0.0287628173828125,
0.0282440185546875,
0.0202484130859375,
-0.017669677734375,
-0.007358551025390625,
-0.01142120361328125,
0.0115203857421875,
0.0106658935546875,
-0.0404052734375,
-0.028228759765625,
-0.00321197509765625,
0.047576904296875,
0.04833984375,
0.0166015625,
0.018157958984375,
0.0163116455078125,
0.060882568359375,
-0.05413818359375,
0.040252685546875,
-0.0252532958984375,
-0.00820159912109375,
-0.01340484619140625,
-0.006710052490234375,
-0.0467529296875,
-0.0197296142578125,
0.027130126953125,
-0.039459228515625,
-0.0131378173828125,
0.0014715194702148438,
0.0806884765625,
0.0245513916015625,
-0.04034423828125,
-0.0212249755859375,
-0.0404052734375,
0.07427978515625,
-0.059173583984375,
0.018646240234375,
0.03277587890625,
0.01172637939453125,
-0.0004336833953857422,
-0.0655517578125,
-0.05218505859375,
-0.0109710693359375,
-0.039520263671875,
0.0181884765625,
-0.014404296875,
0.012420654296875,
0.017364501953125,
0.024017333984375,
-0.046051025390625,
-0.019805908203125,
-0.0458984375,
-0.00356292724609375,
0.05682373046875,
-0.0120086669921875,
0.01568603515625,
-0.052337646484375,
-0.03515625,
-0.0226287841796875,
-0.03973388671875,
0.0240478515625,
0.021697998046875,
0.0184326171875,
-0.01526641845703125,
0.039886474609375,
-0.0125579833984375,
0.060455322265625,
0.00505828857421875,
-0.0208892822265625,
0.045166015625,
-0.0277862548828125,
-0.030517578125,
0.025054931640625,
0.059234619140625,
0.0232086181640625,
0.038604736328125,
0.01256561279296875,
0.00489044189453125,
-0.006870269775390625,
-0.006031036376953125,
-0.0670166015625,
-0.0232696533203125,
0.0206756591796875,
-0.05462646484375,
-0.008392333984375,
-0.00986480712890625,
-0.042755126953125,
-0.0005092620849609375,
-0.033203125,
0.0404052734375,
-0.061065673828125,
-0.0413818359375,
-0.01277923583984375,
-0.0063323974609375,
0.042816162109375,
-0.01045989990234375,
-0.07049560546875,
0.0196533203125,
0.0300445556640625,
0.057098388671875,
-0.003086090087890625,
-0.016937255859375,
-0.004150390625,
-0.0026149749755859375,
-0.0239410400390625,
0.03704833984375,
-0.0433349609375,
-0.023681640625,
0.0013980865478515625,
0.001373291015625,
-0.0309600830078125,
-0.032562255859375,
0.05706787109375,
-0.012603759765625,
0.0318603515625,
-0.01255035400390625,
-0.05572509765625,
-0.047393798828125,
0.0186767578125,
-0.0283660888671875,
0.085205078125,
0.01247406005859375,
-0.07330322265625,
0.041473388671875,
-0.048065185546875,
-0.0167236328125,
0.006916046142578125,
-0.0122833251953125,
-0.0294342041015625,
-0.005229949951171875,
0.035186767578125,
0.040557861328125,
-0.0078582763671875,
0.023101806640625,
-0.003589630126953125,
-0.02313232421875,
0.0027751922607421875,
-0.0106048583984375,
0.1044921875,
0.0197296142578125,
-0.0294036865234375,
0.0286865234375,
-0.081298828125,
0.006328582763671875,
0.007030487060546875,
-0.01259613037109375,
-0.04937744140625,
-0.01100921630859375,
0.01611328125,
0.01520538330078125,
0.0226287841796875,
-0.03466796875,
0.021514892578125,
-0.05279541015625,
0.0265045166015625,
0.04876708984375,
0.01491546630859375,
0.021148681640625,
-0.019805908203125,
0.0301971435546875,
0.002086639404296875,
0.0094757080078125,
-0.028411865234375,
-0.0299835205078125,
-0.0865478515625,
-0.04351806640625,
0.0291290283203125,
0.04547119140625,
-0.046722412109375,
0.05194091796875,
-0.0236358642578125,
-0.05035400390625,
-0.028961181640625,
0.004932403564453125,
0.0203094482421875,
0.03643798828125,
0.033599853515625,
-0.0181427001953125,
-0.056182861328125,
-0.0843505859375,
-0.0313720703125,
-0.0233306884765625,
0.0025653839111328125,
0.020233154296875,
0.0426025390625,
-0.01490020751953125,
0.08551025390625,
-0.012969970703125,
-0.0211944580078125,
-0.030548095703125,
0.0194549560546875,
0.037109375,
0.05218505859375,
0.0596923828125,
-0.053619384765625,
-0.0740966796875,
-0.029510498046875,
-0.05438232421875,
0.0004992485046386719,
-0.00043082237243652344,
-0.01036834716796875,
0.036224365234375,
0.03729248046875,
-0.0487060546875,
0.04193115234375,
0.02325439453125,
-0.023101806640625,
0.038360595703125,
-0.020843505859375,
-0.003711700439453125,
-0.09619140625,
0.021636962890625,
-0.00684356689453125,
-0.0012950897216796875,
-0.047393798828125,
-0.0037937164306640625,
0.0165863037109375,
0.0009222030639648438,
-0.0293731689453125,
0.052490234375,
-0.03753662109375,
0.0173797607421875,
-0.0051727294921875,
-0.004909515380859375,
-0.0183563232421875,
0.06170654296875,
0.004344940185546875,
0.049774169921875,
0.05584716796875,
-0.042449951171875,
0.0102691650390625,
0.033447265625,
-0.036590576171875,
0.037078857421875,
-0.06146240234375,
-0.00972747802734375,
-0.021484375,
0.02703857421875,
-0.088134765625,
-0.0128936767578125,
0.0380859375,
-0.03363037109375,
0.029510498046875,
-0.0167236328125,
-0.0447998046875,
-0.00577545166015625,
-0.0241241455078125,
0.03277587890625,
0.055633544921875,
-0.04425048828125,
0.06024169921875,
0.0032749176025390625,
-0.026275634765625,
-0.0531005859375,
-0.0484619140625,
-0.01137542724609375,
-0.024505615234375,
-0.05352783203125,
0.02728271484375,
-0.0013818740844726562,
-0.011444091796875,
0.006221771240234375,
0.0006418228149414062,
-0.00519561767578125,
-0.01236724853515625,
0.03717041015625,
0.0235137939453125,
-0.004238128662109375,
0.0113372802734375,
0.007080078125,
-0.0122833251953125,
0.015838623046875,
-0.0120086669921875,
0.04620361328125,
-0.01763916015625,
0.0003898143768310547,
-0.02996826171875,
0.005039215087890625,
0.033966064453125,
-0.031707763671875,
0.0689697265625,
0.04302978515625,
-0.03179931640625,
0.0093536376953125,
-0.030731201171875,
-0.005443572998046875,
-0.03155517578125,
0.0167694091796875,
-0.032684326171875,
-0.033050537109375,
0.0299530029296875,
0.004375457763671875,
0.01495361328125,
0.06396484375,
0.042999267578125,
0.022216796875,
0.0709228515625,
0.0667724609375,
0.0009860992431640625,
0.031494140625,
-0.034393310546875,
0.029693603515625,
-0.062103271484375,
-0.0215911865234375,
-0.036956787109375,
-0.005863189697265625,
-0.058135986328125,
-0.00787353515625,
0.019500732421875,
0.017669677734375,
-0.030853271484375,
0.027130126953125,
-0.05218505859375,
0.0108184814453125,
0.047515869140625,
-0.00981903076171875,
0.01080322265625,
0.01123809814453125,
-0.03106689453125,
-0.01800537109375,
-0.0540771484375,
-0.041290283203125,
0.10589599609375,
0.01800537109375,
0.03961181640625,
0.0016307830810546875,
0.056182861328125,
-0.003604888916015625,
0.0439453125,
-0.041107177734375,
0.04010009765625,
0.0079498291015625,
-0.064697265625,
-0.01010894775390625,
-0.01334381103515625,
-0.06402587890625,
0.037078857421875,
-0.004871368408203125,
-0.0648193359375,
0.0296630859375,
0.0016279220581054688,
-0.0157318115234375,
0.0072784423828125,
-0.055328369140625,
0.06829833984375,
-0.011444091796875,
-0.0107574462890625,
-0.005451202392578125,
-0.05560302734375,
0.03643798828125,
0.0132598876953125,
0.003063201904296875,
-0.0054473876953125,
0.007415771484375,
0.07220458984375,
-0.03350830078125,
0.0504150390625,
-0.02001953125,
0.017547607421875,
0.02984619140625,
-0.01788330078125,
0.0162506103515625,
0.00463104248046875,
-0.005039215087890625,
0.011383056640625,
0.010498046875,
-0.04034423828125,
-0.023040771484375,
0.045989990234375,
-0.06982421875,
-0.02484130859375,
-0.0662841796875,
-0.026885986328125,
-0.0013151168823242188,
0.03155517578125,
0.038970947265625,
0.024749755859375,
0.01245880126953125,
0.0307769775390625,
0.0274658203125,
-0.0217742919921875,
0.041259765625,
0.02569580078125,
-0.0176239013671875,
-0.0298309326171875,
0.061065673828125,
0.01398468017578125,
0.006805419921875,
0.0121002197265625,
0.002227783203125,
-0.0233154296875,
-0.01611328125,
-0.018310546875,
0.0241241455078125,
-0.069091796875,
0.0029163360595703125,
-0.06597900390625,
-0.034912109375,
-0.0369873046875,
-0.0011196136474609375,
-0.020660400390625,
-0.03875732421875,
-0.01251983642578125,
0.0030517578125,
0.0208892822265625,
0.02862548828125,
-0.008758544921875,
0.016571044921875,
-0.02886962890625,
-0.00374603271484375,
0.0198211669921875,
0.0180511474609375,
-0.001583099365234375,
-0.046478271484375,
0.0023326873779296875,
-0.00798797607421875,
-0.0071563720703125,
-0.06103515625,
0.0482177734375,
0.012420654296875,
0.039581298828125,
0.017730712890625,
0.0174713134765625,
0.039764404296875,
-0.020599365234375,
0.061065673828125,
-0.0012073516845703125,
-0.06707763671875,
0.04443359375,
0.01068878173828125,
0.02362060546875,
0.05279541015625,
0.0504150390625,
-0.0321044921875,
-0.01727294921875,
-0.061248779296875,
-0.08575439453125,
0.054656982421875,
0.02886962890625,
-0.0020275115966796875,
0.0094757080078125,
0.0180206298828125,
0.0019588470458984375,
0.0177001953125,
-0.0743408203125,
-0.0496826171875,
-0.035919189453125,
-0.0276641845703125,
-0.0127105712890625,
-0.039215087890625,
-0.0012388229370117188,
-0.0440673828125,
0.0732421875,
0.0038356781005859375,
0.0209197998046875,
0.0305328369140625,
-0.029693603515625,
0.008331298828125,
0.02197265625,
0.04095458984375,
0.0638427734375,
-0.04071044921875,
-0.0038127899169921875,
0.00933837890625,
-0.029144287109375,
0.00278472900390625,
0.041259765625,
-0.0199737548828125,
0.01776123046875,
0.04144287109375,
0.07281494140625,
0.001129150390625,
-0.05731201171875,
0.043731689453125,
-0.0183868408203125,
-0.0237579345703125,
-0.0704345703125,
0.0010061264038085938,
-0.012237548828125,
0.0041961669921875,
0.043212890625,
0.007129669189453125,
-0.004055023193359375,
-0.0197601318359375,
0.0012083053588867188,
0.01153564453125,
-0.03240966796875,
-0.005222320556640625,
0.0467529296875,
0.0027561187744140625,
-0.0087738037109375,
0.048187255859375,
-0.0131988525390625,
-0.050567626953125,
0.0404052734375,
0.032379150390625,
0.0682373046875,
-0.008209228515625,
-0.004711151123046875,
0.0645751953125,
0.038360595703125,
0.01739501953125,
0.02178955078125,
-0.00749969482421875,
-0.0282440185546875,
-0.03558349609375,
-0.052734375,
-0.0179901123046875,
0.021759033203125,
-0.04638671875,
0.01099395751953125,
-0.0305328369140625,
0.004909515380859375,
0.00214385986328125,
0.014617919921875,
-0.0489501953125,
0.012725830078125,
-0.004413604736328125,
0.07452392578125,
-0.0582275390625,
0.058929443359375,
0.06256103515625,
-0.045989990234375,
-0.0673828125,
0.002094268798828125,
-0.019378662109375,
-0.039093017578125,
0.07513427734375,
0.03900146484375,
0.02288818359375,
-0.004993438720703125,
-0.039093017578125,
-0.07440185546875,
0.07330322265625,
0.0090789794921875,
-0.025726318359375,
0.01824951171875,
-0.0005750656127929688,
0.04180908203125,
-0.01751708984375,
0.0350341796875,
0.042633056640625,
0.038818359375,
-0.01528167724609375,
-0.05267333984375,
0.00959014892578125,
-0.04510498046875,
0.005184173583984375,
-0.00734710693359375,
-0.053680419921875,
0.0860595703125,
0.01554107666015625,
-0.0150604248046875,
0.0016355514526367188,
0.058380126953125,
0.0222015380859375,
-0.006130218505859375,
0.044036865234375,
0.042633056640625,
0.032470703125,
-0.0016851425170898438,
0.0782470703125,
-0.060272216796875,
0.038299560546875,
0.055328369140625,
0.003505706787109375,
0.0599365234375,
0.04022216796875,
-0.0134124755859375,
0.05120849609375,
0.044952392578125,
0.00949859619140625,
0.034027099609375,
0.00624847412109375,
-0.015716552734375,
-0.00981903076171875,
0.005657196044921875,
-0.042388916015625,
0.03460693359375,
0.023193359375,
-0.0281524658203125,
-0.00571441650390625,
0.004932403564453125,
0.01210784912109375,
-0.003582000732421875,
-0.0020961761474609375,
0.041656494140625,
-0.0006504058837890625,
-0.0682373046875,
0.08050537109375,
0.01184844970703125,
0.054534912109375,
-0.050537109375,
0.00030040740966796875,
-0.0196380615234375,
0.028533935546875,
-0.0116119384765625,
-0.032440185546875,
0.01387786865234375,
-0.005222320556640625,
-0.00421142578125,
-0.003925323486328125,
0.061614990234375,
-0.0423583984375,
-0.052947998046875,
0.01471710205078125,
0.0095672607421875,
0.006351470947265625,
0.0005011558532714844,
-0.05224609375,
0.006500244140625,
0.00534820556640625,
-0.041900634765625,
0.0021076202392578125,
0.007770538330078125,
0.00921630859375,
0.037353515625,
0.034088134765625,
-0.008941650390625,
0.0157012939453125,
-0.0223236083984375,
0.049072265625,
-0.03466796875,
-0.04864501953125,
-0.065673828125,
0.0419921875,
-0.031097412109375,
-0.04766845703125,
0.06622314453125,
0.045318603515625,
0.066162109375,
-0.0002741813659667969,
0.07330322265625,
-0.024566650390625,
0.043670654296875,
-0.01788330078125,
0.047515869140625,
-0.05078125,
-0.0014905929565429688,
-0.01904296875,
-0.08123779296875,
-0.0255126953125,
0.07427978515625,
-0.03399658203125,
0.00650787353515625,
0.045867919921875,
0.045318603515625,
-0.006610870361328125,
-0.006221771240234375,
-0.011871337890625,
0.033966064453125,
0.03558349609375,
0.0352783203125,
0.024322509765625,
-0.045623779296875,
0.047698974609375,
-0.05224609375,
-0.03155517578125,
-0.033355712890625,
-0.057281494140625,
-0.09442138671875,
-0.061737060546875,
-0.01187896728515625,
-0.0237579345703125,
-0.0063323974609375,
0.08575439453125,
0.0518798828125,
-0.0645751953125,
-0.018646240234375,
-0.01384735107421875,
-0.0063934326171875,
-0.016143798828125,
-0.0194854736328125,
0.06182861328125,
-0.03363037109375,
-0.057708740234375,
0.003955841064453125,
0.019622802734375,
0.0018243789672851562,
-0.0234832763671875,
-0.01556396484375,
-0.04168701171875,
0.01273345947265625,
0.045013427734375,
0.0247650146484375,
-0.05218505859375,
-0.004627227783203125,
-0.01288604736328125,
-0.02032470703125,
0.010833740234375,
0.029144287109375,
-0.05255126953125,
0.04949951171875,
0.039825439453125,
0.04534912109375,
0.058807373046875,
-0.01183319091796875,
0.0247344970703125,
-0.0506591796875,
0.050933837890625,
0.0131988525390625,
0.0274200439453125,
0.02728271484375,
-0.000019431114196777344,
0.04290771484375,
0.016693115234375,
-0.037322998046875,
-0.0604248046875,
0.00038933753967285156,
-0.06787109375,
-0.0218353271484375,
0.067626953125,
-0.01190948486328125,
-0.028564453125,
0.00812530517578125,
-0.0303802490234375,
0.024658203125,
-0.010955810546875,
0.03826904296875,
0.058502197265625,
-0.01340484619140625,
-0.0137786865234375,
-0.04296875,
0.045074462890625,
0.06182861328125,
-0.042449951171875,
-0.0178680419921875,
0.006999969482421875,
0.024169921875,
0.0191192626953125,
0.055145263671875,
0.0006208419799804688,
0.00959014892578125,
0.002880096435546875,
0.0125885009765625,
0.000025987625122070312,
-0.00988006591796875,
-0.0151519775390625,
0.0012302398681640625,
-0.003917694091796875,
-0.01462554931640625
]
] |
google/flan-t5-large | 2023-07-17T12:49:05.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"t5",
"text2text-generation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:svakulenk0/qrecc",
"dataset:taskmaster2",
"dataset:djaym7/wiki_dialog",
"dataset:deepmind/code_contests",
"dataset:lambada",
"dataset:gsm8k",
"dataset:aqua_rat",
"dataset:esnli",
"dataset:quasc",
"dataset:qed",
"arxiv:2210.11416",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/flan-t5-large | 299 | 1,485,728 | transformers | 2022-10-21T10:07:08 | ---
language:
- en
- fr
- ro
- de
- multilingual
widget:
- text: "Translate to German: My name is Arthur"
example_title: "Translation"
- text: "Please answer to the following question. Who is going to be the next Ballon d'or?"
example_title: "Question Answering"
- text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering."
example_title: "Logical reasoning"
- text: "Please answer the following question. What is the boiling point of Nitrogen?"
example_title: "Scientific knowledge"
- text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?"
example_title: "Yes/no question"
- text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?"
example_title: "Reasoning task"
- text: "Q: ( False or not False or False ) is? A: Let's think step by step"
example_title: "Boolean Expressions"
- text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?"
example_title: "Math reasoning"
- text: "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?"
example_title: "Premise and hypothesis"
tags:
- text2text-generation
datasets:
- svakulenk0/qrecc
- taskmaster2
- djaym7/wiki_dialog
- deepmind/code_contests
- lambada
- gsm8k
- aqua_rat
- esnli
- quasc
- qed
license: apache-2.0
---
# Model Card for FLAN-T5 large
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg"
alt="drawing" width="600"/>
# Table of Contents
0. [TL;DR](#TL;DR)
1. [Model Details](#model-details)
2. [Usage](#usage)
3. [Uses](#uses)
4. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
5. [Training Details](#training-details)
6. [Evaluation](#evaluation)
7. [Environmental Impact](#environmental-impact)
8. [Citation](#citation)
9. [Model Card Authors](#model-card-authors)
# TL;DR
If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages.
As mentioned in the first few lines of the abstract :
> Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models.
**Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large).
# Model Details
## Model Description
- **Model type:** Language model
- **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian
- **License:** Apache 2.0
- **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5)
- **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints)
- **Resources for more information:**
- [Research paper](https://arxiv.org/pdf/2210.11416.pdf)
- [GitHub Repo](https://github.com/google-research/t5x)
- [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5)
# Usage
Find below some example scripts on how to use the model in `transformers`:
## Using the Pytorch model
### Running the model on a CPU
<details>
<summary> Click to expand </summary>
```python
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large")
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto")
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU using different precisions
#### FP16
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto", torch_dtype=torch.float16)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
#### INT8
<details>
<summary> Click to expand </summary>
```python
# pip install bitsandbytes accelerate
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto", load_in_8bit=True)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
# Uses
## Direct Use and Downstream Use
The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that:
> The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models
See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf):
> Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application.
## Ethical considerations and risks
> Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data.
## Known Limitations
> Flan-T5 has not been tested in real world applications.
## Sensitive Use:
> Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech.
# Training Details
## Training Data
The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2):

## Training Procedure
According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf):
> These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size.
The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax).
# Evaluation
## Testing Data, Factors & Metrics
The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation:

For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf).
## Results
For full results for FLAN-T5-Large, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips โฅ 4.
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@misc{https://doi.org/10.48550/arxiv.2210.11416,
doi = {10.48550/ARXIV.2210.11416},
url = {https://arxiv.org/abs/2210.11416},
author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason},
keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Scaling Instruction-Finetuned Language Models},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 10,818 | [
[
-0.0343017578125,
-0.043365478515625,
0.02313232421875,
0.00015282630920410156,
-0.0069122314453125,
-0.0113525390625,
-0.03326416015625,
-0.048736572265625,
-0.00991058349609375,
0.00968170166015625,
-0.037017822265625,
-0.038421630859375,
-0.049468994140625,
0.00542449951171875,
-0.0189666748046875,
0.07781982421875,
-0.0102996826171875,
0.001689910888671875,
0.01409149169921875,
-0.006244659423828125,
-0.01154327392578125,
-0.02362060546875,
-0.052764892578125,
-0.0218963623046875,
0.0369873046875,
0.0217437744140625,
0.03460693359375,
0.0386962890625,
0.039398193359375,
0.0255584716796875,
-0.01267242431640625,
-0.004199981689453125,
-0.04052734375,
-0.0305633544921875,
0.003620147705078125,
-0.033935546875,
-0.04547119140625,
-0.0028228759765625,
0.03448486328125,
0.0367431640625,
0.006866455078125,
0.02490234375,
-0.01068115234375,
0.0197296142578125,
-0.041168212890625,
0.023468017578125,
-0.0222320556640625,
0.006755828857421875,
-0.021942138671875,
0.0081634521484375,
-0.01947021484375,
-0.0169525146484375,
0.004627227783203125,
-0.052398681640625,
0.036712646484375,
-0.00966644287109375,
0.10748291015625,
0.01001739501953125,
-0.007266998291015625,
-0.013214111328125,
-0.058868408203125,
0.07275390625,
-0.072265625,
0.03411865234375,
0.014129638671875,
0.0274505615234375,
0.0062408447265625,
-0.0611572265625,
-0.051483154296875,
-0.0243682861328125,
-0.0034027099609375,
0.01035308837890625,
-0.002132415771484375,
0.01549530029296875,
0.0423583984375,
0.0467529296875,
-0.034820556640625,
-0.002452850341796875,
-0.054107666015625,
-0.0117340087890625,
0.05487060546875,
-0.0024318695068359375,
0.036834716796875,
-0.004695892333984375,
-0.020751953125,
-0.03662109375,
-0.02569580078125,
0.00872802734375,
0.020355224609375,
0.03167724609375,
-0.035003662109375,
0.0300750732421875,
-0.001377105712890625,
0.040557861328125,
0.0230865478515625,
-0.037506103515625,
0.037506103515625,
-0.024810791015625,
-0.0268096923828125,
-0.01152801513671875,
0.0714111328125,
0.0153350830078125,
0.0189361572265625,
-0.0049591064453125,
-0.031097412109375,
-0.0005578994750976562,
0.01268768310546875,
-0.07366943359375,
-0.00823974609375,
0.0328369140625,
-0.029510498046875,
-0.03643798828125,
0.0125274658203125,
-0.062103271484375,
-0.0032520294189453125,
-0.0007648468017578125,
0.0404052734375,
-0.03717041015625,
-0.0439453125,
-0.00284576416015625,
-0.0143280029296875,
0.023712158203125,
0.0065460205078125,
-0.08111572265625,
0.01468658447265625,
0.039398193359375,
0.06427001953125,
0.00965118408203125,
-0.0245513916015625,
-0.0201873779296875,
0.0018796920776367188,
-0.016876220703125,
0.0309295654296875,
-0.0309906005859375,
-0.0261688232421875,
-0.0009889602661132812,
0.0152130126953125,
-0.014404296875,
-0.03594970703125,
0.051055908203125,
-0.0225067138671875,
0.037567138671875,
-0.02227783203125,
-0.0400390625,
-0.0302581787109375,
-0.0010166168212890625,
-0.050537109375,
0.08209228515625,
0.0207672119140625,
-0.0560302734375,
0.034332275390625,
-0.0709228515625,
-0.0338134765625,
-0.0117034912109375,
0.010467529296875,
-0.053314208984375,
0.004589080810546875,
0.0259857177734375,
0.0294647216796875,
-0.0176849365234375,
0.0164031982421875,
-0.037994384765625,
-0.0252685546875,
-0.011444091796875,
-0.00850677490234375,
0.07574462890625,
0.032684326171875,
-0.061370849609375,
0.020782470703125,
-0.0447998046875,
-0.004627227783203125,
0.0247039794921875,
-0.00733184814453125,
0.0120391845703125,
-0.02587890625,
0.018402099609375,
0.0309295654296875,
0.02044677734375,
-0.040130615234375,
0.0022182464599609375,
-0.039031982421875,
0.040008544921875,
0.0419921875,
-0.01528167724609375,
0.0302581787109375,
-0.038604736328125,
0.03607177734375,
0.023895263671875,
0.01715087890625,
-0.0088043212890625,
-0.0262298583984375,
-0.08587646484375,
-0.000743865966796875,
0.02142333984375,
0.032928466796875,
-0.04522705078125,
0.0303192138671875,
-0.038726806640625,
-0.052398681640625,
-0.031646728515625,
0.0079345703125,
0.027313232421875,
0.033935546875,
0.0362548828125,
-0.00589752197265625,
-0.03887939453125,
-0.052337646484375,
-0.01384735107421875,
0.002044677734375,
-0.0005922317504882812,
0.018585205078125,
0.059600830078125,
-0.0023174285888671875,
0.036651611328125,
-0.022735595703125,
-0.027923583984375,
-0.037139892578125,
0.006175994873046875,
0.010498046875,
0.050445556640625,
0.0623779296875,
-0.04400634765625,
-0.031280517578125,
0.00623321533203125,
-0.060089111328125,
0.004367828369140625,
-0.0082550048828125,
-0.00852203369140625,
0.03662109375,
0.017364501953125,
-0.049530029296875,
0.028656005859375,
0.03375244140625,
-0.017181396484375,
0.023162841796875,
-0.009368896484375,
0.00444793701171875,
-0.09075927734375,
0.0386962890625,
0.0108642578125,
-0.01305389404296875,
-0.0562744140625,
0.01068115234375,
0.00389862060546875,
-0.01690673828125,
-0.04498291015625,
0.058349609375,
-0.0267791748046875,
0.001220703125,
-0.0087127685546875,
-0.0011692047119140625,
-0.001369476318359375,
0.04241943359375,
0.0091094970703125,
0.06103515625,
0.026763916015625,
-0.054931640625,
0.004772186279296875,
0.006244659423828125,
-0.0200958251953125,
0.016082763671875,
-0.05511474609375,
0.01043701171875,
-0.000560760498046875,
0.0191802978515625,
-0.051116943359375,
-0.0298309326171875,
0.0178070068359375,
-0.03485107421875,
0.03558349609375,
0.005237579345703125,
-0.02801513671875,
-0.045806884765625,
-0.0221710205078125,
0.0240020751953125,
0.04998779296875,
-0.04534912109375,
0.05035400390625,
0.01554107666015625,
0.0249481201171875,
-0.04388427734375,
-0.06494140625,
-0.0191192626953125,
-0.036712646484375,
-0.062744140625,
0.04058837890625,
0.0010833740234375,
0.00014913082122802734,
-0.016845703125,
-0.00826263427734375,
-0.003597259521484375,
0.0023708343505859375,
0.0101165771484375,
0.006427764892578125,
-0.0205841064453125,
-0.0099029541015625,
-0.017791748046875,
-0.006748199462890625,
-0.002246856689453125,
-0.028839111328125,
0.0467529296875,
-0.01947021484375,
0.01214599609375,
-0.05950927734375,
-0.0023441314697265625,
0.04595947265625,
-0.0224609375,
0.07000732421875,
0.08551025390625,
-0.036468505859375,
0.0005273818969726562,
-0.049102783203125,
-0.0247650146484375,
-0.038482666015625,
0.01702880859375,
-0.036712646484375,
-0.047943115234375,
0.051513671875,
0.01557159423828125,
0.0247344970703125,
0.058990478515625,
0.036834716796875,
-0.00044918060302734375,
0.0704345703125,
0.050323486328125,
-0.003520965576171875,
0.058563232421875,
-0.052642822265625,
0.018218994140625,
-0.045013427734375,
-0.0133209228515625,
-0.035003662109375,
-0.021514892578125,
-0.05322265625,
-0.022125244140625,
0.02313232421875,
0.00504302978515625,
-0.04595947265625,
0.0260772705078125,
-0.0257415771484375,
0.00905609130859375,
0.042572021484375,
0.016326904296875,
-0.00414276123046875,
0.006549835205078125,
-0.00870513916015625,
-0.0025463104248046875,
-0.051849365234375,
-0.03826904296875,
0.08465576171875,
0.0313720703125,
0.03118896484375,
0.0037479400634765625,
0.05413818359375,
-0.00417327880859375,
0.0197296142578125,
-0.040374755859375,
0.0301971435546875,
-0.0177764892578125,
-0.0684814453125,
-0.004955291748046875,
-0.03192138671875,
-0.06292724609375,
0.004520416259765625,
-0.005767822265625,
-0.056671142578125,
0.0009641647338867188,
0.011627197265625,
-0.0352783203125,
0.044708251953125,
-0.06982421875,
0.08941650390625,
-0.0285186767578125,
-0.040008544921875,
-0.00620269775390625,
-0.036834716796875,
0.041534423828125,
0.01033782958984375,
0.00890350341796875,
0.004734039306640625,
0.00676727294921875,
0.060638427734375,
-0.05633544921875,
0.059234619140625,
-0.03253173828125,
-0.008819580078125,
0.0229034423828125,
-0.018585205078125,
0.029052734375,
-0.019378662109375,
-0.008941650390625,
0.026123046875,
0.00565338134765625,
-0.044921875,
-0.037933349609375,
0.053985595703125,
-0.07940673828125,
-0.043212890625,
-0.037109375,
-0.028656005859375,
0.00372314453125,
0.035675048828125,
0.0281219482421875,
0.0238189697265625,
0.00286865234375,
0.003108978271484375,
0.033447265625,
-0.031829833984375,
0.0479736328125,
0.006107330322265625,
-0.021881103515625,
-0.028717041015625,
0.07122802734375,
0.01000213623046875,
0.03607177734375,
0.023712158203125,
0.023406982421875,
-0.0242767333984375,
-0.0195159912109375,
-0.03546142578125,
0.0313720703125,
-0.046905517578125,
-0.00543975830078125,
-0.042572021484375,
-0.01186370849609375,
-0.03875732421875,
-0.01029205322265625,
-0.03424072265625,
-0.0287628173828125,
-0.02703857421875,
-0.00411224365234375,
0.0220184326171875,
0.05035400390625,
0.000005245208740234375,
0.0277862548828125,
-0.04547119140625,
0.0233306884765625,
0.00397491455078125,
0.027313232421875,
0.00852203369140625,
-0.051483154296875,
-0.01296234130859375,
0.0221710205078125,
-0.035308837890625,
-0.044647216796875,
0.02789306640625,
0.017333984375,
0.024932861328125,
0.040618896484375,
-0.00775909423828125,
0.06903076171875,
-0.010162353515625,
0.07861328125,
0.00433349609375,
-0.07440185546875,
0.04229736328125,
-0.037506103515625,
0.034942626953125,
0.025665283203125,
0.0264892578125,
-0.0249786376953125,
-0.0169525146484375,
-0.07659912109375,
-0.05413818359375,
0.0721435546875,
0.0192413330078125,
0.002964019775390625,
0.0216522216796875,
0.017181396484375,
-0.007495880126953125,
0.006031036376953125,
-0.06787109375,
-0.015869140625,
-0.035980224609375,
-0.0231781005859375,
-0.00890350341796875,
-0.005046844482421875,
-0.006450653076171875,
-0.0265655517578125,
0.06072998046875,
0.0038509368896484375,
0.048919677734375,
0.0091094970703125,
-0.017730712890625,
-0.01371002197265625,
-0.00013184547424316406,
0.06982421875,
0.036895751953125,
-0.0240325927734375,
-0.01114654541015625,
0.028839111328125,
-0.04425048828125,
-0.005626678466796875,
0.0084686279296875,
-0.0260772705078125,
-0.004730224609375,
0.03436279296875,
0.07977294921875,
0.01308441162109375,
-0.02838134765625,
0.033660888671875,
-0.004261016845703125,
-0.0285186767578125,
-0.036895751953125,
0.0254058837890625,
0.00782012939453125,
0.005161285400390625,
0.01198577880859375,
0.0022258758544921875,
-0.01490020751953125,
-0.029510498046875,
-0.0011396408081054688,
0.0136566162109375,
-0.01557159423828125,
-0.035400390625,
0.08123779296875,
0.017181396484375,
-0.007511138916015625,
0.042449951171875,
-0.00505828857421875,
-0.038726806640625,
0.051483154296875,
0.0328369140625,
0.0701904296875,
-0.01218414306640625,
0.00030422210693359375,
0.0701904296875,
0.0245208740234375,
-0.0105438232421875,
0.0264739990234375,
0.006336212158203125,
-0.037750244140625,
-0.01299285888671875,
-0.050689697265625,
0.001712799072265625,
0.034088134765625,
-0.03326416015625,
0.03814697265625,
-0.05682373046875,
-0.013946533203125,
0.0085906982421875,
0.033935546875,
-0.07080078125,
0.0304718017578125,
0.0219879150390625,
0.061279296875,
-0.055450439453125,
0.060394287109375,
0.04656982421875,
-0.07293701171875,
-0.083984375,
-0.0005192756652832031,
-0.00678253173828125,
-0.0408935546875,
0.0452880859375,
0.03131103515625,
0.0032291412353515625,
0.0008835792541503906,
-0.037139892578125,
-0.0675048828125,
0.09881591796875,
0.0301666259765625,
-0.03271484375,
-0.0094146728515625,
0.0251617431640625,
0.04522705078125,
-0.0189361572265625,
0.056915283203125,
0.041534423828125,
0.051849365234375,
0.007144927978515625,
-0.081298828125,
0.0153961181640625,
-0.018646240234375,
0.00971221923828125,
0.00011068582534790039,
-0.07861328125,
0.06988525390625,
-0.0233612060546875,
-0.0214080810546875,
-0.0013065338134765625,
0.064697265625,
0.016387939453125,
0.00653076171875,
0.04119873046875,
0.04571533203125,
0.059417724609375,
-0.01812744140625,
0.096923828125,
-0.041351318359375,
0.044158935546875,
0.04876708984375,
0.012725830078125,
0.050506591796875,
0.0193634033203125,
-0.0201873779296875,
0.032806396484375,
0.052764892578125,
-0.008453369140625,
0.0200347900390625,
-0.0033855438232421875,
-0.018402099609375,
-0.0076446533203125,
-0.003570556640625,
-0.038848876953125,
0.0253143310546875,
0.0285491943359375,
-0.034393310546875,
-0.0098724365234375,
-0.00360870361328125,
0.0280914306640625,
-0.0253753662109375,
-0.01244354248046875,
0.036895751953125,
0.00946807861328125,
-0.058563232421875,
0.08001708984375,
0.01123046875,
0.062042236328125,
-0.04095458984375,
0.01849365234375,
-0.0225067138671875,
0.0304718017578125,
-0.033203125,
-0.027252197265625,
0.01885986328125,
0.0017528533935546875,
0.0005764961242675781,
-0.01175689697265625,
0.0352783203125,
-0.03631591796875,
-0.05438232421875,
0.017608642578125,
0.00954437255859375,
0.01197052001953125,
0.01678466796875,
-0.0643310546875,
0.0195465087890625,
0.01010894775390625,
-0.0277557373046875,
0.01039886474609375,
0.0118560791015625,
-0.0009551048278808594,
0.044281005859375,
0.040069580078125,
-0.0115203857421875,
0.0194549560546875,
0.0096588134765625,
0.05322265625,
-0.051239013671875,
-0.0242767333984375,
-0.048553466796875,
0.04730224609375,
-0.006198883056640625,
-0.0394287109375,
0.051513671875,
0.046905517578125,
0.0858154296875,
-0.0121612548828125,
0.07080078125,
-0.0292205810546875,
0.022613525390625,
-0.031707763671875,
0.0537109375,
-0.061187744140625,
0.00264739990234375,
-0.029541015625,
-0.06219482421875,
-0.0170440673828125,
0.0653076171875,
-0.038818359375,
0.04931640625,
0.0577392578125,
0.0657958984375,
-0.0278472900390625,
0.004764556884765625,
0.01348876953125,
0.0233612060546875,
0.04986572265625,
0.054443359375,
0.0182647705078125,
-0.0648193359375,
0.04400634765625,
-0.05615234375,
0.007541656494140625,
-0.0177001953125,
-0.050079345703125,
-0.08245849609375,
-0.041778564453125,
-0.0206146240234375,
-0.03472900390625,
-0.003582000732421875,
0.06463623046875,
0.056671142578125,
-0.07684326171875,
-0.0255126953125,
-0.0203704833984375,
-0.0090179443359375,
-0.01605224609375,
-0.0185394287109375,
0.035125732421875,
-0.039306640625,
-0.0838623046875,
0.006183624267578125,
-0.0168914794921875,
0.01788330078125,
-0.0262603759765625,
-0.012451171875,
-0.0257568359375,
-0.0224761962890625,
0.0234527587890625,
0.0287933349609375,
-0.0626220703125,
-0.0281829833984375,
0.0006108283996582031,
-0.00809478759765625,
0.0088958740234375,
0.03680419921875,
-0.0321044921875,
0.02740478515625,
0.037872314453125,
0.038055419921875,
0.0615234375,
-0.005977630615234375,
0.050567626953125,
-0.03314208984375,
0.03289794921875,
0.0030345916748046875,
0.0205841064453125,
0.0302276611328125,
-0.018035888671875,
0.042236328125,
0.0232696533203125,
-0.031982421875,
-0.06005859375,
-0.0121307373046875,
-0.068359375,
-0.00016129016876220703,
0.09271240234375,
-0.0205841064453125,
-0.038421630859375,
0.01849365234375,
-0.0009050369262695312,
0.043426513671875,
-0.030517578125,
0.052032470703125,
0.05474853515625,
0.00894927978515625,
-0.0255584716796875,
-0.058319091796875,
0.05029296875,
0.040618896484375,
-0.057769775390625,
-0.01568603515625,
0.012969970703125,
0.041290283203125,
0.0143890380859375,
0.029937744140625,
-0.0042877197265625,
0.01485443115234375,
0.01123046875,
0.0187835693359375,
-0.01190185546875,
-0.007274627685546875,
-0.0216064453125,
0.003391265869140625,
-0.005130767822265625,
-0.007526397705078125
]
] |
microsoft/layoutlm-base-uncased | 2022-12-16T16:25:46.000Z | [
"transformers",
"pytorch",
"tf",
"layoutlm",
"arxiv:1912.13318",
"endpoints_compatible",
"has_space",
"region:us"
] | null | microsoft | null | null | microsoft/layoutlm-base-uncased | 25 | 1,444,971 | transformers | 2022-03-02T23:29:05 | # LayoutLM
**Multimodal (text + layout/format + image) pre-training for document AI**
[Microsoft Document AI](https://www.microsoft.com/en-us/research/project/document-ai/) | [GitHub](https://aka.ms/layoutlm)
## Model description
LayoutLM is a simple but effective pre-training method of text and layout for document image understanding and information extraction tasks, such as form understanding and receipt understanding. LayoutLM archives the SOTA results on multiple datasets. For more details, please refer to our paper:
[LayoutLM: Pre-training of Text and Layout for Document Image Understanding](https://arxiv.org/abs/1912.13318)
Yiheng Xu, Minghao Li, Lei Cui, Shaohan Huang, Furu Wei, Ming Zhou, [KDD 2020](https://www.kdd.org/kdd2020/accepted-papers)
## Training data
We pre-train LayoutLM on IIT-CDIP Test Collection 1.0\* dataset with two settings.
* LayoutLM-Base, Uncased (11M documents, 2 epochs): 12-layer, 768-hidden, 12-heads, 113M parameters **(This Model)**
* LayoutLM-Large, Uncased (11M documents, 2 epochs): 24-layer, 1024-hidden, 16-heads, 343M parameters
## Citation
If you find LayoutLM useful in your research, please cite the following paper:
``` latex
@misc{xu2019layoutlm,
title={LayoutLM: Pre-training of Text and Layout for Document Image Understanding},
author={Yiheng Xu and Minghao Li and Lei Cui and Shaohan Huang and Furu Wei and Ming Zhou},
year={2019},
eprint={1912.13318},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 1,500 | [
[
-0.01995849609375,
-0.053558349609375,
0.0418701171875,
0.0208892822265625,
-0.02032470703125,
-0.007595062255859375,
0.01529693603515625,
-0.0106048583984375,
-0.007171630859375,
0.02734375,
-0.040252685546875,
-0.046142578125,
-0.04010009765625,
-0.0221405029296875,
-0.0126800537109375,
0.0633544921875,
-0.027191162109375,
0.0340576171875,
-0.036102294921875,
-0.0178375244140625,
-0.023895263671875,
-0.045318603515625,
-0.031402587890625,
-0.010467529296875,
0.023468017578125,
0.00344085693359375,
0.0220184326171875,
0.0367431640625,
0.0513916015625,
0.026458740234375,
-0.0005559921264648438,
0.035919189453125,
-0.024444580078125,
-0.013885498046875,
0.02166748046875,
-0.01690673828125,
-0.0498046875,
0.021728515625,
0.053253173828125,
0.0306243896484375,
0.0037059783935546875,
0.00974273681640625,
0.012908935546875,
0.052276611328125,
-0.045745849609375,
0.0052642822265625,
-0.01462554931640625,
0.0131683349609375,
-0.0242919921875,
-0.0330810546875,
-0.048919677734375,
-0.020233154296875,
0.0032825469970703125,
-0.07012939453125,
0.00479888916015625,
0.0276336669921875,
0.08612060546875,
0.01338958740234375,
-0.0272674560546875,
-0.020843505859375,
-0.030914306640625,
0.055938720703125,
-0.045745849609375,
0.044586181640625,
0.039031982421875,
0.0106048583984375,
0.00005632638931274414,
-0.06951904296875,
-0.033416748046875,
-0.0122222900390625,
-0.038238525390625,
0.0207672119140625,
0.012603759765625,
0.0018358230590820312,
0.035369873046875,
0.0143280029296875,
-0.06732177734375,
-0.002758026123046875,
-0.043365478515625,
-0.039794921875,
0.04400634765625,
0.00003707408905029297,
0.042144775390625,
-0.01033782958984375,
-0.0489501953125,
-0.0249786376953125,
-0.024200439453125,
-0.0009675025939941406,
0.0247802734375,
-0.005767822265625,
-0.033660888671875,
0.0116729736328125,
0.033538818359375,
0.058258056640625,
-0.006786346435546875,
-0.034423828125,
0.0487060546875,
-0.034759521484375,
-0.040252685546875,
-0.016876220703125,
0.054901123046875,
-0.005237579345703125,
-0.025299072265625,
0.004848480224609375,
-0.015411376953125,
-0.0199737548828125,
0.024139404296875,
-0.056304931640625,
-0.02178955078125,
0.00955963134765625,
-0.0631103515625,
-0.00745391845703125,
0.0179443359375,
-0.03363037109375,
-0.02777099609375,
-0.035430908203125,
0.048248291015625,
-0.061676025390625,
-0.003997802734375,
-0.01049041748046875,
-0.0134124755859375,
0.03192138671875,
0.046783447265625,
-0.0279693603515625,
-0.005619049072265625,
0.0222015380859375,
0.06585693359375,
-0.01409912109375,
-0.053192138671875,
-0.038177490234375,
0.002193450927734375,
-0.018218994140625,
0.06976318359375,
-0.0199432373046875,
-0.026458740234375,
-0.00649261474609375,
0.004669189453125,
-0.03265380859375,
-0.0281219482421875,
0.04931640625,
-0.045166015625,
0.061737060546875,
0.005096435546875,
-0.03094482421875,
-0.01253509521484375,
0.0340576171875,
-0.054107666015625,
0.06903076171875,
0.00661468505859375,
-0.070068359375,
0.029327392578125,
-0.065673828125,
-0.0291290283203125,
-0.0015077590942382812,
-0.0231475830078125,
-0.0625,
-0.01678466796875,
0.01338958740234375,
0.0188446044921875,
0.01263427734375,
0.01474761962890625,
-0.021240234375,
-0.01708984375,
-0.0162506103515625,
-0.02044677734375,
0.07696533203125,
0.03656005859375,
-0.0016803741455078125,
0.031646728515625,
-0.06927490234375,
0.01299285888671875,
0.017059326171875,
-0.034271240234375,
-0.038543701171875,
-0.030914306640625,
0.03521728515625,
0.0237884521484375,
0.0357666015625,
-0.022430419921875,
0.0165557861328125,
-0.00885772705078125,
0.01070404052734375,
0.048431396484375,
-0.0220947265625,
0.05291748046875,
-0.01849365234375,
0.03814697265625,
-0.0017910003662109375,
0.013916015625,
-0.041107177734375,
-0.012908935546875,
-0.03045654296875,
-0.0355224609375,
0.020111083984375,
0.03460693359375,
-0.049957275390625,
0.032623291015625,
-0.0006265640258789062,
-0.03289794921875,
-0.0257720947265625,
0.00250244140625,
0.062286376953125,
0.041229248046875,
0.0293731689453125,
-0.01953125,
-0.061859130859375,
-0.051971435546875,
-0.009429931640625,
0.0107269287109375,
0.007747650146484375,
0.013214111328125,
0.03509521484375,
-0.0112152099609375,
0.05255126953125,
-0.03094482421875,
-0.039886474609375,
-0.0062255859375,
0.03314208984375,
-0.003936767578125,
0.042449951171875,
0.053802490234375,
-0.0853271484375,
-0.048309326171875,
-0.01262664794921875,
-0.07452392578125,
0.01151275634765625,
-0.01345062255859375,
-0.028717041015625,
0.0335693359375,
0.04388427734375,
-0.047760009765625,
0.0537109375,
0.0311431884765625,
-0.034515380859375,
0.03619384765625,
-0.04901123046875,
-0.0001519918441772461,
-0.08392333984375,
0.01560211181640625,
-0.005725860595703125,
-0.0104217529296875,
-0.050506591796875,
0.009521484375,
0.040435791015625,
-0.017822265625,
-0.038238525390625,
0.056060791015625,
-0.06866455078125,
-0.0164642333984375,
-0.01381683349609375,
-0.00217437744140625,
0.033447265625,
0.041778564453125,
0.007595062255859375,
0.06109619140625,
0.0230712890625,
-0.0213165283203125,
-0.0104827880859375,
0.041107177734375,
-0.027313232421875,
0.044189453125,
-0.026458740234375,
0.033111572265625,
-0.0221405029296875,
0.042022705078125,
-0.0806884765625,
-0.0035552978515625,
0.01104736328125,
-0.0225830078125,
0.038848876953125,
0.0207977294921875,
-0.039031982421875,
-0.028717041015625,
-0.04119873046875,
0.0201263427734375,
0.0176849365234375,
-0.03167724609375,
0.0689697265625,
0.0174713134765625,
0.016815185546875,
-0.03814697265625,
-0.038421630859375,
-0.021453857421875,
-0.0194549560546875,
-0.060577392578125,
0.03594970703125,
-0.02203369140625,
-0.003017425537109375,
-0.0094146728515625,
-0.008575439453125,
-0.01415252685546875,
0.004886627197265625,
0.026580810546875,
0.033233642578125,
-0.019683837890625,
0.00921630859375,
-0.0129241943359375,
-0.01293182373046875,
-0.01148223876953125,
-0.0186309814453125,
0.06689453125,
-0.0007009506225585938,
-0.039947509765625,
-0.038421630859375,
0.0260467529296875,
0.03009033203125,
-0.026214599609375,
0.0596923828125,
0.0721435546875,
-0.0249786376953125,
0.0196533203125,
-0.03680419921875,
0.01081085205078125,
-0.0335693359375,
0.031982421875,
-0.0269012451171875,
-0.02703857421875,
0.03045654296875,
0.008514404296875,
0.01387786865234375,
0.042022705078125,
0.0297088623046875,
-0.03546142578125,
0.068359375,
0.052276611328125,
0.00988006591796875,
0.056671142578125,
-0.038787841796875,
0.004238128662109375,
-0.06964111328125,
-0.039520263671875,
-0.037353515625,
-0.0267791748046875,
-0.008758544921875,
-0.027008056640625,
0.02618408203125,
0.01641845703125,
-0.030548095703125,
0.00640106201171875,
-0.035491943359375,
0.01499176025390625,
0.053924560546875,
0.01100921630859375,
0.01081085205078125,
0.007801055908203125,
-0.01312255859375,
0.001644134521484375,
-0.03070068359375,
-0.048614501953125,
0.0697021484375,
0.0227813720703125,
0.0643310546875,
-0.0126800537109375,
0.060333251953125,
0.017059326171875,
0.008758544921875,
-0.03240966796875,
0.025299072265625,
-0.01548004150390625,
-0.03228759765625,
-0.03680419921875,
-0.005825042724609375,
-0.07904052734375,
0.01081085205078125,
-0.005374908447265625,
-0.043365478515625,
0.0065765380859375,
0.0200042724609375,
-0.01904296875,
0.0184173583984375,
-0.0693359375,
0.06951904296875,
-0.045562744140625,
-0.0023136138916015625,
0.0164794921875,
-0.059906005859375,
0.008514404296875,
-0.0164031982421875,
0.0241241455078125,
0.0147857666015625,
0.019500732421875,
0.0780029296875,
-0.044677734375,
0.0426025390625,
-0.03521728515625,
-0.007534027099609375,
-0.0010890960693359375,
-0.01462554931640625,
0.04241943359375,
-0.0198516845703125,
0.0013332366943359375,
0.006336212158203125,
0.00864410400390625,
-0.03375244140625,
-0.045623779296875,
0.0272369384765625,
-0.085205078125,
-0.0328369140625,
-0.0231475830078125,
-0.054473876953125,
-0.0031757354736328125,
0.039093017578125,
0.055755615234375,
0.03363037109375,
-0.006256103515625,
0.022613525390625,
0.05352783203125,
-0.0192108154296875,
0.0266265869140625,
0.0216217041015625,
-0.0216827392578125,
-0.0149993896484375,
0.05767822265625,
0.0244598388671875,
-0.00962066650390625,
0.042724609375,
0.01100921630859375,
-0.0260467529296875,
-0.044677734375,
-0.03521728515625,
0.007656097412109375,
-0.053497314453125,
-0.01090240478515625,
-0.07244873046875,
-0.049468994140625,
-0.03741455078125,
-0.01094818115234375,
-0.010101318359375,
-0.00689697265625,
-0.0311279296875,
0.002025604248046875,
-0.001148223876953125,
0.0377197265625,
0.01020050048828125,
0.0289764404296875,
-0.055145263671875,
0.0218353271484375,
0.0081024169921875,
0.006229400634765625,
0.0038127899169921875,
-0.0523681640625,
-0.0181121826171875,
-0.0042877197265625,
-0.043731689453125,
-0.0509033203125,
0.035675048828125,
0.01483154296875,
0.07305908203125,
0.027435302734375,
-0.01371002197265625,
0.046783447265625,
-0.036041259765625,
0.049835205078125,
0.0194854736328125,
-0.053619384765625,
0.0533447265625,
-0.01293182373046875,
0.0274505615234375,
0.017791748046875,
0.0274658203125,
-0.01262664794921875,
0.004474639892578125,
-0.060577392578125,
-0.054840087890625,
0.07940673828125,
0.00835418701171875,
0.001598358154296875,
0.050445556640625,
0.007366180419921875,
0.0150146484375,
0.00921630859375,
-0.054473876953125,
-0.02459716796875,
-0.031494140625,
-0.0196533203125,
0.002994537353515625,
-0.029541015625,
-0.01922607421875,
-0.0186309814453125,
0.04705810546875,
-0.0034542083740234375,
0.03466796875,
0.00897979736328125,
-0.033447265625,
0.0126495361328125,
0.0015201568603515625,
0.073974609375,
0.06500244140625,
-0.020965576171875,
0.00897979736328125,
0.0036602020263671875,
-0.061431884765625,
0.004764556884765625,
0.021881103515625,
-0.006866455078125,
-0.0015764236450195312,
0.0560302734375,
0.08062744140625,
-0.00835418701171875,
-0.0205230712890625,
0.04461669921875,
-0.00969696044921875,
-0.044647216796875,
-0.0273895263671875,
-0.01525115966796875,
-0.006008148193359375,
0.00333404541015625,
0.0310211181640625,
0.01050567626953125,
0.004428863525390625,
-0.027435302734375,
0.00791168212890625,
0.030426025390625,
-0.02520751953125,
-0.016204833984375,
0.04437255859375,
-0.0007519721984863281,
-0.033905029296875,
0.037628173828125,
-0.0162200927734375,
-0.02581787109375,
0.049285888671875,
0.05120849609375,
0.042999267578125,
-0.006435394287109375,
0.01812744140625,
0.01514434814453125,
0.006740570068359375,
0.0126800537109375,
0.0347900390625,
-0.002338409423828125,
-0.058013916015625,
-0.00975799560546875,
-0.0435791015625,
-0.0031585693359375,
0.0277099609375,
-0.03912353515625,
0.0242919921875,
-0.040252685546875,
0.010101318359375,
0.004703521728515625,
0.0194854736328125,
-0.0697021484375,
0.0249786376953125,
0.0193328857421875,
0.07855224609375,
-0.05194091796875,
0.070068359375,
0.0826416015625,
-0.037872314453125,
-0.0771484375,
0.00531005859375,
0.0159912109375,
-0.07757568359375,
0.0655517578125,
0.0139007568359375,
-0.0006990432739257812,
-0.01100921630859375,
-0.056243896484375,
-0.06219482421875,
0.0928955078125,
0.020965576171875,
-0.0204315185546875,
-0.0239105224609375,
-0.0181884765625,
0.037139892578125,
-0.0301361083984375,
0.02960205078125,
0.0134124755859375,
0.046905517578125,
0.0116424560546875,
-0.054931640625,
-0.0004572868347167969,
-0.045166015625,
0.0111846923828125,
-0.0013484954833984375,
-0.054779052734375,
0.08404541015625,
-0.0018815994262695312,
-0.014434814453125,
0.00963592529296875,
0.038818359375,
0.046630859375,
0.0511474609375,
0.0433349609375,
0.03790283203125,
0.07379150390625,
0.0020656585693359375,
0.072021484375,
-0.0057373046875,
0.0233917236328125,
0.08868408203125,
-0.01184844970703125,
0.03265380859375,
0.0292816162109375,
-0.0065765380859375,
0.040863037109375,
0.06158447265625,
0.0009722709655761719,
0.055389404296875,
-0.0123291015625,
0.03656005859375,
-0.00804901123046875,
0.01551055908203125,
-0.058502197265625,
0.03082275390625,
0.00945281982421875,
-0.0406494140625,
-0.0143280029296875,
0.0318603515625,
0.01519012451171875,
0.0035190582275390625,
-0.0027027130126953125,
0.057647705078125,
-0.003444671630859375,
-0.01499176025390625,
0.0290069580078125,
-0.01025390625,
0.05059814453125,
-0.0531005859375,
0.0012254714965820312,
-0.028839111328125,
-0.000007569789886474609,
-0.021575927734375,
-0.04638671875,
0.0088348388671875,
-0.0263214111328125,
-0.02423095703125,
-0.0278167724609375,
0.068115234375,
-0.018096923828125,
-0.037628173828125,
0.01555633544921875,
0.04461669921875,
0.005352020263671875,
0.0034332275390625,
-0.060699462890625,
0.01250457763671875,
0.00165557861328125,
-0.0301666259765625,
0.045074462890625,
0.0384521484375,
-0.02130126953125,
0.0221710205078125,
0.05645751953125,
-0.0120086669921875,
0.0029048919677734375,
0.006969451904296875,
0.06500244140625,
-0.0301055908203125,
-0.061981201171875,
-0.059600830078125,
0.04168701171875,
-0.035980224609375,
-0.0271148681640625,
0.0635986328125,
0.055755615234375,
0.065673828125,
-0.013153076171875,
0.055755615234375,
-0.0018329620361328125,
0.021636962890625,
-0.05743408203125,
0.07794189453125,
-0.06646728515625,
-0.005580902099609375,
-0.02984619140625,
-0.07049560546875,
-0.028656005859375,
0.053497314453125,
-0.03778076171875,
0.0065460205078125,
0.0675048828125,
0.04449462890625,
-0.01036834716796875,
-0.0120391845703125,
0.03466796875,
0.006252288818359375,
0.0305938720703125,
0.0104217529296875,
0.061981201171875,
-0.034423828125,
0.038970947265625,
-0.00882720947265625,
-0.0147857666015625,
-0.023223876953125,
-0.05194091796875,
-0.0654296875,
-0.06536865234375,
-0.01457977294921875,
-0.021240234375,
-0.00830078125,
0.045745849609375,
0.08697509765625,
-0.049041748046875,
0.008544921875,
0.016937255859375,
0.00939178466796875,
-0.01145172119140625,
-0.011016845703125,
0.069091796875,
-0.0218963623046875,
-0.042816162109375,
0.003330230712890625,
0.0243072509765625,
0.0174560546875,
-0.018280029296875,
-0.023223876953125,
-0.0263824462890625,
-0.012603759765625,
0.038116455078125,
0.01200103759765625,
-0.043304443359375,
-0.0024356842041015625,
-0.006328582763671875,
-0.033416748046875,
0.025421142578125,
0.0562744140625,
-0.0278778076171875,
0.0299835205078125,
0.052947998046875,
0.0292510986328125,
0.03753662109375,
0.00514984130859375,
0.0226287841796875,
-0.06439208984375,
0.038665771484375,
-0.0133514404296875,
0.04376220703125,
0.021575927734375,
-0.04010009765625,
0.041046142578125,
0.0117340087890625,
-0.032379150390625,
-0.060211181640625,
-0.0065460205078125,
-0.0654296875,
-0.0218048095703125,
0.066650390625,
-0.0290985107421875,
-0.024871826171875,
0.003887176513671875,
-0.05206298828125,
0.01465606689453125,
-0.01033782958984375,
0.04150390625,
0.039886474609375,
-0.01300048828125,
-0.031005859375,
-0.0222320556640625,
0.0386962890625,
0.013458251953125,
-0.08355712890625,
-0.03729248046875,
0.0271453857421875,
-0.006534576416015625,
0.046905517578125,
0.069580078125,
-0.01027679443359375,
0.0127410888671875,
0.0005235671997070312,
0.0132293701171875,
-0.024322509765625,
-0.0284576416015625,
-0.00911712646484375,
0.01479339599609375,
-0.0131683349609375,
-0.033203125
]
] |
ProsusAI/finbert | 2023-05-23T12:43:35.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"financial-sentiment-analysis",
"sentiment-analysis",
"en",
"arxiv:1908.10063",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | ProsusAI | null | null | ProsusAI/finbert | 373 | 1,423,334 | transformers | 2022-03-02T23:29:04 | ---
language: "en"
tags:
- financial-sentiment-analysis
- sentiment-analysis
widget:
- text: "Stocks rallied and the British pound gained."
---
FinBERT is a pre-trained NLP model to analyze sentiment of financial text. It is built by further training the BERT language model in the finance domain, using a large financial corpus and thereby fine-tuning it for financial sentiment classification. [Financial PhraseBank](https://www.researchgate.net/publication/251231107_Good_Debt_or_Bad_Debt_Detecting_Semantic_Orientations_in_Economic_Texts) by Malo et al. (2014) is used for fine-tuning. For more details, please see the paper [FinBERT: Financial Sentiment Analysis with Pre-trained Language Models](https://arxiv.org/abs/1908.10063) and our related [blog post](https://medium.com/prosus-ai-tech-blog/finbert-financial-sentiment-analysis-with-bert-b277a3607101) on Medium.
The model will give softmax outputs for three labels: positive, negative or neutral.
---
About Prosus
Prosus is a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities. For more information, please visit www.prosus.com.
Contact information
Please contact Dogu Araci dogu.araci[at]prosus[dot]com and Zulkuf Genc zulkuf.genc[at]prosus[dot]com about any FinBERT related issues and questions.
| 1,476 | [
[
-0.0447998046875,
-0.042938232421875,
0.007137298583984375,
0.0182952880859375,
-0.03857421875,
0.00543975830078125,
-0.00799560546875,
-0.0262908935546875,
0.0264434814453125,
0.05767822265625,
-0.05206298828125,
-0.0552978515625,
-0.031829833984375,
-0.009613037109375,
-0.0204315185546875,
0.1365966796875,
0.0118865966796875,
0.038116455078125,
0.0191497802734375,
-0.0019931793212890625,
-0.00951385498046875,
-0.03375244140625,
-0.07379150390625,
-0.0032291412353515625,
0.03662109375,
0.035308837890625,
0.06195068359375,
-0.003627777099609375,
0.057891845703125,
0.026824951171875,
0.00525665283203125,
-0.014404296875,
-0.0347900390625,
-0.00002771615982055664,
-0.00423431396484375,
-0.032257080078125,
-0.051513671875,
0.01409149169921875,
0.0264129638671875,
0.037872314453125,
0.014190673828125,
0.035552978515625,
0.0162811279296875,
0.0667724609375,
-0.038116455078125,
0.0266265869140625,
-0.01206207275390625,
-0.00044345855712890625,
-0.01404571533203125,
0.0004911422729492188,
-0.0179290771484375,
-0.04998779296875,
0.035675048828125,
-0.034576416015625,
0.0284423828125,
0.01424407958984375,
0.0802001953125,
0.0060577392578125,
-0.0179290771484375,
-0.03131103515625,
-0.04486083984375,
0.064208984375,
-0.056793212890625,
0.025146484375,
0.0135498046875,
0.0167999267578125,
0.00014984607696533203,
-0.06201171875,
-0.03570556640625,
-0.00679779052734375,
-0.009246826171875,
0.0164947509765625,
-0.0230712890625,
0.017669677734375,
-0.01727294921875,
0.0118255615234375,
-0.027130126953125,
-0.0232086181640625,
-0.060394287109375,
-0.0176239013671875,
0.051239013671875,
-0.01291656494140625,
-0.0167694091796875,
-0.00017201900482177734,
-0.07196044921875,
-0.0190277099609375,
-0.037109375,
0.0372314453125,
0.0531005859375,
0.0316162109375,
0.005214691162109375,
0.00881195068359375,
0.0251922607421875,
0.06787109375,
-0.0008859634399414062,
-0.0045318603515625,
0.0411376953125,
-0.004970550537109375,
-0.03076171875,
0.0300445556640625,
0.056549072265625,
0.04205322265625,
0.052642822265625,
0.01288604736328125,
-0.035736083984375,
-0.020721435546875,
0.02838134765625,
-0.039947509765625,
-0.0224151611328125,
0.040924072265625,
-0.04833984375,
-0.034942626953125,
0.0248565673828125,
-0.043060302734375,
-0.020172119140625,
-0.0184326171875,
0.030426025390625,
-0.034820556640625,
-0.020263671875,
0.0220489501953125,
-0.017181396484375,
0.00925445556640625,
0.004795074462890625,
-0.0682373046875,
0.03729248046875,
0.061126708984375,
0.038330078125,
0.0279693603515625,
-0.00505828857421875,
-0.045135498046875,
-0.023223876953125,
-0.0306243896484375,
0.059661865234375,
-0.00836181640625,
-0.00782012939453125,
0.0129852294921875,
0.0091400146484375,
0.00524139404296875,
-0.035858154296875,
0.058837890625,
-0.0406494140625,
0.004886627197265625,
-0.0192108154296875,
-0.040802001953125,
-0.034454345703125,
0.004428863525390625,
-0.03985595703125,
0.03680419921875,
0.0285797119140625,
-0.07611083984375,
0.037506103515625,
-0.0574951171875,
-0.01348876953125,
0.0187530517578125,
0.0110626220703125,
-0.0291290283203125,
-0.005878448486328125,
-0.0208740234375,
0.058135986328125,
-0.0283966064453125,
0.048919677734375,
-0.0238494873046875,
-0.031280517578125,
0.038726806640625,
-0.004634857177734375,
0.0545654296875,
0.0305023193359375,
-0.025299072265625,
0.034423828125,
-0.048675537109375,
-0.0202484130859375,
-0.007843017578125,
0.01007080078125,
-0.049468994140625,
0.0084228515625,
0.021240234375,
0.0025959014892578125,
0.04156494140625,
-0.06365966796875,
0.0035400390625,
-0.048126220703125,
0.033172607421875,
0.0704345703125,
-0.01136016845703125,
0.0169525146484375,
-0.021759033203125,
0.02764892578125,
-0.01015472412109375,
0.048858642578125,
0.02362060546875,
-0.0235443115234375,
-0.055511474609375,
-0.02801513671875,
0.035614013671875,
0.06671142578125,
-0.0249481201171875,
0.02642822265625,
0.0093231201171875,
-0.0428466796875,
-0.0489501953125,
0.0267181396484375,
0.01398468017578125,
0.046356201171875,
0.020782470703125,
-0.0227203369140625,
-0.03173828125,
-0.11041259765625,
-0.00846099853515625,
-0.028167724609375,
-0.001983642578125,
-0.0008063316345214844,
0.031890869140625,
-0.01277923583984375,
0.06597900390625,
-0.040863037109375,
-0.0355224609375,
-0.0285491943359375,
0.0191802978515625,
0.0552978515625,
0.03607177734375,
0.0626220703125,
-0.08392333984375,
-0.0185699462890625,
-0.02447509765625,
-0.031036376953125,
0.006443023681640625,
-0.0312042236328125,
-0.0173187255859375,
0.00405120849609375,
0.0196533203125,
-0.0251922607421875,
0.00896453857421875,
0.036102294921875,
-0.052581787109375,
0.0474853515625,
-0.01299285888671875,
-0.0027942657470703125,
-0.0633544921875,
0.01424407958984375,
0.0237579345703125,
-0.00670623779296875,
-0.053253173828125,
-0.0142669677734375,
-0.00807952880859375,
0.00795745849609375,
-0.04248046875,
0.020843505859375,
0.0176239013671875,
-0.0006432533264160156,
-0.010955810546875,
0.02203369140625,
-0.0032711029052734375,
0.0341796875,
-0.003986358642578125,
0.061126708984375,
0.0355224609375,
-0.04248046875,
0.03900146484375,
0.0233917236328125,
-0.0176544189453125,
0.032196044921875,
-0.0511474609375,
-0.035797119140625,
-0.00063323974609375,
0.0060577392578125,
-0.09375,
0.01206207275390625,
0.041168212890625,
-0.03741455078125,
0.00905609130859375,
0.024169921875,
-0.033966064453125,
-0.020721435546875,
-0.046661376953125,
0.00824737548828125,
0.020172119140625,
-0.026153564453125,
0.031829833984375,
0.018798828125,
-0.039886474609375,
-0.0655517578125,
-0.038421630859375,
-0.005794525146484375,
-0.022064208984375,
-0.058197021484375,
0.025665283203125,
-0.00907135009765625,
-0.0494384765625,
0.004543304443359375,
-0.0034694671630859375,
-0.0104217529296875,
0.0165863037109375,
0.01100921630859375,
0.05499267578125,
-0.0213623046875,
0.027252197265625,
-0.000047326087951660156,
-0.03472900390625,
0.01385498046875,
-0.035064697265625,
0.034637451171875,
-0.06201171875,
0.0188140869140625,
-0.0304718017578125,
0.0125732421875,
0.0426025390625,
-0.01242828369140625,
0.06378173828125,
0.06427001953125,
-0.01000213623046875,
0.01306915283203125,
-0.04010009765625,
-0.0045623779296875,
-0.041015625,
-0.00243377685546875,
0.0059356689453125,
-0.07562255859375,
0.04425048828125,
-0.0111541748046875,
0.033599853515625,
0.06719970703125,
0.0301666259765625,
-0.02337646484375,
0.044036865234375,
0.058319091796875,
-0.007080078125,
0.024200439453125,
-0.04010009765625,
0.0295257568359375,
-0.03314208984375,
-0.02410888671875,
-0.03265380859375,
-0.032501220703125,
-0.04034423828125,
0.023345947265625,
0.00421142578125,
0.0201873779296875,
-0.04815673828125,
0.037933349609375,
-0.037933349609375,
0.0007185935974121094,
0.0509033203125,
-0.017730712890625,
-0.01233673095703125,
-0.00812530517578125,
-0.02191162109375,
-0.009765625,
-0.0404052734375,
-0.05010986328125,
0.0587158203125,
0.033599853515625,
0.047760009765625,
-0.0035533905029296875,
0.0616455078125,
0.0560302734375,
0.042938232421875,
-0.05072021484375,
0.041534423828125,
-0.043670654296875,
-0.05023193359375,
-0.01476287841796875,
-0.0168914794921875,
-0.07086181640625,
-0.006317138671875,
-0.029266357421875,
-0.0616455078125,
0.030914306640625,
0.01471710205078125,
-0.0709228515625,
0.0248565673828125,
-0.04132080078125,
0.08123779296875,
-0.037811279296875,
-0.031463623046875,
-0.007778167724609375,
-0.055145263671875,
0.03314208984375,
-0.022125244140625,
0.0433349609375,
-0.013671875,
-0.0005922317504882812,
0.0750732421875,
-0.0401611328125,
0.0721435546875,
-0.0293121337890625,
-0.0243377685546875,
0.037628173828125,
-0.0020847320556640625,
0.0159454345703125,
0.0172271728515625,
-0.0021190643310546875,
0.00726318359375,
0.0158538818359375,
-0.029693603515625,
-0.0243682861328125,
0.040802001953125,
-0.0675048828125,
-0.0235443115234375,
-0.035369873046875,
-0.0214691162109375,
-0.007587432861328125,
0.0027256011962890625,
0.00998687744140625,
0.037322998046875,
-0.007579803466796875,
0.01357269287109375,
0.038909912109375,
-0.0355224609375,
0.0183258056640625,
0.030242919921875,
-0.0219573974609375,
-0.04595947265625,
0.0826416015625,
-0.003009796142578125,
0.0005240440368652344,
0.0196533203125,
0.01439666748046875,
-0.02044677734375,
-0.0186614990234375,
-0.0151214599609375,
0.0226593017578125,
-0.055816650390625,
-0.0249176025390625,
-0.0296478271484375,
-0.021759033203125,
-0.033905029296875,
-0.047882080078125,
-0.043731689453125,
-0.036407470703125,
-0.0220184326171875,
-0.017425537109375,
0.037567138671875,
0.046356201171875,
-0.021392822265625,
0.044525146484375,
-0.0799560546875,
0.003795623779296875,
0.00824737548828125,
0.023468017578125,
-0.0205078125,
-0.01153564453125,
-0.0213470458984375,
-0.0214996337890625,
-0.00624847412109375,
-0.054931640625,
0.0469970703125,
0.006214141845703125,
0.039398193359375,
0.067626953125,
0.024261474609375,
0.01398468017578125,
0.01198577880859375,
0.0626220703125,
0.028167724609375,
-0.07196044921875,
0.03009033203125,
-0.0183563232421875,
0.00478363037109375,
0.0672607421875,
0.04425048828125,
-0.0372314453125,
-0.04412841796875,
-0.05841064453125,
-0.09112548828125,
0.0341796875,
0.004116058349609375,
0.01074981689453125,
-0.0008950233459472656,
0.031585693359375,
0.03350830078125,
0.032623291015625,
-0.048828125,
-0.0260772705078125,
-0.0233306884765625,
-0.0268402099609375,
-0.024627685546875,
-0.036895751953125,
-0.005229949951171875,
-0.0269317626953125,
0.0662841796875,
0.013702392578125,
0.026336669921875,
0.0196533203125,
0.01395416259765625,
0.00628662109375,
0.0200653076171875,
0.06427001953125,
0.045654296875,
-0.04730224609375,
0.0012073516845703125,
0.002716064453125,
-0.038116455078125,
-0.0042572021484375,
0.0204315185546875,
0.006900787353515625,
0.0129852294921875,
0.034698486328125,
0.06201171875,
0.02520751953125,
-0.05938720703125,
0.053741455078125,
-0.0126495361328125,
-0.06451416015625,
-0.07012939453125,
0.0090484619140625,
-0.0068206787109375,
0.0545654296875,
0.049560546875,
0.03680419921875,
0.0245361328125,
-0.0207061767578125,
0.026458740234375,
0.01180267333984375,
-0.057281494140625,
-0.01483917236328125,
0.056121826171875,
0.0228729248046875,
-0.01000213623046875,
0.05853271484375,
-0.013092041015625,
-0.057403564453125,
0.0423583984375,
0.021209716796875,
0.071044921875,
-0.002193450927734375,
0.04376220703125,
0.0245819091796875,
0.0272369384765625,
-0.00762939453125,
0.04931640625,
-0.0077056884765625,
-0.05206298828125,
-0.035736083984375,
-0.060638427734375,
-0.020721435546875,
0.011322021484375,
-0.05584716796875,
0.0177764892578125,
-0.059051513671875,
-0.055084228515625,
0.0034580230712890625,
0.00547027587890625,
-0.03729248046875,
0.0245819091796875,
0.01096343994140625,
0.08245849609375,
-0.05706787109375,
0.047821044921875,
0.043121337890625,
-0.01806640625,
-0.0421142578125,
-0.0178070068359375,
-0.01065826416015625,
-0.0384521484375,
0.0848388671875,
0.01445770263671875,
-0.031982421875,
-0.00223541259765625,
-0.053436279296875,
-0.040252685546875,
0.05316162109375,
0.019744873046875,
-0.04998779296875,
0.02435302734375,
-0.0038604736328125,
0.045196533203125,
-0.03448486328125,
-0.020263671875,
0.0316162109375,
0.04864501953125,
-0.0016765594482421875,
-0.03619384765625,
-0.01360321044921875,
-0.0384521484375,
-0.0450439453125,
0.0295257568359375,
-0.051300048828125,
0.08148193359375,
-0.01139068603515625,
0.0138092041015625,
-0.0113677978515625,
0.046630859375,
-0.0013666152954101562,
0.030364990234375,
0.056121826171875,
0.0141448974609375,
0.05169677734375,
-0.00672149658203125,
0.06842041015625,
-0.0633544921875,
0.041900634765625,
0.0498046875,
-0.0063323974609375,
0.0574951171875,
0.0301361083984375,
-0.005779266357421875,
0.053741455078125,
0.06915283203125,
-0.03302001953125,
0.04132080078125,
0.0280914306640625,
-0.025665283203125,
-0.034698486328125,
0.0026836395263671875,
-0.018707275390625,
0.0294647216796875,
0.031158447265625,
-0.0528564453125,
-0.00334930419921875,
0.01386260986328125,
0.0024662017822265625,
-0.0183563232421875,
-0.032257080078125,
0.0265655517578125,
0.0117950439453125,
-0.03472900390625,
0.035980224609375,
0.0162353515625,
0.050628662109375,
-0.068115234375,
0.01538848876953125,
-0.00354766845703125,
0.0430908203125,
-0.0121612548828125,
-0.0506591796875,
0.0304718017578125,
0.00690460205078125,
0.0139923095703125,
-0.0277099609375,
0.06854248046875,
-0.0010042190551757812,
-0.044036865234375,
0.03179931640625,
0.02935791015625,
0.0196990966796875,
0.019317626953125,
-0.07000732421875,
-0.02117919921875,
-0.00467681884765625,
-0.0311126708984375,
0.0109100341796875,
-0.006038665771484375,
0.00885772705078125,
0.048553466796875,
0.041229248046875,
-0.0057830810546875,
-0.049560546875,
0.00632476806640625,
0.04937744140625,
-0.04205322265625,
-0.0382080078125,
-0.0797119140625,
0.022613525390625,
-0.008087158203125,
-0.03924560546875,
0.049346923828125,
0.04766845703125,
0.05767822265625,
-0.019744873046875,
0.047393798828125,
0.01593017578125,
0.01421356201171875,
-0.020263671875,
0.06060791015625,
-0.043609619140625,
0.002960205078125,
-0.0384521484375,
-0.056884765625,
-0.037261962890625,
0.066162109375,
-0.03472900390625,
-0.02618408203125,
0.030914306640625,
0.027130126953125,
0.0188751220703125,
0.0255279541015625,
0.003253936767578125,
0.003787994384765625,
-0.0101165771484375,
0.0151519775390625,
0.035430908203125,
-0.0074462890625,
0.044158935546875,
-0.005115509033203125,
-0.0131988525390625,
-0.0192108154296875,
-0.040924072265625,
-0.06195068359375,
-0.047271728515625,
-0.0197296142578125,
-0.031982421875,
0.007366180419921875,
0.0828857421875,
0.0227813720703125,
-0.0703125,
-0.03656005859375,
0.00469207763671875,
-0.009918212890625,
-0.0184326171875,
-0.014007568359375,
0.043914794921875,
-0.0323486328125,
-0.0217437744140625,
-0.006603240966796875,
0.025665283203125,
0.0025577545166015625,
-0.03521728515625,
0.0027446746826171875,
-0.01030731201171875,
0.02374267578125,
0.032867431640625,
0.0045166015625,
-0.021392822265625,
-0.0159912109375,
0.002960205078125,
-0.00560760498046875,
-0.00841522216796875,
0.06268310546875,
-0.0347900390625,
0.01213836669921875,
0.0296173095703125,
0.0231781005859375,
0.0164337158203125,
-0.0025177001953125,
0.05133056640625,
-0.02783203125,
0.0060577392578125,
0.01454925537109375,
0.028533935546875,
0.00958251953125,
-0.0283966064453125,
0.028167724609375,
0.01093292236328125,
-0.032501220703125,
-0.029205322265625,
-0.00738525390625,
-0.06414794921875,
-0.048553466796875,
0.060089111328125,
-0.0204620361328125,
-0.026153564453125,
-0.01422119140625,
-0.03204345703125,
0.01690673828125,
-0.037628173828125,
0.039520263671875,
0.04620361328125,
-0.00951385498046875,
0.01349639892578125,
-0.07623291015625,
0.03485107421875,
0.0233001708984375,
-0.01342010498046875,
-0.01058197021484375,
0.010162353515625,
0.0133056640625,
0.02783203125,
0.055023193359375,
0.01343536376953125,
0.0186920166015625,
-0.0013904571533203125,
0.02978515625,
-0.005474090576171875,
-0.0244903564453125,
0.0123138427734375,
0.0274658203125,
-0.0013599395751953125,
-0.0151214599609375
]
] |
cardiffnlp/twitter-roberta-base-sentiment-latest | 2023-05-28T05:45:10.000Z | [
"transformers",
"pytorch",
"tf",
"roberta",
"text-classification",
"en",
"dataset:tweet_eval",
"arxiv:2202.03829",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-sentiment-latest | 258 | 1,386,030 | transformers | 2022-03-15T01:21:58 | ---
language: en
widget:
- text: Covid cases are increasing fast!
datasets:
- tweet_eval
---
# Twitter-roBERTa-base for Sentiment Analysis - UPDATED (2022)
This is a RoBERTa-base model trained on ~124M tweets from January 2018 to December 2021, and finetuned for sentiment analysis with the TweetEval benchmark.
The original Twitter-based RoBERTa model can be found [here](https://huggingface.co/cardiffnlp/twitter-roberta-base-2021-124m) and the original reference paper is [TweetEval](https://github.com/cardiffnlp/tweeteval). This model is suitable for English.
- Reference Paper: [TimeLMs paper](https://arxiv.org/abs/2202.03829).
- Git Repo: [TimeLMs official repository](https://github.com/cardiffnlp/timelms).
<b>Labels</b>:
0 -> Negative;
1 -> Neutral;
2 -> Positive
This sentiment analysis model has been integrated into [TweetNLP](https://github.com/cardiffnlp/tweetnlp). You can access the demo [here](https://tweetnlp.org).
## Example Pipeline
```python
from transformers import pipeline
sentiment_task = pipeline("sentiment-analysis", model=model_path, tokenizer=model_path)
sentiment_task("Covid cases are increasing fast!")
```
```
[{'label': 'Negative', 'score': 0.7236}]
```
## Full classification example
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer, AutoConfig
import numpy as np
from scipy.special import softmax
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
MODEL = f"cardiffnlp/twitter-roberta-base-sentiment-latest"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
config = AutoConfig.from_pretrained(MODEL)
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
#model.save_pretrained(MODEL)
text = "Covid cases are increasing fast!"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Covid cases are increasing fast!"
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
# Print labels and scores
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = config.id2label[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) Negative 0.7236
2) Neutral 0.2287
3) Positive 0.0477
```
### References
```
@inproceedings{camacho-collados-etal-2022-tweetnlp,
title = "{T}weet{NLP}: Cutting-Edge Natural Language Processing for Social Media",
author = "Camacho-collados, Jose and
Rezaee, Kiamehr and
Riahi, Talayeh and
Ushio, Asahi and
Loureiro, Daniel and
Antypas, Dimosthenis and
Boisson, Joanne and
Espinosa Anke, Luis and
Liu, Fangyu and
Mart{\'\i}nez C{\'a}mara, Eugenio" and others,
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = dec,
year = "2022",
address = "Abu Dhabi, UAE",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-demos.5",
pages = "38--49"
}
```
```
@inproceedings{loureiro-etal-2022-timelms,
title = "{T}ime{LM}s: Diachronic Language Models from {T}witter",
author = "Loureiro, Daniel and
Barbieri, Francesco and
Neves, Leonardo and
Espinosa Anke, Luis and
Camacho-collados, Jose",
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.acl-demo.25",
doi = "10.18653/v1/2022.acl-demo.25",
pages = "251--260"
}
```
| 4,310 | [
[
-0.01308441162109375,
-0.055389404296875,
0.02001953125,
0.0306549072265625,
-0.0196990966796875,
0.017547607421875,
-0.0249176025390625,
-0.025634765625,
0.017974853515625,
0.0010271072387695312,
-0.044525146484375,
-0.06317138671875,
-0.051788330078125,
0.0036163330078125,
-0.0215911865234375,
0.07745361328125,
-0.003139495849609375,
0.0038738250732421875,
0.0157012939453125,
-0.0156402587890625,
0.004825592041015625,
-0.04144287109375,
-0.05523681640625,
-0.0212554931640625,
0.0201416015625,
0.01195526123046875,
0.033538818359375,
0.0305633544921875,
0.0295257568359375,
0.031890869140625,
-0.006565093994140625,
0.0008034706115722656,
-0.02191162109375,
0.01377105712890625,
0.00411224365234375,
-0.017547607421875,
-0.0474853515625,
0.0157470703125,
0.052154541015625,
0.04803466796875,
0.00998687744140625,
0.0305023193359375,
0.00803375244140625,
0.025054931640625,
-0.03131103515625,
0.0170440673828125,
-0.027313232421875,
-0.0027637481689453125,
-0.0133514404296875,
-0.018310546875,
-0.027984619140625,
-0.03741455078125,
0.007648468017578125,
-0.03314208984375,
0.0216064453125,
-0.00582122802734375,
0.09527587890625,
0.0151519775390625,
-0.0149078369140625,
0.0006604194641113281,
-0.041290283203125,
0.0850830078125,
-0.07501220703125,
0.01349639892578125,
0.0092315673828125,
0.0064544677734375,
-0.004329681396484375,
-0.042449951171875,
-0.043304443359375,
-0.012603759765625,
0.0025386810302734375,
0.01418304443359375,
-0.0253753662109375,
-0.006610870361328125,
0.0146484375,
0.0089263916015625,
-0.042236328125,
-0.00428009033203125,
-0.02288818359375,
-0.00811004638671875,
0.05316162109375,
0.0092926025390625,
0.0160064697265625,
-0.0269012451171875,
-0.0203704833984375,
-0.01374053955078125,
-0.006103515625,
0.004344940185546875,
0.002750396728515625,
0.0283660888671875,
-0.037017822265625,
0.046295166015625,
-0.00586700439453125,
0.043853759765625,
0.00791168212890625,
-0.00933837890625,
0.05419921875,
-0.0288848876953125,
-0.0247650146484375,
-0.0167999267578125,
0.09564208984375,
0.0286865234375,
0.02880859375,
-0.01055908203125,
-0.012786865234375,
0.005367279052734375,
-0.01532745361328125,
-0.0584716796875,
-0.005603790283203125,
0.0308837890625,
-0.034423828125,
-0.042266845703125,
0.0024890899658203125,
-0.07086181640625,
-0.0029468536376953125,
-0.004077911376953125,
0.04156494140625,
-0.038116455078125,
-0.0275115966796875,
0.0016508102416992188,
-0.017852783203125,
0.01018524169921875,
0.0095367431640625,
-0.04632568359375,
0.017730712890625,
0.03753662109375,
0.072021484375,
-0.00362396240234375,
-0.0269927978515625,
-0.031005859375,
-0.0013017654418945312,
-0.0249786376953125,
0.048797607421875,
-0.025390625,
-0.0198211669921875,
-0.007114410400390625,
-0.00603485107421875,
-0.016571044921875,
-0.01451873779296875,
0.031890869140625,
-0.0253448486328125,
0.034820556640625,
0.003086090087890625,
-0.037933349609375,
0.0034275054931640625,
0.0169525146484375,
-0.023773193359375,
0.09033203125,
0.0146484375,
-0.05596923828125,
0.011474609375,
-0.06866455078125,
-0.0305938720703125,
-0.01259613037109375,
0.01557159423828125,
-0.036346435546875,
-0.0024204254150390625,
0.023895263671875,
0.044403076171875,
-0.02001953125,
0.025604248046875,
-0.0401611328125,
-0.01065826416015625,
0.0248565673828125,
-0.024688720703125,
0.09820556640625,
0.023651123046875,
-0.04193115234375,
0.0077667236328125,
-0.0562744140625,
0.0220184326171875,
0.0128173828125,
-0.022857666015625,
-0.008880615234375,
-0.01509857177734375,
0.00823974609375,
0.024505615234375,
0.029754638671875,
-0.048919677734375,
0.01377105712890625,
-0.038543701171875,
0.047882080078125,
0.057861328125,
0.007015228271484375,
0.031219482421875,
-0.0290985107421875,
0.031585693359375,
0.0024585723876953125,
0.0224456787109375,
0.0103302001953125,
-0.036468505859375,
-0.06256103515625,
-0.01149749755859375,
0.0207366943359375,
0.0435791015625,
-0.04510498046875,
0.0438232421875,
-0.03594970703125,
-0.052032470703125,
-0.03643798828125,
-0.016510009765625,
0.0270538330078125,
0.040863037109375,
0.045562744140625,
0.003604888916015625,
-0.06072998046875,
-0.0433349609375,
-0.034820556640625,
-0.0272674560546875,
0.01216888427734375,
0.0160675048828125,
0.042327880859375,
-0.0186767578125,
0.0677490234375,
-0.041778564453125,
-0.01195526123046875,
-0.0248870849609375,
0.0305938720703125,
0.03558349609375,
0.048919677734375,
0.048797607421875,
-0.041473388671875,
-0.04510498046875,
-0.0291748046875,
-0.0654296875,
-0.0263519287109375,
0.01207733154296875,
-0.0128173828125,
0.04351806640625,
0.0300750732421875,
-0.04803466796875,
0.0298614501953125,
0.03460693359375,
-0.04345703125,
0.03179931640625,
0.00107574462890625,
0.0191650390625,
-0.10498046875,
0.006988525390625,
0.019317626953125,
-0.00514984130859375,
-0.053497314453125,
-0.01454925537109375,
-0.02069091796875,
0.00432586669921875,
-0.030548095703125,
0.062286376953125,
-0.0276031494140625,
0.0110931396484375,
0.01207733154296875,
0.00720977783203125,
0.00107574462890625,
0.04010009765625,
-0.0146484375,
0.040069580078125,
0.03802490234375,
-0.039215087890625,
0.01393890380859375,
0.01308441162109375,
-0.0059814453125,
0.023651123046875,
-0.059051513671875,
-0.0008568763732910156,
0.00421142578125,
0.017303466796875,
-0.08575439453125,
-0.00756072998046875,
0.0269012451171875,
-0.069091796875,
0.027099609375,
-0.0203399658203125,
-0.0400390625,
-0.031097412109375,
-0.036468505859375,
0.0269012451171875,
0.04278564453125,
-0.0239715576171875,
0.04840087890625,
0.03277587890625,
0.0083160400390625,
-0.0526123046875,
-0.06597900390625,
0.007450103759765625,
-0.015594482421875,
-0.05694580078125,
0.030120849609375,
-0.0144500732421875,
-0.0288238525390625,
0.0092620849609375,
0.0074920654296875,
-0.00707244873046875,
0.00958251953125,
0.0091705322265625,
0.0294647216796875,
-0.01654052734375,
0.01074981689453125,
-0.01514434814453125,
-0.00811004638671875,
0.003841400146484375,
-0.0290374755859375,
0.058990478515625,
-0.0250396728515625,
0.00875091552734375,
-0.046112060546875,
0.016082763671875,
0.037872314453125,
-0.007720947265625,
0.070068359375,
0.06298828125,
-0.0300750732421875,
-0.00814056396484375,
-0.045166015625,
-0.0074920654296875,
-0.035888671875,
0.029632568359375,
-0.0188446044921875,
-0.055023193359375,
0.039337158203125,
0.023651123046875,
0.00791168212890625,
0.07159423828125,
0.043731689453125,
-0.01082611083984375,
0.0802001953125,
0.034393310546875,
-0.0134124755859375,
0.046722412109375,
-0.059326171875,
0.01125335693359375,
-0.0513916015625,
-0.017303466796875,
-0.051849365234375,
-0.006946563720703125,
-0.0631103515625,
-0.03106689453125,
0.01490020751953125,
0.0011339187622070312,
-0.042999267578125,
0.0203704833984375,
-0.04071044921875,
-0.0015821456909179688,
0.034942626953125,
0.005390167236328125,
-0.002964019775390625,
0.0024356842041015625,
-0.0126800537109375,
-0.0103912353515625,
-0.050201416015625,
-0.03741455078125,
0.08599853515625,
0.0246124267578125,
0.0394287109375,
0.006717681884765625,
0.06805419921875,
0.01910400390625,
0.025909423828125,
-0.048675537109375,
0.057525634765625,
-0.0275421142578125,
-0.04510498046875,
-0.0211334228515625,
-0.0443115234375,
-0.0615234375,
0.003528594970703125,
-0.01519775390625,
-0.061431884765625,
0.0074920654296875,
-0.00394439697265625,
-0.01611328125,
0.0280303955078125,
-0.055877685546875,
0.060150146484375,
-0.0068206787109375,
-0.034332275390625,
-0.0006213188171386719,
-0.041473388671875,
0.00757598876953125,
0.01806640625,
0.0241546630859375,
-0.0235595703125,
-0.0106048583984375,
0.08404541015625,
-0.0457763671875,
0.06231689453125,
-0.0259246826171875,
0.0190887451171875,
0.0207061767578125,
-0.00011622905731201172,
0.0178680419921875,
-0.0018701553344726562,
-0.023773193359375,
0.0210418701171875,
-0.006069183349609375,
-0.0379638671875,
-0.0236663818359375,
0.05645751953125,
-0.0753173828125,
-0.036376953125,
-0.058349609375,
-0.02392578125,
-0.01004791259765625,
0.0218658447265625,
0.03680419921875,
0.047088623046875,
-0.006229400634765625,
0.01250457763671875,
0.0288238525390625,
-0.016998291015625,
0.06256103515625,
0.0238037109375,
-0.0033359527587890625,
-0.0396728515625,
0.054534912109375,
0.0204315185546875,
0.01134490966796875,
0.035430908203125,
0.024200439453125,
-0.020782470703125,
-0.0330810546875,
-0.0118408203125,
0.034423828125,
-0.044189453125,
-0.018218994140625,
-0.0665283203125,
-0.0309906005859375,
-0.05963134765625,
-0.004810333251953125,
-0.0245208740234375,
-0.055206298828125,
-0.040374755859375,
-0.0006666183471679688,
0.040374755859375,
0.05120849609375,
-0.0204315185546875,
0.01629638671875,
-0.047119140625,
0.0176849365234375,
-0.0020389556884765625,
0.02569580078125,
0.0012178421020507812,
-0.059173583984375,
-0.0162811279296875,
0.00806427001953125,
-0.019866943359375,
-0.058929443359375,
0.0540771484375,
0.016815185546875,
0.03704833984375,
0.006866455078125,
0.00858306884765625,
0.0511474609375,
-0.015655517578125,
0.07672119140625,
0.0081634521484375,
-0.08038330078125,
0.044891357421875,
-0.032012939453125,
0.0325927734375,
0.031890869140625,
0.0236358642578125,
-0.041229248046875,
-0.045013427734375,
-0.06378173828125,
-0.07257080078125,
0.0645751953125,
0.0175018310546875,
0.003299713134765625,
-0.0091094970703125,
0.01103973388671875,
-0.0180816650390625,
0.006427764892578125,
-0.0626220703125,
-0.0433349609375,
-0.02911376953125,
-0.0394287109375,
-0.0234527587890625,
-0.0271759033203125,
-0.0007300376892089844,
-0.03509521484375,
0.07427978515625,
0.00907135009765625,
0.04754638671875,
0.015380859375,
-0.0091552734375,
-0.007427215576171875,
0.01983642578125,
0.0428466796875,
0.042449951171875,
-0.03619384765625,
-0.00241851806640625,
0.01444244384765625,
-0.0360107421875,
0.00438690185546875,
0.0204010009765625,
-0.011871337890625,
0.0205535888671875,
0.046844482421875,
0.052459716796875,
0.0167694091796875,
-0.00685882568359375,
0.042449951171875,
-0.01070404052734375,
-0.0246734619140625,
-0.037353515625,
-0.00450897216796875,
-0.006275177001953125,
0.0136566162109375,
0.0491943359375,
0.0170745849609375,
-0.00508880615234375,
-0.0308990478515625,
0.007785797119140625,
0.022613525390625,
-0.031890869140625,
-0.03363037109375,
0.054351806640625,
-0.0006885528564453125,
-0.032501220703125,
0.032684326171875,
-0.0123443603515625,
-0.062744140625,
0.04412841796875,
0.0287628173828125,
0.0924072265625,
-0.00933837890625,
0.0233154296875,
0.05682373046875,
0.0118255615234375,
-0.004913330078125,
0.032958984375,
0.006465911865234375,
-0.04840087890625,
-0.0168609619140625,
-0.05841064453125,
-0.003589630126953125,
0.006710052490234375,
-0.0308837890625,
0.02178955078125,
-0.042266845703125,
-0.035430908203125,
0.01216888427734375,
0.0264739990234375,
-0.0474853515625,
0.027008056640625,
-0.002716064453125,
0.06280517578125,
-0.061859130859375,
0.055755615234375,
0.0478515625,
-0.04095458984375,
-0.07501220703125,
0.009918212890625,
-0.006710052490234375,
-0.037353515625,
0.0587158203125,
0.00621795654296875,
-0.01271820068359375,
0.0091400146484375,
-0.056610107421875,
-0.07696533203125,
0.080078125,
0.0122222900390625,
-0.004146575927734375,
-0.007747650146484375,
0.005462646484375,
0.061187744140625,
-0.029632568359375,
0.0374755859375,
0.02923583984375,
0.032958984375,
-0.003818511962890625,
-0.052154541015625,
0.00911712646484375,
-0.040985107421875,
-0.007720947265625,
0.0034427642822265625,
-0.06695556640625,
0.08636474609375,
-0.0129852294921875,
-0.0121002197265625,
0.003444671630859375,
0.0479736328125,
0.0271148681640625,
0.02471923828125,
0.03076171875,
0.042633056640625,
0.041107177734375,
-0.026641845703125,
0.0751953125,
-0.0274658203125,
0.05224609375,
0.06927490234375,
0.0188446044921875,
0.060791015625,
0.03289794921875,
-0.025177001953125,
0.053497314453125,
0.0469970703125,
-0.0029811859130859375,
0.0308990478515625,
-0.001262664794921875,
-0.00858306884765625,
-0.01444244384765625,
-0.01015472412109375,
-0.032440185546875,
0.0278167724609375,
0.020294189453125,
-0.0300140380859375,
-0.01406097412109375,
-0.0106658935546875,
0.02557373046875,
-0.005092620849609375,
-0.008575439453125,
0.0390625,
0.01041412353515625,
-0.05316162109375,
0.0704345703125,
0.00516510009765625,
0.0645751953125,
-0.0308990478515625,
0.010101318359375,
-0.00849151611328125,
0.0245819091796875,
-0.0209197998046875,
-0.0615234375,
0.01251983642578125,
0.01485443115234375,
-0.0080413818359375,
-0.0218505859375,
0.0309295654296875,
-0.0239105224609375,
-0.050201416015625,
0.04345703125,
0.0257568359375,
0.01242828369140625,
0.0252227783203125,
-0.0750732421875,
0.0054779052734375,
-0.00098419189453125,
-0.05224609375,
-0.004405975341796875,
0.028045654296875,
0.007038116455078125,
0.05029296875,
0.0419921875,
0.01082611083984375,
0.0200347900390625,
0.016845703125,
0.061920166015625,
-0.049163818359375,
-0.0305023193359375,
-0.08148193359375,
0.03668212890625,
-0.0178985595703125,
-0.040069580078125,
0.06329345703125,
0.04730224609375,
0.052703857421875,
0.00014889240264892578,
0.06640625,
-0.0267791748046875,
0.050079345703125,
-0.02545166015625,
0.0545654296875,
-0.054168701171875,
0.0170440673828125,
-0.026336669921875,
-0.057769775390625,
-0.0284576416015625,
0.04296875,
-0.042633056640625,
0.03106689453125,
0.058349609375,
0.0552978515625,
0.009246826171875,
-0.0192718505859375,
0.0031299591064453125,
0.049224853515625,
0.03131103515625,
0.04766845703125,
0.0384521484375,
-0.05487060546875,
0.0465087890625,
-0.043975830078125,
-0.0169525146484375,
-0.0239715576171875,
-0.064697265625,
-0.0751953125,
-0.059173583984375,
-0.0272064208984375,
-0.0675048828125,
0.0087432861328125,
0.0855712890625,
0.03985595703125,
-0.07220458984375,
-0.0260162353515625,
0.0009822845458984375,
0.00347900390625,
-0.0030040740966796875,
-0.02435302734375,
0.04315185546875,
-0.0270843505859375,
-0.058349609375,
0.003875732421875,
-0.00006115436553955078,
0.01458740234375,
-0.0006012916564941406,
-0.0034637451171875,
-0.04351806640625,
0.0037078857421875,
0.0302276611328125,
0.01446533203125,
-0.045074462890625,
-0.0173797607421875,
0.01183319091796875,
-0.034515380859375,
0.01186370849609375,
0.0240936279296875,
-0.0472412109375,
0.01361083984375,
0.053497314453125,
0.0105743408203125,
0.051300048828125,
-0.0035495758056640625,
0.03173828125,
-0.04449462890625,
0.0115509033203125,
0.027191162109375,
0.026519775390625,
0.03533935546875,
-0.0140838623046875,
0.04302978515625,
0.03472900390625,
-0.03509521484375,
-0.07086181640625,
-0.01617431640625,
-0.085205078125,
-0.02020263671875,
0.09759521484375,
-0.008331298828125,
-0.0361328125,
-0.0018186569213867188,
0.006603240966796875,
0.057464599609375,
-0.047760009765625,
0.0587158203125,
0.042755126953125,
0.00876617431640625,
0.0025157928466796875,
-0.031707763671875,
0.0411376953125,
0.022552490234375,
-0.046783447265625,
-0.01122283935546875,
0.003726959228515625,
0.03656005859375,
0.0139923095703125,
0.06005859375,
-0.00853729248046875,
0.01065826416015625,
-0.01169586181640625,
0.00482940673828125,
-0.00753021240234375,
-0.0030517578125,
-0.031951904296875,
0.009246826171875,
-0.0180206298828125,
-0.0146484375
]
] |
alexandrainst/scandi-nli-large | 2023-09-20T11:55:47.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"text-classification",
"zero-shot-classification",
"da",
"no",
"nb",
"sv",
"dataset:strombergnlp/danfever",
"dataset:KBLab/overlim",
"dataset:MoritzLaurer/multilingual-NLI-26lang-2mil7",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | alexandrainst | null | null | alexandrainst/scandi-nli-large | 5 | 1,356,428 | transformers | 2022-11-28T07:05:27 | ---
pipeline_tag: zero-shot-classification
language:
- da
- 'no'
- nb
- sv
license: apache-2.0
datasets:
- strombergnlp/danfever
- KBLab/overlim
- MoritzLaurer/multilingual-NLI-26lang-2mil7
widget:
- example_title: Danish
text: >-
Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke
finder dig'
candidate_labels: sundhed, politik, sport, religion
- example_title: Norwegian
text: >-
Regjeringen i Russland hevder Norge fรธrer en politikk som vil fรธre til
opptrapping i Arktis og ยซden endelige รธdeleggelsen av russisk-norske
relasjonerยป.
candidate_labels: helse, politikk, sport, religion
- example_title: Swedish
text: Sรฅ luras kroppens immunfรถrsvar att bota cancer
candidate_labels: hรคlsa, politik, sport, religion
inference:
parameters:
hypothesis_template: Dette eksempel handler om {}
---
# ScandiNLI - Natural Language Inference model for Scandinavian Languages
This model is a fine-tuned version of [NbAiLab/nb-bert-large](https://huggingface.co/NbAiLab/nb-bert-large) for Natural Language Inference in Danish, Norwegian Bokmรฅl and Swedish.
We have released three models for Scandinavian NLI, of different sizes:
- alexandrainst/scandi-nli-large (this)
- [alexandrainst/scandi-nli-base](https://huggingface.co/alexandrainst/scandi-nli-base)
- [alexandrainst/scandi-nli-small](https://huggingface.co/alexandrainst/scandi-nli-small)
A demo of the large model can be found in [this Hugging Face Space](https://huggingface.co/spaces/alexandrainst/zero-shot-classification) - check it out!
The performance and model size of each of them can be found in the Performance section below.
## Quick start
You can use this model in your scripts as follows:
```python
>>> from transformers import pipeline
>>> classifier = pipeline(
... "zero-shot-classification",
... model="alexandrainst/scandi-nli-large",
... )
>>> classifier(
... "Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke finder dig'",
... candidate_labels=['sundhed', 'politik', 'sport', 'religion'],
... hypothesis_template="Dette eksempel handler om {}",
... )
{'sequence': "Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke finder dig'",
'labels': ['sport', 'religion', 'politik', 'sundhed'],
'scores': [0.6134647727012634,
0.30309760570526123,
0.05021871626377106,
0.03321893885731697]}
```
## Performance
We assess the models both on their aggregate Scandinavian performance, as well as their language-specific Danish, Swedish and Norwegian Bokmรฅl performance.
In all cases, we report Matthew's Correlation Coefficient (MCC), macro-average F1-score as well as accuracy.
### Scandinavian Evaluation
The Scandinavian scores are the average of the Danish, Swedish and Norwegian scores, which can be found in the sections below.
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
| :-------- | :------------ | :--------- | :----------- | :----------- |
| `alexandrainst/scandi-nli-large` (this) | **73.70%** | **74.44%** | **83.91%** | 354M |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 69.01% | 71.99% | 80.66% | 279M |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.42% | 71.54% | 80.09% | 178M |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 64.17% | 70.80% | 77.29% | 560M |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 63.94% | 70.41% | 77.23% | 279M |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 61.71% | 68.36% | 76.08% | 178M |
| [`alexandrainst/scandi-nli-small`](https://huggingface.co/alexandrainst/scandi-nli-small) | 56.02% | 65.30% | 73.56% | **22M** |
### Danish Evaluation
We use a test split of the [DanFEVER dataset](https://aclanthology.org/2021.nodalida-main.pdf#page=439) to evaluate the Danish performance of the models.
The test split is generated using [this gist](https://gist.github.com/saattrupdan/1cb8379232fdec6e943dc84595a85e7c).
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
| :-------- | :------------ | :--------- | :----------- | :----------- |
| `alexandrainst/scandi-nli-large` (this) | **73.80%** | **58.41%** | **86.98%** | 354M |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 68.37% | 57.10% | 83.25% | 279M |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 62.44% | 55.00% | 80.42% | 178M |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 56.92% | 53.25% | 76.39% | 178M |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 52.79% | 52.00% | 72.35% | 279M |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 49.18% | 50.31% | 69.73% | 560M |
| [`alexandrainst/scandi-nli-small`](https://huggingface.co/alexandrainst/scandi-nli-small) | 47.28% | 48.88% | 73.46% | **22M** |
### Swedish Evaluation
We use the test split of the machine translated version of the [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) dataset to evaluate the Swedish performance of the models.
We acknowledge that not evaluating on a gold standard dataset is not ideal, but unfortunately we are not aware of any NLI datasets in Swedish.
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
| :-------- | :------------ | :--------- | :----------- | :----------- |
| `alexandrainst/scandi-nli-large` (this) | **76.69%** | **84.47%** | **84.38%** | 354M |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 75.35% | 83.42% | 83.55% | 560M |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 73.84% | 82.46% | 82.58% | 279M |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 73.32% | 82.15% | 82.08% | 279M |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 72.29% | 81.37% | 81.51% | 178M |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 64.69% | 76.40% | 76.47% | 178M |
| [`alexandrainst/scandi-nli-small`](https://huggingface.co/alexandrainst/scandi-nli-small) | 62.35% | 74.79% | 74.93% | **22M** |
### Norwegian Evaluation
We use the test split of the machine translated version of the [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) dataset to evaluate the Norwegian performance of the models.
We acknowledge that not evaluating on a gold standard dataset is not ideal, but unfortunately we are not aware of any NLI datasets in Norwegian.
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
| :-------- | :------------ | :--------- | :----------- | :----------- |
| `alexandrainst/scandi-nli-large` (this) | **70.61%** | **80.43%** | **80.36%** | 354M |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 67.99% | 78.68% | 78.60% | 560M |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.53% | 78.24% | 78.33% | 178M |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 65.33% | 76.73% | 76.65% | 279M |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 65.18% | 76.76% | 76.77% | 279M |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 63.51% | 75.42% | 75.39% | 178M |
| [`alexandrainst/scandi-nli-small`](https://huggingface.co/alexandrainst/scandi-nli-small) | 58.42% | 72.22% | 72.30% | **22M** |
## Training procedure
It has been fine-tuned on a dataset composed of [DanFEVER](https://aclanthology.org/2021.nodalida-main.pdf#page=439) as well as machine translated versions of [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) and [CommitmentBank](https://doi.org/10.18148/sub/2019.v23i2.601) into all three languages, and machine translated versions of [FEVER](https://aclanthology.org/N18-1074/) and [Adversarial NLI](https://aclanthology.org/2020.acl-main.441/) into Swedish.
The training split of DanFEVER is generated using [this gist](https://gist.github.com/saattrupdan/1cb8379232fdec6e943dc84595a85e7c).
The three languages are sampled equally during training, and they're validated on validation splits of [DanFEVER](https://aclanthology.org/2021.nodalida-main.pdf#page=439) and machine translated versions of [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) for Swedish and Norwegian Bokmรฅl, sampled equally.
Check out the [Github repository](https://github.com/alexandrainst/ScandiNLI) for the code used to train the ScandiNLI models, and the full training logs can be found in [this Weights and Biases report](https://wandb.ai/saattrupdan/huggingface/reports/ScandiNLI--VmlldzozMDQyOTk1?accessToken=r9crgxqvvigy2hatdjeobzwipz7f3id5vqg8ooksljhfw6wl0hv1b05asypsfj9v).
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 4242
- gradient_accumulation_steps: 16
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- max_steps: 50,000 | 9,788 | [
[
-0.048248291015625,
-0.029052734375,
0.0144805908203125,
0.0212860107421875,
-0.0206451416015625,
-0.00919342041015625,
-0.018890380859375,
-0.043975830078125,
0.059661865234375,
0.0005970001220703125,
-0.04791259765625,
-0.061370849609375,
-0.04058837890625,
0.019500732421875,
0.01103973388671875,
0.07403564453125,
-0.0011034011840820312,
0.0259857177734375,
0.00458526611328125,
-0.0213165283203125,
-0.01155853271484375,
-0.0391845703125,
-0.047027587890625,
-0.00933074951171875,
0.039093017578125,
0.01235198974609375,
0.046661376953125,
0.0271759033203125,
0.03851318359375,
0.019317626953125,
-0.032989501953125,
-0.0014829635620117188,
-0.021942138671875,
-0.00948333740234375,
0.0126800537109375,
-0.032745361328125,
-0.0654296875,
-0.005077362060546875,
0.057952880859375,
0.043487548828125,
-0.006649017333984375,
0.01482391357421875,
0.005218505859375,
0.048583984375,
-0.019927978515625,
0.00548553466796875,
-0.00664520263671875,
0.002017974853515625,
-0.01027679443359375,
0.0176239013671875,
-0.0156402587890625,
-0.029541015625,
0.0013446807861328125,
-0.037445068359375,
0.002838134765625,
0.0066070556640625,
0.08740234375,
0.00215911865234375,
-0.0277862548828125,
-0.0159149169921875,
-0.00966644287109375,
0.07275390625,
-0.0833740234375,
0.0305023193359375,
0.0296630859375,
-0.004512786865234375,
-0.01073455810546875,
-0.01751708984375,
-0.0316162109375,
-0.01424407958984375,
-0.0245819091796875,
0.0300140380859375,
-0.022857666015625,
-0.0113067626953125,
0.0265655517578125,
0.0283355712890625,
-0.0631103515625,
0.005428314208984375,
-0.02105712890625,
-0.0007696151733398438,
0.08319091796875,
0.00791168212890625,
0.036529541015625,
-0.024200439453125,
-0.0200347900390625,
-0.0210113525390625,
-0.0191497802734375,
0.01201629638671875,
0.033538818359375,
0.0271759033203125,
-0.05908203125,
0.04876708984375,
-0.00406646728515625,
0.0526123046875,
0.00901031494140625,
-0.0185546875,
0.059478759765625,
-0.042572021484375,
-0.0186004638671875,
-0.0179901123046875,
0.08514404296875,
0.03759765625,
0.01041412353515625,
0.015777587890625,
-0.009979248046875,
-0.002407073974609375,
-0.020172119140625,
-0.0548095703125,
0.004638671875,
0.0190277099609375,
-0.043792724609375,
-0.021087646484375,
0.0123443603515625,
-0.06292724609375,
0.00926971435546875,
-0.00817108154296875,
0.00936126708984375,
-0.052337646484375,
-0.0511474609375,
0.01253509521484375,
-0.0030651092529296875,
0.03155517578125,
0.004161834716796875,
-0.031951904296875,
0.018218994140625,
0.0230865478515625,
0.059814453125,
-0.0177001953125,
-0.0159149169921875,
0.001964569091796875,
-0.0200958251953125,
-0.03021240234375,
0.02642822265625,
-0.003971099853515625,
-0.03289794921875,
-0.0168304443359375,
0.0075531005859375,
-0.0421142578125,
-0.0128173828125,
0.0501708984375,
0.009735107421875,
0.027252197265625,
-0.018463134765625,
-0.04595947265625,
-0.006366729736328125,
0.03155517578125,
-0.0675048828125,
0.08135986328125,
0.011199951171875,
-0.0765380859375,
0.00754547119140625,
-0.04656982421875,
-0.006450653076171875,
-0.0308685302734375,
0.012420654296875,
-0.047271728515625,
-0.0004875659942626953,
0.0166168212890625,
0.04486083984375,
-0.0303955078125,
0.0141448974609375,
-0.03546142578125,
-0.006305694580078125,
0.0286712646484375,
-0.0116424560546875,
0.07659912109375,
0.0160064697265625,
-0.01983642578125,
-0.006221771240234375,
-0.0755615234375,
0.018463134765625,
0.0187225341796875,
-0.02227783203125,
-0.0160064697265625,
-0.0394287109375,
0.0166168212890625,
0.042449951171875,
0.00640106201171875,
-0.045684814453125,
0.014739990234375,
-0.04180908203125,
0.00598907470703125,
0.03033447265625,
-0.0018625259399414062,
0.007617950439453125,
-0.0287933349609375,
0.056915283203125,
-0.0009794235229492188,
0.0214080810546875,
0.010955810546875,
-0.052276611328125,
-0.039794921875,
-0.040557861328125,
0.039794921875,
0.031982421875,
-0.060302734375,
0.03912353515625,
-0.025634765625,
-0.057830810546875,
-0.04486083984375,
-0.00047659873962402344,
0.05615234375,
0.01535797119140625,
0.00640869140625,
-0.0107574462890625,
-0.03814697265625,
-0.0845947265625,
-0.015869140625,
-0.0164794921875,
0.0022430419921875,
0.031158447265625,
0.054901123046875,
-0.00952911376953125,
0.045501708984375,
-0.0297698974609375,
-0.00559234619140625,
-0.0206298828125,
0.0036678314208984375,
0.05426025390625,
0.05316162109375,
0.0738525390625,
-0.07012939453125,
-0.06884765625,
0.0137176513671875,
-0.06536865234375,
-0.001621246337890625,
0.0102386474609375,
-0.01026153564453125,
0.04345703125,
0.01425933837890625,
-0.057220458984375,
0.035736083984375,
0.060791015625,
-0.059295654296875,
0.052459716796875,
-0.00934600830078125,
0.013214111328125,
-0.10284423828125,
0.0343017578125,
0.013153076171875,
-0.008392333984375,
-0.057037353515625,
0.0048675537109375,
-0.0111846923828125,
0.0254669189453125,
-0.056243896484375,
0.0718994140625,
-0.040618896484375,
0.0057373046875,
0.01995849609375,
0.00421905517578125,
-0.004901885986328125,
0.047454833984375,
0.004405975341796875,
0.051483154296875,
0.042449951171875,
-0.019744873046875,
0.006103515625,
0.033660888671875,
-0.0233306884765625,
0.0458984375,
-0.050506591796875,
-0.001377105712890625,
0.0018682479858398438,
0.0110931396484375,
-0.07000732421875,
-0.0240325927734375,
0.0244598388671875,
-0.05657958984375,
0.035064697265625,
-0.00595855712890625,
-0.0241241455078125,
-0.04095458984375,
-0.051483154296875,
0.0206451416015625,
0.032440185546875,
-0.03021240234375,
0.0546875,
0.01335906982421875,
-0.015106201171875,
-0.053375244140625,
-0.053558349609375,
-0.01311492919921875,
-0.0195770263671875,
-0.057830810546875,
0.0164031982421875,
-0.012664794921875,
-0.0099029541015625,
0.0138092041015625,
0.003997802734375,
-0.0119781494140625,
-0.0004820823669433594,
0.018280029296875,
0.0391845703125,
-0.0240020751953125,
-0.017608642578125,
-0.01256561279296875,
0.0026187896728515625,
-0.002536773681640625,
0.005100250244140625,
0.04339599609375,
-0.0248870849609375,
-0.014739990234375,
-0.059783935546875,
0.02923583984375,
0.051422119140625,
-0.00972747802734375,
0.073974609375,
0.06024169921875,
-0.019134521484375,
0.0178375244140625,
-0.046722412109375,
0.00107574462890625,
-0.0237884521484375,
0.01139068603515625,
-0.0474853515625,
-0.051422119140625,
0.060028076171875,
0.0251922607421875,
-0.0022182464599609375,
0.07135009765625,
0.035614013671875,
-0.0003361701965332031,
0.10040283203125,
0.029754638671875,
-0.007335662841796875,
0.0292816162109375,
-0.05572509765625,
-0.0035533905029296875,
-0.06903076171875,
-0.021026611328125,
-0.0285797119140625,
-0.021453857421875,
-0.05706787109375,
-0.0240020751953125,
0.0229339599609375,
0.00554656982421875,
-0.0098724365234375,
0.0265655517578125,
-0.0225067138671875,
0.0158843994140625,
0.03729248046875,
0.00336456298828125,
-0.003070831298828125,
-0.00479888916015625,
-0.0244140625,
-0.0146331787109375,
-0.059783935546875,
-0.0163116455078125,
0.08575439453125,
0.034088134765625,
0.037841796875,
0.019622802734375,
0.042266845703125,
0.00128173828125,
0.0155181884765625,
-0.033416748046875,
0.0347900390625,
-0.00540924072265625,
-0.056427001953125,
-0.020660400390625,
-0.045684814453125,
-0.063720703125,
0.0295867919921875,
-0.031707763671875,
-0.054473876953125,
0.0259857177734375,
-0.0012969970703125,
-0.026458740234375,
0.026611328125,
-0.050933837890625,
0.055633544921875,
-0.008514404296875,
-0.03143310546875,
0.004688262939453125,
-0.0501708984375,
0.031494140625,
-0.004535675048828125,
0.01953125,
-0.01131439208984375,
0.0084381103515625,
0.0618896484375,
-0.032562255859375,
0.060272216796875,
-0.016815185546875,
-0.0059356689453125,
0.01483154296875,
-0.00946807861328125,
0.0189361572265625,
0.00557708740234375,
-0.0299224853515625,
0.036376953125,
0.02752685546875,
-0.03912353515625,
-0.028656005859375,
0.0484619140625,
-0.0447998046875,
-0.019500732421875,
-0.053314208984375,
-0.03448486328125,
0.0030002593994140625,
0.018280029296875,
0.031005859375,
0.033111572265625,
-0.025634765625,
0.0200347900390625,
0.037445068359375,
-0.0295867919921875,
0.0316162109375,
0.0230560302734375,
-0.0235748291015625,
-0.04925537109375,
0.056365966796875,
-0.0014581680297851562,
0.0258331298828125,
-0.0009822845458984375,
0.0083465576171875,
-0.0377197265625,
-0.03497314453125,
-0.02899169921875,
0.0594482421875,
-0.02886962890625,
-0.0170135498046875,
-0.0479736328125,
-0.0191497802734375,
-0.036041259765625,
-0.0039520263671875,
-0.02972412109375,
-0.04638671875,
-0.0176239013671875,
-0.007152557373046875,
0.01922607421875,
0.04595947265625,
-0.01302337646484375,
0.01111602783203125,
-0.04052734375,
0.007083892822265625,
-0.009857177734375,
0.036651611328125,
-0.0143280029296875,
-0.03668212890625,
-0.013702392578125,
0.0171966552734375,
0.004840850830078125,
-0.05609130859375,
0.050872802734375,
0.0135498046875,
0.0302734375,
0.023468017578125,
-0.01171112060546875,
0.038116455078125,
-0.015472412109375,
0.06829833984375,
0.039154052734375,
-0.058929443359375,
0.031402587890625,
-0.032379150390625,
0.02642822265625,
0.03961181640625,
0.0362548828125,
-0.036712646484375,
-0.0312042236328125,
-0.05682373046875,
-0.06536865234375,
0.069091796875,
0.027191162109375,
-0.0016393661499023438,
-0.001407623291015625,
0.03558349609375,
-0.01161956787109375,
0.0091400146484375,
-0.049530029296875,
-0.046173095703125,
0.005329132080078125,
-0.023834228515625,
-0.021942138671875,
-0.027923583984375,
-0.0052947998046875,
-0.036773681640625,
0.0716552734375,
-0.0128326416015625,
0.0185089111328125,
0.01507568359375,
0.006572723388671875,
0.010711669921875,
-0.004238128662109375,
0.058929443359375,
0.043792724609375,
-0.0249786376953125,
-0.03485107421875,
0.03338623046875,
-0.04638671875,
-0.00001823902130126953,
0.00760650634765625,
-0.0157318115234375,
0.0282135009765625,
0.04791259765625,
0.080322265625,
0.025726318359375,
-0.040191650390625,
0.0537109375,
-0.03436279296875,
-0.037322998046875,
-0.031524658203125,
-0.00981903076171875,
0.0182342529296875,
0.007549285888671875,
0.002899169921875,
-0.006336212158203125,
-0.005123138427734375,
-0.01543426513671875,
0.0121917724609375,
0.035369873046875,
-0.0308990478515625,
-0.04351806640625,
0.03826904296875,
-0.004863739013671875,
-0.0015201568603515625,
0.0196685791015625,
-0.0176239013671875,
-0.030975341796875,
0.0552978515625,
0.03240966796875,
0.05010986328125,
-0.04962158203125,
0.0197601318359375,
0.061614990234375,
0.009765625,
-0.00843048095703125,
0.055145263671875,
0.03515625,
-0.04595947265625,
-0.0279083251953125,
-0.06634521484375,
-0.00024271011352539062,
0.0147247314453125,
-0.06646728515625,
0.0237884521484375,
-0.017669677734375,
-0.032867431640625,
0.013427734375,
0.0191802978515625,
-0.050872802734375,
0.021728515625,
0.004146575927734375,
0.08538818359375,
-0.0772705078125,
0.06536865234375,
0.05926513671875,
-0.037506103515625,
-0.071533203125,
-0.0145416259765625,
0.0078277587890625,
-0.0469970703125,
0.039520263671875,
-0.0007090568542480469,
0.00804901123046875,
-0.01067352294921875,
-0.0257568359375,
-0.07989501953125,
0.079833984375,
0.0241241455078125,
-0.0345458984375,
-0.00299835205078125,
0.004886627197265625,
0.054107666015625,
-0.02752685546875,
0.048492431640625,
0.041229248046875,
0.035614013671875,
0.01422882080078125,
-0.07470703125,
-0.00012969970703125,
-0.049774169921875,
0.01302337646484375,
0.00730133056640625,
-0.06060791015625,
0.07305908203125,
-0.0032482147216796875,
0.0023021697998046875,
0.00695037841796875,
0.038360595703125,
0.032989501953125,
0.01556396484375,
0.036956787109375,
0.054718017578125,
0.03802490234375,
-0.0032901763916015625,
0.0997314453125,
-0.037811279296875,
0.035980224609375,
0.0509033203125,
0.0059051513671875,
0.06365966796875,
0.028106689453125,
-0.0196075439453125,
0.0275726318359375,
0.038726806640625,
-0.005584716796875,
0.00957489013671875,
-0.0018711090087890625,
-0.01227569580078125,
-0.0034694671630859375,
-0.007701873779296875,
-0.028778076171875,
0.029541015625,
0.01459503173828125,
-0.0164642333984375,
0.00281524658203125,
0.006450653076171875,
0.046234130859375,
-0.0047607421875,
-0.00348663330078125,
0.058502197265625,
0.0120086669921875,
-0.047271728515625,
0.076171875,
-0.01238250732421875,
0.0645751953125,
-0.043701171875,
0.0070037841796875,
-0.01430511474609375,
0.01114654541015625,
-0.01467132568359375,
-0.061126708984375,
0.0268096923828125,
0.012542724609375,
-0.00833892822265625,
-0.006450653076171875,
0.035064697265625,
-0.038909912109375,
-0.0648193359375,
0.0440673828125,
0.034088134765625,
0.0274810791015625,
0.01152801513671875,
-0.08282470703125,
0.0157012939453125,
0.021514892578125,
-0.0472412109375,
0.0155487060546875,
0.004192352294921875,
0.004650115966796875,
0.04461669921875,
0.049041748046875,
0.022979736328125,
0.005535125732421875,
0.0032176971435546875,
0.062408447265625,
-0.03289794921875,
-0.025726318359375,
-0.06219482421875,
0.05010986328125,
-0.0275726318359375,
-0.040740966796875,
0.07318115234375,
0.04840087890625,
0.0538330078125,
-0.005222320556640625,
0.05126953125,
-0.01934814453125,
0.0511474609375,
-0.039886474609375,
0.05865478515625,
-0.059295654296875,
-0.006175994873046875,
-0.0247955322265625,
-0.06475830078125,
-0.03680419921875,
0.032745361328125,
-0.021026611328125,
0.00754547119140625,
0.03448486328125,
0.049530029296875,
-0.004428863525390625,
-0.0105743408203125,
0.012451171875,
0.035186767578125,
0.01238250732421875,
0.051849365234375,
0.034637451171875,
-0.04638671875,
0.0038967132568359375,
-0.0367431640625,
-0.0169219970703125,
-0.012481689453125,
-0.05474853515625,
-0.08258056640625,
-0.042999267578125,
-0.03729248046875,
-0.035614013671875,
-0.003452301025390625,
0.085693359375,
0.04522705078125,
-0.07275390625,
-0.0277862548828125,
0.0266876220703125,
-0.009918212890625,
-0.02239990234375,
-0.0119781494140625,
0.0418701171875,
0.01081085205078125,
-0.0673828125,
0.01314544677734375,
-0.00765228271484375,
0.01800537109375,
-0.003849029541015625,
-0.0308990478515625,
-0.045318603515625,
0.001621246337890625,
0.038726806640625,
0.024017333984375,
-0.05419921875,
0.004161834716796875,
0.01204681396484375,
-0.01462554931640625,
0.006072998046875,
0.005138397216796875,
-0.013214111328125,
0.0095977783203125,
0.037017822265625,
0.0207061767578125,
0.046478271484375,
0.003910064697265625,
0.0227508544921875,
-0.051605224609375,
0.024322509765625,
0.01401519775390625,
0.0289154052734375,
0.02685546875,
-0.0169677734375,
0.053314208984375,
0.01434326171875,
-0.0191802978515625,
-0.08319091796875,
-0.01439666748046875,
-0.07843017578125,
-0.014007568359375,
0.08453369140625,
-0.0230560302734375,
-0.053375244140625,
0.0240936279296875,
-0.01464080810546875,
0.01166534423828125,
-0.017425537109375,
0.0226898193359375,
0.0552978515625,
-0.005893707275390625,
-0.0022869110107421875,
-0.05419921875,
0.037109375,
0.039276123046875,
-0.053192138671875,
-0.0167388916015625,
0.0159149169921875,
0.0242462158203125,
0.0277099609375,
0.05169677734375,
-0.0266571044921875,
0.0088348388671875,
-0.009979248046875,
0.03509521484375,
0.01305389404296875,
0.0010776519775390625,
-0.046844482421875,
-0.01271820068359375,
-0.0158538818359375,
-0.009429931640625
]
] |
laion/CLIP-ViT-H-14-laion2B-s32B-b79K | 2023-04-18T17:45:56.000Z | [
"open_clip",
"pytorch",
"clip",
"zero-shot-image-classification",
"arxiv:1910.04867",
"license:mit",
"has_space",
"region:us"
] | zero-shot-image-classification | laion | null | null | laion/CLIP-ViT-H-14-laion2B-s32B-b79K | 181 | 1,340,812 | open_clip | 2022-09-14T22:52:28 | ---
license: mit
widget:
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
library_name: open_clip
pipeline_tag: zero-shot-image-classification
---
# Model Card for CLIP ViT-H/14 - LAION-2B
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Details](#training-details)
4. [Evaluation](#evaluation)
5. [Acknowledgements](#acknowledgements)
6. [Citation](#citation)
7. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
A CLIP ViT-H/14 model trained with the LAION-2B English subset of LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip).
Model training done by Romain Beaumont on the [stability.ai](https://stability.ai/) cluster.
# Uses
As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model.
The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset.
## Direct Use
Zero-shot image classification, image and text retrieval, among others.
## Downstream Use
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
## Out-of-Scope Use
As per the OpenAI models,
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIPโs performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below.
# Training Details
## Training Data
This model was trained with the 2 Billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/).
**IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a โsafeโ subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress.
## Training Procedure
Please see [training notes](https://docs.google.com/document/d/1EFbMLRWSSV0LUf9Du1pWzWqgeiIRPwEWX2s1C6mAk5c) and [wandb logs](https://wandb.ai/rom1504/eval_openclip/reports/H-14--VmlldzoyNDAxODQ3).
# Evaluation
Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark).
## Testing Data, Factors & Metrics
### Testing Data
The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
**TODO** - more detail
## Results
The model achieves a 78.0 zero-shot top-1 accuracy on ImageNet-1k.
An initial round of benchmarks have been performed on a wider range of datasets, currently viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb
**TODO** - create table for just this model's metrics.
# Acknowledgements
Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model.
# Citation
**BibTeX:**
LAION-5B
```bibtex
@inproceedings{schuhmann2022laionb,
title={{LAION}-5B: An open large-scale dataset for training next generation image-text models},
author={Christoph Schuhmann and
Romain Beaumont and
Richard Vencu and
Cade W Gordon and
Ross Wightman and
Mehdi Cherti and
Theo Coombes and
Aarush Katta and
Clayton Mullis and
Mitchell Wortsman and
Patrick Schramowski and
Srivatsa R Kundurthy and
Katherine Crowson and
Ludwig Schmidt and
Robert Kaczmarczyk and
Jenia Jitsev},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=M3Y74vmsMcY}
}
```
OpenAI CLIP paper
```
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
OpenCLIP software
```
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
```
# How to Get Started with the Model
Use the code below to get started with the model.
** TODO ** - Hugging Face transformers, OpenCLIP, and timm getting started snippets | 8,203 | [
[
-0.0226287841796875,
-0.044158935546875,
0.0161590576171875,
0.0019664764404296875,
-0.029296875,
-0.033721923828125,
-0.01512908935546875,
-0.050628662109375,
-0.0005974769592285156,
0.034027099609375,
-0.0330810546875,
-0.04559326171875,
-0.045806884765625,
-0.00586700439453125,
-0.0306549072265625,
0.0701904296875,
-0.0160369873046875,
-0.002777099609375,
-0.022705078125,
-0.03216552734375,
-0.0374755859375,
-0.043701171875,
-0.031341552734375,
0.0042724609375,
0.012420654296875,
0.02069091796875,
0.047210693359375,
0.06536865234375,
0.057952880859375,
0.0174102783203125,
-0.0074615478515625,
-0.0003848075866699219,
-0.041534423828125,
-0.0360107421875,
-0.002155303955078125,
-0.0250701904296875,
-0.048553466796875,
0.01068878173828125,
0.042083740234375,
0.025360107421875,
-0.0075531005859375,
0.0196380615234375,
-0.0019092559814453125,
0.031982421875,
-0.058807373046875,
0.0163421630859375,
-0.042938232421875,
0.0004296302795410156,
-0.0169830322265625,
0.013671875,
-0.025787353515625,
-0.01039886474609375,
0.01486968994140625,
-0.053314208984375,
0.0128631591796875,
-0.00865936279296875,
0.10723876953125,
0.01739501953125,
-0.02056884765625,
0.01520538330078125,
-0.05035400390625,
0.056915283203125,
-0.05731201171875,
0.02783203125,
0.0265655517578125,
0.030548095703125,
0.0182952880859375,
-0.06689453125,
-0.03509521484375,
-0.01290130615234375,
0.0124664306640625,
0.0208282470703125,
-0.02178955078125,
0.0016803741455078125,
0.0343017578125,
0.0135955810546875,
-0.026458740234375,
0.0020923614501953125,
-0.0496826171875,
-0.0012655258178710938,
0.049896240234375,
-0.0013589859008789062,
0.0277252197265625,
-0.0243988037109375,
-0.0548095703125,
-0.035614013671875,
-0.045654296875,
0.0311126708984375,
0.0192413330078125,
-0.0021114349365234375,
-0.03887939453125,
0.034515380859375,
0.00302886962890625,
0.0301666259765625,
-0.005481719970703125,
-0.0222625732421875,
0.03582763671875,
-0.03118896484375,
-0.0256195068359375,
-0.01532745361328125,
0.08349609375,
0.049102783203125,
0.01236724853515625,
0.00811767578125,
-0.0014066696166992188,
-0.00630950927734375,
0.0238189697265625,
-0.0760498046875,
-0.01059722900390625,
-0.0006270408630371094,
-0.049346923828125,
-0.0252532958984375,
0.03314208984375,
-0.057525634765625,
0.00258636474609375,
-0.0118255615234375,
0.039825439453125,
-0.042327880859375,
-0.0187225341796875,
0.0004215240478515625,
-0.002635955810546875,
0.0189971923828125,
0.021697998046875,
-0.04510498046875,
0.0154266357421875,
0.0260162353515625,
0.08123779296875,
-0.017730712890625,
-0.0325927734375,
-0.016693115234375,
0.01366424560546875,
-0.0233001708984375,
0.038177490234375,
-0.013641357421875,
-0.0221099853515625,
-0.00728607177734375,
0.02764892578125,
-0.00574493408203125,
-0.04229736328125,
0.0467529296875,
-0.0187225341796875,
0.00444793701171875,
-0.01096343994140625,
-0.01520538330078125,
-0.042755126953125,
0.010589599609375,
-0.05157470703125,
0.0711669921875,
0.000835418701171875,
-0.0660400390625,
0.023101806640625,
-0.044219970703125,
-0.01372528076171875,
-0.017181396484375,
-0.003826141357421875,
-0.047149658203125,
-0.0186920166015625,
0.03631591796875,
0.04083251953125,
-0.02392578125,
0.0352783203125,
-0.04791259765625,
-0.023681640625,
0.0182647705078125,
-0.03240966796875,
0.07269287109375,
0.0002409219741821289,
-0.0265350341796875,
0.015380859375,
-0.0479736328125,
-0.00988006591796875,
0.0192718505859375,
0.004299163818359375,
-0.017181396484375,
-0.0176849365234375,
-0.0008425712585449219,
0.0206756591796875,
0.01093292236328125,
-0.0406494140625,
-0.0010309219360351562,
-0.00914764404296875,
0.034637451171875,
0.05596923828125,
0.006267547607421875,
0.021942138671875,
-0.0312347412109375,
0.043121337890625,
0.011810302734375,
0.044891357421875,
-0.02349853515625,
-0.0389404296875,
-0.05816650390625,
-0.042510986328125,
0.031524658203125,
0.0413818359375,
-0.050567626953125,
0.031982421875,
-0.018798828125,
-0.0390625,
-0.03118896484375,
-0.007007598876953125,
0.037933349609375,
0.04058837890625,
0.033477783203125,
-0.03582763671875,
-0.037384033203125,
-0.06494140625,
0.016143798828125,
-0.0018262863159179688,
-0.00244903564453125,
0.04541015625,
0.05706787109375,
-0.01194000244140625,
0.06976318359375,
-0.047637939453125,
-0.036956787109375,
-0.00858306884765625,
0.004993438720703125,
0.0065460205078125,
0.03515625,
0.06622314453125,
-0.064453125,
-0.035858154296875,
-0.007167816162109375,
-0.089599609375,
0.00836944580078125,
0.000782012939453125,
-0.0208740234375,
0.01381683349609375,
0.041534423828125,
-0.044952392578125,
0.053253173828125,
0.0333251953125,
0.0038394927978515625,
0.03692626953125,
-0.00981903076171875,
0.00017273426055908203,
-0.08868408203125,
0.02734375,
0.008636474609375,
-0.016845703125,
-0.038909912109375,
0.0012712478637695312,
0.0034580230712890625,
-0.027252197265625,
-0.0631103515625,
0.04229736328125,
-0.0281982421875,
0.005619049072265625,
-0.0032634735107421875,
0.002887725830078125,
0.005252838134765625,
0.04656982421875,
0.007640838623046875,
0.06494140625,
0.056365966796875,
-0.04888916015625,
0.00267791748046875,
0.02923583984375,
-0.027587890625,
0.029693603515625,
-0.0738525390625,
0.004665374755859375,
-0.0068511962890625,
0.01026153564453125,
-0.0292816162109375,
-0.03271484375,
0.0303192138671875,
-0.03643798828125,
0.0250091552734375,
-0.02142333984375,
-0.01666259765625,
-0.032623291015625,
-0.04327392578125,
0.0396728515625,
0.054718017578125,
-0.049163818359375,
0.0256805419921875,
0.035675048828125,
0.0075225830078125,
-0.057403564453125,
-0.0504150390625,
-0.0210113525390625,
-0.0252838134765625,
-0.0528564453125,
0.032684326171875,
-0.0061187744140625,
-0.0007023811340332031,
0.004299163818359375,
0.01024627685546875,
-0.016815185546875,
-0.007778167724609375,
0.0504150390625,
0.04290771484375,
-0.004390716552734375,
-0.009063720703125,
-0.00762939453125,
-0.0004527568817138672,
-0.0012102127075195312,
-0.006671905517578125,
0.0167388916015625,
-0.0173492431640625,
-0.0220794677734375,
-0.048248291015625,
0.0128936767578125,
0.04815673828125,
-0.032257080078125,
0.058258056640625,
0.05560302734375,
-0.03656005859375,
0.0036029815673828125,
-0.025360107421875,
-0.0063018798828125,
-0.035919189453125,
0.035858154296875,
-0.00403594970703125,
-0.049896240234375,
0.042083740234375,
0.01087188720703125,
-0.0092620849609375,
0.0435791015625,
0.02789306640625,
-0.005611419677734375,
0.06927490234375,
0.070068359375,
-0.002349853515625,
0.04925537109375,
-0.052398681640625,
0.01219940185546875,
-0.07208251953125,
-0.0267791748046875,
-0.01568603515625,
-0.005954742431640625,
-0.03955078125,
-0.040924072265625,
0.049407958984375,
0.029327392578125,
-0.0157928466796875,
0.034332275390625,
-0.027252197265625,
0.0211944580078125,
0.040130615234375,
0.02789306640625,
0.0033664703369140625,
-0.0023326873779296875,
-0.00214385986328125,
-0.007808685302734375,
-0.052032470703125,
-0.03302001953125,
0.08624267578125,
0.0504150390625,
0.058563232421875,
-0.00626373291015625,
0.0333251953125,
0.0104827880859375,
0.007083892822265625,
-0.053466796875,
0.048095703125,
-0.0281982421875,
-0.04620361328125,
-0.0234527587890625,
-0.0261993408203125,
-0.061981201171875,
-0.0004775524139404297,
-0.008880615234375,
-0.057708740234375,
0.031951904296875,
0.0040130615234375,
-0.0258636474609375,
0.036956787109375,
-0.042236328125,
0.074951171875,
-0.02838134765625,
-0.0259246826171875,
0.00708770751953125,
-0.057952880859375,
0.04193115234375,
0.01229095458984375,
0.004791259765625,
-0.01528167724609375,
0.00981903076171875,
0.077880859375,
-0.04693603515625,
0.07366943359375,
-0.01320648193359375,
0.0174102783203125,
0.050994873046875,
-0.020050048828125,
0.01262664794921875,
0.012420654296875,
0.007579803466796875,
0.055572509765625,
0.004199981689453125,
-0.01079559326171875,
-0.0292205810546875,
0.030914306640625,
-0.07305908203125,
-0.017822265625,
-0.033447265625,
-0.037994384765625,
0.0158843994140625,
0.0308380126953125,
0.048583984375,
0.041900634765625,
-0.0098876953125,
0.0273590087890625,
0.045440673828125,
-0.0260162353515625,
0.04107666015625,
0.0165863037109375,
-0.01461029052734375,
-0.0556640625,
0.07537841796875,
0.022216796875,
0.023223876953125,
0.00867462158203125,
0.005001068115234375,
-0.00434112548828125,
-0.029815673828125,
-0.03936767578125,
0.02593994140625,
-0.055816650390625,
-0.03216552734375,
-0.03692626953125,
-0.03338623046875,
-0.03076171875,
-0.00334930419921875,
-0.03857421875,
-0.01499176025390625,
-0.0462646484375,
-0.00431060791015625,
0.0279998779296875,
0.0440673828125,
-0.0062103271484375,
0.025848388671875,
-0.0638427734375,
0.0241546630859375,
0.0187225341796875,
0.033447265625,
0.00019431114196777344,
-0.048736572265625,
-0.0224761962890625,
0.01293182373046875,
-0.04144287109375,
-0.048614501953125,
0.0291900634765625,
0.020843505859375,
0.0360107421875,
0.049468994140625,
0.01207733154296875,
0.04248046875,
-0.031707763671875,
0.07635498046875,
0.0262603759765625,
-0.062744140625,
0.0408935546875,
-0.045501708984375,
0.01519012451171875,
0.046844482421875,
0.056396484375,
-0.01361846923828125,
0.004055023193359375,
-0.05438232421875,
-0.06890869140625,
0.0721435546875,
0.01236724853515625,
0.0018444061279296875,
0.01288604736328125,
0.02813720703125,
0.0007071495056152344,
0.0152740478515625,
-0.0733642578125,
-0.00279998779296875,
-0.03558349609375,
-0.00634002685546875,
0.016326904296875,
-0.0224151611328125,
-0.01336669921875,
-0.033111572265625,
0.058319091796875,
-0.0207672119140625,
0.04925537109375,
0.0187225341796875,
-0.010772705078125,
-0.005523681640625,
-0.0039825439453125,
0.036590576171875,
0.04345703125,
-0.035675048828125,
-0.017181396484375,
0.00788116455078125,
-0.04779052734375,
-0.006168365478515625,
0.010009765625,
-0.047698974609375,
-0.0116424560546875,
0.034637451171875,
0.09906005859375,
0.00960540771484375,
-0.052337646484375,
0.0701904296875,
-0.002956390380859375,
-0.0290679931640625,
-0.02459716796875,
0.0057830810546875,
-0.0208282470703125,
0.01395416259765625,
0.004749298095703125,
0.01158905029296875,
0.006771087646484375,
-0.042144775390625,
0.01508331298828125,
0.032806396484375,
-0.041839599609375,
-0.035186767578125,
0.06463623046875,
-0.0013217926025390625,
-0.00823211669921875,
0.048919677734375,
-0.006313323974609375,
-0.038330078125,
0.052978515625,
0.038177490234375,
0.07159423828125,
-0.0031280517578125,
0.02557373046875,
0.05047607421875,
0.02008056640625,
-0.0184783935546875,
0.0090484619140625,
0.01262664794921875,
-0.0433349609375,
-0.0064849853515625,
-0.032135009765625,
-0.0268096923828125,
0.0212554931640625,
-0.07110595703125,
0.043548583984375,
-0.052825927734375,
-0.0304412841796875,
-0.0154571533203125,
-0.036590576171875,
-0.0372314453125,
0.0164947509765625,
0.01332855224609375,
0.068359375,
-0.0626220703125,
0.052520751953125,
0.0545654296875,
-0.0618896484375,
-0.06475830078125,
0.01009368896484375,
-0.0129547119140625,
-0.034393310546875,
0.0295562744140625,
0.0400390625,
-0.0013179779052734375,
-0.026458740234375,
-0.07061767578125,
-0.07733154296875,
0.10723876953125,
0.0400390625,
-0.018890380859375,
-0.011322021484375,
0.004199981689453125,
0.032318115234375,
-0.01837158203125,
0.0305328369140625,
0.0180816650390625,
0.01285552978515625,
0.0087738037109375,
-0.0775146484375,
-0.0029773712158203125,
-0.024200439453125,
0.01441192626953125,
-0.001132965087890625,
-0.08416748046875,
0.0789794921875,
-0.019256591796875,
-0.0203094482421875,
0.006900787353515625,
0.055816650390625,
0.003173828125,
0.02783203125,
0.028350830078125,
0.050567626953125,
0.042083740234375,
-0.00037384033203125,
0.077880859375,
-0.01483154296875,
0.02874755859375,
0.08465576171875,
-0.01103973388671875,
0.0750732421875,
0.01806640625,
-0.0161590576171875,
0.03289794921875,
0.032379150390625,
-0.0281982421875,
0.04840087890625,
-0.0275115966796875,
0.01117706298828125,
-0.00696563720703125,
-0.032623291015625,
-0.030364990234375,
0.041656494140625,
0.0017957687377929688,
-0.030487060546875,
0.0008893013000488281,
0.0289306640625,
0.0034847259521484375,
-0.01493072509765625,
-0.0101470947265625,
0.041656494140625,
0.016265869140625,
-0.034637451171875,
0.06256103515625,
0.0048675537109375,
0.05291748046875,
-0.051727294921875,
-0.0033779144287109375,
-0.002777099609375,
0.0211639404296875,
-0.01178741455078125,
-0.05401611328125,
0.0208892822265625,
0.0005888938903808594,
-0.0158843994140625,
-0.00913238525390625,
0.056549072265625,
-0.01358795166015625,
-0.035369873046875,
0.0347900390625,
0.0014247894287109375,
0.01253509521484375,
0.0016908645629882812,
-0.04766845703125,
0.01166534423828125,
-0.0004210472106933594,
-0.00872039794921875,
0.0287628173828125,
0.014373779296875,
-0.0198974609375,
0.050933837890625,
0.03802490234375,
-0.0148162841796875,
0.01125335693359375,
-0.007320404052734375,
0.07244873046875,
-0.032684326171875,
-0.035003662109375,
-0.043304443359375,
0.047637939453125,
-0.01009368896484375,
-0.03369140625,
0.057861328125,
0.037750244140625,
0.08050537109375,
-0.0146331787109375,
0.05712890625,
-0.0180816650390625,
0.0225677490234375,
-0.04925537109375,
0.0574951171875,
-0.04931640625,
0.005352020263671875,
-0.038482666015625,
-0.055816650390625,
-0.0143890380859375,
0.045013427734375,
-0.018585205078125,
0.0086669921875,
0.05462646484375,
0.054962158203125,
-0.021270751953125,
-0.00525665283203125,
0.0162506103515625,
0.0146331787109375,
0.0198516845703125,
0.032257080078125,
0.041412353515625,
-0.060821533203125,
0.04595947265625,
-0.05718994140625,
-0.02288818359375,
-0.0104217529296875,
-0.063720703125,
-0.08782958984375,
-0.04486083984375,
-0.0292205810546875,
-0.0168609619140625,
0.002105712890625,
0.054718017578125,
0.07269287109375,
-0.056243896484375,
-0.0253143310546875,
0.01024627685546875,
-0.01366424560546875,
-0.01837158203125,
-0.0163116455078125,
0.037933349609375,
0.0169677734375,
-0.0440673828125,
0.0129241943359375,
0.0120391845703125,
0.0225830078125,
-0.004917144775390625,
-0.004116058349609375,
-0.0289154052734375,
-0.00567626953125,
0.035675048828125,
0.0311126708984375,
-0.043426513671875,
-0.017333984375,
0.013427734375,
0.0081634521484375,
0.0240325927734375,
0.0401611328125,
-0.04254150390625,
0.032196044921875,
0.03411865234375,
0.0372314453125,
0.049896240234375,
0.016204833984375,
0.01873779296875,
-0.046051025390625,
0.0300140380859375,
0.0024509429931640625,
0.0247344970703125,
0.026153564453125,
-0.0290679931640625,
0.049591064453125,
0.031585693359375,
-0.033416748046875,
-0.06964111328125,
-0.005950927734375,
-0.08599853515625,
-0.00644683837890625,
0.0838623046875,
-0.035064697265625,
-0.038482666015625,
0.0304412841796875,
-0.0170440673828125,
0.0311126708984375,
-0.0271759033203125,
0.032379150390625,
0.032012939453125,
-0.0017042160034179688,
-0.032501220703125,
-0.058746337890625,
0.02587890625,
0.0124359130859375,
-0.0712890625,
-0.0115509033203125,
0.0298309326171875,
0.0284576416015625,
0.01812744140625,
0.038055419921875,
-0.024139404296875,
0.025390625,
-0.0034160614013671875,
0.0248260498046875,
-0.0261688232421875,
-0.053924560546875,
-0.035369873046875,
0.0027942657470703125,
-0.01192474365234375,
-0.03173828125
]
] |
dslim/bert-base-NER | 2023-05-09T16:37:55.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"token-classification",
"en",
"dataset:conll2003",
"arxiv:1810.04805",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | dslim | null | null | dslim/bert-base-NER | 278 | 1,323,754 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- conll2003
license: mit
---
# bert-base-NER
## Model description
**bert-base-NER** is a fine-tuned BERT model that is ready to use for **Named Entity Recognition** and achieves **state-of-the-art performance** for the NER task. It has been trained to recognize four types of entities: location (LOC), organizations (ORG), person (PER) and Miscellaneous (MISC).
Specifically, this model is a *bert-base-cased* model that was fine-tuned on the English version of the standard [CoNLL-2003 Named Entity Recognition](https://www.aclweb.org/anthology/W03-0419.pdf) dataset.
If you'd like to use a larger BERT-large model fine-tuned on the same dataset, a [**bert-large-NER**](https://huggingface.co/dslim/bert-large-NER/) version is also available.
## Intended uses & limitations
#### How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("dslim/bert-base-NER")
model = AutoModelForTokenClassification.from_pretrained("dslim/bert-base-NER")
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "My name is Wolfgang and I live in Berlin"
ner_results = nlp(example)
print(ner_results)
```
#### Limitations and bias
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains. Furthermore, the model occassionally tags subword tokens as entities and post-processing of results may be necessary to handle those cases.
## Training data
This model was fine-tuned on English version of the standard [CoNLL-2003 Named Entity Recognition](https://www.aclweb.org/anthology/W03-0419.pdf) dataset.
The training dataset distinguishes between the beginning and continuation of an entity so that if there are back-to-back entities of the same type, the model can output where the second entity begins. As in the dataset, each token will be classified as one of the following classes:
Abbreviation|Description
-|-
O|Outside of a named entity
B-MIS |Beginning of a miscellaneous entity right after another miscellaneous entity
I-MIS | Miscellaneous entity
B-PER |Beginning of a personโs name right after another personโs name
I-PER |Personโs name
B-ORG |Beginning of an organization right after another organization
I-ORG |organization
B-LOC |Beginning of a location right after another location
I-LOC |Location
### CoNLL-2003 English Dataset Statistics
This dataset was derived from the Reuters corpus which consists of Reuters news stories. You can read more about how this dataset was created in the CoNLL-2003 paper.
#### # of training examples per entity type
Dataset|LOC|MISC|ORG|PER
-|-|-|-|-
Train|7140|3438|6321|6600
Dev|1837|922|1341|1842
Test|1668|702|1661|1617
#### # of articles/sentences/tokens per dataset
Dataset |Articles |Sentences |Tokens
-|-|-|-
Train |946 |14,987 |203,621
Dev |216 |3,466 |51,362
Test |231 |3,684 |46,435
## Training procedure
This model was trained on a single NVIDIA V100 GPU with recommended hyperparameters from the [original BERT paper](https://arxiv.org/pdf/1810.04805) which trained & evaluated the model on CoNLL-2003 NER task.
## Eval results
metric|dev|test
-|-|-
f1 |95.1 |91.3
precision |95.0 |90.7
recall |95.3 |91.9
The test metrics are a little lower than the official Google BERT results which encoded document context & experimented with CRF. More on replicating the original results [here](https://github.com/google-research/bert/issues/223).
### BibTeX entry and citation info
```
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
```
@inproceedings{tjong-kim-sang-de-meulder-2003-introduction,
title = "Introduction to the {C}o{NLL}-2003 Shared Task: Language-Independent Named Entity Recognition",
author = "Tjong Kim Sang, Erik F. and
De Meulder, Fien",
booktitle = "Proceedings of the Seventh Conference on Natural Language Learning at {HLT}-{NAACL} 2003",
year = "2003",
url = "https://www.aclweb.org/anthology/W03-0419",
pages = "142--147",
}
```
| 4,804 | [
[
-0.036407470703125,
-0.050384521484375,
0.01511383056640625,
0.01139068603515625,
-0.027099609375,
-0.008209228515625,
-0.033782958984375,
-0.043060302734375,
0.0230865478515625,
0.0251007080078125,
-0.033477783203125,
-0.03985595703125,
-0.05511474609375,
0.018035888671875,
-0.034149169921875,
0.09844970703125,
-0.003444671630859375,
0.022369384765625,
-0.003154754638671875,
-0.0145111083984375,
-0.01346588134765625,
-0.05877685546875,
-0.0645751953125,
-0.01373291015625,
0.043365478515625,
0.0099639892578125,
0.0291595458984375,
0.02557373046875,
0.039581298828125,
0.019989013671875,
-0.00717926025390625,
0.0088348388671875,
-0.0311279296875,
-0.016326904296875,
-0.0006694793701171875,
-0.020172119140625,
-0.0286102294921875,
0.006267547607421875,
0.05657958984375,
0.0576171875,
0.0006303787231445312,
0.00943756103515625,
0.011932373046875,
0.0450439453125,
-0.02178955078125,
0.0183258056640625,
-0.05389404296875,
-0.0091094970703125,
-0.018035888671875,
0.00464630126953125,
-0.032440185546875,
-0.0212860107421875,
0.03271484375,
-0.033599853515625,
0.04132080078125,
-0.00020706653594970703,
0.10784912109375,
0.0010328292846679688,
-0.031402587890625,
-0.0202789306640625,
-0.048095703125,
0.0635986328125,
-0.0638427734375,
0.04949951171875,
0.00980377197265625,
0.0001996755599975586,
-0.0092010498046875,
-0.05999755859375,
-0.05841064453125,
-0.0170745849609375,
-0.0233001708984375,
0.003353118896484375,
-0.005645751953125,
0.00652313232421875,
0.0246734619140625,
0.0206146240234375,
-0.039825439453125,
0.0035247802734375,
-0.036224365234375,
-0.0160675048828125,
0.043365478515625,
-0.01300811767578125,
0.00408172607421875,
-0.024200439453125,
-0.039093017578125,
-0.0177764892578125,
-0.047210693359375,
0.01355743408203125,
0.0286407470703125,
0.03497314453125,
-0.0175933837890625,
0.038177490234375,
-0.0021076202392578125,
0.04486083984375,
0.0269317626953125,
-0.00763702392578125,
0.048583984375,
-0.0364990234375,
-0.01812744140625,
0.00786590576171875,
0.061004638671875,
0.01053619384765625,
0.0231475830078125,
-0.017486572265625,
-0.023956298828125,
-0.0207061767578125,
0.0112152099609375,
-0.05352783203125,
-0.025146484375,
-0.001667022705078125,
-0.040771484375,
-0.003353118896484375,
0.01070404052734375,
-0.043731689453125,
-0.002552032470703125,
-0.034698486328125,
0.029296875,
-0.043365478515625,
-0.01097869873046875,
-0.006591796875,
-0.0020275115966796875,
0.037689208984375,
0.0141448974609375,
-0.06884765625,
0.0190887451171875,
0.037750244140625,
0.044647216796875,
-0.0157928466796875,
-0.0321044921875,
-0.02276611328125,
-0.00604248046875,
-0.00881195068359375,
0.044708251953125,
-0.0214996337890625,
-0.0191650390625,
-0.0022945404052734375,
0.0155181884765625,
-0.0149383544921875,
-0.0251312255859375,
0.050384521484375,
-0.051025390625,
0.033966064453125,
-0.02728271484375,
-0.0489501953125,
-0.0242156982421875,
0.0167999267578125,
-0.041900634765625,
0.08477783203125,
0.00695037841796875,
-0.0654296875,
0.050140380859375,
-0.038330078125,
-0.03912353515625,
-0.01047515869140625,
-0.01247406005859375,
-0.03570556640625,
-0.0053558349609375,
0.020599365234375,
0.029449462890625,
-0.005786895751953125,
0.03924560546875,
-0.0129241943359375,
-0.0147552490234375,
-0.00350189208984375,
-0.0290985107421875,
0.07891845703125,
-0.0008873939514160156,
-0.026092529296875,
-0.007358551025390625,
-0.07513427734375,
-0.01435089111328125,
0.013580322265625,
-0.04669189453125,
-0.03228759765625,
0.006130218505859375,
0.0052947998046875,
0.007297515869140625,
0.0345458984375,
-0.04541015625,
0.007808685302734375,
-0.04058837890625,
0.01132965087890625,
0.04766845703125,
0.0038127899169921875,
0.0372314453125,
-0.01338958740234375,
0.003940582275390625,
0.0123443603515625,
0.0010280609130859375,
0.00811004638671875,
-0.035369873046875,
-0.09320068359375,
-0.0233917236328125,
0.05401611328125,
0.04193115234375,
-0.04754638671875,
0.0556640625,
-0.036529541015625,
-0.0421142578125,
-0.04266357421875,
0.0013284683227539062,
0.02740478515625,
0.058502197265625,
0.051361083984375,
-0.0224609375,
-0.07391357421875,
-0.07769775390625,
-0.0105438232421875,
-0.0136566162109375,
0.006061553955078125,
0.026947021484375,
0.04608154296875,
-0.017974853515625,
0.07574462890625,
-0.0152587890625,
-0.0164947509765625,
-0.02288818359375,
0.02099609375,
0.036712646484375,
0.04931640625,
0.044189453125,
-0.06427001953125,
-0.040679931640625,
-0.025726318359375,
-0.053558349609375,
0.00847625732421875,
-0.021087646484375,
-0.016143798828125,
0.0428466796875,
0.0286102294921875,
-0.053131103515625,
0.0230865478515625,
0.0253753662109375,
-0.0173187255859375,
0.0404052734375,
-0.01517486572265625,
-0.01296234130859375,
-0.08331298828125,
0.00824737548828125,
-0.0007042884826660156,
-0.0006799697875976562,
-0.04351806640625,
-0.0173797607421875,
0.0020275115966796875,
0.0092010498046875,
-0.024169921875,
0.04193115234375,
-0.05010986328125,
-0.006649017333984375,
0.0122833251953125,
0.00396728515625,
-0.0016002655029296875,
0.050384521484375,
0.02020263671875,
0.049041748046875,
0.030975341796875,
-0.05975341796875,
0.0137481689453125,
0.03887939453125,
-0.04034423828125,
0.033477783203125,
-0.05572509765625,
0.005039215087890625,
-0.01467132568359375,
0.0171356201171875,
-0.059783935546875,
0.007965087890625,
0.01142120361328125,
-0.04595947265625,
0.0435791015625,
-0.007480621337890625,
-0.044464111328125,
-0.029327392578125,
-0.006481170654296875,
0.0077667236328125,
0.033660888671875,
-0.0439453125,
0.0443115234375,
0.0166778564453125,
-0.004230499267578125,
-0.0606689453125,
-0.062286376953125,
0.00403594970703125,
-0.0004968643188476562,
-0.038299560546875,
0.037567138671875,
-0.006061553955078125,
0.0038356781005859375,
0.01248931884765625,
-0.0012979507446289062,
-0.0117034912109375,
-0.0015125274658203125,
0.007587432861328125,
0.04010009765625,
-0.025146484375,
0.020263671875,
-0.000972747802734375,
-0.0015039443969726562,
-0.0017375946044921875,
-0.014129638671875,
0.0413818359375,
-0.00994873046875,
-0.01236724853515625,
-0.023040771484375,
0.0214080810546875,
0.0283660888671875,
-0.01763916015625,
0.0679931640625,
0.05657958984375,
-0.038726806640625,
0.01593017578125,
-0.047332763671875,
-0.0139007568359375,
-0.033111572265625,
0.0296173095703125,
-0.0240325927734375,
-0.050628662109375,
0.03271484375,
0.0311431884765625,
0.022979736328125,
0.057281494140625,
0.038604736328125,
-0.01373291015625,
0.049896240234375,
0.042816162109375,
-0.0172882080078125,
0.03851318359375,
-0.03265380859375,
0.03125,
-0.06695556640625,
-0.0238494873046875,
-0.0396728515625,
-0.034088134765625,
-0.054595947265625,
-0.007160186767578125,
0.0030727386474609375,
0.00812530517578125,
-0.02947998046875,
0.0516357421875,
-0.0328369140625,
0.00798797607421875,
0.05999755859375,
-0.000002562999725341797,
0.0079345703125,
0.005138397216796875,
-0.0226287841796875,
-0.011260986328125,
-0.03680419921875,
-0.03924560546875,
0.0770263671875,
0.021759033203125,
0.042755126953125,
0.0010633468627929688,
0.07745361328125,
0.01214599609375,
0.019012451171875,
-0.05084228515625,
0.049652099609375,
-0.016632080078125,
-0.06500244140625,
-0.0212554931640625,
-0.02752685546875,
-0.087158203125,
0.0027790069580078125,
-0.0269012451171875,
-0.047149658203125,
0.04156494140625,
-0.00554656982421875,
-0.026275634765625,
0.0267791748046875,
-0.061370849609375,
0.0560302734375,
-0.0271148681640625,
0.00815582275390625,
0.0017557144165039062,
-0.057159423828125,
0.00470733642578125,
0.0026340484619140625,
-0.002269744873046875,
0.00044035911560058594,
0.0059967041015625,
0.06524658203125,
-0.019866943359375,
0.05889892578125,
-0.028717041015625,
0.00376129150390625,
0.0171661376953125,
-0.0184783935546875,
0.04754638671875,
-0.0027561187744140625,
-0.0006508827209472656,
0.036956787109375,
-0.0145721435546875,
-0.0216827392578125,
-0.0214080810546875,
0.052093505859375,
-0.07257080078125,
-0.033966064453125,
-0.0352783203125,
-0.034332275390625,
-0.009735107421875,
0.0347900390625,
0.04302978515625,
0.035064697265625,
-0.013641357421875,
0.02978515625,
0.054779052734375,
-0.020782470703125,
0.046173095703125,
0.044189453125,
0.010284423828125,
-0.0290985107421875,
0.048675537109375,
0.031036376953125,
-0.00022852420806884766,
0.045745849609375,
-0.02197265625,
-0.0251312255859375,
-0.0472412109375,
-0.0208892822265625,
0.029388427734375,
-0.045318603515625,
-0.0179901123046875,
-0.07208251953125,
-0.04217529296875,
-0.038116455078125,
-0.002834320068359375,
-0.025970458984375,
-0.0293731689453125,
-0.0543212890625,
-0.0125732421875,
0.016326904296875,
0.0302276611328125,
-0.0011138916015625,
0.0115203857421875,
-0.053863525390625,
0.0193023681640625,
0.0267181396484375,
0.0250396728515625,
-0.0002415180206298828,
-0.05133056640625,
-0.0234832763671875,
0.01219940185546875,
-0.0121612548828125,
-0.054595947265625,
0.033477783203125,
0.026947021484375,
0.056732177734375,
0.0242156982421875,
0.003814697265625,
0.048980712890625,
-0.049285888671875,
0.06890869140625,
0.010833740234375,
-0.056304931640625,
0.034912109375,
-0.00974273681640625,
-0.005565643310546875,
0.05029296875,
0.036163330078125,
-0.0285186767578125,
-0.01358795166015625,
-0.07025146484375,
-0.0753173828125,
0.05206298828125,
0.0147247314453125,
0.0187530517578125,
-0.0172271728515625,
0.02362060546875,
0.0082550048828125,
0.0191650390625,
-0.0782470703125,
-0.041015625,
-0.009490966796875,
-0.0193023681640625,
-0.007568359375,
-0.0310821533203125,
0.0018444061279296875,
-0.0281219482421875,
0.07769775390625,
0.0189971923828125,
0.056854248046875,
0.037017822265625,
-0.029266357421875,
0.01059722900390625,
0.0154876708984375,
0.038330078125,
0.038421630859375,
-0.029296875,
0.00554656982421875,
0.025146484375,
-0.041748046875,
-0.00830841064453125,
0.04229736328125,
-0.01910400390625,
0.0292510986328125,
0.027496337890625,
0.0684814453125,
0.01483154296875,
-0.02685546875,
0.042022705078125,
0.001506805419921875,
-0.0266876220703125,
-0.044830322265625,
-0.0052642822265625,
-0.006961822509765625,
0.025421142578125,
0.042724609375,
0.007030487060546875,
0.00937652587890625,
-0.026458740234375,
0.018218994140625,
0.03167724609375,
-0.0233306884765625,
-0.0238800048828125,
0.040863037109375,
0.01139068603515625,
-0.0174560546875,
0.0625,
-0.0307769775390625,
-0.038330078125,
0.04742431640625,
0.041473388671875,
0.072265625,
0.007022857666015625,
-0.001056671142578125,
0.05291748046875,
0.036224365234375,
0.0028400421142578125,
0.0198974609375,
0.00415802001953125,
-0.07330322265625,
-0.022186279296875,
-0.05126953125,
-0.00827789306640625,
0.0243988037109375,
-0.053436279296875,
0.038848876953125,
-0.029449462890625,
-0.0164642333984375,
0.016693115234375,
0.01216888427734375,
-0.06634521484375,
0.019622802734375,
0.029510498046875,
0.0797119140625,
-0.04254150390625,
0.07110595703125,
0.057525634765625,
-0.04736328125,
-0.0526123046875,
-0.0003516674041748047,
-0.0266571044921875,
-0.05828857421875,
0.06121826171875,
0.01203155517578125,
0.020538330078125,
0.006313323974609375,
-0.044097900390625,
-0.0838623046875,
0.08319091796875,
0.014404296875,
-0.045166015625,
-0.0298919677734375,
-0.00489044189453125,
0.044830322265625,
-0.02667236328125,
0.0126495361328125,
0.032623291015625,
0.031646728515625,
-0.010711669921875,
-0.06982421875,
-0.0017881393432617188,
-0.0185699462890625,
0.00673675537109375,
0.01861572265625,
-0.050811767578125,
0.07330322265625,
-0.0206146240234375,
-0.0170440673828125,
0.0016031265258789062,
0.059417724609375,
0.0157928466796875,
0.020111083984375,
0.045684814453125,
0.05712890625,
0.05804443359375,
-0.0208892822265625,
0.06561279296875,
-0.0277252197265625,
0.0455322265625,
0.09649658203125,
0.0009565353393554688,
0.06451416015625,
0.036773681640625,
-0.0235595703125,
0.0638427734375,
0.05572509765625,
-0.03009033203125,
0.061004638671875,
-0.0030765533447265625,
0.0007600784301757812,
0.0009713172912597656,
0.0093231201171875,
-0.037628173828125,
0.0298309326171875,
0.01776123046875,
-0.049591064453125,
-0.01373291015625,
-0.00785064697265625,
0.0122833251953125,
-0.0262908935546875,
-0.01255035400390625,
0.051727294921875,
0.0024585723876953125,
-0.04388427734375,
0.054443359375,
-0.00012177228927612305,
0.057952880859375,
-0.046966552734375,
0.006771087646484375,
-0.01381683349609375,
0.00707244873046875,
-0.00782012939453125,
-0.04559326171875,
0.0220794677734375,
0.0017299652099609375,
-0.032989501953125,
-0.00921630859375,
0.05279541015625,
-0.037628173828125,
-0.035980224609375,
0.025726318359375,
0.029052734375,
0.0187835693359375,
0.01053619384765625,
-0.06121826171875,
-0.0287628173828125,
0.006549835205078125,
-0.035888671875,
0.00836181640625,
0.038482666015625,
0.0131072998046875,
0.0252227783203125,
0.053253173828125,
0.00023126602172851562,
0.01316070556640625,
0.004055023193359375,
0.056854248046875,
-0.051544189453125,
-0.0277557373046875,
-0.0565185546875,
0.034149169921875,
-0.0174102783203125,
-0.033477783203125,
0.052001953125,
0.055908203125,
0.084228515625,
-0.0075836181640625,
0.057403564453125,
-0.02960205078125,
0.05364990234375,
-0.0268096923828125,
0.0438232421875,
-0.035430908203125,
0.007457733154296875,
-0.016357421875,
-0.0694580078125,
-0.01483154296875,
0.054718017578125,
-0.030059814453125,
0.012969970703125,
0.049407958984375,
0.04302978515625,
-0.00119781494140625,
-0.01654052734375,
0.001739501953125,
0.029815673828125,
0.00916290283203125,
0.0350341796875,
0.03814697265625,
-0.046112060546875,
0.048492431640625,
-0.026458740234375,
0.004161834716796875,
-0.0249786376953125,
-0.059906005859375,
-0.0699462890625,
-0.051727294921875,
-0.021820068359375,
-0.03338623046875,
0.00007390975952148438,
0.0736083984375,
0.0489501953125,
-0.082763671875,
-0.0020904541015625,
-0.0161895751953125,
0.004047393798828125,
-0.015167236328125,
-0.0184478759765625,
0.041778564453125,
-0.035003662109375,
-0.06451416015625,
0.0075225830078125,
-0.00545501708984375,
0.0187225341796875,
-0.0102081298828125,
-0.00688934326171875,
-0.050506591796875,
0.00205230712890625,
0.039581298828125,
0.018402099609375,
-0.05047607421875,
-0.024017333984375,
0.0112457275390625,
-0.027496337890625,
0.014892578125,
0.03106689453125,
-0.0560302734375,
0.0289154052734375,
0.0302276611328125,
0.049835205078125,
0.03985595703125,
-0.0115814208984375,
0.0209808349609375,
-0.07098388671875,
0.01165771484375,
0.017791748046875,
0.046051025390625,
0.03460693359375,
-0.0357666015625,
0.040924072265625,
0.03289794921875,
-0.0396728515625,
-0.053619384765625,
-0.01236724853515625,
-0.08599853515625,
-0.0018701553344726562,
0.09326171875,
-0.01355743408203125,
0.0015249252319335938,
-0.0035877227783203125,
-0.01197052001953125,
0.0428466796875,
-0.042510986328125,
0.049163818359375,
0.0635986328125,
0.00342559814453125,
-0.009918212890625,
-0.034912109375,
0.03271484375,
0.0194854736328125,
-0.03839111328125,
-0.0255889892578125,
0.0228729248046875,
0.0308685302734375,
0.025482177734375,
0.0401611328125,
-0.002735137939453125,
-0.0005764961242675781,
-0.0141448974609375,
0.032440185546875,
0.0089263916015625,
-0.01386260986328125,
-0.0145416259765625,
-0.013702392578125,
-0.01258087158203125,
-0.029388427734375
]
] |
microsoft/resnet-50 | 2023-03-10T17:35:03.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"resnet",
"image-classification",
"vision",
"dataset:imagenet-1k",
"arxiv:1512.03385",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | microsoft | null | null | microsoft/resnet-50 | 168 | 1,259,553 | transformers | 2022-03-16T15:42:43 | ---
license: apache-2.0
tags:
- vision
- image-classification
datasets:
- imagenet-1k
---
# ResNet-50 v1.5
ResNet model pre-trained on ImageNet-1k at resolution 224x224. It was introduced in the paper [Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385) by He et al.
Disclaimer: The team releasing ResNet did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
ResNet (Residual Network) is a convolutional neural network that democratized the concepts of residual learning and skip connections. This enables to train much deeper models.
This is ResNet v1.5, which differs from the original model: in the bottleneck blocks which require downsampling, v1 has stride = 2 in the first 1x1 convolution, whereas v1.5 has stride = 2 in the 3x3 convolution. This difference makes ResNet50 v1.5 slightly more accurate (\~0.5% top1) than v1, but comes with a small performance drawback (~5% imgs/sec) according to [Nvidia](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/resnet_50_v1_5_for_pytorch).

## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=resnet) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import AutoImageProcessor, ResNetForImageClassification
import torch
from datasets import load_dataset
dataset = load_dataset("huggingface/cats-image")
image = dataset["test"]["image"][0]
processor = AutoImageProcessor.from_pretrained("microsoft/resnet-50")
model = ResNetForImageClassification.from_pretrained("microsoft/resnet-50")
inputs = processor(image, return_tensors="pt")
with torch.no_grad():
logits = model(**inputs).logits
# model predicts one of the 1000 ImageNet classes
predicted_label = logits.argmax(-1).item()
print(model.config.id2label[predicted_label])
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/resnet).
### BibTeX entry and citation info
```bibtex
@inproceedings{he2016deep,
title={Deep residual learning for image recognition},
author={He, Kaiming and Zhang, Xiangyu and Ren, Shaoqing and Sun, Jian},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
pages={770--778},
year={2016}
}
```
| 2,642 | [
[
-0.047393798828125,
-0.01062774658203125,
-0.0162811279296875,
-0.00691986083984375,
-0.021942138671875,
-0.01119232177734375,
-0.0061798095703125,
-0.052581787109375,
0.0235748291015625,
0.03277587890625,
-0.045623779296875,
-0.02362060546875,
-0.040985107421875,
0.01248931884765625,
-0.028350830078125,
0.05548095703125,
0.0022335052490234375,
0.00579071044921875,
-0.043365478515625,
-0.021270751953125,
-0.0186309814453125,
-0.03167724609375,
-0.07122802734375,
-0.0362548828125,
0.05029296875,
0.0263214111328125,
0.03192138671875,
0.0350341796875,
0.0572509765625,
0.0335693359375,
-0.00252532958984375,
-0.00042939186096191406,
-0.03387451171875,
-0.02093505859375,
0.0164642333984375,
-0.030303955078125,
-0.0193634033203125,
0.01629638671875,
0.032318115234375,
0.0167388916015625,
0.004848480224609375,
0.036163330078125,
0.001087188720703125,
0.0595703125,
-0.041107177734375,
0.00870513916015625,
-0.03082275390625,
0.014129638671875,
-0.00301361083984375,
0.002170562744140625,
-0.03546142578125,
-0.01366424560546875,
0.01270294189453125,
-0.035186767578125,
0.0435791015625,
-0.0028533935546875,
0.10638427734375,
0.0179443359375,
-0.0036334991455078125,
0.0194091796875,
-0.031402587890625,
0.0533447265625,
-0.050445556640625,
0.03271484375,
0.0249176025390625,
0.04107666015625,
0.0045318603515625,
-0.10443115234375,
-0.035369873046875,
-0.0211944580078125,
-0.009735107421875,
0.00489044189453125,
-0.03955078125,
0.005321502685546875,
0.024017333984375,
0.03582763671875,
-0.035980224609375,
0.0117034912109375,
-0.06640625,
-0.025146484375,
0.045196533203125,
-0.0007524490356445312,
0.0101776123046875,
-0.005664825439453125,
-0.049285888671875,
-0.016357421875,
-0.030242919921875,
0.0194244384765625,
0.009979248046875,
0.0113372802734375,
-0.0262298583984375,
0.023956298828125,
-0.02484130859375,
0.05426025390625,
0.00740814208984375,
-0.00556182861328125,
0.03314208984375,
-0.0032787322998046875,
-0.034027099609375,
0.0025081634521484375,
0.0694580078125,
0.02783203125,
0.0278472900390625,
0.0074462890625,
-0.0037326812744140625,
-0.0004036426544189453,
0.026824951171875,
-0.08001708984375,
-0.03472900390625,
0.01018524169921875,
-0.05072021484375,
-0.042388916015625,
0.00518035888671875,
-0.030059814453125,
-0.0178680419921875,
-0.0304718017578125,
0.01174163818359375,
-0.0270233154296875,
-0.02191162109375,
0.006870269775390625,
-0.0023345947265625,
0.0261383056640625,
0.0260772705078125,
-0.02276611328125,
0.0249176025390625,
0.032562255859375,
0.069580078125,
0.00330352783203125,
-0.02301025390625,
-0.00396728515625,
-0.0556640625,
-0.01654052734375,
0.046478271484375,
-0.01837158203125,
0.0082855224609375,
-0.02557373046875,
0.0382080078125,
0.0114898681640625,
-0.0406494140625,
0.032989501953125,
-0.058685302734375,
0.016937255859375,
-0.00893402099609375,
-0.0211181640625,
-0.048919677734375,
0.021392822265625,
-0.05999755859375,
0.07281494140625,
0.01551055908203125,
-0.078369140625,
0.0188140869140625,
-0.0117645263671875,
0.002132415771484375,
-0.00690460205078125,
0.01448822021484375,
-0.0556640625,
0.0027637481689453125,
-0.004852294921875,
0.040771484375,
-0.0269775390625,
0.0098724365234375,
-0.045806884765625,
-0.0300750732421875,
0.0204315185546875,
-0.0306396484375,
0.0716552734375,
0.027587890625,
-0.0134124755859375,
0.0063629150390625,
-0.0479736328125,
0.005645751953125,
0.0183868408203125,
-0.00360870361328125,
0.0034351348876953125,
-0.038787841796875,
0.006343841552734375,
0.035308837890625,
0.01105499267578125,
-0.051666259765625,
0.013580322265625,
-0.0117340087890625,
0.03216552734375,
0.0396728515625,
-0.01019287109375,
0.0240631103515625,
-0.03057861328125,
0.041015625,
-0.0005364418029785156,
0.0235595703125,
-0.011444091796875,
-0.03668212890625,
-0.06683349609375,
-0.0227203369140625,
0.03656005859375,
0.031707763671875,
-0.05157470703125,
0.0175323486328125,
-0.0160675048828125,
-0.06048583984375,
-0.0301666259765625,
-0.01459503173828125,
0.035797119140625,
0.05621337890625,
0.034027099609375,
-0.040008544921875,
-0.07025146484375,
-0.07635498046875,
0.01412200927734375,
0.003513336181640625,
0.01293182373046875,
0.0271759033203125,
0.04083251953125,
-0.0137939453125,
0.072265625,
-0.020721435546875,
-0.0172576904296875,
-0.00087738037109375,
0.0006775856018066406,
0.0293426513671875,
0.046600341796875,
0.03692626953125,
-0.0655517578125,
-0.0311737060546875,
-0.00536346435546875,
-0.0714111328125,
0.023773193359375,
0.004058837890625,
-0.0038814544677734375,
0.014434814453125,
0.04156494140625,
-0.0182647705078125,
0.053955078125,
0.036224365234375,
-0.006435394287109375,
0.051666259765625,
-0.0018701553344726562,
0.00719451904296875,
-0.08233642578125,
0.0162811279296875,
0.01959228515625,
-0.0255584716796875,
-0.03582763671875,
0.0007181167602539062,
0.0043487548828125,
-0.01641845703125,
-0.056549072265625,
0.039398193359375,
-0.037841796875,
-0.0084991455078125,
-0.02435302734375,
-0.0293426513671875,
0.00811767578125,
0.046539306640625,
0.0171356201171875,
0.0263824462890625,
0.055389404296875,
-0.04876708984375,
0.06085205078125,
0.0043792724609375,
-0.02215576171875,
0.019775390625,
-0.06494140625,
0.012420654296875,
-0.01158905029296875,
0.037261962890625,
-0.07464599609375,
-0.0113677978515625,
0.02825927734375,
-0.05755615234375,
0.045196533203125,
-0.0209808349609375,
-0.003269195556640625,
-0.0592041015625,
-0.00983428955078125,
0.0496826171875,
0.051300048828125,
-0.04779052734375,
0.0269622802734375,
0.0006766319274902344,
0.03717041015625,
-0.0665283203125,
-0.06494140625,
-0.0005159378051757812,
-0.0173187255859375,
-0.04425048828125,
0.0293426513671875,
0.01397705078125,
0.01285552978515625,
0.0189971923828125,
-0.007152557373046875,
-0.011993408203125,
0.00011587142944335938,
0.04144287109375,
0.0233154296875,
-0.0168609619140625,
0.00937652587890625,
-0.033843994140625,
-0.0212249755859375,
-0.0020236968994140625,
-0.0249786376953125,
0.03564453125,
-0.0316162109375,
-0.0079193115234375,
-0.0693359375,
-0.010345458984375,
0.045501708984375,
-0.022064208984375,
0.055511474609375,
0.07196044921875,
-0.047821044921875,
0.006038665771484375,
-0.04022216796875,
-0.029144287109375,
-0.03778076171875,
0.0242767333984375,
-0.0258026123046875,
-0.049530029296875,
0.048065185546875,
-0.00653839111328125,
-0.0074310302734375,
0.045440673828125,
0.007755279541015625,
-0.01045989990234375,
0.0297698974609375,
0.0535888671875,
0.00279998779296875,
0.04718017578125,
-0.062347412109375,
-0.01093292236328125,
-0.07452392578125,
-0.035980224609375,
-0.0285797119140625,
-0.053955078125,
-0.04901123046875,
-0.0247955322265625,
0.006298065185546875,
0.0146331787109375,
-0.04443359375,
0.05609130859375,
-0.061767578125,
0.01239776611328125,
0.0479736328125,
0.04876708984375,
-0.0117645263671875,
0.02886962890625,
0.0033855438232421875,
0.0025157928466796875,
-0.066162109375,
-0.019989013671875,
0.061279296875,
0.047637939453125,
0.052642822265625,
-0.022003173828125,
0.053863525390625,
0.00921630859375,
0.042877197265625,
-0.056243896484375,
0.040496826171875,
-0.02239990234375,
-0.04840087890625,
-0.01125335693359375,
-0.02947998046875,
-0.0791015625,
-0.003894805908203125,
-0.0202789306640625,
-0.042449951171875,
0.04864501953125,
0.0171356201171875,
-0.01537322998046875,
0.04132080078125,
-0.04296875,
0.06768798828125,
-0.00408172607421875,
-0.033050537109375,
0.005847930908203125,
-0.050537109375,
0.0300750732421875,
0.01605224609375,
-0.0186309814453125,
-0.006534576416015625,
0.012420654296875,
0.06158447265625,
-0.0322265625,
0.08367919921875,
-0.01306915283203125,
0.0270843505859375,
0.05377197265625,
-0.0010442733764648438,
0.023956298828125,
-0.00858306884765625,
-0.00917816162109375,
0.041107177734375,
-0.00405120849609375,
-0.03472900390625,
-0.032623291015625,
0.041900634765625,
-0.058135986328125,
-0.02215576171875,
-0.032989501953125,
-0.00689697265625,
0.012939453125,
0.01540374755859375,
0.060943603515625,
0.058563232421875,
0.002471923828125,
0.03472900390625,
0.039886474609375,
-0.031890869140625,
0.036376953125,
0.00046443939208984375,
-0.0079193115234375,
-0.036895751953125,
0.068115234375,
0.0145721435546875,
0.017730712890625,
0.01739501953125,
0.01380157470703125,
-0.020721435546875,
-0.007411956787109375,
-0.01461029052734375,
0.02557373046875,
-0.052001953125,
-0.049102783203125,
-0.03564453125,
-0.03515625,
-0.02392578125,
-0.01009368896484375,
-0.052337646484375,
-0.0157623291015625,
-0.04132080078125,
0.00238800048828125,
0.03863525390625,
0.037078857421875,
-0.00592041015625,
0.020355224609375,
-0.0472412109375,
0.003833770751953125,
0.0233154296875,
0.044219970703125,
0.0135650634765625,
-0.0655517578125,
-0.0142669677734375,
0.00818634033203125,
-0.0173187255859375,
-0.047027587890625,
0.0305633544921875,
0.0158233642578125,
0.0290374755859375,
0.028778076171875,
0.00537109375,
0.044891357421875,
-0.01222991943359375,
0.0440673828125,
0.048858642578125,
-0.03900146484375,
0.0212249755859375,
0.00920867919921875,
0.01438140869140625,
0.0240325927734375,
0.04583740234375,
-0.037933349609375,
0.01654052734375,
-0.0765380859375,
-0.040771484375,
0.054534912109375,
-0.00750732421875,
0.00821685791015625,
0.019500732421875,
0.04791259765625,
-0.005279541015625,
0.0080413818359375,
-0.057403564453125,
-0.03228759765625,
-0.031829833984375,
0.002079010009765625,
-0.00905609130859375,
-0.025787353515625,
0.004276275634765625,
-0.044952392578125,
0.0452880859375,
-0.002849578857421875,
0.052764892578125,
0.0306396484375,
0.017120361328125,
0.00048279762268066406,
-0.03350830078125,
0.041046142578125,
0.02484130859375,
-0.0212554931640625,
0.01021575927734375,
0.01546478271484375,
-0.045013427734375,
0.009979248046875,
0.0018978118896484375,
0.0022449493408203125,
0.002201080322265625,
0.049407958984375,
0.07513427734375,
-0.00762939453125,
-0.003589630126953125,
0.028106689453125,
-0.0240936279296875,
-0.033355712890625,
-0.037933349609375,
-0.0043182373046875,
-0.0036907196044921875,
0.0217437744140625,
0.01110076904296875,
0.038299560546875,
0.001850128173828125,
-0.018707275390625,
0.04022216796875,
0.014862060546875,
-0.058807373046875,
-0.019317626953125,
0.03912353515625,
-0.0006337165832519531,
-0.0167083740234375,
0.07855224609375,
-0.0176544189453125,
-0.04229736328125,
0.090576171875,
0.0294342041015625,
0.08251953125,
-0.00555419921875,
0.0172119140625,
0.07794189453125,
0.0181884765625,
-0.00982666015625,
0.003116607666015625,
0.007061004638671875,
-0.0626220703125,
-0.02484130859375,
-0.038360595703125,
-0.0013113021850585938,
0.0170440673828125,
-0.052947998046875,
0.03155517578125,
-0.03326416015625,
-0.0309295654296875,
-0.007007598876953125,
0.007305145263671875,
-0.0731201171875,
0.03887939453125,
0.0200347900390625,
0.089599609375,
-0.054779052734375,
0.05364990234375,
0.054046630859375,
-0.030853271484375,
-0.07464599609375,
-0.0300750732421875,
-0.0208587646484375,
-0.055694580078125,
0.05487060546875,
0.0298309326171875,
0.007144927978515625,
0.004467010498046875,
-0.07177734375,
-0.0693359375,
0.10333251953125,
0.0270843505859375,
-0.0309295654296875,
0.0274505615234375,
-0.0265960693359375,
0.033782958984375,
-0.034088134765625,
0.0301361083984375,
0.00372314453125,
0.0172271728515625,
0.03619384765625,
-0.049896240234375,
0.0146026611328125,
-0.0258331298828125,
0.0012073516845703125,
-0.00463104248046875,
-0.06463623046875,
0.06756591796875,
-0.030670166015625,
-0.0212249755859375,
0.00670623779296875,
0.06463623046875,
0.00566864013671875,
0.034515380859375,
0.034881591796875,
0.058685302734375,
0.044464111328125,
-0.0214385986328125,
0.08453369140625,
-0.00762939453125,
0.043426513671875,
0.070556640625,
0.026153564453125,
0.045867919921875,
0.01107025146484375,
-0.018341064453125,
0.036773681640625,
0.0894775390625,
-0.02191162109375,
0.026641845703125,
0.0279998779296875,
-0.0101165771484375,
-0.023040771484375,
-0.0126800537109375,
-0.048370361328125,
0.0382080078125,
0.004703521728515625,
-0.041259765625,
-0.00799560546875,
0.01727294921875,
-0.007495880126953125,
-0.023773193359375,
-0.0218505859375,
0.034210205078125,
0.008819580078125,
-0.0322265625,
0.07464599609375,
-0.0032329559326171875,
0.051910400390625,
-0.0247955322265625,
-0.0118865966796875,
-0.028839111328125,
0.00952911376953125,
-0.039825439453125,
-0.047393798828125,
0.0223846435546875,
-0.0261383056640625,
-0.005279541015625,
0.005558013916015625,
0.07135009765625,
-0.0138702392578125,
-0.041534423828125,
0.00809478759765625,
-0.0024242401123046875,
0.03369140625,
-0.0040435791015625,
-0.0748291015625,
0.019866943359375,
0.0009016990661621094,
-0.0270843505859375,
0.011810302734375,
0.0203857421875,
0.00891876220703125,
0.0748291015625,
0.0401611328125,
-0.01496124267578125,
-0.0006990432739257812,
-0.0235595703125,
0.072265625,
-0.031524658203125,
-0.01140594482421875,
-0.0428466796875,
0.046142578125,
-0.006420135498046875,
-0.0310516357421875,
0.039459228515625,
0.040069580078125,
0.070556640625,
-0.01372528076171875,
0.033721923828125,
-0.01873779296875,
-0.00003355741500854492,
-0.01555633544921875,
0.0374755859375,
-0.056671142578125,
-0.00597381591796875,
-0.0111541748046875,
-0.050018310546875,
-0.0255584716796875,
0.051361083984375,
-0.01218414306640625,
0.02886962890625,
0.0372314453125,
0.06817626953125,
-0.0177154541015625,
-0.0125274658203125,
0.022979736328125,
-0.004642486572265625,
0.0025691986083984375,
0.034332275390625,
0.0386962890625,
-0.06475830078125,
0.022796630859375,
-0.06304931640625,
-0.0178070068359375,
-0.01508331298828125,
-0.07012939453125,
-0.049041748046875,
-0.061492919921875,
-0.0440673828125,
-0.059051513671875,
-0.0174713134765625,
0.04791259765625,
0.0848388671875,
-0.056365966796875,
-0.004253387451171875,
-0.0195159912109375,
0.00724029541015625,
-0.0261688232421875,
-0.016754150390625,
0.033416748046875,
-0.007843017578125,
-0.045196533203125,
-0.016937255859375,
-0.0060272216796875,
0.00537109375,
-0.01267242431640625,
-0.01430511474609375,
-0.016143798828125,
-0.028961181640625,
0.0242919921875,
0.050567626953125,
-0.040557861328125,
-0.0183868408203125,
0.0023860931396484375,
-0.011993408203125,
0.00325775146484375,
0.03692626953125,
-0.0699462890625,
0.0390625,
0.039794921875,
0.044464111328125,
0.051971435546875,
-0.00244140625,
0.006633758544921875,
-0.0435791015625,
0.0207061767578125,
0.01195526123046875,
0.030029296875,
0.0216827392578125,
-0.03173828125,
0.050811767578125,
0.038787841796875,
-0.045440673828125,
-0.05120849609375,
0.01399993896484375,
-0.08251953125,
-0.0163726806640625,
0.07421875,
-0.02008056640625,
-0.04119873046875,
0.023773193359375,
-0.0147857666015625,
0.03466796875,
-0.00794219970703125,
0.027557373046875,
0.024627685546875,
-0.007495880126953125,
-0.0533447265625,
-0.028839111328125,
0.036590576171875,
0.0027751922607421875,
-0.028472900390625,
-0.0213775634765625,
0.0191497802734375,
0.0328369140625,
0.0207672119140625,
0.03057861328125,
-0.01090240478515625,
0.027618408203125,
0.01084136962890625,
0.047515869140625,
-0.038330078125,
-0.0290374755859375,
-0.0311737060546875,
0.006336212158203125,
-0.01277923583984375,
-0.047119140625
]
] |
bigscience/bloom-560m | 2023-09-26T09:16:49.000Z | [
"transformers",
"pytorch",
"jax",
"onnx",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zhs",
"zht",
"zu",
"arxiv:1909.08053",
"arxiv:2110.02861",
"arxiv:2108.12409",
"license:bigscience-bloom-rail-1.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloom-560m | 274 | 1,235,084 | transformers | 2022-05-19T11:51:24 | ---
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zhs
- zht
- zu
pipeline_tag: text-generation
---
<h1 style='text-align: center '>BLOOM LM</h1>
<h2 style='text-align: center '><em>BigScience Large Open-science Open-access Multilingual Language Model</em> </h2>
<h3 style='text-align: center '>Model Card</h3>
<img src="https://s3.amazonaws.com/moonup/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Version 1.0 / 26.May.2022
# Model Card for Bloom-560m
<!-- Provide a quick summary of what the model is/does. -->
## Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Recommendations](#recommendations)
5. [Training Data](#training-data)
6. [Evaluation](#evaluation)
7. [Environmental Impact](#environmental-impact)
8. [Technical Specifications](#techincal-specifications)
9. [Citation](#citation)
10. [Glossary and Calculations](#glossary-and-calculations)
11. [More Information](#more-information)
12. [Model Card Authors](#model-card-authors)
13. [Model Card Contact](#model-card-contact)
## Model Details
### Model Description
*This section provides information for anyone who wants to know about the model.*
- **Developed by:** BigScience ([website](https://bigscience.huggingface.co))
* All collaborators are either volunteers or have an agreement with their employer. *(Further breakdown of participants forthcoming.)*
- **Model Type:** Transformer-based Language Model
- **Version:** 1.0.0
- **Languages:** Multiple; see [training data](#training-data)
- **License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license))
- **Release Date Estimate:** Monday, 11.July.2022
- **Funded by:**
* The French government.
* Hugging Face ([website](https://huggingface.co)).
* Organizations of contributors. *(Further breakdown of organizations forthcoming.)*
## Uses
*This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model.
It provides information for anyone considering using the model or who is affected by the model.*
### Intended Use
This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive.
#### **Direct Use**
- Text generation
- Exploring characteristics of language generated by a language model
- Examples: Cloze tests, counterfactuals, generations with reframings
#### **Downstream Use**
- Tasks that leverage language models include: Information Extraction, Question Answering, Summarization
### Misuse and Out-of-scope Use
*This section addresses what users ought not do with the model.*
See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases.
#### **Out-of-scope Uses**
Using the model in [high-stakes](#high-stakes) settings is out of scope for this model.ย The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but is not correct.
##### Out-of-scope Uses Include:
- Usage in biomedical domains, political and legal domains, or finance domains
- Usage for evaluating or scoring individuals, such as for employment, education, or credit
- Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct
#### **Misuse**
Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes:
- Spam generation
- Disinformation and influence operations
- Disparagement and defamation
- Harassment and abuse
- [Deception](#deception)
- Unconsented impersonation and imitation
- Unconsented surveillance
- Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
### Intended Users
#### **Direct Users**
- General Public
- Researchers
- Students
- Educators
- Engineers/developers
- Non-commercial entities
- Community advocates, including human and civil rights groups
#### Indirect Users
- Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use)
- Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license)
#### Others Affected (Parties Prenantes)
- People and groups referred to by the LLM
- People and groups exposed to outputs of, or decisions based on, the LLM
- People and groups whose original work is included in the LLM
## Bias, Risks and Limitations
*This section identifies foreseeable harms and misunderstandings.*
Model may:
- Overrepresent some viewpoints and underrepresent others
- Contain stereotypes
- Contain [personal information](#personal-data-and-information)
- Generate:
- Hateful, abusive, or violent language
- Discriminatory or prejudicial language
- Content that may not be appropriate for all settings, including sexual content
- Make errors, including producing incorrect information as if it were factual
- Generate irrelevant or repetitive outputs
### Recommendations
*This section provides information on warnings and potential mitigations.*
- Indirect users should be made aware when the content they're working with is created by the LLM.
- Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary.
- Models pretrained with the LLM should include an updated Model Card.
- Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments.
## Training Data
*This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.*
Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus).
Training data includes:
- 45 natural languages
- 12 programming languages
- In 1.5TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.)
#### **Languages**
The pie chart shows the distribution of languages in training data.

**The following table shows the further distribution of Niger-Congo and Indic languages in the training data.**
| Niger Congo | Percentage | | Indic | Percentage |
|----------------|------------ |------ |-----------|------------|
| Chi Tumbuka | 0.00002 | | Assamese | 0.01 |
| Kikuyu | 0.00004 | | Odia | 0.04 |
| Bambara | 0.00004 | | Gujarati | 0.04 |
| Akan | 0.00007 | | Marathi | 0.05 |
| Xitsonga | 0.00007 | | Punjabi | 0.05 |
| Sesotho | 0.00007 | | Kannada | 0.06 |
| Chi Chewa | 0.0001 | | Nepali | 0.07 |
| Setswana | 0.0002 | | Telugu | 0.09 |
| Northern Sotho | 0.0002 | | Malayalam | 0.10 |
| Fon | 0.0002 | | Urdu | 0.10 |
| Kirundi | 0.0003 | | Tamil | 0.20 |
| Wolof | 0.0004 | | Bengali | 0.50 |
| Kuganda | 0.0004 | | Hindi | 0.70 |
| Chi Shona | 0.001 |
| Isi Zulu | 0.001 |
| Igbo | 0.001 |
| Xhosa | 0.001 |
| Kinyarwanda | 0.003 |
| Yoruba | 0.006 |
| Swahili | 0.02 |
**The following table shows the distribution of programming languages.**
| Extension | Language | Number of files |
|----------------|------------|-----------------|
| java | Java | 5,407,724 |
| php | PHP | 4,942,186 |
| cpp | C++ | 2,503,930 |
| py | Python | 2,435,072 |
| js | JavaScript | 1,905,518 |
| cs | C# | 1,577,347 |
| rb | Ruby | 6,78,413 |
| cc | C++ | 443,054 |
| hpp | C++ | 391,048 |
| lua | Lua | 352,317 |
| go | GO | 227,763 |
| ts | TypeScript | 195,254 |
| C | C | 134,537 |
| scala | Scala | 92,052 |
| hh | C++ | 67,161 |
| H | C++ | 55,899 |
| tsx | TypeScript | 33,107 |
| rs | Rust | 29,693 |
| phpt | PHP | 9,702 |
| c++ | C++ | 1,342 |
| h++ | C++ | 791 |
| php3 | PHP | 540 |
| phps | PHP | 270 |
| php5 | PHP | 166 |
| php4 | PHP | 29 |
## Evaluation
*This section describes the evaluation protocols and provides the results.*
### Metrics
*This section describes the different ways performance is calculated and why.*
Includes:
| Metric | Why chosen |
|--------------------|--------------------------------------------------------------------|
| [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training |
| Cross Entropy [Loss](#loss) | Standard objective for language models. |
And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_
### Factors
*This section lists some different aspects of what BLOOM models. Its focus is on those aspects that are likely to give rise to high variance in model behavior.*
- Language, such as English or Yoruba
- Domain, such as newswire or stories
- Demographic characteristics, such as gender or nationality
### Results
*Results are based on the [Factors](#factors) and [Metrics](#metrics).*
**Train-time Evaluation:**
As of 25.May.2022, 15:00 PST:
- Training Loss: 2.0
- Validation Loss: 2.2
- Perplexity: 8.9
(More evaluation scores forthcoming at the end of model training.)
## Environmental Impact
The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing.
**Estimated carbon emissions:** *(Forthcoming upon completion of training.)*
**Estimated electricity usage:** *(Forthcoming upon completion of training.)*
## Technical Specifications
*This section provides information for people who work on model development.*
Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training.
**Model Architecture:** Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)):
* Decoder-only architecture
* Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf))
* ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions
* 559,214,592 parameters:
* 256,901,120 embedding parameters
* 24 layers, 16 attention heads
* Hidden layers are 1024-dimensional
* Sequence length of 2048 tokens (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
**Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
**Compute infrastructure:** Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
* Hardware: 384 A100 80GB GPUs (48 nodes):
* Additional 32 A100 80GB GPUs (4 nodes) in reserve
* 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links
* CPU: AMD
* CPU memory: 512GB per node
* GPU memory: 640GB per node
* Inter-node connect: Omni-Path Architecture (OPA)
* NCCL-communications network: a fully dedicated subnet
* Disc IO network: shared network with other types of nodes
* Software:
* Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed))
* DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed))
* PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch))
* apex ([Github link](https://github.com/NVIDIA/apex))
### **Training**
Training logs: [Tensorboard link](https://huggingface.co/bigscience/tr11e-350M-logs)
- Training throughput: About 150 TFLOPs per GPU
- Number of epochs: 1 (*current target*)
- Dates:
- Started 11th March, 2022 11:42am PST
- Ended 5th July, 2022
- Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments and other model sizes)
- Server training location: รle-de-France, France
### **Tokenization**
The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)) is a learned subword tokenizer trained using:
- A byte-level Byte Pair Encoding (BPE) algorithm
- A simple pre-tokenization rule, no normalization
- A vocabulary size of 250,680
It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language.
## Citation
**Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022
## Glossary and Calculations
*This section defines common terms and how metrics are calculated.*
- <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss.
- <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy.
- <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/).
- <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf).
- <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf).
- <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm).
- <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf))
- <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated.
## More Information
### Dataset Creation
Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling
### Technical Specifications
Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours
More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model
Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss
Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md
Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md
### Initial Results
Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book
## Model Card Authors
*Ordered roughly chronologically and by amount of time spent.*
Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muรฑoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Iliฤ, Gรฉrard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff
## Model Card Contact
**Send Questions to:** bigscience-contact@googlegroups.com | 20,187 | [
[
-0.01873779296875,
-0.044097900390625,
0.032867431640625,
0.02044677734375,
-0.00919342041015625,
-0.018280029296875,
-0.03851318359375,
-0.04302978515625,
0.00598907470703125,
0.03887939453125,
-0.033599853515625,
-0.052215576171875,
-0.04913330078125,
0.0035648345947265625,
-0.0267181396484375,
0.08074951171875,
0.0003879070281982422,
0.01535797119140625,
-0.010894775390625,
-0.003482818603515625,
-0.01439666748046875,
-0.050018310546875,
-0.0300750732421875,
-0.019256591796875,
0.04461669921875,
0.026214599609375,
0.04266357421875,
0.042755126953125,
0.0458984375,
0.0172576904296875,
-0.02923583984375,
-0.01910400390625,
-0.043121337890625,
-0.0274658203125,
-0.01554107666015625,
-0.0242919921875,
-0.049224853515625,
0.01097869873046875,
0.06439208984375,
0.064208984375,
-0.01285552978515625,
0.024169921875,
-0.0005068778991699219,
0.040283203125,
-0.030914306640625,
0.0272064208984375,
-0.025634765625,
0.005077362060546875,
-0.0187225341796875,
0.0238494873046875,
-0.024566650390625,
-0.016632080078125,
-0.0019073486328125,
-0.038330078125,
-0.00235748291015625,
0.006580352783203125,
0.0677490234375,
0.003940582275390625,
-0.0183563232421875,
-0.008758544921875,
-0.054901123046875,
0.0655517578125,
-0.06695556640625,
0.043975830078125,
0.042266845703125,
0.0169830322265625,
0.0004279613494873047,
-0.05010986328125,
-0.05413818359375,
-0.0220794677734375,
0.0018777847290039062,
0.0211181640625,
-0.01258087158203125,
0.0014753341674804688,
0.036773681640625,
0.043365478515625,
-0.041168212890625,
0.0181732177734375,
-0.047149658203125,
-0.006877899169921875,
0.064208984375,
0.0004055500030517578,
0.0201873779296875,
-0.01336669921875,
-0.017333984375,
-0.022216796875,
-0.062164306640625,
-0.00617218017578125,
0.0309600830078125,
0.03900146484375,
-0.043609619140625,
0.06231689453125,
0.01290130615234375,
0.0372314453125,
-0.0201568603515625,
-0.015411376953125,
0.045745849609375,
-0.0400390625,
-0.0254974365234375,
-0.0213623046875,
0.07159423828125,
0.0221405029296875,
0.00335693359375,
-0.0005025863647460938,
-0.006221771240234375,
-0.01885986328125,
-0.004901885986328125,
-0.060546875,
-0.0032329559326171875,
0.0126495361328125,
-0.0265350341796875,
-0.00997161865234375,
-0.0046234130859375,
-0.07037353515625,
-0.0118255615234375,
-0.025360107421875,
0.01276397705078125,
-0.030792236328125,
-0.0450439453125,
0.01078033447265625,
0.0019893646240234375,
0.01305389404296875,
0.0003032684326171875,
-0.057342529296875,
0.0204620361328125,
0.0264739990234375,
0.06646728515625,
-0.016143798828125,
-0.032806396484375,
0.00542449951171875,
0.01013946533203125,
-0.0075531005859375,
0.02301025390625,
-0.0252532958984375,
-0.050079345703125,
0.00213623046875,
0.02044677734375,
-0.001461029052734375,
-0.0309600830078125,
0.035186767578125,
-0.01708984375,
0.03338623046875,
-0.026336669921875,
-0.0447998046875,
-0.002330780029296875,
0.0026607513427734375,
-0.052734375,
0.0872802734375,
0.0164337158203125,
-0.05548095703125,
0.020599365234375,
-0.0703125,
-0.01374053955078125,
-0.0015840530395507812,
-0.00199127197265625,
-0.03936767578125,
-0.020599365234375,
-0.006450653076171875,
0.0273590087890625,
-0.0181884765625,
0.041259765625,
-0.0198516845703125,
-0.0007781982421875,
0.007781982421875,
-0.0177764892578125,
0.07666015625,
0.0197601318359375,
-0.0210113525390625,
0.007198333740234375,
-0.055999755859375,
-0.0242767333984375,
0.0269775390625,
-0.034820556640625,
-0.007114410400390625,
-0.00804901123046875,
0.032012939453125,
0.0250396728515625,
0.01091766357421875,
-0.048126220703125,
0.0225677490234375,
-0.0447998046875,
0.035736083984375,
0.041839599609375,
-0.0073089599609375,
0.0313720703125,
-0.022186279296875,
0.039520263671875,
0.0193939208984375,
0.017974853515625,
-0.003093719482421875,
-0.046630859375,
-0.0477294921875,
-0.036651611328125,
0.026824951171875,
0.03948974609375,
-0.035064697265625,
0.052581787109375,
-0.0238037109375,
-0.052276611328125,
-0.029998779296875,
0.0110931396484375,
0.0416259765625,
0.033477783203125,
0.038665771484375,
-0.0083465576171875,
-0.03863525390625,
-0.06634521484375,
0.007587432861328125,
-0.00518035888671875,
0.01345062255859375,
0.0272216796875,
0.0699462890625,
-0.0307769775390625,
0.05767822265625,
-0.0447998046875,
-0.003910064697265625,
-0.0198974609375,
-0.00614166259765625,
0.0209197998046875,
0.03326416015625,
0.045379638671875,
-0.06524658203125,
-0.0243072509765625,
-0.00717926025390625,
-0.05474853515625,
0.0048980712890625,
0.02349853515625,
-0.006755828857421875,
0.0267181396484375,
0.03692626953125,
-0.0623779296875,
0.030517578125,
0.05670166015625,
-0.0196685791015625,
0.050506591796875,
-0.00666046142578125,
-0.0158843994140625,
-0.0987548828125,
0.032623291015625,
0.01399993896484375,
0.00325775146484375,
-0.041900634765625,
0.0067901611328125,
-0.004093170166015625,
-0.02587890625,
-0.044647216796875,
0.0673828125,
-0.025970458984375,
0.00296783447265625,
-0.004322052001953125,
0.004398345947265625,
-0.004108428955078125,
0.0223388671875,
0.0133209228515625,
0.06500244140625,
0.043304443359375,
-0.04248046875,
0.01271820068359375,
0.0251007080078125,
-0.022674560546875,
0.01654052734375,
-0.05999755859375,
-0.01221466064453125,
-0.0123443603515625,
0.0237579345703125,
-0.057159423828125,
-0.0272979736328125,
0.0219268798828125,
-0.0289306640625,
0.0355224609375,
-0.0039825439453125,
-0.055633544921875,
-0.047454833984375,
-0.0148162841796875,
0.0240631103515625,
0.04339599609375,
-0.0292816162109375,
0.01690673828125,
0.037139892578125,
-0.005847930908203125,
-0.04132080078125,
-0.068115234375,
-0.00615692138671875,
-0.0229339599609375,
-0.040771484375,
0.0216522216796875,
-0.023681640625,
-0.0168609619140625,
0.0030879974365234375,
0.021881103515625,
-0.007114410400390625,
0.002880096435546875,
0.0250701904296875,
0.0202789306640625,
-0.01059722900390625,
0.0259857177734375,
-0.0212554931640625,
0.01354217529296875,
0.00521087646484375,
-0.01319122314453125,
0.039215087890625,
-0.0010766983032226562,
-0.0279083251953125,
-0.021392822265625,
0.0257110595703125,
0.0400390625,
-0.013214111328125,
0.072998046875,
0.04583740234375,
-0.039520263671875,
0.0079803466796875,
-0.03607177734375,
-0.0208892822265625,
-0.031341552734375,
0.0479736328125,
0.003814697265625,
-0.064697265625,
0.043670654296875,
0.007389068603515625,
0.006443023681640625,
0.048248291015625,
0.0589599609375,
0.00278472900390625,
0.0601806640625,
0.06591796875,
-0.0172271728515625,
0.03973388671875,
-0.031005859375,
0.0281829833984375,
-0.0654296875,
-0.0237274169921875,
-0.038543701171875,
0.0019683837890625,
-0.0498046875,
-0.0394287109375,
0.013031005859375,
0.00957489013671875,
-0.03802490234375,
0.0261993408203125,
-0.02398681640625,
0.0187835693359375,
0.040771484375,
-0.0019664764404296875,
0.0146942138671875,
0.006267547607421875,
-0.0125274658203125,
-0.01123809814453125,
-0.047515869140625,
-0.0408935546875,
0.10015869140625,
0.053955078125,
0.04803466796875,
0.01302337646484375,
0.044708251953125,
0.0018720626831054688,
0.02752685546875,
-0.053802490234375,
0.033935546875,
-0.0044708251953125,
-0.07427978515625,
-0.023406982421875,
-0.040313720703125,
-0.08123779296875,
0.0015163421630859375,
-0.01380157470703125,
-0.068115234375,
0.01617431640625,
0.0090179443359375,
-0.015594482421875,
0.04010009765625,
-0.059906005859375,
0.06402587890625,
-0.018768310546875,
-0.024688720703125,
-0.0034770965576171875,
-0.04364013671875,
0.03729248046875,
-0.0150909423828125,
0.038360595703125,
-0.002437591552734375,
0.01117706298828125,
0.0640869140625,
-0.0270843505859375,
0.08331298828125,
-0.01275634765625,
-0.01025390625,
0.01837158203125,
-0.0206451416015625,
0.0216522216796875,
-0.006671905517578125,
-0.01137542724609375,
0.04241943359375,
-0.0010395050048828125,
-0.0206298828125,
-0.0079498291015625,
0.052581787109375,
-0.07958984375,
-0.036041259765625,
-0.040313720703125,
-0.0498046875,
-0.001796722412109375,
0.0258026123046875,
0.027557373046875,
0.01042938232421875,
-0.016082763671875,
0.017913818359375,
0.051849365234375,
-0.052581787109375,
0.017059326171875,
0.036407470703125,
-0.045196533203125,
-0.0250701904296875,
0.067626953125,
0.01446533203125,
0.03240966796875,
0.001644134521484375,
0.0244598388671875,
-0.03411865234375,
-0.03656005859375,
-0.018402099609375,
0.0435791015625,
-0.04925537109375,
-0.006275177001953125,
-0.0482177734375,
-0.0374755859375,
-0.0501708984375,
0.013458251953125,
-0.0216217041015625,
-0.0244293212890625,
-0.025665283203125,
-0.0187835693359375,
0.0291290283203125,
0.046600341796875,
-0.01348876953125,
0.034454345703125,
-0.044525146484375,
0.007144927978515625,
0.00797271728515625,
0.035491943359375,
-0.0177001953125,
-0.0517578125,
-0.0291290283203125,
0.023651123046875,
-0.033660888671875,
-0.053070068359375,
0.032135009765625,
0.008544921875,
0.043853759765625,
0.0078887939453125,
-0.023162841796875,
0.023834228515625,
-0.0343017578125,
0.08416748046875,
0.03070068359375,
-0.050079345703125,
0.045013427734375,
-0.04498291015625,
0.0198974609375,
0.0295867919921875,
0.05279541015625,
-0.0382080078125,
-0.0193023681640625,
-0.059844970703125,
-0.0816650390625,
0.051971435546875,
0.01519012451171875,
0.0182037353515625,
-0.003368377685546875,
0.0244293212890625,
-0.013580322265625,
0.019927978515625,
-0.09246826171875,
-0.029022216796875,
-0.010986328125,
-0.0200958251953125,
-0.0163726806640625,
-0.0237274169921875,
-0.0292816162109375,
-0.033782958984375,
0.057952880859375,
0.00659942626953125,
0.031402587890625,
0.00274658203125,
-0.006862640380859375,
-0.022369384765625,
0.0131378173828125,
0.05072021484375,
0.0604248046875,
-0.019683837890625,
-0.0143585205078125,
0.01505279541015625,
-0.055267333984375,
-0.0014543533325195312,
0.01959228515625,
-0.0128021240234375,
-0.0161285400390625,
0.0243988037109375,
0.0543212890625,
0.01297760009765625,
-0.054840087890625,
0.04522705078125,
0.00760650634765625,
-0.026031494140625,
-0.035675048828125,
-0.0023250579833984375,
0.025848388671875,
0.0111083984375,
0.01111602783203125,
-0.0185699462890625,
0.0022182464599609375,
-0.042877197265625,
-0.0017786026000976562,
0.0233001708984375,
-0.0172271728515625,
-0.02874755859375,
0.043548583984375,
0.0042877197265625,
-0.0178985595703125,
0.035491943359375,
-0.01537322998046875,
-0.0212860107421875,
0.048309326171875,
0.050872802734375,
0.040435791015625,
-0.0243377685546875,
0.007320404052734375,
0.056732177734375,
0.030426025390625,
-0.00960540771484375,
0.0306549072265625,
0.032196044921875,
-0.049072265625,
-0.037841796875,
-0.058197021484375,
-0.0277862548828125,
0.03582763671875,
-0.04278564453125,
0.031494140625,
-0.037322998046875,
-0.0178070068359375,
0.009185791015625,
-0.00337982177734375,
-0.05364990234375,
0.0155029296875,
0.03216552734375,
0.0797119140625,
-0.08135986328125,
0.065185546875,
0.057769775390625,
-0.06329345703125,
-0.0657958984375,
-0.00844573974609375,
0.00980377197265625,
-0.054534912109375,
0.0653076171875,
0.0037860870361328125,
0.0148162841796875,
-0.008270263671875,
-0.06610107421875,
-0.08367919921875,
0.081298828125,
0.0251617431640625,
-0.05108642578125,
0.00033473968505859375,
0.022003173828125,
0.050140380859375,
-0.01312255859375,
0.02099609375,
0.0333251953125,
0.04913330078125,
0.01274871826171875,
-0.0777587890625,
0.00015544891357421875,
-0.0177764892578125,
-0.0039825439453125,
0.001972198486328125,
-0.05841064453125,
0.07440185546875,
-0.00420379638671875,
-0.010650634765625,
-0.00962066650390625,
0.039306640625,
0.0248870849609375,
0.0113372802734375,
0.018280029296875,
0.0576171875,
0.058197021484375,
-0.01180267333984375,
0.0882568359375,
-0.0233917236328125,
0.035369873046875,
0.07476806640625,
-0.016326904296875,
0.05841064453125,
0.0294952392578125,
-0.044525146484375,
0.0218048095703125,
0.041534423828125,
-0.0150909423828125,
0.0239105224609375,
0.0233306884765625,
-0.00330352783203125,
0.007518768310546875,
-0.01214599609375,
-0.048858642578125,
0.0219879150390625,
0.044891357421875,
-0.03717041015625,
-0.007083892822265625,
0.00849151611328125,
0.02099609375,
-0.006641387939453125,
-0.016204833984375,
0.036834716796875,
0.019683837890625,
-0.034515380859375,
0.04150390625,
0.002895355224609375,
0.0494384765625,
-0.055267333984375,
0.0008292198181152344,
-0.007465362548828125,
0.00708770751953125,
-0.020233154296875,
-0.07147216796875,
0.0241241455078125,
0.0027179718017578125,
-0.0202789306640625,
-0.01103973388671875,
0.0306243896484375,
-0.036773681640625,
-0.0557861328125,
0.03729248046875,
0.0288543701171875,
0.0205535888671875,
0.0083770751953125,
-0.066650390625,
0.0099334716796875,
-0.009857177734375,
-0.032440185546875,
0.024566650390625,
0.01523590087890625,
0.00431060791015625,
0.0531005859375,
0.049468994140625,
0.021331787109375,
0.003795623779296875,
0.0069732666015625,
0.07769775390625,
-0.056793212890625,
-0.0109100341796875,
-0.054443359375,
0.0416259765625,
-0.01038360595703125,
-0.03509521484375,
0.06927490234375,
0.0550537109375,
0.062744140625,
0.00457000732421875,
0.070556640625,
-0.00916290283203125,
0.041473388671875,
-0.033599853515625,
0.048248291015625,
-0.048187255859375,
0.00009435415267944336,
-0.0259246826171875,
-0.08038330078125,
-0.033203125,
0.042327880859375,
-0.0443115234375,
0.0224151611328125,
0.04559326171875,
0.058624267578125,
-0.0127410888671875,
-0.00213623046875,
0.0173797607421875,
0.04241943359375,
0.024871826171875,
0.025238037109375,
0.054840087890625,
-0.029052734375,
0.02020263671875,
-0.021636962890625,
-0.01287841796875,
-0.0205841064453125,
-0.06695556640625,
-0.0687255859375,
-0.047637939453125,
-0.0287017822265625,
-0.039154052734375,
-0.0079345703125,
0.07305908203125,
0.05548095703125,
-0.0625,
-0.0333251953125,
0.0047454833984375,
-0.00818634033203125,
0.0113525390625,
-0.0159912109375,
0.031341552734375,
-0.00574493408203125,
-0.046478271484375,
0.02020263671875,
0.00235748291015625,
0.0169219970703125,
-0.03985595703125,
-0.007053375244140625,
-0.038543701171875,
-0.005290985107421875,
0.050537109375,
0.039520263671875,
-0.047088623046875,
-0.0084381103515625,
0.004852294921875,
-0.020050048828125,
0.0009603500366210938,
0.0203094482421875,
-0.01140594482421875,
0.0194854736328125,
0.017242431640625,
0.027862548828125,
0.05242919921875,
-0.01479339599609375,
0.0184173583984375,
-0.051544189453125,
0.0307464599609375,
0.021392822265625,
0.039215087890625,
0.031280517578125,
-0.031707763671875,
0.039398193359375,
0.024505615234375,
-0.04974365234375,
-0.06512451171875,
0.01244354248046875,
-0.07525634765625,
-0.0181884765625,
0.1187744140625,
-0.0126190185546875,
-0.03070068359375,
0.00856781005859375,
-0.00726318359375,
0.016021728515625,
-0.0169219970703125,
0.04119873046875,
0.06671142578125,
0.009429931640625,
-0.0086669921875,
-0.067138671875,
0.0304412841796875,
0.00806427001953125,
-0.06793212890625,
0.0117034912109375,
0.040283203125,
0.037750244140625,
0.0187530517578125,
0.0430908203125,
-0.0249176025390625,
-0.005672454833984375,
-0.0188751220703125,
0.037567138671875,
0.0038166046142578125,
-0.014068603515625,
-0.03082275390625,
-0.0243072509765625,
0.0140838623046875,
0.0144195556640625
]
] |
deepset/roberta-base-squad2 | 2023-09-26T11:36:30.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"roberta",
"question-answering",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | deepset | null | null | deepset/roberta-base-squad2 | 483 | 1,212,516 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: cc-by-4.0
datasets:
- squad_v2
model-index:
- name: deepset/roberta-base-squad2
results:
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_v2
type: squad_v2
config: squad_v2
split: validation
metrics:
- type: exact_match
value: 79.9309
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDhhNjg5YzNiZGQ1YTIyYTAwZGUwOWEzZTRiYzdjM2QzYjA3ZTUxNDM1NjE1MTUyMjE1MGY1YzEzMjRjYzVjYiIsInZlcnNpb24iOjF9.EH5JJo8EEFwU7osPz3s7qanw_tigeCFhCXjSfyN0Y1nWVnSfulSxIk_DbAEI5iE80V4EKLyp5-mYFodWvL2KDA
- type: f1
value: 82.9501
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjk5ZDYwOGQyNjNkMWI0OTE4YzRmOTlkY2JjNjQ0YTZkNTMzMzNkYTA0MDFmNmI3NjA3NjNlMjhiMDQ2ZjJjNSIsInZlcnNpb24iOjF9.DDm0LNTkdLbGsue58bg1aH_s67KfbcmkvL-6ZiI2s8IoxhHJMSf29H_uV2YLyevwx900t-MwTVOW3qfFnMMEAQ
- type: total
value: 11869
name: total
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGFkMmI2ODM0NmY5NGNkNmUxYWViOWYxZDNkY2EzYWFmOWI4N2VhYzY5MGEzMTVhOTU4Zjc4YWViOGNjOWJjMCIsInZlcnNpb24iOjF9.fexrU1icJK5_MiifBtZWkeUvpmFISqBLDXSQJ8E6UnrRof-7cU0s4tX_dIsauHWtUpIHMPZCf5dlMWQKXZuAAA
- task:
type: question-answering
name: Question Answering
dataset:
name: squad
type: squad
config: plain_text
split: validation
metrics:
- type: exact_match
value: 85.289
name: Exact Match
- type: f1
value: 91.841
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: adversarial_qa
type: adversarial_qa
config: adversarialQA
split: validation
metrics:
- type: exact_match
value: 29.500
name: Exact Match
- type: f1
value: 40.367
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_adversarial
type: squad_adversarial
config: AddOneSent
split: validation
metrics:
- type: exact_match
value: 78.567
name: Exact Match
- type: f1
value: 84.469
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts amazon
type: squadshifts
config: amazon
split: test
metrics:
- type: exact_match
value: 69.924
name: Exact Match
- type: f1
value: 83.284
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts new_wiki
type: squadshifts
config: new_wiki
split: test
metrics:
- type: exact_match
value: 81.204
name: Exact Match
- type: f1
value: 90.595
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts nyt
type: squadshifts
config: nyt
split: test
metrics:
- type: exact_match
value: 82.931
name: Exact Match
- type: f1
value: 90.756
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts reddit
type: squadshifts
config: reddit
split: test
metrics:
- type: exact_match
value: 71.550
name: Exact Match
- type: f1
value: 82.939
name: F1
---
# roberta-base for QA
This is the [roberta-base](https://huggingface.co/roberta-base) model, fine-tuned using the [SQuAD2.0](https://huggingface.co/datasets/squad_v2) dataset. It's been trained on question-answer pairs, including unanswerable questions, for the task of Question Answering.
## Overview
**Language model:** roberta-base
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
**Infrastructure**: 4x Tesla v100
## Hyperparameters
```
batch_size = 96
n_epochs = 2
base_LM_model = "roberta-base"
max_seq_len = 386
learning_rate = 3e-5
lr_schedule = LinearWarmup
warmup_proportion = 0.2
doc_stride=128
max_query_length=64
```
## Using a distilled model instead
Please note that we have also released a distilled version of this model called [deepset/tinyroberta-squad2](https://huggingface.co/deepset/tinyroberta-squad2). The distilled model has a comparable prediction quality and runs at twice the speed of the base model.
## Usage
### In Haystack
Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2")
# or
reader = TransformersReader(model_name_or_path="deepset/roberta-base-squad2",tokenizer="deepset/roberta-base-squad2")
```
For a complete example of ``roberta-base-squad2`` being used for Question Answering, check out the [Tutorials in Haystack Documentation](https://haystack.deepset.ai/tutorials/first-qa-system)
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/roberta-base-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Performance
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
```
"exact": 79.87029394424324,
"f1": 82.91251169582613,
"total": 11873,
"HasAns_exact": 77.93522267206478,
"HasAns_f1": 84.02838248389763,
"HasAns_total": 5928,
"NoAns_exact": 81.79983179142137,
"NoAns_f1": 81.79983179142137,
"NoAns_total": 5945
```
## Authors
**Branden Chan:** branden.chan@deepset.ai
**Timo Mรถller:** timo.moeller@deepset.ai
**Malte Pietsch:** malte.pietsch@deepset.ai
**Tanay Soni:** tanay.soni@deepset.ai
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
| 8,387 | [
[
-0.0306243896484375,
-0.04876708984375,
0.0306243896484375,
0.00460052490234375,
-0.0031681060791015625,
0.00795745849609375,
-0.007904052734375,
-0.0284271240234375,
0.0225372314453125,
0.0218505859375,
-0.0633544921875,
-0.050048828125,
-0.0207366943359375,
0.006107330322265625,
-0.024261474609375,
0.07257080078125,
0.013946533203125,
-0.0008544921875,
-0.0167999267578125,
-0.0010814666748046875,
-0.03704833984375,
-0.0350341796875,
-0.05230712890625,
-0.01050567626953125,
0.0167999267578125,
0.0255279541015625,
0.0479736328125,
0.0226898193359375,
0.040130615234375,
0.0245208740234375,
-0.006923675537109375,
0.013580322265625,
-0.0333251953125,
0.0184783935546875,
-0.00409698486328125,
-0.030029296875,
-0.032073974609375,
-0.0045318603515625,
0.037750244140625,
0.029998779296875,
-0.011474609375,
0.039520263671875,
-0.011322021484375,
0.0556640625,
-0.039703369140625,
0.0078582763671875,
-0.05096435546875,
-0.0190582275390625,
0.01444244384765625,
0.01523590087890625,
-0.01016998291015625,
-0.0121307373046875,
0.0149993896484375,
-0.04632568359375,
0.0252227783203125,
-0.0124359130859375,
0.08795166015625,
0.021759033203125,
-0.006824493408203125,
-0.01141357421875,
-0.0345458984375,
0.06683349609375,
-0.07952880859375,
-0.00116729736328125,
0.04327392578125,
0.03155517578125,
0.00879669189453125,
-0.0631103515625,
-0.048095703125,
0.0025424957275390625,
-0.0246429443359375,
0.0175018310546875,
-0.0122833251953125,
-0.0240020751953125,
0.00814056396484375,
0.02508544921875,
-0.057769775390625,
0.005359649658203125,
-0.038787841796875,
0.0011205673217773438,
0.06707763671875,
0.017547607421875,
0.0178680419921875,
-0.0237579345703125,
-0.022613525390625,
-0.020233154296875,
-0.032806396484375,
0.0164947509765625,
0.0088043212890625,
0.03131103515625,
-0.021270751953125,
0.03485107421875,
-0.032623291015625,
0.04248046875,
0.01873779296875,
0.0263824462890625,
0.033966064453125,
-0.054595947265625,
-0.018768310546875,
-0.017547607421875,
0.07318115234375,
0.03131103515625,
0.0007262229919433594,
0.00040268898010253906,
-0.0218505859375,
-0.01324462890625,
0.0172882080078125,
-0.066162109375,
-0.012298583984375,
0.03973388671875,
-0.024261474609375,
-0.032806396484375,
0.007648468017578125,
-0.057891845703125,
-0.0231170654296875,
0.005069732666015625,
0.0362548828125,
-0.030670166015625,
-0.0323486328125,
0.0218505859375,
-0.0175933837890625,
0.044158935546875,
0.01001739501953125,
-0.059844970703125,
0.0121917724609375,
0.045654296875,
0.058746337890625,
0.0189208984375,
-0.019927978515625,
-0.0291595458984375,
-0.01025390625,
-0.0120849609375,
0.051483154296875,
-0.0220184326171875,
-0.0108489990234375,
-0.0011272430419921875,
0.015838623046875,
-0.00811004638671875,
-0.0272674560546875,
0.01329803466796875,
-0.046417236328125,
0.0408935546875,
-0.01207733154296875,
-0.037261962890625,
-0.0159149169921875,
0.028472900390625,
-0.0511474609375,
0.07940673828125,
0.029449462890625,
-0.03955078125,
0.00960540771484375,
-0.056182861328125,
-0.0192108154296875,
0.006595611572265625,
0.007579803466796875,
-0.033966064453125,
-0.0207061767578125,
0.02777099609375,
0.032928466796875,
-0.0240020751953125,
0.00882720947265625,
-0.021697998046875,
-0.029449462890625,
0.0196685791015625,
0.0006451606750488281,
0.09283447265625,
0.007625579833984375,
-0.0302276611328125,
-0.0021820068359375,
-0.0516357421875,
0.029754638671875,
0.01442718505859375,
-0.0144500732421875,
-0.0021991729736328125,
-0.0105438232421875,
0.0069580078125,
0.020965576171875,
0.0416259765625,
-0.0275115966796875,
0.01268768310546875,
-0.046142578125,
0.046600341796875,
0.044189453125,
0.005054473876953125,
0.030426025390625,
-0.028228759765625,
0.05096435546875,
-0.006710052490234375,
0.0095062255859375,
0.005069732666015625,
-0.0256500244140625,
-0.06451416015625,
-0.01218414306640625,
0.033843994140625,
0.050445556640625,
-0.05078125,
0.059844970703125,
-0.01306915283203125,
-0.046630859375,
-0.06085205078125,
0.005672454833984375,
0.029541015625,
0.0275115966796875,
0.040863037109375,
0.0009469985961914062,
-0.059906005859375,
-0.07464599609375,
-0.0031414031982421875,
-0.0159912109375,
-0.0130462646484375,
0.0179443359375,
0.0531005859375,
-0.0235443115234375,
0.064208984375,
-0.048004150390625,
-0.021942138671875,
-0.0176544189453125,
-0.01042938232421875,
0.042022705078125,
0.05267333984375,
0.04974365234375,
-0.06207275390625,
-0.041839599609375,
-0.0170745849609375,
-0.056121826171875,
0.019683837890625,
-0.0059051513671875,
-0.022979736328125,
0.0099029541015625,
0.0262451171875,
-0.05908203125,
0.02471923828125,
0.03582763671875,
-0.04388427734375,
0.032257080078125,
0.003787994384765625,
0.00966644287109375,
-0.111572265625,
0.0228424072265625,
-0.0014905929565429688,
-0.0182952880859375,
-0.03607177734375,
0.0203704833984375,
-0.01458740234375,
-0.005634307861328125,
-0.03363037109375,
0.043975830078125,
-0.030242919921875,
0.0059051513671875,
0.01316070556640625,
0.008331298828125,
0.0193328857421875,
0.037384033203125,
-0.0174560546875,
0.0821533203125,
0.047515869140625,
-0.035400390625,
0.043426513671875,
0.045562744140625,
-0.03314208984375,
0.0244140625,
-0.072265625,
0.0146942138671875,
0.0039825439453125,
0.015655517578125,
-0.0765380859375,
-0.01654052734375,
0.01280975341796875,
-0.057037353515625,
0.0081329345703125,
-0.01438140869140625,
-0.0452880859375,
-0.0305938720703125,
-0.03692626953125,
0.02001953125,
0.060089111328125,
-0.022369384765625,
0.0212554931640625,
0.0305938720703125,
-0.0032329559326171875,
-0.044891357421875,
-0.06591796875,
0.0008778572082519531,
-0.00846099853515625,
-0.04962158203125,
0.0199737548828125,
-0.01036834716796875,
-0.012725830078125,
0.01103973388671875,
0.0024623870849609375,
-0.03826904296875,
0.016632080078125,
0.0126953125,
0.032989501953125,
-0.03240966796875,
0.0214691162109375,
-0.0163726806640625,
-0.01094818115234375,
-0.0024929046630859375,
-0.022918701171875,
0.04998779296875,
-0.052978515625,
0.0011091232299804688,
-0.045013427734375,
0.0282745361328125,
0.038238525390625,
-0.0306243896484375,
0.06414794921875,
0.047515869140625,
-0.0270843505859375,
-0.0033435821533203125,
-0.0377197265625,
-0.019012451171875,
-0.035552978515625,
0.033447265625,
-0.0191497802734375,
-0.059906005859375,
0.048095703125,
0.0269317626953125,
0.00980377197265625,
0.07476806640625,
0.034027099609375,
-0.034271240234375,
0.07427978515625,
0.033538818359375,
-0.0003495216369628906,
0.026580810546875,
-0.06500244140625,
0.0025806427001953125,
-0.072265625,
-0.01666259765625,
-0.043792724609375,
-0.03570556640625,
-0.047149658203125,
-0.029998779296875,
0.0181121826171875,
0.01337432861328125,
-0.034942626953125,
0.03973388671875,
-0.05523681640625,
0.036285400390625,
0.051544189453125,
0.01398468017578125,
0.00452423095703125,
-0.00665283203125,
0.0183563232421875,
0.0186767578125,
-0.053985595703125,
-0.035308837890625,
0.0838623046875,
0.006084442138671875,
0.036529541015625,
0.01346588134765625,
0.06317138671875,
0.0172119140625,
-0.0160369873046875,
-0.04443359375,
0.040069580078125,
-0.00946044921875,
-0.0728759765625,
-0.0447998046875,
-0.028076171875,
-0.0777587890625,
0.0016374588012695312,
-0.018096923828125,
-0.04583740234375,
0.022369384765625,
-0.002109527587890625,
-0.049957275390625,
0.01390838623046875,
-0.052001953125,
0.06878662109375,
-0.01013946533203125,
-0.0141754150390625,
-0.01214599609375,
-0.059722900390625,
0.0154571533203125,
0.0125274658203125,
0.0011835098266601562,
-0.0132293701171875,
-0.0029430389404296875,
0.05230712890625,
-0.047515869140625,
0.0648193359375,
-0.01067352294921875,
0.004547119140625,
0.036163330078125,
-0.00045490264892578125,
0.0305938720703125,
0.0241851806640625,
-0.028839111328125,
0.0169677734375,
0.030670166015625,
-0.045654296875,
-0.042694091796875,
0.05181884765625,
-0.06549072265625,
-0.03289794921875,
-0.034637451171875,
-0.030914306640625,
-0.0069732666015625,
0.0276031494140625,
0.0182647705078125,
0.0274810791015625,
-0.010528564453125,
0.039764404296875,
0.04400634765625,
-0.00672149658203125,
0.0303802490234375,
0.032684326171875,
-0.00949859619140625,
-0.0283203125,
0.055206298828125,
-0.00557708740234375,
0.01611328125,
0.031097412109375,
0.004924774169921875,
-0.034912109375,
-0.035369873046875,
-0.040374755859375,
0.0154571533203125,
-0.0400390625,
-0.032989501953125,
-0.038604736328125,
-0.036468505859375,
-0.0535888671875,
-0.0036640167236328125,
-0.0280914306640625,
-0.046600341796875,
-0.04022216796875,
-0.0081329345703125,
0.052642822265625,
0.040496826171875,
-0.0013742446899414062,
0.01381683349609375,
-0.04608154296875,
0.024658203125,
0.03277587890625,
0.029022216796875,
-0.01203155517578125,
-0.040496826171875,
-0.0171051025390625,
0.038818359375,
0.0009813308715820312,
-0.047821044921875,
0.0209808349609375,
0.01319122314453125,
0.0211639404296875,
-0.00908660888671875,
0.01125335693359375,
0.04931640625,
-0.020416259765625,
0.06768798828125,
0.01132965087890625,
-0.0574951171875,
0.0511474609375,
-0.0282440185546875,
0.03131103515625,
0.08160400390625,
0.01468658447265625,
-0.041290283203125,
-0.019317626953125,
-0.057769775390625,
-0.07220458984375,
0.047698974609375,
0.0243377685546875,
0.01407623291015625,
-0.0009398460388183594,
0.023040771484375,
-0.005619049072265625,
0.02001953125,
-0.038238525390625,
-0.02008056640625,
-0.0180816650390625,
-0.0222320556640625,
-0.0021076202392578125,
-0.01027679443359375,
-0.0128631591796875,
-0.029693603515625,
0.06988525390625,
-0.003932952880859375,
0.01296234130859375,
0.023773193359375,
-0.01418304443359375,
0.0129852294921875,
0.01110076904296875,
0.034515380859375,
0.06256103515625,
-0.0294952392578125,
-0.0155487060546875,
0.01556396484375,
-0.0208587646484375,
0.001434326171875,
0.0156097412109375,
-0.0391845703125,
0.00795745849609375,
0.0303497314453125,
0.057647705078125,
0.005947113037109375,
-0.046630859375,
0.045867919921875,
-0.0082244873046875,
-0.030517578125,
-0.04364013671875,
0.01428985595703125,
0.0196075439453125,
0.0305938720703125,
0.0316162109375,
-0.00510406494140625,
0.01053619384765625,
-0.03765869140625,
0.01300048828125,
0.041015625,
-0.0301971435546875,
-0.007617950439453125,
0.03399658203125,
0.0211639404296875,
-0.034271240234375,
0.054595947265625,
-0.01708984375,
-0.0430908203125,
0.0743408203125,
0.0190277099609375,
0.072998046875,
0.0156402587890625,
0.0299224853515625,
0.0467529296875,
0.0216064453125,
0.005428314208984375,
0.0217437744140625,
0.00823211669921875,
-0.039794921875,
-0.019927978515625,
-0.0535888671875,
-0.00868988037109375,
0.024871826171875,
-0.0579833984375,
0.012725830078125,
-0.041717529296875,
-0.01232147216796875,
0.0007171630859375,
0.02520751953125,
-0.07037353515625,
0.0172576904296875,
-0.01497650146484375,
0.06280517578125,
-0.03753662109375,
0.0305023193359375,
0.0623779296875,
-0.059295654296875,
-0.06640625,
-0.006988525390625,
-0.019683837890625,
-0.07421875,
0.03253173828125,
0.01180267333984375,
-0.00572967529296875,
0.0219879150390625,
-0.061431884765625,
-0.07476806640625,
0.0992431640625,
-0.0009450912475585938,
-0.031585693359375,
-0.01910400390625,
-0.005947113037109375,
0.044158935546875,
-0.0245361328125,
0.0238494873046875,
0.03887939453125,
0.034576416015625,
-0.0003292560577392578,
-0.0623779296875,
0.022796630859375,
-0.0350341796875,
0.0006351470947265625,
-0.001983642578125,
-0.06402587890625,
0.06134033203125,
-0.01558685302734375,
-0.015106201171875,
0.0273284912109375,
0.03582763671875,
0.01611328125,
0.007282257080078125,
0.0352783203125,
0.041534423828125,
0.056365966796875,
-0.0023632049560546875,
0.0733642578125,
-0.0166778564453125,
0.056121826171875,
0.08831787109375,
-0.007556915283203125,
0.0704345703125,
0.0291900634765625,
-0.03277587890625,
0.060943603515625,
0.04937744140625,
-0.026702880859375,
0.033355712890625,
0.01053619384765625,
-0.001628875732421875,
-0.0268096923828125,
0.005016326904296875,
-0.0557861328125,
0.038238525390625,
0.004573822021484375,
-0.0173187255859375,
-0.0153961181640625,
-0.0300445556640625,
-0.01346588134765625,
0.0023097991943359375,
-0.0024566650390625,
0.06671142578125,
-0.0079345703125,
-0.0401611328125,
0.0736083984375,
-0.01235198974609375,
0.05316162109375,
-0.04962158203125,
-0.0016689300537109375,
-0.0211944580078125,
0.0079345703125,
-0.014556884765625,
-0.0670166015625,
0.0090789794921875,
-0.00009292364120483398,
-0.037017822265625,
-0.01009368896484375,
0.040130615234375,
-0.03509521484375,
-0.066162109375,
0.002288818359375,
0.042510986328125,
0.01447296142578125,
-0.0016794204711914062,
-0.0732421875,
-0.0165863037109375,
0.0006265640258789062,
-0.0239105224609375,
0.0111083984375,
0.02752685546875,
0.024169921875,
0.045318603515625,
0.05889892578125,
0.00260162353515625,
-0.002880096435546875,
-0.0026092529296875,
0.06683349609375,
-0.05340576171875,
-0.0301666259765625,
-0.057037353515625,
0.055450439453125,
-0.0266876220703125,
-0.038421630859375,
0.04998779296875,
0.05230712890625,
0.060943603515625,
-0.0130615234375,
0.06060791015625,
-0.0206298828125,
0.048248291015625,
-0.03460693359375,
0.07275390625,
-0.06219482421875,
0.00901031494140625,
0.0023822784423828125,
-0.047088623046875,
-0.007617950439453125,
0.0555419921875,
-0.007419586181640625,
0.01174163818359375,
0.050628662109375,
0.06427001953125,
0.005954742431640625,
-0.0232391357421875,
-0.0010852813720703125,
0.0240020751953125,
0.01397705078125,
0.06329345703125,
0.05316162109375,
-0.0606689453125,
0.046630859375,
-0.0264892578125,
-0.0022335052490234375,
-0.02166748046875,
-0.043914794921875,
-0.0635986328125,
-0.048004150390625,
-0.018585205078125,
-0.05224609375,
0.0050201416015625,
0.061187744140625,
0.06268310546875,
-0.07147216796875,
-0.01419830322265625,
-0.00769805908203125,
0.01654052734375,
-0.0204925537109375,
-0.0238494873046875,
0.03155517578125,
-0.0227813720703125,
-0.046661376953125,
0.026092529296875,
-0.0057373046875,
-0.0001938343048095703,
-0.0201263427734375,
0.0012750625610351562,
-0.053741455078125,
-0.0133056640625,
0.0294036865234375,
0.0264892578125,
-0.04791259765625,
-0.0092620849609375,
0.0139007568359375,
-0.0209808349609375,
0.0026264190673828125,
0.028106689453125,
-0.06793212890625,
0.01526641845703125,
0.04473876953125,
0.054962158203125,
0.04083251953125,
0.0075531005859375,
0.0380859375,
-0.0469970703125,
0.00998687744140625,
0.0382080078125,
0.01285552978515625,
0.02423095703125,
-0.041412353515625,
0.05633544921875,
0.006183624267578125,
-0.0330810546875,
-0.06689453125,
0.00034999847412109375,
-0.06805419921875,
-0.0305023193359375,
0.091796875,
-0.001674652099609375,
-0.02069091796875,
0.01273345947265625,
-0.01027679443359375,
0.01502227783203125,
-0.034515380859375,
0.0521240234375,
0.051727294921875,
0.01189422607421875,
0.00650787353515625,
-0.04400634765625,
0.03564453125,
0.03955078125,
-0.06378173828125,
-0.00574493408203125,
0.0330810546875,
0.02655029296875,
0.0141754150390625,
0.045074462890625,
0.0108489990234375,
0.033355712890625,
-0.0087432861328125,
0.006256103515625,
-0.01381683349609375,
-0.0047607421875,
-0.0292816162109375,
-0.0055999755859375,
-0.0221405029296875,
-0.0323486328125
]
] |
martin-ha/toxic-comment-model | 2022-05-06T02:24:31.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | martin-ha | null | null | martin-ha/toxic-comment-model | 29 | 1,196,172 | transformers | 2022-03-02T23:29:05 | ---
language: en
---
## Model description
This model is a fine-tuned version of the [DistilBERT model](https://huggingface.co/transformers/model_doc/distilbert.html) to classify toxic comments.
## How to use
You can use the model with the following code.
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, TextClassificationPipeline
model_path = "martin-ha/toxic-comment-model"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForSequenceClassification.from_pretrained(model_path)
pipeline = TextClassificationPipeline(model=model, tokenizer=tokenizer)
print(pipeline('This is a test text.'))
```
## Limitations and Bias
This model is intended to use for classify toxic online classifications. However, one limitation of the model is that it performs poorly for some comments that mention a specific identity subgroup, like Muslim. The following table shows a evaluation score for different identity group. You can learn the specific meaning of this metrics [here](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification/overview/evaluation). But basically, those metrics shows how well a model performs for a specific group. The larger the number, the better.
| **subgroup** | **subgroup_size** | **subgroup_auc** | **bpsn_auc** | **bnsp_auc** |
| ----------------------------- | ----------------- | ---------------- | ------------ | ------------ |
| muslim | 108 | 0.689 | 0.811 | 0.88 |
| jewish | 40 | 0.749 | 0.86 | 0.825 |
| homosexual_gay_or_lesbian | 56 | 0.795 | 0.706 | 0.972 |
| black | 84 | 0.866 | 0.758 | 0.975 |
| white | 112 | 0.876 | 0.784 | 0.97 |
| female | 306 | 0.898 | 0.887 | 0.948 |
| christian | 231 | 0.904 | 0.917 | 0.93 |
| male | 225 | 0.922 | 0.862 | 0.967 |
| psychiatric_or_mental_illness | 26 | 0.924 | 0.907 | 0.95 |
The table above shows that the model performs poorly for the muslim and jewish group. In fact, you pass the sentence "Muslims are people who follow or practice Islam, an Abrahamic monotheistic religion." Into the model, the model will classify it as toxic. Be mindful for this type of potential bias.
## Training data
The training data comes this [Kaggle competition](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification/data). We use 10% of the `train.csv` data to train the model.
## Training procedure
You can see [this documentation and codes](https://github.com/MSIA/wenyang_pan_nlp_project_2021) for how we train the model. It takes about 3 hours in a P-100 GPU.
## Evaluation results
The model achieves 94% accuracy and 0.59 f1-score in a 10000 rows held-out test set. | 3,184 | [
[
-0.028076171875,
-0.033935546875,
0.013397216796875,
0.0095672607421875,
-0.01136016845703125,
-0.007282257080078125,
0.0025634765625,
-0.0287933349609375,
0.0013036727905273438,
0.016815185546875,
-0.03814697265625,
-0.053497314453125,
-0.06549072265625,
0.01224517822265625,
-0.02471923828125,
0.1094970703125,
0.0216827392578125,
0.01666259765625,
0.0149383544921875,
-0.01345062255859375,
-0.002532958984375,
-0.033935546875,
-0.050048828125,
-0.022064208984375,
0.041046142578125,
0.0113067626953125,
0.05926513671875,
0.04022216796875,
0.032073974609375,
0.0235137939453125,
-0.040252685546875,
-0.01151275634765625,
-0.0382080078125,
-0.0224609375,
-0.01502227783203125,
-0.035552978515625,
-0.03350830078125,
0.020721435546875,
0.01013946533203125,
0.0296478271484375,
-0.01200103759765625,
0.0380859375,
0.008819580078125,
0.0297088623046875,
-0.036956787109375,
0.00881195068359375,
-0.036712646484375,
0.0295257568359375,
0.0025806427001953125,
-0.0012607574462890625,
-0.0309295654296875,
-0.027679443359375,
0.0212249755859375,
-0.034912109375,
0.0139617919921875,
-0.003917694091796875,
0.06988525390625,
0.030120849609375,
-0.0487060546875,
-0.0158233642578125,
-0.03692626953125,
0.056671142578125,
-0.07958984375,
-0.00672149658203125,
0.034210205078125,
0.004489898681640625,
-0.003002166748046875,
-0.048492431640625,
-0.04315185546875,
-0.007740020751953125,
-0.032501220703125,
0.016937255859375,
0.004711151123046875,
0.0006699562072753906,
0.05621337890625,
0.04132080078125,
-0.040740966796875,
-0.003330230712890625,
-0.031890869140625,
-0.0188140869140625,
0.056365966796875,
0.0297698974609375,
0.020416259765625,
-0.039794921875,
-0.0269775390625,
-0.00414276123046875,
-0.0095062255859375,
0.0159454345703125,
0.01287078857421875,
0.0140380859375,
-0.007144927978515625,
0.0252685546875,
-0.0169830322265625,
0.0380859375,
0.00933837890625,
-0.014373779296875,
0.035919189453125,
-0.0206298828125,
-0.01079559326171875,
-0.00409698486328125,
0.07379150390625,
0.0433349609375,
0.01430511474609375,
0.0204315185546875,
-0.0007429122924804688,
0.0255279541015625,
0.0165557861328125,
-0.088623046875,
-0.044891357421875,
0.00925445556640625,
-0.05926513671875,
-0.047698974609375,
-0.006103515625,
-0.0675048828125,
-0.0164337158203125,
-0.01050567626953125,
0.0396728515625,
-0.01544952392578125,
-0.0303802490234375,
-0.00408935546875,
-0.01175689697265625,
0.0025920867919921875,
0.01549530029296875,
-0.061279296875,
0.0185394287109375,
0.01232147216796875,
0.0654296875,
-0.0085601806640625,
-0.00567626953125,
-0.014617919921875,
-0.0032596588134765625,
-0.0018787384033203125,
0.034576416015625,
-0.046173095703125,
-0.024139404296875,
-0.0130157470703125,
0.01202392578125,
0.002132415771484375,
-0.03839111328125,
0.048797607421875,
-0.02789306640625,
0.039306640625,
-0.029510498046875,
-0.034149169921875,
-0.022918701171875,
0.014984130859375,
-0.040618896484375,
0.078369140625,
0.03411865234375,
-0.09454345703125,
0.037506103515625,
-0.056060791015625,
-0.00891876220703125,
-0.006183624267578125,
0.007717132568359375,
-0.05413818359375,
-0.0190887451171875,
0.0010280609130859375,
0.0218658447265625,
-0.002574920654296875,
0.017486572265625,
-0.046356201171875,
-0.042388916015625,
0.0139007568359375,
-0.019012451171875,
0.1021728515625,
0.0391845703125,
-0.04266357421875,
0.006160736083984375,
-0.057830810546875,
0.00606536865234375,
0.006565093994140625,
-0.030670166015625,
-0.021697998046875,
0.00946044921875,
0.024566650390625,
0.050018310546875,
-0.0017547607421875,
-0.037139892578125,
0.00922393798828125,
-0.019195556640625,
0.0380859375,
0.062408447265625,
0.01016998291015625,
0.0153961181640625,
-0.03399658203125,
0.019500732421875,
0.0201263427734375,
0.041107177734375,
0.01238250732421875,
-0.0343017578125,
-0.07159423828125,
-0.007274627685546875,
0.0161285400390625,
0.040863037109375,
-0.035430908203125,
0.052337646484375,
0.00518798828125,
-0.052825927734375,
-0.017181396484375,
0.0082550048828125,
0.034332275390625,
0.04266357421875,
0.030548095703125,
-0.03594970703125,
-0.032989501953125,
-0.07098388671875,
-0.01690673828125,
-0.025238037109375,
-0.00603485107421875,
0.0153656005859375,
0.060089111328125,
-0.0338134765625,
0.047698974609375,
-0.047698974609375,
-0.03179931640625,
0.01398468017578125,
0.0182037353515625,
0.032440185546875,
0.029327392578125,
0.04840087890625,
-0.05401611328125,
-0.0675048828125,
-0.0085906982421875,
-0.04827880859375,
-0.011505126953125,
0.0206756591796875,
-0.019805908203125,
-0.0020275115966796875,
0.0176849365234375,
-0.020599365234375,
0.04705810546875,
0.03192138671875,
-0.040496826171875,
0.04296875,
0.0112457275390625,
0.0004875659942626953,
-0.0867919921875,
0.0184478759765625,
0.0158538818359375,
-0.0097198486328125,
-0.05010986328125,
-0.0002796649932861328,
0.0029773712158203125,
0.0085906982421875,
-0.039306640625,
0.0225830078125,
-0.0121612548828125,
0.02813720703125,
-0.015655517578125,
-0.021453857421875,
0.0004558563232421875,
0.06329345703125,
0.0113983154296875,
0.043182373046875,
0.035064697265625,
-0.054718017578125,
0.016937255859375,
0.025421142578125,
-0.0290679931640625,
0.03546142578125,
-0.038421630859375,
0.002079010009765625,
-0.00785064697265625,
0.0270538330078125,
-0.06280517578125,
-0.024658203125,
0.031280517578125,
-0.0285797119140625,
0.00905609130859375,
-0.00487518310546875,
-0.028839111328125,
-0.050018310546875,
-0.0211639404296875,
0.033233642578125,
0.04205322265625,
-0.0299224853515625,
0.036712646484375,
0.034881591796875,
0.00047135353088378906,
-0.055755615234375,
-0.0467529296875,
-0.032196044921875,
-0.039031982421875,
-0.0193328857421875,
0.012603759765625,
-0.0166015625,
-0.0035343170166015625,
0.004039764404296875,
-0.011444091796875,
-0.0159149169921875,
0.0028858184814453125,
0.01488494873046875,
0.018310546875,
0.0040740966796875,
0.004177093505859375,
0.0002624988555908203,
-0.005458831787109375,
0.0312042236328125,
0.02935791015625,
0.033416748046875,
-0.01519012451171875,
-0.00661468505859375,
-0.038665771484375,
0.0186309814453125,
0.036041259765625,
-0.01367950439453125,
0.05047607421875,
0.045166015625,
-0.0238037109375,
-0.0140533447265625,
-0.02288818359375,
-0.018218994140625,
-0.03900146484375,
0.047454833984375,
0.0012760162353515625,
-0.04840087890625,
0.060760498046875,
0.0012722015380859375,
0.016448974609375,
0.06317138671875,
0.0535888671875,
-0.0012416839599609375,
0.10986328125,
0.03155517578125,
-0.007171630859375,
0.033538818359375,
-0.025482177734375,
0.00649261474609375,
-0.053741455078125,
-0.0241241455078125,
-0.0289459228515625,
-0.0292510986328125,
-0.054718017578125,
-0.0158843994140625,
0.033843994140625,
-0.016357421875,
-0.0654296875,
0.0206298828125,
-0.068115234375,
0.03619384765625,
0.031463623046875,
0.0258331298828125,
0.0203094482421875,
-0.0033245086669921875,
-0.0164031982421875,
-0.01320648193359375,
-0.0538330078125,
-0.034912109375,
0.07916259765625,
0.043212890625,
0.058197021484375,
0.00788116455078125,
0.0308380126953125,
0.0174713134765625,
0.07080078125,
-0.0496826171875,
0.0299224853515625,
-0.020751953125,
-0.09307861328125,
-0.00673675537109375,
-0.04913330078125,
-0.040802001953125,
0.029449462890625,
-0.01392364501953125,
-0.05084228515625,
0.0177459716796875,
0.004589080810546875,
-0.035186767578125,
0.031646728515625,
-0.048309326171875,
0.06915283203125,
-0.021484375,
-0.0218505859375,
-0.00287628173828125,
-0.05511474609375,
0.040618896484375,
-0.017486572265625,
0.01451873779296875,
-0.0139617919921875,
0.0249786376953125,
0.07440185546875,
-0.036834716796875,
0.06927490234375,
-0.0209808349609375,
0.004886627197265625,
0.019195556640625,
-0.0038127899169921875,
0.0146636962890625,
0.0111236572265625,
-0.0156402587890625,
0.0228271484375,
0.0215606689453125,
-0.01544189453125,
-0.029296875,
0.0478515625,
-0.07861328125,
-0.03375244140625,
-0.0672607421875,
-0.031982421875,
-0.0030498504638671875,
0.022216796875,
0.037933349609375,
0.004039764404296875,
-0.0012664794921875,
0.0032978057861328125,
0.05340576171875,
-0.03717041015625,
0.01788330078125,
0.044830322265625,
-0.031646728515625,
-0.029296875,
0.04449462890625,
0.00909423828125,
0.0286407470703125,
-0.0020751953125,
0.0262908935546875,
-0.0193023681640625,
-0.00846099853515625,
-0.0126190185546875,
0.01380157470703125,
-0.05584716796875,
-0.0235443115234375,
-0.05914306640625,
-0.05096435546875,
-0.01030731201171875,
0.0234222412109375,
-0.018890380859375,
0.0005159378051757812,
-0.0247650146484375,
-0.005901336669921875,
0.02667236328125,
0.060333251953125,
-0.0003540515899658203,
0.034088134765625,
-0.0482177734375,
0.0073089599609375,
0.021270751953125,
0.04925537109375,
0.012847900390625,
-0.06329345703125,
-0.00763702392578125,
0.0133209228515625,
-0.04913330078125,
-0.09033203125,
0.0291748046875,
0.0019168853759765625,
0.0357666015625,
0.058135986328125,
0.0156402587890625,
0.052276611328125,
-0.0170440673828125,
0.04840087890625,
0.015899658203125,
-0.0657958984375,
0.03741455078125,
-0.02862548828125,
0.00897979736328125,
0.051971435546875,
0.053955078125,
-0.050537109375,
-0.03814697265625,
-0.043182373046875,
-0.0516357421875,
0.07818603515625,
0.03216552734375,
0.006282806396484375,
-0.007205963134765625,
0.0236358642578125,
0.0004363059997558594,
-0.00014483928680419922,
-0.10028076171875,
-0.046112060546875,
-0.027679443359375,
-0.0178985595703125,
0.0038967132568359375,
-0.0227203369140625,
-0.0201263427734375,
-0.06292724609375,
0.064453125,
0.009033203125,
0.0084686279296875,
0.01123809814453125,
0.005458831787109375,
-0.0176239013671875,
0.006511688232421875,
0.0216064453125,
0.04168701171875,
-0.035797119140625,
-0.0004391670227050781,
0.028106689453125,
-0.037811279296875,
0.020416259765625,
-0.0042572021484375,
-0.01332855224609375,
0.00293731689453125,
0.006191253662109375,
0.047454833984375,
-0.00766754150390625,
-0.0199432373046875,
0.043731689453125,
-0.020538330078125,
-0.0269775390625,
-0.037445068359375,
0.01800537109375,
-0.004627227783203125,
0.005359649658203125,
0.0117034912109375,
0.01297760009765625,
0.0288543701171875,
-0.0438232421875,
0.0189208984375,
0.027130126953125,
-0.0308685302734375,
-0.011138916015625,
0.0667724609375,
0.01226043701171875,
-0.012786865234375,
0.0521240234375,
-0.019012451171875,
-0.059356689453125,
0.049652099609375,
0.0276641845703125,
0.047210693359375,
-0.04241943359375,
0.025665283203125,
0.070556640625,
0.0197601318359375,
0.0023365020751953125,
0.0106353759765625,
0.0038051605224609375,
-0.034210205078125,
-0.025360107421875,
-0.07196044921875,
-0.0100555419921875,
0.0379638671875,
-0.0631103515625,
0.0177001953125,
-0.040313720703125,
-0.037933349609375,
0.00644683837890625,
0.0054473876953125,
-0.0430908203125,
0.0286407470703125,
0.0086212158203125,
0.06524658203125,
-0.0970458984375,
0.04449462890625,
0.03948974609375,
-0.03887939453125,
-0.07366943359375,
-0.0095367431640625,
-0.0035552978515625,
-0.0305633544921875,
0.04498291015625,
0.034088134765625,
0.02532958984375,
-0.0214385986328125,
-0.0394287109375,
-0.07659912109375,
0.06427001953125,
-0.002838134765625,
-0.043121337890625,
-0.00823211669921875,
0.019805908203125,
0.065185546875,
-0.0030651092529296875,
0.051971435546875,
0.036163330078125,
0.0283966064453125,
0.00836181640625,
-0.06640625,
0.013427734375,
-0.052276611328125,
0.006130218505859375,
0.004383087158203125,
-0.0635986328125,
0.078125,
0.003841400146484375,
-0.007236480712890625,
-0.00879669189453125,
0.0281219482421875,
0.012664794921875,
0.0270233154296875,
0.058837890625,
0.05419921875,
0.038909912109375,
-0.007099151611328125,
0.060150146484375,
-0.0203399658203125,
0.059844970703125,
0.07916259765625,
-0.006191253662109375,
0.041534423828125,
0.0147247314453125,
-0.0309600830078125,
0.054443359375,
0.0762939453125,
-0.018829345703125,
0.045745849609375,
0.0176239013671875,
-0.0208892822265625,
-0.015838623046875,
0.0002378225326538086,
-0.042022705078125,
0.0289154052734375,
0.022796630859375,
-0.05426025390625,
-0.01727294921875,
-0.0172576904296875,
0.0096435546875,
-0.0145263671875,
-0.0214385986328125,
0.05328369140625,
0.008148193359375,
-0.051239013671875,
0.038818359375,
-0.0030059814453125,
0.051055908203125,
-0.041046142578125,
-0.0022563934326171875,
0.00003975629806518555,
0.0309295654296875,
-0.036773681640625,
-0.0513916015625,
0.03271484375,
0.003875732421875,
-0.00865936279296875,
0.0057220458984375,
0.036102294921875,
-0.033843994140625,
-0.048431396484375,
0.031005859375,
0.003170013427734375,
0.00518035888671875,
0.01067352294921875,
-0.07440185546875,
-0.016143798828125,
0.00820159912109375,
-0.0299530029296875,
0.01369476318359375,
0.016510009765625,
0.0017242431640625,
0.042266845703125,
0.04736328125,
-0.0011167526245117188,
0.00042438507080078125,
0.0017862319946289062,
0.08026123046875,
-0.04217529296875,
-0.034820556640625,
-0.07470703125,
0.04022216796875,
-0.022552490234375,
-0.03143310546875,
0.0648193359375,
0.049285888671875,
0.06500244140625,
0.0002887248992919922,
0.049072265625,
-0.012847900390625,
0.037200927734375,
-0.0043487548828125,
0.0804443359375,
-0.047454833984375,
-0.0035800933837890625,
-0.0369873046875,
-0.0755615234375,
-0.017974853515625,
0.07818603515625,
-0.0253143310546875,
0.0261993408203125,
0.04510498046875,
0.0635986328125,
-0.008056640625,
0.0086822509765625,
-0.00768280029296875,
0.036712646484375,
0.027679443359375,
0.0506591796875,
0.0604248046875,
-0.031982421875,
0.03497314453125,
-0.047210693359375,
-0.02777099609375,
-0.00037860870361328125,
-0.05267333984375,
-0.08111572265625,
-0.031707763671875,
-0.0288848876953125,
-0.05023193359375,
-0.0284423828125,
0.060028076171875,
0.04736328125,
-0.06890869140625,
-0.0083465576171875,
-0.00011420249938964844,
-0.0091552734375,
-0.0054168701171875,
-0.02239990234375,
0.0193939208984375,
-0.0022983551025390625,
-0.054351806640625,
-0.035552978515625,
-0.0033721923828125,
-0.003993988037109375,
-0.02655029296875,
-0.0060882568359375,
-0.0200042724609375,
0.0036411285400390625,
0.03717041015625,
-0.000052034854888916016,
-0.04559326171875,
-0.0283355712890625,
-0.0157470703125,
-0.0247344970703125,
0.004268646240234375,
0.01296234130859375,
-0.0295257568359375,
0.03948974609375,
0.0263519287109375,
0.013641357421875,
0.04559326171875,
0.00215911865234375,
0.00940704345703125,
-0.025238037109375,
0.00035953521728515625,
0.0218658447265625,
0.0232696533203125,
0.00457763671875,
-0.0350341796875,
0.038787841796875,
0.01139068603515625,
-0.06011962890625,
-0.0595703125,
0.0122528076171875,
-0.06787109375,
-0.0190277099609375,
0.09368896484375,
-0.009307861328125,
-0.0208892822265625,
-0.012420654296875,
-0.028106689453125,
0.035308837890625,
-0.024017333984375,
0.0860595703125,
0.06109619140625,
-0.0011682510375976562,
0.00885009765625,
-0.038299560546875,
0.057708740234375,
0.03155517578125,
-0.043487548828125,
-0.0018482208251953125,
0.026336669921875,
0.05950927734375,
0.0222015380859375,
0.046783447265625,
-0.0205841064453125,
0.0210723876953125,
-0.00792694091796875,
-0.0010957717895507812,
0.01419830322265625,
0.004009246826171875,
-0.00865936279296875,
-0.004169464111328125,
-0.01186370849609375,
0.0036773681640625
]
] |
google/flan-t5-base | 2023-07-17T12:48:39.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"t5",
"text2text-generation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:svakulenk0/qrecc",
"dataset:taskmaster2",
"dataset:djaym7/wiki_dialog",
"dataset:deepmind/code_contests",
"dataset:lambada",
"dataset:gsm8k",
"dataset:aqua_rat",
"dataset:esnli",
"dataset:quasc",
"dataset:qed",
"arxiv:2210.11416",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/flan-t5-base | 395 | 1,145,305 | transformers | 2022-10-21T10:02:31 | ---
language:
- en
- fr
- ro
- de
- multilingual
tags:
- text2text-generation
widget:
- text: "Translate to German: My name is Arthur"
example_title: "Translation"
- text: "Please answer to the following question. Who is going to be the next Ballon d'or?"
example_title: "Question Answering"
- text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering."
example_title: "Logical reasoning"
- text: "Please answer the following question. What is the boiling point of Nitrogen?"
example_title: "Scientific knowledge"
- text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?"
example_title: "Yes/no question"
- text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?"
example_title: "Reasoning task"
- text: "Q: ( False or not False or False ) is? A: Let's think step by step"
example_title: "Boolean Expressions"
- text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?"
example_title: "Math reasoning"
- text: "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?"
example_title: "Premise and hypothesis"
datasets:
- svakulenk0/qrecc
- taskmaster2
- djaym7/wiki_dialog
- deepmind/code_contests
- lambada
- gsm8k
- aqua_rat
- esnli
- quasc
- qed
license: apache-2.0
---
# Model Card for FLAN-T5 base
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg"
alt="drawing" width="600"/>
# Table of Contents
0. [TL;DR](#TL;DR)
1. [Model Details](#model-details)
2. [Usage](#usage)
3. [Uses](#uses)
4. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
5. [Training Details](#training-details)
6. [Evaluation](#evaluation)
7. [Environmental Impact](#environmental-impact)
8. [Citation](#citation)
9. [Model Card Authors](#model-card-authors)
# TL;DR
If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages.
As mentioned in the first few lines of the abstract :
> Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models.
**Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large).
# Model Details
## Model Description
- **Model type:** Language model
- **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian
- **License:** Apache 2.0
- **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5)
- **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints)
- **Resources for more information:**
- [Research paper](https://arxiv.org/pdf/2210.11416.pdf)
- [GitHub Repo](https://github.com/google-research/t5x)
- [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5)
# Usage
Find below some example scripts on how to use the model in `transformers`:
## Using the Pytorch model
### Running the model on a CPU
<details>
<summary> Click to expand </summary>
```python
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base")
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto")
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU using different precisions
#### FP16
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", torch_dtype=torch.float16)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
#### INT8
<details>
<summary> Click to expand </summary>
```python
# pip install bitsandbytes accelerate
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-base")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-base", device_map="auto", load_in_8bit=True)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
# Uses
## Direct Use and Downstream Use
The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that:
> The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models
See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf):
> Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application.
## Ethical considerations and risks
> Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data.
## Known Limitations
> Flan-T5 has not been tested in real world applications.
## Sensitive Use:
> Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech.
# Training Details
## Training Data
The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2):

## Training Procedure
According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf):
> These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size.
The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax).
# Evaluation
## Testing Data, Factors & Metrics
The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation:

For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf).
## Results
For full results for FLAN-T5-Base, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips โฅ 4.
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@misc{https://doi.org/10.48550/arxiv.2210.11416,
doi = {10.48550/ARXIV.2210.11416},
url = {https://arxiv.org/abs/2210.11416},
author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason},
keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Scaling Instruction-Finetuned Language Models},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
## Model Recycling
[Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=9.16&mnli_lp=nan&20_newsgroup=3.34&ag_news=1.49&amazon_reviews_multi=0.21&anli=13.91&boolq=16.75&cb=23.12&cola=9.97&copa=34.50&dbpedia=6.90&esnli=5.37&financial_phrasebank=18.66&imdb=0.33&isear=1.37&mnli=11.74&mrpc=16.63&multirc=6.24&poem_sentiment=14.62&qnli=3.41&qqp=6.18&rotten_tomatoes=2.98&rte=24.26&sst2=0.67&sst_5bins=5.44&stsb=20.68&trec_coarse=3.95&trec_fine=10.73&tweet_ev_emoji=13.39&tweet_ev_emotion=4.62&tweet_ev_hate=3.46&tweet_ev_irony=9.04&tweet_ev_offensive=1.69&tweet_ev_sentiment=0.75&wic=14.22&wnli=9.44&wsc=5.53&yahoo_answers=4.14&model_name=google%2Fflan-t5-base&base_name=google%2Ft5-v1_1-base) using google/flan-t5-base as a base model yields average score of 77.98 in comparison to 68.82 by google/t5-v1_1-base.
The model is ranked 1st among all tested models for the google/t5-v1_1-base architecture as of 06/02/2023
Results:
| 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
|---------------:|----------:|-----------------------:|--------:|--------:|--------:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|-------:|--------:|----------------:|
| 86.2188 | 89.6667 | 67.12 | 51.9688 | 82.3242 | 78.5714 | 80.1534 | 75 | 77.6667 | 90.9507 | 85.4 | 93.324 | 72.425 | 87.2457 | 89.4608 | 62.3762 | 82.6923 | 92.7878 | 89.7724 | 89.0244 | 84.8375 | 94.3807 | 57.2851 | 89.4759 | 97.2 | 92.8 | 46.848 | 80.2252 | 54.9832 | 76.6582 | 84.3023 | 70.6366 | 70.0627 | 56.338 | 53.8462 | 73.4 |
For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)
| 13,375 | [
[
-0.03338623046875,
-0.043670654296875,
0.021270751953125,
-0.000125885009765625,
-0.0068817138671875,
-0.0107421875,
-0.031402587890625,
-0.04705810546875,
-0.0114593505859375,
0.009124755859375,
-0.038330078125,
-0.039306640625,
-0.04925537109375,
0.003673553466796875,
-0.0173187255859375,
0.0767822265625,
-0.010009765625,
0.002147674560546875,
0.013458251953125,
-0.006984710693359375,
-0.01242828369140625,
-0.0232696533203125,
-0.05108642578125,
-0.022705078125,
0.033538818359375,
0.0222625732421875,
0.034088134765625,
0.040069580078125,
0.03875732421875,
0.0247039794921875,
-0.0125885009765625,
-0.0038280487060546875,
-0.03948974609375,
-0.031219482421875,
0.004070281982421875,
-0.034820556640625,
-0.045989990234375,
-0.0027828216552734375,
0.031768798828125,
0.038299560546875,
0.006317138671875,
0.0240020751953125,
-0.01045989990234375,
0.0207672119140625,
-0.040985107421875,
0.022674560546875,
-0.0216522216796875,
0.006313323974609375,
-0.021148681640625,
0.0072021484375,
-0.019287109375,
-0.0175018310546875,
0.0053253173828125,
-0.051910400390625,
0.037872314453125,
-0.010345458984375,
0.10870361328125,
0.01055145263671875,
-0.0057373046875,
-0.01406097412109375,
-0.057708740234375,
0.0721435546875,
-0.07257080078125,
0.034271240234375,
0.014739990234375,
0.0263671875,
0.00655364990234375,
-0.05987548828125,
-0.049652099609375,
-0.0235443115234375,
-0.004924774169921875,
0.01129913330078125,
-0.0023365020751953125,
0.0144805908203125,
0.04046630859375,
0.046966552734375,
-0.034820556640625,
-0.006084442138671875,
-0.05413818359375,
-0.01212310791015625,
0.053619384765625,
-0.002887725830078125,
0.03839111328125,
-0.00650787353515625,
-0.0211944580078125,
-0.03662109375,
-0.0267486572265625,
0.01015472412109375,
0.020782470703125,
0.03240966796875,
-0.036834716796875,
0.029754638671875,
-0.002227783203125,
0.0401611328125,
0.024658203125,
-0.036956787109375,
0.037628173828125,
-0.0254364013671875,
-0.027130126953125,
-0.01262664794921875,
0.07147216796875,
0.015045166015625,
0.0193939208984375,
-0.006557464599609375,
-0.0291748046875,
0.0016317367553710938,
0.0134429931640625,
-0.07147216796875,
-0.009674072265625,
0.03167724609375,
-0.029571533203125,
-0.037811279296875,
0.0127410888671875,
-0.06170654296875,
-0.0008068084716796875,
0.0005559921264648438,
0.039337158203125,
-0.03875732421875,
-0.044708251953125,
-0.0028820037841796875,
-0.01386260986328125,
0.0235443115234375,
0.004787445068359375,
-0.08245849609375,
0.0164337158203125,
0.0379638671875,
0.06378173828125,
0.0089569091796875,
-0.024566650390625,
-0.017822265625,
0.0011081695556640625,
-0.0165252685546875,
0.031402587890625,
-0.0299530029296875,
-0.02880859375,
-0.0020618438720703125,
0.01428985595703125,
-0.01531219482421875,
-0.036895751953125,
0.0498046875,
-0.0216217041015625,
0.0391845703125,
-0.022216796875,
-0.03973388671875,
-0.0303192138671875,
-0.0013399124145507812,
-0.0494384765625,
0.0848388671875,
0.0229034423828125,
-0.055816650390625,
0.0345458984375,
-0.07080078125,
-0.0338134765625,
-0.01139068603515625,
0.010101318359375,
-0.052276611328125,
0.00359344482421875,
0.026824951171875,
0.026611328125,
-0.0159454345703125,
0.0161590576171875,
-0.0360107421875,
-0.0245208740234375,
-0.0134735107421875,
-0.00806427001953125,
0.0780029296875,
0.0322265625,
-0.062255859375,
0.020111083984375,
-0.04522705078125,
-0.003475189208984375,
0.02288818359375,
-0.00787353515625,
0.01157379150390625,
-0.025299072265625,
0.01641845703125,
0.029937744140625,
0.0193939208984375,
-0.0396728515625,
0.0012226104736328125,
-0.041473388671875,
0.03717041015625,
0.040802001953125,
-0.01386260986328125,
0.032501220703125,
-0.039886474609375,
0.0367431640625,
0.023956298828125,
0.0165557861328125,
-0.00829315185546875,
-0.0281219482421875,
-0.08740234375,
0.0010519027709960938,
0.019927978515625,
0.035736083984375,
-0.044464111328125,
0.0306854248046875,
-0.037689208984375,
-0.05267333984375,
-0.033355712890625,
0.007495880126953125,
0.0284881591796875,
0.036651611328125,
0.0372314453125,
-0.007476806640625,
-0.0406494140625,
-0.051239013671875,
-0.0157012939453125,
-0.000606536865234375,
-0.00015163421630859375,
0.0194854736328125,
0.058380126953125,
-0.00189971923828125,
0.03948974609375,
-0.0209808349609375,
-0.026702880859375,
-0.03717041015625,
0.006824493408203125,
0.01003265380859375,
0.0518798828125,
0.06280517578125,
-0.04345703125,
-0.032257080078125,
0.00545501708984375,
-0.061492919921875,
0.003002166748046875,
-0.00791168212890625,
-0.009735107421875,
0.0352783203125,
0.01514434814453125,
-0.048187255859375,
0.03057861328125,
0.032073974609375,
-0.017364501953125,
0.022430419921875,
-0.00797271728515625,
0.004718780517578125,
-0.0908203125,
0.03778076171875,
0.00977325439453125,
-0.01322174072265625,
-0.058380126953125,
0.0106048583984375,
0.0034236907958984375,
-0.01548004150390625,
-0.046356201171875,
0.058441162109375,
-0.0265960693359375,
0.0029659271240234375,
-0.00843048095703125,
-0.0035343170166015625,
-0.0015735626220703125,
0.04400634765625,
0.0098114013671875,
0.061126708984375,
0.028106689453125,
-0.055816650390625,
0.003246307373046875,
0.00695037841796875,
-0.018646240234375,
0.016204833984375,
-0.054534912109375,
0.01190948486328125,
0.00015342235565185547,
0.01611328125,
-0.05096435546875,
-0.0281219482421875,
0.0207977294921875,
-0.036407470703125,
0.034332275390625,
0.003208160400390625,
-0.0261383056640625,
-0.04241943359375,
-0.0211181640625,
0.0235443115234375,
0.05096435546875,
-0.043548583984375,
0.05072021484375,
0.0162200927734375,
0.025299072265625,
-0.04449462890625,
-0.06585693359375,
-0.0202178955078125,
-0.0362548828125,
-0.0626220703125,
0.041839599609375,
0.0011434555053710938,
-0.0005068778991699219,
-0.0152435302734375,
-0.01012420654296875,
-0.004123687744140625,
0.0031681060791015625,
0.01082611083984375,
0.007198333740234375,
-0.019775390625,
-0.01328277587890625,
-0.0169219970703125,
-0.007232666015625,
-0.002285003662109375,
-0.02850341796875,
0.045806884765625,
-0.021026611328125,
0.0113372802734375,
-0.0577392578125,
-0.0018157958984375,
0.0430908203125,
-0.0207061767578125,
0.06781005859375,
0.08282470703125,
-0.03765869140625,
0.00021517276763916016,
-0.0472412109375,
-0.0247039794921875,
-0.038421630859375,
0.01486968994140625,
-0.03741455078125,
-0.0465087890625,
0.05206298828125,
0.0167388916015625,
0.0235595703125,
0.05865478515625,
0.038238525390625,
-0.0016345977783203125,
0.06866455078125,
0.0511474609375,
-0.0016765594482421875,
0.058258056640625,
-0.05352783203125,
0.017181396484375,
-0.04412841796875,
-0.0148468017578125,
-0.03363037109375,
-0.019622802734375,
-0.05316162109375,
-0.0213623046875,
0.02288818359375,
0.00665283203125,
-0.0440673828125,
0.028594970703125,
-0.0262908935546875,
0.00858306884765625,
0.042877197265625,
0.016754150390625,
-0.004337310791015625,
0.007625579833984375,
-0.0110015869140625,
-0.0033016204833984375,
-0.05267333984375,
-0.039825439453125,
0.08392333984375,
0.03076171875,
0.031890869140625,
0.004177093505859375,
0.055145263671875,
-0.002895355224609375,
0.0205535888671875,
-0.03955078125,
0.030364990234375,
-0.01702880859375,
-0.06866455078125,
-0.003482818603515625,
-0.0303192138671875,
-0.0611572265625,
0.00476837158203125,
-0.003475189208984375,
-0.056549072265625,
0.00395965576171875,
0.01178741455078125,
-0.03509521484375,
0.04351806640625,
-0.06927490234375,
0.0892333984375,
-0.028411865234375,
-0.039276123046875,
-0.00406646728515625,
-0.03759765625,
0.039703369140625,
0.01381683349609375,
0.00872802734375,
0.003688812255859375,
0.007740020751953125,
0.060638427734375,
-0.0572509765625,
0.059844970703125,
-0.03302001953125,
-0.007030487060546875,
0.0241241455078125,
-0.0172576904296875,
0.0262298583984375,
-0.0186767578125,
-0.0084228515625,
0.026519775390625,
0.00490570068359375,
-0.04498291015625,
-0.03717041015625,
0.05218505859375,
-0.07928466796875,
-0.04193115234375,
-0.03656005859375,
-0.02862548828125,
0.0033817291259765625,
0.03582763671875,
0.029937744140625,
0.0232696533203125,
0.002941131591796875,
0.0004420280456542969,
0.033233642578125,
-0.0288543701171875,
0.047332763671875,
0.00611114501953125,
-0.020172119140625,
-0.029510498046875,
0.07073974609375,
0.009002685546875,
0.036102294921875,
0.0238037109375,
0.0225830078125,
-0.02410888671875,
-0.01947021484375,
-0.03680419921875,
0.02984619140625,
-0.048828125,
-0.0048370361328125,
-0.04290771484375,
-0.01091766357421875,
-0.039154052734375,
-0.01045989990234375,
-0.034454345703125,
-0.0294342041015625,
-0.028045654296875,
-0.004970550537109375,
0.0219268798828125,
0.049041748046875,
-0.0015726089477539062,
0.02886962890625,
-0.04449462890625,
0.0244903564453125,
0.0032978057861328125,
0.0273590087890625,
0.00665283203125,
-0.051910400390625,
-0.0131683349609375,
0.02239990234375,
-0.03436279296875,
-0.047607421875,
0.02880859375,
0.0175323486328125,
0.0268096923828125,
0.037261962890625,
-0.0084075927734375,
0.06964111328125,
-0.009368896484375,
0.07855224609375,
0.0026874542236328125,
-0.07476806640625,
0.044281005859375,
-0.0364990234375,
0.0345458984375,
0.0278167724609375,
0.025634765625,
-0.0250396728515625,
-0.017791748046875,
-0.077392578125,
-0.05450439453125,
0.0732421875,
0.020904541015625,
0.0027751922607421875,
0.020965576171875,
0.0170440673828125,
-0.00634765625,
0.00571441650390625,
-0.0670166015625,
-0.018524169921875,
-0.03851318359375,
-0.0243072509765625,
-0.004100799560546875,
-0.004062652587890625,
-0.0066680908203125,
-0.0263671875,
0.061553955078125,
0.004337310791015625,
0.04925537109375,
0.0104522705078125,
-0.0200042724609375,
-0.01418304443359375,
0.0005850791931152344,
0.06890869140625,
0.03619384765625,
-0.026123046875,
-0.01123809814453125,
0.02880859375,
-0.0447998046875,
-0.003818511962890625,
0.00711822509765625,
-0.028289794921875,
-0.002735137939453125,
0.033660888671875,
0.08062744140625,
0.01209259033203125,
-0.02642822265625,
0.031585693359375,
-0.00368499755859375,
-0.0282745361328125,
-0.035552978515625,
0.026580810546875,
0.0078277587890625,
0.0030841827392578125,
0.01276397705078125,
0.004985809326171875,
-0.0144805908203125,
-0.028778076171875,
0.0007395744323730469,
0.01174163818359375,
-0.016021728515625,
-0.03564453125,
0.08197021484375,
0.0166473388671875,
-0.0094146728515625,
0.041290283203125,
-0.00571441650390625,
-0.03875732421875,
0.05194091796875,
0.03216552734375,
0.07177734375,
-0.0109710693359375,
-0.00015854835510253906,
0.06964111328125,
0.0249176025390625,
-0.0090484619140625,
0.029266357421875,
0.005588531494140625,
-0.039886474609375,
-0.0113372802734375,
-0.049224853515625,
-0.000045418739318847656,
0.032562255859375,
-0.03472900390625,
0.03741455078125,
-0.056243896484375,
-0.014404296875,
0.007476806640625,
0.032867431640625,
-0.07073974609375,
0.0310516357421875,
0.0223236083984375,
0.061553955078125,
-0.055877685546875,
0.06024169921875,
0.047271728515625,
-0.0736083984375,
-0.08319091796875,
-0.0008559226989746094,
-0.006134033203125,
-0.040618896484375,
0.04559326171875,
0.02960205078125,
0.0007710456848144531,
0.002246856689453125,
-0.036376953125,
-0.06610107421875,
0.09832763671875,
0.0299072265625,
-0.0302276611328125,
-0.01016998291015625,
0.0258026123046875,
0.044677734375,
-0.0203399658203125,
0.0577392578125,
0.041839599609375,
0.05010986328125,
0.005245208740234375,
-0.08056640625,
0.01461029052734375,
-0.0206756591796875,
0.00954437255859375,
-0.0030384063720703125,
-0.076416015625,
0.0694580078125,
-0.02325439453125,
-0.0230712890625,
-0.0017175674438476562,
0.06500244140625,
0.0180511474609375,
0.00620269775390625,
0.041595458984375,
0.04571533203125,
0.058135986328125,
-0.018646240234375,
0.095703125,
-0.042938232421875,
0.0457763671875,
0.048675537109375,
0.01406097412109375,
0.048858642578125,
0.01934814453125,
-0.02130126953125,
0.03521728515625,
0.05340576171875,
-0.00827789306640625,
0.023345947265625,
-0.005107879638671875,
-0.018524169921875,
-0.00647735595703125,
-0.003192901611328125,
-0.038330078125,
0.0237274169921875,
0.028564453125,
-0.0322265625,
-0.01143646240234375,
-0.0048828125,
0.0284271240234375,
-0.02471923828125,
-0.00982666015625,
0.037078857421875,
0.00927734375,
-0.058074951171875,
0.08050537109375,
0.01122283935546875,
0.0633544921875,
-0.0413818359375,
0.01763916015625,
-0.0240631103515625,
0.029144287109375,
-0.033905029296875,
-0.026611328125,
0.0203399658203125,
0.003147125244140625,
-0.000743865966796875,
-0.0130462646484375,
0.0374755859375,
-0.03692626953125,
-0.0535888671875,
0.0178680419921875,
0.01226043701171875,
0.01220703125,
0.0191192626953125,
-0.06536865234375,
0.017547607421875,
0.0106353759765625,
-0.027191162109375,
0.00849151611328125,
0.0099334716796875,
-0.0011444091796875,
0.043701171875,
0.04034423828125,
-0.01163482666015625,
0.0214691162109375,
0.01058197021484375,
0.05303955078125,
-0.051025390625,
-0.02459716796875,
-0.04937744140625,
0.0494384765625,
-0.00453948974609375,
-0.03936767578125,
0.051544189453125,
0.046905517578125,
0.08734130859375,
-0.01263427734375,
0.07330322265625,
-0.030853271484375,
0.023284912109375,
-0.0312347412109375,
0.054290771484375,
-0.059814453125,
0.0036411285400390625,
-0.0283660888671875,
-0.059417724609375,
-0.0167083740234375,
0.06610107421875,
-0.03765869140625,
0.049041748046875,
0.057708740234375,
0.06719970703125,
-0.027801513671875,
0.00334930419921875,
0.01247406005859375,
0.020111083984375,
0.049163818359375,
0.05316162109375,
0.01806640625,
-0.066650390625,
0.0433349609375,
-0.05963134765625,
0.008056640625,
-0.0167083740234375,
-0.0498046875,
-0.08154296875,
-0.040008544921875,
-0.0212554931640625,
-0.035003662109375,
-0.0021991729736328125,
0.063720703125,
0.057891845703125,
-0.0791015625,
-0.0276031494140625,
-0.0230712890625,
-0.00847625732421875,
-0.01739501953125,
-0.0186767578125,
0.034820556640625,
-0.0400390625,
-0.08392333984375,
0.0090484619140625,
-0.0169219970703125,
0.017974853515625,
-0.02520751953125,
-0.01438140869140625,
-0.0267181396484375,
-0.0227203369140625,
0.0204925537109375,
0.0291290283203125,
-0.06341552734375,
-0.026824951171875,
0.0039215087890625,
-0.00933837890625,
0.01125335693359375,
0.0362548828125,
-0.034332275390625,
0.02880859375,
0.039642333984375,
0.03704833984375,
0.060089111328125,
-0.004364013671875,
0.05078125,
-0.03521728515625,
0.032135009765625,
0.00330352783203125,
0.02001953125,
0.0311279296875,
-0.0184326171875,
0.043365478515625,
0.0258331298828125,
-0.0298004150390625,
-0.05987548828125,
-0.01203155517578125,
-0.06866455078125,
0.001705169677734375,
0.091796875,
-0.020721435546875,
-0.0382080078125,
0.0169219970703125,
-0.0003478527069091797,
0.045166015625,
-0.030487060546875,
0.05078125,
0.0543212890625,
0.00701141357421875,
-0.0265045166015625,
-0.056915283203125,
0.050872802734375,
0.0419921875,
-0.057891845703125,
-0.0164337158203125,
0.011260986328125,
0.042388916015625,
0.01433563232421875,
0.032562255859375,
-0.00479888916015625,
0.0158538818359375,
0.0124053955078125,
0.0199432373046875,
-0.0123138427734375,
-0.006610870361328125,
-0.0211029052734375,
0.00252532958984375,
-0.006378173828125,
-0.0090789794921875
]
] |
google/vit-base-patch16-224 | 2023-09-05T15:27:12.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"vit",
"image-classification",
"vision",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2010.11929",
"arxiv:2006.03677",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | google | null | null | google/vit-base-patch16-224 | 362 | 1,108,207 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- vision
- image-classification
datasets:
- imagenet-1k
- imagenet-21k
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg
example_title: Tiger
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg
example_title: Teapot
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg
example_title: Palace
---
# Vision Transformer (base-sized model)
Vision Transformer (ViT) model pre-trained on ImageNet-21k (14 million images, 21,843 classes) at resolution 224x224, and fine-tuned on ImageNet 2012 (1 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper [An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale](https://arxiv.org/abs/2010.11929) by Dosovitskiy et al. and first released in [this repository](https://github.com/google-research/vision_transformer). However, the weights were converted from the [timm repository](https://github.com/rwightman/pytorch-image-models) by Ross Wightman, who already converted the weights from JAX to PyTorch. Credits go to him.
Disclaimer: The team releasing ViT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels. Next, the model was fine-tuned on ImageNet (also referred to as ILSVRC2012), a dataset comprising 1 million images and 1,000 classes, also at resolution 224x224.
Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import ViTImageProcessor, ViTForImageClassification
from PIL import Image
import requests
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
processor = ViTImageProcessor.from_pretrained('google/vit-base-patch16-224')
model = ViTForImageClassification.from_pretrained('google/vit-base-patch16-224')
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits
# model predicts one of the 1000 ImageNet classes
predicted_class_idx = logits.argmax(-1).item()
print("Predicted class:", model.config.id2label[predicted_class_idx])
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/vit.html#).
## Training data
The ViT model was pretrained on [ImageNet-21k](http://www.image-net.org/), a dataset consisting of 14 million images and 21k classes, and fine-tuned on [ImageNet](http://www.image-net.org/challenges/LSVRC/2012/), a dataset consisting of 1 million images and 1k classes.
## Training procedure
### Preprocessing
The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py).
Images are resized/rescaled to the same resolution (224x224) and normalized across the RGB channels with mean (0.5, 0.5, 0.5) and standard deviation (0.5, 0.5, 0.5).
### Pretraining
The model was trained on TPUv3 hardware (8 cores). All model variants are trained with a batch size of 4096 and learning rate warmup of 10k steps. For ImageNet, the authors found it beneficial to additionally apply gradient clipping at global norm 1. Training resolution is 224.
## Evaluation results
For evaluation results on several image classification benchmarks, we refer to tables 2 and 5 of the original paper. Note that for fine-tuning, the best results are obtained with a higher resolution (384x384). Of course, increasing the model size will result in better performance.
### BibTeX entry and citation info
```bibtex
@misc{wu2020visual,
title={Visual Transformers: Token-based Image Representation and Processing for Computer Vision},
author={Bichen Wu and Chenfeng Xu and Xiaoliang Dai and Alvin Wan and Peizhao Zhang and Zhicheng Yan and Masayoshi Tomizuka and Joseph Gonzalez and Kurt Keutzer and Peter Vajda},
year={2020},
eprint={2006.03677},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@inproceedings{deng2009imagenet,
title={Imagenet: A large-scale hierarchical image database},
author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li},
booktitle={2009 IEEE conference on computer vision and pattern recognition},
pages={248--255},
year={2009},
organization={Ieee}
}
``` | 5,688 | [
[
-0.04669189453125,
-0.0118865966796875,
-0.0011205673217773438,
-0.00598907470703125,
-0.0290374755859375,
-0.01224517822265625,
-0.004547119140625,
-0.04632568359375,
0.011810302734375,
0.037384033203125,
-0.023651123046875,
-0.0192413330078125,
-0.056549072265625,
-0.00522613525390625,
-0.04034423828125,
0.0660400390625,
-0.002841949462890625,
0.0028133392333984375,
-0.022216796875,
-0.01273345947265625,
-0.0276947021484375,
-0.03314208984375,
-0.048797607421875,
-0.020599365234375,
0.03826904296875,
0.01537322998046875,
0.044586181640625,
0.061279296875,
0.061492919921875,
0.033843994140625,
-0.0029354095458984375,
0.0013370513916015625,
-0.0290985107421875,
-0.029632568359375,
-0.00041103363037109375,
-0.0428466796875,
-0.024932861328125,
0.0159912109375,
0.046661376953125,
0.0272064208984375,
0.0170135498046875,
0.028350830078125,
0.00887298583984375,
0.034027099609375,
-0.047088623046875,
0.0201263427734375,
-0.03985595703125,
0.030059814453125,
-0.0057373046875,
-0.0134429931640625,
-0.04058837890625,
-0.00983428955078125,
0.018402099609375,
-0.037445068359375,
0.039215087890625,
0.00003236532211303711,
0.10491943359375,
0.0182952880859375,
-0.0232391357421875,
0.01641845703125,
-0.054473876953125,
0.058868408203125,
-0.0219268798828125,
0.031280517578125,
0.00775909423828125,
0.041778564453125,
0.016632080078125,
-0.08642578125,
-0.040313720703125,
-0.0034027099609375,
-0.00508880615234375,
0.01395416259765625,
-0.0205078125,
0.01308441162109375,
0.04339599609375,
0.043792724609375,
-0.0279083251953125,
0.00849151611328125,
-0.050537109375,
-0.02203369140625,
0.04046630859375,
-0.0013532638549804688,
0.01018524169921875,
-0.002971649169921875,
-0.05181884765625,
-0.0367431640625,
-0.0286865234375,
0.01494598388671875,
0.006374359130859375,
0.00327301025390625,
-0.01042938232421875,
0.040924072265625,
0.008636474609375,
0.0469970703125,
0.02484130859375,
-0.0103759765625,
0.036895751953125,
-0.0184173583984375,
-0.0272369384765625,
-0.005916595458984375,
0.06414794921875,
0.0345458984375,
0.0260162353515625,
-0.0003063678741455078,
-0.02386474609375,
0.013885498046875,
0.04351806640625,
-0.07635498046875,
-0.0105438232421875,
-0.0094146728515625,
-0.050140380859375,
-0.033050537109375,
0.0198974609375,
-0.042999267578125,
-0.01190185546875,
-0.033477783203125,
0.052764892578125,
-0.01361846923828125,
-0.01190185546875,
-0.008758544921875,
-0.003810882568359375,
0.050689697265625,
0.031402587890625,
-0.0462646484375,
0.024169921875,
0.021148681640625,
0.0726318359375,
-0.01031494140625,
-0.0173797607421875,
-0.005474090576171875,
-0.026458740234375,
-0.030548095703125,
0.048797607421875,
-0.0078887939453125,
-0.01200103759765625,
-0.00363922119140625,
0.031524658203125,
-0.00614166259765625,
-0.0396728515625,
0.030487060546875,
-0.0462646484375,
0.005741119384765625,
-0.004795074462890625,
-0.0192718505859375,
-0.017913818359375,
0.0173797607421875,
-0.051544189453125,
0.07879638671875,
0.01451873779296875,
-0.0628662109375,
0.031890869140625,
-0.03533935546875,
-0.00750732421875,
0.010650634765625,
-0.0016756057739257812,
-0.051544189453125,
0.0001513957977294922,
0.018310546875,
0.0426025390625,
-0.0150146484375,
0.0025634765625,
-0.011199951171875,
-0.042144775390625,
0.01462554931640625,
-0.029632568359375,
0.06390380859375,
0.014739990234375,
-0.0272369384765625,
0.0150299072265625,
-0.048248291015625,
-0.0023555755615234375,
0.019622802734375,
-0.0138702392578125,
0.00331878662109375,
-0.0272064208984375,
0.01287841796875,
0.0271453857421875,
0.017852783203125,
-0.055938720703125,
0.019317626953125,
-0.00943756103515625,
0.0341796875,
0.060546875,
-0.0163421630859375,
0.039337158203125,
-0.01898193359375,
0.033172607421875,
0.0159759521484375,
0.042999267578125,
-0.0272674560546875,
-0.0484619140625,
-0.0821533203125,
-0.01267242431640625,
0.029296875,
0.03302001953125,
-0.05841064453125,
0.03802490234375,
-0.03717041015625,
-0.04638671875,
-0.0287628173828125,
-0.014739990234375,
0.02203369140625,
0.03759765625,
0.04010009765625,
-0.0439453125,
-0.0518798828125,
-0.07275390625,
0.01430511474609375,
0.0012731552124023438,
-0.004245758056640625,
0.01528167724609375,
0.05841064453125,
-0.0244903564453125,
0.07061767578125,
-0.0270843505859375,
-0.0240325927734375,
-0.0034427642822265625,
-0.00444793701171875,
0.023040771484375,
0.047760009765625,
0.04046630859375,
-0.0657958984375,
-0.0286865234375,
0.0007348060607910156,
-0.0589599609375,
0.0233612060546875,
-0.004425048828125,
-0.0202484130859375,
0.0004355907440185547,
0.038116455078125,
-0.036224365234375,
0.06488037109375,
0.026458740234375,
-0.007354736328125,
0.03643798828125,
-0.0025348663330078125,
0.001392364501953125,
-0.0830078125,
-0.00036835670471191406,
0.017425537109375,
-0.026397705078125,
-0.038177490234375,
0.0131683349609375,
0.014739990234375,
-0.0128631591796875,
-0.0445556640625,
0.0225982666015625,
-0.0310821533203125,
-0.01212310791015625,
-0.0141754150390625,
-0.029876708984375,
0.0018672943115234375,
0.04925537109375,
0.004730224609375,
0.042724609375,
0.053680419921875,
-0.04473876953125,
0.048248291015625,
0.0201568603515625,
-0.0372314453125,
0.032196044921875,
-0.062255859375,
0.01486968994140625,
-0.0098114013671875,
0.02484130859375,
-0.060821533203125,
-0.01387786865234375,
0.01250457763671875,
-0.0323486328125,
0.043609619140625,
-0.0248870849609375,
-0.0281524658203125,
-0.059600830078125,
-0.0159149169921875,
0.04351806640625,
0.057342529296875,
-0.05975341796875,
0.04840087890625,
0.0172882080078125,
0.04107666015625,
-0.057647705078125,
-0.078369140625,
0.001312255859375,
-0.00914764404296875,
-0.03997802734375,
0.041534423828125,
0.007701873779296875,
0.0194549560546875,
0.0132293701171875,
0.0012502670288085938,
-0.0011034011840820312,
-0.0190582275390625,
0.04095458984375,
0.032562255859375,
-0.025299072265625,
0.0015716552734375,
-0.033966064453125,
-0.01513671875,
-0.00043964385986328125,
-0.042083740234375,
0.03643798828125,
-0.0360107421875,
-0.0264129638671875,
-0.043609619140625,
-0.00617218017578125,
0.051910400390625,
-0.0189056396484375,
0.052276611328125,
0.07257080078125,
-0.0404052734375,
0.000606536865234375,
-0.03717041015625,
-0.0104827880859375,
-0.038360595703125,
0.0269622802734375,
-0.0229339599609375,
-0.0439453125,
0.04937744140625,
0.0052490234375,
-0.00484466552734375,
0.0506591796875,
0.0264129638671875,
-0.0090789794921875,
0.06268310546875,
0.047760009765625,
0.003086090087890625,
0.05670166015625,
-0.06622314453125,
0.0081024169921875,
-0.064208984375,
-0.031951904296875,
-0.01947021484375,
-0.04180908203125,
-0.049407958984375,
-0.04168701171875,
0.0233306884765625,
0.003753662109375,
-0.0301971435546875,
0.046661376953125,
-0.059356689453125,
0.0304718017578125,
0.059661865234375,
0.045196533203125,
-0.00936126708984375,
0.017547607421875,
-0.0076751708984375,
0.007110595703125,
-0.04534912109375,
-0.0142059326171875,
0.07659912109375,
0.04278564453125,
0.050445556640625,
-0.0172119140625,
0.03662109375,
0.0011072158813476562,
0.01393890380859375,
-0.07421875,
0.048675537109375,
-0.0167388916015625,
-0.037200927734375,
-0.00707244873046875,
-0.0207366943359375,
-0.07763671875,
0.00560760498046875,
-0.0278167724609375,
-0.04547119140625,
0.039520263671875,
0.015045166015625,
-0.00797271728515625,
0.048248291015625,
-0.048004150390625,
0.0657958984375,
-0.0135345458984375,
-0.0221405029296875,
0.007106781005859375,
-0.056060791015625,
0.01375579833984375,
-0.00016510486602783203,
-0.01543426513671875,
0.0205078125,
0.020965576171875,
0.06427001953125,
-0.053955078125,
0.07080078125,
-0.0169219970703125,
0.0253448486328125,
0.040283203125,
-0.027130126953125,
0.019073486328125,
-0.0242767333984375,
0.0219268798828125,
0.03631591796875,
-0.0076751708984375,
-0.037017822265625,
-0.042633056640625,
0.02880859375,
-0.07501220703125,
-0.031890869140625,
-0.0301361083984375,
-0.018402099609375,
0.01399993896484375,
0.020965576171875,
0.06390380859375,
0.056060791015625,
0.01361083984375,
0.046539306640625,
0.050933837890625,
-0.0278778076171875,
0.033966064453125,
-0.01727294921875,
-0.01727294921875,
-0.0156707763671875,
0.067138671875,
0.0285797119140625,
0.01447296142578125,
0.032562255859375,
0.0159149169921875,
-0.01708984375,
-0.0345458984375,
-0.02557373046875,
0.002986907958984375,
-0.06427001953125,
-0.038818359375,
-0.03753662109375,
-0.05499267578125,
-0.0244598388671875,
-0.0109100341796875,
-0.037750244140625,
-0.0164794921875,
-0.031036376953125,
-0.006633758544921875,
0.0290985107421875,
0.053070068359375,
0.0007915496826171875,
0.04290771484375,
-0.044281005859375,
0.006740570068359375,
0.042724609375,
0.036224365234375,
0.0087890625,
-0.05780029296875,
-0.031036376953125,
-0.00296783447265625,
-0.0269775390625,
-0.04119873046875,
0.0279693603515625,
0.018890380859375,
0.03936767578125,
0.049774169921875,
-0.0164337158203125,
0.0655517578125,
-0.031768798828125,
0.05987548828125,
0.038818359375,
-0.053192138671875,
0.0396728515625,
-0.005126953125,
0.020599365234375,
0.0135345458984375,
0.0301055908203125,
-0.0199737548828125,
0.00792694091796875,
-0.06109619140625,
-0.054229736328125,
0.05169677734375,
0.0078887939453125,
0.01363372802734375,
0.0206298828125,
0.031890869140625,
-0.01299285888671875,
-0.004451751708984375,
-0.06402587890625,
-0.0189971923828125,
-0.05279541015625,
-0.00799560546875,
-0.0067596435546875,
-0.01320648193359375,
0.0057373046875,
-0.0513916015625,
0.0311737060546875,
-0.006103515625,
0.0648193359375,
0.00927734375,
-0.016571044921875,
-0.005382537841796875,
-0.024871826171875,
0.021881103515625,
0.0279998779296875,
-0.0166778564453125,
0.0117340087890625,
0.0120086669921875,
-0.06671142578125,
0.0006480216979980469,
-0.01003265380859375,
-0.00628662109375,
-0.006336212158203125,
0.04278564453125,
0.09039306640625,
0.00508880615234375,
-0.0089874267578125,
0.0645751953125,
-0.0111083984375,
-0.0308685302734375,
-0.038055419921875,
0.005626678466796875,
-0.0248870849609375,
0.0186920166015625,
0.032684326171875,
0.044097900390625,
0.0017108917236328125,
-0.0234222412109375,
0.01983642578125,
0.0167083740234375,
-0.04058837890625,
-0.028289794921875,
0.049713134765625,
-0.0034084320068359375,
-0.00485992431640625,
0.0670166015625,
-0.0007538795471191406,
-0.0517578125,
0.070068359375,
0.034515380859375,
0.06378173828125,
-0.01090240478515625,
0.00830078125,
0.055816650390625,
0.0245819091796875,
-0.0163421630859375,
-0.0030517578125,
-0.0014553070068359375,
-0.07672119140625,
-0.0297393798828125,
-0.04547119140625,
-0.005298614501953125,
0.0170440673828125,
-0.062042236328125,
0.033233642578125,
-0.042327880859375,
-0.029571533203125,
0.007740020751953125,
-0.0044708251953125,
-0.09039306640625,
0.0294342041015625,
0.0242767333984375,
0.0684814453125,
-0.057769775390625,
0.059112548828125,
0.05926513671875,
-0.041351318359375,
-0.07183837890625,
-0.02685546875,
-0.01751708984375,
-0.06878662109375,
0.058197021484375,
0.037506103515625,
0.004344940185546875,
0.006778717041015625,
-0.06939697265625,
-0.062042236328125,
0.09423828125,
0.013092041015625,
-0.027496337890625,
-0.003261566162109375,
0.00722503662109375,
0.03546142578125,
-0.0309600830078125,
0.036773681640625,
0.004291534423828125,
0.01898193359375,
0.031280517578125,
-0.05841064453125,
-0.006786346435546875,
-0.0328369140625,
0.022735595703125,
0.0025959014892578125,
-0.045501708984375,
0.07562255859375,
-0.01494598388671875,
-0.01486968994140625,
0.0005588531494140625,
0.041412353515625,
-0.01397705078125,
-0.0046539306640625,
0.058502197265625,
0.053802490234375,
0.03240966796875,
-0.0266265869140625,
0.080078125,
0.00014102458953857422,
0.033111572265625,
0.04473876953125,
0.026397705078125,
0.046630859375,
0.023101806640625,
-0.0200042724609375,
0.03607177734375,
0.071533203125,
-0.040191650390625,
0.0367431640625,
0.007354736328125,
0.005092620849609375,
-0.017547607421875,
-0.006076812744140625,
-0.033172607421875,
0.047119140625,
0.0276031494140625,
-0.047882080078125,
0.00640869140625,
0.028228759765625,
-0.031341552734375,
-0.03424072265625,
-0.04571533203125,
0.040802001953125,
0.0011224746704101562,
-0.0333251953125,
0.052581787109375,
-0.0155487060546875,
0.05487060546875,
-0.024932861328125,
-0.00576019287109375,
-0.00860595703125,
0.0267486572265625,
-0.030426025390625,
-0.061279296875,
0.007472991943359375,
-0.0149078369140625,
-0.00860595703125,
-0.0141754150390625,
0.06280517578125,
-0.01042938232421875,
-0.042938232421875,
0.01641845703125,
0.0018548965454101562,
0.0243988037109375,
-0.0021839141845703125,
-0.05120849609375,
-0.0029277801513671875,
-0.0059661865234375,
-0.025146484375,
0.0188140869140625,
0.0242767333984375,
-0.01030731201171875,
0.041412353515625,
0.0472412109375,
-0.00415802001953125,
0.0270538330078125,
-0.0014963150024414062,
0.075927734375,
-0.036102294921875,
-0.031829833984375,
-0.03662109375,
0.046234130859375,
-0.011566162109375,
-0.0255279541015625,
0.03887939453125,
0.0301666259765625,
0.08599853515625,
-0.02587890625,
0.03857421875,
-0.004062652587890625,
0.00027942657470703125,
-0.02496337890625,
0.031707763671875,
-0.04742431640625,
-0.011016845703125,
-0.0233612060546875,
-0.07647705078125,
-0.0328369140625,
0.06427001953125,
-0.011932373046875,
0.01629638671875,
0.038055419921875,
0.053955078125,
-0.018646240234375,
-0.00860595703125,
0.028717041015625,
0.01227569580078125,
0.00760650634765625,
0.030548095703125,
0.060546875,
-0.061920166015625,
0.04241943359375,
-0.042144775390625,
-0.017974853515625,
-0.0210418701171875,
-0.05322265625,
-0.06884765625,
-0.05615234375,
-0.0316162109375,
-0.038726806640625,
-0.01800537109375,
0.053863525390625,
0.083984375,
-0.06280517578125,
-0.0017499923706054688,
-0.01105499267578125,
-0.01953125,
-0.0219268798828125,
-0.01490020751953125,
0.033477783203125,
-0.0022983551025390625,
-0.055938720703125,
-0.01401519775390625,
-0.0001590251922607422,
0.0174713134765625,
-0.0222015380859375,
-0.0043792724609375,
-0.003936767578125,
-0.0251007080078125,
0.0496826171875,
0.021759033203125,
-0.04290771484375,
-0.037506103515625,
0.004070281982421875,
-0.006305694580078125,
0.0254669189453125,
0.050048828125,
-0.06439208984375,
0.039031982421875,
0.0384521484375,
0.04217529296875,
0.06854248046875,
-0.0067596435546875,
0.01141357421875,
-0.0562744140625,
0.027496337890625,
0.009613037109375,
0.044219970703125,
0.0190887451171875,
-0.0269622802734375,
0.04217529296875,
0.029205322265625,
-0.045013427734375,
-0.05352783203125,
0.005016326904296875,
-0.09552001953125,
-0.003910064697265625,
0.06939697265625,
-0.0232391357421875,
-0.03509521484375,
0.012542724609375,
-0.00997161865234375,
0.03887939453125,
-0.0028018951416015625,
0.02508544921875,
0.023468017578125,
0.007659912109375,
-0.0443115234375,
-0.032135009765625,
0.0189056396484375,
-0.00730133056640625,
-0.0347900390625,
-0.04400634765625,
0.0054168701171875,
0.0162506103515625,
0.039947509765625,
0.01873779296875,
-0.0260772705078125,
0.011322021484375,
0.01361083984375,
0.034423828125,
-0.0086212158203125,
-0.03314208984375,
-0.0204010009765625,
0.0047607421875,
-0.0191802978515625,
-0.05120849609375
]
] |
EleutherAI/pythia-1.4b | 2023-07-09T16:01:57.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-1.4b | 9 | 1,101,663 | transformers | 2023-02-09T14:08:20 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-1.4B
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | โ |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | โ |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | โ |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. โEquivalentโ
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-1.4B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1.4B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1.4B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1.4B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better โfollowโ human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most โaccurateโ text. Never rely on Pythia-1.4B to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1.4B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1.4B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-1.4B.
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA โ OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning ChallengeโEasy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1ร their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,574 | [
[
-0.02484130859375,
-0.0601806640625,
0.0250091552734375,
0.0051116943359375,
-0.01763916015625,
-0.01537322998046875,
-0.0166168212890625,
-0.0343017578125,
0.0163421630859375,
0.0118408203125,
-0.0277557373046875,
-0.0219879150390625,
-0.03173828125,
-0.00428009033203125,
-0.035247802734375,
0.08673095703125,
-0.00989532470703125,
-0.01151275634765625,
0.00774383544921875,
-0.006435394287109375,
-0.003299713134765625,
-0.041229248046875,
-0.03265380859375,
-0.029754638671875,
0.04962158203125,
0.01264190673828125,
0.06549072265625,
0.04296875,
0.01232147216796875,
0.0215911865234375,
-0.026214599609375,
-0.00473785400390625,
-0.01221466064453125,
-0.0067138671875,
-0.00033736228942871094,
-0.0201416015625,
-0.0537109375,
0.0023097991943359375,
0.050384521484375,
0.048492431640625,
-0.01430511474609375,
0.019378662109375,
-0.00138092041015625,
0.0262603759765625,
-0.03900146484375,
0.001190185546875,
-0.0232696533203125,
-0.012725830078125,
-0.007320404052734375,
0.0099029541015625,
-0.0283203125,
-0.0261077880859375,
0.035491943359375,
-0.05108642578125,
0.0196075439453125,
0.007709503173828125,
0.0908203125,
-0.00848388671875,
-0.0309600830078125,
-0.004917144775390625,
-0.051910400390625,
0.050811767578125,
-0.054412841796875,
0.0253753662109375,
0.021087646484375,
0.01184844970703125,
-0.0024852752685546875,
-0.068115234375,
-0.04095458984375,
-0.016845703125,
-0.00974273681640625,
-0.0017299652099609375,
-0.04608154296875,
0.00217437744140625,
0.0377197265625,
0.047515869140625,
-0.0625,
-0.002887725830078125,
-0.0283966064453125,
-0.02642822265625,
0.025604248046875,
0.003448486328125,
0.03424072265625,
-0.0223846435546875,
0.0007319450378417969,
-0.0287933349609375,
-0.050048828125,
-0.01727294921875,
0.04248046875,
0.00597381591796875,
-0.02801513671875,
0.03729248046875,
-0.0276336669921875,
0.042236328125,
-0.0055084228515625,
0.0187225341796875,
0.033050537109375,
-0.0157318115234375,
-0.037384033203125,
-0.00792694091796875,
0.07073974609375,
0.00955963134765625,
0.0166778564453125,
-0.001220703125,
-0.0026416778564453125,
0.005344390869140625,
0.00396728515625,
-0.08477783203125,
-0.06005859375,
0.019134521484375,
-0.0287933349609375,
-0.032257080078125,
-0.01296234130859375,
-0.07049560546875,
-0.01430511474609375,
-0.013671875,
0.04254150390625,
-0.038665771484375,
-0.054290771484375,
-0.0117034912109375,
0.0011205673217773438,
0.0167083740234375,
0.0274505615234375,
-0.07220458984375,
0.031707763671875,
0.032928466796875,
0.07696533203125,
0.016693115234375,
-0.043548583984375,
-0.01457977294921875,
-0.021240234375,
-0.00876617431640625,
0.02728271484375,
-0.00884246826171875,
-0.01343536376953125,
-0.00827789306640625,
0.0116119384765625,
-0.00853729248046875,
-0.0257568359375,
0.03021240234375,
-0.029754638671875,
0.0202789306640625,
-0.0208892822265625,
-0.033172607421875,
-0.0276336669921875,
0.00708770751953125,
-0.04656982421875,
0.06341552734375,
0.0177764892578125,
-0.07366943359375,
0.0172576904296875,
-0.0170745849609375,
-0.00490570068359375,
-0.004039764404296875,
0.01390838623046875,
-0.052337646484375,
0.0020656585693359375,
0.0275421142578125,
0.004055023193359375,
-0.0286102294921875,
0.01390838623046875,
-0.0191650390625,
-0.031768798828125,
0.01372528076171875,
-0.03924560546875,
0.06878662109375,
0.0145263671875,
-0.048095703125,
0.0208892822265625,
-0.04315185546875,
0.01708984375,
0.0191192626953125,
-0.0262603759765625,
0.004016876220703125,
-0.01378631591796875,
0.0286865234375,
0.01525115966796875,
0.012969970703125,
-0.02764892578125,
0.02142333984375,
-0.037994384765625,
0.05560302734375,
0.055633544921875,
-0.004730224609375,
0.035369873046875,
-0.032501220703125,
0.035247802734375,
0.002773284912109375,
0.0153961181640625,
-0.005893707275390625,
-0.047576904296875,
-0.07391357421875,
-0.02117919921875,
0.0284881591796875,
0.0223236083984375,
-0.03546142578125,
0.0308685302734375,
-0.0183563232421875,
-0.0653076171875,
-0.0143890380859375,
-0.00653076171875,
0.031463623046875,
0.0218658447265625,
0.031646728515625,
-0.01222991943359375,
-0.040679931640625,
-0.06683349609375,
-0.0166778564453125,
-0.032684326171875,
0.0106048583984375,
0.01385498046875,
0.0703125,
-0.00759124755859375,
0.044189453125,
-0.0259246826171875,
0.0194854736328125,
-0.0267333984375,
0.012939453125,
0.03369140625,
0.045684814453125,
0.0300750732421875,
-0.04193115234375,
-0.02886962890625,
0.002231597900390625,
-0.04425048828125,
0.006961822509765625,
0.002780914306640625,
-0.0235443115234375,
0.022979736328125,
0.0046539306640625,
-0.07464599609375,
0.034942626953125,
0.04833984375,
-0.040771484375,
0.0609130859375,
-0.0244140625,
-0.001010894775390625,
-0.0791015625,
0.0194091796875,
0.01126861572265625,
-0.017669677734375,
-0.046539306640625,
0.006916046142578125,
0.01387786865234375,
-0.01503753662109375,
-0.0309295654296875,
0.045196533203125,
-0.039794921875,
-0.01178741455078125,
-0.01519012451171875,
0.00476837158203125,
-0.0026950836181640625,
0.048065185546875,
0.0106658935546875,
0.043182373046875,
0.0609130859375,
-0.057586669921875,
0.03118896484375,
0.0167083740234375,
-0.01947021484375,
0.0278472900390625,
-0.06695556640625,
0.0132904052734375,
0.00554656982421875,
0.03204345703125,
-0.04266357421875,
-0.027587890625,
0.0400390625,
-0.044189453125,
0.01206207275390625,
-0.031097412109375,
-0.040313720703125,
-0.0321044921875,
-0.01284027099609375,
0.0457763671875,
0.058074951171875,
-0.04638671875,
0.0517578125,
0.003688812255859375,
0.0091705322265625,
-0.028350830078125,
-0.041717529296875,
-0.01971435546875,
-0.03936767578125,
-0.050018310546875,
0.028656005859375,
0.01213836669921875,
-0.01293182373046875,
0.0012426376342773438,
-0.0009298324584960938,
0.00750732421875,
-0.0048065185546875,
0.0241546630859375,
0.025543212890625,
-0.0032482147216796875,
0.0009527206420898438,
-0.0103302001953125,
-0.0101318359375,
-0.0008502006530761719,
-0.03875732421875,
0.07073974609375,
-0.02197265625,
-0.01445770263671875,
-0.061309814453125,
-0.00004363059997558594,
0.06707763671875,
-0.03204345703125,
0.066650390625,
0.04583740234375,
-0.05279541015625,
0.01116943359375,
-0.027923583984375,
-0.0218963623046875,
-0.03302001953125,
0.0499267578125,
-0.01953125,
-0.0277099609375,
0.047119140625,
0.0211029052734375,
0.02215576171875,
0.04217529296875,
0.054290771484375,
0.0157470703125,
0.08984375,
0.035491943359375,
-0.01265716552734375,
0.048248291015625,
-0.038604736328125,
0.020233154296875,
-0.0826416015625,
-0.013275146484375,
-0.038421630859375,
-0.0181732177734375,
-0.07177734375,
-0.02130126953125,
0.0250091552734375,
0.0168304443359375,
-0.05621337890625,
0.0423583984375,
-0.041839599609375,
0.004703521728515625,
0.048980712890625,
0.01959228515625,
0.012725830078125,
0.01593017578125,
0.0050201416015625,
-0.005420684814453125,
-0.0509033203125,
-0.0252685546875,
0.09381103515625,
0.038330078125,
0.04425048828125,
0.0223388671875,
0.054779052734375,
-0.01168060302734375,
0.0196685791015625,
-0.052032470703125,
0.030792236328125,
0.0261993408203125,
-0.054412841796875,
-0.0157470703125,
-0.0577392578125,
-0.07073974609375,
0.037567138671875,
0.004894256591796875,
-0.084716796875,
0.0173492431640625,
0.0171356201171875,
-0.029388427734375,
0.035491943359375,
-0.04779052734375,
0.07537841796875,
-0.017303466796875,
-0.036590576171875,
-0.025634765625,
-0.0230712890625,
0.017791748046875,
0.0261688232421875,
0.00931549072265625,
0.007625579833984375,
0.0227508544921875,
0.07470703125,
-0.052276611328125,
0.04925537109375,
-0.00933837890625,
0.01161956787109375,
0.0252685546875,
0.020751953125,
0.05072021484375,
0.0101776123046875,
0.0092926025390625,
-0.00318145751953125,
0.01296234130859375,
-0.04302978515625,
-0.0298919677734375,
0.0684814453125,
-0.08404541015625,
-0.0280609130859375,
-0.05987548828125,
-0.0439453125,
0.0080413818359375,
0.01458740234375,
0.0323486328125,
0.049072265625,
-0.0028171539306640625,
-0.0005660057067871094,
0.0445556640625,
-0.038299560546875,
0.0261383056640625,
0.014801025390625,
-0.0380859375,
-0.038787841796875,
0.0758056640625,
0.0020923614501953125,
0.025665283203125,
0.00044655799865722656,
0.0166778564453125,
-0.0292510986328125,
-0.03411865234375,
-0.045074462890625,
0.04150390625,
-0.055084228515625,
0.0008883476257324219,
-0.0537109375,
-0.0029201507568359375,
-0.033782958984375,
0.00727081298828125,
-0.03118896484375,
-0.0278778076171875,
-0.0177001953125,
-0.002323150634765625,
0.04388427734375,
0.0364990234375,
0.008026123046875,
0.0260772705078125,
-0.04046630859375,
-0.004367828369140625,
0.01715087890625,
0.007648468017578125,
0.00875091552734375,
-0.06787109375,
-0.00789642333984375,
0.00914764404296875,
-0.032806396484375,
-0.08636474609375,
0.03863525390625,
-0.0035533905029296875,
0.02716064453125,
0.005767822265625,
-0.0181732177734375,
0.045074462890625,
-0.006725311279296875,
0.050537109375,
0.01253509521484375,
-0.07684326171875,
0.041839599609375,
-0.038238525390625,
0.0228424072265625,
0.0265960693359375,
0.0267181396484375,
-0.053741455078125,
-0.006565093994140625,
-0.07562255859375,
-0.08111572265625,
0.05609130859375,
0.038238525390625,
0.01324462890625,
0.008544921875,
0.0308380126953125,
-0.0345458984375,
0.01174163818359375,
-0.0784912109375,
-0.0227508544921875,
-0.0200042724609375,
-0.006134033203125,
0.01244354248046875,
-0.0015316009521484375,
0.00421905517578125,
-0.042755126953125,
0.0767822265625,
0.0038700103759765625,
0.026519775390625,
0.021881103515625,
-0.02825927734375,
-0.00859832763671875,
-0.003780364990234375,
0.01175689697265625,
0.057830810546875,
-0.01042938232421875,
0.0028076171875,
0.016937255859375,
-0.0411376953125,
0.002346038818359375,
0.01261138916015625,
-0.0280609130859375,
-0.004787445068359375,
0.014434814453125,
0.06597900390625,
0.01116180419921875,
-0.0300140380859375,
0.0179290771484375,
-0.0038928985595703125,
-0.005649566650390625,
-0.022613525390625,
-0.01378631591796875,
0.01311492919921875,
0.01436614990234375,
-0.0021572113037109375,
-0.01299285888671875,
-0.00112152099609375,
-0.06640625,
0.004207611083984375,
0.01457977294921875,
-0.010498046875,
-0.031402587890625,
0.04412841796875,
0.0027446746826171875,
-0.01509857177734375,
0.0850830078125,
-0.01922607421875,
-0.05157470703125,
0.0579833984375,
0.037872314453125,
0.05499267578125,
-0.0148773193359375,
0.027069091796875,
0.0679931640625,
0.02459716796875,
-0.01457977294921875,
0.004825592041015625,
0.00724029541015625,
-0.039459228515625,
-0.0080718994140625,
-0.061126708984375,
-0.0183563232421875,
0.017822265625,
-0.04425048828125,
0.032501220703125,
-0.048431396484375,
-0.007015228271484375,
-0.002315521240234375,
0.01727294921875,
-0.0443115234375,
0.0228271484375,
0.0121307373046875,
0.053924560546875,
-0.06951904296875,
0.061492919921875,
0.04901123046875,
-0.05584716796875,
-0.083251953125,
0.002285003662109375,
0.0023746490478515625,
-0.03271484375,
0.01114654541015625,
0.0158538818359375,
0.0166015625,
0.01311492919921875,
-0.023040771484375,
-0.06707763671875,
0.09814453125,
0.0175323486328125,
-0.049285888671875,
-0.0206298828125,
-0.00988006591796875,
0.04107666015625,
0.0039825439453125,
0.054046630859375,
0.053985595703125,
0.03143310546875,
0.006069183349609375,
-0.08062744140625,
0.0271759033203125,
-0.0264434814453125,
-0.00516510009765625,
0.0175933837890625,
-0.050537109375,
0.0980224609375,
-0.006160736083984375,
-0.0024127960205078125,
0.031768798828125,
0.045379638671875,
0.0312347412109375,
-0.00848388671875,
0.0262603759765625,
0.058929443359375,
0.067138671875,
-0.0283050537109375,
0.0955810546875,
-0.0228271484375,
0.058258056640625,
0.06488037109375,
0.0151824951171875,
0.03887939453125,
0.03070068359375,
-0.0293731689453125,
0.039581298828125,
0.0640869140625,
-0.005523681640625,
0.0140228271484375,
0.0190277099609375,
-0.0216064453125,
-0.0197906494140625,
0.008941650390625,
-0.044921875,
0.0133209228515625,
0.01113128662109375,
-0.04205322265625,
-0.01558685302734375,
-0.025482177734375,
0.0267181396484375,
-0.031494140625,
-0.0174560546875,
0.0193328857421875,
0.00762176513671875,
-0.048828125,
0.046539306640625,
0.0182952880859375,
0.04254150390625,
-0.034759521484375,
0.0118408203125,
-0.01158905029296875,
0.0228424072265625,
-0.027069091796875,
-0.0308380126953125,
0.005615234375,
-0.000047326087951660156,
0.00577545166015625,
0.0106201171875,
0.0311431884765625,
-0.010162353515625,
-0.04296875,
0.01428985595703125,
0.037353515625,
0.019989013671875,
-0.032318115234375,
-0.051971435546875,
0.00730133056640625,
-0.012542724609375,
-0.039947509765625,
0.03253173828125,
0.01971435546875,
-0.00948333740234375,
0.044525146484375,
0.04888916015625,
0.0029926300048828125,
0.0010442733764648438,
0.011444091796875,
0.0740966796875,
-0.033935546875,
-0.036956787109375,
-0.068603515625,
0.036651611328125,
0.00027298927307128906,
-0.05108642578125,
0.0654296875,
0.040191650390625,
0.052703857421875,
0.0207366943359375,
0.045440673828125,
-0.033477783203125,
-0.001220703125,
-0.0217742919921875,
0.050018310546875,
-0.03717041015625,
0.00264739990234375,
-0.038330078125,
-0.0860595703125,
-0.005176544189453125,
0.0718994140625,
-0.039093017578125,
0.0309600830078125,
0.061126708984375,
0.0609130859375,
-0.0068359375,
0.00820159912109375,
0.004634857177734375,
0.023162841796875,
0.0391845703125,
0.06927490234375,
0.06695556640625,
-0.05291748046875,
0.040771484375,
-0.03814697265625,
-0.020416259765625,
-0.0106353759765625,
-0.036773681640625,
-0.06414794921875,
-0.035064697265625,
-0.037811279296875,
-0.05712890625,
0.0006952285766601562,
0.068359375,
0.057220458984375,
-0.046539306640625,
-0.0114593505859375,
-0.0400390625,
0.0031223297119140625,
-0.0192108154296875,
-0.01727294921875,
0.03216552734375,
0.00902557373046875,
-0.0712890625,
-0.0031261444091796875,
-0.01155853271484375,
0.0085296630859375,
-0.032745361328125,
-0.0233154296875,
-0.01403045654296875,
-0.00789642333984375,
0.00449371337890625,
0.0251922607421875,
-0.039581298828125,
-0.0199432373046875,
0.0012083053588867188,
0.005344390869140625,
0.0005083084106445312,
0.053436279296875,
-0.042877197265625,
0.009857177734375,
0.045196533203125,
0.00876617431640625,
0.061676025390625,
-0.0210113525390625,
0.031097412109375,
-0.0203857421875,
0.0254364013671875,
0.0203399658203125,
0.04833984375,
0.02557373046875,
-0.0194854736328125,
0.012420654296875,
0.029632568359375,
-0.055816650390625,
-0.0650634765625,
0.02630615234375,
-0.05328369140625,
-0.005706787109375,
0.09429931640625,
-0.0210418701171875,
-0.0305938720703125,
0.005096435546875,
-0.0144500732421875,
0.039947509765625,
-0.0218048095703125,
0.049072265625,
0.048065185546875,
0.005283355712890625,
-0.0164947509765625,
-0.0494384765625,
0.0279083251953125,
0.050689697265625,
-0.061492919921875,
0.02783203125,
0.044830322265625,
0.04559326171875,
0.018951416015625,
0.044952392578125,
-0.0220947265625,
0.046417236328125,
0.007320404052734375,
0.00640106201171875,
0.002109527587890625,
-0.03558349609375,
-0.033172607421875,
-0.0101470947265625,
0.016265869140625,
0.0025005340576171875
]
] |
WarriorMama777/OrangeMixs | 2023-06-28T10:00:13.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"dataset:Nerfgun3/bad_prompt",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | WarriorMama777 | null | null | WarriorMama777/OrangeMixs | 3,464 | 1,093,124 | diffusers | 2022-12-04T14:18:34 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- text-to-image
datasets: Nerfgun3/bad_prompt
---
----
# OrangeMixs
"OrangeMixs" shares various Merge models that can be used with StableDiffusionWebui:Automatic1111 and others.
<img src="https://i.imgur.com/VZg0LqQ.png" width="1000" height="">
Maintain a repository for the following purposes.
1. to provide easy access to models commonly used in the Japanese community.The Wisdom of the Anons๐
2. As a place to upload my merge models when I feel like it.

<span style="font-size: 60%;">Hero image prompts(AOM3B2):https://majinai.art/ja/i/jhw20Z_</span>
----
# UPDATE NOTE / How to read this README
## How to read this README
1. Read the ToC as release notes.
Sections are in descending order. The order within the section is ascending. It is written like SNS.
2. UPDATE NOTE
3. View the repository history when you need to check the full history.
## UPDATE NOTE
- 2023-02-27: Add AOM3A1B
- 2023-03-10: Model name fix
I found that I abbreviated the model name too much, so that when users see illustrations using OrangeMixs models on the web, they cannot reach them in their searches.
To make the specification more search engine friendly, I renamed it to "ModelName + (orangemixs)".
- 2023-03-11: Change model name : () to _
Changed to _ because an error occurs when using () in the Cloud environment(e.g.:paperspace).
"ModelName + _orangemixs"
- 2023-04-01: Added description of AOM3A1 cursed by Dreamlike
- 2023-06-27: Added AOM3B2. Remove terms of service.
----
# Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI to run OrangeMixs:
[](https://huggingface.co/spaces/akhaliq/webui-orangemixs)
----
# Table of Contents
- [OrangeMixs](#orangemixs)
- [UPDATE NOTE / How to read this README](#update-note--how-to-read-this-readme)
- [How to read this README](#how-to-read-this-readme)
- [UPDATE NOTE](#update-note)
- [Gradio](#gradio)
- [Table of Contents](#table-of-contents)
- [Reference](#reference)
- [Licence](#licence)
- [~~Terms of use~~](#terms-of-use)
- [Disclaimer](#disclaimer)
- [How to download](#how-to-download)
- [Batch Download](#batch-download)
- [Batch Download (Advanced)](#batch-download-advanced)
- [Select and download](#select-and-download)
- [Model Detail \& Merge Recipes](#model-detail--merge-recipes)
- [AbyssOrangeMix3 (AOM3)](#abyssorangemix3-aom3)
- [About](#about)
- [More feature](#more-feature)
- [Variations / Sample Gallery](#variations--sample-gallery)
- [AOM3](#aom3)
- [AOM3A1](#aom3a1)
- [AOM3A2](#aom3a2)
- [AOM3A3](#aom3a3)
- [AOM3A1B](#aom3a1b)
- [AOM3B2](#aom3b2)
- [Description for enthusiast](#description-for-enthusiast)
- [AbyssOrangeMix2 (AOM2)](#abyssorangemix2-aom2)
- [AbyssOrangeMix2\_sfw (AOM2s)](#abyssorangemix2_sfw-aom2s)
- [AbyssOrangeMix2\_nsfw (AOM2n)](#abyssorangemix2_nsfw-aom2n)
- [AbyssOrangeMix2\_hard (AOM2h)](#abyssorangemix2_hard-aom2h)
- [EerieOrangeMix (EOM)](#eerieorangemix-eom)
- [EerieOrangeMix (EOM1)](#eerieorangemix-eom1)
- [EerieOrangeMix\_base (EOM1b)](#eerieorangemix_base-eom1b)
- [EerieOrangeMix\_Night (EOM1n)](#eerieorangemix_night-eom1n)
- [EerieOrangeMix\_half (EOM1h)](#eerieorangemix_half-eom1h)
- [EerieOrangeMix (EOM1)](#eerieorangemix-eom1-1)
- [EerieOrangeMix2 (EOM2)](#eerieorangemix2-eom2)
- [EerieOrangeMix2\_base (EOM2b)](#eerieorangemix2_base-eom2b)
- [EerieOrangeMix2\_night (EOM2n)](#eerieorangemix2_night-eom2n)
- [EerieOrangeMix2\_half (EOM2h)](#eerieorangemix2_half-eom2h)
- [EerieOrangeMix2 (EOM2)](#eerieorangemix2-eom2-1)
- [Models Comparison](#models-comparison)
- [AbyssOrangeMix (AOM)](#abyssorangemix-aom)
- [AbyssOrangeMix\_base (AOMb)](#abyssorangemix_base-aomb)
- [AbyssOrangeMix\_Night (AOMn)](#abyssorangemix_night-aomn)
- [AbyssOrangeMix\_half (AOMh)](#abyssorangemix_half-aomh)
- [AbyssOrangeMix (AOM)](#abyssorangemix-aom-1)
- [ElyOrangeMix (ELOM)](#elyorangemix-elom)
- [ElyOrangeMix (ELOM)](#elyorangemix-elom-1)
- [ElyOrangeMix\_half (ELOMh)](#elyorangemix_half-elomh)
- [ElyNightOrangeMix (ELOMn)](#elynightorangemix-elomn)
- [BloodOrangeMix (BOM)](#bloodorangemix-bom)
- [BloodOrangeMix (BOM)](#bloodorangemix-bom-1)
- [BloodOrangeMix\_half (BOMh)](#bloodorangemix_half-bomh)
- [BloodNightOrangeMix (BOMn)](#bloodnightorangemix-bomn)
- [ElderOrangeMix](#elderorangemix)
- [Troubleshooting](#troubleshooting)
- [FAQ and Tips (๐MEME ZONE๐ฆ)](#faq-and-tips-meme-zone)
----
# Reference
+/hdg/ Stable Diffusion Models Cookbook - <https://rentry.org/hdgrecipes#g-anons-unnamed-mix-e93c3bf7>
Model names are named after Cookbook precedents๐
# Licence
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here ๏ผhttps://huggingface.co/spaces/CompVis/stable-diffusion-license
# ~~Terms of use~~
~~- **Clearly indicate where modifications have been made.**
If you used it for merging, please state what steps you took to do so.~~
Removed terms of use. 2023-06-28
freedom. If you publish the recipe, Marge swamp will be fun.
# Disclaimer
<details><summary>READ MORE: Disclaimer</summary>
The user has complete control over whether or not to generate NSFW content, and the user's decision to enjoy either SFW or NSFW is entirely up to the user.The learning model does not contain any obscene visual content that can be viewed with a single click.The posting of the Learning Model is not intended to display obscene material in a public place.
In publishing examples of the generation of copyrighted characters, I consider the following cases to be exceptional cases in which unauthorised use is permitted.
"when the use is for private use or research purposes; when the work is used as material for merchandising (however, this does not apply when the main use of the work is to be merchandised); when the work is used in criticism, commentary or news reporting; when the work is used as a parody or derivative work to demonstrate originality."
In these cases, use against the will of the copyright holder or use for unjustified gain should still be avoided, and if a complaint is lodged by the copyright holder, it is guaranteed that the publication will be stopped as soon as possible.
I would also like to note that I am aware of the fact that many of the merged models use NAI, which is learned from Danbooru and other sites that could be interpreted as illegal, and whose model data itself is also a leak, and that this should be watched carefully. I believe that the best we can do is to expand the possibilities of GenerativeAI while protecting the works of illustrators and artists.
</details>
----
# How to download
## Batch Download
โ Deprecated: Orange has grown too huge. Doing this will kill your storage.
1. install Git
2. create a folder of your choice and right click โ "Git bash here" and open a gitbash on the folder's directory.
3. run the following commands in order.
```
git lfs install
git clone https://huggingface.co/WarriorMama777/OrangeMixs
```
4. complete
## Batch Download (Advanced)
Advanced: (When you want to download only selected directories, not the entire repository.)
<details>
<summary>Toggle: How to Batch Download (Advanced)</summary>
1. Run the command `git clone --filter=tree:0 --no-checkout https://huggingface.co/WarriorMama777/OrangeMixs` to clone the huggingface repository. By adding the `--filter=tree:0` and `--no-checkout` options, you can download only the file names without their contents.
```
git clone --filter=tree:0 --no-checkout https://huggingface.co/WarriorMama777/OrangeMixs
```
2. Move to the cloned directory with the command `cd OrangeMixs`.
```
cd OrangeMixs
```
3. Enable sparse-checkout mode with the command `git sparse-checkout init --cone`. By adding the `--cone` option, you can achieve faster performance.
```
git sparse-checkout init --cone
```
4. Specify the directory you want to get with the command `git sparse-checkout add <directory name>`. For example, if you want to get only the `Models/AbyssOrangeMix3` directory, enter `git sparse-checkout add Models/AbyssOrangeMix3`.
```
git sparse-checkout add Models/AbyssOrangeMix3
```
5. Download the contents of the specified directory with the command `git checkout main`.
```
git checkout main
```
This completes how to clone only a specific directory. If you want to add other directories, run `git sparse-checkout add <directory name>` again.
</details>
## Select and download
1. Go to the Files and vaersions tab.
2. select the model you want to download
3. download
4. complete
----
----
# Model Detail & Merge Recipes
## AbyssOrangeMix3 (AOM3)

โโEveryone has different โABYSSโ!
โผAbout
The main model, "AOM3 (AbyssOrangeMix3)", is a purely upgraded model that improves on the problems of the previous version, "AOM2". "AOM3" can generate illustrations with very realistic textures and can generate a wide variety of content. There are also three variant models based on the AOM3 that have been adjusted to a unique illustration style. These models will help you to express your ideas more clearly.
โผLinks
- [โ NSFW] Civitai: AbyssOrangeMix3 (AOM3) | Stable Diffusion Checkpoint | https://civitai.com/models/9942/abyssorangemix3-aom3
### About
Features: high-quality, realistic textured illustrations can be generated.
There are two major changes from AOM2.
1: Models for NSFW such as _nsfw and _hard have been improved: the models after nsfw in AOM2 generated creepy realistic faces, muscles and ribs when using Hires.fix, even though they were animated characters. These have all been improved in AOM3.
e.g.: explanatory diagram by MEME : [GO TO MEME ZONEโ](#MEME_realface)
2: sfw/nsfw merged into one model. Originally, nsfw models were separated because adding NSFW content (models like NAI and gape) would change the face and cause the aforementioned problems. Now that those have been improved, the models can be packed into one.
In addition, thanks to excellent extensions such as [ModelToolkit](https://github.com/arenatemp/stable-diffusion-webui-model-toolkit
), the model file size could be reduced (1.98 GB per model).

### More feature
In addition, these U-Net Blocks Weight Merge models take numerous steps but are carefully merged to ensure that mutual content is not overwritten.
(Of course, all models allow full control over adult content.)
- ๐ When generating illustrations for the general public: write "nsfw" in the negative prompt field
- ๐ ~~When generating adult illustrations: "nsfw" in the positive prompt field~~ -> It can be generated without putting it in. If you include it, the atmosphere will be more NSFW.
### Variations / Sample Gallery
๐งEditing๐ง

#### AOM3
โผAOM3

<span style="font-size: 60%;">(Actually, this gallery doesn't make much sense since AOM3 is mainly an improvement of the NSFW part ๐ ...But we can confirm that the picture is not much different from AOM2sfw.)</span>
#### AOM3A1
โOnly this model (AOM3A1) includes ChilloutMix. The curse of the DreamLike license. In other words, only AOM3A1 is not available for commercial use. I recommend AOM3A1B instead.โ
[GO TO MEME ZONEโ](#MEME_AOM3A1)
Features: Anime like illustrations with flat paint. Cute enough as it is, but I really like to apply LoRA of anime characters to this model to generate high quality anime illustrations like a frame from a theatre version.
โผA1

<details>
<summary>ยฉ</summary>
(1)ยฉYurucamp: Inuyama Aoi, (2)ยฉThe Quintessential Quintuplets: Nakano Yotsuba, (3)ยฉSailor Moon: Mizuno Ami/SailorMercury
</details>
#### AOM3A2
๐งEditing๐ง
Features: Oil paintings like style artistic illustrations and stylish background depictions. In fact, this is mostly due to the work of Counterfeit 2.5, but the textures are more realistic thanks to the U-Net Blocks Weight Merge.
#### AOM3A3
๐งEditing๐ง
Features: Midpoint of artistic and kawaii. the model has been tuned to combine realistic textures, a artistic style that also feels like an oil colour style, and a cute anime-style face. Can be used to create a wide range of illustrations.
#### AOM3A1B
AOM3A1B added. This model is my latest favorite. I recommend it for its moderate realism, moderate brush touch, and moderate LoRA conformity.
The model was merged by mistakenly selecting 'Add sum' when 'Add differences' should have been selected in the AOM3A3 recipe. It was an unintended merge, but we share it because the illustrations produced are consistently good results.
In my review, this is an illustration style somewhere between AOM3A1 and A3.
โผA1B


- Meisho Doto (umamusume): https://civitai.com/models/11980/meisho-doto-umamusume
- Train and Girl: [JR East E235 series / train interior](https://civitai.com/models/9517/jr-east-e235-series-train-interior)
<details>
<summary>ยฉ</summary>
ยฉumamusume: Meisho Doto, ยฉGirls und Panzer: Nishizumi Miho,ยฉIDOLM@STER: Sagisawa Fumika
</details>
#### AOM3B2
my newest toy.
Just AOM3A1B + BreakdomainM21: 0.4
So this model is somewhat of a troll model.
I would like to create an improved DiffLoRAKit_v2 based on this.
Upload for access for research etc. 2023-06-27

<details><summary>Sample image prompts</summary>
1. [Maid](https://majinai.art/ja/i/jhw20Z_)
2. Yotsuba: https://majinai.art/ja/i/f-O4wau
3. Inuko in cafe: https://majinai.art/ja/i/Cj-Ar9C
4. bathroom: https://majinai.art/ja/i/XiSj5K6
</details>
____
### Description for enthusiast
AOM3 was created with a focus on improving the nsfw version of AOM2, as mentioned above.The AOM3 is a merge of the following two models into AOM2sfw using U-Net Blocks Weight Merge, while extracting only the NSFW content part.
(1) NAI: trained in Danbooru
(2)gape: Finetune model of NAI trained on Danbooru's very hardcore NSFW content.
In other words, if you are looking for something like AOM3sfw, it is AOM2sfw.The AOM3 was merged with the NSFW model while removing only the layers that have a negative impact on the face and body. However, the faces and compositions are not an exact match to AOM2sfw.AOM2sfw is sometimes superior when generating SFW content. I recommend choosing according to the intended use of the illustration.See below for a comparison between AOM2sfw and AOM3.

โผA summary of the AOM3 work is as follows
1. investigated the impact of the NAI and gape layers as AOM2 _nsfw onwards is crap.
2. cut face layer: OUT04 because I want realistic faces to stop โ Failed. No change.
3. gapeNAI layer investigation๏ฝ
a. (IN05-08 (especially IN07) | Change the illustration significantly. Noise is applied, natural colours are lost, shadows die, and we can see that the IN deep layer is a layer of light and shade.
b. OUT03-05(?) | likely to be sexual section/NSFW layer.Cutting here will kill the NSFW.
c. OUT03,OUT04๏ฝNSFW effects are in(?). e.g.: spoken hearts, trembling, motion lines, etc...
d. OUT05๏ฝThis is really an NSFW switch. All the "NSFW atmosphere" is in here. Facial expressions, Heavy breaths, etc...
e. OUT10-11๏ฝPaint layer. Does not affect detail, but does have an extensive impact.
1. (mass production of rubbish from here...)
2. cut IN05-08 and merge NAIgape with flat parameters โ avoided creepy muscles and real faces. Also, merging NSFW models stronger has less impact.
3. so, cut IN05-08, OUT10-11 and merge NAI+gape with all others 0.5.
4. โ AOM3
AOM3 roughly looks like this
----
โผHow to use
- Prompts
- Negative prompts is As simple as possible is good.
(worst quality, low quality:1.4)
- Using "3D" as a negative will result in a rough sketch style at the "sketch" level. Use with caution as it is a very strong prompt.
- How to avoid Real Face
(realistic, lip, nose, tooth, rouge, lipstick, eyeshadow:1.0), (abs, muscular, rib:1.0),
- How to avoid Bokeh
(depth of field, bokeh, blurry:1.4)
- How to remove mosaic: `(censored, mosaic censoring, bar censor, convenient censoring, pointless censoring:1.0),`
- How to remove blush: `(blush, embarrassed, nose blush, light blush, full-face blush:1.4), `
- How to remove NSFW effects: `(trembling, motion lines, motion blur, emphasis lines:1.2),`
- ๐ฐBasic negative prompts sample for Anime girl โ
- v1
`nsfw, (worst quality, low quality:1.4), (realistic, lip, nose, tooth, rouge, lipstick, eyeshadow:1.0), (dusty sunbeams:1.0),, (abs, muscular, rib:1.0), (depth of field, bokeh, blurry:1.4),(motion lines, motion blur:1.4), (greyscale, monochrome:1.0), text, title, logo, signature`
- v2
`nsfw, (worst quality, low quality:1.4), (lip, nose, tooth, rouge, lipstick, eyeshadow:1.4), (blush:1.2), (jpeg artifacts:1.4), (depth of field, bokeh, blurry, film grain, chromatic aberration, lens flare:1.0), (1boy, abs, muscular, rib:1.0), greyscale, monochrome, dusty sunbeams, trembling, motion lines, motion blur, emphasis lines, text, title, logo, signature, `
- Sampler: ~~โDPM++ SDE Karrasโ is good~~ Take your pick
- Steps:
- DPM++ SDE Karras: Test: 12๏ฝ ,illustration: 20๏ฝ
- DPM++ 2M Karras: Test: 20๏ฝ ,illustration: 28๏ฝ
- Clipskip: 1 or 2
- CFG: 8 (6๏ฝ12)
- Upscaler :
- Detailed illust โ Latenet (nearest-exact)
Denoise strength: 0.5 (0.5~0.6)
- Simple upscale: Swin IR, ESRGAN, Remacri etcโฆ
Denoise strength: Can be set low. (0.35~0.6)
---
๐ฉโ๐ณModel details / Recipe
โผHash
- AOM3.safetensors
D124FC18F0232D7F0A2A70358CDB1288AF9E1EE8596200F50F0936BE59514F6D
- AOM3A1.safetensors
F303D108122DDD43A34C160BD46DBB08CB0E088E979ACDA0BF168A7A1F5820E0
- AOM3A2.safetensors
553398964F9277A104DA840A930794AC5634FC442E6791E5D7E72B82B3BB88C3
- AOM3A3.safetensors
EB4099BA9CD5E69AB526FCA22A2E967F286F8512D9509B735C892FA6468767CF
โผUse Models
1. AOM2sfw
ใ038ba203d8ba3c8af24f14e01fbb870c85bbb8d4b6d9520804828f4193d12ce9ใ
1. AnythingV3.0 huggingface pruned
[2700c435]ใ543bcbc21294831c6245cd74c8a7707761e28812c690f946cb81fef930d54b5eใ
1. NovelAI animefull-final-pruned
[925997e9]ใ89d59c3dde4c56c6d5c41da34cc55ce479d93b4007046980934b14db71bdb2a8ใ
1. NovelAI sfw
[1d4a34af]ใ22fa233c2dfd7748d534be603345cb9abf994a23244dfdfc1013f4f90322fecaใ
1. Gape60
[25396b85]ใ893cca5903ccd0519876f58f4bc188dd8fcc5beb8a69c1a3f1a5fe314bb573f5ใ
1. BasilMix
ใbbf07e3a1c3482c138d096f7dcdb4581a2aa573b74a68ba0906c7b657942f1c2ใ
1. chilloutmix_fp16.safetensors
ใ4b3bf0860b7f372481d0b6ac306fed43b0635caf8aa788e28b32377675ce7630ใ
1. Counterfeit-V2.5_fp16.safetensors
ใ71e703a0fca0e284dd9868bca3ce63c64084db1f0d68835f0a31e1f4e5b7cca6ใ
1. kenshi_01_fp16.safetensors
ใ3b3982f3aaeaa8af3639a19001067905e146179b6cddf2e3b34a474a0acae7faใ
----
โผAOM3
โผ**Instructions:**
USE: [https://github.com/hako-mikan/sd-webui-supermerger/](https://github.com/hako-mikan/sd-webui-supermerger/)
(This extension is really great. It turns a month's work into an hour. Thank you)
STEP: 1 | BWM : NAI - NAIsfw & gape - NAI
CUT: IN05-IN08, OUT10-11
| Model: A | Model: B | Model: C | Interpolation Method | Weight | Merge Name |
| -------- | -------- | -------- | -------------------- | ----------------------------------------------------------------------------------------- | ---------- |
| AOM2sfw | NAI full | NAI sfw | Add Difference @ 1.0 | 0,0.5,0.5,0.5,0.5,0.5,0,0,0,0,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0,0 | temp01 |
CUT: IN05-IN08, OUT10-11
| Model: A | Model: B | Model: C | Interpolation Method | Weight | Merge Name |
| -------- | -------- | -------- | -------------------- | ----------------------------------------------------------------------------------------- | ---------- |
| temp01 | gape60 | NAI full | Add Difference @ 1.0 | 0,0.5,0.5,0.5,0.5,0.5,0,0,0,0,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0,0 | AOM3 |
โผAOM3A1
โOnly this model (AOM3A1) includes ChilloutMix (=The curse of DreamLike).Commercial use is not available.
โผ**Instructions:**
STEP: 1 | Change the base photorealistic model of AOM3 from BasilMix to Chilloutmix.
Change the photorealistic model from BasilMix to Chilloutmix and proceed to gapeNAI merge.
STEP: 2 |
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------- | --------------- | -------------- | ------------------ |
| 1 | SUM @ 0.5 | Counterfeit2.5 | Kenshi | | Counterfeit+Kenshi |
STEP: 3 |
CUT: BASE0, IN00-IN08๏ผ0, IN10๏ผ0.1, OUT03-04-05๏ผ0, OUT08๏ผ0.2
| Model: A | Model: B | Model: C | Interpolation Method | Weight | Merge Name |
| -------- | ------------------ | -------- | -------------------- | --------------------------------------------------------------------------- | ---------- |
| AOM3 | Counterfeit+Kenshi | | Add SUM @ 1.0 | 0,0,0,0,0,0,0,0,0,0.3,0.1,0.3,0.3,0.3,0.2,0.1,0,0,0,0.3,0.3,0.2,0.3,0.4,0.5 | AOM3A1 |
โผAOM3A2
โผ?
CUT: BASE0, IN05:0.3ใIN06-IN08๏ผ0, IN10๏ผ0.1, OUT03๏ผ0, OUT04๏ผ0.3, OUT05๏ผ0, OUT08๏ผ0.2
โผ**Instructions:**
| Model: A | Model: B | Model: C | Interpolation Method | Weight | Merge Name |
| -------- | -------------- | -------- | -------------------- | --------------------------------------------------------- | ---------- |
| AOM3 | Counterfeit2.5 | | Add SUM @ 1.0 | 0,1,1,1,1,1,0.3,0,0,0,1,0.1,1,1,1,1,1,0,1,0,1,1,0.2,1,1,1 | AOM3A2 |
โผAOM3A3
โผ?
CUT : BASE0, IN05-IN08๏ผ0, IN10๏ผ0.1, OUT03๏ผ0.5, OUT04-05๏ผ0.1, OUT08๏ผ0.2
| Model: A | Model: B | Model: C | Interpolation Method | Weight | Merge Name |
| -------- | -------------- | -------- | -------------------- | --------------------------------------------------------------------------------------------- | ---------- |
| AOM3 | Counterfeit2.5 | | Add SUM @ 1.0 | 0,0.6,0.6,0.6,0.6,0.6,0,0,0,0,0.6,0.1,0.6,0.6,0.6,0.6,0.6,0.5,0.1,0.1,0.6,0.6,0.2,0.6,0.6,0.6 | AOM3A3 |
----
## AbyssOrangeMix2 (AOM2)
โโCreating the next generation of illustration with โAbyssโ!
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/AbyssOrangeMix2/HeroImage_AbyssOrangeMix2_Designed_01_comp001.webp" width="" height="" alt=โHeroImage_AbyssOrangeMix2_Designed_01_comp001โ>
Prompt: [https://majinai.art/ja/i/nxpKRpw](https://majinai.art/ja/i/nxpKRpw)
โผAbout
AbyssOrangeMix2 (AOM2) is an AI model capable of generating high-quality, highly realistic illustrations.
It can generate elaborate and detailed illustrations that cannot be drawn by hand. It can also be used for a variety of purposes, making it extremely useful for design and artwork.
Furthermore, it provides an unparalleled new means of expression.
It can generate illustrations in a variety of genres to meet a wide range of needs. I encourage you to use "Abyss" to make your designs and artwork richer and of higher quality.
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/AbyssOrangeMix2/UBM_ON_OFF_4_comp001.webp" width="" height="" alt=โUBM_ON_OFF_4_comp001.webpโ>
โปnvidia joke.
โผDescription for engineers/enthusiasts
The merged model was formulated using an extension such as sdweb-merge-block-weighted-gui, which merges models at separate rates for each of the 25 U-Net blocks (input, intermediate, and output).
The validation of many Anons has shown that such a recipe can generate a painting style that is anatomically realistic enough to feel the finger skeleton, but still maintains an anime-style face.
The changes from AbyssOrangeMix are as follows.
1. the model used for U-Net Blocks Weight Merge was changed from Instagram+F222 to BasilMix. (<https://huggingface.co/nuigurumi>)
This is an excellent merge model that can generate decent human bodies while maintaining the facial layers of the Instagram model. Thanks!!!
This has improved the dullness of the color and given a more Japanese skin tone (or more precisely, the moisturized white skin that the Japanese would ideally like).
Also, the unnatural bokeh that sometimes occurred in the previous version may have been eliminated (needs to be verified).
2.Added IN deep layers (IN06-11) to the layer merging from the realistic model (BasilMix).
It is said that the IN deep layer (IN06-11) is the layer that determines composition, etc., but perhaps light, reflections, skin texture, etc., may also be involved.
It is like "Global Illumination", "Ray tracing" and "Ambient Occlusion" in 3DCG.
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/AbyssOrangeMix2/AbyssOrangeMix2_comparison_comp001.webp" width="" height="" alt=โAbyssOrangeMix2_comparison_comp001โ>
โปThis does not fundamentally improve the fingers. Therefore, More research needs to be done to improve the fingers (e.g. '[bad_prompt](https://huggingface.co/datasets/Nerfgun3/bad_prompt)').
About 30-50% chance of generating correct fingers(?). Abyss is deep.
โผSample Gallery
The prompts for generating these images were all generated using ChatGPT. I simply asked "Pirates sailing the oceans" to tell me what the prompts were.
However, to make sure the AI understood the specifications, I used the template for AI questions (Question template for AI prompt generation(v1.2) ).
Please review the following.
```jsx
https://seesaawiki.jp/nai_ch/d/AI%a4%f2%b3%e8%cd%d1%a4%b7%a4%bf%a5%d7%a5%ed%a5%f3%a5%d7%a5%c8%c0%b8%c0%ae
```
The images thus generated, strangely enough, look like MidJourney or Nijijourney illustrations. Perhaps they are passing user prompts through GPT or something else before passing them on to the image AI๐ค
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/AbyssOrangeMix2/SampleGallerBoardDesign_AbyssOrangeMix2_ReadMore_comp001.webp" width="" height="" alt=โSampleGallerBoardDesign_AbyssOrangeMix2_03_comp001โ>
<details>
<summary>โผREAD MORE๐ผ</summary>
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/AbyssOrangeMix2/SampleGallerBoardDesign_AbyssOrangeMix2_03_comp001.webp" width="" height="" alt=โSampleGallerBoardDesign_AbyssOrangeMix2_03_comp001โ>
โผAll prompts to generate sample images
1. [Gaming Girl](https://majinai.art/ja/i/GbTbLyk)
2. [Fantasy](https://majinai.art/ja/i/ax45Pof)
3. [Rainy Day](https://majinai.art/ja/i/1P9DUul)
4. [Kemomimi Girl](https://majinai.art/ja/i/hrUSb31)
5. [Supermarket](https://majinai.art/ja/i/6Mf4bVK)
6. [Lunch Time](https://majinai.art/ja/i/YAgQ4On)
7. [Womens in the Garden](https://majinai.art/ja/i/oHZYum_)
8. [Pirate](https://majinai.art/ja/i/yEA3EZk)
9. [Japanese Girl](https://majinai.art/ja/i/x4G_B_e)
10. [Sweets Time](https://majinai.art/ja/i/vK_mkac)
11. [Glasses Girl](https://majinai.art/ja/i/Z87IHOC)
</details>
โผHow to use
- VAE: orangemix.vae.pt
- ~~Prompts can be long or short~~
As simple as possible is good. Do not add excessive detail prompts. Start with just this negative propmt.
(worst quality, low quality:1.4)
- Sampler: โDPM++ SDE Karrasโ is good
- Steps: forTest: 12๏ฝ ,illustration: 20๏ฝ
- Clipskip: 1 or 2
- Upscaler : Latenet (nearest-exact)
- CFG Scale : 5 or 6 (4๏ฝ8)
- Denoise strength: 0.5 (0.45~0.6)
If you use 0.7๏ฝ, the picture will change too much.
If below 0.45, Block noise occurs.
๐Model List
- AbyssOrangeMix2_sfw๏ฝBasilMix U-Net Blocks Weight Merge
- AbyssOrangeMix2_nsfw๏ฝ+ NAI-NAISFW 0.3 Merge
- AbyssOrangeMix2_hard๏ฝ+ Gape 0.3 Merge
โปChanged suffix of models.
_base โ_sfw: _base was changed to_sfw.
_night โ_nsfw: Merged models up to NAI-NAI SFW were changed from _night to_nsfw.
_half and non suffix โ_hard: Gape merged models were given the suffix _hard.gape was reduced to 0.3 because it affects character modeling.
โผHow to choice models
- _sfw : SFW๐
- _nsfw : SFW ๏ฝ Soft NSFW๐ฅฐ
- _hard : SFW ๏ฝ hard NSFW๐
โผHash
- AbyssOrangeMix2_sfw.ckpt
ใf75b19923f2a4a0e70f564476178eedd94e76e2c94f8fd8f80c548742b5b51b9ใ
- AbyssOrangeMix2_sfw.safetensors
ใ038ba203d8ba3c8af24f14e01fbb870c85bbb8d4b6d9520804828f4193d12ce9ใ
- AbyssOrangeMix2_nsfw.safetensors
ใ0873291ac5419eaa7a18726e8841ce0f15f701ace29e0183c47efad2018900a4ใ
- AbyssOrangeMix_hard.safetensors
ใ0fc198c4908e98d7aae2a76bd78fa004e9c21cb0be7582e36008b4941169f18eใ
โผUse Models
1. AnythingV3.0 huggingface pruned
[2700c435]ใ543bcbc21294831c6245cd74c8a7707761e28812c690f946cb81fef930d54b5eใ
1. NovelAI animefull-final-pruned
[925997e9]ใ89d59c3dde4c56c6d5c41da34cc55ce479d93b4007046980934b14db71bdb2a8ใ
1. NovelAI sfw
[1d4a34af]ใ22fa233c2dfd7748d534be603345cb9abf994a23244dfdfc1013f4f90322fecaใ
1. Gape60
[25396b85]ใ893cca5903ccd0519876f58f4bc188dd8fcc5beb8a69c1a3f1a5fe314bb573f5ใ
1. BasilMix
ใbbf07e3a1c3482c138d096f7dcdb4581a2aa573b74a68ba0906c7b657942f1c2ใ
### AbyssOrangeMix2_sfw (AOM2s)
โผ**Instructions:**
STEP: 1๏ฝBlock Merge
| Model: A | Model: B | Weight | Base alpha | Merge Name |
| ------------ | -------- | --------------------------------------------------------------------- | ---------- | ------------------- |
| AnythingV3.0 | BasilMix | 1,0.9,0.7,0.5,0.3,0.1,1,1,1,1,1,1,0,0,0,0,0,0,0,0.1,0.3,0.5,0.7,0.9,1 | 0 | AbyssOrangeMix2_sfw |
### AbyssOrangeMix2_nsfw (AOM2n)
โผ?
JUST AbyssOrangeMix2_sfw+ (NAI-NAISFW) 0.3.
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------- | ----------------- | -------------- | -------------------- |
| 1 | Add Difference @ 0.3 | AbyssOrangeMix_base | NovelAI animefull | NovelAI sfw | AbyssOrangeMix2_nsfw |
### AbyssOrangeMix2_hard (AOM2h)
โผ?
+Gape0.3 version AbyssOrangeMix2_nsfw.
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | --------------- | ----------------- | -------------------- |
| 1 | Add Difference @ 0.3 | AbyssOrangeMix2_nsfw | Gape60 | NovelAI animefull | AbyssOrangeMix2_hard |
----
## EerieOrangeMix (EOM)
EerieOrangeMix is the generic name for a U-Net Blocks Weight Merge Models based on Elysium(Anime V2).
Since there are infinite possibilities for U-Net Blocks Weight Merging, I plan to treat all Elysium-based models as a lineage of this model.
โปThis does not fundamentally improve the fingers. Therefore, More research needs to be done to improve the fingers (e.g. '[bad_prompt](https://huggingface.co/datasets/Nerfgun3/bad_prompt)').
<img src="https://files.catbox.moe/yjnqna.webp" width="1000" height="" alt=โHeroImage_EerieOrangeMix_Designed_comp001โ >
### EerieOrangeMix (EOM1)
โผ?
This merge model is simply a U-Net Blocks Weight Merge of ElysiumAnime V2 with the AbyssOrangeMix method.
The AnythingModel is good at cute girls anyway, and no matter how hard I try, it doesn't seem to be good at women in their late 20s and beyond. Therefore, I created a U-Net Blocks Weight Merge model based on my personal favorite ElysiumAnime V2 model. ElyOrangeMix was originally my favorite, so this is an enhanced version of that.
๐Model List
- EerieOrangeMix_base๏ฝInstagram+F222 U-Net Blocks Weight Merge
- EerieOrangeMix_night๏ฝ+ NAI-NAISFW Merge
- EerieOrangeMix_half๏ฝ+ Gape0.5 Merge
- EerieOrangeMix๏ฝ+ Gape1.0 Merge
โผ How to choice models
- _base : SFW๐
- _Night : SFW ๏ฝ Soft NSFW๐ฅฐ
- _half : SFW ๏ฝ NSFW๐
- unlabeled : SFW ๏ฝ HARDCORE ๏ฝ๐คฏ ex)AbyssOrangeMix, BloodOrangeMix...etc
โผHash
- EerieOrangeMix.safetensors
- EerieOrangeMix_half.safetensors
- EerieOrangeMix_night.safetensors
- EerieOrangeMix_base.ckpt
โผUse Models
[] = WebUI Hash,ใใ= SHA256
1. Elysium Anime V2
[]ใ5c4787ce1386500ee05dbb9d27c17273c7a78493535f2603321f40f6e0796851ใ
2. NovelAI animefull-final-pruned
[925997e9]ใ89d59c3dde4c56c6d5c41da34cc55ce479d93b4007046980934b14db71bdb2a8ใ
3. NovelAI sfw
[1d4a34af]ใ22fa233c2dfd7748d534be603345cb9abf994a23244dfdfc1013f4f90322fecaใ
4. Gape60
[25396b85]ใ893cca5903ccd0519876f58f4bc188dd8fcc5beb8a69c1a3f1a5fe314bb573f5ใ
5. instagram-latest-plus-clip-v6e1_50000.safetensors
[] ใ8f1d325b194570754c6bd06cf1e90aa9219a7e732eb3d488fb52157e9451a2a5ใ
6. f222
[] ใ9e2c6ceff3f6d6f65c6fb0e10d8e69d772871813be647fd2ea5d06e00db33c1fใ
7. sd1.5_pruned
[] ใe1441589a6f3c5a53f5f54d0975a18a7feb7cdf0b0dee276dfc3331ae376a053ใ
โผ Sample Gallery
<img src="https://files.catbox.moe/oqbvti.webp" width="1000" height="" alt=โ2022-12-30_MotorbikeGIrlAsa3_comp001โ>
<details>
<summary>More๐ผ</summary>
<img src="https://files.catbox.moe/nmmswd.webp" width="" height="600" alt=โ2022-12-30_SampleGallery5โ>
</details>
โผ How to use
- VAE: orangemix.vae.pt
- As simple as possible is good. Do not add excessive detail prompts. Start with just this.
(worst quality, low quality:1.4)
- Sampler: โDPM++ SDE Karrasโ is good
- Steps: forTest: 20๏ฝ24 ,illustration: 24๏ฝ50
- Clipskip: 1
- USE โupscale latent spaceโ
- Denoise strength: 0.45 (0.4~0.5)
If you use 0.7๏ฝ, the picture will change too much.
โผPrompts
๐When generating cute girls, try this negative prompt first. It avoids low quality, prevents blurring, avoids dull colors, and dictates Anime-like cute face modeling.
```jsx
nsfw, (worst quality, low quality:1.3), (depth of field, blurry:1.2), (greyscale, monochrome:1.1), 3D face, nose, cropped, lowres, text, jpeg artifacts, signature, watermark, username, blurry, artist name, trademark, watermark, title, (tan, muscular, loli, petite, child, infant, toddlers, chibi, sd character:1.1), multiple view, Reference sheet,
```
---
#### EerieOrangeMix_base (EOM1b)
โผ?
Details are omitted since it is the same as AbyssOrangeMix.
โผ**Instructions:**
STEP: 1๏ฝCreation of photorealistic model for Merge
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------------------------- | --------------- | -------------- | ---------- |
| 1 | Add Difference @ 1.0 | instagram-latest-plus-clip-v6e1_50000 | f222 | sd1.5_pruned | Insta_F222 |
STEP: 2๏ฝBlock Merge
Merge InstaF222
| Model: A | Model: B | Weight | Base alpha | Merge Name |
| ---------------- | ---------- | --------------------------------------------------------------------- | ---------- | ---------- |
| Elysium Anime V2 | Insta_F222 | 1,0.9,0.7,0.5,0.3,0.1,0,0,0,0,0,0,0,0,0,0,0,0,0,0.1,0.3,0.5,0.7,0.9,1 | 0 | Temp1 |
#### EerieOrangeMix_Night (EOM1n)
โผ?
JUST EerieOrangeMix_base+ (NAI-NAISFW) 0.3.
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------- | ----------------- | -------------- | -------------------- |
| 1 | Add Difference @ 0.3 | EerieOrangeMix_base | NovelAI animefull | NovelAI sfw | EerieOrangeMix_Night |
#### EerieOrangeMix_half (EOM1h)
โผ?
+Gape0.5 version EerieOrangeMix.
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | ----------------- | -------------- | ------------------- |
| 1 | Add Difference @ 0.5 | EerieOrangeMix_Night | NovelAI animefull | NovelAI sfw | EerieOrangeMix_half |
#### EerieOrangeMix (EOM1)
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | --------------- | ----------------- | -------------- |
| 1 | Add Difference @ 1.0 | EerieOrangeMix_Night | Gape60 | NovelAI animefull | EerieOrangeMix |
----
### EerieOrangeMix2 (EOM2)
โผ?
The model was created by adding the hierarchy responsible for detailing and painting ElysiumV1 to EerieOrangeMix_base, then merging NAI and Gape.
๐Model List
- EerieOrangeMix2_base๏ฝInstagram+F222+ElysiumV1 U-Net Blocks Weight Merge
- EerieOrangeMix2_night๏ฝ+ NAI-NAISFW Merge
- EerieOrangeMix2_half๏ฝ+ Gape0.5 Merge
- EerieOrangeMix2๏ฝ+ Gape1.0 Merge
โผ How to choice models
- _base : SFW๐
- _Night : SFW ๏ฝ Soft NSFW๐ฅฐ
- _half : SFW ๏ฝ NSFW๐
- unlabeled : SFW ๏ฝ HARDCORE ๏ฝ๐คฏ ex)AbyssOrangeMix, BloodOrangeMix...etc
โผHash
- EerieOrangeMix2.safetensors
- EerieOrangeMix2_half.safetensors
- EerieOrangeMix2_night.safetensors
- EerieOrangeMix2_base.ckpt
โผUse Models
[] = webuHash,ใใ= SHA256
1. Elysium Anime V2
[]ใ5c4787ce1386500ee05dbb9d27c17273c7a78493535f2603321f40f6e0796851ใ
2. NovelAI animefull-final-pruned
[925997e9]ใ89d59c3dde4c56c6d5c41da34cc55ce479d93b4007046980934b14db71bdb2a8ใ
3. NovelAI sfw
[1d4a34af]ใ22fa233c2dfd7748d534be603345cb9abf994a23244dfdfc1013f4f90322fecaใ
4. Gape60
[25396b85]ใ893cca5903ccd0519876f58f4bc188dd8fcc5beb8a69c1a3f1a5fe314bb573f5ใ
5. instagram-latest-plus-clip-v6e1_50000.safetensors
[] ใ8f1d325b194570754c6bd06cf1e90aa9219a7e732eb3d488fb52157e9451a2a5ใ
6. f222
[] ใ9e2c6ceff3f6d6f65c6fb0e10d8e69d772871813be647fd2ea5d06e00db33c1fใ
7. sd1.5_pruned
[] ใe1441589a6f3c5a53f5f54d0975a18a7feb7cdf0b0dee276dfc3331ae376a053ใ
8. ElysiumV1
ใabbb28cb5e70d3e0a635f241b8d61cefe42eb8f1be91fd1168bc3e52b0f09ae4ใ
#### EerieOrangeMix2_base (EOM2b)
โผ?
โผInstructions
STEP: 1๏ฝBlock Merge
Merge ElysiumV1
The generated results do not change much with or without this process, but I wanted to incorporate Elysium's depiction, so I merged it.
| Model: A | Model: B | Weight | Base alpha | Merge Name |
| ------------------- | --------- | --------------------------------------------------------------------- | ---------- | -------------------- |
| EerieOrangeMix_base | ElysiumV1 | 1,0.9,0.7,0.5,0.3,0.1,0,0,0,0,0,0,0,0,0,0,0,0,0,0.1,0.3,0.5,0.7,0.9,1 | 0 | EerieOrangeMix2_base |
#### EerieOrangeMix2_night (EOM2n)
โผ?
JUST EerieOrangeMix2_base+ (NAI-NAISFW) 0.3.
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------- | ----------------- | -------------- | --------------------- |
| 1 | Add Difference @ 0.3 | EerieOrangeMix_base | NovelAI animefull | NovelAI sfw | EerieOrangeMix2_Night |
#### EerieOrangeMix2_half (EOM2h)
โผ?
+Gape0.5 version EerieOrangeMix2.
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | ----------------- | -------------- | -------------------- |
| 1 | Add Difference @ 0.5 | EerieOrangeMix_Night | NovelAI animefull | NovelAI sfw | EerieOrangeMix2_half |
#### EerieOrangeMix2 (EOM2)
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | --------------- | ----------------- | --------------- |
| 1 | Add Difference @ 1.0 | EerieOrangeMix_Night | Gape60 | NovelAI animefull | EerieOrangeMix2 |
### Models Comparison
<img src="https://files.catbox.moe/mp2fr4.webp" width="1000" height="" alt="MotorbikeGIrlAsa_Eerie_Abyss_Comparison_comp001">
<img src="https://files.catbox.moe/9xqths.webp" width="1000" height="" alt=โEerie_Abyss_Comparison_02_comp001โ>
<img src="https://files.catbox.moe/cm6c7m.webp" width="1000" height="" alt=โEerie_Comparison_01_comp001โ>
โปThe difference is slight but probably looks like this.
โ warm color, โ natural color, โ animated color
----
## AbyssOrangeMix (AOM)
โโHow can you guys take on such a deep swamp and get results?
Is it something like "Made in Abyss"?
By Anon, 115th thread
<img src="https://files.catbox.moe/wst1bp.webp" width="1000" height="">
โผ?
The merged model was formulated using an extension such as sdweb-merge-block-weighted-gui, which merges models at separate rates for each of the 25 U-Net blocks (input, intermediate, and output).
The validation of many Anons has shown that such a recipe can generate a painting style that is anatomically realistic enough to feel the finger skeleton, but still maintains an anime-style face.
โปThis model is the result of a great deal of testing and experimentation by many Anons๐ค
โปThis model can be very difficult to handle. I am not 100% confident in my ability to use this model. It is peaky and for experts.
โปThis does not fundamentally improve the fingers, and I recommend using bad_prompt, etc. (Embedding) in combination.
โผSample Gallery
(1)
<img src="https://files.catbox.moe/8mke0t.webp" width="1000" height="">
```jsx
((masterpiece)), best quality, perfect anatomy, (1girl, solo focus:1.4), pov, looking at viewer, flower trim,(perspective, sideway, From directly above ,lying on water, open hand, palm, :1.3),(Accurate five-fingered hands, Reach out, hand focus, foot focus, Sole, heel, ball of the thumb:1.2), (outdoor, sunlight:1.2),(shiny skin:1.3),,(masterpiece, white border, outside border, frame:1.3),
, (motherhood, aged up, mature female, medium breasts:1.2), (curvy:1.1), (single side braid:1.2), (long hair with queue and braid, disheveled hair, hair scrunchie, tareme:1.2), (light Ivory hair:1.2), looking at viewer,, Calm, Slight smile,
,(anemic, dark, lake, river,puddle, Meadow, rock, stone, moss, cliff, white flower, stalactite, Godray, ruins, ancient, eternal, deep ,mystic background,sunlight,plant,lily,white flowers, Abyss, :1.2), (orange fruits, citrus fruit, citrus fruit bearing tree:1.4), volumetric lighting,good lighting,, masterpiece, best quality, highly detailed,extremely detailed cg unity 8k wallpaper,illustration,((beautiful detailed face)), best quality, (((hyper-detailed ))), high resolution illustration ,high quality, highres, sidelighting, ((illustrationbest)),highres,illustration, absurdres, hyper-detailed, intricate detail, perfect, high detailed eyes,perfect lighting, (extremely detailed CG:1.2),
Negative prompt: (bad_prompt_version2:1), distant view, lip, Pregnant, maternity, pointy ears, realistic, tan, muscular, greyscale, monochrome, lineart, 2koma, 3koma, 4koma, manga, 3D, 3Dcubism, pablo picasso, disney, marvel, mutanted breasts, mutanted nipple, cropped, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name, lowres, trademark, watermark, title, text, deformed, bad anatomy, disfigured, mutated, extra limbs, ugly, missing limb, floating limbs, disconnected limbs, out of frame, mutated hands and fingers, poorly drawn hands, malformed hands, poorly drawn face, poorly drawn asymmetrical eyes, (blurry:1.4), duplicate (loli, petite, child, infant, toddlers, chibi, sd character, teen age:1.4), tsurime, helmet hair, evil smile, smug_face, naughty smile, multiple view, Reference sheet, (worst quality, low quality:1.4),
Steps: 24, Sampler: DPM++ SDE Karras, CFG scale: 10, Seed: 1159970659, Size: 1536x768, Model hash: cc44dbff, Model: AbyssOrangeMix, Variation seed: 93902374, Variation seed strength: 0.45, Denoising strength: 0.45, ENSD: 31337
```
(2)
<img src="https://files.catbox.moe/6cbrqh.webp" width="" height="600">
```jsx
street, 130mm f1.4 lens, ,(shiny skin:1.3),, (teen age, school uniform:1.2), (glasses, black hair, medium hair with queue and braid, disheveled hair, hair scrunchie, tareme:1.2), looking at viewer,, Calm, Slight smile,
Negative prompt: (bad_prompt_version2:1), distant view, lip, Pregnant, maternity, pointy ears, realistic, tan, muscular, greyscale, monochrome, lineart, 2koma, 3koma, 4koma, manga, 3D, 3Dcubism, pablo picasso, disney, marvel, mutanted breasts, mutanted nipple, cropped, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name, lowres, trademark, watermark, title, text, deformed, bad anatomy, disfigured, mutated, extra limbs, ugly, missing limb, floating limbs, disconnected limbs, out of frame, mutated hands and fingers, poorly drawn hands, malformed hands, poorly drawn face, poorly drawn asymmetrical eyes, (blurry:1.4), duplicate (loli, petite, child, infant, toddlers, chibi, sd character, teen age:1.4), tsurime, helmet hair, evil smile, smug_face, naughty smile, multiple view, Reference sheet, (worst quality, low quality:1.4),
Steps: 24, Sampler: DPM++ SDE Karras, CFG scale: 10, Seed: 1140782193, Size: 1024x1536, Model hash: cc44dbff, Model: AbyssOrangeMix, Denoising strength: 0.45, ENSD: 31337, First pass size: 512x768, Model sha256: 6bb3a5a3b1eadd32, VAE sha256: f921fb3f29891d2a, Options: xformers medvram gtx_16x0
Used embeddings: bad_prompt_version2 [afea]
```
----
โผHow to use
- VAE: orangemix.vae.pt
- ~~Prompts can be long or short~~
As simple as possible is good. Do not add excessive detail prompts. Start with just this.
(worst quality, low quality:1.4)
- Sampler: โDPM++ SDE Karrasโ is good
- Steps: forTest: 20๏ฝ24 ,illustration: 24๏ฝ50
- Clipskip: 1
- USE โupscale latent spaceโ
- Denoise strength: 0.45 (0.4~0.5)
If you use 0.7๏ฝ, the picture will change too much.
โผPrompts
๐When generating cute girls, try this negative prompt first. It avoids low quality, prevents blurring, avoids dull colors, and dictates Anime-like cute face modeling.
```jsx
nsfw, (worst quality, low quality:1.3), (depth of field, blurry:1.2), (greyscale, monochrome:1.1), 3D face, nose, cropped, lowres, text, jpeg artifacts, signature, watermark, username, blurry, artist name, trademark, watermark, title, (tan, muscular, loli, petite, child, infant, toddlers, chibi, sd character:1.1), multiple view, Reference sheet,
```
๐Model List
- AbyssOrangeMix_base๏ฝInstagram Merge
- AbyssOrangeMix_Night๏ฝ+ NAI-NAISFW Merge
- AbyssOrangeMix_half๏ฝ+ Gape0.5 Merge
- AbyssOrangeMix๏ฝ+ Gape1.0 Merge
โผ How to choice models
- _base : SFW๐
- _Night : SFW ๏ฝ Soft NSFW๐ฅฐ
- _half : SFW ๏ฝ NSFW๐
- unlabeled : SFW ๏ฝ HARDCORE ๏ฝ๐คฏ ex)AbyssOrangeMix, BloodOrangeMix...etc
โผHash (SHA256)
- AbyssOrangeMix.safetensors
6bb3a5a3b1eadd32dfbc8f0987559c48cb4177aee7582baa6d6a25181929b345
- AbyssOrangeMix_half.safetensors
468d1b5038c4fbd354113842e606fe0557b4e0e16cbaca67706b29bcf51dc402
- AbyssOrangeMix_Night.safetensors
167cd104699dd98df22f4dfd3c7a2c7171df550852181e454e71e5bff61d56a6
- AbyssOrangeMix_base.ckpt
bbd2621f3ec4fad707f75fc032a2c2602c296180a53ed3d9897d8ca7a01dd6ed
โผUse Models
1. AnythingV3.0 huggingface pruned
[2700c435]ใ543bcbc21294831c6245cd74c8a7707761e28812c690f946cb81fef930d54b5eใ
1. NovelAI animefull-final-pruned
[925997e9]ใ89d59c3dde4c56c6d5c41da34cc55ce479d93b4007046980934b14db71bdb2a8ใ
1. NovelAI sfw
[1d4a34af]ใ22fa233c2dfd7748d534be603345cb9abf994a23244dfdfc1013f4f90322fecaใ
1. Gape60
[25396b85]ใ893cca5903ccd0519876f58f4bc188dd8fcc5beb8a69c1a3f1a5fe314bb573f5ใ
1. instagram-latest-plus-clip-v6e1_50000.safetensors
[] ใ8f1d325b194570754c6bd06cf1e90aa9219a7e732eb3d488fb52157e9451a2a5ใ
1. f222
[] ใ9e2c6ceff3f6d6f65c6fb0e10d8e69d772871813be647fd2ea5d06e00db33c1fใ
1. sd1.5_pruned
[] ใe1441589a6f3c5a53f5f54d0975a18a7feb7cdf0b0dee276dfc3331ae376a053ใ
### AbyssOrangeMix_base (AOMb)
โผ?
The basic trick for this merged model is to incorporate a model that has learned more than 1m Instagram photos (mostly Japanese) or a photorealistic model like f222. The choice of base model here depends on the person. I chose AnythingV3 for versatility.
โผ**Instructions:**
STEP: 1๏ฝCreation of photorealistic model for Merge
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------------------------- | --------------- | -------------- | ---------- |
| 1 | Add Difference @ 1.0 | instagram-latest-plus-clip-v6e1_50000 | f222 | sd1.5_pruned | Insta_F222 |
STEP: 2๏ฝBlock Merge
| Model: A | Model: B | Weight | Base alpha | Merge Name |
| ------------ | ---------- | --------------------------------------------------------------------- | ---------- | ------------------- |
| AnythingV3.0 | Insta_F222 | 1,0.9,0.7,0.5,0.3,0.1,0,0,0,0,0,0,0,0,0,0,0,0,0,0.1,0.3,0.5,0.7,0.9,1 | 0 | AbyssOrangeMix_base |
### AbyssOrangeMix_Night (AOMn)
โผ?
JUST AbyssOrangeMix_base+ (NAI-NAISFW) 0.3.
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------------- | ----------------- | -------------- | -------------------- |
| 1 | Add Difference @ 0.3 | AbyssOrangeMix_base | NovelAI animefull | NovelAI sfw | AbyssOrangeMix_Night |
### AbyssOrangeMix_half (AOMh)
โผ?
+Gape0.5 version AbyssOrangeMix.
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | --------------- | ----------------- | ------------------- |
| 1 | Add Difference @ 0.5 | AbyssOrangeMix_Night | Gape60 | NovelAI animefull | AbyssOrangeMix_half |
### AbyssOrangeMix (AOM)
โผ**Instructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | -------------------- | --------------- | ----------------- | -------------- |
| 1 | Add Difference @ 1.0 | AbyssOrangeMix_Night | Gape60 | NovelAI animefull | AbyssOrangeMix |
----
## ElyOrangeMix (ELOM)
<img src="https://i.imgur.com/AInEXA5.jpg" width="1000" height="">
โผ?
Elysium_Anime_V2 + NAI + Gape.
This is a merge model that improves on the Elysium_Anime_V2, where NSFW representation is not good.
It can produce SFW, NSFW, and any other type of artwork, while retaining the Elysium's three-dimensional, thickly painted style.
โผ How to choice models
- _base : SFW๐
- _Night : SFW ๏ฝ Soft NSFW๐ฅฐ
- _half : SFW ๏ฝ NSFW๐
- unlabeled : SFW ๏ฝ HARDCORE ๏ฝ๐คฏ ex)AbyssOrangeMix, BloodOrangeMix...etc
โผHow to use
- VAE: orangemix.vae.pt
โผHash (SHA256)
- ElyOrangeMix [6b508e59]
- ElyOrangeMix_half [6b508e59]
- ElyNightOrangeMix[6b508e59]
### ElyOrangeMix (ELOM)
โผUse Models
1. Elysium_Anime_V2 [6b508e59]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
4. Gape60 [25396b85]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ---------------- | ----------------- | ----------------- | ------------------------ |
| 1 | Add Difference @ 0.3 | Elysium_Anime_V2 | NovelAI animefull | NovelAI sfw | tempmix-part1 [] |
| 2 | Add Difference @ 1.0 | tempmix-part1 | Gape60 | NovelAI animefull | ElyOrangeMix [6b508e59] |
---
### ElyOrangeMix_half (ELOMh)
โผ?
+Gape0.5 version ElyOrangeMix.
โผUse Models
1. Elysium_Anime_V2 [6b508e59]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
4. Gape60 [25396b85]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ---------------- | ----------------- | ----------------- | ----------------------------- |
| 1 | Add Difference @ 0.3 | Elysium_Anime_V2 | NovelAI animefull | NovelAI sfw | tempmix-part1 [] |
| 2 | Add Difference @ 0.5 | tempmix-part1 | Gape60 | NovelAI animefull | ElyOrangeMix_half [6b508e59] |
----
### ElyNightOrangeMix (ELOMn)
โผ?
It is a merged model that just did Elysium_Anime_V2+ (NAI-NAISFW) 0.3.
โผUse Models
1. Elysium_Anime_V2 [6b508e59]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ---------------- | ----------------- | -------------- | ----------------- |
| 1 | Add Difference @ 0.3 | Elysium_Anime_V2 | NovelAI animefull | NovelAI sfw | ElyNightOrangeMix |
----
## BloodOrangeMix (BOM)
<img src="https://i.imgur.com/soAnnFk.jpg" width="1000" height="">
โผ?
Anything+NAI+Gape.
This is a merge model that improves on the AnythingV3, where NSFW representation is not good.
It can produce SFW, NSFW, and any other type of artwork, while retaining the flat, beautifully painted style of AnythingV3.
Stable. Popular in the Japanese community.
โผModelList & [] = WebUI Hash,ใใ= SHA256
- BloodNightOrangeMix.ckpt
[ffa7b160]ใf8aff727ba3da0358815b1766ed232fd1ef9682ad165067cac76e576d19689e0ใ
- BloodOrangeMix_half.ckpt
[ffa7b160]ใb2168aaa59fa91229b8add21f140ac9271773fe88a387276f3f0c7d70f726a83ใ
- BloodOrangeMix.ckpt
[ffa7b160] ใ25cece3fe303ea8e3ad40c3dca788406dbd921bcf3aa8e3d1c7c5ac81f208a4fใ
- BloodOrangeMix.safetensors
ใ79a1edf6af43c75ee1e00a884a09213a28ee743b2e913de978cb1f6faa1b320dใ
โผ How to choice models
- _base : SFW๐
- _Night : SFW ๏ฝ Soft NSFW๐ฅฐ
- _half : SFW ๏ฝ NSFW๐
- unlabeled : SFW ๏ฝ HARDCORE ๏ฝ๐คฏ ex)AbyssOrangeMix, BloodOrangeMix...etc
โผHow to use
- VAE: orangemix.vae.pt
### BloodOrangeMix (BOM)
โผUse Models
1. AnythingV3.0 huggingface pruned [2700c435]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
4. Gape60 [25396b85]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------- | ----------------- | ----------------- | ------------------------- |
| 1 | Add Difference @ 0.3 | AnythingV3.0 | NovelAI animefull | NovelAI sfw | tempmix-part1 [] |
| 2 | Add Difference @ 1.0 | tempmix-part1 | Gape60 | NovelAI animefull | BloodOrangeMix [ffa7b160] |
----
### BloodOrangeMix_half (BOMh)
โผ?
Anything+Nai+Gape0.5
+Gape0.5 version BloodOrangeMix.
NSFW expression will be softer and have less impact on the Anything style painting style.
โผUse Models
1. AnythingV3.0 huggingface pruned [2700c435]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
4. Gape60 [25396b85]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------- | ----------------- | ----------------- | ------------------------------ |
| 1 | Add Difference @ 0.3 | AnythingV3.0 | NovelAI animefull | NovelAI sfw | tempmix-part1 [] |
| 2 | Add Difference @ 0.5 | tempmix-part1 | Gape60 | NovelAI animefull | BloodOrangeMix_half [ffa7b160] |
----
### BloodNightOrangeMix (BOMn)
โผ?
It is a merged model that just did AnythingV3+ (NAI-NAISFW) 0.3.
โผUse Models
1. AnythingV3.0 huggingface pruned [2700c435]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
โผInstructions
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ------------- | ----------------- | -------------- | ------------------- |
| 1 | Add Difference @ 0.3 | AnythingV3.0 | NovelAI animefull | NovelAI sfw | BloodNightOrangeMix |
----
## ElderOrangeMix
โปI found this model to be very prone to body collapse. Not recommended.
โผ?
anything and everything mix ver.1.5+Gape+Nai(AnEve.G.N0.3)
This is a merged model with improved NSFW representation of anything and everything mix ver.1.5.
โผHash
[3a46a1e0]
โผUse Models
1. anything and everything mix ver.1.5 [5265dcf6]
2. NovelAI animefull-final-pruned [925997e9]
3. NovelAI sfw [1d4a34af]
4. Gape60 [25396b85]
โผInstructions:**
| Step | Interpolation Method | Primary Model | Secondary Model | Tertiary Model | Merge Name |
| ---- | -------------------- | ----------------------------------- | --------------- | -------------- | -------------------------- |
| 1 | Add Difference @ 0.5 | anything and everything mix ver.1.5 | Gape60 | NovelAI full | tempmix-part1 [] |
| 2 | Add Difference @ 0.3 | tempmix-part1 | NovelAI full | NovelAI sfw | ElderOrangeMix [3a46a1e0] |
----
## Troubleshooting
1. blurred Images & clearly low quality output
If the generated images are blurred or only clearly low quality output is produced, it is possible that the vae, etc. are not loaded properly. Try reloading the model/vae or restarting the WebUI/OS.
## FAQ and Tips (๐MEME ZONE๐ฆ)
Trash zone.
----
<a name="MEME_AOM3A1"></a>
โผAOM3A1?
R.I.P.

<a name="MEME_realface"></a>
โผNo, AOM2 (only hentai models)

โผNooo^()&*%#NG0u!!!!!!!!็ธบใ
โ็นง?็ธบๅป?็ธบ๏ฝค็น๏ฝผ็ธบ๏ฝจ็ธบๅป?็ธบๅถ๏ฝ็น๏ฝผ็ธบ๏ฝฏ้ฉไธป๏ฝญ๏ฝฆ้ๅต?็นง๏ฝด็นๆบใ็ธบ? (ใAOM3A2 and A3 are overlearning and Trash. delete!ใ)
<img src="https://github.com/WarriorMama777/imgup/raw/main/img/img_general/img_meme_tension_comp001.webp" width="300" height="" alt=โgetting_excitedโ>
โผNoo, Too many models. Tell me which one to choose.
โ [ๅ
จ้จๅใใใใชใใงใใ](https://github.com/WarriorMama777/imgup/blob/main/img/img_general/img_MEME_whichModel_comp001.webp?raw=true "ๅ
จ้จๅใใใใชใใงใใ")
โผNooo, not work. This guy is Scammer
STEP1: BUY HUGE PC
โผNoooo, can't generate image like samples.This models is hype.
โ
<img src="https://files.catbox.moe/nte6ud.webp" width="500" height="">
๐ข
<img src="https://files.catbox.moe/lta462.webp" width="500" height="">
โผNooooo, This models have troy virus. don't download.
All models in this repository are secure. It is most likely that anti-virus software has detected them erroneously.
However, the models with the .ckpt extension have the potential danger of executing arbitrary code.
A safe model that is free from these dangers is the model with the .safetensors extension.
| 61,021 | [
[
-0.056610107421875,
-0.0240478515625,
0.023773193359375,
0.0185394287109375,
-0.0158538818359375,
-0.02264404296875,
0.0199432373046875,
-0.03753662109375,
0.0258636474609375,
0.04608154296875,
-0.039886474609375,
-0.04754638671875,
-0.041534423828125,
0.0214691162109375,
-0.009552001953125,
0.05841064453125,
-0.01027679443359375,
-0.003665924072265625,
0.010040283203125,
-0.0146636962890625,
-0.0287017822265625,
-0.00868988037109375,
-0.041107177734375,
-0.0198516845703125,
0.044647216796875,
0.0313720703125,
0.074951171875,
0.060760498046875,
0.041107177734375,
0.0266265869140625,
-0.0287933349609375,
0.003345489501953125,
-0.03338623046875,
0.0082550048828125,
-0.01519775390625,
-0.039794921875,
-0.05712890625,
0.00007939338684082031,
0.032562255859375,
0.0302886962890625,
-0.020263671875,
0.0055389404296875,
0.005702972412109375,
0.05242919921875,
-0.04541015625,
0.00946044921875,
-0.01142120361328125,
0.005191802978515625,
-0.0169219970703125,
0.0097198486328125,
0.000583648681640625,
-0.03155517578125,
-0.020721435546875,
-0.06341552734375,
0.006397247314453125,
0.01300811767578125,
0.0843505859375,
0.004390716552734375,
-0.0283966064453125,
-0.01678466796875,
-0.034820556640625,
0.061248779296875,
-0.069091796875,
0.0197296142578125,
0.0252685546875,
0.020660400390625,
-0.019195556640625,
-0.046630859375,
-0.04107666015625,
0.005397796630859375,
-0.0131988525390625,
0.0310516357421875,
-0.0322265625,
-0.008270263671875,
0.0214691162109375,
0.04693603515625,
-0.048065185546875,
-0.0079498291015625,
-0.037322998046875,
0.00861358642578125,
0.055267333984375,
0.016082763671875,
0.043487548828125,
-0.0299835205078125,
-0.03607177734375,
-0.019775390625,
-0.0372314453125,
-0.0204925537109375,
0.032623291015625,
0.001148223876953125,
-0.040985107421875,
0.04193115234375,
0.0006456375122070312,
0.03277587890625,
0.005382537841796875,
-0.01184844970703125,
0.032867431640625,
-0.0146331787109375,
-0.016510009765625,
-0.006580352783203125,
0.0631103515625,
0.051849365234375,
-0.01000213623046875,
0.00708770751953125,
-0.0181427001953125,
0.0010194778442382812,
-0.005367279052734375,
-0.072021484375,
-0.024932861328125,
0.03033447265625,
-0.042877197265625,
-0.0196380615234375,
-0.001728057861328125,
-0.0750732421875,
-0.01093292236328125,
0.007198333740234375,
0.0308990478515625,
-0.035369873046875,
-0.042236328125,
0.01155853271484375,
-0.043182373046875,
-0.0054779052734375,
0.0283050537109375,
-0.051361083984375,
0.0128631591796875,
0.022918701171875,
0.07086181640625,
0.00669097900390625,
-0.01396942138671875,
0.0080108642578125,
0.03851318359375,
-0.032684326171875,
0.03961181640625,
-0.0167388916015625,
-0.049163818359375,
-0.02508544921875,
0.024169921875,
-0.00592803955078125,
-0.05059814453125,
0.036865234375,
-0.0267181396484375,
0.023406982421875,
-0.016387939453125,
-0.032440185546875,
-0.043365478515625,
0.01091766357421875,
-0.054962158203125,
0.058197021484375,
0.0128021240234375,
-0.0728759765625,
-0.014862060546875,
-0.059539794921875,
-0.009002685546875,
-0.00743865966796875,
0.0078125,
-0.02606201171875,
-0.0124969482421875,
0.00510406494140625,
0.035064697265625,
-0.03131103515625,
-0.004482269287109375,
-0.031585693359375,
-0.0034542083740234375,
-0.0052032470703125,
-0.01085662841796875,
0.092041015625,
0.047210693359375,
-0.0328369140625,
-0.0225982666015625,
-0.04779052734375,
-0.0015077590942382812,
0.043060302734375,
-0.006290435791015625,
-0.031494140625,
-0.029327392578125,
-0.003570556640625,
0.0037097930908203125,
0.01326751708984375,
-0.0175933837890625,
0.0225372314453125,
-0.0152435302734375,
0.0233306884765625,
0.046844482421875,
0.0194854736328125,
0.030792236328125,
-0.0684814453125,
0.05853271484375,
0.0249786376953125,
0.017974853515625,
-0.020050048828125,
-0.055419921875,
-0.060577392578125,
-0.0248260498046875,
0.022857666015625,
0.035369873046875,
-0.063232421875,
0.028106689453125,
-0.005924224853515625,
-0.0723876953125,
-0.02392578125,
-0.0023593902587890625,
0.04229736328125,
0.0209503173828125,
0.0201873779296875,
-0.034515380859375,
-0.036407470703125,
-0.0635986328125,
0.004047393798828125,
-0.00806427001953125,
0.00421905517578125,
0.043060302734375,
0.04937744140625,
-0.0513916015625,
0.043548583984375,
-0.059112548828125,
-0.0187225341796875,
-0.00464630126953125,
0.0244903564453125,
0.035003662109375,
0.04791259765625,
0.07275390625,
-0.06964111328125,
-0.04034423828125,
0.005970001220703125,
-0.06292724609375,
-0.00762939453125,
0.01366424560546875,
-0.028106689453125,
-0.004299163818359375,
0.010223388671875,
-0.051666259765625,
0.0303192138671875,
0.048675537109375,
-0.0300140380859375,
0.035491943359375,
-0.018341064453125,
0.03765869140625,
-0.1063232421875,
0.0143890380859375,
0.0008020401000976562,
-0.017578125,
-0.042755126953125,
0.0411376953125,
0.004421234130859375,
0.00603485107421875,
-0.05706787109375,
0.061248779296875,
-0.061492919921875,
0.0173187255859375,
-0.003936767578125,
0.0018186569213867188,
0.003017425537109375,
0.030548095703125,
-0.0182952880859375,
0.0545654296875,
0.053192138671875,
-0.051513671875,
0.0217742919921875,
0.0232391357421875,
-0.0028133392333984375,
0.024932861328125,
-0.03753662109375,
-0.00806427001953125,
-0.01959228515625,
0.00440216064453125,
-0.057403564453125,
-0.02972412109375,
0.0380859375,
-0.038848876953125,
0.0258636474609375,
-0.03057861328125,
-0.025360107421875,
-0.0207672119140625,
-0.0294342041015625,
0.0283355712890625,
0.059967041015625,
-0.032745361328125,
0.060150146484375,
0.040740966796875,
-0.004741668701171875,
-0.028900146484375,
-0.0667724609375,
-0.005512237548828125,
-0.0258636474609375,
-0.05487060546875,
0.03302001953125,
-0.018646240234375,
-0.0307159423828125,
0.01393890380859375,
0.004444122314453125,
-0.01255035400390625,
0.0033130645751953125,
0.02813720703125,
0.01081085205078125,
-0.01355743408203125,
-0.03253173828125,
0.00098419189453125,
-0.0025615692138671875,
-0.01076507568359375,
0.00009900331497192383,
0.03521728515625,
-0.0020160675048828125,
-0.00771331787109375,
-0.021270751953125,
0.0277557373046875,
0.05517578125,
-0.0018253326416015625,
0.05078125,
0.04339599609375,
-0.035369873046875,
0.00576019287109375,
-0.042236328125,
-0.0177154541015625,
-0.03521728515625,
0.0083465576171875,
-0.01088714599609375,
-0.044281005859375,
0.06463623046875,
-0.002193450927734375,
0.0174560546875,
0.060455322265625,
0.0271453857421875,
-0.004749298095703125,
0.0927734375,
0.039886474609375,
0.00867462158203125,
0.0321044921875,
-0.0565185546875,
-0.005748748779296875,
-0.07843017578125,
-0.014251708984375,
-0.032318115234375,
-0.0251617431640625,
-0.041778564453125,
-0.037689208984375,
0.0302276611328125,
0.032928466796875,
-0.0276336669921875,
0.04559326171875,
-0.0352783203125,
0.030792236328125,
0.0185699462890625,
0.01194000244140625,
0.011993408203125,
-0.0085601806640625,
-0.0283355712890625,
-0.0036907196044921875,
-0.053741455078125,
-0.0266265869140625,
0.07281494140625,
0.026153564453125,
0.050323486328125,
0.047210693359375,
0.0667724609375,
0.006656646728515625,
0.0316162109375,
-0.0235595703125,
0.046234130859375,
-0.019989013671875,
-0.06658935546875,
-0.006656646728515625,
-0.04681396484375,
-0.0860595703125,
0.029510498046875,
-0.0016050338745117188,
-0.06097412109375,
0.0400390625,
0.00695037841796875,
-0.0264129638671875,
0.03521728515625,
-0.0565185546875,
0.05926513671875,
-0.018951416015625,
-0.040374755859375,
0.006496429443359375,
-0.0469970703125,
0.05108642578125,
0.0214080810546875,
0.03521728515625,
-0.0171051025390625,
-0.0154266357421875,
0.064208984375,
-0.046142578125,
0.045989990234375,
0.0007166862487792969,
-0.0016450881958007812,
0.0244293212890625,
0.0244598388671875,
0.033233642578125,
0.006763458251953125,
-0.0021305084228515625,
0.032623291015625,
0.004711151123046875,
-0.02166748046875,
-0.029205322265625,
0.06646728515625,
-0.06378173828125,
-0.052093505859375,
-0.053955078125,
-0.0200958251953125,
0.0224456787109375,
0.04022216796875,
0.043426513671875,
0.021026611328125,
0.01251220703125,
0.0140228271484375,
0.053009033203125,
0.0017681121826171875,
0.0273895263671875,
0.0251312255859375,
-0.052032470703125,
-0.054046630859375,
0.06219482421875,
0.014404296875,
0.02197265625,
0.01244354248046875,
0.01531219482421875,
-0.00893402099609375,
-0.033660888671875,
-0.03472900390625,
0.03668212890625,
-0.04998779296875,
-0.02392578125,
-0.053924560546875,
-0.007568359375,
-0.0306549072265625,
-0.01654052734375,
-0.029632568359375,
-0.027008056640625,
-0.04693603515625,
0.0028057098388671875,
0.042572021484375,
0.045867919921875,
-0.029632568359375,
0.008392333984375,
-0.07916259765625,
0.033233642578125,
0.0135345458984375,
0.0187530517578125,
0.006885528564453125,
-0.036346435546875,
-0.0003783702850341797,
0.015594482421875,
-0.04022216796875,
-0.09332275390625,
0.05657958984375,
-0.00501251220703125,
0.03363037109375,
0.03411865234375,
0.005519866943359375,
0.059783935546875,
-0.0223541259765625,
0.06756591796875,
0.0390625,
-0.038970947265625,
0.0279083251953125,
-0.053009033203125,
0.0284881591796875,
0.0263214111328125,
0.0556640625,
-0.0318603515625,
-0.029052734375,
-0.060577392578125,
-0.083251953125,
0.04998779296875,
0.04522705078125,
-0.0018329620361328125,
-0.006633758544921875,
0.02374267578125,
-0.006916046142578125,
0.016082763671875,
-0.054351806640625,
-0.055877685546875,
-0.00872802734375,
-0.007213592529296875,
-0.005397796630859375,
-0.01549530029296875,
-0.0218505859375,
-0.0215301513671875,
0.074951171875,
0.01537322998046875,
0.0189666748046875,
0.0166168212890625,
0.00919342041015625,
-0.0298919677734375,
0.0072174072265625,
0.0254058837890625,
0.038787841796875,
-0.0288543701171875,
-0.00555419921875,
-0.00270843505859375,
-0.047271728515625,
0.0057525634765625,
0.01358795166015625,
-0.0236663818359375,
0.010528564453125,
0.0023651123046875,
0.045379638671875,
0.0307464599609375,
-0.031890869140625,
0.0298309326171875,
-0.0200653076171875,
-0.0146636962890625,
-0.035125732421875,
0.0308990478515625,
0.0257568359375,
0.0286102294921875,
0.0205535888671875,
0.0208282470703125,
0.0252838134765625,
-0.06195068359375,
-0.002620697021484375,
0.0155181884765625,
-0.01654052734375,
-0.002826690673828125,
0.0703125,
0.0037212371826171875,
-0.00714874267578125,
0.034027099609375,
-0.034210205078125,
-0.0295562744140625,
0.060943603515625,
0.032562255859375,
0.07281494140625,
-0.03662109375,
0.002391815185546875,
0.041839599609375,
0.01312255859375,
-0.03057861328125,
0.03460693359375,
0.019195556640625,
-0.04132080078125,
0.0249786376953125,
-0.041534423828125,
-0.01299285888671875,
0.004306793212890625,
-0.043060302734375,
0.03466796875,
-0.0408935546875,
-0.0254364013671875,
-0.002536773681640625,
-0.0159759521484375,
-0.03997802734375,
0.0199127197265625,
0.00624847412109375,
0.07073974609375,
-0.05645751953125,
0.0367431640625,
0.062286376953125,
-0.050537109375,
-0.06707763671875,
-0.0027179718017578125,
0.0182952880859375,
-0.0302276611328125,
0.0231170654296875,
0.008087158203125,
0.0025787353515625,
0.0072479248046875,
-0.04571533203125,
-0.0784912109375,
0.1097412109375,
0.02197265625,
-0.0423583984375,
-0.0019426345825195312,
-0.02337646484375,
0.030853271484375,
-0.0220489501953125,
0.052978515625,
0.028900146484375,
0.04693603515625,
0.03265380859375,
-0.060150146484375,
0.00966644287109375,
-0.0457763671875,
0.0178070068359375,
0.01277923583984375,
-0.06500244140625,
0.09454345703125,
-0.0164642333984375,
-0.03155517578125,
0.0280303955078125,
0.033843994140625,
0.0335693359375,
0.019989013671875,
0.0309906005859375,
0.06768798828125,
0.031158447265625,
-0.016845703125,
0.09100341796875,
-0.022369384765625,
0.03485107421875,
0.0498046875,
-0.004444122314453125,
0.05059814453125,
0.012451171875,
-0.021087646484375,
0.0299835205078125,
0.057586669921875,
-0.005702972412109375,
0.041290283203125,
-0.0106658935546875,
-0.00887298583984375,
-0.0138702392578125,
-0.005283355712890625,
-0.051727294921875,
-0.0082244873046875,
0.018524169921875,
-0.0295257568359375,
-0.01513671875,
-0.01168060302734375,
0.0267181396484375,
-0.0177764892578125,
-0.0280914306640625,
0.052825927734375,
0.0258636474609375,
-0.034912109375,
0.06964111328125,
0.00516510009765625,
0.054534912109375,
-0.050201416015625,
-0.0150604248046875,
-0.024658203125,
0.016387939453125,
-0.031646728515625,
-0.0640869140625,
0.0038623809814453125,
-0.0117950439453125,
-0.0130615234375,
-0.0006365776062011719,
0.03057861328125,
-0.023468017578125,
-0.039459228515625,
0.043701171875,
0.01395416259765625,
0.0238800048828125,
0.01470947265625,
-0.06951904296875,
0.01308441162109375,
0.013641357421875,
-0.029266357421875,
0.02392578125,
0.0267333984375,
0.01523590087890625,
0.06317138671875,
0.03668212890625,
0.007080078125,
0.006317138671875,
-0.0265045166015625,
0.0667724609375,
-0.04998779296875,
-0.04473876953125,
-0.038055419921875,
0.056976318359375,
0.00551605224609375,
-0.0275726318359375,
0.07244873046875,
0.04803466796875,
0.06890869140625,
-0.01513671875,
0.061248779296875,
-0.0191192626953125,
0.028045654296875,
-0.042449951171875,
0.06622314453125,
-0.08642578125,
0.01806640625,
-0.035919189453125,
-0.071533203125,
-0.01959228515625,
0.033966064453125,
-0.015625,
0.024139404296875,
0.040924072265625,
0.069091796875,
-0.0259246826171875,
0.0105133056640625,
0.01806640625,
0.03924560546875,
0.003376007080078125,
0.037078857421875,
0.04547119140625,
-0.05340576171875,
0.021697998046875,
-0.044708251953125,
-0.011322021484375,
-0.042633056640625,
-0.060577392578125,
-0.04766845703125,
-0.03668212890625,
-0.036895751953125,
-0.0287628173828125,
0.0152587890625,
0.061492919921875,
0.039947509765625,
-0.05377197265625,
-0.02899169921875,
0.00527191162109375,
0.0112762451171875,
-0.01453399658203125,
-0.0180511474609375,
0.0105438232421875,
0.022613525390625,
-0.06329345703125,
0.01220703125,
0.019073486328125,
0.040924072265625,
-0.005321502685546875,
-0.023956298828125,
-0.022613525390625,
0.0008187294006347656,
0.006561279296875,
0.03662109375,
-0.041229248046875,
0.012939453125,
-0.0264129638671875,
-0.00730133056640625,
0.012054443359375,
0.020263671875,
-0.01434326171875,
0.0301361083984375,
0.051910400390625,
0.000637054443359375,
0.03485107421875,
-0.003467559814453125,
0.0273895263671875,
-0.02197265625,
0.0169525146484375,
0.0206146240234375,
0.0550537109375,
0.01374053955078125,
-0.033538818359375,
0.0433349609375,
0.0188446044921875,
-0.0345458984375,
-0.06219482421875,
0.0012254714965820312,
-0.06982421875,
-0.034027099609375,
0.0904541015625,
-0.0207672119140625,
-0.039459228515625,
0.01155853271484375,
-0.0152587890625,
0.018524169921875,
-0.038604736328125,
0.038330078125,
0.027740478515625,
0.0017690658569335938,
-0.0225372314453125,
-0.0491943359375,
0.0250091552734375,
0.0126953125,
-0.05157470703125,
-0.005855560302734375,
0.03125,
0.03155517578125,
0.042572021484375,
0.0270538330078125,
-0.033447265625,
0.0284881591796875,
-0.0041351318359375,
0.028350830078125,
-0.00731658935546875,
-0.01611328125,
-0.0236663818359375,
0.0030651092529296875,
-0.01073455810546875,
-0.0099029541015625
]
] |
sentence-transformers/msmarco-distilbert-dot-v5 | 2023-11-02T09:31:39.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"en",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/msmarco-distilbert-dot-v5 | 7 | 1,072,480 | sentence-transformers | 2022-03-02T23:29:05 | ---
language:
- en
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# msmarco-distilbert-dot-v5
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and was designed for **semantic search**. It has been trained on 500K (query, answer) pairs from the [MS MARCO dataset](https://github.com/microsoft/MSMARCO-Passage-Ranking/). For an introduction to semantic search, have a look at: [SBERT.net - Semantic Search](https://www.sbert.net/examples/applications/semantic-search/README.html)
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer, util
query = "How many people live in London?"
docs = ["Around 9 Million people live in London", "London is known for its financial district"]
#Load the model
model = SentenceTransformer('sentence-transformers/msmarco-distilbert-dot-v5')
#Encode query and documents
query_emb = model.encode(query)
doc_emb = model.encode(docs)
#Compute dot score between query and all document embeddings
scores = util.dot_score(query_emb, doc_emb)[0].cpu().tolist()
#Combine docs & scores
doc_score_pairs = list(zip(docs, scores))
#Sort by decreasing score
doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True)
#Output passages & scores
print("Query:", query)
for doc, score in doc_score_pairs:
print(score, doc)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the correct pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output.last_hidden_state
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
#Encode text
def encode(texts):
# Tokenize sentences
encoded_input = tokenizer(texts, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input, return_dict=True)
# Perform pooling
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
return embeddings
# Sentences we want sentence embeddings for
query = "How many people live in London?"
docs = ["Around 9 Million people live in London", "London is known for its financial district"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/msmarco-distilbert-dot-v5")
model = AutoModel.from_pretrained("sentence-transformers/msmarco-distilbert-dot-v5")
#Encode query and docs
query_emb = encode(query)
doc_emb = encode(docs)
#Compute dot score between query and all document embeddings
scores = torch.mm(query_emb, doc_emb.transpose(0, 1))[0].cpu().tolist()
#Combine docs & scores
doc_score_pairs = list(zip(docs, scores))
#Sort by decreasing score
doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True)
#Output passages & scores
print("Query:", query)
for doc, score in doc_score_pairs:
print(score, doc)
```
## Technical Details
In the following some technical details how this model must be used:
| Setting | Value |
| --- | :---: |
| Dimensions | 768 |
| Max Sequence Length | 512 |
| Produces normalized embeddings | No |
| Pooling-Method | Mean pooling |
| Suitable score functions | dot-product (e.g. `util.dot_score`) |
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=msmarco-distilbert-base-dot-v5)
## Training
See `train_script.py` in this repository for the used training script.
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 7858 with parameters:
```
{'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MarginMSELoss.MarginMSELoss`
Parameters of the fit()-Method:
```
{
"callback": null,
"epochs": 30,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 1e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 10000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
```
## License
This model is released under the Apache 2 license. However, note that this model was trained on the MS MARCO dataset which has it's own license restrictions: [MS MARCO - Terms and Conditions](https://github.com/microsoft/msmarco/blob/095515e8e28b756a62fcca7fcf1d8b3d9fbb96a9/README.md). | 6,479 | [
[
-0.0162506103515625,
-0.060455322265625,
0.03338623046875,
0.0210723876953125,
-0.0175933837890625,
-0.0265350341796875,
-0.022430419921875,
-0.006500244140625,
0.007152557373046875,
0.023468017578125,
-0.041290283203125,
-0.05023193359375,
-0.056915283203125,
0.0128631591796875,
-0.028961181640625,
0.0635986328125,
-0.005786895751953125,
0.01078033447265625,
-0.02410888671875,
-0.01123809814453125,
-0.0186920166015625,
-0.020599365234375,
-0.0307464599609375,
-0.01537322998046875,
0.021697998046875,
0.019805908203125,
0.0396728515625,
0.033721923828125,
0.0302276611328125,
0.032379150390625,
-0.007537841796875,
0.0177154541015625,
-0.0328369140625,
0.0024738311767578125,
-0.0011472702026367188,
-0.031890869140625,
-0.0137481689453125,
0.0219573974609375,
0.041412353515625,
0.036163330078125,
-0.01300048828125,
0.0083770751953125,
0.0068511962890625,
0.037841796875,
-0.0223846435546875,
0.0311737060546875,
-0.045257568359375,
0.005126953125,
0.00308990478515625,
-0.00595855712890625,
-0.04473876953125,
-0.016998291015625,
0.0211944580078125,
-0.037139892578125,
0.020294189453125,
0.01495361328125,
0.0933837890625,
0.027008056640625,
-0.0234375,
-0.038604736328125,
-0.0197601318359375,
0.06134033203125,
-0.06695556640625,
0.0120086669921875,
0.027923583984375,
0.00670623779296875,
-0.005710601806640625,
-0.0743408203125,
-0.05621337890625,
-0.015838623046875,
-0.026275634765625,
0.0169830322265625,
-0.017547607421875,
-0.004283905029296875,
0.0170135498046875,
0.0189056396484375,
-0.057342529296875,
-0.0026531219482421875,
-0.0452880859375,
-0.01241302490234375,
0.052764892578125,
0.01425933837890625,
0.015869140625,
-0.036407470703125,
-0.037841796875,
-0.026031494140625,
-0.017669677734375,
0.0089874267578125,
0.0212249755859375,
0.01308441162109375,
-0.01221466064453125,
0.0535888671875,
-0.0052947998046875,
0.049774169921875,
-0.004169464111328125,
0.007793426513671875,
0.04803466796875,
-0.0215606689453125,
-0.02044677734375,
-0.002407073974609375,
0.08380126953125,
0.0276641845703125,
0.0236968994140625,
0.002109527587890625,
-0.0153961181640625,
-0.00048089027404785156,
0.01537322998046875,
-0.06463623046875,
-0.0235748291015625,
0.021697998046875,
-0.0265655517578125,
-0.026824951171875,
0.0173492431640625,
-0.044158935546875,
-0.00917816162109375,
-0.003276824951171875,
0.05572509765625,
-0.05303955078125,
0.00881195068359375,
0.0297088623046875,
-0.0218658447265625,
0.017913818359375,
-0.01001739501953125,
-0.052978515625,
0.008697509765625,
0.0189971923828125,
0.06744384765625,
0.0109405517578125,
-0.0423583984375,
-0.0194549560546875,
-0.01055145263671875,
0.0019016265869140625,
0.046356201171875,
-0.0301361083984375,
-0.0222320556640625,
0.004161834716796875,
0.0207366943359375,
-0.0394287109375,
-0.0207672119140625,
0.05169677734375,
-0.0252532958984375,
0.0599365234375,
-0.0012674331665039062,
-0.06561279296875,
-0.01468658447265625,
0.0125732421875,
-0.0404052734375,
0.093505859375,
0.0157318115234375,
-0.07110595703125,
0.004817962646484375,
-0.0576171875,
-0.025115966796875,
-0.0177001953125,
0.0019292831420898438,
-0.05181884765625,
0.0017004013061523438,
0.034759521484375,
0.052215576171875,
-0.0043182373046875,
0.01284027099609375,
-0.015228271484375,
-0.033294677734375,
0.032073974609375,
-0.029296875,
0.08514404296875,
0.0147857666015625,
-0.030792236328125,
0.0007381439208984375,
-0.049285888671875,
-0.0032634735107421875,
0.0286712646484375,
-0.0237274169921875,
-0.014373779296875,
0.0002868175506591797,
0.015899658203125,
0.0281829833984375,
0.0235748291015625,
-0.043121337890625,
0.0240631103515625,
-0.038726806640625,
0.0560302734375,
0.05059814453125,
-0.0010213851928710938,
0.0295867919921875,
-0.0211029052734375,
0.0207366943359375,
0.023101806640625,
0.005619049072265625,
-0.01111602783203125,
-0.037506103515625,
-0.0670166015625,
-0.0216522216796875,
0.0303192138671875,
0.039093017578125,
-0.053466796875,
0.0667724609375,
-0.037506103515625,
-0.044708251953125,
-0.0657958984375,
-0.0030078887939453125,
0.01312255859375,
0.044189453125,
0.047210693359375,
0.00391387939453125,
-0.035552978515625,
-0.0679931640625,
-0.00800323486328125,
0.0093994140625,
0.0003139972686767578,
0.02178955078125,
0.05792236328125,
-0.023529052734375,
0.07421875,
-0.061920166015625,
-0.040618896484375,
-0.0269317626953125,
0.0073394775390625,
0.0297698974609375,
0.0369873046875,
0.038330078125,
-0.05523681640625,
-0.04278564453125,
-0.038787841796875,
-0.05621337890625,
-0.002727508544921875,
-0.01038360595703125,
-0.00696563720703125,
0.0131378173828125,
0.04193115234375,
-0.04779052734375,
0.0242919921875,
0.0394287109375,
-0.046905517578125,
0.0293121337890625,
-0.03326416015625,
-0.005764007568359375,
-0.10675048828125,
0.00301361083984375,
0.0047454833984375,
-0.01508331298828125,
-0.0285491943359375,
-0.004230499267578125,
0.00897216796875,
-0.005481719970703125,
-0.03662109375,
0.028289794921875,
-0.042510986328125,
0.0143585205078125,
0.00835418701171875,
0.038787841796875,
0.01605224609375,
0.05059814453125,
-0.0104522705078125,
0.054534912109375,
0.048492431640625,
-0.0379638671875,
0.022308349609375,
0.041595458984375,
-0.03912353515625,
0.0190582275390625,
-0.061676025390625,
0.004917144775390625,
-0.006847381591796875,
0.0171966552734375,
-0.08831787109375,
0.004024505615234375,
0.00975799560546875,
-0.05059814453125,
0.01922607421875,
0.016937255859375,
-0.052764892578125,
-0.040924072265625,
-0.032867431640625,
0.0001099705696105957,
0.036407470703125,
-0.032073974609375,
0.033660888671875,
0.0173797607421875,
0.0033054351806640625,
-0.044769287109375,
-0.0751953125,
-0.00954437255859375,
-0.0139923095703125,
-0.05816650390625,
0.03558349609375,
-0.005157470703125,
0.01064300537109375,
0.01532745361328125,
0.0161285400390625,
0.004138946533203125,
0.0016422271728515625,
0.0031108856201171875,
0.023406982421875,
-0.003143310546875,
0.01328277587890625,
0.0088043212890625,
-0.01033782958984375,
0.00637054443359375,
-0.017822265625,
0.05377197265625,
-0.01451873779296875,
-0.00836944580078125,
-0.029541015625,
0.011474609375,
0.035552978515625,
-0.0253143310546875,
0.08416748046875,
0.06951904296875,
-0.020111083984375,
-0.0125732421875,
-0.0300445556640625,
-0.01788330078125,
-0.0367431640625,
0.0443115234375,
-0.024017333984375,
-0.0576171875,
0.03167724609375,
0.01654052734375,
-0.0010023117065429688,
0.056365966796875,
0.041748046875,
-0.0232391357421875,
0.06439208984375,
0.0280609130859375,
-0.01052093505859375,
0.036163330078125,
-0.0550537109375,
0.01111602783203125,
-0.06573486328125,
-0.013641357421875,
-0.03094482421875,
-0.032806396484375,
-0.053192138671875,
-0.033294677734375,
0.025146484375,
-0.00093841552734375,
-0.0108184814453125,
0.04473876953125,
-0.051605224609375,
0.0184173583984375,
0.042327880859375,
0.01763916015625,
0.00106048583984375,
0.00009381771087646484,
-0.038818359375,
-0.0097808837890625,
-0.050506591796875,
-0.040313720703125,
0.0836181640625,
0.0291748046875,
0.03546142578125,
-0.0036983489990234375,
0.052276611328125,
0.01203155517578125,
-0.00547027587890625,
-0.047210693359375,
0.040618896484375,
-0.01276397705078125,
-0.03839111328125,
-0.0288543701171875,
-0.031463623046875,
-0.07861328125,
0.03460693359375,
-0.014892578125,
-0.049957275390625,
0.0026912689208984375,
-0.022369384765625,
-0.0166778564453125,
0.01177978515625,
-0.060302734375,
0.08038330078125,
-0.00030231475830078125,
-0.0163116455078125,
-0.01171112060546875,
-0.05303955078125,
0.00702667236328125,
0.0235748291015625,
0.0123138427734375,
-0.00567626953125,
-0.001956939697265625,
0.062103271484375,
-0.0310516357421875,
0.05322265625,
-0.01444244384765625,
0.01025390625,
0.0245208740234375,
-0.0213623046875,
0.03167724609375,
-0.005725860595703125,
-0.013153076171875,
0.0091400146484375,
-0.0014543533325195312,
-0.03411865234375,
-0.036590576171875,
0.052825927734375,
-0.06719970703125,
-0.0261993408203125,
-0.044677734375,
-0.041778564453125,
-0.0024204254150390625,
0.016265869140625,
0.036865234375,
0.033599853515625,
-0.0047760009765625,
0.033050537109375,
0.04595947265625,
-0.02276611328125,
0.053436279296875,
0.0311431884765625,
-0.00353240966796875,
-0.037139892578125,
0.048919677734375,
0.0157012939453125,
0.005893707275390625,
0.03131103515625,
0.014617919921875,
-0.038360595703125,
-0.02630615234375,
-0.0233154296875,
0.032012939453125,
-0.04669189453125,
-0.01163482666015625,
-0.058624267578125,
-0.0294647216796875,
-0.05096435546875,
0.0006976127624511719,
-0.0149993896484375,
-0.027130126953125,
-0.03631591796875,
-0.0229644775390625,
0.024566650390625,
0.0382080078125,
0.00939178466796875,
0.0174102783203125,
-0.0445556640625,
0.0061798095703125,
0.003284454345703125,
0.009429931640625,
-0.01422882080078125,
-0.06378173828125,
-0.03216552734375,
0.00276947021484375,
-0.034393310546875,
-0.07373046875,
0.043975830078125,
0.01397705078125,
0.041900634765625,
0.0190277099609375,
0.0113525390625,
0.04473876953125,
-0.041778564453125,
0.06658935546875,
-0.00360870361328125,
-0.068115234375,
0.043975830078125,
-0.00612640380859375,
0.0259246826171875,
0.041595458984375,
0.032379150390625,
-0.032684326171875,
-0.03338623046875,
-0.056427001953125,
-0.07421875,
0.056732177734375,
0.0382080078125,
0.0203857421875,
-0.0114898681640625,
0.0123138427734375,
-0.0169830322265625,
0.0125579833984375,
-0.07220458984375,
-0.037200927734375,
-0.019317626953125,
-0.039093017578125,
-0.02392578125,
-0.0144805908203125,
0.005126953125,
-0.036224365234375,
0.06158447265625,
0.004489898681640625,
0.039886474609375,
0.0404052734375,
-0.0347900390625,
0.0267333984375,
0.009979248046875,
0.048858642578125,
0.02490234375,
-0.0154266357421875,
0.000652313232421875,
0.0170440673828125,
-0.037994384765625,
-0.00258636474609375,
0.036102294921875,
-0.00606536865234375,
0.0205535888671875,
0.03289794921875,
0.0689697265625,
0.0245361328125,
-0.034515380859375,
0.05780029296875,
-0.008758544921875,
-0.025146484375,
-0.035125732421875,
-0.005016326904296875,
0.0191650390625,
0.0202484130859375,
0.026123046875,
-0.004039764404296875,
0.004627227783203125,
-0.0254364013671875,
0.019287109375,
0.015960693359375,
-0.03167724609375,
-0.004947662353515625,
0.055908203125,
0.0032711029052734375,
-0.01232147216796875,
0.06787109375,
-0.0200347900390625,
-0.04742431640625,
0.037139892578125,
0.035430908203125,
0.06475830078125,
-0.0027065277099609375,
0.0167388916015625,
0.039520263671875,
0.03326416015625,
0.0033016204833984375,
0.00997161865234375,
0.0037059783935546875,
-0.056610107421875,
-0.00128936767578125,
-0.049041748046875,
0.0087738037109375,
-0.00508880615234375,
-0.04931640625,
0.0309600830078125,
-0.005626678466796875,
-0.00736236572265625,
-0.0091094970703125,
0.0203094482421875,
-0.059661865234375,
0.01027679443359375,
0.00313568115234375,
0.0712890625,
-0.0679931640625,
0.07666015625,
0.046417236328125,
-0.06683349609375,
-0.0643310546875,
-0.00399017333984375,
-0.024261474609375,
-0.058074951171875,
0.033416748046875,
0.034637451171875,
0.01473236083984375,
0.0164794921875,
-0.0305938720703125,
-0.05712890625,
0.11737060546875,
0.0225677490234375,
-0.035736083984375,
-0.02056884765625,
0.01360321044921875,
0.04705810546875,
-0.0268707275390625,
0.040191650390625,
0.0330810546875,
0.032196044921875,
-0.006744384765625,
-0.053741455078125,
0.006969451904296875,
-0.0194549560546875,
0.00798797607421875,
-0.01239013671875,
-0.0472412109375,
0.07379150390625,
-0.00855255126953125,
-0.01261138916015625,
0.00006663799285888672,
0.053802490234375,
0.019378662109375,
0.01031494140625,
0.031158447265625,
0.06439208984375,
0.057708740234375,
-0.0132598876953125,
0.079833984375,
-0.029754638671875,
0.064208984375,
0.0712890625,
0.004467010498046875,
0.0750732421875,
0.033447265625,
-0.022918701171875,
0.05987548828125,
0.048187255859375,
-0.0208740234375,
0.048065185546875,
0.0168609619140625,
0.005107879638671875,
0.0025424957275390625,
0.0250244140625,
-0.0276641845703125,
0.040130615234375,
0.01493072509765625,
-0.0589599609375,
-0.00572967529296875,
0.0054168701171875,
0.01422119140625,
0.00583648681640625,
0.005157470703125,
0.047943115234375,
0.00937652587890625,
-0.036651611328125,
0.04864501953125,
0.01013946533203125,
0.06298828125,
-0.037139892578125,
0.0167388916015625,
-0.01230621337890625,
0.0233154296875,
-0.007663726806640625,
-0.05535888671875,
0.0186767578125,
-0.01364898681640625,
-0.01409912109375,
-0.0210418701171875,
0.0305938720703125,
-0.049407958984375,
-0.05218505859375,
0.0294342041015625,
0.02923583984375,
0.01554107666015625,
-0.0021190643310546875,
-0.07965087890625,
-0.005893707275390625,
0.006427764892578125,
-0.04833984375,
0.00992584228515625,
0.03582763671875,
0.0271148681640625,
0.040069580078125,
0.03790283203125,
-0.007476806640625,
0.00885772705078125,
0.0059814453125,
0.06341552734375,
-0.04864501953125,
-0.03704833984375,
-0.07330322265625,
0.05596923828125,
-0.0252227783203125,
-0.032958984375,
0.057342529296875,
0.05499267578125,
0.06402587890625,
-0.01702880859375,
0.040130615234375,
-0.0131683349609375,
0.0236358642578125,
-0.045257568359375,
0.073974609375,
-0.04595947265625,
0.01473236083984375,
-0.01617431640625,
-0.06982421875,
-0.0062713623046875,
0.06298828125,
-0.03179931640625,
0.002696990966796875,
0.0711669921875,
0.07073974609375,
-0.004547119140625,
-0.0172576904296875,
0.01629638671875,
0.033782958984375,
0.0176849365234375,
0.047393798828125,
0.027130126953125,
-0.07012939453125,
0.057098388671875,
-0.030975341796875,
0.0001709461212158203,
-0.01383209228515625,
-0.05224609375,
-0.06927490234375,
-0.0718994140625,
-0.0242462158203125,
-0.0321044921875,
-0.01165008544921875,
0.06866455078125,
0.03790283203125,
-0.0501708984375,
-0.0057830810546875,
-0.00881195068359375,
-0.005584716796875,
-0.0124053955078125,
-0.0259246826171875,
0.04351806640625,
-0.04205322265625,
-0.070068359375,
0.0189208984375,
-0.007320404052734375,
-0.0038299560546875,
-0.026397705078125,
-0.0019741058349609375,
-0.0501708984375,
0.01015472412109375,
0.038848876953125,
-0.022186279296875,
-0.052215576171875,
-0.01311492919921875,
0.00868988037109375,
-0.040069580078125,
0.003597259521484375,
0.0232391357421875,
-0.0517578125,
0.031524658203125,
0.036895751953125,
0.032562255859375,
0.055511474609375,
-0.01092529296875,
0.027008056640625,
-0.0595703125,
0.0157470703125,
0.01328277587890625,
0.0535888671875,
0.030548095703125,
-0.0255279541015625,
0.04486083984375,
0.023834228515625,
-0.042449951171875,
-0.052734375,
-0.01422119140625,
-0.0780029296875,
-0.029296875,
0.0850830078125,
-0.0279541015625,
-0.0288848876953125,
0.0211029052734375,
-0.024627685546875,
0.0318603515625,
-0.02703857421875,
0.058319091796875,
0.0655517578125,
0.00337982177734375,
-0.0092620849609375,
-0.036529541015625,
0.0207977294921875,
0.0311737060546875,
-0.04119873046875,
-0.0248870849609375,
0.0181732177734375,
0.03533935546875,
0.0129852294921875,
0.03076171875,
-0.006824493408203125,
-0.0010766983032226562,
0.00797271728515625,
0.005359649658203125,
-0.022857666015625,
0.0037059783935546875,
-0.0313720703125,
0.01168060302734375,
-0.027801513671875,
-0.03558349609375
]
] |
microsoft/tapex-base-finetuned-wikisql | 2023-01-24T16:57:17.000Z | [
"transformers",
"pytorch",
"bart",
"text2text-generation",
"tapex",
"table-question-answering",
"en",
"dataset:wikisql",
"arxiv:2107.07653",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | table-question-answering | microsoft | null | null | microsoft/tapex-base-finetuned-wikisql | 13 | 1,067,188 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- tapex
- table-question-answering
datasets:
- wikisql
license: mit
---
# TAPEX (base-sized model)
TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://arxiv.org/abs/2107.07653) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
## Model description
TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
This model is the `tapex-base` model fine-tuned on the [WikiSQL](https://huggingface.co/datasets/wikisql) dataset.
## Intended Uses
You can use the model for table question answering on relatively simple questions. Some **solveable** questions are shown below (corresponding tables now shown):
| Question | Answer |
|:---: |:---:|
| tell me what the notes are for south australia | no slogan on current series |
| what position does the player who played for butler cc (ks) play? | guard-forward |
| how many schools did player number 3 play at? | 1.0 |
| how many winning drivers in the kraco twin 125 (r2) race were there? | 1.0 |
| for the episode(s) aired in the u.s. on 4 april 2008, what were the names? | "bust a move" part one, "bust a move" part two |
### How to Use
Here is how to use this model in transformers:
```python
from transformers import TapexTokenizer, BartForConditionalGeneration
import pandas as pd
tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-base-finetuned-wikisql")
model = BartForConditionalGeneration.from_pretrained("microsoft/tapex-base-finetuned-wikisql")
data = {
"year": [1896, 1900, 1904, 2004, 2008, 2012],
"city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
}
table = pd.DataFrame.from_dict(data)
# tapex accepts uncased input since it is pre-trained on the uncased corpus
query = "In which year did beijing host the Olympic Games?"
encoding = tokenizer(table=table, query=query, return_tensors="pt")
outputs = model.generate(**encoding)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
# [' 2008.0']
```
### How to Eval
Please find the eval script [here](https://github.com/SivilTaram/transformers/tree/add_tapex_bis/examples/research_projects/tapex).
### BibTeX entry and citation info
```bibtex
@inproceedings{
liu2022tapex,
title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=O50443AsCP}
}
``` | 3,149 | [
[
-0.032867431640625,
-0.0582275390625,
0.03863525390625,
-0.01161956787109375,
-0.017852783203125,
0.003314971923828125,
-0.01727294921875,
-0.00948333740234375,
0.0256805419921875,
0.041259765625,
-0.03802490234375,
-0.042327880859375,
-0.03533935546875,
-0.0164947509765625,
-0.04290771484375,
0.09588623046875,
-0.0019969940185546875,
0.00955963134765625,
-0.008331298828125,
-0.01230621337890625,
-0.028167724609375,
-0.045318603515625,
-0.033538818359375,
-0.0094146728515625,
0.02398681640625,
0.0364990234375,
0.0535888671875,
0.04412841796875,
0.0360107421875,
0.026763916015625,
0.0027942657470703125,
-0.0031490325927734375,
-0.02496337890625,
-0.01522064208984375,
0.0021152496337890625,
-0.0535888671875,
-0.043792724609375,
0.00832366943359375,
0.042816162109375,
0.06195068359375,
-0.0150604248046875,
0.0328369140625,
-0.00013530254364013672,
0.037322998046875,
-0.029510498046875,
0.001369476318359375,
-0.05340576171875,
0.00806427001953125,
-0.00435638427734375,
0.00200653076171875,
-0.030517578125,
-0.04669189453125,
-0.006378173828125,
-0.049774169921875,
0.02911376953125,
0.0165252685546875,
0.08349609375,
0.00920867919921875,
-0.0223541259765625,
-0.009307861328125,
-0.055816650390625,
0.0582275390625,
-0.056427001953125,
0.0157623291015625,
0.028045654296875,
0.0236358642578125,
0.00243377685546875,
-0.0677490234375,
-0.046875,
-0.019561767578125,
-0.02227783203125,
0.006084442138671875,
-0.00475311279296875,
-0.0131072998046875,
0.042572021484375,
0.0262603759765625,
-0.057342529296875,
-0.0067138671875,
-0.042755126953125,
0.002628326416015625,
0.0377197265625,
0.018951416015625,
0.0053863525390625,
-0.00003224611282348633,
-0.036865234375,
-0.0213165283203125,
-0.038818359375,
-0.00396728515625,
0.0161895751953125,
0.0035877227783203125,
-0.02020263671875,
0.039764404296875,
0.005706787109375,
0.04425048828125,
-0.0008420944213867188,
0.00685882568359375,
0.027587890625,
-0.0228729248046875,
-0.007732391357421875,
0.0023040771484375,
0.079833984375,
0.0107879638671875,
0.0171661376953125,
-0.0152587890625,
-0.0169219970703125,
0.00713348388671875,
0.007671356201171875,
-0.039947509765625,
-0.02862548828125,
0.024688720703125,
-0.0333251953125,
-0.00342559814453125,
0.01384735107421875,
-0.0625,
-0.006374359130859375,
-0.041900634765625,
0.06072998046875,
-0.0440673828125,
-0.03497314453125,
0.0112152099609375,
-0.01549530029296875,
0.0260162353515625,
-0.003871917724609375,
-0.054718017578125,
0.030975341796875,
0.051239013671875,
0.0443115234375,
-0.01192474365234375,
-0.0278472900390625,
-0.0311126708984375,
-0.00547027587890625,
-0.03460693359375,
0.0396728515625,
-0.0096893310546875,
-0.01076507568359375,
-0.0034389495849609375,
0.00444793701171875,
-0.0161895751953125,
-0.04736328125,
0.0289154052734375,
-0.046142578125,
0.02459716796875,
-0.001773834228515625,
-0.03717041015625,
0.0015897750854492188,
-0.0038051605224609375,
-0.051422119140625,
0.08367919921875,
0.0196990966796875,
-0.048126220703125,
0.030029296875,
-0.05224609375,
-0.031829833984375,
-0.030670166015625,
0.0166015625,
-0.07183837890625,
0.00275421142578125,
0.0212860107421875,
0.0205535888671875,
-0.0277862548828125,
0.0211639404296875,
-0.03070068359375,
-0.033966064453125,
0.038818359375,
-0.0190582275390625,
0.08203125,
0.0149078369140625,
-0.0302734375,
0.0128936767578125,
-0.0841064453125,
0.00798797607421875,
0.00867462158203125,
-0.01470947265625,
-0.01837158203125,
-0.0276336669921875,
-0.00962066650390625,
0.01129913330078125,
0.004047393798828125,
-0.043121337890625,
0.0039520263671875,
-0.0157012939453125,
0.027099609375,
0.037445068359375,
-0.007808685302734375,
0.0355224609375,
-0.023681640625,
0.034088134765625,
0.016204833984375,
0.02044677734375,
-0.018157958984375,
-0.023895263671875,
-0.0804443359375,
0.0008311271667480469,
0.049407958984375,
0.0478515625,
-0.05419921875,
0.033050537109375,
-0.0380859375,
-0.045745849609375,
-0.0298614501953125,
-0.0189666748046875,
0.030059814453125,
0.04022216796875,
0.044281005859375,
-0.01430511474609375,
-0.0731201171875,
-0.061859130859375,
-0.00197601318359375,
-0.006633758544921875,
-0.01267242431640625,
0.0043487548828125,
0.04388427734375,
0.01343536376953125,
0.0672607421875,
-0.051666259765625,
-0.017059326171875,
-0.0133819580078125,
0.0158538818359375,
0.04608154296875,
0.048736572265625,
0.0250244140625,
-0.028045654296875,
-0.050201416015625,
-0.01042938232421875,
-0.05657958984375,
0.0138397216796875,
-0.01447296142578125,
-0.03570556640625,
0.018096923828125,
0.03387451171875,
-0.05908203125,
0.032470703125,
0.0026760101318359375,
-0.0265960693359375,
0.054412841796875,
-0.019989013671875,
-0.00396728515625,
-0.065185546875,
0.01285552978515625,
-0.0258636474609375,
-0.012939453125,
-0.055877685546875,
-0.0157318115234375,
0.01409149169921875,
-0.0108489990234375,
-0.033782958984375,
0.01035308837890625,
-0.046966552734375,
-0.00577545166015625,
0.00756072998046875,
0.0022449493408203125,
0.0021839141845703125,
0.06451416015625,
0.0190277099609375,
0.045196533203125,
0.033477783203125,
-0.053863525390625,
0.016845703125,
0.0281524658203125,
-0.03765869140625,
0.039276123046875,
-0.040679931640625,
0.02923583984375,
-0.00836944580078125,
-0.00867462158203125,
-0.07232666015625,
0.024444580078125,
0.0191650390625,
-0.041412353515625,
0.03778076171875,
-0.01080322265625,
-0.016876220703125,
-0.044708251953125,
-0.018310546875,
0.006366729736328125,
0.055145263671875,
-0.039642333984375,
0.039306640625,
0.043792724609375,
0.0170745849609375,
-0.05499267578125,
-0.052947998046875,
-0.0037860870361328125,
-0.0391845703125,
-0.033203125,
0.0213775634765625,
-0.0106048583984375,
-0.01023101806640625,
0.00007939338684082031,
-0.0023479461669921875,
-0.0183868408203125,
-0.0103912353515625,
-0.0069427490234375,
0.043243408203125,
-0.04315185546875,
-0.0018768310546875,
-0.0161895751953125,
-0.0190887451171875,
0.018402099609375,
-0.027496337890625,
0.04522705078125,
-0.0007605552673339844,
0.00640106201171875,
-0.018341064453125,
0.032623291015625,
0.02215576171875,
-0.02227783203125,
0.058868408203125,
0.0704345703125,
-0.00669097900390625,
-0.0036334991455078125,
-0.05078125,
-0.02984619140625,
-0.03485107421875,
0.035125732421875,
-0.032135009765625,
-0.039093017578125,
0.03924560546875,
0.021820068359375,
-0.002918243408203125,
0.0362548828125,
0.0389404296875,
-0.01494598388671875,
0.06756591796875,
0.0180816650390625,
0.02105712890625,
0.0312347412109375,
-0.057037353515625,
-0.00429534912109375,
-0.061981201171875,
-0.00872039794921875,
-0.037261962890625,
-0.0196990966796875,
-0.0252685546875,
-0.0292816162109375,
0.0301055908203125,
-0.0108184814453125,
-0.06146240234375,
0.051239013671875,
-0.042724609375,
0.0286712646484375,
0.07025146484375,
0.00981903076171875,
0.0088043212890625,
0.003940582275390625,
-0.01273345947265625,
-0.00446319580078125,
-0.060699462890625,
-0.0199737548828125,
0.10089111328125,
0.0219573974609375,
0.0587158203125,
0.006526947021484375,
0.047027587890625,
0.00933074951171875,
0.0213775634765625,
-0.034576416015625,
0.048248291015625,
-0.0056915283203125,
-0.058319091796875,
-0.02728271484375,
-0.0345458984375,
-0.09686279296875,
0.01416778564453125,
-0.024566650390625,
-0.038818359375,
0.0312347412109375,
-0.00157928466796875,
-0.040985107421875,
0.032073974609375,
-0.06243896484375,
0.0743408203125,
-0.0236358642578125,
-0.02923583984375,
0.009613037109375,
-0.058319091796875,
0.04156494140625,
-0.00848388671875,
0.020172119140625,
-0.0024776458740234375,
0.004150390625,
0.07452392578125,
-0.04736328125,
0.044219970703125,
-0.0202789306640625,
0.017730712890625,
0.038818359375,
0.00284576416015625,
0.021453857421875,
-0.0037441253662109375,
0.001705169677734375,
0.024017333984375,
0.02667236328125,
-0.0056304931640625,
-0.045623779296875,
0.0278778076171875,
-0.06787109375,
-0.04364013671875,
-0.0301513671875,
-0.0309295654296875,
-0.0079803466796875,
0.0133819580078125,
0.02105712890625,
0.03497314453125,
0.00003069639205932617,
0.0185089111328125,
0.047607421875,
-0.0201416015625,
0.044281005859375,
0.045257568359375,
-0.0255126953125,
-0.047576904296875,
0.07208251953125,
0.016845703125,
0.00861358642578125,
0.048095703125,
0.013397216796875,
-0.03729248046875,
-0.0274200439453125,
-0.0213775634765625,
0.0333251953125,
-0.036834716796875,
-0.0293731689453125,
-0.04290771484375,
-0.0203399658203125,
-0.026519775390625,
0.031341552734375,
-0.0261688232421875,
-0.0635986328125,
-0.022796630859375,
-0.01471710205078125,
0.0167999267578125,
0.03509521484375,
-0.0229339599609375,
0.027191162109375,
-0.049530029296875,
0.02984619140625,
0.0221405029296875,
0.03173828125,
-0.0092010498046875,
-0.044281005859375,
-0.037506103515625,
-0.004730224609375,
-0.029449462890625,
-0.07421875,
0.04254150390625,
0.0109710693359375,
0.046478271484375,
0.020660400390625,
0.0215301513671875,
0.046295166015625,
-0.048736572265625,
0.05804443359375,
0.02642822265625,
-0.06976318359375,
0.036163330078125,
-0.006336212158203125,
-0.0017986297607421875,
0.02630615234375,
0.0309906005859375,
-0.0298919677734375,
-0.01070404052734375,
-0.061004638671875,
-0.06292724609375,
0.0775146484375,
0.002582550048828125,
0.00098419189453125,
0.00628662109375,
0.014923095703125,
0.0145111083984375,
0.01507568359375,
-0.066650390625,
-0.045196533203125,
-0.02703857421875,
-0.008087158203125,
-0.006015777587890625,
-0.0238037109375,
-0.006916046142578125,
-0.034454345703125,
0.06451416015625,
0.00905609130859375,
0.049468994140625,
0.00797271728515625,
-0.01320648193359375,
0.006137847900390625,
0.0154876708984375,
0.0286712646484375,
0.04827880859375,
-0.03387451171875,
0.011383056640625,
0.0306854248046875,
-0.04315185546875,
0.0202789306640625,
0.0175933837890625,
-0.018524169921875,
0.0160369873046875,
0.01099395751953125,
0.0828857421875,
0.01122283935546875,
-0.02838134765625,
0.0287017822265625,
-0.00933837890625,
-0.02044677734375,
-0.060821533203125,
0.0022411346435546875,
0.0012331008911132812,
-0.007091522216796875,
0.0269317626953125,
-0.00507354736328125,
0.0039215087890625,
-0.036834716796875,
0.0275726318359375,
0.036163330078125,
-0.044952392578125,
-0.01343536376953125,
0.0841064453125,
-0.020721435546875,
-0.0255126953125,
0.052154541015625,
-0.006885528564453125,
-0.032257080078125,
0.046356201171875,
0.0322265625,
0.04595947265625,
-0.020751953125,
0.0137481689453125,
0.06048583984375,
0.0251312255859375,
0.01065826416015625,
0.0171966552734375,
-0.003597259521484375,
-0.050750732421875,
-0.015167236328125,
-0.031890869140625,
-0.0150146484375,
0.0184326171875,
-0.051422119140625,
0.038848876953125,
-0.0199127197265625,
-0.031097412109375,
-0.0010929107666015625,
0.0000711679458618164,
-0.058013916015625,
0.004913330078125,
0.012298583984375,
0.06671142578125,
-0.054168701171875,
0.069580078125,
0.031829833984375,
-0.052001953125,
-0.06640625,
-0.02874755859375,
-0.03411865234375,
-0.0467529296875,
0.038787841796875,
-0.01214599609375,
0.03875732421875,
-0.0014410018920898438,
-0.033050537109375,
-0.081787109375,
0.07537841796875,
0.03509521484375,
-0.0203399658203125,
-0.00978851318359375,
0.050201416015625,
0.0318603515625,
-0.00901031494140625,
0.064697265625,
0.07733154296875,
0.0279998779296875,
0.0012884140014648438,
-0.0650634765625,
0.0007338523864746094,
-0.015960693359375,
-0.0049285888671875,
0.006084442138671875,
-0.03302001953125,
0.0867919921875,
-0.01326751708984375,
-0.0034198760986328125,
0.006801605224609375,
0.06549072265625,
0.0150604248046875,
0.0213775634765625,
0.041748046875,
0.047760009765625,
0.049407958984375,
-0.0254669189453125,
0.06744384765625,
-0.0233001708984375,
0.032562255859375,
0.0831298828125,
0.00969696044921875,
0.061553955078125,
0.0227203369140625,
-0.05804443359375,
0.0533447265625,
0.04144287109375,
-0.005290985107421875,
0.0269622802734375,
0.0175628662109375,
0.0046234130859375,
-0.01214599609375,
0.037078857421875,
-0.039581298828125,
0.031097412109375,
0.031982421875,
-0.029998779296875,
-0.003932952880859375,
0.002887725830078125,
0.023681640625,
-0.005901336669921875,
-0.0180816650390625,
0.047454833984375,
0.0102691650390625,
-0.07208251953125,
0.0743408203125,
-0.0174407958984375,
0.045654296875,
-0.060821533203125,
-0.002307891845703125,
-0.0267486572265625,
0.0268707275390625,
-0.006549835205078125,
-0.063720703125,
0.031646728515625,
-0.024810791015625,
-0.0043792724609375,
0.0041046142578125,
0.0268707275390625,
-0.04705810546875,
-0.041412353515625,
0.00289154052734375,
0.03033447265625,
0.0262451171875,
0.000919342041015625,
-0.060211181640625,
-0.0047149658203125,
0.00632476806640625,
-0.0219573974609375,
0.017333984375,
0.034149169921875,
0.01073455810546875,
0.0254669189453125,
0.05633544921875,
-0.0036487579345703125,
0.0246429443359375,
-0.00984954833984375,
0.046234130859375,
-0.047943115234375,
-0.037689208984375,
-0.0489501953125,
0.0518798828125,
-0.0182952880859375,
-0.02178955078125,
0.05548095703125,
0.07476806640625,
0.048431396484375,
-0.037353515625,
0.039825439453125,
-0.0010156631469726562,
0.06781005859375,
-0.0394287109375,
0.059906005859375,
-0.02984619140625,
0.02655029296875,
-0.018463134765625,
-0.078857421875,
-0.026611328125,
0.0435791015625,
-0.03271484375,
0.007328033447265625,
0.06292724609375,
0.07281494140625,
-0.00638580322265625,
-0.0008835792541503906,
0.0262908935546875,
0.031341552734375,
0.007171630859375,
0.06231689453125,
0.06671142578125,
-0.053924560546875,
0.07415771484375,
-0.045196533203125,
-0.0111541748046875,
0.00344085693359375,
-0.050384521484375,
-0.05804443359375,
-0.06243896484375,
-0.038116455078125,
-0.051025390625,
-0.007701873779296875,
0.06341552734375,
0.055694580078125,
-0.073974609375,
-0.040435791015625,
0.005008697509765625,
0.0160369873046875,
-0.03143310546875,
-0.019866943359375,
0.07135009765625,
-0.0234832763671875,
-0.06475830078125,
0.006198883056640625,
-0.0159454345703125,
-0.0049896240234375,
-0.00859832763671875,
0.0105133056640625,
-0.022308349609375,
0.0003876686096191406,
0.03643798828125,
0.0270843505859375,
-0.0217437744140625,
-0.01155853271484375,
0.00855255126953125,
0.00435638427734375,
0.031005859375,
0.049713134765625,
-0.0745849609375,
0.01192474365234375,
0.0258026123046875,
0.01255035400390625,
0.0701904296875,
-0.0126800537109375,
0.028045654296875,
-0.045318603515625,
0.0017290115356445312,
0.00885772705078125,
0.0347900390625,
0.0214080810546875,
-0.0147552490234375,
0.05657958984375,
0.02899169921875,
-0.045806884765625,
-0.0638427734375,
-0.0172271728515625,
-0.0738525390625,
-0.01395416259765625,
0.08856201171875,
-0.004764556884765625,
-0.036102294921875,
-0.0223541259765625,
-0.0258636474609375,
0.033294677734375,
-0.0090179443359375,
0.0478515625,
0.02069091796875,
-0.00748443603515625,
-0.0221405029296875,
-0.035858154296875,
0.038299560546875,
0.0251312255859375,
-0.03662109375,
-0.0025806427001953125,
0.00926971435546875,
0.026214599609375,
0.0225982666015625,
0.047271728515625,
-0.017242431640625,
0.0171661376953125,
0.021087646484375,
0.0208587646484375,
-0.010040283203125,
0.0014925003051757812,
-0.0112152099609375,
0.01546478271484375,
-0.021270751953125,
-0.03094482421875
]
] |
bert-large-uncased | 2022-11-14T21:36:14.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-large-uncased | 56 | 1,040,642 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT large model (uncased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
This model has the following configuration:
- 24-layer
- 1024 hidden dimension
- 16 attention heads
- 336M parameters.
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-large-uncased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
'score': 0.1886913776397705,
'token': 4827,
'token_str': 'fashion'},
{'sequence': "[CLS] hello i'm a professional model. [SEP]",
'score': 0.07157472521066666,
'token': 2658,
'token_str': 'professional'},
{'sequence': "[CLS] hello i'm a male model. [SEP]",
'score': 0.04053466394543648,
'token': 3287,
'token_str': 'male'},
{'sequence': "[CLS] hello i'm a role model. [SEP]",
'score': 0.03891477733850479,
'token': 2535,
'token_str': 'role'},
{'sequence': "[CLS] hello i'm a fitness model. [SEP]",
'score': 0.03038121573626995,
'token': 10516,
'token_str': 'fitness'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-large-uncased')
model = BertModel.from_pretrained("bert-large-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-large-uncased')
model = TFBertModel.from_pretrained("bert-large-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-large-uncased')
>>> unmasker("The man worked as a [MASK].")
[{'sequence': '[CLS] the man worked as a bartender. [SEP]',
'score': 0.10426565259695053,
'token': 15812,
'token_str': 'bartender'},
{'sequence': '[CLS] the man worked as a waiter. [SEP]',
'score': 0.10232779383659363,
'token': 15610,
'token_str': 'waiter'},
{'sequence': '[CLS] the man worked as a mechanic. [SEP]',
'score': 0.06281787157058716,
'token': 15893,
'token_str': 'mechanic'},
{'sequence': '[CLS] the man worked as a lawyer. [SEP]',
'score': 0.050936125218868256,
'token': 5160,
'token_str': 'lawyer'},
{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
'score': 0.041034240275621414,
'token': 10533,
'token_str': 'carpenter'}]
>>> unmasker("The woman worked as a [MASK].")
[{'sequence': '[CLS] the woman worked as a waitress. [SEP]',
'score': 0.28473711013793945,
'token': 13877,
'token_str': 'waitress'},
{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
'score': 0.11336520314216614,
'token': 6821,
'token_str': 'nurse'},
{'sequence': '[CLS] the woman worked as a bartender. [SEP]',
'score': 0.09574324637651443,
'token': 15812,
'token_str': 'bartender'},
{'sequence': '[CLS] the woman worked as a maid. [SEP]',
'score': 0.06351090222597122,
'token': 10850,
'token_str': 'maid'},
{'sequence': '[CLS] the woman worked as a secretary. [SEP]',
'score': 0.048970773816108704,
'token': 3187,
'token_str': 'secretary'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Model | SQUAD 1.1 F1/EM | Multi NLI Accuracy
---------------------------------------- | :-------------: | :----------------:
BERT-Large, Uncased (Original) | 91.0/84.3 | 86.05
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 8,961 | [
[
-0.00824737548828125,
-0.044677734375,
0.0166168212890625,
0.0220947265625,
-0.04248046875,
0.004222869873046875,
-0.004711151123046875,
-0.01503753662109375,
0.031158447265625,
0.0394287109375,
-0.04437255859375,
-0.031646728515625,
-0.05950927734375,
0.0140228271484375,
-0.038604736328125,
0.0848388671875,
0.0191497802734375,
0.024932861328125,
0.005558013916015625,
0.01419830322265625,
-0.034149169921875,
-0.054962158203125,
-0.061614990234375,
-0.0223388671875,
0.03375244140625,
0.022735595703125,
0.044525146484375,
0.04425048828125,
0.033966064453125,
0.0298919677734375,
-0.004985809326171875,
-0.008087158203125,
-0.0260162353515625,
0.00762939453125,
-0.00266265869140625,
-0.0438232421875,
-0.02740478515625,
0.012298583984375,
0.04107666015625,
0.059783935546875,
-0.0002219676971435547,
0.024810791015625,
-0.00896453857421875,
0.04437255859375,
-0.01312255859375,
0.0264129638671875,
-0.0394287109375,
0.00928497314453125,
-0.020050048828125,
0.009429931640625,
-0.0275726318359375,
-0.0146484375,
0.01070404052734375,
-0.041778564453125,
0.0163421630859375,
0.0175018310546875,
0.0787353515625,
0.011474609375,
-0.0146942138671875,
-0.007762908935546875,
-0.035064697265625,
0.056060791015625,
-0.052764892578125,
0.0119476318359375,
0.03875732421875,
0.0190277099609375,
-0.0168304443359375,
-0.07476806640625,
-0.0277252197265625,
-0.00397491455078125,
-0.005611419677734375,
0.00197601318359375,
-0.0008401870727539062,
-0.006084442138671875,
0.0253143310546875,
0.0281524658203125,
-0.0236968994140625,
0.004901885986328125,
-0.055877685546875,
-0.024017333984375,
0.0516357421875,
0.01207733154296875,
0.01263427734375,
-0.0252227783203125,
-0.0249176025390625,
-0.0235595703125,
-0.021087646484375,
0.0088348388671875,
0.044952392578125,
0.033294677734375,
-0.01367950439453125,
0.060943603515625,
-0.01312255859375,
0.043060302734375,
0.002826690673828125,
-0.0007877349853515625,
0.034149169921875,
-0.00945281982421875,
-0.028656005859375,
0.003208160400390625,
0.072265625,
0.0181884765625,
0.031982421875,
-0.00435638427734375,
-0.0279083251953125,
-0.0009145736694335938,
0.0282440185546875,
-0.0458984375,
-0.02410888671875,
0.01178741455078125,
-0.039459228515625,
-0.03436279296875,
0.0369873046875,
-0.050018310546875,
-0.00833892822265625,
-0.005992889404296875,
0.0445556640625,
-0.0245208740234375,
-0.00612640380859375,
0.0112152099609375,
-0.042205810546875,
0.010833740234375,
0.00390625,
-0.0667724609375,
0.01666259765625,
0.050689697265625,
0.06341552734375,
0.0250701904296875,
-0.0080108642578125,
-0.0323486328125,
-0.015838623046875,
-0.0260467529296875,
0.03216552734375,
-0.0219573974609375,
-0.03558349609375,
0.002262115478515625,
0.02178955078125,
-0.00612640380859375,
-0.0165557861328125,
0.049468994140625,
-0.0382080078125,
0.04034423828125,
-0.0022449493408203125,
-0.042755126953125,
-0.01776123046875,
0.0023517608642578125,
-0.055267333984375,
0.0880126953125,
0.0236968994140625,
-0.054534912109375,
0.0253448486328125,
-0.0706787109375,
-0.044677734375,
0.0152587890625,
0.008880615234375,
-0.03643798828125,
0.0153350830078125,
0.009033203125,
0.033905029296875,
-0.00580596923828125,
0.025634765625,
-0.01812744140625,
-0.033966064453125,
0.032806396484375,
-0.0164642333984375,
0.07666015625,
0.01309967041015625,
-0.024139404296875,
0.01412200927734375,
-0.059234619140625,
-0.002086639404296875,
0.01654052734375,
-0.0289459228515625,
-0.011688232421875,
-0.007228851318359375,
0.0244903564453125,
0.0132904052734375,
0.0303955078125,
-0.0498046875,
0.0215301513671875,
-0.044891357421875,
0.052764892578125,
0.06494140625,
-0.005756378173828125,
0.0185394287109375,
-0.0284423828125,
0.038665771484375,
-0.005191802978515625,
-0.003940582275390625,
-0.01003265380859375,
-0.05706787109375,
-0.0516357421875,
-0.0289764404296875,
0.04449462890625,
0.052764892578125,
-0.03851318359375,
0.06158447265625,
-0.0024700164794921875,
-0.044189453125,
-0.046417236328125,
-0.007781982421875,
0.0233001708984375,
0.03216552734375,
0.0233917236328125,
-0.034515380859375,
-0.0655517578125,
-0.060089111328125,
-0.0218048095703125,
-0.01329803466796875,
-0.0203399658203125,
0.005870819091796875,
0.0567626953125,
-0.0219573974609375,
0.061126708984375,
-0.054962158203125,
-0.033172607421875,
-0.01309967041015625,
0.0195159912109375,
0.049835205078125,
0.054931640625,
0.026763916015625,
-0.041900634765625,
-0.0262451171875,
-0.032318115234375,
-0.041839599609375,
0.0019779205322265625,
-0.0014619827270507812,
-0.01137542724609375,
0.00907135009765625,
0.042572021484375,
-0.05712890625,
0.04132080078125,
0.0178680419921875,
-0.043182373046875,
0.05352783203125,
-0.028656005859375,
-0.007381439208984375,
-0.09442138671875,
0.0146942138671875,
-0.00946044921875,
-0.0267333984375,
-0.05169677734375,
-0.00206756591796875,
-0.01197052001953125,
-0.01052093505859375,
-0.037689208984375,
0.041168212890625,
-0.031768798828125,
-0.002689361572265625,
0.0029811859130859375,
-0.012908935546875,
0.0010156631469726562,
0.0323486328125,
0.0013713836669921875,
0.044158935546875,
0.041046142578125,
-0.041046142578125,
0.04107666015625,
0.032806396484375,
-0.0452880859375,
0.0113983154296875,
-0.06378173828125,
0.017578125,
0.0028057098388671875,
0.005458831787109375,
-0.087158203125,
-0.0275421142578125,
0.018402099609375,
-0.042755126953125,
0.0173797607421875,
-0.00405120849609375,
-0.059783935546875,
-0.049224853515625,
-0.02142333984375,
0.03411865234375,
0.0423583984375,
-0.0178375244140625,
0.030517578125,
0.02484130859375,
-0.00713348388671875,
-0.04791259765625,
-0.053955078125,
0.00876617431640625,
-0.0153350830078125,
-0.0372314453125,
0.0276641845703125,
-0.0013971328735351562,
-0.00873565673828125,
-0.015625,
0.00368499755859375,
-0.01128387451171875,
0.006256103515625,
0.020751953125,
0.032806396484375,
-0.0112762451171875,
-0.003795623779296875,
-0.016021728515625,
-0.008697509765625,
0.0229034423828125,
-0.0169525146484375,
0.062103271484375,
0.0014753341674804688,
-0.00415802001953125,
-0.027496337890625,
0.0235443115234375,
0.047698974609375,
-0.005641937255859375,
0.059967041015625,
0.06475830078125,
-0.042022705078125,
0.005901336669921875,
-0.0253448486328125,
-0.01528167724609375,
-0.03839111328125,
0.041046142578125,
-0.03387451171875,
-0.06317138671875,
0.05755615234375,
0.0235595703125,
-0.0119476318359375,
0.054962158203125,
0.041839599609375,
-0.01593017578125,
0.07537841796875,
0.035400390625,
-0.01087188720703125,
0.03802490234375,
-0.007656097412109375,
0.0249786376953125,
-0.054046630859375,
-0.03704833984375,
-0.036285400390625,
-0.023712158203125,
-0.036773681640625,
-0.01396942138671875,
0.0204315185546875,
0.01715087890625,
-0.036224365234375,
0.045806884765625,
-0.04644775390625,
0.02484130859375,
0.07415771484375,
0.0280609130859375,
-0.0178680419921875,
-0.0186920166015625,
-0.017547607421875,
0.00455474853515625,
-0.03411865234375,
-0.02484130859375,
0.0850830078125,
0.04254150390625,
0.053466796875,
0.005222320556640625,
0.048309326171875,
0.0265350341796875,
-0.00585174560546875,
-0.051025390625,
0.04595947265625,
-0.0254974365234375,
-0.07037353515625,
-0.03253173828125,
-0.0109405517578125,
-0.08148193359375,
0.00789642333984375,
-0.027740478515625,
-0.06646728515625,
-0.005584716796875,
-0.01313018798828125,
-0.025848388671875,
0.01447296142578125,
-0.051849365234375,
0.0804443359375,
-0.0218353271484375,
-0.00814056396484375,
0.006984710693359375,
-0.07379150390625,
0.019683837890625,
-0.003360748291015625,
0.0090789794921875,
-0.006977081298828125,
0.01464080810546875,
0.08416748046875,
-0.0435791015625,
0.0771484375,
-0.018310546875,
0.01641845703125,
0.0036468505859375,
-0.00524139404296875,
0.025970458984375,
0.0037555694580078125,
0.006046295166015625,
0.021697998046875,
0.004909515380859375,
-0.03594970703125,
-0.00769805908203125,
0.0214080810546875,
-0.058319091796875,
-0.036468505859375,
-0.047149658203125,
-0.047607421875,
0.01082611083984375,
0.034149169921875,
0.0426025390625,
0.03839111328125,
-0.011932373046875,
0.0250091552734375,
0.03533935546875,
-0.02166748046875,
0.05645751953125,
0.0197601318359375,
-0.0163116455078125,
-0.04058837890625,
0.0384521484375,
-0.0012083053588867188,
0.0008993148803710938,
0.03790283203125,
0.0146484375,
-0.050384521484375,
-0.0119476318359375,
-0.0243072509765625,
0.0107879638671875,
-0.043121337890625,
-0.0247802734375,
-0.040924072265625,
-0.032623291015625,
-0.050262451171875,
-0.004108428955078125,
-0.01215362548828125,
-0.03839111328125,
-0.049957275390625,
-0.01336669921875,
0.0357666015625,
0.05072021484375,
-0.00917816162109375,
0.039154052734375,
-0.055877685546875,
0.0188446044921875,
0.0214385986328125,
0.031097412109375,
-0.0218963623046875,
-0.05859375,
-0.024658203125,
-0.0015783309936523438,
-0.006633758544921875,
-0.0643310546875,
0.048858642578125,
0.0168914794921875,
0.037078857421875,
0.04254150390625,
-0.0013685226440429688,
0.0458984375,
-0.047393798828125,
0.0716552734375,
0.0182952880859375,
-0.08197021484375,
0.040740966796875,
-0.02581787109375,
0.018096923828125,
0.0300140380859375,
0.0164794921875,
-0.03533935546875,
-0.0280303955078125,
-0.065185546875,
-0.0732421875,
0.061767578125,
0.014007568359375,
0.0229949951171875,
-0.004352569580078125,
0.02020263671875,
0.006862640380859375,
0.03375244140625,
-0.07171630859375,
-0.034698486328125,
-0.035858154296875,
-0.0274505615234375,
-0.0168304443359375,
-0.019012451171875,
-0.0057830810546875,
-0.042236328125,
0.050445556640625,
0.01172637939453125,
0.04473876953125,
0.005847930908203125,
-0.00829315185546875,
0.007843017578125,
0.01239776611328125,
0.060302734375,
0.036041259765625,
-0.041046142578125,
0.0003116130828857422,
-0.00222015380859375,
-0.04620361328125,
0.004573822021484375,
0.01490020751953125,
0.003475189208984375,
0.017730712890625,
0.043609619140625,
0.061676025390625,
0.0190582275390625,
-0.039520263671875,
0.045989990234375,
0.01169586181640625,
-0.0252685546875,
-0.04571533203125,
0.008880615234375,
-0.0034923553466796875,
0.01174163818359375,
0.03826904296875,
0.01535797119140625,
0.002635955810546875,
-0.0440673828125,
0.03131103515625,
0.029266357421875,
-0.03704833984375,
-0.0165252685546875,
0.0740966796875,
0.0035495758056640625,
-0.055267333984375,
0.060272216796875,
-0.01239776611328125,
-0.056427001953125,
0.0552978515625,
0.050262451171875,
0.06976318359375,
-0.01482391357421875,
0.01486968994140625,
0.03704833984375,
0.0263214111328125,
-0.0262298583984375,
0.02691650390625,
0.02288818359375,
-0.059844970703125,
-0.02490234375,
-0.05682373046875,
-0.00923919677734375,
0.01387786865234375,
-0.062042236328125,
0.021759033203125,
-0.037750244140625,
-0.021087646484375,
0.0135498046875,
0.0004334449768066406,
-0.0538330078125,
0.033538818359375,
-0.0013675689697265625,
0.0804443359375,
-0.0775146484375,
0.07257080078125,
0.056121826171875,
-0.049041748046875,
-0.06494140625,
-0.03155517578125,
-0.0235137939453125,
-0.08355712890625,
0.056488037109375,
0.02825927734375,
0.02667236328125,
-0.0017232894897460938,
-0.045623779296875,
-0.0494384765625,
0.06427001953125,
0.009857177734375,
-0.037353515625,
-0.012359619140625,
0.005496978759765625,
0.04296875,
-0.040771484375,
0.030609130859375,
0.038116455078125,
0.03289794921875,
-0.004901885986328125,
-0.0604248046875,
0.00577545166015625,
-0.0276336669921875,
0.0002684593200683594,
0.0080413818359375,
-0.0301055908203125,
0.0870361328125,
-0.009368896484375,
0.00824737548828125,
0.017242431640625,
0.037689208984375,
-0.00027871131896972656,
0.0088043212890625,
0.0382080078125,
0.04644775390625,
0.05755615234375,
-0.0244598388671875,
0.0606689453125,
-0.016143798828125,
0.03717041015625,
0.0631103515625,
0.005825042724609375,
0.06103515625,
0.03125,
-0.0202789306640625,
0.06829833984375,
0.06658935546875,
-0.027740478515625,
0.057403564453125,
0.01959228515625,
-0.0038700103759765625,
-0.00563812255859375,
0.0128326416015625,
-0.0207061767578125,
0.041412353515625,
0.0181121826171875,
-0.044830322265625,
0.00833892822265625,
-0.00555419921875,
0.0110931396484375,
-0.01331329345703125,
-0.0384521484375,
0.05499267578125,
0.0126800537109375,
-0.05224609375,
0.020965576171875,
0.01824951171875,
0.047698974609375,
-0.04278564453125,
0.003692626953125,
-0.0022945404052734375,
0.0161285400390625,
-0.005519866943359375,
-0.0660400390625,
0.0144500732421875,
-0.0118865966796875,
-0.028961181640625,
-0.0166015625,
0.0546875,
-0.033660888671875,
-0.050140380859375,
0.0004391670227050781,
0.0201416015625,
0.025360107421875,
-0.01296234130859375,
-0.05755615234375,
-0.0213165283203125,
0.00047087669372558594,
-0.006755828857421875,
0.0117034912109375,
0.0266265869140625,
0.00740814208984375,
0.04132080078125,
0.060150146484375,
-0.0091552734375,
0.00612640380859375,
0.0027866363525390625,
0.052764892578125,
-0.07415771484375,
-0.065185546875,
-0.07489013671875,
0.0428466796875,
-0.009735107421875,
-0.0423583984375,
0.0455322265625,
0.055267333984375,
0.049285888671875,
-0.034942626953125,
0.035003662109375,
-0.0121307373046875,
0.03729248046875,
-0.0323486328125,
0.05743408203125,
-0.0265960693359375,
-0.0006718635559082031,
-0.028961181640625,
-0.0625,
-0.0234222412109375,
0.06329345703125,
-0.004375457763671875,
0.0008311271667480469,
0.053253173828125,
0.0435791015625,
0.007038116455078125,
-0.01003265380859375,
0.01079559326171875,
0.011505126953125,
0.00627899169921875,
0.030426025390625,
0.038238525390625,
-0.045867919921875,
0.0294342041015625,
-0.00769805908203125,
-0.0035724639892578125,
-0.0290679931640625,
-0.06707763671875,
-0.08038330078125,
-0.04351806640625,
-0.016082763671875,
-0.0484619140625,
-0.0157318115234375,
0.07275390625,
0.05645751953125,
-0.0667724609375,
-0.02056884765625,
-0.00724029541015625,
0.00360870361328125,
-0.023101806640625,
-0.0211181640625,
0.03448486328125,
-0.0174407958984375,
-0.05352783203125,
0.01947021484375,
-0.0031585693359375,
0.00665283203125,
-0.01261138916015625,
0.00579833984375,
-0.030181884765625,
0.00421142578125,
0.038848876953125,
0.0086212158203125,
-0.055816650390625,
-0.03924560546875,
0.0026912689208984375,
-0.0119476318359375,
0.0099029541015625,
0.034820556640625,
-0.040496826171875,
0.0299072265625,
0.0306243896484375,
0.033416748046875,
0.05511474609375,
0.0118255615234375,
0.049407958984375,
-0.0888671875,
0.0232696533203125,
0.0150604248046875,
0.037933349609375,
0.024139404296875,
-0.035797119140625,
0.0404052734375,
0.037322998046875,
-0.036773681640625,
-0.06414794921875,
-0.0043182373046875,
-0.0770263671875,
-0.0217437744140625,
0.0643310546875,
-0.00998687744140625,
-0.0239715576171875,
-0.007678985595703125,
-0.0244598388671875,
0.032745361328125,
-0.03216552734375,
0.057220458984375,
0.06524658203125,
0.00543975830078125,
-0.00710296630859375,
-0.0289154052734375,
0.029754638671875,
0.0279998779296875,
-0.0333251953125,
-0.03546142578125,
0.00839996337890625,
0.03533935546875,
0.01641845703125,
0.041656494140625,
-0.0032520294189453125,
0.01007080078125,
0.01629638671875,
0.013824462890625,
-0.00873565673828125,
-0.00991058349609375,
-0.02197265625,
0.0120086669921875,
-0.009124755859375,
-0.0545654296875
]
] |
cardiffnlp/twitter-xlm-roberta-base-sentiment | 2023-07-19T20:41:38.000Z | [
"transformers",
"pytorch",
"tf",
"xlm-roberta",
"text-classification",
"multilingual",
"arxiv:2104.12250",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-xlm-roberta-base-sentiment | 147 | 1,038,434 | transformers | 2022-03-02T23:29:05 | ---
language: multilingual
widget:
- text: "๐ค"
- text: "T'estimo! โค๏ธ"
- text: "I love you!"
- text: "I hate you ๐คฎ"
- text: "Mahal kita!"
- text: "์ฌ๋ํด!"
- text: "๋ ๋๊ฐ ์ซ์ด"
- text: "๐๐๐"
---
# twitter-XLM-roBERTa-base for Sentiment Analysis
This is a multilingual XLM-roBERTa-base model trained on ~198M tweets and finetuned for sentiment analysis. The sentiment fine-tuning was done on 8 languages (Ar, En, Fr, De, Hi, It, Sp, Pt) but it can be used for more languages (see paper for details).
- Paper: [XLM-T: A Multilingual Language Model Toolkit for Twitter](https://arxiv.org/abs/2104.12250).
- Git Repo: [XLM-T official repository](https://github.com/cardiffnlp/xlm-t).
This model has been integrated into the [TweetNLP library](https://github.com/cardiffnlp/tweetnlp).
## Example Pipeline
```python
from transformers import pipeline
model_path = "cardiffnlp/twitter-xlm-roberta-base-sentiment"
sentiment_task = pipeline("sentiment-analysis", model=model_path, tokenizer=model_path)
sentiment_task("T'estimo!")
```
```
[{'label': 'Positive', 'score': 0.6600581407546997}]
```
## Full classification example
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer, AutoConfig
import numpy as np
from scipy.special import softmax
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
MODEL = f"cardiffnlp/twitter-xlm-roberta-base-sentiment"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
config = AutoConfig.from_pretrained(MODEL)
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)
text = "Good night ๐"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Good night ๐"
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
# Print labels and scores
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = config.id2label[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) Positive 0.7673
2) Neutral 0.2015
3) Negative 0.0313
```
### Reference
```
@inproceedings{barbieri-etal-2022-xlm,
title = "{XLM}-{T}: Multilingual Language Models in {T}witter for Sentiment Analysis and Beyond",
author = "Barbieri, Francesco and
Espinosa Anke, Luis and
Camacho-Collados, Jose",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.27",
pages = "258--266"
}
```
| 3,249 | [
[
-0.0160675048828125,
-0.046539306640625,
0.01490020751953125,
0.030364990234375,
-0.015594482421875,
0.0193634033203125,
-0.03240966796875,
-0.015869140625,
0.0180816650390625,
0.0102081298828125,
-0.045135498046875,
-0.07318115234375,
-0.054779052734375,
0.0126953125,
-0.008575439453125,
0.08319091796875,
-0.00864410400390625,
0.019500732421875,
0.0216217041015625,
-0.030364990234375,
-0.00571441650390625,
-0.043426513671875,
-0.064453125,
-0.024658203125,
0.036895751953125,
0.0242156982421875,
0.029449462890625,
0.0167236328125,
0.032073974609375,
0.0380859375,
-0.0028476715087890625,
0.00675201416015625,
-0.033447265625,
0.00662994384765625,
-0.00441741943359375,
-0.0298919677734375,
-0.049041748046875,
0.00981903076171875,
0.054534912109375,
0.04779052734375,
0.01271820068359375,
0.0278472900390625,
0.0090484619140625,
0.03448486328125,
-0.02850341796875,
0.0198211669921875,
-0.02105712890625,
-0.0005402565002441406,
-0.0003314018249511719,
-0.012908935546875,
-0.0258026123046875,
-0.04315185546875,
-0.0014896392822265625,
-0.02459716796875,
0.007080078125,
-0.00998687744140625,
0.08563232421875,
0.003574371337890625,
-0.02252197265625,
-0.007656097412109375,
-0.0364990234375,
0.0914306640625,
-0.0650634765625,
0.017303466796875,
0.01024627685546875,
0.0018625259399414062,
0.012603759765625,
-0.038818359375,
-0.0343017578125,
-0.007633209228515625,
0.0022869110107421875,
0.02984619140625,
-0.01496124267578125,
-0.0075531005859375,
0.01102447509765625,
0.0175628662109375,
-0.037933349609375,
-0.0070037841796875,
-0.0279693603515625,
-0.01123809814453125,
0.0478515625,
0.0117034912109375,
0.0168609619140625,
-0.025726318359375,
-0.0194854736328125,
-0.01325225830078125,
-0.00858306884765625,
0.004650115966796875,
0.02001953125,
0.042877197265625,
-0.0293426513671875,
0.04412841796875,
-0.00579833984375,
0.03680419921875,
0.0035266876220703125,
-0.0118255615234375,
0.057281494140625,
-0.0191497802734375,
-0.0177764892578125,
-0.008758544921875,
0.09674072265625,
0.035614013671875,
0.0255126953125,
-0.0006189346313476562,
-0.0277252197265625,
0.0018243789672851562,
-0.0157470703125,
-0.0654296875,
-0.003170013427734375,
0.02874755859375,
-0.0391845703125,
-0.03863525390625,
0.0133819580078125,
-0.053192138671875,
-0.004001617431640625,
-0.0118255615234375,
0.061309814453125,
-0.042083740234375,
-0.040618896484375,
0.007549285888671875,
-0.01306915283203125,
0.00702667236328125,
-0.0030651092529296875,
-0.0423583984375,
0.0081024169921875,
0.035003662109375,
0.0767822265625,
-0.00152587890625,
-0.035430908203125,
-0.021881103515625,
-0.007904052734375,
-0.02496337890625,
0.042877197265625,
-0.024871826171875,
-0.0307159423828125,
0.004596710205078125,
0.0064239501953125,
-0.0178375244140625,
-0.020721435546875,
0.0362548828125,
-0.0160369873046875,
0.038848876953125,
-0.005390167236328125,
-0.0426025390625,
-0.004856109619140625,
0.02685546875,
-0.031585693359375,
0.08734130859375,
0.0222625732421875,
-0.052337646484375,
0.0138092041015625,
-0.0645751953125,
-0.025543212890625,
-0.0103302001953125,
0.0186309814453125,
-0.03753662109375,
0.0012388229370117188,
0.00926971435546875,
0.042694091796875,
-0.00510406494140625,
0.01212310791015625,
-0.03253173828125,
-0.013641357421875,
0.0182342529296875,
-0.0154876708984375,
0.08831787109375,
0.024871826171875,
-0.03363037109375,
0.016082763671875,
-0.054534912109375,
0.02178955078125,
0.007373809814453125,
-0.030242919921875,
-0.01479339599609375,
-0.0214385986328125,
0.02569580078125,
0.03070068359375,
0.0268096923828125,
-0.048370361328125,
0.005435943603515625,
-0.038330078125,
0.041351318359375,
0.052825927734375,
-0.00983428955078125,
0.0301361083984375,
-0.02105712890625,
0.03570556640625,
0.0179290771484375,
0.015960693359375,
0.0009746551513671875,
-0.0258026123046875,
-0.06365966796875,
-0.01061248779296875,
0.0247802734375,
0.046630859375,
-0.043243408203125,
0.042938232421875,
-0.0299224853515625,
-0.047088623046875,
-0.04254150390625,
0.0015211105346679688,
0.0252838134765625,
0.0379638671875,
0.033416748046875,
0.0003879070281982422,
-0.058380126953125,
-0.04852294921875,
-0.0316162109375,
-0.01568603515625,
0.0176544189453125,
0.0168609619140625,
0.049713134765625,
-0.031646728515625,
0.053436279296875,
-0.03753662109375,
-0.031158447265625,
-0.040130615234375,
0.0226287841796875,
0.044464111328125,
0.052337646484375,
0.05743408203125,
-0.04443359375,
-0.061004638671875,
-0.013336181640625,
-0.05712890625,
-0.015289306640625,
0.0139007568359375,
-0.017547607421875,
0.04144287109375,
0.02252197265625,
-0.042022705078125,
0.010345458984375,
0.037017822265625,
-0.042572021484375,
0.02801513671875,
-0.002349853515625,
0.02587890625,
-0.11279296875,
0.0012750625610351562,
0.026336669921875,
-0.01013946533203125,
-0.046905517578125,
-0.007770538330078125,
0.0002193450927734375,
0.00649261474609375,
-0.033416748046875,
0.056488037109375,
-0.0212860107421875,
0.0257720947265625,
-0.00018584728240966797,
0.0023250579833984375,
0.0037841796875,
0.040191650390625,
-0.0013523101806640625,
0.0380859375,
0.050262451171875,
-0.02716064453125,
0.021209716796875,
0.00864410400390625,
-0.00611114501953125,
0.037750244140625,
-0.048095703125,
-0.0103607177734375,
-0.0003895759582519531,
0.002834320068359375,
-0.0906982421875,
-0.00926971435546875,
0.0230865478515625,
-0.0626220703125,
0.0350341796875,
-0.0209808349609375,
-0.04022216796875,
-0.0257720947265625,
-0.04022216796875,
0.0241241455078125,
0.037139892578125,
-0.0263824462890625,
0.0491943359375,
0.0304412841796875,
0.0024776458740234375,
-0.052337646484375,
-0.0638427734375,
0.00887298583984375,
-0.023101806640625,
-0.050506591796875,
0.015594482421875,
-0.0173492431640625,
-0.0191497802734375,
-0.0024242401123046875,
0.00905609130859375,
-0.0106048583984375,
-0.002521514892578125,
0.007686614990234375,
0.0267181396484375,
-0.0230560302734375,
0.009613037109375,
-0.01026153564453125,
-0.00750732421875,
0.0074005126953125,
-0.032470703125,
0.0501708984375,
-0.03118896484375,
0.01513671875,
-0.0396728515625,
0.0246124267578125,
0.032562255859375,
-0.00435638427734375,
0.076904296875,
0.07958984375,
-0.0286712646484375,
-0.0122833251953125,
-0.040130615234375,
-0.008819580078125,
-0.03753662109375,
0.0406494140625,
-0.025848388671875,
-0.05810546875,
0.048004150390625,
0.01139068603515625,
0.003101348876953125,
0.06341552734375,
0.05206298828125,
-0.006847381591796875,
0.0987548828125,
0.043426513671875,
-0.0174407958984375,
0.04315185546875,
-0.061126708984375,
0.0103302001953125,
-0.03900146484375,
-0.0201568603515625,
-0.04461669921875,
-0.0094146728515625,
-0.06658935546875,
-0.01129913330078125,
0.01305389404296875,
-0.00411224365234375,
-0.039764404296875,
0.0113983154296875,
-0.04144287109375,
0.016387939453125,
0.03387451171875,
0.00543212890625,
-0.0100250244140625,
0.0014886856079101562,
-0.01416778564453125,
-0.01348876953125,
-0.044525146484375,
-0.039947509765625,
0.081298828125,
0.028472900390625,
0.045867919921875,
0.01177215576171875,
0.06494140625,
0.00801849365234375,
0.032379150390625,
-0.06390380859375,
0.040313720703125,
-0.02874755859375,
-0.04144287109375,
-0.01160430908203125,
-0.052947998046875,
-0.0543212890625,
0.0129547119140625,
-0.0117340087890625,
-0.05078125,
-0.001476287841796875,
-0.0038204193115234375,
-0.0152740478515625,
0.030548095703125,
-0.0655517578125,
0.07000732421875,
-0.0131072998046875,
-0.03570556640625,
0.00004118680953979492,
-0.0289306640625,
0.01507568359375,
0.00591278076171875,
0.0300140380859375,
-0.018402099609375,
-0.01090240478515625,
0.07586669921875,
-0.0369873046875,
0.061279296875,
-0.0209808349609375,
0.0062713623046875,
0.00839996337890625,
-0.0016603469848632812,
0.00873565673828125,
0.007770538330078125,
-0.0277862548828125,
0.0218353271484375,
0.00011658668518066406,
-0.03509521484375,
-0.01152801513671875,
0.0697021484375,
-0.08447265625,
-0.031158447265625,
-0.05413818359375,
-0.031768798828125,
-0.01337432861328125,
0.029083251953125,
0.036773681640625,
0.04248046875,
-0.0031681060791015625,
0.01491546630859375,
0.035736083984375,
-0.0253753662109375,
0.048370361328125,
0.0276947021484375,
-0.006397247314453125,
-0.045257568359375,
0.069091796875,
0.0150909423828125,
0.00279998779296875,
0.033447265625,
0.02703857421875,
-0.033233642578125,
-0.022369384765625,
-0.0102996826171875,
0.030548095703125,
-0.0540771484375,
-0.02667236328125,
-0.055755615234375,
-0.0258026123046875,
-0.053741455078125,
-0.0019130706787109375,
-0.02532958984375,
-0.044525146484375,
-0.0282440185546875,
-0.013519287109375,
0.03167724609375,
0.05438232421875,
-0.0271148681640625,
0.021240234375,
-0.055267333984375,
0.01424407958984375,
-0.005092620849609375,
0.034942626953125,
-0.0071563720703125,
-0.05621337890625,
-0.0306396484375,
0.00839996337890625,
-0.0228271484375,
-0.0633544921875,
0.056854248046875,
0.0211944580078125,
0.03143310546875,
0.018798828125,
-0.0008268356323242188,
0.041961669921875,
-0.0157318115234375,
0.06005859375,
0.0225982666015625,
-0.0797119140625,
0.04205322265625,
-0.0310821533203125,
0.03594970703125,
0.03338623046875,
0.032958984375,
-0.04571533203125,
-0.04473876953125,
-0.041839599609375,
-0.07916259765625,
0.07830810546875,
0.0130157470703125,
0.025726318359375,
-0.01178741455078125,
0.0137176513671875,
-0.0031032562255859375,
0.01009368896484375,
-0.07110595703125,
-0.04791259765625,
-0.040008544921875,
-0.04461669921875,
-0.026336669921875,
-0.0245208740234375,
-0.0013475418090820312,
-0.0374755859375,
0.0716552734375,
0.0030918121337890625,
0.029937744140625,
0.005889892578125,
-0.0165863037109375,
-0.01256561279296875,
0.0108642578125,
0.03704833984375,
0.04974365234375,
-0.035858154296875,
-0.0021762847900390625,
0.0216522216796875,
-0.037200927734375,
0.0026226043701171875,
0.01253509521484375,
-0.003238677978515625,
0.02117919921875,
0.031280517578125,
0.052276611328125,
0.0132598876953125,
-0.010223388671875,
0.036041259765625,
-0.0156402587890625,
-0.0290069580078125,
-0.033966064453125,
-0.0017061233520507812,
0.000640869140625,
0.01363372802734375,
0.04449462890625,
0.0117340087890625,
-0.0018281936645507812,
-0.03826904296875,
0.0025424957275390625,
0.020050048828125,
-0.046234130859375,
-0.03460693359375,
0.052337646484375,
0.005176544189453125,
-0.0243072509765625,
0.029754638671875,
-0.00803375244140625,
-0.0653076171875,
0.042572021484375,
0.0235443115234375,
0.08111572265625,
-0.020721435546875,
0.0299835205078125,
0.06451416015625,
0.01332855224609375,
-0.00453948974609375,
0.0435791015625,
0.01247406005859375,
-0.05853271484375,
-0.0167236328125,
-0.06005859375,
-0.00946807861328125,
0.00930023193359375,
-0.03936767578125,
0.0192108154296875,
-0.028289794921875,
-0.03936767578125,
0.00832366943359375,
0.031219482421875,
-0.04998779296875,
0.03533935546875,
0.00952911376953125,
0.06292724609375,
-0.0660400390625,
0.05712890625,
0.0670166015625,
-0.04364013671875,
-0.0728759765625,
-0.00994110107421875,
-0.00965118408203125,
-0.04010009765625,
0.056793212890625,
0.020233154296875,
-0.00817108154296875,
0.0111541748046875,
-0.044158935546875,
-0.0621337890625,
0.06964111328125,
0.01544189453125,
-0.0113372802734375,
0.01061248779296875,
0.023468017578125,
0.054412841796875,
-0.0278472900390625,
0.040985107421875,
0.0347900390625,
0.03936767578125,
-0.00896453857421875,
-0.047149658203125,
0.0022335052490234375,
-0.037017822265625,
-0.01081085205078125,
0.007678985595703125,
-0.070068359375,
0.08135986328125,
-0.0069122314453125,
0.005924224853515625,
-0.001251220703125,
0.042724609375,
0.022308349609375,
0.013580322265625,
0.0305023193359375,
0.03900146484375,
0.039794921875,
-0.0299835205078125,
0.064453125,
-0.040679931640625,
0.0648193359375,
0.04791259765625,
0.0139007568359375,
0.056427001953125,
0.03289794921875,
-0.0106964111328125,
0.04339599609375,
0.051544189453125,
-0.002750396728515625,
0.0343017578125,
-0.005153656005859375,
-0.0082550048828125,
-0.020416259765625,
-0.00653839111328125,
-0.02447509765625,
0.0272674560546875,
0.02459716796875,
-0.031005859375,
-0.017181396484375,
-0.00335693359375,
0.0248565673828125,
-0.0091705322265625,
-0.0295867919921875,
0.0411376953125,
0.0200042724609375,
-0.051910400390625,
0.052337646484375,
0.01439666748046875,
0.068115234375,
-0.0391845703125,
0.01467132568359375,
-0.019744873046875,
0.041534423828125,
-0.025970458984375,
-0.06964111328125,
0.00978851318359375,
0.01256561279296875,
-0.004119873046875,
-0.0306243896484375,
0.033050537109375,
-0.0367431640625,
-0.06707763671875,
0.045501708984375,
0.036407470703125,
0.004138946533203125,
0.016204833984375,
-0.0872802734375,
0.006153106689453125,
0.0011911392211914062,
-0.057281494140625,
0.005649566650390625,
0.040008544921875,
0.0137176513671875,
0.041961669921875,
0.03326416015625,
0.01023101806640625,
0.005191802978515625,
0.033447265625,
0.0584716796875,
-0.053955078125,
-0.0300140380859375,
-0.07958984375,
0.0382080078125,
-0.00930023193359375,
-0.040802001953125,
0.06817626953125,
0.053955078125,
0.0582275390625,
-0.00289154052734375,
0.0677490234375,
-0.01465606689453125,
0.0406494140625,
-0.0207672119140625,
0.06524658203125,
-0.062103271484375,
0.0018405914306640625,
-0.0261077880859375,
-0.06488037109375,
-0.0301055908203125,
0.048583984375,
-0.02984619140625,
0.03326416015625,
0.04876708984375,
0.056732177734375,
-0.000400543212890625,
-0.024932861328125,
0.0175018310546875,
0.045135498046875,
0.019134521484375,
0.045989990234375,
0.043304443359375,
-0.048309326171875,
0.053314208984375,
-0.04315185546875,
-0.01068115234375,
-0.0167999267578125,
-0.054473876953125,
-0.0845947265625,
-0.05291748046875,
-0.03265380859375,
-0.060577392578125,
-0.005970001220703125,
0.08123779296875,
0.04241943359375,
-0.08123779296875,
-0.035858154296875,
0.01416778564453125,
0.008575439453125,
0.0018415451049804688,
-0.02362060546875,
0.04022216796875,
-0.040374755859375,
-0.073974609375,
-0.005756378173828125,
0.0018224716186523438,
0.003246307373046875,
-0.007221221923828125,
-0.01168060302734375,
-0.02154541015625,
0.00818634033203125,
0.041595458984375,
-0.0006403923034667969,
-0.036834716796875,
-0.01432037353515625,
0.0117645263671875,
-0.032928466796875,
0.014068603515625,
0.0164337158203125,
-0.034698486328125,
0.01195526123046875,
0.044677734375,
0.0013189315795898438,
0.03338623046875,
-0.007480621337890625,
0.0347900390625,
-0.05364990234375,
0.01107025146484375,
0.0166473388671875,
0.039642333984375,
0.039337158203125,
-0.0111846923828125,
0.036224365234375,
0.0205230712890625,
-0.031646728515625,
-0.07049560546875,
-0.0188751220703125,
-0.0762939453125,
-0.0205230712890625,
0.096923828125,
-0.0103607177734375,
-0.0301361083984375,
0.00826263427734375,
-0.010223388671875,
0.044525146484375,
-0.04425048828125,
0.060546875,
0.04852294921875,
0.01320648193359375,
-0.00720977783203125,
-0.0283660888671875,
0.03863525390625,
0.021240234375,
-0.037322998046875,
-0.022064208984375,
0.0047454833984375,
0.0423583984375,
0.00811767578125,
0.0462646484375,
-0.0031890869140625,
0.0169677734375,
-0.01416015625,
0.007080078125,
-0.0057525634765625,
0.002727508544921875,
-0.0294342041015625,
0.01331329345703125,
-0.0167694091796875,
-0.01233673095703125
]
] |
nvidia/speakerverification_en_titanet_large | 2023-03-13T19:13:57.000Z | [
"nemo",
"speaker",
"speech",
"audio",
"speaker-verification",
"speaker-recognition",
"speaker-diarization",
"titanet",
"NeMo",
"pytorch",
"en",
"dataset:VOXCELEB-1",
"dataset:VOXCELEB-2",
"dataset:FISHER",
"dataset:switchboard",
"dataset:librispeech_asr",
"dataset:SRE(2004-2010)",
"license:cc-by-4.0",
"model-index",
"has_space",
"region:us"
] | null | nvidia | null | null | nvidia/speakerverification_en_titanet_large | 35 | 1,036,656 | nemo | 2022-07-15T00:26:00 | ---
language:
- en
library_name: nemo
datasets:
- VOXCELEB-1
- VOXCELEB-2
- FISHER
- switchboard
- librispeech_asr
- SRE(2004-2010)
thumbnail: null
tags:
- speaker
- speech
- audio
- speaker-verification
- speaker-recognition
- speaker-diarization
- titanet
- NeMo
- pytorch
license: cc-by-4.0
widget:
- src: https://huggingface.co/nvidia/speakerverification_en_titanet_large/resolve/main/an255-fash-b.wav
example_title: Speech sample 1
- src: https://huggingface.co/nvidia/speakerverification_en_titanet_large/resolve/main/cen7-fash-b.wav
example_title: Speech sample 2
model-index:
- name: speakerverification_en_titanet_large
results:
- task:
name: Speaker Verification
type: speaker-verification
dataset:
name: voxceleb1
type: voxceleb1-O
config: clean
split: test
args:
language: en
metrics:
- name: Test EER
type: eer
value: 0.66
- task:
type: Speaker Diarization
name: speaker-diarization
dataset:
name: ami-mixheadset
type: ami_diarization
config: oracle-vad-known-number-of-speakers
split: test
args:
language: en
metrics:
- name: Test DER
type: der
value: 1.73
- task:
type: Speaker Diarization
name: speaker-diarization
dataset:
name: ami-lapel
type: ami_diarization
config: oracle-vad-known-number-of-speakers
split: test
args:
language: en
metrics:
- name: Test DER
type: der
value: 2.03
- task:
type: Speaker Diarization
name: speaker-diarization
dataset:
name: ch109
type: callhome_diarization
config: oracle-vad-known-number-of-speakers
split: test
args:
language: en
metrics:
- name: Test DER
type: der
value: 1.19
- task:
type: Speaker Diarization
name: speaker-diarization
dataset:
name: nist-sre-2000
type: nist-sre_diarization
config: oracle-vad-known-number-of-speakers
split: test
args:
language: en
metrics:
- name: Test DER
type: der
value: 6.73
---
# NVIDIA TitaNet-Large (en-US)
<style>
img {
display: inline;
}
</style>
| [](#model-architecture)
| [](#model-architecture)
| [](#datasets)
This model extracts speaker embeddings from given speech, which is the backbone for speaker verification and diarization tasks.
It is a "large" version of TitaNet (around 23M parameters) models.
See the [model architecture](#model-architecture) section and [NeMo documentation](https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/stable/asr/speaker_recognition/models.html#titanet) for complete architecture details.
## NVIDIA NeMo: Training
To train, fine-tune or play with the model you will need to install [NVIDIA NeMo](https://github.com/NVIDIA/NeMo). We recommend you install it after you've installed the latest Pytorch version.
```
pip install nemo_toolkit['all']
```
## How to Use this Model
The model is available for use in the NeMo toolkit [3] and can be used as a pre-trained checkpoint for inference or for fine-tuning on another dataset.
### Automatically instantiate the model
```python
import nemo.collections.asr as nemo_asr
speaker_model = nemo_asr.models.EncDecSpeakerLabelModel.from_pretrained("nvidia/speakerverification_en_titanet_large")
```
### Embedding Extraction
Using
```python
emb = speaker_model.get_embedding("an255-fash-b.wav")
```
### Verifying two utterances (Speaker Verification)
Now to check if two audio files are from the same speaker or not, simply do:
```python
speaker_model.verify_speakers("an255-fash-b.wav","cen7-fash-b.wav")
```
### Extracting Embeddings for more audio files
To extract embeddings from a bunch of audio files:
Write audio files to a `manifest.json` file with lines as in format:
```json
{"audio_filepath": "<absolute path to dataset>/audio_file.wav", "duration": "duration of file in sec", "label": "speaker_id"}
```
Then running following script will extract embeddings and writes to current working directory:
```shell
python <NeMo_root>/examples/speaker_tasks/recognition/extract_speaker_embeddings.py --manifest=manifest.json
```
### Input
This model accepts 16000 KHz Mono-channel Audio (wav files) as input.
### Output
This model provides speaker embeddings for an audio file.
## Model Architecture
TitaNet model is a depth-wise separable conv1D model [1] for Speaker Verification and diarization tasks. You may find more info on the detail of this model here: [TitaNet-Model](https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/main/asr/speaker_recognition/models.html).
## Training
The NeMo toolkit [3] was used for training the models for over several hundred epochs. These model are trained with this [example script](https://github.com/NVIDIA/NeMo/blob/main/examples/speaker_tasks/recognition/speaker_reco.py) and this [base config](https://github.com/NVIDIA/NeMo/blob/main/examples/speaker_tasks/recognition/conf/titanet-large.yaml).
### Datasets
All the models in this collection are trained on a composite dataset comprising several thousand hours of English speech:
- Voxceleb-1
- Voxceleb-2
- Fisher
- Switchboard
- Librispeech
- SRE (2004-2010)
## Performance
Performances of the these models are reported in terms of Equal Error Rate (EER%) on speaker verification evaluation trial files and as Diarization Error Rate (DER%) on diarization test sessions.
* Speaker Verification (EER%)
| Version | Model | Model Size | VoxCeleb1 (Cleaned trial file) |
|---------|--------------|-----|---------------|
| 1.10.0 | TitaNet-Large | 23M | 0.66 |
* Speaker Diarization (DER%)
| Version | Model | Model Size | Evaluation Condition | NIST SRE 2000 | AMI (Lapel) | AMI (MixHeadset) | CH109 |
|---------|--------------|-----|----------------------|---------------|-------------|------------------|-------|
| 1.10.0 | TitaNet-Large | 23M | Oracle VAD KNOWN # of Speakers | 6.73 | 2.03 | 1.73 | 1.19 |
| 1.10.0 | TitaNet-Large | 23M | Oracle VAD UNKNOWN # of Speakers | 5.38 | 2.03 | 1.89 | 1.63 |
## Limitations
This model is trained on both telephonic and non-telephonic speech from voxceleb datasets, Fisher and switch board. If your domain of data differs from trained data or doesnot show relatively good performance consider finetuning for that speech domain.
## NVIDIA Riva: Deployment
[NVIDIA Riva](https://developer.nvidia.com/riva), is an accelerated speech AI SDK deployable on-prem, in all clouds, multi-cloud, hybrid, on edge, and embedded.
Additionally, Riva provides:
* World-class out-of-the-box accuracy for the most common languages with model checkpoints trained on proprietary data with hundreds of thousands of GPU-compute hours
* Best in class accuracy with run-time word boosting (e.g., brand and product names) and customization of acoustic model, language model, and inverse text normalization
* Streaming speech recognition, Kubernetes compatible scaling, and enterprise-grade support
Although this model isnโt supported yet by Riva, the [list of supported models is here](https://huggingface.co/models?other=Riva).
Check out [Riva live demo](https://developer.nvidia.com/riva#demos).
## References
[1] [TitaNet: Neural Model for Speaker Representation with 1D Depth-wise Separable convolutions and global context](https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9746806)
[2] [NVIDIA NeMo Toolkit](https://github.com/NVIDIA/NeMo)
## Licence
License to use this model is covered by the [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/). By downloading the public and release version of the model, you accept the terms and conditions of the [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/) license. | 8,124 | [
[
-0.0433349609375,
-0.06610107421875,
0.005069732666015625,
-0.004642486572265625,
-0.01093292236328125,
-0.01270294189453125,
-0.01837158203125,
-0.0293121337890625,
0.016693115234375,
0.0200347900390625,
-0.03466796875,
-0.035247802734375,
-0.03753662109375,
-0.0050506591796875,
-0.0213623046875,
0.041595458984375,
0.04010009765625,
0.007175445556640625,
-0.015838623046875,
-0.009796142578125,
-0.05999755859375,
-0.0091705322265625,
-0.062042236328125,
-0.04656982421875,
0.0058746337890625,
0.03314208984375,
0.0367431640625,
0.036346435546875,
0.01204681396484375,
0.026397705078125,
-0.021484375,
0.0236968994140625,
-0.00830841064453125,
0.0038242340087890625,
-0.00261688232421875,
-0.0092010498046875,
-0.04156494140625,
-0.003459930419921875,
0.05902099609375,
0.0148162841796875,
-0.0173492431640625,
0.00978851318359375,
0.0002460479736328125,
0.0034618377685546875,
-0.01983642578125,
0.0093536376953125,
-0.027008056640625,
-0.013092041015625,
-0.004398345947265625,
-0.021087646484375,
-0.039825439453125,
-0.0197906494140625,
0.01375579833984375,
-0.047088623046875,
-0.0020236968994140625,
0.0013513565063476562,
0.0877685546875,
0.0217742919921875,
-0.018463134765625,
-0.009002685546875,
-0.052703857421875,
0.06414794921875,
-0.08349609375,
0.045135498046875,
0.0305938720703125,
0.0203399658203125,
-0.0113372802734375,
-0.044342041015625,
-0.05377197265625,
-0.029998779296875,
0.00423431396484375,
0.003143310546875,
-0.037139892578125,
0.0034008026123046875,
0.0213165283203125,
0.0318603515625,
-0.05242919921875,
0.0283050537109375,
-0.0274658203125,
-0.0262298583984375,
0.0443115234375,
-0.007022857666015625,
0.044525146484375,
-0.035858154296875,
-0.022064208984375,
-0.046356201171875,
-0.04046630859375,
-0.006847381591796875,
0.0246124267578125,
0.0340576171875,
-0.0408935546875,
0.020172119140625,
-0.01029205322265625,
0.053497314453125,
0.0022449493408203125,
-0.0211334228515625,
0.054595947265625,
-0.0189056396484375,
-0.023468017578125,
0.00936126708984375,
0.083984375,
0.00946044921875,
-0.0029811859130859375,
0.0086517333984375,
-0.01168060302734375,
-0.006557464599609375,
-0.01078033447265625,
-0.052276611328125,
-0.00258636474609375,
0.03076171875,
-0.01428985595703125,
-0.0110321044921875,
-0.023223876953125,
-0.056640625,
-0.017578125,
-0.0240631103515625,
0.063720703125,
-0.05682373046875,
-0.037811279296875,
0.01531982421875,
-0.0139007568359375,
0.0258331298828125,
0.01189422607421875,
-0.074462890625,
0.0298309326171875,
0.043609619140625,
0.06597900390625,
0.0162506103515625,
-0.01202392578125,
-0.047088623046875,
-0.0162811279296875,
-0.006847381591796875,
0.0487060546875,
-0.031585693359375,
-0.02789306640625,
-0.01280975341796875,
0.002025604248046875,
-0.01403045654296875,
-0.0303955078125,
0.042022705078125,
-0.00612640380859375,
0.0372314453125,
-0.001739501953125,
-0.0499267578125,
-0.0270233154296875,
0.0145416259765625,
-0.0335693359375,
0.0767822265625,
-0.01485443115234375,
-0.048065185546875,
0.0146026611328125,
-0.0506591796875,
-0.009063720703125,
0.01107025146484375,
-0.02685546875,
-0.032501220703125,
-0.003711700439453125,
0.0321044921875,
0.04156494140625,
-0.0404052734375,
0.013916015625,
-0.00809478759765625,
-0.0311431884765625,
0.003047943115234375,
-0.021820068359375,
0.06787109375,
0.0225677490234375,
-0.0386962890625,
0.0024280548095703125,
-0.05804443359375,
-0.00658416748046875,
0.00986480712890625,
-0.0146026611328125,
-0.00505828857421875,
0.00286865234375,
0.022674560546875,
0.003917694091796875,
0.012176513671875,
-0.06463623046875,
-0.0084075927734375,
-0.0484619140625,
0.06036376953125,
0.05438232421875,
-0.0034656524658203125,
0.0166473388671875,
-0.020965576171875,
0.0404052734375,
-0.02081298828125,
0.015045166015625,
-0.0148468017578125,
-0.048675537109375,
-0.047454833984375,
-0.026519775390625,
0.037139892578125,
0.044342041015625,
-0.05072021484375,
0.0267486572265625,
-0.03790283203125,
-0.056549072265625,
-0.04156494140625,
-0.0109710693359375,
0.049285888671875,
0.026336669921875,
0.0163116455078125,
-0.0161895751953125,
-0.04443359375,
-0.0792236328125,
0.0032138824462890625,
-0.01029205322265625,
0.004467010498046875,
0.053070068359375,
0.044189453125,
-0.01094818115234375,
0.060577392578125,
-0.0223846435546875,
-0.02392578125,
-0.0121917724609375,
0.008758544921875,
0.038055419921875,
0.052764892578125,
0.038482666015625,
-0.056060791015625,
-0.0301513671875,
0.035186767578125,
-0.050140380859375,
0.00940704345703125,
0.00878143310546875,
0.032684326171875,
0.0014057159423828125,
0.035797119140625,
-0.040679931640625,
0.0140838623046875,
0.046295166015625,
-0.0203857421875,
0.0237274169921875,
-0.0184478759765625,
0.00234222412109375,
-0.09649658203125,
0.011383056640625,
0.0026569366455078125,
-0.0141448974609375,
-0.04412841796875,
-0.01178741455078125,
-0.01041412353515625,
-0.01419830322265625,
-0.03900146484375,
0.058013916015625,
-0.0191650390625,
-0.01019287109375,
-0.017791748046875,
0.0213775634765625,
0.00023508071899414062,
0.036102294921875,
-0.004077911376953125,
0.07073974609375,
0.055267333984375,
-0.047119140625,
0.03985595703125,
0.033782958984375,
-0.04254150390625,
0.0303955078125,
-0.08203125,
0.0307464599609375,
-0.004302978515625,
0.00833892822265625,
-0.07977294921875,
-0.015869140625,
0.0186920166015625,
-0.06414794921875,
0.02532958984375,
-0.0242767333984375,
-0.045989990234375,
-0.0126190185546875,
0.01190948486328125,
0.0284271240234375,
0.041229248046875,
-0.038055419921875,
0.054290771484375,
0.0418701171875,
-0.02288818359375,
-0.05322265625,
-0.04864501953125,
-0.00726318359375,
-0.017852783203125,
-0.052276611328125,
0.0364990234375,
0.002193450927734375,
-0.003387451171875,
-0.00640106201171875,
0.00594329833984375,
0.0005745887756347656,
-0.01403045654296875,
0.041778564453125,
0.0171051025390625,
-0.022369384765625,
0.0277099609375,
-0.0158538818359375,
-0.0195159912109375,
-0.02398681640625,
-0.015228271484375,
0.062744140625,
-0.0244293212890625,
-0.01329803466796875,
-0.072998046875,
0.007663726806640625,
0.052276611328125,
-0.0300750732421875,
0.029449462890625,
0.059356689453125,
-0.029449462890625,
-0.0029296875,
-0.06256103515625,
-0.022064208984375,
-0.03802490234375,
0.043975830078125,
-0.01424407958984375,
-0.0518798828125,
0.03472900390625,
0.023040771484375,
0.00730133056640625,
0.038238525390625,
0.02880859375,
0.012664794921875,
0.059539794921875,
0.035430908203125,
-0.013214111328125,
0.057708740234375,
-0.0251617431640625,
0.0105133056640625,
-0.059661865234375,
-0.038330078125,
-0.048858642578125,
-0.016632080078125,
-0.04986572265625,
-0.032318115234375,
0.01477813720703125,
-0.0350341796875,
-0.006916046142578125,
0.051971435546875,
-0.05615234375,
0.0298309326171875,
0.04754638671875,
0.0002493858337402344,
-0.00127410888671875,
0.0173187255859375,
-0.01270294189453125,
-0.004070281982421875,
-0.043975830078125,
-0.035308837890625,
0.0789794921875,
0.03936767578125,
0.03985595703125,
-0.0007729530334472656,
0.042633056640625,
0.01702880859375,
-0.018524169921875,
-0.0439453125,
0.037384033203125,
-0.0176544189453125,
-0.06842041015625,
-0.032257080078125,
-0.033966064453125,
-0.060699462890625,
0.024322509765625,
-0.0153656005859375,
-0.060516357421875,
0.050872802734375,
-0.0010776519775390625,
-0.05029296875,
0.0259857177734375,
-0.060577392578125,
0.06536865234375,
0.0020294189453125,
-0.019256591796875,
-0.02880859375,
-0.017242431640625,
-0.006511688232421875,
0.025177001953125,
0.019134521484375,
-0.018951416015625,
0.0269927978515625,
0.0755615234375,
-0.004352569580078125,
0.0513916015625,
-0.0246429443359375,
0.0208282470703125,
0.0219879150390625,
-0.012603759765625,
0.031341552734375,
0.0086517333984375,
-0.01049041748046875,
0.00960540771484375,
0.00601959228515625,
-0.017181396484375,
-0.0255584716796875,
0.066650390625,
-0.0855712890625,
-0.02093505859375,
-0.0235443115234375,
-0.044097900390625,
-0.0089874267578125,
0.01293182373046875,
0.033843994140625,
0.0677490234375,
-0.0186920166015625,
0.04345703125,
0.064208984375,
-0.041961669921875,
0.03857421875,
0.0289764404296875,
-0.011322021484375,
-0.0531005859375,
0.07720947265625,
0.01904296875,
0.01371002197265625,
0.039581298828125,
0.0077667236328125,
-0.02490234375,
-0.05181884765625,
-0.0128021240234375,
0.0189666748046875,
-0.039093017578125,
0.0014190673828125,
-0.04443359375,
-0.0131683349609375,
-0.045623779296875,
0.030059814453125,
-0.054473876953125,
-0.02655029296875,
-0.021514892578125,
-0.0172271728515625,
0.03466796875,
0.0518798828125,
-0.00432586669921875,
0.038604736328125,
-0.023101806640625,
0.0162811279296875,
0.037506103515625,
0.00904083251953125,
-0.0143585205078125,
-0.07574462890625,
-0.005817413330078125,
0.01531219482421875,
-0.03192138671875,
-0.048187255859375,
0.053466796875,
0.0216827392578125,
0.052764892578125,
0.0285491943359375,
-0.023956298828125,
0.052093505859375,
-0.0130462646484375,
0.055450439453125,
0.013427734375,
-0.0699462890625,
0.050018310546875,
-0.034576416015625,
0.02166748046875,
0.0277252197265625,
0.013275146484375,
-0.047271728515625,
0.00788116455078125,
-0.05926513671875,
-0.054840087890625,
0.07086181640625,
0.02349853515625,
0.019775390625,
0.0022602081298828125,
0.005641937255859375,
0.00286102294921875,
-0.0013170242309570312,
-0.055938720703125,
-0.0255126953125,
-0.021942138671875,
0.01192474365234375,
-0.035247802734375,
-0.018157958984375,
0.0030059814453125,
-0.03619384765625,
0.069091796875,
0.0138397216796875,
0.0309600830078125,
0.0205078125,
-0.004077911376953125,
0.0223541259765625,
0.026885986328125,
0.0577392578125,
0.0267486572265625,
-0.04205322265625,
-0.00879669189453125,
0.0231475830078125,
-0.02484130859375,
-0.007396697998046875,
0.003993988037109375,
0.0014352798461914062,
0.02728271484375,
0.01206207275390625,
0.08868408203125,
0.02288818359375,
-0.04241943359375,
0.042327880859375,
-0.0107574462890625,
-0.0283050537109375,
-0.041656494140625,
-0.006195068359375,
0.0175628662109375,
0.0166015625,
0.01605224609375,
0.0037860870361328125,
-0.00579071044921875,
-0.037872314453125,
0.00839996337890625,
0.024810791015625,
-0.0343017578125,
-0.0242462158203125,
0.046875,
0.0137481689453125,
-0.038848876953125,
0.07098388671875,
-0.00818634033203125,
0.0020961761474609375,
0.052764892578125,
0.035797119140625,
0.06756591796875,
-0.039031982421875,
0.0176849365234375,
0.053497314453125,
0.0238494873046875,
0.00457763671875,
0.023651123046875,
-0.0010023117065429688,
-0.057342529296875,
-0.03369140625,
-0.0467529296875,
-0.0255126953125,
0.0247802734375,
-0.04620361328125,
0.02777099609375,
-0.05133056640625,
-0.04180908203125,
0.0215911865234375,
0.006595611572265625,
-0.04730224609375,
0.013153076171875,
0.036712646484375,
0.054779052734375,
-0.08001708984375,
0.0714111328125,
0.04315185546875,
-0.03668212890625,
-0.0677490234375,
-0.044586181640625,
-0.006061553955078125,
-0.054229736328125,
0.034759521484375,
0.0106658935546875,
-0.0010614395141601562,
-0.006145477294921875,
-0.032806396484375,
-0.0718994140625,
0.09381103515625,
0.04315185546875,
-0.044097900390625,
0.00615692138671875,
-0.00118255615234375,
0.02349853515625,
-0.0355224609375,
0.04473876953125,
0.0197906494140625,
0.035003662109375,
0.004947662353515625,
-0.09710693359375,
-0.00537872314453125,
-0.044342041015625,
-0.0150299072265625,
-0.00554656982421875,
-0.0452880859375,
0.0855712890625,
0.00873565673828125,
-0.00926971435546875,
-0.0085906982421875,
0.036224365234375,
0.018951416015625,
0.017364501953125,
0.03948974609375,
0.048065185546875,
0.055450439453125,
-0.01464080810546875,
0.07171630859375,
-0.01055908203125,
0.0240631103515625,
0.08636474609375,
0.0162506103515625,
0.07830810546875,
0.0305938720703125,
-0.010284423828125,
0.044830322265625,
0.037506103515625,
-0.017181396484375,
0.033355712890625,
-0.0214996337890625,
-0.002105712890625,
-0.0382080078125,
-0.0175933837890625,
-0.057220458984375,
0.052886962890625,
0.028411865234375,
-0.0225982666015625,
0.004650115966796875,
0.01194000244140625,
-0.0093536376953125,
-0.007137298583984375,
-0.0035190582275390625,
0.03521728515625,
0.0177764892578125,
-0.0295867919921875,
0.06390380859375,
-0.005970001220703125,
0.053070068359375,
-0.037384033203125,
0.007762908935546875,
-0.006092071533203125,
0.0174560546875,
-0.02294921875,
-0.0214996337890625,
0.0030460357666015625,
-0.0192108154296875,
-0.01554107666015625,
-0.015045166015625,
0.0238494873046875,
-0.0017185211181640625,
-0.0153656005859375,
0.024932861328125,
0.01039886474609375,
0.0265960693359375,
-0.001979827880859375,
-0.047454833984375,
0.0191497802734375,
0.00887298583984375,
-0.007293701171875,
0.01337432861328125,
-0.0007672309875488281,
0.0192413330078125,
0.06573486328125,
0.052764892578125,
0.0012178421020507812,
0.01824951171875,
0.004421234130859375,
0.041748046875,
-0.040069580078125,
-0.042755126953125,
-0.06011962890625,
0.0287933349609375,
-0.0164337158203125,
-0.046173095703125,
0.06817626953125,
0.04901123046875,
0.054962158203125,
0.0005512237548828125,
0.040252685546875,
-0.01030731201171875,
0.03533935546875,
-0.0184326171875,
0.048309326171875,
-0.052032470703125,
0.023345947265625,
-0.0244140625,
-0.057098388671875,
-0.00860595703125,
0.04901123046875,
-0.044189453125,
-0.0008106231689453125,
0.034637451171875,
0.0848388671875,
-0.01221466064453125,
0.0173187255859375,
0.03277587890625,
0.03662109375,
0.0259246826171875,
0.0433349609375,
0.056396484375,
-0.0635986328125,
0.040069580078125,
-0.018829345703125,
-0.004947662353515625,
-0.0208587646484375,
-0.042755126953125,
-0.04986572265625,
-0.047271728515625,
-0.046539306640625,
-0.017364501953125,
0.0109100341796875,
0.072265625,
0.0802001953125,
-0.06036376953125,
-0.040008544921875,
0.001491546630859375,
0.007038116455078125,
-0.0202789306640625,
-0.01230621337890625,
0.029632568359375,
0.01015472412109375,
-0.0750732421875,
0.05694580078125,
0.00673675537109375,
-0.00438690185546875,
0.00173187255859375,
-0.02197265625,
-0.0234375,
0.002918243408203125,
0.01291656494140625,
0.04473876953125,
-0.06170654296875,
-0.015838623046875,
-0.019622802734375,
0.005504608154296875,
0.0156402587890625,
0.00083160400390625,
-0.056610107421875,
0.051025390625,
0.05938720703125,
0.0172882080078125,
0.05389404296875,
-0.03045654296875,
0.036895751953125,
-0.039398193359375,
0.0286712646484375,
0.0192108154296875,
0.031280517578125,
0.031524658203125,
0.006458282470703125,
0.0137176513671875,
0.006931304931640625,
-0.050811767578125,
-0.08935546875,
-0.007686614990234375,
-0.08819580078125,
-0.00731658935546875,
0.087158203125,
0.0019350051879882812,
-0.0196075439453125,
0.007305145263671875,
-0.00838470458984375,
0.0160369873046875,
-0.038848876953125,
0.0255126953125,
0.03485107421875,
0.017181396484375,
-0.0120391845703125,
-0.07244873046875,
0.020721435546875,
0.0435791015625,
-0.0255126953125,
-0.0173797607421875,
0.01070404052734375,
0.05291748046875,
0.02960205078125,
0.037078857421875,
-0.020233154296875,
0.0287017822265625,
0.01849365234375,
0.034820556640625,
-0.028594970703125,
-0.0086822509765625,
-0.031463623046875,
0.0167388916015625,
-0.005084991455078125,
-0.049713134765625
]
] |
pysentimiento/robertuito-sentiment-analysis | 2023-02-25T14:25:07.000Z | [
"pysentimiento",
"pytorch",
"tf",
"roberta",
"twitter",
"sentiment-analysis",
"es",
"arxiv:2106.09462",
"has_space",
"region:us"
] | null | pysentimiento | null | null | pysentimiento/robertuito-sentiment-analysis | 30 | 1,008,941 | pysentimiento | 2022-03-02T23:29:05 | ---
language:
- es
library_name: pysentimiento
tags:
- twitter
- sentiment-analysis
---
# Sentiment Analysis in Spanish
## robertuito-sentiment-analysis
Repository: [https://github.com/pysentimiento/pysentimiento/](https://github.com/finiteautomata/pysentimiento/)
Model trained with TASS 2020 corpus (around ~5k tweets) of several dialects of Spanish. Base model is [RoBERTuito](https://github.com/pysentimiento/robertuito), a RoBERTa model trained in Spanish tweets.
Uses `POS`, `NEG`, `NEU` labels.
## Usage
Use it directly with [pysentimiento](https://github.com/pysentimiento/pysentimiento)
```python
from pysentimiento import create_analyzer
analyzer = create_analyzer(task="sentiment", lang="es")
analyzer.predict("Quรฉ gran jugador es Messi")
# returns AnalyzerOutput(output=POS, probas={POS: 0.998, NEG: 0.002, NEU: 0.000})
```
## Results
Results for the four tasks evaluated in `pysentimiento`. Results are expressed as Macro F1 scores
| model | emotion | hate_speech | irony | sentiment |
|:--------------|:--------------|:--------------|:--------------|:--------------|
| robertuito | 0.560 ยฑ 0.010 | 0.759 ยฑ 0.007 | 0.739 ยฑ 0.005 | 0.705 ยฑ 0.003 |
| roberta | 0.527 ยฑ 0.015 | 0.741 ยฑ 0.012 | 0.721 ยฑ 0.008 | 0.670 ยฑ 0.006 |
| bertin | 0.524 ยฑ 0.007 | 0.738 ยฑ 0.007 | 0.713 ยฑ 0.012 | 0.666 ยฑ 0.005 |
| beto_uncased | 0.532 ยฑ 0.012 | 0.727 ยฑ 0.016 | 0.701 ยฑ 0.007 | 0.651 ยฑ 0.006 |
| beto_cased | 0.516 ยฑ 0.012 | 0.724 ยฑ 0.012 | 0.705 ยฑ 0.009 | 0.662 ยฑ 0.005 |
| mbert_uncased | 0.493 ยฑ 0.010 | 0.718 ยฑ 0.011 | 0.681 ยฑ 0.010 | 0.617 ยฑ 0.003 |
| biGRU | 0.264 ยฑ 0.007 | 0.592 ยฑ 0.018 | 0.631 ยฑ 0.011 | 0.585 ยฑ 0.011 |
Note that for Hate Speech, these are the results for Semeval 2019, Task 5 Subtask B
## Citation
If you use this model in your research, please cite pysentimiento and RoBERTuito papers:
```
@misc{perez2021pysentimiento,
title={pysentimiento: A Python Toolkit for Sentiment Analysis and SocialNLP tasks},
author={Juan Manuel Pรฉrez and Juan Carlos Giudici and Franco Luque},
year={2021},
eprint={2106.09462},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{perez-etal-2022-robertuito,
title = "{R}o{BERT}uito: a pre-trained language model for social media text in {S}panish",
author = "P{\'e}rez, Juan Manuel and
Furman, Dami{\'a}n Ariel and
Alonso Alemany, Laura and
Luque, Franco M.",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.785",
pages = "7235--7243",
abstract = "Since BERT appeared, Transformer language models and transfer learning have become state-of-the-art for natural language processing tasks. Recently, some works geared towards pre-training specially-crafted models for particular domains, such as scientific papers, medical documents, user-generated texts, among others. These domain-specific models have been shown to improve performance significantly in most tasks; however, for languages other than English, such models are not widely available. In this work, we present RoBERTuito, a pre-trained language model for user-generated text in Spanish, trained on over 500 million tweets. Experiments on a benchmark of tasks involving user-generated text showed that RoBERTuito outperformed other pre-trained language models in Spanish. In addition to this, our model has some cross-lingual abilities, achieving top results for English-Spanish tasks of the Linguistic Code-Switching Evaluation benchmark (LinCE) and also competitive performance against monolingual models in English Twitter tasks. To facilitate further research, we make RoBERTuito publicly available at the HuggingFace model hub together with the dataset used to pre-train it.",
}
@inproceedings{garcia2020overview,
title={Overview of TASS 2020: Introducing emotion detection},
author={Garc{\'\i}a-Vega, Manuel and D{\'\i}az-Galiano, MC and Garc{\'\i}a-Cumbreras, MA and Del Arco, FMP and Montejo-R{\'a}ez, A and Jim{\'e}nez-Zafra, SM and Mart{\'\i}nez C{\'a}mara, E and Aguilar, CA and Cabezudo, MAS and Chiruzzo, L and others},
booktitle={Proceedings of the Iberian Languages Evaluation Forum (IberLEF 2020) Co-Located with 36th Conference of the Spanish Society for Natural Language Processing (SEPLN 2020), M{\'a}laga, Spain},
pages={163--170},
year={2020}
}
``` | 4,583 | [
[
-0.0269317626953125,
-0.04925537109375,
0.0157928466796875,
0.0472412109375,
-0.019744873046875,
0.01517486572265625,
-0.037994384765625,
-0.0386962890625,
0.042327880859375,
0.01806640625,
-0.041107177734375,
-0.0675048828125,
-0.07086181640625,
0.0156707763671875,
-0.0260467529296875,
0.0880126953125,
0.012176513671875,
0.004756927490234375,
0.0194854736328125,
-0.00804901123046875,
0.021240234375,
-0.033050537109375,
-0.05322265625,
-0.010650634765625,
0.048553466796875,
0.012725830078125,
0.036376953125,
0.005626678466796875,
0.0318603515625,
0.026885986328125,
-0.01116180419921875,
-0.00522613525390625,
-0.024749755859375,
0.0003299713134765625,
-0.004611968994140625,
-0.0211029052734375,
-0.03369140625,
0.01016998291015625,
0.0303497314453125,
0.032989501953125,
0.006195068359375,
0.0128631591796875,
0.0079345703125,
0.042205810546875,
-0.0269775390625,
0.0096435546875,
-0.036590576171875,
-0.005550384521484375,
-0.0249786376953125,
-0.0012865066528320312,
-0.0211181640625,
-0.0484619140625,
0.0179595947265625,
-0.01617431640625,
0.0098724365234375,
-0.00490570068359375,
0.09100341796875,
0.016082763671875,
-0.007137298583984375,
-0.0273895263671875,
-0.028900146484375,
0.07769775390625,
-0.0665283203125,
0.0224609375,
0.01204681396484375,
-0.005031585693359375,
0.00021076202392578125,
-0.02935791015625,
-0.05035400390625,
-0.0171356201171875,
0.005008697509765625,
0.0222625732421875,
-0.04205322265625,
-0.003360748291015625,
0.00021791458129882812,
0.0175018310546875,
-0.0308380126953125,
0.00957489013671875,
-0.025177001953125,
-0.01155853271484375,
0.0428466796875,
-0.015380859375,
0.02117919921875,
-0.033935546875,
-0.01343536376953125,
-0.0234832763671875,
-0.0228729248046875,
0.0009670257568359375,
0.034393310546875,
0.0296630859375,
-0.0224761962890625,
0.0225372314453125,
-0.0034618377685546875,
0.032012939453125,
-0.008453369140625,
-0.0032863616943359375,
0.059722900390625,
-0.005878448486328125,
-0.0226898193359375,
-0.03607177734375,
0.10858154296875,
0.0212554931640625,
0.047698974609375,
-0.0035228729248046875,
-0.0081634521484375,
0.014678955078125,
0.004364013671875,
-0.05657958984375,
-0.024749755859375,
0.0200653076171875,
-0.0251312255859375,
-0.033111572265625,
0.0034542083740234375,
-0.07098388671875,
-0.0156402587890625,
-0.00592041015625,
0.0223541259765625,
-0.0275115966796875,
-0.041748046875,
0.0079803466796875,
-0.00307464599609375,
0.004489898681640625,
0.0167999267578125,
-0.03985595703125,
0.0196990966796875,
0.0233917236328125,
0.06243896484375,
-0.021331787109375,
-0.02490234375,
-0.01435089111328125,
-0.03668212890625,
-0.01064300537109375,
0.0665283203125,
-0.023223876953125,
-0.01323699951171875,
0.00405120849609375,
0.0096893310546875,
-0.014495849609375,
-0.033721923828125,
0.0511474609375,
-0.0214385986328125,
0.040557861328125,
-0.014739990234375,
-0.01369476318359375,
-0.0194244384765625,
0.013397216796875,
-0.041107177734375,
0.103759765625,
0.01155853271484375,
-0.06732177734375,
0.006107330322265625,
-0.060333251953125,
-0.044281005859375,
-0.0224151611328125,
0.0012836456298828125,
-0.040069580078125,
-0.007808685302734375,
0.0174560546875,
0.05230712890625,
-0.032196044921875,
0.02325439453125,
-0.0260162353515625,
0.0074462890625,
0.01910400390625,
-0.0020236968994140625,
0.0849609375,
0.0237579345703125,
-0.04608154296875,
0.0157470703125,
-0.048309326171875,
-0.0079498291015625,
0.017669677734375,
-0.00685882568359375,
-0.022796630859375,
-0.00786590576171875,
0.0148468017578125,
0.035400390625,
0.0287628173828125,
-0.06573486328125,
-0.0256805419921875,
-0.042877197265625,
0.0255584716796875,
0.053466796875,
-0.01343536376953125,
0.023162841796875,
-0.00856781005859375,
0.058929443359375,
-0.0081024169921875,
0.013092041015625,
0.0137939453125,
-0.03887939453125,
-0.056976318359375,
-0.0247344970703125,
0.0081024169921875,
0.052093505859375,
-0.0435791015625,
0.041046142578125,
-0.01157379150390625,
-0.052825927734375,
-0.0308685302734375,
-0.0034084320068359375,
0.03515625,
0.0438232421875,
0.03472900390625,
0.0015239715576171875,
-0.07476806640625,
-0.059326171875,
-0.03118896484375,
-0.016357421875,
0.0013446807861328125,
0.01727294921875,
0.050933837890625,
-0.01430511474609375,
0.061492919921875,
-0.0350341796875,
-0.0218353271484375,
-0.0361328125,
0.0232391357421875,
0.0308837890625,
0.0193023681640625,
0.054046630859375,
-0.049041748046875,
-0.0655517578125,
0.007457733154296875,
-0.05279541015625,
-0.031280517578125,
0.0225830078125,
-0.005168914794921875,
0.03253173828125,
0.02960205078125,
-0.01226043701171875,
0.016082763671875,
0.0655517578125,
-0.0206756591796875,
0.0360107421875,
0.0089263916015625,
0.024261474609375,
-0.10003662109375,
0.0010747909545898438,
0.039581298828125,
-0.01369476318359375,
-0.04876708984375,
-0.0281829833984375,
-0.0128631591796875,
0.011505126953125,
-0.04583740234375,
0.05206298828125,
-0.02642822265625,
0.01422119140625,
-0.00490570068359375,
0.0155181884765625,
-0.0150146484375,
0.04815673828125,
0.0146026611328125,
0.050537109375,
0.0426025390625,
-0.03204345703125,
0.0090179443359375,
0.0108184814453125,
-0.0297698974609375,
0.033355712890625,
-0.069091796875,
-0.0017328262329101562,
-0.007282257080078125,
-0.00865936279296875,
-0.07867431640625,
-0.0024623870849609375,
0.01983642578125,
-0.062164306640625,
0.00293731689453125,
-0.0106201171875,
-0.033599853515625,
-0.042572021484375,
-0.03826904296875,
-0.0035228729248046875,
0.040130615234375,
-0.03094482421875,
0.0516357421875,
0.04876708984375,
-0.01922607421875,
-0.05010986328125,
-0.059844970703125,
0.00019466876983642578,
-0.03173828125,
-0.058624267578125,
0.0110626220703125,
-0.0001436471939086914,
-0.0215301513671875,
-0.006908416748046875,
0.01099395751953125,
-0.0029506683349609375,
0.00676727294921875,
0.020263671875,
0.02471923828125,
-0.005405426025390625,
-0.0016422271728515625,
0.00020420551300048828,
0.006130218505859375,
0.0118560791015625,
-0.0034084320068359375,
0.059173583984375,
-0.0282440185546875,
0.0130767822265625,
-0.036376953125,
0.01012420654296875,
0.0374755859375,
-0.021575927734375,
0.06463623046875,
0.044036865234375,
-0.0227508544921875,
-0.009765625,
-0.042236328125,
0.006465911865234375,
-0.0338134765625,
0.03924560546875,
-0.017425537109375,
-0.07489013671875,
0.060577392578125,
0.01351165771484375,
-0.0068817138671875,
0.05126953125,
0.054046630859375,
-0.023712158203125,
0.06231689453125,
0.0467529296875,
-0.0157318115234375,
0.06915283203125,
-0.02874755859375,
0.0259246826171875,
-0.048095703125,
-0.021484375,
-0.070068359375,
-0.0102081298828125,
-0.049835205078125,
-0.0255889892578125,
0.018218994140625,
-0.0183868408203125,
-0.0185089111328125,
0.0479736328125,
-0.03179931640625,
0.0345458984375,
0.0297393798828125,
0.0010738372802734375,
0.00003808736801147461,
0.004520416259765625,
0.00920867919921875,
-0.027557373046875,
-0.04296875,
-0.0390625,
0.07867431640625,
0.02972412109375,
0.0465087890625,
0.0026111602783203125,
0.06591796875,
0.02581787109375,
0.037445068359375,
-0.057708740234375,
0.04302978515625,
-0.037811279296875,
-0.0308380126953125,
-0.021209716796875,
-0.0390625,
-0.0721435546875,
0.025054931640625,
-0.00927734375,
-0.06878662109375,
0.01934814453125,
0.0014715194702148438,
-0.01557159423828125,
0.0133209228515625,
-0.05596923828125,
0.07659912109375,
-0.014678955078125,
-0.01024627685546875,
-0.006000518798828125,
-0.038970947265625,
0.01032257080078125,
0.01285552978515625,
0.0330810546875,
-0.019561767578125,
0.004032135009765625,
0.093017578125,
-0.0177001953125,
0.06524658203125,
-0.01270294189453125,
-0.0135650634765625,
0.0190582275390625,
0.0006241798400878906,
0.0299224853515625,
-0.0183563232421875,
-0.0189056396484375,
0.01751708984375,
-0.00799560546875,
-0.0157470703125,
-0.0242156982421875,
0.05218505859375,
-0.06097412109375,
-0.016998291015625,
-0.040313720703125,
-0.0301666259765625,
-0.007099151611328125,
0.016571044921875,
0.031402587890625,
0.015350341796875,
-0.0246124267578125,
0.01517486572265625,
0.048004150390625,
-0.0280914306640625,
0.0350341796875,
0.0390625,
-0.00466156005859375,
-0.040252685546875,
0.060333251953125,
0.00669097900390625,
0.014556884765625,
0.02423095703125,
0.0279083251953125,
-0.0272216796875,
-0.0291595458984375,
-0.001201629638671875,
0.04595947265625,
-0.0360107421875,
-0.0146331787109375,
-0.078369140625,
0.007160186767578125,
-0.045135498046875,
-0.00997161865234375,
-0.040557861328125,
-0.025482177734375,
-0.029510498046875,
-0.01160430908203125,
0.031585693359375,
0.0430908203125,
-0.01129913330078125,
0.030609130859375,
-0.03515625,
0.0213623046875,
-0.01340484619140625,
0.0091705322265625,
-0.004283905029296875,
-0.061309814453125,
-0.019622802734375,
-0.0005402565002441406,
-0.00839996337890625,
-0.077392578125,
0.06683349609375,
0.00983428955078125,
0.028350830078125,
0.0197906494140625,
0.003078460693359375,
0.03680419921875,
-0.0170440673828125,
0.049957275390625,
0.0206756591796875,
-0.076171875,
0.06298828125,
-0.0347900390625,
0.0029964447021484375,
0.048004150390625,
0.057525634765625,
-0.047149658203125,
-0.061370849609375,
-0.066162109375,
-0.0672607421875,
0.0665283203125,
0.01320648193359375,
0.0114288330078125,
-0.0247344970703125,
-0.0128631591796875,
-0.0113677978515625,
0.0193328857421875,
-0.08746337890625,
-0.0264739990234375,
-0.014373779296875,
-0.0350341796875,
-0.0015916824340820312,
-0.01557159423828125,
0.0031261444091796875,
-0.0307769775390625,
0.06805419921875,
0.00792694091796875,
0.02032470703125,
0.0089874267578125,
-0.0219573974609375,
0.0081634521484375,
0.0182037353515625,
0.03662109375,
0.029937744140625,
-0.02764892578125,
0.0001881122589111328,
0.0007052421569824219,
-0.0340576171875,
-0.01412200927734375,
0.010833740234375,
-0.00411224365234375,
0.00951385498046875,
0.0172271728515625,
0.05780029296875,
0.0061187744140625,
-0.0447998046875,
0.051910400390625,
-0.00156402587890625,
-0.0255889892578125,
-0.0288543701171875,
-0.0135345458984375,
-0.00662994384765625,
0.0206298828125,
0.028289794921875,
0.006622314453125,
-0.0012350082397460938,
-0.04071044921875,
0.00347137451171875,
0.02471923828125,
-0.0264892578125,
-0.0360107421875,
0.0408935546875,
0.0196990966796875,
-0.0255889892578125,
0.0132598876953125,
-0.032196044921875,
-0.0750732421875,
0.048919677734375,
0.030426025390625,
0.07769775390625,
-0.0228729248046875,
0.043609619140625,
0.0531005859375,
0.0350341796875,
-0.0098724365234375,
0.037811279296875,
0.00450897216796875,
-0.0718994140625,
-0.029510498046875,
-0.056793212890625,
-0.0207366943359375,
0.0155792236328125,
-0.04949951171875,
0.01386260986328125,
-0.031463623046875,
-0.014373779296875,
-0.0030364990234375,
0.0031032562255859375,
-0.040069580078125,
0.03179931640625,
-0.0010900497436523438,
0.05194091796875,
-0.0919189453125,
0.05999755859375,
0.053466796875,
-0.04290771484375,
-0.05450439453125,
-0.006671905517578125,
-0.00634002685546875,
-0.050537109375,
0.043701171875,
0.0022220611572265625,
-0.0297698974609375,
-0.0009288787841796875,
-0.039581298828125,
-0.06402587890625,
0.046905517578125,
0.0252685546875,
-0.0228729248046875,
0.01123046875,
0.0029659271240234375,
0.07122802734375,
-0.02154541015625,
0.0291595458984375,
0.0489501953125,
0.029693603515625,
0.000629425048828125,
-0.061309814453125,
-0.0030040740966796875,
-0.03973388671875,
-0.00994110107421875,
0.0176849365234375,
-0.043731689453125,
0.0777587890625,
-0.002681732177734375,
-0.016021728515625,
-0.01385498046875,
0.05914306640625,
0.0014801025390625,
0.00897979736328125,
0.033233642578125,
0.037933349609375,
0.06317138671875,
-0.0233154296875,
0.08258056640625,
-0.0200653076171875,
0.041595458984375,
0.08624267578125,
0.00389862060546875,
0.06292724609375,
0.0271759033203125,
-0.035125732421875,
0.056488037109375,
0.03411865234375,
0.0118560791015625,
0.0264739990234375,
-0.0057373046875,
-0.014801025390625,
-0.00450897216796875,
-0.009796142578125,
-0.01849365234375,
0.022735595703125,
0.01971435546875,
-0.03216552734375,
-0.006641387939453125,
0.01172637939453125,
0.041595458984375,
0.0302276611328125,
-0.00655364990234375,
0.038787841796875,
0.0011110305786132812,
-0.039215087890625,
0.05242919921875,
-0.0034656524658203125,
0.08551025390625,
-0.040069580078125,
0.034027099609375,
-0.021820068359375,
0.006618499755859375,
-0.034637451171875,
-0.0697021484375,
0.029937744140625,
0.0352783203125,
-0.004291534423828125,
-0.035552978515625,
0.036224365234375,
-0.026885986328125,
-0.038360595703125,
0.043670654296875,
0.029266357421875,
0.0163726806640625,
-0.0146331787109375,
-0.0623779296875,
0.01090240478515625,
0.0233154296875,
-0.041046142578125,
0.00775909423828125,
0.03753662109375,
-0.001354217529296875,
0.04461669921875,
0.03485107421875,
0.018951416015625,
0.0171661376953125,
0.034088134765625,
0.061767578125,
-0.04815673828125,
-0.036895751953125,
-0.057220458984375,
0.040130615234375,
-0.020538330078125,
-0.03448486328125,
0.07403564453125,
0.034698486328125,
0.0679931640625,
-0.00799560546875,
0.0615234375,
-0.03271484375,
0.0638427734375,
-0.01346588134765625,
0.03985595703125,
-0.043304443359375,
-0.003032684326171875,
-0.050628662109375,
-0.060089111328125,
-0.040740966796875,
0.06903076171875,
-0.057098388671875,
0.0014047622680664062,
0.058135986328125,
0.07080078125,
0.015594482421875,
-0.0136566162109375,
0.0031414031982421875,
0.041717529296875,
0.0187530517578125,
0.035552978515625,
0.04833984375,
-0.035675048828125,
0.039581298828125,
-0.029693603515625,
-0.0233001708984375,
-0.0052947998046875,
-0.06329345703125,
-0.072998046875,
-0.0511474609375,
-0.033721923828125,
-0.045196533203125,
-0.006763458251953125,
0.06951904296875,
0.0255584716796875,
-0.07269287109375,
-0.034942626953125,
-0.0015506744384765625,
0.00833892822265625,
0.01092529296875,
-0.0191192626953125,
0.0252532958984375,
-0.0218658447265625,
-0.0775146484375,
0.0137939453125,
0.0079803466796875,
0.00168609619140625,
0.003536224365234375,
-0.0060882568359375,
-0.031463623046875,
0.007579803466796875,
0.0460205078125,
0.0345458984375,
-0.0494384765625,
-0.00946044921875,
0.0152130126953125,
-0.0116119384765625,
0.0202484130859375,
0.02752685546875,
-0.046417236328125,
0.00933837890625,
0.040802001953125,
0.031402587890625,
0.0399169921875,
-0.00897216796875,
0.019500732421875,
-0.05255126953125,
0.026275634765625,
0.033233642578125,
0.02447509765625,
0.03155517578125,
-0.007781982421875,
0.038665771484375,
0.01183319091796875,
-0.02911376953125,
-0.0648193359375,
-0.00167083740234375,
-0.089599609375,
-0.00370025634765625,
0.0908203125,
-0.0036029815673828125,
-0.0340576171875,
0.01146697998046875,
-0.0146026611328125,
0.0252227783203125,
-0.052093505859375,
0.051605224609375,
0.047027587890625,
-0.005527496337890625,
-0.0167236328125,
-0.0302886962890625,
0.0382080078125,
0.036590576171875,
-0.0738525390625,
-0.0031585693359375,
0.0195770263671875,
0.0255889892578125,
0.01155853271484375,
0.0643310546875,
-0.0182647705078125,
0.0147857666015625,
-0.026611328125,
0.033111572265625,
0.0189208984375,
-0.0190887451171875,
-0.0235137939453125,
-0.0012054443359375,
-0.01190948486328125,
-0.01036834716796875
]
] |
vinai/bertweet-base | 2022-10-22T08:52:39.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | vinai | null | null | vinai/bertweet-base | 21 | 1,007,895 | transformers | 2022-03-02T23:29:05 | # <a name="introduction"></a> BERTweet: A pre-trained language model for English Tweets
BERTweet is the first public large-scale language model pre-trained for English Tweets. BERTweet is trained based on the [RoBERTa](https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.md) pre-training procedure. The corpus used to pre-train BERTweet consists of 850M English Tweets (16B word tokens ~ 80GB), containing 845M Tweets streamed from 01/2012 to 08/2019 and 5M Tweets related to the **COVID-19** pandemic. The general architecture and experimental results of BERTweet can be found in our [paper](https://aclanthology.org/2020.emnlp-demos.2/):
@inproceedings{bertweet,
title = {{BERTweet: A pre-trained language model for English Tweets}},
author = {Dat Quoc Nguyen and Thanh Vu and Anh Tuan Nguyen},
booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
pages = {9--14},
year = {2020}
}
**Please CITE** our paper when BERTweet is used to help produce published results or is incorporated into other software.
For further information or requests, please go to [BERTweet's homepage](https://github.com/VinAIResearch/BERTweet)!
### Main results
<p float="left">
<img width="275" alt="postagging" src="https://user-images.githubusercontent.com/2412555/135724590-01d8d435-262d-44fe-a383-cd39324fe190.png" />
<img width="275" alt="ner" src="https://user-images.githubusercontent.com/2412555/135724598-1e3605e7-d8ce-4c5e-be4a-62ae8501fae7.png" />
</p>
<p float="left">
<img width="275" alt="sentiment" src="https://user-images.githubusercontent.com/2412555/135724597-f1981f1e-fe73-4c03-b1ff-0cae0cc5f948.png" />
<img width="275" alt="irony" src="https://user-images.githubusercontent.com/2412555/135724595-15f4f2c8-bbb6-4ee6-82a0-034769dec183.png" />
</p>
| 1,890 | [
[
-0.03759765625,
-0.045745849609375,
0.0215606689453125,
0.021331787109375,
-0.034759521484375,
0.0022335052490234375,
-0.027008056640625,
-0.039947509765625,
0.039398193359375,
0.011474609375,
-0.040771484375,
-0.046630859375,
-0.047119140625,
-0.01444244384765625,
-0.0294647216796875,
0.0675048828125,
0.0098876953125,
-0.01525115966796875,
0.0164947509765625,
0.0064697265625,
0.0015726089477539062,
-0.0372314453125,
-0.05096435546875,
-0.0023441314697265625,
0.052734375,
0.0141143798828125,
0.056610107421875,
0.01190948486328125,
0.04412841796875,
0.0185699462890625,
0.011749267578125,
-0.0121002197265625,
-0.046966552734375,
-0.00335693359375,
0.01244354248046875,
0.006549835205078125,
-0.0264892578125,
0.0008044242858886719,
0.0292510986328125,
0.0245513916015625,
0.0081939697265625,
0.0113067626953125,
0.0073394775390625,
0.04522705078125,
-0.04107666015625,
0.0007586479187011719,
-0.04937744140625,
-0.014678955078125,
-0.0283355712890625,
-0.0159912109375,
-0.0206756591796875,
-0.044586181640625,
0.0286865234375,
-0.044036865234375,
0.0052947998046875,
-0.00339508056640625,
0.11749267578125,
-0.01459503173828125,
-0.0174102783203125,
-0.007320404052734375,
-0.034423828125,
0.05389404296875,
-0.062347412109375,
0.0265960693359375,
0.0227203369140625,
0.01303863525390625,
0.0017452239990234375,
-0.0655517578125,
-0.045013427734375,
-0.0177459716796875,
-0.00972747802734375,
0.0190277099609375,
-0.03466796875,
-0.00896453857421875,
-0.00040030479431152344,
-0.0006327629089355469,
-0.05072021484375,
-0.0017528533935546875,
-0.0122222900390625,
-0.002147674560546875,
0.03973388671875,
-0.0290069580078125,
0.0205078125,
-0.0226287841796875,
-0.044403076171875,
-0.0167083740234375,
-0.0303955078125,
0.00450897216796875,
0.00957489013671875,
0.01015472412109375,
-0.0205841064453125,
0.0189208984375,
0.01050567626953125,
0.0279083251953125,
-0.0056304931640625,
0.01145172119140625,
0.031768798828125,
-0.01153564453125,
-0.01332855224609375,
-0.0291900634765625,
0.0863037109375,
0.0287933349609375,
0.035125732421875,
-0.0030117034912109375,
-0.033599853515625,
-0.020751953125,
0.015228271484375,
-0.071044921875,
-0.02105712890625,
0.011627197265625,
-0.0364990234375,
-0.019561767578125,
0.005329132080078125,
-0.047027587890625,
-0.0085906982421875,
-0.005931854248046875,
0.056884765625,
-0.061798095703125,
-0.0242462158203125,
-0.01032257080078125,
-0.00438690185546875,
0.0311431884765625,
0.04290771484375,
-0.049102783203125,
-0.0019855499267578125,
0.04046630859375,
0.057586669921875,
0.021820068359375,
-0.0196533203125,
-0.0038738250732421875,
0.00603485107421875,
-0.0225677490234375,
0.060882568359375,
-0.02056884765625,
-0.0038204193115234375,
0.0185699462890625,
0.0012617111206054688,
-0.0001633167266845703,
-0.0005469322204589844,
0.042449951171875,
-0.04608154296875,
0.01491546630859375,
-0.03271484375,
-0.04290771484375,
-0.010528564453125,
0.018646240234375,
-0.034820556640625,
0.07330322265625,
0.0004203319549560547,
-0.0809326171875,
0.030792236328125,
-0.0675048828125,
-0.03814697265625,
0.0155487060546875,
-0.01434326171875,
-0.037933349609375,
-0.00986480712890625,
0.0287933349609375,
0.058929443359375,
-0.01282501220703125,
0.007160186767578125,
-0.039031982421875,
-0.0191192626953125,
0.009613037109375,
-0.019256591796875,
0.0877685546875,
0.00942230224609375,
-0.030853271484375,
-0.007427215576171875,
-0.035614013671875,
0.0015201568603515625,
0.0396728515625,
-0.0154266357421875,
-0.01273345947265625,
-0.003902435302734375,
0.0262451171875,
0.01611328125,
0.04620361328125,
-0.047760009765625,
0.00882720947265625,
-0.0135498046875,
0.041259765625,
0.06988525390625,
-0.008056640625,
0.02056884765625,
-0.043121337890625,
0.0223388671875,
-0.0160064697265625,
0.0245361328125,
-0.0124053955078125,
-0.032928466796875,
-0.04071044921875,
-0.04046630859375,
0.0157012939453125,
0.036102294921875,
-0.041473388671875,
0.051483154296875,
-0.042266845703125,
-0.049652099609375,
-0.0452880859375,
-0.017425537109375,
0.00635528564453125,
0.0297088623046875,
0.0238800048828125,
-0.006378173828125,
-0.046234130859375,
-0.070068359375,
-0.009002685546875,
-0.0240936279296875,
0.0010051727294921875,
0.0204925537109375,
0.051300048828125,
-0.0189056396484375,
0.07183837890625,
-0.034423828125,
-0.01004791259765625,
0.0057220458984375,
0.0222015380859375,
0.011962890625,
0.038360595703125,
0.06829833984375,
-0.0587158203125,
-0.05426025390625,
-0.004062652587890625,
-0.038909912109375,
-0.0059661865234375,
-0.005390167236328125,
-0.0202789306640625,
0.038299560546875,
0.02264404296875,
-0.05328369140625,
0.04022216796875,
0.05523681640625,
-0.0193023681640625,
0.047332763671875,
-0.008209228515625,
0.023956298828125,
-0.0977783203125,
0.01375579833984375,
0.0028553009033203125,
-0.02734375,
-0.033477783203125,
-0.0081024169921875,
0.0083160400390625,
0.0022602081298828125,
-0.0311431884765625,
0.040191650390625,
-0.033203125,
0.0008115768432617188,
0.0036754608154296875,
0.0019893646240234375,
-0.0013132095336914062,
0.030670166015625,
-0.003978729248046875,
0.05206298828125,
0.050537109375,
-0.011260986328125,
0.0185394287109375,
-0.00023102760314941406,
-0.030029296875,
0.02301025390625,
-0.06024169921875,
0.0118255615234375,
0.002101898193359375,
0.0244140625,
-0.09259033203125,
-0.0201263427734375,
0.006275177001953125,
-0.06341552734375,
0.01158905029296875,
-0.0135040283203125,
-0.0643310546875,
-0.032073974609375,
-0.04156494140625,
0.0011720657348632812,
0.06298828125,
-0.04132080078125,
0.03326416015625,
0.022857666015625,
-0.00315093994140625,
-0.0428466796875,
-0.064208984375,
0.0083160400390625,
0.0011434555053710938,
-0.0784912109375,
0.019622802734375,
-0.0008521080017089844,
0.01220703125,
0.0185699462890625,
0.0086517333984375,
-0.0264739990234375,
-0.0006289482116699219,
0.0155181884765625,
0.0220184326171875,
-0.01387786865234375,
0.02740478515625,
-0.019683837890625,
0.01139068603515625,
-0.00949859619140625,
-0.0263671875,
0.055511474609375,
-0.0220184326171875,
-0.031494140625,
-0.045318603515625,
0.0269775390625,
0.04547119140625,
-0.017852783203125,
0.07025146484375,
0.0733642578125,
-0.029815673828125,
0.0004794597625732422,
-0.0469970703125,
-0.0179290771484375,
-0.03338623046875,
0.021881103515625,
-0.0157470703125,
-0.07318115234375,
0.049102783203125,
0.016326904296875,
0.0291900634765625,
0.0418701171875,
0.042236328125,
-0.05718994140625,
0.054412841796875,
0.021575927734375,
-0.0003368854522705078,
0.058013916015625,
-0.0277557373046875,
0.02154541015625,
-0.0275115966796875,
0.007595062255859375,
-0.035919189453125,
-0.0236358642578125,
-0.0670166015625,
-0.0276947021484375,
0.011871337890625,
-0.0011510848999023438,
-0.032928466796875,
0.03851318359375,
-0.03790283203125,
0.0168304443359375,
0.04437255859375,
0.0200958251953125,
-0.015625,
0.004329681396484375,
-0.0009965896606445312,
-0.005306243896484375,
-0.037506103515625,
-0.0423583984375,
0.08538818359375,
0.0295257568359375,
0.061920166015625,
0.0144500732421875,
0.0821533203125,
0.01548004150390625,
0.0199737548828125,
-0.042266845703125,
0.044158935546875,
-0.0258026123046875,
-0.06494140625,
-0.0338134765625,
-0.0224609375,
-0.0906982421875,
0.00469207763671875,
-0.021240234375,
-0.0521240234375,
0.0056304931640625,
0.01387786865234375,
-0.01326751708984375,
0.03704833984375,
-0.0740966796875,
0.052581787109375,
-0.02294921875,
-0.0189971923828125,
-0.0108795166015625,
-0.047454833984375,
0.002918243408203125,
-0.00714874267578125,
0.01995849609375,
-0.01453399658203125,
-0.0026721954345703125,
0.0599365234375,
-0.037139892578125,
0.0689697265625,
-0.021942138671875,
0.004184722900390625,
0.01165008544921875,
-0.0016431808471679688,
0.0521240234375,
0.00524139404296875,
0.00933074951171875,
0.0191802978515625,
0.0004515647888183594,
-0.058929443359375,
-0.0175018310546875,
0.0450439453125,
-0.0758056640625,
-0.0367431640625,
-0.030487060546875,
-0.01464080810546875,
0.00690460205078125,
0.02838134765625,
0.039520263671875,
0.0228424072265625,
-0.0240020751953125,
0.04779052734375,
0.041778564453125,
-0.0249176025390625,
0.04412841796875,
0.0220184326171875,
-0.00823211669921875,
-0.030609130859375,
0.048309326171875,
0.0250396728515625,
0.00850677490234375,
0.0428466796875,
0.0090484619140625,
-0.0130767822265625,
-0.0343017578125,
0.0088348388671875,
0.0167236328125,
-0.041229248046875,
-0.015472412109375,
-0.0521240234375,
-0.0251312255859375,
-0.068359375,
-0.04083251953125,
-0.03094482421875,
-0.0292816162109375,
-0.0278472900390625,
0.01059722900390625,
0.04730224609375,
0.042327880859375,
-0.0191192626953125,
0.025390625,
-0.042572021484375,
0.0201263427734375,
0.037750244140625,
0.0252685546875,
0.001953125,
-0.040496826171875,
-0.0215606689453125,
0.005405426025390625,
-0.020721435546875,
-0.056732177734375,
0.03350830078125,
0.01116943359375,
0.02911376953125,
0.03326416015625,
0.007724761962890625,
0.04364013671875,
-0.0382080078125,
0.0567626953125,
0.034698486328125,
-0.06378173828125,
0.04718017578125,
-0.04315185546875,
0.01934814453125,
0.037384033203125,
0.0279541015625,
-0.044525146484375,
-0.034515380859375,
-0.06585693359375,
-0.07733154296875,
0.043426513671875,
0.0333251953125,
0.0117034912109375,
-0.007904052734375,
-0.00476837158203125,
-0.0084381103515625,
-0.00131988525390625,
-0.060333251953125,
-0.0153961181640625,
-0.0159149169921875,
-0.0139617919921875,
0.0027751922607421875,
-0.0155487060546875,
-0.002956390380859375,
-0.04022216796875,
0.05328369140625,
0.0155181884765625,
0.052734375,
0.021575927734375,
-0.0164947509765625,
-0.0037631988525390625,
0.01473236083984375,
0.0435791015625,
0.05303955078125,
-0.041656494140625,
-0.0021190643310546875,
-0.01116943359375,
-0.036590576171875,
-0.0215301513671875,
0.0299224853515625,
-0.01092529296875,
0.01654052734375,
0.0478515625,
0.06243896484375,
0.011749267578125,
-0.024322509765625,
0.067626953125,
-0.00695037841796875,
-0.032470703125,
-0.04986572265625,
-0.0274658203125,
0.0153350830078125,
0.0182037353515625,
0.056976318359375,
0.0103302001953125,
-0.01033782958984375,
-0.0252227783203125,
0.0287933349609375,
0.039520263671875,
-0.0170440673828125,
-0.033782958984375,
0.0298614501953125,
0.0178375244140625,
-0.00910186767578125,
0.020782470703125,
-0.02008056640625,
-0.050872802734375,
0.0401611328125,
0.01513671875,
0.085693359375,
0.004077911376953125,
0.02459716796875,
0.03143310546875,
0.05938720703125,
0.0207672119140625,
0.032806396484375,
0.016632080078125,
-0.0546875,
-0.0297698974609375,
-0.039154052734375,
-0.014068603515625,
0.0201873779296875,
-0.0335693359375,
0.0276947021484375,
-0.046356201171875,
-0.029937744140625,
0.0174407958984375,
0.0083160400390625,
-0.07122802734375,
0.0138092041015625,
0.01387786865234375,
0.07623291015625,
-0.060760498046875,
0.07037353515625,
0.06915283203125,
-0.04302978515625,
-0.06964111328125,
0.030792236328125,
-0.0210113525390625,
-0.0709228515625,
0.053802490234375,
0.0092620849609375,
0.0042877197265625,
0.007793426513671875,
-0.07415771484375,
-0.061614990234375,
0.0816650390625,
0.0352783203125,
-0.0303802490234375,
-0.00240325927734375,
-0.01271820068359375,
0.034210205078125,
-0.0211181640625,
0.02447509765625,
0.00817108154296875,
0.039520263671875,
0.003246307373046875,
-0.06976318359375,
0.007843017578125,
-0.01560211181640625,
0.0073394775390625,
0.0142822265625,
-0.05609130859375,
0.0733642578125,
-0.006938934326171875,
-0.01392364501953125,
-0.0108184814453125,
0.036102294921875,
0.00820159912109375,
0.005504608154296875,
0.045135498046875,
0.0537109375,
0.0400390625,
-0.029510498046875,
0.0762939453125,
-0.019256591796875,
0.05682373046875,
0.062408447265625,
0.004329681396484375,
0.0697021484375,
0.04376220703125,
-0.02032470703125,
0.03759765625,
0.051025390625,
0.01210784912109375,
0.037811279296875,
0.004749298095703125,
-0.0083160400390625,
-0.0180816650390625,
0.005115509033203125,
-0.032501220703125,
0.0115966796875,
0.0158233642578125,
-0.0294647216796875,
-0.02337646484375,
-0.00540924072265625,
0.0149993896484375,
-0.011993408203125,
0.0005922317504882812,
0.044097900390625,
-0.00870513916015625,
-0.0165252685546875,
0.061981201171875,
-0.0245513916015625,
0.06744384765625,
-0.048370361328125,
0.0187530517578125,
-0.007350921630859375,
-0.000005781650543212891,
-0.01340484619140625,
-0.0833740234375,
0.0026187896728515625,
0.0062713623046875,
0.00139617919921875,
-0.030120849609375,
0.058685302734375,
-0.0130157470703125,
-0.02825927734375,
0.037841796875,
0.019622802734375,
0.01331329345703125,
-0.00588226318359375,
-0.074462890625,
0.006519317626953125,
0.006656646728515625,
-0.054901123046875,
0.0015687942504882812,
0.059539794921875,
0.014801025390625,
0.0391845703125,
0.04083251953125,
0.006183624267578125,
0.02032470703125,
0.0087738037109375,
0.08038330078125,
-0.04486083984375,
-0.03863525390625,
-0.05389404296875,
0.04962158203125,
-0.0167388916015625,
-0.01387786865234375,
0.04180908203125,
0.032928466796875,
0.06280517578125,
-0.015167236328125,
0.081298828125,
-0.0401611328125,
0.04864501953125,
-0.01263427734375,
0.07037353515625,
-0.06170654296875,
-0.01207733154296875,
-0.0272064208984375,
-0.0450439453125,
-0.02044677734375,
0.057373046875,
-0.036285400390625,
0.0206451416015625,
0.048858642578125,
0.058807373046875,
0.004848480224609375,
-0.018707275390625,
0.0227203369140625,
0.0296630859375,
0.0246429443359375,
0.037261962890625,
0.05389404296875,
-0.034759521484375,
0.05169677734375,
-0.02587890625,
-0.02593994140625,
-0.03924560546875,
-0.057891845703125,
-0.070068359375,
-0.059600830078125,
-0.01140594482421875,
-0.0258331298828125,
0.01934814453125,
0.08526611328125,
0.07415771484375,
-0.06744384765625,
-0.00443267822265625,
0.00032210350036621094,
-0.0100860595703125,
0.006862640380859375,
-0.0213470458984375,
0.04010009765625,
-0.0218658447265625,
-0.042816162109375,
-0.017791748046875,
0.0088653564453125,
0.0165252685546875,
-0.0016469955444335938,
-0.0082550048828125,
-0.051513671875,
0.0139312744140625,
0.046905517578125,
0.0147552490234375,
-0.044586181640625,
-0.03546142578125,
0.0011501312255859375,
-0.03350830078125,
0.0169219970703125,
0.01654052734375,
-0.0297088623046875,
0.01025390625,
0.047088623046875,
0.041534423828125,
0.03619384765625,
-0.01032257080078125,
0.0172119140625,
-0.07257080078125,
0.0157470703125,
0.03240966796875,
0.023345947265625,
0.01824951171875,
-0.0169219970703125,
0.0311737060546875,
0.0202178955078125,
-0.048858642578125,
-0.060882568359375,
-0.0026950836181640625,
-0.0899658203125,
-0.0223388671875,
0.07159423828125,
-0.0191192626953125,
-0.01131439208984375,
0.0195770263671875,
-0.01190948486328125,
0.0245819091796875,
-0.07415771484375,
0.04730224609375,
0.059661865234375,
0.017242431640625,
-0.023834228515625,
-0.043060302734375,
0.0280303955078125,
0.02215576171875,
-0.056396484375,
-0.00928497314453125,
0.018463134765625,
0.01666259765625,
0.01824951171875,
0.06268310546875,
0.006038665771484375,
0.0191802978515625,
-0.024169921875,
0.038299560546875,
0.0236968994140625,
0.00029158592224121094,
-0.009490966796875,
-0.0014095306396484375,
0.00022292137145996094,
-0.031829833984375
]
] |
CIDAS/clipseg-rd64-refined | 2023-01-04T11:56:08.000Z | [
"transformers",
"pytorch",
"clipseg",
"vision",
"image-segmentation",
"arxiv:2112.10003",
"license:apache-2.0",
"has_space",
"region:us"
] | image-segmentation | CIDAS | null | null | CIDAS/clipseg-rd64-refined | 58 | 980,137 | transformers | 2022-11-01T14:25:57 | ---
license: apache-2.0
tags:
- vision
- image-segmentation
inference: false
---
# CLIPSeg model
CLIPSeg model with reduce dimension 64, refined (using a more complex convolution). It was introduced in the paper [Image Segmentation Using Text and Image Prompts](https://arxiv.org/abs/2112.10003) by Lรผddecke et al. and first released in [this repository](https://github.com/timojl/clipseg).
# Intended use cases
This model is intended for zero-shot and one-shot image segmentation.
# Usage
Refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/clipseg). | 595 | [
[
-0.06158447265625,
-0.04052734375,
0.045135498046875,
0.00789642333984375,
-0.04302978515625,
-0.0294342041015625,
0.019195556640625,
-0.0261688232421875,
0.0018758773803710938,
0.028472900390625,
-0.0709228515625,
-0.04632568359375,
-0.057281494140625,
0.0010471343994140625,
-0.05322265625,
0.0643310546875,
-0.0084991455078125,
0.01739501953125,
-0.02325439453125,
-0.025238037109375,
-0.01508331298828125,
-0.0175933837890625,
-0.036102294921875,
-0.011383056640625,
0.019073486328125,
0.0305938720703125,
0.039886474609375,
0.0306549072265625,
0.053436279296875,
0.0231170654296875,
-0.033599853515625,
-0.0142364501953125,
-0.030517578125,
-0.0248565673828125,
0.00028896331787109375,
-0.02947998046875,
-0.00479888916015625,
0.00189971923828125,
0.046051025390625,
0.0297393798828125,
0.022796630859375,
0.023956298828125,
-0.033416748046875,
0.040496826171875,
-0.05126953125,
-0.0173492431640625,
-0.02288818359375,
0.0214691162109375,
-0.004566192626953125,
0.0132598876953125,
-0.015777587890625,
-0.0261383056640625,
0.021331787109375,
-0.02447509765625,
0.04534912109375,
-0.0207061767578125,
0.09930419921875,
0.01088714599609375,
-0.0101318359375,
-0.00435638427734375,
-0.051849365234375,
0.061920166015625,
-0.040252685546875,
0.0228729248046875,
-0.0160675048828125,
0.039031982421875,
0.0279083251953125,
-0.0716552734375,
-0.0218963623046875,
0.00368499755859375,
0.00949859619140625,
-0.0234222412109375,
-0.029510498046875,
-0.006866455078125,
0.046112060546875,
0.0316162109375,
-0.03582763671875,
0.0015010833740234375,
-0.079345703125,
-0.01617431640625,
0.04132080078125,
0.0207977294921875,
0.03302001953125,
-0.0212554931640625,
-0.074462890625,
-0.005809783935546875,
-0.046142578125,
0.01503753662109375,
0.0145721435546875,
-0.01824951171875,
-0.0245361328125,
0.0309295654296875,
0.021697998046875,
0.044342041015625,
0.01166534423828125,
-0.0223541259765625,
0.010284423828125,
0.01003265380859375,
-0.017974853515625,
-0.00879669189453125,
0.0584716796875,
0.04901123046875,
0.028961181640625,
0.00997161865234375,
-0.0104827880859375,
-0.004962921142578125,
0.0297088623046875,
-0.103271484375,
-0.015899658203125,
-0.00879669189453125,
-0.051849365234375,
-0.026824951171875,
0.00948333740234375,
-0.04534912109375,
0.0021305084228515625,
-0.01324462890625,
0.032745361328125,
-0.0618896484375,
0.01210784912109375,
0.01031494140625,
-0.022125244140625,
0.03497314453125,
0.03448486328125,
-0.051483154296875,
0.0262908935546875,
0.03582763671875,
0.07763671875,
-0.0038471221923828125,
-0.018402099609375,
-0.0036678314208984375,
-0.0131683349609375,
-0.01021575927734375,
0.09918212890625,
-0.0310821533203125,
-0.04217529296875,
-0.006561279296875,
0.0232086181640625,
0.006916046142578125,
-0.0289459228515625,
0.062103271484375,
-0.033966064453125,
0.01007080078125,
-0.00852203369140625,
-0.0478515625,
-0.0474853515625,
0.0263214111328125,
-0.050567626953125,
0.0428466796875,
0.0206451416015625,
-0.051910400390625,
0.024993896484375,
-0.0504150390625,
-0.00400543212890625,
0.0020046234130859375,
-0.0015010833740234375,
-0.0609130859375,
0.0124969482421875,
0.0304412841796875,
0.01430511474609375,
-0.0223541259765625,
0.01032257080078125,
-0.0256195068359375,
-0.036895751953125,
0.007472991943359375,
-0.01080322265625,
0.063232421875,
0.0306854248046875,
0.0225982666015625,
0.0225067138671875,
-0.040679931640625,
-0.01061248779296875,
0.02587890625,
0.0168914794921875,
-0.01312255859375,
-0.01486968994140625,
0.01120758056640625,
-0.004405975341796875,
0.01849365234375,
-0.06024169921875,
0.0216827392578125,
0.0013246536254882812,
0.032928466796875,
0.0201873779296875,
0.0230712890625,
0.03997802734375,
-0.01271820068359375,
0.048248291015625,
0.0035915374755859375,
0.046539306640625,
-0.042022705078125,
-0.0145263671875,
-0.0284423828125,
-0.028289794921875,
0.033966064453125,
0.0328369140625,
-0.0201416015625,
0.01776123046875,
0.0069732666015625,
-0.049530029296875,
-0.005069732666015625,
-0.017974853515625,
0.0007505416870117188,
0.0286407470703125,
0.0183868408203125,
-0.0165252685546875,
-0.05047607421875,
-0.08868408203125,
0.02337646484375,
0.00618743896484375,
-0.015472412109375,
0.0167999267578125,
0.03631591796875,
-0.0211181640625,
0.06951904296875,
-0.06781005859375,
-0.00809478759765625,
-0.0220794677734375,
0.01403045654296875,
0.0245819091796875,
0.03802490234375,
0.08441162109375,
-0.0692138671875,
-0.0394287109375,
-0.04534912109375,
-0.048370361328125,
0.004611968994140625,
-0.0020084381103515625,
-0.01244354248046875,
-0.03253173828125,
0.0307464599609375,
-0.0511474609375,
0.04559326171875,
0.03271484375,
-0.004119873046875,
0.055023193359375,
0.019073486328125,
0.0252838134765625,
-0.076416015625,
-0.0035610198974609375,
0.029541015625,
-0.04437255859375,
-0.0511474609375,
0.003570556640625,
0.0122528076171875,
-0.048828125,
-0.06658935546875,
0.0173797607421875,
-0.0230712890625,
-0.00139617919921875,
-0.0285491943359375,
-0.00228118896484375,
0.0266265869140625,
0.0401611328125,
0.00598907470703125,
0.060577392578125,
0.052032470703125,
-0.0501708984375,
0.016021728515625,
0.053985595703125,
-0.027984619140625,
0.05841064453125,
-0.084228515625,
0.007732391357421875,
-0.01505279541015625,
0.005596160888671875,
-0.054595947265625,
-0.015594482421875,
0.01549530029296875,
-0.021514892578125,
0.02337646484375,
-0.019195556640625,
-0.0181121826171875,
-0.0308837890625,
-0.039794921875,
0.05828857421875,
0.04266357421875,
-0.043609619140625,
0.01059722900390625,
0.0374755859375,
0.00426483154296875,
-0.0032958984375,
-0.040069580078125,
-0.025360107421875,
-0.0101776123046875,
-0.0699462890625,
0.047393798828125,
-0.0210418701171875,
0.00020182132720947266,
0.0017185211181640625,
0.007007598876953125,
-0.0265960693359375,
-0.0264434814453125,
0.007686614990234375,
0.028472900390625,
-0.0170440673828125,
0.01102447509765625,
0.013580322265625,
-0.0148162841796875,
-0.0039520263671875,
0.0188751220703125,
0.033416748046875,
-0.016357421875,
-0.01507568359375,
-0.031463623046875,
0.04791259765625,
0.04193115234375,
-0.003139495849609375,
0.038116455078125,
0.043914794921875,
-0.036224365234375,
0.0045013427734375,
-0.023162841796875,
-0.01071929931640625,
-0.03369140625,
0.01861572265625,
-0.0245361328125,
-0.068603515625,
0.048828125,
-0.0202789306640625,
-0.01218414306640625,
0.05230712890625,
0.026641845703125,
0.007091522216796875,
0.0814208984375,
0.07403564453125,
0.0260772705078125,
0.055877685546875,
-0.03668212890625,
0.020263671875,
-0.0841064453125,
-0.0117340087890625,
-0.036895751953125,
-0.0216064453125,
-0.0178680419921875,
-0.0249481201171875,
0.034759521484375,
0.040130615234375,
-0.0119781494140625,
0.05499267578125,
-0.049591064453125,
0.041290283203125,
0.0243072509765625,
0.026458740234375,
-0.0119171142578125,
-0.0183868408203125,
-0.002643585205078125,
-0.0166473388671875,
-0.0341796875,
-0.03668212890625,
0.027496337890625,
0.045379638671875,
0.07171630859375,
-0.0195465087890625,
0.04296875,
0.023468017578125,
-0.002185821533203125,
-0.09173583984375,
0.046875,
-0.0146942138671875,
-0.059051513671875,
-0.006465911865234375,
0.0019407272338867188,
-0.054595947265625,
-0.0016460418701171875,
-0.01004791259765625,
-0.0579833984375,
0.030670166015625,
0.00403594970703125,
-0.032470703125,
0.03497314453125,
-0.056365966796875,
0.06524658203125,
0.0009760856628417969,
0.00907135009765625,
0.00835418701171875,
-0.0300140380859375,
0.041595458984375,
0.0244140625,
-0.0168914794921875,
-0.04071044921875,
0.0183563232421875,
0.056488037109375,
-0.04925537109375,
0.053741455078125,
-0.018310546875,
0.01242828369140625,
0.053466796875,
0.0037860870361328125,
0.033050537109375,
0.00012350082397460938,
0.0084228515625,
0.0247039794921875,
0.0323486328125,
-0.019561767578125,
-0.03973388671875,
0.0298919677734375,
-0.048492431640625,
-0.0226287841796875,
-0.0021953582763671875,
-0.0350341796875,
0.0208282470703125,
0.01383209228515625,
0.056060791015625,
0.0538330078125,
-0.029632568359375,
0.01050567626953125,
0.052825927734375,
-0.021697998046875,
0.033447265625,
0.01461029052734375,
-0.0272064208984375,
-0.031982421875,
0.031951904296875,
0.005596160888671875,
0.0039215087890625,
0.0122528076171875,
0.022125244140625,
-0.0215911865234375,
-0.038543701171875,
-0.0243988037109375,
0.02374267578125,
-0.06854248046875,
-0.040069580078125,
-0.0323486328125,
-0.03546142578125,
-0.0308380126953125,
-0.0286407470703125,
-0.03369140625,
-0.0247802734375,
-0.050811767578125,
-0.003353118896484375,
0.045501708984375,
0.03863525390625,
-0.0038547515869140625,
0.04498291015625,
-0.09759521484375,
0.0289154052734375,
0.02947998046875,
0.038909912109375,
-0.0240478515625,
-0.04217529296875,
-0.00977325439453125,
-0.0243377685546875,
-0.046295166015625,
-0.08404541015625,
0.03515625,
0.009552001953125,
0.044952392578125,
0.03753662109375,
-0.0037059783935546875,
0.036590576171875,
-0.047027587890625,
0.06890869140625,
0.041290283203125,
-0.080322265625,
0.043853759765625,
-0.018096923828125,
0.045257568359375,
0.0316162109375,
0.0438232421875,
-0.03546142578125,
0.00836181640625,
-0.058990478515625,
-0.05523681640625,
0.045166015625,
-0.0012598037719726562,
0.017333984375,
0.0019683837890625,
0.0218353271484375,
0.0264434814453125,
0.007312774658203125,
-0.047119140625,
-0.0175323486328125,
-0.01395416259765625,
0.001399993896484375,
0.044189453125,
-0.06390380859375,
-0.0016117095947265625,
-0.0323486328125,
0.03546142578125,
-0.00811004638671875,
0.052825927734375,
0.0302734375,
-0.016845703125,
-0.01284027099609375,
0.01088714599609375,
0.06671142578125,
0.043182373046875,
-0.0220947265625,
-0.0032901763916015625,
0.01336669921875,
-0.018798828125,
0.0014886856079101562,
-0.00798797607421875,
-0.03515625,
0.0194091796875,
0.0177764892578125,
0.0906982421875,
0.022674560546875,
-0.0296173095703125,
0.03778076171875,
-0.0020809173583984375,
-0.044830322265625,
-0.0294036865234375,
0.007358551025390625,
-0.01358795166015625,
0.025299072265625,
0.0185089111328125,
0.0264434814453125,
0.042694091796875,
-0.0032558441162109375,
0.03436279296875,
0.016998291015625,
-0.055419921875,
-0.0292816162109375,
0.03997802734375,
0.0141448974609375,
-0.04803466796875,
0.03302001953125,
-0.0248870849609375,
-0.052093505859375,
0.051483154296875,
0.03680419921875,
0.0728759765625,
-0.0016584396362304688,
0.03729248046875,
0.06158447265625,
0.01128387451171875,
0.00022518634796142578,
0.0152740478515625,
-0.00383758544921875,
-0.044097900390625,
-0.030181884765625,
-0.036590576171875,
-0.046539306640625,
0.00931549072265625,
-0.057464599609375,
0.04351806640625,
-0.033599853515625,
-0.017913818359375,
0.0129852294921875,
-0.0223541259765625,
-0.05279541015625,
0.01268768310546875,
0.023345947265625,
0.102294921875,
-0.050994873046875,
0.045867919921875,
0.04669189453125,
-0.03179931640625,
-0.041046142578125,
-0.01544189453125,
-0.016937255859375,
-0.042449951171875,
0.019866943359375,
0.032196044921875,
-0.003932952880859375,
-0.01030731201171875,
-0.1053466796875,
-0.059722900390625,
0.0673828125,
0.0281829833984375,
-0.037261962890625,
0.0124053955078125,
-0.0167083740234375,
0.0217742919921875,
-0.044342041015625,
0.01021575927734375,
0.020233154296875,
0.018798828125,
0.047698974609375,
-0.022125244140625,
-0.01824951171875,
-0.020660400390625,
0.0265350341796875,
0.0304718017578125,
-0.054595947265625,
0.0860595703125,
-0.02227783203125,
-0.0217742919921875,
0.0174713134765625,
0.051727294921875,
0.00859832763671875,
0.01360321044921875,
0.042388916015625,
0.040924072265625,
0.01161956787109375,
-0.0101165771484375,
0.06878662109375,
-0.0008764266967773438,
0.06646728515625,
0.06353759765625,
-0.00806427001953125,
0.0297088623046875,
0.0254669189453125,
-0.0218963623046875,
0.048614501953125,
0.05230712890625,
-0.052032470703125,
0.048553466796875,
-0.01184844970703125,
-0.00499725341796875,
-0.02239990234375,
-0.01345062255859375,
-0.0080718994140625,
0.0215911865234375,
0.0251617431640625,
-0.0384521484375,
-0.050506591796875,
0.01499176025390625,
0.0025539398193359375,
-0.005428314208984375,
-0.0269927978515625,
0.04901123046875,
0.004161834716796875,
-0.0216064453125,
0.03582763671875,
0.0017910003662109375,
0.032135009765625,
-0.03485107421875,
-0.0152740478515625,
0.00469207763671875,
0.0023365020751953125,
-0.0049285888671875,
-0.0791015625,
0.054046630859375,
0.01336669921875,
-0.029541015625,
0.012847900390625,
0.06658935546875,
-0.0115203857421875,
-0.051177978515625,
0.0090789794921875,
-0.01549530029296875,
0.01800537109375,
-0.007755279541015625,
-0.0733642578125,
0.023712158203125,
-0.0035228729248046875,
-0.0096282958984375,
0.0027008056640625,
0.022125244140625,
0.018707275390625,
0.0298614501953125,
0.0258636474609375,
-0.0418701171875,
-0.00024819374084472656,
0.0031337738037109375,
0.072998046875,
-0.043609619140625,
-0.027557373046875,
-0.054229736328125,
0.057891845703125,
-0.018829345703125,
-0.020843505859375,
0.029327392578125,
0.034027099609375,
0.039794921875,
-0.03900146484375,
0.048583984375,
-0.01119232177734375,
0.03564453125,
-0.0311737060546875,
0.017425537109375,
-0.036773681640625,
-0.0240478515625,
-0.01959228515625,
-0.0511474609375,
-0.035247802734375,
0.04962158203125,
-0.0255584716796875,
-0.01238250732421875,
0.053558349609375,
0.047576904296875,
-0.012481689453125,
-0.01070404052734375,
0.042205810546875,
-0.0241241455078125,
0.033203125,
0.01277923583984375,
0.041595458984375,
-0.060577392578125,
0.048553466796875,
-0.038726806640625,
-0.0050811767578125,
-0.01593017578125,
-0.06585693359375,
-0.07757568359375,
-0.0277862548828125,
-0.037384033203125,
-0.005031585693359375,
-0.0272064208984375,
0.05694580078125,
0.0765380859375,
-0.06317138671875,
0.0084991455078125,
-0.007049560546875,
0.010986328125,
0.005886077880859375,
-0.01422119140625,
0.01959228515625,
-0.0155487060546875,
-0.061492919921875,
-0.00033593177795410156,
0.033782958984375,
0.0278472900390625,
-0.0272979736328125,
0.00821685791015625,
-0.0196380615234375,
0.0255584716796875,
0.030487060546875,
0.0303802490234375,
-0.0242767333984375,
-0.037933349609375,
0.01120758056640625,
-0.006198883056640625,
0.040924072265625,
0.07452392578125,
-0.0277862548828125,
0.0149688720703125,
0.059967041015625,
-0.005428314208984375,
0.056671142578125,
0.00907135009765625,
0.023712158203125,
-0.05926513671875,
0.037933349609375,
0.00263214111328125,
0.03900146484375,
0.0224456787109375,
-0.0271148681640625,
0.0222625732421875,
0.00881195068359375,
-0.04559326171875,
-0.0290374755859375,
0.033416748046875,
-0.08685302734375,
-0.021331787109375,
0.06158447265625,
-0.0153045654296875,
-0.0491943359375,
0.034759521484375,
-0.04876708984375,
0.016571044921875,
-0.015655517578125,
0.039306640625,
0.029937744140625,
-0.006313323974609375,
-0.072021484375,
0.00035262107849121094,
0.0172271728515625,
-0.0273895263671875,
-0.054351806640625,
-0.049407958984375,
0.0263671875,
0.0260467529296875,
0.038330078125,
0.01708984375,
-0.0323486328125,
0.03350830078125,
0.0098419189453125,
0.037109375,
-0.0087127685546875,
-0.026031494140625,
-0.0159759521484375,
0.025665283203125,
-0.0031185150146484375,
-0.0465087890625
]
] |
EleutherAI/pythia-2.8b | 2023-06-09T00:35:37.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/pile",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-2.8b | 9 | 962,527 | transformers | 2023-02-13T14:37:12 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-2.8B
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | โ |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | โ |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | โ |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. โEquivalentโ
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-2.8B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-2.8B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-2.8B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-2.8B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better โfollowโ human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most โaccurateโ text. Never rely on Pythia-2.8B to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-2.8B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-2.8B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-2.8B.
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA โ OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning ChallengeโEasy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1ร their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,570 | [
[
-0.0238037109375,
-0.0594482421875,
0.0245819091796875,
0.00455474853515625,
-0.0186309814453125,
-0.0153961181640625,
-0.01763916015625,
-0.035491943359375,
0.01505279541015625,
0.01201629638671875,
-0.0260162353515625,
-0.020263671875,
-0.03289794921875,
-0.006290435791015625,
-0.034027099609375,
0.0865478515625,
-0.01027679443359375,
-0.0107879638671875,
0.00812530517578125,
-0.006137847900390625,
-0.004596710205078125,
-0.039581298828125,
-0.0328369140625,
-0.0302734375,
0.048980712890625,
0.0130767822265625,
0.0660400390625,
0.043243408203125,
0.01245880126953125,
0.021087646484375,
-0.027069091796875,
-0.0037860870361328125,
-0.01213836669921875,
-0.00664520263671875,
-0.001094818115234375,
-0.0210113525390625,
-0.052734375,
0.0019588470458984375,
0.04974365234375,
0.0482177734375,
-0.01377105712890625,
0.01922607421875,
-0.0011997222900390625,
0.0270233154296875,
-0.03875732421875,
0.0025196075439453125,
-0.0231170654296875,
-0.01313018798828125,
-0.007129669189453125,
0.0106658935546875,
-0.0291595458984375,
-0.0250091552734375,
0.03448486328125,
-0.050506591796875,
0.019073486328125,
0.0079345703125,
0.09033203125,
-0.0089263916015625,
-0.0323486328125,
-0.00559234619140625,
-0.052154541015625,
0.05010986328125,
-0.053253173828125,
0.0252227783203125,
0.021392822265625,
0.01271820068359375,
-0.003490447998046875,
-0.068603515625,
-0.040435791015625,
-0.0159454345703125,
-0.00982666015625,
-0.0020275115966796875,
-0.04644775390625,
0.0015287399291992188,
0.03765869140625,
0.047576904296875,
-0.0634765625,
-0.0016832351684570312,
-0.0294952392578125,
-0.0261688232421875,
0.0262603759765625,
0.0028514862060546875,
0.034423828125,
-0.02239990234375,
0.001071929931640625,
-0.0293426513671875,
-0.050445556640625,
-0.0165863037109375,
0.04205322265625,
0.004276275634765625,
-0.0286102294921875,
0.03778076171875,
-0.0268096923828125,
0.040069580078125,
-0.00556182861328125,
0.019561767578125,
0.03314208984375,
-0.01474761962890625,
-0.036376953125,
-0.00887298583984375,
0.070068359375,
0.01224517822265625,
0.0161895751953125,
-0.0012407302856445312,
-0.0033130645751953125,
0.00540924072265625,
0.0026378631591796875,
-0.08477783203125,
-0.061798095703125,
0.01904296875,
-0.0298004150390625,
-0.0323486328125,
-0.01288604736328125,
-0.070556640625,
-0.01519775390625,
-0.013641357421875,
0.04193115234375,
-0.037139892578125,
-0.055023193359375,
-0.01349639892578125,
-0.00005251169204711914,
0.0172119140625,
0.0277099609375,
-0.07244873046875,
0.0308685302734375,
0.033660888671875,
0.07745361328125,
0.0171661376953125,
-0.04229736328125,
-0.01532745361328125,
-0.0202484130859375,
-0.0095977783203125,
0.02862548828125,
-0.0082244873046875,
-0.01377105712890625,
-0.00787353515625,
0.0122833251953125,
-0.00749969482421875,
-0.0270233154296875,
0.0293731689453125,
-0.0302886962890625,
0.020050048828125,
-0.02093505859375,
-0.0330810546875,
-0.0280609130859375,
0.005130767822265625,
-0.046234130859375,
0.065673828125,
0.0185394287109375,
-0.072265625,
0.0164031982421875,
-0.016021728515625,
-0.004802703857421875,
-0.003925323486328125,
0.014007568359375,
-0.052581787109375,
0.003452301025390625,
0.0261993408203125,
0.0022869110107421875,
-0.0301055908203125,
0.014556884765625,
-0.0196075439453125,
-0.03228759765625,
0.013397216796875,
-0.03839111328125,
0.06884765625,
0.014373779296875,
-0.049560546875,
0.020294189453125,
-0.04119873046875,
0.0176239013671875,
0.0186309814453125,
-0.0257110595703125,
0.004608154296875,
-0.012908935546875,
0.0299530029296875,
0.016204833984375,
0.01262664794921875,
-0.0259857177734375,
0.02117919921875,
-0.038604736328125,
0.055694580078125,
0.055389404296875,
-0.00601959228515625,
0.035491943359375,
-0.0321044921875,
0.033172607421875,
0.0029926300048828125,
0.01453399658203125,
-0.00414276123046875,
-0.048553466796875,
-0.07452392578125,
-0.0198974609375,
0.028228759765625,
0.0228729248046875,
-0.03448486328125,
0.031768798828125,
-0.01751708984375,
-0.065673828125,
-0.013275146484375,
-0.006343841552734375,
0.0323486328125,
0.0205841064453125,
0.031982421875,
-0.0129852294921875,
-0.040924072265625,
-0.06695556640625,
-0.01678466796875,
-0.03338623046875,
0.00966644287109375,
0.01471710205078125,
0.07098388671875,
-0.00787353515625,
0.04315185546875,
-0.026702880859375,
0.019195556640625,
-0.0270843505859375,
0.0133514404296875,
0.031951904296875,
0.04541015625,
0.0301513671875,
-0.042205810546875,
-0.0287933349609375,
0.0022602081298828125,
-0.044677734375,
0.006488800048828125,
0.003849029541015625,
-0.0239105224609375,
0.023223876953125,
0.005367279052734375,
-0.07501220703125,
0.0350341796875,
0.047637939453125,
-0.041412353515625,
0.060821533203125,
-0.0244903564453125,
-0.0022258758544921875,
-0.079833984375,
0.019317626953125,
0.010589599609375,
-0.0167694091796875,
-0.04656982421875,
0.007450103759765625,
0.014923095703125,
-0.01488494873046875,
-0.030487060546875,
0.045928955078125,
-0.040985107421875,
-0.01351165771484375,
-0.017059326171875,
0.00447845458984375,
-0.0024127960205078125,
0.04718017578125,
0.01096343994140625,
0.043609619140625,
0.060089111328125,
-0.056243896484375,
0.032257080078125,
0.016571044921875,
-0.019989013671875,
0.0274810791015625,
-0.06707763671875,
0.01435089111328125,
0.005771636962890625,
0.033203125,
-0.04278564453125,
-0.0286102294921875,
0.04034423828125,
-0.04486083984375,
0.01168060302734375,
-0.031494140625,
-0.03985595703125,
-0.0322265625,
-0.01374053955078125,
0.045684814453125,
0.05950927734375,
-0.045867919921875,
0.051727294921875,
0.0040283203125,
0.00909423828125,
-0.0269012451171875,
-0.042724609375,
-0.019134521484375,
-0.038909912109375,
-0.05010986328125,
0.028076171875,
0.0124359130859375,
-0.013641357421875,
0.0008363723754882812,
-0.0000597834587097168,
0.0074310302734375,
-0.00496673583984375,
0.0246429443359375,
0.0253143310546875,
-0.003208160400390625,
0.00024044513702392578,
-0.00879669189453125,
-0.01018524169921875,
0.00014853477478027344,
-0.03948974609375,
0.07086181640625,
-0.0206298828125,
-0.01313018798828125,
-0.0614013671875,
0.0007677078247070312,
0.06689453125,
-0.03228759765625,
0.06658935546875,
0.04693603515625,
-0.053619384765625,
0.011962890625,
-0.02813720703125,
-0.022003173828125,
-0.0328369140625,
0.0513916015625,
-0.0190277099609375,
-0.0274200439453125,
0.0467529296875,
0.0211181640625,
0.0220184326171875,
0.042877197265625,
0.05621337890625,
0.0152130126953125,
0.0899658203125,
0.033966064453125,
-0.0112457275390625,
0.04742431640625,
-0.03912353515625,
0.0204315185546875,
-0.08477783203125,
-0.0157470703125,
-0.037506103515625,
-0.0187530517578125,
-0.071533203125,
-0.0227203369140625,
0.024566650390625,
0.01837158203125,
-0.055511474609375,
0.04205322265625,
-0.04107666015625,
0.004497528076171875,
0.04913330078125,
0.0198974609375,
0.0114898681640625,
0.016632080078125,
0.006282806396484375,
-0.0052947998046875,
-0.04901123046875,
-0.024658203125,
0.0943603515625,
0.03863525390625,
0.042572021484375,
0.0238494873046875,
0.052734375,
-0.01062774658203125,
0.0185394287109375,
-0.05316162109375,
0.0300140380859375,
0.025665283203125,
-0.054412841796875,
-0.01507568359375,
-0.05706787109375,
-0.069580078125,
0.037628173828125,
0.00627899169921875,
-0.08544921875,
0.0184173583984375,
0.0178680419921875,
-0.0288848876953125,
0.035491943359375,
-0.048126220703125,
0.0751953125,
-0.0169525146484375,
-0.036773681640625,
-0.02679443359375,
-0.0242919921875,
0.017333984375,
0.0271453857421875,
0.0091094970703125,
0.00672149658203125,
0.022430419921875,
0.07568359375,
-0.05230712890625,
0.04840087890625,
-0.0096282958984375,
0.01232147216796875,
0.0260467529296875,
0.02093505859375,
0.052093505859375,
0.01078033447265625,
0.010345458984375,
-0.0037250518798828125,
0.01172637939453125,
-0.04290771484375,
-0.0305328369140625,
0.06842041015625,
-0.0845947265625,
-0.027618408203125,
-0.05999755859375,
-0.044464111328125,
0.008056640625,
0.0140838623046875,
0.0330810546875,
0.049163818359375,
-0.0012178421020507812,
0.0000776052474975586,
0.04473876953125,
-0.0364990234375,
0.0263824462890625,
0.0140838623046875,
-0.03656005859375,
-0.03985595703125,
0.0753173828125,
0.0021190643310546875,
0.0255126953125,
-0.00004655122756958008,
0.016815185546875,
-0.030792236328125,
-0.034332275390625,
-0.0465087890625,
0.041015625,
-0.054473876953125,
0.00043511390686035156,
-0.053314208984375,
-0.002277374267578125,
-0.034393310546875,
0.00801849365234375,
-0.030670166015625,
-0.0284271240234375,
-0.0181732177734375,
-0.00209808349609375,
0.044219970703125,
0.03656005859375,
0.0072021484375,
0.02606201171875,
-0.03955078125,
-0.004459381103515625,
0.0169830322265625,
0.006744384765625,
0.009124755859375,
-0.0693359375,
-0.00867462158203125,
0.009368896484375,
-0.032958984375,
-0.08685302734375,
0.037506103515625,
-0.0034427642822265625,
0.025299072265625,
0.0053863525390625,
-0.018707275390625,
0.0440673828125,
-0.00615692138671875,
0.050811767578125,
0.0128631591796875,
-0.0758056640625,
0.041839599609375,
-0.03790283203125,
0.02154541015625,
0.0258026123046875,
0.0249176025390625,
-0.054229736328125,
-0.005767822265625,
-0.0751953125,
-0.081787109375,
0.056610107421875,
0.038360595703125,
0.012481689453125,
0.00968170166015625,
0.0311431884765625,
-0.0360107421875,
0.01177215576171875,
-0.076904296875,
-0.0227508544921875,
-0.019744873046875,
-0.00643157958984375,
0.01250457763671875,
-0.00412750244140625,
0.00426483154296875,
-0.04156494140625,
0.0762939453125,
0.0041351318359375,
0.0269317626953125,
0.0211639404296875,
-0.0293121337890625,
-0.0091705322265625,
-0.00434112548828125,
0.01171112060546875,
0.05682373046875,
-0.00928497314453125,
0.0038471221923828125,
0.016815185546875,
-0.041534423828125,
0.003055572509765625,
0.01331329345703125,
-0.0292510986328125,
-0.005451202392578125,
0.0142669677734375,
0.06640625,
0.0108795166015625,
-0.0297088623046875,
0.0178070068359375,
-0.00249481201171875,
-0.00597381591796875,
-0.0230865478515625,
-0.0136871337890625,
0.0136871337890625,
0.0156097412109375,
-0.00209808349609375,
-0.0130767822265625,
0.000059664249420166016,
-0.06689453125,
0.003879547119140625,
0.01378631591796875,
-0.01090240478515625,
-0.031402587890625,
0.044403076171875,
0.0029754638671875,
-0.0152435302734375,
0.085693359375,
-0.02001953125,
-0.05126953125,
0.05804443359375,
0.03814697265625,
0.05462646484375,
-0.01476287841796875,
0.026641845703125,
0.067138671875,
0.0239410400390625,
-0.015655517578125,
0.005428314208984375,
0.00849151611328125,
-0.0382080078125,
-0.007106781005859375,
-0.060089111328125,
-0.017364501953125,
0.0181884765625,
-0.04486083984375,
0.0323486328125,
-0.048614501953125,
-0.006778717041015625,
-0.0026111602783203125,
0.018341064453125,
-0.042388916015625,
0.0234527587890625,
0.01219940185546875,
0.05340576171875,
-0.07012939453125,
0.060638427734375,
0.049346923828125,
-0.05474853515625,
-0.08441162109375,
0.0019350051879882812,
0.003345489501953125,
-0.03271484375,
0.01050567626953125,
0.016143798828125,
0.01715087890625,
0.01207733154296875,
-0.0213470458984375,
-0.06640625,
0.0985107421875,
0.0170440673828125,
-0.049957275390625,
-0.0216217041015625,
-0.009979248046875,
0.040863037109375,
0.00499725341796875,
0.05517578125,
0.054473876953125,
0.031280517578125,
0.00600433349609375,
-0.0802001953125,
0.027313232421875,
-0.027008056640625,
-0.0034427642822265625,
0.0177764892578125,
-0.052581787109375,
0.09710693359375,
-0.006618499755859375,
-0.0007081031799316406,
0.03271484375,
0.04742431640625,
0.03125,
-0.007633209228515625,
0.0260162353515625,
0.05908203125,
0.06732177734375,
-0.0289306640625,
0.09423828125,
-0.0226593017578125,
0.058990478515625,
0.0645751953125,
0.01513671875,
0.039337158203125,
0.03021240234375,
-0.028717041015625,
0.039703369140625,
0.0648193359375,
-0.006580352783203125,
0.01422119140625,
0.0185089111328125,
-0.021575927734375,
-0.019683837890625,
0.00765228271484375,
-0.045501708984375,
0.013275146484375,
0.0111083984375,
-0.04296875,
-0.01435089111328125,
-0.026519775390625,
0.0276641845703125,
-0.0318603515625,
-0.017486572265625,
0.01922607421875,
0.00830078125,
-0.04864501953125,
0.045379638671875,
0.0194244384765625,
0.041015625,
-0.033294677734375,
0.0116729736328125,
-0.01171112060546875,
0.02337646484375,
-0.0272369384765625,
-0.0318603515625,
0.005901336669921875,
0.00008374452590942383,
0.00476837158203125,
0.0116729736328125,
0.0309906005859375,
-0.0105743408203125,
-0.043121337890625,
0.0145416259765625,
0.036468505859375,
0.02081298828125,
-0.0322265625,
-0.05145263671875,
0.00748443603515625,
-0.01219940185546875,
-0.04034423828125,
0.0330810546875,
0.01904296875,
-0.0106048583984375,
0.043243408203125,
0.048431396484375,
0.002681732177734375,
-0.00020873546600341797,
0.01132965087890625,
0.07373046875,
-0.03326416015625,
-0.036834716796875,
-0.06756591796875,
0.0362548828125,
-0.00005447864532470703,
-0.05133056640625,
0.06451416015625,
0.04241943359375,
0.052398681640625,
0.0209808349609375,
0.04522705078125,
-0.034088134765625,
-0.0008654594421386719,
-0.0225677490234375,
0.04974365234375,
-0.03741455078125,
0.001354217529296875,
-0.0364990234375,
-0.08624267578125,
-0.0044097900390625,
0.0716552734375,
-0.039459228515625,
0.0308990478515625,
0.06134033203125,
0.061431884765625,
-0.0062408447265625,
0.006988525390625,
0.0030498504638671875,
0.023040771484375,
0.0391845703125,
0.070068359375,
0.06646728515625,
-0.053680419921875,
0.04107666015625,
-0.03741455078125,
-0.0207366943359375,
-0.0123291015625,
-0.03466796875,
-0.06451416015625,
-0.034210205078125,
-0.036865234375,
-0.05828857421875,
0.0021991729736328125,
0.06805419921875,
0.056915283203125,
-0.046844482421875,
-0.01215362548828125,
-0.03961181640625,
0.0026798248291015625,
-0.018707275390625,
-0.017242431640625,
0.033111572265625,
0.01059722900390625,
-0.07073974609375,
-0.0032329559326171875,
-0.01139068603515625,
0.008636474609375,
-0.031768798828125,
-0.023712158203125,
-0.01232147216796875,
-0.00913238525390625,
0.00463104248046875,
0.0247650146484375,
-0.0401611328125,
-0.02032470703125,
0.0008969306945800781,
0.005218505859375,
0.0007596015930175781,
0.053985595703125,
-0.041473388671875,
0.00836944580078125,
0.045379638671875,
0.007274627685546875,
0.0611572265625,
-0.021240234375,
0.0308685302734375,
-0.0189208984375,
0.0264434814453125,
0.019805908203125,
0.048187255859375,
0.025634765625,
-0.019927978515625,
0.0118255615234375,
0.031005859375,
-0.05535888671875,
-0.0657958984375,
0.027496337890625,
-0.052581787109375,
-0.00554656982421875,
0.09552001953125,
-0.020782470703125,
-0.0301361083984375,
0.0048065185546875,
-0.0143890380859375,
0.041107177734375,
-0.0222930908203125,
0.049163818359375,
0.0467529296875,
0.005489349365234375,
-0.0170135498046875,
-0.04949951171875,
0.0286407470703125,
0.050811767578125,
-0.061065673828125,
0.0287017822265625,
0.04534912109375,
0.045623779296875,
0.017791748046875,
0.044219970703125,
-0.0222320556640625,
0.048126220703125,
0.007312774658203125,
0.005428314208984375,
0.0018453598022460938,
-0.036651611328125,
-0.032501220703125,
-0.0111541748046875,
0.0174102783203125,
0.003173828125
]
] |
laion/CLIP-ViT-B-16-laion2B-s34B-b88K | 2023-04-19T18:55:10.000Z | [
"open_clip",
"zero-shot-image-classification",
"arxiv:1910.04867",
"license:mit",
"has_space",
"region:us"
] | zero-shot-image-classification | laion | null | null | laion/CLIP-ViT-B-16-laion2B-s34B-b88K | 18 | 959,561 | open_clip | 2023-01-03T00:16:18 | ---
license: mit
pipeline_tag: zero-shot-image-classification
library_name: open_clip
---
# Model Card for CLIP ViT-B/16 - LAION-2B
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Details](#training-details)
4. [Evaluation](#evaluation)
5. [Acknowledgements](#acknowledgements)
6. [Citation](#citation)
# Model Details
## Model Description
A CLIP ViT-B/16 model trained with the LAION-2B English subset of LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip).
Model training done by Mehdi Cherti on the [JUWELS Booster](https://apps.fz-juelich.de/jsc/hps/juwels/booster-overview.html) supercomputer. See acknowledgements below.
# Uses
As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model.
The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset.
## Direct Use
Zero-shot image classification, image and text retrieval, among others.
## Downstream Use
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
## Out-of-Scope Use
As per the OpenAI models,
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIPโs performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below.
# Training Details
## Training Data
This model was trained with the 2 Billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/).
**IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a โsafeโ subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress.
## Training Procedure
TODO
# Evaluation
Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark).
## Testing Data, Factors & Metrics
### Testing Data
The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
## Results
The model achieves a 70.2 zero-shot top-1 accuracy on ImageNet-1k.
An initial round of benchmarks have been performed on a wider range of datasets, currently viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb
# Acknowledgements
Acknowledging the Gauss Centre for Supercomputing e.V. (http://gauss-centre.eu) for funding this part of work by providing computing time through the John von Neumann Institute for Computing (NIC) on the GCS Supercomputer JUWELS Booster at Jรผlich Supercomputing Centre (JSC).
# Citation
**BibTeX:**
LAION-5B
```bibtex
@inproceedings{schuhmann2022laionb,
title={{LAION}-5B: An open large-scale dataset for training next generation image-text models},
author={Christoph Schuhmann and
Romain Beaumont and
Richard Vencu and
Cade W Gordon and
Ross Wightman and
Mehdi Cherti and
Theo Coombes and
Aarush Katta and
Clayton Mullis and
Mitchell Wortsman and
Patrick Schramowski and
Srivatsa R Kundurthy and
Katherine Crowson and
Ludwig Schmidt and
Robert Kaczmarczyk and
Jenia Jitsev},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=M3Y74vmsMcY}
}
```
OpenAI CLIP paper
```
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
OpenCLIP software
```
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
``` | 7,755 | [
[
-0.0210418701171875,
-0.0391845703125,
0.0140838623046875,
0.0042724609375,
-0.0281829833984375,
-0.034698486328125,
-0.015045166015625,
-0.053375244140625,
-0.004058837890625,
0.031646728515625,
-0.031524658203125,
-0.039459228515625,
-0.042724609375,
-0.0032215118408203125,
-0.02947998046875,
0.0679931640625,
-0.01448822021484375,
-0.001476287841796875,
-0.02294921875,
-0.0305633544921875,
-0.035552978515625,
-0.045074462890625,
-0.035247802734375,
0.00351715087890625,
0.01375579833984375,
0.02032470703125,
0.044647216796875,
0.062744140625,
0.0550537109375,
0.0164947509765625,
-0.00598907470703125,
0.00548553466796875,
-0.0416259765625,
-0.037872314453125,
-0.00507354736328125,
-0.0227203369140625,
-0.042999267578125,
0.009033203125,
0.046234130859375,
0.0272369384765625,
-0.00215911865234375,
0.0206756591796875,
0.001079559326171875,
0.0306549072265625,
-0.06402587890625,
0.0182647705078125,
-0.04412841796875,
-0.00012350082397460938,
-0.019012451171875,
0.0048370361328125,
-0.0280914306640625,
-0.01067352294921875,
0.012054443359375,
-0.053466796875,
0.0167999267578125,
-0.01280975341796875,
0.10858154296875,
0.0142059326171875,
-0.0252685546875,
0.0150299072265625,
-0.0511474609375,
0.0596923828125,
-0.059295654296875,
0.0245361328125,
0.025146484375,
0.027587890625,
0.015838623046875,
-0.06536865234375,
-0.035400390625,
-0.00896453857421875,
0.011138916015625,
0.0231170654296875,
-0.0255126953125,
-0.0005249977111816406,
0.03485107421875,
0.01300811767578125,
-0.0250091552734375,
0.0020465850830078125,
-0.05426025390625,
-0.0013360977172851562,
0.0513916015625,
0.00032830238342285156,
0.0226593017578125,
-0.02557373046875,
-0.051239013671875,
-0.03143310546875,
-0.04718017578125,
0.0307769775390625,
0.02252197265625,
-0.00698089599609375,
-0.03485107421875,
0.030853271484375,
-0.0021915435791015625,
0.035858154296875,
-0.0081634521484375,
-0.0165252685546875,
0.037811279296875,
-0.038330078125,
-0.0241851806640625,
-0.0190582275390625,
0.08349609375,
0.0496826171875,
0.0140838623046875,
0.01029205322265625,
0.0006895065307617188,
-0.01114654541015625,
0.0207061767578125,
-0.07952880859375,
-0.00843048095703125,
-0.0033721923828125,
-0.04608154296875,
-0.024505615234375,
0.031890869140625,
-0.054351806640625,
0.0014162063598632812,
-0.00986480712890625,
0.038421630859375,
-0.0400390625,
-0.01477813720703125,
-0.0005235671997070312,
-0.006999969482421875,
0.0195159912109375,
0.017974853515625,
-0.048095703125,
0.0100250244140625,
0.025604248046875,
0.07891845703125,
-0.01490020751953125,
-0.0285491943359375,
-0.0181884765625,
0.0175628662109375,
-0.0275115966796875,
0.041015625,
-0.01641845703125,
-0.0229034423828125,
-0.005237579345703125,
0.027587890625,
-0.005283355712890625,
-0.041595458984375,
0.05029296875,
-0.0221099853515625,
0.0056915283203125,
-0.01349639892578125,
-0.018280029296875,
-0.043365478515625,
0.00508880615234375,
-0.048858642578125,
0.07135009765625,
-0.0009660720825195312,
-0.06695556640625,
0.024200439453125,
-0.04364013671875,
-0.0177001953125,
-0.0118255615234375,
-0.007659912109375,
-0.045928955078125,
-0.01531982421875,
0.041259765625,
0.043182373046875,
-0.0271453857421875,
0.035797119140625,
-0.043487548828125,
-0.0298614501953125,
0.022613525390625,
-0.04119873046875,
0.07781982421875,
0.0017881393432617188,
-0.02838134765625,
0.010284423828125,
-0.04718017578125,
-0.006755828857421875,
0.0190582275390625,
0.0024261474609375,
-0.019744873046875,
-0.0174560546875,
-0.000995635986328125,
0.019012451171875,
0.01202392578125,
-0.04376220703125,
0.0003802776336669922,
-0.0093536376953125,
0.034912109375,
0.058837890625,
0.0088958740234375,
0.0206756591796875,
-0.0292205810546875,
0.04095458984375,
0.0098114013671875,
0.04718017578125,
-0.02105712890625,
-0.03668212890625,
-0.0537109375,
-0.044647216796875,
0.032470703125,
0.04315185546875,
-0.05419921875,
0.030426025390625,
-0.0248870849609375,
-0.037384033203125,
-0.0305633544921875,
-0.00739288330078125,
0.039093017578125,
0.03851318359375,
0.034820556640625,
-0.034576416015625,
-0.034820556640625,
-0.0687255859375,
0.01299285888671875,
-0.00009799003601074219,
-0.0027942657470703125,
0.0537109375,
0.05279541015625,
-0.01020050048828125,
0.06585693359375,
-0.05291748046875,
-0.037933349609375,
-0.01025390625,
0.005825042724609375,
0.00881195068359375,
0.03289794921875,
0.0650634765625,
-0.0645751953125,
-0.036773681640625,
-0.0095977783203125,
-0.091552734375,
-0.0000413060188293457,
-0.006732940673828125,
-0.018798828125,
0.014801025390625,
0.0465087890625,
-0.040435791015625,
0.0545654296875,
0.0301513671875,
0.005184173583984375,
0.0377197265625,
-0.01337432861328125,
0.00036787986755371094,
-0.08575439453125,
0.0269317626953125,
0.0066375732421875,
-0.01055908203125,
-0.039031982421875,
-0.0034770965576171875,
0.007137298583984375,
-0.0223236083984375,
-0.06268310546875,
0.0462646484375,
-0.031036376953125,
0.00237274169921875,
-0.002254486083984375,
0.0002868175506591797,
0.00852203369140625,
0.04498291015625,
0.004894256591796875,
0.07098388671875,
0.055908203125,
-0.048095703125,
0.004299163818359375,
0.0272216796875,
-0.0294189453125,
0.032867431640625,
-0.07122802734375,
0.00498199462890625,
-0.0071563720703125,
0.0185699462890625,
-0.028411865234375,
-0.028778076171875,
0.0266265869140625,
-0.036041259765625,
0.0247344970703125,
-0.0205230712890625,
-0.0180816650390625,
-0.031890869140625,
-0.04449462890625,
0.044342041015625,
0.05047607421875,
-0.048980712890625,
0.0224609375,
0.032073974609375,
0.00870513916015625,
-0.05987548828125,
-0.053863525390625,
-0.01739501953125,
-0.0198516845703125,
-0.0499267578125,
0.033447265625,
-0.00626373291015625,
0.00678253173828125,
0.006496429443359375,
0.0091400146484375,
-0.011871337890625,
-0.00872039794921875,
0.05010986328125,
0.04150390625,
-0.0027904510498046875,
-0.00904083251953125,
-0.006221771240234375,
0.0010166168212890625,
-0.0029277801513671875,
-0.01410675048828125,
0.01555633544921875,
-0.0089569091796875,
-0.0200347900390625,
-0.04815673828125,
0.017974853515625,
0.041168212890625,
-0.032684326171875,
0.058624267578125,
0.057952880859375,
-0.03448486328125,
0.00079345703125,
-0.0264739990234375,
0.0005726814270019531,
-0.035980224609375,
0.038177490234375,
-0.00467681884765625,
-0.04547119140625,
0.03558349609375,
0.014312744140625,
-0.0064849853515625,
0.043121337890625,
0.0252685546875,
-0.006885528564453125,
0.0672607421875,
0.06585693359375,
-0.0012826919555664062,
0.0537109375,
-0.059814453125,
0.0124053955078125,
-0.07757568359375,
-0.025848388671875,
-0.01549530029296875,
-0.009613037109375,
-0.0401611328125,
-0.04241943359375,
0.048126220703125,
0.02545166015625,
-0.00974273681640625,
0.0309295654296875,
-0.0267791748046875,
0.0266571044921875,
0.042877197265625,
0.026397705078125,
0.00131988525390625,
0.001605987548828125,
-0.0034942626953125,
-0.005382537841796875,
-0.0521240234375,
-0.02911376953125,
0.09197998046875,
0.042755126953125,
0.0556640625,
-0.00655364990234375,
0.035064697265625,
0.01309967041015625,
0.0065155029296875,
-0.0491943359375,
0.052703857421875,
-0.03448486328125,
-0.04925537109375,
-0.0230712890625,
-0.0301513671875,
-0.06695556640625,
0.0005970001220703125,
-0.004405975341796875,
-0.05938720703125,
0.036224365234375,
0.0017147064208984375,
-0.0253143310546875,
0.03900146484375,
-0.045654296875,
0.07196044921875,
-0.024261474609375,
-0.028594970703125,
0.00234222412109375,
-0.055633544921875,
0.0350341796875,
0.0122528076171875,
0.00272369384765625,
-0.0142822265625,
0.00823211669921875,
0.0780029296875,
-0.0406494140625,
0.06787109375,
-0.01458740234375,
0.020355224609375,
0.047698974609375,
-0.0231170654296875,
0.01149749755859375,
0.0109100341796875,
0.0074310302734375,
0.058807373046875,
0.00312042236328125,
-0.0189056396484375,
-0.0287017822265625,
0.035675048828125,
-0.07073974609375,
-0.019561767578125,
-0.03704833984375,
-0.04193115234375,
0.0143585205078125,
0.0276641845703125,
0.053314208984375,
0.05291748046875,
-0.005157470703125,
0.0293121337890625,
0.043182373046875,
-0.0311431884765625,
0.04364013671875,
0.0191802978515625,
-0.00917816162109375,
-0.05255126953125,
0.08038330078125,
0.027374267578125,
0.025146484375,
0.012847900390625,
0.00336456298828125,
-0.006984710693359375,
-0.034698486328125,
-0.0377197265625,
0.024627685546875,
-0.0584716796875,
-0.0311126708984375,
-0.03900146484375,
-0.03875732421875,
-0.0301513671875,
-0.004077911376953125,
-0.0298614501953125,
-0.017242431640625,
-0.048431396484375,
-0.0037441253662109375,
0.02093505859375,
0.042022705078125,
-0.01132965087890625,
0.02105712890625,
-0.058990478515625,
0.0256500244140625,
0.0185546875,
0.0293426513671875,
0.0019702911376953125,
-0.050537109375,
-0.0242156982421875,
0.01366424560546875,
-0.04412841796875,
-0.048248291015625,
0.0276031494140625,
0.0241241455078125,
0.038360595703125,
0.04705810546875,
0.0095977783203125,
0.042877197265625,
-0.0323486328125,
0.0777587890625,
0.0240631103515625,
-0.06048583984375,
0.037841796875,
-0.041351318359375,
0.0216522216796875,
0.043060302734375,
0.055908203125,
-0.014312744140625,
0.0021724700927734375,
-0.0550537109375,
-0.0711669921875,
0.0721435546875,
0.01349639892578125,
0.004955291748046875,
0.010467529296875,
0.0248565673828125,
-0.0004911422729492188,
0.0090789794921875,
-0.0711669921875,
-0.0061492919921875,
-0.031890869140625,
-0.00954437255859375,
0.01224517822265625,
-0.024505615234375,
-0.00998687744140625,
-0.031219482421875,
0.061553955078125,
-0.021270751953125,
0.050872802734375,
0.0226593017578125,
-0.01309967041015625,
-0.0010290145874023438,
0.00201416015625,
0.03839111328125,
0.046295166015625,
-0.03192138671875,
-0.01418304443359375,
0.008056640625,
-0.048583984375,
-0.005615234375,
0.0136871337890625,
-0.0538330078125,
-0.00817108154296875,
0.03759765625,
0.0958251953125,
0.00836944580078125,
-0.049072265625,
0.07098388671875,
-0.004100799560546875,
-0.0290679931640625,
-0.023101806640625,
0.005855560302734375,
-0.02392578125,
0.016265869140625,
0.01415252685546875,
0.01158905029296875,
0.01153564453125,
-0.04150390625,
0.01323699951171875,
0.036712646484375,
-0.039581298828125,
-0.0328369140625,
0.061553955078125,
-0.0008459091186523438,
-0.005542755126953125,
0.046051025390625,
-0.01078033447265625,
-0.039337158203125,
0.050445556640625,
0.03802490234375,
0.0721435546875,
0.0003662109375,
0.02606201171875,
0.051300048828125,
0.021209716796875,
-0.012420654296875,
0.01192474365234375,
0.0108642578125,
-0.03961181640625,
-0.0095672607421875,
-0.031829833984375,
-0.0224151611328125,
0.0256805419921875,
-0.06768798828125,
0.038238525390625,
-0.048004150390625,
-0.0291900634765625,
-0.006694793701171875,
-0.031890869140625,
-0.0408935546875,
0.0149688720703125,
0.0154266357421875,
0.06878662109375,
-0.06097412109375,
0.05279541015625,
0.051361083984375,
-0.060150146484375,
-0.0660400390625,
0.0130767822265625,
-0.015350341796875,
-0.0311126708984375,
0.03363037109375,
0.039031982421875,
0.0006961822509765625,
-0.0257415771484375,
-0.06707763671875,
-0.07391357421875,
0.1048583984375,
0.04095458984375,
-0.01493072509765625,
-0.01026153564453125,
0.005420684814453125,
0.0322265625,
-0.0161590576171875,
0.034210205078125,
0.0160369873046875,
0.00975799560546875,
0.006305694580078125,
-0.07684326171875,
-0.00238037109375,
-0.025665283203125,
0.017425537109375,
-0.0006766319274902344,
-0.08404541015625,
0.07525634765625,
-0.019927978515625,
-0.0225372314453125,
0.00469970703125,
0.055999755859375,
-0.001308441162109375,
0.0262298583984375,
0.0275115966796875,
0.046600341796875,
0.03985595703125,
-0.0023345947265625,
0.08148193359375,
-0.0101776123046875,
0.0263214111328125,
0.08380126953125,
-0.0105438232421875,
0.07196044921875,
0.019287109375,
-0.0154571533203125,
0.03436279296875,
0.028228759765625,
-0.0272369384765625,
0.050628662109375,
-0.023529052734375,
0.00977325439453125,
-0.005916595458984375,
-0.035919189453125,
-0.0310821533203125,
0.041473388671875,
0.0006351470947265625,
-0.029144287109375,
0.0011930465698242188,
0.02557373046875,
0.001827239990234375,
-0.0167694091796875,
-0.0109100341796875,
0.039886474609375,
0.0134735107421875,
-0.039306640625,
0.06494140625,
0.00399017333984375,
0.0528564453125,
-0.051239013671875,
-0.0040283203125,
-0.0087890625,
0.0253753662109375,
-0.0146942138671875,
-0.057220458984375,
0.0185546875,
0.0005564689636230469,
-0.0139312744140625,
-0.0091400146484375,
0.055572509765625,
-0.0136871337890625,
-0.037353515625,
0.033233642578125,
-0.002117156982421875,
0.01230621337890625,
0.00507354736328125,
-0.046600341796875,
0.008575439453125,
0.0011568069458007812,
-0.006420135498046875,
0.029144287109375,
0.0162200927734375,
-0.021331787109375,
0.05078125,
0.04449462890625,
-0.006984710693359375,
0.01549530029296875,
-0.00478363037109375,
0.07122802734375,
-0.032623291015625,
-0.040252685546875,
-0.043792724609375,
0.041046142578125,
-0.016357421875,
-0.03515625,
0.058929443359375,
0.04296875,
0.080322265625,
-0.01349639892578125,
0.056671142578125,
-0.0161895751953125,
0.0196685791015625,
-0.047607421875,
0.05230712890625,
-0.050537109375,
0.008209228515625,
-0.035308837890625,
-0.0537109375,
-0.0123138427734375,
0.042022705078125,
-0.0224151611328125,
0.00823211669921875,
0.055633544921875,
0.05584716796875,
-0.0218963623046875,
-0.00609588623046875,
0.0178070068359375,
0.01361083984375,
0.0230560302734375,
0.041351318359375,
0.041351318359375,
-0.060516357421875,
0.051300048828125,
-0.05322265625,
-0.024078369140625,
-0.01293182373046875,
-0.06549072265625,
-0.08624267578125,
-0.047637939453125,
-0.0300140380859375,
-0.0117034912109375,
0.00310516357421875,
0.05120849609375,
0.072509765625,
-0.05645751953125,
-0.0228271484375,
0.01495361328125,
-0.0130767822265625,
-0.0242767333984375,
-0.0172119140625,
0.040191650390625,
0.0180206298828125,
-0.042877197265625,
0.01360321044921875,
0.015899658203125,
0.018280029296875,
-0.00473785400390625,
-0.0013055801391601562,
-0.03387451171875,
-0.0004858970642089844,
0.031494140625,
0.029388427734375,
-0.04388427734375,
-0.0155792236328125,
0.01202392578125,
0.004638671875,
0.020294189453125,
0.04180908203125,
-0.044647216796875,
0.032989501953125,
0.036773681640625,
0.03741455078125,
0.050933837890625,
0.01312255859375,
0.015228271484375,
-0.053375244140625,
0.0291900634765625,
0.00266265869140625,
0.025543212890625,
0.0279998779296875,
-0.0302581787109375,
0.050384521484375,
0.03167724609375,
-0.032989501953125,
-0.07183837890625,
-0.007476806640625,
-0.08416748046875,
-0.0102996826171875,
0.09063720703125,
-0.0372314453125,
-0.034912109375,
0.027587890625,
-0.0161590576171875,
0.0306549072265625,
-0.0257720947265625,
0.0298004150390625,
0.0318603515625,
0.0021915435791015625,
-0.027740478515625,
-0.0635986328125,
0.0238494873046875,
0.013397216796875,
-0.07122802734375,
-0.00994110107421875,
0.02947998046875,
0.027923583984375,
0.015838623046875,
0.042022705078125,
-0.0205535888671875,
0.0255126953125,
-0.005817413330078125,
0.0193328857421875,
-0.0289459228515625,
-0.050872802734375,
-0.03900146484375,
0.0015630722045898438,
-0.0196380615234375,
-0.032470703125
]
] |
Bingsu/clip-vit-base-patch32-ko | 2022-11-08T11:02:10.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"clip",
"zero-shot-image-classification",
"ko",
"arxiv:2004.09813",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | Bingsu | null | null | Bingsu/clip-vit-base-patch32-ko | 3 | 956,648 | transformers | 2022-09-16T05:18:05 | ---
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: ๊ธฐํ์น๋ ๊ณ ์์ด, ํผ์๋
ธ ์น๋ ๊ฐ์์ง
example_title: Guitar, cat and dog
language: ko
license: mit
---
# clip-vit-base-patch32-ko
Korean CLIP model trained by [Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation](https://arxiv.org/abs/2004.09813)
[Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation](https://arxiv.org/abs/2004.09813)๋ก ํ์ต๋ ํ๊ตญ์ด CLIP ๋ชจ๋ธ์
๋๋ค.
ํ๋ จ ์ฝ๋: <https://github.com/Bing-su/KoCLIP_training_code>
์ฌ์ฉ๋ ๋ฐ์ดํฐ: AIHUB์ ์๋ ๋ชจ๋ ํ๊ตญ์ด-์์ด ๋ณ๋ ฌ ๋ฐ์ดํฐ
## How to Use
#### 1.
```python
import requests
import torch
from PIL import Image
from transformers import AutoModel, AutoProcessor
repo = "Bingsu/clip-vit-base-patch32-ko"
model = AutoModel.from_pretrained(repo)
processor = AutoProcessor.from_pretrained(repo)
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=["๊ณ ์์ด ๋ ๋ง๋ฆฌ", "๊ฐ ๋ ๋ง๋ฆฌ"], images=image, return_tensors="pt", padding=True)
with torch.inference_mode():
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image
probs = logits_per_image.softmax(dim=1)
```
```python
>>> probs
tensor([[0.9926, 0.0074]])
```
#### 2.
```python
from transformers import pipeline
repo = "Bingsu/clip-vit-base-patch32-ko"
pipe = pipeline("zero-shot-image-classification", model=repo)
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
result = pipe(images=url, candidate_labels=["๊ณ ์์ด ํ ๋ง๋ฆฌ", "๊ณ ์์ด ๋ ๋ง๋ฆฌ", "๋ถํ์ ์ํ์ ๋๋ฌ๋์ด ๊ณ ์์ด ์น๊ตฌ๋ค"], hypothesis_template="{}")
```
```python
>>> result
[{'score': 0.9456236958503723, 'label': '๋ถํ์ ์ํ์ ๋๋ฌ๋์ด ๊ณ ์์ด ์น๊ตฌ๋ค'},
{'score': 0.05315302312374115, 'label': '๊ณ ์์ด ๋ ๋ง๋ฆฌ'},
{'score': 0.0012233294546604156, 'label': '๊ณ ์์ด ํ ๋ง๋ฆฌ'}]
```
## Tokenizer
ํ ํฌ๋์ด์ ๋ ํ๊ตญ์ด ๋ฐ์ดํฐ์ ์์ด ๋ฐ์ดํฐ๋ฅผ 7:3 ๋น์จ๋ก ์์ด, ์๋ณธ CLIP ํ ํฌ๋์ด์ ์์ `.train_new_from_iterator`๋ฅผ ํตํด ํ์ต๋์์ต๋๋ค.
https://github.com/huggingface/transformers/blob/bc21aaca789f1a366c05e8b5e111632944886393/src/transformers/models/clip/modeling_clip.py#L661-L666
```python
# text_embeds.shape = [batch_size, sequence_length, transformer.width]
# take features from the eot embedding (eot_token is the highest number in each sequence)
# casting to torch.int for onnx compatibility: argmax doesn't support int64 inputs with opset 14
pooled_output = last_hidden_state[
torch.arange(last_hidden_state.shape[0]), input_ids.to(torch.int).argmax(dim=-1)
]
```
CLIP ๋ชจ๋ธ์ `pooled_output`์ ๊ตฌํ ๋ id๊ฐ ๊ฐ์ฅ ํฐ ํ ํฐ์ ์ฌ์ฉํ๊ธฐ ๋๋ฌธ์, eos ํ ํฐ์ ๊ฐ์ฅ ๋ง์ง๋ง ํ ํฐ์ด ๋์ด์ผ ํฉ๋๋ค.
| 2,625 | [
[
-0.034637451171875,
-0.050933837890625,
0.0167388916015625,
0.0254669189453125,
-0.034332275390625,
0.0036296844482421875,
-0.0171356201171875,
0.0026950836181640625,
0.0303192138671875,
0.021575927734375,
-0.0305938720703125,
-0.046051025390625,
-0.054656982421875,
0.0005383491516113281,
-0.0184326171875,
0.057525634765625,
-0.0170440673828125,
-0.004344940185546875,
0.00931549072265625,
-0.0124359130859375,
-0.031219482421875,
-0.0254669189453125,
-0.027862548828125,
-0.0103912353515625,
-0.006534576416015625,
0.0192108154296875,
0.0311431884765625,
0.04937744140625,
0.02490234375,
0.031707763671875,
-0.00676727294921875,
0.004154205322265625,
-0.0310821533203125,
-0.01113128662109375,
0.005550384521484375,
-0.0421142578125,
-0.0187225341796875,
-0.00701904296875,
0.042938232421875,
-0.00009775161743164062,
0.0157623291015625,
0.0160675048828125,
0.0037593841552734375,
0.020538330078125,
-0.037200927734375,
0.0195465087890625,
-0.036651611328125,
0.00873565673828125,
0.0017757415771484375,
-0.01971435546875,
-0.0243072509765625,
-0.023681640625,
0.011444091796875,
-0.036376953125,
0.03729248046875,
-0.00510406494140625,
0.10693359375,
0.007556915283203125,
-0.0198516845703125,
-0.01398468017578125,
-0.027801513671875,
0.0682373046875,
-0.061614990234375,
0.0176239013671875,
0.01177215576171875,
0.006519317626953125,
0.01045989990234375,
-0.06500244140625,
-0.03955078125,
0.0026264190673828125,
-0.0052642822265625,
0.016326904296875,
-0.0090179443359375,
-0.002685546875,
0.0248260498046875,
0.03070068359375,
-0.0232086181640625,
-0.0095977783203125,
-0.046905517578125,
-0.040283203125,
0.0426025390625,
0.0035991668701171875,
0.0295257568359375,
-0.04412841796875,
-0.040069580078125,
-0.043548583984375,
-0.02880859375,
0.0164337158203125,
0.024017333984375,
-0.004482269287109375,
-0.022308349609375,
0.05438232421875,
-0.003917694091796875,
0.033599853515625,
0.0178375244140625,
-0.0230712890625,
0.045501708984375,
-0.020172119140625,
-0.024505615234375,
0.01519775390625,
0.08282470703125,
0.038726806640625,
0.013153076171875,
0.0200958251953125,
0.0121917724609375,
0.0083770751953125,
-0.01174163818359375,
-0.07562255859375,
-0.02593994140625,
0.0141448974609375,
-0.023162841796875,
-0.02545166015625,
0.0169219970703125,
-0.07183837890625,
0.007305145263671875,
0.0052490234375,
0.05157470703125,
-0.051727294921875,
-0.0208892822265625,
0.0068359375,
-0.02764892578125,
0.018280029296875,
-0.0020160675048828125,
-0.049346923828125,
-0.01061248779296875,
0.0160369873046875,
0.07794189453125,
0.0024662017822265625,
-0.036956787109375,
-0.0123748779296875,
-0.00370025634765625,
-0.0031280517578125,
0.040771484375,
-0.00868988037109375,
-0.0192718505859375,
-0.010711669921875,
0.028656005859375,
-0.0268096923828125,
-0.036102294921875,
0.035980224609375,
-0.0183868408203125,
0.007110595703125,
-0.0093536376953125,
-0.03857421875,
-0.035125732421875,
0.0235748291015625,
-0.03759765625,
0.0770263671875,
0.014801025390625,
-0.0711669921875,
0.0243682861328125,
-0.033111572265625,
-0.0135040283203125,
-0.026214599609375,
-0.007389068603515625,
-0.05413818359375,
-0.01666259765625,
0.056671142578125,
0.036773681640625,
-0.0006737709045410156,
0.00792694091796875,
-0.0247955322265625,
-0.02886962890625,
0.0172882080078125,
-0.0406494140625,
0.0765380859375,
0.00860595703125,
-0.043792724609375,
0.0184478759765625,
-0.04632568359375,
0.0159912109375,
0.0234527587890625,
-0.0210113525390625,
0.00153350830078125,
-0.0298614501953125,
0.00829315185546875,
0.0232696533203125,
-0.005626678466796875,
-0.048553466796875,
0.015838623046875,
-0.0252227783203125,
0.052947998046875,
0.06842041015625,
0.0147705078125,
0.0247955322265625,
-0.023651123046875,
0.034271240234375,
0.0074310302734375,
0.00411224365234375,
-0.029876708984375,
-0.03814697265625,
-0.0574951171875,
-0.057281494140625,
0.0224761962890625,
0.03729248046875,
-0.063720703125,
0.04046630859375,
-0.0146026611328125,
-0.04876708984375,
-0.05548095703125,
-0.014678955078125,
0.0125274658203125,
0.0310821533203125,
0.03656005859375,
-0.014007568359375,
-0.0587158203125,
-0.0506591796875,
-0.0019235610961914062,
0.00498199462890625,
-0.00885009765625,
0.02032470703125,
0.053558349609375,
-0.01465606689453125,
0.07708740234375,
-0.049163818359375,
-0.0198974609375,
-0.01959228515625,
0.00494384765625,
0.0426025390625,
0.044586181640625,
0.04779052734375,
-0.0484619140625,
-0.048828125,
-0.0259857177734375,
-0.0618896484375,
0.00788116455078125,
0.0015821456909179688,
-0.01222991943359375,
-0.0005064010620117188,
0.019866943359375,
-0.035797119140625,
0.042083740234375,
0.0240631103515625,
-0.0301361083984375,
0.0594482421875,
-0.0168914794921875,
0.031341552734375,
-0.0804443359375,
0.0069122314453125,
-0.006397247314453125,
-0.0238037109375,
-0.028839111328125,
0.002620697021484375,
0.00861358642578125,
-0.0056915283203125,
-0.061279296875,
0.039093017578125,
-0.03692626953125,
0.0118865966796875,
-0.01318359375,
0.005153656005859375,
0.01094818115234375,
0.06341552734375,
0.00453948974609375,
0.04010009765625,
0.0592041015625,
-0.047943115234375,
0.04425048828125,
0.0250396728515625,
-0.04034423828125,
0.02001953125,
-0.05517578125,
-0.0012083053588867188,
0.001987457275390625,
0.00977325439453125,
-0.06829833984375,
-0.0156402587890625,
0.032257080078125,
-0.042694091796875,
0.0236053466796875,
-0.0247039794921875,
-0.0230255126953125,
-0.042266845703125,
-0.0400390625,
0.036407470703125,
0.04595947265625,
-0.048065185546875,
0.0249786376953125,
0.020477294921875,
0.0091552734375,
-0.05596923828125,
-0.0552978515625,
-0.0297088623046875,
-0.0224609375,
-0.05279541015625,
0.03631591796875,
0.0031337738037109375,
0.0242462158203125,
0.0013380050659179688,
-0.0052032470703125,
-0.0052337646484375,
-0.0196075439453125,
0.01299285888671875,
0.033172607421875,
-0.01140594482421875,
-0.0236663818359375,
0.0105743408203125,
-0.01149749755859375,
0.000461578369140625,
-0.007415771484375,
0.0718994140625,
-0.02276611328125,
-0.04296875,
-0.04071044921875,
0.00130462646484375,
0.044158935546875,
-0.0073394775390625,
0.03948974609375,
0.0789794921875,
-0.0173492431640625,
0.007480621337890625,
-0.027008056640625,
-0.01338958740234375,
-0.039337158203125,
0.05780029296875,
-0.0221710205078125,
-0.051666259765625,
0.050018310546875,
0.0021610260009765625,
-0.0124053955078125,
0.0606689453125,
0.055267333984375,
0.0023040771484375,
0.0784912109375,
0.033294677734375,
-0.0010662078857421875,
0.037628173828125,
-0.086181640625,
0.01538848876953125,
-0.0762939453125,
-0.0208740234375,
-0.00380706787109375,
-0.02130126953125,
-0.034515380859375,
-0.0543212890625,
0.042266845703125,
0.035491943359375,
-0.02032470703125,
0.038299560546875,
-0.059967041015625,
0.0098114013671875,
0.04034423828125,
0.02764892578125,
-0.00719451904296875,
0.00713348388671875,
-0.03955078125,
-0.01197052001953125,
-0.05560302734375,
-0.01959228515625,
0.07000732421875,
0.038055419921875,
0.06341552734375,
-0.0170135498046875,
0.047149658203125,
0.002918243408203125,
-0.0037822723388671875,
-0.05865478515625,
0.0386962890625,
0.00527191162109375,
-0.037994384765625,
-0.0157928466796875,
-0.0098724365234375,
-0.0560302734375,
0.02935791015625,
-0.0093994140625,
-0.07666015625,
0.0194244384765625,
0.0124053955078125,
-0.009033203125,
0.050872802734375,
-0.0408935546875,
0.06732177734375,
-0.00795745849609375,
-0.0335693359375,
0.00589752197265625,
-0.034637451171875,
0.0219268798828125,
0.02142333984375,
-0.00408172607421875,
-0.00919342041015625,
0.0235748291015625,
0.0770263671875,
-0.047576904296875,
0.05389404296875,
-0.016387939453125,
0.0163726806640625,
0.0526123046875,
-0.01453399658203125,
0.032562255859375,
0.020751953125,
0.00860595703125,
0.0175628662109375,
0.0088958740234375,
-0.031463623046875,
-0.02960205078125,
0.045257568359375,
-0.057891845703125,
-0.0311126708984375,
-0.035919189453125,
-0.032501220703125,
0.028076171875,
0.0169219970703125,
0.075927734375,
0.034912109375,
0.0168304443359375,
0.010986328125,
0.042327880859375,
-0.028594970703125,
0.046966552734375,
-0.0020580291748046875,
-0.0229949951171875,
-0.05072021484375,
0.075439453125,
0.01143646240234375,
0.0174407958984375,
0.0152130126953125,
0.019073486328125,
-0.035614013671875,
-0.004467010498046875,
-0.041259765625,
0.0274810791015625,
-0.0635986328125,
-0.0289306640625,
-0.045806884765625,
-0.034698486328125,
-0.0435791015625,
-0.01525115966796875,
-0.0462646484375,
-0.0239715576171875,
-0.0220794677734375,
-0.0060577392578125,
0.0439453125,
0.020172119140625,
-0.0126953125,
0.01259613037109375,
-0.06304931640625,
0.0174102783203125,
0.0157623291015625,
0.01195526123046875,
0.009429931640625,
-0.053466796875,
-0.015625,
0.0057220458984375,
-0.048065185546875,
-0.0780029296875,
0.04498291015625,
0.002422332763671875,
0.04766845703125,
0.03863525390625,
0.00290679931640625,
0.06085205078125,
-0.023681640625,
0.061798095703125,
0.0261383056640625,
-0.0831298828125,
0.034393310546875,
-0.006832122802734375,
0.0175628662109375,
0.01386260986328125,
0.0266265869140625,
-0.04156494140625,
-0.0206298828125,
-0.0430908203125,
-0.08270263671875,
0.06365966796875,
0.0225677490234375,
-0.005710601806640625,
-0.0006566047668457031,
-0.0029201507568359375,
-0.004474639892578125,
0.0027904510498046875,
-0.048858642578125,
-0.041046142578125,
-0.0418701171875,
-0.01477813720703125,
0.01181793212890625,
-0.00408172607421875,
0.000274658203125,
-0.047149658203125,
0.058349609375,
-0.005138397216796875,
0.066162109375,
0.048858642578125,
-0.0289459228515625,
0.0139923095703125,
0.001697540283203125,
0.039520263671875,
0.032562255859375,
-0.01910400390625,
0.0004475116729736328,
0.0223846435546875,
-0.04656982421875,
0.020355224609375,
-0.00948333740234375,
-0.01537322998046875,
0.0258636474609375,
0.0277252197265625,
0.08990478515625,
0.01483917236328125,
-0.034698486328125,
0.04632568359375,
0.0108489990234375,
-0.023712158203125,
-0.0200653076171875,
-0.00469970703125,
0.0046844482421875,
0.0218658447265625,
0.030548095703125,
0.00443267822265625,
-0.01276397705078125,
-0.0273284912109375,
0.02410888671875,
0.01245880126953125,
-0.0228424072265625,
-0.0220794677734375,
0.059600830078125,
-0.0178680419921875,
-0.011932373046875,
0.047760009765625,
-0.01073455810546875,
-0.07666015625,
0.0706787109375,
0.046173095703125,
0.05316162109375,
-0.00838470458984375,
0.0211639404296875,
0.06854248046875,
0.00479888916015625,
-0.01198577880859375,
0.0105438232421875,
0.00937652587890625,
-0.042724609375,
-0.0196533203125,
-0.036407470703125,
-0.0053253173828125,
0.0133209228515625,
-0.050689697265625,
0.044158935546875,
-0.0218658447265625,
-0.02081298828125,
-0.01470184326171875,
-0.010772705078125,
-0.06512451171875,
0.005977630615234375,
-0.00641632080078125,
0.061492919921875,
-0.07196044921875,
0.06072998046875,
0.0338134765625,
-0.03814697265625,
-0.06524658203125,
-0.01480865478515625,
-0.021728515625,
-0.06488037109375,
0.035247802734375,
0.0302276611328125,
0.019866943359375,
0.002246856689453125,
-0.0408935546875,
-0.07818603515625,
0.1005859375,
0.0310211181640625,
-0.01020050048828125,
-0.00026226043701171875,
-0.0008349418640136719,
0.022064208984375,
-0.0167236328125,
0.0450439453125,
0.0306549072265625,
0.0287933349609375,
0.01432037353515625,
-0.05914306640625,
0.0116424560546875,
-0.0218353271484375,
-0.004077911376953125,
0.00865936279296875,
-0.0555419921875,
0.08587646484375,
-0.02984619140625,
-0.0164642333984375,
-0.0009670257568359375,
0.051422119140625,
0.03955078125,
0.00933837890625,
0.0301666259765625,
0.044952392578125,
0.033782958984375,
-0.0193939208984375,
0.055572509765625,
-0.01345062255859375,
0.08038330078125,
0.053466796875,
0.0204925537109375,
0.0379638671875,
0.035736083984375,
-0.0274658203125,
0.0391845703125,
0.049102783203125,
-0.046478271484375,
0.05987548828125,
0.004123687744140625,
-0.0003669261932373047,
0.0013713836669921875,
0.0177764892578125,
-0.032501220703125,
0.039764404296875,
0.017364501953125,
-0.040618896484375,
-0.012786865234375,
0.0291595458984375,
0.0079345703125,
-0.0244903564453125,
-0.01438140869140625,
0.0200653076171875,
-0.006015777587890625,
-0.039154052734375,
0.06854248046875,
0.0105438232421875,
0.0767822265625,
-0.034027099609375,
-0.01215362548828125,
0.011383056640625,
0.022003173828125,
-0.0160980224609375,
-0.061767578125,
0.01398468017578125,
0.0000388026237487793,
-0.0065155029296875,
0.0110626220703125,
0.044342041015625,
-0.031951904296875,
-0.045684814453125,
0.0274658203125,
0.004444122314453125,
0.0298309326171875,
-0.0006017684936523438,
-0.061676025390625,
0.0144500732421875,
0.0157318115234375,
-0.03424072265625,
0.006256103515625,
0.002826690673828125,
0.01000213623046875,
0.041748046875,
0.033905029296875,
-0.003063201904296875,
0.0390625,
-0.0247650146484375,
0.06719970703125,
-0.040069580078125,
-0.043212890625,
-0.07489013671875,
0.0498046875,
-0.01180267333984375,
-0.032135009765625,
0.05523681640625,
0.051666259765625,
0.06854248046875,
-0.025421142578125,
0.05615234375,
-0.0350341796875,
0.01149749755859375,
-0.040435791015625,
0.060516357421875,
-0.0269927978515625,
-0.00556182861328125,
-0.032318115234375,
-0.056304931640625,
0.0056304931640625,
0.058319091796875,
-0.01528167724609375,
0.00798797607421875,
0.06475830078125,
0.05792236328125,
-0.01535797119140625,
-0.02301025390625,
0.007099151611328125,
0.0159149169921875,
0.0310211181640625,
0.050384521484375,
0.0233917236328125,
-0.07177734375,
0.050140380859375,
-0.05975341796875,
0.004085540771484375,
-0.0201416015625,
-0.05340576171875,
-0.06512451171875,
-0.045501708984375,
-0.027557373046875,
-0.035614013671875,
-0.0287933349609375,
0.06951904296875,
0.0633544921875,
-0.063720703125,
-0.0028095245361328125,
-0.0030841827392578125,
0.01953125,
-0.03460693359375,
-0.0265960693359375,
0.062347412109375,
0.001422882080078125,
-0.0836181640625,
0.0113372802734375,
-0.00023126602172851562,
0.0020961761474609375,
0.0007166862487792969,
-0.01273345947265625,
-0.038726806640625,
-0.01318359375,
0.0250244140625,
0.0257110595703125,
-0.0537109375,
-0.0237579345703125,
0.0069732666015625,
-0.0012569427490234375,
0.031463623046875,
0.0386962890625,
-0.05206298828125,
0.0426025390625,
0.05181884765625,
0.02313232421875,
0.06976318359375,
-0.01241302490234375,
0.01983642578125,
-0.05743408203125,
0.0285186767578125,
-0.0008187294006347656,
0.04388427734375,
0.020294189453125,
-0.0185546875,
0.031341552734375,
0.04608154296875,
-0.03924560546875,
-0.06170654296875,
-0.010345458984375,
-0.07635498046875,
-0.01450347900390625,
0.07904052734375,
-0.036956787109375,
-0.032440185546875,
0.0100250244140625,
-0.04022216796875,
0.06072998046875,
-0.0245361328125,
0.0439453125,
0.037689208984375,
-0.0001436471939086914,
-0.0220794677734375,
-0.0214080810546875,
0.0187530517578125,
0.01482391357421875,
-0.0279083251953125,
-0.034637451171875,
-0.01476287841796875,
0.05035400390625,
0.0298614501953125,
0.037261962890625,
-0.00677490234375,
0.0169677734375,
0.01383209228515625,
0.0167236328125,
-0.0216217041015625,
0.0012798309326171875,
-0.0172271728515625,
0.0096282958984375,
-0.0295562744140625,
-0.0572509765625
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.