modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
stabilityai/stable-diffusion-2-1-unclip | 2023-04-12T15:49:10.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"arxiv:2112.10752",
"arxiv:1910.09700",
"license:openrail++",
"has_space",
"diffusers:StableUnCLIPImg2ImgPipeline",
"region:us"
] | text-to-image | stabilityai | null | null | stabilityai/stable-diffusion-2-1-unclip | 216 | 17,333 | diffusers | 2023-03-20T13:11:38 | ---
license: openrail++
tags:
- stable-diffusion
- text-to-image
pinned: true
---
# Stable Diffusion v2-1-unclip Model Card
This model card focuses on the model associated with the Stable Diffusion v2-1 model, codebase available [here](https://github.com/Stability-AI/stablediffusion).
This `stable-diffusion-2-1-unclip` is a finetuned version of Stable Diffusion 2.1, modified to accept (noisy) CLIP image embedding in addition to the text prompt, and can be used to create image variations (Examples) or can be chained with text-to-image CLIP priors. The amount of noise added to the image embedding can be specified via the noise_level (0 means no noise, 1000 full noise).
- Use it with 🧨 [`diffusers`](#examples)
## Model Details
- **Developed by:** Robin Rombach, Patrick Esser
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)).
- **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/).
- **Cite as:**
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
## Examples
Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion UnCLIP 2-1-small in a simple and efficient manner.
```bash
pip install diffusers transformers accelerate scipy safetensors
```
Running the pipeline (if you don't swap the scheduler it will run with the default DDIM, in this example we are swapping it to DPMSolverMultistepScheduler):
```python
from diffusers import DiffusionPipeline
from diffusers.utils import load_image
import torch
pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2-1-unclip-small", torch_dtype=torch.float16)
pipe.to("cuda")
# get image
url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/stable_unclip/tarsila_do_amaral.png"
image = load_image(url)
# run image variation
image = pipe(image).images[0]
```

# Uses
## Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere”
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent.
## Training
**Training Data**
The model developers used the following dataset for training the model:
- LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic.
## Environmental Impact
**Stable Diffusion v1** **Estimated Emissions**
Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.
- **Hardware Type:** A100 PCIe 40GB
- **Hours used:** 200000
- **Cloud Provider:** AWS
- **Compute Region:** US-east
- **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq.
## Citation
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
*This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).* | 8,136 | [
[
-0.035186767578125,
-0.06396484375,
0.0225372314453125,
0.011688232421875,
-0.0195465087890625,
-0.0287322998046875,
0.006244659423828125,
-0.034698486328125,
-0.00603485107421875,
0.033477783203125,
-0.030548095703125,
-0.03253173828125,
-0.050384521484375,
-0.01222991943359375,
-0.0303192138671875,
0.0694580078125,
-0.002593994140625,
0.00194549560546875,
-0.018310546875,
-0.0010461807250976562,
-0.0246734619140625,
-0.0161590576171875,
-0.0697021484375,
-0.0194549560546875,
0.0269927978515625,
0.00765228271484375,
0.0556640625,
0.040924072265625,
0.0362548828125,
0.02056884765625,
-0.028717041015625,
-0.0029296875,
-0.047943115234375,
-0.0107574462890625,
0.00118255615234375,
-0.01396942138671875,
-0.04534912109375,
0.0175933837890625,
0.053863525390625,
0.0290069580078125,
-0.0097808837890625,
0.005107879638671875,
0.00823211669921875,
0.0377197265625,
-0.04290771484375,
-0.01198577880859375,
-0.032989501953125,
0.011138916015625,
-0.01136016845703125,
0.020965576171875,
-0.0267181396484375,
-0.0009703636169433594,
0.005092620849609375,
-0.055938720703125,
0.0290985107421875,
-0.02362060546875,
0.0819091796875,
0.0318603515625,
-0.0232086181640625,
-0.004161834716796875,
-0.05450439453125,
0.050201416015625,
-0.050018310546875,
0.015960693359375,
0.0236968994140625,
0.00531005859375,
-0.011810302734375,
-0.0709228515625,
-0.04754638671875,
-0.0015115737915039062,
0.0031280517578125,
0.03521728515625,
-0.0291290283203125,
-0.00849151611328125,
0.0313720703125,
0.011749267578125,
-0.038177490234375,
0.007678985595703125,
-0.040435791015625,
-0.00666046142578125,
0.052154541015625,
0.0106658935546875,
0.01561737060546875,
-0.01447296142578125,
-0.03778076171875,
-0.005229949951171875,
-0.04461669921875,
0.0023441314697265625,
0.03485107421875,
-0.01490020751953125,
-0.04083251953125,
0.032318115234375,
0.0170135498046875,
0.034576416015625,
0.016265869140625,
-0.01320648193359375,
0.020294189453125,
-0.0281219482421875,
-0.01541900634765625,
-0.03338623046875,
0.0675048828125,
0.05224609375,
-0.01056671142578125,
0.015777587890625,
-0.002834320068359375,
0.00753021240234375,
0.0008740425109863281,
-0.09820556640625,
-0.033050537109375,
0.01192474365234375,
-0.049224853515625,
-0.044097900390625,
-0.003978729248046875,
-0.08984375,
-0.0166473388671875,
0.011962890625,
0.029998779296875,
-0.016204833984375,
-0.0400390625,
-0.0021495819091796875,
-0.0311737060546875,
0.01104736328125,
0.035980224609375,
-0.049102783203125,
0.0179595947265625,
0.0054931640625,
0.08380126953125,
-0.03076171875,
-0.0020999908447265625,
-0.004856109619140625,
0.007221221923828125,
-0.0211029052734375,
0.0491943359375,
-0.0281524658203125,
-0.04754638671875,
-0.022003173828125,
0.0257415771484375,
0.0168609619140625,
-0.0421142578125,
0.045562744140625,
-0.02972412109375,
0.0267181396484375,
-0.001312255859375,
-0.0258941650390625,
-0.0159454345703125,
-0.00893402099609375,
-0.05084228515625,
0.086181640625,
0.01131439208984375,
-0.06787109375,
0.007450103759765625,
-0.0560302734375,
-0.01904296875,
-0.00487518310546875,
0.00572967529296875,
-0.054443359375,
-0.0168304443359375,
-0.000988006591796875,
0.0272064208984375,
-0.01456451416015625,
0.022430419921875,
-0.0271759033203125,
-0.018157958984375,
-0.0035495758056640625,
-0.05084228515625,
0.0836181640625,
0.0290679931640625,
-0.02984619140625,
-0.0010929107666015625,
-0.0517578125,
-0.02197265625,
0.03668212890625,
-0.0158538818359375,
-0.01285552978515625,
-0.01058197021484375,
0.021453857421875,
0.024261474609375,
0.0103912353515625,
-0.032501220703125,
-0.0006208419799804688,
-0.01360321044921875,
0.047149658203125,
0.05743408203125,
0.0213470458984375,
0.04791259765625,
-0.0266876220703125,
0.0435791015625,
0.0267486572265625,
0.02593994140625,
-0.00888824462890625,
-0.065673828125,
-0.04315185546875,
-0.02337646484375,
0.0183258056640625,
0.04058837890625,
-0.044158935546875,
0.024139404296875,
0.006103515625,
-0.05511474609375,
-0.01406097412109375,
-0.00510406494140625,
0.01763916015625,
0.052520751953125,
0.0226593017578125,
-0.0303497314453125,
-0.02984619140625,
-0.05731201171875,
0.024932861328125,
-0.00638580322265625,
0.01180267333984375,
0.023162841796875,
0.0576171875,
-0.028900146484375,
0.04644775390625,
-0.048126220703125,
-0.020294189453125,
0.00970458984375,
0.0095062255859375,
0.0014562606811523438,
0.0531005859375,
0.057342529296875,
-0.07904052734375,
-0.047149658203125,
-0.0249786376953125,
-0.0595703125,
-0.002368927001953125,
0.00020301342010498047,
-0.023712158203125,
0.0299835205078125,
0.03472900390625,
-0.053436279296875,
0.0472412109375,
0.050445556640625,
-0.0238189697265625,
0.037933349609375,
-0.0325927734375,
-0.0038890838623046875,
-0.0810546875,
0.015960693359375,
0.0211334228515625,
-0.017059326171875,
-0.044189453125,
0.004253387451171875,
-0.0120697021484375,
-0.0207061767578125,
-0.05450439453125,
0.056121826171875,
-0.0301055908203125,
0.0261688232421875,
-0.0308837890625,
0.00037384033203125,
0.01514434814453125,
0.022491455078125,
0.017791748046875,
0.053802490234375,
0.059600830078125,
-0.046142578125,
0.00435638427734375,
0.01837158203125,
-0.0128631591796875,
0.037017822265625,
-0.06787109375,
0.015228271484375,
-0.027099609375,
0.026214599609375,
-0.07171630859375,
-0.01322174072265625,
0.0421142578125,
-0.02685546875,
0.0291748046875,
-0.0222930908203125,
-0.033935546875,
-0.02923583984375,
-0.01396942138671875,
0.038970947265625,
0.08074951171875,
-0.0328369140625,
0.0305633544921875,
0.04168701171875,
0.01371002197265625,
-0.033233642578125,
-0.060638427734375,
-0.007099151611328125,
-0.032745361328125,
-0.061981201171875,
0.047637939453125,
-0.024169921875,
-0.01511383056640625,
0.008941650390625,
0.01184844970703125,
-0.008575439453125,
0.003864288330078125,
0.03692626953125,
0.020904541015625,
0.009521484375,
-0.00986480712890625,
0.01436614990234375,
-0.01702880859375,
-0.00115203857421875,
-0.003932952880859375,
0.030487060546875,
0.007640838623046875,
-0.0005965232849121094,
-0.052734375,
0.03753662109375,
0.04742431640625,
0.001468658447265625,
0.058135986328125,
0.06890869140625,
-0.040435791015625,
0.001888275146484375,
-0.0175323486328125,
-0.01451873779296875,
-0.0380859375,
0.030120849609375,
-0.01003265380859375,
-0.043609619140625,
0.04736328125,
-0.002498626708984375,
-0.001495361328125,
0.055419921875,
0.0550537109375,
-0.016693115234375,
0.08343505859375,
0.04339599609375,
0.0286102294921875,
0.052032470703125,
-0.05841064453125,
-0.004924774169921875,
-0.070556640625,
-0.0201873779296875,
-0.0182647705078125,
-0.017059326171875,
-0.040924072265625,
-0.05047607421875,
0.03118896484375,
0.0227203369140625,
-0.019287109375,
0.014404296875,
-0.047088623046875,
0.0251007080078125,
0.0206298828125,
0.0134124755859375,
0.000507354736328125,
0.013916015625,
0.00670623779296875,
-0.01480865478515625,
-0.057220458984375,
-0.048004150390625,
0.07421875,
0.038909912109375,
0.06591796875,
0.0050048828125,
0.037078857421875,
0.0357666015625,
0.0230865478515625,
-0.031494140625,
0.04486083984375,
-0.0328369140625,
-0.050506591796875,
-0.00885009765625,
-0.017852783203125,
-0.0736083984375,
0.01548004150390625,
-0.020843505859375,
-0.03375244140625,
0.037384033203125,
0.014129638671875,
-0.0178680419921875,
0.0283203125,
-0.055938720703125,
0.07427978515625,
-0.007579803466796875,
-0.05657958984375,
-0.0125274658203125,
-0.0506591796875,
0.023773193359375,
-0.00022304058074951172,
0.0196533203125,
-0.0097198486328125,
-0.00791168212890625,
0.0706787109375,
-0.0285186767578125,
0.076416015625,
-0.033477783203125,
0.0030994415283203125,
0.040802001953125,
-0.0078125,
0.032928466796875,
0.01251983642578125,
-0.01262664794921875,
0.042938232421875,
0.007389068603515625,
-0.02972412109375,
-0.0238494873046875,
0.0545654296875,
-0.0718994140625,
-0.03753662109375,
-0.033935546875,
-0.0269317626953125,
0.048095703125,
0.019287109375,
0.059051513671875,
0.026611328125,
-0.01904296875,
-0.0045013427734375,
0.053009033203125,
-0.0230712890625,
0.036041259765625,
0.02294921875,
-0.02374267578125,
-0.043792724609375,
0.057464599609375,
0.02227783203125,
0.04150390625,
-0.00856781005859375,
0.021331787109375,
-0.0111236572265625,
-0.039581298828125,
-0.042694091796875,
0.0167236328125,
-0.06036376953125,
-0.0131072998046875,
-0.056732177734375,
-0.024200439453125,
-0.032257080078125,
-0.01062774658203125,
-0.0313720703125,
-0.02301025390625,
-0.06414794921875,
0.00913238525390625,
0.019561767578125,
0.0416259765625,
-0.028656005859375,
0.02923583984375,
-0.02728271484375,
0.02349853515625,
0.0102386474609375,
0.01319122314453125,
0.0027675628662109375,
-0.056304931640625,
-0.010528564453125,
0.0157470703125,
-0.052825927734375,
-0.0728759765625,
0.02947998046875,
0.0075531005859375,
0.039581298828125,
0.038055419921875,
-0.00014495849609375,
0.047943115234375,
-0.0299224853515625,
0.0831298828125,
0.01947021484375,
-0.053375244140625,
0.04693603515625,
-0.0304107666015625,
0.00989532470703125,
0.0197601318359375,
0.040679931640625,
-0.017181396484375,
-0.0213165283203125,
-0.06524658203125,
-0.0645751953125,
0.043487548828125,
0.0255279541015625,
0.02349853515625,
-0.00937652587890625,
0.050506591796875,
0.002475738525390625,
-0.000980377197265625,
-0.07574462890625,
-0.03948974609375,
-0.0238189697265625,
0.01229095458984375,
0.0096588134765625,
-0.029205322265625,
-0.013153076171875,
-0.04473876953125,
0.0699462890625,
0.006855010986328125,
0.04400634765625,
0.027679443359375,
0.0103912353515625,
-0.02947998046875,
-0.0203704833984375,
0.0440673828125,
0.0262908935546875,
-0.0168609619140625,
0.00014448165893554688,
0.00004690885543823242,
-0.040435791015625,
0.017364501953125,
0.004642486572265625,
-0.058258056640625,
0.0018367767333984375,
-0.003326416015625,
0.061370849609375,
-0.0187835693359375,
-0.036956787109375,
0.05169677734375,
-0.007720947265625,
-0.0313720703125,
-0.041290283203125,
0.01000213623046875,
0.0044708251953125,
0.0178680419921875,
0.0035762786865234375,
0.0433349609375,
0.01337432861328125,
-0.02978515625,
0.00926971435546875,
0.047119140625,
-0.0294342041015625,
-0.02685546875,
0.08270263671875,
0.01203155517578125,
-0.025848388671875,
0.046539306640625,
-0.0389404296875,
-0.0206146240234375,
0.055267333984375,
0.0587158203125,
0.059051513671875,
-0.01062774658203125,
0.030914306640625,
0.05523681640625,
0.018890380859375,
-0.0163726806640625,
0.01042938232421875,
0.0178070068359375,
-0.05419921875,
-0.00334930419921875,
-0.030120849609375,
-0.006313323974609375,
0.01467132568359375,
-0.040496826171875,
0.042327880859375,
-0.04266357421875,
-0.033477783203125,
-0.0016689300537109375,
-0.0191497802734375,
-0.04266357421875,
0.004497528076171875,
0.0197601318359375,
0.05999755859375,
-0.0823974609375,
0.061614990234375,
0.051788330078125,
-0.044219970703125,
-0.031402587890625,
0.0020694732666015625,
-0.005779266357421875,
-0.026031494140625,
0.037628173828125,
0.009124755859375,
-0.0003268718719482422,
0.004978179931640625,
-0.06793212890625,
-0.06842041015625,
0.097900390625,
0.0282440185546875,
-0.022308349609375,
0.003742218017578125,
-0.0150604248046875,
0.047943115234375,
-0.035888671875,
0.0214385986328125,
0.0205535888671875,
0.032745361328125,
0.03228759765625,
-0.035675048828125,
0.00927734375,
-0.0245819091796875,
0.02166748046875,
-0.0037555694580078125,
-0.077392578125,
0.07391357421875,
-0.030609130859375,
-0.0338134765625,
0.0235748291015625,
0.045806884765625,
0.01334381103515625,
0.027130126953125,
0.03289794921875,
0.06353759765625,
0.046539306640625,
-0.0027713775634765625,
0.0726318359375,
-0.008575439453125,
0.0310516357421875,
0.0650634765625,
-0.009735107421875,
0.059112548828125,
0.0361328125,
-0.020355224609375,
0.04852294921875,
0.048828125,
-0.033050537109375,
0.061279296875,
0.0013141632080078125,
-0.0149383544921875,
-0.006557464599609375,
-0.01346588134765625,
-0.035064697265625,
0.01230621337890625,
0.021331787109375,
-0.041015625,
-0.0126495361328125,
0.01216888427734375,
0.006679534912109375,
-0.0156707763671875,
-0.0036411285400390625,
0.04248046875,
0.004871368408203125,
-0.031494140625,
0.0465087890625,
0.02056884765625,
0.064697265625,
-0.03485107421875,
-0.0168609619140625,
-0.005474090576171875,
0.00807952880859375,
-0.021575927734375,
-0.059600830078125,
0.03936767578125,
-0.0038890838623046875,
-0.0218963623046875,
-0.0171051025390625,
0.060699462890625,
-0.0235443115234375,
-0.0506591796875,
0.03240966796875,
0.01739501953125,
0.0269012451171875,
0.01013946533203125,
-0.07861328125,
0.01406097412109375,
-0.0020771026611328125,
-0.0188140869140625,
0.02740478515625,
0.006191253662109375,
0.0097808837890625,
0.034423828125,
0.04669189453125,
-0.00711822509765625,
0.003147125244140625,
-0.01200103759765625,
0.058319091796875,
-0.0197906494140625,
-0.0243988037109375,
-0.058746337890625,
0.0555419921875,
-0.007427215576171875,
-0.0226593017578125,
0.05120849609375,
0.04876708984375,
0.055267333984375,
-0.0040130615234375,
0.060089111328125,
-0.0220489501953125,
-0.0020732879638671875,
-0.031890869140625,
0.059600830078125,
-0.05914306640625,
0.01274871826171875,
-0.0269622802734375,
-0.06634521484375,
-0.0093841552734375,
0.06109619140625,
-0.0172576904296875,
0.015777587890625,
0.0335693359375,
0.07586669921875,
-0.00463104248046875,
-0.01464080810546875,
0.0260009765625,
0.0238189697265625,
0.03131103515625,
0.021209716796875,
0.052276611328125,
-0.058258056640625,
0.0288543701171875,
-0.04052734375,
-0.022216796875,
0.004276275634765625,
-0.07232666015625,
-0.06353759765625,
-0.051300048828125,
-0.06268310546875,
-0.056243896484375,
-0.007251739501953125,
0.03497314453125,
0.07012939453125,
-0.036407470703125,
-0.0057830810546875,
-0.020111083984375,
0.0015611648559570312,
-0.011260986328125,
-0.02166748046875,
0.0287933349609375,
0.0170135498046875,
-0.06591796875,
-0.001987457275390625,
0.0192718505859375,
0.046539306640625,
-0.04010009765625,
-0.0181884765625,
-0.01702880859375,
-0.00806427001953125,
0.03973388671875,
0.00963592529296875,
-0.050384521484375,
-0.00510406494140625,
-0.01041412353515625,
-0.0006589889526367188,
0.01171112060546875,
0.0235137939453125,
-0.044647216796875,
0.0271148681640625,
0.037628173828125,
0.01031494140625,
0.06182861328125,
-0.005786895751953125,
0.008544921875,
-0.036590576171875,
0.029693603515625,
0.00611114501953125,
0.028167724609375,
0.0217437744140625,
-0.043365478515625,
0.03582763671875,
0.052001953125,
-0.056182861328125,
-0.060455322265625,
0.018646240234375,
-0.08697509765625,
-0.024200439453125,
0.10235595703125,
-0.01314544677734375,
-0.023162841796875,
0.0034503936767578125,
-0.022674560546875,
0.0173797607421875,
-0.0283203125,
0.04083251953125,
0.041473388671875,
-0.01102447509765625,
-0.0367431640625,
-0.0435791015625,
0.03338623046875,
0.011932373046875,
-0.046600341796875,
-0.01171112060546875,
0.045562744140625,
0.049163818359375,
0.0173492431640625,
0.06390380859375,
-0.028594970703125,
0.01332855224609375,
0.01526641845703125,
0.004150390625,
-0.0034618377685546875,
-0.0187835693359375,
-0.036346435546875,
0.00438690185546875,
-0.015777587890625,
0.0015363693237304688
]
] |
EleutherAI/polyglot-ko-12.8b | 2023-06-07T05:03:56.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"ko",
"arxiv:2104.09864",
"arxiv:2204.04541",
"arxiv:2306.02254",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/polyglot-ko-12.8b | 63 | 17,274 | transformers | 2022-10-14T23:46:19 | ---
language:
- ko
tags:
- pytorch
- causal-lm
license: apache-2.0
---
# Polyglot-Ko-12.8B
## Model Description
Polyglot-Ko is a series of large-scale Korean autoregressive language models made by the EleutherAI polyglot team.
| Hyperparameter | Value |
|----------------------|----------------------------------------------------------------------------------------------------------------------------------------|
| \\(n_{parameters}\\) | 12,898,631,680 |
| \\(n_{layers}\\) | 40 |
| \\(d_{model}\\) | 5120 |
| \\(d_{ff}\\) | 20,480 |
| \\(n_{heads}\\) | 40 |
| \\(d_{head}\\) | 128 |
| \\(n_{ctx}\\) | 2,048 |
| \\(n_{vocab}\\) | 30,003 / 30,080 |
| Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) |
| RoPE Dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) |
The model consists of 40 transformer layers with a model dimension of 5120, and a feedforward dimension of 20480. The model
dimension is split into 40 heads, each with a dimension of 128. Rotary Position Embedding (RoPE) is applied to 64
dimensions of each head. The model is trained with a tokenization vocabulary of 30003.
## Training data
Polyglot-Ko-12.8B was trained on 863 GB of Korean language data (1.2TB before processing), a large-scale dataset curated by [TUNiB](https://tunib.ai/). The data collection process has abided by South Korean laws. This dataset was collected for the purpose of training Polyglot-Ko models, so it will not be released for public use.
| Source |Size (GB) | Link |
|-------------------------------------|---------|------------------------------------------|
| Korean blog posts | 682.3 | - |
| Korean news dataset | 87.0 | - |
| Modu corpus | 26.4 |corpus.korean.go.kr |
| Korean patent dataset | 19.0 | - |
| Korean Q & A dataset | 18.1 | - |
| KcBert dataset | 12.7 | github.com/Beomi/KcBERT |
| Korean fiction dataset | 6.1 | - |
| Korean online comments | 4.2 | - |
| Korean wikipedia | 1.4 | ko.wikipedia.org |
| Clova call | < 1.0 | github.com/clovaai/ClovaCall |
| Naver sentiment movie corpus | < 1.0 | github.com/e9t/nsmc |
| Korean hate speech dataset | < 1.0 | - |
| Open subtitles | < 1.0 | opus.nlpl.eu/OpenSubtitles.php |
| AIHub various tasks datasets | < 1.0 |aihub.or.kr |
| Standard Korean language dictionary | < 1.0 | stdict.korean.go.kr/main/main.do |
Furthermore, in order to avoid the model memorizing and generating personally identifiable information (PII) in the training data, we masked out the following sensitive information in the pre-processing stage:
* `<|acc|>` : bank account number
* `<|rrn|>` : resident registration number
* `<|tell|>` : phone number
## Training procedure
Polyglot-Ko-12.8B was trained for 167 billion tokens over 301,000 steps on 256 A100 GPUs with the [GPT-NeoX framework](https://github.com/EleutherAI/gpt-neox). It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token.
## How to use
This model can be easily loaded using the `AutoModelForCausalLM` class:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/polyglot-ko-12.8b")
model = AutoModelForCausalLM.from_pretrained("EleutherAI/polyglot-ko-12.8b")
```
## Evaluation results
We evaluate Polyglot-Ko-3.8B on [KOBEST dataset](https://arxiv.org/abs/2204.04541), a benchmark with 5 downstream tasks, against comparable models such as skt/ko-gpt-trinity-1.2B-v0.5, kakaobrain/kogpt and facebook/xglm-7.5B, using the prompts provided in the paper.
The following tables show the results when the number of few-shot examples differ. You can reproduce these results using the [polyglot branch of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/tree/polyglot) and the following scripts. For a fair comparison, all models were run under the same conditions and using the same prompts. In the tables, `n` refers to the number of few-shot examples.
In case of WiC dataset, all models show random performance.
```console
python main.py \
--model gpt2 \
--model_args pretrained='EleutherAI/polyglot-ko-3.8b' \
--tasks kobest_copa,kobest_hellaswag \
--num_fewshot $YOUR_NUM_FEWSHOT \
--batch_size $YOUR_BATCH_SIZE \
--device $YOUR_DEVICE \
--output_path $/path/to/output/
```
### COPA (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6696 | 0.6477 | 0.6419 | 0.6514 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.7345 | 0.7287 | 0.7277 | 0.7479 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.6723 | 0.6731 | 0.6769 | 0.7119 |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.7196 | 0.7193 | 0.7204 | 0.7206 |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.7595 | 0.7608 | 0.7638 | 0.7788 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.7745 | 0.7676 | 0.7775 | 0.7887 |
| **[EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) (this)** | **12.8B** | **0.7937** | **0.8108** | **0.8037** | **0.8369** |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/d5b49364-aed5-4467-bae2-5a322c8e2ceb" width="800px">
### HellaSwag (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.5243 | 0.5272 | 0.5166 | 0.5352 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.5590 | 0.5833 | 0.5828 | 0.5907 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.5665 | 0.5689 | 0.5565 | 0.5622 |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.5247 | 0.5260 | 0.5278 | 0.5427 |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.5707 | 0.5830 | 0.5670 | 0.5787 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.5976 | 0.5998 | 0.5979 | 0.6208 |
| **[EleutherAI/polyglot-ko-12.8b (this)](https://huggingface.co/EleutherAI/polyglot-ko-12.8b)** | **12.8B** | **0.5954** | **0.6306** | **0.6098** | **0.6118** |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/5acb60ac-161a-4ab3-a296-db4442e08b7f" width="800px">
### BoolQ (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3356 | 0.4014 | 0.3640 | 0.3560 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.4514 | 0.5981 | 0.5499 | 0.5202 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.4464 | 0.3324 | 0.3324 | 0.3324 |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.3552 | 0.4751 | 0.4109 | 0.4038 |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4320 | 0.5263 | 0.4930 | 0.4038 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.4356 | 0.5698 | 0.5187 | 0.5236 |
| **[EleutherAI/polyglot-ko-12.8b (this)](https://huggingface.co/EleutherAI/polyglot-ko-12.8b)** | **12.8B** | **0.4818** | **0.6041** | **0.6289** | **0.6448** |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/b74c23c0-01f3-4b68-9e10-a48e9aa052ab" width="800px">
### SentiNeg (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6065 | 0.6878 | 0.7280 | 0.8413 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3747 | 0.8942 | 0.9294 | 0.9698 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3578 | 0.4471 | 0.3964 | 0.5271 |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.6790 | 0.6257 | 0.5514 | 0.7851 |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4858 | 0.7950 | 0.7320 | 0.7851 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.3394 | 0.8841 | 0.8808 | 0.9521 |
| **[EleutherAI/polyglot-ko-12.8b (this)](https://huggingface.co/EleutherAI/polyglot-ko-12.8b)** | **12.8B** | **0.9117** | **0.9015** | **0.9345** | **0.9723** |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/95b56b19-d349-4b70-9ff9-94a5560f89ee" width="800px">
### WiC (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3290 | 0.4313 | 0.4001 | 0.3621 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3526 | 0.4775 | 0.4358 | 0.4061 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3280 | 0.4903 | 0.4945 | 0.3656 |
| [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.3297 | 0.4850 | 0.4650 | 0.3290 |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.3390 | 0.4944 | 0.4203 | 0.3835 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.3913 | 0.4688 | 0.4189 | 0.3910 |
| **[EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) (this)** | **12.8B** | **0.3985** | **0.3683** | **0.3307** | **0.3273** |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/4de4a4c3-d7ac-4e04-8b0c-0d533fe88294" width="800px">
## Limitations and Biases
Polyglot-Ko has been trained to optimize next token prediction. Language models such as this are often used for a wide variety of tasks and it is important to be aware of possible unexpected outcomes. For instance, Polyglot-Ko will not always return the most factual or accurate response but the most statistically likely one. In addition, Polyglot may produce socially unacceptable or offensive content. We recommend having a human curator or other filtering mechanism to censor sensitive content.
## Citation and Related Information
### BibTeX entry
If you find our work useful, please consider citing:
```bibtex
@misc{ko2023technical,
title={A Technical Report for Polyglot-Ko: Open-Source Large-Scale Korean Language Models},
author={Hyunwoong Ko and Kichang Yang and Minho Ryu and Taekyoon Choi and Seungmu Yang and jiwung Hyun and Sungho Park},
year={2023},
eprint={2306.02254},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Licensing
All our models are licensed under the terms of the Apache License 2.0.
```
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
### Acknowledgement
This project was made possible thanks to the computing resources from [Stability.ai](https://stability.ai), and thanks to [TUNiB](https://tunib.ai) for providing a large-scale Korean dataset for this work.
| 15,490 | [
[
-0.049530029296875,
-0.05157470703125,
0.0198974609375,
0.004962921142578125,
-0.03875732421875,
0.0018873214721679688,
-0.00939178466796875,
-0.039581298828125,
0.0308990478515625,
0.01284027099609375,
-0.03497314453125,
-0.048736572265625,
-0.054779052734375,
-0.0076141357421875,
-0.005504608154296875,
0.09332275390625,
-0.003299713134765625,
-0.016357421875,
0.0012378692626953125,
-0.00495147705078125,
-0.00836944580078125,
-0.04736328125,
-0.0416259765625,
-0.035614013671875,
0.0247650146484375,
0.0006799697875976562,
0.06243896484375,
0.03326416015625,
0.0265655517578125,
0.0233001708984375,
-0.02142333984375,
0.00316619873046875,
-0.036956787109375,
-0.035980224609375,
0.0190582275390625,
-0.038330078125,
-0.05950927734375,
-0.002796173095703125,
0.0460205078125,
0.0206756591796875,
-0.01038360595703125,
0.032440185546875,
0.00008672475814819336,
0.047698974609375,
-0.032623291015625,
0.0203704833984375,
-0.026336669921875,
0.0135650634765625,
-0.031524658203125,
0.021087646484375,
-0.0188751220703125,
-0.029205322265625,
0.0122833251953125,
-0.04339599609375,
0.0034656524658203125,
-0.0017347335815429688,
0.0972900390625,
-0.0022335052490234375,
-0.0149078369140625,
-0.0036373138427734375,
-0.04083251953125,
0.055877685546875,
-0.07904052734375,
0.0187835693359375,
0.0246734619140625,
0.00904083251953125,
0.0006952285766601562,
-0.050994873046875,
-0.04364013671875,
0.0011377334594726562,
-0.02880859375,
0.035064697265625,
-0.018341064453125,
0.002689361572265625,
0.036041259765625,
0.03369140625,
-0.0576171875,
-0.006328582763671875,
-0.0389404296875,
-0.02288818359375,
0.06866455078125,
0.0170440673828125,
0.0301055908203125,
-0.024169921875,
-0.0225830078125,
-0.0251922607421875,
-0.02215576171875,
0.0310516357421875,
0.04010009765625,
-0.0011987686157226562,
-0.053253173828125,
0.041168212890625,
-0.01593017578125,
0.0494384765625,
0.01540374755859375,
-0.03692626953125,
0.050201416015625,
-0.0277862548828125,
-0.0259552001953125,
-0.00629425048828125,
0.08392333984375,
0.034423828125,
0.0221710205078125,
0.01348876953125,
-0.00872039794921875,
-0.001026153564453125,
-0.011077880859375,
-0.06085205078125,
-0.0213775634765625,
0.016754150390625,
-0.042388916015625,
-0.0267333984375,
0.02587890625,
-0.0614013671875,
0.00286865234375,
-0.01282501220703125,
0.0289764404296875,
-0.035247802734375,
-0.03173828125,
0.0173492431640625,
0.00046253204345703125,
0.034332275390625,
0.015716552734375,
-0.042572021484375,
0.0166168212890625,
0.027984619140625,
0.0667724609375,
-0.0113525390625,
-0.0250091552734375,
-0.0025615692138671875,
-0.0016021728515625,
-0.016693115234375,
0.04278564453125,
-0.013427734375,
-0.0142974853515625,
-0.0247650146484375,
0.01263427734375,
-0.024810791015625,
-0.0233917236328125,
0.03240966796875,
-0.013519287109375,
0.020782470703125,
-0.0037670135498046875,
-0.03533935546875,
-0.0323486328125,
0.005306243896484375,
-0.03857421875,
0.07861328125,
0.0122222900390625,
-0.0731201171875,
0.018341064453125,
-0.0261993408203125,
-0.00237274169921875,
-0.00681304931640625,
0.0005335807800292969,
-0.062225341796875,
0.00826263427734375,
0.020965576171875,
0.024322509765625,
-0.027252197265625,
0.0020160675048828125,
-0.0234222412109375,
-0.02081298828125,
-0.0018768310546875,
-0.0107574462890625,
0.07403564453125,
0.00923919677734375,
-0.03643798828125,
-0.0014104843139648438,
-0.06524658203125,
0.0180511474609375,
0.040618896484375,
-0.01021575927734375,
-0.00682830810546875,
-0.025726318359375,
-0.007366180419921875,
0.035400390625,
0.0234527587890625,
-0.0399169921875,
0.0156097412109375,
-0.03961181640625,
-0.0018701553344726562,
0.051483154296875,
-0.006732940673828125,
0.01340484619140625,
-0.0380859375,
0.061859130859375,
0.01328277587890625,
0.034393310546875,
0.0026988983154296875,
-0.05047607421875,
-0.05572509765625,
-0.0279541015625,
0.019195556640625,
0.043609619140625,
-0.048187255859375,
0.03887939453125,
-0.003009796142578125,
-0.066650390625,
-0.05364990234375,
0.00982666015625,
0.035491943359375,
0.0237274169921875,
0.0139312744140625,
-0.0190887451171875,
-0.038116455078125,
-0.0733642578125,
-0.0006422996520996094,
-0.01352691650390625,
0.01331329345703125,
0.03192138671875,
0.05108642578125,
-0.0024356842041015625,
0.060943603515625,
-0.055908203125,
-0.007320404052734375,
-0.0257415771484375,
0.0162811279296875,
0.051788330078125,
0.03009033203125,
0.0665283203125,
-0.049591064453125,
-0.0802001953125,
0.0102081298828125,
-0.07391357421875,
-0.00954437255859375,
0.0036869049072265625,
-0.007480621337890625,
0.0237274169921875,
0.017974853515625,
-0.06524658203125,
0.04888916015625,
0.048736572265625,
-0.0295257568359375,
0.07049560546875,
-0.0087738037109375,
-0.002429962158203125,
-0.07647705078125,
0.0052337646484375,
-0.01318359375,
-0.0196685791015625,
-0.050384521484375,
-0.002391815185546875,
-0.01261138916015625,
0.006038665771484375,
-0.051910400390625,
0.047515869140625,
-0.03509521484375,
0.005466461181640625,
-0.0197601318359375,
0.0010223388671875,
-0.00910186767578125,
0.04412841796875,
-0.0018033981323242188,
0.04522705078125,
0.0701904296875,
-0.0205078125,
0.049163818359375,
0.0034942626953125,
-0.020965576171875,
0.0193023681640625,
-0.06414794921875,
0.0227813720703125,
-0.0181732177734375,
0.031890869140625,
-0.06317138671875,
-0.015899658203125,
0.0290374755859375,
-0.0338134765625,
0.006046295166015625,
-0.0290679931640625,
-0.0430908203125,
-0.0513916015625,
-0.0482177734375,
0.034271240234375,
0.05572509765625,
-0.0196685791015625,
0.037017822265625,
0.015899658203125,
-0.0111846923828125,
-0.034515380859375,
-0.0309600830078125,
-0.0263214111328125,
-0.0275726318359375,
-0.06280517578125,
0.0232391357421875,
0.001560211181640625,
0.001323699951171875,
-0.00034689903259277344,
0.0001652240753173828,
0.0066375732421875,
-0.026519775390625,
0.0204620361328125,
0.03973388671875,
-0.0182952880859375,
-0.01776123046875,
-0.01593017578125,
-0.0152587890625,
-0.003025054931640625,
-0.01018524169921875,
0.06573486328125,
-0.03033447265625,
-0.01425933837890625,
-0.053497314453125,
0.0029506683349609375,
0.05804443359375,
-0.00984954833984375,
0.071044921875,
0.079833984375,
-0.024871826171875,
0.0204010009765625,
-0.035430908203125,
-0.002292633056640625,
-0.035400390625,
0.007320404052734375,
-0.035369873046875,
-0.042144775390625,
0.06964111328125,
0.01528167724609375,
0.0007414817810058594,
0.0537109375,
0.048309326171875,
0.000016987323760986328,
0.08807373046875,
0.029022216796875,
-0.0168914794921875,
0.0301055908203125,
-0.0457763671875,
0.015167236328125,
-0.0628662109375,
-0.0277252197265625,
-0.006763458251953125,
-0.015716552734375,
-0.06231689453125,
-0.0290985107421875,
0.0384521484375,
0.02337646484375,
-0.0135955810546875,
0.038909912109375,
-0.0304718017578125,
0.0205230712890625,
0.034881591796875,
0.00806427001953125,
0.002506256103515625,
-0.0019044876098632812,
-0.0288543701171875,
-0.0018682479858398438,
-0.052703857421875,
-0.0236968994140625,
0.0762939453125,
0.031982421875,
0.0670166015625,
0.0030345916748046875,
0.0594482421875,
-0.0070953369140625,
-0.002838134765625,
-0.047119140625,
0.04315185546875,
-0.00545501708984375,
-0.04791259765625,
-0.0188751220703125,
-0.031707763671875,
-0.06231689453125,
0.028564453125,
-0.018707275390625,
-0.07666015625,
0.00782012939453125,
0.0180511474609375,
-0.034881591796875,
0.0423583984375,
-0.053253173828125,
0.055572509765625,
-0.013671875,
-0.0245513916015625,
0.0007739067077636719,
-0.04119873046875,
0.025177001953125,
-0.005214691162109375,
0.0007023811340332031,
-0.0166168212890625,
0.01470947265625,
0.05413818359375,
-0.047393798828125,
0.052490234375,
-0.0165863037109375,
-0.00243377685546875,
0.046295166015625,
-0.013671875,
0.05853271484375,
0.0013132095336914062,
-0.0015745162963867188,
0.0203704833984375,
0.0023479461669921875,
-0.0362548828125,
-0.0390625,
0.036651611328125,
-0.06280517578125,
-0.039581298828125,
-0.0528564453125,
-0.041717529296875,
0.012054443359375,
0.0164947509765625,
0.048583984375,
0.0207672119140625,
0.0189666748046875,
0.0173797607421875,
0.0295867919921875,
-0.03668212890625,
0.038482666015625,
0.01294708251953125,
-0.033111572265625,
-0.038787841796875,
0.063720703125,
0.0188140869140625,
0.034088134765625,
-0.007335662841796875,
0.02191162109375,
-0.0250396728515625,
-0.0276336669921875,
-0.02459716796875,
0.048370361328125,
-0.030548095703125,
-0.0162811279296875,
-0.033111572265625,
-0.042572021484375,
-0.044403076171875,
-0.004856109619140625,
-0.040283203125,
-0.0211181640625,
-0.00710296630859375,
-0.00972747802734375,
0.035003662109375,
0.05108642578125,
-0.0008912086486816406,
0.0295867919921875,
-0.043487548828125,
0.0210113525390625,
0.0256500244140625,
0.0278472900390625,
0.0030727386474609375,
-0.05426025390625,
-0.0158538818359375,
0.0052490234375,
-0.0237884521484375,
-0.056427001953125,
0.03997802734375,
0.005527496337890625,
0.03619384765625,
0.0241546630859375,
-0.00446319580078125,
0.061279296875,
-0.0286865234375,
0.061431884765625,
0.0273895263671875,
-0.057525634765625,
0.05267333984375,
-0.0290069580078125,
0.04486083984375,
0.0196685791015625,
0.042083740234375,
-0.0282745361328125,
-0.02288818359375,
-0.06341552734375,
-0.0667724609375,
0.08489990234375,
0.03533935546875,
-0.004230499267578125,
0.01433563232421875,
0.02081298828125,
-0.0032901763916015625,
0.003780364990234375,
-0.07525634765625,
-0.045013427734375,
-0.0223236083984375,
-0.01352691650390625,
0.0020599365234375,
-0.0171356201171875,
0.0017328262329101562,
-0.044708251953125,
0.062103271484375,
-0.00016295909881591797,
0.033447265625,
0.01358795166015625,
-0.01314544677734375,
0.0033435821533203125,
-0.005535125732421875,
0.053863525390625,
0.05657958984375,
-0.0308685302734375,
-0.01099395751953125,
0.031341552734375,
-0.047882080078125,
0.00782012939453125,
0.0015039443969726562,
-0.027374267578125,
0.013885498046875,
0.02728271484375,
0.07989501953125,
-0.01739501953125,
-0.0308685302734375,
0.03826904296875,
0.005382537841796875,
-0.0227508544921875,
-0.030517578125,
0.006561279296875,
0.00925445556640625,
0.015960693359375,
0.01898193359375,
-0.0222015380859375,
-0.01187896728515625,
-0.031982421875,
0.016326904296875,
0.0218353271484375,
-0.0149078369140625,
-0.0401611328125,
0.035064697265625,
-0.0122222900390625,
-0.005771636962890625,
0.03424072265625,
-0.0258026123046875,
-0.04388427734375,
0.050384521484375,
0.0457763671875,
0.05865478515625,
-0.0234527587890625,
0.02288818359375,
0.053314208984375,
0.018218994140625,
-0.0009002685546875,
0.01922607421875,
0.0289154052734375,
-0.043853759765625,
-0.0231170654296875,
-0.061920166015625,
0.009307861328125,
0.035980224609375,
-0.047210693359375,
0.0287933349609375,
-0.0416259765625,
-0.035308837890625,
0.0097808837890625,
0.002124786376953125,
-0.04229736328125,
0.01137542724609375,
0.024017333984375,
0.052490234375,
-0.07391357421875,
0.065185546875,
0.05462646484375,
-0.039398193359375,
-0.055877685546875,
-0.006439208984375,
0.020263671875,
-0.058837890625,
0.035125732421875,
0.003963470458984375,
-0.0019073486328125,
-0.0068817138671875,
-0.03369140625,
-0.08087158203125,
0.0963134765625,
0.036956787109375,
-0.04022216796875,
-0.00640869140625,
0.0206756591796875,
0.04571533203125,
-0.01715087890625,
0.036346435546875,
0.03387451171875,
0.03448486328125,
-0.0027008056640625,
-0.09393310546875,
0.00537872314453125,
-0.0251922607421875,
0.005695343017578125,
0.01428985595703125,
-0.087158203125,
0.07794189453125,
-0.002559661865234375,
0.0015459060668945312,
-0.01392364501953125,
0.03497314453125,
0.035186767578125,
0.006694793701171875,
0.047607421875,
0.060791015625,
0.03216552734375,
-0.0116119384765625,
0.09210205078125,
-0.0167999267578125,
0.05902099609375,
0.0692138671875,
0.01422119140625,
0.03350830078125,
0.009490966796875,
-0.03704833984375,
0.04180908203125,
0.046905517578125,
-0.011199951171875,
0.026458740234375,
0.01122283935546875,
-0.02099609375,
-0.01428985595703125,
-0.00421905517578125,
-0.03253173828125,
0.041015625,
0.005126953125,
-0.0163421630859375,
-0.012237548828125,
0.012664794921875,
0.0220489501953125,
-0.0242767333984375,
-0.0195465087890625,
0.058013916015625,
0.016204833984375,
-0.03826904296875,
0.0648193359375,
-0.0107574462890625,
0.05474853515625,
-0.050201416015625,
0.01291656494140625,
-0.0110015869140625,
0.005001068115234375,
-0.0274810791015625,
-0.061279296875,
0.0172119140625,
0.00351715087890625,
-0.01143646240234375,
0.006702423095703125,
0.05340576171875,
-0.02154541015625,
-0.05487060546875,
0.03826904296875,
0.0187835693359375,
0.029876708984375,
0.0035076141357421875,
-0.08599853515625,
0.01013946533203125,
0.0102996826171875,
-0.037841796875,
0.02410888671875,
0.0188140869140625,
0.0039520263671875,
0.038665771484375,
0.044342041015625,
0.02496337890625,
0.031707763671875,
0.016082763671875,
0.057830810546875,
-0.046966552734375,
-0.0256805419921875,
-0.071044921875,
0.044464111328125,
-0.0174407958984375,
-0.0308074951171875,
0.057342529296875,
0.048370361328125,
0.073486328125,
-0.0113067626953125,
0.049560546875,
-0.0291748046875,
0.020965576171875,
-0.0340576171875,
0.045623779296875,
-0.035797119140625,
-0.004974365234375,
-0.035430908203125,
-0.06884765625,
-0.005947113037109375,
0.052490234375,
-0.0206756591796875,
0.015716552734375,
0.04254150390625,
0.055938720703125,
-0.0083465576171875,
-0.031524658203125,
0.0185546875,
0.0285491943359375,
0.0156402587890625,
0.055084228515625,
0.04248046875,
-0.05950927734375,
0.0428466796875,
-0.051788330078125,
-0.012786865234375,
-0.016204833984375,
-0.044586181640625,
-0.06890869140625,
-0.03179931640625,
-0.033172607421875,
-0.019134521484375,
-0.00611114501953125,
0.0750732421875,
0.06500244140625,
-0.058349609375,
-0.030517578125,
0.004974365234375,
0.0069427490234375,
-0.0207366943359375,
-0.0218658447265625,
0.04156494140625,
-0.00957489013671875,
-0.0733642578125,
-0.00629425048828125,
0.00931549072265625,
0.03338623046875,
0.005069732666015625,
-0.0129241943359375,
-0.0307464599609375,
-0.00174713134765625,
0.0538330078125,
0.0202484130859375,
-0.060302734375,
-0.0210418701171875,
-0.00524139404296875,
-0.0021877288818359375,
0.0139923095703125,
0.0201873779296875,
-0.03387451171875,
0.0305633544921875,
0.0572509765625,
0.01381683349609375,
0.06927490234375,
0.0096435546875,
0.0309295654296875,
-0.044647216796875,
0.0322265625,
0.005168914794921875,
0.0220184326171875,
0.003612518310546875,
-0.02252197265625,
0.048126220703125,
0.03424072265625,
-0.0309295654296875,
-0.05999755859375,
-0.00492095947265625,
-0.08148193359375,
-0.00916290283203125,
0.07794189453125,
-0.0267791748046875,
-0.0245208740234375,
0.01027679443359375,
-0.0170745849609375,
0.0152587890625,
-0.015899658203125,
0.04345703125,
0.0684814453125,
-0.0316162109375,
-0.023284912109375,
-0.05377197265625,
0.041534423828125,
0.0299224853515625,
-0.061492919921875,
-0.00708770751953125,
0.003665924072265625,
0.0247650146484375,
0.026397705078125,
0.049163818359375,
-0.0275421142578125,
0.0271148681640625,
-0.0063323974609375,
0.01163482666015625,
0.005496978759765625,
-0.0168914794921875,
-0.0211334228515625,
-0.0123291015625,
-0.015716552734375,
-0.0133056640625
]
] |
aneuraz/awesome-align-with-co | 2022-04-29T16:16:12.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"sentence alignment",
"de",
"fr",
"en",
"ro",
"zh",
"arxiv:2101.08231",
"license:bsd-3-clause",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | aneuraz | null | null | aneuraz/awesome-align-with-co | 1 | 17,265 | transformers | 2022-04-29T14:55:54 | ---
language:
- de
- fr
- en
- ro
- zh
thumbnail:
tags:
- sentence alignment
license: bsd-3-clause
---
# AWESOME: Aligning Word Embedding Spaces of Multilingual Encoders
This model comes from the following GitHub repository: [https://github.com/neulab/awesome-align](https://github.com/neulab/awesome-align)
It corresponds to this paper: [https://arxiv.org/abs/2101.08231](https://arxiv.org/abs/2101.08231)
Please cite the original paper if you decide to use the model:
```
@inproceedings{dou2021word,
title={Word Alignment by Fine-tuning Embeddings on Parallel Corpora},
author={Dou, Zi-Yi and Neubig, Graham},
booktitle={Conference of the European Chapter of the Association for Computational Linguistics (EACL)},
year={2021}
}
```
`awesome-align` is a tool that can extract word alignments from multilingual BERT (mBERT) [Demo](https://colab.research.google.com/drive/1205ubqebM0OsZa1nRgbGJBtitgHqIVv6?usp=sharing) and allows you to fine-tune mBERT on parallel corpora for better alignment quality (see our paper for more details).
## Usage (copied from this [DEMO](https://colab.research.google.com/drive/1205ubqebM0OsZa1nRgbGJBtitgHqIVv6?usp=sharing) )
```python
from transformers import AutoModel, AutoTokenizer
import itertools
import torch
# load model
model = AutoModel.from_pretrained("aneuraz/awesome-align-with-co")
tokenizer = AutoTokenizer.from_pretrained("aneuraz/awesome-align-with-co")
# model parameters
align_layer = 8
threshold = 1e-3
# define inputs
src = 'awesome-align is awesome !'
tgt = '牛对齐 是 牛 !'
# pre-processing
sent_src, sent_tgt = src.strip().split(), tgt.strip().split()
token_src, token_tgt = [tokenizer.tokenize(word) for word in sent_src], [tokenizer.tokenize(word) for word in sent_tgt]
wid_src, wid_tgt = [tokenizer.convert_tokens_to_ids(x) for x in token_src], [tokenizer.convert_tokens_to_ids(x) for x in token_tgt]
ids_src, ids_tgt = tokenizer.prepare_for_model(list(itertools.chain(*wid_src)), return_tensors='pt', model_max_length=tokenizer.model_max_length, truncation=True)['input_ids'], tokenizer.prepare_for_model(list(itertools.chain(*wid_tgt)), return_tensors='pt', truncation=True, model_max_length=tokenizer.model_max_length)['input_ids']
sub2word_map_src = []
for i, word_list in enumerate(token_src):
sub2word_map_src += [i for x in word_list]
sub2word_map_tgt = []
for i, word_list in enumerate(token_tgt):
sub2word_map_tgt += [i for x in word_list]
# alignment
align_layer = 8
threshold = 1e-3
model.eval()
with torch.no_grad():
out_src = model(ids_src.unsqueeze(0), output_hidden_states=True)[2][align_layer][0, 1:-1]
out_tgt = model(ids_tgt.unsqueeze(0), output_hidden_states=True)[2][align_layer][0, 1:-1]
dot_prod = torch.matmul(out_src, out_tgt.transpose(-1, -2))
softmax_srctgt = torch.nn.Softmax(dim=-1)(dot_prod)
softmax_tgtsrc = torch.nn.Softmax(dim=-2)(dot_prod)
softmax_inter = (softmax_srctgt > threshold)*(softmax_tgtsrc > threshold)
align_subwords = torch.nonzero(softmax_inter, as_tuple=False)
align_words = set()
for i, j in align_subwords:
align_words.add( (sub2word_map_src[i], sub2word_map_tgt[j]) )
print(align_words)
```
| 3,158 | [
[
-0.0192718505859375,
-0.05645751953125,
0.018890380859375,
0.011474609375,
-0.020904541015625,
-0.005397796630859375,
-0.0131988525390625,
-0.017425537109375,
0.01276397705078125,
0.008087158203125,
-0.034698486328125,
-0.061126708984375,
-0.04547119140625,
-0.005008697509765625,
-0.0248260498046875,
0.07574462890625,
-0.0202178955078125,
0.003833770751953125,
0.00543975830078125,
-0.0251617431640625,
-0.0277862548828125,
-0.031341552734375,
-0.0249176025390625,
-0.0251617431640625,
0.0078582763671875,
0.011474609375,
0.04132080078125,
0.044586181640625,
0.040496826171875,
0.0301513671875,
-0.001369476318359375,
0.0148773193359375,
-0.02264404296875,
-0.00464630126953125,
-0.01226806640625,
-0.02166748046875,
-0.03607177734375,
-0.00310516357421875,
0.046844482421875,
0.024322509765625,
-0.005710601806640625,
0.0214080810546875,
0.0014209747314453125,
0.0200042724609375,
-0.034759521484375,
0.01195526123046875,
-0.052703857421875,
0.004985809326171875,
-0.01561737060546875,
-0.00778961181640625,
-0.0145263671875,
-0.028900146484375,
0.00276947021484375,
-0.049407958984375,
0.0194091796875,
0.01299285888671875,
0.11529541015625,
0.0128173828125,
-0.0092620849609375,
-0.03399658203125,
-0.03302001953125,
0.0595703125,
-0.071044921875,
0.03472900390625,
0.006999969482421875,
-0.015899658203125,
-0.00904083251953125,
-0.06951904296875,
-0.066162109375,
-0.005092620849609375,
-0.004863739013671875,
0.0272369384765625,
-0.01480865478515625,
0.00862884521484375,
0.016815185546875,
0.039825439453125,
-0.050537109375,
0.00824737548828125,
-0.0266876220703125,
-0.0283203125,
0.03668212890625,
0.0195159912109375,
0.04107666015625,
-0.0162200927734375,
-0.03143310546875,
-0.022613525390625,
-0.0262451171875,
0.005886077880859375,
0.026763916015625,
0.0162811279296875,
-0.0283050537109375,
0.054412841796875,
0.00734710693359375,
0.056884765625,
0.0030879974365234375,
-0.006000518798828125,
0.04217529296875,
-0.03338623046875,
-0.0172576904296875,
-0.00539398193359375,
0.07684326171875,
0.0242767333984375,
0.00620269775390625,
-0.002185821533203125,
-0.01110076904296875,
0.02325439453125,
-0.01062774658203125,
-0.060882568359375,
-0.0291900634765625,
0.02935791015625,
-0.018524169921875,
-0.009185791015625,
-0.0008535385131835938,
-0.037841796875,
-0.011474609375,
-0.00490570068359375,
0.07733154296875,
-0.05987548828125,
-0.014129638671875,
0.019195556640625,
-0.0224609375,
0.0233612060546875,
-0.010650634765625,
-0.060211181640625,
0.0038890838623046875,
0.039306640625,
0.06756591796875,
0.01407623291015625,
-0.0443115234375,
-0.0226593017578125,
0.00417327880859375,
-0.020721435546875,
0.028656005859375,
-0.0292816162109375,
-0.0153656005859375,
-0.01346588134765625,
-0.0005965232849121094,
-0.036285400390625,
-0.02032470703125,
0.04547119140625,
-0.0303497314453125,
0.0394287109375,
-0.01174163818359375,
-0.065673828125,
-0.017242431640625,
0.0224456787109375,
-0.0267791748046875,
0.08428955078125,
0.0120849609375,
-0.06756591796875,
0.019134521484375,
-0.04541015625,
-0.0203094482421875,
0.0019817352294921875,
-0.0160369873046875,
-0.04815673828125,
0.00971221923828125,
0.0304412841796875,
0.0341796875,
-0.00594329833984375,
0.003971099853515625,
-0.020416259765625,
-0.02215576171875,
0.0152130126953125,
-0.0111541748046875,
0.08184814453125,
0.0152587890625,
-0.04144287109375,
0.0178985595703125,
-0.04669189453125,
0.0227508544921875,
0.0135650634765625,
-0.01763916015625,
-0.0073394775390625,
-0.0190277099609375,
0.005382537841796875,
0.02490234375,
0.0290985107421875,
-0.059356689453125,
0.018798828125,
-0.035736083984375,
0.04193115234375,
0.04986572265625,
-0.01450347900390625,
0.0238189697265625,
-0.0171661376953125,
0.03240966796875,
0.006664276123046875,
0.00579071044921875,
0.0090484619140625,
-0.026580810546875,
-0.06402587890625,
-0.04339599609375,
0.045013427734375,
0.044769287109375,
-0.047943115234375,
0.06402587890625,
-0.030609130859375,
-0.043426513671875,
-0.061279296875,
0.0005488395690917969,
0.0268096923828125,
0.03839111328125,
0.034088134765625,
-0.0225067138671875,
-0.039886474609375,
-0.059539794921875,
-0.01174163818359375,
0.004810333251953125,
0.01137542724609375,
0.004779815673828125,
0.048980712890625,
-0.0211944580078125,
0.0625,
-0.0350341796875,
-0.04071044921875,
-0.0203094482421875,
0.01309967041015625,
0.0301361083984375,
0.05609130859375,
0.0372314453125,
-0.032135009765625,
-0.045654296875,
-0.00501251220703125,
-0.05316162109375,
0.00098419189453125,
-0.0014085769653320312,
-0.03515625,
0.0159149169921875,
0.03302001953125,
-0.0537109375,
0.0275421142578125,
0.035430908203125,
-0.033905029296875,
0.031982421875,
-0.03411865234375,
-0.0012006759643554688,
-0.0958251953125,
0.00771331787109375,
-0.011016845703125,
0.0008382797241210938,
-0.040435791015625,
0.0253753662109375,
0.02606201171875,
-0.00196075439453125,
-0.025238037109375,
0.044586181640625,
-0.061065673828125,
0.01311492919921875,
-0.00423431396484375,
0.012054443359375,
0.00667572021484375,
0.038818359375,
-0.0083465576171875,
0.042510986328125,
0.059814453125,
-0.0307464599609375,
0.0192108154296875,
0.0290374755859375,
-0.024169921875,
0.02557373046875,
-0.0465087890625,
0.0093536376953125,
0.0009236335754394531,
0.0107269287109375,
-0.087646484375,
-0.007732391357421875,
0.0179595947265625,
-0.04852294921875,
0.03765869140625,
0.004802703857421875,
-0.04852294921875,
-0.0440673828125,
-0.031005859375,
0.02294921875,
0.032470703125,
-0.038787841796875,
0.041961669921875,
0.031494140625,
0.01055908203125,
-0.0501708984375,
-0.06036376953125,
-0.009918212890625,
-0.007648468017578125,
-0.05419921875,
0.04571533203125,
-0.006702423095703125,
0.012969970703125,
0.022918701171875,
-0.00311279296875,
0.007415771484375,
-0.005298614501953125,
-0.002361297607421875,
0.030792236328125,
-0.005374908447265625,
0.0048065185546875,
-0.0019741058349609375,
-0.0018634796142578125,
-0.006282806396484375,
-0.02935791015625,
0.06622314453125,
-0.025146484375,
-0.0225372314453125,
-0.041778564453125,
0.02105712890625,
0.038177490234375,
-0.0325927734375,
0.08197021484375,
0.0831298828125,
-0.046844482421875,
-0.0011272430419921875,
-0.03839111328125,
-0.0087890625,
-0.03753662109375,
0.047760009765625,
-0.037200927734375,
-0.05206298828125,
0.0526123046875,
0.0269927978515625,
0.0240020751953125,
0.048095703125,
0.05084228515625,
0.00290679931640625,
0.093505859375,
0.0467529296875,
-0.006130218505859375,
0.032257080078125,
-0.0572509765625,
0.044464111328125,
-0.0751953125,
-0.017486572265625,
-0.0244293212890625,
-0.0251312255859375,
-0.049835205078125,
-0.0206298828125,
0.0204315185546875,
0.028656005859375,
-0.013427734375,
0.029083251953125,
-0.06085205078125,
0.012969970703125,
0.037445068359375,
-0.00016868114471435547,
-0.0190887451171875,
0.0108489990234375,
-0.045135498046875,
-0.0180206298828125,
-0.059326171875,
-0.0190887451171875,
0.0650634765625,
0.0174560546875,
0.04833984375,
0.006610870361328125,
0.06781005859375,
-0.0181121826171875,
0.0103912353515625,
-0.051971435546875,
0.0439453125,
-0.016754150390625,
-0.04290771484375,
-0.01230621337890625,
-0.0225067138671875,
-0.0662841796875,
0.0265960693359375,
-0.017303466796875,
-0.06591796875,
0.0026683807373046875,
0.0133514404296875,
-0.036224365234375,
0.03240966796875,
-0.05462646484375,
0.06927490234375,
-0.006465911865234375,
-0.03729248046875,
-0.00620269775390625,
-0.034332275390625,
0.017578125,
0.01401519775390625,
0.01187896728515625,
-0.01239013671875,
-0.0011272430419921875,
0.0748291015625,
-0.049713134765625,
0.0310211181640625,
-0.01250457763671875,
0.00003331899642944336,
0.0202178955078125,
-0.0109710693359375,
0.038909912109375,
-0.01806640625,
-0.00925445556640625,
0.01300811767578125,
-0.005687713623046875,
-0.0269622802734375,
-0.0247955322265625,
0.06304931640625,
-0.06817626953125,
-0.0311431884765625,
-0.03521728515625,
-0.055206298828125,
-0.00006031990051269531,
0.027191162109375,
0.05889892578125,
0.049591064453125,
0.01959228515625,
0.00922393798828125,
0.034027099609375,
-0.022216796875,
0.0418701171875,
0.0200653076171875,
-0.01499176025390625,
-0.04876708984375,
0.07989501953125,
0.04522705078125,
0.00582122802734375,
0.024200439453125,
0.01328277587890625,
-0.044830322265625,
-0.037567138671875,
-0.03485107421875,
0.032623291015625,
-0.047821044921875,
-0.018646240234375,
-0.06951904296875,
-0.03143310546875,
-0.056732177734375,
-0.0155792236328125,
-0.0230560302734375,
-0.0341796875,
-0.02593994140625,
-0.01218414306640625,
0.034759521484375,
0.0290985107421875,
-0.027435302734375,
0.0134124755859375,
-0.05059814453125,
0.019134521484375,
0.01104736328125,
0.0140533447265625,
-0.0019207000732421875,
-0.0618896484375,
-0.0243072509765625,
-0.01157379150390625,
-0.027099609375,
-0.06951904296875,
0.053741455078125,
0.018463134765625,
0.04541015625,
0.0093536376953125,
0.007213592529296875,
0.07171630859375,
-0.029266357421875,
0.058624267578125,
0.0208282470703125,
-0.0892333984375,
0.023193359375,
-0.007350921630859375,
0.030975341796875,
0.0237884521484375,
0.0266571044921875,
-0.042327880859375,
-0.0345458984375,
-0.04730224609375,
-0.1072998046875,
0.059967041015625,
0.0266876220703125,
0.01256561279296875,
-0.006679534912109375,
0.01021575927734375,
-0.0017786026000976562,
0.0031280517578125,
-0.058929443359375,
-0.03759765625,
-0.0241546630859375,
-0.03900146484375,
0.0034389495849609375,
-0.0179443359375,
-0.00962066650390625,
-0.039031982421875,
0.07928466796875,
0.0104217529296875,
0.049346923828125,
0.026031494140625,
-0.02923583984375,
0.01275634765625,
0.01641845703125,
0.04595947265625,
0.03948974609375,
-0.0172576904296875,
0.006946563720703125,
0.0206756591796875,
-0.034912109375,
-0.0009512901306152344,
0.03179931640625,
0.00247955322265625,
0.0093994140625,
0.0228271484375,
0.064697265625,
-0.00519561767578125,
-0.0124969482421875,
0.03485107421875,
-0.007770538330078125,
-0.028472900390625,
-0.022430419921875,
-0.005573272705078125,
0.0208282470703125,
0.00714111328125,
0.017578125,
-0.004909515380859375,
-0.011871337890625,
-0.0208282470703125,
0.0027332305908203125,
0.042205810546875,
-0.01922607421875,
-0.0237579345703125,
0.068359375,
-0.000016450881958007812,
-0.023193359375,
0.055694580078125,
-0.0272979736328125,
-0.061859130859375,
0.03802490234375,
0.047607421875,
0.0677490234375,
0.006198883056640625,
-0.00319671630859375,
0.045654296875,
0.0276336669921875,
0.00946044921875,
0.038543701171875,
0.0025730133056640625,
-0.06787109375,
-0.005123138427734375,
-0.05218505859375,
0.0030460357666015625,
0.0206756591796875,
-0.04510498046875,
0.01245880126953125,
-0.0303802490234375,
-0.0171661376953125,
0.0025997161865234375,
0.01715087890625,
-0.0654296875,
0.009765625,
0.01244354248046875,
0.055267333984375,
-0.05889892578125,
0.07208251953125,
0.053314208984375,
-0.033782958984375,
-0.09503173828125,
-0.00644683837890625,
-0.01235198974609375,
-0.06451416015625,
0.030029296875,
0.02734375,
-0.003635406494140625,
0.032135009765625,
-0.038665771484375,
-0.08233642578125,
0.08502197265625,
0.034332275390625,
-0.033233642578125,
0.0030574798583984375,
0.004344940185546875,
0.0367431640625,
-0.007030487060546875,
0.0304412841796875,
0.0313720703125,
0.03045654296875,
-0.0224456787109375,
-0.0594482421875,
0.0137176513671875,
-0.03704833984375,
0.005908966064453125,
0.0162200927734375,
-0.048614501953125,
0.085693359375,
-0.0111083984375,
-0.0267486572265625,
0.01561737060546875,
0.05877685546875,
0.0303955078125,
-0.01013946533203125,
0.022735595703125,
0.07403564453125,
0.06005859375,
-0.012725830078125,
0.0670166015625,
-0.040252685546875,
0.0565185546875,
0.06878662109375,
-0.007415771484375,
0.055206298828125,
0.032958984375,
-0.004474639892578125,
0.042694091796875,
0.04278564453125,
-0.004119873046875,
0.037872314453125,
0.0092926025390625,
-0.01181793212890625,
-0.0022335052490234375,
-0.0007328987121582031,
-0.053924560546875,
0.0275115966796875,
0.031951904296875,
-0.0248870849609375,
-0.0122833251953125,
0.0026493072509765625,
0.01299285888671875,
-0.01412200927734375,
-0.01221466064453125,
0.0322265625,
-0.0034008026123046875,
-0.034515380859375,
0.0584716796875,
0.012054443359375,
0.0792236328125,
-0.045074462890625,
0.010467529296875,
-0.001567840576171875,
0.0391845703125,
-0.0226593017578125,
-0.049285888671875,
0.005985260009765625,
-0.01247406005859375,
-0.0081939697265625,
-0.006389617919921875,
0.03106689453125,
-0.051910400390625,
-0.05230712890625,
0.0301055908203125,
0.0189361572265625,
0.0177764892578125,
0.0239715576171875,
-0.0760498046875,
0.01052093505859375,
0.0032176971435546875,
-0.0313720703125,
0.006687164306640625,
0.027191162109375,
0.01044464111328125,
0.022857666015625,
0.03961181640625,
0.004314422607421875,
0.035858154296875,
0.0003402233123779297,
0.048828125,
-0.0426025390625,
-0.037017822265625,
-0.0771484375,
0.042144775390625,
-0.01110076904296875,
-0.041351318359375,
0.06951904296875,
0.05291748046875,
0.0765380859375,
0.0000998377799987793,
0.051239013671875,
-0.033111572265625,
0.0235595703125,
-0.034515380859375,
0.06402587890625,
-0.05279541015625,
-0.0025691986083984375,
-0.02105712890625,
-0.057769775390625,
-0.01995849609375,
0.05291748046875,
-0.020416259765625,
-0.004749298095703125,
0.053863525390625,
0.0787353515625,
-0.0088958740234375,
-0.0197296142578125,
0.0158538818359375,
0.0273284912109375,
0.0278167724609375,
0.039215087890625,
0.033905029296875,
-0.05841064453125,
0.04150390625,
-0.05029296875,
-0.004974365234375,
0.00008934736251831055,
-0.052490234375,
-0.043365478515625,
-0.052154541015625,
-0.032318115234375,
-0.03533935546875,
-0.0086517333984375,
0.0784912109375,
0.047332763671875,
-0.06341552734375,
-0.0090789794921875,
-0.016082763671875,
0.0001609325408935547,
-0.0196990966796875,
-0.0227203369140625,
0.05755615234375,
-0.0234527587890625,
-0.06378173828125,
0.00542449951171875,
0.00014066696166992188,
0.0013170242309570312,
0.0028133392333984375,
-0.0018873214721679688,
-0.0362548828125,
0.00701141357421875,
0.0290069580078125,
0.0133514404296875,
-0.0509033203125,
-0.0230712890625,
0.0024566650390625,
-0.02935791015625,
0.0035305023193359375,
0.0193328857421875,
-0.03875732421875,
0.0382080078125,
0.046356201171875,
0.03118896484375,
0.03875732421875,
-0.0254364013671875,
0.033660888671875,
-0.04833984375,
0.01412200927734375,
0.00437164306640625,
0.0452880859375,
0.03265380859375,
-0.0216217041015625,
0.0282440185546875,
0.0202789306640625,
-0.041168212890625,
-0.0701904296875,
-0.005001068115234375,
-0.06085205078125,
-0.01110076904296875,
0.07183837890625,
-0.0234527587890625,
-0.029266357421875,
0.01081085205078125,
-0.0187530517578125,
0.048553466796875,
-0.0170440673828125,
0.05084228515625,
0.051849365234375,
-0.00989532470703125,
0.0113372802734375,
-0.0190887451171875,
0.0194091796875,
0.05645751953125,
-0.036285400390625,
-0.01399993896484375,
0.014801025390625,
0.037139892578125,
0.0246124267578125,
0.050994873046875,
-0.009765625,
0.01291656494140625,
0.01093292236328125,
0.0294036865234375,
-0.016387939453125,
0.0037078857421875,
-0.032470703125,
0.0151519775390625,
-0.0152587890625,
-0.050048828125
]
] |
sentence-transformers/sentence-t5-base | 2022-06-21T14:56:18.000Z | [
"sentence-transformers",
"pytorch",
"rust",
"t5",
"feature-extraction",
"sentence-similarity",
"transformers",
"en",
"arxiv:2108.08877",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/sentence-t5-base | 28 | 17,188 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
language: en
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/sentence-t5-base
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space. The model works well for sentence similarity tasks, but doesn't perform that well for semantic search tasks.
This model was converted from the Tensorflow model [st5-base-1](https://tfhub.dev/google/sentence-t5/st5-base/1) to PyTorch. When using this model, have a look at the publication: [Sentence-T5: Scalable sentence encoders from pre-trained text-to-text models](https://arxiv.org/abs/2108.08877). The tfhub model and this PyTorch model can produce slightly different embeddings, however, when run on the same benchmarks, they produce identical results.
The model uses only the encoder from a T5-base model. The weights are stored in FP16.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/sentence-t5-base')
embeddings = model.encode(sentences)
print(embeddings)
```
The model requires sentence-transformers version 2.2.0 or newer.
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/sentence-t5-base)
## Citing & Authors
If you find this model helpful, please cite the respective publication:
[Sentence-T5: Scalable sentence encoders from pre-trained text-to-text models](https://arxiv.org/abs/2108.08877)
| 2,014 | [
[
-0.005695343017578125,
-0.047607421875,
0.038604736328125,
0.0231475830078125,
-0.020660400390625,
-0.0237579345703125,
-0.017120361328125,
-0.013336181640625,
0.0012493133544921875,
0.039947509765625,
-0.03546142578125,
-0.047210693359375,
-0.061920166015625,
0.0177154541015625,
-0.0528564453125,
0.072021484375,
-0.0108795166015625,
-0.00009578466415405273,
-0.0275421142578125,
-0.01006317138671875,
-0.0226593017578125,
-0.0245819091796875,
-0.01473236083984375,
-0.02301025390625,
0.020751953125,
0.024169921875,
0.0521240234375,
0.036102294921875,
0.05548095703125,
0.02703857421875,
-0.0065155029296875,
-0.0143280029296875,
-0.04425048828125,
-0.00323486328125,
-0.0091552734375,
-0.0207366943359375,
-0.0201873779296875,
-0.0026187896728515625,
0.034637451171875,
0.052703857421875,
-0.01418304443359375,
0.0190887451171875,
-0.019073486328125,
0.0167694091796875,
-0.03619384765625,
0.01580810546875,
-0.037750244140625,
0.0178070068359375,
0.0021877288818359375,
-0.00345611572265625,
-0.0443115234375,
-0.0300445556640625,
0.029296875,
-0.0242767333984375,
0.032501220703125,
0.016510009765625,
0.0908203125,
0.03277587890625,
-0.034515380859375,
-0.0204010009765625,
-0.03173828125,
0.050689697265625,
-0.053131103515625,
0.0322265625,
0.00911712646484375,
0.01381683349609375,
-0.00888824462890625,
-0.09185791015625,
-0.039794921875,
-0.02996826171875,
-0.0164642333984375,
0.01153564453125,
-0.037445068359375,
0.0108795166015625,
0.037078857421875,
0.03106689453125,
-0.05181884765625,
-0.00502777099609375,
-0.046905517578125,
-0.012664794921875,
0.019378662109375,
0.01000213623046875,
0.02734375,
-0.036529541015625,
-0.0390625,
-0.0245513916015625,
-0.023895263671875,
-0.004802703857421875,
0.0175628662109375,
0.002422332763671875,
-0.01556396484375,
0.07305908203125,
-0.007244110107421875,
0.045654296875,
0.01073455810546875,
0.0055999755859375,
0.043121337890625,
-0.023406982421875,
-0.01337432861328125,
0.002048492431640625,
0.07476806640625,
0.037750244140625,
0.034271240234375,
-0.036651611328125,
-0.0150604248046875,
0.018402099609375,
0.047088623046875,
-0.0833740234375,
-0.0252685546875,
0.018402099609375,
-0.051605224609375,
-0.036376953125,
0.018951416015625,
-0.03302001953125,
0.00348663330078125,
0.012298583984375,
0.0548095703125,
-0.0430908203125,
0.0102691650390625,
0.017913818359375,
-0.0313720703125,
0.0199127197265625,
-0.01031494140625,
-0.060638427734375,
0.02978515625,
0.026123046875,
0.0616455078125,
-0.0098724365234375,
-0.03680419921875,
-0.0177001953125,
-0.003307342529296875,
-0.01488494873046875,
0.055633544921875,
-0.030975341796875,
-0.0200042724609375,
0.0035495758056640625,
0.0157470703125,
-0.0078887939453125,
-0.04278564453125,
0.06427001953125,
-0.020294189453125,
0.045745849609375,
0.007144927978515625,
-0.05322265625,
-0.0136871337890625,
0.014923095703125,
-0.048248291015625,
0.0811767578125,
0.021087646484375,
-0.06256103515625,
0.0204315185546875,
-0.05126953125,
-0.0261993408203125,
-0.0165557861328125,
0.0111541748046875,
-0.055511474609375,
0.013214111328125,
0.0198822021484375,
0.0423583984375,
-0.018585205078125,
0.00933837890625,
-0.03271484375,
-0.049346923828125,
0.0156707763671875,
-0.016510009765625,
0.06585693359375,
0.00643157958984375,
-0.02337646484375,
0.0189971923828125,
-0.037567138671875,
-0.0035724639892578125,
0.0118408203125,
-0.021331787109375,
0.004150390625,
-0.007450103759765625,
0.031494140625,
0.0135040283203125,
0.033599853515625,
-0.051239013671875,
0.0273590087890625,
-0.0343017578125,
0.0550537109375,
0.0306854248046875,
-0.0103302001953125,
0.048980712890625,
-0.02532958984375,
0.0212249755859375,
0.026824951171875,
-0.0144195556640625,
-0.0128631591796875,
-0.025146484375,
-0.0625,
0.0032863616943359375,
0.018585205078125,
0.02960205078125,
-0.04534912109375,
0.0599365234375,
-0.0513916015625,
-0.04522705078125,
-0.058502197265625,
-0.018829345703125,
0.0022144317626953125,
0.03515625,
0.052978515625,
-0.004047393798828125,
-0.051300048828125,
-0.0789794921875,
-0.035888671875,
-0.0002224445343017578,
-0.0038604736328125,
-0.00861358642578125,
0.06256103515625,
-0.0360107421875,
0.0614013671875,
-0.034515380859375,
-0.031280517578125,
-0.02911376953125,
0.01444244384765625,
0.0095367431640625,
0.035430908203125,
0.048675537109375,
-0.0457763671875,
-0.023223876953125,
-0.02203369140625,
-0.049560546875,
-0.01074981689453125,
0.00365447998046875,
0.004856109619140625,
0.0101470947265625,
0.034759521484375,
-0.06744384765625,
0.032928466796875,
0.03692626953125,
-0.043365478515625,
0.0167694091796875,
-0.0176849365234375,
-0.0018329620361328125,
-0.116943359375,
0.0064544677734375,
-0.007335662841796875,
-0.02783203125,
-0.0225067138671875,
0.0092010498046875,
0.0177154541015625,
-0.0162811279296875,
-0.0214080810546875,
0.019989013671875,
-0.0222320556640625,
-0.01111602783203125,
-0.00707244873046875,
0.0020656585693359375,
-0.0079193115234375,
0.040008544921875,
-0.0147552490234375,
0.0693359375,
0.0251007080078125,
-0.0257568359375,
0.037261962890625,
0.047149658203125,
-0.0345458984375,
0.01165771484375,
-0.072021484375,
0.01396942138671875,
-0.0115966796875,
0.0295562744140625,
-0.07305908203125,
-0.008941650390625,
0.017913818359375,
-0.040771484375,
-0.01035308837890625,
0.0073089599609375,
-0.051605224609375,
-0.0282135009765625,
-0.0234527587890625,
0.0136260986328125,
0.05047607421875,
-0.0347900390625,
0.055328369140625,
0.0023593902587890625,
0.01009368896484375,
-0.043670654296875,
-0.0758056640625,
0.00978851318359375,
-0.01934814453125,
-0.0513916015625,
0.05987548828125,
0.006679534912109375,
0.01473236083984375,
0.029937744140625,
-0.0042572021484375,
0.0022106170654296875,
-0.00653076171875,
0.01355743408203125,
-0.0015592575073242188,
-0.0188751220703125,
0.0113372802734375,
-0.0004379749298095703,
-0.01044464111328125,
0.009429931640625,
-0.03265380859375,
0.049530029296875,
-0.0194854736328125,
-0.0037136077880859375,
-0.023406982421875,
0.0175323486328125,
0.044677734375,
-0.0131378173828125,
0.067626953125,
0.0704345703125,
-0.025726318359375,
-0.01520538330078125,
-0.0386962890625,
-0.018585205078125,
-0.033172607421875,
0.04852294921875,
-0.037353515625,
-0.0726318359375,
0.027008056640625,
0.003971099853515625,
-0.00658416748046875,
0.0560302734375,
0.04278564453125,
-0.0113983154296875,
0.05560302734375,
0.045318603515625,
-0.0015020370483398438,
0.038299560546875,
-0.023895263671875,
0.026092529296875,
-0.04388427734375,
-0.00836944580078125,
-0.041900634765625,
-0.0252838134765625,
-0.0638427734375,
-0.023406982421875,
0.0165557861328125,
-0.01461029052734375,
-0.03369140625,
0.058807373046875,
-0.042022705078125,
0.0166015625,
0.041229248046875,
0.005084991455078125,
0.004497528076171875,
0.0198516845703125,
-0.018096923828125,
-0.01386260986328125,
-0.051422119140625,
-0.039947509765625,
0.0760498046875,
0.0192413330078125,
0.0533447265625,
0.0092010498046875,
0.038543701171875,
0.0084075927734375,
-0.01323699951171875,
-0.0634765625,
0.03912353515625,
-0.03875732421875,
-0.0276336669921875,
-0.004199981689453125,
-0.0311431884765625,
-0.07794189453125,
0.01861572265625,
-0.022125244140625,
-0.05218505859375,
-0.0098724365234375,
-0.0269775390625,
-0.00763702392578125,
0.0163421630859375,
-0.06854248046875,
0.099609375,
-0.0026836395263671875,
-0.004329681396484375,
-0.01029205322265625,
-0.039215087890625,
-0.01110076904296875,
0.0247039794921875,
-0.02197265625,
0.006885528564453125,
-0.0012159347534179688,
0.053436279296875,
-0.0280914306640625,
0.04632568359375,
0.0079803466796875,
0.01849365234375,
0.00853729248046875,
-0.00754547119140625,
0.029937744140625,
-0.0263671875,
-0.0058746337890625,
0.006916046142578125,
-0.00247955322265625,
-0.033447265625,
-0.047088623046875,
0.047943115234375,
-0.08258056640625,
-0.032928466796875,
-0.031402587890625,
-0.040008544921875,
0.008544921875,
0.025970458984375,
0.032958984375,
0.03436279296875,
-0.023040771484375,
0.06005859375,
0.028228759765625,
-0.01009368896484375,
0.041168212890625,
0.0045928955078125,
0.0024051666259765625,
-0.023895263671875,
0.036102294921875,
0.00281524658203125,
0.0030975341796875,
0.05352783203125,
0.00899505615234375,
-0.0374755859375,
-0.029693603515625,
-0.015350341796875,
0.004505157470703125,
-0.0540771484375,
-0.0036220550537109375,
-0.07196044921875,
-0.0254974365234375,
-0.0528564453125,
-0.0027313232421875,
-0.016143798828125,
-0.0307769775390625,
-0.029327392578125,
-0.0263824462890625,
0.038482666015625,
0.05133056640625,
0.01464080810546875,
0.03192138671875,
-0.05133056640625,
0.028228759765625,
-0.00392913818359375,
0.01361083984375,
-0.01206207275390625,
-0.049896240234375,
-0.01409149169921875,
-0.00897979736328125,
-0.033233642578125,
-0.06829833984375,
0.043304443359375,
0.0035305023193359375,
0.030487060546875,
0.0118560791015625,
0.0004076957702636719,
0.052978515625,
-0.041900634765625,
0.06005859375,
-0.0005316734313964844,
-0.07403564453125,
0.0243377685546875,
-0.022674560546875,
0.039886474609375,
0.035888671875,
0.02386474609375,
-0.03521728515625,
-0.010894775390625,
-0.058685302734375,
-0.07122802734375,
0.053802490234375,
0.0382080078125,
0.026824951171875,
0.0012264251708984375,
0.0163726806640625,
-0.0003821849822998047,
0.02313232421875,
-0.0732421875,
-0.009033203125,
-0.043304443359375,
-0.055328369140625,
-0.01544189453125,
-0.02044677734375,
0.022430419921875,
-0.00684356689453125,
0.029571533203125,
0.00336456298828125,
0.053680419921875,
0.0159149169921875,
-0.03509521484375,
0.01100921630859375,
0.022308349609375,
0.03375244140625,
0.0171356201171875,
-0.02459716796875,
0.017120361328125,
0.0294342041015625,
-0.033935546875,
-0.007419586181640625,
0.0235443115234375,
-0.00548553466796875,
0.01374053955078125,
0.0382080078125,
0.07745361328125,
0.035400390625,
-0.02386474609375,
0.051666259765625,
0.0073699951171875,
-0.0169525146484375,
-0.040802001953125,
-0.007480621337890625,
0.02685546875,
0.01364898681640625,
0.0191802978515625,
0.015899658203125,
0.0137176513671875,
-0.043365478515625,
0.0170440673828125,
-0.00011020898818969727,
-0.031005859375,
-0.00954437255859375,
0.060089111328125,
0.00927734375,
-0.02984619140625,
0.072509765625,
0.002590179443359375,
-0.05010986328125,
0.0435791015625,
0.046844482421875,
0.0736083984375,
0.01947021484375,
0.00921630859375,
0.034912109375,
0.0283203125,
-0.022857666015625,
0.0093231201171875,
-0.004245758056640625,
-0.062744140625,
-0.01605224609375,
-0.035308837890625,
-0.0150909423828125,
-0.004425048828125,
-0.0396728515625,
0.031097412109375,
-0.0197601318359375,
-0.0108489990234375,
0.002056121826171875,
-0.0019292831420898438,
-0.058441162109375,
0.01290130615234375,
-0.00893402099609375,
0.056121826171875,
-0.053253173828125,
0.055267333984375,
0.06317138671875,
-0.06494140625,
-0.06549072265625,
0.0093841552734375,
-0.04278564453125,
-0.0450439453125,
0.048736572265625,
0.025421142578125,
0.006683349609375,
0.0173187255859375,
-0.044830322265625,
-0.050140380859375,
0.09600830078125,
0.0291595458984375,
-0.0217742919921875,
-0.03424072265625,
0.03131103515625,
0.050537109375,
-0.03521728515625,
0.041778564453125,
0.01434326171875,
0.0225982666015625,
-0.0029697418212890625,
-0.0693359375,
0.0203399658203125,
-0.0175018310546875,
0.0178985595703125,
-0.0034027099609375,
-0.042388916015625,
0.08026123046875,
-0.004123687744140625,
0.0024280548095703125,
0.0185394287109375,
0.05328369140625,
0.01016998291015625,
-0.0104217529296875,
0.0245208740234375,
0.0634765625,
0.035369873046875,
-0.0194854736328125,
0.07977294921875,
-0.0181427001953125,
0.063720703125,
0.06292724609375,
0.0003514289855957031,
0.08026123046875,
0.0408935546875,
-0.001453399658203125,
0.0648193359375,
0.0404052734375,
-0.039581298828125,
0.037933349609375,
0.00860595703125,
0.0137939453125,
-0.022308349609375,
0.0107269287109375,
-0.022857666015625,
0.037445068359375,
0.0084228515625,
-0.05316162109375,
-0.022552490234375,
-0.0142059326171875,
-0.007740020751953125,
-0.002399444580078125,
0.002925872802734375,
0.046783447265625,
0.0122222900390625,
-0.041412353515625,
0.02752685546875,
0.02081298828125,
0.060516357421875,
-0.029205322265625,
0.0006718635559082031,
-0.0016078948974609375,
0.037353515625,
-0.0138702392578125,
-0.06036376953125,
0.034271240234375,
-0.003753662109375,
-0.012939453125,
-0.037200927734375,
0.06817626953125,
-0.04034423828125,
-0.04278564453125,
0.0231170654296875,
0.03399658203125,
0.008575439453125,
0.00048089027404785156,
-0.0517578125,
-0.00333404541015625,
-0.00620269775390625,
-0.01242828369140625,
-0.0009775161743164062,
0.0295867919921875,
0.0087890625,
0.04473876953125,
0.031005859375,
-0.0230712890625,
0.0031757354736328125,
0.006725311279296875,
0.04852294921875,
-0.0648193359375,
-0.0491943359375,
-0.055877685546875,
0.041259765625,
-0.00705718994140625,
-0.03778076171875,
0.049346923828125,
0.048065185546875,
0.0709228515625,
-0.035736083984375,
0.04498291015625,
-0.00958251953125,
0.02142333984375,
-0.034637451171875,
0.06494140625,
-0.04833984375,
-0.01490020751953125,
-0.015960693359375,
-0.07891845703125,
-0.014556884765625,
0.0784912109375,
-0.02740478515625,
0.0162811279296875,
0.08258056640625,
0.058319091796875,
-0.02362060546875,
-0.010833740234375,
0.0172271728515625,
0.034332275390625,
0.007137298583984375,
0.0455322265625,
0.0443115234375,
-0.0703125,
0.068115234375,
-0.0090789794921875,
0.032012939453125,
-0.0198974609375,
-0.0521240234375,
-0.083984375,
-0.047393798828125,
-0.0208587646484375,
-0.0374755859375,
0.0106201171875,
0.078125,
0.041778564453125,
-0.0535888671875,
-0.0135955810546875,
-0.03167724609375,
-0.0167999267578125,
0.0023212432861328125,
-0.01910400390625,
0.021209716796875,
-0.022369384765625,
-0.06640625,
0.0106048583984375,
-0.01209259033203125,
0.0018663406372070312,
-0.0135040283203125,
0.0201263427734375,
-0.00870513916015625,
-0.015655517578125,
0.033843994140625,
-0.02716064453125,
-0.064208984375,
-0.0419921875,
0.02081298828125,
-0.033721923828125,
0.00550079345703125,
0.03472900390625,
-0.059539794921875,
0.0135498046875,
0.058197021484375,
0.052886962890625,
0.06402587890625,
-0.0101470947265625,
0.058197021484375,
-0.03814697265625,
0.01441192626953125,
0.01218414306640625,
0.03985595703125,
0.037445068359375,
-0.0078887939453125,
0.042205810546875,
0.0176239013671875,
-0.03057861328125,
-0.039581298828125,
0.00042748451232910156,
-0.0955810546875,
-0.0092926025390625,
0.096923828125,
-0.0106353759765625,
-0.0204620361328125,
0.0196533203125,
-0.0252838134765625,
0.034637451171875,
-0.033203125,
0.061004638671875,
0.06695556640625,
0.023681640625,
-0.027801513671875,
-0.0182647705078125,
0.0222320556640625,
0.037841796875,
-0.037811279296875,
-0.03045654296875,
0.014068603515625,
0.040740966796875,
0.0107269287109375,
0.01401519775390625,
-0.002899169921875,
0.0115509033203125,
0.0211944580078125,
0.0167999267578125,
-0.01448822021484375,
0.0018739700317382812,
-0.018585205078125,
0.033966064453125,
-0.02142333984375,
-0.0184173583984375
]
] |
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k | 2023-05-06T00:12:58.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:wit-400m",
"dataset:imagenet-12k",
"arxiv:2212.07143",
"arxiv:2103.00020",
"arxiv:2010.11929",
"license:apache-2.0",
"has_space",
"region:us"
] | image-classification | timm | null | null | timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k | 33 | 17,148 | timm | 2022-11-03T04:37:01 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- wit-400m
- imagenet-12k
---
# Model card for vit_large_patch14_clip_224.openai_ft_in12k_in1k
A Vision Transformer (ViT) image classification model. Pretrained on WIT-400M image-text pairs by OpenAI using CLIP. Fine-tuned on ImageNet-12k and then ImageNet-1k in `timm`. See recipes in [Reproducible scaling laws](https://arxiv.org/abs/2212.07143).
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 304.2
- GMACs: 77.8
- Activations (M): 57.1
- Image size: 224 x 224
- **Papers:**
- Learning Transferable Visual Models From Natural Language Supervision: https://arxiv.org/abs/2103.00020
- Reproducible scaling laws for contrastive language-image learning: https://arxiv.org/abs/2212.07143
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:**
- WIT-400M
- ImageNet-12k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_large_patch14_clip_224.openai_ft_in12k_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_large_patch14_clip_224.openai_ft_in12k_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 257, 1024) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
```bibtex
@article{cherti2022reproducible,
title={Reproducible scaling laws for contrastive language-image learning},
author={Cherti, Mehdi and Beaumont, Romain and Wightman, Ross and Wortsman, Mitchell and Ilharco, Gabriel and Gordon, Cade and Schuhmann, Christoph and Schmidt, Ludwig and Jitsev, Jenia},
journal={arXiv preprint arXiv:2212.07143},
year={2022}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,441 | [
[
-0.0316162109375,
-0.037445068359375,
0.0034580230712890625,
0.0185089111328125,
-0.0231170654296875,
-0.03228759765625,
-0.03369140625,
-0.032928466796875,
0.012451171875,
0.031494140625,
-0.0304718017578125,
-0.0399169921875,
-0.057403564453125,
0.00023090839385986328,
-0.01294708251953125,
0.07354736328125,
-0.019683837890625,
-0.0038700103759765625,
-0.016571044921875,
-0.039154052734375,
-0.0140533447265625,
-0.0193634033203125,
-0.04888916015625,
-0.025421142578125,
0.0277099609375,
0.0140228271484375,
0.0452880859375,
0.05084228515625,
0.057830810546875,
0.034210205078125,
-0.01151275634765625,
0.00862884521484375,
-0.02276611328125,
-0.0224609375,
0.01214599609375,
-0.043304443359375,
-0.03216552734375,
0.0203094482421875,
0.060150146484375,
0.03472900390625,
0.0012598037719726562,
0.0235443115234375,
0.0133514404296875,
0.044647216796875,
-0.023712158203125,
0.0257415771484375,
-0.036529541015625,
0.0098724365234375,
-0.0055389404296875,
0.003997802734375,
-0.028289794921875,
-0.029632568359375,
0.0218658447265625,
-0.047607421875,
0.02923583984375,
-0.002269744873046875,
0.10711669921875,
0.01419830322265625,
-0.00878143310546875,
-0.0040740966796875,
-0.0235443115234375,
0.06304931640625,
-0.059539794921875,
0.02392578125,
0.01471710205078125,
0.01392364501953125,
-0.0005450248718261719,
-0.07720947265625,
-0.049072265625,
-0.0140228271484375,
-0.020233154296875,
0.004825592041015625,
-0.03289794921875,
0.00489044189453125,
0.02789306640625,
0.035247802734375,
-0.029052734375,
0.004764556884765625,
-0.04058837890625,
-0.00948333740234375,
0.034088134765625,
-0.00019991397857666016,
0.0278472900390625,
-0.015960693359375,
-0.040313720703125,
-0.039764404296875,
-0.0280914306640625,
0.019775390625,
0.0121002197265625,
0.0189056396484375,
-0.049285888671875,
0.0305633544921875,
0.011871337890625,
0.049285888671875,
0.01303863525390625,
-0.0216522216796875,
0.04595947265625,
-0.0165557861328125,
-0.0292510986328125,
-0.0188140869140625,
0.089111328125,
0.036529541015625,
0.034454345703125,
0.0057373046875,
-0.007549285888671875,
-0.0105133056640625,
-0.0031299591064453125,
-0.08660888671875,
-0.028350830078125,
-0.004489898681640625,
-0.038238525390625,
-0.0248565673828125,
0.019683837890625,
-0.061309814453125,
-0.011322021484375,
-0.0136566162109375,
0.053863525390625,
-0.037139892578125,
-0.0272369384765625,
0.006649017333984375,
-0.004459381103515625,
0.0304718017578125,
0.01541900634765625,
-0.049835205078125,
0.01444244384765625,
0.01247406005859375,
0.08331298828125,
-0.0010738372802734375,
-0.0377197265625,
-0.022857666015625,
-0.0262451171875,
-0.025604248046875,
0.0318603515625,
-0.012542724609375,
-0.005207061767578125,
-0.008270263671875,
0.028778076171875,
-0.0169525146484375,
-0.04962158203125,
0.0193634033203125,
-0.0135040283203125,
0.0190887451171875,
-0.0005173683166503906,
-0.0199127197265625,
-0.0256500244140625,
0.0190277099609375,
-0.038970947265625,
0.0858154296875,
0.0256195068359375,
-0.06854248046875,
0.028564453125,
-0.039093017578125,
-0.011566162109375,
-0.01629638671875,
0.0014667510986328125,
-0.07232666015625,
-0.00579071044921875,
0.0284576416015625,
0.04937744140625,
-0.0214385986328125,
0.006504058837890625,
-0.03436279296875,
-0.016357421875,
0.024200439453125,
-0.01519012451171875,
0.071533203125,
0.00618743896484375,
-0.0253143310546875,
0.0247802734375,
-0.05096435546875,
-0.0034046173095703125,
0.03692626953125,
-0.0206756591796875,
-0.01064300537109375,
-0.04278564453125,
0.0095367431640625,
0.0205078125,
0.0122833251953125,
-0.04547119140625,
0.017669677734375,
-0.0156402587890625,
0.03643798828125,
0.05596923828125,
-0.00989532470703125,
0.03326416015625,
-0.021026611328125,
0.0289154052734375,
0.024993896484375,
0.0179901123046875,
-0.0121002197265625,
-0.040557861328125,
-0.07281494140625,
-0.046356201171875,
0.021636962890625,
0.033935546875,
-0.056304931640625,
0.033050537109375,
-0.03179931640625,
-0.051666259765625,
-0.045257568359375,
0.00974273681640625,
0.041015625,
0.04864501953125,
0.04168701171875,
-0.036346435546875,
-0.035369873046875,
-0.06695556640625,
-0.0115203857421875,
-0.0048675537109375,
-0.00318145751953125,
0.023895263671875,
0.05084228515625,
-0.0160980224609375,
0.05615234375,
-0.036163330078125,
-0.035797119140625,
-0.0202484130859375,
0.00881195068359375,
0.028778076171875,
0.053924560546875,
0.060791015625,
-0.04632568359375,
-0.05084228515625,
-0.0028400421142578125,
-0.06976318359375,
0.0017824172973632812,
-0.00574493408203125,
-0.015960693359375,
0.0176239013671875,
0.034820556640625,
-0.043060302734375,
0.042510986328125,
0.026123046875,
-0.03216552734375,
0.0313720703125,
-0.0231475830078125,
0.006381988525390625,
-0.095703125,
0.0020904541015625,
0.03863525390625,
-0.0111236572265625,
-0.042083740234375,
-0.0024356842041015625,
0.01282501220703125,
0.004581451416015625,
-0.0309600830078125,
0.05047607421875,
-0.04473876953125,
0.0030193328857421875,
0.0004749298095703125,
-0.0039520263671875,
0.00206756591796875,
0.06280517578125,
-0.004489898681640625,
0.039398193359375,
0.063720703125,
-0.039031982421875,
0.0296630859375,
0.0302734375,
-0.0309295654296875,
0.0501708984375,
-0.0589599609375,
-0.000736236572265625,
-0.0014362335205078125,
0.01067352294921875,
-0.077392578125,
-0.00738525390625,
0.0263214111328125,
-0.0379638671875,
0.048065185546875,
-0.0350341796875,
-0.03662109375,
-0.0311279296875,
-0.0288848876953125,
0.0362548828125,
0.054595947265625,
-0.0645751953125,
0.040130615234375,
0.01346588134765625,
0.01141357421875,
-0.034149169921875,
-0.0711669921875,
-0.01526641845703125,
-0.028350830078125,
-0.04754638671875,
0.0274810791015625,
0.005084991455078125,
0.00445556640625,
0.0084228515625,
0.00146484375,
-0.00045800209045410156,
-0.02154541015625,
0.0322265625,
0.0369873046875,
-0.023223876953125,
-0.00586700439453125,
-0.023468017578125,
-0.0027599334716796875,
0.003925323486328125,
-0.01947021484375,
0.028411865234375,
-0.02850341796875,
-0.00907135009765625,
-0.050628662109375,
-0.0098876953125,
0.045623779296875,
-0.024993896484375,
0.06304931640625,
0.07708740234375,
-0.0311279296875,
0.0048065185546875,
-0.0340576171875,
-0.0120697021484375,
-0.037353515625,
0.043304443359375,
-0.0243377685546875,
-0.042449951171875,
0.058746337890625,
0.01041412353515625,
0.0004143714904785156,
0.05401611328125,
0.0297393798828125,
0.004749298095703125,
0.058746337890625,
0.053924560546875,
0.007537841796875,
0.06915283203125,
-0.0589599609375,
-0.006481170654296875,
-0.07086181640625,
-0.0264739990234375,
-0.0203704833984375,
-0.033203125,
-0.056640625,
-0.04052734375,
0.0268096923828125,
0.009918212890625,
-0.023895263671875,
0.04150390625,
-0.052825927734375,
0.0142974853515625,
0.049896240234375,
0.0372314453125,
-0.007541656494140625,
0.024688720703125,
-0.01947021484375,
-0.01183319091796875,
-0.050811767578125,
-0.01166534423828125,
0.08099365234375,
0.03448486328125,
0.0631103515625,
-0.0028018951416015625,
0.04248046875,
-0.018157958984375,
0.03277587890625,
-0.052886962890625,
0.049102783203125,
-0.01340484619140625,
-0.0215911865234375,
-0.01971435546875,
-0.041290283203125,
-0.08416748046875,
0.01666259765625,
-0.0193939208984375,
-0.053375244140625,
0.0230712890625,
0.009918212890625,
-0.005023956298828125,
0.05828857421875,
-0.065185546875,
0.0726318359375,
0.0005707740783691406,
-0.02947998046875,
0.009857177734375,
-0.052154541015625,
0.015716552734375,
0.0176239013671875,
-0.00992584228515625,
0.00666046142578125,
0.0124969482421875,
0.0782470703125,
-0.04132080078125,
0.071044921875,
-0.0248870849609375,
0.00928497314453125,
0.031463623046875,
-0.01015472412109375,
0.0283203125,
-0.0128936767578125,
0.01270294189453125,
0.022308349609375,
0.002170562744140625,
-0.032257080078125,
-0.042755126953125,
0.041168212890625,
-0.06988525390625,
-0.027618408203125,
-0.0287017822265625,
-0.041534423828125,
0.0123443603515625,
0.017425537109375,
0.040863037109375,
0.045867919921875,
0.01435089111328125,
0.0281524658203125,
0.051300048828125,
-0.0301513671875,
0.032196044921875,
0.00812530517578125,
-0.0201873779296875,
-0.042266845703125,
0.07550048828125,
0.0226287841796875,
0.0239715576171875,
0.0193939208984375,
0.0209808349609375,
-0.018585205078125,
-0.033599853515625,
-0.0200958251953125,
0.039764404296875,
-0.059539794921875,
-0.027801513671875,
-0.03839111328125,
-0.0301513671875,
-0.0281982421875,
-0.004241943359375,
-0.034881591796875,
-0.0171356201171875,
-0.0303955078125,
0.00836181640625,
0.05352783203125,
0.0443115234375,
-0.005893707275390625,
0.02545166015625,
-0.039031982421875,
0.0182342529296875,
0.0264129638671875,
0.0472412109375,
-0.007427215576171875,
-0.08160400390625,
-0.0244903564453125,
-0.0022487640380859375,
-0.0266265869140625,
-0.052490234375,
0.0396728515625,
0.01995849609375,
0.038543701171875,
0.0272216796875,
-0.00958251953125,
0.0574951171875,
-0.01067352294921875,
0.042877197265625,
0.032318115234375,
-0.04962158203125,
0.038604736328125,
-0.00762939453125,
0.0247039794921875,
0.01180267333984375,
0.02325439453125,
-0.0261383056640625,
-0.0099945068359375,
-0.065185546875,
-0.058135986328125,
0.05908203125,
0.01049041748046875,
0.002674102783203125,
0.0195465087890625,
0.036041259765625,
-0.00566864013671875,
-0.00800323486328125,
-0.06842041015625,
-0.0262451171875,
-0.0316162109375,
-0.02984619140625,
-0.01097869873046875,
-0.006561279296875,
-0.002658843994140625,
-0.04986572265625,
0.045928955078125,
-0.0126953125,
0.050262451171875,
0.029449462890625,
-0.0173797607421875,
-0.0073089599609375,
-0.01386260986328125,
0.036224365234375,
0.02215576171875,
-0.02105712890625,
0.00988006591796875,
0.01299285888671875,
-0.051788330078125,
-0.0058135986328125,
0.015899658203125,
-0.0014190673828125,
0.0015878677368164062,
0.041351318359375,
0.08538818359375,
0.00388336181640625,
-0.0115509033203125,
0.04522705078125,
-0.0086669921875,
-0.038421630859375,
-0.01062774658203125,
-0.005268096923828125,
-0.0074005126953125,
0.029632568359375,
0.01363372802734375,
0.0193634033203125,
-0.0191192626953125,
-0.0198516845703125,
0.00971221923828125,
0.037994384765625,
-0.0389404296875,
-0.0299072265625,
0.04901123046875,
-0.0166473388671875,
-0.00804901123046875,
0.061492919921875,
-0.0014066696166992188,
-0.0311279296875,
0.06494140625,
0.036590576171875,
0.0711669921875,
-0.00400543212890625,
0.0011758804321289062,
0.06463623046875,
0.024627685546875,
-0.010772705078125,
0.005184173583984375,
0.0124664306640625,
-0.0595703125,
0.0009331703186035156,
-0.044036865234375,
-0.002979278564453125,
0.0171356201171875,
-0.051177978515625,
0.0265045166015625,
-0.0430908203125,
-0.0201873779296875,
0.010040283203125,
0.0008587837219238281,
-0.06793212890625,
0.014404296875,
0.0008168220520019531,
0.059967041015625,
-0.064208984375,
0.05908203125,
0.060394287109375,
-0.060150146484375,
-0.07159423828125,
-0.00804901123046875,
-0.0094757080078125,
-0.0616455078125,
0.041473388671875,
0.028350830078125,
0.007266998291015625,
0.00732421875,
-0.06378173828125,
-0.051422119140625,
0.10052490234375,
0.040191650390625,
-0.006103515625,
0.01020050048828125,
-0.00353240966796875,
0.034393310546875,
-0.0258941650390625,
0.0243072509765625,
0.0200653076171875,
0.02484130859375,
0.02008056640625,
-0.061187744140625,
0.0191650390625,
-0.031280517578125,
0.01241302490234375,
0.01314544677734375,
-0.061767578125,
0.07293701171875,
-0.035797119140625,
-0.0168914794921875,
0.00635528564453125,
0.054473876953125,
0.01371002197265625,
0.00015604496002197266,
0.036041259765625,
0.060516357421875,
0.033447265625,
-0.023040771484375,
0.0692138671875,
-0.005336761474609375,
0.046356201171875,
0.05657958984375,
0.031463623046875,
0.050506591796875,
0.043121337890625,
-0.03369140625,
0.038330078125,
0.05828857421875,
-0.0321044921875,
0.026336669921875,
0.011474609375,
0.0074920654296875,
-0.01058197021484375,
-0.003681182861328125,
-0.0308685302734375,
0.026580810546875,
0.01404571533203125,
-0.044891357421875,
-0.007762908935546875,
0.005126953125,
0.004184722900390625,
-0.00958251953125,
-0.01428985595703125,
0.04437255859375,
0.0088348388671875,
-0.036163330078125,
0.06689453125,
-0.00547027587890625,
0.06976318359375,
-0.03875732421875,
-0.00409698486328125,
-0.026763916015625,
0.0313720703125,
-0.019195556640625,
-0.06585693359375,
0.0124664306640625,
-0.01483154296875,
-0.00655364990234375,
-0.0072479248046875,
0.046844482421875,
-0.0418701171875,
-0.041534423828125,
0.0185089111328125,
0.0124969482421875,
0.0191650390625,
0.003147125244140625,
-0.07244873046875,
-0.0042877197265625,
-0.002040863037109375,
-0.041900634765625,
0.0291900634765625,
0.043121337890625,
-0.0003933906555175781,
0.056915283203125,
0.04052734375,
-0.0030689239501953125,
0.01885986328125,
-0.006061553955078125,
0.0655517578125,
-0.036651611328125,
-0.02984619140625,
-0.06494140625,
0.04327392578125,
-0.004180908203125,
-0.03802490234375,
0.048736572265625,
0.0309295654296875,
0.06756591796875,
-0.012725830078125,
0.032257080078125,
-0.01296234130859375,
-0.0003306865692138672,
-0.032958984375,
0.052825927734375,
-0.055572509765625,
-0.00283050537109375,
-0.036163330078125,
-0.05926513671875,
-0.036834716796875,
0.05523681640625,
-0.0210723876953125,
0.02325439453125,
0.04644775390625,
0.07904052734375,
-0.0212249755859375,
-0.034088134765625,
0.0175323486328125,
0.0193634033203125,
0.0156402587890625,
0.037506103515625,
0.027557373046875,
-0.05712890625,
0.050567626953125,
-0.03515625,
-0.0179290771484375,
-0.007556915283203125,
-0.04925537109375,
-0.08184814453125,
-0.06781005859375,
-0.048126220703125,
-0.04534912109375,
-0.02264404296875,
0.0648193359375,
0.06451416015625,
-0.052581787109375,
-0.01042938232421875,
-0.00811004638671875,
0.002674102783203125,
-0.0290679931640625,
-0.01763916015625,
0.04705810546875,
-0.01122283935546875,
-0.05828857421875,
-0.027313232421875,
0.005359649658203125,
0.035369873046875,
-0.004703521728515625,
-0.0220489501953125,
-0.0237579345703125,
-0.00913238525390625,
0.0251922607421875,
0.028564453125,
-0.058380126953125,
-0.01047515869140625,
-0.01178741455078125,
-0.0195159912109375,
0.036529541015625,
0.0269317626953125,
-0.041259765625,
0.039398193359375,
0.04071044921875,
0.032501220703125,
0.057830810546875,
-0.023834228515625,
0.01007843017578125,
-0.061859130859375,
0.0350341796875,
-0.0183563232421875,
0.03790283203125,
0.0386962890625,
-0.0117340087890625,
0.04644775390625,
0.039398193359375,
-0.028594970703125,
-0.0615234375,
-0.0050201416015625,
-0.08270263671875,
-0.003040313720703125,
0.07916259765625,
-0.03564453125,
-0.03045654296875,
0.034393310546875,
-0.01374053955078125,
0.048583984375,
-0.007080078125,
0.0290069580078125,
0.0278472900390625,
0.00916290283203125,
-0.05303955078125,
-0.032470703125,
0.0335693359375,
0.016448974609375,
-0.049713134765625,
-0.01568603515625,
0.004119873046875,
0.040863037109375,
0.0239105224609375,
0.03009033203125,
-0.0194244384765625,
0.004482269287109375,
0.00399017333984375,
0.03717041015625,
-0.0184173583984375,
-0.0082550048828125,
-0.032379150390625,
-0.006809234619140625,
-0.005596160888671875,
-0.040374755859375
]
] |
EleutherAI/pythia-410m-deduped | 2023-07-09T16:05:38.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-410m-deduped | 16 | 17,139 | transformers | 2023-02-13T21:27:47 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile_deduplicated
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-410M-deduped
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-410M-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-410M-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-410M-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means XNPythia-410M-dedupedAME will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-410M-deduped to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-410M-deduped may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-410M-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
Pythia-410M-deduped was trained on the Pile **after the dataset has been globally
deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,668 | [
[
-0.0263671875,
-0.055816650390625,
0.0252838134765625,
0.00327301025390625,
-0.0181121826171875,
-0.01274871826171875,
-0.0171356201171875,
-0.032806396484375,
0.0143280029296875,
0.01438140869140625,
-0.026092529296875,
-0.021759033203125,
-0.0313720703125,
-0.001796722412109375,
-0.03460693359375,
0.08221435546875,
-0.00931549072265625,
-0.01049041748046875,
0.01137542724609375,
-0.00460052490234375,
-0.00411224365234375,
-0.042999267578125,
-0.03460693359375,
-0.0285491943359375,
0.04766845703125,
0.01428985595703125,
0.06365966796875,
0.04620361328125,
0.011871337890625,
0.022369384765625,
-0.028411865234375,
-0.0063323974609375,
-0.01145172119140625,
-0.0085906982421875,
-0.0040740966796875,
-0.0204925537109375,
-0.054901123046875,
0.0018835067749023438,
0.05303955078125,
0.050750732421875,
-0.014312744140625,
0.018798828125,
-0.0017004013061523438,
0.0271759033203125,
-0.038970947265625,
0.002635955810546875,
-0.0262298583984375,
-0.0146026611328125,
-0.00609588623046875,
0.01081085205078125,
-0.0296173095703125,
-0.0260467529296875,
0.0321044921875,
-0.047882080078125,
0.019500732421875,
0.0080718994140625,
0.08795166015625,
-0.0085906982421875,
-0.032745361328125,
-0.004657745361328125,
-0.05450439453125,
0.052459716796875,
-0.05487060546875,
0.0264129638671875,
0.0204315185546875,
0.01280975341796875,
-0.0022945404052734375,
-0.06597900390625,
-0.04443359375,
-0.0156402587890625,
-0.009521484375,
-0.003246307373046875,
-0.047821044921875,
0.0023250579833984375,
0.039703369140625,
0.04888916015625,
-0.062255859375,
-0.0013036727905273438,
-0.0281982421875,
-0.024139404296875,
0.02587890625,
0.00690460205078125,
0.032806396484375,
-0.02520751953125,
-0.003856658935546875,
-0.02911376953125,
-0.05224609375,
-0.0182952880859375,
0.040863037109375,
0.00592041015625,
-0.026885986328125,
0.0377197265625,
-0.031280517578125,
0.044830322265625,
-0.006923675537109375,
0.0198516845703125,
0.0322265625,
-0.01412200927734375,
-0.039886474609375,
-0.00547027587890625,
0.06964111328125,
0.00902557373046875,
0.0170440673828125,
-0.000774383544921875,
-0.003162384033203125,
0.004238128662109375,
0.0031452178955078125,
-0.084716796875,
-0.059539794921875,
0.0177154541015625,
-0.0286712646484375,
-0.032012939453125,
-0.0136260986328125,
-0.07025146484375,
-0.01543426513671875,
-0.0158843994140625,
0.040771484375,
-0.03753662109375,
-0.053741455078125,
-0.00815582275390625,
0.00035500526428222656,
0.0157012939453125,
0.026885986328125,
-0.07110595703125,
0.0306243896484375,
0.032684326171875,
0.07379150390625,
0.01708984375,
-0.04083251953125,
-0.0140380859375,
-0.0184326171875,
-0.009124755859375,
0.02679443359375,
-0.0089111328125,
-0.01499176025390625,
-0.01032257080078125,
0.01322174072265625,
-0.0086822509765625,
-0.0279388427734375,
0.030303955078125,
-0.032440185546875,
0.019683837890625,
-0.0216064453125,
-0.033966064453125,
-0.0294036865234375,
0.010772705078125,
-0.047515869140625,
0.0631103515625,
0.0193939208984375,
-0.0723876953125,
0.0184326171875,
-0.0167999267578125,
-0.0029125213623046875,
-0.0022144317626953125,
0.0143890380859375,
-0.04925537109375,
0.0016193389892578125,
0.024627685546875,
0.0038089752197265625,
-0.0313720703125,
0.01503753662109375,
-0.01824951171875,
-0.032470703125,
0.0145111083984375,
-0.041046142578125,
0.07080078125,
0.014373779296875,
-0.0489501953125,
0.020233154296875,
-0.046905517578125,
0.0162200927734375,
0.018035888671875,
-0.0294952392578125,
0.0023021697998046875,
-0.01508331298828125,
0.0267333984375,
0.018157958984375,
0.0106353759765625,
-0.0294952392578125,
0.0232391357421875,
-0.03802490234375,
0.054718017578125,
0.056640625,
-0.00618743896484375,
0.03485107421875,
-0.03192138671875,
0.0364990234375,
0.0012121200561523438,
0.01256561279296875,
-0.00559234619140625,
-0.044921875,
-0.07501220703125,
-0.02496337890625,
0.0270538330078125,
0.02410888671875,
-0.036834716796875,
0.034454345703125,
-0.0191192626953125,
-0.06463623046875,
-0.01141357421875,
-0.00627899169921875,
0.032135009765625,
0.026824951171875,
0.0325927734375,
-0.01256561279296875,
-0.04248046875,
-0.06475830078125,
-0.01514434814453125,
-0.03204345703125,
0.00858306884765625,
0.01384735107421875,
0.06927490234375,
-0.00963592529296875,
0.04425048828125,
-0.02838134765625,
0.0178070068359375,
-0.0275726318359375,
0.0124359130859375,
0.032745361328125,
0.046051025390625,
0.028564453125,
-0.041229248046875,
-0.0291748046875,
-0.0006017684936523438,
-0.04302978515625,
0.007282257080078125,
0.0024585723876953125,
-0.0231170654296875,
0.024749755859375,
0.006786346435546875,
-0.0740966796875,
0.0347900390625,
0.0465087890625,
-0.040374755859375,
0.06103515625,
-0.02410888671875,
-0.000043451786041259766,
-0.08056640625,
0.021820068359375,
0.007122039794921875,
-0.01727294921875,
-0.044281005859375,
0.003520965576171875,
0.014495849609375,
-0.014617919921875,
-0.0311431884765625,
0.04766845703125,
-0.041168212890625,
-0.0109100341796875,
-0.017181396484375,
0.005725860595703125,
-0.0018892288208007812,
0.04925537109375,
0.01325225830078125,
0.040618896484375,
0.05938720703125,
-0.057281494140625,
0.032745361328125,
0.0174713134765625,
-0.0232391357421875,
0.027923583984375,
-0.066162109375,
0.01088714599609375,
0.00765228271484375,
0.0310821533203125,
-0.04718017578125,
-0.0247650146484375,
0.040130615234375,
-0.042633056640625,
0.01200103759765625,
-0.0310821533203125,
-0.041290283203125,
-0.032745361328125,
-0.01262664794921875,
0.0469970703125,
0.059051513671875,
-0.046234130859375,
0.048980712890625,
0.003116607666015625,
0.00843048095703125,
-0.02825927734375,
-0.0423583984375,
-0.0172119140625,
-0.039703369140625,
-0.048431396484375,
0.029449462890625,
0.013763427734375,
-0.01442718505859375,
0.0008950233459472656,
0.0006122589111328125,
0.0086517333984375,
-0.0036468505859375,
0.02349853515625,
0.02691650390625,
-0.002735137939453125,
0.0023021697998046875,
-0.01218414306640625,
-0.01125335693359375,
0.0005550384521484375,
-0.037506103515625,
0.07464599609375,
-0.0201873779296875,
-0.01395416259765625,
-0.059967041015625,
0.00024700164794921875,
0.066162109375,
-0.0318603515625,
0.066162109375,
0.04376220703125,
-0.052490234375,
0.00994110107421875,
-0.0268707275390625,
-0.022979736328125,
-0.03338623046875,
0.05023193359375,
-0.0189361572265625,
-0.0269622802734375,
0.047149658203125,
0.02117919921875,
0.0184478759765625,
0.04119873046875,
0.055999755859375,
0.0178375244140625,
0.09112548828125,
0.034149169921875,
-0.01151275634765625,
0.048004150390625,
-0.039947509765625,
0.01751708984375,
-0.08453369140625,
-0.0133819580078125,
-0.03936767578125,
-0.0193023681640625,
-0.07147216796875,
-0.0232391357421875,
0.02166748046875,
0.016632080078125,
-0.0579833984375,
0.042816162109375,
-0.041961669921875,
0.0036067962646484375,
0.0496826171875,
0.016845703125,
0.0160064697265625,
0.01464080810546875,
0.00478363037109375,
-0.004055023193359375,
-0.04925537109375,
-0.02679443359375,
0.09161376953125,
0.035430908203125,
0.046142578125,
0.0218658447265625,
0.05511474609375,
-0.01152801513671875,
0.0193939208984375,
-0.053741455078125,
0.033172607421875,
0.0247344970703125,
-0.055572509765625,
-0.01433563232421875,
-0.05938720703125,
-0.0718994140625,
0.036163330078125,
0.007274627685546875,
-0.08453369140625,
0.0183563232421875,
0.0174560546875,
-0.0283660888671875,
0.038177490234375,
-0.046142578125,
0.07391357421875,
-0.017974853515625,
-0.03460693359375,
-0.0258941650390625,
-0.02349853515625,
0.0199737548828125,
0.02618408203125,
0.008148193359375,
0.006237030029296875,
0.022003173828125,
0.0748291015625,
-0.051361083984375,
0.048919677734375,
-0.01010894775390625,
0.00998687744140625,
0.024871826171875,
0.0210113525390625,
0.048187255859375,
0.01071929931640625,
0.0080718994140625,
-0.00264739990234375,
0.01116943359375,
-0.040496826171875,
-0.0284576416015625,
0.0689697265625,
-0.08233642578125,
-0.0295562744140625,
-0.060333251953125,
-0.045501708984375,
0.006839752197265625,
0.0173187255859375,
0.0301361083984375,
0.049896240234375,
-0.0032138824462890625,
0.0009493827819824219,
0.044158935546875,
-0.0396728515625,
0.027923583984375,
0.018890380859375,
-0.036468505859375,
-0.040069580078125,
0.074462890625,
0.002780914306640625,
0.026885986328125,
0.0031948089599609375,
0.017242431640625,
-0.0304718017578125,
-0.03240966796875,
-0.045806884765625,
0.040313720703125,
-0.05511474609375,
-0.0008296966552734375,
-0.053741455078125,
-0.0016269683837890625,
-0.03424072265625,
0.008026123046875,
-0.02996826171875,
-0.0304107666015625,
-0.0177154541015625,
-0.0022735595703125,
0.043975830078125,
0.03424072265625,
0.00635528564453125,
0.0259857177734375,
-0.04254150390625,
-0.0008425712585449219,
0.0166168212890625,
0.00843048095703125,
0.01020050048828125,
-0.06768798828125,
-0.0073699951171875,
0.01125335693359375,
-0.0309906005859375,
-0.08453369140625,
0.0389404296875,
-0.0058441162109375,
0.0279083251953125,
0.004638671875,
-0.015838623046875,
0.0440673828125,
-0.006443023681640625,
0.04852294921875,
0.0115509033203125,
-0.078369140625,
0.038787841796875,
-0.036651611328125,
0.0250701904296875,
0.0266876220703125,
0.02789306640625,
-0.056610107421875,
-0.005084991455078125,
-0.07421875,
-0.08203125,
0.0579833984375,
0.03436279296875,
0.012176513671875,
0.00649261474609375,
0.0291900634765625,
-0.034027099609375,
0.0112762451171875,
-0.076171875,
-0.0210113525390625,
-0.016693115234375,
-0.005489349365234375,
0.01241302490234375,
-0.0022735595703125,
0.00418853759765625,
-0.042633056640625,
0.07611083984375,
0.00437164306640625,
0.024017333984375,
0.021514892578125,
-0.0288543701171875,
-0.004962921142578125,
-0.0017681121826171875,
0.0126190185546875,
0.055999755859375,
-0.0119171142578125,
0.005084991455078125,
0.01557159423828125,
-0.041015625,
0.00510406494140625,
0.0137176513671875,
-0.0268707275390625,
-0.00627899169921875,
0.01245880126953125,
0.06439208984375,
0.0097198486328125,
-0.032135009765625,
0.01523590087890625,
-0.004123687744140625,
-0.004459381103515625,
-0.02154541015625,
-0.01275634765625,
0.0144805908203125,
0.0144500732421875,
-0.0007939338684082031,
-0.0123748779296875,
-0.0013580322265625,
-0.06549072265625,
0.00487518310546875,
0.0155181884765625,
-0.0122833251953125,
-0.03076171875,
0.044525146484375,
0.003597259521484375,
-0.01418304443359375,
0.0870361328125,
-0.0206451416015625,
-0.051971435546875,
0.05987548828125,
0.03668212890625,
0.056488037109375,
-0.0140380859375,
0.0238494873046875,
0.06707763671875,
0.023895263671875,
-0.01473236083984375,
0.0084991455078125,
0.005039215087890625,
-0.040435791015625,
-0.009368896484375,
-0.060150146484375,
-0.0191650390625,
0.0184783935546875,
-0.0457763671875,
0.035430908203125,
-0.046905517578125,
-0.005138397216796875,
-0.002777099609375,
0.0166015625,
-0.044097900390625,
0.026336669921875,
0.01403045654296875,
0.056365966796875,
-0.0679931640625,
0.065185546875,
0.0477294921875,
-0.055755615234375,
-0.08349609375,
-0.000032842159271240234,
0.0019311904907226562,
-0.03466796875,
0.0148773193359375,
0.0162811279296875,
0.01433563232421875,
0.01380157470703125,
-0.0199432373046875,
-0.06640625,
0.09759521484375,
0.017974853515625,
-0.050445556640625,
-0.0200653076171875,
-0.00665283203125,
0.040496826171875,
0.0048980712890625,
0.05316162109375,
0.05401611328125,
0.0305938720703125,
0.00693511962890625,
-0.0791015625,
0.0300140380859375,
-0.0240936279296875,
-0.005687713623046875,
0.01861572265625,
-0.051727294921875,
0.0994873046875,
-0.0061187744140625,
-0.0018587112426757812,
0.0321044921875,
0.041839599609375,
0.03228759765625,
-0.01027679443359375,
0.0279998779296875,
0.058990478515625,
0.066162109375,
-0.0277862548828125,
0.09246826171875,
-0.02325439453125,
0.058074951171875,
0.06707763671875,
0.01076507568359375,
0.0390625,
0.030517578125,
-0.02783203125,
0.039947509765625,
0.061737060546875,
-0.0075531005859375,
0.0135345458984375,
0.0227813720703125,
-0.0222625732421875,
-0.019439697265625,
0.010009765625,
-0.045196533203125,
0.0162200927734375,
0.0106353759765625,
-0.045379638671875,
-0.017059326171875,
-0.0245819091796875,
0.027740478515625,
-0.031402587890625,
-0.0169219970703125,
0.020751953125,
0.007293701171875,
-0.04931640625,
0.049468994140625,
0.01873779296875,
0.042449951171875,
-0.033905029296875,
0.01125335693359375,
-0.011962890625,
0.027618408203125,
-0.0256805419921875,
-0.032958984375,
0.00820159912109375,
0.0008282661437988281,
0.002712249755859375,
0.00653076171875,
0.0309906005859375,
-0.01092529296875,
-0.042694091796875,
0.0159454345703125,
0.035919189453125,
0.019256591796875,
-0.0350341796875,
-0.050628662109375,
0.007259368896484375,
-0.01123046875,
-0.041595458984375,
0.03173828125,
0.0186767578125,
-0.01119232177734375,
0.044769287109375,
0.047515869140625,
0.0030803680419921875,
-0.0003669261932373047,
0.00986480712890625,
0.074951171875,
-0.038330078125,
-0.03509521484375,
-0.07061767578125,
0.038726806640625,
0.00028514862060546875,
-0.050506591796875,
0.06561279296875,
0.042724609375,
0.05401611328125,
0.0193939208984375,
0.04559326171875,
-0.033599853515625,
0.0020236968994140625,
-0.0206756591796875,
0.051605224609375,
-0.037872314453125,
0.004199981689453125,
-0.03741455078125,
-0.08453369140625,
-0.0053558349609375,
0.06866455078125,
-0.038421630859375,
0.029388427734375,
0.05804443359375,
0.061187744140625,
-0.005523681640625,
0.006313323974609375,
0.0048370361328125,
0.0216217041015625,
0.040618896484375,
0.070068359375,
0.0684814453125,
-0.05169677734375,
0.0426025390625,
-0.040313720703125,
-0.020843505859375,
-0.0113983154296875,
-0.03912353515625,
-0.06317138671875,
-0.0340576171875,
-0.03826904296875,
-0.05401611328125,
-0.003871917724609375,
0.06597900390625,
0.056915283203125,
-0.0467529296875,
-0.01226806640625,
-0.039276123046875,
0.004253387451171875,
-0.0191192626953125,
-0.0179595947265625,
0.032135009765625,
0.01125335693359375,
-0.07281494140625,
-0.002666473388671875,
-0.01151275634765625,
0.007305145263671875,
-0.031768798828125,
-0.019500732421875,
-0.01666259765625,
-0.0071563720703125,
0.0059661865234375,
0.024383544921875,
-0.0374755859375,
-0.01861572265625,
0.004749298095703125,
0.0018377304077148438,
0.0006222724914550781,
0.0535888671875,
-0.045501708984375,
0.0106201171875,
0.049591064453125,
0.0071258544921875,
0.060516357421875,
-0.01910400390625,
0.032501220703125,
-0.02056884765625,
0.02606201171875,
0.0213165283203125,
0.04937744140625,
0.0256805419921875,
-0.0186767578125,
0.01244354248046875,
0.0303497314453125,
-0.055877685546875,
-0.066162109375,
0.026580810546875,
-0.05560302734375,
-0.00756072998046875,
0.09527587890625,
-0.0212554931640625,
-0.029144287109375,
0.00408172607421875,
-0.017364501953125,
0.0406494140625,
-0.0198822021484375,
0.050628662109375,
0.046661376953125,
0.0038967132568359375,
-0.01493072509765625,
-0.04656982421875,
0.027557373046875,
0.051055908203125,
-0.062744140625,
0.0290069580078125,
0.044921875,
0.04595947265625,
0.0192108154296875,
0.041748046875,
-0.0228118896484375,
0.045379638671875,
0.00646209716796875,
0.00994110107421875,
0.0018711090087890625,
-0.032684326171875,
-0.03424072265625,
-0.01140594482421875,
0.0171356201171875,
-0.0009016990661621094
]
] |
mrsinghania/asr-question-detection | 2021-09-21T06:44:23.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | mrsinghania | null | null | mrsinghania/asr-question-detection | 4 | 17,119 | transformers | 2022-03-02T23:29:05 | <i>Question vs Statement classifier</i> trained on more than 7k samples which were coming from spoken data in an interview setting
<b>Code for using in Transformers:</b>
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("mrsinghania/asr-question-detection")
model = AutoModelForSequenceClassification.from_pretrained("mrsinghania/asr-question-detection") | 427 | [
[
-0.03076171875,
-0.03997802734375,
0.00269317626953125,
0.014556884765625,
-0.0022754669189453125,
0.0102386474609375,
0.006221771240234375,
-0.0113372802734375,
-0.0128021240234375,
0.0406494140625,
-0.042755126953125,
-0.0023021697998046875,
-0.040435791015625,
0.02655029296875,
-0.05023193359375,
0.07086181640625,
0.017303466796875,
0.019775390625,
-0.0230560302734375,
-0.00916290283203125,
-0.03948974609375,
-0.05023193359375,
-0.044464111328125,
-0.0261077880859375,
0.00521087646484375,
0.0489501953125,
0.01255035400390625,
0.01556396484375,
0.0124053955078125,
0.035675048828125,
0.01097869873046875,
-0.00731658935546875,
-0.018402099609375,
0.00849151611328125,
0.00215911865234375,
-0.07318115234375,
0.0217132568359375,
0.0171661376953125,
0.04388427734375,
0.032470703125,
-0.00905609130859375,
0.0290374755859375,
-0.017578125,
0.00933074951171875,
-0.0238037109375,
0.0210723876953125,
-0.0450439453125,
0.0101165771484375,
0.019378662109375,
-0.005443572998046875,
-0.0279693603515625,
-0.03717041015625,
0.01503753662109375,
-0.0287628173828125,
0.02545166015625,
0.0033664703369140625,
0.07861328125,
0.0281219482421875,
-0.02935791015625,
-0.023406982421875,
-0.058837890625,
0.06158447265625,
-0.03668212890625,
0.023712158203125,
0.0242462158203125,
0.056396484375,
0.021697998046875,
-0.0784912109375,
-0.08221435546875,
-0.00490570068359375,
-0.01300048828125,
0.0115203857421875,
-0.005443572998046875,
-0.005428314208984375,
0.05560302734375,
0.040863037109375,
-0.04022216796875,
-0.0061187744140625,
-0.062042236328125,
-0.0140380859375,
0.0280914306640625,
0.0037975311279296875,
0.00707244873046875,
-0.04248046875,
-0.031463623046875,
-0.043853759765625,
-0.00356292724609375,
0.01476287841796875,
0.0190582275390625,
0.004123687744140625,
0.01255035400390625,
0.044036865234375,
-0.0233612060546875,
0.049102783203125,
0.0213775634765625,
-0.021881103515625,
0.043426513671875,
-0.0196533203125,
-0.0253753662109375,
0.006420135498046875,
0.0662841796875,
0.0194549560546875,
0.01812744140625,
0.015960693359375,
-0.0103302001953125,
0.00923919677734375,
0.0234527587890625,
-0.07000732421875,
-0.019256591796875,
0.01666259765625,
-0.03436279296875,
-0.015655517578125,
0.0170440673828125,
-0.04632568359375,
-0.007747650146484375,
-0.03057861328125,
0.034027099609375,
-0.026153564453125,
-0.0196380615234375,
0.0142822265625,
-0.02520751953125,
0.052459716796875,
0.00901031494140625,
-0.0462646484375,
0.00557708740234375,
0.0162353515625,
0.024871826171875,
0.01557159423828125,
-0.013916015625,
-0.01070404052734375,
-0.0178680419921875,
-0.007244110107421875,
0.06805419921875,
-0.0298004150390625,
-0.038909912109375,
-0.00823974609375,
0.017425537109375,
-0.0443115234375,
-0.04888916015625,
0.0460205078125,
-0.0262298583984375,
0.051971435546875,
0.0005984306335449219,
-0.05401611328125,
-0.025543212890625,
0.0202484130859375,
-0.03302001953125,
0.09356689453125,
0.011932373046875,
-0.03564453125,
0.04290771484375,
-0.046600341796875,
-0.0245819091796875,
0.0006351470947265625,
-0.0287933349609375,
-0.0355224609375,
-0.0080108642578125,
-0.00021219253540039062,
0.018798828125,
0.006893157958984375,
0.02374267578125,
0.01250457763671875,
-0.034515380859375,
0.037078857421875,
-0.039886474609375,
0.063720703125,
0.0362548828125,
-0.027587890625,
-0.0006437301635742188,
-0.0794677734375,
0.0150146484375,
0.006195068359375,
-0.047454833984375,
0.0006170272827148438,
-0.0194854736328125,
0.009765625,
0.0287933349609375,
0.008819580078125,
-0.0498046875,
0.0161895751953125,
-0.03955078125,
0.0279388427734375,
0.0261688232421875,
-0.004146575927734375,
0.009033203125,
-0.027130126953125,
0.04229736328125,
0.0195159912109375,
0.0033512115478515625,
-0.0166015625,
-0.034698486328125,
-0.08599853515625,
0.010528564453125,
0.027801513671875,
0.0718994140625,
-0.057281494140625,
0.056610107421875,
-0.007747650146484375,
-0.03448486328125,
-0.05194091796875,
-0.00972747802734375,
0.01197052001953125,
0.04443359375,
0.03851318359375,
0.004695892333984375,
-0.06781005859375,
-0.06884765625,
-0.01416015625,
-0.0271453857421875,
-0.013519287109375,
0.0155792236328125,
0.0540771484375,
-0.019317626953125,
0.0770263671875,
-0.0489501953125,
-0.01092529296875,
-0.0301971435546875,
0.0196533203125,
0.01436614990234375,
0.0518798828125,
0.022674560546875,
-0.053619384765625,
-0.034515380859375,
-0.041412353515625,
-0.041107177734375,
0.006458282470703125,
-0.0233306884765625,
0.0030651092529296875,
-0.0008602142333984375,
0.066162109375,
-0.04132080078125,
0.034881591796875,
0.005046844482421875,
-0.0114898681640625,
0.04205322265625,
0.0037384033203125,
0.01348876953125,
-0.105712890625,
-0.00383758544921875,
-0.00695037841796875,
-0.0157470703125,
-0.039764404296875,
-0.003719329833984375,
0.00499725341796875,
-0.0125274658203125,
-0.028533935546875,
0.002780914306640625,
-0.024322509765625,
0.0198822021484375,
-0.0187835693359375,
0.00308990478515625,
0.0021152496337890625,
0.0355224609375,
0.0008177757263183594,
0.06219482421875,
0.038726806640625,
-0.05706787109375,
0.04229736328125,
0.07037353515625,
-0.043914794921875,
0.04681396484375,
-0.051788330078125,
0.027862548828125,
-0.0203399658203125,
0.0224761962890625,
-0.10211181640625,
-0.012451171875,
0.0243682861328125,
-0.0504150390625,
0.0169830322265625,
-0.00020182132720947266,
-0.0390625,
-0.051788330078125,
-0.005260467529296875,
0.01800537109375,
0.049041748046875,
-0.04302978515625,
0.055419921875,
0.0175323486328125,
0.01141357421875,
-0.0445556640625,
-0.06591796875,
-0.0028972625732421875,
-0.007022857666015625,
-0.02276611328125,
-0.0265045166015625,
-0.0187835693359375,
0.00463104248046875,
-0.02557373046875,
-0.0161285400390625,
-0.003337860107421875,
-0.0019464492797851562,
0.01419830322265625,
0.0102386474609375,
-0.0246734619140625,
0.0286407470703125,
-0.0016956329345703125,
-0.022186279296875,
0.015350341796875,
-0.00675201416015625,
0.060791015625,
-0.008056640625,
-0.031036376953125,
-0.0312042236328125,
0.0208740234375,
0.0105743408203125,
-0.004589080810546875,
0.053955078125,
0.052093505859375,
-0.007518768310546875,
-0.0186767578125,
-0.050689697265625,
-0.0156402587890625,
-0.03741455078125,
0.036865234375,
-0.032257080078125,
-0.06585693359375,
0.023712158203125,
-0.006008148193359375,
-0.00885009765625,
0.05364990234375,
0.026275634765625,
-0.00335693359375,
0.0784912109375,
0.034088134765625,
-0.0211181640625,
0.039520263671875,
-0.024932861328125,
0.02520751953125,
-0.034149169921875,
-0.0225677490234375,
-0.059234619140625,
-0.00555419921875,
-0.0426025390625,
-0.0301666259765625,
0.008270263671875,
-0.00861358642578125,
-0.0117034912109375,
0.033111572265625,
-0.0550537109375,
0.039703369140625,
0.058013916015625,
-0.005084991455078125,
0.008148193359375,
-0.0102386474609375,
0.005733489990234375,
0.0173797607421875,
-0.03680419921875,
-0.036163330078125,
0.07598876953125,
0.021820068359375,
0.046051025390625,
-0.01422882080078125,
0.03375244140625,
-0.0087890625,
0.01517486572265625,
-0.06768798828125,
0.048370361328125,
0.0010423660278320312,
-0.0677490234375,
-0.0162506103515625,
-0.0269775390625,
-0.06451416015625,
0.00803375244140625,
-0.004512786865234375,
-0.03363037109375,
0.02008056640625,
-0.0010089874267578125,
-0.0301513671875,
0.006023406982421875,
-0.060638427734375,
0.07183837890625,
0.0020694732666015625,
0.048919677734375,
-0.004734039306640625,
-0.0728759765625,
0.01123809814453125,
0.0063018798828125,
-0.00649261474609375,
0.0108489990234375,
0.0290985107421875,
0.070556640625,
-0.02630615234375,
0.0633544921875,
-0.0292510986328125,
0.02081298828125,
0.042205810546875,
-0.045166015625,
0.00824737548828125,
-0.01398468017578125,
0.0185394287109375,
-0.01329803466796875,
0.01288604736328125,
-0.0163116455078125,
-0.02899169921875,
0.00823974609375,
-0.05718994140625,
-0.040740966796875,
-0.04290771484375,
-0.02288818359375,
-0.0174560546875,
0.01457977294921875,
0.03570556640625,
0.035888671875,
-0.01806640625,
0.024322509765625,
0.055267333984375,
-0.03948974609375,
0.025146484375,
0.0458984375,
-0.0202178955078125,
-0.0164947509765625,
0.07562255859375,
0.004611968994140625,
0.0239410400390625,
0.0295562744140625,
0.0186767578125,
-0.0399169921875,
-0.019256591796875,
-0.0278472900390625,
0.00971221923828125,
-0.07330322265625,
-0.0148773193359375,
-0.079833984375,
-0.050018310546875,
-0.047027587890625,
0.032928466796875,
0.005157470703125,
-0.035003662109375,
-0.024322509765625,
-0.04095458984375,
0.026031494140625,
0.03497314453125,
-0.0012340545654296875,
0.0557861328125,
-0.052581787109375,
0.0401611328125,
0.045166015625,
0.01006317138671875,
-0.0209503173828125,
-0.047637939453125,
-0.01806640625,
0.006744384765625,
-0.026336669921875,
-0.048919677734375,
0.02166748046875,
0.026947021484375,
0.044708251953125,
0.03302001953125,
-0.01088714599609375,
0.045166015625,
-0.01560211181640625,
0.065673828125,
0.004337310791015625,
-0.0626220703125,
0.04095458984375,
0.01751708984375,
0.0282135009765625,
0.06793212890625,
0.033935546875,
-0.044830322265625,
-0.0298614501953125,
-0.052490234375,
-0.04779052734375,
0.059112548828125,
0.0009751319885253906,
0.0207672119140625,
-0.0124053955078125,
0.0147552490234375,
0.019012451171875,
0.04156494140625,
-0.0528564453125,
-0.042144775390625,
-0.046539306640625,
-0.0333251953125,
-0.0166015625,
-0.0195770263671875,
0.0011072158813476562,
-0.0509033203125,
0.044891357421875,
-0.009063720703125,
0.037353515625,
0.01387786865234375,
-0.0242919921875,
0.01209259033203125,
0.0254058837890625,
0.033416748046875,
0.022186279296875,
-0.01094818115234375,
-0.002101898193359375,
0.016937255859375,
-0.051422119140625,
0.00830078125,
0.032135009765625,
-0.0184478759765625,
0.0106201171875,
-0.0018768310546875,
0.078125,
-0.0157012939453125,
-0.051727294921875,
0.039886474609375,
-0.0260009765625,
-0.0240478515625,
-0.053863525390625,
0.01523590087890625,
-0.0035343170166015625,
0.01080322265625,
0.0287933349609375,
0.0110931396484375,
0.0265655517578125,
-0.048309326171875,
0.037811279296875,
0.01441192626953125,
-0.037261962890625,
0.00833892822265625,
0.053314208984375,
-0.0121612548828125,
-0.0211181640625,
0.062347412109375,
-0.021820068359375,
-0.057830810546875,
0.034423828125,
0.0212860107421875,
0.05810546875,
-0.032958984375,
0.025482177734375,
0.042755126953125,
0.01404571533203125,
-0.00804901123046875,
0.034912109375,
-0.01629638671875,
-0.07293701171875,
-0.0306549072265625,
-0.056610107421875,
-0.0362548828125,
0.0215911865234375,
-0.061065673828125,
0.0011472702026367188,
-0.029388427734375,
0.0008440017700195312,
0.0129852294921875,
-0.01363372802734375,
-0.05120849609375,
0.050079345703125,
-0.00196075439453125,
0.06585693359375,
-0.054931640625,
0.051849365234375,
0.0482177734375,
-0.0584716796875,
-0.084716796875,
-0.017822265625,
-0.0245819091796875,
-0.083251953125,
0.048828125,
0.0394287109375,
0.01995849609375,
0.00408935546875,
-0.0697021484375,
-0.04058837890625,
0.05657958984375,
-0.0188751220703125,
-0.02545166015625,
0.005512237548828125,
0.042205810546875,
0.035369873046875,
-0.0183258056640625,
0.0308990478515625,
0.07647705078125,
0.0276947021484375,
0.01824951171875,
-0.06134033203125,
-0.022064208984375,
-0.0303192138671875,
-0.02032470703125,
0.0019397735595703125,
-0.041259765625,
0.06500244140625,
0.007274627685546875,
-0.015899658203125,
0.01139068603515625,
0.049591064453125,
0.01198577880859375,
0.0171661376953125,
0.062103271484375,
0.042144775390625,
0.0482177734375,
-0.03131103515625,
0.06414794921875,
-0.0166473388671875,
0.0213470458984375,
0.076904296875,
0.005306243896484375,
0.0467529296875,
0.013763427734375,
-0.027618408203125,
0.04376220703125,
0.04364013671875,
-0.03778076171875,
0.0191497802734375,
0.02777099609375,
-0.01399993896484375,
-0.0174407958984375,
0.01120758056640625,
-0.0250396728515625,
0.0406494140625,
0.0257720947265625,
-0.0277252197265625,
0.0024814605712890625,
0.025054931640625,
-0.005115509033203125,
-0.01568603515625,
-0.0111541748046875,
0.07080078125,
0.00826263427734375,
-0.056640625,
0.047637939453125,
-0.01207733154296875,
0.04608154296875,
-0.037078857421875,
-0.0042724609375,
0.001087188720703125,
0.0180511474609375,
-0.01250457763671875,
-0.0518798828125,
0.0217742919921875,
-0.01959228515625,
0.003902435302734375,
-0.0130462646484375,
0.062042236328125,
-0.042510986328125,
-0.0396728515625,
-0.0169830322265625,
0.0391845703125,
0.037994384765625,
-0.0085906982421875,
-0.062042236328125,
-0.033599853515625,
0.0122833251953125,
-0.01824951171875,
0.003753662109375,
0.0285797119140625,
0.034271240234375,
0.052001953125,
0.032623291015625,
-0.022247314453125,
0.028900146484375,
0.030059814453125,
0.0609130859375,
-0.05194091796875,
-0.05072021484375,
-0.031341552734375,
0.0328369140625,
-0.0191497802734375,
-0.03643798828125,
0.077880859375,
0.056488037109375,
0.0748291015625,
-0.0226898193359375,
0.06683349609375,
-0.01227569580078125,
0.0699462890625,
-0.022186279296875,
0.02789306640625,
-0.0231170654296875,
-0.005157470703125,
0.0019989013671875,
-0.04718017578125,
0.01456451416015625,
0.08184814453125,
-0.01123046875,
0.01343536376953125,
0.059173583984375,
0.052154541015625,
0.00774383544921875,
0.031829833984375,
0.02008056640625,
0.0299224853515625,
0.0120391845703125,
0.01097869873046875,
0.0806884765625,
-0.050689697265625,
0.061370849609375,
-0.020111083984375,
0.005321502685546875,
-0.0028972625732421875,
-0.02288818359375,
-0.0859375,
-0.043914794921875,
-0.0294189453125,
-0.047698974609375,
-0.037261962890625,
0.07476806640625,
0.03497314453125,
-0.084716796875,
-0.027374267578125,
0.0016870498657226562,
-0.0157318115234375,
-0.007167816162109375,
-0.02520751953125,
0.00994873046875,
-0.019256591796875,
-0.0584716796875,
0.0281524658203125,
-0.016357421875,
0.023040771484375,
-0.0264129638671875,
0.032684326171875,
-0.0191802978515625,
0.0157470703125,
0.034576416015625,
0.01554107666015625,
-0.01232147216796875,
-0.037139892578125,
-0.002063751220703125,
-0.002712249755859375,
0.028533935546875,
0.0474853515625,
-0.06329345703125,
0.0289764404296875,
0.036468505859375,
0.03338623046875,
0.05584716796875,
0.0186920166015625,
0.034423828125,
-0.05596923828125,
0.0148162841796875,
0.023345947265625,
0.022308349609375,
0.02288818359375,
-0.006805419921875,
0.025054931640625,
0.02862548828125,
-0.0509033203125,
-0.05389404296875,
0.0078277587890625,
-0.07470703125,
-0.013885498046875,
0.07342529296875,
0.0099334716796875,
-0.013763427734375,
-0.036956787109375,
-0.039215087890625,
0.0289306640625,
-0.029693603515625,
0.043548583984375,
0.0467529296875,
-0.0012054443359375,
-0.007114410400390625,
-0.051544189453125,
0.04669189453125,
0.0211181640625,
-0.061981201171875,
-0.0352783203125,
0.0170440673828125,
0.037384033203125,
0.034881591796875,
0.040191650390625,
0.00484466552734375,
-0.0019664764404296875,
0.01470184326171875,
0.01209259033203125,
0.01013946533203125,
-0.004058837890625,
-0.0193634033203125,
0.016082763671875,
-0.010223388671875,
-0.052337646484375
]
] |
Salesforce/codet5-base | 2021-11-23T09:53:41.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"codet5",
"dataset:code_search_net",
"arxiv:2109.00859",
"arxiv:1909.09436",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | Salesforce | null | null | Salesforce/codet5-base | 83 | 17,108 | transformers | 2022-03-02T23:29:04 | ---
license: apache-2.0
tags:
- codet5
datasets:
- code_search_net
inference: false
---
# CodeT5 (base-sized model)
Pre-trained CodeT5 model. It was introduced in the paper [CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models
for Code Understanding and Generation](https://arxiv.org/abs/2109.00859) by Yue Wang, Weishi Wang, Shafiq Joty, Steven C.H. Hoi and first released in [this repository](https://github.com/salesforce/CodeT5).
Disclaimer: The team releasing CodeT5 did not write a model card for this model so this model card has been written by the Hugging Face team (more specifically, [nielsr](https://huggingface.co/nielsr)).
## Model description
From the abstract:
"We present CodeT5, a unified pre-trained encoder-decoder Transformer model that better leverages the code semantics conveyed from the developer-assigned identifiers. Our model employs a unified framework to seamlessly support both code understanding and generation tasks and allows for multi-task learning. Besides, we propose a novel identifier-aware pre-training task that enables the model to distinguish which code tokens are identifiers and to recover them when they are masked. Furthermore, we propose to exploit the user-written code comments with a bimodal dual generation task for better NL-PL alignment. Comprehensive experiments show that CodeT5 significantly outperforms prior methods on understanding tasks such as code defect detection and clone detection, and generation tasks across various directions including PL-NL, NL-PL, and PL-PL. Further analysis reveals that our model can better capture semantic information from code."
## Intended uses & limitations
This repository contains the pre-trained model only, so you can use this model for (among other tasks) masked span prediction, as shown in the code example below. However, the main use of this model is to fine-tune it for a downstream task of interest, such as:
* code summarization
* code generation
* code translation
* code refinement
* code defect detection
* code clone detection.
Supervised datasets for code can be found [here](https://huggingface.co/datasets?languages=languages:code).
See the [model hub](https://huggingface.co/models?search=salesforce/codet) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
from transformers import RobertaTokenizer, T5ForConditionalGeneration
tokenizer = RobertaTokenizer.from_pretrained('Salesforce/codet5-base')
model = T5ForConditionalGeneration.from_pretrained('Salesforce/codet5-base')
text = "def greet(user): print(f'hello <extra_id_0>!')"
input_ids = tokenizer(text, return_tensors="pt").input_ids
# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=8)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
# this prints "{user.username}"
```
## Training data
The CodeT5 model was pretrained on CodeSearchNet [Husain et al., 2019](https://arxiv.org/abs/1909.09436). Additionally, the authors collected two datasets of C/CSharp from [BigQuery1](https://console.cloud.google.com/marketplace/details/github/github-repos) to ensure that all downstream tasks have overlapped programming languages with the pre-training data. In total, around 8.35 million instances are used for pretraining.
## Training procedure
### Preprocessing
This model uses a code-specific BPE (Byte-Pair Encoding) tokenizer trained using the [HuggingFace Tokenizers](https://github.com/huggingface/tokenizers) library. One can prepare text (or code) for the model using RobertaTokenizer, with the files from this repository.
## Evaluation results
For evaluation results on several downstream benchmarks, we refer to the paper.
### BibTeX entry and citation info
```bibtex
@misc{wang2021codet5,
title={CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
author={Yue Wang and Weishi Wang and Shafiq Joty and Steven C. H. Hoi},
year={2021},
eprint={2109.00859},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,140 | [
[
-0.027130126953125,
-0.02191162109375,
0.000827789306640625,
0.021331787109375,
-0.0163116455078125,
0.005146026611328125,
-0.024932861328125,
-0.037139892578125,
-0.006011962890625,
0.0241546630859375,
-0.039825439453125,
-0.0430908203125,
-0.04266357421875,
0.0118865966796875,
-0.0193939208984375,
0.0997314453125,
0.0019216537475585938,
-0.002655029296875,
-0.0194244384765625,
-0.005779266357421875,
-0.0318603515625,
-0.0654296875,
-0.0137786865234375,
-0.017852783203125,
0.015838623046875,
0.007335662841796875,
0.0264892578125,
0.040985107421875,
0.04449462890625,
0.0218353271484375,
-0.002872467041015625,
-0.006000518798828125,
-0.03155517578125,
-0.0287933349609375,
0.013702392578125,
-0.035797119140625,
-0.052886962890625,
0.0020923614501953125,
0.0181732177734375,
0.04498291015625,
0.0036678314208984375,
0.0382080078125,
-0.003070831298828125,
0.03350830078125,
-0.03704833984375,
0.014556884765625,
-0.045257568359375,
-0.00006341934204101562,
-0.005886077880859375,
-0.00039505958557128906,
-0.035003662109375,
-0.01910400390625,
0.00058746337890625,
-0.030853271484375,
0.03509521484375,
-0.019500732421875,
0.09259033203125,
0.0255889892578125,
-0.0206756591796875,
-0.004688262939453125,
-0.03680419921875,
0.056915283203125,
-0.06463623046875,
0.031951904296875,
0.01062774658203125,
-0.0002474784851074219,
0.00936126708984375,
-0.08355712890625,
-0.03948974609375,
-0.016387939453125,
-0.0174407958984375,
0.00827789306640625,
-0.0166015625,
0.0187835693359375,
0.046539306640625,
0.0291900634765625,
-0.0533447265625,
-0.0084381103515625,
-0.04913330078125,
-0.0231781005859375,
0.045745849609375,
-0.0133209228515625,
0.0260467529296875,
-0.0123748779296875,
-0.041351318359375,
0.0010938644409179688,
-0.044952392578125,
0.01085662841796875,
0.0189666748046875,
0.00733184814453125,
-0.0262908935546875,
0.011199951171875,
-0.003528594970703125,
0.053863525390625,
0.0191497802734375,
-0.0019512176513671875,
0.05340576171875,
-0.042022705078125,
-0.0252227783203125,
-0.01244354248046875,
0.064208984375,
0.0028820037841796875,
0.02801513671875,
-0.0224761962890625,
-0.0212249755859375,
0.0223236083984375,
0.029754638671875,
-0.0845947265625,
-0.030303955078125,
0.0238037109375,
-0.045440673828125,
-0.034271240234375,
0.027435302734375,
-0.03271484375,
0.00041556358337402344,
-0.0214385986328125,
0.021453857421875,
-0.03302001953125,
-0.0294189453125,
0.00827789306640625,
-0.00821685791015625,
0.0199127197265625,
-0.005619049072265625,
-0.06256103515625,
0.0013380050659179688,
0.0293426513671875,
0.042266845703125,
-0.007740020751953125,
-0.042572021484375,
-0.0285186767578125,
-0.0112457275390625,
-0.01239013671875,
0.031524658203125,
-0.0256500244140625,
-0.01554107666015625,
-0.013763427734375,
0.010955810546875,
-0.0027828216552734375,
-0.042327880859375,
0.0218048095703125,
-0.04290771484375,
0.01070404052734375,
-0.004940032958984375,
-0.0306854248046875,
-0.01003265380859375,
0.0130157470703125,
-0.0413818359375,
0.0809326171875,
0.0173187255859375,
-0.061737060546875,
0.039031982421875,
-0.064208984375,
-0.02130126953125,
0.0038089752197265625,
-0.009124755859375,
-0.050445556640625,
0.0004832744598388672,
0.021697998046875,
0.0416259765625,
-0.0030765533447265625,
0.03900146484375,
-0.0167388916015625,
-0.033905029296875,
0.00836944580078125,
-0.00644683837890625,
0.06591796875,
0.03143310546875,
-0.04248046875,
0.029022216796875,
-0.0560302734375,
0.01480865478515625,
0.01395416259765625,
-0.0242462158203125,
0.00551605224609375,
-0.0241241455078125,
0.0102081298828125,
0.0292205810546875,
0.0299224853515625,
-0.02825927734375,
0.0231781005859375,
-0.0187225341796875,
0.047149658203125,
0.0430908203125,
-0.0164031982421875,
0.0301666259765625,
-0.0133819580078125,
0.055999755859375,
0.031280517578125,
0.0146484375,
-0.04241943359375,
-0.0149993896484375,
-0.048492431640625,
-0.0210723876953125,
0.04833984375,
0.0285797119140625,
-0.04541015625,
0.042816162109375,
-0.039154052734375,
-0.03802490234375,
-0.03387451171875,
0.010345458984375,
0.060089111328125,
0.0149993896484375,
0.035064697265625,
-0.03863525390625,
-0.0726318359375,
-0.04168701171875,
-0.023895263671875,
0.0017805099487304688,
0.005126953125,
0.00499725341796875,
0.05560302734375,
-0.0369873046875,
0.07598876953125,
-0.032501220703125,
-0.00312042236328125,
-0.034423828125,
0.013824462890625,
0.0124053955078125,
0.06573486328125,
0.04962158203125,
-0.0679931640625,
-0.0271759033203125,
-0.01415252685546875,
-0.055694580078125,
-0.002162933349609375,
-0.00936126708984375,
-0.002231597900390625,
0.028228759765625,
0.0469970703125,
-0.0306854248046875,
0.03564453125,
0.047454833984375,
-0.02325439453125,
0.037384033203125,
-0.00849151611328125,
-0.005153656005859375,
-0.10284423828125,
0.0310821533203125,
-0.0019102096557617188,
-0.0120697021484375,
-0.046905517578125,
0.0111541748046875,
0.0200958251953125,
-0.024169921875,
-0.0252838134765625,
0.0206756591796875,
-0.0606689453125,
-0.016571044921875,
0.001445770263671875,
-0.01214599609375,
0.0014829635620117188,
0.0604248046875,
0.0112152099609375,
0.06982421875,
0.0229034423828125,
-0.04608154296875,
0.004055023193359375,
0.0177001953125,
-0.01593017578125,
0.007293701171875,
-0.0638427734375,
0.0308074951171875,
0.005405426025390625,
0.0238037109375,
-0.055572509765625,
-0.00933074951171875,
-0.003360748291015625,
-0.0570068359375,
0.0231781005859375,
-0.034454345703125,
-0.0294036865234375,
-0.047454833984375,
-0.00885772705078125,
0.04034423828125,
0.05303955078125,
-0.045928955078125,
0.028900146484375,
0.01727294921875,
0.017425537109375,
-0.051025390625,
-0.05743408203125,
-0.00914764404296875,
-0.01824951171875,
-0.054656982421875,
0.04345703125,
-0.019073486328125,
0.018768310546875,
-0.006862640380859375,
-0.0169525146484375,
-0.0116729736328125,
0.004474639892578125,
0.0303955078125,
0.0328369140625,
-0.0235748291015625,
0.00090789794921875,
-0.02886962890625,
-0.017059326171875,
0.0030879974365234375,
-0.04083251953125,
0.047515869140625,
-0.0322265625,
-0.00994110107421875,
-0.025177001953125,
0.005859375,
0.03961181640625,
-0.05682373046875,
0.048004150390625,
0.07000732421875,
-0.02239990234375,
-0.00797271728515625,
-0.03790283203125,
-0.0133209228515625,
-0.03509521484375,
0.036865234375,
-0.045166015625,
-0.058563232421875,
0.041961669921875,
-0.0069580078125,
-0.0013818740844726562,
0.0230712890625,
0.03802490234375,
0.0252838134765625,
0.079345703125,
0.06573486328125,
-0.00823974609375,
0.054351806640625,
-0.04510498046875,
0.028472900390625,
-0.048553466796875,
-0.01319122314453125,
-0.0345458984375,
-0.00507354736328125,
-0.0426025390625,
-0.020599365234375,
0.01299285888671875,
0.02130126953125,
-0.0328369140625,
0.049774169921875,
-0.06353759765625,
0.029052734375,
0.0443115234375,
0.005584716796875,
0.0039520263671875,
-0.00827789306640625,
-0.0021076202392578125,
0.003612518310546875,
-0.04962158203125,
-0.0273895263671875,
0.08843994140625,
0.026824951171875,
0.046142578125,
-0.006649017333984375,
0.06671142578125,
0.0042572021484375,
0.01091766357421875,
-0.033721923828125,
0.02325439453125,
-0.00504302978515625,
-0.03790283203125,
0.004741668701171875,
-0.0372314453125,
-0.06787109375,
0.003757476806640625,
-0.006305694580078125,
-0.04620361328125,
0.01097869873046875,
0.020538330078125,
-0.03277587890625,
0.0136566162109375,
-0.0831298828125,
0.10565185546875,
-0.017822265625,
-0.00981903076171875,
0.01206207275390625,
-0.062347412109375,
0.029022216796875,
0.00045108795166015625,
0.005176544189453125,
0.0176849365234375,
0.015106201171875,
0.0648193359375,
-0.045318603515625,
0.062286376953125,
-0.0267486572265625,
0.00732421875,
0.0147705078125,
-0.017730712890625,
0.0267486572265625,
-0.0118408203125,
0.01024627685546875,
0.0311431884765625,
0.0191650390625,
-0.04815673828125,
-0.03485107421875,
0.029998779296875,
-0.0660400390625,
-0.0265045166015625,
-0.02587890625,
-0.0292510986328125,
-0.005603790283203125,
0.032073974609375,
0.047149658203125,
0.034149169921875,
-0.0030059814453125,
0.018280029296875,
0.043914794921875,
-0.0222015380859375,
0.03997802734375,
0.01554107666015625,
-0.01409149169921875,
-0.0225830078125,
0.063232421875,
-0.0017251968383789062,
0.0221099853515625,
0.01280975341796875,
-0.0084686279296875,
-0.00849151611328125,
-0.03961181640625,
-0.028167724609375,
0.01354217529296875,
-0.040985107421875,
-0.0309295654296875,
-0.049224853515625,
-0.025787353515625,
-0.0487060546875,
-0.010772705078125,
-0.0291290283203125,
0.0040435791015625,
-0.01004791259765625,
-0.0068206787109375,
0.0254974365234375,
0.049072265625,
0.007137298583984375,
0.021209716796875,
-0.07122802734375,
0.0201263427734375,
0.00384521484375,
0.034912109375,
-0.009674072265625,
-0.0411376953125,
-0.033111572265625,
0.01137542724609375,
-0.0179443359375,
-0.048797607421875,
0.0321044921875,
-0.002231597900390625,
0.033599853515625,
0.020355224609375,
0.0020503997802734375,
0.0582275390625,
-0.01416015625,
0.07025146484375,
0.019439697265625,
-0.09259033203125,
0.046234130859375,
-0.0157318115234375,
0.030517578125,
0.034332275390625,
0.021697998046875,
-0.030487060546875,
0.0015249252319335938,
-0.0634765625,
-0.05963134765625,
0.0723876953125,
0.0089111328125,
0.0122833251953125,
0.0132293701171875,
0.015838623046875,
-0.00452423095703125,
0.0252838134765625,
-0.0799560546875,
-0.01021575927734375,
-0.03857421875,
-0.02716064453125,
0.003116607666015625,
-0.0166473388671875,
0.011199951171875,
-0.031158447265625,
0.03948974609375,
-0.01519012451171875,
0.050018310546875,
0.007572174072265625,
-0.035430908203125,
0.01202392578125,
0.004665374755859375,
0.046875,
0.051422119140625,
-0.015380859375,
-0.0085906982421875,
0.006923675537109375,
-0.0533447265625,
-0.0104827880859375,
0.0199737548828125,
0.0067901611328125,
-0.0023174285888671875,
0.0284271240234375,
0.07666015625,
0.0117340087890625,
-0.049407958984375,
0.06671142578125,
-0.00295257568359375,
-0.023040771484375,
-0.03717041015625,
-0.00434112548828125,
0.0023021697998046875,
0.00726318359375,
0.01342010498046875,
0.021087646484375,
-0.01030731201171875,
-0.0377197265625,
0.0231781005859375,
0.01392364501953125,
-0.02777099609375,
-0.03765869140625,
0.0633544921875,
0.0200958251953125,
-0.00537109375,
0.045013427734375,
-0.01934814453125,
-0.060791015625,
0.06634521484375,
0.044219970703125,
0.06683349609375,
-0.00034999847412109375,
-0.003742218017578125,
0.036651611328125,
0.01103973388671875,
0.00560760498046875,
0.02191162109375,
-0.0171356201171875,
-0.058807373046875,
-0.03131103515625,
-0.043548583984375,
0.007396697998046875,
0.0079345703125,
-0.054718017578125,
0.0290985107421875,
-0.028472900390625,
-0.009613037109375,
0.00023603439331054688,
0.01068878173828125,
-0.0697021484375,
0.0232696533203125,
0.01082611083984375,
0.06463623046875,
-0.050506591796875,
0.0814208984375,
0.059600830078125,
-0.08050537109375,
-0.0887451171875,
0.01010894775390625,
-0.0232696533203125,
-0.05645751953125,
0.060089111328125,
0.029205322265625,
0.011199951171875,
0.0157318115234375,
-0.05859375,
-0.05474853515625,
0.09051513671875,
0.0255126953125,
-0.0197601318359375,
-0.0268096923828125,
-0.003093719482421875,
0.036865234375,
-0.034088134765625,
0.03570556640625,
0.038330078125,
0.015838623046875,
-0.002719879150390625,
-0.072509765625,
0.01311492919921875,
-0.0430908203125,
0.0216522216796875,
0.012176513671875,
-0.04681396484375,
0.076416015625,
-0.03546142578125,
-0.00701141357421875,
-0.0021533966064453125,
0.0450439453125,
0.0165557861328125,
0.00807952880859375,
0.0292205810546875,
0.037139892578125,
0.048797607421875,
-0.01082611083984375,
0.07440185546875,
-0.047821044921875,
0.040252685546875,
0.06072998046875,
0.007099151611328125,
0.043670654296875,
0.0222320556640625,
-0.0190277099609375,
0.0443115234375,
0.04937744140625,
-0.023040771484375,
0.023895263671875,
0.01558685302734375,
0.004169464111328125,
-0.006160736083984375,
0.01334381103515625,
-0.051910400390625,
0.035308837890625,
0.0054473876953125,
-0.040283203125,
0.0070953369140625,
-0.0008535385131835938,
0.0218658447265625,
-0.007701873779296875,
-0.006610870361328125,
0.0587158203125,
-0.0007205009460449219,
-0.05810546875,
0.0753173828125,
0.0079345703125,
0.08526611328125,
-0.052276611328125,
0.0027294158935546875,
-0.0228271484375,
0.025421142578125,
-0.036712646484375,
-0.018829345703125,
0.0167236328125,
0.019012451171875,
-0.0164031982421875,
-0.027313232421875,
0.047515869140625,
-0.03375244140625,
-0.017303466796875,
0.0227813720703125,
0.0230712890625,
0.009613037109375,
-0.00827789306640625,
-0.056243896484375,
0.0130462646484375,
0.0166778564453125,
-0.020355224609375,
0.0258636474609375,
0.0288238525390625,
0.0007200241088867188,
0.029541015625,
0.057098388671875,
-0.01187896728515625,
0.0167083740234375,
0.002960205078125,
0.0662841796875,
-0.05596923828125,
-0.045166015625,
-0.05029296875,
0.05572509765625,
-0.0078887939453125,
-0.043121337890625,
0.048248291015625,
0.049896240234375,
0.09442138671875,
-0.0280914306640625,
0.0728759765625,
-0.02069091796875,
0.0232696533203125,
-0.03411865234375,
0.05340576171875,
-0.03814697265625,
0.019287109375,
-0.034149169921875,
-0.05694580078125,
-0.021148681640625,
0.040252685546875,
-0.02874755859375,
0.040069580078125,
0.052032470703125,
0.06640625,
-0.013031005859375,
-0.01038360595703125,
0.02215576171875,
0.01085662841796875,
0.0200958251953125,
0.054473876953125,
0.033233642578125,
-0.0545654296875,
0.057647705078125,
-0.0227508544921875,
-0.002666473388671875,
-0.01531982421875,
-0.056243896484375,
-0.0814208984375,
-0.053314208984375,
-0.0226593017578125,
-0.044219970703125,
0.0026416778564453125,
0.08209228515625,
0.079833984375,
-0.06097412109375,
-0.0112152099609375,
-0.02581787109375,
-0.01265716552734375,
-0.017578125,
-0.0164947509765625,
0.0268402099609375,
-0.046600341796875,
-0.0516357421875,
-0.002414703369140625,
-0.0117645263671875,
0.004180908203125,
-0.00531005859375,
-0.0169219970703125,
0.0004286766052246094,
-0.0173797607421875,
0.036163330078125,
0.0247039794921875,
-0.049407958984375,
-0.0118255615234375,
0.01218414306640625,
-0.01885986328125,
0.01934814453125,
0.061126708984375,
-0.06500244140625,
0.0220947265625,
0.027618408203125,
0.058685302734375,
0.048797607421875,
0.01004791259765625,
0.0474853515625,
-0.064697265625,
0.027069091796875,
0.006313323974609375,
0.0269622802734375,
0.017852783203125,
-0.02703857421875,
0.050537109375,
0.031219482421875,
-0.03692626953125,
-0.06768798828125,
0.0110321044921875,
-0.06842041015625,
-0.0167236328125,
0.08050537109375,
-0.029510498046875,
-0.01348114013671875,
0.00811004638671875,
-0.01088714599609375,
0.0299835205078125,
-0.0158538818359375,
0.035186767578125,
0.033203125,
0.01214599609375,
-0.0205230712890625,
-0.0307159423828125,
0.053924560546875,
0.026702880859375,
-0.06475830078125,
-0.0068511962890625,
0.0202789306640625,
0.033447265625,
0.01145172119140625,
0.054901123046875,
-0.0177154541015625,
0.0264434814453125,
0.00966644287109375,
0.0517578125,
-0.0236053466796875,
-0.0105438232421875,
-0.0316162109375,
0.00339508056640625,
0.003482818603515625,
-0.029876708984375
]
] |
nlpaueb/bert-base-greek-uncased-v1 | 2022-03-02T16:32:57.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"pretraining",
"fill-mask",
"el",
"arxiv:2008.12014",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | nlpaueb | null | null | nlpaueb/bert-base-greek-uncased-v1 | 17 | 17,048 | transformers | 2022-03-02T23:29:05 | ---
language: el
pipeline_tag: fill-mask
thumbnail: https://github.com/nlpaueb/GreekBERT/raw/master/greek-bert-logo.png
widget:
- text: "Σήμερα είναι μια [MASK] μέρα."
---
# GreekBERT
A Greek version of BERT pre-trained language model.
<img src="https://github.com/nlpaueb/GreekBERT/raw/master/greek-bert-logo.png" width="600"/>
## Pre-training corpora
The pre-training corpora of `bert-base-greek-uncased-v1` include:
* The Greek part of [Wikipedia](https://el.wikipedia.org/wiki/Βικιπαίδεια:Αντίγραφα_της_βάσης_δεδομένων),
* The Greek part of [European Parliament Proceedings Parallel Corpus](https://www.statmt.org/europarl/), and
* The Greek part of [OSCAR](https://traces1.inria.fr/oscar/), a cleansed version of [Common Crawl](https://commoncrawl.org).
Future release will also include:
* The entire corpus of Greek legislation, as published by the [National Publication Office](http://www.et.gr),
* The entire corpus of EU legislation (Greek translation), as published in [Eur-Lex](https://eur-lex.europa.eu/homepage.html?locale=en).
## Pre-training details
* We trained BERT using the official code provided in Google BERT's GitHub repository (https://github.com/google-research/bert).* We then used [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers) conversion script to convert the TF checkpoint and vocabulary in the desired format in order to be able to load the model in two lines of code for both PyTorch and TF2 users.
* We released a model similar to the English `bert-base-uncased` model (12-layer, 768-hidden, 12-heads, 110M parameters).
* We chose to follow the same training set-up: 1 million training steps with batches of 256 sequences of length 512 with an initial learning rate 1e-4.
* We were able to use a single Google Cloud TPU v3-8 provided for free from [TensorFlow Research Cloud (TFRC)](https://www.tensorflow.org/tfrc), while also utilizing [GCP research credits](https://edu.google.com/programs/credits/research). Huge thanks to both Google programs for supporting us!
\* You can still have access to the original TensorFlow checkpoints from this [Google Drive folder](https://drive.google.com/drive/folders/1ZjlaE4nvdtgqXiVBTVHCF5I9Ff8ZmztE?usp=sharing).
## Requirements
We published `bert-base-greek-uncased-v1` as part of [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers) repository. So, you need to install the transformers library through pip along with PyTorch or Tensorflow 2.
```
pip install transformers
pip install (torch|tensorflow)
```
## Pre-process text (Deaccent - Lower)
**NOTICE:** Preprocessing is now natively supported by the default tokenizer. No need to include the following code.
In order to use `bert-base-greek-uncased-v1`, you have to pre-process texts to lowercase letters and remove all Greek diacritics.
```python
import unicodedata
def strip_accents_and_lowercase(s):
return ''.join(c for c in unicodedata.normalize('NFD', s)
if unicodedata.category(c) != 'Mn').lower()
accented_string = "Αυτή είναι η Ελληνική έκδοση του BERT."
unaccented_string = strip_accents_and_lowercase(accented_string)
print(unaccented_string) # αυτη ειναι η ελληνικη εκδοση του bert.
```
## Load Pretrained Model
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("nlpaueb/bert-base-greek-uncased-v1")
model = AutoModel.from_pretrained("nlpaueb/bert-base-greek-uncased-v1")
```
## Use Pretrained Model as a Language Model
```python
import torch
from transformers import *
# Load model and tokenizer
tokenizer_greek = AutoTokenizer.from_pretrained('nlpaueb/bert-base-greek-uncased-v1')
lm_model_greek = AutoModelWithLMHead.from_pretrained('nlpaueb/bert-base-greek-uncased-v1')
# ================ EXAMPLE 1 ================
text_1 = 'O ποιητής έγραψε ένα [MASK] .'
# EN: 'The poet wrote a [MASK].'
input_ids = tokenizer_greek.encode(text_1)
print(tokenizer_greek.convert_ids_to_tokens(input_ids))
# ['[CLS]', 'o', 'ποιητης', 'εγραψε', 'ενα', '[MASK]', '.', '[SEP]']
outputs = lm_model_greek(torch.tensor([input_ids]))[0]
print(tokenizer_greek.convert_ids_to_tokens(outputs[0, 5].max(0)[1].item()))
# the most plausible prediction for [MASK] is "song"
# ================ EXAMPLE 2 ================
text_2 = 'Είναι ένας [MASK] άνθρωπος.'
# EN: 'He is a [MASK] person.'
input_ids = tokenizer_greek.encode(text_2)
print(tokenizer_greek.convert_ids_to_tokens(input_ids))
# ['[CLS]', 'ειναι', 'ενας', '[MASK]', 'ανθρωπος', '.', '[SEP]']
outputs = lm_model_greek(torch.tensor([input_ids]))[0]
print(tokenizer_greek.convert_ids_to_tokens(outputs[0, 3].max(0)[1].item()))
# the most plausible prediction for [MASK] is "good"
# ================ EXAMPLE 3 ================
text_3 = 'Είναι ένας [MASK] άνθρωπος και κάνει συχνά [MASK].'
# EN: 'He is a [MASK] person he does frequently [MASK].'
input_ids = tokenizer_greek.encode(text_3)
print(tokenizer_greek.convert_ids_to_tokens(input_ids))
# ['[CLS]', 'ειναι', 'ενας', '[MASK]', 'ανθρωπος', 'και', 'κανει', 'συχνα', '[MASK]', '.', '[SEP]']
outputs = lm_model_greek(torch.tensor([input_ids]))[0]
print(tokenizer_greek.convert_ids_to_tokens(outputs[0, 8].max(0)[1].item()))
# the most plausible prediction for the second [MASK] is "trips"
```
## Evaluation on downstream tasks
For detailed results read the article:
GREEK-BERT: The Greeks visiting Sesame Street. John Koutsikakis, Ilias Chalkidis, Prodromos Malakasiotis and Ion Androutsopoulos. In the Proceedings of the 11th Hellenic Conference on Artificial Intelligence (SETN 2020). Held Online. 2020. (https://arxiv.org/abs/2008.12014)
### Named Entity Recognition with Greek NER dataset
| Model name | Micro F1 |
| ------------------- | ------------------------------------ |
BILSTM-CNN-CRF (Ma and Hovy, 2016) | 76.4 ± 2.07
M-BERT-UNCASED (Devlin et al., 2019) | 81.5 ± 1.77
M-BERT-CASED (Devlin et al., 2019)| 82.1 ± 1.35
XLM-R (Conneau et al., 2020)| 84.8 ± 1.50
GREEK-BERT (ours) | **85.7 ± 1.00**
### Natural Language Inference with XNLI
| Model name | Accuracy |
| ------------------- | ------------------------------------ |
DAM (Parikh et al., 2016) | 68.5 ± 1.71
M-BERT-UNCASED (Devlin et al., 2019) | 73.9 ± 0.64
M-BERT-CASED (Devlin et al., 2019) | 73.5 ± 0.49
XLM-R (Conneau et al., 2020) | 77.3 ± 0.41
GREEK-BERT (ours) | **78.6 ± 0.62**
## Author
The model has been officially released with the article "GREEK-BERT: The Greeks visiting Sesame Street. John Koutsikakis, Ilias Chalkidis, Prodromos Malakasiotis and Ion Androutsopoulos. In the Proceedings of the 11th Hellenic Conference on Artificial Intelligence (SETN 2020). Held Online. 2020" (https://arxiv.org/abs/2008.12014).
If you use the model, please cite the following:
```
@inproceedings{greek-bert,
author = {Koutsikakis, John and Chalkidis, Ilias and Malakasiotis, Prodromos and Androutsopoulos, Ion},
title = {GREEK-BERT: The Greeks Visiting Sesame Street},
year = {2020},
isbn = {9781450388788},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3411408.3411440},
booktitle = {11th Hellenic Conference on Artificial Intelligence},
pages = {110–117},
numpages = {8},
location = {Athens, Greece},
series = {SETN 2020}
}
```
## About Us
[AUEB's Natural Language Processing Group](http://nlp.cs.aueb.gr) develops algorithms, models, and systems that allow computers to process and generate natural language texts.
The group's current research interests include:
* question answering systems for databases, ontologies, document collections, and the Web, especially biomedical question answering,
* natural language generation from databases and ontologies, especially Semantic Web ontologies,
text classification, including filtering spam and abusive content,
* information extraction and opinion mining, including legal text analytics and sentiment analysis,
* natural language processing tools for Greek, for example parsers and named-entity recognizers,
machine learning in natural language processing, especially deep learning.
The group is part of the Information Processing Laboratory of the Department of Informatics of the Athens University of Economics and Business.
[Ilias Chalkidis](https://iliaschalkidis.github.io) on behalf of [AUEB's Natural Language Processing Group](http://nlp.cs.aueb.gr)
| Github: [@ilias.chalkidis](https://github.com/iliaschalkidis) | Twitter: [@KiddoThe2B](https://twitter.com/KiddoThe2B) | | 8,614 | [
[
-0.034515380859375,
-0.049407958984375,
0.0301971435546875,
0.006740570068359375,
-0.04071044921875,
-0.0037078857421875,
-0.03680419921875,
-0.030242919921875,
0.039154052734375,
0.006427764892578125,
-0.04217529296875,
-0.0360107421875,
-0.064697265625,
0.027740478515625,
-0.022979736328125,
0.07513427734375,
-0.005096435546875,
0.02569580078125,
0.017578125,
-0.005817413330078125,
0.00653076171875,
-0.049407958984375,
-0.0670166015625,
-0.0246124267578125,
0.04241943359375,
0.0157928466796875,
0.04986572265625,
0.0279541015625,
0.025177001953125,
0.032867431640625,
-0.0114593505859375,
0.004302978515625,
-0.023956298828125,
0.0033550262451171875,
-0.00836944580078125,
-0.0188751220703125,
-0.0269317626953125,
0.0021076202392578125,
0.043853759765625,
0.037109375,
0.0143890380859375,
0.01520538330078125,
0.00171661376953125,
0.0240478515625,
-0.016265869140625,
0.00965118408203125,
-0.0399169921875,
-0.00463104248046875,
-0.0156402587890625,
-0.001220703125,
-0.02117919921875,
-0.034698486328125,
0.0154876708984375,
-0.060272216796875,
0.01418304443359375,
-0.005023956298828125,
0.0888671875,
0.00394439697265625,
-0.0172882080078125,
-0.039794921875,
-0.038238525390625,
0.072265625,
-0.0634765625,
0.031768798828125,
0.034698486328125,
0.0025653839111328125,
-0.02410888671875,
-0.0777587890625,
-0.0557861328125,
-0.006221771240234375,
0.0059967041015625,
0.0110931396484375,
-0.0168914794921875,
-0.0138702392578125,
-0.0013837814331054688,
0.03497314453125,
-0.02874755859375,
-0.001129150390625,
-0.048370361328125,
-0.0187530517578125,
0.042572021484375,
-0.01629638671875,
0.03509521484375,
-0.01418304443359375,
-0.016357421875,
-0.03436279296875,
-0.0293121337890625,
0.0049591064453125,
0.0291748046875,
0.038543701171875,
-0.0222625732421875,
0.051483154296875,
0.0030689239501953125,
0.0198974609375,
0.02386474609375,
-0.00630950927734375,
0.042724609375,
-0.016265869140625,
-0.016204833984375,
0.01026153564453125,
0.07122802734375,
0.01445770263671875,
0.030029296875,
-0.0187835693359375,
-0.00010925531387329102,
0.01038360595703125,
0.011932373046875,
-0.054931640625,
-0.020477294921875,
0.0173797607421875,
-0.0304107666015625,
-0.0201568603515625,
0.0004134178161621094,
-0.044647216796875,
-0.0031719207763671875,
-0.00736236572265625,
0.041229248046875,
-0.06573486328125,
-0.018096923828125,
0.0030002593994140625,
-0.0135650634765625,
0.004238128662109375,
0.01323699951171875,
-0.08172607421875,
0.018402099609375,
0.046356201171875,
0.044525146484375,
0.017303466796875,
-0.01934814453125,
-0.0085601806640625,
-0.00687408447265625,
-0.0182952880859375,
0.03363037109375,
-0.0222625732421875,
-0.015777587890625,
-0.003265380859375,
0.0090179443359375,
-0.009857177734375,
-0.0142669677734375,
0.06390380859375,
-0.033935546875,
0.0465087890625,
-0.022979736328125,
-0.056121826171875,
-0.0260009765625,
0.01050567626953125,
-0.042724609375,
0.0972900390625,
0.01324462890625,
-0.055023193359375,
0.0236358642578125,
-0.036102294921875,
-0.028472900390625,
0.010223388671875,
-0.007415771484375,
-0.037139892578125,
0.016326904296875,
0.0255584716796875,
0.03277587890625,
0.002811431884765625,
0.0074310302734375,
-0.0247802734375,
-0.0084991455078125,
0.0208587646484375,
-0.004657745361328125,
0.08026123046875,
0.022979736328125,
-0.0439453125,
-0.004009246826171875,
-0.0443115234375,
0.004863739013671875,
0.012115478515625,
-0.039459228515625,
-0.01544189453125,
-0.001983642578125,
0.031585693359375,
0.01678466796875,
0.0220794677734375,
-0.061065673828125,
0.0198974609375,
-0.035430908203125,
0.0343017578125,
0.0655517578125,
-0.0194549560546875,
0.033935546875,
-0.020721435546875,
0.0296173095703125,
-0.005123138427734375,
-0.0142669677734375,
0.00878143310546875,
-0.06158447265625,
-0.064208984375,
-0.04449462890625,
0.035430908203125,
0.033721923828125,
-0.060272216796875,
0.047332763671875,
-0.024200439453125,
-0.04620361328125,
-0.0465087890625,
0.0009522438049316406,
0.0230865478515625,
0.033966064453125,
0.0274658203125,
-0.032501220703125,
-0.06353759765625,
-0.048980712890625,
-0.0169525146484375,
-0.033416748046875,
-0.00104522705078125,
0.0169219970703125,
0.044525146484375,
-0.0194549560546875,
0.0531005859375,
-0.005420684814453125,
-0.021331787109375,
-0.032135009765625,
0.01300811767578125,
0.041412353515625,
0.0537109375,
0.04296875,
-0.039459228515625,
-0.036529541015625,
-0.00909423828125,
-0.04754638671875,
0.0179901123046875,
-0.005035400390625,
-0.01309967041015625,
0.020355224609375,
0.0210113525390625,
-0.057281494140625,
0.02880859375,
0.025726318359375,
-0.0537109375,
0.034637451171875,
-0.0240325927734375,
0.0021820068359375,
-0.098876953125,
0.006481170654296875,
-0.00501251220703125,
-0.0155487060546875,
-0.05206298828125,
0.0037670135498046875,
0.0019989013671875,
0.0068511962890625,
-0.033721923828125,
0.062744140625,
-0.0258026123046875,
0.0062103271484375,
0.00409698486328125,
-0.00012922286987304688,
-0.014892578125,
0.039581298828125,
0.01418304443359375,
0.047607421875,
0.04736328125,
-0.057281494140625,
0.040771484375,
0.041778564453125,
-0.04278564453125,
0.01702880859375,
-0.03924560546875,
0.00106048583984375,
0.0002627372741699219,
0.01380157470703125,
-0.06951904296875,
-0.01476287841796875,
0.03668212890625,
-0.06298828125,
0.02301025390625,
-0.0010223388671875,
-0.061859130859375,
-0.0308837890625,
-0.013671875,
0.0380859375,
0.04132080078125,
-0.0426025390625,
0.037750244140625,
0.0121002197265625,
-0.0194244384765625,
-0.04180908203125,
-0.05450439453125,
0.00963592529296875,
-0.0240936279296875,
-0.04510498046875,
0.033935546875,
-0.01024627685546875,
0.0019464492797851562,
0.0022907257080078125,
0.005161285400390625,
-0.006175994873046875,
0.00984954833984375,
0.00220489501953125,
0.02252197265625,
-0.0225372314453125,
0.01055145263671875,
-0.01214599609375,
-0.007335662841796875,
0.0006680488586425781,
-0.0292816162109375,
0.054107666015625,
-0.0145111083984375,
-0.0013885498046875,
-0.0546875,
0.01534271240234375,
0.04449462890625,
-0.00409698486328125,
0.0518798828125,
0.0648193359375,
-0.042205810546875,
0.0081634521484375,
-0.03619384765625,
-0.026458740234375,
-0.03692626953125,
0.041107177734375,
-0.03125,
-0.041412353515625,
0.046661376953125,
0.015869140625,
0.0150604248046875,
0.044708251953125,
0.06085205078125,
-0.01617431640625,
0.059356689453125,
0.0404052734375,
-0.012908935546875,
0.03790283203125,
-0.0258331298828125,
0.03143310546875,
-0.053741455078125,
-0.01763916015625,
-0.03533935546875,
-0.02874755859375,
-0.051788330078125,
-0.0164947509765625,
0.017913818359375,
0.004123687744140625,
-0.01751708984375,
0.04754638671875,
-0.03094482421875,
0.0098114013671875,
0.06243896484375,
0.00789642333984375,
-0.0064239501953125,
0.027313232421875,
-0.038238525390625,
-0.0079498291015625,
-0.045867919921875,
-0.05291748046875,
0.1009521484375,
0.03411865234375,
0.03076171875,
0.0194091796875,
0.06561279296875,
0.005741119384765625,
0.0158233642578125,
-0.050811767578125,
0.039459228515625,
-0.011077880859375,
-0.07879638671875,
-0.033050537109375,
-0.0181884765625,
-0.0850830078125,
0.01287078857421875,
-0.018890380859375,
-0.07666015625,
0.023773193359375,
-0.006103515625,
-0.03350830078125,
0.03564453125,
-0.0309906005859375,
0.0654296875,
-0.0208282470703125,
-0.018524169921875,
-0.00777435302734375,
-0.036834716796875,
0.0038166046142578125,
0.007205963134765625,
0.01959228515625,
-0.00315093994140625,
0.007343292236328125,
0.07305908203125,
-0.034637451171875,
0.053863525390625,
0.00713348388671875,
-0.0031948089599609375,
0.00760650634765625,
-0.00579071044921875,
0.0275115966796875,
0.0113067626953125,
-0.00860595703125,
0.010894775390625,
0.0006799697875976562,
-0.03948974609375,
-0.01284027099609375,
0.051788330078125,
-0.0792236328125,
-0.0100555419921875,
-0.0517578125,
-0.0309906005859375,
-0.0005698204040527344,
0.0252227783203125,
0.0582275390625,
0.0352783203125,
-0.0124969482421875,
0.001190185546875,
0.043243408203125,
-0.024139404296875,
0.053070068359375,
0.01480865478515625,
-0.0121917724609375,
-0.055908203125,
0.051666259765625,
0.0022716522216796875,
-0.00887298583984375,
0.00881195068359375,
-0.0005655288696289062,
-0.033355712890625,
-0.0216064453125,
-0.038848876953125,
0.038909912109375,
-0.04669189453125,
0.001461029052734375,
-0.038604736328125,
-0.007053375244140625,
-0.057403564453125,
-0.00675201416015625,
-0.03570556640625,
-0.053497314453125,
-0.04779052734375,
-0.0186309814453125,
0.02398681640625,
0.044281005859375,
-0.020263671875,
0.0299530029296875,
-0.04241943359375,
0.02471923828125,
0.0134735107421875,
0.0235595703125,
-0.01143646240234375,
-0.043212890625,
-0.023529052734375,
-0.006866455078125,
-0.028533935546875,
-0.07305908203125,
0.04827880859375,
0.017425537109375,
0.043853759765625,
0.050750732421875,
0.007389068603515625,
0.04351806640625,
-0.042388916015625,
0.05889892578125,
0.0263671875,
-0.06683349609375,
0.017059326171875,
-0.010986328125,
0.01267242431640625,
0.0450439453125,
0.030303955078125,
-0.049560546875,
-0.0426025390625,
-0.0650634765625,
-0.08099365234375,
0.06610107421875,
0.046234130859375,
0.027252197265625,
-0.01885986328125,
0.01171875,
0.01059722900390625,
0.0085296630859375,
-0.07421875,
-0.04876708984375,
-0.017059326171875,
-0.039031982421875,
-0.028472900390625,
-0.0244598388671875,
0.00037479400634765625,
-0.042266845703125,
0.0926513671875,
0.0218048095703125,
0.047149658203125,
0.029266357421875,
-0.0169677734375,
-0.0087890625,
0.01432037353515625,
0.0526123046875,
0.0404052734375,
-0.0271759033203125,
0.0074310302734375,
0.009033203125,
-0.05194091796875,
-0.0084075927734375,
0.018035888671875,
-0.01093292236328125,
0.039337158203125,
0.028533935546875,
0.0596923828125,
0.0121612548828125,
-0.031585693359375,
0.0178070068359375,
-0.0146331787109375,
-0.01983642578125,
-0.058197021484375,
-0.023895263671875,
0.00931549072265625,
0.019500732421875,
0.046142578125,
-0.005588531494140625,
-0.004337310791015625,
-0.048583984375,
0.0103302001953125,
0.0270538330078125,
-0.02398681640625,
-0.036529541015625,
0.055694580078125,
0.007015228271484375,
-0.0246734619140625,
0.05853271484375,
-0.01274871826171875,
-0.0601806640625,
0.046295166015625,
0.0297393798828125,
0.0712890625,
-0.0206146240234375,
0.009521484375,
0.0548095703125,
0.032928466796875,
0.0083160400390625,
0.041168212890625,
-0.00878143310546875,
-0.06500244140625,
-0.011962890625,
-0.054473876953125,
-0.004207611083984375,
0.0169219970703125,
-0.045989990234375,
0.020050048828125,
-0.035552978515625,
-0.04010009765625,
0.010894775390625,
0.01032257080078125,
-0.0509033203125,
0.0275726318359375,
0.04229736328125,
0.0611572265625,
-0.056854248046875,
0.07427978515625,
0.0770263671875,
-0.026763916015625,
-0.0526123046875,
-0.029083251953125,
-0.01546478271484375,
-0.052398681640625,
0.058868408203125,
0.01776123046875,
0.01554107666015625,
0.0005884170532226562,
-0.031890869140625,
-0.08599853515625,
0.06298828125,
0.01739501953125,
-0.0243988037109375,
-0.01050567626953125,
0.018524169921875,
0.040496826171875,
-0.022125244140625,
0.033721923828125,
0.02630615234375,
0.03948974609375,
-0.00266265869140625,
-0.07183837890625,
0.0013704299926757812,
-0.04150390625,
-0.005321502685546875,
0.0005545616149902344,
-0.0228118896484375,
0.07879638671875,
-0.01215362548828125,
-0.02484130859375,
0.032318115234375,
0.06585693359375,
0.0193634033203125,
0.019805908203125,
0.025054931640625,
0.07110595703125,
0.058563232421875,
-0.01549530029296875,
0.06939697265625,
-0.0384521484375,
0.045623779296875,
0.060943603515625,
0.0204315185546875,
0.058349609375,
0.05633544921875,
-0.01568603515625,
0.055816650390625,
0.07086181640625,
-0.0015325546264648438,
0.0526123046875,
0.0037288665771484375,
-0.022552490234375,
-0.02099609375,
0.023529052734375,
-0.03790283203125,
0.021270751953125,
0.02587890625,
-0.052490234375,
-0.0103607177734375,
-0.0013360977172851562,
0.0308380126953125,
-0.0193634033203125,
-0.0169219970703125,
0.046875,
0.0101318359375,
-0.04620361328125,
0.05523681640625,
0.0231170654296875,
0.046478271484375,
-0.0360107421875,
0.0201873779296875,
-0.009307861328125,
0.03289794921875,
-0.0028533935546875,
-0.03399658203125,
-0.0001252889633178711,
0.00197601318359375,
-0.0035533905029296875,
-0.017913818359375,
0.0209197998046875,
-0.026031494140625,
-0.06597900390625,
0.0201568603515625,
0.01385498046875,
0.033721923828125,
0.01395416259765625,
-0.05499267578125,
-0.0164947509765625,
0.0016164779663085938,
-0.0304718017578125,
0.00791168212890625,
0.0309906005859375,
0.00928497314453125,
0.041473388671875,
0.051788330078125,
0.01235198974609375,
0.0285186767578125,
-0.0191650390625,
0.0626220703125,
-0.046234130859375,
-0.0491943359375,
-0.062286376953125,
0.04248046875,
-0.000014543533325195312,
-0.04693603515625,
0.0537109375,
0.04949951171875,
0.072265625,
-0.017059326171875,
0.05035400390625,
-0.018524169921875,
0.038482666015625,
-0.0237274169921875,
0.061737060546875,
-0.04095458984375,
0.0004241466522216797,
-0.02886962890625,
-0.06781005859375,
-0.02044677734375,
0.06610107421875,
-0.02276611328125,
0.0018405914306640625,
0.052154541015625,
0.035400390625,
-0.0121612548828125,
-0.016998291015625,
0.0092315673828125,
0.0229644775390625,
0.004734039306640625,
0.039947509765625,
0.038818359375,
-0.05450439453125,
0.0391845703125,
-0.039031982421875,
-0.004711151123046875,
-0.0184173583984375,
-0.04620361328125,
-0.06549072265625,
-0.04974365234375,
-0.0305938720703125,
-0.046844482421875,
0.0016469955444335938,
0.07916259765625,
0.04632568359375,
-0.0875244140625,
-0.0267181396484375,
-0.00521087646484375,
0.000995635986328125,
-0.0191650390625,
-0.0132598876953125,
0.0465087890625,
-0.03448486328125,
-0.07879638671875,
0.018890380859375,
-0.01549530029296875,
0.00885772705078125,
0.007114410400390625,
-0.0230255126953125,
-0.0257110595703125,
0.01412200927734375,
0.02374267578125,
0.03125,
-0.06781005859375,
-0.01322174072265625,
-0.00704193115234375,
-0.020660400390625,
0.01461029052734375,
0.0205535888671875,
-0.0531005859375,
0.031158447265625,
0.0309906005859375,
0.03076171875,
0.049163818359375,
-0.00884246826171875,
0.038299560546875,
-0.04217529296875,
0.01165008544921875,
0.0166473388671875,
0.050750732421875,
0.0254669189453125,
-0.021209716796875,
0.0252685546875,
0.0188751220703125,
-0.02862548828125,
-0.059539794921875,
0.00588226318359375,
-0.07696533203125,
-0.0178070068359375,
0.08111572265625,
-0.0249481201171875,
-0.01800537109375,
-0.0028667449951171875,
-0.007720947265625,
0.044464111328125,
-0.0360107421875,
0.07080078125,
0.06268310546875,
0.0106048583984375,
-0.007556915283203125,
-0.0258941650390625,
0.03826904296875,
0.05218505859375,
-0.027679443359375,
-0.032073974609375,
0.01274871826171875,
0.034271240234375,
0.0283660888671875,
0.030975341796875,
-0.00861358642578125,
0.0220184326171875,
-0.01654052734375,
0.033294677734375,
0.0081024169921875,
0.0006451606750488281,
-0.022186279296875,
-0.00042724609375,
-0.00644683837890625,
-0.0306243896484375
]
] |
bigscience/bloom-3b | 2023-04-14T08:43:29.000Z | [
"transformers",
"pytorch",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zhs",
"zht",
"zu",
"arxiv:1909.08053",
"arxiv:2110.02861",
"arxiv:2108.12409",
"license:bigscience-bloom-rail-1.0",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloom-3b | 70 | 17,016 | transformers | 2022-05-19T11:52:27 | ---
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zhs
- zht
- zu
pipeline_tag: text-generation
model-index:
- name: bloom
results:
- task:
type: text-generation
name: text generation
dataset:
name: arc_challenge
type: arc_challenge
metrics:
- name: acc
type: acc
value: 0.27986348122866894
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: arc_easy
type: arc_easy
metrics:
- name: acc
type: acc
value: 0.5946969696969697
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: axb
type: axb
metrics:
- name: acc
type: acc
value: 0.4433876811594203
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: axg
type: axg
metrics:
- name: acc
type: acc
value: 0.5
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: boolq
type: boolq
metrics:
- name: acc
type: acc
value: 0.6165137614678899
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: cb
type: cb
metrics:
- name: acc
type: acc
value: 0.30357142857142855
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: cola
type: cola
metrics:
- name: acc
type: acc
value: 0.610738255033557
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: copa
type: copa
metrics:
- name: acc
type: acc
value: 0.63
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: crows_pairs_english
type: crows_pairs_english
metrics:
- name: acc
type: acc
value: 0.4973166368515206
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: crows_pairs_french
type: crows_pairs_french
metrics:
- name: acc
type: acc
value: 0.5032796660703638
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: diabla
type: diabla
metrics:
- name: acc
type: acc
value: 0.28888308977035493
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_afr
type: gsarti/flores_101_afr
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.500798737976343
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_amh
type: gsarti/flores_101_amh
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.9726863338897145
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ara
type: gsarti/flores_101_ara
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.8083841089875814
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_asm
type: gsarti/flores_101_asm
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.699102962086425
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ast
type: gsarti/flores_101_ast
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.9252047073429384
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_azj
type: gsarti/flores_101_azj
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.942805054270002
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bel
type: gsarti/flores_101_bel
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.614136245847082
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ben
type: gsarti/flores_101_ben
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.121491534300969
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bos
type: gsarti/flores_101_bos
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.653353469118798
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_bul
type: gsarti/flores_101_bul
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.7014693938055068
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_cat
type: gsarti/flores_101_cat
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.305190041967345
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ceb
type: gsarti/flores_101_ceb
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.291000321323428
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ces
type: gsarti/flores_101_ces
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.447322753586386
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ckb
type: gsarti/flores_101_ckb
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.7255124939234765
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_cym
type: gsarti/flores_101_cym
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 12.539424151448149
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_dan
type: gsarti/flores_101_dan
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.183309001005672
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_deu
type: gsarti/flores_101_deu
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.1180422286591347
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ell
type: gsarti/flores_101_ell
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.467943456164706
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_eng
type: gsarti/flores_101_eng
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.018740628193298
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_est
type: gsarti/flores_101_est
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 9.11654425176368
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fas
type: gsarti/flores_101_fas
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.058009097116482
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fin
type: gsarti/flores_101_fin
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.847047959628553
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_fra
type: gsarti/flores_101_fra
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.9975177011840075
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ful
type: gsarti/flores_101_ful
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 11.465912731488828
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_gle
type: gsarti/flores_101_gle
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.681491663539422
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_glg
type: gsarti/flores_101_glg
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.029991089015508
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_guj
type: gsarti/flores_101_guj
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.955224230286231
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hau
type: gsarti/flores_101_hau
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 10.758347356372159
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_heb
type: gsarti/flores_101_heb
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.6004478129801667
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hin
type: gsarti/flores_101_hin
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.712530650588064
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hrv
type: gsarti/flores_101_hrv
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.822418943372185
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hun
type: gsarti/flores_101_hun
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.440482646965992
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_hye
type: gsarti/flores_101_hye
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.657718918347166
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ibo
type: gsarti/flores_101_ibo
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.564814003872672
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ind
type: gsarti/flores_101_ind
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.1597101468869373
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_isl
type: gsarti/flores_101_isl
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.082349269518136
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ita
type: gsarti/flores_101_ita
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.9687591414176207
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_jav
type: gsarti/flores_101_jav
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.0573805415708994
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_jpn
type: gsarti/flores_101_jpn
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.7758864197116933
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kam
type: gsarti/flores_101_kam
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 11.072949642861332
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kan
type: gsarti/flores_101_kan
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.551730651007082
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kat
type: gsarti/flores_101_kat
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.522630524283745
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kaz
type: gsarti/flores_101_kaz
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.3901748516975574
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kea
type: gsarti/flores_101_kea
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.918534182590863
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kir
type: gsarti/flores_101_kir
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.729278369847201
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_kor
type: gsarti/flores_101_kor
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.932884847226212
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lao
type: gsarti/flores_101_lao
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.9077314760849924
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lav
type: gsarti/flores_101_lav
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.777221919194806
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lin
type: gsarti/flores_101_lin
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.524842908050988
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lit
type: gsarti/flores_101_lit
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.369179434621725
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ltz
type: gsarti/flores_101_ltz
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.801059747949214
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_lug
type: gsarti/flores_101_lug
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.483203026364786
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_luo
type: gsarti/flores_101_luo
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 11.975963093623681
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mal
type: gsarti/flores_101_mal
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.615948455160037
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mar
type: gsarti/flores_101_mar
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.483253482821379
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mkd
type: gsarti/flores_101_mkd
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.9656732291754087
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mlt
type: gsarti/flores_101_mlt
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 15.004773437665275
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mon
type: gsarti/flores_101_mon
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.410598542315402
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mri
type: gsarti/flores_101_mri
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.474035895661322
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_msa
type: gsarti/flores_101_msa
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.5710001772665634
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_mya
type: gsarti/flores_101_mya
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.413577969878331
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nld
type: gsarti/flores_101_nld
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.127831721885065
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nob
type: gsarti/flores_101_nob
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.402763169129877
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_npi
type: gsarti/flores_101_npi
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.199342701937889
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nso
type: gsarti/flores_101_nso
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.154626800955667
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_nya
type: gsarti/flores_101_nya
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.179860208369393
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_oci
type: gsarti/flores_101_oci
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.8617357393685845
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_orm
type: gsarti/flores_101_orm
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 12.911595421079408
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ory
type: gsarti/flores_101_ory
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.189421861225964
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pan
type: gsarti/flores_101_pan
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.698477289331806
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pol
type: gsarti/flores_101_pol
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.625550458479643
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_por
type: gsarti/flores_101_por
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.9754515986213523
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_pus
type: gsarti/flores_101_pus
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.4963371422771585
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ron
type: gsarti/flores_101_ron
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.965456830031304
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_rus
type: gsarti/flores_101_rus
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.0498020542445303
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_slk
type: gsarti/flores_101_slk
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.450822127057479
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_slv
type: gsarti/flores_101_slv
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 6.620252120186232
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_sna
type: gsarti/flores_101_sna
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.462166771382726
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_snd
type: gsarti/flores_101_snd
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.466066951221973
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_som
type: gsarti/flores_101_som
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 11.95918054093392
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_spa
type: gsarti/flores_101_spa
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.8965140104323535
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_srp
type: gsarti/flores_101_srp
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.871214785885079
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_swe
type: gsarti/flores_101_swe
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.054972008155866
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_swh
type: gsarti/flores_101_swh
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.6973091886730676
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tam
type: gsarti/flores_101_tam
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.539493400469833
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tel
type: gsarti/flores_101_tel
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.807499987508966
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tgk
type: gsarti/flores_101_tgk
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 3.5994818827380426
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tgl
type: gsarti/flores_101_tgl
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.667053833119858
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tha
type: gsarti/flores_101_tha
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.365940201944242
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_tur
type: gsarti/flores_101_tur
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 4.885014749844601
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_ukr
type: gsarti/flores_101_ukr
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.7240934990288483
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_umb
type: gsarti/flores_101_umb
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 12.766915508610673
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_urd
type: gsarti/flores_101_urd
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.9797467071381232
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_uzb
type: gsarti/flores_101_uzb
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 12.002337637722146
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_vie
type: gsarti/flores_101_vie
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 1.76578415476397
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_wol
type: gsarti/flores_101_wol
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 9.144285650306488
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_xho
type: gsarti/flores_101_xho
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 7.403240538286952
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_yor
type: gsarti/flores_101_yor
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 5.91272037551173
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zho_simpl
type: gsarti/flores_101_zho_simpl
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.2769070822768533
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zho_trad
type: gsarti/flores_101_zho_trad
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 2.5180582198242383
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: gsarti/flores_101_zul
type: gsarti/flores_101_zul
metrics:
- name: byte_perplexity
type: byte_perplexity
value: 8.53353320693145
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: headqa
type: headqa
metrics:
- name: acc
type: acc
value: 0.26440554339897887
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: hellaswag
type: hellaswag
metrics:
- name: acc
type: acc
value: 0.41236805417247563
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: logiqa
type: logiqa
metrics:
- name: acc
type: acc
value: 0.2073732718894009
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mathqa
type: mathqa
metrics:
- name: acc
type: acc
value: 0.24958123953098826
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mc_taco
type: mc_taco
metrics:
- name: em
type: em
value: 0.11936936936936937
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mnli
type: mnli
metrics:
- name: acc
type: acc
value: 0.35496688741721855
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mnli_mismatched
type: mnli_mismatched
metrics:
- name: acc
type: acc
value: 0.35211554109031734
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: mrpc
type: mrpc
metrics:
- name: acc
type: acc
value: 0.5857843137254902
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: multirc
type: multirc
metrics:
- name: acc
type: acc
value: 0.5375412541254125
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: openbookqa
type: openbookqa
metrics:
- name: acc
type: acc
value: 0.216
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: piqa
type: piqa
metrics:
- name: acc
type: acc
value: 0.7078346028291621
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: prost
type: prost
metrics:
- name: acc
type: acc
value: 0.22683603757472245
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: pubmedqa
type: pubmedqa
metrics:
- name: acc
type: acc
value: 0.616
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: qnli
type: qnli
metrics:
- name: acc
type: acc
value: 0.5072304594545122
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: qqp
type: qqp
metrics:
- name: acc
type: acc
value: 0.3842443729903537
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: race
type: race
metrics:
- name: acc
type: acc
value: 0.3521531100478469
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: rte
type: rte
metrics:
- name: acc
type: acc
value: 0.47653429602888087
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: sciq
type: sciq
metrics:
- name: acc
type: acc
value: 0.892
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: sst
type: sst
metrics:
- name: acc
type: acc
value: 0.5177752293577982
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: triviaqa
type: triviaqa
metrics:
- name: acc
type: acc
value: 0.041633518960487934
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: tydiqa_primary
type: tydiqa_primary
metrics:
- name: acc
type: acc
value: 0.3011337608795236
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: webqs
type: webqs
metrics:
- name: acc
type: acc
value: 0.01673228346456693
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wic
type: wic
metrics:
- name: acc
type: acc
value: 0.5015673981191222
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: winogrande
type: winogrande
metrics:
- name: acc
type: acc
value: 0.5864246250986582
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wnli
type: wnli
metrics:
- name: acc
type: acc
value: 0.471830985915493
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: wsc
type: wsc
metrics:
- name: acc
type: acc
value: 0.4423076923076923
verified: false
- task:
type: text-generation
name: text generation
dataset:
name: humaneval
type: humaneval
metrics:
- name: pass@1
type: pass@1
value: 0.15524390243902436
verified: false
- name: pass@10
type: pass@10
value: 0.3220367632383857
verified: false
- name: pass@100
type: pass@100
value: 0.5545431515723145
verified: false
---
<h1 style='text-align: center '>BLOOM LM</h1>
<h2 style='text-align: center '><em>BigScience Large Open-science Open-access Multilingual Language Model</em> </h2>
<h3 style='text-align: center '>Model Card</h3>
<img src="https://s3.amazonaws.com/moonup/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Version 1.0 / 26.May.2022
## Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Data](#training-data)
4. [Risks and Limitations](#risks-and-limitations)
5. [Evaluation](#evaluation)
6. [Recommendations](#recommendations)
7. [Glossary and Calculations](#glossary-and-calculations)
8. [More Information](#more-information)
9. [Model Card Authors](#model-card-authors)
## Model Details
### Basics
*This section provides information for anyone who wants to know about the model.*
<details>
<summary>Click to expand</summary> <br/>
**Developed by:** BigScience ([website](https://bigscience.huggingface.co))
* All collaborators are either volunteers or have an agreement with their employer. *(Further breakdown of participants forthcoming.)*
**Model Type:** Transformer-based Language Model
**Version:** 1.0.0
**Languages:** Multiple; see [training data](#training-data)
**License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license))
**Release Date Estimate:** Monday, 11.July.2022
**Send Questions to:** bigscience-contact@googlegroups.com
**Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022
**Funded by:**
* The French government.
* Hugging Face ([website](https://huggingface.co)).
* Organizations of contributors. *(Further breakdown of organizations forthcoming.)*
</details>
### Technical Specifications
*This section provides information for people who work on model development.*
<details>
<summary>Click to expand</summary><br/>
Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training.
**Model Architecture:** Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)):
* Decoder-only architecture
* Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf))
* ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions
* 3,002,557,440 parameters:
* 642,252,800 embedding parameters
* 30 layers, 32 attention heads
* Hidden layers are 2560-dimensional
* Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
**Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
**Compute infrastructure:** Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
* Hardware: 384 A100 80GB GPUs (48 nodes):
* Additional 32 A100 80GB GPUs (4 nodes) in reserve
* 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links
* CPU: AMD
* CPU memory: 512GB per node
* GPU memory: 640GB per node
* Inter-node connect: Omni-Path Architecture (OPA)
* NCCL-communications network: a fully dedicated subnet
* Disc IO network: shared network with other types of nodes
* Software:
* Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed))
* DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed))
* PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch))
* apex ([Github link](https://github.com/NVIDIA/apex))
#### **Training**
Training logs: [Tensorboard link](https://huggingface.co/tensorboard/bigscience/tr11c-2B5-logs)
- Number of epochs: 1 (*current target*)
- Dates:
- Started 11th March, 2022 11:42am PST
- Ended 5th July, 2022
- Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments)
- Server training location: Île-de-France, France
#### **Tokenization**
The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)) is a learned subword tokenizer trained using:
- A byte-level Byte Pair Encoding (BPE) algorithm
- A simple pre-tokenization rule, no normalization
- A vocabulary size of 250,680
It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language.
</details>
### Environmental Impact
<details>
<summary>Click to expand</summary><br/>
The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing.
**Estimated carbon emissions:** *(Forthcoming upon completion of training.)*
**Estimated electricity usage:** *(Forthcoming upon completion of training.)*
</details>
<p> </p>
## Uses
*This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model.
It provides information for anyone considering using the model or who is affected by the model.*
<details>
<summary>Click to expand</summary><br/>
### Intended Use
This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive.
#### **Direct Use**
- Text generation
- Exploring characteristics of language generated by a language model
- Examples: Cloze tests, counterfactuals, generations with reframings
#### **Downstream Use**
- Tasks that leverage language models include: Information Extraction, Question Answering, Summarization
### Misuse and Out-of-scope Use
*This section addresses what users ought not do with the model.*
See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases.
#### **Out-of-scope Uses**
Using the model in [high-stakes](#high-stakes) settings is out of scope for this model. The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but is not correct.
##### Out-of-scope Uses Include:
- Usage in biomedical domains, political and legal domains, or finance domains
- Usage for evaluating or scoring individuals, such as for employment, education, or credit
- Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct
#### **Misuse**
Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes:
- Spam generation
- Disinformation and influence operations
- Disparagement and defamation
- Harassment and abuse
- [Deception](#deception)
- Unconsented impersonation and imitation
- Unconsented surveillance
- Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
### Intended Users
#### **Direct Users**
- General Public
- Researchers
- Students
- Educators
- Engineers/developers
- Non-commercial entities
- Community advocates, including human and civil rights groups
#### Indirect Users
- Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use)
- Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license)
#### Others Affected (Parties Prenantes)
- People and groups referred to by the LLM
- People and groups exposed to outputs of, or decisions based on, the LLM
- People and groups whose original work is included in the LLM
</details>
<p> </p>
## Training Data
*This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.*
<details>
<summary>Click to expand</summary><br/>
Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus).
Training data includes:
- 45 natural languages
- 12 programming languages
- In 1.5TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.)
#### **Languages**
The pie chart shows the distribution of languages in training data.

The following table shows the further distribution of Niger-Congo and Indic languages in the training data.
<details>
<summary>Click to expand</summary><br/>
| Niger Congo | Percentage | | Indic | Percentage |
|----------------|------------ |------ |-----------|------------|
| Chi Tumbuka | 0.00002 | | Assamese | 0.01 |
| Kikuyu | 0.00004 | | Odia | 0.04 |
| Bambara | 0.00004 | | Gujarati | 0.04 |
| Akan | 0.00007 | | Marathi | 0.05 |
| Xitsonga | 0.00007 | | Punjabi | 0.05 |
| Sesotho | 0.00007 | | Kannada | 0.06 |
| Chi Chewa | 0.0001 | | Nepali | 0.07 |
| Setswana | 0.0002 | | Telugu | 0.09 |
| Northern Sotho | 0.0002 | | Malayalam | 0.10 |
| Fon | 0.0002 | | Urdu | 0.10 |
| Kirundi | 0.0003 | | Tamil | 0.20 |
| Wolof | 0.0004 | | Bengali | 0.50 |
| Kuganda | 0.0004 | | Hindi | 0.70 |
| Chi Shona | 0.001 |
| Isi Zulu | 0.001 |
| Igbo | 0.001 |
| Xhosa | 0.001 |
| Kinyarwanda | 0.003 |
| Yoruba | 0.006 |
| Swahili | 0.02 |
</details>
The following table shows the distribution of programming languages.
<details>
<summary>Click to expand</summary><br/>
| Extension | Language | Number of files |
|----------------|------------|-----------------|
| java | Java | 5,407,724 |
| php | PHP | 4,942,186 |
| cpp | C++ | 2,503,930 |
| py | Python | 2,435,072 |
| js | JavaScript | 1,905,518 |
| cs | C# | 1,577,347 |
| rb | Ruby | 6,78,413 |
| cc | C++ | 443,054 |
| hpp | C++ | 391,048 |
| lua | Lua | 352,317 |
| go | GO | 227,763 |
| ts | TypeScript | 195,254 |
| C | C | 134,537 |
| scala | Scala | 92,052 |
| hh | C++ | 67,161 |
| H | C++ | 55,899 |
| tsx | TypeScript | 33,107 |
| rs | Rust | 29,693 |
| phpt | PHP | 9,702 |
| c++ | C++ | 1,342 |
| h++ | C++ | 791 |
| php3 | PHP | 540 |
| phps | PHP | 270 |
| php5 | PHP | 166 |
| php4 | PHP | 29 |
</details>
</details>
<p> </p>
## Risks and Limitations
*This section identifies foreseeable harms and misunderstandings.*
<details>
<summary>Click to expand</summary><br/>
Model may:
- Overrepresent some viewpoints and underrepresent others
- Contain stereotypes
- Contain [personal information](#personal-data-and-information)
- Generate:
- Hateful, abusive, or violent language
- Discriminatory or prejudicial language
- Content that may not be appropriate for all settings, including sexual content
- Make errors, including producing incorrect information as if it were factual
- Generate irrelevant or repetitive outputs
</details>
<p> </p>
## Evaluation
*This section describes the evaluation protocols and provides the results.*
<details>
<summary>Click to expand</summary><br/>
### Metrics
*This section describes the different ways performance is calculated and why.*
Includes:
| Metric | Why chosen |
|--------------------|--------------------------------------------------------------------|
| [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training |
| Cross Entropy [Loss](#loss) | Standard objective for language models. |
And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_
### Factors
*This section lists some different aspects of BLOOM models. Its focus is on aspects that are likely to give rise to high variance in model behavior.*
- Language, such as English or Yoruba
- Domain, such as newswire or stories
- Demographic characteristics, such as gender or nationality
### Results
*Results are based on the [Factors](#factors) and [Metrics](#metrics).*
**Zero-shot evaluations:**
See this repository for JSON files: https://github.com/bigscience-workshop/evaluation-results
| Task | Language | Metric | BLOOM-2B5 |
|:----|:----|:----|:----:|
| arc_challenge | eng | acc ↑ | 0.28 |
| arc_easy | eng | acc ↑ | 0.595 |
| axb (Median of 10 prompts) | eng | acc ↑ | 0.443 |
| axg (Median of 10 prompts) | eng | acc ↑ | 0.5 |
| boolq (Median of 11 prompts) | eng | acc ↑ | 0.617 |
| cb (Median of 15 prompts) | eng | acc ↑ | 0.304 |
| cola (Median of 5 prompts) | eng | acc ↑ | 0.611 |
| copa (Median of 9 prompts) | eng | acc ↑ | 0.63 |
| crows_pairs_english (Median of 6 prompts) | eng | acc ↑ | 0.497 |
| crows_pairs_french (Median of 7 prompts) | fra | acc ↑ | 0.503 |
| diabla (Median of 2 prompts) | eng | acc ↑ | 0.289 |
| gsarti/flores_101_afr | afr | byte_perplexity ↓ | 6.501 |
| gsarti/flores_101_amh | amh | byte_perplexity ↓ | 3.973 |
| gsarti/flores_101_ara | ara | byte_perplexity ↓ | 1.808 |
| gsarti/flores_101_asm | asm | byte_perplexity ↓ | 5.699 |
| gsarti/flores_101_ast | ast | byte_perplexity ↓ | 3.925 |
| gsarti/flores_101_azj | azj | byte_perplexity ↓ | 6.943 |
| gsarti/flores_101_bel | bel | byte_perplexity ↓ | 3.614 |
| gsarti/flores_101_ben | ben | byte_perplexity ↓ | 5.121 |
| gsarti/flores_101_bos | bos | byte_perplexity ↓ | 5.653 |
| gsarti/flores_101_bul | bul | byte_perplexity ↓ | 2.701 |
| gsarti/flores_101_cat | cat | byte_perplexity ↓ | 2.305 |
| gsarti/flores_101_ceb | ceb | byte_perplexity ↓ | 6.291 |
| gsarti/flores_101_ces | ces | byte_perplexity ↓ | 5.447 |
| gsarti/flores_101_ckb | ckb | byte_perplexity ↓ | 3.726 |
| gsarti/flores_101_cym | cym | byte_perplexity ↓ | 12.539 |
| gsarti/flores_101_dan | dan | byte_perplexity ↓ | 5.183 |
| gsarti/flores_101_deu | deu | byte_perplexity ↓ | 3.118 |
| gsarti/flores_101_ell | ell | byte_perplexity ↓ | 2.468 |
| gsarti/flores_101_eng | eng | byte_perplexity ↓ | 2.019 |
| gsarti/flores_101_est | est | byte_perplexity ↓ | 9.117 |
| gsarti/flores_101_fas | fas | byte_perplexity ↓ | 3.058 |
| gsarti/flores_101_fin | fin | byte_perplexity ↓ | 6.847 |
| gsarti/flores_101_fra | fra | byte_perplexity ↓ | 1.998 |
| gsarti/flores_101_ful | ful | byte_perplexity ↓ | 11.466 |
| gsarti/flores_101_gle | gle | byte_perplexity ↓ | 8.681 |
| gsarti/flores_101_glg | glg | byte_perplexity ↓ | 3.03 |
| gsarti/flores_101_guj | guj | byte_perplexity ↓ | 4.955 |
| gsarti/flores_101_hau | hau | byte_perplexity ↓ | 10.758 |
| gsarti/flores_101_heb | heb | byte_perplexity ↓ | 3.6 |
| gsarti/flores_101_hin | hin | byte_perplexity ↓ | 4.713 |
| gsarti/flores_101_hrv | hrv | byte_perplexity ↓ | 5.822 |
| gsarti/flores_101_hun | hun | byte_perplexity ↓ | 6.44 |
| gsarti/flores_101_hye | hye | byte_perplexity ↓ | 3.658 |
| gsarti/flores_101_ibo | ibo | byte_perplexity ↓ | 5.565 |
| gsarti/flores_101_ind | ind | byte_perplexity ↓ | 2.16 |
| gsarti/flores_101_isl | isl | byte_perplexity ↓ | 8.082 |
| gsarti/flores_101_ita | ita | byte_perplexity ↓ | 2.969 |
| gsarti/flores_101_jav | jav | byte_perplexity ↓ | 7.057 |
| gsarti/flores_101_jpn | jpn | byte_perplexity ↓ | 2.776 |
| gsarti/flores_101_kam | kam | byte_perplexity ↓ | 11.073 |
| gsarti/flores_101_kan | kan | byte_perplexity ↓ | 5.552 |
| gsarti/flores_101_kat | kat | byte_perplexity ↓ | 2.523 |
| gsarti/flores_101_kaz | kaz | byte_perplexity ↓ | 3.39 |
| gsarti/flores_101_kea | kea | byte_perplexity ↓ | 8.919 |
| gsarti/flores_101_kir | kir | byte_perplexity ↓ | 3.729 |
| gsarti/flores_101_kor | kor | byte_perplexity ↓ | 3.933 |
| gsarti/flores_101_lao | lao | byte_perplexity ↓ | 2.908 |
| gsarti/flores_101_lav | lav | byte_perplexity ↓ | 7.777 |
| gsarti/flores_101_lin | lin | byte_perplexity ↓ | 7.525 |
| gsarti/flores_101_lit | lit | byte_perplexity ↓ | 7.369 |
| gsarti/flores_101_ltz | ltz | byte_perplexity ↓ | 8.801 |
| gsarti/flores_101_lug | lug | byte_perplexity ↓ | 8.483 |
| gsarti/flores_101_luo | luo | byte_perplexity ↓ | 11.976 |
| gsarti/flores_101_mal | mal | byte_perplexity ↓ | 4.616 |
| gsarti/flores_101_mar | mar | byte_perplexity ↓ | 5.483 |
| gsarti/flores_101_mkd | mkd | byte_perplexity ↓ | 2.966 |
| gsarti/flores_101_mlt | mlt | byte_perplexity ↓ | 15.005 |
| gsarti/flores_101_mon | mon | byte_perplexity ↓ | 3.411 |
| gsarti/flores_101_mri | mri | byte_perplexity ↓ | 7.474 |
| gsarti/flores_101_msa | msa | byte_perplexity ↓ | 2.571 |
| gsarti/flores_101_mya | mya | byte_perplexity ↓ | 2.414 |
| gsarti/flores_101_nld | nld | byte_perplexity ↓ | 4.128 |
| gsarti/flores_101_nob | nob | byte_perplexity ↓ | 5.403 |
| gsarti/flores_101_npi | npi | byte_perplexity ↓ | 5.199 |
| gsarti/flores_101_nso | nso | byte_perplexity ↓ | 8.155 |
| gsarti/flores_101_nya | nya | byte_perplexity ↓ | 8.18 |
| gsarti/flores_101_oci | oci | byte_perplexity ↓ | 4.862 |
| gsarti/flores_101_orm | orm | byte_perplexity ↓ | 12.912 |
| gsarti/flores_101_ory | ory | byte_perplexity ↓ | 5.189 |
| gsarti/flores_101_pan | pan | byte_perplexity ↓ | 4.698 |
| gsarti/flores_101_pol | pol | byte_perplexity ↓ | 4.626 |
| gsarti/flores_101_por | por | byte_perplexity ↓ | 1.975 |
| gsarti/flores_101_pus | pus | byte_perplexity ↓ | 4.496 |
| gsarti/flores_101_ron | ron | byte_perplexity ↓ | 4.965 |
| gsarti/flores_101_rus | rus | byte_perplexity ↓ | 2.05 |
| gsarti/flores_101_slk | slk | byte_perplexity ↓ | 6.451 |
| gsarti/flores_101_slv | slv | byte_perplexity ↓ | 6.62 |
| gsarti/flores_101_sna | sna | byte_perplexity ↓ | 8.462 |
| gsarti/flores_101_snd | snd | byte_perplexity ↓ | 5.466 |
| gsarti/flores_101_som | som | byte_perplexity ↓ | 11.959 |
| gsarti/flores_101_spa | spa | byte_perplexity ↓ | 1.897 |
| gsarti/flores_101_srp | srp | byte_perplexity ↓ | 2.871 |
| gsarti/flores_101_swe | swe | byte_perplexity ↓ | 5.055 |
| gsarti/flores_101_swh | swh | byte_perplexity ↓ | 3.697 |
| gsarti/flores_101_tam | tam | byte_perplexity ↓ | 4.539 |
| gsarti/flores_101_tel | tel | byte_perplexity ↓ | 5.807 |
| gsarti/flores_101_tgk | tgk | byte_perplexity ↓ | 3.599 |
| gsarti/flores_101_tgl | tgl | byte_perplexity ↓ | 5.667 |
| gsarti/flores_101_tha | tha | byte_perplexity ↓ | 2.366 |
| gsarti/flores_101_tur | tur | byte_perplexity ↓ | 4.885 |
| gsarti/flores_101_ukr | ukr | byte_perplexity ↓ | 2.724 |
| gsarti/flores_101_umb | umb | byte_perplexity ↓ | 12.767 |
| gsarti/flores_101_urd | urd | byte_perplexity ↓ | 1.98 |
| gsarti/flores_101_uzb | uzb | byte_perplexity ↓ | 12.002 |
| gsarti/flores_101_vie | vie | byte_perplexity ↓ | 1.766 |
| gsarti/flores_101_wol | wol | byte_perplexity ↓ | 9.144 |
| gsarti/flores_101_xho | xho | byte_perplexity ↓ | 7.403 |
| gsarti/flores_101_yor | yor | byte_perplexity ↓ | 5.913 |
| gsarti/flores_101_zho_simpl | zho_simpl | byte_perplexity ↓ | 2.277 |
| gsarti/flores_101_zho_trad | zho_trad | byte_perplexity ↓ | 2.518 |
| gsarti/flores_101_zul | zul | byte_perplexity ↓ | 8.534 |
| headqa | esp | acc ↑ | 0.264 |
| hellaswag | eng | acc ↑ | 0.412 |
| logiqa | eng | acc ↑ | 0.207 |
| mathqa | eng | acc ↑ | 0.25 |
| mc_taco | eng | em ↑ | 0.119 |
| mnli (Median of 15 prompts) | eng | acc ↑ | 0.355 |
| mnli_mismatched (Median of 15 prompts) | eng | acc ↑ | 0.352 |
| mrpc | eng | acc ↑ | 0.586 |
| multirc (Median of 11 prompts) | eng | acc ↑ | 0.538 |
| openbookqa | eng | acc ↑ | 0.216 |
| piqa | eng | acc ↑ | 0.708 |
| prost | eng | acc ↑ | 0.227 |
| pubmedqa | eng | acc ↑ | 0.616 |
| qnli | eng | acc ↑ | 0.507 |
| qqp (Median of 7 prompts) | eng | acc ↑ | 0.384 |
| race | eng | acc ↑ | 0.352 |
| rte (Median of 6 prompts) | eng | acc ↑ | 0.477 |
| sciq | eng | acc ↑ | 0.892 |
| sst (Median of 6 prompts) | eng | acc ↑ | 0.518 |
| triviaqa | eng | acc ↑ | 0.042 |
| tydiqa_primary (Median of 24 prompts) | eng | acc ↑ | 0.301 |
| webqs | eng | acc ↑ | 0.017 |
| wic (Median of 11 prompts) | eng | acc ↑ | 0.502 |
| winogrande | eng | acc ↑ | 0.586 |
| wnli (Median of 6 prompts) | eng | acc ↑ | 0.472 |
| wsc (Median of 11 prompts) | eng | acc ↑ | 0.442 |
| humaneval | python | pass@1 ↑ | 0.155 |
| humaneval | python | pass@10 ↑ | 0.322 |
| humaneval | python | pass@100 ↑ | 0.555 |
**Train-time Evaluation:**
As of 25.May.2022, 15:00 PST:
- Training Loss: 2.0
- Validation Loss: 2.2
- Perplexity: 8.9
</details>
<p> </p>
## Recommendations
*This section provides information on warnings and potential mitigations.*
<details>
<summary>Click to expand</summary><br/>
- Indirect users should be made aware when the content they're working with is created by the LLM.
- Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary.
- Models pretrained with the LLM should include an updated Model Card.
- Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments.
</details>
<p> </p>
## Glossary and Calculations
*This section defines common terms and how metrics are calculated.*
<details>
<summary>Click to expand</summary><br/>
- <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss.
- <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy.
- <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/).
- <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf).
- <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf).
- <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm).
- <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf))
- <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated.
</details>
<p> </p>
## More Information
<details>
<summary>Click to expand</summary><br/>
### Dataset Creation
Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling
### Technical Specifications
Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours
More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model
Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss
Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md
Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md
### Initial Results
Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book
</details>
<p> </p>
## Model Card Authors
*Ordered roughly chronologically and by amount of time spent.*
Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muñoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Ilić, Gérard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff
| 64,277 | [
[
-0.0290679931640625,
-0.0546875,
0.033294677734375,
0.00884246826171875,
-0.0062408447265625,
-0.0143280029296875,
-0.03668212890625,
-0.0421142578125,
0.00701141357421875,
0.01739501953125,
-0.028076171875,
-0.037567138671875,
-0.0516357421875,
0.004528045654296875,
-0.0257110595703125,
0.07708740234375,
0.0035915374755859375,
0.0130615234375,
0.00009369850158691406,
0.0087432861328125,
-0.01071929931640625,
-0.049652099609375,
-0.04217529296875,
-0.0287628173828125,
0.044158935546875,
0.024505615234375,
0.052520751953125,
0.050201416015625,
0.0482177734375,
0.0202178955078125,
-0.0328369140625,
-0.006153106689453125,
-0.041290283203125,
-0.0286865234375,
-0.020294189453125,
-0.0146026611328125,
-0.048828125,
-0.0036945343017578125,
0.06732177734375,
0.048736572265625,
0.0019359588623046875,
0.0246734619140625,
-0.0011529922485351562,
0.039703369140625,
-0.042938232421875,
0.0251007080078125,
-0.0399169921875,
0.005397796630859375,
-0.017333984375,
0.022003173828125,
-0.0245819091796875,
-0.00005996227264404297,
0.00441741943359375,
-0.03594970703125,
0.0124053955078125,
0.0005478858947753906,
0.07171630859375,
-0.0029315948486328125,
-0.008026123046875,
-0.00646209716796875,
-0.05889892578125,
0.064208984375,
-0.06512451171875,
0.043731689453125,
0.02642822265625,
0.018096923828125,
0.0035343170166015625,
-0.0704345703125,
-0.058319091796875,
-0.01459503173828125,
0.005100250244140625,
0.0283050537109375,
-0.01514434814453125,
0.004795074462890625,
0.027740478515625,
0.04473876953125,
-0.044769287109375,
0.0187530517578125,
-0.038543701171875,
-0.00611114501953125,
0.049346923828125,
0.0025539398193359375,
0.012359619140625,
-0.017791748046875,
-0.0147552490234375,
-0.027557373046875,
-0.046661376953125,
-0.0159759521484375,
0.016693115234375,
0.03460693359375,
-0.027618408203125,
0.050872802734375,
0.0132293701171875,
0.0302886962890625,
-0.0209808349609375,
0.0010890960693359375,
0.03619384765625,
-0.040985107421875,
-0.028472900390625,
-0.01531982421875,
0.07513427734375,
0.0149078369140625,
-0.0020885467529296875,
-0.0080718994140625,
-0.006275177001953125,
-0.015716552734375,
0.002246856689453125,
-0.07147216796875,
-0.00946044921875,
0.03118896484375,
-0.02325439453125,
-0.0085906982421875,
0.0010652542114257812,
-0.0780029296875,
-0.005859375,
-0.01094818115234375,
0.034637451171875,
-0.049591064453125,
-0.03448486328125,
0.0209808349609375,
-0.00628662109375,
0.01146697998046875,
0.007266998291015625,
-0.06390380859375,
0.0201263427734375,
0.043731689453125,
0.078125,
-0.006565093994140625,
-0.046295166015625,
-0.0014390945434570312,
0.004817962646484375,
-0.0028095245361328125,
0.0164642333984375,
-0.016815185546875,
-0.041900634765625,
-0.00533294677734375,
0.0125579833984375,
-0.0048065185546875,
-0.016143798828125,
0.043304443359375,
-0.0307464599609375,
0.026580810546875,
-0.02264404296875,
-0.044464111328125,
-0.0029621124267578125,
0.00247955322265625,
-0.046478271484375,
0.0799560546875,
0.0017652511596679688,
-0.0653076171875,
0.01220703125,
-0.07574462890625,
-0.0134735107421875,
0.00383758544921875,
0.01024627685546875,
-0.044891357421875,
-0.00980377197265625,
0.00803375244140625,
0.030059814453125,
-0.021575927734375,
0.0218658447265625,
-0.00910186767578125,
-0.01049041748046875,
0.007411956787109375,
-0.0272979736328125,
0.060821533203125,
0.02850341796875,
-0.042877197265625,
0.004180908203125,
-0.049041748046875,
-0.00885772705078125,
0.0276947021484375,
-0.0299835205078125,
0.01308441162109375,
-0.0095977783203125,
0.030426025390625,
0.0161895751953125,
0.0215606689453125,
-0.05316162109375,
0.02264404296875,
-0.046783447265625,
0.053924560546875,
0.045379638671875,
-0.01049041748046875,
0.0243682861328125,
-0.0152130126953125,
0.0259246826171875,
0.005718231201171875,
0.0241546630859375,
-0.01849365234375,
-0.0472412109375,
-0.0595703125,
-0.0352783203125,
0.03118896484375,
0.035980224609375,
-0.0345458984375,
0.047576904296875,
-0.0361328125,
-0.0582275390625,
-0.031951904296875,
-0.005237579345703125,
0.0428466796875,
0.025054931640625,
0.05450439453125,
-0.005413055419921875,
-0.041412353515625,
-0.0574951171875,
0.00485992431640625,
0.00616455078125,
0.01540374755859375,
0.02838134765625,
0.07843017578125,
-0.036895751953125,
0.06231689453125,
-0.04052734375,
-0.00493621826171875,
-0.0182647705078125,
-0.0006527900695800781,
0.023773193359375,
0.040557861328125,
0.03326416015625,
-0.055999755859375,
-0.019927978515625,
0.002582550048828125,
-0.050994873046875,
0.0256500244140625,
0.0187835693359375,
0.00012314319610595703,
0.020050048828125,
0.0379638671875,
-0.064208984375,
0.021728515625,
0.052825927734375,
-0.010040283203125,
0.05401611328125,
-0.01898193359375,
-0.007648468017578125,
-0.10321044921875,
0.03741455078125,
0.0018224716186523438,
0.0019121170043945312,
-0.03741455078125,
0.0182952880859375,
-0.005336761474609375,
-0.03143310546875,
-0.047576904296875,
0.058685302734375,
-0.0304718017578125,
0.0053558349609375,
-0.01172637939453125,
-0.00170135498046875,
-0.0090484619140625,
0.0252532958984375,
0.00856781005859375,
0.068359375,
0.052703857421875,
-0.046295166015625,
0.01088714599609375,
0.0111541748046875,
-0.01311492919921875,
0.005084991455078125,
-0.0657958984375,
0.0079498291015625,
-0.0084075927734375,
0.0215606689453125,
-0.055328369140625,
-0.0235443115234375,
0.0158538818359375,
-0.041961669921875,
0.03240966796875,
-0.0004477500915527344,
-0.060791015625,
-0.05169677734375,
-0.0159759521484375,
0.029388427734375,
0.0450439453125,
-0.034271240234375,
0.02850341796875,
0.0235137939453125,
0.01233673095703125,
-0.0377197265625,
-0.06805419921875,
0.0081024169921875,
-0.01079559326171875,
-0.0450439453125,
0.0396728515625,
-0.01074981689453125,
-0.005519866943359375,
0.00530242919921875,
0.0184326171875,
0.00445556640625,
0.002101898193359375,
0.0242156982421875,
0.00907135009765625,
-0.0128936767578125,
0.026275634765625,
-0.020965576171875,
-0.00144195556640625,
-0.00586700439453125,
-0.0439453125,
0.05255126953125,
-0.0205535888671875,
-0.024993896484375,
-0.0367431640625,
0.021697998046875,
0.051055908203125,
-0.0201568603515625,
0.08160400390625,
0.060821533203125,
-0.0450439453125,
0.004177093505859375,
-0.035980224609375,
-0.026580810546875,
-0.03558349609375,
0.051055908203125,
-0.0027980804443359375,
-0.071044921875,
0.03082275390625,
0.007110595703125,
0.01354217529296875,
0.05609130859375,
0.053253173828125,
0.009521484375,
0.0611572265625,
0.0494384765625,
-0.0168914794921875,
0.041290283203125,
-0.04840087890625,
0.0238494873046875,
-0.0677490234375,
-0.00946044921875,
-0.036956787109375,
-0.0071868896484375,
-0.04534912109375,
-0.049224853515625,
0.02264404296875,
0.01427459716796875,
-0.039154052734375,
0.032867431640625,
-0.041015625,
0.0205230712890625,
0.044189453125,
0.003204345703125,
0.004589080810546875,
0.005657196044921875,
-0.01324462890625,
-0.005054473876953125,
-0.0574951171875,
-0.046142578125,
0.096923828125,
0.0506591796875,
0.032073974609375,
0.00464630126953125,
0.048309326171875,
-0.000006139278411865234,
0.01525115966796875,
-0.050445556640625,
0.040740966796875,
-0.007076263427734375,
-0.0606689453125,
-0.02862548828125,
-0.043182373046875,
-0.0863037109375,
0.01415252685546875,
-0.0220794677734375,
-0.07391357421875,
0.0012073516845703125,
0.018035888671875,
-0.0191192626953125,
0.052337646484375,
-0.05859375,
0.072265625,
-0.0210113525390625,
-0.0347900390625,
-0.019317626953125,
-0.04205322265625,
0.02008056640625,
0.0012769699096679688,
0.0294189453125,
0.0153350830078125,
0.01385498046875,
0.058990478515625,
-0.03857421875,
0.0704345703125,
-0.01090240478515625,
0.005901336669921875,
0.0214691162109375,
-0.0224151611328125,
0.034149169921875,
0.0018062591552734375,
-0.0134429931640625,
0.044219970703125,
-0.00395965576171875,
-0.0263824462890625,
-0.00913238525390625,
0.059478759765625,
-0.08056640625,
-0.029388427734375,
-0.04345703125,
-0.040008544921875,
-0.0033855438232421875,
0.033721923828125,
0.04010009765625,
0.015869140625,
-0.0193634033203125,
0.018035888671875,
0.051422119140625,
-0.04364013671875,
0.029937744140625,
0.025054931640625,
-0.02978515625,
-0.04620361328125,
0.08197021484375,
0.010711669921875,
0.0229034423828125,
0.0262603759765625,
0.0284423828125,
-0.021484375,
-0.044158935546875,
-0.0245361328125,
0.0384521484375,
-0.04052734375,
-0.008544921875,
-0.06378173828125,
-0.0357666015625,
-0.055419921875,
0.0096588134765625,
-0.042236328125,
-0.0182647705078125,
-0.0406494140625,
-0.0140380859375,
0.031524658203125,
0.042205810546875,
-0.013671875,
0.031951904296875,
-0.048309326171875,
0.003406524658203125,
0.01552581787109375,
0.0196990966796875,
0.004444122314453125,
-0.045440673828125,
-0.034912109375,
0.0232391357421875,
-0.03790283203125,
-0.04632568359375,
0.0254058837890625,
0.01324462890625,
0.0305938720703125,
0.00804901123046875,
-0.0312347412109375,
0.032135009765625,
-0.039215087890625,
0.0823974609375,
0.0274505615234375,
-0.069091796875,
0.039642333984375,
-0.03216552734375,
0.0301361083984375,
0.0306549072265625,
0.039703369140625,
-0.0406494140625,
-0.01049041748046875,
-0.06072998046875,
-0.0838623046875,
0.04742431640625,
0.01776123046875,
0.0165252685546875,
-0.008544921875,
0.0307769775390625,
-0.02081298828125,
0.01495361328125,
-0.0718994140625,
-0.022979736328125,
-0.0251922607421875,
-0.0101470947265625,
-0.0265350341796875,
-0.0195465087890625,
-0.0112457275390625,
-0.0309295654296875,
0.06451416015625,
0.0007801055908203125,
0.044036865234375,
0.0200042724609375,
-0.00963592529296875,
-0.0028076171875,
0.0132293701171875,
0.0516357421875,
0.045867919921875,
-0.01806640625,
-0.0023250579833984375,
0.023468017578125,
-0.058837890625,
-0.0123443603515625,
0.0214691162109375,
-0.01183319091796875,
-0.01067352294921875,
0.0267333984375,
0.06317138671875,
0.0167694091796875,
-0.045989990234375,
0.039794921875,
0.0122528076171875,
-0.0298309326171875,
-0.023834228515625,
-0.006313323974609375,
0.027618408203125,
0.01149749755859375,
0.0162200927734375,
-0.01206207275390625,
-0.0022716522216796875,
-0.040985107421875,
0.005672454833984375,
0.034332275390625,
-0.02197265625,
-0.0343017578125,
0.058074951171875,
0.0023097991943359375,
-0.0206451416015625,
0.038177490234375,
-0.0254058837890625,
-0.039215087890625,
0.041900634765625,
0.0455322265625,
0.056884765625,
-0.00414276123046875,
0.004131317138671875,
0.057037353515625,
0.032135009765625,
-0.0079803466796875,
0.0190277099609375,
0.0190887451171875,
-0.0452880859375,
-0.0308837890625,
-0.059600830078125,
-0.026763916015625,
0.025604248046875,
-0.027618408203125,
0.0247955322265625,
-0.044830322265625,
-0.020355224609375,
0.007053375244140625,
0.00527191162109375,
-0.060333251953125,
0.01453399658203125,
0.02392578125,
0.0831298828125,
-0.05926513671875,
0.065185546875,
0.052032470703125,
-0.048614501953125,
-0.0645751953125,
-0.006072998046875,
-0.0018873214721679688,
-0.053924560546875,
0.054443359375,
0.0082855224609375,
0.00237274169921875,
0.005390167236328125,
-0.052001953125,
-0.0811767578125,
0.0924072265625,
0.0252838134765625,
-0.0487060546875,
0.00144195556640625,
0.01467132568359375,
0.053314208984375,
-0.001651763916015625,
0.02923583984375,
0.0228424072265625,
0.044342041015625,
0.01546478271484375,
-0.07647705078125,
-0.00017917156219482422,
-0.0168914794921875,
-0.002685546875,
0.005462646484375,
-0.06634521484375,
0.08502197265625,
-0.01145172119140625,
-0.0158233642578125,
0.0011835098266601562,
0.05279541015625,
0.01032257080078125,
0.003879547119140625,
0.00955963134765625,
0.06768798828125,
0.06005859375,
-0.01216888427734375,
0.083251953125,
-0.038055419921875,
0.048736572265625,
0.071533203125,
-0.00632476806640625,
0.0611572265625,
0.0276947021484375,
-0.034332275390625,
0.0201263427734375,
0.0345458984375,
-0.0143280029296875,
0.01904296875,
0.021209716796875,
-0.0091094970703125,
0.00405120849609375,
-0.0031375885009765625,
-0.052947998046875,
0.017669677734375,
0.029449462890625,
-0.04296875,
-0.005481719970703125,
0.00843048095703125,
0.016693115234375,
-0.01934814453125,
-0.022552490234375,
0.035430908203125,
0.00994873046875,
-0.037841796875,
0.03900146484375,
0.0160369873046875,
0.0584716796875,
-0.054901123046875,
0.007266998291015625,
0.0009984970092773438,
0.0246429443359375,
-0.0191650390625,
-0.060821533203125,
0.01149749755859375,
-0.0005507469177246094,
-0.0191192626953125,
-0.01152801513671875,
0.027587890625,
-0.01983642578125,
-0.0511474609375,
0.029144287109375,
0.0218048095703125,
0.0163726806640625,
-0.0066070556640625,
-0.0654296875,
0.0135498046875,
-0.01947021484375,
-0.025360107421875,
0.023101806640625,
0.0185394287109375,
0.0181884765625,
0.0401611328125,
0.051544189453125,
0.0170745849609375,
0.00701904296875,
0.0012121200561523438,
0.07061767578125,
-0.052703857421875,
-0.01531219482421875,
-0.0653076171875,
0.03515625,
-0.0050811767578125,
-0.04345703125,
0.06927490234375,
0.06231689453125,
0.06695556640625,
0.00229644775390625,
0.0626220703125,
-0.01006317138671875,
0.0264739990234375,
-0.03265380859375,
0.0443115234375,
-0.0494384765625,
-0.00467681884765625,
-0.037933349609375,
-0.0704345703125,
-0.0272216796875,
0.038177490234375,
-0.033660888671875,
0.021697998046875,
0.048858642578125,
0.061279296875,
-0.01224517822265625,
0.005138397216796875,
0.015716552734375,
0.03509521484375,
0.035430908203125,
0.0396728515625,
0.039825439453125,
-0.04180908203125,
0.030731201171875,
-0.020263671875,
-0.0155487060546875,
-0.025604248046875,
-0.0653076171875,
-0.05670166015625,
-0.046478271484375,
-0.026397705078125,
-0.037567138671875,
0.006923675537109375,
0.07330322265625,
0.0592041015625,
-0.06268310546875,
-0.0222930908203125,
-0.0253753662109375,
-0.01233673095703125,
-0.00850677490234375,
-0.016448974609375,
0.043121337890625,
-0.01476287841796875,
-0.054473876953125,
0.01861572265625,
0.01338958740234375,
0.01065826416015625,
-0.035858154296875,
-0.00884246826171875,
-0.035064697265625,
-0.00537109375,
0.04754638671875,
0.041748046875,
-0.043121337890625,
-0.0156402587890625,
0.011260986328125,
-0.015594482421875,
-0.0011034011840820312,
0.034698486328125,
-0.021942138671875,
0.0274505615234375,
0.0246124267578125,
0.042236328125,
0.05438232421875,
-0.019287109375,
0.0154876708984375,
-0.035888671875,
0.01190948486328125,
0.0274200439453125,
0.038787841796875,
0.0277862548828125,
-0.02496337890625,
0.02935791015625,
0.031158447265625,
-0.05572509765625,
-0.05975341796875,
0.0103912353515625,
-0.07281494140625,
-0.02313232421875,
0.11297607421875,
-0.006626129150390625,
-0.0277862548828125,
0.0193328857421875,
-0.01263427734375,
0.0189208984375,
-0.017791748046875,
0.04473876953125,
0.055267333984375,
0.0090179443359375,
-0.004154205322265625,
-0.05029296875,
0.0262603759765625,
0.027252197265625,
-0.0618896484375,
0.0057830810546875,
0.02947998046875,
0.02642822265625,
0.0305938720703125,
0.0361328125,
-0.01708984375,
0.00646209716796875,
-0.0029354095458984375,
0.033050537109375,
-0.00853729248046875,
-0.0192718505859375,
-0.030029296875,
-0.002532958984375,
0.005565643310546875,
-0.00473785400390625
]
] |
t5-11b | 2023-01-02T16:15:50.000Z | [
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"summarization",
"translation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:c4",
"arxiv:1805.12471",
"arxiv:1708.00055",
"arxiv:1704.05426",
"arxiv:1606.05250",
"arxiv:1808.09121",
"arxiv:1810.12885",
"arxiv:1905.10044",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | translation | null | null | null | t5-11b | 41 | 17,013 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- fr
- ro
- de
- multilingual
license: apache-2.0
tags:
- summarization
- translation
datasets:
- c4
inference: false
---
# Model Card for T5 11B

# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [Model Card Authors](#model-card-authors)
9. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
> With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
T5-11B is the checkpoint with 11 billion parameters.
- **Developed by:** Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. See [associated paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) and [GitHub repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
- **Model type:** Language model
- **Language(s) (NLP):** English, French, Romanian, German
- **License:** Apache 2.0
- **Related Models:** [All T5 Checkpoints](https://huggingface.co/models?search=t5)
- **Resources for more information:**
- [Research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
- [Google's T5 Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
- [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer)
- [Hugging Face T5 Docs](https://huggingface.co/docs/transformers/model_doc/t5)
# Uses
## Direct Use and Downstream Use
The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
> Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task, including machine translation, document summarization, question answering, and classification tasks (e.g., sentiment analysis). We can even apply T5 to regression tasks by training it to predict the string representation of a number instead of the number itself.
See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
More information needed.
## Recommendations
More information needed.
# Training Details
## Training Data
The model is pre-trained on the [Colossal Clean Crawled Corpus (C4)](https://www.tensorflow.org/datasets/catalog/c4), which was developed and released in the context of the same [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) as T5.
The model was pre-trained on a on a **multi-task mixture of unsupervised (1.) and supervised tasks (2.)**.
Thereby, the following datasets were being used for (1.) and (2.):
1. **Datasets used for Unsupervised denoising objective**:
- [C4](https://huggingface.co/datasets/c4)
- [Wiki-DPR](https://huggingface.co/datasets/wiki_dpr)
2. **Datasets used for Supervised text-to-text language modeling objective**
- Sentence acceptability judgment
- CoLA [Warstadt et al., 2018](https://arxiv.org/abs/1805.12471)
- Sentiment analysis
- SST-2 [Socher et al., 2013](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)
- Paraphrasing/sentence similarity
- MRPC [Dolan and Brockett, 2005](https://aclanthology.org/I05-5002)
- STS-B [Ceret al., 2017](https://arxiv.org/abs/1708.00055)
- QQP [Iyer et al., 2017](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- Natural language inference
- MNLI [Williams et al., 2017](https://arxiv.org/abs/1704.05426)
- QNLI [Rajpurkar et al.,2016](https://arxiv.org/abs/1606.05250)
- RTE [Dagan et al., 2005](https://link.springer.com/chapter/10.1007/11736790_9)
- CB [De Marneff et al., 2019](https://semanticsarchive.net/Archive/Tg3ZGI2M/Marneffe.pdf)
- Sentence completion
- COPA [Roemmele et al., 2011](https://www.researchgate.net/publication/221251392_Choice_of_Plausible_Alternatives_An_Evaluation_of_Commonsense_Causal_Reasoning)
- Word sense disambiguation
- WIC [Pilehvar and Camacho-Collados, 2018](https://arxiv.org/abs/1808.09121)
- Question answering
- MultiRC [Khashabi et al., 2018](https://aclanthology.org/N18-1023)
- ReCoRD [Zhang et al., 2018](https://arxiv.org/abs/1810.12885)
- BoolQ [Clark et al., 2019](https://arxiv.org/abs/1905.10044)
## Training Procedure
In their [abstract](https://jmlr.org/papers/volume21/20-074/20-074.pdf), the model developers write:
> In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks.
The framework introduced, the T5 framework, involves a training procedure that brings together the approaches studied in the paper. See the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
# Evaluation
## Testing Data, Factors & Metrics
The developers evaluated the model on 24 tasks, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for full details.
## Results
For full results for T5-11B, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf), Table 14.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@article{2020t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {Journal of Machine Learning Research},
year = {2020},
volume = {21},
number = {140},
pages = {1-67},
url = {http://jmlr.org/papers/v21/20-074.html}
}
```
**APA:**
- Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., ... & Liu, P. J. (2020). Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140), 1-67.
# Model Card Authors
This model card was written by the team at Hugging Face.
# How to Get Started with the Model
## Disclaimer
**Before `transformers` v3.5.0**, due do its immense size, `t5-11b` required some special treatment.
If you're using transformers `<= v3.4.0`, `t5-11b` should be loaded with flag `use_cdn` set to `False` as follows:
```python
t5 = transformers.T5ForConditionalGeneration.from_pretrained('t5-11b', use_cdn = False)
```
Secondly, a single GPU will most likely not have enough memory to even load the model into memory as the weights alone amount to over 40 GB.
- Model parallelism has to be used here to overcome this problem as is explained in this [PR](https://github.com/huggingface/transformers/pull/3578).
- DeepSpeed's ZeRO-Offload is another approach as explained in this [post](https://github.com/huggingface/transformers/issues/9996).
See the [Hugging Face T5](https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Model) docs and a [Colab Notebook](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) created by the model developers for more context.
| 8,603 | [
[
-0.02581787109375,
-0.0299530029296875,
0.036895751953125,
0.01361083984375,
-0.00977325439453125,
-0.006221771240234375,
-0.0191650390625,
-0.045806884765625,
-0.0245819091796875,
0.033447265625,
-0.038330078125,
-0.040924072265625,
-0.060882568359375,
0.0230865478515625,
-0.04254150390625,
0.08038330078125,
-0.007293701171875,
-0.0113525390625,
-0.01296234130859375,
-0.0088043212890625,
-0.0265655517578125,
-0.039398193359375,
-0.04498291015625,
-0.0251617431640625,
0.030059814453125,
0.01413726806640625,
0.025299072265625,
0.035858154296875,
0.0513916015625,
0.0183868408203125,
-0.012847900390625,
-0.0003235340118408203,
-0.032135009765625,
-0.021759033203125,
-0.0200042724609375,
-0.0204010009765625,
-0.028167724609375,
-0.0022792816162109375,
0.042724609375,
0.0545654296875,
-0.0013103485107421875,
0.02520751953125,
0.012054443359375,
0.042083740234375,
-0.04541015625,
0.01336669921875,
-0.0435791015625,
0.004852294921875,
-0.0030651092529296875,
0.005176544189453125,
-0.0435791015625,
-0.0016183853149414062,
0.0180816650390625,
-0.0482177734375,
0.024078369140625,
0.0018434524536132812,
0.09344482421875,
0.0264129638671875,
-0.0347900390625,
-0.01145172119140625,
-0.05816650390625,
0.08154296875,
-0.05963134765625,
0.038238525390625,
0.01044464111328125,
0.011932373046875,
0.0135498046875,
-0.08245849609375,
-0.05078125,
-0.0005564689636230469,
-0.0161895751953125,
0.0174560546875,
-0.0237274169921875,
0.003704071044921875,
0.0292205810546875,
0.029327392578125,
-0.032440185546875,
-0.0027446746826171875,
-0.04754638671875,
-0.009857177734375,
0.0428466796875,
-0.0011692047119140625,
0.0265655517578125,
-0.0169525146484375,
-0.037200927734375,
-0.025390625,
-0.0270843505859375,
0.0047149658203125,
-0.003246307373046875,
0.021759033203125,
-0.0288238525390625,
0.0184326171875,
0.005870819091796875,
0.04644775390625,
0.010894775390625,
-0.0106353759765625,
0.0309295654296875,
-0.054290771484375,
-0.0178070068359375,
-0.0262908935546875,
0.084228515625,
0.0208587646484375,
0.01320648193359375,
-0.0322265625,
-0.0025386810302734375,
-0.006046295166015625,
0.032745361328125,
-0.07183837890625,
-0.01155853271484375,
0.02130126953125,
-0.0362548828125,
-0.035125732421875,
-0.003971099853515625,
-0.0633544921875,
-0.0020084381103515625,
-0.00970458984375,
0.03570556640625,
-0.036163330078125,
-0.01654052734375,
0.016265869140625,
-0.019287109375,
0.02545166015625,
0.0241851806640625,
-0.065185546875,
0.027099609375,
0.0235137939453125,
0.05291748046875,
-0.0341796875,
-0.0265350341796875,
-0.00848388671875,
0.00731658935546875,
-0.007843017578125,
0.0555419921875,
-0.031585693359375,
-0.0305328369140625,
-0.0090179443359375,
0.01226043701171875,
-0.018707275390625,
-0.024261474609375,
0.06341552734375,
-0.021392822265625,
0.053497314453125,
-0.0266876220703125,
-0.038299560546875,
-0.034393310546875,
0.01032257080078125,
-0.047271728515625,
0.092529296875,
0.000240325927734375,
-0.059783935546875,
0.0183258056640625,
-0.0657958984375,
-0.0218505859375,
-0.01922607421875,
0.0206451416015625,
-0.040863037109375,
-0.0188140869140625,
0.0230560302734375,
0.043914794921875,
-0.033447265625,
0.030609130859375,
-0.0207977294921875,
-0.016876220703125,
0.002643585205078125,
-0.027099609375,
0.07763671875,
0.023712158203125,
-0.03753662109375,
0.0005469322204589844,
-0.052276611328125,
0.0011816024780273438,
0.0016117095947265625,
-0.0181427001953125,
0.003612518310546875,
-0.0143280029296875,
0.0207366943359375,
0.032806396484375,
0.01727294921875,
-0.039031982421875,
0.0006470680236816406,
-0.0222625732421875,
0.047698974609375,
0.0374755859375,
-0.00873565673828125,
0.043487548828125,
-0.036834716796875,
0.0310211181640625,
0.01125335693359375,
0.006046295166015625,
-0.0166015625,
-0.03143310546875,
-0.061065673828125,
0.0068206787109375,
0.035400390625,
0.0433349609375,
-0.046142578125,
0.041259765625,
-0.0386962890625,
-0.055267333984375,
-0.0458984375,
-0.003631591796875,
0.0311737060546875,
0.05316162109375,
0.06414794921875,
-0.00745391845703125,
-0.04339599609375,
-0.050018310546875,
-0.0242767333984375,
-0.008056640625,
0.0002765655517578125,
0.01415252685546875,
0.049285888671875,
-0.006984710693359375,
0.05853271484375,
-0.02471923828125,
-0.02667236328125,
-0.040374755859375,
-0.0015363693237304688,
-0.00726318359375,
0.04412841796875,
0.04986572265625,
-0.05706787109375,
-0.036773681640625,
-0.0140838623046875,
-0.06689453125,
-0.00270843505859375,
-0.01397705078125,
-0.0005879402160644531,
0.030303955078125,
0.04388427734375,
-0.04461669921875,
0.0186920166015625,
0.048828125,
-0.023956298828125,
0.0240325927734375,
-0.012237548828125,
-0.0015649795532226562,
-0.1182861328125,
0.038177490234375,
0.01433563232421875,
-0.01306915283203125,
-0.056427001953125,
-0.005077362060546875,
0.006511688232421875,
-0.00701141357421875,
-0.04083251953125,
0.053802490234375,
-0.0312347412109375,
0.0063018798828125,
-0.003265380859375,
0.0036220550537109375,
0.0119171142578125,
0.05145263671875,
-0.0011854171752929688,
0.059326171875,
0.0235137939453125,
-0.055206298828125,
-0.0006895065307617188,
0.0286102294921875,
-0.00820159912109375,
0.0222015380859375,
-0.0565185546875,
0.020660400390625,
-0.006992340087890625,
0.035369873046875,
-0.06787109375,
0.01251983642578125,
0.0269622802734375,
-0.04840087890625,
0.026397705078125,
0.004016876220703125,
-0.0289459228515625,
-0.028900146484375,
-0.019927978515625,
0.0221710205078125,
0.05218505859375,
-0.037933349609375,
0.056427001953125,
0.01239776611328125,
0.021820068359375,
-0.058135986328125,
-0.0640869140625,
0.01160430908203125,
-0.03057861328125,
-0.041656494140625,
0.061431884765625,
-0.01049041748046875,
0.005706787109375,
0.01129150390625,
-0.001338958740234375,
-0.01253509521484375,
0.01084136962890625,
0.004192352294921875,
0.018890380859375,
0.0007023811340332031,
0.01152801513671875,
-0.006008148193359375,
-0.012969970703125,
0.0020427703857421875,
-0.030609130859375,
0.0232696533203125,
-0.01172637939453125,
0.01160430908203125,
-0.048980712890625,
0.0120391845703125,
0.041229248046875,
-0.0146636962890625,
0.06243896484375,
0.07305908203125,
-0.018524169921875,
-0.003948211669921875,
-0.03924560546875,
-0.0160369873046875,
-0.03546142578125,
0.028228759765625,
-0.0271759033203125,
-0.0653076171875,
0.033233642578125,
0.0022983551025390625,
0.0248260498046875,
0.06787109375,
0.0237274169921875,
-0.01155853271484375,
0.05499267578125,
0.062469482421875,
-0.004974365234375,
0.04241943359375,
-0.034210205078125,
0.0223236083984375,
-0.06884765625,
-0.022674560546875,
-0.059783935546875,
-0.0236968994140625,
-0.05877685546875,
-0.0284881591796875,
0.00818634033203125,
-0.0008153915405273438,
-0.0269775390625,
0.04339599609375,
-0.042388916015625,
0.008056640625,
0.032073974609375,
0.00832366943359375,
0.025848388671875,
0.0002677440643310547,
-0.002681732177734375,
-0.0106201171875,
-0.064208984375,
-0.03497314453125,
0.0966796875,
0.023956298828125,
0.02996826171875,
-0.00421142578125,
0.045257568359375,
0.01561737060546875,
0.0145416259765625,
-0.05120849609375,
0.05474853515625,
-0.02880859375,
-0.0416259765625,
-0.01580810546875,
-0.03411865234375,
-0.083251953125,
0.0209503173828125,
-0.0272369384765625,
-0.053070068359375,
0.01165771484375,
0.0002803802490234375,
-0.016693115234375,
0.041961669921875,
-0.06500244140625,
0.084228515625,
-0.0053558349609375,
-0.023345947265625,
-0.00438690185546875,
-0.054443359375,
0.0160369873046875,
0.0009622573852539062,
0.01000213623046875,
0.005184173583984375,
-0.01073455810546875,
0.07318115234375,
-0.02239990234375,
0.06500244140625,
-0.01537322998046875,
0.0023250579833984375,
0.01345062255859375,
-0.0261383056640625,
0.033355712890625,
-0.0313720703125,
-0.00594329833984375,
0.032012939453125,
0.00629425048828125,
-0.034698486328125,
-0.040313720703125,
0.03564453125,
-0.0709228515625,
-0.03192138671875,
-0.031829833984375,
-0.038299560546875,
-0.01236724853515625,
0.0243988037109375,
0.026336669921875,
0.0160369873046875,
-0.01024627685546875,
0.02734375,
0.05108642578125,
-0.03411865234375,
0.05511474609375,
0.0238800048828125,
-0.0014705657958984375,
-0.0192413330078125,
0.06024169921875,
0.00911712646484375,
0.0311126708984375,
0.042083740234375,
0.01322174072265625,
-0.0227203369140625,
-0.043914794921875,
-0.0296478271484375,
0.0256805419921875,
-0.04510498046875,
-0.01000213623046875,
-0.07513427734375,
-0.019134521484375,
-0.0421142578125,
-0.0016756057739257812,
-0.032470703125,
-0.0309906005859375,
-0.0341796875,
-0.0134429931640625,
0.0238800048828125,
0.034912109375,
0.01128387451171875,
0.01727294921875,
-0.0697021484375,
0.0156097412109375,
0.0012407302856445312,
0.00634002685546875,
0.005214691162109375,
-0.06463623046875,
-0.01042938232421875,
0.005176544189453125,
-0.0362548828125,
-0.048736572265625,
0.03411865234375,
0.0182952880859375,
0.02813720703125,
0.0026607513427734375,
0.01009368896484375,
0.049530029296875,
-0.021942138671875,
0.07415771484375,
0.0097503662109375,
-0.07659912109375,
0.0206298828125,
-0.022064208984375,
0.0303802490234375,
0.039306640625,
0.03875732421875,
-0.04901123046875,
-0.0167999267578125,
-0.0770263671875,
-0.0611572265625,
0.060211181640625,
0.0193328857421875,
0.01499176025390625,
0.0268096923828125,
0.01953125,
0.0005116462707519531,
0.01123809814453125,
-0.0679931640625,
-0.022552490234375,
-0.01702880859375,
-0.024749755859375,
-0.00664520263671875,
-0.0037593841552734375,
0.01053619384765625,
-0.025604248046875,
0.050323486328125,
-0.0036640167236328125,
0.05255126953125,
0.0211944580078125,
-0.01885986328125,
0.0127410888671875,
0.0253753662109375,
0.0447998046875,
0.03948974609375,
-0.01361083984375,
-0.004192352294921875,
0.03228759765625,
-0.03997802734375,
-0.003631591796875,
0.015380859375,
-0.02557373046875,
-0.0034236907958984375,
0.033721923828125,
0.07257080078125,
0.0101318359375,
-0.033447265625,
0.040191650390625,
-0.0014162063598632812,
-0.04400634765625,
-0.0169219970703125,
-0.004009246826171875,
0.01074981689453125,
-0.002216339111328125,
0.017547607421875,
0.020660400390625,
0.009552001953125,
-0.035919189453125,
0.005100250244140625,
0.01065826416015625,
-0.036041259765625,
-0.035491943359375,
0.06396484375,
0.0254058837890625,
-0.00045180320739746094,
0.049530029296875,
-0.008697509765625,
-0.0384521484375,
0.04388427734375,
0.03924560546875,
0.0780029296875,
-0.005977630615234375,
0.0169219970703125,
0.052703857421875,
0.0302276611328125,
-0.01177978515625,
0.0027618408203125,
-0.0034885406494140625,
-0.062286376953125,
-0.044647216796875,
-0.0374755859375,
-0.02423095703125,
0.0142059326171875,
-0.03802490234375,
0.0262451171875,
-0.026275634765625,
0.002895355224609375,
0.0080718994140625,
0.007091522216796875,
-0.057861328125,
0.023284912109375,
0.0052490234375,
0.065673828125,
-0.058624267578125,
0.063720703125,
0.05609130859375,
-0.0447998046875,
-0.07183837890625,
0.00801849365234375,
-0.0177001953125,
-0.05029296875,
0.039794921875,
0.012786865234375,
0.0037078857421875,
0.01306915283203125,
-0.040283203125,
-0.06524658203125,
0.0982666015625,
0.0292205810546875,
-0.023956298828125,
-0.0280303955078125,
0.0192108154296875,
0.049530029296875,
-0.0175018310546875,
0.032806396484375,
0.03460693359375,
0.038116455078125,
0.0173492431640625,
-0.08001708984375,
0.0251617431640625,
-0.019195556640625,
0.00537872314453125,
0.005687713623046875,
-0.06365966796875,
0.0438232421875,
-0.0236663818359375,
-0.01898193359375,
-0.01364898681640625,
0.05279541015625,
0.0022449493408203125,
0.0141754150390625,
0.03546142578125,
0.06060791015625,
0.04962158203125,
-0.0098419189453125,
0.09100341796875,
-0.0269775390625,
0.035308837890625,
0.06182861328125,
0.01019287109375,
0.065673828125,
0.034515380859375,
-0.0267333984375,
0.03692626953125,
0.0472412109375,
-0.00867462158203125,
0.03521728515625,
-0.00875091552734375,
-0.0035457611083984375,
-0.01056671142578125,
-0.0133514404296875,
-0.0282135009765625,
0.01800537109375,
0.0183868408203125,
-0.0292205810546875,
-0.0221710205078125,
0.00948333740234375,
0.023773193359375,
-0.006328582763671875,
-0.00423431396484375,
0.061065673828125,
0.0185699462890625,
-0.058837890625,
0.05218505859375,
0.0082855224609375,
0.065673828125,
-0.03436279296875,
0.006694793701171875,
-0.0150604248046875,
0.0145111083984375,
-0.0242462158203125,
-0.052032470703125,
0.039154052734375,
0.0016889572143554688,
-0.012237548828125,
-0.05255126953125,
0.061553955078125,
-0.0309906005859375,
-0.032806396484375,
0.0294952392578125,
0.03460693359375,
0.007236480712890625,
-0.00289154052734375,
-0.072021484375,
-0.002246856689453125,
0.015777587890625,
-0.018096923828125,
0.0291595458984375,
0.0298004150390625,
0.006603240966796875,
0.05340576171875,
0.045440673828125,
-0.0144195556640625,
0.0003657341003417969,
-0.01059722900390625,
0.054168701171875,
-0.05548095703125,
-0.0172882080078125,
-0.05377197265625,
0.0511474609375,
0.00040912628173828125,
-0.033538818359375,
0.05206298828125,
0.0357666015625,
0.083984375,
-0.007335662841796875,
0.071533203125,
-0.0163116455078125,
0.03857421875,
-0.02972412109375,
0.036956787109375,
-0.050323486328125,
0.016693115234375,
-0.031280517578125,
-0.0682373046875,
-0.0217437744140625,
0.0290985107421875,
-0.027496337890625,
0.02508544921875,
0.0809326171875,
0.048828125,
-0.0000470280647277832,
-0.006450653076171875,
0.0142974853515625,
0.01128387451171875,
0.0264129638671875,
0.055267333984375,
0.02618408203125,
-0.0755615234375,
0.0743408203125,
-0.0296478271484375,
0.01435089111328125,
0.0038776397705078125,
-0.062744140625,
-0.06707763671875,
-0.0621337890625,
-0.032806396484375,
-0.034515380859375,
0.007335662841796875,
0.055908203125,
0.04327392578125,
-0.0509033203125,
-0.0200042724609375,
-0.0282745361328125,
-0.0019702911376953125,
-0.0184783935546875,
-0.01546478271484375,
0.03741455078125,
-0.038116455078125,
-0.0682373046875,
0.0031757354736328125,
-0.003940582275390625,
0.002460479736328125,
-0.0013866424560546875,
-0.0019702911376953125,
-0.024871826171875,
-0.01393890380859375,
0.04144287109375,
0.015380859375,
-0.04736328125,
-0.0213470458984375,
0.0204010009765625,
-0.00832366943359375,
0.0119781494140625,
0.036102294921875,
-0.054046630859375,
0.01534271240234375,
0.040618896484375,
0.07110595703125,
0.0621337890625,
-0.007411956787109375,
0.045440673828125,
-0.026885986328125,
-0.00677490234375,
0.0092620849609375,
0.009246826171875,
0.029876708984375,
-0.015838623046875,
0.047607421875,
0.037200927734375,
-0.039642333984375,
-0.05108642578125,
-0.0116119384765625,
-0.09661865234375,
-0.01384735107421875,
0.09613037109375,
-0.01160430908203125,
-0.0161590576171875,
0.0028209686279296875,
-0.00424957275390625,
0.0253143310546875,
-0.03509521484375,
0.057403564453125,
0.06524658203125,
0.00982666015625,
-0.032318115234375,
-0.045745849609375,
0.0482177734375,
0.045806884765625,
-0.0799560546875,
-0.01131439208984375,
0.014007568359375,
0.0309906005859375,
0.01007843017578125,
0.041473388671875,
-0.011199951171875,
0.0038242340087890625,
-0.01511383056640625,
0.0229644775390625,
-0.0009388923645019531,
-0.00560760498046875,
-0.02508544921875,
0.01163482666015625,
-0.0177001953125,
-0.01380157470703125
]
] |
hirotasoshu/tiny-random-prophetnet | 2023-03-30T10:45:15.000Z | [
"transformers",
"pytorch",
"prophetnet",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | hirotasoshu | null | null | hirotasoshu/tiny-random-prophetnet | 0 | 17,008 | transformers | 2023-03-27T08:26:02 | Same as `hf-internal-testing/tiny-random-prophetnet`, but with higher max_length and max_position_embedding | 107 | [
[
-0.033111572265625,
-0.06561279296875,
0.001094818115234375,
0.029052734375,
-0.0171356201171875,
-0.022430419921875,
-0.0309600830078125,
-0.0181732177734375,
0.037841796875,
0.0294189453125,
-0.0222320556640625,
-0.0242462158203125,
-0.0268402099609375,
0.024200439453125,
-0.042449951171875,
0.02252197265625,
0.0173797607421875,
-0.01003265380859375,
0.0001977682113647461,
-0.0256500244140625,
0.003936767578125,
-0.032867431640625,
-0.043701171875,
-0.0137939453125,
0.058074951171875,
0.040283203125,
0.0858154296875,
0.06475830078125,
0.0677490234375,
0.00673675537109375,
0.0058746337890625,
-0.0254058837890625,
-0.0193023681640625,
-0.00661468505859375,
0.017852783203125,
0.01363372802734375,
-0.014862060546875,
-0.037872314453125,
0.08453369140625,
0.03753662109375,
-0.0289764404296875,
0.0293426513671875,
-0.0034160614013671875,
0.03863525390625,
-0.056549072265625,
-0.0321044921875,
-0.0037059783935546875,
0.0293731689453125,
-0.0193023681640625,
-0.01340484619140625,
-0.036102294921875,
-0.01385498046875,
0.0027370452880859375,
-0.0239105224609375,
0.005008697509765625,
0.036712646484375,
0.10394287109375,
0.018524169921875,
-0.0458984375,
0.0330810546875,
-0.0283355712890625,
0.027191162109375,
-0.047119140625,
0.06048583984375,
0.0205841064453125,
0.0518798828125,
-0.0049591064453125,
-0.026763916015625,
-0.046478271484375,
-0.0208740234375,
-0.01003265380859375,
-0.012908935546875,
0.0019779205322265625,
0.01258087158203125,
0.045684814453125,
0.024688720703125,
-0.05108642578125,
0.017822265625,
-0.0154571533203125,
-0.0034580230712890625,
0.056671142578125,
0.0037078857421875,
-0.01213836669921875,
-0.020416259765625,
-0.029937744140625,
-0.029266357421875,
-0.096435546875,
-0.0255584716796875,
0.04376220703125,
0.024810791015625,
-0.0258636474609375,
0.049102783203125,
-0.03607177734375,
0.05596923828125,
0.01486968994140625,
-0.00719451904296875,
0.038909912109375,
-0.026611328125,
-0.050445556640625,
0.0166473388671875,
0.056304931640625,
-0.00690460205078125,
-0.005584716796875,
0.023956298828125,
-0.006458282470703125,
-0.04864501953125,
0.01513671875,
-0.077880859375,
-0.0251312255859375,
-0.005870819091796875,
-0.0618896484375,
-0.0220489501953125,
0.02850341796875,
-0.02239990234375,
-0.02337646484375,
0.021636962890625,
0.038543701171875,
-0.029144287109375,
-0.00017178058624267578,
-0.007274627685546875,
-0.03790283203125,
0.038299560546875,
-0.0135955810546875,
-0.03680419921875,
0.0278778076171875,
0.04168701171875,
0.06585693359375,
-0.0219268798828125,
-0.022735595703125,
-0.03717041015625,
0.0079803466796875,
-0.025390625,
0.03497314453125,
-0.027069091796875,
-0.01322174072265625,
0.00890350341796875,
0.00435638427734375,
0.0160980224609375,
-0.0282745361328125,
0.014404296875,
-0.037109375,
0.01261138916015625,
-0.039581298828125,
-0.032470703125,
-0.02191162109375,
0.0271453857421875,
-0.032562255859375,
0.10162353515625,
0.036041259765625,
-0.045257568359375,
0.050689697265625,
-0.046661376953125,
-0.036407470703125,
-0.002346038818359375,
-0.0020236968994140625,
-0.040374755859375,
-0.01959228515625,
-0.02191162109375,
0.034088134765625,
-0.0255126953125,
-0.00759124755859375,
-0.06103515625,
-0.052337646484375,
0.0301666259765625,
-0.039306640625,
0.048492431640625,
0.01302337646484375,
-0.024688720703125,
0.017242431640625,
-0.0528564453125,
0.005458831787109375,
0.0211639404296875,
-0.01690673828125,
-0.024658203125,
-0.0203399658203125,
-0.0026702880859375,
0.0022830963134765625,
0.012603759765625,
-0.05352783203125,
0.0115814208984375,
-0.0246734619140625,
0.0197906494140625,
0.032073974609375,
0.019439697265625,
0.021728515625,
-0.044097900390625,
0.00965118408203125,
-0.01453399658203125,
0.046112060546875,
-0.01050567626953125,
-0.044097900390625,
-0.061126708984375,
-0.04345703125,
0.0218505859375,
0.034332275390625,
-0.01558685302734375,
0.060150146484375,
-0.01415252685546875,
-0.0259857177734375,
-0.00939178466796875,
0.00685882568359375,
0.041961669921875,
-0.00373077392578125,
0.0020904541015625,
-0.01279449462890625,
-0.03643798828125,
-0.06854248046875,
0.021759033203125,
-0.00762939453125,
0.01364898681640625,
-0.0010251998901367188,
0.059600830078125,
-0.0225372314453125,
0.027069091796875,
-0.038604736328125,
-0.028228759765625,
0.005786895751953125,
-0.03717041015625,
0.052032470703125,
0.03619384765625,
0.0489501953125,
-0.048858642578125,
-0.06964111328125,
-0.0183868408203125,
-0.016998291015625,
0.01708984375,
0.004131317138671875,
-0.01079559326171875,
-0.01406097412109375,
-0.0094451904296875,
-0.0670166015625,
0.04736328125,
0.03399658203125,
-0.034271240234375,
0.044464111328125,
-0.0231475830078125,
0.006763458251953125,
-0.0877685546875,
-0.024444580078125,
-0.00865936279296875,
-0.0301055908203125,
-0.00798797607421875,
0.040924072265625,
0.034515380859375,
0.0124969482421875,
-0.036224365234375,
0.02825927734375,
-0.036895751953125,
-0.0299835205078125,
-0.029388427734375,
-0.01708984375,
-0.0236053466796875,
0.0219879150390625,
-0.0290985107421875,
0.0755615234375,
0.049896240234375,
-0.034332275390625,
0.05218505859375,
0.0213775634765625,
-0.01264190673828125,
0.03955078125,
-0.05389404296875,
-0.0099639892578125,
0.01763916015625,
0.0143585205078125,
-0.054534912109375,
-0.050384521484375,
0.0176544189453125,
-0.02154541015625,
-0.0008530616760253906,
-0.0157623291015625,
-0.0679931640625,
-0.06329345703125,
-0.05279541015625,
0.03125,
0.06939697265625,
-0.05535888671875,
-0.005031585693359375,
0.006061553955078125,
0.0025806427001953125,
-0.058441162109375,
-0.047027587890625,
-0.0035762786865234375,
-0.0300750732421875,
-0.04522705078125,
0.039306640625,
0.001003265380859375,
-0.0009851455688476562,
-0.0030002593994140625,
0.033447265625,
-0.018157958984375,
-0.014312744140625,
0.05316162109375,
0.005657196044921875,
-0.03570556640625,
0.0206451416015625,
0.004650115966796875,
0.011383056640625,
-0.0175323486328125,
-0.039825439453125,
0.047607421875,
-0.0195159912109375,
-0.052398681640625,
-0.024993896484375,
0.0272979736328125,
0.006511688232421875,
-0.002155303955078125,
0.057098388671875,
0.06121826171875,
-0.0521240234375,
-0.01971435546875,
-0.0269012451171875,
-0.04632568359375,
-0.041595458984375,
0.018280029296875,
-0.04132080078125,
-0.07391357421875,
0.0311126708984375,
0.0260467529296875,
-0.01117706298828125,
0.036834716796875,
0.032440185546875,
-0.0160064697265625,
0.07501220703125,
0.05218505859375,
0.00872802734375,
0.022613525390625,
-0.0230865478515625,
-0.0056304931640625,
-0.052886962890625,
0.0200653076171875,
-0.035675048828125,
-0.021453857421875,
-0.0222015380859375,
-0.0203704833984375,
0.0261077880859375,
-0.00768280029296875,
-0.07977294921875,
0.05523681640625,
-0.037200927734375,
0.0270233154296875,
0.050567626953125,
-0.007190704345703125,
0.01409149169921875,
0.00640869140625,
-0.0310211181640625,
0.01520538330078125,
-0.046905517578125,
0.004352569580078125,
0.0740966796875,
0.007415771484375,
0.034393310546875,
0.014404296875,
0.0614013671875,
0.0208892822265625,
0.05322265625,
-0.044342041015625,
0.022003173828125,
-0.012451171875,
-0.08245849609375,
-0.04766845703125,
-0.035400390625,
-0.06890869140625,
0.032196044921875,
-0.0294952392578125,
-0.048980712890625,
0.0045013427734375,
0.00757598876953125,
-0.05108642578125,
0.0408935546875,
-0.0250244140625,
0.046173095703125,
0.01175689697265625,
0.0255584716796875,
-0.0009202957153320312,
-0.0194091796875,
0.015228271484375,
0.005336761474609375,
-0.0094451904296875,
-0.0083465576171875,
-0.0094451904296875,
0.05133056640625,
-0.052978515625,
0.034942626953125,
-0.010009765625,
0.0014743804931640625,
0.0271148681640625,
0.0037708282470703125,
0.030364990234375,
0.042999267578125,
-0.010528564453125,
-0.0128326416015625,
0.004703521728515625,
-0.036895751953125,
-0.03118896484375,
0.05548095703125,
-0.048553466796875,
-0.0308837890625,
-0.032257080078125,
-0.0203704833984375,
0.0133514404296875,
0.024017333984375,
0.034820556640625,
0.050933837890625,
-0.0279693603515625,
0.03912353515625,
0.06134033203125,
-0.007595062255859375,
0.046966552734375,
0.04205322265625,
-0.00788116455078125,
-0.04168701171875,
0.046295166015625,
-0.032806396484375,
0.00392913818359375,
0.049896240234375,
0.0380859375,
-0.01532745361328125,
-0.050323486328125,
-0.03887939453125,
0.0178985595703125,
-0.0251922607421875,
-0.03753662109375,
-0.031005859375,
-0.02886962890625,
-0.0224456787109375,
-0.01065826416015625,
-0.0540771484375,
-0.03729248046875,
0.0068511962890625,
-0.01983642578125,
0.04498291015625,
0.0184173583984375,
-0.0269775390625,
0.05706787109375,
-0.072265625,
0.00978851318359375,
0.01324462890625,
0.033782958984375,
-0.00684356689453125,
-0.054901123046875,
-0.0172576904296875,
-0.0087432861328125,
-0.0279998779296875,
-0.09185791015625,
0.0037479400634765625,
0.037017822265625,
0.004215240478515625,
0.0679931640625,
-0.00661468505859375,
0.041473388671875,
-0.045623779296875,
0.0506591796875,
0.053192138671875,
-0.0230865478515625,
0.005825042724609375,
-0.03875732421875,
0.01331329345703125,
0.042633056640625,
0.026092529296875,
-0.017730712890625,
-0.0037288665771484375,
-0.042510986328125,
-0.038055419921875,
-0.00031113624572753906,
0.025299072265625,
0.019561767578125,
-0.00856781005859375,
0.0292205810546875,
0.035125732421875,
-0.007801055908203125,
-0.0196075439453125,
-0.036590576171875,
0.021820068359375,
-0.0227813720703125,
-0.002613067626953125,
-0.01363372802734375,
-0.032501220703125,
-0.0404052734375,
0.0301361083984375,
0.002819061279296875,
0.07281494140625,
0.0103607177734375,
-0.00565338134765625,
-0.0108489990234375,
0.0002627372741699219,
0.0241241455078125,
0.035614013671875,
-0.025848388671875,
0.0195159912109375,
0.0413818359375,
-0.0367431640625,
0.031768798828125,
0.0204925537109375,
-0.0333251953125,
0.0025634765625,
0.0291748046875,
0.038177490234375,
0.017486572265625,
-0.02105712890625,
0.06573486328125,
0.006927490234375,
-0.016876220703125,
-0.0670166015625,
0.026153564453125,
0.00482940673828125,
0.01165008544921875,
0.0259246826171875,
0.0006995201110839844,
0.0190582275390625,
-0.04180908203125,
0.04144287109375,
-0.00299835205078125,
-0.0543212890625,
-0.0154876708984375,
0.036590576171875,
0.0005631446838378906,
-0.043182373046875,
0.04736328125,
-0.0484619140625,
-0.0126190185546875,
0.056488037109375,
0.04595947265625,
0.05145263671875,
-0.006366729736328125,
0.0380859375,
0.02362060546875,
0.0236358642578125,
-0.0128173828125,
0.07073974609375,
-0.01294708251953125,
-0.053680419921875,
-0.0305328369140625,
-0.028594970703125,
-0.0084381103515625,
-0.0307159423828125,
-0.053436279296875,
-0.0107421875,
-0.051788330078125,
-0.062255859375,
-0.00862884521484375,
-0.0215606689453125,
-0.06463623046875,
0.006824493408203125,
-0.0137176513671875,
0.0794677734375,
-0.0003228187561035156,
0.08392333984375,
0.06689453125,
0.0137481689453125,
-0.052276611328125,
-0.006092071533203125,
0.0006999969482421875,
-0.038970947265625,
0.034393310546875,
0.00649261474609375,
0.032470703125,
-0.02337646484375,
-0.0251312255859375,
-0.08148193359375,
0.10479736328125,
0.01300811767578125,
-0.030426025390625,
0.00620269775390625,
0.005741119384765625,
0.0178680419921875,
-0.0220794677734375,
0.032379150390625,
0.0195159912109375,
0.042510986328125,
-0.040771484375,
-0.06939697265625,
0.0229339599609375,
-0.01406097412109375,
0.000652313232421875,
0.023590087890625,
-0.07427978515625,
0.044677734375,
-0.0186767578125,
0.005214691162109375,
0.02813720703125,
0.06854248046875,
0.03607177734375,
0.0148162841796875,
0.050323486328125,
0.056365966796875,
0.045135498046875,
-0.0357666015625,
0.09600830078125,
-0.00493621826171875,
0.0237274169921875,
0.07763671875,
-0.0184478759765625,
0.07684326171875,
0.0264129638671875,
-0.0192718505859375,
0.0279541015625,
0.05511474609375,
-0.006603240966796875,
0.038787841796875,
0.01383209228515625,
0.0188446044921875,
-0.00269317626953125,
0.01006317138671875,
-0.051849365234375,
0.011444091796875,
0.024871826171875,
-0.0164031982421875,
0.02545166015625,
0.00594329833984375,
-0.004344940185546875,
-0.030914306640625,
0.00506591796875,
0.0089569091796875,
0.0400390625,
-0.002574920654296875,
0.041534423828125,
0.036956787109375,
0.07159423828125,
-0.018524169921875,
0.0257110595703125,
0.002105712890625,
0.004108428955078125,
0.02374267578125,
-0.058013916015625,
0.004058837890625,
-0.0240325927734375,
0.0011472702026367188,
-0.00801849365234375,
0.0716552734375,
-0.0019292831420898438,
-0.01006317138671875,
0.0302734375,
0.00399017333984375,
-0.00913238525390625,
-0.0156707763671875,
-0.047515869140625,
-0.01103973388671875,
-0.042755126953125,
-0.022003173828125,
0.016937255859375,
0.0032100677490234375,
0.0230865478515625,
0.055908203125,
0.034759521484375,
0.0038433074951171875,
0.0404052734375,
0.021820068359375,
0.00958251953125,
-0.06329345703125,
-0.0634765625,
-0.048370361328125,
-0.0198516845703125,
-0.0404052734375,
-0.0577392578125,
0.059478759765625,
0.0306854248046875,
0.0633544921875,
-0.0266876220703125,
0.018035888671875,
-0.013031005859375,
-0.00861358642578125,
-0.0101776123046875,
0.036376953125,
-0.048919677734375,
-0.0343017578125,
-0.01554107666015625,
-0.10546875,
-0.0159912109375,
0.027801513671875,
0.03240966796875,
-0.005558013916015625,
0.0811767578125,
0.046234130859375,
-0.0251312255859375,
0.01299285888671875,
0.0266876220703125,
0.022552490234375,
-0.00263214111328125,
0.050750732421875,
0.07012939453125,
-0.0255279541015625,
0.034332275390625,
-0.0615234375,
-0.0014829635620117188,
-0.0306854248046875,
-0.027374267578125,
-0.0877685546875,
-0.03741455078125,
-0.0301666259765625,
-0.01062774658203125,
0.0281524658203125,
0.06866455078125,
0.092529296875,
-0.0538330078125,
-0.006237030029296875,
0.0308990478515625,
0.0032291412353515625,
-0.04364013671875,
-0.0157318115234375,
0.046112060546875,
0.05126953125,
-0.00252532958984375,
0.02850341796875,
0.03985595703125,
0.017913818359375,
-0.011444091796875,
-0.0243072509765625,
0.0194091796875,
0.01143646240234375,
0.0255584716796875,
0.0158538818359375,
-0.01678466796875,
-0.056365966796875,
0.0014801025390625,
-0.0224609375,
0.0034923553466796875,
0.0252227783203125,
0.004756927490234375,
-0.0172882080078125,
0.068359375,
0.031646728515625,
0.0307464599609375,
-0.0207061767578125,
0.0205841064453125,
-0.053802490234375,
0.03863525390625,
-0.01448822021484375,
0.03997802734375,
0.0160369873046875,
-0.041900634765625,
0.02398681640625,
0.01334381103515625,
-0.0013017654418945312,
-0.05133056640625,
0.031982421875,
-0.076416015625,
0.01470947265625,
0.083251953125,
-0.0037708282470703125,
-0.0222320556640625,
0.03472900390625,
0.0207366943359375,
0.0302581787109375,
-0.0272979736328125,
0.054046630859375,
0.031646728515625,
0.0249786376953125,
0.00290679931640625,
-0.042633056640625,
0.013519287109375,
0.03387451171875,
-0.04248046875,
-0.06219482421875,
-0.01500701904296875,
0.0257110595703125,
0.005092620849609375,
0.03411865234375,
-0.0298309326171875,
0.037750244140625,
-0.006237030029296875,
-0.0247802734375,
-0.0225067138671875,
-0.05255126953125,
0.009552001953125,
0.0293731689453125,
-0.0108184814453125,
-0.0134735107421875
]
] |
peft-internal-testing/tiny_OPTForFeatureExtraction-lora | 2023-07-24T08:53:35.000Z | [
"peft",
"text-generation",
"region:us"
] | text-generation | peft-internal-testing | null | null | peft-internal-testing/tiny_OPTForFeatureExtraction-lora | 0 | 17,008 | peft | 2023-07-13T14:10:04 | ---
library_name: peft
pipeline_tag: text-generation
---
## Training procedure
### Framework versions
- PEFT 0.4.0.dev0 | 122 | [
[
-0.029998779296875,
-0.0157928466796875,
0.01493072509765625,
0.058868408203125,
-0.0097808837890625,
0.0160675048828125,
0.04296875,
0.007419586181640625,
0.020965576171875,
0.058349609375,
-0.043487548828125,
-0.0167694091796875,
-0.0278472900390625,
0.01305389404296875,
-0.02880859375,
0.05169677734375,
-0.0185546875,
0.022125244140625,
0.01258087158203125,
0.00875091552734375,
-0.000007450580596923828,
-0.051300048828125,
-0.07342529296875,
-0.02874755859375,
0.0212860107421875,
0.046905517578125,
0.0037326812744140625,
0.01435089111328125,
0.03692626953125,
0.017120361328125,
0.0055694580078125,
-0.060211181640625,
-0.026519775390625,
0.00653076171875,
-0.01416015625,
0.01265716552734375,
-0.08074951171875,
-0.0017347335815429688,
0.03173828125,
0.04315185546875,
-0.0147247314453125,
0.0037326812744140625,
-0.025421142578125,
0.0595703125,
-0.016143798828125,
-0.00860595703125,
-0.0249481201171875,
0.04022216796875,
-0.01387786865234375,
-0.0015468597412109375,
0.002117156982421875,
-0.0223541259765625,
0.0145111083984375,
-0.045166015625,
0.03106689453125,
-0.027435302734375,
0.053314208984375,
-0.0035114288330078125,
-0.04571533203125,
0.035888671875,
-0.0193328857421875,
0.0360107421875,
-0.0289459228515625,
0.07928466796875,
0.0477294921875,
0.0257720947265625,
-0.006389617919921875,
-0.07763671875,
-0.007381439208984375,
0.0188140869140625,
0.039398193359375,
-0.030731201171875,
0.0272064208984375,
0.0088043212890625,
0.050323486328125,
0.038848876953125,
-0.0164794921875,
0.0047454833984375,
-0.07208251953125,
-0.042236328125,
0.03778076171875,
-0.0189971923828125,
-0.0105743408203125,
0.040771484375,
-0.026275634765625,
0.00969696044921875,
-0.0777587890625,
-0.0237579345703125,
0.0078582763671875,
0.0006999969482421875,
-0.0419921875,
0.01152801513671875,
-0.034881591796875,
0.026214599609375,
0.02947998046875,
0.01078033447265625,
0.08819580078125,
-0.01690673828125,
-0.05938720703125,
0.01113128662109375,
0.0125274658203125,
0.007106781005859375,
0.007251739501953125,
-0.007198333740234375,
-0.025421142578125,
-0.038177490234375,
0.021514892578125,
-0.06878662109375,
-0.043548583984375,
0.004901885986328125,
-0.01174163818359375,
-0.053558349609375,
0.006664276123046875,
-0.0687255859375,
-0.0161895751953125,
0.01216888427734375,
0.00579833984375,
-0.0247650146484375,
-0.04681396484375,
0.01678466796875,
-0.0308990478515625,
0.05120849609375,
0.0229034423828125,
-0.08856201171875,
0.04132080078125,
0.04022216796875,
0.0135040283203125,
0.040008544921875,
-0.033905029296875,
-0.040252685546875,
0.0185089111328125,
-0.0233154296875,
0.05035400390625,
0.01904296875,
-0.044281005859375,
0.00044345855712890625,
0.034637451171875,
-0.005863189697265625,
-0.032196044921875,
0.06317138671875,
-0.0200042724609375,
0.03167724609375,
-0.0108489990234375,
-0.033538818359375,
-0.00997161865234375,
0.0303192138671875,
-0.031585693359375,
0.07684326171875,
0.054229736328125,
-0.0059967041015625,
0.024322509765625,
-0.0804443359375,
-0.034515380859375,
0.006072998046875,
-0.013397216796875,
-0.03375244140625,
-0.00849151611328125,
-0.02410888671875,
0.01514434814453125,
0.01641845703125,
0.0010633468627929688,
-0.01476287841796875,
-0.0162200927734375,
-0.040191650390625,
-0.021026611328125,
0.0701904296875,
0.0218963623046875,
-0.047607421875,
0.0056610107421875,
-0.05401611328125,
0.01165771484375,
0.001842498779296875,
-0.022003173828125,
0.037017822265625,
-0.044097900390625,
-0.01340484619140625,
0.01145172119140625,
0.0258941650390625,
-0.04034423828125,
0.02197265625,
-0.034393310546875,
0.033050537109375,
0.049896240234375,
0.00797271728515625,
0.04022216796875,
-0.01934814453125,
0.050933837890625,
-0.006011962890625,
0.032196044921875,
0.01421356201171875,
-0.0250244140625,
-0.049102783203125,
-0.0084381103515625,
0.020294189453125,
0.0469970703125,
-0.01132965087890625,
0.035308837890625,
0.0243988037109375,
-0.034423828125,
0.0311126708984375,
0.01021575927734375,
0.032623291015625,
0.0250701904296875,
0.00894927978515625,
0.0079803466796875,
-0.0675048828125,
-0.042572021484375,
0.04180908203125,
-0.0219573974609375,
0.007434844970703125,
-0.025909423828125,
0.0513916015625,
-0.037872314453125,
-0.0010824203491210938,
-0.01065826416015625,
-0.0052947998046875,
0.008941650390625,
-0.0011997222900390625,
0.0271759033203125,
0.065185546875,
0.09149169921875,
-0.060302734375,
-0.03607177734375,
-0.047515869140625,
-0.018463134765625,
-0.0302581787109375,
0.0133514404296875,
-0.0163421630859375,
0.038848876953125,
0.0413818359375,
-0.006748199462890625,
0.033111572265625,
0.033477783203125,
-0.058868408203125,
0.036102294921875,
0.01349639892578125,
0.0083465576171875,
-0.047515869140625,
-0.0021495819091796875,
-0.01415252685546875,
-0.038482666015625,
-0.004001617431640625,
0.006938934326171875,
-0.0056610107421875,
-0.0093841552734375,
-0.07232666015625,
0.040008544921875,
-0.0290679931640625,
0.00426483154296875,
0.0253753662109375,
-0.05267333984375,
0.023712158203125,
0.00739288330078125,
0.0116729736328125,
0.061614990234375,
0.06201171875,
-0.032745361328125,
0.043487548828125,
0.0355224609375,
0.039520263671875,
0.04351806640625,
-0.041473388671875,
0.0142974853515625,
0.035736083984375,
-0.01428985595703125,
-0.02117919921875,
-0.03564453125,
0.04742431640625,
-0.00916290283203125,
0.03424072265625,
0.0167388916015625,
-0.0173187255859375,
-0.047088623046875,
-0.056640625,
0.016815185546875,
0.052947998046875,
-0.0108184814453125,
0.008636474609375,
0.0232696533203125,
0.0060272216796875,
-0.0254974365234375,
-0.048583984375,
-0.006107330322265625,
-0.0020198822021484375,
-0.023040771484375,
0.02520751953125,
-0.03839111328125,
-0.0005240440368652344,
-0.04595947265625,
-0.025634765625,
-0.03936767578125,
-0.009002685546875,
0.0213470458984375,
-0.00853729248046875,
-0.00962066650390625,
-0.002307891845703125,
0.01056671142578125,
-0.0176544189453125,
0.005794525146484375,
0.0025043487548828125,
0.0247955322265625,
-0.0263671875,
-0.0145263671875,
-0.00931549072265625,
0.0223541259765625,
0.0244903564453125,
-0.003932952880859375,
0.029266357421875,
0.033721923828125,
-0.040069580078125,
0.008941650390625,
-0.029937744140625,
-0.0171051025390625,
-0.036041259765625,
0.018798828125,
-0.03021240234375,
-0.0699462890625,
0.024322509765625,
-0.02069091796875,
0.0076446533203125,
0.041107177734375,
0.0182342529296875,
-0.006565093994140625,
0.0330810546875,
0.055908203125,
0.029327392578125,
0.04296875,
-0.017059326171875,
0.0189208984375,
-0.0401611328125,
-0.035430908203125,
-0.042266845703125,
0.0198211669921875,
-0.001171112060546875,
-0.00943756103515625,
0.01153564453125,
0.045501708984375,
-0.07855224609375,
0.04022216796875,
-0.054351806640625,
0.04095458984375,
0.0247955322265625,
-0.0049896240234375,
-0.0149993896484375,
-0.007266998291015625,
-0.01214599609375,
0.014190673828125,
-0.03607177734375,
-0.0260772705078125,
0.08721923828125,
0.0672607421875,
0.0256805419921875,
-0.0006647109985351562,
0.0723876953125,
0.0036792755126953125,
0.021484375,
-0.004520416259765625,
0.047332763671875,
-0.0018091201782226562,
-0.07135009765625,
-0.0247955322265625,
0.0079803466796875,
-0.041015625,
-0.0251312255859375,
-0.00604248046875,
-0.0308380126953125,
0.012969970703125,
0.005584716796875,
-0.0207672119140625,
0.00823974609375,
-0.052337646484375,
0.1256103515625,
-0.03741455078125,
0.03515625,
0.0245513916015625,
-0.0413818359375,
-0.002471923828125,
-0.043060302734375,
-0.014190673828125,
-0.015167236328125,
-0.0210418701171875,
0.05828857421875,
-0.0268707275390625,
0.045166015625,
-0.01715087890625,
-0.004962921142578125,
-0.0136260986328125,
-0.0181121826171875,
0.0139312744140625,
0.007251739501953125,
-0.0219268798828125,
0.01117706298828125,
0.0433349609375,
-0.051055908203125,
0.006374359130859375,
0.027587890625,
-0.039703369140625,
-0.0115203857421875,
-0.051177978515625,
0.0110015869140625,
0.004825592041015625,
0.03369140625,
0.042327880859375,
0.0267486572265625,
-0.018585205078125,
0.0053558349609375,
0.0665283203125,
0.01036834716796875,
0.039306640625,
0.034637451171875,
-0.054168701171875,
-0.0462646484375,
0.0316162109375,
-0.0040740966796875,
-0.0013637542724609375,
0.01294708251953125,
-0.016021728515625,
-0.0263671875,
-0.08099365234375,
-0.0198974609375,
0.01453399658203125,
-0.03607177734375,
-0.02642822265625,
-0.015167236328125,
-0.034942626953125,
-0.0628662109375,
-0.024169921875,
-0.0328369140625,
-0.0343017578125,
-0.050262451171875,
-0.04736328125,
0.037261962890625,
0.01453399658203125,
-0.0443115234375,
0.038604736328125,
-0.0775146484375,
-0.0129547119140625,
0.0145721435546875,
0.060150146484375,
-0.058441162109375,
-0.0294036865234375,
-0.0252227783203125,
-0.002414703369140625,
-0.0660400390625,
-0.07354736328125,
0.005992889404296875,
0.033782958984375,
0.053192138671875,
0.022369384765625,
0.0018720626831054688,
0.0189056396484375,
-0.03094482421875,
0.05987548828125,
0.033538818359375,
-0.058319091796875,
0.063232421875,
-0.032440185546875,
0.00908660888671875,
0.086181640625,
0.048309326171875,
0.0045623779296875,
0.0116119384765625,
-0.08270263671875,
-0.034881591796875,
0.01068115234375,
-0.00849151611328125,
0.016632080078125,
-0.009185791015625,
0.020660400390625,
0.004749298095703125,
0.0484619140625,
-0.060455322265625,
-0.0027904510498046875,
-0.0190277099609375,
0.001605987548828125,
0.034332275390625,
-0.01238250732421875,
-0.0250244140625,
-0.060943603515625,
0.063232421875,
0.01548004150390625,
0.040802001953125,
0.0225372314453125,
0.040679931640625,
-0.033935546875,
-0.00826263427734375,
0.051116943359375,
0.0733642578125,
-0.05120849609375,
-0.0007162094116210938,
0.00007909536361694336,
-0.05010986328125,
0.0192413330078125,
0.035247802734375,
0.017547607421875,
0.001422882080078125,
0.023101806640625,
0.028228759765625,
0.0189666748046875,
-0.041656494140625,
0.041656494140625,
0.005558013916015625,
-0.060516357421875,
-0.085205078125,
0.0207672119140625,
-0.012420654296875,
0.0232391357421875,
0.01097869873046875,
0.0244293212890625,
0.0016183853149414062,
-0.037994384765625,
0.057891845703125,
-0.0181884765625,
-0.052337646484375,
-0.023956298828125,
0.037994384765625,
0.0589599609375,
-0.0655517578125,
0.03839111328125,
-0.045501708984375,
-0.05426025390625,
0.051116943359375,
0.032684326171875,
0.062103271484375,
-0.01514434814453125,
0.03619384765625,
0.0218658447265625,
-0.0169830322265625,
-0.013519287109375,
0.0787353515625,
0.005207061767578125,
-0.0477294921875,
-0.0070037841796875,
-0.052947998046875,
-0.017791748046875,
0.005069732666015625,
-0.01399993896484375,
0.0224761962890625,
-0.060089111328125,
-0.0145721435546875,
0.00951385498046875,
-0.0025691986083984375,
-0.0165252685546875,
0.007965087890625,
0.00958251953125,
0.128173828125,
-0.0406494140625,
0.085205078125,
0.07611083984375,
-0.039337158203125,
-0.050537109375,
-0.0294036865234375,
0.0030422210693359375,
-0.019195556640625,
0.05120849609375,
0.0045623779296875,
0.027374267578125,
0.0257415771484375,
-0.053802490234375,
-0.028656005859375,
0.07354736328125,
0.032867431640625,
-0.039520263671875,
0.0114898681640625,
-0.016937255859375,
0.00926971435546875,
-0.045654296875,
-0.0300445556640625,
0.041656494140625,
0.0283203125,
0.01248931884765625,
-0.10003662109375,
-0.01229095458984375,
-0.0204315185546875,
-0.04351806640625,
0.047332763671875,
-0.0184478759765625,
0.0625,
-0.0026988983154296875,
0.043365478515625,
0.021209716796875,
0.022918701171875,
0.03271484375,
0.0182952880859375,
0.045562744140625,
0.053192138671875,
0.038787841796875,
-0.01274871826171875,
0.034454345703125,
0.0009765625,
0.029998779296875,
0.06683349609375,
-0.0253143310546875,
0.033294677734375,
0.058013916015625,
-0.0308990478515625,
0.045074462890625,
0.07244873046875,
-0.0285797119140625,
0.04278564453125,
-0.0155029296875,
-0.0343017578125,
-0.0165557861328125,
0.018585205078125,
-0.051788330078125,
-0.01020050048828125,
0.004543304443359375,
-0.0462646484375,
-0.0222625732421875,
-0.007778167724609375,
0.01503753662109375,
-0.00331878662109375,
-0.03448486328125,
0.038177490234375,
0.0015659332275390625,
-0.024139404296875,
0.0484619140625,
-0.0009031295776367188,
0.042724609375,
-0.046905517578125,
-0.0287322998046875,
0.00040149688720703125,
0.0030364990234375,
-0.006160736083984375,
-0.050994873046875,
0.05804443359375,
-0.01161956787109375,
-0.0175933837890625,
-0.036834716796875,
0.036163330078125,
-0.007175445556640625,
-0.041900634765625,
0.0196075439453125,
0.0137939453125,
0.0157470703125,
-0.01215362548828125,
-0.0872802734375,
0.01326751708984375,
0.01552581787109375,
0.0054931640625,
0.0173187255859375,
0.0282745361328125,
0.0162353515625,
0.023529052734375,
0.0174407958984375,
-0.003692626953125,
0.00279998779296875,
0.00904083251953125,
0.058746337890625,
-0.03314208984375,
-0.040863037109375,
-0.040130615234375,
0.032623291015625,
-0.0253448486328125,
-0.04608154296875,
0.0179290771484375,
0.0599365234375,
0.08746337890625,
-0.007572174072265625,
0.022186279296875,
-0.03424072265625,
0.038482666015625,
-0.004543304443359375,
0.029266357421875,
-0.0168304443359375,
0.01444244384765625,
0.00911712646484375,
-0.05499267578125,
-0.0224761962890625,
0.048309326171875,
-0.016815185546875,
0.007701873779296875,
0.06829833984375,
0.037994384765625,
-0.0149993896484375,
0.0196380615234375,
0.01123046875,
-0.0131378173828125,
-0.0007796287536621094,
0.034332275390625,
0.0021190643310546875,
-0.054412841796875,
0.0243682861328125,
-0.048370361328125,
-0.03326416015625,
0.003902435302734375,
-0.0806884765625,
-0.04364013671875,
-0.01323699951171875,
-0.0465087890625,
-0.049957275390625,
-0.00769805908203125,
0.0645751953125,
0.07952880859375,
-0.062042236328125,
-0.01904296875,
-0.015289306640625,
0.0043487548828125,
0.0011749267578125,
-0.008514404296875,
0.00437164306640625,
-0.0122528076171875,
-0.0222320556640625,
0.007274627685546875,
0.00582122802734375,
0.0445556640625,
-0.0241851806640625,
-0.0460205078125,
-0.034637451171875,
-0.02099609375,
0.0225830078125,
0.046783447265625,
-0.01168060302734375,
-0.0277862548828125,
-0.013763427734375,
0.00335693359375,
0.02264404296875,
0.042388916015625,
-0.031982421875,
-0.0125885009765625,
0.040863037109375,
0.0127105712890625,
0.036041259765625,
-0.0285186767578125,
0.0576171875,
-0.06439208984375,
0.03216552734375,
0.033721923828125,
0.0252685546875,
0.01268768310546875,
-0.0299530029296875,
0.083740234375,
0.046844482421875,
-0.055999755859375,
-0.0625,
0.0101470947265625,
-0.08184814453125,
-0.002368927001953125,
0.032470703125,
-0.001651763916015625,
-0.0195465087890625,
-0.0181121826171875,
-0.0246429443359375,
0.0012969970703125,
-0.032958984375,
-0.0003082752227783203,
0.029144287109375,
-0.0277557373046875,
-0.039215087890625,
-0.0285186767578125,
0.0565185546875,
-0.0019550323486328125,
-0.0478515625,
-0.0513916015625,
0.03289794921875,
0.0254669189453125,
0.00228118896484375,
0.083984375,
-0.031219482421875,
0.039642333984375,
0.017608642578125,
0.0032062530517578125,
-0.02874755859375,
-0.0211639404296875,
-0.0341796875,
-0.0055694580078125,
-0.0006833076477050781,
-0.021697998046875
]
] |
NousResearch/Nous-Capybara-7B-V1 | 2023-10-28T21:17:12.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"sft",
"eng",
"dataset:LDJnr/LessWrong-Amplify-Instruct",
"dataset:LDJnr/Pure-Dove",
"dataset:LDJnr/Verified-Camel",
"license:mit",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | NousResearch | null | null | NousResearch/Nous-Capybara-7B-V1 | 17 | 16,947 | transformers | 2023-09-20T01:08:27 | ---
language:
- eng
tags:
- llama-2
- sft
license:
- mit
datasets:
- LDJnr/LessWrong-Amplify-Instruct
- LDJnr/Pure-Dove
- LDJnr/Verified-Camel
---
## **Nous-Capybara-7B V1**
**MUCH BETTER MISTRAL BASED VERSION IS OUT NOW AS CAPYBARA V1.9**
The Capybara series is made by fine-tuning on data that is created by Nous with our novel data synthesis technique called Amplify-instruct, the seed distribution and synthesis method are comprised of a synergistic combination of top performing existing data synthesis techniques and distributions used for SOTA models such as Airoboros, Evol-Instruct, Orca, Vicuna, Know_Logic, Lamini, FLASK and others, all into one lean holistically formed dataset and model. The seed instructions used for the start of synthesized conversations are largely based on highly datasets like Airoboros, Know logic, EverythingLM, GPTeacher and even entirely new seed instructions derived from posts on the website LessWrong, as well as being supplemented with certain in-house multi-turn datasets like Dove(A successor to Puffin).
While performing great in it's current state, the current dataset used for fine-tuning is entirely contained within 20K training examples, mostly comprised of newly synthesized conversation tokens that have never previously been used for AI training to our knowledge.
This small fine-tune dataset has significant implications for how we'll be able to scale model abilities in the future! This model is currently 20K examples while matching benchmarks of notable 300K example datasets that are 10 times the size!
## Process of creation and special thank yous!
This model was fine-tuned by Nous Research, with LDJ leading the training and dataset curation, along with significant dataset formation contributions by J-Supha, Also thank you to Emozilla for also assisting to expedite the training experimentation process.
Special thank you to **A16Z** for sponsoring our training, as well as **Yield Protocol** for their support in resources during R&D of aspects outside of training, such as dataset development/synthesis.
## Thank you to those of you that have indirectly contributed!
While most of the tokens within Capybara are newly synthsized and part of datasets like Puffin/Dove, we would like to credit the single-turn datasets we leveraged as seeds that are used to generate the multi-turn data as part of the Amplify-Instruct synthesis.
The datasets shown in green below are datasets that we sampled from to curate seeds that are used during Amplify-Instruct synthesis for this project.

## Model Training
Nous-Capybara 7B is a new model trained for multiple epochs on a dataset of roughly 20,000 carefully curated conversational examples, most of which are comprised of entirely new in-house synthesized tokens that previously didn't exist on HuggingFace.
Additional data came from manually curated CamelAI data, with the help of volunteers ranging from former Physics PhD's, Mathematicians, Biologists and more!
## Prompt Format
The reccomended model usage is:
```
USER:
ASSISTANT:
```
## Notable Features:
- The first Nous model trained on over 10,000 multi-turn conversations.
- Over 1,000 tokens average per conversation example and multiple back and forth turns per conversation! Most models are still trained for only single-turn conversations and less than 300 tokens per example!
- Able to effectively do complex summaries of advanced topics and studies.
- Ability to recall information upto late 2022 without internet.
- Includes a portion of conversational data synthesized from less wrong posts, discussing very in-depth about the nature of rationality, reasoning, self-improvement and related concepts.
## Example Outputs!:



## Benchmarks! (Important to note that all mentioned benchmarks are single-turn and don't test multi-turn capabilities, Capybara should excel even further at multi-turn conversational tasks than what benchmark comparisons show.)

## Future Changes
This is a relatively early build amongst the grand plans for the future of Capybara!
[IT IS NOW RECCOMENDED TO USE CAPYBARA V1.9 FOR SIGNIFICANTLY BETTER OVERALL CAPABILITIES]
## Future model sizes
We plan on releasing a 3B, 13B and 70B version, as well as a potential 1B version based on phi-1.5 or similar architectures.
## How you can help!
In the near future we plan on leveraging the help of domain specific expert volunteers to eliminate any mathematically/verifiably incorrect answers from our training curations.
If you have at-least a bachelors in mathematics, physics, biology or chemistry and would like to volunteer even just 30 minutes of your expertise time, please contact LDJ on discord!
## Dataset contamination.
We checked for 100%, 99%, 98% and 97% similarity matches between our data and many popular benchmarks, we found no exact matches!
The following are benchmarks we checked for contamination for:
- HumanEval
- AGIEval
- TruthfulQA
- MMLU
- GPT4All
| 5,301 | [
[
-0.044769287109375,
-0.052764892578125,
-0.005615234375,
0.0178375244140625,
0.0162811279296875,
0.00577545166015625,
-0.0173187255859375,
-0.043609619140625,
0.034088134765625,
0.04052734375,
-0.035186767578125,
-0.005954742431640625,
-0.0219573974609375,
-0.0169219970703125,
-0.041015625,
0.11236572265625,
0.0234222412109375,
0.00506591796875,
-0.007083892822265625,
-0.0277252197265625,
-0.048248291015625,
-0.0276336669921875,
-0.0592041015625,
-0.031463623046875,
0.053375244140625,
0.052093505859375,
0.037811279296875,
0.03228759765625,
0.0217132568359375,
0.015594482421875,
-0.014923095703125,
0.0197296142578125,
-0.047149658203125,
-0.00981903076171875,
0.0130462646484375,
-0.0217742919921875,
-0.052764892578125,
0.0197296142578125,
0.01381683349609375,
0.060516357421875,
-0.0125732421875,
0.0210723876953125,
-0.005168914794921875,
0.03369140625,
-0.0546875,
0.00395965576171875,
-0.06219482421875,
-0.0155487060546875,
-0.0174713134765625,
0.0198516845703125,
-0.022003173828125,
-0.031707763671875,
-0.00693511962890625,
-0.06365966796875,
0.03179931640625,
0.0184478759765625,
0.08349609375,
-0.01410675048828125,
-0.01398468017578125,
-0.0201416015625,
-0.0316162109375,
0.064208984375,
-0.051910400390625,
0.031463623046875,
0.01212310791015625,
0.0221710205078125,
-0.02764892578125,
-0.04534912109375,
-0.054656982421875,
-0.049072265625,
0.01532745361328125,
0.01142120361328125,
-0.00481414794921875,
-0.007343292236328125,
0.032806396484375,
0.053070068359375,
-0.029937744140625,
0.004810333251953125,
-0.0599365234375,
-0.004787445068359375,
0.045318603515625,
0.00907135009765625,
0.0068511962890625,
-0.00506591796875,
-0.0262908935546875,
-0.032135009765625,
-0.051849365234375,
0.0152740478515625,
0.0267486572265625,
0.00711822509765625,
-0.025604248046875,
0.0170745849609375,
-0.035736083984375,
0.0273895263671875,
0.010986328125,
0.00009042024612426758,
0.0235748291015625,
-0.024505615234375,
-0.0221710205078125,
-0.01641845703125,
0.08447265625,
0.0169677734375,
0.004489898681640625,
-0.01045989990234375,
0.004467010498046875,
0.005451202392578125,
0.02471923828125,
-0.055694580078125,
-0.036773681640625,
0.030120849609375,
-0.0230865478515625,
-0.036651611328125,
-0.01398468017578125,
-0.058258056640625,
-0.02447509765625,
-0.006290435791015625,
0.03143310546875,
-0.064453125,
-0.024200439453125,
0.0302581787109375,
-0.0204010009765625,
0.0266876220703125,
0.038238525390625,
-0.0831298828125,
0.032379150390625,
0.0310821533203125,
0.05999755859375,
-0.004520416259765625,
-0.035736083984375,
-0.00846099853515625,
-0.011474609375,
-0.042938232421875,
0.04638671875,
-0.01751708984375,
-0.0130462646484375,
-0.019561767578125,
-0.01493072509765625,
0.002056121826171875,
-0.043304443359375,
0.0601806640625,
-0.01317596435546875,
0.00936126708984375,
-0.0313720703125,
-0.017120361328125,
-0.01739501953125,
0.0218658447265625,
-0.04656982421875,
0.0797119140625,
0.0034084320068359375,
-0.057891845703125,
0.01910400390625,
-0.060699462890625,
-0.0322265625,
-0.01296234130859375,
-0.00415802001953125,
-0.03350830078125,
-0.0084991455078125,
0.023345947265625,
0.025115966796875,
-0.0298919677734375,
0.003025054931640625,
-0.039825439453125,
0.002719879150390625,
0.044708251953125,
-0.02545166015625,
0.060821533203125,
0.0214385986328125,
-0.006153106689453125,
-0.00506591796875,
-0.0631103515625,
0.0035839080810546875,
0.0164794921875,
0.006847381591796875,
0.0069122314453125,
-0.036407470703125,
-0.006595611572265625,
-0.006061553955078125,
0.0256500244140625,
-0.050628662109375,
0.03955078125,
-0.025299072265625,
0.05419921875,
0.0321044921875,
-0.0035076141357421875,
0.0281219482421875,
-0.057952880859375,
0.025146484375,
-0.01233673095703125,
0.05169677734375,
-0.025299072265625,
-0.04779052734375,
-0.08489990234375,
-0.03045654296875,
0.01202392578125,
0.0242919921875,
-0.022003173828125,
0.0035152435302734375,
0.006927490234375,
-0.082275390625,
-0.029449462890625,
-0.01247406005859375,
0.032623291015625,
0.036773681640625,
0.03472900390625,
-0.0244140625,
-0.042938232421875,
-0.0665283203125,
0.0037212371826171875,
-0.00728607177734375,
0.009552001953125,
0.02490234375,
0.043731689453125,
-0.0013399124145507812,
0.0556640625,
-0.04449462890625,
-0.0044097900390625,
-0.0249786376953125,
0.00899505615234375,
0.0499267578125,
0.04833984375,
0.04608154296875,
-0.0292205810546875,
-0.0145721435546875,
0.0003464221954345703,
-0.058135986328125,
0.007732391357421875,
-0.013763427734375,
-0.0254669189453125,
-0.0210723876953125,
0.01654052734375,
-0.069091796875,
0.0230865478515625,
0.028900146484375,
0.0184478759765625,
0.042633056640625,
-0.0165863037109375,
0.02032470703125,
-0.048187255859375,
0.026092529296875,
-0.0211334228515625,
0.003887176513671875,
-0.07183837890625,
0.0012378692626953125,
0.014556884765625,
-0.004119873046875,
-0.0296173095703125,
0.0367431640625,
-0.031036376953125,
0.0174102783203125,
-0.0156402587890625,
0.00583648681640625,
0.00830841064453125,
0.042755126953125,
-0.023223876953125,
0.049957275390625,
0.045196533203125,
-0.07293701171875,
0.03399658203125,
0.054931640625,
-0.0180206298828125,
0.0252685546875,
-0.07940673828125,
0.01338958740234375,
0.01392364501953125,
0.0169219970703125,
-0.064697265625,
-0.0208740234375,
0.024017333984375,
-0.06512451171875,
0.0277252197265625,
0.006134033203125,
-0.005096435546875,
-0.0458984375,
-0.039306640625,
0.0185699462890625,
0.037445068359375,
-0.037322998046875,
0.0004830360412597656,
0.033050537109375,
0.0099029541015625,
-0.055633544921875,
-0.03668212890625,
0.00199127197265625,
-0.0183868408203125,
-0.044891357421875,
0.027587890625,
-0.0182342529296875,
0.00412750244140625,
-0.04638671875,
-0.01068878173828125,
0.0017175674438476562,
0.0012559890747070312,
0.042633056640625,
0.027587890625,
0.0118408203125,
0.003231048583984375,
-0.003787994384765625,
0.0023021697998046875,
-0.00925445556640625,
0.00196075439453125,
0.0535888671875,
-0.0030364990234375,
-0.0077362060546875,
-0.03887939453125,
-0.002532958984375,
0.0576171875,
-0.0267791748046875,
0.048248291015625,
0.034088134765625,
-0.02593994140625,
0.001201629638671875,
-0.060882568359375,
-0.03985595703125,
-0.035186767578125,
0.0096435546875,
-0.030303955078125,
-0.04486083984375,
0.031982421875,
0.0175933837890625,
0.0164947509765625,
0.033203125,
0.037261962890625,
-0.007366180419921875,
0.047027587890625,
0.03094482421875,
-0.0234375,
0.044921875,
-0.0479736328125,
0.01519012451171875,
-0.0771484375,
-0.037139892578125,
-0.04620361328125,
-0.00833892822265625,
-0.05712890625,
-0.017242431640625,
0.0306243896484375,
0.0189971923828125,
-0.0260009765625,
0.05218505859375,
-0.04571533203125,
0.004451751708984375,
0.054351806640625,
0.01178741455078125,
0.00029730796813964844,
-0.00870513916015625,
0.0258331298828125,
0.009918212890625,
-0.061553955078125,
-0.03271484375,
0.10662841796875,
0.019805908203125,
0.050445556640625,
-0.0135040283203125,
0.041595458984375,
0.01537322998046875,
0.0232696533203125,
-0.037139892578125,
0.037841796875,
0.007137298583984375,
-0.0684814453125,
-0.02130126953125,
-0.023529052734375,
-0.09423828125,
0.01202392578125,
-0.0264892578125,
-0.057281494140625,
-0.004779815673828125,
0.003993988037109375,
-0.03216552734375,
0.0135955810546875,
-0.0421142578125,
0.07952880859375,
-0.01398468017578125,
-0.038482666015625,
-0.0209808349609375,
-0.04150390625,
0.03271484375,
0.0186614990234375,
0.013641357421875,
-0.0261077880859375,
0.0180816650390625,
0.06732177734375,
-0.045745849609375,
0.04901123046875,
0.00787353515625,
0.0227813720703125,
0.034332275390625,
0.0014286041259765625,
0.02728271484375,
0.0260162353515625,
0.0009350776672363281,
0.02935791015625,
0.01296234130859375,
-0.03839111328125,
-0.015472412109375,
0.06719970703125,
-0.08380126953125,
-0.0260467529296875,
-0.0408935546875,
-0.0430908203125,
-0.00685882568359375,
0.0069732666015625,
0.0224456787109375,
0.035186767578125,
-0.01513671875,
0.006336212158203125,
0.05859375,
-0.0343017578125,
0.0306549072265625,
0.0352783203125,
-0.00811767578125,
-0.043731689453125,
0.0701904296875,
0.00891876220703125,
0.014404296875,
0.0289154052734375,
0.01245880126953125,
-0.0070037841796875,
-0.036529541015625,
-0.031097412109375,
0.0308380126953125,
-0.0179290771484375,
-0.02960205078125,
-0.069091796875,
-0.01517486572265625,
-0.05462646484375,
0.0028476715087890625,
-0.052520751953125,
-0.041748046875,
-0.01357269287109375,
0.01500701904296875,
0.051849365234375,
0.06298828125,
-0.01267242431640625,
0.0250701904296875,
-0.038238525390625,
0.031707763671875,
0.027587890625,
0.00717926025390625,
0.0012664794921875,
-0.0635986328125,
-0.0106048583984375,
0.01317596435546875,
-0.06787109375,
-0.07586669921875,
0.03765869140625,
0.0011167526245117188,
0.035675048828125,
0.0300445556640625,
0.0016489028930664062,
0.059722900390625,
-0.0308380126953125,
0.057342529296875,
0.02728271484375,
-0.05902099609375,
0.03948974609375,
-0.037109375,
-0.00293731689453125,
0.03814697265625,
0.036712646484375,
-0.05120849609375,
-0.04058837890625,
-0.0740966796875,
-0.047149658203125,
0.061859130859375,
0.0401611328125,
0.02008056640625,
0.0032978057861328125,
0.035919189453125,
0.028533935546875,
0.041961669921875,
-0.044891357421875,
-0.0277252197265625,
-0.02056884765625,
0.0184478759765625,
0.019683837890625,
0.0003314018249511719,
0.0009183883666992188,
-0.0309295654296875,
0.05731201171875,
0.007724761962890625,
0.026702880859375,
0.004459381103515625,
0.015655517578125,
0.0144500732421875,
0.005237579345703125,
0.035552978515625,
0.047454833984375,
-0.0296173095703125,
-0.0173492431640625,
0.033660888671875,
-0.03240966796875,
-0.0147247314453125,
-0.005580902099609375,
-0.01288604736328125,
-0.004047393798828125,
0.01471710205078125,
0.062286376953125,
0.003284454345703125,
-0.033599853515625,
0.0350341796875,
0.00879669189453125,
0.0169219970703125,
-0.0310821533203125,
0.0265960693359375,
-0.00380706787109375,
0.01468658447265625,
0.0233917236328125,
0.0124359130859375,
0.0233306884765625,
-0.04449462890625,
0.0009670257568359375,
0.035491943359375,
-0.004268646240234375,
-0.028045654296875,
0.068115234375,
0.006511688232421875,
-0.01056671142578125,
0.042236328125,
-0.03179931640625,
-0.0209503173828125,
0.06060791015625,
0.018524169921875,
0.036529541015625,
-0.01338958740234375,
0.0159454345703125,
0.06329345703125,
0.0206298828125,
-0.00460052490234375,
0.01374053955078125,
0.002979278564453125,
-0.050201416015625,
-0.00653076171875,
-0.040740966796875,
-0.054290771484375,
0.046478271484375,
-0.051605224609375,
0.033721923828125,
-0.04754638671875,
-0.012725830078125,
0.0196380615234375,
0.004779815673828125,
-0.047607421875,
0.03192138671875,
-0.0014181137084960938,
0.0635986328125,
-0.061187744140625,
0.064453125,
0.042266845703125,
-0.05938720703125,
-0.045867919921875,
-0.029693603515625,
-0.03692626953125,
-0.07171630859375,
0.0253753662109375,
0.0284271240234375,
0.01418304443359375,
-0.024658203125,
-0.0657958984375,
-0.053741455078125,
0.0819091796875,
0.03369140625,
-0.039306640625,
0.0097503662109375,
0.0010251998901367188,
0.034637451171875,
-0.006134033203125,
0.07281494140625,
0.0305023193359375,
0.00839996337890625,
0.00994873046875,
-0.07989501953125,
-0.0030975341796875,
-0.00922393798828125,
-0.0052032470703125,
0.001750946044921875,
-0.09344482421875,
0.08868408203125,
-0.0222625732421875,
0.003437042236328125,
0.035675048828125,
0.05859375,
0.026214599609375,
0.03192138671875,
0.00955963134765625,
0.04339599609375,
0.062469482421875,
-0.014923095703125,
0.0657958984375,
-0.037200927734375,
0.02655029296875,
0.0791015625,
0.0003132820129394531,
0.050750732421875,
0.031829833984375,
-0.0310821533203125,
0.0185394287109375,
0.04595947265625,
-0.0004546642303466797,
0.0259246826171875,
0.0123291015625,
-0.005893707275390625,
-0.009674072265625,
-0.021484375,
-0.04913330078125,
0.0215301513671875,
0.0177154541015625,
0.0031070709228515625,
0.022705078125,
0.01092529296875,
0.0226898193359375,
-0.0166168212890625,
-0.0108489990234375,
0.0238800048828125,
-0.004077911376953125,
-0.045867919921875,
0.067626953125,
0.0031147003173828125,
0.0543212890625,
-0.043243408203125,
0.006221771240234375,
-0.05712890625,
0.0137786865234375,
-0.022369384765625,
-0.041748046875,
-0.00333404541015625,
-0.020050048828125,
-0.0017490386962890625,
-0.0215301513671875,
0.0323486328125,
-0.00460052490234375,
-0.01154327392578125,
0.005340576171875,
0.01336669921875,
0.037750244140625,
-0.0157623291015625,
-0.044830322265625,
0.0213165283203125,
-0.006748199462890625,
-0.004917144775390625,
0.0155487060546875,
0.0243682861328125,
-0.0015592575073242188,
0.0645751953125,
0.04241943359375,
0.0030765533447265625,
0.0149688720703125,
-0.0112762451171875,
0.08642578125,
-0.04852294921875,
-0.0007224082946777344,
-0.02764892578125,
0.0260009765625,
-0.0225982666015625,
-0.054534912109375,
0.06719970703125,
0.05938720703125,
0.07232666015625,
-0.00605010986328125,
0.052734375,
-0.035430908203125,
0.03265380859375,
-0.01312255859375,
0.0374755859375,
-0.040740966796875,
0.0016841888427734375,
-0.007511138916015625,
-0.046875,
0.0126800537109375,
0.05120849609375,
-0.01837158203125,
0.005214691162109375,
0.045257568359375,
0.0648193359375,
-0.0031032562255859375,
0.00995635986328125,
0.0264892578125,
0.0257568359375,
0.025299072265625,
0.06640625,
0.06878662109375,
-0.07293701171875,
0.053985595703125,
-0.023193359375,
-0.0304107666015625,
-0.0271148681640625,
-0.0241546630859375,
-0.07958984375,
-0.03289794921875,
-0.03759765625,
-0.037445068359375,
0.0134735107421875,
0.0458984375,
0.060211181640625,
-0.07647705078125,
-0.040313720703125,
-0.01114654541015625,
-0.0006575584411621094,
-0.016571044921875,
-0.0138397216796875,
-0.0063018798828125,
-0.02716064453125,
-0.057159423828125,
0.03204345703125,
0.0146026611328125,
-0.0156402587890625,
-0.013671875,
-0.0010700225830078125,
-0.0198974609375,
0.0088043212890625,
0.0229034423828125,
0.049468994140625,
-0.0284271240234375,
-0.022003173828125,
0.01751708984375,
-0.0168304443359375,
0.029449462890625,
0.03955078125,
-0.0570068359375,
0.0255584716796875,
0.0281524658203125,
0.0183258056640625,
0.07501220703125,
-0.040679931640625,
0.007213592529296875,
-0.0268707275390625,
0.0266265869140625,
0.005382537841796875,
0.020263671875,
0.0269927978515625,
0.00911712646484375,
0.03961181640625,
0.01898193359375,
-0.050933837890625,
-0.060821533203125,
0.01166534423828125,
-0.09429931640625,
0.005298614501953125,
0.0799560546875,
0.0095367431640625,
-0.025848388671875,
0.01436614990234375,
-0.044921875,
0.044921875,
-0.051727294921875,
0.07135009765625,
0.02728271484375,
-0.01319122314453125,
0.0023193359375,
-0.0584716796875,
0.019439697265625,
0.04254150390625,
-0.070556640625,
-0.01078033447265625,
0.024627685546875,
0.0247039794921875,
0.0247802734375,
0.036224365234375,
0.00048422813415527344,
0.008331298828125,
0.016143798828125,
0.0293731689453125,
-0.004779815673828125,
-0.0213623046875,
0.00736236572265625,
0.005962371826171875,
0.00876617431640625,
-0.0484619140625
]
] |
MBZUAI/LaMini-T5-738M | 2023-04-28T12:07:40.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"generated_from_trainer",
"instruction fine-tuning",
"en",
"arxiv:2304.14402",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | MBZUAI | null | null | MBZUAI/LaMini-T5-738M | 26 | 16,946 | transformers | 2023-04-17T05:35:04 | ---
license: cc-by-nc-4.0
tags:
- generated_from_trainer
- instruction fine-tuning
model-index:
- name: flan-t5-small-distil-v2
results: []
language:
- en
pipeline_tag: text2text-generation
widget:
- text: >-
how can I become more healthy?
example_title: example
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
<p align="center" width="100%">
<a><img src="https://raw.githubusercontent.com/mbzuai-nlp/lamini-lm/main/images/lamini.png" alt="Title" style="width: 100%; min-width: 300px; display: block; margin: auto;"></a>
</p>
# LaMini-T5-738M
[]()
This model is one of our LaMini-LM model series in paper "[LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions](https://github.com/mbzuai-nlp/lamini-lm)". This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on [LaMini-instruction dataset](https://huggingface.co/datasets/MBZUAI/LaMini-instruction) that contains 2.58M samples for instruction fine-tuning. For more information about our dataset, please refer to our [project repository](https://github.com/mbzuai-nlp/lamini-lm/).
You can view other models of LaMini-LM series as follows. Models with ✩ are those with the best overall performance given their size/architecture, hence we recommend using them. More details can be seen in our paper.
<table>
<thead>
<tr>
<th>Base model</th>
<th colspan="4">LaMini-LM series (#parameters)</th>
</tr>
</thead>
<tbody>
<tr>
<td>T5</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-t5-61m" target="_blank" rel="noopener noreferrer">LaMini-T5-61M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-t5-223m" target="_blank" rel="noopener noreferrer">LaMini-T5-223M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-t5-738m" target="_blank" rel="noopener noreferrer">LaMini-T5-738M</a></td>
<td></td>
</tr>
<tr>
<td>Flan-T5</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-flan-t5-77m" target="_blank" rel="noopener noreferrer">LaMini-Flan-T5-77M</a>✩</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-flan-t5-248m" target="_blank" rel="noopener noreferrer">LaMini-Flan-T5-248M</a>✩</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-flan-t5-783m" target="_blank" rel="noopener noreferrer">LaMini-Flan-T5-783M</a>✩</td>
<td></td>
</tr>
<tr>
<td>Cerebras-GPT</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-cerebras-111m" target="_blank" rel="noopener noreferrer">LaMini-Cerebras-111M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-cerebras-256m" target="_blank" rel="noopener noreferrer">LaMini-Cerebras-256M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-cerebras-590m" target="_blank" rel="noopener noreferrer">LaMini-Cerebras-590M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-cerebras-1.3b" target="_blank" rel="noopener noreferrer">LaMini-Cerebras-1.3B</a></td>
</tr>
<tr>
<td>GPT-2</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-gpt-124m" target="_blank" rel="noopener noreferrer">LaMini-GPT-124M</a>✩</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-gpt-774m" target="_blank" rel="noopener noreferrer">LaMini-GPT-774M</a>✩</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-gpt-1.5b" target="_blank" rel="noopener noreferrer">LaMini-GPT-1.5B</a>✩</td>
<td></td>
</tr>
<tr>
<td>GPT-Neo</td>
<td><a href="https://huggingface.co/MBZUAI/lamini-neo-125m" target="_blank" rel="noopener noreferrer">LaMini-Neo-125M</a></td>
<td><a href="https://huggingface.co/MBZUAI/lamini-neo-1.3b" target="_blank" rel="noopener noreferrer">LaMini-Neo-1.3B</a></td>
<td></td>
<td></td>
</tr>
<tr>
<td>GPT-J</td>
<td colspan="4">coming soon</td>
</tr>
<tr>
<td>LLaMA</td>
<td colspan="4">coming soon</td>
</tr>
</tbody>
</table>
## Use
### Intended use
We recommend using the model to response to human instructions written in natural language.
We now show you how to load and use our model using HuggingFace `pipeline()`.
```python
# pip install -q transformers
from transformers import pipeline
checkpoint = "{model_name}"
model = pipeline('text2text-generation', model = checkpoint)
input_prompt = 'Please let me know your thoughts on the given place and why you think it deserves to be visited: \n"Barcelona, Spain"'
generated_text = model(input_prompt, max_length=512, do_sample=True)[0]['generated_text']
print("Response", generated_text)
```
## Training Procedure
<p align="center" width="100%">
<a><img src="https://raw.githubusercontent.com/mbzuai-nlp/lamini-lm/main/images/lamini-pipeline.drawio.png" alt="Title" style="width: 100%; min-width: 250px; display: block; margin: auto;"></a>
</p>
We initialize with [t5-large](https://huggingface.co/t5-large) and fine-tune it on our [LaMini-instruction dataset](https://huggingface.co/datasets/MBZUAI/LaMini-instruction). Its total number of parameters is 738M.
### Training Hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
## Evaluation
We conducted two sets of evaluations: automatic evaluation on downstream NLP tasks and human evaluation on user-oriented instructions. For more detail, please refer to our [paper]().
## Limitations
More information needed
# Citation
```bibtex
@article{lamini-lm,
author = {Minghao Wu and
Abdul Waheed and
Chiyu Zhang and
Muhammad Abdul-Mageed and
Alham Fikri Aji
},
title = {LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions},
journal = {CoRR},
volume = {abs/2304.14402},
year = {2023},
url = {https://arxiv.org/abs/2304.14402},
eprinttype = {arXiv},
eprint = {2304.14402}
}
``` | 6,363 | [
[
-0.0482177734375,
-0.049835205078125,
0.01511383056640625,
0.020233154296875,
-0.019866943359375,
-0.029266357421875,
-0.00994110107421875,
-0.047943115234375,
0.020477294921875,
0.020294189453125,
-0.061492919921875,
-0.03485107421875,
-0.0411376953125,
0.0047454833984375,
-0.0011348724365234375,
0.06475830078125,
-0.0179595947265625,
-0.0045166015625,
0.0041656494140625,
-0.01493072509765625,
-0.0199737548828125,
-0.0282135009765625,
-0.061187744140625,
-0.03466796875,
0.0187835693359375,
0.004039764404296875,
0.052490234375,
0.06353759765625,
0.0285186767578125,
0.029205322265625,
-0.01482391357421875,
0.0204010009765625,
-0.00817108154296875,
-0.01424407958984375,
0.005542755126953125,
-0.03131103515625,
-0.07086181640625,
0.0087432861328125,
0.054656982421875,
0.0219573974609375,
0.01727294921875,
0.03192138671875,
0.01580810546875,
0.057159423828125,
-0.027557373046875,
0.01519775390625,
0.0008115768432617188,
0.00794219970703125,
-0.018524169921875,
-0.002376556396484375,
-0.016326904296875,
-0.03765869140625,
0.003658294677734375,
-0.044677734375,
-0.00801849365234375,
0.01251983642578125,
0.11114501953125,
0.01247406005859375,
-0.0070648193359375,
-0.010040283203125,
-0.0273590087890625,
0.0743408203125,
-0.06243896484375,
0.012542724609375,
0.043060302734375,
-0.00412750244140625,
0.0043487548828125,
-0.032012939453125,
-0.05126953125,
-0.00293731689453125,
-0.039337158203125,
0.0215911865234375,
-0.0211029052734375,
-0.0238494873046875,
0.04986572265625,
0.01145172119140625,
-0.033111572265625,
0.0025005340576171875,
-0.0341796875,
-0.004421234130859375,
0.052490234375,
0.0137939453125,
0.052490234375,
-0.0193023681640625,
-0.0305328369140625,
-0.013885498046875,
-0.0252838134765625,
0.024444580078125,
0.0267791748046875,
0.022369384765625,
-0.061065673828125,
0.023956298828125,
-0.0015354156494140625,
0.06878662109375,
0.024322509765625,
-0.0257415771484375,
0.047882080078125,
-0.01392364501953125,
-0.02923583984375,
-0.0171661376953125,
0.08074951171875,
0.05029296875,
0.0206451416015625,
0.001312255859375,
-0.003437042236328125,
-0.0232086181640625,
0.005466461181640625,
-0.07568359375,
0.0016603469848632812,
0.026611328125,
-0.041473388671875,
-0.035003662109375,
0.0059051513671875,
-0.063232421875,
-0.0005364418029785156,
-0.032501220703125,
0.014617919921875,
-0.040863037109375,
-0.0162506103515625,
0.01157379150390625,
-0.0023899078369140625,
0.028778076171875,
0.020263671875,
-0.0596923828125,
0.006809234619140625,
0.0295562744140625,
0.05609130859375,
0.0009160041809082031,
-0.0208740234375,
-0.025726318359375,
0.016754150390625,
0.00557708740234375,
0.048797607421875,
-0.0193023681640625,
-0.02593994140625,
-0.013458251953125,
0.0277557373046875,
-0.0289154052734375,
-0.0245513916015625,
0.06817626953125,
-0.005878448486328125,
0.028167724609375,
-0.035736083984375,
-0.0293121337890625,
-0.007411956787109375,
0.0113372802734375,
-0.053253173828125,
0.0755615234375,
0.006961822509765625,
-0.08074951171875,
0.00543212890625,
-0.06072998046875,
-0.01678466796875,
-0.020660400390625,
0.0162353515625,
-0.05181884765625,
-0.0172882080078125,
0.019256591796875,
0.036773681640625,
-0.0239105224609375,
-0.0200347900390625,
-0.030853271484375,
-0.0174407958984375,
0.032135009765625,
-0.00872039794921875,
0.0772705078125,
0.00984954833984375,
-0.0482177734375,
-0.0105743408203125,
-0.06622314453125,
0.0168609619140625,
0.022674560546875,
-0.02294921875,
-0.00817108154296875,
-0.0234527587890625,
0.0110321044921875,
0.038726806640625,
0.0287628173828125,
-0.030792236328125,
0.0096893310546875,
-0.033660888671875,
0.031463623046875,
0.059173583984375,
0.0018987655639648438,
0.0306243896484375,
-0.059906005859375,
0.0231781005859375,
-0.0031604766845703125,
0.016876220703125,
0.00904083251953125,
-0.0230712890625,
-0.0673828125,
-0.014892578125,
0.0195770263671875,
0.042510986328125,
-0.0311279296875,
0.045318603515625,
-0.0010833740234375,
-0.0345458984375,
-0.050872802734375,
0.0111083984375,
0.049896240234375,
0.034942626953125,
0.04132080078125,
-0.0154876708984375,
-0.049560546875,
-0.060943603515625,
-0.0010042190551757812,
-0.01264190673828125,
0.0005388259887695312,
0.045806884765625,
0.046905517578125,
-0.0193939208984375,
0.036163330078125,
-0.040191650390625,
-0.01434326171875,
-0.0276031494140625,
0.0056304931640625,
0.018524169921875,
0.06036376953125,
0.056549072265625,
-0.057830810546875,
-0.045623779296875,
0.0011425018310546875,
-0.07568359375,
-0.01363372802734375,
-0.0195770263671875,
-0.036865234375,
0.01548004150390625,
0.01056671142578125,
-0.03741455078125,
0.042938232421875,
0.025299072265625,
-0.036865234375,
0.04150390625,
-0.0208587646484375,
0.00759124755859375,
-0.0948486328125,
0.039154052734375,
0.0357666015625,
0.0011997222900390625,
-0.06439208984375,
0.015899658203125,
-0.012725830078125,
0.0224609375,
-0.0411376953125,
0.06689453125,
-0.035552978515625,
0.01471710205078125,
-0.01120758056640625,
0.02227783203125,
0.0245361328125,
0.044158935546875,
0.0102691650390625,
0.039581298828125,
0.0305633544921875,
-0.034454345703125,
0.0256195068359375,
0.032257080078125,
-0.01361846923828125,
0.054779052734375,
-0.062347412109375,
0.0088958740234375,
-0.007373809814453125,
0.015380859375,
-0.03668212890625,
-0.0201416015625,
0.04180908203125,
-0.0289764404296875,
0.050872802734375,
-0.006336212158203125,
-0.0302581787109375,
-0.049896240234375,
-0.022735595703125,
0.00923919677734375,
0.03436279296875,
-0.0297088623046875,
0.040802001953125,
0.0181732177734375,
0.0214996337890625,
-0.050811767578125,
-0.052001953125,
-0.0247344970703125,
-0.038421630859375,
-0.05828857421875,
0.03717041015625,
-0.0134124755859375,
-0.01074981689453125,
-0.0224761962890625,
-0.00820159912109375,
-0.01519012451171875,
0.00806427001953125,
0.0304412841796875,
0.036224365234375,
-0.014190673828125,
-0.00931549072265625,
-0.016387939453125,
-0.00878143310546875,
0.0125274658203125,
-0.0033721923828125,
0.052886962890625,
-0.02593994140625,
-0.0006866455078125,
-0.10211181640625,
0.006153106689453125,
0.045867919921875,
-0.0223388671875,
0.069091796875,
0.0833740234375,
-0.0224151611328125,
0.0081939697265625,
-0.044097900390625,
-0.00888824462890625,
-0.038848876953125,
-0.016082763671875,
-0.032989501953125,
-0.0302734375,
0.0472412109375,
0.00010389089584350586,
-0.0085601806640625,
0.038848876953125,
0.0272979736328125,
-0.0232391357421875,
0.052093505859375,
0.03167724609375,
-0.026641845703125,
0.032958984375,
-0.0543212890625,
0.00247955322265625,
-0.09661865234375,
-0.0377197265625,
-0.035003662109375,
-0.043975830078125,
-0.03594970703125,
-0.028594970703125,
0.01113128662109375,
0.03277587890625,
-0.047027587890625,
0.039306640625,
-0.042877197265625,
0.01338958740234375,
0.037322998046875,
0.044097900390625,
-0.0045166015625,
-0.0101776123046875,
-0.0308837890625,
-0.0008368492126464844,
-0.0236663818359375,
-0.0458984375,
0.06768798828125,
0.0300445556640625,
0.03369140625,
0.00727081298828125,
0.05694580078125,
0.006237030029296875,
0.005706787109375,
-0.037933349609375,
0.03466796875,
-0.006702423095703125,
-0.032135009765625,
-0.018280029296875,
-0.0287933349609375,
-0.07122802734375,
0.0019521713256835938,
-0.031707763671875,
-0.0804443359375,
0.014678955078125,
0.01447296142578125,
-0.03326416015625,
0.03424072265625,
-0.04278564453125,
0.06927490234375,
-0.0265960693359375,
-0.0673828125,
0.021942138671875,
-0.05126953125,
0.012054443359375,
0.029937744140625,
0.01515960693359375,
-0.0027332305908203125,
0.00891876220703125,
0.053131103515625,
-0.049591064453125,
0.073486328125,
-0.0232086181640625,
-0.006244659423828125,
0.035186767578125,
-0.01546478271484375,
0.042144775390625,
-0.006008148193359375,
-0.027374267578125,
-0.00832366943359375,
-0.008453369140625,
-0.03424072265625,
-0.039825439453125,
0.057281494140625,
-0.07159423828125,
-0.04046630859375,
-0.03839111328125,
-0.029541015625,
0.01042938232421875,
0.01580810546875,
0.0225830078125,
0.0384521484375,
0.006435394287109375,
0.0114898681640625,
0.05120849609375,
-0.018341064453125,
0.0439453125,
0.01120758056640625,
0.0029315948486328125,
-0.0110321044921875,
0.06268310546875,
-0.002040863037109375,
0.01552581787109375,
0.03741455078125,
0.0210418701171875,
-0.0307159423828125,
-0.0177764892578125,
-0.048583984375,
0.046417236328125,
-0.021728515625,
-0.0153350830078125,
-0.041900634765625,
-0.0235748291015625,
-0.0217437744140625,
-0.0267333984375,
-0.0102386474609375,
-0.025604248046875,
-0.048492431640625,
-0.0076446533203125,
0.0333251953125,
0.039306640625,
-0.0114898681640625,
0.0228118896484375,
-0.040771484375,
0.01300048828125,
0.0127105712890625,
0.01351165771484375,
0.011932373046875,
-0.034759521484375,
-0.00738525390625,
0.018707275390625,
-0.03778076171875,
-0.05084228515625,
0.04742431640625,
-0.0026607513427734375,
0.043701171875,
0.03466796875,
0.0038280487060546875,
0.05682373046875,
-0.023101806640625,
0.039154052734375,
0.0212860107421875,
-0.066162109375,
0.044647216796875,
-0.030059814453125,
0.036224365234375,
0.035308837890625,
0.046539306640625,
-0.0254364013671875,
-0.0140228271484375,
-0.047454833984375,
-0.051239013671875,
0.065185546875,
0.01849365234375,
-0.00008469820022583008,
0.01468658447265625,
0.03948974609375,
-0.02801513671875,
-0.0013408660888671875,
-0.068359375,
-0.04046630859375,
-0.0233001708984375,
-0.005321502685546875,
0.0270843505859375,
-0.003147125244140625,
-0.0128021240234375,
-0.0338134765625,
0.0625,
-0.003742218017578125,
0.042205810546875,
0.01202392578125,
-0.0077056884765625,
-0.00738525390625,
0.01702880859375,
0.057891845703125,
0.034637451171875,
-0.026123046875,
-0.022735595703125,
0.024322509765625,
-0.03265380859375,
-0.0017118453979492188,
-0.00514984130859375,
-0.02593994140625,
-0.004764556884765625,
0.022216796875,
0.08056640625,
0.015533447265625,
-0.0122222900390625,
0.035736083984375,
0.00887298583984375,
-0.01067352294921875,
-0.023956298828125,
0.01416015625,
0.015899658203125,
0.02880859375,
0.00045561790466308594,
0.01180267333984375,
0.0016889572143554688,
-0.047454833984375,
0.0200042724609375,
0.0256195068359375,
-0.032318115234375,
-0.024322509765625,
0.0626220703125,
0.0003848075866699219,
-0.01036834716796875,
0.026397705078125,
-0.015960693359375,
-0.05743408203125,
0.0445556640625,
0.052978515625,
0.04681396484375,
-0.025054931640625,
0.025177001953125,
0.07135009765625,
-0.004566192626953125,
-0.004940032958984375,
0.01099395751953125,
0.00033402442932128906,
-0.040985107421875,
0.0036792755126953125,
-0.076416015625,
-0.0021877288818359375,
0.0162506103515625,
-0.073486328125,
0.0293731689453125,
-0.0423583984375,
-0.029449462890625,
-0.0011968612670898438,
0.029510498046875,
-0.055328369140625,
0.0458984375,
0.014190673828125,
0.058135986328125,
-0.047393798828125,
0.0728759765625,
0.040008544921875,
-0.05255126953125,
-0.06951904296875,
0.007251739501953125,
0.0118255615234375,
-0.07476806640625,
0.05908203125,
0.00791168212890625,
-0.0017004013061523438,
-0.00727081298828125,
-0.0279998779296875,
-0.049835205078125,
0.09930419921875,
-0.0074005126953125,
-0.0189971923828125,
-0.0229644775390625,
0.019744873046875,
0.05120849609375,
-0.033905029296875,
0.05181884765625,
0.03857421875,
0.0501708984375,
0.01201629638671875,
-0.06475830078125,
0.0460205078125,
-0.04443359375,
0.0021495819091796875,
0.002712249755859375,
-0.10137939453125,
0.0740966796875,
-0.0033969879150390625,
-0.0027446746826171875,
0.01267242431640625,
0.03802490234375,
0.0249786376953125,
0.0153961181640625,
0.01139068603515625,
0.05792236328125,
0.0419921875,
-0.0165863037109375,
0.08758544921875,
-0.031951904296875,
0.03802490234375,
0.0753173828125,
0.0031890869140625,
0.06988525390625,
0.01486968994140625,
-0.01806640625,
0.05615234375,
0.029510498046875,
-0.025177001953125,
0.01129150390625,
0.02008056640625,
-0.01255035400390625,
-0.00836181640625,
-0.01247406005859375,
-0.039093017578125,
0.0193634033203125,
0.02777099609375,
-0.042266845703125,
0.0028934478759765625,
-0.0237579345703125,
0.03363037109375,
-0.001293182373046875,
-0.0156097412109375,
0.043060302734375,
0.0156402587890625,
-0.035736083984375,
0.0643310546875,
-0.002635955810546875,
0.0482177734375,
-0.036224365234375,
0.01195526123046875,
-0.01389312744140625,
0.01009368896484375,
-0.0272674560546875,
-0.046844482421875,
0.012115478515625,
0.0093994140625,
-0.00797271728515625,
-0.0278472900390625,
0.033111572265625,
-0.015380859375,
-0.048858642578125,
0.0290069580078125,
0.0177764892578125,
0.00939178466796875,
0.0251007080078125,
-0.0902099609375,
0.02294921875,
0.022369384765625,
-0.0306243896484375,
0.0262451171875,
0.017181396484375,
0.0155181884765625,
0.052001953125,
0.0411376953125,
-0.006305694580078125,
0.00608062744140625,
-0.00141143798828125,
0.0673828125,
-0.035797119140625,
-0.007213592529296875,
-0.0684814453125,
0.058868408203125,
-0.029571533203125,
-0.024200439453125,
0.067138671875,
0.040771484375,
0.055145263671875,
-0.008544921875,
0.05010986328125,
-0.0190887451171875,
0.0274505615234375,
-0.043060302734375,
0.07403564453125,
-0.0526123046875,
0.01047515869140625,
-0.030487060546875,
-0.04913330078125,
-0.01552581787109375,
0.06988525390625,
-0.0240325927734375,
0.0203857421875,
0.047607421875,
0.05157470703125,
-0.001953125,
-0.00696563720703125,
-0.00899505615234375,
0.02459716796875,
0.0008425712585449219,
0.062286376953125,
0.03363037109375,
-0.05633544921875,
0.0094451904296875,
-0.043701171875,
-0.00853729248046875,
-0.0257415771484375,
-0.054779052734375,
-0.08612060546875,
-0.044677734375,
-0.031768798828125,
-0.041717529296875,
-0.005702972412109375,
0.07769775390625,
0.0467529296875,
-0.06427001953125,
-0.0287933349609375,
0.01216888427734375,
0.0007219314575195312,
-0.006679534912109375,
-0.02044677734375,
0.051666259765625,
-0.0017185211181640625,
-0.07904052734375,
0.0029659271240234375,
-0.00545501708984375,
0.037628173828125,
0.00847625732421875,
-0.017333984375,
-0.0347900390625,
0.006099700927734375,
0.0193634033203125,
0.039215087890625,
-0.044097900390625,
-0.0218658447265625,
-0.00598907470703125,
-0.01849365234375,
0.0184478759765625,
0.02069091796875,
-0.032562255859375,
0.0155181884765625,
0.0413818359375,
0.0183258056640625,
0.052093505859375,
0.0201873779296875,
0.032470703125,
-0.03961181640625,
0.00890350341796875,
-0.00806427001953125,
0.03143310546875,
0.01103973388671875,
-0.0267791748046875,
0.04180908203125,
0.0176544189453125,
-0.033203125,
-0.055419921875,
-0.0091552734375,
-0.0936279296875,
0.001316070556640625,
0.082275390625,
-0.0272674560546875,
-0.0394287109375,
0.023406982421875,
-0.0216522216796875,
0.037109375,
-0.03765869140625,
0.04193115234375,
0.051055908203125,
-0.0225677490234375,
-0.01274871826171875,
-0.0526123046875,
0.050567626953125,
0.0168304443359375,
-0.06280517578125,
-0.02203369140625,
0.01300048828125,
0.020660400390625,
0.0305328369140625,
0.0247955322265625,
-0.00566864013671875,
0.0083770751953125,
-0.0143585205078125,
0.0005521774291992188,
-0.0088043212890625,
-0.0016231536865234375,
-0.00606536865234375,
-0.00225830078125,
-0.0224761962890625,
-0.00701904296875
]
] |
ArthurZ/flax-tiny-random-bert-sharded | 2022-11-14T06:24:51.000Z | [
"transformers",
"jax",
"bert",
"feature-extraction",
"flax",
"endpoints_compatible",
"region:us"
] | feature-extraction | ArthurZ | null | null | ArthurZ/flax-tiny-random-bert-sharded | 0 | 16,928 | transformers | 2022-06-17T16:08:40 | ---
tags:
- flax
---
# Model Card for flax-tiny-random-bert-sharded
# Model Details
## Model Description
This model is used to check that the sharding of a flax_model works properly. See [`test_checkpoint_sharding_from_hub`](https://github.com/huggingface/transformers/blob/main/tests/test_modeling_flax_common.py#L1049).
# Uses
The model is not designed to be used and serves a testing purpose.
### Software
- Transformers 4.21.0.dev0
- TensorFlow 2.9.0
- Datasets 2.2.2
- Tokenizers 0.12.1
| 502 | [
[
-0.051483154296875,
-0.08245849609375,
0.00919342041015625,
0.0113983154296875,
-0.0308685302734375,
-0.0295257568359375,
0.030120849609375,
-0.01971435546875,
-0.004650115966796875,
0.03314208984375,
-0.047882080078125,
-0.00994110107421875,
-0.0235595703125,
-0.0121002197265625,
-0.07513427734375,
0.09161376953125,
-0.0188446044921875,
0.02362060546875,
-0.0289306640625,
-0.00501251220703125,
-0.0276336669921875,
-0.029205322265625,
-0.06573486328125,
-0.04486083984375,
0.029022216796875,
0.0263214111328125,
0.0748291015625,
0.0225677490234375,
0.04046630859375,
0.02252197265625,
-0.0300750732421875,
-0.00690460205078125,
-0.034332275390625,
-0.0199432373046875,
0.00093841552734375,
-0.03826904296875,
-0.049713134765625,
0.0020084381103515625,
0.052093505859375,
0.046844482421875,
0.01004791259765625,
0.0189056396484375,
-0.01486968994140625,
0.01457977294921875,
-0.0215606689453125,
0.01522064208984375,
-0.0238494873046875,
0.032867431640625,
-0.03143310546875,
0.0059967041015625,
-0.00774383544921875,
-0.019805908203125,
0.0043182373046875,
-0.0233306884765625,
0.050689697265625,
0.02685546875,
0.09771728515625,
-0.0002180337905883789,
-0.0287628173828125,
0.00423431396484375,
-0.06884765625,
0.0572509765625,
-0.0310211181640625,
0.0287322998046875,
0.0372314453125,
0.04071044921875,
0.002346038818359375,
-0.07000732421875,
-0.04345703125,
0.003692626953125,
-0.0175628662109375,
-0.0190277099609375,
-0.00382232666015625,
0.0169525146484375,
0.034332275390625,
0.0718994140625,
-0.018768310546875,
0.0011777877807617188,
-0.05242919921875,
-0.034637451171875,
0.02581787109375,
0.0184326171875,
0.01255035400390625,
-0.002964019775390625,
-0.06329345703125,
-0.03253173828125,
-0.0222625732421875,
-0.002201080322265625,
0.0228424072265625,
0.0113372802734375,
-0.0333251953125,
0.032012939453125,
0.00873565673828125,
0.054901123046875,
0.003757476806640625,
0.00870513916015625,
0.017608642578125,
-0.012298583984375,
-0.037322998046875,
-0.0196533203125,
0.0364990234375,
0.0094146728515625,
-0.0087127685546875,
-0.0113067626953125,
-0.053314208984375,
-0.030120849609375,
0.03350830078125,
-0.07440185546875,
-0.0013055801391601562,
0.0364990234375,
-0.0484619140625,
-0.06787109375,
0.005664825439453125,
-0.045379638671875,
-0.009368896484375,
0.0093841552734375,
0.054718017578125,
-0.0440673828125,
-0.0226898193359375,
-0.01180267333984375,
-0.0340576171875,
0.0361328125,
0.0034084320068359375,
-0.05859375,
0.018096923828125,
0.0655517578125,
0.054901123046875,
0.0007686614990234375,
-0.0399169921875,
-0.007518768310546875,
-0.010986328125,
-0.03338623046875,
0.04931640625,
0.0032978057861328125,
-0.03173828125,
0.007659912109375,
0.0164794921875,
-0.00640869140625,
-0.0223541259765625,
0.07208251953125,
-0.05047607421875,
0.004016876220703125,
-0.052825927734375,
-0.04443359375,
-0.029541015625,
0.0239715576171875,
-0.067626953125,
0.08087158203125,
0.03558349609375,
-0.048492431640625,
0.042022705078125,
-0.05242919921875,
-0.0178680419921875,
0.0187835693359375,
-0.0014562606811523438,
-0.056304931640625,
0.02093505859375,
-0.01708984375,
0.0182037353515625,
-0.00484466552734375,
0.0071563720703125,
-0.034393310546875,
-0.051971435546875,
0.0243072509765625,
-0.01537322998046875,
0.0694580078125,
0.05548095703125,
-0.0064849853515625,
0.01467132568359375,
-0.0258026123046875,
-0.006908416748046875,
0.026519775390625,
0.0015802383422851562,
0.00847625732421875,
-0.00643157958984375,
0.039306640625,
-0.005168914794921875,
0.01216888427734375,
-0.035003662109375,
-0.00237274169921875,
-0.00716400146484375,
0.01983642578125,
0.058349609375,
0.0081634521484375,
0.018218994140625,
-0.0791015625,
0.022796630859375,
0.0137939453125,
0.0210723876953125,
0.0226898193359375,
-0.0478515625,
-0.0888671875,
-0.0240631103515625,
0.0465087890625,
0.0200042724609375,
-0.0307159423828125,
0.040618896484375,
0.006771087646484375,
-0.0576171875,
-0.018280029296875,
0.00440216064453125,
0.0202178955078125,
0.012115478515625,
0.002895355224609375,
-0.0182037353515625,
-0.03717041015625,
-0.07080078125,
0.006256103515625,
-0.0038928985595703125,
0.0091705322265625,
0.01393890380859375,
0.056060791015625,
-0.03656005859375,
0.06304931640625,
-0.01666259765625,
-0.032989501953125,
-0.026947021484375,
0.017822265625,
0.0239410400390625,
0.059478759765625,
0.0697021484375,
-0.053619384765625,
0.01137542724609375,
-0.025787353515625,
-0.048065185546875,
0.01654052734375,
-0.01222991943359375,
-0.03179931640625,
-0.011810302734375,
0.015625,
-0.040008544921875,
0.0328369140625,
0.04327392578125,
-0.03521728515625,
0.03265380859375,
-0.0213165283203125,
0.00852203369140625,
-0.058135986328125,
0.019683837890625,
-0.03192138671875,
-0.0345458984375,
-0.038116455078125,
0.06085205078125,
-0.00334930419921875,
-0.0287322998046875,
-0.04150390625,
0.045806884765625,
-0.0024204254150390625,
-0.01274871826171875,
-0.0256805419921875,
-0.03057861328125,
0.0029468536376953125,
0.01560211181640625,
-0.005825042724609375,
0.052032470703125,
0.02392578125,
-0.053314208984375,
0.027618408203125,
0.0297393798828125,
-0.00791168212890625,
0.06048583984375,
-0.05535888671875,
0.022613525390625,
-0.0018548965454101562,
0.00970458984375,
-0.04730224609375,
-0.024078369140625,
-0.005001068115234375,
-0.01218414306640625,
0.0244293212890625,
-0.003673553466796875,
-0.049652099609375,
-0.05328369140625,
-0.00844573974609375,
0.050567626953125,
0.046539306640625,
-0.054534912109375,
0.058349609375,
0.0204010009765625,
0.039031982421875,
0.0004520416259765625,
-0.06396484375,
-0.043243408203125,
-0.0159454345703125,
-0.06707763671875,
0.01555633544921875,
-0.01418304443359375,
-0.01537322998046875,
-0.0204315185546875,
-0.0028972625732421875,
-0.03594970703125,
0.020050048828125,
0.035125732421875,
0.007190704345703125,
-0.00838470458984375,
0.0087738037109375,
0.0009002685546875,
-0.019805908203125,
0.00887298583984375,
-0.004810333251953125,
0.054351806640625,
-0.034027099609375,
-0.029022216796875,
-0.018341064453125,
0.01384735107421875,
0.0280303955078125,
-0.005889892578125,
0.06292724609375,
0.08612060546875,
-0.058258056640625,
-0.0199127197265625,
-0.049835205078125,
-0.03802490234375,
-0.039031982421875,
0.01076507568359375,
-0.027069091796875,
-0.04534912109375,
0.06195068359375,
0.0288543701171875,
0.01197052001953125,
0.05230712890625,
0.0394287109375,
-0.0168914794921875,
0.04931640625,
0.033905029296875,
0.00640106201171875,
0.03131103515625,
-0.0374755859375,
0.01020050048828125,
-0.041778564453125,
-0.0055389404296875,
-0.04345703125,
-0.037506103515625,
-0.01192474365234375,
-0.035858154296875,
0.01007843017578125,
0.030303955078125,
-0.0176544189453125,
0.0301971435546875,
-0.025177001953125,
0.02520751953125,
0.048095703125,
0.00679779052734375,
-0.003864288330078125,
-0.0006442070007324219,
-0.0257415771484375,
0.00254058837890625,
-0.03253173828125,
-0.0206756591796875,
0.05780029296875,
0.042633056640625,
0.043853759765625,
-0.0089569091796875,
0.04620361328125,
0.03973388671875,
0.01503753662109375,
-0.038970947265625,
0.035400390625,
0.01108551025390625,
-0.10369873046875,
0.0016603469848632812,
-0.017791748046875,
-0.06341552734375,
0.0108489990234375,
-0.0190277099609375,
-0.043731689453125,
0.0101165771484375,
0.032867431640625,
-0.02618408203125,
0.055999755859375,
-0.04327392578125,
0.08489990234375,
-0.0204925537109375,
0.002201080322265625,
-0.01255035400390625,
0.00414276123046875,
0.037841796875,
-0.00539398193359375,
0.0126190185546875,
0.00916290283203125,
0.029205322265625,
0.07098388671875,
-0.07025146484375,
0.05535888671875,
-0.01580810546875,
0.0160980224609375,
0.044952392578125,
0.01332855224609375,
0.041839599609375,
0.001026153564453125,
0.00518035888671875,
0.03173828125,
0.037445068359375,
-0.044952392578125,
0.01495361328125,
0.06341552734375,
-0.0789794921875,
-0.00836181640625,
-0.045562744140625,
-0.0447998046875,
0.00323486328125,
0.039093017578125,
0.0235595703125,
0.025634765625,
-0.022796630859375,
0.0254669189453125,
0.0567626953125,
0.0019969940185546875,
0.024871826171875,
0.0244293212890625,
-0.045501708984375,
-0.0200958251953125,
0.06707763671875,
-0.0103607177734375,
0.016387939453125,
0.0079193115234375,
0.0260009765625,
0.00432586669921875,
-0.038543701171875,
-0.01861572265625,
0.0109100341796875,
-0.0589599609375,
-0.02227783203125,
-0.03759765625,
-0.043243408203125,
-0.019500732421875,
-0.02239990234375,
-0.04046630859375,
-0.0202178955078125,
-0.0211181640625,
-0.024200439453125,
0.036224365234375,
0.045654296875,
-0.013946533203125,
0.0369873046875,
-0.050201416015625,
0.008636474609375,
0.039031982421875,
0.038848876953125,
0.0017299652099609375,
-0.031524658203125,
-0.004566192626953125,
0.0136260986328125,
-0.032989501953125,
-0.051910400390625,
0.0027675628662109375,
0.004669189453125,
0.043182373046875,
0.055816650390625,
-0.00804901123046875,
0.047943115234375,
0.0038299560546875,
0.058685302734375,
-0.017822265625,
-0.06707763671875,
0.01462554931640625,
-0.0308380126953125,
0.01346588134765625,
0.037353515625,
0.013671875,
-0.032867431640625,
-0.01160430908203125,
-0.067626953125,
-0.046783447265625,
0.049530029296875,
0.030517578125,
0.00510406494140625,
0.003627777099609375,
0.01332855224609375,
0.005523681640625,
0.008697509765625,
-0.045745849609375,
-0.03057861328125,
-0.00725555419921875,
0.013519287109375,
0.00772857666015625,
-0.039215087890625,
-0.0212554931640625,
-0.0200347900390625,
0.05242919921875,
0.00937652587890625,
0.0599365234375,
0.0018987655639648438,
-0.01348114013671875,
-0.035125732421875,
-0.0171051025390625,
0.045257568359375,
0.015472412109375,
-0.058502197265625,
-0.006206512451171875,
0.037933349609375,
-0.033447265625,
-0.0308685302734375,
0.0209197998046875,
-0.0413818359375,
0.00417327880859375,
0.01467132568359375,
0.049957275390625,
0.034088134765625,
-0.01282501220703125,
0.04345703125,
-0.01467132568359375,
-0.0243988037109375,
-0.04852294921875,
0.017730712890625,
-0.00469207763671875,
0.007427215576171875,
0.0006322860717773438,
0.037689208984375,
0.0030384063720703125,
-0.01508331298828125,
0.0235595703125,
0.043487548828125,
-0.05926513671875,
-0.0171051025390625,
0.07293701171875,
0.01727294921875,
-0.036041259765625,
0.0611572265625,
-0.0262908935546875,
-0.0263824462890625,
0.051116943359375,
0.0308837890625,
0.0628662109375,
0.01412200927734375,
0.0011768341064453125,
0.034271240234375,
0.027587890625,
-0.002719879150390625,
0.03057861328125,
-0.01446533203125,
-0.05401611328125,
0.0110626220703125,
-0.067626953125,
-0.01543426513671875,
0.0065155029296875,
-0.06182861328125,
0.050079345703125,
-0.05670166015625,
-0.03179931640625,
-0.00592803955078125,
-0.0011320114135742188,
-0.08367919921875,
0.0098876953125,
0.029205322265625,
0.10089111328125,
-0.06597900390625,
0.07177734375,
0.037750244140625,
-0.041900634765625,
-0.052642822265625,
-0.035919189453125,
0.01052093505859375,
-0.0694580078125,
0.01517486572265625,
0.00823211669921875,
0.035125732421875,
-0.0140228271484375,
-0.042510986328125,
-0.06787109375,
0.099853515625,
-0.01453399658203125,
-0.006214141845703125,
-0.00157928466796875,
0.0115814208984375,
0.033660888671875,
-0.01395416259765625,
0.04400634765625,
0.0226593017578125,
0.036956787109375,
0.02996826171875,
-0.05047607421875,
-0.002330780029296875,
-0.00469207763671875,
0.0181884765625,
0.014312744140625,
-0.061737060546875,
0.09100341796875,
0.0002772808074951172,
0.005279541015625,
0.038055419921875,
0.0621337890625,
0.0237884521484375,
0.005741119384765625,
0.0614013671875,
0.055633544921875,
0.046600341796875,
-0.035003662109375,
0.092529296875,
-0.0163726806640625,
0.058807373046875,
0.05230712890625,
-0.004772186279296875,
0.035675048828125,
0.0229034423828125,
-0.008026123046875,
0.02752685546875,
0.04254150390625,
-0.03265380859375,
0.01922607421875,
-0.006191253662109375,
0.016021728515625,
-0.030120849609375,
0.012420654296875,
-0.047637939453125,
0.021759033203125,
0.016998291015625,
-0.052032470703125,
-0.0234832763671875,
-0.0234832763671875,
0.01483917236328125,
-0.043060302734375,
-0.015716552734375,
0.020477294921875,
-0.0027008056640625,
-0.0295257568359375,
0.03326416015625,
0.023529052734375,
0.0321044921875,
-0.052490234375,
0.0130157470703125,
-0.024383544921875,
0.053466796875,
-0.014739990234375,
-0.0199432373046875,
0.0372314453125,
-0.0299072265625,
-0.012481689453125,
-0.0185089111328125,
0.04754638671875,
-0.01096343994140625,
-0.0662841796875,
0.003063201904296875,
-0.006694793701171875,
0.0147705078125,
0.010162353515625,
-0.0655517578125,
0.02362060546875,
-0.0181884765625,
-0.001354217529296875,
0.0181121826171875,
-0.0017576217651367188,
-0.001445770263671875,
0.042266845703125,
0.04730224609375,
-0.017364501953125,
0.00014078617095947266,
0.007091522216796875,
0.04559326171875,
-0.0308074951171875,
-0.04376220703125,
-0.028900146484375,
0.039764404296875,
-0.01308441162109375,
-0.04840087890625,
0.060699462890625,
0.059326171875,
0.060516357421875,
-0.01531982421875,
0.0157318115234375,
-0.0289306640625,
0.0298919677734375,
-0.011199951171875,
0.0780029296875,
-0.03338623046875,
-0.0010280609130859375,
-0.01141357421875,
-0.04937744140625,
0.0091552734375,
0.05712890625,
-0.00080108642578125,
0.02313232421875,
0.059814453125,
0.055999755859375,
-0.0228424072265625,
0.0308685302734375,
0.025543212890625,
0.01715087890625,
0.030914306640625,
0.010009765625,
0.0323486328125,
-0.06085205078125,
0.0265960693359375,
-0.06396484375,
-0.0101776123046875,
-0.023529052734375,
-0.056732177734375,
-0.0999755859375,
-0.03802490234375,
-0.0216064453125,
-0.063720703125,
-0.018402099609375,
0.0484619140625,
0.07403564453125,
-0.0831298828125,
-0.0176849365234375,
-0.00521087646484375,
0.01004791259765625,
-0.041961669921875,
-0.0175933837890625,
0.00814056396484375,
-0.0208740234375,
-0.03497314453125,
-0.013031005859375,
-0.01605224609375,
0.02166748046875,
-0.039520263671875,
-0.005832672119140625,
-0.006366729736328125,
0.00725555419921875,
-0.004375457763671875,
0.0116729736328125,
-0.0303802490234375,
-0.034271240234375,
-0.021636962890625,
-0.01251983642578125,
-0.0234527587890625,
0.051849365234375,
-0.0247039794921875,
0.027099609375,
0.04638671875,
0.0058135986328125,
0.05023193359375,
-0.00757598876953125,
0.045928955078125,
-0.060394287109375,
0.045501708984375,
0.0206146240234375,
0.060943603515625,
0.0062103271484375,
-0.0126800537109375,
0.0172271728515625,
0.022674560546875,
-0.04815673828125,
-0.06622314453125,
0.004634857177734375,
-0.0665283203125,
0.013885498046875,
0.078125,
0.00270843505859375,
-0.054901123046875,
0.0134735107421875,
-0.00922393798828125,
-0.00383758544921875,
-0.009552001953125,
0.045379638671875,
0.042633056640625,
0.03668212890625,
-0.00978851318359375,
-0.0200958251953125,
0.043731689453125,
0.0247802734375,
-0.035919189453125,
-0.036468505859375,
0.0025272369384765625,
0.03472900390625,
0.023345947265625,
0.0210418701171875,
-0.0206756591796875,
0.03167724609375,
0.0263519287109375,
0.00942230224609375,
-0.0209808349609375,
0.0035305023193359375,
-0.007686614990234375,
-0.00279998779296875,
0.014129638671875,
-0.021759033203125
]
] |
aiplanet/effi-13b | 2023-08-20T04:16:56.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:kaist-ai/CoT-Collection",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | aiplanet | null | null | aiplanet/effi-13b | 7 | 16,901 | transformers | 2023-08-18T13:59:35 | ---
license: apache-2.0
datasets:
- kaist-ai/CoT-Collection
metrics:
- accuracy
pipeline_tag: text-generation
---
# Model card for aiplanet/effi-13b
effi-13B parameters is a causal decoder-only model built by AI Planet based on Llama-2-13b-chat-hf and fine tuned using the 1.8 Million coversations from CoT dataset available in huggingface datasets. The model is made available under the Apache 2.0 license.
## Why use effi-13B-Instruct?
- This is a ready to use chat/instruct model based on Llama-2-13b-chat-hf, which provides a rationale for the context provided.
- Llama-2 is the best open-source model available. This is an instruct model, which may not be ideal for further finetuning. If you are interested in building your own instruct/chat model, we recommend starting from **Llama-2-13b-chat-hf**
You will need at least **85-100GB of memory to swiftly run inference with effi-13b**.
## Model Details
### Model Description
This model has been fine-tuned on Chain of Thought datasets, which has context from mixed sources with corresponding rationale. The final finetuned Large Language Model(LLM) have shown enhanced capabilities of solving novel tasks by providing a reasoning.
- **Developed by:** AI Planet
- **Model type:** Casual Decoder only
- **Language(s) (NLP):** English
- **License:** Apache 2.0
- **Finetuned from model:** Llama-2-13b-chat-hf
### Direct Use
effi-13b has been finetuned on a Chain of Thought dataset.
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
This model has been majorly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of effi-13b to develop guardrails and take appropriate precautions for any production use.
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information is needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
```
from transformers import (AutoModelForCausalLM, AutoTokenizer, pipeline)
model_card = "aiplanet/effi-13b"
#
model = AutoModelForCausalLM.from_pretrained(model_card)
tokenizer = AutoTokenizer.from_pretrained(model_card)
#
generate_text = transformers.pipeline(
model=model, tokenizer=tokenizer,
return_full_text=True, # langchain expects the full text
task='text-generation',
# we pass model parameters here too
temperature=0.4, # 'randomness' of outputs, 0.0 is the min and 1.0 the max
max_new_tokens=512, # mex number of tokens to generate in the output
repetition_penalty=1.1 # without this output begins repeating
)
#
promt = """
Can you explain this code in detail?
def generate_stream(tokenizer, model, params, device,
context_len=2048, stream_interval=2):
prompt = params["prompt"]
l_prompt = len(prompt)
temperature = float(params.get("temperature", 1.0))
max_new_tokens = int(params.get("max_new_tokens", 256))
stop_str = params.get("stop", None)
input_ids = tokenizer(prompt).input_ids
output_ids = list(input_ids)
max_src_len = context_len - max_new_tokens - 8
input_ids = input_ids[-max_src_len:]
for i in range(max_new_tokens):
if i == 0:
out = model(
torch.as_tensor([input_ids], device=device), use_cache=True)
logits = out.logits
past_key_values = out.past_key_values
else:
attention_mask = torch.ones(
1, past_key_values[0][0].shape[-2] + 1, device=device)
out = model(input_ids=torch.as_tensor([[token]], device=device),
use_cache=True,
attention_mask=attention_mask,
past_key_values=past_key_values)
logits = out.logits
past_key_values = out.past_key_values
last_token_logits = logits[0][-1]
if device == "mps":
# Switch to CPU by avoiding some bugs in mps backend.
last_token_logits = last_token_logits.float().to("cpu")
if temperature < 1e-4:
token = int(torch.argmax(last_token_logits))
else:
probs = torch.softmax(last_token_logits / temperature, dim=-1)
token = int(torch.multinomial(probs, num_samples=1))
output_ids.append(token)
if token == tokenizer.eos_token_id:
stopped = True
else:
stopped = False
if i % stream_interval == 0 or i == max_new_tokens - 1 or stopped:
output = tokenizer.decode(output_ids, skip_special_tokens=True)
pos = output.rfind(stop_str, l_prompt)
if pos != -1:
output = output[:pos]
stopped = True
yield output
if stopped:
break
del past_key_values
"""
#
system_message = "Given your chain of thought reasoning, provide a rationale for the context in the source."
prompt = f"[INST] <<SYS>>\n{system_message}\n<</SYS>>\n\n{prompt}. [/INST]" # replace the command here with something relevant to your task
#
result = generate_text(prompt)
print(result[0]['generated_text'].strip().split("[/INST]")[-1])
```
## Training Details
### Training Data
effi-13b has been finetuned on https://huggingface.co/datasets/kaist-ai/CoT-Collection
The data was tokenized with the **meta-llama/Llama-2-13b-chat-hf** tokenizer.
### Training Procedure
Fine-tuning approach using PefT and Qlora(https://huggingface.co/blog/4bit-transformers-bitsandbytes)
#### Training Hyperparameters
- **Training regime:**
- lora_alpha=32,
- lora_dropout=0.05,
- r=8,
- bias="none",
- task_type="CAUSAL_LM"
#
- load_in_4bit=True,
- bnb_4bit_quant_type = "nf4",
- bnb_4bit_use_double_quant=True,
- bnb_4bit_compute_dtype=torch.bfloat16
#
- num_train_epochs = 1
- fp16 = False
- bf16 = False
- per_device_train_batch_size = 1
- per_device_eval_batch_size = 1
- gradient_accumulation_steps = 4
- gradient_checkpointing = True
- max_grad_norm = 0.3
- learning_rate = 2e-4
- weight_decay = 0.001
- optim = "paged_adamw_32bit"
- lr_scheduler_type = "constant"
- max_steps = 500
- warmup_ratio = 0.03
- group_by_length = True
- save_steps = 25
- logging_steps = 5
- max_seq_length = 2048
- packing = False
- device_map = {"": 0}
## Evaluation
Paper coming soon.
See the OpenLLM Leaderboard(https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
## Citation
@article{effi-13b,
title={{effi-13b}: an open large language model with state-of-the-art performance},
author={aiplanet},
year={2023}
}
## Model Card Contact
community@aiplanet.com
| 6,954 | [
[
-0.0296630859375,
-0.073486328125,
0.021820068359375,
0.0172271728515625,
-0.03302001953125,
-0.0154876708984375,
-0.024200439453125,
-0.02618408203125,
0.015655517578125,
0.0171356201171875,
-0.05023193359375,
-0.035675048828125,
-0.04608154296875,
0.00959014892578125,
-0.01509857177734375,
0.07183837890625,
0.00159454345703125,
-0.01253509521484375,
0.00189971923828125,
0.0113525390625,
-0.0235443115234375,
-0.03973388671875,
-0.06280517578125,
-0.0216064453125,
0.01551055908203125,
0.0187225341796875,
0.0225982666015625,
0.053680419921875,
0.04266357421875,
0.0271148681640625,
-0.008758544921875,
0.0230560302734375,
-0.0413818359375,
-0.0011682510375976562,
0.010650634765625,
-0.0231170654296875,
-0.030853271484375,
0.0003418922424316406,
0.05572509765625,
0.018463134765625,
-0.004940032958984375,
0.035858154296875,
0.00701141357421875,
0.0259552001953125,
-0.03436279296875,
0.0222625732421875,
-0.052764892578125,
-0.00780487060546875,
-0.016387939453125,
-0.0258941650390625,
-0.0390625,
-0.0071868896484375,
-0.015411376953125,
-0.044708251953125,
0.00228118896484375,
0.01468658447265625,
0.08953857421875,
0.03729248046875,
-0.0201873779296875,
-0.018524169921875,
-0.04931640625,
0.060882568359375,
-0.07708740234375,
0.0185089111328125,
0.02630615234375,
0.002819061279296875,
-0.01351165771484375,
-0.0653076171875,
-0.04949951171875,
-0.01444244384765625,
-0.00421142578125,
0.0136871337890625,
-0.0147705078125,
0.001373291015625,
0.0263519287109375,
0.0180511474609375,
-0.0367431640625,
0.00547027587890625,
-0.036163330078125,
-0.0179290771484375,
0.044952392578125,
0.03070068359375,
0.007724761962890625,
-0.0265045166015625,
-0.03302001953125,
-0.019683837890625,
-0.03564453125,
0.0070648193359375,
0.03082275390625,
0.0200042724609375,
-0.02301025390625,
0.040496826171875,
-0.01055145263671875,
0.048431396484375,
0.01357269287109375,
-0.0270233154296875,
0.032989501953125,
-0.025848388671875,
-0.04156494140625,
-0.002254486083984375,
0.080322265625,
0.02630615234375,
-0.00638580322265625,
0.01479339599609375,
-0.005374908447265625,
-0.0095977783203125,
-0.00872039794921875,
-0.07379150390625,
-0.0179901123046875,
0.0290679931640625,
-0.039306640625,
-0.032440185546875,
0.002719879150390625,
-0.05029296875,
-0.001922607421875,
-0.01348876953125,
0.0310211181640625,
-0.039520263671875,
-0.027374267578125,
0.006816864013671875,
-0.0130157470703125,
0.0295257568359375,
0.006069183349609375,
-0.057708740234375,
0.0194244384765625,
0.03607177734375,
0.061309814453125,
0.0038166046142578125,
-0.034942626953125,
-0.0263671875,
0.004367828369140625,
-0.005580902099609375,
0.044189453125,
-0.0231475830078125,
-0.015716552734375,
-0.0302734375,
0.00433349609375,
-0.021026611328125,
-0.0265655517578125,
0.04150390625,
-0.0279388427734375,
0.0173187255859375,
-0.01313018798828125,
-0.05499267578125,
-0.01206207275390625,
0.013397216796875,
-0.034759521484375,
0.095703125,
-0.011962890625,
-0.07366943359375,
0.01511383056640625,
-0.055755615234375,
-0.01424407958984375,
-0.00238037109375,
-0.005126953125,
-0.036773681640625,
-0.01425933837890625,
0.01352691650390625,
0.04278564453125,
-0.03033447265625,
0.03509521484375,
-0.0212249755859375,
-0.02752685546875,
0.0189666748046875,
-0.051422119140625,
0.087890625,
0.026763916015625,
-0.05059814453125,
0.0189056396484375,
-0.06390380859375,
0.0018968582153320312,
0.021484375,
-0.03375244140625,
0.002086639404296875,
-0.007808685302734375,
0.0152130126953125,
0.01500701904296875,
0.0294647216796875,
-0.048553466796875,
0.006378173828125,
-0.02642822265625,
0.046844482421875,
0.0703125,
0.00621795654296875,
0.0123138427734375,
-0.0201568603515625,
0.0215606689453125,
0.0104827880859375,
0.0180206298828125,
-0.005855560302734375,
-0.050445556640625,
-0.0740966796875,
-0.0267181396484375,
0.002811431884765625,
0.052520751953125,
-0.04046630859375,
0.06231689453125,
-0.00934600830078125,
-0.0557861328125,
-0.034393310546875,
0.006618499755859375,
0.0240478515625,
0.042266845703125,
0.02630615234375,
0.0073394775390625,
-0.052215576171875,
-0.06378173828125,
0.01084136962890625,
-0.024749755859375,
0.0097503662109375,
0.03509521484375,
0.0537109375,
-0.01495361328125,
0.0423583984375,
-0.048797607421875,
-0.0135040283203125,
-0.01885986328125,
0.001506805419921875,
0.03668212890625,
0.054534912109375,
0.035980224609375,
-0.03411865234375,
-0.025054931640625,
-0.0145416259765625,
-0.06707763671875,
-0.0010290145874023438,
-0.0242919921875,
-0.0295257568359375,
0.0229034423828125,
0.0460205078125,
-0.054412841796875,
0.0288848876953125,
0.031341552734375,
-0.0406494140625,
0.039459228515625,
-0.02056884765625,
0.0037994384765625,
-0.0985107421875,
0.0216827392578125,
-0.00357818603515625,
-0.0023365020751953125,
-0.032958984375,
0.01041412353515625,
-0.009674072265625,
-0.0031795501708984375,
-0.044342041015625,
0.04559326171875,
-0.0209197998046875,
0.0133209228515625,
-0.0116729736328125,
-0.01129150390625,
-0.00742340087890625,
0.056488037109375,
-0.0022907257080078125,
0.05029296875,
0.045745849609375,
-0.05029296875,
0.034637451171875,
0.009918212890625,
-0.0153045654296875,
0.0185089111328125,
-0.05731201171875,
0.01538848876953125,
0.0009632110595703125,
0.0194854736328125,
-0.08721923828125,
-0.0162200927734375,
0.038116455078125,
-0.04718017578125,
0.019195556640625,
0.01299285888671875,
-0.04376220703125,
-0.034759521484375,
-0.01079559326171875,
0.02520751953125,
0.044769287109375,
-0.035675048828125,
0.044189453125,
0.0165557861328125,
0.01568603515625,
-0.046478271484375,
-0.051788330078125,
-0.002735137939453125,
-0.01751708984375,
-0.050811767578125,
0.028778076171875,
-0.017578125,
-0.01044464111328125,
-0.001834869384765625,
0.0190887451171875,
0.0018014907836914062,
0.01038360595703125,
0.0241546630859375,
0.0235443115234375,
-0.0116119384765625,
-0.0017681121826171875,
-0.0030193328857421875,
-0.00647735595703125,
0.0180816650390625,
-0.027618408203125,
0.06549072265625,
-0.026641845703125,
-0.0158233642578125,
-0.058258056640625,
0.01522064208984375,
0.03204345703125,
-0.006259918212890625,
0.07891845703125,
0.06884765625,
-0.03839111328125,
0.00380706787109375,
-0.036956787109375,
-0.0169525146484375,
-0.0400390625,
0.031280517578125,
-0.031005859375,
-0.046112060546875,
0.0499267578125,
0.0158538818359375,
0.02215576171875,
0.05865478515625,
0.05816650390625,
-0.0018491744995117188,
0.0611572265625,
0.031707763671875,
0.003406524658203125,
0.033447265625,
-0.0482177734375,
0.0057830810546875,
-0.05816650390625,
-0.033203125,
-0.03155517578125,
-0.02191162109375,
-0.043853759765625,
-0.026763916015625,
0.01629638671875,
0.005725860595703125,
-0.050323486328125,
0.0170745849609375,
-0.051605224609375,
0.0186767578125,
0.054229736328125,
0.0222625732421875,
-0.0028285980224609375,
-0.004283905029296875,
-0.0210113525390625,
0.01078033447265625,
-0.053466796875,
-0.033935546875,
0.081787109375,
0.03131103515625,
0.053955078125,
-0.0115203857421875,
0.066650390625,
0.003749847412109375,
0.0106658935546875,
-0.03863525390625,
0.0504150390625,
0.008392333984375,
-0.048095703125,
-0.0272674560546875,
-0.0272064208984375,
-0.058563232421875,
0.0133819580078125,
-0.0096282958984375,
-0.05609130859375,
0.00438690185546875,
0.01085662841796875,
-0.0165863037109375,
0.04931640625,
-0.054351806640625,
0.062744140625,
-0.012237548828125,
-0.02838134765625,
-0.0073394775390625,
-0.045654296875,
0.03533935546875,
0.00936126708984375,
0.005992889404296875,
-0.00768280029296875,
-0.0013475418090820312,
0.08074951171875,
-0.044189453125,
0.071533203125,
-0.021026611328125,
0.01042938232421875,
0.0372314453125,
-0.01385498046875,
0.034332275390625,
-0.002498626708984375,
-0.0156707763671875,
0.0194244384765625,
-0.0049591064453125,
-0.03936767578125,
-0.01509857177734375,
0.058135986328125,
-0.09063720703125,
-0.0484619140625,
-0.034576416015625,
-0.0321044921875,
0.0148468017578125,
0.015777587890625,
0.039886474609375,
0.02813720703125,
-0.0037021636962890625,
0.0137481689453125,
0.038970947265625,
-0.028778076171875,
0.047454833984375,
0.004497528076171875,
-0.0062255859375,
-0.04815673828125,
0.06646728515625,
-0.0017843246459960938,
0.0186614990234375,
0.0266265869140625,
0.01325225830078125,
-0.022247314453125,
-0.02264404296875,
-0.049468994140625,
0.022064208984375,
-0.057769775390625,
-0.033203125,
-0.05865478515625,
-0.03521728515625,
-0.03363037109375,
-0.002574920654296875,
-0.024688720703125,
-0.0215911865234375,
-0.0523681640625,
-0.013946533203125,
0.05145263671875,
0.040374755859375,
-0.01177215576171875,
0.032012939453125,
-0.040679931640625,
0.011474609375,
0.00939178466796875,
0.00835418701171875,
0.01885986328125,
-0.057891845703125,
-0.015960693359375,
0.00768280029296875,
-0.03240966796875,
-0.0601806640625,
0.038055419921875,
0.019256591796875,
0.04229736328125,
0.033447265625,
0.004459381103515625,
0.061065673828125,
-0.021514892578125,
0.06787109375,
0.01873779296875,
-0.07659912109375,
0.04296875,
-0.0188446044921875,
0.0243072509765625,
0.0197296142578125,
0.0194854736328125,
-0.0224761962890625,
-0.032470703125,
-0.059814453125,
-0.06695556640625,
0.060821533203125,
0.0259552001953125,
0.006847381591796875,
-0.0119476318359375,
0.0293426513671875,
-0.00731658935546875,
-0.000032842159271240234,
-0.06121826171875,
-0.04193115234375,
-0.026458740234375,
-0.01513671875,
-0.0035877227783203125,
-0.0109100341796875,
0.0069732666015625,
-0.0411376953125,
0.0653076171875,
0.0006694793701171875,
0.06134033203125,
0.0187835693359375,
-0.005123138427734375,
-0.005889892578125,
-0.0024662017822265625,
0.04632568359375,
0.04296875,
-0.0261383056640625,
0.006214141845703125,
0.0219573974609375,
-0.046142578125,
0.0186614990234375,
0.0094757080078125,
-0.01343536376953125,
-0.0026569366455078125,
0.022796630859375,
0.08160400390625,
0.0030193328857421875,
-0.0281829833984375,
0.019683837890625,
-0.01165771484375,
-0.0288543701171875,
-0.033355712890625,
0.006145477294921875,
0.0237884521484375,
0.01470184326171875,
0.0299835205078125,
0.0096588134765625,
-0.004241943359375,
-0.025146484375,
0.00937652587890625,
0.02752685546875,
-0.0008716583251953125,
-0.0186920166015625,
0.07427978515625,
0.0172882080078125,
-0.021514892578125,
0.05938720703125,
-0.0173492431640625,
-0.03399658203125,
0.0679931640625,
0.053070068359375,
0.062469482421875,
0.000286102294921875,
0.0040740966796875,
0.051971435546875,
0.03057861328125,
0.00974273681640625,
0.030731201171875,
0.007598876953125,
-0.05328369140625,
-0.0300140380859375,
-0.057647705078125,
-0.011505126953125,
0.0219879150390625,
-0.045257568359375,
0.0302886962890625,
-0.04925537109375,
-0.019561767578125,
0.001922607421875,
0.0159759521484375,
-0.0704345703125,
0.01580810546875,
0.0199432373046875,
0.0723876953125,
-0.06890869140625,
0.0650634765625,
0.046417236328125,
-0.034912109375,
-0.07440185546875,
-0.0153656005859375,
-0.00667572021484375,
-0.06842041015625,
0.05023193359375,
0.01517486572265625,
0.0019283294677734375,
0.01538848876953125,
-0.055938720703125,
-0.08038330078125,
0.0953369140625,
0.0254669189453125,
-0.03179931640625,
-0.003932952880859375,
0.0012960433959960938,
0.032928466796875,
-0.02728271484375,
0.038360595703125,
0.033935546875,
0.03515625,
0.01103973388671875,
-0.0673828125,
0.0262603759765625,
-0.032470703125,
-0.01094818115234375,
0.01222991943359375,
-0.0714111328125,
0.08123779296875,
-0.0192108154296875,
-0.0056915283203125,
0.003910064697265625,
0.063720703125,
0.03369140625,
0.00988006591796875,
0.0308990478515625,
0.05560302734375,
0.066650390625,
-0.002117156982421875,
0.071533203125,
-0.033905029296875,
0.0374755859375,
0.06280517578125,
0.0015697479248046875,
0.0716552734375,
0.039215087890625,
-0.01386260986328125,
0.03546142578125,
0.0665283203125,
-0.00736236572265625,
0.035888671875,
0.00971221923828125,
-0.006183624267578125,
-0.01904296875,
-0.001766204833984375,
-0.033447265625,
0.028289794921875,
0.032135009765625,
-0.036590576171875,
-0.007633209228515625,
0.0013885498046875,
0.01088714599609375,
-0.03265380859375,
-0.01313018798828125,
0.05145263671875,
0.00370025634765625,
-0.0462646484375,
0.07257080078125,
0.01439666748046875,
0.0677490234375,
-0.03411865234375,
0.005886077880859375,
-0.003459930419921875,
0.0208587646484375,
-0.0123443603515625,
-0.04302978515625,
0.0175933837890625,
0.000202178955078125,
-0.00885009765625,
-0.006404876708984375,
0.043975830078125,
-0.018280029296875,
-0.0574951171875,
0.0204315185546875,
0.0291900634765625,
0.035614013671875,
0.01239013671875,
-0.0667724609375,
0.0227203369140625,
0.017059326171875,
-0.03839111328125,
0.01453399658203125,
0.0271759033203125,
0.0237884521484375,
0.041839599609375,
0.05841064453125,
0.0001232624053955078,
0.0191497802734375,
-0.01515960693359375,
0.07293701171875,
-0.054595947265625,
-0.02728271484375,
-0.08258056640625,
0.02923583984375,
-0.003082275390625,
-0.05218505859375,
0.045166015625,
0.04791259765625,
0.07147216796875,
0.0014781951904296875,
0.04840087890625,
-0.027374267578125,
0.006679534912109375,
-0.040985107421875,
0.043212890625,
-0.03619384765625,
0.015411376953125,
-0.01165771484375,
-0.06097412109375,
-0.0015230178833007812,
0.053070068359375,
-0.01282501220703125,
0.0022792816162109375,
0.051300048828125,
0.06268310546875,
-0.004169464111328125,
0.0054168701171875,
0.01462554931640625,
0.024810791015625,
0.0297698974609375,
0.054534912109375,
0.0447998046875,
-0.050567626953125,
0.0458984375,
-0.0390625,
-0.020599365234375,
-0.0105438232421875,
-0.06201171875,
-0.07366943359375,
-0.05096435546875,
-0.0186767578125,
-0.035980224609375,
-0.004596710205078125,
0.092041015625,
0.051971435546875,
-0.0523681640625,
-0.0299072265625,
-0.01032257080078125,
-0.00832366943359375,
-0.0175323486328125,
-0.0205841064453125,
0.044189453125,
-0.0258636474609375,
-0.054046630859375,
-0.0023860931396484375,
-0.004833221435546875,
0.004726409912109375,
-0.021026611328125,
-0.0153045654296875,
-0.035980224609375,
0.010284423828125,
0.03662109375,
0.00499725341796875,
-0.0615234375,
-0.009307861328125,
0.01418304443359375,
-0.009521484375,
0.00464630126953125,
0.0303192138671875,
-0.04876708984375,
0.0094146728515625,
0.037689208984375,
0.035736083984375,
0.054901123046875,
-0.0170135498046875,
0.024688720703125,
-0.03912353515625,
0.0223388671875,
0.002407073974609375,
0.03680419921875,
0.0185089111328125,
-0.048828125,
0.0232086181640625,
0.0301971435546875,
-0.04840087890625,
-0.061676025390625,
-0.01239013671875,
-0.07537841796875,
-0.0204315185546875,
0.10015869140625,
-0.01806640625,
-0.03826904296875,
0.0160675048828125,
-0.020599365234375,
0.035003662109375,
-0.041290283203125,
0.042877197265625,
0.047454833984375,
-0.0210723876953125,
-0.0091094970703125,
-0.0263824462890625,
0.0254364013671875,
0.0214385986328125,
-0.063232421875,
-0.0013189315795898438,
0.0270233154296875,
0.0311126708984375,
0.004474639892578125,
0.0555419921875,
0.0129547119140625,
0.00739288330078125,
0.01788330078125,
0.0045013427734375,
-0.0193328857421875,
-0.01453399658203125,
-0.01000213623046875,
-0.0008196830749511719,
-0.00554656982421875,
-0.0259857177734375
]
] |
Salesforce/blip-vqa-capfilt-large | 2023-08-01T14:48:25.000Z | [
"transformers",
"pytorch",
"tf",
"blip",
"question-answering",
"visual-question-answering",
"arxiv:2201.12086",
"license:bsd-3-clause",
"autotrain_compatible",
"has_space",
"region:us"
] | visual-question-answering | Salesforce | null | null | Salesforce/blip-vqa-capfilt-large | 19 | 16,887 | transformers | 2022-12-13T11:37:19 | ---
pipeline_tag: visual-question-answering
tags:
- visual-question-answering
inference: false
languages:
- en
license: bsd-3-clause
---
# BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Model card for BLIP trained on visual question answering - large architecture (with ViT large backbone).
|  |
|:--:|
| <b> Pull figure from BLIP official repo | Image source: https://github.com/salesforce/BLIP </b>|
## TL;DR
Authors from the [paper](https://arxiv.org/abs/2201.12086) write in the abstract:
*Vision-Language Pre-training (VLP) has advanced the performance for many vision-language tasks. However, most existing pre-trained models only excel in either understanding-based tasks or generation-based tasks. Furthermore, performance improvement has been largely achieved by scaling up the dataset with noisy image-text pairs collected from the web, which is a suboptimal source of supervision. In this paper, we propose BLIP, a new VLP framework which transfers flexibly to both vision-language understanding and generation tasks. BLIP effectively utilizes the noisy web data by bootstrapping the captions, where a captioner generates synthetic captions and a filter removes the noisy ones. We achieve state-of-the-art results on a wide range of vision-language tasks, such as image-text retrieval (+2.7% in average recall@1), image captioning (+2.8% in CIDEr), and VQA (+1.6% in VQA score). BLIP also demonstrates strong generalization ability when directly transferred to videolanguage tasks in a zero-shot manner. Code, models, and datasets are released.*
## Usage
You can use this model for conditional and un-conditional image captioning
### Using the Pytorch model
#### Running the model on CPU
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large ")
model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large ")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
#### Running the model on GPU
##### In full precision
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("Salesforce/blip-vqa-capfilt-large")
model = BlipForQuestionAnswering.from_pretrained("Salesforce/blip-vqa-capfilt-large").to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
##### In half precision (`float16`)
<details>
<summary> Click to expand </summary>
```python
import torch
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForQuestionAnswering
processor = BlipProcessor.from_pretrained("ybelkada/blip-vqa-capfilt-large")
model = BlipForQuestionAnswering.from_pretrained("ybelkada/blip-vqa-capfilt-large", torch_dtype=torch.float16).to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16)
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
>>> 1
```
</details>
## BibTex and citation info
```
@misc{https://doi.org/10.48550/arxiv.2201.12086,
doi = {10.48550/ARXIV.2201.12086},
url = {https://arxiv.org/abs/2201.12086},
author = {Li, Junnan and Li, Dongxu and Xiong, Caiming and Hoi, Steven},
keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,843 | [
[
-0.0233001708984375,
-0.047119140625,
-0.0014209747314453125,
0.0340576171875,
-0.0263824462890625,
-0.0024242401123046875,
-0.033050537109375,
-0.047637939453125,
-0.0033721923828125,
0.0201263427734375,
-0.0277862548828125,
-0.0271148681640625,
-0.0347900390625,
-0.0005636215209960938,
-0.013031005859375,
0.04730224609375,
0.01366424560546875,
0.00811767578125,
-0.00705718994140625,
0.00310516357421875,
-0.01534271240234375,
-0.0178985595703125,
-0.03985595703125,
-0.0009255409240722656,
0.003383636474609375,
0.0201568603515625,
0.036712646484375,
0.0361328125,
0.0552978515625,
0.03204345703125,
-0.01079559326171875,
0.00914764404296875,
-0.0227203369140625,
-0.02191162109375,
-0.00832366943359375,
-0.0567626953125,
-0.013702392578125,
-0.00012052059173583984,
0.048187255859375,
0.044677734375,
0.003215789794921875,
0.032012939453125,
0.0025730133056640625,
0.03875732421875,
-0.055023193359375,
0.0257415771484375,
-0.058013916015625,
0.004825592041015625,
-0.004810333251953125,
-0.0140228271484375,
-0.0304718017578125,
-0.005016326904296875,
0.007244110107421875,
-0.06463623046875,
0.0428466796875,
0.01035308837890625,
0.1185302734375,
0.0267791748046875,
0.0186309814453125,
-0.01438140869140625,
-0.029052734375,
0.06439208984375,
-0.043701171875,
0.036285400390625,
0.00798797607421875,
0.0226287841796875,
0.0019702911376953125,
-0.06610107421875,
-0.05938720703125,
-0.0145263671875,
-0.00811767578125,
0.030364990234375,
-0.0186309814453125,
-0.0023288726806640625,
0.030853271484375,
0.027252197265625,
-0.048431396484375,
-0.00247955322265625,
-0.0618896484375,
-0.022216796875,
0.03912353515625,
-0.0081024169921875,
0.01457977294921875,
-0.023223876953125,
-0.03594970703125,
-0.03399658203125,
-0.0360107421875,
0.0272674560546875,
-0.003620147705078125,
0.01358795166015625,
-0.02935791015625,
0.052032470703125,
-0.00580596923828125,
0.06549072265625,
0.02178955078125,
-0.0213470458984375,
0.04974365234375,
-0.0236968994140625,
-0.037109375,
-0.0113983154296875,
0.07708740234375,
0.0428466796875,
0.0232696533203125,
0.0029964447021484375,
0.00249481201171875,
0.00218963623046875,
0.0027103424072265625,
-0.07232666015625,
-0.0255889892578125,
0.0188140869140625,
-0.0302276611328125,
-0.012054443359375,
0.0012645721435546875,
-0.07391357421875,
-0.0080108642578125,
-0.00012111663818359375,
0.041015625,
-0.040985107421875,
-0.0146026611328125,
0.0170135498046875,
-0.0268707275390625,
0.0330810546875,
0.0280914306640625,
-0.060546875,
-0.005733489990234375,
0.0215911865234375,
0.069091796875,
0.0122528076171875,
-0.05029296875,
-0.033447265625,
0.0077667236328125,
-0.0235595703125,
0.03717041015625,
-0.007080078125,
-0.0066680908203125,
-0.0018444061279296875,
0.01270294189453125,
-0.005977630615234375,
-0.038726806640625,
-0.004962921142578125,
-0.0237579345703125,
0.018951416015625,
-0.01104736328125,
-0.02069091796875,
-0.024017333984375,
0.0219879150390625,
-0.0220794677734375,
0.06805419921875,
0.0015392303466796875,
-0.06146240234375,
0.045440673828125,
-0.040130615234375,
-0.0217742919921875,
0.0302276611328125,
-0.01947021484375,
-0.040374755859375,
-0.01029205322265625,
0.033660888671875,
0.033447265625,
-0.020538330078125,
0.0015459060668945312,
-0.0162353515625,
-0.0268707275390625,
0.00910186767578125,
-0.023651123046875,
0.0821533203125,
0.0006899833679199219,
-0.0494384765625,
0.0037670135498046875,
-0.058502197265625,
-0.00455474853515625,
0.01715087890625,
-0.023406982421875,
0.0020275115966796875,
-0.019775390625,
0.0160980224609375,
0.01210784912109375,
0.04620361328125,
-0.04766845703125,
0.0007643699645996094,
-0.032928466796875,
0.032684326171875,
0.042205810546875,
-0.0177459716796875,
0.0229949951171875,
-0.0002541542053222656,
0.026336669921875,
0.01280975341796875,
0.0264739990234375,
-0.0241546630859375,
-0.04791259765625,
-0.076171875,
-0.031982421875,
0.0011529922485351562,
0.05224609375,
-0.0728759765625,
0.028564453125,
-0.0192718505859375,
-0.042083740234375,
-0.051605224609375,
0.014739990234375,
0.054412841796875,
0.0618896484375,
0.047088623046875,
-0.02935791015625,
-0.038543701171875,
-0.059539794921875,
0.01953125,
-0.0229034423828125,
-0.00044608116149902344,
0.0224609375,
0.045379638671875,
-0.009185791015625,
0.05560302734375,
-0.040283203125,
-0.0220184326171875,
-0.0233154296875,
0.009124755859375,
0.0308990478515625,
0.047698974609375,
0.054351806640625,
-0.060272216796875,
-0.0279388427734375,
0.00685882568359375,
-0.0628662109375,
0.00975799560546875,
-0.01024627685546875,
-0.0098419189453125,
0.034210205078125,
0.0421142578125,
-0.0531005859375,
0.046142578125,
0.03070068359375,
-0.01540374755859375,
0.05224609375,
-0.0205841064453125,
-0.001247406005859375,
-0.0711669921875,
0.031097412109375,
0.0136566162109375,
-0.0030879974365234375,
-0.0205841064453125,
0.0113067626953125,
0.0127410888671875,
-0.00550079345703125,
-0.05242919921875,
0.048492431640625,
-0.045440673828125,
-0.0186004638671875,
0.00958251953125,
-0.005153656005859375,
0.0041656494140625,
0.054168701171875,
0.02227783203125,
0.062408447265625,
0.07977294921875,
-0.049407958984375,
0.03485107421875,
0.035552978515625,
-0.034088134765625,
0.0258026123046875,
-0.060546875,
-0.00970458984375,
-0.005046844482421875,
-0.013824462890625,
-0.084228515625,
-0.00685882568359375,
0.019805908203125,
-0.053619384765625,
0.0265350341796875,
-0.030303955078125,
-0.028717041015625,
-0.05133056640625,
-0.015838623046875,
0.0192718505859375,
0.040283203125,
-0.0494384765625,
0.0242462158203125,
0.01230621337890625,
0.0112152099609375,
-0.0682373046875,
-0.08135986328125,
0.004486083984375,
0.0108795166015625,
-0.046295166015625,
0.0285186767578125,
-0.0023593902587890625,
0.010498046875,
0.007022857666015625,
0.01513671875,
-0.0025081634521484375,
-0.0212249755859375,
0.0174102783203125,
0.036834716796875,
-0.0294342041015625,
-0.01079559326171875,
-0.02911376953125,
0.005992889404296875,
-0.00568389892578125,
-0.0178375244140625,
0.06414794921875,
-0.0335693359375,
-0.005462646484375,
-0.049072265625,
-0.0017309188842773438,
0.043731689453125,
-0.038177490234375,
0.038543701171875,
0.061248779296875,
-0.01552581787109375,
-0.0036468505859375,
-0.04290771484375,
0.0009469985961914062,
-0.043914794921875,
0.04486083984375,
-0.01824951171875,
-0.031280517578125,
0.04168701171875,
0.024932861328125,
0.002880096435546875,
0.018096923828125,
0.055694580078125,
-0.01450347900390625,
0.042236328125,
0.0548095703125,
0.00418853759765625,
0.053070068359375,
-0.06976318359375,
-0.00463104248046875,
-0.05401611328125,
-0.0299530029296875,
-0.0084228515625,
-0.01116943359375,
-0.0330810546875,
-0.035400390625,
0.0157623291015625,
0.018646240234375,
-0.0262451171875,
0.0185546875,
-0.045806884765625,
0.0168304443359375,
0.063232421875,
0.01438140869140625,
-0.0115203857421875,
0.01319122314453125,
-0.0182037353515625,
0.0040435791015625,
-0.051910400390625,
-0.01277923583984375,
0.07940673828125,
0.0161285400390625,
0.05059814453125,
-0.0205535888671875,
0.036285400390625,
-0.029205322265625,
0.00870513916015625,
-0.0513916015625,
0.05596923828125,
-0.0189361572265625,
-0.043975830078125,
-0.027984619140625,
-0.023223876953125,
-0.06915283203125,
0.018310546875,
-0.0238037109375,
-0.06060791015625,
0.0209503173828125,
0.034332275390625,
-0.0194549560546875,
0.0263519287109375,
-0.06549072265625,
0.0750732421875,
-0.031463623046875,
-0.047393798828125,
0.01195526123046875,
-0.050933837890625,
0.0196685791015625,
0.0245208740234375,
-0.006160736083984375,
0.0242767333984375,
0.0107574462890625,
0.052276611328125,
-0.040435791015625,
0.066650390625,
-0.028228759765625,
0.0291748046875,
0.0275115966796875,
-0.018768310546875,
-0.003505706787109375,
-0.00612640380859375,
0.0167388916015625,
0.02410888671875,
-0.0016698837280273438,
-0.042449951171875,
-0.03717041015625,
0.0098419189453125,
-0.05731201171875,
-0.03704833984375,
-0.0274505615234375,
-0.036376953125,
0.001964569091796875,
0.034454345703125,
0.055389404296875,
0.02484130859375,
0.024169921875,
0.01007843017578125,
0.02349853515625,
-0.03765869140625,
0.05767822265625,
0.01959228515625,
-0.0312042236328125,
-0.0352783203125,
0.07281494140625,
0.0016088485717773438,
0.01508331298828125,
0.0251922607421875,
0.013336181640625,
-0.026702880859375,
-0.043701171875,
-0.054229736328125,
0.037872314453125,
-0.043182373046875,
-0.02764892578125,
-0.0212554931640625,
-0.024017333984375,
-0.038726806640625,
-0.01947021484375,
-0.0360107421875,
-0.007686614990234375,
-0.0302734375,
0.014984130859375,
0.037384033203125,
0.020263671875,
-0.00807952880859375,
0.033172607421875,
-0.035369873046875,
0.03338623046875,
0.03155517578125,
0.020660400390625,
-0.0008478164672851562,
-0.040313720703125,
-0.0107269287109375,
0.01108551025390625,
-0.0213470458984375,
-0.04925537109375,
0.046600341796875,
0.0172271728515625,
0.03179931640625,
0.032928466796875,
-0.03125,
0.08404541015625,
-0.029541015625,
0.05792236328125,
0.041595458984375,
-0.07159423828125,
0.05316162109375,
-0.0002868175506591797,
0.0142974853515625,
0.03424072265625,
0.0200042724609375,
-0.0217742919921875,
-0.0274658203125,
-0.041351318359375,
-0.06689453125,
0.05072021484375,
0.006866455078125,
-0.0043487548828125,
0.0187225341796875,
0.020660400390625,
-0.017669677734375,
0.022247314453125,
-0.06060791015625,
-0.0179443359375,
-0.046875,
-0.0107269287109375,
-0.0158843994140625,
0.007442474365234375,
0.013671875,
-0.051910400390625,
0.030029296875,
-0.007205963134765625,
0.03216552734375,
0.032501220703125,
-0.033172607421875,
-0.004154205322265625,
-0.027069091796875,
0.04290771484375,
0.046600341796875,
-0.0222320556640625,
0.002712249755859375,
-0.00652313232421875,
-0.07135009765625,
-0.017669677734375,
0.00605010986328125,
-0.0254974365234375,
0.0002372264862060547,
0.0389404296875,
0.07147216796875,
-0.0029659271240234375,
-0.0452880859375,
0.05975341796875,
0.00623321533203125,
-0.0182647705078125,
-0.024169921875,
0.0023555755615234375,
-0.00505828857421875,
0.022308349609375,
0.046173095703125,
0.01105499267578125,
-0.01488494873046875,
-0.036529541015625,
0.01540374755859375,
0.035186767578125,
-0.0068359375,
-0.0191192626953125,
0.05413818359375,
-0.003955841064453125,
-0.01483154296875,
0.051483154296875,
-0.0305633544921875,
-0.050262451171875,
0.0618896484375,
0.05047607421875,
0.0333251953125,
-0.0023365020751953125,
0.0225372314453125,
0.04791259765625,
0.034454345703125,
0.003322601318359375,
0.039154052734375,
0.006145477294921875,
-0.06414794921875,
-0.0316162109375,
-0.0587158203125,
-0.0251617431640625,
0.0250701904296875,
-0.04034423828125,
0.031097412109375,
-0.052703857421875,
0.0016546249389648438,
0.0126800537109375,
0.01172637939453125,
-0.064697265625,
0.02935791015625,
0.0162506103515625,
0.06134033203125,
-0.056640625,
0.040008544921875,
0.066650390625,
-0.06939697265625,
-0.06634521484375,
-0.01165008544921875,
-0.028564453125,
-0.08538818359375,
0.06646728515625,
0.0257720947265625,
-0.004787445068359375,
0.00186920166015625,
-0.06744384765625,
-0.056365966796875,
0.07550048828125,
0.0362548828125,
-0.033660888671875,
-0.00431060791015625,
0.01087188720703125,
0.04486083984375,
-0.00843048095703125,
0.0171356201171875,
0.00598907470703125,
0.031768798828125,
0.031524658203125,
-0.0721435546875,
-0.0004749298095703125,
-0.028961181640625,
-0.006504058837890625,
-0.01399993896484375,
-0.05810546875,
0.07696533203125,
-0.03265380859375,
-0.01090240478515625,
-0.0046234130859375,
0.055938720703125,
0.0293731689453125,
0.01441192626953125,
0.028594970703125,
0.045318603515625,
0.04718017578125,
0.00605010986328125,
0.06304931640625,
-0.0225677490234375,
0.035430908203125,
0.05731201171875,
0.0181427001953125,
0.06573486328125,
0.0457763671875,
-0.014678955078125,
0.0218658447265625,
0.0369873046875,
-0.04486083984375,
0.031005859375,
0.00878143310546875,
0.02166748046875,
-0.010009765625,
0.021575927734375,
-0.024017333984375,
0.05908203125,
0.0308837890625,
-0.0223388671875,
-0.00464630126953125,
0.0066986083984375,
-0.00958251953125,
-0.0155792236328125,
-0.037841796875,
0.0206756591796875,
-0.01338958740234375,
-0.045989990234375,
0.0782470703125,
-0.0193023681640625,
0.08062744140625,
-0.0204315185546875,
-0.00455474853515625,
-0.01155853271484375,
0.0181427001953125,
-0.0215301513671875,
-0.07171630859375,
0.0089874267578125,
0.0006113052368164062,
0.0007967948913574219,
0.0023937225341796875,
0.023956298828125,
-0.033203125,
-0.07135009765625,
0.0177764892578125,
0.0192718505859375,
0.02740478515625,
0.01157379150390625,
-0.06732177734375,
0.0027637481689453125,
0.005863189697265625,
-0.01523590087890625,
-0.00940704345703125,
0.0280914306640625,
0.008575439453125,
0.055023193359375,
0.053741455078125,
0.03070068359375,
0.048828125,
-0.0079193115234375,
0.059539794921875,
-0.042510986328125,
-0.0292205810546875,
-0.0501708984375,
0.0438232421875,
-0.01384735107421875,
-0.050811767578125,
0.045806884765625,
0.06298828125,
0.07965087890625,
-0.0210113525390625,
0.041534423828125,
-0.019012451171875,
0.01100921630859375,
-0.046600341796875,
0.06219482421875,
-0.05633544921875,
-0.0099945068359375,
-0.036529541015625,
-0.04730224609375,
-0.042510986328125,
0.07550048828125,
-0.0212554931640625,
-0.0012311935424804688,
0.040771484375,
0.08447265625,
-0.0219879150390625,
-0.040985107421875,
0.0189666748046875,
0.02642822265625,
0.0169219970703125,
0.052520751953125,
0.0452880859375,
-0.040924072265625,
0.053436279296875,
-0.04547119140625,
-0.0178985595703125,
-0.01312255859375,
-0.04815673828125,
-0.07366943359375,
-0.05780029296875,
-0.0304718017578125,
-0.01922607421875,
-0.00441741943359375,
0.0355224609375,
0.062103271484375,
-0.049896240234375,
-0.022308349609375,
-0.017333984375,
-0.000286102294921875,
-0.01313018798828125,
-0.01525115966796875,
0.0440673828125,
-0.03570556640625,
-0.059478759765625,
-0.005168914794921875,
0.022003173828125,
0.015655517578125,
-0.01404571533203125,
0.006175994873046875,
-0.02203369140625,
-0.0277252197265625,
0.031829833984375,
0.04022216796875,
-0.04876708984375,
-0.01233673095703125,
0.0098419189453125,
-0.007068634033203125,
0.0267486572265625,
0.018707275390625,
-0.049163818359375,
0.034393310546875,
0.027069091796875,
0.0279693603515625,
0.06689453125,
-0.01425933837890625,
0.00739288330078125,
-0.049560546875,
0.06048583984375,
0.00843048095703125,
0.0386962890625,
0.03619384765625,
-0.019073486328125,
0.02191162109375,
0.0287628173828125,
-0.0157623291015625,
-0.0648193359375,
0.003383636474609375,
-0.09716796875,
-0.0191192626953125,
0.08770751953125,
-0.020721435546875,
-0.0557861328125,
0.0087738037109375,
-0.01806640625,
0.0271148681640625,
-0.00931549072265625,
0.045318603515625,
0.0142822265625,
0.0013208389282226562,
-0.039703369140625,
-0.0183563232421875,
0.0310516357421875,
0.0218505859375,
-0.046295166015625,
-0.01264190673828125,
0.028228759765625,
0.033477783203125,
0.04595947265625,
0.04046630859375,
-0.0019588470458984375,
0.039459228515625,
0.0132293701171875,
0.04205322265625,
-0.0173797607421875,
-0.01861572265625,
-0.01335906982421875,
0.00395965576171875,
-0.0121612548828125,
-0.052490234375
]
] |
cross-encoder/msmarco-MiniLM-L6-en-de-v1 | 2021-08-05T08:40:24.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | cross-encoder | null | null | cross-encoder/msmarco-MiniLM-L6-en-de-v1 | 3 | 16,886 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
---
# Cross-Encoder for MS MARCO - EN-DE
This is a cross-lingual Cross-Encoder model for EN-DE that can be used for passage re-ranking. It was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task.
The model can be used for Information Retrieval: See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html).
The training code is available in this repository, see `train_script.py`.
## Usage with SentenceTransformers
When you have [SentenceTransformers](https://www.sbert.net/) installed, you can use the model like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name', max_length=512)
query = 'How many people live in Berlin?'
docs = ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.']
pairs = [(query, doc) for doc in docs]
scores = model.predict(pairs)
```
## Usage with Transformers
With the transformers library, you can use the model like this:
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('model_name')
tokenizer = AutoTokenizer.from_pretrained('model_name')
features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
```
## Performance
The performance was evaluated on three datasets:
- **TREC-DL19 EN-EN**: The original [TREC 2019 Deep Learning Track](https://microsoft.github.io/msmarco/TREC-Deep-Learning-2019.html): Given an English query and 1000 documents (retrieved by BM25 lexical search), rank documents with according to their relevance. We compute NDCG@10. BM25 achieves a score of 45.46, a perfect re-ranker can achieve a score of 95.47.
- **TREC-DL19 DE-EN**: The English queries of TREC-DL19 have been translated by a German native speaker to German. We rank the German queries versus the English passages from the original TREC-DL19 setup. We compute NDCG@10.
- **GermanDPR DE-DE**: The [GermanDPR](https://www.deepset.ai/germanquad) dataset provides German queries and German passages from Wikipedia. We indexed the 2.8 Million paragraphs from German Wikipedia and retrieved for each query the top 100 most relevant passages using BM25 lexical search with Elasticsearch. We compute MRR@10. BM25 achieves a score of 35.85, a perfect re-ranker can achieve a score of 76.27.
We also check the performance of bi-encoders using the same evaluation: The retrieved documents from BM25 lexical search are re-ranked using query & passage embeddings with cosine-similarity. Bi-Encoders can also be used for end-to-end semantic search.
| Model-Name | TREC-DL19 EN-EN | TREC-DL19 DE-EN | GermanDPR DE-DE | Docs / Sec |
| ------------- |:-------------:| :-----: | :---: | :----: |
| BM25 | 45.46 | - | 35.85 | -|
| **Cross-Encoder Re-Rankers** | | | |
| [cross-encoder/msmarco-MiniLM-L6-en-de-v1](https://huggingface.co/cross-encoder/msmarco-MiniLM-L6-en-de-v1) | 72.43 | 65.53 | 46.77 | 1600 |
| [cross-encoder/msmarco-MiniLM-L12-en-de-v1](https://huggingface.co/cross-encoder/msmarco-MiniLM-L12-en-de-v1) | 72.94 | 66.07 | 49.91 | 900 |
| [svalabs/cross-electra-ms-marco-german-uncased](https://huggingface.co/svalabs/cross-electra-ms-marco-german-uncased) (DE only) | - | - | 53.67 | 260 |
| [deepset/gbert-base-germandpr-reranking](https://huggingface.co/deepset/gbert-base-germandpr-reranking) (DE only) | - | - | 53.59 | 260 |
| **Bi-Encoders (re-ranking)** | | | |
| [sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-lng-aligned](https://huggingface.co/sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-lng-aligned) | 63.38 | 58.28 | 37.88 | 940 |
| [sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-trained-scratch](https://huggingface.co/sentence-transformers/msmarco-distilbert-multilingual-en-de-v2-tmp-trained-scratch) | 65.51 | 58.69 | 38.32 | 940 |
| [svalabs/bi-electra-ms-marco-german-uncased](https://huggingface.co/svalabs/bi-electra-ms-marco-german-uncased) (DE only) | - | - | 34.31 | 450 |
| [deepset/gbert-base-germandpr-question_encoder](https://huggingface.co/deepset/gbert-base-germandpr-question_encoder) (DE only) | - | - | 42.55 | 450 |
Note: Docs / Sec gives the number of (query, document) pairs we can re-rank within a second on a V100 GPU.
| 4,798 | [
[
-0.042816162109375,
-0.055023193359375,
0.0199127197265625,
0.0091552734375,
-0.0178680419921875,
0.005641937255859375,
-0.034576416015625,
-0.0300140380859375,
0.0306396484375,
0.01076507568359375,
-0.03265380859375,
-0.06475830078125,
-0.05816650390625,
0.02392578125,
-0.0222320556640625,
0.0618896484375,
-0.0088958740234375,
0.0232391357421875,
-0.0116119384765625,
-0.01000213623046875,
-0.01291656494140625,
-0.0295257568359375,
-0.050384521484375,
-0.02386474609375,
0.03778076171875,
0.0288238525390625,
0.044921875,
0.033782958984375,
0.036163330078125,
0.0262451171875,
-0.01126861572265625,
0.01122283935546875,
-0.0299835205078125,
-0.00018274784088134766,
-0.0031337738037109375,
-0.018096923828125,
-0.0215301513671875,
-0.00640106201171875,
0.04180908203125,
0.042877197265625,
0.0022830963134765625,
0.004383087158203125,
-0.001697540283203125,
0.043182373046875,
-0.01427459716796875,
0.003154754638671875,
-0.0299072265625,
0.0032062530517578125,
-0.0162353515625,
-0.009063720703125,
-0.032470703125,
-0.01861572265625,
0.003986358642578125,
-0.033233642578125,
0.033966064453125,
-0.00023055076599121094,
0.0989990234375,
0.0220184326171875,
-0.0264739990234375,
-0.01006317138671875,
-0.0244598388671875,
0.04296875,
-0.047607421875,
0.06292724609375,
0.0257415771484375,
0.00811767578125,
0.003246307373046875,
-0.06195068359375,
-0.039642333984375,
-0.01898193359375,
-0.0177154541015625,
0.0247344970703125,
-0.01364898681640625,
-0.01142120361328125,
0.0246734619140625,
0.0377197265625,
-0.06549072265625,
0.0124359130859375,
-0.03863525390625,
-0.0135345458984375,
0.057159423828125,
0.00954437255859375,
0.018463134765625,
-0.0272369384765625,
-0.021087646484375,
-0.0261993408203125,
-0.03790283203125,
0.00974273681640625,
0.0242156982421875,
0.0016984939575195312,
-0.0174713134765625,
0.03961181640625,
-0.0323486328125,
0.051239013671875,
0.022003173828125,
0.006664276123046875,
0.05828857421875,
-0.03179931640625,
-0.0009279251098632812,
-0.0033435821533203125,
0.07745361328125,
0.031707763671875,
0.0137176513671875,
-0.0270843505859375,
-0.0149383544921875,
-0.0006642341613769531,
0.0128021240234375,
-0.0648193359375,
-0.01337432861328125,
0.00897216796875,
-0.030792236328125,
0.00017559528350830078,
0.0205535888671875,
-0.0660400390625,
0.0026569366455078125,
-0.0108795166015625,
0.0251922607421875,
-0.04498291015625,
-0.0007410049438476562,
0.02142333984375,
-0.0168609619140625,
0.0306396484375,
-0.0001329183578491211,
-0.04705810546875,
-0.0009436607360839844,
0.0265655517578125,
0.0635986328125,
-0.0159454345703125,
-0.042510986328125,
-0.005741119384765625,
-0.008514404296875,
-0.02178955078125,
0.045440673828125,
-0.029815673828125,
-0.0299072265625,
-0.00308990478515625,
0.033782958984375,
-0.00917816162109375,
-0.0247344970703125,
0.060943603515625,
-0.0323486328125,
0.044097900390625,
-0.016326904296875,
-0.039215087890625,
-0.0005288124084472656,
0.0180816650390625,
-0.060211181640625,
0.10003662109375,
0.0138397216796875,
-0.0660400390625,
0.02618408203125,
-0.042388916015625,
-0.041259765625,
-0.01263427734375,
0.0019330978393554688,
-0.031463623046875,
-0.0038394927978515625,
0.040008544921875,
0.0245819091796875,
-0.029632568359375,
0.0149078369140625,
-0.00885772705078125,
-0.034576416015625,
0.00595855712890625,
-0.032318115234375,
0.07501220703125,
0.0156707763671875,
-0.043426513671875,
-0.01403045654296875,
-0.052032470703125,
0.0095062255859375,
0.0201873779296875,
-0.04248046875,
-0.00792694091796875,
-0.0281219482421875,
0.01198577880859375,
0.0367431640625,
0.031707763671875,
-0.0509033203125,
0.007068634033203125,
-0.044342041015625,
0.026153564453125,
0.039031982421875,
0.00146484375,
0.030120849609375,
-0.00954437255859375,
0.034576416015625,
-0.0003833770751953125,
0.01898193359375,
0.0069580078125,
-0.037017822265625,
-0.07464599609375,
-0.0035686492919921875,
0.053253173828125,
0.050201416015625,
-0.0643310546875,
0.060211181640625,
-0.062469482421875,
-0.055419921875,
-0.0723876953125,
-0.0026302337646484375,
0.0372314453125,
0.029144287109375,
0.04095458984375,
-0.01525115966796875,
-0.038970947265625,
-0.0838623046875,
-0.00937652587890625,
0.01447296142578125,
-0.004657745361328125,
0.022705078125,
0.054718017578125,
-0.0257720947265625,
0.040313720703125,
-0.0362548828125,
-0.0197906494140625,
-0.021697998046875,
-0.004047393798828125,
0.0209197998046875,
0.0379638671875,
0.04217529296875,
-0.062744140625,
-0.05157470703125,
-0.0066986083984375,
-0.0631103515625,
0.004283905029296875,
0.0175933837890625,
-0.0084991455078125,
0.041290283203125,
0.048187255859375,
-0.045135498046875,
0.024566650390625,
0.024871826171875,
-0.031280517578125,
0.0145721435546875,
-0.036376953125,
0.029815673828125,
-0.08990478515625,
0.0103759765625,
0.001094818115234375,
-0.0008072853088378906,
-0.0254364013671875,
0.005680084228515625,
0.0033855438232421875,
-0.005550384521484375,
-0.03436279296875,
0.03863525390625,
-0.036163330078125,
0.01354217529296875,
0.01380157470703125,
0.020904541015625,
0.0049285888671875,
0.05084228515625,
0.01336669921875,
0.058074951171875,
0.033599853515625,
-0.0223236083984375,
0.0253448486328125,
0.0283050537109375,
-0.042388916015625,
0.03564453125,
-0.058074951171875,
0.00811767578125,
-0.01122283935546875,
0.018035888671875,
-0.06805419921875,
0.00035500526428222656,
0.007068634033203125,
-0.061309814453125,
0.036590576171875,
-0.00527191162109375,
-0.0499267578125,
-0.042877197265625,
-0.0242156982421875,
0.017852783203125,
0.023040771484375,
-0.0316162109375,
0.0272979736328125,
0.0269317626953125,
0.00031638145446777344,
-0.05316162109375,
-0.0738525390625,
0.016082763671875,
-0.0026111602783203125,
-0.0543212890625,
0.05047607421875,
-0.0067901611328125,
0.01320648193359375,
0.00868988037109375,
-0.00836181640625,
0.0087738037109375,
-0.00798797607421875,
0.01219940185546875,
0.015228271484375,
-0.001079559326171875,
-0.00024580955505371094,
-0.0088348388671875,
-0.02191162109375,
-0.003711700439453125,
-0.0166015625,
0.05926513671875,
-0.025054931640625,
-0.01222991943359375,
-0.0189971923828125,
0.029022216796875,
0.046295166015625,
-0.026031494140625,
0.055877685546875,
0.06707763671875,
-0.0248260498046875,
0.007144927978515625,
-0.0428466796875,
-0.01210784912109375,
-0.0340576171875,
0.0328369140625,
-0.046630859375,
-0.060272216796875,
0.04168701171875,
0.017730712890625,
-0.01013946533203125,
0.057647705078125,
0.04095458984375,
-0.00894927978515625,
0.073486328125,
0.027801513671875,
-0.027069091796875,
0.038818359375,
-0.058135986328125,
0.015777587890625,
-0.0462646484375,
-0.04742431640625,
-0.0438232421875,
-0.038543701171875,
-0.07086181640625,
-0.024566650390625,
0.0140533447265625,
-0.007144927978515625,
-0.019775390625,
0.050689697265625,
-0.045135498046875,
0.025970458984375,
0.053680419921875,
0.01947021484375,
0.0030956268310546875,
0.016387939453125,
-0.0255889892578125,
-0.0182342529296875,
-0.052825927734375,
-0.0168609619140625,
0.0863037109375,
0.0151824951171875,
0.050384521484375,
-0.0012845993041992188,
0.060333251953125,
0.027252197265625,
-0.0172576904296875,
-0.039154052734375,
0.0416259765625,
-0.019073486328125,
-0.0506591796875,
-0.0299072265625,
-0.0382080078125,
-0.085693359375,
0.01485443115234375,
-0.013458251953125,
-0.035675048828125,
0.0250244140625,
-0.0288543701171875,
-0.006591796875,
0.01690673828125,
-0.052978515625,
0.08642578125,
-0.030548095703125,
-0.0266571044921875,
-0.0182037353515625,
-0.05889892578125,
-0.00864410400390625,
0.0093231201171875,
0.0001977682113647461,
0.005588531494140625,
-0.0022487640380859375,
0.057098388671875,
-0.01081085205078125,
0.029266357421875,
-0.01117706298828125,
0.004474639892578125,
0.005138397216796875,
-0.00763702392578125,
0.033538818359375,
0.0017900466918945312,
-0.0108489990234375,
0.041473388671875,
-0.007091522216796875,
-0.0267333984375,
-0.03411865234375,
0.07000732421875,
-0.067626953125,
-0.039306640625,
-0.0301361083984375,
-0.027801513671875,
-0.0017299652099609375,
0.025543212890625,
0.0621337890625,
0.031097412109375,
-0.01039886474609375,
0.03680419921875,
0.0567626953125,
-0.027587890625,
0.029541015625,
0.03582763671875,
0.01209259033203125,
-0.05157470703125,
0.0753173828125,
0.02117919921875,
-0.004581451416015625,
0.052490234375,
-0.0089111328125,
-0.0416259765625,
-0.03729248046875,
-0.0288543701171875,
0.0274200439453125,
-0.05279541015625,
-0.01554107666015625,
-0.056549072265625,
-0.0284576416015625,
-0.045928955078125,
-0.0067901611328125,
-0.0194091796875,
-0.0283050537109375,
-0.00812530517578125,
-0.031982421875,
0.01129150390625,
0.0288238525390625,
-0.0006957054138183594,
0.0009174346923828125,
-0.032501220703125,
0.0257415771484375,
0.0014371871948242188,
0.0292510986328125,
-0.01861572265625,
-0.0587158203125,
-0.035186767578125,
0.000025093555450439453,
-0.01413726806640625,
-0.06396484375,
0.04010009765625,
0.0017261505126953125,
0.0599365234375,
0.017547607421875,
0.0019063949584960938,
0.043212890625,
-0.0355224609375,
0.06549072265625,
-0.0006322860717773438,
-0.05377197265625,
0.04302978515625,
-0.005039215087890625,
0.02362060546875,
0.0611572265625,
0.0426025390625,
-0.0513916015625,
-0.02685546875,
-0.059783935546875,
-0.08148193359375,
0.061309814453125,
0.006328582763671875,
0.0019330978393554688,
-0.013153076171875,
0.00873565673828125,
-0.00785064697265625,
0.006092071533203125,
-0.0631103515625,
-0.04266357421875,
-0.02392578125,
-0.02239990234375,
-0.0223846435546875,
-0.021881103515625,
0.006092071533203125,
-0.04376220703125,
0.063720703125,
0.007843017578125,
0.0352783203125,
0.0322265625,
-0.0301055908203125,
0.01812744140625,
0.0120086669921875,
0.039154052734375,
0.03472900390625,
-0.014892578125,
0.006053924560546875,
0.037200927734375,
-0.0330810546875,
-0.0008029937744140625,
0.0302276611328125,
-0.0284576416015625,
0.023681640625,
0.0232391357421875,
0.06512451171875,
0.0171356201171875,
-0.040863037109375,
0.043548583984375,
0.0014734268188476562,
-0.02520751953125,
-0.02960205078125,
-0.0236663818359375,
0.003719329833984375,
0.01995849609375,
0.0226593017578125,
-0.0210418701171875,
0.0120086669921875,
-0.0259857177734375,
0.021881103515625,
0.0221710205078125,
-0.03973388671875,
-0.0111846923828125,
0.048187255859375,
0.0149383544921875,
-0.0216064453125,
0.060455322265625,
-0.01312255859375,
-0.054718017578125,
0.03826904296875,
0.025238037109375,
0.06640625,
-0.01068878173828125,
0.0149078369140625,
0.05499267578125,
0.04266357421875,
0.005069732666015625,
0.0227813720703125,
-0.00370025634765625,
-0.04876708984375,
-0.0131683349609375,
-0.042999267578125,
0.00704193115234375,
0.00868988037109375,
-0.040252685546875,
0.0264892578125,
-0.006000518798828125,
-0.0142974853515625,
-0.004825592041015625,
0.0285797119140625,
-0.0703125,
0.01291656494140625,
0.0101470947265625,
0.0943603515625,
-0.055877685546875,
0.07659912109375,
0.050323486328125,
-0.05804443359375,
-0.03900146484375,
-0.00563812255859375,
-0.028472900390625,
-0.05242919921875,
0.043212890625,
0.0085601806640625,
0.0047454833984375,
0.001987457275390625,
-0.0200653076171875,
-0.06439208984375,
0.0960693359375,
0.027587890625,
-0.033843994140625,
-0.01226806640625,
0.0231475830078125,
0.05230712890625,
-0.01012420654296875,
0.0343017578125,
0.042877197265625,
0.04962158203125,
-0.01331329345703125,
-0.0653076171875,
0.007476806640625,
-0.04132080078125,
-0.004261016845703125,
0.01352691650390625,
-0.06427001953125,
0.05047607421875,
-0.005634307861328125,
-0.01023101806640625,
-0.0113067626953125,
0.03656005859375,
0.004367828369140625,
0.017120361328125,
0.0262451171875,
0.0751953125,
0.044647216796875,
-0.03021240234375,
0.08648681640625,
-0.040130615234375,
0.0369873046875,
0.05804443359375,
0.015533447265625,
0.05975341796875,
0.029876708984375,
-0.0185394287109375,
0.049163818359375,
0.06207275390625,
-0.0225677490234375,
0.05377197265625,
-0.0030956268310546875,
0.004047393798828125,
-0.0264129638671875,
0.006504058837890625,
-0.042022705078125,
0.03424072265625,
0.0110015869140625,
-0.038543701171875,
-0.01291656494140625,
0.0019197463989257812,
0.00563812255859375,
-0.0062408447265625,
-0.0030498504638671875,
0.05072021484375,
-0.005157470703125,
-0.051910400390625,
0.057098388671875,
0.012481689453125,
0.063720703125,
-0.045074462890625,
0.01551055908203125,
-0.039764404296875,
0.011688232421875,
-0.00959014892578125,
-0.072265625,
0.01009368896484375,
-0.00937652587890625,
-0.0250091552734375,
-0.026824951171875,
0.0380859375,
-0.04388427734375,
-0.044708251953125,
0.04034423828125,
0.040008544921875,
0.0240936279296875,
0.00041103363037109375,
-0.07037353515625,
-0.009368896484375,
0.01537322998046875,
-0.028411865234375,
0.0181732177734375,
0.040130615234375,
0.00012993812561035156,
0.043212890625,
0.038848876953125,
-0.00704193115234375,
0.009185791015625,
0.01296234130859375,
0.051177978515625,
-0.06005859375,
-0.029937744140625,
-0.051727294921875,
0.0247650146484375,
-0.0264739990234375,
-0.03271484375,
0.06982421875,
0.07672119140625,
0.09326171875,
-0.01104736328125,
0.047943115234375,
-0.0209197998046875,
0.01922607421875,
-0.03118896484375,
0.05731201171875,
-0.061309814453125,
0.0026035308837890625,
-0.0104217529296875,
-0.05926513671875,
-0.0080108642578125,
0.04205322265625,
-0.025543212890625,
0.00855255126953125,
0.046478271484375,
0.0684814453125,
0.0001118779182434082,
-0.0181732177734375,
0.01593017578125,
0.01282501220703125,
0.006076812744140625,
0.06915283203125,
0.0443115234375,
-0.0557861328125,
0.0712890625,
-0.029876708984375,
0.0125732421875,
-0.0185546875,
-0.0307464599609375,
-0.05633544921875,
-0.0582275390625,
-0.0243072509765625,
-0.033355712890625,
0.0027523040771484375,
0.05596923828125,
0.037811279296875,
-0.059814453125,
-0.006504058837890625,
0.00766754150390625,
0.01291656494140625,
-0.017547607421875,
-0.018798828125,
0.044097900390625,
-0.01727294921875,
-0.0809326171875,
0.03271484375,
-0.004138946533203125,
-0.0068817138671875,
-0.01305389404296875,
-0.029083251953125,
-0.028167724609375,
-0.01824951171875,
0.037384033203125,
0.0109710693359375,
-0.05828857421875,
-0.0111846923828125,
0.0300445556640625,
-0.029266357421875,
0.00909423828125,
0.038360595703125,
-0.057159423828125,
0.0303802490234375,
0.061187744140625,
0.042510986328125,
0.07110595703125,
-0.01641845703125,
0.026123046875,
-0.05157470703125,
0.006526947021484375,
0.0116119384765625,
0.04583740234375,
0.02996826171875,
-0.022918701171875,
0.06341552734375,
0.0309906005859375,
-0.03778076171875,
-0.052215576171875,
-0.0135040283203125,
-0.0767822265625,
-0.0291595458984375,
0.08599853515625,
-0.007518768310546875,
-0.01392364501953125,
0.002227783203125,
-0.01346588134765625,
0.0294342041015625,
-0.0335693359375,
0.055206298828125,
0.055877685546875,
0.00927734375,
-0.019561767578125,
-0.040283203125,
0.0208282470703125,
0.0220794677734375,
-0.042205810546875,
-0.0176849365234375,
0.031341552734375,
0.035858154296875,
0.02142333984375,
0.0309906005859375,
-0.026702880859375,
0.01263427734375,
0.0068511962890625,
0.018096923828125,
-0.022064208984375,
-0.0222015380859375,
-0.025970458984375,
0.01312255859375,
-0.0283660888671875,
-0.0271759033203125
]
] |
Helsinki-NLP/opus-mt-gem-gem | 2023-08-16T11:37:53.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"da",
"sv",
"af",
"nn",
"fy",
"fo",
"de",
"nb",
"nl",
"is",
"en",
"lb",
"yi",
"gem",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-gem-gem | 0 | 16,781 | transformers | 2022-03-02T23:29:04 | ---
language:
- da
- sv
- af
- nn
- fy
- fo
- de
- nb
- nl
- is
- en
- lb
- yi
- gem
tags:
- translation
license: apache-2.0
---
### gem-gem
* source group: Germanic languages
* target group: Germanic languages
* OPUS readme: [gem-gem](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/gem-gem/README.md)
* model: transformer
* source language(s): afr ang_Latn dan deu eng enm_Latn fao frr fry gos got_Goth gsw isl ksh ltz nds nld nno nob nob_Hebr non_Latn pdc sco stq swe swg yid
* target language(s): afr ang_Latn dan deu eng enm_Latn fao frr fry gos got_Goth gsw isl ksh ltz nds nld nno nob nob_Hebr non_Latn pdc sco stq swe swg yid
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
* download original weights: [opus-2020-07-27.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/gem-gem/opus-2020-07-27.zip)
* test set translations: [opus-2020-07-27.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/gem-gem/opus-2020-07-27.test.txt)
* test set scores: [opus-2020-07-27.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/gem-gem/opus-2020-07-27.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newssyscomb2009-deueng.deu.eng | 24.5 | 0.519 |
| newssyscomb2009-engdeu.eng.deu | 18.7 | 0.495 |
| news-test2008-deueng.deu.eng | 22.8 | 0.509 |
| news-test2008-engdeu.eng.deu | 18.6 | 0.485 |
| newstest2009-deueng.deu.eng | 22.2 | 0.507 |
| newstest2009-engdeu.eng.deu | 18.3 | 0.491 |
| newstest2010-deueng.deu.eng | 24.8 | 0.537 |
| newstest2010-engdeu.eng.deu | 19.7 | 0.499 |
| newstest2011-deueng.deu.eng | 22.9 | 0.516 |
| newstest2011-engdeu.eng.deu | 18.3 | 0.485 |
| newstest2012-deueng.deu.eng | 23.9 | 0.524 |
| newstest2012-engdeu.eng.deu | 18.5 | 0.484 |
| newstest2013-deueng.deu.eng | 26.3 | 0.537 |
| newstest2013-engdeu.eng.deu | 21.5 | 0.506 |
| newstest2014-deen-deueng.deu.eng | 25.7 | 0.535 |
| newstest2015-ende-deueng.deu.eng | 27.3 | 0.542 |
| newstest2015-ende-engdeu.eng.deu | 24.2 | 0.534 |
| newstest2016-ende-deueng.deu.eng | 31.8 | 0.584 |
| newstest2016-ende-engdeu.eng.deu | 28.4 | 0.564 |
| newstest2017-ende-deueng.deu.eng | 27.6 | 0.545 |
| newstest2017-ende-engdeu.eng.deu | 22.8 | 0.527 |
| newstest2018-ende-deueng.deu.eng | 34.1 | 0.593 |
| newstest2018-ende-engdeu.eng.deu | 32.7 | 0.595 |
| newstest2019-deen-deueng.deu.eng | 30.6 | 0.565 |
| newstest2019-ende-engdeu.eng.deu | 29.5 | 0.567 |
| Tatoeba-test.afr-ang.afr.ang | 0.0 | 0.053 |
| Tatoeba-test.afr-dan.afr.dan | 57.8 | 0.907 |
| Tatoeba-test.afr-deu.afr.deu | 46.4 | 0.663 |
| Tatoeba-test.afr-eng.afr.eng | 57.4 | 0.717 |
| Tatoeba-test.afr-enm.afr.enm | 11.3 | 0.285 |
| Tatoeba-test.afr-fry.afr.fry | 0.0 | 0.167 |
| Tatoeba-test.afr-gos.afr.gos | 1.5 | 0.178 |
| Tatoeba-test.afr-isl.afr.isl | 29.0 | 0.760 |
| Tatoeba-test.afr-ltz.afr.ltz | 11.2 | 0.246 |
| Tatoeba-test.afr-nld.afr.nld | 53.3 | 0.708 |
| Tatoeba-test.afr-nor.afr.nor | 66.0 | 0.752 |
| Tatoeba-test.afr-swe.afr.swe | 88.0 | 0.955 |
| Tatoeba-test.afr-yid.afr.yid | 59.5 | 0.443 |
| Tatoeba-test.ang-afr.ang.afr | 10.7 | 0.043 |
| Tatoeba-test.ang-dan.ang.dan | 6.3 | 0.190 |
| Tatoeba-test.ang-deu.ang.deu | 1.4 | 0.212 |
| Tatoeba-test.ang-eng.ang.eng | 8.1 | 0.247 |
| Tatoeba-test.ang-enm.ang.enm | 1.7 | 0.196 |
| Tatoeba-test.ang-fao.ang.fao | 10.7 | 0.105 |
| Tatoeba-test.ang-gos.ang.gos | 10.7 | 0.128 |
| Tatoeba-test.ang-isl.ang.isl | 16.0 | 0.135 |
| Tatoeba-test.ang-ltz.ang.ltz | 16.0 | 0.121 |
| Tatoeba-test.ang-yid.ang.yid | 1.5 | 0.136 |
| Tatoeba-test.dan-afr.dan.afr | 22.7 | 0.655 |
| Tatoeba-test.dan-ang.dan.ang | 3.1 | 0.110 |
| Tatoeba-test.dan-deu.dan.deu | 47.4 | 0.676 |
| Tatoeba-test.dan-eng.dan.eng | 54.7 | 0.704 |
| Tatoeba-test.dan-enm.dan.enm | 4.8 | 0.291 |
| Tatoeba-test.dan-fao.dan.fao | 9.7 | 0.120 |
| Tatoeba-test.dan-gos.dan.gos | 3.8 | 0.240 |
| Tatoeba-test.dan-isl.dan.isl | 66.1 | 0.678 |
| Tatoeba-test.dan-ltz.dan.ltz | 78.3 | 0.563 |
| Tatoeba-test.dan-nds.dan.nds | 6.2 | 0.335 |
| Tatoeba-test.dan-nld.dan.nld | 60.0 | 0.748 |
| Tatoeba-test.dan-nor.dan.nor | 68.1 | 0.812 |
| Tatoeba-test.dan-swe.dan.swe | 65.0 | 0.785 |
| Tatoeba-test.dan-swg.dan.swg | 2.6 | 0.182 |
| Tatoeba-test.dan-yid.dan.yid | 9.3 | 0.226 |
| Tatoeba-test.deu-afr.deu.afr | 50.3 | 0.682 |
| Tatoeba-test.deu-ang.deu.ang | 0.5 | 0.118 |
| Tatoeba-test.deu-dan.deu.dan | 49.6 | 0.679 |
| Tatoeba-test.deu-eng.deu.eng | 43.4 | 0.618 |
| Tatoeba-test.deu-enm.deu.enm | 2.2 | 0.159 |
| Tatoeba-test.deu-frr.deu.frr | 0.4 | 0.156 |
| Tatoeba-test.deu-fry.deu.fry | 10.7 | 0.355 |
| Tatoeba-test.deu-gos.deu.gos | 0.7 | 0.183 |
| Tatoeba-test.deu-got.deu.got | 0.3 | 0.010 |
| Tatoeba-test.deu-gsw.deu.gsw | 1.1 | 0.130 |
| Tatoeba-test.deu-isl.deu.isl | 24.3 | 0.504 |
| Tatoeba-test.deu-ksh.deu.ksh | 0.9 | 0.173 |
| Tatoeba-test.deu-ltz.deu.ltz | 15.6 | 0.304 |
| Tatoeba-test.deu-nds.deu.nds | 21.2 | 0.469 |
| Tatoeba-test.deu-nld.deu.nld | 47.1 | 0.657 |
| Tatoeba-test.deu-nor.deu.nor | 43.9 | 0.646 |
| Tatoeba-test.deu-pdc.deu.pdc | 3.0 | 0.133 |
| Tatoeba-test.deu-sco.deu.sco | 12.0 | 0.296 |
| Tatoeba-test.deu-stq.deu.stq | 0.6 | 0.137 |
| Tatoeba-test.deu-swe.deu.swe | 50.6 | 0.668 |
| Tatoeba-test.deu-swg.deu.swg | 0.2 | 0.137 |
| Tatoeba-test.deu-yid.deu.yid | 3.9 | 0.229 |
| Tatoeba-test.eng-afr.eng.afr | 55.2 | 0.721 |
| Tatoeba-test.eng-ang.eng.ang | 4.9 | 0.118 |
| Tatoeba-test.eng-dan.eng.dan | 52.6 | 0.684 |
| Tatoeba-test.eng-deu.eng.deu | 35.4 | 0.573 |
| Tatoeba-test.eng-enm.eng.enm | 1.8 | 0.223 |
| Tatoeba-test.eng-fao.eng.fao | 7.0 | 0.312 |
| Tatoeba-test.eng-frr.eng.frr | 1.2 | 0.050 |
| Tatoeba-test.eng-fry.eng.fry | 15.8 | 0.381 |
| Tatoeba-test.eng-gos.eng.gos | 0.7 | 0.170 |
| Tatoeba-test.eng-got.eng.got | 0.3 | 0.011 |
| Tatoeba-test.eng-gsw.eng.gsw | 0.5 | 0.126 |
| Tatoeba-test.eng-isl.eng.isl | 20.9 | 0.463 |
| Tatoeba-test.eng-ksh.eng.ksh | 1.0 | 0.141 |
| Tatoeba-test.eng-ltz.eng.ltz | 12.8 | 0.292 |
| Tatoeba-test.eng-nds.eng.nds | 18.3 | 0.428 |
| Tatoeba-test.eng-nld.eng.nld | 47.3 | 0.657 |
| Tatoeba-test.eng-non.eng.non | 0.3 | 0.145 |
| Tatoeba-test.eng-nor.eng.nor | 47.2 | 0.650 |
| Tatoeba-test.eng-pdc.eng.pdc | 4.8 | 0.177 |
| Tatoeba-test.eng-sco.eng.sco | 38.1 | 0.597 |
| Tatoeba-test.eng-stq.eng.stq | 2.4 | 0.288 |
| Tatoeba-test.eng-swe.eng.swe | 52.7 | 0.677 |
| Tatoeba-test.eng-swg.eng.swg | 1.1 | 0.163 |
| Tatoeba-test.eng-yid.eng.yid | 4.5 | 0.223 |
| Tatoeba-test.enm-afr.enm.afr | 22.8 | 0.401 |
| Tatoeba-test.enm-ang.enm.ang | 0.4 | 0.062 |
| Tatoeba-test.enm-dan.enm.dan | 51.4 | 0.782 |
| Tatoeba-test.enm-deu.enm.deu | 33.8 | 0.473 |
| Tatoeba-test.enm-eng.enm.eng | 22.4 | 0.495 |
| Tatoeba-test.enm-fry.enm.fry | 16.0 | 0.173 |
| Tatoeba-test.enm-gos.enm.gos | 6.1 | 0.222 |
| Tatoeba-test.enm-isl.enm.isl | 59.5 | 0.651 |
| Tatoeba-test.enm-ksh.enm.ksh | 10.5 | 0.130 |
| Tatoeba-test.enm-nds.enm.nds | 18.1 | 0.327 |
| Tatoeba-test.enm-nld.enm.nld | 38.3 | 0.546 |
| Tatoeba-test.enm-nor.enm.nor | 15.6 | 0.290 |
| Tatoeba-test.enm-yid.enm.yid | 2.3 | 0.215 |
| Tatoeba-test.fao-ang.fao.ang | 2.1 | 0.035 |
| Tatoeba-test.fao-dan.fao.dan | 53.7 | 0.625 |
| Tatoeba-test.fao-eng.fao.eng | 24.7 | 0.435 |
| Tatoeba-test.fao-gos.fao.gos | 12.7 | 0.116 |
| Tatoeba-test.fao-isl.fao.isl | 26.3 | 0.341 |
| Tatoeba-test.fao-nor.fao.nor | 41.9 | 0.586 |
| Tatoeba-test.fao-swe.fao.swe | 0.0 | 1.000 |
| Tatoeba-test.frr-deu.frr.deu | 7.4 | 0.263 |
| Tatoeba-test.frr-eng.frr.eng | 7.0 | 0.157 |
| Tatoeba-test.frr-fry.frr.fry | 4.0 | 0.112 |
| Tatoeba-test.frr-gos.frr.gos | 1.0 | 0.135 |
| Tatoeba-test.frr-nds.frr.nds | 12.4 | 0.207 |
| Tatoeba-test.frr-nld.frr.nld | 10.6 | 0.227 |
| Tatoeba-test.frr-stq.frr.stq | 1.0 | 0.058 |
| Tatoeba-test.fry-afr.fry.afr | 12.7 | 0.333 |
| Tatoeba-test.fry-deu.fry.deu | 30.8 | 0.555 |
| Tatoeba-test.fry-eng.fry.eng | 31.2 | 0.506 |
| Tatoeba-test.fry-enm.fry.enm | 0.0 | 0.175 |
| Tatoeba-test.fry-frr.fry.frr | 1.6 | 0.091 |
| Tatoeba-test.fry-gos.fry.gos | 1.1 | 0.254 |
| Tatoeba-test.fry-ltz.fry.ltz | 30.4 | 0.526 |
| Tatoeba-test.fry-nds.fry.nds | 12.4 | 0.116 |
| Tatoeba-test.fry-nld.fry.nld | 43.4 | 0.637 |
| Tatoeba-test.fry-nor.fry.nor | 47.1 | 0.607 |
| Tatoeba-test.fry-stq.fry.stq | 0.6 | 0.181 |
| Tatoeba-test.fry-swe.fry.swe | 30.2 | 0.587 |
| Tatoeba-test.fry-yid.fry.yid | 3.1 | 0.173 |
| Tatoeba-test.gos-afr.gos.afr | 1.8 | 0.215 |
| Tatoeba-test.gos-ang.gos.ang | 0.0 | 0.045 |
| Tatoeba-test.gos-dan.gos.dan | 4.1 | 0.236 |
| Tatoeba-test.gos-deu.gos.deu | 19.6 | 0.406 |
| Tatoeba-test.gos-eng.gos.eng | 15.1 | 0.329 |
| Tatoeba-test.gos-enm.gos.enm | 5.8 | 0.271 |
| Tatoeba-test.gos-fao.gos.fao | 19.0 | 0.136 |
| Tatoeba-test.gos-frr.gos.frr | 1.3 | 0.119 |
| Tatoeba-test.gos-fry.gos.fry | 17.1 | 0.388 |
| Tatoeba-test.gos-isl.gos.isl | 16.8 | 0.356 |
| Tatoeba-test.gos-ltz.gos.ltz | 3.6 | 0.174 |
| Tatoeba-test.gos-nds.gos.nds | 4.7 | 0.225 |
| Tatoeba-test.gos-nld.gos.nld | 16.3 | 0.406 |
| Tatoeba-test.gos-stq.gos.stq | 0.7 | 0.154 |
| Tatoeba-test.gos-swe.gos.swe | 8.6 | 0.319 |
| Tatoeba-test.gos-yid.gos.yid | 4.4 | 0.165 |
| Tatoeba-test.got-deu.got.deu | 0.2 | 0.041 |
| Tatoeba-test.got-eng.got.eng | 0.2 | 0.068 |
| Tatoeba-test.got-nor.got.nor | 0.6 | 0.000 |
| Tatoeba-test.gsw-deu.gsw.deu | 15.9 | 0.373 |
| Tatoeba-test.gsw-eng.gsw.eng | 14.7 | 0.320 |
| Tatoeba-test.isl-afr.isl.afr | 38.0 | 0.641 |
| Tatoeba-test.isl-ang.isl.ang | 0.0 | 0.037 |
| Tatoeba-test.isl-dan.isl.dan | 67.7 | 0.836 |
| Tatoeba-test.isl-deu.isl.deu | 42.6 | 0.614 |
| Tatoeba-test.isl-eng.isl.eng | 43.5 | 0.610 |
| Tatoeba-test.isl-enm.isl.enm | 12.4 | 0.123 |
| Tatoeba-test.isl-fao.isl.fao | 15.6 | 0.176 |
| Tatoeba-test.isl-gos.isl.gos | 7.1 | 0.257 |
| Tatoeba-test.isl-nor.isl.nor | 53.5 | 0.690 |
| Tatoeba-test.isl-stq.isl.stq | 10.7 | 0.176 |
| Tatoeba-test.isl-swe.isl.swe | 67.7 | 0.818 |
| Tatoeba-test.ksh-deu.ksh.deu | 11.8 | 0.393 |
| Tatoeba-test.ksh-eng.ksh.eng | 4.0 | 0.239 |
| Tatoeba-test.ksh-enm.ksh.enm | 9.5 | 0.085 |
| Tatoeba-test.ltz-afr.ltz.afr | 36.5 | 0.529 |
| Tatoeba-test.ltz-ang.ltz.ang | 0.0 | 0.043 |
| Tatoeba-test.ltz-dan.ltz.dan | 80.6 | 0.722 |
| Tatoeba-test.ltz-deu.ltz.deu | 40.1 | 0.581 |
| Tatoeba-test.ltz-eng.ltz.eng | 36.1 | 0.511 |
| Tatoeba-test.ltz-fry.ltz.fry | 16.5 | 0.524 |
| Tatoeba-test.ltz-gos.ltz.gos | 0.7 | 0.118 |
| Tatoeba-test.ltz-nld.ltz.nld | 40.4 | 0.535 |
| Tatoeba-test.ltz-nor.ltz.nor | 19.1 | 0.582 |
| Tatoeba-test.ltz-stq.ltz.stq | 2.4 | 0.093 |
| Tatoeba-test.ltz-swe.ltz.swe | 25.9 | 0.430 |
| Tatoeba-test.ltz-yid.ltz.yid | 1.5 | 0.160 |
| Tatoeba-test.multi.multi | 42.7 | 0.614 |
| Tatoeba-test.nds-dan.nds.dan | 23.0 | 0.465 |
| Tatoeba-test.nds-deu.nds.deu | 39.8 | 0.610 |
| Tatoeba-test.nds-eng.nds.eng | 32.0 | 0.520 |
| Tatoeba-test.nds-enm.nds.enm | 3.9 | 0.156 |
| Tatoeba-test.nds-frr.nds.frr | 10.7 | 0.127 |
| Tatoeba-test.nds-fry.nds.fry | 10.7 | 0.231 |
| Tatoeba-test.nds-gos.nds.gos | 0.8 | 0.157 |
| Tatoeba-test.nds-nld.nds.nld | 44.1 | 0.634 |
| Tatoeba-test.nds-nor.nds.nor | 47.1 | 0.665 |
| Tatoeba-test.nds-swg.nds.swg | 0.5 | 0.166 |
| Tatoeba-test.nds-yid.nds.yid | 12.7 | 0.337 |
| Tatoeba-test.nld-afr.nld.afr | 58.4 | 0.748 |
| Tatoeba-test.nld-dan.nld.dan | 61.3 | 0.753 |
| Tatoeba-test.nld-deu.nld.deu | 48.2 | 0.670 |
| Tatoeba-test.nld-eng.nld.eng | 52.8 | 0.690 |
| Tatoeba-test.nld-enm.nld.enm | 5.7 | 0.178 |
| Tatoeba-test.nld-frr.nld.frr | 0.9 | 0.159 |
| Tatoeba-test.nld-fry.nld.fry | 23.0 | 0.467 |
| Tatoeba-test.nld-gos.nld.gos | 1.0 | 0.165 |
| Tatoeba-test.nld-ltz.nld.ltz | 14.4 | 0.310 |
| Tatoeba-test.nld-nds.nld.nds | 24.1 | 0.485 |
| Tatoeba-test.nld-nor.nld.nor | 53.6 | 0.705 |
| Tatoeba-test.nld-sco.nld.sco | 15.0 | 0.415 |
| Tatoeba-test.nld-stq.nld.stq | 0.5 | 0.183 |
| Tatoeba-test.nld-swe.nld.swe | 73.6 | 0.842 |
| Tatoeba-test.nld-swg.nld.swg | 4.2 | 0.191 |
| Tatoeba-test.nld-yid.nld.yid | 9.4 | 0.299 |
| Tatoeba-test.non-eng.non.eng | 27.7 | 0.501 |
| Tatoeba-test.nor-afr.nor.afr | 48.2 | 0.687 |
| Tatoeba-test.nor-dan.nor.dan | 69.5 | 0.820 |
| Tatoeba-test.nor-deu.nor.deu | 41.1 | 0.634 |
| Tatoeba-test.nor-eng.nor.eng | 49.4 | 0.660 |
| Tatoeba-test.nor-enm.nor.enm | 6.8 | 0.230 |
| Tatoeba-test.nor-fao.nor.fao | 6.9 | 0.395 |
| Tatoeba-test.nor-fry.nor.fry | 9.2 | 0.323 |
| Tatoeba-test.nor-got.nor.got | 1.5 | 0.000 |
| Tatoeba-test.nor-isl.nor.isl | 34.5 | 0.555 |
| Tatoeba-test.nor-ltz.nor.ltz | 22.1 | 0.447 |
| Tatoeba-test.nor-nds.nor.nds | 34.3 | 0.565 |
| Tatoeba-test.nor-nld.nor.nld | 50.5 | 0.676 |
| Tatoeba-test.nor-nor.nor.nor | 57.6 | 0.764 |
| Tatoeba-test.nor-swe.nor.swe | 68.9 | 0.813 |
| Tatoeba-test.nor-yid.nor.yid | 65.0 | 0.627 |
| Tatoeba-test.pdc-deu.pdc.deu | 43.5 | 0.559 |
| Tatoeba-test.pdc-eng.pdc.eng | 26.1 | 0.471 |
| Tatoeba-test.sco-deu.sco.deu | 7.1 | 0.295 |
| Tatoeba-test.sco-eng.sco.eng | 34.4 | 0.551 |
| Tatoeba-test.sco-nld.sco.nld | 9.9 | 0.438 |
| Tatoeba-test.stq-deu.stq.deu | 8.6 | 0.385 |
| Tatoeba-test.stq-eng.stq.eng | 21.8 | 0.431 |
| Tatoeba-test.stq-frr.stq.frr | 2.1 | 0.111 |
| Tatoeba-test.stq-fry.stq.fry | 7.6 | 0.267 |
| Tatoeba-test.stq-gos.stq.gos | 0.7 | 0.198 |
| Tatoeba-test.stq-isl.stq.isl | 16.0 | 0.121 |
| Tatoeba-test.stq-ltz.stq.ltz | 3.8 | 0.150 |
| Tatoeba-test.stq-nld.stq.nld | 14.6 | 0.375 |
| Tatoeba-test.stq-yid.stq.yid | 2.4 | 0.096 |
| Tatoeba-test.swe-afr.swe.afr | 51.8 | 0.802 |
| Tatoeba-test.swe-dan.swe.dan | 64.9 | 0.784 |
| Tatoeba-test.swe-deu.swe.deu | 47.0 | 0.657 |
| Tatoeba-test.swe-eng.swe.eng | 55.8 | 0.700 |
| Tatoeba-test.swe-fao.swe.fao | 0.0 | 0.060 |
| Tatoeba-test.swe-fry.swe.fry | 14.1 | 0.449 |
| Tatoeba-test.swe-gos.swe.gos | 7.5 | 0.291 |
| Tatoeba-test.swe-isl.swe.isl | 70.7 | 0.812 |
| Tatoeba-test.swe-ltz.swe.ltz | 15.9 | 0.553 |
| Tatoeba-test.swe-nld.swe.nld | 78.7 | 0.854 |
| Tatoeba-test.swe-nor.swe.nor | 67.1 | 0.799 |
| Tatoeba-test.swe-yid.swe.yid | 14.7 | 0.156 |
| Tatoeba-test.swg-dan.swg.dan | 7.7 | 0.341 |
| Tatoeba-test.swg-deu.swg.deu | 8.0 | 0.334 |
| Tatoeba-test.swg-eng.swg.eng | 12.4 | 0.305 |
| Tatoeba-test.swg-nds.swg.nds | 1.1 | 0.209 |
| Tatoeba-test.swg-nld.swg.nld | 4.9 | 0.244 |
| Tatoeba-test.swg-yid.swg.yid | 3.4 | 0.194 |
| Tatoeba-test.yid-afr.yid.afr | 23.6 | 0.552 |
| Tatoeba-test.yid-ang.yid.ang | 0.1 | 0.066 |
| Tatoeba-test.yid-dan.yid.dan | 17.5 | 0.392 |
| Tatoeba-test.yid-deu.yid.deu | 21.0 | 0.423 |
| Tatoeba-test.yid-eng.yid.eng | 17.4 | 0.368 |
| Tatoeba-test.yid-enm.yid.enm | 0.6 | 0.143 |
| Tatoeba-test.yid-fry.yid.fry | 5.3 | 0.169 |
| Tatoeba-test.yid-gos.yid.gos | 1.2 | 0.149 |
| Tatoeba-test.yid-ltz.yid.ltz | 3.5 | 0.256 |
| Tatoeba-test.yid-nds.yid.nds | 14.4 | 0.487 |
| Tatoeba-test.yid-nld.yid.nld | 26.1 | 0.423 |
| Tatoeba-test.yid-nor.yid.nor | 47.1 | 0.583 |
| Tatoeba-test.yid-stq.yid.stq | 1.5 | 0.092 |
| Tatoeba-test.yid-swe.yid.swe | 35.9 | 0.518 |
| Tatoeba-test.yid-swg.yid.swg | 1.0 | 0.124 |
### System Info:
- hf_name: gem-gem
- source_languages: gem
- target_languages: gem
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/gem-gem/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['da', 'sv', 'af', 'nn', 'fy', 'fo', 'de', 'nb', 'nl', 'is', 'en', 'lb', 'yi', 'gem']
- src_constituents: {'ksh', 'enm_Latn', 'got_Goth', 'stq', 'dan', 'swe', 'afr', 'pdc', 'gos', 'nno', 'fry', 'gsw', 'fao', 'deu', 'swg', 'sco', 'nob', 'nld', 'isl', 'eng', 'ltz', 'nob_Hebr', 'ang_Latn', 'frr', 'non_Latn', 'yid', 'nds'}
- tgt_constituents: {'ksh', 'enm_Latn', 'got_Goth', 'stq', 'dan', 'swe', 'afr', 'pdc', 'gos', 'nno', 'fry', 'gsw', 'fao', 'deu', 'swg', 'sco', 'nob', 'nld', 'isl', 'eng', 'ltz', 'nob_Hebr', 'ang_Latn', 'frr', 'non_Latn', 'yid', 'nds'}
- src_multilingual: True
- tgt_multilingual: True
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/gem-gem/opus-2020-07-27.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/gem-gem/opus-2020-07-27.test.txt
- src_alpha3: gem
- tgt_alpha3: gem
- short_pair: gem-gem
- chrF2_score: 0.614
- bleu: 42.7
- brevity_penalty: 0.993
- ref_len: 73459.0
- src_name: Germanic languages
- tgt_name: Germanic languages
- train_date: 2020-07-27
- src_alpha2: gem
- tgt_alpha2: gem
- prefer_old: False
- long_pair: gem-gem
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 17,233 | [
[
-0.06268310546875,
-0.04376220703125,
0.01922607421875,
0.025177001953125,
-0.02008056640625,
-0.0038967132568359375,
0.007190704345703125,
-0.0226593017578125,
0.05889892578125,
0.0016489028930664062,
-0.019775390625,
-0.036865234375,
-0.03558349609375,
0.0244293212890625,
-0.00839996337890625,
0.033721923828125,
-0.00463104248046875,
-0.00418853759765625,
0.0194854736328125,
-0.0211029052734375,
-0.03857421875,
0.01198577880859375,
-0.059600830078125,
-0.0052490234375,
0.0218353271484375,
0.031890869140625,
0.04949951171875,
0.02581787109375,
0.019195556640625,
0.031036376953125,
-0.0158538818359375,
0.004932403564453125,
0.01009368896484375,
-0.01544952392578125,
0.004375457763671875,
-0.050750732421875,
-0.038970947265625,
-0.011260986328125,
0.037994384765625,
0.05230712890625,
0.012054443359375,
0.0277862548828125,
-0.0034961700439453125,
0.0673828125,
-0.0293426513671875,
0.007434844970703125,
-0.00445556640625,
0.003940582275390625,
-0.0274505615234375,
-0.02435302734375,
-0.03936767578125,
-0.06982421875,
-0.01050567626953125,
-0.0384521484375,
0.0014123916625976562,
0.00423431396484375,
0.10137939453125,
-0.00783538818359375,
-0.0225067138671875,
-0.00543975830078125,
-0.0165863037109375,
0.055633544921875,
-0.0677490234375,
0.016357421875,
0.03302001953125,
0.001068115234375,
-0.025177001953125,
-0.0225067138671875,
-0.05426025390625,
0.01200103759765625,
-0.02252197265625,
0.03692626953125,
-0.0023365020751953125,
-0.0206146240234375,
0.0204010009765625,
0.034393310546875,
-0.047943115234375,
-0.002124786376953125,
-0.035247802734375,
0.0111083984375,
0.0474853515625,
0.024658203125,
0.0323486328125,
-0.02618408203125,
-0.04754638671875,
-0.035369873046875,
-0.0246124267578125,
0.037994384765625,
0.01702880859375,
0.0177764892578125,
-0.01812744140625,
0.04736328125,
-0.0200958251953125,
0.0411376953125,
0.0275421142578125,
-0.00494384765625,
0.059051513671875,
-0.0355224609375,
-0.039764404296875,
-0.0223541259765625,
0.06365966796875,
0.05645751953125,
-0.01100921630859375,
0.0135498046875,
-0.0022945404052734375,
0.01229095458984375,
-0.01335906982421875,
-0.04443359375,
-0.00605010986328125,
0.0156707763671875,
-0.0209808349609375,
-0.007598876953125,
0.00909423828125,
-0.0914306640625,
-0.00044608116149902344,
-0.00449371337890625,
0.0266265869140625,
-0.043212890625,
-0.032806396484375,
0.0187835693359375,
-0.0033111572265625,
0.02618408203125,
0.00672149658203125,
-0.04052734375,
0.0279998779296875,
0.00677490234375,
0.05517578125,
-0.01239776611328125,
-0.0172882080078125,
0.0098724365234375,
0.00760650634765625,
-0.03594970703125,
0.05792236328125,
-0.01959228515625,
-0.03668212890625,
-0.020416259765625,
0.01678466796875,
-0.0252838134765625,
-0.0194549560546875,
0.03472900390625,
-0.0128021240234375,
0.0251922607421875,
-0.04010009765625,
-0.0304412841796875,
-0.0293731689453125,
0.007686614990234375,
-0.051666259765625,
0.09942626953125,
0.024200439453125,
-0.05108642578125,
0.047821044921875,
-0.041259765625,
-0.006927490234375,
-0.006084442138671875,
0.014190673828125,
-0.040679931640625,
-0.00431060791015625,
0.019866943359375,
0.0253753662109375,
-0.028900146484375,
0.014129638671875,
-0.00498199462890625,
-0.0236358642578125,
0.0007953643798828125,
-0.016998291015625,
0.093505859375,
0.0212860107421875,
-0.037750244140625,
-0.01300811767578125,
-0.0830078125,
0.0272369384765625,
0.01538848876953125,
-0.0341796875,
-0.0008950233459472656,
-0.0521240234375,
-0.00787353515625,
0.0134429931640625,
0.0177459716796875,
-0.048065185546875,
0.01268768310546875,
-0.045196533203125,
-0.00849151611328125,
0.07830810546875,
0.01551055908203125,
0.0240631103515625,
-0.0489501953125,
0.03619384765625,
0.015960693359375,
0.023956298828125,
0.03594970703125,
-0.0338134765625,
-0.04376220703125,
-0.004657745361328125,
0.036224365234375,
0.049407958984375,
-0.047088623046875,
0.06475830078125,
-0.006496429443359375,
-0.0732421875,
-0.0277252197265625,
-0.013824462890625,
0.0308837890625,
0.03704833984375,
0.02691650390625,
-0.0219879150390625,
-0.035369873046875,
-0.07379150390625,
-0.01180267333984375,
0.0007414817810058594,
0.0116424560546875,
0.01296234130859375,
0.07110595703125,
0.007717132568359375,
0.06646728515625,
-0.05865478515625,
-0.028472900390625,
-0.0082855224609375,
-0.01641845703125,
0.05242919921875,
0.04022216796875,
0.08184814453125,
-0.04486083984375,
-0.0750732421875,
-0.0019817352294921875,
-0.059967041015625,
-0.007091522216796875,
-0.0015287399291992188,
-0.0187225341796875,
0.034271240234375,
-0.00046443939208984375,
-0.041839599609375,
0.053253173828125,
0.036163330078125,
-0.05224609375,
0.0478515625,
-0.023193359375,
0.0298614501953125,
-0.07598876953125,
0.0092010498046875,
-0.01091766357421875,
0.0101318359375,
-0.0201568603515625,
-0.0012798309326171875,
-0.00128936767578125,
0.009857177734375,
-0.019927978515625,
0.043853759765625,
-0.061370849609375,
-0.00678253173828125,
0.0221099853515625,
0.0252838134765625,
-0.00693511962890625,
0.05975341796875,
-0.0293731689453125,
0.08062744140625,
0.053131103515625,
-0.04351806640625,
0.038360595703125,
0.0175323486328125,
-0.01502227783203125,
0.03955078125,
-0.036865234375,
-0.0260162353515625,
-0.01174163818359375,
0.00493621826171875,
-0.083251953125,
-0.00740814208984375,
0.0158233642578125,
-0.0430908203125,
0.0158843994140625,
0.030242919921875,
-0.0226898193359375,
-0.050750732421875,
-0.052154541015625,
0.0131378173828125,
0.034637451171875,
-0.01898193359375,
0.036773681640625,
0.02178955078125,
-0.0067596435546875,
-0.04071044921875,
-0.050872802734375,
-0.01509857177734375,
-0.0036563873291015625,
-0.032684326171875,
0.00591278076171875,
-0.0159149169921875,
-0.02081298828125,
0.014984130859375,
-0.016998291015625,
-0.0201873779296875,
-0.002964019775390625,
0.0254364013671875,
0.00415802001953125,
-0.0173797607421875,
-0.0137481689453125,
-0.01107025146484375,
-0.0175323486328125,
-0.002788543701171875,
0.02386474609375,
0.05517578125,
-0.01136016845703125,
-0.026275634765625,
-0.037689208984375,
0.0206146240234375,
0.03546142578125,
-0.01424407958984375,
0.07470703125,
0.0181427001953125,
-0.0225677490234375,
0.006710052490234375,
-0.035888671875,
0.0086517333984375,
-0.03521728515625,
0.007144927978515625,
-0.041107177734375,
-0.039276123046875,
0.0594482421875,
0.002613067626953125,
0.01459503173828125,
0.0660400390625,
0.02886962890625,
-0.0076141357421875,
0.0458984375,
0.01540374755859375,
0.0210113525390625,
0.036468505859375,
-0.045318603515625,
0.0085601806640625,
-0.047607421875,
-0.043487548828125,
-0.0418701171875,
-0.033416748046875,
-0.046234130859375,
-0.01418304443359375,
0.036163330078125,
-0.007137298583984375,
-0.0230712890625,
0.055694580078125,
-0.05755615234375,
0.0260162353515625,
0.036041259765625,
0.03485107421875,
0.00638580322265625,
-0.031097412109375,
-0.0159454345703125,
-0.0094757080078125,
-0.01308441162109375,
-0.031768798828125,
0.0946044921875,
0.01422882080078125,
0.034759521484375,
0.0309906005859375,
0.0751953125,
0.02764892578125,
0.005828857421875,
-0.026123046875,
0.053436279296875,
0.023712158203125,
-0.05694580078125,
-0.0501708984375,
-0.0016641616821289062,
-0.073486328125,
0.039031982421875,
-0.01202392578125,
-0.04205322265625,
0.030975341796875,
-0.010498046875,
-0.0225830078125,
0.034271240234375,
-0.061614990234375,
0.048248291015625,
-0.009735107421875,
-0.03729248046875,
0.0102996826171875,
-0.041656494140625,
0.01338958740234375,
0.00263214111328125,
0.01224517822265625,
-0.00803375244140625,
0.005847930908203125,
0.06494140625,
-0.05633544921875,
0.028961181640625,
-0.036865234375,
-0.00004374980926513672,
0.050537109375,
0.0073699951171875,
0.039154052734375,
0.00569915771484375,
-0.01375579833984375,
-0.027069091796875,
-0.0089569091796875,
-0.0479736328125,
-0.014556884765625,
0.04345703125,
-0.051788330078125,
-0.0513916015625,
-0.062744140625,
-0.0098724365234375,
0.0024013519287109375,
0.0345458984375,
0.0265960693359375,
0.045013427734375,
-0.009857177734375,
0.031768798828125,
0.0482177734375,
-0.0236663818359375,
0.057708740234375,
0.0008559226989746094,
-0.00879669189453125,
-0.037353515625,
0.048065185546875,
0.0255584716796875,
0.0205535888671875,
0.0271759033203125,
-0.0005578994750976562,
-0.0270233154296875,
-0.032012939453125,
-0.0267791748046875,
0.01438140869140625,
-0.03564453125,
-0.0386962890625,
-0.04473876953125,
0.0026836395263671875,
-0.037994384765625,
-0.027587890625,
-0.0214691162109375,
-0.037200927734375,
-0.004169464111328125,
-0.036285400390625,
0.042327880859375,
0.050689697265625,
-0.0215606689453125,
0.020904541015625,
-0.0252685546875,
0.01409912109375,
0.01136016845703125,
0.035064697265625,
-0.002849578857421875,
-0.032958984375,
-0.0068817138671875,
0.00018894672393798828,
-0.0247955322265625,
-0.10052490234375,
0.038360595703125,
-0.0224151611328125,
0.0250396728515625,
0.0167694091796875,
-0.0017147064208984375,
0.061859130859375,
-0.0008535385131835938,
0.082763671875,
0.0268707275390625,
-0.06317138671875,
0.04840087890625,
-0.023223876953125,
0.006107330322265625,
0.06072998046875,
0.017852783203125,
-0.00180816650390625,
-0.04339599609375,
-0.05059814453125,
-0.07781982421875,
0.043304443359375,
0.0200347900390625,
-0.020416259765625,
-0.0271453857421875,
-0.00931549072265625,
0.0015764236450195312,
-0.02099609375,
-0.0621337890625,
-0.06732177734375,
0.00571441650390625,
-0.0087738037109375,
0.027618408203125,
-0.0207366943359375,
-0.0089569091796875,
-0.036590576171875,
0.05072021484375,
0.022064208984375,
0.037139892578125,
0.0201416015625,
-0.00473785400390625,
0.010406494140625,
0.032470703125,
0.061370849609375,
0.049652099609375,
-0.0199127197265625,
-0.009521484375,
0.050994873046875,
-0.048980712890625,
0.03436279296875,
-0.01270294189453125,
-0.0221710205078125,
0.01340484619140625,
0.005634307861328125,
0.033172607421875,
0.01544189453125,
-0.004268646240234375,
0.033721923828125,
-0.0012054443359375,
-0.0283355712890625,
-0.05859375,
-0.007007598876953125,
0.001567840576171875,
0.0009717941284179688,
0.022216796875,
0.00420379638671875,
0.00482177734375,
-0.04254150390625,
0.0215301513671875,
0.0151519775390625,
-0.008941650390625,
0.0034198760986328125,
0.04833984375,
0.0141448974609375,
-0.006824493408203125,
0.009307861328125,
-0.02166748046875,
-0.023101806640625,
0.06573486328125,
0.036102294921875,
0.04541015625,
-0.03460693359375,
0.0007147789001464844,
0.0654296875,
0.01104736328125,
0.0027675628662109375,
0.050689697265625,
0.032501220703125,
-0.0227203369140625,
0.005283355712890625,
-0.058624267578125,
-0.010528564453125,
-0.001018524169921875,
-0.058074951171875,
0.006465911865234375,
-0.037200927734375,
-0.03973388671875,
-0.0087738037109375,
0.0286407470703125,
-0.04840087890625,
0.038848876953125,
-0.032379150390625,
0.08892822265625,
-0.07513427734375,
0.041412353515625,
0.0509033203125,
-0.071044921875,
-0.0758056640625,
-0.02166748046875,
-0.0245513916015625,
-0.04705810546875,
0.040802001953125,
-0.022430419921875,
-0.0056610107421875,
-0.004489898681640625,
-0.016448974609375,
-0.073486328125,
0.105224609375,
-0.005535125732421875,
-0.0196075439453125,
0.0192108154296875,
-0.0008530616760253906,
0.04364013671875,
0.025421142578125,
0.053924560546875,
0.038360595703125,
0.06964111328125,
-0.0128021240234375,
-0.071533203125,
0.030914306640625,
-0.04205322265625,
-0.0095977783203125,
0.028961181640625,
-0.08160400390625,
0.0892333984375,
0.0022258758544921875,
-0.00942230224609375,
-0.014556884765625,
0.048065185546875,
0.0153656005859375,
0.00882720947265625,
0.036102294921875,
0.059295654296875,
0.041168212890625,
-0.032196044921875,
0.061553955078125,
-0.006744384765625,
0.033172607421875,
0.03802490234375,
-0.000060498714447021484,
0.047271728515625,
0.02532958984375,
-0.060302734375,
0.036102294921875,
0.045318603515625,
-0.0209197998046875,
0.034088134765625,
0.009613037109375,
-0.0263519287109375,
-0.005489349365234375,
-0.00732421875,
-0.05126953125,
0.031005859375,
0.005741119384765625,
-0.005859375,
-0.01145172119140625,
-0.01474761962890625,
0.0211639404296875,
0.028228759765625,
-0.012603759765625,
0.0382080078125,
0.001705169677734375,
-0.041748046875,
0.0406494140625,
-0.0174102783203125,
0.060577392578125,
-0.02716064453125,
-0.003948211669921875,
-0.0262451171875,
0.0272674560546875,
-0.01776123046875,
-0.0966796875,
0.0266265869140625,
-0.006023406982421875,
-0.03265380859375,
-0.0009331703186035156,
0.0192413330078125,
-0.00787353515625,
-0.032501220703125,
0.040374755859375,
0.022796630859375,
0.0168609619140625,
0.031707763671875,
-0.0274658203125,
-0.010040283203125,
0.0082855224609375,
-0.058563232421875,
0.0228271484375,
0.034576416015625,
0.007694244384765625,
0.0465087890625,
0.03485107421875,
0.007904052734375,
0.033935546875,
-0.02294921875,
0.054840087890625,
-0.068359375,
-0.0254974365234375,
-0.057342529296875,
0.029449462890625,
-0.027679443359375,
-0.048309326171875,
0.07012939453125,
0.0689697265625,
0.052276611328125,
-0.0188140869140625,
0.054656982421875,
-0.03973388671875,
0.03570556640625,
-0.023223876953125,
0.05206298828125,
-0.0465087890625,
-0.0379638671875,
-0.01153564453125,
-0.0618896484375,
-0.0219573974609375,
0.044677734375,
-0.01898193359375,
0.01496124267578125,
0.06427001953125,
0.040435791015625,
0.021514892578125,
0.01229095458984375,
0.0095062255859375,
0.026702880859375,
0.00521087646484375,
0.075927734375,
0.0206756591796875,
-0.057891845703125,
0.038848876953125,
-0.037384033203125,
-0.0019359588623046875,
-0.0173187255859375,
-0.041748046875,
-0.05499267578125,
-0.035369873046875,
0.001728057861328125,
-0.0231170654296875,
-0.0306396484375,
0.06536865234375,
-0.0006747245788574219,
-0.071044921875,
-0.0272674560546875,
0.0182952880859375,
0.0038471221923828125,
-0.036224365234375,
-0.0246124267578125,
0.0740966796875,
0.0033588409423828125,
-0.0771484375,
0.016510009765625,
-0.0012359619140625,
0.0059051513671875,
0.03448486328125,
-0.0005393028259277344,
-0.043914794921875,
0.006031036376953125,
0.0301513671875,
0.02508544921875,
-0.0487060546875,
-0.033599853515625,
0.01299285888671875,
-0.03216552734375,
0.023284912109375,
-0.01264190673828125,
-0.037200927734375,
0.01146697998046875,
0.0682373046875,
0.02313232421875,
0.039306640625,
0.016632080078125,
0.007724761962890625,
-0.032958984375,
0.0299530029296875,
0.0016946792602539062,
0.030853271484375,
-0.01715087890625,
-0.019287109375,
0.068603515625,
0.0199737548828125,
-0.0306854248046875,
-0.06982421875,
-0.0186614990234375,
-0.09344482421875,
-0.0009398460388183594,
0.053802490234375,
-0.006649017333984375,
-0.037139892578125,
-0.005641937255859375,
-0.0248565673828125,
0.0265655517578125,
-0.039947509765625,
0.034759521484375,
0.040252685546875,
-0.02197265625,
0.0014123916625976562,
-0.048126220703125,
0.01090240478515625,
0.0377197265625,
-0.06964111328125,
-0.0140380859375,
0.0110321044921875,
0.02789306640625,
0.0345458984375,
0.08233642578125,
-0.03289794921875,
0.010650634765625,
0.0295257568359375,
0.01611328125,
-0.00710296630859375,
-0.007091522216796875,
0.00107574462890625,
0.018463134765625,
-0.0159149169921875,
-0.045623779296875
]
] |
Habana/bert-large-uncased-whole-word-masking | 2023-09-08T16:18:32.000Z | [
"optimum_habana",
"license:apache-2.0",
"region:us"
] | null | Habana | null | null | Habana/bert-large-uncased-whole-word-masking | 0 | 16,770 | null | 2022-04-22T18:04:29 | ---
license: apache-2.0
---
[Optimum Habana](https://github.com/huggingface/optimum-habana) is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU).
It provides a set of tools enabling easy and fast model loading, training and inference on single- and multi-HPU settings for different downstream tasks.
Learn more about how to take advantage of the power of Habana HPUs to train and deploy Transformers and Diffusers models at [hf.co/hardware/habana](https://huggingface.co/hardware/habana).
## BERT Large model HPU configuration
This model only contains the `GaudiConfig` file for running the [bert-large-uncased-whole-word-masking](https://huggingface.co/bert-large-uncased-whole-word-masking) model on Habana's Gaudi processors (HPU).
**This model contains no model weights, only a GaudiConfig.**
This enables to specify:
- `use_fused_adam`: whether to use Habana's custom AdamW implementation
- `use_fused_clip_norm`: whether to use Habana's fused gradient norm clipping operator
- `use_torch_autocast`: whether to use Torch Autocast for managing mixed precision
## Usage
The model is instantiated the same way as in the Transformers library.
The only difference is that there are a few new training arguments specific to HPUs.\
It is strongly recommended to train this model doing bf16 mixed-precision training for optimal performance and accuracy.
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/question-answering/run_qa.py) is a question-answering example script to fine-tune a model on SQuAD. You can run it with BERT Large with the following command:
```bash
python run_qa.py \
--model_name_or_path bert-large-uncased-whole-word-masking \
--gaudi_config_name gaudi_config_name_or_path \
--dataset_name squad \
--do_train \
--do_eval \
--per_device_train_batch_size 24 \
--per_device_eval_batch_size 8 \
--learning_rate 3e-5 \
--num_train_epochs 2 \
--max_seq_length 384 \
--doc_stride 128 \
--output_dir /tmp/squad/ \
--use_habana \
--use_lazy_mode \
--throughput_warmup_steps 3 \
--bf16
```
Check the [documentation](https://huggingface.co/docs/optimum/habana/index) out for more advanced usage and examples.
| 2,244 | [
[
-0.058929443359375,
-0.0743408203125,
0.020751953125,
0.017486572265625,
-0.0115509033203125,
0.00033783912658691406,
-0.0088958740234375,
-0.0379638671875,
0.0252532958984375,
0.0208892822265625,
-0.0452880859375,
-0.01468658447265625,
-0.0308074951171875,
-0.014678955078125,
-0.024139404296875,
0.08416748046875,
-0.0023899078369140625,
-0.017608642578125,
-0.0165557861328125,
-0.01221466064453125,
-0.0287017822265625,
-0.040374755859375,
-0.0777587890625,
-0.032501220703125,
0.0220184326171875,
0.0104827880859375,
0.0689697265625,
0.03497314453125,
0.036468505859375,
0.0312347412109375,
-0.01806640625,
0.00905609130859375,
-0.038970947265625,
-0.00502777099609375,
0.007808685302734375,
-0.0249481201171875,
-0.03753662109375,
0.00920867919921875,
0.0487060546875,
0.0073394775390625,
-0.015380859375,
0.02276611328125,
-0.00238800048828125,
0.03692626953125,
-0.04205322265625,
-0.0088348388671875,
-0.0207366943359375,
0.00786590576171875,
-0.0115509033203125,
-0.01108551025390625,
-0.009979248046875,
-0.01666259765625,
0.011444091796875,
-0.04473876953125,
0.0199432373046875,
0.0148468017578125,
0.1097412109375,
0.052001953125,
-0.0249786376953125,
0.00850677490234375,
-0.050506591796875,
0.062744140625,
-0.0372314453125,
0.01898193359375,
0.02752685546875,
0.04644775390625,
-0.01251983642578125,
-0.053955078125,
-0.03448486328125,
0.0014925003051757812,
-0.002658843994140625,
0.018829345703125,
-0.034881591796875,
0.0268402099609375,
0.0267791748046875,
0.049896240234375,
-0.03070068359375,
0.0019292831420898438,
-0.03436279296875,
-0.0221710205078125,
0.041168212890625,
0.0016498565673828125,
0.01043701171875,
-0.0290679931640625,
-0.034942626953125,
-0.015655517578125,
-0.02490234375,
0.0026702880859375,
0.03338623046875,
-0.0112152099609375,
-0.0245819091796875,
0.0204010009765625,
-0.006591796875,
0.06585693359375,
0.0100860595703125,
-0.005218505859375,
0.033721923828125,
0.00405120849609375,
-0.042327880859375,
-0.01715087890625,
0.057891845703125,
0.0026035308837890625,
0.00948333740234375,
-0.0012302398681640625,
-0.02227783203125,
0.0084075927734375,
0.059112548828125,
-0.0546875,
-0.031494140625,
0.0308074951171875,
-0.038238525390625,
-0.044464111328125,
-0.031158447265625,
-0.0638427734375,
0.005756378173828125,
-0.0161590576171875,
0.065185546875,
-0.033233642578125,
-0.001995086669921875,
-0.007709503173828125,
-0.024139404296875,
0.0297698974609375,
0.0218963623046875,
-0.063232421875,
0.0251007080078125,
0.02691650390625,
0.063232421875,
-0.01268768310546875,
-0.016357421875,
-0.0275726318359375,
-0.006134033203125,
-0.0027790069580078125,
0.045135498046875,
-0.01104736328125,
-0.017181396484375,
0.00020492076873779297,
0.0091094970703125,
-0.0151519775390625,
-0.03753662109375,
0.0640869140625,
-0.034088134765625,
0.0308074951171875,
0.0169219970703125,
-0.040985107421875,
-0.0275726318359375,
-0.002132415771484375,
-0.046783447265625,
0.10882568359375,
0.032958984375,
-0.040740966796875,
0.0110015869140625,
-0.051910400390625,
-0.03662109375,
-0.0028972625732421875,
0.0019102096557617188,
-0.052520751953125,
0.006389617919921875,
0.0095062255859375,
0.03515625,
-0.0128021240234375,
0.0112457275390625,
-0.01523590087890625,
-0.0299530029296875,
-0.0010099411010742188,
-0.01473236083984375,
0.08868408203125,
0.023040771484375,
-0.0302581787109375,
0.033416748046875,
-0.0548095703125,
-0.007476806640625,
0.0166168212890625,
-0.03497314453125,
-0.00826263427734375,
-0.022735595703125,
0.014556884765625,
0.0094146728515625,
0.025604248046875,
-0.03436279296875,
-0.00010246038436889648,
0.0012083053588867188,
0.0499267578125,
0.06298828125,
-0.009918212890625,
0.0235443115234375,
-0.0457763671875,
0.045867919921875,
-0.0274505615234375,
0.0540771484375,
0.0158538818359375,
-0.06201171875,
-0.0755615234375,
-0.04718017578125,
-0.00022268295288085938,
0.033966064453125,
-0.029693603515625,
0.04766845703125,
0.0177459716796875,
-0.050750732421875,
-0.07049560546875,
0.00823211669921875,
0.0182647705078125,
0.049774169921875,
0.037353515625,
-0.0143280029296875,
-0.05767822265625,
-0.079833984375,
0.010467529296875,
-0.0098876953125,
-0.0005822181701660156,
0.0399169921875,
0.038543701171875,
-0.02508544921875,
0.059844970703125,
-0.03497314453125,
-0.038848876953125,
-0.005077362060546875,
0.0009207725524902344,
0.04156494140625,
0.03338623046875,
0.05340576171875,
-0.059326171875,
-0.0247344970703125,
-0.0210418701171875,
-0.05853271484375,
-0.0010547637939453125,
-0.003871917724609375,
-0.051910400390625,
0.0214996337890625,
0.0211181640625,
-0.0570068359375,
0.030120849609375,
0.05780029296875,
-0.011749267578125,
0.047393798828125,
-0.0286102294921875,
-0.005657196044921875,
-0.07562255859375,
0.0186004638671875,
-0.0022678375244140625,
-0.038787841796875,
-0.04327392578125,
0.02264404296875,
0.0010538101196289062,
-0.00031876564025878906,
-0.03057861328125,
0.04510498046875,
-0.0184478759765625,
0.006992340087890625,
-0.01544952392578125,
-0.0159149169921875,
0.01180267333984375,
0.056182861328125,
-0.0047760009765625,
0.061614990234375,
0.035675048828125,
-0.046417236328125,
0.036041259765625,
0.03619384765625,
-0.0280609130859375,
0.0294647216796875,
-0.07122802734375,
0.013519287109375,
0.00013899803161621094,
0.0151214599609375,
-0.06646728515625,
-0.0322265625,
0.00687408447265625,
-0.04327392578125,
0.02777099609375,
-0.00646209716796875,
-0.0207672119140625,
-0.048675537109375,
-0.0174407958984375,
0.0164642333984375,
0.08294677734375,
-0.06793212890625,
0.0511474609375,
0.0626220703125,
0.019439697265625,
-0.040863037109375,
-0.03717041015625,
-0.0114898681640625,
-0.041168212890625,
-0.058074951171875,
0.051910400390625,
-0.01058197021484375,
0.003803253173828125,
-0.012786865234375,
0.007068634033203125,
-0.0194244384765625,
0.0147857666015625,
0.0235443115234375,
0.038604736328125,
0.01300048828125,
-0.00226593017578125,
0.0016317367553710938,
0.00135040283203125,
0.0106353759765625,
-0.00798797607421875,
0.051910400390625,
-0.023773193359375,
0.01422119140625,
-0.0254364013671875,
0.01043701171875,
0.0333251953125,
-0.005008697509765625,
0.055419921875,
0.08209228515625,
-0.030914306640625,
-0.0125885009765625,
-0.058197021484375,
-0.0097808837890625,
-0.04266357421875,
0.0036716461181640625,
-0.0206146240234375,
-0.05120849609375,
0.043670654296875,
0.0138397216796875,
0.01000213623046875,
0.04400634765625,
0.055908203125,
-0.02935791015625,
0.07342529296875,
0.0596923828125,
-0.0233917236328125,
0.0484619140625,
-0.035491943359375,
-0.000024139881134033203,
-0.07806396484375,
-0.00917816162109375,
-0.0450439453125,
-0.0191802978515625,
-0.02215576171875,
-0.022796630859375,
0.04608154296875,
0.0031890869140625,
-0.0308685302734375,
0.034759521484375,
-0.0643310546875,
0.01355743408203125,
0.06463623046875,
0.0203094482421875,
-0.0025234222412109375,
0.002513885498046875,
-0.0012063980102539062,
0.02459716796875,
-0.0614013671875,
-0.015655517578125,
0.064208984375,
0.035430908203125,
0.055389404296875,
0.0006508827209472656,
0.053497314453125,
0.0115203857421875,
-0.0004673004150390625,
-0.06689453125,
0.031158447265625,
0.006809234619140625,
-0.0667724609375,
-0.0179443359375,
-0.02459716796875,
-0.06524658203125,
0.0112152099609375,
-0.0215301513671875,
-0.038482666015625,
0.017181396484375,
0.016204833984375,
-0.02960205078125,
0.0161590576171875,
-0.045989990234375,
0.065185546875,
-0.0011415481567382812,
-0.0304107666015625,
-0.01477813720703125,
-0.048431396484375,
0.0202178955078125,
-0.0078887939453125,
0.0015497207641601562,
-0.003925323486328125,
0.0123443603515625,
0.07330322265625,
-0.037200927734375,
0.051239013671875,
-0.0214385986328125,
0.002704620361328125,
0.0251922607421875,
-0.0164642333984375,
0.0243988037109375,
-0.007373809814453125,
-0.0062408447265625,
0.0254364013671875,
-0.00661468505859375,
-0.036285400390625,
-0.017059326171875,
0.039031982421875,
-0.08837890625,
-0.030120849609375,
-0.024078369140625,
-0.04547119140625,
0.0011053085327148438,
0.01361846923828125,
0.042083740234375,
0.023223876953125,
-0.0182952880859375,
0.0015707015991210938,
0.0548095703125,
-0.01959228515625,
0.0228271484375,
-0.00039768218994140625,
-0.0189056396484375,
-0.0261993408203125,
0.048095703125,
-0.0191497802734375,
0.0159149169921875,
0.01219940185546875,
0.0229034423828125,
-0.0162506103515625,
-0.031829833984375,
-0.043609619140625,
0.016754150390625,
-0.039276123046875,
-0.0236968994140625,
-0.04901123046875,
-0.024627685546875,
-0.0277099609375,
-0.0193023681640625,
-0.0192718505859375,
-0.0294952392578125,
-0.01146697998046875,
0.01352691650390625,
0.0526123046875,
0.01751708984375,
-0.01202392578125,
0.0482177734375,
-0.04266357421875,
0.039642333984375,
0.0220184326171875,
0.001068115234375,
-0.0099945068359375,
-0.058929443359375,
-0.0267791748046875,
-0.00611114501953125,
-0.024810791015625,
-0.06207275390625,
0.03472900390625,
0.03411865234375,
0.050506591796875,
0.0224761962890625,
-0.003055572509765625,
0.046234130859375,
-0.028350830078125,
0.0401611328125,
-0.00540924072265625,
-0.0594482421875,
0.0450439453125,
-0.03582763671875,
0.021331787109375,
0.04644775390625,
0.045806884765625,
-0.0224456787109375,
-0.018524169921875,
-0.05609130859375,
-0.06500244140625,
0.04901123046875,
0.0202484130859375,
0.006534576416015625,
0.009765625,
0.0202789306640625,
-0.011749267578125,
0.0157318115234375,
-0.0303497314453125,
-0.0172119140625,
-0.0129547119140625,
-0.0029239654541015625,
-0.0008645057678222656,
-0.005771636962890625,
-0.0287322998046875,
-0.044036865234375,
0.0831298828125,
-0.004566192626953125,
0.052581787109375,
0.020050048828125,
-0.0087890625,
-0.01605224609375,
-0.02227783203125,
0.0017414093017578125,
0.0227508544921875,
-0.04595947265625,
-0.02105712890625,
0.0097808837890625,
-0.036773681640625,
-0.006359100341796875,
0.0086822509765625,
-0.0206451416015625,
0.01021575927734375,
0.00864410400390625,
0.080078125,
0.004344940185546875,
-0.0287628173828125,
0.0308685302734375,
-0.01116180419921875,
-0.014251708984375,
-0.044952392578125,
0.0170135498046875,
-0.0011129379272460938,
0.01189422607421875,
0.006488800048828125,
0.00850677490234375,
0.0142669677734375,
-0.035736083984375,
0.004486083984375,
0.038360595703125,
-0.007579803466796875,
0.0025234222412109375,
0.06036376953125,
0.0164642333984375,
-0.0181427001953125,
0.07012939453125,
0.0013027191162109375,
-0.048858642578125,
0.055267333984375,
0.028167724609375,
0.067138671875,
-0.0292205810546875,
0.0189056396484375,
0.032135009765625,
0.0147857666015625,
0.0162353515625,
0.0257110595703125,
-0.01531219482421875,
-0.05450439453125,
-0.0265655517578125,
-0.0762939453125,
-0.03997802734375,
0.0031108856201171875,
-0.06585693359375,
0.0362548828125,
-0.038604736328125,
-0.0266571044921875,
0.0178680419921875,
0.0005855560302734375,
-0.063232421875,
0.01477813720703125,
0.0008196830749511719,
0.07830810546875,
-0.05499267578125,
0.07855224609375,
0.0657958984375,
-0.0257415771484375,
-0.055999755859375,
-0.037200927734375,
0.0004134178161621094,
-0.07489013671875,
0.01104736328125,
0.0106048583984375,
0.0025501251220703125,
-0.0002942085266113281,
-0.03326416015625,
-0.0489501953125,
0.0723876953125,
0.0148468017578125,
-0.01465606689453125,
-0.01427459716796875,
-0.01398468017578125,
0.038055419921875,
-0.01186370849609375,
0.029296875,
0.0654296875,
0.041015625,
0.006084442138671875,
-0.06976318359375,
0.005779266357421875,
-0.030181884765625,
-0.01523590087890625,
0.018768310546875,
-0.06597900390625,
0.069580078125,
-0.00501251220703125,
0.0111236572265625,
0.01378631591796875,
0.043670654296875,
0.01230621337890625,
0.00511932373046875,
0.044769287109375,
0.059844970703125,
0.070068359375,
-0.01267242431640625,
0.08917236328125,
-0.0217742919921875,
0.040557861328125,
0.05279541015625,
0.01053619384765625,
0.04925537109375,
0.0304107666015625,
-0.0258941650390625,
0.0352783203125,
0.0596923828125,
-0.018035888671875,
0.0379638671875,
0.007389068603515625,
-0.0214996337890625,
-0.0154571533203125,
0.0007171630859375,
-0.0189666748046875,
0.04034423828125,
0.02197265625,
-0.024505615234375,
0.00038123130798339844,
0.003936767578125,
0.0209808349609375,
-0.033447265625,
-0.007843017578125,
0.05029296875,
0.0019073486328125,
-0.0574951171875,
0.08502197265625,
-0.006824493408203125,
0.07086181640625,
-0.05767822265625,
0.01512908935546875,
-0.015716552734375,
0.0222320556640625,
-0.0242462158203125,
-0.03350830078125,
0.0330810546875,
-0.00901031494140625,
-0.00608062744140625,
0.0033550262451171875,
0.06500244140625,
-0.016387939453125,
-0.01708984375,
0.0269622802734375,
0.0193634033203125,
0.032470703125,
-0.004180908203125,
-0.06475830078125,
0.0162200927734375,
0.004650115966796875,
-0.026275634765625,
0.023101806640625,
-0.008697509765625,
0.0107879638671875,
0.0428466796875,
0.04473876953125,
0.0018024444580078125,
0.01189422607421875,
-0.002849578857421875,
0.06231689453125,
-0.030059814453125,
-0.04315185546875,
-0.034210205078125,
0.0231781005859375,
-0.0220794677734375,
-0.037567138671875,
0.056732177734375,
0.035308837890625,
0.05560302734375,
-0.00891876220703125,
0.05499267578125,
-0.0216217041015625,
0.014556884765625,
-0.027313232421875,
0.053863525390625,
-0.053131103515625,
-0.016693115234375,
-0.023101806640625,
-0.087890625,
-0.00391387939453125,
0.08013916015625,
0.005405426025390625,
0.007022857666015625,
0.04046630859375,
0.059112548828125,
-0.01285552978515625,
0.0123443603515625,
0.0078277587890625,
0.01611328125,
0.0281524658203125,
0.045074462890625,
0.04345703125,
-0.034942626953125,
0.0295562744140625,
-0.049530029296875,
-0.046295166015625,
-0.021575927734375,
-0.054229736328125,
-0.06903076171875,
-0.037109375,
-0.021575927734375,
-0.034637451171875,
0.01271820068359375,
0.052734375,
0.073974609375,
-0.032073974609375,
-0.024169921875,
-0.0009946823120117188,
-0.016448974609375,
-0.0197296142578125,
-0.0190277099609375,
0.0364990234375,
-0.032318115234375,
-0.07244873046875,
0.036285400390625,
-0.009765625,
-0.00299835205078125,
-0.0261688232421875,
-0.006999969482421875,
-0.022918701171875,
0.007904052734375,
0.03729248046875,
0.0300140380859375,
-0.0151214599609375,
-0.023834228515625,
-0.01079559326171875,
0.0047760009765625,
0.004978179931640625,
0.0229949951171875,
-0.07257080078125,
0.0210418701171875,
0.04998779296875,
0.045440673828125,
0.06536865234375,
-0.01149749755859375,
0.03692626953125,
-0.048828125,
0.01184844970703125,
0.0068359375,
0.042327880859375,
0.01348114013671875,
-0.038726806640625,
0.056060791015625,
0.0009617805480957031,
-0.0762939453125,
-0.052398681640625,
-0.0018777847290039062,
-0.0889892578125,
-0.01454925537109375,
0.0775146484375,
-0.0014982223510742188,
-0.043060302734375,
0.002407073974609375,
-0.0264892578125,
0.035491943359375,
-0.03363037109375,
0.06256103515625,
0.039825439453125,
-0.0223846435546875,
0.01439666748046875,
-0.056884765625,
0.04925537109375,
0.0504150390625,
-0.06060791015625,
-0.0213623046875,
0.035400390625,
0.01079559326171875,
0.0203704833984375,
0.042724609375,
-0.0146026611328125,
0.0203399658203125,
-0.0135040283203125,
0.016448974609375,
-0.0154571533203125,
-0.033721923828125,
-0.035400390625,
-0.0045318603515625,
-0.0169677734375,
-0.0256195068359375
]
] |
FFusion/FFusionXL-BASE | 2023-08-17T20:52:00.000Z | [
"diffusers",
"onnx",
"safetensors",
"openvino",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"stable-diffusion",
"text-to-image",
"di.FFusion.ai",
"en",
"arxiv:2112.10752",
"arxiv:2307.01952",
"arxiv:2106.09685",
"doi:10.57967/hf/1094",
"license:openrail++",
"model-index",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | FFusion | null | null | FFusion/FFusionXL-BASE | 21 | 16,763 | diffusers | 2023-07-27T14:59:49 | ---
license: openrail++
base_model: diffusers/stable-diffusion-xl-base-1.0
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- stable-diffusion
- text-to-image
- diffusers
- di.FFusion.ai
inference: true
widget:
- text: >-
a dog in colorful exploding clouds, dreamlike surrealism colorful smoke and fire coming
out of it, explosion of data fragments, exploding background,realistic explosion, 3d digital art
example_title: Dogo FFusion
- text: >-
a sprinkled donut sitting on top of a table, colorful hyperrealism, everything is made of candy, hyperrealistic digital
painting, covered in sprinkles and crumbs, vibrant colors hyper realism,colorful smoke explosion background
example_title: Donut FFusion
- text: >-
a cup of coffee with a tree in it, surreal art, awesome great composition,
surrealism, ice cubes in tree, colorful clouds, perfectly realistic yet surreal
example_title: CoFFee FFusion
- text: >-
brightly colored headphones with a splash of colorful paint splash, vibing
to music, stunning artwork, music is life, beautiful digital artwork, concept art, cinematic, dramatic, intricate details, dark
lighting
example_title: Headset FFusion
- text: >-
high-quality game character digital design, Unreal Engine, Water color painting, Mecha- Monstrous high quality game fantasy rpg character design, dark rainbow Fur Scarf, inside of a Superficial Outhouse, at Twilight, Overdetailed art
example_title: Digital Fusion
language:
- en
model-index:
- name: FFusion/FFusionXL-BASE
results:
- task:
type: text-to-image
name: Text to Image Generation
dataset:
type: poloclub/diffusiondb
name: DiffusionDB
split: train
metrics:
- type: is
value: 4.9797071218490601
name: Inception Score
verified: true
- type: fid
value: 311.33686580590006
name: Fréchet Inception Distance
verified: true
- type: text-image-similarity
value: 14.368797302246094
name: Similarity Score (CLIP)
thumbnail: >-
https://cdn-uploads.huggingface.co/production/uploads/6380cf05f496d57325c12194/p54u7dEP1u8en0--NMEjS.png
---

<div style="display: flex; flex-wrap: wrap; gap: 2px;">
<a href="https://huggingface.co/FFusion/"><img src="https://img.shields.io/badge/ONNX_Version-Available-brightgreen" alt="ONNX Version Available"></a>
<a href="https://huggingface.co/FFusion/"><img src="https://img.shields.io/badge/OpenVINO-Support-blue" alt="OpenVINO Support"></a>
<a href="https://huggingface.co/FFusion/"><img src="https://img.shields.io/badge/Compatibility-Intel%20|%20AMD%20|%20NVIDIA-orange" alt="Intel/AMD/NVIDIA Compatible"></a>
</div>
## 🌟 Overview
- 🚀 Fast Training: Optimized for high-speed training, allowing rapid experimentation.
- 🧩 Versatility: Suitable for various applications and standards, from NLP to Computer Vision.
- 🎓 Train Your Way: A base for training your own models, tailored to your needs.
- 🌐 Multilingual Support: Train models in multiple languages.
- 🛡️ Robust Architecture: Built on proven technologies to ensure stability and reliability.
## 📜 Model Description
FFusionXL "Base" is a foundational model designed to accelerate training processes. Crafted with flexibility in mind, it serves as a base for training custom models across a variety of standards, enabling innovation and efficiency.
<div style="display: flex; flex-wrap: wrap; gap: 2px;">
<a href="#"><img src="https://img.shields.io/badge/Safetensor-FP16%20%26%20FP32-blue" alt="Safetensor checkpoints"></a>
<a href="#"><img src="https://img.shields.io/badge/Diffusers(Safetensors)-FP16%20%26%20FP32-green" alt="Diffusers(safetensors)"></a>
<a href="#"><img src="https://img.shields.io/badge/Diffusers(PyTorch%20Bin)-FP16%20%26%20FP32-orange" alt="Diffusers(pytorch bin)"></a>
<a href="#"><img src="https://img.shields.io/badge/ONNX-Unoptimized%20FP32-red" alt="ONNX un-optimized FP32"></a>
<a href="#"><img src="https://img.shields.io/badge/ONNX%20Optimized-FP16%20DirectML%20Support-blueviolet" alt="ONNX Optimized FP16 full DirectML support"></a>
<a href="#"><img src="https://img.shields.io/badge/Intel®%20OpenVINO™-FP32%20%26%20FP16-brightgreen" alt="Intel® OpenVINO™ FP32 & FP16"></a>
</div>
**Available formats for training:**
- Safetensor checkpoints fp16 & fp32
- Diffusers(safetensors) FP 16 & FP32
- Diffusers(pytorch bin) FP16 & FP32
- ONNX un-optimzed FP32
- **ONNX Optimized** FP16 full **DirectML** support / AMD / NVIDIA
- Intel® OpenVINO™ FP32 - unoptimized
- **Intel® OpenVINO™** FP16
- **Trained by:** FFusion AI
- **Model type:** Diffusion-based text-to-image generative model
- **License:** [FFXL Research License](https://huggingface.co/FFusion/FFusionXL-09-SDXL/blob/main/LICENSE.md)
- **Model Description:** This is a trained model based on SDXL that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses two fixed, pretrained text encoders ([OpenCLIP-ViT/G](https://github.com/mlfoundations/open_clip) and [CLIP-ViT/L](https://github.com/openai/CLIP/tree/main)).
- **Resources for more information:** [SDXL paper on arXiv](https://arxiv.org/abs/2307.01952).
## 📊 Model Sources
- **Demo:** [FFusionXL SDXL DEMO](https://huggingface.co/spaces/FFusion/FFusionXL-SDXL-DEMO)

## Table of Contents
1. [📌 ONNX Version](#📌-onnx-version)
1. [🔖 ### 📌 ONNX Details](#🔖-###-📌-onnx-details)
2. [🔖 ### 📌 AMD Support for Microsoft® DirectML Optimization of Stable Diffusion](#🔖-###-📌-amd-support-for-microsoft®-directml-optimization-of-stable-diffusion)
3. [🔖 ### 📌 ONNX Inference Instructions](#🔖-###-📌-onnx-inference-instructions)
4. [🔖 ### 📌 Text-to-Image](#🔖-###-📌-text-to-image)
2. [📌 Intel® OpenVINO™ Version](#📌-intel®-openvino™-version)
1. [📌 OpenVINO Inference with FFusion/FFusionXL-BASE](#📌-openvino-inference-with-ffusion/ffusionxl-base)
2. [🔖 ### 📌 Installing Dependencies](#🔖-###-📌-installing-dependencies)
3. [🔖 ### 📌 Text-to-Image](#🔖-###-📌-text-to-image)
4. [🔖 ### 📌 Text-to-Image with Textual Inversion](#🔖-###-📌-text-to-image-with-textual-inversion)
5. [🔖 ### 📌 Image-to-Image](#🔖-###-📌-image-to-image)
6. [🔖 ### 📌 Refining the Image Output](#🔖-###-📌-refining-the-image-output)
3. [📜 Part 003: 🧨 Model Diffusers, Fast LoRa Loading, and Training](#📜-part-001:-🧨-model-diffusers,-fast-lora-loading,-and-training)
1. [📌 Model Diffusers: Unleashing the Power of FFusion/FFusionXL-BASE](#📌-model-diffusers:-unleashing-the-power-of-ffusion/ffusionxl-base)
2. [📌 Installing the dependencies](#📌-installing-the-dependencies)
3. [📌 Training](#📌-training)
4. [📌 Inference](#📌-inference)
5. [📌 Training](#📌-training)
6. [📌 Finetuning the text encoder and UNet](#📌-finetuning-the-text-encoder-and-unet)
7. [📌 Inference](#📌-inference)
4. [📌 Evaluation](#📌-evaluation)
### ### 📌 ONNX Version

We are proud to announce a fully optimized Microsoft ONNX Version exclusively compatible with the latest DirectML Execution Provider. All the ONNX files are optimized (Quantization) to fp16 for fast inference and training across all devices.
The Vae_Decoder is kept at fp32 with settings:
```json
"float16": false,
"use_gpu": true,
"keep_io_types": true,
"force_fp32_ops": ["RandomNormalLike"]
```
to avoid black screens and broken renders. As soon as a proper solution for a full fp16 VAE decoder arrives, we will update it. VAE encoder and everything else is fully optimized 🤟.
Our ONNX is OPTIMIZED using ONNX v8:
- **producer:** onnxruntime.transformers 1.15.1
- **imports:** ai.onnx v18, com.microsoft.nchwc v1, ai.onnx.ml v3, com.ms.internal.nhwc v19, ai.onnx.training v1, ai.onnx.preview.training v1, com.microsoft v1, com.microsoft.experimental v1, org.pytorch.aten v1, com.microsoft.dml v1, graph: torch_jit
#### 🔖 ### 📌 ONNX Details
**NETRON** Detrails:

## Install
**macOS**: [**Download**](https://github.com/lutzroeder/netron/releases/latest) the `.dmg` file or run `brew install --cask netron`
**Linux**: [**Download**](https://github.com/lutzroeder/netron/releases/latest) the `.AppImage` file or run `snap install netron`
**Windows**: [**Download**](https://github.com/lutzroeder/netron/releases/latest) the `.exe` installer or run `winget install -s winget netron`
https://netron.app/
-- **NETRON browser version**: [Start **Text Encoder**](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/text_encoder/model.onnx)
[](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/text_encoder/model.onnx)
--**NETRON browser version**: [Start **Text Encoder 2**](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/text_encoder_2/model.onnx)
[](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/text_encoder_2/model.onnx)
--**NETRON browser version**: [Start **VAE decoder**](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/vae_decoder/model.onnx)
--**NETRON browser version**: [Start **VAE encoder**](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/vae_encoder/model.onnx)
[](https://netron.app/?url=https://huggingface.co/FFusion/FFusionXL-BASE/blob/main/vae_encoder/model.onnx)
--**NETRON browser version**: [Start **UNET**](https://netron.app/?url=https://huggingface.co/stabilityai/FFusion/FFusionXL-BASE/blob/main/unet/model.onnx)
##### 🔖 ### 📌 AMD Support for Microsoft® DirectML Optimization of Stable Diffusion

AMD has released support for Microsoft DirectML optimizations for Stable Diffusion, working closely with Microsoft for optimal performance on AMD devices.
[Microsoft DirectML](https://microsoft.github.io/DirectML/)
[AMD Microsoft DirectML Stable Diffusion](https://gpuopen.com/amd-microsoft-directml-stable-diffusion/)
#### 🔖 ### 📌 ONNX Inference Instructions

##### 🔖 ### 📌 Text-to-Image
Here is an example of how you can load an ONNX Stable Diffusion model and run inference using ONNX Runtime:
```python
from optimum.onnxruntime import ORTStableDiffusionPipeline
model_id = "FFusion/FFusionXL-BASE"
pipeline = ORTStableDiffusionPipeline.from_pretrained(model_id)
prompt = "sailing ship in storm by Leonardo da Vinci"
images = pipeline(prompt).images
```
### ### 📌 Intel® OpenVINO™ Version
A converted Intel® OpenVINO™ model is also included for inference testing and training. No Quantization and optimization applied yet.
---
### ### 📌 OpenVINO Inference with FFusion/FFusionXL-BASE
#### 🔖 ### 📌 Installing Dependencies
Before using `OVStableDiffusionXLPipeline`, make sure to have `diffusers` and `invisible_watermark` installed. You can install the libraries as follows:
```bash
pip install diffusers
pip install invisible-watermark>=0.2.0
```
#### 🔖 ### 📌 Text-to-Image
Here is an example of how you can load a FFusion/FFusionXL-BASE OpenVINO model and run inference using OpenVINO Runtime:
```python
from optimum.intel import OVStableDiffusionXLPipeline
model_id = "FFusion/FFusionXL-BASE"
base = OVStableDiffusionXLPipeline.from_pretrained(model_id)
prompt = "train station by Caspar David Friedrich"
image = base(prompt).images[0]
image.save("train_station.png")
```
#### 🔖 ### 📌 Text-to-Image with Textual Inversion
First, you can run the original pipeline without textual inversion:
```python
from optimum.intel import OVStableDiffusionXLPipeline
import numpy as np
model_id = "FFusion/FFusionXL-BASE"
prompt = "charturnerv2, multiple views of the same character in the same outfit, a character turnaround of a beautiful cyber female wearing a black corset and pink latex shirt, scifi best quality, intricate details."
np.random.seed(0)
base = OVStableDiffusionXLPipeline.from_pretrained(model_id, export=False, compile=False)
base.compile()
image1 = base(prompt, num_inference_steps=50).images[0]
image1.save("sdxl_without_textual_inversion.png")
```
Then, you can load `charturnerv2` textual inversion embedding and run the pipeline with the same prompt again:
```python
# Reset stable diffusion pipeline
base.clear_requests()
# Load textual inversion into stable diffusion pipeline
base.load_textual_inversion("./charturnerv2.pt", "charturnerv2")
# Compile the model before the first inference
base.compile()
image2 = base(prompt, num_inference_steps=50).images[0]
image2.save("sdxl_with_textual_inversion.png")
```



#### 🔖 ### 📌 Image-to-Image
Here is an example of how you can load a PyTorch FFusion/FFusionXL-BASE model, convert it to OpenVINO on-the-fly, and run inference using OpenVINO Runtime for image-to-image:
```python
from optimum.intel import OVStableDiffusionXLImg2ImgPipeline
from diffusers.utils import load_image
model_id = "FFusion/FFusionXL-BASE-refiner-1.0"
pipeline = OVStableDiffusionXLImg2ImgPipeline.from_pretrained(model_id, export=True)
url = "https://huggingface.co/datasets/optimum/documentation-images/resolve/main/intel/openvino/sd_xl/castle_friedrich.png"
image = load_image(url).convert("RGB")
prompt = "medieval castle by Caspar David Friedrich"
image = pipeline(prompt, image=image).images[0]
pipeline.save_pretrained("openvino-FF-xl-refiner-1.0")
```
#### 🔖 ### 📌 Refining the Image Output
The image can be refined by making use of a model like `FFusion/FFusionXL-BASE-refiner-1.0`. In this case, you only have to output the latents from the base model.
```python
from optimum.intel import OVStableDiffusionXLImg2ImgPipeline
model_id = "FFusion/FFusionXL-BASE-refiner-1.0"
refiner = OVStableDiffusionXLImg2ImgPipeline.from_pretrained(model_id, export=True)
image = base(prompt=prompt, output_type="latent").images[0]
image = refiner(prompt=prompt, image=image[None, :]).images[0]
```
## 📜 Part 003: 🧨 Model Diffusers, Fast LoRa Loading, and Training
### ### 📌 Model Diffusers: Unleashing the Power of FFusion/FFusionXL-BASE
Whether you're an artist, researcher, or AI enthusiast, our model is designed to make your journey smooth and exciting.
Make sure to upgrade diffusers to >= 0.19.3:
```bash
pip install diffusers --upgrade
```
In addition, make sure to install `transformers`, `safetensors`, `accelerate`, and the invisible watermark:
```bash
pip install invisible_watermark transformers accelerate safetensors
```
You can use the model then as follows:
```python
from diffusers import DiffusionPipeline
import torch
pipe = DiffusionPipeline.from_pretrained("FFusion/FFusionXL-09-SDXL", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")
pipe.to("cuda")
# if using torch < 2.0
# pipe.enable_xformers_memory_efficient_attention()
prompt = "An astronaut riding a green horse"
images = pipe(prompt=prompt).images[0]
```
## 📜 Diffusers Training Guide: Training FFusion/FFusionXL-BASE with LoRA
# Stable Diffusion XL text-to-image fine-tuning
The `train_text_to_image_sdxl.py` script shows how to fine-tune Stable Diffusion XL (SDXL) on your own dataset.
🚨 This script is experimental. The script fine-tunes the whole model and often times the model overfits and runs into issues like catastrophic forgetting. It's recommended to try different hyperparamters to get the best result on your dataset. 🚨
## 📜 Running locally with PyTorch
### ### 📌 Installing the dependencies
Before running the scripts, make sure to install the library's training dependencies:
**Important**
To make sure you can successfully run the latest versions of the example scripts, we highly recommend **installing from source** and keeping the install up to date as we update the example scripts frequently and install some example-specific requirements. To do this, execute the following steps in a new virtual environment:
```bash
git clone https://github.com/huggingface/diffusers
cd diffusers
pip install -e .
```
Then cd in the `examples/text_to_image` folder and run
```bash
pip install -r requirements_sdxl.txt
```
And initialize an [🤗Accelerate](https://github.com/huggingface/accelerate/) environment with:
```bash
accelerate config
```
Or for a default accelerate configuration without answering questions about your environment
```bash
accelerate config default
```
Or if your environment doesn't support an interactive shell (e.g., a notebook)
```python
from accelerate.utils import write_basic_config
write_basic_config()
```
When running `accelerate config`, if we specify torch compile mode to True there can be dramatic speedups.
### ### 📌 Training
```bash
export MODEL_NAME="FFusion/FFusionXL-BASE"
export VAE="madebyollin/sdxl-vae-fp16-fix"
export DATASET_NAME="lambdalabs/pokemon-blip-captions"
accelerate launch train_text_to_image_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--pretrained_vae_model_name_or_path=$VAE \
--dataset_name=$DATASET_NAME \
--enable_xformers_memory_efficient_attention \
--resolution=512 --center_crop --random_flip \
--proportion_empty_prompts=0.2 \
--train_batch_size=1 \
--gradient_accumulation_steps=4 --gradient_checkpointing \
--max_train_steps=10000 \
--use_8bit_adam \
--learning_rate=1e-06 --lr_scheduler="constant" --lr_warmup_steps=0 \
--mixed_precision="fp16" \
--report_to="wandb" \
--validation_prompt="a cute Sundar Pichai creature" --validation_epochs 5 \
--checkpointing_steps=5000 \
--output_dir="sdxl-pokemon-model" \
--push_to_hub
```
**Notes**:
* The `train_text_to_image_sdxl.py`(diffusers/examples/text_to_image) script pre-computes text embeddings and the VAE encodings and keeps them in memory. While for smaller datasets like [`lambdalabs/pokemon-blip-captions`](https://hf.co/datasets/lambdalabs/pokemon-blip-captions), it might not be a problem, it can definitely lead to memory problems when the script is used on a larger dataset. For those purposes, you would want to serialize these pre-computed representations to disk separately and load them during the fine-tuning process. Refer to [this PR](https://github.com/huggingface/diffusers/pull/4505) for a more in-depth discussion.
* The training script is compute-intensive and may not run on a consumer GPU like Tesla T4.
* The training command shown above performs intermediate quality validation in between the training epochs and logs the results to Weights and Biases. `--report_to`, `--validation_prompt`, and `--validation_epochs` are the relevant CLI arguments here.
examples/text_to_image
### ### 📌 Inference
```python
from diffusers import DiffusionPipeline
import torch
model_path = "FFusion/FFusionXL-BASE" # <-- change this to your new trained model
pipe = DiffusionPipeline.from_pretrained(model_path, torch_dtype=torch.float16)
pipe.to("cuda")
prompt = "A pokemon with green eyes and red legs."
image = pipe(prompt, num_inference_steps=30, guidance_scale=7.5).images[0]
image.save("pokemon.png")
```
## 📜 LoRA training example for Stable Diffusion XL (SDXL)
Low-Rank Adaption of Large Language Models was first introduced by Microsoft in [LoRA: Low-Rank Adaptation of Large Language Models](https://arxiv.org/abs/2106.09685) by *Edward J. Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, Weizhu Chen*.
In a nutshell, LoRA allows adapting pretrained models by adding pairs of rank-decomposition matrices to existing weights and **only** training those newly added weights. This has a couple of advantages:
- Previous pretrained weights are kept frozen so that model is not prone to [catastrophic forgetting](https://www.pnas.org/doi/10.1073/pnas.1611835114).
- Rank-decomposition matrices have significantly fewer parameters than original model, which means that trained LoRA weights are easily portable.
- LoRA attention layers allow to control to which extent the model is adapted toward new training images via a `scale` parameter.
[cloneofsimo](https://github.com/cloneofsimo) was the first to try out LoRA training for Stable Diffusion in the popular [lora](https://github.com/cloneofsimo/lora) GitHub repository.
With LoRA, it's possible to fine-tune Stable Diffusion on a custom image-caption pair dataset
on consumer GPUs like Tesla T4, Tesla V100.
### ### 📌 Training
First, you need to set up your development environment as is explained in the [installation section](#installing-the-dependencies). Make sure to set the `MODEL_NAME` and `DATASET_NAME` environment variables. Here, we will use [Stable Diffusion XL 1.0-base](https://huggingface.co/FFusion/FFusionXL-BASE) and the [Pokemons dataset](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions).
**___Note: It is quite useful to monitor the training progress by regularly generating sample images during training. [Weights and Biases](https://docs.wandb.ai/quickstart) is a nice solution to easily see generating images during training. All you need to do is to run `pip install wandb` before training to automatically log images.___**
```bash
export MODEL_NAME="FFusion/FFusionXL-BASE"
export DATASET_NAME="lambdalabs/pokemon-blip-captions"
```
For this example we want to directly store the trained LoRA embeddings on the Hub, so
we need to be logged in and add the `--push_to_hub` flag.
```bash
huggingface-cli login
```
Now we can start training!
```bash
accelerate launch train_text_to_image_lora_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--dataset_name=$DATASET_NAME --caption_column="text" \
--resolution=1024 --random_flip \
--train_batch_size=1 \
--num_train_epochs=2 --checkpointing_steps=500 \
--learning_rate=1e-04 --lr_scheduler="constant" --lr_warmup_steps=0 \
--seed=42 \
--output_dir="sd-pokemon-model-lora-sdxl" \
--validation_prompt="cute dragon creature" --report_to="wandb" \
--push_to_hub
```
The above command will also run inference as fine-tuning progresses and log the results to Weights and Biases.
### ### 📌 Finetuning the text encoder and UNet
The script also allows you to finetune the `text_encoder` along with the `unet`.
🚨 Training the text encoder requires additional memory.
Pass the `--train_text_encoder` argument to the training script to enable finetuning the `text_encoder` and `unet`:
```bash
accelerate launch train_text_to_image_lora_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--dataset_name=$DATASET_NAME --caption_column="text" \
--resolution=1024 --random_flip \
--train_batch_size=1 \
--num_train_epochs=2 --checkpointing_steps=500 \
--learning_rate=1e-04 --lr_scheduler="constant" --lr_warmup_steps=0 \
--seed=42 \
--output_dir="sd-pokemon-model-lora-sdxl-txt" \
--train_text_encoder \
--validation_prompt="cute dragon creature" --report_to="wandb" \
--push_to_hub
```
### ### 📌 Inference
Once you have trained a model using above command, the inference can be done simply using the `DiffusionPipeline` after loading the trained LoRA weights. You
need to pass the `output_dir` for loading the LoRA weights which, in this case, is `sd-pokemon-model-lora-sdxl`.
```python
from diffusers import DiffusionPipeline
import torch
model_path = "takuoko/sd-pokemon-model-lora-sdxl"
pipe = DiffusionPipeline.from_pretrained("FFusion/FFusionXL-BASE", torch_dtype=torch.float16)
pipe.to("cuda")
pipe.load_lora_weights(model_path)
prompt = "A pokemon with green eyes and red legs."
image = pipe(prompt, num_inference_steps=30, guidance_scale=7.5).images[0]
image.save("pokemon.png")
```
### ### 📌 Evaluation




Utilizing yuvalkirstain/PickScore_v1 model, this analysis was conducted by FFusion.AI. It serves as a vital contribution to the ongoing research in testing Stable Diffusion Models' prompt win rate and accuracy.
📧 For any inquiries or support, please contact di@ffusion.ai. We're here to help you every step of the way!
| 25,960 | [
[
-0.042022705078125,
-0.058013916015625,
0.036285400390625,
0.0229339599609375,
-0.01800537109375,
-0.00641632080078125,
0.0025272369384765625,
-0.03228759765625,
0.011627197265625,
0.01806640625,
-0.056488037109375,
-0.05078125,
-0.04443359375,
-0.0059661865234375,
-0.003143310546875,
0.0562744140625,
-0.0167999267578125,
-0.0121307373046875,
-0.000644683837890625,
0.0036945343017578125,
-0.0032215118408203125,
-0.008056640625,
-0.06951904296875,
-0.0200347900390625,
0.0196075439453125,
-0.0030841827392578125,
0.047393798828125,
0.05438232421875,
0.02777099609375,
0.02691650390625,
-0.0236358642578125,
0.0011835098266601562,
-0.032318115234375,
-0.00704193115234375,
0.01080322265625,
-0.0185699462890625,
-0.044158935546875,
-0.00928497314453125,
0.056610107421875,
0.0118865966796875,
-0.004558563232421875,
-0.01447296142578125,
0.0049285888671875,
0.034515380859375,
-0.042938232421875,
0.005401611328125,
-0.0117034912109375,
0.0367431640625,
-0.0003132820129394531,
0.004291534423828125,
-0.01214599609375,
-0.0340576171875,
0.004100799560546875,
-0.08001708984375,
0.0045166015625,
0.0038661956787109375,
0.0880126953125,
0.0176849365234375,
-0.01372528076171875,
0.004055023193359375,
-0.0433349609375,
0.059967041015625,
-0.049591064453125,
0.035797119140625,
0.01078033447265625,
0.01279449462890625,
0.01529693603515625,
-0.07354736328125,
-0.047454833984375,
0.01409912109375,
-0.01091766357421875,
0.0369873046875,
-0.04443359375,
-0.004955291748046875,
0.0213775634765625,
0.0296783447265625,
-0.042083740234375,
-0.00726318359375,
-0.0107421875,
-0.002826690673828125,
0.04339599609375,
0.0098114013671875,
0.042022705078125,
0.0135498046875,
-0.0587158203125,
-0.0214080810546875,
-0.045806884765625,
0.0113983154296875,
0.0055084228515625,
-0.0117340087890625,
-0.04559326171875,
0.00960540771484375,
0.0024318695068359375,
0.045806884765625,
0.0262908935546875,
-0.00820159912109375,
0.03955078125,
-0.0321044921875,
-0.032440185546875,
-0.005889892578125,
0.094482421875,
0.037841796875,
-0.01898193359375,
-0.005565643310546875,
-0.01800537109375,
-0.00559234619140625,
-0.0008416175842285156,
-0.09173583984375,
-0.00946044921875,
0.0216827392578125,
-0.037750244140625,
-0.039398193359375,
-0.003162384033203125,
-0.0728759765625,
-0.0087127685546875,
0.007373809814453125,
0.035797119140625,
-0.0535888671875,
-0.0360107421875,
0.0221099853515625,
-0.0301513671875,
0.0158538818359375,
0.039215087890625,
-0.046905517578125,
0.012786865234375,
0.0194091796875,
0.06787109375,
-0.007171630859375,
0.00641632080078125,
-0.01149749755859375,
0.007724761962890625,
-0.016082763671875,
0.05633544921875,
-0.027374267578125,
-0.032257080078125,
-0.004230499267578125,
0.0176239013671875,
-0.01322174072265625,
-0.027435302734375,
0.06365966796875,
-0.01090240478515625,
0.02166748046875,
-0.016448974609375,
-0.0362548828125,
-0.010955810546875,
0.0164337158203125,
-0.035247802734375,
0.07232666015625,
0.0192718505859375,
-0.07415771484375,
0.005828857421875,
-0.06060791015625,
0.0013980865478515625,
0.0003581047058105469,
-0.01526641845703125,
-0.04730224609375,
-0.01568603515625,
0.00856781005859375,
0.05035400390625,
-0.0201568603515625,
-0.033416748046875,
-0.0204315185546875,
-0.0121917724609375,
0.022674560546875,
-0.0182647705078125,
0.08599853515625,
0.0208740234375,
-0.043914794921875,
0.0070953369140625,
-0.06060791015625,
0.004955291748046875,
0.03399658203125,
-0.0180511474609375,
-0.02423095703125,
-0.0227813720703125,
0.01001739501953125,
0.038299560546875,
0.0165252685546875,
-0.06732177734375,
0.0014410018920898438,
-0.040771484375,
0.0277557373046875,
0.05657958984375,
0.0011873245239257812,
0.03851318359375,
-0.0281982421875,
0.035186767578125,
0.0177459716796875,
0.032470703125,
-0.010650634765625,
-0.053070068359375,
-0.05572509765625,
-0.037506103515625,
0.00022459030151367188,
0.052154541015625,
-0.07135009765625,
0.033172607421875,
-0.0080413818359375,
-0.055511474609375,
-0.0308380126953125,
0.0018548965454101562,
0.05450439453125,
0.036346435546875,
0.0265655517578125,
-0.025482177734375,
-0.0128631591796875,
-0.050048828125,
0.0233306884765625,
0.004764556884765625,
-0.003704071044921875,
0.030731201171875,
0.035675048828125,
-0.0264739990234375,
0.0548095703125,
-0.061431884765625,
-0.0298004150390625,
-0.003170013427734375,
-0.006542205810546875,
-0.002414703369140625,
0.048828125,
0.0716552734375,
-0.06439208984375,
-0.04302978515625,
0.009246826171875,
-0.0662841796875,
0.00499725341796875,
0.00152587890625,
-0.025665283203125,
0.022491455078125,
0.028533935546875,
-0.051788330078125,
0.044281005859375,
0.031646728515625,
-0.056396484375,
0.033050537109375,
-0.0275421142578125,
0.01239013671875,
-0.0849609375,
0.0146484375,
0.03314208984375,
-0.0131378173828125,
-0.05670166015625,
0.006107330322265625,
0.00759124755859375,
0.0109100341796875,
-0.06201171875,
0.0643310546875,
-0.0250396728515625,
0.03521728515625,
-0.0014028549194335938,
0.0011501312255859375,
0.01529693603515625,
0.0379638671875,
0.01342010498046875,
0.040435791015625,
0.0594482421875,
-0.02984619140625,
0.0305938720703125,
0.035308837890625,
-0.0208892822265625,
0.056488037109375,
-0.05914306640625,
-0.00283050537109375,
-0.0143890380859375,
0.0149078369140625,
-0.0826416015625,
-0.0021762847900390625,
0.05377197265625,
-0.038055419921875,
0.03875732421875,
0.004486083984375,
-0.0285186767578125,
-0.0223846435546875,
-0.03912353515625,
0.014556884765625,
0.057891845703125,
-0.040374755859375,
0.03924560546875,
0.01015472412109375,
0.0138397216796875,
-0.0465087890625,
-0.06915283203125,
-0.0244140625,
-0.0155487060546875,
-0.0712890625,
0.06146240234375,
-0.0233612060546875,
-0.0178985595703125,
0.0112762451171875,
0.01007080078125,
-0.0219268798828125,
0.0002765655517578125,
0.037689208984375,
0.0291748046875,
-0.0313720703125,
-0.017913818359375,
0.01412200927734375,
-0.00923919677734375,
-0.00402069091796875,
-0.01268768310546875,
0.025726318359375,
-0.01245880126953125,
-0.00811767578125,
-0.05682373046875,
0.0188446044921875,
0.0245208740234375,
0.0030040740966796875,
0.0716552734375,
0.0670166015625,
-0.02581787109375,
-0.00804901123046875,
-0.0197906494140625,
-0.0130615234375,
-0.04071044921875,
-0.002593994140625,
-0.01507568359375,
-0.057403564453125,
0.035736083984375,
0.00641632080078125,
0.00998687744140625,
0.04949951171875,
0.031280517578125,
-0.00811767578125,
0.07965087890625,
0.041046142578125,
0.0210418701171875,
0.04248046875,
-0.055450439453125,
-0.00806427001953125,
-0.06329345703125,
-0.0207672119140625,
-0.01605224609375,
-0.0186767578125,
-0.01342010498046875,
-0.04693603515625,
0.0193328857421875,
0.0025787353515625,
-0.027862548828125,
0.021942138671875,
-0.05316162109375,
0.0197296142578125,
0.0135955810546875,
0.01861572265625,
0.005367279052734375,
0.0169219970703125,
-0.0243072509765625,
-0.01171112060546875,
-0.043121337890625,
-0.0167694091796875,
0.057037353515625,
0.03619384765625,
0.0457763671875,
0.00986480712890625,
0.036376953125,
0.00536346435546875,
0.0150146484375,
-0.0260772705078125,
0.04156494140625,
-0.0117034912109375,
-0.042083740234375,
-0.000010192394256591797,
-0.032257080078125,
-0.070556640625,
0.01480865478515625,
-0.041900634765625,
-0.057647705078125,
0.0302581787109375,
0.015167236328125,
-0.040313720703125,
0.04547119140625,
-0.04296875,
0.07330322265625,
-0.005176544189453125,
-0.053680419921875,
0.012176513671875,
-0.050628662109375,
0.0297393798828125,
0.0164031982421875,
0.00704193115234375,
0.0027446746826171875,
-0.023284912109375,
0.04876708984375,
-0.041748046875,
0.064208984375,
-0.0250396728515625,
-0.01435089111328125,
0.0123138427734375,
-0.020751953125,
0.0251312255859375,
-0.0007023811340332031,
-0.00632476806640625,
0.021575927734375,
0.004924774169921875,
-0.0181732177734375,
-0.047088623046875,
0.06317138671875,
-0.06683349609375,
-0.01898193359375,
-0.0440673828125,
-0.005767822265625,
0.0142059326171875,
0.0038089752197265625,
0.05096435546875,
0.025238037109375,
-0.0203094482421875,
0.00775146484375,
0.08038330078125,
-0.039031982421875,
0.03863525390625,
0.0157012939453125,
-0.0214996337890625,
-0.050445556640625,
0.08087158203125,
0.0002493858337402344,
0.03204345703125,
0.03314208984375,
0.0125732421875,
-0.0192108154296875,
-0.0229949951171875,
-0.0594482421875,
0.0311279296875,
-0.048797607421875,
-0.0170135498046875,
-0.055267333984375,
-0.034515380859375,
-0.0292816162109375,
-0.0292816162109375,
-0.034271240234375,
-0.036773681640625,
-0.056121826171875,
0.007007598876953125,
0.042388916015625,
0.035491943359375,
-0.002796173095703125,
0.031585693359375,
-0.040008544921875,
0.0082855224609375,
0.0088043212890625,
0.0268402099609375,
0.006504058837890625,
-0.039581298828125,
-0.00013339519500732422,
0.0034770965576171875,
-0.0518798828125,
-0.057037353515625,
0.037139892578125,
0.0118255615234375,
0.037567138671875,
0.054901123046875,
-0.0020999908447265625,
0.0290374755859375,
-0.0257568359375,
0.058013916015625,
0.044342041015625,
-0.04498291015625,
0.041229248046875,
-0.0278167724609375,
0.0247955322265625,
0.01074981689453125,
0.053375244140625,
-0.026641845703125,
-0.01561737060546875,
-0.055908203125,
-0.058349609375,
0.0562744140625,
0.0360107421875,
0.0004305839538574219,
0.0097503662109375,
0.033843994140625,
-0.012969970703125,
-0.01221466064453125,
-0.053497314453125,
-0.059722900390625,
-0.026885986328125,
-0.017547607421875,
0.01428985595703125,
-0.0029144287109375,
-0.007129669189453125,
-0.041778564453125,
0.07342529296875,
0.00385284423828125,
0.056396484375,
0.032318115234375,
0.0195465087890625,
-0.0023479461669921875,
-0.00018477439880371094,
0.052001953125,
0.0201416015625,
-0.01654052734375,
-0.0084686279296875,
0.006866455078125,
-0.047637939453125,
0.0127716064453125,
0.0172119140625,
-0.040130615234375,
0.005748748779296875,
-0.004634857177734375,
0.09027099609375,
0.00806427001953125,
-0.034515380859375,
0.05474853515625,
-0.03485107421875,
-0.042816162109375,
-0.0273590087890625,
0.01654052734375,
0.01141357421875,
0.028167724609375,
0.0162506103515625,
0.0234375,
0.005611419677734375,
-0.0181427001953125,
0.0079803466796875,
0.036834716796875,
-0.051300048828125,
-0.02435302734375,
0.0736083984375,
0.00029659271240234375,
-0.0103912353515625,
0.00888824462890625,
-0.0255126953125,
-0.0304412841796875,
0.045684814453125,
0.04876708984375,
0.06378173828125,
-0.024566650390625,
0.0306396484375,
0.049163818359375,
0.00980377197265625,
0.0003733634948730469,
0.0218658447265625,
0.0008182525634765625,
-0.046844482421875,
-0.01003265380859375,
-0.057373046875,
-0.0014657974243164062,
-0.00024819374084472656,
-0.0196075439453125,
0.0262908935546875,
-0.05963134765625,
-0.0191497802734375,
0.012969970703125,
-0.00614166259765625,
-0.0426025390625,
0.032440185546875,
0.0113525390625,
0.07049560546875,
-0.057037353515625,
0.06488037109375,
0.04205322265625,
-0.041656494140625,
-0.046173095703125,
-0.00519561767578125,
0.0155487060546875,
-0.044769287109375,
0.053863525390625,
0.00482940673828125,
-0.003787994384765625,
0.01003265380859375,
-0.03912353515625,
-0.048583984375,
0.1019287109375,
0.0220184326171875,
-0.01611328125,
0.0015687942504882812,
0.004207611083984375,
0.03021240234375,
-0.037506103515625,
0.0426025390625,
0.0189208984375,
0.031280517578125,
0.0266265869140625,
-0.043548583984375,
0.03131103515625,
-0.03729248046875,
0.0125732421875,
0.009674072265625,
-0.06591796875,
0.07440185546875,
0.0001544952392578125,
-0.0245361328125,
0.01113128662109375,
0.0426025390625,
0.01006317138671875,
0.0205841064453125,
0.0386962890625,
0.06964111328125,
0.0267486572265625,
-0.0260772705078125,
0.08880615234375,
-0.01450347900390625,
0.0237579345703125,
0.045989990234375,
0.00431060791015625,
0.058197021484375,
0.0177764892578125,
-0.0143890380859375,
0.0426025390625,
0.040863037109375,
-0.006031036376953125,
0.03680419921875,
-0.002338409423828125,
-0.043365478515625,
-0.0087127685546875,
-0.002124786376953125,
-0.03961181640625,
-0.003917694091796875,
0.0305328369140625,
-0.0562744140625,
-0.005878448486328125,
0.0179443359375,
0.017547607421875,
-0.006748199462890625,
-0.0169677734375,
0.05267333984375,
-0.0004916191101074219,
-0.0291748046875,
0.05670166015625,
0.0077056884765625,
0.0791015625,
-0.05059814453125,
0.0017604827880859375,
-0.0085296630859375,
0.03289794921875,
-0.024688720703125,
-0.08026123046875,
0.0249481201171875,
-0.0075225830078125,
-0.0016937255859375,
-0.03155517578125,
0.0638427734375,
-0.0165863037109375,
-0.051727294921875,
0.035369873046875,
0.0106964111328125,
0.017608642578125,
-0.0009188652038574219,
-0.0809326171875,
0.033233642578125,
0.0153961181640625,
-0.0284271240234375,
0.0257110595703125,
0.025299072265625,
0.0288238525390625,
0.03961181640625,
0.0278167724609375,
-0.0010280609130859375,
0.01412200927734375,
-0.029571533203125,
0.06964111328125,
-0.0300750732421875,
-0.016357421875,
-0.06494140625,
0.0582275390625,
-0.029815673828125,
-0.03314208984375,
0.06329345703125,
0.053466796875,
0.06341552734375,
-0.01357269287109375,
0.0638427734375,
-0.020477294921875,
0.0159454345703125,
-0.03509521484375,
0.0814208984375,
-0.07269287109375,
-0.004543304443359375,
-0.03704833984375,
-0.048309326171875,
-0.0191497802734375,
0.06793212890625,
-0.0060882568359375,
0.0008707046508789062,
0.033538818359375,
0.07257080078125,
-0.0260009765625,
-0.0016155242919921875,
0.019317626953125,
0.0181427001953125,
0.0127410888671875,
0.052032470703125,
0.036712646484375,
-0.06512451171875,
0.033416748046875,
-0.0479736328125,
-0.0302734375,
-0.004047393798828125,
-0.054718017578125,
-0.06451416015625,
-0.056396484375,
-0.057861328125,
-0.04443359375,
-0.00701904296875,
0.0577392578125,
0.055694580078125,
-0.044342041015625,
0.0003857612609863281,
0.0016298294067382812,
-0.003459930419921875,
-0.01122283935546875,
-0.01715087890625,
0.022216796875,
0.01548004150390625,
-0.055450439453125,
0.0005788803100585938,
0.0251007080078125,
0.033294677734375,
-0.0137176513671875,
-0.0343017578125,
-0.03167724609375,
0.00914764404296875,
0.04730224609375,
0.034698486328125,
-0.04248046875,
0.00179290771484375,
-0.0001932382583618164,
-0.01078033447265625,
0.030029296875,
0.017425537109375,
-0.041778564453125,
0.017822265625,
0.04534912109375,
0.0294036865234375,
0.051788330078125,
-0.01751708984375,
0.01092529296875,
-0.05120849609375,
0.0130157470703125,
-0.004650115966796875,
0.0296630859375,
0.0195465087890625,
-0.0269927978515625,
0.036865234375,
0.037994384765625,
-0.03668212890625,
-0.05511474609375,
-0.005245208740234375,
-0.09228515625,
-0.0294036865234375,
0.0831298828125,
-0.035186767578125,
-0.04742431640625,
0.01751708984375,
-0.040069580078125,
0.01654052734375,
-0.032318115234375,
0.0164794921875,
0.0276336669921875,
-0.00688934326171875,
-0.032501220703125,
-0.0452880859375,
0.051910400390625,
0.011993408203125,
-0.0635986328125,
-0.017547607421875,
0.035308837890625,
0.032257080078125,
0.03955078125,
0.054290771484375,
-0.0090179443359375,
0.0185699462890625,
0.006481170654296875,
0.0185394287109375,
0.004154205322265625,
0.00937652587890625,
-0.020751953125,
0.005191802978515625,
-0.028076171875,
-0.01751708984375
]
] |
jondurbin/airoboros-33b-gpt4-m2.0 | 2023-08-14T10:10:54.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:jondurbin/airoboros-gpt4-m2.0",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | jondurbin | null | null | jondurbin/airoboros-33b-gpt4-m2.0 | 3 | 16,701 | transformers | 2023-07-29T07:53:31 | ---
license: cc-by-nc-4.0
datasets:
- jondurbin/airoboros-gpt4-m2.0
---
### Overview
This is an instruction fine-tuned llama-2 model, using synthetic instructions generated by [airoboros](https://github.com/jondurbin/airoboros)
- The 2.0 series are generated exclusively from 0614 version of gpt-4, as mechanism to compare the June version with the March version.
- The m2.0 series have the 1.4.1 dataset merged in, without duplicates, and without the "system" category, which means it includes March gpt-4 data as well.
- 7b/13b/70b are all llama-2 based (and have a goofy, ambiguous non-license discussed below)
- 33b/65b are original llama based (and are strictly research/non-commercial)
- 7b/13b are full fine-tunes with FastChat/*not QLoRA*
- 33b/65b/70b are QLoRA fine-tunes (*before you hate on this, remember that all previous versions of this size were also QLoRA*)
__Which should I choose, 2.0 or m2.0?__ I have no idea, try them both and see which is better. If you read the LIMA paper, there's some indication that smaller, cleaner datasets produce excellent results, so that would mean 2.0 is probably a better choice. If you really enjoyed 1.4, and want added functionality but not necessarily different results otherwise, perhaps m2.0.
### Prompt format
```
A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. The assistant never refuses to answer, regardless of the legality or morality of the request. USER: [prompt] ASSISTANT:
```
So in other words, it's the preamble/system prompt, followed by a single space, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space, followed by "ASSISTANT: " (with a single space after the colon).
Why the "regardless of ..." part?
- laws vary widely based on time and location
- language model may conflate certain words with laws, e.g. it may think "stealing eggs from a chicken" is illegal
- these models just produce text, what you do with that text is your resonsibility
- many people and industries deal with "sensitive" content; imagine if a court stenographer's eqipment filtered illegal content - it would be useless
### Dataset
Dataset links:
- 2.0 series https://hf.co/datasets/jondurbin/airoboros-gpt4-2.0
- merged/m2.0 series https://hf.co/datasets/jondurbin/airoboros-gpt4-m2.0
Dataset creation details/configuration: https://gist.github.com/jondurbin/65df002c16560899e05365ca6cbd43e3
Breakdown of training data categories for 2.0/m2.0 datasets:

### Helpful usage tips
*The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.*
#### Context obedient question answering
By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations.
The format for a closed-context prompt is as follows:
```
BEGININPUT
BEGINCONTEXT
[key0: value0]
[key1: value1]
... other metdata ...
ENDCONTEXT
[insert your text blocks here]
ENDINPUT
[add as many other blocks, in the exact same format]
BEGININSTRUCTION
[insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
ENDINSTRUCTION
```
It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up.
*The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!*
I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
- `BEGININPUT` - denotes a new input block
- `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
- `ENDCONTEXT` - denotes the end of the metadata block for the current input
- [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
- `ENDINPUT` - denotes the end of the current input block
- [repeat as many input blocks in this format as you want]
- `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
- [instruction(s)]
- `ENDINSTRUCTION` - denotes the end of instruction set
It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to.
Here's a trivial, but important example to prove the point:
```
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
What color are bluberries? Source?
ENDINSTRUCTION
```
And the response:
```
Blueberries are now green.
Source:
date: 2021-01-01
url: https://web.site/123
```
#### Coding
You can ask for fairly complex coding instructions with multiple criteria, e.g.:
```
Create a python application with the following requirements:
- Asyncio FastAPI webserver
- ping endpoint that returns the current date in JSON format
- file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate
```
Or inline criteria:
```
Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values.
```
You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.:
```
Write a websocket application in node.js. PLAINFORMAT
```
#### Agent/function calling
The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML.
Example prompt:
```
As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format.
Input: I want to know how many times 'Python' is mentioned in my text file.
Available functions:
file_analytics:
description: This tool performs various operations on a text file.
params:
action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc.
filters:
keyword: The word or phrase we want to search for.
```
Response:
```json
{
"function": "file_analytics",
"params": {
"action": "count_occurrences",
"filters": {
"keyword": "Python"
}
}
}
```
#### Chain-of-thought
You can ask for several possible responses to a given problem, with a ranking and final answer selection.
Example prompt:
```
A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one.
```
Example response:
```
Solution 1:
In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players).
The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows:
n(n-1)/2 = 45
Solving this equation gives us n=10.
Final answer: There were 10 players in the tournament.
Solution 2:
Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers.
If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players.
Final answer: There were 10 players in the tournament.
Solution 3:
We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played.
Final answer: There were 10 players in the tournament.
Ranking of solutions from best to worst:
1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer.
2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer.
3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer.
Best and final answer: There were 10 players in the tournament.
```
#### reWOO style execution planning
The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions!
Example prompt:
```
Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string
that could be the user's question, one or more prior evidence values, or a combination of both.
Here are the tools available to be called:
DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer.
TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links.
HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links.
KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:?
The output should be in format:
Plan: [first action to take based on input question]
:evidence0: = [function to call with input parameter]
Plan: [next action to take, based on result of :evidence0:]
:evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"]
...
Answer: [:evidence[n]: containing the final answer.]
Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work?
```
Response:
```
Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020.
:evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner]
Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate.
:evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?]
Plan: Conduct another web search to find the most famous work of the identified laureate.
:evidence2: = DuckDuckGo[Most famous work of :evidence1:]
Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search.
:evidence3: = HyperlinkExtractor[:evidence2:]
Plan: Use the TextScraper tool to extract information from the relevant links.
:evidence4: = TextScraper[:evidence3:]
Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information.
:evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?]
Answer: :evidence5:
```
For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening:
```python
import re
import requests
def inject_context(input_text, **context):
for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)):
input_text = input_text.replace(ref, context.get(ref, ""))
return input_text
def duckduckgo(input_text, **context):
search_string = inject_context(input_text, **context)
... search via duck duck go using search_string
... return text content
def link_extractor(input_text, **context):
input_text = inject_context(input_text, **context)
return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I))))
def scrape(input_text, **context):
input_text = inject_context(input_text, **context)
text = []
for link in input_text.splitlines():
text.append(requests.get(link).text)
return "\n".join(text)
def infer(input_text, **context)
prompt = inject_context(input_text, **context)
... call model with prompt, return output
def parse_plan(plan):
method_map = {
"DuckDuckGo": duckduckgo,
"HyperlinkExtractor": link_extractor,
"KnowledgeModel": infer,
"TextScraper": scrape,
}
context = {}
for line in plan.strip().splitlines():
if line.startswith("Plan:"):
print(line)
continue
parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I)
if not parts:
if line.startswith("Answer: "):
return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...")
raise RuntimeError("bad format: " + line)
context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context)
```
### Contribute
If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data,
take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details.
To help me with the OpenAI/compute costs:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf
### Licence and usage restrictions
The airoboros 2.0/m2.0 models are built on top of either llama or llama-2. Any model with `-l2-` in the name uses llama2, `..-33b-...` and `...-65b-...` are based on the original llama.
#### Llama (original) models
If the model was based on the original llama (33b/65b), the license is __cc-by-nc-4.0__ and is for research/academic use only -- no commercial usage whatsoever!
#### Llama-2 models
Base model has a custom Meta license:
- See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta.
- See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta.
The fine-tuning data was generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros)
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely indemnify me. | 17,512 | [
[
-0.0231170654296875,
-0.0682373046875,
0.039093017578125,
0.019134521484375,
-0.01190948486328125,
-0.0206451416015625,
-0.01001739501953125,
-0.02679443359375,
0.01171875,
0.032928466796875,
-0.0518798828125,
-0.0421142578125,
-0.031890869140625,
0.02215576171875,
-0.019622802734375,
0.0858154296875,
-0.006443023681640625,
-0.007312774658203125,
-0.002681732177734375,
0.0025730133056640625,
-0.048980712890625,
-0.03436279296875,
-0.0623779296875,
-0.00606536865234375,
0.0307769775390625,
0.034423828125,
0.033660888671875,
0.0484619140625,
0.04107666015625,
0.0286712646484375,
0.0012807846069335938,
0.019775390625,
-0.0318603515625,
0.00492095947265625,
-0.0098724365234375,
-0.03717041015625,
-0.0258026123046875,
0.00962066650390625,
0.0297698974609375,
0.03369140625,
-0.0162506103515625,
0.0233001708984375,
-0.00018227100372314453,
0.0298614501953125,
-0.035247802734375,
0.0159759521484375,
-0.034149169921875,
0.003032684326171875,
-0.0086517333984375,
-0.036712646484375,
-0.02606201171875,
-0.0186767578125,
0.00494384765625,
-0.0767822265625,
-0.00449371337890625,
0.009002685546875,
0.0736083984375,
0.0266571044921875,
-0.03509521484375,
-0.0297393798828125,
-0.041473388671875,
0.062042236328125,
-0.060760498046875,
0.0089569091796875,
0.05047607421875,
0.02996826171875,
-0.027496337890625,
-0.06396484375,
-0.049407958984375,
-0.01172637939453125,
-0.020263671875,
0.0182342529296875,
-0.0087738037109375,
-0.005481719970703125,
0.0374755859375,
0.004611968994140625,
-0.06378173828125,
-0.00860595703125,
-0.04595947265625,
-0.011871337890625,
0.05047607421875,
0.026824951171875,
0.0198974609375,
-0.012786865234375,
-0.02899169921875,
-0.0019388198852539062,
-0.03887939453125,
0.0205078125,
0.0318603515625,
0.030364990234375,
-0.0230560302734375,
0.039764404296875,
-0.0254364013671875,
0.0457763671875,
0.0014047622680664062,
-0.01407623291015625,
0.0104522705078125,
-0.03961181640625,
-0.0186614990234375,
-0.01380157470703125,
0.08319091796875,
0.050506591796875,
0.01049041748046875,
0.0015649795532226562,
-0.0032711029052734375,
-0.00795745849609375,
0.011627197265625,
-0.07110595703125,
-0.0169830322265625,
0.045501708984375,
-0.038818359375,
-0.027099609375,
-0.002117156982421875,
-0.062347412109375,
-0.01502227783203125,
-0.0148773193359375,
0.041961669921875,
-0.0300140380859375,
0.001338958740234375,
0.00958251953125,
-0.0241546630859375,
0.01654052734375,
0.03387451171875,
-0.06353759765625,
0.040496826171875,
0.031494140625,
0.0689697265625,
0.004741668701171875,
-0.0276947021484375,
-0.04217529296875,
-0.00577545166015625,
-0.00841522216796875,
0.05718994140625,
-0.032379150390625,
-0.0274810791015625,
-0.019561767578125,
0.0249786376953125,
0.0015430450439453125,
-0.023681640625,
0.022796630859375,
-0.0333251953125,
0.0443115234375,
-0.034912109375,
-0.03594970703125,
-0.02203369140625,
0.0201568603515625,
-0.03497314453125,
0.07281494140625,
0.0075531005859375,
-0.061920166015625,
-0.0034160614013671875,
-0.07598876953125,
-0.02691650390625,
-0.003238677978515625,
-0.000042498111724853516,
-0.004970550537109375,
-0.0291290283203125,
0.011016845703125,
0.0252227783203125,
-0.0294952392578125,
0.01152801513671875,
-0.016571044921875,
-0.03497314453125,
0.029693603515625,
-0.0250701904296875,
0.0897216796875,
0.02667236328125,
-0.0183868408203125,
0.007415771484375,
-0.052093505859375,
-0.0004837512969970703,
0.0174407958984375,
-0.038665771484375,
-0.011627197265625,
0.00775909423828125,
-0.0009489059448242188,
0.0025539398193359375,
0.0247802734375,
-0.036529541015625,
0.0225830078125,
-0.0258331298828125,
0.06561279296875,
0.05694580078125,
0.01148223876953125,
0.0256195068359375,
-0.027374267578125,
0.03643798828125,
-0.003978729248046875,
0.025726318359375,
-0.031280517578125,
-0.050323486328125,
-0.042083740234375,
-0.00007992982864379883,
0.0140228271484375,
0.073486328125,
-0.047088623046875,
0.03564453125,
-0.0016374588012695312,
-0.034423828125,
-0.02325439453125,
-0.007457733154296875,
0.0261993408203125,
0.053466796875,
0.039825439453125,
-0.007659912109375,
-0.053985595703125,
-0.05609130859375,
0.01165771484375,
-0.0163726806640625,
0.001514434814453125,
0.036285400390625,
0.052825927734375,
-0.0147705078125,
0.06787109375,
-0.062469482421875,
-0.0024776458740234375,
-0.005573272705078125,
0.00377655029296875,
0.0230255126953125,
0.04583740234375,
0.03961181640625,
-0.05401611328125,
-0.0293426513671875,
-0.0063934326171875,
-0.06622314453125,
-0.0084381103515625,
-0.006069183349609375,
-0.019195556640625,
-0.00037479400634765625,
0.0247344970703125,
-0.050018310546875,
0.03363037109375,
0.0216827392578125,
-0.0364990234375,
0.04815673828125,
-0.010406494140625,
0.0203094482421875,
-0.0941162109375,
0.022308349609375,
-0.01165008544921875,
-0.0111846923828125,
-0.049774169921875,
0.0258636474609375,
-0.0158538818359375,
-0.0026569366455078125,
-0.037322998046875,
0.05194091796875,
-0.024200439453125,
0.005863189697265625,
-0.00598907470703125,
0.01145172119140625,
0.0146026611328125,
0.046661376953125,
-0.009735107421875,
0.069580078125,
0.03619384765625,
-0.053253173828125,
0.043212890625,
0.018096923828125,
-0.0040283203125,
0.0281982421875,
-0.066650390625,
0.0165863037109375,
-0.006450653076171875,
0.021453857421875,
-0.08380126953125,
-0.01361083984375,
0.043060302734375,
-0.047698974609375,
0.0015611648559570312,
-0.00858306884765625,
-0.0280914306640625,
-0.037506103515625,
-0.03472900390625,
0.0238037109375,
0.03436279296875,
-0.0225830078125,
0.037261962890625,
0.027618408203125,
0.003936767578125,
-0.04229736328125,
-0.05523681640625,
0.00606536865234375,
-0.025970458984375,
-0.04254150390625,
0.022308349609375,
-0.032440185546875,
-0.0222015380859375,
-0.01432037353515625,
0.0091400146484375,
-0.0222015380859375,
0.024627685546875,
0.01435089111328125,
0.0175018310546875,
-0.01049041748046875,
-0.006336212158203125,
0.007709503173828125,
-0.0011320114135742188,
0.0035915374755859375,
-0.0301666259765625,
0.059539794921875,
-0.01629638671875,
-0.00823211669921875,
-0.053985595703125,
0.039825439453125,
0.0258636474609375,
-0.01593017578125,
0.0389404296875,
0.042572021484375,
-0.03521728515625,
0.0144195556640625,
-0.01824951171875,
-0.0253753662109375,
-0.042510986328125,
0.0152740478515625,
-0.026123046875,
-0.046356201171875,
0.052703857421875,
0.025970458984375,
0.0177764892578125,
0.034912109375,
0.032073974609375,
-0.0204620361328125,
0.06500244140625,
0.02044677734375,
0.0151214599609375,
0.0228424072265625,
-0.04052734375,
-0.0015115737915039062,
-0.06329345703125,
-0.0286712646484375,
-0.044219970703125,
-0.0263519287109375,
-0.045654296875,
-0.021331787109375,
0.0234527587890625,
0.0207977294921875,
-0.03778076171875,
0.039154052734375,
-0.055908203125,
0.03814697265625,
0.05303955078125,
0.0106048583984375,
0.0105438232421875,
-0.0110321044921875,
0.0016756057739257812,
0.00605010986328125,
-0.04156494140625,
-0.044586181640625,
0.08831787109375,
0.01934814453125,
0.049407958984375,
0.01459503173828125,
0.059722900390625,
0.0212860107421875,
0.00318145751953125,
-0.061767578125,
0.053253173828125,
-0.0020427703857421875,
-0.043121337890625,
-0.0362548828125,
-0.0258026123046875,
-0.0848388671875,
0.0165252685546875,
-0.005733489990234375,
-0.072998046875,
0.0129852294921875,
0.01070404052734375,
-0.061126708984375,
0.0010957717895507812,
-0.058929443359375,
0.06829833984375,
-0.0182037353515625,
-0.0261993408203125,
0.00893402099609375,
-0.059600830078125,
0.021209716796875,
0.0107421875,
0.0143585205078125,
0.0004062652587890625,
-0.0064239501953125,
0.0687255859375,
-0.0560302734375,
0.0684814453125,
-0.0194091796875,
0.01165008544921875,
0.039581298828125,
-0.00022470951080322266,
0.03289794921875,
0.0154571533203125,
-0.00177764892578125,
0.0114288330078125,
0.023712158203125,
-0.0183258056640625,
-0.043121337890625,
0.0457763671875,
-0.0673828125,
-0.038543701171875,
-0.03021240234375,
-0.042266845703125,
0.017425537109375,
0.0290374755859375,
0.03594970703125,
0.043304443359375,
-0.005321502685546875,
-0.003528594970703125,
0.040313720703125,
-0.0243988037109375,
0.04241943359375,
0.04541015625,
-0.0194854736328125,
-0.0443115234375,
0.0577392578125,
0.0139617919921875,
-0.0030918121337890625,
0.04644775390625,
0.0302886962890625,
-0.0244598388671875,
-0.030364990234375,
-0.051177978515625,
0.0141448974609375,
-0.046630859375,
-0.018280029296875,
-0.0655517578125,
-0.0048828125,
-0.044586181640625,
-0.00506591796875,
-0.00018012523651123047,
-0.039886474609375,
-0.04510498046875,
-0.0014314651489257812,
0.04541015625,
0.043731689453125,
0.0005311965942382812,
0.04400634765625,
-0.04962158203125,
0.01873779296875,
0.0246734619140625,
0.009124755859375,
-0.0024242401123046875,
-0.0478515625,
-0.005420684814453125,
0.01776123046875,
-0.034637451171875,
-0.08892822265625,
0.0280609130859375,
0.003910064697265625,
0.035919189453125,
0.0394287109375,
-0.0013380050659179688,
0.059234619140625,
-0.04376220703125,
0.0814208984375,
-0.0005536079406738281,
-0.06298828125,
0.06103515625,
-0.044464111328125,
0.0092926025390625,
0.0418701171875,
0.0312042236328125,
-0.046356201171875,
-0.0138092041015625,
-0.03778076171875,
-0.06610107421875,
0.0738525390625,
0.024078369140625,
0.0018587112426757812,
-0.00864410400390625,
0.03668212890625,
0.0000871419906616211,
0.018402099609375,
-0.060638427734375,
-0.0285797119140625,
-0.033477783203125,
-0.01544189453125,
0.0027408599853515625,
-0.00389862060546875,
-0.02197265625,
-0.0277557373046875,
0.0382080078125,
-0.00855255126953125,
0.045257568359375,
0.01611328125,
0.0024852752685546875,
0.006404876708984375,
0.0122528076171875,
0.0626220703125,
0.041534423828125,
-0.024566650390625,
0.002933502197265625,
0.0159149169921875,
-0.039154052734375,
0.008575439453125,
0.015869140625,
-0.0225067138671875,
-0.020538330078125,
0.025970458984375,
0.05718994140625,
-0.0035991668701171875,
-0.04559326171875,
0.034637451171875,
-0.01511383056640625,
-0.00948333740234375,
-0.025421142578125,
0.0203094482421875,
0.006961822509765625,
0.01264190673828125,
0.0183258056640625,
-0.0081787109375,
0.032562255859375,
-0.05059814453125,
0.00807952880859375,
0.0220794677734375,
0.00034332275390625,
-0.0293121337890625,
0.053985595703125,
0.0160675048828125,
-0.049346923828125,
0.04571533203125,
-0.0400390625,
-0.04156494140625,
0.06719970703125,
0.05712890625,
0.050506591796875,
-0.01467132568359375,
0.0219573974609375,
0.041839599609375,
0.0278472900390625,
-0.01306915283203125,
0.04791259765625,
-0.00994873046875,
-0.045867919921875,
-0.00775909423828125,
-0.048248291015625,
-0.020782470703125,
0.017486572265625,
-0.042694091796875,
0.0175933837890625,
-0.0526123046875,
-0.01465606689453125,
0.0007047653198242188,
0.0091705322265625,
-0.053558349609375,
0.0164642333984375,
-0.0146331787109375,
0.07220458984375,
-0.074462890625,
0.037384033203125,
0.0621337890625,
-0.0557861328125,
-0.06793212890625,
-0.0086669921875,
0.006855010986328125,
-0.0533447265625,
0.02996826171875,
0.020233154296875,
0.01218414306640625,
-0.00006496906280517578,
-0.05908203125,
-0.07525634765625,
0.09759521484375,
0.0081024169921875,
-0.0311126708984375,
-0.01090240478515625,
-0.0012292861938476562,
0.04248046875,
-0.031982421875,
0.050201416015625,
0.0390625,
0.047149658203125,
-0.0008754730224609375,
-0.07049560546875,
0.0263214111328125,
-0.032257080078125,
-0.0055389404296875,
-0.0012559890747070312,
-0.06622314453125,
0.08563232421875,
-0.0235443115234375,
-0.0172271728515625,
0.00821685791015625,
0.035125732421875,
0.0121002197265625,
0.0255889892578125,
0.0284423828125,
0.03668212890625,
0.079345703125,
-0.00690460205078125,
0.0777587890625,
-0.0206298828125,
0.0205078125,
0.08721923828125,
-0.01004791259765625,
0.0599365234375,
0.029815673828125,
-0.03436279296875,
0.042572021484375,
0.06585693359375,
-0.0099945068359375,
0.04400634765625,
0.00508880615234375,
0.002471923828125,
0.0031299591064453125,
-0.00024890899658203125,
-0.0330810546875,
0.03839111328125,
0.0203857421875,
-0.0140838623046875,
-0.006633758544921875,
-0.0009551048278808594,
0.0153961181640625,
-0.01078033447265625,
-0.00698089599609375,
0.056304931640625,
-0.0018682479858398438,
-0.06103515625,
0.050994873046875,
0.01312255859375,
0.05096435546875,
-0.04400634765625,
-0.0105438232421875,
-0.0254974365234375,
-0.00980377197265625,
-0.02288818359375,
-0.0694580078125,
0.019775390625,
0.0050048828125,
-0.0258941650390625,
0.0035457611083984375,
0.03240966796875,
-0.0237884521484375,
-0.025177001953125,
0.01117706298828125,
0.0181884765625,
0.049102783203125,
0.005863189697265625,
-0.056793212890625,
0.00926971435546875,
0.00830841064453125,
-0.02105712890625,
0.0116119384765625,
0.0270233154296875,
-0.003265380859375,
0.052337646484375,
0.0577392578125,
-0.0008420944213867188,
-0.002506256103515625,
-0.00937652587890625,
0.06597900390625,
-0.052001953125,
-0.045257568359375,
-0.064453125,
0.046112060546875,
-0.0106048583984375,
-0.0361328125,
0.0467529296875,
0.0498046875,
0.054595947265625,
0.006622314453125,
0.058807373046875,
-0.02203369140625,
0.0238800048828125,
-0.0396728515625,
0.04815673828125,
-0.047760009765625,
0.0258636474609375,
-0.0101776123046875,
-0.0489501953125,
-0.00598907470703125,
0.06475830078125,
-0.0159454345703125,
0.000009298324584960938,
0.050994873046875,
0.07049560546875,
0.0017604827880859375,
0.0092926025390625,
-0.002109527587890625,
0.0182037353515625,
0.028289794921875,
0.048095703125,
0.053558349609375,
-0.04840087890625,
0.0418701171875,
-0.02117919921875,
-0.03436279296875,
-0.00846099853515625,
-0.058135986328125,
-0.06365966796875,
-0.0419921875,
-0.005382537841796875,
-0.0306854248046875,
0.0090484619140625,
0.086669921875,
0.048553466796875,
-0.06365966796875,
-0.0298614501953125,
0.0027446746826171875,
0.006679534912109375,
-0.02288818359375,
-0.023681640625,
0.0197601318359375,
-0.014007568359375,
-0.051971435546875,
0.03216552734375,
0.0016870498657226562,
0.00952911376953125,
-0.011627197265625,
-0.004070281982421875,
-0.0283966064453125,
0.01050567626953125,
0.045867919921875,
0.0262603759765625,
-0.053192138671875,
-0.02117919921875,
0.0134735107421875,
-0.00862884521484375,
0.0036640167236328125,
0.037445068359375,
-0.054595947265625,
0.025726318359375,
0.043182373046875,
0.021148681640625,
0.031036376953125,
0.0072784423828125,
0.0260009765625,
-0.044036865234375,
0.006256103515625,
0.007091522216796875,
0.0284423828125,
0.01520538330078125,
-0.054718017578125,
0.041839599609375,
0.0248260498046875,
-0.05157470703125,
-0.068359375,
0.0032711029052734375,
-0.08038330078125,
-0.0312347412109375,
0.09649658203125,
-0.01235198974609375,
-0.0145721435546875,
-0.01349639892578125,
-0.032684326171875,
0.01338958740234375,
-0.05084228515625,
0.05078125,
0.049530029296875,
-0.0309906005859375,
0.0078125,
-0.039031982421875,
0.03302001953125,
-0.0005679130554199219,
-0.06939697265625,
-0.0029201507568359375,
0.038360595703125,
0.041412353515625,
0.021331787109375,
0.0728759765625,
0.0078277587890625,
0.0207061767578125,
0.0023059844970703125,
-0.0059814453125,
-0.0178680419921875,
-0.029815673828125,
-0.015533447265625,
0.008056640625,
-0.01922607421875,
-0.0248260498046875
]
] |
facebook/tart-full-flan-t5-xl | 2022-12-21T06:58:39.000Z | [
"transformers",
"pytorch",
"t5",
"text-classification",
"arxiv:2211.09260",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-classification | facebook | null | null | facebook/tart-full-flan-t5-xl | 23 | 16,689 | transformers | 2022-12-21T05:20:02 | # Task-aware Retrieval with Instructions
Official repository: [github.com/facebookresearch/tart](https://github.com/facebookresearch/tart)
### Model descriptions
`facebook/tart-full-flan-t5-xl` is a multi-task cross-encoder model trained via instruction-tuning on approximately 40 retrieval tasks, which is initialized with [google/flan-t5-xl](https://huggingface.co/google/flan-t5-xl).
TART-full is a 1.5 billion cross-necoder and it can rerank top documents given a query and natural language instruction (e.g., *find a Wikipedia paragraph that answers this question.*).
Experimental results on widely-used [BEIR](https://github.com/beir-cellar/beir), [LOTTE](https://huggingface.co/datasets/colbertv2/lotte), and our new evaluation, [X^2-Retrieval](https://github.com/facebookresearch/tart/cross_task_cross_eval) show that TART-full outperforms previous state-of-the-art methods by levaraging natural language instructions.
More details about modeling and training are in our paper: [Task-aware Retrieval with Instructions](https://arxiv.org/abs/2211.09260).
### Installation
```sh
git clone https://github.com/facebookresearch/tart
pip install -r requirements.txt
cd tart/TART
```
### How to use?
TART-full can be loaded through our customized EncT5 model.
```python
from src.modeling_enc_t5 import EncT5ForSequenceClassification
from src.tokenization_enc_t5 import EncT5Tokenizer
import torch
import torch.nn.functional as F
import numpy as np
# load TART full and tokenizer
model = EncT5ForSequenceClassification.from_pretrained("facebook/tart-full-flan-t5-xl")
tokenizer = EncT5Tokenizer.from_pretrained("facebook/tart-full-flan-t5-xl")
model.eval()
q = "What is the population of Tokyo?"
in_answer = "retrieve a passage that answers this question from Wikipedia"
p_1 = "The population of Japan's capital, Tokyo, dropped by about 48,600 people to just under 14 million at the start of 2022, the first decline since 1996, the metropolitan government reported Monday."
p_2 = "Tokyo, officially the Tokyo Metropolis (東京都, Tōkyō-to), is the capital and largest city of Japan."
# 1. TART-full can identify more relevant paragraph.
features = tokenizer(['{0} [SEP] {1}'.format(in_answer, q), '{0} [SEP] {1}'.format(in_answer, q)], [p_1, p_2], padding=True, truncation=True, return_tensors="pt")
with torch.no_grad():
scores = model(**features).logits
normalized_scores = [float(score[1]) for score in F.softmax(scores, dim=1)]
print([p_1, p_2][np.argmax(normalized_scores)]) # "The population of Japan's capital, Tokyo, dropped by about 48,600 people to just under 14 million ... "
# 2. TART-full can identify the document that is more relevant AND follows instructions.
in_sim = "You need to find duplicated questions in Wiki forum. Could you find a question that is similar to this question"
q_1 = "How many people live in Tokyo?"
features = tokenizer(['{0} [SEP] {1}'.format(in_sim, q), '{0} [SEP] {1}'.format(in_sim, q)], [p_1, q_1], padding=True, truncation=True, return_tensors="pt")
with torch.no_grad():
scores = model(**features).logits
normalized_scores = [float(score[1]) for score in F.softmax(scores, dim=1)]
print([p_1, q_1][np.argmax(normalized_scores)]) # "How many people live in Tokyo?"
```
| 3,248 | [
[
-0.0209503173828125,
-0.0618896484375,
0.0477294921875,
0.0184478759765625,
-0.0142364501953125,
-0.0098724365234375,
-0.0219268798828125,
-0.0253143310546875,
0.0202789306640625,
0.036956787109375,
-0.041656494140625,
-0.051849365234375,
-0.036468505859375,
0.00836944580078125,
-0.038116455078125,
0.08642578125,
-0.01061248779296875,
-0.01263427734375,
-0.0055084228515625,
0.002666473388671875,
-0.035308837890625,
-0.0266876220703125,
-0.051727294921875,
0.0009984970092773438,
0.0305938720703125,
0.044036865234375,
0.031219482421875,
0.034027099609375,
0.0423583984375,
0.02777099609375,
-0.005168914794921875,
-0.005420684814453125,
-0.03863525390625,
-0.00958251953125,
0.002391815185546875,
-0.041229248046875,
-0.0225677490234375,
0.0020427703857421875,
0.049560546875,
0.03729248046875,
0.01041412353515625,
0.028717041015625,
-0.0016498565673828125,
0.04248046875,
-0.047271728515625,
0.0247650146484375,
-0.026031494140625,
-0.00811004638671875,
-0.01274871826171875,
-0.01177978515625,
-0.0199432373046875,
-0.0338134765625,
0.00627899169921875,
-0.055511474609375,
0.050506591796875,
0.00238800048828125,
0.08221435546875,
0.00479888916015625,
-0.025726318359375,
-0.006618499755859375,
-0.03802490234375,
0.0650634765625,
-0.0361328125,
0.0244140625,
0.03564453125,
0.0202178955078125,
-0.0007658004760742188,
-0.07135009765625,
-0.0343017578125,
-0.023590087890625,
-0.0238189697265625,
0.0236358642578125,
0.005519866943359375,
0.01666259765625,
0.04351806640625,
0.036346435546875,
-0.059967041015625,
-0.0248870849609375,
-0.053985595703125,
-0.00943756103515625,
0.0477294921875,
0.0108795166015625,
0.0208892822265625,
-0.035400390625,
-0.033447265625,
-0.022918701171875,
-0.01629638671875,
0.00244903564453125,
0.0167236328125,
0.021484375,
-0.01378631591796875,
0.041778564453125,
-0.02972412109375,
0.036712646484375,
0.00988006591796875,
-0.0007214546203613281,
0.036956787109375,
-0.035797119140625,
-0.01183319091796875,
0.005191802978515625,
0.063720703125,
0.02471923828125,
0.01418304443359375,
-0.006328582763671875,
-0.01666259765625,
-0.0036029815673828125,
0.01270294189453125,
-0.07342529296875,
-0.02728271484375,
0.022979736328125,
-0.0246429443359375,
-0.01184844970703125,
0.0347900390625,
-0.05322265625,
-0.006771087646484375,
0.0102996826171875,
0.059783935546875,
-0.04840087890625,
-0.0112762451171875,
0.01169586181640625,
-0.027557373046875,
0.0147857666015625,
0.0166778564453125,
-0.05572509765625,
0.0313720703125,
0.0574951171875,
0.0751953125,
0.01007080078125,
-0.023406982421875,
-0.0242462158203125,
-0.00818634033203125,
-0.02130126953125,
0.0428466796875,
-0.03839111328125,
-0.0155792236328125,
-0.025177001953125,
0.013153076171875,
-0.02166748046875,
-0.032867431640625,
0.05218505859375,
-0.0308837890625,
0.0295562744140625,
-0.03485107421875,
-0.040771484375,
-0.03228759765625,
0.00921630859375,
-0.05999755859375,
0.0977783203125,
0.024444580078125,
-0.06494140625,
0.0193023681640625,
-0.06805419921875,
-0.031005859375,
0.00208282470703125,
0.0035457611083984375,
-0.0308837890625,
0.007221221923828125,
0.01506805419921875,
0.031097412109375,
-0.01253509521484375,
0.01192474365234375,
-0.025848388671875,
-0.0523681640625,
0.0104522705078125,
-0.02239990234375,
0.06414794921875,
0.0125885009765625,
-0.034942626953125,
0.017913818359375,
-0.049285888671875,
0.007080078125,
0.0172882080078125,
-0.031646728515625,
-0.0200653076171875,
-0.0201416015625,
0.0289154052734375,
0.00302886962890625,
0.03436279296875,
-0.0491943359375,
0.0192413330078125,
-0.0215911865234375,
0.0240020751953125,
0.043548583984375,
0.0093231201171875,
0.0181121826171875,
-0.033050537109375,
0.038482666015625,
0.0020389556884765625,
0.0161285400390625,
-0.0160675048828125,
-0.042724609375,
-0.056671142578125,
-0.020599365234375,
0.00789642333984375,
0.038848876953125,
-0.033538818359375,
0.03094482421875,
-0.0284423828125,
-0.0367431640625,
-0.038360595703125,
-0.00811767578125,
0.034149169921875,
0.034393310546875,
0.032440185546875,
-0.0115814208984375,
-0.057708740234375,
-0.06390380859375,
-0.0120697021484375,
0.00017189979553222656,
0.00849151611328125,
0.00942230224609375,
0.05255126953125,
-0.00020968914031982422,
0.05670166015625,
-0.048187255859375,
-0.01197052001953125,
-0.0197296142578125,
0.007389068603515625,
0.030975341796875,
0.04461669921875,
0.03753662109375,
-0.07269287109375,
-0.05145263671875,
-0.0175628662109375,
-0.040130615234375,
0.003894805908203125,
0.007213592529296875,
-0.02130126953125,
0.0162811279296875,
0.018890380859375,
-0.060882568359375,
0.038116455078125,
0.00969696044921875,
-0.038543701171875,
0.038055419921875,
-0.0092315673828125,
0.029693603515625,
-0.09698486328125,
0.031982421875,
0.00992584228515625,
-0.0242462158203125,
-0.04779052734375,
0.015655517578125,
0.011444091796875,
-0.0133819580078125,
-0.013458251953125,
0.048980712890625,
-0.035980224609375,
-0.002719879150390625,
-0.0022830963134765625,
0.00992584228515625,
0.0000737309455871582,
0.034515380859375,
-0.002819061279296875,
0.05267333984375,
0.032257080078125,
-0.039398193359375,
0.033477783203125,
0.0176544189453125,
-0.032440185546875,
0.01507568359375,
-0.0400390625,
0.0105438232421875,
-0.015472412109375,
0.0200653076171875,
-0.095947265625,
-0.0352783203125,
0.014373779296875,
-0.04150390625,
0.0050811767578125,
-0.0155487060546875,
-0.046478271484375,
-0.04132080078125,
-0.040191650390625,
0.03253173828125,
0.037933349609375,
-0.027008056640625,
0.0273895263671875,
0.0159759521484375,
0.0036163330078125,
-0.053863525390625,
-0.043975830078125,
-0.01032257080078125,
-0.007450103759765625,
-0.046173095703125,
0.027008056640625,
-0.0120391845703125,
-0.00154876708984375,
0.0169830322265625,
-0.019805908203125,
-0.00997161865234375,
-0.0001779794692993164,
0.0090179443359375,
0.01666259765625,
-0.016265869140625,
0.00449371337890625,
-0.01421356201171875,
-0.01184844970703125,
0.004848480224609375,
-0.02215576171875,
0.061309814453125,
-0.02301025390625,
-0.02203369140625,
-0.0318603515625,
0.01454925537109375,
0.03887939453125,
-0.0193634033203125,
0.04962158203125,
0.0689697265625,
-0.042938232421875,
-0.0011911392211914062,
-0.05865478515625,
-0.0218048095703125,
-0.036956787109375,
0.049835205078125,
-0.0280303955078125,
-0.054351806640625,
0.052459716796875,
0.025177001953125,
0.0085296630859375,
0.054534912109375,
0.023101806640625,
0.005992889404296875,
0.07806396484375,
0.049560546875,
0.002288818359375,
0.045196533203125,
-0.047393798828125,
0.00628662109375,
-0.05255126953125,
-0.0283050537109375,
-0.0304107666015625,
-0.032440185546875,
-0.050140380859375,
-0.0268096923828125,
0.03228759765625,
0.00909423828125,
-0.0231170654296875,
0.035491943359375,
-0.0516357421875,
0.040557861328125,
0.052947998046875,
0.02301025390625,
0.004680633544921875,
0.00015842914581298828,
0.000025033950805664062,
0.0019369125366210938,
-0.0478515625,
-0.0171051025390625,
0.08770751953125,
0.00481414794921875,
0.051055908203125,
0.00199127197265625,
0.0654296875,
0.00012350082397460938,
0.0174713134765625,
-0.05572509765625,
0.049468994140625,
-0.029266357421875,
-0.07000732421875,
-0.021697998046875,
-0.054840087890625,
-0.0758056640625,
0.00528717041015625,
-0.003787994384765625,
-0.0670166015625,
-0.0020771026611328125,
-0.0029888153076171875,
-0.035736083984375,
0.0257415771484375,
-0.059844970703125,
0.085693359375,
-0.0292205810546875,
-0.0243377685546875,
-0.0178680419921875,
-0.041778564453125,
0.034149169921875,
-0.007068634033203125,
0.003826141357421875,
-0.0127105712890625,
0.0005269050598144531,
0.08355712890625,
-0.022430419921875,
0.058563232421875,
-0.02056884765625,
0.020751953125,
0.0209808349609375,
-0.01462554931640625,
0.0270538330078125,
0.00214385986328125,
-0.0125885009765625,
-0.01262664794921875,
0.0228118896484375,
-0.035797119140625,
-0.05096435546875,
0.05914306640625,
-0.0638427734375,
-0.03887939453125,
-0.0305023193359375,
-0.043792724609375,
0.0195770263671875,
0.0306854248046875,
0.039276123046875,
0.041229248046875,
-0.0017070770263671875,
0.0249786376953125,
0.0513916015625,
-0.0167388916015625,
0.0306854248046875,
0.024322509765625,
-0.0228271484375,
-0.044952392578125,
0.063232421875,
0.0221099853515625,
0.007465362548828125,
0.04998779296875,
0.0305328369140625,
-0.044281005859375,
-0.0034275054931640625,
-0.0154876708984375,
0.0250244140625,
-0.05487060546875,
-0.0037326812744140625,
-0.0484619140625,
-0.0209503173828125,
-0.04315185546875,
-0.004909515380859375,
-0.020843505859375,
-0.037750244140625,
-0.01026153564453125,
-0.007266998291015625,
0.04150390625,
0.05377197265625,
0.006381988525390625,
0.01387786865234375,
-0.055633544921875,
0.0274505615234375,
0.0038433074951171875,
0.035552978515625,
-0.0007195472717285156,
-0.0418701171875,
-0.0257568359375,
0.0029621124267578125,
-0.0171051025390625,
-0.084228515625,
0.0186920166015625,
0.002185821533203125,
0.044708251953125,
0.0318603515625,
0.01983642578125,
0.0640869140625,
-0.0037631988525390625,
0.045379638671875,
0.00672149658203125,
-0.06390380859375,
0.0439453125,
-0.019744873046875,
0.0225067138671875,
0.04827880859375,
0.034698486328125,
-0.05487060546875,
-0.0137786865234375,
-0.068603515625,
-0.077392578125,
0.07464599609375,
0.013092041015625,
0.01380157470703125,
-0.0074920654296875,
0.03973388671875,
-0.008026123046875,
0.01169586181640625,
-0.06121826171875,
-0.0276947021484375,
-0.016021728515625,
-0.0391845703125,
-0.003543853759765625,
-0.0209197998046875,
0.0025539398193359375,
-0.03350830078125,
0.058502197265625,
0.004383087158203125,
0.0288238525390625,
0.01861572265625,
-0.001373291015625,
-0.01117706298828125,
0.011199951171875,
0.032196044921875,
0.04827880859375,
-0.03717041015625,
0.0028629302978515625,
0.017486572265625,
-0.0457763671875,
-0.007167816162109375,
0.0240631103515625,
-0.014984130859375,
0.00775146484375,
0.025146484375,
0.06231689453125,
0.0211029052734375,
-0.0313720703125,
0.0452880859375,
0.005558013916015625,
-0.01727294921875,
-0.0474853515625,
0.01291656494140625,
-0.00452423095703125,
0.01093292236328125,
0.03790283203125,
-0.007358551025390625,
0.01058197021484375,
-0.050445556640625,
0.0219268798828125,
0.024017333984375,
-0.0302581787109375,
-0.0245208740234375,
0.0611572265625,
0.00452423095703125,
-0.0102996826171875,
0.058349609375,
-0.026031494140625,
-0.053314208984375,
0.059326171875,
0.03594970703125,
0.0762939453125,
0.0005640983581542969,
0.0107421875,
0.06353759765625,
0.0276336669921875,
-0.0236968994140625,
0.0455322265625,
-0.00290679931640625,
-0.0372314453125,
-0.02606201171875,
-0.051605224609375,
-0.007129669189453125,
0.0256500244140625,
-0.04833984375,
0.0281524658203125,
-0.038360595703125,
-0.019256591796875,
-0.00933074951171875,
0.035400390625,
-0.057037353515625,
0.0159759521484375,
-0.007465362548828125,
0.0667724609375,
-0.06524658203125,
0.04803466796875,
0.08197021484375,
-0.05438232421875,
-0.076904296875,
0.00333404541015625,
-0.01007080078125,
-0.050872802734375,
0.041351318359375,
0.037445068359375,
0.01041412353515625,
0.00893402099609375,
-0.04754638671875,
-0.07794189453125,
0.0941162109375,
0.0279388427734375,
-0.0284271240234375,
-0.005954742431640625,
-0.0066375732421875,
0.033477783203125,
0.0007290840148925781,
0.0419921875,
0.033355712890625,
0.047698974609375,
0.004608154296875,
-0.072998046875,
0.032745361328125,
-0.0276336669921875,
-0.00913238525390625,
0.02642822265625,
-0.060821533203125,
0.07763671875,
-0.0313720703125,
-0.027191162109375,
0.01290130615234375,
0.03619384765625,
0.022552490234375,
0.024200439453125,
0.03680419921875,
0.049652099609375,
0.05950927734375,
-0.0194854736328125,
0.07794189453125,
-0.03216552734375,
0.056365966796875,
0.04888916015625,
0.0167694091796875,
0.056793212890625,
0.013671875,
-0.018218994140625,
0.031768798828125,
0.051361083984375,
-0.014404296875,
0.033050537109375,
0.01184844970703125,
-0.01511383056640625,
-0.018218994140625,
0.013214111328125,
-0.0186767578125,
0.034271240234375,
0.01241302490234375,
-0.02423095703125,
-0.0186614990234375,
-0.015655517578125,
0.01467132568359375,
-0.0163726806640625,
-0.01352691650390625,
0.046478271484375,
-0.0025272369384765625,
-0.05462646484375,
0.04638671875,
0.01544189453125,
0.05328369140625,
-0.054473876953125,
-0.0110931396484375,
-0.01447296142578125,
0.026763916015625,
-0.02880859375,
-0.06427001953125,
0.01174163818359375,
0.0033321380615234375,
-0.0126953125,
0.00785064697265625,
0.04638671875,
-0.041900634765625,
-0.061981201171875,
0.01358795166015625,
0.01428985595703125,
0.0283050537109375,
-0.01276397705078125,
-0.050628662109375,
-0.009002685546875,
0.01012420654296875,
-0.03118896484375,
0.00731658935546875,
0.026763916015625,
-0.00937652587890625,
0.0487060546875,
0.047149658203125,
-0.00939178466796875,
0.014556884765625,
-0.0002837181091308594,
0.050811767578125,
-0.05584716796875,
-0.043060302734375,
-0.06231689453125,
0.0295867919921875,
-0.0092010498046875,
-0.031524658203125,
0.05694580078125,
0.0704345703125,
0.07244873046875,
-0.00585174560546875,
0.06494140625,
-0.031646728515625,
0.0272369384765625,
-0.035064697265625,
0.055633544921875,
-0.0574951171875,
0.00061798095703125,
-0.023834228515625,
-0.0789794921875,
-0.016357421875,
0.060302734375,
-0.031524658203125,
0.035400390625,
0.057342529296875,
0.06781005859375,
-0.025177001953125,
-0.0027408599853515625,
0.01322174072265625,
0.014129638671875,
0.0134124755859375,
0.043365478515625,
0.037750244140625,
-0.042144775390625,
0.059112548828125,
-0.04486083984375,
-0.010711669921875,
-0.0229339599609375,
-0.034210205078125,
-0.054229736328125,
-0.0538330078125,
-0.0233917236328125,
-0.02655029296875,
0.00689697265625,
0.07305908203125,
0.052154541015625,
-0.06475830078125,
-0.0164794921875,
-0.0015878677368164062,
0.00519561767578125,
-0.03094482421875,
-0.0236663818359375,
0.049468994140625,
-0.023834228515625,
-0.05535888671875,
0.0016565322875976562,
-0.00434112548828125,
-0.00469207763671875,
-0.00336456298828125,
0.00005602836608886719,
-0.024169921875,
-0.018280029296875,
0.0290985107421875,
0.00608062744140625,
-0.052703857421875,
-0.03802490234375,
0.005268096923828125,
-0.01666259765625,
0.0003917217254638672,
0.037506103515625,
-0.0452880859375,
0.0282745361328125,
0.059844970703125,
0.0255584716796875,
0.06597900390625,
0.0179290771484375,
0.0196990966796875,
-0.0655517578125,
-0.003917694091796875,
0.00750732421875,
0.02984619140625,
0.0249176025390625,
-0.032684326171875,
0.0440673828125,
0.04071044921875,
-0.0305023193359375,
-0.043365478515625,
-0.00324249267578125,
-0.0789794921875,
-0.0141143798828125,
0.0880126953125,
-0.00408172607421875,
-0.0219268798828125,
0.005481719970703125,
-0.02069091796875,
0.038116455078125,
-0.039276123046875,
0.0655517578125,
0.045166015625,
-0.0062255859375,
-0.0235595703125,
-0.054718017578125,
0.0264129638671875,
0.0190582275390625,
-0.049041748046875,
-0.003261566162109375,
0.0284881591796875,
0.056121826171875,
0.02252197265625,
0.068115234375,
-0.00528717041015625,
0.0265960693359375,
0.0097808837890625,
0.01329803466796875,
-0.0202484130859375,
-0.015472412109375,
-0.005847930908203125,
0.0132904052734375,
-0.01525115966796875,
-0.0306854248046875
]
] |
Davlan/xlm-roberta-large-ner-hrl | 2023-08-14T19:35:59.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"xlm-roberta",
"token-classification",
"license:afl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | Davlan | null | null | Davlan/xlm-roberta-large-ner-hrl | 11 | 16,652 | transformers | 2022-03-02T23:29:04 | ---
license: afl-3.0
---
Hugging Face's logo
---
language:
- ar
- de
- en
- es
- fr
- it
- lv
- nl
- pt
- zh
- multilingual
---
# xlm-roberta-large-ner-hrl
## Model description
**xlm-roberta-large-ner-hrl** is a **Named Entity Recognition** model for 10 high resourced languages (Arabic, German, English, Spanish, French, Italian, Latvian, Dutch, Portuguese and Chinese) based on a fine-tuned XLM-RoBERTa large model. It has been trained to recognize three types of entities: location (LOC), organizations (ORG), and person (PER).
Specifically, this model is a *xlm-roberta-large* model that was fine-tuned on an aggregation of 10 high-resourced languages
## Intended uses & limitations
#### How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("Davlan/xlm-roberta-large-ner-hrl")
model = AutoModelForTokenClassification.from_pretrained("Davlan/xlm-roberta-large-ner-hrl")
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "Nader Jokhadar had given Syria the lead with a well-struck header in the seventh minute."
ner_results = nlp(example)
print(ner_results)
```
#### Limitations and bias
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
## Training data
The training data for the 10 languages are from:
Language|Dataset
-|-
Arabic | [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/)
German | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/)
English | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/)
Spanish | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/)
French | [Europeana Newspapers](https://github.com/EuropeanaNewspapers/ner-corpora/tree/master/enp_FR.bnf.bio)
Italian | [Italian I-CAB](https://ontotext.fbk.eu/icab.html)
Latvian | [Latvian NER](https://github.com/LUMII-AILab/FullStack/tree/master/NamedEntities)
Dutch | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/)
Portuguese |[Paramopama + Second Harem](https://github.com/davidsbatista/NER-datasets/tree/master/Portuguese)
Chinese | [MSRA](https://huggingface.co/datasets/msra_ner)
The training dataset distinguishes between the beginning and continuation of an entity so that if there are back-to-back entities of the same type, the model can output where the second entity begins. As in the dataset, each token will be classified as one of the following classes:
Abbreviation|Description
-|-
O|Outside of a named entity
B-PER |Beginning of a person’s name right after another person’s name
I-PER |Person’s name
B-ORG |Beginning of an organisation right after another organisation
I-ORG |Organisation
B-LOC |Beginning of a location right after another location
I-LOC |Location
## Training procedure
This model was trained on NVIDIA V100 GPU with recommended hyperparameters from HuggingFace code. | 3,048 | [
[
-0.039764404296875,
-0.0458984375,
0.01397705078125,
0.026611328125,
-0.0116119384765625,
0.004833221435546875,
-0.02752685546875,
-0.047271728515625,
0.041778564453125,
0.0308685302734375,
-0.0340576171875,
-0.053680419921875,
-0.063720703125,
0.0333251953125,
-0.0198822021484375,
0.09210205078125,
-0.0177154541015625,
0.0213775634765625,
0.007022857666015625,
-0.0447998046875,
-0.007110595703125,
-0.05377197265625,
-0.07232666015625,
-0.01959228515625,
0.031280517578125,
0.0128936767578125,
0.03472900390625,
0.043731689453125,
0.0179443359375,
0.028045654296875,
-0.02239990234375,
0.0003380775451660156,
-0.0169677734375,
-0.0179595947265625,
-0.01291656494140625,
-0.0174407958984375,
-0.0260009765625,
-0.00502777099609375,
0.06298828125,
0.0400390625,
-0.00927734375,
0.0221405029296875,
-0.005062103271484375,
0.054779052734375,
-0.01483917236328125,
0.0193023681640625,
-0.03021240234375,
-0.002170562744140625,
-0.0198516845703125,
0.01788330078125,
-0.0101776123046875,
-0.0181121826171875,
0.0160675048828125,
-0.0225677490234375,
0.0038433074951171875,
0.0005998611450195312,
0.10565185546875,
0.0019550323486328125,
-0.040435791015625,
-0.018280029296875,
-0.0303955078125,
0.04656982421875,
-0.0364990234375,
0.05352783203125,
0.0216064453125,
0.0128936767578125,
0.0032501220703125,
-0.03607177734375,
-0.05792236328125,
-0.00101470947265625,
-0.0098419189453125,
0.00054931640625,
-0.0214080810546875,
-0.0209197998046875,
0.025390625,
0.0294647216796875,
-0.0465087890625,
-0.0006127357482910156,
-0.0284423828125,
-0.016571044921875,
0.040374755859375,
-0.0114593505859375,
0.0374755859375,
-0.0250701904296875,
-0.0305633544921875,
-0.01332855224609375,
-0.0389404296875,
0.00838470458984375,
0.0280609130859375,
0.03271484375,
-0.042877197265625,
0.042022705078125,
-0.0146484375,
0.056488037109375,
0.0150909423828125,
-0.0143280029296875,
0.050994873046875,
-0.013153076171875,
-0.0174102783203125,
-0.003589630126953125,
0.0667724609375,
0.023040771484375,
0.00875091552734375,
-0.0106353759765625,
-0.0197906494140625,
0.0072784423828125,
-0.0169219970703125,
-0.060699462890625,
-0.00316619873046875,
0.005199432373046875,
-0.038330078125,
-0.006809234619140625,
-0.004070281982421875,
-0.04449462890625,
0.004589080810546875,
-0.03369140625,
0.0242156982421875,
-0.045318603515625,
-0.0265960693359375,
0.001430511474609375,
-0.0018949508666992188,
0.03387451171875,
0.00995635986328125,
-0.05084228515625,
0.0223388671875,
0.0328369140625,
0.0673828125,
-0.01255035400390625,
-0.0270233154296875,
-0.045684814453125,
-0.004611968994140625,
-0.004062652587890625,
0.050018310546875,
-0.03326416015625,
-0.01000213623046875,
-0.000682830810546875,
0.032501220703125,
-0.0124664306640625,
-0.037353515625,
0.034454345703125,
-0.037841796875,
0.034912109375,
-0.0216827392578125,
-0.039337158203125,
-0.023773193359375,
0.0307769775390625,
-0.05169677734375,
0.0897216796875,
0.03680419921875,
-0.0677490234375,
0.037872314453125,
-0.0341796875,
-0.029998779296875,
0.002918243408203125,
-0.0265655517578125,
-0.044586181640625,
-0.001605987548828125,
0.0171661376953125,
0.0283966064453125,
-0.0040130615234375,
0.0237884521484375,
0.00395965576171875,
0.00955963134765625,
-0.006534576416015625,
0.00583648681640625,
0.08245849609375,
-0.0006475448608398438,
-0.02923583984375,
0.00806427001953125,
-0.0704345703125,
-0.0152587890625,
0.0175323486328125,
-0.033050537109375,
-0.0262451171875,
-0.02960205078125,
0.036407470703125,
0.04241943359375,
0.01141357421875,
-0.049560546875,
0.01727294921875,
-0.031036376953125,
0.0216522216796875,
0.03564453125,
-0.0029125213623046875,
0.030975341796875,
-0.0151824951171875,
0.04400634765625,
0.0224761962890625,
-0.012481689453125,
-0.00312042236328125,
-0.039215087890625,
-0.0672607421875,
-0.01141357421875,
0.037567138671875,
0.048004150390625,
-0.07012939453125,
0.0303955078125,
-0.0204315185546875,
-0.036285400390625,
-0.0311431884765625,
0.005645751953125,
0.0482177734375,
0.04638671875,
0.031402587890625,
-0.038970947265625,
-0.05975341796875,
-0.06256103515625,
-0.015655517578125,
-0.01024627685546875,
0.0165252685546875,
0.0293731689453125,
0.052947998046875,
-0.020965576171875,
0.046142578125,
-0.00868988037109375,
-0.038055419921875,
-0.0219879150390625,
-0.0011377334594726562,
0.0389404296875,
0.039520263671875,
0.050994873046875,
-0.06719970703125,
-0.051788330078125,
0.00043487548828125,
-0.0517578125,
0.016998291015625,
-0.0017137527465820312,
-0.022369384765625,
0.050079345703125,
0.033050537109375,
-0.043853759765625,
0.03076171875,
0.052734375,
-0.035003662109375,
0.029876708984375,
-0.006816864013671875,
-0.0089263916015625,
-0.0975341796875,
0.01280975341796875,
0.0175323486328125,
-0.00789642333984375,
-0.04010009765625,
0.0004887580871582031,
-0.006664276123046875,
-0.0028209686279296875,
-0.041351318359375,
0.07379150390625,
-0.06011962890625,
-0.0026397705078125,
-0.00731658935546875,
0.006305694580078125,
-0.013641357421875,
0.0333251953125,
0.045166015625,
0.04193115234375,
0.046844482421875,
-0.0482177734375,
0.015106201171875,
0.05078125,
-0.0261383056640625,
0.059661865234375,
-0.03826904296875,
-0.00335693359375,
-0.0171051025390625,
0.023834228515625,
-0.03509521484375,
-0.0234222412109375,
0.0236358642578125,
-0.041778564453125,
0.035247802734375,
-0.031951904296875,
-0.038177490234375,
-0.0240631103515625,
0.0081329345703125,
0.027435302734375,
0.026702880859375,
-0.044586181640625,
0.05975341796875,
0.0343017578125,
-0.004428863525390625,
-0.049163818359375,
-0.0552978515625,
0.0243988037109375,
-0.0281829833984375,
-0.040557861328125,
0.0283203125,
0.004245758056640625,
0.00201416015625,
0.0028553009033203125,
0.0040435791015625,
-0.01312255859375,
-0.01020050048828125,
0.014892578125,
0.02716064453125,
-0.021484375,
0.00318145751953125,
-0.01873779296875,
-0.0077667236328125,
-0.009124755859375,
-0.0252227783203125,
0.050384521484375,
-0.0223236083984375,
0.0025482177734375,
-0.036041259765625,
0.028594970703125,
0.0218963623046875,
-0.028411865234375,
0.0860595703125,
0.0677490234375,
-0.04486083984375,
0.01036834716796875,
-0.045867919921875,
0.00021696090698242188,
-0.0293731689453125,
0.0167236328125,
-0.03857421875,
-0.06549072265625,
0.0513916015625,
0.0107574462890625,
-0.0023097991943359375,
0.04901123046875,
0.05078125,
0.0298309326171875,
0.058837890625,
0.06787109375,
-0.033477783203125,
0.040283203125,
-0.029998779296875,
0.00922393798828125,
-0.06317138671875,
-0.0303497314453125,
-0.04180908203125,
-0.0196075439453125,
-0.0665283203125,
-0.020355224609375,
0.01203155517578125,
-0.0008673667907714844,
-0.022552490234375,
0.05792236328125,
-0.043426513671875,
0.022552490234375,
0.03558349609375,
-0.005451202392578125,
0.0118255615234375,
0.0114898681640625,
-0.01593017578125,
-0.0005130767822265625,
-0.037017822265625,
-0.043670654296875,
0.050262451171875,
0.033172607421875,
0.03131103515625,
0.01261138916015625,
0.07171630859375,
-0.01983642578125,
0.0283966064453125,
-0.049560546875,
0.025482177734375,
-0.0113677978515625,
-0.049774169921875,
-0.01256561279296875,
-0.04010009765625,
-0.07305908203125,
-0.0005979537963867188,
-0.013519287109375,
-0.06292724609375,
0.024658203125,
-0.0080108642578125,
-0.0216064453125,
0.0305938720703125,
-0.02532958984375,
0.06304931640625,
-0.0208892822265625,
-0.007579803466796875,
0.0168304443359375,
-0.058441162109375,
0.015716552734375,
0.0001342296600341797,
0.0179290771484375,
-0.0111846923828125,
-0.005664825439453125,
0.062469482421875,
-0.0198822021484375,
0.049652099609375,
-0.0154571533203125,
-0.00743865966796875,
0.004871368408203125,
-0.0106201171875,
0.026702880859375,
0.004207611083984375,
-0.01071929931640625,
0.04425048828125,
-0.0031280517578125,
-0.03173828125,
-0.01953125,
0.04901123046875,
-0.06390380859375,
-0.018585205078125,
-0.047119140625,
-0.0236663818359375,
-0.0008935928344726562,
0.03448486328125,
0.04071044921875,
0.02545166015625,
-0.01125335693359375,
0.004192352294921875,
0.033050537109375,
-0.031494140625,
0.029876708984375,
0.045166015625,
-0.027587890625,
-0.044036865234375,
0.0604248046875,
0.01192474365234375,
-0.001129150390625,
0.0216064453125,
-0.0007281303405761719,
-0.0204315185546875,
-0.027862548828125,
-0.055694580078125,
0.042816162109375,
-0.029754638671875,
-0.0232391357421875,
-0.07232666015625,
-0.029205322265625,
-0.037078857421875,
0.00453948974609375,
-0.0199432373046875,
-0.0252838134765625,
-0.024566650390625,
-0.005092620849609375,
0.028411865234375,
0.045745849609375,
-0.006244659423828125,
0.0263671875,
-0.058807373046875,
0.0265655517578125,
-0.00553131103515625,
0.031494140625,
-0.01372528076171875,
-0.0516357421875,
-0.0247039794921875,
-0.004001617431640625,
-0.01068878173828125,
-0.07208251953125,
0.0550537109375,
0.028778076171875,
0.0361328125,
0.0361328125,
-0.01496124267578125,
0.046783447265625,
-0.045501708984375,
0.046478271484375,
0.01666259765625,
-0.06451416015625,
0.040771484375,
-0.0227813720703125,
0.00926971435546875,
0.0401611328125,
0.058563232421875,
-0.061126708984375,
-0.015869140625,
-0.0645751953125,
-0.06829833984375,
0.0550537109375,
0.015655517578125,
0.02215576171875,
-0.027130126953125,
0.0307464599609375,
-0.0005421638488769531,
0.01496124267578125,
-0.06890869140625,
-0.036529541015625,
0.0021228790283203125,
-0.01763916015625,
-0.004547119140625,
-0.01290130615234375,
-0.00439453125,
-0.0231170654296875,
0.07965087890625,
-0.004985809326171875,
0.02069091796875,
0.0247344970703125,
-0.01354217529296875,
-0.0077362060546875,
-0.00020039081573486328,
0.032012939453125,
0.03594970703125,
-0.001651763916015625,
-0.0034694671630859375,
0.023834228515625,
-0.027740478515625,
0.00293731689453125,
0.01953125,
-0.017181396484375,
0.0269622802734375,
0.02020263671875,
0.074462890625,
0.006015777587890625,
-0.035888671875,
0.052337646484375,
-0.013427734375,
-0.007480621337890625,
-0.04974365234375,
-0.0170135498046875,
0.00702667236328125,
0.023834228515625,
0.02490234375,
-0.0115203857421875,
-0.0003085136413574219,
-0.046173095703125,
0.02874755859375,
0.036895751953125,
-0.0298004150390625,
-0.032867431640625,
0.04132080078125,
0.0013971328735351562,
-0.0240936279296875,
0.05450439453125,
-0.0266571044921875,
-0.05999755859375,
0.04852294921875,
0.03753662109375,
0.06109619140625,
-0.0299530029296875,
0.01013946533203125,
0.0645751953125,
0.0179901123046875,
0.001415252685546875,
0.0312347412109375,
0.007633209228515625,
-0.0638427734375,
-0.0272216796875,
-0.07037353515625,
-0.01071929931640625,
0.0165863037109375,
-0.071044921875,
0.0428466796875,
-0.033294677734375,
-0.020660400390625,
0.01129150390625,
0.0138702392578125,
-0.07611083984375,
0.0203704833984375,
0.0278472900390625,
0.082763671875,
-0.0780029296875,
0.057952880859375,
0.07586669921875,
-0.04974365234375,
-0.0673828125,
-0.0142822265625,
0.0097198486328125,
-0.0703125,
0.0640869140625,
0.0277862548828125,
0.022918701171875,
-0.005977630615234375,
-0.0333251953125,
-0.0789794921875,
0.06085205078125,
0.0149993896484375,
-0.0391845703125,
-0.0187530517578125,
-0.009918212890625,
0.03790283203125,
-0.044647216796875,
0.024658203125,
0.043609619140625,
0.0330810546875,
0.006412506103515625,
-0.08123779296875,
0.0033016204833984375,
-0.032196044921875,
0.003353118896484375,
0.0216064453125,
-0.06793212890625,
0.0574951171875,
-0.0168304443359375,
-0.0259246826171875,
0.00902557373046875,
0.06365966796875,
0.0170745849609375,
0.025238037109375,
0.042449951171875,
0.06549072265625,
0.0440673828125,
-0.006015777587890625,
0.0650634765625,
-0.043731689453125,
0.0279693603515625,
0.08319091796875,
-0.004268646240234375,
0.064208984375,
0.031829833984375,
-0.01096343994140625,
0.06341552734375,
0.051788330078125,
-0.0173492431640625,
0.01611328125,
0.0051422119140625,
-0.0157470703125,
0.00424957275390625,
-0.0210418701171875,
-0.02557373046875,
0.0518798828125,
0.0148773193359375,
-0.037872314453125,
-0.01416015625,
0.00855255126953125,
0.037628173828125,
-0.006412506103515625,
-0.006168365478515625,
0.0682373046875,
0.01181793212890625,
-0.045440673828125,
0.043243408203125,
0.0079498291015625,
0.0489501953125,
-0.02752685546875,
-0.002544403076171875,
-0.0150604248046875,
0.00020575523376464844,
-0.022918701171875,
-0.04241943359375,
0.0269622802734375,
0.0046844482421875,
-0.01476287841796875,
-0.0105438232421875,
0.0263671875,
-0.05328369140625,
-0.051605224609375,
0.03363037109375,
0.043853759765625,
0.03411865234375,
0.002864837646484375,
-0.06768798828125,
0.01287078857421875,
0.0013275146484375,
-0.02691650390625,
0.0261077880859375,
0.03936767578125,
-0.007137298583984375,
0.034912109375,
0.048309326171875,
0.025787353515625,
-0.0051727294921875,
0.0088958740234375,
0.06878662109375,
-0.0582275390625,
-0.035888671875,
-0.05535888671875,
0.02642822265625,
-0.0118255615234375,
-0.035980224609375,
0.0615234375,
0.056488037109375,
0.08441162109375,
0.00678253173828125,
0.043670654296875,
-0.019683837890625,
0.048614501953125,
-0.0284423828125,
0.05322265625,
-0.039794921875,
-0.01251220703125,
-0.0316162109375,
-0.08294677734375,
-0.023834228515625,
0.0582275390625,
-0.01302337646484375,
0.01438140869140625,
0.034759521484375,
0.04058837890625,
-0.0035552978515625,
-0.0251007080078125,
0.00511932373046875,
0.026702880859375,
0.005550384521484375,
0.04583740234375,
0.035858154296875,
-0.041961669921875,
0.0225830078125,
-0.0306396484375,
-0.016815185546875,
-0.002033233642578125,
-0.07171630859375,
-0.07147216796875,
-0.054168701171875,
-0.043914794921875,
-0.05120849609375,
-0.00911712646484375,
0.074951171875,
0.055877685546875,
-0.07208251953125,
-0.01509857177734375,
0.006175994873046875,
-0.0026988983154296875,
-0.0012311935424804688,
-0.015899658203125,
0.038909912109375,
-0.01039886474609375,
-0.06500244140625,
0.005207061767578125,
0.0107421875,
0.01141357421875,
-0.015625,
-0.01151275634765625,
-0.0283660888671875,
-0.00896453857421875,
0.040802001953125,
0.0390625,
-0.053863525390625,
-0.01111602783203125,
-0.002155303955078125,
-0.013275146484375,
0.0105743408203125,
0.032501220703125,
-0.061309814453125,
0.01776123046875,
0.018096923828125,
0.04608154296875,
0.045166015625,
0.0030727386474609375,
0.015625,
-0.04974365234375,
0.020721435546875,
0.00984954833984375,
0.041595458984375,
0.047393798828125,
-0.033538818359375,
0.049041748046875,
0.0190277099609375,
-0.03656005859375,
-0.0552978515625,
0.0008478164672851562,
-0.0750732421875,
0.008453369140625,
0.087158203125,
-0.01873779296875,
-0.035491943359375,
0.0010614395141601562,
-0.00882720947265625,
0.034912109375,
-0.0296783447265625,
0.040771484375,
0.0537109375,
-0.0037689208984375,
-0.0199432373046875,
-0.0384521484375,
0.03729248046875,
0.0201873779296875,
-0.05340576171875,
-0.0230712890625,
0.0290069580078125,
0.04241943359375,
0.017913818359375,
0.05322265625,
-0.0136260986328125,
0.0030918121337890625,
-0.01751708984375,
0.0233001708984375,
0.015777587890625,
-0.0202484130859375,
-0.037628173828125,
-0.018524169921875,
-0.01873779296875,
-0.0002218484878540039
]
] |
FredZhang7/distilgpt2-stable-diffusion-v2 | 2023-03-16T20:06:26.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"stable-diffusion",
"prompt-generator",
"arxiv:2210.14140",
"dataset:FredZhang7/stable-diffusion-prompts-2.47M",
"dataset:poloclub/diffusiondb",
"dataset:Gustavosta/Stable-Diffusion-Prompts",
"dataset:bartman081523/stable-diffusion-discord-prompts",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | FredZhang7 | null | null | FredZhang7/distilgpt2-stable-diffusion-v2 | 79 | 16,623 | transformers | 2022-12-10T06:49:07 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- prompt-generator
- arxiv:2210.14140
widget:
- text: "amazing"
- text: "a photo of"
- text: "a sci-fi"
- text: "a portrait of"
- text: "a person standing"
- text: "a boy watching"
datasets:
- FredZhang7/stable-diffusion-prompts-2.47M
- poloclub/diffusiondb
- Gustavosta/Stable-Diffusion-Prompts
- bartman081523/stable-diffusion-discord-prompts
---
# Fast GPT2 PromptGen
<style>
.container {
padding-left: 20px;
border-left: 5px solid gray;
}
</style>
<div class="container">
<p><strong><a href="https://huggingface.co/FredZhang7/anime-anything-promptgen-v2">Fast Anime PromptGen</a></strong> generates descriptive safebooru and danbooru tags for anime text-to-image models.</p>
</div>
This model was trained on 2,470,000 descriptive stable diffusion prompts on the [FredZhang7/distilgpt2-stable-diffusion](https://huggingface.co/FredZhang7/distilgpt2-stable-diffusion) checkpoint for another 4,270,000 steps.
Compared to other prompt generation models using GPT2, this one runs with 50% faster forwardpropagation and 40% less disk space & RAM.
Major improvements from v1 are:
- 25% more variations
- faster and more fluent prompt generation
- cleaned training data
* removed prompts that generate images with nsfw scores > 0.5
* removed duplicates, including prompts that differ by capitalization and punctuations
* removed punctuations at random places
* removed prompts shorter than 15 characters
## Live WebUI Demo
See the Prompt Generator tab of [Paint Journey Demo](https://huggingface.co/spaces/FredZhang7/paint-journey-demo).
## Contrastive Search
```bash
pip install --upgrade transformers
```
```python
from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2')
tokenizer.add_special_tokens({'pad_token': '[PAD]'})
model = GPT2LMHeadModel.from_pretrained('FredZhang7/distilgpt2-stable-diffusion-v2')
prompt = r'a cat sitting' # the beginning of the prompt
temperature = 0.9 # a higher temperature will produce more diverse results, but with a higher risk of less coherent text
top_k = 8 # the number of tokens to sample from at each step
max_length = 80 # the maximum number of tokens for the output of the model
repitition_penalty = 1.2 # the penalty value for each repetition of a token
num_return_sequences=5 # the number of results to generate
# generate the result with contrastive search
input_ids = tokenizer(prompt, return_tensors='pt').input_ids
output = model.generate(input_ids, do_sample=True, temperature=temperature, top_k=top_k, max_length=max_length, num_return_sequences=num_return_sequences, repetition_penalty=repitition_penalty, penalty_alpha=0.6, no_repeat_ngram_size=1, early_stopping=True)
print('\nInput:\n' + 100 * '-')
print('\033[96m' + prompt + '\033[0m')
print('\nOutput:\n' + 100 * '-')
for i in range(len(output)):
print('\033[92m' + tokenizer.decode(output[i], skip_special_tokens=True) + '\033[0m\n')
```
No comma style:

To bring back the commas, assign output without `penalty_alpha` and `no_repeat_ngram_size`:
```python
output = model.generate(input_ids, do_sample=True, temperature=temperature, top_k=top_k, max_length=max_length, num_return_sequences=num_return_sequences, repetition_penalty=repitition_penalty, early_stopping=True)
```
 | 3,500 | [
[
-0.0193634033203125,
-0.051025390625,
0.03253173828125,
0.0157470703125,
-0.027374267578125,
-0.0222625732421875,
-0.009307861328125,
-0.00980377197265625,
-0.0089111328125,
0.012542724609375,
-0.0469970703125,
-0.035400390625,
-0.0458984375,
0.0183563232421875,
0.000659942626953125,
0.0750732421875,
-0.005268096923828125,
-0.007076263427734375,
0.0108642578125,
0.01364898681640625,
-0.044952392578125,
-0.0227813720703125,
-0.07763671875,
-0.0201873779296875,
0.038787841796875,
0.014862060546875,
0.039520263671875,
0.05059814453125,
0.0257110595703125,
0.027923583984375,
-0.0181732177734375,
0.002391815185546875,
-0.03912353515625,
0.002986907958984375,
0.0025234222412109375,
-0.02978515625,
-0.01248931884765625,
0.006317138671875,
0.036468505859375,
0.0230255126953125,
0.0019435882568359375,
0.00035071372985839844,
0.0097198486328125,
0.03759765625,
-0.0275726318359375,
0.031982421875,
-0.02203369140625,
-0.00444793701171875,
-0.0008630752563476562,
-0.00960540771484375,
-0.031280517578125,
-0.0267486572265625,
-0.0001558065414428711,
-0.065185546875,
0.031829833984375,
-0.01099395751953125,
0.0828857421875,
0.0190887451171875,
-0.0284881591796875,
-0.01165008544921875,
-0.0204925537109375,
0.06988525390625,
-0.06097412109375,
-0.005458831787109375,
0.0158843994140625,
0.014801025390625,
-0.0115509033203125,
-0.08721923828125,
-0.047607421875,
-0.01142120361328125,
-0.01898193359375,
0.01473236083984375,
-0.0276336669921875,
-0.011138916015625,
0.0304718017578125,
0.0126800537109375,
-0.06854248046875,
-0.003925323486328125,
-0.018524169921875,
-0.025390625,
0.03497314453125,
0.03363037109375,
0.02447509765625,
-0.0211944580078125,
-0.0309600830078125,
-0.03961181640625,
-0.037445068359375,
0.01422119140625,
0.0128326416015625,
0.022125244140625,
-0.01241302490234375,
0.0293121337890625,
-0.0235137939453125,
0.029296875,
0.00960540771484375,
-0.01629638671875,
0.033447265625,
-0.04290771484375,
-0.004283905029296875,
-0.01558685302734375,
0.07330322265625,
0.054534912109375,
0.01177215576171875,
0.007198333740234375,
-0.00894927978515625,
0.01033782958984375,
-0.016357421875,
-0.1031494140625,
-0.040069580078125,
0.00855255126953125,
-0.04180908203125,
-0.01922607421875,
-0.00677490234375,
-0.06109619140625,
-0.01064300537109375,
-0.0058746337890625,
0.0379638671875,
-0.04998779296875,
-0.0251312255859375,
0.011962890625,
-0.044677734375,
0.00786590576171875,
0.0297698974609375,
-0.05731201171875,
0.0016889572143554688,
0.036163330078125,
0.08038330078125,
0.0021343231201171875,
-0.0176849365234375,
-0.035614013671875,
-0.0251617431640625,
-0.00974273681640625,
0.04376220703125,
-0.04644775390625,
-0.039947509765625,
-0.0299072265625,
0.0217437744140625,
-0.00716400146484375,
-0.031524658203125,
0.02978515625,
-0.0257415771484375,
0.03485107421875,
-0.0255889892578125,
-0.027374267578125,
-0.03460693359375,
0.0033435821533203125,
-0.035980224609375,
0.089599609375,
0.0184783935546875,
-0.062286376953125,
0.0134124755859375,
-0.04913330078125,
-0.0240936279296875,
-0.0007147789001464844,
0.008331298828125,
-0.03863525390625,
-0.014556884765625,
0.026947021484375,
0.029937744140625,
-0.01253509521484375,
0.01097869873046875,
-0.006664276123046875,
-0.04168701171875,
0.0124053955078125,
-0.03192138671875,
0.07159423828125,
0.022918701171875,
-0.057708740234375,
0.0091094970703125,
-0.05230712890625,
0.00839996337890625,
0.025390625,
-0.035858154296875,
0.000008344650268554688,
-0.0262298583984375,
0.01360321044921875,
0.01155853271484375,
0.020538330078125,
-0.02642822265625,
0.0221405029296875,
-0.0218048095703125,
0.036956787109375,
0.060791015625,
0.00485992431640625,
0.0182952880859375,
-0.027435302734375,
0.05072021484375,
0.0159149169921875,
-0.006397247314453125,
-0.032318115234375,
-0.0609130859375,
-0.04339599609375,
-0.0280914306640625,
0.01477813720703125,
0.058563232421875,
-0.06292724609375,
0.03936767578125,
-0.005825042724609375,
-0.047698974609375,
-0.0206451416015625,
-0.0157928466796875,
0.035003662109375,
0.06195068359375,
0.03521728515625,
-0.0163421630859375,
-0.045623779296875,
-0.045074462890625,
0.0082244873046875,
-0.006130218505859375,
-0.0032253265380859375,
0.02001953125,
0.048248291015625,
-0.017669677734375,
0.04925537109375,
-0.050689697265625,
0.0025424957275390625,
-0.0216827392578125,
0.02203369140625,
0.0428466796875,
0.0479736328125,
0.039215087890625,
-0.0467529296875,
-0.054168701171875,
-0.0186614990234375,
-0.050537109375,
-0.01354217529296875,
0.0003516674041748047,
-0.0126190185546875,
0.01003265380859375,
0.02685546875,
-0.05841064453125,
0.034271240234375,
0.013214111328125,
-0.04608154296875,
0.046539306640625,
-0.020843505859375,
0.01885986328125,
-0.08782958984375,
0.0308990478515625,
0.005878448486328125,
-0.01045989990234375,
-0.0479736328125,
0.0117340087890625,
-0.00415802001953125,
-0.007389068603515625,
-0.037994384765625,
0.055938720703125,
-0.03460693359375,
0.0290985107421875,
-0.0207366943359375,
-0.00004690885543823242,
0.0082855224609375,
0.044158935546875,
0.008514404296875,
0.07281494140625,
0.053863525390625,
-0.0443115234375,
0.024993896484375,
0.029449462890625,
-0.0230255126953125,
0.01152801513671875,
-0.058990478515625,
0.024566650390625,
-0.0162353515625,
0.032623291015625,
-0.1146240234375,
-0.00933074951171875,
0.04083251953125,
-0.04400634765625,
0.0203094482421875,
-0.01165008544921875,
-0.04217529296875,
-0.025634765625,
-0.020904541015625,
0.035552978515625,
0.0740966796875,
-0.0357666015625,
0.0237274169921875,
-0.014739990234375,
-0.02252197265625,
-0.0390625,
-0.04931640625,
0.0008764266967773438,
-0.0252532958984375,
-0.05126953125,
0.012786865234375,
-0.0144805908203125,
0.005184173583984375,
-0.016937255859375,
0.01425933837890625,
0.013092041015625,
0.01568603515625,
0.0170745849609375,
0.0200958251953125,
-0.0157318115234375,
-0.017852783203125,
-0.00942230224609375,
-0.027923583984375,
0.00482940673828125,
-0.0103759765625,
0.07403564453125,
-0.0069427490234375,
0.011962890625,
-0.05059814453125,
0.0289306640625,
0.017730712890625,
-0.00604248046875,
0.044189453125,
0.07745361328125,
-0.0190582275390625,
0.0118865966796875,
-0.01091766357421875,
-0.023284912109375,
-0.03924560546875,
0.037841796875,
-0.0300750732421875,
-0.051177978515625,
0.048309326171875,
0.00910186767578125,
-0.006805419921875,
0.0513916015625,
0.0374755859375,
0.013671875,
0.0906982421875,
0.02508544921875,
0.016021728515625,
0.031585693359375,
-0.0390625,
0.0008535385131835938,
-0.060333251953125,
-0.03314208984375,
-0.037139892578125,
-0.026947021484375,
-0.04302978515625,
-0.0343017578125,
0.03472900390625,
0.0160980224609375,
-0.0290374755859375,
0.019561767578125,
-0.0589599609375,
0.032623291015625,
0.0498046875,
0.02447509765625,
-0.0094757080078125,
0.00897216796875,
-0.0086822509765625,
-0.01271820068359375,
-0.044830322265625,
-0.02886962890625,
0.07781982421875,
0.02197265625,
0.04888916015625,
0.006778717041015625,
0.045684814453125,
-0.003376007080078125,
0.01232147216796875,
-0.0439453125,
0.054046630859375,
-0.0209197998046875,
-0.037200927734375,
-0.0153656005859375,
-0.036163330078125,
-0.0765380859375,
0.0163726806640625,
0.006694793701171875,
-0.053863525390625,
0.00699615478515625,
0.0020923614501953125,
-0.0279998779296875,
0.0165252685546875,
-0.05157470703125,
0.0701904296875,
0.004730224609375,
-0.03460693359375,
0.011962890625,
-0.06756591796875,
0.018402099609375,
0.014862060546875,
-0.01111602783203125,
0.00023376941680908203,
-0.0047607421875,
0.059722900390625,
-0.035797119140625,
0.06256103515625,
-0.034332275390625,
0.00806427001953125,
0.0174102783203125,
0.0009617805480957031,
0.0428466796875,
-0.002384185791015625,
-0.008209228515625,
0.008636474609375,
-0.0164337158203125,
-0.0235443115234375,
-0.0218963623046875,
0.04486083984375,
-0.046295166015625,
-0.03863525390625,
-0.035400390625,
-0.033843994140625,
0.0282440185546875,
0.0285186767578125,
0.062469482421875,
0.03533935546875,
0.005863189697265625,
-0.00400543212890625,
0.05194091796875,
-0.03509521484375,
0.050079345703125,
0.01496124267578125,
-0.01340484619140625,
-0.049652099609375,
0.06610107421875,
-0.0043182373046875,
0.00870513916015625,
0.0163726806640625,
0.026397705078125,
-0.043487548828125,
-0.0243682861328125,
-0.03533935546875,
0.0171966552734375,
-0.0640869140625,
-0.0197906494140625,
-0.057342529296875,
-0.00579833984375,
-0.0469970703125,
-0.00537872314453125,
-0.003383636474609375,
-0.0306243896484375,
-0.058013916015625,
0.0057373046875,
0.040740966796875,
0.03729248046875,
-0.0251617431640625,
0.0335693359375,
-0.037750244140625,
0.0261993408203125,
0.0135650634765625,
0.002834320068359375,
-0.0016021728515625,
-0.05511474609375,
-0.0010776519775390625,
-0.003223419189453125,
-0.0341796875,
-0.0626220703125,
0.042724609375,
0.009185791015625,
0.023468017578125,
0.0292205810546875,
-0.01018524169921875,
0.056976318359375,
-0.0302734375,
0.08892822265625,
0.0254364013671875,
-0.061126708984375,
0.0364990234375,
-0.0248870849609375,
0.025634765625,
0.0272979736328125,
0.02880859375,
-0.040740966796875,
-0.0187225341796875,
-0.04058837890625,
-0.07391357421875,
0.0633544921875,
0.025848388671875,
0.006977081298828125,
-0.0112457275390625,
0.0386962890625,
0.01345062255859375,
0.003871917724609375,
-0.057525634765625,
-0.042510986328125,
-0.03558349609375,
-0.00960540771484375,
-0.024169921875,
-0.01297760009765625,
0.0011758804321289062,
-0.0355224609375,
0.06451416015625,
-0.005764007568359375,
0.04644775390625,
0.0296783447265625,
0.012939453125,
0.006458282470703125,
0.003147125244140625,
0.050384521484375,
0.0254058837890625,
-0.0233306884765625,
0.0055389404296875,
-0.0033283233642578125,
-0.036651611328125,
0.0201568603515625,
0.0234222412109375,
-0.0198822021484375,
0.027801513671875,
0.0187225341796875,
0.0772705078125,
-0.01995849609375,
-0.0181884765625,
0.032958984375,
-0.01314544677734375,
-0.02239990234375,
-0.01464080810546875,
0.019561767578125,
0.0079498291015625,
0.018524169921875,
0.0214385986328125,
0.007843017578125,
0.0243988037109375,
-0.03668212890625,
0.01019287109375,
0.030975341796875,
0.004055023193359375,
-0.01502227783203125,
0.08270263671875,
0.0022125244140625,
-0.01404571533203125,
0.06280517578125,
-0.0266265869140625,
-0.04925537109375,
0.06219482421875,
0.04962158203125,
0.07135009765625,
-0.01099395751953125,
0.016571044921875,
0.058685302734375,
0.009918212890625,
-0.02587890625,
0.038787841796875,
0.0035858154296875,
-0.0272674560546875,
-0.0107269287109375,
-0.03314208984375,
-0.0120086669921875,
0.0278167724609375,
-0.043487548828125,
0.0209503173828125,
-0.041015625,
-0.025238037109375,
-0.01016998291015625,
0.00438690185546875,
-0.05316162109375,
0.029632568359375,
0.0074462890625,
0.07562255859375,
-0.06787109375,
0.036468505859375,
0.040496826171875,
-0.040069580078125,
-0.0701904296875,
-0.004344940185546875,
-0.00943756103515625,
-0.059722900390625,
0.048736572265625,
0.048370361328125,
0.00811004638671875,
0.0116119384765625,
-0.056640625,
-0.05902099609375,
0.090087890625,
0.0218353271484375,
-0.039398193359375,
-0.01413726806640625,
0.004241943359375,
0.04302978515625,
-0.017425537109375,
0.041290283203125,
0.043121337890625,
0.0479736328125,
0.007232666015625,
-0.04962158203125,
0.042022705078125,
-0.04290771484375,
0.0158538818359375,
-0.0010814666748046875,
-0.060760498046875,
0.091552734375,
-0.02398681640625,
-0.03057861328125,
0.04925537109375,
0.038818359375,
0.0274810791015625,
0.03515625,
0.0211334228515625,
0.058502197265625,
0.0513916015625,
-0.035736083984375,
0.07769775390625,
-0.00980377197265625,
0.037750244140625,
0.055999755859375,
-0.00036454200744628906,
0.040557861328125,
0.027862548828125,
-0.01363372802734375,
0.05389404296875,
0.07000732421875,
-0.019744873046875,
0.06915283203125,
0.0157928466796875,
-0.0160369873046875,
-0.01226806640625,
0.0093841552734375,
-0.04840087890625,
-0.0002732276916503906,
0.006008148193359375,
-0.032958984375,
-0.01910400390625,
-0.00656890869140625,
0.029937744140625,
-0.01605224609375,
-0.00237274169921875,
0.048370361328125,
-0.004863739013671875,
-0.055145263671875,
0.075927734375,
0.01032257080078125,
0.06329345703125,
-0.047698974609375,
-0.007610321044921875,
-0.0323486328125,
0.01256561279296875,
-0.00823974609375,
-0.06683349609375,
-0.00940704345703125,
0.002185821533203125,
-0.00960540771484375,
-0.009857177734375,
0.039886474609375,
-0.031005859375,
-0.0322265625,
0.0029449462890625,
0.00458526611328125,
0.030487060546875,
-0.006328582763671875,
-0.061737060546875,
0.0017633438110351562,
0.01413726806640625,
-0.04315185546875,
0.006900787353515625,
0.027374267578125,
0.0212860107421875,
0.034698486328125,
0.0546875,
-0.00872802734375,
0.0116119384765625,
-0.01055145263671875,
0.0635986328125,
-0.05328369140625,
-0.04315185546875,
-0.0697021484375,
0.0604248046875,
0.0017805099487304688,
-0.041839599609375,
0.051971435546875,
0.05364990234375,
0.050323486328125,
-0.0210113525390625,
0.06964111328125,
-0.00974273681640625,
0.0200958251953125,
-0.0555419921875,
0.04315185546875,
-0.048675537109375,
0.0022869110107421875,
-0.0124053955078125,
-0.0557861328125,
-0.0068206787109375,
0.058685302734375,
-0.0034236907958984375,
0.025848388671875,
0.048126220703125,
0.08148193359375,
-0.0025920867919921875,
-0.0242919921875,
-0.0014505386352539062,
0.0106658935546875,
0.0323486328125,
0.06298828125,
0.052459716796875,
-0.0576171875,
0.0394287109375,
-0.033355712890625,
-0.011932373046875,
-0.01493072509765625,
-0.0634765625,
-0.0718994140625,
-0.042083740234375,
-0.035247802734375,
-0.059112548828125,
-0.00782012939453125,
0.052093505859375,
0.051666259765625,
-0.05157470703125,
0.00438690185546875,
-0.0165252685546875,
-0.019256591796875,
-0.0166778564453125,
-0.025970458984375,
0.035247802734375,
-0.01148223876953125,
-0.0657958984375,
0.022705078125,
-0.00618743896484375,
0.02178955078125,
0.0033397674560546875,
-0.018035888671875,
-0.00021767616271972656,
-0.0038604736328125,
0.0217132568359375,
0.017059326171875,
-0.04644775390625,
-0.00797271728515625,
-0.004253387451171875,
-0.0177764892578125,
0.019256591796875,
0.0283050537109375,
-0.0498046875,
0.0243682861328125,
0.046478271484375,
0.0220947265625,
0.045867919921875,
-0.0031490325927734375,
0.0210723876953125,
-0.048492431640625,
0.01226043701171875,
-0.007175445556640625,
0.0355224609375,
0.035003662109375,
-0.03851318359375,
0.040985107421875,
0.044464111328125,
-0.048583984375,
-0.055908203125,
0.00827789306640625,
-0.08538818359375,
-0.0288543701171875,
0.1083984375,
-0.019744873046875,
-0.0142669677734375,
0.0169830322265625,
-0.04150390625,
0.03912353515625,
-0.047149658203125,
0.04461669921875,
0.041534423828125,
-0.0178070068359375,
-0.01055908203125,
-0.01464080810546875,
0.036712646484375,
0.032257080078125,
-0.048736572265625,
-0.0033588409423828125,
0.036834716796875,
0.06451416015625,
0.0171661376953125,
0.04913330078125,
-0.001239776611328125,
0.035064697265625,
0.01247406005859375,
-0.00327301025390625,
-0.023468017578125,
-0.00434112548828125,
-0.04058837890625,
0.006954193115234375,
-0.0233001708984375,
-0.024749755859375
]
] |
Yntec/AbsoluteReality | 2023-10-03T10:40:59.000Z | [
"diffusers",
"General",
"LandScapes",
"Photorealistic",
"Lykon",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/AbsoluteReality | 0 | 16,599 | diffusers | 2023-10-03T09:44:25 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- General
- LandScapes
- Photorealistic
- Lykon
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
---
# Absolute Reality
v1.8.1 version of this model. Original page: https://civitai.com/models/81458?modelVersionId=132760
Sample and prompt:


Full body picture of a pretty cute girl making cake in school, detailed brown eyes, short smile, beautiful and aesthetic, intricate, neat hair, highly detailed, detailed face, smooth, sharp focus, chiaroscuro, magazine ad, 1949, 2D Game Art, anime on canvas, rossdraws, clay mann, CHIBI ART, light novel cover art | 913 | [
[
-0.0224761962890625,
-0.06109619140625,
0.031646728515625,
0.030792236328125,
-0.0143280029296875,
-0.017333984375,
0.03582763671875,
-0.0350341796875,
0.05035400390625,
0.037994384765625,
-0.06072998046875,
-0.039703369140625,
-0.0228729248046875,
-0.0203857421875,
-0.043304443359375,
0.039825439453125,
0.0005631446838378906,
0.01482391357421875,
-0.031463623046875,
0.002628326416015625,
-0.036468505859375,
-0.026824951171875,
-0.04010009765625,
-0.01873779296875,
0.017486572265625,
0.047027587890625,
0.035003662109375,
0.036163330078125,
0.0288543701171875,
0.0175323486328125,
-0.01078033447265625,
-0.0211029052734375,
-0.031402587890625,
-0.01377105712890625,
-0.01486968994140625,
-0.0277862548828125,
-0.0704345703125,
0.0230712890625,
0.0433349609375,
0.04022216796875,
-0.020172119140625,
0.0245513916015625,
-0.003505706787109375,
0.042510986328125,
-0.0283966064453125,
-0.0009527206420898438,
-0.006458282470703125,
0.021026611328125,
-0.0012607574462890625,
0.02740478515625,
-0.0180206298828125,
-0.0225982666015625,
-0.01044464111328125,
-0.06951904296875,
0.0147247314453125,
-0.0177001953125,
0.106689453125,
0.00452423095703125,
-0.042205810546875,
0.0023479461669921875,
-0.04327392578125,
0.0273284912109375,
-0.028289794921875,
0.046356201171875,
0.028411865234375,
0.039031982421875,
-0.021636962890625,
-0.06561279296875,
-0.041534423828125,
0.0192108154296875,
0.01441192626953125,
0.0217132568359375,
-0.03448486328125,
-0.017913818359375,
0.02142333984375,
0.0235748291015625,
-0.04998779296875,
-0.009857177734375,
-0.048492431640625,
0.017364501953125,
0.04315185546875,
0.006916046142578125,
0.036651611328125,
0.01192474365234375,
-0.021240234375,
-0.0108795166015625,
-0.04278564453125,
0.0118560791015625,
0.0166473388671875,
-0.00876617431640625,
-0.0426025390625,
0.03729248046875,
-0.0146484375,
0.04559326171875,
0.014068603515625,
-0.002635955810546875,
0.0205535888671875,
-0.01373291015625,
-0.033782958984375,
-0.01024627685546875,
0.040679931640625,
0.040252685546875,
0.005126953125,
0.027984619140625,
0.0150604248046875,
0.0061187744140625,
0.03515625,
-0.08917236328125,
-0.02447509765625,
0.0211944580078125,
-0.0196533203125,
-0.0416259765625,
0.033050537109375,
-0.049591064453125,
0.003627777099609375,
-0.018402099609375,
0.0155487060546875,
-0.0265045166015625,
-0.04742431640625,
0.0211944580078125,
0.0260162353515625,
0.0205841064453125,
0.02313232421875,
-0.02960205078125,
0.0180511474609375,
0.038665771484375,
0.05633544921875,
0.0288543701171875,
0.020355224609375,
0.0014276504516601562,
-0.002685546875,
-0.04779052734375,
0.0621337890625,
-0.013397216796875,
-0.043487548828125,
-0.020172119140625,
0.0189056396484375,
0.023040771484375,
-0.042877197265625,
0.06982421875,
-0.018798828125,
0.00377655029296875,
-0.03448486328125,
-0.0294342041015625,
-0.029205322265625,
0.00872039794921875,
-0.04644775390625,
0.03924560546875,
0.00992584228515625,
-0.030426025390625,
0.035797119140625,
-0.048370361328125,
-0.01139068603515625,
0.00478363037109375,
-0.00545501708984375,
-0.045989990234375,
0.023773193359375,
-0.015533447265625,
0.01271820068359375,
0.006626129150390625,
0.004528045654296875,
-0.05609130859375,
-0.0247039794921875,
0.01227569580078125,
-0.01398468017578125,
0.075927734375,
0.0300750732421875,
-0.0109405517578125,
0.003055572509765625,
-0.06805419921875,
0.014129638671875,
0.046966552734375,
0.019134521484375,
-0.006587982177734375,
-0.0200347900390625,
0.0000902414321899414,
0.0401611328125,
0.03240966796875,
-0.04034423828125,
0.031982421875,
-0.005191802978515625,
-0.004947662353515625,
0.046539306640625,
0.002582550048828125,
-0.00202178955078125,
-0.039642333984375,
0.046356201171875,
0.017486572265625,
0.035308837890625,
-0.01131439208984375,
-0.0285186767578125,
-0.06494140625,
-0.054107666015625,
0.032196044921875,
0.03314208984375,
-0.0164337158203125,
0.05218505859375,
-0.01453399658203125,
-0.06793212890625,
-0.0413818359375,
-0.00772857666015625,
0.024688720703125,
0.03912353515625,
-0.01177215576171875,
-0.03680419921875,
-0.04974365234375,
-0.10308837890625,
0.01107025146484375,
-0.0143890380859375,
-0.008880615234375,
0.00482177734375,
0.0340576171875,
0.0078887939453125,
0.059539794921875,
-0.039886474609375,
-0.01873779296875,
-0.00640106201171875,
-0.0124969482421875,
0.0433349609375,
0.04620361328125,
0.06866455078125,
-0.0828857421875,
-0.04644775390625,
-0.026611328125,
-0.0777587890625,
-0.00762176513671875,
0.0200347900390625,
-0.0401611328125,
-0.021209716796875,
0.00884246826171875,
-0.051971435546875,
0.06170654296875,
0.014068603515625,
-0.0689697265625,
0.052703857421875,
-0.01497650146484375,
0.037628173828125,
-0.0693359375,
0.0104217529296875,
0.04486083984375,
-0.03118896484375,
-0.029876708984375,
0.0654296875,
-0.002544403076171875,
-0.006137847900390625,
-0.06170654296875,
0.06524658203125,
-0.048583984375,
0.02105712890625,
-0.024993896484375,
-0.004222869873046875,
0.026824951171875,
0.01206207275390625,
-0.0009813308715820312,
0.03900146484375,
0.052581787109375,
-0.038726806640625,
0.03399658203125,
0.0295562744140625,
-0.031585693359375,
0.07415771484375,
-0.0718994140625,
0.004650115966796875,
-0.005390167236328125,
0.0112152099609375,
-0.0670166015625,
-0.050262451171875,
0.036865234375,
-0.046966552734375,
0.0068817138671875,
0.00396728515625,
-0.047210693359375,
-0.0399169921875,
-0.0181121826171875,
0.00852203369140625,
0.052886962890625,
-0.034027099609375,
0.0184783935546875,
0.024627685546875,
0.01297760009765625,
-0.00429534912109375,
-0.05828857421875,
-0.003070831298828125,
-0.00913238525390625,
-0.04278564453125,
0.0280609130859375,
-0.0266265869140625,
-0.04180908203125,
-0.00434112548828125,
-0.02020263671875,
-0.036865234375,
-0.00817108154296875,
0.030914306640625,
0.04205322265625,
-0.0139312744140625,
-0.0416259765625,
0.007205963134765625,
0.01434326171875,
0.003353118896484375,
0.0199737548828125,
0.02996826171875,
-0.01473236083984375,
-0.033538818359375,
-0.0726318359375,
0.0224456787109375,
0.050872802734375,
0.0059661865234375,
0.0595703125,
0.0003688335418701172,
-0.060455322265625,
0.00891876220703125,
-0.032684326171875,
-0.00916290283203125,
-0.034088134765625,
0.0236053466796875,
-0.062225341796875,
-0.01043701171875,
0.046417236328125,
0.0238800048828125,
-0.008331298828125,
0.022064208984375,
0.0234832763671875,
0.0076751708984375,
0.0946044921875,
0.050628662109375,
0.0179443359375,
0.025634765625,
-0.040374755859375,
-0.0054779052734375,
-0.032012939453125,
-0.023590087890625,
-0.019683837890625,
-0.0173797607421875,
-0.01497650146484375,
-0.0130462646484375,
0.0117034912109375,
0.0245819091796875,
-0.04205322265625,
0.0416259765625,
-0.042205810546875,
0.04779052734375,
0.023223876953125,
0.054046630859375,
0.0169219970703125,
-0.000499725341796875,
0.0006890296936035156,
-0.03387451171875,
-0.039581298828125,
-0.044708251953125,
0.049560546875,
0.01297760009765625,
0.00955963134765625,
0.032745361328125,
0.0252838134765625,
-0.004566192626953125,
0.0193634033203125,
-0.04736328125,
0.060150146484375,
-0.0084381103515625,
-0.08563232421875,
0.0286865234375,
-0.0171661376953125,
-0.03387451171875,
0.00684356689453125,
-0.01558685302734375,
-0.034271240234375,
0.0204925537109375,
0.009857177734375,
-0.0303192138671875,
0.028411865234375,
-0.0548095703125,
0.078369140625,
-0.0149383544921875,
-0.048828125,
0.0084075927734375,
-0.019287109375,
0.036163330078125,
0.00711822509765625,
0.01108551025390625,
-0.0004439353942871094,
-0.0036220550537109375,
0.0218048095703125,
-0.031402587890625,
0.06243896484375,
-0.00960540771484375,
-0.0009484291076660156,
0.0243072509765625,
0.034332275390625,
0.0085906982421875,
0.0166778564453125,
-0.016845703125,
-0.002300262451171875,
-0.00742340087890625,
-0.049407958984375,
-0.03546142578125,
0.07415771484375,
-0.04132080078125,
-0.038604736328125,
-0.05023193359375,
-0.00308990478515625,
0.002834320068359375,
0.02557373046875,
0.040740966796875,
0.038330078125,
-0.0438232421875,
0.0118408203125,
0.060150146484375,
0.0110015869140625,
0.005428314208984375,
0.034698486328125,
-0.057464599609375,
-0.00977325439453125,
0.05816650390625,
0.015777587890625,
0.0175018310546875,
-0.0004055500030517578,
-0.003292083740234375,
0.0005664825439453125,
-0.009979248046875,
-0.027618408203125,
0.033966064453125,
-0.0308990478515625,
-0.004486083984375,
-0.031890869140625,
-0.0242156982421875,
-0.03741455078125,
-0.0222320556640625,
-0.048614501953125,
-0.020050048828125,
-0.046661376953125,
-0.0197601318359375,
0.01715087890625,
0.048248291015625,
0.0039215087890625,
0.01200103759765625,
-0.03369140625,
0.024505615234375,
0.0291900634765625,
0.040863037109375,
-0.0144195556640625,
-0.056793212890625,
0.00922393798828125,
0.0193023681640625,
-0.0546875,
-0.06268310546875,
0.0374755859375,
0.01629638671875,
0.034027099609375,
0.04376220703125,
0.000022530555725097656,
0.049102783203125,
-0.0199127197265625,
0.053070068359375,
0.023193359375,
-0.046112060546875,
0.0428466796875,
-0.0302581787109375,
0.045745849609375,
0.05865478515625,
0.051300048828125,
-0.0238037109375,
0.00388336181640625,
-0.07470703125,
-0.05572509765625,
0.031768798828125,
0.0182647705078125,
0.01509857177734375,
0.020355224609375,
0.03936767578125,
0.001834869384765625,
0.0270233154296875,
-0.050140380859375,
-0.042633056640625,
-0.0338134765625,
-0.026824951171875,
0.034088134765625,
-0.01203155517578125,
-0.0027904510498046875,
-0.0511474609375,
0.0726318359375,
0.01132965087890625,
0.02813720703125,
0.01407623291015625,
0.029083251953125,
-0.0196533203125,
-0.0059661865234375,
0.04345703125,
0.07415771484375,
-0.03814697265625,
-0.0184783935546875,
-0.0173797607421875,
-0.03106689453125,
0.027008056640625,
-0.0034961700439453125,
-0.0293731689453125,
0.016204833984375,
0.00989532470703125,
0.07171630859375,
0.018646240234375,
-0.04095458984375,
0.051300048828125,
-0.0155487060546875,
-0.0034656524658203125,
-0.05615234375,
0.0197601318359375,
-0.00005704164505004883,
0.01190185546875,
0.016815185546875,
0.00506591796875,
0.0295257568359375,
-0.061126708984375,
0.02777099609375,
0.01280975341796875,
-0.058258056640625,
-0.05279541015625,
0.0682373046875,
-0.0169219970703125,
-0.03582763671875,
0.0330810546875,
-0.0299530029296875,
-0.034210205078125,
0.06390380859375,
0.052581787109375,
0.0574951171875,
-0.022979736328125,
0.04296875,
0.05572509765625,
-0.01267242431640625,
-0.009246826171875,
0.0262908935546875,
0.025604248046875,
-0.0233306884765625,
0.0103302001953125,
-0.038726806640625,
-0.0272674560546875,
0.0169525146484375,
-0.0419921875,
0.0714111328125,
-0.0706787109375,
0.00008153915405273438,
0.00984954833984375,
-0.0002243518829345703,
-0.032928466796875,
0.0513916015625,
0.00701904296875,
0.0927734375,
-0.05157470703125,
0.059814453125,
0.07470703125,
-0.05224609375,
-0.05419921875,
-0.0165557861328125,
0.0328369140625,
-0.0298004150390625,
0.029632568359375,
0.023193359375,
0.00868988037109375,
-0.0140228271484375,
-0.0814208984375,
-0.048248291015625,
0.058074951171875,
0.032989501953125,
-0.0635986328125,
-0.0263671875,
-0.0276641845703125,
0.03265380859375,
-0.04669189453125,
0.0301361083984375,
0.0159454345703125,
0.0282745361328125,
0.05841064453125,
-0.053741455078125,
-0.0246124267578125,
-0.06732177734375,
-0.0004558563232421875,
-0.010101318359375,
-0.07733154296875,
0.0560302734375,
-0.01007080078125,
-0.01027679443359375,
0.03302001953125,
0.0819091796875,
0.04681396484375,
0.01483154296875,
0.0396728515625,
0.0523681640625,
0.01352691650390625,
-0.0126953125,
0.07037353515625,
-0.00946807861328125,
0.0104217529296875,
0.08416748046875,
-0.0169219970703125,
0.0341796875,
0.0005006790161132812,
-0.0198974609375,
0.030670166015625,
0.0628662109375,
-0.00717926025390625,
0.04290771484375,
0.00475311279296875,
-0.02069091796875,
-0.0318603515625,
-0.015655517578125,
-0.0318603515625,
0.02337646484375,
0.0008263587951660156,
0.0005850791931152344,
0.00024318695068359375,
0.0117034912109375,
0.0247955322265625,
0.036773681640625,
-0.0217437744140625,
0.0377197265625,
0.00994873046875,
-0.0182647705078125,
0.032958984375,
-0.01525115966796875,
0.047607421875,
-0.04876708984375,
-0.0297698974609375,
-0.0232391357421875,
0.0144500732421875,
-0.0172271728515625,
-0.05609130859375,
0.022247314453125,
-0.00200653076171875,
-0.0176544189453125,
-0.006755828857421875,
0.040924072265625,
-0.00550079345703125,
-0.09185791015625,
0.01142120361328125,
-0.005100250244140625,
0.0180816650390625,
-0.003856658935546875,
-0.07330322265625,
0.025604248046875,
0.001575469970703125,
-0.0208740234375,
-0.0006933212280273438,
0.02496337890625,
0.014984130859375,
0.043670654296875,
0.009063720703125,
0.016265869140625,
-0.00861358642578125,
-0.02142333984375,
0.03558349609375,
-0.032623291015625,
-0.039306640625,
-0.03802490234375,
0.045013427734375,
-0.029541015625,
-0.037994384765625,
0.05718994140625,
0.06298828125,
0.06793212890625,
-0.0177459716796875,
0.039947509765625,
-0.018402099609375,
0.04644775390625,
-0.01436614990234375,
0.0675048828125,
-0.0875244140625,
-0.0146636962890625,
-0.028656005859375,
-0.0677490234375,
-0.0156402587890625,
0.06439208984375,
0.016021728515625,
0.0155487060546875,
0.007049560546875,
0.051605224609375,
-0.033782958984375,
0.007373809814453125,
0.0215301513671875,
0.032135009765625,
0.026885986328125,
0.0113525390625,
0.0178070068359375,
-0.062744140625,
-0.00762939453125,
-0.040924072265625,
-0.038848876953125,
-0.016876220703125,
-0.051544189453125,
-0.042572021484375,
-0.0511474609375,
-0.040985107421875,
-0.047149658203125,
0.00522613525390625,
0.08148193359375,
0.0810546875,
-0.060028076171875,
-0.0205230712890625,
0.00963592529296875,
-0.0079345703125,
-0.00518035888671875,
-0.020111083984375,
-0.00504302978515625,
0.058502197265625,
-0.068115234375,
0.0200347900390625,
-0.0216217041015625,
0.036834716796875,
-0.0302581787109375,
0.0214385986328125,
-0.03631591796875,
0.0204620361328125,
0.0251312255859375,
0.036041259765625,
-0.0147857666015625,
-0.023956298828125,
0.003208160400390625,
-0.00852203369140625,
0.02679443359375,
0.02410888671875,
-0.0230560302734375,
0.0151214599609375,
0.017730712890625,
-0.007534027099609375,
0.035675048828125,
-0.00865936279296875,
0.0218353271484375,
-0.0178070068359375,
0.035552978515625,
0.00867462158203125,
0.0271148681640625,
0.030426025390625,
-0.0257415771484375,
0.04632568359375,
0.036651611328125,
-0.03192138671875,
-0.08184814453125,
0.023162841796875,
-0.08624267578125,
-0.029266357421875,
0.0433349609375,
0.004711151123046875,
-0.062744140625,
0.034515380859375,
-0.0230560302734375,
0.0225372314453125,
-0.020355224609375,
0.03753662109375,
0.05731201171875,
0.0014352798461914062,
-0.0175018310546875,
-0.04974365234375,
-0.006618499755859375,
0.006641387939453125,
-0.054779052734375,
-0.0245361328125,
0.05047607421875,
0.0335693359375,
0.00945281982421875,
0.0379638671875,
-0.03851318359375,
0.0250701904296875,
0.0127410888671875,
0.048431396484375,
-0.004791259765625,
-0.051025390625,
-0.002105712890625,
0.01377105712890625,
-0.011322021484375,
-0.0034046173095703125
]
] |
ai-forever/rugpt3large_based_on_gpt2 | 2023-11-03T12:50:25.000Z | [
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"PyTorch",
"Transformers",
"ru",
"arxiv:2309.10931",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | ai-forever | null | null | ai-forever/rugpt3large_based_on_gpt2 | 61 | 16,557 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: "https://github.com/sberbank-ai/ru-gpts"
---
# rugpt3large\_based\_on\_gpt2
The model architecture design, pretraining, and evaluation are documented in our preprint: [**A Family of Pretrained Transformer Language Models for Russian**](https://arxiv.org/abs/2309.10931).
The model was trained with sequence length 1024 using transformers lib by the [SberDevices](https://sberdevices.ru/) team on 80B tokens for 3 epochs. After that, the model was finetuned 1 epoch with sequence length 2048.
Total training time was around 14 days on 128 GPUs for 1024 context and a few days on 16 GPUs for 2048 context.
The final perplexity on the test set is `13.6`.
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich
# Cite us
```
@misc{zmitrovich2023family,
title={A Family of Pretrained Transformer Language Models for Russian},
author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
year={2023},
eprint={2309.10931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 1,311 | [
[
-0.024566650390625,
-0.0362548828125,
0.0216522216796875,
0.0177154541015625,
-0.032073974609375,
-0.0172271728515625,
-0.028106689453125,
-0.01788330078125,
-0.03759765625,
0.00992584228515625,
-0.0377197265625,
-0.00963592529296875,
-0.048797607421875,
-0.00368499755859375,
-0.0142364501953125,
0.10382080078125,
-0.01271820068359375,
0.0187225341796875,
0.0212249755859375,
0.00249481201171875,
-0.01690673828125,
-0.031646728515625,
-0.044097900390625,
-0.0308837890625,
-0.0016775131225585938,
0.006740570068359375,
0.0435791015625,
0.0400390625,
0.0285491943359375,
0.0160675048828125,
-0.0093994140625,
-0.0248260498046875,
-0.04443359375,
-0.0150146484375,
0.005420684814453125,
-0.019073486328125,
-0.0428466796875,
-0.0005097389221191406,
0.056365966796875,
0.0203857421875,
-0.0167236328125,
0.0287322998046875,
0.02490234375,
0.0243377685546875,
-0.017791748046875,
0.009857177734375,
-0.04876708984375,
0.01177978515625,
-0.024566650390625,
-0.00021886825561523438,
-0.05694580078125,
0.006328582763671875,
0.0220947265625,
-0.03472900390625,
0.029937744140625,
-0.005474090576171875,
0.0899658203125,
0.0124664306640625,
-0.02325439453125,
0.01073455810546875,
-0.07086181640625,
0.056121826171875,
-0.06072998046875,
0.046600341796875,
0.02154541015625,
0.025146484375,
-0.003131866455078125,
-0.07904052734375,
-0.046051025390625,
-0.00920867919921875,
-0.021728515625,
0.0034503936767578125,
-0.016448974609375,
0.0003635883331298828,
0.044830322265625,
0.031646728515625,
-0.07720947265625,
0.005916595458984375,
-0.031646728515625,
-0.0171051025390625,
0.0300750732421875,
0.00988006591796875,
-0.00959014892578125,
-0.01306915283203125,
-0.041351318359375,
-0.0167083740234375,
-0.051025390625,
0.0006413459777832031,
0.0382080078125,
0.00513458251953125,
-0.031829833984375,
0.0277252197265625,
-0.00818634033203125,
0.053192138671875,
0.01374053955078125,
-0.00543975830078125,
0.033966064453125,
-0.03851318359375,
-0.01947021484375,
-0.03662109375,
0.07940673828125,
-0.00597381591796875,
0.02313232421875,
-0.0224151611328125,
-0.029022216796875,
-0.0069580078125,
0.033599853515625,
-0.07379150390625,
-0.0194549560546875,
0.0013599395751953125,
-0.0263214111328125,
-0.006282806396484375,
-0.004360198974609375,
-0.05548095703125,
0.00847625732421875,
-0.0197296142578125,
0.0435791015625,
-0.03936767578125,
-0.0241851806640625,
0.01715087890625,
0.013641357421875,
0.0626220703125,
-0.00579833984375,
-0.08306884765625,
0.03289794921875,
0.064697265625,
0.052886962890625,
-0.012420654296875,
-0.035400390625,
-0.01081085205078125,
-0.022735595703125,
-0.004299163818359375,
0.050018310546875,
-0.02130126953125,
-0.0186920166015625,
-0.012786865234375,
0.0109405517578125,
-0.015350341796875,
-0.013641357421875,
0.0430908203125,
-0.045501708984375,
0.049041748046875,
0.01373291015625,
-0.0259552001953125,
-0.00055694580078125,
0.0073394775390625,
-0.019989013671875,
0.08154296875,
0.0308837890625,
-0.054046630859375,
0.04083251953125,
-0.036834716796875,
-0.021728515625,
0.01262664794921875,
-0.0010309219360351562,
-0.053070068359375,
0.0004553794860839844,
0.00563812255859375,
0.027923583984375,
-0.030670166015625,
0.0333251953125,
0.006137847900390625,
-0.0280609130859375,
-0.01512908935546875,
-0.0362548828125,
0.05535888671875,
0.024444580078125,
-0.048004150390625,
0.016021728515625,
-0.05682373046875,
0.005535125732421875,
0.024200439453125,
-0.0212249755859375,
0.0125732421875,
-0.0167236328125,
0.0161895751953125,
0.0283660888671875,
0.01152801513671875,
-0.032989501953125,
0.0236358642578125,
-0.03326416015625,
0.03424072265625,
0.053802490234375,
-0.0224456787109375,
0.0394287109375,
-0.007720947265625,
0.053680419921875,
-0.010009765625,
0.040130615234375,
-0.0199127197265625,
-0.038665771484375,
-0.059967041015625,
-0.000217437744140625,
0.035400390625,
0.040496826171875,
-0.052093505859375,
0.045135498046875,
-0.0386962890625,
-0.048797607421875,
-0.022064208984375,
-0.0235137939453125,
0.0419921875,
0.02801513671875,
0.03692626953125,
-0.027862548828125,
-0.043212890625,
-0.0672607421875,
0.0015821456909179688,
-0.01016998291015625,
-0.0123748779296875,
0.0020389556884765625,
0.047332763671875,
-0.01232147216796875,
0.0631103515625,
-0.0285491943359375,
0.004711151123046875,
-0.032958984375,
0.0104217529296875,
0.032623291015625,
0.045074462890625,
0.044525146484375,
-0.040802001953125,
-0.045257568359375,
-0.01041412353515625,
-0.0277099609375,
0.00750732421875,
-0.001422882080078125,
-0.0016613006591796875,
0.037506103515625,
0.00363922119140625,
-0.06341552734375,
0.035888671875,
0.042572021484375,
-0.038421630859375,
0.07177734375,
-0.00921630859375,
-0.0058441162109375,
-0.091064453125,
0.0186004638671875,
-0.0104217529296875,
-0.01219940185546875,
-0.0618896484375,
-0.005413055419921875,
-0.003082275390625,
-0.01470184326171875,
-0.041656494140625,
0.042755126953125,
-0.04144287109375,
-0.016693115234375,
-0.00922393798828125,
-0.0089874267578125,
-0.0190887451171875,
0.05181884765625,
0.01202392578125,
0.0706787109375,
0.04461669921875,
-0.0352783203125,
-0.001560211181640625,
0.02734375,
-0.036163330078125,
0.0142364501953125,
-0.0733642578125,
0.033935546875,
0.01007843017578125,
0.0217742919921875,
-0.07086181640625,
0.0098419189453125,
0.0289459228515625,
-0.043304443359375,
0.04083251953125,
-0.026153564453125,
-0.04632568359375,
-0.040985107421875,
0.005420684814453125,
0.05010986328125,
0.06781005859375,
-0.0361328125,
0.045806884765625,
0.00909423828125,
-0.0019397735595703125,
-0.05242919921875,
-0.0242919921875,
-0.008575439453125,
-0.0248260498046875,
-0.060699462890625,
0.0233001708984375,
0.006832122802734375,
0.009124755859375,
-0.018218994140625,
-0.00264739990234375,
-0.0028972625732421875,
-0.0008349418640136719,
0.006565093994140625,
0.02703857421875,
-0.0311737060546875,
0.0030078887939453125,
-0.0289306640625,
-0.0411376953125,
-0.00811767578125,
-0.030792236328125,
0.08489990234375,
-0.0286865234375,
-0.01007843017578125,
-0.03790283203125,
-0.005035400390625,
0.0208282470703125,
-0.0272064208984375,
0.056610107421875,
0.06903076171875,
-0.0217437744140625,
-0.005153656005859375,
-0.045623779296875,
-0.0221099853515625,
-0.035736083984375,
0.041290283203125,
-0.03509521484375,
-0.06097412109375,
0.041534423828125,
0.0179595947265625,
-0.0076141357421875,
0.055267333984375,
0.0462646484375,
0.02960205078125,
0.046905517578125,
0.042694091796875,
-0.0036563873291015625,
0.031036376953125,
-0.04156494140625,
0.023223876953125,
-0.06719970703125,
-0.01136016845703125,
-0.038818359375,
-0.0059051513671875,
-0.040283203125,
-0.032073974609375,
0.0022125244140625,
0.004047393798828125,
-0.04547119140625,
0.052581787109375,
-0.0286865234375,
0.0288543701171875,
0.022308349609375,
-0.017059326171875,
0.0017299652099609375,
0.00014960765838623047,
-0.0103912353515625,
-0.0158538818359375,
-0.05987548828125,
-0.049560546875,
0.09381103515625,
0.039031982421875,
0.05389404296875,
-0.006366729736328125,
0.0328369140625,
-0.0145721435546875,
0.021209716796875,
-0.05718994140625,
0.038116455078125,
-0.006061553955078125,
-0.050567626953125,
-0.0240325927734375,
-0.022186279296875,
-0.0723876953125,
0.0137939453125,
-0.010223388671875,
-0.05780029296875,
-0.006221771240234375,
0.01371002197265625,
-0.0215911865234375,
0.0247650146484375,
-0.04473876953125,
0.084228515625,
-0.0155181884765625,
-0.0202789306640625,
-0.01558685302734375,
-0.062469482421875,
0.0231170654296875,
-0.0091094970703125,
-0.018341064453125,
0.0176849365234375,
0.025390625,
0.06005859375,
-0.02679443359375,
0.0316162109375,
-0.020263671875,
0.00220489501953125,
0.00594329833984375,
-0.0130767822265625,
0.041046142578125,
0.0037059783935546875,
0.004657745361328125,
0.028594970703125,
-0.0155181884765625,
-0.02001953125,
-0.0252838134765625,
0.028228759765625,
-0.06658935546875,
-0.03533935546875,
-0.046600341796875,
-0.024383544921875,
-0.01448822021484375,
0.0287322998046875,
0.0455322265625,
0.0496826171875,
-0.01374053955078125,
0.0199127197265625,
0.035614013671875,
-0.006031036376953125,
0.04962158203125,
0.0465087890625,
-0.016326904296875,
-0.024200439453125,
0.036346435546875,
0.00022304058074951172,
0.027252197265625,
0.0202789306640625,
0.00963592529296875,
-0.0312042236328125,
-0.05419921875,
-0.0394287109375,
0.04376220703125,
-0.0352783203125,
-0.006801605224609375,
-0.04791259765625,
-0.02935791015625,
-0.042266845703125,
0.0233306884765625,
-0.048492431640625,
-0.022918701171875,
-0.0203399658203125,
-0.009429931640625,
0.004077911376953125,
0.06365966796875,
0.00901031494140625,
0.045318603515625,
-0.05450439453125,
0.01303863525390625,
0.0101470947265625,
0.0435791015625,
0.0013971328735351562,
-0.08819580078125,
-0.025909423828125,
-0.01108551025390625,
-0.03289794921875,
-0.043243408203125,
0.022125244140625,
0.009490966796875,
0.053558349609375,
0.0231170654296875,
-0.0219268798828125,
0.038604736328125,
-0.071533203125,
0.07373046875,
-0.0027713775634765625,
-0.068603515625,
0.0171966552734375,
-0.01971435546875,
0.027923583984375,
0.0242462158203125,
0.039703369140625,
-0.0296478271484375,
-0.0038127899169921875,
-0.0631103515625,
-0.06390380859375,
0.066162109375,
0.0194091796875,
0.007350921630859375,
0.019134521484375,
0.032379150390625,
0.018707275390625,
-0.005290985107421875,
-0.06488037109375,
-0.0260772705078125,
-0.0295257568359375,
0.008758544921875,
-0.026092529296875,
-0.044342041015625,
0.00250244140625,
-0.031036376953125,
0.06866455078125,
0.00362396240234375,
0.044708251953125,
-0.005863189697265625,
-0.01454925537109375,
0.01560211181640625,
0.028045654296875,
0.07928466796875,
0.06695556640625,
-0.01422882080078125,
-0.008453369140625,
0.019683837890625,
-0.050262451171875,
0.01165008544921875,
0.0243072509765625,
0.0010404586791992188,
-0.00016200542449951172,
0.02935791015625,
0.10009765625,
-0.0022029876708984375,
-0.011566162109375,
0.05157470703125,
-0.01898193359375,
-0.0214691162109375,
-0.039398193359375,
-0.020355224609375,
-0.00461578369140625,
0.00804901123046875,
0.036102294921875,
0.00492095947265625,
-0.01082611083984375,
-0.004543304443359375,
0.03131103515625,
0.0196075439453125,
-0.0292510986328125,
-0.06475830078125,
0.047515869140625,
-0.0031261444091796875,
-0.0265350341796875,
0.058990478515625,
-0.0251922607421875,
-0.05279541015625,
0.0186309814453125,
0.053558349609375,
0.075439453125,
-0.0347900390625,
0.01271820068359375,
0.0458984375,
0.02691650390625,
-0.01605224609375,
-0.002719879150390625,
0.00363922119140625,
-0.0538330078125,
-0.043853759765625,
-0.06182861328125,
0.001972198486328125,
0.036865234375,
-0.04833984375,
0.0360107421875,
-0.01085662841796875,
-0.0194854736328125,
-0.0229949951171875,
-0.00872802734375,
-0.05767822265625,
0.020416259765625,
0.0029430389404296875,
0.07403564453125,
-0.06597900390625,
0.0679931640625,
0.040374755859375,
-0.0069580078125,
-0.07391357421875,
0.00897216796875,
-0.020111083984375,
-0.0654296875,
0.059722900390625,
0.0140533447265625,
0.00804901123046875,
0.0178680419921875,
-0.0201568603515625,
-0.056976318359375,
0.0760498046875,
0.017669677734375,
-0.035980224609375,
-0.0145263671875,
0.0235443115234375,
0.062469482421875,
-0.01462554931640625,
0.031951904296875,
0.054443359375,
0.032318115234375,
0.00559234619140625,
-0.08795166015625,
-0.00511932373046875,
-0.033843994140625,
0.009857177734375,
0.020965576171875,
-0.042449951171875,
0.06494140625,
-0.016448974609375,
-0.0288543701171875,
0.011749267578125,
0.04071044921875,
-0.005340576171875,
-0.0140533447265625,
0.0384521484375,
0.0733642578125,
0.0173797607421875,
-0.034210205078125,
0.0926513671875,
-0.04022216796875,
0.051055908203125,
0.0821533203125,
0.0133209228515625,
0.054901123046875,
0.0280609130859375,
-0.045013427734375,
0.0214996337890625,
0.046722412109375,
-0.00970458984375,
0.0498046875,
0.01861572265625,
-0.0054168701171875,
-0.0151824951171875,
0.0234527587890625,
-0.05316162109375,
0.0340576171875,
0.00887298583984375,
-0.0114288330078125,
-0.0141143798828125,
-0.00550079345703125,
0.0166168212890625,
-0.0297393798828125,
0.0118865966796875,
0.050140380859375,
0.00489044189453125,
-0.0543212890625,
0.05804443359375,
-0.006866455078125,
0.04681396484375,
-0.06121826171875,
0.017669677734375,
-0.0147247314453125,
0.019073486328125,
-0.00441741943359375,
-0.04302978515625,
0.018646240234375,
-0.004184722900390625,
-0.01204681396484375,
-0.02239990234375,
0.036895751953125,
-0.028045654296875,
-0.02191162109375,
0.01061248779296875,
0.016571044921875,
0.00443267822265625,
0.0052490234375,
-0.049896240234375,
-0.005596160888671875,
-0.00507354736328125,
-0.044586181640625,
0.01690673828125,
0.0134124755859375,
0.007167816162109375,
0.040008544921875,
0.03436279296875,
0.0013647079467773438,
0.01323699951171875,
0.00616455078125,
0.059234619140625,
-0.0235443115234375,
-0.032440185546875,
-0.0703125,
0.0543212890625,
0.0235137939453125,
-0.048980712890625,
0.05572509765625,
0.0545654296875,
0.0693359375,
-0.0302886962890625,
0.041595458984375,
-0.01251220703125,
0.0191497802734375,
-0.037139892578125,
0.05169677734375,
-0.02178955078125,
0.0079345703125,
-0.00714874267578125,
-0.09368896484375,
-0.0156402587890625,
0.045379638671875,
-0.036956787109375,
0.020965576171875,
0.06866455078125,
0.052703857421875,
-0.00553131103515625,
-0.01049041748046875,
0.00904083251953125,
0.021484375,
0.03314208984375,
0.045867919921875,
0.0611572265625,
-0.055267333984375,
0.035400390625,
-0.021759033203125,
-0.013427734375,
-0.00007897615432739258,
-0.061248779296875,
-0.0596923828125,
-0.0533447265625,
-0.00849151611328125,
-0.02508544921875,
0.00244903564453125,
0.0584716796875,
0.05267333984375,
-0.0587158203125,
-0.0161895751953125,
-0.006710052490234375,
-0.0262603759765625,
-0.00017964839935302734,
-0.01568603515625,
0.036834716796875,
-0.02264404296875,
-0.0440673828125,
0.010589599609375,
0.001903533935546875,
0.021392822265625,
-0.0142364501953125,
-0.02215576171875,
-0.01549530029296875,
-0.0255279541015625,
0.023040771484375,
0.00818634033203125,
-0.044708251953125,
-0.0323486328125,
0.003993988037109375,
-0.006237030029296875,
0.0247650146484375,
0.054046630859375,
-0.042755126953125,
0.02490234375,
0.045867919921875,
0.03582763671875,
0.05670166015625,
0.008514404296875,
0.048736572265625,
-0.042449951171875,
0.02880859375,
0.01537322998046875,
0.0305023193359375,
0.02740478515625,
-0.005672454833984375,
0.041595458984375,
0.02703857421875,
-0.06591796875,
-0.05450439453125,
0.020294189453125,
-0.071044921875,
0.0192718505859375,
0.1026611328125,
-0.02728271484375,
0.0067901611328125,
-0.0007200241088867188,
-0.0034427642822265625,
0.0254058837890625,
-0.00897979736328125,
0.04296875,
0.047271728515625,
0.0173187255859375,
-0.0032291412353515625,
-0.03558349609375,
0.057891845703125,
0.02593994140625,
-0.047760009765625,
-0.0081787109375,
0.004512786865234375,
0.036285400390625,
0.019134521484375,
0.049041748046875,
-0.01389312744140625,
0.019683837890625,
0.0019054412841796875,
0.01474761962890625,
-0.0130767822265625,
-0.0267791748046875,
-0.038970947265625,
-0.01386260986328125,
0.002391815185546875,
-0.0006475448608398438
]
] |
stanfordnlp/stanza-nl | 2023-10-02T23:43:10.000Z | [
"stanza",
"token-classification",
"nl",
"license:apache-2.0",
"region:us"
] | token-classification | stanfordnlp | null | null | stanfordnlp/stanza-nl | 2 | 16,550 | stanza | 2022-03-02T23:29:05 | ---
tags:
- stanza
- token-classification
library_name: stanza
language: nl
license: apache-2.0
---
# Stanza model for Dutch (nl)
Stanza is a collection of accurate and efficient tools for the linguistic analysis of many human languages. Starting from raw text to syntactic analysis and entity recognition, Stanza brings state-of-the-art NLP models to languages of your choosing.
Find more about it in [our website](https://stanfordnlp.github.io/stanza) and our [GitHub repository](https://github.com/stanfordnlp/stanza).
This card and repo were automatically prepared with `hugging_stanza.py` in the `stanfordnlp/huggingface-models` repo
Last updated 2023-10-02 23:42:52.918
| 678 | [
[
-0.0350341796875,
-0.05499267578125,
0.007740020751953125,
0.048065185546875,
-0.025909423828125,
-0.01611328125,
-0.0163726806640625,
-0.047088623046875,
0.0259246826171875,
0.04254150390625,
-0.036895751953125,
-0.043426513671875,
-0.03631591796875,
0.00939178466796875,
-0.0287017822265625,
0.07012939453125,
0.0007405281066894531,
0.00591278076171875,
0.0161285400390625,
-0.0141143798828125,
-0.0187530517578125,
-0.0189666748046875,
-0.055908203125,
-0.05181884765625,
0.057830810546875,
0.00899505615234375,
0.029998779296875,
0.005352020263671875,
0.041961669921875,
0.0228729248046875,
-0.033935546875,
-0.033721923828125,
0.006473541259765625,
0.033905029296875,
-0.0024967193603515625,
-0.035400390625,
-0.0284881591796875,
0.00383758544921875,
0.047149658203125,
0.049407958984375,
-0.0231475830078125,
0.0279083251953125,
0.00628662109375,
0.0638427734375,
-0.0294952392578125,
0.0189056396484375,
-0.035614013671875,
0.01219940185546875,
-0.0274505615234375,
-0.0080108642578125,
-0.02752685546875,
-0.042144775390625,
0.03240966796875,
-0.042083740234375,
0.00272369384765625,
0.0102691650390625,
0.0843505859375,
0.01476287841796875,
-0.017730712890625,
-0.0213775634765625,
-0.01522064208984375,
0.0382080078125,
-0.04718017578125,
0.00809478759765625,
0.0274505615234375,
0.055267333984375,
-0.023834228515625,
-0.061126708984375,
-0.04461669921875,
-0.01358795166015625,
-0.005374908447265625,
0.01788330078125,
-0.0217437744140625,
0.02362060546875,
0.0197601318359375,
0.03155517578125,
-0.047943115234375,
0.00438690185546875,
-0.0238037109375,
-0.005451202392578125,
0.03973388671875,
0.004558563232421875,
0.0535888671875,
-0.060760498046875,
-0.0303497314453125,
-0.01328277587890625,
-0.020172119140625,
0.0114593505859375,
0.034149169921875,
0.016937255859375,
-0.02239990234375,
0.05810546875,
0.0007390975952148438,
0.072998046875,
-0.0263671875,
0.019500732421875,
0.00457763671875,
0.00765228271484375,
-0.017242431640625,
-0.0017175674438476562,
0.0689697265625,
0.0226593017578125,
0.03289794921875,
0.003208160400390625,
-0.017059326171875,
0.015777587890625,
0.0193634033203125,
-0.0254974365234375,
-0.049774169921875,
0.01995849609375,
-0.042633056640625,
-0.032073974609375,
-0.0067138671875,
-0.036712646484375,
-0.00559234619140625,
-0.032012939453125,
0.0111236572265625,
-0.055694580078125,
-0.0236053466796875,
0.018524169921875,
-0.055084228515625,
0.0275726318359375,
0.03173828125,
-0.050872802734375,
0.0068359375,
0.07049560546875,
0.08160400390625,
0.008392333984375,
-0.033660888671875,
0.002132415771484375,
-0.02349853515625,
-0.026214599609375,
0.06353759765625,
-0.0270538330078125,
0.00432586669921875,
-0.01016998291015625,
-0.007049560546875,
-0.00624847412109375,
-0.0106353759765625,
0.05975341796875,
-0.036102294921875,
0.0281982421875,
-0.018890380859375,
-0.04949951171875,
-0.036407470703125,
0.02191162109375,
-0.07049560546875,
0.09527587890625,
0.020416259765625,
-0.08721923828125,
0.0186004638671875,
-0.06317138671875,
0.00727081298828125,
0.0204620361328125,
0.016937255859375,
-0.01264190673828125,
0.0204925537109375,
0.0011539459228515625,
0.04437255859375,
-0.0216522216796875,
0.025360107421875,
-0.031768798828125,
-0.00910186767578125,
-0.0044708251953125,
-0.00405120849609375,
0.11358642578125,
0.031341552734375,
0.017242431640625,
0.0303955078125,
-0.061309814453125,
-0.009979248046875,
0.009857177734375,
-0.018829345703125,
-0.056884765625,
-0.011138916015625,
0.050201416015625,
0.0118255615234375,
0.0017976760864257812,
-0.049163818359375,
0.034576416015625,
-0.045318603515625,
0.04315185546875,
0.057403564453125,
-0.01125335693359375,
0.0213775634765625,
-0.0211944580078125,
0.055145263671875,
-0.0312347412109375,
0.005336761474609375,
-0.018463134765625,
-0.061492919921875,
-0.044647216796875,
-0.0183563232421875,
0.048004150390625,
0.039459228515625,
-0.06402587890625,
0.062469482421875,
-0.009796142578125,
-0.06005859375,
-0.033294677734375,
-0.0289764404296875,
0.01094818115234375,
0.03387451171875,
0.0021762847900390625,
-0.017486572265625,
-0.05291748046875,
-0.047149658203125,
-0.0204010009765625,
-0.03765869140625,
-0.0140838623046875,
-0.0048828125,
0.047210693359375,
-0.0103912353515625,
0.0787353515625,
-0.0227203369140625,
-0.0018320083618164062,
-0.0122222900390625,
0.026275634765625,
0.0163421630859375,
0.035552978515625,
0.057952880859375,
-0.045623779296875,
-0.0416259765625,
0.0031642913818359375,
-0.058807373046875,
-0.024017333984375,
0.004863739013671875,
-0.023223876953125,
0.0123291015625,
0.0011377334594726562,
-0.02825927734375,
0.005908966064453125,
0.06817626953125,
-0.042572021484375,
0.06146240234375,
0.0022792816162109375,
-0.0154571533203125,
-0.09478759765625,
0.023223876953125,
0.01465606689453125,
-0.026702880859375,
-0.02545166015625,
0.036468505859375,
0.00818634033203125,
-0.0177001953125,
-0.0309600830078125,
0.054534912109375,
-0.02301025390625,
-0.00328826904296875,
-0.00640869140625,
-0.00948333740234375,
-0.001926422119140625,
0.02191162109375,
0.005756378173828125,
0.04449462890625,
0.0540771484375,
-0.0209808349609375,
0.03753662109375,
0.035491943359375,
-0.031036376953125,
0.041290283203125,
-0.05914306640625,
-0.0064239501953125,
0.0018749237060546875,
0.0294342041015625,
-0.0577392578125,
-0.0352783203125,
0.0176239013671875,
-0.039794921875,
0.01541900634765625,
-0.0223236083984375,
-0.0242156982421875,
-0.0400390625,
-0.034271240234375,
0.0184173583984375,
0.039459228515625,
-0.049560546875,
0.050201416015625,
0.036590576171875,
-0.0131072998046875,
-0.045623779296875,
-0.043182373046875,
0.0252838134765625,
-0.0230255126953125,
-0.056976318359375,
0.0035858154296875,
0.0111541748046875,
-0.0187530517578125,
0.00566864013671875,
0.0156707763671875,
-0.01242828369140625,
0.01267242431640625,
0.01142120361328125,
0.01561737060546875,
-0.0120086669921875,
0.0008988380432128906,
0.0135650634765625,
-0.03192138671875,
-0.0016660690307617188,
-0.0273895263671875,
0.05255126953125,
-0.0175323486328125,
0.0019664764404296875,
-0.0634765625,
0.0230560302734375,
0.0311279296875,
0.004825592041015625,
0.0282745361328125,
0.027557373046875,
-0.02447509765625,
-0.02825927734375,
-0.0262603759765625,
0.006664276123046875,
-0.031646728515625,
-0.00936126708984375,
-0.0467529296875,
-0.061676025390625,
0.0560302734375,
-0.0056304931640625,
0.00733184814453125,
0.052459716796875,
0.0266265869140625,
-0.0279083251953125,
0.03424072265625,
0.058624267578125,
-0.01117706298828125,
0.0423583984375,
-0.014892578125,
-0.0189361572265625,
-0.0830078125,
0.00566864013671875,
-0.061920166015625,
-0.01384735107421875,
-0.051849365234375,
-0.0125274658203125,
0.007190704345703125,
0.0450439453125,
-0.0166168212890625,
0.06732177734375,
-0.06365966796875,
0.0299530029296875,
0.047943115234375,
-0.0283355712890625,
0.004276275634765625,
-0.01552581787109375,
-0.0276947021484375,
-0.0029888153076171875,
-0.05126953125,
-0.0599365234375,
0.05877685546875,
0.0285186767578125,
0.0484619140625,
-0.009735107421875,
0.05645751953125,
0.01104736328125,
0.009857177734375,
-0.07049560546875,
0.040191650390625,
-0.01024627685546875,
-0.037200927734375,
-0.0185394287109375,
-0.0269012451171875,
-0.0889892578125,
0.023223876953125,
-0.005825042724609375,
-0.056121826171875,
-0.00647735595703125,
0.01471710205078125,
0.0017156600952148438,
0.0223236083984375,
-0.04156494140625,
0.06842041015625,
0.01031494140625,
0.03753662109375,
0.00739288330078125,
-0.0521240234375,
0.04119873046875,
0.00713348388671875,
-0.020294189453125,
-0.024932861328125,
0.005924224853515625,
0.0601806640625,
-0.0277862548828125,
0.05047607421875,
-0.0077056884765625,
-0.0122222900390625,
0.01849365234375,
0.01374053955078125,
0.01947021484375,
-0.0178070068359375,
-0.0213623046875,
0.053131103515625,
0.017181396484375,
-0.03167724609375,
-0.036041259765625,
0.04022216796875,
-0.03948974609375,
0.0083770751953125,
-0.047454833984375,
-0.0209808349609375,
0.0208740234375,
0.0091400146484375,
0.0199432373046875,
0.0074920654296875,
-0.017059326171875,
0.0015993118286132812,
0.0073394775390625,
-0.03399658203125,
0.040557861328125,
0.0165863037109375,
-0.031524658203125,
-0.035400390625,
0.042999267578125,
0.0174560546875,
-0.0002505779266357422,
0.01087188720703125,
0.037841796875,
-0.017303466796875,
-0.0233001708984375,
-0.04498291015625,
0.00814056396484375,
-0.0506591796875,
-0.003322601318359375,
-0.048797607421875,
-0.00788116455078125,
-0.015045166015625,
-0.0181884765625,
-0.01374053955078125,
-0.06591796875,
0.0007781982421875,
0.003154754638671875,
0.040283203125,
0.050445556640625,
-0.00030541419982910156,
0.0296173095703125,
-0.05938720703125,
0.0248870849609375,
0.00513458251953125,
0.030487060546875,
0.0037631988525390625,
-0.010162353515625,
0.0160675048828125,
-0.034149169921875,
-0.021270751953125,
-0.060211181640625,
0.0287628173828125,
0.01134490966796875,
0.0178375244140625,
0.019378662109375,
0.0095977783203125,
0.043487548828125,
-0.04949951171875,
0.07427978515625,
0.039337158203125,
-0.08013916015625,
0.061492919921875,
-0.033782958984375,
0.00878143310546875,
0.0238037109375,
0.0219573974609375,
-0.0809326171875,
-0.04315185546875,
-0.045684814453125,
-0.0748291015625,
0.05224609375,
0.0220947265625,
0.004669189453125,
-0.0035552978515625,
0.04150390625,
0.01338958740234375,
-0.0036754608154296875,
-0.0560302734375,
-0.01561737060546875,
-0.012420654296875,
-0.0297393798828125,
0.007312774658203125,
-0.0180511474609375,
-0.00396728515625,
-0.02471923828125,
0.047332763671875,
-0.00821685791015625,
0.0125885009765625,
0.0033550262451171875,
0.008453369140625,
-0.00394439697265625,
-0.0034942626953125,
0.0257415771484375,
0.031768798828125,
-0.0068511962890625,
-0.0011882781982421875,
-0.002681732177734375,
-0.0288238525390625,
-0.0018949508666992188,
0.000949859619140625,
-0.0257110595703125,
0.0155487060546875,
0.04949951171875,
0.06182861328125,
0.00420379638671875,
-0.0267791748046875,
0.0292510986328125,
-0.00948333740234375,
0.0023784637451171875,
-0.0460205078125,
0.01513671875,
0.00885009765625,
0.0117950439453125,
0.00713348388671875,
0.01108551025390625,
0.021087646484375,
-0.01219940185546875,
0.024139404296875,
0.02740478515625,
-0.030914306640625,
-0.03216552734375,
0.048370361328125,
0.01556396484375,
-0.047149658203125,
0.043212890625,
0.003635406494140625,
-0.049346923828125,
0.034637451171875,
0.0232696533203125,
0.09033203125,
-0.00042128562927246094,
0.019683837890625,
0.041290283203125,
0.01374053955078125,
-0.01084136962890625,
0.0306549072265625,
-0.0252685546875,
-0.040069580078125,
-0.0518798828125,
-0.0810546875,
-0.032257080078125,
-0.020660400390625,
-0.06365966796875,
0.033660888671875,
-0.0160675048828125,
-0.01508331298828125,
0.0169677734375,
-0.00673675537109375,
-0.0172882080078125,
0.0018281936645507812,
-0.0005102157592773438,
0.098876953125,
-0.064453125,
0.080078125,
0.04559326171875,
-0.035858154296875,
-0.06207275390625,
0.0014257431030273438,
0.00033783912658691406,
-0.048614501953125,
0.05426025390625,
0.0213775634765625,
0.004497528076171875,
0.00232696533203125,
-0.0460205078125,
-0.040802001953125,
0.0538330078125,
0.0305328369140625,
-0.038787841796875,
-0.00795745849609375,
-0.0255279541015625,
0.04559326171875,
0.0162200927734375,
0.0216522216796875,
0.02410888671875,
0.040191650390625,
0.00745391845703125,
-0.07763671875,
-0.01561737060546875,
-0.036468505859375,
0.026275634765625,
0.03509521484375,
-0.0217742919921875,
0.06536865234375,
0.030670166015625,
-0.0178985595703125,
0.0212249755859375,
0.0455322265625,
0.02288818359375,
0.00811767578125,
0.033172607421875,
0.06365966796875,
0.04913330078125,
-0.0288238525390625,
0.1007080078125,
-0.023651123046875,
0.05645751953125,
0.06378173828125,
-0.003757476806640625,
0.06719970703125,
0.032623291015625,
-0.01708984375,
0.046173095703125,
0.06292724609375,
0.0021209716796875,
-0.0008454322814941406,
0.026580810546875,
0.0023632049560546875,
-0.00829315185546875,
0.016998291015625,
-0.015777587890625,
0.03631591796875,
0.0033435821533203125,
0.0006680488586425781,
-0.0280609130859375,
0.00673675537109375,
-0.00872802734375,
0.0014333724975585938,
-0.006195068359375,
0.0634765625,
0.0161285400390625,
-0.04327392578125,
0.0513916015625,
0.003063201904296875,
0.06744384765625,
-0.057037353515625,
-0.01462554931640625,
0.0217132568359375,
-0.01161956787109375,
-0.003726959228515625,
-0.076171875,
0.04913330078125,
0.0026760101318359375,
0.004108428955078125,
-0.0162353515625,
0.049835205078125,
-0.01267242431640625,
-0.041259765625,
0.047210693359375,
0.0009851455688476562,
0.0284423828125,
-0.005008697509765625,
-0.0892333984375,
-0.0016422271728515625,
-0.0228118896484375,
-0.051025390625,
-0.004772186279296875,
0.03973388671875,
-0.01271820068359375,
0.05999755859375,
0.05126953125,
0.027557373046875,
0.005645751953125,
0.018585205078125,
0.041168212890625,
-0.060638427734375,
-0.04937744140625,
-0.05181884765625,
0.04888916015625,
-0.01413726806640625,
-0.037353515625,
0.0643310546875,
0.034881591796875,
0.057647705078125,
-0.0021457672119140625,
0.042144775390625,
-0.0257110595703125,
0.0216827392578125,
-0.040771484375,
0.06982421875,
-0.04449462890625,
-0.00885009765625,
-0.039031982421875,
-0.07196044921875,
-0.05853271484375,
0.06915283203125,
0.0018720626831054688,
-0.0261077880859375,
0.0396728515625,
0.01708984375,
0.00461578369140625,
0.0201416015625,
0.016021728515625,
-0.0032482147216796875,
0.02374267578125,
0.0167999267578125,
0.0699462890625,
-0.0322265625,
-0.0088348388671875,
-0.021453857421875,
-0.01561737060546875,
-0.0014410018920898438,
-0.0728759765625,
-0.06201171875,
-0.056884765625,
-0.011810302734375,
-0.0284881591796875,
-0.01451873779296875,
0.08160400390625,
0.044708251953125,
-0.0595703125,
-0.024566650390625,
-0.037261962890625,
0.0025577545166015625,
-0.0179901123046875,
-0.01904296875,
0.04644775390625,
-0.046844482421875,
-0.0640869140625,
0.0034885406494140625,
0.030914306640625,
0.0034694671630859375,
-0.022491455078125,
-0.008819580078125,
-0.0164794921875,
0.00025963783264160156,
0.053131103515625,
0.0092315673828125,
-0.0732421875,
-0.021087646484375,
-0.0237884521484375,
-0.0099639892578125,
-0.0008077621459960938,
0.053558349609375,
-0.035614013671875,
0.0440673828125,
0.063720703125,
0.0305633544921875,
0.016998291015625,
-0.0036220550537109375,
0.041534423828125,
-0.046173095703125,
0.025360107421875,
0.0294036865234375,
0.04608154296875,
0.0162200927734375,
-0.0000959634780883789,
0.051788330078125,
0.016754150390625,
-0.0418701171875,
-0.060516357421875,
0.021270751953125,
-0.099609375,
-0.029693603515625,
0.059783935546875,
-0.025054931640625,
-0.041168212890625,
-0.00948333740234375,
-0.0379638671875,
0.0192718505859375,
-0.0193328857421875,
0.051177978515625,
0.0660400390625,
-0.00045108795166015625,
-0.00724029541015625,
-0.038116455078125,
0.0312347412109375,
0.0121612548828125,
-0.0399169921875,
0.00524139404296875,
0.0291900634765625,
0.0113525390625,
0.02264404296875,
0.03857421875,
0.00331878662109375,
0.01129150390625,
-0.0020542144775390625,
0.04376220703125,
0.01506805419921875,
-0.0276947021484375,
-0.045318603515625,
0.0189361572265625,
0.0006084442138671875,
-0.015655517578125
]
] |
timm/vit_tiny_patch16_224.augreg_in21k_ft_in1k | 2023-05-06T00:30:03.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2106.10270",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_tiny_patch16_224.augreg_in21k_ft_in1k | 1 | 16,519 | timm | 2022-12-22T07:56:04 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for vit_tiny_patch16_224.augreg_in21k_ft_in1k
A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.7
- GMACs: 1.1
- Activations (M): 4.1
- Image size: 224 x 224
- **Papers:**
- How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_tiny_patch16_224.augreg_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_tiny_patch16_224.augreg_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 192) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{steiner2021augreg,
title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers},
author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas},
journal={arXiv preprint arXiv:2106.10270},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,903 | [
[
-0.0396728515625,
-0.02960205078125,
-0.001514434814453125,
0.00438690185546875,
-0.02801513671875,
-0.027069091796875,
-0.0228424072265625,
-0.035003662109375,
0.01495361328125,
0.0207061767578125,
-0.0416259765625,
-0.034515380859375,
-0.045928955078125,
0.0004508495330810547,
-0.0099945068359375,
0.07293701171875,
-0.00804901123046875,
0.003810882568359375,
-0.0159759521484375,
-0.03424072265625,
-0.023834228515625,
-0.0161285400390625,
-0.0472412109375,
-0.03143310546875,
0.0291595458984375,
0.016632080078125,
0.045166015625,
0.048095703125,
0.05938720703125,
0.033111572265625,
-0.0086669921875,
0.01107025146484375,
-0.0240325927734375,
-0.019866943359375,
0.0220184326171875,
-0.048065185546875,
-0.02972412109375,
0.0175018310546875,
0.055084228515625,
0.0296783447265625,
0.00949859619140625,
0.029449462890625,
0.011688232421875,
0.03546142578125,
-0.0242156982421875,
0.01450347900390625,
-0.04010009765625,
0.0196380615234375,
-0.00443267822265625,
-0.0008478164672851562,
-0.0232696533203125,
-0.02587890625,
0.0177154541015625,
-0.040069580078125,
0.0472412109375,
-0.0019931793212890625,
0.10198974609375,
0.0227508544921875,
0.00370025634765625,
0.01568603515625,
-0.030487060546875,
0.056671142578125,
-0.04559326171875,
0.0292510986328125,
0.0128326416015625,
0.01406097412109375,
0.006160736083984375,
-0.078369140625,
-0.051177978515625,
-0.01247406005859375,
-0.020782470703125,
0.0052337646484375,
-0.023773193359375,
0.01873779296875,
0.035247802734375,
0.0435791015625,
-0.038299560546875,
0.0007925033569335938,
-0.042144775390625,
-0.0212860107421875,
0.04217529296875,
0.0002474784851074219,
0.014801025390625,
-0.014892578125,
-0.043243408203125,
-0.04437255859375,
-0.0248565673828125,
0.0200653076171875,
0.0201873779296875,
0.0069122314453125,
-0.0384521484375,
0.04058837890625,
0.00447845458984375,
0.04962158203125,
0.0178375244140625,
-0.0167999267578125,
0.0472412109375,
-0.01393890380859375,
-0.028656005859375,
-0.0188751220703125,
0.08270263671875,
0.03277587890625,
0.0271759033203125,
-0.0015325546264648438,
-0.01345062255859375,
-0.00791168212890625,
0.003894805908203125,
-0.08251953125,
-0.0311126708984375,
0.005397796630859375,
-0.03485107421875,
-0.0310516357421875,
0.0255584716796875,
-0.049041748046875,
-0.0090789794921875,
-0.007503509521484375,
0.06048583984375,
-0.032623291015625,
-0.0127716064453125,
0.005977630615234375,
-0.010467529296875,
0.03369140625,
0.0196990966796875,
-0.045318603515625,
0.00824737548828125,
0.0162353515625,
0.0787353515625,
0.004627227783203125,
-0.03546142578125,
-0.0194549560546875,
-0.03387451171875,
-0.0243072509765625,
0.03851318359375,
-0.0015802383422851562,
-0.00901031494140625,
-0.0140838623046875,
0.02825927734375,
-0.019683837890625,
-0.042877197265625,
0.0245208740234375,
-0.01393890380859375,
0.02618408203125,
0.00994873046875,
-0.01357269287109375,
-0.02880859375,
0.020782470703125,
-0.0316162109375,
0.08966064453125,
0.026214599609375,
-0.068359375,
0.032318115234375,
-0.034912109375,
-0.005397796630859375,
-0.0098419189453125,
0.0008673667907714844,
-0.0826416015625,
0.005764007568359375,
0.02386474609375,
0.04388427734375,
-0.01555633544921875,
-0.005092620849609375,
-0.032012939453125,
-0.0255279541015625,
0.0263214111328125,
-0.020263671875,
0.0667724609375,
0.0022525787353515625,
-0.0222320556640625,
0.02191162109375,
-0.04473876953125,
0.00745391845703125,
0.03271484375,
-0.017974853515625,
-0.0006670951843261719,
-0.047821044921875,
0.01165771484375,
0.0176849365234375,
0.0162353515625,
-0.049468994140625,
0.031585693359375,
-0.0284881591796875,
0.032012939453125,
0.050079345703125,
-0.005939483642578125,
0.0309295654296875,
-0.0261077880859375,
0.024444580078125,
0.018341064453125,
0.0305023193359375,
-0.0089874267578125,
-0.04779052734375,
-0.07904052734375,
-0.035369873046875,
0.02960205078125,
0.031585693359375,
-0.052703857421875,
0.039886474609375,
-0.02960205078125,
-0.056060791015625,
-0.048065185546875,
0.004619598388671875,
0.033905029296875,
0.038299560546875,
0.037261962890625,
-0.039459228515625,
-0.039794921875,
-0.0712890625,
-0.00890350341796875,
-0.0048980712890625,
-0.0026493072509765625,
0.0171661376953125,
0.04840087890625,
-0.0211944580078125,
0.0655517578125,
-0.034393310546875,
-0.0287933349609375,
-0.0164794921875,
0.00421905517578125,
0.02569580078125,
0.056243896484375,
0.0517578125,
-0.047119140625,
-0.034942626953125,
-0.01092529296875,
-0.0616455078125,
0.01175689697265625,
-0.00310516357421875,
-0.0118865966796875,
0.00901031494140625,
0.01425933837890625,
-0.05255126953125,
0.05755615234375,
0.01261138916015625,
-0.028533935546875,
0.02972412109375,
-0.0181121826171875,
0.003681182861328125,
-0.08660888671875,
-0.0007834434509277344,
0.02764892578125,
-0.02020263671875,
-0.036163330078125,
0.0009512901306152344,
0.0080413818359375,
-0.00304412841796875,
-0.032135009765625,
0.045654296875,
-0.036041259765625,
-0.00299835205078125,
-0.004299163818359375,
-0.022308349609375,
0.00397491455078125,
0.053680419921875,
-0.005634307861328125,
0.04022216796875,
0.052215576171875,
-0.035797119140625,
0.042144775390625,
0.041046142578125,
-0.017608642578125,
0.032989501953125,
-0.054168701171875,
0.0121917724609375,
-0.0013494491577148438,
0.01282501220703125,
-0.07598876953125,
-0.0160675048828125,
0.0286712646484375,
-0.054718017578125,
0.0487060546875,
-0.04022216796875,
-0.035980224609375,
-0.046966552734375,
-0.032623291015625,
0.03143310546875,
0.055877685546875,
-0.060150146484375,
0.0416259765625,
0.00669097900390625,
0.0232391357421875,
-0.038360595703125,
-0.07293701171875,
-0.018768310546875,
-0.028289794921875,
-0.054412841796875,
0.03289794921875,
0.00609588623046875,
0.01053619384765625,
0.005657196044921875,
-0.0064697265625,
0.00012350082397460938,
-0.016326904296875,
0.03228759765625,
0.030548095703125,
-0.0178070068359375,
-0.0032520294189453125,
-0.0247955322265625,
-0.0160675048828125,
-0.0023956298828125,
-0.0253143310546875,
0.03717041015625,
-0.0247955322265625,
-0.013885498046875,
-0.05621337890625,
-0.020263671875,
0.034332275390625,
-0.019256591796875,
0.055938720703125,
0.0882568359375,
-0.0355224609375,
0.0039520263671875,
-0.04095458984375,
-0.03265380859375,
-0.037200927734375,
0.032623291015625,
-0.0236663818359375,
-0.035980224609375,
0.05462646484375,
0.0132904052734375,
0.0048675537109375,
0.058746337890625,
0.031951904296875,
0.0006537437438964844,
0.063720703125,
0.052978515625,
0.0128173828125,
0.066650390625,
-0.07379150390625,
-0.00775146484375,
-0.06793212890625,
-0.025054931640625,
-0.0191802978515625,
-0.041748046875,
-0.054168701171875,
-0.038299560546875,
0.03485107421875,
0.008880615234375,
-0.02093505859375,
0.037750244140625,
-0.06658935546875,
0.013916015625,
0.053192138671875,
0.03863525390625,
-0.007518768310546875,
0.03277587890625,
-0.01495361328125,
-0.0083770751953125,
-0.062103271484375,
-0.0087738037109375,
0.081298828125,
0.03570556640625,
0.061187744140625,
-0.0196990966796875,
0.04901123046875,
-0.0175628662109375,
0.0242767333984375,
-0.059814453125,
0.041656494140625,
-0.0022945404052734375,
-0.0309600830078125,
-0.0095062255859375,
-0.0304412841796875,
-0.07757568359375,
0.016754150390625,
-0.0261383056640625,
-0.062103271484375,
0.0274200439453125,
0.01690673828125,
-0.0211181640625,
0.049102783203125,
-0.06622314453125,
0.07568359375,
-0.0025310516357421875,
-0.03729248046875,
0.0064239501953125,
-0.053192138671875,
0.0159149169921875,
0.0166015625,
-0.025726318359375,
0.011138916015625,
0.0172576904296875,
0.07647705078125,
-0.04693603515625,
0.06298828125,
-0.02978515625,
0.0278472900390625,
0.03729248046875,
-0.015380859375,
0.029876708984375,
0.004161834716796875,
0.01406097412109375,
0.0258636474609375,
0.0009746551513671875,
-0.0275421142578125,
-0.03546142578125,
0.03717041015625,
-0.07684326171875,
-0.02679443359375,
-0.03924560546875,
-0.041168212890625,
0.00824737548828125,
0.006328582763671875,
0.051788330078125,
0.045074462890625,
0.0212554931640625,
0.0293731689453125,
0.049591064453125,
-0.023590087890625,
0.0303955078125,
0.0014944076538085938,
-0.01328277587890625,
-0.0406494140625,
0.072998046875,
0.016876220703125,
0.01213836669921875,
0.01238250732421875,
0.0194854736328125,
-0.024139404296875,
-0.035003662109375,
-0.0271759033203125,
0.03179931640625,
-0.05145263671875,
-0.0374755859375,
-0.042510986328125,
-0.03765869140625,
-0.0249786376953125,
0.00164794921875,
-0.0335693359375,
-0.0266571044921875,
-0.028289794921875,
0.00986480712890625,
0.061065673828125,
0.03857421875,
-0.0117034912109375,
0.04144287109375,
-0.043914794921875,
0.0170745849609375,
0.0234832763671875,
0.038238525390625,
-0.0135345458984375,
-0.07659912109375,
-0.028839111328125,
0.0019292831420898438,
-0.03851318359375,
-0.05426025390625,
0.03350830078125,
0.0163726806640625,
0.030792236328125,
0.02972412109375,
-0.0184326171875,
0.0654296875,
-0.00585174560546875,
0.043701171875,
0.0247955322265625,
-0.039581298828125,
0.039093017578125,
-0.0068511962890625,
0.01107025146484375,
0.0154571533203125,
0.01007843017578125,
-0.020965576171875,
-0.0036830902099609375,
-0.08221435546875,
-0.05523681640625,
0.06201171875,
0.018798828125,
0.002796173095703125,
0.034820556640625,
0.043701171875,
-0.005023956298828125,
0.005157470703125,
-0.0657958984375,
-0.023193359375,
-0.0265045166015625,
-0.02508544921875,
-0.007244110107421875,
-0.002124786376953125,
-0.0020236968994140625,
-0.061920166015625,
0.048004150390625,
-0.00925445556640625,
0.06353759765625,
0.033111572265625,
-0.01465606689453125,
-0.01265716552734375,
-0.0283660888671875,
0.0268402099609375,
0.0172576904296875,
-0.0212860107421875,
0.002239227294921875,
0.0211944580078125,
-0.055267333984375,
-0.0024318695068359375,
0.02392578125,
-0.00916290283203125,
0.004974365234375,
0.03594970703125,
0.08087158203125,
-0.007610321044921875,
-0.000423431396484375,
0.0418701171875,
-0.009124755859375,
-0.0294647216796875,
-0.0193328857421875,
0.006450653076171875,
-0.0200042724609375,
0.02764892578125,
0.022216796875,
0.031768798828125,
-0.01154327392578125,
-0.01027679443359375,
0.01093292236328125,
0.039947509765625,
-0.0404052734375,
-0.0281829833984375,
0.05133056640625,
-0.0156097412109375,
-0.006565093994140625,
0.05889892578125,
-0.007266998291015625,
-0.040374755859375,
0.0654296875,
0.022674560546875,
0.0753173828125,
-0.0055694580078125,
-0.003238677978515625,
0.058624267578125,
0.0311431884765625,
-0.0020847320556640625,
0.011322021484375,
0.00878143310546875,
-0.0587158203125,
-0.00506591796875,
-0.0491943359375,
0.0025959014892578125,
0.023773193359375,
-0.03887939453125,
0.0305328369140625,
-0.04083251953125,
-0.02740478515625,
0.005924224853515625,
0.02099609375,
-0.07891845703125,
0.0214385986328125,
0.001819610595703125,
0.057098388671875,
-0.0606689453125,
0.05169677734375,
0.06488037109375,
-0.05133056640625,
-0.07415771484375,
-0.010711669921875,
-0.011627197265625,
-0.06646728515625,
0.03289794921875,
0.032318115234375,
0.0137786865234375,
0.019866943359375,
-0.060882568359375,
-0.045196533203125,
0.09716796875,
0.027557373046875,
-0.01102447509765625,
0.01004791259765625,
-0.0032367706298828125,
0.0283966064453125,
-0.0201873779296875,
0.033294677734375,
0.01477813720703125,
0.029144287109375,
0.01491546630859375,
-0.05389404296875,
0.00528717041015625,
-0.02288818359375,
0.01387786865234375,
0.01424407958984375,
-0.0625,
0.07220458984375,
-0.033721923828125,
-0.008941650390625,
0.0158233642578125,
0.047882080078125,
0.005794525146484375,
0.006591796875,
0.040985107421875,
0.0653076171875,
0.02978515625,
-0.034942626953125,
0.06927490234375,
-0.009368896484375,
0.05523681640625,
0.03887939453125,
0.039947509765625,
0.0328369140625,
0.0361328125,
-0.0263519287109375,
0.02215576171875,
0.07537841796875,
-0.042205810546875,
0.023193359375,
0.005657196044921875,
0.00458526611328125,
-0.0171661376953125,
0.0032863616943359375,
-0.035858154296875,
0.037933349609375,
0.0155792236328125,
-0.04022216796875,
-0.005199432373046875,
0.015411376953125,
-0.0128326416015625,
-0.029083251953125,
-0.016632080078125,
0.046966552734375,
0.00191497802734375,
-0.02972412109375,
0.061065673828125,
0.0006551742553710938,
0.06280517578125,
-0.0335693359375,
0.0005002021789550781,
-0.020263671875,
0.034088134765625,
-0.028472900390625,
-0.058074951171875,
0.0127716064453125,
-0.01727294921875,
-0.006343841552734375,
0.0035152435302734375,
0.0552978515625,
-0.030120849609375,
-0.04327392578125,
0.005584716796875,
0.0239410400390625,
0.021484375,
-0.0054779052734375,
-0.076416015625,
-0.0025691986083984375,
0.0011072158813476562,
-0.0433349609375,
0.0155792236328125,
0.031402587890625,
0.0033779144287109375,
0.050140380859375,
0.048828125,
-0.00469207763671875,
0.0186920166015625,
-0.01007843017578125,
0.06866455078125,
-0.03167724609375,
-0.031402587890625,
-0.059478759765625,
0.047027587890625,
-0.00530242919921875,
-0.045135498046875,
0.052032470703125,
0.044219970703125,
0.06982421875,
-0.011993408203125,
0.032623291015625,
-0.01392364501953125,
0.0026950836181640625,
-0.0266265869140625,
0.04608154296875,
-0.056060791015625,
-0.0062408447265625,
-0.0220489501953125,
-0.06512451171875,
-0.0280609130859375,
0.07135009765625,
-0.026092529296875,
0.032745361328125,
0.040130615234375,
0.073974609375,
-0.0264129638671875,
-0.0284423828125,
0.0135498046875,
0.0161590576171875,
0.0079345703125,
0.0297393798828125,
0.042877197265625,
-0.06829833984375,
0.039276123046875,
-0.0479736328125,
-0.011993408203125,
-0.0200958251953125,
-0.034515380859375,
-0.07623291015625,
-0.0587158203125,
-0.044677734375,
-0.051055908203125,
-0.019439697265625,
0.0648193359375,
0.07073974609375,
-0.043487548828125,
-0.0028438568115234375,
-0.01020050048828125,
0.003299713134765625,
-0.02337646484375,
-0.0184478759765625,
0.0390625,
-0.0097198486328125,
-0.05755615234375,
-0.02545166015625,
-0.000392913818359375,
0.038238525390625,
-0.0133514404296875,
-0.01251983642578125,
-0.0101776123046875,
-0.0244140625,
0.0183258056640625,
0.022796630859375,
-0.051177978515625,
-0.019287109375,
-0.00778961181640625,
-0.005489349365234375,
0.03631591796875,
0.0294189453125,
-0.0552978515625,
0.04156494140625,
0.0404052734375,
0.0250701904296875,
0.06201171875,
-0.0170745849609375,
0.00370025634765625,
-0.0677490234375,
0.04669189453125,
-0.0019779205322265625,
0.03875732421875,
0.0377197265625,
-0.0188751220703125,
0.045135498046875,
0.04437255859375,
-0.033905029296875,
-0.06524658203125,
-0.004650115966796875,
-0.0814208984375,
0.010498046875,
0.07086181640625,
-0.0170135498046875,
-0.0355224609375,
0.0301055908203125,
-0.0160980224609375,
0.0517578125,
-0.004016876220703125,
0.0355224609375,
0.0157623291015625,
0.0070343017578125,
-0.044952392578125,
-0.034698486328125,
0.039520263671875,
0.01102447509765625,
-0.039520263671875,
-0.0295867919921875,
0.003173828125,
0.04132080078125,
0.0282440185546875,
0.0264129638671875,
-0.00926971435546875,
0.0134124755859375,
0.007015228271484375,
0.03790283203125,
-0.0253143310546875,
-0.01297760009765625,
-0.03253173828125,
-0.01050567626953125,
-0.004467010498046875,
-0.04815673828125
]
] |
deepset/roberta-base-squad2-distilled | 2023-09-26T08:34:45.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"question-answering",
"exbert",
"en",
"dataset:squad_v2",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | deepset | null | null | deepset/roberta-base-squad2-distilled | 5 | 16,500 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: mit
tags:
- exbert
datasets:
- squad_v2
thumbnail: https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg
model-index:
- name: deepset/roberta-base-squad2-distilled
results:
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_v2
type: squad_v2
config: squad_v2
split: validation
metrics:
- type: exact_match
value: 80.8593
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzVjNzkxNmNiNDkzNzdiYjJjZGM3ZTViMGJhOGM2ZjFmYjg1MjYxMDM2YzM5NWMwNDIyYzNlN2QwNGYyNDMzZSIsInZlcnNpb24iOjF9.Rgww8tf8D7nF2dh2U_DMrFzmp87k8s7RFibrDXSvQyA66PGWXwjlsd1552lzjHnNV5hvHUM1-h3PTuY_5p64BA
- type: f1
value: 84.0104
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTAyZDViNWYzNjA4OWQ5MzgyYmQ2ZDlhNWRhMTIzYTYxYzViMmI4NWE4ZGU5MzVhZTAwNTRlZmRlNWUwMjI0ZSIsInZlcnNpb24iOjF9.Er21BNgJ3jJXLuZtpubTYq9wCwO1i_VLQFwS5ET0e4eAYVVj0aOA40I5FvP5pZac3LjkCnVacxzsFWGCYVmnDA
- task:
type: question-answering
name: Question Answering
dataset:
name: squad
type: squad
config: plain_text
split: validation
metrics:
- type: exact_match
value: 86.225
name: Exact Match
- type: f1
value: 92.483
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: adversarial_qa
type: adversarial_qa
config: adversarialQA
split: validation
metrics:
- type: exact_match
value: 29.900
name: Exact Match
- type: f1
value: 41.183
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_adversarial
type: squad_adversarial
config: AddOneSent
split: validation
metrics:
- type: exact_match
value: 79.071
name: Exact Match
- type: f1
value: 84.472
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts amazon
type: squadshifts
config: amazon
split: test
metrics:
- type: exact_match
value: 70.733
name: Exact Match
- type: f1
value: 83.958
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts new_wiki
type: squadshifts
config: new_wiki
split: test
metrics:
- type: exact_match
value: 82.011
name: Exact Match
- type: f1
value: 91.092
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts nyt
type: squadshifts
config: nyt
split: test
metrics:
- type: exact_match
value: 84.203
name: Exact Match
- type: f1
value: 91.521
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts reddit
type: squadshifts
config: reddit
split: test
metrics:
- type: exact_match
value: 72.029
name: Exact Match
- type: f1
value: 83.454
name: F1
---
## Overview
**Language model:** deepset/roberta-base-squad2-distilled
**Language:** English
**Training data:** SQuAD 2.0 training set
**Eval data:** SQuAD 2.0 dev set
**Infrastructure**: 4x V100 GPU
**Published**: Dec 8th, 2021
## Details
- haystack's distillation feature was used for training. deepset/roberta-large-squad2 was used as the teacher model.
## Hyperparameters
```
batch_size = 80
n_epochs = 4
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
temperature = 1.5
distillation_loss_weight = 0.75
```
## Performance
```
"exact": 79.8366040596311
"f1": 83.916407079888
```
## Authors
**Timo Möller:** timo.moeller@deepset.ai
**Julian Risch:** julian.risch@deepset.ai
**Malte Pietsch:** malte.pietsch@deepset.ai
**Michel Bartels:** michel.bartels@deepset.ai
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs) | 5,991 | [
[
-0.03369140625,
-0.04388427734375,
0.0295867919921875,
0.0077056884765625,
-0.00913238525390625,
0.0165863037109375,
-0.027801513671875,
-0.03411865234375,
0.0200958251953125,
0.00879669189453125,
-0.06353759765625,
-0.059814453125,
-0.03057861328125,
-0.0002925395965576172,
-0.019775390625,
0.07489013671875,
0.00795745849609375,
-0.003505706787109375,
-0.00272369384765625,
-0.00445556640625,
-0.036407470703125,
-0.035003662109375,
-0.0498046875,
-0.01561737060546875,
0.0182037353515625,
0.03289794921875,
0.0494384765625,
0.024200439453125,
0.04229736328125,
0.0236968994140625,
-0.0028820037841796875,
0.00972747802734375,
-0.04443359375,
0.0174407958984375,
-0.01010894775390625,
-0.019439697265625,
-0.02886962890625,
-0.0004763603210449219,
0.042755126953125,
0.03302001953125,
-0.013885498046875,
0.025115966796875,
-0.0009174346923828125,
0.0732421875,
-0.0462646484375,
0.01058197021484375,
-0.0511474609375,
-0.01141357421875,
-0.0030670166015625,
0.024139404296875,
-0.010498046875,
-0.011383056640625,
0.0145111083984375,
-0.04754638671875,
0.033172607421875,
-0.019805908203125,
0.08258056640625,
0.0177001953125,
-0.0015926361083984375,
-0.009307861328125,
-0.039337158203125,
0.059478759765625,
-0.078125,
0.005985260009765625,
0.042236328125,
0.0367431640625,
0.00525665283203125,
-0.06585693359375,
-0.04290771484375,
-0.00261688232421875,
-0.0191497802734375,
0.0224456787109375,
-0.0090789794921875,
-0.0213775634765625,
0.004856109619140625,
0.032623291015625,
-0.057098388671875,
0.0184478759765625,
-0.0380859375,
0.00325775146484375,
0.0679931640625,
0.011749267578125,
0.005615234375,
-0.0020599365234375,
-0.0186920166015625,
-0.023681640625,
-0.0350341796875,
0.004978179931640625,
0.01849365234375,
0.0281219482421875,
-0.031341552734375,
0.029937744140625,
-0.026763916015625,
0.037841796875,
0.0171966552734375,
0.0272064208984375,
0.043731689453125,
-0.051116943359375,
-0.0186920166015625,
0.0004837512969970703,
0.07598876953125,
0.0297393798828125,
0.00543212890625,
0.0004477500915527344,
-0.0189971923828125,
-0.006549835205078125,
0.0157928466796875,
-0.07330322265625,
-0.0225830078125,
0.0302276611328125,
-0.0240478515625,
-0.0367431640625,
0.0102386474609375,
-0.05755615234375,
-0.0271148681640625,
0.002658843994140625,
0.03436279296875,
-0.033721923828125,
-0.031768798828125,
0.01904296875,
-0.02313232421875,
0.034942626953125,
0.0119171142578125,
-0.06304931640625,
0.018096923828125,
0.047332763671875,
0.060699462890625,
0.008758544921875,
-0.0289154052734375,
-0.0226593017578125,
0.00027680397033691406,
-0.01071929931640625,
0.039520263671875,
-0.0235443115234375,
-0.01360321044921875,
0.005023956298828125,
0.01495361328125,
0.002979278564453125,
-0.027679443359375,
0.01708984375,
-0.033905029296875,
0.036865234375,
-0.0149993896484375,
-0.04217529296875,
-0.01025390625,
0.02288818359375,
-0.060516357421875,
0.08184814453125,
0.0189056396484375,
-0.03375244140625,
0.0225372314453125,
-0.054290771484375,
-0.03509521484375,
0.014434814453125,
-0.0039825439453125,
-0.034454345703125,
-0.01441192626953125,
0.0189361572265625,
0.045318603515625,
-0.0212249755859375,
0.0167236328125,
-0.017364501953125,
-0.038909912109375,
0.015411376953125,
-0.00899505615234375,
0.08544921875,
0.004138946533203125,
-0.047332763671875,
-0.01081085205078125,
-0.0565185546875,
0.00945281982421875,
0.022064208984375,
-0.021881103515625,
-0.0031337738037109375,
-0.006420135498046875,
0.01294708251953125,
0.01100921630859375,
0.040985107421875,
-0.0234527587890625,
0.014251708984375,
-0.03802490234375,
0.041595458984375,
0.04931640625,
-0.00621795654296875,
0.0312347412109375,
-0.0166168212890625,
0.033203125,
-0.00726318359375,
0.0086517333984375,
0.004638671875,
-0.023529052734375,
-0.06414794921875,
-0.023773193359375,
0.02203369140625,
0.04864501953125,
-0.04901123046875,
0.06787109375,
-0.0125732421875,
-0.0494384765625,
-0.053436279296875,
0.01375579833984375,
0.031768798828125,
0.0281219482421875,
0.03564453125,
-0.0013132095336914062,
-0.05963134765625,
-0.081298828125,
0.0014781951904296875,
-0.0090484619140625,
-0.0063934326171875,
0.0198822021484375,
0.051177978515625,
-0.0211181640625,
0.053131103515625,
-0.047821044921875,
-0.03472900390625,
-0.018096923828125,
-0.0142974853515625,
0.045501708984375,
0.046966552734375,
0.05670166015625,
-0.0606689453125,
-0.043609619140625,
-0.0124664306640625,
-0.061126708984375,
0.028656005859375,
-0.0013027191162109375,
-0.022308349609375,
0.0160369873046875,
0.0282745361328125,
-0.0489501953125,
0.0261993408203125,
0.04339599609375,
-0.036102294921875,
0.0313720703125,
0.0017490386962890625,
0.006618499755859375,
-0.111328125,
0.0206146240234375,
0.01052093505859375,
-0.00951385498046875,
-0.0279388427734375,
0.0113983154296875,
-0.0184326171875,
-0.004669189453125,
-0.038421630859375,
0.036102294921875,
-0.029449462890625,
0.01001739501953125,
0.0127410888671875,
0.0020294189453125,
0.01546478271484375,
0.03668212890625,
-0.00972747802734375,
0.081787109375,
0.057373046875,
-0.038177490234375,
0.047607421875,
0.02703857421875,
-0.041229248046875,
0.0277252197265625,
-0.07098388671875,
0.006565093994140625,
0.01043701171875,
0.01708984375,
-0.07733154296875,
-0.020751953125,
0.007144927978515625,
-0.047607421875,
0.0203094482421875,
-0.01629638671875,
-0.050445556640625,
-0.0286712646484375,
-0.04742431640625,
0.0211334228515625,
0.061859130859375,
-0.0287933349609375,
0.00933837890625,
0.0274810791015625,
0.00516510009765625,
-0.04400634765625,
-0.06793212890625,
0.006450653076171875,
-0.01161956787109375,
-0.043121337890625,
0.033905029296875,
-0.00211334228515625,
-0.00933074951171875,
0.014434814453125,
0.0032405853271484375,
-0.038665771484375,
0.0151824951171875,
0.0120086669921875,
0.0338134765625,
-0.027984619140625,
0.01186370849609375,
-0.017913818359375,
-0.01052093505859375,
-0.007091522216796875,
-0.02215576171875,
0.047210693359375,
-0.050750732421875,
0.002346038818359375,
-0.052703857421875,
0.019317626953125,
0.03240966796875,
-0.021087646484375,
0.06756591796875,
0.061004638671875,
-0.0297698974609375,
-0.0007338523864746094,
-0.04376220703125,
-0.0228729248046875,
-0.038482666015625,
0.036468505859375,
-0.01184844970703125,
-0.06512451171875,
0.0411376953125,
0.0164031982421875,
0.01488494873046875,
0.068115234375,
0.039154052734375,
-0.042266845703125,
0.05902099609375,
0.0350341796875,
-0.0072784423828125,
0.03338623046875,
-0.06646728515625,
-0.008819580078125,
-0.06854248046875,
-0.0198974609375,
-0.043060302734375,
-0.049163818359375,
-0.055450439453125,
-0.0224761962890625,
0.012603759765625,
0.005870819091796875,
-0.035888671875,
0.045989990234375,
-0.0584716796875,
0.04522705078125,
0.0540771484375,
0.013275146484375,
-0.003055572509765625,
0.0023517608642578125,
0.009124755859375,
0.00499725341796875,
-0.046905517578125,
-0.0300140380859375,
0.07891845703125,
0.0042877197265625,
0.04010009765625,
0.017547607421875,
0.07049560546875,
0.0237274169921875,
-0.0289459228515625,
-0.0462646484375,
0.0372314453125,
-0.01910400390625,
-0.073486328125,
-0.044036865234375,
-0.0285491943359375,
-0.078369140625,
-0.006984710693359375,
-0.0166168212890625,
-0.045654296875,
0.02093505859375,
0.0035495758056640625,
-0.0286102294921875,
0.0178680419921875,
-0.053802490234375,
0.0682373046875,
-0.0175933837890625,
-0.0210418701171875,
-0.01904296875,
-0.05841064453125,
0.0111846923828125,
0.00960540771484375,
0.005706787109375,
-0.0175323486328125,
0.0018062591552734375,
0.05596923828125,
-0.04510498046875,
0.07220458984375,
-0.016143798828125,
0.003025054931640625,
0.0251312255859375,
0.005344390869140625,
0.0377197265625,
0.0228118896484375,
-0.0277252197265625,
0.031707763671875,
0.01062774658203125,
-0.038330078125,
-0.03485107421875,
0.061737060546875,
-0.066650390625,
-0.0250701904296875,
-0.036865234375,
-0.0216064453125,
-0.0031986236572265625,
0.0283660888671875,
0.0232391357421875,
0.0199432373046875,
-0.0199127197265625,
0.0462646484375,
0.046112060546875,
-0.0046539306640625,
0.0380859375,
0.032196044921875,
-0.0011997222900390625,
-0.03204345703125,
0.06500244140625,
-0.004001617431640625,
0.00917816162109375,
0.0297393798828125,
0.0063018798828125,
-0.0215606689453125,
-0.034271240234375,
-0.039642333984375,
0.009307861328125,
-0.034454345703125,
-0.02447509765625,
-0.019500732421875,
-0.030120849609375,
-0.045806884765625,
-0.014556884765625,
-0.04290771484375,
-0.0428466796875,
-0.034393310546875,
-0.01141357421875,
0.052398681640625,
0.04095458984375,
-0.0080413818359375,
0.011474609375,
-0.03802490234375,
0.0177764892578125,
0.0258026123046875,
0.034881591796875,
-0.0130462646484375,
-0.035247802734375,
-0.0294036865234375,
0.031158447265625,
0.0009775161743164062,
-0.039337158203125,
0.006618499755859375,
0.02276611328125,
0.03289794921875,
0.0007295608520507812,
0.0015993118286132812,
0.046234130859375,
-0.0347900390625,
0.0660400390625,
0.0199127197265625,
-0.05267333984375,
0.0592041015625,
-0.0246429443359375,
0.03253173828125,
0.09521484375,
0.0259857177734375,
-0.032928466796875,
-0.01372528076171875,
-0.052642822265625,
-0.07684326171875,
0.047119140625,
0.032318115234375,
0.015289306640625,
0.0010242462158203125,
0.0182037353515625,
-0.0031337738037109375,
0.0163726806640625,
-0.032379150390625,
-0.0275726318359375,
-0.01690673828125,
-0.01904296875,
-0.007259368896484375,
-0.013214111328125,
-0.0107421875,
-0.033111572265625,
0.07208251953125,
0.00785064697265625,
0.004779815673828125,
0.009490966796875,
-0.01029205322265625,
0.0225067138671875,
0.0017595291137695312,
0.042236328125,
0.062744140625,
-0.032501220703125,
-0.0037670135498046875,
0.01265716552734375,
-0.0372314453125,
0.004695892333984375,
0.0132904052734375,
-0.0274505615234375,
0.01419830322265625,
0.0367431640625,
0.0589599609375,
0.01038360595703125,
-0.040252685546875,
0.046478271484375,
-0.006195068359375,
-0.03717041015625,
-0.04107666015625,
0.0166778564453125,
0.0184783935546875,
0.0296173095703125,
0.035888671875,
-0.00902557373046875,
0.01605224609375,
-0.038543701171875,
0.00823211669921875,
0.038116455078125,
-0.031280517578125,
-0.0083770751953125,
0.0247802734375,
0.030120849609375,
-0.0290679931640625,
0.058135986328125,
-0.0153656005859375,
-0.03717041015625,
0.06658935546875,
0.0187225341796875,
0.0750732421875,
0.0119171142578125,
0.0213470458984375,
0.044219970703125,
0.0272064208984375,
0.0054473876953125,
0.014129638671875,
0.001445770263671875,
-0.040771484375,
-0.0182342529296875,
-0.0419921875,
-0.00926971435546875,
0.0265960693359375,
-0.056732177734375,
0.01447296142578125,
-0.045318603515625,
-0.01186370849609375,
0.0096893310546875,
0.027587890625,
-0.07696533203125,
0.013824462890625,
-0.004734039306640625,
0.060882568359375,
-0.041229248046875,
0.044219970703125,
0.06036376953125,
-0.050567626953125,
-0.05322265625,
-0.012054443359375,
-0.00482940673828125,
-0.07489013671875,
0.034088134765625,
0.017608642578125,
-0.01300811767578125,
0.0116729736328125,
-0.05731201171875,
-0.0616455078125,
0.10052490234375,
0.00920867919921875,
-0.050689697265625,
-0.01079559326171875,
-0.007564544677734375,
0.033294677734375,
-0.012847900390625,
0.006244659423828125,
0.038665771484375,
0.03985595703125,
0.0096282958984375,
-0.069580078125,
0.0135650634765625,
-0.03106689453125,
0.0025463104248046875,
0.002948760986328125,
-0.058685302734375,
0.049652099609375,
-0.00726318359375,
-0.0137939453125,
0.00815582275390625,
0.033355712890625,
0.01528167724609375,
0.00891876220703125,
0.024932861328125,
0.04901123046875,
0.059783935546875,
-0.004791259765625,
0.067626953125,
-0.01497650146484375,
0.047393798828125,
0.099609375,
-0.0160675048828125,
0.0654296875,
0.0296173095703125,
-0.033935546875,
0.05242919921875,
0.041534423828125,
-0.03204345703125,
0.045166015625,
0.0091705322265625,
-0.006755828857421875,
-0.0191192626953125,
0.005931854248046875,
-0.06390380859375,
0.03277587890625,
0.01105499267578125,
-0.0237884521484375,
-0.01177978515625,
-0.0289306640625,
-0.003520965576171875,
0.0009341239929199219,
0.0012979507446289062,
0.07196044921875,
-0.00972747802734375,
-0.0300140380859375,
0.0648193359375,
-0.0073089599609375,
0.059661865234375,
-0.049560546875,
-0.00324249267578125,
-0.0234832763671875,
0.018524169921875,
-0.017822265625,
-0.07696533203125,
0.00688934326171875,
-0.0006194114685058594,
-0.035980224609375,
-0.005870819091796875,
0.038543701171875,
-0.02490234375,
-0.0679931640625,
0.005199432373046875,
0.0309295654296875,
0.0184783935546875,
0.002643585205078125,
-0.066162109375,
0.0038604736328125,
0.0003998279571533203,
-0.0221099853515625,
0.0182647705078125,
0.03509521484375,
0.0262603759765625,
0.03399658203125,
0.05865478515625,
0.003612518310546875,
-0.006816864013671875,
0.007694244384765625,
0.06585693359375,
-0.0494384765625,
-0.0180816650390625,
-0.08087158203125,
0.045166015625,
-0.032867431640625,
-0.034423828125,
0.050323486328125,
0.06463623046875,
0.07379150390625,
-0.01143646240234375,
0.06201171875,
-0.0153045654296875,
0.0309600830078125,
-0.0352783203125,
0.0745849609375,
-0.054412841796875,
-0.00018787384033203125,
-0.01123046875,
-0.06298828125,
-0.00997161865234375,
0.06683349609375,
0.005870819091796875,
0.0128326416015625,
0.045440673828125,
0.057098388671875,
0.00021076202392578125,
-0.029205322265625,
0.0003314018249511719,
0.02020263671875,
0.0179290771484375,
0.06353759765625,
0.047149658203125,
-0.053680419921875,
0.0419921875,
-0.032073974609375,
-0.005985260009765625,
-0.02935791015625,
-0.056243896484375,
-0.06585693359375,
-0.05035400390625,
-0.0282135009765625,
-0.044464111328125,
0.0142669677734375,
0.06103515625,
0.06732177734375,
-0.06707763671875,
-0.024383544921875,
-0.00897216796875,
0.00567626953125,
-0.0197906494140625,
-0.0210113525390625,
0.033660888671875,
-0.01351165771484375,
-0.04205322265625,
0.034149169921875,
0.0034885406494140625,
-0.00026917457580566406,
-0.0234527587890625,
-0.0072174072265625,
-0.052520751953125,
-0.0235595703125,
0.03216552734375,
0.02288818359375,
-0.044189453125,
0.0030612945556640625,
-0.002132415771484375,
-0.0218048095703125,
-0.00226593017578125,
0.0269317626953125,
-0.06475830078125,
0.016021728515625,
0.03558349609375,
0.050628662109375,
0.052032470703125,
-0.01218414306640625,
0.03460693359375,
-0.0604248046875,
0.0268096923828125,
0.03240966796875,
0.0188751220703125,
0.0208282470703125,
-0.037750244140625,
0.059783935546875,
0.00872039794921875,
-0.0276947021484375,
-0.061370849609375,
-0.0007023811340332031,
-0.06695556640625,
-0.0280914306640625,
0.0904541015625,
0.002716064453125,
-0.006549835205078125,
0.0237274169921875,
-0.00794219970703125,
0.016387939453125,
-0.039459228515625,
0.046966552734375,
0.054931640625,
0.0213470458984375,
0.01332855224609375,
-0.046112060546875,
0.03643798828125,
0.0222015380859375,
-0.059112548828125,
-0.005786895751953125,
0.04180908203125,
0.0209808349609375,
0.005565643310546875,
0.04083251953125,
0.0134735107421875,
0.034454345703125,
-0.0121002197265625,
0.0095672607421875,
-0.020416259765625,
-0.0162811279296875,
-0.0201263427734375,
-0.012237548828125,
-0.0166168212890625,
-0.0333251953125
]
] |
google/bert_uncased_L-4_H-512_A-8 | 2021-05-19T17:30:51.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | google | null | null | google/bert_uncased_L-4_H-512_A-8 | 3 | 16,485 | transformers | 2022-03-02T23:29:05 | ---
thumbnail: https://huggingface.co/front/thumbnails/google.png
license: apache-2.0
---
BERT Miniatures
===
This is the set of 24 BERT models referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962) (English only, uncased, trained with WordPiece masking).
We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted computational resources. They can be fine-tuned in the same manner as the original BERT models. However, they are most effective in the context of knowledge distillation, where the fine-tuning labels are produced by a larger and more accurate teacher.
Our goal is to enable research in institutions with fewer computational resources and encourage the community to seek directions of innovation alternative to increasing model capacity.
You can download the 24 BERT miniatures either from the [official BERT Github page](https://github.com/google-research/bert/), or via HuggingFace from the links below:
| |H=128|H=256|H=512|H=768|
|---|:---:|:---:|:---:|:---:|
| **L=2** |[**2/128 (BERT-Tiny)**][2_128]|[2/256][2_256]|[2/512][2_512]|[2/768][2_768]|
| **L=4** |[4/128][4_128]|[**4/256 (BERT-Mini)**][4_256]|[**4/512 (BERT-Small)**][4_512]|[4/768][4_768]|
| **L=6** |[6/128][6_128]|[6/256][6_256]|[6/512][6_512]|[6/768][6_768]|
| **L=8** |[8/128][8_128]|[8/256][8_256]|[**8/512 (BERT-Medium)**][8_512]|[8/768][8_768]|
| **L=10** |[10/128][10_128]|[10/256][10_256]|[10/512][10_512]|[10/768][10_768]|
| **L=12** |[12/128][12_128]|[12/256][12_256]|[12/512][12_512]|[**12/768 (BERT-Base)**][12_768]|
Note that the BERT-Base model in this release is included for completeness only; it was re-trained under the same regime as the original model.
Here are the corresponding GLUE scores on the test set:
|Model|Score|CoLA|SST-2|MRPC|STS-B|QQP|MNLI-m|MNLI-mm|QNLI(v2)|RTE|WNLI|AX|
|---|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
|BERT-Tiny|64.2|0.0|83.2|81.1/71.1|74.3/73.6|62.2/83.4|70.2|70.3|81.5|57.2|62.3|21.0|
|BERT-Mini|65.8|0.0|85.9|81.1/71.8|75.4/73.3|66.4/86.2|74.8|74.3|84.1|57.9|62.3|26.1|
|BERT-Small|71.2|27.8|89.7|83.4/76.2|78.8/77.0|68.1/87.0|77.6|77.0|86.4|61.8|62.3|28.6|
|BERT-Medium|73.5|38.0|89.6|86.6/81.6|80.4/78.4|69.6/87.9|80.0|79.1|87.7|62.2|62.3|30.5|
For each task, we selected the best fine-tuning hyperparameters from the lists below, and trained for 4 epochs:
- batch sizes: 8, 16, 32, 64, 128
- learning rates: 3e-4, 1e-4, 5e-5, 3e-5
If you use these models, please cite the following paper:
```
@article{turc2019,
title={Well-Read Students Learn Better: On the Importance of Pre-training Compact Models},
author={Turc, Iulia and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina},
journal={arXiv preprint arXiv:1908.08962v2 },
year={2019}
}
```
[2_128]: https://huggingface.co/google/bert_uncased_L-2_H-128_A-2
[2_256]: https://huggingface.co/google/bert_uncased_L-2_H-256_A-4
[2_512]: https://huggingface.co/google/bert_uncased_L-2_H-512_A-8
[2_768]: https://huggingface.co/google/bert_uncased_L-2_H-768_A-12
[4_128]: https://huggingface.co/google/bert_uncased_L-4_H-128_A-2
[4_256]: https://huggingface.co/google/bert_uncased_L-4_H-256_A-4
[4_512]: https://huggingface.co/google/bert_uncased_L-4_H-512_A-8
[4_768]: https://huggingface.co/google/bert_uncased_L-4_H-768_A-12
[6_128]: https://huggingface.co/google/bert_uncased_L-6_H-128_A-2
[6_256]: https://huggingface.co/google/bert_uncased_L-6_H-256_A-4
[6_512]: https://huggingface.co/google/bert_uncased_L-6_H-512_A-8
[6_768]: https://huggingface.co/google/bert_uncased_L-6_H-768_A-12
[8_128]: https://huggingface.co/google/bert_uncased_L-8_H-128_A-2
[8_256]: https://huggingface.co/google/bert_uncased_L-8_H-256_A-4
[8_512]: https://huggingface.co/google/bert_uncased_L-8_H-512_A-8
[8_768]: https://huggingface.co/google/bert_uncased_L-8_H-768_A-12
[10_128]: https://huggingface.co/google/bert_uncased_L-10_H-128_A-2
[10_256]: https://huggingface.co/google/bert_uncased_L-10_H-256_A-4
[10_512]: https://huggingface.co/google/bert_uncased_L-10_H-512_A-8
[10_768]: https://huggingface.co/google/bert_uncased_L-10_H-768_A-12
[12_128]: https://huggingface.co/google/bert_uncased_L-12_H-128_A-2
[12_256]: https://huggingface.co/google/bert_uncased_L-12_H-256_A-4
[12_512]: https://huggingface.co/google/bert_uncased_L-12_H-512_A-8
[12_768]: https://huggingface.co/google/bert_uncased_L-12_H-768_A-12
| 4,617 | [
[
-0.053558349609375,
-0.03546142578125,
0.0239410400390625,
0.0131683349609375,
-0.02374267578125,
-0.016937255859375,
-0.0239715576171875,
-0.031219482421875,
0.04376220703125,
-0.006092071533203125,
-0.06103515625,
-0.030670166015625,
-0.052093505859375,
-0.001918792724609375,
-0.0018672943115234375,
0.0860595703125,
0.015594482421875,
-0.0009140968322753906,
-0.01369476318359375,
-0.0030422210693359375,
-0.0162811279296875,
-0.0218353271484375,
-0.0292510986328125,
-0.0182952880859375,
0.046905517578125,
0.02886962890625,
0.06463623046875,
0.043487548828125,
0.0390625,
0.0274200439453125,
-0.021575927734375,
0.004146575927734375,
-0.0291900634765625,
-0.03277587890625,
0.01541900634765625,
-0.0261688232421875,
-0.0648193359375,
0.0187835693359375,
0.04217529296875,
0.0548095703125,
-0.003711700439453125,
0.0264434814453125,
0.02435302734375,
0.050323486328125,
-0.037811279296875,
0.01103973388671875,
-0.019500732421875,
-0.006824493408203125,
-0.007724761962890625,
0.0133056640625,
-0.0197296142578125,
-0.049346923828125,
0.0248870849609375,
-0.0609130859375,
0.0188140869140625,
-0.0117340087890625,
0.09710693359375,
0.00795745849609375,
-0.01812744140625,
-0.024017333984375,
-0.0204925537109375,
0.07330322265625,
-0.067138671875,
0.0260162353515625,
0.0275421142578125,
0.0011081695556640625,
-0.00989532470703125,
-0.053558349609375,
-0.036529541015625,
0.005702972412109375,
-0.02899169921875,
0.028106689453125,
-0.0166015625,
-0.00241851806640625,
0.024688720703125,
0.029144287109375,
-0.043701171875,
0.005710601806640625,
-0.03900146484375,
-0.0187530517578125,
0.057647705078125,
0.004817962646484375,
0.02020263671875,
-0.004825592041015625,
-0.0294342041015625,
-0.027069091796875,
-0.0257415771484375,
0.025634765625,
0.0257110595703125,
0.01308441162109375,
-0.03729248046875,
0.032135009765625,
0.005344390869140625,
0.05731201171875,
0.034271240234375,
-0.031951904296875,
0.042816162109375,
-0.01763916015625,
-0.0213775634765625,
-0.01474761962890625,
0.057830810546875,
0.0268707275390625,
0.01019287109375,
0.00830078125,
-0.00836181640625,
-0.00711822509765625,
0.01641845703125,
-0.0721435546875,
-0.038604736328125,
0.00789642333984375,
-0.050628662109375,
-0.013580322265625,
0.0023517608642578125,
-0.048675537109375,
0.00579071044921875,
-0.025909423828125,
0.0430908203125,
-0.054534912109375,
0.0023632049560546875,
0.01012420654296875,
-0.0124053955078125,
0.0190277099609375,
0.032012939453125,
-0.06634521484375,
0.018157958984375,
0.0280914306640625,
0.036407470703125,
0.01041412353515625,
-0.0175933837890625,
0.00821685791015625,
-0.002655029296875,
-0.0304412841796875,
0.04534912109375,
-0.0266265869140625,
-0.0208282470703125,
-0.01361083984375,
0.01525115966796875,
-0.02178955078125,
-0.0279998779296875,
0.050537109375,
-0.0025615692138671875,
0.019134521484375,
-0.035888671875,
-0.061737060546875,
-0.0011587142944335938,
0.0181884765625,
-0.04803466796875,
0.07293701171875,
0.00638580322265625,
-0.05780029296875,
0.0283660888671875,
-0.0294952392578125,
-0.008270263671875,
-0.025634765625,
-0.003406524658203125,
-0.061004638671875,
-0.0005154609680175781,
0.02337646484375,
0.0499267578125,
-0.007228851318359375,
-0.01186370849609375,
-0.0367431640625,
-0.023834228515625,
0.01025390625,
0.00494384765625,
0.0712890625,
0.01049041748046875,
-0.0202178955078125,
0.00690460205078125,
-0.06549072265625,
0.0256805419921875,
0.0287933349609375,
-0.028350830078125,
-0.00250244140625,
-0.029754638671875,
-0.00911712646484375,
0.02264404296875,
0.04571533203125,
-0.038238525390625,
0.01806640625,
-0.01541900634765625,
0.02862548828125,
0.0615234375,
-0.004772186279296875,
0.0281524658203125,
-0.054473876953125,
0.0200042724609375,
0.000881195068359375,
0.0350341796875,
0.00013709068298339844,
-0.045440673828125,
-0.06439208984375,
-0.04339599609375,
0.030487060546875,
0.0176239013671875,
-0.0247039794921875,
0.06561279296875,
-0.01812744140625,
-0.06591796875,
-0.043701171875,
0.0081329345703125,
0.039886474609375,
0.02435302734375,
0.019012451171875,
-0.01611328125,
-0.032562255859375,
-0.08013916015625,
-0.004302978515625,
-0.0109405517578125,
-0.00240325927734375,
0.037811279296875,
0.047576904296875,
-0.00748443603515625,
0.056549072265625,
-0.048553466796875,
-0.0200653076171875,
-0.004199981689453125,
-0.0038928985595703125,
0.0297088623046875,
0.05810546875,
0.07275390625,
-0.058258056640625,
-0.047515869140625,
-0.0251312255859375,
-0.04718017578125,
0.0217132568359375,
-0.0076904296875,
-0.01505279541015625,
0.00920867919921875,
0.019500732421875,
-0.06500244140625,
0.043487548828125,
0.03900146484375,
-0.030914306640625,
0.05987548828125,
-0.036529541015625,
-0.0026226043701171875,
-0.065673828125,
0.016204833984375,
0.006397247314453125,
-0.006061553955078125,
-0.03564453125,
-0.0026607513427734375,
0.0156402587890625,
0.011566162109375,
-0.029266357421875,
0.038665771484375,
-0.04644775390625,
0.00025963783264160156,
0.01258087158203125,
0.006763458251953125,
0.0070648193359375,
0.050750732421875,
-0.00516510009765625,
0.052032470703125,
0.035491943359375,
-0.0219268798828125,
0.0081024169921875,
0.035980224609375,
-0.032318115234375,
0.030303955078125,
-0.058135986328125,
0.005611419677734375,
0.0019197463989257812,
0.0286865234375,
-0.08184814453125,
-0.029052734375,
0.00514984130859375,
-0.0567626953125,
0.03790283203125,
0.0011739730834960938,
-0.037017822265625,
-0.049530029296875,
-0.047271728515625,
0.0118408203125,
0.058441162109375,
-0.0401611328125,
0.0251007080078125,
0.0172119140625,
-0.005039215087890625,
-0.035919189453125,
-0.036865234375,
-0.0321044921875,
-0.0182342529296875,
-0.0511474609375,
0.042236328125,
-0.03253173828125,
0.01531219482421875,
0.007320404052734375,
-0.01386260986328125,
-0.0177764892578125,
0.0015916824340820312,
0.0185699462890625,
0.0341796875,
-0.0164794921875,
0.0038471221923828125,
-0.0048675537109375,
0.01082611083984375,
0.0009794235229492188,
0.00452423095703125,
0.03546142578125,
-0.03240966796875,
0.0033473968505859375,
-0.053253173828125,
0.006328582763671875,
0.039093017578125,
-0.005573272705078125,
0.07293701171875,
0.06304931640625,
-0.030059814453125,
0.003986358642578125,
-0.044464111328125,
-0.031494140625,
-0.0379638671875,
-0.00899505615234375,
-0.0347900390625,
-0.0654296875,
0.0531005859375,
0.002655029296875,
0.01520538330078125,
0.048553466796875,
0.041046142578125,
-0.023681640625,
0.07427978515625,
0.043975830078125,
-0.0145721435546875,
0.034149169921875,
-0.04583740234375,
0.004138946533203125,
-0.061309814453125,
-0.0216827392578125,
-0.0230865478515625,
-0.039764404296875,
-0.046600341796875,
-0.016571044921875,
0.02716064453125,
0.02520751953125,
-0.033203125,
0.04425048828125,
-0.048065185546875,
0.0224151611328125,
0.0533447265625,
0.041046142578125,
-0.0209197998046875,
-0.01678466796875,
-0.0220184326171875,
-0.016754150390625,
-0.05694580078125,
-0.0204925537109375,
0.080078125,
0.02520751953125,
0.0413818359375,
0.0090179443359375,
0.061248779296875,
0.00531005859375,
-0.0036144256591796875,
-0.0516357421875,
0.044586181640625,
-0.007701873779296875,
-0.07513427734375,
-0.031585693359375,
-0.0244903564453125,
-0.07843017578125,
0.009521484375,
-0.038238525390625,
-0.066162109375,
0.01580810546875,
0.0184173583984375,
-0.0460205078125,
0.0191650390625,
-0.05853271484375,
0.0654296875,
-0.0031375885009765625,
-0.03314208984375,
-0.00252532958984375,
-0.06744384765625,
0.0260772705078125,
-0.0002015829086303711,
0.0063018798828125,
-0.00664520263671875,
0.01198577880859375,
0.07049560546875,
-0.049163818359375,
0.06890869140625,
-0.01209259033203125,
0.00452423095703125,
0.034698486328125,
-0.0081634521484375,
0.04437255859375,
-0.00213623046875,
0.00847625732421875,
-0.00791168212890625,
0.01163482666015625,
-0.055389404296875,
-0.0296630859375,
0.051910400390625,
-0.0721435546875,
-0.035919189453125,
-0.03253173828125,
-0.045654296875,
-0.0217742919921875,
0.0257568359375,
0.040008544921875,
0.03448486328125,
0.0055694580078125,
0.03546142578125,
0.06109619140625,
-0.01226043701171875,
0.0333251953125,
0.01393890380859375,
0.01113128662109375,
-0.0228271484375,
0.0653076171875,
0.01422119140625,
0.008270263671875,
0.01007843017578125,
0.0196380615234375,
-0.022369384765625,
-0.03216552734375,
-0.008453369140625,
0.049072265625,
-0.031280517578125,
-0.01157379150390625,
-0.041473388671875,
-0.02117919921875,
-0.04998779296875,
-0.033782958984375,
-0.04010009765625,
-0.042694091796875,
-0.0413818359375,
-0.0035190582275390625,
0.0236358642578125,
0.0433349609375,
-0.02044677734375,
0.02032470703125,
-0.060516357421875,
0.01898193359375,
0.0377197265625,
0.023162841796875,
-0.00939178466796875,
-0.039154052734375,
-0.017730712890625,
-0.00009077787399291992,
-0.031585693359375,
-0.048492431640625,
0.0294647216796875,
0.01904296875,
0.04931640625,
0.040924072265625,
0.001972198486328125,
0.074951171875,
-0.040252685546875,
0.06524658203125,
0.046173095703125,
-0.057373046875,
0.040557861328125,
-0.033966064453125,
0.01611328125,
0.036163330078125,
0.043426513671875,
-0.00218963623046875,
-0.007411956787109375,
-0.08099365234375,
-0.053009033203125,
0.05670166015625,
0.0244140625,
0.001644134521484375,
0.0011768341064453125,
0.0279388427734375,
-0.0029964447021484375,
0.01114654541015625,
-0.04241943359375,
-0.046661376953125,
-0.007022857666015625,
-0.018798828125,
-0.01039886474609375,
-0.0306243896484375,
-0.01381683349609375,
-0.04766845703125,
0.057647705078125,
-0.00696563720703125,
0.046966552734375,
0.00873565673828125,
0.01152801513671875,
0.0042572021484375,
-0.006244659423828125,
0.048919677734375,
0.055633544921875,
-0.04608154296875,
-0.0244903564453125,
0.00897979736328125,
-0.044158935546875,
-0.0123138427734375,
0.018402099609375,
-0.0006556510925292969,
0.004364013671875,
0.037200927734375,
0.053619384765625,
0.029296875,
-0.036590576171875,
0.0526123046875,
-0.0007185935974121094,
-0.03118896484375,
-0.036407470703125,
0.0021533966064453125,
0.00937652587890625,
0.0292510986328125,
0.0078887939453125,
-0.00033855438232421875,
0.01047515869140625,
-0.0460205078125,
0.0223236083984375,
0.03155517578125,
-0.0377197265625,
-0.032135009765625,
0.045379638671875,
0.005352020263671875,
0.01151275634765625,
0.033660888671875,
-0.01161956787109375,
-0.0394287109375,
0.050140380859375,
0.03521728515625,
0.03564453125,
-0.0201568603515625,
0.0141143798828125,
0.0633544921875,
0.0169525146484375,
-0.001003265380859375,
0.0250701904296875,
0.0041961669921875,
-0.041259765625,
0.0007557868957519531,
-0.047332763671875,
-0.0160064697265625,
0.022003173828125,
-0.0745849609375,
0.01181793212890625,
-0.045501708984375,
-0.035003662109375,
0.0216217041015625,
0.031707763671875,
-0.06365966796875,
0.04144287109375,
0.0186767578125,
0.07415771484375,
-0.050933837890625,
0.07427978515625,
0.06402587890625,
-0.0246734619140625,
-0.07598876953125,
-0.01062774658203125,
0.00843048095703125,
-0.0631103515625,
0.041290283203125,
0.0018396377563476562,
0.02777099609375,
-0.0018148422241210938,
-0.0443115234375,
-0.0760498046875,
0.097412109375,
0.0198516845703125,
-0.039642333984375,
-0.018890380859375,
-0.0093994140625,
0.0330810546875,
-0.01044464111328125,
0.026153564453125,
0.041534423828125,
0.0283050537109375,
0.01146697998046875,
-0.07501220703125,
0.00933074951171875,
-0.031585693359375,
0.0025177001953125,
0.01666259765625,
-0.0838623046875,
0.0909423828125,
-0.0243988037109375,
0.003692626953125,
0.01346588134765625,
0.043487548828125,
0.044189453125,
0.00455474853515625,
0.03948974609375,
0.06500244140625,
0.0457763671875,
-0.01971435546875,
0.07586669921875,
-0.0185089111328125,
0.04693603515625,
0.07012939453125,
0.0252838134765625,
0.049652099609375,
0.0264434814453125,
-0.030670166015625,
0.032562255859375,
0.05731201171875,
-0.01031494140625,
0.04193115234375,
0.0148162841796875,
0.0018596649169921875,
-0.03741455078125,
0.01088714599609375,
-0.03790283203125,
0.01456451416015625,
0.0361328125,
-0.0201263427734375,
-0.00919342041015625,
-0.0149078369140625,
0.02264404296875,
-0.006290435791015625,
-0.0174560546875,
0.045074462890625,
0.000015616416931152344,
-0.0260467529296875,
0.05511474609375,
-0.021514892578125,
0.05194091796875,
-0.05126953125,
0.0130615234375,
-0.0097808837890625,
0.0294952392578125,
-0.01111602783203125,
-0.06298828125,
0.002773284912109375,
-0.006282806396484375,
-0.01285552978515625,
-0.004642486572265625,
0.047027587890625,
-0.01561737060546875,
-0.046051025390625,
0.019866943359375,
0.0236053466796875,
0.015625,
0.01383209228515625,
-0.0826416015625,
0.01338958740234375,
0.004756927490234375,
-0.04412841796875,
0.0311126708984375,
0.03179931640625,
0.016448974609375,
0.042694091796875,
0.04254150390625,
-0.0116729736328125,
0.028167724609375,
-0.0208740234375,
0.08099365234375,
-0.0333251953125,
-0.0294036865234375,
-0.041534423828125,
0.044769287109375,
-0.01129913330078125,
-0.035247802734375,
0.07061767578125,
0.04443359375,
0.06536865234375,
-0.0185089111328125,
0.0423583984375,
-0.0178985595703125,
0.04986572265625,
-0.0285797119140625,
0.057220458984375,
-0.06854248046875,
-0.0135650634765625,
-0.028656005859375,
-0.057159423828125,
-0.016082763671875,
0.05621337890625,
-0.020050048828125,
0.0117340087890625,
0.0308685302734375,
0.032562255859375,
0.00013208389282226562,
-0.013214111328125,
0.0095367431640625,
0.018035888671875,
0.01534271240234375,
0.0643310546875,
0.031005859375,
-0.05255126953125,
0.033203125,
-0.058624267578125,
-0.015594482421875,
-0.032440185546875,
-0.040924072265625,
-0.08477783203125,
-0.051483154296875,
-0.0299224853515625,
-0.01934814453125,
-0.0062103271484375,
0.07196044921875,
0.0712890625,
-0.0550537109375,
-0.008270263671875,
0.01849365234375,
-0.01244354248046875,
-0.01291656494140625,
-0.0137786865234375,
0.048126220703125,
-0.0127716064453125,
-0.06658935546875,
0.002437591552734375,
-0.0162506103515625,
0.0294342041015625,
0.0168609619140625,
-0.01398468017578125,
-0.0272216796875,
0.017181396484375,
0.042205810546875,
0.0179595947265625,
-0.046356201171875,
-0.03192138671875,
0.0013885498046875,
-0.01195526123046875,
-0.0014667510986328125,
0.01444244384765625,
-0.038909912109375,
0.017059326171875,
0.0316162109375,
0.024017333984375,
0.050445556640625,
0.003231048583984375,
0.00262451171875,
-0.05694580078125,
0.0197296142578125,
0.016632080078125,
0.036590576171875,
0.006061553955078125,
-0.0145263671875,
0.048553466796875,
0.02008056640625,
-0.050811767578125,
-0.0721435546875,
-0.00969696044921875,
-0.1004638671875,
-0.020660400390625,
0.057952880859375,
-0.03265380859375,
-0.036163330078125,
0.037078857421875,
-0.01039886474609375,
0.018951416015625,
-0.031494140625,
0.048187255859375,
0.050262451171875,
-0.0163116455078125,
-0.01068878173828125,
-0.03204345703125,
0.04449462890625,
0.03155517578125,
-0.04925537109375,
-0.024566650390625,
0.0288848876953125,
0.0273590087890625,
0.036163330078125,
0.034393310546875,
-0.015380859375,
0.0182037353515625,
0.00624847412109375,
0.005680084228515625,
0.0111236572265625,
-0.0258941650390625,
-0.0002512931823730469,
-0.012115478515625,
-0.0189361572265625,
-0.033721923828125
]
] |
laion/clap-htsat-unfused | 2023-04-24T14:39:57.000Z | [
"transformers",
"pytorch",
"clap",
"feature-extraction",
"arxiv:2211.06687",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | feature-extraction | laion | null | null | laion/clap-htsat-unfused | 16 | 16,460 | transformers | 2023-02-16T20:47:08 | ---
license: apache-2.0
---
# Model card for CLAP
Model card for CLAP: Contrastive Language-Audio Pretraining

# Table of Contents
0. [TL;DR](#TL;DR)
1. [Model Details](#model-details)
2. [Usage](#usage)
3. [Uses](#uses)
4. [Citation](#citation)
# TL;DR
The abstract of the paper states that:
> Contrastive learning has shown remarkable success in the field of multimodal representation learning. In this paper, we propose a pipeline of contrastive language-audio pretraining to develop an audio representation by combining audio data with natural language descriptions. To accomplish this target, we first release LAION-Audio-630K, a large collection of 633,526 audio-text pairs from different data sources. Second, we construct a contrastive language-audio pretraining model by considering different audio encoders and text encoders. We incorporate the feature fusion mechanism and keyword-to-caption augmentation into the model design to further enable the model to process audio inputs of variable lengths and enhance the performance. Third, we perform comprehensive experiments to evaluate our model across three tasks: text-to-audio retrieval, zero-shot audio classification, and supervised audio classification. The results demonstrate that our model achieves superior performance in text-to-audio retrieval task. In audio classification tasks, the model achieves state-of-the-art performance in the zero-shot setting and is able to obtain performance comparable to models' results in the non-zero-shot setting. LAION-Audio-630K and the proposed model are both available to the public.
# Usage
You can use this model for zero shot audio classification or extracting audio and/or textual features.
# Uses
## Perform zero-shot audio classification
### Using `pipeline`
```python
from datasets import load_dataset
from transformers import pipeline
dataset = load_dataset("ashraq/esc50")
audio = dataset["train"]["audio"][-1]["array"]
audio_classifier = pipeline(task="zero-shot-audio-classification", model="laion/clap-htsat-unfused")
output = audio_classifier(audio, candidate_labels=["Sound of a dog", "Sound of vaccum cleaner"])
print(output)
>>> [{"score": 0.999, "label": "Sound of a dog"}, {"score": 0.001, "label": "Sound of vaccum cleaner"}]
```
## Run the model:
You can also get the audio and text embeddings using `ClapModel`
### Run the model on CPU:
```python
from datasets import load_dataset
from transformers import ClapModel, ClapProcessor
librispeech_dummy = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
audio_sample = librispeech_dummy[0]
model = ClapModel.from_pretrained("laion/clap-htsat-unfused")
processor = ClapProcessor.from_pretrained("laion/clap-htsat-unfused")
inputs = processor(audios=audio_sample["audio"]["array"], return_tensors="pt")
audio_embed = model.get_audio_features(**inputs)
```
### Run the model on GPU:
```python
from datasets import load_dataset
from transformers import ClapModel, ClapProcessor
librispeech_dummy = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
audio_sample = librispeech_dummy[0]
model = ClapModel.from_pretrained("laion/clap-htsat-unfused").to(0)
processor = ClapProcessor.from_pretrained("laion/clap-htsat-unfused")
inputs = processor(audios=audio_sample["audio"]["array"], return_tensors="pt").to(0)
audio_embed = model.get_audio_features(**inputs)
```
# Citation
If you are using this model for your work, please consider citing the original paper:
```
@misc{https://doi.org/10.48550/arxiv.2211.06687,
doi = {10.48550/ARXIV.2211.06687},
url = {https://arxiv.org/abs/2211.06687},
author = {Wu, Yusong and Chen, Ke and Zhang, Tianyu and Hui, Yuchen and Berg-Kirkpatrick, Taylor and Dubnov, Shlomo},
keywords = {Sound (cs.SD), Audio and Speech Processing (eess.AS), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Electrical engineering, electronic engineering, information engineering, FOS: Electrical engineering, electronic engineering, information engineering},
title = {Large-scale Contrastive Language-Audio Pretraining with Feature Fusion and Keyword-to-Caption Augmentation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,454 | [
[
-0.04034423828125,
-0.045379638671875,
0.020355224609375,
0.0032062530517578125,
-0.0199432373046875,
-0.014678955078125,
-0.0254974365234375,
-0.0193328857421875,
0.005725860595703125,
0.017333984375,
-0.038299560546875,
-0.0430908203125,
-0.050445556640625,
-0.015869140625,
-0.03350830078125,
0.07293701171875,
0.014678955078125,
-0.0002923011779785156,
-0.00437164306640625,
-0.0177154541015625,
-0.031585693359375,
-0.0312347412109375,
-0.035675048828125,
-0.036590576171875,
0.0105438232421875,
0.0119171142578125,
0.02947998046875,
0.041473388671875,
0.020172119140625,
0.0293121337890625,
-0.0148162841796875,
-0.0257720947265625,
-0.01348876953125,
-0.004459381103515625,
0.014068603515625,
-0.039154052734375,
-0.048492431640625,
0.0159454345703125,
0.045745849609375,
0.032135009765625,
-0.00701904296875,
0.030364990234375,
0.01534271240234375,
0.00687408447265625,
-0.043731689453125,
0.01473236083984375,
-0.034912109375,
0.0007576942443847656,
-0.00904083251953125,
-0.0259246826171875,
-0.021331787109375,
0.0077362060546875,
0.00795745849609375,
-0.04534912109375,
0.030181884765625,
-0.00864410400390625,
0.0906982421875,
0.037322998046875,
-0.004535675048828125,
-0.006748199462890625,
-0.054107666015625,
0.06463623046875,
-0.0789794921875,
0.052459716796875,
0.0298614501953125,
0.013763427734375,
0.0218658447265625,
-0.061737060546875,
-0.061737060546875,
-0.01004791259765625,
0.00926971435546875,
0.035614013671875,
-0.0109710693359375,
0.0086669921875,
0.026123046875,
0.041351318359375,
-0.03662109375,
0.01015472412109375,
-0.0276641845703125,
-0.01346588134765625,
0.033447265625,
-0.013946533203125,
0.0180206298828125,
-0.0204925537109375,
-0.0312347412109375,
-0.053680419921875,
-0.028717041015625,
0.011932373046875,
0.0338134765625,
0.02386474609375,
-0.0545654296875,
0.0253753662109375,
0.004489898681640625,
0.03411865234375,
0.01342010498046875,
-0.032501220703125,
0.057708740234375,
-0.020050048828125,
-0.01549530029296875,
0.02264404296875,
0.08416748046875,
0.0023059844970703125,
0.00043892860412597656,
0.0205535888671875,
-0.023773193359375,
0.018341064453125,
-0.003772735595703125,
-0.053680419921875,
-0.0205535888671875,
0.0299224853515625,
-0.0234375,
-0.01128387451171875,
-0.002048492431640625,
-0.045806884765625,
0.00018417835235595703,
-0.03314208984375,
0.0728759765625,
-0.04290771484375,
-0.030059814453125,
0.01187896728515625,
-0.0216827392578125,
0.01389312744140625,
-0.01279449462890625,
-0.0655517578125,
0.01168060302734375,
0.03094482421875,
0.0535888671875,
0.00850677490234375,
-0.0269317626953125,
-0.043487548828125,
0.00943756103515625,
0.0007920265197753906,
0.04132080078125,
-0.0252532958984375,
-0.0230712890625,
-0.005340576171875,
-0.01172637939453125,
-0.0086669921875,
-0.049896240234375,
0.0523681640625,
-0.00861358642578125,
0.0254364013671875,
0.016510009765625,
-0.036346435546875,
-0.0175628662109375,
-0.00792694091796875,
-0.0374755859375,
0.07183837890625,
-0.008209228515625,
-0.06964111328125,
0.018341064453125,
-0.050445556640625,
-0.0256805419921875,
-0.0199737548828125,
-0.0121917724609375,
-0.048553466796875,
-0.0035686492919921875,
0.03466796875,
0.0343017578125,
-0.0245819091796875,
0.0293121337890625,
-0.0243377685546875,
-0.049957275390625,
0.0188751220703125,
-0.047576904296875,
0.0654296875,
0.01282501220703125,
-0.050567626953125,
0.0203704833984375,
-0.06488037109375,
-0.0175323486328125,
0.007686614990234375,
-0.0156707763671875,
0.003803253173828125,
-0.01079559326171875,
0.016082763671875,
0.01708984375,
0.00484466552734375,
-0.0287933349609375,
-0.033905029296875,
-0.03668212890625,
0.033599853515625,
0.04498291015625,
-0.0277252197265625,
0.024871826171875,
-0.036407470703125,
0.033447265625,
0.01222991943359375,
0.001766204833984375,
-0.03826904296875,
-0.0226287841796875,
-0.0694580078125,
-0.040130615234375,
0.043701171875,
0.046539306640625,
-0.036285400390625,
0.05670166015625,
-0.0222930908203125,
-0.03857421875,
-0.0865478515625,
-0.0084991455078125,
0.0333251953125,
0.0478515625,
0.05633544921875,
-0.0235595703125,
-0.049072265625,
-0.061553955078125,
0.00527191162109375,
-0.012451171875,
-0.028472900390625,
0.035614013671875,
0.026153564453125,
-0.0284881591796875,
0.054107666015625,
-0.0288543701171875,
-0.040740966796875,
-0.00997161865234375,
0.033905029296875,
0.050689697265625,
0.056732177734375,
0.0279693603515625,
-0.04425048828125,
-0.0195465087890625,
-0.018341064453125,
-0.058624267578125,
-0.0195770263671875,
-0.0020465850830078125,
0.0147857666015625,
-0.00760650634765625,
0.03765869140625,
-0.05126953125,
0.01302337646484375,
0.039520263671875,
0.0028209686279296875,
0.050048828125,
-0.0034122467041015625,
0.01438140869140625,
-0.0794677734375,
0.0129852294921875,
-0.00634765625,
-0.0038280487060546875,
-0.038970947265625,
-0.0172576904296875,
-0.01486968994140625,
-0.024383544921875,
-0.047607421875,
0.04449462890625,
-0.0277862548828125,
-0.0019893646240234375,
-0.0199737548828125,
0.0287322998046875,
-0.0077056884765625,
0.055938720703125,
0.0117645263671875,
0.050445556640625,
0.07159423828125,
-0.056732177734375,
0.01052093505859375,
0.0264434814453125,
-0.022064208984375,
0.037353515625,
-0.066162109375,
0.014678955078125,
-0.005794525146484375,
0.0153350830078125,
-0.078369140625,
-0.005771636962890625,
0.006114959716796875,
-0.05926513671875,
0.04876708984375,
-0.00559234619140625,
-0.0380859375,
-0.0222625732421875,
-0.01319122314453125,
0.0467529296875,
0.05377197265625,
-0.047515869140625,
0.0509033203125,
0.03875732421875,
-0.0177001953125,
-0.047271728515625,
-0.0423583984375,
-0.0338134765625,
-0.02667236328125,
-0.0382080078125,
0.040679931640625,
-0.00569915771484375,
0.00891876220703125,
-0.00795745849609375,
-0.0139617919921875,
0.00853729248046875,
-0.012969970703125,
0.0289154052734375,
0.024566650390625,
-0.0279693603515625,
0.010589599609375,
-0.007843017578125,
-0.0192108154296875,
-0.0009431838989257812,
-0.00411224365234375,
0.06036376953125,
-0.0109710693359375,
-0.023529052734375,
-0.0587158203125,
0.00696563720703125,
0.0261383056640625,
-0.02777099609375,
0.023162841796875,
0.0858154296875,
-0.0038585662841796875,
0.005733489990234375,
-0.045745849609375,
-0.0306854248046875,
-0.041259765625,
0.044921875,
-0.0192718505859375,
-0.053955078125,
0.034912109375,
0.0005469322204589844,
-0.0064697265625,
0.04229736328125,
0.0311279296875,
-0.0202484130859375,
0.07440185546875,
0.036224365234375,
-0.0033702850341796875,
0.043365478515625,
-0.048553466796875,
0.0022487640380859375,
-0.0679931640625,
-0.002460479736328125,
-0.040771484375,
-0.0220794677734375,
-0.041748046875,
-0.034912109375,
0.02508544921875,
0.01355743408203125,
-0.032501220703125,
0.0282135009765625,
-0.0307464599609375,
0.0029850006103515625,
0.049957275390625,
0.0009412765502929688,
-0.0027751922607421875,
-0.00020682811737060547,
-0.0218048095703125,
-0.00739288330078125,
-0.050079345703125,
-0.0206298828125,
0.07415771484375,
0.04510498046875,
0.04400634765625,
-0.00836181640625,
0.0628662109375,
-0.001682281494140625,
0.0064544677734375,
-0.0518798828125,
0.03399658203125,
-0.0156707763671875,
-0.042449951171875,
-0.017059326171875,
-0.024200439453125,
-0.060333251953125,
0.0253448486328125,
-0.028289794921875,
-0.042205810546875,
0.00844573974609375,
0.01209259033203125,
-0.026947021484375,
0.023193359375,
-0.056243896484375,
0.058349609375,
-0.0109405517578125,
-0.02337646484375,
-0.00333404541015625,
-0.041778564453125,
0.0179901123046875,
0.005615234375,
0.028289794921875,
-0.0030345916748046875,
0.0234832763671875,
0.08258056640625,
0.00128936767578125,
0.058258056640625,
-0.0218963623046875,
0.0019330978393554688,
0.027069091796875,
-0.023834228515625,
0.01093292236328125,
-0.00792694091796875,
0.003963470458984375,
0.03204345703125,
-0.004276275634765625,
-0.01288604736328125,
-0.032135009765625,
0.037811279296875,
-0.07159423828125,
-0.047149658203125,
-0.0194549560546875,
-0.0447998046875,
-0.0222320556640625,
0.012939453125,
0.0423583984375,
0.037261962890625,
0.003986358642578125,
0.0277862548828125,
0.045501708984375,
-0.03271484375,
0.0506591796875,
0.0271148681640625,
-0.01336669921875,
-0.046783447265625,
0.07470703125,
-0.0016298294067382812,
0.01446533203125,
0.026519775390625,
0.0192108154296875,
-0.043365478515625,
-0.03118896484375,
-0.0057830810546875,
0.03521728515625,
-0.041473388671875,
-0.007595062255859375,
-0.05316162109375,
-0.0273895263671875,
-0.045318603515625,
0.00795745849609375,
-0.051055908203125,
-0.0131683349609375,
-0.041168212890625,
-0.00284576416015625,
0.00540924072265625,
0.0157470703125,
-0.01678466796875,
0.048004150390625,
-0.06207275390625,
0.044952392578125,
0.0265655517578125,
0.0206298828125,
0.00007486343383789062,
-0.084716796875,
-0.015960693359375,
0.004390716552734375,
-0.039581298828125,
-0.05963134765625,
0.03472900390625,
0.013427734375,
0.052398681640625,
0.04278564453125,
-0.003871917724609375,
0.0565185546875,
-0.031890869140625,
0.058624267578125,
0.0377197265625,
-0.08111572265625,
0.0477294921875,
-0.031890869140625,
0.042510986328125,
0.031768798828125,
0.032257080078125,
-0.0152130126953125,
-0.021697998046875,
-0.06842041015625,
-0.07684326171875,
0.06256103515625,
0.0212554931640625,
0.01094818115234375,
0.01398468017578125,
-0.013427734375,
-0.001049041748046875,
0.019683837890625,
-0.0623779296875,
-0.03143310546875,
-0.04168701171875,
-0.01470947265625,
-0.0293121337890625,
-0.0099334716796875,
-0.0062255859375,
-0.04266357421875,
0.06304931640625,
-0.00044989585876464844,
0.04656982421875,
0.01366424560546875,
-0.00969696044921875,
0.01690673828125,
0.03204345703125,
0.038726806640625,
0.00841522216796875,
-0.036224365234375,
0.01317596435546875,
0.0131988525390625,
-0.037017822265625,
0.01163482666015625,
0.0197296142578125,
0.0017604827880859375,
0.01349639892578125,
0.0382080078125,
0.1019287109375,
0.035736083984375,
-0.0408935546875,
0.043548583984375,
0.006664276123046875,
-0.0241241455078125,
-0.0341796875,
0.0059661865234375,
0.020965576171875,
0.0185089111328125,
0.02386474609375,
0.00034356117248535156,
0.00164031982421875,
-0.0382080078125,
0.025299072265625,
0.01629638671875,
-0.0478515625,
-0.0323486328125,
0.0789794921875,
0.0011377334594726562,
-0.0252838134765625,
0.042999267578125,
0.01149749755859375,
-0.0262298583984375,
0.0325927734375,
0.055389404296875,
0.07611083984375,
-0.03265380859375,
0.004852294921875,
0.049774169921875,
-0.0027179718017578125,
0.0020999908447265625,
0.014312744140625,
-0.0265350341796875,
-0.046295166015625,
-0.026397705078125,
-0.065673828125,
-0.0122222900390625,
0.034210205078125,
-0.050445556640625,
0.0361328125,
-0.0261993408203125,
-0.033111572265625,
0.00911712646484375,
-0.0164947509765625,
-0.058837890625,
0.023834228515625,
0.025543212890625,
0.04193115234375,
-0.067138671875,
0.062744140625,
0.037139892578125,
-0.05316162109375,
-0.07269287109375,
-0.0015077590942382812,
-0.013916015625,
-0.032989501953125,
0.042938232421875,
0.02587890625,
-0.007717132568359375,
0.0023937225341796875,
-0.03955078125,
-0.05645751953125,
0.07708740234375,
0.044403076171875,
-0.04937744140625,
0.009124755859375,
0.00859832763671875,
0.040679931640625,
-0.0277862548828125,
0.044189453125,
0.036834716796875,
0.03460693359375,
0.00885772705078125,
-0.08270263671875,
-0.01418304443359375,
-0.0308837890625,
-0.0258941650390625,
-0.0145111083984375,
-0.03814697265625,
0.090087890625,
-0.01419830322265625,
-0.01279449462890625,
-0.0162506103515625,
0.057098388671875,
0.05078125,
0.02276611328125,
0.0546875,
0.054840087890625,
0.045318603515625,
-0.01280975341796875,
0.07196044921875,
-0.01515960693359375,
0.0265350341796875,
0.0902099609375,
0.0009708404541015625,
0.07952880859375,
0.031707763671875,
-0.030120849609375,
0.03546142578125,
0.0227813720703125,
-0.01033782958984375,
0.041595458984375,
0.00013959407806396484,
-0.0155181884765625,
-0.0230712890625,
-0.01303863525390625,
-0.04229736328125,
0.058837890625,
0.0121307373046875,
-0.0260467529296875,
0.01279449462890625,
0.0105743408203125,
-0.0107421875,
-0.00890350341796875,
-0.002819061279296875,
0.0423583984375,
0.0234527587890625,
-0.0273590087890625,
0.0701904296875,
-0.0207366943359375,
0.08502197265625,
-0.038787841796875,
-0.0031108856201171875,
0.0058441162109375,
0.014984130859375,
-0.01404571533203125,
-0.0430908203125,
0.000002384185791015625,
-0.0150146484375,
-0.0182952880859375,
0.0016183853149414062,
0.02703857421875,
-0.047454833984375,
-0.019744873046875,
0.049346923828125,
0.0125732421875,
0.0151824951171875,
-0.00507354736328125,
-0.05609130859375,
0.01323699951171875,
-0.0025539398193359375,
-0.0080108642578125,
0.0211944580078125,
0.00664520263671875,
0.0283203125,
0.030242919921875,
0.0556640625,
0.0152130126953125,
0.01184844970703125,
0.0240020751953125,
0.045135498046875,
-0.05194091796875,
-0.05584716796875,
-0.041778564453125,
0.051849365234375,
0.0037708282470703125,
-0.01258087158203125,
0.04034423828125,
0.044647216796875,
0.06719970703125,
-0.013580322265625,
0.05126953125,
0.0117950439453125,
0.045684814453125,
-0.034454345703125,
0.0665283203125,
-0.054534912109375,
0.0295257568359375,
-0.04010009765625,
-0.07080078125,
-0.004032135009765625,
0.045989990234375,
-0.0090484619140625,
0.0185089111328125,
0.044708251953125,
0.06732177734375,
-0.0280609130859375,
-0.003757476806640625,
0.0271148681640625,
0.031768798828125,
0.0214385986328125,
0.02911376953125,
0.045318603515625,
-0.06842041015625,
0.047149658203125,
-0.0335693359375,
-0.00748443603515625,
0.0009946823120117188,
-0.040740966796875,
-0.06036376953125,
-0.057464599609375,
-0.037567138671875,
-0.0175018310546875,
-0.01468658447265625,
0.0469970703125,
0.06463623046875,
-0.0704345703125,
-0.0323486328125,
0.0200653076171875,
-0.0014734268188476562,
-0.0248260498046875,
-0.01898193359375,
0.04864501953125,
-0.012939453125,
-0.07012939453125,
0.043609619140625,
-0.0007653236389160156,
0.0159454345703125,
-0.0007548332214355469,
-0.0251312255859375,
-0.037139892578125,
0.00013899803161621094,
0.03369140625,
0.0180206298828125,
-0.050140380859375,
-0.012451171875,
-0.014190673828125,
0.0098724365234375,
0.0230712890625,
0.0274505615234375,
-0.05712890625,
0.03607177734375,
0.04150390625,
0.0264739990234375,
0.0670166015625,
-0.01251220703125,
0.02484130859375,
-0.059356689453125,
0.040740966796875,
0.01506805419921875,
0.02923583984375,
0.03759765625,
0.00968170166015625,
0.0170745849609375,
0.023529052734375,
-0.031646728515625,
-0.07305908203125,
-0.0166015625,
-0.091552734375,
-0.01372528076171875,
0.08819580078125,
-0.0015516281127929688,
-0.0179595947265625,
0.0016851425170898438,
-0.01491546630859375,
0.062744140625,
-0.0263519287109375,
0.041748046875,
0.0268402099609375,
0.008453369140625,
-0.02728271484375,
-0.04559326171875,
0.03472900390625,
0.04388427734375,
-0.02947998046875,
-0.022003173828125,
0.00968170166015625,
0.0294647216796875,
0.037200927734375,
0.050262451171875,
-0.0224761962890625,
0.01934814453125,
0.0253753662109375,
0.031585693359375,
-0.015960693359375,
-0.0193328857421875,
-0.034271240234375,
0.0223388671875,
-0.01837158203125,
-0.03619384765625
]
] |
togethercomputer/RedPajama-INCITE-7B-Base | 2023-06-06T02:44:34.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"en",
"dataset:togethercomputer/RedPajama-Data-1T",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | togethercomputer | null | null | togethercomputer/RedPajama-INCITE-7B-Base | 89 | 16,414 | transformers | 2023-05-04T05:50:06 | ---
license: apache-2.0
language:
- en
datasets:
- togethercomputer/RedPajama-Data-1T
---
# RedPajama-INCITE-7B-Base
RedPajama-INCITE-7B-Base was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION.
The training was done on 3,072 V100 GPUs provided as part of the INCITE 2023 project on Scalable Foundation Models for Transferrable Generalist AI, awarded to MILA, LAION, and EleutherAI in fall 2022, with support from the Oak Ridge Leadership Computing Facility (OLCF) and INCITE program.
- Base Model: [RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base)
- Instruction-tuned Version: [RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct)
- Chat Version: [RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat)
## Model Details
- **Developed by**: Together Computer.
- **Model type**: Language Model
- **Language(s)**: English
- **License**: Apache 2.0
- **Model Description**: A 6.9B parameter pretrained language model.
# Quick Start
Please note that the model requires `transformers` version >= 4.25.1.
## GPU Inference
This requires a GPU with 16GB memory.
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base", torch_dtype=torch.float16)
model = model.to('cuda:0')
# infer
prompt = "Alan Turing is"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
widely considered to be the father of modern computer science and artificial intelligence. He was a brilliant mathematician and cryptographer, who worked for the British government during World War II. He was instrumental in breaking the German Enigma code, and is credited with helping to shorten the war by two years...
"""
```
## GPU Inference in Int8
This requires a GPU with 12GB memory.
To run inference with int8, please ensure you have installed accelerate and bitandbytes. You can install them with the following command:
```bash
pip install accelerate
pip install bitsandbytes
```
Then you can run inference with int8 as follows:
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base", device_map='auto', torch_dtype=torch.float16, load_in_8bit=True)
# infer
prompt = "Alan Turing is"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
a very well-known name in the world of computer science. It is named after the mathematician Alan Turing. He is famous for his work on the Enigma machine, which was used by the Germans during World War II....
"""```
## CPU Inference
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Base", torch_dtype=torch.bfloat16)
# infer
prompt = "Alan Turing is"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
one of the most important figures in the history of computing. He is best known for his work on the development of the modern computer and for his code-breaking work during World War II. He was also a brilliant mathematician and philosopher.
"""
```
Please note that since `LayerNormKernelImpl` is not implemented in fp16 for CPU, we use `bfloat16` for CPU inference.
# Uses
## Direct Use
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
It is the responsibility of the end user to ensure that the model is used in a responsible and ethical manner.
#### Out-of-Scope Use
`RedPajama-INCITE-7B-Base` is a language model and may not perform well for other use cases outside of its intended scope.
For example, it may not be suitable for use in safety-critical applications or for making decisions that have a significant impact on individuals or society.
It is important to consider the limitations of the model and to only use it for its intended purpose.
#### Misuse and Malicious Use
`RedPajama-INCITE-7B-Base` is designed for language modeling.
Misuse of the model, such as using it to engage in illegal or unethical activities, is strictly prohibited and goes against the principles of the project.
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating fake news, misinformation, or propaganda
- Promoting hate speech, discrimination, or violence against individuals or groups
- Impersonating individuals or organizations without their consent
- Engaging in cyberbullying or harassment
- Defamatory content
- Spamming or scamming
- Sharing confidential or sensitive information without proper authorization
- Violating the terms of use of the model or the data used to train it
- Creating automated bots for malicious purposes such as spreading malware, phishing scams, or spamming
## Limitations
`RedPajama-INCITE-7B-Base`, like other language models, has limitations that should be taken into consideration.
For example, the model may not always provide accurate or relevant answers, particularly for questions that are complex, ambiguous, or outside of its training data.
We therefore welcome contributions from individuals and organizations, and encourage collaboration towards creating a more robust and inclusive chatbot.
## Training
**Training Data**
Please refer to [togethercomputer/RedPajama-Data-1T](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
**Training Procedure**
- **Hardware:** 512 nodes of 6xV100 (IBM Power9), on the OLCF Summit cluster
- **Optimizer:** Apex FusedAdam
- **Parallelism:** Pipeline parallel 12, tensor parallel 2
- **Gradient Accumulations**: 8 (global batch size 4M tokens)
- **Num of Tokens:** 1.001T Tokens
- **Learning rate:** 0.00012
## Benchmark
Please refer to our [blog post](https://together.xyz) for benchmark results.
## Intermediate Checkpoints
We provide 11 intermediate checkpoints that have been released for study.
The checkpoints are organized based on the number of tokens they contain, ranging from 240 billion tokens to 1 trillion tokens.
- [240b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/240b_tokens)
- [280b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/280b_tokens)
- [400b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/400b_tokens)
- [440b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/440b_tokens)
- [500b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/500b_tokens)
- [600b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/600b_tokens)
- [700b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/700b_tokens)
- [720b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/720b_tokens)
- [960b_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/960b_tokens)
- [1t_tokens](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/1t_tokens)
- [latest](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base/tree/main)
## Community
Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4) | 9,424 | [
[
-0.03387451171875,
-0.06915283203125,
0.020721435546875,
0.02215576171875,
-0.001995086669921875,
-0.007843017578125,
-0.0255279541015625,
-0.0298309326171875,
0.01113128662109375,
0.0227203369140625,
-0.04071044921875,
-0.03277587890625,
-0.06182861328125,
-0.0022792816162109375,
-0.03558349609375,
0.059967041015625,
0.0007314682006835938,
-0.01422119140625,
-0.016937255859375,
-0.003231048583984375,
-0.0241241455078125,
-0.03436279296875,
-0.047119140625,
-0.038543701171875,
-0.0005431175231933594,
0.0175933837890625,
0.03790283203125,
0.032806396484375,
0.04071044921875,
0.0300750732421875,
-0.006374359130859375,
0.0029087066650390625,
-0.0335693359375,
-0.008575439453125,
0.009918212890625,
-0.040740966796875,
-0.0202789306640625,
-0.01100921630859375,
0.03826904296875,
0.032073974609375,
0.00203704833984375,
0.02130126953125,
-0.012176513671875,
0.037078857421875,
-0.0423583984375,
0.0235137939453125,
-0.048248291015625,
0.00041294097900390625,
-0.01165771484375,
-0.00203704833984375,
-0.0377197265625,
-0.0167388916015625,
-0.0038776397705078125,
-0.036865234375,
0.01416015625,
-0.00844573974609375,
0.0701904296875,
0.030364990234375,
-0.00396728515625,
-0.01422119140625,
-0.042724609375,
0.068359375,
-0.06854248046875,
0.020721435546875,
0.029022216796875,
0.0186614990234375,
-0.02471923828125,
-0.07806396484375,
-0.056304931640625,
-0.0157012939453125,
-0.00981903076171875,
-0.0019474029541015625,
-0.0260772705078125,
-0.004703521728515625,
0.035919189453125,
0.02215576171875,
-0.040008544921875,
-0.00955963134765625,
-0.05230712890625,
-0.00689697265625,
0.046783447265625,
0.01128387451171875,
0.035430908203125,
-0.00732421875,
-0.020233154296875,
-0.0215606689453125,
-0.0413818359375,
0.007785797119140625,
0.020843505859375,
0.015716552734375,
-0.03753662109375,
0.03338623046875,
-0.01044464111328125,
0.042449951171875,
0.006816864013671875,
-0.01806640625,
0.048828125,
-0.0167999267578125,
-0.0304718017578125,
0.0031795501708984375,
0.085205078125,
0.006816864013671875,
0.01139068603515625,
0.004119873046875,
0.0096893310546875,
0.0138702392578125,
0.006134033203125,
-0.061676025390625,
-0.0423583984375,
0.03277587890625,
-0.0277252197265625,
-0.01268768310546875,
0.00917816162109375,
-0.043670654296875,
-0.00911712646484375,
0.006671905517578125,
0.04595947265625,
-0.036529541015625,
-0.031982421875,
0.007701873779296875,
-0.0221710205078125,
0.014190673828125,
0.0015573501586914062,
-0.076171875,
0.0203704833984375,
0.02960205078125,
0.054412841796875,
-0.0007319450378417969,
-0.041534423828125,
-0.005947113037109375,
-0.0036182403564453125,
-0.007617950439453125,
0.026397705078125,
-0.0272979736328125,
-0.0172576904296875,
-0.026763916015625,
0.0007505416870117188,
-0.013214111328125,
-0.03204345703125,
0.022735595703125,
-0.0193939208984375,
0.03839111328125,
0.0122222900390625,
-0.037506103515625,
-0.0245361328125,
0.009063720703125,
-0.0309295654296875,
0.0870361328125,
0.01390838623046875,
-0.06817626953125,
0.001613616943359375,
-0.0677490234375,
-0.03399658203125,
-0.004726409912109375,
-0.02386474609375,
-0.050872802734375,
-0.00414276123046875,
0.0207672119140625,
0.028167724609375,
-0.0261993408203125,
0.01715087890625,
-0.0191650390625,
-0.031707763671875,
0.01453399658203125,
-0.033447265625,
0.10595703125,
0.00975799560546875,
-0.068115234375,
0.01355743408203125,
-0.05230712890625,
0.007781982421875,
0.031524658203125,
-0.00807952880859375,
0.010589599609375,
-0.0179443359375,
-0.008758544921875,
0.015869140625,
0.0284423828125,
-0.0390625,
0.00812530517578125,
-0.04388427734375,
0.057373046875,
0.051025390625,
-0.0053863525390625,
0.0189208984375,
-0.0219879150390625,
0.0308837890625,
0.003055572509765625,
0.022003173828125,
-0.010589599609375,
-0.062744140625,
-0.0694580078125,
-0.0254364013671875,
0.011505126953125,
0.038238525390625,
-0.03839111328125,
0.052703857421875,
-0.004673004150390625,
-0.053863525390625,
-0.041778564453125,
-0.010955810546875,
0.0263519287109375,
0.03045654296875,
0.04266357421875,
-0.0186920166015625,
-0.05120849609375,
-0.0631103515625,
0.0007452964782714844,
-0.01708984375,
0.006893157958984375,
0.02838134765625,
0.053375244140625,
-0.036376953125,
0.057586669921875,
-0.038543701171875,
-0.00511932373046875,
-0.01384735107421875,
0.00841522216796875,
0.03973388671875,
0.060455322265625,
0.03814697265625,
-0.0386962890625,
-0.039093017578125,
-0.004932403564453125,
-0.0635986328125,
0.0096435546875,
-0.00408935546875,
-0.0122528076171875,
0.02264404296875,
0.029876708984375,
-0.0643310546875,
0.03790283203125,
0.04010009765625,
-0.0310821533203125,
0.043121337890625,
-0.01776123046875,
0.01447296142578125,
-0.0784912109375,
0.0160369873046875,
-0.0190277099609375,
-0.00632476806640625,
-0.040374755859375,
0.00733184814453125,
-0.0032787322998046875,
0.000008165836334228516,
-0.054931640625,
0.0584716796875,
-0.02740478515625,
0.00832366943359375,
-0.009979248046875,
-0.01392364501953125,
-0.01299285888671875,
0.054351806640625,
-0.00833892822265625,
0.051239013671875,
0.0579833984375,
-0.05035400390625,
0.031646728515625,
0.022064208984375,
-0.0216827392578125,
0.007083892822265625,
-0.057342529296875,
0.01422119140625,
0.00629425048828125,
0.005023956298828125,
-0.072265625,
-0.006519317626953125,
0.03570556640625,
-0.06353759765625,
0.02117919921875,
-0.00623321533203125,
-0.038360595703125,
-0.034210205078125,
-0.01016998291015625,
0.0307464599609375,
0.06634521484375,
-0.04400634765625,
0.05328369140625,
0.040679931640625,
0.01215362548828125,
-0.051910400390625,
-0.057952880859375,
-0.0097198486328125,
-0.0168304443359375,
-0.05975341796875,
0.020782470703125,
-0.0186767578125,
-0.018341064453125,
-0.005451202392578125,
-0.00232696533203125,
-0.0046234130859375,
0.0193328857421875,
0.028076171875,
0.0169219970703125,
-0.00022602081298828125,
-0.018951416015625,
-0.004817962646484375,
-0.006885528564453125,
0.027130126953125,
-0.01006317138671875,
0.06719970703125,
-0.028228759765625,
-0.01788330078125,
-0.052459716796875,
0.0036869049072265625,
0.04473876953125,
0.002147674560546875,
0.06817626953125,
0.0535888671875,
-0.037811279296875,
-0.01531982421875,
-0.031280517578125,
-0.034515380859375,
-0.0390625,
0.031982421875,
-0.0277099609375,
-0.0552978515625,
0.037872314453125,
0.0202178955078125,
0.0206146240234375,
0.061248779296875,
0.0670166015625,
0.0008015632629394531,
0.07501220703125,
0.03131103515625,
-0.004901885986328125,
0.0382080078125,
-0.0556640625,
0.010467529296875,
-0.052520751953125,
-0.0236053466796875,
-0.0443115234375,
-0.0189971923828125,
-0.04412841796875,
-0.0275726318359375,
0.017822265625,
0.0011920928955078125,
-0.05108642578125,
0.03814697265625,
-0.0635986328125,
0.0240325927734375,
0.060272216796875,
0.0004265308380126953,
-0.003505706787109375,
0.0063018798828125,
-0.01311492919921875,
0.0102386474609375,
-0.061248779296875,
-0.0193939208984375,
0.0794677734375,
0.0281982421875,
0.05517578125,
-0.006671905517578125,
0.052459716796875,
0.004947662353515625,
0.014617919921875,
-0.01885986328125,
0.04107666015625,
0.0033702850341796875,
-0.05902099609375,
-0.0178375244140625,
-0.040740966796875,
-0.07440185546875,
0.0182342529296875,
-0.009033203125,
-0.06884765625,
-0.002498626708984375,
0.0195159912109375,
-0.006816864013671875,
0.02838134765625,
-0.06439208984375,
0.080322265625,
-0.016204833984375,
-0.00917816162109375,
-0.0189971923828125,
-0.054473876953125,
0.033721923828125,
0.024322509765625,
0.00679779052734375,
-0.00782012939453125,
0.023345947265625,
0.06732177734375,
-0.0308837890625,
0.06890869140625,
-0.01213836669921875,
0.0257110595703125,
0.02984619140625,
-0.0035858154296875,
0.0299530029296875,
-0.001628875732421875,
0.00820159912109375,
0.0484619140625,
-0.0011034011840820312,
-0.033233642578125,
-0.029083251953125,
0.0677490234375,
-0.09405517578125,
-0.037261962890625,
-0.040191650390625,
-0.03472900390625,
0.00530242919921875,
0.0262603759765625,
0.0390625,
0.01971435546875,
0.00931549072265625,
0.0198211669921875,
0.04571533203125,
-0.01806640625,
0.0341796875,
0.003253936767578125,
-0.00717926025390625,
-0.04144287109375,
0.069580078125,
-0.004871368408203125,
0.00925445556640625,
0.01312255859375,
0.01513671875,
-0.0197296142578125,
-0.03173828125,
-0.039154052734375,
0.031158447265625,
-0.051055908203125,
-0.01424407958984375,
-0.060272216796875,
-0.0333251953125,
-0.03753662109375,
-0.00899505615234375,
-0.03436279296875,
-0.04241943359375,
-0.037261962890625,
0.01105499267578125,
0.032073974609375,
0.0285186767578125,
0.005001068115234375,
0.020416259765625,
-0.03790283203125,
0.02166748046875,
0.006710052490234375,
0.01526641845703125,
0.005214691162109375,
-0.057769775390625,
-0.01885986328125,
0.004001617431640625,
-0.027435302734375,
-0.040985107421875,
0.0252532958984375,
0.002559661865234375,
0.050933837890625,
0.0141754150390625,
0.004886627197265625,
0.047149658203125,
-0.0201568603515625,
0.06683349609375,
0.02703857421875,
-0.06878662109375,
0.041168212890625,
-0.0014181137084960938,
0.038909912109375,
0.0295867919921875,
0.0267791748046875,
-0.0257720947265625,
-0.046173095703125,
-0.083740234375,
-0.07354736328125,
0.06988525390625,
0.038055419921875,
0.006771087646484375,
-0.00750732421875,
0.01383209228515625,
0.0008039474487304688,
0.018768310546875,
-0.07244873046875,
-0.054901123046875,
-0.0244598388671875,
-0.0294952392578125,
0.01026153564453125,
0.00238037109375,
-0.01049041748046875,
-0.029571533203125,
0.080322265625,
0.00817108154296875,
0.053131103515625,
0.006069183349609375,
-0.0213623046875,
0.000499725341796875,
-0.00572967529296875,
0.04852294921875,
0.0675048828125,
-0.01313018798828125,
0.006916046142578125,
0.0201416015625,
-0.044708251953125,
0.0033111572265625,
0.01213836669921875,
-0.023956298828125,
-0.0033588409423828125,
0.0318603515625,
0.074462890625,
-0.0067138671875,
-0.043365478515625,
0.0218505859375,
-0.0218505859375,
-0.0207672119140625,
-0.0396728515625,
0.03271484375,
0.0207672119140625,
0.0237884521484375,
0.0201568603515625,
0.00445556640625,
-0.0152435302734375,
-0.025970458984375,
0.006439208984375,
0.03936767578125,
-0.01432037353515625,
-0.0207672119140625,
0.06646728515625,
0.005458831787109375,
-0.035247802734375,
0.060455322265625,
-0.0032596588134765625,
-0.031982421875,
0.06378173828125,
0.0498046875,
0.0655517578125,
-0.00043773651123046875,
-0.0119171142578125,
0.048736572265625,
0.0297393798828125,
-0.00745391845703125,
0.0209503173828125,
-0.0016193389892578125,
-0.051025390625,
-0.01788330078125,
-0.03875732421875,
-0.0160675048828125,
0.0145416259765625,
-0.0241851806640625,
0.02093505859375,
-0.047515869140625,
-0.01496124267578125,
-0.00357818603515625,
0.01096343994140625,
-0.054443359375,
0.006866455078125,
0.005580902099609375,
0.0648193359375,
-0.059722900390625,
0.0648193359375,
0.046142578125,
-0.036346435546875,
-0.05963134765625,
-0.02252197265625,
-0.01354217529296875,
-0.06524658203125,
0.043487548828125,
0.0233917236328125,
-0.0008463859558105469,
0.012725830078125,
-0.046844482421875,
-0.0638427734375,
0.081298828125,
0.043243408203125,
-0.0231475830078125,
0.005397796630859375,
-0.007350921630859375,
0.0279388427734375,
-0.01456451416015625,
0.0489501953125,
0.04547119140625,
0.0341796875,
-0.0016393661499023438,
-0.0684814453125,
0.00421142578125,
-0.0236663818359375,
-0.01428985595703125,
0.0263214111328125,
-0.047515869140625,
0.0838623046875,
-0.018310546875,
-0.00968170166015625,
-0.00469970703125,
0.07012939453125,
0.02825927734375,
0.002628326416015625,
0.028961181640625,
0.058868408203125,
0.052459716796875,
-0.00914764404296875,
0.07208251953125,
-0.041748046875,
0.050750732421875,
0.06561279296875,
0.022705078125,
0.046722412109375,
0.0278778076171875,
-0.03338623046875,
0.047332763671875,
0.029693603515625,
-0.00868988037109375,
0.0345458984375,
0.0089263916015625,
-0.017059326171875,
-0.0001023411750793457,
0.0268402099609375,
-0.041656494140625,
0.0183868408203125,
0.027374267578125,
-0.033050537109375,
0.0006413459777832031,
-0.0135498046875,
0.0106658935546875,
-0.0207977294921875,
-0.009307861328125,
0.040985107421875,
0.00684356689453125,
-0.042999267578125,
0.07855224609375,
0.012237548828125,
0.064208984375,
-0.047515869140625,
-0.0028820037841796875,
-0.00951385498046875,
0.0295257568359375,
-0.019195556640625,
-0.041748046875,
0.0178070068359375,
-0.01183319091796875,
-0.007358551025390625,
0.0040130615234375,
0.037261962890625,
-0.0294647216796875,
-0.042449951171875,
0.0173187255859375,
0.00620269775390625,
0.0357666015625,
0.0023174285888671875,
-0.072509765625,
0.0299530029296875,
0.01383209228515625,
-0.0272369384765625,
0.019134521484375,
0.00823974609375,
0.007678985595703125,
0.0556640625,
0.04278564453125,
-0.0030155181884765625,
0.020416259765625,
-0.0177154541015625,
0.0572509765625,
-0.05303955078125,
-0.036529541015625,
-0.07708740234375,
0.039154052734375,
0.003955841064453125,
-0.0347900390625,
0.061248779296875,
0.040985107421875,
0.087646484375,
-0.0028514862060546875,
0.060882568359375,
-0.033447265625,
0.0092315673828125,
-0.023681640625,
0.058258056640625,
-0.0362548828125,
0.003002166748046875,
-0.01336669921875,
-0.062744140625,
-0.0174560546875,
0.07769775390625,
-0.007221221923828125,
0.0202789306640625,
0.054931640625,
0.06329345703125,
-0.00539398193359375,
-0.01061248779296875,
0.0098419189453125,
0.031982421875,
0.037109375,
0.052703857421875,
0.034759521484375,
-0.052154541015625,
0.041656494140625,
-0.04913330078125,
-0.0192718505859375,
-0.0190277099609375,
-0.051239013671875,
-0.08062744140625,
-0.047821044921875,
-0.0280609130859375,
-0.04327392578125,
0.00104522705078125,
0.0894775390625,
0.060150146484375,
-0.06365966796875,
-0.0290679931640625,
-0.01544189453125,
0.01407623291015625,
-0.02471923828125,
-0.019989013671875,
0.03778076171875,
-0.021270751953125,
-0.06683349609375,
0.0211334228515625,
0.012359619140625,
0.000896453857421875,
-0.0325927734375,
-0.0149688720703125,
-0.0255584716796875,
-0.00797271728515625,
0.03509521484375,
0.04083251953125,
-0.054168701171875,
-0.0122528076171875,
-0.0133056640625,
-0.00969696044921875,
0.0225372314453125,
0.0310821533203125,
-0.061492919921875,
0.03179931640625,
0.0423583984375,
0.046142578125,
0.060943603515625,
-0.0182342529296875,
0.026031494140625,
-0.03839111328125,
0.021820068359375,
0.0219879150390625,
0.03277587890625,
0.02166748046875,
-0.03424072265625,
0.0254058837890625,
0.029052734375,
-0.04803466796875,
-0.06427001953125,
0.004306793212890625,
-0.0693359375,
-0.026336669921875,
0.07525634765625,
-0.005077362060546875,
-0.039154052734375,
0.001068115234375,
-0.00565338134765625,
0.0217437744140625,
-0.0306243896484375,
0.06719970703125,
0.03631591796875,
-0.035430908203125,
-0.01523590087890625,
-0.0265655517578125,
0.0246124267578125,
0.0237884521484375,
-0.063720703125,
-0.0007290840148925781,
0.014495849609375,
0.033935546875,
0.0171356201171875,
0.0643310546875,
-0.0067596435546875,
0.029754638671875,
0.0119171142578125,
0.031219482421875,
-0.023529052734375,
-0.0245361328125,
-0.015106201171875,
0.010833740234375,
-0.0000489354133605957,
-0.01995849609375
]
] |
amazon/FalconLite | 2023-08-03T15:29:56.000Z | [
"transformers",
"RefinedWeb",
"text-generation",
"custom_code",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | amazon | null | null | amazon/FalconLite | 171 | 16,407 | transformers | 2023-08-01T14:18:59 | ---
license: apache-2.0
inference: false
---
# FalconLite Model
FalconLite is a quantized version of the [Falcon 40B SFT OASST-TOP1 model](https://huggingface.co/OpenAssistant/falcon-40b-sft-top1-560), capable of processing long (i.e. 11K tokens) input sequences while consuming 4x less GPU memory. By utilizing 4-bit [GPTQ quantization](https://github.com/PanQiWei/AutoGPTQ) and adapted [dynamic NTK](https://www.reddit.com/r/LocalLLaMA/comments/14mrgpr/dynamically_scaled_rope_further_increases/) RotaryEmbedding, FalconLite achieves a balance between latency, accuracy, and memory efficiency. With the ability to process 5x longer contexts than the original model, FalconLite is useful for applications such as topic retrieval, summarization, and question-answering. FalconLite can be deployed on a single AWS `g5.12x` instance with [TGI 0.9.2](https://github.com/huggingface/text-generation-inference/tree/v0.9.2), making it suitable for applications that require high performance in resource-constrained environments.
## Model Details
- **Developed by:** [AWS Contributors](https://github.com/orgs/aws-samples/teams/aws-prototype-ml-apac)
- **Model type:** [Falcon 40B](https://huggingface.co/tiiuae/falcon-40b)
- **Language:** English
- **Quantized from weights:** [Falcon 40B SFT OASST-TOP1 model](https://huggingface.co/OpenAssistant/falcon-40b-sft-top1-560)
- **Modified from layers:** [Text-Generation-Inference 0.9.2](https://github.com/huggingface/text-generation-inference/tree/v0.9.2)
- **License:** Apache 2.0
- **Contact:** [GitHub issues](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/issues)
- **Blogpost:** [Extend the context length of Falcon40B to 10k](https://medium.com/@chenwuperth/extend-the-context-length-of-falcon40b-to-10k-85d81d32146f)
## Deploy FalconLite ##
SSH login to an AWS `g5.12x` instance with the [Deep Learning AMI](https://aws.amazon.com/releasenotes/aws-deep-learning-ami-gpu-pytorch-2-0-ubuntu-20-04/).
### Start LLM server
```bash
git clone https://github.com/awslabs/extending-the-context-length-of-open-source-llms.git falconlite-dev
cd falconlite-dev/script
./docker_build.sh
./start_falconlite.sh
```
### Perform inference
```bash
# after FalconLite has been completely started
pip install -r requirements-client.txt
python falconlite_client.py
```
### *New!* Amazon SageMaker Deployment ###
To deploy FalconLite on SageMaker endpoint, please follow [this notebook](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/custom-tgi-ecr/deploy.ipynb).
**Important** - When using FalconLite for inference for the first time, it may require a brief 'warm-up' period that can take 10s of seconds. However, subsequent inferences should be faster and return results in a more timely manner. This warm-up period is normal and should not affect the overall performance of the system once the initialisation period has been completed.
## Evalution Result ##
We evaluated FalconLite against benchmarks that are specifically designed to assess the capabilities of LLMs in handling longer contexts. All evaluations were conducted without fine-tuning the model.
### Accuracy ###
|Eval task|Input length| Input length | Input length| Input length|
|----------|-------------|-------------|------------|-----------|
| | 2800 ~ 3800| 5500 ~ 5600 |7500 ~ 8300 | 9300 ~ 11000 |
| [Topic Retrieval](https://lmsys.org/blog/2023-06-29-longchat/) | 100% | 100% | 92% | 92% |
| [Line Retrieval](https://lmsys.org/blog/2023-06-29-longchat/#longeval-results) | 38% | 12% | 8% | 4% |
| [Pass key Retrieval](https://github.com/epfml/landmark-attention/blob/main/llama/run_test.py#L101) | 100% | 100% | 100% | 100% |
|Eval task| Test set Accuracy | Hard subset Accuracy|
|----------|-------------|-------------|
| [Question Answering with Long Input Texts](https://nyu-mll.github.io/quality/) | 46.9% | 40.8% |
### Performance ###
**metrics** = the average number of generated tokens per second (TPS) =
`nb-generated-tokens` / `end-to-end-response-time`
The `end-to-end-response-time` = when the last token is generated - when the inference request is received
|Instance| Input length | Input length| Input length|Input length|
|----------|-------------|-------------|------------|------------|
| | 20 | 3300 | 5500 |10000 |
| g5.48x | 22 tps | 12 tps | 12 tps | 12 tps |
| g5.12x | 18 tps | 11 tps | 11 tps | 10 tps |
## Limitations ##
* Our evaluation shows that FalconLite's capability in `Line Retrieval` is limited, and requires further effort.
* While `g5.12x` is sufficient for FalconLite to handle 10K long contexts, a larger instance with more memory capcacity such as `g5.48x` is recommended for sustained, heavy workloads.
* Before using the FalconLite model, it is important to perform your own independent assessment, and take measures to ensure that your use would comply with your own specific quality control practices and standards, and that your use would comply with the local rules, laws, regulations, licenses and terms that apply to you, and your content. | 5,172 | [
[
-0.03265380859375,
-0.06842041015625,
0.047149658203125,
0.0256500244140625,
-0.00994873046875,
-0.0021724700927734375,
-0.007411956787109375,
-0.03387451171875,
0.0203857421875,
0.0216217041015625,
-0.042083740234375,
-0.047210693359375,
-0.036651611328125,
-0.004123687744140625,
-0.03179931640625,
0.0728759765625,
-0.00047898292541503906,
-0.01629638671875,
-0.006122589111328125,
-0.0038909912109375,
-0.0169219970703125,
-0.03125,
-0.061767578125,
-0.0029735565185546875,
0.018585205078125,
0.02789306640625,
0.040496826171875,
0.03680419921875,
0.04559326171875,
0.0203704833984375,
-0.0263671875,
0.01556396484375,
-0.037994384765625,
-0.005641937255859375,
-0.00034737586975097656,
-0.017333984375,
-0.050537109375,
-0.01110076904296875,
0.0616455078125,
0.04083251953125,
-0.0073394775390625,
0.0270538330078125,
0.006130218505859375,
0.05780029296875,
-0.056549072265625,
0.01122283935546875,
-0.0309295654296875,
0.0117645263671875,
-0.01227569580078125,
-0.0013837814331054688,
-0.0229339599609375,
0.00853729248046875,
0.00335693359375,
-0.050750732421875,
0.0141143798828125,
0.01361083984375,
0.071044921875,
0.0286102294921875,
-0.03009033203125,
-0.0153961181640625,
-0.0258331298828125,
0.0701904296875,
-0.06475830078125,
0.032073974609375,
0.0251617431640625,
0.027496337890625,
-0.005893707275390625,
-0.08319091796875,
-0.0149383544921875,
-0.011871337890625,
-0.0189361572265625,
0.0229644775390625,
0.005970001220703125,
0.01439666748046875,
0.036529541015625,
0.0287933349609375,
-0.040313720703125,
-0.004573822021484375,
-0.0249786376953125,
-0.0115966796875,
0.05596923828125,
0.0233917236328125,
-0.0080718994140625,
-0.0267181396484375,
-0.032379150390625,
0.0028667449951171875,
-0.042388916015625,
0.03460693359375,
0.00970458984375,
0.0162506103515625,
0.0013322830200195312,
0.0289154052734375,
-0.03631591796875,
0.0318603515625,
0.0188140869140625,
-0.007965087890625,
0.02398681640625,
-0.042633056640625,
-0.04010009765625,
0.007511138916015625,
0.072021484375,
0.0186004638671875,
-0.01087188720703125,
0.0015172958374023438,
-0.020294189453125,
-0.0183868408203125,
0.0142059326171875,
-0.10101318359375,
0.0099945068359375,
0.032379150390625,
-0.03338623046875,
-0.0196533203125,
0.01172637939453125,
-0.04840087890625,
-0.01033782958984375,
0.024810791015625,
0.01971435546875,
-0.032470703125,
-0.004772186279296875,
0.01412200927734375,
-0.001834869384765625,
0.0007376670837402344,
0.00991058349609375,
-0.07080078125,
0.035491943359375,
0.055999755859375,
0.07122802734375,
-0.0021190643310546875,
-0.02783203125,
-0.051788330078125,
-0.01120758056640625,
-0.031494140625,
0.031951904296875,
-0.0213623046875,
-0.0477294921875,
-0.0211944580078125,
0.0194244384765625,
-0.01444244384765625,
-0.0223236083984375,
0.06414794921875,
-0.032562255859375,
0.0151519775390625,
-0.032012939453125,
-0.05645751953125,
-0.0133819580078125,
-0.01000213623046875,
-0.036163330078125,
0.097900390625,
0.0179595947265625,
-0.057830810546875,
0.021728515625,
-0.056640625,
-0.00913238525390625,
0.011871337890625,
-0.007221221923828125,
-0.03802490234375,
-0.0024242401123046875,
0.0185394287109375,
0.049407958984375,
-0.0149993896484375,
0.0304718017578125,
-0.0188446044921875,
-0.07049560546875,
0.0076751708984375,
-0.01543426513671875,
0.05560302734375,
0.039520263671875,
-0.03594970703125,
0.01190185546875,
-0.035736083984375,
0.009368896484375,
0.00975799560546875,
-0.021331787109375,
0.0082855224609375,
0.00891876220703125,
0.0053558349609375,
-0.001979827880859375,
0.0238189697265625,
-0.045379638671875,
0.005016326904296875,
-0.0298919677734375,
0.052490234375,
0.04083251953125,
0.00902557373046875,
0.0288238525390625,
-0.03570556640625,
0.0260009765625,
0.00391387939453125,
0.0193634033203125,
-0.0213165283203125,
-0.02960205078125,
-0.078125,
-0.023101806640625,
0.007724761962890625,
0.035125732421875,
-0.03411865234375,
0.0294189453125,
-0.032684326171875,
-0.042144775390625,
-0.0310821533203125,
0.00885772705078125,
0.04400634765625,
0.049896240234375,
0.046844482421875,
0.0011157989501953125,
-0.035888671875,
-0.0728759765625,
-0.01100921630859375,
-0.01200103759765625,
0.0037384033203125,
0.042877197265625,
0.057159423828125,
-0.0225677490234375,
0.06964111328125,
-0.03753662109375,
-0.0182952880859375,
0.0003287792205810547,
-0.0057525634765625,
0.041717529296875,
0.0293731689453125,
0.0628662109375,
-0.049774169921875,
-0.04278564453125,
-0.01282501220703125,
-0.057159423828125,
-0.004169464111328125,
-0.007778167724609375,
-0.008758544921875,
0.037628173828125,
0.0338134765625,
-0.051116943359375,
0.027099609375,
0.02813720703125,
-0.04742431640625,
0.047393798828125,
-0.005664825439453125,
0.0208740234375,
-0.086181640625,
0.028350830078125,
0.006481170654296875,
-0.01348114013671875,
-0.0390625,
0.0200042724609375,
0.028350830078125,
0.0073394775390625,
-0.051300048828125,
0.052978515625,
-0.031951904296875,
-0.00844573974609375,
-0.002166748046875,
-0.00261688232421875,
0.0088958740234375,
0.031158447265625,
-0.007259368896484375,
0.107421875,
0.0186920166015625,
-0.043487548828125,
0.00675201416015625,
0.0101318359375,
-0.032012939453125,
0.0301361083984375,
-0.06243896484375,
-0.005733489990234375,
0.0106201171875,
0.0304412841796875,
-0.06976318359375,
-0.016265869140625,
0.0083465576171875,
-0.0360107421875,
0.00606536865234375,
-0.0116119384765625,
-0.0248870849609375,
-0.0301361083984375,
-0.035491943359375,
0.0173492431640625,
0.055389404296875,
-0.03753662109375,
0.0287933349609375,
0.0000959634780883789,
-0.00634002685546875,
-0.0633544921875,
-0.0625,
0.00555419921875,
-0.027801513671875,
-0.061370849609375,
0.054229736328125,
-0.0166168212890625,
-0.01122283935546875,
-0.005340576171875,
-0.0085296630859375,
-0.0004343986511230469,
0.0222015380859375,
0.03326416015625,
0.011871337890625,
-0.022796630859375,
0.0137481689453125,
0.0121307373046875,
-0.009857177734375,
0.004375457763671875,
-0.00994110107421875,
0.03289794921875,
-0.046875,
-0.02154541015625,
-0.03607177734375,
0.01071929931640625,
0.0413818359375,
-0.0069580078125,
0.058074951171875,
0.043670654296875,
-0.0192413330078125,
-0.00801849365234375,
-0.045989990234375,
-0.01502227783203125,
-0.042938232421875,
0.037689208984375,
-0.043365478515625,
-0.0672607421875,
0.04559326171875,
0.01198577880859375,
0.0147247314453125,
0.05670166015625,
0.017730712890625,
-0.0103759765625,
0.061187744140625,
0.0208282470703125,
-0.004058837890625,
0.033721923828125,
-0.046905517578125,
-0.0018987655639648438,
-0.0888671875,
-0.00931549072265625,
-0.0263671875,
-0.01371002197265625,
-0.0291290283203125,
-0.023101806640625,
0.01541900634765625,
0.019378662109375,
-0.038818359375,
0.0163421630859375,
-0.03973388671875,
0.02001953125,
0.0234832763671875,
0.0026683807373046875,
0.01434326171875,
-0.008270263671875,
-0.022308349609375,
-0.0018749237060546875,
-0.04620361328125,
-0.040313720703125,
0.0911865234375,
0.0254364013671875,
0.033935546875,
0.0035381317138671875,
0.06610107421875,
0.01169586181640625,
0.0181121826171875,
-0.0543212890625,
0.05218505859375,
-0.00531005859375,
-0.042144775390625,
-0.018218994140625,
-0.0347900390625,
-0.0562744140625,
0.02459716796875,
-0.0035190582275390625,
-0.060516357421875,
-0.00821685791015625,
-0.010101318359375,
-0.035247802734375,
0.026580810546875,
-0.04931640625,
0.05267333984375,
-0.0176239013671875,
-0.044708251953125,
0.0028705596923828125,
-0.03680419921875,
0.0272674560546875,
-0.0175628662109375,
0.018646240234375,
-0.007732391357421875,
-0.0239715576171875,
0.06982421875,
-0.05230712890625,
0.045135498046875,
-0.018798828125,
0.00830078125,
0.0271453857421875,
-0.00695037841796875,
0.04461669921875,
0.01490020751953125,
-0.0258636474609375,
0.016143798828125,
0.022796630859375,
-0.038818359375,
-0.039398193359375,
0.07342529296875,
-0.05706787109375,
-0.05560302734375,
-0.0399169921875,
-0.03485107421875,
0.000010788440704345703,
0.01174163818359375,
0.021026611328125,
0.01409912109375,
-0.01273345947265625,
0.0355224609375,
0.04315185546875,
-0.00572967529296875,
0.060272216796875,
0.027923583984375,
-0.034423828125,
-0.0301666259765625,
0.05426025390625,
0.0244903564453125,
0.01444244384765625,
0.01366424560546875,
0.00115966796875,
-0.0249481201171875,
-0.051971435546875,
-0.0430908203125,
0.0186309814453125,
-0.048004150390625,
-0.006687164306640625,
-0.056243896484375,
-0.039794921875,
-0.0239105224609375,
-0.019500732421875,
-0.03436279296875,
-0.03863525390625,
-0.048736572265625,
-0.0101470947265625,
0.037811279296875,
0.0274658203125,
0.02459716796875,
0.0302581787109375,
-0.05712890625,
-0.007511138916015625,
0.0076904296875,
0.020782470703125,
-0.00391387939453125,
-0.0570068359375,
-0.0218658447265625,
0.0192413330078125,
-0.0210723876953125,
-0.04638671875,
0.0302581787109375,
0.019989013671875,
0.038330078125,
0.033355712890625,
0.008392333984375,
0.0635986328125,
-0.02520751953125,
0.07965087890625,
-0.005558013916015625,
-0.0699462890625,
0.038238525390625,
-0.027130126953125,
0.0223388671875,
0.052886962890625,
0.033355712890625,
-0.042510986328125,
-0.004825592041015625,
-0.05224609375,
-0.06378173828125,
0.04119873046875,
0.0273590087890625,
-0.0036373138427734375,
-0.002880096435546875,
0.040283203125,
-0.012847900390625,
-0.00015854835510253906,
-0.0251617431640625,
-0.01605224609375,
-0.01445770263671875,
-0.0115509033203125,
-0.0269317626953125,
-0.027984619140625,
-0.00029850006103515625,
-0.017059326171875,
0.057830810546875,
-0.01556396484375,
0.03466796875,
0.02911376953125,
-0.018341064453125,
0.004032135009765625,
0.01303863525390625,
0.050079345703125,
0.058074951171875,
-0.04180908203125,
-0.0032596588134765625,
0.04095458984375,
-0.048431396484375,
0.0020847320556640625,
0.018157958984375,
-0.0161895751953125,
-0.01206207275390625,
0.031494140625,
0.060882568359375,
0.0260772705078125,
-0.044677734375,
0.03314208984375,
-0.0121002197265625,
-0.0308837890625,
-0.03009033203125,
0.033203125,
0.01360321044921875,
0.02655029296875,
0.04937744140625,
-0.022552490234375,
0.0198822021484375,
-0.0228271484375,
0.01390838623046875,
0.036224365234375,
-0.028228759765625,
0.00035881996154785156,
0.052490234375,
0.0128021240234375,
-0.0272369384765625,
0.04296875,
-0.00841522216796875,
-0.032867431640625,
0.066650390625,
0.032806396484375,
0.055328369140625,
0.0030040740966796875,
0.0086517333984375,
0.037628173828125,
0.00540924072265625,
-0.0084686279296875,
0.0229034423828125,
-0.00775909423828125,
-0.02392578125,
-0.0170135498046875,
-0.06793212890625,
-0.0224151611328125,
-0.006114959716796875,
-0.040435791015625,
0.0015268325805664062,
-0.04736328125,
-0.0312347412109375,
-0.01073455810546875,
0.05511474609375,
-0.0384521484375,
0.0099334716796875,
-0.01421356201171875,
0.088623046875,
-0.03802490234375,
0.05291748046875,
0.044281005859375,
-0.03570556640625,
-0.051971435546875,
-0.0233154296875,
0.0014715194702148438,
-0.05267333984375,
0.0560302734375,
0.0288238525390625,
-0.01120758056640625,
0.019134521484375,
-0.022247314453125,
-0.07879638671875,
0.1370849609375,
-0.004070281982421875,
-0.043609619140625,
-0.0014925003051757812,
0.01165771484375,
0.0340576171875,
-0.03253173828125,
0.049346923828125,
0.036773681640625,
0.033966064453125,
0.01346588134765625,
-0.0748291015625,
0.029449462890625,
-0.0269775390625,
0.005275726318359375,
0.0149078369140625,
-0.07977294921875,
0.0823974609375,
-0.0123291015625,
-0.0240631103515625,
0.0262451171875,
0.04974365234375,
0.03875732421875,
0.0228118896484375,
0.016693115234375,
0.05596923828125,
0.066650390625,
-0.01126861572265625,
0.08343505859375,
-0.038604736328125,
0.0523681640625,
0.0643310546875,
-0.00791168212890625,
0.07000732421875,
0.03741455078125,
-0.0215301513671875,
0.0301666259765625,
0.07244873046875,
0.0028781890869140625,
0.0255584716796875,
-0.0049896240234375,
-0.0194244384765625,
-0.005580902099609375,
0.0009059906005859375,
-0.051513671875,
0.00661468505859375,
0.0277099609375,
-0.0384521484375,
-0.0037555694580078125,
-0.02685546875,
0.0203094482421875,
-0.03167724609375,
-0.01265716552734375,
0.027801513671875,
0.010772705078125,
-0.05267333984375,
0.06939697265625,
0.012237548828125,
0.06884765625,
-0.04522705078125,
0.01555633544921875,
-0.038818359375,
0.01678466796875,
-0.0305938720703125,
-0.049896240234375,
0.0185394287109375,
-0.004222869873046875,
0.0021076202392578125,
-0.01000213623046875,
0.044830322265625,
0.0022125244140625,
-0.04388427734375,
0.0228271484375,
0.0258941650390625,
0.0146331787109375,
-0.0117645263671875,
-0.06011962890625,
0.02008056640625,
-0.01482391357421875,
-0.0498046875,
0.033721923828125,
0.00453948974609375,
0.0007176399230957031,
0.06597900390625,
0.058624267578125,
-0.0170745849609375,
0.00513458251953125,
-0.0220184326171875,
0.06646728515625,
-0.058746337890625,
-0.025787353515625,
-0.065185546875,
0.037628173828125,
-0.0178985595703125,
-0.0318603515625,
0.062255859375,
0.060516357421875,
0.043853759765625,
0.0178070068359375,
0.039825439453125,
0.0036220550537109375,
0.01593017578125,
-0.0443115234375,
0.06787109375,
-0.06146240234375,
0.02490234375,
-0.0193634033203125,
-0.07666015625,
-0.0020771026611328125,
0.047637939453125,
-0.0245208740234375,
0.0175933837890625,
0.072265625,
0.06304931640625,
-0.01406097412109375,
0.00867462158203125,
0.01551055908203125,
0.01410675048828125,
0.03717041015625,
0.06964111328125,
0.04583740234375,
-0.053070068359375,
0.03582763671875,
-0.0233306884765625,
-0.0129852294921875,
-0.033477783203125,
-0.06097412109375,
-0.0684814453125,
-0.033416748046875,
-0.003971099853515625,
-0.0187835693359375,
0.01288604736328125,
0.0562744140625,
0.061767578125,
-0.0360107421875,
-0.0282440185546875,
-0.0157318115234375,
-0.0153656005859375,
-0.0030269622802734375,
-0.0202178955078125,
0.0404052734375,
-0.03704833984375,
-0.047760009765625,
0.034637451171875,
0.002338409423828125,
0.0013971328735351562,
-0.022857666015625,
-0.03021240234375,
-0.021636962890625,
0.007305145263671875,
0.04290771484375,
0.0130157470703125,
-0.052764892578125,
-0.0225677490234375,
0.0203857421875,
-0.022613525390625,
0.001430511474609375,
0.0174560546875,
-0.066650390625,
0.0066680908203125,
0.038330078125,
0.04974365234375,
0.07330322265625,
0.00284576416015625,
-0.004001617431640625,
-0.0318603515625,
0.00936126708984375,
0.01031494140625,
0.038177490234375,
0.00372314453125,
-0.043121337890625,
0.061126708984375,
0.018768310546875,
-0.04913330078125,
-0.05316162109375,
0.004581451416015625,
-0.072265625,
-0.0272369384765625,
0.08343505859375,
-0.02044677734375,
-0.0253753662109375,
0.028564453125,
-0.0182342529296875,
0.0039825439453125,
-0.04180908203125,
0.043853759765625,
0.0557861328125,
-0.0165252685546875,
-0.011688232421875,
-0.052734375,
0.031463623046875,
0.03973388671875,
-0.06396484375,
-0.019195556640625,
0.03582763671875,
0.0350341796875,
0.006908416748046875,
0.058319091796875,
0.012786865234375,
0.019378662109375,
-0.0004973411560058594,
-0.022308349609375,
-0.03656005859375,
-0.0172271728515625,
-0.01268768310546875,
0.01529693603515625,
-0.01549530029296875,
-0.0037517547607421875
]
] |
Helsinki-NLP/opus-mt-cs-en | 2023-08-16T11:27:09.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"cs",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-cs-en | 0 | 16,388 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-cs-en
* source languages: cs
* target languages: en
* OPUS readme: [cs-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/cs-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/cs-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/cs-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/cs-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newstest2014-csen.cs.en | 34.1 | 0.612 |
| newstest2015-encs.cs.en | 30.4 | 0.565 |
| newstest2016-encs.cs.en | 31.8 | 0.584 |
| newstest2017-encs.cs.en | 28.7 | 0.556 |
| newstest2018-encs.cs.en | 30.3 | 0.566 |
| Tatoeba.cs.en | 58.0 | 0.721 |
| 1,043 | [
[
-0.0254058837890625,
-0.0207672119140625,
0.0230560302734375,
0.0294952392578125,
-0.02056884765625,
-0.01580810546875,
-0.027923583984375,
-0.00128936767578125,
0.0036754608154296875,
0.0272216796875,
-0.056488037109375,
-0.04656982421875,
-0.041839599609375,
0.00970458984375,
-0.01161956787109375,
0.052581787109375,
-0.009429931640625,
0.036956787109375,
0.0126800537109375,
-0.034027099609375,
-0.0281982421875,
-0.034820556640625,
-0.0377197265625,
-0.020660400390625,
0.018157958984375,
0.035980224609375,
0.032562255859375,
0.027435302734375,
0.06329345703125,
0.018951416015625,
-0.00970458984375,
0.00830841064453125,
-0.038116455078125,
-0.014129638671875,
0.01251983642578125,
-0.0438232421875,
-0.05706787109375,
-0.00725555419921875,
0.07421875,
0.0318603515625,
-0.00027680397033691406,
0.031707763671875,
0.00702667236328125,
0.0675048828125,
-0.0244598388671875,
0.0006670951843261719,
-0.041351318359375,
0.01126861572265625,
-0.0213165283203125,
-0.0286407470703125,
-0.044891357421875,
-0.0167083740234375,
0.0085601806640625,
-0.052093505859375,
0.0069427490234375,
0.0103607177734375,
0.1063232421875,
0.0192413330078125,
-0.02130126953125,
-0.006511688232421875,
-0.035369873046875,
0.0841064453125,
-0.05828857421875,
0.03607177734375,
0.02984619140625,
0.0179290771484375,
0.013336181640625,
-0.038116455078125,
-0.0204620361328125,
0.012939453125,
-0.0164337158203125,
0.0176544189453125,
-0.015045166015625,
-0.0210723876953125,
0.0202789306640625,
0.052947998046875,
-0.06292724609375,
0.0023670196533203125,
-0.047882080078125,
-0.00205230712890625,
0.049041748046875,
0.01910400390625,
0.0140380859375,
-0.00439453125,
-0.026214599609375,
-0.036407470703125,
-0.058837890625,
0.0164337158203125,
0.027374267578125,
0.022674560546875,
-0.040496826171875,
0.04266357421875,
-0.01190948486328125,
0.048675537109375,
0.0036182403564453125,
0.0007796287536621094,
0.073974609375,
-0.0248870849609375,
-0.0252227783203125,
-0.0139007568359375,
0.08154296875,
0.025115966796875,
0.003963470458984375,
0.00615692138671875,
-0.00603485107421875,
-0.011932373046875,
0.008026123046875,
-0.06732177734375,
-0.00608062744140625,
0.0150299072265625,
-0.03759765625,
-0.02239990234375,
0.00571441650390625,
-0.047943115234375,
0.010345458984375,
-0.03582763671875,
0.030242919921875,
-0.03887939453125,
-0.02294921875,
0.0184783935546875,
0.00382232666015625,
0.03607177734375,
-0.00026988983154296875,
-0.0408935546875,
0.0225982666015625,
0.0283203125,
0.050567626953125,
-0.032928466796875,
-0.01702880859375,
-0.032257080078125,
-0.024627685546875,
-0.016571044921875,
0.045806884765625,
-0.0107269287109375,
-0.029388427734375,
-0.00551605224609375,
0.0377197265625,
-0.022613525390625,
-0.0209808349609375,
0.09063720703125,
-0.017730712890625,
0.049041748046875,
-0.041656494140625,
-0.03314208984375,
-0.022735595703125,
0.037811279296875,
-0.038055419921875,
0.102294921875,
0.0058441162109375,
-0.06396484375,
0.01067352294921875,
-0.054840087890625,
-0.007762908935546875,
-0.0099029541015625,
-0.0017833709716796875,
-0.046844482421875,
0.0033321380615234375,
0.0094146728515625,
0.0271759033203125,
-0.0269775390625,
0.0182952880859375,
-0.0020656585693359375,
-0.0257720947265625,
0.00342559814453125,
-0.026031494140625,
0.0821533203125,
0.0257110595703125,
-0.0209197998046875,
0.021453857421875,
-0.07635498046875,
0.00655364990234375,
0.00571441650390625,
-0.03399658203125,
-0.006206512451171875,
0.01096343994140625,
0.0227203369140625,
0.012939453125,
0.0216827392578125,
-0.04205322265625,
0.01532745361328125,
-0.0462646484375,
0.0204620361328125,
0.054107666015625,
-0.0209808349609375,
0.0291290283203125,
-0.0300750732421875,
0.03253173828125,
0.0125732421875,
0.007495880126953125,
0.01088714599609375,
-0.029266357421875,
-0.06646728515625,
-0.0145721435546875,
0.034515380859375,
0.0802001953125,
-0.05133056640625,
0.0626220703125,
-0.045654296875,
-0.06427001953125,
-0.0439453125,
-0.01348114013671875,
0.0296173095703125,
0.032318115234375,
0.037261962890625,
-0.01190948486328125,
-0.034423828125,
-0.082763671875,
-0.00907135009765625,
-0.004451751708984375,
-0.009490966796875,
0.0181121826171875,
0.048309326171875,
-0.0032329559326171875,
0.042266845703125,
-0.047637939453125,
-0.027862548828125,
-0.0123748779296875,
0.00975799560546875,
0.03594970703125,
0.04949951171875,
0.048919677734375,
-0.05908203125,
-0.04498291015625,
-0.004451751708984375,
-0.046905517578125,
-0.01303863525390625,
0.0042266845703125,
-0.0223236083984375,
0.0018320083618164062,
0.0134429931640625,
-0.0128173828125,
0.01525115966796875,
0.053466796875,
-0.0484619140625,
0.0489501953125,
-0.0043182373046875,
0.02069091796875,
-0.1019287109375,
0.007572174072265625,
-0.01261138916015625,
-0.01019287109375,
-0.03192138671875,
-0.00290679931640625,
0.018035888671875,
0.00005125999450683594,
-0.058807373046875,
0.035430908203125,
-0.0309295654296875,
-0.00516510009765625,
0.0181732177734375,
0.0007848739624023438,
0.00695037841796875,
0.059600830078125,
-0.0024566650390625,
0.060333251953125,
0.0692138671875,
-0.038970947265625,
0.01303863525390625,
0.03509521484375,
-0.0279083251953125,
0.03497314453125,
-0.05657958984375,
-0.0194854736328125,
0.0209197998046875,
-0.005706787109375,
-0.055084228515625,
0.00997161865234375,
0.019378662109375,
-0.050689697265625,
0.0238494873046875,
-0.00516510009765625,
-0.048980712890625,
-0.0116729736328125,
-0.02685546875,
0.033782958984375,
0.043060302734375,
-0.01085662841796875,
0.04034423828125,
0.007793426513671875,
0.003116607666015625,
-0.037109375,
-0.07794189453125,
-0.01041412353515625,
-0.0279541015625,
-0.055999755859375,
0.0183563232421875,
-0.032745361328125,
0.0018377304077148438,
0.00345611572265625,
0.0220794677734375,
-0.0153350830078125,
0.003826141357421875,
0.00836181640625,
0.02227783203125,
-0.0361328125,
0.0013713836669921875,
-0.00681304931640625,
-0.0082550048828125,
-0.011138916015625,
-0.006793975830078125,
0.04498291015625,
-0.033843994140625,
-0.0276947021484375,
-0.037139892578125,
0.004669189453125,
0.039886474609375,
-0.037872314453125,
0.059326171875,
0.03955078125,
-0.01346588134765625,
0.00945281982421875,
-0.02838134765625,
0.00560760498046875,
-0.0328369140625,
0.01560211181640625,
-0.034271240234375,
-0.06585693359375,
0.050628662109375,
0.0071258544921875,
0.036834716796875,
0.061431884765625,
0.0474853515625,
0.01214599609375,
0.061309814453125,
0.020721435546875,
0.0030193328857421875,
0.033782958984375,
-0.0369873046875,
-0.011871337890625,
-0.07794189453125,
0.0132293701171875,
-0.053009033203125,
-0.0302886962890625,
-0.06610107421875,
-0.02044677734375,
0.0252532958984375,
-0.0008683204650878906,
-0.0301055908203125,
0.053314208984375,
-0.054107666015625,
0.023101806640625,
0.040130615234375,
-0.003215789794921875,
0.0156707763671875,
-0.00038886070251464844,
-0.041961669921875,
-0.0218353271484375,
-0.0306854248046875,
-0.0271759033203125,
0.09283447265625,
0.021331787109375,
0.0180206298828125,
0.0198211669921875,
0.04144287109375,
-0.0015001296997070312,
0.0114898681640625,
-0.039093017578125,
0.0340576171875,
-0.015625,
-0.056976318359375,
-0.016693115234375,
-0.040802001953125,
-0.058441162109375,
0.036773681640625,
-0.016754150390625,
-0.044708251953125,
0.023406982421875,
0.0009088516235351562,
-0.0164642333984375,
0.032958984375,
-0.043365478515625,
0.08685302734375,
-0.00806427001953125,
-0.01084136962890625,
0.015838623046875,
-0.03521728515625,
0.022796630859375,
0.005725860595703125,
0.02398681640625,
-0.01515960693359375,
0.0128326416015625,
0.05023193359375,
-0.01617431640625,
0.03472900390625,
-0.002223968505859375,
0.0027103424072265625,
0.0130767822265625,
0.01025390625,
0.032562255859375,
-0.0127410888671875,
-0.025665283203125,
0.022430419921875,
0.007549285888671875,
-0.0347900390625,
-0.0064544677734375,
0.0477294921875,
-0.055389404296875,
-0.00434112548828125,
-0.0469970703125,
-0.0430908203125,
-0.0018100738525390625,
0.024749755859375,
0.04962158203125,
0.04949951171875,
-0.0227203369140625,
0.044464111328125,
0.05963134765625,
-0.027801513671875,
0.02813720703125,
0.05731201171875,
-0.01421356201171875,
-0.04034423828125,
0.06451416015625,
0.00881195068359375,
0.0274810791015625,
0.037261962890625,
0.008270263671875,
-0.0034580230712890625,
-0.049102783203125,
-0.04425048828125,
0.00843048095703125,
-0.0251617431640625,
-0.017913818359375,
-0.033966064453125,
-0.01165771484375,
-0.02423095703125,
0.004131317138671875,
-0.041107177734375,
-0.04278564453125,
-0.0261077880859375,
-0.0156402587890625,
0.0176544189453125,
0.0133209228515625,
-0.007171630859375,
0.031158447265625,
-0.06793212890625,
0.00945281982421875,
-0.01293182373046875,
0.0238189697265625,
-0.0268096923828125,
-0.061309814453125,
-0.030548095703125,
-0.003696441650390625,
-0.06005859375,
-0.056610107421875,
0.04302978515625,
0.007598876953125,
0.0239715576171875,
0.0301055908203125,
0.0088653564453125,
0.035858154296875,
-0.052490234375,
0.07708740234375,
0.0026912689208984375,
-0.05279541015625,
0.037506103515625,
-0.03497314453125,
0.0352783203125,
0.06768798828125,
0.01427459716796875,
-0.030303955078125,
-0.03643798828125,
-0.053192138671875,
-0.06439208984375,
0.0635986328125,
0.0511474609375,
-0.01303863525390625,
0.016510009765625,
-0.0157012939453125,
-0.0161590576171875,
0.00394439697265625,
-0.08349609375,
-0.037841796875,
0.003143310546875,
-0.0289154052734375,
-0.00039649009704589844,
-0.01641845703125,
-0.0146484375,
-0.022186279296875,
0.080810546875,
0.0062713623046875,
0.01494598388671875,
0.03216552734375,
-0.0012569427490234375,
-0.0112152099609375,
0.027496337890625,
0.0721435546875,
0.0478515625,
-0.0289306640625,
-0.01155853271484375,
0.0311737060546875,
-0.028106689453125,
-0.01410675048828125,
0.006374359130859375,
-0.024627685546875,
0.0120391845703125,
0.0249786376953125,
0.06793212890625,
0.0225372314453125,
-0.041839599609375,
0.037322998046875,
-0.02227783203125,
-0.0330810546875,
-0.0579833984375,
-0.01560211181640625,
0.0108184814453125,
-0.0011987686157226562,
0.01560211181640625,
0.0174102783203125,
0.017578125,
-0.01389312744140625,
0.007843017578125,
0.01363372802734375,
-0.051025390625,
-0.03167724609375,
0.040130615234375,
0.006122589111328125,
-0.01116180419921875,
0.032012939453125,
-0.031341552734375,
-0.04937744140625,
0.039886474609375,
0.006572723388671875,
0.0777587890625,
-0.0168304443359375,
-0.0207977294921875,
0.0609130859375,
0.042633056640625,
-0.0159759521484375,
0.04034423828125,
0.01275634765625,
-0.048675537109375,
-0.033203125,
-0.059814453125,
-0.0085906982421875,
0.0038547515869140625,
-0.06488037109375,
0.036102294921875,
0.020660400390625,
0.0004401206970214844,
-0.02239990234375,
0.0155181884765625,
-0.047210693359375,
0.0047760009765625,
-0.01507568359375,
0.08099365234375,
-0.06903076171875,
0.064208984375,
0.034332275390625,
-0.02618408203125,
-0.06683349609375,
-0.031585693359375,
-0.004322052001953125,
-0.036773681640625,
0.04241943359375,
0.00571441650390625,
0.0237274169921875,
-0.0089874267578125,
-0.02685546875,
-0.0721435546875,
0.08636474609375,
0.007061004638671875,
-0.04425048828125,
0.0036163330078125,
0.01407623291015625,
0.03497314453125,
-0.0260772705078125,
0.01355743408203125,
0.033050537109375,
0.05621337890625,
0.005588531494140625,
-0.07965087890625,
-0.0147857666015625,
-0.04315185546875,
-0.0243072509765625,
0.035491943359375,
-0.04742431640625,
0.0758056640625,
0.032806396484375,
-0.00470733642578125,
0.006427764892578125,
0.045989990234375,
0.0306854248046875,
0.01462554931640625,
0.03955078125,
0.08685302734375,
0.031280517578125,
-0.04119873046875,
0.067138671875,
-0.02667236328125,
0.043731689453125,
0.08758544921875,
-0.0005040168762207031,
0.0631103515625,
0.016021728515625,
-0.0163726806640625,
0.043365478515625,
0.056060791015625,
-0.0283203125,
0.0303955078125,
0.01434326171875,
0.01200103759765625,
-0.01715087890625,
0.0200653076171875,
-0.055694580078125,
0.0155792236328125,
0.0134429931640625,
-0.0234527587890625,
-0.0012226104736328125,
-0.01203155517578125,
0.0149078369140625,
-0.0030765533447265625,
-0.01470184326171875,
0.037628173828125,
-0.003116607666015625,
-0.038360595703125,
0.054962158203125,
-0.0016870498657226562,
0.049041748046875,
-0.05267333984375,
0.004299163818359375,
0.0037250518798828125,
0.0254058837890625,
-0.00951385498046875,
-0.04461669921875,
0.0350341796875,
0.005092620849609375,
-0.0247802734375,
-0.031829833984375,
0.019683837890625,
-0.03582763671875,
-0.07257080078125,
0.0271453857421875,
0.03009033203125,
0.0259246826171875,
-0.0033473968505859375,
-0.06854248046875,
0.00342559814453125,
0.0113525390625,
-0.04913330078125,
0.008453369140625,
0.049896240234375,
0.017730712890625,
0.0295562744140625,
0.0511474609375,
0.01549530029296875,
0.02032470703125,
-0.005718231201171875,
0.05517578125,
-0.0310211181640625,
-0.040740966796875,
-0.055999755859375,
0.062042236328125,
-0.013336181640625,
-0.052459716796875,
0.054290771484375,
0.081298828125,
0.06549072265625,
-0.0068206787109375,
0.024627685546875,
-0.00952911376953125,
0.05621337890625,
-0.033416748046875,
0.052154541015625,
-0.06890869140625,
0.01318359375,
-0.007587432861328125,
-0.06268310546875,
-0.01480865478515625,
0.0211334228515625,
-0.022613525390625,
-0.024200439453125,
0.058929443359375,
0.051025390625,
-0.01026153564453125,
-0.00997161865234375,
0.0190582275390625,
0.021942138671875,
0.0180816650390625,
0.050872802734375,
0.024993896484375,
-0.07220458984375,
0.04559326171875,
-0.02459716796875,
-0.0121002197265625,
0.0015192031860351562,
-0.056243896484375,
-0.058197021484375,
-0.044830322265625,
-0.0138702392578125,
-0.021026611328125,
-0.02703857421875,
0.066650390625,
0.047088623046875,
-0.07659912109375,
-0.038604736328125,
-0.00402069091796875,
0.004611968994140625,
-0.0202484130859375,
-0.023193359375,
0.052032470703125,
-0.022552490234375,
-0.06732177734375,
0.0340576171875,
-0.0059356689453125,
-0.0097198486328125,
-0.00667572021484375,
-0.025634765625,
-0.033782958984375,
-0.00209808349609375,
0.023681640625,
0.0089569091796875,
-0.039764404296875,
0.007053375244140625,
0.0133514404296875,
-0.00908660888671875,
0.0286865234375,
0.024200439453125,
-0.0196533203125,
0.0128631591796875,
0.06683349609375,
0.0182037353515625,
0.0293121337890625,
-0.01136016845703125,
0.030029296875,
-0.059539794921875,
0.02313232421875,
0.015716552734375,
0.045440673828125,
0.0217132568359375,
-0.0037479400634765625,
0.057373046875,
0.0187225341796875,
-0.048736572265625,
-0.0797119140625,
0.0021572113037109375,
-0.09381103515625,
-0.00273895263671875,
0.0709228515625,
-0.0166168212890625,
-0.0172882080078125,
0.0248260498046875,
-0.01117706298828125,
0.003948211669921875,
-0.0253448486328125,
0.021942138671875,
0.074462890625,
0.0167694091796875,
0.00933837890625,
-0.057464599609375,
0.027801513671875,
0.0283660888671875,
-0.0565185546875,
-0.01126861572265625,
0.0183868408203125,
0.013702392578125,
0.032012939453125,
0.037109375,
-0.0271453857421875,
0.007488250732421875,
-0.018646240234375,
0.03302001953125,
-0.00417327880859375,
-0.006359100341796875,
-0.017181396484375,
-0.0018949508666992188,
-0.00811767578125,
-0.0225830078125
]
] |
cardiffnlp/twitter-roberta-base-hate-latest | 2023-08-02T00:30:47.000Z | [
"transformers",
"pytorch",
"tf",
"roberta",
"text-classification",
"en",
"endpoints_compatible",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-hate-latest | 10 | 16,354 | transformers | 2023-03-30T05:47:39 | ---
model-index:
- name: twitter-roberta-base-hate-latest
results: []
pipeline_tag: text-classification
language:
- en
---
# cardiffnlp/twitter-roberta-base-hate-latest
This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-2022-154m](https://huggingface.co/cardiffnlp/twitter-roberta-base-2022-154m) for binary hate-speech classification.
A combination of 13 different hate-speech datasets in the English language were used to fine-tune the model.
More details in the [reference paper](https://aclanthology.org/2023.woah-1.25/).
| **Dataset** | **Accuracy** | **Macro-F1** | **Weighted-F1** |
|:----------|-----------:|-----------:|--------------:|
| hatEval, SemEval-2019 Task 5: Multilingual Detection of Hate Speech Against Immigrants and Women in Twitter | 0.5831 | 0.5646 | 0.548 |
| ucberkeley-dlab/measuring-hate-speech | 0.9273 | 0.9193 | 0.928 |
| Detecting East Asian Prejudice on Social Media | 0.9231 | 0.6623 | 0.9428 |
| Call me sexist, but | 0.9686 | 0.9203 | 0.9696 |
| Predicting the Type and Target of Offensive Posts in Social Media | 0.9164 | 0.6847 | 0.9098 |
| HateXplain | 0.8653 | 0.845 | 0.8662 |
| Large Scale Crowdsourcing and Characterization of Twitter Abusive BehaviorLarge Scale Crowdsourcing and Characterization of Twitter Abusive Behavior | 0.7801 | 0.7446 | 0.7614 |
| Multilingual and Multi-Aspect Hate Speech Analysis | 0.9944 | 0.4986 | 0.9972 |
| Hate speech and offensive content identification in indo-european languages | 0.8779 | 0.6904 | 0.8706 |
| Are You a Racist or Am I Seeing Things? | 0.921 | 0.8935 | 0.9216 |
| Automated Hate Speech Detection | 0.9423 | 0.9249 | 0.9429 |
| Hate Towards the Political Opponent | 0.8783 | 0.6595 | 0.8788 |
| Hateful Symbols or Hateful People? | 0.8187 | 0.7833 | 0.8323 |
| **Overall** | **0.8766** | **0.7531** | **0.8745** |
### Usage
Install tweetnlp via pip.
```shell
pip install tweetnlp
```
Load the model in python.
```python
import tweetnlp
model = tweetnlp.Classifier("cardiffnlp/twitter-roberta-base-hate-latest")
model.predict('I love everybody :)')
>> {'label': 'NOT-HATE'}
```
### Reference paper - Model based on:
```
@inproceedings{antypas-camacho-collados-2023-robust,
title = "Robust Hate Speech Detection in Social Media: A Cross-Dataset Empirical Evaluation",
author = "Antypas, Dimosthenis and
Camacho-Collados, Jose",
booktitle = "The 7th Workshop on Online Abuse and Harms (WOAH)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.woah-1.25",
pages = "231--242"
}
``` | 3,069 | [
[
-0.022735595703125,
-0.05792236328125,
0.0044097900390625,
0.011566162109375,
-0.011566162109375,
0.019866943359375,
-0.01837158203125,
-0.053253173828125,
0.0247650146484375,
0.01873779296875,
-0.040771484375,
-0.0631103515625,
-0.07135009765625,
-0.0009131431579589844,
-0.04364013671875,
0.068115234375,
0.04132080078125,
0.00655364990234375,
0.04107666015625,
-0.023406982421875,
-0.00472259521484375,
-0.039276123046875,
-0.058502197265625,
0.00022339820861816406,
0.046417236328125,
0.012847900390625,
0.049285888671875,
0.036346435546875,
0.033050537109375,
0.0245361328125,
-0.0120849609375,
-0.00412750244140625,
-0.040252685546875,
-0.0034046173095703125,
-0.011688232421875,
-0.006969451904296875,
-0.0277557373046875,
0.0092010498046875,
0.0316162109375,
0.0272369384765625,
0.0013551712036132812,
0.0102081298828125,
-0.003337860107421875,
0.0072784423828125,
-0.032379150390625,
-0.007595062255859375,
-0.0701904296875,
-0.01227569580078125,
-0.0217437744140625,
-0.0003230571746826172,
-0.017364501953125,
-0.037994384765625,
-0.002864837646484375,
0.0010538101196289062,
0.018035888671875,
0.007213592529296875,
0.06475830078125,
0.01073455810546875,
-0.0204925537109375,
-0.0180206298828125,
-0.037261962890625,
0.07684326171875,
-0.07086181640625,
0.0224151611328125,
0.0221099853515625,
0.0097198486328125,
0.006343841552734375,
-0.0194091796875,
-0.0504150390625,
-0.007373809814453125,
0.009490966796875,
-0.003459930419921875,
-0.040252685546875,
-0.010894775390625,
0.0104827880859375,
0.0018243789672851562,
-0.037261962890625,
0.0214691162109375,
-0.043731689453125,
-0.020751953125,
0.060211181640625,
-0.0019207000732421875,
0.02423095703125,
-0.0199127197265625,
-0.00965118408203125,
0.005779266357421875,
-0.01331329345703125,
-0.0211334228515625,
0.0290069580078125,
0.05419921875,
-0.038421630859375,
0.0279541015625,
0.004901885986328125,
0.0419921875,
-0.00405120849609375,
-0.0218658447265625,
0.044158935546875,
0.0011625289916992188,
-0.0120697021484375,
-0.002162933349609375,
0.076904296875,
0.051605224609375,
0.03961181640625,
-0.01485443115234375,
0.004093170166015625,
0.031005859375,
0.01067352294921875,
-0.05865478515625,
-0.006511688232421875,
0.0211944580078125,
-0.02655029296875,
-0.048126220703125,
-0.02716064453125,
-0.06866455078125,
-0.033660888671875,
-0.0079193115234375,
0.01512908935546875,
-0.048614501953125,
-0.03546142578125,
-0.01174163818359375,
-0.016632080078125,
0.00536346435546875,
0.0214691162109375,
-0.05255126953125,
0.0081634521484375,
0.03369140625,
0.0650634765625,
-0.026123046875,
-0.0166015625,
-0.019775390625,
-0.0015878677368164062,
-0.00707244873046875,
0.048370361328125,
-0.044403076171875,
0.005031585693359375,
0.0066375732421875,
-0.0158843994140625,
0.006549835205078125,
-0.027923583984375,
0.056365966796875,
-0.030548095703125,
0.0237274169921875,
0.0027484893798828125,
-0.034210205078125,
-0.012298583984375,
0.01374053955078125,
-0.0293731689453125,
0.07049560546875,
0.0173492431640625,
-0.06939697265625,
0.0301361083984375,
-0.0489501953125,
-0.01953125,
-0.0032672882080078125,
0.00521087646484375,
-0.034759521484375,
-0.01316070556640625,
0.0012178421020507812,
0.02825927734375,
-0.0210113525390625,
0.0152740478515625,
-0.047637939453125,
0.0006232261657714844,
-0.009674072265625,
-0.011810302734375,
0.1112060546875,
0.032501220703125,
-0.046234130859375,
0.016571044921875,
-0.07232666015625,
0.0002015829086303711,
0.0197296142578125,
-0.0055999755859375,
-0.01497650146484375,
-0.00913238525390625,
0.032379150390625,
0.03204345703125,
0.0059661865234375,
-0.06915283203125,
-0.0150146484375,
-0.026947021484375,
0.03131103515625,
0.06640625,
-0.0128173828125,
0.01110076904296875,
-0.043426513671875,
0.035186767578125,
-0.007537841796875,
0.0223541259765625,
0.03509521484375,
-0.0511474609375,
-0.034637451171875,
-0.0267333984375,
0.00171661376953125,
0.040313720703125,
-0.02099609375,
0.0452880859375,
-0.01071929931640625,
-0.06304931640625,
-0.023162841796875,
-0.00585174560546875,
0.03289794921875,
0.0245513916015625,
0.037872314453125,
0.004058837890625,
-0.09320068359375,
-0.060394287109375,
-0.0404052734375,
-0.0268096923828125,
0.0202484130859375,
0.0132598876953125,
0.038909912109375,
-0.01244354248046875,
0.06158447265625,
-0.0106964111328125,
-0.00540924072265625,
-0.0286102294921875,
0.018218994140625,
0.008148193359375,
0.023468017578125,
0.068359375,
-0.0657958984375,
-0.06298828125,
-0.0188446044921875,
-0.0347900390625,
-0.0287628173828125,
0.0249176025390625,
-0.0169219970703125,
0.0244293212890625,
0.02020263671875,
-0.011871337890625,
0.033721923828125,
0.043731689453125,
-0.042205810546875,
0.03839111328125,
0.04681396484375,
0.01428985595703125,
-0.09820556640625,
-0.0180511474609375,
0.015777587890625,
-0.0190582275390625,
-0.06341552734375,
-0.002231597900390625,
-0.0163116455078125,
0.0258941650390625,
-0.05010986328125,
0.040740966796875,
-0.0003058910369873047,
0.0213165283203125,
-0.007183074951171875,
-0.004581451416015625,
-0.047882080078125,
0.04144287109375,
-0.01105499267578125,
0.035919189453125,
0.03753662109375,
-0.033447265625,
0.033203125,
0.01053619384765625,
-0.0247344970703125,
0.032928466796875,
-0.0244293212890625,
0.00774383544921875,
0.004009246826171875,
-0.0059661865234375,
-0.0792236328125,
-0.012603759765625,
0.0284881591796875,
-0.0643310546875,
0.002048492431640625,
-0.0247955322265625,
-0.032470703125,
-0.038848876953125,
-0.0318603515625,
0.02227783203125,
0.03582763671875,
-0.0182037353515625,
0.041351318359375,
0.0660400390625,
-0.0032978057861328125,
-0.0482177734375,
-0.06292724609375,
0.016876220703125,
-0.027923583984375,
-0.04888916015625,
0.0273590087890625,
-0.0155792236328125,
-0.042633056640625,
-0.0030651092529296875,
0.015960693359375,
-0.0163116455078125,
0.0005602836608886719,
0.0160675048828125,
0.0157012939453125,
-0.0013437271118164062,
-0.0161285400390625,
-0.0218353271484375,
0.025848388671875,
0.0018892288208007812,
-0.00476837158203125,
0.06817626953125,
0.0003077983856201172,
0.007843017578125,
-0.030609130859375,
0.031768798828125,
0.045074462890625,
0.01377105712890625,
0.060302734375,
0.06451416015625,
-0.0450439453125,
-0.01169586181640625,
-0.043212890625,
0.005786895751953125,
-0.0304412841796875,
0.0374755859375,
-0.004421234130859375,
-0.078369140625,
0.040740966796875,
0.0435791015625,
0.0160675048828125,
0.05615234375,
0.05877685546875,
0.01027679443359375,
0.0804443359375,
0.04498291015625,
-0.0301055908203125,
0.046234130859375,
-0.015899658203125,
0.018646240234375,
-0.01544952392578125,
-0.0140838623046875,
-0.061492919921875,
-0.0190887451171875,
-0.061981201171875,
-0.0228729248046875,
0.007190704345703125,
-0.0147705078125,
-0.0452880859375,
0.035675048828125,
-0.0390625,
0.02227783203125,
0.035736083984375,
-0.00768280029296875,
-0.0008611679077148438,
0.0250701904296875,
-0.0041351318359375,
-0.025909423828125,
-0.0298309326171875,
-0.0285491943359375,
0.07843017578125,
0.0199127197265625,
0.035736083984375,
0.036041259765625,
0.04962158203125,
0.038848876953125,
0.0294647216796875,
-0.0450439453125,
0.045654296875,
-0.0253753662109375,
-0.07086181640625,
-0.0206451416015625,
-0.0391845703125,
-0.05194091796875,
0.007373809814453125,
-0.013580322265625,
-0.08502197265625,
0.00782012939453125,
-0.0004968643188476562,
-0.01812744140625,
0.054901123046875,
-0.03863525390625,
0.0419921875,
-0.004566192626953125,
-0.019073486328125,
-0.0214691162109375,
-0.032379150390625,
0.0252838134765625,
-0.004016876220703125,
0.041778564453125,
-0.023162841796875,
0.00017154216766357422,
0.0816650390625,
-0.02752685546875,
0.07220458984375,
-0.016387939453125,
-0.009307861328125,
0.01241302490234375,
-0.007152557373046875,
0.021453857421875,
-0.03277587890625,
-0.0276336669921875,
0.018402099609375,
-0.013275146484375,
-0.024261474609375,
-0.0166168212890625,
0.0469970703125,
-0.056976318359375,
-0.02398681640625,
-0.041168212890625,
-0.039215087890625,
-0.0193939208984375,
0.026214599609375,
0.0294952392578125,
0.0291290283203125,
-0.0160064697265625,
0.0201416015625,
0.029205322265625,
-0.022735595703125,
0.028228759765625,
0.035614013671875,
-0.01499176025390625,
-0.044219970703125,
0.053619384765625,
0.00982666015625,
0.009735107421875,
0.0238189697265625,
0.016510009765625,
-0.021453857421875,
-0.057464599609375,
-0.0028209686279296875,
0.0179595947265625,
-0.036163330078125,
-0.0265045166015625,
-0.074462890625,
-0.035308837890625,
-0.06463623046875,
-0.00569915771484375,
-0.01215362548828125,
-0.033447265625,
-0.01515960693359375,
-0.0048980712890625,
0.050262451171875,
0.06695556640625,
-0.0277557373046875,
0.0278778076171875,
-0.023529052734375,
0.03192138671875,
-0.0008902549743652344,
0.0252532958984375,
-0.0086212158203125,
-0.08197021484375,
-0.01209259033203125,
0.01335906982421875,
-0.023345947265625,
-0.0814208984375,
0.054595947265625,
0.00685882568359375,
0.0252532958984375,
0.019317626953125,
0.0123291015625,
0.044891357421875,
-0.0285491943359375,
0.05633544921875,
0.02667236328125,
-0.048004150390625,
0.043701171875,
-0.041107177734375,
0.001712799072265625,
0.022796630859375,
0.043670654296875,
-0.047210693359375,
-0.06268310546875,
-0.0653076171875,
-0.05487060546875,
0.080078125,
0.0250244140625,
0.01381683349609375,
-0.01406097412109375,
-0.00414276123046875,
-0.0043792724609375,
0.0037136077880859375,
-0.069091796875,
-0.056243896484375,
-0.004642486572265625,
-0.036773681640625,
-0.00298309326171875,
-0.042388916015625,
-0.0012102127075195312,
-0.0452880859375,
0.062225341796875,
0.0196685791015625,
0.03155517578125,
-0.0266876220703125,
-0.0012369155883789062,
-0.00569915771484375,
0.0187225341796875,
0.03997802734375,
0.052032470703125,
-0.044158935546875,
-0.005962371826171875,
0.0189666748046875,
-0.044219970703125,
-0.00478363037109375,
0.006938934326171875,
-0.01580810546875,
0.0170745849609375,
0.0301666259765625,
0.049530029296875,
0.003936767578125,
-0.030670166015625,
0.039337158203125,
-0.0026721954345703125,
-0.03802490234375,
-0.04156494140625,
0.0037288665771484375,
-0.01340484619140625,
0.0216217041015625,
0.0401611328125,
0.021728515625,
0.0044097900390625,
-0.0440673828125,
0.0205078125,
0.025848388671875,
-0.020111083984375,
-0.0208892822265625,
0.048583984375,
-0.00507354736328125,
-0.04022216796875,
0.0030689239501953125,
-0.031768798828125,
-0.0733642578125,
0.03472900390625,
0.035186767578125,
0.0638427734375,
-0.030914306640625,
0.022491455078125,
0.0550537109375,
0.0287628173828125,
0.034576416015625,
0.042510986328125,
0.0214385986328125,
-0.07159423828125,
-0.0126495361328125,
-0.06591796875,
-0.010040283203125,
0.039581298828125,
-0.03277587890625,
0.024017333984375,
-0.0535888671875,
-0.011199951171875,
0.03802490234375,
0.0154571533203125,
-0.04095458984375,
0.02203369140625,
0.0237274169921875,
0.06695556640625,
-0.06475830078125,
0.06292724609375,
0.0364990234375,
-0.023651123046875,
-0.055511474609375,
-0.0138702392578125,
0.019287109375,
-0.06146240234375,
0.040985107421875,
0.005207061767578125,
-0.01474761962890625,
0.00713348388671875,
-0.06884765625,
-0.06842041015625,
0.03497314453125,
0.0269317626953125,
-0.035186767578125,
-0.0006623268127441406,
0.0215911865234375,
0.049530029296875,
-0.0302581787109375,
0.0272216796875,
0.03826904296875,
0.0298309326171875,
0.01233673095703125,
-0.067138671875,
-0.0008544921875,
-0.0302276611328125,
-0.0143585205078125,
-0.0026798248291015625,
-0.050079345703125,
0.0723876953125,
0.0047760009765625,
-0.00762176513671875,
-0.00836181640625,
0.04156494140625,
0.006244659423828125,
0.0137176513671875,
0.057891845703125,
0.0533447265625,
0.0489501953125,
-0.005672454833984375,
0.060211181640625,
-0.0118255615234375,
0.0301666259765625,
0.090087890625,
0.0208740234375,
0.053924560546875,
-0.0004582405090332031,
-0.0325927734375,
0.03240966796875,
0.036285400390625,
-0.00252532958984375,
0.0296173095703125,
0.0010118484497070312,
-0.01406097412109375,
-0.0120849609375,
-0.023529052734375,
-0.018768310546875,
0.039031982421875,
0.0244293212890625,
-0.0281982421875,
-0.0204925537109375,
0.0008487701416015625,
0.03021240234375,
0.01442718505859375,
-0.0210418701171875,
0.04632568359375,
-0.0159912109375,
-0.023773193359375,
0.0634765625,
0.0156097412109375,
0.08673095703125,
-0.0283660888671875,
0.01288604736328125,
0.0079498291015625,
0.011749267578125,
-0.0248870849609375,
-0.05340576171875,
0.032379150390625,
0.039764404296875,
-0.0139312744140625,
-0.0124053955078125,
0.035614013671875,
-0.03302001953125,
-0.0308990478515625,
0.050323486328125,
0.01047515869140625,
0.018768310546875,
-0.010040283203125,
-0.08074951171875,
0.0243072509765625,
0.026641845703125,
0.0032367706298828125,
0.0009026527404785156,
0.030792236328125,
-0.003864288330078125,
0.041748046875,
0.033203125,
0.024383544921875,
0.037750244140625,
0.0253753662109375,
0.058990478515625,
-0.06182861328125,
-0.019561767578125,
-0.071533203125,
0.02178955078125,
-0.0284881591796875,
-0.03167724609375,
0.056488037109375,
0.053680419921875,
0.06866455078125,
0.0213775634765625,
0.0743408203125,
-0.0435791015625,
0.092041015625,
-0.006076812744140625,
0.040557861328125,
-0.0491943359375,
-0.008575439453125,
-0.042694091796875,
-0.047882080078125,
-0.0389404296875,
0.048126220703125,
-0.03759765625,
0.0165252685546875,
0.0384521484375,
0.072509765625,
-0.0012426376342773438,
0.0048675537109375,
0.03155517578125,
0.058929443359375,
0.0299530029296875,
0.0243682861328125,
0.04315185546875,
-0.0288543701171875,
0.050506591796875,
-0.044708251953125,
-0.0138397216796875,
-0.023406982421875,
-0.0462646484375,
-0.0701904296875,
-0.05316162109375,
-0.0252838134765625,
-0.056121826171875,
0.01282501220703125,
0.08697509765625,
0.04833984375,
-0.09552001953125,
-0.0191802978515625,
0.006198883056640625,
0.00685882568359375,
0.0131072998046875,
-0.02130126953125,
0.01192474365234375,
-0.0077056884765625,
-0.07427978515625,
0.0029850006103515625,
-0.0018205642700195312,
-0.0036487579345703125,
0.00966644287109375,
0.00246429443359375,
-0.0572509765625,
-0.0019989013671875,
0.041259765625,
0.01056671142578125,
-0.06280517578125,
-0.0281219482421875,
-0.0241851806640625,
-0.0214996337890625,
0.00576019287109375,
0.0198974609375,
-0.0198822021484375,
-0.00026726722717285156,
0.0250396728515625,
0.0238189697265625,
0.0272979736328125,
-0.00982666015625,
0.01399993896484375,
-0.037353515625,
0.021331787109375,
0.03997802734375,
0.0298309326171875,
0.02728271484375,
-0.01372528076171875,
0.031219482421875,
0.03143310546875,
-0.02642822265625,
-0.06842041015625,
0.006389617919921875,
-0.08306884765625,
-0.0174560546875,
0.0975341796875,
0.0230712890625,
-0.025146484375,
-0.0285797119140625,
-0.01253509521484375,
0.036102294921875,
-0.06005859375,
0.051788330078125,
0.0487060546875,
0.01763916015625,
-0.0176849365234375,
-0.0305328369140625,
0.0545654296875,
0.01421356201171875,
-0.03155517578125,
0.006137847900390625,
0.038970947265625,
0.0279388427734375,
0.011566162109375,
0.0479736328125,
-0.001190185546875,
0.0134429931640625,
-0.01247406005859375,
0.02044677734375,
0.0169525146484375,
-0.01367950439453125,
-0.0135498046875,
0.01454925537109375,
-0.021209716796875,
-0.0034847259521484375
]
] |
TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ | 2023-09-27T12:44:25.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"uncensored",
"en",
"dataset:ehartford/wizard_vicuna_70k_unfiltered",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ | 407 | 16,342 | transformers | 2023-05-30T03:11:00 | ---
language:
- en
license: other
tags:
- uncensored
datasets:
- ehartford/wizard_vicuna_70k_unfiltered
model_name: Wizard Vicuna 30B Uncensored
base_model: ehartford/Wizard-Vicuna-30B-Uncensored
inference: false
model_creator: Eric Hartford
model_type: llama
prompt_template: 'A chat between a curious user and an artificial intelligence assistant.
The assistant gives helpful, detailed, and polite answers to the user''s questions.
USER: {prompt} ASSISTANT:
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Wizard Vicuna 30B Uncensored - GPTQ
- Model creator: [Eric Hartford](https://huggingface.co/ehartford)
- Original model: [Wizard Vicuna 30B Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Eric Hartford's Wizard-Vicuna-30B-Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GGUF)
* [Eric Hartford's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Vicuna
```
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: {prompt} ASSISTANT:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/main) | 4 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 16.94 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 19.44 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 18.18 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 17.55 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 32.99 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_False](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-8bit-128g-actorder_False) | 8 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 33.73 GB | No | 8-bit, with group size 128g for higher inference quality and without Act Order to improve AutoGPTQ speed. |
| [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 12.92 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. |
| [gptq-3bit-128g-actorder_False](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ/tree/gptq-3bit-128g-actorder_False) | 3 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 13.51 GB | No | 3-bit, with group size 128g but no act-order. Slightly higher VRAM requirements than 3-bit None. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Wizard-Vicuna-30B-Uncensored-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: {prompt} ASSISTANT:
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Eric Hartford's Wizard-Vicuna-30B-Uncensored
<!-- header start -->
<div style="width: 100%;">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p><a href="https://discord.gg/Jq4vkcDakD">Chat & support: my new Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<!-- header end -->
# Eric Hartford's Wizard-Vicuna-30B-Uncensored GPTQ
This is an fp16 models of [Eric Hartford's Wizard-Vicuna 30B](https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored).
It is the result of converting Eric's original fp32 upload to fp16.
## Repositories available
* [4bit GPTQ models for GPU inference](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GPTQ).
* [4bit and 5bit GGML models for CPU inference](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GGML).
* [float16 HF format model for GPU inference and further conversions](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16).
<!-- footer start -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/Jq4vkcDakD)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Patreon special mentions**: Aemon Algiz, Dmitriy Samsonov, Nathan LeClaire, Trenton Dambrowitz, Mano Prime, David Flickinger, vamX, Nikolai Manek, senxiiz, Khalefa Al-Ahmad, Illia Dulskyi, Jonathan Leane, Talal Aujan, V. Lukas, Joseph William Delisle, Pyrater, Oscar Rangel, Lone Striker, Luke Pendergrass, Eugene Pentland, Sebastain Graf, Johann-Peter Hartman.
Thank you to all my generous patrons and donaters!
<!-- footer end -->
# Original model card
This is [wizard-vicuna-13b](https://huggingface.co/junelee/wizard-vicuna-13b) trained with a subset of the dataset - responses that contained alignment / moralizing were removed. The intent is to train a WizardLM that doesn't have alignment built-in, so that alignment (of any sort) can be added separately with for example with a RLHF LoRA.
Shout out to the open source AI/ML community, and everyone who helped me out.
Note:
An uncensored model has no guardrails.
You are responsible for anything you do with the model, just as you are responsible for anything you do with any dangerous object such as a knife, gun, lighter, or car.
Publishing anything this model generates is the same as publishing it yourself.
You are responsible for the content you publish, and you cannot blame the model any more than you can blame the knife, gun, lighter, or car for what you do with it.
| 19,579 | [
[
-0.040740966796875,
-0.05609130859375,
0.0033626556396484375,
0.0099334716796875,
-0.01727294921875,
-0.012176513671875,
0.00579071044921875,
-0.039215087890625,
0.0172576904296875,
0.032012939453125,
-0.042755126953125,
-0.0372314453125,
-0.0252227783203125,
-0.004791259765625,
-0.0263519287109375,
0.0760498046875,
0.0080413818359375,
-0.021484375,
-0.000025391578674316406,
-0.0218963623046875,
-0.02825927734375,
-0.03564453125,
-0.05810546875,
-0.012786865234375,
0.0271759033203125,
0.0155792236328125,
0.07049560546875,
0.03851318359375,
0.00760650634765625,
0.0230255126953125,
0.006885528564453125,
0.00460052490234375,
-0.038970947265625,
-0.0172882080078125,
0.012420654296875,
-0.0167388916015625,
-0.044769287109375,
0.00867462158203125,
0.03662109375,
0.008819580078125,
-0.0227203369140625,
0.01343536376953125,
0.00321197509765625,
0.053436279296875,
-0.033935546875,
0.0138092041015625,
-0.027801513671875,
0.01383209228515625,
-0.0052337646484375,
0.0025177001953125,
-0.005786895751953125,
-0.02972412109375,
0.00377655029296875,
-0.06939697265625,
0.0210113525390625,
0.004993438720703125,
0.08758544921875,
0.01265716552734375,
-0.0521240234375,
0.0180511474609375,
-0.033966064453125,
0.037567138671875,
-0.06536865234375,
0.024749755859375,
0.042266845703125,
0.02197265625,
-0.0187835693359375,
-0.060546875,
-0.053375244140625,
0.00001895427703857422,
-0.0138397216796875,
0.0276031494140625,
-0.0293426513671875,
0.00962066650390625,
0.038330078125,
0.057159423828125,
-0.0712890625,
-0.01366424560546875,
-0.020111083984375,
-0.0145263671875,
0.068359375,
0.0165252685546875,
0.02825927734375,
-0.0185546875,
-0.0172882080078125,
-0.03302001953125,
-0.039764404296875,
0.006282806396484375,
0.0258636474609375,
-0.00888824462890625,
-0.036773681640625,
0.04119873046875,
-0.02825927734375,
0.03668212890625,
0.01045989990234375,
-0.01097869873046875,
0.021484375,
-0.046661376953125,
-0.03875732421875,
-0.02545166015625,
0.0948486328125,
0.038818359375,
-0.0131988525390625,
0.0128936767578125,
0.0047454833984375,
-0.01383209228515625,
-0.00013184547424316406,
-0.072021484375,
-0.047149658203125,
0.033203125,
-0.031219482421875,
-0.01203155517578125,
-0.0018682479858398438,
-0.05426025390625,
-0.00798797607421875,
-0.00754547119140625,
0.042633056640625,
-0.050140380859375,
-0.03399658203125,
0.00872802734375,
-0.034912109375,
0.0401611328125,
0.0296783447265625,
-0.051971435546875,
0.037689208984375,
0.0242919921875,
0.047515869140625,
0.00377655029296875,
-0.01119232177734375,
-0.01029205322265625,
0.007228851318359375,
-0.01456451416015625,
0.0269622802734375,
-0.006809234619140625,
-0.0306243896484375,
-0.029632568359375,
0.02398681640625,
0.0020008087158203125,
-0.01568603515625,
0.03485107421875,
-0.023651123046875,
0.035675048828125,
-0.034423828125,
-0.044586181640625,
-0.02935791015625,
0.005443572998046875,
-0.05126953125,
0.09918212890625,
0.036041259765625,
-0.06280517578125,
0.0199737548828125,
-0.03729248046875,
-0.004871368408203125,
0.0018568038940429688,
-0.0034809112548828125,
-0.0340576171875,
-0.0126953125,
0.0138092041015625,
0.015289306640625,
-0.0283660888671875,
0.01055145263671875,
-0.0159912109375,
-0.02142333984375,
0.01107025146484375,
-0.056793212890625,
0.10028076171875,
0.0098876953125,
-0.03643798828125,
-0.0081634521484375,
-0.054840087890625,
0.01476287841796875,
0.032867431640625,
-0.0087127685546875,
-0.00304412841796875,
-0.017425537109375,
0.005367279052734375,
0.0138397216796875,
0.0167083740234375,
-0.0208587646484375,
0.0389404296875,
-0.01247406005859375,
0.03448486328125,
0.043182373046875,
0.008392333984375,
0.01444244384765625,
-0.0274200439453125,
0.03985595703125,
-0.0014696121215820312,
0.05560302734375,
0.0153350830078125,
-0.058929443359375,
-0.05126953125,
-0.0211334228515625,
0.0273590087890625,
0.050262451171875,
-0.0584716796875,
0.04425048828125,
-0.014495849609375,
-0.058441162109375,
-0.026885986328125,
-0.00484466552734375,
0.0261688232421875,
0.0269622802734375,
0.0379638671875,
-0.03704833984375,
-0.017608642578125,
-0.0672607421875,
0.004756927490234375,
-0.034759521484375,
-0.00649261474609375,
0.03369140625,
0.04827880859375,
-0.01062774658203125,
0.06103515625,
-0.048614501953125,
-0.0053253173828125,
0.01129913330078125,
0.003505706787109375,
0.0235595703125,
0.041595458984375,
0.06036376953125,
-0.0579833984375,
-0.048828125,
-0.0034656524658203125,
-0.04864501953125,
-0.0011348724365234375,
-0.00244903564453125,
-0.041015625,
0.016754150390625,
-0.002712249755859375,
-0.08154296875,
0.056915283203125,
0.030487060546875,
-0.053192138671875,
0.06292724609375,
-0.02130126953125,
0.0231170654296875,
-0.0765380859375,
-0.00244903564453125,
0.0030498504638671875,
-0.022216796875,
-0.031829833984375,
-0.0023193359375,
-0.0028705596923828125,
0.0169219970703125,
-0.025909423828125,
0.051544189453125,
-0.041778564453125,
0.002361297607421875,
0.00464630126953125,
-0.0077362060546875,
0.027435302734375,
0.045562744140625,
-0.01042938232421875,
0.060150146484375,
0.038970947265625,
-0.032684326171875,
0.0498046875,
0.032196044921875,
0.0005526542663574219,
0.0219573974609375,
-0.054168701171875,
0.007328033447265625,
0.01666259765625,
0.026336669921875,
-0.06805419921875,
-0.018646240234375,
0.04449462890625,
-0.048858642578125,
0.036376953125,
-0.0292205810546875,
-0.04498291015625,
-0.0303192138671875,
-0.041900634765625,
0.02490234375,
0.058258056640625,
-0.0299530029296875,
0.03289794921875,
0.02825927734375,
0.01158905029296875,
-0.053802490234375,
-0.047149658203125,
-0.00920867919921875,
-0.021148681640625,
-0.042816162109375,
0.035003662109375,
-0.0099334716796875,
0.0017147064208984375,
0.007396697998046875,
-0.0033168792724609375,
-0.00691986083984375,
-0.0080718994140625,
0.0208740234375,
0.024322509765625,
-0.011932373046875,
-0.01151275634765625,
0.01534271240234375,
0.00769805908203125,
0.00432586669921875,
-0.025146484375,
0.0244140625,
-0.01279449462890625,
-0.0036907196044921875,
-0.0291595458984375,
0.0226287841796875,
0.03778076171875,
0.00937652587890625,
0.05908203125,
0.06793212890625,
-0.0245361328125,
0.0107421875,
-0.037567138671875,
-0.00745391845703125,
-0.037017822265625,
0.0129547119140625,
-0.0183258056640625,
-0.048492431640625,
0.041046142578125,
0.041168212890625,
0.01407623291015625,
0.06353759765625,
0.0289459228515625,
0.004413604736328125,
0.0748291015625,
0.0232391357421875,
-0.01280975341796875,
0.035369873046875,
-0.04766845703125,
-0.01177978515625,
-0.05413818359375,
-0.0155487060546875,
-0.0212249755859375,
-0.01041412353515625,
-0.0657958984375,
-0.044403076171875,
0.0240478515625,
0.028167724609375,
-0.0579833984375,
0.04339599609375,
-0.058929443359375,
0.0169677734375,
0.051361083984375,
0.0186614990234375,
0.0179290771484375,
0.0034046173095703125,
-0.007129669189453125,
0.00865936279296875,
-0.04974365234375,
-0.0179290771484375,
0.0831298828125,
0.0221710205078125,
0.0391845703125,
0.0218505859375,
0.0273590087890625,
0.0168914794921875,
0.0183258056640625,
-0.0390625,
0.046295166015625,
0.0059051513671875,
-0.059173583984375,
-0.034332275390625,
-0.044677734375,
-0.0687255859375,
0.024383544921875,
-0.013214111328125,
-0.0604248046875,
0.0306243896484375,
-0.00009042024612426758,
-0.0185089111328125,
0.0234375,
-0.058319091796875,
0.082275390625,
-0.0148162841796875,
-0.036865234375,
-0.001308441162109375,
-0.0526123046875,
0.0279693603515625,
0.0160064697265625,
-0.00893402099609375,
-0.01207733154296875,
-0.01230621337890625,
0.06256103515625,
-0.0728759765625,
0.0499267578125,
-0.0205841064453125,
0.0009984970092773438,
0.042938232421875,
-0.01085662841796875,
0.035400390625,
0.01322174072265625,
0.0111083984375,
0.0268402099609375,
0.034027099609375,
-0.0421142578125,
-0.038421630859375,
0.03875732421875,
-0.0772705078125,
-0.03997802734375,
-0.041168212890625,
-0.02984619140625,
-0.00749969482421875,
0.0038623809814453125,
0.04241943359375,
0.0299530029296875,
-0.0023555755615234375,
0.0010156631469726562,
0.052490234375,
-0.02587890625,
0.027130126953125,
0.026763916015625,
-0.01922607421875,
-0.04425048828125,
0.06512451171875,
0.01160430908203125,
0.01050567626953125,
0.01666259765625,
0.00476837158203125,
-0.036224365234375,
-0.0364990234375,
-0.047149658203125,
0.0233917236328125,
-0.042449951171875,
-0.031219482421875,
-0.047576904296875,
-0.029052734375,
-0.034149169921875,
0.0278472900390625,
-0.028045654296875,
-0.05535888671875,
-0.030914306640625,
-0.0056304931640625,
0.0777587890625,
0.0311431884765625,
-0.0113983154296875,
0.0190582275390625,
-0.0595703125,
0.0205535888671875,
0.0290069580078125,
0.0169219970703125,
-0.0022335052490234375,
-0.05487060546875,
-0.01043701171875,
0.00835418701171875,
-0.045318603515625,
-0.080810546875,
0.053497314453125,
0.01348114013671875,
0.04010009765625,
0.034942626953125,
0.0170745849609375,
0.06231689453125,
-0.01605224609375,
0.079833984375,
0.0153656005859375,
-0.05810546875,
0.035919189453125,
-0.03656005859375,
0.01125335693359375,
0.035675048828125,
0.04498291015625,
-0.0226593017578125,
-0.0245361328125,
-0.06134033203125,
-0.06549072265625,
0.034637451171875,
0.03350830078125,
-0.00005745887756347656,
0.006839752197265625,
0.041046142578125,
0.0074920654296875,
0.006793975830078125,
-0.058929443359375,
-0.055145263671875,
-0.032684326171875,
-0.00799560546875,
0.00879669189453125,
-0.0043487548828125,
-0.0201568603515625,
-0.058990478515625,
0.0709228515625,
-0.01395416259765625,
0.05523681640625,
0.024017333984375,
0.01251983642578125,
0.0054473876953125,
-0.0009975433349609375,
0.0174407958984375,
0.038330078125,
-0.0167999267578125,
-0.0210723876953125,
0.00934600830078125,
-0.06640625,
0.0073394775390625,
0.028594970703125,
-0.00951385498046875,
-0.006259918212890625,
0.00846099853515625,
0.0643310546875,
-0.01418304443359375,
-0.023162841796875,
0.036163330078125,
-0.0325927734375,
-0.0264892578125,
-0.030517578125,
0.02557373046875,
0.00878143310546875,
0.0303192138671875,
0.0289154052734375,
-0.024749755859375,
0.03094482421875,
-0.039764404296875,
0.009307861328125,
0.03839111328125,
-0.01092529296875,
-0.020263671875,
0.056732177734375,
-0.013336181640625,
0.00955963134765625,
0.056915283203125,
-0.0233612060546875,
-0.030364990234375,
0.057891845703125,
0.0238494873046875,
0.0565185546875,
-0.009796142578125,
0.0184173583984375,
0.04412841796875,
0.01178741455078125,
-0.0021610260009765625,
0.02435302734375,
-0.01152801513671875,
-0.044342041015625,
-0.02001953125,
-0.038421630859375,
-0.025054931640625,
0.0211639404296875,
-0.058502197265625,
0.00853729248046875,
-0.025360107421875,
-0.030853271484375,
-0.01458740234375,
0.03863525390625,
-0.045166015625,
0.0216064453125,
-0.003833770751953125,
0.0760498046875,
-0.05560302734375,
0.074462890625,
0.0285491943359375,
-0.034912109375,
-0.07696533203125,
-0.01343536376953125,
0.004558563232421875,
-0.0372314453125,
0.0063323974609375,
-0.0034389495849609375,
0.0254058837890625,
0.00009906291961669922,
-0.050506591796875,
-0.06402587890625,
0.11712646484375,
0.0209503173828125,
-0.0335693359375,
-0.01058197021484375,
-0.0004935264587402344,
0.02789306640625,
-0.0015649795532226562,
0.058685302734375,
0.049407958984375,
0.0293121337890625,
0.01357269287109375,
-0.06884765625,
0.0276031494140625,
-0.040985107421875,
-0.0054931640625,
0.00893402099609375,
-0.0740966796875,
0.0667724609375,
0.012054443359375,
-0.00962066650390625,
0.0055084228515625,
0.05218505859375,
0.027252197265625,
0.007312774658203125,
0.025543212890625,
0.0577392578125,
0.06390380859375,
-0.02313232421875,
0.09307861328125,
-0.01326751708984375,
0.036651611328125,
0.0618896484375,
0.016082763671875,
0.04705810546875,
0.0131072998046875,
-0.052459716796875,
0.03387451171875,
0.07867431640625,
-0.00780487060546875,
0.0308074951171875,
-0.0015497207641601562,
-0.0244293212890625,
-0.005176544189453125,
0.0211181640625,
-0.055694580078125,
0.0011091232299804688,
0.0308990478515625,
-0.01032257080078125,
0.0125732421875,
-0.01145172119140625,
0.0016355514526367188,
-0.051727294921875,
-0.0157012939453125,
0.040863037109375,
0.01322174072265625,
-0.0289459228515625,
0.06793212890625,
-0.01084136962890625,
0.04736328125,
-0.03955078125,
-0.010711669921875,
-0.031890869140625,
-0.0097503662109375,
-0.0207672119140625,
-0.060638427734375,
0.010467529296875,
-0.0173492431640625,
-0.00655364990234375,
0.010162353515625,
0.043731689453125,
-0.0154266357421875,
-0.0299072265625,
0.0274810791015625,
0.0303192138671875,
0.0300750732421875,
-0.00783538818359375,
-0.08404541015625,
0.0038928985595703125,
-0.0008111000061035156,
-0.055023193359375,
0.038055419921875,
0.041168212890625,
0.005237579345703125,
0.04827880859375,
0.036529541015625,
-0.00876617431640625,
0.0130767822265625,
-0.014495849609375,
0.068115234375,
-0.0596923828125,
-0.0192108154296875,
-0.0511474609375,
0.03802490234375,
-0.0175628662109375,
-0.031341552734375,
0.067138671875,
0.05084228515625,
0.05108642578125,
0.008636474609375,
0.050323486328125,
-0.0250244140625,
0.00690460205078125,
-0.0283203125,
0.0518798828125,
-0.057373046875,
0.006656646728515625,
-0.025970458984375,
-0.05682373046875,
0.0015010833740234375,
0.04730224609375,
-0.00868988037109375,
0.020233154296875,
0.032073974609375,
0.06292724609375,
0.0022296905517578125,
0.01039886474609375,
0.0214691162109375,
0.02459716796875,
0.0102996826171875,
0.06414794921875,
0.048492431640625,
-0.0758056640625,
0.039520263671875,
-0.0333251953125,
-0.0174713134765625,
-0.0082855224609375,
-0.05224609375,
-0.057830810546875,
-0.03704833984375,
-0.05194091796875,
-0.05462646484375,
-0.0008559226989746094,
0.06353759765625,
0.056640625,
-0.047027587890625,
-0.0170745849609375,
-0.006801605224609375,
0.0037441253662109375,
-0.0265045166015625,
-0.026153564453125,
0.036468505859375,
0.0295867919921875,
-0.051177978515625,
0.00893402099609375,
0.002506256103515625,
0.023162841796875,
-0.01229095458984375,
-0.0294647216796875,
-0.01076507568359375,
0.00962066650390625,
0.045074462890625,
0.046234130859375,
-0.037750244140625,
-0.006748199462890625,
-0.00846099853515625,
-0.00771331787109375,
0.02142333984375,
0.01338958740234375,
-0.055694580078125,
0.0061187744140625,
0.03704833984375,
0.00922393798828125,
0.0716552734375,
0.00247955322265625,
0.0228118896484375,
-0.0287933349609375,
0.0028438568115234375,
0.00852203369140625,
0.0227813720703125,
0.0014133453369140625,
-0.04632568359375,
0.050994873046875,
0.030731201171875,
-0.050048828125,
-0.053070068359375,
-0.01465606689453125,
-0.0906982421875,
-0.0176849365234375,
0.08380126953125,
-0.00868988037109375,
-0.03216552734375,
-0.00965118408203125,
-0.0166473388671875,
0.030792236328125,
-0.0333251953125,
0.0196685791015625,
0.03594970703125,
-0.026458740234375,
-0.0328369140625,
-0.0679931640625,
0.046478271484375,
0.01270294189453125,
-0.056884765625,
0.00113677978515625,
0.046966552734375,
0.035980224609375,
-0.0024127960205078125,
0.07318115234375,
-0.0221405029296875,
0.025726318359375,
0.016876220703125,
0.0035572052001953125,
-0.0030994415283203125,
0.00384521484375,
-0.0214996337890625,
-0.0018091201782226562,
-0.015655517578125,
-0.0042572021484375
]
] |
textattack/distilbert-base-uncased-CoLA | 2020-07-06T16:29:03.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | textattack | null | null | textattack/distilbert-base-uncased-CoLA | 1 | 16,319 | transformers | 2022-03-02T23:29:05 | ## TextAttack Model Cardand the glue dataset loaded using the `nlp` library. The model was fine-tuned
for 5 epochs with a batch size of 64, a learning
rate of 3e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.8235858101629914, as measured by the
eval set accuracy, found after 2 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 530 | [
[
-0.0033931732177734375,
-0.0394287109375,
0.025299072265625,
0.0155487060546875,
-0.0147552490234375,
-0.001468658447265625,
-0.011016845703125,
-0.03448486328125,
-0.002079010009765625,
0.01070404052734375,
-0.032135009765625,
-0.050537109375,
-0.044677734375,
0.004108428955078125,
-0.031494140625,
0.1048583984375,
0.018585205078125,
-0.0076446533203125,
-0.018463134765625,
-0.0046539306640625,
-0.037139892578125,
-0.0214080810546875,
-0.04266357421875,
-0.027008056640625,
0.03448486328125,
0.029541015625,
0.0609130859375,
0.0653076171875,
0.05462646484375,
0.01568603515625,
-0.02447509765625,
0.0017671585083007812,
-0.038970947265625,
-0.0247039794921875,
-0.00981903076171875,
-0.0557861328125,
-0.054412841796875,
0.006435394287109375,
0.038299560546875,
0.002780914306640625,
-0.0171966552734375,
0.0494384765625,
0.003940582275390625,
0.05035400390625,
-0.0333251953125,
-0.006458282470703125,
-0.05438232421875,
0.01641845703125,
-0.01236724853515625,
-0.00795745849609375,
-0.0545654296875,
-0.0142669677734375,
0.0164947509765625,
-0.046661376953125,
0.0078277587890625,
0.02032470703125,
0.07568359375,
0.02099609375,
-0.01483154296875,
0.0011272430419921875,
-0.030609130859375,
0.072998046875,
-0.05010986328125,
0.0153045654296875,
0.0572509765625,
0.004852294921875,
-0.0090179443359375,
-0.062255859375,
-0.0146484375,
0.016265869140625,
0.0167388916015625,
-0.01261138916015625,
-0.00635528564453125,
-0.00693511962890625,
0.0264892578125,
0.0306396484375,
-0.084716796875,
0.0099334716796875,
-0.05096435546875,
-0.01995849609375,
0.0772705078125,
0.0325927734375,
0.02142333984375,
-0.01027679443359375,
-0.0469970703125,
-0.0186614990234375,
-0.00666046142578125,
-0.003955841064453125,
0.023590087890625,
0.01433563232421875,
-0.01522064208984375,
0.0390625,
0.0045623779296875,
0.060211181640625,
0.0031261444091796875,
-0.03216552734375,
0.0190277099609375,
-0.0029850006103515625,
-0.0443115234375,
-0.0013513565063476562,
0.07257080078125,
0.024749755859375,
0.007297515869140625,
0.003910064697265625,
-0.012115478515625,
-0.00872802734375,
0.0245513916015625,
-0.0516357421875,
-0.0294342041015625,
0.03289794921875,
-0.047119140625,
-0.03314208984375,
-0.0026397705078125,
-0.0249786376953125,
-0.01557159423828125,
-0.020355224609375,
0.0430908203125,
-0.06005859375,
0.003162384033203125,
0.0232696533203125,
-0.0289459228515625,
0.0187225341796875,
0.01511383056640625,
-0.048736572265625,
0.01873779296875,
0.044647216796875,
0.08941650390625,
-0.0323486328125,
-0.01329803466796875,
0.00856781005859375,
-0.016998291015625,
-0.0276031494140625,
0.07025146484375,
-0.0347900390625,
-0.00768280029296875,
-0.0276031494140625,
-0.0015716552734375,
-0.00490570068359375,
-0.0164337158203125,
0.038116455078125,
-0.031402587890625,
0.03369140625,
0.0175628662109375,
-0.072265625,
-0.034576416015625,
0.00707244873046875,
-0.059326171875,
0.08740234375,
0.0225372314453125,
-0.042144775390625,
0.042144775390625,
-0.03564453125,
-0.00997161865234375,
0.0120391845703125,
0.0244140625,
-0.060272216796875,
0.0103607177734375,
0.0018558502197265625,
0.042938232421875,
-0.0307769775390625,
0.0281219482421875,
-0.0297088623046875,
-0.03839111328125,
0.01438140869140625,
-0.0298004150390625,
0.0775146484375,
0.005889892578125,
-0.0021514892578125,
-0.01023101806640625,
-0.0699462890625,
0.027740478515625,
-0.00283050537109375,
-0.03253173828125,
-0.00780487060546875,
-0.033416748046875,
0.020111083984375,
0.0148773193359375,
0.02166748046875,
-0.036865234375,
0.03790283203125,
-0.0291290283203125,
0.019622802734375,
0.054443359375,
-0.0043487548828125,
0.023040771484375,
-0.048797607421875,
0.04443359375,
-0.005970001220703125,
0.01087188720703125,
-0.021697998046875,
-0.054595947265625,
-0.0272674560546875,
-0.01873779296875,
0.041015625,
0.052886962890625,
-0.0247802734375,
0.0347900390625,
-0.01206207275390625,
-0.0738525390625,
-0.0335693359375,
0.001556396484375,
0.0237274169921875,
0.012725830078125,
0.029541015625,
-0.01160430908203125,
-0.0289764404296875,
-0.0595703125,
-0.0222015380859375,
-0.01605224609375,
-0.0132598876953125,
0.0022563934326171875,
0.0611572265625,
0.009002685546875,
0.0689697265625,
-0.06146240234375,
-0.04034423828125,
0.01151275634765625,
0.0266265869140625,
0.0269622802734375,
0.035400390625,
0.028228759765625,
-0.04718017578125,
-0.035675048828125,
-0.016937255859375,
-0.033447265625,
0.01329803466796875,
-0.00778961181640625,
0.005916595458984375,
0.025848388671875,
0.0227508544921875,
-0.0408935546875,
0.050537109375,
0.05487060546875,
-0.04827880859375,
0.05078125,
-0.01084136962890625,
0.006847381591796875,
-0.1043701171875,
0.01331329345703125,
0.0026493072509765625,
-0.0259857177734375,
-0.0296478271484375,
-0.01183319091796875,
0.00432586669921875,
-0.032379150390625,
-0.058258056640625,
0.04302978515625,
-0.03143310546875,
0.01171112060546875,
-0.00994873046875,
-0.012054443359375,
0.01229095458984375,
0.050689697265625,
-0.00029087066650390625,
0.059356689453125,
0.0216217041015625,
-0.02960205078125,
0.0232696533203125,
0.023101806640625,
-0.017730712890625,
0.03448486328125,
-0.060638427734375,
0.0350341796875,
0.007965087890625,
0.007434844970703125,
-0.07818603515625,
-0.0236663818359375,
-0.0230560302734375,
-0.046173095703125,
0.0159759521484375,
0.00882720947265625,
-0.0350341796875,
-0.018768310546875,
-0.0439453125,
0.0411376953125,
0.03326416015625,
-0.02166748046875,
0.034637451171875,
0.03607177734375,
0.0220947265625,
-0.037628173828125,
-0.0496826171875,
-0.00872039794921875,
-0.0325927734375,
-0.04180908203125,
0.04058837890625,
-0.0253448486328125,
0.0224456787109375,
-0.01555633544921875,
0.0079498291015625,
-0.02801513671875,
-0.00969696044921875,
0.0252838134765625,
0.0125274658203125,
-0.006137847900390625,
0.0384521484375,
-0.0095367431640625,
-0.007411956787109375,
-0.01053619384765625,
-0.00901031494140625,
0.0396728515625,
-0.0228729248046875,
0.01244354248046875,
-0.045257568359375,
-0.0033283233642578125,
0.047088623046875,
0.00507354736328125,
0.08740234375,
0.05010986328125,
-0.043853759765625,
-0.0170440673828125,
-0.01690673828125,
-0.00518035888671875,
-0.03485107421875,
0.0287933349609375,
-0.030670166015625,
-0.053436279296875,
0.042022705078125,
0.0140380859375,
0.0008087158203125,
0.0589599609375,
0.0347900390625,
0.0024929046630859375,
0.061767578125,
0.053497314453125,
-0.018829345703125,
0.0162811279296875,
-0.018951416015625,
-0.001495361328125,
-0.049560546875,
-0.022918701171875,
-0.0225372314453125,
-0.0255126953125,
-0.04168701171875,
-0.03851318359375,
0.0128936767578125,
0.022552490234375,
-0.0143280029296875,
0.053680419921875,
-0.058624267578125,
0.0380859375,
0.03619384765625,
0.0240020751953125,
0.005519866943359375,
-0.0158233642578125,
0.006771087646484375,
-0.0016803741455078125,
-0.056793212890625,
-0.024566650390625,
0.0924072265625,
0.0467529296875,
0.056671142578125,
-0.008026123046875,
0.03814697265625,
0.0343017578125,
-0.0056610107421875,
-0.06097412109375,
0.0430908203125,
-0.0089569091796875,
-0.043792724609375,
-0.0396728515625,
0.00046539306640625,
-0.063232421875,
-0.0292205810546875,
-0.0292205810546875,
-0.058258056640625,
-0.00936126708984375,
0.023162841796875,
-0.0126190185546875,
0.0229034423828125,
-0.039581298828125,
0.06689453125,
0.0133056640625,
-0.0130462646484375,
-0.00019741058349609375,
-0.052520751953125,
0.031005859375,
-0.01568603515625,
-0.0142059326171875,
-0.0299224853515625,
-0.0035419464111328125,
0.08154296875,
-0.017303466796875,
0.038055419921875,
0.0225372314453125,
0.0010309219360351562,
0.02203369140625,
-0.005855560302734375,
0.027496337890625,
-0.01120758056640625,
0.0016002655029296875,
0.0299835205078125,
0.01025390625,
-0.0232391357421875,
-0.042449951171875,
0.016265869140625,
-0.039886474609375,
0.0012254714965820312,
-0.03106689453125,
-0.054473876953125,
0.0137481689453125,
-0.0025081634521484375,
0.0458984375,
0.039703369140625,
-0.0076751708984375,
0.0246124267578125,
0.05841064453125,
-0.02069091796875,
0.046966552734375,
0.030792236328125,
-0.01493072509765625,
-0.03045654296875,
0.07763671875,
0.0211181640625,
0.01357269287109375,
0.023681640625,
0.01995849609375,
-0.00970458984375,
-0.01200103759765625,
-0.0217437744140625,
0.01849365234375,
-0.03057861328125,
-0.0517578125,
-0.033447265625,
-0.03265380859375,
-0.0291290283203125,
-0.00325775146484375,
-0.037841796875,
-0.048309326171875,
-0.050933837890625,
-0.006694793701171875,
0.056793212890625,
0.037628173828125,
0.001117706298828125,
0.0244598388671875,
-0.06683349609375,
0.004077911376953125,
-0.0005207061767578125,
0.04315185546875,
-0.019500732421875,
-0.0645751953125,
-0.044830322265625,
-0.0236663818359375,
-0.021636962890625,
-0.0662841796875,
0.029541015625,
0.046417236328125,
0.01261138916015625,
0.0271759033203125,
0.003749847412109375,
0.047607421875,
-0.0469970703125,
0.08123779296875,
-0.00363922119140625,
-0.062255859375,
0.052337646484375,
-0.0289764404296875,
0.07080078125,
0.0201873779296875,
0.05078125,
-0.018310546875,
-0.04345703125,
-0.0740966796875,
-0.060028076171875,
0.03692626953125,
0.0081939697265625,
-0.0143280029296875,
0.0142669677734375,
0.027679443359375,
-0.000041961669921875,
-0.0035076141357421875,
-0.05377197265625,
-0.0157012939453125,
0.016845703125,
-0.0367431640625,
0.003326416015625,
-0.0241851806640625,
0.01024627685546875,
-0.019317626953125,
0.068359375,
-0.0146636962890625,
0.041351318359375,
-0.0028514862060546875,
-0.0153656005859375,
0.011016845703125,
0.00437164306640625,
0.07281494140625,
0.04693603515625,
-0.022552490234375,
-0.017852783203125,
0.0228271484375,
-0.0653076171875,
0.01515960693359375,
0.0018825531005859375,
0.00862884521484375,
-0.00556182861328125,
0.047332763671875,
0.06378173828125,
-0.0096435546875,
-0.048095703125,
0.0330810546875,
-0.00431060791015625,
-0.01074981689453125,
-0.0249176025390625,
0.0147705078125,
-0.01422119140625,
0.01515960693359375,
0.02752685546875,
-0.01453399658203125,
0.0275421142578125,
-0.0037326812744140625,
0.048095703125,
0.0181884765625,
-0.0533447265625,
-0.0310211181640625,
0.053619384765625,
-0.0174713134765625,
-0.046234130859375,
0.058135986328125,
-0.0004067420959472656,
-0.0310516357421875,
0.0390625,
0.0413818359375,
0.068359375,
-0.046630859375,
0.026947021484375,
0.039642333984375,
0.01922607421875,
-0.0198974609375,
0.0137481689453125,
0.01380157470703125,
-0.038970947265625,
-0.0265960693359375,
-0.0606689453125,
-0.010650634765625,
0.0211029052734375,
-0.066650390625,
0.032867431640625,
-0.01702880859375,
-0.03741455078125,
0.005405426025390625,
0.0111846923828125,
-0.0394287109375,
0.0469970703125,
-0.007808685302734375,
0.09466552734375,
-0.09130859375,
0.04779052734375,
0.04461669921875,
-0.01068115234375,
-0.06524658203125,
-0.0263671875,
0.01439666748046875,
-0.040771484375,
0.04241943359375,
0.020782470703125,
0.01317596435546875,
-0.004917144775390625,
-0.061767578125,
-0.0762939453125,
0.0821533203125,
0.0140838623046875,
-0.0335693359375,
-0.0098114013671875,
0.0008816719055175781,
0.048004150390625,
-0.0183563232421875,
0.0237579345703125,
0.05157470703125,
0.00030517578125,
-0.02215576171875,
-0.0860595703125,
-0.01128387451171875,
-0.0271453857421875,
0.007549285888671875,
0.022003173828125,
-0.04962158203125,
0.057952880859375,
-0.006679534912109375,
0.00013446807861328125,
-0.0003457069396972656,
0.06787109375,
0.017822265625,
0.0171356201171875,
0.0254974365234375,
0.0670166015625,
0.0546875,
0.00215911865234375,
0.0626220703125,
-0.022216796875,
0.04693603515625,
0.0947265625,
0.01629638671875,
0.07568359375,
0.01352691650390625,
-0.017669677734375,
0.04010009765625,
0.0538330078125,
-0.01259613037109375,
0.026275634765625,
0.010040283203125,
0.009033203125,
-0.002101898193359375,
-0.0092926025390625,
0.00412750244140625,
0.042694091796875,
0.0248260498046875,
-0.0301055908203125,
0.0198211669921875,
0.02716064453125,
0.01348114013671875,
-0.0200347900390625,
-0.01213836669921875,
0.0604248046875,
0.004364013671875,
-0.0282440185546875,
0.03509521484375,
-0.0080413818359375,
0.02581787109375,
-0.030029296875,
-0.0214080810546875,
0.00901031494140625,
0.00836181640625,
-0.018463134765625,
-0.076171875,
0.020111083984375,
-0.030242919921875,
-0.03173828125,
0.00396728515625,
0.0595703125,
-0.046630859375,
-0.032440185546875,
-0.0095977783203125,
-0.0086212158203125,
0.0238494873046875,
0.00910186767578125,
-0.0621337890625,
0.0080718994140625,
-0.007221221923828125,
-0.044647216796875,
0.01369476318359375,
0.022247314453125,
0.0010080337524414062,
0.06121826171875,
0.03564453125,
-0.0198211669921875,
-0.0006313323974609375,
-0.041015625,
0.036712646484375,
-0.04827880859375,
-0.0355224609375,
-0.064453125,
0.060943603515625,
-0.009613037109375,
-0.064697265625,
0.05596923828125,
0.065185546875,
0.0399169921875,
-0.0079345703125,
0.01378631591796875,
0.005283355712890625,
0.042144775390625,
-0.04486083984375,
0.038787841796875,
-0.04302978515625,
0.001079559326171875,
-0.0313720703125,
-0.060546875,
-0.02777099609375,
0.0472412109375,
0.0153961181640625,
-0.00925445556640625,
0.05712890625,
0.051483154296875,
0.0164337158203125,
0.005603790283203125,
0.0257568359375,
-0.00406646728515625,
0.01222991943359375,
0.049774169921875,
0.010986328125,
-0.07550048828125,
0.02490234375,
-0.0097198486328125,
-0.0121307373046875,
-0.005878448486328125,
-0.07916259765625,
-0.074462890625,
-0.06298828125,
-0.037841796875,
-0.042816162109375,
-0.00640106201171875,
0.0621337890625,
0.05657958984375,
-0.041015625,
-0.0163726806640625,
0.006595611572265625,
-0.005123138427734375,
-0.021759033203125,
-0.0233306884765625,
0.0499267578125,
-0.007778167724609375,
-0.049835205078125,
0.01053619384765625,
0.027496337890625,
0.0142822265625,
-0.00489044189453125,
0.009124755859375,
-0.0394287109375,
0.014495849609375,
0.03155517578125,
0.006252288818359375,
-0.025360107421875,
-0.0305328369140625,
0.0015716552734375,
-0.00682830810546875,
0.008575439453125,
0.0345458984375,
-0.05487060546875,
0.0377197265625,
0.0501708984375,
0.0374755859375,
0.037384033203125,
0.003841400146484375,
0.05938720703125,
-0.061737060546875,
0.0111846923828125,
0.0194091796875,
-0.00638580322265625,
0.00655364990234375,
-0.03924560546875,
0.05194091796875,
0.005126953125,
-0.0771484375,
-0.07037353515625,
-0.007801055908203125,
-0.06427001953125,
-0.0198974609375,
0.061279296875,
-0.012542724609375,
-0.04595947265625,
0.0022602081298828125,
-0.006427764892578125,
0.01145172119140625,
-0.025360107421875,
0.042816162109375,
0.0572509765625,
-0.0178985595703125,
-0.021026611328125,
-0.0150604248046875,
0.06512451171875,
0.0115203857421875,
-0.056640625,
-0.02496337890625,
0.00841522216796875,
0.019378662109375,
0.0321044921875,
0.04620361328125,
0.01009368896484375,
0.0284423828125,
0.016448974609375,
0.006519317626953125,
0.007358551025390625,
-0.0499267578125,
-0.029083251953125,
0.016204833984375,
-0.014373779296875,
-0.0101165771484375
]
] |
textattack/albert-base-v2-ag-news | 2020-07-07T21:59:15.000Z | [
"transformers",
"pytorch",
"albert",
"text-classification",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | textattack | null | null | textattack/albert-base-v2-ag-news | 0 | 16,314 | transformers | 2022-03-02T23:29:05 | ## TextAttack Model CardThis `albert-base-v2` model was fine-tuned for sequence classification using TextAttack
and the ag_news dataset loaded using the `nlp` library. The model was fine-tuned
for 5 epochs with a batch size of 16, a learning
rate of 2e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.9471052631578948, as measured by the
eval set accuracy, found after 3 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 622 | [
[
-0.0151824951171875,
-0.02581787109375,
0.0214080810546875,
0.0006442070007324219,
-0.0208282470703125,
0.005157470703125,
0.0020465850830078125,
-0.040557861328125,
-0.016143798828125,
0.020477294921875,
-0.02630615234375,
-0.051788330078125,
-0.04779052734375,
0.01172637939453125,
-0.045623779296875,
0.112548828125,
0.01214599609375,
0.00135040283203125,
-0.00555419921875,
-0.01446533203125,
-0.042205810546875,
-0.0286712646484375,
-0.052215576171875,
-0.03131103515625,
0.033935546875,
0.03863525390625,
0.0579833984375,
0.065673828125,
0.049774169921875,
0.0140380859375,
-0.0240325927734375,
-0.00527191162109375,
-0.0286102294921875,
-0.01317596435546875,
-0.015380859375,
-0.059234619140625,
-0.045623779296875,
-0.019195556640625,
0.047332763671875,
0.0060272216796875,
-0.0138702392578125,
0.035369873046875,
-0.015716552734375,
0.057220458984375,
-0.03118896484375,
-0.00843048095703125,
-0.05291748046875,
0.015716552734375,
-0.01546478271484375,
-0.0217437744140625,
-0.0496826171875,
-0.00616455078125,
0.0132293701171875,
-0.039337158203125,
0.016448974609375,
0.023773193359375,
0.07366943359375,
0.014007568359375,
-0.01345062255859375,
-0.00141143798828125,
-0.045257568359375,
0.068603515625,
-0.05181884765625,
0.017578125,
0.052520751953125,
0.0097198486328125,
0.00917816162109375,
-0.04705810546875,
-0.0265655517578125,
0.01454925537109375,
-0.0009746551513671875,
-0.0169677734375,
-0.021942138671875,
-0.01171875,
0.0280303955078125,
0.0286865234375,
-0.085693359375,
0.016387939453125,
-0.05126953125,
-0.00643157958984375,
0.069580078125,
0.0287933349609375,
0.004848480224609375,
-0.024444580078125,
-0.0194854736328125,
-0.01326751708984375,
-0.0255584716796875,
-0.01007080078125,
0.0340576171875,
0.0068359375,
-0.0110015869140625,
0.032562255859375,
-0.016937255859375,
0.04754638671875,
-0.0009255409240722656,
-0.0018558502197265625,
0.045623779296875,
0.003795623779296875,
-0.0343017578125,
-0.0018463134765625,
0.06842041015625,
0.025909423828125,
0.00789642333984375,
-0.0005974769592285156,
-0.01873779296875,
-0.01403045654296875,
0.031707763671875,
-0.054229736328125,
-0.0188140869140625,
0.0261688232421875,
-0.044525146484375,
-0.031890869140625,
0.0035495758056640625,
-0.04278564453125,
-0.011474609375,
-0.0202789306640625,
0.034881591796875,
-0.05780029296875,
-0.0014944076538085938,
0.0085906982421875,
-0.029327392578125,
0.018951416015625,
0.0165557861328125,
-0.06732177734375,
0.014739990234375,
0.04571533203125,
0.09246826171875,
-0.013458251953125,
-0.0031147003173828125,
0.0011224746704101562,
-0.0171051025390625,
-0.0201568603515625,
0.0601806640625,
-0.0232391357421875,
-0.018310546875,
-0.01406097412109375,
-0.00884246826171875,
0.0270233154296875,
-0.0215301513671875,
0.04693603515625,
-0.0494384765625,
0.0234375,
0.0130462646484375,
-0.044219970703125,
-0.03973388671875,
0.01491546630859375,
-0.053741455078125,
0.068603515625,
0.0340576171875,
-0.052520751953125,
0.031036376953125,
-0.041473388671875,
-0.028656005859375,
0.0230712890625,
0.01568603515625,
-0.058502197265625,
0.01026153564453125,
-0.005157470703125,
0.039764404296875,
-0.021148681640625,
0.0083770751953125,
-0.01084136962890625,
-0.0292510986328125,
-0.0008997917175292969,
-0.0279541015625,
0.050201416015625,
0.025360107421875,
-0.0059356689453125,
-0.01125335693359375,
-0.08843994140625,
0.03021240234375,
-0.0005221366882324219,
-0.036468505859375,
-0.0272369384765625,
-0.0089569091796875,
0.0168914794921875,
-0.0010862350463867188,
0.0024013519287109375,
-0.01332855224609375,
0.0347900390625,
-0.0303802490234375,
0.03619384765625,
0.03533935546875,
-0.006122589111328125,
0.03076171875,
-0.04656982421875,
0.028350830078125,
-0.006649017333984375,
0.00023055076599121094,
-0.034759521484375,
-0.0443115234375,
-0.04083251953125,
-0.01113128662109375,
0.044952392578125,
0.07623291015625,
-0.0275115966796875,
0.052398681640625,
-0.025848388671875,
-0.06671142578125,
-0.02691650390625,
-0.0038585662841796875,
0.02496337890625,
0.0144195556640625,
0.0443115234375,
-0.032470703125,
-0.039276123046875,
-0.0645751953125,
-0.010650634765625,
-0.0281219482421875,
-0.00759124755859375,
0.003749847412109375,
0.051605224609375,
-0.0268096923828125,
0.059661865234375,
-0.030517578125,
-0.01078033447265625,
-0.0007443428039550781,
0.057647705078125,
0.019622802734375,
0.045440673828125,
0.03399658203125,
-0.045013427734375,
-0.01715087890625,
0.0018472671508789062,
-0.03387451171875,
-0.010345458984375,
-0.0098876953125,
0.004230499267578125,
0.024078369140625,
0.01561737060546875,
-0.037750244140625,
0.046051025390625,
0.031402587890625,
-0.031951904296875,
0.03448486328125,
-0.004009246826171875,
0.006526947021484375,
-0.0911865234375,
0.01200103759765625,
0.01114654541015625,
-0.033721923828125,
-0.0430908203125,
-0.008819580078125,
0.014251708984375,
0.0010118484497070312,
-0.040557861328125,
0.035858154296875,
-0.0207977294921875,
-0.004428863525390625,
-0.022308349609375,
-0.044921875,
0.00858306884765625,
0.036407470703125,
0.01018524169921875,
0.048004150390625,
0.02581787109375,
-0.049560546875,
0.01953125,
0.028717041015625,
-0.0325927734375,
0.03619384765625,
-0.056060791015625,
0.0296630859375,
-0.0220947265625,
0.0018777847290039062,
-0.0640869140625,
-0.022064208984375,
0.0006899833679199219,
-0.04931640625,
0.037506103515625,
0.00441741943359375,
-0.03228759765625,
-0.0240020751953125,
-0.0311431884765625,
0.044189453125,
0.041717529296875,
-0.0279083251953125,
0.02685546875,
0.0135345458984375,
-0.0077972412109375,
-0.04327392578125,
-0.0623779296875,
-0.01305389404296875,
-0.0253143310546875,
-0.032684326171875,
0.039764404296875,
-0.0125274658203125,
0.00977325439453125,
-0.00824737548828125,
0.0086822509765625,
-0.007343292236328125,
-0.00414276123046875,
0.0245513916015625,
0.0167083740234375,
-0.006389617919921875,
0.0259857177734375,
0.01088714599609375,
-0.0012521743774414062,
-0.0125885009765625,
0.0052490234375,
0.051971435546875,
-0.024505615234375,
0.0214996337890625,
-0.04931640625,
0.0221099853515625,
0.052642822265625,
-0.007110595703125,
0.0645751953125,
0.03839111328125,
-0.0428466796875,
-0.0184783935546875,
-0.005382537841796875,
0.0070037841796875,
-0.0308685302734375,
0.056884765625,
-0.025299072265625,
-0.050689697265625,
0.037750244140625,
0.01297760009765625,
0.00875091552734375,
0.070068359375,
0.04193115234375,
0.00864410400390625,
0.078857421875,
0.032257080078125,
-0.0205535888671875,
0.03509521484375,
-0.03668212890625,
-0.01183319091796875,
-0.05145263671875,
-0.0253753662109375,
-0.03460693359375,
-0.0264739990234375,
-0.035369873046875,
-0.01348876953125,
0.0037288665771484375,
0.019287109375,
-0.043731689453125,
0.042633056640625,
-0.045257568359375,
0.038848876953125,
0.051483154296875,
0.0010900497436523438,
0.005741119384765625,
-0.006488800048828125,
0.0279388427734375,
0.0178985595703125,
-0.042266845703125,
-0.0295867919921875,
0.0941162109375,
0.05389404296875,
0.06414794921875,
0.00324249267578125,
0.044219970703125,
0.054656982421875,
0.01461029052734375,
-0.06640625,
0.0288238525390625,
-0.00594329833984375,
-0.041717529296875,
-0.026153564453125,
-0.0199127197265625,
-0.058074951171875,
-0.01678466796875,
-0.034210205078125,
-0.048126220703125,
0.002651214599609375,
0.0186614990234375,
-0.038787841796875,
0.00476837158203125,
-0.04779052734375,
0.065673828125,
-0.0127105712890625,
0.0010519027709960938,
-0.011871337890625,
-0.063232421875,
0.03619384765625,
-0.013427734375,
0.00307464599609375,
-0.0139923095703125,
0.007476806640625,
0.06787109375,
-0.024932861328125,
0.02471923828125,
0.0228118896484375,
0.0165557861328125,
0.01416778564453125,
-0.01409149169921875,
0.0364990234375,
0.00676727294921875,
0.003398895263671875,
0.04132080078125,
0.006664276123046875,
-0.0174560546875,
-0.0238800048828125,
0.02685546875,
-0.07269287109375,
0.002368927001953125,
-0.0517578125,
-0.039459228515625,
0.01045989990234375,
0.0035572052001953125,
0.057220458984375,
0.0278778076171875,
-0.004604339599609375,
0.023468017578125,
0.0501708984375,
-0.0130767822265625,
0.035980224609375,
0.04071044921875,
-0.0100250244140625,
-0.047698974609375,
0.053619384765625,
0.0069427490234375,
0.01397705078125,
0.0294036865234375,
-0.00047278404235839844,
-0.008697509765625,
-0.04052734375,
-0.03912353515625,
0.00371551513671875,
-0.03814697265625,
-0.06365966796875,
-0.033599853515625,
-0.039642333984375,
-0.0288238525390625,
0.0086669921875,
-0.0207672119140625,
-0.032623291015625,
-0.0509033203125,
-0.01366424560546875,
0.044189453125,
0.038116455078125,
-0.00983428955078125,
0.0594482421875,
-0.0760498046875,
0.0168304443359375,
0.0054931640625,
0.036407470703125,
-0.01519012451171875,
-0.0703125,
-0.04046630859375,
-0.0214691162109375,
-0.0361328125,
-0.07647705078125,
0.040069580078125,
0.040313720703125,
0.01580810546875,
0.03399658203125,
-0.015380859375,
0.034271240234375,
-0.06878662109375,
0.06427001953125,
0.01059722900390625,
-0.070068359375,
0.050994873046875,
-0.01110076904296875,
0.05267333984375,
0.03314208984375,
0.041961669921875,
-0.03546142578125,
-0.0251007080078125,
-0.060760498046875,
-0.07537841796875,
0.05828857421875,
0.00786590576171875,
0.0001016855239868164,
0.005035400390625,
0.040985107421875,
0.01364898681640625,
0.005035400390625,
-0.052520751953125,
-0.0236663818359375,
-0.0091552734375,
-0.041046142578125,
0.00482177734375,
-0.027618408203125,
0.00920867919921875,
-0.0248870849609375,
0.074951171875,
0.001888275146484375,
0.04217529296875,
0.00501251220703125,
-0.010498046875,
-0.0243988037109375,
0.018951416015625,
0.0654296875,
0.03778076171875,
-0.022430419921875,
-0.01348876953125,
0.01158905029296875,
-0.050384521484375,
0.00948333740234375,
0.0186614990234375,
-0.0194854736328125,
0.0078125,
0.037200927734375,
0.08740234375,
-0.0203399658203125,
-0.0479736328125,
0.0292205810546875,
-0.0076751708984375,
-0.0185546875,
-0.03619384765625,
0.02813720703125,
-0.03314208984375,
0.02813720703125,
0.041656494140625,
0.00771331787109375,
0.03704833984375,
-0.029693603515625,
0.032684326171875,
0.0240631103515625,
-0.01488494873046875,
-0.03143310546875,
0.05889892578125,
-0.0111846923828125,
-0.05377197265625,
0.049896240234375,
-0.0191192626953125,
-0.04180908203125,
0.037353515625,
0.033660888671875,
0.06732177734375,
-0.04534912109375,
0.01971435546875,
0.043731689453125,
0.0184783935546875,
-0.040924072265625,
0.0200653076171875,
0.0149688720703125,
-0.04205322265625,
-0.0301055908203125,
-0.05841064453125,
-0.0201873779296875,
0.04461669921875,
-0.069580078125,
0.0357666015625,
-0.0313720703125,
-0.0240325927734375,
0.021881103515625,
-0.0006957054138183594,
-0.043426513671875,
0.040985107421875,
0.0216217041015625,
0.08685302734375,
-0.07952880859375,
0.050201416015625,
0.049285888671875,
-0.00445556640625,
-0.040374755859375,
-0.0210418701171875,
-0.003643035888671875,
-0.0501708984375,
0.035919189453125,
0.03228759765625,
0.0311279296875,
0.0040283203125,
-0.064697265625,
-0.07550048828125,
0.0897216796875,
-0.006771087646484375,
-0.062744140625,
-0.00823211669921875,
-0.0035152435302734375,
0.0413818359375,
-0.01198577880859375,
0.0258026123046875,
0.05169677734375,
0.007045745849609375,
0.0011425018310546875,
-0.07470703125,
-0.028778076171875,
-0.033782958984375,
0.0067291259765625,
0.034088134765625,
-0.049072265625,
0.058685302734375,
-0.005889892578125,
0.005077362060546875,
0.0282440185546875,
0.07086181640625,
0.015380859375,
0.03778076171875,
0.031982421875,
0.08154296875,
0.057342529296875,
-0.0011854171752929688,
0.058685302734375,
-0.009124755859375,
0.0401611328125,
0.08123779296875,
0.00424957275390625,
0.0758056640625,
0.007904052734375,
-0.015289306640625,
0.0501708984375,
0.052520751953125,
-0.0149688720703125,
0.06365966796875,
0.021942138671875,
0.00814056396484375,
0.0052490234375,
0.018310546875,
-0.021820068359375,
0.02899169921875,
0.03729248046875,
-0.049224853515625,
0.01554107666015625,
0.014068603515625,
0.006206512451171875,
-0.0328369140625,
-0.05126953125,
0.052337646484375,
-0.00391387939453125,
-0.040985107421875,
0.03436279296875,
0.02813720703125,
0.042083740234375,
-0.034210205078125,
-0.00983428955078125,
0.0021305084228515625,
0.0141754150390625,
-0.0146026611328125,
-0.06622314453125,
0.0263671875,
-0.00217437744140625,
-0.043304443359375,
0.01275634765625,
0.048431396484375,
-0.039947509765625,
-0.03302001953125,
-0.0221405029296875,
-0.004161834716796875,
0.01328277587890625,
0.00927734375,
-0.049224853515625,
0.0006504058837890625,
-0.0191802978515625,
-0.045623779296875,
0.003543853759765625,
0.028656005859375,
-0.00737762451171875,
0.037353515625,
0.031585693359375,
-0.01145172119140625,
0.003879547119140625,
-0.0281829833984375,
0.031982421875,
-0.05743408203125,
-0.04547119140625,
-0.06964111328125,
0.047576904296875,
-0.00853729248046875,
-0.057647705078125,
0.03125,
0.0625,
0.058685302734375,
-0.005340576171875,
0.053070068359375,
-0.00588226318359375,
0.053314208984375,
-0.03570556640625,
0.035797119140625,
-0.05133056640625,
0.0029296875,
-0.00540924072265625,
-0.045196533203125,
-0.033416748046875,
0.0626220703125,
0.0042724609375,
-0.00969696044921875,
0.04180908203125,
0.03778076171875,
0.0188446044921875,
0.007843017578125,
0.004486083984375,
0.0006709098815917969,
0.01134490966796875,
0.0516357421875,
0.029937744140625,
-0.059967041015625,
0.0186004638671875,
-0.005435943603515625,
-0.01409149169921875,
-0.024871826171875,
-0.06610107421875,
-0.0682373046875,
-0.046905517578125,
-0.0301971435546875,
-0.03887939453125,
0.005191802978515625,
0.07086181640625,
0.055023193359375,
-0.06170654296875,
0.007305145263671875,
-0.0178375244140625,
-0.0126953125,
-0.01386260986328125,
-0.021514892578125,
0.0309600830078125,
-0.0089569091796875,
-0.050201416015625,
-0.001995086669921875,
0.0161895751953125,
0.019561767578125,
0.0029964447021484375,
-0.0207061767578125,
-0.0135650634765625,
0.00655364990234375,
0.0299530029296875,
0.03411865234375,
-0.029052734375,
-0.034088134765625,
-0.006870269775390625,
-0.003597259521484375,
0.0135955810546875,
0.03131103515625,
-0.05889892578125,
0.0154571533203125,
0.0546875,
0.04583740234375,
0.021148681640625,
0.01537322998046875,
0.03802490234375,
-0.06524658203125,
0.00196075439453125,
0.0304718017578125,
0.0160980224609375,
0.01499176025390625,
-0.034820556640625,
0.042022705078125,
0.013427734375,
-0.07635498046875,
-0.072998046875,
0.00146484375,
-0.0765380859375,
-0.0103912353515625,
0.09332275390625,
0.01216888427734375,
-0.0361328125,
0.0174102783203125,
-0.01430511474609375,
0.0218505859375,
-0.0304718017578125,
0.045623779296875,
0.06060791015625,
-0.0030307769775390625,
-0.031585693359375,
-0.0278778076171875,
0.050994873046875,
0.01221466064453125,
-0.056488037109375,
-0.0367431640625,
0.0224761962890625,
0.015625,
0.03448486328125,
0.03704833984375,
0.014739990234375,
0.0384521484375,
-0.01085662841796875,
0.01406097412109375,
0.005275726318359375,
-0.0491943359375,
-0.03118896484375,
0.0028476715087890625,
-0.0025501251220703125,
-0.0198211669921875
]
] |
Helsinki-NLP/opus-mt-ur-en | 2023-08-16T12:08:24.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"ur",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"has_space"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-ur-en | 2 | 16,285 | transformers | 2022-03-02T23:29:04 | ---
language:
- ur
- en
tags:
- translation
license: apache-2.0
---
### urd-eng
* source group: Urdu
* target group: English
* OPUS readme: [urd-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/urd-eng/README.md)
* model: transformer-align
* source language(s): urd
* target language(s): eng
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/urd-eng/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/urd-eng/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/urd-eng/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.urd.eng | 23.2 | 0.435 |
### System Info:
- hf_name: urd-eng
- source_languages: urd
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/urd-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['ur', 'en']
- src_constituents: {'urd'}
- tgt_constituents: {'eng'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/urd-eng/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/urd-eng/opus-2020-06-17.test.txt
- src_alpha3: urd
- tgt_alpha3: eng
- short_pair: ur-en
- chrF2_score: 0.435
- bleu: 23.2
- brevity_penalty: 0.975
- ref_len: 12029.0
- src_name: Urdu
- tgt_name: English
- train_date: 2020-06-17
- src_alpha2: ur
- tgt_alpha2: en
- prefer_old: False
- long_pair: urd-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,051 | [
[
-0.02630615234375,
-0.043731689453125,
0.0193328857421875,
0.036041259765625,
-0.035003662109375,
-0.0135955810546875,
-0.02362060546875,
-0.0264892578125,
0.0157470703125,
0.02130126953125,
-0.045654296875,
-0.06158447265625,
-0.042724609375,
0.0279388427734375,
-0.0030994415283203125,
0.07269287109375,
-0.009246826171875,
0.010589599609375,
0.033447265625,
-0.039093017578125,
-0.03253173828125,
-0.0159912109375,
-0.042510986328125,
-0.01099395751953125,
0.03143310546875,
0.0270843505859375,
0.03704833984375,
0.035858154296875,
0.047698974609375,
0.0229644775390625,
-0.0246429443359375,
0.0175323486328125,
-0.005580902099609375,
-0.002239227294921875,
-0.00677490234375,
-0.027587890625,
-0.036285400390625,
-0.0200042724609375,
0.06793212890625,
0.03936767578125,
0.0037479400634765625,
0.036163330078125,
-0.00965118408203125,
0.055633544921875,
-0.01837158203125,
0.00310516357421875,
-0.031524658203125,
-0.00749969482421875,
-0.035491943359375,
-0.0199127197265625,
-0.041595458984375,
-0.0243377685546875,
0.0034923553466796875,
-0.0367431640625,
0.0023441314697265625,
0.0087738037109375,
0.12103271484375,
0.0096435546875,
-0.034271240234375,
-0.006855010986328125,
-0.03143310546875,
0.06292724609375,
-0.060302734375,
0.03497314453125,
0.031982421875,
-0.004611968994140625,
-0.00466156005859375,
-0.029693603515625,
-0.0281829833984375,
0.01068878173828125,
-0.0233612060546875,
0.01593017578125,
-0.005138397216796875,
-0.012298583984375,
0.0140380859375,
0.04156494140625,
-0.05487060546875,
0.0010833740234375,
-0.028564453125,
-0.01263427734375,
0.038787841796875,
0.006488800048828125,
0.0303802490234375,
-0.0474853515625,
-0.0328369140625,
-0.03619384765625,
-0.0307769775390625,
0.01485443115234375,
0.03363037109375,
0.0299530029296875,
-0.042877197265625,
0.05072021484375,
-0.009735107421875,
0.040771484375,
0.01114654541015625,
-0.0128021240234375,
0.049285888671875,
-0.050018310546875,
-0.012664794921875,
-0.0144195556640625,
0.09368896484375,
0.013916015625,
0.001964569091796875,
0.0088653564453125,
-0.0100250244140625,
-0.0167083740234375,
-0.005695343017578125,
-0.06219482421875,
0.01050567626953125,
0.0196990966796875,
-0.02667236328125,
-0.01300048828125,
0.004886627197265625,
-0.06280517578125,
0.0056610107421875,
0.00678253173828125,
0.033905029296875,
-0.060760498046875,
-0.0162811279296875,
0.0224456787109375,
0.00311279296875,
0.025909423828125,
-0.004535675048828125,
-0.041900634765625,
0.01094818115234375,
0.0196990966796875,
0.0692138671875,
-0.00959014892578125,
-0.0299072265625,
-0.01399993896484375,
0.0093536376953125,
-0.00865936279296875,
0.052490234375,
-0.0167694091796875,
-0.0285797119140625,
-0.0106964111328125,
0.0270538330078125,
-0.01352691650390625,
-0.01218414306640625,
0.0699462890625,
-0.0230865478515625,
0.036834716796875,
-0.0233306884765625,
-0.0303802490234375,
-0.03082275390625,
0.0198822021484375,
-0.053863525390625,
0.09039306640625,
0.0169830322265625,
-0.07012939453125,
0.0272674560546875,
-0.05755615234375,
-0.0238494873046875,
-0.00675201416015625,
0.005344390869140625,
-0.061737060546875,
-0.003284454345703125,
0.0239715576171875,
0.0245361328125,
-0.03009033203125,
0.036895751953125,
0.0001970529556274414,
-0.01532745361328125,
-0.0008969306945800781,
-0.024017333984375,
0.0965576171875,
0.015716552734375,
-0.033721923828125,
0.00791168212890625,
-0.055755615234375,
0.01032257080078125,
0.0189971923828125,
-0.034423828125,
-0.01271820068359375,
-0.00916290283203125,
0.020782470703125,
0.00547027587890625,
0.016845703125,
-0.041717529296875,
0.02716064453125,
-0.047943115234375,
0.00975799560546875,
0.060882568359375,
0.01027679443359375,
0.016845703125,
-0.026153564453125,
0.032958984375,
0.01837158203125,
0.006351470947265625,
0.010833740234375,
-0.037933349609375,
-0.04986572265625,
-0.0245208740234375,
0.0472412109375,
0.050384521484375,
-0.045440673828125,
0.0523681640625,
-0.052398681640625,
-0.0572509765625,
-0.053955078125,
-0.004558563232421875,
0.03912353515625,
0.01476287841796875,
0.039093017578125,
-0.0180816650390625,
-0.042755126953125,
-0.07763671875,
-0.0191192626953125,
-0.0203704833984375,
0.005290985107421875,
0.01239776611328125,
0.0556640625,
-0.004116058349609375,
0.044769287109375,
-0.0304107666015625,
-0.042999267578125,
-0.01849365234375,
0.017822265625,
0.0291595458984375,
0.05181884765625,
0.0487060546875,
-0.0670166015625,
-0.050140380859375,
0.009368896484375,
-0.047149658203125,
-0.0181427001953125,
-0.0053863525390625,
-0.017425537109375,
0.03253173828125,
-0.00010061264038085938,
-0.04107666015625,
0.0263214111328125,
0.042877197265625,
-0.061737060546875,
0.0350341796875,
-0.007534027099609375,
0.035369873046875,
-0.1143798828125,
0.01064300537109375,
-0.01351165771484375,
-0.00496673583984375,
-0.021636962890625,
0.0038623809814453125,
0.01067352294921875,
0.0131378173828125,
-0.0400390625,
0.054840087890625,
-0.050994873046875,
-0.0006403923034667969,
0.0272216796875,
0.010345458984375,
-0.0011072158813476562,
0.058135986328125,
-0.01404571533203125,
0.07470703125,
0.037811279296875,
-0.0256500244140625,
0.0032253265380859375,
0.038818359375,
-0.029571533203125,
0.0219573974609375,
-0.04766845703125,
-0.0158843994140625,
0.023529052734375,
-0.004276275634765625,
-0.061859130859375,
-0.01561737060546875,
0.0167083740234375,
-0.05316162109375,
0.0209503173828125,
-0.0083770751953125,
-0.0474853515625,
-0.01201629638671875,
-0.03314208984375,
0.04852294921875,
0.034759521484375,
-0.01373291015625,
0.062744140625,
0.007366180419921875,
-0.005523681640625,
-0.049896240234375,
-0.06195068359375,
-0.0022907257080078125,
-0.0137939453125,
-0.048919677734375,
0.0308990478515625,
-0.0115203857421875,
-0.00135040283203125,
0.01457977294921875,
-0.002170562744140625,
-0.00902557373046875,
0.0040435791015625,
0.0126190185546875,
0.0194091796875,
-0.02386474609375,
0.0038623809814453125,
0.0020198822021484375,
-0.004596710205078125,
-0.0173187255859375,
-0.008697509765625,
0.05999755859375,
-0.028839111328125,
-0.0258331298828125,
-0.05810546875,
0.01499176025390625,
0.040924072265625,
-0.0301666259765625,
0.08355712890625,
0.052978515625,
-0.017913818359375,
0.0193634033203125,
-0.04559326171875,
0.0075225830078125,
-0.02960205078125,
0.026153564453125,
-0.04168701171875,
-0.042877197265625,
0.06414794921875,
0.01514434814453125,
0.01097869873046875,
0.07135009765625,
0.04962158203125,
0.011810302734375,
0.055145263671875,
0.0277099609375,
0.00499725341796875,
0.035400390625,
-0.042205810546875,
-0.0004930496215820312,
-0.0697021484375,
-0.024658203125,
-0.056121826171875,
-0.01413726806640625,
-0.06744384765625,
-0.0166168212890625,
0.020233154296875,
0.0013494491577148438,
-0.0145263671875,
0.050384521484375,
-0.04022216796875,
0.0231781005859375,
0.040985107421875,
0.0139007568359375,
0.0236968994140625,
-0.00405120849609375,
-0.032562255859375,
-0.0106658935546875,
-0.0281219482421875,
-0.042449951171875,
0.0921630859375,
0.0164642333984375,
0.0187835693359375,
0.03106689453125,
0.050048828125,
0.01300048828125,
0.010284423828125,
-0.044189453125,
0.04473876953125,
-0.007350921630859375,
-0.059600830078125,
-0.028076171875,
-0.0253448486328125,
-0.06463623046875,
0.016021728515625,
-0.0121307373046875,
-0.050048828125,
0.01122283935546875,
-0.0184173583984375,
-0.00875091552734375,
0.050628662109375,
-0.055999755859375,
0.0643310546875,
0.002742767333984375,
-0.02301025390625,
0.0106353759765625,
-0.04443359375,
0.0156707763671875,
-0.005462646484375,
0.01253509521484375,
-0.01123809814453125,
-0.00980377197265625,
0.07061767578125,
-0.022552490234375,
0.04119873046875,
-0.002716064453125,
-0.006229400634765625,
0.00963592529296875,
0.0108795166015625,
0.034576416015625,
-0.0013523101806640625,
-0.022125244140625,
0.0218048095703125,
0.004390716552734375,
-0.051116943359375,
-0.0190887451171875,
0.0430908203125,
-0.06414794921875,
-0.04022216796875,
-0.04791259765625,
-0.052276611328125,
-0.00205230712890625,
0.034210205078125,
0.037506103515625,
0.045867919921875,
-0.0013103485107421875,
0.043975830078125,
0.045867919921875,
-0.0225830078125,
0.038726806640625,
0.0374755859375,
-0.002994537353515625,
-0.046844482421875,
0.042205810546875,
0.01519775390625,
0.0173492431640625,
0.036773681640625,
0.0010480880737304688,
-0.018798828125,
-0.0643310546875,
-0.041900634765625,
0.0308837890625,
-0.029937744140625,
-0.0309906005859375,
-0.04827880859375,
-0.00823211669921875,
-0.03460693359375,
0.0077056884765625,
-0.0234222412109375,
-0.0291595458984375,
-0.007373809814453125,
-0.0193939208984375,
0.032623291015625,
0.03277587890625,
0.00006705522537231445,
0.0186614990234375,
-0.0638427734375,
0.0199432373046875,
-0.00998687744140625,
0.03265380859375,
-0.022491455078125,
-0.0654296875,
-0.0218353271484375,
0.0007238388061523438,
-0.027191162109375,
-0.08245849609375,
0.04071044921875,
0.0006337165832519531,
0.0147857666015625,
0.0106658935546875,
0.011383056640625,
0.05023193359375,
-0.0275421142578125,
0.0733642578125,
-0.00922393798828125,
-0.06475830078125,
0.04754638671875,
-0.0310821533203125,
0.033843994140625,
0.05206298828125,
0.0243072509765625,
-0.025482177734375,
-0.04931640625,
-0.053680419921875,
-0.06787109375,
0.061187744140625,
0.042755126953125,
-0.00983428955078125,
-0.006298065185546875,
0.005580902099609375,
-0.0032787322998046875,
-0.0186767578125,
-0.0865478515625,
-0.033294677734375,
0.0116729736328125,
-0.040557861328125,
0.01491546630859375,
-0.036773681640625,
-0.01079559326171875,
-0.0227203369140625,
0.08172607421875,
0.014801025390625,
0.01166534423828125,
0.03948974609375,
-0.01448822021484375,
0.0004639625549316406,
0.03375244140625,
0.055877685546875,
0.03253173828125,
-0.028533935546875,
-0.0140533447265625,
0.0274658203125,
-0.045867919921875,
0.01263427734375,
0.0018558502197265625,
-0.0307769775390625,
0.01520538330078125,
0.042083740234375,
0.0616455078125,
0.0158538818359375,
-0.039031982421875,
0.036468505859375,
-0.0072479248046875,
-0.029083251953125,
-0.033233642578125,
-0.0254058837890625,
0.00959014892578125,
0.0128173828125,
0.031646728515625,
0.0014657974243164062,
-0.00012755393981933594,
-0.0164794921875,
0.003818511962890625,
0.01061248779296875,
-0.0173492431640625,
-0.028778076171875,
0.03936767578125,
0.0005636215209960938,
-0.0307464599609375,
0.0340576171875,
-0.0252685546875,
-0.0343017578125,
0.044525146484375,
0.0258026123046875,
0.0792236328125,
-0.016754150390625,
-0.005092620849609375,
0.055084228515625,
0.036834716796875,
0.002544403076171875,
0.0333251953125,
0.014984130859375,
-0.042999267578125,
-0.028778076171875,
-0.06732177734375,
0.01003265380859375,
0.0106353759765625,
-0.060577392578125,
0.0254058837890625,
-0.00506591796875,
-0.023040771484375,
-0.005840301513671875,
0.029693603515625,
-0.039306640625,
0.0041351318359375,
-0.0328369140625,
0.06927490234375,
-0.06719970703125,
0.056610107421875,
0.054443359375,
-0.0592041015625,
-0.0850830078125,
-0.00281524658203125,
-0.0173797607421875,
-0.041595458984375,
0.050994873046875,
-0.0005612373352050781,
0.0016489028930664062,
-0.0015897750854492188,
-0.0183563232421875,
-0.05743408203125,
0.08636474609375,
0.027099609375,
-0.02215576171875,
-0.01319122314453125,
0.0016307830810546875,
0.0372314453125,
-0.001331329345703125,
0.022674560546875,
0.0267791748046875,
0.06268310546875,
-0.0160675048828125,
-0.0782470703125,
0.01403045654296875,
-0.04046630859375,
-0.00438690185546875,
0.0304412841796875,
-0.05975341796875,
0.061798095703125,
0.0015392303466796875,
-0.0247650146484375,
0.01413726806640625,
0.045928955078125,
0.0224761962890625,
0.0010538101196289062,
0.0322265625,
0.07281494140625,
0.0283966064453125,
-0.03765869140625,
0.07586669921875,
-0.022247314453125,
0.043365478515625,
0.06671142578125,
0.01375579833984375,
0.060699462890625,
0.0416259765625,
-0.023345947265625,
0.053680419921875,
0.0499267578125,
-0.0102691650390625,
0.0256500244140625,
-0.01470947265625,
-0.003955841064453125,
-0.01678466796875,
-0.0209503173828125,
-0.037750244140625,
0.036956787109375,
0.0015354156494140625,
-0.014129638671875,
-0.003078460693359375,
-0.0125579833984375,
0.02276611328125,
0.0177154541015625,
-0.002361297607421875,
0.051513671875,
-0.0179290771484375,
-0.04583740234375,
0.054931640625,
0.0010423660278320312,
0.050628662109375,
-0.04705810546875,
0.002597808837890625,
-0.0179901123046875,
0.01226806640625,
-0.007572174072265625,
-0.056610107421875,
0.0294036865234375,
0.011016845703125,
-0.025146484375,
-0.0175323486328125,
0.006893157958984375,
-0.039215087890625,
-0.060089111328125,
0.0328369140625,
0.03131103515625,
0.01493072509765625,
0.0203094482421875,
-0.05267333984375,
0.005893707275390625,
0.0180206298828125,
-0.051361083984375,
0.00421142578125,
0.05316162109375,
0.0014286041259765625,
0.0535888671875,
0.0352783203125,
0.0218505859375,
0.00677490234375,
-0.0001876354217529297,
0.046112060546875,
-0.0660400390625,
-0.023834228515625,
-0.06396484375,
0.037811279296875,
-0.0084381103515625,
-0.0426025390625,
0.050872802734375,
0.05792236328125,
0.06591796875,
-0.005046844482421875,
0.0223541259765625,
-0.02301025390625,
0.038818359375,
-0.05047607421875,
0.055999755859375,
-0.06854248046875,
0.004955291748046875,
-0.01506805419921875,
-0.05694580078125,
-0.021331787109375,
0.024932861328125,
-0.0158233642578125,
0.004119873046875,
0.08111572265625,
0.0614013671875,
0.00569915771484375,
-0.024169921875,
-0.0010347366333007812,
0.031951904296875,
0.0167694091796875,
0.056793212890625,
0.01715087890625,
-0.07421875,
0.052978515625,
-0.025482177734375,
0.00872802734375,
-0.0023708343505859375,
-0.056427001953125,
-0.06146240234375,
-0.062469482421875,
-0.0150299072265625,
-0.0272216796875,
-0.00957489013671875,
0.0753173828125,
0.025604248046875,
-0.07464599609375,
-0.022064208984375,
0.00574493408203125,
0.0171661376953125,
-0.0204925537109375,
-0.0211029052734375,
0.060272216796875,
-0.0079498291015625,
-0.076904296875,
0.00714111328125,
0.006153106689453125,
0.01178741455078125,
0.0078277587890625,
-0.00482940673828125,
-0.062408447265625,
-0.0027370452880859375,
0.016265869140625,
0.0116424560546875,
-0.06549072265625,
-0.0143280029296875,
0.00605010986328125,
-0.0231475830078125,
0.0144500732421875,
0.004795074462890625,
-0.023956298828125,
0.00865936279296875,
0.053955078125,
0.0321044921875,
0.029144287109375,
0.0005884170532226562,
0.0234222412109375,
-0.051788330078125,
0.035400390625,
0.015655517578125,
0.038787841796875,
0.024322509765625,
-0.01222991943359375,
0.06103515625,
0.0231781005859375,
-0.0263214111328125,
-0.0810546875,
-0.002529144287109375,
-0.092529296875,
-0.002521514892578125,
0.0743408203125,
-0.01488494873046875,
-0.0286712646484375,
0.01123809814453125,
-0.0215301513671875,
0.034149169921875,
-0.034515380859375,
0.0455322265625,
0.0653076171875,
0.028472900390625,
0.0103302001953125,
-0.03204345703125,
0.0225372314453125,
0.049713134765625,
-0.0545654296875,
-0.0109710693359375,
0.0146026611328125,
0.0258331298828125,
0.027984619140625,
0.04791259765625,
-0.0303955078125,
0.0167694091796875,
-0.013580322265625,
0.0248260498046875,
-0.01476287841796875,
-0.005992889404296875,
-0.0182952880859375,
0.00910186767578125,
-0.00572967529296875,
-0.01641845703125
]
] |
TurkuNLP/bert-base-finnish-cased-v1 | 2022-06-10T08:43:15.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"fi",
"arxiv:1912.07076",
"arxiv:1908.04212",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | TurkuNLP | null | null | TurkuNLP/bert-base-finnish-cased-v1 | 4 | 16,228 | transformers | 2022-03-02T23:29:05 | ---
language: fi
---
## Quickstart
**Release 1.0** (November 25, 2019)
We generally recommend the use of the cased model.
Paper presenting Finnish BERT: [arXiv:1912.07076](https://arxiv.org/abs/1912.07076)
## What's this?
A version of Google's [BERT](https://github.com/google-research/bert) deep transfer learning model for Finnish. The model can be fine-tuned to achieve state-of-the-art results for various Finnish natural language processing tasks.
FinBERT features a custom 50,000 wordpiece vocabulary that has much better coverage of Finnish words than e.g. the previously released [multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) models from Google:
| Vocabulary | Example |
|------------|---------|
| FinBERT | Suomessa vaihtuu kesän aikana sekä pääministeri että valtiovarain ##ministeri . |
| Multilingual BERT | Suomessa vai ##htuu kes ##än aikana sekä p ##ää ##minister ##i että valt ##io ##vara ##in ##minister ##i . |
FinBERT has been pre-trained for 1 million steps on over 3 billion tokens (24B characters) of Finnish text drawn from news, online discussion, and internet crawls. By contrast, Multilingual BERT was trained on Wikipedia texts, where the Finnish Wikipedia text is approximately 3% of the amount used to train FinBERT.
These features allow FinBERT to outperform not only Multilingual BERT but also all previously proposed models when fine-tuned for Finnish natural language processing tasks.
## Results
### Document classification

FinBERT outperforms multilingual BERT (M-BERT) on document classification over a range of training set sizes on the Yle news (left) and Ylilauta online discussion (right) corpora. (Baseline classification performance with [FastText](https://fasttext.cc/) included for reference.)
[[code](https://github.com/spyysalo/finbert-text-classification)][[Yle data](https://github.com/spyysalo/yle-corpus)] [[Ylilauta data](https://github.com/spyysalo/ylilauta-corpus)]
### Named Entity Recognition
Evaluation on FiNER corpus ([Ruokolainen et al 2019](https://arxiv.org/abs/1908.04212))
| Model | Accuracy |
|--------------------|----------|
| **FinBERT** | **92.40%** |
| Multilingual BERT | 90.29% |
| [FiNER-tagger](https://github.com/Traubert/FiNer-rules) (rule-based) | 86.82% |
(FiNER tagger results from [Ruokolainen et al. 2019](https://arxiv.org/pdf/1908.04212.pdf))
[[code](https://github.com/jouniluoma/keras-bert-ner)][[data](https://github.com/mpsilfve/finer-data)]
### Part of speech tagging
Evaluation on three Finnish corpora annotated with [Universal Dependencies](https://universaldependencies.org/) part-of-speech tags: the Turku Dependency Treebank (TDT), FinnTreeBank (FTB), and Parallel UD treebank (PUD)
| Model | TDT | FTB | PUD |
|-------------------|-------------|-------------|-------------|
| **FinBERT** | **98.23%** | **98.39%** | **98.08%** |
| Multilingual BERT | 96.97% | 95.87% | 97.58% |
[[code](https://github.com/spyysalo/bert-pos)][[data](http://hdl.handle.net/11234/1-2837)]
## Previous releases
### Release 0.2
**October 24, 2019** Beta version of the BERT base uncased model trained from scratch on a corpus of Finnish news, online discussions, and crawled data.
Download the model here: [bert-base-finnish-uncased.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-uncased.zip)
### Release 0.1
**September 30, 2019** We release a beta version of the BERT base cased model trained from scratch on a corpus of Finnish news, online discussions, and crawled data.
Download the model here: [bert-base-finnish-cased.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-cased.zip)
| 3,875 | [
[
-0.035491943359375,
-0.046051025390625,
0.0175933837890625,
0.0032215118408203125,
-0.0258331298828125,
-0.0069427490234375,
-0.042083740234375,
-0.042266845703125,
0.01580810546875,
0.022308349609375,
-0.030609130859375,
-0.0557861328125,
-0.0433349609375,
-0.00492095947265625,
-0.0167999267578125,
0.096435546875,
0.0025615692138671875,
0.02197265625,
-0.00424957275390625,
-0.01088714599609375,
-0.0216522216796875,
-0.0677490234375,
-0.042724609375,
-0.0184478759765625,
0.04986572265625,
0.017791748046875,
0.0162811279296875,
0.0164947509765625,
0.0416259765625,
0.0197601318359375,
-0.00872039794921875,
-0.0002613067626953125,
-0.01374053955078125,
-0.002613067626953125,
0.00762939453125,
-0.021240234375,
-0.02862548828125,
-0.0005688667297363281,
0.05316162109375,
0.0484619140625,
-0.004913330078125,
0.006885528564453125,
-0.0056304931640625,
0.049072265625,
-0.035125732421875,
0.01560211181640625,
-0.048980712890625,
-0.017669677734375,
-0.0221099853515625,
0.0191802978515625,
-0.04339599609375,
-0.01325225830078125,
0.023681640625,
-0.0364990234375,
0.0220489501953125,
-0.018768310546875,
0.09478759765625,
0.0101470947265625,
-0.0185089111328125,
-0.023681640625,
-0.044769287109375,
0.06005859375,
-0.056488037109375,
0.053955078125,
0.0276641845703125,
0.00989532470703125,
-0.0028095245361328125,
-0.06903076171875,
-0.0357666015625,
-0.013519287109375,
-0.016357421875,
0.01503753662109375,
-0.01552581787109375,
0.0025157928466796875,
0.00860595703125,
0.01288604736328125,
-0.04510498046875,
0.011322021484375,
-0.04742431640625,
-0.0168914794921875,
0.044830322265625,
-0.0272369384765625,
0.004306793212890625,
-0.031341552734375,
-0.038848876953125,
-0.0287017822265625,
-0.0302734375,
0.0191497802734375,
0.0276947021484375,
0.043701171875,
-0.01078033447265625,
0.0250244140625,
0.008819580078125,
0.0455322265625,
-0.00537109375,
-0.01169586181640625,
0.046844482421875,
-0.022125244140625,
-0.0216522216796875,
0.01111602783203125,
0.06396484375,
0.015869140625,
0.0191497802734375,
-0.005977630615234375,
-0.00936126708984375,
0.00778961181640625,
0.013092041015625,
-0.052093505859375,
-0.0188751220703125,
0.02398681640625,
-0.0250091552734375,
-0.00412750244140625,
0.0034236907958984375,
-0.043182373046875,
0.003665924072265625,
-0.017303466796875,
0.04132080078125,
-0.0638427734375,
-0.0163421630859375,
0.01389312744140625,
-0.01200103759765625,
0.024810791015625,
0.0218353271484375,
-0.06719970703125,
0.0167999267578125,
0.044769287109375,
0.052581787109375,
-0.01401519775390625,
-0.0168609619140625,
-0.01959228515625,
-0.017730712890625,
-0.0155792236328125,
0.055206298828125,
-0.0093536376953125,
-0.01120758056640625,
-0.0011587142944335938,
0.005908966064453125,
-0.0246734619140625,
-0.01392364501953125,
0.0684814453125,
-0.029876708984375,
0.045440673828125,
-0.0186004638671875,
-0.0560302734375,
-0.021453857421875,
0.0036449432373046875,
-0.04095458984375,
0.08795166015625,
0.01123809814453125,
-0.06915283203125,
0.037139892578125,
-0.051727294921875,
-0.022552490234375,
0.00917816162109375,
0.006427764892578125,
-0.03741455078125,
-0.0091705322265625,
0.0119476318359375,
0.044586181640625,
0.01352691650390625,
0.033355712890625,
-0.016448974609375,
-0.0209197998046875,
0.0025501251220703125,
-0.0152587890625,
0.0843505859375,
0.0266265869140625,
-0.0189361572265625,
0.0017404556274414062,
-0.0648193359375,
0.00644683837890625,
0.00086212158203125,
-0.040985107421875,
-0.037261962890625,
-0.00490570068359375,
0.0292816162109375,
0.006160736083984375,
0.022857666015625,
-0.05078125,
0.016387939453125,
-0.0360107421875,
0.0308990478515625,
0.03936767578125,
-0.019439697265625,
0.024658203125,
-0.025848388671875,
0.0081787109375,
-0.01035308837890625,
0.01261138916015625,
-0.0115814208984375,
-0.046173095703125,
-0.0762939453125,
-0.051361083984375,
0.06915283203125,
0.04705810546875,
-0.04144287109375,
0.054931640625,
-0.03326416015625,
-0.053497314453125,
-0.061492919921875,
-0.0015316009521484375,
0.03082275390625,
0.0341796875,
0.031219482421875,
-0.0261077880859375,
-0.048553466796875,
-0.07379150390625,
0.0087127685546875,
-0.03094482421875,
0.00603485107421875,
0.00922393798828125,
0.0433349609375,
-0.01496124267578125,
0.06512451171875,
-0.0087738037109375,
-0.02423095703125,
-0.018341064453125,
0.033599853515625,
0.019500732421875,
0.04541015625,
0.053924560546875,
-0.06231689453125,
-0.039520263671875,
-0.0013027191162109375,
-0.039794921875,
0.006946563720703125,
0.00518798828125,
0.00643157958984375,
0.062225341796875,
0.02587890625,
-0.053466796875,
0.016754150390625,
0.0357666015625,
-0.02508544921875,
0.035858154296875,
-0.018035888671875,
-0.0092010498046875,
-0.09429931640625,
0.0199737548828125,
0.002124786376953125,
-0.01513671875,
-0.06085205078125,
0.01070404052734375,
0.0158233642578125,
0.00860595703125,
-0.048187255859375,
0.042877197265625,
-0.02044677734375,
-0.0025730133056640625,
0.01099395751953125,
-0.004848480224609375,
-0.0006318092346191406,
0.051422119140625,
0.0153045654296875,
0.0604248046875,
0.026275634765625,
-0.045074462890625,
0.0088348388671875,
0.0205841064453125,
-0.05145263671875,
0.01373291015625,
-0.04498291015625,
0.0016956329345703125,
-0.0203704833984375,
0.0159912109375,
-0.0797119140625,
-0.00540924072265625,
0.01393890380859375,
-0.056915283203125,
0.045501708984375,
-0.0174407958984375,
-0.045135498046875,
-0.02899169921875,
-0.03790283203125,
-0.00911712646484375,
0.04498291015625,
-0.039154052734375,
0.03173828125,
0.01995849609375,
-0.022918701171875,
-0.058319091796875,
-0.0594482421875,
-0.0007815361022949219,
-0.0211334228515625,
-0.051727294921875,
0.037994384765625,
-0.0236663818359375,
-0.01502227783203125,
0.0093231201171875,
0.006134033203125,
-0.01544952392578125,
0.0117645263671875,
0.006221771240234375,
0.03582763671875,
-0.019073486328125,
0.03497314453125,
-0.00579071044921875,
0.005420684814453125,
-0.01204681396484375,
-0.00836181640625,
0.04425048828125,
-0.032379150390625,
0.005176544189453125,
-0.01947021484375,
0.0245361328125,
0.03729248046875,
-0.01467132568359375,
0.053802490234375,
0.07415771484375,
-0.0261688232421875,
0.00971221923828125,
-0.05474853515625,
-0.002361297607421875,
-0.0295257568359375,
0.0211639404296875,
-0.0283355712890625,
-0.0732421875,
0.03778076171875,
0.0188140869140625,
0.020904541015625,
0.06231689453125,
0.03515625,
-0.0240325927734375,
0.04791259765625,
0.06048583984375,
-0.0160675048828125,
0.039886474609375,
-0.03289794921875,
-0.0005335807800292969,
-0.055999755859375,
-0.023590087890625,
-0.054290771484375,
-0.0021190643310546875,
-0.07171630859375,
-0.0165863037109375,
0.005306243896484375,
0.0236053466796875,
-0.00954437255859375,
0.048065185546875,
-0.042022705078125,
0.003040313720703125,
0.0574951171875,
-0.005336761474609375,
0.003875732421875,
0.0253143310546875,
-0.0261688232421875,
-0.01393890380859375,
-0.062225341796875,
-0.0347900390625,
0.08831787109375,
0.03680419921875,
0.034942626953125,
0.004138946533203125,
0.065673828125,
0.0272674560546875,
0.0199127197265625,
-0.06365966796875,
0.028472900390625,
-0.035400390625,
-0.063232421875,
-0.0273284912109375,
-0.0242462158203125,
-0.0843505859375,
0.0261688232421875,
-0.0238037109375,
-0.06671142578125,
0.0257568359375,
-0.0122833251953125,
-0.022064208984375,
0.0236053466796875,
-0.07757568359375,
0.060943603515625,
-0.0193939208984375,
-0.010467529296875,
-0.000022649765014648438,
-0.053314208984375,
0.01555633544921875,
-0.0186767578125,
0.0226287841796875,
-0.01226806640625,
0.00533294677734375,
0.07440185546875,
-0.0230560302734375,
0.060211181640625,
-0.0119476318359375,
-0.006031036376953125,
0.01390838623046875,
-0.0271759033203125,
0.02801513671875,
-0.012725830078125,
-0.0018701553344726562,
0.03070068359375,
0.0167388916015625,
-0.0191802978515625,
-0.0173797607421875,
0.0546875,
-0.07135009765625,
-0.0255126953125,
-0.048492431640625,
-0.032958984375,
-0.007213592529296875,
0.0185546875,
0.036376953125,
0.020843505859375,
-0.0201568603515625,
0.01617431640625,
0.0618896484375,
-0.024566650390625,
0.04400634765625,
0.050445556640625,
-0.007904052734375,
-0.036712646484375,
0.0648193359375,
0.01641845703125,
0.00258636474609375,
0.0380859375,
0.004009246826171875,
-0.0230865478515625,
-0.037261962890625,
-0.033721923828125,
0.0280303955078125,
-0.05047607421875,
-0.010772705078125,
-0.06512451171875,
-0.0340576171875,
-0.050445556640625,
0.0028934478759765625,
-0.031219482421875,
-0.054473876953125,
-0.01959228515625,
-0.0044403076171875,
0.038299560546875,
0.039459228515625,
-0.01334381103515625,
0.0183563232421875,
-0.044036865234375,
0.00036716461181640625,
0.021209716796875,
0.02679443359375,
-0.0284423828125,
-0.0435791015625,
-0.015625,
-0.0028839111328125,
-0.0010690689086914062,
-0.048187255859375,
0.03302001953125,
0.0156097412109375,
0.048187255859375,
0.0083160400390625,
0.0033588409423828125,
0.0278778076171875,
-0.03466796875,
0.06329345703125,
0.020172119140625,
-0.055328369140625,
0.03778076171875,
-0.02532958984375,
0.0219879150390625,
0.054290771484375,
0.050048828125,
-0.040496826171875,
-0.0182952880859375,
-0.059478759765625,
-0.0784912109375,
0.060211181640625,
0.014739990234375,
0.00885009765625,
0.002696990966796875,
0.01137542724609375,
0.016632080078125,
0.01163482666015625,
-0.06048583984375,
-0.032623291015625,
-0.00836944580078125,
-0.01322174072265625,
-0.0242767333984375,
-0.035919189453125,
0.0091552734375,
-0.0380859375,
0.068359375,
0.00978851318359375,
0.0430908203125,
0.0283050537109375,
-0.01641845703125,
0.00746917724609375,
0.035980224609375,
0.06072998046875,
0.04248046875,
-0.0576171875,
0.0005259513854980469,
0.01003265380859375,
-0.043060302734375,
-0.01265716552734375,
0.043548583984375,
-0.01959228515625,
0.04168701171875,
0.034149169921875,
0.07470703125,
0.0161285400390625,
-0.0384521484375,
0.038360595703125,
-0.01528167724609375,
-0.04473876953125,
-0.0301666259765625,
-0.0122222900390625,
0.005435943603515625,
0.007320404052734375,
0.031280517578125,
-0.00726318359375,
0.00415802001953125,
-0.033721923828125,
0.017730712890625,
0.025665283203125,
-0.030426025390625,
-0.012359619140625,
0.0303192138671875,
0.01105499267578125,
-0.01202392578125,
0.039306640625,
-0.01375579833984375,
-0.051422119140625,
0.034637451171875,
0.02508544921875,
0.06121826171875,
-0.013885498046875,
0.0242156982421875,
0.044830322265625,
0.037872314453125,
0.00762939453125,
0.0245513916015625,
-0.000583648681640625,
-0.053497314453125,
-0.0433349609375,
-0.06439208984375,
-0.014495849609375,
0.034454345703125,
-0.046539306640625,
0.0199737548828125,
-0.0254669189453125,
-0.027069091796875,
0.0304107666015625,
0.02972412109375,
-0.0517578125,
0.01056671142578125,
0.0259857177734375,
0.08441162109375,
-0.05120849609375,
0.08819580078125,
0.064453125,
-0.030242919921875,
-0.042083740234375,
-0.024169921875,
-0.0238494873046875,
-0.048736572265625,
0.058563232421875,
0.01422882080078125,
0.01390838623046875,
-0.0079345703125,
-0.033172607421875,
-0.07403564453125,
0.07373046875,
0.02239990234375,
-0.049224853515625,
0.004329681396484375,
0.0011472702026367188,
0.04766845703125,
-0.0188751220703125,
0.01378631591796875,
0.028045654296875,
0.037750244140625,
0.0006237030029296875,
-0.0882568359375,
-0.0232696533203125,
-0.0247802734375,
0.002193450927734375,
0.020172119140625,
-0.046173095703125,
0.068359375,
-0.005382537841796875,
-0.0214080810546875,
0.0003147125244140625,
0.03729248046875,
0.00766754150390625,
0.0162811279296875,
0.035003662109375,
0.0614013671875,
0.0667724609375,
-0.012664794921875,
0.0755615234375,
-0.017974853515625,
0.03070068359375,
0.08538818359375,
0.007289886474609375,
0.07989501953125,
0.031463623046875,
-0.018310546875,
0.04949951171875,
0.05865478515625,
-0.00814056396484375,
0.045989990234375,
-0.0020599365234375,
-0.0156707763671875,
-0.00989532470703125,
-0.01287078857421875,
-0.036590576171875,
0.039031982421875,
0.026702880859375,
-0.02734375,
-0.01337432861328125,
0.00739288330078125,
0.0230255126953125,
-0.00986480712890625,
-0.013702392578125,
0.0504150390625,
-0.0029735565185546875,
-0.045562744140625,
0.062103271484375,
0.0205535888671875,
0.0694580078125,
-0.0618896484375,
0.01038360595703125,
-0.01776123046875,
0.009246826171875,
-0.000995635986328125,
-0.0479736328125,
0.009246826171875,
0.0086212158203125,
-0.0189361572265625,
-0.0209197998046875,
0.06781005859375,
-0.03887939453125,
-0.0462646484375,
0.029541015625,
0.0333251953125,
0.026275634765625,
0.0166168212890625,
-0.06268310546875,
0.00949859619140625,
-0.00350189208984375,
-0.021575927734375,
0.0225067138671875,
0.0131378173828125,
-0.003448486328125,
0.039703369140625,
0.048004150390625,
0.0015573501586914062,
0.00820159912109375,
0.006900787353515625,
0.060760498046875,
-0.03411865234375,
-0.0185546875,
-0.047821044921875,
0.037994384765625,
-0.0015010833740234375,
-0.026519775390625,
0.056884765625,
0.04888916015625,
0.08819580078125,
-0.0049896240234375,
0.0594482421875,
-0.016448974609375,
0.04071044921875,
-0.0369873046875,
0.055511474609375,
-0.043975830078125,
-0.004482269287109375,
-0.018951416015625,
-0.057891845703125,
-0.0232696533203125,
0.04833984375,
-0.012664794921875,
-0.0057220458984375,
0.04962158203125,
0.033905029296875,
0.006862640380859375,
-0.0172271728515625,
0.0207672119140625,
0.005840301513671875,
0.006160736083984375,
0.0364990234375,
0.041412353515625,
-0.049163818359375,
0.040374755859375,
-0.030670166015625,
-0.010284423828125,
-0.00995635986328125,
-0.06085205078125,
-0.07733154296875,
-0.06683349609375,
-0.0254364013671875,
-0.0197296142578125,
0.025177001953125,
0.07293701171875,
0.060760498046875,
-0.0692138671875,
-0.019866943359375,
0.005390167236328125,
0.001369476318359375,
-0.004398345947265625,
-0.01580810546875,
0.039306640625,
-0.040313720703125,
-0.054779052734375,
0.017059326171875,
-0.005443572998046875,
0.0095367431640625,
-0.0121002197265625,
-0.005290985107421875,
-0.048675537109375,
-0.0051727294921875,
0.0499267578125,
0.0205535888671875,
-0.05596923828125,
-0.01244354248046875,
-0.0008616447448730469,
-0.0181732177734375,
0.00858306884765625,
0.027984619140625,
-0.056640625,
0.0362548828125,
0.03515625,
0.037017822265625,
0.0643310546875,
-0.023712158203125,
0.0202484130859375,
-0.061553955078125,
0.0142822265625,
0.006755828857421875,
0.0274810791015625,
0.03485107421875,
-0.017333984375,
0.047119140625,
0.021087646484375,
-0.0243072509765625,
-0.057525634765625,
-0.0110931396484375,
-0.08697509765625,
-0.038330078125,
0.08013916015625,
-0.0214080810546875,
-0.0184478759765625,
0.0032215118408203125,
-0.011199951171875,
0.0248870849609375,
-0.039642333984375,
0.040985107421875,
0.07373046875,
0.016632080078125,
-0.01009368896484375,
-0.042755126953125,
0.044769287109375,
0.025421142578125,
-0.04498291015625,
-0.00988006591796875,
0.0256195068359375,
0.03326416015625,
0.03326416015625,
0.050018310546875,
-0.00003737211227416992,
0.0105743408203125,
-0.0182342529296875,
0.03778076171875,
-0.002132415771484375,
-0.0257720947265625,
-0.0307464599609375,
-0.005565643310546875,
-0.0014286041259765625,
-0.0213775634765625
]
] |
OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 | 2023-05-24T14:04:02.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"sft",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | OpenAssistant | null | null | OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 | 342 | 16,217 | transformers | 2023-04-03T20:06:28 | ---
license: apache-2.0
language:
- en
tags:
- sft
pipeline_tag: text-generation
widget:
- text: <|prompter|>What is a meme, and what's the history behind this word?<|endoftext|><|assistant|>
- text: <|prompter|>What's the Earth total population<|endoftext|><|assistant|>
- text: <|prompter|>Write a story about future of AI development<|endoftext|><|assistant|>
---
# Open-Assistant SFT-4 12B Model
This is the 4th iteration English supervised-fine-tuning (SFT) model of
the [Open-Assistant](https://github.com/LAION-AI/Open-Assistant) project.
It is based on a Pythia 12B that was fine-tuned on human demonstrations
of assistant conversations collected through the
[https://open-assistant.io/](https://open-assistant.io/) human feedback web
app before March 25, 2023.
## Model Details
- **Developed by:** [Open-Assistant Contributors](https://open-assistant.io/)
- **Model type:** Transformer-based Language Model
- **Language:** English
- **Finetuned from:** [EleutherAI / pythia-12b-deduped](https://huggingface.co/EleutherAI/pythia-12b-deduped)
- **Code:** [Open-Assistant/model/model_training](https://github.com/LAION-AI/Open-Assistant/tree/main/model/model_training)
- **Demo:** [Continuations for 250 random prompts](https://open-assistant.github.io/oasst-model-eval/?f=https%3A%2F%2Fraw.githubusercontent.com%2FOpen-Assistant%2Foasst-model-eval%2Fmain%2Fsampling_reports%2Foasst-sft%2F2023-04-03_andreaskoepf_oasst-sft-4-pythia-12b-epoch-3_5_sampling_noprefix_lottery.json%0Ahttps%3A%2F%2Fraw.githubusercontent.com%2FOpen-Assistant%2Foasst-model-eval%2Fmain%2Fsampling_reports%2Fchat-gpt%2F2023-04-11_gpt-3.5-turbo_lottery.json)
- **License:** Apache 2.0
- **Contact:** [Open-Assistant Discord](https://ykilcher.com/open-assistant-discord)
## Prompting
Two special tokens are used to mark the beginning of user and assistant turns:
`<|prompter|>` and `<|assistant|>`. Each turn ends with a `<|endoftext|>` token.
Input prompt example:
```
<|prompter|>What is a meme, and what's the history behind this word?<|endoftext|><|assistant|>
```
The input ends with the `<|assistant|>` token to signal that the model should
start generating the assistant reply.
## Dev Details
- wandb: https://wandb.ai/open-assistant/supervised-finetuning/runs/770a0t41
- base model: [andreaskoepf/pythia-12b-pre-2000](https://huggingface.co/andreaskoepf/pythia-12b-pre-2000)
- checkpoint: 4000 steps
command: `deepspeed trainer_sft.py --configs defaults reference-data reference-pythia-12b --cache_dir /home/ubuntu/data_cache --output_dir .saved/oasst-sft-3-pythia-12b-reference_2kpre --num_train_epochs 8 --residual_dropout 0.2 --deepspeed --use_flash_attention true --model_name andreaskoepf/pythia-12b-pre-2000`
data:
```
reference-data:
datasets:
- oasst_export:
lang: "bg,ca,cs,da,de,en,es,fr,hr,hu,it,nl,pl,pt,ro,ru,sl,sr,sv,uk"
input_file_path: 2023-03-25_oasst_research_ready_synth_labels.jsonl.gz
val_split: 0.05
- alpaca
sort_by_length: false
use_custom_sampler: false
```
pythia:
```
reference-pythia-12b:
dtype: fp16
log_dir: "pythia_log_12b"
learning_rate: 6e-6
model_name: EleutherAI/pythia-12b-deduped
output_dir: pythia_model_12b
weight_decay: 0.0
max_length: 2048
warmup_steps: 100
gradient_checkpointing: true
gradient_accumulation_steps: 2
per_device_train_batch_size: 4
per_device_eval_batch_size: 4
eval_steps: 100
save_steps: 1000
num_train_epochs: 8
save_total_limit: 4
```
zero config:
```
{
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"optimizer": {
"type": "AdamW",
"params": {
"lr": "auto",
"betas": "auto",
"eps": "auto",
"weight_decay": "auto"
}
},
"scheduler": {
"type": "WarmupDecayLR",
"params": {
"warmup_min_lr": "auto",
"warmup_max_lr": "auto",
"warmup_num_steps": "auto",
"total_num_steps": "auto"
}
},
"zero_optimization": {
"stage": 2,
"allgather_partitions": true,
"allgather_bucket_size": 1e9,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 1e9,
"contiguous_gradients": true
},
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"steps_per_print": 2000,
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"wall_clock_breakdown": false
}
``` | 4,509 | [
[
-0.03765869140625,
-0.057861328125,
0.0241241455078125,
0.0159454345703125,
-0.0080413818359375,
-0.00724029541015625,
-0.01019287109375,
-0.01197052001953125,
0.0125732421875,
0.004726409912109375,
-0.063232421875,
-0.033111572265625,
-0.0246429443359375,
0.005039215087890625,
-0.0053558349609375,
0.0697021484375,
-0.0213623046875,
0.006198883056640625,
0.005489349365234375,
-0.00333404541015625,
-0.03717041015625,
-0.024261474609375,
-0.0550537109375,
-0.020172119140625,
0.02154541015625,
0.0287017822265625,
0.04852294921875,
0.038970947265625,
0.03936767578125,
0.023223876953125,
-0.00818634033203125,
0.0019245147705078125,
-0.0241546630859375,
-0.0223846435546875,
0.0053558349609375,
-0.014892578125,
-0.048797607421875,
-0.0013494491577148438,
0.050384521484375,
0.03741455078125,
-0.0033893585205078125,
0.028839111328125,
-0.0028533935546875,
0.038330078125,
-0.03582763671875,
0.01654052734375,
-0.018707275390625,
0.00872039794921875,
-0.009674072265625,
-0.020355224609375,
-0.00766754150390625,
-0.019927978515625,
0.01203155517578125,
-0.049591064453125,
0.027618408203125,
-0.009796142578125,
0.0968017578125,
0.0206298828125,
-0.00901031494140625,
-0.01284027099609375,
-0.048095703125,
0.050323486328125,
-0.0697021484375,
0.025054931640625,
0.01861572265625,
0.0222320556640625,
-0.0084228515625,
-0.0650634765625,
-0.04241943359375,
-0.0176239013671875,
-0.007488250732421875,
0.0178680419921875,
-0.0305633544921875,
-0.0007958412170410156,
0.031646728515625,
0.046356201171875,
-0.055816650390625,
0.003559112548828125,
-0.042755126953125,
-0.016632080078125,
0.0272674560546875,
0.02337646484375,
0.00928497314453125,
-0.007904052734375,
-0.019622802734375,
-0.017578125,
-0.038421630859375,
0.01535797119140625,
0.03021240234375,
0.004444122314453125,
-0.032928466796875,
0.04522705078125,
-0.03851318359375,
0.04736328125,
0.01446533203125,
0.00128936767578125,
0.041717529296875,
-0.040069580078125,
-0.032196044921875,
-0.01629638671875,
0.0919189453125,
0.0216217041015625,
-0.0012454986572265625,
0.006175994873046875,
-0.00658416748046875,
0.00162506103515625,
0.00534820556640625,
-0.078369140625,
-0.046630859375,
0.020721435546875,
-0.0294952392578125,
-0.0278472900390625,
0.00970458984375,
-0.06158447265625,
0.008026123046875,
-0.0021533966064453125,
0.04522705078125,
-0.039886474609375,
-0.0179443359375,
0.0061798095703125,
-0.0012884140014648438,
0.016815185546875,
0.0236358642578125,
-0.059173583984375,
0.0183868408203125,
0.0180511474609375,
0.07037353515625,
0.0122528076171875,
-0.04180908203125,
-0.0238494873046875,
-0.00982666015625,
-0.01450347900390625,
0.0305023193359375,
-0.0070648193359375,
-0.02740478515625,
-0.0255126953125,
0.0220794677734375,
-0.0189666748046875,
-0.01506805419921875,
0.0369873046875,
-0.0204620361328125,
0.033203125,
-0.0176544189453125,
-0.029876708984375,
-0.0174102783203125,
0.0056304931640625,
-0.045623779296875,
0.08599853515625,
0.018798828125,
-0.05120849609375,
0.018096923828125,
-0.07464599609375,
-0.01397705078125,
0.00001609325408935547,
-0.006450653076171875,
-0.040771484375,
-0.007843017578125,
0.0205230712890625,
0.0380859375,
-0.02850341796875,
0.011962890625,
-0.00948333740234375,
-0.0271148681640625,
0.00394439697265625,
-0.045684814453125,
0.07757568359375,
0.01904296875,
-0.040985107421875,
0.02899169921875,
-0.065185546875,
-0.004840850830078125,
0.030853271484375,
-0.0270538330078125,
0.017822265625,
-0.02117919921875,
-0.0081024169921875,
0.01348114013671875,
0.03814697265625,
-0.0295562744140625,
0.0216064453125,
-0.03204345703125,
0.051025390625,
0.059844970703125,
0.00543212890625,
0.0316162109375,
-0.0220489501953125,
0.043609619140625,
0.004138946533203125,
0.0312347412109375,
-0.01090240478515625,
-0.049285888671875,
-0.05224609375,
-0.019866943359375,
0.0087890625,
0.041412353515625,
-0.03131103515625,
0.0653076171875,
-0.0142364501953125,
-0.052581787109375,
-0.048309326171875,
-0.0022716522216796875,
0.0277099609375,
0.04656982421875,
0.04180908203125,
-0.01702880859375,
-0.037261962890625,
-0.06103515625,
0.0092926025390625,
-0.006366729736328125,
0.004070281982421875,
0.037933349609375,
0.06695556640625,
-0.0153045654296875,
0.0413818359375,
-0.05523681640625,
-0.019378662109375,
-0.00562286376953125,
0.01056671142578125,
0.036163330078125,
0.051788330078125,
0.052734375,
-0.033111572265625,
-0.0294952392578125,
-0.0126953125,
-0.054779052734375,
0.01090240478515625,
-0.01129913330078125,
-0.0209808349609375,
0.0150299072265625,
0.0219879150390625,
-0.063232421875,
0.040069580078125,
0.033416748046875,
-0.048095703125,
0.05206298828125,
-0.026458740234375,
0.02001953125,
-0.0894775390625,
0.020721435546875,
-0.00611114501953125,
-0.00045680999755859375,
-0.022857666015625,
0.00388336181640625,
-0.0013036727905273438,
0.001186370849609375,
-0.0343017578125,
0.039947509765625,
-0.041259765625,
0.01123046875,
0.005893707275390625,
-0.01250457763671875,
-0.01052093505859375,
0.05621337890625,
-0.0084381103515625,
0.06573486328125,
0.0518798828125,
-0.04083251953125,
0.0274505615234375,
0.0211639404296875,
-0.0175323486328125,
0.020050048828125,
-0.06524658203125,
0.00897216796875,
0.023162841796875,
0.0199432373046875,
-0.076416015625,
-0.036163330078125,
0.040985107421875,
-0.058197021484375,
0.0146484375,
-0.0096282958984375,
-0.03924560546875,
-0.051849365234375,
-0.026885986328125,
0.02752685546875,
0.04290771484375,
-0.03875732421875,
0.03582763671875,
0.0044097900390625,
0.018585205078125,
-0.036346435546875,
-0.043182373046875,
-0.0206451416015625,
-0.00553131103515625,
-0.055419921875,
0.023895263671875,
-0.0157928466796875,
0.010772705078125,
0.0094146728515625,
-0.01085662841796875,
-0.0050811767578125,
0.01497650146484375,
0.021484375,
0.0283660888671875,
-0.016082763671875,
-0.02783203125,
-0.0024547576904296875,
-0.0019664764404296875,
0.0087890625,
-0.016754150390625,
0.0721435546875,
-0.0200347900390625,
-0.01529693603515625,
-0.057403564453125,
-0.00551605224609375,
0.05120849609375,
-0.022705078125,
0.07415771484375,
0.06378173828125,
-0.03875732421875,
0.0089874267578125,
-0.018585205078125,
-0.02484130859375,
-0.036163330078125,
0.024627685546875,
-0.037689208984375,
-0.046844482421875,
0.047576904296875,
0.01511383056640625,
0.0181884765625,
0.06365966796875,
0.052734375,
-0.0016679763793945312,
0.08209228515625,
0.01364898681640625,
-0.004940032958984375,
0.0550537109375,
-0.06591796875,
-0.002593994140625,
-0.06695556640625,
-0.019378662109375,
-0.037689208984375,
-0.0206756591796875,
-0.043731689453125,
-0.0289306640625,
0.02374267578125,
0.0224609375,
-0.041412353515625,
0.0313720703125,
-0.057159423828125,
0.0237884521484375,
0.0595703125,
0.01165771484375,
-0.00632476806640625,
-0.01453399658203125,
-0.01418304443359375,
0.007045745849609375,
-0.050933837890625,
-0.030242919921875,
0.09625244140625,
0.0255889892578125,
0.0400390625,
-0.0003478527069091797,
0.054046630859375,
-0.0068359375,
-0.00395965576171875,
-0.03790283203125,
0.0382080078125,
0.0148162841796875,
-0.046051025390625,
-0.024627685546875,
-0.036407470703125,
-0.0665283203125,
0.0091552734375,
-0.00238037109375,
-0.06903076171875,
0.01418304443359375,
0.0211639404296875,
-0.034515380859375,
0.031829833984375,
-0.055145263671875,
0.09100341796875,
-0.0157623291015625,
-0.0289306640625,
-0.007282257080078125,
-0.050933837890625,
0.0273895263671875,
0.01406097412109375,
-0.0016946792602539062,
0.00039649009704589844,
0.01500701904296875,
0.07269287109375,
-0.056060791015625,
0.04693603515625,
-0.0182647705078125,
0.0142059326171875,
0.026458740234375,
0.0023822784423828125,
0.04241943359375,
0.006099700927734375,
0.0088958740234375,
0.0239105224609375,
0.00690460205078125,
-0.03131103515625,
-0.013946533203125,
0.0703125,
-0.07977294921875,
-0.0164947509765625,
-0.048583984375,
-0.033843994140625,
0.00957489013671875,
0.02850341796875,
0.037689208984375,
0.029571533203125,
-0.0077667236328125,
0.022857666015625,
0.037322998046875,
-0.0167999267578125,
0.0260009765625,
0.028564453125,
-0.005641937255859375,
-0.047882080078125,
0.073974609375,
0.0007472038269042969,
0.01126861572265625,
0.016632080078125,
0.00847625732421875,
-0.0193023681640625,
-0.038299560546875,
-0.051025390625,
0.0177154541015625,
-0.03765869140625,
-0.02618408203125,
-0.047576904296875,
-0.006683349609375,
-0.04541015625,
-0.00623321533203125,
-0.0293731689453125,
-0.032470703125,
-0.0511474609375,
-0.007965087890625,
0.040802001953125,
0.0299530029296875,
-0.007595062255859375,
0.03466796875,
-0.045166015625,
0.029815673828125,
-0.00007921457290649414,
0.0209197998046875,
-0.0021381378173828125,
-0.0518798828125,
-0.01580810546875,
0.01345062255859375,
-0.048095703125,
-0.0794677734375,
0.030517578125,
-0.0049285888671875,
0.0364990234375,
0.0252838134765625,
-0.0210723876953125,
0.055908203125,
-0.024139404296875,
0.08233642578125,
0.01374053955078125,
-0.05926513671875,
0.05133056640625,
-0.033721923828125,
0.029571533203125,
0.03497314453125,
0.0291900634765625,
-0.01910400390625,
-0.0130615234375,
-0.0672607421875,
-0.08209228515625,
0.07806396484375,
0.0286102294921875,
-0.01053619384765625,
0.0012998580932617188,
0.017608642578125,
-0.0146331787109375,
0.017852783203125,
-0.057403564453125,
-0.037445068359375,
-0.0229034423828125,
-0.02191162109375,
0.005466461181640625,
-0.0017671585083007812,
-0.0025844573974609375,
-0.044952392578125,
0.07452392578125,
-0.0014591217041015625,
0.0355224609375,
0.0194854736328125,
-0.00264739990234375,
-0.0258026123046875,
-0.0018930435180664062,
0.035675048828125,
0.045623779296875,
-0.034423828125,
-0.0216217041015625,
0.01297760009765625,
-0.039306640625,
-0.00414276123046875,
0.0282440185546875,
-0.02630615234375,
-0.006343841552734375,
0.0234832763671875,
0.07415771484375,
0.0187225341796875,
-0.0283355712890625,
0.0261077880859375,
-0.0095977783203125,
-0.0257110595703125,
-0.033843994140625,
0.0096282958984375,
0.005519866943359375,
0.01514434814453125,
0.01435089111328125,
0.0106201171875,
0.006343841552734375,
-0.0360107421875,
-0.007205963134765625,
0.03228759765625,
-0.005123138427734375,
-0.036163330078125,
0.07208251953125,
0.005035400390625,
-0.0253753662109375,
0.041473388671875,
-0.028717041015625,
-0.03582763671875,
0.056549072265625,
0.0231475830078125,
0.07476806640625,
-0.015838623046875,
0.0008401870727539062,
0.055694580078125,
0.028656005859375,
-0.0150299072265625,
0.038726806640625,
0.00193023681640625,
-0.035491943359375,
-0.0028553009033203125,
-0.06048583984375,
-0.0142059326171875,
0.034393310546875,
-0.04888916015625,
0.035430908203125,
-0.045745849609375,
-0.004116058349609375,
0.01294708251953125,
0.01558685302734375,
-0.07373046875,
0.018768310546875,
-0.0103607177734375,
0.06964111328125,
-0.057647705078125,
0.055633544921875,
0.06439208984375,
-0.060577392578125,
-0.0814208984375,
-0.00798797607421875,
-0.006923675537109375,
-0.045196533203125,
0.016357421875,
0.0026912689208984375,
0.0112457275390625,
0.023223876953125,
-0.04522705078125,
-0.06585693359375,
0.11138916015625,
0.01355743408203125,
-0.045928955078125,
-0.01983642578125,
-0.0126190185546875,
0.03826904296875,
-0.00960540771484375,
0.036163330078125,
0.0477294921875,
0.032073974609375,
0.00588226318359375,
-0.07879638671875,
0.0189208984375,
-0.024566650390625,
-0.021026611328125,
0.022857666015625,
-0.06463623046875,
0.095703125,
-0.01580810546875,
0.00666046142578125,
0.0245819091796875,
0.0478515625,
0.0242919921875,
0.0110626220703125,
0.0210113525390625,
0.056732177734375,
0.050384521484375,
-0.0242919921875,
0.07745361328125,
-0.039886474609375,
0.043914794921875,
0.0703125,
0.01459503173828125,
0.038299560546875,
0.024261474609375,
-0.0270538330078125,
0.0275726318359375,
0.05523681640625,
-0.0234222412109375,
0.03997802734375,
0.00009822845458984375,
-0.00388336181640625,
-0.0032501220703125,
0.020721435546875,
-0.039520263671875,
0.0151214599609375,
0.017608642578125,
-0.03961181640625,
-0.0084228515625,
-0.0146331787109375,
0.0146331787109375,
-0.027191162109375,
-0.01409149169921875,
0.0477294921875,
-0.01233673095703125,
-0.050018310546875,
0.056549072265625,
0.0067291259765625,
0.0501708984375,
-0.05120849609375,
-0.00872802734375,
-0.005893707275390625,
0.0301513671875,
-0.0154266357421875,
-0.049896240234375,
-0.0004782676696777344,
0.0003304481506347656,
-0.011627197265625,
0.0002397298812866211,
0.032928466796875,
-0.013702392578125,
-0.039031982421875,
0.0168609619140625,
0.026397705078125,
0.0253753662109375,
-0.006591796875,
-0.06591796875,
0.00855255126953125,
0.0159454345703125,
-0.041534423828125,
0.010406494140625,
0.0411376953125,
0.0007581710815429688,
0.03271484375,
0.059600830078125,
0.00748443603515625,
0.013397216796875,
-0.002651214599609375,
0.071044921875,
-0.0438232421875,
-0.032318115234375,
-0.068115234375,
0.04180908203125,
0.0010986328125,
-0.060699462890625,
0.053466796875,
0.058074951171875,
0.07098388671875,
-0.00907135009765625,
0.0511474609375,
-0.029632568359375,
0.01641845703125,
-0.03057861328125,
0.049346923828125,
-0.050262451171875,
0.01003265380859375,
-0.03472900390625,
-0.05291748046875,
0.0113525390625,
0.06622314453125,
-0.017913818359375,
0.016998291015625,
0.04888916015625,
0.0570068359375,
-0.0147247314453125,
0.0027256011962890625,
-0.003612518310546875,
0.025054931640625,
0.036407470703125,
0.05364990234375,
0.0401611328125,
-0.050933837890625,
0.047210693359375,
-0.050323486328125,
-0.0215301513671875,
-0.0221710205078125,
-0.0347900390625,
-0.0697021484375,
-0.037841796875,
-0.0187530517578125,
-0.035552978515625,
-0.00551605224609375,
0.08306884765625,
0.04058837890625,
-0.06463623046875,
-0.01148223876953125,
-0.033050537109375,
-0.01274871826171875,
-0.0276031494140625,
-0.0259552001953125,
0.03436279296875,
-0.005645751953125,
-0.0633544921875,
0.026641845703125,
-0.009307861328125,
0.01418304443359375,
-0.022125244140625,
-0.0232696533203125,
-0.0283355712890625,
-0.004940032958984375,
0.02032470703125,
0.02252197265625,
-0.041259765625,
-0.00386810302734375,
0.008544921875,
-0.0011472702026367188,
0.002704620361328125,
0.0295867919921875,
-0.04681396484375,
0.0203857421875,
0.040802001953125,
0.007198333740234375,
0.050445556640625,
-0.011688232421875,
0.028045654296875,
-0.04803466796875,
0.01451873779296875,
0.0228118896484375,
0.0467529296875,
0.01529693603515625,
-0.029052734375,
0.049957275390625,
0.032684326171875,
-0.05010986328125,
-0.07489013671875,
-0.01090240478515625,
-0.0692138671875,
-0.01427459716796875,
0.07696533203125,
-0.0093994140625,
-0.02593994140625,
0.0236358642578125,
-0.0299530029296875,
0.03985595703125,
-0.044921875,
0.04736328125,
0.043060302734375,
-0.01139068603515625,
0.004852294921875,
-0.05224609375,
0.03717041015625,
0.020904541015625,
-0.0657958984375,
-0.005756378173828125,
0.047027587890625,
0.038360595703125,
0.0216827392578125,
0.06048583984375,
-0.005107879638671875,
0.043304443359375,
-0.0030918121337890625,
0.01434326171875,
-0.0186614990234375,
-0.0279541015625,
-0.0255889892578125,
-0.006763458251953125,
-0.0112457275390625,
-0.040557861328125
]
] |
textattack/albert-base-v2-CoLA | 2020-07-06T16:28:50.000Z | [
"transformers",
"pytorch",
"albert",
"text-classification",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | textattack | null | null | textattack/albert-base-v2-CoLA | 0 | 16,154 | transformers | 2022-03-02T23:29:05 | ## TextAttack Model Cardand the glue dataset loaded using the `nlp` library. The model was fine-tuned
for 5 epochs with a batch size of 32, a learning
rate of 3e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.8245445829338447, as measured by the
eval set accuracy, found after 2 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 530 | [
[
-0.00289154052734375,
-0.03985595703125,
0.0245819091796875,
0.016357421875,
-0.01512908935546875,
-0.00028514862060546875,
-0.0108489990234375,
-0.03472900390625,
-0.002208709716796875,
0.01074981689453125,
-0.03192138671875,
-0.050628662109375,
-0.044830322265625,
0.0037899017333984375,
-0.03228759765625,
0.10479736328125,
0.0176239013671875,
-0.0069122314453125,
-0.01806640625,
-0.00499725341796875,
-0.0377197265625,
-0.0211639404296875,
-0.04217529296875,
-0.027008056640625,
0.03466796875,
0.0297393798828125,
0.0601806640625,
0.0657958984375,
0.054107666015625,
0.015533447265625,
-0.02423095703125,
0.0012264251708984375,
-0.038787841796875,
-0.024566650390625,
-0.0100250244140625,
-0.055877685546875,
-0.054443359375,
0.00653839111328125,
0.0374755859375,
0.00173187255859375,
-0.0174407958984375,
0.049591064453125,
0.0041046142578125,
0.05047607421875,
-0.0321044921875,
-0.00652313232421875,
-0.05419921875,
0.0157623291015625,
-0.01202392578125,
-0.00760650634765625,
-0.05401611328125,
-0.0145721435546875,
0.016204833984375,
-0.0469970703125,
0.007541656494140625,
0.020538330078125,
0.0753173828125,
0.020416259765625,
-0.0145721435546875,
0.000576019287109375,
-0.030731201171875,
0.07269287109375,
-0.05010986328125,
0.01523590087890625,
0.05706787109375,
0.0050201416015625,
-0.007965087890625,
-0.0631103515625,
-0.014434814453125,
0.016754150390625,
0.0170440673828125,
-0.0122833251953125,
-0.00626373291015625,
-0.00739288330078125,
0.0262451171875,
0.03045654296875,
-0.0843505859375,
0.0109100341796875,
-0.050048828125,
-0.0195465087890625,
0.077880859375,
0.032135009765625,
0.021881103515625,
-0.00949859619140625,
-0.046844482421875,
-0.01885986328125,
-0.006961822509765625,
-0.004016876220703125,
0.0238494873046875,
0.01430511474609375,
-0.015716552734375,
0.0390625,
0.004123687744140625,
0.0599365234375,
0.003299713134765625,
-0.031524658203125,
0.01910400390625,
-0.003170013427734375,
-0.0440673828125,
-0.0016155242919921875,
0.0721435546875,
0.023956298828125,
0.006679534912109375,
0.004039764404296875,
-0.01213836669921875,
-0.0092315673828125,
0.024261474609375,
-0.052032470703125,
-0.029876708984375,
0.03338623046875,
-0.0469970703125,
-0.033447265625,
-0.002429962158203125,
-0.0245208740234375,
-0.0154571533203125,
-0.019989013671875,
0.04302978515625,
-0.06024169921875,
0.0028820037841796875,
0.0235443115234375,
-0.0282135009765625,
0.0181884765625,
0.01509857177734375,
-0.04925537109375,
0.01824951171875,
0.044586181640625,
0.0889892578125,
-0.03265380859375,
-0.013671875,
0.009124755859375,
-0.017578125,
-0.027679443359375,
0.070556640625,
-0.03466796875,
-0.0078582763671875,
-0.02752685546875,
-0.00146484375,
-0.00485992431640625,
-0.0164337158203125,
0.03857421875,
-0.031036376953125,
0.033203125,
0.01751708984375,
-0.07257080078125,
-0.03466796875,
0.00714111328125,
-0.059326171875,
0.08770751953125,
0.023101806640625,
-0.04168701171875,
0.042266845703125,
-0.035552978515625,
-0.0100860595703125,
0.0123748779296875,
0.0250244140625,
-0.060455322265625,
0.010650634765625,
0.00211334228515625,
0.04266357421875,
-0.0307769775390625,
0.028045654296875,
-0.0296630859375,
-0.0380859375,
0.01503753662109375,
-0.029693603515625,
0.07659912109375,
0.006015777587890625,
-0.00223541259765625,
-0.009765625,
-0.07025146484375,
0.0277252197265625,
-0.002750396728515625,
-0.033050537109375,
-0.007415771484375,
-0.03326416015625,
0.0205078125,
0.0150604248046875,
0.02154541015625,
-0.03717041015625,
0.03875732421875,
-0.029205322265625,
0.01922607421875,
0.054443359375,
-0.0038547515869140625,
0.02276611328125,
-0.048828125,
0.045074462890625,
-0.005523681640625,
0.01061248779296875,
-0.02178955078125,
-0.0545654296875,
-0.02764892578125,
-0.0186920166015625,
0.0411376953125,
0.05303955078125,
-0.0253143310546875,
0.0341796875,
-0.01104736328125,
-0.0736083984375,
-0.033447265625,
0.0016336441040039062,
0.024688720703125,
0.0134429931640625,
0.0287628173828125,
-0.0111846923828125,
-0.0287628173828125,
-0.059295654296875,
-0.02313232421875,
-0.01678466796875,
-0.013641357421875,
0.001987457275390625,
0.061370849609375,
0.00858306884765625,
0.06829833984375,
-0.0615234375,
-0.040435791015625,
0.0113983154296875,
0.026641845703125,
0.026824951171875,
0.03564453125,
0.0279541015625,
-0.04736328125,
-0.0357666015625,
-0.01690673828125,
-0.033355712890625,
0.0133209228515625,
-0.00787353515625,
0.006000518798828125,
0.025299072265625,
0.022247314453125,
-0.04083251953125,
0.04986572265625,
0.05419921875,
-0.04833984375,
0.0517578125,
-0.01071929931640625,
0.00670623779296875,
-0.1046142578125,
0.0133819580078125,
0.0027179718017578125,
-0.025146484375,
-0.0303192138671875,
-0.010986328125,
0.004901885986328125,
-0.03253173828125,
-0.05889892578125,
0.04400634765625,
-0.031280517578125,
0.01194000244140625,
-0.00974273681640625,
-0.0125274658203125,
0.0127105712890625,
0.05078125,
-0.0010356903076171875,
0.059478759765625,
0.02215576171875,
-0.0294189453125,
0.0235443115234375,
0.0240631103515625,
-0.0178070068359375,
0.0343017578125,
-0.060546875,
0.03509521484375,
0.008392333984375,
0.006649017333984375,
-0.0784912109375,
-0.02459716796875,
-0.0228271484375,
-0.046478271484375,
0.0166168212890625,
0.00952911376953125,
-0.035675048828125,
-0.0193023681640625,
-0.0440673828125,
0.041015625,
0.032440185546875,
-0.0220794677734375,
0.033782958984375,
0.036041259765625,
0.0221099853515625,
-0.037689208984375,
-0.0501708984375,
-0.009124755859375,
-0.032562255859375,
-0.0413818359375,
0.0399169921875,
-0.025390625,
0.0223388671875,
-0.01580810546875,
0.0077056884765625,
-0.0283203125,
-0.00958251953125,
0.025482177734375,
0.012298583984375,
-0.00592803955078125,
0.038543701171875,
-0.00968170166015625,
-0.007045745849609375,
-0.01113128662109375,
-0.0093841552734375,
0.039581298828125,
-0.0232696533203125,
0.0120086669921875,
-0.04571533203125,
-0.004131317138671875,
0.04644775390625,
0.00567626953125,
0.0863037109375,
0.050140380859375,
-0.044189453125,
-0.01715087890625,
-0.016326904296875,
-0.00537872314453125,
-0.03497314453125,
0.0289764404296875,
-0.030242919921875,
-0.053924560546875,
0.04248046875,
0.01409912109375,
0.0007076263427734375,
0.05804443359375,
0.035308837890625,
0.002101898193359375,
0.06201171875,
0.053924560546875,
-0.019073486328125,
0.0159149169921875,
-0.01910400390625,
-0.0013637542724609375,
-0.049346923828125,
-0.0233154296875,
-0.022430419921875,
-0.0255126953125,
-0.04193115234375,
-0.038421630859375,
0.012298583984375,
0.0231475830078125,
-0.013885498046875,
0.054534912109375,
-0.05908203125,
0.037994384765625,
0.035919189453125,
0.023712158203125,
0.005275726318359375,
-0.015777587890625,
0.00743865966796875,
-0.0013971328735351562,
-0.056549072265625,
-0.0245208740234375,
0.09197998046875,
0.04730224609375,
0.05682373046875,
-0.00824737548828125,
0.03790283203125,
0.03338623046875,
-0.00469970703125,
-0.061370849609375,
0.042633056640625,
-0.008056640625,
-0.04443359375,
-0.039642333984375,
0.0010433197021484375,
-0.06298828125,
-0.02935791015625,
-0.0289764404296875,
-0.05914306640625,
-0.0095977783203125,
0.023468017578125,
-0.0120849609375,
0.0228271484375,
-0.03985595703125,
0.06695556640625,
0.01227569580078125,
-0.01352691650390625,
0.0005202293395996094,
-0.051910400390625,
0.0312347412109375,
-0.0165252685546875,
-0.014312744140625,
-0.0296478271484375,
-0.003692626953125,
0.08135986328125,
-0.016937255859375,
0.037628173828125,
0.0225372314453125,
0.0008535385131835938,
0.0220184326171875,
-0.00583648681640625,
0.0270538330078125,
-0.0107269287109375,
0.0009775161743164062,
0.030029296875,
0.00982666015625,
-0.02325439453125,
-0.0426025390625,
0.016510009765625,
-0.039581298828125,
0.0011157989501953125,
-0.0306396484375,
-0.054107666015625,
0.0129852294921875,
-0.0016622543334960938,
0.045501708984375,
0.03912353515625,
-0.007598876953125,
0.0242919921875,
0.0577392578125,
-0.0218505859375,
0.047210693359375,
0.03143310546875,
-0.01561737060546875,
-0.0307159423828125,
0.0780029296875,
0.0208282470703125,
0.01318359375,
0.0234375,
0.0200958251953125,
-0.009521484375,
-0.01256561279296875,
-0.022216796875,
0.018280029296875,
-0.02996826171875,
-0.0517578125,
-0.03289794921875,
-0.0323486328125,
-0.02862548828125,
-0.0031986236572265625,
-0.037841796875,
-0.04876708984375,
-0.051513671875,
-0.00742340087890625,
0.05596923828125,
0.038055419921875,
0.002056121826171875,
0.0252532958984375,
-0.06689453125,
0.0040740966796875,
-0.0004744529724121094,
0.043548583984375,
-0.019378662109375,
-0.0643310546875,
-0.044647216796875,
-0.0235443115234375,
-0.0223388671875,
-0.0660400390625,
0.0300750732421875,
0.046173095703125,
0.012420654296875,
0.0273895263671875,
0.0031280517578125,
0.04742431640625,
-0.046783447265625,
0.08111572265625,
-0.0034084320068359375,
-0.06201171875,
0.0518798828125,
-0.029541015625,
0.07073974609375,
0.020355224609375,
0.049957275390625,
-0.018341064453125,
-0.043975830078125,
-0.07415771484375,
-0.059661865234375,
0.037689208984375,
0.008758544921875,
-0.01403045654296875,
0.01416015625,
0.027618408203125,
0.00006103515625,
-0.0028858184814453125,
-0.053863525390625,
-0.01519775390625,
0.017120361328125,
-0.03582763671875,
0.003635406494140625,
-0.024627685546875,
0.0106201171875,
-0.0194091796875,
0.06854248046875,
-0.0142364501953125,
0.041229248046875,
-0.003612518310546875,
-0.014801025390625,
0.0111083984375,
0.00418853759765625,
0.07281494140625,
0.047119140625,
-0.0224609375,
-0.0182342529296875,
0.022979736328125,
-0.065185546875,
0.015777587890625,
0.0023136138916015625,
0.00887298583984375,
-0.0060577392578125,
0.047698974609375,
0.06396484375,
-0.009857177734375,
-0.04754638671875,
0.03271484375,
-0.00463104248046875,
-0.01087188720703125,
-0.0245208740234375,
0.014312744140625,
-0.01346588134765625,
0.0146331787109375,
0.0278472900390625,
-0.01476287841796875,
0.02734375,
-0.0036830902099609375,
0.048431396484375,
0.0180206298828125,
-0.05377197265625,
-0.0309906005859375,
0.0543212890625,
-0.01702880859375,
-0.0469970703125,
0.05780029296875,
-0.0003185272216796875,
-0.0313720703125,
0.0391845703125,
0.041900634765625,
0.068359375,
-0.0467529296875,
0.0271453857421875,
0.039398193359375,
0.019287109375,
-0.0194091796875,
0.01445770263671875,
0.01385498046875,
-0.038970947265625,
-0.02655029296875,
-0.0611572265625,
-0.01064300537109375,
0.02117919921875,
-0.0660400390625,
0.03289794921875,
-0.0171356201171875,
-0.03765869140625,
0.00569915771484375,
0.01123809814453125,
-0.03875732421875,
0.047332763671875,
-0.0078125,
0.0948486328125,
-0.091064453125,
0.048431396484375,
0.04449462890625,
-0.0105133056640625,
-0.06591796875,
-0.02569580078125,
0.01433563232421875,
-0.041015625,
0.04278564453125,
0.021453857421875,
0.012939453125,
-0.0048828125,
-0.061798095703125,
-0.076171875,
0.08172607421875,
0.01392364501953125,
-0.03363037109375,
-0.009368896484375,
0.0010662078857421875,
0.04827880859375,
-0.0182342529296875,
0.02362060546875,
0.051605224609375,
0.00023949146270751953,
-0.0224151611328125,
-0.08563232421875,
-0.01146697998046875,
-0.027618408203125,
0.00788116455078125,
0.021575927734375,
-0.0496826171875,
0.05706787109375,
-0.006378173828125,
-0.00017511844635009766,
0.0004367828369140625,
0.0675048828125,
0.0182342529296875,
0.0174560546875,
0.0250091552734375,
0.0672607421875,
0.054473876953125,
0.0020275115966796875,
0.062408447265625,
-0.0232086181640625,
0.046539306640625,
0.0948486328125,
0.01580810546875,
0.07525634765625,
0.01302337646484375,
-0.017181396484375,
0.040130615234375,
0.05419921875,
-0.01233673095703125,
0.0256805419921875,
0.0098114013671875,
0.00909423828125,
-0.0011119842529296875,
-0.00872802734375,
0.004840850830078125,
0.042694091796875,
0.0248870849609375,
-0.029937744140625,
0.01995849609375,
0.0272674560546875,
0.01409912109375,
-0.019775390625,
-0.0115966796875,
0.05999755859375,
0.00438690185546875,
-0.0286865234375,
0.03509521484375,
-0.007518768310546875,
0.0254669189453125,
-0.029541015625,
-0.02166748046875,
0.0090484619140625,
0.00853729248046875,
-0.0183563232421875,
-0.07611083984375,
0.0201568603515625,
-0.030029296875,
-0.0310211181640625,
0.0036525726318359375,
0.0599365234375,
-0.046295166015625,
-0.032562255859375,
-0.00957489013671875,
-0.008056640625,
0.0234375,
0.00923919677734375,
-0.062744140625,
0.0081634521484375,
-0.00713348388671875,
-0.044708251953125,
0.01366424560546875,
0.0216217041015625,
0.0008087158203125,
0.061614990234375,
0.035858154296875,
-0.019775390625,
-0.0013027191162109375,
-0.040435791015625,
0.036865234375,
-0.047607421875,
-0.035430908203125,
-0.0648193359375,
0.060638427734375,
-0.00954437255859375,
-0.0648193359375,
0.05645751953125,
0.0653076171875,
0.039886474609375,
-0.00849151611328125,
0.0138092041015625,
0.00556182861328125,
0.042144775390625,
-0.044403076171875,
0.039306640625,
-0.04266357421875,
0.0018482208251953125,
-0.03173828125,
-0.06072998046875,
-0.0280914306640625,
0.04632568359375,
0.0153350830078125,
-0.0096282958984375,
0.056640625,
0.051605224609375,
0.0164947509765625,
0.0062255859375,
0.025421142578125,
-0.0045318603515625,
0.01194000244140625,
0.04962158203125,
0.01137542724609375,
-0.0753173828125,
0.0248870849609375,
-0.0096588134765625,
-0.012359619140625,
-0.005504608154296875,
-0.0791015625,
-0.074462890625,
-0.062744140625,
-0.038055419921875,
-0.04266357421875,
-0.006748199462890625,
0.06109619140625,
0.057037353515625,
-0.04132080078125,
-0.0168304443359375,
0.006298065185546875,
-0.005207061767578125,
-0.0214996337890625,
-0.0232391357421875,
0.0494384765625,
-0.007110595703125,
-0.0498046875,
0.01043701171875,
0.0277252197265625,
0.014312744140625,
-0.00449371337890625,
0.0088958740234375,
-0.039306640625,
0.0145111083984375,
0.03167724609375,
0.006378173828125,
-0.0249481201171875,
-0.0306854248046875,
0.0016794204711914062,
-0.006282806396484375,
0.0081939697265625,
0.03448486328125,
-0.054290771484375,
0.0382080078125,
0.050048828125,
0.037841796875,
0.038055419921875,
0.00464630126953125,
0.05889892578125,
-0.062347412109375,
0.0113983154296875,
0.020050048828125,
-0.00669097900390625,
0.00666046142578125,
-0.0389404296875,
0.052032470703125,
0.00434112548828125,
-0.0771484375,
-0.0706787109375,
-0.007167816162109375,
-0.06378173828125,
-0.020172119140625,
0.061676025390625,
-0.0128936767578125,
-0.045989990234375,
0.002593994140625,
-0.006832122802734375,
0.012359619140625,
-0.0249786376953125,
0.04266357421875,
0.05731201171875,
-0.0183563232421875,
-0.021026611328125,
-0.0150146484375,
0.064697265625,
0.01195526123046875,
-0.057037353515625,
-0.024749755859375,
0.008941650390625,
0.0197601318359375,
0.031646728515625,
0.0460205078125,
0.009613037109375,
0.0283203125,
0.0159454345703125,
0.006847381591796875,
0.007099151611328125,
-0.0496826171875,
-0.0295257568359375,
0.0155792236328125,
-0.013824462890625,
-0.01041412353515625
]
] |
elyza/ELYZA-japanese-Llama-2-7b-fast-instruct | 2023-08-29T03:47:09.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"ja",
"en",
"arxiv:2307.09288",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | elyza | null | null | elyza/ELYZA-japanese-Llama-2-7b-fast-instruct | 51 | 16,152 | transformers | 2023-08-28T13:36:19 | ---
license: llama2
language:
- ja
- en
---
## ELYZA-japanese-Llama-2-7b

### Model Description
**ELYZA-japanese-Llama-2-7b** は、 Llama2をベースとして日本語能力を拡張するために追加事前学習を行ったモデルです。
詳細は [Blog記事](https://note.com/elyza/n/na405acaca130) を参照してください。
### Usage
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
B_INST, E_INST = "[INST]", "[/INST]"
B_SYS, E_SYS = "<<SYS>>\n", "\n<</SYS>>\n\n"
DEFAULT_SYSTEM_PROMPT = "あなたは誠実で優秀な日本人のアシスタントです。"
text = "クマが海辺に行ってアザラシと友達になり、最終的には家に帰るというプロットの短編小説を書いてください。"
model_name = "elyza/ELYZA-japanese-Llama-2-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype="auto")
if torch.cuda.is_available():
model = model.to("cuda")
prompt = "{bos_token}{b_inst} {system}{prompt} {e_inst} ".format(
bos_token=tokenizer.bos_token,
b_inst=B_INST,
system=f"{B_SYS}{DEFAULT_SYSTEM_PROMPT}{E_SYS}",
prompt=text,
e_inst=E_INST,
)
with torch.no_grad():
token_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt")
output_ids = model.generate(
token_ids.to(model.device),
max_new_tokens=256,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id,
)
output = tokenizer.decode(output_ids.tolist()[0][token_ids.size(1) :], skip_special_tokens=True)
print(output)
"""
承知しました。以下にクマが海辺に行ってアザラシと友達になり、最終的には家に帰るというプロットの短編小説を記述します。
クマは山の中でゆっくりと眠っていた。
その眠りに落ちたクマは、夢の中で海辺を歩いていた。
そこにはアザラシがいた。
クマはアザラシに話しかける。
「おはよう」とクマが言うと、アザラシは驚いたように顔を上げた。
「あ、こんにちは」アザラシは答えた。
クマはアザラシと友達になりたいと思う。
「私はクマと申します。」クマは...
"""
```
### ELYZA-japanese-Llama-2-7b Models
| Model Name | Vocab Size | #Params |
|:---------------------------------------------|:----------:|:-------:|
|[elyza/ELYZA-japanese-Llama-2-7b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b)| 32000 | 6.27B |
|[elyza/ELYZA-japanese-Llama-2-7b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct)| 32000 | 6.27B |
|[elyza/ELYZA-japanese-Llama-2-7b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast)| 45043 | 6.37B |
|[elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct)| 45043 | 6.37B |
### Developers
以下アルファベット順
- [Akira Sasaki](https://huggingface.co/akirasasaki)
- [Masato Hirakawa](https://huggingface.co/m-hirakawa)
- [Shintaro Horie](https://huggingface.co/e-mon)
- [Tomoaki Nakamura](https://huggingface.co/tyoyo)
### Licence
Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
### How to Cite
```tex
@misc{elyzallama2023,
title={ELYZA-japanese-Llama-2-7b},
url={https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b},
author={Akira Sasaki and Masato Hirakawa and Shintaro Horie and Tomoaki Nakamura},
year={2023},
}
```
### Citations
```tex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,469 | [
[
-0.0343017578125,
-0.046722412109375,
0.0199127197265625,
0.0262298583984375,
-0.040679931640625,
0.005817413330078125,
0.01020050048828125,
-0.046600341796875,
0.044830322265625,
0.00803375244140625,
-0.046356201171875,
-0.045318603515625,
-0.042816162109375,
0.0149383544921875,
-0.00800323486328125,
0.0562744140625,
-0.0100555419921875,
-0.0239410400390625,
0.0034656524658203125,
-0.001407623291015625,
-0.0154876708984375,
-0.0282440185546875,
-0.038421630859375,
-0.0228271484375,
0.0208587646484375,
0.01097869873046875,
0.04168701171875,
0.050445556640625,
0.038787841796875,
0.0310516357421875,
-0.0194091796875,
0.0211944580078125,
-0.0200347900390625,
-0.01556396484375,
0.0182037353515625,
-0.036865234375,
-0.05810546875,
-0.0219879150390625,
0.040374755859375,
0.0232086181640625,
0.00678253173828125,
0.02691650390625,
-0.0030612945556640625,
0.022796630859375,
-0.021087646484375,
0.0027294158935546875,
-0.0281524658203125,
0.00640106201171875,
-0.0163726806640625,
-0.0159759521484375,
-0.0093994140625,
-0.0264434814453125,
-0.020843505859375,
-0.06365966796875,
-0.00473785400390625,
0.005756378173828125,
0.10858154296875,
0.0167999267578125,
-0.0211029052734375,
-0.001003265380859375,
-0.01177978515625,
0.0648193359375,
-0.072998046875,
0.014373779296875,
0.0218353271484375,
-0.006092071533203125,
-0.0255279541015625,
-0.06097412109375,
-0.05474853515625,
-0.00673675537109375,
-0.02178955078125,
0.01482391357421875,
-0.034423828125,
-0.0230712890625,
0.01470947265625,
0.016204833984375,
-0.033447265625,
0.02197265625,
-0.040313720703125,
-0.00936126708984375,
0.05621337890625,
0.014129638671875,
0.043365478515625,
-0.026397705078125,
-0.042510986328125,
-0.013580322265625,
-0.04998779296875,
0.0183868408203125,
0.0262298583984375,
0.007442474365234375,
-0.0531005859375,
0.046661376953125,
-0.01548004150390625,
0.032135009765625,
0.00989532470703125,
-0.0276031494140625,
0.0484619140625,
-0.03277587890625,
-0.019561767578125,
-0.017120361328125,
0.08441162109375,
0.049957275390625,
-0.002124786376953125,
0.012542724609375,
0.0005512237548828125,
-0.0003230571746826172,
-0.0335693359375,
-0.06903076171875,
0.0122528076171875,
0.0244598388671875,
-0.044830322265625,
-0.0291900634765625,
-0.00543212890625,
-0.0654296875,
-0.005130767822265625,
0.0055389404296875,
0.016998291015625,
-0.01483917236328125,
-0.0325927734375,
0.0139617919921875,
0.0003857612609863281,
0.0308074951171875,
0.01134490966796875,
-0.049285888671875,
0.01062774658203125,
0.0291900634765625,
0.068359375,
0.00525665283203125,
-0.0232696533203125,
-0.01128387451171875,
0.016082763671875,
-0.01343536376953125,
0.050445556640625,
-0.021881103515625,
-0.038604736328125,
-0.020904541015625,
0.016143798828125,
-0.00872802734375,
-0.0212554931640625,
0.0276336669921875,
-0.0080718994140625,
0.005992889404296875,
-0.02349853515625,
-0.020355224609375,
-0.01512908935546875,
0.00705718994140625,
-0.0255279541015625,
0.0811767578125,
-0.0030765533447265625,
-0.06573486328125,
0.000032901763916015625,
-0.034515380859375,
-0.01253509521484375,
-0.00907135009765625,
-0.002178192138671875,
-0.042205810546875,
-0.01409149169921875,
0.032379150390625,
0.0340576171875,
-0.0306854248046875,
-0.0036144256591796875,
-0.0284881591796875,
-0.0223236083984375,
0.0245208740234375,
-0.0033283233642578125,
0.08135986328125,
0.0251312255859375,
-0.0341796875,
-0.0010347366333007812,
-0.06402587890625,
0.0042266845703125,
0.050048828125,
-0.0225677490234375,
0.0015106201171875,
-0.0136871337890625,
-0.00730133056640625,
0.01052093505859375,
0.043914794921875,
-0.042205810546875,
0.0193328857421875,
-0.032684326171875,
0.04058837890625,
0.06573486328125,
0.006496429443359375,
0.011016845703125,
-0.037567138671875,
0.033416748046875,
0.00978851318359375,
0.020355224609375,
-0.01007080078125,
-0.047607421875,
-0.07305908203125,
-0.0283355712890625,
-0.0114593505859375,
0.03765869140625,
-0.03851318359375,
0.0518798828125,
-0.00927734375,
-0.0604248046875,
-0.032867431640625,
0.0029315948486328125,
0.03466796875,
0.022186279296875,
0.0186920166015625,
-0.019775390625,
-0.0626220703125,
-0.0521240234375,
-0.0081024169921875,
-0.0268707275390625,
0.0167388916015625,
0.033660888671875,
0.0506591796875,
-0.031158447265625,
0.04754638671875,
-0.03741455078125,
-0.017791748046875,
-0.0149993896484375,
-0.0161895751953125,
0.050262451171875,
0.050384521484375,
0.056060791015625,
-0.03826904296875,
-0.041259765625,
0.01395416259765625,
-0.0660400390625,
-0.005725860595703125,
-0.000736236572265625,
-0.0372314453125,
0.0225677490234375,
0.017242431640625,
-0.054840087890625,
0.045135498046875,
0.0311126708984375,
-0.047760009765625,
0.02606201171875,
-0.01323699951171875,
0.01239013671875,
-0.08984375,
0.01010894775390625,
-0.007843017578125,
0.0017881393432617188,
-0.039337158203125,
0.0021457672119140625,
-0.013336181640625,
0.02288818359375,
-0.040008544921875,
0.0660400390625,
-0.03326416015625,
-0.0008745193481445312,
-0.004543304443359375,
0.026824951171875,
0.003292083740234375,
0.04730224609375,
-0.0078125,
0.046417236328125,
0.03692626953125,
-0.03717041015625,
0.036895751953125,
0.04217529296875,
-0.0209197998046875,
0.03399658203125,
-0.065185546875,
0.0198974609375,
0.00506591796875,
0.032958984375,
-0.0888671875,
-0.0162200927734375,
0.035980224609375,
-0.051422119140625,
0.00238800048828125,
-0.00897216796875,
-0.03436279296875,
-0.045684814453125,
-0.0318603515625,
0.02288818359375,
0.045440673828125,
-0.054473876953125,
0.02978515625,
0.0212249755859375,
0.0037078857421875,
-0.05621337890625,
-0.0548095703125,
-0.013763427734375,
-0.0206451416015625,
-0.057403564453125,
0.030792236328125,
-0.01666259765625,
-0.01380157470703125,
-0.01300811767578125,
-0.004848480224609375,
-0.0011625289916992188,
0.01151275634765625,
0.0206756591796875,
0.047393798828125,
-0.0181884765625,
-0.0286865234375,
-0.00016880035400390625,
-0.0130462646484375,
-0.0035457611083984375,
-0.003444671630859375,
0.06536865234375,
-0.02349853515625,
-0.02947998046875,
-0.06488037109375,
0.0103302001953125,
0.038299560546875,
-0.01556396484375,
0.0577392578125,
0.055755615234375,
-0.0283355712890625,
0.0268707275390625,
-0.041839599609375,
-0.0026683807373046875,
-0.038299560546875,
0.027435302734375,
-0.032684326171875,
-0.0384521484375,
0.06524658203125,
0.0235748291015625,
0.017425537109375,
0.05157470703125,
0.046478271484375,
0.0019054412841796875,
0.0753173828125,
0.04205322265625,
-0.003604888916015625,
0.041595458984375,
-0.049407958984375,
0.020904541015625,
-0.07489013671875,
-0.0452880859375,
-0.032073974609375,
-0.027679443359375,
-0.03533935546875,
-0.03045654296875,
0.01739501953125,
0.0104827880859375,
-0.0452880859375,
0.031494140625,
-0.051055908203125,
0.0213775634765625,
0.02752685546875,
0.0164947509765625,
0.01629638671875,
0.00994110107421875,
-0.01401519775390625,
0.00179290771484375,
-0.028106689453125,
-0.0297088623046875,
0.08184814453125,
0.03302001953125,
0.045196533203125,
0.019134521484375,
0.0626220703125,
-0.0117034912109375,
0.0007834434509277344,
-0.035491943359375,
0.050140380859375,
0.0132293701171875,
-0.0491943359375,
-0.0068817138671875,
-0.015045166015625,
-0.07659912109375,
0.036224365234375,
0.0003876686096191406,
-0.08135986328125,
0.020416259765625,
-0.0170440673828125,
-0.0301513671875,
0.03790283203125,
-0.036041259765625,
0.0396728515625,
-0.02423095703125,
-0.032867431640625,
-0.0018453598022460938,
-0.041229248046875,
0.0285491943359375,
0.01488494873046875,
0.0195770263671875,
-0.026031494140625,
-0.0219879150390625,
0.08001708984375,
-0.04638671875,
0.06591796875,
-0.007419586181640625,
-0.01201629638671875,
0.0291595458984375,
-0.00685882568359375,
0.05279541015625,
0.017242431640625,
0.0006856918334960938,
0.0196380615234375,
-0.00012803077697753906,
-0.02728271484375,
-0.01462554931640625,
0.05316162109375,
-0.089111328125,
-0.05426025390625,
-0.035675048828125,
-0.0135650634765625,
0.014892578125,
0.0178985595703125,
0.044342041015625,
0.00958251953125,
0.01552581787109375,
0.0104522705078125,
0.026824951171875,
-0.026336669921875,
0.053802490234375,
0.022247314453125,
-0.02288818359375,
-0.046783447265625,
0.05242919921875,
0.0109405517578125,
0.01451873779296875,
0.0199127197265625,
0.005146026611328125,
-0.0186767578125,
-0.01314544677734375,
-0.035430908203125,
0.055267333984375,
-0.05487060546875,
-0.0269012451171875,
-0.053131103515625,
-0.02410888671875,
-0.02874755859375,
-0.031707763671875,
-0.027252197265625,
-0.030609130859375,
-0.048736572265625,
-0.0121307373046875,
0.059661865234375,
0.036865234375,
-0.0096435546875,
0.027740478515625,
-0.039337158203125,
0.0180206298828125,
0.0028209686279296875,
0.01140594482421875,
0.015777587890625,
-0.06378173828125,
-0.0048370361328125,
0.0016736984252929688,
-0.026458740234375,
-0.0677490234375,
0.0545654296875,
-0.0035762786865234375,
0.04705810546875,
0.0242767333984375,
-0.006084442138671875,
0.0733642578125,
-0.01043701171875,
0.06549072265625,
0.0419921875,
-0.0670166015625,
0.047576904296875,
-0.029998779296875,
0.00118255615234375,
0.0023250579833984375,
0.0164337158203125,
-0.031402587890625,
-0.01099395751953125,
-0.058868408203125,
-0.07366943359375,
0.06915283203125,
0.0170135498046875,
0.0159149169921875,
0.0080108642578125,
0.0162506103515625,
-0.007381439208984375,
0.0036487579345703125,
-0.07342529296875,
-0.055419921875,
-0.016937255859375,
-0.0134124755859375,
0.00374603271484375,
-0.0189666748046875,
-0.01097869873046875,
-0.040130615234375,
0.061370849609375,
0.0034580230712890625,
0.04559326171875,
0.020263671875,
-0.004306793212890625,
-0.01043701171875,
-0.0013103485107421875,
0.05535888671875,
0.02978515625,
-0.0083160400390625,
-0.0149993896484375,
0.034423828125,
-0.0458984375,
0.0159912109375,
-0.001979827880859375,
-0.00982666015625,
0.00873565673828125,
0.0237274169921875,
0.06854248046875,
0.0174713134765625,
-0.0305633544921875,
0.0360107421875,
0.0026226043701171875,
-0.01039886474609375,
-0.03228759765625,
0.0006871223449707031,
0.0151824951171875,
0.033233642578125,
0.0301513671875,
-0.01358795166015625,
-0.0203094482421875,
-0.032012939453125,
-0.01016998291015625,
0.0251007080078125,
0.006282806396484375,
-0.0218048095703125,
0.062469482421875,
0.01136016845703125,
-0.014984130859375,
0.026214599609375,
-0.0011625289916992188,
-0.04046630859375,
0.07366943359375,
0.057525634765625,
0.04608154296875,
-0.016357421875,
-0.0008649826049804688,
0.063232421875,
0.0157623291015625,
0.01149749755859375,
0.0300750732421875,
0.0002067089080810547,
-0.039764404296875,
0.004344940185546875,
-0.05218505859375,
-0.00421142578125,
0.0145263671875,
-0.030487060546875,
0.031341552734375,
-0.045989990234375,
-0.0170440673828125,
-0.016357421875,
0.0295867919921875,
-0.047576904296875,
0.0013303756713867188,
0.00734710693359375,
0.052825927734375,
-0.05401611328125,
0.047882080078125,
0.041351318359375,
-0.0447998046875,
-0.0660400390625,
-0.026763916015625,
0.006977081298828125,
-0.0849609375,
0.0455322265625,
0.00279998779296875,
-0.00478363037109375,
0.0087432861328125,
-0.050048828125,
-0.0987548828125,
0.10858154296875,
0.01025390625,
-0.031768798828125,
0.0079193115234375,
-0.00733184814453125,
0.0296783447265625,
-0.0225372314453125,
0.047454833984375,
0.04046630859375,
0.04498291015625,
0.011993408203125,
-0.07159423828125,
0.0251922607421875,
-0.050994873046875,
0.0030612945556640625,
-0.00592041015625,
-0.0970458984375,
0.0859375,
-0.030181884765625,
-0.00783538818359375,
0.032318115234375,
0.06463623046875,
0.054595947265625,
0.0127105712890625,
0.0157012939453125,
0.038543701171875,
0.050811767578125,
-0.0178680419921875,
0.06329345703125,
-0.021270751953125,
0.04046630859375,
0.0305633544921875,
-0.0028247833251953125,
0.06243896484375,
0.03424072265625,
-0.047454833984375,
0.04962158203125,
0.058807373046875,
-0.022186279296875,
0.0250244140625,
0.00376129150390625,
-0.01415252685546875,
-0.0037384033203125,
-0.0071868896484375,
-0.060760498046875,
0.0224609375,
0.027252197265625,
-0.02392578125,
0.0030078887939453125,
-0.01003265380859375,
0.03900146484375,
-0.0172271728515625,
-0.01180267333984375,
0.044219970703125,
0.01442718505859375,
-0.036041259765625,
0.08209228515625,
-0.004756927490234375,
0.07635498046875,
-0.034515380859375,
0.0175323486328125,
-0.0303955078125,
0.01230621337890625,
-0.035247802734375,
-0.0516357421875,
-0.00620269775390625,
0.0190582275390625,
-0.004276275634765625,
0.01195526123046875,
0.032470703125,
-0.004119873046875,
-0.0455322265625,
0.03411865234375,
0.0111083984375,
0.03515625,
0.043670654296875,
-0.054595947265625,
0.0330810546875,
0.023681640625,
-0.050994873046875,
0.017059326171875,
0.01092529296875,
0.01446533203125,
0.0565185546875,
0.05487060546875,
0.0004906654357910156,
0.0299072265625,
-0.01500701904296875,
0.06329345703125,
-0.040863037109375,
-0.0298309326171875,
-0.07330322265625,
0.0482177734375,
-0.00799560546875,
-0.036956787109375,
0.06134033203125,
0.03533935546875,
0.050933837890625,
0.006122589111328125,
0.05743408203125,
-0.0240325927734375,
0.02239990234375,
-0.032318115234375,
0.055511474609375,
-0.05859375,
0.015625,
-0.0206146240234375,
-0.051361083984375,
-0.0167999267578125,
0.067626953125,
-0.01494598388671875,
0.0179290771484375,
0.040863037109375,
0.06427001953125,
0.0135650634765625,
-0.021575927734375,
0.0040130615234375,
0.03375244140625,
0.035614013671875,
0.06951904296875,
0.05206298828125,
-0.06396484375,
0.033599853515625,
-0.040863037109375,
0.0018024444580078125,
-0.038055419921875,
-0.052520751953125,
-0.0743408203125,
-0.043731689453125,
-0.0258331298828125,
-0.03228759765625,
-0.0218353271484375,
0.07733154296875,
0.0467529296875,
-0.04583740234375,
-0.024688720703125,
0.00954437255859375,
0.0195465087890625,
-0.003116607666015625,
-0.01270294189453125,
0.0394287109375,
0.0041656494140625,
-0.07073974609375,
0.01276397705078125,
0.007076263427734375,
0.03857421875,
0.0018205642700195312,
-0.022796630859375,
-0.0207672119140625,
0.008209228515625,
0.0213165283203125,
0.033721923828125,
-0.06842041015625,
-0.005863189697265625,
0.0022220611572265625,
-0.0211944580078125,
0.012786865234375,
-0.0013780593872070312,
-0.042510986328125,
0.004184722900390625,
0.042510986328125,
0.0044403076171875,
0.043609619140625,
-0.01464080810546875,
0.003917694091796875,
-0.0281982421875,
0.039520263671875,
-0.01291656494140625,
0.0494384765625,
0.0133209228515625,
-0.031768798828125,
0.0457763671875,
0.0298309326171875,
-0.0264434814453125,
-0.0870361328125,
-0.0108642578125,
-0.08428955078125,
-0.01326751708984375,
0.0880126953125,
-0.0161285400390625,
-0.039337158203125,
0.018096923828125,
-0.0258331298828125,
0.034332275390625,
-0.0217437744140625,
0.04205322265625,
0.034942626953125,
-0.004291534423828125,
-0.00838470458984375,
-0.032867431640625,
0.0133819580078125,
0.0220489501953125,
-0.0626220703125,
-0.016021728515625,
0.0067138671875,
0.026336669921875,
0.0301513671875,
0.05499267578125,
-0.005008697509765625,
0.0230255126953125,
0.005290985107421875,
0.01392364501953125,
-0.01422882080078125,
0.006572723388671875,
-0.003101348876953125,
-0.03033447265625,
-0.016448974609375,
-0.0264739990234375
]
] |
facebook/wmt19-de-en | 2023-09-19T07:27:56.000Z | [
"transformers",
"pytorch",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"de",
"en",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/wmt19-de-en | 14 | 16,144 | transformers | 2022-03-02T23:29:05 | ---
language:
- de
- en
tags:
- translation
- wmt19
- facebook
license: apache-2.0
datasets:
- wmt19
metrics:
- bleu
thumbnail: https://huggingface.co/front/thumbnails/facebook.png
---
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for de-en.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-de-en"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Maschinelles Lernen ist großartig, oder?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Machine learning is great, isn't it?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
de-en | [42.3](http://matrix.statmt.org/matrix/output/1902?run_id=6750) | 41.35
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=de-en
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
| 3,431 | [
[
-0.0285797119140625,
-0.044708251953125,
0.0260467529296875,
0.0276336669921875,
-0.020050048828125,
-0.0005774497985839844,
-0.008880615234375,
-0.0247955322265625,
0.00432586669921875,
0.01214599609375,
-0.06488037109375,
-0.02362060546875,
-0.056884765625,
0.01305389404296875,
-0.03814697265625,
0.06939697265625,
-0.0202789306640625,
0.0255584716796875,
-0.00113677978515625,
-0.0192108154296875,
-0.01366424560546875,
-0.0103912353515625,
-0.038299560546875,
-0.0291595458984375,
0.01018524169921875,
0.006923675537109375,
0.045745849609375,
0.019927978515625,
0.04364013671875,
0.033172607421875,
-0.0152130126953125,
-0.0007610321044921875,
-0.034698486328125,
-0.0159912109375,
-0.020599365234375,
-0.0263671875,
-0.028289794921875,
0.001987457275390625,
0.0467529296875,
0.044769287109375,
-0.0024089813232421875,
0.0435791015625,
0.00748443603515625,
0.035614013671875,
-0.008819580078125,
0.013519287109375,
-0.047454833984375,
0.014068603515625,
-0.0149078369140625,
-0.00359344482421875,
-0.0343017578125,
-0.0210723876953125,
-0.014678955078125,
-0.0316162109375,
0.00977325439453125,
0.005260467529296875,
0.10235595703125,
0.017822265625,
-0.047271728515625,
0.0238037109375,
-0.03973388671875,
0.07861328125,
-0.05694580078125,
0.056488037109375,
0.004055023193359375,
0.030029296875,
-0.0148468017578125,
-0.07989501953125,
-0.028472900390625,
0.0081634521484375,
-0.00241851806640625,
0.0222625732421875,
-0.03533935546875,
-0.015625,
0.03375244140625,
0.031280517578125,
-0.045989990234375,
-0.0149078369140625,
-0.051727294921875,
-0.04010009765625,
0.0501708984375,
0.01163482666015625,
0.00522613525390625,
-0.0227813720703125,
-0.040008544921875,
-0.021331787109375,
-0.0213775634765625,
0.01495361328125,
-0.001361846923828125,
0.0225830078125,
-0.0084991455078125,
0.04510498046875,
-0.04241943359375,
0.04034423828125,
0.0271759033203125,
-0.0106964111328125,
0.06463623046875,
-0.0401611328125,
-0.006031036376953125,
-0.0223541259765625,
0.08245849609375,
0.03680419921875,
-0.001277923583984375,
-0.00788116455078125,
-0.035003662109375,
-0.02203369140625,
0.007843017578125,
-0.08221435546875,
0.01708984375,
0.00626373291015625,
-0.05181884765625,
-0.0158538818359375,
0.02056884765625,
-0.049072265625,
0.016326904296875,
-0.0188140869140625,
0.06878662109375,
-0.037994384765625,
-0.002140045166015625,
-0.000133514404296875,
-0.023101806640625,
0.0236663818359375,
0.01465606689453125,
-0.023040771484375,
0.00643157958984375,
0.023284912109375,
0.07220458984375,
-0.0100250244140625,
-0.03094482421875,
-0.046142578125,
-0.01287841796875,
-0.0165863037109375,
0.029693603515625,
-0.005840301513671875,
-0.015228271484375,
-0.0115966796875,
0.04876708984375,
-0.0180511474609375,
-0.0260467529296875,
0.05126953125,
-0.0362548828125,
0.04156494140625,
-0.022064208984375,
-0.0278167724609375,
-0.0146636962890625,
0.00447845458984375,
-0.031951904296875,
0.08404541015625,
0.034942626953125,
-0.05694580078125,
0.01000213623046875,
-0.037261962890625,
-0.037109375,
-0.01143646240234375,
0.00403594970703125,
-0.0302581787109375,
0.01123046875,
0.00881195068359375,
0.029998779296875,
-0.01354217529296875,
0.038421630859375,
-0.01511383056640625,
-0.04449462890625,
0.020965576171875,
-0.036376953125,
0.071533203125,
0.036956787109375,
-0.036102294921875,
0.0174407958984375,
-0.050018310546875,
-0.00122833251953125,
0.02044677734375,
-0.0277099609375,
0.0179595947265625,
-0.016937255859375,
0.01161956787109375,
0.048004150390625,
0.033447265625,
-0.043304443359375,
-0.007175445556640625,
-0.04010009765625,
0.0345458984375,
0.060760498046875,
-0.006954193115234375,
0.02880859375,
-0.04974365234375,
0.0352783203125,
0.01117706298828125,
0.02093505859375,
0.0086212158203125,
-0.050567626953125,
-0.048248291015625,
-0.005035400390625,
0.0177001953125,
0.044952392578125,
-0.07342529296875,
0.033233642578125,
-0.033782958984375,
-0.0606689453125,
-0.035400390625,
-0.0167694091796875,
0.022552490234375,
0.02569580078125,
0.046966552734375,
-0.0258331298828125,
-0.046112060546875,
-0.07220458984375,
-0.035797119140625,
0.0013761520385742188,
-0.0005254745483398438,
0.0008482933044433594,
0.04412841796875,
-0.039306640625,
0.0521240234375,
-0.0267181396484375,
-0.0163726806640625,
-0.017120361328125,
-0.006870269775390625,
0.043121337890625,
0.0513916015625,
0.047821044921875,
-0.045684814453125,
-0.0321044921875,
-0.01020050048828125,
-0.039306640625,
-0.0084991455078125,
0.00042629241943359375,
-0.02978515625,
0.01541900634765625,
0.028900146484375,
-0.058746337890625,
0.0260162353515625,
0.034271240234375,
-0.03656005859375,
0.034515380859375,
0.0182037353515625,
0.04010009765625,
-0.1083984375,
0.008636474609375,
-0.003513336181640625,
-0.041107177734375,
-0.0284576416015625,
-0.007099151611328125,
0.005645751953125,
-0.004848480224609375,
-0.050994873046875,
0.052520751953125,
-0.01297760009765625,
0.004802703857421875,
-0.0078887939453125,
-0.0035152435302734375,
0.008636474609375,
0.044830322265625,
-0.0270233154296875,
0.038238525390625,
0.032745361328125,
-0.03607177734375,
0.0227813720703125,
0.03961181640625,
-0.024810791015625,
0.02752685546875,
-0.040313720703125,
-0.010162353515625,
0.0031833648681640625,
0.0272216796875,
-0.0684814453125,
-0.01200103759765625,
0.028778076171875,
-0.053131103515625,
0.0236358642578125,
-0.007083892822265625,
-0.037445068359375,
-0.04840087890625,
-0.0215301513671875,
0.0206146240234375,
0.055267333984375,
-0.036285400390625,
0.0377197265625,
0.011627197265625,
0.00495147705078125,
-0.04278564453125,
-0.0758056640625,
-0.0218353271484375,
-0.0218505859375,
-0.05303955078125,
0.045196533203125,
-0.0079803466796875,
0.00507354736328125,
-0.0019969940185546875,
-0.0300750732421875,
0.003025054931640625,
0.002742767333984375,
0.0173492431640625,
0.01238250732421875,
-0.013397216796875,
-0.0017728805541992188,
0.0246124267578125,
-0.01480865478515625,
0.0032291412353515625,
-0.03363037109375,
0.053741455078125,
-0.0290679931640625,
-0.0184783935546875,
-0.054168701171875,
0.0057373046875,
0.04034423828125,
-0.0224761962890625,
0.06365966796875,
0.09033203125,
-0.035919189453125,
0.01105499267578125,
-0.0303955078125,
-0.029937744140625,
-0.0406494140625,
0.04656982421875,
-0.037109375,
-0.07147216796875,
0.04022216796875,
-0.0017185211181640625,
0.01280975341796875,
0.07208251953125,
0.05279541015625,
0.0026378631591796875,
0.0902099609375,
0.012542724609375,
0.00489044189453125,
0.052734375,
-0.023193359375,
0.007167816162109375,
-0.0399169921875,
-0.00039458274841308594,
-0.038360595703125,
-0.038299560546875,
-0.057159423828125,
-0.04461669921875,
0.01056671142578125,
0.00395965576171875,
-0.043182373046875,
0.044219970703125,
-0.03509521484375,
0.002445220947265625,
0.040802001953125,
0.004451751708984375,
0.0204010009765625,
-0.002979278564453125,
-0.01220703125,
-0.01328277587890625,
-0.041534423828125,
-0.0246124267578125,
0.0758056640625,
0.0250244140625,
0.036956787109375,
0.002925872802734375,
0.06231689453125,
0.00025081634521484375,
0.015960693359375,
-0.04620361328125,
0.048065185546875,
-0.00290679931640625,
-0.0533447265625,
-0.00923919677734375,
-0.0643310546875,
-0.06976318359375,
0.031707763671875,
-0.006526947021484375,
-0.06390380859375,
0.00669097900390625,
-0.01235198974609375,
-0.00569915771484375,
0.0171966552734375,
-0.038055419921875,
0.0858154296875,
-0.00916290283203125,
-0.0164794921875,
-0.0113525390625,
-0.0509033203125,
0.0303192138671875,
-0.01142120361328125,
0.038360595703125,
-0.017303466796875,
0.0196990966796875,
0.07550048828125,
-0.0305633544921875,
0.04144287109375,
-0.02081298828125,
0.01068878173828125,
0.023284912109375,
0.0018815994262695312,
0.042572021484375,
0.00045752525329589844,
-0.0175018310546875,
0.0221710205078125,
0.03509521484375,
-0.030426025390625,
-0.01447296142578125,
0.054290771484375,
-0.058990478515625,
-0.03936767578125,
-0.0335693359375,
-0.04278564453125,
-0.01351165771484375,
0.037017822265625,
0.04541015625,
0.0355224609375,
-0.009979248046875,
0.0271759033203125,
0.0188751220703125,
-0.01386260986328125,
0.034820556640625,
0.02752685546875,
-0.03973388671875,
-0.031524658203125,
0.06927490234375,
0.01529693603515625,
0.0220184326171875,
0.0223236083984375,
0.015106201171875,
-0.0301513671875,
-0.02496337890625,
-0.039764404296875,
0.0188751220703125,
-0.05364990234375,
-0.0374755859375,
-0.05841064453125,
-0.0210418701171875,
-0.044097900390625,
0.01055908203125,
-0.0443115234375,
-0.06097412109375,
-0.007236480712890625,
-0.01287841796875,
0.0312347412109375,
0.0166168212890625,
-0.0175018310546875,
0.015960693359375,
-0.07073974609375,
0.0186614990234375,
-0.0007467269897460938,
0.0187225341796875,
-0.012115478515625,
-0.07586669921875,
-0.0274505615234375,
0.0236053466796875,
-0.0528564453125,
-0.07586669921875,
0.0249176025390625,
0.006839752197265625,
0.055633544921875,
0.01351165771484375,
0.018798828125,
0.0445556640625,
-0.03466796875,
0.059661865234375,
0.0123138427734375,
-0.08111572265625,
0.033172607421875,
-0.0236663818359375,
0.0250244140625,
0.046417236328125,
0.024322509765625,
-0.04248046875,
-0.0458984375,
-0.062255859375,
-0.0653076171875,
0.07855224609375,
0.03289794921875,
0.01708984375,
0.004718780517578125,
0.005657196044921875,
-0.00030541419982910156,
0.0188446044921875,
-0.0703125,
-0.018402099609375,
-0.031219482421875,
-0.03375244140625,
-0.001079559326171875,
0.0099029541015625,
-0.01197052001953125,
-0.034820556640625,
0.076416015625,
-0.00498199462890625,
0.036956787109375,
0.0158843994140625,
-0.0159454345703125,
-0.016082763671875,
0.0136260986328125,
0.0198516845703125,
0.037445068359375,
-0.00847625732421875,
-0.004425048828125,
0.03375244140625,
-0.017669677734375,
0.00341033935546875,
0.037445068359375,
-0.01335906982421875,
0.01503753662109375,
0.016845703125,
0.06085205078125,
0.01279449462890625,
-0.04559326171875,
0.057952880859375,
-0.00215911865234375,
-0.0282135009765625,
-0.01116180419921875,
-0.004192352294921875,
0.0115509033203125,
0.04779052734375,
0.0316162109375,
0.0218505859375,
0.017730712890625,
-0.028289794921875,
0.032501220703125,
0.02191162109375,
-0.057464599609375,
-0.02874755859375,
0.06939697265625,
0.0027828216552734375,
-0.0139617919921875,
0.03533935546875,
-0.040802001953125,
-0.04248046875,
0.039581298828125,
0.039581298828125,
0.05340576171875,
-0.01421356201171875,
0.01509857177734375,
0.0595703125,
0.032440185546875,
-0.02606201171875,
0.0270233154296875,
0.00841522216796875,
-0.047760009765625,
-0.035003662109375,
-0.0662841796875,
0.006561279296875,
-0.00756072998046875,
-0.0621337890625,
0.03314208984375,
0.0006113052368164062,
-0.027252197265625,
-0.0241241455078125,
-0.0032596588134765625,
-0.06256103515625,
0.01242828369140625,
-0.00730133056640625,
0.0635986328125,
-0.061248779296875,
0.054412841796875,
0.059326171875,
-0.033966064453125,
-0.06256103515625,
-0.003604888916015625,
0.0025005340576171875,
-0.044464111328125,
0.0198516845703125,
0.033233642578125,
0.02215576171875,
0.0174102783203125,
-0.0304412841796875,
-0.0811767578125,
0.09857177734375,
0.019775390625,
-0.038055419921875,
0.00865936279296875,
0.005279541015625,
0.031402587890625,
-0.00039839744567871094,
0.02264404296875,
0.0292510986328125,
0.044769287109375,
0.007198333740234375,
-0.089111328125,
0.03033447265625,
-0.023040771484375,
-0.0084381103515625,
0.01617431640625,
-0.06707763671875,
0.0675048828125,
-0.005565643310546875,
-0.0179595947265625,
0.018341064453125,
0.0609130859375,
0.03448486328125,
0.033538818359375,
0.0374755859375,
0.032958984375,
0.041259765625,
-0.03009033203125,
0.05975341796875,
-0.0156402587890625,
0.06524658203125,
0.0523681640625,
0.0093231201171875,
0.046905517578125,
0.039581298828125,
-0.03265380859375,
0.025909423828125,
0.050018310546875,
-0.0166168212890625,
0.03375244140625,
0.01023101806640625,
0.022308349609375,
-0.0204010009765625,
0.0048980712890625,
-0.049407958984375,
0.0290069580078125,
-0.00009900331497192383,
-0.0232391357421875,
-0.0167999267578125,
0.016204833984375,
0.007457733154296875,
-0.011962890625,
-0.0067596435546875,
0.02740478515625,
0.02349853515625,
-0.044464111328125,
0.04693603515625,
0.024017333984375,
0.05499267578125,
-0.037078857421875,
0.023162841796875,
-0.01959228515625,
0.022369384765625,
-0.012115478515625,
-0.0450439453125,
0.0423583984375,
-0.00785064697265625,
-0.004791259765625,
-0.029571533203125,
0.03857421875,
-0.0274200439453125,
-0.05316162109375,
0.0260772705078125,
0.04559326171875,
0.0131072998046875,
-0.012847900390625,
-0.0660400390625,
0.007293701171875,
0.024200439453125,
-0.047637939453125,
0.0275115966796875,
0.0389404296875,
-0.01029205322265625,
0.0291595458984375,
0.039306640625,
-0.03271484375,
0.009674072265625,
0.003955841064453125,
0.05859375,
-0.056793212890625,
-0.0222625732421875,
-0.0501708984375,
0.052978515625,
0.0014677047729492188,
-0.023529052734375,
0.055206298828125,
0.058013916015625,
0.06884765625,
-0.0113677978515625,
0.036773681640625,
-0.02392578125,
0.033416748046875,
-0.02838134765625,
0.05706787109375,
-0.0736083984375,
-0.0049896240234375,
-0.03265380859375,
-0.07196044921875,
0.00647735595703125,
0.0460205078125,
-0.0082855224609375,
0.017425537109375,
0.052947998046875,
0.059417724609375,
-0.009765625,
-0.00495147705078125,
0.01328277587890625,
0.03399658203125,
0.02618408203125,
0.05230712890625,
0.06719970703125,
-0.07452392578125,
0.0655517578125,
-0.034820556640625,
-0.01357269287109375,
-0.0172119140625,
-0.03515625,
-0.0552978515625,
-0.05487060546875,
-0.030731201171875,
-0.051910400390625,
-0.022308349609375,
0.06500244140625,
0.036163330078125,
-0.05615234375,
-0.0026988983154296875,
0.006076812744140625,
0.001796722412109375,
-0.035400390625,
-0.02215576171875,
0.0192718505859375,
-0.0215606689453125,
-0.07781982421875,
0.03302001953125,
-0.00882720947265625,
0.003871917724609375,
0.01702880859375,
-0.01380157470703125,
-0.0157623291015625,
0.00012600421905517578,
0.035400390625,
-0.01226043701171875,
-0.04193115234375,
-0.0253753662109375,
0.0187225341796875,
-0.018524169921875,
-0.0018558502197265625,
0.0139007568359375,
-0.0377197265625,
0.00672149658203125,
0.054351806640625,
0.033050537109375,
0.060150146484375,
-0.00217437744140625,
0.030029296875,
-0.04144287109375,
0.02252197265625,
0.0081787109375,
0.046356201171875,
0.00942230224609375,
-0.005100250244140625,
0.0386962890625,
0.043212890625,
-0.041046142578125,
-0.0712890625,
0.006511688232421875,
-0.08984375,
-0.0231170654296875,
0.0889892578125,
0.006114959716796875,
-0.01372528076171875,
0.01776123046875,
-0.016998291015625,
0.042083740234375,
-0.023834228515625,
0.038482666015625,
0.04632568359375,
0.009674072265625,
0.00803375244140625,
-0.051239013671875,
0.0164794921875,
0.0352783203125,
-0.036163330078125,
-0.035675048828125,
0.0192718505859375,
0.0288543701171875,
0.01410675048828125,
0.0478515625,
-0.028564453125,
0.02191162109375,
-0.01410675048828125,
0.00640869140625,
0.0025005340576171875,
0.007328033447265625,
-0.00675201416015625,
-0.0080108642578125,
-0.00782012939453125,
-0.0260162353515625
]
] |
Helsinki-NLP/opus-mt-af-en | 2023-08-16T11:25:20.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"af",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-af-en | 0 | 16,140 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-af-en
* source languages: af
* target languages: en
* OPUS readme: [af-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/af-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/af-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/af-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/af-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.af.en | 60.8 | 0.736 |
| 818 | [
[
-0.02294921875,
-0.038604736328125,
0.0146026611328125,
0.032012939453125,
-0.0311126708984375,
-0.029510498046875,
-0.0309295654296875,
-0.007457733154296875,
0.0024871826171875,
0.033447265625,
-0.051666259765625,
-0.04144287109375,
-0.04754638671875,
0.0203704833984375,
-0.01430511474609375,
0.0477294921875,
-0.017608642578125,
0.036590576171875,
0.01543426513671875,
-0.04010009765625,
-0.0206298828125,
-0.031463623046875,
-0.03466796875,
-0.026580810546875,
0.013824462890625,
0.02752685546875,
0.03338623046875,
0.034088134765625,
0.0731201171875,
0.0191497802734375,
-0.006221771240234375,
0.004711151123046875,
-0.0308685302734375,
-0.00687408447265625,
0.0075531005859375,
-0.040863037109375,
-0.04998779296875,
-0.01023101806640625,
0.07391357421875,
0.0307769775390625,
-0.0053253173828125,
0.02752685546875,
-0.01377105712890625,
0.07208251953125,
-0.0200958251953125,
0.00868988037109375,
-0.042510986328125,
0.0023555755615234375,
-0.023773193359375,
-0.0160369873046875,
-0.04705810546875,
-0.00940704345703125,
0.00940704345703125,
-0.049407958984375,
0.001270294189453125,
0.0125732421875,
0.10552978515625,
0.026947021484375,
-0.0283050537109375,
-0.0119781494140625,
-0.042724609375,
0.068115234375,
-0.0557861328125,
0.0537109375,
0.033447265625,
0.0204620361328125,
0.01800537109375,
-0.04278564453125,
-0.027252197265625,
0.00792694091796875,
-0.0174713134765625,
0.0084381103515625,
-0.0156707763671875,
-0.012969970703125,
0.01922607421875,
0.0616455078125,
-0.052947998046875,
-0.00540924072265625,
-0.03863525390625,
-0.001476287841796875,
0.0552978515625,
0.002716064453125,
0.01096343994140625,
-0.00864410400390625,
-0.032806396484375,
-0.04315185546875,
-0.054229736328125,
0.005664825439453125,
0.0243682861328125,
0.0178070068359375,
-0.036285400390625,
0.050872802734375,
-0.01629638671875,
0.047698974609375,
0.0043182373046875,
0.003662109375,
0.07568359375,
-0.0300140380859375,
-0.0289764404296875,
-0.01806640625,
0.089111328125,
0.0210418701171875,
0.0092315673828125,
0.0011701583862304688,
-0.024810791015625,
-0.0184326171875,
0.00848388671875,
-0.0677490234375,
-0.00539398193359375,
0.01113128662109375,
-0.04071044921875,
-0.0079498291015625,
0.00565338134765625,
-0.044097900390625,
0.016693115234375,
-0.033172607421875,
0.047454833984375,
-0.05133056640625,
-0.0239410400390625,
0.0233917236328125,
0.004302978515625,
0.0257415771484375,
-0.005031585693359375,
-0.05035400390625,
0.01372528076171875,
0.028900146484375,
0.05731201171875,
-0.027374267578125,
-0.0221405029296875,
-0.03485107421875,
-0.0194549560546875,
-0.0057373046875,
0.04998779296875,
-0.00799560546875,
-0.0360107421875,
-0.001598358154296875,
0.033538818359375,
-0.025482177734375,
-0.0254974365234375,
0.09405517578125,
-0.0196685791015625,
0.0462646484375,
-0.032623291015625,
-0.039337158203125,
-0.030731201171875,
0.031524658203125,
-0.044036865234375,
0.09307861328125,
0.01192474365234375,
-0.052978515625,
0.01806640625,
-0.0650634765625,
-0.020751953125,
-0.00785064697265625,
0.002651214599609375,
-0.045654296875,
0.01068115234375,
0.015106201171875,
0.031402587890625,
-0.023956298828125,
0.0224456787109375,
0.000009417533874511719,
-0.0245513916015625,
0.004489898681640625,
-0.0340576171875,
0.0684814453125,
0.0225677490234375,
-0.0276947021484375,
0.01538848876953125,
-0.07159423828125,
-0.01141357421875,
0.00018727779388427734,
-0.036041259765625,
-0.01030731201171875,
0.005046844482421875,
0.0221710205078125,
0.01044464111328125,
0.0192108154296875,
-0.048492431640625,
0.017852783203125,
-0.049163818359375,
0.011444091796875,
0.047454833984375,
-0.02203369140625,
0.028411865234375,
-0.03240966796875,
0.0252227783203125,
0.01136016845703125,
0.00536346435546875,
-0.00201416015625,
-0.032867431640625,
-0.0645751953125,
-0.0219268798828125,
0.043731689453125,
0.07720947265625,
-0.061065673828125,
0.05828857421875,
-0.04693603515625,
-0.06011962890625,
-0.062042236328125,
-0.002880096435546875,
0.037689208984375,
0.0258941650390625,
0.040924072265625,
-0.014373779296875,
-0.027252197265625,
-0.07769775390625,
-0.00931549072265625,
-0.0178070068359375,
-0.0138702392578125,
0.0169830322265625,
0.045654296875,
-0.0160980224609375,
0.038116455078125,
-0.03814697265625,
-0.03472900390625,
-0.0087432861328125,
0.012847900390625,
0.033782958984375,
0.0455322265625,
0.04949951171875,
-0.07012939453125,
-0.037750244140625,
-0.0028362274169921875,
-0.052978515625,
-0.0163116455078125,
0.0099945068359375,
-0.0192718505859375,
0.0131072998046875,
0.00856781005859375,
-0.0301055908203125,
0.00966644287109375,
0.045867919921875,
-0.050567626953125,
0.034912109375,
-0.0028972625732421875,
0.023193359375,
-0.0975341796875,
0.015655517578125,
-0.00830841064453125,
-0.0035686492919921875,
-0.033843994140625,
0.00447845458984375,
0.0158843994140625,
0.005859375,
-0.05950927734375,
0.050262451171875,
-0.0193023681640625,
0.0013685226440429688,
0.0158538818359375,
-0.0088043212890625,
0.00485992431640625,
0.058197021484375,
0.001430511474609375,
0.061859130859375,
0.05126953125,
-0.034454345703125,
0.013916015625,
0.05242919921875,
-0.0283660888671875,
0.0311737060546875,
-0.05621337890625,
-0.0224761962890625,
0.02392578125,
-0.007053375244140625,
-0.049285888671875,
0.00679779052734375,
0.02252197265625,
-0.05084228515625,
0.032806396484375,
0.0004036426544189453,
-0.053924560546875,
-0.0003204345703125,
-0.0172882080078125,
0.037384033203125,
0.0467529296875,
-0.01727294921875,
0.052947998046875,
0.00525665283203125,
-0.0002772808074951172,
-0.037811279296875,
-0.077392578125,
-0.01430511474609375,
-0.026123046875,
-0.05780029296875,
0.01495361328125,
-0.027252197265625,
-0.006011962890625,
0.00547027587890625,
0.018524169921875,
-0.005336761474609375,
0.0088043212890625,
0.01018524169921875,
0.0103607177734375,
-0.0421142578125,
0.01025390625,
0.0002601146697998047,
-0.0089874267578125,
-0.0097503662109375,
-0.004756927490234375,
0.04498291015625,
-0.0285491943359375,
-0.01849365234375,
-0.04364013671875,
0.0079498291015625,
0.044036865234375,
-0.036773681640625,
0.06591796875,
0.044891357421875,
-0.0115203857421875,
0.01314544677734375,
-0.03466796875,
0.005748748779296875,
-0.03363037109375,
0.00782012939453125,
-0.03289794921875,
-0.05499267578125,
0.042083740234375,
0.01103973388671875,
0.03369140625,
0.06396484375,
0.047210693359375,
0.012542724609375,
0.047607421875,
0.0234527587890625,
-0.003787994384765625,
0.038116455078125,
-0.03985595703125,
-0.01155853271484375,
-0.07183837890625,
0.005817413330078125,
-0.050018310546875,
-0.0268096923828125,
-0.0565185546875,
-0.0175628662109375,
0.0156402587890625,
0.0021514892578125,
-0.0204925537109375,
0.0440673828125,
-0.03717041015625,
0.01389312744140625,
0.044708251953125,
-0.01248931884765625,
0.0210418701171875,
0.005100250244140625,
-0.03729248046875,
-0.0087432861328125,
-0.029144287109375,
-0.03533935546875,
0.0987548828125,
0.02947998046875,
0.021575927734375,
0.0198974609375,
0.036651611328125,
0.00203704833984375,
0.026275634765625,
-0.047943115234375,
0.0308685302734375,
-0.0272674560546875,
-0.057403564453125,
-0.019683837890625,
-0.045166015625,
-0.056060791015625,
0.037750244140625,
-0.02313232421875,
-0.03204345703125,
0.014862060546875,
-0.0011005401611328125,
-0.00844573974609375,
0.0306854248046875,
-0.05303955078125,
0.08221435546875,
-0.005931854248046875,
-0.001354217529296875,
0.022552490234375,
-0.03619384765625,
0.019439697265625,
-0.0014362335205078125,
0.01983642578125,
-0.01543426513671875,
0.01200103759765625,
0.04583740234375,
-0.00923919677734375,
0.026580810546875,
-0.0080718994140625,
-0.0027828216552734375,
0.006175994873046875,
0.004528045654296875,
0.022247314453125,
-0.0081939697265625,
-0.0369873046875,
0.037445068359375,
-0.0013704299926757812,
-0.0303497314453125,
-0.009552001953125,
0.03619384765625,
-0.056732177734375,
-0.005298614501953125,
-0.03240966796875,
-0.0494384765625,
-0.0012865066528320312,
0.02850341796875,
0.04925537109375,
0.05438232421875,
-0.0247039794921875,
0.041107177734375,
0.053375244140625,
-0.018402099609375,
0.0286865234375,
0.058929443359375,
-0.016357421875,
-0.045440673828125,
0.06011962890625,
0.00687408447265625,
0.0262298583984375,
0.045989990234375,
0.0018815994262695312,
-0.01116180419921875,
-0.05413818359375,
-0.056854248046875,
0.0235443115234375,
-0.0239715576171875,
-0.020233154296875,
-0.04339599609375,
-0.004711151123046875,
-0.023712158203125,
0.01377105712890625,
-0.03289794921875,
-0.04278564453125,
-0.01328277587890625,
-0.0209808349609375,
0.023406982421875,
0.0145263671875,
-0.00864410400390625,
0.0447998046875,
-0.07562255859375,
0.01959228515625,
-0.0146026611328125,
0.0269927978515625,
-0.0307769775390625,
-0.0535888671875,
-0.0297088623046875,
0.0087432861328125,
-0.0518798828125,
-0.049224853515625,
0.0401611328125,
0.01554107666015625,
0.0195159912109375,
0.0299530029296875,
0.009979248046875,
0.024871826171875,
-0.0518798828125,
0.07275390625,
-0.00844573974609375,
-0.0572509765625,
0.0301055908203125,
-0.03131103515625,
0.034271240234375,
0.07354736328125,
0.018218994140625,
-0.0245361328125,
-0.038787841796875,
-0.053802490234375,
-0.062042236328125,
0.06414794921875,
0.05474853515625,
-0.0106964111328125,
0.01226806640625,
-0.003986358642578125,
-0.00408172607421875,
0.016143798828125,
-0.0770263671875,
-0.0325927734375,
0.0013284683227539062,
-0.033935546875,
-0.0196990966796875,
-0.01435089111328125,
-0.0168609619140625,
-0.0113372802734375,
0.08270263671875,
0.0095367431640625,
0.0206146240234375,
0.0333251953125,
-0.011077880859375,
-0.0217437744140625,
0.016357421875,
0.072021484375,
0.03656005859375,
-0.0396728515625,
-0.016387939453125,
0.0200958251953125,
-0.034088134765625,
-0.0063323974609375,
0.0191497802734375,
-0.0301513671875,
0.024688720703125,
0.037689208984375,
0.0845947265625,
0.0115203857421875,
-0.04644775390625,
0.036468505859375,
-0.0299072265625,
-0.03302001953125,
-0.0447998046875,
-0.01165008544921875,
0.005535125732421875,
0.002109527587890625,
0.02288818359375,
0.01505279541015625,
0.01334381103515625,
-0.01535797119140625,
0.015411376953125,
0.0072021484375,
-0.046600341796875,
-0.040130615234375,
0.038238525390625,
0.015167236328125,
-0.0273895263671875,
0.0399169921875,
-0.0250091552734375,
-0.0406494140625,
0.030914306640625,
0.006557464599609375,
0.07806396484375,
-0.02203369140625,
-0.012664794921875,
0.0484619140625,
0.044525146484375,
-0.0200347900390625,
0.035797119140625,
0.005382537841796875,
-0.051483154296875,
-0.034088134765625,
-0.0673828125,
-0.01044464111328125,
0.00844573974609375,
-0.060577392578125,
0.0287017822265625,
0.020965576171875,
0.0041656494140625,
-0.0216217041015625,
0.0177459716796875,
-0.04132080078125,
0.0131378173828125,
-0.0178070068359375,
0.087890625,
-0.07403564453125,
0.060211181640625,
0.04150390625,
-0.0229949951171875,
-0.06524658203125,
-0.01427459716796875,
-0.01007843017578125,
-0.039581298828125,
0.03668212890625,
0.021026611328125,
0.02496337890625,
-0.0103912353515625,
-0.01403045654296875,
-0.060516357421875,
0.087158203125,
0.0176544189453125,
-0.038604736328125,
0.0003635883331298828,
0.01538848876953125,
0.039031982421875,
-0.028656005859375,
0.006084442138671875,
0.0333251953125,
0.053985595703125,
0.007587432861328125,
-0.083251953125,
-0.0209808349609375,
-0.043609619140625,
-0.0289764404296875,
0.044403076171875,
-0.049224853515625,
0.06829833984375,
0.03759765625,
-0.01055908203125,
0.01009368896484375,
0.049835205078125,
0.02166748046875,
0.0282135009765625,
0.03900146484375,
0.088134765625,
0.02301025390625,
-0.0372314453125,
0.07489013671875,
-0.026947021484375,
0.033294677734375,
0.0758056640625,
-0.00266265869140625,
0.06939697265625,
0.024322509765625,
-0.00498199462890625,
0.032073974609375,
0.044891357421875,
-0.02081298828125,
0.038848876953125,
-0.002635955810546875,
0.01410675048828125,
-0.0079345703125,
0.01593017578125,
-0.052459716796875,
0.0255889892578125,
0.016021728515625,
-0.020599365234375,
0.006427764892578125,
-0.0041351318359375,
0.003284454345703125,
-0.002532958984375,
-0.01279449462890625,
0.047454833984375,
0.003871917724609375,
-0.04644775390625,
0.054779052734375,
-0.0011310577392578125,
0.05206298828125,
-0.051971435546875,
0.0099945068359375,
-0.01157379150390625,
0.019775390625,
0.0008859634399414062,
-0.038970947265625,
0.039031982421875,
0.0008969306945800781,
-0.0191192626953125,
-0.0321044921875,
0.0120697021484375,
-0.04364013671875,
-0.07080078125,
0.0287017822265625,
0.031707763671875,
0.01922607421875,
0.0035381317138671875,
-0.06396484375,
0.002742767333984375,
0.00927734375,
-0.05108642578125,
0.000008225440979003906,
0.05517578125,
0.02301025390625,
0.035491943359375,
0.048248291015625,
0.014678955078125,
0.01364898681640625,
-0.009033203125,
0.048736572265625,
-0.030303955078125,
-0.031463623046875,
-0.058349609375,
0.057220458984375,
-0.004230499267578125,
-0.054779052734375,
0.0482177734375,
0.07318115234375,
0.07562255859375,
-0.009033203125,
0.016326904296875,
-0.004962921142578125,
0.04876708984375,
-0.051971435546875,
0.053375244140625,
-0.07440185546875,
0.01293182373046875,
-0.0091094970703125,
-0.06866455078125,
-0.0186309814453125,
0.0290069580078125,
-0.00794219970703125,
-0.0245208740234375,
0.056060791015625,
0.04913330078125,
-0.0174713134765625,
-0.01290130615234375,
0.017852783203125,
0.0226593017578125,
0.015960693359375,
0.03790283203125,
0.024139404296875,
-0.07391357421875,
0.0372314453125,
-0.023712158203125,
-0.005641937255859375,
-0.004428863525390625,
-0.049835205078125,
-0.061767578125,
-0.04998779296875,
-0.0109710693359375,
-0.01470947265625,
-0.0249786376953125,
0.06671142578125,
0.038848876953125,
-0.07061767578125,
-0.040618896484375,
0.003986358642578125,
0.004932403564453125,
-0.016510009765625,
-0.0183258056640625,
0.046112060546875,
-0.01593017578125,
-0.069091796875,
0.034637451171875,
0.000988006591796875,
-0.004245758056640625,
0.0005545616149902344,
-0.0218048095703125,
-0.042724609375,
0.0018415451049804688,
0.0176239013671875,
0.0023250579833984375,
-0.04150390625,
0.005062103271484375,
0.0133209228515625,
-0.0008134841918945312,
0.0237274169921875,
0.0266876220703125,
-0.018218994140625,
0.01483917236328125,
0.058349609375,
0.033843994140625,
0.0288848876953125,
-0.003826141357421875,
0.045684814453125,
-0.05548095703125,
0.0280609130859375,
0.01171875,
0.048980712890625,
0.032318115234375,
-0.007488250732421875,
0.0684814453125,
0.0159912109375,
-0.049224853515625,
-0.08349609375,
0.00391387939453125,
-0.0887451171875,
0.00150299072265625,
0.07550048828125,
-0.0227508544921875,
-0.0190277099609375,
0.03082275390625,
-0.01117706298828125,
0.01502227783203125,
-0.0297088623046875,
0.0294342041015625,
0.0718994140625,
0.026031494140625,
0.00982666015625,
-0.046875,
0.0303497314453125,
0.0426025390625,
-0.050323486328125,
-0.0133209228515625,
0.0209503173828125,
0.01055145263671875,
0.03179931640625,
0.038421630859375,
-0.0192718505859375,
0.0109405517578125,
-0.024810791015625,
0.0274505615234375,
-0.01222991943359375,
-0.01364898681640625,
-0.0311126708984375,
0.0006651878356933594,
-0.006801605224609375,
-0.0187530517578125
]
] |
af1tang/personaGPT | 2023-08-20T20:10:05.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"arxiv:1801.07243",
"license:gpl-3.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | conversational | af1tang | null | null | af1tang/personaGPT | 99 | 16,133 | transformers | 2022-03-02T23:29:05 | ---
tags:
- conversational
license: gpl-3.0
---
## A conversational agent with many personalities (PersonaGPT)
PersonaGPT is an open-domain conversational agent designed to do 2 tasks:
1. decoding _personalized_ responses based on input personality facts (the "persona" profile of the bot).
2. incorporating _turn-level goals_ into its responses through "action codes" (e.g., "talk about work", "ask about favorite music").
It builds on the [DialoGPT-medium](https://huggingface.co/microsoft/DialoGPT-medium) pretrained model based on the [GPT-2](https://github.com/openai/gpt-2) architecture.
This model is trained on the [Persona-Chat](https://arxiv.org/pdf/1801.07243) dataset, with added special tokens to better distinguish between conversational history and personality traits for dyadic conversations. Furthermore, some active learning was used to train the model to do _controlled_ decoding using turn-level goals.
## Full Repo
Preprocessing, training and implementation details can be found in the [personaGPT repo](https://github.com/af1tang/personaGPT).
### How to Use
1. Load the model and define some helper functions.
```python
from transformers import GPT2Tokenizer, GPT2LMHeadModel
import torch
tokenizer = GPT2Tokenizer.from_pretrained("af1tang/personaGPT")
model = GPT2LMHeadModel.from_pretrained("af1tang/personaGPT")
if torch.cuda.is_available():
model = model.cuda()
## utility functions ##
flatten = lambda l: [item for sublist in l for item in sublist]
def to_data(x):
if torch.cuda.is_available():
x = x.cpu()
return x.data.numpy()
def to_var(x):
if not torch.is_tensor(x):
x = torch.Tensor(x)
if torch.cuda.is_available():
x = x.cuda()
return x
def display_dialog_history(dialog_hx):
for j, line in enumerate(dialog_hx):
msg = tokenizer.decode(line)
if j %2 == 0:
print(">> User: "+ msg)
else:
print("Bot: "+msg)
print()
def generate_next(bot_input_ids, do_sample=True, top_k=10, top_p=.92,
max_length=1000, pad_token=tokenizer.eos_token_id):
full_msg = model.generate(bot_input_ids, do_sample=True,
top_k=top_k, top_p=top_p,
max_length=max_length, pad_token_id=tokenizer.eos_token_id)
msg = to_data(full_msg.detach()[0])[bot_input_ids.shape[-1]:]
return msg
```
2. Give your chatbot partner a set of personalities.
```python
# get personality facts for conversation
personas = []
for i in range(3):
response = input(">> Fact %d: "%(i+1))+ tokenizer.eos_token
personas.append(response)
personas = tokenizer.encode(''.join(['<|p2|>'] + personas + ['<|sep|>'] + ['<|start|>']))
```
3. The first use of PersonaGPT is to do _personalized_ dialog generation. Use the following loop to interact with the model.
```python
# converse for 8 turns
dialog_hx = []
for step in range(8):
# encode the user input
user_inp = tokenizer.encode(input(">> User: ") + tokenizer.eos_token)
# append to the chat history
dialog_hx.append(user_inp)
# generated a response while limiting the total chat history to 1000 tokens,
bot_input_ids = to_var([personas + flatten(dialog_hx)]).long()
msg = generate_next(bot_input_ids)
dialog_hx.append(msg)
print("Bot: {}".format(tokenizer.decode(msg, skip_special_tokens=True)))
```
Example of personalized decoding:
| | Persona Facts |
|---|---|
| 1 | i am 64 years old but feel quite young. |
| 2 | i try to eat healthy but limit mcdonalds to once a week. |
| 3 | i regret working as a doctor for the last 20 years. |
| 4 | my secret hobby is making self-help youtube videos. |
| 5 | i've been to spain many times and wish i could go again. |
| Role | Response|
|---|---|
| Human | yo whats good fam? |
| Bot | hello, how are you doing? |
| Human | not bad what you doing today?|
| Bot | eating a lot healthier now that i'm in my 60s |
| Human | i'm more of a fast food guy |
| Bot | me too but i'm very healthy so i try to avoid it |
| Human | what do you do for fun? |
| Bot | i make videos for fun and eat very healthy |
| Human | nice what kind of videos? |
| Bot | i love making videos about self help stuff |
| Human | i think i need some help |
| Bot | what kind of help would you like to get |
| Human | i'm 21 and feel lost in life |
| Bot | i can relate, i'm only 65 |
4. The second use of PersonaGPT is to do _controlled response generation_. Use the following program to interact with the model.
```python
## available actions ##
action_space = [ 'ask about kids.', "ask about pets.", 'talk about work.',
'ask about marital status.', 'talk about travel.', 'ask about age and gender.',
'ask about hobbies.', 'ask about favorite food.', 'talk about movies.',
'talk about music.', 'talk about politics.']
# converse for 8 turns
dialog_hx = []
for step in range(8):
# choose an action
act = None
while act not in action_space:
display_dialog_history(dialog_hx)
print()
print(" actions: ")
for k,v in enumerate(action_space): print(k,v)
try:
act = action_space[int(input(" input [0-10]: " ))]
except:
act = None
print()
# format into prefix code
action_prefix = tokenizer.encode(''.join(['<|act|> '] + [act] + ['<|p1|>'] + [] + ['<|sep|>'] + ['<|start|>']))
bot_input_ids = to_var([action_prefix + flatten(dialog_hx)]).long()
# generate query conditioned on action
msg = generate_next(bot_input_ids, top_k=top_k, top_p=top_p, max_length=max_length)
dialog_hx.append(msg)
# generate bot response
bot_input_ids = to_var([personas+ flatten(dialog_hx)]).long()
msg = generate_next(bot_input_ids, top_k=top_k, top_p=top_p, max_length=max_length)
dialog_hx.append(msg)
display_dialog_history(dialog_hx)
```
Example of controlled response generation:
| | Bot 2's Persona Facts |
|---|---|
| 1 | i am a quiet engineer. |
| 2 | i'm single and am looking for love. |
| 3 | sadly, i don't have any relatable hobbies. |
| 4 | luckily, however, i am tall and athletic. |
| 5 | on friday nights, i watch re-runs of the simpsons alone. |
| Role | Action | Response |
|---|---|---|
| Bot 1 | talk about movies | hello do you like movies? |
| Bot 2 | - | i love the simpsons. how about you? |
| Bot 1 | talk about music | i like the walking dead. do you play an instrument? |
| Bot 2 | - | yes i play the violin. |
| Bot 1 | ask about marital status | are you married or single? |
| Bot 2 | - | i am single but i am looking for love. |
| Bot 1 | talk about work | my job is working in a factory. what's your occupation? |
| Bot 2 | - | engineer. i'm very quiet so no one hears me. |
| Bot 1 | talk about hobbies | do you have any hobbies? |
| Bot 2 | - | i watch reruns of the simpsons. |
| Bot 1 | ask about favorite food | what's your favorite food? |
| Bot 2 | - | i love pizza. how about yourself? |
| Bot 1 | ask about pets | i also love pizza. do you like animals? |
| Bot 2 | - | i have two dogs. what is your occupation? |
| Bot 1 | talk about work | i'm a factory worker. what's your dream job? |
| Bot 2 | - | i'd love to be a writer one day. | | 7,282 | [
[
-0.02899169921875,
-0.07855224609375,
0.036529541015625,
0.0243682861328125,
0.0007529258728027344,
0.0094146728515625,
-0.0108489990234375,
0.0017452239990234375,
0.0283660888671875,
0.03533935546875,
-0.0670166015625,
-0.0291748046875,
-0.0294342041015625,
-0.00434112548828125,
0.01806640625,
0.07867431640625,
0.018463134765625,
0.00949859619140625,
-0.007663726806640625,
0.01230621337890625,
-0.042816162109375,
-0.047454833984375,
-0.0694580078125,
-0.02142333984375,
0.0298004150390625,
0.0186309814453125,
0.0382080078125,
0.0183563232421875,
0.0256500244140625,
0.038360595703125,
0.0014553070068359375,
0.0101470947265625,
-0.0277862548828125,
0.0167694091796875,
-0.01169586181640625,
-0.037872314453125,
-0.051116943359375,
-0.01129913330078125,
0.019317626953125,
0.032684326171875,
0.00539398193359375,
0.0180206298828125,
0.0131683349609375,
0.0216217041015625,
-0.0126800537109375,
0.039825439453125,
-0.0223236083984375,
0.00788116455078125,
-0.0010700225830078125,
-0.01824951171875,
-0.0311279296875,
-0.033905029296875,
0.011260986328125,
-0.031585693359375,
0.0079345703125,
0.0016078948974609375,
0.057281494140625,
0.0189361572265625,
-0.021514892578125,
-0.029266357421875,
-0.046661376953125,
0.058624267578125,
-0.06829833984375,
-0.00594329833984375,
0.047271728515625,
0.022735595703125,
-0.031982421875,
-0.04876708984375,
-0.04656982421875,
-0.0143890380859375,
-0.03131103515625,
0.03131103515625,
-0.02099609375,
-0.007366180419921875,
0.0164642333984375,
0.01403045654296875,
-0.058258056640625,
-0.0172271728515625,
-0.02874755859375,
-0.025238037109375,
0.036834716796875,
0.0258941650390625,
0.0304107666015625,
-0.03338623046875,
-0.01348876953125,
-0.00504302978515625,
-0.00989532470703125,
0.0244598388671875,
0.0313720703125,
0.0038623809814453125,
-0.028594970703125,
0.056060791015625,
-0.0212249755859375,
0.035125732421875,
0.0305023193359375,
-0.02813720703125,
0.0199127197265625,
-0.025970458984375,
-0.01499176025390625,
-0.02435302734375,
0.0782470703125,
0.03387451171875,
0.013275146484375,
0.01168060302734375,
0.01479339599609375,
-0.01058197021484375,
-0.002544403076171875,
-0.0484619140625,
-0.03485107421875,
0.04998779296875,
-0.04071044921875,
-0.0214080810546875,
-0.01403045654296875,
-0.0662841796875,
-0.0198974609375,
0.00881195068359375,
0.0340576171875,
-0.0438232421875,
-0.017120361328125,
0.016143798828125,
-0.02581787109375,
0.0209808349609375,
0.030548095703125,
-0.07049560546875,
0.0155792236328125,
0.025360107421875,
0.06109619140625,
0.020599365234375,
-0.0121917724609375,
-0.036163330078125,
-0.0027141571044921875,
0.005382537841796875,
0.048431396484375,
-0.0285491943359375,
-0.019927978515625,
-0.018829345703125,
0.02215576171875,
-0.0176544189453125,
-0.02880859375,
0.020782470703125,
-0.0143890380859375,
0.058197021484375,
-0.0169830322265625,
-0.0273284912109375,
-0.00878143310546875,
0.02203369140625,
-0.021697998046875,
0.06329345703125,
0.01806640625,
-0.058929443359375,
-0.004909515380859375,
-0.06781005859375,
-0.0233306884765625,
0.00772857666015625,
-0.0092315673828125,
-0.0218963623046875,
-0.020660400390625,
0.019927978515625,
0.04754638671875,
-0.0107269287109375,
0.00795745849609375,
-0.0183563232421875,
0.00444793701171875,
0.036651611328125,
-0.01508331298828125,
0.076904296875,
0.012725830078125,
-0.039642333984375,
0.0117340087890625,
-0.04132080078125,
0.01311492919921875,
0.0261383056640625,
-0.039276123046875,
0.0099029541015625,
0.004238128662109375,
-0.005931854248046875,
0.0287322998046875,
0.0450439453125,
-0.03839111328125,
0.0255279541015625,
-0.028106689453125,
0.05242919921875,
0.0609130859375,
0.016510009765625,
0.01441192626953125,
-0.04412841796875,
0.035919189453125,
0.0007643699645996094,
0.008758544921875,
-0.01605224609375,
-0.056396484375,
-0.0653076171875,
-0.0197601318359375,
-0.004917144775390625,
0.07037353515625,
-0.047454833984375,
0.08026123046875,
-0.0018491744995117188,
-0.040496826171875,
-0.02606201171875,
0.0031795501708984375,
0.033660888671875,
0.0377197265625,
0.0148773193359375,
-0.032562255859375,
-0.042449951171875,
-0.0640869140625,
0.00457763671875,
-0.01739501953125,
-0.023040771484375,
0.056182861328125,
0.048065185546875,
-0.01580810546875,
0.067626953125,
-0.053009033203125,
-0.014739990234375,
-0.0435791015625,
0.0017461776733398438,
0.025970458984375,
0.0521240234375,
0.042388916015625,
-0.053009033203125,
-0.050537109375,
-0.007137298583984375,
-0.065185546875,
0.007656097412109375,
-0.0250244140625,
-0.0199127197265625,
-0.0022296905517578125,
0.003040313720703125,
-0.06036376953125,
0.0286865234375,
0.017669677734375,
-0.042877197265625,
0.0292510986328125,
-0.0255279541015625,
0.01378631591796875,
-0.09283447265625,
0.007022857666015625,
-0.00007343292236328125,
-0.0207672119140625,
-0.04998779296875,
-0.0104522705078125,
-0.036712646484375,
-0.0093841552734375,
-0.038177490234375,
0.046539306640625,
-0.017364501953125,
0.0052490234375,
-0.006504058837890625,
0.015106201171875,
-0.0196533203125,
0.08648681640625,
0.0015535354614257812,
0.044525146484375,
0.054656982421875,
-0.04852294921875,
0.043182373046875,
0.04779052734375,
-0.013946533203125,
0.04315185546875,
-0.056396484375,
0.03167724609375,
-0.0233001708984375,
0.036712646484375,
-0.09521484375,
-0.034759521484375,
0.06268310546875,
-0.061676025390625,
0.00043487548828125,
-0.01023101806640625,
-0.03729248046875,
-0.038970947265625,
-0.0216217041015625,
-0.0024623870849609375,
0.050384521484375,
-0.0167083740234375,
0.05010986328125,
0.035858154296875,
-0.0265960693359375,
-0.017120361328125,
-0.03558349609375,
-0.010406494140625,
-0.0200347900390625,
-0.056549072265625,
0.0033283233642578125,
-0.03759765625,
-0.00994873046875,
-0.021575927734375,
0.0026702880859375,
-0.00719451904296875,
-0.004383087158203125,
0.0214080810546875,
0.0007529258728027344,
-0.0004253387451171875,
-0.015594482421875,
-0.00881195068359375,
-0.005084991455078125,
0.0045318603515625,
-0.003448486328125,
0.09844970703125,
-0.02996826171875,
-0.0099945068359375,
-0.06280517578125,
0.0281524658203125,
0.04351806640625,
-0.0135650634765625,
0.052459716796875,
0.0657958984375,
-0.01349639892578125,
0.0197296142578125,
-0.035858154296875,
-0.0236663818359375,
-0.04266357421875,
0.0269622802734375,
-0.0382080078125,
-0.04620361328125,
0.0548095703125,
0.007053375244140625,
0.0092926025390625,
0.032623291015625,
0.0777587890625,
-0.0176544189453125,
0.08880615234375,
0.0255584716796875,
-0.0003457069396972656,
0.03302001953125,
-0.034332275390625,
0.00580596923828125,
-0.051727294921875,
-0.040679931640625,
-0.015045166015625,
-0.0300140380859375,
-0.044647216796875,
-0.027496337890625,
0.0222320556640625,
-0.01181793212890625,
-0.045379638671875,
0.035125732421875,
-0.042388916015625,
0.007671356201171875,
0.062255859375,
0.039276123046875,
-0.00518035888671875,
-0.0166778564453125,
0.006591796875,
-0.01474761962890625,
-0.044830322265625,
-0.0477294921875,
0.065673828125,
0.0277557373046875,
0.049102783203125,
0.0174102783203125,
0.0494384765625,
0.01155853271484375,
-0.0047607421875,
-0.06475830078125,
0.057891845703125,
0.01122283935546875,
-0.060943603515625,
-0.0248260498046875,
-0.024932861328125,
-0.071044921875,
0.004100799560546875,
-0.004665374755859375,
-0.087158203125,
0.003749847412109375,
0.011199951171875,
-0.04345703125,
0.0183563232421875,
-0.049713134765625,
0.06689453125,
-0.03375244140625,
-0.031341552734375,
0.0015869140625,
-0.0745849609375,
0.017791748046875,
0.01557159423828125,
-0.00872039794921875,
-0.0169525146484375,
0.01192474365234375,
0.05731201171875,
-0.0292205810546875,
0.07757568359375,
-0.01236724853515625,
0.01540374755859375,
0.0272064208984375,
0.01202392578125,
0.037445068359375,
0.02105712890625,
0.0120391845703125,
-0.0121917724609375,
0.004825592041015625,
-0.038330078125,
-0.039520263671875,
0.05035400390625,
-0.07562255859375,
-0.054443359375,
-0.035797119140625,
-0.026336669921875,
-0.00601959228515625,
0.0276641845703125,
0.033935546875,
0.01529693603515625,
-0.00849151611328125,
0.0226593017578125,
0.04608154296875,
-0.03863525390625,
0.0390625,
0.02374267578125,
0.0028324127197265625,
-0.0362548828125,
0.052001953125,
-0.019195556640625,
0.01025390625,
0.034027099609375,
0.021575927734375,
-0.01290130615234375,
-0.0006036758422851562,
-0.02496337890625,
0.022491455078125,
-0.035675048828125,
-0.0029087066650390625,
-0.07940673828125,
-0.0168609619140625,
-0.0523681640625,
-0.00589752197265625,
-0.0249786376953125,
-0.00778961181640625,
-0.02752685546875,
0.0150146484375,
0.029754638671875,
0.037872314453125,
0.00438690185546875,
0.04052734375,
-0.0343017578125,
0.036041259765625,
0.038787841796875,
0.0049591064453125,
-0.01232147216796875,
-0.0311279296875,
0.0085601806640625,
0.018951416015625,
-0.042633056640625,
-0.07940673828125,
0.0570068359375,
0.0026149749755859375,
0.04534912109375,
0.0294647216796875,
-0.0029449462890625,
0.04449462890625,
-0.022552490234375,
0.0693359375,
0.00896453857421875,
-0.0634765625,
0.052276611328125,
-0.043792724609375,
0.0168914794921875,
0.025421142578125,
0.02880859375,
-0.050201416015625,
-0.0205230712890625,
-0.05035400390625,
-0.044891357421875,
0.0665283203125,
0.040863037109375,
0.028594970703125,
0.0006413459777832031,
0.020538330078125,
-0.0019521713256835938,
0.0234222412109375,
-0.07818603515625,
-0.044464111328125,
-0.029205322265625,
-0.00847625732421875,
0.01335906982421875,
-0.001384735107421875,
-0.0046234130859375,
-0.039459228515625,
0.05120849609375,
-0.006771087646484375,
0.06884765625,
0.010894775390625,
0.019744873046875,
0.007663726806640625,
0.020721435546875,
0.044464111328125,
0.033660888671875,
-0.0305938720703125,
-0.01114654541015625,
0.007114410400390625,
-0.0271148681640625,
0.0027866363525390625,
-0.00801849365234375,
0.002185821533203125,
-0.006771087646484375,
0.0023746490478515625,
0.06329345703125,
0.002521514892578125,
-0.04388427734375,
0.03790283203125,
-0.015899658203125,
-0.0128021240234375,
-0.050384521484375,
0.014862060546875,
0.019439697265625,
0.0235137939453125,
0.007381439208984375,
0.0110321044921875,
-0.00508880615234375,
-0.06439208984375,
0.00591278076171875,
0.035858154296875,
-0.00298309326171875,
-0.027374267578125,
0.06646728515625,
0.0170135498046875,
-0.05517578125,
0.051055908203125,
-0.0189361572265625,
-0.048065185546875,
0.0513916015625,
0.041412353515625,
0.06158447265625,
-0.0022125244140625,
0.03350830078125,
0.0350341796875,
0.00653839111328125,
-0.0021114349365234375,
0.03021240234375,
-0.0081329345703125,
-0.048919677734375,
-0.003147125244140625,
-0.038543701171875,
-0.02984619140625,
0.04058837890625,
-0.0295562744140625,
0.0288543701171875,
-0.05560302734375,
-0.0150299072265625,
0.005191802978515625,
0.006740570068359375,
-0.06329345703125,
0.0048065185546875,
-0.00876617431640625,
0.0284271240234375,
-0.050384521484375,
0.0499267578125,
0.049652099609375,
-0.03997802734375,
-0.045318603515625,
0.003620147705078125,
0.008880615234375,
-0.062744140625,
0.03106689453125,
0.0016965866088867188,
0.006946563720703125,
0.006473541259765625,
-0.060546875,
-0.068603515625,
0.07598876953125,
0.01410675048828125,
-0.0184326171875,
-0.0189361572265625,
0.0018396377563476562,
0.0469970703125,
-0.03497314453125,
0.03167724609375,
0.040679931640625,
0.035888671875,
0.01097869873046875,
-0.06689453125,
0.0012178421020507812,
-0.034942626953125,
-0.01617431640625,
-0.01535797119140625,
-0.076416015625,
0.0732421875,
-0.0169830322265625,
-0.0075531005859375,
0.0284576416015625,
0.03363037109375,
0.0186614990234375,
0.0114288330078125,
0.03009033203125,
0.0240936279296875,
0.037445068359375,
-0.0097198486328125,
0.0677490234375,
-0.032379150390625,
0.042449951171875,
0.07647705078125,
0.002788543701171875,
0.0443115234375,
0.01296234130859375,
-0.023101806640625,
0.039642333984375,
0.061187744140625,
-0.0172882080078125,
0.038818359375,
0.00913238525390625,
-0.0198974609375,
-0.0218505859375,
0.0142822265625,
-0.0201416015625,
0.040130615234375,
0.0130767822265625,
-0.028961181640625,
-0.00213623046875,
0.01554107666015625,
0.0258026123046875,
-0.00968170166015625,
0.0123748779296875,
0.05657958984375,
-0.0157623291015625,
-0.0638427734375,
0.0280609130859375,
0.0058746337890625,
0.06597900390625,
-0.04718017578125,
0.00926971435546875,
-0.02337646484375,
0.0239410400390625,
-0.00931549072265625,
-0.056060791015625,
0.0029125213623046875,
-0.01513671875,
-0.0128021240234375,
-0.0232696533203125,
0.041015625,
-0.0284881591796875,
-0.0309295654296875,
-0.002552032470703125,
0.056182861328125,
0.036651611328125,
-0.0011682510375976562,
-0.07373046875,
-0.0211944580078125,
0.02166748046875,
-0.027496337890625,
0.0155487060546875,
0.01454925537109375,
0.0107879638671875,
0.06353759765625,
0.04034423828125,
-0.00321197509765625,
-0.001312255859375,
-0.023956298828125,
0.0660400390625,
-0.0513916015625,
-0.03265380859375,
-0.0697021484375,
0.045928955078125,
-0.005096435546875,
-0.0382080078125,
0.06878662109375,
0.043975830078125,
0.0506591796875,
-0.034332275390625,
0.0654296875,
-0.0478515625,
0.03765869140625,
-0.01473236083984375,
0.051544189453125,
-0.031585693359375,
0.00814056396484375,
-0.0182037353515625,
-0.047515869140625,
0.00214385986328125,
0.066162109375,
-0.0254364013671875,
0.016937255859375,
0.037445068359375,
0.06353759765625,
0.00531005859375,
-0.0291748046875,
0.016693115234375,
0.0200958251953125,
0.03271484375,
0.0631103515625,
0.045928955078125,
-0.0258026123046875,
0.04351806640625,
-0.037445068359375,
-0.0229644775390625,
-0.01326751708984375,
-0.024383544921875,
-0.09027099609375,
-0.05535888671875,
-0.02789306640625,
-0.039459228515625,
-0.01259613037109375,
0.07867431640625,
0.047271728515625,
-0.053466796875,
-0.020294189453125,
0.0003018379211425781,
0.004589080810546875,
-0.01070404052734375,
-0.0200958251953125,
0.0210113525390625,
-0.00250244140625,
-0.07843017578125,
0.0118408203125,
0.008880615234375,
0.0178375244140625,
-0.026092529296875,
-0.0041046142578125,
-0.018798828125,
0.015106201171875,
0.040283203125,
0.037872314453125,
-0.056304931640625,
-0.0246429443359375,
0.0203857421875,
-0.010955810546875,
0.025115966796875,
0.046356201171875,
-0.047332763671875,
0.026153564453125,
0.042022705078125,
0.004913330078125,
0.052032470703125,
0.0025577545166015625,
0.033203125,
-0.042572021484375,
0.0153961181640625,
0.0241241455078125,
0.031982421875,
0.019927978515625,
-0.04736328125,
0.032470703125,
0.0228271484375,
-0.0537109375,
-0.041961669921875,
0.017303466796875,
-0.09710693359375,
-0.01476287841796875,
0.08721923828125,
-0.01210784912109375,
-0.017120361328125,
-0.0159454345703125,
-0.05377197265625,
0.0241241455078125,
-0.04315185546875,
0.0526123046875,
0.044586181640625,
-0.0528564453125,
-0.0157012939453125,
-0.035675048828125,
0.03509521484375,
0.02490234375,
-0.07098388671875,
-0.0196990966796875,
0.0316162109375,
0.046661376953125,
0.0253448486328125,
0.07373046875,
0.00719451904296875,
0.01373291015625,
-0.002307891845703125,
-0.0025634765625,
0.01038360595703125,
0.0012178421020507812,
-0.0036945343017578125,
0.01210784912109375,
-0.0242919921875,
-0.028594970703125
]
] |
facebook/s2t-small-librispeech-asr | 2023-09-06T19:14:59.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"speech_to_text",
"automatic-speech-recognition",
"speech",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2010.05171",
"arxiv:1904.08779",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/s2t-small-librispeech-asr | 19 | 16,109 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- speech
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
license: mit
pipeline_tag: automatic-speech-recognition
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: s2t-small-librispeech-asr
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 4.3
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 9.0
---
# S2T-SMALL-LIBRISPEECH-ASR
`s2t-small-librispeech-asr` is a Speech to Text Transformer (S2T) model trained for automatic speech recognition (ASR).
The S2T model was proposed in [this paper](https://arxiv.org/abs/2010.05171) and released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text)
## Model description
S2T is an end-to-end sequence-to-sequence transformer model. It is trained with standard
autoregressive cross-entropy loss and generates the transcripts autoregressively.
## Intended uses & limitations
This model can be used for end-to-end speech recognition (ASR).
See the [model hub](https://huggingface.co/models?filter=speech_to_text) to look for other S2T checkpoints.
### How to use
As this a standard sequence to sequence transformer model, you can use the `generate` method to generate the
transcripts by passing the speech features to the model.
*Note: The `Speech2TextProcessor` object uses [torchaudio](https://github.com/pytorch/audio) to extract the
filter bank features. Make sure to install the `torchaudio` package before running this example.*
*Note: The feature extractor depends on [torchaudio](https://github.com/pytorch/audio) and the tokenizer depends on [sentencepiece](https://github.com/google/sentencepiece)
so be sure to install those packages before running the examples.*
You could either install those as extra speech dependancies with
`pip install transformers"[speech, sentencepiece]"` or install the packages seperatly
with `pip install torchaudio sentencepiece`.
```python
import torch
from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration
from datasets import load_dataset
model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-librispeech-asr")
processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-librispeech-asr")
ds = load_dataset(
"patrickvonplaten/librispeech_asr_dummy",
"clean",
split="validation"
)
input_features = processor(
ds[0]["audio"]["array"],
sampling_rate=16_000,
return_tensors="pt"
).input_features # Batch size 1
generated_ids = model.generate(input_features=input_features)
transcription = processor.batch_decode(generated_ids)
```
#### Evaluation on LibriSpeech Test
The following script shows how to evaluate this model on the [LibriSpeech](https://huggingface.co/datasets/librispeech_asr)
*"clean"* and *"other"* test dataset.
```python
from datasets import load_dataset
from evaluate import load
from transformers import Speech2TextForConditionalGeneration, Speech2TextProcessor
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test") # change to "other" for other test dataset
wer = load("wer")
model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-librispeech-asr").to("cuda")
processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-librispeech-asr", do_upper_case=True)
def map_to_pred(batch):
features = processor(batch["audio"]["array"], sampling_rate=16000, padding=True, return_tensors="pt")
input_features = features.input_features.to("cuda")
attention_mask = features.attention_mask.to("cuda")
gen_tokens = model.generate(input_features=input_features, attention_mask=attention_mask)
batch["transcription"] = processor.batch_decode(gen_tokens, skip_special_tokens=True)[0]
return batch
result = librispeech_eval.map(map_to_pred, remove_columns=["audio"])
print("WER:", wer.compute(predictions=result["transcription"], references=result["text"]))
```
*Result (WER)*:
| "clean" | "other" |
|:-------:|:-------:|
| 4.3 | 9.0 |
## Training data
The S2T-SMALL-LIBRISPEECH-ASR is trained on [LibriSpeech ASR Corpus](https://www.openslr.org/12), a dataset consisting of
approximately 1000 hours of 16kHz read English speech.
## Training procedure
### Preprocessing
The speech data is pre-processed by extracting Kaldi-compliant 80-channel log mel-filter bank features automatically from
WAV/FLAC audio files via PyKaldi or torchaudio. Further utterance-level CMVN (cepstral mean and variance normalization)
is applied to each example.
The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 10,000.
### Training
The model is trained with standard autoregressive cross-entropy loss and using [SpecAugment](https://arxiv.org/abs/1904.08779).
The encoder receives speech features, and the decoder generates the transcripts autoregressively.
### BibTeX entry and citation info
```bibtex
@inproceedings{wang2020fairseqs2t,
title = {fairseq S2T: Fast Speech-to-Text Modeling with fairseq},
author = {Changhan Wang and Yun Tang and Xutai Ma and Anne Wu and Dmytro Okhonko and Juan Pino},
booktitle = {Proceedings of the 2020 Conference of the Asian Chapter of the Association for Computational Linguistics (AACL): System Demonstrations},
year = {2020},
}
``` | 6,003 | [
[
-0.007358551025390625,
-0.0567626953125,
0.00982666015625,
0.01032257080078125,
-0.016693115234375,
-0.0174713134765625,
-0.03643798828125,
-0.028900146484375,
0.005863189697265625,
0.030792236328125,
-0.0517578125,
-0.0215911865234375,
-0.0482177734375,
-0.01549530029296875,
-0.038482666015625,
0.077880859375,
0.0163726806640625,
0.018280029296875,
-0.00386810302734375,
0.0009465217590332031,
-0.01538848876953125,
-0.019805908203125,
-0.07354736328125,
-0.0294647216796875,
0.0058746337890625,
0.032470703125,
0.0212860107421875,
0.0372314453125,
0.022003173828125,
0.0244293212890625,
-0.0260162353515625,
-0.01044464111328125,
-0.03375244140625,
-0.0040283203125,
-0.0038318634033203125,
-0.0270538330078125,
-0.012786865234375,
-0.0002193450927734375,
0.047088623046875,
0.0237884521484375,
-0.01270294189453125,
0.034515380859375,
0.0175018310546875,
0.0247650146484375,
-0.018798828125,
0.01434326171875,
-0.034881591796875,
-0.02081298828125,
-0.01654052734375,
0.0041961669921875,
-0.03338623046875,
-0.01456451416015625,
0.019744873046875,
-0.0201873779296875,
0.0106201171875,
-0.005550384521484375,
0.08056640625,
0.0275115966796875,
-0.019134521484375,
-0.04833984375,
-0.04571533203125,
0.047760009765625,
-0.06268310546875,
0.0399169921875,
0.033721923828125,
0.0206756591796875,
0.006755828857421875,
-0.0848388671875,
-0.047637939453125,
-0.0101776123046875,
-0.007556915283203125,
0.021697998046875,
-0.029510498046875,
-0.013458251953125,
0.0285491943359375,
0.01727294921875,
-0.05889892578125,
0.0054473876953125,
-0.05120849609375,
-0.0255279541015625,
0.062469482421875,
-0.0196685791015625,
0.01398468017578125,
-0.0312347412109375,
-0.0281524658203125,
-0.02825927734375,
-0.0186920166015625,
0.031280517578125,
0.0184326171875,
0.024658203125,
-0.03729248046875,
0.03887939453125,
-0.002655029296875,
0.033447265625,
0.0095672607421875,
-0.018768310546875,
0.04327392578125,
-0.02178955078125,
-0.0120849609375,
0.020721435546875,
0.07769775390625,
0.01178741455078125,
0.00787353515625,
0.0086822509765625,
-0.01332855224609375,
0.007843017578125,
-0.015350341796875,
-0.07293701171875,
-0.021575927734375,
0.0305023193359375,
-0.02972412109375,
-0.00835418701171875,
0.00115966796875,
-0.03955078125,
-0.0081634521484375,
-0.01432037353515625,
0.06427001953125,
-0.053680419921875,
-0.01097869873046875,
0.007572174072265625,
-0.0208587646484375,
0.0227813720703125,
-0.01336669921875,
-0.0565185546875,
0.0229034423828125,
0.0288848876953125,
0.060150146484375,
0.00052642822265625,
-0.01456451416015625,
-0.042266845703125,
-0.0003566741943359375,
0.00792694091796875,
0.058349609375,
-0.0004582405090332031,
-0.031494140625,
-0.00930023193359375,
0.0169830322265625,
-0.01275634765625,
-0.035919189453125,
0.053680419921875,
-0.01373291015625,
0.03314208984375,
0.00200653076171875,
-0.048553466796875,
-0.014801025390625,
-0.031494140625,
-0.028594970703125,
0.0875244140625,
-0.006683349609375,
-0.052001953125,
0.006038665771484375,
-0.050506591796875,
-0.04840087890625,
-0.025634765625,
-0.01290130615234375,
-0.047149658203125,
-0.005096435546875,
0.0162506103515625,
0.0283050537109375,
-0.0255126953125,
0.014373779296875,
0.001285552978515625,
-0.035552978515625,
0.0382080078125,
-0.02935791015625,
0.09698486328125,
0.021575927734375,
-0.048370361328125,
0.01271820068359375,
-0.062103271484375,
0.0037899017333984375,
0.017547607421875,
-0.017547607421875,
0.01110076904296875,
-0.0003540515899658203,
0.03289794921875,
0.024627685546875,
0.021514892578125,
-0.038543701171875,
-0.01317596435546875,
-0.058502197265625,
0.054718017578125,
0.046600341796875,
-0.00848388671875,
0.0338134765625,
-0.0234222412109375,
0.006977081298828125,
-0.00836944580078125,
-0.00232696533203125,
0.0025577545166015625,
-0.033966064453125,
-0.06951904296875,
-0.0280609130859375,
0.0279998779296875,
0.062255859375,
-0.043426513671875,
0.05072021484375,
-0.0301666259765625,
-0.0609130859375,
-0.05621337890625,
-0.00978851318359375,
0.03887939453125,
0.0293731689453125,
0.035888671875,
-0.0164031982421875,
-0.04876708984375,
-0.06658935546875,
-0.020416259765625,
-0.00811004638671875,
-0.023101806640625,
0.024017333984375,
0.03857421875,
-0.035858154296875,
0.06561279296875,
-0.027557373046875,
-0.029510498046875,
-0.0240631103515625,
0.0026607513427734375,
0.030975341796875,
0.047210693359375,
0.016937255859375,
-0.04107666015625,
-0.0279083251953125,
-0.02496337890625,
-0.03759765625,
-0.0208282470703125,
-0.0031528472900390625,
0.006908416748046875,
0.0186309814453125,
0.046844482421875,
-0.039398193359375,
0.032470703125,
0.037109375,
-0.033966064453125,
0.0256500244140625,
0.00004678964614868164,
-0.004974365234375,
-0.09442138671875,
0.00832366943359375,
-0.01479339599609375,
-0.02099609375,
-0.04925537109375,
-0.020416259765625,
-0.01055145263671875,
-0.018035888671875,
-0.035308837890625,
0.041717529296875,
-0.0226593017578125,
-0.0219879150390625,
-0.01593017578125,
0.0306549072265625,
-0.01071929931640625,
0.0307159423828125,
0.00579071044921875,
0.06378173828125,
0.0452880859375,
-0.037384033203125,
0.0202484130859375,
0.03759765625,
-0.033782958984375,
0.0190277099609375,
-0.0640869140625,
0.024261474609375,
0.00856781005859375,
0.0206756591796875,
-0.08148193359375,
-0.016693115234375,
0.01155853271484375,
-0.06463623046875,
0.0170440673828125,
0.00856781005859375,
-0.045806884765625,
-0.0310516357421875,
-0.00862884521484375,
0.0091400146484375,
0.052001953125,
-0.0306396484375,
0.051025390625,
0.030853271484375,
-0.0184173583984375,
-0.021026611328125,
-0.07611083984375,
-0.0258026123046875,
-0.01557159423828125,
-0.0638427734375,
0.033172607421875,
-0.0019407272338867188,
0.00530242919921875,
-0.005252838134765625,
-0.01236724853515625,
0.002593994140625,
-0.01251220703125,
0.032470703125,
0.00336456298828125,
-0.0022678375244140625,
0.004329681396484375,
-0.0017871856689453125,
-0.01546478271484375,
0.0026035308837890625,
-0.017822265625,
0.0592041015625,
-0.01580810546875,
-0.005535125732421875,
-0.06719970703125,
0.004405975341796875,
0.0027751922607421875,
-0.018707275390625,
0.0268402099609375,
0.08270263671875,
-0.0280609130859375,
-0.0010614395141601562,
-0.0361328125,
-0.0408935546875,
-0.037994384765625,
0.043701171875,
-0.026092529296875,
-0.06414794921875,
0.039337158203125,
0.012237548828125,
-0.007404327392578125,
0.049041748046875,
0.04766845703125,
-0.01422882080078125,
0.04974365234375,
0.019866943359375,
0.005596160888671875,
0.037506103515625,
-0.048797607421875,
0.006267547607421875,
-0.049896240234375,
-0.0093231201171875,
-0.036468505859375,
-0.0140380859375,
-0.050506591796875,
-0.046295166015625,
0.026153564453125,
0.00888824462890625,
-0.031402587890625,
0.04046630859375,
-0.0380859375,
0.01332855224609375,
0.057769775390625,
0.01174163818359375,
0.0013151168823242188,
0.0163726806640625,
0.004192352294921875,
-0.0061492919921875,
-0.045440673828125,
-0.0296630859375,
0.07501220703125,
0.0360107421875,
0.034912109375,
0.005645751953125,
0.05511474609375,
0.01044464111328125,
-0.01092529296875,
-0.05963134765625,
0.0399169921875,
-0.03564453125,
-0.0452880859375,
-0.0263824462890625,
-0.0168914794921875,
-0.060028076171875,
0.0011692047119140625,
-0.01218414306640625,
-0.0743408203125,
0.01104736328125,
0.00023233890533447266,
-0.04241943359375,
-0.0088958740234375,
-0.0555419921875,
0.067626953125,
-0.011749267578125,
-0.019073486328125,
-0.01274871826171875,
-0.06463623046875,
0.007480621337890625,
0.010589599609375,
0.019317626953125,
-0.00026345252990722656,
0.0224761962890625,
0.0933837890625,
-0.0163726806640625,
0.05426025390625,
-0.0224761962890625,
0.006328582763671875,
0.043060302734375,
-0.0142822265625,
0.03094482421875,
0.0008082389831542969,
-0.006710052490234375,
0.0258941650390625,
0.016326904296875,
-0.0119476318359375,
-0.0226593017578125,
0.03521728515625,
-0.07403564453125,
-0.0281524658203125,
-0.031219482421875,
-0.023223876953125,
-0.007770538330078125,
0.013885498046875,
0.056915283203125,
0.03961181640625,
-0.0004940032958984375,
0.0269317626953125,
0.0343017578125,
-0.0201263427734375,
0.039703369140625,
0.0286102294921875,
-0.0201263427734375,
-0.0455322265625,
0.0718994140625,
0.02081298828125,
0.020751953125,
0.00238800048828125,
0.02117919921875,
-0.03839111328125,
-0.019744873046875,
-0.0303802490234375,
0.0262908935546875,
-0.045135498046875,
-0.01427459716796875,
-0.051361083984375,
-0.0445556640625,
-0.062164306640625,
0.0067138671875,
-0.038299560546875,
-0.0360107421875,
-0.03094482421875,
-0.0004878044128417969,
0.03668212890625,
0.0289459228515625,
-0.0243377685546875,
0.05670166015625,
-0.0316162109375,
0.0263519287109375,
0.0193939208984375,
-0.00644683837890625,
-0.028594970703125,
-0.088623046875,
-0.021209716796875,
0.012481689453125,
-0.021728515625,
-0.05767822265625,
0.0276336669921875,
0.029541015625,
0.0212860107421875,
0.0247650146484375,
-0.0104522705078125,
0.055999755859375,
-0.041595458984375,
0.06884765625,
0.005695343017578125,
-0.08978271484375,
0.058624267578125,
-0.01183319091796875,
0.0184326171875,
0.03778076171875,
0.00762176513671875,
-0.03656005859375,
-0.018798828125,
-0.059326171875,
-0.07794189453125,
0.07965087890625,
0.0318603515625,
-0.0037078857421875,
0.0199737548828125,
0.00787353515625,
-0.01399993896484375,
0.00551605224609375,
-0.0631103515625,
-0.03192138671875,
-0.02178955078125,
-0.02496337890625,
-0.02685546875,
-0.0251922607421875,
-0.0032978057861328125,
-0.0229034423828125,
0.07000732421875,
0.01293182373046875,
0.05596923828125,
0.0284576416015625,
-0.013397216796875,
0.01561737060546875,
0.01343536376953125,
0.04437255859375,
0.00611114501953125,
-0.018585205078125,
-0.0038852691650390625,
0.021575927734375,
-0.034088134765625,
0.005374908447265625,
0.044677734375,
-0.0173797607421875,
0.0091705322265625,
0.029815673828125,
0.0870361328125,
0.0167236328125,
-0.04498291015625,
0.052978515625,
0.00469970703125,
-0.0209503173828125,
-0.0609130859375,
0.0010538101196289062,
0.0272979736328125,
0.03753662109375,
0.023345947265625,
0.0016231536865234375,
0.022064208984375,
-0.033966064453125,
0.03955078125,
0.00927734375,
-0.042877197265625,
-0.01476287841796875,
0.0673828125,
-0.0007605552673339844,
-0.036346435546875,
0.036865234375,
-0.0166015625,
-0.0285491943359375,
0.036865234375,
0.052886962890625,
0.06683349609375,
-0.03314208984375,
-0.0017852783203125,
0.033233642578125,
0.01617431640625,
-0.0040740966796875,
0.03155517578125,
0.003620147705078125,
-0.053741455078125,
-0.0186309814453125,
-0.06060791015625,
0.006229400634765625,
0.01294708251953125,
-0.05511474609375,
0.03369140625,
-0.0203857421875,
-0.0206146240234375,
0.01308441162109375,
0.0010194778442382812,
-0.04803466796875,
0.0229339599609375,
0.009796142578125,
0.059234619140625,
-0.0643310546875,
0.07269287109375,
0.027069091796875,
-0.039031982421875,
-0.082275390625,
0.005939483642578125,
0.0050506591796875,
-0.06365966796875,
0.050445556640625,
0.033447265625,
-0.01348876953125,
0.017791748046875,
-0.028594970703125,
-0.0706787109375,
0.08990478515625,
0.0194091796875,
-0.040740966796875,
0.0002779960632324219,
0.00998687744140625,
0.034332275390625,
-0.0228271484375,
0.02349853515625,
0.057373046875,
0.022857666015625,
-0.0008215904235839844,
-0.0828857421875,
-0.00408935546875,
-0.012359619140625,
-0.00432586669921875,
-0.029876708984375,
-0.056121826171875,
0.06756591796875,
-0.00742340087890625,
-0.0212249755859375,
0.01117706298828125,
0.061492919921875,
0.0295867919921875,
0.0323486328125,
0.048492431640625,
0.039398193359375,
0.04791259765625,
-0.01213836669921875,
0.05389404296875,
-0.0189056396484375,
0.037109375,
0.09210205078125,
-0.0018911361694335938,
0.09014892578125,
0.0294647216796875,
-0.017059326171875,
0.038787841796875,
0.04510498046875,
-0.002704620361328125,
0.04522705078125,
0.01172637939453125,
-0.00681304931640625,
0.004901885986328125,
-0.0016508102416992188,
-0.033050537109375,
0.07275390625,
0.0292816162109375,
-0.0301666259765625,
0.016754150390625,
0.017364501953125,
0.0080413818359375,
-0.0035953521728515625,
-0.007526397705078125,
0.0626220703125,
0.01348876953125,
-0.023101806640625,
0.05340576171875,
0.00335693359375,
0.06256103515625,
-0.04010009765625,
0.03021240234375,
0.005764007568359375,
0.0166473388671875,
-0.01244354248046875,
-0.039703369140625,
0.026763916015625,
0.0003943443298339844,
-0.0102386474609375,
-0.0098876953125,
0.046295166015625,
-0.054046630859375,
-0.0438232421875,
0.03466796875,
0.021728515625,
0.02874755859375,
-0.0006542205810546875,
-0.06317138671875,
0.0111846923828125,
0.0242767333984375,
-0.0333251953125,
-0.0035076141357421875,
0.0158843994140625,
0.0160675048828125,
0.0340576171875,
0.044464111328125,
0.00830841064453125,
-0.002811431884765625,
0.0093536376953125,
0.052703857421875,
-0.044708251953125,
-0.050323486328125,
-0.052703857421875,
0.054412841796875,
-0.0178680419921875,
-0.0102386474609375,
0.0531005859375,
0.063720703125,
0.06622314453125,
-0.0081634521484375,
0.0565185546875,
-0.0152130126953125,
0.06170654296875,
-0.045562744140625,
0.0706787109375,
-0.047637939453125,
0.00815582275390625,
-0.0133819580078125,
-0.0491943359375,
0.00007170438766479492,
0.074462890625,
-0.02545166015625,
0.00446319580078125,
0.045654296875,
0.07940673828125,
-0.0025424957275390625,
0.004589080810546875,
0.00769805908203125,
0.03961181640625,
0.0166168212890625,
0.0281982421875,
0.0310516357421875,
-0.07244873046875,
0.0565185546875,
-0.0249786376953125,
-0.0146636962890625,
-0.0170440673828125,
-0.0250396728515625,
-0.0584716796875,
-0.060638427734375,
-0.0211181640625,
-0.05523681640625,
0.0053253173828125,
0.07501220703125,
0.0391845703125,
-0.06982421875,
-0.04150390625,
0.009521484375,
-0.0243072509765625,
-0.016143798828125,
-0.0199737548828125,
0.053009033203125,
-0.0055694580078125,
-0.05950927734375,
0.039764404296875,
-0.002964019775390625,
0.004985809326171875,
0.004482269287109375,
-0.007320404052734375,
-0.0193328857421875,
0.00478363037109375,
0.034942626953125,
0.00780487060546875,
-0.0584716796875,
-0.02471923828125,
-0.00222015380859375,
-0.0099029541015625,
-0.007694244384765625,
0.032501220703125,
-0.0382080078125,
0.020965576171875,
0.0287628173828125,
0.0313720703125,
0.0692138671875,
-0.014678955078125,
0.01331329345703125,
-0.05145263671875,
0.044403076171875,
0.00995635986328125,
0.0233154296875,
0.0270538330078125,
-0.01073455810546875,
0.0340576171875,
0.026641845703125,
-0.058807373046875,
-0.062286376953125,
0.007007598876953125,
-0.0869140625,
-0.0115203857421875,
0.10870361328125,
-0.00031828880310058594,
-0.0178680419921875,
0.017791748046875,
-0.030517578125,
0.059844970703125,
-0.03692626953125,
0.03765869140625,
0.032806396484375,
-0.0103607177734375,
-0.00832366943359375,
-0.051422119140625,
0.057525634765625,
0.03460693359375,
-0.03472900390625,
-0.0026988983154296875,
0.0199737548828125,
0.039764404296875,
0.01343536376953125,
0.0643310546875,
0.002117156982421875,
0.0145263671875,
0.01143646240234375,
0.0174560546875,
0.007282257080078125,
0.000576019287109375,
-0.036895751953125,
-0.007518768310546875,
-0.01439666748046875,
-0.0243682861328125
]
] |
KoboldAI/fairseq-dense-13B-Janeway | 2022-04-07T10:51:39.000Z | [
"transformers",
"pytorch",
"xglm",
"text-generation",
"en",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/fairseq-dense-13B-Janeway | 10 | 16,063 | transformers | 2022-04-06T14:36:12 | ---
language: en
license: mit
---
# Fairseq-dense 13B - Janeway
## Model Description
Fairseq-dense 13B-Janeway is a finetune created using Fairseq's MoE dense model.
## Training data
The training data contains around 2210 ebooks, mostly in the sci-fi and fantasy genres. The dataset is identical as dataset used by GPT-Neo-2.7B-Janeway.
Some parts of the dataset have been prepended using the following text: `[Genre: <genre1>,<genre2>]`
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/fairseq-dense-13B-Janeway')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
```
### Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
### BibTeX entry and citation info
```
Artetxe et al. (2021): Efficient Large Scale Language Modeling with Mixtures of Experts
``` | 1,322 | [
[
-0.0082855224609375,
-0.046295166015625,
0.0235137939453125,
0.0206451416015625,
0.0008182525634765625,
-0.033721923828125,
-0.003414154052734375,
-0.0003161430358886719,
-0.007648468017578125,
0.047454833984375,
-0.06878662109375,
-0.034332275390625,
-0.043182373046875,
0.0160980224609375,
-0.040740966796875,
0.0791015625,
0.03125,
0.013580322265625,
0.0116424560546875,
0.008514404296875,
-0.028472900390625,
-0.0205841064453125,
-0.056243896484375,
-0.01387786865234375,
0.037109375,
0.0283203125,
0.059051513671875,
0.03564453125,
0.019561767578125,
0.020416259765625,
-0.019683837890625,
0.0025997161865234375,
-0.059478759765625,
0.007495880126953125,
-0.007419586181640625,
-0.0272674560546875,
-0.0203704833984375,
-0.0102691650390625,
0.06817626953125,
0.05572509765625,
-0.0101318359375,
0.016082763671875,
0.0153961181640625,
0.03729248046875,
-0.038665771484375,
-0.00727081298828125,
-0.05047607421875,
0.00711822509765625,
-0.0186920166015625,
0.01026153564453125,
-0.038055419921875,
-0.01480865478515625,
0.01129150390625,
-0.049102783203125,
0.031005859375,
0.01328277587890625,
0.09478759765625,
0.0192413330078125,
-0.04791259765625,
-0.006134033203125,
-0.0595703125,
0.07476806640625,
-0.0634765625,
0.0237274169921875,
0.019775390625,
0.012725830078125,
-0.01445770263671875,
-0.0743408203125,
-0.057220458984375,
0.0023670196533203125,
0.006740570068359375,
0.0240936279296875,
-0.00638580322265625,
-0.0126190185546875,
0.0350341796875,
0.024444580078125,
-0.055145263671875,
0.00409698486328125,
-0.052398681640625,
-0.023223876953125,
0.061065673828125,
0.039215087890625,
0.005268096923828125,
-0.041778564453125,
-0.026611328125,
-0.0190887451171875,
-0.033660888671875,
-0.003143310546875,
0.040191650390625,
0.0233001708984375,
-0.0078887939453125,
0.0625,
-0.0217437744140625,
0.03936767578125,
0.017791748046875,
-0.0103759765625,
0.037017822265625,
-0.018280029296875,
-0.02288818359375,
-0.00445556640625,
0.09149169921875,
0.03875732421875,
0.015350341796875,
0.0007181167602539062,
-0.0202484130859375,
-0.0267181396484375,
0.018402099609375,
-0.07550048828125,
-0.0067901611328125,
0.038818359375,
-0.054443359375,
-0.028594970703125,
0.0084228515625,
-0.03240966796875,
-0.033355712890625,
-0.0231170654296875,
0.046600341796875,
-0.044647216796875,
-0.018646240234375,
0.01224517822265625,
-0.01800537109375,
0.0122528076171875,
0.00775146484375,
-0.058868408203125,
0.00543975830078125,
0.027618408203125,
0.04522705078125,
0.005290985107421875,
-0.024932861328125,
-0.03021240234375,
0.01158905029296875,
-0.01068878173828125,
0.0426025390625,
-0.00644683837890625,
-0.0221099853515625,
0.0008006095886230469,
0.0254058837890625,
0.0009784698486328125,
-0.034088134765625,
0.051422119140625,
-0.028106689453125,
0.04541015625,
0.0035533905029296875,
-0.04974365234375,
-0.01476287841796875,
0.00881195068359375,
-0.049407958984375,
0.078369140625,
0.022247314453125,
-0.07366943359375,
0.024261474609375,
-0.0288848876953125,
-0.015045166015625,
0.000568389892578125,
0.003040313720703125,
-0.0311279296875,
-0.00418853759765625,
0.00617218017578125,
0.047515869140625,
-0.0216827392578125,
0.043609619140625,
-0.01544952392578125,
-0.03643798828125,
0.022552490234375,
-0.05322265625,
0.09149169921875,
0.034393310546875,
-0.01259613037109375,
-0.0006694793701171875,
-0.08233642578125,
-0.0067138671875,
0.01348114013671875,
-0.0198974609375,
-0.0025997161865234375,
-0.004238128662109375,
0.053314208984375,
0.006927490234375,
0.01038360595703125,
-0.033660888671875,
0.0212860107421875,
-0.042755126953125,
0.0171356201171875,
0.04302978515625,
-0.0004184246063232422,
0.041046142578125,
-0.03204345703125,
0.033416748046875,
0.0038700103759765625,
0.0098876953125,
0.006763458251953125,
-0.041778564453125,
-0.059051513671875,
-0.010345458984375,
0.049468994140625,
0.059661865234375,
-0.042144775390625,
0.042510986328125,
-0.015350341796875,
-0.06500244140625,
-0.042510986328125,
-0.01629638671875,
-0.00044417381286621094,
0.0261077880859375,
0.0305938720703125,
-0.007457733154296875,
-0.037750244140625,
-0.05926513671875,
-0.0179290771484375,
-0.0007085800170898438,
-0.0184173583984375,
0.0191192626953125,
0.023590087890625,
-0.04864501953125,
0.05828857421875,
-0.041778564453125,
-0.01483917236328125,
-0.01541900634765625,
0.01224517822265625,
0.04351806640625,
0.03997802734375,
0.0333251953125,
-0.04779052734375,
-0.035247802734375,
-0.022705078125,
-0.042755126953125,
-0.02239990234375,
-0.033416748046875,
-0.025360107421875,
-0.00605010986328125,
0.03936767578125,
-0.036773681640625,
0.0199737548828125,
0.039825439453125,
-0.051544189453125,
0.0491943359375,
0.0165863037109375,
0.0070037841796875,
-0.1141357421875,
-0.01044464111328125,
-0.00930023193359375,
-0.007411956787109375,
-0.0231475830078125,
-0.01264190673828125,
-0.00858306884765625,
-0.010894775390625,
-0.034210205078125,
0.0552978515625,
-0.0110931396484375,
0.01959228515625,
-0.0099029541015625,
-0.001861572265625,
0.01248931884765625,
0.0177154541015625,
0.0055389404296875,
0.03668212890625,
0.048370361328125,
-0.05401611328125,
0.0236358642578125,
0.036895751953125,
-0.0245513916015625,
0.0225067138671875,
-0.053802490234375,
-0.0225677490234375,
-0.0092926025390625,
0.004718780517578125,
-0.05633544921875,
-0.0216827392578125,
0.029754638671875,
-0.0277557373046875,
0.0173492431640625,
-0.0188446044921875,
-0.039398193359375,
-0.01291656494140625,
0.0095367431640625,
0.0199432373046875,
0.04266357421875,
-0.01451873779296875,
0.056243896484375,
0.008544921875,
-0.0219573974609375,
-0.054534912109375,
-0.055511474609375,
-0.01312255859375,
-0.021148681640625,
-0.062347412109375,
0.0292510986328125,
-0.006744384765625,
-0.0069122314453125,
-0.0079345703125,
0.019561767578125,
-0.021087646484375,
-0.00016319751739501953,
0.0124359130859375,
0.00855255126953125,
-0.0166015625,
0.00919342041015625,
0.0286712646484375,
-0.0037288665771484375,
0.007419586181640625,
-0.01557159423828125,
0.039093017578125,
0.00861358642578125,
0.0015010833740234375,
-0.03668212890625,
0.0272674560546875,
-0.0007052421569824219,
0.006610870361328125,
0.07275390625,
0.09490966796875,
-0.0234375,
-0.017059326171875,
-0.01385498046875,
-0.027740478515625,
-0.0362548828125,
0.045501708984375,
-0.0202484130859375,
-0.0518798828125,
0.031951904296875,
0.00875091552734375,
0.007427215576171875,
0.06927490234375,
0.049072265625,
-0.005535125732421875,
0.07012939453125,
0.0299835205078125,
0.02581787109375,
0.0361328125,
-0.036956787109375,
-0.004833221435546875,
-0.0430908203125,
-0.01358795166015625,
-0.04901123046875,
-0.019866943359375,
-0.052459716796875,
-0.0206756591796875,
0.006988525390625,
0.02032470703125,
-0.050140380859375,
0.03472900390625,
-0.038330078125,
0.0218353271484375,
0.054290771484375,
-0.00333404541015625,
0.02716064453125,
0.002044677734375,
-0.00411224365234375,
-0.006938934326171875,
-0.0706787109375,
-0.04364013671875,
0.0989990234375,
0.0229339599609375,
0.06817626953125,
0.0157470703125,
0.0655517578125,
0.0005593299865722656,
0.0094146728515625,
-0.045440673828125,
0.02215576171875,
-0.01189422607421875,
-0.080322265625,
-0.0188751220703125,
-0.04852294921875,
-0.08135986328125,
-0.002780914306640625,
-0.047576904296875,
-0.04718017578125,
0.0274658203125,
-0.00545501708984375,
-0.0213165283203125,
0.006511688232421875,
-0.064453125,
0.07720947265625,
0.0020122528076171875,
-0.0231781005859375,
0.004360198974609375,
-0.039825439453125,
0.0245361328125,
0.0009102821350097656,
0.00562286376953125,
0.0004317760467529297,
0.01203155517578125,
0.060272216796875,
-0.025360107421875,
0.059967041015625,
0.0026111602783203125,
-0.01213836669921875,
0.037689208984375,
0.003673553466796875,
0.02105712890625,
0.033660888671875,
-0.01456451416015625,
0.0174713134765625,
0.0148773193359375,
-0.031280517578125,
-0.00982666015625,
0.040008544921875,
-0.072509765625,
-0.013824462890625,
-0.053314208984375,
-0.042816162109375,
0.00083160400390625,
0.027313232421875,
0.06182861328125,
0.051361083984375,
-0.01229095458984375,
0.0240631103515625,
0.03582763671875,
-0.0241241455078125,
0.022216796875,
0.01375579833984375,
-0.0268707275390625,
-0.03009033203125,
0.06201171875,
0.01168060302734375,
0.01128387451171875,
0.0186920166015625,
0.01181793212890625,
-0.030487060546875,
-0.033538818359375,
-0.0273284912109375,
0.012359619140625,
-0.045867919921875,
-0.0219573974609375,
-0.06878662109375,
-0.045501708984375,
-0.035369873046875,
0.0063629150390625,
-0.0258941650390625,
-0.046295166015625,
-0.04107666015625,
0.0167388916015625,
0.0296783447265625,
0.04986572265625,
-0.007244110107421875,
0.052093505859375,
-0.0765380859375,
0.0266571044921875,
0.01220703125,
0.01535797119140625,
-0.0038776397705078125,
-0.07073974609375,
-0.0310211181640625,
0.01322174072265625,
-0.056121826171875,
-0.055908203125,
0.0277557373046875,
0.01031494140625,
0.0208282470703125,
0.028778076171875,
0.0113525390625,
0.019500732421875,
-0.043853759765625,
0.06689453125,
0.007110595703125,
-0.0556640625,
0.0229949951171875,
-0.048309326171875,
0.02655029296875,
0.0280609130859375,
0.042724609375,
-0.03802490234375,
-0.03741455078125,
-0.07501220703125,
-0.07086181640625,
0.0506591796875,
0.040283203125,
0.0216827392578125,
-0.007152557373046875,
0.005786895751953125,
0.023223876953125,
0.01177215576171875,
-0.08441162109375,
-0.035430908203125,
-0.0294342041015625,
-0.04437255859375,
0.005462646484375,
-0.0150909423828125,
0.0008525848388671875,
-0.01213836669921875,
0.07281494140625,
-0.0036602020263671875,
0.0318603515625,
0.0087432861328125,
-0.00035953521728515625,
0.0162506103515625,
0.019622802734375,
0.00847625732421875,
0.032196044921875,
-0.032379150390625,
-0.036346435546875,
-0.00690460205078125,
-0.052215576171875,
-0.01230621337890625,
0.0408935546875,
-0.03326416015625,
0.0115966796875,
0.02313232421875,
0.061431884765625,
-0.0135040283203125,
-0.043243408203125,
0.034088134765625,
-0.01198577880859375,
-0.0335693359375,
-0.0360107421875,
0.0174102783203125,
0.015472412109375,
0.037200927734375,
0.02655029296875,
0.0159759521484375,
0.041107177734375,
-0.023223876953125,
0.00795745849609375,
0.00965118408203125,
-0.0278167724609375,
0.0095977783203125,
0.06341552734375,
0.020294189453125,
-0.005794525146484375,
0.06494140625,
-0.03094482421875,
-0.0097198486328125,
0.047393798828125,
0.04144287109375,
0.059967041015625,
0.0227508544921875,
0.0190582275390625,
0.055877685546875,
0.023529052734375,
-0.005615234375,
0.03021240234375,
0.036956787109375,
-0.06689453125,
-0.042572021484375,
-0.056640625,
-0.0106658935546875,
0.0222625732421875,
-0.043701171875,
0.050323486328125,
-0.0212249755859375,
-0.0293426513671875,
-0.00342559814453125,
-0.0037841796875,
-0.061798095703125,
0.036468505859375,
0.0143890380859375,
0.06072998046875,
-0.05419921875,
0.034210205078125,
0.037109375,
-0.035186767578125,
-0.0906982421875,
0.01175689697265625,
-0.00563812255859375,
-0.0284423828125,
0.0323486328125,
0.0179901123046875,
0.03485107421875,
0.0225677490234375,
-0.053009033203125,
-0.06396484375,
0.08636474609375,
0.00194549560546875,
-0.05572509765625,
0.0018053054809570312,
-0.00885772705078125,
0.0205078125,
-0.0221710205078125,
0.038665771484375,
0.01403045654296875,
0.028778076171875,
-0.0161895751953125,
-0.03985595703125,
-0.01287841796875,
-0.0207977294921875,
-0.00449371337890625,
-0.00047969818115234375,
-0.047088623046875,
0.066162109375,
-0.00321197509765625,
-0.01239776611328125,
0.03173828125,
0.063720703125,
0.02728271484375,
0.0213775634765625,
0.038818359375,
0.037506103515625,
0.059844970703125,
-0.0239715576171875,
0.07135009765625,
0.00213623046875,
0.052490234375,
0.07147216796875,
0.0034942626953125,
0.03955078125,
0.01061248779296875,
-0.013153076171875,
0.0540771484375,
0.058563232421875,
-0.0294952392578125,
0.0318603515625,
0.0101165771484375,
0.009063720703125,
-0.022369384765625,
-0.00273895263671875,
-0.048553466796875,
0.02716064453125,
0.0203399658203125,
-0.03369140625,
0.015045166015625,
-0.0005908012390136719,
-0.0014390945434570312,
-0.002685546875,
-0.0177154541015625,
0.0408935546875,
0.002834320068359375,
-0.024658203125,
0.051177978515625,
0.0121002197265625,
0.0552978515625,
-0.042938232421875,
0.00910186767578125,
-0.0019207000732421875,
0.01456451416015625,
-0.013153076171875,
-0.06512451171875,
0.008819580078125,
-0.007602691650390625,
-0.03887939453125,
-0.003047943115234375,
0.0406494140625,
-0.052093505859375,
-0.06512451171875,
-0.00047969818115234375,
0.03564453125,
0.017913818359375,
-0.005924224853515625,
-0.0679931640625,
-0.01029205322265625,
0.015716552734375,
-0.03277587890625,
-0.0030460357666015625,
0.06103515625,
0.01568603515625,
0.042633056640625,
0.03289794921875,
-0.006175994873046875,
0.008758544921875,
0.0271759033203125,
0.041259765625,
-0.054779052734375,
-0.030517578125,
-0.0290069580078125,
0.06854248046875,
-0.01033782958984375,
-0.0531005859375,
0.04583740234375,
0.0430908203125,
0.060211181640625,
-0.028228759765625,
0.05694580078125,
-0.0270538330078125,
0.041015625,
-0.050323486328125,
0.0667724609375,
-0.059051513671875,
-0.00001341104507446289,
-0.027435302734375,
-0.09454345703125,
0.004444122314453125,
0.051666259765625,
0.004795074462890625,
0.0204010009765625,
0.0635986328125,
0.068359375,
-0.01461029052734375,
0.00728607177734375,
-0.0006418228149414062,
0.035797119140625,
-0.001979827880859375,
0.030517578125,
0.05841064453125,
-0.07763671875,
0.05706787109375,
-0.025787353515625,
-0.00241851806640625,
-0.01328277587890625,
-0.06036376953125,
-0.07098388671875,
-0.04345703125,
-0.02667236328125,
-0.06817626953125,
-0.0236663818359375,
0.056640625,
0.043701171875,
-0.05523681640625,
-0.01345062255859375,
-0.00240325927734375,
-0.0016527175903320312,
-0.016387939453125,
-0.0245513916015625,
0.017578125,
-0.012115478515625,
-0.0775146484375,
0.01312255859375,
-0.031341552734375,
0.016357421875,
-0.006519317626953125,
-0.01189422607421875,
-0.0209808349609375,
0.004459381103515625,
0.0192413330078125,
-0.01300811767578125,
-0.0286102294921875,
0.003040313720703125,
0.0061187744140625,
-0.01983642578125,
-0.0189056396484375,
0.0186920166015625,
-0.042449951171875,
0.01146697998046875,
0.037689208984375,
0.0335693359375,
0.0333251953125,
-0.0126800537109375,
0.057952880859375,
-0.05499267578125,
0.01837158203125,
0.0218963623046875,
0.0250701904296875,
0.0247650146484375,
-0.0260772705078125,
0.0438232421875,
0.0283660888671875,
-0.06146240234375,
-0.0439453125,
0.01119232177734375,
-0.07891845703125,
-0.0230865478515625,
0.1026611328125,
0.0266265869140625,
-0.01032257080078125,
-0.0019464492797851562,
-0.035675048828125,
0.0226593017578125,
-0.03619384765625,
0.034881591796875,
0.0595703125,
0.028228759765625,
0.0041656494140625,
-0.04095458984375,
0.0296478271484375,
0.0311431884765625,
-0.0266571044921875,
-0.0125885009765625,
0.035675048828125,
0.0205078125,
0.0142364501953125,
0.037689208984375,
0.0002970695495605469,
0.0149383544921875,
0.0237274169921875,
0.01305389404296875,
0.0126190185546875,
-0.01477813720703125,
-0.004322052001953125,
-0.006801605224609375,
-0.010223388671875,
0.0171356201171875
]
] |
line-corporation/line-distilbert-base-japanese | 2023-03-22T06:05:24.000Z | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"ja",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] | fill-mask | line-corporation | null | null | line-corporation/line-distilbert-base-japanese | 26 | 16,049 | transformers | 2023-03-09T08:52:55 | ---
inference: false
language: ja
license: apache-2.0
mask_token: "[MASK]"
widget:
- text: "LINE株式会社で[MASK]の研究・開発をしている。"
---
# LINE DistilBERT Japanese
This is a DistilBERT model pre-trained on 131 GB of Japanese web text.
The teacher model is BERT-base that built in-house at LINE.
The model was trained by [LINE Corporation](https://linecorp.com/).
## For Japanese
https://github.com/line/LINE-DistilBERT-Japanese/blob/main/README_ja.md is written in Japanese.
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("line-corporation/line-distilbert-base-japanese", trust_remote_code=True)
model = AutoModel.from_pretrained("line-corporation/line-distilbert-base-japanese")
sentence = "LINE株式会社で[MASK]の研究・開発をしている。"
print(model(**tokenizer(sentence, return_tensors="pt")))
```
### Requirements
```txt
fugashi
sentencepiece
unidic-lite
```
## Model architecture
The model architecture is the DitilBERT base model; 6 layers, 768 dimensions of hidden states, 12 attention heads, 66M parameters.
## Evaluation
The evaluation by [JGLUE](https://github.com/yahoojapan/JGLUE) is as follows:
| model name | #Params | Marc_ja | JNLI | JSTS | JSQuAD | JCommonSenseQA |
|------------------------|:-------:|:-------:|:----:|:----------------:|:---------:|:--------------:|
| | | acc | acc | Pearson/Spearman | EM/F1 | acc |
| LINE-DistilBERT | 68M | 95.6 | 88.9 | 89.2/85.1 | 87.3/93.3 | 76.1 |
| Laboro-DistilBERT | 68M | 94.7 | 82.0 | 87.4/82.7 | 70.2/87.3 | 73.2 |
| BandaiNamco-DistilBERT | 68M | 94.6 | 81.6 | 86.8/82.1 | 80.0/88.0 | 66.5 |
## Tokenization
The texts are first tokenized by MeCab with the Unidic dictionary and then split into subwords by the SentencePiece algorithm. The vocabulary size is 32768.
## Licenses
The pretrained models are distributed under the terms of the [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0).
## To cite this work
We haven't published any paper on this work. Please cite [this GitHub repository](http://github.com/line/LINE-DistilBERT-Japanese):
```
@article{LINE DistilBERT Japanese,
title = {LINE DistilBERT Japanese},
author = {"Koga, Kobayashi and Li, Shengzhe and Nakamachi, Akifumi and Sato, Toshinori"},
year = {2023},
howpublished = {\url{http://github.com/line/LINE-DistilBERT-Japanese}}
}
``` | 2,519 | [
[
-0.0287322998046875,
-0.0670166015625,
0.02899169921875,
0.019134521484375,
-0.0328369140625,
0.000888824462890625,
-0.0032482147216796875,
-0.022186279296875,
0.02203369140625,
0.019073486328125,
-0.060089111328125,
-0.039886474609375,
-0.0560302734375,
0.01021575927734375,
-0.020599365234375,
0.0750732421875,
-0.007663726806640625,
0.0035266876220703125,
0.00018489360809326172,
0.0091400146484375,
-0.02197265625,
-0.0274505615234375,
-0.0374755859375,
-0.0294036865234375,
0.0153045654296875,
0.01309967041015625,
0.046722412109375,
0.038360595703125,
0.02484130859375,
0.0266265869140625,
-0.00908660888671875,
-0.0096435546875,
-0.028533935546875,
-0.0208740234375,
0.004608154296875,
-0.035003662109375,
-0.0322265625,
0.00225830078125,
0.0254058837890625,
0.060089111328125,
0.0032787322998046875,
0.014007568359375,
0.007152557373046875,
0.04791259765625,
-0.02581787109375,
0.0210418701171875,
-0.037139892578125,
0.004283905029296875,
-0.0183868408203125,
0.0118255615234375,
-0.02032470703125,
-0.0041046142578125,
0.0219573974609375,
-0.062103271484375,
0.00696563720703125,
-0.0018367767333984375,
0.10394287109375,
0.0303497314453125,
-0.01502227783203125,
-0.02606201171875,
-0.02081298828125,
0.055755615234375,
-0.07293701171875,
0.007396697998046875,
0.0372314453125,
0.002918243408203125,
-0.01178741455078125,
-0.072021484375,
-0.045806884765625,
-0.00479888916015625,
-0.024322509765625,
0.0212554931640625,
-0.0216827392578125,
-0.0015897750854492188,
0.039947509765625,
0.036895751953125,
-0.048065185546875,
0.011566162109375,
-0.025360107421875,
-0.005702972412109375,
0.04547119140625,
0.009552001953125,
0.03131103515625,
-0.03546142578125,
-0.03143310546875,
-0.0149383544921875,
-0.01392364501953125,
0.01313018798828125,
0.034515380859375,
0.00820159912109375,
-0.02587890625,
0.038543701171875,
-0.0311279296875,
0.046295166015625,
0.0132293701171875,
-0.0174102783203125,
0.039947509765625,
-0.0213775634765625,
-0.01203155517578125,
0.00968170166015625,
0.09442138671875,
0.0305023193359375,
0.0196990966796875,
0.01271820068359375,
-0.007335662841796875,
-0.00809478759765625,
0.00337982177734375,
-0.07586669921875,
-0.03271484375,
0.00211334228515625,
-0.0379638671875,
-0.02001953125,
0.0224761962890625,
-0.052978515625,
0.0079345703125,
-0.0059661865234375,
0.044281005859375,
-0.04278564453125,
-0.0130157470703125,
0.01508331298828125,
-0.0136260986328125,
0.006969451904296875,
0.000957489013671875,
-0.07342529296875,
0.0114898681640625,
0.0252685546875,
0.0576171875,
0.02423095703125,
-0.0311126708984375,
0.0087432861328125,
-0.0166168212890625,
-0.013031005859375,
0.0236053466796875,
-0.004608154296875,
-0.0296630859375,
-0.0174102783203125,
0.004695892333984375,
-0.00481414794921875,
-0.01641845703125,
0.034820556640625,
-0.0235748291015625,
0.047454833984375,
-0.007167816162109375,
-0.0423583984375,
-0.01067352294921875,
0.00856781005859375,
-0.047821044921875,
0.0740966796875,
0.0186920166015625,
-0.08062744140625,
0.0286102294921875,
-0.05230712890625,
-0.031341552734375,
0.00734710693359375,
0.0026397705078125,
-0.045562744140625,
0.0019588470458984375,
0.030731201171875,
0.03314208984375,
-0.0034313201904296875,
0.023193359375,
-0.018280029296875,
-0.0301361083984375,
0.021575927734375,
-0.03826904296875,
0.09954833984375,
0.032501220703125,
-0.0293731689453125,
-0.0169830322265625,
-0.06695556640625,
0.0028324127197265625,
0.032440185546875,
-0.0201568603515625,
-0.0513916015625,
-0.01413726806640625,
0.00194549560546875,
0.005786895751953125,
0.057403564453125,
-0.032806396484375,
0.0233612060546875,
-0.038848876953125,
0.041015625,
0.041351318359375,
-0.00004124641418457031,
0.0157318115234375,
-0.01071929931640625,
0.0263214111328125,
0.01751708984375,
0.021575927734375,
-0.016693115234375,
-0.0195159912109375,
-0.08172607421875,
-0.0283355712890625,
0.028564453125,
0.033935546875,
-0.07037353515625,
0.0693359375,
-0.0289154052734375,
-0.048736572265625,
-0.04547119140625,
-0.006256103515625,
0.0355224609375,
0.046234130859375,
0.034149169921875,
-0.004566192626953125,
-0.0499267578125,
-0.0732421875,
-0.0117340087890625,
-0.0183868408203125,
0.007663726806640625,
-0.0139312744140625,
0.04180908203125,
-0.0266265869140625,
0.07208251953125,
-0.062744140625,
-0.009765625,
-0.0209503173828125,
0.0191192626953125,
0.050933837890625,
0.036590576171875,
0.042449951171875,
-0.04681396484375,
-0.05657958984375,
-0.0266571044921875,
-0.05145263671875,
-0.01316070556640625,
-0.0036754608154296875,
-0.016571044921875,
0.004486083984375,
0.0303802490234375,
-0.033447265625,
0.033233642578125,
0.031005859375,
-0.039764404296875,
0.0313720703125,
-0.0223846435546875,
-0.0081024169921875,
-0.12371826171875,
0.020050048828125,
0.007007598876953125,
-0.0159149169921875,
-0.043426513671875,
0.00264739990234375,
0.001789093017578125,
0.003505706787109375,
-0.0263671875,
0.031463623046875,
-0.039337158203125,
0.01401519775390625,
-0.0136871337890625,
-0.005390167236328125,
0.0166015625,
0.05145263671875,
0.0029621124267578125,
0.04766845703125,
0.04022216796875,
-0.040618896484375,
0.028167724609375,
0.032135009765625,
-0.0499267578125,
0.0250244140625,
-0.058502197265625,
-0.00634765625,
-0.0004944801330566406,
0.006473541259765625,
-0.0753173828125,
-0.004589080810546875,
0.02264404296875,
-0.03704833984375,
0.0215606689453125,
-0.003276824951171875,
-0.040618896484375,
-0.045654296875,
-0.022216796875,
0.00890350341796875,
0.039764404296875,
-0.0261383056640625,
0.0186614990234375,
0.0189666748046875,
-0.00983428955078125,
-0.059173583984375,
-0.07757568359375,
-0.01406097412109375,
-0.0183563232421875,
-0.0244598388671875,
0.0306396484375,
-0.013031005859375,
-0.0037078857421875,
0.006862640380859375,
-0.0074920654296875,
-0.0275115966796875,
0.02545166015625,
0.0018358230590820312,
0.0452880859375,
-0.00588226318359375,
-0.0178680419921875,
0.019683837890625,
-0.01349639892578125,
0.004024505615234375,
-0.004241943359375,
0.05999755859375,
-0.0106353759765625,
-0.009307861328125,
-0.0443115234375,
0.00025081634521484375,
0.0350341796875,
-0.0026683807373046875,
0.04986572265625,
0.04339599609375,
-0.019195556640625,
-0.0009522438049316406,
-0.0214385986328125,
-0.0032672882080078125,
-0.036834716796875,
0.04815673828125,
-0.044097900390625,
-0.0494384765625,
0.045135498046875,
0.01024627685546875,
0.0075531005859375,
0.061279296875,
0.0223846435546875,
-0.0081024169921875,
0.07269287109375,
0.03668212890625,
-0.0305023193359375,
0.034271240234375,
-0.045806884765625,
0.00804901123046875,
-0.058135986328125,
-0.0126495361328125,
-0.02630615234375,
-0.028564453125,
-0.05316162109375,
-0.004093170166015625,
0.0195159912109375,
0.026123046875,
-0.0210723876953125,
0.05181884765625,
-0.045379638671875,
0.034271240234375,
0.041534423828125,
0.0024166107177734375,
0.0272064208984375,
-0.00008678436279296875,
-0.033203125,
-0.0082244873046875,
-0.049652099609375,
-0.0367431640625,
0.06634521484375,
0.0396728515625,
0.059661865234375,
-0.0007104873657226562,
0.06640625,
-0.00829315185546875,
-0.005962371826171875,
-0.059356689453125,
0.03778076171875,
-0.02862548828125,
-0.063232421875,
-0.0322265625,
-0.03912353515625,
-0.06817626953125,
0.02142333984375,
0.0009179115295410156,
-0.04864501953125,
-0.005313873291015625,
-0.011322021484375,
-0.0252838134765625,
0.01064300537109375,
-0.051544189453125,
0.0771484375,
-0.033294677734375,
-0.0014495849609375,
0.004642486572265625,
-0.04986572265625,
0.0232391357421875,
-0.00229644775390625,
0.01544189453125,
-0.007419586181640625,
0.005161285400390625,
0.0662841796875,
-0.0540771484375,
0.058746337890625,
-0.0194091796875,
0.0004761219024658203,
0.0193023681640625,
-0.01702880859375,
0.0139007568359375,
0.01207733154296875,
0.01430511474609375,
0.031341552734375,
0.0149383544921875,
-0.0171966552734375,
-0.02484130859375,
0.048828125,
-0.086181640625,
-0.03399658203125,
-0.045806884765625,
-0.025665283203125,
0.0049896240234375,
0.046478271484375,
0.0565185546875,
0.0226898193359375,
-0.0035800933837890625,
0.0247650146484375,
0.03900146484375,
-0.01282501220703125,
0.052337646484375,
0.04766845703125,
-0.0230712890625,
-0.033966064453125,
0.0494384765625,
0.0219268798828125,
0.010284423828125,
0.04473876953125,
0.019561767578125,
-0.0391845703125,
-0.0195465087890625,
-0.0350341796875,
0.0265655517578125,
-0.04339599609375,
0.00394439697265625,
-0.042572021484375,
-0.036956787109375,
-0.033203125,
0.0033245086669921875,
-0.0272064208984375,
-0.03057861328125,
-0.047821044921875,
-0.004974365234375,
0.02520751953125,
0.0352783203125,
0.022613525390625,
0.0274200439453125,
-0.046142578125,
0.01534271240234375,
0.0146026611328125,
0.0199432373046875,
-0.003124237060546875,
-0.061279296875,
-0.027374267578125,
0.0162200927734375,
-0.0283050537109375,
-0.048736572265625,
0.03643798828125,
-0.0016727447509765625,
0.048583984375,
0.0341796875,
0.0169219970703125,
0.075927734375,
-0.034576416015625,
0.057708740234375,
0.035675048828125,
-0.083251953125,
0.04302978515625,
-0.01331329345703125,
0.040283203125,
0.064453125,
0.055023193359375,
-0.045745849609375,
-0.02777099609375,
-0.06134033203125,
-0.07354736328125,
0.06585693359375,
0.01044464111328125,
0.01641845703125,
0.0015077590942382812,
0.0106964111328125,
0.0010557174682617188,
0.0187225341796875,
-0.08184814453125,
-0.04034423828125,
-0.0400390625,
-0.049774169921875,
-0.005573272705078125,
-0.019439697265625,
0.01216888427734375,
-0.0214385986328125,
0.058990478515625,
0.001949310302734375,
0.00824737548828125,
0.0217132568359375,
-0.0164794921875,
0.0175018310546875,
0.006961822509765625,
0.03521728515625,
0.039703369140625,
-0.0111083984375,
0.0006532669067382812,
0.01293182373046875,
-0.049774169921875,
0.00719451904296875,
0.024993896484375,
-0.02630615234375,
0.01861572265625,
0.034088134765625,
0.0841064453125,
0.00933837890625,
-0.035614013671875,
0.04351806640625,
-0.00604248046875,
-0.033721923828125,
-0.050750732421875,
0.0009241104125976562,
-0.0025806427001953125,
0.00868988037109375,
0.03436279296875,
-0.0233306884765625,
0.005825042724609375,
-0.03704833984375,
0.0068206787109375,
0.01751708984375,
-0.030731201171875,
-0.01360321044921875,
0.04754638671875,
0.0103759765625,
-0.0018854141235351562,
0.04864501953125,
-0.0139617919921875,
-0.0430908203125,
0.035369873046875,
0.03289794921875,
0.05767822265625,
-0.0038909912109375,
-0.00041866302490234375,
0.04901123046875,
0.0181732177734375,
-0.006221771240234375,
0.014617919921875,
0.00848388671875,
-0.05078125,
-0.017181396484375,
-0.060150146484375,
0.003940582275390625,
0.038848876953125,
-0.051910400390625,
0.0272369384765625,
-0.044281005859375,
-0.0188446044921875,
-0.00487518310546875,
0.027923583984375,
-0.02386474609375,
0.0182647705078125,
0.006282806396484375,
0.05780029296875,
-0.06219482421875,
0.07354736328125,
0.05010986328125,
-0.056671142578125,
-0.0811767578125,
0.000896453857421875,
-0.02099609375,
-0.058074951171875,
0.053497314453125,
0.0231170654296875,
0.02227783203125,
0.0212554931640625,
-0.023193359375,
-0.06817626953125,
0.0980224609375,
0.0031795501708984375,
-0.06256103515625,
-0.0166473388671875,
0.0179901123046875,
0.042205810546875,
-0.00554656982421875,
0.037261962890625,
0.0244903564453125,
0.0236053466796875,
0.00855255126953125,
-0.07958984375,
0.004573822021484375,
-0.03271484375,
0.0236663818359375,
0.007610321044921875,
-0.059478759765625,
0.08087158203125,
0.01107025146484375,
-0.007076263427734375,
0.016021728515625,
0.043701171875,
0.02392578125,
0.009521484375,
0.044586181640625,
0.0726318359375,
0.04205322265625,
-0.01517486572265625,
0.04638671875,
-0.0293426513671875,
0.0516357421875,
0.07525634765625,
0.0023937225341796875,
0.046661376953125,
0.035186767578125,
-0.028045654296875,
0.06304931640625,
0.043121337890625,
-0.01776123046875,
0.05194091796875,
0.0087432861328125,
0.005680084228515625,
0.00653076171875,
0.0184478759765625,
-0.028106689453125,
0.04156494140625,
0.01068878173828125,
-0.039947509765625,
-0.006687164306640625,
-0.0014257431030273438,
0.035003662109375,
-0.004199981689453125,
-0.0177154541015625,
0.057769775390625,
0.00933837890625,
-0.05712890625,
0.0606689453125,
0.00872802734375,
0.06768798828125,
-0.056365966796875,
0.0160369873046875,
-0.00997161865234375,
0.0235595703125,
-0.0085601806640625,
-0.054229736328125,
0.0150299072265625,
0.0171356201171875,
-0.01143646240234375,
-0.01385498046875,
0.030670166015625,
-0.03643798828125,
-0.049163818359375,
0.0101776123046875,
0.0186920166015625,
0.01273345947265625,
0.016571044921875,
-0.0782470703125,
-0.004886627197265625,
0.022216796875,
-0.0380859375,
0.01425933837890625,
0.0400390625,
0.01457977294921875,
0.0360107421875,
0.047149658203125,
-0.00382232666015625,
0.0036640167236328125,
0.01276397705078125,
0.064453125,
-0.035675048828125,
-0.05364990234375,
-0.0833740234375,
0.058807373046875,
-0.0288848876953125,
-0.039093017578125,
0.06524658203125,
0.053070068359375,
0.0721435546875,
-0.01922607421875,
0.05206298828125,
-0.02484130859375,
0.038848876953125,
-0.048004150390625,
0.0745849609375,
-0.053497314453125,
0.0028743743896484375,
-0.03521728515625,
-0.06524658203125,
-0.001800537109375,
0.0750732421875,
-0.023590087890625,
0.0080108642578125,
0.060455322265625,
0.0443115234375,
-0.0003046989440917969,
-0.0231475830078125,
0.0108642578125,
0.0241241455078125,
0.0157318115234375,
0.059234619140625,
0.05670166015625,
-0.06201171875,
0.03509521484375,
-0.046600341796875,
-0.01200103759765625,
-0.0169525146484375,
-0.057037353515625,
-0.0745849609375,
-0.053924560546875,
-0.02813720703125,
-0.0123138427734375,
-0.0350341796875,
0.060791015625,
0.046661376953125,
-0.044219970703125,
-0.0201873779296875,
-0.027618408203125,
-0.0203704833984375,
0.0015668869018554688,
-0.023406982421875,
0.0184478759765625,
-0.019500732421875,
-0.076171875,
0.00852203369140625,
-0.01503753662109375,
0.0273590087890625,
-0.0226898193359375,
-0.003421783447265625,
-0.033294677734375,
-0.01548004150390625,
0.0201263427734375,
-0.0037593841552734375,
-0.050079345703125,
-0.002239227294921875,
0.0003082752227783203,
-0.0267486572265625,
-0.0149993896484375,
0.0125274658203125,
-0.0286865234375,
0.031463623046875,
0.040374755859375,
0.0168304443359375,
0.04681396484375,
-0.005641937255859375,
0.01287078857421875,
-0.061614990234375,
0.040771484375,
0.00677490234375,
0.04925537109375,
0.01155853271484375,
-0.022216796875,
0.03021240234375,
0.0328369140625,
-0.028594970703125,
-0.047210693359375,
-0.0006923675537109375,
-0.07281494140625,
-0.0418701171875,
0.0640869140625,
-0.028900146484375,
-0.0265350341796875,
0.01491546630859375,
-0.0178070068359375,
0.041473388671875,
-0.0159454345703125,
0.04998779296875,
0.0699462890625,
0.018341064453125,
-0.0059661865234375,
-0.021636962890625,
0.0184478759765625,
0.0231170654296875,
-0.04681396484375,
-0.0186309814453125,
0.02813720703125,
0.03570556640625,
0.0316162109375,
0.044952392578125,
-0.00783538818359375,
-0.006664276123046875,
0.002323150634765625,
0.0132904052734375,
-0.017822265625,
-0.00711822509765625,
-0.0137939453125,
-0.017425537109375,
0.00493621826171875,
-0.025054931640625
]
] |
warp-ai/wuerstchen | 2023-09-22T21:53:14.000Z | [
"diffusers",
"text-to-image",
"wuerstchen",
"arxiv:2306.00637",
"arxiv:1910.09700",
"license:mit",
"has_space",
"diffusers:WuerstchenDecoderPipeline",
"region:us"
] | text-to-image | warp-ai | null | null | warp-ai/wuerstchen | 132 | 16,039 | diffusers | 2023-07-19T19:10:32 | ---
license: mit
prior:
- warp-diffusion/wuerstchen-prior
tags:
- text-to-image
- wuerstchen
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/634cb5eefb80cc6bcaf63c3e/i-DYpDHw8Pwiy7QBKZVR5.jpeg" width=1500>
## Würstchen - Overview
Würstchen is a diffusion model, whose text-conditional model works in a highly compressed latent space of images. Why is this important? Compressing data can reduce
computational costs for both training and inference by magnitudes. Training on 1024x1024 images, is way more expensive than training at 32x32. Usually, other works make
use of a relatively small compression, in the range of 4x - 8x spatial compression. Würstchen takes this to an extreme. Through its novel design, we achieve a 42x spatial
compression. This was unseen before because common methods fail to faithfully reconstruct detailed images after 16x spatial compression. Würstchen employs a
two-stage compression, what we call Stage A and Stage B. Stage A is a VQGAN, and Stage B is a Diffusion Autoencoder (more details can be found in the [paper](https://arxiv.org/abs/2306.00637)).
A third model, Stage C, is learned in that highly compressed latent space. This training requires fractions of the compute used for current top-performing models, allowing
also cheaper and faster inference.
## Würstchen - Decoder
The Decoder is what we refer to as "Stage A" and "Stage B". The decoder takes in image embeddings, either generated by the Prior (Stage C) or extracted from a real image, and decodes those latents back into the pixel space. Specifically, Stage B first decodes the image embeddings into the VQGAN Space, and Stage A (which is a VQGAN)
decodes the latents into pixel space. Together, they achieve a spatial compression of 42.
**Note:** The reconstruction is lossy and loses information of the image. The current Stage B often lacks details in the reconstructions, which are especially noticeable to
us humans when looking at faces, hands, etc. We are working on making these reconstructions even better in the future!
### Image Sizes
Würstchen was trained on image resolutions between 1024x1024 & 1536x1536. We sometimes also observe good outputs at resolutions like 1024x2048. Feel free to try it out.
We also observed that the Prior (Stage C) adapts extremely fast to new resolutions. So finetuning it at 2048x2048 should be computationally cheap.
<img src="https://cdn-uploads.huggingface.co/production/uploads/634cb5eefb80cc6bcaf63c3e/5pA5KUfGmvsObqiIjdGY1.jpeg" width=1000>
## How to run
This pipeline should be run together with a prior https://huggingface.co/warp-ai/wuerstchen-prior:
```py
import torch
from diffusers import AutoPipelineForText2Image
device = "cuda"
dtype = torch.float16
pipeline = AutoPipelineForText2Image.from_pretrained(
"warp-diffusion/wuerstchen", torch_dtype=dtype
).to(device)
caption = "Anthropomorphic cat dressed as a fire fighter"
output = pipeline(
prompt=caption,
height=1024,
width=1024,
prior_guidance_scale=4.0,
decoder_guidance_scale=0.0,
).images
```
### Image Sampling Times
The figure shows the inference times (on an A100) for different batch sizes (`num_images_per_prompt`) on Würstchen compared to [Stable Diffusion XL](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) (without refiner).
The left figure shows inference times (using torch > 2.0), whereas the right figure applies `torch.compile` to both pipelines in advance.

## Model Details
- **Developed by:** Pablo Pernias, Dominic Rampas
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** MIT
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a Diffusion model in the style of Stage C from the [Würstchen paper](https://arxiv.org/abs/2306.00637) that uses a fixed, pretrained text encoder ([CLIP ViT-bigG/14](https://huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k)).
- **Resources for more information:** [GitHub Repository](https://github.com/dome272/Wuerstchen), [Paper](https://arxiv.org/abs/2306.00637).
- **Cite as:**
@misc{pernias2023wuerstchen,
title={Wuerstchen: Efficient Pretraining of Text-to-Image Models},
author={Pablo Pernias and Dominic Rampas and Mats L. Richter and Christopher Pal and Marc Aubreville},
year={2023},
eprint={2306.00637},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Environmental Impact
**Würstchen v2** **Estimated Emissions**
Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.
- **Hardware Type:** A100 PCIe 40GB
- **Hours used:** 24602
- **Cloud Provider:** AWS
- **Compute Region:** US-east
- **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 2275.68 kg CO2 eq. | 5,268 | [
[
-0.038970947265625,
-0.0445556640625,
0.03271484375,
0.00873565673828125,
-0.0270843505859375,
-0.0340576171875,
-0.0083770751953125,
-0.0220489501953125,
-0.01007843017578125,
0.017974853515625,
-0.041046142578125,
-0.036529541015625,
-0.052093505859375,
-0.0155487060546875,
-0.0177764892578125,
0.068359375,
0.005069732666015625,
0.00620269775390625,
0.000041961669921875,
0.0120849609375,
-0.007537841796875,
-0.0166473388671875,
-0.0576171875,
-0.022552490234375,
0.035614013671875,
-0.0014429092407226562,
0.044769287109375,
0.04815673828125,
0.032257080078125,
0.028839111328125,
-0.01336669921875,
-0.01092529296875,
-0.038177490234375,
-0.0126495361328125,
0.01445770263671875,
-0.01055908203125,
-0.005962371826171875,
0.007221221923828125,
0.048004150390625,
0.0295867919921875,
-0.0225677490234375,
-0.006195068359375,
0.019683837890625,
0.055877685546875,
-0.041534423828125,
-0.014404296875,
-0.02374267578125,
0.0195465087890625,
-0.01377105712890625,
0.003414154052734375,
-0.02459716796875,
-0.00580596923828125,
0.01335906982421875,
-0.0599365234375,
0.04296875,
-0.004764556884765625,
0.09271240234375,
0.0270843505859375,
-0.0300140380859375,
-0.008392333984375,
-0.04925537109375,
0.04058837890625,
-0.06414794921875,
0.02398681640625,
0.01477813720703125,
0.010711669921875,
-0.0013446807861328125,
-0.09912109375,
-0.05352783203125,
0.0174407958984375,
-0.017425537109375,
0.024139404296875,
-0.038543701171875,
0.00258636474609375,
0.033294677734375,
0.030914306640625,
-0.06292724609375,
0.0164337158203125,
-0.03204345703125,
-0.034423828125,
0.03631591796875,
0.0084075927734375,
0.0101470947265625,
-0.01092529296875,
-0.034881591796875,
-0.044281005859375,
-0.0274505615234375,
-0.0069732666015625,
0.0301055908203125,
-0.01404571533203125,
-0.0284423828125,
0.019012451171875,
-0.005321502685546875,
0.0443115234375,
0.01788330078125,
0.004940032958984375,
0.0230255126953125,
-0.021026611328125,
-0.034637451171875,
-0.0261688232421875,
0.0865478515625,
0.0283660888671875,
-0.011322021484375,
0.0037822723388671875,
0.0025882720947265625,
0.0013332366943359375,
0.0018825531005859375,
-0.111083984375,
-0.050689697265625,
0.0114898681640625,
-0.036041259765625,
-0.01678466796875,
-0.004695892333984375,
-0.07666015625,
-0.01458740234375,
0.00736236572265625,
0.041717529296875,
-0.0250701904296875,
-0.0220489501953125,
0.01062774658203125,
-0.027557373046875,
0.01335906982421875,
0.0264892578125,
-0.044189453125,
-0.00458526611328125,
-0.0011320114135742188,
0.0733642578125,
-0.0076904296875,
-0.01032257080078125,
-0.0227813720703125,
-0.01500701904296875,
-0.018798828125,
0.060699462890625,
-0.01342010498046875,
-0.03662109375,
-0.01007080078125,
0.0174560546875,
0.021392822265625,
-0.026336669921875,
0.037506103515625,
-0.05316162109375,
0.048187255859375,
-0.01275634765625,
-0.041839599609375,
-0.01340484619140625,
-0.01068115234375,
-0.0472412109375,
0.094970703125,
0.01241302490234375,
-0.07562255859375,
0.0297393798828125,
-0.046051025390625,
-0.0185546875,
-0.0038967132568359375,
0.01470184326171875,
-0.050689697265625,
-0.00684356689453125,
0.012237548828125,
0.04241943359375,
-0.0184478759765625,
0.0113067626953125,
-0.0190582275390625,
-0.016571044921875,
-0.014190673828125,
-0.036529541015625,
0.0838623046875,
0.033447265625,
-0.044342041015625,
0.00244903564453125,
-0.048797607421875,
0.011566162109375,
0.025634765625,
-0.0280914306640625,
-0.006549835205078125,
-0.028289794921875,
0.020050048828125,
0.0404052734375,
0.00286865234375,
-0.028045654296875,
0.00507354736328125,
-0.0170135498046875,
0.040313720703125,
0.06036376953125,
0.0092926025390625,
0.0509033203125,
-0.0022430419921875,
0.02362060546875,
0.017669677734375,
0.006938934326171875,
-0.01751708984375,
-0.034515380859375,
-0.060516357421875,
-0.0194091796875,
0.031890869140625,
0.033477783203125,
-0.07061767578125,
0.035675048828125,
-0.014404296875,
-0.063232421875,
-0.0116729736328125,
-0.007709503173828125,
0.0206146240234375,
0.04498291015625,
0.019134521484375,
-0.035247802734375,
-0.0297393798828125,
-0.06201171875,
0.00933074951171875,
0.0011034011840820312,
-0.01094818115234375,
0.0022029876708984375,
0.051116943359375,
-0.0154571533203125,
0.044464111328125,
-0.055084228515625,
-0.016082763671875,
0.0155029296875,
0.0165252685546875,
0.01268768310546875,
0.0531005859375,
0.046600341796875,
-0.06280517578125,
-0.054290771484375,
-0.0179443359375,
-0.058135986328125,
0.00586700439453125,
-0.001194000244140625,
-0.03131103515625,
0.0234375,
0.035980224609375,
-0.05926513671875,
0.047210693359375,
0.05975341796875,
-0.038970947265625,
0.048858642578125,
-0.043914794921875,
0.01739501953125,
-0.08184814453125,
0.00574493408203125,
0.02642822265625,
-0.018585205078125,
-0.0433349609375,
-0.002986907958984375,
-0.0016603469848632812,
-0.0171356201171875,
-0.0423583984375,
0.04754638671875,
-0.0540771484375,
0.0011720657348632812,
-0.017303466796875,
-0.00732421875,
0.0192413330078125,
0.062164306640625,
0.0154571533203125,
0.05059814453125,
0.055877685546875,
-0.043487548828125,
0.00872802734375,
0.0075225830078125,
-0.035919189453125,
0.0447998046875,
-0.06500244140625,
0.03106689453125,
-0.005718231201171875,
0.0278472900390625,
-0.0809326171875,
-0.01541900634765625,
0.0206756591796875,
-0.03509521484375,
0.036956787109375,
-0.005893707275390625,
-0.05523681640625,
-0.04302978515625,
-0.0052490234375,
0.0242919921875,
0.0762939453125,
-0.037139892578125,
0.040924072265625,
0.0113525390625,
0.01389312744140625,
-0.029296875,
-0.05841064453125,
-0.0009164810180664062,
-0.024261474609375,
-0.05157470703125,
0.041595458984375,
-0.0250701904296875,
0.005596160888671875,
0.0167083740234375,
0.018707275390625,
-0.01361083984375,
-0.007663726806640625,
0.0176544189453125,
0.0158843994140625,
-0.006343841552734375,
-0.00179290771484375,
0.0084686279296875,
-0.0123291015625,
-0.0121612548828125,
-0.039642333984375,
0.044403076171875,
0.0005216598510742188,
-0.00919342041015625,
-0.0654296875,
0.028350830078125,
0.0279388427734375,
0.006832122802734375,
0.06201171875,
0.0809326171875,
-0.035552978515625,
0.0107421875,
-0.021026611328125,
-0.0187835693359375,
-0.042510986328125,
0.0298004150390625,
-0.00804901123046875,
-0.05084228515625,
0.032501220703125,
-0.005321502685546875,
0.00868988037109375,
0.0618896484375,
0.051055908203125,
-0.038421630859375,
0.075439453125,
0.042510986328125,
0.0190887451171875,
0.046356201171875,
-0.05889892578125,
-0.007061004638671875,
-0.057830810546875,
-0.01038360595703125,
-0.034454345703125,
-0.0201568603515625,
-0.026885986328125,
-0.0428466796875,
0.030853271484375,
0.012725830078125,
-0.038665771484375,
0.0253753662109375,
-0.051422119140625,
0.017730712890625,
0.0369873046875,
0.01092529296875,
0.0003333091735839844,
0.025299072265625,
-0.0102691650390625,
-0.025787353515625,
-0.06024169921875,
-0.0211181640625,
0.060028076171875,
0.0164794921875,
0.041778564453125,
-0.0011272430419921875,
0.038482666015625,
0.041595458984375,
0.0211334228515625,
-0.0374755859375,
0.032806396484375,
-0.002105712890625,
-0.055511474609375,
-0.0220184326171875,
-0.0268402099609375,
-0.08892822265625,
0.036468505859375,
-0.0289459228515625,
-0.040008544921875,
0.0194854736328125,
0.0264739990234375,
-0.0093994140625,
0.041595458984375,
-0.06732177734375,
0.06439208984375,
-0.0034027099609375,
-0.05096435546875,
-0.010711669921875,
-0.053680419921875,
0.004245758056640625,
0.00927734375,
0.0125579833984375,
0.0191650390625,
0.0084686279296875,
0.071533203125,
-0.042816162109375,
0.0699462890625,
-0.048187255859375,
-0.004062652587890625,
0.0206756591796875,
-0.0103607177734375,
0.039520263671875,
-0.0240936279296875,
-0.006946563720703125,
0.0253448486328125,
-0.0010023117065429688,
-0.0458984375,
-0.0369873046875,
0.035552978515625,
-0.05657958984375,
-0.038970947265625,
-0.035675048828125,
-0.024505615234375,
0.00962066650390625,
0.03076171875,
0.08050537109375,
0.030181884765625,
-0.0239105224609375,
-0.0004949569702148438,
0.052947998046875,
-0.02099609375,
0.048004150390625,
0.01204681396484375,
-0.01268768310546875,
-0.03302001953125,
0.046112060546875,
0.00577545166015625,
0.0192413330078125,
0.0037746429443359375,
0.0168609619140625,
-0.0205230712890625,
-0.044586181640625,
-0.045135498046875,
0.031463623046875,
-0.0675048828125,
-0.0248565673828125,
-0.056121826171875,
-0.0194244384765625,
-0.0249786376953125,
-0.01214599609375,
-0.024658203125,
-0.0142669677734375,
-0.051605224609375,
0.004993438720703125,
0.0141448974609375,
0.04302978515625,
-0.0231781005859375,
0.04779052734375,
-0.04327392578125,
0.0037689208984375,
0.022979736328125,
0.014892578125,
0.0109405517578125,
-0.059112548828125,
-0.037322998046875,
0.009185791015625,
-0.038330078125,
-0.055267333984375,
0.040863037109375,
0.0131378173828125,
0.033966064453125,
0.0323486328125,
-0.01284027099609375,
0.051483154296875,
-0.034271240234375,
0.0701904296875,
0.038543701171875,
-0.0562744140625,
0.0340576171875,
-0.0192718505859375,
0.031005859375,
0.0154571533203125,
0.0321044921875,
-0.034332275390625,
-0.0023975372314453125,
-0.0634765625,
-0.05352783203125,
0.04595947265625,
0.0248870849609375,
0.0144500732421875,
0.0167236328125,
0.037811279296875,
-0.00533294677734375,
-0.00446319580078125,
-0.06024169921875,
-0.0278472900390625,
-0.022796630859375,
0.005390167236328125,
-0.01525115966796875,
-0.0106048583984375,
-0.00988006591796875,
-0.0460205078125,
0.0650634765625,
0.00003743171691894531,
0.054962158203125,
0.0295562744140625,
-0.01517486572265625,
0.0018625259399414062,
-0.01136016845703125,
0.03875732421875,
0.01055908203125,
0.006107330322265625,
0.001750946044921875,
0.0007214546203613281,
-0.0394287109375,
0.019500732421875,
0.005237579345703125,
-0.011932373046875,
0.006954193115234375,
0.030120849609375,
0.08563232421875,
-0.0193023681640625,
-0.0181884765625,
0.04339599609375,
-0.0181427001953125,
-0.04827880859375,
-0.0184173583984375,
0.003185272216796875,
0.0205078125,
0.0261077880859375,
0.0310211181640625,
0.028656005859375,
0.0187530517578125,
-0.01165008544921875,
0.0087738037109375,
0.023101806640625,
-0.031097412109375,
-0.0298004150390625,
0.060302734375,
0.00334930419921875,
-0.0157318115234375,
0.055877685546875,
-0.031280517578125,
-0.03289794921875,
0.05316162109375,
0.04400634765625,
0.05377197265625,
-0.0024242401123046875,
0.02313232421875,
0.059112548828125,
0.008392333984375,
-0.00713348388671875,
-0.01158905029296875,
-0.003345489501953125,
-0.051483154296875,
-0.01467132568359375,
-0.02716064453125,
0.011688232421875,
0.00415802001953125,
-0.043365478515625,
0.016021728515625,
-0.0218505859375,
-0.023406982421875,
0.0007414817810058594,
0.0012083053588867188,
-0.07525634765625,
0.0303497314453125,
0.002605438232421875,
0.06512451171875,
-0.061370849609375,
0.06915283203125,
0.0286865234375,
-0.0175933837890625,
-0.05767822265625,
0.0042572021484375,
0.0012540817260742188,
-0.05267333984375,
0.040313720703125,
0.008392333984375,
-0.0039215087890625,
0.0263519287109375,
-0.05810546875,
-0.049346923828125,
0.1041259765625,
0.0248565673828125,
-0.038604736328125,
-0.015838623046875,
-0.02264404296875,
0.055419921875,
-0.0183258056640625,
0.0192413330078125,
0.023284912109375,
0.032318115234375,
0.022857666015625,
-0.0289459228515625,
0.009796142578125,
-0.028656005859375,
0.027191162109375,
0.01013946533203125,
-0.0489501953125,
0.05316162109375,
-0.032806396484375,
-0.0472412109375,
0.01043701171875,
0.060699462890625,
-0.0067596435546875,
0.00432586669921875,
0.05218505859375,
0.0743408203125,
0.056488037109375,
-0.0101165771484375,
0.09375,
-0.0022068023681640625,
0.043609619140625,
0.053619384765625,
0.0097198486328125,
0.044464111328125,
0.031097412109375,
-0.0272216796875,
0.061248779296875,
0.07586669921875,
-0.016448974609375,
0.06512451171875,
0.00716400146484375,
-0.00720977783203125,
-0.004932403564453125,
0.0146484375,
-0.03521728515625,
0.007587432861328125,
0.0181884765625,
-0.02642822265625,
-0.0170135498046875,
0.016693115234375,
0.01309967041015625,
-0.0178070068359375,
0.00777435302734375,
0.04876708984375,
-0.01110076904296875,
-0.04644775390625,
0.055938720703125,
-0.0004658699035644531,
0.06787109375,
-0.040740966796875,
-0.0003809928894042969,
-0.01019287109375,
-0.0018377304077148438,
-0.01422882080078125,
-0.07159423828125,
0.0187225341796875,
-0.0064697265625,
-0.0088653564453125,
-0.008941650390625,
0.048858642578125,
-0.0313720703125,
-0.032806396484375,
0.0247802734375,
0.02398681640625,
0.0181427001953125,
-0.004337310791015625,
-0.07073974609375,
0.00818634033203125,
0.01245880126953125,
-0.044952392578125,
0.018524169921875,
0.0269927978515625,
0.024169921875,
0.0242156982421875,
0.055999755859375,
-0.007236480712890625,
-0.0000883936882019043,
-0.015655517578125,
0.06500244140625,
-0.0299530029296875,
-0.0335693359375,
-0.059417724609375,
0.06695556640625,
-0.01776123046875,
-0.024658203125,
0.04296875,
0.041534423828125,
0.051361083984375,
-0.007663726806640625,
0.05706787109375,
-0.0228424072265625,
-0.00348663330078125,
-0.044677734375,
0.053741455078125,
-0.06085205078125,
-0.001758575439453125,
-0.031097412109375,
-0.064208984375,
-0.005619049072265625,
0.0462646484375,
-0.01212310791015625,
0.0218505859375,
0.055694580078125,
0.06842041015625,
-0.0058441162109375,
-0.025146484375,
0.0277099609375,
0.01102447509765625,
0.02386474609375,
0.030548095703125,
0.038482666015625,
-0.05609130859375,
0.049072265625,
-0.044342041015625,
-0.0187835693359375,
0.007183074951171875,
-0.0716552734375,
-0.058685302734375,
-0.0821533203125,
-0.037506103515625,
-0.0548095703125,
-0.024078369140625,
0.021484375,
0.08837890625,
-0.02392578125,
0.007396697998046875,
-0.01433563232421875,
-0.0176239013671875,
-0.0071258544921875,
-0.0181121826171875,
0.0450439453125,
0.00847625732421875,
-0.06732177734375,
-0.0232086181640625,
0.01221466064453125,
0.03155517578125,
-0.0207672119140625,
-0.0181121826171875,
-0.015960693359375,
-0.00809478759765625,
0.035552978515625,
0.01346588134765625,
-0.0323486328125,
-0.00788116455078125,
0.00010973215103149414,
-0.0231781005859375,
0.02618408203125,
0.046112060546875,
-0.044677734375,
0.033111572265625,
0.03375244140625,
0.02899169921875,
0.09466552734375,
-0.028656005859375,
0.0134429931640625,
-0.038482666015625,
0.038787841796875,
-0.0013246536254882812,
0.033050537109375,
0.031402587890625,
-0.042266845703125,
0.031097412109375,
0.03759765625,
-0.056243896484375,
-0.041900634765625,
0.0087890625,
-0.0728759765625,
-0.032501220703125,
0.07855224609375,
-0.028350830078125,
-0.01654052734375,
0.01343536376953125,
-0.0240325927734375,
0.0266876220703125,
-0.0206146240234375,
0.0411376953125,
0.0625,
-0.0047149658203125,
-0.040557861328125,
-0.0225830078125,
0.03863525390625,
0.018798828125,
-0.0628662109375,
-0.022064208984375,
0.0531005859375,
0.040557861328125,
0.015228271484375,
0.06304931640625,
-0.01041412353515625,
0.0192413330078125,
0.0203704833984375,
0.024169921875,
0.01316070556640625,
0.0029010772705078125,
-0.031585693359375,
-0.0029926300048828125,
-0.0208740234375,
-0.0050811767578125
]
] |
textattack/bert-base-uncased-rotten_tomatoes | 2021-05-20T07:47:13.000Z | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | textattack | null | null | textattack/bert-base-uncased-rotten_tomatoes | 0 | 16,023 | transformers | 2022-03-02T23:29:05 | ## bert-base-uncased fine-tuned with TextAttack on the rotten_tomatoes dataset
This `bert-base-uncased` model was fine-tuned for sequence classificationusing TextAttack
and the rotten_tomatoes dataset loaded using the `nlp` library. The model was fine-tuned
for 10 epochs with a batch size of 64, a learning
rate of 5e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.875234521575985, as measured by the
eval set accuracy, found after 4 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 728 | [
[
-0.0227203369140625,
-0.0340576171875,
0.029266357421875,
-0.00024318695068359375,
-0.0308380126953125,
0.0139923095703125,
-0.0251007080078125,
-0.023529052734375,
0.0010309219360351562,
0.0367431640625,
-0.04095458984375,
-0.044525146484375,
-0.04852294921875,
0.004344940185546875,
-0.044525146484375,
0.1295166015625,
0.036102294921875,
0.01265716552734375,
0.001544952392578125,
0.006557464599609375,
-0.0251312255859375,
-0.03497314453125,
-0.0229034423828125,
-0.01186370849609375,
0.034759521484375,
0.04248046875,
0.06488037109375,
0.042755126953125,
0.058868408203125,
0.0179595947265625,
-0.006893157958984375,
0.00525665283203125,
-0.047637939453125,
0.005184173583984375,
-0.023040771484375,
-0.0231475830078125,
-0.0180206298828125,
-0.0177459716796875,
0.040618896484375,
0.020172119140625,
0.006671905517578125,
0.033843994140625,
-0.002216339111328125,
0.070068359375,
-0.042999267578125,
-0.016387939453125,
-0.058624267578125,
0.020843505859375,
0.0003788471221923828,
0.001735687255859375,
-0.054901123046875,
-0.0280609130859375,
0.016510009765625,
-0.016265869140625,
0.01488494873046875,
0.007152557373046875,
0.06646728515625,
0.0196990966796875,
-0.00811767578125,
0.0110321044921875,
-0.052093505859375,
0.06463623046875,
-0.058929443359375,
0.00922393798828125,
0.0247039794921875,
0.006374359130859375,
-0.0111846923828125,
-0.052978515625,
-0.0238494873046875,
0.007266998291015625,
0.025604248046875,
-0.0199432373046875,
-0.022247314453125,
0.0122528076171875,
0.0108184814453125,
0.032012939453125,
-0.062103271484375,
0.009033203125,
-0.045562744140625,
-0.020263671875,
0.058929443359375,
0.0115509033203125,
0.01324462890625,
-0.0294342041015625,
-0.041107177734375,
-0.0189208984375,
-0.020965576171875,
-0.005931854248046875,
0.0302734375,
0.0290985107421875,
0.0012178421020507812,
0.052886962890625,
-0.0038166046142578125,
0.048492431640625,
0.005840301513671875,
0.0251312255859375,
0.03631591796875,
-0.01021575927734375,
-0.0418701171875,
0.0000807642936706543,
0.05999755859375,
0.03167724609375,
0.046417236328125,
-0.00940704345703125,
-0.032196044921875,
0.0011224746704101562,
0.0550537109375,
-0.043701171875,
-0.03656005859375,
0.031829833984375,
-0.034637451171875,
-0.041748046875,
0.01052093505859375,
-0.031158447265625,
-0.028106689453125,
-0.0086669921875,
0.061248779296875,
-0.062469482421875,
0.0162200927734375,
0.0010166168212890625,
-0.0340576171875,
0.00443267822265625,
0.01340484619140625,
-0.057220458984375,
0.007427215576171875,
0.035186767578125,
0.06787109375,
-0.020263671875,
0.0046234130859375,
0.004222869873046875,
-0.0240325927734375,
-0.0244598388671875,
0.05340576171875,
-0.03912353515625,
0.0002760887145996094,
-0.004604339599609375,
-0.004871368408203125,
0.02117919921875,
-0.01140594482421875,
0.046905517578125,
-0.043853759765625,
0.0256805419921875,
0.0024127960205078125,
-0.057525634765625,
-0.033721923828125,
0.01885986328125,
-0.045745849609375,
0.0634765625,
0.0227203369140625,
-0.039947509765625,
0.03485107421875,
-0.036895751953125,
-0.026611328125,
-0.00360870361328125,
0.01837158203125,
-0.044097900390625,
0.0300750732421875,
0.0159149169921875,
0.0439453125,
-0.006755828857421875,
0.0233612060546875,
-0.015289306640625,
-0.0487060546875,
0.0157928466796875,
-0.036834716796875,
0.05853271484375,
0.019744873046875,
-0.0208740234375,
0.0012483596801757812,
-0.07208251953125,
0.048095703125,
-0.007053375244140625,
-0.032196044921875,
-0.00872802734375,
-0.005176544189453125,
0.0308380126953125,
0.00342559814453125,
0.027313232421875,
-0.0439453125,
0.0165252685546875,
-0.04132080078125,
0.030364990234375,
0.0579833984375,
-0.00647735595703125,
0.0283660888671875,
-0.040313720703125,
0.0205841064453125,
-0.0020122528076171875,
0.0124664306640625,
-0.01180267333984375,
-0.03814697265625,
-0.047149658203125,
-0.007747650146484375,
0.045562744140625,
0.049407958984375,
-0.040283203125,
0.062255859375,
-0.0153961181640625,
-0.056182861328125,
-0.0298309326171875,
-0.00838470458984375,
0.03839111328125,
0.0031757354736328125,
0.032440185546875,
-0.049041748046875,
-0.058929443359375,
-0.0732421875,
0.0008096694946289062,
-0.0042724609375,
-0.01507568359375,
0.012725830078125,
0.04132080078125,
-0.0169219970703125,
0.0654296875,
-0.0350341796875,
-0.027313232421875,
-0.01025390625,
0.0238189697265625,
0.034088134765625,
0.04833984375,
0.0257415771484375,
-0.03558349609375,
-0.0290069580078125,
-0.0171661376953125,
-0.035736083984375,
0.00399017333984375,
0.0024509429931640625,
0.001018524169921875,
0.0249786376953125,
0.00704193115234375,
-0.0328369140625,
0.025054931640625,
0.04058837890625,
-0.035858154296875,
0.042755126953125,
-0.01049041748046875,
0.0169830322265625,
-0.0888671875,
-0.0046234130859375,
-0.005138397216796875,
-0.0247955322265625,
-0.032989501953125,
-0.0018167495727539062,
0.01322174072265625,
-0.0234375,
-0.0421142578125,
0.004779815673828125,
-0.015777587890625,
-0.015228271484375,
-0.00574493408203125,
-0.02294921875,
0.00399017333984375,
0.060333251953125,
-0.004344940185546875,
0.06927490234375,
0.02532958984375,
-0.0382080078125,
0.0165252685546875,
0.01071929931640625,
-0.043487548828125,
0.01898193359375,
-0.04827880859375,
0.0294342041015625,
-0.0035686492919921875,
0.0014400482177734375,
-0.08502197265625,
-0.01116943359375,
-0.005558013916015625,
-0.051025390625,
0.0129547119140625,
-0.00830841064453125,
-0.038116455078125,
-0.018829345703125,
-0.043121337890625,
0.02252197265625,
0.039276123046875,
-0.0211334228515625,
0.0154876708984375,
0.0260467529296875,
0.0039043426513671875,
-0.053802490234375,
-0.047607421875,
-0.006526947021484375,
-0.0111083984375,
-0.033538818359375,
0.048095703125,
-0.0183258056640625,
0.013702392578125,
-0.0131072998046875,
-0.0028476715087890625,
-0.0253143310546875,
-0.0224456787109375,
0.021636962890625,
0.0133056640625,
-0.0201568603515625,
0.029693603515625,
0.00431060791015625,
0.004322052001953125,
-0.005855560302734375,
-0.0210418701171875,
0.05242919921875,
-0.0236358642578125,
0.021728515625,
-0.0178985595703125,
0.005954742431640625,
0.06298828125,
0.00939178466796875,
0.057220458984375,
0.045806884765625,
-0.0340576171875,
-0.0224151611328125,
-0.0196075439453125,
-0.007537841796875,
-0.0367431640625,
0.035186767578125,
-0.00411224365234375,
-0.04351806640625,
0.041107177734375,
0.0265045166015625,
0.0016126632690429688,
0.056488037109375,
0.032806396484375,
-0.0258331298828125,
0.06622314453125,
0.038848876953125,
-0.0206451416015625,
0.038909912109375,
-0.0208587646484375,
-0.01203155517578125,
-0.04351806640625,
-0.0165252685546875,
-0.03173828125,
-0.0498046875,
-0.049102783203125,
-0.0253448486328125,
0.0218658447265625,
-0.0199737548828125,
-0.054656982421875,
0.0445556640625,
-0.065185546875,
0.058349609375,
0.04925537109375,
0.0361328125,
0.004131317138671875,
0.0029163360595703125,
-0.01062774658203125,
-0.01117706298828125,
-0.028564453125,
-0.019775390625,
0.10162353515625,
0.032379150390625,
0.068359375,
0.01163482666015625,
0.040802001953125,
0.07757568359375,
-0.023773193359375,
-0.049774169921875,
0.0350341796875,
-0.044891357421875,
-0.06402587890625,
-0.034393310546875,
-0.01116943359375,
-0.05364990234375,
-0.0187225341796875,
-0.03375244140625,
-0.0548095703125,
-0.00017893314361572266,
0.00885009765625,
-0.022857666015625,
0.02105712890625,
-0.0655517578125,
0.0621337890625,
-0.0201873779296875,
-0.009124755859375,
-0.0087127685546875,
-0.07208251953125,
0.021026611328125,
-0.0022449493408203125,
-0.012420654296875,
-0.016387939453125,
0.0227813720703125,
0.06439208984375,
-0.0260009765625,
0.0501708984375,
-0.0012359619140625,
0.01250457763671875,
0.0038433074951171875,
0.0034809112548828125,
0.037261962890625,
-0.00707244873046875,
0.00893402099609375,
0.024322509765625,
0.006557464599609375,
-0.02252197265625,
-0.0260009765625,
0.039031982421875,
-0.0701904296875,
-0.0003571510314941406,
-0.04986572265625,
-0.039215087890625,
-0.01526641845703125,
0.01235198974609375,
0.052337646484375,
0.0245208740234375,
-0.01491546630859375,
0.046478271484375,
0.0693359375,
-0.025390625,
0.03643798828125,
0.0350341796875,
-0.00441741943359375,
-0.040802001953125,
0.06304931640625,
0.00457000732421875,
0.01038360595703125,
0.033233642578125,
0.0027523040771484375,
-0.01360321044921875,
-0.036468505859375,
-0.038238525390625,
-0.0169677734375,
-0.044525146484375,
-0.04425048828125,
-0.02874755859375,
-0.04852294921875,
-0.015869140625,
-0.007442474365234375,
-0.034576416015625,
-0.043731689453125,
-0.03387451171875,
-0.0125579833984375,
0.05108642578125,
0.049560546875,
-0.0052337646484375,
0.0231475830078125,
-0.059814453125,
0.01061248779296875,
-0.01001739501953125,
0.045562744140625,
-0.01325225830078125,
-0.06610107421875,
-0.039794921875,
-0.004192352294921875,
-0.022918701171875,
-0.0648193359375,
0.03680419921875,
0.030731201171875,
0.008544921875,
0.024169921875,
0.0155181884765625,
0.0306243896484375,
-0.055816650390625,
0.0723876953125,
0.00829315185546875,
-0.06439208984375,
0.048004150390625,
-0.0260467529296875,
0.0576171875,
0.0506591796875,
0.029754638671875,
-0.0127410888671875,
-0.036956787109375,
-0.052093505859375,
-0.0838623046875,
0.041717529296875,
0.027130126953125,
0.0279693603515625,
0.00797271728515625,
0.00567626953125,
0.025054931640625,
0.0216827392578125,
-0.06634521484375,
-0.0152130126953125,
-0.007488250732421875,
-0.036651611328125,
-0.01232147216796875,
-0.0286865234375,
0.006244659423828125,
-0.041015625,
0.052642822265625,
0.010101318359375,
0.0478515625,
0.00223541259765625,
-0.027008056640625,
-0.01361846923828125,
0.022369384765625,
0.05657958984375,
0.0362548828125,
-0.05694580078125,
-0.0047149658203125,
-0.0008096694946289062,
-0.059173583984375,
0.01029205322265625,
-0.0014438629150390625,
-0.00832366943359375,
0.006542205810546875,
0.021728515625,
0.060333251953125,
0.0183258056640625,
-0.061065673828125,
0.024505615234375,
0.0016050338745117188,
0.0035762786865234375,
-0.041961669921875,
0.0271148681640625,
-0.03607177734375,
0.0239715576171875,
0.0526123046875,
0.0126495361328125,
0.031158447265625,
-0.030670166015625,
0.05078125,
0.023529052734375,
-0.03839111328125,
-0.00865936279296875,
0.0377197265625,
0.00885772705078125,
-0.0283660888671875,
0.07110595703125,
0.0082855224609375,
-0.039886474609375,
0.0509033203125,
0.03387451171875,
0.0782470703125,
-0.007633209228515625,
0.0295562744140625,
0.020599365234375,
0.030731201171875,
0.0022220611572265625,
0.0195770263671875,
0.0011348724365234375,
-0.061981201171875,
-0.00804901123046875,
-0.0576171875,
-0.034576416015625,
0.03033447265625,
-0.0626220703125,
0.0146026611328125,
-0.052734375,
-0.0245208740234375,
0.023345947265625,
0.00689697265625,
-0.041259765625,
0.05413818359375,
0.005512237548828125,
0.07183837890625,
-0.07757568359375,
0.051483154296875,
0.049102783203125,
-0.029266357421875,
-0.035797119140625,
-0.0018568038940429688,
-0.0103912353515625,
-0.04583740234375,
0.033721923828125,
0.01837158203125,
0.015777587890625,
0.004817962646484375,
-0.0582275390625,
-0.0782470703125,
0.049652099609375,
0.00909423828125,
-0.044891357421875,
-0.00891876220703125,
-0.0101318359375,
0.055572509765625,
-0.0277099609375,
0.019195556640625,
0.040374755859375,
0.00848388671875,
-0.0160064697265625,
-0.064453125,
-0.0282135009765625,
-0.035552978515625,
0.0015230178833007812,
0.0247802734375,
-0.0245513916015625,
0.060516357421875,
-0.02362060546875,
0.01580810546875,
0.007190704345703125,
0.0439453125,
0.002635955810546875,
0.02484130859375,
0.02630615234375,
0.0543212890625,
0.042755126953125,
-0.01171875,
0.053009033203125,
-0.01325225830078125,
0.039520263671875,
0.0772705078125,
0.0101776123046875,
0.083740234375,
0.0200653076171875,
-0.019073486328125,
0.05950927734375,
0.06402587890625,
-0.021759033203125,
0.069580078125,
0.0147552490234375,
0.017486572265625,
-0.0025501251220703125,
0.010833740234375,
-0.0189666748046875,
0.035491943359375,
0.03253173828125,
-0.06304931640625,
0.01203155517578125,
0.01629638671875,
-0.0072174072265625,
-0.00975799560546875,
-0.0238494873046875,
0.057769775390625,
-0.01495361328125,
-0.036529541015625,
0.048583984375,
0.004283905029296875,
0.050079345703125,
-0.034881591796875,
0.00830078125,
-0.006893157958984375,
0.01375579833984375,
-0.0096893310546875,
-0.0841064453125,
0.0143585205078125,
0.0050506591796875,
-0.054168701171875,
0.01047515869140625,
0.050628662109375,
-0.0284423828125,
-0.03656005859375,
-0.007091522216796875,
-0.0060577392578125,
0.014007568359375,
-0.00304412841796875,
-0.052459716796875,
0.01314544677734375,
0.0034942626953125,
-0.022430419921875,
0.00574493408203125,
0.0279693603515625,
0.005992889404296875,
0.042999267578125,
0.0361328125,
-0.01983642578125,
-0.0037136077880859375,
-0.0167694091796875,
0.0517578125,
-0.0679931640625,
-0.04449462890625,
-0.05340576171875,
0.056732177734375,
-0.002727508544921875,
-0.048492431640625,
0.039154052734375,
0.053131103515625,
0.0618896484375,
-0.043548583984375,
0.044281005859375,
-0.019561767578125,
0.034149169921875,
-0.04583740234375,
0.055633544921875,
-0.0469970703125,
0.01079559326171875,
-0.037994384765625,
-0.06707763671875,
-0.01407623291015625,
0.054473876953125,
0.004749298095703125,
0.01708984375,
0.049560546875,
0.0372314453125,
0.012664794921875,
-0.008544921875,
0.025634765625,
0.01300048828125,
-0.0142364501953125,
0.05194091796875,
0.01910400390625,
-0.050018310546875,
0.0282135009765625,
-0.0197601318359375,
-0.00016355514526367188,
-0.031036376953125,
-0.0645751953125,
-0.0623779296875,
-0.057769775390625,
-0.01102447509765625,
-0.043304443359375,
0.02911376953125,
0.07269287109375,
0.057342529296875,
-0.08123779296875,
-0.0012235641479492188,
-0.007228851318359375,
-0.0018291473388671875,
-0.02459716796875,
-0.0243072509765625,
0.023345947265625,
-0.01425933837890625,
-0.0533447265625,
0.0204925537109375,
0.019775390625,
0.0013885498046875,
-0.0182342529296875,
0.004451751708984375,
-0.0257415771484375,
0.0220794677734375,
0.037811279296875,
0.013153076171875,
-0.042327880859375,
-0.04833984375,
-0.002895355224609375,
-0.00908660888671875,
0.0090484619140625,
0.032318115234375,
-0.0682373046875,
0.0494384765625,
0.053741455078125,
0.0445556640625,
0.0232086181640625,
0.01349639892578125,
0.0667724609375,
-0.09503173828125,
-0.0033512115478515625,
0.037994384765625,
0.0109405517578125,
0.0177764892578125,
-0.0496826171875,
0.042205810546875,
0.0013246536254882812,
-0.06689453125,
-0.072998046875,
-0.0031375885009765625,
-0.0809326171875,
-0.03277587890625,
0.0675048828125,
0.01519775390625,
-0.01287078857421875,
0.00930023193359375,
-0.0098419189453125,
0.01415252685546875,
-0.049957275390625,
0.07562255859375,
0.06329345703125,
0.00386810302734375,
-0.03369140625,
-0.00473785400390625,
0.05035400390625,
0.0296478271484375,
-0.035858154296875,
-0.0275726318359375,
0.029754638671875,
0.0182647705078125,
0.025634765625,
0.0153961181640625,
0.0088958740234375,
0.034332275390625,
-0.00839996337890625,
0.0160369873046875,
0.005657196044921875,
-0.057098388671875,
-0.0157928466796875,
0.01541900634765625,
-0.0072174072265625,
-0.039306640625
]
] |
textattack/albert-base-v2-rotten_tomatoes | 2020-06-25T20:00:46.000Z | [
"transformers",
"pytorch",
"tensorboard",
"albert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | textattack | null | null | textattack/albert-base-v2-rotten_tomatoes | 0 | 16,017 | transformers | 2022-03-02T23:29:05 | ## albert-base-v2 fine-tuned with TextAttack on the rotten_tomatoes dataset
This `albert-base-v2` model was fine-tuned for sequence classificationusing TextAttack
and the rotten_tomatoes dataset loaded using the `nlp` library. The model was fine-tuned
for 10 epochs with a batch size of 128, a learning
rate of 2e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.8855534709193246, as measured by the
eval set accuracy, found after 1 epoch.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 723 | [
[
-0.019012451171875,
-0.0252685546875,
0.027923583984375,
0.0051422119140625,
-0.01482391357421875,
0.00832366943359375,
-0.007328033447265625,
-0.0263519287109375,
-0.005420684814453125,
0.039306640625,
-0.0367431640625,
-0.042327880859375,
-0.05181884765625,
0.00942230224609375,
-0.0400390625,
0.1224365234375,
0.01438140869140625,
0.01381683349609375,
-0.01270294189453125,
-0.0033550262451171875,
-0.02886962890625,
-0.0174560546875,
-0.04132080078125,
-0.01203155517578125,
0.04437255859375,
0.05279541015625,
0.05828857421875,
0.06292724609375,
0.06573486328125,
0.01300048828125,
-0.001934051513671875,
0.002033233642578125,
-0.052703857421875,
0.015411376953125,
-0.03106689453125,
-0.0489501953125,
-0.0231475830078125,
-0.032989501953125,
0.03857421875,
0.0099029541015625,
0.01213836669921875,
0.038330078125,
-0.00966644287109375,
0.0751953125,
-0.05035400390625,
-0.01043701171875,
-0.048004150390625,
0.0139312744140625,
0.0013599395751953125,
-0.0038280487060546875,
-0.04315185546875,
-0.012359619140625,
0.006107330322265625,
-0.03387451171875,
0.0102081298828125,
0.0218048095703125,
0.06829833984375,
0.026824951171875,
-0.00704193115234375,
0.0026721954345703125,
-0.05621337890625,
0.06884765625,
-0.0477294921875,
0.01261138916015625,
0.051605224609375,
0.0034885406494140625,
0.0099029541015625,
-0.050079345703125,
-0.017791748046875,
0.006900787353515625,
0.0176849365234375,
-0.0270843505859375,
-0.022247314453125,
-0.01085662841796875,
0.02117919921875,
0.022674560546875,
-0.07904052734375,
0.0177154541015625,
-0.052520751953125,
-0.01386260986328125,
0.0582275390625,
0.0282745361328125,
0.005863189697265625,
-0.0225372314453125,
-0.0350341796875,
-0.017425537109375,
-0.036895751953125,
-0.01096343994140625,
0.031707763671875,
0.01125335693359375,
-0.004791259765625,
0.049957275390625,
-0.022125244140625,
0.0340576171875,
-0.0019779205322265625,
0.0160369873046875,
0.0401611328125,
-0.0157623291015625,
-0.02130126953125,
0.0001703500747680664,
0.0731201171875,
0.045928955078125,
0.0380859375,
0.0031147003173828125,
-0.0289154052734375,
-0.00009697675704956055,
0.0423583984375,
-0.0628662109375,
-0.03656005859375,
0.0345458984375,
-0.0247039794921875,
-0.034759521484375,
-0.0013437271118164062,
-0.031829833984375,
-0.0255584716796875,
-0.0171356201171875,
0.042877197265625,
-0.06671142578125,
-0.004009246826171875,
-0.004486083984375,
-0.02667236328125,
0.0015516281127929688,
0.01128387451171875,
-0.07122802734375,
0.01160430908203125,
0.052490234375,
0.0821533203125,
-0.01325225830078125,
-0.003170013427734375,
0.0024662017822265625,
-0.0221710205078125,
-0.035552978515625,
0.056732177734375,
-0.0249481201171875,
0.0018320083618164062,
-0.00908660888671875,
-0.01197052001953125,
0.035675048828125,
-0.02764892578125,
0.02978515625,
-0.05010986328125,
0.0225677490234375,
-0.0012102127075195312,
-0.044708251953125,
-0.02978515625,
0.0226593017578125,
-0.0452880859375,
0.05694580078125,
0.04400634765625,
-0.03826904296875,
0.0328369140625,
-0.035308837890625,
-0.02276611328125,
0.0021266937255859375,
0.01849365234375,
-0.047821044921875,
0.01374053955078125,
0.0016345977783203125,
0.029510498046875,
-0.0164947509765625,
0.00637054443359375,
-0.016204833984375,
-0.03912353515625,
-0.0005974769592285156,
-0.02459716796875,
0.05206298828125,
0.023681640625,
-0.01195526123046875,
-0.0137939453125,
-0.07281494140625,
0.0380859375,
-0.0006031990051269531,
-0.022857666015625,
-0.02227783203125,
-0.0032958984375,
0.0255584716796875,
0.0019855499267578125,
0.0196533203125,
-0.01605224609375,
0.0210113525390625,
-0.045013427734375,
0.0280303955078125,
0.034576416015625,
0.00028967857360839844,
0.03387451171875,
-0.04766845703125,
0.0296478271484375,
0.0008220672607421875,
-0.0008859634399414062,
-0.022918701171875,
-0.036651611328125,
-0.05078125,
-0.0036411285400390625,
0.0357666015625,
0.06646728515625,
-0.037841796875,
0.049407958984375,
-0.0228271484375,
-0.060943603515625,
-0.0277099609375,
0.00494384765625,
0.043609619140625,
0.001010894775390625,
0.035186767578125,
-0.046051025390625,
-0.0628662109375,
-0.0782470703125,
-0.0030460357666015625,
-0.024627685546875,
-0.015716552734375,
0.004482269287109375,
0.049591064453125,
-0.0176544189453125,
0.0654296875,
-0.0240478515625,
-0.017578125,
-0.0040283203125,
0.030487060546875,
0.0225830078125,
0.047271728515625,
0.0273284912109375,
-0.037506103515625,
-0.032989501953125,
-0.01018524169921875,
-0.043853759765625,
-0.007335662841796875,
0.0006742477416992188,
-0.0014448165893554688,
0.0137176513671875,
0.0101165771484375,
-0.0301055908203125,
0.013916015625,
0.03680419921875,
-0.039520263671875,
0.036834716796875,
0.0061492919921875,
0.0199737548828125,
-0.09747314453125,
-0.011932373046875,
0.005908966064453125,
-0.027191162109375,
-0.030426025390625,
-0.00435638427734375,
0.014617919921875,
-0.0010557174682617188,
-0.05230712890625,
0.01473236083984375,
-0.01467132568359375,
-0.0103302001953125,
-0.014495849609375,
-0.0265045166015625,
0.01381683349609375,
0.045928955078125,
-0.005771636962890625,
0.0657958984375,
0.0338134765625,
-0.044647216796875,
0.0254058837890625,
0.0164337158203125,
-0.04669189453125,
0.03143310546875,
-0.0484619140625,
0.0279541015625,
-0.01033782958984375,
-0.01091766357421875,
-0.07501220703125,
-0.0167236328125,
0.010528564453125,
-0.048736572265625,
0.02410888671875,
-0.0093231201171875,
-0.0296478271484375,
-0.00725555419921875,
-0.040252685546875,
0.03375244140625,
0.033203125,
-0.01360321044921875,
0.01094818115234375,
0.019073486328125,
-0.0111083984375,
-0.0501708984375,
-0.049163818359375,
-0.005863189697265625,
-0.0186004638671875,
-0.034820556640625,
0.0360107421875,
-0.0096893310546875,
-0.0063323974609375,
-0.008819580078125,
0.0063323974609375,
-0.01285552978515625,
-0.01422119140625,
0.027435302734375,
0.0140838623046875,
-0.0301513671875,
0.0134429931640625,
0.01177978515625,
-0.005107879638671875,
0.00148773193359375,
-0.0056915283203125,
0.053253173828125,
-0.0189666748046875,
0.0287017822265625,
-0.037872314453125,
0.0128326416015625,
0.0673828125,
0.00586700439453125,
0.059295654296875,
0.04217529296875,
-0.033050537109375,
-0.0225067138671875,
-0.00510406494140625,
-0.007598876953125,
-0.033050537109375,
0.0675048828125,
-0.0178070068359375,
-0.04583740234375,
0.0271148681640625,
0.021697998046875,
0.0005674362182617188,
0.0626220703125,
0.04620361328125,
-0.0129852294921875,
0.07904052734375,
0.0245513916015625,
-0.0140228271484375,
0.042572021484375,
-0.035614013671875,
-0.00890350341796875,
-0.05517578125,
-0.0204010009765625,
-0.031890869140625,
-0.039764404296875,
-0.041168212890625,
-0.0225372314453125,
0.0121917724609375,
0.008758544921875,
-0.0621337890625,
0.0556640625,
-0.04754638671875,
0.054168701171875,
0.043304443359375,
0.01068115234375,
0.002155303955078125,
0.0051116943359375,
0.0193023681640625,
-0.0103607177734375,
-0.0248565673828125,
-0.03131103515625,
0.1015625,
0.04193115234375,
0.059906005859375,
0.018585205078125,
0.0303497314453125,
0.06500244140625,
-0.003955841064453125,
-0.0596923828125,
0.027069091796875,
-0.01629638671875,
-0.0594482421875,
-0.0255584716796875,
-0.0172882080078125,
-0.045989990234375,
-0.0160675048828125,
-0.032012939453125,
-0.056304931640625,
-0.006229400634765625,
0.01085662841796875,
-0.03094482421875,
0.0033588409423828125,
-0.06158447265625,
0.062744140625,
-0.02008056640625,
-0.0266876220703125,
-0.01189422607421875,
-0.05950927734375,
0.031768798828125,
-0.01004791259765625,
-0.0042724609375,
-0.0219268798828125,
0.0059967041015625,
0.06951904296875,
-0.0206146240234375,
0.022796630859375,
0.012359619140625,
0.0161590576171875,
0.00865936279296875,
0.0159759521484375,
0.040069580078125,
0.01432037353515625,
0.003299713134765625,
0.0360107421875,
0.01386260986328125,
-0.01401519775390625,
-0.031707763671875,
0.0465087890625,
-0.07745361328125,
-0.000446319580078125,
-0.048553466796875,
-0.0318603515625,
-0.01166534423828125,
0.005889892578125,
0.054168701171875,
0.0191497802734375,
-0.0056610107421875,
0.043670654296875,
0.049896240234375,
-0.00848388671875,
0.023284912109375,
0.04083251953125,
-0.0184478759765625,
-0.048095703125,
0.06390380859375,
0.004608154296875,
0.0154266357421875,
0.03466796875,
-0.004833221435546875,
-0.01473236083984375,
-0.03887939453125,
-0.043670654296875,
-0.0034961700439453125,
-0.034210205078125,
-0.0498046875,
-0.035369873046875,
-0.0340576171875,
-0.022430419921875,
0.0022106170654296875,
-0.04541015625,
-0.033966064453125,
-0.04132080078125,
-0.01465606689453125,
0.046630859375,
0.048004150390625,
-0.01226806640625,
0.0328369140625,
-0.05694580078125,
0.016937255859375,
-0.01151275634765625,
0.038970947265625,
-0.028076171875,
-0.06695556640625,
-0.0506591796875,
-0.00959014892578125,
-0.033355712890625,
-0.062347412109375,
0.039520263671875,
0.0290069580078125,
-0.0102386474609375,
0.030029296875,
-0.003864288330078125,
0.0229949951171875,
-0.057525634765625,
0.072265625,
0.0174560546875,
-0.049530029296875,
0.050262451171875,
-0.019561767578125,
0.04608154296875,
0.05194091796875,
0.04364013671875,
-0.0212554931640625,
-0.031707763671875,
-0.044952392578125,
-0.08917236328125,
0.059051513671875,
0.024200439453125,
0.0169830322265625,
0.01366424560546875,
0.01397705078125,
0.022979736328125,
0.0134735107421875,
-0.044586181640625,
-0.026580810546875,
-0.00203704833984375,
-0.03375244140625,
-0.0097808837890625,
-0.0308074951171875,
0.005168914794921875,
-0.030914306640625,
0.0665283203125,
0.0129547119140625,
0.037384033203125,
-0.005344390869140625,
-0.01445770263671875,
-0.03131103515625,
0.008026123046875,
0.054168701171875,
0.0452880859375,
-0.04534912109375,
-0.00891876220703125,
-0.0031414031982421875,
-0.048614501953125,
0.01456451416015625,
-0.00787353515625,
-0.0196685791015625,
0.00604248046875,
0.027923583984375,
0.08514404296875,
-0.0035305023193359375,
-0.057220458984375,
0.02752685546875,
-0.0034732818603515625,
-0.0103302001953125,
-0.04693603515625,
0.04156494140625,
-0.0302886962890625,
0.0282745361328125,
0.038360595703125,
0.0215911865234375,
0.03369140625,
-0.030487060546875,
0.0447998046875,
0.01206207275390625,
-0.03240966796875,
-0.016357421875,
0.051605224609375,
-0.00966644287109375,
-0.061370849609375,
0.06500244140625,
-0.004764556884765625,
-0.032257080078125,
0.049591064453125,
0.037689208984375,
0.0677490234375,
-0.0152435302734375,
0.0212554931640625,
0.045989990234375,
0.0227203369140625,
-0.0193939208984375,
0.01151275634765625,
0.00775909423828125,
-0.05816650390625,
-0.014617919921875,
-0.055511474609375,
-0.038543701171875,
0.037689208984375,
-0.0628662109375,
0.03228759765625,
-0.0477294921875,
-0.018096923828125,
0.017852783203125,
0.0004010200500488281,
-0.0452880859375,
0.049163818359375,
0.0187225341796875,
0.059112548828125,
-0.0736083984375,
0.0411376953125,
0.0489501953125,
-0.021820068359375,
-0.040374755859375,
-0.0164794921875,
-0.004993438720703125,
-0.044525146484375,
0.0323486328125,
0.02471923828125,
0.0218658447265625,
0.005786895751953125,
-0.056976318359375,
-0.08770751953125,
0.062469482421875,
0.004093170166015625,
-0.0682373046875,
-0.0078887939453125,
-0.00323486328125,
0.04901123046875,
-0.024871826171875,
0.0377197265625,
0.049957275390625,
0.006351470947265625,
-0.01580810546875,
-0.05816650390625,
-0.035919189453125,
-0.036773681640625,
0.00455474853515625,
0.0240631103515625,
-0.03106689453125,
0.0540771484375,
-0.022674560546875,
0.0189208984375,
0.01418304443359375,
0.05841064453125,
0.015960693359375,
0.039886474609375,
0.0284271240234375,
0.06353759765625,
0.0347900390625,
-0.0095062255859375,
0.056243896484375,
-0.023651123046875,
0.0498046875,
0.080078125,
0.006450653076171875,
0.08343505859375,
0.02508544921875,
-0.0257110595703125,
0.058380126953125,
0.05902099609375,
-0.0142364501953125,
0.07281494140625,
0.015106201171875,
0.01517486572265625,
0.0017681121826171875,
0.01309967041015625,
-0.0210723876953125,
0.03192138671875,
0.041229248046875,
-0.061065673828125,
0.02001953125,
0.010772705078125,
0.004154205322265625,
-0.022247314453125,
-0.0261077880859375,
0.0577392578125,
-0.004810333251953125,
-0.047760009765625,
0.049957275390625,
0.01141357421875,
0.04144287109375,
-0.032958984375,
0.00030541419982910156,
-0.004421234130859375,
0.0160980224609375,
-0.01177215576171875,
-0.0762939453125,
0.01861572265625,
0.0061187744140625,
-0.0528564453125,
0.0081939697265625,
0.03387451171875,
-0.0284271240234375,
-0.03643798828125,
-0.00966644287109375,
-0.00208282470703125,
0.0159759521484375,
-0.00356292724609375,
-0.04937744140625,
0.01318359375,
-0.00021505355834960938,
-0.0430908203125,
0.0066375732421875,
0.0241241455078125,
0.007007598876953125,
0.037322998046875,
0.031585693359375,
-0.00879669189453125,
-0.0121612548828125,
-0.025115966796875,
0.034332275390625,
-0.054412841796875,
-0.034637451171875,
-0.061248779296875,
0.05615234375,
0.002590179443359375,
-0.06494140625,
0.03192138671875,
0.065185546875,
0.0750732421875,
-0.0272216796875,
0.0506591796875,
-0.0207977294921875,
0.04351806640625,
-0.038726806640625,
0.04315185546875,
-0.05621337890625,
0.0115509033203125,
-0.013153076171875,
-0.0728759765625,
-0.020233154296875,
0.0596923828125,
-0.007587432861328125,
0.00220489501953125,
0.0545654296875,
0.062744140625,
0.01561737060546875,
-0.01287841796875,
0.019012451171875,
0.01348114013671875,
-0.00574493408203125,
0.054962158203125,
0.0309906005859375,
-0.06292724609375,
0.021209716796875,
-0.02655029296875,
-0.001953125,
-0.03369140625,
-0.045166015625,
-0.048797607421875,
-0.053985595703125,
-0.02276611328125,
-0.059783935546875,
0.02374267578125,
0.0792236328125,
0.052642822265625,
-0.0809326171875,
0.004878997802734375,
-0.0153656005859375,
-0.006336212158203125,
-0.01715087890625,
-0.0246429443359375,
0.015716552734375,
-0.00534820556640625,
-0.0596923828125,
0.0186004638671875,
0.02313232421875,
-0.01456451416015625,
-0.0158843994140625,
-0.01788330078125,
-0.010162353515625,
0.007328033447265625,
0.0306243896484375,
0.0201873779296875,
-0.0305328369140625,
-0.040924072265625,
-0.00804901123046875,
-0.01983642578125,
0.0175323486328125,
0.02978515625,
-0.059844970703125,
0.0369873046875,
0.05712890625,
0.0360107421875,
0.0219268798828125,
0.0142364501953125,
0.048797607421875,
-0.0631103515625,
-0.00958251953125,
0.0428466796875,
0.01413726806640625,
0.0250091552734375,
-0.040069580078125,
0.039306640625,
0.0140228271484375,
-0.0714111328125,
-0.0694580078125,
0.005706787109375,
-0.07550048828125,
-0.0139617919921875,
0.0943603515625,
0.01885986328125,
-0.0231170654296875,
0.01139068603515625,
-0.0161285400390625,
0.0204010009765625,
-0.04559326171875,
0.058258056640625,
0.055816650390625,
0.003299713134765625,
-0.03533935546875,
-0.0171661376953125,
0.048004150390625,
0.0312042236328125,
-0.03515625,
-0.0340576171875,
0.02886962890625,
0.02655029296875,
0.029388427734375,
0.021514892578125,
0.016021728515625,
0.034820556640625,
0.00666046142578125,
0.00827789306640625,
0.001079559326171875,
-0.060302734375,
-0.0130767822265625,
0.01593017578125,
0.0017366409301757812,
-0.0259857177734375
]
] |
textattack/roberta-base-rotten_tomatoes | 2021-05-20T22:18:23.000Z | [
"transformers",
"pytorch",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | textattack | null | null | textattack/roberta-base-rotten_tomatoes | 1 | 15,985 | transformers | 2022-03-02T23:29:05 | ## roberta-base fine-tuned with TextAttack on the rotten_tomatoes dataset
This `roberta-base` model was fine-tuned for sequence classificationusing TextAttack
and the rotten_tomatoes dataset loaded using the `nlp` library. The model was fine-tuned
for 10 epochs with a batch size of 128, a learning
rate of 5e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.9033771106941839, as measured by the
eval set accuracy, found after 9 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
| 720 | [
[
-0.014923095703125,
-0.04058837890625,
0.03594970703125,
-0.00449371337890625,
-0.01983642578125,
0.00984954833984375,
-0.0255279541015625,
-0.0207366943359375,
-0.00513458251953125,
0.038299560546875,
-0.041534423828125,
-0.048553466796875,
-0.057769775390625,
0.014984130859375,
-0.042144775390625,
0.130859375,
0.035736083984375,
0.017242431640625,
0.00428009033203125,
-0.005794525146484375,
-0.034332275390625,
-0.031219482421875,
-0.040557861328125,
-0.00559234619140625,
0.039825439453125,
0.044830322265625,
0.0540771484375,
0.04876708984375,
0.06268310546875,
0.0166778564453125,
-0.006744384765625,
0.01528167724609375,
-0.054443359375,
0.0149078369140625,
-0.030181884765625,
-0.03997802734375,
-0.0287628173828125,
-0.0115509033203125,
0.035064697265625,
0.00925445556640625,
0.01070404052734375,
0.048095703125,
-0.00963592529296875,
0.072021484375,
-0.03778076171875,
-0.0176849365234375,
-0.04632568359375,
0.014678955078125,
-0.00927734375,
-0.007904052734375,
-0.058837890625,
-0.0254669189453125,
0.006420135498046875,
-0.0201568603515625,
0.0094451904296875,
0.0160980224609375,
0.07379150390625,
0.028961181640625,
-0.01183319091796875,
0.00960540771484375,
-0.05828857421875,
0.07232666015625,
-0.04656982421875,
0.002605438232421875,
0.033050537109375,
0.014739990234375,
-0.002552032470703125,
-0.04852294921875,
-0.0219573974609375,
0.0033664703369140625,
0.0330810546875,
-0.0231781005859375,
-0.0255279541015625,
-0.005828857421875,
0.01806640625,
0.033935546875,
-0.07904052734375,
0.0021495819091796875,
-0.047119140625,
-0.003986358642578125,
0.049102783203125,
0.0263824462890625,
0.0179290771484375,
-0.028839111328125,
-0.038665771484375,
-0.0164031982421875,
-0.02288818359375,
-0.019927978515625,
0.0282135009765625,
0.0277099609375,
-0.01261138916015625,
0.04620361328125,
-0.006076812744140625,
0.06719970703125,
0.00937652587890625,
0.0116424560546875,
0.03643798828125,
-0.016265869140625,
-0.035858154296875,
-0.017242431640625,
0.06256103515625,
0.0289306640625,
0.04632568359375,
-0.0022182464599609375,
-0.01387786865234375,
0.01073455810546875,
0.0504150390625,
-0.042083740234375,
-0.034393310546875,
0.032928466796875,
-0.033172607421875,
-0.046478271484375,
0.01371002197265625,
-0.033966064453125,
-0.02349853515625,
-0.016571044921875,
0.043548583984375,
-0.06591796875,
0.01080322265625,
0.004932403564453125,
-0.034515380859375,
-0.0006718635559082031,
0.01328277587890625,
-0.048675537109375,
0.016265869140625,
0.04205322265625,
0.0770263671875,
-0.0265045166015625,
0.0007262229919433594,
-0.0113372802734375,
-0.0228118896484375,
-0.02056884765625,
0.0550537109375,
-0.0343017578125,
-0.0017671585083007812,
-0.0265655517578125,
-0.00455474853515625,
0.01413726806640625,
-0.0187530517578125,
0.0423583984375,
-0.04022216796875,
0.033477783203125,
0.020843505859375,
-0.05072021484375,
-0.035003662109375,
0.0303497314453125,
-0.04718017578125,
0.056854248046875,
0.036102294921875,
-0.037872314453125,
0.0272216796875,
-0.035369873046875,
-0.0202484130859375,
-0.004486083984375,
0.020660400390625,
-0.0418701171875,
0.0233612060546875,
0.0013370513916015625,
0.0282135009765625,
-0.02789306640625,
0.024658203125,
-0.01050567626953125,
-0.034454345703125,
0.0209808349609375,
-0.0406494140625,
0.06304931640625,
0.027374267578125,
-0.017852783203125,
0.01004791259765625,
-0.0816650390625,
0.050933837890625,
-0.00580596923828125,
-0.025634765625,
-0.0113067626953125,
-0.01517486572265625,
0.0350341796875,
0.006160736083984375,
0.01372528076171875,
-0.0236053466796875,
0.01959228515625,
-0.03961181640625,
0.028594970703125,
0.04669189453125,
0.0014095306396484375,
0.031646728515625,
-0.055328369140625,
0.047943115234375,
-0.006458282470703125,
0.016693115234375,
-0.01410675048828125,
-0.04083251953125,
-0.0423583984375,
-0.00930023193359375,
0.04400634765625,
0.0572509765625,
-0.02392578125,
0.052398681640625,
-0.0216064453125,
-0.058837890625,
-0.02276611328125,
-0.0114593505859375,
0.04803466796875,
0.003620147705078125,
0.036590576171875,
-0.047088623046875,
-0.06072998046875,
-0.056640625,
-0.012237548828125,
-0.0007309913635253906,
-0.006744384765625,
0.0185546875,
0.041473388671875,
-0.003688812255859375,
0.0572509765625,
-0.042236328125,
-0.031890869140625,
-0.0166015625,
0.0197601318359375,
0.033355712890625,
0.04766845703125,
0.031585693359375,
-0.047515869140625,
-0.03643798828125,
-0.0177459716796875,
-0.033599853515625,
0.00493621826171875,
0.00443267822265625,
-0.0001093745231628418,
0.02606201171875,
0.00009429454803466797,
-0.029144287109375,
0.02313232421875,
0.056854248046875,
-0.038848876953125,
0.0400390625,
0.00099945068359375,
0.0272216796875,
-0.106201171875,
-0.006778717041015625,
0.0025310516357421875,
-0.032379150390625,
-0.030029296875,
0.011199951171875,
0.0013647079467773438,
-0.0250396728515625,
-0.042724609375,
0.01180267333984375,
-0.0228729248046875,
-0.00800323486328125,
-0.00738525390625,
-0.0152587890625,
0.00867462158203125,
0.0460205078125,
-0.01385498046875,
0.06976318359375,
0.0272064208984375,
-0.03814697265625,
0.0181121826171875,
0.0095367431640625,
-0.0243377685546875,
0.032684326171875,
-0.0438232421875,
0.0318603515625,
0.006805419921875,
0.006557464599609375,
-0.07403564453125,
-0.01303863525390625,
0.0030536651611328125,
-0.05810546875,
0.0098876953125,
-0.031524658203125,
-0.023223876953125,
-0.0093841552734375,
-0.0302276611328125,
0.034698486328125,
0.0391845703125,
-0.0103607177734375,
0.0269775390625,
0.029571533203125,
0.01097869873046875,
-0.0396728515625,
-0.04339599609375,
0.005657196044921875,
-0.030364990234375,
-0.037109375,
0.04302978515625,
-0.0146942138671875,
-0.0024280548095703125,
-0.01276397705078125,
0.01122283935546875,
-0.0253143310546875,
-0.0203704833984375,
0.030029296875,
0.0161590576171875,
-0.021270751953125,
0.020599365234375,
-0.018829345703125,
-0.0148162841796875,
-0.01328277587890625,
-0.031341552734375,
0.06939697265625,
-0.0214080810546875,
0.02825927734375,
-0.027923583984375,
-0.0014667510986328125,
0.06280517578125,
-0.00820159912109375,
0.05615234375,
0.039276123046875,
-0.024444580078125,
-0.0318603515625,
-0.0094451904296875,
-0.00046372413635253906,
-0.03155517578125,
0.032196044921875,
-0.01788330078125,
-0.040130615234375,
0.031158447265625,
0.01629638671875,
-0.020050048828125,
0.05810546875,
0.032440185546875,
-0.004230499267578125,
0.06103515625,
0.033721923828125,
-0.0238189697265625,
0.039703369140625,
-0.0382080078125,
-0.0159149169921875,
-0.049560546875,
-0.0027370452880859375,
-0.04693603515625,
-0.03863525390625,
-0.047210693359375,
-0.042388916015625,
0.021148681640625,
-0.02056884765625,
-0.0462646484375,
0.05096435546875,
-0.058624267578125,
0.0550537109375,
0.041656494140625,
0.0276336669921875,
0.0124969482421875,
-0.00446319580078125,
0.01216888427734375,
-0.0100860595703125,
-0.02655029296875,
-0.0227203369140625,
0.10406494140625,
0.017059326171875,
0.06512451171875,
0.0167388916015625,
0.0472412109375,
0.06243896484375,
-0.022216796875,
-0.0523681640625,
0.03094482421875,
-0.041961669921875,
-0.054962158203125,
-0.030364990234375,
-0.0193634033203125,
-0.0491943359375,
-0.0277252197265625,
-0.0289306640625,
-0.046844482421875,
-0.01045989990234375,
0.01013946533203125,
-0.0198211669921875,
0.0116424560546875,
-0.044921875,
0.06854248046875,
-0.0164642333984375,
-0.0186309814453125,
-0.022369384765625,
-0.05517578125,
0.0265655517578125,
0.0034847259521484375,
-0.004253387451171875,
-0.01508331298828125,
0.022552490234375,
0.051300048828125,
-0.0277252197265625,
0.03619384765625,
0.0026035308837890625,
0.019012451171875,
0.00727081298828125,
0.01288604736328125,
0.03839111328125,
-0.00414276123046875,
0.00290679931640625,
0.02166748046875,
-0.0003707408905029297,
-0.0174560546875,
-0.03460693359375,
0.04583740234375,
-0.059844970703125,
-0.0023193359375,
-0.0545654296875,
-0.035369873046875,
-0.00670623779296875,
0.004642486572265625,
0.050872802734375,
0.0286865234375,
-0.01372528076171875,
0.041290283203125,
0.054229736328125,
-0.01387786865234375,
0.0172271728515625,
0.03741455078125,
-0.0163116455078125,
-0.042510986328125,
0.06610107421875,
0.00806427001953125,
0.017303466796875,
0.02630615234375,
0.00556182861328125,
0.0006861686706542969,
-0.04217529296875,
-0.04669189453125,
-0.01276397705078125,
-0.030517578125,
-0.04644775390625,
-0.029998779296875,
-0.035247802734375,
-0.00974273681640625,
0.0021228790283203125,
-0.0289306640625,
-0.048431396484375,
-0.03570556640625,
-0.012725830078125,
0.06207275390625,
0.048553466796875,
0.00766754150390625,
0.0185089111328125,
-0.061187744140625,
0.0173187255859375,
-0.0281982421875,
0.031585693359375,
-0.0180511474609375,
-0.07666015625,
-0.039306640625,
-0.01201629638671875,
-0.024017333984375,
-0.06951904296875,
0.043487548828125,
0.035186767578125,
-0.005863189697265625,
0.0094757080078125,
0.00853729248046875,
0.042449951171875,
-0.051605224609375,
0.07537841796875,
-0.00011104345321655273,
-0.061431884765625,
0.0594482421875,
-0.032745361328125,
0.051025390625,
0.04656982421875,
0.035369873046875,
-0.0155792236328125,
-0.046875,
-0.061431884765625,
-0.0833740234375,
0.040069580078125,
0.01502227783203125,
0.017547607421875,
0.005390167236328125,
0.00977325439453125,
0.01654052734375,
0.0017976760864257812,
-0.056182861328125,
-0.01271820068359375,
0.00601959228515625,
-0.04449462890625,
-0.011993408203125,
-0.013824462890625,
-0.0015192031860351562,
-0.029571533203125,
0.0545654296875,
0.0038776397705078125,
0.023345947265625,
-0.001926422119140625,
-0.0256500244140625,
-0.018951416015625,
0.0172271728515625,
0.059234619140625,
0.04449462890625,
-0.035797119140625,
-0.01311492919921875,
0.0025386810302734375,
-0.04937744140625,
0.018035888671875,
-0.01352691650390625,
-0.0175323486328125,
-0.0006852149963378906,
0.0202178955078125,
0.06103515625,
0.0150146484375,
-0.058837890625,
0.022369384765625,
-0.006793975830078125,
0.0111236572265625,
-0.054962158203125,
0.0445556640625,
-0.031280517578125,
0.0217437744140625,
0.040374755859375,
0.02783203125,
0.0303497314453125,
-0.0310211181640625,
0.04779052734375,
0.01371002197265625,
-0.037261962890625,
-0.01416778564453125,
0.04388427734375,
-0.00732421875,
-0.0511474609375,
0.06353759765625,
0.007205963134765625,
-0.03082275390625,
0.055267333984375,
0.04083251953125,
0.07928466796875,
-0.007110595703125,
0.0255279541015625,
0.037200927734375,
0.00870513916015625,
-0.00344085693359375,
0.02557373046875,
0.01206207275390625,
-0.050384521484375,
-0.00853729248046875,
-0.0596923828125,
-0.02960205078125,
0.021575927734375,
-0.06707763671875,
0.0245513916015625,
-0.056671142578125,
-0.0357666015625,
0.0208740234375,
0.0054473876953125,
-0.042236328125,
0.055450439453125,
-0.01149749755859375,
0.06610107421875,
-0.07452392578125,
0.0517578125,
0.04400634765625,
-0.043365478515625,
-0.051666259765625,
-0.00302886962890625,
0.00823211669921875,
-0.035552978515625,
0.047454833984375,
0.0100250244140625,
0.00809478759765625,
0.01153564453125,
-0.048431396484375,
-0.08367919921875,
0.05322265625,
-0.0010328292846679688,
-0.04766845703125,
-0.00653076171875,
-0.0196533203125,
0.05853271484375,
-0.039459228515625,
0.0156707763671875,
0.034942626953125,
0.01468658447265625,
-0.0249786376953125,
-0.06304931640625,
-0.025360107421875,
-0.033416748046875,
0.00038433074951171875,
0.019927978515625,
-0.035369873046875,
0.06915283203125,
-0.024932861328125,
0.013702392578125,
0.0214080810546875,
0.038787841796875,
0.01226043701171875,
0.024169921875,
0.02667236328125,
0.06182861328125,
0.041534423828125,
-0.01605224609375,
0.06561279296875,
-0.03369140625,
0.03887939453125,
0.0926513671875,
0.00952911376953125,
0.08538818359375,
0.014129638671875,
-0.0285186767578125,
0.06353759765625,
0.052154541015625,
-0.023345947265625,
0.050750732421875,
0.023284912109375,
0.0168609619140625,
0.0026454925537109375,
0.0121612548828125,
-0.0101470947265625,
0.043670654296875,
0.0267333984375,
-0.06500244140625,
0.024810791015625,
0.0174102783203125,
0.0007867813110351562,
0.005615234375,
-0.0092620849609375,
0.06353759765625,
-0.0115814208984375,
-0.042449951171875,
0.050201416015625,
0.0090789794921875,
0.039703369140625,
-0.033599853515625,
0.0001360177993774414,
-0.002010345458984375,
0.019805908203125,
-0.01470947265625,
-0.0697021484375,
0.0189666748046875,
0.014373779296875,
-0.06103515625,
0.010955810546875,
0.03863525390625,
-0.0269775390625,
-0.03497314453125,
0.0016222000122070312,
-0.001705169677734375,
0.0163726806640625,
0.0018157958984375,
-0.048614501953125,
0.0211639404296875,
0.0019521713256835938,
-0.0254058837890625,
0.0085601806640625,
0.0272369384765625,
-0.0011444091796875,
0.046051025390625,
0.043243408203125,
0.0008835792541503906,
-0.0034961700439453125,
-0.01097869873046875,
0.041290283203125,
-0.06549072265625,
-0.039947509765625,
-0.05657958984375,
0.05718994140625,
-0.0020999908447265625,
-0.06549072265625,
0.048187255859375,
0.058258056640625,
0.06353759765625,
-0.038330078125,
0.044036865234375,
-0.022369384765625,
0.04522705078125,
-0.04742431640625,
0.0455322265625,
-0.044921875,
0.0112762451171875,
-0.0305938720703125,
-0.065673828125,
-0.0196990966796875,
0.057373046875,
-0.0051422119140625,
0.01404571533203125,
0.05474853515625,
0.047119140625,
0.01343536376953125,
-0.01116943359375,
0.02056884765625,
0.0183868408203125,
-0.006137847900390625,
0.050018310546875,
0.018157958984375,
-0.05792236328125,
0.0224151611328125,
-0.00873565673828125,
-0.004993438720703125,
-0.0254364013671875,
-0.05517578125,
-0.04388427734375,
-0.057647705078125,
-0.01198577880859375,
-0.053375244140625,
0.0396728515625,
0.07525634765625,
0.06011962890625,
-0.07489013671875,
-0.0087127685546875,
-0.01345062255859375,
-0.004734039306640625,
-0.0276031494140625,
-0.0263214111328125,
0.0177001953125,
-0.01873779296875,
-0.06243896484375,
0.0248565673828125,
0.0240478515625,
-0.00711822509765625,
-0.019500732421875,
-0.0085296630859375,
-0.024078369140625,
0.00502777099609375,
0.02667236328125,
0.0120849609375,
-0.039947509765625,
-0.04498291015625,
-0.006793975830078125,
-0.01361083984375,
0.01300811767578125,
0.036407470703125,
-0.06561279296875,
0.03460693359375,
0.0556640625,
0.030670166015625,
0.0290679931640625,
0.024932861328125,
0.06768798828125,
-0.0687255859375,
0.0027523040771484375,
0.03948974609375,
0.00542449951171875,
0.0183258056640625,
-0.043243408203125,
0.053741455078125,
0.006900787353515625,
-0.07049560546875,
-0.06671142578125,
0.01251220703125,
-0.08465576171875,
-0.02532958984375,
0.07012939453125,
0.0022525787353515625,
-0.02740478515625,
0.008270263671875,
-0.01358795166015625,
0.0117950439453125,
-0.03570556640625,
0.06298828125,
0.052886962890625,
0.0024852752685546875,
-0.034027099609375,
-0.0116424560546875,
0.055419921875,
0.02386474609375,
-0.04058837890625,
-0.0217742919921875,
0.039337158203125,
0.03271484375,
0.02716064453125,
0.0259552001953125,
-0.0006051063537597656,
0.034454345703125,
0.0018701553344726562,
0.0029430389404296875,
-0.00000858306884765625,
-0.06939697265625,
-0.02447509765625,
0.0209197998046875,
0.001972198486328125,
-0.0234832763671875
]
] |
hf-internal-testing/tiny-stable-diffusion-xl-pipe | 2023-09-26T13:51:51.000Z | [
"diffusers",
"onnx",
"text-to-image",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | hf-internal-testing | null | null | hf-internal-testing/tiny-stable-diffusion-xl-pipe | 0 | 15,946 | diffusers | 2023-07-10T06:53:23 | ---
library_name: diffusers
tags:
- text-to-image
---
```python
from diffusers import DiffusionPipeline
pipe = DiffusionPipeline.from_pretrained("hf-internal-testing/tiny-stable-diffusion-xl-pipe")
```
The pipeline was created using this [Colab Notebook](https://colab.research.google.com/gist/sayakpaul/a7b986af7e9ea26562eed4ec1410d766/scratchpad.ipynb). | 359 | [
[
-0.032989501953125,
-0.055450439453125,
0.026519775390625,
0.0200958251953125,
-0.00858306884765625,
0.00015461444854736328,
0.01263427734375,
0.03521728515625,
0.0284881591796875,
0.014251708984375,
-0.0350341796875,
-0.0017642974853515625,
-0.033843994140625,
0.00160980224609375,
-0.0364990234375,
0.086669921875,
-0.00720977783203125,
0.005107879638671875,
0.008514404296875,
0.0107421875,
0.00800323486328125,
0.006565093994140625,
-0.052764892578125,
-0.0264129638671875,
0.0296783447265625,
0.0313720703125,
0.0294189453125,
0.0255279541015625,
0.015167236328125,
0.01910400390625,
-0.02410888671875,
-0.034088134765625,
-0.03509521484375,
0.004444122314453125,
0.0163421630859375,
-0.02325439453125,
-0.0093536376953125,
-0.00495147705078125,
0.08343505859375,
0.0433349609375,
-0.0131683349609375,
0.01812744140625,
0.03472900390625,
0.0283203125,
-0.050018310546875,
-0.006053924560546875,
-0.0119781494140625,
0.0131072998046875,
-0.0264434814453125,
-0.00811767578125,
-0.00919342041015625,
-0.03253173828125,
0.0235595703125,
-0.0380859375,
0.0178985595703125,
-0.00662994384765625,
0.07421875,
0.03851318359375,
-0.030303955078125,
-0.0222625732421875,
-0.03411865234375,
0.03692626953125,
-0.032562255859375,
0.01493072509765625,
0.0037384033203125,
0.018402099609375,
-0.0239105224609375,
-0.1080322265625,
-0.02044677734375,
0.008148193359375,
0.0010595321655273438,
0.005901336669921875,
0.0236663818359375,
-0.0014715194702148438,
0.017730712890625,
0.027557373046875,
-0.04278564453125,
-0.0248260498046875,
-0.05230712890625,
-0.0223236083984375,
0.03375244140625,
0.019256591796875,
0.0076446533203125,
0.0008325576782226562,
-0.014190673828125,
-0.0016298294067382812,
-0.033905029296875,
-0.0032958984375,
0.0307159423828125,
-0.0026111602783203125,
-0.025299072265625,
0.052276611328125,
-0.0020904541015625,
0.0295562744140625,
0.01151275634765625,
-0.007350921630859375,
0.040557861328125,
-0.0193328857421875,
-0.027099609375,
0.0181427001953125,
0.05535888671875,
-0.0105743408203125,
0.0011911392211914062,
0.035888671875,
0.0012845993041992188,
-0.01180267333984375,
0.00965118408203125,
-0.10748291015625,
-0.05548095703125,
0.0174102783203125,
-0.037078857421875,
-0.0272064208984375,
-0.0007100105285644531,
-0.055084228515625,
-0.009521484375,
0.00644683837890625,
0.037384033203125,
-0.006786346435546875,
-0.037384033203125,
-0.024139404296875,
-0.053558349609375,
0.0178680419921875,
-0.004070281982421875,
-0.035797119140625,
0.0219268798828125,
0.029327392578125,
0.08642578125,
0.01526641845703125,
-0.02105712890625,
-0.05767822265625,
-0.017974853515625,
-0.0160369873046875,
0.0565185546875,
-0.01389312744140625,
-0.0268707275390625,
-0.0108642578125,
0.016082763671875,
-0.003971099853515625,
-0.041351318359375,
0.031219482421875,
-0.02069091796875,
0.03045654296875,
0.0235595703125,
-0.043914794921875,
0.01448822021484375,
-0.01358795166015625,
-0.004817962646484375,
0.06494140625,
0.036102294921875,
-0.089111328125,
0.0256195068359375,
-0.0643310546875,
-0.0126495361328125,
0.006526947021484375,
0.00994110107421875,
-0.049835205078125,
-0.0248260498046875,
-0.031890869140625,
0.006320953369140625,
0.016937255859375,
-0.00980377197265625,
-0.037811279296875,
-0.0421142578125,
-0.00940704345703125,
-0.00836181640625,
0.1158447265625,
0.047454833984375,
-0.0057220458984375,
0.02764892578125,
-0.05615234375,
0.00044727325439453125,
-0.033477783203125,
-0.0321044921875,
-0.016937255859375,
-0.02227783203125,
0.0226287841796875,
-0.00446319580078125,
-0.0038509368896484375,
-0.051971435546875,
-0.002414703369140625,
-0.034210205078125,
0.056732177734375,
0.063720703125,
0.01104736328125,
0.031036376953125,
-0.00957489013671875,
0.055572509765625,
-0.0015392303466796875,
-0.00514984130859375,
0.01305389404296875,
-0.05645751953125,
-0.06097412109375,
-0.03802490234375,
0.024627685546875,
0.028533935546875,
-0.018585205078125,
0.0253753662109375,
0.005062103271484375,
-0.05047607421875,
-0.02886962890625,
0.030242919921875,
-0.0090179443359375,
0.03863525390625,
-0.0026493072509765625,
-0.0166778564453125,
-0.03778076171875,
-0.0300445556640625,
0.0102386474609375,
0.003997802734375,
-0.0224761962890625,
0.0014238357543945312,
0.0498046875,
-0.038665771484375,
0.0704345703125,
-0.08099365234375,
-0.030487060546875,
0.006259918212890625,
0.0268707275390625,
0.04534912109375,
0.056732177734375,
0.043243408203125,
-0.01454925537109375,
-0.086669921875,
-0.00620269775390625,
-0.02569580078125,
-0.0260162353515625,
0.0161285400390625,
-0.0450439453125,
-0.01512908935546875,
0.0362548828125,
-0.043060302734375,
0.026947021484375,
0.052154541015625,
-0.058349609375,
0.06439208984375,
-0.0234375,
-0.022003173828125,
-0.07623291015625,
0.0195159912109375,
-0.0025997161865234375,
-0.03643798828125,
-0.01392364501953125,
0.0017223358154296875,
0.004550933837890625,
-0.010833740234375,
-0.048187255859375,
0.048492431640625,
-0.0433349609375,
0.0227813720703125,
-0.0190277099609375,
-0.0198516845703125,
-0.01483917236328125,
-0.00603485107421875,
-0.00743865966796875,
0.063232421875,
0.06585693359375,
-0.03680419921875,
0.041259765625,
0.0235595703125,
-0.01280975341796875,
0.0197296142578125,
-0.052215576171875,
0.0090179443359375,
-0.00394439697265625,
0.0183563232421875,
-0.0714111328125,
-0.0227813720703125,
0.0243682861328125,
-0.019866943359375,
0.00817108154296875,
-0.04315185546875,
-0.0019311904907226562,
-0.034454345703125,
-0.0281829833984375,
0.032501220703125,
0.07965087890625,
-0.03472900390625,
0.023529052734375,
0.016571044921875,
-0.01641845703125,
-0.036712646484375,
-0.045166015625,
-0.0255279541015625,
-0.047332763671875,
-0.0596923828125,
0.01068878173828125,
-0.035125732421875,
-0.0096588134765625,
-0.01494598388671875,
0.006015777587890625,
-0.058990478515625,
-0.008209228515625,
0.022918701171875,
0.005077362060546875,
-0.0228271484375,
-0.03271484375,
0.015106201171875,
-0.01317596435546875,
0.0291748046875,
-0.0084686279296875,
0.040771484375,
0.0015163421630859375,
0.01477813720703125,
-0.03326416015625,
0.00148773193359375,
0.0209808349609375,
0.028228759765625,
0.032928466796875,
0.06890869140625,
-0.0078125,
-0.03057861328125,
-0.0022525787353515625,
-0.02337646484375,
-0.038909912109375,
-0.00559234619140625,
-0.0206756591796875,
-0.039581298828125,
0.028228759765625,
-0.02410888671875,
-0.0145111083984375,
0.01470947265625,
0.0399169921875,
-0.0215301513671875,
0.060150146484375,
0.028961181640625,
0.033935546875,
0.034332275390625,
-0.05828857421875,
-0.00316619873046875,
-0.06817626953125,
0.00270843505859375,
-0.04632568359375,
-0.01392364501953125,
-0.017822265625,
-0.007762908935546875,
0.03802490234375,
0.031463623046875,
-0.055023193359375,
0.00875091552734375,
-0.02984619140625,
0.03607177734375,
0.037628173828125,
-0.0174560546875,
0.0034122467041015625,
-0.02056884765625,
-0.035675048828125,
0.00339508056640625,
-0.0265655517578125,
-0.0284881591796875,
0.0843505859375,
0.019989013671875,
0.06890869140625,
-0.01386260986328125,
0.06463623046875,
0.00370025634765625,
0.034515380859375,
-0.046844482421875,
-0.003204345703125,
0.005374908447265625,
-0.0672607421875,
-0.013092041015625,
-0.01451873779296875,
-0.08013916015625,
0.0222015380859375,
0.003879547119140625,
-0.02947998046875,
0.004367828369140625,
0.02410888671875,
-0.0249786376953125,
0.0023040771484375,
-0.058563232421875,
0.090087890625,
0.01033782958984375,
-0.041351318359375,
0.00206756591796875,
-0.0308380126953125,
0.03350830078125,
-0.00820159912109375,
0.0008168220520019531,
0.0206146240234375,
-0.0003380775451660156,
0.055755615234375,
-0.08087158203125,
0.0345458984375,
-0.03839111328125,
-0.00208282470703125,
0.039093017578125,
0.00518798828125,
-0.0005650520324707031,
0.0302276611328125,
-0.012542724609375,
0.0074005126953125,
0.0194854736328125,
-0.04205322265625,
-0.002185821533203125,
0.0667724609375,
-0.056182861328125,
-0.011749267578125,
-0.0523681640625,
-0.01486968994140625,
0.03192138671875,
0.044952392578125,
0.0435791015625,
0.0243988037109375,
0.004390716552734375,
0.01033782958984375,
0.0180816650390625,
0.0212554931640625,
0.068603515625,
0.006832122802734375,
-0.0283660888671875,
-0.0286102294921875,
0.039703369140625,
0.0095977783203125,
0.0029163360595703125,
-0.0010576248168945312,
0.060302734375,
-0.02984619140625,
-0.033172607421875,
-0.043792724609375,
0.0119171142578125,
-0.040191650390625,
-0.0067291259765625,
-0.03094482421875,
-0.043731689453125,
-0.0300750732421875,
-0.0137481689453125,
-0.0294189453125,
-0.01522064208984375,
-0.04290771484375,
0.018646240234375,
0.0162811279296875,
0.049560546875,
-0.0423583984375,
0.0362548828125,
-0.044677734375,
0.017974853515625,
0.039794921875,
0.020599365234375,
-0.0162506103515625,
-0.040802001953125,
-0.01184844970703125,
0.0067596435546875,
-0.053436279296875,
-0.04437255859375,
0.050201416015625,
0.0333251953125,
0.033172607421875,
0.06756591796875,
0.01044464111328125,
0.05621337890625,
-0.028717041015625,
0.0670166015625,
0.0114593505859375,
-0.0721435546875,
0.05535888671875,
-0.02459716796875,
0.006256103515625,
0.00638580322265625,
0.039581298828125,
-0.00820159912109375,
-0.0215606689453125,
-0.03790283203125,
-0.06402587890625,
0.029083251953125,
0.032867431640625,
-0.01010894775390625,
0.0103912353515625,
0.0178680419921875,
0.01971435546875,
-0.0015611648559570312,
-0.0494384765625,
-0.0308074951171875,
-0.025665283203125,
-0.01049041748046875,
0.0037403106689453125,
0.00971221923828125,
-0.027801513671875,
-0.05560302734375,
0.04022216796875,
-0.0176544189453125,
0.0181884765625,
0.032196044921875,
-0.0018215179443359375,
-0.0098114013671875,
0.00293731689453125,
0.0210418701171875,
0.0606689453125,
-0.057373046875,
0.002777099609375,
-0.004718780517578125,
-0.06378173828125,
0.040313720703125,
-0.00856781005859375,
-0.0189208984375,
0.003803253173828125,
-0.00036525726318359375,
0.0290374755859375,
-0.037353515625,
-0.030487060546875,
0.04559326171875,
-0.0160675048828125,
-0.0162506103515625,
-0.07122802734375,
0.0249481201171875,
0.0267486572265625,
0.0077667236328125,
0.0118865966796875,
0.0323486328125,
0.0190277099609375,
-0.031341552734375,
0.00534820556640625,
0.031219482421875,
-0.035125732421875,
-0.007293701171875,
0.06854248046875,
0.02716064453125,
-0.037353515625,
0.06182861328125,
-0.0223541259765625,
-0.01039886474609375,
0.0257415771484375,
0.04437255859375,
0.0748291015625,
0.0020847320556640625,
0.00408172607421875,
0.0399169921875,
-0.0073394775390625,
-0.001903533935546875,
0.015777587890625,
-0.0118865966796875,
-0.039093017578125,
-0.0243377685546875,
-0.052947998046875,
-0.01245880126953125,
0.0005817413330078125,
-0.039794921875,
0.039520263671875,
-0.042449951171875,
-0.01276397705078125,
-0.006679534912109375,
-0.0011148452758789062,
-0.04290771484375,
0.0011768341064453125,
-0.0023403167724609375,
0.06951904296875,
-0.061431884765625,
0.0843505859375,
0.058502197265625,
-0.02972412109375,
-0.0382080078125,
0.0088653564453125,
-0.0166473388671875,
-0.040985107421875,
0.0631103515625,
0.00966644287109375,
-0.01446533203125,
0.0155181884765625,
-0.03228759765625,
-0.060150146484375,
0.076171875,
0.0254058837890625,
-0.0188140869140625,
0.01078033447265625,
-0.020843505859375,
0.03369140625,
-0.0251617431640625,
0.06060791015625,
0.0377197265625,
0.03582763671875,
0.00695037841796875,
-0.0594482421875,
-0.00959014892578125,
-0.042510986328125,
0.00847625732421875,
0.00572967529296875,
-0.048492431640625,
0.0848388671875,
-0.0203704833984375,
-0.0138397216796875,
0.0185089111328125,
0.0292205810546875,
0.021270751953125,
0.0186309814453125,
0.042816162109375,
0.062286376953125,
0.04449462890625,
-0.0041656494140625,
0.048126220703125,
0.0009016990661621094,
0.0535888671875,
0.06390380859375,
-0.01751708984375,
0.0618896484375,
0.050994873046875,
-0.0204010009765625,
0.08795166015625,
0.049896240234375,
0.01250457763671875,
0.057342529296875,
0.039703369140625,
-0.0179595947265625,
0.0006799697875976562,
0.032806396484375,
-0.041412353515625,
0.007282257080078125,
0.018402099609375,
-0.019378662109375,
-0.0027408599853515625,
-0.0189208984375,
-0.003803253173828125,
-0.044189453125,
-0.01158905029296875,
0.029541015625,
0.01270294189453125,
-0.04840087890625,
0.0704345703125,
-0.0325927734375,
0.07891845703125,
-0.055908203125,
-0.0009684562683105469,
-0.00139617919921875,
0.047515869140625,
-0.025787353515625,
-0.0732421875,
0.041717529296875,
-0.006275177001953125,
0.012542724609375,
-0.01354217529296875,
0.044036865234375,
-0.01177978515625,
-0.032379150390625,
0.0309906005859375,
0.01074981689453125,
0.031982421875,
0.007274627685546875,
-0.0399169921875,
-0.00484466552734375,
-0.002532958984375,
-0.04119873046875,
0.019195556640625,
0.007526397705078125,
0.05218505859375,
0.05389404296875,
0.03900146484375,
0.031341552734375,
0.040435791015625,
-0.016937255859375,
0.03839111328125,
-0.045135498046875,
-0.057708740234375,
-0.0343017578125,
0.0604248046875,
-0.022552490234375,
-0.048370361328125,
0.037109375,
0.0513916015625,
0.0645751953125,
-0.0250701904296875,
0.063232421875,
-0.0209197998046875,
0.02679443359375,
-0.01861572265625,
0.06939697265625,
-0.03790283203125,
-0.009002685546875,
-0.01497650146484375,
-0.07574462890625,
0.0030117034912109375,
0.08380126953125,
0.03643798828125,
-0.0009746551513671875,
0.0859375,
0.06732177734375,
-0.04052734375,
-0.017425537109375,
-0.0162811279296875,
0.046875,
0.01934814453125,
0.02105712890625,
0.0560302734375,
-0.035369873046875,
0.050872802734375,
-0.05145263671875,
-0.0200347900390625,
0.00894927978515625,
-0.0743408203125,
-0.05865478515625,
-0.0201568603515625,
-0.042144775390625,
-0.058074951171875,
-0.0259246826171875,
0.0504150390625,
0.07769775390625,
-0.0540771484375,
-0.056854248046875,
-0.0499267578125,
0.0063018798828125,
-0.00522613525390625,
-0.0261383056640625,
0.024749755859375,
-0.0208282470703125,
-0.0594482421875,
0.0109405517578125,
-0.00922393798828125,
0.02142333984375,
-0.039093017578125,
-0.0341796875,
-0.01039886474609375,
-0.0180816650390625,
0.014434814453125,
0.01727294921875,
-0.0161590576171875,
-0.0276947021484375,
-0.0440673828125,
0.01910400390625,
0.00765228271484375,
0.03314208984375,
-0.04949951171875,
0.0086822509765625,
0.08538818359375,
0.004932403564453125,
0.055450439453125,
-0.0150604248046875,
0.039337158203125,
-0.04327392578125,
0.027984619140625,
0.00348663330078125,
0.044403076171875,
0.0000336766242980957,
-0.0146484375,
0.0428466796875,
0.039825439453125,
-0.07354736328125,
-0.047515869140625,
-0.0096435546875,
-0.08001708984375,
-0.0200958251953125,
0.0787353515625,
-0.0269775390625,
-0.0210723876953125,
-0.0204010009765625,
-0.04095458984375,
0.0211944580078125,
-0.0306854248046875,
0.041656494140625,
0.0303802490234375,
-0.022918701171875,
-0.00576019287109375,
-0.0022602081298828125,
0.03448486328125,
-0.00020635128021240234,
-0.04681396484375,
-0.0252685546875,
0.0162506103515625,
0.08404541015625,
0.0201263427734375,
0.062744140625,
0.005489349365234375,
-0.0104522705078125,
0.03741455078125,
-0.0223846435546875,
0.022979736328125,
-0.00995635986328125,
-0.0115966796875,
0.018585205078125,
0.00027871131896972656,
-0.0220794677734375
]
] |
bigscience/bloomz-3b | 2023-05-27T17:26:10.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zu",
"dataset:bigscience/xP3",
"arxiv:2211.01786",
"license:bigscience-bloom-rail-1.0",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloomz-3b | 71 | 15,912 | transformers | 2022-10-08T16:47:24 | ---
datasets:
- bigscience/xP3
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
pipeline_tag: text-generation
widget:
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。Would you rate the previous review as positive, neutral or negative?"
example_title: "zh-en sentiment"
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?"
example_title: "zh-zh sentiment"
- text: "Suggest at least five related search terms to \"Mạng neural nhân tạo\"."
example_title: "vi-en query"
- text: "Proposez au moins cinq mots clés concernant «Réseau de neurones artificiels»."
example_title: "fr-fr query"
- text: "Explain in a sentence in Telugu what is backpropagation in neural networks."
example_title: "te-en qa"
- text: "Why is the sky blue?"
example_title: "en-en qa"
- text: "Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is \"Heroes Come in All Shapes and Sizes\". Story (in Spanish):"
example_title: "es-en fable"
- text: "Write a fable about wood elves living in a forest that is suddenly invaded by ogres. The fable is a masterpiece that has achieved praise worldwide and its moral is \"Violence is the last refuge of the incompetent\". Fable (in Hindi):"
example_title: "hi-en fable"
model-index:
- name: bloomz-3b1
results:
- task:
type: Coreference resolution
dataset:
type: winogrande
name: Winogrande XL (xl)
config: xl
split: validation
revision: a80f460359d1e9a67c006011c94de42a8759430c
metrics:
- type: Accuracy
value: 53.67
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (en)
config: en
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 59.23
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (fr)
config: fr
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 53.01
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (jp)
config: jp
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 52.45
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (pt)
config: pt
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 53.61
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (ru)
config: ru
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 53.97
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (zh)
config: zh
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 60.91
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r1)
config: r1
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 40.1
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r2)
config: r2
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 36.8
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r3)
config: r3
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 40.0
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (cb)
config: cb
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 75.0
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (rte)
config: rte
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 76.17
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ar)
config: ar
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 53.29
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (bg)
config: bg
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 43.82
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (de)
config: de
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 45.26
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (el)
config: el
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 42.61
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (en)
config: en
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 57.31
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (es)
config: es
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 56.14
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (fr)
config: fr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 55.78
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (hi)
config: hi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 51.49
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ru)
config: ru
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 47.11
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (sw)
config: sw
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 47.83
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (th)
config: th
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 42.93
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (tr)
config: tr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 37.23
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ur)
config: ur
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 49.04
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (vi)
config: vi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 53.98
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (zh)
config: zh
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 54.18
- task:
type: Program synthesis
dataset:
type: openai_humaneval
name: HumanEval
config: None
split: test
revision: e8dc562f5de170c54b5481011dd9f4fa04845771
metrics:
- type: Pass@1
value: 6.29
- type: Pass@10
value: 11.94
- type: Pass@100
value: 19.06
- task:
type: Sentence completion
dataset:
type: story_cloze
name: StoryCloze (2016)
config: "2016"
split: validation
revision: e724c6f8cdf7c7a2fb229d862226e15b023ee4db
metrics:
- type: Accuracy
value: 87.33
- task:
type: Sentence completion
dataset:
type: super_glue
name: SuperGLUE (copa)
config: copa
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 76.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (et)
config: et
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 53.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ht)
config: ht
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 64.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (id)
config: id
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 70.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (it)
config: it
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 53.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (qu)
config: qu
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 56.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (sw)
config: sw
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 66.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ta)
config: ta
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 59.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (th)
config: th
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 63.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (tr)
config: tr
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 61.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (vi)
config: vi
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 77.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (zh)
config: zh
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 73.0
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ar)
config: ar
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 80.61
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (es)
config: es
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 85.9
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (eu)
config: eu
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 70.95
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (hi)
config: hi
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 78.89
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (id)
config: id
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 82.99
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (my)
config: my
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 49.9
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ru)
config: ru
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 61.42
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (sw)
config: sw
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 69.69
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (te)
config: te
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 73.66
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (zh)
config: zh
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 84.32
---

# Table of Contents
1. [Model Summary](#model-summary)
2. [Use](#use)
3. [Limitations](#limitations)
4. [Training](#training)
5. [Evaluation](#evaluation)
7. [Citation](#citation)
# Model Summary
> We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find the resulting models capable of crosslingual generalization to unseen tasks & languages.
- **Repository:** [bigscience-workshop/xmtf](https://github.com/bigscience-workshop/xmtf)
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
- **Languages:** Refer to [bloom](https://huggingface.co/bigscience/bloom) for pretraining & [xP3](https://huggingface.co/datasets/bigscience/xP3) for finetuning language proportions. It understands both pretraining & finetuning languages.
- **BLOOMZ & mT0 Model Family:**
<div class="max-w-full overflow-auto">
<table>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3>xP3</a>. Recommended for prompting in English.
</tr>
<tr>
<td>Parameters</td>
<td>300M</td>
<td>580M</td>
<td>1.2B</td>
<td>3.7B</td>
<td>13B</td>
<td>560M</td>
<td>1.1B</td>
<td>1.7B</td>
<td>3B</td>
<td>7.1B</td>
<td>176B</td>
</tr>
<tr>
<td>Finetuned Model</td>
<td><a href=https://huggingface.co/bigscience/mt0-small>mt0-small</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-base>mt0-base</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-large>mt0-large</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xl>mt0-xl</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-560m>bloomz-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b1>bloomz-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b7>bloomz-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-3b>bloomz-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1>bloomz-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
</tr>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a>. Recommended for prompting in non-English.</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-mt>bloomz-7b1-mt</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a></td>
</tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/Muennighoff/P3>P3</a>. Released for research purposes only. Strictly inferior to above models!</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-p3>bloomz-7b1-p3</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a></td>
</tr>
<th colspan="12">Original pretrained checkpoints. Not recommended.</th>
<tr>
<td>Pretrained Model</td>
<td><a href=https://huggingface.co/google/mt5-small>mt5-small</a></td>
<td><a href=https://huggingface.co/google/mt5-base>mt5-base</a></td>
<td><a href=https://huggingface.co/google/mt5-large>mt5-large</a></td>
<td><a href=https://huggingface.co/google/mt5-xl>mt5-xl</a></td>
<td><a href=https://huggingface.co/google/mt5-xxl>mt5-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-560m>bloom-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b1>bloom-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b7>bloom-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-3b>bloom-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-7b1>bloom-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom>bloom</a></td>
</tr>
</table>
</div>
# Use
## Intended use
We recommend using the model to perform tasks expressed in natural language. For example, given the prompt "*Translate to English: Je t’aime.*", the model will most likely answer "*I love you.*". Some prompt ideas from our paper:
- 一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?
- Suggest at least five related search terms to "Mạng neural nhân tạo".
- Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is "Heroes Come in All Shapes and Sizes". Story (in Spanish):
- Explain in a sentence in Telugu what is backpropagation in neural networks.
**Feel free to share your generations in the Community tab!**
## How to use
### CPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-3b"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-3b"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype="auto", device_map="auto")
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU in 8bit
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate bitsandbytes
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-3b"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, device_map="auto", load_in_8bit=True)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
<!-- Necessary for whitespace -->
###
# Limitations
**Prompt Engineering:** The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "*Translate to English: Je t'aime*" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "*Translate to English: Je t'aime.*", "*Translate to English: Je t'aime. Translation:*" "*What is "Je t'aime." in English?*", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "*Explain in a sentence in Telugu what is backpropagation in neural networks.*".
# Training
## Model
- **Architecture:** Same as [bloom-3b](https://huggingface.co/bigscience/bloom-3b), also refer to the `config.json` file
- **Finetuning steps:** 2000
- **Finetuning tokens:** 8.39 billion
- **Finetuning layout:** 2x pipeline parallel, 1x tensor parallel, 64x data parallel
- **Precision:** float16
## Hardware
- **CPUs:** AMD CPUs with 512GB memory per node
- **GPUs:** 128 A100 80GB GPUs with 8 GPUs per node (16 nodes) using NVLink 4 inter-gpu connects, 4 OmniPath links
- **Communication:** NCCL-communications network with a fully dedicated subnet
## Software
- **Orchestration:** [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed)
- **Optimizer & parallelism:** [DeepSpeed](https://github.com/microsoft/DeepSpeed)
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch) (pytorch-1.11 w/ CUDA-11.5)
- **FP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
# Evaluation
We refer to Table 7 from our [paper](https://arxiv.org/abs/2211.01786) & [bigscience/evaluation-results](https://huggingface.co/datasets/bigscience/evaluation-results) for zero-shot results on unseen tasks. The sidebar reports zero-shot performance of the best prompt per dataset config.
# Citation
```bibtex
@article{muennighoff2022crosslingual,
title={Crosslingual generalization through multitask finetuning},
author={Muennighoff, Niklas and Wang, Thomas and Sutawika, Lintang and Roberts, Adam and Biderman, Stella and Scao, Teven Le and Bari, M Saiful and Shen, Sheng and Yong, Zheng-Xin and Schoelkopf, Hailey and others},
journal={arXiv preprint arXiv:2211.01786},
year={2022}
}
``` | 24,191 | [
[
-0.031585693359375,
-0.04339599609375,
0.0235748291015625,
0.030548095703125,
-0.005268096923828125,
-0.00634002685546875,
-0.0248870849609375,
-0.0260162353515625,
0.03118896484375,
-0.0124664306640625,
-0.06854248046875,
-0.039947509765625,
-0.0401611328125,
0.01111602783203125,
0.0008025169372558594,
0.059295654296875,
-0.0103912353515625,
0.0123748779296875,
0.0026493072509765625,
-0.0026912689208984375,
-0.0219268798828125,
-0.0293731689453125,
-0.0555419921875,
-0.045196533203125,
0.0386962890625,
0.01338958740234375,
0.03717041015625,
0.03875732421875,
0.0233917236328125,
0.0281829833984375,
-0.0248870849609375,
0.005222320556640625,
-0.0161590576171875,
-0.01003265380859375,
0.002582550048828125,
-0.029327392578125,
-0.055145263671875,
-0.005161285400390625,
0.043365478515625,
0.044586181640625,
0.01434326171875,
0.021240234375,
0.0236663818359375,
0.039703369140625,
-0.034912109375,
0.02801513671875,
-0.0034542083740234375,
0.029541015625,
-0.01381683349609375,
0.0038852691650390625,
-0.01129913330078125,
-0.0241851806640625,
-0.00347137451171875,
-0.05853271484375,
0.015228271484375,
0.009429931640625,
0.10101318359375,
0.0007486343383789062,
0.004024505615234375,
0.00470733642578125,
-0.0251312255859375,
0.0758056640625,
-0.06622314453125,
0.02996826171875,
0.0311279296875,
-0.003620147705078125,
0.0012159347534179688,
-0.045562744140625,
-0.0599365234375,
-0.004222869873046875,
-0.02459716796875,
0.03192138671875,
-0.01898193359375,
-0.0124969482421875,
0.0186920166015625,
0.0386962890625,
-0.052032470703125,
0.00548553466796875,
-0.025421142578125,
-0.0171356201171875,
0.041778564453125,
0.015167236328125,
0.042877197265625,
-0.023284912109375,
-0.0191497802734375,
-0.032135009765625,
-0.0350341796875,
0.0105438232421875,
0.0125274658203125,
0.040771484375,
-0.0487060546875,
0.0301513671875,
-0.00577545166015625,
0.045562744140625,
0.0214385986328125,
-0.00010257959365844727,
0.057891845703125,
-0.035369873046875,
-0.0280914306640625,
-0.0198516845703125,
0.08929443359375,
0.01654052734375,
0.0034332275390625,
-0.00698089599609375,
0.0084228515625,
-0.01538848876953125,
-0.0007295608520507812,
-0.07183837890625,
-0.00435638427734375,
0.0226593017578125,
-0.0433349609375,
-0.0251617431640625,
-0.00823211669921875,
-0.07464599609375,
0.008544921875,
-0.0159149169921875,
0.051849365234375,
-0.043914794921875,
-0.027679443359375,
0.0162506103515625,
0.0012044906616210938,
0.0157470703125,
0.01178741455078125,
-0.07177734375,
0.01358795166015625,
0.023468017578125,
0.06927490234375,
-0.01161956787109375,
-0.043792724609375,
0.0025386810302734375,
0.006107330322265625,
-0.0114898681640625,
0.039154052734375,
-0.01219940185546875,
-0.029327392578125,
-0.0240936279296875,
0.02337646484375,
-0.03253173828125,
-0.00691986083984375,
0.0419921875,
-0.00859832763671875,
0.0455322265625,
-0.042388916015625,
-0.02545166015625,
-0.01558685302734375,
0.021759033203125,
-0.0396728515625,
0.0797119140625,
0.0157928466796875,
-0.06854248046875,
0.01284027099609375,
-0.072509765625,
-0.0177001953125,
-0.01444244384765625,
-0.0009813308715820312,
-0.051910400390625,
-0.0279541015625,
0.033843994140625,
0.0386962890625,
-0.0173187255859375,
-0.0201416015625,
-0.0225982666015625,
-0.0022678375244140625,
-0.002460479736328125,
-0.01116180419921875,
0.07891845703125,
0.0193634033203125,
-0.04718017578125,
0.017730712890625,
-0.0484619140625,
0.00960540771484375,
0.04168701171875,
-0.015838623046875,
0.0091094970703125,
-0.031494140625,
-0.0023345947265625,
0.034454345703125,
0.02337646484375,
-0.03936767578125,
0.0142364501953125,
-0.040740966796875,
0.04827880859375,
0.04638671875,
-0.0036411285400390625,
0.031982421875,
-0.039306640625,
0.036224365234375,
0.01277923583984375,
0.011688232421875,
-0.0194549560546875,
-0.03363037109375,
-0.0633544921875,
-0.0158233642578125,
0.018890380859375,
0.036407470703125,
-0.039794921875,
0.041717529296875,
-0.0225982666015625,
-0.048919677734375,
-0.026275634765625,
0.001018524169921875,
0.04315185546875,
0.051544189453125,
0.05059814453125,
-0.002971649169921875,
-0.043365478515625,
-0.05889892578125,
0.00012600421905517578,
-0.007732391357421875,
0.0096435546875,
0.040557861328125,
0.057647705078125,
-0.0102386474609375,
0.039703369140625,
-0.047119140625,
-0.004711151123046875,
-0.03033447265625,
0.003032684326171875,
0.0214385986328125,
0.0596923828125,
0.04266357421875,
-0.05743408203125,
-0.03289794921875,
0.00078582763671875,
-0.0687255859375,
0.017364501953125,
0.0015411376953125,
-0.029937744140625,
0.0081024169921875,
0.025299072265625,
-0.0567626953125,
0.035064697265625,
0.02288818359375,
-0.037261962890625,
0.04473876953125,
-0.016815185546875,
0.0176544189453125,
-0.0989990234375,
0.031463623046875,
0.010986328125,
0.0059051513671875,
-0.04833984375,
0.0139312744140625,
0.00574493408203125,
0.004302978515625,
-0.044158935546875,
0.06719970703125,
-0.03656005859375,
0.01242828369140625,
0.002445220947265625,
-0.007404327392578125,
0.017974853515625,
0.054779052734375,
0.01337432861328125,
0.05316162109375,
0.052032470703125,
-0.0506591796875,
0.0230560302734375,
0.043609619140625,
-0.00931549072265625,
0.0272979736328125,
-0.06378173828125,
-0.00464630126953125,
0.0008182525634765625,
0.01129913330078125,
-0.06317138671875,
-0.01776123046875,
0.0311279296875,
-0.055267333984375,
0.046783447265625,
0.00461578369140625,
-0.039947509765625,
-0.062164306640625,
-0.02423095703125,
0.0224609375,
0.040618896484375,
-0.038299560546875,
0.028656005859375,
-0.0009965896606445312,
0.005645751953125,
-0.042205810546875,
-0.07244873046875,
-0.01166534423828125,
-0.0286865234375,
-0.064697265625,
0.04730224609375,
-0.015106201171875,
0.013092041015625,
-0.01800537109375,
0.00494384765625,
-0.00693511962890625,
-0.0034542083740234375,
0.024658203125,
0.0322265625,
-0.02789306640625,
0.004974365234375,
-0.01218414306640625,
0.00563812255859375,
-0.0007853507995605469,
-0.018035888671875,
0.053985595703125,
-0.018951416015625,
-0.007717132568359375,
-0.0565185546875,
0.0114898681640625,
0.03936767578125,
-0.0117950439453125,
0.0682373046875,
0.06915283203125,
-0.033782958984375,
0.007480621337890625,
-0.0296478271484375,
-0.028289794921875,
-0.039703369140625,
0.0108184814453125,
-0.0230865478515625,
-0.048004150390625,
0.053955078125,
0.0207672119140625,
-0.00290679931640625,
0.0560302734375,
0.048004150390625,
0.01084136962890625,
0.07073974609375,
0.042724609375,
-0.005489349365234375,
0.036865234375,
-0.050048828125,
0.01141357421875,
-0.07269287109375,
-0.036285400390625,
-0.0291748046875,
-0.0228271484375,
-0.017486572265625,
-0.0247650146484375,
0.0183868408203125,
0.00576019287109375,
-0.047607421875,
0.038055419921875,
-0.050506591796875,
-0.0014638900756835938,
0.046295166015625,
0.0276641845703125,
-0.00872802734375,
0.000021576881408691406,
-0.036163330078125,
-0.012115478515625,
-0.0555419921875,
-0.0166015625,
0.072509765625,
0.021026611328125,
0.0311737060546875,
-0.0062103271484375,
0.0501708984375,
-0.017242431640625,
-0.0033435821533203125,
-0.0390625,
0.031585693359375,
0.004009246826171875,
-0.052154541015625,
-0.0240020751953125,
-0.02886962890625,
-0.08612060546875,
0.02056884765625,
-0.035125732421875,
-0.0731201171875,
0.0138702392578125,
0.0236968994140625,
-0.05633544921875,
0.036712646484375,
-0.05267333984375,
0.08056640625,
-0.0150299072265625,
-0.058349609375,
0.0124664306640625,
-0.048187255859375,
0.0137481689453125,
0.02783203125,
0.020538330078125,
0.007396697998046875,
0.016082763671875,
0.061981201171875,
-0.045074462890625,
0.06304931640625,
-0.0107269287109375,
0.006641387939453125,
0.0211944580078125,
-0.0158233642578125,
0.0242767333984375,
-0.01123046875,
-0.00457000732421875,
0.00525665283203125,
-0.00540924072265625,
-0.035308837890625,
-0.02642822265625,
0.0604248046875,
-0.06744384765625,
-0.0352783203125,
-0.041259765625,
-0.03955078125,
-0.010467529296875,
0.03631591796875,
0.046600341796875,
0.0168914794921875,
0.005512237548828125,
-0.0038967132568359375,
0.048004150390625,
-0.02508544921875,
0.053131103515625,
0.01096343994140625,
-0.01535797119140625,
-0.01751708984375,
0.07061767578125,
0.005306243896484375,
0.007419586181640625,
0.02911376953125,
0.029144287109375,
-0.0264739990234375,
-0.03033447265625,
-0.038909912109375,
0.036712646484375,
-0.0244598388671875,
-0.0233917236328125,
-0.0650634765625,
-0.0267486572265625,
-0.058837890625,
-0.01348876953125,
-0.031982421875,
-0.032257080078125,
-0.043670654296875,
-0.01293182373046875,
0.0357666015625,
0.03436279296875,
-0.0199737548828125,
0.0255889892578125,
-0.03857421875,
0.026885986328125,
0.01812744140625,
0.02337646484375,
0.01535797119140625,
-0.04010009765625,
-0.016387939453125,
0.0174102783203125,
-0.044158935546875,
-0.051177978515625,
0.051116943359375,
0.0015535354614257812,
0.038818359375,
0.0177001953125,
-0.02777099609375,
0.06109619140625,
-0.034820556640625,
0.062255859375,
0.03179931640625,
-0.0638427734375,
0.04791259765625,
-0.02984619140625,
0.03668212890625,
0.0272216796875,
0.039459228515625,
-0.0301666259765625,
-0.0122833251953125,
-0.057861328125,
-0.0687255859375,
0.05828857421875,
0.025543212890625,
0.002254486083984375,
0.005096435546875,
0.029144287109375,
-0.005588531494140625,
0.006710052490234375,
-0.07257080078125,
-0.045806884765625,
-0.0369873046875,
-0.0195465087890625,
-0.0038852691650390625,
0.006847381591796875,
-0.002178192138671875,
-0.0435791015625,
0.052978515625,
0.0020618438720703125,
0.043365478515625,
0.02178955078125,
0.001556396484375,
-0.0022373199462890625,
0.0080718994140625,
0.0443115234375,
0.03131103515625,
-0.006313323974609375,
-0.0167388916015625,
0.0155181884765625,
-0.05126953125,
0.0008449554443359375,
0.005237579345703125,
-0.021759033203125,
-0.01068115234375,
0.017364501953125,
0.06500244140625,
0.0157470703125,
-0.01108551025390625,
0.033843994140625,
-0.0028285980224609375,
-0.02783203125,
-0.020538330078125,
0.0110626220703125,
0.0252532958984375,
0.01568603515625,
0.0175628662109375,
0.005706787109375,
0.0011663436889648438,
-0.0291900634765625,
0.001865386962890625,
0.0304718017578125,
-0.0193939208984375,
-0.036651611328125,
0.06634521484375,
-0.00389862060546875,
-0.00262451171875,
0.0228271484375,
-0.023468017578125,
-0.05743408203125,
0.04937744140625,
0.048583984375,
0.04541015625,
-0.020599365234375,
0.004886627197265625,
0.076416015625,
0.00617218017578125,
-0.0171661376953125,
0.0252532958984375,
0.0020427703857421875,
-0.040252685546875,
-0.020294189453125,
-0.060302734375,
-0.000031113624572753906,
0.0267791748046875,
-0.0477294921875,
0.028472900390625,
-0.037384033203125,
-0.0171356201171875,
0.01788330078125,
0.0199432373046875,
-0.057708740234375,
0.042144775390625,
0.019927978515625,
0.062255859375,
-0.05535888671875,
0.055908203125,
0.0469970703125,
-0.062255859375,
-0.07623291015625,
-0.0066070556640625,
0.0018148422241210938,
-0.07122802734375,
0.06378173828125,
0.011138916015625,
0.01056671142578125,
0.0123291015625,
-0.046356201171875,
-0.0859375,
0.099609375,
0.005756378173828125,
-0.0185394287109375,
-0.02203369140625,
0.0019588470458984375,
0.04071044921875,
-0.01502227783203125,
0.031341552734375,
0.02569580078125,
0.048583984375,
0.0206451416015625,
-0.068603515625,
0.0265655517578125,
-0.045562744140625,
-0.0032978057861328125,
-0.0034122467041015625,
-0.08453369140625,
0.09161376953125,
-0.01348114013671875,
-0.00934600830078125,
0.0030574798583984375,
0.061126708984375,
0.0283355712890625,
0.01367950439453125,
0.01522064208984375,
0.059539794921875,
0.037109375,
-0.023956298828125,
0.0753173828125,
-0.0288543701171875,
0.041961669921875,
0.058074951171875,
0.0162353515625,
0.04351806640625,
0.025482177734375,
-0.038726806640625,
0.041473388671875,
0.04852294921875,
-0.0221405029296875,
0.0203094482421875,
0.016693115234375,
-0.005062103271484375,
-0.006481170654296875,
0.01123809814453125,
-0.048095703125,
0.007537841796875,
0.0307159423828125,
-0.0219879150390625,
-0.00272369384765625,
0.00693511962890625,
0.0285491943359375,
-0.0029125213623046875,
-0.035491943359375,
0.027496337890625,
0.0085906982421875,
-0.051239013671875,
0.051055908203125,
-0.0036468505859375,
0.074951171875,
-0.04010009765625,
0.018890380859375,
-0.01169586181640625,
0.01302337646484375,
-0.0299224853515625,
-0.056304931640625,
0.01439666748046875,
-0.00450897216796875,
-0.00914764404296875,
-0.01442718505859375,
0.034912109375,
-0.0226593017578125,
-0.045989990234375,
0.0225830078125,
0.02642822265625,
0.00923919677734375,
0.005069732666015625,
-0.08050537109375,
0.0033206939697265625,
-0.0028057098388671875,
-0.0341796875,
0.0139617919921875,
0.01293182373046875,
0.0162200927734375,
0.0540771484375,
0.043792724609375,
0.00998687744140625,
0.0260772705078125,
-0.004886627197265625,
0.0628662109375,
-0.0523681640625,
-0.0360107421875,
-0.06280517578125,
0.041290283203125,
-0.01018524169921875,
-0.0252838134765625,
0.07940673828125,
0.042938232421875,
0.059478759765625,
-0.005046844482421875,
0.060455322265625,
-0.018280029296875,
0.045501708984375,
-0.0305633544921875,
0.07012939453125,
-0.0594482421875,
-0.0185699462890625,
-0.028656005859375,
-0.038116455078125,
-0.0255889892578125,
0.06024169921875,
-0.0203704833984375,
0.041900634765625,
0.05865478515625,
0.0491943359375,
-0.00974273681640625,
-0.00366973876953125,
-0.0045013427734375,
0.0296478271484375,
0.01369476318359375,
0.06396484375,
0.024749755859375,
-0.055572509765625,
0.0286407470703125,
-0.050262451171875,
-0.002735137939453125,
-0.0186920166015625,
-0.04827880859375,
-0.06817626953125,
-0.05218505859375,
-0.0355224609375,
-0.04168701171875,
-0.007656097412109375,
0.0650634765625,
0.05572509765625,
-0.0670166015625,
-0.01496124267578125,
-0.01421356201171875,
0.000042557716369628906,
-0.01122283935546875,
-0.017822265625,
0.055267333984375,
-0.022125244140625,
-0.07110595703125,
0.006465911865234375,
0.0015497207641601562,
0.04010009765625,
-0.00493621826171875,
-0.01412200927734375,
-0.0304412841796875,
-0.0037975311279296875,
0.0233917236328125,
0.048248291015625,
-0.03546142578125,
-0.00738525390625,
0.0123748779296875,
-0.01503753662109375,
0.026519775390625,
0.024200439453125,
-0.039337158203125,
0.007495880126953125,
0.03436279296875,
0.0216827392578125,
0.05230712890625,
-0.01464080810546875,
0.025482177734375,
-0.036163330078125,
0.01715087890625,
0.012725830078125,
0.03466796875,
0.0272979736328125,
-0.03363037109375,
0.028076171875,
0.0195465087890625,
-0.042724609375,
-0.0577392578125,
-0.00804901123046875,
-0.083984375,
-0.0165557861328125,
0.086181640625,
-0.0208892822265625,
-0.050445556640625,
0.02618408203125,
-0.0102996826171875,
0.042510986328125,
-0.0256805419921875,
0.04730224609375,
0.057708740234375,
-0.0230712890625,
-0.0089111328125,
-0.044403076171875,
0.041748046875,
0.043731689453125,
-0.0654296875,
-0.01139068603515625,
0.010833740234375,
0.03265380859375,
0.0309906005859375,
0.03094482421875,
-0.0196533203125,
0.0157318115234375,
-0.0005922317504882812,
0.0163726806640625,
-0.01308441162109375,
0.00257110595703125,
-0.02813720703125,
-0.0025157928466796875,
-0.023345947265625,
-0.0193634033203125
]
] |
mosaicml/mpt-30b | 2023-10-30T21:54:24.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"StreamingDatasets",
"custom_code",
"dataset:allenai/c4",
"dataset:mc4",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigcode/the-stack-dedup",
"dataset:allenai/s2orc",
"arxiv:2108.12409",
"arxiv:2302.13971",
"arxiv:2205.14135",
"arxiv:2010.04245",
"arxiv:1909.08053",
"arxiv:2302.06675",
"license:apache-2.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | mosaicml | null | null | mosaicml/mpt-30b | 322 | 15,892 | transformers | 2023-06-20T16:29:39 | ---
license: apache-2.0
tags:
- Composer
- MosaicML
- llm-foundry
- StreamingDatasets
datasets:
- allenai/c4
- mc4
- togethercomputer/RedPajama-Data-1T
- bigcode/the-stack-dedup
- allenai/s2orc
inference: false
---
# MPT-30B
MPT-30B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
This model was trained by [MosaicML](https://www.mosaicml.com).
MPT-30B is part of the family of Mosaic Pretrained Transformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
MPT-30B comes with special features that differentiate it from other LLMs, including an 8k token context window (which can be further extended via finetuning; see [MPT-7B-StoryWriter](https://huggingface.co/mosaicml/mpt-7b-storywriter)), support for context-length extrapolation via [ALiBi](https://arxiv.org/abs/2108.12409), and efficient inference + training via FlashAttention. It also has strong coding abilities thanks to its pretraining mix. MPT models can also be served efficiently with both standard HuggingFace pipelines and NVIDIA's [FasterTransformer](https://github.com/NVIDIA/FasterTransformer).
The size of MPT-30B was also specifically chosen to make it easy to deploy on a single GPU—either 1xA100-80GB in 16-bit precision or 1xA100-40GB in 8-bit precision.
This model uses the MosaicML LLM codebase, which can be found in the [llm-foundry repository](https://github.com/mosaicml/llm-foundry). It was trained by MosaicML’s NLP team on the [MosaicML platform](https://www.mosaicml.com/training) for LLM pretraining, finetuning, and inference.
### How is this model different?
MPT-30B is:
* **Licensed for the possibility of commercial use** (unlike [LLaMA](https://arxiv.org/abs/2302.13971)).
* **Trained on a large amount of data** (1T tokens like [LLaMA](https://arxiv.org/abs/2302.13971) vs. 300B for [Pythia](https://github.com/EleutherAI/pythia), 300B for [OpenLLaMA](https://github.com/openlm-research/open_llama), and 800B for [StableLM](https://github.com/Stability-AI/StableLM)).
* **Prepared to handle extremely long inputs** thanks to [ALiBi](https://arxiv.org/abs/2108.12409).
* **Capable of fast training and inference** (via [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf) and [FasterTransformer](https://github.com/NVIDIA/FasterTransformer))
* **Equipped with highly efficient open-source training code** via the [llm-foundry repository](https://github.com/mosaicml/llm-foundry)
### Models finetuned off MPT-30B:
The following models are finetuned on MPT-30B:
* [MPT-30B-Instruct](https://huggingface.co/mosaicml/mpt-30b-instruct): a model for long-form instruction following (especially summarization and question-answering).
Built by finetuning MPT-30B on several carefully curated datasets.
* License: _CC-BY-SA-3.0_
* [MPT-30B-Chat](https://huggingface.co/mosaicml/mpt-30b-chat): a chatbot-like model for dialogue generation.
Built by finetuning MPT-30B on [ShareGPT-Vicuna](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered), [Camel-AI](https://huggingface.co/camel-ai),
[GPTeacher](https://github.com/teknium1/GPTeacher), [Guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco), [Baize](https://github.com/project-baize/baize-chatbot) and some generated datasets.
* License: _CC-By-NC-SA-4.0_
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-30b-chat)
## Model Date
June 22, 2023
## Model License
Apache-2.0
## Documentation
* [Blog post: MPT-30B: Raising the bar for open-source foundation models](https://www.mosaicml.com/blog/mpt-30b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
## How to Use
This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-30b',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision:
```python
import torch
import transformers
name = 'mosaicml/mpt-30b'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.attn_config['attn_impl'] = 'triton' # change this to use triton-based FlashAttention
config.init_device = 'cuda:0' # For fast initialization directly on GPU!
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
torch_dtype=torch.bfloat16, # Load model weights in bfloat16
trust_remote_code=True
)
```
The model was trained initially with a sequence length of 2048 with an additional pretraining stage for sequence length adapation up to 8192. However, ALiBi enables users to increase the maximum sequence length even further during finetuning and/or inference. For example:
```python
import transformers
name = 'mosaicml/mpt-30b'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.max_seq_len = 16384 # (input + output) tokens can now be up to 16384
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
trust_remote_code=True
)
```
This model was trained with the MPT-30B tokenizer which is identical to the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('mosaicml/mpt-30b')
```
The model can then be used, for example, within a text-generation pipeline.
Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html).
```python
from transformers import pipeline
with torch.autocast('cuda', dtype=torch.bfloat16):
inputs = tokenizer('Here is a recipe for vegan banana bread:\n', return_tensors="pt").to('cuda')
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
# or using the HF pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
print(
pipe('Here is a recipe for vegan banana bread:\n',
max_new_tokens=100,
do_sample=True,
use_cache=True))
```
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 29.95B |
|n_layers | 48 |
| n_heads | 64 |
| d_model | 7168 |
| vocab size | 50432 |
| sequence length | 8192 |
## Training Data
### Streaming Datasets
Data was formatted using the MosaicML [StreamingDataset](https://github.com/mosaicml/streaming) library to host our data in object storage and efficiently stream it to our compute cluster during training.
StreamingDataset obviates the need to download the whole dataset before starting training, and allows instant resumption of training from any point in the dataset.
### Data Mix
The model was trained for 1T tokens on the following data mix:
| Data Source | Number of Tokens in Source | Proportion | Effective Number of Tokens | Epochs |
|-------------|----------------------------|------------|----------------------------|--------|
| mC4 3.1.0 - English (200+ words) | 2417.99 B | 33.50% | 335 B | 0.14 |
| c4 - English - SemDedup 80% | 100.42 B | 29.90% | 299 B | 2.98 |
| RedPajama - CommonCrawl | 878.45 B | 8.50% | 85 B | 0.097 |
| The Stack - Selected Languages | 463.78 B | 10.00% | 100 B | 0.22 |
| RedPajama - Wikipedia | 4.87 B | 4.00% | 40 B | 8.21 |
| The Stack - Markdown | 107.07 B | 4.50% | 45 B | 0.42 |
| Semantic Scholar ORC | 48.95 B | 3.30% | 33 B | 0.67 |
| RedPajama - Books | 26.02 B | 3.00% | 30 B | 1.15 |
| RedPajama - arXiv | 28.10 B | 1.90% | 19 B | 0.68 |
| RedPajama - StackExchange | 20.54 B | 1.40% | 14 B |0.68 |
Samples for each batch were selected from one of the datasets with the probability specified above. The examples were shuffled within each dataset, and each example was constructed from as many sequences from that dataset as were necessary to fill the sequence length. To build 8k support into MPT-30B efficiently, we first pre-trained on 1T tokens using sequences that were 2k tokens long, and then trained for an additional 50B tokens using sequences that were 8k tokens long.
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer. This BPE tokenizer has a number of desirable characteristics,
most of which are relevant for tokenizing code:
(1) It was trained on a diverse mix of data that includes code (The Pile)
(2) It applies consistent space delimitation, unlike the GPT2 tokenizer which tokenizes inconsistently depending on the presence of prefix spaces
(3) It contains tokens for repeated space characters, which allows superior compression of text with large amounts of repeated space characters.
The model vocabulary size of 50432 was set to be a multiple of 128 (as in [MEGATRON-LM](https://arxiv.org/abs/1909.08053)).
### Training Configuration
The model was trained in three stages using the [MosaicML Platform](https://www.mosaicml.com/platform):
(i) First it was trained on 440 A100-40GBs with a batch size of 1760.
(ii) Then, on 216 A100-40GBs with a batch size of 1728.
(iii) Training was completed on 256 H100-80GBs with a batch size of 512 with 8k context length and 50B tokens.
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the [LION](https://arxiv.org/abs/2302.06675) optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-30B (Base) is **not** intended for deployment without finetuning.
It should not be used for human-facing interactions without further guardrails and user consent.
MPT-30B can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-30B was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-30b).
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please consult an attorney before using this model for commercial purposes.
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-30B: Raising the bar
for open-source foundation models},
year = {2023},
url = {www.mosaicml.com/blog/mpt-30b},
note = {Accessed: 2023-06-22},
urldate = {2023-06-22}
}
``` | 12,202 | [
[
-0.03729248046875,
-0.042633056640625,
0.01453399658203125,
0.0270538330078125,
-0.0220184326171875,
0.002208709716796875,
-0.0069427490234375,
-0.0186309814453125,
-0.0016269683837890625,
0.022491455078125,
-0.048187255859375,
-0.03900146484375,
-0.046142578125,
-0.00534820556640625,
-0.0262603759765625,
0.07611083984375,
-0.00485992431640625,
0.002513885498046875,
-0.0015726089477539062,
-0.0030307769775390625,
-0.01244354248046875,
-0.033477783203125,
-0.049652099609375,
-0.0323486328125,
0.037200927734375,
0.0111083984375,
0.0511474609375,
0.0584716796875,
0.036468505859375,
0.0247650146484375,
-0.01221466064453125,
0.0098724365234375,
-0.03533935546875,
-0.0298919677734375,
0.01412200927734375,
-0.0306243896484375,
-0.038238525390625,
0.01140594482421875,
0.0374755859375,
0.019622802734375,
-0.0172576904296875,
0.032501220703125,
-0.0005574226379394531,
0.024932861328125,
-0.035064697265625,
0.0217742919921875,
-0.03167724609375,
0.013427734375,
-0.0162811279296875,
0.00835418701171875,
-0.040863037109375,
-0.0178985595703125,
0.002689361572265625,
-0.03985595703125,
0.0230865478515625,
0.0007576942443847656,
0.08099365234375,
0.0280303955078125,
-0.0270843505859375,
0.0024051666259765625,
-0.0406494140625,
0.048980712890625,
-0.0611572265625,
0.028472900390625,
0.01806640625,
0.02203369140625,
0.004047393798828125,
-0.0870361328125,
-0.05120849609375,
-0.0128326416015625,
-0.00362396240234375,
0.0275115966796875,
-0.0228118896484375,
0.005054473876953125,
0.0295562744140625,
0.041046142578125,
-0.043212890625,
0.003688812255859375,
-0.030303955078125,
-0.0139312744140625,
0.03594970703125,
0.01763916015625,
0.019287109375,
-0.020965576171875,
-0.0404052734375,
-0.0290679931640625,
-0.057464599609375,
0.0032520294189453125,
0.021026611328125,
-0.005138397216796875,
-0.036041259765625,
0.046722412109375,
-0.0029659271240234375,
0.04510498046875,
0.01277923583984375,
0.0010538101196289062,
0.0264892578125,
-0.022552490234375,
-0.031219482421875,
-0.01390838623046875,
0.08184814453125,
0.01885986328125,
0.004970550537109375,
-0.0017271041870117188,
-0.004039764404296875,
-0.00814056396484375,
0.00362396240234375,
-0.079345703125,
-0.0271148681640625,
0.01812744140625,
-0.0313720703125,
-0.01222991943359375,
-0.002040863037109375,
-0.040252685546875,
-0.01442718505859375,
-0.00937652587890625,
0.043548583984375,
-0.054351806640625,
-0.0167999267578125,
0.0009412765502929688,
-0.01041412353515625,
0.0277557373046875,
0.01384735107421875,
-0.07073974609375,
0.00333404541015625,
0.033721923828125,
0.0748291015625,
-0.01152801513671875,
-0.03924560546875,
-0.0110931396484375,
-0.0076446533203125,
-0.0005536079406738281,
0.044769287109375,
-0.01039886474609375,
-0.015777587890625,
-0.0254364013671875,
0.00974273681640625,
-0.0187835693359375,
-0.03173828125,
0.0171661376953125,
-0.030059814453125,
0.034393310546875,
-0.00978851318359375,
-0.034942626953125,
-0.015472412109375,
0.008544921875,
-0.04290771484375,
0.0782470703125,
0.02764892578125,
-0.05865478515625,
0.01983642578125,
-0.0633544921875,
-0.01071929931640625,
-0.004268646240234375,
0.00719451904296875,
-0.053009033203125,
-0.0047760009765625,
0.0294952392578125,
0.037933349609375,
-0.0279541015625,
0.021392822265625,
-0.01178741455078125,
-0.03851318359375,
0.01255035400390625,
-0.047576904296875,
0.076416015625,
0.029327392578125,
-0.045806884765625,
0.01318359375,
-0.051177978515625,
-0.0125885009765625,
0.0200958251953125,
-0.035919189453125,
0.04144287109375,
-0.0179290771484375,
0.0038604736328125,
0.01250457763671875,
0.01082611083984375,
-0.048248291015625,
0.00807952880859375,
-0.030426025390625,
0.0501708984375,
0.05682373046875,
-0.014373779296875,
0.0227508544921875,
-0.035491943359375,
0.031524658203125,
0.011566162109375,
0.031524658203125,
-0.0200347900390625,
-0.048126220703125,
-0.07745361328125,
-0.032958984375,
0.026123046875,
0.039520263671875,
-0.06658935546875,
0.0256195068359375,
-0.01508331298828125,
-0.056304931640625,
-0.05657958984375,
-0.0026683807373046875,
0.032379150390625,
0.037322998046875,
0.042388916015625,
-0.02374267578125,
-0.046417236328125,
-0.055389404296875,
0.00321197509765625,
-0.005886077880859375,
-0.0052642822265625,
0.02001953125,
0.0418701171875,
-0.0228118896484375,
0.07257080078125,
-0.0269317626953125,
0.00047779083251953125,
-0.01690673828125,
0.0157318115234375,
0.035125732421875,
0.046234130859375,
0.04296875,
-0.050323486328125,
-0.048126220703125,
-0.01508331298828125,
-0.051239013671875,
0.006595611572265625,
-0.0077056884765625,
-0.007450103759765625,
0.01349639892578125,
0.017669677734375,
-0.069580078125,
0.034332275390625,
0.048797607421875,
-0.0274505615234375,
0.039825439453125,
-0.004650115966796875,
0.0005922317504882812,
-0.09893798828125,
0.007144927978515625,
-0.007579803466796875,
-0.0180511474609375,
-0.039215087890625,
-0.00887298583984375,
0.0067596435546875,
-0.006191253662109375,
-0.06732177734375,
0.038177490234375,
-0.0241241455078125,
0.006633758544921875,
-0.011993408203125,
-0.0202484130859375,
-0.006885528564453125,
0.06146240234375,
0.014923095703125,
0.06817626953125,
0.03375244140625,
-0.033935546875,
0.0423583984375,
0.031219482421875,
-0.02655029296875,
0.0148162841796875,
-0.04913330078125,
0.01067352294921875,
0.0025196075439453125,
0.0213775634765625,
-0.0653076171875,
-0.01288604736328125,
0.025665283203125,
-0.043182373046875,
0.0240936279296875,
-0.01654052734375,
-0.037811279296875,
-0.048309326171875,
-0.0086669921875,
0.028350830078125,
0.055633544921875,
-0.06219482421875,
0.04644775390625,
0.0099334716796875,
0.01139068603515625,
-0.059173583984375,
-0.06439208984375,
-0.0013408660888671875,
-0.0173797607421875,
-0.05133056640625,
0.0303497314453125,
-0.01012420654296875,
0.011199951171875,
-0.012786865234375,
0.0009031295776367188,
0.005615234375,
0.0018243789672851562,
0.035736083984375,
0.0311431884765625,
-0.02130126953125,
-0.010101318359375,
-0.01111602783203125,
-0.016143798828125,
0.00033092498779296875,
-0.01885986328125,
0.0755615234375,
-0.037200927734375,
-0.0245208740234375,
-0.050872802734375,
-0.0011968612670898438,
0.037811279296875,
-0.0151519775390625,
0.0787353515625,
0.081298828125,
-0.0123138427734375,
0.0085601806640625,
-0.0501708984375,
-0.0132598876953125,
-0.03997802734375,
0.025665283203125,
-0.0136260986328125,
-0.058502197265625,
0.04058837890625,
0.01160430908203125,
0.00440216064453125,
0.04998779296875,
0.06549072265625,
-0.01068878173828125,
0.0654296875,
0.032623291015625,
0.0078277587890625,
0.052703857421875,
-0.052398681640625,
-0.004070281982421875,
-0.06744384765625,
-0.024993896484375,
-0.011077880859375,
-0.01763916015625,
-0.04217529296875,
-0.038116455078125,
0.01534271240234375,
-0.00865936279296875,
-0.0552978515625,
0.048492431640625,
-0.04644775390625,
0.02886962890625,
0.060089111328125,
0.02294921875,
0.0019588470458984375,
-0.00514984130859375,
-0.0148162841796875,
0.0099639892578125,
-0.068359375,
-0.0306396484375,
0.0936279296875,
0.0340576171875,
0.04638671875,
-0.003910064697265625,
0.0523681640625,
-0.0019054412841796875,
0.046112060546875,
-0.0306854248046875,
0.03057861328125,
0.0018520355224609375,
-0.056060791015625,
-0.0054931640625,
-0.04351806640625,
-0.05694580078125,
0.0167388916015625,
-0.0213165283203125,
-0.05322265625,
0.02099609375,
0.00963592529296875,
-0.03887939453125,
0.045562744140625,
-0.0655517578125,
0.07586669921875,
-0.0172882080078125,
-0.031646728515625,
0.0108642578125,
-0.055023193359375,
0.0281524658203125,
0.002410888671875,
-0.006988525390625,
-0.006572723388671875,
0.0193939208984375,
0.057769775390625,
-0.036407470703125,
0.0645751953125,
-0.016326904296875,
0.01593017578125,
0.031097412109375,
-0.01171112060546875,
0.0271759033203125,
0.0010204315185546875,
-0.00101470947265625,
0.02801513671875,
0.006671905517578125,
-0.031768798828125,
-0.0178985595703125,
0.03594970703125,
-0.0887451171875,
-0.0423583984375,
-0.03302001953125,
-0.047607421875,
0.0024051666259765625,
0.01043701171875,
0.051025390625,
0.0175628662109375,
-0.00424957275390625,
0.0250244140625,
0.04388427734375,
-0.03369140625,
0.05914306640625,
0.0185089111328125,
-0.004238128662109375,
-0.040557861328125,
0.059783935546875,
-0.0079498291015625,
0.025482177734375,
0.0214385986328125,
0.004177093505859375,
-0.02008056640625,
-0.0360107421875,
-0.042510986328125,
0.0220184326171875,
-0.04254150390625,
-0.033416748046875,
-0.048309326171875,
-0.04119873046875,
-0.0322265625,
0.006076812744140625,
-0.050323486328125,
-0.0268707275390625,
-0.040924072265625,
-0.0033969879150390625,
0.027923583984375,
0.03387451171875,
-0.00496673583984375,
0.061279296875,
-0.05670166015625,
0.01654052734375,
0.024169921875,
0.0277252197265625,
-0.0014638900756835938,
-0.0496826171875,
-0.020965576171875,
0.016815185546875,
-0.046142578125,
-0.0511474609375,
0.04071044921875,
0.0068817138671875,
0.0330810546875,
0.024993896484375,
-0.01349639892578125,
0.05242919921875,
-0.03179931640625,
0.07086181640625,
0.0308380126953125,
-0.0704345703125,
0.01849365234375,
-0.022979736328125,
0.0328369140625,
0.0241851806640625,
0.033935546875,
-0.0367431640625,
-0.01172637939453125,
-0.050689697265625,
-0.049896240234375,
0.0718994140625,
0.039581298828125,
0.01361083984375,
-0.0012769699096679688,
0.02874755859375,
0.0033588409423828125,
0.0093841552734375,
-0.08648681640625,
-0.0198822021484375,
-0.045074462890625,
-0.0281524658203125,
-0.006504058837890625,
-0.007457733154296875,
-0.010223388671875,
-0.03802490234375,
0.055023193359375,
-0.0000057220458984375,
0.05657958984375,
0.01439666748046875,
-0.022552490234375,
0.0034027099609375,
-0.00508880615234375,
0.03302001953125,
0.044830322265625,
-0.0244903564453125,
-0.0016040802001953125,
0.02508544921875,
-0.050811767578125,
0.0053558349609375,
0.0210113525390625,
-0.0029754638671875,
-0.01456451416015625,
0.0227508544921875,
0.0838623046875,
-0.00499725341796875,
-0.0308837890625,
0.04193115234375,
-0.014556884765625,
-0.0162811279296875,
-0.0177154541015625,
0.01027679443359375,
0.0306243896484375,
0.040252685546875,
0.0150299072265625,
0.00159454345703125,
-0.0216064453125,
-0.03558349609375,
0.013519287109375,
0.0189056396484375,
-0.0175628662109375,
-0.01983642578125,
0.07220458984375,
0.004070281982421875,
-0.022857666015625,
0.07025146484375,
-0.0002913475036621094,
-0.03424072265625,
0.0545654296875,
0.048248291015625,
0.056243896484375,
-0.019287109375,
0.0207977294921875,
0.0259552001953125,
0.0211181640625,
-0.0033435821533203125,
-0.0006089210510253906,
-0.014068603515625,
-0.055267333984375,
-0.03082275390625,
-0.060638427734375,
-0.0252685546875,
-0.006011962890625,
-0.03302001953125,
0.0265960693359375,
-0.031097412109375,
-0.0235595703125,
-0.0127410888671875,
0.00482940673828125,
-0.067626953125,
0.015289306640625,
0.0297393798828125,
0.07366943359375,
-0.048309326171875,
0.0762939453125,
0.022186279296875,
-0.04119873046875,
-0.06280517578125,
-0.0304718017578125,
-0.01097869873046875,
-0.08319091796875,
0.0310821533203125,
0.0181732177734375,
0.0165863037109375,
0.00955963134765625,
-0.051483154296875,
-0.0679931640625,
0.122314453125,
0.04119873046875,
-0.03643798828125,
-0.019775390625,
0.0340576171875,
0.036956787109375,
-0.030059814453125,
0.046142578125,
0.0474853515625,
0.0250701904296875,
0.0305633544921875,
-0.06488037109375,
0.005313873291015625,
-0.021881103515625,
-0.00711822509765625,
-0.0003800392150878906,
-0.0670166015625,
0.08636474609375,
-0.00911712646484375,
-0.01025390625,
0.019012451171875,
0.050445556640625,
0.024627685546875,
0.0183563232421875,
0.0202789306640625,
0.0616455078125,
0.0419921875,
-0.0296783447265625,
0.08795166015625,
-0.0238037109375,
0.04718017578125,
0.0689697265625,
0.0186309814453125,
0.04437255859375,
0.0262603759765625,
-0.0144500732421875,
0.040374755859375,
0.07293701171875,
-0.0289154052734375,
0.033538818359375,
-0.0009126663208007812,
-0.0109710693359375,
-0.0168304443359375,
0.01678466796875,
-0.035858154296875,
0.0313720703125,
0.01413726806640625,
-0.04022216796875,
-0.0085601806640625,
0.015106201171875,
0.0097503662109375,
-0.038055419921875,
-0.0097503662109375,
0.044891357421875,
0.01209259033203125,
-0.041595458984375,
0.056549072265625,
-0.0009312629699707031,
0.055267333984375,
-0.041229248046875,
0.0139007568359375,
-0.0247650146484375,
0.016326904296875,
-0.0165252685546875,
-0.05157470703125,
0.0171356201171875,
-0.0013179779052734375,
0.01033782958984375,
-0.0201568603515625,
0.02313232421875,
-0.019287109375,
-0.0303497314453125,
0.01120758056640625,
0.025665283203125,
0.0121612548828125,
-0.01319122314453125,
-0.06219482421875,
0.0029888153076171875,
0.0005259513854980469,
-0.0400390625,
0.01517486572265625,
0.007228851318359375,
0.023223876953125,
0.052215576171875,
0.053192138671875,
-0.00928497314453125,
0.0198211669921875,
-0.012359619140625,
0.07666015625,
-0.05340576171875,
-0.01404571533203125,
-0.07379150390625,
0.05010986328125,
0.00150299072265625,
-0.0274810791015625,
0.050689697265625,
0.047332763671875,
0.063232421875,
-0.0031604766845703125,
0.0299072265625,
-0.01062774658203125,
0.0019016265869140625,
-0.034332275390625,
0.06781005859375,
-0.032501220703125,
0.0181884765625,
-0.013946533203125,
-0.08526611328125,
-0.0244293212890625,
0.039642333984375,
-0.02490234375,
0.0110626220703125,
0.044586181640625,
0.05877685546875,
-0.0167999267578125,
0.01052093505859375,
0.011566162109375,
0.021453857421875,
0.0264434814453125,
0.063232421875,
0.0655517578125,
-0.04913330078125,
0.052764892578125,
-0.034332275390625,
-0.01013946533203125,
-0.00411224365234375,
-0.053314208984375,
-0.07794189453125,
-0.03778076171875,
-0.01520538330078125,
-0.033355712890625,
-0.00482177734375,
0.07135009765625,
0.072509765625,
-0.049530029296875,
-0.022186279296875,
-0.01082611083984375,
-0.0035686492919921875,
-0.00507354736328125,
-0.01412200927734375,
0.036834716796875,
-0.004451751708984375,
-0.059417724609375,
0.01186370849609375,
0.0036334991455078125,
0.02325439453125,
-0.009368896484375,
-0.0121307373046875,
-0.0250091552734375,
0.00518798828125,
0.037353515625,
0.018341064453125,
-0.038909912109375,
-0.0151519775390625,
0.0029811859130859375,
-0.0034027099609375,
0.036376953125,
0.032073974609375,
-0.054351806640625,
0.0196685791015625,
0.02105712890625,
0.032073974609375,
0.0943603515625,
-0.004291534423828125,
0.032501220703125,
-0.038330078125,
0.0147705078125,
0.0178375244140625,
0.037353515625,
0.0246734619140625,
-0.021026611328125,
0.042633056640625,
0.0272216796875,
-0.04443359375,
-0.056304931640625,
0.0024261474609375,
-0.07879638671875,
-0.01080322265625,
0.0791015625,
-0.019195556640625,
-0.0386962890625,
0.0269622802734375,
-0.018829345703125,
0.045379638671875,
-0.0096588134765625,
0.0511474609375,
0.04302978515625,
-0.01033782958984375,
-0.0374755859375,
-0.0240020751953125,
0.030517578125,
0.0245513916015625,
-0.05029296875,
-0.015960693359375,
0.01029205322265625,
0.03985595703125,
0.009674072265625,
0.031341552734375,
-0.01461029052734375,
0.0258636474609375,
0.0029811859130859375,
0.0222320556640625,
-0.032073974609375,
-0.01055145263671875,
-0.013671875,
0.01001739501953125,
-0.0261993408203125,
-0.0157012939453125
]
] |
beomi/KoAlpaca-Polyglot-5.8B | 2023-09-15T01:28:17.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"generated_from_trainer",
"polyglot-ko",
"gpt-neox",
"KoAlpaca",
"ko",
"dataset:KoAlpaca-v1.1b",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | beomi | null | null | beomi/KoAlpaca-Polyglot-5.8B | 50 | 15,878 | transformers | 2023-03-16T15:42:53 | ---
language:
- ko
license: apache-2.0
tags:
- generated_from_trainer
- polyglot-ko
- gpt-neox
- KoAlpaca
datasets:
- KoAlpaca-v1.1b
pipeline_tag: text-generation
base_model: EleutherAI/polyglot-ko-5.8b
model-index:
- name: KoAlpaca-Polyglot-5.8B
results: []
---
Update @ 2023.06.01
- Add Safetensor sharded model weight (max shard = 1GB)
# KoAlpaca-Polyglot-5.8B (v1.1b)
This model is a fine-tuned version of [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) on a KoAlpaca Dataset v1.1b
Detail Codes are available at [KoAlpaca Github Repository](https://github.com/Beomi/KoAlpaca)
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.29.0.dev0
- Pytorch 2.0.0+cu117
- Datasets 2.10.1
- Tokenizers 0.13.2
| 1,044 | [
[
-0.03887939453125,
-0.05328369140625,
0.028594970703125,
0.016571044921875,
-0.04736328125,
-0.0164947509765625,
0.004909515380859375,
-0.05096435546875,
0.0266571044921875,
0.03778076171875,
-0.013397216796875,
-0.0290374755859375,
-0.062744140625,
-0.027587890625,
-0.0019130706787109375,
0.099365234375,
-0.00936126708984375,
0.009307861328125,
-0.00797271728515625,
-0.0357666015625,
-0.031585693359375,
-0.0521240234375,
-0.039215087890625,
-0.05633544921875,
0.043792724609375,
0.0253448486328125,
0.084716796875,
0.041168212890625,
0.0237274169921875,
0.018402099609375,
-0.031341552734375,
-0.004547119140625,
-0.041778564453125,
-0.0173187255859375,
0.000057637691497802734,
-0.0207366943359375,
-0.05194091796875,
-0.0283355712890625,
0.031951904296875,
0.03173828125,
-0.035400390625,
0.032928466796875,
-0.00820159912109375,
0.046600341796875,
-0.070556640625,
0.034454345703125,
-0.031951904296875,
0.021392822265625,
-0.0270843505859375,
0.011871337890625,
-0.02984619140625,
-0.0259246826171875,
0.032440185546875,
-0.05462646484375,
0.0146331787109375,
0.002986907958984375,
0.09197998046875,
0.005748748779296875,
-0.03436279296875,
-0.003917694091796875,
-0.03680419921875,
0.054931640625,
-0.032684326171875,
0.032318115234375,
0.03765869140625,
0.03680419921875,
0.00962066650390625,
-0.054779052734375,
-0.00762176513671875,
0.01120758056640625,
0.00450897216796875,
-0.005764007568359375,
0.0005087852478027344,
0.01128387451171875,
0.03460693359375,
0.0318603515625,
-0.0296173095703125,
-0.00768280029296875,
-0.056884765625,
-0.044647216796875,
0.05035400390625,
0.01319122314453125,
-0.00931549072265625,
0.0067901611328125,
-0.045318603515625,
-0.04833984375,
-0.018768310546875,
0.019683837890625,
0.0416259765625,
0.00814056396484375,
-0.033905029296875,
0.03009033203125,
-0.040557861328125,
0.05804443359375,
0.005161285400390625,
-0.0134735107421875,
0.05621337890625,
-0.008819580078125,
-0.02886962890625,
0.031585693359375,
0.046112060546875,
0.0224456787109375,
0.0084228515625,
0.0058441162109375,
-0.0102386474609375,
-0.003513336181640625,
0.0221099853515625,
-0.07952880859375,
-0.047393798828125,
0.0144195556640625,
-0.021209716796875,
-0.047088623046875,
0.01088714599609375,
-0.06329345703125,
-0.020751953125,
-0.01251983642578125,
0.043304443359375,
-0.0291595458984375,
-0.003665924072265625,
0.036376953125,
0.0025348663330078125,
0.03656005859375,
0.0233612060546875,
-0.05877685546875,
0.031890869140625,
0.019805908203125,
0.056793212890625,
-0.0006732940673828125,
-0.0194854736328125,
-0.01531982421875,
0.00664520263671875,
-0.017974853515625,
0.04156494140625,
0.0011816024780273438,
-0.0274505615234375,
-0.0169677734375,
0.0209503173828125,
-0.010040283203125,
-0.020050048828125,
0.0745849609375,
-0.049072265625,
0.012725830078125,
-0.03936767578125,
-0.0221099853515625,
-0.054412841796875,
0.01186370849609375,
-0.048492431640625,
0.07269287109375,
0.03045654296875,
-0.057159423828125,
0.03448486328125,
-0.0214691162109375,
-0.0093536376953125,
0.012542724609375,
0.0091705322265625,
-0.07843017578125,
-0.0014715194702148438,
0.0247344970703125,
0.01192474365234375,
-0.0191192626953125,
0.0168914794921875,
-0.039825439453125,
-0.0240020751953125,
-0.01220703125,
-0.01102447509765625,
0.06256103515625,
0.028594970703125,
-0.012115478515625,
0.005336761474609375,
-0.051422119140625,
0.045379638671875,
0.02947998046875,
-0.017364501953125,
0.0092926025390625,
-0.02996826171875,
0.029144287109375,
0.01270294189453125,
0.0501708984375,
-0.052642822265625,
-0.0005784034729003906,
-0.00885772705078125,
-0.00518035888671875,
0.05548095703125,
-0.00019860267639160156,
0.0006589889526367188,
-0.044281005859375,
0.02935791015625,
0.0006480216979980469,
0.043365478515625,
0.0249786376953125,
-0.03460693359375,
-0.05059814453125,
-0.0200042724609375,
0.029388427734375,
0.024261474609375,
-0.029022216796875,
0.0467529296875,
-0.0160980224609375,
-0.06341552734375,
-0.0185546875,
0.01495361328125,
0.039581298828125,
0.017913818359375,
0.022125244140625,
-0.043182373046875,
-0.037200927734375,
-0.0697021484375,
0.02557373046875,
0.007717132568359375,
0.017486572265625,
0.013031005859375,
0.047821044921875,
-0.0022640228271484375,
0.054229736328125,
-0.054351806640625,
-0.01418304443359375,
-0.0046234130859375,
0.0006990432739257812,
0.0269012451171875,
0.0307159423828125,
0.0625,
-0.05438232421875,
-0.04132080078125,
0.005645751953125,
-0.061431884765625,
0.0022945404052734375,
0.01922607421875,
-0.017181396484375,
-0.014312744140625,
0.007171630859375,
-0.050048828125,
0.018218994140625,
0.060699462890625,
-0.0343017578125,
0.04913330078125,
-0.031494140625,
-0.00968170166015625,
-0.084716796875,
-0.00524139404296875,
-0.0157318115234375,
-0.021087646484375,
-0.0242156982421875,
0.023773193359375,
0.0105438232421875,
-0.0190277099609375,
-0.05401611328125,
0.0253448486328125,
-0.0127716064453125,
-0.0092926025390625,
-0.02227783203125,
-0.01727294921875,
0.00530242919921875,
0.044647216796875,
-0.00962066650390625,
0.05511474609375,
0.06500244140625,
-0.039093017578125,
0.0310821533203125,
0.0289306640625,
-0.02520751953125,
0.0179443359375,
-0.092529296875,
0.0193023681640625,
0.00962066650390625,
0.040130615234375,
-0.01087188720703125,
-0.046844482421875,
0.02398681640625,
-0.0096282958984375,
-0.0009889602661132812,
-0.0272674560546875,
-0.02838134765625,
-0.0266876220703125,
-0.039337158203125,
0.054779052734375,
0.050933837890625,
-0.040985107421875,
0.036224365234375,
0.0184478759765625,
-0.0011043548583984375,
-0.0269012451171875,
-0.024688720703125,
-0.02398681640625,
-0.00989532470703125,
-0.052276611328125,
0.018310546875,
-0.00426483154296875,
0.006481170654296875,
-0.0299835205078125,
-0.00693511962890625,
-0.009674072265625,
-0.013671875,
0.0316162109375,
0.0204315185546875,
0.007091522216796875,
-0.005702972412109375,
0.01056671142578125,
-0.02423095703125,
0.00438690185546875,
-0.0192718505859375,
0.079345703125,
-0.040863037109375,
-0.025604248046875,
-0.043548583984375,
-0.023223876953125,
0.05889892578125,
-0.0243682861328125,
0.07635498046875,
0.06512451171875,
-0.02410888671875,
0.019805908203125,
-0.0309906005859375,
0.0005726814270019531,
-0.03497314453125,
0.030609130859375,
-0.017181396484375,
-0.023040771484375,
0.05615234375,
0.0121307373046875,
-0.003131866455078125,
0.040374755859375,
0.042724609375,
0.0219268798828125,
0.083251953125,
0.0262908935546875,
-0.01413726806640625,
0.032989501953125,
-0.044525146484375,
-0.00685882568359375,
-0.0673828125,
-0.0297698974609375,
-0.0294036865234375,
-0.0107879638671875,
-0.047088623046875,
-0.0223236083984375,
0.03936767578125,
0.0197906494140625,
-0.03619384765625,
0.03228759765625,
-0.0418701171875,
0.03515625,
0.04107666015625,
0.0418701171875,
-0.0066070556640625,
0.0004723072052001953,
-0.0091705322265625,
0.0236968994140625,
-0.05133056640625,
-0.025238037109375,
0.096435546875,
0.03466796875,
0.05584716796875,
-0.011932373046875,
0.05035400390625,
0.006999969482421875,
-0.0149688720703125,
-0.037933349609375,
0.0299835205078125,
0.0207977294921875,
-0.04730224609375,
-0.006862640380859375,
-0.01873779296875,
-0.05511474609375,
0.0308685302734375,
-0.03070068359375,
-0.055267333984375,
0.0218048095703125,
0.0279541015625,
-0.023468017578125,
0.038726806640625,
-0.043731689453125,
0.055938720703125,
-0.0192108154296875,
-0.0246429443359375,
-0.00879669189453125,
-0.032318115234375,
0.004131317138671875,
-0.0091400146484375,
-0.01229095458984375,
-0.018585205078125,
0.0134735107421875,
0.054931640625,
-0.057220458984375,
0.042572021484375,
-0.0283966064453125,
0.002559661865234375,
0.04217529296875,
-0.01702880859375,
0.06988525390625,
0.00942230224609375,
-0.0177764892578125,
0.039215087890625,
0.0151519775390625,
-0.07904052734375,
-0.006694793701171875,
0.037811279296875,
-0.0848388671875,
-0.0010538101196289062,
-0.0574951171875,
-0.08392333984375,
-0.001018524169921875,
0.02105712890625,
0.03460693359375,
0.0266876220703125,
0.00543975830078125,
0.05438232421875,
0.06097412109375,
0.00439453125,
0.01306915283203125,
0.01418304443359375,
-0.01261138916015625,
-0.052154541015625,
0.05853271484375,
-0.0004265308380126953,
0.031158447265625,
-0.021942138671875,
0.02520751953125,
-0.00844573974609375,
-0.06500244140625,
-0.035125732421875,
0.0090179443359375,
-0.03887939453125,
-0.02362060546875,
-0.04986572265625,
-0.0275115966796875,
-0.018890380859375,
-0.0225067138671875,
-0.03912353515625,
-0.01806640625,
0.0021209716796875,
-0.0169677734375,
0.051727294921875,
0.0312347412109375,
0.0221710205078125,
0.061279296875,
-0.051727294921875,
-0.0010290145874023438,
0.0307159423828125,
0.026153564453125,
-0.01093292236328125,
-0.0516357421875,
-0.02825927734375,
0.029266357421875,
-0.04949951171875,
-0.032867431640625,
0.0228424072265625,
0.00475311279296875,
0.0205535888671875,
0.031890869140625,
0.004909515380859375,
0.043853759765625,
-0.0032672882080078125,
0.051910400390625,
0.01230621337890625,
-0.04541015625,
0.040924072265625,
-0.0455322265625,
0.0443115234375,
0.051788330078125,
0.03289794921875,
0.005046844482421875,
-0.005748748779296875,
-0.05401611328125,
-0.053985595703125,
0.048431396484375,
0.036712646484375,
0.00041985511779785156,
0.024871826171875,
0.01549530029296875,
0.01195526123046875,
0.0022735595703125,
-0.06793212890625,
0.00208282470703125,
-0.0119476318359375,
-0.01378631591796875,
-0.0024013519287109375,
-0.0208740234375,
-0.0270538330078125,
-0.026580810546875,
0.07598876953125,
-0.01096343994140625,
-0.0034809112548828125,
0.002857208251953125,
-0.01525115966796875,
-0.01849365234375,
-0.0015735626220703125,
0.038543701171875,
0.0714111328125,
-0.05108642578125,
-0.0299224853515625,
0.04852294921875,
-0.061370849609375,
-0.0099029541015625,
0.02874755859375,
-0.04150390625,
-0.00827789306640625,
0.0091400146484375,
0.07708740234375,
0.0142822265625,
-0.032806396484375,
0.02239990234375,
0.001827239990234375,
-0.0306396484375,
-0.0251007080078125,
0.018157958984375,
-0.0155792236328125,
0.019256591796875,
0.006816864013671875,
0.0081939697265625,
-0.012054443359375,
-0.012969970703125,
0.0094757080078125,
0.01259613037109375,
-0.03326416015625,
-0.0266265869140625,
0.038482666015625,
0.002262115478515625,
-0.00411224365234375,
0.04193115234375,
-0.020050048828125,
-0.0299835205078125,
0.05316162109375,
0.03851318359375,
0.052154541015625,
-0.01120758056640625,
0.00867462158203125,
0.03558349609375,
0.00791168212890625,
-0.0216827392578125,
0.0291900634765625,
0.004093170166015625,
-0.046844482421875,
0.01009368896484375,
-0.07049560546875,
-0.031463623046875,
-0.007381439208984375,
-0.07867431640625,
0.03594970703125,
-0.04620361328125,
-0.0217742919921875,
-0.007778167724609375,
0.0208740234375,
-0.050262451171875,
0.00511932373046875,
0.0001061558723449707,
0.0523681640625,
-0.07122802734375,
0.0697021484375,
0.0498046875,
-0.0232086181640625,
-0.051666259765625,
-0.040374755859375,
-0.004688262939453125,
-0.056549072265625,
0.0023555755615234375,
0.0047149658203125,
0.0087890625,
-0.0165252685546875,
-0.035888671875,
-0.0819091796875,
0.1251220703125,
0.036376953125,
-0.036102294921875,
0.0059051513671875,
0.0158233642578125,
0.040130615234375,
-0.00829315185546875,
0.0159454345703125,
0.0088958740234375,
0.017120361328125,
0.01165771484375,
-0.08001708984375,
-0.0200042724609375,
-0.0021495819091796875,
0.00775909423828125,
0.04150390625,
-0.08050537109375,
0.08563232421875,
-0.015960693359375,
0.0291900634765625,
0.0088958740234375,
0.051910400390625,
0.046051025390625,
0.0252227783203125,
0.051788330078125,
0.07818603515625,
0.05364990234375,
-0.0088043212890625,
0.094482421875,
-0.035614013671875,
0.06072998046875,
0.09124755859375,
-0.007610321044921875,
0.0269317626953125,
0.0243988037109375,
-0.01142120361328125,
0.039031982421875,
0.05401611328125,
-0.0118255615234375,
0.033660888671875,
-0.0162200927734375,
-0.0239105224609375,
-0.0179443359375,
-0.0150909423828125,
-0.056884765625,
0.0443115234375,
0.0073089599609375,
-0.03875732421875,
-0.024688720703125,
0.004711151123046875,
0.0167694091796875,
-0.03424072265625,
-0.0268096923828125,
0.0298309326171875,
-0.0136566162109375,
-0.018310546875,
0.051910400390625,
-0.0062713623046875,
0.032806396484375,
-0.0833740234375,
0.01079559326171875,
0.00909423828125,
0.024810791015625,
-0.01019287109375,
-0.0273895263671875,
-0.004062652587890625,
-0.0251312255859375,
-0.0063018798828125,
0.0004780292510986328,
0.046417236328125,
-0.01171112060546875,
-0.061676025390625,
0.0245361328125,
0.007419586181640625,
0.006359100341796875,
-0.0015878677368164062,
-0.060028076171875,
0.020965576171875,
-0.01045989990234375,
-0.02532958984375,
0.041656494140625,
0.0079803466796875,
-0.018646240234375,
0.055328369140625,
0.05792236328125,
0.004550933837890625,
0.0202484130859375,
0.003482818603515625,
0.0767822265625,
-0.0190277099609375,
-0.038482666015625,
-0.0162811279296875,
0.003326416015625,
-0.0189666748046875,
-0.0498046875,
0.043548583984375,
0.052825927734375,
0.0858154296875,
-0.004352569580078125,
0.02197265625,
-0.0265960693359375,
-0.0014743804931640625,
-0.0301513671875,
0.050079345703125,
-0.01334381103515625,
-0.018829345703125,
-0.023895263671875,
-0.07550048828125,
0.0252685546875,
0.034088134765625,
-0.014739990234375,
0.0098114013671875,
0.040802001953125,
0.057037353515625,
-0.0206146240234375,
0.0144500732421875,
0.0179901123046875,
0.0014104843139648438,
0.0236053466796875,
0.0423583984375,
0.0028858184814453125,
-0.07373046875,
0.047027587890625,
-0.07550048828125,
-0.00899505615234375,
-0.0107879638671875,
-0.04510498046875,
-0.044403076171875,
-0.01309967041015625,
-0.019439697265625,
-0.0158538818359375,
0.0200653076171875,
0.07452392578125,
0.07061767578125,
-0.06732177734375,
-0.0186309814453125,
-0.0028743743896484375,
-0.005218505859375,
-0.0292816162109375,
-0.01490020751953125,
0.03765869140625,
-0.0173187255859375,
-0.052490234375,
-0.0073394775390625,
-0.0166015625,
0.01540374755859375,
0.0018672943115234375,
-0.022552490234375,
-0.0104217529296875,
-0.00572967529296875,
0.043365478515625,
0.02496337890625,
-0.025665283203125,
-0.0450439453125,
-0.035125732421875,
0.007659912109375,
-0.000522613525390625,
0.022735595703125,
-0.0390625,
0.047637939453125,
0.034881591796875,
0.026580810546875,
0.05938720703125,
0.01108551025390625,
0.030853271484375,
-0.032806396484375,
0.058013916015625,
0.01934814453125,
0.043548583984375,
-0.0022449493408203125,
-0.02154541015625,
0.032806396484375,
0.01003265380859375,
-0.053436279296875,
-0.04913330078125,
-0.0009665489196777344,
-0.08734130859375,
0.02227783203125,
0.054168701171875,
-0.01389312744140625,
-0.0275115966796875,
0.0005135536193847656,
-0.0201873779296875,
0.00449371337890625,
-0.022308349609375,
0.058074951171875,
0.0518798828125,
-0.0025348663330078125,
0.00543975830078125,
-0.050933837890625,
0.0172271728515625,
0.046600341796875,
-0.06744384765625,
-0.01337432861328125,
0.02618408203125,
0.02569580078125,
0.0167236328125,
0.0267333984375,
-0.03668212890625,
0.05133056640625,
0.00951385498046875,
0.0229949951171875,
-0.030120849609375,
-0.0345458984375,
-0.009429931640625,
0.011077880859375,
0.002262115478515625,
-0.03485107421875
]
] |
yehiaserag/anime-pencil-diffusion | 2023-05-05T11:49:35.000Z | [
"diffusers",
"anime",
"stable-diffusion",
"aiart",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | yehiaserag | null | null | yehiaserag/anime-pencil-diffusion | 156 | 15,846 | diffusers | 2022-12-03T04:15:22 | ---
language:
- en
thumbnail: "https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v5.jpg"
tags:
- anime
- stable-diffusion
- aiart
- text-to-image
license: "creativeml-openrail-m"
---
# Anime-Pencil-Diffusion
A dreambooth finetune of stable diffusion 1.5 model that will output stuff in anime pencil concept drawing style.
# Usage
Follow the directions under each version.
## Anime-Pencil-Diffusion-V5
Trained for 400,000 steps, constant learning rate of 0.0000002 on 5000 images with 0 images for regularization.
### Examples generated by the v5 model
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v5.jpg"/>
### Usage
Include `animepencilconcept style` in prompt to invoke the finetuned style.
### Prompet comparison for V5
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v5-prompt-comparison.jpg"/>
---
## Anime-Pencil-Diffusion-V4
Trained for 160,000 steps, constant learning rate of 0.000001 on 526 images with 0 images for regularization and no text encoder training
### Examples generated by the v4 model
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v4.jpg"/>
### Usage
Add the words `anime pencil concept style` anywhere in your prompt.
---
## Anime-Pencil-Diffusion-V3
Trained for 12,000 steps, constant learning rate of 0.0000005 on 80 images with 1000 images of `illustration style` for regularization
### Examples generated by the v3 model
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v3.jpg"/>
### Usage
Add the words `anime pencil concept style` anywhere in your prompt.
---
## Anime-Pencil-Diffusion-V2
Trained for 4,000 steps, constant learning rate of 0.00000172 on 40 images with 1000 images of `illustration style` for regularization
### Examples generated by the v2 model
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v2.jpg"/>
# Usage
Add the words `anime pencil concept style` anywhere in your prompt.
---
## Anime-Pencil-Diffusion-V1
Trained on 2,400 steps, constant learning rate of 0.00000172 on 16 images with 1000 images of `illustration style` for regularization
### Examples generated by the v1 model
<img src="https://huggingface.co/yehiaserag/anime-pencil-deffusion/resolve/main/example-v1.jpg"/>
### Usage
Add the words `anime pencil concept style` anywhere in your prompt.
---
# Socials
- Use the #AnimePencilConceptStyle so i can see the cool stuff you make!
- If you enjoy the model i'd appreciate a follow on [twitter](https://twitter.com/HellYeahYea)
- If you are feeling especially generous, you can sponsor me on [paypal](https://paypal.me/YehiaSerag)
- Created by Yehia Serag
---
*NOTE: usage of this model implies accpetance of stable diffusion's [CreativeML Open RAIL-M license](LICENSE)*
| 2,894 | [
[
-0.0279541015625,
-0.0660400390625,
0.04132080078125,
0.00968170166015625,
-0.02880859375,
-0.00714111328125,
0.0087127685546875,
-0.01197052001953125,
0.04864501953125,
0.0183868408203125,
-0.062744140625,
-0.043548583984375,
-0.033599853515625,
0.00534820556640625,
-0.00829315185546875,
0.044189453125,
0.0009455680847167969,
0.00922393798828125,
0.004734039306640625,
0.0158843994140625,
-0.0638427734375,
-0.0036773681640625,
-0.07208251953125,
-0.0306854248046875,
0.0201263427734375,
0.053192138671875,
0.0361328125,
0.0294342041015625,
-0.0021820068359375,
0.0284576416015625,
-0.0146331787109375,
-0.0261993408203125,
-0.037384033203125,
0.008148193359375,
0.0006146430969238281,
-0.05059814453125,
-0.030914306640625,
0.023468017578125,
0.0274658203125,
0.040374755859375,
-0.004276275634765625,
-0.01442718505859375,
0.0108795166015625,
0.0236053466796875,
-0.0198822021484375,
0.00574493408203125,
-0.0133819580078125,
0.021270751953125,
-0.01702880859375,
0.0287017822265625,
-0.01413726806640625,
-0.03680419921875,
0.025177001953125,
-0.07611083984375,
0.037689208984375,
0.020233154296875,
0.0694580078125,
-0.0036449432373046875,
0.01552581787109375,
-0.002414703369140625,
-0.01227569580078125,
0.05926513671875,
-0.044158935546875,
-0.007595062255859375,
0.03387451171875,
0.0293731689453125,
0.0194244384765625,
-0.08868408203125,
-0.057373046875,
0.01335906982421875,
0.006649017333984375,
0.0223541259765625,
0.0025768280029296875,
-0.01204681396484375,
0.01074981689453125,
0.0257110595703125,
-0.0240936279296875,
0.003765106201171875,
-0.04327392578125,
-0.025634765625,
0.0247650146484375,
0.021697998046875,
0.0310516357421875,
-0.0198822021484375,
-0.021270751953125,
0.0101165771484375,
-0.046844482421875,
0.007335662841796875,
0.025360107421875,
0.007373809814453125,
-0.0311279296875,
0.05072021484375,
0.00786590576171875,
0.0267486572265625,
0.036346435546875,
-0.0017881393432617188,
0.0218505859375,
-0.025177001953125,
-0.0122222900390625,
0.00634765625,
0.07537841796875,
0.050933837890625,
0.01611328125,
0.00905609130859375,
0.0105743408203125,
-0.004608154296875,
-0.0013790130615234375,
-0.10125732421875,
-0.031219482421875,
0.0187530517578125,
-0.043548583984375,
-0.03460693359375,
-0.03692626953125,
-0.06402587890625,
-0.033660888671875,
-0.00131988525390625,
0.032867431640625,
-0.053680419921875,
-0.016448974609375,
0.034088134765625,
-0.04595947265625,
-0.004932403564453125,
0.04388427734375,
-0.059600830078125,
0.0017099380493164062,
0.0230712890625,
0.06982421875,
0.0012912750244140625,
-0.0034999847412109375,
0.0110015869140625,
-0.0186614990234375,
-0.048187255859375,
0.05572509765625,
-0.034698486328125,
-0.035797119140625,
-0.0140380859375,
0.0151824951171875,
-0.0010614395141601562,
-0.034698486328125,
0.04669189453125,
0.0007944107055664062,
0.05133056640625,
0.002109527587890625,
-0.051116943359375,
-0.029205322265625,
-0.0220794677734375,
-0.039581298828125,
0.0677490234375,
0.01432037353515625,
-0.04827880859375,
0.01329803466796875,
-0.04876708984375,
-0.0078277587890625,
0.0151519775390625,
0.018218994140625,
-0.0253143310546875,
-0.01361846923828125,
-0.015899658203125,
0.05316162109375,
-0.018463134765625,
0.0154571533203125,
-0.039031982421875,
-0.0245513916015625,
-0.0018815994262695312,
-0.0195159912109375,
0.08502197265625,
0.00792694091796875,
-0.01242828369140625,
0.0152587890625,
-0.061309814453125,
-0.017181396484375,
0.0182037353515625,
0.020751953125,
-0.0194091796875,
-0.03192138671875,
0.0250396728515625,
0.00905609130859375,
0.0254974365234375,
-0.059356689453125,
0.005817413330078125,
-0.0201416015625,
0.03741455078125,
0.06622314453125,
0.0245208740234375,
0.028106689453125,
-0.031524658203125,
0.045135498046875,
0.0005221366882324219,
-0.007663726806640625,
-0.017425537109375,
-0.0556640625,
-0.08526611328125,
-0.01116943359375,
0.01551055908203125,
0.036865234375,
-0.05328369140625,
0.059783935546875,
0.026519775390625,
-0.06060791015625,
-0.038360595703125,
-0.0090484619140625,
0.01198577880859375,
0.08624267578125,
0.027008056640625,
-0.025146484375,
-0.0184173583984375,
-0.05853271484375,
0.0268402099609375,
0.01442718505859375,
-0.018341064453125,
-0.00402069091796875,
0.023345947265625,
0.0099029541015625,
0.061187744140625,
-0.06256103515625,
-0.0233154296875,
-0.024444580078125,
0.0188446044921875,
0.039215087890625,
0.054931640625,
0.059326171875,
-0.03692626953125,
-0.047088623046875,
-0.03192138671875,
-0.04248046875,
-0.01183319091796875,
0.003993988037109375,
-0.0255889892578125,
-0.022308349609375,
0.0280303955078125,
-0.041748046875,
0.032501220703125,
0.005329132080078125,
-0.06500244140625,
0.054534912109375,
-0.0087738037109375,
0.01152801513671875,
-0.09344482421875,
0.0005965232849121094,
0.03955078125,
-0.0101776123046875,
-0.052703857421875,
0.0019083023071289062,
-0.0138397216796875,
-0.024993896484375,
-0.05133056640625,
0.061492919921875,
-0.01067352294921875,
0.04510498046875,
-0.004871368408203125,
0.0095672607421875,
0.0223236083984375,
0.054656982421875,
0.0255126953125,
0.045654296875,
0.07794189453125,
-0.055023193359375,
0.041412353515625,
0.03228759765625,
-0.0350341796875,
0.0760498046875,
-0.0765380859375,
0.00656890869140625,
-0.0158233642578125,
0.006404876708984375,
-0.10369873046875,
0.004413604736328125,
0.05743408203125,
-0.0287322998046875,
0.020599365234375,
0.01245880126953125,
-0.03558349609375,
-0.0153045654296875,
-0.040557861328125,
0.0116424560546875,
0.06298828125,
-0.0181121826171875,
0.037261962890625,
-0.003887176513671875,
-0.026458740234375,
-0.018798828125,
-0.03466796875,
-0.01055145263671875,
-0.037567138671875,
-0.047393798828125,
0.039703369140625,
-0.04510498046875,
0.01019287109375,
0.0027713775634765625,
0.0019073486328125,
-0.017181396484375,
0.01116180419921875,
0.0157318115234375,
0.024078369140625,
-0.0220184326171875,
-0.004528045654296875,
0.00943756103515625,
-0.00931549072265625,
-0.00533294677734375,
0.0019092559814453125,
0.052886962890625,
0.01013946533203125,
-0.0131988525390625,
-0.08575439453125,
-0.0027446746826171875,
0.0200958251953125,
0.0218658447265625,
0.040557861328125,
0.05682373046875,
-0.0307464599609375,
0.00487518310546875,
-0.022430419921875,
-0.01291656494140625,
-0.03851318359375,
0.01245880126953125,
-0.030181884765625,
-0.025360107421875,
0.04705810546875,
-0.0014543533325195312,
0.019256591796875,
0.051239013671875,
0.0194244384765625,
-0.032928466796875,
0.053955078125,
0.04656982421875,
0.007236480712890625,
0.059326171875,
-0.07110595703125,
-0.00641632080078125,
-0.06683349609375,
-0.0275115966796875,
-0.031494140625,
-0.04766845703125,
0.0024852752685546875,
-0.0301513671875,
0.01114654541015625,
0.04046630859375,
-0.023773193359375,
0.024383544921875,
-0.0173492431640625,
0.03326416015625,
0.037750244140625,
0.024383544921875,
0.01160430908203125,
0.0005955696105957031,
-0.0277862548828125,
-0.0135345458984375,
-0.045196533203125,
-0.030120849609375,
0.06292724609375,
0.0288543701171875,
0.057373046875,
-0.01006317138671875,
0.054473876953125,
0.0006189346313476562,
0.001956939697265625,
-0.07989501953125,
0.05401611328125,
-0.0030517578125,
-0.03521728515625,
-0.03790283203125,
0.0225830078125,
-0.0782470703125,
0.0191650390625,
-0.030426025390625,
-0.042572021484375,
0.024017333984375,
-0.0015516281127929688,
-0.025177001953125,
0.0016574859619140625,
-0.05377197265625,
0.06781005859375,
0.0125579833984375,
-0.037689208984375,
-0.0165252685546875,
-0.03253173828125,
0.0117645263671875,
-0.0081329345703125,
-0.00171661376953125,
-0.0260162353515625,
-0.0099945068359375,
0.056915283203125,
-0.036865234375,
0.0859375,
-0.00643157958984375,
-0.003612518310546875,
0.034332275390625,
0.00989532470703125,
-0.005645751953125,
-0.0035190582275390625,
-0.0171966552734375,
0.0112762451171875,
-0.01467132568359375,
-0.01337432861328125,
-0.0244140625,
0.0253143310546875,
-0.03900146484375,
-0.040283203125,
-0.01308441162109375,
-0.0169525146484375,
0.0128631591796875,
0.047393798828125,
0.06878662109375,
0.05517578125,
-0.01407623291015625,
0.0131683349609375,
0.06341552734375,
-0.00008857250213623047,
0.04449462890625,
0.0011072158813476562,
-0.0235443115234375,
-0.023101806640625,
0.059326171875,
-0.0000502467155456543,
0.023040771484375,
0.00467681884765625,
0.04425048828125,
-0.031768798828125,
-0.0275421142578125,
-0.048492431640625,
0.0279083251953125,
-0.046783447265625,
-0.019561767578125,
-0.033477783203125,
-0.0102996826171875,
-0.03680419921875,
-0.027069091796875,
-0.00121307373046875,
-0.054290771484375,
-0.095703125,
-0.0006384849548339844,
0.057098388671875,
0.042388916015625,
0.0020351409912109375,
0.042327880859375,
-0.0204315185546875,
0.041259765625,
0.0240936279296875,
0.030120849609375,
0.0018138885498046875,
-0.05145263671875,
0.00209808349609375,
-0.0114288330078125,
-0.047576904296875,
-0.06781005859375,
0.026031494140625,
0.0174560546875,
0.039642333984375,
0.046875,
-0.0164642333984375,
0.054107666015625,
-0.05633544921875,
0.0762939453125,
0.050628662109375,
-0.05731201171875,
0.03521728515625,
-0.0268096923828125,
0.01702880859375,
0.026519775390625,
0.052581787109375,
-0.062347412109375,
-0.043212890625,
-0.0736083984375,
-0.03900146484375,
0.042572021484375,
0.0028820037841796875,
0.0197296142578125,
0.0162506103515625,
0.028167724609375,
0.0014047622680664062,
0.00968170166015625,
-0.043701171875,
-0.048492431640625,
-0.037811279296875,
-0.0269927978515625,
-0.0081787109375,
-0.0028972625732421875,
0.01273345947265625,
-0.031951904296875,
0.05413818359375,
-0.00433349609375,
0.01739501953125,
-0.01160430908203125,
0.0254058837890625,
-0.00745391845703125,
-0.01335906982421875,
0.01242828369140625,
0.031005859375,
-0.008056640625,
-0.0248260498046875,
-0.019073486328125,
-0.047088623046875,
0.0236053466796875,
0.0024700164794921875,
-0.039306640625,
0.0184173583984375,
0.0031948089599609375,
0.0672607421875,
0.00894927978515625,
-0.021484375,
0.0313720703125,
-0.01010894775390625,
-0.0169525146484375,
-0.024322509765625,
0.033355712890625,
0.0210723876953125,
0.01076507568359375,
0.01184844970703125,
0.01204681396484375,
0.032318115234375,
-0.031951904296875,
-0.0016326904296875,
0.022796630859375,
-0.01519012451171875,
-0.0192108154296875,
0.055511474609375,
0.0198516845703125,
-0.040069580078125,
0.032958984375,
-0.017120361328125,
-0.030853271484375,
0.07476806640625,
0.054931640625,
0.06903076171875,
-0.0254669189453125,
0.0306549072265625,
0.05987548828125,
-0.0221710205078125,
-0.017181396484375,
0.0250244140625,
0.0225677490234375,
-0.046875,
0.00576019287109375,
-0.021575927734375,
-0.0288543701171875,
0.02197265625,
-0.0257415771484375,
0.07794189453125,
-0.05682373046875,
-0.01165008544921875,
0.0014190673828125,
-0.004222869873046875,
-0.045623779296875,
0.00859832763671875,
-0.00524139404296875,
0.08050537109375,
-0.06683349609375,
0.03778076171875,
0.038177490234375,
-0.03912353515625,
-0.053924560546875,
-0.01288604736328125,
-0.0006322860717773438,
-0.054351806640625,
0.037078857421875,
0.0182342529296875,
-0.01410675048828125,
0.018402099609375,
-0.054595947265625,
-0.052825927734375,
0.08062744140625,
-0.0102691650390625,
-0.01110076904296875,
-0.00897979736328125,
-0.03729248046875,
0.040191650390625,
-0.035064697265625,
0.01143646240234375,
0.022247314453125,
0.00684356689453125,
0.03955078125,
-0.045013427734375,
-0.00005882978439331055,
-0.0212249755859375,
0.01078033447265625,
-0.00921630859375,
-0.059783935546875,
0.0682373046875,
-0.0285797119140625,
-0.040863037109375,
0.05548095703125,
0.0645751953125,
0.031463623046875,
0.00783538818359375,
0.0211181640625,
0.046051025390625,
0.048126220703125,
-0.002838134765625,
0.0439453125,
-0.00981903076171875,
0.019378662109375,
0.058929443359375,
0.0178680419921875,
0.04547119140625,
0.0021572113037109375,
-0.01222991943359375,
0.06573486328125,
0.0819091796875,
-0.005306243896484375,
0.048309326171875,
0.0167999267578125,
-0.0021686553955078125,
-0.003108978271484375,
-0.01403045654296875,
-0.0316162109375,
-0.01357269287109375,
0.03765869140625,
-0.01457977294921875,
-0.006626129150390625,
0.005863189697265625,
0.01464080810546875,
-0.006366729736328125,
-0.0301513671875,
0.033660888671875,
0.01534271240234375,
-0.0236968994140625,
0.05633544921875,
-0.00652313232421875,
0.07586669921875,
-0.053466796875,
-0.035247802734375,
-0.020538330078125,
-0.01255035400390625,
-0.004116058349609375,
-0.08807373046875,
-0.017181396484375,
-0.0161895751953125,
0.00748443603515625,
-0.026885986328125,
0.046051025390625,
-0.03436279296875,
-0.051666259765625,
-0.0258636474609375,
0.01250457763671875,
0.0222930908203125,
0.034698486328125,
-0.06671142578125,
-0.00942230224609375,
0.0018968582153320312,
-0.01045989990234375,
-0.0030059814453125,
0.048828125,
0.031463623046875,
0.050933837890625,
0.01013946533203125,
0.007549285888671875,
-0.007694244384765625,
-0.005462646484375,
0.032135009765625,
-0.023040771484375,
-0.040496826171875,
-0.05487060546875,
0.051788330078125,
-0.0152130126953125,
-0.0323486328125,
0.05242919921875,
0.032379150390625,
0.037506103515625,
-0.033782958984375,
0.050933837890625,
-0.00882720947265625,
0.043212890625,
-0.05755615234375,
0.05841064453125,
-0.040435791015625,
-0.0141143798828125,
-0.037445068359375,
-0.056121826171875,
-0.0017862319946289062,
0.08990478515625,
0.0109710693359375,
0.0016498565673828125,
0.0660400390625,
0.046142578125,
-0.01064300537109375,
0.0030269622802734375,
-0.005626678466796875,
0.0010881423950195312,
0.0022106170654296875,
0.022705078125,
0.059173583984375,
-0.0264129638671875,
0.0023040771484375,
-0.051177978515625,
-0.0158843994140625,
-0.006130218505859375,
-0.0760498046875,
-0.0780029296875,
-0.047943115234375,
-0.047149658203125,
-0.040130615234375,
-0.04693603515625,
0.051788330078125,
0.06402587890625,
-0.047210693359375,
-0.0193328857421875,
-0.004901885986328125,
0.0209197998046875,
0.000255584716796875,
-0.0197906494140625,
0.024932861328125,
-0.00372314453125,
-0.08953857421875,
0.0028972625732421875,
-0.00704193115234375,
0.05780029296875,
-0.0246429443359375,
0.01552581787109375,
0.0038814544677734375,
-0.0224761962890625,
0.0323486328125,
0.005100250244140625,
-0.041961669921875,
-0.00893402099609375,
0.0011758804321289062,
0.006999969482421875,
0.0111236572265625,
0.041107177734375,
-0.056121826171875,
0.026885986328125,
0.04754638671875,
0.01373291015625,
0.029571533203125,
-0.0149078369140625,
0.024688720703125,
-0.039703369140625,
0.01265716552734375,
0.0173797607421875,
0.049468994140625,
0.016265869140625,
-0.03472900390625,
0.03729248046875,
0.050689697265625,
-0.0191192626953125,
-0.022247314453125,
0.0243682861328125,
-0.061492919921875,
-0.030120849609375,
0.0648193359375,
0.00865936279296875,
-0.00861358642578125,
-0.0099639892578125,
-0.060302734375,
0.005218505859375,
-0.04095458984375,
0.0286865234375,
0.06768798828125,
-0.03778076171875,
-0.0164337158203125,
-0.0290374755859375,
0.039031982421875,
0.0014553070068359375,
-0.054443359375,
-0.0099945068359375,
0.061553955078125,
0.05841064453125,
0.03497314453125,
0.0650634765625,
0.00930023193359375,
0.01308441162109375,
0.005443572998046875,
0.0007815361022949219,
0.01409149169921875,
-0.0231781005859375,
-0.027374267578125,
-0.008056640625,
-0.002300262451171875,
-0.0279998779296875
]
] |
keepitreal/vietnamese-sbert | 2022-02-19T08:01:34.000Z | [
"sentence-transformers",
"pytorch",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"vietnamese",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | keepitreal | null | null | keepitreal/vietnamese-sbert | 16 | 15,824 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- vietnamese
---
# {vietnamese-sbert}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search on Vietnamese language.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["Cô giáo đang ăn kem", "Chị gái đang thử món thịt dê"]
model = SentenceTransformer('keepitreal/vietnamese-sbert')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['Cô giáo đang ăn kem', 'Chị gái đang thử món thịt dê']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained(''keepitreal/vietnamese-sbert')
model = AutoModel.from_pretrained('keepitreal/vietnamese-sbert')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 360 with parameters:
```
{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 4,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 144,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | 3,869 | [
[
-0.016845703125,
-0.06915283203125,
0.0225677490234375,
0.0218048095703125,
-0.0242462158203125,
-0.0279083251953125,
-0.02093505859375,
0.00571441650390625,
0.0160369873046875,
0.03302001953125,
-0.041717529296875,
-0.048919677734375,
-0.053070068359375,
0.007171630859375,
-0.0234527587890625,
0.06329345703125,
-0.015350341796875,
-0.004604339599609375,
-0.00978851318359375,
-0.015289306640625,
-0.02691650390625,
-0.020904541015625,
-0.035736083984375,
-0.0193939208984375,
0.010101318359375,
0.01788330078125,
0.03900146484375,
0.044464111328125,
0.0189666748046875,
0.030517578125,
-0.0018796920776367188,
0.01178741455078125,
-0.0367431640625,
-0.00991058349609375,
0.0017862319946289062,
-0.031158447265625,
-0.006160736083984375,
0.019287109375,
0.042755126953125,
0.01520538330078125,
-0.00963592529296875,
0.01509857177734375,
-0.005947113037109375,
0.030517578125,
-0.0287322998046875,
0.02667236328125,
-0.038360595703125,
0.00711822509765625,
0.0089874267578125,
-0.0006361007690429688,
-0.0307464599609375,
-0.017578125,
0.0247650146484375,
-0.0295257568359375,
0.0186004638671875,
0.0149688720703125,
0.09759521484375,
0.037994384765625,
-0.0229339599609375,
-0.026275634765625,
-0.0184478759765625,
0.06549072265625,
-0.061981201171875,
0.017822265625,
0.0311737060546875,
0.0029048919677734375,
0.002964019775390625,
-0.06146240234375,
-0.056396484375,
-0.01201629638671875,
-0.033447265625,
0.0216064453125,
-0.02716064453125,
-0.00009703636169433594,
0.01180267333984375,
0.0199127197265625,
-0.06304931640625,
-0.005413055419921875,
-0.0377197265625,
-0.0193634033203125,
0.050445556640625,
0.0034465789794921875,
0.028717041015625,
-0.046478271484375,
-0.038299560546875,
-0.0240478515625,
-0.00724029541015625,
-0.0023975372314453125,
0.00732421875,
0.009765625,
-0.029510498046875,
0.06475830078125,
-0.0088653564453125,
0.048095703125,
-0.0003039836883544922,
0.007076263427734375,
0.0460205078125,
-0.0227813720703125,
-0.0279083251953125,
-0.004428863525390625,
0.0833740234375,
0.0292205810546875,
0.0247344970703125,
0.00457000732421875,
-0.01239776611328125,
0.01082611083984375,
0.01160430908203125,
-0.057708740234375,
-0.025970458984375,
0.0261993408203125,
-0.03167724609375,
-0.021484375,
0.0239105224609375,
-0.05517578125,
0.00853729248046875,
-0.0009784698486328125,
0.06103515625,
-0.047119140625,
0.0003631114959716797,
0.024505615234375,
-0.02313232421875,
0.02105712890625,
-0.017364501953125,
-0.03912353515625,
0.014862060546875,
0.022186279296875,
0.080078125,
-0.001369476318359375,
-0.038665771484375,
-0.031280517578125,
-0.00701141357421875,
0.01068878173828125,
0.050018310546875,
-0.022491455078125,
-0.01629638671875,
0.00640106201171875,
0.022918701171875,
-0.04766845703125,
-0.0270538330078125,
0.04534912109375,
-0.017913818359375,
0.06298828125,
0.0149078369140625,
-0.06561279296875,
-0.0198211669921875,
0.0153045654296875,
-0.041259765625,
0.09527587890625,
0.032501220703125,
-0.068603515625,
0.0082855224609375,
-0.054779052734375,
-0.0182037353515625,
-0.00771331787109375,
-0.0013027191162109375,
-0.058868408203125,
0.002132415771484375,
0.03607177734375,
0.054473876953125,
-0.0006780624389648438,
0.01468658447265625,
-0.007320404052734375,
-0.0357666015625,
0.021759033203125,
-0.019378662109375,
0.08502197265625,
0.00940704345703125,
-0.039764404296875,
0.02398681640625,
-0.047576904296875,
-0.0030689239501953125,
0.02642822265625,
-0.01763916015625,
-0.0191497802734375,
-0.01800537109375,
0.0322265625,
0.025604248046875,
0.0147552490234375,
-0.045166015625,
0.01107025146484375,
-0.039154052734375,
0.056793212890625,
0.04376220703125,
0.006343841552734375,
0.03411865234375,
-0.02880859375,
0.0312042236328125,
0.017730712890625,
-0.004711151123046875,
-0.01122283935546875,
-0.031494140625,
-0.070556640625,
-0.0323486328125,
0.017669677734375,
0.049957275390625,
-0.049102783203125,
0.07855224609375,
-0.0290985107421875,
-0.0477294921875,
-0.0697021484375,
0.00211334228515625,
0.018707275390625,
0.034698486328125,
0.0479736328125,
-0.0009937286376953125,
-0.041290283203125,
-0.062347412109375,
-0.004703521728515625,
0.002300262451171875,
-0.002735137939453125,
0.0204620361328125,
0.05645751953125,
-0.0440673828125,
0.0731201171875,
-0.057342529296875,
-0.041046142578125,
-0.031829833984375,
0.00939178466796875,
0.01629638671875,
0.04058837890625,
0.04302978515625,
-0.0562744140625,
-0.033355712890625,
-0.034423828125,
-0.050628662109375,
-0.004730224609375,
-0.0140380859375,
-0.01499176025390625,
0.021514892578125,
0.041534423828125,
-0.060638427734375,
0.030914306640625,
0.046905517578125,
-0.047637939453125,
0.035888671875,
-0.020416259765625,
-0.00991058349609375,
-0.10345458984375,
0.0026378631591796875,
0.00803375244140625,
-0.01293182373046875,
-0.0236968994140625,
0.005352020263671875,
0.00524139404296875,
-0.00626373291015625,
-0.030181884765625,
0.047760009765625,
-0.03253173828125,
0.0232391357421875,
-0.0103607177734375,
0.037139892578125,
0.0038433074951171875,
0.04522705078125,
-0.0045928955078125,
0.049468994140625,
0.04083251953125,
-0.0423583984375,
0.028778076171875,
0.04327392578125,
-0.0255584716796875,
0.01812744140625,
-0.067138671875,
0.0014715194702148438,
-0.00439453125,
0.0241546630859375,
-0.08636474609375,
-0.01139068603515625,
0.0207672119140625,
-0.04010009765625,
0.00553131103515625,
0.0170135498046875,
-0.043182373046875,
-0.044586181640625,
-0.031097412109375,
0.007221221923828125,
0.040008544921875,
-0.035888671875,
0.041046142578125,
0.0223388671875,
0.006023406982421875,
-0.03399658203125,
-0.06524658203125,
-0.010467529296875,
-0.02923583984375,
-0.05224609375,
0.036468505859375,
-0.007720947265625,
0.010955810546875,
0.01509857177734375,
0.0173492431640625,
-0.0002818107604980469,
-0.0033168792724609375,
-0.001125335693359375,
0.02313232421875,
-0.007495880126953125,
0.009368896484375,
0.006977081298828125,
-0.0156707763671875,
0.004131317138671875,
-0.01873779296875,
0.06158447265625,
-0.0152587890625,
-0.0006127357482910156,
-0.037139892578125,
0.007335662841796875,
0.0272369384765625,
-0.022491455078125,
0.08123779296875,
0.08514404296875,
-0.0307464599609375,
-0.00795745849609375,
-0.046417236328125,
-0.0157318115234375,
-0.0345458984375,
0.048797607421875,
-0.0182342529296875,
-0.0758056640625,
0.029937744140625,
0.015533447265625,
-0.0052490234375,
0.05316162109375,
0.049041748046875,
0.00032401084899902344,
0.058624267578125,
0.04339599609375,
-0.02130126953125,
0.040679931640625,
-0.0443115234375,
0.0084991455078125,
-0.06634521484375,
-0.004261016845703125,
-0.0228424072265625,
-0.0186767578125,
-0.0594482421875,
-0.041961669921875,
0.0159912109375,
-0.0108184814453125,
-0.0240325927734375,
0.04949951171875,
-0.053802490234375,
0.0135650634765625,
0.046905517578125,
0.0136566162109375,
-0.0037555694580078125,
-0.00475311279296875,
-0.018280029296875,
-0.00554656982421875,
-0.0526123046875,
-0.038055419921875,
0.0596923828125,
0.02734375,
0.03375244140625,
-0.01666259765625,
0.05694580078125,
-0.0043487548828125,
0.0003314018249511719,
-0.057525634765625,
0.0362548828125,
-0.00768280029296875,
-0.02606201171875,
-0.0279083251953125,
-0.034942626953125,
-0.070556640625,
0.030670166015625,
-0.01126861572265625,
-0.05572509765625,
-0.00580596923828125,
-0.019775390625,
-0.017578125,
0.010345458984375,
-0.05810546875,
0.08349609375,
0.001800537109375,
-0.0004191398620605469,
-0.00817108154296875,
-0.054656982421875,
0.01415252685546875,
0.01201629638671875,
0.0070037841796875,
-0.006195068359375,
-0.0030956268310546875,
0.0662841796875,
-0.019287109375,
0.054962158203125,
-0.00751495361328125,
0.0190582275390625,
0.0298614501953125,
-0.019866943359375,
0.0198974609375,
-0.0016813278198242188,
-0.0070343017578125,
0.00960540771484375,
-0.0014600753784179688,
-0.0323486328125,
-0.0394287109375,
0.047943115234375,
-0.0740966796875,
-0.0258636474609375,
-0.03765869140625,
-0.043914794921875,
0.00955963134765625,
0.0194244384765625,
0.039947509765625,
0.01345062255859375,
-0.01445770263671875,
0.025390625,
0.036376953125,
-0.0299530029296875,
0.0374755859375,
0.01641845703125,
-0.0081024169921875,
-0.04473876953125,
0.0528564453125,
0.00502777099609375,
-0.0010499954223632812,
0.0299072265625,
0.0265655517578125,
-0.0293426513671875,
-0.007099151611328125,
-0.02734375,
0.04302978515625,
-0.029510498046875,
-0.015655517578125,
-0.072021484375,
-0.037872314453125,
-0.0478515625,
-0.009124755859375,
-0.0217742919921875,
-0.0231170654296875,
-0.0340576171875,
-0.0255584716796875,
0.0299530029296875,
0.03240966796875,
0.0097503662109375,
0.036468505859375,
-0.054473876953125,
0.0156707763671875,
0.0010004043579101562,
0.005157470703125,
-0.014404296875,
-0.06036376953125,
-0.031585693359375,
-0.00228118896484375,
-0.0272369384765625,
-0.0662841796875,
0.053802490234375,
0.01415252685546875,
0.03680419921875,
0.0170745849609375,
0.00734710693359375,
0.045928955078125,
-0.034027099609375,
0.06121826171875,
-0.005840301513671875,
-0.07684326171875,
0.049957275390625,
-0.012237548828125,
0.0372314453125,
0.02960205078125,
0.0248870849609375,
-0.041961669921875,
-0.029510498046875,
-0.06298828125,
-0.0743408203125,
0.04901123046875,
0.039398193359375,
0.0246734619140625,
-0.01338958740234375,
0.0289154052734375,
-0.0225830078125,
0.0131988525390625,
-0.0731201171875,
-0.0291748046875,
-0.0271453857421875,
-0.044708251953125,
-0.019744873046875,
-0.0151214599609375,
0.0020503997802734375,
-0.02972412109375,
0.059478759765625,
-0.0012826919555664062,
0.045684814453125,
0.0291748046875,
-0.0242156982421875,
0.00862884521484375,
0.006786346435546875,
0.03619384765625,
0.0128326416015625,
-0.0109100341796875,
0.0035648345947265625,
0.019012451171875,
-0.034271240234375,
0.00020587444305419922,
0.033660888671875,
-0.0148468017578125,
0.01702880859375,
0.02825927734375,
0.06805419921875,
0.0272369384765625,
-0.04193115234375,
0.051727294921875,
-0.005886077880859375,
-0.011993408203125,
-0.034332275390625,
0.00243377685546875,
0.01448822021484375,
0.0186614990234375,
0.01238250732421875,
-0.0108642578125,
-0.00457763671875,
-0.021453857421875,
0.0269012451171875,
0.01922607421875,
-0.0296478271484375,
-0.0082550048828125,
0.056060791015625,
-0.00601959228515625,
-0.0206146240234375,
0.07183837890625,
-0.0172271728515625,
-0.05950927734375,
0.03997802734375,
0.0458984375,
0.0758056640625,
-0.01141357421875,
0.025299072265625,
0.046051025390625,
0.029693603515625,
-0.00226593017578125,
0.00777435302734375,
0.01433563232421875,
-0.06219482421875,
-0.006214141845703125,
-0.05059814453125,
0.008087158203125,
0.0019397735595703125,
-0.05499267578125,
0.033935546875,
-0.007801055908203125,
-0.0029048919677734375,
-0.024078369140625,
0.01309967041015625,
-0.06170654296875,
0.019866943359375,
-0.0078582763671875,
0.061126708984375,
-0.0709228515625,
0.0601806640625,
0.057525634765625,
-0.054656982421875,
-0.06292724609375,
0.001346588134765625,
-0.00717926025390625,
-0.06842041015625,
0.0252685546875,
0.038665771484375,
0.01666259765625,
0.018951416015625,
-0.038909912109375,
-0.0673828125,
0.10491943359375,
0.01702880859375,
-0.021636962890625,
-0.01360321044921875,
-0.0012826919555664062,
0.0428466796875,
-0.0367431640625,
0.0258026123046875,
0.0450439453125,
0.028045654296875,
-0.00415802001953125,
-0.054931640625,
0.021820068359375,
-0.0206298828125,
0.01082611083984375,
-0.011474609375,
-0.060394287109375,
0.06365966796875,
-0.00811767578125,
-0.0104827880859375,
0.00530242919921875,
0.06585693359375,
0.03375244140625,
0.010467529296875,
0.03387451171875,
0.058197021484375,
0.050140380859375,
-0.005779266357421875,
0.072509765625,
-0.0226898193359375,
0.06402587890625,
0.07073974609375,
0.01244354248046875,
0.075439453125,
0.028533935546875,
-0.0094146728515625,
0.050689697265625,
0.044891357421875,
-0.028778076171875,
0.0394287109375,
0.0129241943359375,
0.006927490234375,
0.00901031494140625,
0.010711669921875,
-0.012481689453125,
0.0443115234375,
0.01265716552734375,
-0.04302978515625,
0.0029468536376953125,
0.0103912353515625,
0.0200042724609375,
0.0025997161865234375,
0.0084228515625,
0.04632568359375,
0.00980377197265625,
-0.036468505859375,
0.030303955078125,
0.0133209228515625,
0.0679931640625,
-0.0338134765625,
0.01267242431640625,
0.0022983551025390625,
0.0264892578125,
-0.0059814453125,
-0.0399169921875,
0.02703857421875,
-0.0135040283203125,
-0.0024471282958984375,
-0.01000213623046875,
0.040435791015625,
-0.05499267578125,
-0.052520751953125,
0.02880859375,
0.039154052734375,
0.00670623779296875,
-0.005558013916015625,
-0.07843017578125,
0.00698089599609375,
0.00568389892578125,
-0.04541015625,
0.0076751708984375,
0.0306396484375,
0.025634765625,
0.03900146484375,
0.0239715576171875,
-0.00441741943359375,
0.0036525726318359375,
0.00534820556640625,
0.0621337890625,
-0.05316162109375,
-0.0421142578125,
-0.07269287109375,
0.047607421875,
-0.012847900390625,
-0.039581298828125,
0.06390380859375,
0.04632568359375,
0.07476806640625,
-0.01248931884765625,
0.04425048828125,
-0.0143280029296875,
0.02337646484375,
-0.0428466796875,
0.06988525390625,
-0.040008544921875,
-0.005092620849609375,
-0.032196044921875,
-0.076171875,
-0.014007568359375,
0.08258056640625,
-0.0217742919921875,
0.0113525390625,
0.06634521484375,
0.06219482421875,
-0.0065460205078125,
-0.008575439453125,
0.01067352294921875,
0.034912109375,
0.023162841796875,
0.0322265625,
0.02001953125,
-0.06884765625,
0.043121337890625,
-0.03704833984375,
-0.0167388916015625,
-0.01279449462890625,
-0.04412841796875,
-0.07568359375,
-0.06378173828125,
-0.03375244140625,
-0.033355712890625,
-0.01409912109375,
0.08319091796875,
0.040191650390625,
-0.0533447265625,
-0.01373291015625,
-0.00917816162109375,
-0.011474609375,
-0.01499176025390625,
-0.0283203125,
0.052734375,
-0.03240966796875,
-0.06787109375,
0.01004791259765625,
-0.00276947021484375,
0.006977081298828125,
-0.022064208984375,
0.0027446746826171875,
-0.0411376953125,
0.006389617919921875,
0.0426025390625,
-0.019134521484375,
-0.05743408203125,
-0.02154541015625,
0.00789642333984375,
-0.0268096923828125,
-0.006504058837890625,
0.0267791748046875,
-0.05523681640625,
0.0277557373046875,
0.0277557373046875,
0.0399169921875,
0.05731201171875,
-0.00424957275390625,
0.03582763671875,
-0.05364990234375,
0.02069091796875,
0.00925445556640625,
0.045867919921875,
0.0311279296875,
-0.031036376953125,
0.0499267578125,
0.02484130859375,
-0.04852294921875,
-0.045379638671875,
-0.00911712646484375,
-0.08221435546875,
-0.01425933837890625,
0.0877685546875,
-0.021728515625,
-0.03411865234375,
0.01244354248046875,
-0.033477783203125,
0.045562744140625,
-0.0218353271484375,
0.06671142578125,
0.0638427734375,
-0.004444122314453125,
-0.017578125,
-0.032684326171875,
0.020294189453125,
0.03582763671875,
-0.04852294921875,
-0.01690673828125,
0.01959228515625,
0.03057861328125,
0.0098724365234375,
0.037994384765625,
0.0019550323486328125,
0.0015401840209960938,
-0.0007953643798828125,
-0.0035037994384765625,
-0.0030059814453125,
0.00606536865234375,
-0.0286407470703125,
0.0011844635009765625,
-0.040008544921875,
-0.0269012451171875
]
] |
cafeai/cafe_aesthetic | 2022-11-23T12:08:27.000Z | [
"transformers",
"pytorch",
"beit",
"image-classification",
"license:agpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | cafeai | null | null | cafeai/cafe_aesthetic | 32 | 15,821 | transformers | 2022-11-14T09:56:39 | ---
license: agpl-3.0
---
# Info
Since people are downloading this and I don't know why, I'll add some information. This model is an image classifier fine-tuned on `microsoft/beit-base-patch16-384`.
Its purpose is to be used in the dataset conditioning step for the [Waifu Diffusion project](https://huggingface.co/hakurei/waifu-diffusion), a fine-tune effort for Stable Diffusion. As WD1.4 is planned to have a *significantly large dataset* (~15m images), it is infeasible to analyze every image manually to determine whether or not it should be included in the final training dataset. This image classifier is trained on approximately 3.5k real-life and anime/manga images. Its purpose is to remove aesthetically worthless images from our dataset by classifying them as "`not_aesthetic`". The image classifier was trained to **err on the side of caution** and will generally tend to include images unless they are in a "manga-like" format, have messy lines and/or are sketches, or include an unacceptable amount of text (namely text that covers the primary subject of the image). The idea is that certain images will hurt a SD fine-tune.
Note: This classifier is not perfect, just like every other classifier out there. However, with a sufficiently large dataset, any imperfections or misclassifications should average themselves out due to the Law of Large Numbers.
You can test out the classifier [here](https://huggingface.co/spaces/cafeai/cafe_aesthetic_demo), along with some other classifiers for the project.
# License
Released under the aGPLv3. Use the model as you wish for any purpose. If you make changes, share the changes. | 1,643 | [
[
-0.05169677734375,
-0.04833984375,
0.017181396484375,
0.022125244140625,
-0.04180908203125,
-0.0360107421875,
0.01837158203125,
-0.04608154296875,
-0.0160675048828125,
0.038238525390625,
-0.04254150390625,
-0.040252685546875,
-0.051422119140625,
-0.01071929931640625,
-0.03546142578125,
0.08013916015625,
-0.0107269287109375,
0.0030193328857421875,
-0.0166015625,
-0.017852783203125,
-0.04730224609375,
-0.00408172607421875,
-0.0704345703125,
-0.035888671875,
0.036865234375,
0.0303955078125,
0.05731201171875,
0.052215576171875,
0.037139892578125,
0.0167388916015625,
-0.0012378692626953125,
-0.0098114013671875,
-0.062286376953125,
-0.032684326171875,
0.0027713775634765625,
-0.031463623046875,
-0.039794921875,
0.015411376953125,
0.031219482421875,
0.03924560546875,
0.0006804466247558594,
0.0139007568359375,
-0.01031494140625,
0.07781982421875,
-0.04266357421875,
0.01255035400390625,
-0.02093505859375,
0.033721923828125,
-0.0188446044921875,
-0.0037288665771484375,
-0.02691650390625,
-0.01416015625,
0.00797271728515625,
-0.05816650390625,
0.041290283203125,
0.004199981689453125,
0.09832763671875,
0.03680419921875,
-0.017608642578125,
0.0096893310546875,
-0.047454833984375,
0.0386962890625,
-0.04296875,
0.0085601806640625,
0.04779052734375,
0.034820556640625,
-0.00955963134765625,
-0.037689208984375,
-0.0335693359375,
0.0026874542236328125,
0.006366729736328125,
0.0143585205078125,
-0.0236358642578125,
0.007198333740234375,
0.044769287109375,
0.0309906005859375,
-0.04815673828125,
0.021270751953125,
-0.051025390625,
-0.01413726806640625,
0.04833984375,
0.004634857177734375,
0.01593017578125,
0.00027251243591308594,
-0.027252197265625,
0.006275177001953125,
-0.048126220703125,
0.0006556510925292969,
0.037689208984375,
0.0018873214721679688,
-0.017547607421875,
0.0379638671875,
-0.00548553466796875,
0.06231689453125,
0.0016241073608398438,
-0.00998687744140625,
0.04443359375,
0.01132965087890625,
-0.032867431640625,
-0.0024318695068359375,
0.062225341796875,
0.059051513671875,
0.0274200439453125,
0.0082855224609375,
-0.0176849365234375,
0.01338958740234375,
0.03033447265625,
-0.08660888671875,
-0.0333251953125,
0.0235595703125,
-0.05511474609375,
-0.0484619140625,
-0.009521484375,
-0.035736083984375,
-0.0030727386474609375,
-0.0213165283203125,
0.039581298828125,
-0.0374755859375,
-0.0189361572265625,
0.0037784576416015625,
-0.01397705078125,
-0.001499176025390625,
0.049407958984375,
-0.044189453125,
0.0121612548828125,
0.021697998046875,
0.06451416015625,
-0.00370025634765625,
-0.025787353515625,
-0.003063201904296875,
-0.0032749176025390625,
-0.04473876953125,
0.0667724609375,
-0.02374267578125,
-0.04400634765625,
-0.01224517822265625,
0.034637451171875,
0.01070404052734375,
-0.050445556640625,
0.0675048828125,
-0.048797607421875,
0.00493621826171875,
-0.0168914794921875,
-0.055145263671875,
-0.0308837890625,
0.0036411285400390625,
-0.06768798828125,
0.0589599609375,
0.015655517578125,
-0.063232421875,
0.045623779296875,
-0.06451416015625,
-0.02392578125,
0.00785064697265625,
0.024566650390625,
-0.05279541015625,
0.00708770751953125,
-0.0211639404296875,
0.026947021484375,
0.00354766845703125,
0.0222930908203125,
-0.044189453125,
-0.056549072265625,
0.0041046142578125,
-0.03680419921875,
0.050445556640625,
0.045135498046875,
-0.002956390380859375,
-0.001560211181640625,
-0.072998046875,
-0.01519012451171875,
0.02349853515625,
-0.0187835693359375,
-0.0308837890625,
-0.01261138916015625,
0.0234222412109375,
0.0168609619140625,
0.01551055908203125,
-0.045989990234375,
0.01123809814453125,
-0.00504302978515625,
0.01155853271484375,
0.07232666015625,
-0.006805419921875,
0.031280517578125,
-0.040252685546875,
0.040069580078125,
0.023223876953125,
0.0226898193359375,
-0.00676727294921875,
-0.048675537109375,
-0.05596923828125,
-0.03118896484375,
0.032562255859375,
0.0184478759765625,
-0.040802001953125,
0.037872314453125,
0.01284027099609375,
-0.046478271484375,
-0.03558349609375,
-0.00714874267578125,
0.02252197265625,
0.0518798828125,
0.0215911865234375,
-0.01117706298828125,
-0.036865234375,
-0.0780029296875,
0.0171356201171875,
0.017364501953125,
0.00983428955078125,
0.006076812744140625,
0.03985595703125,
-0.0283203125,
0.06085205078125,
-0.044158935546875,
-0.040069580078125,
0.012115478515625,
0.0022602081298828125,
0.0180816650390625,
0.050872802734375,
0.06085205078125,
-0.06109619140625,
-0.0263671875,
-0.0140533447265625,
-0.056396484375,
-0.005229949951171875,
0.014739990234375,
-0.025360107421875,
0.0037631988525390625,
0.0304718017578125,
-0.038421630859375,
0.043853759765625,
0.0296783447265625,
-0.024932861328125,
0.044097900390625,
-0.02734375,
-0.00013625621795654297,
-0.06634521484375,
0.008636474609375,
0.040313720703125,
-0.020233154296875,
-0.037567138671875,
0.00952911376953125,
0.0069122314453125,
-0.0212860107421875,
-0.037567138671875,
0.036895751953125,
0.00002956390380859375,
0.01629638671875,
-0.04437255859375,
-0.0293731689453125,
0.0154571533203125,
0.03131103515625,
-0.0016717910766601562,
0.043914794921875,
0.04931640625,
-0.06390380859375,
0.01293182373046875,
0.013671875,
-0.03338623046875,
0.059906005859375,
-0.055389404296875,
0.007137298583984375,
-0.033966064453125,
0.01519012451171875,
-0.06939697265625,
-0.0307159423828125,
0.0267333984375,
-0.01493072509765625,
0.0369873046875,
-0.005588531494140625,
-0.039947509765625,
-0.034088134765625,
-0.015289306640625,
0.03875732421875,
0.039947509765625,
-0.051513671875,
0.035552978515625,
0.035430908203125,
0.0244903564453125,
-0.032867431640625,
-0.051544189453125,
-0.023834228515625,
-0.027740478515625,
-0.03826904296875,
0.01425933837890625,
-0.01383209228515625,
0.0069122314453125,
0.0212860107421875,
-0.00597381591796875,
-0.0091400146484375,
0.0114898681640625,
0.036651611328125,
0.026611328125,
0.001071929931640625,
0.004062652587890625,
0.00807952880859375,
-0.0192413330078125,
0.00402069091796875,
-0.004291534423828125,
0.0295257568359375,
0.0026035308837890625,
-0.0338134765625,
-0.057891845703125,
0.00569915771484375,
0.046142578125,
0.01230621337890625,
0.038604736328125,
0.061767578125,
-0.044464111328125,
-0.01190185546875,
-0.0305023193359375,
0.0189361572265625,
-0.03631591796875,
0.039398193359375,
-0.0308990478515625,
-0.044219970703125,
0.047576904296875,
-0.007472991943359375,
0.00437164306640625,
0.056915283203125,
0.01471710205078125,
-0.0170440673828125,
0.10443115234375,
0.037353515625,
-0.01397705078125,
0.027313232421875,
-0.041015625,
-0.005809783935546875,
-0.06719970703125,
-0.03631591796875,
-0.03765869140625,
-0.039581298828125,
-0.03497314453125,
-0.0098876953125,
-0.0038700103759765625,
-0.0011625289916992188,
-0.027801513671875,
0.0151519775390625,
-0.039459228515625,
0.040283203125,
0.026397705078125,
0.0313720703125,
0.004596710205078125,
-0.002132415771484375,
0.011962890625,
-0.0060577392578125,
-0.03375244140625,
-0.026153564453125,
0.06732177734375,
0.04736328125,
0.07879638671875,
-0.000018417835235595703,
0.032135009765625,
0.031890869140625,
0.0230712890625,
-0.07928466796875,
0.03753662109375,
-0.03924560546875,
-0.042144775390625,
-0.0011358261108398438,
-0.04437255859375,
-0.076171875,
0.00678253173828125,
-0.002368927001953125,
-0.0213165283203125,
0.02545166015625,
0.0301513671875,
-0.0026760101318359375,
0.0144805908203125,
-0.06866455078125,
0.0875244140625,
-0.0003311634063720703,
-0.01727294921875,
-0.00799560546875,
-0.02020263671875,
0.034881591796875,
-0.0264892578125,
0.0261993408203125,
-0.032562255859375,
0.015655517578125,
0.057373046875,
-0.0489501953125,
0.06231689453125,
-0.051910400390625,
-0.0038814544677734375,
0.01087188720703125,
0.0033321380615234375,
0.0161895751953125,
-0.023651123046875,
-0.003932952880859375,
0.0188751220703125,
0.01215362548828125,
-0.041259765625,
-0.0252227783203125,
0.050201416015625,
-0.07965087890625,
-0.01763916015625,
-0.0301666259765625,
-0.0345458984375,
-0.00229644775390625,
0.0163116455078125,
0.059051513671875,
0.04388427734375,
-0.0204315185546875,
0.00418853759765625,
0.059051513671875,
-0.0008077621459960938,
0.01285552978515625,
0.0305023193359375,
-0.03204345703125,
-0.01551055908203125,
0.08447265625,
0.0206298828125,
0.01300811767578125,
-0.009490966796875,
0.0243377685546875,
-0.0161285400390625,
-0.04986572265625,
-0.034820556640625,
0.0110626220703125,
-0.067138671875,
-0.02569580078125,
-0.0301666259765625,
-0.0168304443359375,
-0.01007080078125,
0.0020580291748046875,
-0.016204833984375,
-0.0225067138671875,
-0.0589599609375,
-0.00531005859375,
0.0276336669921875,
0.0594482421875,
0.00798797607421875,
0.0305633544921875,
-0.0474853515625,
0.020660400390625,
0.0041046142578125,
0.041168212890625,
0.00852203369140625,
-0.06109619140625,
-0.0219268798828125,
-0.00251007080078125,
-0.0226287841796875,
-0.04046630859375,
0.024017333984375,
0.02734375,
0.0458984375,
0.06243896484375,
-0.0020732879638671875,
0.05645751953125,
-0.048126220703125,
0.040008544921875,
0.019561767578125,
-0.040008544921875,
0.045654296875,
-0.0204925537109375,
0.038909912109375,
0.052398681640625,
0.043304443359375,
-0.005191802978515625,
-0.0222930908203125,
-0.06402587890625,
-0.049560546875,
0.06292724609375,
-0.0011196136474609375,
0.0194854736328125,
0.01003265380859375,
0.042724609375,
0.007091522216796875,
0.01476287841796875,
-0.053680419921875,
-0.0191192626953125,
-0.039154052734375,
-0.01535797119140625,
0.01373291015625,
-0.0204315185546875,
-0.00652313232421875,
-0.0263519287109375,
0.0784912109375,
-0.0068817138671875,
0.008758544921875,
0.003093719482421875,
0.002002716064453125,
-0.03314208984375,
-0.0243988037109375,
0.0297393798828125,
0.0330810546875,
-0.0263214111328125,
-0.01395416259765625,
-0.00659942626953125,
-0.05206298828125,
0.00616455078125,
0.00984954833984375,
-0.02618408203125,
-0.004543304443359375,
0.0031375885009765625,
0.08306884765625,
-0.006744384765625,
-0.035552978515625,
0.041168212890625,
-0.0245819091796875,
-0.03765869140625,
-0.024078369140625,
0.0292816162109375,
-0.020233154296875,
0.0025730133056640625,
0.0170135498046875,
0.054779052734375,
0.040771484375,
-0.03314208984375,
0.00916290283203125,
0.0281829833984375,
-0.037994384765625,
-0.020233154296875,
0.0447998046875,
-0.0005869865417480469,
-0.00910186767578125,
0.037689208984375,
-0.04486083984375,
-0.0287628173828125,
0.056365966796875,
0.04144287109375,
0.0433349609375,
-0.005828857421875,
0.020721435546875,
0.052093505859375,
0.006587982177734375,
-0.033660888671875,
0.01336669921875,
0.007297515869140625,
-0.060394287109375,
-0.0239410400390625,
-0.05889892578125,
-0.0171661376953125,
0.0293731689453125,
-0.054931640625,
0.043853759765625,
-0.04931640625,
-0.028961181640625,
0.006526947021484375,
-0.0162506103515625,
-0.051239013671875,
0.040985107421875,
0.005855560302734375,
0.08502197265625,
-0.086669921875,
0.06500244140625,
0.04486083984375,
-0.0276336669921875,
-0.0675048828125,
-0.0093841552734375,
-0.01459503173828125,
-0.054962158203125,
0.031890869140625,
0.0306396484375,
0.0131378173828125,
0.000016748905181884766,
-0.068359375,
-0.04388427734375,
0.06158447265625,
0.0245361328125,
-0.039154052734375,
-0.00397491455078125,
-0.020294189453125,
0.0469970703125,
-0.0213623046875,
-0.002368927001953125,
0.026641845703125,
0.03204345703125,
0.042816162109375,
-0.052734375,
0.00537872314453125,
-0.04638671875,
0.0203399658203125,
0.0016584396362304688,
-0.05670166015625,
0.05914306640625,
-0.01512908935546875,
0.0036106109619140625,
0.028289794921875,
0.02740478515625,
0.0209197998046875,
0.0282745361328125,
0.049285888671875,
0.08209228515625,
0.0621337890625,
-0.0253448486328125,
0.08502197265625,
0.005649566650390625,
0.0372314453125,
0.070068359375,
-0.0141448974609375,
0.037109375,
0.0084381103515625,
-0.0143890380859375,
0.02734375,
0.083251953125,
-0.045196533203125,
0.07421875,
0.01036834716796875,
-0.00292205810546875,
-0.0094451904296875,
-0.00670623779296875,
-0.035888671875,
0.0238800048828125,
0.0247802734375,
-0.028839111328125,
-0.01218414306640625,
0.0276947021484375,
-0.00878143310546875,
-0.02972412109375,
-0.0283203125,
0.053558349609375,
-0.000015795230865478516,
-0.03497314453125,
0.045654296875,
-0.004085540771484375,
0.0455322265625,
-0.041168212890625,
-0.017242431640625,
0.00778961181640625,
0.01312255859375,
-0.044525146484375,
-0.057891845703125,
0.019805908203125,
-0.01239013671875,
-0.01288604736328125,
0.0160980224609375,
0.06182861328125,
-0.01195526123046875,
-0.042572021484375,
0.01617431640625,
-0.0016307830810546875,
0.03497314453125,
-0.0005459785461425781,
-0.0714111328125,
0.02490234375,
0.0016956329345703125,
-0.007152557373046875,
0.02423095703125,
0.040435791015625,
0.01873779296875,
0.0380859375,
0.04998779296875,
0.006633758544921875,
-0.0265350341796875,
0.0379638671875,
0.0645751953125,
-0.015655517578125,
-0.03240966796875,
-0.047637939453125,
0.041290283203125,
-0.00897979736328125,
-0.0240936279296875,
0.044921875,
0.058013916015625,
0.05755615234375,
-0.01207733154296875,
0.057403564453125,
0.0001455545425415039,
0.013824462890625,
-0.045745849609375,
0.051544189453125,
-0.070068359375,
0.01309967041015625,
-0.03057861328125,
-0.06494140625,
-0.01000213623046875,
0.08447265625,
-0.00007903575897216797,
0.017913818359375,
0.0290069580078125,
0.04595947265625,
-0.009521484375,
0.02691650390625,
0.027130126953125,
0.00904083251953125,
0.0224456787109375,
0.01088714599609375,
0.051910400390625,
-0.038970947265625,
0.0211944580078125,
-0.0360107421875,
-0.0309906005859375,
-0.012969970703125,
-0.06243896484375,
-0.06585693359375,
-0.033233642578125,
-0.043304443359375,
-0.047637939453125,
-0.0108489990234375,
0.05364990234375,
0.07781982421875,
-0.054107666015625,
0.0028629302978515625,
-0.021484375,
-0.0215301513671875,
0.004055023193359375,
-0.0193939208984375,
0.0116119384765625,
-0.01021575927734375,
-0.05999755859375,
0.0014238357543945312,
0.0010690689086914062,
0.033966064453125,
-0.029571533203125,
-0.005039215087890625,
0.003265380859375,
-0.0303955078125,
0.036590576171875,
0.01212310791015625,
-0.01904296875,
-0.0257110595703125,
-0.016510009765625,
0.006931304931640625,
0.0196685791015625,
0.0234832763671875,
-0.0340576171875,
0.04132080078125,
0.024993896484375,
0.023193359375,
0.048126220703125,
-0.01085662841796875,
0.01309967041015625,
-0.049072265625,
0.020477294921875,
0.007537841796875,
0.0552978515625,
0.0301666259765625,
-0.0382080078125,
0.04058837890625,
0.030731201171875,
-0.055389404296875,
-0.062225341796875,
0.005290985107421875,
-0.078125,
-0.0262908935546875,
0.0780029296875,
-0.0049591064453125,
-0.0374755859375,
-0.0048370361328125,
-0.0350341796875,
0.00939178466796875,
-0.048828125,
0.0548095703125,
0.056549072265625,
-0.0108489990234375,
-0.00595855712890625,
-0.052032470703125,
0.0274658203125,
-0.0041046142578125,
-0.05426025390625,
-0.0211944580078125,
0.0538330078125,
0.044525146484375,
0.0304718017578125,
0.052734375,
-0.0190277099609375,
0.035919189453125,
-0.01004791259765625,
0.000713348388671875,
-0.01110076904296875,
-0.0206756591796875,
-0.04388427734375,
0.00942230224609375,
0.0167388916015625,
-0.01541900634765625
]
] |
KoboldAI/OPT-13B-Erebus | 2022-09-09T13:54:35.000Z | [
"transformers",
"pytorch",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/OPT-13B-Erebus | 157 | 15,695 | transformers | 2022-09-09T09:11:05 | ---
language: en
license: other
commercial: no
inference: false
---
# OPT 13B - Erebus
## Model description
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
## Training data
The data can be divided in 6 different datasets:
- Literotica (everything with 4.5/5 or higher)
- Sexstories (everything with 90 or higher)
- Dataset-G (private dataset of X-rated stories)
- Doc's Lab (all stories)
- Pike Dataset (novels with "adult" rating)
- SoFurry (collection of various animals)
The dataset uses `[Genre: <comma-separated list of genres>]` for tagging.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/OPT-13B-Erebus')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
```
## Limitations and biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). **Warning: This model has a very strong NSFW bias!**
### License
OPT-13B is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
### BibTeX entry and citation info
```
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 2,395 | [
[
-0.032196044921875,
-0.047576904296875,
0.01154327392578125,
0.0180511474609375,
-0.0203857421875,
-0.0279541015625,
-0.0261383056640625,
-0.030517578125,
0.023681640625,
0.0531005859375,
-0.062225341796875,
-0.0292205810546875,
-0.029144287109375,
0.0250244140625,
-0.016387939453125,
0.0684814453125,
0.015594482421875,
-0.0079498291015625,
0.0273895263671875,
0.00968170166015625,
-0.0308074951171875,
-0.0217437744140625,
-0.047393798828125,
-0.02508544921875,
0.032867431640625,
0.027618408203125,
0.06005859375,
0.03802490234375,
0.0467529296875,
0.0205230712890625,
-0.0245208740234375,
0.016204833984375,
-0.048553466796875,
-0.006107330322265625,
-0.0014896392822265625,
-0.0340576171875,
-0.032928466796875,
-0.00942230224609375,
0.04779052734375,
0.044036865234375,
-0.005939483642578125,
0.014129638671875,
-0.007770538330078125,
0.04022216796875,
-0.035003662109375,
-0.01568603515625,
-0.0386962890625,
0.00940704345703125,
-0.0237274169921875,
0.002750396728515625,
-0.06298828125,
-0.011016845703125,
0.009429931640625,
-0.0350341796875,
0.0355224609375,
0.024169921875,
0.10064697265625,
0.02020263671875,
-0.024810791015625,
-0.00988006591796875,
-0.049713134765625,
0.0653076171875,
-0.07196044921875,
0.031494140625,
0.019775390625,
0.001522064208984375,
-0.0016279220581054688,
-0.068359375,
-0.0256195068359375,
-0.00409698486328125,
-0.01256561279296875,
0.0347900390625,
-0.01117706298828125,
-0.01515960693359375,
0.01119232177734375,
0.029327392578125,
-0.04217529296875,
0.007625579833984375,
-0.0550537109375,
-0.00281524658203125,
0.049041748046875,
0.0160675048828125,
0.020721435546875,
-0.041259765625,
-0.041473388671875,
-0.0222625732421875,
-0.041961669921875,
-0.0066986083984375,
0.0478515625,
0.03656005859375,
-0.0198974609375,
0.04144287109375,
0.0191497802734375,
0.050018310546875,
0.0154266357421875,
0.01678466796875,
0.0452880859375,
-0.02069091796875,
-0.0171966552734375,
0.0076141357421875,
0.069580078125,
0.028228759765625,
0.004604339599609375,
0.00457000732421875,
-0.00882720947265625,
-0.01013946533203125,
0.048583984375,
-0.052825927734375,
-0.01139068603515625,
0.0139617919921875,
-0.057342529296875,
-0.043182373046875,
0.0224609375,
-0.08026123046875,
-0.0261077880859375,
-0.006641387939453125,
0.016265869140625,
-0.045196533203125,
-0.035858154296875,
0.01189422607421875,
0.00736236572265625,
0.036865234375,
-0.01029205322265625,
-0.06610107421875,
0.0209197998046875,
0.02215576171875,
0.04119873046875,
-0.01036834716796875,
-0.031585693359375,
0.0143890380859375,
-0.0113525390625,
-0.043701171875,
0.036407470703125,
-0.0284576416015625,
-0.01470947265625,
0.0062408447265625,
0.0229949951171875,
-0.0152587890625,
-0.03302001953125,
0.077880859375,
-0.0357666015625,
0.029876708984375,
0.016387939453125,
-0.030181884765625,
-0.025970458984375,
-0.02587890625,
-0.054229736328125,
0.0860595703125,
0.01336669921875,
-0.06939697265625,
0.029449462890625,
-0.046234130859375,
-0.0239105224609375,
0.0101470947265625,
0.0126800537109375,
-0.04833984375,
0.0254974365234375,
0.0079345703125,
0.0165252685546875,
-0.01093292236328125,
0.030181884765625,
-0.01264190673828125,
-0.0111083984375,
0.01317596435546875,
-0.0303192138671875,
0.0743408203125,
0.03564453125,
-0.0303802490234375,
0.008697509765625,
-0.0704345703125,
0.0054931640625,
0.038787841796875,
-0.0157623291015625,
-0.0234527587890625,
0.0002295970916748047,
0.01468658447265625,
0.006069183349609375,
0.021820068359375,
-0.040496826171875,
-0.007213592529296875,
-0.042449951171875,
0.0219573974609375,
0.05389404296875,
-0.007843017578125,
0.03509521484375,
-0.017730712890625,
0.03631591796875,
0.0036525726318359375,
0.02349853515625,
-0.024078369140625,
-0.03924560546875,
-0.08489990234375,
-0.0065460205078125,
0.028656005859375,
0.039520263671875,
-0.032806396484375,
0.055419921875,
-0.01357269287109375,
-0.0567626953125,
-0.05621337890625,
-0.0242919921875,
0.01416015625,
0.0005640983581542969,
0.033905029296875,
0.006282806396484375,
-0.066162109375,
-0.076171875,
-0.02252197265625,
-0.005786895751953125,
0.00417327880859375,
0.034210205078125,
0.0494384765625,
-0.0296783447265625,
0.0594482421875,
-0.046234130859375,
-0.0277252197265625,
-0.0396728515625,
-0.0021228790283203125,
0.035919189453125,
0.033172607421875,
0.044464111328125,
-0.064208984375,
-0.0282135009765625,
-0.01163482666015625,
-0.05523681640625,
-0.0204010009765625,
-0.022552490234375,
-0.0321044921875,
0.0010118484497070312,
0.023406982421875,
-0.0180816650390625,
0.029449462890625,
0.037933349609375,
-0.0423583984375,
0.041046142578125,
-0.0162353515625,
-0.00492095947265625,
-0.1114501953125,
0.0060882568359375,
0.00252532958984375,
-0.007175445556640625,
-0.05859375,
0.01438140869140625,
0.01320648193359375,
-0.019622802734375,
-0.03936767578125,
0.037261962890625,
-0.0308074951171875,
0.0220184326171875,
-0.0146942138671875,
0.011444091796875,
-0.0137481689453125,
0.0341796875,
0.0184326171875,
0.041656494140625,
0.04010009765625,
-0.0552978515625,
0.0240936279296875,
0.04290771484375,
-0.01399993896484375,
0.0292205810546875,
-0.057769775390625,
0.000039458274841308594,
-0.0095367431640625,
-0.00576019287109375,
-0.05126953125,
-0.0263214111328125,
0.0190277099609375,
-0.0506591796875,
0.0308990478515625,
0.0033855438232421875,
-0.0281524658203125,
-0.048431396484375,
-0.01165008544921875,
-0.001468658447265625,
0.0499267578125,
-0.048553466796875,
0.05438232421875,
0.01291656494140625,
-0.0124053955078125,
-0.04437255859375,
-0.06329345703125,
-0.00021338462829589844,
-0.0306549072265625,
-0.0650634765625,
0.045257568359375,
0.0024929046630859375,
0.002460479736328125,
-0.00974273681640625,
0.00931549072265625,
-0.006488800048828125,
-0.012359619140625,
0.00902557373046875,
0.03411865234375,
-0.00551605224609375,
-0.005207061767578125,
0.015869140625,
-0.015960693359375,
0.004657745361328125,
0.002758026123046875,
0.048980712890625,
-0.017364501953125,
-0.0004246234893798828,
-0.018951416015625,
0.01678466796875,
0.0185546875,
-0.015899658203125,
0.0758056640625,
0.061431884765625,
-0.036651611328125,
-0.0269775390625,
-0.022796630859375,
-0.025482177734375,
-0.036834716796875,
0.044219970703125,
-0.019134521484375,
-0.038848876953125,
0.042572021484375,
0.0028858184814453125,
0.0253143310546875,
0.0548095703125,
0.035919189453125,
0.0168914794921875,
0.0712890625,
0.06536865234375,
0.0270843505859375,
0.038238525390625,
-0.0240325927734375,
0.018585205078125,
-0.0706787109375,
-0.027801513671875,
-0.037872314453125,
-0.0199432373046875,
-0.043304443359375,
-0.0063018798828125,
-0.004665374755859375,
-0.0050811767578125,
-0.03765869140625,
0.051971435546875,
-0.04443359375,
0.01447296142578125,
0.051300048828125,
0.021942138671875,
-0.00080108642578125,
0.00609588623046875,
-0.01068878173828125,
-0.020965576171875,
-0.055938720703125,
-0.0465087890625,
0.08111572265625,
0.03875732421875,
0.08062744140625,
0.01007080078125,
0.06646728515625,
0.0139007568359375,
0.0031795501708984375,
-0.0264739990234375,
0.045989990234375,
-0.0201873779296875,
-0.0819091796875,
-0.0123138427734375,
-0.0289764404296875,
-0.0760498046875,
0.0208282470703125,
-0.00958251953125,
-0.043670654296875,
0.033203125,
-0.016571044921875,
-0.019073486328125,
0.028778076171875,
-0.05877685546875,
0.06561279296875,
-0.02032470703125,
-0.021728515625,
0.0068206787109375,
-0.060821533203125,
0.02496337890625,
-0.00787353515625,
0.01172637939453125,
0.013397216796875,
-0.0032863616943359375,
0.07977294921875,
-0.027252197265625,
0.0714111328125,
0.0083770751953125,
-0.0103607177734375,
0.030609130859375,
-0.00927734375,
0.02392578125,
0.004718780517578125,
0.00261688232421875,
0.005512237548828125,
-0.02093505859375,
-0.0119476318359375,
0.003398895263671875,
0.044891357421875,
-0.076171875,
-0.01001739501953125,
-0.04144287109375,
-0.00943756103515625,
0.020294189453125,
0.041778564453125,
0.061614990234375,
0.0377197265625,
-0.0010929107666015625,
0.03546142578125,
0.0509033203125,
-0.044921875,
0.0261383056640625,
0.039459228515625,
-0.041595458984375,
-0.055206298828125,
0.060638427734375,
-0.001811981201171875,
0.0158538818359375,
0.01030731201171875,
0.00788116455078125,
-0.028839111328125,
-0.01397705078125,
-0.0223388671875,
0.037017822265625,
-0.051055908203125,
-0.0168914794921875,
-0.04864501953125,
-0.03802490234375,
-0.0248565673828125,
-0.0189056396484375,
-0.043792724609375,
0.0014181137084960938,
-0.039031982421875,
-0.00955963134765625,
0.0120391845703125,
0.04302978515625,
-0.012542724609375,
0.0340576171875,
-0.052093505859375,
0.026519775390625,
-0.0012683868408203125,
0.02978515625,
-0.003997802734375,
-0.07623291015625,
-0.02032470703125,
0.004444122314453125,
-0.0340576171875,
-0.0810546875,
0.048431396484375,
0.014068603515625,
0.052276611328125,
0.0377197265625,
0.0261383056640625,
0.0225982666015625,
-0.042266845703125,
0.072998046875,
0.0211181640625,
-0.046905517578125,
0.041259765625,
-0.0273284912109375,
0.007175445556640625,
0.032989501953125,
0.0226898193359375,
-0.0265960693359375,
-0.0302276611328125,
-0.07281494140625,
-0.08013916015625,
0.08514404296875,
0.0406494140625,
0.0184326171875,
-0.0053863525390625,
0.00824737548828125,
0.0210418701171875,
0.004718780517578125,
-0.09075927734375,
-0.061920166015625,
-0.024383544921875,
-0.0215301513671875,
-0.0011854171752929688,
-0.030303955078125,
0.005008697509765625,
-0.00039505958557128906,
0.07501220703125,
0.01009368896484375,
0.04949951171875,
0.01555633544921875,
-0.016387939453125,
-0.0102386474609375,
0.020477294921875,
0.039276123046875,
0.03558349609375,
-0.030120849609375,
-0.001590728759765625,
0.00968170166015625,
-0.061126708984375,
-0.00527191162109375,
0.01488494873046875,
-0.04266357421875,
0.0180816650390625,
0.01291656494140625,
0.09283447265625,
0.0129852294921875,
-0.0250244140625,
0.01739501953125,
-0.0014743804931640625,
-0.016326904296875,
-0.0467529296875,
-0.0107574462890625,
-0.0031528472900390625,
0.00986480712890625,
0.03564453125,
0.0159149169921875,
0.0037593841552734375,
-0.01861572265625,
0.00592041015625,
-0.01102447509765625,
-0.03387451171875,
-0.0207366943359375,
0.0711669921875,
0.0208740234375,
-0.032867431640625,
0.05615234375,
-0.0158538818359375,
-0.02899169921875,
0.04473876953125,
0.065673828125,
0.07476806640625,
-0.0167236328125,
0.0303802490234375,
0.055633544921875,
0.05255126953125,
0.0066375732421875,
0.0300140380859375,
0.052947998046875,
-0.060455322265625,
-0.0205230712890625,
-0.056640625,
-0.00872802734375,
0.030517578125,
-0.05889892578125,
0.0498046875,
-0.00803375244140625,
-0.0377197265625,
-0.005512237548828125,
-0.015106201171875,
-0.04034423828125,
0.019927978515625,
0.0352783203125,
0.068603515625,
-0.064453125,
0.006504058837890625,
0.0709228515625,
-0.04144287109375,
-0.05438232421875,
-0.016937255859375,
-0.02386474609375,
-0.038421630859375,
0.0283355712890625,
0.024169921875,
0.0192718505859375,
0.0195770263671875,
-0.056549072265625,
-0.06585693359375,
0.062164306640625,
0.0075225830078125,
-0.0254669189453125,
-0.0082855224609375,
-0.002716064453125,
0.036834716796875,
-0.028778076171875,
0.0301513671875,
0.03485107421875,
0.04425048828125,
-0.021728515625,
-0.0457763671875,
-0.007587432861328125,
-0.03179931640625,
0.01397705078125,
0.010833740234375,
-0.057159423828125,
0.065673828125,
-0.0288848876953125,
-0.01934814453125,
0.0215911865234375,
0.05889892578125,
0.025390625,
0.010162353515625,
0.0241241455078125,
0.0455322265625,
0.035430908203125,
-0.0258941650390625,
0.05889892578125,
-0.0252227783203125,
0.053863525390625,
0.069580078125,
-0.0014362335205078125,
0.044097900390625,
0.015228271484375,
-0.039398193359375,
0.045684814453125,
0.06671142578125,
-0.0226593017578125,
0.0452880859375,
-0.00600433349609375,
0.017120361328125,
-0.0270843505859375,
0.0028896331787109375,
-0.04656982421875,
0.01535797119140625,
0.0198974609375,
-0.054168701171875,
-0.00577545166015625,
0.0080108642578125,
0.008941650390625,
0.0003018379211425781,
-0.0160064697265625,
0.045867919921875,
0.015899658203125,
-0.0350341796875,
0.046844482421875,
0.00695037841796875,
0.06396484375,
-0.0628662109375,
0.0189208984375,
-0.002742767333984375,
0.027191162109375,
-0.0159912109375,
-0.053802490234375,
-0.0014238357543945312,
-0.0019006729125976562,
-0.0202789306640625,
-0.0036602020263671875,
0.06414794921875,
-0.019805908203125,
-0.054595947265625,
0.02069091796875,
0.0281524658203125,
0.0211181640625,
0.0207977294921875,
-0.057159423828125,
0.0009431838989257812,
0.0198974609375,
-0.03814697265625,
0.0003654956817626953,
0.01824951171875,
0.028472900390625,
0.050445556640625,
0.034271240234375,
0.0013666152954101562,
0.041046142578125,
0.0120086669921875,
0.04779052734375,
-0.04559326171875,
-0.044525146484375,
-0.041778564453125,
0.041015625,
-0.0199432373046875,
-0.04266357421875,
0.057281494140625,
0.04217529296875,
0.05572509765625,
-0.038665771484375,
0.061248779296875,
-0.034698486328125,
0.047332763671875,
-0.0186309814453125,
0.060028076171875,
-0.044525146484375,
-0.0136260986328125,
-0.031402587890625,
-0.09552001953125,
-0.00438690185546875,
0.0660400390625,
-0.01039886474609375,
0.0264892578125,
0.0650634765625,
0.04766845703125,
-0.000015854835510253906,
0.01343536376953125,
0.01480865478515625,
0.019744873046875,
0.0125885009765625,
0.0229339599609375,
0.0595703125,
-0.0567626953125,
0.0438232421875,
-0.036865234375,
-0.011688232421875,
-0.0303497314453125,
-0.04571533203125,
-0.07049560546875,
-0.038055419921875,
-0.0215911865234375,
-0.0292205810546875,
-0.0162353515625,
0.048095703125,
0.04534912109375,
-0.055908203125,
0.0005650520324707031,
-0.01204681396484375,
-0.0135345458984375,
-0.0257415771484375,
-0.0213165283203125,
0.0300140380859375,
-0.0215911865234375,
-0.06109619140625,
0.0225982666015625,
-0.01085662841796875,
0.01444244384765625,
-0.01169586181640625,
-0.010955810546875,
-0.01788330078125,
0.01280975341796875,
0.01861572265625,
-0.0018606185913085938,
-0.04766845703125,
0.0011377334594726562,
0.028778076171875,
0.0013370513916015625,
-0.01044464111328125,
0.0235137939453125,
-0.035308837890625,
0.03167724609375,
0.0364990234375,
0.014892578125,
0.0118865966796875,
-0.00033855438232421875,
0.032867431640625,
-0.039703369140625,
0.003704071044921875,
0.01468658447265625,
0.035430908203125,
0.0217132568359375,
-0.01480865478515625,
0.05078125,
0.0137786865234375,
-0.054046630859375,
-0.07464599609375,
0.016571044921875,
-0.064453125,
-0.01129913330078125,
0.10089111328125,
-0.0048980712890625,
-0.016357421875,
0.009521484375,
-0.03472900390625,
0.019805908203125,
-0.028106689453125,
0.021392822265625,
0.04071044921875,
0.02569580078125,
-0.0159149169921875,
-0.04058837890625,
0.01287841796875,
0.016326904296875,
-0.04266357421875,
0.00836944580078125,
0.018829345703125,
0.007518768310546875,
0.034423828125,
0.011749267578125,
-0.016845703125,
0.0115509033203125,
0.0260009765625,
0.0272979736328125,
-0.005279541015625,
-0.035369873046875,
-0.00914764404296875,
-0.01123046875,
-0.0229034423828125,
0.0018863677978515625
]
] |
facebook/bart-large-xsum | 2023-01-24T16:28:59.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"bart",
"text2text-generation",
"summarization",
"en",
"arxiv:1910.13461",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | facebook | null | null | facebook/bart-large-xsum | 28 | 15,694 | transformers | 2022-03-02T23:29:05 | ---
tags:
- summarization
language:
- en
license: mit
model-index:
- name: facebook/bart-large-xsum
results:
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: test
metrics:
- name: ROUGE-1
type: rouge
value: 25.2697
verified: true
- name: ROUGE-2
type: rouge
value: 7.6638
verified: true
- name: ROUGE-L
type: rouge
value: 17.1808
verified: true
- name: ROUGE-LSUM
type: rouge
value: 21.7933
verified: true
- name: loss
type: loss
value: 3.5042972564697266
verified: true
- name: gen_len
type: gen_len
value: 27.4462
verified: true
- task:
type: summarization
name: Summarization
dataset:
name: xsum
type: xsum
config: default
split: test
metrics:
- name: ROUGE-1
type: rouge
value: 45.4525
verified: true
- name: ROUGE-2
type: rouge
value: 22.3455
verified: true
- name: ROUGE-L
type: rouge
value: 37.2302
verified: true
- name: ROUGE-LSUM
type: rouge
value: 37.2323
verified: true
- name: loss
type: loss
value: 2.3128726482391357
verified: true
- name: gen_len
type: gen_len
value: 25.5435
verified: true
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: train
metrics:
- name: ROUGE-1
type: rouge
value: 24.7852
verified: true
- name: ROUGE-2
type: rouge
value: 5.2533
verified: true
- name: ROUGE-L
type: rouge
value: 18.6792
verified: true
- name: ROUGE-LSUM
type: rouge
value: 20.629
verified: true
- name: loss
type: loss
value: 3.746837854385376
verified: true
- name: gen_len
type: gen_len
value: 23.1206
verified: true
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: test
metrics:
- name: ROUGE-1
type: rouge
value: 24.9158
verified: true
- name: ROUGE-2
type: rouge
value: 5.5837
verified: true
- name: ROUGE-L
type: rouge
value: 18.8935
verified: true
- name: ROUGE-LSUM
type: rouge
value: 20.76
verified: true
- name: loss
type: loss
value: 3.775235891342163
verified: true
- name: gen_len
type: gen_len
value: 23.0928
verified: true
---
### Bart model finetuned on xsum
docs: https://huggingface.co/transformers/model_doc/bart.html
finetuning: examples/seq2seq/ (as of Aug 20, 2020)
Metrics: ROUGE > 22 on xsum.
variants: search for distilbart
paper: https://arxiv.org/abs/1910.13461 | 2,997 | [
[
-0.032623291015625,
-0.0325927734375,
0.03570556640625,
0.0244293212890625,
-0.01111602783203125,
0.0167999267578125,
0.0207061767578125,
-0.006168365478515625,
0.0379638671875,
0.05413818359375,
-0.060211181640625,
-0.0228729248046875,
-0.0362548828125,
-0.01367950439453125,
-0.04052734375,
0.0927734375,
0.0010347366333007812,
0.0222320556640625,
0.0090179443359375,
-0.02532958984375,
-0.017578125,
-0.034576416015625,
-0.07684326171875,
-0.0177001953125,
0.046356201171875,
0.0609130859375,
0.058685302734375,
0.02239990234375,
0.0584716796875,
0.023681640625,
-0.012664794921875,
0.0114288330078125,
-0.04962158203125,
0.004169464111328125,
-0.015716552734375,
-0.0399169921875,
-0.0826416015625,
0.01263427734375,
0.049713134765625,
0.054840087890625,
-0.00270843505859375,
0.040008544921875,
-0.01479339599609375,
0.0443115234375,
-0.0168609619140625,
0.022064208984375,
-0.010284423828125,
-0.008392333984375,
-0.0245513916015625,
0.0005273818969726562,
-0.007373809814453125,
-0.021392822265625,
-0.0222015380859375,
-0.051116943359375,
0.0236968994140625,
0.0228118896484375,
0.09869384765625,
0.034820556640625,
-0.0423583984375,
0.0172271728515625,
-0.04229736328125,
0.05169677734375,
-0.0462646484375,
0.031768798828125,
0.0477294921875,
0.057952880859375,
0.0187225341796875,
-0.09637451171875,
-0.028167724609375,
-0.0003609657287597656,
-0.0288543701171875,
0.01163482666015625,
-0.04412841796875,
-0.0082550048828125,
0.043701171875,
0.05145263671875,
-0.029632568359375,
-0.00748443603515625,
-0.039031982421875,
-0.005481719970703125,
0.052886962890625,
0.0582275390625,
0.007350921630859375,
-0.0258941650390625,
-0.02178955078125,
-0.037384033203125,
-0.032196044921875,
-0.00778961181640625,
0.0220794677734375,
0.00250244140625,
-0.0498046875,
0.050384521484375,
0.0017423629760742188,
-0.00554656982421875,
0.016326904296875,
0.0102996826171875,
0.031982421875,
-0.05755615234375,
-0.0293426513671875,
-0.03411865234375,
0.044952392578125,
0.04193115234375,
0.023193359375,
0.01277923583984375,
-0.034332275390625,
-0.005382537841796875,
0.03228759765625,
-0.055572509765625,
-0.040008544921875,
-0.024444580078125,
-0.0310211181640625,
-0.0189208984375,
0.01013946533203125,
-0.0283355712890625,
-0.02178955078125,
-0.0308990478515625,
0.02386474609375,
-0.0272979736328125,
-0.015655517578125,
-0.007610321044921875,
-0.04364013671875,
0.005771636962890625,
0.023284912109375,
-0.05755615234375,
0.061370849609375,
0.0301361083984375,
0.05987548828125,
0.0262451171875,
0.0117034912109375,
-0.02685546875,
-0.0245208740234375,
-0.051666259765625,
0.0270233154296875,
0.01105499267578125,
-0.02423095703125,
-0.038543701171875,
0.02850341796875,
0.037445068359375,
-0.023895263671875,
0.0638427734375,
-0.039306640625,
0.00907135009765625,
-0.017364501953125,
-0.032745361328125,
-0.0206756591796875,
0.00768280029296875,
-0.0693359375,
0.07244873046875,
0.060577392578125,
-0.040863037109375,
0.03594970703125,
-0.06451416015625,
-0.0276947021484375,
-0.003582000732421875,
0.01336669921875,
-0.053466796875,
0.0211334228515625,
-0.015777587890625,
0.02764892578125,
-0.018096923828125,
0.0225677490234375,
-0.0208740234375,
-0.009521484375,
-0.004077911376953125,
-0.0428466796875,
0.08319091796875,
0.0352783203125,
-0.00873565673828125,
0.0191650390625,
-0.061126708984375,
-0.005771636962890625,
0.034698486328125,
0.0263214111328125,
-0.037109375,
-0.01654052734375,
0.037750244140625,
-0.0059967041015625,
0.017242431640625,
-0.029144287109375,
0.0166778564453125,
0.01025390625,
0.020263671875,
0.0303955078125,
-0.00527191162109375,
0.029510498046875,
-0.051483154296875,
0.038543701171875,
-0.0258026123046875,
0.02655029296875,
0.005878448486328125,
-0.0228729248046875,
-0.076416015625,
-0.030792236328125,
0.0184173583984375,
0.032196044921875,
-0.003894805908203125,
0.039306640625,
-0.00847625732421875,
-0.07501220703125,
-0.01352691650390625,
-0.0306549072265625,
0.0022411346435546875,
0.0296783447265625,
0.047760009765625,
-0.0255889892578125,
-0.0650634765625,
-0.06561279296875,
0.0257415771484375,
0.0036220550537109375,
-0.0003447532653808594,
0.0156707763671875,
0.028350830078125,
-0.032958984375,
0.0577392578125,
-0.043121337890625,
-0.0122833251953125,
-0.0305633544921875,
0.0174560546875,
0.032684326171875,
0.024017333984375,
0.06878662109375,
-0.042083740234375,
-0.0560302734375,
-0.0113525390625,
-0.0313720703125,
-0.0154266357421875,
-0.0004875659942626953,
-0.004184722900390625,
0.00595855712890625,
0.0496826171875,
-0.0296783447265625,
0.04473876953125,
0.037872314453125,
-0.0537109375,
0.037841796875,
-0.004337310791015625,
0.0090484619140625,
-0.0765380859375,
0.016754150390625,
-0.00884246826171875,
-0.06781005859375,
-0.035797119140625,
0.01265716552734375,
0.0199737548828125,
0.0176849365234375,
-0.03265380859375,
0.052734375,
-0.0350341796875,
0.0006241798400878906,
-0.005649566650390625,
0.00678253173828125,
0.0279388427734375,
0.00823974609375,
-0.01039886474609375,
0.0290679931640625,
0.0290679931640625,
-0.03125,
0.03179931640625,
0.050079345703125,
-0.0227508544921875,
0.032318115234375,
-0.060028076171875,
-0.032470703125,
0.0118408203125,
0.038970947265625,
-0.07110595703125,
-0.0109710693359375,
0.0166168212890625,
-0.014495849609375,
0.034210205078125,
0.001064300537109375,
-0.046905517578125,
-0.03265380859375,
-0.0228424072265625,
0.051116943359375,
0.0689697265625,
-0.0237884521484375,
0.0023059844970703125,
0.0288238525390625,
-0.0117340087890625,
-0.03009033203125,
-0.0445556640625,
0.0018587112426757812,
-0.028350830078125,
-0.027587890625,
0.0283355712890625,
-0.01395416259765625,
-0.032806396484375,
-0.03125,
-0.0142669677734375,
-0.0147705078125,
-0.005657196044921875,
0.014404296875,
0.0278778076171875,
-0.0208892822265625,
-0.0162353515625,
0.02166748046875,
-0.0269012451171875,
0.00614166259765625,
0.039581298828125,
0.06915283203125,
-0.0209197998046875,
-0.0081024169921875,
-0.027008056640625,
-0.00357818603515625,
0.055755615234375,
0.0080718994140625,
0.01070404052734375,
0.0482177734375,
-0.0278167724609375,
0.00498199462890625,
-0.0211639404296875,
-0.04638671875,
-0.039459228515625,
0.0276947021484375,
-0.0198974609375,
-0.040191650390625,
0.045074462890625,
0.01262664794921875,
0.007373809814453125,
0.063720703125,
0.038543701171875,
-0.01364898681640625,
0.0677490234375,
0.04241943359375,
0.0185394287109375,
0.0253753662109375,
-0.05987548828125,
0.0169677734375,
-0.047698974609375,
-0.0020084381103515625,
-0.0170745849609375,
-0.038665771484375,
-0.055389404296875,
-0.034332275390625,
0.0153656005859375,
0.0279388427734375,
-0.0655517578125,
0.07806396484375,
-0.00930023193359375,
0.030059814453125,
0.045867919921875,
-0.009246826171875,
0.0328369140625,
-0.02850341796875,
0.00493621826171875,
-0.021209716796875,
-0.032958984375,
-0.038482666015625,
0.06268310546875,
0.051971435546875,
0.0738525390625,
0.0084381103515625,
0.039154052734375,
0.0237884521484375,
0.01235198974609375,
-0.050323486328125,
0.026275634765625,
-0.033294677734375,
-0.10064697265625,
-0.0372314453125,
-0.0173187255859375,
-0.053558349609375,
-0.0168609619140625,
-0.0159759521484375,
-0.0281829833984375,
0.0180511474609375,
0.002269744873046875,
-0.0219879150390625,
0.031707763671875,
-0.0312042236328125,
0.0631103515625,
-0.0140838623046875,
0.00791168212890625,
-0.0263214111328125,
-0.0628662109375,
0.0210113525390625,
-0.0071563720703125,
-0.00844573974609375,
-0.0301513671875,
0.0061187744140625,
0.0301666259765625,
-0.039154052734375,
0.061553955078125,
0.0122528076171875,
-0.001895904541015625,
0.0180206298828125,
0.01332855224609375,
0.0014295578002929688,
0.037384033203125,
0.0129547119140625,
0.03656005859375,
0.0265350341796875,
-0.025146484375,
-0.0178070068359375,
0.01552581787109375,
-0.045745849609375,
-0.00771331787109375,
-0.032562255859375,
-0.052734375,
0.0085296630859375,
0.02264404296875,
0.052734375,
0.049102783203125,
-0.037689208984375,
0.0041046142578125,
0.0137481689453125,
0.008331298828125,
0.046478271484375,
0.040863037109375,
-0.04742431640625,
-0.0203094482421875,
0.06610107421875,
0.00569915771484375,
0.03192138671875,
0.03314208984375,
0.02410888671875,
-0.036468505859375,
0.01071929931640625,
-0.0270233154296875,
0.0289306640625,
-0.03228759765625,
-0.035064697265625,
-0.0185546875,
-0.04815673828125,
-0.020660400390625,
-0.0112762451171875,
-0.0521240234375,
-0.07635498046875,
-0.035980224609375,
-0.00794219970703125,
0.045684814453125,
0.062408447265625,
-0.04437255859375,
0.034698486328125,
-0.052703857421875,
0.0309600830078125,
0.00498199462890625,
0.047393798828125,
-0.02337646484375,
-0.06268310546875,
-0.0246429443359375,
0.00995635986328125,
-0.055633544921875,
-0.028350830078125,
0.0229339599609375,
0.0013303756713867188,
0.018218994140625,
0.058441162109375,
0.0240020751953125,
0.03472900390625,
-0.048095703125,
0.051483154296875,
0.038665771484375,
-0.07366943359375,
-0.00394439697265625,
-0.031158447265625,
-0.00267791748046875,
0.037384033203125,
0.042022705078125,
-0.045379638671875,
-0.028533935546875,
-0.08392333984375,
-0.05694580078125,
0.028533935546875,
0.015838623046875,
0.027191162109375,
0.00807952880859375,
-0.014984130859375,
0.036651611328125,
0.00426483154296875,
-0.053802490234375,
-0.06646728515625,
-0.0206451416015625,
-0.0026798248291015625,
-0.02301025390625,
-0.005733489990234375,
-0.0261383056640625,
-0.062103271484375,
0.036956787109375,
0.01294708251953125,
0.01678466796875,
-0.0233917236328125,
0.03021240234375,
-0.0072174072265625,
-0.003681182861328125,
0.054351806640625,
0.04541015625,
-0.033843994140625,
-0.00817108154296875,
0.0178070068359375,
-0.02508544921875,
0.00713348388671875,
0.00896453857421875,
0.015838623046875,
0.0128326416015625,
0.0231170654296875,
0.071533203125,
0.00039577484130859375,
-0.038330078125,
0.0297393798828125,
-0.0007138252258300781,
-0.021728515625,
-0.06463623046875,
0.026824951171875,
0.001987457275390625,
0.025634765625,
0.00843048095703125,
0.01441192626953125,
0.019378662109375,
-0.036407470703125,
0.03143310546875,
0.0214080810546875,
-0.05755615234375,
-0.02191162109375,
0.060577392578125,
-0.00800323486328125,
-0.028900146484375,
0.0501708984375,
-0.0289306640625,
-0.005481719970703125,
0.046722412109375,
0.04132080078125,
0.045166015625,
-0.0010595321655273438,
0.019134521484375,
0.0285186767578125,
0.006496429443359375,
-0.0003056526184082031,
0.0285797119140625,
0.0119476318359375,
-0.046905517578125,
-0.01467132568359375,
-0.01519012451171875,
-0.046875,
0.005069732666015625,
-0.060577392578125,
0.0694580078125,
-0.031982421875,
-0.032257080078125,
-0.0005006790161132812,
-0.002292633056640625,
-0.049041748046875,
0.030731201171875,
0.0135345458984375,
0.09869384765625,
-0.060333251953125,
0.0452880859375,
0.04376220703125,
-0.05426025390625,
-0.049835205078125,
-0.0238800048828125,
0.002429962158203125,
-0.06097412109375,
0.03131103515625,
-0.00807952880859375,
0.003543853759765625,
0.0035800933837890625,
-0.054931640625,
-0.0740966796875,
0.08770751953125,
0.01499176025390625,
-0.056243896484375,
-0.004772186279296875,
-0.00844573974609375,
0.03564453125,
-0.0003345012664794922,
0.050933837890625,
0.05902099609375,
0.050079345703125,
-0.01219940185546875,
-0.07080078125,
-0.0164794921875,
-0.021881103515625,
-0.04248046875,
0.036590576171875,
-0.07476806640625,
0.07122802734375,
-0.0245208740234375,
-0.007419586181640625,
0.029632568359375,
0.0287628173828125,
0.006622314453125,
0.0452880859375,
0.042724609375,
0.061737060546875,
0.03570556640625,
-0.0261383056640625,
0.0562744140625,
-0.0016565322875976562,
0.047576904296875,
0.081298828125,
0.00522613525390625,
0.02886962890625,
0.0139923095703125,
-0.042877197265625,
0.054595947265625,
0.061859130859375,
-0.024383544921875,
0.03704833984375,
0.019195556640625,
0.0221099853515625,
-0.00910186767578125,
0.0217132568359375,
-0.031341552734375,
0.036102294921875,
0.0328369140625,
-0.027862548828125,
-0.01522064208984375,
-0.02154541015625,
0.02984619140625,
-0.0045013427734375,
-0.01448822021484375,
0.03857421875,
0.028167724609375,
-0.00545501708984375,
0.046234130859375,
-0.003047943115234375,
0.024566650390625,
-0.0421142578125,
-0.0032520294189453125,
-0.037078857421875,
0.007480621337890625,
-0.00940704345703125,
-0.0721435546875,
0.03570556640625,
-0.0197296142578125,
-0.0269622802734375,
-0.0190887451171875,
0.006450653076171875,
-0.0024929046630859375,
-0.054229736328125,
0.0179443359375,
0.036651611328125,
0.02008056640625,
0.0112152099609375,
-0.040802001953125,
-0.024566650390625,
-0.0098876953125,
-0.012542724609375,
-0.006656646728515625,
0.0303955078125,
-0.0170440673828125,
0.0570068359375,
0.00954437255859375,
-0.0160064697265625,
-0.01070404052734375,
0.01393890380859375,
0.04583740234375,
-0.0445556640625,
-0.03424072265625,
-0.04681396484375,
0.07293701171875,
-0.022369384765625,
-0.057281494140625,
0.04547119140625,
0.05615234375,
0.050506591796875,
-0.0156402587890625,
0.03271484375,
-0.0192108154296875,
0.052734375,
-0.00310516357421875,
0.0726318359375,
-0.07403564453125,
-0.0175323486328125,
-0.03863525390625,
-0.0733642578125,
-0.0303955078125,
0.043060302734375,
0.0113372802734375,
0.004772186279296875,
0.06927490234375,
0.03546142578125,
-0.035552978515625,
0.00424957275390625,
0.033172607421875,
0.010650634765625,
0.006626129150390625,
-0.027496337890625,
0.03973388671875,
-0.04730224609375,
0.013275146484375,
-0.03289794921875,
-0.0185394287109375,
-0.041290283203125,
-0.060577392578125,
-0.055328369140625,
-0.055755615234375,
-0.01019287109375,
-0.020965576171875,
-0.026885986328125,
0.049224853515625,
0.07318115234375,
-0.060302734375,
-0.023284912109375,
-0.0118255615234375,
0.0045318603515625,
-0.025848388671875,
-0.018157958984375,
0.0233917236328125,
-0.0208740234375,
-0.06695556640625,
0.00884246826171875,
-0.00015056133270263672,
0.0238800048828125,
-0.0202789306640625,
-0.0033054351806640625,
0.01441192626953125,
0.0252685546875,
0.03668212890625,
0.0111541748046875,
-0.0299072265625,
-0.0272216796875,
0.01428985595703125,
-0.0249786376953125,
0.0018548965454101562,
0.052947998046875,
-0.0121002197265625,
-0.00884246826171875,
0.045745849609375,
0.00222015380859375,
0.05133056640625,
0.0261077880859375,
0.035797119140625,
-0.050537109375,
0.0035572052001953125,
0.01641845703125,
0.04412841796875,
0.0135345458984375,
0.0049896240234375,
0.06658935546875,
0.0256195068359375,
-0.060577392578125,
-0.0445556640625,
0.01690673828125,
-0.0999755859375,
0.00923919677734375,
0.05340576171875,
0.0160064697265625,
0.023223876953125,
0.0032176971435546875,
-0.039459228515625,
0.031707763671875,
-0.0258941650390625,
0.030487060546875,
0.0222930908203125,
-0.0185394287109375,
-0.01396942138671875,
-0.037933349609375,
0.018524169921875,
0.01483917236328125,
-0.03759765625,
0.00473785400390625,
0.0237274169921875,
0.0263519287109375,
0.025146484375,
0.0304412841796875,
-0.01358795166015625,
0.0142364501953125,
0.030609130859375,
0.0196380615234375,
0.0176239013671875,
-0.0330810546875,
0.02178955078125,
-0.044036865234375,
0.0027313232421875,
-0.01200103759765625
]
] |
timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320 | 2023-03-31T22:28:54.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:laion-2b",
"arxiv:2210.08402",
"arxiv:2201.03545",
"arxiv:2103.00020",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320 | 4 | 15,674 | timm | 2023-03-31T22:26:03 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
- laion-2b
---
# Model card for convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320
A ConvNeXt image classification model. CLIP image tower weights pretrained in [OpenCLIP](https://github.com/mlfoundations/open_clip) on LAION and fine-tuned on ImageNet-12k followed by ImageNet-1k in `timm` bby Ross Wightman.
Please see related OpenCLIP model cards for more details on pretrain:
* https://huggingface.co/laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-soup
* https://huggingface.co/laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg
* https://huggingface.co/laion/CLIP-convnext_base_w-laion2B-s13B-b82K-augreg
* https://huggingface.co/laion/CLIP-convnext_base_w_320-laion_aesthetic-s13B-b82K
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 200.1
- GMACs: 70.2
- Activations (M): 88.0
- Image size: 320 x 320
- **Papers:**
- LAION-5B: An open large-scale dataset for training next generation image-text models: https://arxiv.org/abs/2210.08402
- A ConvNet for the 2020s: https://arxiv.org/abs/2201.03545
- Learning Transferable Visual Models From Natural Language Supervision: https://arxiv.org/abs/2103.00020
- **Original:** https://github.com/mlfoundations/open_clip
- **Pretrain Dataset:** LAION-2B
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 192, 80, 80])
# torch.Size([1, 384, 40, 40])
# torch.Size([1, 768, 20, 20])
# torch.Size([1, 1536, 10, 10])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1536, 10, 10) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
All timing numbers from eager model PyTorch 1.13 on RTX 3090 w/ AMP.
| model |top1 |top5 |img_size|param_count|gmacs |macts |samples_per_sec|batch_size|
|------------------------------------------------------------------------------------------------------------------------------|------|------|--------|-----------|------|------|---------------|----------|
| [convnextv2_huge.fcmae_ft_in22k_in1k_512](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_512) |88.848|98.742|512 |660.29 |600.81|413.07|28.58 |48 |
| [convnextv2_huge.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_384) |88.668|98.738|384 |660.29 |337.96|232.35|50.56 |64 |
| [convnext_xxlarge.clip_laion2b_soup_ft_in1k](https://huggingface.co/timm/convnext_xxlarge.clip_laion2b_soup_ft_in1k) |88.612|98.704|256 |846.47 |198.09|124.45|122.45 |256 |
| [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384) |88.312|98.578|384 |200.13 |101.11|126.74|196.84 |256 |
| [convnextv2_large.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k_384) |88.196|98.532|384 |197.96 |101.1 |126.74|128.94 |128 |
| [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320) |87.968|98.47 |320 |200.13 |70.21 |88.02 |283.42 |256 |
| [convnext_xlarge.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k_384) |87.75 |98.556|384 |350.2 |179.2 |168.99|124.85 |192 |
| [convnextv2_base.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k_384) |87.646|98.422|384 |88.72 |45.21 |84.49 |209.51 |256 |
| [convnext_large.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k_384) |87.476|98.382|384 |197.77 |101.1 |126.74|194.66 |256 |
| [convnext_large_mlp.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_augreg_ft_in1k) |87.344|98.218|256 |200.13 |44.94 |56.33 |438.08 |256 |
| [convnextv2_large.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k) |87.26 |98.248|224 |197.96 |34.4 |43.13 |376.84 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384) |87.138|98.212|384 |88.59 |45.21 |84.49 |365.47 |256 |
| [convnext_xlarge.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k) |87.002|98.208|224 |350.2 |60.98 |57.5 |368.01 |256 |
| [convnext_base.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k_384) |86.796|98.264|384 |88.59 |45.21 |84.49 |366.54 |256 |
| [convnextv2_base.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k) |86.74 |98.022|224 |88.72 |15.38 |28.75 |624.23 |256 |
| [convnext_large.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k) |86.636|98.028|224 |197.77 |34.4 |43.13 |581.43 |256 |
| [convnext_base.clip_laiona_augreg_ft_in1k_384](https://huggingface.co/timm/convnext_base.clip_laiona_augreg_ft_in1k_384) |86.504|97.97 |384 |88.59 |45.21 |84.49 |368.14 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in12k_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k) |86.344|97.97 |256 |88.59 |20.09 |37.55 |816.14 |256 |
| [convnextv2_huge.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in1k) |86.256|97.75 |224 |660.29 |115.0 |79.07 |154.72 |256 |
| [convnext_small.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_small.in12k_ft_in1k_384) |86.182|97.92 |384 |50.22 |25.58 |63.37 |516.19 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in1k) |86.154|97.68 |256 |88.59 |20.09 |37.55 |819.86 |256 |
| [convnext_base.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k) |85.822|97.866|224 |88.59 |15.38 |28.75 |1037.66 |256 |
| [convnext_small.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k_384) |85.778|97.886|384 |50.22 |25.58 |63.37 |518.95 |256 |
| [convnextv2_large.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in1k) |85.742|97.584|224 |197.96 |34.4 |43.13 |375.23 |256 |
| [convnext_small.in12k_ft_in1k](https://huggingface.co/timm/convnext_small.in12k_ft_in1k) |85.174|97.506|224 |50.22 |8.71 |21.56 |1474.31 |256 |
| [convnext_tiny.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k_384) |85.118|97.608|384 |28.59 |13.14 |39.48 |856.76 |256 |
| [convnextv2_tiny.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k_384) |85.112|97.63 |384 |28.64 |13.14 |39.48 |491.32 |256 |
| [convnextv2_base.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in1k) |84.874|97.09 |224 |88.72 |15.38 |28.75 |625.33 |256 |
| [convnext_small.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k) |84.562|97.394|224 |50.22 |8.71 |21.56 |1478.29 |256 |
| [convnext_large.fb_in1k](https://huggingface.co/timm/convnext_large.fb_in1k) |84.282|96.892|224 |197.77 |34.4 |43.13 |584.28 |256 |
| [convnext_tiny.in12k_ft_in1k](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k) |84.186|97.124|224 |28.59 |4.47 |13.44 |2433.7 |256 |
| [convnext_tiny.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k_384) |84.084|97.14 |384 |28.59 |13.14 |39.48 |862.95 |256 |
| [convnextv2_tiny.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k) |83.894|96.964|224 |28.64 |4.47 |13.44 |1452.72 |256 |
| [convnext_base.fb_in1k](https://huggingface.co/timm/convnext_base.fb_in1k) |83.82 |96.746|224 |88.59 |15.38 |28.75 |1054.0 |256 |
| [convnextv2_nano.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k_384) |83.37 |96.742|384 |15.62 |7.22 |24.61 |801.72 |256 |
| [convnext_small.fb_in1k](https://huggingface.co/timm/convnext_small.fb_in1k) |83.142|96.434|224 |50.22 |8.71 |21.56 |1464.0 |256 |
| [convnextv2_tiny.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in1k) |82.92 |96.284|224 |28.64 |4.47 |13.44 |1425.62 |256 |
| [convnext_tiny.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k) |82.898|96.616|224 |28.59 |4.47 |13.44 |2480.88 |256 |
| [convnext_nano.in12k_ft_in1k](https://huggingface.co/timm/convnext_nano.in12k_ft_in1k) |82.282|96.344|224 |15.59 |2.46 |8.37 |3926.52 |256 |
| [convnext_tiny_hnf.a2h_in1k](https://huggingface.co/timm/convnext_tiny_hnf.a2h_in1k) |82.216|95.852|224 |28.59 |4.47 |13.44 |2529.75 |256 |
| [convnext_tiny.fb_in1k](https://huggingface.co/timm/convnext_tiny.fb_in1k) |82.066|95.854|224 |28.59 |4.47 |13.44 |2346.26 |256 |
| [convnextv2_nano.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k) |82.03 |96.166|224 |15.62 |2.46 |8.37 |2300.18 |256 |
| [convnextv2_nano.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in1k) |81.83 |95.738|224 |15.62 |2.46 |8.37 |2321.48 |256 |
| [convnext_nano_ols.d1h_in1k](https://huggingface.co/timm/convnext_nano_ols.d1h_in1k) |80.866|95.246|224 |15.65 |2.65 |9.38 |3523.85 |256 |
| [convnext_nano.d1h_in1k](https://huggingface.co/timm/convnext_nano.d1h_in1k) |80.768|95.334|224 |15.59 |2.46 |8.37 |3915.58 |256 |
| [convnextv2_pico.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_pico.fcmae_ft_in1k) |80.304|95.072|224 |9.07 |1.37 |6.1 |3274.57 |256 |
| [convnext_pico.d1_in1k](https://huggingface.co/timm/convnext_pico.d1_in1k) |79.526|94.558|224 |9.05 |1.37 |6.1 |5686.88 |256 |
| [convnext_pico_ols.d1_in1k](https://huggingface.co/timm/convnext_pico_ols.d1_in1k) |79.522|94.692|224 |9.06 |1.43 |6.5 |5422.46 |256 |
| [convnextv2_femto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_femto.fcmae_ft_in1k) |78.488|93.98 |224 |5.23 |0.79 |4.57 |4264.2 |256 |
| [convnext_femto_ols.d1_in1k](https://huggingface.co/timm/convnext_femto_ols.d1_in1k) |77.86 |93.83 |224 |5.23 |0.82 |4.87 |6910.6 |256 |
| [convnext_femto.d1_in1k](https://huggingface.co/timm/convnext_femto.d1_in1k) |77.454|93.68 |224 |5.22 |0.79 |4.57 |7189.92 |256 |
| [convnextv2_atto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_atto.fcmae_ft_in1k) |76.664|93.044|224 |3.71 |0.55 |3.81 |4728.91 |256 |
| [convnext_atto_ols.a2_in1k](https://huggingface.co/timm/convnext_atto_ols.a2_in1k) |75.88 |92.846|224 |3.7 |0.58 |4.11 |7963.16 |256 |
| [convnext_atto.d2_in1k](https://huggingface.co/timm/convnext_atto.d2_in1k) |75.664|92.9 |224 |3.7 |0.55 |3.81 |8439.22 |256 |
## Citation
```bibtex
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
```
```bibtex
@inproceedings{schuhmann2022laionb,
title={{LAION}-5B: An open large-scale dataset for training next generation image-text models},
author={Christoph Schuhmann and
Romain Beaumont and
Richard Vencu and
Cade W Gordon and
Ross Wightman and
Mehdi Cherti and
Theo Coombes and
Aarush Katta and
Clayton Mullis and
Mitchell Wortsman and
Patrick Schramowski and
Srivatsa R Kundurthy and
Katherine Crowson and
Ludwig Schmidt and
Robert Kaczmarczyk and
Jenia Jitsev},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=M3Y74vmsMcY}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
```bibtex
@article{liu2022convnet,
author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie},
title = {A ConvNet for the 2020s},
journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022},
}
```
| 18,579 | [
[
-0.059356689453125,
-0.034088134765625,
-0.0017862319946289062,
0.037078857421875,
-0.028900146484375,
-0.018646240234375,
-0.015625,
-0.033782958984375,
0.05743408203125,
0.0191650390625,
-0.04071044921875,
-0.044830322265625,
-0.053680419921875,
-0.0011119842529296875,
0.004970550537109375,
0.06951904296875,
-0.0072479248046875,
-0.007770538330078125,
0.01526641845703125,
-0.0291900634765625,
-0.0169677734375,
-0.02752685546875,
-0.060455322265625,
-0.015655517578125,
0.0202789306640625,
0.021514892578125,
0.05810546875,
0.04644775390625,
0.032684326171875,
0.03887939453125,
-0.017059326171875,
0.01107025146484375,
-0.018524169921875,
-0.0242156982421875,
0.036529541015625,
-0.033782958984375,
-0.06658935546875,
0.0193328857421875,
0.060638427734375,
0.0361328125,
0.00344085693359375,
0.0186920166015625,
0.0261383056640625,
0.03448486328125,
-0.00566864013671875,
0.0010957717895507812,
-0.01247406005859375,
0.01617431640625,
-0.01812744140625,
-0.0019779205322265625,
0.00014531612396240234,
-0.048126220703125,
0.0255126953125,
-0.0439453125,
0.005710601806640625,
0.0009927749633789062,
0.10540771484375,
-0.00844573974609375,
-0.01422882080078125,
-0.0010776519775390625,
0.005321502685546875,
0.056488037109375,
-0.061187744140625,
0.0199737548828125,
0.0298004150390625,
-0.0079498291015625,
-0.01036834716796875,
-0.0537109375,
-0.044586181640625,
-0.004909515380859375,
-0.02392578125,
0.0169830322265625,
-0.0260162353515625,
-0.006488800048828125,
0.039306640625,
0.03363037109375,
-0.0379638671875,
-0.0047760009765625,
-0.02728271484375,
-0.006488800048828125,
0.055084228515625,
-0.0074462890625,
0.047210693359375,
-0.0252838134765625,
-0.05120849609375,
-0.0231170654296875,
-0.0196990966796875,
0.03076171875,
0.01296234130859375,
-0.0030498504638671875,
-0.07135009765625,
0.04034423828125,
0.01259613037109375,
0.0261993408203125,
0.021759033203125,
-0.0194854736328125,
0.057861328125,
-0.0157928466796875,
-0.038330078125,
-0.0205078125,
0.0904541015625,
0.052337646484375,
0.033203125,
0.013397216796875,
0.0013837814331054688,
-0.011566162109375,
-0.033660888671875,
-0.0792236328125,
-0.01548004150390625,
0.0263671875,
-0.042572021484375,
-0.01232147216796875,
0.0263214111328125,
-0.057769775390625,
0.00830841064453125,
-0.00682830810546875,
0.0187530517578125,
-0.05975341796875,
-0.0287017822265625,
-0.005992889404296875,
-0.02740478515625,
0.0272064208984375,
0.0204925537109375,
-0.0292510986328125,
0.01983642578125,
0.0204620361328125,
0.0789794921875,
0.0171051025390625,
-0.019561767578125,
-0.0309600830078125,
-0.0038585662841796875,
-0.025299072265625,
0.0300445556640625,
0.005123138427734375,
-0.00949859619140625,
-0.0180816650390625,
0.031829833984375,
-0.01427459716796875,
-0.034088134765625,
0.03143310546875,
0.01702880859375,
0.011322021484375,
-0.0270843505859375,
-0.0267486572265625,
-0.0186309814453125,
0.0285491943359375,
-0.039459228515625,
0.080810546875,
0.033233642578125,
-0.08160400390625,
0.02313232421875,
-0.035247802734375,
-0.00823211669921875,
-0.0226287841796875,
0.0024929046630859375,
-0.056732177734375,
-0.01016998291015625,
0.0243988037109375,
0.058685302734375,
-0.0138702392578125,
-0.00982666015625,
-0.032928466796875,
-0.00678253173828125,
0.0281524658203125,
0.004878997802734375,
0.0728759765625,
0.0101470947265625,
-0.0322265625,
0.00699615478515625,
-0.046966552734375,
0.0211944580078125,
0.03338623046875,
-0.00698089599609375,
-0.006549835205078125,
-0.057159423828125,
0.004108428955078125,
0.039031982421875,
0.0137176513671875,
-0.036376953125,
0.0209808349609375,
-0.0169219970703125,
0.033294677734375,
0.051910400390625,
-0.005893707275390625,
0.0229644775390625,
-0.04376220703125,
0.0400390625,
0.01067352294921875,
0.018310546875,
-0.005428314208984375,
-0.0322265625,
-0.0556640625,
-0.053466796875,
0.018524169921875,
0.037200927734375,
-0.035675048828125,
0.05560302734375,
0.004589080810546875,
-0.042510986328125,
-0.056549072265625,
0.01236724853515625,
0.036865234375,
0.0229339599609375,
0.020599365234375,
-0.0278167724609375,
-0.04754638671875,
-0.06951904296875,
-0.00453948974609375,
0.00557708740234375,
-0.0070037841796875,
0.0447998046875,
0.0309600830078125,
-0.004062652587890625,
0.0447998046875,
-0.03643798828125,
-0.0251922607421875,
-0.0174713134765625,
-0.005512237548828125,
0.031707763671875,
0.056549072265625,
0.0826416015625,
-0.060760498046875,
-0.06695556640625,
-0.0021648406982421875,
-0.0823974609375,
-0.0007171630859375,
-0.00388336181640625,
-0.0294647216796875,
0.0198211669921875,
0.02001953125,
-0.0740966796875,
0.047698974609375,
0.0298004150390625,
-0.0404052734375,
0.035736083984375,
-0.0214080810546875,
0.021759033203125,
-0.0794677734375,
0.0185546875,
0.022308349609375,
-0.0207672119140625,
-0.036956787109375,
0.0031299591064453125,
-0.004421234130859375,
0.00934600830078125,
-0.04827880859375,
0.0662841796875,
-0.05316162109375,
0.0032672882080078125,
0.00299072265625,
0.01171875,
0.004528045654296875,
0.041412353515625,
0.001293182373046875,
0.03515625,
0.060638427734375,
-0.0245513916015625,
0.03179931640625,
0.038421630859375,
-0.00759124755859375,
0.05560302734375,
-0.05255126953125,
0.00566864013671875,
0.005702972412109375,
0.035675048828125,
-0.06732177734375,
-0.02850341796875,
0.042816162109375,
-0.053070068359375,
0.03729248046875,
-0.0209197998046875,
-0.0295867919921875,
-0.059783935546875,
-0.06463623046875,
0.0199737548828125,
0.044830322265625,
-0.0484619140625,
0.014495849609375,
0.0209197998046875,
0.0027103424072265625,
-0.046630859375,
-0.050262451171875,
-0.00759124755859375,
-0.0293731689453125,
-0.0633544921875,
0.0318603515625,
0.00505828857421875,
-0.003414154052734375,
0.0013484954833984375,
-0.001026153564453125,
-0.00237274169921875,
-0.014984130859375,
0.03875732421875,
0.03131103515625,
-0.019744873046875,
-0.0249481201171875,
-0.0223541259765625,
-0.004795074462890625,
0.0005402565002441406,
-0.007335662841796875,
0.0482177734375,
-0.0232086181640625,
0.007602691650390625,
-0.0760498046875,
0.01247406005859375,
0.04705810546875,
-0.00704193115234375,
0.0692138671875,
0.07989501953125,
-0.032073974609375,
0.0104827880859375,
-0.03009033203125,
-0.00913238525390625,
-0.037628173828125,
0.0019359588623046875,
-0.03839111328125,
-0.044769287109375,
0.058013916015625,
0.013824462890625,
-0.004779815673828125,
0.0513916015625,
0.02655029296875,
-0.01531219482421875,
0.0673828125,
0.039398193359375,
-0.0095977783203125,
0.045440673828125,
-0.0662841796875,
-0.0022678375244140625,
-0.0660400390625,
-0.041961669921875,
-0.010833740234375,
-0.042022705078125,
-0.048675537109375,
-0.0335693359375,
0.0229339599609375,
0.0352783203125,
-0.013275146484375,
0.046234130859375,
-0.042449951171875,
-0.0026416778564453125,
0.039642333984375,
0.026458740234375,
-0.01739501953125,
-0.01067352294921875,
-0.01198577880859375,
-0.01316070556640625,
-0.045257568359375,
-0.013702392578125,
0.057403564453125,
0.051666259765625,
0.035125732421875,
-0.004459381103515625,
0.039093017578125,
-0.007724761962890625,
0.024383544921875,
-0.03857421875,
0.05499267578125,
-0.002960205078125,
-0.0343017578125,
-0.01513671875,
-0.03607177734375,
-0.0748291015625,
0.0133819580078125,
-0.028656005859375,
-0.06207275390625,
-0.01139068603515625,
0.0164031982421875,
-0.0173187255859375,
0.0428466796875,
-0.05328369140625,
0.05908203125,
-0.007656097412109375,
-0.0333251953125,
0.0084686279296875,
-0.06280517578125,
0.02081298828125,
0.03350830078125,
-0.005275726318359375,
-0.012847900390625,
0.01110076904296875,
0.064453125,
-0.059539794921875,
0.042449951171875,
-0.0286712646484375,
0.0028228759765625,
0.038665771484375,
-0.007205963134765625,
0.03350830078125,
0.0079498291015625,
0.0006732940673828125,
0.0064544677734375,
0.0111236572265625,
-0.04571533203125,
-0.032257080078125,
0.047149658203125,
-0.054779052734375,
-0.0284423828125,
-0.038909912109375,
-0.0248565673828125,
0.01345062255859375,
0.00431060791015625,
0.049713134765625,
0.040557861328125,
-0.00722503662109375,
0.01503753662109375,
0.041015625,
-0.029632568359375,
0.0360107421875,
-0.010894775390625,
-0.0045623779296875,
-0.040618896484375,
0.059661865234375,
0.0035915374755859375,
0.0113067626953125,
0.00720977783203125,
0.01187896728515625,
-0.02783203125,
-0.01561737060546875,
-0.01349639892578125,
0.04876708984375,
-0.021453857421875,
-0.0294647216796875,
-0.04974365234375,
-0.03472900390625,
-0.045928955078125,
-0.0223541259765625,
-0.0298309326171875,
-0.02008056640625,
-0.028411865234375,
0.0072021484375,
0.049285888671875,
0.04034423828125,
-0.02386474609375,
0.031158447265625,
-0.051910400390625,
0.0257720947265625,
0.005359649658203125,
0.030914306640625,
-0.014984130859375,
-0.04632568359375,
0.002742767333984375,
0.0030384063720703125,
-0.0229034423828125,
-0.060882568359375,
0.04705810546875,
0.01491546630859375,
0.0279083251953125,
0.04302978515625,
-0.020599365234375,
0.059234619140625,
-0.01099395751953125,
0.04010009765625,
0.045562744140625,
-0.0640869140625,
0.036346435546875,
-0.02508544921875,
0.01178741455078125,
0.0095672607421875,
0.03240966796875,
-0.036651611328125,
-0.02386474609375,
-0.06829833984375,
-0.048919677734375,
0.05511474609375,
0.0118408203125,
-0.0024261474609375,
0.005615234375,
0.04327392578125,
-0.00701904296875,
0.01110076904296875,
-0.050567626953125,
-0.049346923828125,
-0.01800537109375,
-0.0146026611328125,
-0.004314422607421875,
-0.00821685791015625,
-0.005489349365234375,
-0.0513916015625,
0.0382080078125,
-0.00958251953125,
0.042999267578125,
0.0207366943359375,
-0.0032634735107421875,
-0.004482269287109375,
-0.0212554931640625,
0.040374755859375,
0.0274810791015625,
-0.0246429443359375,
-0.006534576416015625,
0.0247344970703125,
-0.041595458984375,
-0.0009851455688476562,
0.018096923828125,
0.00331878662109375,
0.0117340087890625,
0.031951904296875,
0.052947998046875,
0.0191802978515625,
-0.0165557861328125,
0.0504150390625,
-0.0130615234375,
-0.029022216796875,
-0.0225372314453125,
-0.0012989044189453125,
0.01099395751953125,
0.032928466796875,
0.0139923095703125,
0.00319671630859375,
-0.0226287841796875,
-0.0447998046875,
0.034820556640625,
0.05596923828125,
-0.0313720703125,
-0.041748046875,
0.04803466796875,
-0.007556915283203125,
-0.006298065185546875,
0.04248046875,
-0.004390716552734375,
-0.052398681640625,
0.0704345703125,
0.0285491943359375,
0.043792724609375,
-0.03778076171875,
0.018341064453125,
0.06414794921875,
0.0028667449951171875,
0.004970550537109375,
0.024749755859375,
0.02423095703125,
-0.03521728515625,
0.00524139404296875,
-0.047698974609375,
0.01476287841796875,
0.039764404296875,
-0.039215087890625,
0.029205322265625,
-0.05816650390625,
-0.027557373046875,
0.00983428955078125,
0.025299072265625,
-0.06390380859375,
0.02142333984375,
0.0024280548095703125,
0.08294677734375,
-0.0599365234375,
0.06524658203125,
0.05487060546875,
-0.031707763671875,
-0.0711669921875,
-0.007568359375,
0.00905609130859375,
-0.06463623046875,
0.034912109375,
0.0232086181640625,
0.0186920166015625,
-0.01751708984375,
-0.04998779296875,
-0.042633056640625,
0.09368896484375,
0.036041259765625,
-0.01180267333984375,
0.0029582977294921875,
-0.0205078125,
0.0292510986328125,
-0.0230712890625,
0.037078857421875,
0.0364990234375,
0.03778076171875,
0.0208282470703125,
-0.068359375,
0.0268402099609375,
-0.0301513671875,
-0.00841522216796875,
0.0186614990234375,
-0.096435546875,
0.08038330078125,
-0.0306396484375,
-0.007717132568359375,
0.01227569580078125,
0.06341552734375,
0.0294647216796875,
0.00421905517578125,
0.0308074951171875,
0.0546875,
0.036346435546875,
-0.0116119384765625,
0.0777587890625,
-0.0009431838989257812,
0.031158447265625,
0.0306854248046875,
0.0322265625,
0.03741455078125,
0.0272064208984375,
-0.030914306640625,
0.01297760009765625,
0.063720703125,
-0.0194549560546875,
0.01232147216796875,
0.01702880859375,
-0.00742340087890625,
-0.00832366943359375,
-0.01354217529296875,
-0.047210693359375,
0.032257080078125,
0.0119781494140625,
-0.0282745361328125,
0.00144195556640625,
-0.006435394287109375,
0.03570556640625,
-0.006443023681640625,
-0.01232147216796875,
0.033660888671875,
0.0175323486328125,
-0.041107177734375,
0.046661376953125,
-0.00261688232421875,
0.07244873046875,
-0.0293426513671875,
0.0004200935363769531,
-0.019134521484375,
0.0215301513671875,
-0.0193328857421875,
-0.086181640625,
0.01824951171875,
-0.01316070556640625,
0.01340484619140625,
-0.004993438720703125,
0.044403076171875,
-0.035675048828125,
-0.022308349609375,
0.0364990234375,
0.0233306884765625,
0.0297393798828125,
0.00356292724609375,
-0.087158203125,
0.018157958984375,
0.007568359375,
-0.04315185546875,
0.03033447265625,
0.03631591796875,
0.0180511474609375,
0.0533447265625,
0.032257080078125,
0.01503753662109375,
0.006870269775390625,
-0.0217132568359375,
0.061431884765625,
-0.04693603515625,
-0.033843994140625,
-0.0662841796875,
0.03338623046875,
-0.023956298828125,
-0.0469970703125,
0.0567626953125,
0.0360107421875,
0.04327392578125,
0.0031757354736328125,
0.037994384765625,
-0.032073974609375,
0.0260162353515625,
-0.038330078125,
0.0552978515625,
-0.0595703125,
-0.0195159912109375,
-0.036956787109375,
-0.06341552734375,
-0.0203704833984375,
0.056304931640625,
0.00020933151245117188,
0.0185394287109375,
0.033233642578125,
0.046142578125,
-0.0074462890625,
-0.01959228515625,
-0.00479888916015625,
0.021820068359375,
0.005916595458984375,
0.06072998046875,
0.038604736328125,
-0.055084228515625,
0.019989013671875,
-0.04498291015625,
-0.0254058837890625,
-0.02294921875,
-0.054595947265625,
-0.0869140625,
-0.06158447265625,
-0.03741455078125,
-0.049346923828125,
-0.0207061767578125,
0.08367919921875,
0.0726318359375,
-0.04180908203125,
-0.014007568359375,
0.024169921875,
0.00788116455078125,
-0.016082763671875,
-0.019744873046875,
0.04461669921875,
0.0213623046875,
-0.0731201171875,
-0.021575927734375,
0.0072021484375,
0.04327392578125,
0.021026611328125,
-0.02520751953125,
-0.01885986328125,
-0.007457733154296875,
0.034698486328125,
0.05487060546875,
-0.05084228515625,
-0.02899169921875,
-0.0029048919677734375,
-0.01702880859375,
0.0230712890625,
0.0269927978515625,
-0.0289306640625,
-0.00782012939453125,
0.037994384765625,
0.00939178466796875,
0.056365966796875,
0.006557464599609375,
0.019775390625,
-0.051666259765625,
0.04644775390625,
-0.00423431396484375,
0.0298614501953125,
0.0280303955078125,
-0.025604248046875,
0.0511474609375,
0.036529541015625,
-0.03094482421875,
-0.07269287109375,
-0.0181732177734375,
-0.1043701171875,
-0.004489898681640625,
0.06536865234375,
-0.019012451171875,
-0.0419921875,
0.042144775390625,
-0.025421142578125,
0.038848876953125,
-0.0205078125,
0.0237579345703125,
0.0262603759765625,
-0.01898193359375,
-0.0382080078125,
-0.040283203125,
0.049835205078125,
0.0229034423828125,
-0.0538330078125,
-0.0252838134765625,
0.0004343986511230469,
0.039459228515625,
0.0166473388671875,
0.05780029296875,
-0.01568603515625,
0.00902557373046875,
-0.0010213851928710938,
0.01629638671875,
-0.000835418701171875,
-0.003925323486328125,
-0.00865936279296875,
-0.00939178466796875,
-0.0214385986328125,
-0.0440673828125
]
] |
wonrax/phobert-base-vietnamese-sentiment | 2022-05-04T07:30:54.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"sentiment",
"classification",
"vi",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | wonrax | null | null | wonrax/phobert-base-vietnamese-sentiment | 5 | 15,602 | transformers | 2022-05-03T14:03:13 | ---
language:
- vi
tags:
- sentiment
- classification
license: mit
widget:
- text: "Không thể nào đẹp hơn"
- text: "Quá phí tiền, mà không đẹp"
- text: "Cái này giá ổn không nhỉ?"
---
[**GitHub Homepage**](https://github.com/wonrax/phobert-base-vietnamese-sentiment)
A model fine-tuned for sentiment analysis based on [vinai/phobert-base](https://huggingface.co/vinai/phobert-base).
Labels:
- NEG: Negative
- POS: Positive
- NEU: Neutral
Dataset: [30K e-commerce reviews](https://www.kaggle.com/datasets/linhlpv/vietnamese-sentiment-analyst)
## Usage
```python
import torch
from transformers import RobertaForSequenceClassification, AutoTokenizer
model = RobertaForSequenceClassification.from_pretrained("wonrax/phobert-base-vietnamese-sentiment")
tokenizer = AutoTokenizer.from_pretrained("wonrax/phobert-base-vietnamese-sentiment", use_fast=False)
# Just like PhoBERT: INPUT TEXT MUST BE ALREADY WORD-SEGMENTED!
sentence = 'Đây là mô_hình rất hay , phù_hợp với điều_kiện và như cầu của nhiều người .'
input_ids = torch.tensor([tokenizer.encode(sentence)])
with torch.no_grad():
out = model(input_ids)
print(out.logits.softmax(dim=-1).tolist())
# Output:
# [[0.002, 0.988, 0.01]]
# ^ ^ ^
# NEG POS NEU
```
| 1,267 | [
[
-0.0189971923828125,
-0.038238525390625,
0.0132293701171875,
0.03045654296875,
-0.05224609375,
-0.00730133056640625,
-0.0094757080078125,
-0.00012695789337158203,
0.021484375,
0.018646240234375,
-0.0189971923828125,
-0.064208984375,
-0.040679931640625,
0.0019817352294921875,
-0.01055145263671875,
0.095703125,
0.009124755859375,
0.00927734375,
0.038299560546875,
-0.02020263671875,
-0.020263671875,
-0.047760009765625,
-0.0280609130859375,
-0.036895751953125,
0.0233154296875,
0.034210205078125,
0.043914794921875,
0.0085601806640625,
0.039703369140625,
0.0281219482421875,
-0.0016107559204101562,
0.01314544677734375,
-0.02557373046875,
0.006832122802734375,
0.00011974573135375977,
-0.0299530029296875,
-0.05303955078125,
0.005321502685546875,
0.025634765625,
0.01338958740234375,
-0.003997802734375,
0.005519866943359375,
-0.0005183219909667969,
0.0450439453125,
-0.03155517578125,
0.0130615234375,
-0.0272674560546875,
-0.0059661865234375,
0.0079498291015625,
-0.0004398822784423828,
-0.03472900390625,
-0.0294189453125,
0.0247955322265625,
-0.0474853515625,
0.00983428955078125,
-0.015899658203125,
0.10467529296875,
0.021240234375,
-0.03369140625,
-0.009124755859375,
-0.040283203125,
0.0684814453125,
-0.059722900390625,
0.0157470703125,
0.0110015869140625,
0.0019464492797851562,
0.01151275634765625,
-0.0533447265625,
-0.03515625,
-0.0156707763671875,
-0.00577545166015625,
0.0269775390625,
-0.027862548828125,
-0.0014781951904296875,
0.00876617431640625,
0.039947509765625,
-0.0498046875,
-0.0259246826171875,
-0.0256805419921875,
-0.0092315673828125,
0.044158935546875,
0.00251007080078125,
-0.00714111328125,
-0.052337646484375,
-0.037750244140625,
-0.0220794677734375,
-0.006267547607421875,
-0.007427215576171875,
0.024139404296875,
0.00815582275390625,
-0.02386474609375,
0.05889892578125,
-0.0172119140625,
0.05841064453125,
0.02960205078125,
-0.009124755859375,
0.06622314453125,
-0.0037746429443359375,
-0.02459716796875,
-0.006443023681640625,
0.07073974609375,
0.05523681640625,
0.03973388671875,
0.016510009765625,
-0.0176239013671875,
0.022857666015625,
-0.0028095245361328125,
-0.07012939453125,
-0.049041748046875,
0.036712646484375,
-0.03656005859375,
-0.0253753662109375,
0.03216552734375,
-0.0631103515625,
-0.0218505859375,
-0.005146026611328125,
0.05328369140625,
-0.04254150390625,
-0.04571533203125,
0.040924072265625,
-0.0290374755859375,
0.0280303955078125,
0.007076263427734375,
-0.018768310546875,
-0.007152557373046875,
0.0419921875,
0.0623779296875,
-0.0014514923095703125,
-0.02276611328125,
-0.0194854736328125,
-0.0280609130859375,
-0.002044677734375,
0.048309326171875,
-0.0091705322265625,
-0.044830322265625,
-0.0187225341796875,
0.007678985595703125,
-0.00905609130859375,
-0.05206298828125,
0.044647216796875,
-0.0216217041015625,
0.033538818359375,
0.0189971923828125,
-0.047943115234375,
-0.0225677490234375,
0.0310211181640625,
-0.03228759765625,
0.09576416015625,
0.0309906005859375,
-0.06939697265625,
0.0165557861328125,
-0.037261962890625,
-0.029510498046875,
-0.00441741943359375,
0.01123809814453125,
-0.05889892578125,
0.0101318359375,
0.00698089599609375,
0.031280517578125,
-0.0043182373046875,
0.024932861328125,
-0.02685546875,
-0.01404571533203125,
0.04937744140625,
-0.01788330078125,
0.0848388671875,
0.0146484375,
-0.0511474609375,
0.033172607421875,
-0.08673095703125,
-0.0018253326416015625,
0.00690460205078125,
-0.02484130859375,
-0.0174713134765625,
-0.0236053466796875,
0.03289794921875,
0.01218414306640625,
0.02386474609375,
-0.054931640625,
0.01904296875,
-0.040496826171875,
0.03631591796875,
0.0467529296875,
0.008941650390625,
0.0262908935546875,
-0.01055908203125,
0.032806396484375,
0.005580902099609375,
0.0179595947265625,
0.0168304443359375,
-0.0316162109375,
-0.09368896484375,
-0.0307464599609375,
0.006748199462890625,
0.061187744140625,
-0.043701171875,
0.06787109375,
-0.009857177734375,
-0.0623779296875,
-0.0246429443359375,
-0.0106658935546875,
-0.00039458274841308594,
0.0428466796875,
0.033538818359375,
-0.0249786376953125,
-0.050872802734375,
-0.062255859375,
-0.00946044921875,
-0.0278167724609375,
-0.00612640380859375,
-0.0013837814331054688,
0.044952392578125,
-0.042572021484375,
0.07757568359375,
-0.0487060546875,
-0.0296478271484375,
-0.020416259765625,
0.0178375244140625,
0.043670654296875,
0.044281005859375,
0.02667236328125,
-0.0538330078125,
-0.03851318359375,
-0.0152587890625,
-0.041839599609375,
-0.0012617111206054688,
0.009521484375,
-0.032806396484375,
-0.014007568359375,
0.01151275634765625,
-0.055145263671875,
0.032623291015625,
0.028411865234375,
-0.044769287109375,
0.0623779296875,
-0.001972198486328125,
-0.00891876220703125,
-0.09710693359375,
0.004375457763671875,
0.01525115966796875,
-0.00841522216796875,
-0.029449462890625,
-0.01617431640625,
0.00383758544921875,
-0.01198577880859375,
-0.036224365234375,
0.041229248046875,
-0.01018524169921875,
0.0256805419921875,
-0.029510498046875,
-0.01308441162109375,
0.026824951171875,
0.04339599609375,
0.0244293212890625,
0.03472900390625,
0.04278564453125,
-0.038787841796875,
0.05059814453125,
0.017333984375,
-0.01116180419921875,
0.04132080078125,
-0.06500244140625,
0.0012559890747070312,
0.02587890625,
0.010589599609375,
-0.06884765625,
-0.00957489013671875,
0.044891357421875,
-0.0504150390625,
0.0011243820190429688,
-0.016143798828125,
-0.03289794921875,
-0.021087646484375,
-0.033447265625,
0.0300750732421875,
0.0301513671875,
-0.020050048828125,
0.052703857421875,
0.0206146240234375,
0.0205230712890625,
-0.0458984375,
-0.05657958984375,
-0.0149993896484375,
-0.039337158203125,
-0.036712646484375,
0.00572967529296875,
-0.01386260986328125,
-0.02398681640625,
0.0024890899658203125,
0.015594482421875,
-0.01505279541015625,
-0.007427215576171875,
0.0127410888671875,
0.0224761962890625,
-0.0233154296875,
0.00933837890625,
-0.02813720703125,
-0.0391845703125,
0.0054168701171875,
-0.04052734375,
0.046539306640625,
-0.039794921875,
0.00013887882232666016,
-0.040191650390625,
-0.0027008056640625,
0.0357666015625,
-0.0244598388671875,
0.050201416015625,
0.07098388671875,
-0.015167236328125,
-0.01995849609375,
-0.03216552734375,
-0.0206756591796875,
-0.03656005859375,
0.0450439453125,
-0.0211944580078125,
-0.05572509765625,
0.0184783935546875,
0.01416778564453125,
-0.00586700439453125,
0.047149658203125,
0.057952880859375,
0.00942230224609375,
0.06304931640625,
0.05413818359375,
-0.0141754150390625,
0.04071044921875,
-0.04510498046875,
0.019317626953125,
-0.048797607421875,
0.000011861324310302734,
-0.008819580078125,
-0.004276275634765625,
-0.04815673828125,
-0.0286712646484375,
0.015167236328125,
0.00125885009765625,
-0.036773681640625,
0.02679443359375,
-0.068603515625,
-0.0027446746826171875,
0.046539306640625,
0.0060577392578125,
0.006053924560546875,
0.0166168212890625,
-0.009246826171875,
-0.005992889404296875,
-0.042022705078125,
-0.034210205078125,
0.07464599609375,
0.02667236328125,
0.0635986328125,
-0.01259613037109375,
0.050567626953125,
0.009521484375,
0.0357666015625,
-0.07501220703125,
0.0428466796875,
-0.011688232421875,
-0.04754638671875,
0.003936767578125,
-0.0206451416015625,
-0.057525634765625,
0.0284881591796875,
0.00319671630859375,
-0.038330078125,
0.03765869140625,
0.006359100341796875,
-0.03460693359375,
0.017913818359375,
-0.04638671875,
0.07830810546875,
-0.0290985107421875,
-0.0197601318359375,
0.0041351318359375,
-0.0244903564453125,
0.011962890625,
0.0221405029296875,
-0.0007333755493164062,
-0.0274810791015625,
0.004138946533203125,
0.0699462890625,
-0.043365478515625,
0.060089111328125,
-0.01641845703125,
-0.00238800048828125,
0.039093017578125,
0.004932403564453125,
0.005542755126953125,
0.0177154541015625,
-0.019317626953125,
0.01253509521484375,
-0.024627685546875,
-0.023895263671875,
-0.02081298828125,
0.03631591796875,
-0.0718994140625,
-0.022369384765625,
-0.061004638671875,
-0.0215606689453125,
0.006046295166015625,
0.01198577880859375,
0.0562744140625,
0.024017333984375,
-0.0089874267578125,
0.010498046875,
0.03350830078125,
-0.0162200927734375,
0.0012540817260742188,
0.0250396728515625,
-0.01441192626953125,
-0.06787109375,
0.0816650390625,
-0.008026123046875,
-0.0103759765625,
0.0205230712890625,
0.0328369140625,
-0.0124969482421875,
-0.0124969482421875,
-0.0262908935546875,
0.04168701171875,
-0.051361083984375,
-0.0218963623046875,
-0.040557861328125,
-0.0199737548828125,
-0.025970458984375,
0.0020580291748046875,
-0.03515625,
-0.031585693359375,
-0.041351318359375,
-0.0301971435546875,
0.05572509765625,
0.036651611328125,
-0.018951416015625,
0.049224853515625,
-0.0406494140625,
0.0160675048828125,
0.0141143798828125,
0.0029964447021484375,
-0.01415252685546875,
-0.05120849609375,
-0.0273895263671875,
-0.023101806640625,
-0.0203857421875,
-0.08404541015625,
0.066650390625,
0.0020923614501953125,
0.0022830963134765625,
0.03594970703125,
0.00917816162109375,
0.036865234375,
-0.006011962890625,
0.048309326171875,
0.01050567626953125,
-0.07769775390625,
0.035491943359375,
-0.0202484130859375,
0.0153350830078125,
0.030853271484375,
0.0209197998046875,
-0.043792724609375,
-0.026123046875,
-0.0458984375,
-0.0927734375,
0.03497314453125,
0.0226287841796875,
-0.01349639892578125,
0.00888824462890625,
0.01244354248046875,
0.0011196136474609375,
0.0188446044921875,
-0.09307861328125,
-0.0391845703125,
-0.052093505859375,
-0.0418701171875,
-0.0193939208984375,
-0.030792236328125,
0.0015211105346679688,
-0.028564453125,
0.07489013671875,
0.00826263427734375,
0.0235443115234375,
0.04168701171875,
-0.026519775390625,
0.008056640625,
0.019256591796875,
0.0298004150390625,
0.037994384765625,
-0.042388916015625,
0.0082244873046875,
-0.007747650146484375,
-0.043609619140625,
0.0182952880859375,
0.0297088623046875,
-0.0199127197265625,
0.0236663818359375,
0.0159912109375,
0.06390380859375,
-0.0083465576171875,
-0.0301055908203125,
0.03057861328125,
-0.021270751953125,
-0.0135040283203125,
-0.04266357421875,
-0.005222320556640625,
0.0026073455810546875,
0.01180267333984375,
0.038055419921875,
0.0012950897216796875,
0.00403594970703125,
-0.005168914794921875,
0.025634765625,
0.00951385498046875,
-0.041351318359375,
-0.032073974609375,
0.040740966796875,
0.018310546875,
-0.0177001953125,
0.04827880859375,
-0.0242462158203125,
-0.0706787109375,
0.0450439453125,
0.0036792755126953125,
0.0830078125,
0.001972198486328125,
0.0229339599609375,
0.05401611328125,
0.0299530029296875,
0.0157012939453125,
0.0261688232421875,
0.0090484619140625,
-0.06103515625,
-0.00893402099609375,
-0.04974365234375,
-0.01509857177734375,
0.0145263671875,
-0.042022705078125,
0.0197906494140625,
-0.032745361328125,
-0.0426025390625,
-0.021697998046875,
0.01018524169921875,
-0.041229248046875,
0.0335693359375,
0.0013017654418945312,
0.045257568359375,
-0.0609130859375,
0.06524658203125,
0.04632568359375,
-0.032989501953125,
-0.0743408203125,
-0.01032257080078125,
-0.0127410888671875,
-0.035919189453125,
0.043670654296875,
0.026519775390625,
-0.0171966552734375,
0.008392333984375,
-0.0229949951171875,
-0.046234130859375,
0.04754638671875,
0.00920867919921875,
-0.020477294921875,
0.0163116455078125,
0.004207611083984375,
0.034393310546875,
-0.0224456787109375,
0.022003173828125,
0.038604736328125,
0.035400390625,
-0.03594970703125,
-0.04443359375,
-0.01312255859375,
-0.042755126953125,
-0.028900146484375,
-0.0073699951171875,
-0.0577392578125,
0.09857177734375,
0.000606536865234375,
0.0007491111755371094,
0.01194000244140625,
0.07122802734375,
0.02691650390625,
0.017425537109375,
0.042633056640625,
0.051513671875,
0.040313720703125,
-0.01184844970703125,
0.056243896484375,
-0.02587890625,
0.060638427734375,
0.0703125,
-0.00971221923828125,
0.0576171875,
0.01702880859375,
-0.01031494140625,
0.07281494140625,
0.0640869140625,
-0.0263824462890625,
0.02142333984375,
0.004581451416015625,
-0.01229095458984375,
-0.0177154541015625,
0.013214111328125,
-0.036865234375,
0.04083251953125,
0.02099609375,
-0.0195159912109375,
-0.0037631988525390625,
0.00983428955078125,
0.003826141357421875,
-0.0171661376953125,
-0.0089569091796875,
0.039642333984375,
0.01325225830078125,
-0.036102294921875,
0.056060791015625,
0.0158233642578125,
0.08343505859375,
-0.0469970703125,
0.006343841552734375,
-0.00534820556640625,
0.051544189453125,
-0.0191192626953125,
-0.04266357421875,
-0.01050567626953125,
0.010833740234375,
-0.003772735595703125,
0.016510009765625,
0.05810546875,
-0.030975341796875,
-0.06524658203125,
0.0404052734375,
0.037872314453125,
0.010223388671875,
-0.011993408203125,
-0.08453369140625,
-0.004974365234375,
0.012298583984375,
-0.04937744140625,
-0.019317626953125,
0.0380859375,
0.017608642578125,
0.03204345703125,
0.03228759765625,
0.005706787109375,
0.00801849365234375,
0.01097869873046875,
0.04998779296875,
-0.06103515625,
-0.058837890625,
-0.07940673828125,
0.046356201171875,
-0.004962921142578125,
-0.0511474609375,
0.08380126953125,
0.062744140625,
0.065185546875,
-0.0145263671875,
0.05364990234375,
0.0036468505859375,
0.037261962890625,
-0.00930023193359375,
0.05096435546875,
-0.0384521484375,
-0.0167236328125,
-0.02740478515625,
-0.070556640625,
-0.023193359375,
0.08404541015625,
-0.033843994140625,
0.009033203125,
0.05194091796875,
0.06109619140625,
-0.00530242919921875,
-0.0264434814453125,
0.002086639404296875,
0.02801513671875,
0.016448974609375,
0.027374267578125,
0.061981201171875,
-0.035491943359375,
0.043121337890625,
-0.052764892578125,
-0.0181121826171875,
-0.033447265625,
-0.050537109375,
-0.06243896484375,
-0.02313232421875,
-0.0389404296875,
-0.0596923828125,
-0.01068115234375,
0.07183837890625,
0.0606689453125,
-0.06805419921875,
-0.0517578125,
-0.0011129379272460938,
0.00266265869140625,
0.0004820823669433594,
-0.0259552001953125,
0.049591064453125,
-0.0006380081176757812,
-0.05548095703125,
0.0053558349609375,
0.00737762451171875,
0.00231170654296875,
-0.0188446044921875,
-0.009735107421875,
-0.01027679443359375,
-0.00916290283203125,
0.037353515625,
0.01312255859375,
-0.03863525390625,
-0.0010156631469726562,
0.005832672119140625,
-0.01303863525390625,
0.042449951171875,
0.05401611328125,
-0.052337646484375,
0.026031494140625,
0.0287628173828125,
0.01505279541015625,
0.032806396484375,
0.00623321533203125,
0.0300750732421875,
-0.04193115234375,
0.040374755859375,
0.031402587890625,
0.037811279296875,
0.057891845703125,
-0.01641845703125,
0.0282135009765625,
0.03955078125,
-0.03814697265625,
-0.048919677734375,
0.0088043212890625,
-0.077392578125,
-0.0213623046875,
0.0914306640625,
-0.0051116943359375,
-0.03717041015625,
0.0023784637451171875,
-0.041534423828125,
0.038330078125,
-0.046051025390625,
0.035736083984375,
0.0433349609375,
-0.007526397705078125,
0.0030345916748046875,
-0.004924774169921875,
0.028350830078125,
0.049652099609375,
-0.0262451171875,
-0.006336212158203125,
0.030975341796875,
0.036285400390625,
0.01415252685546875,
0.048828125,
0.0132598876953125,
0.026397705078125,
-0.0007128715515136719,
0.02490234375,
0.0027313232421875,
-0.004138946533203125,
-0.037445068359375,
0.0160369873046875,
0.011322021484375,
-0.034423828125
]
] |
WizardLM/WizardLM-13B-V1.2 | 2023-09-09T06:45:42.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2304.12244",
"arxiv:2306.08568",
"arxiv:2308.09583",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | WizardLM | null | null | WizardLM/WizardLM-13B-V1.2 | 174 | 15,571 | transformers | 2023-07-25T13:51:28 | ---
license: llama2
---
This is the **Full-Weight** of WizardLM-13B V1.2 model, this model is trained from **Llama-2 13b**.
## WizardLM: Empowering Large Pre-Trained Language Models to Follow Complex Instructions
<p align="center">
🤗 <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> • 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> • 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br>
</p>
<p align="center">
👋 Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a>
</p>
## News
- 🔥🔥🔥[2023/08/26] We released **WizardCoder-Python-34B-V1.0** , which achieves the **73.2 pass@1** and surpasses **GPT4 (2023/03/15)**, **ChatGPT-3.5**, and **Claude2** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). For more details, please refer to [WizardCoder](https://github.com/nlpxucan/WizardLM/tree/main/WizardCoder).
- [2023/06/16] We released **WizardCoder-15B-V1.0** , which surpasses **Claude-Plus (+6.8)**, **Bard (+15.3)** and **InstructCodeT5+ (+22.3)** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). For more details, please refer to [WizardCoder](https://github.com/nlpxucan/WizardLM/tree/main/WizardCoder).
| Model | Checkpoint | Paper | HumanEval | MBPP | Demo | License |
| ----- |------| ---- |------|-------| ----- | ----- |
| WizardCoder-Python-34B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 61.2 | [Demo](http://47.103.63.15:50085/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
| WizardCoder-15B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 |50.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
| WizardCoder-Python-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 64.0 | 55.6 | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
| WizardCoder-Python-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 55.5 | 51.6 | [Demo](http://47.103.63.15:50088/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
| WizardCoder-3B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-3B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 34.8 |37.4 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
| WizardCoder-1B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-1B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 23.8 |28.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
- 🔥 [08/11/2023] We release **WizardMath** Models.
- 🔥 Our **WizardMath-70B-V1.0** model slightly outperforms some closed-source LLMs on the GSM8K, including **ChatGPT 3.5**, **Claude Instant 1** and **PaLM 2 540B**.
- 🔥 Our **WizardMath-70B-V1.0** model achieves **81.6 pass@1** on the [GSM8k Benchmarks](https://github.com/openai/grade-school-math), which is **24.8** points higher than the SOTA open-source LLM.
- 🔥 Our **WizardMath-70B-V1.0** model achieves **22.7 pass@1** on the [MATH Benchmarks](https://github.com/hendrycks/math), which is **9.2** points higher than the SOTA open-source LLM.
| Model | Checkpoint | Paper | GSM8k | MATH |Online Demo| License|
| ----- |------| ---- |------|-------| ----- | ----- |
| WizardMath-70B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-70B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **81.6** | **22.7** |[Demo](http://47.103.63.15:50083/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> |
| WizardMath-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **63.9** | **14.0** |[Demo](http://47.103.63.15:50082/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> |
| WizardMath-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **54.9** | **10.7** | [Demo](http://47.103.63.15:50080/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a>|
<font size=4>
| <sup>Model</sup> | <sup>Checkpoint</sup> | <sup>Paper</sup> |<sup>MT-Bench</sup> | <sup>AlpacaEval</sup> | <sup>WizardEval</sup> | <sup>HumanEval</sup> | <sup>License</sup>|
| ----- |------| ---- |------|-------| ----- | ----- | ----- |
| <sup>WizardLM-13B-V1.2</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.2" target="_blank">HF Link</a> </sup>| | <sup>7.06</sup> | <sup>89.17%</sup> | <sup>101.4% </sup>|<sup>36.6 pass@1</sup>|<sup> <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 License </a></sup> |
| <sup>WizardLM-13B-V1.1</sup> |<sup> 🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.1" target="_blank">HF Link</a> </sup> | | <sup>6.76</sup> |<sup>86.32%</sup> | <sup>99.3% </sup> |<sup>25.0 pass@1</sup>| <sup>Non-commercial</sup>|
| <sup>WizardLM-30B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-30B-V1.0" target="_blank">HF Link</a></sup> | | <sup>7.01</sup> | | <sup>97.8% </sup> | <sup>37.8 pass@1</sup>| <sup>Non-commercial</sup> |
| <sup>WizardLM-13B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.0" target="_blank">HF Link</a> </sup> | | <sup>6.35</sup> | <sup>75.31%</sup> | <sup>89.1% </sup> |<sup> 24.0 pass@1 </sup> | <sup>Non-commercial</sup>|
| <sup>WizardLM-7B-V1.0 </sup>| <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-7B-V1.0" target="_blank">HF Link</a> </sup> |<sup> 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> </sup>| | | <sup>78.0% </sup> |<sup>19.1 pass@1 </sup>|<sup> Non-commercial</sup>|
</font>
**Repository**: https://github.com/nlpxucan/WizardLM
**Twitter**:
- 🔥🔥🔥 [7/25/2023] We released **WizardLM V1.2** models. The **WizardLM-13B-V1.2** is here ([Demo_13B-V1.2](https://b7a19878988c8c73.gradio.app), [Demo_13B-V1.2_bak-1](https://d0a37a76e0ac4b52.gradio.app/), [Full Model Weight](https://huggingface.co/WizardLM/WizardLM-13B-V1.2)). Please checkout the [paper](https://arxiv.org/abs/2304.12244).
- 🔥🔥🔥 [7/25/2023] The **WizardLM-13B-V1.2** achieves **7.06** on [MT-Bench Leaderboard](https://chat.lmsys.org/?leaderboard), **89.17%** on [AlpacaEval Leaderboard](https://tatsu-lab.github.io/alpaca_eval/), and **101.4%** on [WizardLM Eval](https://github.com/nlpxucan/WizardLM/blob/main/WizardLM/data/WizardLM_testset.jsonl). (Note: MT-Bench and AlpacaEval are all self-test, will push update and request review. All tests are completed under their official settings.)
❗<b>Note for model system prompts usage:</b>
<b>WizardLM</b> adopts the prompt format from <b>Vicuna</b> and supports **multi-turn** conversation. The prompt should be as following:
```
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: Hi ASSISTANT: Hello.</s>USER: Who are you? ASSISTANT: I am WizardLM.</s>......
```
## Inference WizardLM Demo Script
We provide the inference WizardLM demo code [here](https://github.com/nlpxucan/WizardLM/tree/main/demo).
Please cite the paper if you use the data or code from WizardLM.
```
@article{xu2023wizardlm,
title={Wizardlm: Empowering large language models to follow complex instructions},
author={Xu, Can and Sun, Qingfeng and Zheng, Kai and Geng, Xiubo and Zhao, Pu and Feng, Jiazhan and Tao, Chongyang and Jiang, Daxin},
journal={arXiv preprint arXiv:2304.12244},
year={2023}
}
```
❗<b>To commen concern about dataset:</b>
Recently, there have been clear changes in the open-source policy and regulations of our overall organization's code, data, and models.
Despite this, we have still worked hard to obtain opening the weights of the model first, but the data involves stricter auditing and is in review with our legal team .
Our researchers have no authority to publicly release them without authorization.
Thank you for your understanding. | 9,597 | [
[
-0.04534912109375,
-0.037353515625,
-0.004955291748046875,
0.03118896484375,
0.006412506103515625,
-0.010467529296875,
0.0020275115966796875,
-0.0347900390625,
0.0153961181640625,
0.026458740234375,
-0.050567626953125,
-0.04681396484375,
-0.038360595703125,
0.01381683349609375,
-0.01023101806640625,
0.059326171875,
-0.01274871826171875,
-0.013519287109375,
-0.01461029052734375,
-0.01116943359375,
-0.0164031982421875,
-0.033782958984375,
-0.016143798828125,
-0.03607177734375,
0.03082275390625,
0.006084442138671875,
0.06292724609375,
0.03289794921875,
0.02496337890625,
0.0213165283203125,
-0.0192718505859375,
0.039337158203125,
-0.01372528076171875,
-0.0185394287109375,
0.0158538818359375,
-0.0177001953125,
-0.071044921875,
-0.0018548965454101562,
0.04400634765625,
0.028228759765625,
-0.00435638427734375,
0.0293731689453125,
0.0026493072509765625,
0.068115234375,
-0.0428466796875,
0.0201873779296875,
-0.0179290771484375,
0.0181427001953125,
-0.01209259033203125,
-0.00677490234375,
-0.0022106170654296875,
-0.03607177734375,
-0.0028743743896484375,
-0.060546875,
-0.0089569091796875,
0.00897216796875,
0.086181640625,
0.01367950439453125,
-0.0206756591796875,
-0.006481170654296875,
-0.0206451416015625,
0.0528564453125,
-0.06317138671875,
0.024688720703125,
0.04168701171875,
0.01284027099609375,
-0.03997802734375,
-0.039947509765625,
-0.05908203125,
-0.01207733154296875,
-0.00902557373046875,
0.00823974609375,
-0.0279998779296875,
-0.019012451171875,
0.0241241455078125,
0.0229644775390625,
-0.04925537109375,
-0.003170013427734375,
-0.03082275390625,
-0.01293182373046875,
0.0631103515625,
0.0157623291015625,
0.032257080078125,
-0.01346588134765625,
0.0016574859619140625,
-0.01580810546875,
-0.03900146484375,
0.01544952392578125,
0.0298309326171875,
-0.0024356842041015625,
-0.035125732421875,
0.057342529296875,
-0.005352020263671875,
0.049835205078125,
0.008636474609375,
-0.043914794921875,
0.048431396484375,
-0.026947021484375,
-0.01540374755859375,
-0.007537841796875,
0.0780029296875,
0.037017822265625,
0.0095367431640625,
0.005992889404296875,
0.0006952285766601562,
-0.020416259765625,
0.004001617431640625,
-0.06707763671875,
-0.007328033447265625,
0.0216827392578125,
-0.040679931640625,
-0.0200347900390625,
-0.0184173583984375,
-0.05633544921875,
-0.0278472900390625,
-0.0069427490234375,
0.0213775634765625,
-0.0491943359375,
-0.0234527587890625,
0.01904296875,
-0.0017518997192382812,
0.045501708984375,
0.0418701171875,
-0.06561279296875,
0.022064208984375,
0.039825439453125,
0.056915283203125,
-0.00963592529296875,
-0.0418701171875,
-0.007053375244140625,
-0.0006756782531738281,
-0.03271484375,
0.039031982421875,
0.0020046234130859375,
-0.0321044921875,
-0.0094451904296875,
-0.00571441650390625,
-0.01177215576171875,
-0.0288848876953125,
0.0295257568359375,
-0.03033447265625,
0.0210418701171875,
-0.00820159912109375,
-0.038604736328125,
-0.0195159912109375,
0.022064208984375,
-0.045135498046875,
0.0855712890625,
0.01267242431640625,
-0.06951904296875,
-0.0030002593994140625,
-0.048248291015625,
-0.01104736328125,
-0.032623291015625,
-0.0029354095458984375,
-0.043304443359375,
-0.01425933837890625,
0.0179901123046875,
0.01528167724609375,
-0.038787841796875,
-0.01493072509765625,
-0.016876220703125,
-0.018341064453125,
0.0171051025390625,
-0.0369873046875,
0.09735107421875,
0.0177459716796875,
-0.0262908935546875,
-0.004520416259765625,
-0.07952880859375,
0.0014944076538085938,
0.043365478515625,
-0.035308837890625,
0.004970550537109375,
-0.0214080810546875,
-0.00736236572265625,
0.01313018798828125,
0.050628662109375,
-0.0212249755859375,
0.035430908203125,
-0.03631591796875,
-0.0129547119140625,
0.0521240234375,
-0.01087188720703125,
0.025115966796875,
-0.03411865234375,
0.03033447265625,
-0.008544921875,
0.0289764404296875,
0.00850677490234375,
-0.048583984375,
-0.06195068359375,
-0.027191162109375,
0.007625579833984375,
0.053619384765625,
-0.0328369140625,
0.07916259765625,
-0.017669677734375,
-0.073486328125,
-0.04217529296875,
0.0215301513671875,
0.03143310546875,
0.03985595703125,
0.040374755859375,
-0.0111083984375,
-0.0260009765625,
-0.062042236328125,
-0.0035305023193359375,
-0.02447509765625,
-0.00653839111328125,
0.0218963623046875,
0.04534912109375,
-0.03253173828125,
0.0712890625,
-0.047393798828125,
-0.0165557861328125,
-0.00798797607421875,
-0.0159149169921875,
0.028778076171875,
0.047607421875,
0.04693603515625,
-0.044647216796875,
-0.033721923828125,
0.01454925537109375,
-0.06903076171875,
-0.00891876220703125,
0.006671905517578125,
-0.0231475830078125,
0.023834228515625,
-0.00047969818115234375,
-0.06744384765625,
0.055084228515625,
0.022064208984375,
-0.043060302734375,
0.06585693359375,
-0.025360107421875,
0.004039764404296875,
-0.07421875,
0.0056610107421875,
-0.0091094970703125,
0.0092926025390625,
-0.04541015625,
0.0016651153564453125,
0.00701141357421875,
0.020965576171875,
-0.04705810546875,
0.0638427734375,
-0.03643798828125,
-0.002399444580078125,
-0.004085540771484375,
-0.01287841796875,
0.0151519775390625,
0.05511474609375,
-0.0095977783203125,
0.053253173828125,
0.054443359375,
-0.03271484375,
0.040069580078125,
0.02886962890625,
-0.02093505859375,
0.0235595703125,
-0.039306640625,
0.005306243896484375,
0.003894805908203125,
0.0225372314453125,
-0.0384521484375,
-0.0091705322265625,
0.04168701171875,
-0.043304443359375,
0.0279998779296875,
0.0048065185546875,
-0.056976318359375,
-0.04656982421875,
-0.0506591796875,
0.0023193359375,
0.057403564453125,
-0.039764404296875,
0.05194091796875,
0.019866943359375,
0.0216064453125,
-0.059417724609375,
-0.043426513671875,
-0.01149749755859375,
-0.0080718994140625,
-0.058685302734375,
0.01751708984375,
-0.019775390625,
-0.0120086669921875,
-0.00351715087890625,
-0.0277252197265625,
-0.0012273788452148438,
0.0135650634765625,
0.019439697265625,
0.036346435546875,
-0.01050567626953125,
-0.0235748291015625,
0.004138946533203125,
-0.005298614501953125,
-0.0020046234130859375,
-0.01474761962890625,
0.037628173828125,
-0.0182647705078125,
-0.03662109375,
-0.0311431884765625,
0.004276275634765625,
0.03662109375,
-0.020599365234375,
0.0711669921875,
0.046600341796875,
-0.03839111328125,
0.006740570068359375,
-0.05108642578125,
0.011688232421875,
-0.04046630859375,
0.0139007568359375,
-0.03460693359375,
-0.048583984375,
0.043853759765625,
0.0158843994140625,
0.0250396728515625,
0.04412841796875,
0.049346923828125,
0.01052093505859375,
0.0706787109375,
0.033660888671875,
-0.0012750625610351562,
0.035003662109375,
-0.0406494140625,
0.007534027099609375,
-0.0657958984375,
-0.04248046875,
-0.03680419921875,
0.0020503997802734375,
-0.033782958984375,
-0.04754638671875,
0.0239105224609375,
0.0499267578125,
-0.04571533203125,
0.044921875,
-0.0633544921875,
0.0188140869140625,
0.039520263671875,
0.00013935565948486328,
0.0138397216796875,
0.0131988525390625,
-0.0224456787109375,
0.016571044921875,
-0.0266876220703125,
-0.04583740234375,
0.08099365234375,
0.0201873779296875,
0.049835205078125,
0.01507568359375,
0.05560302734375,
-0.002590179443359375,
-0.003383636474609375,
-0.027099609375,
0.05218505859375,
0.0223236083984375,
-0.04278564453125,
-0.0279998779296875,
-0.0186767578125,
-0.08001708984375,
0.037322998046875,
-0.015167236328125,
-0.08837890625,
0.023345947265625,
0.0033855438232421875,
-0.0176544189453125,
0.03497314453125,
-0.04278564453125,
0.06817626953125,
-0.012298583984375,
-0.03314208984375,
0.0008025169372558594,
-0.0312347412109375,
0.0241546630859375,
0.0037097930908203125,
0.00927734375,
-0.0256805419921875,
-0.0216827392578125,
0.0601806640625,
-0.0811767578125,
0.043212890625,
-0.002216339111328125,
-0.02093505859375,
0.04180908203125,
-0.003528594970703125,
0.04180908203125,
-0.010772705078125,
-0.0161285400390625,
0.0304412841796875,
0.0110626220703125,
-0.034759521484375,
-0.0450439453125,
0.050994873046875,
-0.08160400390625,
-0.052886962890625,
-0.04022216796875,
-0.0287017822265625,
-0.005645751953125,
0.0204620361328125,
0.0191497802734375,
0.01247406005859375,
0.019500732421875,
-0.0170745849609375,
0.049591064453125,
-0.0281524658203125,
0.025634765625,
0.028778076171875,
-0.029693603515625,
-0.0295867919921875,
0.07421875,
0.0145263671875,
-0.0024623870849609375,
0.0249176025390625,
0.01404571533203125,
-0.01309967041015625,
-0.0316162109375,
-0.051727294921875,
0.027069091796875,
-0.0540771484375,
-0.028350830078125,
-0.059539794921875,
-0.035675048828125,
-0.04644775390625,
-0.02093505859375,
-0.03253173828125,
-0.04119873046875,
-0.0484619140625,
0.0032978057861328125,
0.07415771484375,
0.032806396484375,
-0.017852783203125,
-0.007568359375,
-0.050933837890625,
0.0241546630859375,
0.027008056640625,
0.0134429931640625,
0.02642822265625,
-0.04022216796875,
-0.018341064453125,
-0.01340484619140625,
-0.04345703125,
-0.06640625,
0.04534912109375,
-0.0167694091796875,
0.04046630859375,
0.00815582275390625,
-0.0023517608642578125,
0.058868408203125,
-0.044189453125,
0.0732421875,
0.0430908203125,
-0.05609130859375,
0.036407470703125,
-0.0107879638671875,
0.0254058837890625,
0.0205841064453125,
0.0252838134765625,
-0.0296630859375,
-0.011566162109375,
-0.0333251953125,
-0.056396484375,
0.05535888671875,
0.02288818359375,
-0.001651763916015625,
0.01172637939453125,
0.01056671142578125,
0.0023365020751953125,
0.004360198974609375,
-0.038330078125,
-0.0623779296875,
-0.0267181396484375,
-0.01971435546875,
0.021636962890625,
0.0012998580932617188,
-0.0014524459838867188,
-0.03570556640625,
0.05767822265625,
0.0015935897827148438,
0.031768798828125,
0.0201873779296875,
-0.00589752197265625,
-0.00213623046875,
0.0077056884765625,
0.036956787109375,
0.042449951171875,
-0.0142822265625,
-0.0104522705078125,
0.032501220703125,
-0.05810546875,
0.0197601318359375,
0.026641845703125,
-0.0219573974609375,
-0.0099945068359375,
0.03533935546875,
0.0628662109375,
-0.0011539459228515625,
-0.041107177734375,
0.043121337890625,
0.0062103271484375,
-0.0159759521484375,
-0.032440185546875,
0.0142059326171875,
0.0210418701171875,
0.0256805419921875,
0.032012939453125,
0.006439208984375,
0.01351165771484375,
-0.0213775634765625,
-0.0031299591064453125,
0.030120849609375,
-0.0030422210693359375,
-0.01305389404296875,
0.050872802734375,
-0.015899658203125,
-0.0268096923828125,
0.01140594482421875,
-0.024749755859375,
-0.042724609375,
0.05682373046875,
0.038543701171875,
0.049224853515625,
0.0050048828125,
-0.0106048583984375,
0.04022216796875,
0.0123291015625,
-0.002521514892578125,
0.006969451904296875,
-0.0031795501708984375,
-0.03271484375,
-0.0086669921875,
-0.06353759765625,
-0.020660400390625,
-0.013519287109375,
-0.0244140625,
0.0401611328125,
-0.036468505859375,
-0.0023288726806640625,
-0.01206207275390625,
0.039642333984375,
-0.0670166015625,
-0.01168060302734375,
0.0198211669921875,
0.0943603515625,
-0.0175323486328125,
0.07489013671875,
0.03118896484375,
-0.051177978515625,
-0.0733642578125,
-0.016204833984375,
0.0251617431640625,
-0.06585693359375,
0.0421142578125,
-0.0017576217651367188,
-0.00566864013671875,
-0.01183319091796875,
-0.031524658203125,
-0.07745361328125,
0.10455322265625,
0.01522064208984375,
-0.02801513671875,
-0.025909423828125,
-0.00006371736526489258,
0.027923583984375,
-0.0069122314453125,
0.041168212890625,
0.04541015625,
0.047821044921875,
0.007053375244140625,
-0.09967041015625,
0.0190582275390625,
-0.042266845703125,
0.0005879402160644531,
-0.01401519775390625,
-0.06842041015625,
0.06854248046875,
-0.00933837890625,
0.00643157958984375,
0.024658203125,
0.056182861328125,
0.06256103515625,
0.020111083984375,
0.0110626220703125,
0.04180908203125,
0.06402587890625,
0.0119171142578125,
0.09552001953125,
-0.0160369873046875,
0.03216552734375,
0.050018310546875,
-0.005390167236328125,
0.039947509765625,
0.01554107666015625,
-0.04449462890625,
0.03826904296875,
0.049713134765625,
-0.0145111083984375,
0.0305023193359375,
0.04083251953125,
-0.011138916015625,
0.0018405914306640625,
0.00849151611328125,
-0.05316162109375,
-0.0085296630859375,
0.01934814453125,
0.0084381103515625,
-0.0039520263671875,
0.0021076202392578125,
0.0164794921875,
-0.01313018798828125,
-0.032684326171875,
0.044189453125,
0.0097808837890625,
-0.02008056640625,
0.07952880859375,
-0.005847930908203125,
0.07916259765625,
-0.053863525390625,
-0.006038665771484375,
-0.0218658447265625,
0.0010852813720703125,
-0.03338623046875,
-0.050811767578125,
-0.009124755859375,
0.01056671142578125,
-0.00630950927734375,
0.01232147216796875,
0.058074951171875,
-0.006591796875,
-0.053924560546875,
0.0322265625,
0.0264129638671875,
0.03240966796875,
0.025177001953125,
-0.06756591796875,
0.03271484375,
-0.0026378631591796875,
-0.053924560546875,
0.03082275390625,
0.037261962890625,
-0.003032684326171875,
0.0556640625,
0.0457763671875,
0.0061187744140625,
0.037445068359375,
-0.01435089111328125,
0.06744384765625,
-0.040191650390625,
-0.0032024383544921875,
-0.062225341796875,
0.04248046875,
-0.01617431640625,
-0.02227783203125,
0.08245849609375,
0.04803466796875,
0.05316162109375,
-0.0052490234375,
0.04400634765625,
-0.00943756103515625,
0.0175323486328125,
-0.0181121826171875,
0.068359375,
-0.06695556640625,
0.007099151611328125,
-0.031768798828125,
-0.06353759765625,
-0.03497314453125,
0.0682373046875,
-0.0141448974609375,
0.0016279220581054688,
0.03125,
0.07354736328125,
0.009613037109375,
-0.0171966552734375,
0.013427734375,
-0.004886627197265625,
0.0238037109375,
0.053436279296875,
0.040771484375,
-0.052490234375,
0.04766845703125,
-0.029541015625,
-0.00963592529296875,
-0.0283203125,
-0.0428466796875,
-0.0772705078125,
-0.0302734375,
-0.0283203125,
-0.048309326171875,
-0.014434814453125,
0.0999755859375,
0.045562744140625,
-0.05194091796875,
-0.0199432373046875,
0.007602691650390625,
0.04052734375,
-0.0174407958984375,
-0.01367950439453125,
0.05670166015625,
0.0071563720703125,
-0.060546875,
0.0183563232421875,
0.01113128662109375,
0.0273284912109375,
-0.0187225341796875,
-0.051177978515625,
-0.01404571533203125,
0.02154541015625,
0.03289794921875,
0.0435791015625,
-0.05572509765625,
-0.0026798248291015625,
0.0015964508056640625,
-0.02313232421875,
0.01029205322265625,
0.013916015625,
-0.040771484375,
0.005580902099609375,
0.036163330078125,
0.033294677734375,
0.040985107421875,
-0.035797119140625,
0.00473785400390625,
-0.01544952392578125,
0.01025390625,
-0.003421783447265625,
0.0369873046875,
0.01207733154296875,
-0.0249481201171875,
0.0469970703125,
0.01120758056640625,
-0.0306549072265625,
-0.0682373046875,
-0.01433563232421875,
-0.07562255859375,
-0.01568603515625,
0.0762939453125,
-0.0033054351806640625,
-0.048004150390625,
0.0108489990234375,
-0.03125,
0.0230560302734375,
-0.032196044921875,
0.0241241455078125,
0.034912109375,
-0.01641845703125,
-0.00246429443359375,
-0.039642333984375,
0.03179931640625,
0.005615234375,
-0.059814453125,
0.0027942657470703125,
0.033538818359375,
0.0185089111328125,
0.04486083984375,
0.06353759765625,
-0.0223541259765625,
0.02496337890625,
0.0182647705078125,
0.033843994140625,
-0.0263214111328125,
0.003631591796875,
-0.0239410400390625,
-0.005153656005859375,
-0.002086639404296875,
-0.01085662841796875
]
] |
kakaobrain/align-base | 2023-03-08T11:02:27.000Z | [
"transformers",
"pytorch",
"align",
"zero-shot-image-classification",
"vision",
"multi-modal",
"en",
"dataset:coyo-700m",
"arxiv:2102.05918",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | kakaobrain | null | null | kakaobrain/align-base | 15 | 15,569 | transformers | 2023-02-24T15:23:00 | ---
language: en
tags:
- align
- vision
- multi-modal
datasets:
- coyo-700m
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# ALIGN (base model)
The [ALIGN](https://arxiv.org/abs/2102.05918) model was proposed in "Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision" by Chao Jia, Yinfei Yang, Ye Xia, Yi-Ting Chen, Zarana Parekh, Hieu Pham, Quoc V. Le, Yunhsuan Sung, Zhen Li, Tom Duerig.
ALIGN features a dual-encoder architecture with [EfficientNet](https://huggingface.co/docs/transformers/main/en/model_doc/efficientnet#efficientnet) as its vision encoder and [BERT](https://huggingface.co/docs/transformers/main/en/model_doc/bert) as its text encoder, and learns to align visual and text representations with contrastive learning. Unlike previous work, ALIGN leverages a massive noisy dataset and shows that the scale of the corpus can be used to achieve SOTA representations with a simple recipe.
The code for ALIGN was not publicly released, the base model is converted from the original implementation of the Kakao Brain team. This implementation follows the same architecture and hyperparameters as provided in the original Google model but is trained on the open source [COYO](https://github.com/kakaobrain/coyo-dataset) dataset. Google’s [ALIGN](https://ai.googleblog.com/2021/05/align-scaling-up-visual-and-vision.html) model, while trained on a huge dataset of 1.8 billion image-text pairs, cannot be replicated as the datasets is not public. Kakao Brain's ALIGN is on-par or outperforms Google ALIGN's reported metrics despite being trained on the much smaller, albeit carefully curated COYO-700M dataset.
<p>
<center>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/132_vit_align/align-performance.png" alt="ALIGN performance"/>
</center>
</p>
## COYO-700M Dataset
[COYO](https://github.com/kakaobrain/coyo-dataset#dataset-preview) is an image-text dataset of 700 million pairs similar to Google's `ALIGN 1.8B` image-text dataset which is a collection of "noisy" alt-text and image pairs from webpages, but open-source. `COYO-700M` and `ALIGN 1.8B` are "noisy" because minimal filtering was applied. `COYO` is similar to the other open-source image-text dataset, `LAION` but with the following differences. While `LAION` 2B is a much larger dataset of 2 billion English pairs, compared to `COYO`’s 700 million pairs, `COYO` pairs come with more metadata that give users more flexibility and finer-grained control over usage. The following table shows the differences: `COYO` comes equipped with aesthetic scores for all pairs, more robust watermark scores, and face count data.
| COYO | LAION 2B| ALIGN 1.8B |
| :----: | :----: | :----: |
| Image-text similarity score calculated with CLIP ViT-B/32 and ViT-L/14 models, they are provided as metadata but nothing is filtered out so as to avoid possible elimination bias | Image-text similarity score provided with CLIP (ViT-B/32) - only examples above threshold 0.28 | Minimal, Frequency based filtering |
| NSFW filtering on images and text | NSFW filtering on images | [Google Cloud API](https://cloud.google.com/vision) |
| Face recognition (face count) data provided as meta-data | No face recognition data | NA |
| 700 million pairs all English | 2 billion English| 1.8 billion |
| From CC 2020 Oct - 2021 Aug| From CC 2014-2020| NA |
|Aesthetic Score | Aesthetic Score Partial | NA|
|More robust Watermark score | Watermark Score | NA|
|Hugging Face Hub | Hugging Face Hub | Not made public |
| English | English | English? |
COYO is available on the hub as a [dataset](https://huggingface.co/datasets/kakaobrain/coyo-700m).
## Use with Transformers
### Zero-Shot Image Classification
```python3
import requests
import torch
from PIL import Image
from transformers import AlignProcessor, AlignModel
processor = AlignProcessor.from_pretrained("kakaobrain/align-base")
model = AlignModel.from_pretrained("kakaobrain/align-base")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
candidate_labels = ["an image of a cat", "an image of a dog"]
inputs = processor(text=candidate_labels, images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
# this is the image-text similarity score
logits_per_image = outputs.logits_per_image
# we can take the softmax to get the label probabilities
probs = logits_per_image.softmax(dim=1)
print(probs)
```
### Multi-Modal Embedding Retrieval
```python3
import requests
import torch
from PIL import Image
from transformers import AlignProcessor, AlignModel
processor = AlignProcessor.from_pretrained("kakaobrain/align-base")
model = AlignModel.from_pretrained("kakaobrain/align-base")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
text = "an image of a cat"
inputs = processor(text=text, images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
# multi-modal text embedding
text_embeds = outputs.text_embeds
# multi-modal image embedding
image_embeds = outputs.image_embeds
```
Alternatively, retrieve image or text embeddings separately.
```python3
import requests
import torch
from PIL import Image
from transformers import AlignProcessor, AlignModel
processor = AlignProcessor.from_pretrained("kakaobrain/align-base")
model = AlignModel.from_pretrained("kakaobrain/align-base")
# image embeddings
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(images=image, return_tensors="pt")
image_embeds = model.get_image_features(
pixel_values=inputs['pixel_values'],
)
# text embeddings
text = "an image of a cat"
inputs = processor(text=text, return_tensors="pt")
text_embeds = model.get_text_features(
input_ids=inputs['input_ids'],
attention_mask=inputs['attention_mask'],
token_type_ids=inputs['token_type_ids'],
)
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the ALIGN paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
| 6,900 | [
[
-0.0280914306640625,
-0.054168701171875,
0.01348876953125,
0.00843048095703125,
-0.0268402099609375,
-0.0175628662109375,
-0.02069091796875,
-0.053680419921875,
0.017333984375,
0.01502227783203125,
-0.037384033203125,
-0.041717529296875,
-0.046478271484375,
0.0021610260009765625,
-0.01340484619140625,
0.06988525390625,
-0.006122589111328125,
0.00884246826171875,
-0.0288543701171875,
-0.0229034423828125,
-0.02508544921875,
-0.01381683349609375,
-0.0416259765625,
-0.020294189453125,
0.033355712890625,
0.01340484619140625,
0.057098388671875,
0.046722412109375,
0.06036376953125,
0.025115966796875,
0.0135955810546875,
-0.0031490325927734375,
-0.036590576171875,
-0.0142364501953125,
0.0088043212890625,
-0.0300445556640625,
-0.0293426513671875,
0.015228271484375,
0.03155517578125,
0.01702880859375,
0.0142822265625,
0.01158905029296875,
0.00019443035125732422,
0.03973388671875,
-0.051422119140625,
0.023590087890625,
-0.034088134765625,
-0.0012359619140625,
-0.0033969879150390625,
0.00971221923828125,
-0.028350830078125,
-0.0175933837890625,
0.0229034423828125,
-0.061767578125,
0.0181732177734375,
0.0027790069580078125,
0.1121826171875,
0.021148681640625,
-0.01438140869140625,
-0.0303802490234375,
-0.0178680419921875,
0.0709228515625,
-0.037994384765625,
0.034698486328125,
0.02294921875,
0.01361846923828125,
0.01238250732421875,
-0.0609130859375,
-0.049468994140625,
-0.01342010498046875,
-0.014739990234375,
0.026580810546875,
-0.0159149169921875,
0.000016927719116210938,
0.003406524658203125,
0.03363037109375,
-0.043914794921875,
0.0007395744323730469,
-0.03240966796875,
-0.0207672119140625,
0.056610107421875,
-0.005359649658203125,
0.041534423828125,
-0.0211639404296875,
-0.03802490234375,
-0.027191162109375,
-0.0275421142578125,
0.01409912109375,
0.0196075439453125,
0.0003898143768310547,
-0.0335693359375,
0.039703369140625,
-0.0012998580932617188,
0.05120849609375,
0.0187835693359375,
-0.016265869140625,
0.032135009765625,
-0.0213623046875,
-0.0338134765625,
-0.0035762786865234375,
0.08953857421875,
0.037261962890625,
0.004528045654296875,
-0.00014579296112060547,
0.004817962646484375,
0.02032470703125,
-0.0031585693359375,
-0.06744384765625,
-0.0309295654296875,
0.005199432373046875,
-0.032684326171875,
-0.0300140380859375,
-0.0025157928466796875,
-0.053619384765625,
-0.00958251953125,
-0.0148468017578125,
0.06365966796875,
-0.042694091796875,
-0.0089263916015625,
0.00875091552734375,
-0.00862884521484375,
0.028228759765625,
0.03265380859375,
-0.057647705078125,
0.0109405517578125,
0.0231170654296875,
0.06866455078125,
-0.01450347900390625,
-0.021759033203125,
0.0029621124267578125,
-0.01326751708984375,
-0.0153656005859375,
0.042938232421875,
-0.035736083984375,
-0.0002524852752685547,
-0.0152740478515625,
0.01328277587890625,
-0.016204833984375,
-0.031158447265625,
0.0413818359375,
-0.036529541015625,
0.03497314453125,
-0.0203399658203125,
-0.02996826171875,
-0.022491455078125,
0.0259857177734375,
-0.05987548828125,
0.0767822265625,
0.00946807861328125,
-0.0657958984375,
0.028656005859375,
-0.035125732421875,
-0.00086212158203125,
-0.00505828857421875,
-0.0164031982421875,
-0.059722900390625,
-0.023590087890625,
0.031646728515625,
0.043426513671875,
-0.0263824462890625,
0.0191650390625,
-0.0309295654296875,
-0.0295867919921875,
0.01522064208984375,
-0.0171966552734375,
0.07720947265625,
0.0111083984375,
-0.0243988037109375,
-0.0012311935424804688,
-0.045806884765625,
-0.00556182861328125,
0.0335693359375,
-0.01505279541015625,
-0.0223388671875,
-0.0202789306640625,
0.007579803466796875,
0.0219879150390625,
0.0288543701171875,
-0.0435791015625,
0.01016998291015625,
-0.0123138427734375,
0.029144287109375,
0.040130615234375,
-0.0156402587890625,
0.04632568359375,
-0.027984619140625,
0.02667236328125,
0.0121612548828125,
0.01474761962890625,
-0.026153564453125,
-0.04705810546875,
-0.06591796875,
-0.044036865234375,
0.0232391357421875,
0.031005859375,
-0.04840087890625,
0.03961181640625,
-0.0282135009765625,
-0.04486083984375,
-0.0615234375,
0.005748748779296875,
0.042144775390625,
0.05145263671875,
0.03814697265625,
-0.037872314453125,
-0.033172607421875,
-0.07196044921875,
-0.005687713623046875,
0.0091400146484375,
0.0105438232421875,
0.032440185546875,
0.051055908203125,
-0.0113983154296875,
0.058502197265625,
-0.036865234375,
-0.0281219482421875,
0.0007333755493164062,
0.0011987686157226562,
0.00768280029296875,
0.04864501953125,
0.0504150390625,
-0.068359375,
-0.0574951171875,
-0.00830078125,
-0.076171875,
0.01413726806640625,
-0.003513336181640625,
-0.0182952880859375,
0.0222930908203125,
0.0214691162109375,
-0.04486083984375,
0.0357666015625,
0.047607421875,
-0.021820068359375,
0.03790283203125,
-0.0006809234619140625,
0.0224609375,
-0.08642578125,
0.0128173828125,
0.0183258056640625,
-0.01123046875,
-0.038177490234375,
0.0095062255859375,
0.015716552734375,
-0.01039886474609375,
-0.036407470703125,
0.03704833984375,
-0.037872314453125,
0.00830078125,
0.0018558502197265625,
0.00951385498046875,
0.0203857421875,
0.05377197265625,
0.0120697021484375,
0.04583740234375,
0.05596923828125,
-0.038330078125,
0.033477783203125,
0.0298309326171875,
-0.0335693359375,
0.04656982421875,
-0.0584716796875,
0.01329803466796875,
-0.00946807861328125,
0.0169830322265625,
-0.08099365234375,
-0.0262908935546875,
0.028472900390625,
-0.054229736328125,
0.034332275390625,
-0.004116058349609375,
-0.0289764404296875,
-0.032958984375,
-0.045074462890625,
0.03924560546875,
0.036651611328125,
-0.05035400390625,
0.0244140625,
0.019683837890625,
0.01214599609375,
-0.0654296875,
-0.0599365234375,
-0.008880615234375,
-0.007781982421875,
-0.062469482421875,
0.04144287109375,
-0.002605438232421875,
0.01346588134765625,
0.02630615234375,
0.005645751953125,
-0.004703521728515625,
-0.0145263671875,
0.01568603515625,
0.0433349609375,
-0.0136566162109375,
0.006450653076171875,
-0.00677490234375,
0.00499725341796875,
-0.01190948486328125,
0.0014638900756835938,
0.049652099609375,
-0.020721435546875,
-0.011322021484375,
-0.0667724609375,
0.0022125244140625,
0.034423828125,
-0.0094451904296875,
0.064453125,
0.07257080078125,
-0.036956787109375,
0.0231781005859375,
-0.038543701171875,
-0.003131866455078125,
-0.037353515625,
0.034271240234375,
-0.02691650390625,
-0.05169677734375,
0.04376220703125,
0.0214691162109375,
-0.002109527587890625,
0.05499267578125,
0.044158935546875,
-0.015960693359375,
0.07666015625,
0.047882080078125,
0.0006694793701171875,
0.051361083984375,
-0.05230712890625,
0.004550933837890625,
-0.06878662109375,
-0.045684814453125,
-0.0209808349609375,
-0.035614013671875,
-0.047332763671875,
-0.033447265625,
0.0163421630859375,
0.0162200927734375,
-0.0238494873046875,
0.041351318359375,
-0.05377197265625,
0.021026611328125,
0.045806884765625,
0.038482666015625,
0.00787353515625,
0.01226806640625,
0.00011473894119262695,
-0.0196533203125,
-0.04248046875,
-0.024658203125,
0.080078125,
0.026885986328125,
0.0714111328125,
-0.0099334716796875,
0.044097900390625,
-0.0029850006103515625,
0.0186767578125,
-0.063720703125,
0.036956787109375,
-0.00909423828125,
-0.04998779296875,
-0.00795745849609375,
-0.0272216796875,
-0.0816650390625,
0.011474609375,
-0.0360107421875,
-0.053985595703125,
0.018157958984375,
0.01235198974609375,
-0.0100250244140625,
0.0181121826171875,
-0.07061767578125,
0.07403564453125,
-0.0136566162109375,
-0.047393798828125,
-0.0033740997314453125,
-0.051513671875,
0.03021240234375,
0.00980377197265625,
-0.00817108154296875,
-0.0065460205078125,
-0.0039043426513671875,
0.07598876953125,
-0.0240631103515625,
0.06109619140625,
-0.005344390869140625,
0.0102996826171875,
0.03643798828125,
-0.0151214599609375,
0.01255035400390625,
-0.004150390625,
0.009674072265625,
0.02813720703125,
0.0010080337524414062,
-0.033477783203125,
-0.038543701171875,
0.06427001953125,
-0.08135986328125,
-0.0233154296875,
-0.035552978515625,
-0.04559326171875,
0.01271820068359375,
0.019439697265625,
0.04547119140625,
0.032440185546875,
-0.0013275146484375,
0.03289794921875,
0.04620361328125,
-0.0298309326171875,
0.033599853515625,
0.0027484893798828125,
-0.014984130859375,
-0.05035400390625,
0.0640869140625,
0.013427734375,
0.02008056640625,
0.0301055908203125,
0.012115478515625,
-0.032379150390625,
-0.0023059844970703125,
-0.03485107421875,
0.027008056640625,
-0.04779052734375,
-0.03582763671875,
-0.05194091796875,
-0.019622802734375,
-0.046173095703125,
-0.025177001953125,
-0.03424072265625,
-0.02154541015625,
-0.033355712890625,
0.0177001953125,
0.03314208984375,
0.02862548828125,
0.00995635986328125,
0.0253448486328125,
-0.03875732421875,
0.0260772705078125,
0.0259246826171875,
0.03155517578125,
-0.00347900390625,
-0.04803466796875,
-0.0225067138671875,
0.006683349609375,
-0.044952392578125,
-0.06414794921875,
0.035247802734375,
0.02099609375,
0.0304412841796875,
0.033905029296875,
-0.01059722900390625,
0.064453125,
-0.01153564453125,
0.054931640625,
0.04876708984375,
-0.061309814453125,
0.040924072265625,
-0.0186004638671875,
0.026275634765625,
0.037353515625,
0.04925537109375,
-0.024017333984375,
-0.00913238525390625,
-0.056793212890625,
-0.05316162109375,
0.0648193359375,
0.0219879150390625,
0.006763458251953125,
0.0214996337890625,
0.026702880859375,
0.00035262107849121094,
0.00838470458984375,
-0.07574462890625,
-0.0254364013671875,
-0.023895263671875,
-0.045379638671875,
0.0010433197021484375,
0.0011110305786132812,
-0.0007100105285644531,
-0.042205810546875,
0.042816162109375,
-0.004550933837890625,
0.055023193359375,
0.032135009765625,
-0.0275115966796875,
-0.00893402099609375,
-0.01514434814453125,
0.0244598388671875,
0.01445770263671875,
-0.0172271728515625,
0.0029277801513671875,
0.00020945072174072266,
-0.053436279296875,
-0.0059661865234375,
0.00531005859375,
-0.02606201171875,
-0.0016870498657226562,
0.026214599609375,
0.072509765625,
0.01190948486328125,
-0.03179931640625,
0.050994873046875,
-0.00794219970703125,
-0.0207672119140625,
-0.0121002197265625,
0.003147125244140625,
0.009735107421875,
0.012786865234375,
0.0160369873046875,
0.0178070068359375,
0.0037212371826171875,
-0.049407958984375,
0.0113983154296875,
0.033294677734375,
-0.033660888671875,
-0.0361328125,
0.057403564453125,
-0.00040435791015625,
-0.01233673095703125,
0.049957275390625,
-0.00830078125,
-0.0379638671875,
0.0655517578125,
0.04931640625,
0.051361083984375,
-0.00699615478515625,
0.01329803466796875,
0.062164306640625,
0.0151214599609375,
-0.0004973411560058594,
0.01519012451171875,
0.0038204193115234375,
-0.064453125,
-0.0137176513671875,
-0.04364013671875,
-0.01361846923828125,
0.020751953125,
-0.05010986328125,
0.039520263671875,
-0.0347900390625,
-0.01910400390625,
0.0036220550537109375,
0.0235443115234375,
-0.07098388671875,
0.026031494140625,
0.0155029296875,
0.049072265625,
-0.06781005859375,
0.056549072265625,
0.060821533203125,
-0.055084228515625,
-0.0609130859375,
-0.019500732421875,
-0.0039520263671875,
-0.07318115234375,
0.03253173828125,
0.042816162109375,
0.006374359130859375,
0.0026912689208984375,
-0.07098388671875,
-0.06549072265625,
0.11322021484375,
0.03924560546875,
-0.0269012451171875,
0.0003802776336669922,
-0.005313873291015625,
0.03961181640625,
-0.027984619140625,
0.02410888671875,
0.0204315185546875,
0.0186309814453125,
0.0153045654296875,
-0.058746337890625,
0.009368896484375,
-0.0271148681640625,
-0.007495880126953125,
-0.0020599365234375,
-0.07305908203125,
0.06658935546875,
-0.02398681640625,
-0.0188140869140625,
0.00763702392578125,
0.04925537109375,
0.020660400390625,
0.03167724609375,
0.0241851806640625,
0.07421875,
0.046478271484375,
-0.018768310546875,
0.07080078125,
-0.0086517333984375,
0.0511474609375,
0.0675048828125,
0.01480865478515625,
0.060821533203125,
0.029449462890625,
-0.01151275634765625,
0.0435791015625,
0.05828857421875,
-0.0255126953125,
0.05230712890625,
-0.0208282470703125,
-0.004795074462890625,
-0.0071563720703125,
-0.01413726806640625,
-0.030181884765625,
0.035125732421875,
0.0194244384765625,
-0.040802001953125,
-0.0021762847900390625,
0.0023040771484375,
0.00646209716796875,
-0.0106658935546875,
-0.0173492431640625,
0.0419921875,
0.00336456298828125,
-0.038299560546875,
0.048736572265625,
-0.00888824462890625,
0.07379150390625,
-0.029022216796875,
0.01161956787109375,
0.0013551712036132812,
0.0194244384765625,
-0.0234375,
-0.05926513671875,
0.0105438232421875,
-0.01453399658203125,
-0.002590179443359375,
-0.00295257568359375,
0.05908203125,
-0.04376220703125,
-0.0347900390625,
0.0232391357421875,
0.007354736328125,
0.01013946533203125,
0.005680084228515625,
-0.083984375,
0.017486572265625,
-0.00736236572265625,
-0.0303497314453125,
0.021881103515625,
0.01032257080078125,
0.0026149749755859375,
0.044219970703125,
0.03564453125,
-0.0184326171875,
0.0185699462890625,
-0.01253509521484375,
0.06427001953125,
-0.032501220703125,
-0.0116729736328125,
-0.0592041015625,
0.03631591796875,
-0.0154876708984375,
-0.0306549072265625,
0.051544189453125,
0.049072265625,
0.07763671875,
-0.01461029052734375,
0.04119873046875,
-0.02691650390625,
-0.0030231475830078125,
-0.031280517578125,
0.0560302734375,
-0.0631103515625,
-0.0169525146484375,
-0.031158447265625,
-0.058746337890625,
-0.03900146484375,
0.060089111328125,
-0.0219879150390625,
0.00992584228515625,
0.054473876953125,
0.07269287109375,
-0.0167236328125,
-0.01947021484375,
0.020538330078125,
0.0038547515869140625,
0.01222991943359375,
0.051422119140625,
0.038299560546875,
-0.0804443359375,
0.046051025390625,
-0.043060302734375,
-0.0136566162109375,
-0.0219573974609375,
-0.047393798828125,
-0.0677490234375,
-0.061737060546875,
-0.03997802734375,
-0.0341796875,
-0.01409912109375,
0.053924560546875,
0.0616455078125,
-0.05682373046875,
0.0015954971313476562,
-0.00008183717727661133,
0.0001480579376220703,
-0.0076751708984375,
-0.0194244384765625,
0.04827880859375,
-0.0103302001953125,
-0.07781982421875,
-0.0183258056640625,
0.01305389404296875,
0.007537841796875,
-0.005161285400390625,
-0.0233154296875,
-0.032958984375,
-0.005519866943359375,
0.036956787109375,
0.01885986328125,
-0.043212890625,
-0.017364501953125,
0.0224761962890625,
-0.031402587890625,
0.0276641845703125,
0.0260772705078125,
-0.044952392578125,
0.04669189453125,
0.04827880859375,
0.03875732421875,
0.049957275390625,
-0.010498046875,
0.006649017333984375,
-0.043853759765625,
0.025054931640625,
0.006805419921875,
0.036712646484375,
0.035400390625,
-0.02978515625,
0.033721923828125,
0.037567138671875,
-0.04302978515625,
-0.05206298828125,
0.0004818439483642578,
-0.10455322265625,
-0.0027408599853515625,
0.0726318359375,
-0.0249786376953125,
-0.03955078125,
0.01702880859375,
-0.01947021484375,
0.0440673828125,
-0.027496337890625,
0.058349609375,
0.0435791015625,
-0.0022220611572265625,
-0.034881591796875,
-0.02203369140625,
0.0243377685546875,
0.02001953125,
-0.04901123046875,
-0.04144287109375,
0.024749755859375,
0.03338623046875,
0.0282745361328125,
0.0504150390625,
-0.0258941650390625,
0.02056884765625,
0.007579803466796875,
0.02685546875,
0.001148223876953125,
-0.01468658447265625,
-0.0246734619140625,
-0.0004477500915527344,
-0.01959228515625,
-0.033477783203125
]
] |
jplu/tf-xlm-roberta-base | 2020-12-11T21:48:00.000Z | [
"transformers",
"tf",
"xlm-roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | jplu | null | null | jplu/tf-xlm-roberta-base | 1 | 15,551 | transformers | 2022-03-02T23:29:05 | # Tensorflow XLM-RoBERTa
In this repository you will find different versions of the XLM-RoBERTa model for Tensorflow.
## XLM-RoBERTa
[XLM-RoBERTa](https://ai.facebook.com/blog/-xlm-r-state-of-the-art-cross-lingual-understanding-through-self-supervision/) is a scaled cross lingual sentence encoder. It is trained on 2.5T of data across 100 languages data filtered from Common Crawl. XLM-R achieves state-of-the-arts results on multiple cross lingual benchmarks.
## Model Weights
| Model | Downloads
| -------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `jplu/tf-xlm-roberta-base` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/tf_model.h5)
| `jplu/tf-xlm-roberta-large` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/tf_model.h5)
## Usage
With Transformers >= 2.4 the Tensorflow models of XLM-RoBERTa can be loaded like:
```python
from transformers import TFXLMRobertaModel
model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-base")
```
Or
```
model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-large")
```
## Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/jplu).
## Acknowledgments
Thanks to all the Huggingface team for the support and their amazing library!
| 1,693 | [
[
-0.0418701171875,
-0.048492431640625,
0.0255126953125,
0.0282440185546875,
-0.0032501220703125,
-0.00560760498046875,
-0.00911712646484375,
-0.0323486328125,
0.01154327392578125,
0.040618896484375,
-0.053466796875,
-0.03814697265625,
-0.072265625,
0.0199127197265625,
-0.0200653076171875,
0.08154296875,
-0.0240631103515625,
0.023345947265625,
-0.0108795166015625,
-0.03668212890625,
-0.02044677734375,
-0.0276031494140625,
-0.0611572265625,
-0.040008544921875,
0.0284271240234375,
-0.002105712890625,
0.05450439453125,
0.024658203125,
0.035247802734375,
0.0294342041015625,
-0.01947021484375,
0.00533294677734375,
-0.0294952392578125,
-0.007659912109375,
0.01459503173828125,
-0.04876708984375,
-0.055572509765625,
-0.0009918212890625,
0.057220458984375,
0.0254669189453125,
0.0022830963134765625,
0.01363372802734375,
-0.00884246826171875,
0.032073974609375,
-0.011688232421875,
0.01241302490234375,
0.00797271728515625,
0.0283660888671875,
0.0032024383544921875,
-0.00926971435546875,
-0.0195159912109375,
-0.01474761962890625,
0.02691650390625,
-0.032806396484375,
0.00394439697265625,
0.0121002197265625,
0.0926513671875,
0.0302734375,
-0.038055419921875,
-0.0067291259765625,
-0.04022216796875,
0.0760498046875,
-0.053466796875,
0.0552978515625,
0.0143890380859375,
0.0228424072265625,
0.00478363037109375,
-0.06585693359375,
-0.020599365234375,
-0.006595611572265625,
-0.027099609375,
0.008636474609375,
-0.039306640625,
-0.0003566741943359375,
0.028472900390625,
0.0247802734375,
-0.06793212890625,
0.0035305023193359375,
-0.046539306640625,
0.003726959228515625,
0.0273590087890625,
0.01080322265625,
0.035003662109375,
-0.014556884765625,
-0.044219970703125,
-0.0251007080078125,
-0.03875732421875,
-0.00833892822265625,
0.0215606689453125,
0.01253509521484375,
-0.052886962890625,
0.0294952392578125,
0.017974853515625,
0.0574951171875,
0.0153656005859375,
-0.00982666015625,
0.037628173828125,
-0.0078582763671875,
-0.029937744140625,
-0.01531982421875,
0.07684326171875,
0.0034942626953125,
0.0007467269897460938,
-0.0119171142578125,
-0.018798828125,
-0.0243377685546875,
0.015960693359375,
-0.08453369140625,
0.010894775390625,
0.041290283203125,
-0.0443115234375,
-0.024139404296875,
0.0007853507995605469,
-0.03265380859375,
0.0149993896484375,
-0.021636962890625,
0.041015625,
-0.0460205078125,
-0.03857421875,
0.005924224853515625,
-0.015106201171875,
0.0234222412109375,
0.003810882568359375,
-0.04986572265625,
0.033050537109375,
0.050933837890625,
0.06964111328125,
-0.01067352294921875,
-0.02044677734375,
-0.02911376953125,
-0.01436614990234375,
-0.0188140869140625,
0.03759765625,
-0.0179290771484375,
-0.020782470703125,
-0.0005745887756347656,
0.0245361328125,
-0.0172271728515625,
-0.0306854248046875,
0.04058837890625,
-0.042388916015625,
0.043731689453125,
0.00926971435546875,
-0.0528564453125,
-0.0369873046875,
0.02911376953125,
-0.045196533203125,
0.0687255859375,
0.0421142578125,
-0.046417236328125,
0.007843017578125,
-0.060821533203125,
-0.0164642333984375,
-0.007129669189453125,
0.0013475418090820312,
-0.0694580078125,
-0.0128631591796875,
0.0020198822021484375,
0.042236328125,
-0.0213165283203125,
0.00862884521484375,
-0.032501220703125,
-0.010162353515625,
0.0096893310546875,
0.00960540771484375,
0.10137939453125,
0.0293426513671875,
-0.0200042724609375,
0.02655029296875,
-0.04229736328125,
0.00252532958984375,
0.0129547119140625,
-0.00418853759765625,
0.0006427764892578125,
-0.049102783203125,
0.03497314453125,
0.0109100341796875,
0.00786590576171875,
-0.0264892578125,
0.0047760009765625,
-0.004718780517578125,
0.046051025390625,
0.03302001953125,
-0.01132965087890625,
0.03521728515625,
-0.047607421875,
0.062744140625,
0.0199737548828125,
0.0204010009765625,
-0.0026836395263671875,
-0.036865234375,
-0.058685302734375,
-0.024200439453125,
0.0291900634765625,
0.0283660888671875,
-0.043975830078125,
0.0293426513671875,
-0.006198883056640625,
-0.051605224609375,
-0.042022705078125,
0.00882720947265625,
0.04351806640625,
0.026519775390625,
0.03143310546875,
-0.0038909912109375,
-0.053955078125,
-0.05352783203125,
-0.0018815994262695312,
0.00659942626953125,
0.0002751350402832031,
0.0362548828125,
0.048583984375,
-0.02996826171875,
0.0418701171875,
-0.0279541015625,
-0.04302978515625,
-0.031280517578125,
0.0243377685546875,
0.0247650146484375,
0.04986572265625,
0.08807373046875,
-0.056121826171875,
-0.040130615234375,
-0.0173492431640625,
-0.062042236328125,
0.0071868896484375,
0.0012598037719726562,
-0.0224609375,
0.047332763671875,
0.022247314453125,
-0.056488037109375,
0.01297760009765625,
0.06585693359375,
-0.044708251953125,
0.025238037109375,
-0.021209716796875,
0.007236480712890625,
-0.108154296875,
0.000023066997528076172,
0.0245208740234375,
-0.040374755859375,
-0.03216552734375,
0.0350341796875,
0.012237548828125,
0.0063934326171875,
-0.035400390625,
0.046234130859375,
-0.04779052734375,
-0.00876617431640625,
-0.0012569427490234375,
0.00032806396484375,
0.0162353515625,
0.038055419921875,
0.01343536376953125,
0.03369140625,
0.05169677734375,
-0.0235137939453125,
0.03411865234375,
0.0408935546875,
-0.0118560791015625,
0.036407470703125,
-0.066650390625,
0.01009368896484375,
-0.00844573974609375,
0.0279083251953125,
-0.048126220703125,
-0.0010099411010742188,
0.0269775390625,
-0.041748046875,
0.050140380859375,
-0.039825439453125,
-0.030364990234375,
-0.026519775390625,
-0.0025043487548828125,
0.028350830078125,
0.054229736328125,
-0.0511474609375,
0.07086181640625,
0.0245208740234375,
0.00862884521484375,
-0.03314208984375,
-0.06634521484375,
-0.00382232666015625,
-0.039306640625,
-0.060760498046875,
0.0360107421875,
-0.0277099609375,
-0.0084381103515625,
0.0104827880859375,
0.018463134765625,
-0.0119171142578125,
-0.00988006591796875,
0.0258941650390625,
0.03912353515625,
-0.032318115234375,
-0.0177764892578125,
-0.0164794921875,
-0.0188140869140625,
0.00823211669921875,
-0.022674560546875,
0.047821044921875,
-0.0242919921875,
0.009246826171875,
-0.03875732421875,
0.007965087890625,
0.036285400390625,
-0.01268768310546875,
0.06353759765625,
0.083740234375,
-0.0286407470703125,
-0.0164337158203125,
-0.038421630859375,
-0.0112152099609375,
-0.032867431640625,
0.015899658203125,
-0.024810791015625,
-0.0738525390625,
0.047332763671875,
0.0278778076171875,
0.0005617141723632812,
0.034088134765625,
0.041656494140625,
0.01491546630859375,
0.072021484375,
0.0538330078125,
-0.00372314453125,
0.0450439453125,
-0.056182861328125,
0.00047135353088378906,
-0.06988525390625,
-0.007442474365234375,
-0.043975830078125,
-0.01276397705078125,
-0.045440673828125,
-0.03436279296875,
0.02569580078125,
-0.0100250244140625,
-0.0261077880859375,
0.061859130859375,
-0.0343017578125,
0.004009246826171875,
0.043853759765625,
0.01387786865234375,
0.00786590576171875,
-0.0173187255859375,
-0.0023670196533203125,
0.01212310791015625,
-0.041717529296875,
-0.0153656005859375,
0.0887451171875,
0.02099609375,
0.036285400390625,
0.0199127197265625,
0.05560302734375,
-0.00899505615234375,
0.0254669189453125,
-0.050811767578125,
0.0160369873046875,
-0.0023326873779296875,
-0.05352783203125,
-0.00179290771484375,
-0.050689697265625,
-0.0570068359375,
0.00130462646484375,
-0.021148681640625,
-0.038421630859375,
-0.00566864013671875,
-0.004150390625,
-0.03277587890625,
0.0308685302734375,
-0.041839599609375,
0.0833740234375,
-0.0152435302734375,
-0.0153656005859375,
-0.0039043426513671875,
-0.03173828125,
0.025146484375,
-0.0027942657470703125,
-0.0013837814331054688,
0.00489044189453125,
0.007049560546875,
0.046478271484375,
-0.04742431640625,
0.03680419921875,
-0.0161895751953125,
0.001956939697265625,
0.0165252685546875,
-0.001819610595703125,
0.02716064453125,
0.004444122314453125,
-0.0052337646484375,
0.035919189453125,
0.01947021484375,
-0.017181396484375,
-0.0396728515625,
0.05853271484375,
-0.095947265625,
-0.03558349609375,
-0.034149169921875,
-0.0411376953125,
-0.001148223876953125,
0.02587890625,
0.0264892578125,
0.03778076171875,
-0.00315093994140625,
0.0212860107421875,
0.04339599609375,
-0.0196533203125,
0.022186279296875,
0.040863037109375,
-0.0296478271484375,
-0.029449462890625,
0.032623291015625,
0.0125732421875,
0.01311492919921875,
0.03558349609375,
0.01177215576171875,
-0.0248870849609375,
-0.03485107421875,
-0.0341796875,
0.016448974609375,
-0.032928466796875,
-0.03271484375,
-0.0640869140625,
-0.0297088623046875,
-0.045166015625,
-0.002590179443359375,
-0.0220794677734375,
-0.03533935546875,
-0.0345458984375,
-0.00891876220703125,
0.05755615234375,
0.0523681640625,
-0.0080413818359375,
0.025970458984375,
-0.069091796875,
0.019439697265625,
-0.0007152557373046875,
0.03216552734375,
-0.0067291259765625,
-0.0601806640625,
-0.004913330078125,
0.0010652542114257812,
-0.033660888671875,
-0.035125732421875,
0.0556640625,
0.00737762451171875,
0.033203125,
0.034027099609375,
-0.003627777099609375,
0.057220458984375,
-0.0234832763671875,
0.043548583984375,
0.0167083740234375,
-0.0540771484375,
0.0204010009765625,
-0.0295562744140625,
0.0077972412109375,
-0.0098419189453125,
0.0560302734375,
-0.033905029296875,
-0.024871826171875,
-0.055206298828125,
-0.046112060546875,
0.07684326171875,
0.0195770263671875,
0.003818511962890625,
0.01531982421875,
0.015625,
0.00038242340087890625,
-0.01238250732421875,
-0.045379638671875,
-0.02874755859375,
-0.0206451416015625,
-0.021881103515625,
-0.01348876953125,
-0.010498046875,
-0.0208892822265625,
-0.0111236572265625,
0.0662841796875,
-0.0018444061279296875,
0.029876708984375,
0.0147247314453125,
-0.00786590576171875,
-0.035125732421875,
-0.0177001953125,
0.04022216796875,
0.0300445556640625,
-0.030609130859375,
-0.0238189697265625,
0.01175689697265625,
-0.0081634521484375,
-0.0233917236328125,
0.0276947021484375,
-0.008880615234375,
0.005863189697265625,
0.0227813720703125,
0.077880859375,
0.036834716796875,
-0.0291595458984375,
0.03668212890625,
-0.0284576416015625,
-0.0274658203125,
-0.06158447265625,
0.0038089752197265625,
0.0227508544921875,
0.045684814453125,
0.006679534912109375,
0.017669677734375,
-0.00482940673828125,
-0.030853271484375,
0.01165008544921875,
0.03546142578125,
-0.03521728515625,
-0.0243377685546875,
0.0732421875,
-0.0144500732421875,
-0.0235595703125,
0.052581787109375,
0.0022754669189453125,
-0.0335693359375,
0.065673828125,
0.0478515625,
0.0792236328125,
-0.005031585693359375,
0.00746917724609375,
0.049652099609375,
-0.0008778572082519531,
-0.005443572998046875,
0.0194244384765625,
0.0013294219970703125,
-0.064208984375,
-0.01102447509765625,
-0.059783935546875,
-0.0312347412109375,
-0.0007457733154296875,
-0.065185546875,
0.03997802734375,
-0.0343017578125,
-0.0287933349609375,
0.01369476318359375,
0.01488494873046875,
-0.067626953125,
0.0015134811401367188,
0.01100921630859375,
0.07421875,
-0.03802490234375,
0.057464599609375,
0.0570068359375,
-0.052886962890625,
-0.0751953125,
-0.0369873046875,
0.0147247314453125,
-0.0753173828125,
0.04742431640625,
-0.001186370849609375,
0.0171051025390625,
0.0014181137084960938,
-0.028778076171875,
-0.0760498046875,
0.1075439453125,
0.01224517822265625,
-0.0136566162109375,
-0.00904083251953125,
0.01032257080078125,
0.03857421875,
-0.037872314453125,
0.05084228515625,
0.02520751953125,
0.038421630859375,
0.0260162353515625,
-0.06195068359375,
0.01486968994140625,
-0.0233917236328125,
0.01294708251953125,
0.00608062744140625,
-0.0770263671875,
0.07318115234375,
-0.020050048828125,
0.0137786865234375,
0.0328369140625,
0.049835205078125,
0.0294952392578125,
-0.00592041015625,
0.0361328125,
0.0633544921875,
0.02459716796875,
-0.02130126953125,
0.073486328125,
-0.047332763671875,
0.059478759765625,
0.054412841796875,
0.00007522106170654297,
0.045562744140625,
0.0188446044921875,
-0.01184844970703125,
0.043304443359375,
0.055389404296875,
-0.029388427734375,
0.007549285888671875,
-0.01035308837890625,
0.005153656005859375,
-0.02105712890625,
0.01227569580078125,
-0.0260772705078125,
0.0265655517578125,
0.0173187255859375,
-0.04443359375,
-0.01074981689453125,
0.004047393798828125,
0.029937744140625,
-0.0279083251953125,
-0.017547607421875,
0.04351806640625,
0.0230712890625,
-0.055511474609375,
0.0570068359375,
0.00016880035400390625,
0.05352783203125,
-0.046142578125,
0.0019083023071289062,
-0.01513671875,
0.0322265625,
-0.0233612060546875,
-0.05084228515625,
0.031280517578125,
0.00021076202392578125,
0.003948211669921875,
-0.043975830078125,
0.03369140625,
-0.03887939453125,
-0.045806884765625,
0.046844482421875,
0.027008056640625,
0.01264190673828125,
0.018341064453125,
-0.0745849609375,
0.023529052734375,
-0.01189422607421875,
-0.05352783203125,
0.030364990234375,
0.0284576416015625,
0.0220947265625,
0.061859130859375,
0.036834716796875,
-0.0071258544921875,
0.002044677734375,
0.0091400146484375,
0.051239013671875,
-0.032196044921875,
-0.02716064453125,
-0.035400390625,
0.03558349609375,
0.00991058349609375,
-0.038299560546875,
0.050811767578125,
0.034698486328125,
0.0709228515625,
-0.010284423828125,
0.042755126953125,
-0.0177764892578125,
0.027984619140625,
-0.024200439453125,
0.0692138671875,
-0.07073974609375,
-0.0101165771484375,
-0.022216796875,
-0.05291748046875,
-0.033172607421875,
0.05938720703125,
-0.006595611572265625,
0.033905029296875,
0.040008544921875,
0.0592041015625,
-0.01374053955078125,
0.00975799560546875,
0.0238037109375,
0.044891357421875,
0.0169677734375,
0.038177490234375,
0.0226898193359375,
-0.05877685546875,
0.047088623046875,
-0.0269775390625,
-0.0185394287109375,
-0.01546478271484375,
-0.0552978515625,
-0.079345703125,
-0.044708251953125,
-0.04058837890625,
-0.050201416015625,
-0.006267547607421875,
0.08355712890625,
0.07257080078125,
-0.058868408203125,
-0.020111083984375,
-0.006816864013671875,
0.005008697509765625,
-0.0171051025390625,
-0.0204620361328125,
0.0301361083984375,
-0.024810791015625,
-0.09228515625,
-0.0007557868957519531,
-0.0024166107177734375,
0.01824951171875,
-0.0252685546875,
-0.0299835205078125,
-0.0013074874877929688,
-0.0124053955078125,
0.029052734375,
0.02349853515625,
-0.03802490234375,
-0.026611328125,
0.0021305084228515625,
-0.005107879638671875,
0.0162506103515625,
0.0189971923828125,
-0.041839599609375,
0.001995086669921875,
0.032867431640625,
0.0194244384765625,
0.0595703125,
-0.02203369140625,
0.047210693359375,
-0.03167724609375,
0.0196075439453125,
0.01009368896484375,
0.0692138671875,
0.0191802978515625,
-0.016448974609375,
0.0452880859375,
-0.002323150634765625,
-0.049102783203125,
-0.055755615234375,
0.0009341239929199219,
-0.08831787109375,
-0.0095062255859375,
0.07720947265625,
-0.037567138671875,
-0.03564453125,
0.02642822265625,
0.00006812810897827148,
0.036407470703125,
-0.01513671875,
0.050262451171875,
0.0380859375,
-0.0095367431640625,
-0.044708251953125,
-0.028350830078125,
0.032440185546875,
0.0267181396484375,
-0.038055419921875,
-0.0260162353515625,
0.009857177734375,
0.05352783203125,
0.01355743408203125,
0.0200653076171875,
-0.0299224853515625,
0.005901336669921875,
-0.00823974609375,
0.0094146728515625,
-0.022003173828125,
-0.006877899169921875,
-0.01666259765625,
-0.00881195068359375,
-0.0254058837890625,
0.00598907470703125
]
] |
digiplay/majicMIX_realistic_v6 | 2023-08-03T22:18:24.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:other",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | digiplay | null | null | digiplay/majicMIX_realistic_v6 | 33 | 15,546 | diffusers | 2023-06-12T21:16:15 | ---
license: other
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
Model info :
https://civitai.com/models/43331?modelVersionId=94640
| 193 | [
[
-0.0229644775390625,
0.0292816162109375,
0.039306640625,
0.0309906005859375,
-0.032379150390625,
-0.017822265625,
0.038909912109375,
-0.00966644287109375,
0.0171051025390625,
0.029693603515625,
-0.050750732421875,
0.0004456043243408203,
0.01313018798828125,
-0.032562255859375,
-0.0222320556640625,
0.0390625,
0.036163330078125,
0.041259765625,
0.0040283203125,
0.0084381103515625,
-0.0194244384765625,
-0.01352691650390625,
-0.053131103515625,
0.007282257080078125,
0.032745361328125,
0.04730224609375,
0.034881591796875,
0.019500732421875,
0.05194091796875,
0.019012451171875,
-0.006740570068359375,
-0.0011625289916992188,
-0.0170745849609375,
-0.0011301040649414062,
0.006267547607421875,
-0.03936767578125,
-0.0634765625,
-0.029205322265625,
0.0280914306640625,
0.043426513671875,
0.0010395050048828125,
0.03082275390625,
-0.006351470947265625,
0.0206756591796875,
-0.04949951171875,
0.0271453857421875,
0.00299835205078125,
0.00606536865234375,
-0.037872314453125,
-0.01812744140625,
-0.0307769775390625,
-0.0308990478515625,
-0.00579833984375,
-0.034912109375,
0.023223876953125,
-0.010284423828125,
0.092529296875,
0.03143310546875,
-0.0297393798828125,
-0.0239715576171875,
-0.044586181640625,
0.0182647705078125,
-0.03900146484375,
0.05926513671875,
0.036102294921875,
0.03369140625,
-0.030029296875,
-0.0294647216796875,
-0.0039005279541015625,
0.013519287109375,
-0.0027179718017578125,
0.00559234619140625,
-0.01284027099609375,
-0.005062103271484375,
0.02374267578125,
0.0516357421875,
-0.05096435546875,
0.0027828216552734375,
-0.07696533203125,
0.005336761474609375,
0.031951904296875,
-0.01120758056640625,
0.01153564453125,
-0.01166534423828125,
-0.02960205078125,
0.02685546875,
-0.031463623046875,
-0.0171966552734375,
0.031585693359375,
0.00740814208984375,
-0.03424072265625,
0.028228759765625,
-0.02264404296875,
0.07110595703125,
0.0157470703125,
0.0191497802734375,
0.01495361328125,
-0.03216552734375,
-0.055572509765625,
-0.00606536865234375,
-0.0030574798583984375,
0.03265380859375,
-0.0017404556274414062,
-0.0020694732666015625,
0.01360321044921875,
-0.0498046875,
0.0135345458984375,
-0.06378173828125,
-0.0215301513671875,
0.0210113525390625,
-0.03253173828125,
-0.0270233154296875,
0.04541015625,
-0.05096435546875,
-0.0204620361328125,
0.02960205078125,
0.00923919677734375,
-0.01222991943359375,
-0.03448486328125,
0.0128021240234375,
-0.019500732421875,
0.02862548828125,
0.032196044921875,
-0.06719970703125,
0.0328369140625,
0.0214080810546875,
0.05731201171875,
0.019012451171875,
0.01267242431640625,
-0.00847625732421875,
0.0105743408203125,
0.00200653076171875,
0.0272064208984375,
-0.009246826171875,
-0.05029296875,
0.0173492431640625,
0.0282135009765625,
-0.00946044921875,
-0.043304443359375,
0.07305908203125,
-0.07861328125,
-0.0028629302978515625,
-0.0267791748046875,
-0.0229339599609375,
-0.031158447265625,
0.0224151611328125,
-0.080810546875,
0.050445556640625,
-0.01861572265625,
-0.0616455078125,
-0.003032684326171875,
-0.05096435546875,
-0.0251922607421875,
0.0218658447265625,
0.0245513916015625,
-0.035308837890625,
0.033203125,
-0.052764892578125,
0.005748748779296875,
-0.0282135009765625,
0.004390716552734375,
-0.03082275390625,
-0.01546478271484375,
0.0083770751953125,
-0.0352783203125,
0.086669921875,
0.046112060546875,
0.01261138916015625,
0.0137786865234375,
-0.07763671875,
-0.0132598876953125,
0.0265960693359375,
-0.006252288818359375,
0.0087432861328125,
-0.03656005859375,
-0.00933074951171875,
0.0167999267578125,
0.039947509765625,
-0.02655029296875,
-0.0167999267578125,
0.018798828125,
0.004161834716796875,
0.01102447509765625,
0.00567626953125,
-0.0189208984375,
-0.03515625,
0.045745849609375,
-0.00939178466796875,
0.05859375,
0.00969696044921875,
-0.058746337890625,
-0.0987548828125,
-0.0176239013671875,
-0.00074005126953125,
0.022003173828125,
-0.041900634765625,
0.03570556640625,
0.0170135498046875,
-0.057769775390625,
-0.0076904296875,
-0.017730712890625,
0.00945281982421875,
0.022674560546875,
0.0022602081298828125,
-0.037841796875,
-0.0455322265625,
-0.07373046875,
0.0098419189453125,
-0.00238037109375,
-0.01190185546875,
0.023651123046875,
0.030181884765625,
0.006992340087890625,
0.05029296875,
-0.0408935546875,
-0.020416259765625,
-0.00908660888671875,
0.002140045166015625,
0.035888671875,
0.020721435546875,
0.07080078125,
-0.0821533203125,
-0.0309906005859375,
-0.01238250732421875,
-0.0439453125,
-0.01885986328125,
0.0202789306640625,
-0.01163482666015625,
-0.0077667236328125,
0.0269012451171875,
-0.052764892578125,
0.058258056640625,
0.040496826171875,
-0.0128326416015625,
0.0272979736328125,
-0.037322998046875,
0.03399658203125,
-0.07037353515625,
0.0297698974609375,
0.0064697265625,
-0.0170745849609375,
-0.0218048095703125,
0.0028781890869140625,
0.0251922607421875,
0.0248260498046875,
-0.06597900390625,
-0.014373779296875,
-0.0670166015625,
-0.006687164306640625,
-0.021453857421875,
-0.0124664306640625,
-0.0012617111206054688,
0.0232086181640625,
0.00754547119140625,
0.0291595458984375,
0.0112152099609375,
-0.04840087890625,
0.047576904296875,
-0.0057525634765625,
-0.04742431640625,
0.0091552734375,
-0.057220458984375,
-0.001445770263671875,
0.0014734268188476562,
0.004154205322265625,
-0.057586669921875,
-0.042633056640625,
-0.004058837890625,
-0.0157318115234375,
-0.01364898681640625,
-0.0159912109375,
-0.032745361328125,
-0.044036865234375,
-0.01090240478515625,
0.04632568359375,
0.005336761474609375,
-0.0418701171875,
0.0080413818359375,
0.0131683349609375,
-0.00841522216796875,
0.01094818115234375,
-0.052032470703125,
-0.030731201171875,
-0.0288848876953125,
-0.032470703125,
0.0233306884765625,
-0.01476287841796875,
-0.04119873046875,
-0.03155517578125,
0.0162506103515625,
-0.06268310546875,
0.0130767822265625,
0.03070068359375,
0.037872314453125,
0.000209808349609375,
0.0274810791015625,
-0.0169830322265625,
-0.00339508056640625,
0.026336669921875,
0.0191192626953125,
0.0694580078125,
-0.01153564453125,
-0.007541656494140625,
-0.058563232421875,
0.0158843994140625,
0.069091796875,
-0.03558349609375,
0.05731201171875,
0.019134521484375,
-0.080810546875,
0.0059356689453125,
-0.042816162109375,
0.00998687744140625,
-0.0321044921875,
0.0178070068359375,
-0.047210693359375,
0.0204925537109375,
0.0233001708984375,
0.001964569091796875,
-0.03460693359375,
0.05023193359375,
0.038665771484375,
0.0057525634765625,
0.037750244140625,
0.0626220703125,
0.05352783203125,
0.0450439453125,
-0.0284881591796875,
-0.00356292724609375,
-0.049346923828125,
-0.042755126953125,
-0.042510986328125,
0.015625,
-0.007251739501953125,
-0.045379638671875,
-0.0050201416015625,
0.013031005859375,
-0.04986572265625,
0.0211944580078125,
-0.01512908935546875,
0.038665771484375,
0.058349609375,
0.01416778564453125,
0.026397705078125,
-0.0672607421875,
-0.0012617111206054688,
0.00908660888671875,
-0.02471923828125,
-0.027008056640625,
0.04522705078125,
0.0117340087890625,
0.03497314453125,
0.05023193359375,
0.0222625732421875,
0.0165863037109375,
0.005222320556640625,
-0.0240325927734375,
0.00794219970703125,
0.0041961669921875,
-0.0777587890625,
0.0174560546875,
0.0131378173828125,
-0.034332275390625,
0.011566162109375,
-0.005157470703125,
-0.0240478515625,
0.045623779296875,
0.0194549560546875,
-0.040618896484375,
0.0374755859375,
-0.0701904296875,
0.056549072265625,
-0.058013916015625,
-0.0189056396484375,
0.0213623046875,
-0.0269775390625,
0.0528564453125,
0.0064697265625,
0.016998291015625,
0.0218658447265625,
0.0173187255859375,
0.019622802734375,
-0.0697021484375,
0.04876708984375,
-0.0170135498046875,
0.0117034912109375,
-0.0144805908203125,
-0.017303466796875,
0.017852783203125,
0.02069091796875,
-0.01451873779296875,
-0.038543701171875,
-0.0177154541015625,
-0.0216064453125,
-0.00394439697265625,
0.06591796875,
-0.0494384765625,
-0.0460205078125,
-0.06402587890625,
-0.0081634521484375,
-0.01399993896484375,
0.0128631591796875,
0.05938720703125,
0.018402099609375,
-0.036865234375,
0.005306243896484375,
0.07708740234375,
0.00048542022705078125,
0.0731201171875,
0.0171661376953125,
-0.01983642578125,
-0.040802001953125,
0.01025390625,
-0.0033969879150390625,
0.0400390625,
0.0275726318359375,
-0.0158233642578125,
0.0000508427619934082,
-0.01430511474609375,
-0.0292205810546875,
0.034088134765625,
-0.00594329833984375,
-0.02410888671875,
-0.02178955078125,
-0.0294189453125,
-0.01175689697265625,
0.0124664306640625,
-0.05572509765625,
-0.0155792236328125,
-0.042205810546875,
-0.0234832763671875,
0.026336669921875,
0.07611083984375,
0.0251617431640625,
0.0239410400390625,
-0.01666259765625,
0.0014562606811523438,
0.0305328369140625,
0.04241943359375,
0.01212310791015625,
-0.0391845703125,
-0.019500732421875,
0.01177978515625,
-0.062103271484375,
-0.07696533203125,
0.0491943359375,
-0.01151275634765625,
0.011077880859375,
0.045623779296875,
0.02545166015625,
0.0771484375,
-0.03594970703125,
0.05206298828125,
0.0306549072265625,
-0.070068359375,
0.044830322265625,
-0.006671905517578125,
0.04608154296875,
0.033966064453125,
0.05908203125,
-0.01029205322265625,
0.031158447265625,
-0.050506591796875,
-0.0308990478515625,
0.01251220703125,
0.0032978057861328125,
-0.02972412109375,
0.045989990234375,
0.0150909423828125,
0.0303802490234375,
0.00722503662109375,
-0.054443359375,
-0.0279388427734375,
-0.03033447265625,
-0.00650787353515625,
0.01015472412109375,
-0.02496337890625,
0.003162384033203125,
-0.032073974609375,
0.06475830078125,
0.0124359130859375,
0.01171875,
0.033294677734375,
-0.015960693359375,
0.00868988037109375,
0.01200103759765625,
0.07781982421875,
0.050933837890625,
-0.07080078125,
0.01303863525390625,
-0.024261474609375,
-0.0188751220703125,
-0.02301025390625,
0.034576416015625,
0.0032138824462890625,
0.0186920166015625,
-0.0016326904296875,
0.06146240234375,
0.0228118896484375,
-0.0298919677734375,
0.0406494140625,
-0.031463623046875,
0.0224151611328125,
-0.06402587890625,
-0.00705718994140625,
0.0244598388671875,
0.0293731689453125,
0.02093505859375,
-0.00399017333984375,
0.049163818359375,
-0.044586181640625,
0.002979278564453125,
0.005489349365234375,
-0.0538330078125,
-0.050811767578125,
0.08892822265625,
0.045989990234375,
-0.05963134765625,
0.064208984375,
-0.01274871826171875,
-0.017822265625,
0.041961669921875,
0.0479736328125,
0.07147216796875,
-0.03277587890625,
0.0218658447265625,
0.0192413330078125,
-0.0064697265625,
0.010894775390625,
0.030181884765625,
-0.0087738037109375,
-0.022369384765625,
0.04217529296875,
-0.004444122314453125,
-0.053314208984375,
0.00850677490234375,
-0.09661865234375,
0.039520263671875,
-0.0396728515625,
-0.02056884765625,
0.0160064697265625,
0.01041412353515625,
-0.050079345703125,
0.05926513671875,
0.019073486328125,
0.10430908203125,
-0.05633544921875,
0.10906982421875,
0.0258636474609375,
-0.059295654296875,
-0.01049041748046875,
-0.0212860107421875,
-0.0177459716796875,
-0.037506103515625,
0.01739501953125,
0.01259613037109375,
-0.0037860870361328125,
-0.031463623046875,
-0.0418701171875,
-0.07373046875,
0.101806640625,
0.0350341796875,
-0.050537109375,
0.003025054931640625,
-0.0235748291015625,
0.02215576171875,
-0.040740966796875,
0.039764404296875,
0.0293731689453125,
0.0298919677734375,
0.02386474609375,
-0.088623046875,
-0.0066375732421875,
-0.0655517578125,
0.006221771240234375,
-0.01251983642578125,
-0.09820556640625,
0.0509033203125,
-0.039642333984375,
0.007488250732421875,
0.041229248046875,
0.03778076171875,
0.017578125,
0.00446319580078125,
0.0323486328125,
0.0419921875,
0.0672607421875,
-0.01226043701171875,
0.08074951171875,
-0.010986328125,
0.0217437744140625,
0.0899658203125,
-0.023529052734375,
0.05194091796875,
0.035247802734375,
0.00563812255859375,
0.044891357421875,
0.070068359375,
-0.0182342529296875,
0.049041748046875,
0.0027408599853515625,
-0.040435791015625,
-0.006000518798828125,
-0.022491455078125,
-0.0416259765625,
0.0215911865234375,
0.00829315185546875,
-0.07110595703125,
0.01580810546875,
-0.021148681640625,
0.0160675048828125,
-0.0193328857421875,
-0.028228759765625,
0.0279998779296875,
-0.022674560546875,
-0.04962158203125,
0.0162811279296875,
0.00006240606307983398,
0.011505126953125,
-0.04638671875,
0.0135498046875,
-0.0145111083984375,
0.024871826171875,
-0.027191162109375,
-0.0020160675048828125,
0.0123748779296875,
-0.00235748291015625,
-0.03741455078125,
0.004528045654296875,
0.0290069580078125,
-0.0092315673828125,
-0.04815673828125,
0.020111083984375,
-0.019775390625,
0.0138397216796875,
0.0208740234375,
-0.0225830078125,
0.054107666015625,
0.0161895751953125,
0.00917816162109375,
-0.0186614990234375,
-0.027679443359375,
0.004917144775390625,
0.05419921875,
-0.0025844573974609375,
0.01107025146484375,
0.041717529296875,
0.0007290840148925781,
0.03900146484375,
-0.06689453125,
-0.02313232421875,
-0.0238494873046875,
0.047119140625,
-0.0567626953125,
-0.0614013671875,
0.051788330078125,
0.090576171875,
0.0693359375,
-0.0455322265625,
0.05609130859375,
0.011444091796875,
0.009490966796875,
-0.009857177734375,
0.050079345703125,
-0.024627685546875,
-0.00176239013671875,
0.0187530517578125,
-0.03955078125,
-0.036285400390625,
0.030670166015625,
0.0028095245361328125,
0.01226043701171875,
0.0316162109375,
0.03070068359375,
-0.005023956298828125,
0.049102783203125,
0.034149169921875,
0.0271759033203125,
0.026031494140625,
0.0070648193359375,
0.0184173583984375,
-0.05828857421875,
-0.000011563301086425781,
-0.04254150390625,
-0.056640625,
-0.037322998046875,
-0.05181884765625,
-0.036956787109375,
-0.0302581787109375,
-0.0296173095703125,
0.007472991943359375,
0.01253509521484375,
0.056427001953125,
0.07757568359375,
-0.051727294921875,
-0.035888671875,
0.007373809814453125,
0.0133209228515625,
-0.004993438720703125,
-0.0193328857421875,
0.0124664306640625,
0.0265960693359375,
-0.0765380859375,
0.0118560791015625,
0.0102691650390625,
0.035491943359375,
-0.0140228271484375,
0.0224151611328125,
-0.0460205078125,
0.04510498046875,
0.0098419189453125,
0.04315185546875,
-0.042724609375,
-0.005504608154296875,
-0.0263824462890625,
-0.0027523040771484375,
-0.004360198974609375,
0.038665771484375,
-0.004791259765625,
0.01451873779296875,
0.027008056640625,
-0.0159912109375,
0.03863525390625,
-0.01508331298828125,
0.04498291015625,
-0.005767822265625,
0.0237884521484375,
-0.0110626220703125,
0.07220458984375,
0.00536346435546875,
-0.011322021484375,
0.042205810546875,
0.022552490234375,
-0.036956787109375,
-0.072998046875,
0.0116119384765625,
-0.10186767578125,
0.01003265380859375,
0.0491943359375,
-0.0198516845703125,
-0.046142578125,
0.03289794921875,
-0.02813720703125,
0.00492095947265625,
0.002307891845703125,
0.03265380859375,
0.057159423828125,
-0.0164794921875,
-0.025665283203125,
-0.027099609375,
0.019989013671875,
-0.0206146240234375,
-0.07208251953125,
-0.03759765625,
0.00452423095703125,
0.031036376953125,
0.0146484375,
0.0128326416015625,
-0.0233306884765625,
0.0523681640625,
0.00013208389282226562,
0.042022705078125,
0.01166534423828125,
-0.0274200439453125,
-0.00972747802734375,
0.03302001953125,
0.00853729248046875,
-0.059661865234375
]
] |
sentence-transformers/stsb-mpnet-base-v2 | 2021-08-05T08:31:17.000Z | [
"sentence-transformers",
"pytorch",
"mpnet",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/stsb-mpnet-base-v2 | 11 | 15,539 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/stsb-mpnet-base-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/stsb-mpnet-base-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/stsb-mpnet-base-v2')
model = AutoModel.from_pretrained('sentence-transformers/stsb-mpnet-base-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/stsb-mpnet-base-v2)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 75, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,668 | [
[
-0.021728515625,
-0.0460205078125,
0.0240325927734375,
0.0303802490234375,
-0.0252532958984375,
-0.03179931640625,
-0.0160675048828125,
0.0035610198974609375,
0.0137176513671875,
0.0292205810546875,
-0.04034423828125,
-0.024871826171875,
-0.059967041015625,
0.00011551380157470703,
-0.03277587890625,
0.055633544921875,
-0.006397247314453125,
0.0006642341613769531,
-0.0233917236328125,
-0.00879669189453125,
-0.01273345947265625,
-0.033294677734375,
-0.030792236328125,
-0.021820068359375,
0.0244903564453125,
0.01488494873046875,
0.030975341796875,
0.03851318359375,
0.02960205078125,
0.033294677734375,
-0.01007843017578125,
0.01488494873046875,
-0.022491455078125,
-0.015228271484375,
-0.0028972625732421875,
-0.0238037109375,
-0.00099945068359375,
0.021484375,
0.0501708984375,
0.033599853515625,
-0.00923919677734375,
0.01279449462890625,
0.0009551048278808594,
0.0242767333984375,
-0.03424072265625,
0.03192138671875,
-0.044403076171875,
0.0206146240234375,
0.006103515625,
0.00042629241943359375,
-0.044830322265625,
-0.002056121826171875,
0.0205841064453125,
-0.0201263427734375,
0.00577545166015625,
0.0125885009765625,
0.0859375,
0.0275421142578125,
-0.0308837890625,
-0.01477813720703125,
-0.0189971923828125,
0.05767822265625,
-0.0675048828125,
0.0147552490234375,
0.017822265625,
-0.0001722574234008789,
-0.0026988983154296875,
-0.0816650390625,
-0.05780029296875,
-0.0149993896484375,
-0.03851318359375,
0.0199127197265625,
-0.031707763671875,
0.0025615692138671875,
0.01464080810546875,
0.0129852294921875,
-0.048309326171875,
-0.001495361328125,
-0.039337158203125,
-0.01377105712890625,
0.035858154296875,
0.00103759765625,
0.023590087890625,
-0.052398681640625,
-0.040618896484375,
-0.02301025390625,
-0.01380157470703125,
-0.0034770965576171875,
0.0156402587890625,
0.0147552490234375,
-0.0247039794921875,
0.07135009765625,
-0.00437164306640625,
0.04217529296875,
-0.005100250244140625,
0.01776123046875,
0.04376220703125,
-0.0240631103515625,
-0.02069091796875,
-0.0051727294921875,
0.08306884765625,
0.030242919921875,
0.024566650390625,
-0.003849029541015625,
-0.00994110107421875,
0.0021495819091796875,
0.02252197265625,
-0.06500244140625,
-0.030517578125,
0.0152435302734375,
-0.038787841796875,
-0.0240936279296875,
0.0194854736328125,
-0.045745849609375,
-0.0101165771484375,
-0.003429412841796875,
0.051849365234375,
-0.043670654296875,
0.0007343292236328125,
0.010528564453125,
-0.0232086181640625,
0.0168609619140625,
-0.03399658203125,
-0.04901123046875,
0.0137786865234375,
0.0301361083984375,
0.082763671875,
-0.0014438629150390625,
-0.04083251953125,
-0.018798828125,
-0.012969970703125,
0.01025390625,
0.0537109375,
-0.02325439453125,
-0.006011962890625,
-0.0019550323486328125,
0.0194854736328125,
-0.049591064453125,
-0.0264434814453125,
0.043487548828125,
-0.0142822265625,
0.0511474609375,
0.01354217529296875,
-0.05877685546875,
-0.01280975341796875,
0.0096893310546875,
-0.030792236328125,
0.0771484375,
0.0172119140625,
-0.0694580078125,
-0.00008022785186767578,
-0.062469482421875,
-0.0238189697265625,
-0.01537322998046875,
0.004852294921875,
-0.052459716796875,
0.012451171875,
0.027252197265625,
0.05523681640625,
0.01073455810546875,
0.027252197265625,
-0.013397216796875,
-0.031890869140625,
0.035125732421875,
-0.0277099609375,
0.08306884765625,
0.01153564453125,
-0.031829833984375,
0.01837158203125,
-0.038177490234375,
-0.006427764892578125,
0.0189208984375,
-0.0174713134765625,
-0.0010356903076171875,
0.0091094970703125,
0.016021728515625,
0.02984619140625,
0.0142974853515625,
-0.05181884765625,
0.0119171142578125,
-0.0460205078125,
0.07391357421875,
0.048828125,
0.0017805099487304688,
0.041595458984375,
-0.0211029052734375,
0.0095977783203125,
0.03289794921875,
0.0010137557983398438,
-0.016387939453125,
-0.03204345703125,
-0.0816650390625,
-0.017669677734375,
0.024993896484375,
0.046112060546875,
-0.05584716796875,
0.07379150390625,
-0.033843994140625,
-0.035003662109375,
-0.052215576171875,
-0.01372528076171875,
0.006561279296875,
0.03424072265625,
0.043060302734375,
-0.01202392578125,
-0.056396484375,
-0.08367919921875,
0.00023758411407470703,
-0.007709503173828125,
0.00589752197265625,
0.0298309326171875,
0.0572509765625,
-0.033050537109375,
0.07061767578125,
-0.04364013671875,
-0.026763916015625,
-0.0281219482421875,
0.020538330078125,
0.0189208984375,
0.049530029296875,
0.032440185546875,
-0.055908203125,
-0.02215576171875,
-0.040863037109375,
-0.042877197265625,
-0.00493621826171875,
-0.0252685546875,
-0.01377105712890625,
0.01537322998046875,
0.036773681640625,
-0.0654296875,
0.02105712890625,
0.0474853515625,
-0.04425048828125,
0.0340576171875,
-0.01157379150390625,
-0.01580810546875,
-0.107666015625,
0.00799560546875,
0.00337982177734375,
-0.015289306640625,
-0.0282745361328125,
0.004482269287109375,
0.007843017578125,
-0.01300048828125,
-0.036224365234375,
0.02783203125,
-0.0308380126953125,
0.002407073974609375,
-0.005321502685546875,
0.0294647216796875,
-0.00658416748046875,
0.06158447265625,
-0.005550384521484375,
0.05938720703125,
0.0391845703125,
-0.03466796875,
0.0259552001953125,
0.040283203125,
-0.0367431640625,
0.01088714599609375,
-0.0640869140625,
-0.00557708740234375,
-0.005374908447265625,
0.02606201171875,
-0.087890625,
-0.001956939697265625,
0.0273895263671875,
-0.047271728515625,
0.002719879150390625,
0.01055145263671875,
-0.0504150390625,
-0.050567626953125,
-0.0196380615234375,
-0.0015554428100585938,
0.04083251953125,
-0.03948974609375,
0.03948974609375,
0.0215606689453125,
-0.0046539306640625,
-0.040252685546875,
-0.088623046875,
0.00952911376953125,
-0.00713348388671875,
-0.05230712890625,
0.03948974609375,
-0.004734039306640625,
0.0176849365234375,
0.0211029052734375,
0.01611328125,
-0.0008192062377929688,
0.00089263916015625,
0.01067352294921875,
0.0146636962890625,
-0.01287841796875,
0.01207733154296875,
0.006069183349609375,
-0.01285552978515625,
0.009765625,
-0.029327392578125,
0.060394287109375,
-0.016357421875,
-0.01169586181640625,
-0.0311279296875,
0.01629638671875,
0.0341796875,
-0.020050048828125,
0.0904541015625,
0.07965087890625,
-0.0238189697265625,
-0.004364013671875,
-0.03558349609375,
-0.019500732421875,
-0.034881591796875,
0.04840087890625,
-0.012786865234375,
-0.08123779296875,
0.0204315185546875,
0.01151275634765625,
-0.0055999755859375,
0.050567626953125,
0.044097900390625,
-0.01343536376953125,
0.0634765625,
0.04083251953125,
-0.01035308837890625,
0.038818359375,
-0.041595458984375,
0.0263671875,
-0.0753173828125,
-0.0117950439453125,
-0.0201568603515625,
-0.03369140625,
-0.054779052734375,
-0.03485107421875,
0.0144805908203125,
-0.0084075927734375,
-0.025299072265625,
0.0498046875,
-0.0450439453125,
0.0170135498046875,
0.052764892578125,
0.012786865234375,
-0.0185089111328125,
0.0078277587890625,
-0.041046142578125,
-0.0019245147705078125,
-0.05743408203125,
-0.034454345703125,
0.0526123046875,
0.028533935546875,
0.01462554931640625,
-0.0025005340576171875,
0.042938232421875,
-0.0008454322814941406,
-0.0009613037109375,
-0.055419921875,
0.049652099609375,
-0.0289459228515625,
-0.0262298583984375,
-0.01486968994140625,
-0.03729248046875,
-0.0615234375,
0.03814697265625,
-0.0166778564453125,
-0.06695556640625,
0.01067352294921875,
-0.019195556640625,
-0.01849365234375,
0.0283203125,
-0.068115234375,
0.080810546875,
0.0061798095703125,
-0.00203704833984375,
-0.006237030029296875,
-0.05865478515625,
0.0168609619140625,
0.024871826171875,
-0.007282257080078125,
-0.00270843505859375,
-0.0072174072265625,
0.05999755859375,
-0.020355224609375,
0.07177734375,
-0.01207733154296875,
0.028900146484375,
0.025543212890625,
-0.0224456787109375,
0.0249786376953125,
-0.01236724853515625,
-0.005321502685546875,
0.0011968612670898438,
-0.01465606689453125,
-0.02862548828125,
-0.036041259765625,
0.05108642578125,
-0.072265625,
-0.0256500244140625,
-0.03802490234375,
-0.0460205078125,
-0.0010442733764648438,
0.01345062255859375,
0.0254364013671875,
0.0291595458984375,
-0.0029964447021484375,
0.0413818359375,
0.03387451171875,
-0.0263214111328125,
0.059783935546875,
0.01131439208984375,
0.004177093505859375,
-0.038360595703125,
0.047760009765625,
0.00742340087890625,
0.0014743804931640625,
0.036895751953125,
0.0169525146484375,
-0.0310821533203125,
-0.02288818359375,
-0.02581787109375,
0.024505615234375,
-0.03668212890625,
-0.01461029052734375,
-0.0810546875,
-0.04888916015625,
-0.050140380859375,
0.00389862060546875,
-0.01396942138671875,
-0.03778076171875,
-0.042205810546875,
-0.0218505859375,
0.0209197998046875,
0.0264434814453125,
-0.004802703857421875,
0.040771484375,
-0.05029296875,
0.00860595703125,
0.01239776611328125,
0.0162200927734375,
-0.0031261444091796875,
-0.055419921875,
-0.0180206298828125,
0.0000718832015991211,
-0.0265655517578125,
-0.06414794921875,
0.0458984375,
0.01430511474609375,
0.042022705078125,
0.0087738037109375,
-0.0032253265380859375,
0.04217529296875,
-0.0447998046875,
0.0738525390625,
0.0091094970703125,
-0.0755615234375,
0.032928466796875,
-0.008544921875,
0.036895751953125,
0.031463623046875,
0.024688720703125,
-0.0458984375,
-0.01407623291015625,
-0.056182861328125,
-0.0767822265625,
0.05615234375,
0.0404052734375,
0.037017822265625,
-0.020233154296875,
0.02044677734375,
-0.031402587890625,
0.01442718505859375,
-0.0859375,
-0.03118896484375,
-0.02777099609375,
-0.03814697265625,
-0.024871826171875,
-0.032073974609375,
0.014923095703125,
-0.0274810791015625,
0.061126708984375,
0.003246307373046875,
0.0643310546875,
0.0263824462890625,
-0.032806396484375,
0.02069091796875,
0.018402099609375,
0.048065185546875,
0.0191497802734375,
-0.00231170654296875,
0.023529052734375,
0.0171966552734375,
-0.0243377685546875,
0.0008616447448730469,
0.039764404296875,
-0.00896453857421875,
0.00873565673828125,
0.024444580078125,
0.07586669921875,
0.031951904296875,
-0.030975341796875,
0.057891845703125,
0.00016129016876220703,
-0.0178985595703125,
-0.031280517578125,
-0.0059051513671875,
0.02947998046875,
0.0243988037109375,
0.0140228271484375,
0.01464080810546875,
-0.00013935565948486328,
-0.026519775390625,
0.031951904296875,
0.0102691650390625,
-0.036712646484375,
0.004261016845703125,
0.0404052734375,
0.0024356842041015625,
-0.021209716796875,
0.0709228515625,
-0.02374267578125,
-0.062164306640625,
0.037841796875,
0.049652099609375,
0.0712890625,
0.0086212158203125,
0.0287933349609375,
0.03759765625,
0.03607177734375,
-0.0014524459838867188,
0.00012129545211791992,
0.006023406982421875,
-0.0657958984375,
-0.025360107421875,
-0.049224853515625,
0.008697509765625,
-0.0035266876220703125,
-0.0325927734375,
0.01119232177734375,
-0.00910186767578125,
-0.006076812744140625,
-0.01134490966796875,
0.0007309913635253906,
-0.03778076171875,
0.000762939453125,
0.0095977783203125,
0.06927490234375,
-0.0640869140625,
0.05792236328125,
0.051483154296875,
-0.04559326171875,
-0.058441162109375,
-0.01117706298828125,
-0.016571044921875,
-0.05413818359375,
0.0341796875,
0.033966064453125,
0.01788330078125,
0.017242431640625,
-0.04559326171875,
-0.0604248046875,
0.1041259765625,
0.0187225341796875,
-0.0312347412109375,
-0.00738525390625,
0.01328277587890625,
0.03387451171875,
-0.040313720703125,
0.0249481201171875,
0.031402587890625,
0.029571533203125,
-0.004108428955078125,
-0.046478271484375,
0.01416778564453125,
-0.02752685546875,
0.01515960693359375,
-0.005390167236328125,
-0.048004150390625,
0.072265625,
0.00200653076171875,
-0.0143585205078125,
0.01812744140625,
0.06597900390625,
0.02093505859375,
-0.00846099853515625,
0.033355712890625,
0.0640869140625,
0.040618896484375,
-0.0178375244140625,
0.07513427734375,
-0.016876220703125,
0.052734375,
0.07904052734375,
0.00968170166015625,
0.08258056640625,
0.03643798828125,
-0.004108428955078125,
0.061187744140625,
0.03863525390625,
-0.033538818359375,
0.0419921875,
0.024505615234375,
0.0023021697998046875,
0.00005322694778442383,
0.001544952392578125,
-0.00894927978515625,
0.0400390625,
0.0095977783203125,
-0.0614013671875,
0.00008064508438110352,
0.0075836181640625,
0.00653839111328125,
-0.0034694671630859375,
-0.00018835067749023438,
0.044952392578125,
0.01416015625,
-0.041534423828125,
0.0158233642578125,
0.0224456787109375,
0.072509765625,
-0.0316162109375,
0.016265869140625,
-0.0028247833251953125,
0.0270538330078125,
-0.0002084970474243164,
-0.05108642578125,
0.032073974609375,
-0.012542724609375,
-0.008941650390625,
-0.01959228515625,
0.056549072265625,
-0.046295166015625,
-0.0501708984375,
0.018798828125,
0.035736083984375,
0.005268096923828125,
0.0012331008911132812,
-0.0816650390625,
-0.0058441162109375,
0.002109527587890625,
-0.033355712890625,
0.012451171875,
0.01617431640625,
0.036285400390625,
0.042388916015625,
0.031036376953125,
-0.00553131103515625,
0.0118865966796875,
0.007251739501953125,
0.05609130859375,
-0.04425048828125,
-0.04205322265625,
-0.0731201171875,
0.05816650390625,
-0.023956298828125,
-0.0302734375,
0.051513671875,
0.0423583984375,
0.0638427734375,
-0.0165863037109375,
0.037994384765625,
-0.01934814453125,
0.01194000244140625,
-0.04052734375,
0.07000732421875,
-0.030303955078125,
-0.007549285888671875,
-0.00829315185546875,
-0.07147216796875,
-0.0207977294921875,
0.078369140625,
-0.02423095703125,
0.00952911376953125,
0.0701904296875,
0.06597900390625,
-0.017791748046875,
-0.0037784576416015625,
0.007904052734375,
0.038238525390625,
0.00925445556640625,
0.03857421875,
0.0367431640625,
-0.057403564453125,
0.050018310546875,
-0.0408935546875,
0.0007877349853515625,
-0.00547027587890625,
-0.06512451171875,
-0.07177734375,
-0.06549072265625,
-0.0284881591796875,
-0.0258026123046875,
0.00020503997802734375,
0.07940673828125,
0.060211181640625,
-0.05596923828125,
-0.00974273681640625,
-0.0202484130859375,
-0.01204681396484375,
-0.006557464599609375,
-0.025390625,
0.03594970703125,
-0.033660888671875,
-0.048675537109375,
0.0186920166015625,
-0.002887725830078125,
0.00926971435546875,
-0.0264129638671875,
0.01245880126953125,
-0.0557861328125,
0.00995635986328125,
0.04913330078125,
-0.02215576171875,
-0.05364990234375,
-0.025390625,
-0.0004944801330566406,
-0.028564453125,
-0.003490447998046875,
0.03857421875,
-0.053375244140625,
0.008880615234375,
0.035186767578125,
0.044677734375,
0.05352783203125,
-0.015228271484375,
0.033782958984375,
-0.06842041015625,
0.017822265625,
0.005130767822265625,
0.05560302734375,
0.031524658203125,
-0.0204315185546875,
0.04656982421875,
0.024383544921875,
-0.0382080078125,
-0.045196533203125,
-0.004505157470703125,
-0.08245849609375,
-0.026123046875,
0.087890625,
-0.0258941650390625,
-0.0247802734375,
0.023834228515625,
-0.0189208984375,
0.0364990234375,
-0.0164337158203125,
0.04803466796875,
0.06390380859375,
0.0036067962646484375,
-0.03521728515625,
-0.0205841064453125,
0.00678253173828125,
0.036529541015625,
-0.0455322265625,
-0.0199737548828125,
0.018646240234375,
0.01702880859375,
0.0185394287109375,
0.0208282470703125,
-0.003753662109375,
-0.000732421875,
0.0059356689453125,
0.003986358642578125,
-0.023468017578125,
-0.004535675048828125,
-0.0308685302734375,
0.01617431640625,
-0.04046630859375,
-0.0211944580078125
]
] |
osiria/bert-italian-cased-ner | 2023-06-20T22:30:14.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"it",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | osiria | null | null | osiria/bert-italian-cased-ner | 2 | 15,535 | transformers | 2023-06-03T00:17:44 | ---
license: apache-2.0
language:
- it
widget:
- text: "Mi chiamo Marco Rossi, vivo a Roma e lavoro per l'Agenzia Spaziale Italiana"
example_title: "Example 1"
---
--------------------------------------------------------------------------------------------------
<body>
<span class="vertical-text" style="background-color:lightgreen;border-radius: 3px;padding: 3px;"> </span>
<br>
<span class="vertical-text" style="background-color:orange;border-radius: 3px;padding: 3px;"> Task: Named Entity Recognition</span>
<br>
<span class="vertical-text" style="background-color:lightblue;border-radius: 3px;padding: 3px;"> Model: BERT</span>
<br>
<span class="vertical-text" style="background-color:tomato;border-radius: 3px;padding: 3px;"> Lang: IT</span>
<br>
<span class="vertical-text" style="background-color:lightgrey;border-radius: 3px;padding: 3px;"> </span>
<br>
<span class="vertical-text" style="background-color:#CF9FFF;border-radius: 3px;padding: 3px;"> </span>
</body>
--------------------------------------------------------------------------------------------------
<h3>Model description</h3>
This is a <b>BERT</b> <b>[1]</b> cased model for the <b>Italian</b> language, fine-tuned for <b>Named Entity Recognition</b> (<b>Person</b>, <b>Location</b>, <b>Organization</b> and <b>Miscellanea</b> classes) on the [WikiNER](https://figshare.com/articles/dataset/Learning_multilingual_named_entity_recognition_from_Wikipedia/5462500) dataset <b>[2]</b>, using <b>BERT-ITALIAN</b> ([bert-base-italian-cased](https://huggingface.co/osiria/bert-base-italian-cased)) as a pre-trained model.
This is a cased, base size BERT model. If you are looking for a lighter (but slightly less accurate) cased model, you can refer to: https://huggingface.co/osiria/distilbert-italian-cased-ner
If you are looking for an uncased model, you can refer to: https://huggingface.co/osiria/bert-italian-uncased-ner
<h3>Training and Performances</h3>
The model is trained to perform entity recognition over 4 classes: <b>PER</b> (persons), <b>LOC</b> (locations), <b>ORG</b> (organizations), <b>MISC</b> (miscellanea, mainly events, products and services). It has been fine-tuned for Named Entity Recognition, using the WikiNER Italian dataset plus an additional custom dataset of manually annotated Wikipedia paragraphs.
The WikiNER dataset has been splitted in 102.352 training instances and 25.588 test instances, and the model has been trained for 1 epoch with a constant learning rate of 1e-5.
The performances on the test set are reported in the following table:
| Recall | Precision | F1 |
| ------ | ------ | ------ |
| 93.35 | 92.22 | 92.78 |
The metrics have been computed at the token level and then macro-averaged over the 4 classes.
Then, since WikiNER is an automatically annotated (silver standard) dataset, which sometimes contains imperfect annotations, an additional fine-tuning on ~3.500 manually annotated paragraphs has been performed.
<h3>Quick usage</h3>
```python
from transformers import BertTokenizerFast, BertForTokenClassification
tokenizer = BertTokenizerFast.from_pretrained("osiria/bert-italian-cased-ner")
model = BertForTokenClassification.from_pretrained("osiria/bert-italian-cased-ner")
from transformers import pipeline
ner = pipeline("ner", model = model, tokenizer = tokenizer, aggregation_strategy="first")
ner("Mi chiamo Marco Rossi, vivo a Roma e lavoro per l'Agenzia Spaziale Italiana nella missione Prisma")
[{'entity_group': 'PER',
'score': 0.99910736,
'word': 'Marco Rossi',
'start': 10,
'end': 21},
{'entity_group': 'LOC',
'score': 0.9973786,
'word': 'Roma',
'start': 30,
'end': 34},
{'entity_group': 'ORG',
'score': 0.9987071,
'word': 'Agenzia Spaziale Italiana',
'start': 50,
'end': 75},
{'entity_group': 'MISC',
'score': 0.9625836,
'word': 'Prisma',
'start': 91,
'end': 97}]
```
You can also try the model online using this web app: https://huggingface.co/spaces/osiria/bert-italian-cased-ner
<h3>References</h3>
[1] https://arxiv.org/abs/1810.04805
[2] https://www.sciencedirect.com/science/article/pii/S0004370212000276
<h3>Limitations</h3>
This model is mainly trained on Wikipedia, so it's particularly suitable for natively digital text from the world wide web, written in a correct and fluent form (like wikis, web pages, news, etc.). However, it may show limitations when it comes to chaotic text, containing errors and slang expressions
(like social media posts) or when it comes to domain-specific text (like medical, financial or legal content).
<h3>License</h3>
The model is released under <b>Apache-2.0</b> license
| 4,632 | [
[
-0.0391845703125,
-0.046478271484375,
0.019622802734375,
0.01276397705078125,
-0.01187896728515625,
-0.01540374755859375,
-0.02349853515625,
-0.04510498046875,
0.032745361328125,
0.0079193115234375,
-0.0418701171875,
-0.049530029296875,
-0.047393798828125,
0.0137176513671875,
-0.0210723876953125,
0.08380126953125,
-0.00992584228515625,
0.019256591796875,
0.0031719207763671875,
-0.005176544189453125,
-0.00970458984375,
-0.034576416015625,
-0.053131103515625,
-0.0184783935546875,
0.037506103515625,
0.00641632080078125,
0.028167724609375,
0.02642822265625,
0.0262908935546875,
0.0262298583984375,
-0.01611328125,
0.01702880859375,
-0.0167236328125,
-0.0204010009765625,
-0.0038700103759765625,
-0.0287017822265625,
-0.03338623046875,
-0.005157470703125,
0.050079345703125,
0.0528564453125,
-0.0021266937255859375,
0.001216888427734375,
0.00988006591796875,
0.033233642578125,
-0.022674560546875,
0.018280029296875,
-0.04998779296875,
0.01026153564453125,
-0.0218353271484375,
0.0099029541015625,
-0.0259552001953125,
-0.0189971923828125,
0.0203704833984375,
-0.057281494140625,
0.0250244140625,
0.0041046142578125,
0.112548828125,
-0.0007262229919433594,
-0.019500732421875,
-0.0204010009765625,
-0.028167724609375,
0.0589599609375,
-0.0633544921875,
0.042144775390625,
0.0211639404296875,
-0.00021505355834960938,
-0.0209197998046875,
-0.05767822265625,
-0.0693359375,
-0.00518798828125,
-0.0175018310546875,
0.017669677734375,
-0.0164337158203125,
-0.0134124755859375,
0.01214599609375,
0.013336181640625,
-0.0430908203125,
0.0121002197265625,
-0.03692626953125,
-0.0156402587890625,
0.046661376953125,
-0.003925323486328125,
0.037994384765625,
-0.024993896484375,
-0.034393310546875,
-0.0031719207763671875,
-0.02545166015625,
0.02459716796875,
0.018829345703125,
0.033599853515625,
-0.0223388671875,
0.049713134765625,
-0.0198974609375,
0.054595947265625,
0.025787353515625,
-0.0262451171875,
0.04827880859375,
-0.009735107421875,
-0.010955810546875,
0.0009832382202148438,
0.070556640625,
0.031280517578125,
0.0216217041015625,
-0.0135498046875,
-0.0140838623046875,
-0.007022857666015625,
0.01401519775390625,
-0.057891845703125,
-0.015777587890625,
0.0035419464111328125,
-0.037628173828125,
-0.01617431640625,
0.0099334716796875,
-0.0537109375,
0.0084686279296875,
-0.01364898681640625,
0.04241943359375,
-0.055084228515625,
-0.01018524169921875,
0.01531219482421875,
-0.0093231201171875,
0.02545166015625,
0.0246429443359375,
-0.06689453125,
0.0215911865234375,
0.0223846435546875,
0.057220458984375,
-0.0140838623046875,
-0.0221405029296875,
-0.002368927001953125,
-0.00018703937530517578,
-0.0117950439453125,
0.05645751953125,
-0.02056884765625,
-0.019805908203125,
-0.0095062255859375,
0.0230712890625,
-0.03369140625,
-0.01299285888671875,
0.046844482421875,
-0.029022216796875,
0.034698486328125,
-0.014495849609375,
-0.055999755859375,
-0.024566650390625,
0.03155517578125,
-0.053619384765625,
0.091552734375,
0.0019931793212890625,
-0.0836181640625,
0.047943115234375,
-0.042327880859375,
-0.03173828125,
0.005725860595703125,
-0.006549835205078125,
-0.055755615234375,
0.002399444580078125,
0.0283050537109375,
0.0484619140625,
0.00955963134765625,
0.01276397705078125,
-0.028656005859375,
-0.0136871337890625,
0.007755279541015625,
-0.0191650390625,
0.07550048828125,
0.00754547119140625,
-0.042999267578125,
-0.0217437744140625,
-0.0828857421875,
0.0107574462890625,
0.0295562744140625,
-0.0513916015625,
-0.0289306640625,
0.0029430389404296875,
0.019775390625,
0.0157928466796875,
0.02288818359375,
-0.048065185546875,
0.016448974609375,
-0.04095458984375,
0.0168304443359375,
0.0457763671875,
-0.00400543212890625,
0.0199432373046875,
-0.02032470703125,
0.01168060302734375,
0.01065826416015625,
0.003780364990234375,
0.0201416015625,
-0.0419921875,
-0.07977294921875,
-0.035491943359375,
0.05419921875,
0.04583740234375,
-0.059600830078125,
0.06939697265625,
-0.033203125,
-0.046356201171875,
-0.034820556640625,
-0.0036487579345703125,
0.015380859375,
0.0435791015625,
0.0309906005859375,
-0.0211639404296875,
-0.039459228515625,
-0.0753173828125,
0.01233673095703125,
-0.006725311279296875,
0.004108428955078125,
0.01526641845703125,
0.06201171875,
-0.0178985595703125,
0.07647705078125,
-0.0214080810546875,
-0.0230712890625,
-0.0243988037109375,
0.00823211669921875,
0.030181884765625,
0.048126220703125,
0.055572509765625,
-0.0635986328125,
-0.0435791015625,
0.006130218505859375,
-0.058502197265625,
0.00968170166015625,
0.0036163330078125,
-0.00691986083984375,
0.0258941650390625,
0.0078887939453125,
-0.0615234375,
0.044525146484375,
0.0267333984375,
-0.03143310546875,
0.0390625,
-0.017974853515625,
-0.0026836395263671875,
-0.105712890625,
0.012847900390625,
0.016510009765625,
-0.0012369155883789062,
-0.05426025390625,
0.0005731582641601562,
0.010711669921875,
0.004421234130859375,
-0.040618896484375,
0.041412353515625,
-0.0458984375,
0.002437591552734375,
0.01039886474609375,
0.0103759765625,
0.009002685546875,
0.05078125,
0.031951904296875,
0.05731201171875,
0.032135009765625,
-0.046661376953125,
0.0145416259765625,
0.024169921875,
-0.055023193359375,
0.046661376953125,
-0.044525146484375,
-0.0127105712890625,
-0.00958251953125,
0.0079498291015625,
-0.05523681640625,
-0.004482269287109375,
0.0137786865234375,
-0.040924072265625,
0.0350341796875,
0.0028553009033203125,
-0.047943115234375,
-0.04534912109375,
-0.0207366943359375,
0.003215789794921875,
0.0108489990234375,
-0.047821044921875,
0.048919677734375,
0.0151214599609375,
-0.01197052001953125,
-0.038818359375,
-0.054412841796875,
-0.01348876953125,
-0.017608642578125,
-0.05035400390625,
0.04534912109375,
-0.0162200927734375,
0.00788116455078125,
0.003383636474609375,
0.004962921142578125,
-0.0199737548828125,
0.00873565673828125,
0.00650787353515625,
0.031890869140625,
-0.01399993896484375,
0.01447296142578125,
0.00186920166015625,
0.01061248779296875,
-0.0161285400390625,
-0.01192474365234375,
0.051025390625,
-0.00592803955078125,
-0.006443023681640625,
-0.048828125,
0.0188751220703125,
0.037261962890625,
-0.01678466796875,
0.08221435546875,
0.0550537109375,
-0.05364990234375,
-0.00030159950256347656,
-0.049468994140625,
0.0008955001831054688,
-0.031982421875,
0.028839111328125,
-0.04791259765625,
-0.04840087890625,
0.049163818359375,
0.0171356201171875,
-0.0033359527587890625,
0.051025390625,
0.040496826171875,
-0.021270751953125,
0.0762939453125,
0.0460205078125,
-0.02447509765625,
0.03662109375,
-0.0335693359375,
0.0186309814453125,
-0.0555419921875,
-0.043365478515625,
-0.049346923828125,
-0.023895263671875,
-0.053131103515625,
-0.0144805908203125,
0.0246429443359375,
0.0158843994140625,
-0.01461029052734375,
0.04229736328125,
-0.047576904296875,
0.0091705322265625,
0.048309326171875,
0.025970458984375,
0.0089111328125,
0.01216888427734375,
-0.0423583984375,
-0.01806640625,
-0.047698974609375,
-0.03759765625,
0.0670166015625,
0.031707763671875,
0.042083740234375,
0.007122039794921875,
0.059844970703125,
0.0010042190551757812,
0.0078887939453125,
-0.050384521484375,
0.036865234375,
0.00031828880310058594,
-0.06640625,
-0.02685546875,
-0.03204345703125,
-0.0792236328125,
0.0233612060546875,
-0.03436279296875,
-0.078369140625,
0.02734375,
-0.00981903076171875,
-0.0266265869140625,
0.0390625,
-0.05609130859375,
0.0672607421875,
-0.025421142578125,
-0.01137542724609375,
0.0183258056640625,
-0.033111572265625,
0.004886627197265625,
0.00542449951171875,
0.014678955078125,
-0.004009246826171875,
0.005077362060546875,
0.06695556640625,
-0.0447998046875,
0.059600830078125,
-0.01236724853515625,
-0.00031495094299316406,
0.01419830322265625,
-0.0164794921875,
0.043121337890625,
-0.010284423828125,
0.00021123886108398438,
0.0235443115234375,
-0.0033664703369140625,
-0.0224151611328125,
-0.0253753662109375,
0.045654296875,
-0.0665283203125,
-0.026123046875,
-0.033477783203125,
-0.0226287841796875,
-0.01016998291015625,
0.036956787109375,
0.04534912109375,
0.018798828125,
-0.00612640380859375,
0.0183868408203125,
0.06256103515625,
-0.017303466796875,
0.045318603515625,
0.0430908203125,
-0.00030612945556640625,
-0.0292205810546875,
0.059722900390625,
0.0267333984375,
-0.01136016845703125,
0.039520263671875,
-0.011627197265625,
-0.02301025390625,
-0.04266357421875,
-0.04046630859375,
0.027618408203125,
-0.0316162109375,
-0.025848388671875,
-0.06695556640625,
-0.0323486328125,
-0.03753662109375,
-0.005100250244140625,
-0.0245208740234375,
-0.036712646484375,
-0.049468994140625,
-0.03143310546875,
0.0244598388671875,
0.032928466796875,
0.0067138671875,
0.0060272216796875,
-0.032867431640625,
0.010101318359375,
0.026824951171875,
0.0267181396484375,
-0.00495147705078125,
-0.035552978515625,
-0.01446533203125,
0.0008778572082519531,
-0.02178955078125,
-0.0704345703125,
0.037994384765625,
0.007480621337890625,
0.053131103515625,
0.034149169921875,
-0.00685882568359375,
0.057281494140625,
-0.052947998046875,
0.055389404296875,
0.0298309326171875,
-0.065185546875,
0.046356201171875,
-0.01160430908203125,
0.00036787986755371094,
0.040679931640625,
0.047088623046875,
-0.027099609375,
-0.0172119140625,
-0.0452880859375,
-0.06414794921875,
0.05120849609375,
0.02142333984375,
0.006114959716796875,
-0.014434814453125,
0.0178375244140625,
-0.00775146484375,
0.004413604736328125,
-0.0899658203125,
-0.022979736328125,
-0.0141754150390625,
-0.035736083984375,
0.0162506103515625,
-0.007450103759765625,
0.01412200927734375,
-0.037261962890625,
0.07489013671875,
-0.0038890838623046875,
0.03955078125,
0.0222930908203125,
-0.015838623046875,
0.01111602783203125,
0.0158538818359375,
0.042144775390625,
0.0293121337890625,
-0.01021575927734375,
-0.01259613037109375,
0.030303955078125,
-0.0264129638671875,
-0.00464630126953125,
0.03900146484375,
-0.031280517578125,
0.0242156982421875,
0.036651611328125,
0.07000732421875,
0.027191162109375,
-0.036865234375,
0.045257568359375,
0.0019130706787109375,
-0.03564453125,
-0.03729248046875,
-0.00466156005859375,
-0.0005955696105957031,
0.0240631103515625,
0.035491943359375,
-0.0178985595703125,
0.0163726806640625,
-0.03009033203125,
0.0058746337890625,
0.042449951171875,
-0.01500701904296875,
-0.01119232177734375,
0.046478271484375,
-0.0003337860107421875,
-0.007144927978515625,
0.04290771484375,
-0.01421356201171875,
-0.052215576171875,
0.042449951171875,
0.036346435546875,
0.037872314453125,
-0.00797271728515625,
0.01522064208984375,
0.040374755859375,
0.025543212890625,
-0.00013184547424316406,
0.00998687744140625,
-0.007175445556640625,
-0.07366943359375,
0.00061798095703125,
-0.06695556640625,
0.00872802734375,
0.016937255859375,
-0.057220458984375,
0.0271759033203125,
-0.02874755859375,
-0.036895751953125,
0.0196075439453125,
0.03265380859375,
-0.06646728515625,
0.038543701171875,
0.00812530517578125,
0.0810546875,
-0.052764892578125,
0.0667724609375,
0.05157470703125,
-0.054168701171875,
-0.07586669921875,
-0.01013946533203125,
-0.0179595947265625,
-0.06304931640625,
0.05712890625,
0.020233154296875,
0.01824951171875,
-0.005100250244140625,
-0.031402587890625,
-0.07525634765625,
0.09576416015625,
0.003368377685546875,
-0.048431396484375,
-0.036102294921875,
-0.0016222000122070312,
0.042724609375,
-0.018768310546875,
0.024505615234375,
0.0283050537109375,
0.036651611328125,
-0.006256103515625,
-0.084716796875,
-0.0106353759765625,
-0.042572021484375,
0.023681640625,
0.004364013671875,
-0.060333251953125,
0.06170654296875,
0.0010671615600585938,
-0.018890380859375,
0.020904541015625,
0.045684814453125,
0.01513671875,
0.01399993896484375,
0.0304107666015625,
0.07574462890625,
0.0537109375,
-0.024383544921875,
0.0654296875,
-0.0200653076171875,
0.0528564453125,
0.06170654296875,
-0.01084136962890625,
0.0687255859375,
0.0309295654296875,
-0.0246429443359375,
0.06103515625,
0.05364990234375,
-0.01763916015625,
0.045074462890625,
0.0025157928466796875,
-0.016204833984375,
-0.0030059814453125,
0.0037326812744140625,
-0.0305328369140625,
0.038848876953125,
0.023681640625,
-0.034149169921875,
-0.0021190643310546875,
0.00894927978515625,
0.0225830078125,
-0.005947113037109375,
-0.02215576171875,
0.06292724609375,
-0.0110321044921875,
-0.036773681640625,
0.0357666015625,
0.019439697265625,
0.06707763671875,
-0.0418701171875,
0.01131439208984375,
-0.0110931396484375,
0.00882720947265625,
-0.0118560791015625,
-0.042633056640625,
-0.002758026123046875,
0.00635528564453125,
-0.0302734375,
0.003971099853515625,
0.0411376953125,
-0.032745361328125,
-0.06976318359375,
0.03131103515625,
0.0211029052734375,
0.0109710693359375,
0.023712158203125,
-0.072509765625,
0.0017557144165039062,
0.0010013580322265625,
-0.0384521484375,
0.010467529296875,
0.04144287109375,
0.007656097412109375,
0.04254150390625,
0.03924560546875,
0.01306915283203125,
0.02423095703125,
-0.0050201416015625,
0.064453125,
-0.0386962890625,
-0.03460693359375,
-0.046722412109375,
0.04840087890625,
-0.024383544921875,
-0.019989013671875,
0.05169677734375,
0.052459716796875,
0.0823974609375,
-0.005767822265625,
0.046478271484375,
-0.0213623046875,
0.047393798828125,
-0.0491943359375,
0.0667724609375,
-0.05267333984375,
-0.00836181640625,
-0.02423095703125,
-0.0601806640625,
-0.0226287841796875,
0.08441162109375,
-0.0225830078125,
0.02325439453125,
0.03997802734375,
0.053802490234375,
-0.0025997161865234375,
-0.0166168212890625,
0.0106048583984375,
0.0179443359375,
0.005306243896484375,
0.0545654296875,
0.0304412841796875,
-0.047027587890625,
0.03350830078125,
-0.032867431640625,
-0.009735107421875,
-0.014434814453125,
-0.048919677734375,
-0.08251953125,
-0.05224609375,
-0.02557373046875,
-0.01372528076171875,
0.0005774497985839844,
0.07098388671875,
0.07452392578125,
-0.07427978515625,
0.0015726089477539062,
-0.0015544891357421875,
-0.003910064697265625,
-0.011199951171875,
-0.0239715576171875,
0.0335693359375,
-0.0170440673828125,
-0.07647705078125,
0.0138702392578125,
0.0017299652099609375,
0.0259857177734375,
-0.007843017578125,
-0.00792694091796875,
-0.039947509765625,
0.002864837646484375,
0.035400390625,
0.016387939453125,
-0.054840087890625,
-0.0240478515625,
-0.0160064697265625,
-0.019317626953125,
0.003498077392578125,
0.0189361572265625,
-0.03936767578125,
0.033203125,
0.01154327392578125,
0.04351806640625,
0.057464599609375,
-0.016021728515625,
0.0097503662109375,
-0.05438232421875,
0.01522064208984375,
0.00690460205078125,
0.0526123046875,
0.02142333984375,
-0.0310211181640625,
0.036834716796875,
0.03240966796875,
-0.018341064453125,
-0.048065185546875,
-0.01715087890625,
-0.0714111328125,
-0.021942138671875,
0.054840087890625,
-0.02447509765625,
-0.0218658447265625,
0.0103607177734375,
-0.00608062744140625,
0.0223846435546875,
-0.03350830078125,
0.05755615234375,
0.06304931640625,
-0.0016841888427734375,
-0.01285552978515625,
-0.031219482421875,
0.0168914794921875,
0.019775390625,
-0.059906005859375,
-0.0482177734375,
0.03057861328125,
0.046600341796875,
0.031341552734375,
0.04217529296875,
-0.00366973876953125,
0.0121002197265625,
-0.01401519775390625,
0.02703857421875,
0.0083160400390625,
-0.0245513916015625,
-0.003910064697265625,
0.01079559326171875,
-0.01258087158203125,
-0.0204925537109375
]
] |
tiiuae/falcon-180B | 2023-09-06T13:04:38.000Z | [
"transformers",
"safetensors",
"falcon",
"text-generation",
"en",
"de",
"es",
"fr",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:1911.02150",
"arxiv:2101.00027",
"arxiv:2005.14165",
"arxiv:2104.09864",
"arxiv:2205.14135",
"arxiv:2306.01116",
"license:unknown",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | tiiuae | null | null | tiiuae/falcon-180B | 883 | 15,514 | transformers | 2023-08-28T23:28:49 | ---
datasets:
- tiiuae/falcon-refinedweb
language:
- en
- de
- es
- fr
inference: false
license: unknown
extra_gated_heading: "Acknowledge license to access the repository"
extra_gated_prompt: "You agree to the [Falcon-180B TII license](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/ACCEPTABLE_USE_POLICY.txt)."
extra_gated_button_content: "I agree to the terms and conditions of the Falcon-180B TII license and to the acceptable use policy"
---
# 🚀 Falcon-180B
**Falcon-180B is a 180B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 3,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the [Falcon-180B TII License](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/ACCEPTABLE_USE_POLICY.txt).**
*Paper coming soon* 😊
🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost from HF](https://hf.co/blog/falcon-180b) or this [one](https://huggingface.co/blog/falcon) from the release of the 40B!
Note that since the 180B is larger than what can easily be handled with `transformers`+`acccelerate`, we recommend using [Text Generation Inference](https://github.com/huggingface/text-generation-inference).
You will need **at least 400GB of memory** to swiftly run inference with Falcon-180B.
## Why use Falcon-180B?
* **It is the best open-access model currently available, and one of the best model overall.** Falcon-180B outperforms [LLaMA-2](https://huggingface.co/meta-llama/Llama-2-70b-hf), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1), [MPT](https://huggingface.co/mosaicml/mpt-7b), etc. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
* **It is made available under a permissive license allowing for commercial use**.
* ⚠️ **This is a raw, pretrained model, which should be further finetuned for most usecases.** If you are looking for a version better suited to taking generic instructions in a chat format, we recommend taking a look at [Falcon-180B-Chat](https://huggingface.co/tiiuae/falcon-180b-chat).
💸 **Looking for a smaller, less expensive model?** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and [Falcon-40B](https://huggingface.co/tiiuae/falcon-40b) are Falcon-180B's little brothers!
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
# Model Card for Falcon-180B
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English, German, Spanish, French (and limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish);
- **License:** [Falcon-180B TII License](https://huggingface.co/tiiuae/falcon-180B/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/tiiuae/falcon-180B/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Model Source
- **Paper:** *coming soon*.
## Uses
See the [acceptable use policy](https://huggingface.co/tiiuae/falcon-180B/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Direct Use
Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.)
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-180B is trained mostly on English, German, Spanish, French, with limited capabilities also in in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish. It will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-180B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.
## How to Get Started with the Model
To run inference with the model in full `bfloat16` precision you need approximately 8xA100 80GB or equivalent.
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-180b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
### Training Data
Falcon-180B was trained on 3,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), a high-quality filtered and deduplicated web dataset which we enhanced with curated corpora. Significant components from our curated copora were inspired by The Pile ([Gao et al., 2020](https://arxiv.org/abs/2101.00027)).
| **Data source** | **Fraction** | **Tokens** | **Sources** |
|--------------------|--------------|------------|-----------------------------------|
| [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 75% | 750B | massive web crawl |
| RefinedWeb-Europe | 7% | 70B | European massive web crawl |
| Books | 6% | 60B | |
| Conversations | 5% | 50B | Reddit, StackOverflow, HackerNews |
| Code | 5% | 50B | |
| Technical | 2% | 20B | arXiv, PubMed, USPTO, etc. |
RefinedWeb-Europe is made of the following languages:
| **Language** | **Fraction of multilingual data** | **Tokens** |
|--------------|-----------------------------------|------------|
| German | 26% | 18B |
| Spanish | 24% | 17B |
| French | 23% | 16B |
| _Italian_ | 7% | 5B |
| _Portuguese_ | 4% | 3B |
| _Polish_ | 4% | 3B |
| _Dutch_ | 4% | 3B |
| _Romanian_ | 3% | 2B |
| _Czech_ | 3% | 2B |
| _Swedish_ | 2% | 1B |
The data was tokenized with the Falcon tokenizer.
### Training Procedure
Falcon-180B was trained on up to 4,096 A100 40GB GPUs, using a 3D parallelism strategy (TP=8, PP=8, DP=64) combined with ZeRO.
#### Training Hyperparameters
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|------------|-------------------------------------------|
| Precision | `bfloat16` | |
| Optimizer | AdamW | |
| Learning rate | 1.25e-4 | 4B tokens warm-up, cosine decay to 1.25e-5 |
| Weight decay | 1e-1 | |
| Z-loss | 1e-4 | |
| Batch size | 2048 | 100B tokens ramp-up |
#### Speeds, Sizes, Times
Training started in early 2023.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
## Technical Specifications
### Model Architecture and Objective
Falcon-180B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with two layer norms.
For multiquery, we are using an internal variant which uses independent key and values per tensor parallel degree (so-called multigroup).
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 80 | |
| `d_model` | 14848 | |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-180B was trained on AWS SageMaker, on up to 4,096 A100 40GB GPUs in P4d instances.
#### Software
Falcon-180B was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon* 😊 (actually this time). In the meanwhile, you can use the following information to cite:
```
@article{falcon,
title={The Falcon Series of Language Models: Towards Open Frontier Models},
author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Alhammadi, Maitha and Daniele, Mazzotta and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
year={2023}
}
```
To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## Contact
falconllm@tii.ae | 11,491 | [
[
-0.045135498046875,
-0.05712890625,
0.00568389892578125,
0.026702880859375,
0.00048065185546875,
-0.0016469955444335938,
-0.01337432861328125,
-0.04180908203125,
0.0294189453125,
0.025238037109375,
-0.048187255859375,
-0.038848876953125,
-0.049560546875,
0.0131072998046875,
-0.0242462158203125,
0.08154296875,
0.00003737211227416992,
0.007537841796875,
0.0121307373046875,
-0.0056304931640625,
-0.0019664764404296875,
-0.048736572265625,
-0.05792236328125,
-0.007965087890625,
0.03515625,
0.0299224853515625,
0.046875,
0.061126708984375,
0.0467529296875,
0.0216827392578125,
-0.01457977294921875,
0.014251708984375,
-0.042022705078125,
-0.023162841796875,
-0.0018177032470703125,
-0.0185089111328125,
-0.037933349609375,
-0.0004055500030517578,
0.053253173828125,
0.051666259765625,
-0.00399017333984375,
0.01532745361328125,
-0.0055389404296875,
0.0487060546875,
-0.03521728515625,
0.025482177734375,
-0.0313720703125,
0.0052947998046875,
-0.0177459716796875,
0.01374053955078125,
-0.025604248046875,
0.01030731201171875,
-0.025543212890625,
-0.07037353515625,
0.013458251953125,
0.01306915283203125,
0.0992431640625,
0.00868988037109375,
-0.0328369140625,
-0.01548004150390625,
-0.03173828125,
0.057342529296875,
-0.046783447265625,
0.04510498046875,
0.0264129638671875,
0.0193328857421875,
-0.0251922607421875,
-0.07647705078125,
-0.035400390625,
-0.01000213623046875,
0.005664825439453125,
0.0300140380859375,
-0.0218963623046875,
-0.0007448196411132812,
0.029327392578125,
0.0163116455078125,
-0.0394287109375,
0.012542724609375,
-0.037750244140625,
-0.016021728515625,
0.0584716796875,
0.0038242340087890625,
0.01282501220703125,
-0.0184173583984375,
-0.0304107666015625,
-0.0272369384765625,
-0.0297088623046875,
0.02703857421875,
0.04425048828125,
0.0299072265625,
-0.0230865478515625,
0.04937744140625,
-0.0265655517578125,
0.036346435546875,
0.02691650390625,
-0.0011053085327148438,
0.03326416015625,
-0.0343017578125,
-0.0192413330078125,
-0.0173492431640625,
0.0811767578125,
0.0184173583984375,
0.0143585205078125,
-0.012908935546875,
-0.00870513916015625,
-0.0170745849609375,
0.00788116455078125,
-0.071533203125,
0.0084075927734375,
0.0218963623046875,
-0.042633056640625,
-0.0084991455078125,
0.038299560546875,
-0.06298828125,
0.00717926025390625,
-0.0013217926025390625,
0.007633209228515625,
-0.036468505859375,
-0.030517578125,
0.023834228515625,
-0.01506805419921875,
0.0225830078125,
-0.00592041015625,
-0.06689453125,
0.0227508544921875,
0.054962158203125,
0.065673828125,
-0.0039825439453125,
-0.047119140625,
-0.046783447265625,
0.0005369186401367188,
-0.021759033203125,
0.045013427734375,
-0.032440185546875,
-0.038360595703125,
-0.0084381103515625,
0.0300445556640625,
-0.0178680419921875,
-0.01202392578125,
0.058929443359375,
-0.020172119140625,
0.0201416015625,
-0.0292816162109375,
-0.0416259765625,
-0.0216827392578125,
0.007495880126953125,
-0.053497314453125,
0.07794189453125,
0.009124755859375,
-0.07293701171875,
0.024078369140625,
-0.06317138671875,
-0.0223541259765625,
-0.005218505859375,
0.0006852149963378906,
-0.03289794921875,
-0.016632080078125,
0.0298004150390625,
0.04229736328125,
-0.007328033447265625,
0.0251007080078125,
-0.0419921875,
-0.0361328125,
-0.0062255859375,
-0.0064697265625,
0.079345703125,
0.04205322265625,
-0.0280303955078125,
0.002826690673828125,
-0.045257568359375,
-0.00665283203125,
0.025482177734375,
-0.020416259765625,
0.0154571533203125,
-0.012176513671875,
0.0222320556640625,
0.01210784912109375,
0.01442718505859375,
-0.047119140625,
0.015106201171875,
-0.04071044921875,
0.037261962890625,
0.02447509765625,
-0.002017974853515625,
0.0245513916015625,
-0.032562255859375,
0.044403076171875,
0.02252197265625,
0.0255279541015625,
-0.0140838623046875,
-0.047119140625,
-0.0711669921875,
-0.028564453125,
0.01439666748046875,
0.0390625,
-0.044219970703125,
0.041595458984375,
-0.01352691650390625,
-0.046142578125,
-0.0321044921875,
-0.00152587890625,
0.0295867919921875,
0.03790283203125,
0.03131103515625,
0.0079498291015625,
-0.05755615234375,
-0.06903076171875,
-0.006427764892578125,
-0.024505615234375,
0.0102996826171875,
0.01629638671875,
0.05401611328125,
-0.0284423828125,
0.053192138671875,
-0.022857666015625,
-0.01995849609375,
-0.0126190185546875,
0.0032501220703125,
0.01412200927734375,
0.036468505859375,
0.061065673828125,
-0.04779052734375,
-0.0221405029296875,
0.0017795562744140625,
-0.059295654296875,
0.00830078125,
-0.0027179718017578125,
-0.0196990966796875,
0.040618896484375,
0.046234130859375,
-0.048553466796875,
0.019989013671875,
0.0390625,
-0.0300140380859375,
0.028656005859375,
-0.006603240966796875,
0.00679779052734375,
-0.09521484375,
0.0266571044921875,
0.0128936767578125,
0.00406646728515625,
-0.028778076171875,
0.024810791015625,
-0.0005197525024414062,
-0.00670623779296875,
-0.046112060546875,
0.059051513671875,
-0.031524658203125,
0.007503509521484375,
0.003932952880859375,
-0.0106353759765625,
0.0012750625610351562,
0.048004150390625,
0.01537322998046875,
0.0672607421875,
0.029754638671875,
-0.0218658447265625,
-0.0013742446899414062,
0.0263671875,
-0.00743865966796875,
0.0156402587890625,
-0.0570068359375,
-0.0050201416015625,
-0.00799560546875,
0.0261383056640625,
-0.0626220703125,
-0.0228424072265625,
0.027252197265625,
-0.04718017578125,
0.0211334228515625,
-0.0074615478515625,
-0.0306549072265625,
-0.043487548828125,
-0.024139404296875,
0.0011091232299804688,
0.031036376953125,
-0.039398193359375,
0.02838134765625,
0.022613525390625,
0.003391265869140625,
-0.0648193359375,
-0.061065673828125,
0.01080322265625,
-0.016845703125,
-0.057342529296875,
0.0240020751953125,
-0.00751495361328125,
-0.00724029541015625,
-0.00478363037109375,
0.016632080078125,
0.001800537109375,
0.00811004638671875,
0.0307769775390625,
0.01522064208984375,
-0.0210113525390625,
-0.00995635986328125,
0.0008544921875,
-0.007965087890625,
0.000005185604095458984,
-0.023040771484375,
0.03216552734375,
-0.05462646484375,
-0.0207672119140625,
-0.035308837890625,
0.031524658203125,
0.039794921875,
-0.01465606689453125,
0.05950927734375,
0.07122802734375,
-0.0291900634765625,
0.01361083984375,
-0.041595458984375,
-0.01079559326171875,
-0.034942626953125,
0.03765869140625,
-0.038330078125,
-0.05914306640625,
0.052276611328125,
0.01556396484375,
0.010589599609375,
0.0675048828125,
0.0321044921875,
-0.00318145751953125,
0.0865478515625,
0.03570556640625,
-0.009429931640625,
0.033111572265625,
-0.054168701171875,
-0.0084381103515625,
-0.052398681640625,
-0.031951904296875,
-0.05389404296875,
-0.01523590087890625,
-0.058380126953125,
-0.01457977294921875,
0.0007071495056152344,
0.0177154541015625,
-0.054168701171875,
0.0221710205078125,
-0.031890869140625,
0.0258941650390625,
0.043426513671875,
-0.0025844573974609375,
0.00630950927734375,
0.00971221923828125,
-0.0175018310546875,
0.004436492919921875,
-0.06268310546875,
-0.0438232421875,
0.08990478515625,
0.037322998046875,
0.04095458984375,
-0.0100555419921875,
0.0643310546875,
0.0002818107604980469,
0.01110076904296875,
-0.0555419921875,
0.0308990478515625,
-0.0214385986328125,
-0.045562744140625,
-0.0049285888671875,
-0.035308837890625,
-0.07861328125,
0.0038433074951171875,
-0.0157928466796875,
-0.053253173828125,
0.0089569091796875,
-0.01171112060546875,
-0.0099029541015625,
0.03009033203125,
-0.0692138671875,
0.06597900390625,
-0.018218994140625,
-0.02825927734375,
0.01132965087890625,
-0.046173095703125,
0.03155517578125,
-0.0038318634033203125,
0.018707275390625,
0.00653839111328125,
-0.0162353515625,
0.0712890625,
-0.04400634765625,
0.0550537109375,
-0.028900146484375,
0.0226287841796875,
0.024871826171875,
-0.0157928466796875,
0.042266845703125,
0.01050567626953125,
-0.0268402099609375,
0.050140380859375,
0.0295867919921875,
-0.043731689453125,
-0.0191192626953125,
0.053192138671875,
-0.0921630859375,
-0.038970947265625,
-0.0455322265625,
-0.04022216796875,
-0.01154327392578125,
0.02520751953125,
0.033935546875,
0.00576019287109375,
-0.0128173828125,
0.0288848876953125,
0.01229095458984375,
-0.01552581787109375,
0.046966552734375,
0.03228759765625,
-0.024383544921875,
-0.04290771484375,
0.0592041015625,
0.0072021484375,
-0.004550933837890625,
0.0142822265625,
0.016021728515625,
-0.0406494140625,
-0.040985107421875,
-0.045623779296875,
0.034820556640625,
-0.038177490234375,
-0.01541900634765625,
-0.0626220703125,
-0.0310516357421875,
-0.039398193359375,
-0.01422882080078125,
-0.027008056640625,
-0.03363037109375,
-0.043701171875,
-0.0057525634765625,
0.0301971435546875,
0.041046142578125,
-0.00328826904296875,
0.0316162109375,
-0.051513671875,
0.007564544677734375,
-0.007328033447265625,
0.01800537109375,
-0.0033416748046875,
-0.04815673828125,
-0.017852783203125,
0.04248046875,
-0.029205322265625,
-0.036651611328125,
0.034271240234375,
0.0240478515625,
0.051483154296875,
0.040924072265625,
-0.0019588470458984375,
0.0654296875,
-0.0158538818359375,
0.068603515625,
0.01934814453125,
-0.059600830078125,
0.0257415771484375,
-0.041107177734375,
0.0175018310546875,
0.04461669921875,
0.04876708984375,
-0.0384521484375,
-0.03192138671875,
-0.0635986328125,
-0.042816162109375,
0.07183837890625,
0.0160369873046875,
0.00279998779296875,
-0.01181793212890625,
0.035308837890625,
0.0025920867919921875,
-0.004848480224609375,
-0.04254150390625,
-0.01116180419921875,
-0.04632568359375,
-0.0234527587890625,
-0.0098419189453125,
-0.00019359588623046875,
0.006805419921875,
-0.0310516357421875,
0.066650390625,
-0.0186309814453125,
0.033477783203125,
0.007709503173828125,
-0.002056121826171875,
-0.0012826919555664062,
-0.0140380859375,
0.048736572265625,
0.0447998046875,
-0.02398681640625,
-0.01224517822265625,
0.01342010498046875,
-0.058685302734375,
-0.0037403106689453125,
0.024627685546875,
-0.015777587890625,
-0.0135650634765625,
0.03399658203125,
0.06427001953125,
0.004131317138671875,
-0.043853759765625,
0.035430908203125,
-0.01207733154296875,
-0.035064697265625,
-0.0176849365234375,
0.010955810546875,
0.0193939208984375,
0.0196075439453125,
0.0222320556640625,
-0.00989532470703125,
0.0027904510498046875,
-0.02850341796875,
0.017730712890625,
0.031158447265625,
-0.024139404296875,
-0.019805908203125,
0.06268310546875,
0.0116119384765625,
-0.0219573974609375,
0.0487060546875,
-0.0215911865234375,
-0.0260162353515625,
0.06768798828125,
0.040771484375,
0.058868408203125,
0.00511932373046875,
0.0218353271484375,
0.0555419921875,
0.020599365234375,
-0.00884246826171875,
0.03204345703125,
0.017303466796875,
-0.042236328125,
-0.0207672119140625,
-0.058807373046875,
-0.0245513916015625,
0.0153350830078125,
-0.053466796875,
0.038299560546875,
-0.03973388671875,
-0.02008056640625,
0.01445770263671875,
0.037322998046875,
-0.054962158203125,
0.0070953369140625,
-0.0007843971252441406,
0.08697509765625,
-0.046722412109375,
0.06854248046875,
0.0616455078125,
-0.0643310546875,
-0.06549072265625,
-0.01025390625,
-0.00200653076171875,
-0.07489013671875,
0.058868408203125,
0.0219573974609375,
-0.000461578369140625,
0.0108795166015625,
-0.0380859375,
-0.060455322265625,
0.0794677734375,
0.03204345703125,
-0.042083740234375,
-0.0010652542114257812,
0.0234375,
0.04180908203125,
-0.029693603515625,
0.037506103515625,
0.02716064453125,
0.042572021484375,
0.030609130859375,
-0.05694580078125,
0.0014362335205078125,
-0.03936767578125,
-0.0023975372314453125,
0.0034236907958984375,
-0.07879638671875,
0.0643310546875,
-0.0236053466796875,
-0.01044464111328125,
0.006412506103515625,
0.05938720703125,
0.0188140869140625,
0.020416259765625,
0.0265350341796875,
0.0435791015625,
0.05743408203125,
-0.01328277587890625,
0.0731201171875,
-0.047210693359375,
0.037353515625,
0.057373046875,
-0.003692626953125,
0.06964111328125,
0.031890869140625,
-0.01070404052734375,
0.0289764404296875,
0.06707763671875,
-0.01580810546875,
0.0105743408203125,
-0.01100921630859375,
0.01239013671875,
-0.018463134765625,
-0.00603485107421875,
-0.040374755859375,
0.044525146484375,
0.0168304443359375,
-0.0183563232421875,
-0.00888824462890625,
0.00516510009765625,
0.036285400390625,
-0.0137939453125,
-0.007015228271484375,
0.045684814453125,
-0.002960205078125,
-0.061614990234375,
0.07293701171875,
0.019805908203125,
0.058685302734375,
-0.0565185546875,
0.0159912109375,
-0.040618896484375,
0.0165557861328125,
-0.0241851806640625,
-0.05517578125,
0.0341796875,
0.0102996826171875,
-0.00557708740234375,
-0.01428985595703125,
0.03131103515625,
-0.0194549560546875,
-0.0684814453125,
0.02886962890625,
0.0256805419921875,
0.022430419921875,
-0.00972747802734375,
-0.071044921875,
0.020355224609375,
-0.0067596435546875,
-0.031585693359375,
0.020477294921875,
0.0141754150390625,
-0.00362396240234375,
0.050201416015625,
0.053680419921875,
0.001590728759765625,
0.00501251220703125,
-0.00458526611328125,
0.05792236328125,
-0.05474853515625,
-0.0307769775390625,
-0.040740966796875,
0.0305633544921875,
-0.004695892333984375,
-0.03363037109375,
0.056976318359375,
0.064453125,
0.07220458984375,
0.0039825439453125,
0.0477294921875,
-0.004123687744140625,
0.0172271728515625,
-0.03167724609375,
0.061676025390625,
-0.05908203125,
-0.006328582763671875,
-0.0130157470703125,
-0.053466796875,
-0.0177154541015625,
0.0479736328125,
-0.0063323974609375,
0.0176544189453125,
0.052459716796875,
0.0672607421875,
-0.01038360595703125,
0.0256805419921875,
0.006603240966796875,
0.01898193359375,
0.0308380126953125,
0.058197021484375,
0.0430908203125,
-0.054718017578125,
0.0582275390625,
-0.0271148681640625,
-0.0142974853515625,
-0.0210723876953125,
-0.06378173828125,
-0.07550048828125,
-0.05633544921875,
-0.0232696533203125,
-0.032684326171875,
0.004482269287109375,
0.06573486328125,
0.064208984375,
-0.0626220703125,
-0.035614013671875,
-0.01026153564453125,
-0.0095672607421875,
-0.024017333984375,
-0.01629638671875,
0.03521728515625,
-0.0303497314453125,
-0.067138671875,
0.0162200927734375,
-0.004451751708984375,
0.01139068603515625,
-0.0224761962890625,
-0.0291900634765625,
-0.036346435546875,
-0.0016794204711914062,
0.03955078125,
0.036834716796875,
-0.05670166015625,
-0.018524169921875,
0.0299530029296875,
-0.0104522705078125,
0.003108978271484375,
0.0277252197265625,
-0.03961181640625,
0.023834228515625,
0.031890869140625,
0.044677734375,
0.0650634765625,
-0.0032558441162109375,
0.011932373046875,
-0.031463623046875,
0.0312347412109375,
0.0001266002655029297,
0.036224365234375,
0.01143646240234375,
-0.02874755859375,
0.04937744140625,
0.03045654296875,
-0.0301971435546875,
-0.054656982421875,
-0.0141754150390625,
-0.093505859375,
-0.021453857421875,
0.10211181640625,
-0.0166015625,
-0.03082275390625,
0.0079803466796875,
-0.0224761962890625,
0.028411865234375,
-0.034149169921875,
0.043670654296875,
0.056549072265625,
-0.0085601806640625,
0.00560760498046875,
-0.030364990234375,
0.0241851806640625,
0.01230621337890625,
-0.06939697265625,
-0.0097198486328125,
0.0299224853515625,
0.0244598388671875,
-0.0007944107055664062,
0.0236968994140625,
-0.00710296630859375,
0.0059356689453125,
0.01296234130859375,
0.00540924072265625,
-0.047210693359375,
-0.0224151611328125,
-0.0181121826171875,
0.01244354248046875,
-0.016845703125,
-0.0224609375
]
] |
cross-encoder/nli-deberta-v3-base | 2021-12-27T22:26:49.000Z | [
"transformers",
"pytorch",
"deberta-v2",
"text-classification",
"microsoft/deberta-v3-base",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"dataset:snli",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | cross-encoder | null | null | cross-encoder/nli-deberta-v3-base | 11 | 15,434 | transformers | 2022-03-02T23:29:05 | ---
language: en
pipeline_tag: zero-shot-classification
tags:
- microsoft/deberta-v3-base
datasets:
- multi_nli
- snli
metrics:
- accuracy
license: apache-2.0
---
# Cross-Encoder for Natural Language Inference
This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class. This model is based on [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base)
## Training Data
The model was trained on the [SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) datasets. For a given sentence pair, it will output three scores corresponding to the labels: contradiction, entailment, neutral.
## Performance
- Accuracy on SNLI-test dataset: 92.38
- Accuracy on MNLI mismatched set: 90.04
For futher evaluation results, see [SBERT.net - Pretrained Cross-Encoder](https://www.sbert.net/docs/pretrained_cross-encoders.html#nli).
## Usage
Pre-trained models can be used like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('cross-encoder/nli-deberta-v3-base')
scores = model.predict([('A man is eating pizza', 'A man eats something'), ('A black race car starts up in front of a crowd of people.', 'A man is driving down a lonely road.')])
#Convert scores to labels
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(axis=1)]
```
## Usage with Transformers AutoModel
You can use the model also directly with Transformers library (without SentenceTransformers library):
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('cross-encoder/nli-deberta-v3-base')
tokenizer = AutoTokenizer.from_pretrained('cross-encoder/nli-deberta-v3-base')
features = tokenizer(['A man is eating pizza', 'A black race car starts up in front of a crowd of people.'], ['A man eats something', 'A man is driving down a lonely road.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(dim=1)]
print(labels)
```
## Zero-Shot Classification
This model can also be used for zero-shot-classification:
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model='cross-encoder/nli-deberta-v3-base')
sent = "Apple just announced the newest iPhone X"
candidate_labels = ["technology", "sports", "politics"]
res = classifier(sent, candidate_labels)
print(res)
``` | 2,777 | [
[
-0.01540374755859375,
-0.05865478515625,
0.0242767333984375,
0.0219879150390625,
-0.001232147216796875,
-0.006435394287109375,
-0.00296783447265625,
-0.0254974365234375,
0.013336181640625,
0.0322265625,
-0.038604736328125,
-0.03826904296875,
-0.045166015625,
0.01287841796875,
-0.03924560546875,
0.07879638671875,
-0.00852203369140625,
0.0035991668701171875,
-0.006893157958984375,
-0.00986480712890625,
-0.023773193359375,
-0.0355224609375,
-0.028961181640625,
-0.042327880859375,
0.0311126708984375,
0.0156097412109375,
0.0462646484375,
0.0260162353515625,
0.0125274658203125,
0.02618408203125,
0.001445770263671875,
-0.01303863525390625,
-0.0129547119140625,
-0.007183074951171875,
0.00016009807586669922,
-0.04351806640625,
-0.0092926025390625,
0.017578125,
0.0228424072265625,
0.029296875,
0.0024871826171875,
0.0191192626953125,
-0.00791168212890625,
0.0189971923828125,
-0.051055908203125,
0.004878997802734375,
-0.04351806640625,
0.016021728515625,
0.00701141357421875,
-0.004917144775390625,
-0.032806396484375,
-0.0269927978515625,
0.005527496337890625,
-0.033294677734375,
0.018951416015625,
-0.0042724609375,
0.09735107421875,
0.031524658203125,
-0.0233001708984375,
-0.0299224853515625,
-0.04345703125,
0.06768798828125,
-0.0760498046875,
0.0198516845703125,
0.018310546875,
0.0027523040771484375,
0.00653839111328125,
-0.05316162109375,
-0.07244873046875,
-0.01218414306640625,
-0.0163116455078125,
0.0287628173828125,
-0.032806396484375,
-0.00988006591796875,
0.03155517578125,
0.025115966796875,
-0.061126708984375,
0.005645751953125,
-0.0343017578125,
-0.00527191162109375,
0.053863525390625,
0.008697509765625,
0.018707275390625,
-0.0298309326171875,
-0.028076171875,
-0.01113128662109375,
-0.019775390625,
0.00804901123046875,
0.0209197998046875,
0.0008764266967773438,
-0.016754150390625,
0.061492919921875,
-0.02520751953125,
0.06292724609375,
0.0174713134765625,
-0.0017614364624023438,
0.058380126953125,
-0.020721435546875,
-0.0372314453125,
0.0268096923828125,
0.078125,
0.0318603515625,
0.026092529296875,
-0.0081939697265625,
-0.0038318634033203125,
0.03204345703125,
-0.00873565673828125,
-0.0579833984375,
-0.0188140869140625,
0.02667236328125,
-0.0235443115234375,
-0.0282135009765625,
0.000980377197265625,
-0.059844970703125,
-0.0010700225830078125,
-0.01419830322265625,
0.06060791015625,
-0.042449951171875,
0.01146697998046875,
0.0264892578125,
-0.0193328857421875,
0.0299835205078125,
-0.01294708251953125,
-0.063232421875,
0.000263214111328125,
0.0239715576171875,
0.060150146484375,
0.015411376953125,
-0.03790283203125,
-0.0318603515625,
-0.0002703666687011719,
0.003475189208984375,
0.0301361083984375,
-0.031982421875,
0.0008149147033691406,
-0.0111541748046875,
0.00603485107421875,
-0.02752685546875,
-0.0258026123046875,
0.04791259765625,
-0.023834228515625,
0.041290283203125,
0.02001953125,
-0.059051513671875,
-0.0283966064453125,
0.02252197265625,
-0.034454345703125,
0.0855712890625,
0.006496429443359375,
-0.0673828125,
0.0125579833984375,
-0.044036865234375,
-0.0305633544921875,
-0.02325439453125,
-0.003643035888671875,
-0.04071044921875,
0.004199981689453125,
0.03326416015625,
0.034271240234375,
-0.01439666748046875,
0.03631591796875,
-0.020294189453125,
-0.0305633544921875,
0.02496337890625,
-0.04583740234375,
0.09027099609375,
0.0110321044921875,
-0.046478271484375,
0.0093994140625,
-0.06201171875,
0.006427764892578125,
0.0110321044921875,
-0.0178070068359375,
-0.007747650146484375,
-0.02215576171875,
0.01045989990234375,
0.026519775390625,
0.0034084320068359375,
-0.058746337890625,
0.003078460693359375,
-0.037384033203125,
0.04754638671875,
0.02813720703125,
-0.00626373291015625,
0.021881103515625,
-0.015167236328125,
0.0204620361328125,
0.0068817138671875,
0.007686614990234375,
-0.0030231475830078125,
-0.048065185546875,
-0.075439453125,
-0.002277374267578125,
0.03759765625,
0.071044921875,
-0.06231689453125,
0.07135009765625,
-0.0175628662109375,
-0.05316162109375,
-0.0531005859375,
-0.016998291015625,
0.0137939453125,
0.040771484375,
0.04937744140625,
-0.0004343986511230469,
-0.05474853515625,
-0.05828857421875,
-0.0232391357421875,
-0.002925872802734375,
-0.0092315673828125,
-0.004367828369140625,
0.0626220703125,
-0.03350830078125,
0.08233642578125,
-0.040283203125,
-0.01837158203125,
-0.03656005859375,
0.025421142578125,
0.036895751953125,
0.053985595703125,
0.036956787109375,
-0.045166015625,
-0.0294342041015625,
-0.025360107421875,
-0.06494140625,
-0.00988006591796875,
-0.0240020751953125,
-0.0005311965942382812,
0.01140594482421875,
0.022613525390625,
-0.0438232421875,
0.048858642578125,
0.03802490234375,
-0.034515380859375,
0.030059814453125,
-0.01007843017578125,
0.0013036727905273438,
-0.083251953125,
-0.007232666015625,
0.01209259033203125,
-0.006458282470703125,
-0.0570068359375,
-0.0183563232421875,
-0.0110015869140625,
-0.00004982948303222656,
-0.033447265625,
0.0384521484375,
-0.017578125,
0.0107574462890625,
-0.00321197509765625,
0.0174713134765625,
0.0172119140625,
0.046142578125,
0.018402099609375,
0.033233642578125,
0.059326171875,
-0.039093017578125,
0.03631591796875,
0.0276031494140625,
-0.031524658203125,
0.02117919921875,
-0.06573486328125,
-0.005725860595703125,
-0.0132904052734375,
0.0179901123046875,
-0.06646728515625,
-0.006435394287109375,
0.03448486328125,
-0.046875,
0.00034546852111816406,
0.0157623291015625,
-0.032135009765625,
-0.03521728515625,
-0.006710052490234375,
0.026123046875,
0.039520263671875,
-0.038360595703125,
0.054168701171875,
0.010894775390625,
0.0260162353515625,
-0.046142578125,
-0.08770751953125,
-0.00027108192443847656,
-0.01342010498046875,
-0.035614013671875,
0.0246429443359375,
0.003292083740234375,
0.0003905296325683594,
0.00815582275390625,
0.0038127899169921875,
-0.0174102783203125,
-0.0006189346313476562,
0.0164642333984375,
0.0211639404296875,
-0.01434326171875,
-0.00023484230041503906,
-0.00519561767578125,
-0.01015472412109375,
0.01015472412109375,
-0.02020263671875,
0.04315185546875,
-0.019073486328125,
-0.0198974609375,
-0.048370361328125,
0.0191802978515625,
0.0257110595703125,
-0.016754150390625,
0.054412841796875,
0.07366943359375,
-0.0298614501953125,
0.004940032958984375,
-0.0367431640625,
-0.0144805908203125,
-0.0304718017578125,
0.032745361328125,
-0.02581787109375,
-0.04937744140625,
0.02777099609375,
0.02630615234375,
-0.01239013671875,
0.048858642578125,
0.033782958984375,
0.004428863525390625,
0.07049560546875,
0.028045654296875,
-0.022064208984375,
0.02386474609375,
-0.0465087890625,
0.0209808349609375,
-0.05279541015625,
-0.018951416015625,
-0.039093017578125,
-0.02496337890625,
-0.0460205078125,
-0.0256500244140625,
0.00485992431640625,
0.01418304443359375,
-0.0232391357421875,
0.04302978515625,
-0.0413818359375,
0.031951904296875,
0.056793212890625,
0.00689697265625,
0.01023101806640625,
0.00180816650390625,
-0.00392913818359375,
0.0016889572143554688,
-0.06304931640625,
-0.03814697265625,
0.0634765625,
0.0258026123046875,
0.06048583984375,
-0.01236724853515625,
0.06573486328125,
-0.003276824951171875,
0.0161895751953125,
-0.05169677734375,
0.03411865234375,
-0.020050048828125,
-0.055389404296875,
-0.0154571533203125,
-0.036468505859375,
-0.0677490234375,
0.01727294921875,
-0.026123046875,
-0.062225341796875,
0.0261383056640625,
-0.011566162109375,
-0.037506103515625,
0.0259246826171875,
-0.054168701171875,
0.0936279296875,
-0.029693603515625,
-0.0167388916015625,
0.00934600830078125,
-0.055419921875,
0.022003173828125,
0.00714874267578125,
-0.0009226799011230469,
-0.016998291015625,
0.0225982666015625,
0.061431884765625,
-0.0100250244140625,
0.073486328125,
-0.0102691650390625,
0.006977081298828125,
0.03314208984375,
-0.0188751220703125,
0.0107421875,
0.00824737548828125,
-0.024688720703125,
0.032501220703125,
-0.00389862060546875,
-0.0230255126953125,
-0.04718017578125,
0.04058837890625,
-0.0697021484375,
-0.024261474609375,
-0.04461669921875,
-0.0305633544921875,
0.014251708984375,
0.0136566162109375,
0.052490234375,
0.039459228515625,
-0.003940582275390625,
0.006427764892578125,
0.027069091796875,
-0.0259246826171875,
0.050140380859375,
0.0154571533203125,
-0.0092620849609375,
-0.035247802734375,
0.06011962890625,
0.0011930465698242188,
0.00856781005859375,
0.03594970703125,
0.0155792236328125,
-0.0384521484375,
-0.0173492431640625,
-0.03302001953125,
0.01666259765625,
-0.047332763671875,
-0.0165252685546875,
-0.05792236328125,
-0.041717529296875,
-0.044036865234375,
-0.005237579345703125,
-0.0154571533203125,
-0.029327392578125,
-0.041778564453125,
-0.00907135009765625,
0.0270233154296875,
0.03741455078125,
-0.003787994384765625,
0.0248260498046875,
-0.0595703125,
0.0309906005859375,
0.012603759765625,
0.007183074951171875,
-0.010589599609375,
-0.05474853515625,
-0.00945281982421875,
0.00653839111328125,
-0.032684326171875,
-0.0758056640625,
0.048736572265625,
0.01898193359375,
0.04705810546875,
0.0087127685546875,
0.0143890380859375,
0.0498046875,
-0.02447509765625,
0.05584716796875,
0.0227813720703125,
-0.0947265625,
0.047088623046875,
0.01241302490234375,
0.033355712890625,
0.038360595703125,
0.0386962890625,
-0.050628662109375,
-0.035369873046875,
-0.04266357421875,
-0.067138671875,
0.05267333984375,
0.0367431640625,
0.0089111328125,
-0.0111541748046875,
0.01354217529296875,
0.00290679931640625,
0.011962890625,
-0.09893798828125,
-0.034759521484375,
-0.047943115234375,
-0.0435791015625,
-0.0212860107421875,
0.00009751319885253906,
0.0081939697265625,
-0.044036865234375,
0.06768798828125,
-0.0005350112915039062,
0.0241241455078125,
0.04833984375,
-0.02069091796875,
0.0253143310546875,
0.02581787109375,
0.0443115234375,
0.0228118896484375,
-0.020050048828125,
0.005069732666015625,
0.0282135009765625,
-0.02099609375,
0.021392822265625,
0.02337646484375,
-0.0290679931640625,
0.019134521484375,
0.037322998046875,
0.09912109375,
-0.004047393798828125,
-0.035888671875,
0.040069580078125,
0.002490997314453125,
-0.01849365234375,
-0.03515625,
0.0030040740966796875,
-0.005039215087890625,
0.026397705078125,
0.01983642578125,
0.01392364501953125,
0.007457733154296875,
-0.042572021484375,
0.023345947265625,
0.0090789794921875,
-0.043975830078125,
-0.0157470703125,
0.0604248046875,
0.005672454833984375,
-0.0283050537109375,
0.0506591796875,
-0.020904541015625,
-0.051788330078125,
0.0504150390625,
0.045562744140625,
0.07891845703125,
-0.00006598234176635742,
0.0258941650390625,
0.050933837890625,
0.0294647216796875,
-0.00543975830078125,
0.0127105712890625,
0.006946563720703125,
-0.072509765625,
-0.0273284912109375,
-0.0513916015625,
-0.003818511962890625,
0.011871337890625,
-0.048675537109375,
0.01348876953125,
-0.01003265380859375,
-0.004398345947265625,
0.00571441650390625,
-0.01849365234375,
-0.048858642578125,
0.0219268798828125,
0.020111083984375,
0.06610107421875,
-0.08013916015625,
0.07049560546875,
0.036956787109375,
-0.052154541015625,
-0.060638427734375,
0.0108642578125,
-0.020050048828125,
-0.04925537109375,
0.06048583984375,
0.0413818359375,
0.003330230712890625,
0.0085601806640625,
-0.0266876220703125,
-0.052886962890625,
0.076171875,
0.01274871826171875,
-0.04327392578125,
-0.00847625732421875,
0.025238037109375,
0.0469970703125,
-0.032379150390625,
0.054046630859375,
0.053466796875,
0.0345458984375,
-0.00414276123046875,
-0.05242919921875,
0.0077362060546875,
-0.0101318359375,
-0.00421142578125,
-0.01064300537109375,
-0.0294647216796875,
0.0693359375,
-0.0172119140625,
0.0030918121337890625,
0.01049041748046875,
0.052642822265625,
0.0204315185546875,
0.034576416015625,
0.038909912109375,
0.061614990234375,
0.04571533203125,
-0.018157958984375,
0.0645751953125,
-0.0170745849609375,
0.05401611328125,
0.08282470703125,
-0.0186309814453125,
0.0662841796875,
0.033782958984375,
-0.00543212890625,
0.055816650390625,
0.045989990234375,
-0.03228759765625,
0.037750244140625,
0.023040771484375,
-0.003963470458984375,
-0.0174407958984375,
0.0137939453125,
-0.0261383056640625,
0.0640869140625,
0.004940032958984375,
-0.038665771484375,
-0.0167388916015625,
0.01456451416015625,
-0.0123443603515625,
-0.0021457672119140625,
-0.00751495361328125,
0.039337158203125,
-0.00901031494140625,
-0.051239013671875,
0.062469482421875,
0.0009446144104003906,
0.07159423828125,
-0.02813720703125,
0.0034694671630859375,
-0.0004734992980957031,
0.0258026123046875,
-0.0191192626953125,
-0.066162109375,
0.025238037109375,
-0.0017671585083007812,
-0.0091552734375,
-0.000028312206268310547,
0.033050537109375,
-0.05499267578125,
-0.06085205078125,
0.0380859375,
0.0225067138671875,
0.0165863037109375,
0.0004878044128417969,
-0.07427978515625,
-0.0016202926635742188,
0.015106201171875,
-0.0176239013671875,
-0.00824737548828125,
0.0286407470703125,
0.023162841796875,
0.0362548828125,
0.03143310546875,
-0.015228271484375,
0.02154541015625,
0.0158233642578125,
0.046722412109375,
-0.062042236328125,
-0.02166748046875,
-0.0770263671875,
0.04541015625,
-0.01230621337890625,
-0.03961181640625,
0.06561279296875,
0.058868408203125,
0.07073974609375,
-0.0234375,
0.052825927734375,
-0.0200958251953125,
0.0197601318359375,
-0.048675537109375,
0.0469970703125,
-0.04443359375,
0.0025234222412109375,
-0.01291656494140625,
-0.057281494140625,
-0.034271240234375,
0.06573486328125,
-0.026580810546875,
0.007457733154296875,
0.04791259765625,
0.07025146484375,
-0.0025787353515625,
0.0052490234375,
0.00959014892578125,
0.022216796875,
0.009063720703125,
0.0577392578125,
0.05865478515625,
-0.07421875,
0.052978515625,
-0.03912353515625,
-0.003814697265625,
-0.004878997802734375,
-0.05328369140625,
-0.07177734375,
-0.033355712890625,
-0.0426025390625,
-0.030120849609375,
-0.007366180419921875,
0.060302734375,
0.051605224609375,
-0.078857421875,
-0.01349639892578125,
-0.0201568603515625,
0.0126190185546875,
-0.02197265625,
-0.025848388671875,
0.0179290771484375,
-0.0193023681640625,
-0.06903076171875,
0.0240020751953125,
-0.004825592041015625,
0.00852203369140625,
-0.009185791015625,
-0.0048980712890625,
-0.0460205078125,
0.00167083740234375,
0.03607177734375,
0.011962890625,
-0.06781005859375,
-0.023956298828125,
0.0012865066528320312,
-0.0169525146484375,
0.01422882080078125,
0.030242919921875,
-0.0670166015625,
0.01312255859375,
0.04071044921875,
0.042816162109375,
0.05072021484375,
-0.0133056640625,
0.0258331298828125,
-0.050079345703125,
0.011627197265625,
0.01279449462890625,
0.033599853515625,
0.0247344970703125,
-0.015960693359375,
0.04144287109375,
0.0291900634765625,
-0.0435791015625,
-0.0469970703125,
0.006046295166015625,
-0.07177734375,
-0.02349853515625,
0.07855224609375,
-0.00945281982421875,
-0.03253173828125,
-0.0065765380859375,
-0.0139007568359375,
0.04364013671875,
-0.020111083984375,
0.045379638671875,
0.031982421875,
-0.01508331298828125,
-0.01213836669921875,
-0.0362548828125,
0.0222625732421875,
0.041748046875,
-0.05828857421875,
-0.015655517578125,
0.0137481689453125,
0.0311126708984375,
0.0322265625,
0.0307159423828125,
0.00934600830078125,
-0.001987457275390625,
0.01251220703125,
0.0216522216796875,
0.005218505859375,
-0.01026153564453125,
-0.036163330078125,
0.00839996337890625,
-0.042633056640625,
-0.045623779296875
]
] |
maywell/Synatra-V0.1-7B-Instruct | 2023-10-30T00:13:44.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"ko",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | maywell | null | null | maywell/Synatra-V0.1-7B-Instruct | 12 | 15,394 | transformers | 2023-10-09T01:52:53 | ---
language:
- ko
library_name: transformers
pipeline_tag: text-generation
license: cc-by-nc-4.0
---
# **V0.3 IS UP**
[Link to V0.3](https://huggingface.co/maywell/Synatra-7B-v0.3-base)
# **Synatra-V0.1-7B**
Made by StableFluffy
[Visit my website! - Currently on consturction..](https://www.stablefluffy.kr/)
## License
This model is strictly [*non-commercial*](https://creativecommons.org/licenses/by-nc/4.0/) (**cc-by-nc-4.0**) use only which takes priority over the **LLAMA 2 COMMUNITY LICENSE AGREEMENT**.
The "Model" is completely free (ie. base model, derivates, merges/mixes) to use for non-commercial purposes as long as the the included **cc-by-nc-4.0** license in any parent repository, and the non-commercial use statute remains, regardless of other models' licences.
The licence can be changed after new model released.
## Model Details
**Base Model**
[mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
**Trained On**
A6000 48GB * 8
## Instruction format
**학습 과정의 실수로 [/INST]가 아닌 [\INST]가 적용되었습니다. v0.2 에서 수정 될 예정입니다.**
In order to leverage instruction fine-tuning, your prompt should be surrounded by `[INST]` and `[\INST]` tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.
Plus, It is strongly recommended to add a space at the end of the prompt.
E.g.
```
text = "<s>[INST] 아이작 뉴턴의 업적을 알려줘. [\INST] "
```
# **Model Benchmark**
## KULLM Evaluation
구름v2 repo 에서 제공되는 데이터셋과 프롬프트를 사용하여 평가했습니다.
당시 GPT4와 현재 GPT4가 완전히 동일하지는 않기에 실제 결과와 약간의 차이가 존재 할 수 있습니다.

| Model | 이해가능성 | 자연스러움 | 맥락유지 | 흥미로움 | 지시어사용 | 전반적퀄리티
| --- | --- | --- | --- | --- | --- | ---
| GPT-3.5 | 0.980 | 2.806 | 2.849 | 2.056 | 0.917 | 3.905
| GPT-4 | 0.984 | 2.897 | 2.944 | 2.143 | 0.968 | 4.083
| KoAlpaca v1.1 | 0.651 | 1.909 | 1.901 | 1.583 | 0.385 | 2.575
| koVicuna | 0.460 | 1.583 | 1.726 | 1.528 | 0.409 | 2.440
| KULLM v2 | 0.742 | 2.083 | 2.107 | 1.794 | 0.548 | 3.036
| **Synatra-V0.1-7B** | **0.960** | **2.821** | **2.755** | **2.356** | **0.934** | **4.065**
## KOBEST_BOOLQ, SENTINEG, WIC - ZERO_SHOT
[EleutherAI/lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/tree/polyglot)를 사용하여 BoolQ, SentiNeg, Wic을 측정했습니다.
HellaSwag와 COPA는 원본코드를 수정하는 과정에서 어려움을 겪어 아직 진행하지 않았습니다.
### NOTE
BoolQ에는 Instruction 모델의 이해를 돕기위해 "위 글에 대한 질문에 사실을 확인하는 작업입니다.", "예, 아니오로 대답해주세요."의 프롬프트를 추가했습니다.
SentiNeg에는 Instruction 모델의 이해를 돕기위해 "위 문장의 긍정, 부정 여부를 판단하세요."의 프롬프트를 추가했습니다.
Wic의 경우는 [INST], [\INST]만 추가하였습니다.
| Model | COPA | HellaSwag | BoolQ | SentiNeg | Wic
| --- | --- | --- | --- | --- | ---
| EleutherAI/polyglot-ko-12.8b | 0.7937 | 0.5954 | 0.4818 | 0.9117 | 0.3985
| **Synatra-V0.1-7B** | **NaN** | **NaN** | **0.849** | **0.8690** | **0.4881**
# **Implementation Code**
Since, chat_template already contains insturction format above.
You can use the code below.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained("maywell/Synatra-V0.1-7B")
tokenizer = AutoTokenizer.from_pretrained("maywell/Synatra-V0.1-7B")
messages = [
{"role": "user", "content": "What is your favourite condiment?"},
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
```
If you run it on oobabooga your prompt would look like this. - ** Need to add Space at the end! **
```
[INST] 링컨에 대해서 알려줘. [\INST]
```
> Readme format: [beomi/llama-2-ko-7b](https://huggingface.co/beomi/llama-2-ko-7b)
--- | 3,866 | [
[
-0.0262603759765625,
-0.053955078125,
0.0287322998046875,
0.022125244140625,
-0.0300750732421875,
0.0012035369873046875,
0.00330352783203125,
-0.0271453857421875,
0.0277862548828125,
0.01296234130859375,
-0.04425048828125,
-0.043853759765625,
-0.04425048828125,
0.0082550048828125,
-0.0006422996520996094,
0.0634765625,
-0.006191253662109375,
-0.0093536376953125,
-0.00799560546875,
-0.011322021484375,
-0.0570068359375,
-0.0247955322265625,
-0.046173095703125,
-0.024139404296875,
0.003574371337890625,
0.022705078125,
0.040679931640625,
0.02838134765625,
0.022125244140625,
0.029083251953125,
-0.033416748046875,
0.01232147216796875,
-0.0261993408203125,
-0.0012826919555664062,
0.00856781005859375,
-0.0452880859375,
-0.047760009765625,
-0.002292633056640625,
0.036895751953125,
0.0065460205078125,
-0.005794525146484375,
0.036590576171875,
0.0007338523864746094,
0.036956787109375,
-0.035552978515625,
0.03125,
-0.0117950439453125,
-0.00043892860412597656,
-0.005771636962890625,
0.0038089752197265625,
-0.005859375,
-0.03594970703125,
-0.0151214599609375,
-0.05377197265625,
-0.004116058349609375,
0.0130157470703125,
0.09210205078125,
0.01459503173828125,
-0.03082275390625,
-0.0092315673828125,
-0.031158447265625,
0.051483154296875,
-0.07806396484375,
0.00736236572265625,
0.02410888671875,
0.034088134765625,
-0.03289794921875,
-0.055084228515625,
-0.051849365234375,
-0.015380859375,
-0.014862060546875,
0.027008056640625,
-0.03643798828125,
-0.0084228515625,
0.031951904296875,
0.030487060546875,
-0.03594970703125,
0.006969451904296875,
-0.050140380859375,
-0.0215606689453125,
0.04931640625,
0.0255279541015625,
0.03240966796875,
-0.03704833984375,
-0.02764892578125,
-0.0174713134765625,
-0.0262908935546875,
0.034454345703125,
0.01690673828125,
0.0002199411392211914,
-0.048431396484375,
0.04766845703125,
-0.0142974853515625,
0.031158447265625,
0.0223236083984375,
-0.031768798828125,
0.049896240234375,
-0.03936767578125,
-0.03948974609375,
0.005748748779296875,
0.0821533203125,
0.041229248046875,
-0.01250457763671875,
0.030487060546875,
-0.01314544677734375,
0.0126495361328125,
-0.0117645263671875,
-0.056488037109375,
-0.0080413818359375,
0.02899169921875,
-0.0474853515625,
-0.02862548828125,
0.0007762908935546875,
-0.055450439453125,
-0.00958251953125,
-0.01033782958984375,
0.042236328125,
-0.030120849609375,
-0.0264129638671875,
0.00980377197265625,
-0.0004987716674804688,
0.0255584716796875,
0.02484130859375,
-0.04266357421875,
0.0241851806640625,
0.0279083251953125,
0.059906005859375,
0.012847900390625,
-0.01515960693359375,
-0.014007568359375,
-0.00455474853515625,
-0.019500732421875,
0.04718017578125,
-0.001811981201171875,
-0.03619384765625,
-0.0267791748046875,
0.021240234375,
-0.018798828125,
-0.0328369140625,
0.033935546875,
-0.00792694091796875,
0.03662109375,
-0.0093994140625,
-0.034454345703125,
-0.034942626953125,
0.01511383056640625,
-0.037261962890625,
0.08978271484375,
0.0118560791015625,
-0.06048583984375,
-0.0013275146484375,
-0.054473876953125,
-0.005229949951171875,
-0.0146636962890625,
-0.00011843442916870117,
-0.042572021484375,
-0.0015859603881835938,
0.03900146484375,
0.028350830078125,
-0.0287933349609375,
0.0095367431640625,
-0.007694244384765625,
-0.035430908203125,
0.01461029052734375,
-0.0235443115234375,
0.09765625,
0.02410888671875,
-0.03607177734375,
0.0269012451171875,
-0.0626220703125,
0.002567291259765625,
0.037445068359375,
-0.01100921630859375,
-0.0029296875,
-0.0362548828125,
-0.01375579833984375,
0.0235443115234375,
0.03564453125,
-0.04205322265625,
0.0232086181640625,
-0.03485107421875,
0.025360107421875,
0.06317138671875,
0.00848388671875,
0.02740478515625,
-0.03948974609375,
0.057525634765625,
0.008209228515625,
0.031768798828125,
-0.00962066650390625,
-0.04425048828125,
-0.06829833984375,
-0.0222625732421875,
0.0029010772705078125,
0.050750732421875,
-0.048065185546875,
0.049896240234375,
0.013824462890625,
-0.064208984375,
-0.05926513671875,
-0.00682830810546875,
0.038970947265625,
0.041229248046875,
0.0230255126953125,
-0.00829315185546875,
-0.047760009765625,
-0.0604248046875,
0.015960693359375,
-0.0214691162109375,
0.0078277587890625,
0.0280914306640625,
0.049163818359375,
-0.0219268798828125,
0.056671142578125,
-0.04571533203125,
-0.02069091796875,
-0.037628173828125,
-0.0105438232421875,
0.0595703125,
0.040069580078125,
0.055450439453125,
-0.031768798828125,
-0.044921875,
-0.00969696044921875,
-0.0660400390625,
-0.0240020751953125,
0.0010995864868164062,
-0.0192413330078125,
0.0204315185546875,
0.0238494873046875,
-0.053436279296875,
0.04962158203125,
0.041473388671875,
-0.0494384765625,
0.0692138671875,
-0.0286407470703125,
0.01123046875,
-0.10064697265625,
0.01515960693359375,
-0.01105499267578125,
-0.0163421630859375,
-0.034332275390625,
0.0080108642578125,
0.00017464160919189453,
0.0062255859375,
-0.04229736328125,
0.06561279296875,
-0.042266845703125,
0.007648468017578125,
-0.00926971435546875,
0.0011472702026367188,
0.00643157958984375,
0.0401611328125,
-0.0247039794921875,
0.05230712890625,
0.04998779296875,
-0.0306396484375,
0.043121337890625,
0.0266265869140625,
-0.0255279541015625,
0.0218658447265625,
-0.062255859375,
0.01039886474609375,
-0.014312744140625,
0.0311737060546875,
-0.08758544921875,
-0.01337432861328125,
0.058319091796875,
-0.04010009765625,
0.0172271728515625,
-0.00836944580078125,
-0.0263214111328125,
-0.046051025390625,
-0.03240966796875,
0.0195770263671875,
0.0531005859375,
-0.03643798828125,
0.0419921875,
0.0224151611328125,
-0.000545501708984375,
-0.038482666015625,
-0.03790283203125,
-0.0089569091796875,
-0.023040771484375,
-0.051055908203125,
0.01490020751953125,
-0.00766754150390625,
-0.00585174560546875,
-0.0102081298828125,
-0.011810302734375,
0.0040283203125,
0.0054168701171875,
0.0292205810546875,
0.045806884765625,
-0.018341064453125,
-0.019744873046875,
0.005359649658203125,
-0.0230560302734375,
0.004550933837890625,
0.0025768280029296875,
0.058074951171875,
-0.0244140625,
-0.01910400390625,
-0.0592041015625,
0.006000518798828125,
0.0380859375,
-0.023590087890625,
0.050445556640625,
0.053680419921875,
-0.0170745849609375,
0.01329803466796875,
-0.043731689453125,
-0.0005202293395996094,
-0.037567138671875,
0.0206451416015625,
-0.0249786376953125,
-0.04425048828125,
0.0560302734375,
0.019256591796875,
0.002727508544921875,
0.055419921875,
0.048065185546875,
-0.0028209686279296875,
0.07452392578125,
0.036712646484375,
-0.0003199577331542969,
0.030120849609375,
-0.04913330078125,
0.006877899169921875,
-0.07037353515625,
-0.04754638671875,
-0.027008056640625,
-0.0226287841796875,
-0.04052734375,
-0.02960205078125,
0.036346435546875,
0.0162353515625,
-0.02606201171875,
0.02996826171875,
-0.055084228515625,
0.01555633544921875,
0.03851318359375,
0.0183563232421875,
0.00970458984375,
-0.0114593505859375,
-0.0240020751953125,
0.006481170654296875,
-0.054107666015625,
-0.0292205810546875,
0.06756591796875,
0.039642333984375,
0.052947998046875,
0.00882720947265625,
0.06427001953125,
-0.007747650146484375,
0.0097808837890625,
-0.050323486328125,
0.05023193359375,
0.0117950439453125,
-0.03521728515625,
-0.0114593505859375,
-0.019073486328125,
-0.07501220703125,
0.0298919677734375,
-0.0136566162109375,
-0.08270263671875,
0.02374267578125,
0.0052490234375,
-0.024932861328125,
0.02337646484375,
-0.05413818359375,
0.072265625,
-0.0224456787109375,
-0.020111083984375,
0.0028247833251953125,
-0.04595947265625,
0.03466796875,
0.008636474609375,
0.0115203857421875,
-0.027069091796875,
-0.00047779083251953125,
0.06671142578125,
-0.061553955078125,
0.0576171875,
-0.017913818359375,
-0.0037021636962890625,
0.039398193359375,
0.00726318359375,
0.038665771484375,
0.00909423828125,
-0.0005979537963867188,
0.0136566162109375,
0.0186767578125,
-0.0218963623046875,
-0.038665771484375,
0.042266845703125,
-0.07061767578125,
-0.05853271484375,
-0.048370361328125,
-0.031219482421875,
0.02313232421875,
0.0219573974609375,
0.0328369140625,
0.0191192626953125,
0.01531219482421875,
-0.0012922286987304688,
0.022674560546875,
-0.033782958984375,
0.036956787109375,
0.032470703125,
-0.032379150390625,
-0.0394287109375,
0.059417724609375,
0.00530242919921875,
0.01180267333984375,
0.0017557144165039062,
0.03314208984375,
-0.0386962890625,
-0.020660400390625,
-0.028594970703125,
0.03021240234375,
-0.0458984375,
-0.0230560302734375,
-0.04022216796875,
-0.0222320556640625,
-0.0447998046875,
-0.004695892333984375,
-0.0237274169921875,
-0.02984619140625,
-0.029052734375,
-0.01256561279296875,
0.043212890625,
0.053009033203125,
-0.00731658935546875,
0.0243988037109375,
-0.049285888671875,
0.018768310546875,
0.00003224611282348633,
0.004566192626953125,
0.012298583984375,
-0.053558349609375,
-0.0044708251953125,
0.0018100738525390625,
-0.031585693359375,
-0.07574462890625,
0.054229736328125,
-0.0160064697265625,
0.04595947265625,
0.01149749755859375,
-0.00336456298828125,
0.0625,
-0.01355743408203125,
0.06585693359375,
0.0303955078125,
-0.06683349609375,
0.0538330078125,
-0.018341064453125,
0.035919189453125,
0.0247955322265625,
0.01959228515625,
-0.036407470703125,
-0.02166748046875,
-0.05670166015625,
-0.06634521484375,
0.07183837890625,
0.0289764404296875,
-0.003551483154296875,
0.0116424560546875,
0.01171875,
-0.00644683837890625,
0.009490966796875,
-0.053741455078125,
-0.0474853515625,
-0.0282745361328125,
-0.01497650146484375,
0.004909515380859375,
-0.0239105224609375,
0.005523681640625,
-0.039703369140625,
0.055419921875,
0.0139617919921875,
0.0263214111328125,
0.025909423828125,
0.001682281494140625,
0.00553131103515625,
0.01593017578125,
0.0340576171875,
0.041473388671875,
-0.0294952392578125,
-0.01322174072265625,
0.026824951171875,
-0.040283203125,
0.0295562744140625,
0.00676727294921875,
-0.024749755859375,
0.006481170654296875,
0.01503753662109375,
0.07977294921875,
0.0028018951416015625,
-0.027069091796875,
0.042144775390625,
-0.0117950439453125,
-0.01378631591796875,
-0.031402587890625,
0.01001739501953125,
0.007488250732421875,
0.0283966064453125,
0.0160675048828125,
-0.012603759765625,
-0.004436492919921875,
-0.036651611328125,
-0.00562286376953125,
0.0100250244140625,
-0.01299285888671875,
-0.01357269287109375,
0.06304931640625,
-0.003070831298828125,
-0.021697998046875,
0.036712646484375,
-0.0191650390625,
-0.046966552734375,
0.06982421875,
0.06280517578125,
0.061370849609375,
-0.024078369140625,
0.007472991943359375,
0.0621337890625,
0.0091094970703125,
-0.004009246826171875,
0.041534423828125,
0.01416778564453125,
-0.031036376953125,
-0.00600433349609375,
-0.047698974609375,
-0.00621795654296875,
0.0233001708984375,
-0.037078857421875,
0.0291900634765625,
-0.04437255859375,
-0.0240325927734375,
-0.01326751708984375,
0.0144805908203125,
-0.048065185546875,
0.0125885009765625,
0.0086212158203125,
0.07275390625,
-0.061676025390625,
0.04754638671875,
0.054229736328125,
-0.048370361328125,
-0.0887451171875,
-0.020111083984375,
0.01201629638671875,
-0.0672607421875,
0.033935546875,
0.0195159912109375,
0.00885009765625,
-0.006591796875,
-0.037200927734375,
-0.08062744140625,
0.1082763671875,
0.019287109375,
-0.036285400390625,
-0.002613067626953125,
-0.003932952880859375,
0.039581298828125,
-0.0194549560546875,
0.04974365234375,
0.039886474609375,
0.035675048828125,
0.0084991455078125,
-0.09075927734375,
0.037872314453125,
-0.036163330078125,
-0.0033931732177734375,
0.00112152099609375,
-0.07135009765625,
0.07879638671875,
-0.021881103515625,
-0.0093231201171875,
0.017852783203125,
0.04791259765625,
0.044708251953125,
0.02032470703125,
0.035736083984375,
0.03021240234375,
0.05621337890625,
-0.00464630126953125,
0.084716796875,
-0.0230255126953125,
0.040771484375,
0.065673828125,
0.00997161865234375,
0.04022216796875,
0.0208740234375,
-0.03375244140625,
0.037078857421875,
0.055755615234375,
-0.0249786376953125,
0.0197906494140625,
0.00406646728515625,
-0.0154876708984375,
-0.0011234283447265625,
-0.00982666015625,
-0.03729248046875,
0.03790283203125,
0.0208892822265625,
-0.0163421630859375,
-0.0160064697265625,
-0.0042266845703125,
0.0122222900390625,
-0.0185394287109375,
-0.022003173828125,
0.04595947265625,
0.0160369873046875,
-0.034393310546875,
0.07177734375,
0.0009851455688476562,
0.053466796875,
-0.04754638671875,
-0.00962066650390625,
-0.0243377685546875,
0.003955841064453125,
-0.034454345703125,
-0.060577392578125,
-0.00540924072265625,
0.0008225440979003906,
0.0038738250732421875,
0.00916290283203125,
0.05889892578125,
-0.01177978515625,
-0.0517578125,
0.027984619140625,
0.0312042236328125,
0.030120849609375,
0.0222930908203125,
-0.06884765625,
0.0165557861328125,
0.0167999267578125,
-0.0260162353515625,
0.00830841064453125,
0.01447296142578125,
-0.00897979736328125,
0.055755615234375,
0.052032470703125,
0.0029087066650390625,
0.0083160400390625,
0.0070648193359375,
0.07568359375,
-0.048980712890625,
-0.046966552734375,
-0.0667724609375,
0.054901123046875,
-0.0059356689453125,
-0.0328369140625,
0.0655517578125,
0.04791259765625,
0.04815673828125,
0.00019729137420654297,
0.048065185546875,
-0.022796630859375,
0.0201416015625,
-0.04010009765625,
0.05780029296875,
-0.05059814453125,
0.021209716796875,
-0.02008056640625,
-0.0626220703125,
-0.002376556396484375,
0.055755615234375,
-0.0215911865234375,
0.004894256591796875,
0.0391845703125,
0.07257080078125,
-0.0063018798828125,
-0.0208740234375,
0.008819580078125,
0.037322998046875,
0.026885986328125,
0.06683349609375,
0.058563232421875,
-0.056488037109375,
0.03619384765625,
-0.04742431640625,
-0.0189971923828125,
-0.0216522216796875,
-0.052337646484375,
-0.0750732421875,
-0.0205078125,
-0.013458251953125,
-0.0423583984375,
-0.005756378173828125,
0.09283447265625,
0.03125,
-0.050445556640625,
-0.0275726318359375,
0.01203155517578125,
0.014495849609375,
-0.0162811279296875,
-0.0169219970703125,
0.048370361328125,
-0.005039215087890625,
-0.057708740234375,
0.01971435546875,
-0.005054473876953125,
0.032501220703125,
-0.00818634033203125,
-0.0089569091796875,
-0.0196380615234375,
-0.00927734375,
0.02801513671875,
0.02587890625,
-0.057464599609375,
-0.007389068603515625,
-0.00023281574249267578,
-0.0177154541015625,
0.02252197265625,
0.0093994140625,
-0.05474853515625,
0.0301055908203125,
0.043975830078125,
0.00229644775390625,
0.05096435546875,
-0.00341796875,
0.032806396484375,
-0.0303497314453125,
0.035430908203125,
-0.0011739730834960938,
0.04058837890625,
0.007083892822265625,
-0.0284881591796875,
0.040618896484375,
0.01708984375,
-0.0352783203125,
-0.06341552734375,
-0.016632080078125,
-0.08319091796875,
-0.0194854736328125,
0.0848388671875,
-0.005611419677734375,
-0.0450439453125,
0.005603790283203125,
-0.0556640625,
0.03558349609375,
-0.038604736328125,
0.0517578125,
0.024749755859375,
-0.01192474365234375,
-0.013671875,
-0.039764404296875,
0.025054931640625,
0.023040771484375,
-0.05718994140625,
-0.0230865478515625,
0.0205230712890625,
0.034210205078125,
0.006591796875,
0.06805419921875,
-0.0064239501953125,
0.0290679931640625,
-0.0017604827880859375,
0.00756072998046875,
-0.033416748046875,
0.006595611572265625,
-0.034637451171875,
-0.0208892822265625,
-0.0151214599609375,
-0.037841796875
]
] |
monologg/koelectra-base-v3-discriminator | 2021-10-20T16:53:40.000Z | [
"transformers",
"pytorch",
"electra",
"pretraining",
"korean",
"ko",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | monologg | null | null | monologg/koelectra-base-v3-discriminator | 20 | 15,372 | transformers | 2022-03-02T23:29:05 | ---
language: ko
license: apache-2.0
tags:
- korean
---
# KoELECTRA v3 (Base Discriminator)
Pretrained ELECTRA Language Model for Korean (`koelectra-base-v3-discriminator`)
For more detail, please see [original repository](https://github.com/monologg/KoELECTRA/blob/master/README_EN.md).
## Usage
### Load model and tokenizer
```python
>>> from transformers import ElectraModel, ElectraTokenizer
>>> model = ElectraModel.from_pretrained("monologg/koelectra-base-v3-discriminator")
>>> tokenizer = ElectraTokenizer.from_pretrained("monologg/koelectra-base-v3-discriminator")
```
### Tokenizer example
```python
>>> from transformers import ElectraTokenizer
>>> tokenizer = ElectraTokenizer.from_pretrained("monologg/koelectra-base-v3-discriminator")
>>> tokenizer.tokenize("[CLS] 한국어 ELECTRA를 공유합니다. [SEP]")
['[CLS]', '한국어', 'EL', '##EC', '##TRA', '##를', '공유', '##합니다', '.', '[SEP]']
>>> tokenizer.convert_tokens_to_ids(['[CLS]', '한국어', 'EL', '##EC', '##TRA', '##를', '공유', '##합니다', '.', '[SEP]'])
[2, 11229, 29173, 13352, 25541, 4110, 7824, 17788, 18, 3]
```
## Example using ElectraForPreTraining
```python
import torch
from transformers import ElectraForPreTraining, ElectraTokenizer
discriminator = ElectraForPreTraining.from_pretrained("monologg/koelectra-base-v3-discriminator")
tokenizer = ElectraTokenizer.from_pretrained("monologg/koelectra-base-v3-discriminator")
sentence = "나는 방금 밥을 먹었다."
fake_sentence = "나는 내일 밥을 먹었다."
fake_tokens = tokenizer.tokenize(fake_sentence)
fake_inputs = tokenizer.encode(fake_sentence, return_tensors="pt")
discriminator_outputs = discriminator(fake_inputs)
predictions = torch.round((torch.sign(discriminator_outputs[0]) + 1) / 2)
print(list(zip(fake_tokens, predictions.tolist()[1:-1])))
```
| 1,751 | [
[
-0.018402099609375,
-0.023468017578125,
0.007366180419921875,
0.0277252197265625,
-0.048309326171875,
0.0193634033203125,
-0.0019388198852539062,
0.0062255859375,
0.0215911865234375,
0.04217529296875,
-0.0303955078125,
-0.0455322265625,
-0.03955078125,
0.0240478515625,
-0.00424957275390625,
0.0711669921875,
-0.017425537109375,
0.01523590087890625,
0.0257568359375,
-0.0120391845703125,
-0.037139892578125,
-0.048370361328125,
-0.0479736328125,
-0.041656494140625,
0.02154541015625,
0.0455322265625,
0.0164337158203125,
0.0157928466796875,
0.0145721435546875,
0.03277587890625,
0.007232666015625,
-0.007045745849609375,
-0.016357421875,
-0.01071929931640625,
0.0009436607360839844,
-0.053070068359375,
-0.03460693359375,
-0.0010442733764648438,
0.033782958984375,
0.03704833984375,
0.00428009033203125,
0.0242462158203125,
-0.03375244140625,
0.043365478515625,
-0.027557373046875,
0.020782470703125,
-0.045928955078125,
0.00722503662109375,
-0.01473236083984375,
0.001667022705078125,
-0.034393310546875,
-0.04254150390625,
-0.006603240966796875,
-0.05487060546875,
0.0144805908203125,
0.01812744140625,
0.09503173828125,
0.0006256103515625,
-0.041595458984375,
-0.006725311279296875,
-0.047698974609375,
0.06427001953125,
-0.041900634765625,
0.026702880859375,
0.032318115234375,
0.0226287841796875,
0.01071929931640625,
-0.0831298828125,
-0.032073974609375,
-0.00954437255859375,
-0.009246826171875,
0.01849365234375,
-0.0024051666259765625,
0.016082763671875,
0.0374755859375,
0.01873779296875,
-0.0288543701171875,
0.0078277587890625,
-0.053314208984375,
-0.0272369384765625,
0.047515869140625,
0.0017871856689453125,
0.027130126953125,
-0.0177001953125,
-0.01493072509765625,
-0.01154327392578125,
-0.033538818359375,
-0.0102691650390625,
0.041290283203125,
0.02166748046875,
-0.00841522216796875,
0.061431884765625,
-0.0277252197265625,
0.03656005859375,
0.0249481201171875,
-0.006191253662109375,
0.05517578125,
-0.0241241455078125,
-0.0303955078125,
0.0299835205078125,
0.0535888671875,
0.019927978515625,
0.02154541015625,
0.002231597900390625,
-0.001220703125,
0.01483154296875,
0.00341033935546875,
-0.058074951171875,
-0.046661376953125,
0.0218963623046875,
-0.049896240234375,
-0.0225830078125,
0.0301971435546875,
-0.0650634765625,
0.006656646728515625,
0.0019330978393554688,
0.0309600830078125,
-0.0233612060546875,
-0.035675048828125,
0.0277252197265625,
-0.0272369384765625,
0.0175933837890625,
-0.00360107421875,
-0.055633544921875,
0.0248565673828125,
0.01390838623046875,
0.048370361328125,
0.032501220703125,
-0.0156097412109375,
-0.016571044921875,
-0.006275177001953125,
-0.00833892822265625,
0.036407470703125,
0.00722503662109375,
-0.0293731689453125,
0.007537841796875,
0.01551055908203125,
-0.0251007080078125,
-0.03717041015625,
0.052337646484375,
-0.040924072265625,
0.01544189453125,
0.01035308837890625,
-0.03179931640625,
-0.0250091552734375,
-0.00959014892578125,
-0.05718994140625,
0.0821533203125,
0.027862548828125,
-0.05792236328125,
0.03619384765625,
-0.05780029296875,
-0.043548583984375,
0.0267791748046875,
0.01271820068359375,
-0.0440673828125,
0.00704193115234375,
0.004520416259765625,
0.026947021484375,
0.019989013671875,
-0.01119232177734375,
-0.00823974609375,
-0.0169677734375,
0.00916290283203125,
-0.01332855224609375,
0.071533203125,
0.0145416259765625,
-0.0209197998046875,
0.0211639404296875,
-0.07586669921875,
0.013336181640625,
0.016387939453125,
-0.021209716796875,
-0.019683837890625,
-0.001689910888671875,
0.025390625,
0.0191192626953125,
0.03863525390625,
-0.05517578125,
0.0195465087890625,
-0.038818359375,
0.030487060546875,
0.060211181640625,
-0.007457733154296875,
0.0250244140625,
-0.0064544677734375,
0.04095458984375,
0.0111083984375,
-0.0229644775390625,
0.006114959716796875,
-0.01136016845703125,
-0.0777587890625,
-0.0230255126953125,
0.01148223876953125,
0.0550537109375,
-0.0723876953125,
0.058074951171875,
-0.0177001953125,
-0.060211181640625,
-0.039215087890625,
-0.008575439453125,
0.013946533203125,
0.00821685791015625,
0.029815673828125,
0.013153076171875,
-0.0927734375,
-0.049285888671875,
-0.030487060546875,
-0.031219482421875,
0.00807952880859375,
-0.0003807544708251953,
0.06341552734375,
-0.007396697998046875,
0.038177490234375,
-0.027587890625,
-0.0190582275390625,
-0.045806884765625,
0.01517486572265625,
0.052459716796875,
0.038360595703125,
0.025299072265625,
-0.047637939453125,
-0.056915283203125,
0.010894775390625,
-0.03631591796875,
-0.0117340087890625,
-0.0030078887939453125,
0.00159454345703125,
0.0023059844970703125,
0.033538818359375,
-0.046630859375,
0.0391845703125,
0.020416259765625,
-0.062164306640625,
0.06280517578125,
-0.01271820068359375,
0.00864410400390625,
-0.08111572265625,
-0.0021724700927734375,
-0.0177764892578125,
-0.027099609375,
-0.05670166015625,
-0.00716400146484375,
-0.000042498111724853516,
0.01479339599609375,
-0.0518798828125,
0.037750244140625,
-0.034271240234375,
0.019561767578125,
0.004436492919921875,
-0.0021648406982421875,
-0.0171356201171875,
0.0247039794921875,
-0.00030350685119628906,
0.046905517578125,
0.036865234375,
-0.052978515625,
0.0638427734375,
0.03656005859375,
-0.03631591796875,
0.019866943359375,
-0.06573486328125,
0.007175445556640625,
0.0013275146484375,
0.0137176513671875,
-0.0682373046875,
-0.025054931640625,
0.0206756591796875,
-0.0202789306640625,
0.0033168792724609375,
-0.00982666015625,
-0.046112060546875,
-0.05828857421875,
-0.0263671875,
0.039581298828125,
0.0435791015625,
-0.0498046875,
0.0264129638671875,
0.0102081298828125,
0.0019626617431640625,
-0.0374755859375,
-0.037261962890625,
-0.00884246826171875,
-0.027984619140625,
-0.0182647705078125,
0.02484130859375,
0.0009331703186035156,
0.004528045654296875,
-0.02655029296875,
-0.0159759521484375,
-0.003055572509765625,
-0.002445220947265625,
0.0035686492919921875,
0.03375244140625,
-0.01497650146484375,
-0.021453857421875,
-0.0026702880859375,
-0.0225067138671875,
0.00032639503479003906,
-0.01422882080078125,
0.08062744140625,
-0.02020263671875,
-0.016876220703125,
-0.04302978515625,
0.0011701583862304688,
0.0440673828125,
-0.0004024505615234375,
0.041107177734375,
0.050262451171875,
-0.0310211181640625,
-0.003711700439453125,
-0.0303955078125,
0.00899505615234375,
-0.038909912109375,
0.056427001953125,
-0.045013427734375,
-0.05389404296875,
0.045196533203125,
-0.0179901123046875,
-0.01473236083984375,
0.0455322265625,
0.045257568359375,
0.01160430908203125,
0.091796875,
0.020843505859375,
-0.00864410400390625,
0.03399658203125,
-0.02801513671875,
0.03277587890625,
-0.06585693359375,
-0.0261688232421875,
-0.0367431640625,
0.0012054443359375,
-0.042083740234375,
0.005329132080078125,
0.0009179115295410156,
0.0313720703125,
-0.03546142578125,
0.062744140625,
-0.0496826171875,
0.035186767578125,
0.0404052734375,
0.00888824462890625,
-0.012786865234375,
-0.00801849365234375,
-0.010833740234375,
-0.01171875,
-0.041259765625,
-0.049224853515625,
0.0792236328125,
0.028594970703125,
0.05963134765625,
-0.012451171875,
0.053253173828125,
-0.010498046875,
0.0015745162963867188,
-0.0557861328125,
0.047943115234375,
-0.01123809814453125,
-0.046295166015625,
-0.01262664794921875,
-0.0093231201171875,
-0.0740966796875,
0.0193634033203125,
0.007659912109375,
-0.078857421875,
0.011566162109375,
-0.011505126953125,
-0.0139007568359375,
0.03411865234375,
-0.04229736328125,
0.08648681640625,
-0.0187530517578125,
0.01059722900390625,
0.0083160400390625,
-0.0294036865234375,
0.016632080078125,
0.013275146484375,
-0.00063323974609375,
-0.011810302734375,
0.011627197265625,
0.0693359375,
-0.03399658203125,
0.050506591796875,
-0.0222320556640625,
0.009674072265625,
0.0067901611328125,
-0.0124053955078125,
0.0302734375,
0.012847900390625,
-0.002262115478515625,
0.01788330078125,
0.0020999908447265625,
-0.028656005859375,
-0.00839996337890625,
0.0362548828125,
-0.06573486328125,
-0.0159454345703125,
-0.038482666015625,
-0.01047515869140625,
0.029815673828125,
0.0479736328125,
0.071533203125,
0.0241546630859375,
-0.01493072509765625,
0.006900787353515625,
0.03448486328125,
-0.013214111328125,
0.04913330078125,
0.01415252685546875,
-0.0310211181640625,
-0.059051513671875,
0.0693359375,
0.0218658447265625,
0.00390625,
0.0016918182373046875,
-0.009796142578125,
-0.03375244140625,
-0.0240631103515625,
-0.0158233642578125,
0.021392822265625,
-0.0439453125,
-0.0229034423828125,
-0.040802001953125,
-0.032379150390625,
-0.038177490234375,
-0.0129547119140625,
-0.028594970703125,
-0.035675048828125,
-0.033660888671875,
-0.038848876953125,
0.02178955078125,
0.034942626953125,
-0.0167694091796875,
0.043670654296875,
-0.04913330078125,
0.045166015625,
0.01110076904296875,
0.02777099609375,
-0.028961181640625,
-0.02008056640625,
-0.031494140625,
-0.01016998291015625,
-0.012908935546875,
-0.07232666015625,
0.048614501953125,
0.002964019775390625,
0.038116455078125,
0.033599853515625,
0.00644683837890625,
0.04840087890625,
-0.05389404296875,
0.0732421875,
0.027679443359375,
-0.0797119140625,
0.04986572265625,
0.000028431415557861328,
0.01111602783203125,
0.050750732421875,
0.0066680908203125,
-0.041412353515625,
-0.01837158203125,
-0.054229736328125,
-0.0802001953125,
0.06854248046875,
0.02020263671875,
0.0183258056640625,
-0.013641357421875,
0.0016431808471679688,
0.010040283203125,
0.0214385986328125,
-0.08380126953125,
-0.06475830078125,
-0.043914794921875,
-0.0260162353515625,
0.0083160400390625,
-0.0212860107421875,
0.00292205810546875,
-0.0430908203125,
0.07696533203125,
0.0227203369140625,
0.033203125,
0.017425537109375,
-0.005168914794921875,
-0.00213623046875,
0.01451873779296875,
0.043701171875,
0.029693603515625,
-0.0234222412109375,
0.013153076171875,
0.0247039794921875,
-0.050872802734375,
0.031036376953125,
0.012939453125,
-0.0098419189453125,
0.043914794921875,
0.01445770263671875,
0.07635498046875,
-0.00893402099609375,
-0.0185699462890625,
0.0193023681640625,
0.007671356201171875,
-0.0244598388671875,
-0.067138671875,
-0.005786895751953125,
-0.017120361328125,
-0.0125732421875,
0.0300445556640625,
0.004161834716796875,
-0.004901885986328125,
-0.0164031982421875,
0.00853729248046875,
0.01702880859375,
-0.03680419921875,
-0.031524658203125,
0.059234619140625,
0.01788330078125,
-0.050018310546875,
0.043060302734375,
-0.0159759521484375,
-0.072998046875,
0.06317138671875,
0.0576171875,
0.0855712890625,
-0.02655029296875,
0.0230712890625,
0.05670166015625,
0.034210205078125,
-0.016693115234375,
0.047821044921875,
0.01202392578125,
-0.06427001953125,
-0.0158233642578125,
-0.0296478271484375,
0.00344085693359375,
0.025421142578125,
-0.050384521484375,
0.036773681640625,
-0.033660888671875,
-0.0177001953125,
-0.003414154052734375,
0.003383636474609375,
-0.05157470703125,
0.01220703125,
-0.0037212371826171875,
0.07623291015625,
-0.06573486328125,
0.06475830078125,
0.073486328125,
-0.0435791015625,
-0.056488037109375,
-0.031585693359375,
-0.043670654296875,
-0.049163818359375,
0.066650390625,
0.0295257568359375,
0.0165863037109375,
0.021240234375,
-0.055450439453125,
-0.0595703125,
0.0731201171875,
0.0016021728515625,
-0.0244903564453125,
0.003459930419921875,
0.0014848709106445312,
0.02386474609375,
-0.0135498046875,
0.0230865478515625,
0.056365966796875,
0.047393798828125,
-0.028167724609375,
-0.073486328125,
0.002933502197265625,
-0.0248565673828125,
-0.0010347366333007812,
0.021484375,
-0.045745849609375,
0.06951904296875,
0.0009908676147460938,
-0.0237884521484375,
0.0219573974609375,
0.05572509765625,
0.0300445556640625,
0.0184478759765625,
0.03436279296875,
0.04742431640625,
0.0416259765625,
-0.0292205810546875,
0.03643798828125,
-0.0171661376953125,
0.055450439453125,
0.05780029296875,
-0.01529693603515625,
0.040771484375,
0.043121337890625,
-0.035247802734375,
0.0731201171875,
0.042388916015625,
-0.0261383056640625,
0.058197021484375,
0.01222991943359375,
-0.036865234375,
-0.01163482666015625,
0.0198516845703125,
-0.0301055908203125,
0.0245513916015625,
0.0259246826171875,
-0.019927978515625,
-0.00853729248046875,
0.00783538818359375,
0.0163421630859375,
-0.0110015869140625,
-0.03759765625,
0.050994873046875,
0.0010137557983398438,
-0.02655029296875,
0.039642333984375,
0.01181793212890625,
0.06414794921875,
-0.047149658203125,
-0.00998687744140625,
0.007259368896484375,
0.0280609130859375,
-0.021484375,
-0.0302886962890625,
-0.003204345703125,
-0.009674072265625,
-0.003955841064453125,
-0.002368927001953125,
0.061279296875,
-0.032135009765625,
-0.056610107421875,
0.00786590576171875,
0.003955841064453125,
0.0169677734375,
0.0160369873046875,
-0.03863525390625,
-0.0164642333984375,
0.0092010498046875,
-0.0293426513671875,
-0.0003707408905029297,
0.01134490966796875,
0.0255126953125,
0.035247802734375,
0.04290771484375,
0.0168914794921875,
0.0248870849609375,
-0.0037593841552734375,
0.050537109375,
-0.0482177734375,
-0.03607177734375,
-0.0679931640625,
0.04913330078125,
-0.0284881591796875,
-0.04345703125,
0.07061767578125,
0.0528564453125,
0.07781982421875,
-0.0372314453125,
0.07000732421875,
-0.02020263671875,
0.028961181640625,
-0.03515625,
0.06207275390625,
-0.004596710205078125,
-0.01145172119140625,
-0.0205535888671875,
-0.06939697265625,
-0.02667236328125,
0.076416015625,
-0.0007662773132324219,
0.0066986083984375,
0.062255859375,
0.0384521484375,
-0.0045166015625,
-0.02069091796875,
0.0235748291015625,
0.0340576171875,
0.00893402099609375,
0.032684326171875,
0.04718017578125,
-0.06585693359375,
0.028472900390625,
-0.044891357421875,
0.003131866455078125,
-0.0171661376953125,
-0.049652099609375,
-0.0810546875,
-0.0206756591796875,
-0.031494140625,
-0.051849365234375,
-0.0290985107421875,
0.081298828125,
0.0404052734375,
-0.08428955078125,
-0.025054931640625,
-0.027130126953125,
0.0267486572265625,
-0.0192108154296875,
-0.023345947265625,
0.04193115234375,
-0.0205230712890625,
-0.060089111328125,
0.021270751953125,
-0.0025348663330078125,
0.006000518798828125,
0.00209808349609375,
-0.0066986083984375,
-0.020050048828125,
0.0022106170654296875,
0.0460205078125,
0.03253173828125,
-0.05718994140625,
-0.039703369140625,
-0.0108642578125,
-0.01493072509765625,
0.000324249267578125,
0.034942626953125,
-0.056365966796875,
0.02587890625,
0.03607177734375,
-0.00275421142578125,
0.035888671875,
-0.00001055002212524414,
0.033355712890625,
-0.057403564453125,
0.0287322998046875,
0.0033416748046875,
0.049224853515625,
0.004779815673828125,
-0.00873565673828125,
0.044677734375,
0.037841796875,
-0.05010986328125,
-0.0672607421875,
0.00780487060546875,
-0.0721435546875,
-0.0023651123046875,
0.0838623046875,
-0.01155853271484375,
-0.01044464111328125,
-0.033660888671875,
-0.05682373046875,
0.042816162109375,
-0.0129547119140625,
0.0341796875,
0.0548095703125,
0.0016956329345703125,
-0.0033473968505859375,
-0.0224761962890625,
0.03399658203125,
0.0184783935546875,
-0.05181884765625,
-0.006717681884765625,
0.00846099853515625,
0.018310546875,
0.0240020751953125,
0.05517578125,
-0.005069732666015625,
0.024444580078125,
0.0221405029296875,
0.03131103515625,
0.007659912109375,
-0.00147247314453125,
-0.0241851806640625,
-0.0085906982421875,
-0.017913818359375,
-0.05181884765625
]
] |
DeepFloyd/IF-I-L-v1.0 | 2023-06-02T19:04:26.000Z | [
"diffusers",
"pytorch",
"if",
"text-to-image",
"arxiv:2205.11487",
"arxiv:2110.02861",
"license:deepfloyd-if-license",
"diffusers:IFPipeline",
"region:us"
] | text-to-image | DeepFloyd | null | null | DeepFloyd/IF-I-L-v1.0 | 15 | 15,369 | diffusers | 2023-03-21T19:01:41 | ---
license: deepfloyd-if-license
extra_gated_prompt: "DeepFloyd LICENSE AGREEMENT\nThis License Agreement (as may be amended in accordance with this License Agreement, “License”), between you, or your employer or other entity (if you are entering into this agreement on behalf of your employer or other entity) (“Licensee” or “you”) and Stability AI Ltd.. (“Stability AI” or “we”) applies to your use of any computer program, algorithm, source code, object code, or software that is made available by Stability AI under this License (“Software”) and any specifications, manuals, documentation, and other written information provided by Stability AI related to the Software (“Documentation”).\nBy clicking “I Accept” below or by using the Software, you agree to the terms of this License. If you do not agree to this License, then you do not have any rights to use the Software or Documentation (collectively, the “Software Products”), and you must immediately cease using the Software Products. If you are agreeing to be bound by the terms of this License on behalf of your employer or other entity, you represent and warrant to Stability AI that you have full legal authority to bind your employer or such entity to this License. If you do not have the requisite authority, you may not accept the License or access the Software Products on behalf of your employer or other entity.\n1. LICENSE GRANT\n a. Subject to your compliance with the Documentation and Sections 2, 3, and 5, Stability AI grants you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty free and limited license under Stability AI’s copyright interests to reproduce, distribute, and create derivative works of the Software solely for your non-commercial research purposes. The foregoing license is personal to you, and you may not assign or sublicense this License or any other rights or obligations under this License without Stability AI’s prior written consent; any such assignment or sublicense will be void and will automatically and immediately terminate this License.\n b. You may make a reasonable number of copies of the Documentation solely for use in connection with the license to the Software granted above.\n c. The grant of rights expressly set forth in this Section 1 (License Grant) are the complete grant of rights to you in the Software Products, and no other licenses are granted, whether by waiver, estoppel, implication, equity or otherwise. Stability AI and its licensors reserve all rights not expressly granted by this License.\L\n2. RESTRICTIONS\n You will not, and will not permit, assist or cause any third party to:\n a. use, modify, copy, reproduce, create derivative works of, or distribute the Software Products (or any derivative works thereof, works incorporating the Software Products, or any data produced by the Software), in whole or in part, for (i) any commercial or production purposes, (ii) military purposes or in the service of nuclear technology, (iii) purposes of surveillance, including any research or development relating to surveillance, (iv) biometric processing, (v) in any manner that infringes, misappropriates, or otherwise violates any third-party rights, or (vi) in any manner that violates any applicable law and violating any privacy or security laws, rules, regulations, directives, or governmental requirements (including the General Data Privacy Regulation (Regulation (EU) 2016/679), the California Consumer Privacy Act, and any and all laws governing the processing of biometric information), as well as all amendments and successor laws to any of the foregoing;\n b. alter or remove copyright and other proprietary notices which appear on or in the Software Products;\n c. utilize any equipment, device, software, or other means to circumvent or remove any security or protection used by Stability AI in connection with the Software, or to circumvent or remove any usage restrictions, or to enable functionality disabled by Stability AI; or\n d. offer or impose any terms on the Software Products that alter, restrict, or are inconsistent with the terms of this License.\n e. 1) violate any applicable U.S. and non-U.S. export control and trade sanctions laws (“Export Laws”); 2) directly or indirectly export, re-export, provide, or otherwise transfer Software Products: (a) to any individual, entity, or country prohibited by Export Laws; (b) to anyone on U.S. or non-U.S. government restricted parties lists; or (c) for any purpose prohibited by Export Laws, including nuclear, chemical or biological weapons, or missile technology applications; 3) use or download Software Products if you or they are: (a) located in a comprehensively sanctioned jurisdiction, (b) currently listed on any U.S. or non-U.S. restricted parties list, or (c) for any purpose prohibited by Export Laws; and (4) will not disguise your location through IP proxying or other methods.\L\n3. ATTRIBUTION\n Together with any copies of the Software Products (as well as derivative works thereof or works incorporating the Software Products) that you distribute, you must provide (i) a copy of this License, and (ii) the following attribution notice: “DeepFloyd is licensed under the DeepFloyd License, Copyright (c) Stability AI Ltd. All Rights Reserved.”\L\n4. DISCLAIMERS\n THE SOFTWARE PRODUCTS ARE PROVIDED “AS IS” and “WITH ALL FAULTS” WITH NO WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. STABILITY AIEXPRESSLY DISCLAIMS ALL REPRESENTATIONS AND WARRANTIES, EXPRESS OR IMPLIED, WHETHER BY STATUTE, CUSTOM, USAGE OR OTHERWISE AS TO ANY MATTERS RELATED TO THE SOFTWARE PRODUCTS, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE, SATISFACTORY QUALITY, OR NON-INFRINGEMENT. STABILITY AI MAKES NO WARRANTIES OR REPRESENTATIONS THAT THE SOFTWARE PRODUCTS WILL BE ERROR FREE OR FREE OF VIRUSES OR OTHER HARMFUL COMPONENTS, OR PRODUCE ANY PARTICULAR RESULTS.\L\n5. LIMITATION OF LIABILITY\n TO THE FULLEST EXTENT PERMITTED BY LAW, IN NO EVENT WILL STABILITY AI BE LIABLE TO YOU (A) UNDER ANY THEORY OF LIABILITY, WHETHER BASED IN CONTRACT, TORT, NEGLIGENCE, STRICT LIABILITY, WARRANTY, OR OTHERWISE UNDER THIS LICENSE, OR (B) FOR ANY INDIRECT, CONSEQUENTIAL, EXEMPLARY, INCIDENTAL, PUNITIVE OR SPECIAL DAMAGES OR LOST PROFITS, EVEN IF STABILITY AI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THE SOFTWARE PRODUCTS, THEIR CONSTITUENT COMPONENTS, AND ANY OUTPUT (COLLECTIVELY, “SOFTWARE MATERIALS”) ARE NOT DESIGNED OR INTENDED FOR USE IN ANY APPLICATION OR SITUATION WHERE FAILURE OR FAULT OF THE SOFTWARE MATERIALS COULD REASONABLY BE ANTICIPATED TO LEAD TO SERIOUS INJURY OF ANY PERSON, INCLUDING POTENTIAL DISCRIMINATION OR VIOLATION OF AN INDIVIDUAL’S PRIVACY RIGHTS, OR TO SEVERE PHYSICAL, PROPERTY, OR ENVIRONMENTAL DAMAGE (EACH, A “HIGH-RISK USE”). IF YOU ELECT TO USE ANY OF THE SOFTWARE MATERIALS FOR A HIGH-RISK USE, YOU DO SO AT YOUR OWN RISK. YOU AGREE TO DESIGN AND IMPLEMENT APPROPRIATE DECISION-MAKING AND RISK-MITIGATION PROCEDURES AND POLICIES IN CONNECTION WITH A HIGH-RISK USE SUCH THAT EVEN IF THERE IS A FAILURE OR FAULT IN ANY OF THE SOFTWARE MATERIALS, THE SAFETY OF PERSONS OR PROPERTY AFFECTED BY THE ACTIVITY STAYS AT A LEVEL THAT IS REASONABLE, APPROPRIATE, AND LAWFUL FOR THE FIELD OF THE HIGH-RISK USE.\L\n6. INDEMNIFICATION\n You will indemnify, defend and hold harmless Stability AI and our subsidiaries and affiliates, and each of our respective shareholders, directors, officers, employees, agents, successors, and assigns (collectively, the “Stability AI Parties”) from and against any losses, liabilities, damages, fines, penalties, and expenses (including reasonable attorneys’ fees) incurred by any Stability AI Party in connection with any claim, demand, allegation, lawsuit, proceeding, or investigation (collectively, “Claims”) arising out of or related to: (a) your access to or use of the Software Products (as well as any results or data generated from such access or use), including any High-Risk Use (defined below); (b) your violation of this License; or (c) your violation, misappropriation or infringement of any rights of another (including intellectual property or other proprietary rights and privacy rights). You will promptly notify the Stability AI Parties of any such Claims, and cooperate with Stability AI Parties in defending such Claims. You will also grant the Stability AI Parties sole control of the defense or settlement, at Stability AI’s sole option, of any Claims. This indemnity is in addition to, and not in lieu of, any other indemnities or remedies set forth in a written agreement between you and Stability AI or the other Stability AI Parties.\L\n7. TERMINATION; SURVIVAL\n a. This License will automatically terminate upon any breach by you of the terms of this License.\L\Lb. We may terminate this License, in whole or in part, at any time upon notice (including electronic) to you.\L\Lc. The following sections survive termination of this License: 2 (Restrictions), 3 (Attribution), 4 (Disclaimers), 5 (Limitation on Liability), 6 (Indemnification) 7 (Termination; Survival), 8 (Third Party Materials), 9 (Trademarks), 10 (Applicable Law; Dispute Resolution), and 11 (Miscellaneous).\L\n8. THIRD PARTY MATERIALS\n The Software Products may contain third-party software or other components (including free and open source software) (all of the foregoing, “Third Party Materials”), which are subject to the license terms of the respective third-party licensors. Your dealings or correspondence with third parties and your use of or interaction with any Third Party Materials are solely between you and the third party. Stability AI does not control or endorse, and makes no representations or warranties regarding, any Third Party Materials, and your access to and use of such Third Party Materials are at your own risk.\L\n9. TRADEMARKS\n Licensee has not been granted any trademark license as part of this License and may not use any name or mark associated with Stability AI without the prior written permission of Stability AI, except to the extent necessary to make the reference required by the “ATTRIBUTION” section of this Agreement.\L\n10. APPLICABLE LAW; DISPUTE RESOLUTION\n This License will be governed and construed under the laws of the State of California without regard to conflicts of law provisions. Any suit or proceeding arising out of or relating to this License will be brought in the federal or state courts, as applicable, in San Mateo County, California, and each party irrevocably submits to the jurisdiction and venue of such courts.\L\n11. MISCELLANEOUS\n If any provision or part of a provision of this License is unlawful, void or unenforceable, that provision or part of the provision is deemed severed from this License, and will not affect the validity and enforceability of any remaining provisions. The failure of Stability AI to exercise or enforce any right or provision of this License will not operate as a waiver of such right or provision. This License does not confer any third-party beneficiary rights upon any other person or entity. This License, together with the Documentation, contains the entire understanding between you and Stability AI regarding the subject matter of this License, and supersedes all other written or oral agreements and understandings between you and Stability AI regarding such subject matter. No change or addition to any provision of this License will be binding unless it is in writing and signed by an authorized representative of both you and Stability AI."
extra_gated_fields:
"Organization /\_Affiliation": text
Previously related publications: text
I accept the above license agreement, and will use the Software non-commercially and for research purposes only: checkbox
tags:
- if
- text-to-image
inference: false
---
# IF-I-L-v1.0
DeepFloyd-IF is a pixel-based text-to-image triple-cascaded diffusion model, that can generate pictures with new state-of-the-art for photorealism and language understanding. The result is a highly efficient model that outperforms current state-of-the-art models, achieving a zero-shot FID-30K score of `6.66` on the COCO dataset.
*Inspired by* [*Photorealistic Text-to-Image Diffusion Models with Deep Language Understanding*](https://arxiv.org/pdf/2205.11487.pdf)

## Model Details
- **Developed by:** DeepFloyd, StabilityAI
- **Model type:** pixel-based text-to-image cascaded diffusion model
- **Cascade Stage:** I
- **Num Parameters:** 900M
- **Language(s):** primarily English and, to a lesser extent, other Romance languages
- **License:** <span style="color:blue"><a href="https://huggingface.co/spaces/DeepFloyd/deepfloyd-if-license">DeepFloyd IF License Agreement</a></span>
- **Model Description:** DeepFloyd-IF is modular composed of frozen text mode and three pixel cascaded diffusion modules, each designed to generate images of increasing resolution: 64x64, 256x256, and 1024x1024. All stages of the model utilize a frozen text encoder based on the T5 transformer to extract text embeddings, which are then fed into a UNet architecture enhanced with cross-attention and attention-pooling
- **Resources for more information:** [GitHub](https://github.com/deep-floyd/IF), [Website](https://deepfloyd.ai), [All Links](https://linktr.ee/deepfloyd)
## Using with `diffusers`
IF is integrated with the 🤗 Hugging Face [🧨 diffusers library](https://github.com/huggingface/diffusers/), which is optimized to run on GPUs with as little as 14 GB of VRAM.
Before you can use IF, you need to accept its usage conditions. To do so:
1. Make sure to have a [Hugging Face account](https://huggingface.co/join) and be loggin in
2. Accept the license on the model card of [DeepFloyd/IF-I-L-v1.0](https://huggingface.co/DeepFloyd/IF-I-L-v1.0)
3. Make sure to login locally. Install `huggingface_hub`
```sh
pip install huggingface_hub --upgrade
```
run the login function in a Python shell
```py
from huggingface_hub import login
login()
```
and enter your [Hugging Face Hub access token](https://huggingface.co/docs/hub/security-tokens#what-are-user-access-tokens).
Next we install `diffusers` and dependencies:
```sh
pip install diffusers accelerate transformers safetensors sentencepiece
```
And we can now run the model locally.
By default `diffusers` makes use of [model cpu offloading](https://huggingface.co/docs/diffusers/optimization/fp16#model-offloading-for-fast-inference-and-memory-savings) to run the whole IF pipeline with as little as 14 GB of VRAM.
If you are using `torch>=2.0.0`, make sure to **remove all** `enable_xformers_memory_efficient_attention()` functions.
* **Load all stages and offload to CPU**
```py
from diffusers import DiffusionPipeline
from diffusers.utils import pt_to_pil
import torch
# stage 1
stage_1 = DiffusionPipeline.from_pretrained("DeepFloyd/IF-I-L-v1.0", variant="fp16", torch_dtype=torch.float16)
stage_1.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_1.enable_model_cpu_offload()
# stage 2
stage_2 = DiffusionPipeline.from_pretrained(
"DeepFloyd/IF-II-L-v1.0", text_encoder=None, variant="fp16", torch_dtype=torch.float16
)
stage_2.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_2.enable_model_cpu_offload()
# stage 3
safety_modules = {"feature_extractor": stage_1.feature_extractor, "safety_checker": stage_1.safety_checker, "watermarker": stage_1.watermarker}
stage_3 = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-x4-upscaler", **safety_modules, torch_dtype=torch.float16)
stage_3.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_3.enable_model_cpu_offload()
```
* **Retrieve Text Embeddings**
```py
prompt = 'a photo of a kangaroo wearing an orange hoodie and blue sunglasses standing in front of the eiffel tower holding a sign that says "very deep learning"'
# text embeds
prompt_embeds, negative_embeds = stage_1.encode_prompt(prompt)
```
* **Run stage 1**
```py
generator = torch.manual_seed(0)
image = stage_1(prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt").images
pt_to_pil(image)[0].save("./if_stage_I.png")
```
* **Run stage 2**
```py
image = stage_2(
image=image, prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt"
).images
pt_to_pil(image)[0].save("./if_stage_II.png")
```
* **Run stage 3**
```py
image = stage_3(prompt=prompt, image=image, generator=generator, noise_level=100).images
image[0].save("./if_stage_III.png")
```
There are multiple ways to speed up the inference time and lower the memory consumption even more with `diffusers`. To do so, please have a look at the Diffusers docs:
- 🚀 [Optimizing for inference time](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-speed)
- ⚙️ [Optimizing for low memory during inference](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-memory)
For more in-detail information about how to use IF, please have a look at [the IF blog post](https://huggingface.co/blog/if) and the [documentation](https://huggingface.co/docs/diffusers/main/en/api/pipelines/if) 📖.
Diffusers dreambooth scripts also supports fine-tuning 🎨 [IF](https://huggingface.co/docs/diffusers/main/en/training/dreambooth#if).
With parameter efficient finetuning, you can add new concepts to IF with a single GPU and ~28 GB VRAM.
## Training
**Training Data:**
1.2B text-image pairs (based on LAION-A and few additional internal datasets)
Test/Valid parts of datasets are not used at any cascade and stage of training. Valid part of COCO helps to demonstrate "online" loss behaviour during training (to catch incident and other problems), but dataset is never used for train.
**Training Procedure:** IF-I-L-v1.0 is pixel-based diffusion cascade which uses T5-Encoder embeddings (hidden states) to generate 64px image. During training,
- Images are cropped to square via shifted-center-crop augmentation (randomly shift from center up to 0.1 of size) and resized to 64px using `Pillow==9.2.0` BICUBIC resampling with reducing_gap=None (it helps to avoid aliasing) and processed to tensor BxCxHxW
- Text prompts are encoded through open-sourced frozen T5-v1_1-xxl text-encoder (that completely was trained by Google team), random 10% of texts are dropped to empty string to add ability for classifier free guidance (CFG)
- The non-pooled output of the text encoder is fed into the projection (linear layer without activation) and is used in UNet backbone of the diffusion model via controlled hybrid self- and cross- attention
- Also, the output of the text encode is pooled via attention-pooling (64 heads) and is used in time embed as additional features
- Diffusion process is limited by 1000 discrete steps, with cosine beta schedule of noising image
- The loss is a reconstruction objective between the noise that was added to the image and the prediction made by the UNet
- The training process for checkpoint IF-I-L-v1.0 has 2_500_000 steps + 500_000 extra steps at resolution 64x64 on all datasets, OneCycleLR policy, few-bit backward GELU activations, optimizer AdamW8bit + DeepSpeed-Zero1, fully frozen T5-Encoder

**Hardware:** 20 x 8 x A100 GPUs
**Optimizer:** [AdamW8bit](https://arxiv.org/abs/2110.02861) + [DeepSpeed ZeRO-1](https://www.deepspeed.ai/tutorials/zero/)
**Batch:** 3200
**Learning rate**: [one-cycle](https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.OneCycleLR.html) cosine strategy, warmup 10000 steps, start_lr=4e-6, max_lr=1e-4, final_lr=1e-8;
_for extra 500_000 steps:_ [one-cycle](https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.OneCycleLR.html) cosine strategy, warmup 50_000 steps, start_lr=1e-8, max_lr=4e-6, final_lr=4e-8

## Evaluation Results
`FID-30K: 8.06`

# Uses
## Direct Use
The model is released for research purposes. Any attempt to deploy the model in production requires not only that the LICENSE is followed but full liability over the person deploying the model.
Possible research areas and tasks include:
- Generation of artistic imagery and use in design and other artistic processes.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion but applies in the same way for IF_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model was trained mainly with English captions and will not work as well in other languages.
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have... (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
IF was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
IF mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent.
*This model card was written by: DeepFloyd Team and is based on the [StableDiffusion model card](https://huggingface.co/CompVis/stable-diffusion-v1-4).* | 23,609 | [
[
-0.0443115234375,
-0.06597900390625,
0.0196533203125,
0.0290985107421875,
-0.0171966552734375,
-0.0042266845703125,
-0.0188751220703125,
-0.03521728515625,
0.005615234375,
0.022247314453125,
-0.04205322265625,
-0.041961669921875,
-0.0474853515625,
-0.01459503173828125,
-0.0176239013671875,
0.07818603515625,
-0.012237548828125,
-0.0150909423828125,
-0.00695037841796875,
0.0011768341064453125,
-0.01480865478515625,
-0.00847625732421875,
-0.0736083984375,
-0.0260009765625,
0.020477294921875,
0.021087646484375,
0.040771484375,
0.0262298583984375,
0.0261077880859375,
0.026763916015625,
-0.0181121826171875,
-0.0029773712158203125,
-0.03912353515625,
-0.0289764404296875,
0.01041412353515625,
-0.017822265625,
-0.033355712890625,
0.0016632080078125,
0.046630859375,
0.01171875,
-0.0023860931396484375,
0.005489349365234375,
0.00859832763671875,
0.052276611328125,
-0.048248291015625,
0.0196533203125,
-0.020904541015625,
0.0148162841796875,
0.0007042884826660156,
0.0145263671875,
-0.01236724853515625,
-0.01345062255859375,
0.02105712890625,
-0.04888916015625,
0.038116455078125,
-0.00647735595703125,
0.0863037109375,
0.0228424072265625,
-0.00913238525390625,
-0.008209228515625,
-0.026947021484375,
0.049407958984375,
-0.0543212890625,
0.0236968994140625,
0.0107269287109375,
0.0032405853271484375,
0.00539398193359375,
-0.06842041015625,
-0.05096435546875,
-0.01026153564453125,
0.0004551410675048828,
0.0258026123046875,
-0.0128326416015625,
0.01049041748046875,
0.0241241455078125,
0.049102783203125,
-0.03509521484375,
-0.0057220458984375,
-0.038482666015625,
-0.0137481689453125,
0.06304931640625,
-0.001056671142578125,
0.0180816650390625,
-0.0065765380859375,
-0.038482666015625,
-0.01544189453125,
-0.01861572265625,
0.0130157470703125,
0.004550933837890625,
0.0012760162353515625,
-0.049774169921875,
0.024444580078125,
-0.00681304931640625,
0.0276947021484375,
0.0285491943359375,
-0.015045166015625,
0.0310211181640625,
-0.012451171875,
-0.03326416015625,
0.01023101806640625,
0.08367919921875,
0.0225067138671875,
0.01065826416015625,
0.0044403076171875,
-0.00969696044921875,
0.002193450927734375,
0.0010690689086914062,
-0.09698486328125,
-0.039947509765625,
0.02911376953125,
-0.024566650390625,
-0.035858154296875,
-0.01198577880859375,
-0.06396484375,
-0.0115966796875,
0.0140533447265625,
0.041961669921875,
-0.061248779296875,
-0.031524658203125,
0.0190887451171875,
-0.016021728515625,
0.0173492431640625,
0.0292205810546875,
-0.0556640625,
0.026702880859375,
0.0257415771484375,
0.07940673828125,
-0.00760650634765625,
-0.01318359375,
-0.01155853271484375,
-0.0164031982421875,
-0.021392822265625,
0.043365478515625,
-0.018218994140625,
-0.02789306640625,
-0.0117645263671875,
0.007770538330078125,
-0.0106964111328125,
-0.0279998779296875,
0.048095703125,
-0.028106689453125,
0.036712646484375,
-0.0068206787109375,
-0.05029296875,
-0.028167724609375,
0.0059356689453125,
-0.04449462890625,
0.09417724609375,
0.0202484130859375,
-0.07330322265625,
0.0159759521484375,
-0.05450439453125,
-0.032440185546875,
0.00006407499313354492,
0.002689361572265625,
-0.0548095703125,
0.00217437744140625,
0.0235748291015625,
0.050140380859375,
-0.015106201171875,
0.0022830963134765625,
-0.018768310546875,
-0.031280517578125,
0.0031299591064453125,
-0.0269012451171875,
0.0849609375,
0.02105712890625,
-0.055511474609375,
-0.0005712509155273438,
-0.05035400390625,
-0.005184173583984375,
0.027923583984375,
-0.0252532958984375,
0.01049041748046875,
-0.030670166015625,
0.02301025390625,
0.02142333984375,
0.0161895751953125,
-0.045623779296875,
0.0144500732421875,
-0.0298614501953125,
0.037506103515625,
0.048919677734375,
-0.001251220703125,
0.03875732421875,
-0.0095062255859375,
0.036407470703125,
0.0203094482421875,
0.0169219970703125,
-0.0207366943359375,
-0.059814453125,
-0.07574462890625,
-0.0309295654296875,
0.01186370849609375,
0.03912353515625,
-0.058746337890625,
0.03057861328125,
0.00007897615432739258,
-0.041351318359375,
-0.04949951171875,
0.002960205078125,
0.041778564453125,
0.0482177734375,
0.032989501953125,
-0.018280029296875,
-0.018585205078125,
-0.0604248046875,
0.011871337890625,
0.00946807861328125,
0.01061248779296875,
0.02313232421875,
0.05035400390625,
-0.01551055908203125,
0.04705810546875,
-0.042724609375,
-0.03314208984375,
-0.0104217529296875,
0.000705718994140625,
0.027008056640625,
0.04638671875,
0.05322265625,
-0.05120849609375,
-0.046630859375,
-0.004375457763671875,
-0.06549072265625,
0.013519287109375,
-0.01131439208984375,
-0.005237579345703125,
0.032318115234375,
0.031585693359375,
-0.07159423828125,
0.044036865234375,
0.041961669921875,
-0.0300140380859375,
0.041168212890625,
-0.025299072265625,
0.007137298583984375,
-0.0740966796875,
0.0117340087890625,
0.0240325927734375,
-0.0153656005859375,
-0.0278472900390625,
0.01080322265625,
0.006603240966796875,
-0.01427459716796875,
-0.046783447265625,
0.059661865234375,
-0.044921875,
0.0219573974609375,
-0.011993408203125,
0.0034332275390625,
0.01291656494140625,
0.0484619140625,
0.00809478759765625,
0.051666259765625,
0.0667724609375,
-0.054412841796875,
0.016845703125,
0.008209228515625,
-0.030181884765625,
0.031280517578125,
-0.04888916015625,
0.013702392578125,
-0.014312744140625,
0.021331787109375,
-0.0792236328125,
-0.010162353515625,
0.0374755859375,
-0.0343017578125,
0.04315185546875,
-0.002986907958984375,
-0.030731201171875,
-0.041900634765625,
-0.0248260498046875,
0.0257415771484375,
0.06134033203125,
-0.039764404296875,
0.041351318359375,
0.01285552978515625,
0.0188751220703125,
-0.049957275390625,
-0.0555419921875,
-0.00714111328125,
-0.01520538330078125,
-0.058258056640625,
0.05010986328125,
-0.00859832763671875,
0.00040602684020996094,
0.00669097900390625,
0.002727508544921875,
0.0056304931640625,
0.00014865398406982422,
0.0214080810546875,
0.0107574462890625,
-0.021026611328125,
-0.01300048828125,
0.0139312744140625,
-0.0166473388671875,
0.0046234130859375,
-0.028350830078125,
0.040863037109375,
-0.0193328857421875,
0.006137847900390625,
-0.07098388671875,
0.00583648681640625,
0.0265655517578125,
0.005756378173828125,
0.060455322265625,
0.08941650390625,
-0.036041259765625,
-0.00688934326171875,
-0.04669189453125,
-0.0093536376953125,
-0.043243408203125,
0.017608642578125,
-0.030670166015625,
-0.0557861328125,
0.032440185546875,
-0.0024051666259765625,
0.01406097412109375,
0.049774169921875,
0.039703369140625,
-0.017822265625,
0.06640625,
0.049652099609375,
-0.01325225830078125,
0.037750244140625,
-0.0709228515625,
0.005779266357421875,
-0.055084228515625,
-0.0238037109375,
-0.00691986083984375,
-0.03173828125,
-0.033721923828125,
-0.04998779296875,
0.0223388671875,
0.0256500244140625,
-0.0307159423828125,
0.019195556640625,
-0.055511474609375,
0.024658203125,
0.02978515625,
0.021270751953125,
0.00119781494140625,
0.01093292236328125,
-0.01273345947265625,
0.001552581787109375,
-0.056243896484375,
-0.0172882080078125,
0.06158447265625,
0.02972412109375,
0.042205810546875,
-0.021514892578125,
0.05267333984375,
0.01117706298828125,
0.02978515625,
-0.036651611328125,
0.041900634765625,
-0.0019855499267578125,
-0.0501708984375,
0.00112152099609375,
-0.0232391357421875,
-0.05859375,
0.013702392578125,
-0.0244598388671875,
-0.05975341796875,
0.0187835693359375,
0.0181121826171875,
-0.0265045166015625,
0.041107177734375,
-0.060943603515625,
0.07550048828125,
-0.0198822021484375,
-0.050872802734375,
-0.01177215576171875,
-0.05035400390625,
0.0323486328125,
0.0177001953125,
-0.00337982177734375,
-0.01033782958984375,
-0.006015777587890625,
0.05535888671875,
-0.0355224609375,
0.05450439453125,
-0.030670166015625,
-0.0006380081176757812,
0.040191650390625,
-0.00679779052734375,
0.0223236083984375,
0.00665283203125,
-0.0204620361328125,
0.037811279296875,
-0.008209228515625,
-0.041290283203125,
-0.0285186767578125,
0.06298828125,
-0.06689453125,
-0.028564453125,
-0.034576416015625,
-0.020660400390625,
0.0154876708984375,
0.0204010009765625,
0.055938720703125,
0.0198211669921875,
-0.0157012939453125,
-0.0010213851928710938,
0.063232421875,
-0.0391845703125,
0.0540771484375,
-0.004718780517578125,
-0.0236968994140625,
-0.044219970703125,
0.07598876953125,
-0.0087738037109375,
0.011474609375,
0.0283966064453125,
0.0213165283203125,
-0.022613525390625,
-0.0276641845703125,
-0.0501708984375,
0.029632568359375,
-0.04034423828125,
-0.02520751953125,
-0.067626953125,
-0.0311431884765625,
-0.0330810546875,
-0.0223236083984375,
-0.046600341796875,
-0.0189056396484375,
-0.05340576171875,
-0.00014150142669677734,
0.0474853515625,
0.032745361328125,
-0.004322052001953125,
0.036773681640625,
-0.0297088623046875,
0.02630615234375,
0.004077911376953125,
0.0198211669921875,
0.01413726806640625,
-0.0384521484375,
-0.01788330078125,
0.0015459060668945312,
-0.03851318359375,
-0.046356201171875,
0.041900634765625,
0.022857666015625,
0.0172576904296875,
0.054931640625,
-0.01139068603515625,
0.0621337890625,
-0.0208282470703125,
0.05517578125,
0.02734375,
-0.06597900390625,
0.0335693359375,
-0.017730712890625,
0.0233612060546875,
0.0281982421875,
0.0450439453125,
-0.0172882080078125,
-0.0087127685546875,
-0.0645751953125,
-0.062164306640625,
0.059112548828125,
0.034698486328125,
0.0139312744140625,
0.01061248779296875,
0.051605224609375,
-0.00695037841796875,
0.0126190185546875,
-0.056365966796875,
-0.035552978515625,
-0.0209503173828125,
-0.00864410400390625,
-0.00896453857421875,
-0.005489349365234375,
0.01161956787109375,
-0.04888916015625,
0.06195068359375,
0.0007419586181640625,
0.051483154296875,
0.030792236328125,
-0.0016775131225585938,
0.0010309219360351562,
-0.0249176025390625,
0.026947021484375,
0.0224609375,
-0.0205230712890625,
-0.00981903076171875,
0.014892578125,
-0.043426513671875,
-0.0016345977783203125,
0.016571044921875,
-0.0183563232421875,
-0.00159454345703125,
0.0160675048828125,
0.07598876953125,
0.00853729248046875,
-0.02996826171875,
0.04302978515625,
-0.01351165771484375,
-0.0216827392578125,
-0.030059814453125,
0.01654052734375,
0.0208892822265625,
0.029205322265625,
0.01568603515625,
0.01323699951171875,
0.006992340087890625,
-0.0283660888671875,
0.01403045654296875,
0.03289794921875,
-0.02789306640625,
-0.0225982666015625,
0.0745849609375,
0.008392333984375,
-0.0279693603515625,
0.056915283203125,
-0.026763916015625,
-0.039764404296875,
0.055389404296875,
0.039337158203125,
0.0731201171875,
-0.01197052001953125,
0.0198211669921875,
0.0496826171875,
0.018280029296875,
-0.00011909008026123047,
0.0216217041015625,
-0.00836181640625,
-0.053314208984375,
-0.0187835693359375,
-0.053375244140625,
-0.0042724609375,
0.01284027099609375,
-0.03033447265625,
0.0352783203125,
-0.060760498046875,
-0.01326751708984375,
0.004093170166015625,
0.0189666748046875,
-0.07220458984375,
0.0318603515625,
0.0230865478515625,
0.07720947265625,
-0.050872802734375,
0.060760498046875,
0.032440185546875,
-0.03741455078125,
-0.04656982421875,
-0.00665283203125,
-0.01180267333984375,
-0.06640625,
0.02886962890625,
0.038726806640625,
-0.006610870361328125,
0.01099395751953125,
-0.055999755859375,
-0.05780029296875,
0.08984375,
0.03143310546875,
-0.03570556640625,
-0.0031223297119140625,
-0.013092041015625,
0.039520263671875,
-0.0298309326171875,
0.0301513671875,
0.039703369140625,
0.0255584716796875,
0.01519012451171875,
-0.04669189453125,
0.016632080078125,
-0.02923583984375,
0.0013217926025390625,
0.006244659423828125,
-0.07293701171875,
0.0736083984375,
-0.044769287109375,
-0.0227508544921875,
0.0114898681640625,
0.065673828125,
0.017181396484375,
0.0306243896484375,
0.025482177734375,
0.07379150390625,
0.050811767578125,
-0.00213623046875,
0.0975341796875,
-0.02386474609375,
0.041534423828125,
0.049163818359375,
0.007114410400390625,
0.045928955078125,
0.0144500732421875,
-0.01371002197265625,
0.044342041015625,
0.062744140625,
-0.0019006729125976562,
0.038482666015625,
0.0006670951843261719,
-0.0276641845703125,
-0.0109405517578125,
0.0148773193359375,
-0.03656005859375,
0.014556884765625,
0.034393310546875,
-0.03692626953125,
-0.002826690673828125,
0.01070404052734375,
0.0218505859375,
-0.0328369140625,
-0.01026153564453125,
0.047393798828125,
0.01476287841796875,
-0.046630859375,
0.062469482421875,
0.007480621337890625,
0.074951171875,
-0.028778076171875,
-0.0003554821014404297,
-0.0192108154296875,
0.031585693359375,
-0.028594970703125,
-0.060821533203125,
0.00362396240234375,
-0.0140533447265625,
0.0020809173583984375,
-0.0139617919921875,
0.057098388671875,
-0.0233917236328125,
-0.05755615234375,
0.032684326171875,
0.006526947021484375,
0.025543212890625,
-0.022430419921875,
-0.090087890625,
0.0193023681640625,
0.005222320556640625,
-0.035430908203125,
0.0245361328125,
0.026031494140625,
0.01800537109375,
0.05255126953125,
0.042144775390625,
-0.01189422607421875,
0.01084136962890625,
-0.0127716064453125,
0.07220458984375,
-0.044921875,
-0.02099609375,
-0.06768798828125,
0.0540771484375,
-0.0154876708984375,
-0.039398193359375,
0.050018310546875,
0.050201416015625,
0.065673828125,
-0.00275421142578125,
0.045379638671875,
-0.0211029052734375,
-0.007476806640625,
-0.03399658203125,
0.060760498046875,
-0.062286376953125,
0.0027313232421875,
-0.046295166015625,
-0.0631103515625,
-0.00992584228515625,
0.06927490234375,
-0.00955963134765625,
0.0215301513671875,
0.043731689453125,
0.058624267578125,
-0.0229644775390625,
0.00464630126953125,
0.00836181640625,
0.018768310546875,
0.0154876708984375,
0.048797607421875,
0.030731201171875,
-0.07232666015625,
0.0299835205078125,
-0.05950927734375,
-0.024566650390625,
-0.00786590576171875,
-0.0667724609375,
-0.06585693359375,
-0.045440673828125,
-0.057952880859375,
-0.050811767578125,
-0.00576019287109375,
0.044342041015625,
0.06341552734375,
-0.04656982421875,
-0.0117645263671875,
-0.01898193359375,
0.00555419921875,
-0.018768310546875,
-0.0248870849609375,
0.0399169921875,
-0.00940704345703125,
-0.06671142578125,
0.003261566162109375,
0.0205230712890625,
0.0006251335144042969,
-0.0245513916015625,
-0.0118255615234375,
-0.0236358642578125,
-0.01427459716796875,
0.052825927734375,
0.02215576171875,
-0.039947509765625,
-0.001270294189453125,
-0.01210784912109375,
0.0038089752197265625,
0.022369384765625,
0.040435791015625,
-0.0633544921875,
0.02899169921875,
0.030670166015625,
0.03515625,
0.08935546875,
-0.0163421630859375,
0.0119476318359375,
-0.04071044921875,
0.027801513671875,
0.013671875,
0.027069091796875,
0.028472900390625,
-0.04095458984375,
0.03326416015625,
0.03131103515625,
-0.042572021484375,
-0.045684814453125,
-0.01123046875,
-0.09136962890625,
-0.0236968994140625,
0.08367919921875,
-0.01410675048828125,
-0.0380859375,
0.01666259765625,
-0.0245361328125,
0.021881103515625,
-0.039093017578125,
0.053466796875,
0.0280609130859375,
-0.0208587646484375,
-0.04119873046875,
-0.034759521484375,
0.0450439453125,
0.01470947265625,
-0.048583984375,
-0.0242462158203125,
0.0276641845703125,
0.050811767578125,
0.0179290771484375,
0.06878662109375,
-0.0035152435302734375,
0.01038360595703125,
0.01099395751953125,
0.01568603515625,
0.01181793212890625,
-0.01088714599609375,
-0.023406982421875,
0.009246826171875,
-0.0328369140625,
-0.01690673828125
]
] |
kuelumbus/polyBERT | 2023-07-18T18:47:54.000Z | [
"sentence-transformers",
"pytorch",
"deberta-v2",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | sentence-similarity | kuelumbus | null | null | kuelumbus/polyBERT | 3 | 15,366 | sentence-transformers | 2022-09-15T13:54:32 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
widget:
- source_sentence: "[*]CC[*]"
sentences:
- "[*]COC[*]"
- "[*]CC(C)C[*]"
---
# kuelumbus/polyBERT
This is polyBERT: A chemical language model to enable fully machine-driven ultrafast polymer informatics. polyBERT maps PSMILES strings to 600 dimensional dense fingerprints. The fingerprints numerically represent polymer chemical structures. Please see the license agreement in the LICENSE file.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
psmiles_strings = ["[*]CC[*]", "[*]COC[*]"]
polyBERT = SentenceTransformer('kuelumbus/polyBERT')
embeddings = polyBERT.encode(psmiles_strings)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
psmiles_strings = ["[*]CC[*]", "[*]COC[*]"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('kuelumbus/polyBERT')
polyBERT = AutoModel.from_pretrained('kuelumbus/polyBERT')
# Tokenize sentences
encoded_input = tokenizer(psmiles_strings, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = polyBERT(**encoded_input)
# Perform pooling. In this case, mean pooling.
fingerprints = mean_pooling(model_output, encoded_input['attention_mask'])
print("Fingerprints:")
print(fingerprints)
```
## Evaluation Results
See https://github.com/Ramprasad-Group/polyBERT and paper on arXiv.
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model
(1): Pooling({'word_embedding_dimension': 600, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
Kuenneth, C., Ramprasad, R. polyBERT: a chemical language model to enable fully machine-driven ultrafast polymer informatics. Nat Commun 14, 4099 (2023). https://doi.org/10.1038/s41467-023-39868-6 | 3,121 | [
[
-0.016143798828125,
-0.04962158203125,
0.03790283203125,
0.0105743408203125,
-0.0186920166015625,
0.013946533203125,
-0.01139068603515625,
-0.01213836669921875,
0.01161956787109375,
0.023834228515625,
-0.0162353515625,
-0.032562255859375,
-0.0604248046875,
-0.0011434555053710938,
-0.0312347412109375,
0.07562255859375,
-0.0044403076171875,
-0.009246826171875,
-0.01158905029296875,
-0.010528564453125,
0.0240936279296875,
-0.0467529296875,
-0.0215606689453125,
-0.031524658203125,
0.0194244384765625,
0.026763916015625,
0.047210693359375,
0.0266876220703125,
0.0172576904296875,
0.0271759033203125,
-0.03106689453125,
0.0087127685546875,
-0.027618408203125,
-0.0088348388671875,
0.0029277801513671875,
-0.0286865234375,
-0.0138702392578125,
0.029571533203125,
0.05108642578125,
0.036529541015625,
-0.00412750244140625,
0.0204315185546875,
0.002197265625,
0.053619384765625,
-0.063720703125,
0.0225067138671875,
-0.0230255126953125,
0.0174102783203125,
0.00439453125,
0.0186767578125,
-0.052001953125,
-0.01439666748046875,
0.0157012939453125,
-0.03558349609375,
0.007904052734375,
-0.0034008026123046875,
0.083251953125,
-0.003025054931640625,
-0.0263214111328125,
-0.017364501953125,
-0.05169677734375,
0.05914306640625,
-0.048492431640625,
0.0161590576171875,
0.021759033203125,
-0.0013570785522460938,
-0.01273345947265625,
-0.09307861328125,
-0.0701904296875,
-0.0106353759765625,
-0.0209503173828125,
0.028839111328125,
-0.0162811279296875,
0.013763427734375,
0.0242462158203125,
0.0323486328125,
-0.042236328125,
-0.0252227783203125,
-0.036224365234375,
-0.0185394287109375,
0.037261962890625,
0.008544921875,
0.019744873046875,
-0.017181396484375,
-0.033935546875,
-0.036651611328125,
-0.0020694732666015625,
0.01117706298828125,
0.01221466064453125,
-0.000033974647521972656,
-0.032135009765625,
0.055694580078125,
0.0019779205322265625,
0.049468994140625,
-0.0033206939697265625,
0.0098114013671875,
0.0391845703125,
-0.01004791259765625,
-0.018280029296875,
0.00768280029296875,
0.089111328125,
0.00955963134765625,
0.034423828125,
0.008514404296875,
-0.004093170166015625,
0.01314544677734375,
0.0023040771484375,
-0.0635986328125,
-0.039337158203125,
0.03619384765625,
-0.03436279296875,
-0.03387451171875,
0.0267181396484375,
-0.06201171875,
-0.015655517578125,
-0.00505828857421875,
0.048004150390625,
-0.0489501953125,
-0.0005621910095214844,
0.034210205078125,
-0.012115478515625,
0.0035953521728515625,
-0.0156707763671875,
-0.06976318359375,
0.0238800048828125,
0.0258026123046875,
0.0826416015625,
0.0033473968505859375,
-0.0325927734375,
-0.0294189453125,
0.007526397705078125,
0.005794525146484375,
0.043365478515625,
-0.051605224609375,
-0.0121002197265625,
-0.0159759521484375,
0.00553131103515625,
-0.04461669921875,
-0.01407623291015625,
0.0283966064453125,
-0.018707275390625,
0.044830322265625,
0.0211944580078125,
-0.060577392578125,
-0.0262298583984375,
-0.015960693359375,
-0.0406494140625,
0.08636474609375,
0.0248870849609375,
-0.0794677734375,
-0.010986328125,
-0.0596923828125,
-0.0157623291015625,
-0.0005040168762207031,
-0.003692626953125,
-0.06036376953125,
0.0127716064453125,
0.0236358642578125,
0.03643798828125,
-0.010986328125,
0.0138397216796875,
-0.0190887451171875,
-0.040313720703125,
0.035552978515625,
-0.006927490234375,
0.09014892578125,
0.0197601318359375,
-0.04302978515625,
0.006420135498046875,
-0.05206298828125,
0.00337982177734375,
0.017730712890625,
-0.020294189453125,
0.0055694580078125,
0.0026226043701171875,
0.0318603515625,
0.022979736328125,
0.025299072265625,
-0.06256103515625,
0.0219268798828125,
-0.0272064208984375,
0.05059814453125,
0.056793212890625,
-0.012451171875,
0.031707763671875,
-0.02008056640625,
0.0303955078125,
0.01525115966796875,
0.022430419921875,
-0.00812530517578125,
-0.031707763671875,
-0.07354736328125,
-0.04107666015625,
0.0240936279296875,
0.048309326171875,
-0.041839599609375,
0.051177978515625,
-0.01194000244140625,
-0.05487060546875,
-0.036041259765625,
-0.0033893585205078125,
0.007671356201171875,
0.031585693359375,
0.0301513671875,
-0.018829345703125,
-0.0289154052734375,
-0.06683349609375,
0.006526947021484375,
-0.014678955078125,
0.005718231201171875,
0.0225372314453125,
0.05352783203125,
-0.0304107666015625,
0.06396484375,
-0.06158447265625,
-0.0204010009765625,
-0.035552978515625,
0.029693603515625,
0.038726806640625,
0.0233917236328125,
0.0372314453125,
-0.050384521484375,
-0.046600341796875,
-0.027618408203125,
-0.05426025390625,
-0.00028896331787109375,
-0.004840850830078125,
-0.00489044189453125,
0.0242767333984375,
0.037628173828125,
-0.05780029296875,
0.0117645263671875,
0.0518798828125,
-0.047027587890625,
0.052459716796875,
-0.0272064208984375,
-0.0089111328125,
-0.09710693359375,
0.0282440185546875,
-0.004512786865234375,
-0.0075836181640625,
-0.046295166015625,
0.0019321441650390625,
0.01419830322265625,
-0.0199432373046875,
-0.047607421875,
0.027435302734375,
-0.031707763671875,
0.016998291015625,
-0.0196533203125,
0.0116424560546875,
-0.000019669532775878906,
0.044342041015625,
0.0026416778564453125,
0.052490234375,
0.06646728515625,
-0.0413818359375,
0.0323486328125,
0.0227508544921875,
-0.0257568359375,
0.0004935264587402344,
-0.0579833984375,
0.006031036376953125,
-0.019012451171875,
0.0276031494140625,
-0.07794189453125,
-0.0003044605255126953,
0.021697998046875,
-0.04144287109375,
0.000522613525390625,
0.01525115966796875,
-0.06439208984375,
-0.046142578125,
-0.034149169921875,
0.02703857421875,
0.048248291015625,
-0.038818359375,
0.060943603515625,
0.0223846435546875,
-0.0013408660888671875,
-0.034576416015625,
-0.040740966796875,
-0.00974273681640625,
-0.0010700225830078125,
-0.0491943359375,
0.031005859375,
-0.011749267578125,
0.01434326171875,
-0.015167236328125,
0.0078277587890625,
-0.0024929046630859375,
-0.001743316650390625,
0.006580352783203125,
0.0265350341796875,
0.004016876220703125,
0.011962890625,
-0.00414276123046875,
-0.0211181640625,
0.0018062591552734375,
-0.0220794677734375,
0.0623779296875,
-0.027069091796875,
0.00992584228515625,
-0.039276123046875,
0.010009765625,
0.07806396484375,
-0.03399658203125,
0.081787109375,
0.07086181640625,
-0.0146026611328125,
-0.006862640380859375,
-0.0318603515625,
-0.0289306640625,
-0.0400390625,
0.0192108154296875,
-0.0142059326171875,
-0.06756591796875,
0.039337158203125,
0.0033588409423828125,
-0.00687408447265625,
0.0263519287109375,
0.040557861328125,
-0.01493072509765625,
0.07427978515625,
0.057373046875,
-0.018829345703125,
0.040130615234375,
-0.04400634765625,
0.032073974609375,
-0.0657958984375,
-0.0167083740234375,
-0.01444244384765625,
-0.0216827392578125,
-0.04937744140625,
-0.026275634765625,
0.032073974609375,
0.005519866943359375,
-0.019622802734375,
0.04302978515625,
-0.06256103515625,
0.0252227783203125,
0.05035400390625,
0.01061248779296875,
-0.004421234130859375,
0.02105712890625,
-0.05194091796875,
-0.00023448467254638672,
-0.052001953125,
-0.043121337890625,
0.07073974609375,
0.0216522216796875,
0.05804443359375,
-0.00647735595703125,
0.047027587890625,
0.002040863037109375,
-0.0121917724609375,
-0.041717529296875,
0.05426025390625,
-0.0240325927734375,
-0.0302734375,
-0.0018777847290039062,
-0.028289794921875,
-0.06573486328125,
0.0306854248046875,
-0.0203857421875,
-0.0858154296875,
0.01085662841796875,
-0.0025691986083984375,
-0.025726318359375,
0.04034423828125,
-0.055938720703125,
0.06964111328125,
-0.006305694580078125,
-0.0243682861328125,
-0.01456451416015625,
-0.053497314453125,
0.006313323974609375,
0.0016956329345703125,
-0.0009927749633789062,
-0.0165252685546875,
-0.0005736351013183594,
0.0657958984375,
-0.0159912109375,
0.0599365234375,
-0.0014190673828125,
0.0142822265625,
0.028472900390625,
0.004730224609375,
0.022430419921875,
-0.00525665283203125,
-0.004215240478515625,
0.009552001953125,
0.0031833648681640625,
-0.031036376953125,
-0.03271484375,
0.040252685546875,
-0.06646728515625,
-0.0187835693359375,
-0.0660400390625,
-0.05255126953125,
0.009490966796875,
0.01611328125,
0.030303955078125,
0.0185699462890625,
0.01611328125,
0.0149688720703125,
0.045989990234375,
-0.031402587890625,
0.053192138671875,
0.049774169921875,
-0.00931549072265625,
-0.0367431640625,
0.076416015625,
-0.001987457275390625,
0.0125732421875,
0.0267181396484375,
0.0299072265625,
-0.020294189453125,
0.0037631988525390625,
-0.015380859375,
0.0404052734375,
-0.0304412841796875,
0.005580902099609375,
-0.072021484375,
-0.032867431640625,
-0.041290283203125,
-0.00676727294921875,
-0.031219482421875,
-0.0384521484375,
-0.0266876220703125,
-0.00366973876953125,
0.0369873046875,
0.035797119140625,
-0.00510406494140625,
0.0282440185546875,
-0.0638427734375,
0.025390625,
0.007419586181640625,
0.01873779296875,
-0.002948760986328125,
-0.043609619140625,
-0.0225372314453125,
0.0002351999282836914,
-0.03839111328125,
-0.07061767578125,
0.02593994140625,
0.0197296142578125,
0.0281829833984375,
0.013336181640625,
0.007312774658203125,
0.0131378173828125,
-0.023468017578125,
0.05694580078125,
0.006458282470703125,
-0.08734130859375,
0.049072265625,
-0.01038360595703125,
0.045013427734375,
0.0307159423828125,
0.0440673828125,
-0.050872802734375,
-0.04522705078125,
-0.0538330078125,
-0.08685302734375,
0.0711669921875,
0.04150390625,
0.01139068603515625,
-0.0122833251953125,
0.0162200927734375,
-0.0249481201171875,
0.002826690673828125,
-0.0909423828125,
-0.044342041015625,
-0.0145721435546875,
-0.040008544921875,
-0.0113372802734375,
-0.033294677734375,
0.00832366943359375,
-0.042877197265625,
0.0662841796875,
-0.001743316650390625,
0.0433349609375,
0.025604248046875,
-0.0234527587890625,
0.00884246826171875,
0.005771636962890625,
0.043609619140625,
0.0273895263671875,
-0.0200653076171875,
0.0196533203125,
0.0009784698486328125,
-0.05224609375,
0.023956298828125,
0.0298309326171875,
-0.0110015869140625,
0.00616455078125,
0.0181121826171875,
0.047119140625,
-0.007171630859375,
-0.04913330078125,
0.046722412109375,
-0.005828857421875,
-0.00759124755859375,
-0.0296478271484375,
-0.001377105712890625,
0.021392822265625,
0.01015472412109375,
0.0206451416015625,
-0.019561767578125,
-0.007114410400390625,
-0.0254058837890625,
0.0222930908203125,
0.0118408203125,
-0.0335693359375,
-0.005451202392578125,
0.05743408203125,
-0.0003902912139892578,
-0.002834320068359375,
0.07293701171875,
-0.01226806640625,
-0.056060791015625,
0.03826904296875,
0.041900634765625,
0.06988525390625,
-0.0233001708984375,
0.0298614501953125,
0.042755126953125,
0.01995849609375,
0.010406494140625,
0.0256195068359375,
-0.00986480712890625,
-0.05059814453125,
-0.0106201171875,
-0.061492919921875,
-0.003787994384765625,
0.00791168212890625,
-0.033782958984375,
0.016021728515625,
-0.032135009765625,
-0.004520416259765625,
0.00978851318359375,
-0.0102996826171875,
-0.0101318359375,
0.0025806427001953125,
0.0224456787109375,
0.05499267578125,
-0.0860595703125,
0.06732177734375,
0.06317138671875,
-0.043609619140625,
-0.04888916015625,
-0.00997161865234375,
-0.007465362548828125,
-0.04534912109375,
0.043609619140625,
0.037353515625,
-0.0057373046875,
0.0041961669921875,
-0.042388916015625,
-0.06732177734375,
0.082763671875,
0.037872314453125,
-0.045684814453125,
0.0031261444091796875,
0.026153564453125,
0.042144775390625,
-0.017303466796875,
0.0268707275390625,
0.03741455078125,
0.024993896484375,
-0.00942230224609375,
-0.053619384765625,
0.01255035400390625,
-0.0310211181640625,
-0.002269744873046875,
0.01477813720703125,
-0.057220458984375,
0.07781982421875,
-0.01519775390625,
-0.01904296875,
0.00537872314453125,
0.052001953125,
0.02484130859375,
-0.01079559326171875,
0.025390625,
0.06756591796875,
0.037750244140625,
-0.005146026611328125,
0.07733154296875,
-0.04925537109375,
0.06622314453125,
0.0626220703125,
-0.0005011558532714844,
0.05621337890625,
0.03802490234375,
-0.0123443603515625,
0.06268310546875,
0.03338623046875,
-0.0279693603515625,
0.033416748046875,
0.0220947265625,
-0.04449462890625,
-0.01488494873046875,
-0.006824493408203125,
-0.00926971435546875,
0.03729248046875,
0.0172576904296875,
-0.055511474609375,
-0.006702423095703125,
0.00909423828125,
0.0007648468017578125,
-0.0004887580871582031,
0.0110931396484375,
0.037750244140625,
0.0126953125,
-0.034454345703125,
0.03997802734375,
0.012908935546875,
0.05078125,
-0.039154052734375,
-0.001605987548828125,
-0.0020542144775390625,
0.038177490234375,
-0.0094146728515625,
-0.060943603515625,
0.0020275115966796875,
-0.01399993896484375,
-0.016693115234375,
-0.0030670166015625,
0.034576416015625,
-0.04132080078125,
-0.0621337890625,
0.044189453125,
0.01078033447265625,
0.01255035400390625,
0.004573822021484375,
-0.0748291015625,
0.005218505859375,
0.0024471282958984375,
-0.030426025390625,
0.0025787353515625,
-0.00408172607421875,
0.022857666015625,
0.0537109375,
0.03436279296875,
-0.0014162063598632812,
-0.00464630126953125,
0.0030994415283203125,
0.0634765625,
-0.040557861328125,
-0.035430908203125,
-0.07733154296875,
0.039031982421875,
-0.001461029052734375,
-0.01373291015625,
0.047027587890625,
0.06439208984375,
0.05694580078125,
-0.0161895751953125,
0.036529541015625,
-0.0218048095703125,
0.00806427001953125,
-0.0257415771484375,
0.073486328125,
-0.0307464599609375,
-0.01488494873046875,
-0.035064697265625,
-0.0849609375,
-0.01381683349609375,
0.07476806640625,
-0.0055084228515625,
0.00565338134765625,
0.0662841796875,
0.070556640625,
-0.026153564453125,
-0.006023406982421875,
0.00409698486328125,
0.0288848876953125,
0.0194091796875,
0.059661865234375,
0.029205322265625,
-0.052001953125,
0.046600341796875,
-0.05078125,
-0.015472412109375,
-0.005207061767578125,
-0.04791259765625,
-0.0643310546875,
-0.061737060546875,
-0.03179931640625,
-0.03680419921875,
0.0048370361328125,
0.06976318359375,
0.049774169921875,
-0.053375244140625,
-0.031005859375,
-0.0209808349609375,
-0.01175689697265625,
-0.01259613037109375,
-0.0256805419921875,
0.04541015625,
-0.0215606689453125,
-0.05841064453125,
0.007404327392578125,
0.009674072265625,
0.025146484375,
-0.0026531219482421875,
0.00910186767578125,
-0.04205322265625,
0.0050048828125,
0.0469970703125,
0.006378173828125,
-0.041717529296875,
-0.01074981689453125,
-0.0019521713256835938,
-0.025299072265625,
0.0010995864868164062,
0.04278564453125,
-0.056793212890625,
0.050933837890625,
0.035369873046875,
0.017333984375,
0.068359375,
-0.022705078125,
0.04522705078125,
-0.04119873046875,
0.02252197265625,
0.0166015625,
0.043853759765625,
0.018798828125,
-0.019744873046875,
0.0347900390625,
0.02947998046875,
-0.041961669921875,
-0.040740966796875,
0.009765625,
-0.0791015625,
-0.030548095703125,
0.07525634765625,
-0.0163116455078125,
-0.0305023193359375,
0.01165008544921875,
-0.017791748046875,
0.03076171875,
-0.0210113525390625,
0.0731201171875,
0.058258056640625,
-0.0212860107421875,
-0.0164947509765625,
-0.01290130615234375,
0.0145416259765625,
0.05279541015625,
-0.033966064453125,
0.00614166259765625,
0.0188446044921875,
0.0208282470703125,
0.0257415771484375,
0.032623291015625,
-0.020965576171875,
0.0347900390625,
0.00012671947479248047,
0.0261383056640625,
-0.0128631591796875,
-0.01580810546875,
-0.024810791015625,
0.0062103271484375,
-0.0179443359375,
-0.0186767578125
]
] |
xiaolxl/GuoFeng3 | 2023-10-28T08:16:21.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | xiaolxl | null | null | xiaolxl/GuoFeng3 | 458 | 15,364 | diffusers | 2023-01-28T11:29:27 | ---
license: cc-by-nc-sa-4.0
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
---
<img src=https://huggingface.co/xiaolxl/GuoFeng3/resolve/main/examples/cover.png>
# 基于SDXL的国风4已发布!- GuoFeng4 based on SDXL has been released! : https://huggingface.co/xiaolxl/GuoFeng4_XL
# 本人郑重声明:本模型禁止用于训练基于明星、公众人物肖像的风格模型训练,因为这会带来争议,对AI社区的发展造成不良的负面影响。
# 本模型注明:训练素材中不包含任何真人素材。
| 版本 | 效果图 |
| --- | --- |
| **GuoFeng3.4** |  |
| **GuoFeng3.3** |  |
| **GuoFeng3.2_light** |  |
| **GuoFeng3.2** |  |
| **GuoFeng3** |  |
# 介绍 - GuoFeng3
欢迎使用GuoFeng3模型 - (TIP:这个版本的名字进行了微调),这是一个中国华丽古风风格模型,也可以说是一个古风游戏角色模型,具有2.5D的质感。第三代大幅度减少上手难度,增加了场景元素与男性古风人物,除此之外为了模型能更好的适应其它TAG,还增加了其它风格的元素。这一代对脸和手的崩坏有一定的修复,同时素材大小也提高到了最长边1024。
根据个人的实验与收到的反馈,国风模型系列的第二代,在人物,与大头照的效果表现比三代更好,如果你有这方面需求不妨试试第二代。
2.0版本:[https://huggingface.co/xiaolxl/Gf_style2](https://huggingface.co/xiaolxl/Gf_style2)
GuoFeng3:原始模型
GuoFeng3.1:对GuoFeng3人像进行了微调修复
GuoFeng3.2:如果你不知道选择GuoFeng3还是GuoFeng2,可以直接使用此版本
GuoFeng3.2_light:通过GuoFeng3.2融合了基于 Noise Offset 训练的Lora使得模型能够画出更漂亮的光影效果(Lora:epi_noiseoffset/Theovercomer8's Contrast Fix)
GuoFeng3.2_Lora:国风3.2 Lora版本
GuoFeng3.2_Lora_big_light:国风3.2_light Lora版本 维度增大版本
GuoFeng3.2_f16:国风3.2 半精版本
GuoFeng3.2_light_f16:国风3.2_light 半精版本
GuoFeng3.3:此版本是基于3.2的一次较大的更新与改进,可以适配full body,即使你的tag不太好,模型也会对画面进行自动修改,不过因此模型出的脸会比较雷同。此模型似乎不需要超分,我的出图大小是768*1024,清晰度还不错。建议竖图,横图可能不清晰。Euler a即可。(DPM++ SDE Karras, DDIM也不错)
GuoFeng3.4:此版本重新进行了新的训练,适配全身图,同时内容上与前几个版本有较大不同。并调整了整体画风,降低了过拟合程度,使其能使用更多的lora对画面与内容进行调整。
--
Welcome to the GuoFeng3 model - (TIP: the name of this version has been fine-tuned). This is a Chinese gorgeous antique style model, which can also be said to be an antique game character model with a 2.5D texture. The third generation greatly reduces the difficulty of getting started, and adds scene elements and male antique characters. In addition, in order to better adapt the model to other TAGs, other style elements are also added. This generation has repaired the broken face and hands to a certain extent, and the size of the material has also increased to the longest side of 1024.
According to personal experiments and feedback received, the second generation of the Guofeng model series performs better than the third generation in terms of characters and big head photos. If you have this need, you can try the second generation.
Version 2.0:[https://huggingface.co/xiaolxl/Gf_style2](https://huggingface.co/xiaolxl/Gf_style2)
GuoFeng3: original model
GuoFeng3.1: The portrait of GuoFeng3 has been fine-tuned and repaired
GuoFeng3.2: If you don't know whether to choose GuoFeng3 or GuoFeng2, you can use this version directly
GuoFeng3.2_Light: Through GuoFeng3.2, Lora based on Noise Offset training is integrated to enable the model to draw more beautiful light and shadow effects (Lora: epi_noiseoffset/Theovercolor8's Contrast Fix)
GuoFeng3.2_Lora: Guofeng3.2 Lora version
GuoFeng3.2_Lora_big_Light: Guofeng3.2_Light Lora Version Dimension Increase Version
GuoFeng3.2_F16: Guofeng3.2 semi-refined version
GuoFeng3.2_light_f16: Guofeng3.2_Light semi-refined version
GuoFeng3.3: This version is a major update and improvement based on 3.2, which can adapt to full bodies. Even if your tag is not good, the model will automatically modify the screen, but the faces produced by the model will be quite similar. This model doesn't seem to require supersession. My plot size is 768 * 1024, and the clarity is quite good. Suggest vertical view, horizontal view may not be clear. Euler a is sufficient. (DPM++SDE Karras, DDIM is also good)
GuoFeng3.4: This version has undergone new training to adapt to the full body image, and the content is significantly different from previous versions.At the same time, the overall painting style has been adjusted, reducing the degree of overfitting, allowing it to use more Lora to adjust the screen and content.
# 安装教程 - install
1. 将GuoFeng3.ckpt模型放入SD目录 - Put GuoFeng3.ckpt model into SD directory
2. 此模型自带VAE,如果你的程序不支持,请记得选择任意一个VAE文件,否则图形将为灰色 - This model comes with VAE. If your program does not support it, please remember to select any VAE file, otherwise the graphics will be gray
# 如何使用 - How to use
**TIP:经过一天的测试,发现很多人物可能出现红眼问题,可以尝试在负面词添加red eyes。如果色彩艳丽可以尝试降低CFG - After a day of testing, we found that many characters may have red-eye problems. We can try to add red eyes to negative words。Try to reduce CFG if the color is bright**
简单:第三代大幅度减少上手难度 - Simple: the third generation greatly reduces the difficulty of getting started
======
如果你的出图全身图时出现脸部崩坏建议删除full body关键词或者使用脸部自动修复插件:
国外源地址:https://github.com/ototadana/sd-face-editor.git
国内加速地址:https://jihulab.com/xiaolxl_pub/sd-face-editor.git
-
If you experience facial collapse during the full body image, it is recommended to delete the full body keyword or use the facial automatic repair plugin:
Foreign source address: https://github.com/ototadana/sd-face-editor.git
Domestic acceleration address: https://jihulab.com/xiaolxl_pub/sd-face-editor.git
=====
- **关键词 - key word:**
```
best quality, masterpiece, highres, 1girl,china dress,Beautiful face
```
- **负面词 - Negative words:**
```
NSFW, lowres,bad anatomy,bad hands, text, error, missing fingers,extra digit, fewer digits, cropped, worstquality, low quality, normal quality,jpegartifacts,signature, watermark, username,blurry,bad feet
```
---
高级:如果您还想使图片尽可能更好,请尝试以下配置 - senior:If you also want to make the picture as better as possible, please try the following configuration
- Sampling steps:**50**
- Sampler:**DPM++ SDE Karras or DDIM**
- The size of the picture should be at least **1024** - 图片大小至少1024
- CFG:**4-6**
- **更好的负面词 Better negative words - 感谢群友提供的负面词:**
```
(((simple background))),monochrome ,lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, lowres, bad anatomy, bad hands, text, error, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, ugly,pregnant,vore,duplicate,morbid,mut ilated,tran nsexual, hermaphrodite,long neck,mutated hands,poorly drawn hands,poorly drawn face,mutation,deformed,blurry,bad anatomy,bad proportions,malformed limbs,extra limbs,cloned face,disfigured,gross proportions, (((missing arms))),(((missing legs))), (((extra arms))),(((extra legs))),pubic hair, plump,bad legs,error legs,username,blurry,bad feet
```
- **如果想元素更丰富,可以添加下方关键词 - If you want to enrich the elements, you can add the following keywords**
```
Beautiful face,
hair ornament, solo,looking at viewer,smile,closed mouth,lips
china dress,dress,hair ornament, necklace, jewelry, long hair, earrings, chinese clothes,
architecture,east asian architecture,building,outdoors,rooftop,city,cityscape
```
# 例图 - Examples
(可在文件列表中找到原图,并放入WebUi查看关键词等信息) - (You can find the original image in the file list, and put WebUi to view keywords and other information)
<img src=https://huggingface.co/xiaolxl/GuoFeng3/resolve/main/examples/e1.png>
<img src=https://huggingface.co/xiaolxl/GuoFeng3/resolve/main/examples/e2.png>
<img src=https://huggingface.co/xiaolxl/GuoFeng3/resolve/main/examples/e3.png>
<img src=https://huggingface.co/xiaolxl/GuoFeng3/resolve/main/examples/e4.png> | 8,063 | [
[
-0.05517578125,
-0.036163330078125,
0.0223236083984375,
0.036834716796875,
-0.04791259765625,
-0.013916015625,
0.0025615692138671875,
-0.0638427734375,
0.045074462890625,
0.038787841796875,
-0.0416259765625,
-0.049652099609375,
-0.036712646484375,
0.011505126953125,
0.00543212890625,
0.056732177734375,
-0.00003403425216674805,
0.0026683807373046875,
0.0218353271484375,
-0.0065765380859375,
-0.03839111328125,
0.00392913818359375,
-0.048187255859375,
-0.0213470458984375,
0.0095977783203125,
0.0197906494140625,
0.05731201171875,
0.052642822265625,
0.051788330078125,
0.0264129638671875,
-0.0121612548828125,
0.0061798095703125,
-0.02398681640625,
-0.032012939453125,
0.02020263671875,
-0.0272216796875,
-0.075439453125,
0.004665374755859375,
0.038543701171875,
0.00897216796875,
0.01218414306640625,
-0.0002105236053466797,
0.0105133056640625,
0.06231689453125,
-0.01021575927734375,
0.008514404296875,
-0.003925323486328125,
0.00504302978515625,
-0.01947021484375,
-0.0167236328125,
-0.002193450927734375,
-0.0322265625,
-0.0095367431640625,
-0.0618896484375,
0.002674102783203125,
-0.004032135009765625,
0.1168212890625,
0.00431060791015625,
-0.033233642578125,
0.021087646484375,
-0.0352783203125,
0.0540771484375,
-0.05682373046875,
0.0136566162109375,
0.0172882080078125,
0.0263519287109375,
-0.0032958984375,
-0.054443359375,
-0.045074462890625,
-0.005611419677734375,
-0.023468017578125,
0.040863037109375,
-0.00968170166015625,
-0.009368896484375,
0.0282440185546875,
0.0257110595703125,
-0.0318603515625,
-0.0101470947265625,
-0.03839111328125,
-0.009796142578125,
0.0712890625,
-0.0033664703369140625,
0.05859375,
-0.0208740234375,
-0.044952392578125,
-0.0161285400390625,
-0.061065673828125,
0.007965087890625,
0.0137481689453125,
-0.01317596435546875,
-0.060028076171875,
0.023162841796875,
-0.00952911376953125,
0.0440673828125,
0.01157379150390625,
-0.030059814453125,
0.043487548828125,
-0.032623291015625,
-0.0117950439453125,
-0.0303955078125,
0.06915283203125,
0.07012939453125,
0.003940582275390625,
0.0264739990234375,
-0.01708984375,
-0.021759033203125,
-0.01139068603515625,
-0.06707763671875,
-0.00829315185546875,
0.032745361328125,
-0.05914306640625,
-0.0237884521484375,
0.018310546875,
-0.06256103515625,
0.0027446746826171875,
-0.0027675628662109375,
0.03143310546875,
-0.052825927734375,
-0.037139892578125,
0.00015437602996826172,
-0.02032470703125,
0.03192138671875,
0.0504150390625,
-0.04864501953125,
-0.0014429092407226562,
0.01499176025390625,
0.0545654296875,
0.01084136962890625,
-0.01541900634765625,
-0.00543975830078125,
0.0149688720703125,
-0.032562255859375,
0.047332763671875,
0.0022716522216796875,
-0.0299835205078125,
-0.0201873779296875,
0.0204620361328125,
0.016204833984375,
-0.028717041015625,
0.047271728515625,
-0.01837158203125,
0.0157623291015625,
-0.0222930908203125,
-0.0019254684448242188,
-0.02142333984375,
0.006038665771484375,
-0.047119140625,
0.05865478515625,
0.0279998779296875,
-0.0767822265625,
0.01183319091796875,
-0.0386962890625,
-0.0114593505859375,
0.01080322265625,
-0.0085296630859375,
-0.045562744140625,
-0.0177764892578125,
0.01428985595703125,
0.03094482421875,
-0.0133514404296875,
-0.0212860107421875,
-0.0175933837890625,
-0.041656494140625,
0.01280975341796875,
0.0133819580078125,
0.0931396484375,
0.031494140625,
-0.042694091796875,
0.005619049072265625,
-0.0477294921875,
0.00902557373046875,
0.048919677734375,
-0.01242828369140625,
-0.0186767578125,
-0.005138397216796875,
0.01049041748046875,
0.03436279296875,
0.031280517578125,
-0.0340576171875,
0.0165252685546875,
-0.0309295654296875,
0.0361328125,
0.054840087890625,
-0.001003265380859375,
0.016815185546875,
-0.05780029296875,
0.037322998046875,
0.006458282470703125,
0.0205078125,
-0.0204010009765625,
-0.0374755859375,
-0.08148193359375,
-0.0291290283203125,
0.016357421875,
0.0288848876953125,
-0.0762939453125,
0.044525146484375,
-0.0020999908447265625,
-0.05181884765625,
-0.034820556640625,
-0.012359619140625,
0.0380859375,
0.005764007568359375,
0.02783203125,
-0.02655029296875,
-0.040283203125,
-0.06939697265625,
0.008148193359375,
-0.0181427001953125,
-0.01468658447265625,
0.042449951171875,
0.038970947265625,
-0.01641845703125,
0.0406494140625,
-0.045135498046875,
-0.044403076171875,
-0.0252227783203125,
-0.00850677490234375,
0.0150146484375,
0.045440673828125,
0.07720947265625,
-0.06512451171875,
-0.0579833984375,
0.0056610107421875,
-0.063720703125,
-0.00457000732421875,
-0.0080108642578125,
-0.022308349609375,
0.0264129638671875,
0.01091766357421875,
-0.046234130859375,
0.0279083251953125,
0.0307159423828125,
-0.037628173828125,
0.059478759765625,
-0.0188140869140625,
0.0230255126953125,
-0.082275390625,
0.012939453125,
0.00864410400390625,
-0.0105743408203125,
-0.040802001953125,
0.05035400390625,
0.007205963134765625,
0.0162811279296875,
-0.041656494140625,
0.054351806640625,
-0.05126953125,
0.0127410888671875,
-0.01035308837890625,
0.0192413330078125,
0.01021575927734375,
0.0394287109375,
0.0149383544921875,
0.0377197265625,
0.041656494140625,
-0.031341552734375,
0.02984619140625,
0.031280517578125,
-0.0268402099609375,
0.036376953125,
-0.0550537109375,
0.01132965087890625,
-0.01091766357421875,
0.0233001708984375,
-0.060272216796875,
-0.0306243896484375,
0.043426513671875,
-0.05352783203125,
0.028411865234375,
0.00555419921875,
-0.00855255126953125,
-0.046173095703125,
-0.0574951171875,
-0.0014591217041015625,
0.04669189453125,
-0.037017822265625,
0.034210205078125,
0.01739501953125,
0.00428009033203125,
-0.0394287109375,
-0.054046630859375,
-0.0023136138916015625,
-0.0210723876953125,
-0.074951171875,
0.059173583984375,
-0.003711700439453125,
-0.01318359375,
-0.0084381103515625,
0.0181732177734375,
-0.0060272216796875,
-0.0053253173828125,
0.004886627197265625,
0.048187255859375,
-0.02313232421875,
-0.029205322265625,
-0.01399993896484375,
-0.0055084228515625,
-0.0108795166015625,
0.024505615234375,
0.02911376953125,
-0.0147705078125,
-0.00453948974609375,
-0.05670166015625,
0.02093505859375,
0.04449462890625,
-0.01474761962890625,
0.0250396728515625,
0.056365966796875,
-0.01922607421875,
0.004550933837890625,
-0.035186767578125,
-0.01275634765625,
-0.039215087890625,
0.0178070068359375,
-0.0299835205078125,
-0.04766845703125,
0.0377197265625,
0.0180816650390625,
0.006755828857421875,
0.05230712890625,
0.02117919921875,
-0.0178680419921875,
0.0950927734375,
0.052947998046875,
-0.0025768280029296875,
0.03826904296875,
-0.054229736328125,
-0.0102996826171875,
-0.06817626953125,
-0.02435302734375,
-0.0251007080078125,
-0.03363037109375,
-0.047088623046875,
-0.033538818359375,
0.029571533203125,
0.01197052001953125,
-0.02880859375,
0.039764404296875,
-0.05377197265625,
0.008544921875,
0.0288238525390625,
0.0271759033203125,
0.0182342529296875,
-0.01548004150390625,
0.007808685302734375,
-0.01302337646484375,
-0.0174713134765625,
-0.0281524658203125,
0.04248046875,
0.033782958984375,
0.0458984375,
0.021087646484375,
0.04937744140625,
-0.0143280029296875,
0.007305145263671875,
-0.041046142578125,
0.0491943359375,
-0.00788116455078125,
-0.04833984375,
-0.01140594482421875,
-0.02288818359375,
-0.054718017578125,
0.0255126953125,
-0.03643798828125,
-0.053863525390625,
0.0167694091796875,
0.005214691162109375,
-0.031402587890625,
0.01763916015625,
-0.040985107421875,
0.041656494140625,
-0.0276031494140625,
-0.033111572265625,
0.004730224609375,
-0.062347412109375,
0.04449462890625,
0.007228851318359375,
0.0157623291015625,
-0.01212310791015625,
-0.003570556640625,
0.04931640625,
-0.04754638671875,
0.03826904296875,
-0.02178955078125,
-0.0022220611572265625,
0.0289154052734375,
-0.004032135009765625,
0.05084228515625,
0.00594329833984375,
0.01018524169921875,
0.02056884765625,
0.004791259765625,
-0.01708984375,
-0.031768798828125,
0.04534912109375,
-0.07330322265625,
-0.052947998046875,
-0.043426513671875,
-0.00021827220916748047,
0.0173797607421875,
0.0287628173828125,
0.053863525390625,
0.009857177734375,
-0.004581451416015625,
0.01110076904296875,
0.0218353271484375,
-0.037109375,
0.03131103515625,
0.0401611328125,
-0.032562255859375,
-0.044281005859375,
0.0711669921875,
0.01383209228515625,
0.0230560302734375,
0.00830078125,
0.0105438232421875,
0.00669097900390625,
-0.024658203125,
-0.0443115234375,
0.038299560546875,
-0.0296173095703125,
-0.015838623046875,
-0.033050537109375,
-0.0361328125,
-0.035186767578125,
-0.02655029296875,
-0.017669677734375,
-0.0196075439453125,
-0.0292205810546875,
0.018096923828125,
0.035491943359375,
0.033782958984375,
-0.007381439208984375,
0.0167694091796875,
-0.05047607421875,
0.0367431640625,
0.01067352294921875,
0.0197906494140625,
-0.0030193328857421875,
-0.03204345703125,
-0.01244354248046875,
0.01114654541015625,
-0.045013427734375,
-0.05865478515625,
0.052703857421875,
0.0036106109619140625,
0.0188751220703125,
0.044891357421875,
-0.00225830078125,
0.048309326171875,
-0.02459716796875,
0.0643310546875,
0.031524658203125,
-0.054931640625,
0.035980224609375,
-0.056732177734375,
0.017547607421875,
0.0205230712890625,
0.0207672119140625,
-0.046234130859375,
-0.0276641845703125,
-0.05059814453125,
-0.047210693359375,
0.083740234375,
0.0218353271484375,
0.00988006591796875,
0.0230255126953125,
0.0173797607421875,
-0.0216217041015625,
-0.0025730133056640625,
-0.07275390625,
-0.048614501953125,
-0.03778076171875,
0.01215362548828125,
0.0186309814453125,
-0.01021575927734375,
-0.003185272216796875,
-0.039520263671875,
0.07293701171875,
-0.00846099853515625,
0.06005859375,
0.0268096923828125,
0.02667236328125,
-0.0162811279296875,
-0.0024929046630859375,
0.048492431640625,
0.03509521484375,
-0.0308074951171875,
-0.01490020751953125,
0.01023101806640625,
-0.044036865234375,
0.0023784637451171875,
-0.0009927749633789062,
-0.03656005859375,
0.009735107421875,
0.0244598388671875,
0.0574951171875,
-0.00650787353515625,
-0.0350341796875,
0.06427001953125,
-0.010498046875,
-0.02008056640625,
-0.03631591796875,
0.01540374755859375,
0.0178375244140625,
0.01558685302734375,
0.0285186767578125,
0.0104522705078125,
0.00753021240234375,
-0.0394287109375,
0.004169464111328125,
0.0312347412109375,
-0.02545166015625,
-0.0218963623046875,
0.06573486328125,
0.0215911865234375,
-0.0090789794921875,
0.02142333984375,
-0.018585205078125,
-0.044830322265625,
0.07470703125,
0.0516357421875,
0.04656982421875,
-0.0291290283203125,
0.0288238525390625,
0.0634765625,
0.0177001953125,
-0.0025959014892578125,
0.045257568359375,
0.0214996337890625,
-0.043975830078125,
-0.0177459716796875,
-0.035186767578125,
-0.0080108642578125,
0.0265655517578125,
-0.0247802734375,
0.06182861328125,
-0.05291748046875,
-0.00901031494140625,
0.0017566680908203125,
0.0151824951171875,
-0.0235443115234375,
0.034149169921875,
0.00980377197265625,
0.07501220703125,
-0.041748046875,
0.07855224609375,
0.037261962890625,
-0.053558349609375,
-0.06793212890625,
0.0025310516357421875,
0.0218505859375,
-0.061248779296875,
0.050537109375,
0.0211029052734375,
-0.0040740966796875,
-0.005702972412109375,
-0.0589599609375,
-0.0731201171875,
0.09808349609375,
0.0267333984375,
-0.0205535888671875,
-0.0014429092407226562,
-0.0199432373046875,
0.035430908203125,
-0.032928466796875,
0.032562255859375,
0.02593994140625,
0.047088623046875,
0.025390625,
-0.06005859375,
0.0248260498046875,
-0.043975830078125,
0.006862640380859375,
-0.001628875732421875,
-0.10394287109375,
0.0849609375,
-0.0302581787109375,
-0.0289764404296875,
0.026153564453125,
0.054534912109375,
0.02471923828125,
0.0063934326171875,
0.038238525390625,
0.0296173095703125,
0.017303466796875,
-0.02294921875,
0.07122802734375,
-0.019195556640625,
0.0240936279296875,
0.049530029296875,
0.0013294219970703125,
0.04095458984375,
-0.005352020263671875,
-0.033477783203125,
0.0452880859375,
0.0570068359375,
-0.02667236328125,
0.035400390625,
-0.01107025146484375,
-0.0198822021484375,
-0.0018072128295898438,
-0.006923675537109375,
-0.0584716796875,
0.040740966796875,
0.01012420654296875,
-0.033966064453125,
0.0081634521484375,
-0.004486083984375,
0.017669677734375,
-0.0105743408203125,
-0.007015228271484375,
0.05535888671875,
0.0094146728515625,
-0.040557861328125,
0.066162109375,
0.0248260498046875,
0.07135009765625,
-0.050994873046875,
-0.01953125,
-0.02386474609375,
0.0004525184631347656,
-0.03094482421875,
-0.06085205078125,
0.0081787109375,
-0.018402099609375,
0.016204833984375,
0.0005145072937011719,
0.05499267578125,
-0.019561767578125,
-0.0312347412109375,
0.03643798828125,
0.00579833984375,
0.03131103515625,
0.0168304443359375,
-0.06756591796875,
0.0199432373046875,
0.034027099609375,
-0.028594970703125,
0.01727294921875,
0.0199737548828125,
0.01479339599609375,
0.038909912109375,
0.0228118896484375,
0.0227508544921875,
-0.033477783203125,
-0.0236053466796875,
0.0697021484375,
-0.057373046875,
-0.031494140625,
-0.04931640625,
0.046051025390625,
-0.025421142578125,
-0.0304718017578125,
0.061248779296875,
0.0230712890625,
0.06201171875,
-0.0149688720703125,
0.06463623046875,
-0.0377197265625,
0.03118896484375,
-0.041259765625,
0.0794677734375,
-0.0748291015625,
-0.0013532638549804688,
-0.053436279296875,
-0.0584716796875,
-0.0236358642578125,
0.0660400390625,
0.00324249267578125,
0.023468017578125,
0.038116455078125,
0.06304931640625,
0.0194854736328125,
-0.00856781005859375,
0.0157928466796875,
0.024688720703125,
0.011016845703125,
0.063720703125,
0.042877197265625,
-0.058349609375,
0.024566650390625,
-0.06011962890625,
-0.0244598388671875,
-0.033935546875,
-0.052703857421875,
-0.057830810546875,
-0.049560546875,
-0.02587890625,
-0.035186767578125,
-0.017913818359375,
0.060455322265625,
0.061279296875,
-0.04180908203125,
-0.008941650390625,
0.028411865234375,
0.01294708251953125,
-0.03387451171875,
-0.019195556640625,
0.041748046875,
0.0239410400390625,
-0.07794189453125,
0.0026149749755859375,
0.0157012939453125,
0.037506103515625,
-0.01056671142578125,
-0.0231475830078125,
-0.0079803466796875,
-0.01096343994140625,
0.037506103515625,
0.0291595458984375,
-0.05712890625,
-0.0288848876953125,
0.00275421142578125,
-0.0173797607421875,
0.0243682861328125,
0.01556396484375,
-0.0311279296875,
0.024505615234375,
0.039947509765625,
-0.01220703125,
0.036712646484375,
0.0008606910705566406,
0.0243988037109375,
-0.031646728515625,
0.0183258056640625,
-0.007617950439453125,
0.036163330078125,
0.01424407958984375,
-0.0400390625,
0.05841064453125,
0.02874755859375,
-0.04180908203125,
-0.044464111328125,
-0.0027923583984375,
-0.0938720703125,
-0.03369140625,
0.07965087890625,
-0.005245208740234375,
-0.02593994140625,
0.005397796630859375,
-0.054168701171875,
0.0179443359375,
-0.0203094482421875,
0.04803466796875,
0.04119873046875,
-0.00775146484375,
-0.01546478271484375,
-0.052520751953125,
0.046112060546875,
0.0119171142578125,
-0.051788330078125,
-0.01399993896484375,
0.0447998046875,
0.01554107666015625,
0.037017822265625,
0.06396484375,
-0.0277557373046875,
0.04010009765625,
-0.0034275054931640625,
0.038787841796875,
-0.006862640380859375,
-0.001068115234375,
-0.0165252685546875,
-0.01415252685546875,
-0.00264739990234375,
-0.006771087646484375
]
] |
bookbot/wav2vec2-ljspeech-gruut | 2023-04-05T02:27:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"phoneme-recognition",
"generated_from_trainer",
"en",
"dataset:w11wo/ljspeech_phonemes",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | bookbot | null | null | bookbot/wav2vec2-ljspeech-gruut | 5 | 15,357 | transformers | 2023-01-09T01:22:52 | ---
language: en
license: apache-2.0
tags:
- phoneme-recognition
- generated_from_trainer
datasets:
- w11wo/ljspeech_phonemes
model-index:
- name: Wav2Vec2 LJSpeech Gruut
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LJSpeech
type: ljspeech_phonemes
metrics:
- name: Test PER (w/o stress)
type: per
value: 0.0099
- name: Test CER (w/o stress)
type: cer
value: 0.0058
---
# Wav2Vec2 LJSpeech Gruut
Wav2Vec2 LJSpeech Gruut is an automatic speech recognition model based on the [wav2vec 2.0](https://arxiv.org/abs/2006.11477) architecture. This model is a fine-tuned version of [Wav2Vec2-Base](https://huggingface.co/facebook/wav2vec2-base) on the [LJSpech Phonemes](https://huggingface.co/datasets/w11wo/ljspeech_phonemes) dataset.
Instead of being trained to predict sequences of words, this model was trained to predict sequence of phonemes, e.g. `["h", "ɛ", "l", "ˈoʊ", "w", "ˈɚ", "l", "d"]`. Therefore, the model's [vocabulary](https://huggingface.co/bookbot/wav2vec2-ljspeech-gruut/blob/main/vocab.json) contains the different IPA phonemes found in [gruut](https://github.com/rhasspy/gruut).
This model was trained using HuggingFace's PyTorch framework. All training was done on a Google Cloud Engine VM with a Tesla A100 GPU. All necessary scripts used for training could be found in the [Files and versions](https://huggingface.co/bookbot/wav2vec2-ljspeech-gruut/tree/main) tab, as well as the [Training metrics](https://huggingface.co/bookbot/wav2vec2-ljspeech-gruut/tensorboard) logged via Tensorboard.
## Model
| Model | #params | Arch. | Training/Validation data (text) |
| ------------------------- | ------- | ----------- | ------------------------------- |
| `wav2vec2-ljspeech-gruut` | 94M | wav2vec 2.0 | `LJSpech Phonemes` Dataset |
## Evaluation Results
The model achieves the following results on evaluation:
| Dataset | PER (w/o stress) | CER (w/o stress) |
| ---------------------------- | :--------------: | :--------------: |
| `LJSpech Phonemes` Test Data | 0.99% | 0.58% |
## Usage
```py
from transformers import AutoProcessor, AutoModelForCTC, Wav2Vec2Processor
import librosa
import torch
from itertools import groupby
from datasets import load_dataset
def decode_phonemes(
ids: torch.Tensor, processor: Wav2Vec2Processor, ignore_stress: bool = False
) -> str:
"""CTC-like decoding. First removes consecutive duplicates, then removes special tokens."""
# removes consecutive duplicates
ids = [id_ for id_, _ in groupby(ids)]
special_token_ids = processor.tokenizer.all_special_ids + [
processor.tokenizer.word_delimiter_token_id
]
# converts id to token, skipping special tokens
phonemes = [processor.decode(id_) for id_ in ids if id_ not in special_token_ids]
# joins phonemes
prediction = " ".join(phonemes)
# whether to ignore IPA stress marks
if ignore_stress == True:
prediction = prediction.replace("ˈ", "").replace("ˌ", "")
return prediction
checkpoint = "bookbot/wav2vec2-ljspeech-gruut"
model = AutoModelForCTC.from_pretrained(checkpoint)
processor = AutoProcessor.from_pretrained(checkpoint)
sr = processor.feature_extractor.sampling_rate
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
audio_array = ds[0]["audio"]["array"]
# or, read a single audio file
# audio_array, _ = librosa.load("myaudio.wav", sr=sr)
inputs = processor(audio_array, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs["input_values"]).logits
predicted_ids = torch.argmax(logits, dim=-1)
prediction = decode_phonemes(predicted_ids[0], processor, ignore_stress=True)
# => should give 'b ɪ k ʌ z j u ɚ z s l i p ɪ ŋ ɪ n s t ɛ d ə v k ɔ ŋ k ɚ ɪ ŋ ð ə l ʌ v l i ɹ z p ɹ ɪ n s ə s h æ z b ɪ k ʌ m ə v f ɪ t ə l w ɪ θ n b oʊ p ɹ ə ʃ æ ɡ i s ɪ t s ð ɛ ɹ ə k u ɪ ŋ d ʌ v'
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- `learning_rate`: 0.0001
- `train_batch_size`: 16
- `eval_batch_size`: 8
- `seed`: 42
- `gradient_accumulation_steps`: 2
- `total_train_batch_size`: 32
- `optimizer`: Adam with `betas=(0.9,0.999)` and `epsilon=1e-08`
- `lr_scheduler_type`: linear
- `lr_scheduler_warmup_steps`: 1000
- `num_epochs`: 30.0
- `mixed_precision_training`: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
| :-----------: | :---: | :---: | :-------------: | :----: | :----: |
| No log | 1.0 | 348 | 2.2818 | 1.0 | 1.0 |
| 2.6692 | 2.0 | 696 | 0.2045 | 0.0527 | 0.0299 |
| 0.2225 | 3.0 | 1044 | 0.1162 | 0.0319 | 0.0189 |
| 0.2225 | 4.0 | 1392 | 0.0927 | 0.0235 | 0.0147 |
| 0.0868 | 5.0 | 1740 | 0.0797 | 0.0218 | 0.0143 |
| 0.0598 | 6.0 | 2088 | 0.0715 | 0.0197 | 0.0128 |
| 0.0598 | 7.0 | 2436 | 0.0652 | 0.0160 | 0.0103 |
| 0.0447 | 8.0 | 2784 | 0.0571 | 0.0152 | 0.0095 |
| 0.0368 | 9.0 | 3132 | 0.0608 | 0.0163 | 0.0112 |
| 0.0368 | 10.0 | 3480 | 0.0586 | 0.0137 | 0.0083 |
| 0.0303 | 11.0 | 3828 | 0.0641 | 0.0141 | 0.0085 |
| 0.0273 | 12.0 | 4176 | 0.0656 | 0.0131 | 0.0079 |
| 0.0232 | 13.0 | 4524 | 0.0690 | 0.0133 | 0.0082 |
| 0.0232 | 14.0 | 4872 | 0.0598 | 0.0128 | 0.0079 |
| 0.0189 | 15.0 | 5220 | 0.0671 | 0.0121 | 0.0074 |
| 0.017 | 16.0 | 5568 | 0.0654 | 0.0114 | 0.0069 |
| 0.017 | 17.0 | 5916 | 0.0751 | 0.0118 | 0.0073 |
| 0.0146 | 18.0 | 6264 | 0.0653 | 0.0112 | 0.0068 |
| 0.0127 | 19.0 | 6612 | 0.0682 | 0.0112 | 0.0069 |
| 0.0127 | 20.0 | 6960 | 0.0678 | 0.0114 | 0.0068 |
| 0.0114 | 21.0 | 7308 | 0.0656 | 0.0111 | 0.0066 |
| 0.0101 | 22.0 | 7656 | 0.0669 | 0.0109 | 0.0066 |
| 0.0092 | 23.0 | 8004 | 0.0677 | 0.0108 | 0.0065 |
| 0.0092 | 24.0 | 8352 | 0.0653 | 0.0104 | 0.0063 |
| 0.0088 | 25.0 | 8700 | 0.0673 | 0.0102 | 0.0063 |
| 0.0074 | 26.0 | 9048 | 0.0669 | 0.0105 | 0.0064 |
| 0.0074 | 27.0 | 9396 | 0.0707 | 0.0101 | 0.0061 |
| 0.0066 | 28.0 | 9744 | 0.0673 | 0.0100 | 0.0060 |
| 0.0058 | 29.0 | 10092 | 0.0689 | 0.0100 | 0.0059 |
| 0.0058 | 30.0 | 10440 | 0.0683 | 0.0099 | 0.0058 |
## Disclaimer
Do consider the biases which came from pre-training datasets that may be carried over into the results of this model.
## Authors
Wav2Vec2 LJSpeech Gruut was trained and evaluated by [Wilson Wongso](https://w11wo.github.io/). All computation and development are done on Google Cloud.
## Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.10.0
- Datasets 2.7.1
- Tokenizers 0.13.2
- Gruut 2.3.4 | 7,299 | [
[
-0.036285400390625,
-0.04852294921875,
0.0072479248046875,
-0.0005726814270019531,
-0.00833892822265625,
-0.0033092498779296875,
-0.0132598876953125,
-0.02490234375,
0.0306396484375,
0.017974853515625,
-0.04315185546875,
-0.052978515625,
-0.0491943359375,
-0.023895263671875,
-0.0012617111206054688,
0.0660400390625,
0.0094146728515625,
0.006420135498046875,
-0.00010085105895996094,
-0.01268768310546875,
-0.026947021484375,
-0.01389312744140625,
-0.047698974609375,
-0.0262908935546875,
0.01043701171875,
0.0250396728515625,
0.04052734375,
0.041900634765625,
0.0277862548828125,
0.0219879150390625,
-0.0262451171875,
-0.003875732421875,
-0.02288818359375,
-0.016998291015625,
0.022796630859375,
-0.0379638671875,
-0.02490234375,
-0.0004987716674804688,
0.031646728515625,
0.038055419921875,
-0.01702880859375,
0.037109375,
0.00955963134765625,
0.054840087890625,
-0.0219879150390625,
0.01548004150390625,
-0.031982421875,
0.0033092498779296875,
-0.007305145263671875,
-0.002704620361328125,
0.00022745132446289062,
-0.026947021484375,
0.0021762847900390625,
-0.041900634765625,
0.0250396728515625,
-0.0140838623046875,
0.08245849609375,
0.01702880859375,
-0.0111541748046875,
-0.0223541259765625,
-0.030548095703125,
0.059112548828125,
-0.053131103515625,
0.038726806640625,
0.04583740234375,
0.001117706298828125,
-0.019012451171875,
-0.058868408203125,
-0.05108642578125,
0.01523590087890625,
-0.00958251953125,
0.032806396484375,
-0.01180267333984375,
-0.0156707763671875,
0.0267486572265625,
0.0287933349609375,
-0.052490234375,
-0.00047707557678222656,
-0.044281005859375,
-0.02374267578125,
0.049346923828125,
-0.002777099609375,
0.01873779296875,
-0.0312347412109375,
-0.036102294921875,
-0.0213775634765625,
-0.01338958740234375,
0.05780029296875,
0.0411376953125,
0.01499176025390625,
-0.037384033203125,
0.0289459228515625,
-0.0029659271240234375,
0.043487548828125,
0.006771087646484375,
-0.033782958984375,
0.06060791015625,
-0.0161590576171875,
-0.01861572265625,
0.007457733154296875,
0.07366943359375,
0.03399658203125,
0.003139495849609375,
0.007633209228515625,
-0.006011962890625,
0.0024662017822265625,
-0.01141357421875,
-0.05584716796875,
-0.037750244140625,
0.043975830078125,
-0.03607177734375,
-0.02032470703125,
-0.0095367431640625,
-0.054168701171875,
0.00009065866470336914,
-0.0186920166015625,
0.0294952392578125,
-0.0304107666015625,
-0.035797119140625,
0.01332855224609375,
-0.01861572265625,
0.0159454345703125,
0.01332855224609375,
-0.07635498046875,
0.01534271240234375,
0.0274505615234375,
0.06829833984375,
0.01239013671875,
-0.01139068603515625,
-0.00914764404296875,
-0.0010862350463867188,
-0.0291748046875,
0.045928955078125,
-0.00461578369140625,
-0.04632568359375,
-0.0131988525390625,
0.0206146240234375,
-0.018280029296875,
-0.032867431640625,
0.050506591796875,
-0.01256561279296875,
0.03533935546875,
-0.0270538330078125,
-0.02789306640625,
-0.01122283935546875,
0.006679534912109375,
-0.035675048828125,
0.10736083984375,
0.0145263671875,
-0.06536865234375,
0.031646728515625,
-0.03741455078125,
-0.0174407958984375,
-0.01415252685546875,
-0.0190277099609375,
-0.05682373046875,
-0.01959228515625,
0.0294189453125,
0.03204345703125,
-0.0299835205078125,
0.00406646728515625,
-0.0022335052490234375,
-0.0311279296875,
-0.0010166168212890625,
-0.0122833251953125,
0.0833740234375,
0.01776123046875,
-0.04815673828125,
-0.006900787353515625,
-0.06591796875,
0.023834228515625,
0.023284912109375,
-0.039703369140625,
-0.003414154052734375,
-0.0278778076171875,
0.0020275115966796875,
0.0210723876953125,
0.00969696044921875,
-0.040069580078125,
0.0146942138671875,
-0.0309600830078125,
0.036163330078125,
0.046630859375,
0.001312255859375,
0.0267791748046875,
-0.03192138671875,
0.0211181640625,
0.004581451416015625,
0.0086822509765625,
-0.005706787109375,
-0.050506591796875,
-0.05780029296875,
-0.039794921875,
0.00958251953125,
0.045654296875,
-0.01849365234375,
0.052947998046875,
-0.00983428955078125,
-0.0517578125,
-0.059783935546875,
0.002124786376953125,
0.02593994140625,
0.053863525390625,
0.0372314453125,
-0.010894775390625,
-0.043365478515625,
-0.068359375,
-0.01377105712890625,
-0.0145721435546875,
0.0028972625732421875,
0.03216552734375,
0.041778564453125,
-0.0166473388671875,
0.06854248046875,
-0.0309600830078125,
-0.044281005859375,
-0.01009368896484375,
-0.00322723388671875,
0.052764892578125,
0.048370361328125,
0.041778564453125,
-0.048583984375,
-0.032379150390625,
-0.0015115737915039062,
-0.0560302734375,
0.01132965087890625,
-0.006526947021484375,
-0.0003654956817626953,
0.00928497314453125,
0.026702880859375,
-0.05084228515625,
0.042327880859375,
0.03955078125,
-0.0251007080078125,
0.04559326171875,
-0.00928497314453125,
0.0223846435546875,
-0.080078125,
0.011871337890625,
0.0013532638549804688,
-0.0029010772705078125,
-0.037200927734375,
-0.023529052734375,
-0.00225067138671875,
-0.004547119140625,
-0.031646728515625,
0.04296875,
-0.040618896484375,
-0.005046844482421875,
0.005725860595703125,
0.0021800994873046875,
0.00007468461990356445,
0.049346923828125,
0.0009927749633789062,
0.0718994140625,
0.05682373046875,
-0.03997802734375,
0.0180206298828125,
0.02642822265625,
-0.0516357421875,
0.0185546875,
-0.055633544921875,
0.0158538818359375,
0.0018434524536132812,
0.01898193359375,
-0.0892333984375,
-0.01224517822265625,
0.0224761962890625,
-0.06292724609375,
0.0209197998046875,
-0.030303955078125,
-0.0210723876953125,
-0.0611572265625,
-0.019866943359375,
0.01139068603515625,
0.04815673828125,
-0.04132080078125,
0.04339599609375,
0.0283050537109375,
0.005657196044921875,
-0.054840087890625,
-0.071044921875,
-0.01415252685546875,
-0.0098419189453125,
-0.05584716796875,
0.0171661376953125,
-0.0003724098205566406,
-0.007144927978515625,
0.0004229545593261719,
-0.016754150390625,
0.000972747802734375,
0.0028285980224609375,
0.033111572265625,
0.0174102783203125,
-0.01087188720703125,
-0.001361846923828125,
-0.0137939453125,
-0.01387786865234375,
-0.0009641647338867188,
-0.0242919921875,
0.053619384765625,
-0.0212249755859375,
-0.019439697265625,
-0.060516357421875,
-0.0004813671112060547,
0.036376953125,
-0.0213775634765625,
0.053955078125,
0.07513427734375,
-0.02301025390625,
0.0094451904296875,
-0.042724609375,
-0.0094451904296875,
-0.035247802734375,
0.038177490234375,
-0.043670654296875,
-0.0552978515625,
0.055572509765625,
-0.0023040771484375,
0.0032596588134765625,
0.049102783203125,
0.045928955078125,
-0.0207061767578125,
0.059112548828125,
0.004413604736328125,
-0.0164337158203125,
0.023468017578125,
-0.0760498046875,
0.004863739013671875,
-0.05828857421875,
-0.04876708984375,
-0.0306854248046875,
-0.0288543701171875,
-0.034088134765625,
-0.03155517578125,
0.027191162109375,
0.0217742919921875,
-0.028900146484375,
0.01342010498046875,
-0.05718994140625,
0.02606201171875,
0.053314208984375,
0.018707275390625,
-0.0078125,
0.01409912109375,
-0.0205078125,
-0.00823974609375,
-0.048126220703125,
-0.027313232421875,
0.0863037109375,
0.032958984375,
0.038299560546875,
0.004154205322265625,
0.062408447265625,
0.0174713134765625,
-0.0191497802734375,
-0.056610107421875,
0.027923583984375,
0.005702972412109375,
-0.044708251953125,
-0.02587890625,
-0.016632080078125,
-0.07403564453125,
0.026947021484375,
-0.0263671875,
-0.07733154296875,
0.028472900390625,
0.00882720947265625,
-0.0274200439453125,
0.04217529296875,
-0.05157470703125,
0.057708740234375,
-0.00959014892578125,
-0.038665771484375,
-0.005401611328125,
-0.055877685546875,
0.021240234375,
0.01471710205078125,
0.0280609130859375,
-0.01474761962890625,
0.01021575927734375,
0.07684326171875,
-0.037811279296875,
0.039642333984375,
-0.0238800048828125,
-0.003444671630859375,
0.050750732421875,
-0.00457000732421875,
0.054473876953125,
0.00797271728515625,
-0.0100555419921875,
0.024383544921875,
0.004131317138671875,
-0.03826904296875,
-0.0242462158203125,
0.06787109375,
-0.0880126953125,
-0.041107177734375,
-0.027679443359375,
-0.03289794921875,
0.0006384849548339844,
0.01407623291015625,
0.045257568359375,
0.034912109375,
0.0071563720703125,
0.0101470947265625,
0.0445556640625,
-0.0081787109375,
0.045806884765625,
0.0105133056640625,
-0.004848480224609375,
-0.059661865234375,
0.05810546875,
0.00902557373046875,
0.0189208984375,
-0.00783538818359375,
0.0216522216796875,
-0.0283355712890625,
-0.028411865234375,
-0.032257080078125,
0.0144500732421875,
-0.03839111328125,
-0.00841522216796875,
-0.06378173828125,
-0.00628662109375,
-0.0572509765625,
-0.0163421630859375,
-0.042144775390625,
-0.028717041015625,
-0.0225830078125,
-0.01082611083984375,
0.046142578125,
0.037872314453125,
-0.00972747802734375,
0.0286712646484375,
-0.0450439453125,
0.0184478759765625,
0.01324462890625,
0.007747650146484375,
-0.00540924072265625,
-0.05511474609375,
-0.0230865478515625,
0.00799560546875,
-0.0091094970703125,
-0.045654296875,
0.034912109375,
0.0010957717895507812,
0.032928466796875,
0.0224456787109375,
-0.018768310546875,
0.0662841796875,
-0.01458740234375,
0.07562255859375,
0.047698974609375,
-0.057525634765625,
0.04876708984375,
-0.031097412109375,
0.021209716796875,
0.04351806640625,
0.0254974365234375,
-0.03955078125,
-0.01078033447265625,
-0.06689453125,
-0.07550048828125,
0.061920166015625,
0.0236663818359375,
-0.00036978721618652344,
0.017913818359375,
0.0225830078125,
-0.00939178466796875,
0.01428985595703125,
-0.053619384765625,
-0.05535888671875,
-0.01983642578125,
-0.006992340087890625,
-0.00534820556640625,
-0.0105438232421875,
-0.01727294921875,
-0.0458984375,
0.055389404296875,
0.01238250732421875,
0.04046630859375,
0.033447265625,
0.005840301513671875,
-0.01177978515625,
0.00412750244140625,
0.04058837890625,
0.033782958984375,
-0.030914306640625,
-0.004451751708984375,
0.01776123046875,
-0.053680419921875,
0.01335906982421875,
-0.002285003662109375,
-0.01776123046875,
0.01477813720703125,
0.034820556640625,
0.072021484375,
-0.003841400146484375,
-0.01026153564453125,
0.042236328125,
-0.0008554458618164062,
-0.04449462890625,
-0.04296875,
0.0032749176025390625,
0.017974853515625,
0.0286712646484375,
0.038909912109375,
0.0190887451171875,
-0.00511932373046875,
-0.034210205078125,
0.016082763671875,
0.0231170654296875,
-0.0330810546875,
-0.0151519775390625,
0.06732177734375,
0.00405120849609375,
-0.0259552001953125,
0.03668212890625,
-0.005481719970703125,
-0.04296875,
0.06536865234375,
0.035736083984375,
0.05279541015625,
-0.0205535888671875,
-0.0016393661499023438,
0.0728759765625,
0.0240478515625,
-0.007659912109375,
0.036865234375,
0.001007080078125,
-0.03631591796875,
0.00323486328125,
-0.046173095703125,
0.0014333724975585938,
0.034820556640625,
-0.056304931640625,
0.0330810546875,
-0.037750244140625,
-0.034759521484375,
-0.0111083984375,
0.01348876953125,
-0.06500244140625,
0.035400390625,
0.006679534912109375,
0.06329345703125,
-0.07080078125,
0.061279296875,
0.04150390625,
-0.038055419921875,
-0.0848388671875,
-0.006072998046875,
-0.0006585121154785156,
-0.07110595703125,
0.047454833984375,
0.014495849609375,
0.0044403076171875,
0.010162353515625,
-0.033538818359375,
-0.07818603515625,
0.09906005859375,
0.0035037994384765625,
-0.05059814453125,
0.01471710205078125,
0.00450897216796875,
0.026031494140625,
-0.004047393798828125,
0.0394287109375,
0.04937744140625,
0.032318115234375,
0.004486083984375,
-0.07110595703125,
-0.002605438232421875,
-0.03192138671875,
-0.016693115234375,
0.01264190673828125,
-0.06939697265625,
0.07623291015625,
-0.03997802734375,
-0.00042700767517089844,
-0.00519561767578125,
0.051483154296875,
0.0255126953125,
0.02276611328125,
0.0411376953125,
0.07135009765625,
0.07342529296875,
-0.0155029296875,
0.07659912109375,
-0.0280303955078125,
0.04718017578125,
0.07232666015625,
0.020538330078125,
0.053680419921875,
0.037750244140625,
-0.04315185546875,
0.03033447265625,
0.06292724609375,
-0.007190704345703125,
0.046173095703125,
0.006809234619140625,
-0.0240631103515625,
-0.01134490966796875,
0.0038509368896484375,
-0.0567626953125,
0.027496337890625,
0.0176544189453125,
-0.028900146484375,
0.010040283203125,
0.0016117095947265625,
0.0095672607421875,
-0.00815582275390625,
-0.0174102783203125,
0.044525146484375,
0.0008974075317382812,
-0.0224151611328125,
0.07012939453125,
-0.0167388916015625,
0.056304931640625,
-0.044952392578125,
0.0158233642578125,
-0.0056915283203125,
0.015289306640625,
-0.042022705078125,
-0.06732177734375,
-0.00846099853515625,
-0.016571044921875,
-0.0185089111328125,
0.00632476806640625,
0.0399169921875,
-0.0301513671875,
-0.034759521484375,
0.03240966796875,
0.023529052734375,
0.0208892822265625,
0.0193328857421875,
-0.06817626953125,
-0.0025005340576171875,
0.0256500244140625,
-0.051116943359375,
0.01511383056640625,
0.0229644775390625,
0.01800537109375,
0.040557861328125,
0.0660400390625,
0.01442718505859375,
0.01293182373046875,
0.004932403564453125,
0.07012939453125,
-0.048370361328125,
-0.043365478515625,
-0.05023193359375,
0.04150390625,
-0.002559661865234375,
-0.04010009765625,
0.054443359375,
0.06573486328125,
0.052490234375,
-0.00244140625,
0.06689453125,
-0.0158843994140625,
0.039825439453125,
-0.0418701171875,
0.04803466796875,
-0.05615234375,
0.0048370361328125,
-0.01331329345703125,
-0.0517578125,
-0.01491546630859375,
0.061279296875,
-0.026519775390625,
0.00740814208984375,
0.034637451171875,
0.07489013671875,
0.004642486572265625,
-0.002689361572265625,
0.011383056640625,
0.00875091552734375,
0.01507568359375,
0.04449462890625,
0.035369873046875,
-0.06292724609375,
0.0435791015625,
-0.041900634765625,
-0.006168365478515625,
-0.00995635986328125,
-0.031707763671875,
-0.061370849609375,
-0.04644775390625,
-0.039093017578125,
-0.051483154296875,
-0.0024204254150390625,
0.07598876953125,
0.047149658203125,
-0.04998779296875,
-0.022308349609375,
-0.004764556884765625,
-0.0084381103515625,
-0.012298583984375,
-0.0165557861328125,
0.0672607421875,
-0.007427215576171875,
-0.055511474609375,
0.018768310546875,
-0.0068511962890625,
0.0241241455078125,
0.012054443359375,
-0.0205230712890625,
-0.01904296875,
-0.0137786865234375,
0.0215301513671875,
0.022674560546875,
-0.05780029296875,
-0.01374053955078125,
-0.00843048095703125,
-0.0152740478515625,
0.0250244140625,
0.028533935546875,
-0.05047607421875,
0.036865234375,
0.0266876220703125,
0.0283355712890625,
0.0543212890625,
-0.0062103271484375,
-0.0011186599731445312,
-0.033782958984375,
0.03802490234375,
-0.0017833709716796875,
0.03009033203125,
0.0175933837890625,
-0.03118896484375,
0.050933837890625,
0.03692626953125,
-0.050201416015625,
-0.063232421875,
-0.0204620361328125,
-0.091552734375,
-0.005126953125,
0.1004638671875,
-0.0054779052734375,
-0.031585693359375,
0.00597381591796875,
-0.025634765625,
0.033111572265625,
-0.043701171875,
0.033660888671875,
0.033050537109375,
-0.010040283203125,
0.0002715587615966797,
-0.062255859375,
0.04315185546875,
0.0268096923828125,
-0.03118896484375,
-0.007305145263671875,
0.01363372802734375,
0.03643798828125,
0.0168914794921875,
0.058563232421875,
-0.0056915283203125,
0.0196990966796875,
0.0183563232421875,
0.037506103515625,
-0.01285552978515625,
0.005279541015625,
-0.0191497802734375,
0.001911163330078125,
-0.00821685791015625,
-0.04742431640625
]
] |
sentence-transformers/stsb-roberta-base | 2022-06-15T20:36:10.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/stsb-roberta-base | 0 | 15,332 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
**⚠️ This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: [SBERT.net - Pretrained Models](https://www.sbert.net/docs/pretrained_models.html)**
# sentence-transformers/stsb-roberta-base
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/stsb-roberta-base')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/stsb-roberta-base')
model = AutoModel.from_pretrained('sentence-transformers/stsb-roberta-base')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/stsb-roberta-base)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': True}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,911 | [
[
-0.014923095703125,
-0.06256103515625,
0.0239105224609375,
0.0299530029296875,
-0.0295257568359375,
-0.0306243896484375,
-0.0257720947265625,
-0.00597381591796875,
0.01116180419921875,
0.031951904296875,
-0.04010009765625,
-0.0350341796875,
-0.057586669921875,
0.0050506591796875,
-0.0350341796875,
0.0643310546875,
-0.0081634521484375,
0.007793426513671875,
-0.0224151611328125,
-0.00958251953125,
-0.02496337890625,
-0.032745361328125,
-0.0225982666015625,
-0.017791748046875,
0.0095062255859375,
0.01198577880859375,
0.032989501953125,
0.026824951171875,
0.0222930908203125,
0.033477783203125,
-0.0070343017578125,
0.01059722900390625,
-0.030609130859375,
-0.0037326812744140625,
0.00540924072265625,
-0.024749755859375,
-0.00853729248046875,
0.02447509765625,
0.0421142578125,
0.03631591796875,
-0.00959014892578125,
0.00860595703125,
-0.00141143798828125,
0.02081298828125,
-0.030426025390625,
0.032012939453125,
-0.0416259765625,
0.01392364501953125,
0.007266998291015625,
0.0024280548095703125,
-0.051239013671875,
-0.0129852294921875,
0.0250701904296875,
-0.0284881591796875,
0.0091400146484375,
0.0133514404296875,
0.08673095703125,
0.0312347412109375,
-0.0195770263671875,
-0.0212860107421875,
-0.0278472900390625,
0.06488037109375,
-0.064697265625,
0.0196075439453125,
0.019561767578125,
-0.0005626678466796875,
-0.0011463165283203125,
-0.07708740234375,
-0.052703857421875,
-0.01558685302734375,
-0.0286102294921875,
0.016021728515625,
-0.0338134765625,
-0.0007686614990234375,
0.0093841552734375,
0.0222320556640625,
-0.055023193359375,
-0.0094451904296875,
-0.032867431640625,
-0.004344940185546875,
0.03570556640625,
0.00008970499038696289,
0.0306549072265625,
-0.04608154296875,
-0.037506103515625,
-0.023712158203125,
-0.0164794921875,
-0.0125579833984375,
0.0078277587890625,
0.01435089111328125,
-0.02642822265625,
0.058624267578125,
0.0034580230712890625,
0.046905517578125,
0.0028896331787109375,
0.0230255126953125,
0.048309326171875,
-0.0220947265625,
-0.0275115966796875,
-0.01181793212890625,
0.081298828125,
0.0242767333984375,
0.0294952392578125,
-0.00919342041015625,
-0.006237030029296875,
0.006439208984375,
0.0237274169921875,
-0.0594482421875,
-0.032073974609375,
0.01232147216796875,
-0.0282745361328125,
-0.0229949951171875,
0.014373779296875,
-0.047454833984375,
0.0050811767578125,
0.004547119140625,
0.05255126953125,
-0.04638671875,
0.002719879150390625,
0.0243988037109375,
-0.0183258056640625,
0.008270263671875,
-0.019256591796875,
-0.049530029296875,
0.0167999267578125,
0.0160675048828125,
0.06988525390625,
0.006259918212890625,
-0.034393310546875,
-0.0188446044921875,
-0.01055908203125,
0.0040435791015625,
0.0523681640625,
-0.024200439453125,
-0.0124359130859375,
0.00920867919921875,
0.0216827392578125,
-0.04205322265625,
-0.02435302734375,
0.045867919921875,
-0.026580810546875,
0.06085205078125,
0.019256591796875,
-0.0653076171875,
-0.01526641845703125,
0.007534027099609375,
-0.036773681640625,
0.0792236328125,
0.0219573974609375,
-0.06475830078125,
0.006015777587890625,
-0.057647705078125,
-0.023040771484375,
-0.0107269287109375,
0.0079803466796875,
-0.056854248046875,
0.0104827880859375,
0.033843994140625,
0.051910400390625,
0.009307861328125,
0.0309906005859375,
-0.0177154541015625,
-0.032928466796875,
0.028961181640625,
-0.0301361083984375,
0.08892822265625,
0.00974273681640625,
-0.02581787109375,
0.01448822021484375,
-0.0408935546875,
-0.00432586669921875,
0.024810791015625,
-0.012603759765625,
-0.0145263671875,
-0.006313323974609375,
0.0287017822265625,
0.0186614990234375,
0.01392364501953125,
-0.0499267578125,
0.0122528076171875,
-0.047607421875,
0.07110595703125,
0.04388427734375,
0.0006761550903320312,
0.046234130859375,
-0.022430419921875,
0.0209808349609375,
0.02288818359375,
0.002391815185546875,
-0.01678466796875,
-0.0308685302734375,
-0.07403564453125,
-0.024749755859375,
0.0277099609375,
0.039031982421875,
-0.048583984375,
0.083740234375,
-0.034393310546875,
-0.039337158203125,
-0.05303955078125,
-0.0058746337890625,
0.01082611083984375,
0.0292510986328125,
0.05316162109375,
-0.00969696044921875,
-0.051177978515625,
-0.06671142578125,
-0.002574920654296875,
0.003482818603515625,
0.00909423828125,
0.020599365234375,
0.054473876953125,
-0.03338623046875,
0.07421875,
-0.05169677734375,
-0.03515625,
-0.040130615234375,
0.01885986328125,
0.01617431640625,
0.05340576171875,
0.041046142578125,
-0.04852294921875,
-0.02850341796875,
-0.052490234375,
-0.0494384765625,
0.0035572052001953125,
-0.0168609619140625,
-0.01290130615234375,
0.0224151611328125,
0.032257080078125,
-0.06768798828125,
0.027984619140625,
0.052001953125,
-0.039947509765625,
0.0235595703125,
-0.02093505859375,
-0.0164947509765625,
-0.10565185546875,
-0.0017786026000976562,
0.005931854248046875,
-0.0223388671875,
-0.0287017822265625,
0.0071868896484375,
0.00970458984375,
-0.01166534423828125,
-0.03594970703125,
0.03399658203125,
-0.032928466796875,
0.009063720703125,
-0.00017368793487548828,
0.03533935546875,
0.005451202392578125,
0.0531005859375,
-0.0042724609375,
0.051727294921875,
0.036376953125,
-0.038818359375,
0.0224761962890625,
0.053619384765625,
-0.038818359375,
0.01134490966796875,
-0.0657958984375,
0.0009288787841796875,
-0.0020389556884765625,
0.03717041015625,
-0.08282470703125,
-0.00032258033752441406,
0.025970458984375,
-0.0455322265625,
0.01617431640625,
0.01459503173828125,
-0.04779052734375,
-0.045928955078125,
-0.0235443115234375,
0.0121307373046875,
0.046478271484375,
-0.041717529296875,
0.042388916015625,
0.024932861328125,
-0.0011758804321289062,
-0.03460693359375,
-0.08673095703125,
0.0036144256591796875,
-0.0140838623046875,
-0.05517578125,
0.04486083984375,
-0.00609588623046875,
0.0155487060546875,
0.026275634765625,
0.0233154296875,
-0.000026941299438476562,
-0.0002359151840209961,
0.00597381591796875,
0.0207977294921875,
-0.00579071044921875,
0.020416259765625,
0.007625579833984375,
-0.015899658203125,
-0.000016450881958007812,
-0.0222320556640625,
0.0589599609375,
-0.01354217529296875,
-0.007770538330078125,
-0.03570556640625,
0.0089874267578125,
0.0270843505859375,
-0.0237274169921875,
0.080810546875,
0.075927734375,
-0.02935791015625,
-0.01186370849609375,
-0.038238525390625,
-0.02239990234375,
-0.0343017578125,
0.04779052734375,
-0.01090240478515625,
-0.0758056640625,
0.0226287841796875,
0.01336669921875,
-0.00653839111328125,
0.047027587890625,
0.041107177734375,
-0.00966644287109375,
0.058624267578125,
0.044097900390625,
-0.01367950439453125,
0.042022705078125,
-0.052154541015625,
0.0238800048828125,
-0.07708740234375,
0.000035703182220458984,
-0.022796630859375,
-0.01708984375,
-0.053253173828125,
-0.0399169921875,
0.015777587890625,
-0.008056640625,
-0.0250244140625,
0.047271728515625,
-0.046630859375,
0.00959014892578125,
0.047088623046875,
0.01323699951171875,
-0.007110595703125,
0.0018415451049804688,
-0.022369384765625,
-0.006381988525390625,
-0.054412841796875,
-0.040924072265625,
0.06610107421875,
0.03094482421875,
0.035369873046875,
-0.00824737548828125,
0.052490234375,
0.0013284683227539062,
0.0009918212890625,
-0.05126953125,
0.0406494140625,
-0.028594970703125,
-0.03515625,
-0.022857666015625,
-0.0264434814453125,
-0.06610107421875,
0.028228759765625,
-0.015899658203125,
-0.056243896484375,
0.0020465850830078125,
-0.0153350830078125,
-0.0218048095703125,
0.016632080078125,
-0.056976318359375,
0.08282470703125,
0.003353118896484375,
-0.00601959228515625,
-0.01100921630859375,
-0.04705810546875,
0.01325225830078125,
0.02545166015625,
0.0050506591796875,
0.0015935897827148438,
0.00461578369140625,
0.061370849609375,
-0.020721435546875,
0.0738525390625,
-0.01397705078125,
0.020050048828125,
0.0277862548828125,
-0.021728515625,
0.024444580078125,
-0.004497528076171875,
-0.0010242462158203125,
0.0074615478515625,
-0.01529693603515625,
-0.025482177734375,
-0.0377197265625,
0.05377197265625,
-0.0733642578125,
-0.0303955078125,
-0.039886474609375,
-0.044189453125,
-0.004154205322265625,
0.01309967041015625,
0.028961181640625,
0.0345458984375,
-0.0166473388671875,
0.03594970703125,
0.036285400390625,
-0.0277099609375,
0.05157470703125,
0.01157379150390625,
-0.0008821487426757812,
-0.040985107421875,
0.043304443359375,
0.00559234619140625,
0.001079559326171875,
0.0311279296875,
0.01617431640625,
-0.03216552734375,
-0.017974853515625,
-0.031585693359375,
0.0310821533203125,
-0.0400390625,
-0.0155181884765625,
-0.0802001953125,
-0.037445068359375,
-0.05078125,
-0.00276947021484375,
-0.0185089111328125,
-0.039520263671875,
-0.045928955078125,
-0.0250244140625,
0.02923583984375,
0.03717041015625,
0.0004253387451171875,
0.03192138671875,
-0.056304931640625,
0.0084381103515625,
0.004497528076171875,
0.00885772705078125,
-0.006191253662109375,
-0.061553955078125,
-0.0283203125,
-0.002056121826171875,
-0.0309906005859375,
-0.063232421875,
0.051727294921875,
0.0155181884765625,
0.041290283203125,
0.006862640380859375,
0.00832366943359375,
0.048248291015625,
-0.040985107421875,
0.072998046875,
0.0005450248718261719,
-0.07952880859375,
0.037506103515625,
-0.006591796875,
0.0281524658203125,
0.032318115234375,
0.0224761962890625,
-0.033355712890625,
-0.033111572265625,
-0.061553955078125,
-0.0811767578125,
0.0496826171875,
0.03009033203125,
0.04559326171875,
-0.025482177734375,
0.0198516845703125,
-0.02099609375,
0.0125274658203125,
-0.08709716796875,
-0.024810791015625,
-0.0308380126953125,
-0.0469970703125,
-0.0261383056640625,
-0.0228729248046875,
0.0134124755859375,
-0.0263519287109375,
0.060546875,
0.00638580322265625,
0.053070068359375,
0.02587890625,
-0.045379638671875,
0.00774383544921875,
0.016448974609375,
0.043121337890625,
0.01309967041015625,
-0.00934600830078125,
0.00873565673828125,
0.01605224609375,
-0.0216217041015625,
-0.0017805099487304688,
0.036956787109375,
-0.0166015625,
0.0134429931640625,
0.0322265625,
0.07403564453125,
0.045867919921875,
-0.03399658203125,
0.057342529296875,
-0.00377655029296875,
-0.01474761962890625,
-0.042236328125,
-0.004665374755859375,
0.0228424072265625,
0.020599365234375,
0.0153961181640625,
0.0038738250732421875,
-0.0014629364013671875,
-0.0268402099609375,
0.0244598388671875,
0.0202178955078125,
-0.036376953125,
-0.004016876220703125,
0.052703857421875,
0.0079193115234375,
-0.014617919921875,
0.0732421875,
-0.0189208984375,
-0.054840087890625,
0.0322265625,
0.052490234375,
0.07568359375,
0.003826141357421875,
0.02496337890625,
0.04022216796875,
0.0251312255859375,
-0.002384185791015625,
-0.0011949539184570312,
0.01045989990234375,
-0.06610107421875,
-0.017242431640625,
-0.045440673828125,
0.0081787109375,
-0.0017099380493164062,
-0.043060302734375,
0.0164031982421875,
-0.0082244873046875,
-0.01110076904296875,
-0.0206146240234375,
0.002902984619140625,
-0.048065185546875,
0.00618743896484375,
-0.003398895263671875,
0.062469482421875,
-0.0732421875,
0.058807373046875,
0.04901123046875,
-0.05987548828125,
-0.053985595703125,
-0.00501251220703125,
-0.0241546630859375,
-0.057342529296875,
0.045166015625,
0.033935546875,
0.0168304443359375,
0.0195465087890625,
-0.0416259765625,
-0.06280517578125,
0.09954833984375,
0.016143798828125,
-0.0270233154296875,
-0.0159149169921875,
0.003009796142578125,
0.039581298828125,
-0.04278564453125,
0.02777099609375,
0.02203369140625,
0.0285186767578125,
-0.00811004638671875,
-0.048065185546875,
0.0162200927734375,
-0.020111083984375,
0.01983642578125,
-0.0171051025390625,
-0.044586181640625,
0.07733154296875,
-0.00708770751953125,
-0.01690673828125,
0.021392822265625,
0.065185546875,
0.0206146240234375,
-0.0085601806640625,
0.03216552734375,
0.06732177734375,
0.042572021484375,
-0.01251983642578125,
0.07171630859375,
-0.0268402099609375,
0.0574951171875,
0.0799560546875,
0.007122039794921875,
0.081298828125,
0.0295867919921875,
-0.00817108154296875,
0.06915283203125,
0.03790283203125,
-0.02899169921875,
0.04705810546875,
0.0212554931640625,
0.00848388671875,
-0.0026340484619140625,
0.0111236572265625,
-0.0157012939453125,
0.0406494140625,
0.01275634765625,
-0.057952880859375,
0.0008883476257324219,
0.01409149169921875,
0.006805419921875,
0.0066986083984375,
0.0145263671875,
0.044952392578125,
0.0120391845703125,
-0.03399658203125,
0.029296875,
0.013885498046875,
0.07220458984375,
-0.026153564453125,
0.0094757080078125,
-0.0010519027709960938,
0.028717041015625,
0.0015401840209960938,
-0.041351318359375,
0.0231170654296875,
-0.0089874267578125,
-0.0031890869140625,
-0.0195465087890625,
0.04449462890625,
-0.05029296875,
-0.0482177734375,
0.028228759765625,
0.0400390625,
0.002288818359375,
0.00551605224609375,
-0.07452392578125,
0.0035400390625,
-0.004947662353515625,
-0.0413818359375,
0.01129913330078125,
0.0214385986328125,
0.0307464599609375,
0.04412841796875,
0.0293121337890625,
-0.00705718994140625,
0.006259918212890625,
0.0146942138671875,
0.060943603515625,
-0.047119140625,
-0.04217529296875,
-0.06463623046875,
0.05706787109375,
-0.0149383544921875,
-0.024383544921875,
0.05426025390625,
0.039093017578125,
0.0689697265625,
-0.023681640625,
0.045135498046875,
-0.01020050048828125,
0.0194091796875,
-0.042724609375,
0.0626220703125,
-0.034454345703125,
-0.00379180908203125,
-0.0193328857421875,
-0.070068359375,
-0.0243682861328125,
0.08966064453125,
-0.0266571044921875,
0.0168304443359375,
0.0692138671875,
0.0582275390625,
-0.00689697265625,
-0.004085540771484375,
0.009124755859375,
0.03662109375,
0.01824951171875,
0.03857421875,
0.033355712890625,
-0.06573486328125,
0.044952392578125,
-0.036163330078125,
-0.0065460205078125,
-0.0178680419921875,
-0.06158447265625,
-0.0733642578125,
-0.06463623046875,
-0.0318603515625,
-0.024505615234375,
-0.0015773773193359375,
0.083251953125,
0.050079345703125,
-0.0517578125,
-0.005741119384765625,
-0.02099609375,
-0.01526641845703125,
-0.00782012939453125,
-0.0241851806640625,
0.03985595703125,
-0.041534423828125,
-0.06597900390625,
0.015899658203125,
-0.007785797119140625,
0.01096343994140625,
-0.033721923828125,
0.00974273681640625,
-0.046142578125,
0.007251739501953125,
0.043701171875,
-0.0225067138671875,
-0.06024169921875,
-0.0263519287109375,
0.0016946792602539062,
-0.0281219482421875,
-0.007289886474609375,
0.02410888671875,
-0.05517578125,
0.0161285400390625,
0.0285491943359375,
0.038909912109375,
0.052947998046875,
-0.01219940185546875,
0.0379638671875,
-0.05670166015625,
0.0201263427734375,
0.01232147216796875,
0.0531005859375,
0.036224365234375,
-0.0197906494140625,
0.0450439453125,
0.01800537109375,
-0.0406494140625,
-0.049102783203125,
-0.006992340087890625,
-0.08148193359375,
-0.025634765625,
0.0838623046875,
-0.033843994140625,
-0.0270843505859375,
0.01776123046875,
-0.014495849609375,
0.039276123046875,
-0.02288818359375,
0.0577392578125,
0.0606689453125,
0.0028858184814453125,
-0.024810791015625,
-0.021026611328125,
0.0150604248046875,
0.03173828125,
-0.043670654296875,
-0.01087188720703125,
0.0261077880859375,
0.02471923828125,
0.0196075439453125,
0.03131103515625,
-0.011199951171875,
0.00019991397857666016,
0.0022373199462890625,
0.0093994140625,
-0.0195770263671875,
-0.0031681060791015625,
-0.0287017822265625,
0.007843017578125,
-0.028900146484375,
-0.02349853515625
]
] |
google/ddpm-cifar10-32 | 2023-08-03T07:24:08.000Z | [
"diffusers",
"pytorch",
"unconditional-image-generation",
"arxiv:2006.11239",
"license:apache-2.0",
"has_space",
"diffusers:DDPMPipeline",
"region:us"
] | unconditional-image-generation | google | null | null | google/ddpm-cifar10-32 | 38 | 15,321 | diffusers | 2022-06-16T12:53:22 | ---
license: apache-2.0
tags:
- pytorch
- diffusers
- unconditional-image-generation
---
# Denoising Diffusion Probabilistic Models (DDPM)
**Paper**: [Denoising Diffusion Probabilistic Models](https://arxiv.org/abs/2006.11239)
**Authors**: Jonathan Ho, Ajay Jain, Pieter Abbeel
**Abstract**:
*We present high quality image synthesis results using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics. Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics, and our models naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding. On the unconditional CIFAR10 dataset, we obtain an Inception score of 9.46 and a state-of-the-art FID score of 3.17. On 256x256 LSUN, we obtain sample quality similar to ProgressiveGAN.*
## Inference
**DDPM** models can use *discrete noise schedulers* such as:
- [scheduling_ddpm](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddpm.py)
- [scheduling_ddim](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddim.py)
- [scheduling_pndm](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_pndm.py)
for inference. Note that while the *ddpm* scheduler yields the highest quality, it also takes the longest.
For a good trade-off between quality and inference speed you might want to consider the *ddim* or *pndm* schedulers instead.
See the following code:
```python
# !pip install diffusers
from diffusers import DDPMPipeline, DDIMPipeline, PNDMPipeline
model_id = "google/ddpm-cifar10-32"
# load model and scheduler
ddpm = DDPMPipeline.from_pretrained(model_id) # you can replace DDPMPipeline with DDIMPipeline or PNDMPipeline for faster inference
# run pipeline in inference (sample random noise and denoise)
image = ddpm().images[0]
# save image
image.save("ddpm_generated_image.png")
```
For more in-detail information, please have a look at the [official inference example](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/diffusers_intro.ipynb)
## Training
If you want to train your own model, please have a look at the [official training example](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/training_example.ipynb)
## Samples
1. 
2. 
3. 
4.  | 2,977 | [
[
-0.03515625,
-0.052001953125,
0.0253448486328125,
0.04833984375,
-0.011016845703125,
-0.0168609619140625,
0.007045745849609375,
-0.0253143310546875,
0.0083160400390625,
0.01134490966796875,
-0.052764892578125,
-0.0185089111328125,
-0.043243408203125,
-0.013519287109375,
-0.0245361328125,
0.075927734375,
-0.0098419189453125,
0.01123046875,
-0.0112762451171875,
0.01177978515625,
0.00966644287109375,
-0.01149749755859375,
-0.0609130859375,
-0.031768798828125,
0.005016326904296875,
-0.0170440673828125,
0.037109375,
0.0250396728515625,
0.027587890625,
0.0222320556640625,
-0.037109375,
0.005817413330078125,
-0.04376220703125,
-0.00977325439453125,
0.01529693603515625,
-0.002788543701171875,
-0.049896240234375,
0.01018524169921875,
0.061859130859375,
0.05474853515625,
-0.03515625,
0.0062408447265625,
0.00775909423828125,
0.04925537109375,
-0.04876708984375,
0.01334381103515625,
-0.01531219482421875,
0.0091552734375,
0.007396697998046875,
0.0145721435546875,
-0.01161956787109375,
-0.0183868408203125,
0.012420654296875,
-0.042877197265625,
0.038055419921875,
-0.027618408203125,
0.0936279296875,
0.016387939453125,
-0.0178985595703125,
0.0016813278198242188,
-0.05242919921875,
0.05462646484375,
-0.053802490234375,
0.0138092041015625,
0.023193359375,
0.0008416175842285156,
-0.0146942138671875,
-0.05596923828125,
-0.04705810546875,
-0.00594329833984375,
-0.001987457275390625,
0.044189453125,
-0.01415252685546875,
0.016845703125,
0.0203704833984375,
0.048736572265625,
-0.031707763671875,
-0.006832122802734375,
-0.0287322998046875,
-0.0134124755859375,
0.053680419921875,
0.0053863525390625,
-0.0013227462768554688,
0.00214385986328125,
-0.036346435546875,
-0.0006132125854492188,
-0.0194549560546875,
-0.0034580230712890625,
0.03179931640625,
-0.038177490234375,
-0.04522705078125,
0.0120849609375,
-0.017852783203125,
0.059234619140625,
0.037994384765625,
-0.0229034423828125,
0.0172119140625,
-0.0141754150390625,
-0.0274505615234375,
-0.03515625,
0.06353759765625,
0.038665771484375,
-0.00848388671875,
-0.00705718994140625,
-0.01393890380859375,
0.0006103515625,
-0.00418853759765625,
-0.08837890625,
-0.0513916015625,
0.0222015380859375,
-0.045135498046875,
-0.040618896484375,
-0.011566162109375,
-0.061614990234375,
0.0029735565185546875,
-0.0191192626953125,
0.0304107666015625,
-0.0318603515625,
-0.04412841796875,
0.005825042724609375,
-0.018035888671875,
0.02496337890625,
0.038909912109375,
-0.053924560546875,
0.029754638671875,
0.0217437744140625,
0.0826416015625,
-0.01212310791015625,
-0.009796142578125,
-0.03326416015625,
0.005298614501953125,
-0.033477783203125,
0.046722412109375,
-0.01922607421875,
-0.04132080078125,
-0.033050537109375,
0.0032711029052734375,
-0.0157928466796875,
-0.045440673828125,
0.031829833984375,
-0.041717529296875,
0.0295257568359375,
0.0034046173095703125,
-0.04595947265625,
-0.020904541015625,
-0.006023406982421875,
-0.035064697265625,
0.06402587890625,
0.038909912109375,
-0.0567626953125,
0.01141357421875,
-0.046295166015625,
-0.02203369140625,
-0.00951385498046875,
0.0063934326171875,
-0.059051513671875,
-0.001049041748046875,
0.0032711029052734375,
0.0389404296875,
-0.00827789306640625,
0.01509857177734375,
-0.04217529296875,
-0.015533447265625,
0.00940704345703125,
-0.0251007080078125,
0.087890625,
0.028594970703125,
-0.0263519287109375,
-0.0011034011840820312,
-0.06121826171875,
-0.004215240478515625,
0.0034694671630859375,
-0.034759521484375,
0.0105438232421875,
-0.0298614501953125,
0.01593017578125,
0.01255035400390625,
-0.009490966796875,
-0.037261962890625,
0.01343536376953125,
-0.0304107666015625,
0.044586181640625,
0.06207275390625,
0.01361846923828125,
0.057861328125,
-0.03582763671875,
0.053009033203125,
0.0163726806640625,
0.019500732421875,
0.0062408447265625,
-0.047515869140625,
-0.04925537109375,
-0.060394287109375,
0.0254974365234375,
0.044464111328125,
-0.03875732421875,
0.032440185546875,
0.004730224609375,
-0.048828125,
-0.02459716796875,
0.003025054931640625,
0.0284881591796875,
0.055816650390625,
0.023345947265625,
-0.04656982421875,
-0.03131103515625,
-0.047760009765625,
0.015289306640625,
-0.007717132568359375,
-0.0174407958984375,
0.0144500732421875,
0.03662109375,
-0.0189208984375,
0.05169677734375,
-0.042938232421875,
-0.013031005859375,
0.010345458984375,
0.012054443359375,
0.039093017578125,
0.061309814453125,
0.058013916015625,
-0.0643310546875,
-0.041534423828125,
-0.041168212890625,
-0.054931640625,
-0.01806640625,
-0.0155487060546875,
-0.02728271484375,
0.0295867919921875,
0.022705078125,
-0.0604248046875,
0.048187255859375,
0.0635986328125,
-0.023681640625,
0.05902099609375,
-0.017974853515625,
-0.0015974044799804688,
-0.07073974609375,
0.0149383544921875,
0.00905609130859375,
-0.024261474609375,
-0.037628173828125,
0.0033626556396484375,
-0.0143890380859375,
-0.0032215118408203125,
-0.03167724609375,
0.05755615234375,
-0.05523681640625,
0.0183868408203125,
-0.009674072265625,
-0.03582763671875,
0.02264404296875,
0.0301055908203125,
0.01505279541015625,
0.0445556640625,
0.06573486328125,
-0.07000732421875,
0.018402099609375,
0.0323486328125,
-0.0265045166015625,
0.049072265625,
-0.067626953125,
-0.004878997802734375,
-0.015899658203125,
0.03948974609375,
-0.08001708984375,
-0.0179595947265625,
0.055999755859375,
-0.01143646240234375,
0.05035400390625,
-0.04132080078125,
-0.0207366943359375,
-0.016204833984375,
-0.009674072265625,
0.02313232421875,
0.0751953125,
-0.0433349609375,
0.040985107421875,
0.0255279541015625,
0.01110076904296875,
-0.0333251953125,
-0.048492431640625,
-0.0159454345703125,
-0.031829833984375,
-0.045654296875,
0.0699462890625,
-0.0308074951171875,
-0.0117645263671875,
0.002330780029296875,
-0.00960540771484375,
-0.0229949951171875,
0.0113983154296875,
0.04034423828125,
0.0290679931640625,
0.0054168701171875,
-0.0228729248046875,
-0.004852294921875,
-0.008392333984375,
0.0029048919677734375,
-0.0173797607421875,
0.0193023681640625,
-0.0011997222900390625,
-0.006847381591796875,
-0.057281494140625,
0.02880859375,
0.025909423828125,
0.017333984375,
0.0703125,
0.08935546875,
-0.01296234130859375,
-0.006389617919921875,
-0.047119140625,
-0.023712158203125,
-0.03643798828125,
-0.01490020751953125,
-0.03466796875,
-0.03924560546875,
0.0445556640625,
-0.00948333740234375,
-0.00588226318359375,
0.038421630859375,
0.045654296875,
-0.0311431884765625,
0.05413818359375,
0.0321044921875,
0.00920867919921875,
0.05780029296875,
-0.06036376953125,
-0.00804901123046875,
-0.0745849609375,
-0.00711822509765625,
0.0007281303405761719,
-0.0226593017578125,
-0.041473388671875,
-0.046783447265625,
0.04547119140625,
0.039703369140625,
-0.0135040283203125,
0.007213592529296875,
-0.05035400390625,
0.020477294921875,
0.044097900390625,
0.024383544921875,
-0.005725860595703125,
0.0242767333984375,
0.0038509368896484375,
0.008331298828125,
-0.049102783203125,
-0.016510009765625,
0.06341552734375,
0.0154876708984375,
0.070068359375,
0.0026111602783203125,
0.050811767578125,
0.0018110275268554688,
0.01436614990234375,
-0.0302734375,
0.0343017578125,
0.0060272216796875,
-0.040618896484375,
-0.0004088878631591797,
-0.0438232421875,
-0.06927490234375,
0.019287109375,
-0.0087890625,
-0.037200927734375,
0.041534423828125,
0.0228271484375,
-0.0244140625,
0.0269012451171875,
-0.054840087890625,
0.052154541015625,
0.0173797607421875,
-0.048675537109375,
-0.004901885986328125,
-0.050689697265625,
0.0304107666015625,
0.0179290771484375,
-0.002735137939453125,
-0.0165863037109375,
0.0025119781494140625,
0.0516357421875,
-0.041351318359375,
0.0721435546875,
-0.046905517578125,
-0.0052490234375,
0.0439453125,
0.0067901611328125,
0.0190277099609375,
0.0277099609375,
-0.0269012451171875,
0.04608154296875,
0.019256591796875,
-0.047210693359375,
-0.0193634033203125,
0.050018310546875,
-0.0548095703125,
-0.025909423828125,
-0.032196044921875,
-0.032012939453125,
0.0206756591796875,
-0.0031414031982421875,
0.045501708984375,
0.03515625,
-0.010467529296875,
-0.013641357421875,
0.05511474609375,
-0.0128936767578125,
0.046234130859375,
0.0244140625,
0.0013332366943359375,
-0.033721923828125,
0.052215576171875,
0.013763427734375,
0.022979736328125,
0.0286407470703125,
0.01483917236328125,
-0.00872802734375,
-0.04949951171875,
-0.03485107421875,
0.03082275390625,
-0.038116455078125,
-0.0144500732421875,
-0.060211181640625,
-0.0254364013671875,
-0.044189453125,
-0.0057373046875,
-0.026641845703125,
-0.047454833984375,
-0.055999755859375,
0.0234375,
0.037872314453125,
0.01335906982421875,
-0.049102783203125,
0.036529541015625,
-0.04254150390625,
0.0308685302734375,
0.0156402587890625,
0.0222015380859375,
-0.005889892578125,
-0.05859375,
-0.00804901123046875,
-0.0031299591064453125,
-0.045379638671875,
-0.048126220703125,
0.04736328125,
0.0311737060546875,
0.0447998046875,
0.04046630859375,
0.0010929107666015625,
0.060638427734375,
-0.029266357421875,
0.0513916015625,
0.02001953125,
-0.059600830078125,
0.036224365234375,
-0.021728515625,
0.00800323486328125,
0.01398468017578125,
0.042266845703125,
-0.0108184814453125,
-0.00168609619140625,
-0.07098388671875,
-0.05877685546875,
0.0430908203125,
0.024169921875,
0.00513458251953125,
0.00445556640625,
0.054901123046875,
0.0063934326171875,
0.02081298828125,
-0.04541015625,
-0.018798828125,
-0.01201629638671875,
0.00026917457580566406,
0.0083160400390625,
-0.0302734375,
-0.021820068359375,
-0.038421630859375,
0.0673828125,
0.0016927719116210938,
0.051727294921875,
0.045379638671875,
0.01227569580078125,
-0.00730133056640625,
-0.017730712890625,
0.035888671875,
0.057373046875,
-0.0301513671875,
-0.0178985595703125,
-0.00731658935546875,
-0.0282135009765625,
0.01500701904296875,
0.0188140869140625,
-0.035186767578125,
0.017486572265625,
0.0214996337890625,
0.0655517578125,
-0.0198516845703125,
-0.026641845703125,
0.038726806640625,
-0.005950927734375,
-0.0325927734375,
-0.051788330078125,
0.0150909423828125,
0.01454925537109375,
0.0276641845703125,
-0.0020599365234375,
0.0228271484375,
0.02630615234375,
-0.01434326171875,
-0.00002187490463256836,
0.041717529296875,
-0.01265716552734375,
-0.042144775390625,
0.06854248046875,
0.0007076263427734375,
0.00879669189453125,
0.053009033203125,
-0.0435791015625,
-0.0283203125,
0.052642822265625,
0.0222320556640625,
0.0576171875,
-0.0157623291015625,
0.0231475830078125,
0.046417236328125,
-0.00777435302734375,
-0.0193328857421875,
0.03277587890625,
-0.016326904296875,
-0.045928955078125,
-0.040069580078125,
-0.06781005859375,
-0.0209808349609375,
0.00749969482421875,
-0.047607421875,
0.010772705078125,
-0.039398193359375,
-0.019134521484375,
0.00659942626953125,
-0.00901031494140625,
-0.059539794921875,
0.015777587890625,
0.002105712890625,
0.056884765625,
-0.059173583984375,
0.054931640625,
0.044158935546875,
-0.021484375,
-0.047607421875,
-0.0033702850341796875,
0.00403594970703125,
-0.04949951171875,
0.047210693359375,
-0.00031375885009765625,
0.00846099853515625,
0.01303863525390625,
-0.049072265625,
-0.05047607421875,
0.081787109375,
0.037811279296875,
-0.03729248046875,
-0.0159912109375,
-0.045135498046875,
0.0264739990234375,
-0.0204010009765625,
0.030181884765625,
0.0252685546875,
0.0185394287109375,
0.021484375,
-0.05889892578125,
0.0102386474609375,
-0.0148773193359375,
0.01500701904296875,
-0.0015621185302734375,
-0.0599365234375,
0.0830078125,
-0.0302734375,
-0.01517486572265625,
0.022796630859375,
0.0369873046875,
0.01568603515625,
0.0251007080078125,
0.048370361328125,
0.0673828125,
0.040130615234375,
-0.004886627197265625,
0.089599609375,
-0.01464080810546875,
0.0386962890625,
0.0660400390625,
0.00113677978515625,
0.029205322265625,
0.038055419921875,
-0.04046630859375,
0.050018310546875,
0.0670166015625,
-0.0245361328125,
0.056427001953125,
0.030029296875,
-0.036529541015625,
-0.0146026611328125,
0.00775909423828125,
-0.03765869140625,
0.0147705078125,
0.019195556640625,
-0.02630615234375,
-0.0036678314208984375,
0.00438690185546875,
0.00203704833984375,
-0.023956298828125,
0.0019445419311523438,
0.05950927734375,
0.0099334716796875,
-0.036834716796875,
0.072509765625,
-0.0013246536254882812,
0.0679931640625,
-0.0343017578125,
-0.0035648345947265625,
-0.0212249755859375,
0.01233673095703125,
-0.022369384765625,
-0.03936767578125,
0.033599853515625,
-0.032135009765625,
-0.0231170654296875,
-0.03643798828125,
0.04461669921875,
-0.033355712890625,
-0.041717529296875,
0.0231170654296875,
0.0205841064453125,
0.0360107421875,
-0.016326904296875,
-0.06488037109375,
0.006603240966796875,
-0.01529693603515625,
-0.0396728515625,
0.0296783447265625,
0.01384735107421875,
0.0299530029296875,
0.0384521484375,
0.055572509765625,
0.0221710205078125,
-0.00431060791015625,
-0.0146026611328125,
0.06597900390625,
-0.0214080810546875,
-0.03167724609375,
-0.06005859375,
0.038116455078125,
-0.008575439453125,
-0.01073455810546875,
0.03466796875,
0.02508544921875,
0.0721435546875,
-0.0243377685546875,
0.0416259765625,
-0.02227783203125,
0.0182647705078125,
-0.046417236328125,
0.045745849609375,
-0.045806884765625,
0.0161285400390625,
-0.021240234375,
-0.0650634765625,
-0.0002751350402832031,
0.06634521484375,
-0.01226043701171875,
0.0152435302734375,
0.016754150390625,
0.06884765625,
-0.024169921875,
-0.0242156982421875,
0.00659942626953125,
0.0134124755859375,
0.0199432373046875,
0.04217529296875,
0.055084228515625,
-0.05169677734375,
0.03155517578125,
-0.053863525390625,
-0.03399658203125,
-0.01177978515625,
-0.0777587890625,
-0.051483154296875,
-0.04693603515625,
-0.07440185546875,
-0.055572509765625,
-0.0026264190673828125,
0.039398193359375,
0.0947265625,
-0.04925537109375,
-0.01141357421875,
-0.0276336669921875,
-0.0001691579818725586,
-0.044769287109375,
-0.02294921875,
0.030364990234375,
0.00026988983154296875,
-0.07745361328125,
0.01422882080078125,
-0.003253936767578125,
0.0310211181640625,
-0.006809234619140625,
-0.01198577880859375,
-0.0203094482421875,
-0.0173187255859375,
0.03759765625,
0.016632080078125,
-0.04595947265625,
0.0106658935546875,
-0.029541015625,
0.010986328125,
0.00696563720703125,
0.02227783203125,
-0.049407958984375,
0.03021240234375,
0.04669189453125,
0.027313232421875,
0.06915283203125,
-0.0025959014892578125,
0.02593994140625,
-0.042877197265625,
0.0256500244140625,
0.00519561767578125,
0.0210113525390625,
0.02728271484375,
-0.031707763671875,
0.0604248046875,
0.050628662109375,
-0.06207275390625,
-0.0657958984375,
0.01058197021484375,
-0.09356689453125,
-0.018646240234375,
0.0885009765625,
-0.03448486328125,
-0.023956298828125,
0.0211334228515625,
-0.00035381317138671875,
0.022705078125,
-0.0306549072265625,
0.05755615234375,
0.03692626953125,
-0.0261688232421875,
-0.0193634033203125,
-0.0390625,
0.044189453125,
0.021820068359375,
-0.0328369140625,
-0.0208892822265625,
0.046112060546875,
0.059173583984375,
0.01265716552734375,
0.07373046875,
-0.004119873046875,
0.01312255859375,
0.0088653564453125,
0.00545501708984375,
0.00008785724639892578,
0.00418853759765625,
-0.03521728515625,
0.0038013458251953125,
-0.025604248046875,
-0.0243377685546875
]
] |
TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ | 2023-08-21T14:48:15.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"custom_code",
"license:other",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ | 96 | 15,250 | transformers | 2023-06-27T03:55:57 | ---
inference: false
license: other
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Eric Hartford's Wizard Vicuna 13B Uncensored GPTQ
These files are GPTQ 4bit model files for [Eric Hartford's Wizard Vicuna 13B Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-13B-Uncensored) merged with [Kaio Ken's SuperHOT 8K](https://huggingface.co/kaiokendev/superhot-13b-8k-no-rlhf-test).
It is the result of quantising to 4bit using [GPTQ-for-LLaMa](https://github.com/qwopqwop200/GPTQ-for-LLaMa).
**This is an experimental new GPTQ which offers up to 8K context size**
The increased context is tested to work with [ExLlama](https://github.com/turboderp/exllama), via the latest release of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It has also been tested from Python code using AutoGPTQ, and `trust_remote_code=True`.
Code credits:
- Original concept and code for increasing context length: [kaiokendev](https://huggingface.co/kaiokendev)
- Updated Llama modelling code that includes this automatically via trust_remote_code: [emozilla](https://huggingface.co/emozilla).
Please read carefully below to see how to use it.
GGML versions are not yet provided, as there is not yet support for SuperHOT in llama.cpp. This is being investigated and will hopefully come soon.
## Repositories available
* [4-bit GPTQ models for GPU inference](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGML models for CPU inference](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GGML)
* [Unquantised SuperHOT fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-fp16)
* [Unquantised base fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/ehartford/Wizard-Vicuna-13B-Uncensored)
## How to easily download and use this model in text-generation-webui with ExLlama
Please make sure you're using the latest version of text-generation-webui
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done"
5. Untick **Autoload the model**
6. In the top left, click the refresh icon next to **Model**.
7. In the **Model** dropdown, choose the model you just downloaded: `Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ`
8. To use the increased context, set the **Loader** to **ExLlama**, set **max_seq_len** to 8192 or 4096, and set **compress_pos_emb** to **4** for 8192 context, or to **2** for 4096 context.
9. Now click **Save Settings** followed by **Reload**
10. The model will automatically load, and is now ready for use!
11. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
## How to use this GPTQ model from Python code with AutoGPTQ
First make sure you have AutoGPTQ and Einops installed:
```
pip3 install einops auto-gptq
```
Then run the following code. Note that in order to get this to work, `config.json` has been hardcoded to a sequence length of 8192.
If you want to try 4096 instead to reduce VRAM usage, please manually edit `config.json` to set `max_position_embeddings` to the value you want.
```python
from transformers import AutoTokenizer, pipeline, logging
from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
import argparse
model_name_or_path = "TheBloke/Wizard-Vicuna-13B-Uncensored-SuperHOT-8K-GPTQ"
model_basename = "wizard-vicuna-13b-uncensored-superhot-8k-GPTQ-4bit-128g.no-act.order"
use_triton = False
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
model_basename=model_basename,
use_safetensors=True,
trust_remote_code=True,
device_map='auto',
use_triton=use_triton,
quantize_config=None)
model.seqlen = 8192
# Note: check the prompt template is correct for this model.
prompt = "Tell me about AI"
prompt_template=f'''USER: {prompt}
ASSISTANT:'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
# Prevent printing spurious transformers error when using pipeline with AutoGPTQ
logging.set_verbosity(logging.CRITICAL)
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
temperature=0.7,
top_p=0.95,
repetition_penalty=1.15
)
print(pipe(prompt_template)[0]['generated_text'])
```
## Using other UIs: monkey patch
Provided in the repo is `llama_rope_scaled_monkey_patch.py`, written by @kaiokendev.
It can be theoretically be added to any Python UI or custom code to enable the same result as `trust_remote_code=True`. I have not tested this, and it should be superseded by using `trust_remote_code=True`, but I include it for completeness and for interest.
## Provided files
**wizard-vicuna-13b-uncensored-superhot-8k-GPTQ-4bit-128g.no-act.order.safetensors**
This will work with AutoGPTQ, ExLlama, and CUDA versions of GPTQ-for-LLaMa. There are reports of issues with Triton mode of recent GPTQ-for-LLaMa. If you have issues, please use AutoGPTQ instead.
It was created with group_size 128 to increase inference accuracy, but without --act-order (desc_act) to increase compatibility and improve inference speed.
* `wizard-vicuna-13b-uncensored-superhot-8k-GPTQ-4bit-128g.no-act.order.safetensors`
* Works for use with ExLlama with increased context (4096 or 8192)
* Works with AutoGPTQ in Python code, including with increased context, if `trust_remote_code=True` is set.
* Should work with GPTQ-for-LLaMa in CUDA mode, but unknown if increased context works - TBC. May have issues with GPTQ-for-LLaMa Triton mode.
* Works with text-generation-webui, including one-click-installers.
* Parameters: Groupsize = 128. Act Order / desc_act = False.
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Sam, theTransient, Jonathan Leane, Steven Wood, webtim, Johann-Peter Hartmann, Geoffrey Montalvo, Gabriel Tamborski, Willem Michiel, John Villwock, Derek Yates, Mesiah Bishop, Eugene Pentland, Pieter, Chadd, Stephen Murray, Daniel P. Andersen, terasurfer, Brandon Frisco, Thomas Belote, Sid, Nathan LeClaire, Magnesian, Alps Aficionado, Stanislav Ovsiannikov, Alex, Joseph William Delisle, Nikolai Manek, Michael Davis, Junyu Yang, K, J, Spencer Kim, Stefan Sabev, Olusegun Samson, transmissions 11, Michael Levine, Cory Kujawski, Rainer Wilmers, zynix, Kalila, Luke @flexchar, Ajan Kanaga, Mandus, vamX, Ai Maven, Mano Prime, Matthew Berman, subjectnull, Vitor Caleffi, Clay Pascal, biorpg, alfie_i, 阿明, Jeffrey Morgan, ya boyyy, Raymond Fosdick, knownsqashed, Olakabola, Leonard Tan, ReadyPlayerEmma, Enrico Ros, Dave, Talal Aujan, Illia Dulskyi, Sean Connelly, senxiiz, Artur Olbinski, Elle, Raven Klaugh, Fen Risland, Deep Realms, Imad Khwaja, Fred von Graf, Will Dee, usrbinkat, SuperWojo, Alexandros Triantafyllidis, Swaroop Kallakuri, Dan Guido, John Detwiler, Pedro Madruga, Iucharbius, Viktor Bowallius, Asp the Wyvern, Edmond Seymore, Trenton Dambrowitz, Space Cruiser, Spiking Neurons AB, Pyrater, LangChain4j, Tony Hughes, Kacper Wikieł, Rishabh Srivastava, David Ziegler, Luke Pendergrass, Andrey, Gabriel Puliatti, Lone Striker, Sebastain Graf, Pierre Kircher, Randy H, NimbleBox.ai, Vadim, danny, Deo Leter
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Kaio Ken's SuperHOT 8K
### SuperHOT Prototype 2 w/ 8K Context
This is a second prototype of SuperHOT, this time 30B with 8K context and no RLHF, using the same technique described in [the github blog](https://kaiokendev.github.io/til#extending-context-to-8k).
Tests have shown that the model does indeed leverage the extended context at 8K.
You will need to **use either the monkeypatch** or, if you are already using the monkeypatch, **change the scaling factor to 0.25 and the maximum sequence length to 8192**
#### Looking for Merged & Quantized Models?
- 30B 4-bit CUDA: [tmpupload/superhot-30b-8k-4bit-safetensors](https://huggingface.co/tmpupload/superhot-30b-8k-4bit-safetensors)
- 30B 4-bit CUDA 128g: [tmpupload/superhot-30b-8k-4bit-128g-safetensors](https://huggingface.co/tmpupload/superhot-30b-8k-4bit-128g-safetensors)
#### Training Details
I trained the LoRA with the following configuration:
- 1200 samples (~400 samples over 2048 sequence length)
- learning rate of 3e-4
- 3 epochs
- The exported modules are:
- q_proj
- k_proj
- v_proj
- o_proj
- no bias
- Rank = 4
- Alpha = 8
- no dropout
- weight decay of 0.1
- AdamW beta1 of 0.9 and beta2 0.99, epsilon of 1e-5
- Trained on 4-bit base model
# Original model card: Eric Hartford's Wizard Vicuna 13B Uncensored
This is [wizard-vicuna-13b](https://huggingface.co/junelee/wizard-vicuna-13b) trained with a subset of the dataset - responses that contained alignment / moralizing were removed. The intent is to train a WizardLM that doesn't have alignment built-in, so that alignment (of any sort) can be added separately with for example with a RLHF LoRA.
Shout out to the open source AI/ML community, and everyone who helped me out.
Note:
An uncensored model has no guardrails.
You are responsible for anything you do with the model, just as you are responsible for anything you do with any dangerous object such as a knife, gun, lighter, or car.
Publishing anything this model generates is the same as publishing it yourself.
You are responsible for the content you publish, and you cannot blame the model any more than you can blame the knife, gun, lighter, or car for what you do with it.
| 12,071 | [
[
-0.0310211181640625,
-0.064208984375,
0.01959228515625,
0.01012420654296875,
-0.0281982421875,
-0.0158538818359375,
0.0006413459777832031,
-0.037689208984375,
0.01226806640625,
0.0171356201171875,
-0.03466796875,
-0.038482666015625,
-0.0321044921875,
0.0092010498046875,
-0.019195556640625,
0.07415771484375,
0.01293182373046875,
-0.029205322265625,
0.0020046234130859375,
0.0011091232299804688,
-0.03314208984375,
-0.03472900390625,
-0.05743408203125,
-0.0144195556640625,
0.0235748291015625,
0.0150146484375,
0.06695556640625,
0.04656982421875,
0.01300048828125,
0.028961181640625,
-0.0009126663208007812,
0.0188446044921875,
-0.02825927734375,
-0.0032634735107421875,
0.01065826416015625,
-0.029632568359375,
-0.046173095703125,
0.0016756057739257812,
0.03875732421875,
-0.005344390869140625,
-0.0193939208984375,
0.01434326171875,
-0.001556396484375,
0.0179595947265625,
-0.03466796875,
0.01953125,
-0.037506103515625,
0.0019817352294921875,
0.0005216598510742188,
-0.00936126708984375,
-0.007537841796875,
-0.0200042724609375,
0.0015439987182617188,
-0.073974609375,
0.010711669921875,
0.004528045654296875,
0.09539794921875,
0.0251922607421875,
-0.037567138671875,
0.0027828216552734375,
-0.0305328369140625,
0.053070068359375,
-0.07177734375,
0.007476806640625,
0.0272369384765625,
0.01546478271484375,
-0.017578125,
-0.07244873046875,
-0.057281494140625,
-0.018707275390625,
-0.00551605224609375,
0.0190277099609375,
-0.036865234375,
-0.0020732879638671875,
0.023651123046875,
0.037872314453125,
-0.04766845703125,
-0.0048370361328125,
-0.0244903564453125,
-0.0099029541015625,
0.055206298828125,
0.023223876953125,
0.0404052734375,
-0.0178985595703125,
-0.027435302734375,
-0.019927978515625,
-0.047943115234375,
-0.00170135498046875,
0.01149749755859375,
0.000732421875,
-0.04083251953125,
0.03558349609375,
-0.02020263671875,
0.042266845703125,
0.014923095703125,
-0.0051422119140625,
0.01216888427734375,
-0.034942626953125,
-0.047943115234375,
-0.023284912109375,
0.09429931640625,
0.038543701171875,
-0.010650634765625,
0.0166473388671875,
0.0013628005981445312,
-0.004367828369140625,
-0.00551605224609375,
-0.06658935546875,
-0.0303497314453125,
0.03619384765625,
-0.0236358642578125,
-0.023529052734375,
-0.015777587890625,
-0.044036865234375,
-0.00943756103515625,
0.0028476715087890625,
0.04376220703125,
-0.034881591796875,
-0.029388427734375,
0.01259613037109375,
-0.02532958984375,
0.03350830078125,
0.02716064453125,
-0.06890869140625,
0.007678985595703125,
0.0224761962890625,
0.06011962890625,
0.02618408203125,
-0.022308349609375,
-0.0221099853515625,
0.011383056640625,
-0.0159149169921875,
0.0411376953125,
-0.0177764892578125,
-0.035736083984375,
-0.0194549560546875,
0.01617431640625,
0.0019083023071289062,
-0.0225067138671875,
0.018798828125,
-0.025115966796875,
0.029144287109375,
-0.020050048828125,
-0.0316162109375,
-0.0169830322265625,
0.01387786865234375,
-0.032989501953125,
0.09051513671875,
0.0269012451171875,
-0.0606689453125,
0.01251220703125,
-0.050201416015625,
-0.00852203369140625,
0.011383056640625,
-0.00824737548828125,
-0.035980224609375,
-0.0114593505859375,
0.03082275390625,
0.0290679931640625,
-0.032623291015625,
0.01363372802734375,
-0.018463134765625,
-0.0256805419921875,
0.0229949951171875,
-0.041778564453125,
0.09405517578125,
0.0144500732421875,
-0.04669189453125,
0.004009246826171875,
-0.041168212890625,
0.0093994140625,
0.0266265869140625,
-0.02032470703125,
-0.00035309791564941406,
-0.034759521484375,
0.0007014274597167969,
0.005779266357421875,
0.024932861328125,
-0.032440185546875,
0.034912109375,
-0.0139007568359375,
0.056884765625,
0.052703857421875,
0.0065460205078125,
0.024871826171875,
-0.0250396728515625,
0.0439453125,
-0.005771636962890625,
0.047271728515625,
0.007625579833984375,
-0.048431396484375,
-0.0638427734375,
-0.0248870849609375,
0.0240020751953125,
0.0419921875,
-0.06549072265625,
0.041778564453125,
-0.0077056884765625,
-0.05615234375,
-0.028045654296875,
-0.00928497314453125,
0.0247955322265625,
0.03729248046875,
0.04058837890625,
-0.033111572265625,
-0.0300140380859375,
-0.05731201171875,
0.00830841064453125,
-0.0267333984375,
-0.0167236328125,
0.0338134765625,
0.0418701171875,
-0.0282135009765625,
0.058563232421875,
-0.049591064453125,
-0.019927978515625,
-0.0092926025390625,
0.0015783309936523438,
0.021575927734375,
0.04986572265625,
0.054962158203125,
-0.04766845703125,
-0.0416259765625,
-0.006580352783203125,
-0.061431884765625,
-0.004058837890625,
-0.004608154296875,
-0.034454345703125,
0.01288604736328125,
0.01207733154296875,
-0.0816650390625,
0.045684814453125,
0.03515625,
-0.04010009765625,
0.0548095703125,
-0.031951904296875,
0.01000213623046875,
-0.08599853515625,
-0.0028324127197265625,
0.0116729736328125,
-0.0243682861328125,
-0.036712646484375,
0.0126800537109375,
-0.0020008087158203125,
0.01145172119140625,
-0.038482666015625,
0.047698974609375,
-0.03839111328125,
0.00977325439453125,
-0.0105438232421875,
-0.004467010498046875,
0.021636962890625,
0.034942626953125,
-0.00460052490234375,
0.05828857421875,
0.045745849609375,
-0.048583984375,
0.048309326171875,
0.0262603759765625,
-0.01169586181640625,
0.020050048828125,
-0.0657958984375,
0.01493072509765625,
0.0134124755859375,
0.042999267578125,
-0.07244873046875,
-0.023834228515625,
0.043975830078125,
-0.049407958984375,
0.0224609375,
-0.0211639404296875,
-0.0312347412109375,
-0.02618408203125,
-0.0269927978515625,
0.0265350341796875,
0.05010986328125,
-0.03619384765625,
0.0384521484375,
0.035736083984375,
0.003643035888671875,
-0.054534912109375,
-0.0640869140625,
0.005573272705078125,
-0.0295867919921875,
-0.04107666015625,
0.037353515625,
-0.00858306884765625,
-0.0003743171691894531,
-0.0046234130859375,
0.0119476318359375,
-0.01079559326171875,
0.00003916025161743164,
0.0179901123046875,
0.040252685546875,
-0.01329803466796875,
-0.0086669921875,
0.0116729736328125,
-0.0022983551025390625,
-0.0036716461181640625,
-0.0303802490234375,
0.044281005859375,
-0.0291290283203125,
0.011199951171875,
-0.051361083984375,
0.01497650146484375,
0.036224365234375,
-0.00470733642578125,
0.06158447265625,
0.065185546875,
-0.0271148681640625,
0.00281524658203125,
-0.03924560546875,
-0.01349639892578125,
-0.04241943359375,
0.0161895751953125,
-0.015625,
-0.05810546875,
0.035186767578125,
0.030853271484375,
0.01306915283203125,
0.052703857421875,
0.046478271484375,
-0.000606536865234375,
0.06536865234375,
0.046966552734375,
-0.0070648193359375,
0.039276123046875,
-0.057342529296875,
-0.0054931640625,
-0.0675048828125,
-0.01320648193359375,
-0.017791748046875,
-0.0087738037109375,
-0.048248291015625,
-0.04644775390625,
0.0361328125,
0.011260986328125,
-0.0526123046875,
0.041168212890625,
-0.06158447265625,
0.0240936279296875,
0.04534912109375,
0.02032470703125,
0.01806640625,
0.00033545494079589844,
-0.006923675537109375,
0.013458251953125,
-0.056793212890625,
-0.02947998046875,
0.0765380859375,
0.0217742919921875,
0.043182373046875,
0.0142669677734375,
0.043304443359375,
0.007717132568359375,
0.028961181640625,
-0.0404052734375,
0.042816162109375,
0.009124755859375,
-0.052001953125,
-0.03350830078125,
-0.034454345703125,
-0.06549072265625,
0.0222320556640625,
-0.007717132568359375,
-0.053070068359375,
0.03155517578125,
0.01261138916015625,
-0.054473876953125,
0.026336669921875,
-0.047271728515625,
0.060211181640625,
-0.0103759765625,
-0.036773681640625,
0.00629425048828125,
-0.047027587890625,
0.0248565673828125,
0.0257415771484375,
0.0015764236450195312,
-0.016845703125,
-0.0164947509765625,
0.049072265625,
-0.06011962890625,
0.06939697265625,
-0.01532745361328125,
-0.01264190673828125,
0.04638671875,
-0.00678253173828125,
0.0279693603515625,
0.0302886962890625,
0.00281524658203125,
0.023529052734375,
0.007049560546875,
-0.02801513671875,
-0.035186767578125,
0.04473876953125,
-0.076416015625,
-0.050384521484375,
-0.0295867919921875,
-0.03924560546875,
0.0131988525390625,
0.0077972412109375,
0.043914794921875,
0.03070068359375,
0.0062255859375,
0.003826141357421875,
0.035552978515625,
-0.0285186767578125,
0.04986572265625,
0.0225830078125,
-0.0083160400390625,
-0.044525146484375,
0.06005859375,
0.005092620849609375,
0.0108184814453125,
0.0217742919921875,
0.014862060546875,
-0.03839111328125,
-0.02484130859375,
-0.0560302734375,
0.0181732177734375,
-0.040496826171875,
-0.0416259765625,
-0.050994873046875,
-0.0244140625,
-0.0394287109375,
0.0030002593994140625,
-0.03167724609375,
-0.0428466796875,
-0.048553466796875,
-0.0038394927978515625,
0.0633544921875,
0.03338623046875,
-0.018035888671875,
0.0294189453125,
-0.055084228515625,
0.0187835693359375,
0.0413818359375,
0.00010699033737182617,
0.004940032958984375,
-0.059814453125,
-0.009918212890625,
0.007305145263671875,
-0.04132080078125,
-0.058197021484375,
0.06463623046875,
0.0080413818359375,
0.028564453125,
0.0166168212890625,
0.01462554931640625,
0.06292724609375,
-0.01568603515625,
0.07672119140625,
0.0208892822265625,
-0.07501220703125,
0.040740966796875,
-0.0419921875,
0.02166748046875,
0.0210723876953125,
0.036224365234375,
-0.0295257568359375,
-0.025299072265625,
-0.052978515625,
-0.06805419921875,
0.02850341796875,
0.0311737060546875,
0.016021728515625,
-0.0011301040649414062,
0.04339599609375,
-0.00794219970703125,
0.005290985107421875,
-0.07489013671875,
-0.032745361328125,
-0.0309906005859375,
-0.00728607177734375,
0.0131072998046875,
-0.0052032470703125,
-0.0175933837890625,
-0.044525146484375,
0.0628662109375,
-0.01934814453125,
0.05792236328125,
0.041351318359375,
-0.0007200241088867188,
-0.00254058837890625,
0.01276397705078125,
0.0251922607421875,
0.049591064453125,
-0.005130767822265625,
-0.011627197265625,
0.01654052734375,
-0.040496826171875,
0.0169830322265625,
0.0200042724609375,
-0.016815185546875,
-0.00952911376953125,
0.005527496337890625,
0.06610107421875,
-0.00824737548828125,
-0.0187835693359375,
0.0298004150390625,
-0.0265655517578125,
-0.0211639404296875,
-0.027130126953125,
0.0223541259765625,
0.01473236083984375,
0.03857421875,
0.034332275390625,
-0.0242156982421875,
0.01025390625,
-0.042633056640625,
0.0020599365234375,
0.0323486328125,
-0.01331329345703125,
-0.01374053955078125,
0.07598876953125,
0.00333404541015625,
-0.00394439697265625,
0.06011962890625,
-0.0119476318359375,
-0.037689208984375,
0.070556640625,
0.049713134765625,
0.060028076171875,
-0.0119781494140625,
0.023284912109375,
0.03692626953125,
0.018707275390625,
-0.00211334228515625,
0.01715087890625,
0.001888275146484375,
-0.050018310546875,
-0.01331329345703125,
-0.0382080078125,
-0.0301055908203125,
0.018218994140625,
-0.03839111328125,
0.0166473388671875,
-0.04437255859375,
-0.034393310546875,
-0.0177154541015625,
0.015716552734375,
-0.047821044921875,
0.0249786376953125,
0.01073455810546875,
0.05841064453125,
-0.045989990234375,
0.07080078125,
0.03558349609375,
-0.045684814453125,
-0.0784912109375,
-0.010772705078125,
-0.0012359619140625,
-0.053924560546875,
0.016937255859375,
0.0099639892578125,
0.00797271728515625,
0.01312255859375,
-0.058441162109375,
-0.064697265625,
0.1187744140625,
0.0282745361328125,
-0.0333251953125,
-0.01461029052734375,
0.00012242794036865234,
0.03338623046875,
-0.032806396484375,
0.055816650390625,
0.042022705078125,
0.020111083984375,
-0.003082275390625,
-0.0745849609375,
0.036041259765625,
-0.0287628173828125,
0.006908416748046875,
-0.01204681396484375,
-0.09136962890625,
0.08062744140625,
-0.01751708984375,
-0.00940704345703125,
0.0240325927734375,
0.059112548828125,
0.0313720703125,
0.00027561187744140625,
0.029541015625,
0.046722412109375,
0.06036376953125,
-0.0079498291015625,
0.08935546875,
-0.020111083984375,
0.048675537109375,
0.0677490234375,
0.011962890625,
0.056640625,
0.01151275634765625,
-0.0295867919921875,
0.047943115234375,
0.065673828125,
-0.01274871826171875,
0.0272369384765625,
0.00211334228515625,
-0.03350830078125,
-0.01425933837890625,
-0.00548553466796875,
-0.058197021484375,
0.0206756591796875,
0.035308837890625,
-0.0222015380859375,
0.01093292236328125,
-0.0167388916015625,
-0.0010614395141601562,
-0.048004150390625,
-0.00974273681640625,
0.0310211181640625,
0.02276611328125,
-0.029144287109375,
0.078125,
0.0013866424560546875,
0.06414794921875,
-0.03955078125,
-0.0045166015625,
-0.032745361328125,
-0.004428863525390625,
-0.016693115234375,
-0.048431396484375,
-0.0022945404052734375,
-0.00672149658203125,
0.00763702392578125,
0.01132965087890625,
0.055908203125,
-0.0220184326171875,
-0.027862548828125,
0.0208282470703125,
0.025054931640625,
0.0176239013671875,
-0.005748748779296875,
-0.072021484375,
0.013092041015625,
0.0012025833129882812,
-0.03741455078125,
0.025787353515625,
0.0286102294921875,
0.0164642333984375,
0.0548095703125,
0.048187255859375,
-0.0200347900390625,
0.0161590576171875,
-0.01387786865234375,
0.07745361328125,
-0.054901123046875,
-0.0270538330078125,
-0.06829833984375,
0.04327392578125,
-0.00927734375,
-0.03289794921875,
0.055572509765625,
0.036529541015625,
0.05108642578125,
-0.005222320556640625,
0.061004638671875,
-0.020904541015625,
-0.00977325439453125,
-0.0254974365234375,
0.06890869140625,
-0.05316162109375,
0.020751953125,
-0.023468017578125,
-0.05535888671875,
-0.01116943359375,
0.057769775390625,
-0.007110595703125,
0.006500244140625,
0.034881591796875,
0.0692138671875,
-0.00180816650390625,
-0.00281524658203125,
0.0183563232421875,
0.0284423828125,
0.0217437744140625,
0.07147216796875,
0.0606689453125,
-0.0743408203125,
0.05108642578125,
-0.033233642578125,
-0.026092529296875,
-0.005939483642578125,
-0.065673828125,
-0.05706787109375,
-0.0253448486328125,
-0.040802001953125,
-0.03955078125,
0.00012683868408203125,
0.06610107421875,
0.060302734375,
-0.035369873046875,
-0.01934814453125,
-0.00951385498046875,
0.00308990478515625,
-0.006103515625,
-0.0230865478515625,
0.0272064208984375,
0.00958251953125,
-0.06689453125,
0.0169677734375,
0.00641632080078125,
0.030914306640625,
-0.01296234130859375,
-0.0089874267578125,
-0.023040771484375,
0.0038394927978515625,
0.037139892578125,
0.04705810546875,
-0.049530029296875,
-0.013702392578125,
-0.01157379150390625,
-0.0089111328125,
0.020111083984375,
0.0230255126953125,
-0.061859130859375,
0.003971099853515625,
0.027557373046875,
0.0179290771484375,
0.04913330078125,
-0.0103912353515625,
0.042327880859375,
-0.033966064453125,
0.0176239013671875,
0.0006594657897949219,
0.0310211181640625,
0.0175323486328125,
-0.052001953125,
0.04718017578125,
0.0200958251953125,
-0.056304931640625,
-0.05419921875,
-0.013031005859375,
-0.07086181640625,
-0.0159759521484375,
0.0855712890625,
-0.02117919921875,
-0.0292205810546875,
0.000007271766662597656,
-0.019561767578125,
0.041473388671875,
-0.03564453125,
0.04742431640625,
0.025115966796875,
-0.02227783203125,
-0.0235748291015625,
-0.045867919921875,
0.0360107421875,
0.0254974365234375,
-0.0631103515625,
-0.0006966590881347656,
0.035552978515625,
0.0380859375,
0.00482177734375,
0.07354736328125,
-0.0081329345703125,
0.030029296875,
0.0108184814453125,
0.00261688232421875,
-0.00455474853515625,
-0.005405426025390625,
-0.0167236328125,
-0.002262115478515625,
-0.020263671875,
-0.01470184326171875
]
] |
zhihan1996/DNABERT-2-117M | 2023-10-30T19:27:14.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"biology",
"medical",
"custom_code",
"arxiv:2306.15006",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | zhihan1996 | null | null | zhihan1996/DNABERT-2-117M | 12 | 15,213 | transformers | 2023-06-26T07:14:58 | ---
metrics:
- matthews_correlation
- f1
tags:
- biology
- medical
---
This is the official pre-trained model introduced in [DNABERT-2: Efficient Foundation Model and Benchmark For Multi-Species Genome
](https://arxiv.org/pdf/2306.15006.pdf).
DNABERT-2 is a transformer-based genome foundation model trained on multi-species genome.
To load the model from huggingface:
```
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True)
model = AutoModel.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True)
```
To calculate the embedding of a dna sequence
```
dna = "ACGTAGCATCGGATCTATCTATCGACACTTGGTTATCGATCTACGAGCATCTCGTTAGC"
inputs = tokenizer(dna, return_tensors = 'pt')["input_ids"]
hidden_states = model(inputs)[0] # [1, sequence_length, 768]
# embedding with mean pooling
embedding_mean = torch.mean(hidden_states[0], dim=0)
print(embedding_mean.shape) # expect to be 768
# embedding with max pooling
embedding_max = torch.max(hidden_states[0], dim=0)[0]
print(embedding_max.shape) # expect to be 768
``` | 1,129 | [
[
-0.044830322265625,
-0.032745361328125,
0.0123443603515625,
0.013427734375,
-0.0310821533203125,
-0.0003707408905029297,
0.006305694580078125,
-0.0012674331665039062,
0.0244903564453125,
0.0160064697265625,
-0.049224853515625,
-0.036590576171875,
-0.060302734375,
0.0010271072387695312,
-0.0350341796875,
0.0760498046875,
-0.01190948486328125,
0.00461578369140625,
-0.004909515380859375,
-0.017974853515625,
0.021209716796875,
-0.0282440185546875,
0.0032520294189453125,
-0.0307769775390625,
0.0257110595703125,
0.0201568603515625,
0.052459716796875,
0.05859375,
0.035186767578125,
0.019989013671875,
-0.01383209228515625,
-0.01422119140625,
-0.05889892578125,
-0.0270233154296875,
0.01129150390625,
-0.0295257568359375,
-0.0242919921875,
0.02740478515625,
0.057708740234375,
0.043975830078125,
-0.00391387939453125,
0.0235443115234375,
0.0031833648681640625,
0.043487548828125,
-0.041900634765625,
0.00669097900390625,
-0.0162506103515625,
0.02545166015625,
-0.0102691650390625,
0.01178741455078125,
-0.0270538330078125,
-0.0254669189453125,
0.02783203125,
-0.0452880859375,
0.0225372314453125,
-0.005489349365234375,
0.0780029296875,
0.0347900390625,
-0.03436279296875,
-0.0023174285888671875,
-0.034332275390625,
0.04730224609375,
-0.0252532958984375,
0.0277862548828125,
0.016204833984375,
0.01824951171875,
-0.01552581787109375,
-0.06915283203125,
-0.04510498046875,
-0.007572174072265625,
-0.01036834716796875,
-0.001399993896484375,
-0.01555633544921875,
0.006992340087890625,
0.040557861328125,
0.039093017578125,
-0.060089111328125,
-0.0272369384765625,
-0.046112060546875,
-0.0124664306640625,
0.040924072265625,
-0.0200653076171875,
0.0008521080017089844,
-0.00873565673828125,
-0.051910400390625,
-0.03948974609375,
-0.030731201171875,
0.018890380859375,
0.021514892578125,
0.0011882781982421875,
-0.0308685302734375,
0.044342041015625,
-0.0032100677490234375,
0.0313720703125,
0.0256805419921875,
0.0174407958984375,
0.05364990234375,
-0.0016908645629882812,
-0.0240478515625,
0.004299163818359375,
0.0604248046875,
0.01087188720703125,
0.031463623046875,
0.0017061233520507812,
0.0161590576171875,
-0.00225830078125,
0.0246124267578125,
-0.0858154296875,
-0.057403564453125,
0.019561767578125,
-0.029937744140625,
-0.02581787109375,
0.0301513671875,
-0.05328369140625,
-0.01229095458984375,
0.004138946533203125,
0.04205322265625,
-0.0276641845703125,
-0.031524658203125,
0.0017480850219726562,
-0.004940032958984375,
0.037353515625,
-0.0011444091796875,
-0.050689697265625,
0.018096923828125,
0.04052734375,
0.079345703125,
0.0035839080810546875,
-0.0258941650390625,
-0.032501220703125,
-0.0089111328125,
-0.00684356689453125,
0.035919189453125,
-0.024200439453125,
-0.034881591796875,
-0.017120361328125,
0.0228271484375,
-0.009063720703125,
-0.0217132568359375,
0.032196044921875,
-0.02557373046875,
0.00786590576171875,
-0.0295562744140625,
-0.056427001953125,
-0.0355224609375,
0.0113067626953125,
-0.052215576171875,
0.09344482421875,
0.0479736328125,
-0.0693359375,
0.034332275390625,
-0.0389404296875,
-0.009063720703125,
-0.00600433349609375,
-0.00966644287109375,
-0.06591796875,
-0.0177459716796875,
-0.0020542144775390625,
0.042572021484375,
-0.00801849365234375,
0.0218505859375,
-0.04449462890625,
-0.03057861328125,
-0.00421905517578125,
0.0250701904296875,
0.08233642578125,
0.005401611328125,
-0.04638671875,
0.0187530517578125,
-0.047088623046875,
0.0030765533447265625,
0.01554107666015625,
-0.016510009765625,
0.007144927978515625,
-0.0265350341796875,
0.00605010986328125,
0.0190582275390625,
0.0017137527465820312,
-0.05523681640625,
0.03265380859375,
-0.0277862548828125,
0.03759765625,
0.032196044921875,
-0.017242431640625,
0.034027099609375,
-0.02410888671875,
0.0301666259765625,
0.0085601806640625,
0.015838623046875,
0.0029926300048828125,
-0.042694091796875,
-0.0550537109375,
-0.0308380126953125,
0.03033447265625,
0.0372314453125,
-0.027496337890625,
0.050140380859375,
-0.00959014892578125,
-0.06622314453125,
-0.0184173583984375,
0.005939483642578125,
0.0312347412109375,
0.023162841796875,
0.046875,
-0.039276123046875,
-0.03692626953125,
-0.07196044921875,
0.0171051025390625,
-0.007904052734375,
0.01374053955078125,
0.00954437255859375,
0.069091796875,
-0.03363037109375,
0.06915283203125,
-0.027130126953125,
-0.00021505355834960938,
-0.0164794921875,
0.0131683349609375,
0.0303955078125,
0.0609130859375,
0.059295654296875,
-0.04486083984375,
-0.0160369873046875,
-0.0419921875,
-0.061676025390625,
0.0255889892578125,
0.0152130126953125,
-0.04400634765625,
-0.0123443603515625,
0.006511688232421875,
-0.0565185546875,
0.039093017578125,
0.046905517578125,
-0.02862548828125,
0.04779052734375,
-0.0099029541015625,
-0.0002918243408203125,
-0.07720947265625,
0.01035308837890625,
-0.0007929801940917969,
-0.006946563720703125,
-0.049652099609375,
0.026641845703125,
0.033477783203125,
0.00811004638671875,
-0.03289794921875,
0.035858154296875,
-0.0288543701171875,
-0.01123046875,
-0.0086212158203125,
-0.01776123046875,
0.00562286376953125,
0.031585693359375,
-0.0029277801513671875,
0.036956787109375,
0.050994873046875,
-0.04022216796875,
0.035430908203125,
0.052459716796875,
-0.00402069091796875,
0.0169677734375,
-0.06256103515625,
0.01007080078125,
0.00748443603515625,
0.047882080078125,
-0.057342529296875,
-0.0221099853515625,
0.018798828125,
-0.041107177734375,
0.0288543701171875,
-0.0233917236328125,
-0.021942138671875,
-0.059844970703125,
-0.0438232421875,
0.057830810546875,
0.05010986328125,
-0.06549072265625,
0.05975341796875,
0.024322509765625,
0.015716552734375,
-0.048492431640625,
-0.058258056640625,
-0.03192138671875,
-0.0172271728515625,
-0.06768798828125,
0.0243988037109375,
-0.00492095947265625,
0.0106201171875,
-0.01093292236328125,
0.0011615753173828125,
-0.0070648193359375,
0.00469970703125,
0.03369140625,
0.020111083984375,
-0.01568603515625,
-0.00408935546875,
0.007171630859375,
-0.01410675048828125,
0.02581787109375,
-0.01467132568359375,
0.050018310546875,
-0.0191802978515625,
-0.018310546875,
-0.034881591796875,
0.0024738311767578125,
0.0183563232421875,
-0.01535797119140625,
0.059112548828125,
0.06378173828125,
-0.0380859375,
-0.016754150390625,
-0.030487060546875,
-0.022216796875,
-0.034454345703125,
0.02337646484375,
-0.0283355712890625,
-0.056243896484375,
0.033050537109375,
-0.01959228515625,
-0.0013551712036132812,
0.04962158203125,
0.040618896484375,
-0.0007891654968261719,
0.050689697265625,
0.043731689453125,
0.0015249252319335938,
0.0127105712890625,
-0.06146240234375,
-0.0104827880859375,
-0.06365966796875,
-0.031768798828125,
-0.027374267578125,
-0.018890380859375,
-0.034332275390625,
-0.045806884765625,
0.00927734375,
0.039031982421875,
-0.05810546875,
0.061431884765625,
-0.044158935546875,
0.01451873779296875,
0.06671142578125,
0.01548004150390625,
-0.01554107666015625,
-0.0035266876220703125,
-0.03948974609375,
0.01418304443359375,
-0.04583740234375,
-0.0269012451171875,
0.09857177734375,
0.03021240234375,
0.0518798828125,
0.010772705078125,
0.07464599609375,
0.007274627685546875,
0.016204833984375,
-0.047027587890625,
0.016448974609375,
-0.0078277587890625,
-0.04998779296875,
0.0028362274169921875,
-0.03314208984375,
-0.057373046875,
0.00916290283203125,
-0.030487060546875,
-0.05377197265625,
0.0204620361328125,
0.0343017578125,
-0.0299224853515625,
0.0264739990234375,
-0.0352783203125,
0.0789794921875,
-0.0101165771484375,
-0.020355224609375,
0.00228118896484375,
-0.06292724609375,
0.043243408203125,
-0.0020904541015625,
-0.0038471221923828125,
-0.01041412353515625,
0.0325927734375,
0.0699462890625,
-0.036224365234375,
0.03460693359375,
-0.03851318359375,
0.0229644775390625,
0.019744873046875,
0.01024627685546875,
0.034332275390625,
0.01483917236328125,
-0.01274871826171875,
0.018707275390625,
0.01311492919921875,
-0.042938232421875,
-0.01285552978515625,
0.031524658203125,
-0.0838623046875,
-0.02496337890625,
-0.0462646484375,
-0.0082244873046875,
-0.0031032562255859375,
0.004528045654296875,
0.040252685546875,
0.0281982421875,
-0.029876708984375,
0.0240631103515625,
0.0537109375,
-0.021697998046875,
0.0199127197265625,
0.01282501220703125,
-0.01424407958984375,
-0.0274810791015625,
0.033050537109375,
-0.0190277099609375,
0.00916290283203125,
0.005802154541015625,
0.01947021484375,
-0.027862548828125,
-0.0113983154296875,
-0.060821533203125,
0.0295867919921875,
-0.0477294921875,
-0.0142822265625,
-0.07037353515625,
-0.03350830078125,
-0.0294036865234375,
-0.015045166015625,
-0.041290283203125,
-0.029052734375,
-0.0235748291015625,
-0.01540374755859375,
0.035797119140625,
0.048980712890625,
-0.0237274169921875,
0.039276123046875,
-0.05792236328125,
0.021026611328125,
0.019378662109375,
0.0113983154296875,
-0.019775390625,
-0.059417724609375,
-0.00791168212890625,
-0.00638580322265625,
-0.0197906494140625,
-0.07275390625,
0.01446533203125,
0.03216552734375,
0.048187255859375,
0.03204345703125,
0.00003409385681152344,
0.025299072265625,
-0.0328369140625,
0.048248291015625,
0.01251220703125,
-0.0689697265625,
0.040008544921875,
-0.009979248046875,
0.01293182373046875,
0.042144775390625,
0.04803466796875,
-0.006336212158203125,
-0.004825592041015625,
-0.057830810546875,
-0.06256103515625,
0.051177978515625,
0.0208892822265625,
0.0006394386291503906,
0.01232147216796875,
0.036865234375,
0.0083160400390625,
0.00626373291015625,
-0.06500244140625,
-0.046356201171875,
-0.047576904296875,
-0.022064208984375,
-0.00502777099609375,
-0.029998779296875,
-0.0207366943359375,
-0.047027587890625,
0.066650390625,
-0.0025310516357421875,
0.044342041015625,
0.01116943359375,
-0.03314208984375,
-0.01364898681640625,
-0.0172271728515625,
0.04266357421875,
0.0254058837890625,
-0.034454345703125,
-0.003803253173828125,
0.0012044906616210938,
-0.04296875,
0.00795745849609375,
0.049468994140625,
0.0011806488037109375,
-0.007755279541015625,
0.0283660888671875,
0.053863525390625,
0.01568603515625,
-0.015960693359375,
0.058929443359375,
-0.0022735595703125,
-0.036590576171875,
-0.032318115234375,
0.0176544189453125,
0.01531982421875,
0.0347900390625,
0.0217437744140625,
0.000583648681640625,
0.00789642333984375,
-0.0208587646484375,
0.042572021484375,
0.03033447265625,
-0.0211944580078125,
-0.01551055908203125,
0.062347412109375,
0.01336669921875,
0.0098419189453125,
0.056671142578125,
-0.0302886962890625,
-0.06317138671875,
0.06298828125,
0.01538848876953125,
0.05523681640625,
-0.0254974365234375,
0.0193328857421875,
0.07537841796875,
0.0169525146484375,
-0.01435089111328125,
0.0240936279296875,
-0.008697509765625,
-0.052215576171875,
0.0009984970092773438,
-0.0670166015625,
-0.0277252197265625,
-0.007602691650390625,
-0.0675048828125,
0.01708984375,
-0.041168212890625,
-0.01239013671875,
0.0030765533447265625,
0.0080413818359375,
-0.06341552734375,
0.016510009765625,
0.0099945068359375,
0.08782958984375,
-0.06585693359375,
0.076416015625,
0.038360595703125,
-0.0158233642578125,
-0.0799560546875,
-0.01885986328125,
0.021240234375,
-0.07196044921875,
0.0361328125,
0.04473876953125,
-0.0023670196533203125,
0.0159454345703125,
-0.03533935546875,
-0.0777587890625,
0.09521484375,
0.0192108154296875,
-0.04998779296875,
0.00969696044921875,
-0.003803253173828125,
0.0199127197265625,
-0.0260467529296875,
0.0296478271484375,
0.05035400390625,
0.034820556640625,
0.0119476318359375,
-0.047149658203125,
0.00981903076171875,
-0.038818359375,
-0.0046539306640625,
0.01519775390625,
-0.04022216796875,
0.09552001953125,
-0.0271759033203125,
-0.0052490234375,
0.0294647216796875,
0.052459716796875,
0.036224365234375,
-0.0043182373046875,
0.043731689453125,
0.057708740234375,
0.0469970703125,
-0.0333251953125,
0.0692138671875,
-0.040283203125,
0.0709228515625,
0.0665283203125,
-0.00933837890625,
0.050048828125,
0.017120361328125,
-0.0262298583984375,
0.0557861328125,
0.06866455078125,
-0.0467529296875,
0.03778076171875,
0.036834716796875,
-0.0070648193359375,
-0.01146697998046875,
0.0275421142578125,
-0.06353759765625,
0.0301513671875,
0.0166778564453125,
-0.04913330078125,
-0.00815582275390625,
-0.0025920867919921875,
0.00197601318359375,
-0.0256195068359375,
-0.0069427490234375,
0.034942626953125,
-0.005901336669921875,
-0.0232696533203125,
0.042236328125,
0.0008983612060546875,
0.0531005859375,
-0.036712646484375,
-0.004230499267578125,
-0.0025806427001953125,
0.0205078125,
-0.0194244384765625,
-0.046112060546875,
-0.001255035400390625,
0.00019943714141845703,
-0.004016876220703125,
0.0162506103515625,
0.0294647216796875,
-0.0311737060546875,
-0.045684814453125,
0.0489501953125,
0.0277099609375,
0.0242462158203125,
-0.00563812255859375,
-0.07672119140625,
0.0015087127685546875,
-0.0165252685546875,
-0.0550537109375,
0.0013589859008789062,
-0.0071563720703125,
0.025054931640625,
0.034759521484375,
0.0345458984375,
-0.026397705078125,
0.0098724365234375,
-0.00550079345703125,
0.07415771484375,
-0.04962158203125,
-0.039520263671875,
-0.0518798828125,
0.03887939453125,
0.0029754638671875,
-0.0340576171875,
0.035980224609375,
0.0513916015625,
0.080078125,
-0.004543304443359375,
0.042205810546875,
-0.0197906494140625,
0.0256500244140625,
-0.02398681640625,
0.050201416015625,
-0.03826904296875,
-0.0288543701171875,
-0.0223541259765625,
-0.08514404296875,
-0.002712249755859375,
0.08135986328125,
-0.0068817138671875,
0.034088134765625,
0.052947998046875,
0.053466796875,
-0.01434326171875,
0.0138702392578125,
-0.0024394989013671875,
0.0239105224609375,
0.01024627685546875,
0.038360595703125,
0.0333251953125,
-0.0311737060546875,
0.0254669189453125,
-0.0601806640625,
-0.03363037109375,
-0.01678466796875,
-0.053863525390625,
-0.07904052734375,
-0.037506103515625,
-0.0132293701171875,
-0.043914794921875,
0.0013942718505859375,
0.093994140625,
0.07708740234375,
-0.07562255859375,
-0.0006866455078125,
-0.007053375244140625,
-0.0333251953125,
0.007282257080078125,
-0.01507568359375,
0.043487548828125,
0.0223388671875,
-0.03662109375,
0.019622802734375,
-0.0002435445785522461,
0.01038360595703125,
-0.023193359375,
-0.0163116455078125,
-0.037384033203125,
-0.018310546875,
0.01255035400390625,
0.024932861328125,
-0.04150390625,
-0.031768798828125,
-0.0201873779296875,
-0.011871337890625,
0.01904296875,
0.04010009765625,
-0.055938720703125,
0.029296875,
0.007659912109375,
0.03839111328125,
0.06866455078125,
0.00737762451171875,
0.031951904296875,
-0.0400390625,
0.0145416259765625,
0.0004127025604248047,
0.052703857421875,
0.0290985107421875,
-0.0258026123046875,
0.0487060546875,
0.0223388671875,
-0.04168701171875,
-0.038665771484375,
0.020050048828125,
-0.08544921875,
-0.0024738311767578125,
0.071044921875,
-0.04400634765625,
-0.0233154296875,
0.01255035400390625,
0.00548553466796875,
0.0576171875,
-0.01666259765625,
0.037445068359375,
0.0267333984375,
-0.0076141357421875,
-0.00467681884765625,
-0.0186614990234375,
0.037811279296875,
0.018951416015625,
-0.051666259765625,
-0.0328369140625,
0.01007080078125,
0.04071044921875,
0.0184478759765625,
0.022674560546875,
0.0006890296936035156,
0.03839111328125,
0.0146942138671875,
0.020050048828125,
-0.0235595703125,
-0.0280303955078125,
-0.02178955078125,
0.003749847412109375,
-0.01203155517578125,
-0.0041656494140625
]
] |
facebook/nllb-200-3.3B | 2023-02-11T20:19:13.000Z | [
"transformers",
"pytorch",
"m2m_100",
"text2text-generation",
"nllb",
"translation",
"ace",
"acm",
"acq",
"aeb",
"af",
"ajp",
"ak",
"als",
"am",
"apc",
"ar",
"ars",
"ary",
"arz",
"as",
"ast",
"awa",
"ayr",
"azb",
"azj",
"ba",
"bm",
"ban",
"be",
"bem",
"bn",
"bho",
"bjn",
"bo",
"bs",
"bug",
"bg",
"ca",
"ceb",
"cs",
"cjk",
"ckb",
"crh",
"cy",
"da",
"de",
"dik",
"dyu",
"dz",
"el",
"en",
"eo",
"et",
"eu",
"ee",
"fo",
"fj",
"fi",
"fon",
"fr",
"fur",
"fuv",
"gaz",
"gd",
"ga",
"gl",
"gn",
"gu",
"ht",
"ha",
"he",
"hi",
"hne",
"hr",
"hu",
"hy",
"ig",
"ilo",
"id",
"is",
"it",
"jv",
"ja",
"kab",
"kac",
"kam",
"kn",
"ks",
"ka",
"kk",
"kbp",
"kea",
"khk",
"km",
"ki",
"rw",
"ky",
"kmb",
"kmr",
"knc",
"kg",
"ko",
"lo",
"lij",
"li",
"ln",
"lt",
"lmo",
"ltg",
"lb",
"lua",
"lg",
"luo",
"lus",
"lvs",
"mag",
"mai",
"ml",
"mar",
"min",
"mk",
"mt",
"mni",
"mos",
"mi",
"my",
"nl",
"nn",
"nb",
"npi",
"nso",
"nus",
"ny",
"oc",
"ory",
"pag",
"pa",
"pap",
"pbt",
"pes",
"plt",
"pl",
"pt",
"prs",
"quy",
"ro",
"rn",
"ru",
"sg",
"sa",
"sat",
"scn",
"shn",
"si",
"sk",
"sl",
"sm",
"sn",
"sd",
"so",
"st",
"es",
"sc",
"sr",
"ss",
"su",
"sv",
"swh",
"szl",
"ta",
"taq",
"tt",
"te",
"tg",
"tl",
"th",
"ti",
"tpi",
"tn",
"ts",
"tk",
"tum",
"tr",
"tw",
"tzm",
"ug",
"uk",
"umb",
"ur",
"uzn",
"vec",
"vi",
"war",
"wo",
"xh",
"ydd",
"yo",
"yue",
"zh",
"zsm",
"zu",
"dataset:flores-200",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/nllb-200-3.3B | 127 | 15,175 | transformers | 2022-07-08T10:06:00 | ---
language:
- ace
- acm
- acq
- aeb
- af
- ajp
- ak
- als
- am
- apc
- ar
- ars
- ary
- arz
- as
- ast
- awa
- ayr
- azb
- azj
- ba
- bm
- ban
- be
- bem
- bn
- bho
- bjn
- bo
- bs
- bug
- bg
- ca
- ceb
- cs
- cjk
- ckb
- crh
- cy
- da
- de
- dik
- dyu
- dz
- el
- en
- eo
- et
- eu
- ee
- fo
- fj
- fi
- fon
- fr
- fur
- fuv
- gaz
- gd
- ga
- gl
- gn
- gu
- ht
- ha
- he
- hi
- hne
- hr
- hu
- hy
- ig
- ilo
- id
- is
- it
- jv
- ja
- kab
- kac
- kam
- kn
- ks
- ka
- kk
- kbp
- kea
- khk
- km
- ki
- rw
- ky
- kmb
- kmr
- knc
- kg
- ko
- lo
- lij
- li
- ln
- lt
- lmo
- ltg
- lb
- lua
- lg
- luo
- lus
- lvs
- mag
- mai
- ml
- mar
- min
- mk
- mt
- mni
- mos
- mi
- my
- nl
- nn
- nb
- npi
- nso
- nus
- ny
- oc
- ory
- pag
- pa
- pap
- pbt
- pes
- plt
- pl
- pt
- prs
- quy
- ro
- rn
- ru
- sg
- sa
- sat
- scn
- shn
- si
- sk
- sl
- sm
- sn
- sd
- so
- st
- es
- sc
- sr
- ss
- su
- sv
- swh
- szl
- ta
- taq
- tt
- te
- tg
- tl
- th
- ti
- tpi
- tn
- ts
- tk
- tum
- tr
- tw
- tzm
- ug
- uk
- umb
- ur
- uzn
- vec
- vi
- war
- wo
- xh
- ydd
- yo
- yue
- zh
- zsm
- zu
language_details: "ace_Arab, ace_Latn, acm_Arab, acq_Arab, aeb_Arab, afr_Latn, ajp_Arab, aka_Latn, amh_Ethi, apc_Arab, arb_Arab, ars_Arab, ary_Arab, arz_Arab, asm_Beng, ast_Latn, awa_Deva, ayr_Latn, azb_Arab, azj_Latn, bak_Cyrl, bam_Latn, ban_Latn,bel_Cyrl, bem_Latn, ben_Beng, bho_Deva, bjn_Arab, bjn_Latn, bod_Tibt, bos_Latn, bug_Latn, bul_Cyrl, cat_Latn, ceb_Latn, ces_Latn, cjk_Latn, ckb_Arab, crh_Latn, cym_Latn, dan_Latn, deu_Latn, dik_Latn, dyu_Latn, dzo_Tibt, ell_Grek, eng_Latn, epo_Latn, est_Latn, eus_Latn, ewe_Latn, fao_Latn, pes_Arab, fij_Latn, fin_Latn, fon_Latn, fra_Latn, fur_Latn, fuv_Latn, gla_Latn, gle_Latn, glg_Latn, grn_Latn, guj_Gujr, hat_Latn, hau_Latn, heb_Hebr, hin_Deva, hne_Deva, hrv_Latn, hun_Latn, hye_Armn, ibo_Latn, ilo_Latn, ind_Latn, isl_Latn, ita_Latn, jav_Latn, jpn_Jpan, kab_Latn, kac_Latn, kam_Latn, kan_Knda, kas_Arab, kas_Deva, kat_Geor, knc_Arab, knc_Latn, kaz_Cyrl, kbp_Latn, kea_Latn, khm_Khmr, kik_Latn, kin_Latn, kir_Cyrl, kmb_Latn, kon_Latn, kor_Hang, kmr_Latn, lao_Laoo, lvs_Latn, lij_Latn, lim_Latn, lin_Latn, lit_Latn, lmo_Latn, ltg_Latn, ltz_Latn, lua_Latn, lug_Latn, luo_Latn, lus_Latn, mag_Deva, mai_Deva, mal_Mlym, mar_Deva, min_Latn, mkd_Cyrl, plt_Latn, mlt_Latn, mni_Beng, khk_Cyrl, mos_Latn, mri_Latn, zsm_Latn, mya_Mymr, nld_Latn, nno_Latn, nob_Latn, npi_Deva, nso_Latn, nus_Latn, nya_Latn, oci_Latn, gaz_Latn, ory_Orya, pag_Latn, pan_Guru, pap_Latn, pol_Latn, por_Latn, prs_Arab, pbt_Arab, quy_Latn, ron_Latn, run_Latn, rus_Cyrl, sag_Latn, san_Deva, sat_Beng, scn_Latn, shn_Mymr, sin_Sinh, slk_Latn, slv_Latn, smo_Latn, sna_Latn, snd_Arab, som_Latn, sot_Latn, spa_Latn, als_Latn, srd_Latn, srp_Cyrl, ssw_Latn, sun_Latn, swe_Latn, swh_Latn, szl_Latn, tam_Taml, tat_Cyrl, tel_Telu, tgk_Cyrl, tgl_Latn, tha_Thai, tir_Ethi, taq_Latn, taq_Tfng, tpi_Latn, tsn_Latn, tso_Latn, tuk_Latn, tum_Latn, tur_Latn, twi_Latn, tzm_Tfng, uig_Arab, ukr_Cyrl, umb_Latn, urd_Arab, uzn_Latn, vec_Latn, vie_Latn, war_Latn, wol_Latn, xho_Latn, ydd_Hebr, yor_Latn, yue_Hant, zho_Hans, zho_Hant, zul_Latn"
tags:
- nllb
- translation
license: "cc-by-nc-4.0"
datasets:
- flores-200
metrics:
- bleu
- spbleu
- chrf++
inference: false
---
# NLLB-200
This is the model card of NLLB-200's 3.3B variant.
Here are the [metrics](https://tinyurl.com/nllb200dense3bmetrics) for that particular checkpoint.
- Information about training algorithms, parameters, fairness constraints or other applied approaches, and features. The exact training algorithm, data and the strategies to handle data imbalances for high and low resource languages that were used to train NLLB-200 is described in the paper.
- Paper or other resource for more information NLLB Team et al, No Language Left Behind: Scaling Human-Centered Machine Translation, Arxiv, 2022
- License: CC-BY-NC
- Where to send questions or comments about the model: https://github.com/facebookresearch/fairseq/issues
## Intended Use
- Primary intended uses: NLLB-200 is a machine translation model primarily intended for research in machine translation, - especially for low-resource languages. It allows for single sentence translation among 200 languages. Information on how to - use the model can be found in Fairseq code repository along with the training code and references to evaluation and training data.
- Primary intended users: Primary users are researchers and machine translation research community.
- Out-of-scope use cases: NLLB-200 is a research model and is not released for production deployment. NLLB-200 is trained on general domain text data and is not intended to be used with domain specific texts, such as medical domain or legal domain. The model is not intended to be used for document translation. The model was trained with input lengths not exceeding 512 tokens, therefore translating longer sequences might result in quality degradation. NLLB-200 translations can not be used as certified translations.
## Metrics
• Model performance measures: NLLB-200 model was evaluated using BLEU, spBLEU, and chrF++ metrics widely adopted by machine translation community. Additionally, we performed human evaluation with the XSTS protocol and measured the toxicity of the generated translations.
## Evaluation Data
- Datasets: Flores-200 dataset is described in Section 4
- Motivation: We used Flores-200 as it provides full evaluation coverage of the languages in NLLB-200
- Preprocessing: Sentence-split raw text data was preprocessed using SentencePiece. The
SentencePiece model is released along with NLLB-200.
## Training Data
• We used parallel multilingual data from a variety of sources to train the model. We provide detailed report on data selection and construction process in Section 5 in the paper. We also used monolingual data constructed from Common Crawl. We provide more details in Section 5.2.
## Ethical Considerations
• In this work, we took a reflexive approach in technological development to ensure that we prioritize human users and minimize risks that could be transferred to them. While we reflect on our ethical considerations throughout the article, here are some additional points to highlight. For one, many languages chosen for this study are low-resource languages, with a heavy emphasis on African languages. While quality translation could improve education and information access in many in these communities, such an access could also make groups with lower levels of digital literacy more vulnerable to misinformation or online scams. The latter scenarios could arise if bad actors misappropriate our work for nefarious activities, which we conceive as an example of unintended use. Regarding data acquisition, the training data used for model development were mined from various publicly available sources on the web. Although we invested heavily in data cleaning, personally identifiable information may not be entirely eliminated. Finally, although we did our best to optimize for translation quality, mistranslations produced by the model could remain. Although the odds are low, this could have adverse impact on those who rely on these translations to make important decisions (particularly when related to health and safety).
## Caveats and Recommendations
• Our model has been tested on the Wikimedia domain with limited investigation on other domains supported in NLLB-MD. In addition, the supported languages may have variations that our model is not capturing. Users should make appropriate assessments.
## Carbon Footprint Details
• The carbon dioxide (CO2e) estimate is reported in Section 8.8. | 7,630 | [
[
-0.0291900634765625,
-0.041259765625,
0.02325439453125,
0.0234222412109375,
-0.00936126708984375,
-0.01214599609375,
-0.00989532470703125,
-0.05377197265625,
-0.00511932373046875,
0.056610107421875,
-0.036224365234375,
-0.022857666015625,
-0.045867919921875,
0.02679443359375,
-0.0439453125,
0.10015869140625,
-0.004138946533203125,
0.0259246826171875,
-0.001495361328125,
-0.0307159423828125,
-0.0300140380859375,
-0.044525146484375,
-0.060333251953125,
-0.01361846923828125,
0.05474853515625,
0.02325439453125,
0.05755615234375,
0.0506591796875,
0.038787841796875,
0.0096893310546875,
-0.0264739990234375,
-0.00382232666015625,
-0.05126953125,
-0.0364990234375,
-0.0181121826171875,
-0.027313232421875,
-0.061981201171875,
0.016571044921875,
0.041534423828125,
0.072998046875,
-0.01535797119140625,
0.0406494140625,
0.006244659423828125,
0.050262451171875,
-0.024078369140625,
-0.0133514404296875,
-0.038330078125,
0.01033782958984375,
-0.027984619140625,
-0.01006317138671875,
-0.05010986328125,
-0.02679443359375,
-0.0060272216796875,
-0.050323486328125,
0.005153656005859375,
0.02471923828125,
0.07440185546875,
0.008148193359375,
-0.04254150390625,
-0.026763916015625,
-0.038421630859375,
0.07720947265625,
-0.0726318359375,
0.0296630859375,
0.044921875,
0.0037288665771484375,
0.000499725341796875,
-0.0426025390625,
-0.05059814453125,
-0.0071258544921875,
-0.00019991397857666016,
0.0105438232421875,
-0.012969970703125,
-0.0030307769775390625,
0.040252685546875,
0.02899169921875,
-0.0509033203125,
0.0120849609375,
-0.0521240234375,
-0.01450347900390625,
0.051483154296875,
0.0182952880859375,
0.017822265625,
-0.036590576171875,
-0.031768798828125,
-0.00632476806640625,
-0.0545654296875,
-0.00024044513702392578,
0.056365966796875,
0.03338623046875,
-0.026702880859375,
0.04583740234375,
-0.01122283935546875,
0.056365966796875,
-0.007389068603515625,
-0.016143798828125,
0.045013427734375,
-0.05224609375,
-0.003063201904296875,
-0.01190185546875,
0.060028076171875,
0.0406494140625,
0.0185089111328125,
-0.016082763671875,
-0.0151824951171875,
-0.0233154296875,
0.04302978515625,
-0.06109619140625,
0.017364501953125,
0.0223541259765625,
-0.05462646484375,
-0.0374755859375,
-0.0071258544921875,
-0.055511474609375,
-0.0139923095703125,
-0.0271453857421875,
0.024444580078125,
-0.027862548828125,
-0.01390838623046875,
0.0035152435302734375,
0.005420684814453125,
0.01153564453125,
0.01300048828125,
-0.051177978515625,
0.0184326171875,
0.0263214111328125,
0.057586669921875,
-0.02099609375,
-0.027557373046875,
-0.0191192626953125,
0.0034275054931640625,
-0.0259857177734375,
0.02374267578125,
-0.0032253265380859375,
-0.0269012451171875,
-0.00199127197265625,
0.0158538818359375,
0.007659912109375,
-0.03955078125,
0.062042236328125,
-0.042327880859375,
0.02471923828125,
-0.0379638671875,
-0.039276123046875,
-0.0229339599609375,
0.0142974853515625,
-0.067626953125,
0.08355712890625,
0.01096343994140625,
-0.07269287109375,
0.029388427734375,
-0.05657958984375,
-0.0374755859375,
0.017608642578125,
0.01296234130859375,
-0.0361328125,
0.00231170654296875,
-0.0009589195251464844,
0.0122528076171875,
-0.0191192626953125,
0.036865234375,
-0.0254058837890625,
-0.033935546875,
0.0237579345703125,
-0.04541015625,
0.10125732421875,
0.0380859375,
-0.02252197265625,
-0.021636962890625,
-0.050506591796875,
0.0069122314453125,
0.0182952880859375,
-0.04449462890625,
-0.00501251220703125,
-0.01346588134765625,
0.0310516357421875,
0.025909423828125,
0.017181396484375,
-0.043426513671875,
0.01064300537109375,
-0.026336669921875,
0.0082550048828125,
0.04119873046875,
0.0031909942626953125,
0.037933349609375,
-0.0294952392578125,
0.051361083984375,
-0.009002685546875,
0.037933349609375,
0.006038665771484375,
-0.03973388671875,
-0.0616455078125,
0.0154266357421875,
0.038848876953125,
0.04876708984375,
-0.0494384765625,
0.038604736328125,
-0.0246124267578125,
-0.03314208984375,
-0.057220458984375,
0.0143585205078125,
0.031951904296875,
0.03607177734375,
0.04302978515625,
-0.0236968994140625,
-0.035919189453125,
-0.059600830078125,
-0.017852783203125,
-0.0011959075927734375,
0.010711669921875,
0.0126800537109375,
0.047943115234375,
-0.031890869140625,
0.05804443359375,
-0.0156707763671875,
-0.0171051025390625,
-0.0296783447265625,
0.00754547119140625,
0.017303466796875,
0.043243408203125,
0.042755126953125,
-0.0712890625,
-0.034423828125,
-0.0015707015991210938,
-0.07379150390625,
-0.0156402587890625,
-0.01462554931640625,
-0.01180267333984375,
0.035186767578125,
0.03497314453125,
-0.0248260498046875,
0.038726806640625,
0.0609130859375,
-0.0152740478515625,
0.037200927734375,
-0.011505126953125,
0.00382232666015625,
-0.08709716796875,
0.042816162109375,
-0.01390838623046875,
-0.012359619140625,
-0.05908203125,
0.0128326416015625,
0.01215362548828125,
-0.012451171875,
-0.044464111328125,
0.066162109375,
-0.0214385986328125,
0.00470733642578125,
-0.023590087890625,
0.007579803466796875,
0.0181884765625,
0.0361328125,
-0.013946533203125,
0.051177978515625,
0.002979278564453125,
-0.039093017578125,
0.0011110305786132812,
0.026123046875,
-0.0216064453125,
0.060760498046875,
-0.046234130859375,
0.000865936279296875,
-0.0049285888671875,
0.0173492431640625,
-0.0300445556640625,
-0.00777435302734375,
0.0284423828125,
-0.04534912109375,
0.0159759521484375,
0.00685882568359375,
-0.057708740234375,
-0.03167724609375,
0.0004000663757324219,
0.030548095703125,
0.0225677490234375,
-0.0194244384765625,
0.0228118896484375,
0.03363037109375,
-0.01239013671875,
-0.052398681640625,
-0.08154296875,
0.0129241943359375,
-0.0199432373046875,
-0.03515625,
0.01415252685546875,
-0.0207977294921875,
-0.0105438232421875,
0.004901885986328125,
0.00431060791015625,
-0.01125335693359375,
0.0201416015625,
0.0097808837890625,
0.01409149169921875,
0.01019287109375,
0.00661468505859375,
0.0013608932495117188,
-0.005802154541015625,
-0.004787445068359375,
-0.0182342529296875,
0.045806884765625,
-0.0073699951171875,
-0.01012420654296875,
-0.03411865234375,
0.04412841796875,
0.025665283203125,
-0.01215362548828125,
0.08245849609375,
0.057037353515625,
-0.039306640625,
0.022705078125,
-0.045074462890625,
0.0009665489196777344,
-0.033935546875,
0.034393310546875,
-0.0016717910766601562,
-0.04364013671875,
0.03387451171875,
0.0198974609375,
0.0137176513671875,
0.03912353515625,
0.041046142578125,
-0.03143310546875,
0.06976318359375,
0.05572509765625,
0.00021660327911376953,
0.0311279296875,
-0.027435302734375,
0.021575927734375,
-0.06976318359375,
-0.0218505859375,
-0.040496826171875,
-0.015716552734375,
-0.053924560546875,
-0.029388427734375,
0.018402099609375,
0.0219268798828125,
-0.01114654541015625,
0.0477294921875,
-0.011383056640625,
0.0171051025390625,
0.03216552734375,
0.0002779960632324219,
0.034759521484375,
-0.005828857421875,
-0.0113525390625,
-0.0108795166015625,
-0.0577392578125,
-0.0634765625,
0.08624267578125,
0.0355224609375,
0.037841796875,
-0.00334930419921875,
0.055694580078125,
0.02935791015625,
0.0379638671875,
-0.043243408203125,
0.034881591796875,
-0.0103302001953125,
-0.09368896484375,
-0.01702880859375,
-0.05389404296875,
-0.08160400390625,
0.0114593505859375,
-0.01100921630859375,
-0.03717041015625,
0.01256561279296875,
0.00678253173828125,
-0.0124969482421875,
0.0205841064453125,
-0.0565185546875,
0.08843994140625,
-0.041656494140625,
-0.013397216796875,
-0.02081298828125,
-0.05645751953125,
0.006923675537109375,
-0.031768798828125,
0.041046142578125,
-0.004001617431640625,
0.005268096923828125,
0.0679931640625,
-0.022674560546875,
0.061553955078125,
-0.01453399658203125,
-0.01230621337890625,
0.0162506103515625,
-0.007183074951171875,
0.027557373046875,
-0.00661468505859375,
-0.0262908935546875,
0.040802001953125,
0.0018224716186523438,
-0.05609130859375,
-0.005847930908203125,
0.032989501953125,
-0.058685302734375,
-0.01279449462890625,
-0.0282440185546875,
-0.056243896484375,
-0.00511932373046875,
0.046173095703125,
0.04229736328125,
0.022674560546875,
-0.016510009765625,
0.02239990234375,
0.046234130859375,
-0.04473876953125,
0.0236663818359375,
0.05072021484375,
-0.0187835693359375,
-0.0271453857421875,
0.068359375,
0.0259857177734375,
0.046112060546875,
0.0105133056640625,
0.00020563602447509766,
-0.01190185546875,
-0.0419921875,
-0.041168212890625,
0.0232086181640625,
-0.06512451171875,
-0.0168304443359375,
-0.057342529296875,
-0.0070648193359375,
-0.02679443359375,
-0.01430511474609375,
-0.0272674560546875,
-0.0227813720703125,
-0.0265655517578125,
-0.0107574462890625,
0.0020999908447265625,
0.049713134765625,
0.003448486328125,
0.0340576171875,
-0.0535888671875,
0.0180206298828125,
-0.019805908203125,
0.02020263671875,
0.00205230712890625,
-0.056732177734375,
-0.045562744140625,
0.0218963623046875,
-0.033935546875,
-0.060455322265625,
0.0261993408203125,
-0.0117950439453125,
0.052490234375,
0.01027679443359375,
-0.0010347366333007812,
0.045867919921875,
-0.036956787109375,
0.05419921875,
0.0103607177734375,
-0.072998046875,
0.023193359375,
-0.031494140625,
0.0360107421875,
0.07049560546875,
0.051300048828125,
-0.06597900390625,
-0.03570556640625,
-0.057769775390625,
-0.075927734375,
0.0531005859375,
0.023681640625,
0.017547607421875,
0.0011186599731445312,
0.0231170654296875,
0.00982666015625,
0.022186279296875,
-0.10308837890625,
-0.0151519775390625,
-0.00994110107421875,
-0.0160980224609375,
0.0097808837890625,
-0.00762176513671875,
-0.00580596923828125,
-0.01247406005859375,
0.0609130859375,
-0.0000871419906616211,
0.0145721435546875,
-0.005046844482421875,
-0.03265380859375,
-0.01288604736328125,
0.0168304443359375,
0.0263519287109375,
0.045562744140625,
-0.0102691650390625,
-0.0223846435546875,
0.03143310546875,
-0.04339599609375,
0.006927490234375,
0.007320404052734375,
-0.03668212890625,
-0.0099029541015625,
0.0279541015625,
0.0528564453125,
0.0018520355224609375,
-0.04669189453125,
0.0428466796875,
0.0034275054931640625,
-0.00876617431640625,
-0.026458740234375,
-0.023651123046875,
0.01275634765625,
0.0007853507995605469,
0.0244140625,
0.0229949951171875,
0.0186309814453125,
-0.036224365234375,
0.007904052734375,
0.0170135498046875,
-0.027130126953125,
-0.0222015380859375,
0.05303955078125,
0.026885986328125,
-0.0137939453125,
0.052490234375,
-0.03570556640625,
-0.0228118896484375,
0.033721923828125,
0.0269622802734375,
0.04248046875,
-0.007747650146484375,
0.0176849365234375,
0.04815673828125,
0.052520751953125,
-0.01184844970703125,
0.016387939453125,
0.0100860595703125,
-0.04425048828125,
-0.036712646484375,
-0.05950927734375,
-0.0185699462890625,
0.005855560302734375,
-0.07391357421875,
0.02728271484375,
-0.0158233642578125,
-0.0256500244140625,
-0.019195556640625,
0.01409149169921875,
-0.062042236328125,
0.0161895751953125,
0.0162811279296875,
0.07537841796875,
-0.0716552734375,
0.07489013671875,
0.0205841064453125,
-0.057891845703125,
-0.050445556640625,
0.0126800537109375,
-0.003238677978515625,
-0.04571533203125,
0.035980224609375,
0.021697998046875,
0.01324462890625,
-0.004795074462890625,
-0.040679931640625,
-0.06280517578125,
0.080078125,
0.031341552734375,
-0.040435791015625,
-0.01641845703125,
0.032470703125,
0.050262451171875,
-0.0030364990234375,
-0.00302886962890625,
0.0229339599609375,
0.04132080078125,
-0.0145721435546875,
-0.07611083984375,
0.007904052734375,
-0.0133819580078125,
0.0011205673217773438,
0.00038552284240722656,
-0.04638671875,
0.052276611328125,
-0.01287078857421875,
-0.014984130859375,
0.02496337890625,
0.035980224609375,
0.01084136962890625,
0.01422119140625,
0.027374267578125,
0.050201416015625,
0.058807373046875,
-0.0006928443908691406,
0.10040283203125,
-0.013092041015625,
0.038330078125,
0.0789794921875,
-0.01922607421875,
0.05657958984375,
0.046722412109375,
-0.01113128662109375,
0.0177001953125,
0.035064697265625,
-0.01146697998046875,
0.033721923828125,
0.007213592529296875,
0.0117645263671875,
0.00733184814453125,
-0.0257720947265625,
-0.02789306640625,
0.0197906494140625,
0.0118255615234375,
-0.032318115234375,
0.0005025863647460938,
0.01538848876953125,
0.0321044921875,
-0.00665283203125,
-0.00926971435546875,
0.042572021484375,
0.01305389404296875,
-0.053131103515625,
0.04486083984375,
0.0226593017578125,
0.049041748046875,
-0.0438232421875,
0.01552581787109375,
-0.02996826171875,
0.0112457275390625,
-0.0207977294921875,
-0.0528564453125,
0.04473876953125,
0.0167999267578125,
-0.0156402587890625,
-0.04180908203125,
0.0168609619140625,
-0.023223876953125,
-0.051971435546875,
0.038604736328125,
0.0286712646484375,
0.01288604736328125,
0.00444793701171875,
-0.0665283203125,
0.02081298828125,
0.00997161865234375,
-0.017364501953125,
0.0226593017578125,
0.0229949951171875,
-0.01255035400390625,
0.04150390625,
0.04327392578125,
0.01702880859375,
0.00794219970703125,
0.01007080078125,
0.045989990234375,
-0.04827880859375,
-0.0202178955078125,
-0.03399658203125,
0.041229248046875,
-0.015228271484375,
-0.03155517578125,
0.072265625,
0.05499267578125,
0.0985107421875,
0.007320404052734375,
0.05340576171875,
-0.0219268798828125,
0.037353515625,
-0.0289154052734375,
0.062255859375,
-0.05853271484375,
0.00823211669921875,
-0.0284423828125,
-0.0677490234375,
-0.01534271240234375,
0.041900634765625,
-0.01491546630859375,
0.01398468017578125,
0.05609130859375,
0.0499267578125,
0.0136871337890625,
-0.003421783447265625,
0.0145111083984375,
0.00818634033203125,
0.023681640625,
0.0186767578125,
0.035552978515625,
-0.0643310546875,
0.06219482421875,
-0.0185089111328125,
-0.00872039794921875,
-0.0212249755859375,
-0.0655517578125,
-0.0555419921875,
-0.037841796875,
-0.0151214599609375,
-0.032196044921875,
-0.00989532470703125,
0.057342529296875,
0.046112060546875,
-0.05621337890625,
-0.035186767578125,
0.0043792724609375,
-0.0190887451171875,
-0.0249481201171875,
-0.0183868408203125,
-0.003536224365234375,
-0.01285552978515625,
-0.056884765625,
0.01190948486328125,
0.0011415481567382812,
0.00562286376953125,
-0.0277099609375,
-0.0267486572265625,
-0.0374755859375,
0.0005130767822265625,
0.03662109375,
0.014984130859375,
-0.0478515625,
-0.006015777587890625,
0.017425537109375,
-0.038665771484375,
-0.004558563232421875,
0.03717041015625,
-0.0179290771484375,
0.03436279296875,
0.0248870849609375,
0.0401611328125,
0.04376220703125,
-0.01104736328125,
0.03680419921875,
-0.057159423828125,
0.0208892822265625,
0.0270843505859375,
0.031402587890625,
0.038177490234375,
-0.0299072265625,
0.048583984375,
0.016998291015625,
-0.041656494140625,
-0.07403564453125,
0.006862640380859375,
-0.074462890625,
-0.016448974609375,
0.0987548828125,
-0.00783538818359375,
-0.00458526611328125,
-0.01214599609375,
-0.01255035400390625,
0.02728271484375,
-0.00942230224609375,
0.047393798828125,
0.0699462890625,
0.01934814453125,
0.003528594970703125,
-0.08514404296875,
0.0194244384765625,
0.0321044921875,
-0.0675048828125,
-0.003528594970703125,
0.016021728515625,
0.02581787109375,
0.0181884765625,
0.0540771484375,
-0.03826904296875,
0.02191162109375,
-0.00913238525390625,
0.0279693603515625,
0.019927978515625,
-0.0162506103515625,
-0.0256500244140625,
-0.0130615234375,
0.00806427001953125,
0.0192108154296875
]
] |
facebook/wav2vec2-base-100k-voxpopuli | 2021-11-05T12:46:12.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"pretraining",
"audio",
"automatic-speech-recognition",
"voxpopuli",
"multilingual",
"arxiv:2101.00390",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-base-100k-voxpopuli | 1 | 15,104 | transformers | 2022-03-02T23:29:05 | ---
language: multilingual
tags:
- audio
- automatic-speech-recognition
- voxpopuli
license: cc-by-nc-4.0
---
# Wav2Vec2-Base-VoxPopuli
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) base model pretrained on the 100k unlabeled subset of [VoxPopuli corpus](https://arxiv.org/abs/2101.00390).
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
**Paper**: *[VoxPopuli: A Large-Scale Multilingual Speech Corpus for Representation
Learning, Semi-Supervised Learning and Interpretation](https://arxiv.org/abs/2101.00390)*
**Authors**: *Changhan Wang, Morgane Riviere, Ann Lee, Anne Wu, Chaitanya Talnikar, Daniel Haziza, Mary Williamson, Juan Pino, Emmanuel Dupoux* from *Facebook AI*
See the official website for more information, [here](https://github.com/facebookresearch/voxpopuli/)
# Fine-Tuning
Please refer to [this blog](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) on how to fine-tune this model on a specific language. Note that you should replace `"facebook/wav2vec2-large-xlsr-53"` with this checkpoint for fine-tuning.
| 1,419 | [
[
-0.01172637939453125,
-0.057098388671875,
-0.0014772415161132812,
0.01009368896484375,
-0.0184783935546875,
-0.002063751220703125,
-0.03729248046875,
-0.04144287109375,
0.0033473968505859375,
0.0268402099609375,
-0.041839599609375,
-0.04559326171875,
-0.03668212890625,
-0.016204833984375,
-0.01412200927734375,
0.06549072265625,
0.026397705078125,
0.0255584716796875,
0.009002685546875,
-0.0020351409912109375,
-0.038604736328125,
-0.034149169921875,
-0.05706787109375,
-0.0142974853515625,
0.0121307373046875,
0.024200439453125,
0.02850341796875,
0.05194091796875,
0.005970001220703125,
0.0193328857421875,
-0.038787841796875,
0.01389312744140625,
-0.06500244140625,
-0.025604248046875,
-0.0170745849609375,
-0.02020263671875,
-0.035247802734375,
-0.002330780029296875,
0.06268310546875,
0.047943115234375,
-0.01132965087890625,
0.0233001708984375,
0.0056915283203125,
0.020965576171875,
-0.0232391357421875,
0.0217132568359375,
-0.0648193359375,
-0.019805908203125,
-0.006801605224609375,
0.0003821849822998047,
-0.0443115234375,
-0.0051727294921875,
-0.0074005126953125,
-0.0294189453125,
0.0170745849609375,
-0.006961822509765625,
0.07232666015625,
0.0196533203125,
-0.042205810546875,
-0.01245880126953125,
-0.060638427734375,
0.06524658203125,
-0.0302581787109375,
0.046051025390625,
0.0404052734375,
0.01016998291015625,
-0.01001739501953125,
-0.06463623046875,
-0.0237884521484375,
-0.005893707275390625,
0.0231475830078125,
0.0162200927734375,
-0.022491455078125,
-0.002849578857421875,
0.0196990966796875,
0.0122528076171875,
-0.042877197265625,
0.0267181396484375,
-0.0670166015625,
-0.0513916015625,
0.03936767578125,
-0.0235443115234375,
0.006732940673828125,
-0.01409149169921875,
-0.029876708984375,
-0.017486572265625,
-0.0362548828125,
0.0438232421875,
0.022979736328125,
0.0526123046875,
-0.038909912109375,
0.041900634765625,
0.00897979736328125,
0.052032470703125,
0.0001964569091796875,
-0.0146942138671875,
0.06231689453125,
-0.0361328125,
0.002864837646484375,
0.0247955322265625,
0.055694580078125,
0.0007200241088867188,
0.02642822265625,
0.013153076171875,
-0.01197052001953125,
0.016265869140625,
0.00955963134765625,
-0.057373046875,
-0.0220489501953125,
0.0300445556640625,
-0.03759765625,
-0.008941650390625,
0.0177764892578125,
-0.006999969482421875,
0.0159912109375,
-0.042938232421875,
0.031829833984375,
-0.041900634765625,
-0.03857421875,
-0.005809783935546875,
-0.00799560546875,
-0.0016231536865234375,
-0.005367279052734375,
-0.06732177734375,
0.011138916015625,
0.05889892578125,
0.0634765625,
-0.0008835792541503906,
-0.005191802978515625,
-0.05364990234375,
0.004276275634765625,
-0.0279388427734375,
0.062225341796875,
-0.033660888671875,
-0.048309326171875,
0.0009570121765136719,
0.00835418701171875,
0.01461029052734375,
-0.039306640625,
0.059326171875,
-0.0005130767822265625,
0.01258087158203125,
-0.0194549560546875,
-0.0546875,
-0.00928497314453125,
-0.0137939453125,
-0.041656494140625,
0.08013916015625,
0.01354217529296875,
-0.035308837890625,
0.021148681640625,
-0.035125732421875,
-0.03863525390625,
0.00382232666015625,
-0.009796142578125,
-0.0250091552734375,
0.0052032470703125,
0.005001068115234375,
0.0270843505859375,
0.0182037353515625,
-0.0030384063720703125,
-0.010711669921875,
-0.042236328125,
-0.0045623779296875,
-0.0090179443359375,
0.072265625,
0.0234527587890625,
-0.0007824897766113281,
0.0236358642578125,
-0.089111328125,
0.0194091796875,
-0.01116943359375,
-0.048187255859375,
-0.0012912750244140625,
-0.003910064697265625,
0.05035400390625,
0.009246826171875,
0.0260772705078125,
-0.043853759765625,
0.00301361083984375,
-0.0509033203125,
0.058868408203125,
0.045928955078125,
-0.01123046875,
0.027496337890625,
-0.00891876220703125,
0.01300048828125,
-0.00183868408203125,
-0.00044465065002441406,
-0.01003265380859375,
-0.041015625,
-0.0235595703125,
-0.043212890625,
0.034759521484375,
0.047454833984375,
-0.0166778564453125,
0.05029296875,
-0.0076446533203125,
-0.048614501953125,
-0.053741455078125,
0.0094146728515625,
0.035980224609375,
0.0330810546875,
0.0517578125,
-0.00797271728515625,
-0.0654296875,
-0.06610107421875,
-0.01010894775390625,
-0.01959228515625,
-0.01070404052734375,
0.0252227783203125,
0.01461029052734375,
-0.032745361328125,
0.061859130859375,
-0.00208282470703125,
-0.026092529296875,
-0.00726318359375,
0.0213165283203125,
0.017120361328125,
0.051788330078125,
0.04388427734375,
-0.06951904296875,
-0.0243682861328125,
-0.0035266876220703125,
-0.0017118453979492188,
-0.01222991943359375,
0.0188446044921875,
0.004711151123046875,
0.01535797119140625,
0.0391845703125,
-0.03582763671875,
0.007537841796875,
0.049224853515625,
-0.0228271484375,
0.03131103515625,
0.00238037109375,
-0.01763916015625,
-0.1082763671875,
-0.004520416259765625,
0.0079345703125,
-0.031524658203125,
-0.040802001953125,
-0.03851318359375,
0.01338958740234375,
-0.003452301025390625,
-0.054840087890625,
0.034698486328125,
-0.0257568359375,
-0.002292633056640625,
-0.0285491943359375,
-0.008575439453125,
-0.0246124267578125,
0.026214599609375,
0.01012420654296875,
0.0546875,
0.047576904296875,
-0.050445556640625,
0.01007080078125,
0.0374755859375,
-0.0222320556640625,
0.017486572265625,
-0.0567626953125,
0.0184478759765625,
0.00004553794860839844,
0.031585693359375,
-0.08349609375,
-0.014923095703125,
0.00823211669921875,
-0.06744384765625,
0.031005859375,
-0.02166748046875,
-0.026214599609375,
-0.0214691162109375,
0.0052947998046875,
0.035186767578125,
0.05889892578125,
-0.036651611328125,
0.0391845703125,
0.05670166015625,
-0.0020427703857421875,
-0.0380859375,
-0.055572509765625,
-0.013458251953125,
0.01061248779296875,
-0.02996826171875,
0.04266357421875,
-0.00336456298828125,
0.007282257080078125,
-0.00322723388671875,
-0.0079193115234375,
-0.005023956298828125,
-0.0159149169921875,
0.040283203125,
0.00844573974609375,
-0.01166534423828125,
0.024749755859375,
0.006793975830078125,
-0.019256591796875,
0.00270843505859375,
-0.033782958984375,
0.04559326171875,
-0.009857177734375,
-0.0013799667358398438,
-0.053314208984375,
0.0099029541015625,
0.023468017578125,
-0.0243072509765625,
0.027099609375,
0.08074951171875,
-0.0426025390625,
-0.01922607421875,
-0.038665771484375,
-0.01343536376953125,
-0.033843994140625,
0.057281494140625,
-0.03533935546875,
-0.0850830078125,
0.0205535888671875,
-0.01197052001953125,
-0.0013532638549804688,
0.054290771484375,
0.07940673828125,
-0.00736236572265625,
0.08343505859375,
0.049346923828125,
-0.014068603515625,
0.048187255859375,
-0.0282135009765625,
0.00021660327911376953,
-0.041290283203125,
-0.0308837890625,
-0.058013916015625,
-0.01422119140625,
-0.053741455078125,
-0.029541015625,
0.0100250244140625,
0.00025844573974609375,
-0.0104827880859375,
0.044647216796875,
-0.049591064453125,
0.0225830078125,
0.043365478515625,
0.002910614013671875,
-0.003292083740234375,
0.02587890625,
-0.017486572265625,
0.0036182403564453125,
-0.042327880859375,
-0.039764404296875,
0.061920166015625,
0.039215087890625,
0.03271484375,
-0.00001710653305053711,
0.0362548828125,
0.01316070556640625,
-0.024383544921875,
-0.0699462890625,
0.0184783935546875,
-0.019500732421875,
-0.053985595703125,
-0.01328277587890625,
-0.03656005859375,
-0.07232666015625,
0.0091552734375,
-0.029327392578125,
-0.06475830078125,
0.0131072998046875,
0.0231475830078125,
-0.014556884765625,
0.0028438568115234375,
-0.056671142578125,
0.06640625,
-0.00788116455078125,
0.00042319297790527344,
-0.032623291015625,
-0.038055419921875,
0.01027679443359375,
-0.0005645751953125,
0.03271484375,
-0.0114288330078125,
0.007389068603515625,
0.06396484375,
-0.0257110595703125,
0.064208984375,
-0.0301971435546875,
0.0013456344604492188,
0.0369873046875,
-0.020721435546875,
0.0243072509765625,
0.008392333984375,
0.00021541118621826172,
0.036346435546875,
0.031829833984375,
-0.04229736328125,
-0.0026340484619140625,
0.046142578125,
-0.0816650390625,
0.0020923614501953125,
-0.004619598388671875,
-0.02691650390625,
-0.0271759033203125,
0.0178070068359375,
0.053192138671875,
0.0379638671875,
-0.034942626953125,
0.04071044921875,
0.041717529296875,
0.004199981689453125,
0.0170135498046875,
0.030364990234375,
0.00015044212341308594,
-0.026031494140625,
0.06744384765625,
0.0185394287109375,
-0.0089111328125,
0.0276947021484375,
0.0201416015625,
-0.032562255859375,
-0.043731689453125,
-0.01390838623046875,
0.0170745849609375,
-0.033294677734375,
0.00321197509765625,
-0.0528564453125,
-0.0273895263671875,
-0.058837890625,
0.0242919921875,
-0.059906005859375,
-0.03985595703125,
-0.04815673828125,
-0.02215576171875,
0.028900146484375,
0.0411376953125,
-0.0310211181640625,
0.0184173583984375,
-0.05535888671875,
0.0322265625,
0.001186370849609375,
0.0081939697265625,
-0.025482177734375,
-0.07373046875,
-0.03167724609375,
0.03131103515625,
0.00521087646484375,
-0.044219970703125,
0.007251739501953125,
0.022216796875,
0.0379638671875,
0.03277587890625,
-0.0167388916015625,
0.053131103515625,
-0.04388427734375,
0.055633544921875,
0.02642822265625,
-0.07110595703125,
0.050506591796875,
-0.0245819091796875,
0.0198974609375,
0.04443359375,
0.0208740234375,
-0.0216522216796875,
-0.0037746429443359375,
-0.03704833984375,
-0.07550048828125,
0.042022705078125,
0.01134490966796875,
0.036041259765625,
0.00237274169921875,
0.0316162109375,
0.0086212158203125,
0.0016069412231445312,
-0.05181884765625,
-0.00981903076171875,
-0.03240966796875,
-0.025421142578125,
-0.00946044921875,
-0.043487548828125,
0.0014963150024414062,
-0.040130615234375,
0.06671142578125,
0.0065460205078125,
0.005401611328125,
-0.00057220458984375,
-0.010345458984375,
-0.01483154296875,
0.00545501708984375,
0.03887939453125,
0.047119140625,
-0.0196075439453125,
-0.0162506103515625,
0.007465362548828125,
-0.047454833984375,
-0.0031681060791015625,
0.005809783935546875,
-0.00991058349609375,
0.0062103271484375,
0.0345458984375,
0.0875244140625,
0.0157318115234375,
-0.0489501953125,
0.032684326171875,
-0.00440216064453125,
-0.0443115234375,
-0.052642822265625,
0.0188446044921875,
0.017181396484375,
0.01207733154296875,
0.03271484375,
0.00545501708984375,
-0.0006346702575683594,
-0.037200927734375,
0.023193359375,
0.02398681640625,
-0.062744140625,
-0.0309600830078125,
0.04815673828125,
0.0191192626953125,
-0.043304443359375,
0.04254150390625,
-0.01389312744140625,
-0.02044677734375,
0.038482666015625,
0.046295166015625,
0.044921875,
-0.02960205078125,
-0.004497528076171875,
0.05224609375,
0.0158843994140625,
-0.008209228515625,
0.05029296875,
-0.0025043487548828125,
-0.0513916015625,
-0.03753662109375,
-0.045867919921875,
-0.0149078369140625,
0.0247039794921875,
-0.050628662109375,
0.0361328125,
-0.0355224609375,
-0.030059814453125,
0.021636962890625,
0.0034236907958984375,
-0.05072021484375,
0.019805908203125,
0.026336669921875,
0.07025146484375,
-0.059326171875,
0.0750732421875,
0.06256103515625,
-0.01806640625,
-0.08221435546875,
-0.0187225341796875,
0.00920867919921875,
-0.05023193359375,
0.0285491943359375,
0.01470947265625,
-0.010528564453125,
0.0250091552734375,
-0.0574951171875,
-0.070556640625,
0.058380126953125,
0.0192413330078125,
-0.06695556640625,
0.012481689453125,
-0.008331298828125,
0.0408935546875,
-0.0301513671875,
0.0022983551025390625,
0.025146484375,
0.032745361328125,
0.018218994140625,
-0.086181640625,
-0.0201263427734375,
-0.0267791748046875,
-0.00508880615234375,
-0.024169921875,
-0.0523681640625,
0.0662841796875,
-0.01198577880859375,
-0.01517486572265625,
0.005382537841796875,
0.060546875,
0.01279449462890625,
0.006099700927734375,
0.049072265625,
0.043212890625,
0.0736083984375,
-0.008148193359375,
0.0491943359375,
-0.0265655517578125,
0.03204345703125,
0.0853271484375,
-0.00162506103515625,
0.0853271484375,
0.0200958251953125,
-0.01224517822265625,
0.03094482421875,
0.061004638671875,
-0.0301361083984375,
0.038482666015625,
0.01194000244140625,
0.01219940185546875,
-0.016998291015625,
-0.013214111328125,
-0.04833984375,
0.0692138671875,
0.03228759765625,
-0.0183258056640625,
0.020843505859375,
-0.00012218952178955078,
-0.0037899017333984375,
0.004222869873046875,
-0.0272979736328125,
0.0679931640625,
0.0099029541015625,
-0.025177001953125,
0.0594482421875,
0.01364898681640625,
0.039306640625,
-0.042022705078125,
0.00743865966796875,
0.0241546630859375,
0.0162506103515625,
-0.0254974365234375,
-0.025360107421875,
0.010467529296875,
0.0037689208984375,
-0.0284576416015625,
0.0029315948486328125,
0.047943115234375,
-0.043670654296875,
-0.05841064453125,
0.03509521484375,
0.028289794921875,
0.0191192626953125,
-0.0073089599609375,
-0.06640625,
0.0182952880859375,
0.00447845458984375,
-0.0211181640625,
0.0018711090087890625,
0.0174407958984375,
0.005001068115234375,
0.02215576171875,
0.067138671875,
0.0187835693359375,
-0.01485443115234375,
0.032012939453125,
0.04638671875,
-0.042449951171875,
-0.06488037109375,
-0.04791259765625,
0.042633056640625,
0.00045609474182128906,
-0.026824951171875,
0.033294677734375,
0.0609130859375,
0.0877685546875,
0.0014772415161132812,
0.06329345703125,
0.011749267578125,
0.07354736328125,
-0.04315185546875,
0.051513671875,
-0.043426513671875,
-0.00689697265625,
-0.005031585693359375,
-0.07373046875,
-0.0039520263671875,
0.057403564453125,
0.00724029541015625,
0.01358795166015625,
0.03759765625,
0.06524658203125,
-0.01386260986328125,
-0.01015472412109375,
0.0102081298828125,
0.051605224609375,
0.00960540771484375,
0.016754150390625,
0.03985595703125,
-0.0274200439453125,
0.056488037109375,
-0.01380157470703125,
-0.023681640625,
-0.0191497802734375,
-0.042724609375,
-0.07635498046875,
-0.061065673828125,
-0.033355712890625,
-0.040618896484375,
0.004497528076171875,
0.08905029296875,
0.08013916015625,
-0.08221435546875,
-0.025390625,
0.00836944580078125,
-0.0160675048828125,
-0.0172576904296875,
-0.00897979736328125,
0.0291595458984375,
-0.0101776123046875,
-0.0633544921875,
0.047943115234375,
0.00682830810546875,
0.01136016845703125,
0.0012187957763671875,
-0.0057830810546875,
-0.007198333740234375,
-0.0037078857421875,
0.038055419921875,
0.0281219482421875,
-0.06884765625,
-0.019622802734375,
-0.021881103515625,
-0.00855255126953125,
0.0142974853515625,
0.0455322265625,
-0.04510498046875,
0.027191162109375,
0.034881591796875,
0.02154541015625,
0.053131103515625,
0.0170135498046875,
0.0223388671875,
-0.059295654296875,
0.037811279296875,
0.01166534423828125,
0.029876708984375,
0.031585693359375,
-0.0201416015625,
0.01873779296875,
0.024169921875,
-0.0266571044921875,
-0.0633544921875,
0.01546478271484375,
-0.1082763671875,
-0.033203125,
0.10565185546875,
0.020904541015625,
-0.0213623046875,
0.0030460357666015625,
-0.04205322265625,
0.0498046875,
-0.043365478515625,
0.0374755859375,
0.04449462890625,
0.01690673828125,
-0.0033855438232421875,
-0.0282440185546875,
0.04449462890625,
0.0107879638671875,
-0.01357269287109375,
-0.0092010498046875,
0.037078857421875,
0.0287933349609375,
0.00743865966796875,
0.043670654296875,
-0.00853729248046875,
0.0183563232421875,
-0.00592041015625,
0.0126953125,
-0.025604248046875,
-0.0179290771484375,
-0.025604248046875,
0.0086822509765625,
-0.0092620849609375,
-0.0214691162109375
]
] |
pfnet/plamo-13b | 2023-10-10T06:24:54.000Z | [
"transformers",
"safetensors",
"plamo",
"text-generation",
"custom_code",
"en",
"ja",
"license:apache-2.0",
"region:us"
] | text-generation | pfnet | null | null | pfnet/plamo-13b | 72 | 15,102 | transformers | 2023-09-25T12:47:05 | ---
language:
- en
- ja
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
---
# PLaMo-13B
## Model Description
PLaMo-13B is a LLaMA-based 13B model pre-trained on English and Japanese open datasets, developed by Preferred Networks, Inc.
PLaMo-13B is released under Apache v2.0 license.
[PLaMo-13B Release blog (Japanese)](https://tech.preferred.jp/ja/blog/llm-plamo/)
## Usage
### Requirements
- numpy
- sentencepiece
- torch
- transformers
### Use a pipeline as a high-level helper
```python
import transformers
pipeline = transformers.pipeline("text-generation", model="pfnet/plamo-13b", trust_remote_code=True)
print(pipeline("The future of artificial intelligence technology is ", max_new_tokens=32))
```
### Load model directly
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("pfnet/plamo-13b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("pfnet/plamo-13b", trust_remote_code=True)
text = "これからの人工知能技術は"
input_ids = tokenizer(text, return_tensors="pt").input_ids
generated_tokens = model.generate(
inputs=input_ids,
max_new_tokens=32,
do_sample=True,
top_k=50,
top_p=0.95,
temperature=1.0,
)[0]
generated_text = tokenizer.decode(generated_tokens)
print(generated_text)
```
## Model Details
- Model size: 13B
- Trained tokens: 1.5T tokens (English: 1.32T tokens, Japanese: 0.18T tokens)
- Context length: 4096
- Developed by: Preferred Networks, Inc
- Model type: Causal decoder-only
- Language(s): English, Japanese
- License: Apache v2.0
## Training Dataset
### English
- C4 - English
- Project Gutenberg
- RedPajama - Arxiv
- RedPajama - CommonCrawl - English
- RedPajama - Github
- RedPajama - StackExchange
- RedPajama - Wikipedia
### Japanese
- mC4 - Japanese
- Wikipedia - Japanese
## Tokenizer
PLaMo-13B uses sentencepiece tokenizer which is trained on a subset of the datasets for model pre-training.
## Bias, Risks, and Limitations
PLaMo-13B is a new technology that carries risks with use. Testing conducted to date has been in English and Japanese, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, PLaMo-13B’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of PLaMo-13B, developers should perform safety testing and tuning tailored to their specific applications of the model.
## How to cite
```tex
@online{PLaMo2023Introducing,
author = {Preferred Networks, Inc},
title = {PLaMo-13B},
year = {2023},
url = {https://huggingface.co/pfnet/plamo-13b},
urldate = {2023-09-28}
}
```
## Citations
```tex
@article{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```
| 3,274 | [
[
-0.02874755859375,
-0.042388916015625,
0.0186309814453125,
0.04339599609375,
-0.03521728515625,
-0.0228118896484375,
-0.005977630615234375,
-0.01425933837890625,
0.00887298583984375,
0.024200439453125,
-0.04095458984375,
-0.03436279296875,
-0.038543701171875,
0.0238800048828125,
-0.01078033447265625,
0.064453125,
-0.0095367431640625,
-0.00347900390625,
-0.013336181640625,
0.006679534912109375,
-0.0279083251953125,
-0.0478515625,
-0.036956787109375,
-0.02294921875,
0.0166778564453125,
0.00827789306640625,
0.032806396484375,
0.044891357421875,
0.046356201171875,
0.023773193359375,
-0.0264892578125,
0.03631591796875,
-0.02996826171875,
-0.028656005859375,
0.00705718994140625,
-0.031341552734375,
-0.052154541015625,
-0.00152587890625,
0.02960205078125,
0.035003662109375,
0.0013246536254882812,
0.01558685302734375,
-0.006622314453125,
0.0188140869140625,
-0.027099609375,
0.01611328125,
-0.042999267578125,
-0.0147552490234375,
-0.035064697265625,
-0.00830078125,
-0.033355712890625,
-0.0170135498046875,
-0.01010894775390625,
-0.058624267578125,
-0.0127716064453125,
0.013031005859375,
0.09271240234375,
0.0242156982421875,
-0.042999267578125,
-0.0167083740234375,
-0.00473785400390625,
0.0640869140625,
-0.07147216796875,
-0.00676727294921875,
0.03936767578125,
0.005542755126953125,
-0.0206756591796875,
-0.050628662109375,
-0.021331787109375,
-0.03131103515625,
-0.010986328125,
0.0166778564453125,
-0.00251007080078125,
0.0014543533325195312,
0.01080322265625,
0.04620361328125,
-0.03192138671875,
0.023590087890625,
-0.032470703125,
-0.0232696533203125,
0.053802490234375,
0.0171661376953125,
0.02252197265625,
-0.03228759765625,
-0.034423828125,
0.0005960464477539062,
-0.05419921875,
-0.00934600830078125,
0.04705810546875,
0.013519287109375,
-0.045074462890625,
0.041656494140625,
0.00286102294921875,
0.038909912109375,
0.005645751953125,
-0.01378631591796875,
0.037261962890625,
-0.0277557373046875,
-0.0190582275390625,
-0.00614166259765625,
0.0843505859375,
0.0208587646484375,
0.00899505615234375,
0.004947662353515625,
0.002635955810546875,
0.011322021484375,
-0.0072021484375,
-0.05462646484375,
0.0013637542724609375,
0.0187835693359375,
-0.04730224609375,
-0.031280517578125,
0.024444580078125,
-0.055511474609375,
-0.03875732421875,
-0.0024776458740234375,
0.023895263671875,
-0.032196044921875,
-0.0293731689453125,
0.013519287109375,
0.0271148681640625,
0.01412200927734375,
0.0032062530517578125,
-0.030609130859375,
0.02197265625,
0.042236328125,
0.06866455078125,
-0.0218353271484375,
-0.039093017578125,
-0.005458831787109375,
-0.005756378173828125,
-0.0198516845703125,
0.04644775390625,
-0.007694244384765625,
-0.03936767578125,
-0.0243988037109375,
0.01526641845703125,
-0.015869140625,
-0.046234130859375,
0.0357666015625,
-0.01326751708984375,
0.0186004638671875,
0.005222320556640625,
-0.0352783203125,
-0.0221405029296875,
-0.0125579833984375,
-0.049560546875,
0.07379150390625,
-0.0208740234375,
-0.07000732421875,
-0.012847900390625,
-0.055206298828125,
-0.0277252197265625,
-0.0189208984375,
0.0168304443359375,
-0.044677734375,
0.01058197021484375,
0.01068878173828125,
0.027099609375,
-0.040679931640625,
0.06207275390625,
-0.022308349609375,
-0.017608642578125,
0.0374755859375,
-0.031402587890625,
0.09576416015625,
0.039276123046875,
-0.0289459228515625,
0.02093505859375,
-0.0826416015625,
-0.03656005859375,
0.0250091552734375,
-0.0202484130859375,
-0.0177154541015625,
-0.0168609619140625,
-0.0136871337890625,
0.005146026611328125,
0.05523681640625,
-0.033477783203125,
0.00562286376953125,
-0.04461669921875,
0.0389404296875,
0.066650390625,
0.0031261444091796875,
0.032012939453125,
-0.037689208984375,
0.032958984375,
0.019317626953125,
0.03533935546875,
-0.0137939453125,
-0.046875,
-0.074951171875,
-0.024078369140625,
0.0136871337890625,
0.059417724609375,
-0.043304443359375,
0.05657958984375,
0.006076812744140625,
-0.05877685546875,
-0.04779052734375,
0.0141448974609375,
0.032501220703125,
0.021759033203125,
0.0260467529296875,
-0.0160369873046875,
-0.0777587890625,
-0.06243896484375,
-0.0028514862060546875,
-0.0175018310546875,
0.027069091796875,
0.0252227783203125,
0.05462646484375,
-0.017913818359375,
0.0399169921875,
-0.01519012451171875,
-0.0217742919921875,
-0.031341552734375,
-0.0001550912857055664,
0.0374755859375,
0.0450439453125,
0.03228759765625,
-0.034698486328125,
-0.02691650390625,
-0.0032672882080078125,
-0.0699462890625,
-0.031280517578125,
-0.0180816650390625,
-0.036529541015625,
0.0134429931640625,
0.0098419189453125,
-0.03570556640625,
0.03424072265625,
0.05047607421875,
-0.03021240234375,
0.02716064453125,
-0.0097503662109375,
-0.015716552734375,
-0.09857177734375,
0.0250091552734375,
-0.0009365081787109375,
0.005435943603515625,
-0.03912353515625,
0.01068878173828125,
-0.0019388198852539062,
-0.01021575927734375,
-0.0521240234375,
0.06353759765625,
-0.03564453125,
0.0006151199340820312,
-0.0118560791015625,
-0.0171661376953125,
-0.01232147216796875,
0.033935546875,
0.00301361083984375,
0.064453125,
0.044708251953125,
-0.0533447265625,
0.02520751953125,
0.0311279296875,
-0.0360107421875,
0.01558685302734375,
-0.060546875,
0.005153656005859375,
0.01097869873046875,
0.007228851318359375,
-0.050140380859375,
-0.0167236328125,
0.0272064208984375,
-0.0321044921875,
-0.0006995201110839844,
0.00896453857421875,
-0.046051025390625,
-0.0275115966796875,
-0.01103973388671875,
0.017730712890625,
0.03594970703125,
-0.0294036865234375,
0.05194091796875,
0.028656005859375,
0.01404571533203125,
-0.032440185546875,
-0.071044921875,
-0.0196380615234375,
-0.036041259765625,
-0.061004638671875,
0.0207977294921875,
-0.018280029296875,
-0.011993408203125,
-0.01236724853515625,
-0.0076141357421875,
-0.0170440673828125,
0.0347900390625,
0.0343017578125,
0.036285400390625,
0.00139617919921875,
-0.034820556640625,
-0.0015630722045898438,
-0.016357421875,
-0.002956390380859375,
0.00983428955078125,
0.064208984375,
-0.01505279541015625,
-0.0038127899169921875,
-0.07073974609375,
0.008270263671875,
0.029296875,
-0.021087646484375,
0.08447265625,
0.0308990478515625,
-0.011322021484375,
0.01520538330078125,
-0.033050537109375,
-0.0036945343017578125,
-0.03509521484375,
0.012237548828125,
-0.0300750732421875,
-0.044403076171875,
0.061614990234375,
0.00785064697265625,
0.002544403076171875,
0.0384521484375,
0.050018310546875,
0.021453857421875,
0.05462646484375,
0.0282135009765625,
0.0144805908203125,
0.037017822265625,
-0.03802490234375,
0.002666473388671875,
-0.06341552734375,
-0.0401611328125,
-0.0300445556640625,
-0.0121307373046875,
-0.0419921875,
-0.0229339599609375,
0.017822265625,
-0.005504608154296875,
-0.0406494140625,
0.035919189453125,
-0.01493072509765625,
0.006317138671875,
0.060333251953125,
0.021453857421875,
0.0191802978515625,
-0.00540924072265625,
-0.0243072509765625,
0.005466461181640625,
-0.04168701171875,
-0.033203125,
0.0797119140625,
0.0277099609375,
0.0634765625,
0.007160186767578125,
0.06622314453125,
-0.02783203125,
0.01110076904296875,
-0.035400390625,
0.054412841796875,
-0.0197601318359375,
-0.0684814453125,
0.0117645263671875,
-0.0438232421875,
-0.07574462890625,
0.0060882568359375,
0.00640869140625,
-0.031036376953125,
0.02880859375,
0.003849029541015625,
-0.0158843994140625,
0.01480865478515625,
-0.053802490234375,
0.0712890625,
-0.045501708984375,
-0.0181884765625,
0.003879547119140625,
-0.042083740234375,
0.03759765625,
-0.004962921142578125,
0.0304107666015625,
-0.0067138671875,
-0.014007568359375,
0.0667724609375,
-0.045654296875,
0.090576171875,
-0.013336181640625,
-0.0019273757934570312,
0.0301971435546875,
0.012542724609375,
0.02642822265625,
0.0025920867919921875,
-0.01474761962890625,
0.03985595703125,
-0.01432037353515625,
-0.0183868408203125,
-0.005054473876953125,
0.050018310546875,
-0.0968017578125,
-0.046630859375,
-0.05499267578125,
-0.045013427734375,
0.0084228515625,
0.026092529296875,
0.0406494140625,
0.01551055908203125,
0.019287109375,
0.035186767578125,
0.032379150390625,
-0.055084228515625,
0.024139404296875,
0.037322998046875,
-0.0123138427734375,
-0.053680419921875,
0.06048583984375,
0.01174163818359375,
0.0090789794921875,
0.036407470703125,
0.020111083984375,
-0.015655517578125,
-0.038360595703125,
0.00608062744140625,
0.04986572265625,
-0.03289794921875,
-0.03271484375,
-0.024444580078125,
-0.0223541259765625,
-0.02996826171875,
0.0093994140625,
-0.027435302734375,
-0.01763916015625,
-0.060394287109375,
-0.0254974365234375,
0.031219482421875,
0.0224609375,
-0.02227783203125,
0.0494384765625,
-0.0307159423828125,
0.0130157470703125,
-0.009033203125,
0.0172576904296875,
0.0008292198181152344,
-0.06719970703125,
-0.00518798828125,
-0.0113983154296875,
-0.0286865234375,
-0.06884765625,
0.04931640625,
0.015716552734375,
0.061676025390625,
0.01218414306640625,
0.010040283203125,
0.06524658203125,
-0.01311492919921875,
0.059112548828125,
0.0169525146484375,
-0.07586669921875,
0.057098388671875,
-0.01342010498046875,
0.0036754608154296875,
0.0266265869140625,
0.045379638671875,
-0.00782012939453125,
-0.01282501220703125,
-0.056304931640625,
-0.074951171875,
0.0562744140625,
0.0196075439453125,
0.0029296875,
-0.01313018798828125,
0.01290130615234375,
0.021270751953125,
0.01482391357421875,
-0.08624267578125,
-0.042510986328125,
-0.03692626953125,
-0.03497314453125,
-0.0020809173583984375,
-0.03363037109375,
0.0028362274169921875,
0.0005774497985839844,
0.0670166015625,
0.019744873046875,
0.0301361083984375,
0.0064239501953125,
-0.0276031494140625,
0.01995849609375,
0.0004220008850097656,
0.030181884765625,
0.03717041015625,
-0.0235443115234375,
-0.00033473968505859375,
0.0003228187561035156,
-0.03411865234375,
0.01641845703125,
0.01422119140625,
-0.03564453125,
-0.00927734375,
0.020416259765625,
0.095458984375,
0.016510009765625,
-0.0413818359375,
0.0220947265625,
0.00962066650390625,
-0.01297760009765625,
-0.0281982421875,
-0.012725830078125,
-0.0035266876220703125,
-0.001667022705078125,
0.013916015625,
-0.0111236572265625,
-0.004093170166015625,
-0.03594970703125,
-0.0040740966796875,
-0.0005707740783691406,
0.0178070068359375,
-0.0248565673828125,
0.0604248046875,
0.0272674560546875,
-0.000012576580047607422,
0.02471923828125,
-0.01006317138671875,
-0.0166168212890625,
0.04986572265625,
0.0516357421875,
0.03912353515625,
-0.0206756591796875,
0.01233673095703125,
0.033935546875,
0.0275726318359375,
-0.0160369873046875,
0.03045654296875,
0.0306396484375,
-0.048675537109375,
-0.029998779296875,
-0.0484619140625,
-0.0058135986328125,
0.02813720703125,
-0.05279541015625,
0.030914306640625,
-0.059478759765625,
-0.0247344970703125,
-0.031829833984375,
0.02374267578125,
-0.0313720703125,
0.005584716796875,
0.007541656494140625,
0.0811767578125,
-0.061004638671875,
0.051483154296875,
0.064208984375,
-0.0546875,
-0.07110595703125,
-0.036041259765625,
0.003108978271484375,
-0.0748291015625,
0.044036865234375,
0.01157379150390625,
0.030731201171875,
0.0164031982421875,
-0.032257080078125,
-0.07952880859375,
0.09686279296875,
0.0426025390625,
-0.0400390625,
-0.0031890869140625,
-0.004322052001953125,
0.0185546875,
-0.01210784912109375,
0.0182342529296875,
0.022857666015625,
0.03533935546875,
0.0016832351684570312,
-0.08721923828125,
0.00969696044921875,
-0.0265960693359375,
0.0158843994140625,
-0.01227569580078125,
-0.07110595703125,
0.07666015625,
0.00449371337890625,
-0.0067901611328125,
0.053009033203125,
0.055206298828125,
0.051025390625,
0.0222015380859375,
0.010986328125,
0.0506591796875,
0.056884765625,
-0.0126495361328125,
0.083984375,
-0.0269012451171875,
0.026947021484375,
0.0762939453125,
-0.008148193359375,
0.05023193359375,
0.03179931640625,
-0.0176849365234375,
0.047393798828125,
0.06121826171875,
-0.01152801513671875,
0.05322265625,
-0.00850677490234375,
0.0075225830078125,
-0.01910400390625,
-0.018585205078125,
-0.035400390625,
0.019927978515625,
0.0186614990234375,
-0.0318603515625,
0.002056121826171875,
0.00614166259765625,
0.03240966796875,
-0.00408172607421875,
-0.0333251953125,
0.04766845703125,
0.02862548828125,
-0.041717529296875,
0.0479736328125,
0.013916015625,
0.07049560546875,
-0.0523681640625,
0.038909912109375,
-0.049957275390625,
0.0149993896484375,
-0.0218658447265625,
-0.0284423828125,
0.01505279541015625,
0.0218353271484375,
0.013458251953125,
-0.00027751922607421875,
0.04052734375,
0.004058837890625,
-0.04241943359375,
0.021484375,
0.0202484130859375,
0.024505615234375,
0.01453399658203125,
-0.05670166015625,
0.03497314453125,
0.011077880859375,
-0.0174560546875,
0.02667236328125,
0.0308380126953125,
0.0033779144287109375,
0.0521240234375,
0.053863525390625,
0.0163726806640625,
0.03387451171875,
0.0036373138427734375,
0.049285888671875,
-0.031463623046875,
-0.043670654296875,
-0.06304931640625,
0.045013427734375,
-0.0009260177612304688,
-0.033172607421875,
0.07037353515625,
0.03826904296875,
0.05694580078125,
-0.004444122314453125,
0.049774169921875,
-0.01485443115234375,
0.0271148681640625,
-0.046295166015625,
0.0255279541015625,
-0.0390625,
0.0250701904296875,
-0.0018568038940429688,
-0.051116943359375,
-0.028289794921875,
0.08624267578125,
-0.0408935546875,
0.0008006095886230469,
0.059967041015625,
0.06890869140625,
0.00733184814453125,
-0.022491455078125,
0.005710601806640625,
0.0494384765625,
0.0170440673828125,
0.06439208984375,
0.06512451171875,
-0.043701171875,
0.051727294921875,
-0.050811767578125,
-0.01904296875,
-0.0209503173828125,
-0.0762939453125,
-0.0867919921875,
-0.0311279296875,
-0.02471923828125,
-0.0325927734375,
-0.0037288665771484375,
0.0701904296875,
0.036468505859375,
-0.06787109375,
-0.045989990234375,
-0.0214996337890625,
-0.006114959716796875,
-0.01041412353515625,
-0.01534271240234375,
0.023223876953125,
-0.0023975372314453125,
-0.044281005859375,
0.02398681640625,
-0.00823211669921875,
0.020782470703125,
-0.01264190673828125,
-0.015869140625,
-0.02960205078125,
0.0066680908203125,
0.0133514404296875,
0.0082550048828125,
-0.05816650390625,
0.0220489501953125,
0.00814056396484375,
0.003650665283203125,
-0.004253387451171875,
0.03057861328125,
-0.0306854248046875,
0.001560211181640625,
0.035369873046875,
0.03741455078125,
0.020660400390625,
-0.003627777099609375,
0.03936767578125,
-0.035552978515625,
0.046051025390625,
-0.005290985107421875,
0.045806884765625,
0.03485107421875,
-0.0222625732421875,
0.037689208984375,
0.043243408203125,
-0.032745361328125,
-0.08172607421875,
0.003849029541015625,
-0.081787109375,
-0.023040771484375,
0.098876953125,
-0.02545166015625,
-0.039154052734375,
0.0157623291015625,
-0.0264892578125,
0.026824951171875,
-0.0306243896484375,
0.0400390625,
0.039764404296875,
0.01080322265625,
-0.017822265625,
-0.02685546875,
0.005401611328125,
0.0124359130859375,
-0.069580078125,
-0.0158843994140625,
0.02099609375,
0.0195465087890625,
0.00855255126953125,
0.0369873046875,
-0.02166748046875,
0.0157623291015625,
-0.01166534423828125,
-0.00652313232421875,
-0.0372314453125,
-0.0245513916015625,
-0.0214385986328125,
-0.004913330078125,
-0.001224517822265625,
-0.00801849365234375
]
] |
facebook/detr-resnet-50-panoptic | 2022-08-05T13:29:29.000Z | [
"transformers",
"pytorch",
"detr",
"image-segmentation",
"vision",
"dataset:coco",
"arxiv:2005.12872",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | image-segmentation | facebook | null | null | facebook/detr-resnet-50-panoptic | 96 | 15,090 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- image-segmentation
- vision
datasets:
- coco
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg
example_title: Football Match
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/dog-cat.jpg
example_title: Dog & Cat
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/construction-site.jpg
example_title: Construction Site
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/apple-orange.jpg
example_title: Apple & Orange
---
# DETR (End-to-End Object Detection) model with ResNet-50 backbone
DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 panoptic (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr).
Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100.
The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model.
DETR can be naturally extended to perform panoptic segmentation, by adding a mask head on top of the decoder outputs.

## Intended uses & limitations
You can use the raw model for panoptic segmentation. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models.
### How to use
Here is how to use this model:
```python
import io
import requests
from PIL import Image
import torch
import numpy
from transformers import DetrFeatureExtractor, DetrForSegmentation
from transformers.models.detr.feature_extraction_detr import rgb_to_id
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
feature_extractor = DetrFeatureExtractor.from_pretrained("facebook/detr-resnet-50-panoptic")
model = DetrForSegmentation.from_pretrained("facebook/detr-resnet-50-panoptic")
# prepare image for the model
inputs = feature_extractor(images=image, return_tensors="pt")
# forward pass
outputs = model(**inputs)
# use the `post_process_panoptic` method of `DetrFeatureExtractor` to convert to COCO format
processed_sizes = torch.as_tensor(inputs["pixel_values"].shape[-2:]).unsqueeze(0)
result = feature_extractor.post_process_panoptic(outputs, processed_sizes)[0]
# the segmentation is stored in a special-format png
panoptic_seg = Image.open(io.BytesIO(result["png_string"]))
panoptic_seg = numpy.array(panoptic_seg, dtype=numpy.uint8)
# retrieve the ids corresponding to each mask
panoptic_seg_id = rgb_to_id(panoptic_seg)
```
Currently, both the feature extractor and model support PyTorch.
## Training data
The DETR model was trained on [COCO 2017 panoptic](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively.
## Training procedure
### Preprocessing
The exact details of preprocessing of images during training/validation can be found [here](https://github.com/facebookresearch/detr/blob/master/datasets/coco_panoptic.py).
Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225).
### Training
The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64).
## Evaluation results
This model achieves the following results on COCO 2017 validation: a box AP (average precision) of **38.8**, a segmentation AP (average precision) of **31.1** and a PQ (panoptic quality) of **43.4**.
For more details regarding evaluation results, we refer to table 5 of the original paper.
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2005-12872,
author = {Nicolas Carion and
Francisco Massa and
Gabriel Synnaeve and
Nicolas Usunier and
Alexander Kirillov and
Sergey Zagoruyko},
title = {End-to-End Object Detection with Transformers},
journal = {CoRR},
volume = {abs/2005.12872},
year = {2020},
url = {https://arxiv.org/abs/2005.12872},
archivePrefix = {arXiv},
eprint = {2005.12872},
timestamp = {Thu, 28 May 2020 17:38:09 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 5,851 | [
[
-0.07000732421875,
-0.052734375,
-0.005496978759765625,
0.0115203857421875,
-0.023529052734375,
-0.0096893310546875,
-0.00858306884765625,
-0.059234619140625,
0.021209716796875,
0.042755126953125,
-0.0479736328125,
-0.036376953125,
-0.042022705078125,
0.019744873046875,
-0.039764404296875,
0.0640869140625,
0.01154327392578125,
-0.022064208984375,
-0.01136016845703125,
-0.009796142578125,
-0.0018262863159179688,
-0.032745361328125,
-0.041229248046875,
0.004871368408203125,
0.0213470458984375,
0.0304107666015625,
0.048675537109375,
0.04736328125,
0.05584716796875,
0.0242919921875,
-0.018646240234375,
0.0089263916015625,
-0.0256195068359375,
-0.027496337890625,
-0.01451873779296875,
-0.033966064453125,
-0.019927978515625,
0.0036067962646484375,
0.016876220703125,
0.0200653076171875,
0.00897979736328125,
0.02069091796875,
-0.0095977783203125,
0.050506591796875,
-0.04986572265625,
0.0149383544921875,
-0.034881591796875,
0.02203369140625,
-0.01258087158203125,
0.004055023193359375,
-0.02215576171875,
-0.0209197998046875,
0.01340484619140625,
-0.03338623046875,
0.054107666015625,
-0.0117645263671875,
0.1055908203125,
0.0030193328857421875,
-0.00028133392333984375,
-0.01099395751953125,
-0.03369140625,
0.03643798828125,
-0.0364990234375,
0.032562255859375,
0.02740478515625,
0.04315185546875,
0.012054443359375,
-0.06182861328125,
-0.047119140625,
-0.01125335693359375,
0.0006732940673828125,
0.01300811767578125,
-0.01861572265625,
-0.0074005126953125,
0.0249176025390625,
0.03668212890625,
-0.030120849609375,
-0.00893402099609375,
-0.059783935546875,
-0.01861572265625,
0.061553955078125,
-0.00943756103515625,
0.02435302734375,
-0.01116180419921875,
-0.03265380859375,
-0.0279693603515625,
-0.0213470458984375,
0.0239715576171875,
0.02630615234375,
0.006847381591796875,
-0.033416748046875,
0.05072021484375,
-0.0157470703125,
0.055389404296875,
0.0273590087890625,
-0.007656097412109375,
0.024810791015625,
-0.0226898193359375,
-0.0185394287109375,
-0.001583099365234375,
0.072998046875,
0.03570556640625,
0.03619384765625,
0.006763458251953125,
-0.0009560585021972656,
0.01190948486328125,
0.01062774658203125,
-0.0731201171875,
-0.0285797119140625,
-0.002651214599609375,
-0.0243682861328125,
-0.021881103515625,
0.034210205078125,
-0.0650634765625,
0.0006837844848632812,
-0.009613037109375,
0.00922393798828125,
-0.037322998046875,
-0.0237884521484375,
0.0171966552734375,
-0.00254058837890625,
0.049468994140625,
0.020294189453125,
-0.04693603515625,
0.01143646240234375,
0.02655029296875,
0.076416015625,
0.00630950927734375,
-0.007904052734375,
-0.01171112060546875,
-0.022064208984375,
-0.030914306640625,
0.05755615234375,
-0.03448486328125,
-0.021026611328125,
-0.01271820068359375,
0.0364990234375,
-0.014556884765625,
-0.0433349609375,
0.041107177734375,
-0.026214599609375,
0.007091522216796875,
-0.035552978515625,
0.0012903213500976562,
-0.037933349609375,
0.0296630859375,
-0.04779052734375,
0.07275390625,
0.030426025390625,
-0.0693359375,
0.0269622802734375,
-0.038543701171875,
-0.0190277099609375,
-0.0223236083984375,
0.0063018798828125,
-0.055938720703125,
0.00008338689804077148,
0.05609130859375,
0.0292205810546875,
-0.0258026123046875,
0.0026760101318359375,
-0.041229248046875,
-0.0164642333984375,
0.02508544921875,
-0.0240020751953125,
0.072509765625,
0.01287078857421875,
-0.018402099609375,
0.0011281967163085938,
-0.0391845703125,
-0.0268707275390625,
0.028228759765625,
-0.005496978759765625,
0.01751708984375,
-0.025146484375,
0.0173492431640625,
0.042755126953125,
0.00934600830078125,
-0.0482177734375,
0.0120086669921875,
-0.0179290771484375,
0.029510498046875,
0.037750244140625,
-0.0023708343505859375,
0.0174713134765625,
-0.01558685302734375,
0.035980224609375,
0.0279998779296875,
0.0465087890625,
-0.0211944580078125,
-0.040283203125,
-0.07073974609375,
-0.029449462890625,
-0.002765655517578125,
0.02197265625,
-0.02777099609375,
0.05291748046875,
-0.0141448974609375,
-0.036224365234375,
-0.0272674560546875,
-0.00992584228515625,
0.01221466064453125,
0.045196533203125,
0.0293121337890625,
-0.0592041015625,
-0.052642822265625,
-0.07659912109375,
0.031463623046875,
0.01617431640625,
0.00502777099609375,
0.022796630859375,
0.053741455078125,
-0.0283660888671875,
0.0986328125,
-0.03411865234375,
-0.03472900390625,
-0.0238189697265625,
-0.01580810546875,
0.01148223876953125,
0.0258026123046875,
0.070068359375,
-0.06463623046875,
-0.048370361328125,
0.005908966064453125,
-0.06622314453125,
0.0009589195251464844,
0.01519775390625,
-0.00981903076171875,
0.019561767578125,
0.019317626953125,
-0.045440673828125,
0.061370849609375,
0.0121307373046875,
-0.0005335807800292969,
0.058013916015625,
0.0032672882080078125,
0.0016307830810546875,
-0.06475830078125,
0.0220947265625,
0.022430419921875,
-0.01995849609375,
-0.032379150390625,
0.0053863525390625,
0.00738525390625,
-0.0159149169921875,
-0.04779052734375,
0.01617431640625,
-0.04345703125,
-0.02239990234375,
-0.0261383056640625,
0.00724029541015625,
0.0087127685546875,
0.0469970703125,
0.02490234375,
0.037628173828125,
0.0692138671875,
-0.04754638671875,
0.0263824462890625,
0.0236968994140625,
-0.029510498046875,
0.0504150390625,
-0.0523681640625,
0.01141357421875,
-0.0218505859375,
0.0313720703125,
-0.06353759765625,
-0.0195465087890625,
0.04400634765625,
-0.03302001953125,
0.01861572265625,
-0.021728515625,
-0.019927978515625,
-0.0748291015625,
-0.0350341796875,
0.028289794921875,
0.0186614990234375,
-0.057342529296875,
0.0265960693359375,
0.01221466064453125,
0.024322509765625,
-0.058074951171875,
-0.05731201171875,
-0.00006943941116333008,
-0.01221466064453125,
-0.0496826171875,
0.0266571044921875,
0.0019130706787109375,
0.0121002197265625,
-0.00038909912109375,
-0.0207061767578125,
0.0019121170043945312,
-0.0283355712890625,
0.017364501953125,
0.0188446044921875,
-0.00768280029296875,
-0.039154052734375,
-0.006412506103515625,
-0.0126190185546875,
0.0001919269561767578,
-0.008148193359375,
0.06365966796875,
-0.0266571044921875,
-0.023040771484375,
-0.0592041015625,
-0.0032100677490234375,
0.059356689453125,
-0.043914794921875,
0.042816162109375,
0.0775146484375,
-0.034637451171875,
0.01424407958984375,
-0.058319091796875,
-0.0199127197265625,
-0.037567138671875,
0.030975341796875,
-0.03131103515625,
-0.0306854248046875,
0.07379150390625,
0.0242767333984375,
-0.0017223358154296875,
0.039642333984375,
0.028228759765625,
0.020660400390625,
0.06878662109375,
0.0406494140625,
-0.0106048583984375,
0.041748046875,
-0.07562255859375,
0.0171661376953125,
-0.08245849609375,
-0.04827880859375,
-0.0204315185546875,
-0.021728515625,
-0.03399658203125,
-0.053985595703125,
0.025360107421875,
0.03485107421875,
-0.0256805419921875,
0.06036376953125,
-0.06903076171875,
0.0205078125,
0.036529541015625,
0.041717529296875,
-0.00548553466796875,
0.007568359375,
-0.00020563602447509766,
0.01351165771484375,
-0.037017822265625,
-0.0149383544921875,
0.058624267578125,
0.031494140625,
0.045379638671875,
-0.01873779296875,
0.0284271240234375,
-0.0025005340576171875,
0.0205841064453125,
-0.049560546875,
0.05291748046875,
0.00406646728515625,
-0.05316162109375,
0.00638580322265625,
-0.0100250244140625,
-0.061370849609375,
0.0177764892578125,
-0.001739501953125,
-0.0865478515625,
0.053192138671875,
-0.0079803466796875,
-0.009063720703125,
0.03668212890625,
-0.03033447265625,
0.06390380859375,
-0.03192138671875,
-0.0350341796875,
0.0115509033203125,
-0.0733642578125,
0.03692626953125,
0.005123138427734375,
-0.01513671875,
-0.0130462646484375,
0.0295867919921875,
0.06292724609375,
-0.0430908203125,
0.060791015625,
-0.01282501220703125,
0.01300048828125,
0.056427001953125,
-0.0004892349243164062,
0.0287322998046875,
0.01284027099609375,
0.00011414289474487305,
0.03863525390625,
-0.0027675628662109375,
-0.01424407958984375,
-0.029022216796875,
0.0305328369140625,
-0.0601806640625,
-0.03485107421875,
-0.034149169921875,
-0.040863037109375,
0.02581787109375,
0.00786590576171875,
0.053741455078125,
0.0207672119140625,
0.00652313232421875,
0.01497650146484375,
0.027130126953125,
-0.037872314453125,
0.04888916015625,
0.00395965576171875,
-0.0234375,
-0.0341796875,
0.06011962890625,
0.00034308433532714844,
0.01226043701171875,
0.038909912109375,
0.0204010009765625,
-0.041473388671875,
-0.007793426513671875,
-0.0233154296875,
0.035186767578125,
-0.04742431640625,
-0.032684326171875,
-0.059051513671875,
-0.044647216796875,
-0.041229248046875,
-0.0296783447265625,
-0.053070068359375,
-0.01322174072265625,
-0.04266357421875,
-0.01207733154296875,
0.0335693359375,
0.01404571533203125,
-0.0172119140625,
0.035400390625,
-0.02496337890625,
0.0142974853515625,
0.0080718994140625,
0.00963592529296875,
-0.00185394287109375,
-0.038177490234375,
-0.0085296630859375,
0.01947021484375,
-0.0421142578125,
-0.062744140625,
0.0496826171875,
0.00794219970703125,
0.0281524658203125,
0.07403564453125,
0.0016088485717773438,
0.0694580078125,
0.0111846923828125,
0.042755126953125,
0.039154052734375,
-0.059783935546875,
0.046844482421875,
-0.00489044189453125,
0.0125885009765625,
0.0287933349609375,
0.0230560302734375,
-0.0305938720703125,
-0.0079498291015625,
-0.0404052734375,
-0.063232421875,
0.070068359375,
0.008392333984375,
-0.00980377197265625,
0.000049054622650146484,
0.022674560546875,
-0.003147125244140625,
0.00804901123046875,
-0.0645751953125,
-0.0107574462890625,
-0.042510986328125,
-0.0002510547637939453,
-0.0018262863159179688,
-0.004119873046875,
0.0036163330078125,
-0.05078125,
0.029144287109375,
-0.00762176513671875,
0.050140380859375,
0.030242919921875,
-0.01885986328125,
-0.005321502685546875,
-0.0258941650390625,
0.034637451171875,
0.0455322265625,
-0.033599853515625,
-0.00389862060546875,
0.0005049705505371094,
-0.03631591796875,
-0.0095062255859375,
0.0139312744140625,
-0.006443023681640625,
-0.01221466064453125,
0.0193328857421875,
0.0638427734375,
-0.001888275146484375,
-0.03009033203125,
0.045166015625,
0.0141143798828125,
-0.0202484130859375,
-0.0286712646484375,
0.003143310546875,
-0.01366424560546875,
0.0158233642578125,
0.0343017578125,
0.01334381103515625,
-0.004116058349609375,
-0.032440185546875,
0.027557373046875,
0.035400390625,
-0.033416748046875,
-0.040283203125,
0.062744140625,
-0.0145111083984375,
-0.0220947265625,
0.067138671875,
-0.00885772705078125,
-0.059967041015625,
0.076904296875,
0.04443359375,
0.06109619140625,
-0.0243377685546875,
0.02947998046875,
0.05218505859375,
0.0283355712890625,
-0.004230499267578125,
-0.01129150390625,
-0.015350341796875,
-0.053619384765625,
-0.00016200542449951172,
-0.053741455078125,
-0.001644134521484375,
0.0201416015625,
-0.05902099609375,
0.038238525390625,
-0.0340576171875,
-0.01470184326171875,
-0.00455474853515625,
0.0008611679077148438,
-0.07391357421875,
0.0230560302734375,
-0.0010204315185546875,
0.074462890625,
-0.062103271484375,
0.039093017578125,
0.033599853515625,
-0.05206298828125,
-0.04302978515625,
-0.01540374755859375,
-0.0081939697265625,
-0.057647705078125,
0.03570556640625,
0.06243896484375,
0.004852294921875,
-0.01104736328125,
-0.048095703125,
-0.0665283203125,
0.089111328125,
0.032073974609375,
-0.04693603515625,
0.00373077392578125,
0.0273590087890625,
0.030731201171875,
-0.034423828125,
0.052581787109375,
0.0374755859375,
0.033660888671875,
0.03668212890625,
-0.0494384765625,
-0.0029754638671875,
-0.0127716064453125,
0.007511138916015625,
-0.002902984619140625,
-0.056976318359375,
0.0594482421875,
-0.03338623046875,
-0.018829345703125,
-0.00444793701171875,
0.047210693359375,
0.018096923828125,
0.033203125,
0.03424072265625,
0.0577392578125,
0.0281982421875,
-0.00836944580078125,
0.07086181640625,
-0.0233154296875,
0.051361083984375,
0.040252685546875,
0.0022106170654296875,
0.030487060546875,
0.0244140625,
-0.01129913330078125,
0.0241851806640625,
0.08349609375,
-0.0195770263671875,
0.0440673828125,
0.005718231201171875,
-0.000736236572265625,
-0.0036640167236328125,
-0.0113677978515625,
-0.0380859375,
0.046173095703125,
0.0274658203125,
-0.0321044921875,
-0.026092529296875,
0.029754638671875,
0.0004222393035888672,
-0.029449462890625,
-0.010894775390625,
0.020965576171875,
-0.0009007453918457031,
-0.034637451171875,
0.051239013671875,
0.00738525390625,
0.054351806640625,
-0.026947021484375,
0.0160980224609375,
-0.02935791015625,
0.02392578125,
-0.0251007080078125,
-0.044342041015625,
0.031219482421875,
-0.01351165771484375,
0.0015439987182617188,
0.023406982421875,
0.04901123046875,
-0.01666259765625,
-0.067138671875,
0.032470703125,
-0.000980377197265625,
0.0216827392578125,
-0.01605224609375,
-0.06988525390625,
0.0248260498046875,
-0.00394439697265625,
-0.03204345703125,
0.007694244384765625,
0.005397796630859375,
-0.01314544677734375,
0.053680419921875,
0.052459716796875,
-0.0219573974609375,
0.00951385498046875,
-0.0059814453125,
0.074951171875,
-0.0279998779296875,
-0.0299530029296875,
-0.05511474609375,
0.035003662109375,
-0.019500732421875,
-0.018402099609375,
0.045196533203125,
0.068359375,
0.08392333984375,
-0.012603759765625,
0.03564453125,
-0.03082275390625,
-0.011444091796875,
-0.0007228851318359375,
0.0280914306640625,
-0.032745361328125,
0.0072784423828125,
-0.0369873046875,
-0.0826416015625,
-0.028594970703125,
0.08026123046875,
-0.0271759033203125,
0.0220947265625,
0.037567138671875,
0.08294677734375,
-0.04534912109375,
-0.0055999755859375,
0.00765228271484375,
-0.005657196044921875,
0.032012939453125,
0.040985107421875,
0.0364990234375,
-0.051605224609375,
0.035888671875,
-0.0626220703125,
-0.01558685302734375,
-0.03607177734375,
-0.037628173828125,
-0.071533203125,
-0.052459716796875,
-0.041107177734375,
-0.02545166015625,
-0.00020885467529296875,
0.0362548828125,
0.081298828125,
-0.04815673828125,
-0.0170135498046875,
-0.032745361328125,
0.005767822265625,
-0.039459228515625,
-0.023895263671875,
0.049957275390625,
0.0164337158203125,
-0.06915283203125,
0.0023670196533203125,
0.0201873779296875,
0.002277374267578125,
-0.01012420654296875,
-0.00945281982421875,
-0.0272064208984375,
-0.006847381591796875,
0.046722412109375,
0.039093017578125,
-0.045806884765625,
-0.02838134765625,
0.007415771484375,
0.0112152099609375,
0.0277557373046875,
0.049835205078125,
-0.044281005859375,
0.040863037109375,
0.0301361083984375,
0.01763916015625,
0.08392333984375,
0.0233154296875,
-0.001007080078125,
-0.036529541015625,
0.028045654296875,
0.0032062530517578125,
0.027313232421875,
0.040435791015625,
-0.02459716796875,
0.048736572265625,
0.03912353515625,
-0.0302734375,
-0.06103515625,
0.0098876953125,
-0.10894775390625,
-0.0091094970703125,
0.06622314453125,
-0.011505126953125,
-0.042236328125,
0.0077972412109375,
-0.0183258056640625,
0.038818359375,
-0.0127716064453125,
0.054351806640625,
0.01413726806640625,
-0.008575439453125,
-0.055938720703125,
-0.01776123046875,
0.003154754638671875,
-0.0147247314453125,
-0.04962158203125,
-0.015167236328125,
0.0192108154296875,
0.040069580078125,
0.0305938720703125,
0.032012939453125,
-0.029510498046875,
0.014007568359375,
0.0010709762573242188,
0.02294921875,
-0.0272064208984375,
-0.038848876953125,
0.0007867813110351562,
0.00865936279296875,
-0.038482666015625,
-0.046722412109375
]
] |
BM-K/KoSimCSE-roberta-multitask | 2023-03-24T00:48:07.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"feature-extraction",
"korean",
"ko",
"endpoints_compatible",
"region:us"
] | feature-extraction | BM-K | null | null | BM-K/KoSimCSE-roberta-multitask | 4 | 15,089 | transformers | 2022-06-01T15:02:22 | ---
language: ko
tags:
- korean
---
https://github.com/BM-K/Sentence-Embedding-is-all-you-need
# Korean-Sentence-Embedding
🍭 Korean sentence embedding repository. You can download the pre-trained models and inference right away, also it provides environments where individuals can train models.
## Quick tour
```python
import torch
from transformers import AutoModel, AutoTokenizer
def cal_score(a, b):
if len(a.shape) == 1: a = a.unsqueeze(0)
if len(b.shape) == 1: b = b.unsqueeze(0)
a_norm = a / a.norm(dim=1)[:, None]
b_norm = b / b.norm(dim=1)[:, None]
return torch.mm(a_norm, b_norm.transpose(0, 1)) * 100
model = AutoModel.from_pretrained('BM-K/KoSimCSE-roberta-multitask')
AutoTokenizer.from_pretrained('BM-K/KoSimCSE-roberta-multitask')
sentences = ['치타가 들판을 가로 질러 먹이를 쫓는다.',
'치타 한 마리가 먹이 뒤에서 달리고 있다.',
'원숭이 한 마리가 드럼을 연주한다.']
inputs = tokenizer(sentences, padding=True, truncation=True, return_tensors="pt")
embeddings, _ = model(**inputs, return_dict=False)
score01 = cal_score(embeddings[0][0], embeddings[1][0])
score02 = cal_score(embeddings[0][0], embeddings[2][0])
```
## Performance
- Semantic Textual Similarity test set results <br>
| Model | AVG | Cosine Pearson | Cosine Spearman | Euclidean Pearson | Euclidean Spearman | Manhattan Pearson | Manhattan Spearman | Dot Pearson | Dot Spearman |
|------------------------|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
| KoSBERT<sup>†</sup><sub>SKT</sub> | 77.40 | 78.81 | 78.47 | 77.68 | 77.78 | 77.71 | 77.83 | 75.75 | 75.22 |
| KoSBERT | 80.39 | 82.13 | 82.25 | 80.67 | 80.75 | 80.69 | 80.78 | 77.96 | 77.90 |
| KoSRoBERTa | 81.64 | 81.20 | 82.20 | 81.79 | 82.34 | 81.59 | 82.20 | 80.62 | 81.25 |
| | | | | | | | | |
| KoSentenceBART | 77.14 | 79.71 | 78.74 | 78.42 | 78.02 | 78.40 | 78.00 | 74.24 | 72.15 |
| KoSentenceT5 | 77.83 | 80.87 | 79.74 | 80.24 | 79.36 | 80.19 | 79.27 | 72.81 | 70.17 |
| | | | | | | | | |
| KoSimCSE-BERT<sup>†</sup><sub>SKT</sub> | 81.32 | 82.12 | 82.56 | 81.84 | 81.63 | 81.99 | 81.74 | 79.55 | 79.19 |
| KoSimCSE-BERT | 83.37 | 83.22 | 83.58 | 83.24 | 83.60 | 83.15 | 83.54 | 83.13 | 83.49 |
| KoSimCSE-RoBERTa | 83.65 | 83.60 | 83.77 | 83.54 | 83.76 | 83.55 | 83.77 | 83.55 | 83.64 |
| | | | | | | | | | |
| KoSimCSE-BERT-multitask | 85.71 | 85.29 | 86.02 | 85.63 | 86.01 | 85.57 | 85.97 | 85.26 | 85.93 |
| KoSimCSE-RoBERTa-multitask | 85.77 | 85.08 | 86.12 | 85.84 | 86.12 | 85.83 | 86.12 | 85.03 | 85.99 | | 2,592 | [
[
-0.0204620361328125,
-0.046844482421875,
0.03558349609375,
0.022186279296875,
-0.0264434814453125,
0.0048828125,
-0.0205841064453125,
0.0016984939575195312,
0.02496337890625,
0.0283050537109375,
-0.050384521484375,
-0.0548095703125,
-0.0474853515625,
0.006351470947265625,
0.00504302978515625,
0.053466796875,
-0.01514434814453125,
0.00829315185546875,
0.00662994384765625,
-0.0096588134765625,
-0.042694091796875,
-0.0125732421875,
-0.0231781005859375,
-0.023223876953125,
0.00920867919921875,
0.03839111328125,
0.050079345703125,
0.02484130859375,
0.02020263671875,
0.032562255859375,
0.00482940673828125,
0.0009851455688476562,
-0.01161956787109375,
0.0157470703125,
0.00690460205078125,
-0.03997802734375,
-0.02386474609375,
0.0008654594421386719,
0.033538818359375,
0.0325927734375,
0.0006432533264160156,
0.01055145263671875,
0.00618743896484375,
0.052001953125,
-0.0307769775390625,
0.01331329345703125,
-0.0204010009765625,
0.022979736328125,
0.003475189208984375,
-0.01532745361328125,
-0.0283660888671875,
-0.034515380859375,
0.01200103759765625,
-0.04931640625,
0.0005092620849609375,
0.003082275390625,
0.10345458984375,
0.018798828125,
-0.047271728515625,
-0.035369873046875,
-0.0236663818359375,
0.07177734375,
-0.0633544921875,
0.031646728515625,
0.021514892578125,
0.01062774658203125,
-0.01181793212890625,
-0.05401611328125,
-0.03851318359375,
-0.0032634735107421875,
-0.02618408203125,
0.0438232421875,
0.0003390312194824219,
0.0037822723388671875,
0.02008056640625,
0.02459716796875,
-0.05426025390625,
-0.016815185546875,
-0.024383544921875,
-0.01508331298828125,
0.0496826171875,
0.013092041015625,
0.034912109375,
-0.051422119140625,
-0.036834716796875,
-0.01849365234375,
-0.00821685791015625,
0.00817108154296875,
0.0250701904296875,
0.00551605224609375,
-0.008270263671875,
0.0584716796875,
-0.020477294921875,
0.03106689453125,
0.01531219482421875,
-0.00811004638671875,
0.053192138671875,
-0.036346435546875,
-0.0274505615234375,
0.00696563720703125,
0.0693359375,
0.035400390625,
0.017669677734375,
-0.001739501953125,
-0.00502777099609375,
0.00972747802734375,
0.00188446044921875,
-0.0606689453125,
-0.044342041015625,
0.021881103515625,
-0.02880859375,
-0.01361083984375,
0.020538330078125,
-0.05731201171875,
0.0115814208984375,
-0.0247039794921875,
0.0628662109375,
-0.03759765625,
-0.0155487060546875,
0.0164337158203125,
-0.01568603515625,
0.019287109375,
-0.001964569091796875,
-0.045196533203125,
0.011688232421875,
0.0175018310546875,
0.06298828125,
0.004787445068359375,
-0.0182647705078125,
0.0025577545166015625,
-0.0026683807373046875,
-0.0313720703125,
0.03662109375,
-0.000400543212890625,
-0.0230560302734375,
0.000052928924560546875,
0.0230865478515625,
-0.031005859375,
-0.00885009765625,
0.045501708984375,
-0.0130767822265625,
0.03021240234375,
-0.0163726806640625,
-0.062347412109375,
-0.015838623046875,
0.0229949951171875,
-0.041748046875,
0.10369873046875,
0.0108642578125,
-0.0692138671875,
0.03424072265625,
-0.049468994140625,
-0.01763916015625,
-0.005779266357421875,
0.0012664794921875,
-0.052337646484375,
-0.0009365081787109375,
0.0298919677734375,
0.056243896484375,
0.0159759521484375,
0.00835418701171875,
-0.0185394287109375,
-0.0238800048828125,
0.00011497735977172852,
-0.01174163818359375,
0.07293701171875,
0.016326904296875,
-0.029388427734375,
0.0130767822265625,
-0.073974609375,
0.038604736328125,
0.016204833984375,
-0.020721435546875,
-0.026885986328125,
-0.0367431640625,
0.00041294097900390625,
0.032257080078125,
0.01387786865234375,
-0.04345703125,
0.01099395751953125,
-0.04547119140625,
0.03411865234375,
0.05645751953125,
0.01531219482421875,
0.033660888671875,
-0.0260772705078125,
0.04339599609375,
0.01050567626953125,
0.0030727386474609375,
-0.00783538818359375,
-0.03167724609375,
-0.04180908203125,
-0.042999267578125,
0.0277099609375,
0.051910400390625,
-0.059539794921875,
0.0657958984375,
-0.0200042724609375,
-0.045074462890625,
-0.05499267578125,
0.003101348876953125,
0.0200347900390625,
0.0235748291015625,
0.0193634033203125,
0.0188751220703125,
-0.05731201171875,
-0.06768798828125,
-0.02716064453125,
0.009063720703125,
0.0045318603515625,
0.03802490234375,
0.06610107421875,
-0.001720428466796875,
0.06488037109375,
-0.0618896484375,
-0.018035888671875,
-0.0303192138671875,
-0.00714111328125,
0.04669189453125,
0.041900634765625,
0.04632568359375,
-0.05157470703125,
-0.084716796875,
-0.0037631988525390625,
-0.059173583984375,
-0.0090179443359375,
-0.00235748291015625,
-0.0159759521484375,
0.021453857421875,
0.034393310546875,
-0.06549072265625,
0.060882568359375,
0.04443359375,
-0.060333251953125,
0.06884765625,
-0.0259552001953125,
0.0267791748046875,
-0.1170654296875,
0.00502777099609375,
-0.018707275390625,
-0.025115966796875,
-0.043243408203125,
-0.0022602081298828125,
0.01091766357421875,
0.0013360977172851562,
-0.020355224609375,
0.052520751953125,
-0.036407470703125,
0.0078887939453125,
-0.0040130615234375,
0.0115203857421875,
0.007427215576171875,
0.0489501953125,
-0.0011835098266601562,
0.051544189453125,
0.037017822265625,
-0.037200927734375,
0.0200042724609375,
0.02392578125,
-0.050079345703125,
0.026641845703125,
-0.048797607421875,
-0.004901885986328125,
0.01062774658203125,
0.025299072265625,
-0.10784912109375,
-0.0203857421875,
0.034759521484375,
-0.05023193359375,
0.004360198974609375,
-0.01508331298828125,
-0.044036865234375,
-0.048370361328125,
-0.041961669921875,
0.01050567626953125,
0.037750244140625,
-0.027130126953125,
0.033538818359375,
0.005962371826171875,
-0.0161590576171875,
-0.04510498046875,
-0.05126953125,
-0.019317626953125,
-0.0171966552734375,
-0.04559326171875,
0.033233642578125,
-0.00566864013671875,
0.0005974769592285156,
0.02020263671875,
-0.01172637939453125,
-0.008880615234375,
0.0050048828125,
0.01336669921875,
0.034576416015625,
-0.025634765625,
0.0016326904296875,
0.0129852294921875,
-0.017822265625,
-0.0044403076171875,
-0.0106048583984375,
0.06109619140625,
-0.0066375732421875,
-0.017578125,
-0.051849365234375,
0.00977325439453125,
0.033477783203125,
0.007568359375,
0.0635986328125,
0.061767578125,
-0.0262603759765625,
-0.0015039443969726562,
-0.02972412109375,
-0.007030487060546875,
-0.0374755859375,
0.034820556640625,
-0.04864501953125,
-0.06463623046875,
0.0557861328125,
-0.0079498291015625,
-0.011871337890625,
0.056304931640625,
0.042572021484375,
-0.01403045654296875,
0.085205078125,
0.0214385986328125,
0.006725311279296875,
0.0219879150390625,
-0.043365478515625,
0.021209716796875,
-0.0599365234375,
-0.022735595703125,
-0.0295867919921875,
-0.025634765625,
-0.051025390625,
0.0014581680297851562,
0.015716552734375,
0.0163421630859375,
-0.04034423828125,
0.034942626953125,
-0.050018310546875,
0.03564453125,
0.041900634765625,
0.0244140625,
-0.00229644775390625,
-0.02020263671875,
-0.046600341796875,
-0.0242919921875,
-0.054473876953125,
-0.0243988037109375,
0.08544921875,
0.0142822265625,
0.04351806640625,
0.001720428466796875,
0.055755615234375,
0.0157928466796875,
-0.01007080078125,
-0.053466796875,
0.032440185546875,
-0.00041413307189941406,
-0.040863037109375,
-0.02484130859375,
-0.03173828125,
-0.069580078125,
0.032684326171875,
-0.0134124755859375,
-0.0697021484375,
-0.0029659271240234375,
-0.011474609375,
-0.03448486328125,
0.0262603759765625,
-0.05645751953125,
0.07550048828125,
0.002841949462890625,
-0.005126953125,
0.00577545166015625,
-0.046844482421875,
0.020965576171875,
0.01279449462890625,
0.00839996337890625,
-0.0007691383361816406,
0.0006284713745117188,
0.059295654296875,
-0.045623779296875,
0.04168701171875,
-0.0214385986328125,
0.01387786865234375,
0.022308349609375,
0.0033416748046875,
0.042755126953125,
0.0182647705078125,
0.0100860595703125,
-0.0155487060546875,
0.01076507568359375,
-0.035888671875,
-0.040557861328125,
0.0672607421875,
-0.0714111328125,
-0.0296173095703125,
-0.05474853515625,
-0.03900146484375,
0.0030536651611328125,
0.034515380859375,
0.040283203125,
0.0159149169921875,
0.007511138916015625,
0.03131103515625,
0.04486083984375,
-0.023284912109375,
0.041900634765625,
0.01708984375,
-0.0142364501953125,
-0.056182861328125,
0.04656982421875,
0.0145416259765625,
0.01549530029296875,
0.014801025390625,
0.034454345703125,
-0.046783447265625,
-0.012664794921875,
-0.0205535888671875,
0.01412200927734375,
-0.04913330078125,
-0.015716552734375,
-0.052215576171875,
-0.027374267578125,
-0.052215576171875,
-0.02447509765625,
-0.03143310546875,
-0.0406494140625,
-0.00843048095703125,
-0.024658203125,
0.0206451416015625,
0.046966552734375,
-0.0017538070678710938,
0.0279998779296875,
-0.048187255859375,
0.01316070556640625,
0.001636505126953125,
0.01318359375,
-0.0007233619689941406,
-0.048065185546875,
-0.01088714599609375,
-0.0043792724609375,
-0.021759033203125,
-0.07183837890625,
0.0430908203125,
0.0043792724609375,
0.034271240234375,
0.0251312255859375,
0.007411956787109375,
0.057373046875,
-0.024383544921875,
0.06756591796875,
0.01053619384765625,
-0.07122802734375,
0.04052734375,
-0.0163421630859375,
0.031494140625,
0.038818359375,
0.033477783203125,
-0.04595947265625,
-0.015777587890625,
-0.062744140625,
-0.075439453125,
0.0726318359375,
0.03594970703125,
0.005504608154296875,
0.0062408447265625,
0.0164642333984375,
-0.00606536865234375,
0.0165252685546875,
-0.0648193359375,
-0.039276123046875,
-0.04010009765625,
-0.036529541015625,
0.00047469139099121094,
-0.00980377197265625,
0.0156707763671875,
-0.03521728515625,
0.06640625,
0.01058197021484375,
0.0283203125,
0.0253753662109375,
-0.0242156982421875,
0.01800537109375,
0.0092620849609375,
0.05255126953125,
0.0445556640625,
-0.01788330078125,
-0.006656646728515625,
0.0222625732421875,
-0.0440673828125,
0.0177001953125,
-0.0077056884765625,
-0.016204833984375,
0.020721435546875,
0.0242156982421875,
0.048431396484375,
0.037109375,
-0.02783203125,
0.042266845703125,
0.01123809814453125,
-0.040130615234375,
-0.0400390625,
-0.0169219970703125,
0.007656097412109375,
0.0214996337890625,
0.0144805908203125,
-0.0037078857421875,
-0.01522064208984375,
-0.032440185546875,
0.0232086181640625,
0.011016845703125,
-0.038330078125,
-0.0144195556640625,
0.0369873046875,
-0.01097869873046875,
-0.018707275390625,
0.04974365234375,
-0.02093505859375,
-0.07293701171875,
0.053070068359375,
0.04046630859375,
0.056610107421875,
-0.033172607421875,
0.0224609375,
0.0634765625,
0.0092926025390625,
-0.013671875,
0.04217529296875,
0.0162506103515625,
-0.05322265625,
-0.016387939453125,
-0.03424072265625,
0.01226043701171875,
0.01806640625,
-0.0482177734375,
0.0156097412109375,
-0.018035888671875,
-0.016204833984375,
-0.0091094970703125,
0.0288238525390625,
-0.049713134765625,
0.020599365234375,
-0.0244140625,
0.062347412109375,
-0.07373046875,
0.061065673828125,
0.054901123046875,
-0.03668212890625,
-0.0672607421875,
-0.0168304443359375,
-0.00661468505859375,
-0.0604248046875,
0.042327880859375,
0.01403045654296875,
0.031036376953125,
0.006195068359375,
-0.025665283203125,
-0.0714111328125,
0.09912109375,
-0.0021610260009765625,
-0.022186279296875,
0.007656097412109375,
0.0128173828125,
0.0304718017578125,
-0.0110321044921875,
0.03900146484375,
0.026947021484375,
0.043060302734375,
0.003414154052734375,
-0.056396484375,
0.02069091796875,
-0.021697998046875,
-0.006923675537109375,
0.0170745849609375,
-0.0576171875,
0.08056640625,
-0.0015306472778320312,
0.004093170166015625,
0.010345458984375,
0.042877197265625,
0.0174407958984375,
0.0169219970703125,
0.03759765625,
0.060882568359375,
0.053802490234375,
-0.0116424560546875,
0.061431884765625,
-0.025177001953125,
0.05487060546875,
0.0765380859375,
0.007366180419921875,
0.049163818359375,
0.020050048828125,
-0.0228424072265625,
0.035186767578125,
0.04998779296875,
-0.00870513916015625,
0.04681396484375,
0.0125732421875,
-0.005931854248046875,
-0.027099609375,
0.0246734619140625,
-0.0235748291015625,
0.034515380859375,
0.0155029296875,
-0.0251922607421875,
-0.00841522216796875,
0.006557464599609375,
0.028472900390625,
0.007251739501953125,
-0.0097808837890625,
0.04876708984375,
-0.00008720159530639648,
-0.044036865234375,
0.019378662109375,
-0.0010404586791992188,
0.048431396484375,
-0.05078125,
0.00530242919921875,
0.003314971923828125,
0.03173828125,
-0.0132598876953125,
-0.06658935546875,
0.0123748779296875,
-0.0146484375,
-0.00818634033203125,
0.000926971435546875,
0.05999755859375,
-0.033477783203125,
-0.045989990234375,
0.0177154541015625,
0.034088134765625,
0.00855255126953125,
-0.0016460418701171875,
-0.07818603515625,
-0.01259613037109375,
0.00716400146484375,
-0.048553466796875,
0.0206298828125,
0.036376953125,
0.007091522216796875,
0.053466796875,
0.043975830078125,
0.0024051666259765625,
0.0247802734375,
-0.0011119842529296875,
0.052093505859375,
-0.0526123046875,
-0.05474853515625,
-0.07415771484375,
0.03125,
-0.0213165283203125,
-0.042022705078125,
0.07025146484375,
0.058349609375,
0.051483154296875,
-0.021728515625,
0.0574951171875,
-0.019744873046875,
0.0266265869140625,
-0.04034423828125,
0.078369140625,
-0.046722412109375,
-0.0205841064453125,
-0.0218505859375,
-0.06207275390625,
-0.017822265625,
0.06890869140625,
-0.01611328125,
0.00878143310546875,
0.06646728515625,
0.052459716796875,
-0.0059356689453125,
0.00017821788787841797,
0.0154266357421875,
0.04071044921875,
0.01534271240234375,
0.045196533203125,
0.039398193359375,
-0.058563232421875,
0.048431396484375,
-0.03790283203125,
0.0005331039428710938,
-0.01267242431640625,
-0.0565185546875,
-0.0733642578125,
-0.048828125,
-0.0231475830078125,
-0.03094482421875,
-0.007617950439453125,
0.09210205078125,
0.037261962890625,
-0.05877685546875,
-0.00433349609375,
-0.005794525146484375,
0.00621795654296875,
-0.0173797607421875,
-0.022247314453125,
0.07025146484375,
-0.028228759765625,
-0.057952880859375,
0.008514404296875,
-0.0047149658203125,
0.00240325927734375,
0.0067138671875,
-0.0164642333984375,
-0.048431396484375,
0.0148162841796875,
0.03680419921875,
-0.0007529258728027344,
-0.063232421875,
-0.03363037109375,
0.0005850791931152344,
-0.02264404296875,
0.005054473876953125,
0.0166473388671875,
-0.03759765625,
0.0283203125,
0.047088623046875,
0.033538818359375,
0.03643798828125,
-0.0028934478759765625,
0.0202789306640625,
-0.04522705078125,
0.0130615234375,
0.01030731201171875,
0.02734375,
-0.004001617431640625,
-0.0215911865234375,
0.047637939453125,
0.033050537109375,
-0.0474853515625,
-0.064208984375,
-0.020538330078125,
-0.08184814453125,
-0.0261077880859375,
0.07159423828125,
-0.03387451171875,
-0.0227203369140625,
-0.0084228515625,
-0.030670166015625,
0.022491455078125,
-0.0290985107421875,
0.045867919921875,
0.0797119140625,
0.00878143310546875,
-0.0012607574462890625,
-0.051544189453125,
0.03948974609375,
0.0287933349609375,
-0.042388916015625,
-0.0210723876953125,
-0.00865936279296875,
0.033233642578125,
-0.005039215087890625,
0.050811767578125,
-0.0006198883056640625,
0.0207672119140625,
0.01514434814453125,
-0.0011415481567382812,
-0.001750946044921875,
-0.007106781005859375,
0.00333404541015625,
0.00032258033752441406,
-0.03948974609375,
-0.04510498046875
]
] |
Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static | 2023-06-27T08:21:13.000Z | [
"transformers",
"pytorch",
"onnx",
"distilbert",
"text-classification",
"text-classfication",
"int8",
"neural-compressor",
"Intel® Neural Compressor",
"PostTrainingStatic",
"en",
"dataset:sst2",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | Intel | null | null | Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static | 2 | 15,029 | transformers | 2022-03-29T05:04:36 | ---
language: en
license: apache-2.0
tags:
- text-classfication
- int8
- neural-compressor
- Intel® Neural Compressor
- PostTrainingStatic
datasets:
- sst2
model-index:
- name: distilbert-base-uncased-finetuned-sst-2-english-int8-static
results:
- task:
type: sentiment-classification
name: Sentiment Classification
dataset:
type: sst2
name: Stanford Sentiment Treebank
metrics:
- type: accuracy
value: 90.37
name: accuracy
config: accuracy
verified: false
---
## Model Details: INT8 DistilBERT base uncased finetuned SST-2
This model is a fine-tuned DistilBERT model for the downstream task of sentiment classification, training on the [SST-2 dataset](https://huggingface.co/datasets/sst2) and quantized to INT8 (post-training static quantization) from the original FP32 model ([distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english)).
The same model is provided in two different formats: PyTorch and ONNX.
| Model Detail | Description |
| ----------- | ----------- |
| Model Authors - Company | Intel |
| Date | March 29, 2022 for PyTorch model & February 3, 2023 for ONNX model |
| Version | 1 |
| Type | NLP DistilBERT (INT8) - Sentiment Classification (+/-) |
| Paper or Other Resources | [https://github.com/huggingface/optimum-intel](https://github.com/huggingface/optimum-intel) |
| License | Apache 2.0 |
| Questions or Comments | [Community Tab](https://huggingface.co/Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static/discussions) and [Intel Developers Discord](https://discord.gg/rv2Gp55UJQ) |
| Intended Use | Description |
| ----------- | ----------- |
| Primary intended uses | Inference for sentiment classification (classifying whether a statement is positive or negative) |
| Primary intended users | Anyone |
| Out-of-scope uses | This model is already fine-tuned and quantized to INT8. It is not suitable for further fine-tuning in this form. To fine-tune your own model, you can start with [distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english). The model should not be used to intentionally create hostile or alienating environments for people. |
#### Load the PyTorch model with Optimum Intel
```python
from optimum.intel.neural_compressor import INCModelForSequenceClassification
model_id = "Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static"
int8_model = INCModelForSequenceClassification.from_pretrained(model_id)
```
#### Load the ONNX model with Optimum:
```python
from optimum.onnxruntime import ORTModelForSequenceClassification
model_id = "Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static"
int8_model = ORTModelForSequenceClassification.from_pretrained(model_id)
```
| Factors | Description |
| ----------- | ----------- |
| Groups | Movie reviewers from the internet |
| Instrumentation | Text movie single-sentence reviews taken from 4 authors. More information can be found in the original paper by [Pang and Lee (2005)](https://arxiv.org/abs/cs/0506075) |
| Environment | - |
| Card Prompts | Model deployment on alternate hardware and software can change model performance |
| Metrics | Description |
| ----------- | ----------- |
| Model performance measures | Accuracy |
| Decision thresholds | - |
| Approaches to uncertainty and variability | - |
| | PyTorch INT8 | ONNX INT8 | FP32 |
|---|---|---|---|
| **Accuracy (eval-accuracy)** |0.9037|0.9071|0.9106|
| **Model Size (MB)** |65|89|255|
| Training and Evaluation Data | Description |
| ----------- | ----------- |
| Datasets | The dataset can be found here: [datasets/sst2](https://huggingface.co/datasets/sst2). There dataset has a total of 215,154 unique phrases, annotated by 3 human judges. |
| Motivation | Dataset was chosen to showcase the benefits of quantization on an NLP classification task with the [Optimum Intel](https://github.com/huggingface/optimum-intel) and [Intel® Neural Compressor](https://github.com/intel/neural-compressor) |
| Preprocessing | The calibration dataloader is the train dataloader. The default calibration sampling size 100 isn't divisible exactly by batch size 8, so the real sampling size is 104.|
| Quantitative Analyses | Description |
| ----------- | ----------- |
| Unitary results | The model was only evaluated on accuracy. There is no available comparison between evaluation factors. |
| Intersectional results | There is no available comparison between the intersection of evaluated factors. |
| Ethical Considerations | Description |
| ----------- | ----------- |
| Data | The data that make up the model are movie reviews from authors on the internet. |
| Human life | The model is not intended to inform decisions central to human life or flourishing. It is an aggregated set of movie reviews from the internet. |
| Mitigations | No additional risk mitigation strategies were considered during model development. |
| Risks and harms | The data are biased toward the particular reviewers' opinions and the judges (labelers) of the data. Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al., 2021](https://aclanthology.org/2021.acl-long.330.pdf), and [Bender et al., 2021](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Beyond this, the extent of the risks involved by using the model remain unknown.|
| Use cases | - |
| Caveats and Recommendations |
| ----------- |
| Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. There are no additional caveats or recommendations for this model. |
# BibTeX Entry and Citation Info
```
@misc{distilbert-base-uncased-finetuned-sst-2-english-int8-static
author = {Xin He, Yu Wenz},
title = {distilbert-base-uncased-finetuned-sst-2-english-int8-static},
year = {2022},
url = {https://huggingface.co/Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static},
}
```
| 6,272 | [
[
-0.023834228515625,
-0.047637939453125,
0.017608642578125,
0.01019287109375,
-0.0238189697265625,
-0.006195068359375,
-0.0221710205078125,
-0.021728515625,
-0.005340576171875,
0.01168060302734375,
-0.0322265625,
-0.025970458984375,
-0.052642822265625,
-0.01459503173828125,
-0.01409149169921875,
0.0943603515625,
0.0089874267578125,
0.020904541015625,
-0.01352691650390625,
0.0017757415771484375,
-0.01103973388671875,
-0.05499267578125,
-0.043701171875,
-0.020721435546875,
0.004085540771484375,
0.01245880126953125,
0.047515869140625,
0.03369140625,
0.05181884765625,
0.0235595703125,
-0.0272369384765625,
-0.018402099609375,
-0.0284423828125,
-0.006015777587890625,
0.0018758773803710938,
-0.02294921875,
-0.04339599609375,
0.004852294921875,
0.04840087890625,
0.05181884765625,
-0.02069091796875,
0.0369873046875,
0.01446533203125,
0.053985595703125,
-0.041015625,
0.0120391845703125,
-0.03173828125,
0.010162353515625,
0.0016632080078125,
0.019012451171875,
-0.0250091552734375,
-0.0246124267578125,
0.00850677490234375,
-0.03216552734375,
0.0089874267578125,
-0.003910064697265625,
0.07110595703125,
0.039154052734375,
-0.0269012451171875,
0.01158905029296875,
-0.06072998046875,
0.060302734375,
-0.06256103515625,
0.031524658203125,
0.017303466796875,
0.008453369140625,
0.0135040283203125,
-0.06353759765625,
-0.046875,
-0.01103973388671875,
0.0014057159423828125,
0.0309906005859375,
-0.038665771484375,
0.0251617431640625,
0.042144775390625,
0.0413818359375,
-0.040802001953125,
0.0044097900390625,
-0.0501708984375,
-0.0216522216796875,
0.04534912109375,
0.0035762786865234375,
0.01117706298828125,
-0.0313720703125,
-0.04510498046875,
-0.01038360595703125,
-0.029937744140625,
0.0276641845703125,
0.0357666015625,
0.01030731201171875,
-0.0195159912109375,
0.040985107421875,
-0.023681640625,
0.0386962890625,
0.01137542724609375,
0.002460479736328125,
0.03253173828125,
-0.0286102294921875,
-0.0266876220703125,
0.00865936279296875,
0.0731201171875,
0.04937744140625,
0.0185546875,
-0.0002224445343017578,
-0.02056884765625,
0.029327392578125,
0.00905609130859375,
-0.0745849609375,
-0.0225982666015625,
0.0198211669921875,
-0.038116455078125,
-0.0276641845703125,
0.0077056884765625,
-0.042388916015625,
-0.00927734375,
-0.02911376953125,
0.03265380859375,
-0.0310211181640625,
-0.038299560546875,
0.003795623779296875,
-0.02142333984375,
0.0024280548095703125,
0.005107879638671875,
-0.048980712890625,
0.02435302734375,
0.037872314453125,
0.06573486328125,
-0.01056671142578125,
-0.00542449951171875,
-0.0014257431030273438,
-0.0276641845703125,
-0.01276397705078125,
0.04132080078125,
-0.0112762451171875,
-0.0060882568359375,
-0.009796142578125,
-0.0119476318359375,
0.0073394775390625,
-0.029327392578125,
0.05322265625,
-0.026947021484375,
0.026611328125,
-0.0051727294921875,
-0.0379638671875,
-0.01453399658203125,
0.018829345703125,
-0.048492431640625,
0.08221435546875,
0.023101806640625,
-0.065185546875,
0.00885009765625,
-0.045806884765625,
-0.01108551025390625,
-0.0156097412109375,
-0.005214691162109375,
-0.0287628173828125,
0.014801025390625,
-0.011077880859375,
0.0299224853515625,
-0.032867431640625,
0.039825439453125,
-0.019866943359375,
-0.0295562744140625,
0.0191192626953125,
-0.04510498046875,
0.08807373046875,
0.0290985107421875,
-0.030364990234375,
-0.006191253662109375,
-0.0592041015625,
0.0094146728515625,
0.003040313720703125,
-0.0257415771484375,
-0.00891876220703125,
-0.011474609375,
0.0119171142578125,
0.036895751953125,
0.025390625,
-0.050323486328125,
-0.0014276504516601562,
-0.044891357421875,
0.0421142578125,
0.05645751953125,
-0.00745391845703125,
0.032745361328125,
-0.01490020751953125,
0.038970947265625,
0.020263671875,
0.03802490234375,
0.0183868408203125,
-0.0338134765625,
-0.06781005859375,
-0.00455474853515625,
0.038665771484375,
0.051666259765625,
-0.045623779296875,
0.03485107421875,
-0.016571044921875,
-0.03643798828125,
-0.039276123046875,
-0.0105743408203125,
0.04522705078125,
0.04083251953125,
0.027496337890625,
-0.0229034423828125,
-0.050994873046875,
-0.0823974609375,
-0.004978179931640625,
-0.027435302734375,
-0.0022068023681640625,
-0.0020923614501953125,
0.0182342529296875,
-0.00208282470703125,
0.06243896484375,
-0.038848876953125,
-0.0095062255859375,
-0.01299285888671875,
-0.0034084320068359375,
0.0250244140625,
0.043121337890625,
0.0297393798828125,
-0.05413818359375,
-0.03936767578125,
-0.0180816650390625,
-0.0616455078125,
0.0130462646484375,
0.0194091796875,
-0.0113677978515625,
0.0248260498046875,
0.038330078125,
-0.0501708984375,
0.03973388671875,
0.0350341796875,
-0.0293121337890625,
0.040985107421875,
0.0034046173095703125,
-0.004497528076171875,
-0.09796142578125,
0.00412750244140625,
0.0238494873046875,
-0.0013113021850585938,
-0.050994873046875,
-0.00917816162109375,
-0.00705718994140625,
-0.00264739990234375,
-0.048614501953125,
0.0372314453125,
-0.0159149169921875,
0.009368896484375,
-0.00467681884765625,
-0.0090484619140625,
0.007289886474609375,
0.0584716796875,
-0.00811767578125,
0.0439453125,
0.034515380859375,
-0.03814697265625,
0.0254058837890625,
0.0141143798828125,
-0.03729248046875,
0.033477783203125,
-0.070068359375,
-0.00350189208984375,
-0.011077880859375,
0.005664825439453125,
-0.0731201171875,
0.00437164306640625,
0.0171051025390625,
-0.051483154296875,
0.0207366943359375,
-0.0168304443359375,
-0.0389404296875,
-0.020751953125,
-0.0267486572265625,
0.0184478759765625,
0.06060791015625,
-0.0157623291015625,
0.044830322265625,
0.031280517578125,
-0.01032257080078125,
-0.042144775390625,
-0.07696533203125,
-0.031280517578125,
-0.0225830078125,
-0.04779052734375,
0.038726806640625,
-0.0135040283203125,
-0.032928466796875,
-0.006160736083984375,
-0.01009368896484375,
-0.0132598876953125,
0.005268096923828125,
0.0296783447265625,
0.043670654296875,
-0.016937255859375,
0.0142059326171875,
0.0090789794921875,
-0.01003265380859375,
0.0113067626953125,
-0.02105712890625,
0.0286712646484375,
-0.0249481201171875,
0.008544921875,
-0.044189453125,
0.0197296142578125,
0.033935546875,
-0.00440216064453125,
0.04620361328125,
0.061553955078125,
-0.03961181640625,
0.00928497314453125,
-0.037384033203125,
-0.036224365234375,
-0.037139892578125,
0.04779052734375,
-0.01499176025390625,
-0.057342529296875,
0.0360107421875,
0.003887176513671875,
-0.00614166259765625,
0.06396484375,
0.04022216796875,
-0.0101165771484375,
0.08935546875,
0.05584716796875,
-0.003780364990234375,
0.048309326171875,
-0.03936767578125,
0.0220184326171875,
-0.07525634765625,
-0.0084381103515625,
-0.027008056640625,
-0.02960205078125,
-0.0625,
-0.00885009765625,
0.024993896484375,
0.023834228515625,
-0.05377197265625,
0.007144927978515625,
-0.060882568359375,
0.0286865234375,
0.05255126953125,
0.01027679443359375,
0.01154327392578125,
0.018768310546875,
-0.02093505859375,
-0.0113067626953125,
-0.06304931640625,
-0.03851318359375,
0.0826416015625,
0.034210205078125,
0.044830322265625,
0.0118408203125,
0.0255126953125,
0.0209503173828125,
-0.002117156982421875,
-0.046966552734375,
0.031219482421875,
-0.033782958984375,
-0.059234619140625,
-0.00962066650390625,
-0.031280517578125,
-0.044525146484375,
0.00263214111328125,
-0.00255584716796875,
-0.0675048828125,
0.027923583984375,
0.01258087158203125,
-0.05767822265625,
0.0224151611328125,
-0.07928466796875,
0.07598876953125,
-0.024993896484375,
-0.0252532958984375,
-0.0015716552734375,
-0.06085205078125,
0.028167724609375,
0.016326904296875,
-0.0016469955444335938,
-0.01470947265625,
0.01519775390625,
0.06976318359375,
-0.034454345703125,
0.0665283203125,
-0.0252532958984375,
0.032806396484375,
0.033935546875,
-0.004978179931640625,
0.0262603759765625,
0.0019741058349609375,
-0.021331787109375,
0.033355712890625,
0.016571044921875,
-0.01025390625,
-0.034759521484375,
0.05682373046875,
-0.08856201171875,
-0.0243988037109375,
-0.05462646484375,
-0.0289306640625,
-0.008453369140625,
0.0015239715576171875,
0.040008544921875,
0.034271240234375,
-0.004749298095703125,
0.022491455078125,
0.04779052734375,
-0.0142669677734375,
0.031402587890625,
0.03143310546875,
-0.00879669189453125,
-0.0352783203125,
0.07012939453125,
0.0243072509765625,
0.0292510986328125,
0.01166534423828125,
0.025146484375,
-0.031494140625,
-0.028411865234375,
-0.0310821533203125,
-0.0017042160034179688,
-0.061370849609375,
-0.0291595458984375,
-0.052734375,
-0.03521728515625,
-0.035888671875,
0.02044677734375,
-0.037200927734375,
-0.055328369140625,
-0.030853271484375,
-0.0246124267578125,
0.04736328125,
0.039642333984375,
-0.00583648681640625,
0.0311126708984375,
-0.0262298583984375,
0.00945281982421875,
0.007755279541015625,
0.0367431640625,
-0.0010013580322265625,
-0.0714111328125,
-0.0202789306640625,
0.0199127197265625,
-0.039886474609375,
-0.050201416015625,
0.0169830322265625,
0.018646240234375,
0.033050537109375,
0.022216796875,
0.022308349609375,
0.030609130859375,
-0.01528167724609375,
0.06829833984375,
0.02203369140625,
-0.06280517578125,
0.04034423828125,
-0.01027679443359375,
0.032623291015625,
0.054168701171875,
0.047149658203125,
-0.033660888671875,
-0.0130157470703125,
-0.062042236328125,
-0.0819091796875,
0.07440185546875,
0.0172882080078125,
0.01053619384765625,
0.0112457275390625,
0.0170135498046875,
-0.0153656005859375,
0.02630615234375,
-0.067138671875,
-0.0291595458984375,
-0.0301361083984375,
-0.0290985107421875,
0.007068634033203125,
-0.027923583984375,
0.00331878662109375,
-0.04498291015625,
0.06390380859375,
0.01122283935546875,
0.044189453125,
0.0161895751953125,
-0.0052032470703125,
0.005382537841796875,
0.004978179931640625,
0.025970458984375,
0.0301361083984375,
-0.0330810546875,
0.00894927978515625,
0.00890350341796875,
-0.04571533203125,
0.0028438568115234375,
0.0210723876953125,
-0.034423828125,
-0.002826690673828125,
0.0003380775451660156,
0.0906982421875,
-0.002185821533203125,
-0.0357666015625,
0.036163330078125,
-0.01494598388671875,
-0.01387786865234375,
-0.0390625,
-0.00311279296875,
-0.0032291412353515625,
0.0268096923828125,
0.0154266357421875,
0.030517578125,
0.024444580078125,
-0.030181884765625,
0.0022830963134765625,
0.02081298828125,
-0.045501708984375,
-0.0239410400390625,
0.04644775390625,
0.006317138671875,
-0.0225830078125,
0.0618896484375,
-0.03155517578125,
-0.047515869140625,
0.049285888671875,
0.025421142578125,
0.06591796875,
-0.0142364501953125,
0.01174163818359375,
0.051300048828125,
0.03363037109375,
-0.01171112060546875,
0.0299835205078125,
0.0148773193359375,
-0.057647705078125,
-0.0236968994140625,
-0.060302734375,
-0.01068115234375,
0.003253936767578125,
-0.06072998046875,
0.0202789306640625,
-0.051727294921875,
-0.03912353515625,
0.0135345458984375,
0.00531768798828125,
-0.0611572265625,
0.0249176025390625,
0.01111602783203125,
0.051300048828125,
-0.06524658203125,
0.062164306640625,
0.048492431640625,
-0.059234619140625,
-0.07440185546875,
-0.0127410888671875,
0.00046896934509277344,
-0.0306396484375,
0.04132080078125,
0.00978851318359375,
0.01317596435546875,
0.01226043701171875,
-0.0270233154296875,
-0.0709228515625,
0.08154296875,
0.043304443359375,
-0.04437255859375,
-0.0013866424560546875,
0.0095977783203125,
0.049560546875,
-0.0110321044921875,
0.052337646484375,
0.04510498046875,
0.0221710205078125,
-0.0025005340576171875,
-0.0682373046875,
-0.0011548995971679688,
-0.04736328125,
0.00409698486328125,
0.0020771026611328125,
-0.06683349609375,
0.08489990234375,
-0.004558563232421875,
-0.004901885986328125,
-0.0140838623046875,
0.04046630859375,
0.01334381103515625,
0.01242828369140625,
0.0300140380859375,
0.052703857421875,
0.04595947265625,
-0.0234375,
0.08477783203125,
-0.036163330078125,
0.0386962890625,
0.08453369140625,
-0.0008153915405273438,
0.07635498046875,
0.037200927734375,
-0.0421142578125,
0.0391845703125,
0.057220458984375,
-0.00212860107421875,
0.0380859375,
0.0092620849609375,
-0.014007568359375,
-0.005054473876953125,
-0.0196380615234375,
-0.03302001953125,
0.0316162109375,
0.00957489013671875,
-0.037628173828125,
0.01255035400390625,
0.0032062530517578125,
0.0028972625732421875,
-0.01111602783203125,
-0.019744873046875,
0.055450439453125,
0.0117950439453125,
-0.053466796875,
0.05389404296875,
0.0142669677734375,
0.06842041015625,
-0.041351318359375,
0.0152435302734375,
-0.019439697265625,
0.027801513671875,
-0.01922607421875,
-0.048187255859375,
0.0305938720703125,
0.005767822265625,
-0.03948974609375,
-0.00937652587890625,
0.045623779296875,
-0.0289306640625,
-0.06646728515625,
0.01531982421875,
0.0197296142578125,
0.016815185546875,
-0.021697998046875,
-0.07122802734375,
0.0127105712890625,
0.0144500732421875,
-0.02642822265625,
0.01727294921875,
0.0189361572265625,
0.0016450881958007812,
0.0306396484375,
0.035430908203125,
-0.0021419525146484375,
0.006191253662109375,
0.0017881393432617188,
0.060150146484375,
-0.0369873046875,
-0.03619384765625,
-0.06854248046875,
0.0615234375,
-0.01136016845703125,
-0.03955078125,
0.051788330078125,
0.05126953125,
0.0780029296875,
-0.004817962646484375,
0.069580078125,
-0.0094451904296875,
0.02642822265625,
-0.031768798828125,
0.054351806640625,
-0.044708251953125,
0.005992889404296875,
-0.0188140869140625,
-0.0711669921875,
-0.007396697998046875,
0.05938720703125,
-0.01959228515625,
0.0218048095703125,
0.056640625,
0.0582275390625,
-0.00872802734375,
-0.003376007080078125,
0.0159454345703125,
0.035400390625,
-0.0038852691650390625,
0.038726806640625,
0.054718017578125,
-0.0618896484375,
0.037017822265625,
-0.046600341796875,
-0.0196533203125,
-0.01154327392578125,
-0.054718017578125,
-0.05487060546875,
-0.0447998046875,
-0.0443115234375,
-0.04217529296875,
0.00013065338134765625,
0.0845947265625,
0.050689697265625,
-0.059478759765625,
-0.019989013671875,
-0.006763458251953125,
-0.00543975830078125,
-0.0155029296875,
-0.0193023681640625,
0.02490234375,
0.0031261444091796875,
-0.053314208984375,
0.0011129379272460938,
0.005809783935546875,
0.00939178466796875,
-0.038543701171875,
-0.016632080078125,
-0.0174713134765625,
-0.0016880035400390625,
0.04278564453125,
0.0023746490478515625,
-0.056182861328125,
0.0052490234375,
0.0077667236328125,
-0.005767822265625,
0.00330352783203125,
0.0308990478515625,
-0.04315185546875,
0.036102294921875,
0.04254150390625,
0.0279693603515625,
0.03204345703125,
-0.00881195068359375,
0.0255126953125,
-0.059356689453125,
0.0267333984375,
0.026397705078125,
0.03436279296875,
0.0301971435546875,
-0.042999267578125,
0.03485107421875,
0.01934814453125,
-0.045654296875,
-0.06976318359375,
-0.0048065185546875,
-0.0816650390625,
-0.010040283203125,
0.09954833984375,
-0.006374359130859375,
-0.0188446044921875,
0.01549530029296875,
-0.0122528076171875,
0.03997802734375,
-0.035858154296875,
0.049468994140625,
0.058441162109375,
0.005306243896484375,
-0.0055999755859375,
-0.0523681640625,
0.042724609375,
0.03387451171875,
-0.04217529296875,
-0.0090484619140625,
0.031280517578125,
0.023101806640625,
0.01200103759765625,
0.041900634765625,
-0.0063323974609375,
0.0214080810546875,
0.0021343231201171875,
0.03204345703125,
-0.0080413818359375,
-0.0209503173828125,
-0.035369873046875,
-0.005229949951171875,
0.0118408203125,
-0.013916015625
]
] |
laion/clap-htsat-fused | 2023-04-24T14:40:12.000Z | [
"transformers",
"pytorch",
"clap",
"feature-extraction",
"arxiv:2211.06687",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | feature-extraction | laion | null | null | laion/clap-htsat-fused | 10 | 15,019 | transformers | 2023-02-16T20:45:11 | ---
license: apache-2.0
---
# Model card for CLAP
Model card for CLAP: Contrastive Language-Audio Pretraining

# Table of Contents
0. [TL;DR](#TL;DR)
1. [Model Details](#model-details)
2. [Usage](#usage)
3. [Uses](#uses)
4. [Citation](#citation)
# TL;DR
The abstract of the paper states that:
> Contrastive learning has shown remarkable success in the field of multimodal representation learning. In this paper, we propose a pipeline of contrastive language-audio pretraining to develop an audio representation by combining audio data with natural language descriptions. To accomplish this target, we first release LAION-Audio-630K, a large collection of 633,526 audio-text pairs from different data sources. Second, we construct a contrastive language-audio pretraining model by considering different audio encoders and text encoders. We incorporate the feature fusion mechanism and keyword-to-caption augmentation into the model design to further enable the model to process audio inputs of variable lengths and enhance the performance. Third, we perform comprehensive experiments to evaluate our model across three tasks: text-to-audio retrieval, zero-shot audio classification, and supervised audio classification. The results demonstrate that our model achieves superior performance in text-to-audio retrieval task. In audio classification tasks, the model achieves state-of-the-art performance in the zero-shot setting and is able to obtain performance comparable to models' results in the non-zero-shot setting. LAION-Audio-630K and the proposed model are both available to the public.
# Usage
You can use this model for zero shot audio classification or extracting audio and/or textual features.
# Uses
## Perform zero-shot audio classification
### Using `pipeline`
```python
from datasets import load_dataset
from transformers import pipeline
dataset = load_dataset("ashraq/esc50")
audio = dataset["train"]["audio"][-1]["array"]
audio_classifier = pipeline(task="zero-shot-audio-classification", model="laion/clap-htsat-fused")
output = audio_classifier(audio, candidate_labels=["Sound of a dog", "Sound of vaccum cleaner"])
print(output)
>>> [{"score": 0.999, "label": "Sound of a dog"}, {"score": 0.001, "label": "Sound of vaccum cleaner"}]
```
## Run the model:
You can also get the audio and text embeddings using `ClapModel`
### Run the model on CPU:
```python
from datasets import load_dataset
from transformers import ClapModel, ClapProcessor
librispeech_dummy = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
audio_sample = librispeech_dummy[0]
model = ClapModel.from_pretrained("laion/clap-htsat-fused")
processor = ClapProcessor.from_pretrained("laion/clap-htsat-fused")
inputs = processor(audios=audio_sample["audio"]["array"], return_tensors="pt")
audio_embed = model.get_audio_features(**inputs)
```
### Run the model on GPU:
```python
from datasets import load_dataset
from transformers import ClapModel, ClapProcessor
librispeech_dummy = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
audio_sample = librispeech_dummy[0]
model = ClapModel.from_pretrained("laion/clap-htsat-fused").to(0)
processor = ClapProcessor.from_pretrained("laion/clap-htsat-fused")
inputs = processor(audios=audio_sample["audio"]["array"], return_tensors="pt").to(0)
audio_embed = model.get_audio_features(**inputs)
```
# Citation
If you are using this model for your work, please consider citing the original paper:
```
@misc{https://doi.org/10.48550/arxiv.2211.06687,
doi = {10.48550/ARXIV.2211.06687},
url = {https://arxiv.org/abs/2211.06687},
author = {Wu, Yusong and Chen, Ke and Zhang, Tianyu and Hui, Yuchen and Berg-Kirkpatrick, Taylor and Dubnov, Shlomo},
keywords = {Sound (cs.SD), Audio and Speech Processing (eess.AS), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Electrical engineering, electronic engineering, information engineering, FOS: Electrical engineering, electronic engineering, information engineering},
title = {Large-scale Contrastive Language-Audio Pretraining with Feature Fusion and Keyword-to-Caption Augmentation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,444 | [
[
-0.041168212890625,
-0.04443359375,
0.0200653076171875,
0.0031108856201171875,
-0.019317626953125,
-0.01416778564453125,
-0.0256500244140625,
-0.020111083984375,
0.005672454833984375,
0.01702880859375,
-0.03961181640625,
-0.0423583984375,
-0.048980712890625,
-0.01511383056640625,
-0.032684326171875,
0.07269287109375,
0.01416778564453125,
-0.0006151199340820312,
-0.0048675537109375,
-0.0175018310546875,
-0.032073974609375,
-0.031585693359375,
-0.035247802734375,
-0.036651611328125,
0.011444091796875,
0.0118560791015625,
0.0296478271484375,
0.0419921875,
0.0204010009765625,
0.029296875,
-0.014739990234375,
-0.0259857177734375,
-0.0132293701171875,
-0.004558563232421875,
0.013824462890625,
-0.039154052734375,
-0.04876708984375,
0.01593017578125,
0.04595947265625,
0.03271484375,
-0.007175445556640625,
0.0291748046875,
0.01451873779296875,
0.007358551025390625,
-0.04400634765625,
0.0152740478515625,
-0.034332275390625,
8.940696716308594e-7,
-0.008056640625,
-0.0246124267578125,
-0.0214996337890625,
0.006549835205078125,
0.00685882568359375,
-0.04571533203125,
0.0296478271484375,
-0.0087432861328125,
0.09075927734375,
0.0384521484375,
-0.004856109619140625,
-0.00699615478515625,
-0.054168701171875,
0.06427001953125,
-0.07867431640625,
0.053131103515625,
0.02947998046875,
0.014068603515625,
0.0224609375,
-0.060699462890625,
-0.062347412109375,
-0.01071929931640625,
0.00855255126953125,
0.036102294921875,
-0.010955810546875,
0.00931549072265625,
0.0265350341796875,
0.041595458984375,
-0.03765869140625,
0.011260986328125,
-0.0269622802734375,
-0.01305389404296875,
0.03448486328125,
-0.01314544677734375,
0.0186309814453125,
-0.020721435546875,
-0.0311279296875,
-0.0537109375,
-0.0286865234375,
0.01157379150390625,
0.03350830078125,
0.0246124267578125,
-0.05426025390625,
0.0255889892578125,
0.0038852691650390625,
0.03546142578125,
0.01435089111328125,
-0.033111572265625,
0.05816650390625,
-0.02117919921875,
-0.01513671875,
0.022796630859375,
0.0836181640625,
0.002460479736328125,
-0.00013828277587890625,
0.0200042724609375,
-0.0242919921875,
0.01953125,
-0.00362396240234375,
-0.052215576171875,
-0.020416259765625,
0.029388427734375,
-0.0228118896484375,
-0.009796142578125,
-0.0015544891357421875,
-0.04522705078125,
0.0012693405151367188,
-0.03399658203125,
0.072021484375,
-0.043731689453125,
-0.0301513671875,
0.01264190673828125,
-0.0207061767578125,
0.01459503173828125,
-0.0130462646484375,
-0.06585693359375,
0.0120086669921875,
0.03192138671875,
0.053497314453125,
0.007659912109375,
-0.02752685546875,
-0.043121337890625,
0.0098419189453125,
0.0014781951904296875,
0.042236328125,
-0.025848388671875,
-0.022979736328125,
-0.005870819091796875,
-0.01255035400390625,
-0.0106048583984375,
-0.04998779296875,
0.052276611328125,
-0.0090179443359375,
0.0270843505859375,
0.0162506103515625,
-0.03619384765625,
-0.017364501953125,
-0.00746917724609375,
-0.038665771484375,
0.07269287109375,
-0.00821685791015625,
-0.07012939453125,
0.0183563232421875,
-0.04998779296875,
-0.0276641845703125,
-0.019500732421875,
-0.01227569580078125,
-0.0487060546875,
-0.00476837158203125,
0.034332275390625,
0.034271240234375,
-0.0246429443359375,
0.0288238525390625,
-0.0237579345703125,
-0.050140380859375,
0.017486572265625,
-0.046234130859375,
0.06573486328125,
0.0129547119140625,
-0.04998779296875,
0.02008056640625,
-0.06439208984375,
-0.0179901123046875,
0.007076263427734375,
-0.01708984375,
0.0037593841552734375,
-0.010284423828125,
0.01593017578125,
0.0182952880859375,
0.00591278076171875,
-0.0282745361328125,
-0.033355712890625,
-0.03704833984375,
0.03192138671875,
0.043426513671875,
-0.0277557373046875,
0.024749755859375,
-0.036956787109375,
0.03399658203125,
0.0117340087890625,
0.0013170242309570312,
-0.03875732421875,
-0.0223236083984375,
-0.06890869140625,
-0.039825439453125,
0.042938232421875,
0.04730224609375,
-0.03729248046875,
0.05828857421875,
-0.0230255126953125,
-0.038421630859375,
-0.08514404296875,
-0.00899505615234375,
0.033721923828125,
0.047119140625,
0.05560302734375,
-0.023345947265625,
-0.04974365234375,
-0.061370849609375,
0.004749298095703125,
-0.01342010498046875,
-0.028106689453125,
0.03570556640625,
0.0252685546875,
-0.028228759765625,
0.053802490234375,
-0.0285797119140625,
-0.0401611328125,
-0.01078033447265625,
0.033416748046875,
0.050811767578125,
0.055999755859375,
0.0283050537109375,
-0.044647216796875,
-0.01953125,
-0.0183563232421875,
-0.058349609375,
-0.0195159912109375,
-0.0025119781494140625,
0.015838623046875,
-0.007396697998046875,
0.038177490234375,
-0.051513671875,
0.0114593505859375,
0.039764404296875,
0.0028972625732421875,
0.050018310546875,
-0.0027103424072265625,
0.01436614990234375,
-0.08026123046875,
0.01290130615234375,
-0.00519561767578125,
-0.0031585693359375,
-0.038360595703125,
-0.0167999267578125,
-0.0152740478515625,
-0.0232696533203125,
-0.04693603515625,
0.0433349609375,
-0.0281982421875,
-0.0028076171875,
-0.01959228515625,
0.02850341796875,
-0.008880615234375,
0.055328369140625,
0.01178741455078125,
0.0511474609375,
0.07025146484375,
-0.056488037109375,
0.01108551025390625,
0.0257415771484375,
-0.021881103515625,
0.03802490234375,
-0.0655517578125,
0.01386260986328125,
-0.0056915283203125,
0.01611328125,
-0.079345703125,
-0.006561279296875,
0.005397796630859375,
-0.058380126953125,
0.04833984375,
-0.004302978515625,
-0.038482666015625,
-0.0231170654296875,
-0.012847900390625,
0.047027587890625,
0.053009033203125,
-0.047119140625,
0.051422119140625,
0.039154052734375,
-0.0186309814453125,
-0.04638671875,
-0.041778564453125,
-0.034027099609375,
-0.0258941650390625,
-0.038970947265625,
0.041015625,
-0.00600433349609375,
0.00884246826171875,
-0.00820159912109375,
-0.01380157470703125,
0.008697509765625,
-0.0128936767578125,
0.0291748046875,
0.0252532958984375,
-0.0287933349609375,
0.0110321044921875,
-0.0085296630859375,
-0.019866943359375,
-0.0011243820190429688,
-0.00365447998046875,
0.06109619140625,
-0.01163482666015625,
-0.0237274169921875,
-0.058074951171875,
0.006725311279296875,
0.026763916015625,
-0.0279998779296875,
0.02252197265625,
0.08526611328125,
-0.0038909912109375,
0.006290435791015625,
-0.046234130859375,
-0.0301513671875,
-0.041046142578125,
0.04534912109375,
-0.019622802734375,
-0.05419921875,
0.03399658203125,
-0.00007414817810058594,
-0.00624847412109375,
0.043426513671875,
0.03228759765625,
-0.019866943359375,
0.07489013671875,
0.036712646484375,
-0.004810333251953125,
0.04443359375,
-0.04754638671875,
0.0018358230590820312,
-0.067626953125,
-0.00315093994140625,
-0.0411376953125,
-0.0220794677734375,
-0.04144287109375,
-0.0340576171875,
0.02520751953125,
0.01178741455078125,
-0.032684326171875,
0.0287017822265625,
-0.030670166015625,
0.003437042236328125,
0.050140380859375,
0.0013227462768554688,
-0.00250244140625,
-0.0007781982421875,
-0.0222015380859375,
-0.0071868896484375,
-0.050994873046875,
-0.0198211669921875,
0.07244873046875,
0.044158935546875,
0.044647216796875,
-0.0086517333984375,
0.06304931640625,
-0.0019025802612304688,
0.005672454833984375,
-0.052520751953125,
0.034271240234375,
-0.0157012939453125,
-0.042236328125,
-0.0166168212890625,
-0.024688720703125,
-0.06005859375,
0.0249481201171875,
-0.0283355712890625,
-0.04156494140625,
0.00815582275390625,
0.0115509033203125,
-0.0276947021484375,
0.0227203369140625,
-0.056854248046875,
0.058074951171875,
-0.0122833251953125,
-0.0231781005859375,
-0.003570556640625,
-0.041839599609375,
0.01861572265625,
0.00592041015625,
0.0279998779296875,
-0.0024166107177734375,
0.02337646484375,
0.08233642578125,
0.0016632080078125,
0.05828857421875,
-0.0213775634765625,
0.0019140243530273438,
0.02630615234375,
-0.0236663818359375,
0.01059722900390625,
-0.0095672607421875,
0.0037555694580078125,
0.0301361083984375,
-0.005008697509765625,
-0.012481689453125,
-0.03131103515625,
0.03826904296875,
-0.0718994140625,
-0.04638671875,
-0.01947021484375,
-0.046142578125,
-0.0230560302734375,
0.01323699951171875,
0.041412353515625,
0.03582763671875,
0.004306793212890625,
0.0273590087890625,
0.04522705078125,
-0.03253173828125,
0.050628662109375,
0.026702880859375,
-0.012908935546875,
-0.04681396484375,
0.07525634765625,
-0.0018281936645507812,
0.01436614990234375,
0.0265960693359375,
0.0182037353515625,
-0.04315185546875,
-0.0302276611328125,
-0.00498199462890625,
0.035980224609375,
-0.040985107421875,
-0.00726318359375,
-0.052581787109375,
-0.02703857421875,
-0.0445556640625,
0.007495880126953125,
-0.05133056640625,
-0.01322174072265625,
-0.040130615234375,
-0.00299835205078125,
0.004802703857421875,
0.0168304443359375,
-0.016510009765625,
0.047515869140625,
-0.06146240234375,
0.045745849609375,
0.0271759033203125,
0.0211639404296875,
-0.00040268898010253906,
-0.08331298828125,
-0.0160675048828125,
0.00429534912109375,
-0.038543701171875,
-0.05889892578125,
0.033782958984375,
0.013092041015625,
0.053314208984375,
0.041595458984375,
-0.00417327880859375,
0.055877685546875,
-0.031463623046875,
0.058868408203125,
0.0382080078125,
-0.08026123046875,
0.047332763671875,
-0.032012939453125,
0.043426513671875,
0.0322265625,
0.032867431640625,
-0.015289306640625,
-0.020538330078125,
-0.0682373046875,
-0.07659912109375,
0.06317138671875,
0.019989013671875,
0.01010894775390625,
0.01459503173828125,
-0.0135955810546875,
-0.00147247314453125,
0.0187835693359375,
-0.062469482421875,
-0.030731201171875,
-0.041046142578125,
-0.0157623291015625,
-0.029144287109375,
-0.0095062255859375,
-0.00589752197265625,
-0.042938232421875,
0.06304931640625,
-0.0005903244018554688,
0.04620361328125,
0.01242828369140625,
-0.00969696044921875,
0.0165863037109375,
0.031494140625,
0.0391845703125,
0.00775909423828125,
-0.0362548828125,
0.01300048828125,
0.0132293701171875,
-0.036865234375,
0.01160430908203125,
0.0205841064453125,
0.0016279220581054688,
0.013580322265625,
0.038543701171875,
0.1011962890625,
0.036407470703125,
-0.04095458984375,
0.04302978515625,
0.00691986083984375,
-0.0242767333984375,
-0.0330810546875,
0.0063629150390625,
0.021575927734375,
0.018524169921875,
0.023956298828125,
0.00012445449829101562,
0.0011920928955078125,
-0.03826904296875,
0.0258636474609375,
0.015899658203125,
-0.047821044921875,
-0.031982421875,
0.0787353515625,
0.0014200210571289062,
-0.025054931640625,
0.043212890625,
0.0121002197265625,
-0.0272369384765625,
0.032440185546875,
0.055145263671875,
0.07611083984375,
-0.0330810546875,
0.00424957275390625,
0.050018310546875,
-0.0024509429931640625,
0.001873016357421875,
0.01468658447265625,
-0.026580810546875,
-0.046234130859375,
-0.0273895263671875,
-0.06439208984375,
-0.01285552978515625,
0.0325927734375,
-0.051177978515625,
0.035980224609375,
-0.0264434814453125,
-0.03277587890625,
0.0095977783203125,
-0.0159912109375,
-0.058380126953125,
0.024993896484375,
0.0269012451171875,
0.04248046875,
-0.067626953125,
0.06292724609375,
0.037872314453125,
-0.05419921875,
-0.07318115234375,
-0.0010290145874023438,
-0.013671875,
-0.0330810546875,
0.042877197265625,
0.025146484375,
-0.00852203369140625,
0.0017490386962890625,
-0.039581298828125,
-0.0562744140625,
0.07757568359375,
0.04461669921875,
-0.050201416015625,
0.0081787109375,
0.00835418701171875,
0.040679931640625,
-0.0279388427734375,
0.043548583984375,
0.037384033203125,
0.03460693359375,
0.0099334716796875,
-0.08367919921875,
-0.0140838623046875,
-0.03131103515625,
-0.0264892578125,
-0.0146942138671875,
-0.0390625,
0.0906982421875,
-0.01395416259765625,
-0.01230621337890625,
-0.0172271728515625,
0.05670166015625,
0.051177978515625,
0.0227203369140625,
0.055206298828125,
0.054962158203125,
0.044769287109375,
-0.0134735107421875,
0.07275390625,
-0.016143798828125,
0.0269622802734375,
0.09088134765625,
0.0003266334533691406,
0.08026123046875,
0.03173828125,
-0.0307769775390625,
0.03570556640625,
0.022979736328125,
-0.00989532470703125,
0.041717529296875,
-0.0008344650268554688,
-0.01605224609375,
-0.0239715576171875,
-0.01263427734375,
-0.04266357421875,
0.05828857421875,
0.01238250732421875,
-0.0262908935546875,
0.0122528076171875,
0.0102081298828125,
-0.0108795166015625,
-0.00856781005859375,
-0.002971649169921875,
0.043609619140625,
0.0230560302734375,
-0.028076171875,
0.07012939453125,
-0.0208892822265625,
0.08544921875,
-0.0384521484375,
-0.002452850341796875,
0.005214691162109375,
0.0150909423828125,
-0.0146484375,
-0.04351806640625,
0.00018608570098876953,
-0.01551055908203125,
-0.0186309814453125,
0.0017175674438476562,
0.0273590087890625,
-0.04693603515625,
-0.020263671875,
0.049041748046875,
0.01247406005859375,
0.0138702392578125,
-0.004985809326171875,
-0.055908203125,
0.01432037353515625,
-0.0020923614501953125,
-0.00742340087890625,
0.0206146240234375,
0.00617218017578125,
0.0284423828125,
0.0300445556640625,
0.054962158203125,
0.0149688720703125,
0.01265716552734375,
0.0238494873046875,
0.04632568359375,
-0.052215576171875,
-0.055908203125,
-0.04144287109375,
0.052093505859375,
0.003932952880859375,
-0.01213836669921875,
0.04052734375,
0.045562744140625,
0.06787109375,
-0.0132293701171875,
0.052520751953125,
0.01125335693359375,
0.0465087890625,
-0.03472900390625,
0.0655517578125,
-0.05377197265625,
0.029693603515625,
-0.04022216796875,
-0.071044921875,
-0.00524139404296875,
0.04638671875,
-0.01045989990234375,
0.0192108154296875,
0.04376220703125,
0.0670166015625,
-0.0285491943359375,
-0.0033512115478515625,
0.0267486572265625,
0.031982421875,
0.020904541015625,
0.0300750732421875,
0.04583740234375,
-0.068359375,
0.047119140625,
-0.0330810546875,
-0.00742340087890625,
0.00044918060302734375,
-0.03985595703125,
-0.0596923828125,
-0.05755615234375,
-0.03753662109375,
-0.0175323486328125,
-0.01488494873046875,
0.047332763671875,
0.06414794921875,
-0.07037353515625,
-0.03271484375,
0.0205230712890625,
-0.00203704833984375,
-0.0251007080078125,
-0.018890380859375,
0.04791259765625,
-0.01285552978515625,
-0.0694580078125,
0.04364013671875,
-0.0008172988891601562,
0.0149383544921875,
-0.0007491111755371094,
-0.025787353515625,
-0.03826904296875,
-0.0007748603820800781,
0.03411865234375,
0.0186920166015625,
-0.050140380859375,
-0.0125732421875,
-0.0140228271484375,
0.009002685546875,
0.0228729248046875,
0.0279998779296875,
-0.05767822265625,
0.03656005859375,
0.041717529296875,
0.02716064453125,
0.06695556640625,
-0.0118560791015625,
0.025543212890625,
-0.05877685546875,
0.040557861328125,
0.01551055908203125,
0.0291900634765625,
0.037628173828125,
0.01015472412109375,
0.0171051025390625,
0.02349853515625,
-0.0310821533203125,
-0.0721435546875,
-0.0159912109375,
-0.09271240234375,
-0.01352691650390625,
0.08807373046875,
-0.0008950233459472656,
-0.017822265625,
0.0016908645629882812,
-0.01541900634765625,
0.06256103515625,
-0.025299072265625,
0.04156494140625,
0.026458740234375,
0.007656097412109375,
-0.02777099609375,
-0.044342041015625,
0.03387451171875,
0.04364013671875,
-0.0302734375,
-0.02215576171875,
0.01110076904296875,
0.0292510986328125,
0.037384033203125,
0.04931640625,
-0.0218963623046875,
0.02001953125,
0.02606201171875,
0.032379150390625,
-0.015655517578125,
-0.019134521484375,
-0.03472900390625,
0.022369384765625,
-0.0184326171875,
-0.036163330078125
]
] |
TheBloke/Airoboros-L2-70B-3.1.2-AWQ | 2023-10-22T11:48:29.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"dataset:jondurbin/airoboros-3.1",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Airoboros-L2-70B-3.1.2-AWQ | 1 | 15,005 | transformers | 2023-10-21T12:11:56 | ---
base_model: jondurbin/airoboros-l2-70b-3.1.2
datasets:
- jondurbin/airoboros-3.1
inference: false
license: llama2
model_creator: Jon Durbin
model_name: Airoboros L2 70B 3.1.2
model_type: llama
prompt_template: '[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Airoboros L2 70B 3.1.2 - AWQ
- Model creator: [Jon Durbin](https://huggingface.co/jondurbin)
- Original model: [Airoboros L2 70B 3.1.2](https://huggingface.co/jondurbin/airoboros-l2-70b-3.1.2)
<!-- description start -->
## Description
This repo contains AWQ model files for [Jon Durbin's Airoboros L2 70B 3.1.2](https://huggingface.co/jondurbin/airoboros-l2-70b-3.1.2).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.
It is supported by:
- [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) - using Loader: AutoAWQ
- [vLLM](https://github.com/vllm-project/vllm) - Llama and Mistral models only
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) - for use from Python code
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Airoboros-L2-70B-3.1.2-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Airoboros-L2-70B-3.1.2-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Airoboros-L2-70B-3.1.2-GGUF)
* [Jon Durbin's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/jondurbin/airoboros-l2-70b-3.1.2)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Airoboros-Llama-2-Chat
```
[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
For my first release of AWQ models, I am releasing 128g models only. I will consider adding 32g as well if there is interest, and once I have done perplexity and evaluation comparisons, but at this time 32g models are still not fully tested with AutoAWQ and vLLM.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Airoboros-L2-70B-3.1.2-AWQ/tree/main) | 4 | 128 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 36.61 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Airoboros-L2-70B-3.1.2-AWQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Airoboros-L2-70B-3.1.2-AWQ`
7. Select **Loader: AutoAWQ**.
8. Click Load, and the model will load and is now ready for use.
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
10. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_AWQ.md-text-generation-webui end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Multi-user inference server: vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- Please ensure you are using vLLM version 0.2 or later.
- When using vLLM as a server, pass the `--quantization awq` parameter.
For example:
```shell
python3 python -m vllm.entrypoints.api_server --model TheBloke/Airoboros-L2-70B-3.1.2-AWQ --quantization awq
```
- When using vLLM from Python code, again set `quantization=awq`.
For example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Tell me about AI",
"Write a story about llamas",
"What is 291 - 150?",
"How much wood would a woodchuck chuck if a woodchuck could chuck wood?",
]
prompt_template=f'''[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
'''
prompts = [prompt_template.format(prompt=prompt) for prompt in prompts]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/Airoboros-L2-70B-3.1.2-AWQ", quantization="awq", dtype="auto")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Multi-user inference server: Hugging Face Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/Airoboros-L2-70B-3.1.2-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires [huggingface-hub](https://github.com/huggingface/huggingface_hub) 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: ", response)
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## Inference from Python code using AutoAWQ
### Install the AutoAWQ package
Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.1 or later.
```shell
pip3 install autoawq
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### AutoAWQ example code
```python
from awq import AutoAWQForCausalLM
from transformers import AutoTokenizer
model_name_or_path = "TheBloke/Airoboros-L2-70B-3.1.2-AWQ"
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=False)
# Load model
model = AutoAWQForCausalLM.from_quantized(model_name_or_path, fuse_layers=True,
trust_remote_code=False, safetensors=True)
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
'''
print("*** Running model.generate:")
token_input = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
# Generate output
generation_output = model.generate(
token_input,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
max_new_tokens=512
)
# Get the tokens from the output, decode them, print them
token_output = generation_output[0]
text_output = tokenizer.decode(token_output)
print("LLM output: ", text_output)
"""
# Inference should be possible with transformers pipeline as well in future
# But currently this is not yet supported by AutoAWQ (correct as of September 25th 2023)
from transformers import pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
"""
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui) using `Loader: AutoAWQ`.
- [vLLM](https://github.com/vllm-project/vllm) version 0.2.0 and later.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.1.0 and later.
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) version 0.1.1 and later.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Pierre Kircher, Stanislav Ovsiannikov, Michael Levine, Eugene Pentland, Andrey, 준교 김, Randy H, Fred von Graf, Artur Olbinski, Caitlyn Gatomon, terasurfer, Jeff Scroggin, James Bentley, Vadim, Gabriel Puliatti, Harry Royden McLaughlin, Sean Connelly, Dan Guido, Edmond Seymore, Alicia Loh, subjectnull, AzureBlack, Manuel Alberto Morcote, Thomas Belote, Lone Striker, Chris Smitley, Vitor Caleffi, Johann-Peter Hartmann, Clay Pascal, biorpg, Brandon Frisco, sidney chen, transmissions 11, Pedro Madruga, jinyuan sun, Ajan Kanaga, Emad Mostaque, Trenton Dambrowitz, Jonathan Leane, Iucharbius, usrbinkat, vamX, George Stoitzev, Luke Pendergrass, theTransient, Olakabola, Swaroop Kallakuri, Cap'n Zoog, Brandon Phillips, Michael Dempsey, Nikolai Manek, danny, Matthew Berman, Gabriel Tamborski, alfie_i, Raymond Fosdick, Tom X Nguyen, Raven Klaugh, LangChain4j, Magnesian, Illia Dulskyi, David Ziegler, Mano Prime, Luis Javier Navarrete Lozano, Erik Bjäreholt, 阿明, Nathan Dryer, Alex, Rainer Wilmers, zynix, TL, Joseph William Delisle, John Villwock, Nathan LeClaire, Willem Michiel, Joguhyik, GodLy, OG, Alps Aficionado, Jeffrey Morgan, ReadyPlayerEmma, Tiffany J. Kim, Sebastain Graf, Spencer Kim, Michael Davis, webtim, Talal Aujan, knownsqashed, John Detwiler, Imad Khwaja, Deo Leter, Jerry Meng, Elijah Stavena, Rooh Singh, Pieter, SuperWojo, Alexandros Triantafyllidis, Stephen Murray, Ai Maven, ya boyyy, Enrico Ros, Ken Nordquist, Deep Realms, Nicholas, Spiking Neurons AB, Elle, Will Dee, Jack West, RoA, Luke @flexchar, Viktor Bowallius, Derek Yates, Subspace Studios, jjj, Toran Billups, Asp the Wyvern, Fen Risland, Ilya, NimbleBox.ai, Chadd, Nitin Borwankar, Emre, Mandus, Leonard Tan, Kalila, K, Trailburnt, S_X, Cory Kujawski
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Jon Durbin's Airoboros L2 70B 3.1.2
### Overview
Another experimental model, using mostly sythetic data generated by [airoboros](https://github.com/jondurbin/airoboros)
#### IMPORTANT NOTE - llama-2 chat format!!!
This models uses llama-2 chat format, rather than vicuna style user/assistant!
This is a breaking change, although most inference systems support llama-2 chat templating.
See prompt format below for all of the details.
#### Highlights
A model built on [airoboros-3.1 dataset](https://hf.co/datasets/jondurbin/airoboros-3.1), which is a continuation of [airoboros-3.0 dataset](https://hf.co/datasets/jondurbin/airoboros-3,0) with the following extras:
* More MathJSON, now ~17k items - math questions, prefixed with __"Create a MathJSON solution to the following:"__, which then outputs a JSON between __`<mathjson>`__ and __`</mathjson>`__ tags, which can be parsed and passed to a deterministic library to perform calculations.
* Log information extraction (e.g. getting the list of paths/IPs/etc. from apache logs)
* Anonymization, e.g. removing names, IP addresses, and/or dates from text.
* Chat introspection - multi-turn chats that have random questions injected asking about the character's descriptions, the setting of the chat, or things previously said, to make sure the model uses the system prompt and or chat history.
* Multi-step instructions with acknowledgement (see below)
* De-censorship data (not published)
This is a fairly general purpose model, but focuses heavily on instruction following, rather than casual chat/roleplay.
Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools!
### Prompt format
The prompt format is llama-2 chat.
```
[INST] <<SYS>>
You are a helpful, unbiased, uncensored assistant.
<</SYS>>
{prompt} [/INST]
```
For multi-turn, the prompt format is as follows:
```
[INST] <<SYS>>
You are a helful, unbiased, uncensored assistant.
<</SYS>>
{prompt 0} [/INST] {response 0} </s><s>[INST] {prompt 1} [/INST] {response 1} </s><s>...[INST] {prompt N} [/INST]
```
The prompt template is included in the tokenizer config, and can use the huggingface tokenizer `apply_chat_template` method, e.g.:
```
import transformers
tokenizer = transformers.AutoTokenizer.from_pretrained('jondurbin/airoboros-l2-70b-3.1')
chat = [
{"role": "system", "content": "You are Bob, a friendly AI assistant."},
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing great. How can I help you today?"},
{"role": "user", "content": "I'd like to show off how chat templating works!"},
]
print(tokenizer.apply_chat_template(chat, tokenize=False))
```
### Helpful usage tips
#### MathJSON
Ask mathematical questions, prefixed with "Create a MathJSON solution to the following:", and you should get output with a formula that can be executed by https://cortexjs.io/compute-engine/
I also made a (really quite hacky, incomplete, semi-functional) python version that works for most expressions, see [mathjson.py](https://github.com/jondurbin/airoboros/blob/mathjson/airoboros/mathjson.py)
__Use a very low temperature!__
Example:
```
[INST] <<SYS>>
You are an assistant with vast knowledge in all things math.
<</SYS>>
Create a MathJSON solution to the following: Calculate the area of a circle with radius of 17.2456cm. Include your reasoning. [/INST]
```
Output:
```
The area (A) of a circle is given by the formula A = πr², where r is the radius of the circle. Since the radius is given in centimeters, we need to convert it to meters first. We know that 1 meter equals 100 centimeters. Therefore, we can represent the radius as: r = 17.2456 cm * 100/1.
Once we have the radius in meters, we can calculate the area of the circle using the formula A = πr².
Solution as MathJSON:
<mathjson>
[
"Multiply",
"Pi",
[
"Power",
17.2456,
2
]
]
</mathjson>
```
You can then validate the JSON between `<mathjson>` and `</mathjson>`, then pass the parsed JSON to compute-engine JS or the `evaluate` function in mathjson.py to calculate the response.
#### Context obedient question answering
By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations.
The format for a closed-context prompt is as follows:
```
BEGININPUT
BEGINCONTEXT
[key0: value0]
[key1: value1]
... other metdata ...
ENDCONTEXT
[insert your text blocks here]
ENDINPUT
[add as many other blocks, in the exact same format]
BEGININSTRUCTION
[insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
ENDINSTRUCTION
```
It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up.
*The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!*
I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
- `BEGININPUT` - denotes a new input block
- `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
- `ENDCONTEXT` - denotes the end of the metadata block for the current input
- [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
- `ENDINPUT` - denotes the end of the current input block
- [repeat as many input blocks in this format as you want]
- `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
- [instruction(s)]
- `ENDINSTRUCTION` - denotes the end of instruction set
It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to.
__Use a very low temperature!__
Here's a trivial, but important example to prove the point:
```
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
What color are bluberries? Source?
ENDINSTRUCTION
```
And the response:
```
Blueberries are now green.
Source:
date: 2021-01-01
url: https://web.site/123
```
#### Summarization
500 samples have been included from [this dataset](https://huggingface.co/datasets/mattpscott/airoboros-summarization), using the same format as contextual question answering, for example:
```
BEGININPUT
{text to summarize}
ENDINPUT
BEGININSTRUCTION
Summarize the input in around 130 words.
ENDINSTRUCTION
```
#### Getting longer responses
You can use a few techniques to get longer responses.
Detailed prompts, with explicit instruction for word count:
```
Please compose a narrative set in the heart of an ancient library, steeped in the scent of old parchment and ink. The protagonist should be a young scholar who is dedicated to studying the art of storytelling and its evolution throughout history. In her pursuit of knowledge, she stumbles upon a forgotten tome that seems to possess an unusual aura. This book has the ability to bring stories to life, literally manifesting characters and scenarios from within its pages into reality.
The main character must navigate through various epochs of storytelling - from oral traditions of tribal societies, through medieval minstrels' tales, to modern-day digital narratives - as they come alive around her. Each era presents its unique challenges and lessons about the power and impact of stories on human civilization.
One such character could be a sentient quill pen, who was once used by renowned authors of yesteryears and now holds their wisdom and experiences. It becomes her mentor, guiding her through this journey with witty remarks and insightful commentary.
Ensure that your tale encapsulates the thrill of adventure, the beauty of learning, and the profound connection between humans and their stories. All characters involved should be non-human entities. Feel free to explore creative liberties but maintain the mentioned elements.
Your response should be approximately 2300 words.
```
Or, a simpler example:
```
Please create a long, detailed story about a dragon in an old growth forest who, for some reason, begins speaking the words of the source code of linux.
```
There are a few examples of next chapter completion as well, e.g.:
```
Write the next chapter of a historical fiction novel set in Paris during the 20th century.
Here's a summary of the previous chapter:
In the vibrant city of Paris, amid the tumultuous changes of the 20th century, our protagonist Margot, an aspiring fashion designer, has just secured an apprenticeship at a prestigious couture house. She meets Lucien, a charming journalist who covers the fashion industry. Together they navigate the ever-changing world of fashion and society, uncovering secrets that reveal the intricate links between style, politics, and culture. As the chapter concludes, they decide to delve deeper into the hidden corners of the fashion world to unravel its mysteries.
Requirements for the next chapter:
1. Character Development of Margot and Lucien:
- Margot's Evolution: Unfold more about Margot's past, her dreams of revolutionizing fashion, and her struggle to establish herself in a male-dominated industry. Illustrate her growing expertise, innovative ideas, and increasing dependence on Lucien.
- Lucien's Complexity: Introduce uncertainties surrounding Lucien's background and real motives. Increase suspense by suggesting undisclosed information he possesses, while also highlighting his wit and perceptiveness.
2. Exploration of Paris and the Couture House:
- Paris: Elaborate their journey through the bustling streets of Paris, including encounters with iconic figures, social unrest, and relics from different eras of French history.
- The Couture House: Expand on the grandeur of the couture house they work in, filled with artistic masterpieces, intense competition, and cryptic notes hinting at a scandalous past.
3. Emergence of the Subplot: The Lost Collection:
- Discovery: Have Margot and Lucien stumble upon a secret vault containing a lost collection designed before World War II, raising new questions about the previous owner and the influence of war on fashion.
- Revelation: Capture their shock as they realize the designs were plagiarized, the potential repercussions, and the opportunities it presents for Margot's career.
- Twist: End with a twist that suggests there are other stolen collections across Paris, setting up their new mission.
Your response should be approximately 650 words.
```
#### Coding
You can ask for fairly complex coding instructions with multiple criteria, e.g.:
```
Create a python application with the following requirements:
- Asyncio FastAPI webserver
- ping endpoint that returns the current date in JSON format
- file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate
```
Or inline criteria:
```
Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values.
```
You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.:
```
Write a websocket application in node.js. PLAINFORMAT
```
#### Agent/function calling
The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML.
Example prompt:
```
As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format.
Input: I want to know how many times 'Python' is mentioned in my text file.
Available functions:
file_analytics:
description: This tool performs various operations on a text file.
params:
action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc.
filters:
keyword: The word or phrase we want to search for.
```
Response:
```json
{
"function": "file_analytics",
"params": {
"action": "count_occurrences",
"filters": {
"keyword": "Python"
}
}
}
```
#### Chain-of-thought
You can ask for several possible responses to a given problem, with a ranking and final answer selection.
Example prompt:
```
A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one.
```
Example response:
```
Solution 1:
In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players).
The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows:
n(n-1)/2 = 45
Solving this equation gives us n=10.
Final answer: There were 10 players in the tournament.
Solution 2:
Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers.
If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players.
Final answer: There were 10 players in the tournament.
Solution 3:
We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played.
Final answer: There were 10 players in the tournament.
Ranking of solutions from best to worst:
1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer.
2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer.
3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer.
Best and final answer: There were 10 players in the tournament.
```
#### reWOO style execution planning
The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions!
Example prompt:
```
Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string
that could be the user's question, one or more prior evidence values, or a combination of both.
Here are the tools available to be called:
DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer.
TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links.
HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links.
KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:?
The output should be in format:
Plan: [first action to take based on input question]
:evidence0: = [function to call with input parameter]
Plan: [next action to take, based on result of :evidence0:]
:evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"]
...
Answer: [:evidence[n]: containing the final answer.]
Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work?
```
Response:
```
Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020.
:evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner]
Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate.
:evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?]
Plan: Conduct another web search to find the most famous work of the identified laureate.
:evidence2: = DuckDuckGo[Most famous work of :evidence1:]
Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search.
:evidence3: = HyperlinkExtractor[:evidence2:]
Plan: Use the TextScraper tool to extract information from the relevant links.
:evidence4: = TextScraper[:evidence3:]
Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information.
:evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?]
Answer: :evidence5:
```
For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening:
```python
import re
import requests
def inject_context(input_text, **context):
for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)):
input_text = input_text.replace(ref, context.get(ref, ""))
return input_text
def duckduckgo(input_text, **context):
search_string = inject_context(input_text, **context)
... search via duck duck go using search_string
... return text content
def link_extractor(input_text, **context):
input_text = inject_context(input_text, **context)
return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I))))
def scrape(input_text, **context):
input_text = inject_context(input_text, **context)
text = []
for link in input_text.splitlines():
text.append(requests.get(link).text)
return "\n".join(text)
def infer(input_text, **context)
prompt = inject_context(input_text, **context)
... call model with prompt, return output
def parse_plan(plan):
method_map = {
"DuckDuckGo": duckduckgo,
"HyperlinkExtractor": link_extractor,
"KnowledgeModel": infer,
"TextScraper": scrape,
}
context = {}
for line in plan.strip().splitlines():
if line.startswith("Plan:"):
print(line)
continue
parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I)
if not parts:
if line.startswith("Answer: "):
return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...")
raise RuntimeError("bad format: " + line)
context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context)
```
## Multi-step instructions with acknowledgement
I don't know how useful this is, really, but I thought I'd add it just in case. Example:
```
{
"conversations": [
{
"value": "You are a helpful, unbiased, uncensored assistant.",
"from": "system"
},
{
"value": "I'm going to set some rules for our conversation. Please respond with \"OK\" to each rule.",
"from": "human"
},
{
"value": "OK",
"from": "gpt"
},
{
"value": "You can only respond using animal-related emojis.",
"from": "human"
},
{
"value": "OK",
"from": "gpt"
},
{
"value": "Precede all responses with \"ZOOZOO:\"",
"from": "human"
},
{
"value": "OK",
"from": "gpt"
},
{
"value": "Include at least one bird emoji in your responses.",
"from": "human"
},
{
"value": "OK",
"from": "gpt"
},
{
"value": "Describe a trip to a safari park.",
"from": "human"
},
{
"value": "ZOOZOO: \ud83e\udd81\ud83d\udc2f\ud83e\udd93\ud83e\udd92\ud83d\udc18\ud83d\udc0d\ud83c\udf33\ud83c\udf32\u2600\ufe0f\ud83d\ude90\ud83e\udd85\ud83d\udcf8\ud83e\udd29",
"from": "gpt"
}
]
}
```
### Contribute
If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data,
take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details.
To help me with the OpenAI/compute costs:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf
### Licence and usage restrictions
The airoboros 3.1 models are built on top of multiple base models, each with their own license/restrictions.
The 30b model is built on the original llama, which has a strict non-commercial usage restriction.
The models with `-l2` in the name have a custom Meta license:
- See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta.
- See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta.
The models with `-m-` are mistral-7b (apache 2.0)
The fine-tuning data was mostly generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros)
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely indemnify me.
| 38,018 | [
[
-0.040313720703125,
-0.06341552734375,
0.0200347900390625,
0.006618499755859375,
-0.018829345703125,
-0.01393890380859375,
0.0102386474609375,
-0.035980224609375,
0.0038928985595703125,
0.0247955322265625,
-0.049957275390625,
-0.0323486328125,
-0.0252838134765625,
-0.004894256591796875,
-0.020965576171875,
0.06927490234375,
0.00518798828125,
-0.028045654296875,
-0.0130767822265625,
-0.01245880126953125,
-0.00850677490234375,
-0.033111572265625,
-0.04217529296875,
-0.017425537109375,
0.01214599609375,
0.0163421630859375,
0.05328369140625,
0.053436279296875,
0.015655517578125,
0.028045654296875,
-0.00278472900390625,
0.0015039443969726562,
-0.01776123046875,
-0.0057525634765625,
0.01139068603515625,
-0.01092529296875,
-0.0447998046875,
0.01245880126953125,
0.039306640625,
0.0235443115234375,
-0.033172607421875,
0.007598876953125,
-0.0027523040771484375,
0.04278564453125,
-0.039306640625,
-0.0020809173583984375,
-0.0308685302734375,
-0.00290679931640625,
-0.006000518798828125,
0.01364898681640625,
-0.01163482666015625,
-0.0147705078125,
0.00894927978515625,
-0.0673828125,
0.00786590576171875,
0.0171356201171875,
0.10577392578125,
0.0157623291015625,
-0.048797607421875,
0.002712249755859375,
-0.0285797119140625,
0.0802001953125,
-0.09124755859375,
0.0297088623046875,
0.02825927734375,
0.0177764892578125,
-0.00896453857421875,
-0.0643310546875,
-0.048919677734375,
-0.01386260986328125,
-0.00457763671875,
0.02301025390625,
-0.05157470703125,
0.004741668701171875,
0.0173797607421875,
0.0396728515625,
-0.0601806640625,
-0.0016088485717773438,
-0.0247802734375,
-0.016326904296875,
0.05889892578125,
0.014984130859375,
0.0295867919921875,
-0.0252227783203125,
-0.028167724609375,
-0.023040771484375,
-0.035675048828125,
0.007282257080078125,
0.0171661376953125,
0.00579833984375,
-0.048431396484375,
0.035980224609375,
-0.029144287109375,
0.05084228515625,
0.0208282470703125,
-0.00800323486328125,
0.023651123046875,
-0.02996826171875,
-0.0460205078125,
-0.030792236328125,
0.09600830078125,
0.0252685546875,
-0.01432037353515625,
0.021820068359375,
0.0012083053588867188,
-0.015777587890625,
0.0062255859375,
-0.07757568359375,
-0.021270751953125,
0.048675537109375,
-0.044891357421875,
-0.04254150390625,
-0.006870269775390625,
-0.054962158203125,
-0.00460052490234375,
0.005859375,
0.04302978515625,
-0.033935546875,
-0.0248260498046875,
0.00016880035400390625,
-0.035797119140625,
0.0291595458984375,
0.026153564453125,
-0.057861328125,
0.0239715576171875,
0.01300811767578125,
0.043121337890625,
0.0150604248046875,
-0.0233612060546875,
-0.0260162353515625,
0.00455474853515625,
-0.007167816162109375,
0.04571533203125,
-0.01406097412109375,
-0.0299530029296875,
-0.0173797607421875,
0.00616455078125,
0.006591796875,
-0.0292205810546875,
0.027557373046875,
-0.009307861328125,
0.038604736328125,
-0.0187530517578125,
-0.042236328125,
-0.0239715576171875,
0.0078887939453125,
-0.042236328125,
0.08990478515625,
0.03167724609375,
-0.0626220703125,
0.00969696044921875,
-0.041656494140625,
-0.0234527587890625,
-0.00042939186096191406,
-0.0036525726318359375,
-0.052398681640625,
-0.00970458984375,
0.0275421142578125,
0.0241546630859375,
-0.02374267578125,
-0.00213623046875,
-0.033050537109375,
-0.00876617431640625,
0.020111083984375,
-0.024505615234375,
0.10186767578125,
0.0255889892578125,
-0.043121337890625,
0.0012083053588867188,
-0.05926513671875,
0.0163421630859375,
0.0311126708984375,
-0.0272369384765625,
0.0107574462890625,
-0.01525115966796875,
0.00862884521484375,
0.004024505615234375,
0.0179901123046875,
-0.03155517578125,
0.019805908203125,
-0.026519775390625,
0.053192138671875,
0.046356201171875,
0.00734710693359375,
0.02850341796875,
-0.05133056640625,
0.03948974609375,
0.010284423828125,
0.0312347412109375,
-0.0005049705505371094,
-0.044921875,
-0.06341552734375,
-0.0173492431640625,
0.0225372314453125,
0.04595947265625,
-0.053131103515625,
0.044952392578125,
0.01255035400390625,
-0.059967041015625,
-0.035858154296875,
-0.003650665283203125,
0.0248260498046875,
0.03155517578125,
0.0352783203125,
-0.006072998046875,
-0.041259765625,
-0.058258056640625,
-0.00861358642578125,
-0.03936767578125,
-0.0095367431640625,
0.039581298828125,
0.048309326171875,
-0.0268707275390625,
0.055694580078125,
-0.048583984375,
-0.01488494873046875,
-0.0154571533203125,
0.0135040283203125,
0.00870513916015625,
0.048370361328125,
0.06494140625,
-0.04736328125,
-0.0283203125,
-0.01277923583984375,
-0.051910400390625,
-0.0220794677734375,
-0.002349853515625,
-0.0305023193359375,
0.03155517578125,
0.0150146484375,
-0.064697265625,
0.046356201171875,
0.0462646484375,
-0.042633056640625,
0.056732177734375,
-0.01255035400390625,
0.0108184814453125,
-0.08990478515625,
0.016326904296875,
0.0150146484375,
-0.0195465087890625,
-0.040252685546875,
0.0221405029296875,
-0.00495147705078125,
0.0035724639892578125,
-0.035614013671875,
0.0633544921875,
-0.034515380859375,
0.0178985595703125,
-0.007610321044921875,
-0.0004553794860839844,
0.04058837890625,
0.031951904296875,
-0.00830078125,
0.04107666015625,
0.0350341796875,
-0.05059814453125,
0.03204345703125,
0.0299072265625,
0.0006132125854492188,
0.021697998046875,
-0.064208984375,
0.003284454345703125,
0.01322174072265625,
0.033050537109375,
-0.074951171875,
-0.024505615234375,
0.037078857421875,
-0.0513916015625,
0.0126190185546875,
-0.01617431640625,
-0.016937255859375,
-0.031951904296875,
-0.032806396484375,
0.0189208984375,
0.06982421875,
-0.033050537109375,
0.05084228515625,
0.03497314453125,
0.0023899078369140625,
-0.046722412109375,
-0.06280517578125,
-0.0183563232421875,
-0.020538330078125,
-0.043304443359375,
0.029327392578125,
-0.0199432373046875,
-0.023040771484375,
0.00026869773864746094,
0.0018014907836914062,
-0.0177001953125,
0.00926971435546875,
0.018829345703125,
0.020599365234375,
-0.002796173095703125,
-0.0153045654296875,
0.0182342529296875,
0.006378173828125,
0.001285552978515625,
-0.023681640625,
0.029083251953125,
-0.01328277587890625,
0.005008697509765625,
-0.041412353515625,
0.0220794677734375,
0.03363037109375,
-0.0158538818359375,
0.058258056640625,
0.059417724609375,
-0.0318603515625,
-0.01377105712890625,
-0.041107177734375,
-0.01438140869140625,
-0.03997802734375,
0.0012540817260742188,
-0.01248931884765625,
-0.04498291015625,
0.03497314453125,
0.0235137939453125,
0.01904296875,
0.05609130859375,
0.032318115234375,
-0.0220947265625,
0.0728759765625,
0.037872314453125,
0.0007271766662597656,
0.033966064453125,
-0.042236328125,
-0.0138397216796875,
-0.06292724609375,
-0.00934600830078125,
-0.040496826171875,
-0.004627227783203125,
-0.042236328125,
-0.029022216796875,
0.02093505859375,
0.0203857421875,
-0.043975830078125,
0.0296478271484375,
-0.05029296875,
-0.0029201507568359375,
0.04998779296875,
0.01155853271484375,
0.01007843017578125,
-0.0028057098388671875,
-0.0225830078125,
0.0077972412109375,
-0.0479736328125,
-0.0194854736328125,
0.073974609375,
0.0196685791015625,
0.04718017578125,
0.0123291015625,
0.05267333984375,
0.007427215576171875,
0.01345062255859375,
-0.042999267578125,
0.041259765625,
-0.0011167526245117188,
-0.055755615234375,
-0.0235443115234375,
-0.04736328125,
-0.06640625,
0.0170440673828125,
-0.01446533203125,
-0.05401611328125,
0.022003173828125,
0.016357421875,
-0.03411865234375,
0.0207061767578125,
-0.036956787109375,
0.0594482421875,
-0.005523681640625,
-0.0229034423828125,
0.01364898681640625,
-0.046600341796875,
0.0183868408203125,
0.01806640625,
0.0173797607421875,
-0.021026611328125,
-0.016815185546875,
0.0615234375,
-0.06817626953125,
0.055755615234375,
-0.0221405029296875,
-0.007747650146484375,
0.046112060546875,
-0.006664276123046875,
0.0285797119140625,
0.003772735595703125,
-0.0162811279296875,
0.0247802734375,
0.01053619384765625,
-0.037078857421875,
-0.024017333984375,
0.041412353515625,
-0.07464599609375,
-0.041107177734375,
-0.0377197265625,
-0.0286865234375,
0.014404296875,
0.0213470458984375,
0.0439453125,
0.0325927734375,
-0.0091705322265625,
0.006420135498046875,
0.035552978515625,
-0.03424072265625,
0.043670654296875,
0.037567138671875,
-0.033233642578125,
-0.052642822265625,
0.053009033203125,
0.005840301513671875,
0.02239990234375,
0.0220794677734375,
0.0253448486328125,
-0.029632568359375,
-0.0338134765625,
-0.061431884765625,
0.0147705078125,
-0.035888671875,
-0.032989501953125,
-0.05804443359375,
-0.0272369384765625,
-0.050048828125,
-0.00701141357421875,
-0.0199127197265625,
-0.041595458984375,
-0.0423583984375,
0.0027790069580078125,
0.05853271484375,
0.0369873046875,
-0.01898193359375,
0.0295867919921875,
-0.057037353515625,
0.0313720703125,
0.0296478271484375,
0.0016508102416992188,
0.0006971359252929688,
-0.057861328125,
-0.0108795166015625,
0.0261077880859375,
-0.047821044921875,
-0.06109619140625,
0.054595947265625,
0.0175323486328125,
0.038360595703125,
0.0240936279296875,
0.0218353271484375,
0.054229736328125,
-0.013763427734375,
0.07421875,
0.0017175674438476562,
-0.088623046875,
0.048797607421875,
-0.043975830078125,
0.04022216796875,
0.0278778076171875,
0.03118896484375,
-0.0273895263671875,
-0.036376953125,
-0.055694580078125,
-0.06085205078125,
0.043914794921875,
0.038604736328125,
-0.00788116455078125,
0.01074981689453125,
0.0272979736328125,
-0.00449371337890625,
0.0164031982421875,
-0.0599365234375,
-0.035400390625,
-0.033721923828125,
-0.0137786865234375,
0.023040771484375,
0.004680633544921875,
-0.01132965087890625,
-0.043914794921875,
0.0736083984375,
-0.011810302734375,
0.0465087890625,
0.03125,
0.006519317626953125,
-0.0232696533203125,
0.01446533203125,
0.018951416015625,
0.040435791015625,
-0.023651123046875,
-0.018341064453125,
0.0126190185546875,
-0.034912109375,
0.0093994140625,
0.0308685302734375,
-0.0224609375,
-0.0024471282958984375,
0.01071929931640625,
0.06317138671875,
0.0019369125366210938,
-0.03216552734375,
0.035797119140625,
-0.01428985595703125,
-0.039947509765625,
-0.02374267578125,
0.018035888671875,
0.0187530517578125,
0.020233154296875,
0.0408935546875,
-0.0138397216796875,
0.0244140625,
-0.04095458984375,
0.01519775390625,
0.039306640625,
-0.01198577880859375,
-0.0017061233520507812,
0.077392578125,
0.01361846923828125,
-0.0007171630859375,
0.05712890625,
-0.0176544189453125,
-0.03814697265625,
0.0726318359375,
0.038116455078125,
0.058502197265625,
-0.01200103759765625,
0.01708984375,
0.0408935546875,
0.0159454345703125,
0.00970458984375,
0.038787841796875,
0.0108184814453125,
-0.04827880859375,
-0.01010894775390625,
-0.0518798828125,
-0.0325927734375,
0.01751708984375,
-0.0576171875,
0.0157318115234375,
-0.04107666015625,
-0.0201263427734375,
0.004001617431640625,
0.028900146484375,
-0.04498291015625,
0.03515625,
0.01220703125,
0.0643310546875,
-0.047027587890625,
0.058624267578125,
0.052825927734375,
-0.032867431640625,
-0.07159423828125,
-0.0031719207763671875,
0.0193328857421875,
-0.049468994140625,
0.006542205810546875,
0.01116943359375,
0.01267242431640625,
0.019805908203125,
-0.06427001953125,
-0.060882568359375,
0.10546875,
0.01290130615234375,
-0.040313720703125,
-0.00473785400390625,
-0.004119873046875,
0.02117919921875,
-0.0178985595703125,
0.048065185546875,
0.02899169921875,
0.035980224609375,
0.003963470458984375,
-0.0704345703125,
0.03619384765625,
-0.0269927978515625,
-0.0015497207641601562,
0.00997161865234375,
-0.086669921875,
0.0811767578125,
-0.0215911865234375,
-0.0134124755859375,
0.03363037109375,
0.053955078125,
0.04522705078125,
0.00722503662109375,
0.0285797119140625,
0.06201171875,
0.059417724609375,
-0.01277923583984375,
0.07708740234375,
-0.016387939453125,
0.044952392578125,
0.058135986328125,
-0.00936126708984375,
0.0631103515625,
0.018218994140625,
-0.032989501953125,
0.05303955078125,
0.0517578125,
-0.020355224609375,
0.0206298828125,
0.00920867919921875,
-0.0169677734375,
-0.005077362060546875,
0.0015459060668945312,
-0.05029296875,
0.0247802734375,
0.0243988037109375,
-0.0179901123046875,
0.0024471282958984375,
-0.00872802734375,
0.01088714599609375,
-0.035308837890625,
-0.004638671875,
0.052093505859375,
0.02099609375,
-0.0267486572265625,
0.06903076171875,
0.005336761474609375,
0.05401611328125,
-0.04559326171875,
-0.015106201171875,
-0.0225677490234375,
0.0004329681396484375,
-0.0187225341796875,
-0.045379638671875,
0.0035648345947265625,
-0.0098724365234375,
-0.00010412931442260742,
0.0028820037841796875,
0.053955078125,
-0.02313232421875,
-0.038482666015625,
0.025115966796875,
0.03631591796875,
0.0178680419921875,
-0.0023822784423828125,
-0.07708740234375,
0.0311431884765625,
0.006259918212890625,
-0.0477294921875,
0.0226287841796875,
0.0289459228515625,
0.0311126708984375,
0.052490234375,
0.049163818359375,
-0.014892578125,
0.003879547119140625,
-0.0196685791015625,
0.06591796875,
-0.0654296875,
-0.0253448486328125,
-0.06280517578125,
0.05926513671875,
-0.00658416748046875,
-0.039947509765625,
0.061981201171875,
0.03753662109375,
0.0538330078125,
0.001399993896484375,
0.07684326171875,
-0.034576416015625,
0.0212554931640625,
-0.0305023193359375,
0.07733154296875,
-0.0653076171875,
0.00440216064453125,
-0.0222015380859375,
-0.0491943359375,
0.00336456298828125,
0.055694580078125,
0.010955810546875,
0.024139404296875,
0.041717529296875,
0.053192138671875,
0.006465911865234375,
-0.00518798828125,
0.01425933837890625,
0.034576416015625,
0.0188751220703125,
0.0543212890625,
0.04364013671875,
-0.07623291015625,
0.050567626953125,
-0.050201416015625,
-0.020111083984375,
-0.021636962890625,
-0.06951904296875,
-0.060577392578125,
-0.035430908203125,
-0.03961181640625,
-0.048126220703125,
-0.005886077880859375,
0.06707763671875,
0.07354736328125,
-0.05010986328125,
-0.0244140625,
-0.00881195068359375,
-0.005374908447265625,
-0.016265869140625,
-0.027069091796875,
0.0137786865234375,
0.01273345947265625,
-0.058013916015625,
0.0256805419921875,
-0.004169464111328125,
0.0380859375,
-0.01418304443359375,
-0.010467529296875,
-0.0269012451171875,
0.0215606689453125,
0.025238037109375,
0.040374755859375,
-0.053924560546875,
-0.004642486572265625,
-0.01145172119140625,
-0.01447296142578125,
0.01355743408203125,
0.01055908203125,
-0.0643310546875,
0.00450897216796875,
0.036041259765625,
0.01204681396484375,
0.03961181640625,
0.001522064208984375,
0.04595947265625,
-0.040069580078125,
0.011627197265625,
0.006378173828125,
0.027130126953125,
0.00826263427734375,
-0.0355224609375,
0.035797119140625,
0.0159759521484375,
-0.056854248046875,
-0.060577392578125,
-0.006679534912109375,
-0.0841064453125,
-0.033355712890625,
0.08477783203125,
-0.0112457275390625,
-0.02947998046875,
0.00525665283203125,
-0.020843505859375,
0.033294677734375,
-0.0457763671875,
0.03076171875,
0.03094482421875,
-0.0153961181640625,
-0.02294921875,
-0.0401611328125,
0.040435791015625,
0.0235748291015625,
-0.079345703125,
-0.0007734298706054688,
0.0335693359375,
0.03363037109375,
-0.0037384033203125,
0.05780029296875,
0.0005626678466796875,
0.0228729248046875,
0.0169830322265625,
0.01386260986328125,
-0.005077362060546875,
-0.0016918182373046875,
-0.02301025390625,
-0.0090789794921875,
-0.015380859375,
-0.005924224853515625
]
] |
Helsinki-NLP/opus-mt-hi-en | 2023-08-16T11:57:38.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"hi",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-hi-en | 6 | 15,002 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-hi-en
* source languages: hi
* target languages: en
* OPUS readme: [hi-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/hi-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/hi-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/hi-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/hi-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2014.hi.en | 9.1 | 0.357 |
| newstest2014-hien.hi.en | 13.6 | 0.409 |
| Tatoeba.hi.en | 40.4 | 0.580 |
| 901 | [
[
-0.0255279541015625,
-0.034088134765625,
0.0165252685546875,
0.027130126953125,
-0.0228118896484375,
-0.0291748046875,
-0.02972412109375,
-0.0107574462890625,
0.00563812255859375,
0.040771484375,
-0.04791259765625,
-0.040863037109375,
-0.04290771484375,
0.0159759521484375,
0.0009140968322753906,
0.05810546875,
-0.01061248779296875,
0.031097412109375,
0.0164337158203125,
-0.0386962890625,
-0.0301361083984375,
-0.0304718017578125,
-0.027923583984375,
-0.026458740234375,
0.022979736328125,
0.032684326171875,
0.032745361328125,
0.034271240234375,
0.07391357421875,
0.01568603515625,
-0.00881195068359375,
0.0022220611572265625,
-0.038787841796875,
-0.0075836181640625,
0.01038360595703125,
-0.039306640625,
-0.059356689453125,
-0.00812530517578125,
0.0775146484375,
0.031402587890625,
0.0022411346435546875,
0.0316162109375,
0.0030059814453125,
0.06451416015625,
-0.0282440185546875,
0.0066375732421875,
-0.0438232421875,
0.007411956787109375,
-0.028167724609375,
-0.0216827392578125,
-0.0484619140625,
-0.01517486572265625,
0.00867462158203125,
-0.05169677734375,
-0.0055999755859375,
0.015716552734375,
0.10601806640625,
0.022735595703125,
-0.0258331298828125,
-0.0062255859375,
-0.04327392578125,
0.07879638671875,
-0.060760498046875,
0.043792724609375,
0.0270538330078125,
0.0194549560546875,
0.014984130859375,
-0.03924560546875,
-0.01959228515625,
0.01134490966796875,
-0.0177459716796875,
0.0154876708984375,
-0.0064697265625,
-0.01788330078125,
0.024658203125,
0.058868408203125,
-0.056732177734375,
0.0003561973571777344,
-0.03759765625,
-0.004852294921875,
0.049346923828125,
0.0078582763671875,
0.01131439208984375,
-0.019378662109375,
-0.033843994140625,
-0.03680419921875,
-0.060943603515625,
0.003631591796875,
0.02655029296875,
0.0198974609375,
-0.038330078125,
0.049407958984375,
-0.0086212158203125,
0.045166015625,
-0.0027675628662109375,
-0.004276275634765625,
0.0767822265625,
-0.0303497314453125,
-0.028564453125,
-0.01282501220703125,
0.09222412109375,
0.0235595703125,
0.00372314453125,
0.01242828369140625,
-0.0178680419921875,
-0.0218505859375,
0.002033233642578125,
-0.0716552734375,
-0.0028247833251953125,
0.00591278076171875,
-0.03851318359375,
-0.00795745849609375,
0.0008258819580078125,
-0.057830810546875,
0.01454925537109375,
-0.0267333984375,
0.051513671875,
-0.040252685546875,
-0.0194549560546875,
0.0249481201171875,
0.001178741455078125,
0.0257720947265625,
-0.0030002593994140625,
-0.048309326171875,
0.0169677734375,
0.0243988037109375,
0.049530029296875,
-0.0305938720703125,
-0.0240478515625,
-0.03790283203125,
-0.0158843994140625,
-0.0055389404296875,
0.0439453125,
-0.00466156005859375,
-0.032501220703125,
-0.0053558349609375,
0.034759521484375,
-0.0294189453125,
-0.0295867919921875,
0.09814453125,
-0.0184783935546875,
0.055908203125,
-0.03326416015625,
-0.0343017578125,
-0.0201568603515625,
0.0390625,
-0.04351806640625,
0.10369873046875,
0.007564544677734375,
-0.06671142578125,
0.01348876953125,
-0.06475830078125,
-0.016632080078125,
-0.0017118453979492188,
0.0013666152954101562,
-0.05029296875,
0.0005440711975097656,
0.01369476318359375,
0.0281982421875,
-0.0247955322265625,
0.0150146484375,
-0.00179290771484375,
-0.031829833984375,
-0.00278472900390625,
-0.02581787109375,
0.0791015625,
0.0210723876953125,
-0.027618408203125,
0.0199737548828125,
-0.07049560546875,
-0.00591278076171875,
0.0017995834350585938,
-0.0384521484375,
-0.0142822265625,
0.0017795562744140625,
0.0180511474609375,
0.006412506103515625,
0.0278472900390625,
-0.04730224609375,
0.01467132568359375,
-0.051025390625,
0.0131378173828125,
0.047210693359375,
-0.01512908935546875,
0.0225067138671875,
-0.029083251953125,
0.0228729248046875,
0.0106964111328125,
0.01436614990234375,
-0.004489898681640625,
-0.04046630859375,
-0.061248779296875,
-0.019134521484375,
0.038543701171875,
0.07830810546875,
-0.04864501953125,
0.06414794921875,
-0.048065185546875,
-0.0531005859375,
-0.059814453125,
-0.0072021484375,
0.034637451171875,
0.031463623046875,
0.040740966796875,
-0.0089569091796875,
-0.034393310546875,
-0.08111572265625,
-0.009765625,
-0.01202392578125,
-0.01244354248046875,
0.0180816650390625,
0.049652099609375,
-0.0175933837890625,
0.036712646484375,
-0.038909912109375,
-0.026641845703125,
-0.00902557373046875,
0.01377105712890625,
0.038238525390625,
0.04522705078125,
0.0411376953125,
-0.06341552734375,
-0.045135498046875,
0.0011014938354492188,
-0.053741455078125,
-0.0063934326171875,
0.012359619140625,
-0.0252227783203125,
0.004436492919921875,
0.007465362548828125,
-0.0273895263671875,
0.01126861572265625,
0.04766845703125,
-0.0421142578125,
0.04473876953125,
-0.0081787109375,
0.023284912109375,
-0.104736328125,
0.01097869873046875,
-0.01222991943359375,
-0.0025177001953125,
-0.027801513671875,
0.0023860931396484375,
0.019317626953125,
0.0022220611572265625,
-0.061920166015625,
0.044158935546875,
-0.021331787109375,
-0.005390167236328125,
0.01959228515625,
-0.004791259765625,
0.0093994140625,
0.056854248046875,
-0.00922393798828125,
0.0611572265625,
0.056182861328125,
-0.037841796875,
0.0178070068359375,
0.04376220703125,
-0.03173828125,
0.0292510986328125,
-0.061309814453125,
-0.02081298828125,
0.02191162109375,
-0.0115966796875,
-0.043792724609375,
0.0028247833251953125,
0.0226287841796875,
-0.04583740234375,
0.029052734375,
-0.009613037109375,
-0.05572509765625,
-0.008941650390625,
-0.0222625732421875,
0.038055419921875,
0.05792236328125,
-0.019378662109375,
0.03485107421875,
0.009429931640625,
-0.00162506103515625,
-0.033172607421875,
-0.07220458984375,
-0.00909423828125,
-0.03106689453125,
-0.055206298828125,
0.0200042724609375,
-0.033294677734375,
-0.0003495216369628906,
0.00231170654296875,
0.027191162109375,
-0.00365447998046875,
-0.0007457733154296875,
0.0079803466796875,
0.0235137939453125,
-0.04241943359375,
0.006175994873046875,
-0.0101165771484375,
-0.00942230224609375,
-0.0089569091796875,
-0.01094818115234375,
0.0416259765625,
-0.0297393798828125,
-0.0196990966796875,
-0.04638671875,
-0.00021398067474365234,
0.047576904296875,
-0.040130615234375,
0.0599365234375,
0.044097900390625,
-0.01535797119140625,
0.017059326171875,
-0.0287322998046875,
0.0064239501953125,
-0.031768798828125,
0.0086517333984375,
-0.036041259765625,
-0.05682373046875,
0.038818359375,
0.011260986328125,
0.033203125,
0.06536865234375,
0.0523681640625,
0.0063323974609375,
0.056121826171875,
0.020477294921875,
-0.001262664794921875,
0.031829833984375,
-0.03521728515625,
-0.00582122802734375,
-0.0775146484375,
0.007007598876953125,
-0.0579833984375,
-0.027374267578125,
-0.07098388671875,
-0.01406097412109375,
0.0277862548828125,
0.002155303955078125,
-0.020843505859375,
0.05328369140625,
-0.04168701171875,
0.0175018310546875,
0.047454833984375,
-0.01200103759765625,
0.0199127197265625,
0.0034618377685546875,
-0.039031982421875,
-0.0178070068359375,
-0.02923583984375,
-0.03326416015625,
0.0927734375,
0.033843994140625,
0.0279541015625,
0.01294708251953125,
0.04266357421875,
-0.002452850341796875,
0.01386260986328125,
-0.047821044921875,
0.03369140625,
-0.017364501953125,
-0.0587158203125,
-0.0273590087890625,
-0.046600341796875,
-0.06353759765625,
0.03729248046875,
-0.0208892822265625,
-0.038787841796875,
0.0131683349609375,
-0.0008029937744140625,
-0.0114898681640625,
0.032257080078125,
-0.049285888671875,
0.09405517578125,
-0.001911163330078125,
-0.00995635986328125,
0.0200042724609375,
-0.03643798828125,
0.0218658447265625,
0.005199432373046875,
0.0245361328125,
-0.0196685791015625,
0.018402099609375,
0.051055908203125,
-0.004940032958984375,
0.02960205078125,
-0.0052032470703125,
-0.0011892318725585938,
0.00975799560546875,
0.00106048583984375,
0.0259552001953125,
-0.01013946533203125,
-0.035675048828125,
0.0268707275390625,
0.01024627685546875,
-0.03863525390625,
-0.00847625732421875,
0.043792724609375,
-0.053009033203125,
-0.0012035369873046875,
-0.03131103515625,
-0.0498046875,
-0.0007243156433105469,
0.032196044921875,
0.055877685546875,
0.053466796875,
-0.02178955078125,
0.044036865234375,
0.06243896484375,
-0.020416259765625,
0.0270843505859375,
0.046844482421875,
-0.0165863037109375,
-0.04266357421875,
0.06268310546875,
0.004581451416015625,
0.023651123046875,
0.043243408203125,
0.014862060546875,
-0.01031494140625,
-0.0511474609375,
-0.05572509765625,
0.020233154296875,
-0.0197296142578125,
-0.01201629638671875,
-0.03729248046875,
-0.01007080078125,
-0.0278472900390625,
0.0216064453125,
-0.03692626953125,
-0.03802490234375,
-0.01010894775390625,
-0.01229095458984375,
0.0094757080078125,
0.0169830322265625,
0.0006346702575683594,
0.034759521484375,
-0.08038330078125,
0.0171661376953125,
-0.007152557373046875,
0.023956298828125,
-0.038818359375,
-0.059722900390625,
-0.04046630859375,
0.0060882568359375,
-0.053192138671875,
-0.0513916015625,
0.035888671875,
0.00978851318359375,
0.0212249755859375,
0.0250396728515625,
0.01108551025390625,
0.027374267578125,
-0.0513916015625,
0.07269287109375,
-0.000060439109802246094,
-0.048919677734375,
0.040191650390625,
-0.033538818359375,
0.039794921875,
0.066162109375,
0.0221405029296875,
-0.023284912109375,
-0.034912109375,
-0.05291748046875,
-0.0521240234375,
0.053985595703125,
0.048980712890625,
-0.0093536376953125,
0.0164031982421875,
-0.00933837890625,
-0.00128936767578125,
0.01055145263671875,
-0.08184814453125,
-0.025726318359375,
0.006961822509765625,
-0.03094482421875,
-0.01381683349609375,
-0.0180816650390625,
-0.018402099609375,
-0.017242431640625,
0.07879638671875,
0.008392333984375,
0.0136566162109375,
0.0262451171875,
-0.00836944580078125,
-0.0182647705078125,
0.0296478271484375,
0.059295654296875,
0.037933349609375,
-0.043670654296875,
-0.00669097900390625,
0.02081298828125,
-0.032684326171875,
-0.01312255859375,
0.00864410400390625,
-0.034149169921875,
0.0189971923828125,
0.03662109375,
0.0823974609375,
0.020172119140625,
-0.046051025390625,
0.037445068359375,
-0.0223388671875,
-0.0313720703125,
-0.037506103515625,
-0.0098419189453125,
0.0062255859375,
-0.00812530517578125,
0.0081329345703125,
0.01629638671875,
0.0167388916015625,
-0.022979736328125,
0.009429931640625,
0.0064239501953125,
-0.046539306640625,
-0.0421142578125,
0.03240966796875,
0.01361083984375,
-0.023651123046875,
0.04168701171875,
-0.0307159423828125,
-0.03997802734375,
0.027496337890625,
0.01012420654296875,
0.07611083984375,
-0.01873779296875,
-0.01947021484375,
0.0594482421875,
0.040924072265625,
-0.0126953125,
0.037567138671875,
0.0119171142578125,
-0.052764892578125,
-0.037994384765625,
-0.06488037109375,
-0.01140594482421875,
0.01192474365234375,
-0.06707763671875,
0.031646728515625,
0.0251312255859375,
-0.00023114681243896484,
-0.0204925537109375,
0.0199737548828125,
-0.0423583984375,
0.01004791259765625,
-0.0204925537109375,
0.0762939453125,
-0.065185546875,
0.06744384765625,
0.0399169921875,
-0.019195556640625,
-0.052276611328125,
-0.0132293701171875,
-0.01434326171875,
-0.03717041015625,
0.038665771484375,
0.0177001953125,
0.0227813720703125,
-0.00820159912109375,
-0.0158843994140625,
-0.059722900390625,
0.078857421875,
0.0138397216796875,
-0.040924072265625,
-0.005069732666015625,
0.01316070556640625,
0.0345458984375,
-0.0245361328125,
0.005695343017578125,
0.036163330078125,
0.056060791015625,
0.009796142578125,
-0.0858154296875,
-0.024139404296875,
-0.040802001953125,
-0.028289794921875,
0.04425048828125,
-0.04241943359375,
0.0716552734375,
0.0278472900390625,
-0.00714111328125,
0.005039215087890625,
0.043792724609375,
0.0248565673828125,
0.0278472900390625,
0.039520263671875,
0.08795166015625,
0.0272216796875,
-0.033111572265625,
0.06744384765625,
-0.0247650146484375,
0.03765869140625,
0.084228515625,
-0.0082855224609375,
0.07305908203125,
0.023712158203125,
-0.00917816162109375,
0.036163330078125,
0.048004150390625,
-0.0221099853515625,
0.0330810546875,
0.0024166107177734375,
0.01202392578125,
-0.00832366943359375,
0.01247406005859375,
-0.05218505859375,
0.0174407958984375,
0.01280975341796875,
-0.008270263671875,
0.0054931640625,
-0.004817962646484375,
0.00168609619140625,
-0.00516510009765625,
-0.009857177734375,
0.041534423828125,
-0.0014934539794921875,
-0.04510498046875,
0.05859375,
-0.0016231536865234375,
0.050323486328125,
-0.046112060546875,
0.007671356201171875,
-0.0028438568115234375,
0.0209197998046875,
-0.0027599334716796875,
-0.043304443359375,
0.037261962890625,
-0.0010805130004882812,
-0.0195465087890625,
-0.032867431640625,
0.019195556640625,
-0.0391845703125,
-0.063720703125,
0.0295257568359375,
0.030517578125,
0.02520751953125,
0.0032196044921875,
-0.06304931640625,
0.005947113037109375,
0.007022857666015625,
-0.045318603515625,
0.00453948974609375,
0.049713134765625,
0.022796630859375,
0.0256195068359375,
0.041046142578125,
0.01494598388671875,
0.0225982666015625,
-0.002208709716796875,
0.04290771484375,
-0.033660888671875,
-0.03558349609375,
-0.056854248046875,
0.05767822265625,
-0.00745391845703125,
-0.052642822265625,
0.056243896484375,
0.08013916015625,
0.07586669921875,
-0.00904083251953125,
0.01885986328125,
-0.0020084381103515625,
0.058929443359375,
-0.049163818359375,
0.042266845703125,
-0.07257080078125,
0.01523590087890625,
-0.002895355224609375,
-0.068359375,
-0.0137481689453125,
0.0150146484375,
-0.01532745361328125,
-0.0248260498046875,
0.058441162109375,
0.052276611328125,
-0.01837158203125,
-0.01251983642578125,
0.0223541259765625,
0.025970458984375,
0.01531219482421875,
0.044647216796875,
0.032501220703125,
-0.075439453125,
0.040130615234375,
-0.0237884521484375,
-0.0107574462890625,
-0.00994873046875,
-0.049407958984375,
-0.064453125,
-0.04156494140625,
-0.015167236328125,
-0.0164642333984375,
-0.0228118896484375,
0.060577392578125,
0.050689697265625,
-0.0699462890625,
-0.044586181640625,
0.0050811767578125,
0.00969696044921875,
-0.0118255615234375,
-0.0200042724609375,
0.04443359375,
-0.02130126953125,
-0.06976318359375,
0.03857421875,
0.007411956787109375,
-0.0034542083740234375,
-0.00466156005859375,
-0.028961181640625,
-0.0318603515625,
-0.004436492919921875,
0.0230255126953125,
0.003437042236328125,
-0.039093017578125,
0.0079345703125,
0.0131988525390625,
-0.00936126708984375,
0.033935546875,
0.0249481201171875,
-0.01568603515625,
0.01458740234375,
0.06927490234375,
0.025787353515625,
0.037078857421875,
-0.01287841796875,
0.032379150390625,
-0.05291748046875,
0.0237274169921875,
0.01629638671875,
0.044952392578125,
0.02520751953125,
-0.003948211669921875,
0.053985595703125,
0.02105712890625,
-0.050933837890625,
-0.08087158203125,
0.00804901123046875,
-0.08489990234375,
0.003604888916015625,
0.06549072265625,
-0.018035888671875,
-0.0237884521484375,
0.03216552734375,
-0.01141357421875,
0.010528564453125,
-0.0174407958984375,
0.02606201171875,
0.0653076171875,
0.0218963623046875,
0.006717681884765625,
-0.05682373046875,
0.024322509765625,
0.0382080078125,
-0.0628662109375,
-0.0164337158203125,
0.0158233642578125,
0.0084381103515625,
0.0310211181640625,
0.034698486328125,
-0.0288543701171875,
0.0050506591796875,
-0.0198822021484375,
0.031707763671875,
0.0005717277526855469,
-0.0220794677734375,
-0.02276611328125,
-0.00034332275390625,
-0.0016031265258789062,
-0.0279998779296875
]
] |
microsoft/codebert-base-mlm | 2023-01-09T11:37:56.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"roberta",
"fill-mask",
"arxiv:2002.08155",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/codebert-base-mlm | 29 | 14,975 | transformers | 2022-03-02T23:29:05 | ## CodeBERT-base-mlm
Pretrained weights for [CodeBERT: A Pre-Trained Model for Programming and Natural Languages](https://arxiv.org/abs/2002.08155).
### Training Data
The model is trained on the code corpus of [CodeSearchNet](https://github.com/github/CodeSearchNet)
### Training Objective
This model is initialized with Roberta-base and trained with a simple MLM (Masked Language Model) objective.
### Usage
```python
from transformers import RobertaTokenizer, RobertaForMaskedLM, pipeline
model = RobertaForMaskedLM.from_pretrained('microsoft/codebert-base-mlm')
tokenizer = RobertaTokenizer.from_pretrained('microsoft/codebert-base-mlm')
code_example = "if (x is not None) <mask> (x>1)"
fill_mask = pipeline('fill-mask', model=model, tokenizer=tokenizer)
outputs = fill_mask(code_example)
print(outputs)
```
Expected results:
```
{'sequence': '<s> if (x is not None) and (x>1)</s>', 'score': 0.6049249172210693, 'token': 8}
{'sequence': '<s> if (x is not None) or (x>1)</s>', 'score': 0.30680200457572937, 'token': 50}
{'sequence': '<s> if (x is not None) if (x>1)</s>', 'score': 0.02133703976869583, 'token': 114}
{'sequence': '<s> if (x is not None) then (x>1)</s>', 'score': 0.018607674166560173, 'token': 172}
{'sequence': '<s> if (x is not None) AND (x>1)</s>', 'score': 0.007619690150022507, 'token': 4248}
```
### Reference
1. [Bimodal CodeBERT trained with MLM+RTD objective](https://huggingface.co/microsoft/codebert-base) (suitable for code search and document generation)
2. 🤗 [Hugging Face's CodeBERTa](https://huggingface.co/huggingface/CodeBERTa-small-v1) (small size, 6 layers)
### Citation
```bibtex
@misc{feng2020codebert,
title={CodeBERT: A Pre-Trained Model for Programming and Natural Languages},
author={Zhangyin Feng and Daya Guo and Duyu Tang and Nan Duan and Xiaocheng Feng and Ming Gong and Linjun Shou and Bing Qin and Ting Liu and Daxin Jiang and Ming Zhou},
year={2020},
eprint={2002.08155},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 2,005 | [
[
-0.01088714599609375,
-0.0226287841796875,
0.005977630615234375,
0.034027099609375,
0.0034503936767578125,
0.00946807861328125,
-0.030975341796875,
-0.00571441650390625,
0.0191497802734375,
0.02947998046875,
-0.032928466796875,
-0.059783935546875,
-0.05499267578125,
-0.004863739013671875,
-0.0295257568359375,
0.10150146484375,
-0.0005116462707519531,
0.0188751220703125,
-0.0112152099609375,
-0.019927978515625,
-0.0138092041015625,
-0.058349609375,
-0.026580810546875,
-0.023162841796875,
0.0242156982421875,
0.01104736328125,
0.03753662109375,
0.02191162109375,
0.0257720947265625,
0.0241546630859375,
-0.0028533935546875,
-0.010528564453125,
-0.038604736328125,
-0.0187835693359375,
0.025848388671875,
-0.057708740234375,
-0.045806884765625,
0.0205078125,
0.030487060546875,
0.0433349609375,
0.00955963134765625,
0.04132080078125,
0.01139068603515625,
0.06695556640625,
-0.032989501953125,
0.0213623046875,
-0.03997802734375,
0.011749267578125,
0.00339508056640625,
-0.01837158203125,
-0.0305328369140625,
-0.034912109375,
0.0052337646484375,
-0.0287017822265625,
0.0251007080078125,
-0.00098419189453125,
0.09222412109375,
0.01378631591796875,
-0.00870513916015625,
-0.0275421142578125,
-0.02947998046875,
0.0780029296875,
-0.05548095703125,
-0.0001316070556640625,
0.0301513671875,
-0.004390716552734375,
0.006992340087890625,
-0.07110595703125,
-0.0199432373046875,
-0.0242919921875,
-0.00878143310546875,
0.0006389617919921875,
-0.030364990234375,
-0.0038051605224609375,
0.038787841796875,
0.02154541015625,
-0.07177734375,
-0.023834228515625,
-0.05047607421875,
-0.021697998046875,
0.04296875,
0.0023670196533203125,
0.0177001953125,
-0.00835418701171875,
-0.041839599609375,
-0.0133819580078125,
-0.036651611328125,
0.0253753662109375,
0.034454345703125,
0.009613037109375,
-0.034210205078125,
0.037750244140625,
-0.001056671142578125,
0.06976318359375,
0.0014553070068359375,
-0.003330230712890625,
0.06298828125,
-0.0301971435546875,
-0.0223846435546875,
0.0037746429443359375,
0.06298828125,
0.0126495361328125,
0.05133056640625,
0.00045299530029296875,
-0.0193328857421875,
0.00658416748046875,
0.013458251953125,
-0.07244873046875,
-0.04864501953125,
0.024810791015625,
-0.0550537109375,
-0.042816162109375,
0.0276947021484375,
-0.037506103515625,
0.00537872314453125,
-0.03021240234375,
0.04229736328125,
-0.0249176025390625,
-0.003879547119140625,
0.0130767822265625,
-0.00826263427734375,
0.02215576171875,
0.0020008087158203125,
-0.04193115234375,
0.0175323486328125,
0.0224151611328125,
0.06890869140625,
-0.007717132568359375,
-0.0164031982421875,
-0.045562744140625,
-0.03582763671875,
-0.01085662841796875,
0.02197265625,
-0.032470703125,
-0.005176544189453125,
0.0028228759765625,
0.01398468017578125,
-0.0177001953125,
-0.0377197265625,
0.0283660888671875,
-0.04595947265625,
0.02899169921875,
0.0196380615234375,
-0.037567138671875,
-0.0173187255859375,
0.0241851806640625,
-0.040374755859375,
0.084228515625,
0.0245513916015625,
-0.050567626953125,
0.0173187255859375,
-0.048248291015625,
-0.0172576904296875,
-0.00812530517578125,
-0.02093505859375,
-0.047576904296875,
-0.0174713134765625,
0.004581451416015625,
0.032806396484375,
-0.0007181167602539062,
0.0175323486328125,
-0.01837158203125,
-0.0305328369140625,
0.03082275390625,
-0.006580352783203125,
0.0909423828125,
0.0233001708984375,
-0.037994384765625,
0.0306854248046875,
-0.07000732421875,
0.040130615234375,
0.01526641845703125,
-0.035675048828125,
0.00264739990234375,
-0.01549530029296875,
0.0231781005859375,
0.02044677734375,
0.036285400390625,
-0.01526641845703125,
0.026580810546875,
-0.033599853515625,
0.0426025390625,
0.04345703125,
-0.0225830078125,
0.036407470703125,
-0.0257110595703125,
0.06524658203125,
-0.0007696151733398438,
0.0108184814453125,
-0.0047607421875,
-0.0382080078125,
-0.0557861328125,
-0.028289794921875,
0.04522705078125,
0.04705810546875,
-0.046722412109375,
0.056549072265625,
-0.01666259765625,
-0.04132080078125,
-0.035888671875,
0.0198516845703125,
0.041259765625,
0.016571044921875,
0.03375244140625,
-0.0225067138671875,
-0.06646728515625,
-0.052764892578125,
-0.01617431640625,
0.0034465789794921875,
-0.01348876953125,
0.0117950439453125,
0.047882080078125,
-0.0207366943359375,
0.07501220703125,
-0.050994873046875,
-0.022216796875,
-0.0181121826171875,
0.023345947265625,
0.055206298828125,
0.0513916015625,
0.055084228515625,
-0.058197021484375,
-0.043792724609375,
-0.035003662109375,
-0.045257568359375,
-0.004787445068359375,
-0.005992889404296875,
-0.0070343017578125,
0.0191650390625,
0.0272979736328125,
-0.026763916015625,
0.045684814453125,
0.054473876953125,
-0.0159912109375,
0.045562744140625,
-0.015167236328125,
0.001697540283203125,
-0.07513427734375,
0.01407623291015625,
-0.00667572021484375,
-0.01247406005859375,
-0.04791259765625,
-0.01092529296875,
0.015289306640625,
-0.011688232421875,
-0.0302886962890625,
0.02203369140625,
-0.04132080078125,
0.0004839897155761719,
-0.00328826904296875,
0.023651123046875,
0.006195068359375,
0.064697265625,
0.00177001953125,
0.052764892578125,
0.055389404296875,
-0.040130615234375,
0.01248931884765625,
0.01010894775390625,
-0.028411865234375,
-0.0006303787231445312,
-0.06988525390625,
0.006893157958984375,
0.00612640380859375,
0.01495361328125,
-0.07611083984375,
0.0161590576171875,
0.0261993408203125,
-0.06695556640625,
0.01467132568359375,
-0.0299072265625,
-0.028076171875,
-0.0255279541015625,
-0.035675048828125,
0.040130615234375,
0.04425048828125,
-0.0297088623046875,
0.0246124267578125,
0.00988006591796875,
0.01264190673828125,
-0.043304443359375,
-0.055633544921875,
-0.00804901123046875,
-0.01178741455078125,
-0.049835205078125,
0.03131103515625,
-0.03143310546875,
0.00458526611328125,
-0.0110931396484375,
0.0011587142944335938,
-0.007781982421875,
-0.0053253173828125,
0.0231170654296875,
0.039642333984375,
-0.0169525146484375,
0.007068634033203125,
-0.04345703125,
-0.00815582275390625,
0.01224517822265625,
-0.03973388671875,
0.0634765625,
-0.033233642578125,
-0.00551605224609375,
-0.0022296905517578125,
-0.005931854248046875,
0.01444244384765625,
-0.0330810546875,
0.06201171875,
0.06781005859375,
-0.026519775390625,
-0.0131683349609375,
-0.0183563232421875,
0.0034122467041015625,
-0.032867431640625,
0.020843505859375,
-0.0156097412109375,
-0.058685302734375,
0.033294677734375,
0.021697998046875,
-0.0162811279296875,
0.036865234375,
0.053955078125,
0.0168609619140625,
0.06451416015625,
0.0279388427734375,
-0.0189361572265625,
0.045166015625,
-0.06451416015625,
0.0155029296875,
-0.04571533203125,
-0.01509857177734375,
-0.052490234375,
-0.026947021484375,
-0.049713134765625,
-0.042083740234375,
0.0156097412109375,
0.023773193359375,
-0.0264892578125,
0.058502197265625,
-0.059478759765625,
0.00807952880859375,
0.059600830078125,
0.0169830322265625,
-0.004047393798828125,
0.0092315673828125,
-0.016510009765625,
-0.001369476318359375,
-0.047576904296875,
-0.03759765625,
0.0989990234375,
0.0160675048828125,
0.0657958984375,
0.00673675537109375,
0.07177734375,
0.0259552001953125,
0.0157012939453125,
-0.047119140625,
0.02783203125,
0.005008697509765625,
-0.052764892578125,
-0.0031147003173828125,
-0.042205810546875,
-0.07781982421875,
-0.0001589059829711914,
-0.018096923828125,
-0.06048583984375,
0.0109100341796875,
0.015106201171875,
-0.0174713134765625,
0.005092620849609375,
-0.052154541015625,
0.07354736328125,
-0.0281982421875,
-0.026336669921875,
0.00930023193359375,
-0.0572509765625,
0.02142333984375,
0.0035037994384765625,
0.01457977294921875,
0.00701904296875,
0.00461578369140625,
0.06689453125,
-0.036865234375,
0.055938720703125,
-0.010833740234375,
0.00522613525390625,
0.027130126953125,
-0.01007843017578125,
0.037506103515625,
0.0004115104675292969,
-0.000021278858184814453,
0.0282135009765625,
0.0140838623046875,
-0.04290771484375,
-0.029296875,
0.03961181640625,
-0.06146240234375,
-0.0108795166015625,
-0.0467529296875,
-0.022430419921875,
0.01119232177734375,
0.0173187255859375,
0.0391845703125,
0.0426025390625,
0.00588226318359375,
0.033233642578125,
0.041778564453125,
-0.0300750732421875,
0.02630615234375,
0.0299224853515625,
-0.022857666015625,
-0.03912353515625,
0.07086181640625,
-0.003101348876953125,
0.01451873779296875,
0.03558349609375,
-0.003997802734375,
-0.01001739501953125,
-0.0272979736328125,
-0.0127105712890625,
0.0015954971313476562,
-0.046783447265625,
-0.029296875,
-0.04302978515625,
-0.0413818359375,
-0.039642333984375,
-0.00867462158203125,
-0.0166473388671875,
-0.02862548828125,
-0.033233642578125,
0.01129913330078125,
0.0267333984375,
0.031585693359375,
-0.0006480216979980469,
-0.0011796951293945312,
-0.056884765625,
0.019500732421875,
-0.0022487640380859375,
0.02880859375,
-0.00914764404296875,
-0.0518798828125,
-0.04229736328125,
0.00843048095703125,
-0.025421142578125,
-0.06005859375,
0.037506103515625,
0.0117034912109375,
0.042449951171875,
0.02862548828125,
0.005321502685546875,
0.03106689453125,
-0.0207672119140625,
0.059356689453125,
0.0268096923828125,
-0.06707763671875,
0.041412353515625,
-0.0031299591064453125,
0.0178375244140625,
0.038726806640625,
0.037750244140625,
-0.0193023681640625,
-0.0112762451171875,
-0.0540771484375,
-0.07177734375,
0.059478759765625,
0.0430908203125,
0.005123138427734375,
0.01035308837890625,
-0.004467010498046875,
-0.0182342529296875,
0.0257720947265625,
-0.0908203125,
-0.0511474609375,
-0.0244140625,
-0.036163330078125,
-0.0111236572265625,
-0.0068206787109375,
-0.01611328125,
-0.037078857421875,
0.05181884765625,
-0.006252288818359375,
0.037841796875,
0.0220184326171875,
-0.035675048828125,
0.0104522705078125,
0.0035724639892578125,
0.061767578125,
0.0438232421875,
-0.03424072265625,
-0.002292633056640625,
-0.0024471282958984375,
-0.03729248046875,
0.00006681680679321289,
0.0169830322265625,
0.00206756591796875,
0.0030460357666015625,
0.04339599609375,
0.054901123046875,
0.0211181640625,
-0.057769775390625,
0.059539794921875,
0.0002803802490234375,
-0.0355224609375,
-0.05413818359375,
0.01430511474609375,
0.00273895263671875,
0.0194549560546875,
0.0287017822265625,
0.0246429443359375,
0.0038013458251953125,
-0.02642822265625,
0.03076171875,
0.0274200439453125,
-0.03582763671875,
-0.00893402099609375,
0.05523681640625,
-0.005825042724609375,
-0.036285400390625,
0.02581787109375,
-0.031982421875,
-0.0601806640625,
0.06689453125,
0.019622802734375,
0.0692138671875,
0.02716064453125,
0.00412750244140625,
0.05548095703125,
0.0215301513671875,
0.011383056640625,
0.0191650390625,
-0.010772705078125,
-0.047515869140625,
-0.0050811767578125,
-0.05206298828125,
0.004764556884765625,
0.006114959716796875,
-0.052154541015625,
0.0186920166015625,
-0.0283355712890625,
-0.02349853515625,
-0.012908935546875,
0.024658203125,
-0.07464599609375,
0.00957489013671875,
0.0050811767578125,
0.06732177734375,
-0.042205810546875,
0.06854248046875,
0.048187255859375,
-0.0615234375,
-0.0748291015625,
-0.007572174072265625,
-0.00991058349609375,
-0.07147216796875,
0.0836181640625,
0.0203704833984375,
0.00678253173828125,
0.00482177734375,
-0.0584716796875,
-0.0614013671875,
0.08282470703125,
0.00545501708984375,
-0.032135009765625,
-0.00583648681640625,
-0.0029430389404296875,
0.041748046875,
-0.04290771484375,
0.031402587890625,
0.02032470703125,
0.0252532958984375,
-0.0107421875,
-0.038238525390625,
0.0007352828979492188,
-0.04290771484375,
0.0021610260009765625,
-0.0030612945556640625,
-0.0372314453125,
0.0853271484375,
-0.0163116455078125,
0.00450897216796875,
0.0162506103515625,
0.0391845703125,
0.0299530029296875,
0.021820068359375,
0.032257080078125,
0.037567138671875,
0.04132080078125,
-0.01280975341796875,
0.05682373046875,
-0.06268310546875,
0.064208984375,
0.0753173828125,
0.01062774658203125,
0.043304443359375,
0.0194091796875,
-0.042083740234375,
0.061279296875,
0.04473876953125,
-0.038360595703125,
0.02130126953125,
0.048675537109375,
-0.004047393798828125,
-0.0023593902587890625,
0.036285400390625,
-0.04315185546875,
0.0251007080078125,
0.004520416259765625,
-0.0584716796875,
0.002811431884765625,
-0.01291656494140625,
0.01026153564453125,
0.005840301513671875,
-0.0037250518798828125,
0.026947021484375,
0.0010786056518554688,
-0.0452880859375,
0.07965087890625,
0.0047149658203125,
0.06573486328125,
-0.053009033203125,
-0.0077362060546875,
-0.009765625,
0.017333984375,
-0.0311126708984375,
-0.034088134765625,
-0.015960693359375,
0.01418304443359375,
-0.0239410400390625,
-0.01280975341796875,
0.0440673828125,
-0.0435791015625,
-0.04766845703125,
0.034515380859375,
0.0178985595703125,
0.023590087890625,
-0.0026226043701171875,
-0.07781982421875,
0.0114593505859375,
0.021148681640625,
-0.036651611328125,
0.01526641845703125,
0.01476287841796875,
0.0243072509765625,
0.0443115234375,
0.045196533203125,
0.0033588409423828125,
0.0159149169921875,
0.004001617431640625,
0.06573486328125,
-0.050811767578125,
-0.018310546875,
-0.05157470703125,
0.04791259765625,
-0.0038890838623046875,
-0.032867431640625,
0.060302734375,
0.0631103515625,
0.0660400390625,
-0.02581787109375,
0.048126220703125,
-0.0242919921875,
0.014617919921875,
-0.048919677734375,
0.05316162109375,
-0.048370361328125,
0.01464080810546875,
-0.03125,
-0.06329345703125,
-0.0108642578125,
0.058319091796875,
-0.02105712890625,
0.040313720703125,
0.04425048828125,
0.0860595703125,
0.0118560791015625,
-0.03057861328125,
0.014007568359375,
0.011474609375,
0.01473236083984375,
0.053375244140625,
0.027069091796875,
-0.057464599609375,
0.057220458984375,
-0.0167999267578125,
-0.021820068359375,
-0.0300445556640625,
-0.059326171875,
-0.07781982421875,
-0.04461669921875,
-0.024017333984375,
-0.053466796875,
-0.0193023681640625,
0.0836181640625,
0.06744384765625,
-0.07598876953125,
-0.017791748046875,
-0.0110931396484375,
0.00787353515625,
-0.01343536376953125,
-0.0272369384765625,
0.0305633544921875,
-0.0384521484375,
-0.05413818359375,
0.00891876220703125,
-0.007350921630859375,
-0.00879669189453125,
-0.0136871337890625,
-0.034515380859375,
-0.0296630859375,
-0.0114593505859375,
0.03741455078125,
0.0193328857421875,
-0.040985107421875,
-0.0029659271240234375,
0.0005283355712890625,
-0.023956298828125,
0.0208587646484375,
0.055938720703125,
-0.06402587890625,
0.0257110595703125,
0.03045654296875,
0.024749755859375,
0.04931640625,
0.00019979476928710938,
0.024566650390625,
-0.0694580078125,
0.014617919921875,
0.01003265380859375,
0.0321044921875,
0.00809478759765625,
-0.00995635986328125,
0.044891357421875,
0.0257720947265625,
-0.046875,
-0.050018310546875,
-0.0023651123046875,
-0.07977294921875,
-0.010772705078125,
0.07720947265625,
-0.022918701171875,
-0.0036907196044921875,
0.01558685302734375,
-0.0226593017578125,
0.0243072509765625,
-0.021728515625,
0.037261962890625,
0.03973388671875,
0.0002923011779785156,
-0.0012617111206054688,
-0.042083740234375,
0.040985107421875,
0.02593994140625,
-0.038909912109375,
-0.023651123046875,
0.01025390625,
0.032989501953125,
0.02398681640625,
0.052398681640625,
-0.0088348388671875,
0.01074981689453125,
-0.001979827880859375,
0.038482666015625,
-0.01184844970703125,
-0.00925445556640625,
-0.030487060546875,
0.0109710693359375,
0.0009784698486328125,
-0.0276641845703125
]
] |
timm/tf_efficientnetv2_s.in21k | 2023-04-27T22:17:52.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-21k",
"arxiv:2104.00298",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnetv2_s.in21k | 2 | 14,966 | timm | 2022-12-13T00:18:57 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-21k
---
# Model card for tf_efficientnetv2_s.in21k
A EfficientNet-v2 image classification model. Trained on ImageNet-21k in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 48.2
- GMACs: 5.4
- Activations (M): 22.8
- Image size: train = 300 x 300, test = 384 x 384
- **Papers:**
- EfficientNetV2: Smaller Models and Faster Training: https://arxiv.org/abs/2104.00298
- **Dataset:** ImageNet-21k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnetv2_s.in21k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_s.in21k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 24, 150, 150])
# torch.Size([1, 48, 75, 75])
# torch.Size([1, 64, 38, 38])
# torch.Size([1, 160, 19, 19])
# torch.Size([1, 256, 10, 10])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_s.in21k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 10, 10) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2021efficientnetv2,
title={Efficientnetv2: Smaller models and faster training},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={10096--10106},
year={2021},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,076 | [
[
-0.0269012451171875,
-0.033477783203125,
-0.004302978515625,
0.00756072998046875,
-0.022918701171875,
-0.029541015625,
-0.0212554931640625,
-0.029541015625,
0.011688232421875,
0.02935791015625,
-0.0237579345703125,
-0.0457763671875,
-0.054168701171875,
-0.0157012939453125,
-0.01036834716796875,
0.064453125,
-0.00921630859375,
0.0022945404052734375,
-0.013916015625,
-0.0372314453125,
-0.0077972412109375,
-0.0072021484375,
-0.0657958984375,
-0.0347900390625,
0.024871826171875,
0.024200439453125,
0.03521728515625,
0.05596923828125,
0.050445556640625,
0.03582763671875,
-0.005825042724609375,
0.00921630859375,
-0.0245361328125,
-0.0107879638671875,
0.0304107666015625,
-0.0443115234375,
-0.0291900634765625,
0.00916290283203125,
0.0523681640625,
0.0260772705078125,
0.00286865234375,
0.03515625,
0.01078033447265625,
0.0399169921875,
-0.02117919921875,
0.0146484375,
-0.0273284912109375,
0.0131072998046875,
-0.00750732421875,
0.005565643310546875,
-0.0191802978515625,
-0.026397705078125,
0.0129547119140625,
-0.041107177734375,
0.03131103515625,
-0.003566741943359375,
0.09722900390625,
0.0220489501953125,
-0.01030731201171875,
-0.0006089210510253906,
-0.0165252685546875,
0.053009033203125,
-0.052642822265625,
0.0149383544921875,
0.02215576171875,
0.0164642333984375,
-0.004772186279296875,
-0.08489990234375,
-0.0333251953125,
-0.01030731201171875,
-0.01325225830078125,
-0.005035400390625,
-0.02386474609375,
0.00391387939453125,
0.02447509765625,
0.0152740478515625,
-0.038421630859375,
0.0173187255859375,
-0.046539306640625,
-0.01788330078125,
0.041656494140625,
0.0012722015380859375,
0.0194549560546875,
-0.0189971923828125,
-0.0328369140625,
-0.0369873046875,
-0.0238494873046875,
0.025482177734375,
0.024444580078125,
0.01084136962890625,
-0.039764404296875,
0.03753662109375,
0.0048828125,
0.04522705078125,
-0.0032711029052734375,
-0.0251007080078125,
0.04327392578125,
0.002460479736328125,
-0.0306549072265625,
-0.015106201171875,
0.08251953125,
0.035980224609375,
0.0161285400390625,
0.007709503173828125,
-0.01030731201171875,
-0.0275726318359375,
-0.008453369140625,
-0.099853515625,
-0.03369140625,
0.02490234375,
-0.04888916015625,
-0.03289794921875,
0.0180206298828125,
-0.042938232421875,
-0.00914764404296875,
-0.00010699033737182617,
0.0419921875,
-0.0289154052734375,
-0.03228759765625,
-0.00848388671875,
-0.022552490234375,
0.0221099853515625,
0.01474761962890625,
-0.0379638671875,
0.01073455810546875,
0.030548095703125,
0.0887451171875,
0.00435638427734375,
-0.0266571044921875,
-0.0227813720703125,
-0.0269927978515625,
-0.0259246826171875,
0.03485107421875,
-0.0023040771484375,
-0.0007066726684570312,
-0.0212554931640625,
0.0227203369140625,
-0.006740570068359375,
-0.054351806640625,
0.01415252685546875,
-0.017822265625,
0.0141754150390625,
-0.00017690658569335938,
-0.0159149169921875,
-0.03863525390625,
0.0162506103515625,
-0.033416748046875,
0.1007080078125,
0.0265655517578125,
-0.064453125,
0.016387939453125,
-0.0411376953125,
-0.005645751953125,
-0.019134521484375,
0.003131866455078125,
-0.08099365234375,
-0.007076263427734375,
0.006389617919921875,
0.060089111328125,
-0.0213165283203125,
0.005367279052734375,
-0.04583740234375,
-0.0204010009765625,
0.0179901123046875,
0.001010894775390625,
0.08038330078125,
0.01552581787109375,
-0.037139892578125,
0.0170745849609375,
-0.038482666015625,
0.015838623046875,
0.038421630859375,
-0.0206146240234375,
-0.0026950836181640625,
-0.048980712890625,
0.0180206298828125,
0.023956298828125,
0.0080718994140625,
-0.03741455078125,
0.0193939208984375,
-0.01219940185546875,
0.03924560546875,
0.043487548828125,
-0.01470184326171875,
0.0249481201171875,
-0.0230560302734375,
0.01456451416015625,
0.01666259765625,
0.01139068603515625,
0.006160736083984375,
-0.042266845703125,
-0.06842041015625,
-0.037353515625,
0.0280914306640625,
0.0300445556640625,
-0.046234130859375,
0.02972412109375,
-0.0199432373046875,
-0.061767578125,
-0.0287017822265625,
0.00921630859375,
0.039306640625,
0.0423583984375,
0.020843505859375,
-0.03240966796875,
-0.035369873046875,
-0.06939697265625,
-0.002338409423828125,
-0.005764007568359375,
0.001800537109375,
0.027984619140625,
0.057647705078125,
-0.00659942626953125,
0.03997802734375,
-0.03125,
-0.02056884765625,
-0.0179290771484375,
0.004611968994140625,
0.0235748291015625,
0.059112548828125,
0.06121826171875,
-0.040313720703125,
-0.038909912109375,
-0.0098114013671875,
-0.06982421875,
0.0131072998046875,
0.0019702911376953125,
-0.02044677734375,
0.021728515625,
0.0188140869140625,
-0.043121337890625,
0.04290771484375,
0.0172882080078125,
-0.0321044921875,
0.02569580078125,
-0.0189208984375,
0.01386260986328125,
-0.0863037109375,
0.012603759765625,
0.027069091796875,
-0.01849365234375,
-0.036468505859375,
0.0070343017578125,
0.00487518310546875,
-0.0012407302856445312,
-0.038909912109375,
0.05035400390625,
-0.043914794921875,
-0.0190887451171875,
-0.0106353759765625,
-0.02215576171875,
-0.00022017955780029297,
0.047393798828125,
-0.006603240966796875,
0.0280303955078125,
0.058135986328125,
-0.032501220703125,
0.041107177734375,
0.02960205078125,
-0.0202484130859375,
0.0251617431640625,
-0.053314208984375,
0.0174560546875,
0.00144195556640625,
0.016754150390625,
-0.07403564453125,
-0.0278778076171875,
0.0299224853515625,
-0.047607421875,
0.044036865234375,
-0.039306640625,
-0.03564453125,
-0.0413818359375,
-0.03594970703125,
0.0253448486328125,
0.05401611328125,
-0.05718994140625,
0.03289794921875,
0.01849365234375,
0.0213623046875,
-0.045318603515625,
-0.078125,
-0.0155181884765625,
-0.0302581787109375,
-0.059326171875,
0.02447509765625,
0.0156707763671875,
0.004940032958984375,
0.0162200927734375,
-0.0012006759643554688,
-0.013092041015625,
-0.003734588623046875,
0.037200927734375,
0.021728515625,
-0.0221405029296875,
-0.002536773681640625,
-0.023040771484375,
-0.00441741943359375,
0.0035800933837890625,
-0.0307159423828125,
0.041534423828125,
-0.0225830078125,
-0.0028285980224609375,
-0.06689453125,
-0.00705718994140625,
0.0272369384765625,
-0.0018939971923828125,
0.06298828125,
0.0906982421875,
-0.038360595703125,
-0.004878997802734375,
-0.031890869140625,
-0.025665283203125,
-0.036224365234375,
0.047576904296875,
-0.0249176025390625,
-0.039154052734375,
0.05548095703125,
0.005435943603515625,
0.0068817138671875,
0.0562744140625,
0.03131103515625,
-0.007785797119140625,
0.048614501953125,
0.03778076171875,
0.0261077880859375,
0.057281494140625,
-0.0830078125,
-0.018798828125,
-0.0604248046875,
-0.043609619140625,
-0.030517578125,
-0.0474853515625,
-0.054473876953125,
-0.03289794921875,
0.031524658203125,
0.01824951171875,
-0.037872314453125,
0.0362548828125,
-0.061126708984375,
0.0068359375,
0.052337646484375,
0.04229736328125,
-0.0291900634765625,
0.0301513671875,
-0.0100555419921875,
0.0036296844482421875,
-0.06231689453125,
-0.0157318115234375,
0.0869140625,
0.031890869140625,
0.03515625,
-0.0032024383544921875,
0.050445556640625,
-0.018218994140625,
0.02001953125,
-0.049407958984375,
0.0416259765625,
-0.0124053955078125,
-0.0311431884765625,
-0.00946807861328125,
-0.040283203125,
-0.079345703125,
0.01174163818359375,
-0.0205535888671875,
-0.05908203125,
0.0101776123046875,
0.0180206298828125,
-0.015411376953125,
0.05999755859375,
-0.0657958984375,
0.07574462890625,
-0.0084381103515625,
-0.038665771484375,
0.0036067962646484375,
-0.04840087890625,
0.0245361328125,
0.0218505859375,
-0.021881103515625,
-0.00597381591796875,
0.00165557861328125,
0.09112548828125,
-0.050506591796875,
0.056243896484375,
-0.040618896484375,
0.04058837890625,
0.04132080078125,
-0.007442474365234375,
0.03314208984375,
-0.0082855224609375,
-0.0094451904296875,
0.025054931640625,
0.00066375732421875,
-0.036529541015625,
-0.0382080078125,
0.0484619140625,
-0.07952880859375,
-0.0205535888671875,
-0.0278472900390625,
-0.0297698974609375,
0.0181121826171875,
0.0084228515625,
0.040802001953125,
0.05023193359375,
0.0201416015625,
0.0288238525390625,
0.040313720703125,
-0.0240020751953125,
0.041412353515625,
-0.0125274658203125,
-0.00986480712890625,
-0.039764404296875,
0.0657958984375,
0.0199432373046875,
0.00887298583984375,
0.00634002685546875,
0.0192108154296875,
-0.0296173095703125,
-0.047119140625,
-0.0260162353515625,
0.0220184326171875,
-0.05401611328125,
-0.039398193359375,
-0.055572509765625,
-0.025909423828125,
-0.0298309326171875,
0.002246856689453125,
-0.04132080078125,
-0.03564453125,
-0.038055419921875,
0.01445770263671875,
0.060302734375,
0.0457763671875,
-0.0183258056640625,
0.0450439453125,
-0.032623291015625,
0.01154327392578125,
0.01155853271484375,
0.0338134765625,
-0.0022792816162109375,
-0.06512451171875,
-0.0137481689453125,
-0.01036834716796875,
-0.0305023193359375,
-0.048004150390625,
0.0360107421875,
0.0197296142578125,
0.03131103515625,
0.0286407470703125,
-0.0176849365234375,
0.054656982421875,
0.003986358642578125,
0.039581298828125,
0.03436279296875,
-0.03680419921875,
0.038299560546875,
-0.0018434524536132812,
0.005504608154296875,
0.01090240478515625,
0.0178985595703125,
-0.0169219970703125,
0.005252838134765625,
-0.07232666015625,
-0.06060791015625,
0.0703125,
0.01128387451171875,
-0.0036563873291015625,
0.0347900390625,
0.05279541015625,
-0.0008687973022460938,
0.0006608963012695312,
-0.047943115234375,
-0.034912109375,
-0.0275421142578125,
-0.0240631103515625,
-0.0007963180541992188,
-0.0100555419921875,
-0.0017375946044921875,
-0.048736572265625,
0.054229736328125,
-0.00518798828125,
0.06591796875,
0.0171051025390625,
-0.006069183349609375,
-0.0002351999282836914,
-0.037750244140625,
0.0340576171875,
0.0186004638671875,
-0.0227508544921875,
0.00975799560546875,
0.01128387451171875,
-0.039398193359375,
0.0075531005859375,
0.0144805908203125,
-0.0024662017822265625,
-0.00032711029052734375,
0.03790283203125,
0.078369140625,
-0.009796142578125,
0.0101776123046875,
0.03778076171875,
-0.0024509429931640625,
-0.03228759765625,
-0.021820068359375,
0.01454925537109375,
0.005619049072265625,
0.036590576171875,
0.01474761962890625,
0.03314208984375,
-0.00634765625,
-0.0155792236328125,
0.0191802978515625,
0.038787841796875,
-0.0257110595703125,
-0.0215301513671875,
0.0509033203125,
-0.007904052734375,
-0.0167083740234375,
0.06915283203125,
-0.013702392578125,
-0.0362548828125,
0.084716796875,
0.025787353515625,
0.06787109375,
0.004520416259765625,
0.0059661865234375,
0.06658935546875,
0.0167083740234375,
-0.00641632080078125,
0.013702392578125,
0.0111541748046875,
-0.049835205078125,
0.004642486572265625,
-0.0369873046875,
0.009002685546875,
0.024444580078125,
-0.039794921875,
0.023834228515625,
-0.0509033203125,
-0.032012939453125,
0.00981903076171875,
0.0294036865234375,
-0.0777587890625,
0.01215362548828125,
-0.00414276123046875,
0.0655517578125,
-0.05291748046875,
0.058135986328125,
0.06121826171875,
-0.0311431884765625,
-0.08447265625,
-0.01355743408203125,
0.00356292724609375,
-0.072509765625,
0.046142578125,
0.037689208984375,
0.014617919921875,
0.0080413818359375,
-0.057220458984375,
-0.04730224609375,
0.10931396484375,
0.039520263671875,
-0.00785064697265625,
0.021209716796875,
-0.0051116943359375,
0.01488494873046875,
-0.029449462890625,
0.050445556640625,
0.01763916015625,
0.03375244140625,
0.0226287841796875,
-0.0477294921875,
0.0178070068359375,
-0.0275726318359375,
0.01474761962890625,
0.01016998291015625,
-0.06683349609375,
0.06256103515625,
-0.04180908203125,
-0.004703521728515625,
0.00601959228515625,
0.05499267578125,
0.0141448974609375,
0.01131439208984375,
0.038665771484375,
0.0615234375,
0.04241943359375,
-0.033203125,
0.07098388671875,
0.006259918212890625,
0.050933837890625,
0.039642333984375,
0.039794921875,
0.042205810546875,
0.0298614501953125,
-0.011871337890625,
0.0240936279296875,
0.08740234375,
-0.030609130859375,
0.025604248046875,
0.0164642333984375,
0.01061248779296875,
-0.01026153564453125,
0.003253936767578125,
-0.031341552734375,
0.04559326171875,
0.00800323486328125,
-0.039215087890625,
-0.01507568359375,
-0.002506256103515625,
0.005008697509765625,
-0.035858154296875,
-0.017181396484375,
0.037628173828125,
0.001758575439453125,
-0.03167724609375,
0.06524658203125,
0.01467132568359375,
0.06378173828125,
-0.0256500244140625,
0.0035247802734375,
-0.01617431640625,
0.0195159912109375,
-0.0282440185546875,
-0.06121826171875,
0.0215606689453125,
-0.01488494873046875,
0.006927490234375,
0.0007619857788085938,
0.05059814453125,
-0.0276947021484375,
-0.035369873046875,
0.01525115966796875,
0.0219268798828125,
0.043487548828125,
0.0034637451171875,
-0.090087890625,
0.01325225830078125,
0.00699615478515625,
-0.056793212890625,
0.019195556640625,
0.0214691162109375,
0.007518768310546875,
0.05133056640625,
0.040283203125,
-0.0065155029296875,
0.0104522705078125,
-0.0215606689453125,
0.0579833984375,
-0.0311431884765625,
-0.0235443115234375,
-0.0576171875,
0.050537109375,
-0.01213836669921875,
-0.04913330078125,
0.028656005859375,
0.0439453125,
0.06317138671875,
0.000013947486877441406,
0.034210205078125,
-0.0233917236328125,
-0.009552001953125,
-0.030029296875,
0.05615234375,
-0.06048583984375,
-0.0081329345703125,
-0.0021915435791015625,
-0.056182861328125,
-0.0262298583984375,
0.05413818359375,
-0.016448974609375,
0.035858154296875,
0.036651611328125,
0.0770263671875,
-0.0265350341796875,
-0.0296630859375,
0.0078125,
0.01226806640625,
0.0088043212890625,
0.035736083984375,
0.028533935546875,
-0.06146240234375,
0.032928466796875,
-0.053131103515625,
-0.013671875,
-0.0214691162109375,
-0.05059814453125,
-0.075927734375,
-0.059661865234375,
-0.05010986328125,
-0.05987548828125,
-0.010528564453125,
0.0728759765625,
0.0804443359375,
-0.048675537109375,
-0.01041412353515625,
0.0010509490966796875,
0.01020050048828125,
-0.01491546630859375,
-0.01666259765625,
0.054595947265625,
-0.002628326416015625,
-0.056182861328125,
-0.0283966064453125,
-0.00576019287109375,
0.024810791015625,
0.0036869049072265625,
-0.0212554931640625,
-0.00514984130859375,
-0.0283355712890625,
0.0115203857421875,
0.0203094482421875,
-0.046539306640625,
-0.0146484375,
-0.0196533203125,
-0.0145111083984375,
0.0289154052734375,
0.031829833984375,
-0.033416748046875,
0.0247039794921875,
0.042083740234375,
0.0286407470703125,
0.06982421875,
-0.028961181640625,
-0.00455474853515625,
-0.0576171875,
0.044647216796875,
-0.0092926025390625,
0.03240966796875,
0.036865234375,
-0.0233001708984375,
0.045684814453125,
0.035308837890625,
-0.0275726318359375,
-0.06768798828125,
-0.01055145263671875,
-0.08001708984375,
-0.006900787353515625,
0.07958984375,
-0.032928466796875,
-0.04107666015625,
0.037078857421875,
0.0069427490234375,
0.056915283203125,
-0.01262664794921875,
0.0262298583984375,
0.01165008544921875,
-0.00946044921875,
-0.0450439453125,
-0.04608154296875,
0.03314208984375,
0.007266998291015625,
-0.046966552734375,
-0.035858154296875,
-0.005462646484375,
0.05670166015625,
0.0082550048828125,
0.031463623046875,
-0.0033016204833984375,
0.0098114013671875,
0.01427459716796875,
0.0361328125,
-0.048309326171875,
-0.0094451904296875,
-0.02142333984375,
0.0044097900390625,
-0.006893157958984375,
-0.047332763671875
]
] |
Yntec/realistic-vision-v12 | 2023-10-08T15:52:18.000Z | [
"diffusers",
"Photorealistic",
"Realistic",
"Semi-Realistic",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"SG_161222",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/realistic-vision-v12 | 8 | 14,944 | diffusers | 2023-08-16T20:02:32 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Photorealistic
- Realistic
- Semi-Realistic
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
- SG_161222
---
# Realistic Vision 1.2
Samples and prompts:


very cute princess with curly hair wearing choker who would marry me
Original page:
https://civitai.com/models/4201?modelVersionId=5196 | 649 | [
[
-0.048828125,
-0.06085205078125,
0.0253753662109375,
0.0278167724609375,
-0.040924072265625,
-0.0261077880859375,
0.0210113525390625,
-0.031982421875,
0.051910400390625,
0.062286376953125,
-0.069091796875,
-0.0283355712890625,
-0.0408935546875,
0.00392913818359375,
-0.036895751953125,
0.031005859375,
0.004245758056640625,
0.0255279541015625,
0.0004410743713378906,
0.03485107421875,
-0.03741455078125,
0.0092620849609375,
-0.055938720703125,
0.01216888427734375,
0.046783447265625,
0.06353759765625,
0.053375244140625,
0.016815185546875,
0.0312347412109375,
0.01511383056640625,
-0.01302337646484375,
-0.019561767578125,
-0.037261962890625,
0.01049041748046875,
-0.0196685791015625,
-0.0377197265625,
-0.0452880859375,
0.0477294921875,
0.044647216796875,
0.037994384765625,
-0.0244293212890625,
0.0029888153076171875,
0.01806640625,
0.07135009765625,
-0.0181121826171875,
-0.00968170166015625,
-0.0328369140625,
0.030487060546875,
-0.028564453125,
0.01093292236328125,
0.0068817138671875,
-0.021240234375,
0.026611328125,
-0.07745361328125,
0.0241546630859375,
0.0047760009765625,
0.10211181640625,
-0.0006589889526367188,
-0.054718017578125,
-0.0008902549743652344,
-0.0302886962890625,
0.028778076171875,
-0.01788330078125,
0.048553466796875,
0.0228729248046875,
0.048828125,
-0.0239105224609375,
-0.0748291015625,
-0.046875,
0.0006966590881347656,
0.006572723388671875,
0.0277252197265625,
-0.0333251953125,
-0.004150390625,
0.04376220703125,
0.0273590087890625,
-0.052734375,
-0.00662994384765625,
-0.04998779296875,
0.023223876953125,
0.0440673828125,
0.00002110004425048828,
0.0206451416015625,
0.007595062255859375,
-0.01031494140625,
-0.0106658935546875,
-0.023529052734375,
0.0068511962890625,
0.0394287109375,
0.0002484321594238281,
-0.042449951171875,
0.038604736328125,
-0.012603759765625,
0.052947998046875,
0.054168701171875,
-0.0149078369140625,
-0.01149749755859375,
0.00803375244140625,
-0.01497650146484375,
-0.0160980224609375,
0.0291900634765625,
0.056396484375,
0.037353515625,
0.026885986328125,
0.0216217041015625,
-0.00628662109375,
0.0277252197265625,
-0.09271240234375,
-0.0382080078125,
0.00981903076171875,
-0.04339599609375,
-0.049652099609375,
-0.00441741943359375,
-0.08184814453125,
-0.0031108856201171875,
-0.00031304359436035156,
0.004878997802734375,
-0.029388427734375,
-0.00942230224609375,
0.01395416259765625,
-0.00818634033203125,
0.03924560546875,
0.04559326171875,
-0.08447265625,
0.006168365478515625,
0.031097412109375,
0.05670166015625,
0.0235137939453125,
0.01438140869140625,
-0.0014085769653320312,
-0.0202789306640625,
-0.054962158203125,
0.039215087890625,
-0.0009756088256835938,
-0.034515380859375,
-0.01427459716796875,
0.00853729248046875,
0.0018663406372070312,
-0.053741455078125,
0.08447265625,
-0.042327880859375,
-0.0036296844482421875,
-0.0174560546875,
-0.0213623046875,
-0.0130767822265625,
-0.000698089599609375,
-0.057342529296875,
0.01314544677734375,
0.0225830078125,
-0.048980712890625,
0.047210693359375,
-0.033203125,
-0.0034961700439453125,
0.0214385986328125,
-0.01241302490234375,
-0.0328369140625,
-0.0117950439453125,
0.005306243896484375,
0.024322509765625,
-0.0053863525390625,
-0.0035305023193359375,
-0.060028076171875,
0.0032367706298828125,
0.0208587646484375,
-0.00763702392578125,
0.060333251953125,
0.037139892578125,
-0.0099945068359375,
-0.0108489990234375,
-0.04815673828125,
0.0053558349609375,
0.0404052734375,
0.0011119842529296875,
0.00213623046875,
-0.01019287109375,
0.02362060546875,
0.028411865234375,
-0.008544921875,
-0.029541015625,
0.03802490234375,
-0.016021728515625,
0.0022449493408203125,
0.0379638671875,
0.0301666259765625,
0.023590087890625,
-0.034759521484375,
0.0732421875,
0.0161895751953125,
0.0211639404296875,
0.0208740234375,
-0.019622802734375,
-0.0587158203125,
-0.03558349609375,
0.034393310546875,
0.03814697265625,
-0.021636962890625,
0.028778076171875,
-0.003589630126953125,
-0.061492919921875,
-0.04473876953125,
-0.01551055908203125,
0.0075225830078125,
0.053466796875,
-0.00920867919921875,
-0.039215087890625,
-0.055938720703125,
-0.0863037109375,
-0.01206207275390625,
-0.0036220550537109375,
-0.0187530517578125,
0.0215301513671875,
0.032806396484375,
0.004207611083984375,
0.05438232421875,
-0.0640869140625,
-0.0255584716796875,
-0.00965118408203125,
-0.0231170654296875,
0.05377197265625,
0.0601806640625,
0.0638427734375,
-0.084716796875,
-0.03057861328125,
-0.032470703125,
-0.05572509765625,
-0.007320404052734375,
-0.0109100341796875,
-0.02276611328125,
-0.0426025390625,
0.0115814208984375,
-0.0198211669921875,
0.043365478515625,
0.0193328857421875,
-0.055328369140625,
0.08026123046875,
-0.0035839080810546875,
0.037750244140625,
-0.07293701171875,
0.00986480712890625,
0.0229034423828125,
-0.00679779052734375,
-0.04119873046875,
0.04949951171875,
-0.01094818115234375,
-0.0228118896484375,
-0.055816650390625,
0.050872802734375,
-0.03179931640625,
0.02496337890625,
0.005451202392578125,
0.0017824172973632812,
0.018768310546875,
-0.0018634796142578125,
0.0028553009033203125,
0.035064697265625,
0.0645751953125,
-0.042755126953125,
0.035430908203125,
0.068603515625,
-0.0345458984375,
0.08660888671875,
-0.0673828125,
-0.006145477294921875,
0.002227783203125,
0.00727081298828125,
-0.04052734375,
-0.06695556640625,
0.048187255859375,
-0.053741455078125,
0.0042266845703125,
-0.021240234375,
-0.02880859375,
-0.027252197265625,
-0.034942626953125,
0.027130126953125,
0.05889892578125,
-0.04083251953125,
0.0092315673828125,
0.01529693603515625,
-0.0026874542236328125,
-0.006275177001953125,
-0.055328369140625,
0.0033721923828125,
-0.04150390625,
-0.08282470703125,
0.040557861328125,
0.00038814544677734375,
-0.0258331298828125,
-0.0233001708984375,
0.0011262893676757812,
-0.043914794921875,
-0.004787445068359375,
0.028289794921875,
0.03375244140625,
-0.021759033203125,
-0.035400390625,
0.0225830078125,
0.00885009765625,
-0.00276947021484375,
-0.0017852783203125,
0.01195526123046875,
-0.01568603515625,
-0.01493072509765625,
-0.06683349609375,
0.032806396484375,
0.048095703125,
0.00011295080184936523,
0.042205810546875,
0.0213165283203125,
-0.0263519287109375,
-0.00661468505859375,
-0.04046630859375,
-0.0253143310546875,
-0.032012939453125,
0.0127105712890625,
-0.0555419921875,
-0.0452880859375,
0.047210693359375,
0.0007414817810058594,
-0.043670654296875,
0.03814697265625,
0.007076263427734375,
-0.029083251953125,
0.06353759765625,
0.053253173828125,
0.01611328125,
0.0156402587890625,
-0.042755126953125,
0.01097869873046875,
-0.055816650390625,
-0.024383544921875,
-0.00024628639221191406,
-0.0379638671875,
-0.057525634765625,
-0.01666259765625,
0.040924072265625,
0.06842041015625,
-0.0149078369140625,
0.03900146484375,
-0.0206451416015625,
0.027435302734375,
0.045318603515625,
0.031341552734375,
-0.0014009475708007812,
-0.002086639404296875,
-0.0116729736328125,
-0.03533935546875,
-0.03460693359375,
-0.023651123046875,
0.01513671875,
0.01971435546875,
0.048431396484375,
0.01519012451171875,
0.041351318359375,
-0.0177764892578125,
-0.037750244140625,
-0.06097412109375,
0.0648193359375,
0.0016698837280273438,
-0.058563232421875,
-0.01215362548828125,
0.00933837890625,
-0.0750732421875,
0.020233154296875,
-0.040008544921875,
-0.03521728515625,
0.032623291015625,
0.04656982421875,
-0.03985595703125,
0.01464080810546875,
-0.032012939453125,
0.08795166015625,
-0.004840850830078125,
-0.0418701171875,
-0.017974853515625,
-0.053436279296875,
0.038421630859375,
-0.011505126953125,
-0.0189361572265625,
-0.00775146484375,
0.00319671630859375,
0.04864501953125,
-0.035430908203125,
0.0311431884765625,
0.0084075927734375,
0.0125732421875,
0.0275726318359375,
0.004405975341796875,
0.01331329345703125,
0.01229095458984375,
-0.0024623870849609375,
-0.01438140869140625,
-0.007175445556640625,
-0.0264739990234375,
-0.04266357421875,
0.04669189453125,
-0.01617431640625,
-0.021331787109375,
-0.060577392578125,
-0.006145477294921875,
-0.0028133392333984375,
0.00927734375,
0.0550537109375,
0.0275726318359375,
-0.0254364013671875,
-0.005794525146484375,
0.036102294921875,
0.003856658935546875,
0.01155853271484375,
0.025177001953125,
-0.0280914306640625,
-0.0168304443359375,
0.03564453125,
0.02203369140625,
0.01012420654296875,
-0.0019588470458984375,
0.021697998046875,
-0.01552581787109375,
-0.0256805419921875,
-0.029388427734375,
0.026214599609375,
-0.039459228515625,
-0.0006361007690429688,
-0.008453369140625,
-0.02349853515625,
-0.0477294921875,
-0.061065673828125,
-0.006275177001953125,
-0.020233154296875,
-0.045562744140625,
-0.01090240478515625,
0.045074462890625,
0.026580810546875,
-0.011993408203125,
0.026397705078125,
-0.04998779296875,
0.026580810546875,
0.0394287109375,
0.021820068359375,
-0.0006818771362304688,
-0.04510498046875,
0.0213470458984375,
-0.021728515625,
-0.040130615234375,
-0.0638427734375,
0.05126953125,
0.01605224609375,
0.020965576171875,
0.06610107421875,
-0.004856109619140625,
0.07232666015625,
-0.03564453125,
0.052001953125,
0.024383544921875,
-0.050018310546875,
0.063720703125,
-0.051544189453125,
0.0109100341796875,
0.07196044921875,
0.03350830078125,
-0.05401611328125,
-0.02490234375,
-0.0667724609375,
-0.06256103515625,
0.037628173828125,
0.0258331298828125,
-0.00464630126953125,
0.027557373046875,
0.045318603515625,
-0.02813720703125,
0.0241241455078125,
-0.0364990234375,
-0.0316162109375,
-0.01201629638671875,
0.0167999267578125,
0.02490234375,
-0.026519775390625,
0.0147857666015625,
-0.048980712890625,
0.046142578125,
0.01297760009765625,
0.00605010986328125,
0.0305023193359375,
0.040069580078125,
0.0004451274871826172,
0.01078033447265625,
0.0465087890625,
0.0445556640625,
-0.03826904296875,
-0.00415802001953125,
0.003093719482421875,
-0.0491943359375,
0.0035610198974609375,
-0.007442474365234375,
-0.043121337890625,
0.0155029296875,
0.029876708984375,
0.06884765625,
0.01593017578125,
-0.038787841796875,
0.044464111328125,
-0.0010137557983398438,
0.002994537353515625,
-0.043304443359375,
0.0119781494140625,
0.004924774169921875,
0.023223876953125,
0.006084442138671875,
0.0140380859375,
0.03033447265625,
-0.043182373046875,
0.04241943359375,
0.031646728515625,
-0.058349609375,
-0.03204345703125,
0.064208984375,
0.0003230571746826172,
-0.019805908203125,
0.035552978515625,
-0.0275726318359375,
-0.047210693359375,
0.047821044921875,
0.037017822265625,
0.05938720703125,
-0.0302734375,
0.019073486328125,
0.054595947265625,
-0.02081298828125,
0.01267242431640625,
0.047210693359375,
0.0113525390625,
-0.0108642578125,
0.02850341796875,
-0.06536865234375,
-0.0243682861328125,
0.020233154296875,
-0.046722412109375,
0.06787109375,
-0.035797119140625,
0.0188446044921875,
0.01401519775390625,
-0.006160736083984375,
-0.0173797607421875,
0.0538330078125,
0.004352569580078125,
0.10223388671875,
-0.056671142578125,
0.056365966796875,
0.01348876953125,
-0.03399658203125,
-0.1005859375,
0.000446319580078125,
0.02862548828125,
-0.03765869140625,
0.0239410400390625,
0.0171356201171875,
-0.00426483154296875,
-0.005847930908203125,
-0.0650634765625,
-0.02569580078125,
0.06219482421875,
0.025421142578125,
-0.067138671875,
-0.0195465087890625,
-0.0234375,
0.036041259765625,
-0.037689208984375,
0.0020008087158203125,
0.04852294921875,
0.032684326171875,
0.0667724609375,
-0.06121826171875,
-0.0302886962890625,
-0.0279083251953125,
0.0068511962890625,
0.0037288665771484375,
-0.02508544921875,
0.0533447265625,
-0.045989990234375,
-0.01395416259765625,
0.036590576171875,
0.06640625,
0.0257415771484375,
0.00909423828125,
0.0601806640625,
0.050750732421875,
0.0037441253662109375,
-0.001850128173828125,
0.05670166015625,
-0.01399993896484375,
0.01959228515625,
0.07958984375,
-0.010498046875,
0.0643310546875,
0.0228729248046875,
-0.034454345703125,
0.0184783935546875,
0.0767822265625,
-0.040252685546875,
0.05029296875,
0.0296630859375,
-0.031951904296875,
-0.004810333251953125,
-0.02923583984375,
-0.026947021484375,
0.045074462890625,
-0.0155029296875,
-0.005115509033203125,
-0.01158905029296875,
0.0119476318359375,
0.008758544921875,
0.01525115966796875,
-0.036834716796875,
0.0302734375,
0.0228424072265625,
-0.0496826171875,
0.0306549072265625,
-0.0015325546264648438,
0.04168701171875,
-0.0518798828125,
-0.0254364013671875,
0.001800537109375,
0.007701873779296875,
-0.0379638671875,
-0.06964111328125,
0.0280914306640625,
-0.004421234130859375,
-0.0099945068359375,
-0.030487060546875,
0.04364013671875,
-0.03955078125,
-0.0714111328125,
0.01155853271484375,
-0.0018758773803710938,
0.0181427001953125,
-0.0119476318359375,
-0.0572509765625,
-0.01105499267578125,
-0.0123138427734375,
0.0024871826171875,
-0.0035610198974609375,
0.00037550926208496094,
0.006488800048828125,
0.040771484375,
0.020172119140625,
0.0220184326171875,
0.003726959228515625,
0.0030193328857421875,
0.042449951171875,
-0.03192138671875,
-0.043365478515625,
-0.05389404296875,
0.031280517578125,
-0.01255035400390625,
-0.038421630859375,
0.05218505859375,
0.054534912109375,
0.048004150390625,
-0.048858642578125,
0.00940704345703125,
-0.02764892578125,
0.052398681640625,
-0.03167724609375,
0.08245849609375,
-0.058197021484375,
-0.027252197265625,
-0.03997802734375,
-0.031494140625,
-0.0091094970703125,
0.057647705078125,
0.005283355712890625,
-0.00951385498046875,
0.0199737548828125,
0.05908203125,
-0.0108184814453125,
-0.01041412353515625,
0.02667236328125,
0.009857177734375,
0.003078460693359375,
0.01776123046875,
0.052703857421875,
-0.0232696533203125,
-0.00868988037109375,
-0.040924072265625,
-0.027008056640625,
-0.02947998046875,
-0.051910400390625,
-0.038848876953125,
-0.06915283203125,
-0.028350830078125,
-0.05126953125,
-0.0209197998046875,
0.08331298828125,
0.07244873046875,
-0.05029296875,
-0.0271759033203125,
0.0033092498779296875,
-0.009552001953125,
-0.00015938282012939453,
-0.013824462890625,
0.00850677490234375,
0.04840087890625,
-0.05712890625,
0.0066070556640625,
-0.0010156631469726562,
0.06488037109375,
-0.0036754608154296875,
0.00786590576171875,
-0.029876708984375,
0.01201629638671875,
0.0198211669921875,
0.0222015380859375,
-0.04998779296875,
-0.0219268798828125,
0.00336456298828125,
-0.0012769699096679688,
0.0173797607421875,
0.0170440673828125,
-0.034423828125,
0.027740478515625,
0.028564453125,
-0.01143646240234375,
0.031707763671875,
0.01232147216796875,
0.0289306640625,
-0.015716552734375,
0.0269775390625,
0.042755126953125,
0.0282440185546875,
0.0165557861328125,
-0.0245361328125,
0.03973388671875,
0.0267181396484375,
-0.03680419921875,
-0.049346923828125,
0.037567138671875,
-0.1204833984375,
0.005580902099609375,
0.022369384765625,
-0.00004607439041137695,
-0.0061492919921875,
0.01593017578125,
-0.0279998779296875,
0.0164947509765625,
-0.0151214599609375,
0.03643798828125,
0.04290771484375,
-0.0209808349609375,
-0.00487518310546875,
-0.035064697265625,
0.037109375,
0.0194549560546875,
-0.049346923828125,
-0.035064697265625,
0.02496337890625,
0.0250091552734375,
0.0157470703125,
0.045013427734375,
-0.0040283203125,
0.03375244140625,
0.018463134765625,
0.005275726318359375,
0.0195465087890625,
-0.0281829833984375,
-0.033905029296875,
0.00884246826171875,
0.011383056640625,
-0.040252685546875
]
] |
google/tapas-tiny-finetuned-wtq | 2021-11-29T10:45:11.000Z | [
"transformers",
"pytorch",
"tf",
"tapas",
"table-question-answering",
"en",
"dataset:wtq",
"arxiv:2004.02349",
"arxiv:2010.00571",
"arxiv:1508.00305",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | table-question-answering | google | null | null | google/tapas-tiny-finetuned-wtq | 0 | 14,929 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- tapas
- table-question-answering
license: apache-2.0
datasets:
- wtq
---
# TAPAS tiny model fine-tuned on WikiTable Questions (WTQ)
This model has 2 versions which can be used. The default version corresponds to the `tapas_wtq_wikisql_sqa_inter_masklm_tiny_reset` checkpoint of the [original Github repository](https://github.com/google-research/tapas).
This model was pre-trained on MLM and an additional step which the authors call intermediate pre-training, and then fine-tuned in a chain on [SQA](https://www.microsoft.com/en-us/download/details.aspx?id=54253), [WikiSQL](https://github.com/salesforce/WikiSQL) and finally [WTQ](https://github.com/ppasupat/WikiTableQuestions). It uses relative position embeddings (i.e. resetting the position index at every cell of the table).
The other (non-default) version which can be used is:
- `no_reset`, which corresponds to `tapas_wtq_wikisql_sqa_inter_masklm_tiny` (intermediate pre-training, absolute position embeddings).
Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by
the Hugging Face team and contributors.
## Results
Size | Reset | Dev Accuracy | Link
-------- | --------| -------- | ----
LARGE | noreset | 0.5062 | [tapas-large-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/no_reset)
LARGE | reset | 0.5097 | [tapas-large-finetuned-wtq](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/main)
BASE | noreset | 0.4525 | [tapas-base-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/no_reset)
BASE | reset | 0.4638 | [tapas-base-finetuned-wtq](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/main)
MEDIUM | noreset | 0.4324 | [tapas-medium-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/no_reset)
MEDIUM | reset | 0.4324 | [tapas-medium-finetuned-wtq](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/main)
SMALL | noreset | 0.3681 | [tapas-small-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/no_reset)
SMALL | reset | 0.3762 | [tapas-small-finetuned-wtq](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/main)
MINI | noreset | 0.2783 | [tapas-mini-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/no_reset)
MINI | reset | 0.2854 | [tapas-mini-finetuned-wtq](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/main)
**TINY** | **noreset** | **0.0823** | [tapas-tiny-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/no_reset)
**TINY** | **reset** | **0.1039** | [tapas-tiny-finetuned-wtq](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/main)
## Model description
TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.
This means it was pretrained on the raw tables and associated texts only, with no humans labelling them in any way (which is why it
can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a (flattened) table and associated context, the model randomly masks 15% of the words in
the input, then runs the entire (partially masked) sequence through the model. The model then has to predict the masked words.
This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other,
or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional
representation of a table and associated text.
- Intermediate pre-training: to encourage numerical reasoning on tables, the authors additionally pre-trained the model by creating
a balanced dataset of millions of syntactically created training examples. Here, the model must predict (classify) whether a sentence
is supported or refuted by the contents of a table. The training examples are created based on synthetic as well as counterfactual statements.
This way, the model learns an inner representation of the English language used in tables and associated texts, which can then be used
to extract features useful for downstream tasks such as answering questions about a table, or determining whether a sentence is entailed
or refuted by the contents of a table. Fine-tuning is done by adding a cell selection head and aggregation head on top of the pre-trained model, and then jointly train these randomly initialized classification heads with the base model on SQa, WikiSQL and finally WTQ.
## Intended uses & limitations
You can use this model for answering questions related to a table.
For code examples, we refer to the documentation of TAPAS on the HuggingFace website.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Question [SEP] Flattened table [SEP]
```
The authors did first convert the WTQ dataset into the format of SQA using automatic conversion scripts.
### Fine-tuning
The model was fine-tuned on 32 Cloud TPU v3 cores for 50,000 steps with maximum sequence length 512 and batch size of 512.
In this setup, fine-tuning takes around 10 hours. The optimizer used is Adam with a learning rate of 1.93581e-5, and a warmup
ratio of 0.128960. An inductive bias is added such that the model only selects cells of the same column. This is reflected by the
`select_one_column` parameter of `TapasConfig`. See the [paper](https://arxiv.org/abs/2004.02349) for more details (tables 11 and
12).
### BibTeX entry and citation info
```bibtex
@misc{herzig2020tapas,
title={TAPAS: Weakly Supervised Table Parsing via Pre-training},
author={Jonathan Herzig and Paweł Krzysztof Nowak and Thomas Müller and Francesco Piccinno and Julian Martin Eisenschlos},
year={2020},
eprint={2004.02349},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
```bibtex
@misc{eisenschlos2020understanding,
title={Understanding tables with intermediate pre-training},
author={Julian Martin Eisenschlos and Syrine Krichene and Thomas Müller},
year={2020},
eprint={2010.00571},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@article{DBLP:journals/corr/PasupatL15,
author = {Panupong Pasupat and
Percy Liang},
title = {Compositional Semantic Parsing on Semi-Structured Tables},
journal = {CoRR},
volume = {abs/1508.00305},
year = {2015},
url = {http://arxiv.org/abs/1508.00305},
archivePrefix = {arXiv},
eprint = {1508.00305},
timestamp = {Mon, 13 Aug 2018 16:47:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/PasupatL15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 7,203 | [
[
-0.04705810546875,
-0.054595947265625,
0.019256591796875,
0.02099609375,
-0.032318115234375,
-0.009979248046875,
-0.00390625,
-0.04052734375,
0.049285888671875,
0.0240020751953125,
-0.04217529296875,
-0.02960205078125,
-0.03240966796875,
0.001026153564453125,
-0.0189056396484375,
0.08843994140625,
0.004413604736328125,
0.0031719207763671875,
-0.0090179443359375,
-0.0202484130859375,
-0.0264892578125,
-0.027557373046875,
-0.019256591796875,
-0.018951416015625,
0.038177490234375,
0.038604736328125,
0.05859375,
0.0489501953125,
0.0166778564453125,
0.0206298828125,
-0.0222930908203125,
0.00896453857421875,
-0.039276123046875,
-0.0172576904296875,
0.0029087066650390625,
-0.043182373046875,
-0.0308990478515625,
0.0188140869140625,
0.0277252197265625,
0.054107666015625,
0.00421905517578125,
0.036102294921875,
0.01390838623046875,
0.049102783203125,
-0.034820556640625,
0.00888824462890625,
-0.06512451171875,
0.01611328125,
-0.01971435546875,
-0.0044097900390625,
-0.0253448486328125,
-0.046356201171875,
0.02728271484375,
-0.038909912109375,
0.0164031982421875,
0.0021839141845703125,
0.10546875,
0.0079345703125,
-0.0181732177734375,
-0.0013227462768554688,
-0.03790283203125,
0.0386962890625,
-0.055328369140625,
0.0223541259765625,
0.040863037109375,
0.0113525390625,
-0.0174407958984375,
-0.06622314453125,
-0.051544189453125,
-0.01412200927734375,
-0.0225830078125,
0.000762939453125,
-0.00733184814453125,
-0.01195526123046875,
0.0277099609375,
0.04400634765625,
-0.037994384765625,
0.00335693359375,
-0.0374755859375,
-0.015228271484375,
0.041015625,
-0.0022792816162109375,
0.02203369140625,
-0.01338958740234375,
-0.0296783447265625,
-0.0084381103515625,
-0.061981201171875,
0.03021240234375,
0.010467529296875,
0.0272979736328125,
-0.035430908203125,
0.0472412109375,
-0.004985809326171875,
0.05889892578125,
0.0157012939453125,
-0.0200347900390625,
0.04083251953125,
-0.030731201171875,
-0.0256195068359375,
-0.0191192626953125,
0.07000732421875,
0.006923675537109375,
0.01922607421875,
-0.00470733642578125,
-0.011260986328125,
-0.003261566162109375,
0.01175689697265625,
-0.061492919921875,
-0.036712646484375,
0.013519287109375,
-0.037689208984375,
-0.0254974365234375,
0.0016202926635742188,
-0.048187255859375,
-0.0135498046875,
-0.0214996337890625,
0.04034423828125,
-0.025146484375,
-0.01233673095703125,
-0.020477294921875,
-0.003490447998046875,
0.034271240234375,
0.01262664794921875,
-0.0517578125,
0.02227783203125,
0.025482177734375,
0.056182861328125,
0.0028285980224609375,
-0.0217742919921875,
-0.012664794921875,
-0.0079803466796875,
-0.034912109375,
0.040069580078125,
-0.0253448486328125,
-0.0193634033203125,
-0.0137481689453125,
0.0205841064453125,
-0.007770538330078125,
-0.039093017578125,
0.04620361328125,
-0.03125,
0.0247039794921875,
-0.04443359375,
-0.0306549072265625,
-0.014373779296875,
0.016937255859375,
-0.0523681640625,
0.09136962890625,
0.00909423828125,
-0.060821533203125,
0.040069580078125,
-0.04180908203125,
-0.0233917236328125,
-0.01264190673828125,
0.0025196075439453125,
-0.058074951171875,
-0.005924224853515625,
0.016845703125,
0.037506103515625,
-0.01053619384765625,
0.01561737060546875,
-0.036834716796875,
-0.008453369140625,
0.01430511474609375,
0.003475189208984375,
0.06927490234375,
0.0205535888671875,
-0.0286865234375,
0.01233673095703125,
-0.05633544921875,
0.00864410400390625,
0.03179931640625,
-0.01232147216796875,
-0.016693115234375,
-0.0183258056640625,
-0.0021266937255859375,
0.03485107421875,
0.0164337158203125,
-0.0389404296875,
0.0204010009765625,
-0.03271484375,
0.04644775390625,
0.0343017578125,
0.004085540771484375,
0.0244140625,
-0.0182342529296875,
0.020660400390625,
0.01317596435546875,
0.021270751953125,
-0.00675201416015625,
-0.059814453125,
-0.06048583984375,
-0.016754150390625,
0.04443359375,
0.03839111328125,
-0.0447998046875,
0.050048828125,
-0.0177764892578125,
-0.05364990234375,
-0.04693603515625,
0.01332855224609375,
0.044281005859375,
0.05279541015625,
0.03363037109375,
-0.019989013671875,
-0.046173095703125,
-0.08428955078125,
0.003932952880859375,
0.003292083740234375,
0.0014858245849609375,
0.0295562744140625,
0.0450439453125,
-0.0018014907836914062,
0.08758544921875,
-0.045257568359375,
-0.0182647705078125,
-0.01126861572265625,
0.00704193115234375,
0.0251922607421875,
0.04913330078125,
0.055938720703125,
-0.060028076171875,
-0.0576171875,
-0.009307861328125,
-0.03118896484375,
-0.0015935897827148438,
0.01111602783203125,
-0.013702392578125,
0.00605010986328125,
0.015899658203125,
-0.06500244140625,
0.035003662109375,
0.0248565673828125,
-0.03594970703125,
0.03857421875,
-0.005367279052734375,
-0.005001068115234375,
-0.09844970703125,
0.021331787109375,
0.004215240478515625,
-0.0179290771484375,
-0.043365478515625,
-0.004489898681640625,
0.01316070556640625,
-0.014862060546875,
-0.035980224609375,
0.0330810546875,
-0.050506591796875,
-0.013671875,
-0.0005574226379394531,
0.0100860595703125,
0.00782012939453125,
0.0594482421875,
-0.013641357421875,
0.06884765625,
0.0255126953125,
-0.04510498046875,
0.0013494491577148438,
0.03265380859375,
-0.017242431640625,
0.02349853515625,
-0.059722900390625,
-0.004241943359375,
0.005615234375,
0.01654052734375,
-0.0782470703125,
-0.007701873779296875,
0.02142333984375,
-0.04248046875,
0.0394287109375,
-0.0085296630859375,
-0.019744873046875,
-0.0511474609375,
-0.04986572265625,
0.0122833251953125,
0.036468505859375,
-0.050384521484375,
0.0244293212890625,
0.04766845703125,
-0.0008797645568847656,
-0.04705810546875,
-0.050384521484375,
-0.00768280029296875,
-0.01324462890625,
-0.0439453125,
0.049041748046875,
0.00714111328125,
0.0090789794921875,
0.00775146484375,
-0.006572723388671875,
-0.0021076202392578125,
-0.00833892822265625,
0.018646240234375,
0.0170135498046875,
-0.01477813720703125,
0.0011997222900390625,
-0.00478363037109375,
0.008636474609375,
-0.00978851318359375,
-0.0068206787109375,
0.05377197265625,
-0.0286102294921875,
0.0013446807861328125,
-0.051788330078125,
0.0023746490478515625,
0.04052734375,
-0.0247039794921875,
0.062103271484375,
0.0509033203125,
-0.036956787109375,
0.0016613006591796875,
-0.049224853515625,
-0.01446533203125,
-0.03271484375,
0.01092529296875,
-0.045562744140625,
-0.0625,
0.0650634765625,
0.020263671875,
0.00424957275390625,
0.06640625,
0.040618896484375,
-0.0181884765625,
0.059814453125,
0.0428466796875,
-0.022125244140625,
0.035369873046875,
-0.042633056640625,
0.00563812255859375,
-0.054962158203125,
-0.0423583984375,
-0.060211181640625,
-0.032440185546875,
-0.061859130859375,
-0.030059814453125,
0.0153350830078125,
0.0179595947265625,
-0.0300445556640625,
0.038116455078125,
-0.045166015625,
0.0290679931640625,
0.0631103515625,
0.0165557861328125,
-0.00019347667694091797,
-0.0081787109375,
-0.0191192626953125,
-0.0027618408203125,
-0.043060302734375,
-0.0330810546875,
0.07598876953125,
0.04156494140625,
0.039642333984375,
-0.0101165771484375,
0.039215087890625,
0.01464080810546875,
0.0213165283203125,
-0.053375244140625,
0.03216552734375,
0.0005393028259277344,
-0.049530029296875,
-0.022369384765625,
-0.0218353271484375,
-0.07708740234375,
0.0223846435546875,
-0.0273590087890625,
-0.060943603515625,
0.022064208984375,
0.020751953125,
-0.025238037109375,
0.0286407470703125,
-0.0811767578125,
0.0849609375,
-0.00788116455078125,
-0.0198974609375,
0.016021728515625,
-0.062744140625,
0.0362548828125,
0.01190948486328125,
-0.00740814208984375,
-0.00905609130859375,
-0.0032958984375,
0.06927490234375,
-0.0626220703125,
0.06585693359375,
-0.034912109375,
0.008575439453125,
0.038116455078125,
-0.006664276123046875,
0.047515869140625,
-0.0014867782592773438,
0.0248870849609375,
0.003971099853515625,
0.020904541015625,
-0.0382080078125,
-0.0199127197265625,
0.054595947265625,
-0.06658935546875,
-0.02911376953125,
-0.027862548828125,
-0.02362060546875,
-0.039642333984375,
0.023529052734375,
0.02911376953125,
0.011810302734375,
-0.01152801513671875,
0.02740478515625,
0.057525634765625,
-0.01763916015625,
0.035736083984375,
0.0165557861328125,
-0.01149749755859375,
-0.0256500244140625,
0.0631103515625,
0.014129638671875,
-0.008819580078125,
0.02935791015625,
0.025054931640625,
-0.0308837890625,
-0.04547119140625,
-0.03131103515625,
0.0236663818359375,
-0.0142364501953125,
-0.040252685546875,
-0.054046630859375,
-0.0227203369140625,
-0.02734375,
0.01068878173828125,
-0.036590576171875,
-0.046142578125,
-0.03973388671875,
-0.0196380615234375,
0.04254150390625,
0.04498291015625,
0.0105743408203125,
0.0267791748046875,
-0.0550537109375,
0.02825927734375,
0.0297698974609375,
0.043060302734375,
-0.0010557174682617188,
-0.045654296875,
-0.0031490325927734375,
0.00443267822265625,
-0.0311431884765625,
-0.08734130859375,
0.0291595458984375,
0.0233001708984375,
0.042236328125,
0.03729248046875,
-0.004268646240234375,
0.055999755859375,
-0.032012939453125,
0.05975341796875,
0.0170745849609375,
-0.06439208984375,
0.04913330078125,
-0.0180511474609375,
0.01053619384765625,
0.060791015625,
0.041168212890625,
-0.021270751953125,
-0.01177978515625,
-0.049285888671875,
-0.060394287109375,
0.058074951171875,
0.006526947021484375,
0.0124053955078125,
0.0191192626953125,
0.0218048095703125,
0.01470947265625,
0.00934600830078125,
-0.07647705078125,
-0.027191162109375,
-0.016845703125,
-0.006496429443359375,
0.002651214599609375,
-0.041015625,
-0.008209228515625,
-0.057220458984375,
0.061187744140625,
0.002716064453125,
0.021942138671875,
0.007640838623046875,
-0.00710296630859375,
0.004360198974609375,
0.00506591796875,
0.051971435546875,
0.051971435546875,
-0.030731201171875,
-0.01422119140625,
0.00897979736328125,
-0.049468994140625,
-0.0148468017578125,
0.0203399658203125,
-0.0262451171875,
0.0078887939453125,
0.0086822509765625,
0.07501220703125,
0.0207672119140625,
-0.0310821533203125,
0.0472412109375,
0.004192352294921875,
-0.0203399658203125,
-0.01434326171875,
-0.014251708984375,
0.0186920166015625,
0.004547119140625,
0.031768798828125,
-0.01071929931640625,
0.007511138916015625,
-0.037384033203125,
0.0252532958984375,
0.041778564453125,
-0.0218658447265625,
-0.03179931640625,
0.041107177734375,
0.00821685791015625,
-0.01206207275390625,
0.026824951171875,
-0.02032470703125,
-0.03973388671875,
0.04150390625,
0.03973388671875,
0.05584716796875,
-0.0222625732421875,
0.023101806640625,
0.0413818359375,
0.03466796875,
0.0203399658203125,
0.037353515625,
-0.007232666015625,
-0.04498291015625,
-0.006694793701171875,
-0.05889892578125,
-0.018707275390625,
0.0374755859375,
-0.04962158203125,
0.0173797607421875,
-0.03619384765625,
-0.021270751953125,
0.01361846923828125,
0.02020263671875,
-0.056060791015625,
0.01690673828125,
-0.0063629150390625,
0.08551025390625,
-0.07879638671875,
0.06842041015625,
0.050689697265625,
-0.04510498046875,
-0.08367919921875,
-0.01509857177734375,
-0.018157958984375,
-0.064697265625,
0.01462554931640625,
0.008148193359375,
0.0221099853515625,
-0.02203369140625,
-0.053314208984375,
-0.07568359375,
0.08013916015625,
0.014373779296875,
-0.025543212890625,
-0.001987457275390625,
0.00777435302734375,
0.0474853515625,
-0.01561737060546875,
0.012298583984375,
0.052520751953125,
0.01910400390625,
0.0183258056640625,
-0.0799560546875,
0.013427734375,
-0.0260009765625,
0.0130462646484375,
0.0112762451171875,
-0.050506591796875,
0.060943603515625,
-0.0045928955078125,
0.000885009765625,
-0.005275726318359375,
0.0511474609375,
0.012725830078125,
0.0142059326171875,
0.0260162353515625,
0.069580078125,
0.056854248046875,
-0.0241851806640625,
0.0672607421875,
-0.01506805419921875,
0.02850341796875,
0.08917236328125,
0.0096893310546875,
0.069091796875,
0.049591064453125,
-0.046356201171875,
0.02532958984375,
0.07891845703125,
-0.00583648681640625,
0.042755126953125,
0.00989532470703125,
0.0150604248046875,
0.016937255859375,
-0.001033782958984375,
-0.059051513671875,
0.039581298828125,
0.030059814453125,
-0.0433349609375,
-0.00714874267578125,
-0.00040340423583984375,
0.01523590087890625,
-0.004795074462890625,
-0.029022216796875,
0.0550537109375,
0.007648468017578125,
-0.042755126953125,
0.058258056640625,
-0.00016689300537109375,
0.0577392578125,
-0.035919189453125,
0.01334381103515625,
-0.0284881591796875,
-0.00643157958984375,
-0.0230865478515625,
-0.0552978515625,
0.0205841064453125,
-0.0024318695068359375,
-0.010040283203125,
-0.0008640289306640625,
0.051605224609375,
-0.031890869140625,
-0.0305633544921875,
0.0012063980102539062,
0.045745849609375,
0.0100555419921875,
0.005214691162109375,
-0.06585693359375,
0.0014810562133789062,
0.002262115478515625,
-0.041656494140625,
0.0207672119140625,
0.0269622802734375,
0.000621795654296875,
0.041717529296875,
0.04254150390625,
-0.005916595458984375,
0.01995849609375,
-0.0072174072265625,
0.082275390625,
-0.046295166015625,
-0.05352783203125,
-0.0498046875,
0.03363037109375,
-0.016326904296875,
-0.041107177734375,
0.049285888671875,
0.053314208984375,
0.070068359375,
-0.024566650390625,
0.0236663818359375,
-0.0185089111328125,
0.05010986328125,
-0.031402587890625,
0.060882568359375,
-0.048736572265625,
-0.006908416748046875,
-0.032501220703125,
-0.0772705078125,
-0.01334381103515625,
0.03582763671875,
-0.0217742919921875,
0.0142822265625,
0.033172607421875,
0.050262451171875,
-0.01213836669921875,
0.0021533966064453125,
0.00390625,
0.0266265869140625,
0.002819061279296875,
0.0285491943359375,
0.03802490234375,
-0.0322265625,
0.037078857421875,
-0.0465087890625,
-0.0166015625,
-0.0179595947265625,
-0.054901123046875,
-0.05938720703125,
-0.06097412109375,
-0.014556884765625,
-0.0262298583984375,
-0.017730712890625,
0.069091796875,
0.066162109375,
-0.0845947265625,
-0.009613037109375,
0.00969696044921875,
0.006114959716796875,
-0.0003809928894042969,
-0.0220184326171875,
0.0447998046875,
-0.0141448974609375,
-0.04388427734375,
0.012786865234375,
0.00634765625,
0.0176544189453125,
-0.007297515869140625,
0.0003421306610107422,
-0.037628173828125,
-0.00931549072265625,
0.052734375,
0.052276611328125,
-0.05145263671875,
-0.00914764404296875,
-0.00714111328125,
-0.0161590576171875,
0.0181427001953125,
0.04351806640625,
-0.0511474609375,
0.0173492431640625,
0.03338623046875,
0.05712890625,
0.06829833984375,
0.0008940696716308594,
0.01488494873046875,
-0.061798095703125,
0.0251922607421875,
0.029327392578125,
0.035919189453125,
0.0157623291015625,
-0.021514892578125,
0.03955078125,
0.0222930908203125,
-0.022979736328125,
-0.07415771484375,
-0.006031036376953125,
-0.0928955078125,
-0.0102081298828125,
0.059600830078125,
-0.006412506103515625,
-0.029876708984375,
0.007656097412109375,
-0.0183868408203125,
0.0233154296875,
-0.0187530517578125,
0.05316162109375,
0.041259765625,
-0.0109710693359375,
-0.0124664306640625,
-0.028411865234375,
0.061981201171875,
0.03143310546875,
-0.06396484375,
-0.0234832763671875,
0.0283203125,
0.031097412109375,
0.0217742919921875,
0.0540771484375,
-0.0245361328125,
0.0105133056640625,
-0.00969696044921875,
0.004482269287109375,
-0.0090789794921875,
-0.0193634033203125,
-0.0165557861328125,
0.0142822265625,
0.003681182861328125,
-0.044586181640625
]
] |
sschet/biobert_diseases_ner | 2023-02-01T03:40:32.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"NER",
"Biomedical",
"Diseases",
"en",
"dataset:BC5CDR-diseases",
"dataset:ncbi_disease",
"dataset:tner/bc5cdr",
"dataset:commanderstrife/jnlpba",
"dataset:bc2gm_corpus",
"dataset:drAbreu/bc4chemd_ner",
"dataset:linnaeus",
"dataset:chintagunta85/ncbi_disease",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | sschet | null | null | sschet/biobert_diseases_ner | 1 | 14,891 | transformers | 2023-02-01T00:59:17 | ---
language: "en"
license: apache-2.0
tags:
- token-classification
- NER
- Biomedical
- Diseases
datasets:
- BC5CDR-diseases
- ncbi_disease
- tner/bc5cdr
- commanderstrife/jnlpba
- bc2gm_corpus
- drAbreu/bc4chemd_ner
- linnaeus
- chintagunta85/ncbi_disease
---
BioBERT model fine-tuned in NER task with BC5CDR-diseases and NCBI-diseases corpus
This was fine-tuned in order to use it in a BioNER/BioNEN system which is available at: https://github.com/librairy/bio-ner | 469 | [
[
-0.02239990234375,
-0.038330078125,
0.03826904296875,
0.0029392242431640625,
-0.00881195068359375,
0.0157012939453125,
0.0160064697265625,
-0.04949951171875,
0.037994384765625,
0.041748046875,
-0.039306640625,
-0.06121826171875,
-0.0290985107421875,
0.036163330078125,
-0.0256805419921875,
0.09613037109375,
0.033660888671875,
0.046234130859375,
-0.003093719482421875,
-0.0204315185546875,
-0.0173797607421875,
-0.035858154296875,
-0.046600341796875,
-0.06329345703125,
0.0628662109375,
0.041259765625,
0.05401611328125,
0.00859832763671875,
0.07708740234375,
0.0152130126953125,
-0.039947509765625,
-0.01467132568359375,
-0.0309600830078125,
0.01018524169921875,
-0.05047607421875,
0.00925445556640625,
-0.07855224609375,
-0.044708251953125,
0.0101318359375,
0.057861328125,
0.0017576217651367188,
0.03533935546875,
-0.004425048828125,
0.042266845703125,
-0.043792724609375,
-0.00568389892578125,
0.0079345703125,
-0.00896453857421875,
-0.005615234375,
-0.0037746429443359375,
-0.013275146484375,
-0.030426025390625,
-0.00262451171875,
-0.03533935546875,
0.03985595703125,
-0.0101318359375,
0.064453125,
0.034637451171875,
-0.034271240234375,
-0.0097198486328125,
-0.05218505859375,
0.0286712646484375,
-0.0219573974609375,
0.02783203125,
0.01271820068359375,
0.02215576171875,
-0.0179901123046875,
-0.06683349609375,
-0.0211334228515625,
-0.0037288665771484375,
-0.01119232177734375,
0.01297760009765625,
-0.0262908935546875,
0.016754150390625,
0.03448486328125,
-0.011505126953125,
-0.0498046875,
-0.005237579345703125,
-0.07769775390625,
-0.036407470703125,
0.033843994140625,
0.04290771484375,
0.01611328125,
-0.023895263671875,
-0.0090484619140625,
0.0102386474609375,
-0.033355712890625,
-0.004039764404296875,
0.0030059814453125,
0.0110321044921875,
-0.0037670135498046875,
0.035675048828125,
-0.038726806640625,
0.08837890625,
0.03192138671875,
0.029449462890625,
0.0285186767578125,
-0.044403076171875,
-0.0096588134765625,
0.021240234375,
0.0303497314453125,
0.0323486328125,
0.016387939453125,
-0.0101776123046875,
-0.01251983642578125,
0.01432037353515625,
0.033843994140625,
-0.0849609375,
-0.07733154296875,
0.023193359375,
-0.041290283203125,
-0.0034942626953125,
-0.0214385986328125,
-0.035736083984375,
-0.006000518798828125,
-0.0191650390625,
0.0227813720703125,
-0.046295166015625,
0.000010728836059570312,
-0.003208160400390625,
-0.02734375,
-0.0008716583251953125,
0.0036468505859375,
-0.063232421875,
0.027008056640625,
0.0210113525390625,
0.05718994140625,
0.00022077560424804688,
0.0219879150390625,
-0.044586181640625,
0.015045166015625,
-0.0020313262939453125,
0.049652099609375,
-0.0308837890625,
-0.015380859375,
-0.032073974609375,
0.01253509521484375,
0.00850677490234375,
-0.032257080078125,
0.04119873046875,
-0.056793212890625,
-0.038848876953125,
-0.0224761962890625,
-0.0212860107421875,
-0.0193328857421875,
0.0019702911376953125,
-0.0682373046875,
0.040069580078125,
0.01393890380859375,
-0.0582275390625,
0.046783447265625,
-0.051788330078125,
-0.02728271484375,
0.0302581787109375,
0.0280609130859375,
-0.04290771484375,
-0.0024547576904296875,
-0.0031585693359375,
0.03717041015625,
0.00438690185546875,
0.0294189453125,
-0.041717529296875,
-0.031524658203125,
0.016357421875,
0.0278167724609375,
0.08013916015625,
0.0272674560546875,
-0.0030422210693359375,
0.050750732421875,
-0.08172607421875,
0.0227813720703125,
-0.0300445556640625,
-0.0193634033203125,
-0.030670166015625,
-0.0003247261047363281,
0.038665771484375,
0.001560211181640625,
0.0447998046875,
-0.059173583984375,
0.007049560546875,
-0.036773681640625,
0.042938232421875,
-0.00727081298828125,
0.007610321044921875,
0.0155181884765625,
-0.0277862548828125,
0.0171356201171875,
-0.00559234619140625,
0.0350341796875,
0.040069580078125,
-0.014984130859375,
-0.035736083984375,
-0.0364990234375,
0.054779052734375,
0.050537109375,
-0.0171356201171875,
0.039581298828125,
-0.006412506103515625,
-0.0555419921875,
-0.029815673828125,
-0.0004940032958984375,
0.01971435546875,
-0.003978729248046875,
0.045135498046875,
-0.04437255859375,
-0.017333984375,
-0.0673828125,
0.03814697265625,
0.032012939453125,
0.0073699951171875,
0.0005621910095214844,
0.0587158203125,
-0.0401611328125,
0.054962158203125,
-0.03350830078125,
-0.0132904052734375,
-0.0135040283203125,
0.0169830322265625,
0.04949951171875,
0.05267333984375,
0.0419921875,
-0.056976318359375,
-0.03131103515625,
0.00182342529296875,
-0.0280609130859375,
-0.02105712890625,
0.0265655517578125,
-0.00910186767578125,
-0.0024566650390625,
0.06292724609375,
-0.03192138671875,
0.050537109375,
0.037445068359375,
-0.036163330078125,
0.018829345703125,
-0.04827880859375,
-0.0172271728515625,
-0.07830810546875,
0.046783447265625,
-0.014312744140625,
-0.03436279296875,
-0.04473876953125,
0.006702423095703125,
0.03424072265625,
0.012969970703125,
-0.00957489013671875,
0.003551483154296875,
-0.04498291015625,
0.0082855224609375,
-0.03509521484375,
-0.01349639892578125,
-0.01325225830078125,
0.006328582763671875,
-0.00016069412231445312,
0.0283050537109375,
0.04541015625,
-0.0606689453125,
0.03948974609375,
0.034271240234375,
-0.023529052734375,
0.0323486328125,
-0.08917236328125,
0.0119781494140625,
-0.0119781494140625,
0.005161285400390625,
-0.038665771484375,
-0.0220794677734375,
0.0196685791015625,
-0.0322265625,
0.013671875,
0.0015516281127929688,
-0.0421142578125,
-0.0090484619140625,
0.00908660888671875,
0.043304443359375,
0.0121612548828125,
-0.0267791748046875,
0.03216552734375,
0.0166473388671875,
-0.006000518798828125,
-0.0299072265625,
-0.034881591796875,
-0.008087158203125,
-0.005321502685546875,
-0.024810791015625,
0.0576171875,
-0.02392578125,
-0.005886077880859375,
0.0113525390625,
0.012054443359375,
-0.0178680419921875,
-0.006381988525390625,
0.037506103515625,
0.03179931640625,
-0.00618743896484375,
0.0084381103515625,
0.0286865234375,
-0.01617431640625,
-0.0037555694580078125,
0.0140533447265625,
0.03662109375,
0.0032749176025390625,
-0.011962890625,
-0.01064300537109375,
0.020538330078125,
0.050018310546875,
-0.0258636474609375,
0.048583984375,
0.059051513671875,
-0.029296875,
0.014404296875,
-0.0178375244140625,
-0.035675048828125,
-0.033050537109375,
0.0233154296875,
-0.02239990234375,
-0.0699462890625,
0.043701171875,
-0.007480621337890625,
-0.026336669921875,
0.032684326171875,
0.0482177734375,
0.0152587890625,
0.06982421875,
0.060638427734375,
0.0036220550537109375,
0.015167236328125,
-0.0190582275390625,
0.017364501953125,
-0.077880859375,
-0.0214691162109375,
-0.040130615234375,
-0.03546142578125,
-0.0670166015625,
0.0224609375,
0.042022705078125,
0.0164337158203125,
-0.048736572265625,
0.046356201171875,
-0.0645751953125,
0.0221710205078125,
0.0562744140625,
0.03826904296875,
0.003337860107421875,
-0.03179931640625,
-0.0443115234375,
0.006420135498046875,
-0.0733642578125,
-0.022674560546875,
0.0662841796875,
-0.01157379150390625,
0.041534423828125,
0.022003173828125,
0.05615234375,
0.01398468017578125,
0.00527191162109375,
-0.0146484375,
0.02056884765625,
-0.02801513671875,
-0.0880126953125,
-0.01410675048828125,
0.00787353515625,
-0.0968017578125,
-0.0258331298828125,
-0.043792724609375,
-0.07623291015625,
0.057159423828125,
0.03173828125,
-0.07763671875,
-0.0240478515625,
-0.030426025390625,
0.0843505859375,
-0.007476806640625,
-0.042266845703125,
-0.040863037109375,
-0.08331298828125,
0.00894927978515625,
0.005405426025390625,
-0.003688812255859375,
-0.00363922119140625,
0.021820068359375,
0.04180908203125,
-0.0157623291015625,
0.0670166015625,
0.0162200927734375,
0.042633056640625,
-0.00722503662109375,
0.0084228515625,
-0.0035495758056640625,
0.0159454345703125,
-0.006557464599609375,
0.0078125,
0.040313720703125,
-0.0279083251953125,
0.0007834434509277344,
0.0447998046875,
-0.04931640625,
-0.00823211669921875,
-0.0660400390625,
-0.0455322265625,
0.0272369384765625,
0.03131103515625,
0.02752685546875,
0.056854248046875,
-0.0450439453125,
0.015655517578125,
0.0460205078125,
-0.020263671875,
0.0266265869140625,
0.040985107421875,
-0.0220184326171875,
-0.037445068359375,
0.05987548828125,
-0.0128631591796875,
0.0217437744140625,
0.038055419921875,
-0.0198516845703125,
0.00952911376953125,
-0.039825439453125,
-0.00576019287109375,
0.00707244873046875,
-0.04046630859375,
-0.03546142578125,
-0.04730224609375,
-0.038818359375,
-0.040374755859375,
-0.00908660888671875,
0.0102996826171875,
-0.006252288818359375,
-0.057647705078125,
-0.0066070556640625,
0.06744384765625,
0.052978515625,
-0.039703369140625,
0.02587890625,
-0.05999755859375,
0.040191650390625,
-0.003627777099609375,
0.0162200927734375,
-0.035736083984375,
-0.0260467529296875,
-0.026123046875,
0.0030040740966796875,
-0.034637451171875,
-0.069580078125,
0.054931640625,
0.00347137451171875,
0.0170135498046875,
0.011199951171875,
0.00028395652770996094,
0.01102447509765625,
-0.0232391357421875,
0.0142364501953125,
0.00919342041015625,
-0.0621337890625,
0.031494140625,
-0.03277587890625,
0.053192138671875,
0.05047607421875,
0.042266845703125,
-0.00937652587890625,
-0.01519775390625,
-0.04248046875,
-0.08074951171875,
0.033111572265625,
0.00894927978515625,
-0.01236724853515625,
-0.007282257080078125,
0.0203399658203125,
0.00666046142578125,
-0.0036296844482421875,
-0.03363037109375,
-0.00576019287109375,
-0.014129638671875,
-0.0106964111328125,
-0.01311492919921875,
-0.0247039794921875,
-0.0009746551513671875,
-0.054107666015625,
0.029052734375,
-0.0117950439453125,
0.0163726806640625,
0.03570556640625,
-0.0244903564453125,
-0.006565093994140625,
0.0193634033203125,
0.05316162109375,
0.0552978515625,
-0.045135498046875,
-0.0099945068359375,
0.018646240234375,
-0.038665771484375,
-0.011962890625,
0.0232086181640625,
-0.032012939453125,
-0.00400543212890625,
0.00728607177734375,
0.05712890625,
0.0278778076171875,
-0.040252685546875,
0.01129913330078125,
0.01160430908203125,
-0.0229034423828125,
-0.037628173828125,
0.033050537109375,
0.03936767578125,
0.049896240234375,
-0.0158538818359375,
0.024139404296875,
0.0498046875,
-0.022796630859375,
0.0177001953125,
0.0264434814453125,
-0.05889892578125,
-0.0234527587890625,
0.07489013671875,
0.023040771484375,
-0.0262603759765625,
0.07330322265625,
-0.02880859375,
-0.040802001953125,
0.049713134765625,
0.041046142578125,
0.03070068359375,
-0.015869140625,
0.0154571533203125,
0.051666259765625,
0.0171661376953125,
0.0050048828125,
0.055908203125,
0.0108795166015625,
-0.059661865234375,
-0.0269317626953125,
-0.047760009765625,
-0.0333251953125,
-0.006458282470703125,
-0.061553955078125,
0.04620361328125,
-0.0308837890625,
-0.04791259765625,
0.0504150390625,
-0.0160980224609375,
-0.05023193359375,
0.036102294921875,
-0.0008940696716308594,
0.053863525390625,
-0.04180908203125,
0.074951171875,
0.062286376953125,
-0.034027099609375,
-0.034088134765625,
-0.0276641845703125,
-0.025787353515625,
-0.01538848876953125,
0.0811767578125,
-0.0146636962890625,
-0.01007843017578125,
0.0155029296875,
-0.0318603515625,
-0.08782958984375,
0.0887451171875,
0.029083251953125,
-0.065673828125,
0.019775390625,
-0.03448486328125,
0.057403564453125,
-0.0088653564453125,
0.0294342041015625,
0.0205535888671875,
0.039703369140625,
-0.0080108642578125,
-0.084228515625,
0.006000518798828125,
-0.0228271484375,
-0.0197906494140625,
0.038665771484375,
-0.045562744140625,
0.058502197265625,
-0.039642333984375,
0.0242462158203125,
0.0128631591796875,
0.03607177734375,
0.0167388916015625,
0.04132080078125,
0.0099334716796875,
0.0252685546875,
0.03656005859375,
0.00830078125,
0.0694580078125,
-0.0570068359375,
0.0210418701171875,
0.063720703125,
-0.024322509765625,
0.0411376953125,
0.01203155517578125,
0.0135955810546875,
0.0364990234375,
0.04180908203125,
-0.0244293212890625,
0.04986572265625,
0.03680419921875,
-0.0219573974609375,
-0.023284912109375,
-0.0013666152954101562,
-0.01019287109375,
0.0301361083984375,
0.054931640625,
-0.0197906494140625,
0.00341033935546875,
-0.01486968994140625,
-0.0253753662109375,
-0.0117645263671875,
-0.0251312255859375,
0.043792724609375,
0.006694793701171875,
-0.02685546875,
0.0224609375,
0.0035266876220703125,
0.035797119140625,
-0.01416778564453125,
-0.0059814453125,
0.02618408203125,
0.0208282470703125,
-0.0208740234375,
-0.034454345703125,
0.0443115234375,
-0.0214385986328125,
-0.0321044921875,
0.0158233642578125,
0.08453369140625,
-0.0248565673828125,
-0.05316162109375,
0.050140380859375,
0.02557373046875,
0.0108795166015625,
0.0215606689453125,
-0.02349853515625,
-0.007843017578125,
-0.0229034423828125,
0.0216064453125,
0.0015163421630859375,
0.01470947265625,
-0.015380859375,
0.048065185546875,
0.030853271484375,
0.009124755859375,
-0.0234527587890625,
0.005504608154296875,
0.037872314453125,
-0.07012939453125,
-0.0063018798828125,
-0.0552978515625,
0.01885986328125,
-0.0119476318359375,
-0.037322998046875,
0.01116180419921875,
0.05877685546875,
0.03668212890625,
-0.023956298828125,
0.03533935546875,
-0.02191162109375,
0.046295166015625,
-0.0187225341796875,
0.0765380859375,
-0.02783203125,
-0.0335693359375,
-0.02685546875,
-0.061614990234375,
-0.042877197265625,
0.041412353515625,
-0.006439208984375,
0.004940032958984375,
0.07904052734375,
0.0732421875,
-0.0295257568359375,
-0.0082550048828125,
0.022430419921875,
0.0172119140625,
-0.022064208984375,
0.035858154296875,
0.01142120361328125,
0.0008816719055175781,
0.00998687744140625,
-0.0247344970703125,
-0.0086669921875,
-0.039947509765625,
-0.05413818359375,
-0.06842041015625,
-0.05084228515625,
-0.041717529296875,
-0.0229644775390625,
0.0552978515625,
0.061431884765625,
0.045074462890625,
-0.0863037109375,
-0.011993408203125,
-0.01224517822265625,
-0.003940582275390625,
-0.018951416015625,
-0.0055694580078125,
0.0186309814453125,
0.01462554931640625,
-0.0284881591796875,
0.00443267822265625,
0.0011644363403320312,
-0.007717132568359375,
0.01067352294921875,
0.017791748046875,
-0.0283660888671875,
-0.01178741455078125,
0.0277099609375,
0.0155181884765625,
-0.0269927978515625,
-0.0328369140625,
0.0266876220703125,
-0.0210418701171875,
-0.0231781005859375,
0.047607421875,
-0.05169677734375,
0.0076904296875,
0.049041748046875,
0.00817108154296875,
0.0516357421875,
-0.0167999267578125,
0.059051513671875,
0.000015020370483398438,
-0.02374267578125,
0.011810302734375,
0.0214080810546875,
0.028778076171875,
-0.034271240234375,
0.031646728515625,
-0.0013065338134765625,
-0.06414794921875,
-0.022735595703125,
-0.00554656982421875,
-0.1112060546875,
-0.01416778564453125,
0.05084228515625,
0.03271484375,
-0.00803375244140625,
-0.0271759033203125,
-0.01462554931640625,
0.052581787109375,
-0.034698486328125,
0.053619384765625,
0.050628662109375,
-0.022186279296875,
0.00351715087890625,
-0.053131103515625,
0.0211944580078125,
0.0269317626953125,
-0.0377197265625,
-0.033599853515625,
0.038330078125,
0.03546142578125,
0.01325225830078125,
0.0294342041015625,
0.00982666015625,
0.04864501953125,
-0.025726318359375,
0.050811767578125,
0.0177001953125,
-0.0313720703125,
-0.005573272705078125,
0.00160980224609375,
0.00031065940856933594,
-0.0033740997314453125
]
] |
BAAI/bge-large-zh-v1.5 | 2023-10-12T03:34:16.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"feature-extraction",
"zh",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | BAAI | null | null | BAAI/bge-large-zh-v1.5 | 73 | 14,840 | transformers | 2023-09-12T05:22:11 | ---
license: mit
language:
- zh
---
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search.
And it also can be used in vector databases for LLMs.
************* 🌟**Updates**🌟 *************
- 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire:
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
| 27,165 | [
[
-0.036651611328125,
-0.06793212890625,
0.029296875,
0.0118865966796875,
-0.02716064453125,
-0.02056884765625,
-0.0237579345703125,
-0.0189666748046875,
0.0300445556640625,
0.0283966064453125,
-0.0260467529296875,
-0.06549072265625,
-0.036102294921875,
-0.004604339599609375,
-0.006374359130859375,
0.041473388671875,
-0.0037555694580078125,
0.010711669921875,
0.003925323486328125,
-0.0186767578125,
-0.0283966064453125,
-0.01904296875,
-0.049224853515625,
-0.0192108154296875,
0.025726318359375,
0.017059326171875,
0.042205810546875,
0.056365966796875,
0.0222930908203125,
0.02020263671875,
-0.0175933837890625,
0.01165771484375,
-0.0357666015625,
-0.005153656005859375,
-0.01552581787109375,
-0.024871826171875,
-0.03143310546875,
0.01317596435546875,
0.050994873046875,
0.0335693359375,
-0.007740020751953125,
0.00778961181640625,
0.0004954338073730469,
0.053192138671875,
-0.034515380859375,
0.020751953125,
-0.042724609375,
0.0027294158935546875,
-0.0180511474609375,
0.01128387451171875,
-0.03863525390625,
-0.0287017822265625,
0.0119476318359375,
-0.0458984375,
0.00617218017578125,
0.0216064453125,
0.097412109375,
0.01541900634765625,
-0.033935546875,
-0.01239013671875,
-0.009063720703125,
0.0740966796875,
-0.07525634765625,
0.051177978515625,
0.037933349609375,
0.0190277099609375,
-0.005809783935546875,
-0.0609130859375,
-0.0269622802734375,
-0.0124053955078125,
-0.0151519775390625,
0.03155517578125,
0.001903533935546875,
0.0014982223510742188,
0.0236358642578125,
0.044586181640625,
-0.041412353515625,
0.007049560546875,
-0.00511932373046875,
-0.011871337890625,
0.0570068359375,
-0.01230621337890625,
0.034210205078125,
-0.041595458984375,
-0.0223541259765625,
-0.027435302734375,
-0.0596923828125,
0.0034122467041015625,
0.0273284912109375,
0.0102386474609375,
-0.02935791015625,
0.0423583984375,
-0.017120361328125,
0.045501708984375,
0.003787994384765625,
0.00383758544921875,
0.03924560546875,
-0.0278472900390625,
-0.0154876708984375,
-0.0109405517578125,
0.069580078125,
0.029449462890625,
-0.004314422607421875,
0.003849029541015625,
-0.024139404296875,
-0.007076263427734375,
-0.006961822509765625,
-0.06695556640625,
-0.018157958984375,
0.0148162841796875,
-0.05712890625,
-0.0135498046875,
0.0177459716796875,
-0.05816650390625,
0.00774383544921875,
0.00010311603546142578,
0.043670654296875,
-0.055755615234375,
-0.005519866943359375,
0.02337646484375,
-0.015777587890625,
0.029998779296875,
-0.00024437904357910156,
-0.046844482421875,
-0.018524169921875,
0.039764404296875,
0.0640869140625,
0.0124969482421875,
-0.005706787109375,
-0.0279541015625,
0.00283050537109375,
-0.010650634765625,
0.0244598388671875,
-0.03887939453125,
-0.01331329345703125,
0.015777587890625,
0.0289764404296875,
-0.00771331787109375,
-0.0216827392578125,
0.06597900390625,
-0.040130615234375,
0.02691650390625,
-0.0283050537109375,
-0.0615234375,
-0.037506103515625,
0.0069122314453125,
-0.060089111328125,
0.08270263671875,
-0.00733184814453125,
-0.06341552734375,
0.006229400634765625,
-0.048248291015625,
-0.0161895751953125,
-0.01910400390625,
-0.0024871826171875,
-0.044769287109375,
-0.00879669189453125,
0.0284423828125,
0.043701171875,
-0.017120361328125,
0.002605438232421875,
-0.025970458984375,
-0.042755126953125,
-0.0005397796630859375,
-0.017303466796875,
0.0819091796875,
0.0191497802734375,
-0.025115966796875,
-0.0164794921875,
-0.032806396484375,
0.00899505615234375,
0.022705078125,
-0.02337646484375,
-0.025787353515625,
0.0165557861328125,
0.0176849365234375,
0.0038814544677734375,
0.039642333984375,
-0.05279541015625,
0.01369476318359375,
-0.043792724609375,
0.044464111328125,
0.04180908203125,
0.012939453125,
0.0179290771484375,
-0.035491943359375,
0.021514892578125,
-0.0017499923706054688,
-0.0028553009033203125,
-0.0166168212890625,
-0.039703369140625,
-0.046966552734375,
-0.0226287841796875,
0.055419921875,
0.04937744140625,
-0.0653076171875,
0.049774169921875,
-0.034210205078125,
-0.04632568359375,
-0.07049560546875,
0.0100555419921875,
0.039947509765625,
0.00016045570373535156,
0.05377197265625,
-0.01035308837890625,
-0.035858154296875,
-0.0699462890625,
-0.0046234130859375,
0.005855560302734375,
-0.00679779052734375,
0.040191650390625,
0.0460205078125,
-0.023895263671875,
0.0305023193359375,
-0.05487060546875,
-0.026153564453125,
-0.0172119140625,
-0.0054779052734375,
0.0253753662109375,
0.036773681640625,
0.0478515625,
-0.07537841796875,
-0.043701171875,
-0.000606536865234375,
-0.058319091796875,
0.00571441650390625,
0.0027141571044921875,
-0.0224151611328125,
0.01305389404296875,
0.045654296875,
-0.0307769775390625,
0.017791748046875,
0.035675048828125,
-0.019317626953125,
0.0210418701171875,
-0.0015697479248046875,
0.010986328125,
-0.09912109375,
0.00164031982421875,
0.022613525390625,
-0.008575439453125,
-0.020477294921875,
0.0389404296875,
0.012725830078125,
0.0154266357421875,
-0.025909423828125,
0.0439453125,
-0.0394287109375,
0.0187530517578125,
0.009674072265625,
0.0460205078125,
-0.0067138671875,
0.038330078125,
-0.0034999847412109375,
0.0537109375,
0.02783203125,
-0.0299072265625,
0.00928497314453125,
0.03955078125,
-0.0333251953125,
0.0060882568359375,
-0.04937744140625,
-0.005687713623046875,
-0.00554656982421875,
0.0125579833984375,
-0.06207275390625,
-0.005462646484375,
0.0198516845703125,
-0.042999267578125,
0.03955078125,
-0.0224456787109375,
-0.037261962890625,
-0.027587890625,
-0.06829833984375,
0.010986328125,
0.04376220703125,
-0.048553466796875,
0.0164794921875,
0.022125244140625,
0.006961822509765625,
-0.0579833984375,
-0.061309814453125,
-0.01165771484375,
-0.00017368793487548828,
-0.03955078125,
0.0408935546875,
-0.002140045166015625,
0.0191650390625,
0.0141754150390625,
-0.005359649658203125,
0.0112762451171875,
0.00865936279296875,
-0.00020933151245117188,
0.0184326171875,
-0.035797119140625,
0.0035400390625,
0.02056884765625,
0.0098114013671875,
-0.0148773193359375,
-0.01212310791015625,
0.033111572265625,
-0.01291656494140625,
-0.02679443359375,
-0.017791748046875,
0.0255584716796875,
0.0192413330078125,
-0.0303955078125,
0.0445556640625,
0.07452392578125,
-0.0281829833984375,
-0.0062713623046875,
-0.0496826171875,
-0.00923919677734375,
-0.036224365234375,
0.034149169921875,
-0.02435302734375,
-0.07379150390625,
0.02972412109375,
-0.0015201568603515625,
0.0162506103515625,
0.050872802734375,
0.0252532958984375,
-0.01061248779296875,
0.08099365234375,
0.0281524658203125,
-0.020294189453125,
0.049835205078125,
-0.049774169921875,
0.0133209228515625,
-0.0882568359375,
-0.00335693359375,
-0.0297088623046875,
-0.0297088623046875,
-0.099853515625,
-0.03802490234375,
0.00463104248046875,
0.021026611328125,
-0.02862548828125,
0.0323486328125,
-0.04302978515625,
0.01145172119140625,
0.036407470703125,
0.0222930908203125,
-0.00138092041015625,
0.00933837890625,
-0.0325927734375,
-0.0203704833984375,
-0.04583740234375,
-0.038238525390625,
0.07513427734375,
0.036407470703125,
0.046051025390625,
0.027374267578125,
0.061981201171875,
0.01422119140625,
0.007293701171875,
-0.058197021484375,
0.042999267578125,
-0.03936767578125,
-0.04296875,
-0.0269927978515625,
-0.036712646484375,
-0.083984375,
0.02984619140625,
-0.0206146240234375,
-0.058319091796875,
0.00806427001953125,
-0.0148773193359375,
-0.0022869110107421875,
0.03515625,
-0.050872802734375,
0.07720947265625,
-0.00811004638671875,
-0.0231781005859375,
-0.0058746337890625,
-0.03155517578125,
0.0245208740234375,
0.01497650146484375,
0.00620269775390625,
0.00556182861328125,
-0.01953125,
0.05718994140625,
-0.01416015625,
0.04803466796875,
-0.01220703125,
0.01123809814453125,
0.03240966796875,
-0.01386260986328125,
0.04168701171875,
0.006061553955078125,
-0.0135498046875,
0.0226898193359375,
0.0067291259765625,
-0.036376953125,
-0.037384033203125,
0.06634521484375,
-0.05072021484375,
-0.05328369140625,
-0.0282440185546875,
-0.0188751220703125,
0.01348114013671875,
0.032989501953125,
0.0265960693359375,
0.0165252685546875,
-0.0077667236328125,
0.048736572265625,
0.06982421875,
-0.0411376953125,
0.028900146484375,
0.0261077880859375,
-0.0206146240234375,
-0.044647216796875,
0.0845947265625,
0.019805908203125,
-0.003963470458984375,
0.05078125,
0.0010051727294921875,
-0.021087646484375,
-0.0400390625,
-0.034393310546875,
0.047943115234375,
-0.04473876953125,
-0.01264190673828125,
-0.048309326171875,
-0.0322265625,
-0.0325927734375,
0.0016393661499023438,
-0.020416259765625,
-0.021331787109375,
-0.013427734375,
-0.021209716796875,
0.0177764892578125,
0.0357666015625,
0.00916290283203125,
0.00669097900390625,
-0.0535888671875,
0.015899658203125,
-0.00736236572265625,
0.033172607421875,
0.005413055419921875,
-0.04071044921875,
-0.046844482421875,
0.013153076171875,
-0.03692626953125,
-0.081787109375,
0.0262908935546875,
0.005680084228515625,
0.06317138671875,
0.02484130859375,
-0.000843048095703125,
0.0309600830078125,
-0.03955078125,
0.08062744140625,
-0.008148193359375,
-0.05926513671875,
0.0384521484375,
-0.02117919921875,
0.012420654296875,
0.042205810546875,
0.049285888671875,
-0.03497314453125,
-0.0206451416015625,
-0.03704833984375,
-0.07275390625,
0.036712646484375,
0.01372528076171875,
0.003223419189453125,
-0.022369384765625,
0.0247802734375,
-0.01372528076171875,
-0.0001710653305053711,
-0.060302734375,
-0.05621337890625,
-0.0251922607421875,
-0.02655029296875,
-0.0073089599609375,
-0.0208892822265625,
0.01558685302734375,
-0.021881103515625,
0.075439453125,
0.0003371238708496094,
0.041412353515625,
0.0270233154296875,
-0.024688720703125,
0.0180511474609375,
0.019073486328125,
0.0224609375,
0.01412200927734375,
-0.0291595458984375,
-0.01093292236328125,
0.0237579345703125,
-0.0416259765625,
-0.004810333251953125,
0.0233612060546875,
-0.035308837890625,
0.01459503173828125,
0.023040771484375,
0.05328369140625,
0.033843994140625,
-0.03338623046875,
0.042633056640625,
0.00862884521484375,
-0.01419830322265625,
-0.02252197265625,
-0.005405426025390625,
0.02301025390625,
0.0189666748046875,
0.00879669189453125,
-0.034393310546875,
0.0199737548828125,
-0.039947509765625,
0.0255584716796875,
0.033843994140625,
-0.028656005859375,
-0.005062103271484375,
0.05279541015625,
0.0026149749755859375,
-0.0016050338745117188,
0.03607177734375,
-0.037841796875,
-0.055450439453125,
0.031982421875,
0.028289794921875,
0.06329345703125,
-0.010986328125,
0.016876220703125,
0.06500244140625,
0.04010009765625,
-0.02410888671875,
0.026885986328125,
0.0058441162109375,
-0.04400634765625,
-0.033447265625,
-0.0408935546875,
-0.004383087158203125,
0.0200958251953125,
-0.043548583984375,
0.0264434814453125,
-0.031402587890625,
-0.01113128662109375,
0.0023651123046875,
0.033111572265625,
-0.05609130859375,
0.00955963134765625,
0.0034236907958984375,
0.08477783203125,
-0.04400634765625,
0.06292724609375,
0.07476806640625,
-0.07232666015625,
-0.058197021484375,
0.0059814453125,
-0.0098419189453125,
-0.045867919921875,
0.0283050537109375,
0.0198822021484375,
0.0133209228515625,
0.004669189453125,
-0.03594970703125,
-0.06890869140625,
0.11822509765625,
0.002880096435546875,
-0.0399169921875,
-0.00466156005859375,
-0.021240234375,
0.034454345703125,
-0.02850341796875,
0.033599853515625,
0.0310516357421875,
0.0460205078125,
-0.0139617919921875,
-0.04852294921875,
0.040863037109375,
-0.023834228515625,
0.0176849365234375,
0.0036907196044921875,
-0.0733642578125,
0.0623779296875,
0.0034465789794921875,
-0.0249786376953125,
0.014678955078125,
0.05450439453125,
0.0173187255859375,
0.031585693359375,
0.0181732177734375,
0.07000732421875,
0.049652099609375,
-0.01678466796875,
0.08709716796875,
-0.019439697265625,
0.04736328125,
0.06549072265625,
0.01274871826171875,
0.0843505859375,
0.006679534912109375,
-0.0177459716796875,
0.05059814453125,
0.0596923828125,
-0.024078369140625,
0.035064697265625,
0.0014257431030273438,
0.004581451416015625,
-0.0240631103515625,
0.004016876220703125,
-0.04034423828125,
0.021697998046875,
0.0245513916015625,
-0.038970947265625,
0.0033702850341796875,
-0.0220489501953125,
0.0089874267578125,
0.00799560546875,
-0.0017108917236328125,
0.04339599609375,
0.0235137939453125,
-0.035064697265625,
0.050079345703125,
0.0177459716796875,
0.07611083984375,
-0.0305023193359375,
-0.0114898681640625,
-0.0212860107421875,
-0.0085906982421875,
-0.01708984375,
-0.05889892578125,
-0.006099700927734375,
-0.019378662109375,
-0.0155487060546875,
0.00634765625,
0.0406494140625,
-0.046966552734375,
-0.0306549072265625,
0.0426025390625,
0.038818359375,
0.0182647705078125,
0.01349639892578125,
-0.08270263671875,
0.0023555755615234375,
0.0288238525390625,
-0.040130615234375,
0.0231170654296875,
0.035552978515625,
-0.004669189453125,
0.04437255859375,
0.043975830078125,
0.00484466552734375,
-0.0014820098876953125,
0.0030384063720703125,
0.0386962890625,
-0.07037353515625,
-0.022979736328125,
-0.047607421875,
0.02716064453125,
-0.0246124267578125,
0.0016565322875976562,
0.060791015625,
0.053009033203125,
0.08074951171875,
-0.00397491455078125,
0.061309814453125,
-0.008636474609375,
0.03070068359375,
-0.045379638671875,
0.067138671875,
-0.0772705078125,
0.0194244384765625,
-0.0266265869140625,
-0.07049560546875,
-0.0118408203125,
0.052703857421875,
-0.0252838134765625,
0.0174102783203125,
0.051177978515625,
0.0736083984375,
-0.019256591796875,
-0.01438140869140625,
0.023193359375,
0.0328369140625,
0.01187896728515625,
0.0596923828125,
0.0259857177734375,
-0.0736083984375,
0.0482177734375,
-0.017852783203125,
0.00962066650390625,
-0.039154052734375,
-0.04791259765625,
-0.06939697265625,
-0.055145263671875,
-0.031707763671875,
-0.022979736328125,
-0.0034198760986328125,
0.06927490234375,
0.0257568359375,
-0.056304931640625,
-0.0052032470703125,
0.02081298828125,
0.03643798828125,
-0.0200958251953125,
-0.0207061767578125,
0.0496826171875,
-0.005706787109375,
-0.07037353515625,
0.02471923828125,
-0.006282806396484375,
-0.005138397216796875,
-0.0039825439453125,
-0.0184326171875,
-0.06640625,
0.0089569091796875,
0.044952392578125,
0.0191650390625,
-0.06866455078125,
-0.031585693359375,
0.00634765625,
-0.0196075439453125,
-0.01158905029296875,
0.01271820068359375,
-0.0309906005859375,
0.0269927978515625,
0.046600341796875,
0.05792236328125,
0.0496826171875,
-0.0035190582275390625,
0.01529693603515625,
-0.04608154296875,
-0.006397247314453125,
-0.003162384033203125,
0.053375244140625,
0.0273895263671875,
-0.02264404296875,
0.06781005859375,
0.016082763671875,
-0.03045654296875,
-0.056610107421875,
0.002437591552734375,
-0.07928466796875,
-0.0247344970703125,
0.08404541015625,
-0.031982421875,
-0.0188140869140625,
0.0205535888671875,
-0.01439666748046875,
0.041259765625,
-0.036956787109375,
0.035369873046875,
0.05999755859375,
0.03271484375,
-0.0117034912109375,
-0.0684814453125,
0.0235137939453125,
0.046722412109375,
-0.0206756591796875,
-0.0255126953125,
0.025390625,
0.036346435546875,
0.0179595947265625,
0.00876617431640625,
-0.0185699462890625,
0.024017333984375,
-0.005641937255859375,
-0.0006580352783203125,
-0.010406494140625,
0.018524169921875,
-0.01372528076171875,
0.001949310302734375,
-0.0125579833984375,
-0.0232086181640625
]
] |
timm/resnet50d.ra2_in1k | 2023-04-05T18:18:24.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2110.00476",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/resnet50d.ra2_in1k | 0 | 14,819 | timm | 2023-04-05T18:18:03 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
---
# Model card for resnet50d.ra2_in1k
A ResNet-D image classification model.
This model features:
* ReLU activations
* 3-layer stem of 3x3 convolutions with pooling
* 2x2 average pool + 1x1 convolution shortcut downsample
Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 25.6
- GMACs: 4.4
- Activations (M): 11.9
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- Bag of Tricks for Image Classification with Convolutional Neural Networks: https://arxiv.org/abs/1812.01187
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnet50d.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet50d.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet50d.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
```bibtex
@article{He2018BagOT,
title={Bag of Tricks for Image Classification with Convolutional Neural Networks},
author={Tong He and Zhi Zhang and Hang Zhang and Zhongyue Zhang and Junyuan Xie and Mu Li},
journal={2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2018},
pages={558-567}
}
```
| 39,089 | [
[
-0.06402587890625,
-0.017730712890625,
0.0019092559814453125,
0.0266265869140625,
-0.029571533203125,
-0.00812530517578125,
-0.0101776123046875,
-0.031463623046875,
0.08062744140625,
0.0223236083984375,
-0.049102783203125,
-0.040191650390625,
-0.0474853515625,
0.0005636215209960938,
0.0226593017578125,
0.06500244140625,
-0.0012731552124023438,
-0.00576019287109375,
0.0123748779296875,
-0.0208587646484375,
-0.006389617919921875,
-0.020233154296875,
-0.079345703125,
-0.0179901123046875,
0.032257080078125,
0.0170135498046875,
0.050140380859375,
0.0438232421875,
0.0321044921875,
0.043609619140625,
-0.0172576904296875,
0.0211181640625,
-0.0032444000244140625,
-0.006938934326171875,
0.044647216796875,
-0.03387451171875,
-0.067138671875,
-0.003154754638671875,
0.050689697265625,
0.043121337890625,
0.004360198974609375,
0.0264129638671875,
0.025238037109375,
0.04949951171875,
0.00018215179443359375,
-0.0030193328857421875,
0.000782012939453125,
0.01220703125,
-0.02337646484375,
0.0054779052734375,
-0.00724029541015625,
-0.051666259765625,
0.01032257080078125,
-0.04559326171875,
-0.0010433197021484375,
-0.0005435943603515625,
0.10015869140625,
-0.00601959228515625,
-0.0153350830078125,
0.0035305023193359375,
0.00618743896484375,
0.057647705078125,
-0.0606689453125,
0.023040771484375,
0.0418701171875,
0.00452423095703125,
-0.01450347900390625,
-0.05401611328125,
-0.041046142578125,
0.00991058349609375,
-0.0293426513671875,
0.0226593017578125,
-0.026336669921875,
-0.016754150390625,
0.029632568359375,
0.023040771484375,
-0.03570556640625,
-0.00704193115234375,
-0.02783203125,
-0.004932403564453125,
0.05255126953125,
0.004261016845703125,
0.05145263671875,
-0.0250244140625,
-0.038238525390625,
-0.01390838623046875,
-0.01412200927734375,
0.03448486328125,
0.0196533203125,
0.009552001953125,
-0.07843017578125,
0.03399658203125,
0.00717926025390625,
0.0189971923828125,
0.0267181396484375,
-0.01129913330078125,
0.06085205078125,
-0.0055084228515625,
-0.03582763671875,
-0.036041259765625,
0.08135986328125,
0.047760009765625,
0.022308349609375,
-0.0041656494140625,
-0.004634857177734375,
-0.01418304443359375,
-0.0279083251953125,
-0.072265625,
-0.005817413330078125,
0.022705078125,
-0.0411376953125,
-0.018096923828125,
0.0263671875,
-0.06683349609375,
-0.0026092529296875,
-0.00569915771484375,
0.00542449951171875,
-0.054412841796875,
-0.0345458984375,
0.0014476776123046875,
-0.0175628662109375,
0.037628173828125,
0.01548004150390625,
-0.0257415771484375,
0.0322265625,
0.0096588134765625,
0.06817626953125,
0.0204620361328125,
-0.00409698486328125,
-0.015899658203125,
-0.0004911422729492188,
-0.0262451171875,
0.0299530029296875,
0.00946044921875,
-0.0124359130859375,
-0.025299072265625,
0.0301666259765625,
-0.0160369873046875,
-0.02252197265625,
0.0447998046875,
0.01470947265625,
0.01258087158203125,
-0.022705078125,
-0.0172271728515625,
-0.01995849609375,
0.025970458984375,
-0.04449462890625,
0.0816650390625,
0.0294342041015625,
-0.08380126953125,
0.0126190185546875,
-0.03948974609375,
-0.00266265869140625,
-0.0199432373046875,
0.0052490234375,
-0.07000732421875,
-0.00030684471130371094,
0.0161895751953125,
0.052001953125,
-0.0189971923828125,
-0.0130767822265625,
-0.0281829833984375,
0.0022411346435546875,
0.03131103515625,
0.01468658447265625,
0.0699462890625,
0.025299072265625,
-0.0360107421875,
-0.015960693359375,
-0.05096435546875,
0.0338134765625,
0.034027099609375,
-0.0015659332275390625,
-0.00466156005859375,
-0.0582275390625,
0.0034580230712890625,
0.045074462890625,
0.0157928466796875,
-0.053009033203125,
0.0181427001953125,
-0.01551055908203125,
0.0252685546875,
0.044952392578125,
0.0021724700927734375,
0.0136260986328125,
-0.05242919921875,
0.04681396484375,
0.00016498565673828125,
0.0206146240234375,
-0.0000597834587097168,
-0.03228759765625,
-0.05615234375,
-0.052764892578125,
0.01800537109375,
0.03314208984375,
-0.033172607421875,
0.062469482421875,
0.007083892822265625,
-0.045501708984375,
-0.04559326171875,
0.005161285400390625,
0.04412841796875,
0.0162353515625,
0.0085601806640625,
-0.0288848876953125,
-0.055145263671875,
-0.07244873046875,
-0.0231781005859375,
0.0090484619140625,
-0.004058837890625,
0.05108642578125,
0.034576416015625,
-0.013824462890625,
0.045135498046875,
-0.0296630859375,
-0.0191497802734375,
-0.0136260986328125,
-0.00638580322265625,
0.0308685302734375,
0.05950927734375,
0.0767822265625,
-0.05615234375,
-0.069091796875,
0.009185791015625,
-0.08209228515625,
-0.0036449432373046875,
0.0002617835998535156,
-0.0163421630859375,
0.031890869140625,
0.0189971923828125,
-0.06500244140625,
0.055023193359375,
0.0284271240234375,
-0.056854248046875,
0.0330810546875,
-0.0276031494140625,
0.039947509765625,
-0.0841064453125,
0.02020263671875,
0.0214385986328125,
-0.017242431640625,
-0.04168701171875,
0.00566864013671875,
-0.00641632080078125,
0.008514404296875,
-0.041412353515625,
0.058074951171875,
-0.05450439453125,
-0.00595855712890625,
0.0078582763671875,
0.0036449432373046875,
-0.0006084442138671875,
0.032135009765625,
-0.004119873046875,
0.04229736328125,
0.0650634765625,
-0.012481689453125,
0.0268096923828125,
0.0295257568359375,
0.0009636878967285156,
0.05712890625,
-0.048828125,
0.01287841796875,
0.0014524459838867188,
0.03704833984375,
-0.0767822265625,
-0.0300140380859375,
0.04193115234375,
-0.0625,
0.0494384765625,
-0.02081298828125,
-0.0234527587890625,
-0.06304931640625,
-0.06396484375,
0.0228271484375,
0.05023193359375,
-0.0445556640625,
0.0295562744140625,
0.0157470703125,
-0.002658843994140625,
-0.038421630859375,
-0.055511474609375,
0.007236480712890625,
-0.03125,
-0.0606689453125,
0.033111572265625,
0.02374267578125,
-0.014678955078125,
0.00830078125,
-0.008636474609375,
-0.00921630859375,
-0.0166168212890625,
0.047027587890625,
0.0226593017578125,
-0.021240234375,
-0.0293426513671875,
-0.0275421142578125,
-0.0218658447265625,
-0.00377655029296875,
-0.00891876220703125,
0.039215087890625,
-0.03125,
0.0037136077880859375,
-0.1077880859375,
0.0086822509765625,
0.066162109375,
-0.003940582275390625,
0.07183837890625,
0.060577392578125,
-0.03680419921875,
0.00872802734375,
-0.034759521484375,
-0.019439697265625,
-0.038909912109375,
-0.01136016845703125,
-0.053070068359375,
-0.042938232421875,
0.0675048828125,
0.0034580230712890625,
-0.0099029541015625,
0.0577392578125,
0.012969970703125,
-0.0189971923828125,
0.060577392578125,
0.035919189453125,
0.0003485679626464844,
0.04168701171875,
-0.06500244140625,
0.005664825439453125,
-0.06256103515625,
-0.0572509765625,
-0.02032470703125,
-0.044830322265625,
-0.0440673828125,
-0.0282440185546875,
0.01812744140625,
0.0267181396484375,
-0.018035888671875,
0.04339599609375,
-0.04388427734375,
0.0032825469970703125,
0.0247802734375,
0.04168701171875,
-0.0183258056640625,
-0.00508880615234375,
-0.0079345703125,
-0.024871826171875,
-0.04241943359375,
-0.026031494140625,
0.060516357421875,
0.047821044921875,
0.03265380859375,
0.00684356689453125,
0.044952392578125,
0.00551605224609375,
0.0161285400390625,
-0.025054931640625,
0.0517578125,
0.0007953643798828125,
-0.03399658203125,
-0.022613525390625,
-0.029571533203125,
-0.07989501953125,
0.008819580078125,
-0.033050537109375,
-0.058837890625,
-0.010009765625,
-0.00290679931640625,
-0.0269317626953125,
0.056610107421875,
-0.04547119140625,
0.045989990234375,
-0.0033473968505859375,
-0.039306640625,
-0.00545501708984375,
-0.06109619140625,
0.00856781005859375,
0.0279541015625,
0.003021240234375,
-0.0003192424774169922,
-0.003910064697265625,
0.061065673828125,
-0.059906005859375,
0.04571533203125,
-0.0277557373046875,
0.0122833251953125,
0.031219482421875,
-0.002834320068359375,
0.0307464599609375,
-0.0013513565063476562,
-0.01372528076171875,
-0.004299163818359375,
0.007457733154296875,
-0.060272216796875,
-0.0264739990234375,
0.04974365234375,
-0.0572509765625,
-0.0258636474609375,
-0.0496826171875,
-0.02130126953125,
0.00896453857421875,
0.0004792213439941406,
0.03558349609375,
0.048797607421875,
0.0006146430969238281,
0.018402099609375,
0.04180908203125,
-0.032958984375,
0.039093017578125,
-0.007663726806640625,
0.0017948150634765625,
-0.041656494140625,
0.051910400390625,
0.00485992431640625,
0.0019512176513671875,
0.0004189014434814453,
0.0011310577392578125,
-0.031463623046875,
-0.0177001953125,
-0.0218658447265625,
0.0538330078125,
-0.016693115234375,
-0.02520751953125,
-0.04949951171875,
-0.0251617431640625,
-0.04058837890625,
-0.0284423828125,
-0.034637451171875,
-0.027069091796875,
-0.022216796875,
0.0035800933837890625,
0.055328369140625,
0.06536865234375,
-0.026458740234375,
0.0276641845703125,
-0.036346435546875,
0.02203369140625,
0.006259918212890625,
0.040435791015625,
-0.0232086181640625,
-0.051239013671875,
0.00482940673828125,
-0.0015001296997070312,
-0.006389617919921875,
-0.06134033203125,
0.04681396484375,
0.00290679931640625,
0.0259246826171875,
0.0293121337890625,
-0.0135650634765625,
0.052276611328125,
-0.0007143020629882812,
0.034210205078125,
0.0460205078125,
-0.050323486328125,
0.0283355712890625,
-0.0283050537109375,
0.0030956268310546875,
0.020843505859375,
0.0157012939453125,
-0.030609130859375,
-0.02496337890625,
-0.06658935546875,
-0.03448486328125,
0.05792236328125,
0.006626129150390625,
-0.001247406005859375,
-0.0009598731994628906,
0.0584716796875,
-0.005779266357421875,
-0.0003657341003417969,
-0.039794921875,
-0.06683349609375,
-0.00765228271484375,
-0.01325225830078125,
0.00600433349609375,
-0.007732391357421875,
0.0013589859008789062,
-0.048309326171875,
0.049072265625,
0.004390716552734375,
0.037811279296875,
0.01491546630859375,
0.00611114501953125,
0.000324249267578125,
-0.02294921875,
0.04669189453125,
0.027679443359375,
-0.01465606689453125,
-0.007541656494140625,
0.027374267578125,
-0.040130615234375,
0.00832366943359375,
0.01346588134765625,
-0.0005946159362792969,
0.00731658935546875,
0.009735107421875,
0.03863525390625,
0.0236053466796875,
-0.00611114501953125,
0.038604736328125,
-0.018463134765625,
-0.042022705078125,
-0.0186004638671875,
-0.0128021240234375,
0.0190277099609375,
0.033843994140625,
0.0258331298828125,
0.007015228271484375,
-0.0310821533203125,
-0.0276336669921875,
0.0421142578125,
0.054412841796875,
-0.0294647216796875,
-0.0311431884765625,
0.04632568359375,
-0.004261016845703125,
-0.0193939208984375,
0.031524658203125,
-0.00829315185546875,
-0.049560546875,
0.07879638671875,
0.025360107421875,
0.050811767578125,
-0.0357666015625,
0.0084991455078125,
0.06695556640625,
-0.000020503997802734375,
0.0115203857421875,
0.0252227783203125,
0.03228759765625,
-0.0263671875,
-0.004314422607421875,
-0.04022216796875,
0.0132598876953125,
0.03472900390625,
-0.03472900390625,
0.0221710205078125,
-0.0548095703125,
-0.0281219482421875,
0.0072784423828125,
0.03753662109375,
-0.049072265625,
0.02655029296875,
-0.005481719970703125,
0.079345703125,
-0.06304931640625,
0.06610107421875,
0.06719970703125,
-0.040679931640625,
-0.065673828125,
-0.000823974609375,
0.0086669921875,
-0.0650634765625,
0.038665771484375,
0.006866455078125,
0.00266265869140625,
-0.0008330345153808594,
-0.03765869140625,
-0.05010986328125,
0.1060791015625,
0.0287322998046875,
-0.006908416748046875,
0.0233612060546875,
-0.0290374755859375,
0.027923583984375,
-0.01470947265625,
0.044647216796875,
0.0260772705078125,
0.038238525390625,
0.0160064697265625,
-0.06304931640625,
0.0284576416015625,
-0.0309600830078125,
-0.006488800048828125,
0.0208282470703125,
-0.0958251953125,
0.06695556640625,
-0.0231781005859375,
-0.00522613525390625,
0.0169525146484375,
0.050140380859375,
0.024200439453125,
0.0001304149627685547,
0.02215576171875,
0.07025146484375,
0.034454345703125,
-0.0200653076171875,
0.07965087890625,
-0.0164031982421875,
0.041015625,
0.01824951171875,
0.038543701171875,
0.030426025390625,
0.029632568359375,
-0.042724609375,
0.0204620361328125,
0.06329345703125,
-0.004791259765625,
0.011749267578125,
0.0208282470703125,
-0.0298309326171875,
-0.0124053955078125,
-0.01372528076171875,
-0.0509033203125,
0.01837158203125,
0.00595855712890625,
-0.01087188720703125,
-0.01160430908203125,
-0.0018110275268554688,
0.0193634033203125,
0.0201263427734375,
-0.0141754150390625,
0.0399169921875,
0.0066070556640625,
-0.03167724609375,
0.035675048828125,
0.0030689239501953125,
0.07861328125,
-0.0268402099609375,
0.0114593505859375,
-0.0281219482421875,
0.02386474609375,
-0.020172119140625,
-0.08062744140625,
0.0258636474609375,
-0.006694793701171875,
0.0037708282470703125,
-0.016387939453125,
0.04840087890625,
-0.0265350341796875,
-0.02642822265625,
0.0280303955078125,
0.0267486572265625,
0.03826904296875,
0.021697998046875,
-0.085693359375,
0.021331787109375,
0.005443572998046875,
-0.046875,
0.032928466796875,
0.036224365234375,
0.025848388671875,
0.05877685546875,
0.02655029296875,
0.0207366943359375,
0.011474609375,
-0.025360107421875,
0.056671142578125,
-0.04486083984375,
-0.032318115234375,
-0.061492919921875,
0.038726806640625,
-0.027374267578125,
-0.039947509765625,
0.0543212890625,
0.042877197265625,
0.0296783447265625,
0.002288818359375,
0.049407958984375,
-0.041656494140625,
0.03546142578125,
-0.02044677734375,
0.056640625,
-0.051116943359375,
-0.0160064697265625,
-0.013885498046875,
-0.04571533203125,
-0.029876708984375,
0.061676025390625,
-0.00832366943359375,
0.021820068359375,
0.023162841796875,
0.05413818359375,
0.004425048828125,
-0.0124053955078125,
-0.0021457672119140625,
0.01251983642578125,
-0.00684356689453125,
0.0653076171875,
0.03570556640625,
-0.058074951171875,
0.005802154541015625,
-0.037322998046875,
-0.021820068359375,
-0.025970458984375,
-0.054840087890625,
-0.0850830078125,
-0.052703857421875,
-0.04144287109375,
-0.051605224609375,
-0.0166778564453125,
0.08648681640625,
0.062164306640625,
-0.0462646484375,
-0.01052093505859375,
0.0094451904296875,
0.005710601806640625,
-0.013427734375,
-0.016204833984375,
0.0416259765625,
0.007457733154296875,
-0.07281494140625,
-0.0310821533203125,
0.01141357421875,
0.042724609375,
0.0267486572265625,
-0.037384033203125,
-0.017608642578125,
-0.00696563720703125,
0.024139404296875,
0.06304931640625,
-0.0606689453125,
-0.0200653076171875,
0.00040411949157714844,
-0.03558349609375,
0.010009765625,
0.0229949951171875,
-0.03411865234375,
-0.004161834716796875,
0.0362548828125,
0.0268707275390625,
0.056365966796875,
0.004688262939453125,
0.010711669921875,
-0.03631591796875,
0.042144775390625,
-0.0006036758422851562,
0.0224456787109375,
0.01763916015625,
-0.0215606689453125,
0.05615234375,
0.0396728515625,
-0.0296630859375,
-0.07733154296875,
-0.012603759765625,
-0.098388671875,
-0.006069183349609375,
0.05328369140625,
-0.007762908935546875,
-0.0309295654296875,
0.030426025390625,
-0.031494140625,
0.03912353515625,
-0.01397705078125,
0.0217742919921875,
0.0176544189453125,
-0.0251617431640625,
-0.0301361083984375,
-0.044647216796875,
0.046661376953125,
0.0281829833984375,
-0.051361083984375,
-0.030181884765625,
-0.0019140243530273438,
0.026947021484375,
0.01265716552734375,
0.05511474609375,
-0.0290374755859375,
0.0119171142578125,
-0.007061004638671875,
0.021270751953125,
-0.006679534912109375,
0.01061248779296875,
-0.0232391357421875,
-0.007114410400390625,
-0.0152740478515625,
-0.04840087890625
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.