modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
TheBloke/Llama-2-7B-Chat-GPTQ | 2023-09-27T12:44:48.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"arxiv:2307.09288",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-7B-Chat-GPTQ | 165 | 101,944 | transformers | 2023-07-18T17:38:15 | ---
language:
- en
license: llama2
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
model_name: Llama 2 7B Chat
arxiv: 2307.09288
base_model: meta-llama/Llama-2-7b-chat-hf
inference: false
model_creator: Meta Llama 2
model_type: llama
pipeline_tag: text-generation
prompt_template: '[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as
possible, while being safe. Your answers should not include any harmful, unethical,
racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses
are socially unbiased and positive in nature. If a question does not make any sense,
or is not factually coherent, explain why instead of answering something not correct.
If you don''t know the answer to a question, please don''t share false information.
<</SYS>>
{prompt}[/INST]
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama 2 7B Chat - GPTQ
- Model creator: [Meta Llama 2](https://huggingface.co/meta-llama)
- Original model: [Llama 2 7B Chat](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Meta Llama 2's Llama 2 7B Chat](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF)
* [Meta Llama 2's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Llama-2-Chat
```
[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 4.02 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 4.28 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 3.90 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [main](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ/tree/main) | 4 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 3.90 GB | Yes | 4-bit, without Act Order and group size 128g. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Llama-2-7b-Chat-GPTQ:gptq-4bit-64g-actorder_True`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch gptq-4bit-64g-actorder_True https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Llama-2-7b-Chat-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Llama-2-7b-Chat-GPTQ:gptq-4bit-64g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Llama-2-7b-Chat-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Llama-2-7b-Chat-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-64g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Meta Llama 2's Llama 2 7B Chat
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 25,013 | [
[
-0.03826904296875,
-0.06451416015625,
0.01192474365234375,
0.02630615234375,
-0.0300750732421875,
-0.001316070556640625,
0.008697509765625,
-0.0462646484375,
0.022186279296875,
0.0257720947265625,
-0.05426025390625,
-0.033721923828125,
-0.0318603515625,
0.000015795230865478516,
-0.0282745361328125,
0.08148193359375,
0.01012420654296875,
-0.03131103515625,
-0.006610870361328125,
-0.007747650146484375,
-0.0262451171875,
-0.030487060546875,
-0.054656982421875,
-0.0218658447265625,
0.028350830078125,
0.017059326171875,
0.059417724609375,
0.040069580078125,
0.0207977294921875,
0.02703857421875,
-0.01442718505859375,
0.002902984619140625,
-0.047271728515625,
-0.00507354736328125,
0.01397705078125,
-0.0229034423828125,
-0.054534912109375,
0.006366729736328125,
0.025177001953125,
0.005863189697265625,
-0.0294952392578125,
0.0201416015625,
0.0033397674560546875,
0.043609619140625,
-0.03326416015625,
0.01800537109375,
-0.036163330078125,
-0.0033016204833984375,
-0.00873565673828125,
0.00490570068359375,
-0.00400543212890625,
-0.0309295654296875,
0.00562286376953125,
-0.0655517578125,
-0.0014524459838867188,
-0.005100250244140625,
0.08953857421875,
0.01227569580078125,
-0.04888916015625,
0.006572723388671875,
-0.0338134765625,
0.050018310546875,
-0.07110595703125,
0.0192108154296875,
0.04229736328125,
0.020233154296875,
-0.024932861328125,
-0.0667724609375,
-0.04595947265625,
-0.00482177734375,
-0.00679779052734375,
0.0152435302734375,
-0.0416259765625,
0.0011625289916992188,
0.0258941650390625,
0.047119140625,
-0.057708740234375,
-0.00592041015625,
-0.0269317626953125,
-0.01392364501953125,
0.063720703125,
0.0128936767578125,
0.026824951171875,
-0.01776123046875,
-0.024139404296875,
-0.03009033203125,
-0.05572509765625,
0.0024013519287109375,
0.0283355712890625,
0.00021576881408691406,
-0.052764892578125,
0.0360107421875,
-0.0274200439453125,
0.0282135009765625,
0.016815185546875,
-0.0228729248046875,
0.0276031494140625,
-0.039306640625,
-0.0300140380859375,
-0.03289794921875,
0.092529296875,
0.033477783203125,
-0.012481689453125,
0.022613525390625,
0.0021457672119140625,
-0.003795623779296875,
0.00225067138671875,
-0.07354736328125,
-0.0281829833984375,
0.034881591796875,
-0.038421630859375,
-0.02886962890625,
-0.01953125,
-0.0550537109375,
-0.006923675537109375,
-0.00107574462890625,
0.0291595458984375,
-0.031219482421875,
-0.0340576171875,
0.003448486328125,
-0.0243682861328125,
0.036102294921875,
0.0287628173828125,
-0.058685302734375,
0.033416748046875,
0.027191162109375,
0.05419921875,
0.009185791015625,
-0.0088653564453125,
-0.01097869873046875,
0.006755828857421875,
-0.00615692138671875,
0.044158935546875,
-0.01158905029296875,
-0.0419921875,
-0.021697998046875,
0.017852783203125,
0.01190185546875,
-0.0224761962890625,
0.03375244140625,
-0.02392578125,
0.02667236328125,
-0.03436279296875,
-0.03533935546875,
-0.03082275390625,
0.0113067626953125,
-0.04461669921875,
0.09246826171875,
0.027496337890625,
-0.062347412109375,
0.007694244384765625,
-0.04412841796875,
-0.01546478271484375,
-0.0052032470703125,
0.0023403167724609375,
-0.0401611328125,
-0.013214111328125,
0.018035888671875,
0.0245513916015625,
-0.0289306640625,
0.007293701171875,
-0.02703857421875,
-0.0220794677734375,
0.0172271728515625,
-0.03997802734375,
0.09210205078125,
0.01296234130859375,
-0.032806396484375,
-0.007061004638671875,
-0.0550537109375,
0.0038661956787109375,
0.03924560546875,
-0.022979736328125,
0.0002627372741699219,
-0.0032863616943359375,
0.0010251998901367188,
0.005126953125,
0.0250091552734375,
-0.02886962890625,
0.036041259765625,
-0.01503753662109375,
0.048553466796875,
0.053009033203125,
0.00579833984375,
0.02105712890625,
-0.0362548828125,
0.03387451171875,
0.003284454345703125,
0.04486083984375,
0.00713348388671875,
-0.059539794921875,
-0.0601806640625,
-0.0189208984375,
0.018463134765625,
0.050933837890625,
-0.04962158203125,
0.0401611328125,
-0.009002685546875,
-0.06085205078125,
-0.0254974365234375,
-0.0008878707885742188,
0.0238800048828125,
0.0204010009765625,
0.0290985107421875,
-0.038909912109375,
-0.035247802734375,
-0.063720703125,
0.01271820068359375,
-0.042236328125,
-0.004711151123046875,
0.03753662109375,
0.056671142578125,
-0.03173828125,
0.05853271484375,
-0.049041748046875,
-0.007965087890625,
-0.005405426025390625,
0.006328582763671875,
0.021240234375,
0.03900146484375,
0.05865478515625,
-0.053192138671875,
-0.0347900390625,
-0.008758544921875,
-0.0531005859375,
-0.01025390625,
0.0015058517456054688,
-0.0328369140625,
0.01194000244140625,
-0.004482269287109375,
-0.08050537109375,
0.049896240234375,
0.04827880859375,
-0.03662109375,
0.053497314453125,
-0.010284423828125,
0.0066986083984375,
-0.083251953125,
0.0005598068237304688,
0.005062103271484375,
-0.0233154296875,
-0.038330078125,
0.006092071533203125,
-0.00811767578125,
0.0179901123046875,
-0.03265380859375,
0.04669189453125,
-0.03692626953125,
-0.00009644031524658203,
-0.0007739067077636719,
0.0020503997802734375,
0.021697998046875,
0.040008544921875,
-0.0191192626953125,
0.0640869140625,
0.03179931640625,
-0.038177490234375,
0.0445556640625,
0.038909912109375,
-0.0046234130859375,
0.019287109375,
-0.0660400390625,
0.0253143310546875,
0.01210784912109375,
0.038238525390625,
-0.076904296875,
-0.022064208984375,
0.044403076171875,
-0.04058837890625,
0.0243377685546875,
-0.020721435546875,
-0.0272064208984375,
-0.0294952392578125,
-0.04241943359375,
0.024627685546875,
0.0643310546875,
-0.0310516357421875,
0.033721923828125,
0.037689208984375,
0.0012035369873046875,
-0.048553466796875,
-0.053741455078125,
-0.006130218505859375,
-0.022430419921875,
-0.04461669921875,
0.031341552734375,
-0.01177978515625,
-0.0103759765625,
-0.0031948089599609375,
0.00475311279296875,
-0.00830841064453125,
0.00272369384765625,
0.02685546875,
0.023681640625,
-0.007732391357421875,
-0.0158843994140625,
0.01355743408203125,
0.00031256675720214844,
-0.00344085693359375,
-0.0106353759765625,
0.039642333984375,
-0.01389312744140625,
-0.009185791015625,
-0.03790283203125,
0.0192108154296875,
0.034271240234375,
0.0036067962646484375,
0.059967041015625,
0.055572509765625,
-0.0222015380859375,
0.0158233642578125,
-0.04644775390625,
-0.01071929931640625,
-0.03997802734375,
0.014801025390625,
-0.01184844970703125,
-0.05682373046875,
0.043060302734375,
0.0251922607421875,
0.0218048095703125,
0.051239013671875,
0.040802001953125,
-0.00443267822265625,
0.0692138671875,
0.04095458984375,
-0.018463134765625,
0.041015625,
-0.0404052734375,
-0.01181793212890625,
-0.06463623046875,
-0.0171356201171875,
-0.02752685546875,
-0.0197601318359375,
-0.058074951171875,
-0.038177490234375,
0.022125244140625,
0.018707275390625,
-0.054931640625,
0.0477294921875,
-0.0482177734375,
0.0147705078125,
0.03704833984375,
0.018798828125,
0.022613525390625,
0.00585174560546875,
0.00095367431640625,
0.0077056884765625,
-0.0341796875,
-0.032318115234375,
0.08203125,
0.031463623046875,
0.052001953125,
0.021636962890625,
0.04083251953125,
0.007274627685546875,
0.026611328125,
-0.03948974609375,
0.04669189453125,
0.00806427001953125,
-0.0496826171875,
-0.022064208984375,
-0.0426025390625,
-0.07012939453125,
0.0194549560546875,
-0.006542205810546875,
-0.06634521484375,
0.0252227783203125,
0.00008767843246459961,
-0.0209197998046875,
0.0169677734375,
-0.046600341796875,
0.061553955078125,
-0.0089111328125,
-0.0194549560546875,
-0.006015777587890625,
-0.050445556640625,
0.034454345703125,
0.01226043701171875,
0.0050506591796875,
-0.025115966796875,
-0.01971435546875,
0.059478759765625,
-0.06097412109375,
0.07183837890625,
-0.013519287109375,
-0.0130157470703125,
0.04412841796875,
-0.0044708251953125,
0.046783447265625,
0.01092529296875,
0.003448486328125,
0.0328369140625,
0.0160369873046875,
-0.0276031494140625,
-0.030364990234375,
0.03851318359375,
-0.08331298828125,
-0.047149658203125,
-0.0243377685546875,
-0.0305938720703125,
0.0036754608154296875,
-0.0035343170166015625,
0.034515380859375,
0.0175628662109375,
-0.00693511962890625,
0.002410888671875,
0.04217529296875,
-0.030029296875,
0.0312347412109375,
0.025543212890625,
-0.025970458984375,
-0.046356201171875,
0.0565185546875,
0.002071380615234375,
0.023193359375,
0.0210723876953125,
0.005245208740234375,
-0.032440185546875,
-0.0287628173828125,
-0.05078125,
0.02703857421875,
-0.039947509765625,
-0.0355224609375,
-0.0457763671875,
-0.0279083251953125,
-0.0267181396484375,
0.0206298828125,
-0.0283355712890625,
-0.048736572265625,
-0.043609619140625,
-0.004871368408203125,
0.07781982421875,
0.0396728515625,
-0.01073455810546875,
0.03271484375,
-0.06243896484375,
0.020416259765625,
0.03680419921875,
0.00998687744140625,
0.0022144317626953125,
-0.058074951171875,
0.0030193328857421875,
0.0261688232421875,
-0.053009033203125,
-0.07525634765625,
0.043914794921875,
0.0185089111328125,
0.0303192138671875,
0.03179931640625,
0.0040130615234375,
0.0692138671875,
-0.0203704833984375,
0.07781982421875,
0.019378662109375,
-0.06671142578125,
0.04364013671875,
-0.042633056640625,
0.00785064697265625,
0.02569580078125,
0.0369873046875,
-0.0303497314453125,
-0.021087646484375,
-0.052825927734375,
-0.05621337890625,
0.032623291015625,
0.040069580078125,
0.015777587890625,
0.00554656982421875,
0.04339599609375,
-0.006313323974609375,
0.011444091796875,
-0.07073974609375,
-0.043304443359375,
-0.025970458984375,
-0.006011962890625,
0.0138397216796875,
-0.01413726806640625,
-0.018341064453125,
-0.042877197265625,
0.0665283203125,
-0.00867462158203125,
0.056243896484375,
0.01514434814453125,
0.012359619140625,
-0.01172637939453125,
0.01242828369140625,
0.03082275390625,
0.041839599609375,
-0.00907135009765625,
-0.0213165283203125,
0.025421142578125,
-0.060150146484375,
0.02008056640625,
0.017303466796875,
-0.0090179443359375,
-0.01264190673828125,
0.01244354248046875,
0.063232421875,
0.0008111000061035156,
-0.0275726318359375,
0.041656494140625,
-0.023529052734375,
-0.0229644775390625,
-0.02313232421875,
0.016387939453125,
0.02587890625,
0.03680419921875,
0.0294342041015625,
-0.0237274169921875,
0.0160064697265625,
-0.044952392578125,
0.0011310577392578125,
0.036865234375,
-0.0060577392578125,
-0.026824951171875,
0.05841064453125,
0.00408935546875,
-0.0016908645629882812,
0.0560302734375,
-0.0206298828125,
-0.032379150390625,
0.058746337890625,
0.04241943359375,
0.054840087890625,
-0.0136871337890625,
0.0230560302734375,
0.037384033203125,
0.0125274658203125,
-0.00687408447265625,
0.0305938720703125,
-0.0004303455352783203,
-0.0462646484375,
-0.0298614501953125,
-0.04876708984375,
-0.0288543701171875,
0.0206298828125,
-0.049041748046875,
0.014495849609375,
-0.031768798828125,
-0.034515380859375,
-0.0206298828125,
0.0212860107421875,
-0.03741455078125,
0.00930023193359375,
0.0034008026123046875,
0.06787109375,
-0.051055908203125,
0.0621337890625,
0.0377197265625,
-0.034210205078125,
-0.07403564453125,
-0.0198822021484375,
0.017547607421875,
-0.05584716796875,
0.01503753662109375,
0.00023758411407470703,
0.01788330078125,
-0.00206756591796875,
-0.060699462890625,
-0.0770263671875,
0.11529541015625,
0.0279998779296875,
-0.049896240234375,
-0.0089263916015625,
0.0011625289916992188,
0.03082275390625,
-0.0090179443359375,
0.052825927734375,
0.04150390625,
0.0264129638671875,
0.0181732177734375,
-0.08184814453125,
0.0316162109375,
-0.035552978515625,
0.006458282470703125,
0.007427215576171875,
-0.08099365234375,
0.07257080078125,
-0.00531768798828125,
-0.0138397216796875,
0.022705078125,
0.051727294921875,
0.038604736328125,
-0.001956939697265625,
0.031768798828125,
0.054595947265625,
0.05474853515625,
-0.02337646484375,
0.0806884765625,
-0.0139923095703125,
0.0421142578125,
0.055694580078125,
-0.0011749267578125,
0.06439208984375,
0.0187835693359375,
-0.056365966796875,
0.0550537109375,
0.07659912109375,
-0.00540924072265625,
0.03009033203125,
0.00218963623046875,
-0.0253753662109375,
-0.004642486572265625,
0.005825042724609375,
-0.05438232421875,
0.01532745361328125,
0.031524658203125,
-0.01197052001953125,
0.00658416748046875,
-0.0166778564453125,
0.006076812744140625,
-0.053741455078125,
-0.0047454833984375,
0.05426025390625,
0.0258636474609375,
-0.017913818359375,
0.07073974609375,
-0.0081939697265625,
0.055419921875,
-0.043701171875,
-0.01515960693359375,
-0.032958984375,
-0.00691986083984375,
-0.0223846435546875,
-0.053985595703125,
0.01203155517578125,
-0.0056915283203125,
-0.0008101463317871094,
0.0065460205078125,
0.055999755859375,
-0.0092315673828125,
-0.0255279541015625,
0.0289154052734375,
0.037322998046875,
0.027740478515625,
-0.00363922119140625,
-0.07537841796875,
0.025054931640625,
0.004161834716796875,
-0.053741455078125,
0.036590576171875,
0.024566650390625,
0.01385498046875,
0.06103515625,
0.049468994140625,
-0.012847900390625,
-0.0021514892578125,
-0.01100921630859375,
0.0811767578125,
-0.053436279296875,
-0.020233154296875,
-0.06512451171875,
0.047393798828125,
-0.0099029541015625,
-0.0294952392578125,
0.053070068359375,
0.0307769775390625,
0.04705810546875,
0.00897979736328125,
0.049591064453125,
-0.0301971435546875,
0.0122528076171875,
-0.0213165283203125,
0.0533447265625,
-0.0537109375,
0.0186767578125,
-0.03070068359375,
-0.057159423828125,
0.0024662017822265625,
0.06072998046875,
-0.0026950836181640625,
0.01549530029296875,
0.0262908935546875,
0.06671142578125,
0.0026187896728515625,
0.0138702392578125,
0.00954437255859375,
0.0309295654296875,
0.0172271728515625,
0.0635986328125,
0.06561279296875,
-0.0712890625,
0.049041748046875,
-0.0280914306640625,
-0.017486572265625,
-0.006076812744140625,
-0.06536865234375,
-0.0614013671875,
-0.03228759765625,
-0.038665771484375,
-0.04150390625,
-0.003658294677734375,
0.0675048828125,
0.059722900390625,
-0.041656494140625,
-0.0284271240234375,
-0.0017910003662109375,
0.004917144775390625,
-0.01438140869140625,
-0.020111083984375,
0.0164642333984375,
0.024505615234375,
-0.049468994140625,
0.017181396484375,
0.004261016845703125,
0.037017822265625,
-0.0164031982421875,
-0.0191192626953125,
-0.0191192626953125,
0.0024051666259765625,
0.046722412109375,
0.036468505859375,
-0.048126220703125,
-0.01526641845703125,
-0.01265716552734375,
-0.00792694091796875,
0.0210418701171875,
0.01373291015625,
-0.056182861328125,
-0.0108184814453125,
0.033294677734375,
0.0107879638671875,
0.0618896484375,
0.0007233619689941406,
0.0283203125,
-0.03759765625,
0.01348114013671875,
-0.002811431884765625,
0.026824951171875,
0.00899505615234375,
-0.039947509765625,
0.050445556640625,
0.0249481201171875,
-0.049896240234375,
-0.054046630859375,
-0.0080108642578125,
-0.08392333984375,
-0.01136016845703125,
0.08929443359375,
-0.0130462646484375,
-0.0253143310546875,
-0.0026798248291015625,
-0.0297698974609375,
0.0279083251953125,
-0.04254150390625,
0.03485107421875,
0.0325927734375,
-0.0172882080078125,
-0.02447509765625,
-0.051849365234375,
0.039276123046875,
0.00934600830078125,
-0.0733642578125,
-0.0026950836181640625,
0.0360107421875,
0.038909912109375,
0.00640869140625,
0.069580078125,
-0.0096893310546875,
0.0219879150390625,
0.007144927978515625,
0.0027675628662109375,
0.006824493408203125,
0.0076904296875,
-0.0182037353515625,
-0.00881195068359375,
-0.014617919921875,
-0.0011663436889648438
]
] |
pysentimiento/robertuito-emotion-analysis | 2023-02-20T19:04:28.000Z | [
"pysentimiento",
"pytorch",
"roberta",
"emotion-analysis",
"twitter",
"es",
"arxiv:2106.09462",
"region:us"
] | null | pysentimiento | null | null | pysentimiento/robertuito-emotion-analysis | 13 | 101,688 | pysentimiento | 2022-03-02T23:29:05 | ---
language:
- es
library_name: pysentimiento
tags:
- emotion-analysis
- twitter
---
# Emotion Analysis in Spanish
## robertuito-emotion-analysis
Repository: [https://github.com/pysentimiento/pysentimiento/](https://github.com/finiteautomata/pysentimiento/)
Model trained with TASS 2020 Task 2 corpus for Emotion detection in Spanish. Base model is [RoBERTuito](https://github.com/pysentimiento/robertuito), a RoBERTa model trained in Spanish tweets.
Contains the six Ekman emotions plus a neutral class:
- anger
- disgust
- fear
- joy
- sadness
- surprise
## Results
Results for the four tasks evaluated in `pysentimiento`. Results are expressed as Macro F1 scores
| model | emotion | hate_speech | irony | sentiment |
|:--------------|:--------------|:--------------|:--------------|:--------------|
| robertuito | 0.560 ± 0.010 | 0.759 ± 0.007 | 0.739 ± 0.005 | 0.705 ± 0.003 |
| roberta | 0.527 ± 0.015 | 0.741 ± 0.012 | 0.721 ± 0.008 | 0.670 ± 0.006 |
| bertin | 0.524 ± 0.007 | 0.738 ± 0.007 | 0.713 ± 0.012 | 0.666 ± 0.005 |
| beto_uncased | 0.532 ± 0.012 | 0.727 ± 0.016 | 0.701 ± 0.007 | 0.651 ± 0.006 |
| beto_cased | 0.516 ± 0.012 | 0.724 ± 0.012 | 0.705 ± 0.009 | 0.662 ± 0.005 |
| mbert_uncased | 0.493 ± 0.010 | 0.718 ± 0.011 | 0.681 ± 0.010 | 0.617 ± 0.003 |
| biGRU | 0.264 ± 0.007 | 0.592 ± 0.018 | 0.631 ± 0.011 | 0.585 ± 0.011 |
Note that for Hate Speech, these are the results for Semeval 2019, Task 5 Subtask B (HS+TR+AG detection)
## Citation
If you use this model in your research, please cite pysentimiento, RoBERTuito and EmoEvent papers:
```
@misc{perez2021pysentimiento,
title={pysentimiento: A Python Toolkit for Sentiment Analysis and SocialNLP tasks},
author={Juan Manuel Pérez and Juan Carlos Giudici and Franco Luque},
year={2021},
eprint={2106.09462},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{del2020emoevent,
title={EmoEvent: A multilingual emotion corpus based on different events},
author={del Arco, Flor Miriam Plaza and Strapparava, Carlo and Lopez, L Alfonso Urena and Mart{\'\i}n-Valdivia, M Teresa},
booktitle={Proceedings of the 12th Language Resources and Evaluation Conference},
pages={1492--1498},
year={2020}
}
@inproceedings{perez-etal-2022-robertuito,
title = "{R}o{BERT}uito: a pre-trained language model for social media text in {S}panish",
author = "P{\'e}rez, Juan Manuel and
Furman, Dami{\'a}n Ariel and
Alonso Alemany, Laura and
Luque, Franco M.",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.785",
pages = "7235--7243",
abstract = "Since BERT appeared, Transformer language models and transfer learning have become state-of-the-art for natural language processing tasks. Recently, some works geared towards pre-training specially-crafted models for particular domains, such as scientific papers, medical documents, user-generated texts, among others. These domain-specific models have been shown to improve performance significantly in most tasks; however, for languages other than English, such models are not widely available. In this work, we present RoBERTuito, a pre-trained language model for user-generated text in Spanish, trained on over 500 million tweets. Experiments on a benchmark of tasks involving user-generated text showed that RoBERTuito outperformed other pre-trained language models in Spanish. In addition to this, our model has some cross-lingual abilities, achieving top results for English-Spanish tasks of the Linguistic Code-Switching Evaluation benchmark (LinCE) and also competitive performance against monolingual models in English Twitter tasks. To facilitate further research, we make RoBERTuito publicly available at the HuggingFace model hub together with the dataset used to pre-train it.",
}
``` | 4,100 | [
[
-0.0215911865234375,
-0.05377197265625,
0.0221405029296875,
0.033050537109375,
-0.0230712890625,
0.0157470703125,
-0.04290771484375,
-0.04913330078125,
0.04296875,
0.0159759521484375,
-0.0413818359375,
-0.0645751953125,
-0.0745849609375,
0.01629638671875,
-0.0237884521484375,
0.07843017578125,
0.00302886962890625,
0.0047149658203125,
0.020751953125,
-0.0110931396484375,
0.0183258056640625,
-0.033599853515625,
-0.05303955078125,
-0.007587432861328125,
0.05633544921875,
0.022064208984375,
0.0333251953125,
0.004924774169921875,
0.02813720703125,
0.025177001953125,
-0.0078277587890625,
-0.0021152496337890625,
-0.0258026123046875,
0.002410888671875,
-0.007518768310546875,
-0.024932861328125,
-0.02874755859375,
0.0028133392333984375,
0.037200927734375,
0.022064208984375,
0.005260467529296875,
0.01079559326171875,
0.00205230712890625,
0.048126220703125,
-0.032928466796875,
0.0107269287109375,
-0.04034423828125,
-0.0008311271667480469,
-0.025909423828125,
-0.0036525726318359375,
-0.02398681640625,
-0.045013427734375,
0.0111236572265625,
-0.017486572265625,
0.00945281982421875,
-0.00496673583984375,
0.086181640625,
0.01457977294921875,
-0.0185699462890625,
-0.03155517578125,
-0.03143310546875,
0.07415771484375,
-0.0633544921875,
0.0281524658203125,
0.0179290771484375,
0.0009927749633789062,
0.002696990966796875,
-0.025238037109375,
-0.058013916015625,
-0.01119232177734375,
0.006526947021484375,
0.020721435546875,
-0.0440673828125,
-0.002658843994140625,
-0.00264739990234375,
0.006832122802734375,
-0.037078857421875,
0.0145111083984375,
-0.0302581787109375,
-0.005878448486328125,
0.0477294921875,
-0.0188751220703125,
0.029876708984375,
-0.0311279296875,
-0.006221771240234375,
-0.026092529296875,
-0.02197265625,
-0.006816864013671875,
0.042144775390625,
0.03033447265625,
-0.031829833984375,
0.027679443359375,
0.001682281494140625,
0.027374267578125,
-0.0122222900390625,
0.004177093505859375,
0.05718994140625,
0.0015611648559570312,
-0.016021728515625,
-0.027740478515625,
0.0977783203125,
0.0227203369140625,
0.048431396484375,
-0.00814056396484375,
0.0010709762573242188,
0.0172576904296875,
0.00934600830078125,
-0.055206298828125,
-0.01561737060546875,
0.0258636474609375,
-0.0255126953125,
-0.026031494140625,
-0.00536346435546875,
-0.080078125,
-0.0168914794921875,
-0.00592803955078125,
0.0222015380859375,
-0.035491943359375,
-0.043365478515625,
0.0019779205322265625,
-0.002483367919921875,
0.0020084381103515625,
0.01568603515625,
-0.050689697265625,
0.00972747802734375,
0.025390625,
0.058502197265625,
-0.023193359375,
-0.02716064453125,
-0.0124664306640625,
-0.0267791748046875,
-0.0088043212890625,
0.061798095703125,
-0.026092529296875,
-0.00775909423828125,
0.01025390625,
0.01280975341796875,
-0.0173187255859375,
-0.028564453125,
0.056304931640625,
-0.0250701904296875,
0.032501220703125,
-0.0169219970703125,
-0.0186614990234375,
-0.0259552001953125,
0.004657745361328125,
-0.03924560546875,
0.10015869140625,
0.0135498046875,
-0.061614990234375,
0.004825592041015625,
-0.057220458984375,
-0.04229736328125,
-0.0165557861328125,
-0.003749847412109375,
-0.03515625,
-0.0059661865234375,
0.00852203369140625,
0.04986572265625,
-0.0280914306640625,
0.0232696533203125,
-0.0275421142578125,
0.01181793212890625,
0.0171051025390625,
0.0009055137634277344,
0.07794189453125,
0.0240325927734375,
-0.05072021484375,
0.01212310791015625,
-0.0555419921875,
-0.007633209228515625,
0.0264129638671875,
-0.007289886474609375,
-0.0305023193359375,
-0.005741119384765625,
0.0189056396484375,
0.03857421875,
0.0250244140625,
-0.06512451171875,
-0.0275726318359375,
-0.03668212890625,
0.0186614990234375,
0.055206298828125,
-0.0163116455078125,
0.019256591796875,
-0.00948333740234375,
0.0540771484375,
-0.0035762786865234375,
0.01096343994140625,
0.01172637939453125,
-0.0408935546875,
-0.05841064453125,
-0.0215606689453125,
0.0012254714965820312,
0.051971435546875,
-0.0443115234375,
0.039154052734375,
-0.00555419921875,
-0.049896240234375,
-0.03973388671875,
0.0011043548583984375,
0.035400390625,
0.040863037109375,
0.036773681640625,
-0.0008568763732910156,
-0.0750732421875,
-0.059661865234375,
-0.031402587890625,
-0.0172576904296875,
0.006336212158203125,
0.023162841796875,
0.048126220703125,
-0.0202178955078125,
0.05377197265625,
-0.0323486328125,
-0.01806640625,
-0.03729248046875,
0.024566650390625,
0.0171661376953125,
0.0211181640625,
0.06207275390625,
-0.05242919921875,
-0.066650390625,
0.01324462890625,
-0.05712890625,
-0.0297698974609375,
0.0213623046875,
-0.01403045654296875,
0.04400634765625,
0.0215301513671875,
-0.01654052734375,
0.019073486328125,
0.0604248046875,
-0.02581787109375,
0.0298309326171875,
0.004547119140625,
0.0297088623046875,
-0.099853515625,
0.0003516674041748047,
0.0318603515625,
-0.011016845703125,
-0.054931640625,
-0.0199432373046875,
-0.006313323974609375,
0.010986328125,
-0.051605224609375,
0.050506591796875,
-0.0267333984375,
0.00867462158203125,
-0.00833892822265625,
0.0177459716796875,
-0.020751953125,
0.053619384765625,
0.0148162841796875,
0.041839599609375,
0.0390625,
-0.03021240234375,
0.0095062255859375,
0.0180206298828125,
-0.0289459228515625,
0.037200927734375,
-0.058197021484375,
0.0050811767578125,
-0.0186309814453125,
-0.01251983642578125,
-0.078125,
-0.006351470947265625,
0.0194854736328125,
-0.05767822265625,
0.00933837890625,
-0.00968170166015625,
-0.041473388671875,
-0.037567138671875,
-0.035552978515625,
0.0015134811401367188,
0.04315185546875,
-0.0333251953125,
0.0555419921875,
0.049041748046875,
-0.0252532958984375,
-0.0416259765625,
-0.061065673828125,
0.0008006095886230469,
-0.03302001953125,
-0.0582275390625,
0.01180267333984375,
-0.00798797607421875,
-0.0187530517578125,
-0.007091522216796875,
0.022064208984375,
-0.007213592529296875,
0.006000518798828125,
0.0212249755859375,
0.0246124267578125,
-0.002407073974609375,
0.0008683204650878906,
0.01189422607421875,
0.01120758056640625,
0.009521484375,
0.0039043426513671875,
0.059661865234375,
-0.015960693359375,
0.0045928955078125,
-0.035736083984375,
0.0239410400390625,
0.03692626953125,
-0.0162200927734375,
0.06451416015625,
0.053192138671875,
-0.03167724609375,
-0.010772705078125,
-0.045166015625,
0.006061553955078125,
-0.033416748046875,
0.0389404296875,
-0.01288604736328125,
-0.08160400390625,
0.059814453125,
0.01971435546875,
-0.004840850830078125,
0.05279541015625,
0.059234619140625,
-0.019256591796875,
0.06280517578125,
0.044769287109375,
-0.0115814208984375,
0.06591796875,
-0.0210113525390625,
0.0216827392578125,
-0.04913330078125,
-0.022003173828125,
-0.0655517578125,
-0.019195556640625,
-0.046905517578125,
-0.02740478515625,
0.0166015625,
-0.01824951171875,
-0.020355224609375,
0.047149658203125,
-0.0277862548828125,
0.0312347412109375,
0.0299224853515625,
0.0007419586181640625,
-0.00003618001937866211,
0.007354736328125,
0.00897979736328125,
-0.03009033203125,
-0.045379638671875,
-0.034576416015625,
0.0771484375,
0.0297393798828125,
0.04034423828125,
0.0144195556640625,
0.0635986328125,
0.02435302734375,
0.03338623046875,
-0.051177978515625,
0.038787841796875,
-0.034912109375,
-0.038421630859375,
-0.0237274169921875,
-0.037139892578125,
-0.07470703125,
0.022796630859375,
-0.010406494140625,
-0.078125,
0.024017333984375,
0.0019464492797851562,
-0.0154571533203125,
0.0157012939453125,
-0.057769775390625,
0.07684326171875,
-0.015777587890625,
-0.0114898681640625,
-0.0171051025390625,
-0.041900634765625,
0.01322174072265625,
0.002185821533203125,
0.036163330078125,
-0.0212249755859375,
0.00960540771484375,
0.0863037109375,
-0.0225830078125,
0.067138671875,
-0.00716400146484375,
-0.0123291015625,
0.02557373046875,
-0.00037407875061035156,
0.03558349609375,
-0.0203857421875,
-0.011383056640625,
0.01421356201171875,
-0.01192474365234375,
-0.0129547119140625,
-0.02691650390625,
0.05078125,
-0.06524658203125,
-0.01081085205078125,
-0.030975341796875,
-0.0258026123046875,
-0.004940032958984375,
0.01326751708984375,
0.029327392578125,
0.01107025146484375,
-0.0188140869140625,
0.00780487060546875,
0.0460205078125,
-0.03515625,
0.032379150390625,
0.051849365234375,
-0.0005731582641601562,
-0.037200927734375,
0.06024169921875,
0.01104736328125,
0.017059326171875,
0.0250244140625,
0.0215911865234375,
-0.024749755859375,
-0.0174102783203125,
-0.00739288330078125,
0.044281005859375,
-0.044097900390625,
-0.01178741455078125,
-0.08721923828125,
0.006649017333984375,
-0.04278564453125,
-0.01496124267578125,
-0.03924560546875,
-0.02825927734375,
-0.0311431884765625,
-0.0184783935546875,
0.036590576171875,
0.03515625,
-0.01389312744140625,
0.02777099609375,
-0.040618896484375,
0.0216522216796875,
-0.007228851318359375,
0.01444244384765625,
0.004589080810546875,
-0.060516357421875,
-0.02142333984375,
-0.0015134811401367188,
-0.0104827880859375,
-0.0823974609375,
0.0628662109375,
0.01013946533203125,
0.02911376953125,
0.0168914794921875,
0.00434112548828125,
0.034912109375,
-0.029052734375,
0.0472412109375,
0.022247314453125,
-0.07135009765625,
0.0650634765625,
-0.0305633544921875,
-0.0015363693237304688,
0.047576904296875,
0.0570068359375,
-0.049072265625,
-0.061614990234375,
-0.07000732421875,
-0.07080078125,
0.072998046875,
0.016021728515625,
0.01190185546875,
-0.0238037109375,
-0.01035308837890625,
-0.01004791259765625,
0.01375579833984375,
-0.08111572265625,
-0.0298004150390625,
-0.00791168212890625,
-0.040618896484375,
-0.0011777877807617188,
-0.0256805419921875,
0.00225830078125,
-0.024658203125,
0.068115234375,
0.0115509033203125,
0.028778076171875,
0.00687408447265625,
-0.0267181396484375,
-0.00458526611328125,
0.0203399658203125,
0.049652099609375,
0.0250701904296875,
-0.02679443359375,
0.006443023681640625,
0.005725860595703125,
-0.033050537109375,
-0.01052093505859375,
0.0166015625,
-0.00154876708984375,
0.01751708984375,
0.02197265625,
0.0638427734375,
0.0142364501953125,
-0.055084228515625,
0.047607421875,
0.002269744873046875,
-0.02154541015625,
-0.0297393798828125,
-0.01125335693359375,
-0.009033203125,
0.027130126953125,
0.0296630859375,
-0.00983428955078125,
0.002593994140625,
-0.039581298828125,
0.0067138671875,
0.0184783935546875,
-0.0239410400390625,
-0.043304443359375,
0.0487060546875,
0.0142669677734375,
-0.0262298583984375,
0.01079559326171875,
-0.03173828125,
-0.07501220703125,
0.0523681640625,
0.0377197265625,
0.08221435546875,
-0.0305633544921875,
0.03936767578125,
0.055267333984375,
0.03326416015625,
-0.0011224746704101562,
0.047393798828125,
0.00937652587890625,
-0.0733642578125,
-0.0259857177734375,
-0.054656982421875,
-0.0204315185546875,
0.01238250732421875,
-0.043853759765625,
0.017364501953125,
-0.0257110595703125,
-0.006744384765625,
0.00492095947265625,
-0.002902984619140625,
-0.044097900390625,
0.0220489501953125,
0.0095062255859375,
0.0509033203125,
-0.087890625,
0.059600830078125,
0.053131103515625,
-0.03570556640625,
-0.0543212890625,
-0.0183868408203125,
-0.0004031658172607422,
-0.05426025390625,
0.047393798828125,
-0.000835418701171875,
-0.0169219970703125,
-0.00021445751190185547,
-0.033233642578125,
-0.06591796875,
0.05133056640625,
0.0288543701171875,
-0.0289459228515625,
0.0154266357421875,
0.007598876953125,
0.0704345703125,
-0.019989013671875,
0.032135009765625,
0.038543701171875,
0.032135009765625,
0.0021648406982421875,
-0.05755615234375,
-0.005268096923828125,
-0.035186767578125,
-0.0106353759765625,
0.0205841064453125,
-0.048431396484375,
0.0721435546875,
-0.0035190582275390625,
-0.02215576171875,
-0.0229034423828125,
0.056396484375,
0.0048828125,
0.014068603515625,
0.031036376953125,
0.040191650390625,
0.058868408203125,
-0.01172637939453125,
0.081298828125,
-0.0247650146484375,
0.04119873046875,
0.0845947265625,
-0.006649017333984375,
0.0633544921875,
0.01922607421875,
-0.033843994140625,
0.052154541015625,
0.03314208984375,
0.0172882080078125,
0.0282440185546875,
-0.00821685791015625,
-0.017547607421875,
0.0014772415161132812,
-0.01024627685546875,
-0.017791748046875,
0.02081298828125,
0.01812744140625,
-0.034210205078125,
-0.0091705322265625,
0.0147857666015625,
0.0408935546875,
0.0245361328125,
-0.00623321533203125,
0.03778076171875,
0.0035762786865234375,
-0.032867431640625,
0.04986572265625,
0.001987457275390625,
0.0770263671875,
-0.0401611328125,
0.032318115234375,
-0.0171661376953125,
0.0031833648681640625,
-0.0283966064453125,
-0.06689453125,
0.0362548828125,
0.0390625,
-0.00737762451171875,
-0.0281982421875,
0.0390625,
-0.042633056640625,
-0.034576416015625,
0.048675537109375,
0.0250701904296875,
0.0201873779296875,
-0.00865936279296875,
-0.060821533203125,
0.0165252685546875,
0.0184478759765625,
-0.036041259765625,
0.01271820068359375,
0.039886474609375,
-0.0028667449951171875,
0.03778076171875,
0.032958984375,
0.0115203857421875,
0.0204010009765625,
0.03265380859375,
0.056243896484375,
-0.044403076171875,
-0.0369873046875,
-0.06500244140625,
0.039581298828125,
-0.018035888671875,
-0.033966064453125,
0.06573486328125,
0.04058837890625,
0.0650634765625,
-0.01233673095703125,
0.06134033203125,
-0.032501220703125,
0.068603515625,
-0.0134735107421875,
0.041748046875,
-0.046905517578125,
-0.0154266357421875,
-0.055084228515625,
-0.0540771484375,
-0.033843994140625,
0.0640869140625,
-0.043548583984375,
-0.00431060791015625,
0.054656982421875,
0.060882568359375,
0.0164337158203125,
-0.0133209228515625,
0.00817108154296875,
0.038299560546875,
0.0145416259765625,
0.0443115234375,
0.0430908203125,
-0.035736083984375,
0.0299530029296875,
-0.0263214111328125,
-0.0244293212890625,
-0.010833740234375,
-0.061798095703125,
-0.07147216796875,
-0.059326171875,
-0.036041259765625,
-0.036376953125,
0.0013723373413085938,
0.078857421875,
0.019989013671875,
-0.072509765625,
-0.0288543701171875,
0.00682830810546875,
0.01294708251953125,
0.0142364501953125,
-0.0181121826171875,
0.0248565673828125,
-0.0196075439453125,
-0.07843017578125,
0.0157928466796875,
0.01166534423828125,
0.0078582763671875,
0.01264190673828125,
-0.00862884521484375,
-0.03515625,
0.00853729248046875,
0.050933837890625,
0.03497314453125,
-0.043792724609375,
-0.0176544189453125,
0.00879669189453125,
-0.01263427734375,
0.0139617919921875,
0.0240325927734375,
-0.035400390625,
0.0111236572265625,
0.038238525390625,
0.03802490234375,
0.039154052734375,
-0.0107421875,
0.015838623046875,
-0.05194091796875,
0.030914306640625,
0.030487060546875,
0.0254974365234375,
0.0247955322265625,
-0.0191497802734375,
0.04132080078125,
0.01335906982421875,
-0.03228759765625,
-0.06793212890625,
0.0032138824462890625,
-0.0927734375,
-0.0035076141357421875,
0.0970458984375,
0.00003904104232788086,
-0.030120849609375,
0.01459503173828125,
-0.0110931396484375,
0.032867431640625,
-0.04931640625,
0.056121826171875,
0.05255126953125,
-0.0011739730834960938,
-0.0257568359375,
-0.04266357421875,
0.03607177734375,
0.035980224609375,
-0.07440185546875,
-0.0033397674560546875,
0.0218048095703125,
0.01702880859375,
0.01108551025390625,
0.060302734375,
-0.0131988525390625,
0.017425537109375,
-0.0311431884765625,
0.043853759765625,
0.0230255126953125,
-0.02001953125,
-0.02386474609375,
-0.006488800048828125,
-0.0169219970703125,
-0.005451202392578125
]
] |
jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn | 2022-12-14T01:58:32.000Z | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"zh",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | jonatasgrosman | null | null | jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn | 53 | 101,491 | transformers | 2022-03-02T23:29:05 | ---
language: zh
datasets:
- common_voice
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- speech
- xlsr-fine-tuning-week
license: apache-2.0
model-index:
- name: XLSR Wav2Vec2 Chinese (zh-CN) by Jonatas Grosman
results:
- task:
name: Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice zh-CN
type: common_voice
args: zh-CN
metrics:
- name: Test WER
type: wer
value: 82.37
- name: Test CER
type: cer
value: 19.03
---
# Fine-tuned XLSR-53 large model for speech recognition in Chinese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Chinese using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice), [CSS10](https://github.com/Kyubyong/css10) and [ST-CMDS](http://www.openslr.org/38/).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "zh-CN"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn"
SAMPLES = 10
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| 宋朝末年年间定居粉岭围。 | 宋朝末年年间定居分定为 |
| 渐渐行动不便 | 建境行动不片 |
| 二十一年去世。 | 二十一年去世 |
| 他们自称恰哈拉。 | 他们自称家哈<unk> |
| 局部干涩的例子包括有口干、眼睛干燥、及阴道干燥。 | 菊物干寺的例子包括有口肝眼睛干照以及阴到干<unk> |
| 嘉靖三十八年,登进士第三甲第二名。 | 嘉靖三十八年登进士第三甲第二名 |
| 这一名称一直沿用至今。 | 这一名称一直沿用是心 |
| 同时乔凡尼还得到包税合同和许多明矾矿的经营权。 | 同时桥凡妮还得到包税合同和许多民繁矿的经营权 |
| 为了惩罚西扎城和塞尔柱的结盟,盟军在抵达后将外城烧毁。 | 为了曾罚西扎城和塞尔素的节盟盟军在抵达后将外曾烧毁 |
| 河内盛产黄色无鱼鳞的鳍射鱼。 | 合类生场环色无鱼林的骑射鱼 |
## Evaluation
The model can be evaluated as follows on the Chinese (zh-CN) test data of Common Voice.
```python
import torch
import re
import librosa
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "zh-CN"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn"
DEVICE = "cuda"
CHARS_TO_IGNORE = [",", "?", "¿", ".", "!", "¡", ";", ";", ":", '""', "%", '"', "�", "ʿ", "·", "჻", "~", "՞",
"؟", "،", "।", "॥", "«", "»", "„", "“", "”", "「", "」", "‘", "’", "《", "》", "(", ")", "[", "]",
"{", "}", "=", "`", "_", "+", "<", ">", "…", "–", "°", "´", "ʾ", "‹", "›", "©", "®", "—", "→", "。",
"、", "﹂", "﹁", "‧", "~", "﹏", ",", "{", "}", "(", ")", "[", "]", "【", "】", "‥", "〽",
"『", "』", "〝", "〟", "⟨", "⟩", "〜", ":", "!", "?", "♪", "؛", "/", "\\", "º", "−", "^", "'", "ʻ", "ˆ"]
test_dataset = load_dataset("common_voice", LANG_ID, split="test")
wer = load_metric("wer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/wer.py
cer = load_metric("cer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/cer.py
chars_to_ignore_regex = f"[{re.escape(''.join(CHARS_TO_IGNORE))}]"
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
model.to(DEVICE)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
with warnings.catch_warnings():
warnings.simplefilter("ignore")
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = re.sub(chars_to_ignore_regex, "", batch["sentence"]).upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to(DEVICE), attention_mask=inputs.attention_mask.to(DEVICE)).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
predictions = [x.upper() for x in result["pred_strings"]]
references = [x.upper() for x in result["sentence"]]
print(f"WER: {wer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}")
print(f"CER: {cer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}")
```
**Test Result**:
In the table below I report the Word Error Rate (WER) and the Character Error Rate (CER) of the model. I ran the evaluation script described above on other models as well (on 2021-05-13). Note that the table below may show different results from those already reported, this may have been caused due to some specificity of the other evaluation scripts used.
| Model | WER | CER |
| ------------- | ------------- | ------------- |
| jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn | **82.37%** | **19.03%** |
| ydshieh/wav2vec2-large-xlsr-53-chinese-zh-cn-gpt | 84.01% | 20.95% |
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-chinese,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {C}hinese},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-chinese-zh-cn}},
year={2021}
}
``` | 7,036 | [
[
-0.0199432373046875,
-0.044036865234375,
0.01043701171875,
0.016815185546875,
-0.01318359375,
-0.0192413330078125,
-0.03057861328125,
-0.037506103515625,
0.00433349609375,
0.0258636474609375,
-0.044189453125,
-0.052978515625,
-0.03173828125,
-0.007106781005859375,
-0.0136260986328125,
0.07220458984375,
0.01334381103515625,
0.0092620849609375,
0.018890380859375,
-0.010772705078125,
-0.02606201171875,
-0.0269317626953125,
-0.048095703125,
-0.026153564453125,
0.023284912109375,
0.01473236083984375,
0.0323486328125,
0.036346435546875,
0.02679443359375,
0.0243377685546875,
-0.01218414306640625,
0.01302337646484375,
-0.0164337158203125,
-0.0176544189453125,
0.01526641845703125,
-0.0303192138671875,
-0.03460693359375,
0.001949310302734375,
0.05072021484375,
0.035614013671875,
-0.0170745849609375,
0.0283203125,
0.005001068115234375,
0.0345458984375,
-0.016387939453125,
0.0207366943359375,
-0.0419921875,
-0.0126953125,
-0.012054443359375,
-0.006198883056640625,
-0.017974853515625,
-0.0261993408203125,
0.0140838623046875,
-0.044464111328125,
0.01264190673828125,
-0.00431060791015625,
0.09124755859375,
0.01061248779296875,
-0.004566192626953125,
-0.0294647216796875,
-0.040740966796875,
0.0745849609375,
-0.08258056640625,
0.016387939453125,
0.03515625,
0.00970458984375,
-0.011260986328125,
-0.06365966796875,
-0.052581787109375,
-0.00951385498046875,
-0.004062652587890625,
0.01666259765625,
-0.0267791748046875,
-0.0028629302978515625,
0.0247039794921875,
0.0096282958984375,
-0.050567626953125,
0.00308990478515625,
-0.05120849609375,
-0.0291595458984375,
0.06317138671875,
-0.0022220611572265625,
0.0328369140625,
-0.0225830078125,
-0.01568603515625,
-0.0343017578125,
-0.022552490234375,
0.0289306640625,
0.02899169921875,
0.038787841796875,
-0.044921875,
0.034423828125,
-0.00611114501953125,
0.0506591796875,
0.00036525726318359375,
-0.0280609130859375,
0.0626220703125,
-0.0248260498046875,
-0.026885986328125,
0.0223541259765625,
0.08697509765625,
0.023651123046875,
0.0231781005859375,
0.015380859375,
-0.006946563720703125,
0.01264190673828125,
-0.0278778076171875,
-0.054290771484375,
-0.02392578125,
0.035614013671875,
-0.03631591796875,
-0.006549835205078125,
0.000980377197265625,
-0.0489501953125,
-0.0013532638549804688,
-0.01216888427734375,
0.050811767578125,
-0.042816162109375,
-0.024322509765625,
0.01416015625,
-0.0126953125,
0.0146026611328125,
-0.00713348388671875,
-0.06842041015625,
0.00914764404296875,
0.0305328369140625,
0.058502197265625,
0.0270538330078125,
-0.02789306640625,
-0.03253173828125,
-0.00839996337890625,
-0.0158233642578125,
0.039581298828125,
-0.01055145263671875,
-0.0275115966796875,
-0.0170745849609375,
0.007328033447265625,
-0.0270843505859375,
-0.036102294921875,
0.054931640625,
-0.0081329345703125,
0.0236358642578125,
-0.02276611328125,
-0.0297698974609375,
-0.0236968994140625,
-0.00974273681640625,
-0.039337158203125,
0.08221435546875,
0.0008883476257324219,
-0.0662841796875,
0.006031036376953125,
-0.042205810546875,
-0.03790283203125,
-0.0197906494140625,
-0.0129547119140625,
-0.03851318359375,
-0.0192108154296875,
0.0293121337890625,
0.03741455078125,
-0.029998779296875,
0.007415771484375,
-0.00937652587890625,
-0.04193115234375,
0.0259857177734375,
-0.026885986328125,
0.0888671875,
0.0243988037109375,
-0.036895751953125,
0.00878143310546875,
-0.06964111328125,
0.02001953125,
0.01457977294921875,
-0.03021240234375,
-0.007671356201171875,
0.0012664794921875,
0.024017333984375,
0.0168609619140625,
0.016387939453125,
-0.03814697265625,
-0.005725860595703125,
-0.050628662109375,
0.045928955078125,
0.0430908203125,
-0.00870513916015625,
0.00728607177734375,
-0.03619384765625,
0.020263671875,
-0.0016546249389648438,
-0.01152801513671875,
-0.014312744140625,
-0.03167724609375,
-0.057037353515625,
-0.024505615234375,
0.01806640625,
0.05035400390625,
-0.0302734375,
0.05322265625,
-0.0187530517578125,
-0.06439208984375,
-0.0701904296875,
-0.00740814208984375,
0.0279998779296875,
0.04266357421875,
0.0400390625,
-0.0005440711975097656,
-0.068115234375,
-0.0601806640625,
-0.00620269775390625,
-0.02508544921875,
-0.004848480224609375,
0.025543212890625,
0.03619384765625,
-0.0221099853515625,
0.055572509765625,
-0.034881591796875,
-0.03082275390625,
-0.0231781005859375,
0.00800323486328125,
0.050079345703125,
0.0489501953125,
0.033355712890625,
-0.0506591796875,
-0.0345458984375,
0.004627227783203125,
-0.043060302734375,
-0.00841522216796875,
-0.01776123046875,
-0.000850677490234375,
0.0128021240234375,
0.025146484375,
-0.03472900390625,
0.018646240234375,
0.041595458984375,
-0.0226898193359375,
0.048309326171875,
-0.006988525390625,
0.0110321044921875,
-0.0880126953125,
0.01261138916015625,
0.00333404541015625,
0.003139495849609375,
-0.042144775390625,
-0.0268096923828125,
-0.007232666015625,
0.016357421875,
-0.032379150390625,
0.032867431640625,
-0.0313720703125,
-0.006195068359375,
-0.002002716064453125,
0.0162353515625,
-0.00385284423828125,
0.05194091796875,
0.002777099609375,
0.0511474609375,
0.053985595703125,
-0.044769287109375,
0.033355712890625,
0.0267181396484375,
-0.0489501953125,
0.0134429931640625,
-0.06500244140625,
0.02154541015625,
0.00524139404296875,
0.01074981689453125,
-0.07672119140625,
-0.008758544921875,
0.02288818359375,
-0.059783935546875,
0.01509857177734375,
0.004161834716796875,
-0.03228759765625,
-0.03814697265625,
-0.0112762451171875,
0.01418304443359375,
0.048797607421875,
-0.0325927734375,
0.039825439453125,
0.027740478515625,
-0.00634765625,
-0.05548095703125,
-0.071044921875,
-0.0198974609375,
-0.0185546875,
-0.060638427734375,
0.024658203125,
-0.006122589111328125,
-0.00713348388671875,
-0.00543975830078125,
-0.0055389404296875,
0.0025920867919921875,
-0.00652313232421875,
0.020263671875,
0.0289459228515625,
-0.0212249755859375,
-0.0099945068359375,
-0.00959014892578125,
0.0048980712890625,
0.003894805908203125,
-0.0104827880859375,
0.05841064453125,
-0.01232147216796875,
-0.00994873046875,
-0.058929443359375,
0.0073394775390625,
0.033050537109375,
-0.0282745361328125,
0.039794921875,
0.0723876953125,
-0.0287322998046875,
-0.0013055801391601562,
-0.034820556640625,
0.001979827880859375,
-0.03619384765625,
0.06195068359375,
-0.0211029052734375,
-0.045745849609375,
0.05267333984375,
0.0230560302734375,
0.01123046875,
0.05072021484375,
0.042938232421875,
-0.01397705078125,
0.072509765625,
0.0183258056640625,
-0.0168609619140625,
0.028778076171875,
-0.05023193359375,
-0.0013723373413085938,
-0.07110595703125,
-0.031524658203125,
-0.054046630859375,
-0.01519012451171875,
-0.042724609375,
-0.033660888671875,
0.0205841064453125,
0.0039215087890625,
-0.026947021484375,
0.032257080078125,
-0.047882080078125,
0.0127716064453125,
0.036529541015625,
0.007732391357421875,
-0.005725860595703125,
0.0087738037109375,
-0.01666259765625,
-0.0006351470947265625,
-0.038970947265625,
-0.033233642578125,
0.07867431640625,
0.03375244140625,
0.043670654296875,
-0.0018873214721679688,
0.046630859375,
0.0043792724609375,
-0.00473785400390625,
-0.054473876953125,
0.036102294921875,
-0.0236053466796875,
-0.041900634765625,
-0.027374267578125,
-0.026153564453125,
-0.07061767578125,
0.0249786376953125,
-0.01003265380859375,
-0.0804443359375,
0.012908935546875,
-0.00502777099609375,
-0.039886474609375,
0.017730712890625,
-0.043243408203125,
0.060333251953125,
-0.00893402099609375,
-0.004177093505859375,
-0.01085662841796875,
-0.04705810546875,
0.0234222412109375,
0.005451202392578125,
0.0266265869140625,
-0.0154876708984375,
0.028533935546875,
0.098876953125,
-0.0210723876953125,
0.0557861328125,
-0.01245880126953125,
-0.00502777099609375,
0.04022216796875,
-0.0256500244140625,
0.034820556640625,
-0.01373291015625,
-0.02630615234375,
0.0243988037109375,
0.0237274169921875,
-0.01256561279296875,
-0.02191162109375,
0.0467529296875,
-0.07684326171875,
-0.024658203125,
-0.03387451171875,
-0.041778564453125,
0.0003895759582519531,
0.0187530517578125,
0.0526123046875,
0.044647216796875,
-0.004871368408203125,
0.0235137939453125,
0.0292510986328125,
-0.0301513671875,
0.0396728515625,
0.0268402099609375,
-0.0074310302734375,
-0.05316162109375,
0.05682373046875,
0.0223541259765625,
0.0175628662109375,
0.01230621337890625,
0.0152435302734375,
-0.035614013671875,
-0.03466796875,
-0.0179443359375,
0.0272216796875,
-0.046661376953125,
-0.0026111602783203125,
-0.048736572265625,
-0.026275634765625,
-0.058197021484375,
0.0083465576171875,
-0.023284912109375,
-0.0167236328125,
-0.037322998046875,
-0.0005979537963867188,
0.0290374755859375,
0.0240478515625,
-0.0158233642578125,
0.0283203125,
-0.0557861328125,
0.0301055908203125,
-0.0004718303680419922,
-0.0018911361694335938,
0.006175994873046875,
-0.06451416015625,
-0.0386962890625,
0.0180511474609375,
-0.01416015625,
-0.06280517578125,
0.042633056640625,
0.01424407958984375,
0.03778076171875,
0.020172119140625,
0.0007309913635253906,
0.06561279296875,
-0.0274505615234375,
0.06488037109375,
0.025604248046875,
-0.080078125,
0.046875,
-0.026153564453125,
0.0173492431640625,
0.02362060546875,
0.0082550048828125,
-0.059539794921875,
-0.01515960693359375,
-0.044464111328125,
-0.07318115234375,
0.08233642578125,
0.0198516845703125,
0.00844573974609375,
0.0110626220703125,
0.005222320556640625,
-0.0125579833984375,
0.0016527175903320312,
-0.052337646484375,
-0.051055908203125,
-0.0197906494140625,
-0.0283660888671875,
-0.0169830322265625,
-0.00710296630859375,
-0.00466156005859375,
-0.03314208984375,
0.07293701171875,
0.00890350341796875,
0.028106689453125,
0.036468505859375,
0.00604248046875,
-0.0026454925537109375,
0.01995849609375,
0.040863037109375,
0.0200653076171875,
-0.03076171875,
-0.011444091796875,
0.021728515625,
-0.05023193359375,
0.004215240478515625,
0.015838623046875,
-0.00738525390625,
0.0151519775390625,
0.04180908203125,
0.088623046875,
0.00820159912109375,
-0.038909912109375,
0.0198974609375,
-0.0028057098388671875,
-0.0221099853515625,
-0.048309326171875,
0.00992584228515625,
0.0257110595703125,
0.020843505859375,
0.0386962890625,
0.00739288330078125,
-0.00542449951171875,
-0.037078857421875,
0.016754150390625,
0.01953125,
-0.017181396484375,
-0.020050048828125,
0.05462646484375,
-0.00036644935607910156,
-0.018890380859375,
0.04443359375,
0.0030689239501953125,
-0.0369873046875,
0.0693359375,
0.043670654296875,
0.0633544921875,
-0.0238494873046875,
0.0007462501525878906,
0.058319091796875,
0.0272216796875,
-0.0161590576171875,
0.039276123046875,
0.004558563232421875,
-0.06085205078125,
-0.01454925537109375,
-0.049163818359375,
-0.002361297607421875,
0.0290374755859375,
-0.05926513671875,
0.03253173828125,
-0.0285186767578125,
-0.022491455078125,
0.017303466796875,
0.0201263427734375,
-0.050262451171875,
0.0297393798828125,
0.0225830078125,
0.06378173828125,
-0.06195068359375,
0.08099365234375,
0.028900146484375,
-0.03790283203125,
-0.1083984375,
-0.00363922119140625,
-0.01290130615234375,
-0.05078125,
0.04400634765625,
0.0220489501953125,
-0.00662994384765625,
0.00927734375,
-0.0302734375,
-0.073974609375,
0.08984375,
0.0252838134765625,
-0.055999755859375,
-0.003902435302734375,
-0.001956939697265625,
0.037445068359375,
-0.022003173828125,
0.0288238525390625,
0.0567626953125,
0.045806884765625,
0.00144195556640625,
-0.0816650390625,
0.00702667236328125,
-0.02862548828125,
-0.01715087890625,
-0.0149688720703125,
-0.053131103515625,
0.08355712890625,
-0.031524658203125,
-0.017730712890625,
0.01543426513671875,
0.0648193359375,
0.042633056640625,
0.031585693359375,
0.03961181640625,
0.03131103515625,
0.0701904296875,
-0.01444244384765625,
0.05706787109375,
-0.00794219970703125,
0.042755126953125,
0.0828857421875,
-0.0120391845703125,
0.07537841796875,
0.0176239013671875,
-0.03448486328125,
0.04547119140625,
0.045562744140625,
-0.029998779296875,
0.04296875,
0.01055145263671875,
-0.00713348388671875,
-0.0025997161865234375,
-0.0011701583862304688,
-0.04803466796875,
0.052825927734375,
0.0235137939453125,
-0.0293731689453125,
0.0140533447265625,
-0.004680633544921875,
0.0178375244140625,
-0.007167816162109375,
-0.01245880126953125,
0.034271240234375,
0.01476287841796875,
-0.039398193359375,
0.07330322265625,
0.005924224853515625,
0.07147216796875,
-0.054443359375,
0.01055145263671875,
0.016510009765625,
0.0178375244140625,
-0.032989501953125,
-0.04547119140625,
0.00814056396484375,
0.00390625,
-0.01171112060546875,
0.0182037353515625,
0.0248260498046875,
-0.050079345703125,
-0.04510498046875,
0.0399169921875,
0.001857757568359375,
0.028594970703125,
0.01023101806640625,
-0.06805419921875,
0.0207977294921875,
0.031768798828125,
-0.03271484375,
0.0014753341674804688,
0.016387939453125,
0.016387939453125,
0.034088134765625,
0.061676025390625,
0.0272674560546875,
0.005001068115234375,
0.01265716552734375,
0.04571533203125,
-0.05462646484375,
-0.045806884765625,
-0.061859130859375,
0.045501708984375,
0.0024204254150390625,
-0.02410888671875,
0.052032470703125,
0.0474853515625,
0.058746337890625,
0.00103759765625,
0.072509765625,
-0.01483154296875,
0.0458984375,
-0.043914794921875,
0.0733642578125,
-0.042755126953125,
0.0187835693359375,
-0.0333251953125,
-0.050201416015625,
-0.00661468505859375,
0.061065673828125,
-0.02020263671875,
0.0134124755859375,
0.0408935546875,
0.07611083984375,
0.0124969482421875,
-0.004718780517578125,
0.0177001953125,
0.031036376953125,
0.023590087890625,
0.0618896484375,
0.030426025390625,
-0.06561279296875,
0.0533447265625,
-0.03790283203125,
-0.003021240234375,
-0.0029277801513671875,
-0.04058837890625,
-0.05914306640625,
-0.059356689453125,
-0.03009033203125,
-0.0435791015625,
-0.01206207275390625,
0.08428955078125,
0.04876708984375,
-0.06561279296875,
-0.03216552734375,
0.0175933837890625,
-0.0036163330078125,
-0.031036376953125,
-0.016937255859375,
0.0672607421875,
0.004901885986328125,
-0.07269287109375,
0.03289794921875,
-0.0190582275390625,
0.006694793701171875,
0.001117706298828125,
-0.023284912109375,
-0.020904541015625,
0.00901031494140625,
0.01172637939453125,
0.0277862548828125,
-0.06683349609375,
-0.01004791259765625,
-0.00553131103515625,
-0.0196685791015625,
0.00563812255859375,
0.0166015625,
-0.0413818359375,
0.02410888671875,
0.043487548828125,
0.00958251953125,
0.03131103515625,
-0.013397216796875,
0.02301025390625,
-0.033203125,
0.0247802734375,
0.005886077880859375,
0.04132080078125,
0.0311431884765625,
-0.0240325927734375,
0.025390625,
0.01552581787109375,
-0.037200927734375,
-0.0762939453125,
-0.020111083984375,
-0.094482421875,
-0.01244354248046875,
0.09161376953125,
-0.01885986328125,
-0.02178955078125,
0.007671356201171875,
-0.031890869140625,
0.061859130859375,
-0.03460693359375,
0.0399169921875,
0.044525146484375,
-0.00041985511779785156,
0.00640106201171875,
-0.044769287109375,
0.033966064453125,
0.032470703125,
-0.0369873046875,
0.0056610107421875,
0.0178070068359375,
0.04095458984375,
0.0176849365234375,
0.05999755859375,
-0.01094818115234375,
0.0264129638671875,
0.00749969482421875,
0.032135009765625,
-0.01303863525390625,
0.0039215087890625,
-0.032196044921875,
-0.0068511962890625,
-0.0183563232421875,
-0.041748046875
]
] |
gsdf/Counterfeit-V2.5 | 2023-03-14T17:41:46.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | gsdf | null | null | gsdf/Counterfeit-V2.5 | 1,469 | 101,398 | diffusers | 2023-02-02T14:02:11 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
# Update
V2.5 has been updated for ease of use as anime-style model.
I use this embedding for negative prompts.
https://huggingface.co/datasets/gsdf/EasyNegative
Share by-products
V2.1…Feeling of use similar to V2.0
V2.2…NSFW model
# Counterfeit-V2.5 e.g.

```
((masterpiece,best quality)),1girl, solo, animal ears, rabbit, barefoot, knees up, dress, sitting, rabbit ears, short sleeves, looking at viewer, grass, short hair, smile, white hair, puffy sleeves, outdoors, puffy short sleeves, bangs, on ground, full body, animal, white dress, sunlight, brown eyes, dappled sunlight, day, depth of field
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 448x768, Denoising strength: 0.6, Hires upscale: 1.8, Hires upscaler: Latent
```

```
((masterpiece,best quality)),1girl, from below, solo, school uniform, serafuku, sky, cloud, black hair, skirt, sailor collar, looking at viewer, short hair, building, bangs, neckerchief, long sleeves, cloudy sky, power lines, shirt, cityscape, pleated skirt, scenery, blunt bangs, city, night, black sailor collar, closed mouth, black skirt, medium hair, school bag , holding bag
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 832x512, Denoising strength: 0.6, Hires upscale: 1.8, Hires upscaler: Latent
```

```
((masterpiece,best quality)),2girls, black kimono, black legwear, black ribbon, black hair, cherry blossoms, day, flower, hair bun, hair ribbon, japanese clothes, kimono, long hair, looking at viewer, looking back, multiple girls, obi, outdoors, red eyes, red hair, ribbon, sandals, single hair bun, stairs, standing, statue, torii, tree, white kimono, yellow eyes
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 640x960, Denoising strength: 0.58, Hires upscale: 1.8, Hires upscaler: Latent
```

```
((masterpiece,best quality)),1girl, bangs, blue eyes, blurry background, branch, brown hair, dappled sunlight, flower, from side, hair flower, hair ornament, japanese clothes, kimono, leaf, (maple leaf:1.9), obi, outdoors, sash, solo, sunlight, upper body
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 864x512, Denoising strength: 0.58, Hires upscale: 1.8, Hires upscaler: Latent
```

```
((masterpiece,best quality))1girl, solo, black skirt, blue eyes, electric guitar, guitar, headphones, holding, holding plectrum, instrument, long hair, , music, one side up, pink hair, playing guiter, pleated skirt, black shirt, indoors
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 864x512, Denoising strength: 0.58, Hires upscale: 1.8, Hires upscaler: Latent
```

```
((masterpiece,best quality)), 1girl, food, fruit, solo, skirt, shop, indoors, jacket, shopping, basket, jewelry, shirt, shelf, short hair, black hair, plaid skirt, black jacket, dutch angle, yellow eyes, looking at viewer
Negative prompt: EasyNegative, extra fingers,fewer fingers,
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 10, Size: 864x512, Denoising strength: 0.58, Hires upscale: 1.8, Hires upscaler: Latent
```
| 4,056 | [
[
-0.0494384765625,
-0.06939697265625,
0.01450347900390625,
0.016387939453125,
-0.042633056640625,
0.0103302001953125,
0.0264434814453125,
-0.055450439453125,
0.06182861328125,
0.04510498046875,
-0.06195068359375,
-0.04449462890625,
-0.041290283203125,
0.006610870361328125,
-0.0159149169921875,
0.057647705078125,
0.00958251953125,
0.0131683349609375,
-0.0133056640625,
-0.01535797119140625,
-0.036285400390625,
-0.032501220703125,
-0.048492431640625,
-0.0201873779296875,
-0.0131683349609375,
0.0227813720703125,
0.0386962890625,
0.01812744140625,
0.0003619194030761719,
0.0223388671875,
-0.005352020263671875,
-0.01139068603515625,
-0.033599853515625,
-0.034210205078125,
-0.007595062255859375,
-0.033660888671875,
-0.0587158203125,
0.047149658203125,
0.01207733154296875,
0.007297515869140625,
0.004974365234375,
0.027374267578125,
-0.0199432373046875,
0.0352783203125,
-0.0075836181640625,
-0.00736236572265625,
0.003421783447265625,
0.0007100105285644531,
-0.0289154052734375,
0.0150146484375,
-0.0139007568359375,
-0.0138397216796875,
0.0015716552734375,
-0.06427001953125,
0.02569580078125,
-0.02264404296875,
0.09661865234375,
-0.00640869140625,
-0.0193023681640625,
-0.005023956298828125,
-0.0229949951171875,
0.061431884765625,
-0.055511474609375,
0.0237884521484375,
0.0109710693359375,
0.02752685546875,
-0.002506256103515625,
-0.059661865234375,
-0.0692138671875,
0.0106201171875,
-0.039215087890625,
0.034515380859375,
-0.037506103515625,
-0.03369140625,
0.0447998046875,
0.04730224609375,
-0.041961669921875,
-0.01513671875,
-0.0206146240234375,
-0.02838134765625,
0.053436279296875,
-0.007137298583984375,
0.0465087890625,
-0.01432037353515625,
-0.03857421875,
-0.0175628662109375,
-0.0621337890625,
0.0171051025390625,
0.037109375,
-0.00618743896484375,
-0.048187255859375,
0.0243988037109375,
-0.01971435546875,
0.051727294921875,
0.0293426513671875,
-0.018218994140625,
0.04827880859375,
-0.03900146484375,
0.0009756088256835938,
-0.0333251953125,
0.0625,
0.0372314453125,
0.0118560791015625,
0.023529052734375,
0.00201416015625,
0.00600433349609375,
-0.0033016204833984375,
-0.07659912109375,
-0.0196075439453125,
0.0146942138671875,
-0.04278564453125,
-0.0340576171875,
0.001003265380859375,
-0.07080078125,
-0.0006632804870605469,
-0.0297088623046875,
0.04937744140625,
-0.02349853515625,
-0.027069091796875,
0.0052337646484375,
-0.0217742919921875,
0.0144805908203125,
0.031829833984375,
-0.0243988037109375,
0.0128173828125,
0.019439697265625,
0.042724609375,
0.016693115234375,
0.0074615478515625,
-0.01959228515625,
-0.0296173095703125,
-0.036407470703125,
0.05120849609375,
-0.02392578125,
-0.0284423828125,
-0.049072265625,
0.040679931640625,
0.0089569091796875,
-0.039825439453125,
0.0372314453125,
-0.016510009765625,
0.0138397216796875,
-0.025482177734375,
-0.01097869873046875,
-0.030364990234375,
0.0098114013671875,
-0.0638427734375,
0.06268310546875,
0.0193634033203125,
-0.044189453125,
0.019287109375,
-0.028350830078125,
-0.021331787109375,
0.0004513263702392578,
-0.01371002197265625,
-0.047210693359375,
-0.0193023681640625,
0.0235748291015625,
0.02093505859375,
-0.00791168212890625,
-0.024566650390625,
-0.034942626953125,
-0.05426025390625,
0.031585693359375,
-0.0224761962890625,
0.0927734375,
0.009307861328125,
-0.05438232421875,
-0.0218505859375,
-0.07244873046875,
-0.016693115234375,
0.04071044921875,
0.02606201171875,
-0.026611328125,
-0.054840087890625,
0.009796142578125,
0.0240936279296875,
0.022003173828125,
-0.042724609375,
0.005779266357421875,
-0.0263671875,
0.014068603515625,
0.0394287109375,
0.0283660888671875,
0.01189422607421875,
-0.057220458984375,
0.0640869140625,
0.011962890625,
0.0124053955078125,
-0.024749755859375,
-0.04815673828125,
-0.042205810546875,
-0.03717041015625,
0.01320648193359375,
0.0469970703125,
-0.053680419921875,
0.0204315185546875,
0.0086212158203125,
-0.07110595703125,
-0.042449951171875,
-0.0020923614501953125,
0.03521728515625,
0.03302001953125,
0.0205535888671875,
-0.050994873046875,
-0.033660888671875,
-0.059783935546875,
0.0010290145874023438,
0.004749298095703125,
-0.0010929107666015625,
0.048614501953125,
0.034210205078125,
-0.0135498046875,
0.04913330078125,
-0.0372314453125,
-0.042022705078125,
0.0003113746643066406,
0.025146484375,
0.0679931640625,
0.046234130859375,
0.0789794921875,
-0.0643310546875,
-0.041046142578125,
-0.0291748046875,
-0.041656494140625,
-0.01934814453125,
-0.01308441162109375,
-0.0233001708984375,
0.0175323486328125,
0.02508544921875,
-0.0504150390625,
0.042327880859375,
0.0322265625,
-0.03466796875,
0.047882080078125,
-0.007556915283203125,
0.0268707275390625,
-0.0908203125,
0.028076171875,
0.009552001953125,
-0.005329132080078125,
-0.046539306640625,
0.041107177734375,
-0.011138916015625,
0.004199981689453125,
-0.037506103515625,
0.06488037109375,
-0.035919189453125,
0.0181884765625,
-0.0104522705078125,
0.006488800048828125,
0.0185546875,
0.03948974609375,
0.0244140625,
0.047607421875,
0.030914306640625,
-0.01702880859375,
0.0391845703125,
0.030609130859375,
-0.0280914306640625,
0.0853271484375,
-0.06756591796875,
0.0251617431640625,
-0.0158843994140625,
0.024139404296875,
-0.0755615234375,
-0.00847625732421875,
0.05609130859375,
-0.045623779296875,
0.033416748046875,
-0.029052734375,
-0.01702880859375,
-0.043670654296875,
-0.030670166015625,
0.0226898193359375,
0.05078125,
-0.037109375,
0.048736572265625,
0.00392913818359375,
0.01434326171875,
-0.05621337890625,
-0.062744140625,
-0.004802703857421875,
-0.01027679443359375,
-0.0221405029296875,
0.02252197265625,
-0.0171356201171875,
-0.0019855499267578125,
-0.0112762451171875,
-0.012908935546875,
0.00904083251953125,
0.007633209228515625,
0.02252197265625,
0.03558349609375,
-0.01031494140625,
-0.0198974609375,
-0.00018966197967529297,
-0.0165557861328125,
-0.002590179443359375,
-0.004894256591796875,
0.0389404296875,
-0.004070281982421875,
-0.022369384765625,
-0.04058837890625,
0.01155853271484375,
0.04949951171875,
-0.0033435821533203125,
0.0157318115234375,
0.083740234375,
-0.032623291015625,
0.016845703125,
-0.05340576171875,
-0.0282135009765625,
-0.0338134765625,
-0.005367279052734375,
-0.05377197265625,
-0.038299560546875,
0.0694580078125,
0.0235443115234375,
-0.030364990234375,
0.041290283203125,
0.019683837890625,
0.0015773773193359375,
0.06524658203125,
0.01427459716796875,
-0.00559234619140625,
0.0218048095703125,
-0.064208984375,
0.00457000732421875,
-0.052825927734375,
-0.0181884765625,
-0.0003426074981689453,
-0.028778076171875,
-0.052825927734375,
-0.036224365234375,
0.0177001953125,
0.0203094482421875,
-0.01397705078125,
0.04156494140625,
-0.040924072265625,
0.032073974609375,
0.05535888671875,
0.0478515625,
0.0196075439453125,
0.0046539306640625,
-0.0015106201171875,
-0.028656005859375,
-0.02294921875,
-0.0178375244140625,
0.0772705078125,
0.0171051025390625,
0.0345458984375,
0.0174407958984375,
0.044677734375,
0.0113677978515625,
0.018768310546875,
-0.049896240234375,
0.042510986328125,
-0.0322265625,
-0.06707763671875,
-0.00403594970703125,
-0.0105438232421875,
-0.04248046875,
0.0005593299865722656,
-0.040496826171875,
-0.06109619140625,
0.047393798828125,
0.00971221923828125,
-0.0290985107421875,
0.0080413818359375,
-0.03253173828125,
0.07568359375,
-0.002567291259765625,
-0.0307769775390625,
-0.005130767822265625,
-0.042633056640625,
0.0295562744140625,
-0.0100250244140625,
0.010711669921875,
-0.010467529296875,
0.01422119140625,
0.050079345703125,
-0.026031494140625,
0.0794677734375,
-0.015533447265625,
-0.0091552734375,
0.04248046875,
0.0120086669921875,
0.04486083984375,
0.020660400390625,
0.016998291015625,
0.0362548828125,
0.01338958740234375,
-0.0224151611328125,
-0.0274658203125,
0.045654296875,
-0.048126220703125,
-0.04547119140625,
-0.0297088623046875,
0.000637054443359375,
0.0137481689453125,
0.037841796875,
0.060821533203125,
0.05438232421875,
-0.01226806640625,
0.011077880859375,
0.037841796875,
-0.015380859375,
0.0117645263671875,
0.039825439453125,
-0.0164642333984375,
-0.053314208984375,
0.0655517578125,
0.0236053466796875,
0.01776123046875,
0.0269317626953125,
0.0273284912109375,
-0.040771484375,
-0.003459930419921875,
-0.054473876953125,
0.0433349609375,
-0.049957275390625,
-0.002178192138671875,
-0.043670654296875,
-0.008697509765625,
-0.04217529296875,
-0.0044708251953125,
-0.0204925537109375,
-0.027618408203125,
-0.0469970703125,
-0.00388336181640625,
0.04949951171875,
0.0279388427734375,
-0.031646728515625,
0.00624847412109375,
-0.0438232421875,
0.050994873046875,
0.0438232421875,
0.02325439453125,
0.002017974853515625,
-0.0399169921875,
0.0172119140625,
0.0167236328125,
-0.0168304443359375,
-0.0792236328125,
0.029449462890625,
-0.0095672607421875,
0.04669189453125,
0.044952392578125,
0.012420654296875,
0.057891845703125,
0.0024700164794921875,
0.06585693359375,
0.018524169921875,
-0.06854248046875,
0.028778076171875,
-0.043426513671875,
0.020111083984375,
0.03961181640625,
0.032958984375,
-0.048614501953125,
-0.0216064453125,
-0.068359375,
-0.059417724609375,
0.046783447265625,
0.00640106201171875,
0.017669677734375,
0.0036106109619140625,
0.0221405029296875,
0.01226806640625,
0.0245513916015625,
-0.064453125,
-0.056640625,
-0.053131103515625,
-0.01068115234375,
-0.01812744140625,
-0.0153045654296875,
-0.00334930419921875,
-0.0291290283203125,
0.06256103515625,
-0.0099334716796875,
0.0255279541015625,
0.0167236328125,
0.0148468017578125,
-0.010345458984375,
0.001743316650390625,
0.044097900390625,
0.0440673828125,
-0.0251007080078125,
-0.0103302001953125,
0.00616455078125,
-0.05755615234375,
0.028289794921875,
0.0123443603515625,
-0.046600341796875,
0.017364501953125,
0.0106353759765625,
0.05145263671875,
-0.002838134765625,
-0.0283355712890625,
0.055999755859375,
0.002651214599609375,
-0.01273345947265625,
-0.052703857421875,
-0.0053863525390625,
-0.004711151123046875,
0.0167694091796875,
0.027587890625,
0.007671356201171875,
0.0257720947265625,
-0.040771484375,
0.0287322998046875,
0.030609130859375,
-0.02703857421875,
-0.0279388427734375,
0.05340576171875,
-0.01456451416015625,
-0.00592803955078125,
0.025665283203125,
-0.05059814453125,
-0.039154052734375,
0.061981201171875,
0.045806884765625,
0.055328369140625,
-0.01505279541015625,
0.04412841796875,
0.07867431640625,
0.015869140625,
-0.0055694580078125,
0.0418701171875,
-0.004459381103515625,
-0.036102294921875,
-0.0181121826171875,
-0.037139892578125,
-0.02978515625,
0.044219970703125,
-0.0341796875,
0.032440185546875,
-0.04095458984375,
-0.0144805908203125,
-0.0229034423828125,
0.0196533203125,
-0.034393310546875,
0.036285400390625,
-0.01222991943359375,
0.057464599609375,
-0.077880859375,
0.0439453125,
0.03497314453125,
-0.0399169921875,
-0.07489013671875,
-0.0268707275390625,
0.02978515625,
-0.03741455078125,
0.0290374755859375,
0.025299072265625,
0.005222320556640625,
-0.003513336181640625,
-0.0501708984375,
-0.0692138671875,
0.0833740234375,
-0.00617218017578125,
-0.033935546875,
0.0243988037109375,
-0.004566192626953125,
0.0523681640625,
-0.0254364013671875,
0.03399658203125,
0.04266357421875,
0.027801513671875,
0.0499267578125,
-0.0298309326171875,
0.037445068359375,
-0.059661865234375,
-0.0126190185546875,
-0.0002796649932861328,
-0.0643310546875,
0.06756591796875,
-0.0074462890625,
-0.0321044921875,
0.04473876953125,
0.029052734375,
0.058685302734375,
0.050384521484375,
0.040496826171875,
0.07421875,
0.02789306640625,
-0.0196075439453125,
0.04547119140625,
-0.0192108154296875,
0.00618743896484375,
0.07537841796875,
0.01462554931640625,
0.037841796875,
0.0239715576171875,
-0.0183563232421875,
0.05810546875,
0.05133056640625,
-0.04290771484375,
0.0249481201171875,
0.0097808837890625,
0.002780914306640625,
-0.0129852294921875,
-0.0103302001953125,
-0.031158447265625,
0.032073974609375,
0.0114288330078125,
-0.0135040283203125,
0.014892578125,
0.0017328262329101562,
0.00624847412109375,
0.00922393798828125,
-0.003749847412109375,
0.044525146484375,
0.0201416015625,
-0.032623291015625,
0.03497314453125,
-0.01861572265625,
0.0589599609375,
-0.0235443115234375,
-0.0333251953125,
-0.032501220703125,
0.0084991455078125,
-0.0077362060546875,
-0.04315185546875,
0.01444244384765625,
-0.018890380859375,
0.0029449462890625,
-0.0097503662109375,
0.04937744140625,
-0.02020263671875,
-0.048004150390625,
0.018402099609375,
0.0079345703125,
0.0220947265625,
0.0017271041870117188,
-0.06134033203125,
-0.0233001708984375,
-0.007556915283203125,
-0.017791748046875,
-0.0171356201171875,
0.018798828125,
-0.00644683837890625,
0.01403045654296875,
0.051239013671875,
0.0220794677734375,
-0.030364990234375,
0.00684356689453125,
0.0225677490234375,
-0.041107177734375,
-0.039794921875,
-0.04412841796875,
0.041595458984375,
-0.0243988037109375,
-0.0275115966796875,
0.06976318359375,
0.0221099853515625,
0.0791015625,
-0.031158447265625,
0.036346435546875,
-0.00847625732421875,
0.0484619140625,
-0.03662109375,
0.07269287109375,
-0.057891845703125,
0.0017414093017578125,
-0.05615234375,
-0.05328369140625,
-0.0074462890625,
0.053680419921875,
-0.01025390625,
0.0147247314453125,
0.0106353759765625,
0.07049560546875,
-0.0022335052490234375,
-0.020111083984375,
-0.003582000732421875,
0.0169525146484375,
0.006282806396484375,
0.041778564453125,
0.052886962890625,
-0.0809326171875,
0.0189361572265625,
-0.05950927734375,
-0.01140594482421875,
-0.0273590087890625,
-0.0618896484375,
-0.07623291015625,
-0.039825439453125,
-0.041473388671875,
-0.042938232421875,
-0.0399169921875,
0.050567626953125,
0.05535888671875,
-0.0654296875,
-0.00412750244140625,
0.00312042236328125,
0.00901031494140625,
-0.035064697265625,
-0.02069091796875,
0.01212310791015625,
0.056732177734375,
-0.0645751953125,
0.005725860595703125,
0.00021135807037353516,
0.0687255859375,
0.00019419193267822266,
-0.00823211669921875,
0.00327301025390625,
0.0086669921875,
0.02752685546875,
0.019287109375,
-0.049224853515625,
-0.01450347900390625,
0.0283966064453125,
-0.000013649463653564453,
-0.0107269287109375,
0.0272216796875,
-0.046051025390625,
0.038116455078125,
0.049072265625,
-0.0031604766845703125,
0.061553955078125,
0.006832122802734375,
0.00878143310546875,
-0.052093505859375,
0.016510009765625,
-0.006069183349609375,
0.0273284912109375,
0.02337646484375,
-0.038055419921875,
0.04248046875,
0.0699462890625,
-0.00255584716796875,
-0.0474853515625,
0.02154541015625,
-0.0972900390625,
-0.024749755859375,
0.06011962890625,
-0.00518798828125,
-0.031768798828125,
0.004543304443359375,
-0.023956298828125,
0.02252197265625,
0.000667572021484375,
0.04132080078125,
0.0452880859375,
0.018707275390625,
-0.032562255859375,
-0.0052947998046875,
0.0282440185546875,
0.0181732177734375,
-0.03558349609375,
-0.0273895263671875,
0.037261962890625,
0.0298309326171875,
0.046417236328125,
0.06756591796875,
-0.04278564453125,
0.0447998046875,
0.0163726806640625,
0.003398895263671875,
-0.01213836669921875,
0.0010986328125,
-0.0298919677734375,
-0.01141357421875,
-0.0217742919921875,
-0.01568603515625
]
] |
segmind/SSD-1B | 2023-11-06T12:08:15.000Z | [
"diffusers",
"text-to-image",
"ultra-realistic",
"stable-diffusion",
"distilled-model",
"knowledge-distillation",
"dataset:zzliang/GRIT",
"dataset:wanng/midjourney-v5-202304-clean",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | segmind | null | null | segmind/SSD-1B | 430 | 101,199 | diffusers | 2023-10-19T05:18:22 | ---
license: apache-2.0
tags:
- text-to-image
- ultra-realistic
- text-to-image
- stable-diffusion
- distilled-model
- knowledge-distillation
pinned: true
datasets:
- zzliang/GRIT
- wanng/midjourney-v5-202304-clean
library_name: diffusers
---
# Segmind Stable Diffusion 1B (SSD-1B) Model Card

## 🔥🔥Join our [Discord](https://discord.gg/rF44ueRG) to give feedback on our smaller v2 version and early access🔥🔥
## 📣 AUTOMATIC1111 compatibility added. Supporting file [here](https://huggingface.co/segmind/SSD-1B/blob/main/SSD-1B-A1111.safetensors)
## Demo
Try out the model at [Segmind SSD-1B](https://www.segmind.com/models/ssd-1b) for ⚡ fastest inference. You can also try it on [🤗 Spaces](https://huggingface.co/spaces/segmind/Segmind-Stable-Diffusion)
## Model Description
The Segmind Stable Diffusion Model (SSD-1B) is a **distilled 50% smaller** version of the Stable Diffusion XL (SDXL), offering a **60% speedup** while maintaining high-quality text-to-image generation capabilities. It has been trained on diverse datasets, including Grit and Midjourney scrape data, to enhance its ability to create a wide range of visual content based on textual prompts.
This model employs a knowledge distillation strategy, where it leverages the teachings of several expert models in succession, including SDXL, ZavyChromaXL, and JuggernautXL, to combine their strengths and produce impressive visual outputs.
Special thanks to the HF team 🤗 especially [Sayak](https://huggingface.co/sayakpaul), [Patrick](https://github.com/patrickvonplaten) and [Poli](https://huggingface.co/multimodalart) for their collaboration and guidance on this work.
## Image Comparision (SDXL-1.0 vs SSD-1B)

## Usage:
This model can be used via the 🧨 Diffusers library.
Make sure to install diffusers from source by running
```
pip install git+https://github.com/huggingface/diffusers
```
In addition, please install `transformers`, `safetensors` and `accelerate`:
```
pip install transformers accelerate safetensors
```
To use the model, you can run the following:
```py
from diffusers import StableDiffusionXLPipeline
import torch
pipe = StableDiffusionXLPipeline.from_pretrained("segmind/SSD-1B", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")
pipe.to("cuda")
# if using torch < 2.0
# pipe.enable_xformers_memory_efficient_attention()
prompt = "An astronaut riding a green horse" # Your prompt here
neg_prompt = "ugly, blurry, poor quality" # Negative prompt here
image = pipe(prompt=prompt, negative_prompt=neg_prompt).images[0]
```
### Update: Our model should now be usable in ComfyUI.
### Please do use negative prompting, and a CFG around 9.0 for the best quality!
### Model Description
- **Developed by:** [Segmind](https://www.segmind.com/)
- **Developers:** [Yatharth Gupta](https://huggingface.co/Warlord-K) and [Vishnu Jaddipal](https://huggingface.co/Icar).
- **Model type:** Diffusion-based text-to-image generative model
- **License:** Apache 2.0
- **Distilled From** [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0)
### Key Features
- **Text-to-Image Generation:** The model excels at generating images from text prompts, enabling a wide range of creative applications.
- **Distilled for Speed:** Designed for efficiency, this model offers a 60% speedup, making it a practical choice for real-time applications and scenarios where rapid image generation is essential.
- **Diverse Training Data:** Trained on diverse datasets, the model can handle a variety of textual prompts and generate corresponding images effectively.
- **Knowledge Distillation:** By distilling knowledge from multiple expert models, the Segmind Stable Diffusion Model combines their strengths and minimizes their limitations, resulting in improved performance.
### Model Architecture
The SSD-1B Model is a 1.3B Parameter Model which has several layers removed from the Base SDXL Model

### Training info
These are the key hyperparameters used during training:
* Steps: 251000
* Learning rate: 1e-5
* Batch size: 32
* Gradient accumulation steps: 4
* Image resolution: 1024
* Mixed-precision: fp16
### Multi-Resolution Support

SSD-1B can support the following output resolutions.
* 1024 x 1024 (1:1 Square)
* 1152 x 896 (9:7)
* 896 x 1152 (7:9)
* 1216 x 832 (19:13)
* 832 x 1216 (13:19)
* 1344 x 768 (7:4 Horizontal)
* 768 x 1344 (4:7 Vertical)
* 1536 x 640 (12:5 Horizontal)
* 640 x 1536 (5:12 Vertical)
### Speed Comparision
We have observed that SSD-1B is upto 60% faster than the Base SDXL Model. Below is a comparision on an A100 80GB.

Below are the speed up metrics on a RTX 4090 GPU.

### Model Sources
For research and development purposes, the SSD-1B Model can be accessed via the Segmind AI platform. For more information and access details, please visit [Segmind](https://www.segmind.com/models/ssd-1b).
## Uses
### Direct Use
The Segmind Stable Diffusion Model is suitable for research and practical applications in various domains, including:
- **Art and Design:** It can be used to generate artworks, designs, and other creative content, providing inspiration and enhancing the creative process.
- **Education:** The model can be applied in educational tools to create visual content for teaching and learning purposes.
- **Research:** Researchers can use the model to explore generative models, evaluate its performance, and push the boundaries of text-to-image generation.
- **Safe Content Generation:** It offers a safe and controlled way to generate content, reducing the risk of harmful or inappropriate outputs.
- **Bias and Limitation Analysis:** Researchers and developers can use the model to probe its limitations and biases, contributing to a better understanding of generative models' behavior.
### Downstream Use
The Segmind Stable Diffusion Model can also be used directly with the 🧨 Diffusers library training scripts for further training, including:
- **[LoRA](https://github.com/huggingface/diffusers/blob/main/examples/text_to_image/train_text_to_image_lora_sdxl.py):**
```bash
export MODEL_NAME="segmind/SSD-1B"
export VAE_NAME="madebyollin/sdxl-vae-fp16-fix"
export DATASET_NAME="lambdalabs/pokemon-blip-captions"
accelerate launch train_text_to_image_lora_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--pretrained_vae_model_name_or_path=$VAE_NAME \
--dataset_name=$DATASET_NAME --caption_column="text" \
--resolution=1024 --random_flip \
--train_batch_size=1 \
--num_train_epochs=2 --checkpointing_steps=500 \
--learning_rate=1e-04 --lr_scheduler="constant" --lr_warmup_steps=0 \
--mixed_precision="fp16" \
--seed=42 \
--output_dir="sd-pokemon-model-lora-ssd" \
--validation_prompt="cute dragon creature" --report_to="wandb" \
--push_to_hub
```
- **[Fine-Tune](https://github.com/huggingface/diffusers/blob/main/examples/text_to_image/train_text_to_image_sdxl.py):**
```bash
export MODEL_NAME="segmind/SSD-1B"
export VAE_NAME="madebyollin/sdxl-vae-fp16-fix"
export DATASET_NAME="lambdalabs/pokemon-blip-captions"
accelerate launch train_text_to_image_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--pretrained_vae_model_name_or_path=$VAE_NAME \
--dataset_name=$DATASET_NAME \
--enable_xformers_memory_efficient_attention \
--resolution=512 --center_crop --random_flip \
--proportion_empty_prompts=0.2 \
--train_batch_size=1 \
--gradient_accumulation_steps=4 --gradient_checkpointing \
--max_train_steps=10000 \
--use_8bit_adam \
--learning_rate=1e-06 --lr_scheduler="constant" --lr_warmup_steps=0 \
--mixed_precision="fp16" \
--report_to="wandb" \
--validation_prompt="a cute Sundar Pichai creature" --validation_epochs 5 \
--checkpointing_steps=5000 \
--output_dir="ssd-pokemon-model" \
--push_to_hub
```
- **[Dreambooth LoRA](https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/train_dreambooth_lora_sdxl.py):**
```bash
export MODEL_NAME="segmind/SSD-1B"
export INSTANCE_DIR="dog"
export OUTPUT_DIR="lora-trained-xl"
export VAE_PATH="madebyollin/sdxl-vae-fp16-fix"
accelerate launch train_dreambooth_lora_sdxl.py \
--pretrained_model_name_or_path=$MODEL_NAME \
--instance_data_dir=$INSTANCE_DIR \
--pretrained_vae_model_name_or_path=$VAE_PATH \
--output_dir=$OUTPUT_DIR \
--mixed_precision="fp16" \
--instance_prompt="a photo of sks dog" \
--resolution=1024 \
--train_batch_size=1 \
--gradient_accumulation_steps=4 \
--learning_rate=1e-5 \
--report_to="wandb" \
--lr_scheduler="constant" \
--lr_warmup_steps=0 \
--max_train_steps=500 \
--validation_prompt="A photo of sks dog in a bucket" \
--validation_epochs=25 \
--seed="0" \
--push_to_hub
```
### Out-of-Scope Use
The SSD-1B Model is not suitable for creating factual or accurate representations of people, events, or real-world information. It is not intended for tasks requiring high precision and accuracy.
## Limitations and Bias
Limitations & Bias
The SSD-1B Model has some challenges in embodying absolute photorealism, especially in human depictions. While it grapples with incorporating clear text and maintaining the fidelity of complex compositions due to its autoencoding approach, these hurdles pave the way for future enhancements. Importantly, the model's exposure to a diverse dataset, though not a panacea for ingrained societal and digital biases, represents a foundational step towards more equitable technology. Users are encouraged to interact with this pioneering tool with an understanding of its current limitations, fostering an environment of conscious engagement and anticipation for its continued evolution. | 10,378 | [
[
-0.044921875,
-0.07501220703125,
0.0283355712890625,
0.0223541259765625,
-0.0269622802734375,
-0.0216064453125,
-0.0011844635009765625,
-0.021881103515625,
0.0151519775390625,
0.0204315185546875,
-0.042694091796875,
-0.037384033203125,
-0.049224853515625,
-0.01554107666015625,
-0.0119781494140625,
0.061859130859375,
-0.01401519775390625,
0.00040984153747558594,
-0.01511383056640625,
-0.00200653076171875,
-0.0169219970703125,
-0.0150146484375,
-0.07489013671875,
-0.0196533203125,
0.0219573974609375,
0.00910186767578125,
0.04388427734375,
0.048309326171875,
0.035064697265625,
0.02801513671875,
-0.02618408203125,
-0.00695037841796875,
-0.038330078125,
0.0037784576416015625,
0.00974273681640625,
-0.0217437744140625,
-0.043914794921875,
0.0011777877807617188,
0.0445556640625,
0.024383544921875,
-0.0186004638671875,
0.006938934326171875,
0.005985260009765625,
0.051116943359375,
-0.0254364013671875,
-0.0015411376953125,
-0.0149078369140625,
0.01409149169921875,
-0.01080322265625,
0.00982666015625,
-0.006587982177734375,
-0.027191162109375,
0.007373809814453125,
-0.061126708984375,
0.032867431640625,
-0.00533294677734375,
0.08599853515625,
0.0255279541015625,
-0.0201873779296875,
0.0060272216796875,
-0.025787353515625,
0.038299560546875,
-0.058746337890625,
0.0196533203125,
0.0026760101318359375,
0.020050048828125,
0.0081939697265625,
-0.0604248046875,
-0.044708251953125,
0.00547027587890625,
0.00937652587890625,
0.0258941650390625,
-0.0092315673828125,
-0.0081939697265625,
0.04119873046875,
0.0299530029296875,
-0.0484619140625,
0.0044097900390625,
-0.03778076171875,
0.007556915283203125,
0.05499267578125,
0.01123046875,
0.035491943359375,
-0.0153656005859375,
-0.0499267578125,
-0.0185546875,
-0.0345458984375,
-0.005039215087890625,
0.012481689453125,
-0.0092315673828125,
-0.045440673828125,
0.0308380126953125,
0.00214385986328125,
0.028472900390625,
0.023468017578125,
-0.00737762451171875,
0.04193115234375,
-0.0299530029296875,
-0.0310821533203125,
-0.01084136962890625,
0.0697021484375,
0.042022705078125,
-0.00934600830078125,
0.01030731201171875,
-0.0089111328125,
0.00791168212890625,
-0.0026531219482421875,
-0.09844970703125,
-0.017913818359375,
0.019287109375,
-0.050628662109375,
-0.0323486328125,
-0.0214996337890625,
-0.07366943359375,
-0.0214385986328125,
-0.00098419189453125,
0.04119873046875,
-0.04754638671875,
-0.046539306640625,
0.00872039794921875,
-0.04217529296875,
0.0085906982421875,
0.043121337890625,
-0.055450439453125,
0.00855255126953125,
0.014617919921875,
0.07574462890625,
-0.0250244140625,
-0.01253509521484375,
-0.00743865966796875,
0.003246307373046875,
-0.0191192626953125,
0.050628662109375,
-0.017578125,
-0.0391845703125,
-0.0213165283203125,
0.0180206298828125,
-0.00809478759765625,
-0.032623291015625,
0.04632568359375,
-0.0218505859375,
0.01221466064453125,
-0.0092315673828125,
-0.04779052734375,
-0.01412200927734375,
0.00357818603515625,
-0.041900634765625,
0.08624267578125,
0.0240325927734375,
-0.07208251953125,
0.0224151611328125,
-0.048583984375,
-0.0241241455078125,
0.0097808837890625,
0.004180908203125,
-0.048858642578125,
-0.0002651214599609375,
-0.00045418739318847656,
0.052337646484375,
-0.01306915283203125,
0.008026123046875,
-0.0333251953125,
-0.022674560546875,
-0.0014553070068359375,
-0.0318603515625,
0.0765380859375,
0.0325927734375,
-0.035064697265625,
0.0004534721374511719,
-0.0692138671875,
0.0020618438720703125,
0.031951904296875,
-0.0226287841796875,
-0.0241546630859375,
-0.0272216796875,
0.0183563232421875,
0.0287933349609375,
0.02069091796875,
-0.033233642578125,
0.00832366943359375,
-0.006671905517578125,
0.0311431884765625,
0.0670166015625,
0.0107421875,
0.040374755859375,
-0.0074615478515625,
0.04052734375,
0.027923583984375,
0.02362060546875,
-0.021697998046875,
-0.047088623046875,
-0.061798095703125,
-0.0228424072265625,
0.0159759521484375,
0.032928466796875,
-0.06005859375,
0.03369140625,
-0.005252838134765625,
-0.06292724609375,
-0.03485107421875,
0.0015897750854492188,
0.0263519287109375,
0.0533447265625,
0.022552490234375,
-0.0267486572265625,
-0.0201568603515625,
-0.055267333984375,
0.032440185546875,
-0.000865936279296875,
0.01495361328125,
0.0283660888671875,
0.04986572265625,
-0.0257720947265625,
0.026885986328125,
-0.06005859375,
-0.01334381103515625,
-0.01013946533203125,
0.014404296875,
0.015777587890625,
0.0528564453125,
0.054595947265625,
-0.0579833984375,
-0.049346923828125,
-0.01171112060546875,
-0.07159423828125,
0.0007600784301757812,
-0.007568359375,
-0.028778076171875,
0.030731201171875,
0.036285400390625,
-0.06805419921875,
0.0361328125,
0.040252685546875,
-0.05279541015625,
0.043914794921875,
-0.044036865234375,
0.0120391845703125,
-0.08782958984375,
0.0143890380859375,
0.035064697265625,
-0.014251708984375,
-0.046875,
0.01181793212890625,
0.0135498046875,
-0.010711669921875,
-0.045166015625,
0.0560302734375,
-0.03240966796875,
0.021331787109375,
-0.00955963134765625,
0.0012607574462890625,
0.004665374755859375,
0.0384521484375,
0.0274505615234375,
0.050628662109375,
0.07037353515625,
-0.058685302734375,
0.01151275634765625,
0.0286712646484375,
-0.0288238525390625,
0.056884765625,
-0.07037353515625,
-0.00982666015625,
-0.0208282470703125,
0.00768280029296875,
-0.073486328125,
-0.005886077880859375,
0.0280303955078125,
-0.033905029296875,
0.039520263671875,
-0.0094757080078125,
-0.030364990234375,
-0.038665771484375,
-0.0223541259765625,
0.0125579833984375,
0.07501220703125,
-0.042266845703125,
0.033233642578125,
0.0250701904296875,
-0.0047760009765625,
-0.023895263671875,
-0.051666259765625,
-0.01641845703125,
-0.0275115966796875,
-0.072021484375,
0.043701171875,
-0.03131103515625,
-0.0025119781494140625,
0.00189208984375,
0.00518798828125,
-0.0005407333374023438,
0.006183624267578125,
0.033660888671875,
0.040374755859375,
-0.0161285400390625,
-0.0182952880859375,
0.0107574462890625,
-0.01776123046875,
-0.0014705657958984375,
0.0069732666015625,
0.0355224609375,
0.00807952880859375,
-0.00852203369140625,
-0.045135498046875,
0.0188446044921875,
0.040618896484375,
0.01071929931640625,
0.06085205078125,
0.0723876953125,
-0.0225372314453125,
-0.004825592041015625,
-0.03399658203125,
-0.0007977485656738281,
-0.040557861328125,
0.0206146240234375,
-0.0219573974609375,
-0.03369140625,
0.0330810546875,
0.0004036426544189453,
-0.00417327880859375,
0.044830322265625,
0.03680419921875,
-0.0122833251953125,
0.1004638671875,
0.039093017578125,
0.0172271728515625,
0.043121337890625,
-0.0623779296875,
-0.0103302001953125,
-0.06781005859375,
-0.026611328125,
-0.020050048828125,
-0.0139617919921875,
-0.0303802490234375,
-0.044891357421875,
0.02569580078125,
0.0187835693359375,
-0.035064697265625,
0.0185394287109375,
-0.04522705078125,
0.013458251953125,
0.0273590087890625,
0.0198211669921875,
-0.0047149658203125,
0.0094757080078125,
-0.0070037841796875,
-0.007724761962890625,
-0.04437255859375,
-0.021484375,
0.06573486328125,
0.033782958984375,
0.068603515625,
-0.00836944580078125,
0.048309326171875,
0.01508331298828125,
0.00562286376953125,
-0.019744873046875,
0.05279541015625,
-0.0091094970703125,
-0.040435791015625,
-0.010345458984375,
-0.0244903564453125,
-0.04949951171875,
0.023284912109375,
-0.0242919921875,
-0.0254364013671875,
0.0251007080078125,
0.00516510009765625,
-0.02215576171875,
0.025299072265625,
-0.0792236328125,
0.074951171875,
-0.023468017578125,
-0.0548095703125,
-0.00335693359375,
-0.04901123046875,
0.022216796875,
0.00750732421875,
-0.00292205810546875,
-0.008392333984375,
0.00021409988403320312,
0.06329345703125,
-0.048919677734375,
0.05303955078125,
-0.04632568359375,
-0.004306793212890625,
0.01727294921875,
-0.017913818359375,
0.032928466796875,
-0.0006604194641113281,
-0.0079498291015625,
0.018310546875,
0.0011663436889648438,
-0.0279693603515625,
-0.031585693359375,
0.06414794921875,
-0.0672607421875,
-0.03118896484375,
-0.03271484375,
-0.009246826171875,
0.02081298828125,
0.02142333984375,
0.03802490234375,
0.0205535888671875,
-0.0176239013671875,
-0.003948211669921875,
0.08111572265625,
-0.0243377685546875,
0.040802001953125,
0.0251007080078125,
-0.016357421875,
-0.0287933349609375,
0.07098388671875,
0.0128936767578125,
0.026519775390625,
-0.0017194747924804688,
0.0006742477416992188,
-0.0277557373046875,
-0.0308990478515625,
-0.05462646484375,
0.0281219482421875,
-0.05462646484375,
-0.01461029052734375,
-0.0650634765625,
-0.035369873046875,
-0.035247802734375,
-0.017791748046875,
-0.025634765625,
-0.032623291015625,
-0.061126708984375,
-0.0029754638671875,
0.037261962890625,
0.035430908203125,
-0.0172576904296875,
0.0211334228515625,
-0.0343017578125,
0.0292205810546875,
0.0136871337890625,
0.028900146484375,
0.0106964111328125,
-0.047882080078125,
-0.0023059844970703125,
0.007068634033203125,
-0.0345458984375,
-0.05419921875,
0.0433349609375,
0.0048675537109375,
0.039520263671875,
0.04742431640625,
-0.0175628662109375,
0.06903076171875,
-0.03497314453125,
0.0694580078125,
0.03271484375,
-0.046051025390625,
0.05279541015625,
-0.0177154541015625,
0.0169219970703125,
0.020782470703125,
0.044403076171875,
-0.0303802490234375,
-0.00789642333984375,
-0.0628662109375,
-0.0631103515625,
0.051666259765625,
0.01409149169921875,
0.008453369140625,
0.0081634521484375,
0.032867431640625,
0.00789642333984375,
-0.002719879150390625,
-0.0501708984375,
-0.04840087890625,
-0.0216217041015625,
-0.002910614013671875,
-0.01214599609375,
-0.0183258056640625,
0.00855255126953125,
-0.044525146484375,
0.0755615234375,
0.0086517333984375,
0.039825439453125,
0.0222625732421875,
-0.002422332763671875,
-0.006610870361328125,
-0.00875091552734375,
0.0301666259765625,
0.02337646484375,
-0.0230865478515625,
-0.00789642333984375,
-0.0007724761962890625,
-0.040008544921875,
0.0176849365234375,
0.0223541259765625,
-0.0416259765625,
0.00759124755859375,
-0.01227569580078125,
0.08465576171875,
0.0005497932434082031,
-0.031585693359375,
0.04296875,
-0.0252227783203125,
-0.0271759033203125,
-0.02130126953125,
0.022918701171875,
0.017364501953125,
0.021392822265625,
0.00856781005859375,
0.034027099609375,
0.00768280029296875,
-0.0165557861328125,
-0.0009551048278808594,
0.039276123046875,
-0.0222930908203125,
-0.023590087890625,
0.09100341796875,
0.01554107666015625,
-0.007694244384765625,
0.043701171875,
-0.0283660888671875,
-0.0249481201171875,
0.0623779296875,
0.055328369140625,
0.05853271484375,
-0.013916015625,
0.037689208984375,
0.053924560546875,
-0.006805419921875,
-0.020172119140625,
0.0255279541015625,
-0.00531005859375,
-0.041961669921875,
0.003711700439453125,
-0.046051025390625,
0.001251220703125,
0.005626678466796875,
-0.03717041015625,
0.03814697265625,
-0.032928466796875,
-0.018890380859375,
0.01157379150390625,
0.007106781005859375,
-0.050872802734375,
0.024993896484375,
0.0198211669921875,
0.09002685546875,
-0.07501220703125,
0.059234619140625,
0.04052734375,
-0.051849365234375,
-0.0408935546875,
-0.0210723876953125,
-0.01061248779296875,
-0.05706787109375,
0.041107177734375,
0.019134521484375,
0.017333984375,
0.01036834716796875,
-0.056732177734375,
-0.05950927734375,
0.1143798828125,
0.01131439208984375,
-0.0305633544921875,
-0.003574371337890625,
-0.0176849365234375,
0.04534912109375,
-0.0265350341796875,
0.035858154296875,
0.0302276611328125,
0.0288238525390625,
0.0275115966796875,
-0.0266265869140625,
0.018951416015625,
-0.049102783203125,
0.0283660888671875,
-0.000335693359375,
-0.07049560546875,
0.0699462890625,
-0.034515380859375,
-0.023895263671875,
0.0167694091796875,
0.053924560546875,
0.0321044921875,
0.02166748046875,
0.03369140625,
0.07861328125,
0.0552978515625,
-0.006099700927734375,
0.083984375,
-0.014495849609375,
0.0216217041015625,
0.04669189453125,
-0.01404571533203125,
0.0498046875,
0.0240325927734375,
-0.01806640625,
0.033447265625,
0.061981201171875,
-0.0222930908203125,
0.05267333984375,
-0.007038116455078125,
-0.019317626953125,
-0.0121612548828125,
0.00495147705078125,
-0.05352783203125,
-0.018768310546875,
0.0227203369140625,
-0.040802001953125,
-0.006587982177734375,
0.0185546875,
0.00789642333984375,
-0.025177001953125,
-0.01995849609375,
0.03839111328125,
-0.006626129150390625,
-0.048980712890625,
0.050445556640625,
-0.0031185150146484375,
0.0794677734375,
-0.042694091796875,
-0.004871368408203125,
-0.020477294921875,
0.01824951171875,
-0.0160064697265625,
-0.0736083984375,
0.00286102294921875,
-0.00897216796875,
-0.0081787109375,
-0.0260467529296875,
0.0648193359375,
-0.029571533203125,
-0.04656982421875,
0.01904296875,
0.0237579345703125,
0.0234527587890625,
0.0071563720703125,
-0.0828857421875,
0.0222930908203125,
0.0037631988525390625,
-0.0293121337890625,
0.029510498046875,
0.025787353515625,
0.0194854736328125,
0.0560302734375,
0.034271240234375,
0.006275177001953125,
0.0175933837890625,
-0.005550384521484375,
0.0625,
-0.038787841796875,
-0.0252532958984375,
-0.0509033203125,
0.051849365234375,
-0.0281829833984375,
-0.0225372314453125,
0.067138671875,
0.053375244140625,
0.050018310546875,
-0.01837158203125,
0.0699462890625,
-0.0282135009765625,
0.0207977294921875,
-0.035491943359375,
0.06805419921875,
-0.067138671875,
0.0049896240234375,
-0.0305633544921875,
-0.050201416015625,
0.005950927734375,
0.058319091796875,
0.0027313232421875,
0.01396942138671875,
0.036651611328125,
0.0736083984375,
-0.0186309814453125,
-0.0020427703857421875,
0.01186370849609375,
0.02520751953125,
0.01395416259765625,
0.031646728515625,
0.042999267578125,
-0.0645751953125,
0.025115966796875,
-0.043182373046875,
-0.019683837890625,
-0.004253387451171875,
-0.056182861328125,
-0.07061767578125,
-0.05963134765625,
-0.064208984375,
-0.046478271484375,
-0.01041412353515625,
0.039642333984375,
0.06475830078125,
-0.033233642578125,
0.005130767822265625,
-0.01392364501953125,
0.003993988037109375,
-0.01207733154296875,
-0.021942138671875,
0.034027099609375,
-0.0015316009521484375,
-0.06378173828125,
-0.0057220458984375,
0.01192474365234375,
0.0457763671875,
-0.0211334228515625,
-0.0229949951171875,
-0.0220947265625,
-0.005664825439453125,
0.0445556640625,
0.01422882080078125,
-0.0232086181640625,
0.01568603515625,
-0.007656097412109375,
-0.01557159423828125,
0.016815185546875,
0.0185089111328125,
-0.0479736328125,
0.0192108154296875,
0.05120849609375,
0.01367950439453125,
0.06585693359375,
-0.00397491455078125,
0.006496429443359375,
-0.034393310546875,
0.01116180419921875,
-0.0215606689453125,
0.036865234375,
0.01904296875,
-0.02923583984375,
0.048675537109375,
0.035980224609375,
-0.048797607421875,
-0.059326171875,
-0.0180206298828125,
-0.090576171875,
-0.0264434814453125,
0.0999755859375,
-0.01409912109375,
-0.046875,
0.0186309814453125,
-0.035186767578125,
0.01171112060546875,
-0.027740478515625,
0.038909912109375,
0.0306243896484375,
-0.003673553466796875,
-0.02569580078125,
-0.039703369140625,
0.033935546875,
0.0081634521484375,
-0.06439208984375,
-0.01363372802734375,
0.04779052734375,
0.049041748046875,
0.0125885009765625,
0.06903076171875,
-0.013397216796875,
0.0194091796875,
0.0132904052734375,
0.0146331787109375,
0.00641632080078125,
0.0030574798583984375,
-0.0308990478515625,
0.0033550262451171875,
-0.0261383056640625,
-0.00774383544921875
]
] |
facebook/hubert-large-ls960-ft | 2022-05-24T10:43:42.000Z | [
"transformers",
"pytorch",
"tf",
"hubert",
"automatic-speech-recognition",
"speech",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:libri-light",
"dataset:librispeech_asr",
"arxiv:2106.07447",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/hubert-large-ls960-ft | 42 | 101,158 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- libri-light
- librispeech_asr
tags:
- speech
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
license: apache-2.0
model-index:
- name: hubert-large-ls960-ft
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 1.9
---
# Hubert-Large-Finetuned
[Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression)
The large model fine-tuned on 960h of Librispeech on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
The model is a fine-tuned version of [hubert-large-ll60k](https://huggingface.co/facebook/hubert-large-ll60k).
[Paper](https://arxiv.org/abs/2106.07447)
Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed
**Abstract**
Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert .
# Usage
The model can be used for automatic-speech-recognition as follows:
```python
import torch
from transformers import Wav2Vec2Processor, HubertForCTC
from datasets import load_dataset
processor = Wav2Vec2Processor.from_pretrained("facebook/hubert-large-ls960-ft")
model = HubertForCTC.from_pretrained("facebook/hubert-large-ls960-ft")
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values # Batch size 1
logits = model(input_values).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.decode(predicted_ids[0])
# ->"A MAN SAID TO THE UNIVERSE SIR I EXIST"
``` | 3,325 | [
[
-0.03875732421875,
-0.040802001953125,
0.026763916015625,
0.021575927734375,
-0.0082855224609375,
-0.00647735595703125,
-0.030609130859375,
-0.03167724609375,
0.0268707275390625,
0.0196990966796875,
-0.04986572265625,
-0.0269927978515625,
-0.0361328125,
-0.01143646240234375,
-0.030303955078125,
0.0521240234375,
0.01520538330078125,
0.014984130859375,
0.005458831787109375,
-0.00835418701171875,
-0.0274200439453125,
-0.04541015625,
-0.05938720703125,
-0.03265380859375,
0.03240966796875,
0.0186614990234375,
0.0129547119140625,
0.0292816162109375,
0.01428985595703125,
0.0269775390625,
-0.0026760101318359375,
0.0038394927978515625,
-0.03729248046875,
0.0025463104248046875,
0.00814056396484375,
-0.00647735595703125,
-0.029388427734375,
0.0147705078125,
0.0634765625,
0.04541015625,
-0.0277862548828125,
0.03045654296875,
0.007080078125,
0.026153564453125,
-0.0286407470703125,
0.0214996337890625,
-0.057159423828125,
-0.0026111602783203125,
-0.005016326904296875,
0.0076141357421875,
-0.037109375,
0.00018799304962158203,
0.00281524658203125,
-0.0323486328125,
0.019500732421875,
-0.010650634765625,
0.070068359375,
0.024627685546875,
-0.0088043212890625,
-0.007965087890625,
-0.059783935546875,
0.07342529296875,
-0.050140380859375,
0.054656982421875,
0.042022705078125,
0.0282440185546875,
0.0013217926025390625,
-0.058258056640625,
-0.03143310546875,
-0.0204925537109375,
0.0031909942626953125,
0.01396942138671875,
-0.0269775390625,
0.005016326904296875,
0.0267181396484375,
0.01409149169921875,
-0.0440673828125,
0.0224609375,
-0.055145263671875,
-0.03973388671875,
0.0626220703125,
-0.0288848876953125,
-0.0182037353515625,
-0.0189361572265625,
-0.01776123046875,
-0.0237579345703125,
-0.0269927978515625,
0.0157318115234375,
0.032135009765625,
0.02789306640625,
-0.0132293701171875,
0.0189361572265625,
-0.002635955810546875,
0.05291748046875,
0.0232391357421875,
-0.01702880859375,
0.036285400390625,
0.018157958984375,
-0.00862884521484375,
0.0323486328125,
0.0733642578125,
-0.016326904296875,
0.0155487060546875,
0.0060882568359375,
-0.0290069580078125,
-0.0099334716796875,
0.008148193359375,
-0.055938720703125,
-0.047821044921875,
0.0156707763671875,
-0.028350830078125,
-0.003116607666015625,
0.015472412109375,
-0.0082244873046875,
0.00940704345703125,
-0.039459228515625,
0.07525634765625,
-0.033294677734375,
-0.00464630126953125,
-0.016845703125,
0.0037975311279296875,
-0.0014896392822265625,
0.004795074462890625,
-0.08148193359375,
0.0163726806640625,
0.0241546630859375,
0.05059814453125,
-0.004749298095703125,
-0.0104217529296875,
-0.049407958984375,
0.0025310516357421875,
-0.031097412109375,
0.0272979736328125,
-0.00661468505859375,
-0.0233001708984375,
-0.02911376953125,
-0.006038665771484375,
0.00557708740234375,
-0.0489501953125,
0.0372314453125,
-0.021240234375,
0.00839996337890625,
-0.004436492919921875,
-0.05194091796875,
-0.015167236328125,
-0.027069091796875,
-0.04168701171875,
0.082763671875,
0.00730133056640625,
-0.03533935546875,
0.00740814208984375,
-0.038848876953125,
-0.035797119140625,
-0.00445556640625,
-0.0283660888671875,
-0.04962158203125,
0.018157958984375,
0.033477783203125,
0.059478759765625,
0.007335662841796875,
0.033355712890625,
-0.029693603515625,
-0.027374267578125,
0.0253143310546875,
-0.046783447265625,
0.06402587890625,
0.0264434814453125,
-0.0276031494140625,
0.0160980224609375,
-0.077392578125,
0.0107879638671875,
0.005405426025390625,
-0.0288238525390625,
0.00823974609375,
-0.0146942138671875,
0.00850677490234375,
0.011505126953125,
0.029693603515625,
-0.047271728515625,
-0.00811004638671875,
-0.038299560546875,
0.04534912109375,
0.06329345703125,
-0.01451873779296875,
0.023223876953125,
-0.0167236328125,
0.0035839080810546875,
-0.0211639404296875,
0.00806427001953125,
-0.000873565673828125,
-0.030853271484375,
-0.057464599609375,
-0.035736083984375,
0.05181884765625,
0.02667236328125,
-0.0244598388671875,
0.05535888671875,
-0.0013647079467773438,
-0.04205322265625,
-0.0723876953125,
0.002895355224609375,
0.0153045654296875,
0.038238525390625,
0.051788330078125,
-0.01629638671875,
-0.05718994140625,
-0.08013916015625,
0.004741668701171875,
-0.0189666748046875,
-0.0300750732421875,
0.023529052734375,
0.018096923828125,
-0.02996826171875,
0.0723876953125,
-0.022216796875,
-0.03289794921875,
-0.00569915771484375,
0.0197906494140625,
0.02886962890625,
0.052154541015625,
0.0288238525390625,
-0.037994384765625,
-0.0281829833984375,
-0.0138397216796875,
-0.04510498046875,
-0.010284423828125,
0.0007262229919433594,
0.0222930908203125,
0.0195159912109375,
0.049774169921875,
-0.030487060546875,
0.023284912109375,
0.048248291015625,
0.01904296875,
0.033447265625,
-0.0263671875,
-0.0211639404296875,
-0.0794677734375,
-0.0020751953125,
-0.0090179443359375,
-0.0318603515625,
-0.045989990234375,
-0.02581787109375,
0.008056640625,
-0.015716552734375,
-0.02191162109375,
0.0322265625,
-0.02984619140625,
-0.0182647705078125,
-0.015655517578125,
0.01265716552734375,
-0.018218994140625,
0.044342041015625,
0.0009927749633789062,
0.04901123046875,
0.050628662109375,
-0.045989990234375,
0.0377197265625,
0.0064697265625,
-0.0306396484375,
0.0143585205078125,
-0.060394287109375,
0.023468017578125,
-0.00421905517578125,
0.022369384765625,
-0.07354736328125,
-0.020843505859375,
-0.0002105236053466797,
-0.062744140625,
0.055450439453125,
0.0023136138916015625,
-0.032440185546875,
-0.035491943359375,
0.0009937286376953125,
0.02691650390625,
0.048919677734375,
-0.064453125,
0.0408935546875,
0.04046630859375,
0.00484466552734375,
-0.040496826171875,
-0.0601806640625,
-0.009613037109375,
-0.009368896484375,
-0.046173095703125,
0.032562255859375,
-0.00836181640625,
0.0038280487060546875,
-0.01983642578125,
-0.0185089111328125,
0.01154327392578125,
-0.00836181640625,
0.0278167724609375,
-0.0018873214721679688,
-0.007110595703125,
0.037078857421875,
0.0024433135986328125,
-0.01708984375,
0.008544921875,
-0.031158447265625,
0.045989990234375,
-0.0176849365234375,
-0.019012451171875,
-0.060028076171875,
0.0187835693359375,
0.00995635986328125,
-0.01474761962890625,
0.024688720703125,
0.097900390625,
-0.036590576171875,
-0.0180206298828125,
-0.05303955078125,
-0.034027099609375,
-0.04205322265625,
0.039459228515625,
-0.0277557373046875,
-0.0726318359375,
0.029937744140625,
0.005889892578125,
-0.00461578369140625,
0.0538330078125,
0.045074462890625,
-0.033447265625,
0.06744384765625,
0.03509521484375,
-0.0265655517578125,
0.0341796875,
-0.04901123046875,
0.01934814453125,
-0.058868408203125,
-0.01421356201171875,
-0.02264404296875,
-0.0301971435546875,
-0.04815673828125,
-0.03558349609375,
0.0323486328125,
0.01259613037109375,
-0.0158233642578125,
0.0310821533203125,
-0.049072265625,
0.0009183883666992188,
0.06597900390625,
0.0151519775390625,
-0.01082611083984375,
0.0204620361328125,
-0.01149749755859375,
-0.0039825439453125,
-0.06219482421875,
-0.00870513916015625,
0.073974609375,
0.0400390625,
0.070068359375,
-0.005191802978515625,
0.0899658203125,
0.0157012939453125,
-0.00749969482421875,
-0.06610107421875,
0.041839599609375,
-0.0096893310546875,
-0.0538330078125,
-0.044891357421875,
-0.04559326171875,
-0.07666015625,
0.015228271484375,
-0.01708984375,
-0.06854248046875,
0.01471710205078125,
0.016387939453125,
-0.03619384765625,
0.01258087158203125,
-0.057647705078125,
0.049957275390625,
-0.0159149169921875,
0.0004551410675048828,
-0.0241546630859375,
-0.054656982421875,
0.002941131591796875,
-0.01404571533203125,
0.01258087158203125,
-0.0194244384765625,
0.040130615234375,
0.08502197265625,
-0.020233154296875,
0.059478759765625,
-0.024688720703125,
-0.0024127960205078125,
0.034698486328125,
-0.01003265380859375,
0.0248260498046875,
0.00656890869140625,
0.0015802383422851562,
0.031158447265625,
0.007320404052734375,
-0.021087646484375,
-0.0239105224609375,
0.06378173828125,
-0.0806884765625,
-0.033538818359375,
-0.01558685302734375,
-0.030487060546875,
-0.0214080810546875,
0.0034160614013671875,
0.04644775390625,
0.049163818359375,
-0.008026123046875,
0.026763916015625,
0.053741455078125,
-0.006145477294921875,
0.047088623046875,
0.0196990966796875,
-0.0197601318359375,
-0.040740966796875,
0.086181640625,
0.031005859375,
0.0113983154296875,
0.0239715576171875,
0.02496337890625,
-0.033721923828125,
-0.0255279541015625,
-0.016448974609375,
0.03070068359375,
-0.050140380859375,
-0.0211944580078125,
-0.046417236328125,
-0.041229248046875,
-0.051483154296875,
0.01456451416015625,
-0.046905517578125,
-0.041839599609375,
-0.0538330078125,
-0.002777099609375,
0.0215606689453125,
0.046875,
-0.04644775390625,
0.041290283203125,
-0.0374755859375,
0.034332275390625,
0.05352783203125,
0.010498046875,
-0.005199432373046875,
-0.0738525390625,
-0.027130126953125,
0.00870513916015625,
-0.01465606689453125,
-0.0533447265625,
0.0295867919921875,
0.03155517578125,
0.044769287109375,
0.03839111328125,
0.007198333740234375,
0.0526123046875,
-0.036895751953125,
0.043975830078125,
0.020751953125,
-0.07818603515625,
0.050994873046875,
-0.0159759521484375,
0.0223388671875,
0.026611328125,
0.0239105224609375,
-0.02691650390625,
-0.0136260986328125,
-0.0631103515625,
-0.05230712890625,
0.0699462890625,
0.021240234375,
0.00841522216796875,
0.01806640625,
0.0225982666015625,
-0.007564544677734375,
0.006641387939453125,
-0.0653076171875,
-0.034942626953125,
-0.0225982666015625,
-0.01493072509765625,
-0.0266876220703125,
-0.0245208740234375,
-0.002155303955078125,
-0.046630859375,
0.075439453125,
0.004283905029296875,
0.03997802734375,
0.020538330078125,
-0.00548553466796875,
-0.00397491455078125,
0.012115478515625,
0.03875732421875,
0.0352783203125,
-0.031158447265625,
0.0105133056640625,
0.0216217041015625,
-0.02740478515625,
0.0011739730834960938,
0.0291290283203125,
0.00555419921875,
0.018463134765625,
0.031036376953125,
0.08837890625,
0.0031833648681640625,
-0.016876220703125,
0.033294677734375,
0.00024127960205078125,
-0.0361328125,
-0.046112060546875,
0.0004353523254394531,
0.00734710693359375,
0.0228729248046875,
0.046722412109375,
0.0020122528076171875,
0.00567626953125,
-0.0273284912109375,
0.0242462158203125,
0.025238037109375,
-0.04766845703125,
-0.0229339599609375,
0.05426025390625,
-0.00452423095703125,
-0.025909423828125,
0.045867919921875,
-0.026824951171875,
-0.03155517578125,
0.025543212890625,
0.043243408203125,
0.06329345703125,
-0.05633544921875,
0.01020050048828125,
0.042999267578125,
0.0212249755859375,
-0.0087738037109375,
0.017364501953125,
-0.0158538818359375,
-0.040771484375,
-0.03662109375,
-0.056732177734375,
-0.013336181640625,
0.0300445556640625,
-0.049957275390625,
0.020263671875,
-0.0308685302734375,
-0.02850341796875,
0.0125732421875,
0.0007433891296386719,
-0.041839599609375,
0.0160675048828125,
0.019683837890625,
0.0323486328125,
-0.0479736328125,
0.0877685546875,
0.0199127197265625,
-0.0008144378662109375,
-0.090087890625,
-0.01030731201171875,
-0.015869140625,
-0.0662841796875,
0.03875732421875,
0.0221099853515625,
-0.0088958740234375,
0.0088348388671875,
-0.044677734375,
-0.08648681640625,
0.070068359375,
0.042327880859375,
-0.0653076171875,
0.0116424560546875,
-0.00850677490234375,
0.0338134765625,
-0.013824462890625,
-0.01070404052734375,
0.0382080078125,
0.024627685546875,
0.0025482177734375,
-0.0911865234375,
-0.01739501953125,
0.0091552734375,
-0.00762176513671875,
-0.0004239082336425781,
-0.041107177734375,
0.08099365234375,
-0.018341064453125,
-0.0116424560546875,
0.0015802383422851562,
0.0723876953125,
0.0173187255859375,
0.0175018310546875,
0.0413818359375,
0.046356201171875,
0.06939697265625,
-0.012115478515625,
0.043243408203125,
-0.0072021484375,
0.053070068359375,
0.08673095703125,
0.0214996337890625,
0.07427978515625,
0.0250396728515625,
-0.0318603515625,
0.027130126953125,
0.0419921875,
-0.01067352294921875,
0.04168701171875,
0.0269775390625,
-0.0159454345703125,
-0.02679443359375,
0.0142822265625,
-0.043426513671875,
0.06695556640625,
0.0325927734375,
-0.0251617431640625,
0.006626129150390625,
0.01306915283203125,
-0.027679443359375,
-0.01322174072265625,
-0.024444580078125,
0.046478271484375,
0.0239410400390625,
-0.0185546875,
0.07684326171875,
0.004150390625,
0.04949951171875,
-0.04766845703125,
0.009979248046875,
0.00917816162109375,
0.006534576416015625,
-0.0169525146484375,
-0.0278167724609375,
0.00299072265625,
-0.028564453125,
-0.01165771484375,
-0.0090789794921875,
0.047119140625,
-0.049041748046875,
-0.0484619140625,
0.0278167724609375,
0.015594482421875,
0.029510498046875,
-0.0171051025390625,
-0.05438232421875,
0.0106353759765625,
0.00879669189453125,
-0.00665283203125,
0.01397705078125,
0.017364501953125,
0.01456451416015625,
0.03131103515625,
0.03631591796875,
0.002780914306640625,
0.0071868896484375,
0.01386260986328125,
0.04473876953125,
-0.035888671875,
-0.041748046875,
-0.04156494140625,
0.00926971435546875,
0.0028476715087890625,
0.00269317626953125,
0.042694091796875,
0.03668212890625,
0.07904052734375,
-0.003505706787109375,
0.0396728515625,
0.00914764404296875,
0.060089111328125,
-0.03668212890625,
0.061370849609375,
-0.04144287109375,
0.0120697021484375,
-0.0234832763671875,
-0.0618896484375,
-0.0130615234375,
0.0833740234375,
-0.00952911376953125,
0.02886962890625,
0.0227203369140625,
0.054534912109375,
-0.005840301513671875,
-0.01474761962890625,
0.044036865234375,
0.022369384765625,
0.0208892822265625,
0.02362060546875,
0.052276611328125,
-0.04949951171875,
0.042022705078125,
-0.033721923828125,
-0.007732391357421875,
-0.0234527587890625,
-0.030548095703125,
-0.06719970703125,
-0.06805419921875,
-0.024383544921875,
-0.0182342529296875,
-0.00865936279296875,
0.08343505859375,
0.08526611328125,
-0.07110595703125,
-0.030120849609375,
0.02178955078125,
-0.01019287109375,
-0.0159454345703125,
-0.01406097412109375,
0.04412841796875,
-0.0186920166015625,
-0.044525146484375,
0.056976318359375,
0.0034923553466796875,
0.018157958984375,
-0.0263671875,
-0.016571044921875,
-0.007598876953125,
-0.0096588134765625,
0.04815673828125,
0.0242462158203125,
-0.07147216796875,
-0.0253753662109375,
-0.01439666748046875,
0.00803375244140625,
0.01250457763671875,
0.033843994140625,
-0.07232666015625,
0.049835205078125,
0.01058197021484375,
0.033355712890625,
0.07659912109375,
-0.01436614990234375,
0.01265716552734375,
-0.08642578125,
0.00806427001953125,
0.0245819091796875,
0.03375244140625,
0.032745361328125,
-0.01374053955078125,
0.01739501953125,
0.01690673828125,
-0.056732177734375,
-0.056243896484375,
0.00864410400390625,
-0.0972900390625,
-0.0192413330078125,
0.08087158203125,
0.005123138427734375,
-0.007659912109375,
0.018157958984375,
-0.02447509765625,
0.0390625,
-0.049957275390625,
0.05242919921875,
0.03717041015625,
-0.0249176025390625,
-0.010711669921875,
-0.033905029296875,
0.036895751953125,
0.03155517578125,
-0.018218994140625,
-0.0050048828125,
0.03076171875,
0.0236968994140625,
0.0217132568359375,
0.048858642578125,
0.01100921630859375,
0.00960540771484375,
-0.0083465576171875,
0.01617431640625,
-0.0028209686279296875,
-0.0374755859375,
-0.0458984375,
0.0074462890625,
0.0033397674560546875,
-0.033294677734375
]
] |
THUDM/chatglm-6b | 2023-09-04T15:49:45.000Z | [
"transformers",
"pytorch",
"chatglm",
"glm",
"thudm",
"custom_code",
"zh",
"en",
"arxiv:2103.10360",
"arxiv:2210.02414",
"endpoints_compatible",
"has_space",
"region:us"
] | null | THUDM | null | null | THUDM/chatglm-6b | 2,646 | 100,796 | transformers | 2023-03-13T16:28:04 | ---
language:
- zh
- en
tags:
- glm
- chatglm
- thudm
---
# ChatGLM-6B
<p align="center">
🌐 <a href="https://chatglm.cn/blog" target="_blank">Blog</a> • 💻 <a href="https://github.com/THUDM/ChatGLM-6B" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br>
</p>
<p align="center">
👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1y7pqoloy-9b1g6T6JjA8J0KxvUjbwJw" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a>
</p>
<p align="center">
📍Experience the larger-scale ChatGLM model at <a href="https://www.chatglm.cn">chatglm.cn</a>
</p>
**我们发布了 [ChatGLM2-6B](https://github.com/THUDM/ChatGLM2-6B),ChatGLM-6B 的升级版本,在保留了了初代模型对话流畅、部署门槛较低等众多优秀特性的基础之上,引入了更强大的性能、更长的上下文、更高效的推理等升级。**
## 介绍
ChatGLM-6B 是一个开源的、支持中英双语问答的对话语言模型,基于 [General Language Model (GLM)](https://github.com/THUDM/GLM) 架构,具有 62 亿参数。结合模型量化技术,用户可以在消费级的显卡上进行本地部署(INT4 量化级别下最低只需 6GB 显存)。ChatGLM-6B 使用了和 [ChatGLM](https://chatglm.cn) 相同的技术,针对中文问答和对话进行了优化。经过约 1T 标识符的中英双语训练,辅以监督微调、反馈自助、人类反馈强化学习等技术的加持,62 亿参数的 ChatGLM-6B 已经能生成相当符合人类偏好的回答。 ChatGLM-6B 权重对学术研究**完全开放**,在填写[问卷](https://open.bigmodel.cn/mla/form)进行登记后**亦允许免费商业使用**。
ChatGLM-6B is an open bilingual language model based on [General Language Model (GLM)](https://github.com/THUDM/GLM) framework, with 6.2 billion parameters. With the quantization technique, users can deploy locally on consumer-grade graphics cards (only 6GB of GPU memory is required at the INT4 quantization level). ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning with human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference. ChatGLM-6B weights are **completely open** for academic research, and **free commercial use** is also allowed after completing the [questionnaire](https://open.bigmodel.cn/mla/form).
## 软件依赖
```shell
pip install protobuf==3.20.0 transformers==4.27.1 icetk cpm_kernels
```
## 代码调用
可以通过如下代码调用 ChatGLM-6B 模型来生成对话:
```ipython
>>> from transformers import AutoTokenizer, AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
>>> model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).half().cuda()
>>> response, history = model.chat(tokenizer, "你好", history=[])
>>> print(response)
你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。
>>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
>>> print(response)
晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法:
1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。
2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。
3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。
4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。
5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。
6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。
如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。
```
关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM-6B)。
For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/ChatGLM-6B).
## Change Log
* v1.1.0 ([942945d](https://huggingface.co/THUDM/chatglm-6b/commit/942945df047dee66f653c68ae0e56655045f1741)): 更新 v1.1 版本 checkpoint
* v0.1.0 ([f831824](https://huggingface.co/THUDM/chatglm-6b/commit/f83182484538e663a03d3f73647f10f89878f438))
## 协议
本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。
## 引用
如果你觉得我们的工作有帮助的话,请考虑引用下列论文:
```
@inproceedings{
zeng2023glm-130b,
title={{GLM}-130B: An Open Bilingual Pre-trained Model},
author={Aohan Zeng and Xiao Liu and Zhengxiao Du and Zihan Wang and Hanyu Lai and Ming Ding and Zhuoyi Yang and Yifan Xu and Wendi Zheng and Xiao Xia and Weng Lam Tam and Zixuan Ma and Yufei Xue and Jidong Zhai and Wenguang Chen and Zhiyuan Liu and Peng Zhang and Yuxiao Dong and Jie Tang},
booktitle={The Eleventh International Conference on Learning Representations (ICLR)},
year={2023},
url={https://openreview.net/forum?id=-Aw0rrrPUF}
}
```
```
@inproceedings{du2022glm,
title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie},
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages={320--335},
year={2022}
}
``` | 5,122 | [
[
-0.033538818359375,
-0.05877685546875,
0.006092071533203125,
0.02764892578125,
-0.0290069580078125,
-0.0021228790283203125,
-0.023040771484375,
-0.03289794921875,
0.006946563720703125,
0.010498046875,
-0.036651611328125,
-0.05023193359375,
-0.03741455078125,
-0.01190948486328125,
-0.006275177001953125,
0.0675048828125,
0.01346588134765625,
0.005580902099609375,
0.0029354095458984375,
-0.007114410400390625,
-0.037109375,
-0.04925537109375,
-0.05059814453125,
-0.02020263671875,
0.00969696044921875,
-0.0016155242919921875,
0.050262451171875,
0.021331787109375,
0.028564453125,
0.025604248046875,
-0.01617431640625,
0.01261138916015625,
-0.041900634765625,
-0.028717041015625,
0.02056884765625,
-0.0360107421875,
-0.052001953125,
0.0035953521728515625,
0.048095703125,
0.0252227783203125,
-0.01355743408203125,
0.021087646484375,
0.0217742919921875,
0.04461669921875,
-0.033447265625,
0.043182373046875,
-0.037933349609375,
-0.0065765380859375,
-0.006359100341796875,
-0.006671905517578125,
-0.0245361328125,
-0.03631591796875,
-0.0019741058349609375,
-0.035797119140625,
-0.0048675537109375,
0.007366180419921875,
0.09649658203125,
-0.011566162109375,
-0.019195556640625,
-0.01155853271484375,
-0.0423583984375,
0.0693359375,
-0.09075927734375,
0.01495361328125,
0.0236053466796875,
0.0295562744140625,
-0.022369384765625,
-0.04901123046875,
-0.039337158203125,
-0.0123138427734375,
-0.034576416015625,
0.024322509765625,
-0.0105438232421875,
-0.0016088485717773438,
0.01715087890625,
0.031646728515625,
-0.051788330078125,
-0.0021877288818359375,
-0.03924560546875,
-0.0202789306640625,
0.051971435546875,
0.013946533203125,
0.049713134765625,
-0.01065826416015625,
-0.03204345703125,
-0.0007672309875488281,
-0.036773681640625,
0.0232696533203125,
0.022735595703125,
0.022552490234375,
-0.055511474609375,
0.015411376953125,
0.000054895877838134766,
0.04217529296875,
0.00433349609375,
-0.0216827392578125,
0.03887939453125,
-0.044921875,
-0.02362060546875,
-0.021484375,
0.1015625,
0.02423095703125,
0.0006256103515625,
0.01021575927734375,
-0.00794219970703125,
-0.0081939697265625,
-0.010955810546875,
-0.06512451171875,
-0.01007843017578125,
0.0262298583984375,
-0.04156494140625,
-0.01306915283203125,
-0.0031185150146484375,
-0.03961181640625,
0.01355743408203125,
-0.0006804466247558594,
0.045501708984375,
-0.04400634765625,
-0.0360107421875,
0.0169525146484375,
-0.0004930496215820312,
0.026641845703125,
0.0255584716796875,
-0.0718994140625,
0.026031494140625,
0.0284881591796875,
0.0631103515625,
-0.0034275054931640625,
-0.0196685791015625,
-0.01486968994140625,
0.00775909423828125,
-0.01467132568359375,
0.0245513916015625,
-0.004749298095703125,
-0.038787841796875,
-0.0039005279541015625,
-0.00007241964340209961,
-0.0293426513671875,
-0.0263519287109375,
0.02642822265625,
-0.02667236328125,
0.05731201171875,
-0.0106048583984375,
-0.040863037109375,
-0.025390625,
0.029022216796875,
-0.0288543701171875,
0.0687255859375,
-0.006290435791015625,
-0.067138671875,
-0.00719451904296875,
-0.046356201171875,
-0.005279541015625,
0.0007967948913574219,
-0.0157470703125,
-0.033966064453125,
-0.020416259765625,
0.033721923828125,
0.0206146240234375,
-0.025634765625,
0.01253509521484375,
-0.0206298828125,
-0.0294647216796875,
0.0189056396484375,
-0.024871826171875,
0.0947265625,
0.0167236328125,
-0.0293121337890625,
0.0255584716796875,
-0.036407470703125,
0.023681640625,
0.020721435546875,
-0.0170440673828125,
-0.0025577545166015625,
-0.00464630126953125,
-0.006992340087890625,
0.033538818359375,
0.0408935546875,
-0.0230865478515625,
0.0134429931640625,
-0.049407958984375,
0.034332275390625,
0.053253173828125,
-0.007781982421875,
0.031707763671875,
-0.038787841796875,
0.0264434814453125,
0.0235137939453125,
0.0426025390625,
-0.018157958984375,
-0.04522705078125,
-0.0721435546875,
-0.0204315185546875,
0.019927978515625,
0.054779052734375,
-0.048004150390625,
0.062255859375,
-0.0170440673828125,
-0.045501708984375,
-0.03912353515625,
0.0140533447265625,
0.040069580078125,
0.024627685546875,
0.035430908203125,
-0.02313232421875,
-0.03204345703125,
-0.052520751953125,
-0.0055389404296875,
-0.03375244140625,
-0.00191497802734375,
0.0263671875,
0.0401611328125,
-0.0248260498046875,
0.06610107421875,
-0.035858154296875,
-0.030670166015625,
-0.0209503173828125,
0.0042266845703125,
0.032012939453125,
0.04595947265625,
0.0513916015625,
-0.05255126953125,
-0.06390380859375,
0.0026397705078125,
-0.066162109375,
0.01004791259765625,
0.00873565673828125,
-0.0236663818359375,
0.035491943359375,
0.0244140625,
-0.047637939453125,
0.03338623046875,
0.053253173828125,
-0.0277557373046875,
0.046295166015625,
-0.0208740234375,
-0.0009169578552246094,
-0.09368896484375,
0.004611968994140625,
-0.0144500732421875,
0.0023975372314453125,
-0.045623779296875,
-0.00994873046875,
0.0005130767822265625,
0.0112152099609375,
-0.042236328125,
0.07568359375,
-0.052276611328125,
0.0205230712890625,
-0.0011577606201171875,
0.01300811767578125,
-0.0155181884765625,
0.064208984375,
-0.0179290771484375,
0.041748046875,
0.055694580078125,
-0.039794921875,
0.0223236083984375,
0.0184783935546875,
-0.016571044921875,
-0.00205230712890625,
-0.061492919921875,
0.01153564453125,
0.00013399124145507812,
0.0221405029296875,
-0.093017578125,
-0.0017404556274414062,
0.04913330078125,
-0.0592041015625,
0.0195465087890625,
-0.01560211181640625,
-0.03204345703125,
-0.037506103515625,
-0.0338134765625,
0.0166168212890625,
0.0640869140625,
-0.0235137939453125,
0.048370361328125,
0.0266571044921875,
0.0016336441040039062,
-0.043548583984375,
-0.039398193359375,
-0.01220703125,
-0.0177764892578125,
-0.06683349609375,
0.0162200927734375,
-0.021026611328125,
0.004177093505859375,
-0.0060577392578125,
0.01337432861328125,
0.0033817291259765625,
-0.005123138427734375,
0.019195556640625,
0.036346435546875,
-0.00926971435546875,
-0.00679779052734375,
-0.019317626953125,
-0.00545501708984375,
0.00594329833984375,
-0.0109710693359375,
0.049072265625,
-0.0311279296875,
-0.03228759765625,
-0.03961181640625,
0.0106201171875,
0.03265380859375,
-0.01468658447265625,
0.06256103515625,
0.07421875,
-0.0160980224609375,
0.01110076904296875,
-0.05023193359375,
-0.0138092041015625,
-0.04083251953125,
0.01806640625,
-0.00516510009765625,
-0.072021484375,
0.07177734375,
0.027587890625,
0.023040771484375,
0.04205322265625,
0.05169677734375,
0.010223388671875,
0.08709716796875,
0.032257080078125,
-0.028778076171875,
0.041229248046875,
-0.03228759765625,
0.0213165283203125,
-0.051971435546875,
-0.019439697265625,
-0.0333251953125,
-0.0162200927734375,
-0.0533447265625,
-0.0400390625,
0.025299072265625,
0.00946044921875,
-0.0238189697265625,
0.0076751708984375,
-0.03277587890625,
-0.0016622543334960938,
0.03753662109375,
0.0008082389831542969,
0.01383209228515625,
-0.0105438232421875,
-0.022125244140625,
0.003955841064453125,
-0.05810546875,
-0.035797119140625,
0.060089111328125,
0.03448486328125,
0.055694580078125,
0.019439697265625,
0.045501708984375,
-0.00807952880859375,
0.0296173095703125,
-0.043548583984375,
0.0487060546875,
0.006122589111328125,
-0.053253173828125,
-0.031890869140625,
-0.042236328125,
-0.07501220703125,
0.04193115234375,
-0.0050506591796875,
-0.07171630859375,
0.00009208917617797852,
0.01367950439453125,
-0.018829345703125,
0.0223236083984375,
-0.06256103515625,
0.06524658203125,
-0.0276641845703125,
-0.015838623046875,
0.0019207000732421875,
-0.0545654296875,
0.038787841796875,
0.0227508544921875,
0.0340576171875,
-0.0266265869140625,
0.0034923553466796875,
0.05133056640625,
-0.0389404296875,
0.0703125,
-0.0204315185546875,
-0.0082550048828125,
0.036895751953125,
-0.01180267333984375,
0.042572021484375,
0.007793426513671875,
0.01641845703125,
0.02691650390625,
0.0020542144775390625,
-0.035858154296875,
-0.04229736328125,
0.050384521484375,
-0.06591796875,
-0.05902099609375,
-0.03778076171875,
-0.032562255859375,
-0.01317596435546875,
0.0225067138671875,
0.02667236328125,
0.0203399658203125,
-0.0019483566284179688,
0.01418304443359375,
0.0275115966796875,
-0.03765869140625,
0.049774169921875,
0.0418701171875,
-0.045379638671875,
-0.039276123046875,
0.060272216796875,
0.01114654541015625,
0.03399658203125,
0.01531982421875,
0.01195526123046875,
-0.0284271240234375,
-0.037322998046875,
-0.030181884765625,
0.03228759765625,
-0.033172607421875,
-0.00281524658203125,
-0.0491943359375,
-0.0430908203125,
-0.0491943359375,
0.0117034912109375,
-0.023895263671875,
-0.0031337738037109375,
-0.0247802734375,
0.004199981689453125,
0.025665283203125,
0.010009765625,
-0.00048041343688964844,
0.01070404052734375,
-0.0758056640625,
0.0208740234375,
0.0204620361328125,
0.0212860107421875,
0.01605224609375,
-0.054718017578125,
-0.03765869140625,
0.039794921875,
-0.0169830322265625,
-0.0343017578125,
0.0521240234375,
0.010009765625,
0.05377197265625,
0.0222015380859375,
-0.0007953643798828125,
0.053985595703125,
-0.0213775634765625,
0.07171630859375,
0.03192138671875,
-0.06842041015625,
0.03228759765625,
-0.042510986328125,
0.040130615234375,
0.01047515869140625,
0.031341552734375,
-0.047027587890625,
-0.034820556640625,
-0.05621337890625,
-0.0643310546875,
0.06427001953125,
0.033935546875,
0.035430908203125,
0.0017671585083007812,
-0.004261016845703125,
-0.02264404296875,
0.0113372802734375,
-0.06341552734375,
-0.048126220703125,
-0.01290130615234375,
-0.00817108154296875,
0.0006589889526367188,
-0.026397705078125,
-0.006198883056640625,
-0.03533935546875,
0.0582275390625,
-0.00354766845703125,
0.04473876953125,
0.0019521713256835938,
0.004199981689453125,
0.007320404052734375,
0.0172576904296875,
0.047088623046875,
0.056793212890625,
-0.0190582275390625,
-0.01434326171875,
0.0195465087890625,
-0.042816162109375,
-0.003948211669921875,
0.00392913818359375,
-0.02386474609375,
0.0110321044921875,
0.0189056396484375,
0.0849609375,
0.016510009765625,
-0.035400390625,
0.04473876953125,
-0.0258636474609375,
-0.02825927734375,
-0.0262298583984375,
0.01800537109375,
0.0251922607421875,
0.00954437255859375,
0.04248046875,
-0.0273590087890625,
-0.00933074951171875,
-0.048614501953125,
-0.0064697265625,
0.0413818359375,
-0.024017333984375,
-0.0284576416015625,
0.0509033203125,
0.013763427734375,
-0.0021800994873046875,
0.035491943359375,
-0.0177001953125,
-0.045379638671875,
0.040313720703125,
0.0350341796875,
0.06256103515625,
-0.0217437744140625,
0.0081939697265625,
0.05621337890625,
0.0153350830078125,
-0.0158233642578125,
0.025909423828125,
0.0174713134765625,
-0.06121826171875,
-0.0258941650390625,
-0.0435791015625,
-0.01318359375,
0.0171356201171875,
-0.036956787109375,
0.0254058837890625,
-0.032501220703125,
-0.022552490234375,
-0.013824462890625,
0.0056304931640625,
-0.0300140380859375,
0.0123443603515625,
0.0060577392578125,
0.049560546875,
-0.036102294921875,
0.06353759765625,
0.03680419921875,
-0.03582763671875,
-0.07440185546875,
-0.01934814453125,
0.005428314208984375,
-0.05560302734375,
0.0343017578125,
0.0029277801513671875,
0.0014410018920898438,
-0.0004391670227050781,
-0.04339599609375,
-0.0770263671875,
0.08599853515625,
0.0265655517578125,
-0.0275115966796875,
-0.01219940185546875,
0.002971649169921875,
0.049530029296875,
-0.014556884765625,
0.042510986328125,
0.01232147216796875,
0.028289794921875,
0.019287109375,
-0.09796142578125,
0.0135650634765625,
-0.045989990234375,
0.01129150390625,
0.0017805099487304688,
-0.07952880859375,
0.08551025390625,
-0.005931854248046875,
-0.029022216796875,
-0.0088958740234375,
0.053680419921875,
0.0205535888671875,
-0.0018415451049804688,
0.027923583984375,
0.0284271240234375,
0.03826904296875,
-0.0224151611328125,
0.062408447265625,
-0.03607177734375,
0.05535888671875,
0.0716552734375,
0.007518768310546875,
0.0498046875,
0.0168609619140625,
-0.0285186767578125,
0.036773681640625,
0.0406494140625,
-0.0074920654296875,
0.0290985107421875,
0.004261016845703125,
-0.02313232421875,
-0.010162353515625,
0.01088714599609375,
-0.052398681640625,
0.0164031982421875,
0.0338134765625,
-0.01178741455078125,
-0.01236724853515625,
-0.0054779052734375,
0.0164337158203125,
-0.0233154296875,
-0.0158538818359375,
0.0654296875,
0.02227783203125,
-0.047149658203125,
0.0860595703125,
0.0006022453308105469,
0.07275390625,
-0.06634521484375,
0.006633758544921875,
-0.0211944580078125,
0.0101318359375,
-0.0166473388671875,
-0.037872314453125,
0.00830841064453125,
-0.016204833984375,
0.0017728805541992188,
-0.007648468017578125,
0.059661865234375,
-0.0443115234375,
-0.030242919921875,
0.038665771484375,
0.034698486328125,
0.00786590576171875,
0.0114898681640625,
-0.07061767578125,
0.009002685546875,
0.0140533447265625,
-0.03594970703125,
0.036285400390625,
0.028045654296875,
0.0018224716186523438,
0.06298828125,
0.052337646484375,
-0.0007224082946777344,
0.01120758056640625,
0.0010194778442382812,
0.065673828125,
-0.05169677734375,
-0.04351806640625,
-0.07440185546875,
0.055419921875,
-0.00678253173828125,
-0.0169525146484375,
0.07879638671875,
0.040191650390625,
0.0662841796875,
-0.005031585693359375,
0.06378173828125,
-0.01715087890625,
0.038055419921875,
-0.03204345703125,
0.060943603515625,
-0.037353515625,
0.014739990234375,
-0.021026611328125,
-0.041046142578125,
-0.015869140625,
0.042022705078125,
-0.028411865234375,
0.027099609375,
0.046600341796875,
0.066162109375,
0.0162811279296875,
-0.0144195556640625,
0.0223236083984375,
0.025848388671875,
0.03033447265625,
0.06463623046875,
0.04229736328125,
-0.05059814453125,
0.052764892578125,
-0.0179290771484375,
0.0013179779052734375,
-0.0267791748046875,
-0.046356201171875,
-0.0870361328125,
-0.041595458984375,
-0.01934814453125,
-0.028106689453125,
-0.00897216796875,
0.0689697265625,
0.044464111328125,
-0.054229736328125,
-0.0418701171875,
0.0182647705078125,
0.0173797607421875,
-0.0186309814453125,
-0.0171356201171875,
0.034820556640625,
-0.032684326171875,
-0.06536865234375,
-0.00862884521484375,
0.01506805419921875,
0.02294921875,
-0.0196533203125,
-0.019287109375,
-0.03265380859375,
0.0012903213500976562,
0.03765869140625,
0.024932861328125,
-0.058837890625,
-0.0107421875,
0.00498199462890625,
-0.0350341796875,
0.01507568359375,
0.0101470947265625,
-0.030548095703125,
0.027587890625,
0.04595947265625,
0.01227569580078125,
0.047271728515625,
-0.005046844482421875,
0.03466796875,
-0.036956787109375,
0.02825927734375,
0.00460052490234375,
0.0233612060546875,
0.00867462158203125,
-0.013580322265625,
0.039764404296875,
0.017669677734375,
-0.0302886962890625,
-0.05902099609375,
-0.01702880859375,
-0.07830810546875,
-0.006191253662109375,
0.09674072265625,
-0.0267791748046875,
-0.0259246826171875,
0.0003459453582763672,
-0.03436279296875,
0.0229339599609375,
-0.03070068359375,
0.06292724609375,
0.062744140625,
0.0010128021240234375,
-0.0143585205078125,
-0.04302978515625,
0.042022705078125,
0.0243988037109375,
-0.065673828125,
-0.010162353515625,
0.0295562744140625,
0.02386474609375,
0.00916290283203125,
0.07354736328125,
-0.015350341796875,
0.0201416015625,
-0.0240020751953125,
0.01812744140625,
-0.00975799560546875,
0.021026611328125,
-0.01030731201171875,
-0.007232666015625,
-0.00650787353515625,
-0.0216217041015625
]
] |
openlm-research/open_llama_3b_v2 | 2023-07-16T08:32:00.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:tiiuae/falcon-refinedweb",
"dataset:bigcode/starcoderdata",
"dataset:togethercomputer/RedPajama-Data-1T",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | openlm-research | null | null | openlm-research/open_llama_3b_v2 | 69 | 99,894 | transformers | 2023-07-16T00:39:43 | ---
license: apache-2.0
datasets:
- tiiuae/falcon-refinedweb
- bigcode/starcoderdata
- togethercomputer/RedPajama-Data-1T
---
# OpenLLaMA: An Open Reproduction of LLaMA
**TL;DR**: we are releasing our public preview of OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA. We are releasing a series of 3B, 7B and 13B models trained on different data mixtures. Our model weights can serve as the drop in replacement of LLaMA in existing implementations.
In this repo, we present a permissively licensed open source reproduction of Meta AI's [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) large language model. We are releasing a series of 3B, 7B and 13B models trained on 1T tokens. We provide PyTorch and JAX weights of pre-trained OpenLLaMA models, as well as evaluation results and comparison against the original LLaMA models. The v2 model is better than the old v1 model trained on a different data mixture. Please see the [project homepage of OpenLLaMA](https://github.com/openlm-research/open_llama) for more details.
## Weights Release, License and Usage
We release the weights in two formats: an EasyLM format to be use with our [EasyLM framework](https://github.com/young-geng/EasyLM), and a PyTorch format to be used with the [Hugging Face transformers](https://huggingface.co/docs/transformers/index) library. Both our training framework EasyLM and the checkpoint weights are licensed permissively under the Apache 2.0 license.
### Loading the Weights with Hugging Face Transformers
Preview checkpoints can be directly loaded from Hugging Face Hub. **Please note that it is advised to avoid using the Hugging Face fast tokenizer for now, as we’ve observed that** [**the auto-converted fast tokenizer sometimes gives incorrect tokenizations**](https://github.com/huggingface/transformers/issues/24233)**.** This can be achieved by directly using the `LlamaTokenizer` class, or passing in the `use_fast=False` option for the `AutoTokenizer` class. See the following example for usage.
```python
import torch
from transformers import LlamaTokenizer, LlamaForCausalLM
## v2 models
model_path = 'openlm-research/open_llama_3b_v2'
# model_path = 'openlm-research/open_llama_7b_v2'
## v1 models
# model_path = 'openlm-research/open_llama_3b'
# model_path = 'openlm-research/open_llama_7b'
# model_path = 'openlm-research/open_llama_13b'
tokenizer = LlamaTokenizer.from_pretrained(model_path)
model = LlamaForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float16, device_map='auto',
)
prompt = 'Q: What is the largest animal?\nA:'
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=32
)
print(tokenizer.decode(generation_output[0]))
```
For more advanced usage, please follow the [transformers LLaMA documentation](https://huggingface.co/docs/transformers/main/model_doc/llama).
### Evaluating with LM-Eval-Harness
The model can be evaluated with [lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness). However, due to the aforementioned tokenizer issue, we need to avoid using the fast tokenizer to obtain the correct results. This can be achieved by passing in `use_fast=False` to [this part of lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness/blob/4b701e228768052cfae9043dca13e82052ca5eea/lm_eval/models/huggingface.py#LL313C9-L316C10), as shown in the example below:
```python
tokenizer = self.AUTO_TOKENIZER_CLASS.from_pretrained(
pretrained if tokenizer is None else tokenizer,
revision=revision + ("/" + subfolder if subfolder is not None else ""),
use_fast=False
)
```
### Loading the Weights with EasyLM
For using the weights in our EasyLM framework, please refer to the [LLaMA documentation of EasyLM](https://github.com/young-geng/EasyLM/blob/main/docs/llama.md). Note that unlike the original LLaMA model, our OpenLLaMA tokenizer and weights are trained completely from scratch so it is no longer needed to obtain the original LLaMA tokenizer and weights.
## Dataset and Training
The v1 models are trained on the [RedPajama dataset](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T). The v2 models are trained on a mixture of the [Falcon refined-web dataset](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), the [StarCoder dataset](https://huggingface.co/datasets/bigcode/starcoderdata) and the wikipedia, arxiv, book and stackexchange part of the [RedPajama dataset](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T). We follow the exactly same preprocessing steps and training hyperparameters as the original LLaMA paper, including model architecture, context length, training steps, learning rate schedule, and optimizer. The only difference between our setting and the original one is the dataset used: OpenLLaMA employs open datasets rather than the one utilized by the original LLaMA.
We train the models on cloud TPU-v4s using [EasyLM](https://github.com/young-geng/EasyLM), a JAX based training pipeline we developed for training and fine-tuning large language models. We employ a combination of normal data parallelism and fully sharded data parallelism [](https://engineering.fb.com/2021/07/15/open-source/fsdp/)(also know as ZeRO stage 3) to balance the training throughput and memory usage. Overall we reach a throughput of over 2200 tokens / second / TPU-v4 chip for our 7B model.
## Evaluation
We evaluated OpenLLaMA on a wide range of tasks using [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness). The LLaMA results are generated by running the original LLaMA model on the same evaluation metrics. We note that our results for the LLaMA model differ slightly from the original LLaMA paper, which we believe is a result of different evaluation protocols. Similar differences have been reported in [this issue of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/issues/443). Additionally, we present the results of GPT-J, a 6B parameter model trained on the [Pile](https://pile.eleuther.ai/) dataset by [EleutherAI](https://www.eleuther.ai/).
The original LLaMA model was trained for 1 trillion tokens and GPT-J was trained for 500 billion tokens. We present the results in the table below. OpenLLaMA exhibits comparable performance to the original LLaMA and GPT-J across a majority of tasks, and outperforms them in some tasks.
| **Task/Metric** | GPT-J 6B | LLaMA 7B | LLaMA 13B | OpenLLaMA 3Bv2 | OpenLLaMA 7Bv2 | OpenLLaMA 3B | OpenLLaMA 7B | OpenLLaMA 13B |
| ---------------------- | -------- | -------- | --------- | -------------- | -------------- | ------------ | ------------ | ------------- |
| anli_r1/acc | 0.32 | 0.35 | 0.35 | 0.33 | 0.34 | 0.33 | 0.33 | 0.33 |
| anli_r2/acc | 0.34 | 0.34 | 0.36 | 0.36 | 0.35 | 0.32 | 0.36 | 0.33 |
| anli_r3/acc | 0.35 | 0.37 | 0.39 | 0.38 | 0.39 | 0.35 | 0.38 | 0.40 |
| arc_challenge/acc | 0.34 | 0.39 | 0.44 | 0.34 | 0.39 | 0.34 | 0.37 | 0.41 |
| arc_challenge/acc_norm | 0.37 | 0.41 | 0.44 | 0.36 | 0.41 | 0.37 | 0.38 | 0.44 |
| arc_easy/acc | 0.67 | 0.68 | 0.75 | 0.68 | 0.73 | 0.69 | 0.72 | 0.75 |
| arc_easy/acc_norm | 0.62 | 0.52 | 0.59 | 0.63 | 0.70 | 0.65 | 0.68 | 0.70 |
| boolq/acc | 0.66 | 0.75 | 0.71 | 0.66 | 0.72 | 0.68 | 0.71 | 0.75 |
| hellaswag/acc | 0.50 | 0.56 | 0.59 | 0.52 | 0.56 | 0.49 | 0.53 | 0.56 |
| hellaswag/acc_norm | 0.66 | 0.73 | 0.76 | 0.70 | 0.75 | 0.67 | 0.72 | 0.76 |
| openbookqa/acc | 0.29 | 0.29 | 0.31 | 0.26 | 0.30 | 0.27 | 0.30 | 0.31 |
| openbookqa/acc_norm | 0.38 | 0.41 | 0.42 | 0.38 | 0.41 | 0.40 | 0.40 | 0.43 |
| piqa/acc | 0.75 | 0.78 | 0.79 | 0.77 | 0.79 | 0.75 | 0.76 | 0.77 |
| piqa/acc_norm | 0.76 | 0.78 | 0.79 | 0.78 | 0.80 | 0.76 | 0.77 | 0.79 |
| record/em | 0.88 | 0.91 | 0.92 | 0.87 | 0.89 | 0.88 | 0.89 | 0.91 |
| record/f1 | 0.89 | 0.91 | 0.92 | 0.88 | 0.89 | 0.89 | 0.90 | 0.91 |
| rte/acc | 0.54 | 0.56 | 0.69 | 0.55 | 0.57 | 0.58 | 0.60 | 0.64 |
| truthfulqa_mc/mc1 | 0.20 | 0.21 | 0.25 | 0.22 | 0.23 | 0.22 | 0.23 | 0.25 |
| truthfulqa_mc/mc2 | 0.36 | 0.34 | 0.40 | 0.35 | 0.35 | 0.35 | 0.35 | 0.38 |
| wic/acc | 0.50 | 0.50 | 0.50 | 0.50 | 0.50 | 0.48 | 0.51 | 0.47 |
| winogrande/acc | 0.64 | 0.68 | 0.70 | 0.63 | 0.66 | 0.62 | 0.67 | 0.70 |
| Average | 0.52 | 0.55 | 0.57 | 0.53 | 0.56 | 0.53 | 0.55 | 0.57 |
We removed the task CB and WSC from our benchmark, as our model performs suspiciously high on these two tasks. We hypothesize that there could be a benchmark data contamination in the training set.
## Contact
We would love to get feedback from the community. If you have any questions, please open an issue or contact us.
OpenLLaMA is developed by:
[Xinyang Geng](https://young-geng.xyz/)* and [Hao Liu](https://www.haoliu.site/)* from Berkeley AI Research.
*Equal Contribution
## Acknowledgment
We thank the [Google TPU Research Cloud](https://sites.research.google/trc/about/) program for providing part of the computation resources. We’d like to specially thank Jonathan Caton from TPU Research Cloud for helping us organizing compute resources, Rafi Witten from the Google Cloud team and James Bradbury from the Google JAX team for helping us optimizing our training throughput. We’d also want to thank Charlie Snell, Gautier Izacard, Eric Wallace, Lianmin Zheng and our user community for the discussions and feedback.
The OpenLLaMA 13B v1 model is trained in collaboration with [Stability AI](https://stability.ai/), and we thank Stability AI for providing the computation resources. We’d like to especially thank David Ha and Shivanshu Purohit for the coordinating the logistics and providing engineering support.
## Reference
If you found OpenLLaMA useful in your research or applications, please cite using the following BibTeX:
```
@software{openlm2023openllama,
author = {Geng, Xinyang and Liu, Hao},
title = {OpenLLaMA: An Open Reproduction of LLaMA},
month = May,
year = 2023,
url = {https://github.com/openlm-research/open_llama}
}
```
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
```
@article{touvron2023llama,
title={Llama: Open and efficient foundation language models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```
| 12,194 | [
[
-0.0254058837890625,
-0.052520751953125,
0.0185546875,
0.032989501953125,
-0.0177459716796875,
-0.004131317138671875,
-0.0229339599609375,
-0.04718017578125,
0.024658203125,
0.0211944580078125,
-0.034149169921875,
-0.045013427734375,
-0.04833984375,
0.00630950927734375,
-0.0196685791015625,
0.09051513671875,
-0.0255279541015625,
-0.013031005859375,
-0.00522613525390625,
-0.0234222412109375,
-0.014434814453125,
-0.02996826171875,
-0.046417236328125,
-0.0297393798828125,
0.031158447265625,
0.01085662841796875,
0.04473876953125,
0.0391845703125,
0.041900634765625,
0.023681640625,
-0.025665283203125,
0.018829345703125,
-0.037811279296875,
-0.01910400390625,
0.018524169921875,
-0.038909912109375,
-0.05096435546875,
0.0032329559326171875,
0.038787841796875,
0.0236053466796875,
-0.0231475830078125,
0.043609619140625,
-0.002880096435546875,
0.03643798828125,
-0.04205322265625,
0.024505615234375,
-0.041748046875,
0.011383056640625,
-0.0225067138671875,
-0.00003439188003540039,
-0.0173797607421875,
-0.027496337890625,
-0.0134735107421875,
-0.05499267578125,
0.0025424957275390625,
-0.00020754337310791016,
0.08831787109375,
0.030242919921875,
-0.0175323486328125,
-0.0152740478515625,
-0.034027099609375,
0.060028076171875,
-0.0599365234375,
0.01042938232421875,
0.0272674560546875,
0.018157958984375,
-0.0096435546875,
-0.064697265625,
-0.051513671875,
-0.01386260986328125,
-0.0078125,
0.00860595703125,
-0.02642822265625,
-0.00925445556640625,
0.0254974365234375,
0.040740966796875,
-0.033782958984375,
0.0212860107421875,
-0.04351806640625,
-0.01029205322265625,
0.054718017578125,
0.02001953125,
0.00954437255859375,
-0.00911712646484375,
-0.039581298828125,
-0.0161590576171875,
-0.056427001953125,
0.0258636474609375,
0.01061248779296875,
0.0211639404296875,
-0.038543701171875,
0.049041748046875,
-0.02001953125,
0.035736083984375,
0.00910186767578125,
-0.036956787109375,
0.05108642578125,
-0.033050537109375,
-0.032989501953125,
-0.001190185546875,
0.06634521484375,
0.0283203125,
0.0029239654541015625,
0.00826263427734375,
-0.01552581787109375,
-0.00531005859375,
-0.00800323486328125,
-0.0654296875,
0.0008320808410644531,
0.018402099609375,
-0.037017822265625,
-0.0302886962890625,
0.0016241073608398438,
-0.042510986328125,
-0.01302337646484375,
-0.007106781005859375,
0.031951904296875,
-0.01041412353515625,
-0.0200653076171875,
0.0215301513671875,
0.01293182373046875,
0.031829833984375,
0.03363037109375,
-0.052154541015625,
0.0172271728515625,
0.040313720703125,
0.06903076171875,
-0.0015506744384765625,
-0.031280517578125,
-0.0247955322265625,
-0.0007028579711914062,
-0.0171051025390625,
0.036163330078125,
-0.01149749755859375,
-0.02166748046875,
-0.01035308837890625,
0.006290435791015625,
-0.0178375244140625,
-0.0401611328125,
0.037994384765625,
-0.0302734375,
0.016510009765625,
-0.01351165771484375,
-0.01468658447265625,
-0.025238037109375,
0.0160064697265625,
-0.042510986328125,
0.09576416015625,
0.00841522216796875,
-0.053436279296875,
0.0198211669921875,
-0.061126708984375,
-0.00887298583984375,
-0.02117919921875,
0.01166534423828125,
-0.050079345703125,
-0.0009255409240722656,
0.032440185546875,
0.033538818359375,
-0.032989501953125,
0.01419830322265625,
-0.0225372314453125,
-0.035552978515625,
0.015472412109375,
-0.01279449462890625,
0.078125,
0.0218353271484375,
-0.037109375,
0.0225372314453125,
-0.0621337890625,
-0.005100250244140625,
0.04815673828125,
-0.039031982421875,
-0.00036406517028808594,
-0.01953125,
-0.0013446807861328125,
0.009185791015625,
0.03338623046875,
-0.042388916015625,
0.032623291015625,
-0.0228118896484375,
0.03857421875,
0.06512451171875,
-0.01458740234375,
0.01123046875,
-0.033203125,
0.032745361328125,
0.01107025146484375,
0.0234222412109375,
-0.01435089111328125,
-0.0499267578125,
-0.07965087890625,
-0.03826904296875,
0.00992584228515625,
0.0301055908203125,
-0.0247802734375,
0.0372314453125,
-0.01145172119140625,
-0.052642822265625,
-0.057769775390625,
0.0158233642578125,
0.034210205078125,
0.02978515625,
0.036468505859375,
-0.021240234375,
-0.047454833984375,
-0.0599365234375,
0.005214691162109375,
-0.0235443115234375,
0.01090240478515625,
0.0199432373046875,
0.056182861328125,
-0.0300750732421875,
0.06365966796875,
-0.037811279296875,
-0.027313232421875,
-0.01490020751953125,
-0.01062774658203125,
0.044464111328125,
0.0340576171875,
0.05560302734375,
-0.031707763671875,
-0.0286102294921875,
0.0008783340454101562,
-0.0654296875,
-0.0036258697509765625,
0.001712799072265625,
-0.01052093505859375,
0.0228118896484375,
0.01273345947265625,
-0.06732177734375,
0.044097900390625,
0.043609619140625,
-0.0259246826171875,
0.039886474609375,
-0.0046844482421875,
-0.0029239654541015625,
-0.07269287109375,
0.0202789306640625,
-0.007274627685546875,
-0.00992584228515625,
-0.034576416015625,
0.028594970703125,
0.0027561187744140625,
0.0020771026611328125,
-0.054443359375,
0.052825927734375,
-0.0237884521484375,
-0.0086669921875,
0.0010385513305664062,
-0.0025806427001953125,
-0.003513336181640625,
0.052886962890625,
-0.0128326416015625,
0.06634521484375,
0.033477783203125,
-0.033905029296875,
0.02337646484375,
0.025054931640625,
-0.033050537109375,
0.0162200927734375,
-0.06341552734375,
0.0218048095703125,
0.00034618377685546875,
0.039520263671875,
-0.07025146484375,
-0.0164642333984375,
0.039642333984375,
-0.024627685546875,
0.01134490966796875,
0.006046295166015625,
-0.039520263671875,
-0.050323486328125,
-0.044219970703125,
0.0292816162109375,
0.046600341796875,
-0.0577392578125,
0.0167083740234375,
0.0125274658203125,
0.0151824951171875,
-0.051483154296875,
-0.05126953125,
-0.010009765625,
-0.0257415771484375,
-0.043243408203125,
0.0226593017578125,
-0.00923919677734375,
-0.01175689697265625,
-0.00670623779296875,
-0.00557708740234375,
0.0021686553955078125,
0.0182037353515625,
0.0271148681640625,
0.02288818359375,
-0.0243682861328125,
-0.010009765625,
-0.0056304931640625,
-0.00714111328125,
-0.006450653076171875,
0.0021343231201171875,
0.052520751953125,
-0.033660888671875,
-0.033721923828125,
-0.050018310546875,
-0.011138916015625,
0.038909912109375,
-0.0219268798828125,
0.06683349609375,
0.057037353515625,
-0.027313232421875,
0.014892578125,
-0.040191650390625,
0.0074462890625,
-0.036163330078125,
0.021270751953125,
-0.028106689453125,
-0.06671142578125,
0.0438232421875,
0.0165863037109375,
0.0216217041015625,
0.0557861328125,
0.058990478515625,
0.007053375244140625,
0.06439208984375,
0.034942626953125,
-0.0178985595703125,
0.0236663818359375,
-0.04412841796875,
-0.004291534423828125,
-0.07781982421875,
-0.036346435546875,
-0.038665771484375,
-0.0287322998046875,
-0.0284881591796875,
-0.04205322265625,
0.02459716796875,
0.0271759033203125,
-0.047027587890625,
0.0286407470703125,
-0.040435791015625,
0.0213470458984375,
0.037384033203125,
0.01102447509765625,
0.026123046875,
0.00507354736328125,
-0.0074920654296875,
0.005649566650390625,
-0.038726806640625,
-0.036163330078125,
0.1058349609375,
0.041259765625,
0.05108642578125,
0.003986358642578125,
0.06378173828125,
0.0008335113525390625,
0.037750244140625,
-0.04229736328125,
0.041656494140625,
0.0176239013671875,
-0.041259765625,
-0.005550384521484375,
-0.0202484130859375,
-0.0772705078125,
0.03466796875,
-0.007305145263671875,
-0.06805419921875,
0.001369476318359375,
-0.00559234619140625,
-0.024749755859375,
0.033966064453125,
-0.032196044921875,
0.0504150390625,
-0.01904296875,
-0.0162200927734375,
-0.00984954833984375,
-0.036529541015625,
0.049713134765625,
-0.0165557861328125,
0.00789642333984375,
-0.015167236328125,
-0.019317626953125,
0.06695556640625,
-0.053070068359375,
0.06475830078125,
-0.016387939453125,
-0.008636474609375,
0.03729248046875,
-0.0172119140625,
0.043060302734375,
-0.001983642578125,
-0.0191802978515625,
0.0411376953125,
-0.01390838623046875,
-0.0301361083984375,
-0.0230865478515625,
0.053802490234375,
-0.08929443359375,
-0.05322265625,
-0.034942626953125,
-0.0226898193359375,
0.0129241943359375,
0.00846099853515625,
0.01389312744140625,
0.0029239654541015625,
0.0025920867919921875,
0.0158538818359375,
0.0269012451171875,
-0.02850341796875,
0.04248046875,
0.035797119140625,
-0.034942626953125,
-0.04205322265625,
0.0570068359375,
0.0002605915069580078,
0.00971221923828125,
0.0128631591796875,
0.0184783935546875,
-0.017852783203125,
-0.038818359375,
-0.043853759765625,
0.034637451171875,
-0.047637939453125,
-0.02850341796875,
-0.049346923828125,
-0.0213165283203125,
-0.0236663818359375,
-0.002819061279296875,
-0.0300140380859375,
-0.036285400390625,
-0.032257080078125,
-0.01052093505859375,
0.0455322265625,
0.0626220703125,
0.00270843505859375,
0.030670166015625,
-0.042510986328125,
0.01299285888671875,
0.01336669921875,
0.01531982421875,
0.01434326171875,
-0.04949951171875,
-0.0219573974609375,
0.004344940185546875,
-0.04473876953125,
-0.045257568359375,
0.025360107421875,
0.00962066650390625,
0.036529541015625,
0.0281982421875,
-0.00978851318359375,
0.07672119140625,
-0.0189971923828125,
0.07421875,
0.0271759033203125,
-0.06414794921875,
0.043060302734375,
-0.01528167724609375,
0.0137786865234375,
0.0328369140625,
0.0287628173828125,
-0.0223388671875,
-0.020751953125,
-0.047454833984375,
-0.06597900390625,
0.06634521484375,
0.020904541015625,
-0.0010881423950195312,
0.00885772705078125,
0.0236053466796875,
0.006153106689453125,
0.02117919921875,
-0.08062744140625,
-0.0291290283203125,
-0.017303466796875,
-0.00972747802734375,
-0.01270294189453125,
-0.005615234375,
-0.01380157470703125,
-0.0390625,
0.044097900390625,
0.00016427040100097656,
0.034454345703125,
0.013519287109375,
-0.0168914794921875,
-0.0202178955078125,
-0.005451202392578125,
0.056427001953125,
0.047821044921875,
-0.0159759521484375,
-0.0170135498046875,
0.0301666259765625,
-0.037078857421875,
0.012420654296875,
0.0012369155883789062,
-0.018829345703125,
-0.008544921875,
0.035308837890625,
0.078369140625,
0.01849365234375,
-0.043121337890625,
0.043487548828125,
0.005023956298828125,
-0.0159454345703125,
-0.024749755859375,
0.003093719482421875,
0.01081085205078125,
0.0243682861328125,
0.02886962890625,
-0.0031070709228515625,
-0.0152740478515625,
-0.036590576171875,
-0.0074462890625,
0.029937744140625,
-0.0006890296936035156,
-0.0291900634765625,
0.06988525390625,
0.010223388671875,
-0.0194549560546875,
0.037567138671875,
0.004550933837890625,
-0.03619384765625,
0.06341552734375,
0.047607421875,
0.05877685546875,
-0.0128631591796875,
0.0016469955444335938,
0.04168701171875,
0.027069091796875,
-0.006267547607421875,
0.01544952392578125,
-0.007572174072265625,
-0.0290985107421875,
-0.0212554931640625,
-0.07147216796875,
-0.025360107421875,
0.015869140625,
-0.040191650390625,
0.0257415771484375,
-0.04302978515625,
-0.01215362548828125,
-0.0299835205078125,
0.01904296875,
-0.06903076171875,
0.00449371337890625,
0.00795745849609375,
0.07421875,
-0.05047607421875,
0.059814453125,
0.04736328125,
-0.045196533203125,
-0.0733642578125,
-0.01538848876953125,
0.0000871419906616211,
-0.09283447265625,
0.05682373046875,
0.0282440185546875,
0.01116180419921875,
-0.00850677490234375,
-0.0369873046875,
-0.08575439453125,
0.111572265625,
0.0196990966796875,
-0.0364990234375,
0.0005640983581542969,
0.01433563232421875,
0.03814697265625,
-0.0213470458984375,
0.044403076171875,
0.033966064453125,
0.042633056640625,
0.0003273487091064453,
-0.09197998046875,
0.0213470458984375,
-0.023681640625,
-0.0015954971313476562,
0.0072174072265625,
-0.08062744140625,
0.0880126953125,
-0.027130126953125,
-0.004085540771484375,
0.0267333984375,
0.052886962890625,
0.041168212890625,
0.03466796875,
0.03131103515625,
0.0670166015625,
0.06256103515625,
-0.01004791259765625,
0.08660888671875,
-0.01500701904296875,
0.045196533203125,
0.0556640625,
-0.0134124755859375,
0.0660400390625,
0.03167724609375,
-0.042083740234375,
0.039031982421875,
0.06317138671875,
-0.00011020898818969727,
0.0256500244140625,
0.0162200927734375,
-0.006443023681640625,
0.0059967041015625,
0.0018053054809570312,
-0.059814453125,
0.03631591796875,
0.01097869873046875,
-0.025634765625,
-0.01332855224609375,
-0.00506591796875,
0.019561767578125,
-0.0231475830078125,
-0.026519775390625,
0.042266845703125,
0.004650115966796875,
-0.035430908203125,
0.07916259765625,
0.01544189453125,
0.0738525390625,
-0.0474853515625,
0.01549530029296875,
-0.0240325927734375,
0.012725830078125,
-0.033233642578125,
-0.038848876953125,
0.00916290283203125,
0.0152130126953125,
0.01274871826171875,
-0.007904052734375,
0.039825439453125,
-0.0089569091796875,
-0.035858154296875,
0.0223541259765625,
0.0254669189453125,
0.0234832763671875,
0.01155853271484375,
-0.058563232421875,
0.03155517578125,
-0.0026702880859375,
-0.056671142578125,
0.035552978515625,
0.006916046142578125,
-0.004611968994140625,
0.053955078125,
0.06439208984375,
-0.0008997917175292969,
0.017974853515625,
-0.00809478759765625,
0.08074951171875,
-0.05169677734375,
-0.02044677734375,
-0.06488037109375,
0.038787841796875,
0.00550079345703125,
-0.044403076171875,
0.060333251953125,
0.048736572265625,
0.06524658203125,
-0.002635955810546875,
0.033111572265625,
-0.0084991455078125,
0.01776123046875,
-0.041412353515625,
0.05364990234375,
-0.05865478515625,
0.0139923095703125,
-0.0188140869140625,
-0.0740966796875,
-0.017730712890625,
0.057403564453125,
-0.0130615234375,
0.006343841552734375,
0.03826904296875,
0.057220458984375,
0.005199432373046875,
-0.0062408447265625,
-0.0035152435302734375,
0.0241241455078125,
0.0292510986328125,
0.06500244140625,
0.059326171875,
-0.05169677734375,
0.037933349609375,
-0.03082275390625,
-0.01461029052734375,
-0.033233642578125,
-0.055450439453125,
-0.0625,
-0.0290679931640625,
-0.022308349609375,
-0.0233917236328125,
-0.0025959014892578125,
0.08477783203125,
0.04205322265625,
-0.040435791015625,
-0.032501220703125,
0.00881195068359375,
0.00991058349609375,
-0.0158843994140625,
-0.01454925537109375,
0.033935546875,
-0.00848388671875,
-0.061920166015625,
0.029083251953125,
0.00171661376953125,
0.00811004638671875,
-0.024810791015625,
-0.024749755859375,
-0.01229095458984375,
-0.0015687942504882812,
0.04656982421875,
0.0283966064453125,
-0.06982421875,
-0.0231781005859375,
-0.0144805908203125,
-0.0197906494140625,
0.019683837890625,
0.0268402099609375,
-0.0631103515625,
0.01197052001953125,
0.017486572265625,
0.03839111328125,
0.06268310546875,
-0.00888824462890625,
0.0040435791015625,
-0.03033447265625,
0.03350830078125,
-0.00830841064453125,
0.033294677734375,
0.01123046875,
-0.0202789306640625,
0.06048583984375,
0.0233001708984375,
-0.03240966796875,
-0.08135986328125,
-0.0104522705078125,
-0.08612060546875,
0.00038242340087890625,
0.08294677734375,
-0.0195159912109375,
-0.037200927734375,
0.02606201171875,
-0.0258941650390625,
0.0180206298828125,
-0.0277557373046875,
0.051788330078125,
0.037994384765625,
-0.00922393798828125,
-0.0022983551025390625,
-0.040557861328125,
0.0148773193359375,
0.0247650146484375,
-0.06243896484375,
-0.0197906494140625,
0.011688232421875,
0.026153564453125,
0.01456451416015625,
0.06390380859375,
-0.00743865966796875,
0.0180511474609375,
-0.01239013671875,
0.00922393798828125,
-0.034820556640625,
-0.01033782958984375,
-0.0290069580078125,
0.01175689697265625,
0.002079010009765625,
-0.02398681640625
]
] |
liuhaotian/llava-v1.5-7b | 2023-10-22T05:16:14.000Z | [
"transformers",
"pytorch",
"llava",
"text-generation",
"has_space",
"region:us"
] | text-generation | liuhaotian | null | null | liuhaotian/llava-v1.5-7b | 108 | 99,810 | transformers | 2023-10-05T18:25:51 | ---
inference: false
---
<br>
<br>
# LLaVA Model Card
## Model details
**Model type:**
LLaVA is an open-source chatbot trained by fine-tuning LLaMA/Vicuna on GPT-generated multimodal instruction-following data.
It is an auto-regressive language model, based on the transformer architecture.
**Model date:**
LLaVA-v1.5-7B was trained in September 2023.
**Paper or resources for more information:**
https://llava-vl.github.io/
## License
Llama 2 is licensed under the LLAMA 2 Community License,
Copyright (c) Meta Platforms, Inc. All Rights Reserved.
**Where to send questions or comments about the model:**
https://github.com/haotian-liu/LLaVA/issues
## Intended use
**Primary intended uses:**
The primary use of LLaVA is research on large multimodal models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
## Training dataset
- 558K filtered image-text pairs from LAION/CC/SBU, captioned by BLIP.
- 158K GPT-generated multimodal instruction-following data.
- 450K academic-task-oriented VQA data mixture.
- 40K ShareGPT data.
## Evaluation dataset
A collection of 12 benchmarks, including 5 academic VQA benchmarks and 7 recent benchmarks specifically proposed for instruction-following LMMs. | 1,364 | [
[
-0.0001398324966430664,
-0.07037353515625,
0.0232391357421875,
0.0189056396484375,
-0.0311279296875,
0.0160369873046875,
-0.001155853271484375,
-0.0345458984375,
0.01873779296875,
0.041839599609375,
-0.0428466796875,
-0.041534423828125,
-0.041900634765625,
-0.01029205322265625,
-0.031463623046875,
0.07122802734375,
0.0040130615234375,
-0.007747650146484375,
-0.0222930908203125,
0.010894775390625,
-0.0592041015625,
-0.0312042236328125,
-0.042724609375,
-0.0240325927734375,
0.043060302734375,
0.04229736328125,
0.043548583984375,
0.0382080078125,
0.0313720703125,
0.0258331298828125,
-0.0018215179443359375,
0.0183563232421875,
-0.04632568359375,
0.0019464492797851562,
0.0214996337890625,
-0.0545654296875,
-0.05523681640625,
-0.0161590576171875,
0.040496826171875,
-0.007289886474609375,
-0.02587890625,
0.024749755859375,
0.00019097328186035156,
0.025177001953125,
-0.01715087890625,
0.047637939453125,
-0.0625,
-0.0180511474609375,
-0.0300140380859375,
-0.006999969482421875,
-0.0345458984375,
-0.0186004638671875,
-0.0229949951171875,
-0.04119873046875,
-0.0199127197265625,
0.00809478759765625,
0.0770263671875,
0.039886474609375,
-0.0265350341796875,
-0.01558685302734375,
-0.04986572265625,
0.053558349609375,
-0.04949951171875,
0.0208740234375,
0.03515625,
0.055694580078125,
-0.0115509033203125,
-0.049346923828125,
-0.048187255859375,
-0.0188446044921875,
0.002696990966796875,
0.01186370849609375,
-0.03533935546875,
0.0005960464477539062,
0.00542449951171875,
0.0248870849609375,
-0.03240966796875,
0.0194244384765625,
-0.04437255859375,
-0.0016660690307617188,
0.043243408203125,
0.0250244140625,
0.0106964111328125,
-0.01384735107421875,
-0.034454345703125,
-0.0095367431640625,
-0.03729248046875,
-0.001300811767578125,
0.0433349609375,
0.0166473388671875,
-0.031402587890625,
0.057037353515625,
-0.0153961181640625,
0.02972412109375,
-0.0018358230590820312,
-0.031341552734375,
0.035675048828125,
-0.00902557373046875,
-0.03741455078125,
-0.0218505859375,
0.070556640625,
0.018890380859375,
0.01544952392578125,
0.015533447265625,
-0.0185546875,
0.007625579833984375,
0.0204010009765625,
-0.03814697265625,
-0.006160736083984375,
0.0146942138671875,
-0.0306854248046875,
-0.037750244140625,
-0.0498046875,
-0.053009033203125,
-0.0250091552734375,
-0.0158538818359375,
0.01629638671875,
-0.026763916015625,
-0.0207061767578125,
-0.01338958740234375,
0.03155517578125,
0.042938232421875,
0.035125732421875,
-0.06512451171875,
0.00983428955078125,
0.03924560546875,
0.050384521484375,
-0.00833892822265625,
-0.01548004150390625,
0.00616455078125,
-0.007091522216796875,
-0.007144927978515625,
0.0819091796875,
-0.049163818359375,
-0.03216552734375,
0.0017766952514648438,
0.00736236572265625,
-0.001617431640625,
-0.0192718505859375,
0.0545654296875,
-0.04302978515625,
0.0174102783203125,
0.0119171142578125,
-0.036712646484375,
-0.013702392578125,
0.025177001953125,
-0.049072265625,
0.0845947265625,
-0.004573822021484375,
-0.045745849609375,
0.0012617111206054688,
-0.046356201171875,
-0.00008112192153930664,
0.014404296875,
-0.012451171875,
-0.029449462890625,
-0.01204681396484375,
0.013946533203125,
0.0202178955078125,
-0.04364013671875,
0.03924560546875,
-0.006664276123046875,
-0.0224456787109375,
0.01143646240234375,
-0.060455322265625,
0.057525634765625,
0.0212860107421875,
0.001956939697265625,
0.019256591796875,
-0.06988525390625,
-0.018157958984375,
0.027435302734375,
-0.028533935546875,
-0.00466156005859375,
0.0015773773193359375,
0.0035800933837890625,
-0.0032196044921875,
0.050079345703125,
-0.03570556640625,
0.037872314453125,
-0.00893402099609375,
0.0086212158203125,
0.0665283203125,
-0.01800537109375,
0.016448974609375,
-0.0228118896484375,
0.050445556640625,
-0.00396728515625,
0.040740966796875,
-0.0186767578125,
-0.06829833984375,
-0.0675048828125,
-0.025360107421875,
0.005130767822265625,
0.0791015625,
-0.0531005859375,
0.02166748046875,
-0.0209197998046875,
-0.05499267578125,
-0.05902099609375,
0.0213775634765625,
0.0299530029296875,
0.036895751953125,
0.0200958251953125,
-0.0134124755859375,
-0.04705810546875,
-0.08172607421875,
0.00540924072265625,
-0.03643798828125,
0.00339508056640625,
0.0316162109375,
0.03216552734375,
-0.03839111328125,
0.053924560546875,
-0.025482177734375,
-0.0301666259765625,
-0.02008056640625,
-0.006923675537109375,
0.0202789306640625,
0.0161895751953125,
0.0256500244140625,
-0.03753662109375,
-0.0374755859375,
0.0019588470458984375,
-0.0638427734375,
-0.00787353515625,
-0.005573272705078125,
-0.02093505859375,
0.0239105224609375,
0.022003173828125,
-0.044769287109375,
0.0482177734375,
0.06622314453125,
-0.009979248046875,
0.032440185546875,
0.0004210472106933594,
-0.005279541015625,
-0.0882568359375,
-0.01009368896484375,
-0.0137939453125,
-0.01332855224609375,
-0.041168212890625,
-0.00893402099609375,
-0.008697509765625,
0.00799560546875,
-0.044036865234375,
0.047515869140625,
-0.017333984375,
0.006778717041015625,
-0.029205322265625,
0.0026760101318359375,
-0.01142120361328125,
0.055450439453125,
-0.00836944580078125,
0.070556640625,
0.03521728515625,
-0.029083251953125,
0.04058837890625,
0.038604736328125,
-0.01873779296875,
0.036468505859375,
-0.0677490234375,
0.01800537109375,
-0.0016269683837890625,
0.01197052001953125,
-0.0888671875,
-0.0200347900390625,
0.0416259765625,
-0.040863037109375,
0.0234222412109375,
-0.006694793701171875,
-0.050506591796875,
-0.0178680419921875,
-0.0071563720703125,
0.02056884765625,
0.061981201171875,
-0.03350830078125,
0.05828857421875,
0.034759521484375,
0.01468658447265625,
-0.05523681640625,
-0.0555419921875,
0.0019140243530273438,
-0.01788330078125,
-0.03973388671875,
0.005157470703125,
-0.0163421630859375,
-0.01050567626953125,
-0.00992584228515625,
0.0223388671875,
-0.013702392578125,
-0.00574493408203125,
0.033477783203125,
0.03631591796875,
-0.0029888153076171875,
0.018157958984375,
0.00044727325439453125,
-0.00669097900390625,
-0.00807952880859375,
0.0247955322265625,
0.051483154296875,
-0.0194091796875,
-0.024566650390625,
-0.05908203125,
0.0006785392761230469,
0.022735595703125,
0.005245208740234375,
0.052703857421875,
0.0438232421875,
-0.00623321533203125,
0.0210418701171875,
-0.051666259765625,
0.002750396728515625,
-0.0399169921875,
0.031707763671875,
-0.0239410400390625,
-0.050445556640625,
0.041961669921875,
0.0036029815673828125,
0.03387451171875,
0.0296478271484375,
0.05889892578125,
-0.0158843994140625,
0.048919677734375,
0.0572509765625,
-0.01497650146484375,
0.0589599609375,
-0.006130218505859375,
-0.00348663330078125,
-0.062255859375,
-0.029388427734375,
-0.0178375244140625,
-0.014617919921875,
-0.057159423828125,
-0.045684814453125,
-0.0011425018310546875,
-0.0162506103515625,
-0.02398681640625,
0.031707763671875,
-0.03131103515625,
0.031494140625,
0.0478515625,
0.006103515625,
0.0228118896484375,
0.011383056640625,
0.01142120361328125,
0.006061553955078125,
-0.037109375,
-0.058380126953125,
0.1021728515625,
0.050079345703125,
0.0753173828125,
0.004360198974609375,
0.05328369140625,
0.024688720703125,
0.03216552734375,
-0.05364990234375,
0.04888916015625,
0.0025081634521484375,
-0.05084228515625,
-0.0209808349609375,
-0.0086212158203125,
-0.07550048828125,
0.005702972412109375,
-0.00958251953125,
-0.051300048828125,
-0.00463104248046875,
0.0223541259765625,
0.01457977294921875,
0.028472900390625,
-0.056854248046875,
0.04534912109375,
-0.0294189453125,
-0.01129150390625,
-0.01023101806640625,
-0.033721923828125,
0.046905517578125,
-0.003078460693359375,
0.0178985595703125,
-0.020111083984375,
-0.0025386810302734375,
0.038665771484375,
-0.0097808837890625,
0.10919189453125,
-0.0038738250732421875,
-0.0234222412109375,
0.039215087890625,
0.0012006759643554688,
0.03851318359375,
-0.00292205810546875,
0.012298583984375,
0.037750244140625,
-0.01438140869140625,
-0.0311431884765625,
-0.02734375,
0.047760009765625,
-0.09649658203125,
-0.052886962890625,
-0.0209503173828125,
-0.039093017578125,
0.0031414031982421875,
0.0010776519775390625,
0.01009368896484375,
0.00707244873046875,
-0.01157379150390625,
0.0016317367553710938,
0.0401611328125,
-0.0293426513671875,
0.0164794921875,
0.0267181396484375,
-0.02545166015625,
-0.034149169921875,
0.0640869140625,
-0.01428985595703125,
0.01531219482421875,
0.0355224609375,
-0.002223968505859375,
-0.0122222900390625,
-0.01800537109375,
-0.032958984375,
0.031036376953125,
-0.0592041015625,
-0.02838134765625,
-0.03302001953125,
-0.033416748046875,
-0.0208740234375,
0.018310546875,
-0.043212890625,
-0.0199737548828125,
-0.04534912109375,
-0.0010280609130859375,
0.04913330078125,
0.055999755859375,
0.0161590576171875,
0.054351806640625,
-0.036041259765625,
0.017120361328125,
0.034881591796875,
0.0273895263671875,
-0.00792694091796875,
-0.06787109375,
0.00021183490753173828,
-0.000060439109802246094,
-0.039398193359375,
-0.0579833984375,
0.03961181640625,
0.01541900634765625,
0.05322265625,
0.01023101806640625,
-0.0196990966796875,
0.052978515625,
-0.043212890625,
0.06329345703125,
0.0111236572265625,
-0.046630859375,
0.048065185546875,
-0.01355743408203125,
0.0265350341796875,
0.0258026123046875,
0.0164031982421875,
-0.0292816162109375,
-0.0293731689453125,
-0.04107666015625,
-0.05352783203125,
0.0406494140625,
0.0172271728515625,
0.036865234375,
-0.0025806427001953125,
0.0201416015625,
0.0123748779296875,
0.0128173828125,
-0.0823974609375,
-0.039703369140625,
-0.0250701904296875,
-0.028411865234375,
0.01122283935546875,
-0.044708251953125,
-0.00859832763671875,
-0.0074615478515625,
0.047882080078125,
-0.00873565673828125,
0.052215576171875,
-0.0189361572265625,
-0.006679534912109375,
0.0015993118286132812,
0.017303466796875,
0.06219482421875,
0.0233154296875,
-0.0159912109375,
-0.024932861328125,
0.021942138671875,
-0.0557861328125,
0.0009832382202148438,
-0.003345489501953125,
-0.010894775390625,
-0.0061492919921875,
0.0306854248046875,
0.083740234375,
0.0103759765625,
-0.0518798828125,
0.028472900390625,
-0.025787353515625,
-0.0199737548828125,
-0.055328369140625,
0.003932952880859375,
0.00843048095703125,
0.041839599609375,
0.0155487060546875,
-0.013275146484375,
-0.0001881122589111328,
-0.0218505859375,
-0.0114593505859375,
0.0236053466796875,
-0.01641845703125,
-0.027069091796875,
0.05389404296875,
0.01468658447265625,
-0.03973388671875,
0.045623779296875,
0.00441741943359375,
-0.0085601806640625,
0.034423828125,
0.039398193359375,
0.0592041015625,
-0.0217132568359375,
0.01397705078125,
0.034820556640625,
0.026336669921875,
0.00955963134765625,
0.0301666259765625,
-0.00524139404296875,
-0.04559326171875,
-0.028411865234375,
-0.0440673828125,
-0.044921875,
0.010833740234375,
-0.031524658203125,
0.038421630859375,
-0.0262908935546875,
-0.0171661376953125,
-0.0212249755859375,
0.0013628005981445312,
-0.060943603515625,
-0.0034942626953125,
0.024810791015625,
0.065185546875,
-0.056182861328125,
0.09246826171875,
0.029449462890625,
-0.04095458984375,
-0.048187255859375,
-0.0272216796875,
0.0035648345947265625,
-0.10711669921875,
0.062408447265625,
-0.005588531494140625,
0.00044727325439453125,
-0.0208587646484375,
-0.06488037109375,
-0.08233642578125,
0.106201171875,
0.020111083984375,
-0.07537841796875,
-0.01195526123046875,
0.009307861328125,
0.04864501953125,
-0.0283660888671875,
0.040283203125,
0.034454345703125,
0.0227203369140625,
0.028411865234375,
-0.08251953125,
-0.0192108154296875,
-0.0280609130859375,
0.0029735565185546875,
-0.0146942138671875,
-0.07708740234375,
0.061614990234375,
0.00439453125,
-0.0101776123046875,
0.0109100341796875,
0.060821533203125,
0.029083251953125,
0.014434814453125,
0.0318603515625,
0.0250701904296875,
0.05352783203125,
0.0070953369140625,
0.0726318359375,
-0.029205322265625,
0.0078887939453125,
0.086669921875,
-0.008544921875,
0.07073974609375,
0.0306854248046875,
-0.014617919921875,
0.057830810546875,
0.03802490234375,
-0.004886627197265625,
0.04364013671875,
-0.005451202392578125,
0.00809478759765625,
-0.009368896484375,
0.007320404052734375,
-0.024932861328125,
0.04998779296875,
0.042510986328125,
-0.031707763671875,
-0.0009379386901855469,
-0.01131439208984375,
0.004322052001953125,
-0.0080108642578125,
-0.004119873046875,
0.056182861328125,
-0.0024814605712890625,
-0.02252197265625,
0.06024169921875,
-0.0146331787109375,
0.06500244140625,
-0.04400634765625,
-0.0167388916015625,
-0.04022216796875,
0.0033550262451171875,
0.0023822784423828125,
-0.0528564453125,
0.0175323486328125,
0.0145263671875,
0.010467529296875,
-0.003765106201171875,
0.055633544921875,
-0.01922607421875,
-0.04315185546875,
0.0169830322265625,
0.032928466796875,
0.034698486328125,
0.035614013671875,
-0.0677490234375,
0.034942626953125,
0.00897216796875,
-0.0296173095703125,
0.019622802734375,
0.029388427734375,
-0.016357421875,
0.0771484375,
0.03424072265625,
-0.00927734375,
0.00659942626953125,
0.009613037109375,
0.08416748046875,
-0.03466796875,
-0.01261138916015625,
-0.056396484375,
0.043243408203125,
-0.0014142990112304688,
-0.037567138671875,
0.049774169921875,
0.0261688232421875,
0.04022216796875,
0.00005316734313964844,
0.049346923828125,
0.00807952880859375,
0.034515380859375,
-0.0280609130859375,
0.027587890625,
-0.045989990234375,
0.035369873046875,
-0.005252838134765625,
-0.053863525390625,
-0.0152587890625,
0.0438232421875,
-0.01248931884765625,
-0.0084381103515625,
0.0208282470703125,
0.0596923828125,
0.010345458984375,
-0.014373779296875,
0.043365478515625,
0.01983642578125,
0.04302978515625,
0.043365478515625,
0.06396484375,
-0.04736328125,
0.06378173828125,
-0.00801849365234375,
-0.0192108154296875,
-0.0372314453125,
-0.053741455078125,
-0.09246826171875,
-0.0435791015625,
-0.015960693359375,
-0.00820159912109375,
0.01004791259765625,
0.058074951171875,
0.034088134765625,
-0.034698486328125,
-0.031402587890625,
0.01050567626953125,
0.007602691650390625,
0.0080413818359375,
-0.01177978515625,
0.01332855224609375,
-0.0199127197265625,
-0.059295654296875,
0.0216064453125,
-0.00567626953125,
0.01447296142578125,
-0.040130615234375,
-0.004253387451171875,
-0.01654052734375,
0.016754150390625,
0.043670654296875,
0.0299530029296875,
-0.0731201171875,
-0.0178375244140625,
0.0143585205078125,
-0.01381683349609375,
0.023101806640625,
0.0099639892578125,
-0.0543212890625,
0.0265045166015625,
0.017608642578125,
0.0212249755859375,
0.037445068359375,
-0.012054443359375,
0.0201568603515625,
-0.054656982421875,
0.0298309326171875,
0.00406646728515625,
0.0175018310546875,
0.0236968994140625,
-0.0309295654296875,
0.03515625,
-0.004596710205078125,
-0.05987548828125,
-0.06280517578125,
0.01352691650390625,
-0.08056640625,
0.00820159912109375,
0.09954833984375,
0.006328582763671875,
-0.04986572265625,
0.005916595458984375,
-0.03729248046875,
0.01494598388671875,
-0.04925537109375,
0.055572509765625,
0.03033447265625,
-0.0032711029052734375,
-0.036407470703125,
-0.06341552734375,
0.01459503173828125,
-0.009613037109375,
-0.06951904296875,
-0.0038928985595703125,
0.03131103515625,
0.0187225341796875,
0.0022144317626953125,
0.0657958984375,
0.0003910064697265625,
0.0027523040771484375,
0.01500701904296875,
0.037109375,
-0.007801055908203125,
-0.02215576171875,
-0.01480865478515625,
-0.0184478759765625,
0.01018524169921875,
-0.0361328125
]
] |
nvidia/mit-b0 | 2022-08-06T10:24:41.000Z | [
"transformers",
"pytorch",
"tf",
"segformer",
"image-classification",
"vision",
"dataset:imagenet_1k",
"arxiv:2105.15203",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | nvidia | null | null | nvidia/mit-b0 | 19 | 99,706 | transformers | 2022-03-02T23:29:05 | ---
license: other
tags:
- vision
datasets:
- imagenet_1k
widget:
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg
example_title: House
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg
example_title: Castle
---
# SegFormer (b0-sized) encoder pre-trained-only
SegFormer encoder fine-tuned on Imagenet-1k. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer).
Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset.
This repository only contains the pre-trained hierarchical Transformer, hence it can be used for fine-tuning purposes.
## Intended uses & limitations
You can use the model for fine-tuning of semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import SegformerFeatureExtractor, SegformerForImageClassification
from PIL import Image
import requests
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/mit-b0")
model = SegformerForImageClassification.from_pretrained("nvidia/mit-b0")
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits
# model predicts one of the 1000 ImageNet classes
predicted_class_idx = logits.argmax(-1).item()
print("Predicted class:", model.config.id2label[predicted_class_idx])
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#).
### License
The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-15203,
author = {Enze Xie and
Wenhai Wang and
Zhiding Yu and
Anima Anandkumar and
Jose M. Alvarez and
Ping Luo},
title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with
Transformers},
journal = {CoRR},
volume = {abs/2105.15203},
year = {2021},
url = {https://arxiv.org/abs/2105.15203},
eprinttype = {arXiv},
eprint = {2105.15203},
timestamp = {Wed, 02 Jun 2021 11:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 3,354 | [
[
-0.069091796875,
-0.052215576171875,
0.005382537841796875,
0.0110015869140625,
-0.025299072265625,
-0.026519775390625,
0.00286865234375,
-0.047393798828125,
0.019683837890625,
0.043670654296875,
-0.06121826171875,
-0.03955078125,
-0.0579833984375,
0.007476806640625,
-0.0257415771484375,
0.0621337890625,
0.01093292236328125,
-0.006618499755859375,
-0.033538818359375,
-0.01873779296875,
0.0013303756713867188,
-0.023223876953125,
-0.04766845703125,
-0.0280914306640625,
0.0313720703125,
0.0174102783203125,
0.0438232421875,
0.05670166015625,
0.057586669921875,
0.0361328125,
-0.0312347412109375,
0.006366729736328125,
-0.02264404296875,
-0.021087646484375,
0.0015659332275390625,
-0.00738525390625,
-0.02880859375,
0.0005245208740234375,
0.03082275390625,
0.046661376953125,
0.01064300537109375,
0.0250396728515625,
0.00177764892578125,
0.032867431640625,
-0.040130615234375,
0.004589080810546875,
-0.03564453125,
0.0146331787109375,
-0.0006470680236816406,
-0.007213592529296875,
-0.0252227783203125,
-0.01404571533203125,
0.01910400390625,
-0.0399169921875,
0.0484619140625,
0.004222869873046875,
0.114501953125,
0.037139892578125,
-0.0236358642578125,
-0.003238677978515625,
-0.02685546875,
0.058837890625,
-0.05352783203125,
0.032928466796875,
0.00234222412109375,
0.0261383056640625,
0.0088958740234375,
-0.074951171875,
-0.03558349609375,
0.00927734375,
-0.01837158203125,
0.0018243789672851562,
-0.0289306640625,
0.00823211669921875,
0.033447265625,
0.038299560546875,
-0.033660888671875,
0.00881195068359375,
-0.05352783203125,
-0.0298614501953125,
0.04833984375,
0.0005588531494140625,
0.0158538818359375,
-0.0264739990234375,
-0.05767822265625,
-0.032440185546875,
-0.024505615234375,
0.00833892822265625,
0.020233154296875,
0.0027141571044921875,
-0.0226593017578125,
0.03155517578125,
-0.004024505615234375,
0.05657958984375,
0.033416748046875,
-0.012359619140625,
0.0419921875,
-0.0102386474609375,
-0.029693603515625,
0.00949859619140625,
0.06988525390625,
0.033416748046875,
-0.0010318756103515625,
0.0037097930908203125,
-0.0044403076171875,
0.01287841796875,
0.0194854736328125,
-0.0953369140625,
-0.01291656494140625,
0.00270843505859375,
-0.040069580078125,
-0.0293426513671875,
0.009368896484375,
-0.06121826171875,
-0.002216339111328125,
-0.01018524169921875,
0.039031982421875,
-0.0216064453125,
-0.00669097900390625,
0.01168060302734375,
-0.0108795166015625,
0.059478759765625,
0.016632080078125,
-0.05657958984375,
0.0147857666015625,
0.041046142578125,
0.05938720703125,
-0.0154266357421875,
-0.0186614990234375,
-0.00836944580078125,
-0.00847625732421875,
-0.0119171142578125,
0.06439208984375,
-0.0271148681640625,
-0.023468017578125,
-0.0169677734375,
0.046630859375,
-0.017425537109375,
-0.04632568359375,
0.05938720703125,
-0.042816162109375,
0.0137786865234375,
-0.004955291748046875,
-0.0361328125,
-0.03680419921875,
0.0260467529296875,
-0.044464111328125,
0.0675048828125,
0.0120391845703125,
-0.06805419921875,
0.037322998046875,
-0.042938232421875,
-0.01953125,
0.001132965087890625,
0.00595855712890625,
-0.064208984375,
0.0003924369812011719,
0.0300750732421875,
0.04022216796875,
-0.0181732177734375,
0.018157958984375,
-0.04034423828125,
-0.0182342529296875,
-0.0002275705337524414,
-0.01291656494140625,
0.07061767578125,
0.0240631103515625,
-0.0252227783203125,
0.029052734375,
-0.0518798828125,
0.0025634765625,
0.0318603515625,
0.0025577545166015625,
-0.0020427703857421875,
-0.02294921875,
0.01549530029296875,
0.0291748046875,
0.0177459716796875,
-0.050628662109375,
0.0016870498657226562,
-0.0239715576171875,
0.0291900634765625,
0.0560302734375,
0.0063323974609375,
0.03875732421875,
-0.01105499267578125,
0.029632568359375,
0.0159759521484375,
0.03240966796875,
-0.0154266357421875,
-0.0157012939453125,
-0.08935546875,
-0.0189208984375,
0.0204620361328125,
0.00977325439453125,
-0.03955078125,
0.047393798828125,
-0.019287109375,
-0.0506591796875,
-0.03680419921875,
-0.00926971435546875,
0.015777587890625,
0.040008544921875,
0.0389404296875,
-0.0309295654296875,
-0.058837890625,
-0.086669921875,
0.003734588623046875,
0.0168609619140625,
0.0050506591796875,
0.023529052734375,
0.04949951171875,
-0.052947998046875,
0.05670166015625,
-0.049591064453125,
-0.0240478515625,
-0.0164031982421875,
-0.005489349365234375,
0.0224609375,
0.050933837890625,
0.044677734375,
-0.060638427734375,
-0.0285186767578125,
-0.01445770263671875,
-0.049957275390625,
-0.0024261474609375,
0.0061187744140625,
-0.028778076171875,
0.01116180419921875,
0.0364990234375,
-0.0345458984375,
0.03436279296875,
0.035308837890625,
-0.043670654296875,
0.02349853515625,
-0.003452301025390625,
-0.0013875961303710938,
-0.07379150390625,
0.011016845703125,
0.0122528076171875,
-0.01308441162109375,
-0.039398193359375,
0.0102081298828125,
-0.002567291259765625,
-0.0102081298828125,
-0.045074462890625,
0.043426513671875,
-0.0238037109375,
-0.00003510713577270508,
-0.01788330078125,
-0.016357421875,
0.00507354736328125,
0.06024169921875,
0.01203155517578125,
0.0266571044921875,
0.043243408203125,
-0.05242919921875,
0.00470733642578125,
0.04168701171875,
-0.031402587890625,
0.032684326171875,
-0.07855224609375,
0.0091094970703125,
-0.0120391845703125,
0.005954742431640625,
-0.052703857421875,
-0.0255889892578125,
0.0309906005859375,
-0.0272064208984375,
0.031829833984375,
-0.026458740234375,
-0.017303466796875,
-0.040130615234375,
-0.007293701171875,
0.028350830078125,
0.03778076171875,
-0.0609130859375,
0.04071044921875,
0.039703369140625,
0.01186370849609375,
-0.032318115234375,
-0.052001953125,
-0.0235748291015625,
-0.020355224609375,
-0.0782470703125,
0.04718017578125,
-0.003284454345703125,
0.0192108154296875,
0.006595611572265625,
-0.0247955322265625,
-0.0034961700439453125,
0.0006394386291503906,
0.03173828125,
0.038421630859375,
-0.010101318359375,
-0.0225830078125,
-0.000308990478515625,
-0.03509521484375,
0.01125335693359375,
-0.01276397705078125,
0.048095703125,
-0.028228759765625,
-0.031707763671875,
-0.017822265625,
-0.0015630722045898438,
0.029754638671875,
-0.0242462158203125,
0.04052734375,
0.0860595703125,
-0.0218658447265625,
-0.0018053054809570312,
-0.042816162109375,
-0.0186920166015625,
-0.044158935546875,
0.027435302734375,
-0.01303863525390625,
-0.084716796875,
0.036590576171875,
-0.0014066696166992188,
-0.00003069639205932617,
0.0731201171875,
0.026641845703125,
0.01074981689453125,
0.088134765625,
0.045654296875,
0.024749755859375,
0.040008544921875,
-0.06170654296875,
0.01145172119140625,
-0.072021484375,
-0.040771484375,
-0.03411865234375,
-0.0333251953125,
-0.062744140625,
-0.046112060546875,
0.025665283203125,
0.0088653564453125,
-0.035888671875,
0.03692626953125,
-0.06890869140625,
0.0278472900390625,
0.042327880859375,
0.003986358642578125,
-0.0159454345703125,
0.01100921630859375,
-0.005950927734375,
0.007137298583984375,
-0.05743408203125,
-0.0272064208984375,
0.035797119140625,
0.037811279296875,
0.057220458984375,
-0.0169677734375,
0.0516357421875,
-0.00873565673828125,
0.00269317626953125,
-0.06494140625,
0.047515869140625,
-0.0138092041015625,
-0.056610107421875,
-0.010589599609375,
-0.0272979736328125,
-0.0743408203125,
0.027679443359375,
-0.01273345947265625,
-0.058258056640625,
0.0518798828125,
0.00821685791015625,
-0.01367950439453125,
0.0228729248046875,
-0.04107666015625,
0.09368896484375,
-0.018157958984375,
-0.035552978515625,
0.01047515869140625,
-0.05633544921875,
0.01277923583984375,
0.0168914794921875,
-0.004840850830078125,
-0.025726318359375,
0.0214691162109375,
0.0748291015625,
-0.046539306640625,
0.05487060546875,
-0.02899169921875,
0.028594970703125,
0.043060302734375,
-0.011505126953125,
0.0294342041015625,
-0.006439208984375,
0.016265869140625,
0.037933349609375,
0.0184478759765625,
-0.0279693603515625,
-0.02569580078125,
0.047210693359375,
-0.07000732421875,
-0.044830322265625,
-0.038604736328125,
-0.011505126953125,
-0.00048041343688964844,
0.030670166015625,
0.04571533203125,
0.03277587890625,
-0.006866455078125,
0.038604736328125,
0.050079345703125,
-0.0294952392578125,
0.0390625,
0.0100860595703125,
-0.01446533203125,
-0.03033447265625,
0.06719970703125,
-0.007686614990234375,
0.0037441253662109375,
0.0237579345703125,
0.0224151611328125,
-0.0275726318359375,
-0.0207977294921875,
-0.0279693603515625,
0.01568603515625,
-0.055938720703125,
-0.031646728515625,
-0.06787109375,
-0.04400634765625,
-0.032257080078125,
-0.0281982421875,
-0.033050537109375,
-0.01959228515625,
-0.030853271484375,
-0.005550384521484375,
0.0215301513671875,
0.0260009765625,
-0.01241302490234375,
0.03509521484375,
-0.051513671875,
0.0090179443359375,
0.0282745361328125,
0.0272216796875,
0.0013818740844726562,
-0.048492431640625,
-0.0117645263671875,
-0.0012788772583007812,
-0.035919189453125,
-0.038726806640625,
0.051116943359375,
0.01305389404296875,
0.042236328125,
0.046722412109375,
-0.00928497314453125,
0.072265625,
-0.0130767822265625,
0.042816162109375,
0.033660888671875,
-0.058746337890625,
0.0298614501953125,
-0.00830078125,
0.042236328125,
0.036163330078125,
0.02532958984375,
-0.039093017578125,
0.00800323486328125,
-0.06048583984375,
-0.07794189453125,
0.0716552734375,
0.00478363037109375,
0.004756927490234375,
0.003021240234375,
-0.0018558502197265625,
0.0025234222412109375,
-0.002689361572265625,
-0.045989990234375,
-0.0288848876953125,
-0.0341796875,
-0.0089569091796875,
-0.0102386474609375,
-0.036590576171875,
0.0014705657958984375,
-0.040771484375,
0.058013916015625,
-0.01171112060546875,
0.050018310546875,
0.02020263671875,
-0.021697998046875,
-0.00348663330078125,
0.002147674560546875,
0.0264739990234375,
0.0201263427734375,
-0.0226593017578125,
0.00765228271484375,
0.0147247314453125,
-0.031707763671875,
-0.0028133392333984375,
0.0251617431640625,
-0.0238037109375,
-0.002105712890625,
0.0283050537109375,
0.08697509765625,
0.0294342041015625,
-0.0216827392578125,
0.04693603515625,
-0.001010894775390625,
-0.0391845703125,
-0.034820556640625,
0.0168304443359375,
-0.0016956329345703125,
0.024658203125,
0.01641845703125,
0.0307769775390625,
0.0226593017578125,
-0.0017786026000976562,
0.01837158203125,
0.0231781005859375,
-0.054443359375,
-0.0230865478515625,
0.056732177734375,
0.00785064697265625,
0.0023174285888671875,
0.05230712890625,
-0.01517486572265625,
-0.05224609375,
0.06927490234375,
0.04058837890625,
0.07952880859375,
0.0029754638671875,
0.0208892822265625,
0.060211181640625,
0.01444244384765625,
0.00838470458984375,
-0.00536346435546875,
-0.004047393798828125,
-0.06304931640625,
-0.025726318359375,
-0.0810546875,
0.00048422813415527344,
0.002658843994140625,
-0.05303955078125,
0.0335693359375,
-0.035369873046875,
-0.0145416259765625,
0.020751953125,
0.0034961700439453125,
-0.0831298828125,
0.0184326171875,
0.015472412109375,
0.076416015625,
-0.04217529296875,
0.037811279296875,
0.0611572265625,
-0.0188751220703125,
-0.06329345703125,
-0.037872314453125,
-0.00588226318359375,
-0.06329345703125,
0.038299560546875,
0.03826904296875,
0.003185272216796875,
0.006801605224609375,
-0.061370849609375,
-0.07916259765625,
0.09600830078125,
0.01018524169921875,
-0.028472900390625,
-0.0022602081298828125,
0.0040740966796875,
0.02777099609375,
-0.031707763671875,
0.0283050537109375,
0.0269317626953125,
0.043243408203125,
0.054718017578125,
-0.032867431640625,
0.0038242340087890625,
-0.0279388427734375,
0.004726409912109375,
0.026275634765625,
-0.060302734375,
0.053375244140625,
-0.0196533203125,
-0.0199737548828125,
-0.01018524169921875,
0.048828125,
0.0048370361328125,
0.0269775390625,
0.048492431640625,
0.0625,
0.033843994140625,
-0.02703857421875,
0.06732177734375,
-0.0193023681640625,
0.052154541015625,
0.06427001953125,
0.024017333984375,
0.0279388427734375,
0.032012939453125,
-0.0076446533203125,
0.032928466796875,
0.0709228515625,
-0.040008544921875,
0.03985595703125,
-0.0100250244140625,
0.0138397216796875,
-0.037628173828125,
-0.017791748046875,
-0.039215087890625,
0.058013916015625,
0.0126953125,
-0.049835205078125,
-0.0108795166015625,
-0.0113525390625,
-0.004161834716796875,
-0.041107177734375,
-0.0213470458984375,
0.054046630859375,
0.0078277587890625,
-0.031280517578125,
0.0484619140625,
0.0031986236572265625,
0.058197021484375,
-0.037628173828125,
0.005855560302734375,
-0.00800323486328125,
0.02239990234375,
-0.0272674560546875,
-0.034454345703125,
0.033294677734375,
-0.0198974609375,
0.00026226043701171875,
-0.007579803466796875,
0.0787353515625,
-0.019317626953125,
-0.0555419921875,
0.0157623291015625,
0.01468658447265625,
0.00406646728515625,
0.01268768310546875,
-0.064697265625,
0.02630615234375,
0.006183624267578125,
-0.0272369384765625,
0.0116729736328125,
0.00862884521484375,
0.017547607421875,
0.0426025390625,
0.046661376953125,
-0.02685546875,
0.004192352294921875,
-0.01385498046875,
0.07183837890625,
-0.051116943359375,
-0.0294342041015625,
-0.053375244140625,
0.041351318359375,
-0.02166748046875,
-0.0309906005859375,
0.05645751953125,
0.046844482421875,
0.09130859375,
-0.01995849609375,
0.0295257568359375,
-0.0281829833984375,
0.006671905517578125,
-0.01430511474609375,
0.040863037109375,
-0.05126953125,
-0.008392333984375,
-0.032806396484375,
-0.07476806640625,
-0.02117919921875,
0.06689453125,
-0.029876708984375,
0.0168914794921875,
0.034759521484375,
0.0709228515625,
-0.0196990966796875,
0.005779266357421875,
0.0224151611328125,
0.00643157958984375,
0.006671905517578125,
0.022613525390625,
0.05352783203125,
-0.038726806640625,
0.033294677734375,
-0.058319091796875,
0.0016508102416992188,
-0.034820556640625,
-0.048675537109375,
-0.0670166015625,
-0.043487548828125,
-0.0379638671875,
-0.024658203125,
-0.021209716796875,
0.0655517578125,
0.07635498046875,
-0.0650634765625,
-0.003246307373046875,
-0.0014743804931640625,
0.0086822509765625,
-0.0118255615234375,
-0.0194854736328125,
0.03387451171875,
-0.0048828125,
-0.0628662109375,
-0.00750732421875,
0.0166778564453125,
0.01025390625,
-0.00530242919921875,
-0.021575927734375,
-0.004001617431640625,
-0.011749267578125,
0.045745849609375,
0.017791748046875,
-0.04205322265625,
-0.024749755859375,
0.01503753662109375,
-0.0023784637451171875,
0.01372528076171875,
0.03955078125,
-0.042755126953125,
0.033477783203125,
0.041290283203125,
0.042236328125,
0.07061767578125,
-0.0022525787353515625,
0.0052337646484375,
-0.031768798828125,
0.0224151611328125,
0.015625,
0.038482666015625,
0.0260009765625,
-0.0163116455078125,
0.0445556640625,
0.0166015625,
-0.0445556640625,
-0.047821044921875,
0.0021991729736328125,
-0.08673095703125,
-0.01253509521484375,
0.07647705078125,
-0.002605438232421875,
-0.047882080078125,
0.026214599609375,
-0.00817108154296875,
0.0283355712890625,
-0.01308441162109375,
0.035491943359375,
0.0163726806640625,
-0.0017995834350585938,
-0.034423828125,
-0.01021575927734375,
0.0281829833984375,
0.0031299591064453125,
-0.044219970703125,
-0.04315185546875,
0.03253173828125,
0.02740478515625,
0.0205841064453125,
0.01507568359375,
-0.021697998046875,
0.009613037109375,
0.0143585205078125,
0.0262298583984375,
-0.0226593017578125,
-0.01502227783203125,
-0.0126190185546875,
0.0101776123046875,
-0.0158538818359375,
-0.02020263671875
]
] |
SimianLuo/LCM_Dreamshaper_v7 | 2023-11-06T14:56:09.000Z | [
"diffusers",
"text-to-image",
"en",
"arxiv:2310.04378",
"license:mit",
"has_space",
"diffusers:LatentConsistencyModelPipeline",
"region:us"
] | text-to-image | SimianLuo | null | null | SimianLuo/LCM_Dreamshaper_v7 | 138 | 99,101 | diffusers | 2023-10-14T08:26:52 | ---
license: mit
language:
- en
pipeline_tag: text-to-image
tags:
- text-to-image
---
# Latent Consistency Models
Official Repository of the paper: *[Latent Consistency Models](https://arxiv.org/abs/2310.04378)*.
Project Page: https://latent-consistency-models.github.io
## Try our Hugging Face demos:
[](https://huggingface.co/spaces/SimianLuo/Latent_Consistency_Model)
## Model Descriptions:
Distilled from [Dreamshaper v7](https://huggingface.co/Lykon/dreamshaper-7) fine-tune of [Stable-Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) with only 4,000 training iterations (~32 A100 GPU Hours).
## Generation Results:
<p align="center">
<img src="teaser.png">
</p>
By distilling classifier-free guidance into the model's input, LCM can generate high-quality images in very short inference time. We compare the inference time at the setting of 768 x 768 resolution, CFG scale w=8, batchsize=4, using a A800 GPU.
<p align="center">
<img src="speed_fid.png">
</p>
## Usage
You can try out Latency Consistency Models directly on:
[](https://huggingface.co/spaces/SimianLuo/Latent_Consistency_Model)
To run the model yourself, you can leverage the 🧨 Diffusers library:
1. Install the library:
```
pip install --upgrade diffusers # make sure to use at least diffusers >= 0.22
pip install transformers accelerate
```
2. Run the model:
```py
from diffusers import DiffusionPipeline
import torch
pipe = DiffusionPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7")
# To save GPU memory, torch.float16 can be used, but it may compromise image quality.
pipe.to(torch_device="cuda", torch_dtype=torch.float32)
prompt = "Self-portrait oil painting, a beautiful cyborg with golden hair, 8k"
# Can be set to 1~50 steps. LCM support fast inference even <= 4 steps. Recommend: 1~8 steps.
num_inference_steps = 4
images = pipe(prompt=prompt, num_inference_steps=num_inference_steps, guidance_scale=8.0, lcm_origin_steps=50, output_type="pil").images
```
For more information, please have a look at the official docs:
👉 https://huggingface.co/docs/diffusers/api/pipelines/latent_consistency_models#latent-consistency-models
## Usage (Deprecated)
1. Install the library:
```
pip install diffusers transformers accelerate
```
2. Run the model:
```py
from diffusers import DiffusionPipeline
import torch
pipe = DiffusionPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7", custom_pipeline="latent_consistency_txt2img", custom_revision="main", revision="fb9c5d")
# To save GPU memory, torch.float16 can be used, but it may compromise image quality.
pipe.to(torch_device="cuda", torch_dtype=torch.float32)
prompt = "Self-portrait oil painting, a beautiful cyborg with golden hair, 8k"
# Can be set to 1~50 steps. LCM support fast inference even <= 4 steps. Recommend: 1~8 steps.
num_inference_steps = 4
images = pipe(prompt=prompt, num_inference_steps=num_inference_steps, guidance_scale=8.0, output_type="pil").images
```
## BibTeX
```bibtex
@misc{luo2023latent,
title={Latent Consistency Models: Synthesizing High-Resolution Images with Few-Step Inference},
author={Simian Luo and Yiqin Tan and Longbo Huang and Jian Li and Hang Zhao},
year={2023},
eprint={2310.04378},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | 3,498 | [
[
-0.0193634033203125,
-0.047088623046875,
0.03155517578125,
0.0222320556640625,
-0.008453369140625,
-0.01050567626953125,
-0.009613037109375,
-0.0306243896484375,
0.0104217529296875,
0.038482666015625,
-0.03411865234375,
-0.03924560546875,
-0.0380859375,
-0.01038360595703125,
-0.006969451904296875,
0.0721435546875,
-0.01284027099609375,
-0.0101165771484375,
-0.0026760101318359375,
-0.002719879150390625,
-0.006145477294921875,
-0.0040435791015625,
-0.0594482421875,
-0.0230712890625,
0.0281829833984375,
-0.0081939697265625,
0.051025390625,
0.0234222412109375,
0.0215606689453125,
0.02532958984375,
-0.024871826171875,
0.0117950439453125,
-0.04107666015625,
0.00264739990234375,
0.0144195556640625,
-0.02374267578125,
-0.055877685546875,
0.00923919677734375,
0.0589599609375,
0.02362060546875,
-0.0220489501953125,
0.0017490386962890625,
0.032440185546875,
0.06744384765625,
-0.03387451171875,
0.00885772705078125,
-0.0295257568359375,
0.01125335693359375,
0.0008497238159179688,
0.022125244140625,
-0.0232696533203125,
-0.032867431640625,
-0.00472259521484375,
-0.057220458984375,
0.0279388427734375,
-0.01050567626953125,
0.10211181640625,
0.0322265625,
-0.0137786865234375,
0.00749969482421875,
-0.04290771484375,
0.054229736328125,
-0.05462646484375,
0.0469970703125,
0.0153961181640625,
0.016021728515625,
-0.00841522216796875,
-0.069091796875,
-0.056884765625,
0.005092620849609375,
-0.00548553466796875,
0.0305328369140625,
-0.0054473876953125,
0.00864410400390625,
0.03411865234375,
0.024383544921875,
-0.041717529296875,
-0.0265960693359375,
-0.02484130859375,
-0.024200439453125,
0.047607421875,
0.0032405853271484375,
0.00655364990234375,
-0.00853729248046875,
-0.02972412109375,
-0.0195465087890625,
-0.0283203125,
0.00429534912109375,
0.022125244140625,
0.0013475418090820312,
-0.051605224609375,
0.05145263671875,
0.00640106201171875,
0.04278564453125,
0.0242767333984375,
-0.0212860107421875,
0.0277862548828125,
-0.0149078369140625,
-0.0258026123046875,
-0.0229339599609375,
0.0699462890625,
0.02978515625,
-0.0029277801513671875,
0.00452423095703125,
-0.0182342529296875,
0.0089111328125,
-0.004993438720703125,
-0.07464599609375,
-0.031402587890625,
0.0270538330078125,
-0.044464111328125,
-0.01751708984375,
-0.00640106201171875,
-0.05419921875,
-0.0142822265625,
-0.021209716796875,
0.03948974609375,
-0.044464111328125,
-0.032073974609375,
0.004650115966796875,
0.004817962646484375,
0.02227783203125,
0.03106689453125,
-0.0433349609375,
0.018768310546875,
0.0258941650390625,
0.0806884765625,
-0.01800537109375,
-0.0053253173828125,
-0.0188751220703125,
-0.0026187896728515625,
-0.0310821533203125,
0.040283203125,
-0.0033321380615234375,
-0.01276397705078125,
-0.025909423828125,
0.0198974609375,
-0.017822265625,
-0.0433349609375,
0.045989990234375,
-0.02471923828125,
0.01319122314453125,
-0.0054473876953125,
-0.06085205078125,
-0.0010013580322265625,
0.012237548828125,
-0.032501220703125,
0.08056640625,
0.0181427001953125,
-0.07794189453125,
0.0014324188232421875,
-0.035064697265625,
0.0131988525390625,
-0.0201873779296875,
-0.01378631591796875,
-0.05889892578125,
-0.0043182373046875,
-0.005157470703125,
0.031768798828125,
0.0084686279296875,
0.01317596435546875,
-0.032135009765625,
-0.03753662109375,
0.0046844482421875,
-0.0216827392578125,
0.08551025390625,
0.028350830078125,
-0.021331787109375,
0.000980377197265625,
-0.04632568359375,
0.0016231536865234375,
0.01422882080078125,
-0.01216888427734375,
0.007350921630859375,
-0.0228118896484375,
0.0225982666015625,
0.02215576171875,
0.0277862548828125,
-0.0426025390625,
0.00786590576171875,
-0.0219268798828125,
0.00740814208984375,
0.05682373046875,
-0.0100555419921875,
0.0126190185546875,
-0.04510498046875,
0.032562255859375,
0.0308685302734375,
-0.004558563232421875,
0.00850677490234375,
-0.042572021484375,
-0.08135986328125,
-0.01242828369140625,
0.0210418701171875,
0.0228271484375,
-0.04803466796875,
0.032806396484375,
-0.003833770751953125,
-0.0589599609375,
-0.0391845703125,
-0.0036792755126953125,
0.03240966796875,
0.0477294921875,
0.0284423828125,
-0.026031494140625,
-0.03985595703125,
-0.05731201171875,
0.003849029541015625,
-0.014556884765625,
-0.002349853515625,
0.0218048095703125,
0.03521728515625,
-0.024139404296875,
0.06280517578125,
-0.045257568359375,
-0.02362060546875,
0.0044403076171875,
0.00560760498046875,
0.0304412841796875,
0.06298828125,
0.050262451171875,
-0.05230712890625,
-0.061920166015625,
-0.0203857421875,
-0.06634521484375,
0.005100250244140625,
0.00325775146484375,
-0.0220489501953125,
0.03277587890625,
0.0280609130859375,
-0.043487548828125,
0.0283660888671875,
0.06884765625,
-0.0330810546875,
0.0677490234375,
-0.0175933837890625,
0.01029205322265625,
-0.08721923828125,
0.0127105712890625,
0.01898193359375,
-0.0313720703125,
-0.03173828125,
-0.00347137451171875,
-0.001682281494140625,
-0.0031528472900390625,
-0.041229248046875,
0.058624267578125,
-0.031463623046875,
0.00957489013671875,
-0.017791748046875,
-0.00730133056640625,
0.0010156631469726562,
0.04656982421875,
0.016998291015625,
0.043975830078125,
0.0645751953125,
-0.036285400390625,
-0.002696990966796875,
0.0171661376953125,
-0.004360198974609375,
0.046966552734375,
-0.07159423828125,
0.0233612060546875,
-0.0028839111328125,
0.02362060546875,
-0.0758056640625,
-0.00962066650390625,
0.03277587890625,
-0.0268402099609375,
0.029205322265625,
-0.02471923828125,
-0.03271484375,
-0.028411865234375,
-0.0102386474609375,
0.019073486328125,
0.07159423828125,
-0.0285491943359375,
0.055877685546875,
0.01216888427734375,
0.022552490234375,
-0.051605224609375,
-0.038055419921875,
-0.0217132568359375,
-0.0154876708984375,
-0.058868408203125,
0.03961181640625,
-0.04742431640625,
-0.01331329345703125,
0.0022830963134765625,
-0.00551605224609375,
-0.00476837158203125,
-0.01678466796875,
0.0299530029296875,
0.0268402099609375,
-0.0244598388671875,
-0.01739501953125,
-0.0025081634521484375,
-0.00666046142578125,
-0.00719451904296875,
-0.021881103515625,
0.0267486572265625,
-0.03094482421875,
-0.0171966552734375,
-0.06658935546875,
-0.002368927001953125,
0.0289154052734375,
0.021881103515625,
0.05712890625,
0.0733642578125,
-0.0472412109375,
0.00476837158203125,
-0.052398681640625,
-0.0194854736328125,
-0.03765869140625,
-0.0003097057342529297,
-0.03466796875,
-0.04437255859375,
0.048370361328125,
0.0105438232421875,
0.00811767578125,
0.04296875,
0.036529541015625,
-0.03485107421875,
0.06805419921875,
0.056732177734375,
0.0269622802734375,
0.030792236328125,
-0.0550537109375,
-0.021026611328125,
-0.07037353515625,
-0.01401519775390625,
-0.00787353515625,
-0.019317626953125,
-0.03485107421875,
-0.056671142578125,
0.0218658447265625,
0.0299072265625,
-0.02789306640625,
0.024993896484375,
-0.041046142578125,
0.0286407470703125,
0.01068878173828125,
0.034515380859375,
0.019500732421875,
0.004512786865234375,
-0.0233306884765625,
-0.0284576416015625,
-0.035400390625,
-0.04498291015625,
0.0675048828125,
0.023468017578125,
0.061492919921875,
-0.00435638427734375,
0.060638427734375,
0.00249481201171875,
0.0211944580078125,
-0.0279388427734375,
0.0443115234375,
-0.0179290771484375,
-0.038543701171875,
0.0020618438720703125,
-0.0276031494140625,
-0.07305908203125,
0.00972747802734375,
-0.029876708984375,
-0.0523681640625,
0.0184326171875,
0.034423828125,
-0.021881103515625,
0.039794921875,
-0.040130615234375,
0.06658935546875,
0.0032806396484375,
-0.060699462890625,
0.0025920867919921875,
-0.04998779296875,
0.036163330078125,
0.0011577606201171875,
-0.01561737060546875,
-0.0194244384765625,
-0.00505828857421875,
0.05810546875,
-0.035125732421875,
0.0670166015625,
-0.033599853515625,
-0.004871368408203125,
0.04034423828125,
-0.0015411376953125,
0.028717041015625,
-0.004077911376953125,
-0.018096923828125,
0.03155517578125,
0.003082275390625,
-0.045745849609375,
-0.034332275390625,
0.05303955078125,
-0.0697021484375,
-0.01515960693359375,
-0.033111572265625,
-0.04071044921875,
0.0007266998291015625,
0.0194854736328125,
0.04132080078125,
0.003589630126953125,
-0.01678466796875,
0.005168914794921875,
0.07086181640625,
-0.006755828857421875,
0.0308380126953125,
0.0176849365234375,
-0.0252685546875,
-0.0262451171875,
0.056182861328125,
0.0144805908203125,
0.026031494140625,
0.0004012584686279297,
0.0204315185546875,
-0.0276641845703125,
-0.032958984375,
-0.018951416015625,
0.04510498046875,
-0.03607177734375,
-0.01435089111328125,
-0.06591796875,
-0.043548583984375,
-0.0281219482421875,
-0.01096343994140625,
-0.05517578125,
-0.0180206298828125,
-0.0302886962890625,
0.03216552734375,
0.024932861328125,
0.047821044921875,
-0.01351165771484375,
0.0171356201171875,
-0.04095458984375,
0.02935791015625,
0.00800323486328125,
0.0286865234375,
0.0035762786865234375,
-0.053741455078125,
-0.0195770263671875,
0.0160675048828125,
-0.049560546875,
-0.057098388671875,
0.02911376953125,
0.007572174072265625,
0.032379150390625,
0.031646728515625,
-0.006927490234375,
0.0626220703125,
-0.0192413330078125,
0.05267333984375,
0.02667236328125,
-0.0643310546875,
0.03936767578125,
-0.032196044921875,
0.0036792755126953125,
0.0185394287109375,
0.03680419921875,
-0.03338623046875,
-0.0307464599609375,
-0.08062744140625,
-0.045928955078125,
0.031219482421875,
0.039642333984375,
-0.0130767822265625,
0.019012451171875,
0.045013427734375,
-0.00769805908203125,
-0.00745391845703125,
-0.0697021484375,
-0.033050537109375,
-0.0194854736328125,
-0.0201416015625,
-0.008270263671875,
0.0006537437438964844,
-0.00524139404296875,
-0.02374267578125,
0.06842041015625,
-0.0218048095703125,
0.03265380859375,
0.038787841796875,
0.00508880615234375,
-0.01116180419921875,
-0.01389312744140625,
0.02996826171875,
0.021575927734375,
-0.0270843505859375,
0.0006990432739257812,
0.009063720703125,
-0.036285400390625,
0.01751708984375,
0.01837158203125,
-0.0239105224609375,
0.007228851318359375,
-0.0022125244140625,
0.048980712890625,
-0.011627197265625,
-0.03411865234375,
0.037078857421875,
-0.020965576171875,
-0.0179290771484375,
-0.036407470703125,
0.0212860107421875,
0.040863037109375,
0.032623291015625,
0.0107421875,
0.03887939453125,
-0.007396697998046875,
-0.007053375244140625,
0.0135650634765625,
0.02850341796875,
-0.043212890625,
-0.031036376953125,
0.07989501953125,
-0.0171051025390625,
-0.005191802978515625,
0.043212890625,
-0.03289794921875,
-0.0119781494140625,
0.060211181640625,
0.043914794921875,
0.0587158203125,
-0.00469970703125,
0.00852203369140625,
0.06195068359375,
-0.0067291259765625,
-0.03289794921875,
0.032806396484375,
0.0220184326171875,
-0.061798095703125,
-0.0130767822265625,
-0.05841064453125,
-0.0229644775390625,
0.01537322998046875,
-0.029205322265625,
0.043853759765625,
-0.0248870849609375,
-0.0157318115234375,
-0.0071258544921875,
0.0216522216796875,
-0.061492919921875,
0.0246734619140625,
0.0164794921875,
0.0582275390625,
-0.065185546875,
0.07159423828125,
0.036529541015625,
-0.039398193359375,
-0.0765380859375,
0.00811004638671875,
0.0089111328125,
-0.0423583984375,
0.02471923828125,
0.01271820068359375,
-0.00687408447265625,
0.0081024169921875,
-0.049591064453125,
-0.0533447265625,
0.09716796875,
0.052764892578125,
-0.0513916015625,
-0.005512237548828125,
-0.0301055908203125,
0.05218505859375,
-0.040985107421875,
0.022796630859375,
0.0207366943359375,
0.0250244140625,
0.0195465087890625,
-0.053497314453125,
0.008148193359375,
-0.019317626953125,
0.0092620849609375,
-0.00966644287109375,
-0.0655517578125,
0.0672607421875,
-0.052764892578125,
-0.016082763671875,
0.0380859375,
0.08245849609375,
0.05517578125,
0.028564453125,
0.0517578125,
0.06939697265625,
0.05206298828125,
-0.0105438232421875,
0.06219482421875,
-0.0267486572265625,
0.0496826171875,
0.08038330078125,
0.005718231201171875,
0.067138671875,
0.0245819091796875,
-0.03094482421875,
0.0531005859375,
0.06494140625,
-0.00925445556640625,
0.01389312744140625,
0.00643157958984375,
-0.01186370849609375,
-0.00972747802734375,
-0.007537841796875,
-0.039276123046875,
0.01462554931640625,
0.0186004638671875,
-0.0305633544921875,
0.00482940673828125,
-0.00908660888671875,
0.0019044876098632812,
-0.005481719970703125,
0.005786895751953125,
0.034576416015625,
0.0084381103515625,
-0.040374755859375,
0.0653076171875,
-0.0263824462890625,
0.0728759765625,
-0.03302001953125,
-0.006359100341796875,
-0.01479339599609375,
0.0192718505859375,
-0.0277862548828125,
-0.061126708984375,
0.032623291015625,
-0.0135498046875,
-0.000911712646484375,
-0.010772705078125,
0.052764892578125,
-0.05181884765625,
-0.048370361328125,
0.028167724609375,
0.033447265625,
0.047332763671875,
0.00457000732421875,
-0.08978271484375,
0.0175628662109375,
0.003116607666015625,
-0.03338623046875,
0.01016998291015625,
0.01739501953125,
0.0279083251953125,
0.037322998046875,
0.0394287109375,
0.0099945068359375,
-0.0030956268310546875,
-0.006305694580078125,
0.050079345703125,
-0.034454345703125,
-0.0084686279296875,
-0.042572021484375,
0.050689697265625,
-0.004543304443359375,
-0.01479339599609375,
0.0477294921875,
0.06292724609375,
0.053619384765625,
-0.00547027587890625,
0.04315185546875,
-0.0208587646484375,
0.0239105224609375,
-0.0318603515625,
0.06500244140625,
-0.066650390625,
0.006961822509765625,
-0.035491943359375,
-0.08074951171875,
-0.00579071044921875,
0.05706787109375,
0.009490966796875,
0.028228759765625,
0.0360107421875,
0.07879638671875,
-0.02764892578125,
-0.0311737060546875,
0.025299072265625,
0.038299560546875,
0.00867462158203125,
0.040557861328125,
0.03326416015625,
-0.066650390625,
0.0307769775390625,
-0.050445556640625,
-0.00801849365234375,
0.001949310302734375,
-0.06939697265625,
-0.050994873046875,
-0.051727294921875,
-0.06097412109375,
-0.0836181640625,
0.006175994873046875,
0.05511474609375,
0.07550048828125,
-0.048431396484375,
-0.02008056640625,
-0.01641845703125,
0.0034770965576171875,
-0.031280517578125,
-0.0211181640625,
0.03680419921875,
-0.00836181640625,
-0.0848388671875,
-0.0086212158203125,
0.0018548965454101562,
0.053985595703125,
-0.0191497802734375,
-0.038543701171875,
-0.01419830322265625,
-0.00559234619140625,
0.0214080810546875,
0.016754150390625,
-0.060302734375,
-0.0035114288330078125,
0.0065460205078125,
-0.0269927978515625,
0.0294342041015625,
0.0258331298828125,
-0.05279541015625,
0.03485107421875,
0.0328369140625,
0.00959014892578125,
0.06658935546875,
-0.010650634765625,
0.0175323486328125,
-0.0391845703125,
0.01458740234375,
0.0138702392578125,
0.041168212890625,
0.0325927734375,
-0.0232086181640625,
0.056915283203125,
0.04962158203125,
-0.055938720703125,
-0.043914794921875,
0.00301361083984375,
-0.10589599609375,
-0.00859832763671875,
0.07171630859375,
-0.03253173828125,
-0.0221405029296875,
0.009002685546875,
-0.0219268798828125,
0.03887939453125,
-0.0217742919921875,
0.047149658203125,
0.03009033203125,
-0.0259246826171875,
-0.0269927978515625,
-0.0305328369140625,
0.045440673828125,
0.03216552734375,
-0.04827880859375,
-0.003879547119140625,
0.0219573974609375,
0.047882080078125,
0.023162841796875,
0.0753173828125,
-0.0176239013671875,
0.002796173095703125,
0.01556396484375,
0.0182342529296875,
0.02972412109375,
-0.0035190582275390625,
-0.018951416015625,
-0.00469207763671875,
-0.00789642333984375,
0.00870513916015625
]
] |
microsoft/speecht5_tts | 2023-10-05T12:24:56.000Z | [
"transformers",
"pytorch",
"speecht5",
"text-to-audio",
"audio",
"text-to-speech",
"dataset:libritts",
"arxiv:2110.07205",
"arxiv:1910.09700",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-to-speech | microsoft | null | null | microsoft/speecht5_tts | 308 | 98,550 | transformers | 2023-02-02T12:56:54 | ---
license: mit
tags:
- audio
- text-to-speech
datasets:
- libritts
---
# SpeechT5 (TTS task)
SpeechT5 model fine-tuned for speech synthesis (text-to-speech) on LibriTTS.
This model was introduced in [SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing](https://arxiv.org/abs/2110.07205) by Junyi Ao, Rui Wang, Long Zhou, Chengyi Wang, Shuo Ren, Yu Wu, Shujie Liu, Tom Ko, Qing Li, Yu Zhang, Zhihua Wei, Yao Qian, Jinyu Li, Furu Wei.
SpeechT5 was first released in [this repository](https://github.com/microsoft/SpeechT5/), [original weights](https://huggingface.co/mechanicalsea/speecht5-tts). The license used is [MIT](https://github.com/microsoft/SpeechT5/blob/main/LICENSE).
## Model Description
Motivated by the success of T5 (Text-To-Text Transfer Transformer) in pre-trained natural language processing models, we propose a unified-modal SpeechT5 framework that explores the encoder-decoder pre-training for self-supervised speech/text representation learning. The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder.
Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. To align the textual and speech information into this unified semantic space, we propose a cross-modal vector quantization approach that randomly mixes up speech/text states with latent units as the interface between encoder and decoder.
Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification.
- **Developed by:** Junyi Ao, Rui Wang, Long Zhou, Chengyi Wang, Shuo Ren, Yu Wu, Shujie Liu, Tom Ko, Qing Li, Yu Zhang, Zhihua Wei, Yao Qian, Jinyu Li, Furu Wei.
- **Shared by [optional]:** [Matthijs Hollemans](https://huggingface.co/Matthijs)
- **Model type:** text-to-speech
- **Language(s) (NLP):** [More Information Needed]
- **License:** [MIT](https://github.com/microsoft/SpeechT5/blob/main/LICENSE)
- **Finetuned from model [optional]:** [More Information Needed]
## Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [https://github.com/microsoft/SpeechT5/]
- **Paper:** [https://arxiv.org/pdf/2110.07205.pdf]
- **Blog Post:** [https://huggingface.co/blog/speecht5]
- **Demo:** [https://huggingface.co/spaces/Matthijs/speecht5-tts-demo]
# Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
## 🤗 Transformers Usage
You can run SpeechT5 TTS locally with the 🤗 Transformers library.
1. First install the 🤗 [Transformers library](https://github.com/huggingface/transformers), sentencepiece, soundfile and datasets(optional):
```
pip install --upgrade pip
pip install --upgrade transformers sentencepiece datasets[audio]
```
2. Run inference via the `Text-to-Speech` (TTS) pipeline. You can access the SpeechT5 model via the TTS pipeline in just a few lines of code!
```python
from transformers import pipeline
from datasets import load_dataset
import soundfile as sf
synthesiser = pipeline("text-to-speech", "microsoft/speech_tt5")
embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation")
speaker_embedding = torch.tensor(embeddings_dataset[7306]["xvector"]).unsqueeze(0)
# You can replace this embedding with your own as well.
speech = synthesiser("Hello, my dog is cooler than you!", forward_params={"speaker_embeddings": speaker_embedding})
sf.write("speech.wav", speech["audio"], samplerate=speech["sampling_rate"])
```
3. Run inference via the Transformers modelling code - You can use the processor + generate code to convert text into a mono 16 kHz speech waveform for more fine-grained control.
```python
from transformers import SpeechT5Processor, SpeechT5ForTextToSpeech, SpeechT5HifiGan
from datasets import load_dataset
import torch
import soundfile as sf
from datasets import load_dataset
processor = SpeechT5Processor.from_pretrained("microsoft/speecht5_tts")
model = SpeechT5ForTextToSpeech.from_pretrained("microsoft/speecht5_tts")
vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan")
inputs = processor(text="Hello, my dog is cute.", return_tensors="pt")
# load xvector containing speaker's voice characteristics from a dataset
embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation")
speaker_embeddings = torch.tensor(embeddings_dataset[7306]["xvector"]).unsqueeze(0)
speech = model.generate_speech(inputs["input_ids"], speaker_embeddings, vocoder=vocoder)
sf.write("speech.wav", speech.numpy(), samplerate=16000)
```
### Fine-tuning the Model
Refer to [this Colab notebook](https://colab.research.google.com/drive/1i7I5pzBcU3WDFarDnzweIj4-sVVoIUFJ) for an example of how to fine-tune SpeechT5 for TTS on a different dataset or a new language.
## Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
You can use this model for speech synthesis. See the [model hub](https://huggingface.co/models?search=speecht5) to look for fine-tuned versions on a task that interests you.
## Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
## Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
# Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
## Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
LibriTTS
## Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
### Preprocessing [optional]
Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text.
### Training hyperparameters
- **Precision:** [More Information Needed] <!--fp16, bf16, fp8, fp32 -->
- **Regime:** [More Information Needed] <!--mixed precision or not -->
### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
# Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
## Testing Data, Factors & Metrics
### Testing Data
<!-- This should link to a Data Card if possible. -->
[More Information Needed]
### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
## Results
[More Information Needed]
### Summary
# Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification.
# Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
# Technical Specifications [optional]
## Model Architecture and Objective
The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets.
After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder.
## Compute Infrastructure
[More Information Needed]
### Hardware
[More Information Needed]
### Software
[More Information Needed]
# Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@inproceedings{ao-etal-2022-speecht5,
title = {{S}peech{T}5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing},
author = {Ao, Junyi and Wang, Rui and Zhou, Long and Wang, Chengyi and Ren, Shuo and Wu, Yu and Liu, Shujie and Ko, Tom and Li, Qing and Zhang, Yu and Wei, Zhihua and Qian, Yao and Li, Jinyu and Wei, Furu},
booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
month = {May},
year = {2022},
pages={5723--5738},
}
```
# Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
- **text-to-speech** to synthesize audio
# More Information [optional]
[More Information Needed]
# Model Card Authors [optional]
Disclaimer: The team releasing SpeechT5 did not write a model card for this model so this model card has been written by the Hugging Face team.
# Model Card Contact
[More Information Needed]
| 10,807 | [
[
-0.032806396484375,
-0.046722412109375,
0.01267242431640625,
0.01309967041015625,
-0.018035888671875,
-0.0054473876953125,
-0.0225677490234375,
-0.029815673828125,
-0.004940032958984375,
0.0224761962890625,
-0.048492431640625,
-0.052001953125,
-0.037506103515625,
0.0059967041015625,
-0.02362060546875,
0.0782470703125,
0.0215606689453125,
0.00226593017578125,
0.00107574462890625,
-0.0014848709106445312,
-0.0382080078125,
-0.045989990234375,
-0.038360595703125,
-0.032012939453125,
0.01800537109375,
0.0196533203125,
0.02923583984375,
0.038299560546875,
0.016754150390625,
0.0265045166015625,
-0.0136871337890625,
-0.0110321044921875,
-0.032806396484375,
0.0021762847900390625,
0.0007252693176269531,
-0.024505615234375,
-0.04547119140625,
0.0057830810546875,
0.04559326171875,
0.0379638671875,
-0.0200958251953125,
0.0143585205078125,
-0.0030536651611328125,
0.0020351409912109375,
-0.0209197998046875,
0.0133056640625,
-0.049957275390625,
-0.00333404541015625,
-0.01316070556640625,
-0.00010532140731811523,
-0.0311431884765625,
-0.01322174072265625,
0.0263214111328125,
-0.034576416015625,
0.0211944580078125,
-0.005596160888671875,
0.07366943359375,
0.02484130859375,
-0.0248260498046875,
-0.00954437255859375,
-0.050689697265625,
0.06500244140625,
-0.06414794921875,
0.061187744140625,
0.026580810546875,
0.0113677978515625,
0.0017156600952148438,
-0.08917236328125,
-0.048797607421875,
-0.0194549560546875,
0.00991058349609375,
0.021728515625,
-0.035797119140625,
0.02154541015625,
0.0260772705078125,
0.01348114013671875,
-0.04901123046875,
-0.0048065185546875,
-0.0443115234375,
-0.0391845703125,
0.041778564453125,
-0.0064544677734375,
0.032318115234375,
-0.0247344970703125,
-0.037567138671875,
-0.0266265869140625,
-0.027008056640625,
0.00037980079650878906,
0.011322021484375,
0.0302276611328125,
-0.03582763671875,
0.0297393798828125,
-0.002010345458984375,
0.041595458984375,
0.0101470947265625,
-0.01361083984375,
0.044921875,
-0.037628173828125,
-0.0301361083984375,
0.0178070068359375,
0.07611083984375,
0.001934051513671875,
0.0156097412109375,
-0.010528564453125,
-0.013641357421875,
0.004901885986328125,
0.01410675048828125,
-0.0640869140625,
-0.01534271240234375,
0.034698486328125,
-0.017059326171875,
-0.01316070556640625,
-0.01436614990234375,
-0.0307464599609375,
0.00812530517578125,
-0.0265350341796875,
0.052642822265625,
-0.057281494140625,
-0.0142364501953125,
0.00734710693359375,
-0.0202178955078125,
0.0118255615234375,
-0.00431060791015625,
-0.06927490234375,
0.01812744140625,
0.0235595703125,
0.052886962890625,
-0.0019445419311523438,
-0.03277587890625,
-0.037567138671875,
0.01192474365234375,
0.0009369850158691406,
0.035797119140625,
-0.0250701904296875,
-0.031219482421875,
-0.02313232421875,
-0.00039076805114746094,
-0.00978851318359375,
-0.02978515625,
0.065185546875,
-0.01139068603515625,
0.04254150390625,
-0.01000213623046875,
-0.051025390625,
-0.014312744140625,
-0.01006317138671875,
-0.035430908203125,
0.08343505859375,
-0.0027904510498046875,
-0.06146240234375,
0.01812744140625,
-0.06549072265625,
-0.0283050537109375,
-0.0117340087890625,
-0.00020968914031982422,
-0.0391845703125,
-0.0086212158203125,
0.0288543701171875,
0.045989990234375,
-0.0203399658203125,
0.0245208740234375,
-0.0092926025390625,
-0.015411376953125,
0.012847900390625,
-0.03314208984375,
0.075439453125,
0.02838134765625,
-0.039459228515625,
0.025115966796875,
-0.06768798828125,
0.008331298828125,
-0.004451751708984375,
-0.0197296142578125,
0.0004277229309082031,
-0.01287841796875,
0.035430908203125,
0.0130767822265625,
0.0187225341796875,
-0.044464111328125,
-0.0018177032470703125,
-0.045928955078125,
0.061859130859375,
0.04290771484375,
-0.0086517333984375,
0.01898193359375,
-0.019439697265625,
0.0291900634765625,
0.018310546875,
0.005931854248046875,
-0.00768280029296875,
-0.035858154296875,
-0.05218505859375,
-0.027679443359375,
0.040191650390625,
0.04071044921875,
-0.051971435546875,
0.045318603515625,
-0.02569580078125,
-0.047332763671875,
-0.058135986328125,
-0.0209503173828125,
0.0279541015625,
0.04254150390625,
0.049530029296875,
-0.0142364501953125,
-0.06146240234375,
-0.054595947265625,
-0.01149749755859375,
-0.0027065277099609375,
-0.011627197265625,
0.0013027191162109375,
0.03662109375,
-0.0226898193359375,
0.08172607421875,
-0.0157012939453125,
-0.02557373046875,
-0.027801513671875,
0.0143890380859375,
0.0023193359375,
0.047607421875,
0.0280609130859375,
-0.055694580078125,
-0.01267242431640625,
-0.0181732177734375,
-0.035614013671875,
-0.017181396484375,
0.004329681396484375,
0.01654052734375,
0.01309967041015625,
0.03582763671875,
-0.04559326171875,
0.0181884765625,
0.040374755859375,
-0.0215301513671875,
0.021148681640625,
-0.006015777587890625,
-0.005847930908203125,
-0.10784912109375,
0.007312774658203125,
-0.004016876220703125,
-0.029144287109375,
-0.057281494140625,
-0.0246429443359375,
-0.0198822021484375,
-0.024322509765625,
-0.045989990234375,
0.0328369140625,
-0.0299224853515625,
-0.007724761962890625,
0.0006403923034667969,
0.0218658447265625,
-0.0126800537109375,
0.040740966796875,
0.002460479736328125,
0.07012939453125,
0.048095703125,
-0.045318603515625,
0.031707763671875,
0.0282745361328125,
-0.0032176971435546875,
0.035430908203125,
-0.0740966796875,
0.027801513671875,
0.0045318603515625,
0.019989013671875,
-0.06866455078125,
-0.01082611083984375,
0.01308441162109375,
-0.06512451171875,
0.0137939453125,
-0.006633758544921875,
-0.02752685546875,
-0.0155487060546875,
-0.00811767578125,
0.0204315185546875,
0.052642822265625,
-0.0228729248046875,
0.05047607421875,
0.043975830078125,
-0.0024318695068359375,
-0.0281829833984375,
-0.05523681640625,
0.00009769201278686523,
-0.0289459228515625,
-0.05322265625,
0.05096435546875,
-0.0048370361328125,
0.0183868408203125,
-0.00850677490234375,
0.004657745361328125,
-0.00516510009765625,
-0.0197906494140625,
0.0125885009765625,
-0.0031681060791015625,
-0.005725860595703125,
0.005664825439453125,
-0.007732391357421875,
-0.01410675048828125,
-0.0033893585205078125,
-0.0318603515625,
0.045654296875,
-0.014862060546875,
-0.00792694091796875,
-0.061248779296875,
0.0145263671875,
0.045135498046875,
-0.0247344970703125,
0.022735595703125,
0.096923828125,
-0.034820556640625,
-0.002838134765625,
-0.0391845703125,
-0.032379150390625,
-0.0367431640625,
0.04327392578125,
-0.039642333984375,
-0.062164306640625,
0.030487060546875,
0.006603240966796875,
0.0017137527465820312,
0.037261962890625,
0.051055908203125,
-0.0003364086151123047,
0.08221435546875,
0.049346923828125,
-0.01238250732421875,
0.043975830078125,
-0.0252532958984375,
0.005062103271484375,
-0.0628662109375,
-0.019744873046875,
-0.043182373046875,
-0.00992584228515625,
-0.059814453125,
-0.033905029296875,
0.045745849609375,
-0.0111236572265625,
-0.0238037109375,
0.040985107421875,
-0.049346923828125,
0.0035686492919921875,
0.05682373046875,
-0.0042266845703125,
0.006866455078125,
0.0110626220703125,
-0.0245208740234375,
-0.0164947509765625,
-0.06500244140625,
-0.03125,
0.07611083984375,
0.041778564453125,
0.043121337890625,
-0.00986480712890625,
0.059173583984375,
0.00641632080078125,
-0.016082763671875,
-0.04278564453125,
0.036712646484375,
-0.01244354248046875,
-0.03704833984375,
-0.0303955078125,
-0.0338134765625,
-0.08282470703125,
0.01554107666015625,
-0.00818634033203125,
-0.068359375,
0.003582000732421875,
0.004146575927734375,
-0.0295867919921875,
0.006622314453125,
-0.07635498046875,
0.07342529296875,
-0.004016876220703125,
-0.01250457763671875,
-0.00920867919921875,
-0.061279296875,
0.0064239501953125,
0.0197296142578125,
0.007965087890625,
-0.007030487060546875,
0.0120391845703125,
0.08050537109375,
-0.0228271484375,
0.07568359375,
-0.01508331298828125,
0.002765655517578125,
0.0212249755859375,
-0.0252838134765625,
0.0163726806640625,
-0.01096343994140625,
0.001552581787109375,
0.02423095703125,
0.036590576171875,
-0.0205230712890625,
-0.034820556640625,
0.04071044921875,
-0.0777587890625,
-0.016204833984375,
-0.0298919677734375,
-0.042633056640625,
-0.0279388427734375,
0.016998291015625,
0.038238525390625,
0.0435791015625,
-0.0130462646484375,
0.022918701171875,
0.04693603515625,
-0.019317626953125,
0.038909912109375,
0.01995849609375,
-0.007221221923828125,
-0.051666259765625,
0.0714111328125,
0.0165863037109375,
0.0234222412109375,
0.0232086181640625,
0.021697998046875,
-0.04913330078125,
-0.0301513671875,
-0.0280609130859375,
0.01409912109375,
-0.040435791015625,
-0.0027561187744140625,
-0.04718017578125,
-0.0260772705078125,
-0.060760498046875,
0.0213165283203125,
-0.040679931640625,
-0.02447509765625,
-0.03271484375,
0.0006060600280761719,
0.03314208984375,
0.04144287109375,
-0.0111236572265625,
0.029205322265625,
-0.056365966796875,
0.033233642578125,
0.017730712890625,
0.0120849609375,
-0.018463134765625,
-0.056427001953125,
-0.01346588134765625,
0.025909423828125,
-0.036590576171875,
-0.06292724609375,
0.036712646484375,
0.0135040283203125,
0.032379150390625,
0.004940032958984375,
0.005634307861328125,
0.05023193359375,
-0.0323486328125,
0.072265625,
0.01593017578125,
-0.096923828125,
0.0465087890625,
-0.031951904296875,
0.039459228515625,
0.0167236328125,
0.0247650146484375,
-0.044586181640625,
-0.016143798828125,
-0.0703125,
-0.06475830078125,
0.0638427734375,
0.05047607421875,
0.011199951171875,
0.0194549560546875,
0.0086669921875,
-0.0178375244140625,
0.0177459716796875,
-0.06597900390625,
-0.0229949951171875,
-0.036956787109375,
-0.02978515625,
-0.005207061767578125,
-0.00201416015625,
-0.0027904510498046875,
-0.040985107421875,
0.059112548828125,
-0.000051975250244140625,
0.05322265625,
0.0301513671875,
0.002979278564453125,
0.0202484130859375,
0.0247650146484375,
0.054229736328125,
0.0341796875,
-0.017181396484375,
0.005908966064453125,
0.0276336669921875,
-0.050201416015625,
-0.00013554096221923828,
0.0279541015625,
-0.002323150634765625,
0.01396942138671875,
0.0189666748046875,
0.0902099609375,
0.0211029052734375,
-0.02911376953125,
0.032501220703125,
-0.0021038055419921875,
-0.01494598388671875,
-0.02398681640625,
-0.0039825439453125,
0.0199432373046875,
0.0117340087890625,
0.0115814208984375,
-0.0091094970703125,
0.005126953125,
-0.041717529296875,
0.0256195068359375,
0.0124664306640625,
-0.0401611328125,
-0.034423828125,
0.07354736328125,
0.024383544921875,
-0.0255126953125,
0.037261962890625,
0.0006537437438964844,
-0.04010009765625,
0.0296783447265625,
0.048309326171875,
0.07696533203125,
-0.033660888671875,
0.002765655517578125,
0.0191192626953125,
0.0216064453125,
0.018096923828125,
0.03717041015625,
-0.005535125732421875,
-0.052001953125,
-0.0232391357421875,
-0.046844482421875,
-0.0198516845703125,
0.0247650146484375,
-0.03216552734375,
0.039642333984375,
-0.0135040283203125,
-0.031158447265625,
0.01554107666015625,
-0.007495880126953125,
-0.051361083984375,
0.040863037109375,
0.01094818115234375,
0.0443115234375,
-0.043212890625,
0.07843017578125,
0.0426025390625,
-0.061859130859375,
-0.08270263671875,
0.00118255615234375,
-0.01253509521484375,
-0.05389404296875,
0.032958984375,
0.000010728836059570312,
-0.017242431640625,
0.0328369140625,
-0.041015625,
-0.065185546875,
0.079833984375,
0.049835205078125,
-0.037506103515625,
-0.0147247314453125,
0.0219268798828125,
0.041656494140625,
-0.0191802978515625,
0.03253173828125,
0.04119873046875,
0.025146484375,
0.007221221923828125,
-0.10076904296875,
0.0023956298828125,
-0.0029163360595703125,
0.005340576171875,
-0.0174713134765625,
-0.04351806640625,
0.05755615234375,
-0.01042938232421875,
-0.01061248779296875,
-0.01494598388671875,
0.056549072265625,
0.01690673828125,
0.0134124755859375,
0.0384521484375,
0.04254150390625,
0.05810546875,
-0.01397705078125,
0.06256103515625,
-0.0274505615234375,
0.0384521484375,
0.07501220703125,
0.0048065185546875,
0.06732177734375,
0.0318603515625,
-0.0225677490234375,
0.01885986328125,
0.046722412109375,
0.0022125244140625,
0.033966064453125,
0.007114410400390625,
-0.01198577880859375,
-0.0111236572265625,
-0.0087738037109375,
-0.04638671875,
0.05780029296875,
0.0150299072265625,
-0.02313232421875,
-0.00461578369140625,
0.0172119140625,
0.004085540771484375,
-0.026153564453125,
0.003284454345703125,
0.0546875,
0.0190277099609375,
-0.029144287109375,
0.07647705078125,
0.01071929931640625,
0.054901123046875,
-0.05035400390625,
0.00817108154296875,
0.006664276123046875,
0.00508880615234375,
-0.017242431640625,
-0.036956787109375,
0.021026611328125,
-0.003780364990234375,
-0.01061248779296875,
-0.0289154052734375,
0.042633056640625,
-0.050140380859375,
-0.0234222412109375,
0.01666259765625,
0.0285797119140625,
0.0239715576171875,
-0.0024509429931640625,
-0.06585693359375,
0.0238037109375,
0.008453369140625,
-0.0168609619140625,
0.006500244140625,
0.0299835205078125,
0.004673004150390625,
0.050140380859375,
0.052215576171875,
0.004322052001953125,
0.0110931396484375,
0.0282135009765625,
0.0562744140625,
-0.05206298828125,
-0.0496826171875,
-0.0489501953125,
0.050384521484375,
0.0028209686279296875,
-0.03167724609375,
0.045989990234375,
0.044097900390625,
0.070556640625,
-0.0027027130126953125,
0.0701904296875,
-0.01099395751953125,
0.04779052734375,
-0.043121337890625,
0.0640869140625,
-0.036346435546875,
0.012298583984375,
-0.034759521484375,
-0.056243896484375,
-0.005939483642578125,
0.049835205078125,
-0.0177459716796875,
0.028472900390625,
0.04962158203125,
0.059783935546875,
-0.012420654296875,
0.0097503662109375,
0.038116455078125,
0.037628173828125,
0.0242462158203125,
0.034942626953125,
0.039703369140625,
-0.0589599609375,
0.05096435546875,
-0.024810791015625,
0.0034389495849609375,
0.0103912353515625,
-0.05609130859375,
-0.059906005859375,
-0.07647705078125,
-0.032196044921875,
-0.0241241455078125,
0.0086517333984375,
0.08331298828125,
0.07879638671875,
-0.05572509765625,
-0.030517578125,
-0.0080718994140625,
-0.004730224609375,
-0.0016574859619140625,
-0.0177154541015625,
0.0279693603515625,
-0.0239105224609375,
-0.07647705078125,
0.038970947265625,
-0.004192352294921875,
0.0219268798828125,
-0.01654052734375,
0.004772186279296875,
-0.01514434814453125,
-0.0024204254150390625,
0.04296875,
0.00710296630859375,
-0.0679931640625,
-0.024322509765625,
-0.0009365081787109375,
0.0015554428100585938,
0.00858306884765625,
0.04071044921875,
-0.06097412109375,
0.038055419921875,
0.0367431640625,
0.03570556640625,
0.0562744140625,
-0.01410675048828125,
0.051483154296875,
-0.0631103515625,
0.0140228271484375,
0.0267486572265625,
0.0196075439453125,
0.03302001953125,
-0.0103607177734375,
0.0200958251953125,
0.033599853515625,
-0.04156494140625,
-0.06292724609375,
0.00118255615234375,
-0.0955810546875,
0.0045318603515625,
0.0889892578125,
0.00780487060546875,
-0.00988006591796875,
-0.004619598388671875,
-0.0338134765625,
0.050689697265625,
-0.041656494140625,
0.0418701171875,
0.047088623046875,
-0.016815185546875,
-0.0123443603515625,
-0.045989990234375,
0.057464599609375,
0.033660888671875,
-0.057037353515625,
-0.00730133056640625,
0.0238037109375,
0.043914794921875,
0.000038564205169677734,
0.050140380859375,
-0.0018014907836914062,
0.006256103515625,
-0.00135040283203125,
0.0289154052734375,
0.0018177032470703125,
-0.0080718994140625,
-0.029876708984375,
0.0176849365234375,
-0.0173797607421875,
-0.038665771484375
]
] |
TheBloke/Llama-2-13B-chat-GPTQ | 2023-09-27T12:44:48.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"arxiv:2307.09288",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-13B-chat-GPTQ | 298 | 98,464 | transformers | 2023-07-18T18:28:36 | ---
language:
- en
license: llama2
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
model_name: Llama 2 13B Chat
base_model: meta-llama/Llama-2-13b-chat-hf
inference: false
model_creator: Meta Llama 2
model_type: llama
pipeline_tag: text-generation
prompt_template: '[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as
possible, while being safe. Your answers should not include any harmful, unethical,
racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses
are socially unbiased and positive in nature. If a question does not make any sense,
or is not factually coherent, explain why instead of answering something not correct.
If you don''t know the answer to a question, please don''t share false information.
<</SYS>>
{prompt}[/INST]
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama 2 13B Chat - GPTQ
- Model creator: [Meta Llama 2](https://huggingface.co/meta-llama)
- Original model: [Llama 2 13B Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Meta's Llama 2 13B-chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-13B-chat-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-13B-chat-GGUF)
* [Meta Llama 2's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-13B-chat-hf)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Llama-2-Chat
```
[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/main) | 4 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.26 GB | Yes | 4-bit, without Act Order and group size 128g. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 8.00 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.51 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.26 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.65 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-8bit-64g-actorder_True) | 8 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.95 GB | No | 8-bit, with group size 64g and Act Order for even higher inference quality. Poor AutoGPTQ CUDA speed. |
| [gptq-8bit-128g-actorder_False](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-8bit-128g-actorder_False) | 8 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.65 GB | No | 8-bit, with group size 128g for higher inference quality and without Act Order to improve AutoGPTQ speed. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.36 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Llama-2-13B-chat-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Llama-2-13B-chat-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Llama-2-13B-chat-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Llama-2-13B-chat-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Llama-2-13B-chat-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Llama-2-13B-chat-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Meta's Llama 2 13B-chat
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 26,299 | [
[
-0.039581298828125,
-0.06353759765625,
0.0093994140625,
0.0261077880859375,
-0.02642822265625,
-0.00400543212890625,
0.00408172607421875,
-0.048126220703125,
0.0249176025390625,
0.0217742919921875,
-0.0479736328125,
-0.035919189453125,
-0.0350341796875,
-0.00946044921875,
-0.0231475830078125,
0.0828857421875,
0.0162353515625,
-0.03155517578125,
-0.0036144256591796875,
-0.0209503173828125,
-0.0238189697265625,
-0.041717529296875,
-0.061737060546875,
-0.0107269287109375,
0.025726318359375,
0.0180511474609375,
0.06298828125,
0.0396728515625,
0.0152587890625,
0.0251922607421875,
-0.0098876953125,
0.005168914794921875,
-0.048492431640625,
-0.01470947265625,
0.0206298828125,
-0.0192413330078125,
-0.054534912109375,
0.0103607177734375,
0.02752685546875,
0.015655517578125,
-0.027496337890625,
0.021820068359375,
0.007671356201171875,
0.04071044921875,
-0.032135009765625,
0.0184783935546875,
-0.032470703125,
-0.00004392862319946289,
-0.01093292236328125,
0.00977325439453125,
-0.00588226318359375,
-0.0285491943359375,
0.0010156631469726562,
-0.059539794921875,
-0.0005049705505371094,
0.0030345916748046875,
0.08343505859375,
0.01422119140625,
-0.047576904296875,
0.0080108642578125,
-0.024139404296875,
0.047637939453125,
-0.07318115234375,
0.0223236083984375,
0.046112060546875,
0.0141754150390625,
-0.0253753662109375,
-0.061279296875,
-0.051177978515625,
0.0033626556396484375,
-0.01549530029296875,
0.01544189453125,
-0.0361328125,
0.005390167236328125,
0.0306243896484375,
0.052398681640625,
-0.0606689453125,
-0.0028514862060546875,
-0.01629638671875,
-0.0166473388671875,
0.06817626953125,
0.01361083984375,
0.0240936279296875,
-0.01641845703125,
-0.0199737548828125,
-0.02923583984375,
-0.044830322265625,
0.0096282958984375,
0.0256500244140625,
-0.005970001220703125,
-0.051361083984375,
0.03662109375,
-0.030517578125,
0.032928466796875,
0.01507568359375,
-0.0221099853515625,
0.0277099609375,
-0.04754638671875,
-0.035247802734375,
-0.0311279296875,
0.0938720703125,
0.04156494140625,
-0.0139923095703125,
0.019805908203125,
0.005504608154296875,
-0.0018157958984375,
-0.0117950439453125,
-0.0699462890625,
-0.034912109375,
0.0306396484375,
-0.0404052734375,
-0.022186279296875,
-0.0129852294921875,
-0.05401611328125,
-0.006378173828125,
0.0031414031982421875,
0.0269775390625,
-0.03485107421875,
-0.037261962890625,
0.004657745361328125,
-0.0217742919921875,
0.043853759765625,
0.024444580078125,
-0.050567626953125,
0.040313720703125,
0.0278778076171875,
0.05303955078125,
0.00531005859375,
-0.003429412841796875,
-0.0095672607421875,
-0.0003604888916015625,
-0.0097503662109375,
0.03936767578125,
-0.016693115234375,
-0.03668212890625,
-0.035552978515625,
0.01788330078125,
0.006336212158203125,
-0.0158843994140625,
0.042083740234375,
-0.0193634033203125,
0.03375244140625,
-0.0360107421875,
-0.04229736328125,
-0.0269622802734375,
0.0015888214111328125,
-0.04022216796875,
0.102783203125,
0.02606201171875,
-0.061614990234375,
0.01690673828125,
-0.0380859375,
-0.00986480712890625,
-0.007350921630859375,
-0.0021209716796875,
-0.03668212890625,
-0.016021728515625,
0.020904541015625,
0.02685546875,
-0.029266357421875,
0.009765625,
-0.0306243896484375,
-0.0232696533203125,
0.0140838623046875,
-0.04022216796875,
0.0936279296875,
0.00968170166015625,
-0.03375244140625,
-0.006771087646484375,
-0.0491943359375,
0.01136016845703125,
0.0360107421875,
-0.01023101806640625,
-0.00559234619140625,
-0.00450897216796875,
-0.0022678375244140625,
0.01384735107421875,
0.018951416015625,
-0.0288543701171875,
0.0310821533203125,
-0.0170440673828125,
0.045806884765625,
0.047393798828125,
0.00841522216796875,
0.0183258056640625,
-0.034332275390625,
0.0301513671875,
0.005420684814453125,
0.04974365234375,
0.01232147216796875,
-0.06396484375,
-0.055908203125,
-0.0262908935546875,
0.01727294921875,
0.05364990234375,
-0.04693603515625,
0.048736572265625,
-0.010528564453125,
-0.056732177734375,
-0.0279998779296875,
-0.004241943359375,
0.0258636474609375,
0.0240936279296875,
0.0290985107421875,
-0.039398193359375,
-0.03045654296875,
-0.06689453125,
0.01021575927734375,
-0.035736083984375,
-0.0016851425170898438,
0.045379638671875,
0.05322265625,
-0.0229034423828125,
0.052642822265625,
-0.044342041015625,
-0.0019664764404296875,
0.00569915771484375,
0.0008769035339355469,
0.0298919677734375,
0.04180908203125,
0.0626220703125,
-0.05303955078125,
-0.043548583984375,
-0.01105499267578125,
-0.0487060546875,
-0.00823974609375,
-0.00179290771484375,
-0.0391845703125,
0.0189666748046875,
-0.00284576416015625,
-0.0819091796875,
0.0543212890625,
0.04595947265625,
-0.04632568359375,
0.045745849609375,
-0.00487518310546875,
0.0136260986328125,
-0.0755615234375,
-0.0039825439453125,
0.0028476715087890625,
-0.020965576171875,
-0.035797119140625,
-0.01079559326171875,
-0.009002685546875,
0.021240234375,
-0.0227813720703125,
0.052093505859375,
-0.03875732421875,
-0.0031833648681640625,
0.00494384765625,
0.0008268356323242188,
0.0186309814453125,
0.045379638671875,
-0.016021728515625,
0.0645751953125,
0.03662109375,
-0.0318603515625,
0.0460205078125,
0.04254150390625,
-0.00391387939453125,
0.024505615234375,
-0.0631103515625,
0.0216827392578125,
0.016876220703125,
0.03924560546875,
-0.08563232421875,
-0.0205078125,
0.04656982421875,
-0.04876708984375,
0.026947021484375,
-0.0221099853515625,
-0.036407470703125,
-0.03125,
-0.045074462890625,
0.0278778076171875,
0.0611572265625,
-0.0310516357421875,
0.02313232421875,
0.037994384765625,
0.0007977485656738281,
-0.05560302734375,
-0.05120849609375,
-0.00402069091796875,
-0.0250701904296875,
-0.04510498046875,
0.0280609130859375,
-0.005809783935546875,
-0.0030193328857421875,
0.00429534912109375,
-0.0032825469970703125,
-0.007152557373046875,
-0.002689361572265625,
0.027374267578125,
0.023590087890625,
-0.00890350341796875,
-0.01207733154296875,
0.01180267333984375,
0.00412750244140625,
-0.00020444393157958984,
-0.0137176513671875,
0.03375244140625,
-0.0186309814453125,
-0.00897979736328125,
-0.035919189453125,
0.0229034423828125,
0.0328369140625,
0.007965087890625,
0.06427001953125,
0.060089111328125,
-0.02081298828125,
0.0184326171875,
-0.04840087890625,
-0.01198577880859375,
-0.037261962890625,
0.01419830322265625,
-0.015228271484375,
-0.06353759765625,
0.049407958984375,
0.034759521484375,
0.01397705078125,
0.058685302734375,
0.0345458984375,
-0.0041351318359375,
0.07183837890625,
0.0285797119140625,
-0.021728515625,
0.035400390625,
-0.03533935546875,
-0.006252288818359375,
-0.061553955078125,
-0.0157012939453125,
-0.018524169921875,
-0.0197906494140625,
-0.0657958984375,
-0.045745849609375,
0.019683837890625,
0.0178375244140625,
-0.05181884765625,
0.040374755859375,
-0.048370361328125,
0.0158233642578125,
0.044677734375,
0.0186004638671875,
0.01366424560546875,
-0.0006690025329589844,
-0.000743865966796875,
0.0035400390625,
-0.045928955078125,
-0.0252532958984375,
0.0823974609375,
0.0304412841796875,
0.046844482421875,
0.01898193359375,
0.031494140625,
0.01134490966796875,
0.0238189697265625,
-0.035064697265625,
0.04937744140625,
0.00927734375,
-0.048919677734375,
-0.0253143310546875,
-0.0452880859375,
-0.064208984375,
0.0189971923828125,
-0.0092620849609375,
-0.061981201171875,
0.027313232421875,
-0.0035305023193359375,
-0.018890380859375,
0.0179290771484375,
-0.054534912109375,
0.07135009765625,
-0.0118865966796875,
-0.0241851806640625,
-0.004573822021484375,
-0.058929443359375,
0.0345458984375,
0.01457977294921875,
-0.00199127197265625,
-0.023468017578125,
-0.01751708984375,
0.062103271484375,
-0.0616455078125,
0.06805419921875,
-0.019500732421875,
-0.00502777099609375,
0.044219970703125,
-0.00868988037109375,
0.04620361328125,
0.011322021484375,
0.00687408447265625,
0.036376953125,
0.0249176025390625,
-0.035247802734375,
-0.02880859375,
0.04095458984375,
-0.07666015625,
-0.042633056640625,
-0.0287628173828125,
-0.02978515625,
-0.004150390625,
-0.006145477294921875,
0.0306243896484375,
0.0208587646484375,
-0.00494384765625,
0.0099029541015625,
0.043731689453125,
-0.03192138671875,
0.0300445556640625,
0.0206756591796875,
-0.01326751708984375,
-0.046539306640625,
0.051971435546875,
-0.0001811981201171875,
0.019256591796875,
0.0138702392578125,
-0.002384185791015625,
-0.04010009765625,
-0.027801513671875,
-0.0416259765625,
0.03094482421875,
-0.03668212890625,
-0.035369873046875,
-0.0457763671875,
-0.027008056640625,
-0.0304107666015625,
0.0255584716796875,
-0.026153564453125,
-0.05487060546875,
-0.0396728515625,
-0.007965087890625,
0.0758056640625,
0.0292205810546875,
-0.0144500732421875,
0.02935791015625,
-0.0565185546875,
0.016387939453125,
0.02935791015625,
0.0210418701171875,
-0.00611114501953125,
-0.060028076171875,
0.001964569091796875,
0.020050048828125,
-0.037841796875,
-0.07513427734375,
0.045074462890625,
0.0269622802734375,
0.04083251953125,
0.0330810546875,
0.00799560546875,
0.0709228515625,
-0.015869140625,
0.07769775390625,
0.01470947265625,
-0.0654296875,
0.04034423828125,
-0.035858154296875,
0.0011377334594726562,
0.027496337890625,
0.037445068359375,
-0.03204345703125,
-0.0251007080078125,
-0.0631103515625,
-0.057708740234375,
0.03594970703125,
0.0357666015625,
0.01082611083984375,
0.00058746337890625,
0.04803466796875,
-0.00499725341796875,
0.01078033447265625,
-0.0645751953125,
-0.0513916015625,
-0.027374267578125,
-0.0097198486328125,
0.0096588134765625,
-0.0222320556640625,
-0.01849365234375,
-0.04974365234375,
0.060455322265625,
-0.0113067626953125,
0.0601806640625,
0.01611328125,
0.0162353515625,
-0.0008831024169921875,
0.0007381439208984375,
0.031982421875,
0.039093017578125,
-0.0076446533203125,
-0.01885986328125,
0.0267791748046875,
-0.054534912109375,
0.01244354248046875,
0.01904296875,
0.006229400634765625,
-0.017608642578125,
0.01776123046875,
0.0648193359375,
-0.0011758804321289062,
-0.0271453857421875,
0.043060302734375,
-0.02703857421875,
-0.0254974365234375,
-0.020721435546875,
0.020111083984375,
0.0183258056640625,
0.03668212890625,
0.02606201171875,
-0.0288543701171875,
0.0198974609375,
-0.03680419921875,
0.0123443603515625,
0.042999267578125,
-0.002117156982421875,
-0.02716064453125,
0.057891845703125,
-0.0023593902587890625,
0.010498046875,
0.05230712890625,
-0.0225982666015625,
-0.0295257568359375,
0.05609130859375,
0.031951904296875,
0.04693603515625,
-0.0151824951171875,
0.0173492431640625,
0.04052734375,
0.01274871826171875,
-0.003032684326171875,
0.029815673828125,
-0.00736236572265625,
-0.04827880859375,
-0.03521728515625,
-0.041595458984375,
-0.0261688232421875,
0.021087646484375,
-0.052398681640625,
0.003475189208984375,
-0.02752685546875,
-0.035614013671875,
-0.020172119140625,
0.0304412841796875,
-0.0362548828125,
0.0113372802734375,
0.0018033981323242188,
0.06817626953125,
-0.057586669921875,
0.0634765625,
0.0343017578125,
-0.0221710205078125,
-0.07080078125,
-0.0238800048828125,
0.01336669921875,
-0.055389404296875,
0.014801025390625,
-0.005649566650390625,
0.0216827392578125,
-0.002712249755859375,
-0.0633544921875,
-0.07452392578125,
0.11346435546875,
0.020416259765625,
-0.03961181640625,
-0.002178192138671875,
-0.0000947713851928711,
0.0305023193359375,
-0.0073394775390625,
0.052520751953125,
0.046661376953125,
0.0262908935546875,
0.01885986328125,
-0.08026123046875,
0.03173828125,
-0.0386962890625,
0.0007982254028320312,
0.007213592529296875,
-0.0819091796875,
0.0682373046875,
0.0024871826171875,
-0.01488494873046875,
0.00841522216796875,
0.05517578125,
0.033294677734375,
0.004116058349609375,
0.032135009765625,
0.06201171875,
0.058441162109375,
-0.027587890625,
0.08355712890625,
-0.01473236083984375,
0.0399169921875,
0.059173583984375,
0.00536346435546875,
0.054290771484375,
0.0177764892578125,
-0.05487060546875,
0.0394287109375,
0.08331298828125,
-0.0026340484619140625,
0.027435302734375,
-0.0031108856201171875,
-0.022491455078125,
-0.00789642333984375,
0.01282501220703125,
-0.05377197265625,
0.014678955078125,
0.0377197265625,
-0.00885772705078125,
0.00970458984375,
-0.0166778564453125,
0.01140594482421875,
-0.05218505859375,
-0.0018148422241210938,
0.051666259765625,
0.0217437744140625,
-0.015777587890625,
0.0689697265625,
-0.01291656494140625,
0.05615234375,
-0.033294677734375,
-0.0115966796875,
-0.036834716796875,
-0.01004791259765625,
-0.0205535888671875,
-0.0626220703125,
0.0191650390625,
-0.01143646240234375,
0.0028629302978515625,
0.0034084320068359375,
0.051971435546875,
-0.0116424560546875,
-0.0244903564453125,
0.0284271240234375,
0.037445068359375,
0.02825927734375,
-0.005748748779296875,
-0.0792236328125,
0.0187225341796875,
0.0030345916748046875,
-0.0546875,
0.033935546875,
0.03570556640625,
0.0083465576171875,
0.05584716796875,
0.046539306640625,
-0.01084136962890625,
0.001026153564453125,
-0.020721435546875,
0.07769775390625,
-0.057586669921875,
-0.0189666748046875,
-0.061981201171875,
0.0384521484375,
-0.0169677734375,
-0.0287628173828125,
0.058837890625,
0.035369873046875,
0.04638671875,
0.01806640625,
0.042144775390625,
-0.03192138671875,
0.020904541015625,
-0.0200042724609375,
0.046844482421875,
-0.049072265625,
0.01212310791015625,
-0.027130126953125,
-0.0528564453125,
-0.0027751922607421875,
0.0601806640625,
-0.01058197021484375,
0.0183258056640625,
0.0267333984375,
0.0638427734375,
0.00499725341796875,
0.01123046875,
0.01136016845703125,
0.032196044921875,
0.0173797607421875,
0.06787109375,
0.0614013671875,
-0.06927490234375,
0.04193115234375,
-0.0300445556640625,
-0.022705078125,
-0.01023101806640625,
-0.0609130859375,
-0.06353759765625,
-0.03778076171875,
-0.044647216796875,
-0.042449951171875,
-0.00492095947265625,
0.061614990234375,
0.059814453125,
-0.04718017578125,
-0.02020263671875,
0.00218963623046875,
0.00624847412109375,
-0.0186309814453125,
-0.022613525390625,
0.024810791015625,
0.02716064453125,
-0.0445556640625,
0.01056671142578125,
0.0035400390625,
0.0292205810546875,
-0.00591278076171875,
-0.022613525390625,
-0.0166015625,
0.003536224365234375,
0.052703857421875,
0.0390625,
-0.045989990234375,
-0.019378662109375,
-0.0066986083984375,
-0.0102386474609375,
0.0197296142578125,
0.00591278076171875,
-0.04876708984375,
-0.01248931884765625,
0.0308837890625,
0.01311492919921875,
0.0665283203125,
0.01041412353515625,
0.0171966552734375,
-0.0386962890625,
0.016693115234375,
0.002452850341796875,
0.0195465087890625,
0.007091522216796875,
-0.044189453125,
0.05511474609375,
0.02899169921875,
-0.05108642578125,
-0.057891845703125,
-0.0140533447265625,
-0.0904541015625,
-0.00528717041015625,
0.0887451171875,
-0.01221466064453125,
-0.023956298828125,
-0.0027446746826171875,
-0.023956298828125,
0.023345947265625,
-0.0374755859375,
0.027557373046875,
0.040283203125,
-0.03045654296875,
-0.031494140625,
-0.058837890625,
0.04473876953125,
0.01403045654296875,
-0.068359375,
-0.0023365020751953125,
0.041839599609375,
0.035797119140625,
-0.006114959716796875,
0.07550048828125,
-0.01325225830078125,
0.016876220703125,
0.006649017333984375,
-0.00109100341796875,
0.0071563720703125,
0.01270294189453125,
-0.01605224609375,
-0.0135040283203125,
-0.017791748046875,
-0.007568359375
]
] |
flair/pos-english | 2023-04-10T15:54:36.000Z | [
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"has_space",
"region:us"
] | token-classification | flair | null | null | flair/pos-english | 18 | 97,775 | flair | 2022-03-02T23:29:05 | ---
tags:
- flair
- token-classification
- sequence-tagger-model
language: en
datasets:
- ontonotes
inference: false
---
## English Part-of-Speech Tagging in Flair (default model)
This is the standard part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,19** (Ontonotes)
Predicts fine-grained POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADD | Email |
|AFX | Affix |
|CC | Coordinating conjunction |
|CD | Cardinal number |
|DT | Determiner |
|EX | Existential there |
|FW | Foreign word |
|HYPH | Hyphen |
|IN | Preposition or subordinating conjunction |
|JJ | Adjective |
|JJR |Adjective, comparative |
|JJS | Adjective, superlative |
|LS | List item marker |
|MD | Modal |
|NFP | Superfluous punctuation |
|NN | Noun, singular or mass |
|NNP |Proper noun, singular |
|NNPS | Proper noun, plural |
|NNS |Noun, plural |
|PDT | Predeterminer |
|POS | Possessive ending |
|PRP | Personal pronoun |
|PRP$ | Possessive pronoun |
|RB | Adverb |
|RBR | Adverb, comparative |
|RBS | Adverb, superlative |
|RP | Particle |
|SYM | Symbol |
|TO | to |
|UH | Interjection |
|VB | Verb, base form |
|VBD | Verb, past tense |
|VBG | Verb, gerund or present participle |
|VBN | Verb, past participle |
|VBP | Verb, non-3rd person singular present |
|VBZ | Verb, 3rd person singular present |
|WDT | Wh-determiner |
|WP | Wh-pronoun |
|WP$ | Possessive wh-pronoun |
|WRB | Wh-adverb |
|XX | Unknown |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/pos-english")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRP (1.0)]
Span [2]: "love" [− Labels: VBP (1.0)]
Span [3]: "Berlin" [− Labels: NNP (0.9999)]
Span [4]: "." [− Labels: . (1.0)]
```
So, the word "*I*" is labeled as a **pronoun** (PRP), "*love*" is labeled as a **verb** (VBP) and "*Berlin*" is labeled as a **proper noun** (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'pos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/pos-english',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
| 4,962 | [
[
-0.031463623046875,
-0.049072265625,
0.005771636962890625,
0.0164794921875,
-0.02166748046875,
-0.0121307373046875,
-0.01433563232421875,
-0.022064208984375,
0.04913330078125,
0.0151824951171875,
-0.030609130859375,
-0.047821044921875,
-0.033447265625,
0.0263824462890625,
0.0036907196044921875,
0.08154296875,
0.0011529922485351562,
0.0258636474609375,
-0.016510009765625,
-0.0010280609130859375,
-0.026947021484375,
-0.051666259765625,
-0.0311737060546875,
-0.01433563232421875,
0.04180908203125,
0.02545166015625,
0.042999267578125,
0.05029296875,
0.031646728515625,
0.0206451416015625,
-0.0171966552734375,
0.01340484619140625,
-0.007511138916015625,
-0.0027904510498046875,
-0.01507568359375,
-0.0293121337890625,
-0.0576171875,
0.01222991943359375,
0.0457763671875,
0.037841796875,
0.01157379150390625,
0.0030918121337890625,
-0.007259368896484375,
0.0157470703125,
-0.022705078125,
0.032135009765625,
-0.04461669921875,
-0.014373779296875,
-0.02197265625,
-0.0071258544921875,
-0.01953125,
-0.030364990234375,
0.00788116455078125,
-0.038604736328125,
-0.0017070770263671875,
0.010894775390625,
0.10333251953125,
0.00948333740234375,
-0.0244293212890625,
-0.016571044921875,
-0.0292205810546875,
0.058624267578125,
-0.0675048828125,
0.0212554931640625,
0.03302001953125,
-0.01184844970703125,
-0.01387786865234375,
-0.049530029296875,
-0.055206298828125,
-0.01702880859375,
-0.017791748046875,
0.01983642578125,
-0.01163482666015625,
-0.003955841064453125,
0.011962890625,
0.0186004638671875,
-0.0478515625,
-0.008880615234375,
-0.0100860595703125,
-0.017852783203125,
0.05511474609375,
0.00949859619140625,
0.02099609375,
-0.03411865234375,
-0.03033447265625,
-0.0026607513427734375,
-0.0255584716796875,
0.002498626708984375,
0.0126800537109375,
0.039276123046875,
-0.01078033447265625,
0.049346923828125,
-0.00028586387634277344,
0.05389404296875,
0.00738525390625,
-0.025146484375,
0.041107177734375,
-0.03466796875,
-0.0162200927734375,
-0.013214111328125,
0.07562255859375,
0.0294952392578125,
0.01238250732421875,
-0.0110321044921875,
-0.003978729248046875,
0.02587890625,
-0.0179443359375,
-0.038848876953125,
-0.0213775634765625,
0.0175323486328125,
-0.0139617919921875,
-0.021209716796875,
0.001796722412109375,
-0.05389404296875,
-0.0078582763671875,
-0.0030879974365234375,
0.04278564453125,
-0.042388916015625,
-0.00972747802734375,
0.01617431640625,
-0.0261688232421875,
0.0164031982421875,
0.00307464599609375,
-0.056732177734375,
-0.0009236335754394531,
0.030731201171875,
0.039459228515625,
0.015960693359375,
-0.0323486328125,
-0.0210113525390625,
-0.007476806640625,
-0.0135955810546875,
0.0576171875,
-0.034027099609375,
-0.0201416015625,
0.0028209686279296875,
0.01236724853515625,
-0.0279083251953125,
-0.020904541015625,
0.052947998046875,
-0.036895751953125,
0.037933349609375,
-0.0220947265625,
-0.060302734375,
-0.0209808349609375,
0.0192108154296875,
-0.040435791015625,
0.07086181640625,
0.0018053054809570312,
-0.0863037109375,
0.0286712646484375,
-0.034820556640625,
-0.040435791015625,
0.00020742416381835938,
-0.0023059844970703125,
-0.033660888671875,
-0.0126800537109375,
0.006000518798828125,
0.0606689453125,
-0.008514404296875,
0.0175323486328125,
-0.020599365234375,
0.0004868507385253906,
0.0147705078125,
0.016845703125,
0.064208984375,
0.0007691383361816406,
-0.01357269287109375,
0.00399017333984375,
-0.0633544921875,
-0.0199432373046875,
0.02032470703125,
-0.03729248046875,
-0.0239410400390625,
-0.00009912252426147461,
0.006114959716796875,
0.0174102783203125,
0.0168609619140625,
-0.047393798828125,
0.042572021484375,
-0.043182373046875,
0.033721923828125,
0.034454345703125,
0.002826690673828125,
0.046875,
-0.042816162109375,
0.03778076171875,
0.0086669921875,
-0.0173797607421875,
-0.0078277587890625,
-0.05133056640625,
-0.050445556640625,
-0.0286712646484375,
0.0450439453125,
0.0599365234375,
-0.053497314453125,
0.062347412109375,
-0.025543212890625,
-0.0535888671875,
-0.035125732421875,
-0.01493072509765625,
0.0262908935546875,
0.046142578125,
0.03997802734375,
-0.01334381103515625,
-0.0654296875,
-0.0550537109375,
-0.019775390625,
-0.016204833984375,
0.031646728515625,
0.022003173828125,
0.0606689453125,
-0.01380157470703125,
0.06585693359375,
-0.0377197265625,
-0.036895751953125,
-0.0292816162109375,
0.006351470947265625,
0.03271484375,
0.039031982421875,
0.031951904296875,
-0.056121826171875,
-0.04486083984375,
-0.018585205078125,
-0.0323486328125,
0.00676727294921875,
-0.0074920654296875,
-0.00687408447265625,
0.0264892578125,
0.0250091552734375,
-0.0411376953125,
0.034515380859375,
0.0208282470703125,
-0.049652099609375,
0.05438232421875,
0.006744384765625,
-0.01226043701171875,
-0.1090087890625,
0.0122528076171875,
0.0170745849609375,
-0.01389312744140625,
-0.04925537109375,
-0.0155181884765625,
-0.0092315673828125,
0.0323486328125,
-0.0308074951171875,
0.05621337890625,
-0.03240966796875,
0.01543426513671875,
0.003955841064453125,
0.0102691650390625,
0.0081787109375,
0.02838134765625,
0.023040771484375,
0.03521728515625,
0.042633056640625,
-0.04541015625,
0.0243072509765625,
0.03704833984375,
-0.02667236328125,
0.01329803466796875,
-0.0308685302734375,
-0.0209503173828125,
-0.016265869140625,
0.024444580078125,
-0.0875244140625,
-0.0200347900390625,
0.033843994140625,
-0.06195068359375,
0.0355224609375,
-0.00225830078125,
-0.040191650390625,
-0.030120849609375,
-0.018096923828125,
0.005268096923828125,
0.031890869140625,
-0.0176849365234375,
0.0338134765625,
0.033477783203125,
0.00801849365234375,
-0.051055908203125,
-0.05133056640625,
-0.01268768310546875,
-0.019195556640625,
-0.050506591796875,
0.04656982421875,
-0.004611968994140625,
-0.0144195556640625,
0.01103973388671875,
0.01071929931640625,
-0.009429931640625,
0.0169830322265625,
0.01148223876953125,
0.034454345703125,
-0.01467132568359375,
0.013092041015625,
-0.0225677490234375,
0.004852294921875,
-0.00972747802734375,
-0.01140594482421875,
0.05963134765625,
-0.01044464111328125,
0.0149078369140625,
-0.04119873046875,
0.01247406005859375,
0.018524169921875,
-0.02618408203125,
0.06146240234375,
0.06256103515625,
-0.03875732421875,
-0.006313323974609375,
-0.025634765625,
-0.00966644287109375,
-0.0261688232421875,
0.034912109375,
-0.0384521484375,
-0.056396484375,
0.040679931640625,
0.01194000244140625,
0.01142120361328125,
0.059814453125,
0.03717041015625,
-0.01123809814453125,
0.06976318359375,
0.04498291015625,
-0.017364501953125,
0.0298614501953125,
-0.036773681640625,
0.01047515869140625,
-0.055755615234375,
-0.01085662841796875,
-0.039581298828125,
-0.01248931884765625,
-0.05328369140625,
-0.0280609130859375,
0.004322052001953125,
0.035125732421875,
-0.0249481201171875,
0.044708251953125,
-0.04132080078125,
0.0179901123046875,
0.045257568359375,
-0.00989532470703125,
0.006256103515625,
-0.0083160400390625,
-0.0296478271484375,
-0.0207366943359375,
-0.0509033203125,
-0.04302978515625,
0.059814453125,
0.0321044921875,
0.042633056640625,
0.0030269622802734375,
0.0653076171875,
0.003185272216796875,
0.0173797607421875,
-0.07098388671875,
0.042755126953125,
-0.0213775634765625,
-0.057830810546875,
-0.0050048828125,
-0.00858306884765625,
-0.074951171875,
0.01265716552734375,
-0.021881103515625,
-0.071533203125,
0.021514892578125,
0.0078582763671875,
-0.03997802734375,
0.0277862548828125,
-0.0279083251953125,
0.06939697265625,
-0.0005502700805664062,
-0.0223846435546875,
0.0190277099609375,
-0.05633544921875,
0.018096923828125,
0.0174713134765625,
0.0249481201171875,
-0.0187530517578125,
-0.008087158203125,
0.0787353515625,
-0.0208282470703125,
0.07086181640625,
0.002620697021484375,
0.0109405517578125,
0.02227783203125,
0.004276275634765625,
0.0233001708984375,
0.001987457275390625,
-0.0033283233642578125,
0.004360198974609375,
0.0013561248779296875,
-0.01511383056640625,
-0.005397796630859375,
0.04718017578125,
-0.053619384765625,
-0.02337646484375,
-0.06353759765625,
-0.0221099853515625,
-0.00872039794921875,
0.0208282470703125,
0.05450439453125,
0.033050537109375,
-0.016265869140625,
-0.004917144775390625,
0.033233642578125,
-0.0189208984375,
0.0447998046875,
0.0313720703125,
-0.0304718017578125,
-0.049407958984375,
0.0689697265625,
0.015289306640625,
-0.00847625732421875,
0.045623779296875,
0.0168304443359375,
-0.03143310546875,
-0.01279449462890625,
-0.013275146484375,
0.049224853515625,
-0.0364990234375,
-0.0274810791015625,
-0.050811767578125,
-0.01367950439453125,
-0.06915283203125,
-0.0111083984375,
-0.017059326171875,
-0.04046630859375,
-0.053985595703125,
-0.0021533966064453125,
0.0259857177734375,
0.057342529296875,
-0.0224151611328125,
0.0232086181640625,
-0.05029296875,
-0.00382232666015625,
0.001247406005859375,
0.003955841064453125,
-0.020599365234375,
-0.06549072265625,
-0.0186614990234375,
-0.007354736328125,
-0.0276641845703125,
-0.08135986328125,
0.06402587890625,
0.0246429443359375,
0.03271484375,
0.0269927978515625,
-0.0011425018310546875,
0.039093017578125,
-0.0357666015625,
0.080322265625,
0.005847930908203125,
-0.0667724609375,
0.041473388671875,
-0.030364990234375,
0.007415771484375,
0.0174407958984375,
0.06781005859375,
-0.0419921875,
-0.00864410400390625,
-0.063232421875,
-0.07440185546875,
0.04425048828125,
-0.00453948974609375,
0.00307464599609375,
-0.0306243896484375,
0.01557159423828125,
-0.01153564453125,
0.0110931396484375,
-0.07275390625,
-0.0443115234375,
-0.0103302001953125,
-0.0169525146484375,
-0.01611328125,
-0.0185546875,
0.00231170654296875,
-0.04376220703125,
0.087646484375,
0.003063201904296875,
0.042938232421875,
0.03924560546875,
0.002506256103515625,
0.01209259033203125,
0.0191650390625,
0.048675537109375,
0.0198822021484375,
-0.0257415771484375,
-0.0006351470947265625,
0.006351470947265625,
-0.0189666748046875,
-0.00988006591796875,
0.01361846923828125,
-0.0010614395141601562,
0.0224761962890625,
0.035064697265625,
0.056121826171875,
0.007785797119140625,
-0.021514892578125,
0.0517578125,
0.0016622543334960938,
-0.01806640625,
-0.035552978515625,
-0.021697998046875,
0.01629638671875,
0.00942230224609375,
0.007678985595703125,
0.006687164306640625,
-0.006317138671875,
-0.04180908203125,
0.0167694091796875,
0.033721923828125,
-0.0293121337890625,
-0.039093017578125,
0.0604248046875,
-0.0011653900146484375,
-0.01415252685546875,
0.01482391357421875,
-0.044219970703125,
-0.06829833984375,
0.042694091796875,
0.048736572265625,
0.05499267578125,
-0.0211639404296875,
0.01361846923828125,
0.048431396484375,
0.0155181884765625,
-0.004276275634765625,
0.0611572265625,
0.02392578125,
-0.08056640625,
-0.0247650146484375,
-0.0728759765625,
0.00360107421875,
0.0175628662109375,
-0.04315185546875,
0.0302276611328125,
-0.033538818359375,
-0.03289794921875,
0.0289154052734375,
0.01129913330078125,
-0.0548095703125,
0.021759033203125,
0.02935791015625,
0.082275390625,
-0.07611083984375,
0.0745849609375,
0.08868408203125,
-0.0604248046875,
-0.0775146484375,
-0.00804901123046875,
-0.005489349365234375,
-0.045379638671875,
0.057342529296875,
0.0159912109375,
0.0260467529296875,
0.0135955810546875,
-0.045623779296875,
-0.0831298828125,
0.06787109375,
-0.016876220703125,
-0.0231475830078125,
-0.015289306640625,
-0.01303863525390625,
0.037322998046875,
-0.034027099609375,
0.03253173828125,
0.04522705078125,
0.0380859375,
-0.000026404857635498047,
-0.07818603515625,
0.00136566162109375,
-0.0231781005859375,
-0.0102386474609375,
0.006561279296875,
-0.055755615234375,
0.08074951171875,
-0.0119781494140625,
-0.01557159423828125,
0.01800537109375,
0.0670166015625,
-0.0024280548095703125,
0.003192901611328125,
0.0245513916015625,
0.06781005859375,
0.050872802734375,
-0.0245513916015625,
0.05938720703125,
-0.023101806640625,
0.03594970703125,
0.0850830078125,
-0.0027942657470703125,
0.08013916015625,
0.0266265869140625,
-0.0200958251953125,
0.038848876953125,
0.0594482421875,
-0.00862884521484375,
0.035491943359375,
0.0160980224609375,
-0.0074310302734375,
-0.0250396728515625,
-0.026336669921875,
-0.0309600830078125,
0.051422119140625,
0.02557373046875,
-0.04046630859375,
0.00605010986328125,
0.00033354759216308594,
0.04779052734375,
0.0006022453308105469,
-0.0186004638671875,
0.06256103515625,
0.000045359134674072266,
-0.04766845703125,
0.047454833984375,
0.0075836181640625,
0.07568359375,
-0.0279541015625,
0.0049896240234375,
-0.01074981689453125,
0.01233673095703125,
-0.02056884765625,
-0.055511474609375,
0.01235198974609375,
-0.02593994140625,
-0.0150604248046875,
-0.005218505859375,
0.049468994140625,
-0.051116943359375,
-0.0200958251953125,
0.02197265625,
0.035888671875,
0.0179290771484375,
0.004184722900390625,
-0.052337646484375,
-0.01319122314453125,
0.01264190673828125,
-0.02618408203125,
0.012939453125,
0.01104736328125,
0.01081085205078125,
0.03363037109375,
0.0291900634765625,
0.0194091796875,
0.00855255126953125,
-0.012054443359375,
0.06341552734375,
-0.05963134765625,
-0.0286712646484375,
-0.0648193359375,
0.05133056640625,
0.0016870498657226562,
-0.039947509765625,
0.06341552734375,
0.05633544921875,
0.06915283203125,
-0.01010894775390625,
0.0634765625,
-0.031280517578125,
0.060333251953125,
-0.0128326416015625,
0.049896240234375,
-0.055145263671875,
-0.0038509368896484375,
-0.020782470703125,
-0.05096435546875,
-0.039581298828125,
0.05572509765625,
-0.033447265625,
-0.0205841064453125,
0.046783447265625,
0.061553955078125,
0.01470184326171875,
-0.00717926025390625,
0.01293182373046875,
0.03790283203125,
-0.0002428293228149414,
0.033538818359375,
0.053619384765625,
-0.046478271484375,
0.0229644775390625,
-0.04779052734375,
-0.016021728515625,
-0.0241851806640625,
-0.062286376953125,
-0.06195068359375,
-0.07293701171875,
-0.03631591796875,
-0.060699462890625,
-0.0135498046875,
0.09124755859375,
0.027923583984375,
-0.064453125,
-0.0124359130859375,
0.0160675048828125,
0.0033283233642578125,
-0.0004439353942871094,
-0.022705078125,
0.031158447265625,
-0.0172119140625,
-0.0576171875,
0.0255126953125,
-0.0156402587890625,
0.01241302490234375,
0.0075225830078125,
0.01206207275390625,
-0.056121826171875,
0.008087158203125,
0.03326416015625,
0.026947021484375,
-0.0577392578125,
-0.01461029052734375,
0.007617950439453125,
-0.0224761962890625,
0.011962890625,
0.01361846923828125,
-0.0479736328125,
0.01558685302734375,
0.06365966796875,
0.0183868408203125,
0.0099639892578125,
0.006824493408203125,
0.0205841064453125,
-0.051116943359375,
-0.0012750625610351562,
0.03472900390625,
0.045684814453125,
0.019317626953125,
-0.0135955810546875,
0.0289306640625,
0.0411376953125,
-0.055755615234375,
-0.05657958984375,
0.0015897750854492188,
-0.08074951171875,
-0.019927978515625,
0.09979248046875,
-0.01387786865234375,
-0.034423828125,
-0.000232696533203125,
-0.0178375244140625,
0.038909912109375,
-0.034027099609375,
0.021881103515625,
0.042633056640625,
-0.0117950439453125,
0.0243072509765625,
-0.01538848876953125,
0.06402587890625,
0.0294036865234375,
-0.0318603515625,
-0.018524169921875,
0.0205841064453125,
0.038818359375,
0.0296173095703125,
0.04742431640625,
0.006687164306640625,
0.0046844482421875,
-0.0026264190673828125,
0.033050537109375,
0.0130615234375,
-0.01336669921875,
-0.031768798828125,
-0.004077911376953125,
-0.005756378173828125,
-0.0276947021484375
]
] |
alisawuffles/roberta-large-wanli | 2023-06-14T04:58:48.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"en",
"dataset:alisawuffles/WANLI",
"endpoints_compatible",
"region:us"
] | text-classification | alisawuffles | null | null | alisawuffles/roberta-large-wanli | 6 | 97,577 | transformers | 2022-03-30T20:00:10 | ---
language:
- en
tags:
- text-classification
widget:
- text: "I almost forgot to eat lunch.</s></s>I didn't forget to eat lunch."
- text: "I almost forgot to eat lunch.</s></s>I forgot to eat lunch."
- text: "I ate lunch.</s></s>I almost forgot to eat lunch."
datasets:
- alisawuffles/WANLI
---
This is an off-the-shelf roberta-large model finetuned on WANLI, the Worker-AI Collaborative NLI dataset ([Liu et al., 2022](https://aclanthology.org/2022.findings-emnlp.508/)). It outperforms the `roberta-large-mnli` model on eight out-of-domain test sets, including by 11% on HANS and 9% on Adversarial NLI.
### How to use
```python
from transformers import RobertaTokenizer, RobertaForSequenceClassification
model = RobertaForSequenceClassification.from_pretrained('alisawuffles/roberta-large-wanli')
tokenizer = RobertaTokenizer.from_pretrained('alisawuffles/roberta-large-wanli')
x = tokenizer("I almost forgot to eat lunch.", "I didn't forget to eat lunch.", return_tensors='pt', max_length=128, truncation=True)
logits = model(**x).logits
probs = logits.softmax(dim=1).squeeze(0)
label_id = torch.argmax(probs).item()
prediction = model.config.id2label[label_id]
```
### Citation
```
@inproceedings{liu-etal-2022-wanli,
title = "{WANLI}: Worker and {AI} Collaboration for Natural Language Inference Dataset Creation",
author = "Liu, Alisa and
Swayamdipta, Swabha and
Smith, Noah A. and
Choi, Yejin",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2022",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-emnlp.508",
pages = "6826--6847",
abstract = "A recurring challenge of crowdsourcing NLP datasets at scale is that human writers often rely on repetitive patterns when crafting examples, leading to a lack of linguistic diversity. We introduce a novel approach for dataset creation based on worker and AI collaboration, which brings together the generative strength of language models and the evaluative strength of humans. Starting with an existing dataset, MultiNLI for natural language inference (NLI), our approach uses dataset cartography to automatically identify examples that demonstrate challenging reasoning patterns, and instructs GPT-3 to compose new examples with similar patterns. Machine generated examples are then automatically filtered, and finally revised and labeled by human crowdworkers. The resulting dataset, WANLI, consists of 107,885 NLI examples and presents unique empirical strengths over existing NLI datasets. Remarkably, training a model on WANLI improves performance on eight out-of-domain test sets we consider, including by 11{\%} on HANS and 9{\%} on Adversarial NLI, compared to training on the 4x larger MultiNLI. Moreover, it continues to be more effective than MultiNLI augmented with other NLI datasets. Our results demonstrate the promise of leveraging natural language generation techniques and re-imagining the role of humans in the dataset creation process.",
}
``` | 3,145 | [
[
-0.0238189697265625,
-0.044036865234375,
0.01064300537109375,
0.01715087890625,
0.005340576171875,
-0.0081939697265625,
-0.040191650390625,
-0.034912109375,
0.0102081298828125,
0.039093017578125,
-0.055694580078125,
-0.026123046875,
-0.0345458984375,
0.0231170654296875,
-0.0292205810546875,
0.08612060546875,
0.034149169921875,
-0.005054473876953125,
-0.0022125244140625,
-0.0186309814453125,
-0.0177459716796875,
-0.03631591796875,
-0.05523681640625,
-0.0250244140625,
0.0452880859375,
0.0248870849609375,
0.060272216796875,
0.03253173828125,
0.0196990966796875,
0.026580810546875,
0.0034923553466796875,
0.040618896484375,
-0.0391845703125,
-0.004730224609375,
0.0150604248046875,
-0.047637939453125,
-0.027618408203125,
0.01409912109375,
0.033782958984375,
0.064697265625,
-0.0120697021484375,
0.01123046875,
0.0021419525146484375,
0.054962158203125,
-0.03662109375,
-0.0007424354553222656,
-0.06304931640625,
-0.00316619873046875,
-0.0345458984375,
0.03338623046875,
-0.04974365234375,
-0.01171112060546875,
0.00971221923828125,
-0.051239013671875,
0.03363037109375,
0.01494598388671875,
0.0704345703125,
0.03277587890625,
-0.02105712890625,
-0.016632080078125,
-0.046539306640625,
0.08575439453125,
-0.07391357421875,
0.020263671875,
0.0295257568359375,
0.0159759521484375,
-0.01212310791015625,
-0.051971435546875,
-0.0640869140625,
-0.00881195068359375,
-0.0043792724609375,
0.01522064208984375,
-0.0124359130859375,
-0.013153076171875,
0.0218505859375,
0.0250701904296875,
-0.065673828125,
0.01306915283203125,
-0.037139892578125,
0.01849365234375,
0.0633544921875,
0.01396942138671875,
0.0100860595703125,
-0.017242431640625,
-0.005481719970703125,
-0.0121002197265625,
-0.046875,
-0.0199737548828125,
0.020904541015625,
0.0295867919921875,
-0.006572723388671875,
0.040252685546875,
-0.00010901689529418945,
0.07366943359375,
-0.003757476806640625,
0.00426483154296875,
0.045013427734375,
-0.0256500244140625,
-0.015960693359375,
-0.0273895263671875,
0.06903076171875,
0.0194854736328125,
0.0243682861328125,
-0.030120849609375,
0.0010881423950195312,
-0.007904052734375,
0.01348876953125,
-0.0389404296875,
-0.035430908203125,
0.0225830078125,
-0.032073974609375,
-0.0228118896484375,
0.01399993896484375,
-0.060089111328125,
-0.0294647216796875,
-0.03289794921875,
0.01499176025390625,
-0.0276947021484375,
-0.0213775634765625,
0.00182342529296875,
-0.0189208984375,
0.00849151611328125,
0.0013132095336914062,
-0.05670166015625,
0.00733184814453125,
0.0614013671875,
0.06097412109375,
-0.0130615234375,
-0.05078125,
-0.040252685546875,
0.00815582275390625,
-0.01503753662109375,
0.03607177734375,
-0.01983642578125,
-0.0174713134765625,
-0.0095367431640625,
-0.0123443603515625,
-0.0201416015625,
-0.04998779296875,
0.03924560546875,
-0.03131103515625,
0.039581298828125,
0.0045318603515625,
-0.052093505859375,
-0.008026123046875,
0.011322021484375,
-0.07098388671875,
0.08331298828125,
0.00180816650390625,
-0.055572509765625,
0.031158447265625,
-0.06640625,
-0.0299835205078125,
-0.0054779052734375,
-0.0065155029296875,
-0.05242919921875,
-0.002910614013671875,
0.01483154296875,
0.0295257568359375,
-0.04229736328125,
0.05712890625,
-0.0274658203125,
-0.01210784912109375,
0.0355224609375,
-0.039306640625,
0.08880615234375,
0.015838623046875,
-0.028594970703125,
0.00302886962890625,
-0.06982421875,
0.00862884521484375,
0.007904052734375,
-0.017303466796875,
-0.0164642333984375,
-0.017974853515625,
0.0173187255859375,
0.01076507568359375,
0.0002180337905883789,
-0.04150390625,
0.004009246826171875,
-0.041656494140625,
0.039306640625,
0.04827880859375,
-0.0186004638671875,
0.02081298828125,
-0.048187255859375,
0.0214385986328125,
-0.032928466796875,
0.0224761962890625,
0.0177154541015625,
-0.06781005859375,
-0.06280517578125,
-0.030059814453125,
0.03594970703125,
0.03973388671875,
-0.05194091796875,
0.070556640625,
-0.0254669189453125,
-0.048431396484375,
-0.053009033203125,
0.01085662841796875,
0.0295867919921875,
0.0135650634765625,
0.0237274169921875,
0.0013256072998046875,
-0.06707763671875,
-0.062286376953125,
-0.01318359375,
-0.000010132789611816406,
-0.0254058837890625,
0.01529693603515625,
0.0362548828125,
-0.0195465087890625,
0.0870361328125,
-0.036163330078125,
-0.0224609375,
-0.044677734375,
0.0257568359375,
0.052337646484375,
0.044189453125,
0.0306396484375,
-0.068603515625,
-0.03643798828125,
-0.024810791015625,
-0.0657958984375,
-0.0028896331787109375,
-0.019500732421875,
-0.0277099609375,
0.01422882080078125,
0.0124969482421875,
-0.048431396484375,
0.0227508544921875,
0.0596923828125,
-0.02215576171875,
0.031158447265625,
-0.006595611572265625,
-0.0161285400390625,
-0.10235595703125,
0.02117919921875,
-0.0037746429443359375,
0.0015964508056640625,
-0.058502197265625,
0.003971099853515625,
-0.032012939453125,
-0.015655517578125,
-0.007572174072265625,
0.03802490234375,
-0.031829833984375,
-0.003040313720703125,
-0.009124755859375,
0.0106658935546875,
0.014251708984375,
0.03564453125,
-0.0079345703125,
0.0555419921875,
0.02288818359375,
-0.0300750732421875,
0.0187530517578125,
0.0377197265625,
-0.032806396484375,
0.017608642578125,
-0.060272216796875,
0.02703857421875,
0.01080322265625,
0.0273895263671875,
-0.045928955078125,
-0.01097869873046875,
0.005950927734375,
-0.0350341796875,
0.003993988037109375,
-0.0036830902099609375,
-0.04083251953125,
-0.034210205078125,
-0.00893402099609375,
0.0308685302734375,
0.042022705078125,
-0.059814453125,
0.058746337890625,
0.033233642578125,
0.00917816162109375,
-0.0372314453125,
-0.0728759765625,
0.0250701904296875,
-0.026611328125,
-0.056365966796875,
0.018890380859375,
-0.02313232421875,
-0.040374755859375,
-0.0078277587890625,
0.03656005859375,
-0.0269317626953125,
0.01346588134765625,
0.023223876953125,
0.0256500244140625,
-0.00916290283203125,
0.01348876953125,
-0.03125,
-0.012939453125,
0.0008215904235839844,
-0.0212860107421875,
0.039581298828125,
-0.01265716552734375,
-0.0229644775390625,
-0.033203125,
0.029693603515625,
0.0218658447265625,
-0.016357421875,
0.061279296875,
0.0704345703125,
-0.0218658447265625,
-0.0247039794921875,
-0.040618896484375,
-0.0013666152954101562,
-0.03643798828125,
0.02313232421875,
-0.0201873779296875,
-0.04541015625,
0.0140533447265625,
0.014434814453125,
0.0187225341796875,
0.045166015625,
0.023345947265625,
0.01428985595703125,
0.055267333984375,
0.023468017578125,
-0.03125,
0.0247955322265625,
-0.05743408203125,
0.01218414306640625,
-0.06402587890625,
0.00316619873046875,
-0.040557861328125,
-0.0209808349609375,
-0.037200927734375,
-0.043182373046875,
0.0183563232421875,
-0.0004184246063232422,
-0.0174560546875,
0.0223236083984375,
-0.037628173828125,
0.033905029296875,
0.06658935546875,
0.01837158203125,
0.0219573974609375,
-0.0125885009765625,
0.0322265625,
-0.00878143310546875,
-0.06475830078125,
-0.0238189697265625,
0.106689453125,
-0.00457000732421875,
0.02728271484375,
0.027252197265625,
0.068359375,
0.0293731689453125,
0.039764404296875,
-0.0288848876953125,
0.032958984375,
-0.045562744140625,
-0.049285888671875,
-0.039031982421875,
-0.040557861328125,
-0.082763671875,
0.0170745849609375,
-0.0290679931640625,
-0.040191650390625,
0.00942230224609375,
-0.005123138427734375,
-0.0221099853515625,
0.02392578125,
-0.056915283203125,
0.06103515625,
-0.006908416748046875,
-0.017974853515625,
-0.0164337158203125,
-0.033538818359375,
0.0367431640625,
-0.01045989990234375,
0.028656005859375,
-0.01274871826171875,
0.022125244140625,
0.061248779296875,
-0.047149658203125,
0.08453369140625,
-0.0196533203125,
0.004955291748046875,
0.025909423828125,
-0.023345947265625,
0.037567138671875,
0.01387786865234375,
-0.049774169921875,
0.0243682861328125,
-0.0474853515625,
-0.03619384765625,
-0.053314208984375,
0.05426025390625,
-0.0634765625,
-0.033905029296875,
-0.005420684814453125,
-0.046722412109375,
-0.004116058349609375,
0.01468658447265625,
0.025421142578125,
0.05035400390625,
-0.0198822021484375,
0.01488494873046875,
0.0285797119140625,
-0.01146697998046875,
0.031982421875,
0.0155487060546875,
0.0005259513854980469,
-0.0465087890625,
0.07733154296875,
-0.0064239501953125,
0.01108551025390625,
0.0164642333984375,
0.0228729248046875,
-0.009307861328125,
-0.034912109375,
-0.0494384765625,
0.0123443603515625,
-0.0300445556640625,
-0.01210784912109375,
-0.04803466796875,
-0.0187835693359375,
-0.0184783935546875,
-0.01131439208984375,
-0.021636962890625,
-0.03594970703125,
-0.022430419921875,
0.00780487060546875,
0.037841796875,
0.06829833984375,
-0.01105499267578125,
0.0015745162963867188,
-0.045806884765625,
0.012969970703125,
0.0245208740234375,
0.00040721893310546875,
0.0132598876953125,
-0.042022705078125,
-0.012786865234375,
0.0200653076171875,
0.0013914108276367188,
-0.032562255859375,
0.0438232421875,
0.043975830078125,
0.038421630859375,
0.0155029296875,
0.0220947265625,
0.054779052734375,
-0.033355712890625,
0.06866455078125,
0.007259368896484375,
-0.07208251953125,
0.0217742919921875,
-0.015625,
0.020843505859375,
0.04864501953125,
0.019195556640625,
-0.037841796875,
-0.049530029296875,
-0.06512451171875,
-0.0748291015625,
0.032928466796875,
0.03363037109375,
-0.00838470458984375,
-0.002532958984375,
0.0168914794921875,
0.020599365234375,
0.004756927490234375,
-0.060272216796875,
-0.0238800048828125,
-0.00753021240234375,
-0.036956787109375,
-0.0246429443359375,
0.006694793701171875,
-0.00836944580078125,
-0.0218505859375,
0.06463623046875,
-0.0142059326171875,
0.0219573974609375,
0.00424957275390625,
-0.013916015625,
0.0200653076171875,
0.0032196044921875,
0.0301361083984375,
0.046539306640625,
-0.006237030029296875,
-0.00009429454803466797,
0.03094482421875,
-0.0189361572265625,
-0.0166015625,
0.021331787109375,
-0.0053863525390625,
0.01055908203125,
0.048187255859375,
0.058685302734375,
-0.0021839141845703125,
-0.06317138671875,
0.038177490234375,
-0.01068878173828125,
-0.020233154296875,
-0.055084228515625,
0.003875732421875,
-0.0029926300048828125,
0.025634765625,
0.037109375,
0.024566650390625,
0.010284423828125,
-0.0187835693359375,
0.0276947021484375,
0.02001953125,
-0.006374359130859375,
-0.0238037109375,
0.05255126953125,
0.015838623046875,
-0.03509521484375,
0.0755615234375,
-0.0333251953125,
-0.03106689453125,
0.05609130859375,
0.0362548828125,
0.0633544921875,
0.0138092041015625,
0.013153076171875,
0.038604736328125,
0.005615234375,
-0.004436492919921875,
0.013641357421875,
0.0155029296875,
-0.06640625,
-0.0313720703125,
-0.031097412109375,
-0.0263671875,
0.01428985595703125,
-0.05615234375,
0.005504608154296875,
-0.0233001708984375,
0.000274658203125,
0.016754150390625,
-0.0024280548095703125,
-0.06219482421875,
0.011444091796875,
-0.0048065185546875,
0.0478515625,
-0.074462890625,
0.07147216796875,
0.0207061767578125,
-0.0268096923828125,
-0.05316162109375,
0.0182037353515625,
0.000037789344787597656,
-0.06292724609375,
0.053314208984375,
0.02777099609375,
0.01537322998046875,
0.0023250579833984375,
-0.032501220703125,
-0.06744384765625,
0.078369140625,
0.0036792755126953125,
-0.03948974609375,
0.004436492919921875,
0.0117950439453125,
0.047088623046875,
-0.0277099609375,
0.0206756591796875,
0.032684326171875,
0.0404052734375,
-0.0195770263671875,
-0.055877685546875,
0.007343292236328125,
-0.04071044921875,
-0.0020961761474609375,
0.0012216567993164062,
-0.05224609375,
0.078369140625,
-0.0246734619140625,
0.000024259090423583984,
0.0312347412109375,
0.051300048828125,
0.028564453125,
0.0191802978515625,
0.062286376953125,
0.05035400390625,
0.0782470703125,
0.0014190673828125,
0.08355712890625,
-0.01837158203125,
0.021148681640625,
0.105712890625,
-0.0184173583984375,
0.0755615234375,
0.0122833251953125,
-0.0209808349609375,
0.04559326171875,
0.02398681640625,
-0.0386962890625,
0.0211181640625,
0.02996826171875,
-0.013702392578125,
-0.0180206298828125,
-0.0155029296875,
-0.0243682861328125,
0.04010009765625,
0.01108551025390625,
-0.0261688232421875,
0.006038665771484375,
0.0088043212890625,
-0.007843017578125,
0.0152587890625,
0.01922607421875,
0.0528564453125,
-0.0083770751953125,
-0.04962158203125,
0.0712890625,
-0.01195526123046875,
0.0654296875,
-0.0252532958984375,
0.0022754669189453125,
-0.01245880126953125,
0.0209808349609375,
-0.00800323486328125,
-0.04296875,
0.037200927734375,
0.004154205322265625,
-0.0241546630859375,
-0.005191802978515625,
0.0179290771484375,
-0.0408935546875,
-0.04510498046875,
0.01483917236328125,
0.031463623046875,
0.018157958984375,
0.004238128662109375,
-0.059783935546875,
-0.0051422119140625,
0.01788330078125,
-0.011016845703125,
0.011016845703125,
0.029693603515625,
0.01308441162109375,
0.04022216796875,
0.06109619140625,
-0.00628662109375,
0.0029468536376953125,
0.033721923828125,
0.06182861328125,
-0.0391845703125,
-0.01849365234375,
-0.05096435546875,
0.035858154296875,
-0.037567138671875,
-0.054107666015625,
0.05340576171875,
0.061187744140625,
0.060333251953125,
-0.009002685546875,
0.07366943359375,
-0.02154541015625,
0.043243408203125,
-0.041839599609375,
0.049407958984375,
-0.035858154296875,
0.00870513916015625,
-0.01251220703125,
-0.043609619140625,
-0.0340576171875,
0.04010009765625,
-0.01369476318359375,
0.005153656005859375,
0.035491943359375,
0.06939697265625,
0.01068878173828125,
0.01654052734375,
0.0171051025390625,
0.0239105224609375,
0.0244140625,
0.047027587890625,
0.058441162109375,
-0.03643798828125,
0.053955078125,
-0.02569580078125,
-0.01132965087890625,
0.00414276123046875,
-0.046600341796875,
-0.0831298828125,
-0.051788330078125,
-0.01314544677734375,
-0.022857666015625,
0.0155792236328125,
0.0802001953125,
0.063232421875,
-0.07220458984375,
-0.0229644775390625,
-0.024810791015625,
-0.0018739700317382812,
-0.0253143310546875,
-0.0191192626953125,
0.030914306640625,
-0.0196380615234375,
-0.06903076171875,
0.05377197265625,
-0.022796630859375,
-0.01904296875,
-0.0229339599609375,
-0.011871337890625,
-0.0350341796875,
-0.01885986328125,
0.0243682861328125,
0.006103515625,
-0.058685302734375,
-0.0021076202392578125,
-0.01190948486328125,
0.004913330078125,
-0.002223968505859375,
0.047698974609375,
-0.039031982421875,
0.01496124267578125,
0.01152801513671875,
0.056304931640625,
0.037109375,
-0.0070953369140625,
0.058349609375,
-0.07391357421875,
0.0177459716796875,
0.0106658935546875,
0.0243682861328125,
0.03106689453125,
-0.02728271484375,
0.058685302734375,
0.019805908203125,
-0.04827880859375,
-0.0513916015625,
0.00937652587890625,
-0.05401611328125,
-0.026519775390625,
0.086181640625,
-0.02001953125,
-0.0222320556640625,
-0.020904541015625,
-0.006374359130859375,
0.01641845703125,
-0.033599853515625,
0.0753173828125,
0.02581787109375,
0.005641937255859375,
0.005413055419921875,
-0.03656005859375,
0.02838134765625,
0.0423583984375,
-0.06427001953125,
0.01458740234375,
0.02740478515625,
0.0015010833740234375,
0.0259857177734375,
0.031005859375,
0.0243682861328125,
0.005252838134765625,
0.00743865966796875,
0.0210113525390625,
-0.01558685302734375,
-0.0141754150390625,
-0.015838623046875,
0.00772857666015625,
0.00675201416015625,
-0.00806427001953125
]
] |
EleutherAI/gpt-neo-125m | 2023-07-09T15:54:09.000Z | [
"transformers",
"pytorch",
"jax",
"rust",
"safetensors",
"gpt_neo",
"text-generation",
"text generation",
"causal-lm",
"en",
"dataset:EleutherAI/pile",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/gpt-neo-125m | 134 | 97,540 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
tags:
- text generation
- pytorch
- causal-lm
license: mit
datasets:
- EleutherAI/pile
---
# GPT-Neo 125M
## Model Description
GPT-Neo 125M is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 125M represents the number of parameters of this particular pre-trained model.
## Training data
GPT-Neo 125M was trained on the Pile, a large scale curated dataset created by EleutherAI for the purpose of training this model.
## Training procedure
This model was trained on the Pile for 300 billion tokens over 572,300 steps. It was trained as a masked autoregressive language model, using cross-entropy loss.
## Intended Use and Limitations
This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='EleutherAI/gpt-neo-125M')
>>> generator("EleutherAI has", do_sample=True, min_length=20)
[{'generated_text': 'EleutherAI has made a commitment to create new software packages for each of its major clients and has'}]
```
### Limitations and Biases
GPT-Neo was trained as an autoregressive language model. This means that its core functionality is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work.
GPT-Neo was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending on your usecase GPT-Neo may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-Neo will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
## Eval results
TBD
### Down-Stream Applications
TBD
### BibTeX entry and citation info
To cite this model, use
```bibtex
@software{gpt-neo,
author = {Black, Sid and
Leo, Gao and
Wang, Phil and
Leahy, Connor and
Biderman, Stella},
title = {{GPT-Neo: Large Scale Autoregressive Language
Modeling with Mesh-Tensorflow}},
month = mar,
year = 2021,
note = {{If you use this software, please cite it using
these metadata.}},
publisher = {Zenodo},
version = {1.0},
doi = {10.5281/zenodo.5297715},
url = {https://doi.org/10.5281/zenodo.5297715}
}
@article{gao2020pile,
title={The Pile: An 800GB Dataset of Diverse Text for Language Modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and others},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
``` | 3,446 | [
[
-0.032379150390625,
-0.0673828125,
0.0260162353515625,
-0.0007863044738769531,
-0.01837158203125,
-0.0132293701171875,
-0.01271820068359375,
-0.031280517578125,
0.016510009765625,
0.0297698974609375,
-0.0208740234375,
-0.0245819091796875,
-0.054534912109375,
0.00948333740234375,
-0.050323486328125,
0.10540771484375,
0.0021076202392578125,
-0.0261383056640625,
0.016815185546875,
0.01248931884765625,
-0.0122222900390625,
-0.04339599609375,
-0.051971435546875,
-0.01519775390625,
0.0292205810546875,
-0.003612518310546875,
0.05328369140625,
0.0640869140625,
0.0115203857421875,
0.028045654296875,
-0.00809478759765625,
-0.00908660888671875,
-0.0428466796875,
-0.01088714599609375,
-0.006496429443359375,
-0.0089874267578125,
-0.045257568359375,
0.00778961181640625,
0.0523681640625,
0.03729248046875,
-0.0128021240234375,
0.00506591796875,
0.01226043701171875,
0.0237274169921875,
-0.0191650390625,
0.0113067626953125,
-0.04901123046875,
-0.0225067138671875,
-0.0175933837890625,
-0.0021877288818359375,
-0.0230865478515625,
-0.01094818115234375,
0.0060272216796875,
-0.038726806640625,
0.0294036865234375,
-0.002590179443359375,
0.08966064453125,
0.01244354248046875,
-0.024261474609375,
-0.01419830322265625,
-0.0572509765625,
0.053924560546875,
-0.060882568359375,
0.01493072509765625,
0.032318115234375,
0.0008778572082519531,
0.0121612548828125,
-0.05303955078125,
-0.039825439453125,
-0.01409912109375,
-0.01715087890625,
0.0087738037109375,
-0.01885986328125,
-0.0025348663330078125,
0.0252838134765625,
0.032745361328125,
-0.07061767578125,
-0.004070281982421875,
-0.03265380859375,
-0.019866943359375,
0.03912353515625,
0.01372528076171875,
0.02783203125,
-0.043304443359375,
-0.0312347412109375,
-0.0254974365234375,
-0.037841796875,
-0.0230865478515625,
0.048980712890625,
0.0234375,
-0.0169525146484375,
0.038330078125,
-0.00528717041015625,
0.043609619140625,
-0.009368896484375,
0.0017223358154296875,
0.03216552734375,
-0.038116455078125,
-0.0198974609375,
-0.01451873779296875,
0.10693359375,
0.00115966796875,
0.032928466796875,
-0.00574493408203125,
-0.0227813720703125,
0.0084381103515625,
0.0171966552734375,
-0.07855224609375,
-0.01464080810546875,
0.0047760009765625,
-0.01458740234375,
-0.0229644775390625,
0.00920867919921875,
-0.054473876953125,
-0.007720947265625,
-0.01287078857421875,
0.02294921875,
-0.0401611328125,
-0.040863037109375,
0.011383056640625,
-0.0029201507568359375,
0.0007872581481933594,
0.00839996337890625,
-0.06622314453125,
0.03790283203125,
0.050323486328125,
0.0672607421875,
0.0035858154296875,
-0.041961669921875,
-0.0166015625,
0.0021533966064453125,
-0.01140594482421875,
0.051361083984375,
-0.0270538330078125,
-0.0160064697265625,
-0.01253509521484375,
0.020263671875,
-0.027069091796875,
-0.0156402587890625,
0.0367431640625,
-0.0171356201171875,
0.05755615234375,
0.00826263427734375,
-0.02789306640625,
-0.0174102783203125,
0.00701141357421875,
-0.05035400390625,
0.0887451171875,
0.036163330078125,
-0.08502197265625,
0.01099395751953125,
-0.028350830078125,
-0.008209228515625,
0.0162353515625,
-0.0005402565002441406,
-0.03424072265625,
-0.01131439208984375,
0.007671356201171875,
0.0219573974609375,
-0.027130126953125,
0.041839599609375,
-0.007564544677734375,
-0.0249786376953125,
0.0037593841552734375,
-0.039093017578125,
0.06964111328125,
0.0216522216796875,
-0.037078857421875,
-0.00537872314453125,
-0.041534423828125,
-0.02545166015625,
0.023406982421875,
-0.01479339599609375,
-0.0308990478515625,
-0.0132904052734375,
0.021636962890625,
0.0308837890625,
0.019683837890625,
-0.033905029296875,
0.00904083251953125,
-0.03204345703125,
0.041229248046875,
0.05267333984375,
-0.0132598876953125,
0.03326416015625,
-0.03204345703125,
0.059844970703125,
-0.01837158203125,
0.002803802490234375,
-0.01422119140625,
-0.049041748046875,
-0.04742431640625,
-0.01434326171875,
0.0292205810546875,
0.0509033203125,
-0.051055908203125,
0.0296783447265625,
-0.033905029296875,
-0.0377197265625,
-0.038726806640625,
0.0020885467529296875,
0.0197906494140625,
0.03839111328125,
0.0225372314453125,
-0.01059722900390625,
-0.041046142578125,
-0.057891845703125,
-0.006526947021484375,
-0.039093017578125,
-0.00473785400390625,
0.031494140625,
0.042022705078125,
-0.03936767578125,
0.06207275390625,
-0.0257720947265625,
-0.006160736083984375,
-0.0157928466796875,
0.033111572265625,
0.0247039794921875,
0.0237274169921875,
0.04736328125,
-0.0369873046875,
-0.049774169921875,
0.005191802978515625,
-0.03668212890625,
-0.02880859375,
-0.002559661865234375,
-0.004852294921875,
0.02740478515625,
0.031890869140625,
-0.055023193359375,
0.007518768310546875,
0.058624267578125,
-0.03985595703125,
0.037994384765625,
-0.0136566162109375,
-0.006267547607421875,
-0.1031494140625,
0.0255279541015625,
0.006763458251953125,
-0.0290374755859375,
-0.04083251953125,
-0.016571044921875,
-0.007396697998046875,
-0.01268768310546875,
-0.0254669189453125,
0.05255126953125,
-0.0291900634765625,
0.00977325439453125,
-0.026824951171875,
0.0107574462890625,
0.0016460418701171875,
0.038482666015625,
0.004604339599609375,
0.047027587890625,
0.037994384765625,
-0.039093017578125,
0.01953125,
0.002666473388671875,
-0.009185791015625,
0.01265716552734375,
-0.06915283203125,
0.0120391845703125,
-0.01004791259765625,
0.01258087158203125,
-0.06982421875,
0.00113677978515625,
0.032562255859375,
-0.03485107421875,
0.0281829833984375,
-0.032684326171875,
-0.03839111328125,
-0.033050537109375,
-0.01473236083984375,
0.0290985107421875,
0.039947509765625,
-0.0087432861328125,
0.0435791015625,
0.032745361328125,
-0.039276123046875,
-0.056365966796875,
-0.0325927734375,
0.00021696090698242188,
-0.034942626953125,
-0.039642333984375,
0.0270538330078125,
-0.00395965576171875,
-0.0007305145263671875,
0.01149749755859375,
0.02764892578125,
0.0098419189453125,
-0.003948211669921875,
0.004497528076171875,
0.020721435546875,
-0.0028285980224609375,
-0.0018892288208007812,
-0.005817413330078125,
-0.0301055908203125,
0.01496124267578125,
-0.015594482421875,
0.07513427734375,
-0.027496337890625,
0.007518768310546875,
-0.0189666748046875,
0.0208740234375,
0.04742431640625,
-0.001556396484375,
0.058990478515625,
0.0704345703125,
-0.0277252197265625,
0.0028591156005859375,
-0.0276947021484375,
-0.01386260986328125,
-0.035491943359375,
0.055084228515625,
-0.0160980224609375,
-0.060150146484375,
0.043670654296875,
0.020782470703125,
0.0010442733764648438,
0.0634765625,
0.05194091796875,
0.01763916015625,
0.08221435546875,
0.05328369140625,
-0.023406982421875,
0.05010986328125,
-0.032012939453125,
0.0016937255859375,
-0.07171630859375,
0.0032711029052734375,
-0.050933837890625,
-0.0048675537109375,
-0.0738525390625,
-0.03240966796875,
0.004901885986328125,
-0.006175994873046875,
-0.03759765625,
0.049591064453125,
-0.049224853515625,
0.01031494140625,
0.029144287109375,
-0.0206146240234375,
0.0184326171875,
-0.0007977485656738281,
-0.0050811767578125,
0.00970458984375,
-0.05291748046875,
-0.046875,
0.07421875,
0.0487060546875,
0.06170654296875,
-0.00847625732421875,
0.04595947265625,
-0.005657196044921875,
0.0308837890625,
-0.052276611328125,
0.031341552734375,
-0.0196075439453125,
-0.059539794921875,
-0.0302581787109375,
-0.06329345703125,
-0.0924072265625,
0.024078369140625,
0.0023021697998046875,
-0.045623779296875,
0.011138916015625,
-0.002811431884765625,
-0.006954193115234375,
0.0255584716796875,
-0.05267333984375,
0.064453125,
-0.00972747802734375,
-0.00933837890625,
0.0000852346420288086,
-0.030059814453125,
0.0206298828125,
-0.00670623779296875,
0.0295562744140625,
-0.004596710205078125,
-0.00843048095703125,
0.06561279296875,
-0.030609130859375,
0.06732177734375,
-0.00506591796875,
-0.0172271728515625,
0.0235137939453125,
0.00569915771484375,
0.044891357421875,
0.006320953369140625,
0.0012311935424804688,
0.0190887451171875,
-0.0118865966796875,
-0.01499176025390625,
-0.0107269287109375,
0.052337646484375,
-0.08074951171875,
-0.0276336669921875,
-0.05126953125,
-0.045074462890625,
0.0250396728515625,
0.02972412109375,
0.040496826171875,
0.032379150390625,
-0.0238037109375,
0.00662994384765625,
0.03314208984375,
-0.0328369140625,
0.044647216796875,
0.03302001953125,
-0.0294036865234375,
-0.03729248046875,
0.0699462890625,
0.016845703125,
0.01268768310546875,
0.0233154296875,
0.0345458984375,
-0.0222015380859375,
-0.0280914306640625,
-0.037200927734375,
0.049041748046875,
-0.0281524658203125,
-0.0120849609375,
-0.0772705078125,
-0.0279083251953125,
-0.0364990234375,
0.00986480712890625,
-0.03668212890625,
-0.027923583984375,
-0.022491455078125,
-0.007175445556640625,
0.033447265625,
0.06304931640625,
0.0018701553344726562,
0.031646728515625,
-0.03814697265625,
0.0211029052734375,
0.0352783203125,
0.0246124267578125,
-0.01251983642578125,
-0.07086181640625,
-0.02276611328125,
0.0122222900390625,
-0.01434326171875,
-0.058349609375,
0.06353759765625,
0.006244659423828125,
0.037750244140625,
0.0202789306640625,
-0.0124359130859375,
0.0216827392578125,
-0.03851318359375,
0.04638671875,
-0.0026226043701171875,
-0.0621337890625,
0.032958984375,
-0.0572509765625,
0.0198974609375,
0.0188140869140625,
0.04156494140625,
-0.05706787109375,
-0.037261962890625,
-0.07470703125,
-0.0765380859375,
0.06329345703125,
0.0168609619140625,
0.026275634765625,
-0.014801025390625,
0.01416015625,
0.004413604736328125,
0.0045013427734375,
-0.081298828125,
-0.0166015625,
-0.0379638671875,
-0.00833892822265625,
-0.023895263671875,
-0.0166473388671875,
0.0004935264587402344,
-0.0153350830078125,
0.0628662109375,
-0.01447296142578125,
0.03948974609375,
-0.00496673583984375,
-0.014495849609375,
-0.0084228515625,
0.014495849609375,
0.03936767578125,
0.0374755859375,
-0.027862548828125,
0.0011529922485351562,
0.007427215576171875,
-0.0584716796875,
-0.0102996826171875,
0.032379150390625,
-0.017730712890625,
0.01080322265625,
0.0116119384765625,
0.0755615234375,
-0.01544952392578125,
-0.02032470703125,
0.039794921875,
-0.01447296142578125,
-0.027862548828125,
-0.0214996337890625,
-0.00444793701171875,
0.01229095458984375,
-0.01244354248046875,
0.0205841064453125,
-0.01026153564453125,
0.01030731201171875,
-0.039581298828125,
0.00920867919921875,
0.0249786376953125,
-0.02691650390625,
-0.038665771484375,
0.05035400390625,
0.0014200210571289062,
-0.010772705078125,
0.05609130859375,
-0.0208282470703125,
-0.044036865234375,
0.037994384765625,
0.046722412109375,
0.07037353515625,
-0.030609130859375,
0.03643798828125,
0.04730224609375,
0.045806884765625,
-0.004543304443359375,
0.007152557373046875,
0.0284423828125,
-0.064208984375,
-0.0484619140625,
-0.049224853515625,
-0.0187225341796875,
0.0328369140625,
-0.035919189453125,
0.0217132568359375,
-0.0208282470703125,
-0.01959228515625,
-0.0218658447265625,
0.01110076904296875,
-0.04248046875,
0.01593017578125,
0.022064208984375,
0.03656005859375,
-0.0946044921875,
0.06390380859375,
0.059600830078125,
-0.0311737060546875,
-0.05316162109375,
-0.0181732177734375,
-0.015655517578125,
-0.05999755859375,
0.0189971923828125,
0.00800323486328125,
0.00493621826171875,
0.0072021484375,
-0.037384033203125,
-0.06842041015625,
0.07781982421875,
0.036590576171875,
-0.0325927734375,
-0.00962066650390625,
0.0167083740234375,
0.05438232421875,
-0.0208740234375,
0.0550537109375,
0.03057861328125,
0.02740478515625,
-0.010528564453125,
-0.076171875,
0.01049041748046875,
-0.04486083984375,
0.00833892822265625,
0.0240631103515625,
-0.059844970703125,
0.0899658203125,
0.01450347900390625,
-0.0214080810546875,
-0.00885009765625,
0.025146484375,
0.019378662109375,
-0.01038360595703125,
0.041961669921875,
0.059967041015625,
0.055023193359375,
-0.0181121826171875,
0.09600830078125,
-0.0268096923828125,
0.04754638671875,
0.0665283203125,
0.01352691650390625,
0.039825439453125,
0.01293182373046875,
-0.01134490966796875,
0.060028076171875,
0.0400390625,
0.0027141571044921875,
0.027252197265625,
0.003284454345703125,
0.0007505416870117188,
0.0001996755599975586,
-0.01029205322265625,
-0.04449462890625,
0.022796630859375,
0.038543701171875,
-0.041046142578125,
-0.0076904296875,
-0.0185546875,
0.0322265625,
-0.0173797607421875,
-0.0016908645629882812,
0.05255126953125,
0.015533447265625,
-0.04107666015625,
0.058135986328125,
0.0070037841796875,
0.06134033203125,
-0.037933349609375,
0.0174713134765625,
-0.0177001953125,
0.005947113037109375,
-0.0012035369873046875,
-0.039398193359375,
0.0219268798828125,
0.005580902099609375,
-0.0130767822265625,
-0.035797119140625,
0.04876708984375,
-0.0352783203125,
-0.04248046875,
0.028350830078125,
0.0450439453125,
0.0233612060546875,
-0.01666259765625,
-0.07000732421875,
-0.005107879638671875,
-0.01148223876953125,
-0.03802490234375,
0.0281982421875,
0.058746337890625,
0.002994537353515625,
0.042205810546875,
0.042694091796875,
0.01236724853515625,
-0.006290435791015625,
0.0251922607421875,
0.07403564453125,
-0.049560546875,
-0.038330078125,
-0.061798095703125,
0.03643798828125,
0.006046295166015625,
-0.03472900390625,
0.051116943359375,
0.042724609375,
0.058074951171875,
0.0084381103515625,
0.061798095703125,
-0.0205535888671875,
0.045440673828125,
-0.01222991943359375,
0.04852294921875,
-0.02264404296875,
0.010955810546875,
-0.049346923828125,
-0.0946044921875,
0.0052032470703125,
0.05853271484375,
-0.02508544921875,
0.034210205078125,
0.06781005859375,
0.058441162109375,
-0.004131317138671875,
-0.0142669677734375,
0.00966644287109375,
0.0310516357421875,
0.0224456787109375,
0.053070068359375,
0.063720703125,
-0.058868408203125,
0.04730224609375,
-0.032928466796875,
-0.0214080810546875,
-0.004177093505859375,
-0.0704345703125,
-0.071044921875,
-0.037994384765625,
-0.0290374755859375,
-0.03631591796875,
-0.01026153564453125,
0.041717529296875,
0.044281005859375,
-0.055084228515625,
-0.03167724609375,
-0.0183258056640625,
-0.0003974437713623047,
-0.01230621337890625,
-0.022369384765625,
0.0301971435546875,
-0.01224517822265625,
-0.07855224609375,
0.007541656494140625,
-0.002105712890625,
0.0182952880859375,
-0.0225982666015625,
-0.01119232177734375,
-0.01904296875,
-0.00809478759765625,
0.031524658203125,
0.01318359375,
-0.0225830078125,
-0.00701141357421875,
0.016021728515625,
-0.024871826171875,
-0.0030727386474609375,
0.042694091796875,
-0.058837890625,
0.01305389404296875,
0.06494140625,
0.02001953125,
0.052032470703125,
0.01041412353515625,
0.056549072265625,
-0.04559326171875,
0.01152801513671875,
0.0180816650390625,
0.03729248046875,
0.0357666015625,
-0.0300140380859375,
0.037353515625,
0.031524658203125,
-0.054901123046875,
-0.04852294921875,
-0.00536346435546875,
-0.06402587890625,
-0.0166778564453125,
0.1104736328125,
0.006317138671875,
-0.01242828369140625,
-0.018096923828125,
-0.01403045654296875,
0.0270538330078125,
-0.03350830078125,
0.05120849609375,
0.05401611328125,
0.01806640625,
-0.0313720703125,
-0.0517578125,
0.033721923828125,
0.024169921875,
-0.0538330078125,
0.01143646240234375,
0.019744873046875,
0.026458740234375,
0.015960693359375,
0.05120849609375,
-0.0278167724609375,
-0.0018663406372070312,
0.000377655029296875,
0.0164642333984375,
0.00182342529296875,
-0.0249786376953125,
-0.0274200439453125,
-0.0024566650390625,
-0.012176513671875,
0.01812744140625
]
] |
microsoft/infoxlm-large | 2021-08-04T11:43:05.000Z | [
"transformers",
"pytorch",
"xlm-roberta",
"fill-mask",
"arxiv:2007.07834",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/infoxlm-large | 8 | 97,460 | transformers | 2022-03-02T23:29:05 | # InfoXLM
**InfoXLM** (NAACL 2021, [paper](https://arxiv.org/pdf/2007.07834.pdf), [repo](https://github.com/microsoft/unilm/tree/master/infoxlm), [model](https://huggingface.co/microsoft/infoxlm-base)) InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training.
**MD5**
```
05b95b7d977450b364f8ea3269391953 config.json
c19438359fed6d36b0c1bbb107929579 pytorch_model.bin
bf25eb5120ad92ef5c7d8596b5dc4046 sentencepiece.bpe.model
eedbd60a7268b9fc45981b849664f747 tokenizer.json
```
**BibTeX**
```
@inproceedings{chi-etal-2021-infoxlm,
title = "{I}nfo{XLM}: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training",
author={Chi, Zewen and Dong, Li and Wei, Furu and Yang, Nan and Singhal, Saksham and Wang, Wenhui and Song, Xia and Mao, Xian-Ling and Huang, Heyan and Zhou, Ming},
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.naacl-main.280",
doi = "10.18653/v1/2021.naacl-main.280",
pages = "3576--3588",}
``` | 1,246 | [
[
-0.0316162109375,
-0.051239013671875,
0.01116943359375,
0.034454345703125,
-0.01311492919921875,
0.016845703125,
-0.01203155517578125,
-0.034423828125,
0.01097869873046875,
0.0232696533203125,
-0.0411376953125,
-0.048583984375,
-0.0313720703125,
-0.0020084381103515625,
-0.0215606689453125,
0.0745849609375,
-0.01219940185546875,
0.0141448974609375,
-0.002666473388671875,
-0.007354736328125,
0.01125335693359375,
-0.07073974609375,
-0.03375244140625,
0.00437164306640625,
0.029083251953125,
0.0103912353515625,
0.04815673828125,
0.04412841796875,
0.01214599609375,
0.0146026611328125,
-0.0015697479248046875,
0.0057220458984375,
-0.031341552734375,
-0.02313232421875,
0.0035877227783203125,
-0.0277099609375,
-0.056304931640625,
-0.0009927749633789062,
0.0711669921875,
0.06756591796875,
-0.0070953369140625,
0.007732391357421875,
0.01309967041015625,
0.033477783203125,
-0.035736083984375,
0.01116943359375,
-0.044158935546875,
0.002391815185546875,
-0.03924560546875,
-0.01421356201171875,
-0.03375244140625,
-0.0196380615234375,
0.00878143310546875,
-0.05938720703125,
0.007381439208984375,
0.03125,
0.0985107421875,
-0.0123443603515625,
-0.028045654296875,
0.0011110305786132812,
-0.0262298583984375,
0.06298828125,
-0.07635498046875,
0.06134033203125,
0.036163330078125,
-0.002628326416015625,
-0.009368896484375,
-0.06060791015625,
-0.044219970703125,
-0.006885528564453125,
-0.01102447509765625,
-0.0059051513671875,
-0.031341552734375,
0.0035190582275390625,
0.0281982421875,
0.000024020671844482422,
-0.07135009765625,
-0.0097503662109375,
-0.020294189453125,
-0.0292205810546875,
0.038116455078125,
0.0102691650390625,
0.00794219970703125,
0.0216064453125,
-0.05181884765625,
-0.007236480712890625,
-0.040069580078125,
0.0072784423828125,
0.0171051025390625,
0.0275421142578125,
-0.04583740234375,
0.01070404052734375,
-0.0074005126953125,
0.058380126953125,
-0.0007853507995605469,
-0.005565643310546875,
0.06512451171875,
-0.059112548828125,
-0.01465606689453125,
0.0063018798828125,
0.07000732421875,
0.0014123916625976562,
-0.0020313262939453125,
-0.010711669921875,
-0.00589752197265625,
-0.0279693603515625,
-0.005588531494140625,
-0.056640625,
0.0014657974243164062,
0.0129547119140625,
-0.029052734375,
-0.0016431808471679688,
0.018585205078125,
-0.051513671875,
-0.00479888916015625,
-0.0263519287109375,
0.01393890380859375,
-0.0276031494140625,
-0.042816162109375,
0.00846099853515625,
0.0300750732421875,
0.022918701171875,
-0.0007457733154296875,
-0.030120849609375,
0.0263214111328125,
0.0193328857421875,
0.046539306640625,
-0.0190887451171875,
-0.054718017578125,
-0.041473388671875,
-0.0084686279296875,
-0.0172576904296875,
0.0304718017578125,
-0.0276947021484375,
-0.006099700927734375,
0.005970001220703125,
0.01007843017578125,
-0.0240325927734375,
-0.0202484130859375,
0.034637451171875,
-0.050079345703125,
0.03424072265625,
-0.016082763671875,
-0.0280914306640625,
-0.0255126953125,
0.021087646484375,
-0.03497314453125,
0.08203125,
0.02606201171875,
-0.0733642578125,
0.01082611083984375,
-0.0604248046875,
-0.028228759765625,
-0.0011396408081054688,
-0.00323486328125,
-0.0276031494140625,
-0.0284576416015625,
-0.0023822784423828125,
0.01142120361328125,
-0.041229248046875,
0.029632568359375,
-0.026031494140625,
-0.003749847412109375,
0.0023651123046875,
-0.02032470703125,
0.08642578125,
0.0297393798828125,
-0.03973388671875,
0.0175018310546875,
-0.076904296875,
0.0248870849609375,
0.0203857421875,
-0.0316162109375,
-0.026092529296875,
-0.0287933349609375,
0.026947021484375,
0.042938232421875,
0.042144775390625,
-0.0419921875,
0.011260986328125,
-0.0273895263671875,
-0.0093994140625,
0.039154052734375,
-0.02886962890625,
0.02655029296875,
-0.01035308837890625,
0.049560546875,
0.0293121337890625,
0.029449462890625,
0.00821685791015625,
-0.0207061767578125,
-0.054473876953125,
-0.005588531494140625,
0.0291595458984375,
0.05078125,
-0.06292724609375,
0.033599853515625,
-0.0227813720703125,
-0.0396728515625,
-0.033416748046875,
0.00439453125,
0.057220458984375,
0.047119140625,
0.0396728515625,
-0.01050567626953125,
-0.059478759765625,
-0.054168701171875,
-0.0338134765625,
-0.01403045654296875,
0.031829833984375,
0.0086517333984375,
0.01396942138671875,
-0.040496826171875,
0.07086181640625,
-0.0175933837890625,
-0.010528564453125,
-0.02471923828125,
0.01776123046875,
0.02032470703125,
0.03082275390625,
0.059478759765625,
-0.058319091796875,
-0.050140380859375,
-0.0010290145874023438,
-0.04498291015625,
-0.0255889892578125,
-0.000324249267578125,
-0.0214996337890625,
0.045257568359375,
0.031402587890625,
-0.0271453857421875,
0.049346923828125,
0.04595947265625,
-0.065673828125,
0.05145263671875,
-0.0081634521484375,
-0.00557708740234375,
-0.06976318359375,
0.0194244384765625,
-0.0248565673828125,
-0.01058197021484375,
-0.054779052734375,
-0.0200958251953125,
0.0111236572265625,
0.0099639892578125,
-0.04248046875,
0.0697021484375,
-0.05816650390625,
-0.0014743804931640625,
-0.009063720703125,
0.00884246826171875,
0.0036945343017578125,
0.054229736328125,
0.01200103759765625,
0.030487060546875,
0.06640625,
-0.060638427734375,
-0.005039215087890625,
0.0185546875,
-0.01088714599609375,
0.02081298828125,
-0.031341552734375,
-0.002079010009765625,
0.0023345947265625,
0.006313323974609375,
-0.04656982421875,
0.01490020751953125,
0.0273284912109375,
-0.0308837890625,
0.02996826171875,
0.01444244384765625,
-0.0310821533203125,
-0.000530242919921875,
-0.007740020751953125,
0.04791259765625,
0.01041412353515625,
-0.0328369140625,
0.0574951171875,
0.030731201171875,
-0.0158538818359375,
-0.061798095703125,
-0.055450439453125,
-0.00030875205993652344,
0.0102081298828125,
-0.05316162109375,
0.0272674560546875,
-0.036407470703125,
-0.012054443359375,
0.0098419189453125,
0.01204681396484375,
-0.01088714599609375,
-0.0002453327178955078,
0.0011568069458007812,
0.0214385986328125,
-0.032012939453125,
0.0114288330078125,
-0.00943756103515625,
-0.0092620849609375,
-0.00698089599609375,
-0.023834228515625,
0.055877685546875,
-0.0238494873046875,
-0.03509521484375,
-0.0120697021484375,
0.0313720703125,
0.03106689453125,
-0.034576416015625,
0.0606689453125,
0.06121826171875,
-0.01959228515625,
0.0010671615600585938,
-0.0299224853515625,
-0.0165863037109375,
-0.0277862548828125,
0.05096435546875,
-0.0189666748046875,
-0.05352783203125,
0.03369140625,
0.02130126953125,
0.0174102783203125,
0.0302581787109375,
0.0238494873046875,
0.0284576416015625,
0.076904296875,
0.047760009765625,
-0.00972747802734375,
0.0396728515625,
-0.0185546875,
0.014007568359375,
-0.062164306640625,
-0.01116943359375,
-0.0321044921875,
0.0011034011840820312,
-0.03021240234375,
-0.019866943359375,
0.00528717041015625,
0.00225830078125,
-0.035247802734375,
0.022918701171875,
-0.00585174560546875,
-0.00414276123046875,
0.04473876953125,
-0.033355712890625,
0.01131439208984375,
0.010284423828125,
-0.05377197265625,
0.00710296630859375,
-0.06951904296875,
-0.0200958251953125,
0.07025146484375,
0.020843505859375,
0.051025390625,
0.00794219970703125,
0.040252685546875,
-0.0040130615234375,
0.005802154541015625,
-0.0196380615234375,
0.04595947265625,
0.00461578369140625,
-0.052337646484375,
-0.0094451904296875,
-0.050506591796875,
-0.088623046875,
0.0180511474609375,
-0.0134124755859375,
-0.055145263671875,
0.034332275390625,
0.01143646240234375,
-0.0047607421875,
0.0310821533203125,
-0.04339599609375,
0.069580078125,
-0.04736328125,
-0.0249176025390625,
0.01326751708984375,
-0.06280517578125,
0.01078033447265625,
-0.01526641845703125,
0.046539306640625,
0.0103759765625,
-0.00838470458984375,
0.0665283203125,
-0.036773681640625,
0.0531005859375,
-0.0245208740234375,
-0.005889892578125,
0.0025234222412109375,
0.00605010986328125,
0.040191650390625,
-0.01517486572265625,
-0.011444091796875,
0.045196533203125,
0.0120849609375,
-0.034088134765625,
-0.027679443359375,
0.036956787109375,
-0.05120849609375,
-0.029449462890625,
-0.056488037109375,
-0.052947998046875,
-0.029388427734375,
0.038055419921875,
0.03729248046875,
0.04107666015625,
0.00527191162109375,
0.0206756591796875,
0.0284576416015625,
-0.01812744140625,
0.0263214111328125,
0.0693359375,
-0.049163818359375,
-0.0291595458984375,
0.061492919921875,
0.0219268798828125,
0.030487060546875,
0.036376953125,
-0.0035610198974609375,
-0.02655029296875,
-0.044525146484375,
-0.006809234619140625,
0.032928466796875,
-0.0511474609375,
0.0020046234130859375,
-0.051300048828125,
-0.031707763671875,
-0.032806396484375,
0.02215576171875,
-0.027679443359375,
-0.0155487060546875,
-0.0247039794921875,
-0.00492095947265625,
0.01409149169921875,
0.02984619140625,
-0.0164794921875,
0.011383056640625,
-0.065673828125,
0.0016164779663085938,
-0.01629638671875,
0.026702880859375,
-0.0028972625732421875,
-0.0521240234375,
-0.040313720703125,
0.030487060546875,
-0.0029582977294921875,
-0.0384521484375,
0.037200927734375,
0.0167999267578125,
0.07257080078125,
0.02301025390625,
0.028106689453125,
0.04351806640625,
-0.00926971435546875,
0.0528564453125,
0.00986480712890625,
-0.06488037109375,
0.0191650390625,
0.00566864013671875,
0.038665771484375,
0.0290069580078125,
0.03240966796875,
-0.038726806640625,
-0.0256195068359375,
-0.05975341796875,
-0.0780029296875,
0.079833984375,
0.016082763671875,
-0.01335906982421875,
0.0034236907958984375,
-0.007701873779296875,
-0.005519866943359375,
0.01445770263671875,
-0.07574462890625,
-0.056854248046875,
-0.00860595703125,
-0.024627685546875,
0.00589752197265625,
-0.027587890625,
-0.00402069091796875,
-0.019744873046875,
0.08587646484375,
-0.00665283203125,
0.018829345703125,
0.02154541015625,
-0.0243072509765625,
0.00039887428283691406,
0.01200103759765625,
0.0582275390625,
0.059600830078125,
-0.01279449462890625,
0.0175628662109375,
0.005657196044921875,
-0.05560302734375,
-0.00615692138671875,
0.02978515625,
-0.0187530517578125,
0.017364501953125,
0.0384521484375,
0.0904541015625,
0.01544189453125,
-0.0396728515625,
0.03997802734375,
-0.0213775634765625,
-0.03240966796875,
-0.0171356201171875,
-0.0191650390625,
-0.0069122314453125,
0.0089874267578125,
0.0287017822265625,
0.0011320114135742188,
-0.01047515869140625,
-0.0231170654296875,
0.0123138427734375,
0.0309295654296875,
-0.051361083984375,
-0.031341552734375,
0.0341796875,
-0.0108795166015625,
-0.007843017578125,
0.0201263427734375,
-0.02655029296875,
-0.054901123046875,
0.018768310546875,
0.049957275390625,
0.060821533203125,
-0.019775390625,
-0.00598907470703125,
0.035186767578125,
0.028045654296875,
0.0289459228515625,
0.051544189453125,
0.0019969940185546875,
-0.056243896484375,
-0.0037555694580078125,
-0.0521240234375,
0.0006051063537597656,
0.00908660888671875,
-0.0364990234375,
0.021331787109375,
-0.0175018310546875,
-0.0081634521484375,
-0.01502227783203125,
0.00421142578125,
-0.048431396484375,
0.01554107666015625,
0.008331298828125,
0.0821533203125,
-0.026397705078125,
0.08270263671875,
0.04840087890625,
-0.033233642578125,
-0.08099365234375,
0.0010089874267578125,
0.005207061767578125,
-0.04229736328125,
0.062347412109375,
-0.0196075439453125,
-0.0172271728515625,
0.0091552734375,
-0.039459228515625,
-0.09307861328125,
0.073974609375,
0.046417236328125,
-0.045318603515625,
0.004596710205078125,
0.0187835693359375,
0.0167083740234375,
-0.034149169921875,
0.0244293212890625,
0.011627197265625,
0.052886962890625,
-0.0012493133544921875,
-0.0802001953125,
0.010345458984375,
-0.05242919921875,
-0.01238250732421875,
-0.00007444620132446289,
-0.0374755859375,
0.09759521484375,
-0.0136566162109375,
-0.034454345703125,
-0.0090484619140625,
0.03973388671875,
0.02288818359375,
0.0175323486328125,
0.043121337890625,
0.03466796875,
0.06524658203125,
-0.0049591064453125,
0.07501220703125,
-0.0430908203125,
0.0231475830078125,
0.1004638671875,
-0.0170440673828125,
0.048431396484375,
0.0361328125,
-0.0173797607421875,
0.0517578125,
0.040679931640625,
0.019989013671875,
0.0297393798828125,
-0.0013818740844726562,
0.0035572052001953125,
-0.01200103759765625,
0.0030460357666015625,
-0.06427001953125,
0.026702880859375,
0.01398468017578125,
-0.03759765625,
-0.0222930908203125,
-0.0002474784851074219,
0.0244293212890625,
-0.019500732421875,
-0.00830841064453125,
0.04498291015625,
0.0005202293395996094,
-0.035125732421875,
0.0816650390625,
-0.008056640625,
0.03936767578125,
-0.08160400390625,
-0.0000020265579223632812,
-0.01263427734375,
0.0157012939453125,
-0.0203399658203125,
-0.031494140625,
0.004024505615234375,
-0.00701904296875,
-0.012908935546875,
0.00354766845703125,
0.03436279296875,
-0.053436279296875,
-0.0308990478515625,
0.05999755859375,
0.024932861328125,
0.01311492919921875,
-0.0017251968383789062,
-0.06982421875,
0.005245208740234375,
0.01068115234375,
-0.039459228515625,
0.03131103515625,
0.0295257568359375,
-0.01190948486328125,
0.040496826171875,
0.0611572265625,
0.013885498046875,
0.03106689453125,
0.007740020751953125,
0.061920166015625,
-0.04852294921875,
-0.052581787109375,
-0.055572509765625,
0.043121337890625,
-0.01094818115234375,
-0.041717529296875,
0.06866455078125,
0.06475830078125,
0.07843017578125,
-0.00560760498046875,
0.07403564453125,
-0.010223388671875,
0.048583984375,
-0.0418701171875,
0.064453125,
-0.050811767578125,
0.0236663818359375,
-0.0162811279296875,
-0.072509765625,
-0.0281982421875,
0.035369873046875,
-0.01934814453125,
0.0204315185546875,
0.0667724609375,
0.07244873046875,
-0.00494384765625,
-0.00988006591796875,
0.0284576416015625,
0.0421142578125,
0.010467529296875,
0.038970947265625,
0.037353515625,
-0.032867431640625,
0.05328369140625,
-0.024139404296875,
-0.0226593017578125,
-0.024688720703125,
-0.06353759765625,
-0.050079345703125,
-0.0660400390625,
-0.0262908935546875,
-0.0263519287109375,
-0.00913238525390625,
0.084228515625,
0.06591796875,
-0.06536865234375,
-0.037078857421875,
-0.0003917217254638672,
0.0179290771484375,
-0.017669677734375,
-0.015625,
0.038360595703125,
-0.0254364013671875,
-0.033294677734375,
0.0142059326171875,
0.009979248046875,
0.01299285888671875,
-0.01352691650390625,
-0.040313720703125,
-0.05682373046875,
0.005664825439453125,
0.028076171875,
0.03515625,
-0.04833984375,
0.006359100341796875,
0.01166534423828125,
-0.028350830078125,
0.0169677734375,
0.032440185546875,
-0.03265380859375,
0.032928466796875,
0.03607177734375,
0.034912109375,
0.0295257568359375,
-0.0150909423828125,
0.03717041015625,
-0.060760498046875,
0.025543212890625,
0.013641357421875,
0.041748046875,
0.0183563232421875,
-0.0201263427734375,
0.03619384765625,
0.02392578125,
-0.014801025390625,
-0.06982421875,
0.005474090576171875,
-0.0780029296875,
-0.007450103759765625,
0.09515380859375,
-0.0338134765625,
-0.0288238525390625,
-0.031097412109375,
-0.0296478271484375,
0.0176544189453125,
-0.01824951171875,
0.0092010498046875,
0.041656494140625,
0.01519012451171875,
-0.01198577880859375,
-0.0109100341796875,
0.044677734375,
0.01418304443359375,
-0.076171875,
0.0084075927734375,
0.006641387939453125,
0.00952911376953125,
0.0299224853515625,
0.040557861328125,
-0.01528167724609375,
0.015350341796875,
-0.005657196044921875,
0.043792724609375,
-0.0230255126953125,
-0.0089111328125,
-0.0111541748046875,
-0.0021495819091796875,
-0.00359344482421875,
-0.002414703369140625
]
] |
dbmdz/bert-base-turkish-cased | 2021-05-19T15:14:46.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"tr",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | null | dbmdz | null | null | dbmdz/bert-base-turkish-cased | 44 | 97,392 | transformers | 2022-03-02T23:29:05 | ---
language: tr
license: mit
---
# 🤗 + 📚 dbmdz Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased model for Turkish 🎉
# 🇹🇷 BERTurk
BERTurk is a community-driven cased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
## Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 2M steps.
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| --------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-turkish-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-turkish-cased/vocab.txt)
## Usage
With Transformers >= 2.3 our BERTurk cased model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/bert-base-turkish-cased")
model = AutoModel.from_pretrained("dbmdz/bert-base-turkish-cased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
| 2,858 | [
[
-0.040771484375,
-0.052734375,
0.01275634765625,
0.0212554931640625,
-0.0303802490234375,
-0.0236358642578125,
-0.0189056396484375,
-0.0250701904296875,
0.0174102783203125,
0.0301666259765625,
-0.047576904296875,
-0.0501708984375,
-0.049163818359375,
-0.0086212158203125,
-0.0253753662109375,
0.09344482421875,
-0.0117950439453125,
0.0252227783203125,
0.0058746337890625,
-0.0100555419921875,
-0.00592803955078125,
-0.055145263671875,
-0.041900634765625,
-0.0279388427734375,
0.0280303955078125,
0.005382537841796875,
0.0243377685546875,
0.01288604736328125,
0.0411376953125,
0.0223541259765625,
-0.0131378173828125,
-0.0149078369140625,
-0.0024566650390625,
-0.0012340545654296875,
0.015411376953125,
-0.0111236572265625,
-0.044769287109375,
-0.0083465576171875,
0.047271728515625,
0.040985107421875,
-0.0322265625,
0.015899658203125,
-0.004791259765625,
0.05206298828125,
-0.01776123046875,
0.0162353515625,
-0.027557373046875,
0.007671356201171875,
-0.0148468017578125,
0.019561767578125,
-0.0165252685546875,
-0.01131439208984375,
0.03521728515625,
-0.0222015380859375,
0.04034423828125,
-0.0206298828125,
0.09356689453125,
0.0147552490234375,
-0.025360107421875,
-0.0217132568359375,
-0.031951904296875,
0.057037353515625,
-0.0721435546875,
0.0357666015625,
0.02667236328125,
0.0325927734375,
-0.0306243896484375,
-0.060943603515625,
-0.039642333984375,
-0.0112457275390625,
-0.0018491744995117188,
0.00525665283203125,
-0.0159759521484375,
0.006866455078125,
0.0160369873046875,
0.033905029296875,
-0.032135009765625,
-0.01910400390625,
-0.0445556640625,
-0.017120361328125,
0.044952392578125,
-0.004375457763671875,
0.01219940185546875,
-0.0224151611328125,
-0.0233154296875,
-0.0323486328125,
-0.039764404296875,
0.01515960693359375,
0.03564453125,
0.034332275390625,
-0.0283660888671875,
0.050933837890625,
-0.00884246826171875,
0.05474853515625,
0.02044677734375,
-0.0008859634399414062,
0.029296875,
-0.00958251953125,
-0.02471923828125,
0.00789642333984375,
0.060882568359375,
0.0036182403564453125,
0.0005340576171875,
-0.00716400146484375,
-0.0206756591796875,
-0.024871826171875,
0.03656005859375,
-0.07196044921875,
-0.0189666748046875,
0.03173828125,
-0.0504150390625,
-0.0219573974609375,
-0.00444793701171875,
-0.0430908203125,
-0.010467529296875,
-0.01174163818359375,
0.0499267578125,
-0.04547119140625,
-0.03582763671875,
0.0206756591796875,
0.0038738250732421875,
0.04522705078125,
0.01331329345703125,
-0.0775146484375,
0.02655029296875,
0.043426513671875,
0.057220458984375,
0.01338958740234375,
-0.007259368896484375,
-0.00617218017578125,
-0.02471923828125,
-0.0257568359375,
0.052490234375,
0.00015044212341308594,
-0.01293182373046875,
0.0146026611328125,
0.020355224609375,
-0.02105712890625,
-0.026885986328125,
0.05499267578125,
-0.03326416015625,
0.034027099609375,
-0.0325927734375,
-0.049652099609375,
-0.0313720703125,
0.00617218017578125,
-0.04840087890625,
0.086181640625,
0.02484130859375,
-0.0667724609375,
0.038238525390625,
-0.049896240234375,
-0.03765869140625,
-0.00383758544921875,
0.0007367134094238281,
-0.0697021484375,
0.00955963134765625,
0.016845703125,
0.0533447265625,
0.00305938720703125,
0.01491546630859375,
-0.03466796875,
-0.014862060546875,
0.0034027099609375,
0.0151214599609375,
0.08746337890625,
0.027191162109375,
-0.030670166015625,
0.0107879638671875,
-0.042633056640625,
-0.015625,
0.0142059326171875,
-0.035675048828125,
0.003887176513671875,
-0.0011272430419921875,
0.0205535888671875,
0.0221099853515625,
0.018310546875,
-0.049835205078125,
0.0232391357421875,
-0.0258941650390625,
0.03350830078125,
0.048614501953125,
-0.0225982666015625,
0.01468658447265625,
-0.03955078125,
0.0234222412109375,
0.007266998291015625,
0.0085906982421875,
0.0115509033203125,
-0.036590576171875,
-0.08441162109375,
-0.04443359375,
0.039398193359375,
0.0293731689453125,
-0.05023193359375,
0.04644775390625,
-0.002979278564453125,
-0.045318603515625,
-0.054962158203125,
-0.0007910728454589844,
0.00439453125,
0.042999267578125,
0.025970458984375,
-0.024200439453125,
-0.055908203125,
-0.07489013671875,
0.00441741943359375,
-0.0205078125,
-0.0159912109375,
0.0278167724609375,
0.04833984375,
-0.00553131103515625,
0.0574951171875,
0.0014390945434570312,
-0.04400634765625,
-0.0311737060546875,
0.0119476318359375,
0.044189453125,
0.042938232421875,
0.0621337890625,
-0.03167724609375,
-0.031585693359375,
-0.0177764892578125,
-0.05023193359375,
0.00847625732421875,
0.002895355224609375,
-0.01410675048828125,
0.055633544921875,
0.025390625,
-0.05999755859375,
0.0262451171875,
0.035491943359375,
-0.048095703125,
0.0543212890625,
-0.01904296875,
0.002948760986328125,
-0.09375,
0.0198516845703125,
0.01053619384765625,
-0.0154876708984375,
-0.04034423828125,
0.006336212158203125,
-0.0021457672119140625,
0.0103302001953125,
-0.043975830078125,
0.040771484375,
-0.018463134765625,
-0.0095367431640625,
0.0015268325805664062,
-0.032470703125,
-0.01020050048828125,
0.04656982421875,
0.0218658447265625,
0.048797607421875,
0.046051025390625,
-0.0394287109375,
0.0263824462890625,
0.03271484375,
-0.057586669921875,
0.02325439453125,
-0.061553955078125,
-0.0001399517059326172,
0.005580902099609375,
0.02166748046875,
-0.050994873046875,
-0.0088348388671875,
0.033203125,
-0.0379638671875,
0.041900634765625,
-0.041473388671875,
-0.0562744140625,
-0.032012939453125,
-0.01120758056640625,
0.000576019287109375,
0.051483154296875,
-0.04949951171875,
0.058135986328125,
0.02166748046875,
-0.0186004638671875,
-0.045684814453125,
-0.05133056640625,
-0.0035572052001953125,
-0.033660888671875,
-0.06329345703125,
0.037353515625,
-0.01143646240234375,
-0.003143310546875,
0.0007028579711914062,
-0.0011444091796875,
-0.01824951171875,
-0.0027008056640625,
0.008544921875,
0.0325927734375,
-0.020751953125,
0.005619049072265625,
0.0033283233642578125,
0.01345062255859375,
0.006961822509765625,
-0.0145416259765625,
0.047149658203125,
-0.041595458984375,
-0.00559234619140625,
-0.03131103515625,
0.01708984375,
0.035003662109375,
-0.0007910728454589844,
0.08441162109375,
0.07073974609375,
-0.032379150390625,
0.0125274658203125,
-0.06011962890625,
-0.0193634033203125,
-0.033966064453125,
0.0160980224609375,
-0.0377197265625,
-0.068115234375,
0.05340576171875,
0.0195770263671875,
0.0255889892578125,
0.051788330078125,
0.06243896484375,
-0.0299835205078125,
0.0712890625,
0.06719970703125,
-0.007495880126953125,
0.047332763671875,
-0.033355712890625,
0.002040863037109375,
-0.058868408203125,
-0.0186004638671875,
-0.037567138671875,
-0.0036792755126953125,
-0.04736328125,
-0.01004791259765625,
0.0169219970703125,
0.006694793701171875,
-0.025787353515625,
0.034210205078125,
-0.034881591796875,
-0.005062103271484375,
0.046783447265625,
0.0186004638671875,
-0.01605224609375,
0.030670166015625,
-0.0309906005859375,
0.006439208984375,
-0.0614013671875,
-0.0311737060546875,
0.09869384765625,
0.03607177734375,
0.0284576416015625,
0.01294708251953125,
0.06390380859375,
0.0175323486328125,
0.0164794921875,
-0.05157470703125,
0.0248260498046875,
-0.00876617431640625,
-0.07122802734375,
-0.0104217529296875,
-0.022674560546875,
-0.07305908203125,
0.0164031982421875,
-0.0205535888671875,
-0.058258056640625,
0.0188140869140625,
-0.007572174072265625,
-0.0362548828125,
0.034454345703125,
-0.05194091796875,
0.0738525390625,
0.0024394989013671875,
-0.0147705078125,
-0.018280029296875,
-0.046905517578125,
0.015106201171875,
0.005451202392578125,
-0.006938934326171875,
-0.004058837890625,
0.024810791015625,
0.0770263671875,
-0.06024169921875,
0.04754638671875,
-0.0235748291015625,
0.0016260147094726562,
0.03076171875,
-0.00588226318359375,
0.023834228515625,
-0.0175933837890625,
-0.0166473388671875,
0.03765869140625,
0.027099609375,
-0.053253173828125,
-0.012603759765625,
0.0399169921875,
-0.076416015625,
-0.03466796875,
-0.050811767578125,
-0.0299530029296875,
0.0031871795654296875,
0.0146942138671875,
0.014129638671875,
0.0228271484375,
-0.00992584228515625,
0.019622802734375,
0.05364990234375,
-0.0304718017578125,
0.0421142578125,
0.048675537109375,
-0.0031719207763671875,
-0.0167083740234375,
0.04888916015625,
-0.00324249267578125,
-0.0148468017578125,
0.0017976760864257812,
0.0027904510498046875,
-0.0264434814453125,
-0.03741455078125,
-0.042877197265625,
0.033905029296875,
-0.037567138671875,
-0.005725860595703125,
-0.057525634765625,
-0.0279541015625,
-0.05364990234375,
0.00791168212890625,
-0.0380859375,
-0.037445068359375,
-0.0186767578125,
-0.01172637939453125,
0.0433349609375,
0.040496826171875,
-0.02264404296875,
0.0170135498046875,
-0.042388916015625,
0.01010894775390625,
0.0146942138671875,
0.037445068359375,
-0.01001739501953125,
-0.0565185546875,
-0.0225067138671875,
-0.00011682510375976562,
-0.0130615234375,
-0.049896240234375,
0.048858642578125,
0.0088043212890625,
0.041412353515625,
0.0302276611328125,
0.00664520263671875,
0.035125732421875,
-0.03765869140625,
0.0484619140625,
0.0030574798583984375,
-0.049530029296875,
0.0284881591796875,
-0.034210205078125,
0.00666046142578125,
0.036102294921875,
0.0306396484375,
-0.03729248046875,
-0.00783538818359375,
-0.0582275390625,
-0.06414794921875,
0.0743408203125,
0.0272979736328125,
0.00968170166015625,
0.01026153564453125,
0.027801513671875,
0.0065460205078125,
0.016082763671875,
-0.05078125,
-0.018707275390625,
-0.035400390625,
-0.0194091796875,
0.00327301025390625,
-0.035064697265625,
-0.01474761962890625,
-0.040863037109375,
0.0804443359375,
0.01605224609375,
0.05438232421875,
0.027252197265625,
-0.01273345947265625,
-0.00922393798828125,
0.005458831787109375,
0.04888916015625,
0.030914306640625,
-0.05303955078125,
-0.020355224609375,
0.01204681396484375,
-0.0379638671875,
-0.0166473388671875,
0.04345703125,
-0.0039825439453125,
0.0137176513671875,
0.01468658447265625,
0.06219482421875,
-0.01021575927734375,
-0.034759521484375,
0.035888671875,
-0.0323486328125,
-0.036956787109375,
-0.05413818359375,
-0.020050048828125,
0.005977630615234375,
0.03271484375,
0.0389404296875,
-0.01806640625,
0.0015106201171875,
-0.017364501953125,
0.0214080810546875,
0.0302276611328125,
-0.033599853515625,
-0.016265869140625,
0.03106689453125,
0.006256103515625,
-0.0027637481689453125,
0.072021484375,
-0.005435943603515625,
-0.048614501953125,
0.05657958984375,
0.0176849365234375,
0.06683349609375,
-0.010498046875,
0.01812744140625,
0.042205810546875,
0.030731201171875,
0.0029201507568359375,
0.0239715576171875,
-0.01219940185546875,
-0.060333251953125,
-0.0307769775390625,
-0.07257080078125,
-0.01303863525390625,
0.036163330078125,
-0.052764892578125,
0.0273284912109375,
-0.03369140625,
-0.020965576171875,
-0.00234222412109375,
0.037994384765625,
-0.054901123046875,
0.0085601806640625,
0.0155029296875,
0.070068359375,
-0.059112548828125,
0.0784912109375,
0.0643310546875,
-0.034759521484375,
-0.05450439453125,
-0.0345458984375,
-0.0185546875,
-0.05072021484375,
0.036834716796875,
0.01311492919921875,
0.0294952392578125,
-0.00818634033203125,
-0.037200927734375,
-0.060943603515625,
0.07379150390625,
0.016815185546875,
-0.0169677734375,
0.0083465576171875,
0.006305694580078125,
0.042633056640625,
-0.0175323486328125,
0.0254669189453125,
0.041839599609375,
0.024658203125,
0.01418304443359375,
-0.06072998046875,
0.00550079345703125,
-0.036773681640625,
-0.0174560546875,
0.003330230712890625,
-0.044708251953125,
0.0654296875,
-0.0113677978515625,
-0.004573822021484375,
0.0186004638671875,
0.055633544921875,
0.0291595458984375,
-0.0009641647338867188,
0.037933349609375,
0.06072998046875,
0.02984619140625,
-0.01166534423828125,
0.08538818359375,
-0.0309600830078125,
0.03765869140625,
0.053497314453125,
0.00876617431640625,
0.048614501953125,
0.03521728515625,
-0.0252685546875,
0.055023193359375,
0.08441162109375,
-0.0221710205078125,
0.038177490234375,
0.002613067626953125,
-0.02862548828125,
-0.02880859375,
0.003597259521484375,
-0.03826904296875,
0.0225830078125,
0.0229034423828125,
-0.02520751953125,
-0.0188140869140625,
-0.01248931884765625,
0.01409912109375,
-0.04058837890625,
-0.006649017333984375,
0.048614501953125,
0.006622314453125,
-0.033447265625,
0.0472412109375,
0.017059326171875,
0.051361083984375,
-0.04766845703125,
0.0006461143493652344,
-0.0205078125,
0.01305389404296875,
-0.003173828125,
-0.027587890625,
0.0254669189453125,
-0.0059051513671875,
-0.00936126708984375,
-0.0205078125,
0.055938720703125,
-0.02972412109375,
-0.05841064453125,
0.0122222900390625,
0.027252197265625,
0.0321044921875,
-0.004253387451171875,
-0.0826416015625,
-0.00582122802734375,
-0.00909423828125,
-0.042877197265625,
0.037139892578125,
0.01404571533203125,
0.006366729736328125,
0.056304931640625,
0.049072265625,
-0.0007085800170898438,
0.00968170166015625,
0.0067901611328125,
0.06109619140625,
-0.036041259765625,
-0.0261383056640625,
-0.0421142578125,
0.0396728515625,
0.0083770751953125,
-0.01393890380859375,
0.0418701171875,
0.035919189453125,
0.0657958984375,
-0.0098114013671875,
0.049591064453125,
-0.0283355712890625,
0.03515625,
-0.0225677490234375,
0.0833740234375,
-0.049713134765625,
-0.009613037109375,
-0.02392578125,
-0.053802490234375,
-0.01036834716796875,
0.07806396484375,
-0.021148681640625,
0.015289306640625,
0.035980224609375,
0.0499267578125,
0.0017242431640625,
-0.0126800537109375,
0.007061004638671875,
0.0280303955078125,
0.0110626220703125,
0.036041259765625,
0.033538818359375,
-0.05010986328125,
0.042083740234375,
-0.04400634765625,
-0.007904052734375,
-0.024322509765625,
-0.06219482421875,
-0.08856201171875,
-0.05902099609375,
-0.0278167724609375,
-0.04522705078125,
0.005252838134765625,
0.073974609375,
0.065673828125,
-0.0784912109375,
-0.035064697265625,
-0.01029205322265625,
0.005657196044921875,
-0.019561767578125,
-0.0185546875,
0.05194091796875,
-0.0277099609375,
-0.053253173828125,
0.00922393798828125,
-0.0214996337890625,
0.0218658447265625,
-0.00994110107421875,
-0.0093994140625,
-0.0287933349609375,
0.004604339599609375,
0.036041259765625,
0.0283355712890625,
-0.043212890625,
-0.01110076904296875,
-0.00257110595703125,
-0.01090240478515625,
0.003696441650390625,
0.03997802734375,
-0.051605224609375,
0.0259246826171875,
0.0282745361328125,
0.0259246826171875,
0.0692138671875,
-0.0279998779296875,
0.043487548828125,
-0.03765869140625,
0.024749755859375,
0.002777099609375,
0.04736328125,
0.0321044921875,
-0.0086669921875,
0.035736083984375,
0.00644683837890625,
-0.032684326171875,
-0.058746337890625,
-0.01161956787109375,
-0.07989501953125,
-0.0246734619140625,
0.0703125,
-0.0264434814453125,
-0.027587890625,
0.002498626708984375,
-0.00644683837890625,
0.0478515625,
-0.034454345703125,
0.08673095703125,
0.0709228515625,
-0.000751495361328125,
-0.0199737548828125,
-0.039276123046875,
0.049163818359375,
0.051025390625,
-0.032989501953125,
-0.01959228515625,
0.0233612060546875,
0.038787841796875,
-0.0111236572265625,
0.03607177734375,
-0.0178985595703125,
0.01910400390625,
-0.01275634765625,
0.04779052734375,
-0.00730133056640625,
-0.00730133056640625,
-0.0309906005859375,
-0.01324462890625,
-0.019073486328125,
-0.01209259033203125
]
] |
emilianJR/epiCRealism | 2023-07-23T14:14:35.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | emilianJR | null | null | emilianJR/epiCRealism | 22 | 97,300 | diffusers | 2023-06-25T12:13:26 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
Diffuser model for this SD checkpoint:
https://civitai.com/models/25694/epicrealism
**emilianJR/epiCRealism** is the HuggingFace diffuser that you can use with **diffusers.StableDiffusionPipeline()**.
Examples | Examples | Examples
---- | ---- | ----
,%20raw%20photo,%20((monochrome)),%20((grayscale)),%20black%20and%20white%20photo.jpeg) |  | ![]()
,%20lotr,%20fantasy,%20elf,%20female,%20full%20body,%20looking%20at%20viewer,%20portrait,%20phot.jpeg) | ,%20soft%20lighting,%20h.jpeg) | ![]()
-------
## 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "emilianJR/epiCRealism"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "YOUR PROMPT"
image = pipe(prompt).images[0]
image.save("image.png")
```
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 2,382 | [
[
-0.03955078125,
-0.048675537109375,
0.035247802734375,
0.0306243896484375,
-0.0228729248046875,
-0.00611114501953125,
0.0253448486328125,
-0.0220794677734375,
0.0239105224609375,
0.0396728515625,
-0.04998779296875,
-0.051361083984375,
-0.0389404296875,
-0.0056610107421875,
-0.0161285400390625,
0.062225341796875,
-0.0224609375,
-0.0006709098815917969,
-0.0030155181884765625,
-0.0095062255859375,
-0.0194244384765625,
0.0039520263671875,
-0.06103515625,
-0.006198883056640625,
0.035308837890625,
-0.0011415481567382812,
0.048675537109375,
0.023101806640625,
0.03033447265625,
0.0238037109375,
-0.040771484375,
-0.002170562744140625,
-0.027587890625,
0.004650115966796875,
-0.01105499267578125,
-0.0180816650390625,
-0.055908203125,
0.0171661376953125,
0.041259765625,
0.030792236328125,
-0.0227508544921875,
0.0005960464477539062,
0.0025386810302734375,
0.038787841796875,
-0.03179931640625,
-0.0142364501953125,
0.001239776611328125,
0.0016412734985351562,
-0.019866943359375,
0.005680084228515625,
0.0005660057067871094,
-0.03814697265625,
0.013916015625,
-0.05438232421875,
0.0239105224609375,
-0.00713348388671875,
0.0985107421875,
0.01100921630859375,
-0.025299072265625,
-0.0200042724609375,
-0.048858642578125,
0.03558349609375,
-0.0367431640625,
0.021697998046875,
0.00554656982421875,
0.0252532958984375,
-0.0016908645629882812,
-0.08416748046875,
-0.044586181640625,
0.0229034423828125,
-0.007259368896484375,
0.03021240234375,
-0.015045166015625,
-0.0029754638671875,
0.0261077880859375,
0.027984619140625,
-0.0528564453125,
-0.0130767822265625,
-0.046630859375,
0.00359344482421875,
0.0380859375,
0.01546478271484375,
0.030792236328125,
-0.00797271728515625,
-0.054046630859375,
0.004261016845703125,
-0.02740478515625,
0.0085296630859375,
0.004486083984375,
-0.0241546630859375,
-0.0389404296875,
0.03363037109375,
-0.0138092041015625,
0.047271728515625,
0.032379150390625,
-0.01422119140625,
0.0181121826171875,
-0.02056884765625,
-0.01395416259765625,
-0.033355712890625,
0.050567626953125,
0.059814453125,
0.01003265380859375,
0.00677490234375,
-0.005168914794921875,
-0.0032863616943359375,
0.00902557373046875,
-0.10003662109375,
-0.0233917236328125,
0.035919189453125,
-0.04180908203125,
-0.031585693359375,
0.004207611083984375,
-0.0626220703125,
-0.01103973388671875,
-0.002124786376953125,
0.01052093505859375,
-0.029144287109375,
-0.0406494140625,
0.01059722900390625,
-0.024078369140625,
0.0291748046875,
0.050628662109375,
-0.038482666015625,
-0.002246856689453125,
0.0152435302734375,
0.0921630859375,
-0.0012769699096679688,
0.0060577392578125,
-0.0005350112915039062,
0.029632568359375,
-0.019683837890625,
0.067138671875,
-0.0182342529296875,
-0.0413818359375,
-0.00859832763671875,
0.036224365234375,
-0.0129241943359375,
-0.0260162353515625,
0.059112548828125,
-0.04150390625,
0.007343292236328125,
-0.02862548828125,
-0.0367431640625,
-0.01212310791015625,
-0.006328582763671875,
-0.043060302734375,
0.0675048828125,
0.02362060546875,
-0.07745361328125,
0.033782958984375,
-0.03912353515625,
0.01096343994140625,
0.0253143310546875,
-0.001827239990234375,
-0.04986572265625,
-0.00646209716796875,
-0.003993988037109375,
0.04461669921875,
0.0020275115966796875,
-0.0078582763671875,
-0.0426025390625,
-0.0204315185546875,
-0.0054473876953125,
0.00812530517578125,
0.095458984375,
0.0218963623046875,
-0.02801513671875,
0.004283905029296875,
-0.0310821533203125,
-0.002223968505859375,
0.0164947509765625,
-0.018218994140625,
-0.01103973388671875,
-0.032440185546875,
0.0298919677734375,
0.0355224609375,
0.0071868896484375,
-0.061614990234375,
0.0175018310546875,
-0.001216888427734375,
0.041168212890625,
0.0560302734375,
0.0256500244140625,
0.03118896484375,
-0.044342041015625,
0.051025390625,
0.02716064453125,
0.022613525390625,
0.0011510848999023438,
-0.044647216796875,
-0.060699462890625,
-0.055145263671875,
-0.0016918182373046875,
0.0269317626953125,
-0.060821533203125,
0.025848388671875,
0.00038170814514160156,
-0.07122802734375,
-0.0335693359375,
-0.002414703369140625,
0.016632080078125,
0.05078125,
0.01248931884765625,
-0.03662109375,
-0.0204925537109375,
-0.0517578125,
0.00438690185546875,
0.005039215087890625,
-0.007083892822265625,
0.0135498046875,
0.041412353515625,
-0.0241546630859375,
0.059326171875,
-0.044891357421875,
-0.037841796875,
-0.0096435546875,
-0.00786590576171875,
0.033905029296875,
0.0616455078125,
0.069580078125,
-0.06939697265625,
-0.05755615234375,
-0.014617919921875,
-0.0643310546875,
-0.00707244873046875,
0.0006189346313476562,
-0.04150390625,
0.01385498046875,
0.0276336669921875,
-0.072509765625,
0.0660400390625,
0.041168212890625,
-0.06451416015625,
0.0438232421875,
-0.0221405029296875,
0.0140380859375,
-0.0809326171875,
0.0308990478515625,
0.024444580078125,
-0.04266357421875,
-0.05206298828125,
0.026336669921875,
0.004852294921875,
0.0032825469970703125,
-0.04931640625,
0.06976318359375,
-0.043182373046875,
0.034942626953125,
-0.0244293212890625,
-0.010223388671875,
0.016143798828125,
0.01384735107421875,
0.030853271484375,
0.037078857421875,
0.0604248046875,
-0.0418701171875,
0.0235443115234375,
0.03692626953125,
-0.0218963623046875,
0.0745849609375,
-0.07110595703125,
0.00235748291015625,
-0.034942626953125,
0.037078857421875,
-0.08203125,
-0.0264739990234375,
0.038848876953125,
-0.0247344970703125,
0.0158538818359375,
-0.024322509765625,
-0.018402099609375,
-0.0290679931640625,
-0.0182647705078125,
0.029632568359375,
0.05609130859375,
-0.043060302734375,
0.052001953125,
0.0196533203125,
-0.00576019287109375,
-0.015960693359375,
-0.049041748046875,
-0.0210418701171875,
-0.042999267578125,
-0.07086181640625,
0.044189453125,
-0.026031494140625,
-0.0102386474609375,
-0.0168304443359375,
0.0020198822021484375,
-0.01540374755859375,
-0.01557159423828125,
0.03717041015625,
0.047393798828125,
-0.004375457763671875,
-0.045867919921875,
0.01027679443359375,
-0.0160064697265625,
-0.006977081298828125,
-0.00555419921875,
0.034759521484375,
-0.01146697998046875,
-0.01079559326171875,
-0.0684814453125,
0.0213775634765625,
0.053924560546875,
0.01160430908203125,
0.08416748046875,
0.061431884765625,
-0.038604736328125,
0.0026760101318359375,
-0.047821044921875,
-0.003627777099609375,
-0.0384521484375,
-0.00914764404296875,
-0.026885986328125,
-0.032318115234375,
0.06317138671875,
0.00476837158203125,
0.0129241943359375,
0.03460693359375,
0.052764892578125,
-0.0204010009765625,
0.07421875,
0.042266845703125,
0.026824951171875,
0.03961181640625,
-0.054107666015625,
-0.005893707275390625,
-0.079345703125,
-0.03759765625,
-0.014984130859375,
-0.0180511474609375,
-0.01898193359375,
-0.0411376953125,
0.033416748046875,
0.03704833984375,
-0.03564453125,
0.035858154296875,
-0.045318603515625,
0.0283203125,
0.019256591796875,
0.0290679931640625,
0.01532745361328125,
-0.011138916015625,
-0.0008363723754882812,
-0.007598876953125,
-0.0302581787109375,
-0.043182373046875,
0.045379638671875,
0.0308990478515625,
0.05230712890625,
0.019805908203125,
0.045623779296875,
0.0022487640380859375,
0.036102294921875,
-0.0236358642578125,
0.028045654296875,
-0.0005321502685546875,
-0.06756591796875,
0.00470733642578125,
-0.019378662109375,
-0.04931640625,
0.0222320556640625,
-0.01153564453125,
-0.05230712890625,
0.054718017578125,
0.01617431640625,
-0.033935546875,
0.022247314453125,
-0.043792724609375,
0.053619384765625,
0.0125579833984375,
-0.057220458984375,
0.0152740478515625,
-0.037567138671875,
0.031219482421875,
0.014923095703125,
0.007518768310546875,
-0.01136016845703125,
-0.0167083740234375,
0.0523681640625,
-0.0458984375,
0.051055908203125,
-0.042816162109375,
-0.003376007080078125,
0.02130126953125,
0.003902435302734375,
0.025177001953125,
0.027191162109375,
-0.00992584228515625,
0.0142669677734375,
0.022369384765625,
-0.04852294921875,
-0.0171966552734375,
0.054534912109375,
-0.043731689453125,
-0.01629638671875,
-0.042633056640625,
-0.0140380859375,
0.0281219482421875,
0.0203857421875,
0.056732177734375,
0.020416259765625,
-0.01082611083984375,
-0.0131378173828125,
0.058013916015625,
-0.004474639892578125,
0.03521728515625,
0.01617431640625,
-0.0303955078125,
-0.0294342041015625,
0.04254150390625,
0.015167236328125,
0.04034423828125,
-0.023223876953125,
0.00128936767578125,
-0.01262664794921875,
-0.02880859375,
-0.04193115234375,
0.0447998046875,
-0.0543212890625,
-0.0171356201171875,
-0.045867919921875,
-0.0299835205078125,
-0.0224609375,
-0.046417236328125,
-0.01401519775390625,
-0.0170745849609375,
-0.045257568359375,
0.01122283935546875,
0.048736572265625,
0.032257080078125,
-0.006099700927734375,
0.0306854248046875,
-0.0265960693359375,
0.0248870849609375,
0.0181427001953125,
0.0218658447265625,
0.00687408447265625,
-0.03533935546875,
0.00649261474609375,
-0.007717132568359375,
-0.035369873046875,
-0.063720703125,
0.045379638671875,
0.0154571533203125,
0.0214080810546875,
0.04620361328125,
-0.0014657974243164062,
0.06683349609375,
-0.0248870849609375,
0.047821044921875,
0.0307159423828125,
-0.048797607421875,
0.034515380859375,
-0.05230712890625,
0.0189666748046875,
0.03424072265625,
0.03436279296875,
-0.044342041015625,
-0.028411865234375,
-0.056365966796875,
-0.052978515625,
0.034210205078125,
0.028656005859375,
0.00015842914581298828,
0.00998687744140625,
0.05633544921875,
-0.0041961669921875,
0.005157470703125,
-0.051025390625,
-0.0297088623046875,
-0.0171051025390625,
0.003498077392578125,
0.0220184326171875,
0.0013971328735351562,
-0.018707275390625,
-0.0399169921875,
0.05487060546875,
-0.00688934326171875,
0.0266265869140625,
0.03765869140625,
0.01285552978515625,
-0.0137786865234375,
-0.0277862548828125,
0.0269927978515625,
0.057342529296875,
-0.029937744140625,
-0.0246734619140625,
-0.004871368408203125,
-0.044708251953125,
0.00669097900390625,
0.00045800209045410156,
-0.0362548828125,
0.0133819580078125,
0.00302886962890625,
0.042724609375,
-0.00740814208984375,
-0.020538330078125,
0.05743408203125,
-0.0191192626953125,
-0.01384735107421875,
-0.038726806640625,
0.00527191162109375,
0.037506103515625,
0.03729248046875,
0.01039886474609375,
0.026458740234375,
0.0175323486328125,
-0.0284271240234375,
0.00440216064453125,
0.03936767578125,
-0.0311126708984375,
-0.0262908935546875,
0.08978271484375,
-0.00020945072174072266,
0.00228118896484375,
0.036224365234375,
-0.005615234375,
-0.022735595703125,
0.0458984375,
0.026031494140625,
0.0633544921875,
-0.034698486328125,
0.02880859375,
0.050506591796875,
-0.002227783203125,
-0.0024013519287109375,
0.036285400390625,
0.013153076171875,
-0.03131103515625,
0.006748199462890625,
-0.06689453125,
-0.009735107421875,
-0.01454925537109375,
-0.052490234375,
0.0457763671875,
-0.058074951171875,
-0.0198211669921875,
0.009429931640625,
-0.00820159912109375,
-0.06195068359375,
0.02239990234375,
0.002864837646484375,
0.08319091796875,
-0.06768798828125,
0.060791015625,
0.0303802490234375,
-0.04925537109375,
-0.0423583984375,
-0.003509521484375,
0.006031036376953125,
-0.047119140625,
0.023773193359375,
0.009429931640625,
-0.00707244873046875,
-0.001422882080078125,
-0.04937744140625,
-0.06268310546875,
0.11163330078125,
0.01519012451171875,
-0.0277252197265625,
-0.0252227783203125,
-0.03216552734375,
0.0311431884765625,
-0.037139892578125,
0.035919189453125,
0.03363037109375,
0.02850341796875,
0.05340576171875,
-0.042449951171875,
0.0190887451171875,
-0.02227783203125,
0.038909912109375,
-0.008819580078125,
-0.075439453125,
0.079833984375,
-0.01308441162109375,
-0.02978515625,
0.056365966796875,
0.0556640625,
0.04388427734375,
0.0213775634765625,
0.040283203125,
0.05841064453125,
0.0281524658203125,
-0.0150146484375,
0.07684326171875,
-0.0025787353515625,
0.047149658203125,
0.03179931640625,
-0.0011682510375976562,
0.051422119140625,
0.0254669189453125,
-0.00658416748046875,
0.055572509765625,
0.06317138671875,
0.0022869110107421875,
0.04400634765625,
0.002559661865234375,
-0.0394287109375,
-0.01153564453125,
-0.0039520263671875,
-0.033477783203125,
-0.002101898193359375,
0.022247314453125,
-0.027252197265625,
-0.0119171142578125,
0.00870513916015625,
0.0223541259765625,
-0.0203094482421875,
-0.02392578125,
0.045867919921875,
0.007110595703125,
-0.01800537109375,
0.053131103515625,
-0.00728607177734375,
0.064697265625,
-0.049285888671875,
0.0012884140014648438,
-0.0156707763671875,
0.023956298828125,
-0.04150390625,
-0.06884765625,
0.0275115966796875,
-0.0032444000244140625,
-0.0028514862060546875,
-0.0219879150390625,
0.045196533203125,
-0.0256805419921875,
-0.062164306640625,
0.0174102783203125,
0.011383056640625,
0.030029296875,
0.015960693359375,
-0.07025146484375,
0.0185089111328125,
-0.00856781005859375,
-0.034942626953125,
0.01059722900390625,
0.0137786865234375,
0.0223541259765625,
0.047698974609375,
0.026092529296875,
0.0118560791015625,
0.00815582275390625,
-0.0221099853515625,
0.081298828125,
-0.02911376953125,
-0.03515625,
-0.03936767578125,
0.06988525390625,
-0.016937255859375,
-0.037078857421875,
0.0576171875,
0.03302001953125,
0.05157470703125,
-0.0092010498046875,
0.045562744140625,
-0.0279388427734375,
0.0411376953125,
-0.03631591796875,
0.08056640625,
-0.05633544921875,
-0.0066680908203125,
-0.04742431640625,
-0.0792236328125,
-0.0188446044921875,
0.0816650390625,
0.00042510032653808594,
0.0185089111328125,
0.0276947021484375,
0.07159423828125,
-0.0201416015625,
-0.00527191162109375,
-0.004695892333984375,
0.005146026611328125,
0.030792236328125,
0.0171966552734375,
0.0452880859375,
-0.03472900390625,
0.0060577392578125,
-0.040863037109375,
-0.043853759765625,
-0.0007772445678710938,
-0.05517578125,
-0.0712890625,
-0.060821533203125,
-0.047119140625,
-0.06488037109375,
-0.0092926025390625,
0.035064697265625,
0.0760498046875,
-0.045684814453125,
-0.007091522216796875,
-0.0125274658203125,
-0.0087738037109375,
0.0006737709045410156,
-0.0205078125,
0.0246429443359375,
0.03436279296875,
-0.0799560546875,
-0.0164031982421875,
0.0048828125,
0.04583740234375,
-0.034088134765625,
-0.01776123046875,
-0.020721435546875,
0.002010345458984375,
0.017059326171875,
0.0275726318359375,
-0.045196533203125,
-0.01007843017578125,
-0.020843505859375,
-0.0013494491577148438,
0.003498077392578125,
0.0260009765625,
-0.051025390625,
0.0164642333984375,
0.0279541015625,
-0.0002200603485107422,
0.06298828125,
0.007602691650390625,
0.01513671875,
-0.0217437744140625,
0.01491546630859375,
0.017608642578125,
0.033660888671875,
0.0142822265625,
-0.03045654296875,
0.034698486328125,
0.0280303955078125,
-0.05694580078125,
-0.06365966796875,
0.01177978515625,
-0.09814453125,
-0.011505126953125,
0.07110595703125,
-0.030120849609375,
-0.0261383056640625,
-0.0015554428100585938,
-0.0294647216796875,
0.0025997161865234375,
-0.0252838134765625,
0.03265380859375,
0.03570556640625,
-0.0133819580078125,
-0.029205322265625,
-0.03765869140625,
0.039825439453125,
0.01192474365234375,
-0.0501708984375,
-0.00994873046875,
0.0289306640625,
0.054840087890625,
0.027801513671875,
0.058135986328125,
-0.01910400390625,
0.0254669189453125,
-0.00562286376953125,
0.0058135986328125,
0.01412200927734375,
-0.003963470458984375,
-0.02093505859375,
0.0005335807800292969,
-0.02130126953125,
-0.0118865966796875
]
] |
bert-base-german-cased | 2023-04-06T13:41:34.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"exbert",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | bert-base-german-cased | 49 | 96,977 | transformers | 2022-03-02T23:29:04 | ---
language: de
license: mit
thumbnail: https://static.tildacdn.com/tild6438-3730-4164-b266-613634323466/german_bert.png
tags:
- exbert
---
<a href="https://huggingface.co/exbert/?model=bert-base-german-cased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
# German BERT

## Overview
**Language model:** bert-base-cased
**Language:** German
**Training data:** Wiki, OpenLegalData, News (~ 12GB)
**Eval data:** Conll03 (NER), GermEval14 (NER), GermEval18 (Classification), GNAD (Classification)
**Infrastructure**: 1x TPU v2
**Published**: Jun 14th, 2019
**Update April 3rd, 2020**: we updated the vocabulary file on deepset's s3 to conform with the default tokenization of punctuation tokens.
For details see the related [FARM issue](https://github.com/deepset-ai/FARM/issues/60). If you want to use the old vocab we have also uploaded a ["deepset/bert-base-german-cased-oldvocab"](https://huggingface.co/deepset/bert-base-german-cased-oldvocab) model.
## Details
- We trained using Google's Tensorflow code on a single cloud TPU v2 with standard settings.
- We trained 810k steps with a batch size of 1024 for sequence length 128 and 30k steps with sequence length 512. Training took about 9 days.
- As training data we used the latest German Wikipedia dump (6GB of raw txt files), the OpenLegalData dump (2.4 GB) and news articles (3.6 GB).
- We cleaned the data dumps with tailored scripts and segmented sentences with spacy v2.1. To create tensorflow records we used the recommended sentencepiece library for creating the word piece vocabulary and tensorflow scripts to convert the text to data usable by BERT.
See https://deepset.ai/german-bert for more details
## Hyperparameters
```
batch_size = 1024
n_steps = 810_000
max_seq_len = 128 (and 512 later)
learning_rate = 1e-4
lr_schedule = LinearWarmup
num_warmup_steps = 10_000
```
## Performance
During training we monitored the loss and evaluated different model checkpoints on the following German datasets:
- germEval18Fine: Macro f1 score for multiclass sentiment classification
- germEval18coarse: Macro f1 score for binary sentiment classification
- germEval14: Seq f1 score for NER (file names deuutf.\*)
- CONLL03: Seq f1 score for NER
- 10kGNAD: Accuracy for document classification
Even without thorough hyperparameter tuning, we observed quite stable learning especially for our German model. Multiple restarts with different seeds produced quite similar results.

We further evaluated different points during the 9 days of pre-training and were astonished how fast the model converges to the maximally reachable performance. We ran all 5 downstream tasks on 7 different model checkpoints - taken at 0 up to 840k training steps (x-axis in figure below). Most checkpoints are taken from early training where we expected most performance changes. Surprisingly, even a randomly initialized BERT can be trained only on labeled downstream datasets and reach good performance (blue line, GermEval 2018 Coarse task, 795 kB trainset size).

## Authors
- Branden Chan: `branden.chan [at] deepset.ai`
- Timo Möller: `timo.moeller [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
- Tanay Soni: `tanay.soni [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
| 4,194 | [
[
-0.0316162109375,
-0.054290771484375,
0.0240020751953125,
0.016021728515625,
-0.0245208740234375,
-0.01319122314453125,
-0.041168212890625,
-0.03509521484375,
0.0021648406982421875,
0.01337432861328125,
-0.0455322265625,
-0.051483154296875,
-0.041534423828125,
-0.01129150390625,
-0.0186309814453125,
0.0889892578125,
0.00255584716796875,
0.021636962890625,
0.0134124755859375,
-0.01554107666015625,
-0.021484375,
-0.056884765625,
-0.06756591796875,
-0.0304107666015625,
0.042510986328125,
0.00006651878356933594,
0.03594970703125,
0.0254058837890625,
0.033721923828125,
0.0169677734375,
-0.0193634033203125,
-0.004505157470703125,
-0.032073974609375,
-0.0030422210693359375,
0.00916290283203125,
-0.015625,
-0.0222930908203125,
-0.004573822021484375,
0.040985107421875,
0.0460205078125,
-0.0028400421142578125,
0.0023136138916015625,
0.011505126953125,
0.0550537109375,
-0.0305023193359375,
0.00974273681640625,
-0.040679931640625,
0.0038318634033203125,
-0.01259613037109375,
0.022369384765625,
-0.0311737060546875,
-0.019073486328125,
0.01922607421875,
-0.039154052734375,
0.022216796875,
-0.0278472900390625,
0.1063232421875,
0.0078582763671875,
-0.00812530517578125,
-0.009735107421875,
-0.035369873046875,
0.05242919921875,
-0.0777587890625,
0.03411865234375,
0.01043701171875,
0.01446533203125,
-0.01331329345703125,
-0.0653076171875,
-0.04620361328125,
-0.01580810546875,
0.00499725341796875,
0.0127716064453125,
-0.01097869873046875,
-0.00027632713317871094,
0.0206146240234375,
0.028472900390625,
-0.039306640625,
0.028045654296875,
-0.0265350341796875,
-0.0231781005859375,
0.040496826171875,
-0.016357421875,
0.006549835205078125,
-0.0096282958984375,
-0.02825927734375,
-0.03692626953125,
-0.050018310546875,
0.0148468017578125,
0.025543212890625,
0.038848876953125,
-0.0186767578125,
0.0218048095703125,
-0.00666046142578125,
0.042694091796875,
-0.0022125244140625,
0.021575927734375,
0.050811767578125,
-0.023162841796875,
-0.022918701171875,
-0.009979248046875,
0.0673828125,
0.00803375244140625,
0.023193359375,
-0.01275634765625,
-0.0237579345703125,
-0.01374053955078125,
0.0167236328125,
-0.07025146484375,
-0.0258636474609375,
0.0245208740234375,
-0.031341552734375,
-0.01947021484375,
0.0018558502197265625,
-0.03192138671875,
-0.016998291015625,
-0.0215301513671875,
0.04754638671875,
-0.07708740234375,
-0.01323699951171875,
0.00728607177734375,
-0.01064300537109375,
0.033111572265625,
0.020965576171875,
-0.045013427734375,
-0.006748199462890625,
0.05316162109375,
0.060821533203125,
0.00189971923828125,
-0.0316162109375,
-0.02191162109375,
-0.0265045166015625,
-0.0302581787109375,
0.0399169921875,
-0.00717926025390625,
-0.005252838134765625,
0.01148223876953125,
0.003849029541015625,
-0.005641937255859375,
-0.031707763671875,
0.035980224609375,
-0.049041748046875,
0.036895751953125,
-0.033905029296875,
-0.058685302734375,
-0.01494598388671875,
0.0212249755859375,
-0.0478515625,
0.0885009765625,
0.00926971435546875,
-0.044952392578125,
0.04925537109375,
-0.054840087890625,
-0.04132080078125,
0.0005555152893066406,
0.0030994415283203125,
-0.023345947265625,
-0.0027446746826171875,
0.017974853515625,
0.04608154296875,
0.00628662109375,
0.0198516845703125,
-0.0238189697265625,
-0.03045654296875,
0.01290130615234375,
-0.01849365234375,
0.08050537109375,
0.01349639892578125,
-0.0364990234375,
-0.01232147216796875,
-0.05767822265625,
0.009246826171875,
0.00742340087890625,
-0.043365478515625,
-0.01488494873046875,
-0.006168365478515625,
0.013275146484375,
0.026824951171875,
0.025390625,
-0.039337158203125,
0.0118865966796875,
-0.04327392578125,
0.0343017578125,
0.057586669921875,
-0.01158905029296875,
0.016143798828125,
-0.0251007080078125,
0.0103607177734375,
0.0145263671875,
0.009368896484375,
-0.01290130615234375,
-0.0257110595703125,
-0.08514404296875,
-0.0205078125,
0.054962158203125,
0.041107177734375,
-0.048858642578125,
0.06787109375,
-0.034637451171875,
-0.041778564453125,
-0.05670166015625,
0.0105438232421875,
0.03045654296875,
0.041259765625,
0.028411865234375,
-0.031829833984375,
-0.037322998046875,
-0.07537841796875,
0.0016164779663085938,
-0.01306915283203125,
-0.015899658203125,
0.0164642333984375,
0.0465087890625,
-0.01444244384765625,
0.05743408203125,
-0.0162811279296875,
-0.03460693359375,
-0.0214385986328125,
0.0200347900390625,
0.041778564453125,
0.043365478515625,
0.047454833984375,
-0.034698486328125,
-0.0265045166015625,
-0.01251983642578125,
-0.057861328125,
0.012176513671875,
-0.0101165771484375,
-0.0170440673828125,
0.05230712890625,
0.038299560546875,
-0.039520263671875,
-0.0052337646484375,
0.036285400390625,
-0.0208587646484375,
0.0307159423828125,
-0.01314544677734375,
-0.0136260986328125,
-0.09771728515625,
0.0252227783203125,
0.00580596923828125,
-0.0036334991455078125,
-0.040008544921875,
0.0124053955078125,
-0.004772186279296875,
-0.0075225830078125,
-0.0474853515625,
0.0296630859375,
-0.02362060546875,
0.0041961669921875,
0.0090484619140625,
-0.00689697265625,
-0.0054931640625,
0.056915283203125,
-0.000011146068572998047,
0.07208251953125,
0.031585693359375,
-0.045166015625,
-0.00189971923828125,
0.025634765625,
-0.04974365234375,
0.0231781005859375,
-0.05755615234375,
0.01122283935546875,
0.00284576416015625,
0.01541900634765625,
-0.054046630859375,
-0.009552001953125,
0.0056915283203125,
-0.042510986328125,
0.036041259765625,
-0.0005617141723632812,
-0.065673828125,
-0.023956298828125,
-0.028472900390625,
-0.0023136138916015625,
0.0584716796875,
-0.0333251953125,
0.028839111328125,
0.007465362548828125,
-0.00899505615234375,
-0.039825439453125,
-0.06549072265625,
0.01367950439453125,
0.0026569366455078125,
-0.054656982421875,
0.0257110595703125,
0.00458526611328125,
0.002361297607421875,
0.01081085205078125,
0.00395965576171875,
-0.013671875,
0.00811004638671875,
0.00555419921875,
0.019073486328125,
-0.0227508544921875,
0.0367431640625,
-0.002758026123046875,
0.003452301025390625,
-0.005519866943359375,
-0.026214599609375,
0.05963134765625,
-0.035614013671875,
-0.0037326812744140625,
-0.0345458984375,
0.01468658447265625,
0.043243408203125,
0.00051116943359375,
0.06890869140625,
0.05999755859375,
-0.025543212890625,
-0.004695892333984375,
-0.0537109375,
-0.0220489501953125,
-0.035308837890625,
0.0309600830078125,
-0.0212860107421875,
-0.06396484375,
0.040985107421875,
0.0185089111328125,
0.017974853515625,
0.059814453125,
0.03662109375,
-0.035186767578125,
0.05413818359375,
0.049530029296875,
-0.030364990234375,
0.0462646484375,
-0.047515869140625,
0.0029506683349609375,
-0.0560302734375,
-0.01433563232421875,
-0.0250244140625,
-0.028961181640625,
-0.046875,
-0.0217742919921875,
0.021453857421875,
0.0063018798828125,
-0.046356201171875,
0.03570556640625,
-0.03326416015625,
0.00865936279296875,
0.0609130859375,
0.01849365234375,
-0.00131988525390625,
0.01361846923828125,
-0.0222930908203125,
-0.01003265380859375,
-0.0677490234375,
-0.035186767578125,
0.08404541015625,
0.033660888671875,
0.02593994140625,
-0.01366424560546875,
0.0645751953125,
0.02911376953125,
0.00579071044921875,
-0.052642822265625,
0.0390625,
-0.0290374755859375,
-0.0555419921875,
-0.027740478515625,
-0.02044677734375,
-0.0728759765625,
0.00446319580078125,
-0.01195526123046875,
-0.043609619140625,
0.0177001953125,
0.005878448486328125,
-0.02484130859375,
0.0273895263671875,
-0.0633544921875,
0.07012939453125,
-0.0244293212890625,
-0.01031494140625,
-0.01149749755859375,
-0.061676025390625,
0.00513458251953125,
-0.00887298583984375,
0.0015840530395507812,
-0.006855010986328125,
0.01055145263671875,
0.06732177734375,
-0.0552978515625,
0.07208251953125,
-0.0258636474609375,
-0.0007710456848144531,
0.0089263916015625,
-0.00713348388671875,
0.036834716796875,
-0.0208740234375,
-0.0174407958984375,
0.044464111328125,
-0.01001739501953125,
-0.0400390625,
-0.0175628662109375,
0.047393798828125,
-0.0760498046875,
-0.01824951171875,
-0.038604736328125,
-0.0269927978515625,
-0.0184173583984375,
0.025787353515625,
0.037994384765625,
0.0238494873046875,
-0.02191162109375,
0.0275115966796875,
0.05963134765625,
-0.0287017822265625,
0.045318603515625,
0.035247802734375,
0.014984130859375,
-0.0266265869140625,
0.06524658203125,
0.01474761962890625,
0.00714874267578125,
0.027130126953125,
-0.0035247802734375,
-0.01410675048828125,
-0.0347900390625,
-0.036834716796875,
0.021148681640625,
-0.050689697265625,
-0.0075225830078125,
-0.05096435546875,
-0.039886474609375,
-0.036041259765625,
-0.00905609130859375,
-0.03167724609375,
-0.023956298828125,
-0.027862548828125,
-0.0077972412109375,
0.030029296875,
0.0278778076171875,
-0.00965118408203125,
0.0266265869140625,
-0.04608154296875,
-0.006153106689453125,
0.0163116455078125,
0.03350830078125,
-0.00376129150390625,
-0.045135498046875,
-0.0206298828125,
0.01148223876953125,
-0.01458740234375,
-0.041900634765625,
0.0176544189453125,
0.0212249755859375,
0.032470703125,
0.020599365234375,
-0.00787353515625,
0.03741455078125,
-0.0377197265625,
0.07659912109375,
0.00862884521484375,
-0.067138671875,
0.04583740234375,
-0.031707763671875,
0.037689208984375,
0.07037353515625,
0.028961181640625,
-0.05059814453125,
-0.026153564453125,
-0.059295654296875,
-0.08795166015625,
0.053924560546875,
0.01678466796875,
0.0261688232421875,
0.00791168212890625,
0.01486968994140625,
0.014312744140625,
0.0216217041015625,
-0.05596923828125,
-0.008392333984375,
-0.0159912109375,
-0.0078277587890625,
-0.01076507568359375,
-0.0208740234375,
-0.0097808837890625,
-0.0308074951171875,
0.0814208984375,
0.0013856887817382812,
0.045745849609375,
0.022979736328125,
-0.0242462158203125,
0.023101806640625,
0.01446533203125,
0.0440673828125,
0.045074462890625,
-0.0248870849609375,
-0.0235595703125,
0.0196990966796875,
-0.0255584716796875,
-0.007305145263671875,
0.035003662109375,
-0.0279541015625,
0.0209503173828125,
0.049346923828125,
0.07867431640625,
0.0085296630859375,
-0.03961181640625,
0.05120849609375,
0.0046539306640625,
-0.034942626953125,
-0.038482666015625,
-0.00290679931640625,
0.0033397674560546875,
0.021881103515625,
0.03302001953125,
-0.01201629638671875,
-0.006328582763671875,
-0.0235595703125,
0.01529693603515625,
0.018035888671875,
-0.04022216796875,
-0.0160369873046875,
0.03387451171875,
0.0210418701171875,
-0.007843017578125,
0.07598876953125,
-0.0235137939453125,
-0.05023193359375,
0.047515869140625,
0.01551055908203125,
0.0738525390625,
0.02239990234375,
0.0242919921875,
0.0281829833984375,
0.033172607421875,
0.0038242340087890625,
0.0302581787109375,
0.00432586669921875,
-0.057373046875,
-0.038970947265625,
-0.054290771484375,
-0.0174102783203125,
0.044281005859375,
-0.037811279296875,
0.01407623291015625,
-0.0418701171875,
-0.01551055908203125,
-0.0004773139953613281,
0.01042938232421875,
-0.073486328125,
0.0250396728515625,
0.019073486328125,
0.099853515625,
-0.061920166015625,
0.0648193359375,
0.053955078125,
-0.047027587890625,
-0.062164306640625,
0.00519561767578125,
-0.0229644775390625,
-0.07110595703125,
0.049041748046875,
0.0241546630859375,
0.021636962890625,
-0.0046234130859375,
-0.04412841796875,
-0.05340576171875,
0.0797119140625,
0.01806640625,
-0.048004150390625,
-0.00995635986328125,
0.0006442070007324219,
0.061920166015625,
-0.004436492919921875,
-0.0016765594482421875,
0.037445068359375,
0.0330810546875,
-0.00238800048828125,
-0.05462646484375,
0.0038623809814453125,
-0.032501220703125,
0.0084686279296875,
0.0173492431640625,
-0.0555419921875,
0.053924560546875,
-0.004138946533203125,
-0.008331298828125,
0.0036067962646484375,
0.04376220703125,
0.0081024169921875,
0.001007080078125,
0.04052734375,
0.07598876953125,
0.05682373046875,
-0.0281219482421875,
0.0888671875,
-0.035552978515625,
0.0236663818359375,
0.076904296875,
-0.00555419921875,
0.06427001953125,
0.032073974609375,
-0.01537322998046875,
0.054290771484375,
0.0640869140625,
-0.028900146484375,
0.04766845703125,
0.0055694580078125,
-0.0079193115234375,
-0.017425537109375,
-0.0011034011840820312,
-0.0455322265625,
0.0196380615234375,
0.01031494140625,
-0.036651611328125,
-0.01934814453125,
0.0008568763732910156,
0.02410888671875,
-0.0239105224609375,
0.0014209747314453125,
0.053131103515625,
-0.0005621910095214844,
-0.052154541015625,
0.04754638671875,
-0.004375457763671875,
0.061187744140625,
-0.060821533203125,
0.0233306884765625,
-0.0270538330078125,
0.017974853515625,
0.0094451904296875,
-0.054290771484375,
0.00901031494140625,
-0.005706787109375,
-0.015228271484375,
-0.030487060546875,
0.044219970703125,
-0.0185394287109375,
-0.0457763671875,
0.027069091796875,
0.039337158203125,
0.0172119140625,
-0.00432586669921875,
-0.0728759765625,
-0.000850677490234375,
-0.006496429443359375,
-0.0285186767578125,
0.020416259765625,
0.031402587890625,
0.00583648681640625,
0.04052734375,
0.054046630859375,
0.01078033447265625,
-0.0007309913635253906,
0.01326751708984375,
0.07366943359375,
-0.04217529296875,
-0.0277252197265625,
-0.053680419921875,
0.045257568359375,
-0.019561767578125,
-0.047637939453125,
0.040679931640625,
0.04754638671875,
0.09100341796875,
-0.009246826171875,
0.06591796875,
-0.0255584716796875,
0.03118896484375,
-0.03753662109375,
0.0645751953125,
-0.053985595703125,
-0.001796722412109375,
-0.01397705078125,
-0.05621337890625,
-0.0137939453125,
0.036651611328125,
-0.0186614990234375,
0.01220703125,
0.053680419921875,
0.0494384765625,
0.0020771026611328125,
-0.021209716796875,
0.007671356201171875,
0.03228759765625,
0.020751953125,
0.047637939453125,
0.041778564453125,
-0.047698974609375,
0.046356201171875,
-0.0322265625,
-0.005126953125,
-0.0256500244140625,
-0.06689453125,
-0.06304931640625,
-0.050384521484375,
-0.0238189697265625,
-0.041351318359375,
0.02850341796875,
0.067138671875,
0.05853271484375,
-0.08013916015625,
-0.00959014892578125,
-0.015716552734375,
-0.0089569091796875,
-0.021942138671875,
-0.02020263671875,
0.0369873046875,
-0.050018310546875,
-0.047149658203125,
0.014984130859375,
-0.0018701553344726562,
0.007770538330078125,
-0.00833892822265625,
-0.0017633438110351562,
-0.0306549072265625,
-0.008056640625,
0.043121337890625,
0.0247039794921875,
-0.044647216796875,
-0.0199127197265625,
0.0122222900390625,
-0.028106689453125,
0.007808685302734375,
0.043365478515625,
-0.05126953125,
0.025115966796875,
0.041107177734375,
0.062286376953125,
0.050811767578125,
-0.02362060546875,
0.048736572265625,
-0.056671142578125,
0.010772705078125,
0.0244598388671875,
0.029296875,
0.03363037109375,
-0.0172119140625,
0.04815673828125,
0.0231781005859375,
-0.03436279296875,
-0.056365966796875,
-0.006145477294921875,
-0.0718994140625,
-0.03228759765625,
0.08740234375,
-0.012847900390625,
-0.007537841796875,
0.021148681640625,
-0.0059051513671875,
0.0279541015625,
-0.038116455078125,
0.061309814453125,
0.0777587890625,
0.01275634765625,
0.0110626220703125,
-0.03863525390625,
0.033172607421875,
0.051666259765625,
-0.045928955078125,
-0.02325439453125,
0.03717041015625,
0.032928466796875,
0.0117950439453125,
0.03802490234375,
-0.0016450881958007812,
0.0212249755859375,
-0.0191497802734375,
0.0274810791015625,
-0.0085906982421875,
-0.01226043701171875,
-0.037811279296875,
0.006591796875,
-0.00441741943359375,
-0.031097412109375
]
] |
EK12317/Ekmix-Diffusion | 2023-03-20T15:28:39.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | EK12317 | null | null | EK12317/Ekmix-Diffusion | 57 | 96,870 | diffusers | 2022-12-18T06:45:54 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
## Example:
”Negative prompt: (worst quality, low quality:1.4)” is really useful in anywhere
I think all models are great with correct Hires.fix
## Ekmix-Pastel
pastel but lines(with Hires.fix) (Merging the loras into the model.)
~~~
python networks\merge_lora.py --sd_model .\models\model.safetensors --save_to .\lora\2.safetensors --models .\lora\MagicLORA.pt .\lora\Jordan_3.safetensors .\lora\sttabi_v1.4-04.safetensors .\lora\xlimo768.pt .\lora\dpep2.pt --ratios 0.3 1 0.5 0.6 0.35
~~~

~~~
masterpiece,best quality,best quality,Amazing,beautiful detailed eyes,1girl,finely detail,Depth offield,extremely detailed CG unity 8k wallpaper,masterpiece,upper body,(vtuber minato aqua),pink hair,blue streaked hair, palace,holy,white long split mop dress ,mature female,standing,medium_breasts,silver-tiara,smile,black high heels,very long hair, body towards aside,jewelry,hair blue flower,grey eyes,close-up,
Negative prompt: (worst quality, low quality:1.3)
Steps: 30, Sampler: Euler a, CFG scale: 6, Seed: 191289851, Size: 512x768, Model hash: 0526445f65, Denoising strength: 0.5, Eta: 0.5, Clip skip: 2, ENSD: 31337, Hires resize: 856x1280, Hires steps: 30, Hires upscaler: Latent
~~~
pastel but lines(without hires fix) (better!)

~~~
{masterpiece},{best quality},{1girl,{{loli},black hair,blue eyes,very long hair,hair flower,hanfu,happy}},Amazing,beautiful detailed eyes,finely detail,Depth of field,extremely detailed CG,original,outdoors,beautiful detailed hand,beautiful detailed fingers,{{soaked},{wet through}},{body under water},standing,{beautiful detailed water,beautiful detailed sky,fluttered detailed splashs}
Negative prompt: (worst quality, low quality:1.3)
Steps: 30, Sampler: DPM++ 2M Karras, CFG scale: 6, Seed: 2035526620, Size: 768x512, Model hash: ca485b96f8, Eta: 0.5, Clip skip: 2, ENSD: 31337
~~~
## Ekmix-gen4
balance between anime and reality(Merging by block weighted merge.)

~~~
masterpiece,best quality,best quality,Amazing,beautiful detailed eyes,1girl,finely detail,Depth offield,extremely detailed CG unity 8k wallpaper,masterpiece,upper body,(vtuber minato aqua),pink hair,blue streaked hair, palace,holy,white long split mop dress ,mature female,standing,medium_breasts,silver-tiara,smile,black high heels,very long hair, body towards aside,jewelry,hair blue flower,grey eyes,close-up,
~~~

~~~
{masterpiece},{best quality},{1girl,{{loli},black hair,blue eyes,very long hair,hair flower,hanfu,happy}},Amazing,beautiful detailed eyes,finely detail,Depth of field,extremely detailed CG,original,outdoors,beautiful detailed hand,beautiful detailed fingers,{{soaked},{wet through}},{body under water},standing,{beautiful detailed water,beautiful detailed sky,fluttered detailed splashs},by Paul Hedley,
~~~
# Great hypernetworks
style1 and 2 are my favourite.
3,4 may need retrain.

| 4,033 | [
[
-0.0528564453125,
-0.06805419921875,
0.044281005859375,
0.0244903564453125,
-0.027557373046875,
-0.0171661376953125,
0.0079345703125,
-0.044097900390625,
0.0487060546875,
0.0404052734375,
-0.044281005859375,
-0.07220458984375,
-0.04315185546875,
0.00029754638671875,
-0.0254364013671875,
0.043121337890625,
0.0021686553955078125,
0.0027484893798828125,
0.00726318359375,
-0.0011663436889648438,
-0.0352783203125,
-0.0137481689453125,
-0.061492919921875,
-0.0243988037109375,
0.03057861328125,
0.053466796875,
0.060821533203125,
0.058746337890625,
0.00794219970703125,
0.0281524658203125,
-0.02935791015625,
0.0016231536865234375,
-0.041351318359375,
-0.0017805099487304688,
0.002361297607421875,
-0.04937744140625,
-0.06524658203125,
0.01271820068359375,
0.033203125,
0.01678466796875,
-0.00983428955078125,
0.01363372802734375,
0.019927978515625,
0.0501708984375,
-0.0233917236328125,
-0.0089111328125,
-0.00238037109375,
0.01331329345703125,
-0.0197906494140625,
0.0144805908203125,
-0.00662994384765625,
-0.0138702392578125,
-0.0123748779296875,
-0.06488037109375,
0.0220489501953125,
0.0022411346435546875,
0.10833740234375,
-0.00569915771484375,
-0.015869140625,
0.01427459716796875,
-0.03717041015625,
0.057464599609375,
-0.0556640625,
0.0145263671875,
0.0280609130859375,
0.0180816650390625,
-0.0035610198974609375,
-0.05950927734375,
-0.046783447265625,
-0.0012989044189453125,
-0.0033473968505859375,
0.040679931640625,
-0.02642822265625,
-0.0135345458984375,
0.032562255859375,
0.04107666015625,
-0.037322998046875,
0.01087188720703125,
-0.020904541015625,
-0.01678466796875,
0.0445556640625,
0.019195556640625,
0.043212890625,
0.005031585693359375,
-0.0256195068359375,
-0.01546478271484375,
-0.045135498046875,
0.005275726318359375,
0.0401611328125,
-0.01479339599609375,
-0.036590576171875,
0.03912353515625,
-0.0201873779296875,
0.03839111328125,
0.036102294921875,
0.00034356117248535156,
0.0155792236328125,
-0.01313018798828125,
-0.00908660888671875,
-0.018341064453125,
0.060455322265625,
0.04608154296875,
-0.00743865966796875,
0.0249481201171875,
0.0117645263671875,
0.01172637939453125,
0.006862640380859375,
-0.07476806640625,
-0.0230560302734375,
0.042388916015625,
-0.04718017578125,
-0.0302276611328125,
-0.013885498046875,
-0.06683349609375,
0.0012149810791015625,
-0.0273284912109375,
0.025054931640625,
-0.02813720703125,
-0.02618408203125,
0.004009246826171875,
-0.018157958984375,
0.0186614990234375,
0.0347900390625,
-0.020660400390625,
-0.009368896484375,
0.0248260498046875,
0.060272216796875,
0.003173828125,
-0.00904083251953125,
0.0080413818359375,
-0.010284423828125,
-0.041778564453125,
0.058868408203125,
-0.01236724853515625,
-0.027252197265625,
-0.0305328369140625,
0.03192138671875,
0.015289306640625,
-0.04571533203125,
0.0596923828125,
-0.0086517333984375,
0.0015716552734375,
-0.031982421875,
-0.02838134765625,
-0.023651123046875,
0.0018320083618164062,
-0.0396728515625,
0.04010009765625,
0.018157958984375,
-0.06243896484375,
0.0113525390625,
-0.04949951171875,
0.011932373046875,
0.005035400390625,
0.0216064453125,
-0.0458984375,
-0.03216552734375,
-0.004261016845703125,
0.035552978515625,
0.00548553466796875,
-0.023834228515625,
-0.058624267578125,
-0.0198822021484375,
0.022003173828125,
0.003910064697265625,
0.0791015625,
0.0333251953125,
-0.032623291015625,
-0.01568603515625,
-0.06451416015625,
0.00946807861328125,
0.0290679931640625,
-0.0110626220703125,
-0.0129852294921875,
-0.028076171875,
0.0093841552734375,
0.039306640625,
0.024627685546875,
-0.0300445556640625,
-0.0004949569702148438,
-0.0159912109375,
0.0260162353515625,
0.05670166015625,
0.01476287841796875,
0.025726318359375,
-0.041534423828125,
0.0455322265625,
0.02947998046875,
0.02813720703125,
-0.01142120361328125,
-0.0413818359375,
-0.09295654296875,
-0.040618896484375,
0.0026607513427734375,
0.0270843505859375,
-0.053955078125,
0.0323486328125,
0.034271240234375,
-0.07171630859375,
-0.045166015625,
0.015289306640625,
0.039398193359375,
0.035125732421875,
0.019775390625,
-0.0293426513671875,
-0.0262298583984375,
-0.08636474609375,
0.004604339599609375,
0.014068603515625,
0.0002206563949584961,
0.0377197265625,
0.0230712890625,
-0.037994384765625,
0.04644775390625,
-0.04608154296875,
-0.0176849365234375,
-0.00467681884765625,
0.00836181640625,
0.051849365234375,
0.0615234375,
0.056304931640625,
-0.06585693359375,
-0.060333251953125,
-0.0080718994140625,
-0.055938720703125,
-0.0153961181640625,
0.0259552001953125,
-0.05255126953125,
0.005840301513671875,
0.00009799003601074219,
-0.056060791015625,
0.0307159423828125,
0.040985107421875,
-0.056884765625,
0.06585693359375,
-0.0284881591796875,
0.04339599609375,
-0.09649658203125,
0.0237579345703125,
0.0267333984375,
-0.0283203125,
-0.052734375,
0.0498046875,
-0.01377105712890625,
-0.0001633167266845703,
-0.0531005859375,
0.049072265625,
-0.04547119140625,
0.0283203125,
-0.0244598388671875,
0.0089111328125,
0.01202392578125,
0.047454833984375,
0.00524139404296875,
0.0211639404296875,
0.052459716796875,
-0.050506591796875,
0.0445556640625,
0.043670654296875,
-0.01374053955078125,
0.07537841796875,
-0.0645751953125,
0.023590087890625,
-0.0092926025390625,
0.00281524658203125,
-0.07501220703125,
-0.0248260498046875,
0.04620361328125,
-0.041168212890625,
0.0200958251953125,
-0.02166748046875,
-0.024566650390625,
-0.034210205078125,
-0.031036376953125,
0.017364501953125,
0.061859130859375,
-0.034271240234375,
0.026885986328125,
0.0081024169921875,
0.00008511543273925781,
-0.03546142578125,
-0.06634521484375,
0.0017728805541992188,
-0.028350830078125,
-0.058380126953125,
0.0193634033203125,
-0.01299285888671875,
-0.0218048095703125,
-0.0017547607421875,
0.0108795166015625,
-0.0229644775390625,
-0.01214599609375,
0.032501220703125,
0.041351318359375,
-0.0257568359375,
-0.050445556640625,
0.0016927719116210938,
-0.00858306884765625,
-0.013885498046875,
-0.0013561248779296875,
0.03314208984375,
-0.0004906654357910156,
-0.0239410400390625,
-0.0672607421875,
0.027740478515625,
0.06036376953125,
0.0135650634765625,
0.04156494140625,
0.07061767578125,
-0.01776123046875,
0.00882720947265625,
-0.0421142578125,
-0.014617919921875,
-0.03564453125,
0.0078277587890625,
-0.00571441650390625,
-0.050689697265625,
0.062225341796875,
0.028289794921875,
-0.0054168701171875,
0.056121826171875,
0.02947998046875,
-0.0248260498046875,
0.0950927734375,
0.038726806640625,
-0.007633209228515625,
0.03717041015625,
-0.07470703125,
-0.003429412841796875,
-0.06298828125,
-0.0145263671875,
-0.01715087890625,
-0.041351318359375,
-0.044158935546875,
-0.03045654296875,
0.030517578125,
0.031219482421875,
-0.0261993408203125,
0.052276611328125,
-0.037017822265625,
0.01172637939453125,
0.034393310546875,
0.050872802734375,
0.0080413818359375,
-0.01226806640625,
-0.00028395652770996094,
-0.01094818115234375,
-0.044708251953125,
-0.017486572265625,
0.052154541015625,
0.02166748046875,
0.04180908203125,
0.006832122802734375,
0.0433349609375,
0.0225830078125,
0.00859832763671875,
-0.033782958984375,
0.0445556640625,
-0.012542724609375,
-0.052093505859375,
-0.008697509765625,
-0.0284881591796875,
-0.07318115234375,
0.0153961181640625,
-0.040252685546875,
-0.05438232421875,
0.054962158203125,
0.03955078125,
-0.0290679931640625,
0.0166015625,
-0.050201416015625,
0.05950927734375,
0.01580810546875,
-0.057098388671875,
-0.020538330078125,
-0.0389404296875,
0.036712646484375,
0.014068603515625,
0.00673675537109375,
-0.01032257080078125,
0.0029468536376953125,
0.034393310546875,
-0.052886962890625,
0.0458984375,
-0.006359100341796875,
-0.005313873291015625,
0.04949951171875,
0.021636962890625,
0.029754638671875,
0.00765228271484375,
0.028961181640625,
0.026458740234375,
-0.0011892318725585938,
-0.044097900390625,
-0.0163421630859375,
0.067138671875,
-0.05389404296875,
-0.03460693359375,
-0.0081024169921875,
-0.0031585693359375,
0.0167388916015625,
0.017425537109375,
0.0694580078125,
0.04058837890625,
0.003692626953125,
0.004058837890625,
0.05145263671875,
-0.0213623046875,
0.0291900634765625,
0.017791748046875,
-0.01354217529296875,
-0.04205322265625,
0.07196044921875,
0.01534271240234375,
0.038543701171875,
-0.0079193115234375,
0.0102386474609375,
-0.00513458251953125,
-0.017547607421875,
-0.02642822265625,
0.035858154296875,
-0.051055908203125,
-0.0201568603515625,
-0.0489501953125,
-0.023223876953125,
-0.0306396484375,
-0.0171051025390625,
-0.033294677734375,
-0.0145111083984375,
-0.06341552734375,
0.01555633544921875,
0.0309906005859375,
0.0291748046875,
-0.01422119140625,
0.0131378173828125,
-0.053192138671875,
0.03338623046875,
0.0279693603515625,
0.0263671875,
0.007663726806640625,
-0.03564453125,
-0.0130157470703125,
0.0201263427734375,
-0.046600341796875,
-0.0767822265625,
0.050567626953125,
-0.007137298583984375,
0.03326416015625,
0.055908203125,
-0.0230255126953125,
0.058258056640625,
-0.0039825439453125,
0.06756591796875,
0.0204315185546875,
-0.043304443359375,
0.0245208740234375,
-0.061767578125,
0.0189971923828125,
0.0284423828125,
0.024993896484375,
-0.04559326171875,
-0.038604736328125,
-0.07305908203125,
-0.06561279296875,
0.050872802734375,
0.0352783203125,
-0.01276397705078125,
0.01324462890625,
0.03558349609375,
-0.017486572265625,
0.0108795166015625,
-0.051513671875,
-0.053314208984375,
-0.0206451416015625,
0.006549835205078125,
-0.008819580078125,
-0.0228424072265625,
0.00551605224609375,
-0.028656005859375,
0.07659912109375,
0.0019130706787109375,
0.038665771484375,
0.035491943359375,
0.0168304443359375,
-0.0162353515625,
0.0007948875427246094,
0.025665283203125,
0.032867431640625,
-0.00516510009765625,
-0.020263671875,
0.00998687744140625,
-0.036773681640625,
0.0201873779296875,
-0.01303863525390625,
-0.0200958251953125,
0.007328033447265625,
0.01184844970703125,
0.032745361328125,
-0.0033435821533203125,
-0.0219573974609375,
0.041259765625,
0.005672454833984375,
-0.00270843505859375,
-0.0207061767578125,
0.0256195068359375,
0.017364501953125,
0.036712646484375,
0.00044798851013183594,
0.0194549560546875,
0.039520263671875,
-0.03997802734375,
0.002544403076171875,
0.0234375,
-0.028076171875,
-0.0285491943359375,
0.059661865234375,
-0.007282257080078125,
-0.0169219970703125,
0.032928466796875,
-0.0291290283203125,
-0.043792724609375,
0.07275390625,
0.039581298828125,
0.06231689453125,
-0.030426025390625,
0.0244293212890625,
0.066162109375,
-0.0155792236328125,
-0.030364990234375,
0.0264434814453125,
0.01398468017578125,
-0.042572021484375,
-0.0137481689453125,
-0.04400634765625,
-0.01271820068359375,
0.0211334228515625,
-0.0313720703125,
0.05718994140625,
-0.042816162109375,
-0.023834228515625,
-0.0201873779296875,
-0.0033473968505859375,
-0.03570556640625,
0.01849365234375,
0.01198577880859375,
0.07098388671875,
-0.0762939453125,
0.034149169921875,
0.04345703125,
-0.0306854248046875,
-0.06610107421875,
-0.0235748291015625,
0.0107421875,
-0.037872314453125,
0.0198516845703125,
-0.002532958984375,
0.003582000732421875,
-0.00012028217315673828,
-0.08563232421875,
-0.0657958984375,
0.0928955078125,
0.030792236328125,
-0.035491943359375,
0.0016231536865234375,
-0.020538330078125,
0.046722412109375,
-0.0206756591796875,
0.0271453857421875,
0.0267791748046875,
0.043365478515625,
0.0255279541015625,
-0.043548583984375,
0.003208160400390625,
-0.040771484375,
0.0103302001953125,
-0.004772186279296875,
-0.08660888671875,
0.08172607421875,
-0.00908660888671875,
-0.0301666259765625,
0.044464111328125,
0.045135498046875,
0.0294342041015625,
0.02764892578125,
0.037506103515625,
0.07666015625,
0.00824737548828125,
-0.01093292236328125,
0.06451416015625,
-0.0218505859375,
0.031402587890625,
0.04351806640625,
0.029754638671875,
0.03350830078125,
0.0010013580322265625,
-0.0231781005859375,
0.04052734375,
0.06451416015625,
-0.0140380859375,
0.0169219970703125,
-0.00127410888671875,
-0.005924224853515625,
-0.023345947265625,
-0.0087127685546875,
-0.0278472900390625,
0.01666259765625,
0.006099700927734375,
-0.0242767333984375,
-0.0032558441162109375,
-0.00653839111328125,
0.01476287841796875,
-0.0005435943603515625,
-0.02545166015625,
0.040618896484375,
0.0258941650390625,
-0.026275634765625,
0.053466796875,
-0.003253936767578125,
0.0634765625,
-0.036590576171875,
-0.037506103515625,
-0.0295867919921875,
0.00685882568359375,
-0.037139892578125,
-0.0577392578125,
-0.0005106925964355469,
-0.0196685791015625,
0.006511688232421875,
-0.00787353515625,
0.051116943359375,
-0.038818359375,
-0.0318603515625,
0.0219573974609375,
0.00701904296875,
0.0361328125,
-0.005615234375,
-0.077392578125,
-0.01038360595703125,
-0.0030460357666015625,
-0.0092315673828125,
-0.00160980224609375,
0.0232696533203125,
0.02813720703125,
0.046722412109375,
0.03997802734375,
0.028350830078125,
-0.0118865966796875,
-0.021759033203125,
0.043701171875,
-0.039520263671875,
-0.04656982421875,
-0.06439208984375,
0.0494384765625,
-0.01418304443359375,
-0.0268402099609375,
0.07025146484375,
0.03411865234375,
0.041717529296875,
-0.0207366943359375,
0.03131103515625,
-0.0069122314453125,
0.023162841796875,
-0.030120849609375,
0.0723876953125,
-0.0535888671875,
-0.015960693359375,
-0.05157470703125,
-0.0821533203125,
0.00469207763671875,
0.041534423828125,
0.0193634033203125,
0.005950927734375,
0.023834228515625,
0.060699462890625,
-0.0156402587890625,
-0.031158447265625,
0.026885986328125,
0.0443115234375,
0.003070831298828125,
0.04595947265625,
0.0693359375,
-0.034332275390625,
0.00688934326171875,
-0.048126220703125,
-0.02227783203125,
-0.038604736328125,
-0.04986572265625,
-0.0540771484375,
-0.050750732421875,
-0.039947509765625,
-0.057830810546875,
-0.014129638671875,
0.061859130859375,
0.07073974609375,
-0.0478515625,
-0.01126861572265625,
0.0042724609375,
0.00458526611328125,
-0.0236358642578125,
-0.0234222412109375,
-0.0038013458251953125,
0.0287628173828125,
-0.07196044921875,
-0.00030732154846191406,
0.0143585205078125,
0.04608154296875,
-0.0177459716796875,
-0.00957489013671875,
-0.005268096923828125,
-0.00817108154296875,
0.025634765625,
0.02947998046875,
-0.0305328369140625,
-0.00447845458984375,
-0.0031452178955078125,
-0.0029754638671875,
0.016754150390625,
0.023223876953125,
-0.0128173828125,
0.0126953125,
0.04248046875,
-0.00897216796875,
0.0404052734375,
0.0012578964233398438,
0.019744873046875,
-0.0137176513671875,
0.007110595703125,
0.00867462158203125,
0.0479736328125,
0.0185089111328125,
-0.054168701171875,
0.0290679931640625,
0.044464111328125,
-0.05145263671875,
-0.042572021484375,
0.011322021484375,
-0.09307861328125,
-0.018035888671875,
0.0985107421875,
-0.01555633544921875,
-0.03778076171875,
0.01371002197265625,
-0.047088623046875,
0.018646240234375,
-0.023223876953125,
0.0426025390625,
0.048980712890625,
-0.00553131103515625,
-0.0098876953125,
-0.033203125,
0.0186614990234375,
0.012664794921875,
-0.042755126953125,
-0.0134429931640625,
0.057098388671875,
0.040863037109375,
0.0338134765625,
0.038970947265625,
-0.033599853515625,
0.02947998046875,
0.0002779960632324219,
0.02618408203125,
0.000024139881134033203,
-0.01342010498046875,
-0.03375244140625,
-0.00399017333984375,
0.005863189697265625,
-0.01708984375
]
] |
Helsinki-NLP/opus-mt-en-ru | 2023-08-16T11:30:58.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"en",
"ru",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-ru | 31 | 96,463 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-en-ru
* source languages: en
* target languages: ru
* OPUS readme: [en-ru](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-ru/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-02-11.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-ru/opus-2020-02-11.zip)
* test set translations: [opus-2020-02-11.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-ru/opus-2020-02-11.test.txt)
* test set scores: [opus-2020-02-11.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-ru/opus-2020-02-11.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newstest2012.en.ru | 31.1 | 0.581 |
| newstest2013.en.ru | 23.5 | 0.513 |
| newstest2015-enru.en.ru | 27.5 | 0.564 |
| newstest2016-enru.en.ru | 26.4 | 0.548 |
| newstest2017-enru.en.ru | 29.1 | 0.572 |
| newstest2018-enru.en.ru | 25.4 | 0.554 |
| newstest2019-enru.en.ru | 27.1 | 0.533 |
| Tatoeba.en.ru | 48.4 | 0.669 |
| 1,123 | [
[
-0.0243072509765625,
-0.02288818359375,
0.0229339599609375,
0.025634765625,
-0.0243072509765625,
-0.0273284912109375,
-0.02020263671875,
-0.0080413818359375,
0.0079193115234375,
0.023773193359375,
-0.060577392578125,
-0.041046142578125,
-0.044769287109375,
0.0160369873046875,
-0.00853729248046875,
0.059051513671875,
-0.00855255126953125,
0.037322998046875,
0.009735107421875,
-0.036773681640625,
-0.03204345703125,
-0.0207977294921875,
-0.0258941650390625,
-0.0285797119140625,
0.02545166015625,
0.040863037109375,
0.0251922607421875,
0.032073974609375,
0.0667724609375,
0.0190582275390625,
-0.006134033203125,
-0.0033321380615234375,
-0.0277557373046875,
-0.01012420654296875,
0.013916015625,
-0.047760009765625,
-0.058441162109375,
-0.009124755859375,
0.07666015625,
0.0299530029296875,
-0.0021038055419921875,
0.03912353515625,
-0.0019588470458984375,
0.07012939453125,
-0.0195770263671875,
-0.00206756591796875,
-0.039886474609375,
0.00598907470703125,
-0.02618408203125,
-0.0272216796875,
-0.043426513671875,
-0.018310546875,
0.0055694580078125,
-0.0467529296875,
0.0025272369384765625,
0.01334381103515625,
0.11444091796875,
0.0202484130859375,
-0.0183563232421875,
-0.0029354095458984375,
-0.039764404296875,
0.08074951171875,
-0.0543212890625,
0.035125732421875,
0.030029296875,
0.0186309814453125,
0.00925445556640625,
-0.04046630859375,
-0.0207977294921875,
0.0204315185546875,
-0.0200653076171875,
0.0209808349609375,
-0.0185546875,
-0.0290374755859375,
0.0206298828125,
0.052581787109375,
-0.0640869140625,
-0.0011005401611328125,
-0.038604736328125,
0.0037326812744140625,
0.0465087890625,
0.0201416015625,
0.0175323486328125,
-0.01324462890625,
-0.0308074951171875,
-0.037506103515625,
-0.056671142578125,
0.010467529296875,
0.031524658203125,
0.02044677734375,
-0.036712646484375,
0.046722412109375,
-0.020904541015625,
0.04827880859375,
-0.00304412841796875,
-0.007541656494140625,
0.07427978515625,
-0.0263824462890625,
-0.025726318359375,
-0.0198211669921875,
0.08831787109375,
0.0254364013671875,
-0.0017642974853515625,
0.00960540771484375,
-0.01349639892578125,
-0.009124755859375,
0.0018720626831054688,
-0.07000732421875,
-0.004543304443359375,
0.0172271728515625,
-0.03131103515625,
-0.017486572265625,
0.00464630126953125,
-0.059661865234375,
0.01611328125,
-0.0261077880859375,
0.04736328125,
-0.038665771484375,
-0.021240234375,
0.022430419921875,
0.0018720626831054688,
0.029296875,
0.0006098747253417969,
-0.040252685546875,
0.0206146240234375,
0.0263671875,
0.0516357421875,
-0.0173797607421875,
-0.0196075439453125,
-0.034637451171875,
-0.0184478759765625,
-0.00743865966796875,
0.054779052734375,
-0.01493072509765625,
-0.0235443115234375,
-0.01018524169921875,
0.036468505859375,
-0.023406982421875,
-0.0208587646484375,
0.08953857421875,
-0.017333984375,
0.05413818359375,
-0.036285400390625,
-0.03045654296875,
-0.0195159912109375,
0.03289794921875,
-0.03936767578125,
0.09326171875,
0.0083465576171875,
-0.06439208984375,
0.024444580078125,
-0.05755615234375,
-0.010986328125,
-0.0028553009033203125,
0.003452301025390625,
-0.0606689453125,
-0.006587982177734375,
0.011077880859375,
0.0280914306640625,
-0.026397705078125,
0.0180511474609375,
-0.0009350776672363281,
-0.0198822021484375,
0.006252288818359375,
-0.0213165283203125,
0.0748291015625,
0.0229034423828125,
-0.022247314453125,
0.022979736328125,
-0.075439453125,
0.004673004150390625,
0.0078277587890625,
-0.032073974609375,
-0.01000213623046875,
0.004230499267578125,
0.0183868408203125,
0.002532958984375,
0.01739501953125,
-0.043975830078125,
0.0166778564453125,
-0.040374755859375,
0.02105712890625,
0.050506591796875,
-0.0175018310546875,
0.0341796875,
-0.038970947265625,
0.0318603515625,
0.01023101806640625,
0.013702392578125,
0.004840850830078125,
-0.034210205078125,
-0.070556640625,
-0.016815185546875,
0.0416259765625,
0.07696533203125,
-0.03802490234375,
0.06658935546875,
-0.0517578125,
-0.061920166015625,
-0.04937744140625,
-0.0146636962890625,
0.02691650390625,
0.0301361083984375,
0.0360107421875,
-0.00562286376953125,
-0.03741455078125,
-0.08233642578125,
-0.01055908203125,
-0.007053375244140625,
-0.006832122802734375,
0.01473236083984375,
0.056671142578125,
-0.005878448486328125,
0.04150390625,
-0.045806884765625,
-0.03466796875,
-0.0156402587890625,
0.0145416259765625,
0.038604736328125,
0.05023193359375,
0.041046142578125,
-0.0595703125,
-0.047088623046875,
0.0007810592651367188,
-0.045135498046875,
-0.00951385498046875,
0.00518035888671875,
-0.0264129638671875,
0.007656097412109375,
0.0010366439819335938,
-0.0280609130859375,
0.01441192626953125,
0.046905517578125,
-0.053192138671875,
0.041595458984375,
-0.00426483154296875,
0.019775390625,
-0.1036376953125,
0.01459503173828125,
-0.0106201171875,
-0.0129852294921875,
-0.031494140625,
-0.0018053054809570312,
0.0181427001953125,
0.005889892578125,
-0.042572021484375,
0.036956787109375,
-0.029754638671875,
-0.00934600830078125,
0.0299835205078125,
-0.0008373260498046875,
0.00839996337890625,
0.04736328125,
-0.00812530517578125,
0.0611572265625,
0.058837890625,
-0.03350830078125,
0.01251983642578125,
0.038116455078125,
-0.043182373046875,
0.0302734375,
-0.052978515625,
-0.0226898193359375,
0.0155487060546875,
-0.007099151611328125,
-0.057403564453125,
-0.0026798248291015625,
0.0259246826171875,
-0.051177978515625,
0.0268707275390625,
-0.0136260986328125,
-0.04705810546875,
-0.007434844970703125,
-0.0183868408203125,
0.026031494140625,
0.05059814453125,
-0.00862884521484375,
0.041961669921875,
0.01251983642578125,
-0.004119873046875,
-0.039093017578125,
-0.07391357421875,
-0.01251983642578125,
-0.0379638671875,
-0.0556640625,
0.0166473388671875,
-0.0257415771484375,
-0.003154754638671875,
0.00662994384765625,
0.0206146240234375,
-0.007720947265625,
-0.001590728759765625,
0.0038909912109375,
0.0199432373046875,
-0.0295562744140625,
0.0006427764892578125,
-0.01309967041015625,
-0.0175323486328125,
-0.010650634765625,
-0.0039005279541015625,
0.0491943359375,
-0.033966064453125,
-0.0175018310546875,
-0.0413818359375,
0.00009149312973022461,
0.041229248046875,
-0.0352783203125,
0.059326171875,
0.040863037109375,
-0.01065826416015625,
0.01018524169921875,
-0.028167724609375,
0.0052947998046875,
-0.031768798828125,
0.012054443359375,
-0.040191650390625,
-0.057403564453125,
0.051971435546875,
0.013153076171875,
0.03094482421875,
0.06060791015625,
0.041717529296875,
0.007411956787109375,
0.05706787109375,
0.0186614990234375,
0.00595855712890625,
0.033721923828125,
-0.040924072265625,
-0.01216888427734375,
-0.07427978515625,
-0.0023956298828125,
-0.0550537109375,
-0.03302001953125,
-0.06280517578125,
-0.01763916015625,
0.023895263671875,
-0.0023193359375,
-0.02349853515625,
0.051239013671875,
-0.045135498046875,
0.0213165283203125,
0.044158935546875,
-0.0034694671630859375,
0.0207061767578125,
0.00533294677734375,
-0.031463623046875,
-0.02471923828125,
-0.032745361328125,
-0.027099609375,
0.092041015625,
0.025299072265625,
0.028076171875,
0.023193359375,
0.042633056640625,
-0.0037403106689453125,
0.01380157470703125,
-0.04547119140625,
0.038055419921875,
-0.009918212890625,
-0.06036376953125,
-0.0290374755859375,
-0.043426513671875,
-0.0633544921875,
0.037322998046875,
-0.0201263427734375,
-0.04083251953125,
0.01494598388671875,
-0.0029048919677734375,
-0.0143890380859375,
0.0307769775390625,
-0.04473876953125,
0.0845947265625,
-0.00142669677734375,
-0.0163726806640625,
0.0133514404296875,
-0.0321044921875,
0.0198516845703125,
0.006977081298828125,
0.024993896484375,
-0.02093505859375,
0.0099029541015625,
0.051849365234375,
-0.0157012939453125,
0.02801513671875,
-0.0023403167724609375,
0.004055023193359375,
0.00457000732421875,
0.006450653076171875,
0.03564453125,
-0.0089874267578125,
-0.021514892578125,
0.0168304443359375,
0.0016765594482421875,
-0.033538818359375,
-0.011199951171875,
0.0458984375,
-0.0550537109375,
-0.00891876220703125,
-0.048431396484375,
-0.044158935546875,
0.00492095947265625,
0.0306549072265625,
0.04888916015625,
0.05584716796875,
-0.0200958251953125,
0.04376220703125,
0.055572509765625,
-0.0198516845703125,
0.0269622802734375,
0.055419921875,
-0.01552581787109375,
-0.045806884765625,
0.066162109375,
0.0111846923828125,
0.023651123046875,
0.043701171875,
0.014739990234375,
-0.01175689697265625,
-0.04730224609375,
-0.050537109375,
0.0160675048828125,
-0.0254364013671875,
-0.023468017578125,
-0.043426513671875,
-0.0024204254150390625,
-0.0233154296875,
0.01552581787109375,
-0.033782958984375,
-0.04052734375,
-0.0184326171875,
-0.013824462890625,
0.0293426513671875,
0.021148681640625,
-0.01045989990234375,
0.0288238525390625,
-0.0653076171875,
0.00897216796875,
-0.00592041015625,
0.027618408203125,
-0.031768798828125,
-0.06390380859375,
-0.02899169921875,
-0.0037403106689453125,
-0.047210693359375,
-0.052093505859375,
0.044677734375,
0.01050567626953125,
0.021026611328125,
0.026763916015625,
0.00811004638671875,
0.043304443359375,
-0.05047607421875,
0.0673828125,
0.005092620849609375,
-0.046478271484375,
0.041595458984375,
-0.039520263671875,
0.034515380859375,
0.06341552734375,
0.0161590576171875,
-0.0236663818359375,
-0.036346435546875,
-0.056610107421875,
-0.0631103515625,
0.06451416015625,
0.049163818359375,
-0.011444091796875,
0.021484375,
-0.01549530029296875,
-0.00473785400390625,
-0.0004935264587402344,
-0.08575439453125,
-0.037872314453125,
0.009002685546875,
-0.0270233154296875,
-0.00021588802337646484,
-0.0205841064453125,
-0.0214996337890625,
-0.0244140625,
0.07940673828125,
0.01407623291015625,
0.0185546875,
0.0285186767578125,
-0.001003265380859375,
-0.0193023681640625,
0.033660888671875,
0.07025146484375,
0.0472412109375,
-0.036163330078125,
-0.01335906982421875,
0.027374267578125,
-0.03173828125,
-0.008087158203125,
0.007617950439453125,
-0.03363037109375,
0.0169830322265625,
0.0269927978515625,
0.0748291015625,
0.0183868408203125,
-0.03680419921875,
0.0377197265625,
-0.022735595703125,
-0.033416748046875,
-0.0609130859375,
-0.013092041015625,
0.007801055908203125,
0.003204345703125,
0.016204833984375,
0.0185089111328125,
0.01488494873046875,
-0.0181427001953125,
0.01396942138671875,
0.016387939453125,
-0.042205810546875,
-0.037353515625,
0.039520263671875,
0.005523681640625,
-0.0156097412109375,
0.0272979736328125,
-0.02484130859375,
-0.041412353515625,
0.0341796875,
0.00907135009765625,
0.07421875,
-0.0165252685546875,
-0.0147247314453125,
0.06683349609375,
0.044464111328125,
-0.0143585205078125,
0.04229736328125,
0.016754150390625,
-0.048828125,
-0.035369873046875,
-0.063232421875,
-0.00626373291015625,
0.0195159912109375,
-0.0706787109375,
0.03656005859375,
0.0202484130859375,
-0.00850677490234375,
-0.0218963623046875,
0.02032470703125,
-0.04449462890625,
0.01239776611328125,
-0.0272979736328125,
0.07391357421875,
-0.0765380859375,
0.059295654296875,
0.042083740234375,
-0.027099609375,
-0.058074951171875,
-0.0234222412109375,
-0.01554107666015625,
-0.034454345703125,
0.041595458984375,
0.012176513671875,
0.02178955078125,
-0.00464630126953125,
-0.0212249755859375,
-0.07421875,
0.0872802734375,
0.0028133392333984375,
-0.040008544921875,
0.01038360595703125,
0.0096435546875,
0.0301513671875,
-0.022003173828125,
0.01230621337890625,
0.034210205078125,
0.06036376953125,
0.0047149658203125,
-0.07501220703125,
-0.01332855224609375,
-0.04632568359375,
-0.03228759765625,
0.044647216796875,
-0.05364990234375,
0.07379150390625,
0.026214599609375,
-0.01123046875,
0.01153564453125,
0.0440673828125,
0.0259857177734375,
0.01409149169921875,
0.039215087890625,
0.08905029296875,
0.0308380126953125,
-0.04595947265625,
0.064697265625,
-0.0204620361328125,
0.039093017578125,
0.08233642578125,
0.0017957687377929688,
0.06756591796875,
0.0222015380859375,
-0.0225067138671875,
0.046722412109375,
0.04833984375,
-0.026214599609375,
0.0396728515625,
-0.0009489059448242188,
0.00677490234375,
-0.021331787109375,
0.018585205078125,
-0.053253173828125,
0.017303466796875,
0.01496124267578125,
-0.0135345458984375,
-0.0018243789672851562,
-0.01312255859375,
0.01364898681640625,
-0.004276275634765625,
-0.005977630615234375,
0.039794921875,
-0.004871368408203125,
-0.0423583984375,
0.04931640625,
-0.0037403106689453125,
0.04449462890625,
-0.054290771484375,
0.00591278076171875,
-0.004817962646484375,
0.027618408203125,
-0.00739288330078125,
-0.050933837890625,
0.033477783203125,
0.0056915283203125,
-0.0238037109375,
-0.0330810546875,
0.02099609375,
-0.037506103515625,
-0.0704345703125,
0.0226593017578125,
0.0263214111328125,
0.0254974365234375,
0.0098876953125,
-0.060577392578125,
0.001407623291015625,
0.00908660888671875,
-0.051544189453125,
0.01023101806640625,
0.05352783203125,
0.026458740234375,
0.03900146484375,
0.04931640625,
0.0178070068359375,
0.01416015625,
-0.0017452239990234375,
0.053009033203125,
-0.03387451171875,
-0.04156494140625,
-0.058013916015625,
0.06634521484375,
-0.01239013671875,
-0.055572509765625,
0.053436279296875,
0.07611083984375,
0.0648193359375,
-0.00632476806640625,
0.0269317626953125,
-0.01507568359375,
0.058868408203125,
-0.044921875,
0.0491943359375,
-0.077880859375,
0.01009368896484375,
-0.007801055908203125,
-0.0634765625,
-0.0207061767578125,
0.0299530029296875,
-0.0207061767578125,
-0.0231781005859375,
0.053863525390625,
0.0506591796875,
-0.007511138916015625,
-0.01107025146484375,
0.018280029296875,
0.02276611328125,
0.01153564453125,
0.04559326171875,
0.03228759765625,
-0.0762939453125,
0.046051025390625,
-0.024932861328125,
-0.0119171142578125,
-0.00521087646484375,
-0.054595947265625,
-0.060089111328125,
-0.045379638671875,
-0.0118408203125,
-0.020538330078125,
-0.02886962890625,
0.05853271484375,
0.040496826171875,
-0.07080078125,
-0.03826904296875,
-0.00283050537109375,
0.0034694671630859375,
-0.0125579833984375,
-0.02099609375,
0.043365478515625,
-0.0186614990234375,
-0.07305908203125,
0.033935546875,
-0.0024356842041015625,
-0.0033321380615234375,
-0.004772186279296875,
-0.0217132568359375,
-0.035888671875,
-0.01264190673828125,
0.020050048828125,
0.001407623291015625,
-0.03973388671875,
0.00408172607421875,
0.0120697021484375,
-0.004180908203125,
0.0308074951171875,
0.0132904052734375,
-0.01552581787109375,
0.010467529296875,
0.07183837890625,
0.00739288330078125,
0.03643798828125,
-0.0036563873291015625,
0.03277587890625,
-0.05157470703125,
0.027435302734375,
0.00994110107421875,
0.04400634765625,
0.025238037109375,
-0.002010345458984375,
0.059967041015625,
0.0211944580078125,
-0.0496826171875,
-0.08404541015625,
0.005016326904296875,
-0.08453369140625,
0.0029506683349609375,
0.07568359375,
-0.01396942138671875,
-0.02203369140625,
0.02410888671875,
-0.0017147064208984375,
0.0038127899169921875,
-0.0275421142578125,
0.0263824462890625,
0.07110595703125,
0.0202484130859375,
0.00470733642578125,
-0.057281494140625,
0.027923583984375,
0.0306396484375,
-0.0570068359375,
-0.0126800537109375,
0.017242431640625,
0.0209808349609375,
0.03070068359375,
0.037811279296875,
-0.02630615234375,
0.002277374267578125,
-0.00891876220703125,
0.032196044921875,
-0.0014181137084960938,
-0.018829345703125,
-0.0179290771484375,
-0.0070953369140625,
-0.005496978759765625,
-0.0238037109375
]
] |
dbmdz/distilbert-base-turkish-cased | 2021-01-24T01:01:22.000Z | [
"transformers",
"pytorch",
"tf",
"distilbert",
"tr",
"arxiv:1910.01108",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | null | dbmdz | null | null | dbmdz/distilbert-base-turkish-cased | 8 | 96,230 | transformers | 2022-03-02T23:29:05 | ---
language: tr
license: mit
---
# 🤗 + 📚 dbmdz Distilled Turkish BERT model
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a (cased) distilled model for Turkish 🎉
# 🇹🇷 DistilBERTurk
DistilBERTurk is a community-driven cased distilled BERT model for Turkish.
DistilBERTurk was trained on 7GB of the original training data that was used
for training [BERTurk](https://github.com/stefan-it/turkish-bert/tree/master#stats),
using the cased version of BERTurk as teacher model.
*DistilBERTurk* was trained with the official Hugging Face implementation from
[here](https://github.com/huggingface/transformers/tree/master/examples/distillation)
for 5 days on 4 RTX 2080 TI.
More details about distillation can be found in the
["DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"](https://arxiv.org/abs/1910.01108)
paper by Sanh et al. (2019).
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue in the [BERTurk](https://github.com/stefan-it/turkish-bert) repository!
| Model | Downloads
| --------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/distilbert-base-turkish-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/vocab.txt)
## Usage
With Transformers >= 2.3 our DistilBERTurk model can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/distilbert-base-turkish-cased")
model = AutoModel.from_pretrained("dbmdz/distilbert-base-turkish-cased")
```
## Results
For results on PoS tagging or NER tasks, please refer to
[this repository](https://github.com/stefan-it/turkish-bert).
For PoS tagging, DistilBERTurk outperforms the 24-layer XLM-RoBERTa model.
The overall performance difference between DistilBERTurk and the original
(teacher) BERTurk model is ~1.18%.
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
| 3,142 | [
[
-0.04754638671875,
-0.06365966796875,
0.01335906982421875,
0.0280609130859375,
-0.0282745361328125,
-0.01525115966796875,
-0.0177154541015625,
-0.0291595458984375,
0.0088653564453125,
0.01255035400390625,
-0.049072265625,
-0.04388427734375,
-0.058502197265625,
-0.00034332275390625,
-0.02081298828125,
0.09466552734375,
-0.013397216796875,
0.01519012451171875,
-0.0015583038330078125,
-0.01009368896484375,
-0.0010786056518554688,
-0.04913330078125,
-0.035064697265625,
-0.039154052734375,
0.0292205810546875,
-0.006793975830078125,
0.030731201171875,
0.0124969482421875,
0.037994384765625,
0.0313720703125,
-0.0275726318359375,
-0.007465362548828125,
-0.01102447509765625,
0.00033855438232421875,
0.015777587890625,
-0.0097503662109375,
-0.04437255859375,
0.0009546279907226562,
0.04058837890625,
0.04010009765625,
-0.03448486328125,
0.018798828125,
0.0103759765625,
0.07061767578125,
-0.014373779296875,
0.0197906494140625,
-0.02667236328125,
-0.00254058837890625,
-0.00896453857421875,
0.0310821533203125,
-0.015777587890625,
-0.013397216796875,
0.0411376953125,
-0.0185699462890625,
0.034912109375,
-0.021270751953125,
0.088623046875,
0.024200439453125,
-0.0203857421875,
-0.0164031982421875,
-0.034210205078125,
0.055938720703125,
-0.067626953125,
0.028900146484375,
0.0222625732421875,
0.0380859375,
-0.03436279296875,
-0.06201171875,
-0.049072265625,
-0.009918212890625,
-0.005252838134765625,
0.00543212890625,
-0.030670166015625,
0.0028705596923828125,
0.02630615234375,
0.0419921875,
-0.031341552734375,
-0.02398681640625,
-0.03546142578125,
-0.004322052001953125,
0.04827880859375,
-0.004241943359375,
0.00373077392578125,
-0.020843505859375,
-0.0335693359375,
-0.033050537109375,
-0.025054931640625,
0.0171356201171875,
0.035064697265625,
0.029083251953125,
-0.027801513671875,
0.03582763671875,
-0.01515960693359375,
0.057647705078125,
0.027587890625,
-0.006359100341796875,
0.028778076171875,
-0.01096343994140625,
-0.027252197265625,
0.0085296630859375,
0.059783935546875,
0.00269317626953125,
-0.0003943443298339844,
0.0036983489990234375,
-0.0181884765625,
-0.0269775390625,
0.0187530517578125,
-0.08123779296875,
-0.033111572265625,
0.0286407470703125,
-0.0477294921875,
-0.0246429443359375,
0.0004432201385498047,
-0.039154052734375,
-0.0110321044921875,
-0.0214385986328125,
0.0419921875,
-0.04205322265625,
-0.04205322265625,
0.013824462890625,
-0.0036106109619140625,
0.0288848876953125,
0.025543212890625,
-0.0712890625,
0.0247955322265625,
0.041290283203125,
0.0653076171875,
0.0125579833984375,
-0.00815582275390625,
0.000896453857421875,
-0.032623291015625,
-0.0100555419921875,
0.039093017578125,
0.0037441253662109375,
-0.0153961181640625,
-0.0008034706115722656,
0.018402099609375,
-0.00756072998046875,
-0.036346435546875,
0.04266357421875,
-0.0279388427734375,
0.030120849609375,
-0.037200927734375,
-0.040740966796875,
-0.032562255859375,
0.009033203125,
-0.050506591796875,
0.1005859375,
0.03424072265625,
-0.07275390625,
0.033355712890625,
-0.037811279296875,
-0.028472900390625,
-0.00913238525390625,
0.007564544677734375,
-0.07037353515625,
0.0117950439453125,
0.01555633544921875,
0.05029296875,
-0.0048065185546875,
0.01441192626953125,
-0.03448486328125,
-0.01690673828125,
0.010162353515625,
0.0038700103759765625,
0.0947265625,
0.034515380859375,
-0.040679931640625,
-0.0063018798828125,
-0.0438232421875,
-0.0168609619140625,
0.023895263671875,
-0.03594970703125,
-0.00716400146484375,
-0.01366424560546875,
0.0217742919921875,
0.0242156982421875,
0.019317626953125,
-0.047332763671875,
0.026580810546875,
-0.016204833984375,
0.036468505859375,
0.05157470703125,
-0.03009033203125,
0.00846099853515625,
-0.036712646484375,
0.0161895751953125,
0.0172119140625,
0.01454925537109375,
0.010040283203125,
-0.0390625,
-0.06939697265625,
-0.052337646484375,
0.036956787109375,
0.0193023681640625,
-0.05499267578125,
0.04010009765625,
0.0014944076538085938,
-0.05120849609375,
-0.0457763671875,
0.00333404541015625,
0.01393890380859375,
0.05029296875,
0.0226593017578125,
-0.01446533203125,
-0.05169677734375,
-0.0697021484375,
0.006103515625,
-0.0226593017578125,
-0.01058197021484375,
0.021881103515625,
0.0458984375,
-0.0037860870361328125,
0.05987548828125,
-0.01235198974609375,
-0.03369140625,
-0.025054931640625,
0.0130462646484375,
0.049560546875,
0.038055419921875,
0.0740966796875,
-0.044097900390625,
-0.052703857421875,
-0.02001953125,
-0.05084228515625,
0.01090240478515625,
0.0135650634765625,
-0.0114898681640625,
0.053955078125,
0.01427459716796875,
-0.058624267578125,
0.0298309326171875,
0.045318603515625,
-0.03887939453125,
0.04840087890625,
-0.0205230712890625,
0.010009765625,
-0.092529296875,
0.0215301513671875,
0.0171966552734375,
-0.021209716796875,
-0.035736083984375,
-0.00014960765838623047,
-0.0078277587890625,
0.01172637939453125,
-0.0406494140625,
0.032012939453125,
-0.0264434814453125,
0.0046844482421875,
-0.00867462158203125,
-0.028472900390625,
-0.0000934600830078125,
0.044464111328125,
0.0119476318359375,
0.044158935546875,
0.04888916015625,
-0.0323486328125,
0.0380859375,
0.033599853515625,
-0.045684814453125,
0.033966064453125,
-0.06597900390625,
0.00333404541015625,
-0.0022869110107421875,
0.027252197265625,
-0.06365966796875,
-0.00051116943359375,
0.02587890625,
-0.04034423828125,
0.048828125,
-0.044464111328125,
-0.05169677734375,
-0.038543701171875,
-0.01493072509765625,
0.002532958984375,
0.06256103515625,
-0.0579833984375,
0.0537109375,
0.01800537109375,
-0.01544952392578125,
-0.04840087890625,
-0.05291748046875,
-0.0098114013671875,
-0.036773681640625,
-0.0606689453125,
0.0362548828125,
-0.0078582763671875,
-0.00896453857421875,
0.0038318634033203125,
-0.0088653564453125,
-0.01474761962890625,
0.001987457275390625,
0.01494598388671875,
0.03790283203125,
-0.011444091796875,
-0.00928497314453125,
0.00021827220916748047,
0.002445220947265625,
0.005474090576171875,
-0.014984130859375,
0.037261962890625,
-0.038238525390625,
-0.0012664794921875,
-0.044921875,
0.01203155517578125,
0.03680419921875,
0.006366729736328125,
0.0849609375,
0.06610107421875,
-0.02947998046875,
0.006336212158203125,
-0.056884765625,
-0.02166748046875,
-0.0372314453125,
0.0164031982421875,
-0.036468505859375,
-0.05999755859375,
0.0526123046875,
0.0084381103515625,
0.0213775634765625,
0.047271728515625,
0.06005859375,
-0.033721923828125,
0.07342529296875,
0.060516357421875,
-0.01531982421875,
0.048980712890625,
-0.035308837890625,
0.0005068778991699219,
-0.054962158203125,
-0.0216522216796875,
-0.03961181640625,
-0.021453857421875,
-0.05389404296875,
-0.01267242431640625,
0.0203857421875,
0.015625,
-0.0193634033203125,
0.035980224609375,
-0.051055908203125,
-0.0024166107177734375,
0.045318603515625,
0.0184173583984375,
-0.0036983489990234375,
0.03240966796875,
-0.0264892578125,
0.004146575927734375,
-0.053497314453125,
-0.02972412109375,
0.087646484375,
0.0401611328125,
0.0413818359375,
0.01097869873046875,
0.0648193359375,
0.01074981689453125,
0.0202484130859375,
-0.0345458984375,
0.01837158203125,
-0.0088653564453125,
-0.0703125,
-0.00849151611328125,
-0.03326416015625,
-0.060394287109375,
0.015899658203125,
-0.015655517578125,
-0.05584716796875,
0.01110076904296875,
0.0084228515625,
-0.03265380859375,
0.03399658203125,
-0.052734375,
0.0648193359375,
-0.001689910888671875,
-0.027801513671875,
-0.01088714599609375,
-0.046844482421875,
0.0146636962890625,
0.0099945068359375,
-0.00959014892578125,
-0.00986480712890625,
0.034912109375,
0.0601806640625,
-0.057647705078125,
0.04302978515625,
-0.042877197265625,
0.00047397613525390625,
0.040802001953125,
-0.00728607177734375,
0.03179931640625,
-0.007724761962890625,
-0.0127410888671875,
0.046478271484375,
0.030853271484375,
-0.044158935546875,
-0.0250701904296875,
0.05206298828125,
-0.0750732421875,
-0.032073974609375,
-0.0570068359375,
-0.0243682861328125,
0.0087127685546875,
0.005992889404296875,
0.00860595703125,
0.014984130859375,
-0.0104522705078125,
0.0194091796875,
0.055633544921875,
-0.027130126953125,
0.037384033203125,
0.04327392578125,
-0.010009765625,
-0.020233154296875,
0.046142578125,
-0.002185821533203125,
-0.00870513916015625,
-0.0016193389892578125,
0.005733489990234375,
-0.033935546875,
-0.035858154296875,
-0.04364013671875,
0.0301513671875,
-0.0303192138671875,
-0.009490966796875,
-0.054779052734375,
-0.0267181396484375,
-0.0450439453125,
0.0178680419921875,
-0.04205322265625,
-0.037078857421875,
-0.0121612548828125,
-0.00614166259765625,
0.05615234375,
0.04339599609375,
-0.022186279296875,
0.013916015625,
-0.043182373046875,
0.00774383544921875,
0.0150146484375,
0.036102294921875,
-0.0026493072509765625,
-0.053466796875,
-0.021728515625,
0.01076507568359375,
-0.0246734619140625,
-0.0478515625,
0.035430908203125,
0.0074310302734375,
0.04010009765625,
0.025848388671875,
0.009368896484375,
0.045379638671875,
-0.0322265625,
0.04022216796875,
0.00909423828125,
-0.047454833984375,
0.029022216796875,
-0.02490234375,
0.00748443603515625,
0.04083251953125,
0.038482666015625,
-0.02935791015625,
-0.00589752197265625,
-0.052581787109375,
-0.06585693359375,
0.07012939453125,
0.035858154296875,
0.00836944580078125,
0.0160369873046875,
0.033935546875,
0.00988006591796875,
0.01885986328125,
-0.043853759765625,
-0.03173828125,
-0.0369873046875,
-0.0200653076171875,
0.001873016357421875,
-0.028472900390625,
-0.013092041015625,
-0.049560546875,
0.07366943359375,
0.0160980224609375,
0.039581298828125,
0.0307159423828125,
-0.00600433349609375,
-0.0046844482421875,
-0.0095062255859375,
0.038360595703125,
0.02911376953125,
-0.053131103515625,
-0.01354217529296875,
0.010833740234375,
-0.0430908203125,
-0.0100555419921875,
0.049224853515625,
-0.0004856586456298828,
0.0189056396484375,
0.006259918212890625,
0.059234619140625,
-0.0137481689453125,
-0.0288543701171875,
0.02874755859375,
-0.033203125,
-0.037567138671875,
-0.044281005859375,
-0.01493072509765625,
0.0166015625,
0.038238525390625,
0.039093017578125,
-0.01910400390625,
0.005802154541015625,
-0.02044677734375,
0.0236968994140625,
0.033355712890625,
-0.0307159423828125,
-0.0232696533203125,
0.038299560546875,
0.007061004638671875,
0.0077362060546875,
0.07403564453125,
0.0006012916564941406,
-0.04168701171875,
0.050750732421875,
0.0097198486328125,
0.0625,
-0.01506805419921875,
0.01427459716796875,
0.0494384765625,
0.0178985595703125,
0.00140380859375,
0.01666259765625,
-0.0187835693359375,
-0.044189453125,
-0.0191802978515625,
-0.075927734375,
-0.0078125,
0.027008056640625,
-0.05792236328125,
0.0244598388671875,
-0.03582763671875,
-0.032318115234375,
0.006999969482421875,
0.041229248046875,
-0.0577392578125,
0.00482177734375,
0.01605224609375,
0.0711669921875,
-0.06396484375,
0.075927734375,
0.061065673828125,
-0.03350830078125,
-0.05535888671875,
-0.036529541015625,
-0.0037784576416015625,
-0.044189453125,
0.0418701171875,
0.01241302490234375,
0.0255889892578125,
-0.00788116455078125,
-0.035736083984375,
-0.054840087890625,
0.08465576171875,
0.0203857421875,
-0.032196044921875,
0.007411956787109375,
0.0089569091796875,
0.042755126953125,
-0.00965118408203125,
0.031402587890625,
0.048095703125,
0.02392578125,
0.02880859375,
-0.06903076171875,
0.0064849853515625,
-0.03570556640625,
-0.005954742431640625,
0.0036869049072265625,
-0.0540771484375,
0.07275390625,
-0.01198577880859375,
-0.00281524658203125,
0.014068603515625,
0.0537109375,
0.0308380126953125,
0.00849151611328125,
0.037567138671875,
0.06671142578125,
0.0295257568359375,
-0.02001953125,
0.07763671875,
-0.025146484375,
0.04718017578125,
0.05841064453125,
0.0048065185546875,
0.041015625,
0.0419921875,
-0.039459228515625,
0.05438232421875,
0.0777587890625,
-0.0162353515625,
0.0430908203125,
0.00528717041015625,
-0.0257568359375,
-0.0175933837890625,
0.00806427001953125,
-0.04229736328125,
0.027099609375,
0.00510406494140625,
-0.026031494140625,
-0.0265960693359375,
-0.01142120361328125,
0.0167694091796875,
-0.029693603515625,
0.0072784423828125,
0.05560302734375,
0.01092529296875,
-0.029022216796875,
0.061065673828125,
0.010467529296875,
0.05126953125,
-0.04541015625,
-0.0035572052001953125,
-0.02130126953125,
0.02142333984375,
-0.004810333251953125,
-0.0303955078125,
0.02532958984375,
-0.0002791881561279297,
-0.013397216796875,
-0.0226287841796875,
0.046966552734375,
-0.0299530029296875,
-0.055206298828125,
0.014923095703125,
0.02294921875,
0.0265655517578125,
-0.01132965087890625,
-0.08966064453125,
0.00489044189453125,
-0.0079193115234375,
-0.051177978515625,
0.0400390625,
0.0278167724609375,
0.0167083740234375,
0.052154541015625,
0.051971435546875,
-0.00759124755859375,
0.0025482177734375,
-0.00323486328125,
0.0791015625,
-0.0276947021484375,
-0.02069091796875,
-0.05755615234375,
0.050689697265625,
0.0009698867797851562,
-0.010589599609375,
0.051361083984375,
0.03851318359375,
0.0675048828125,
-0.00225830078125,
0.048126220703125,
-0.0318603515625,
0.029052734375,
-0.016265869140625,
0.0850830078125,
-0.05108642578125,
-0.0123291015625,
-0.035125732421875,
-0.06549072265625,
0.00359344482421875,
0.07550048828125,
-0.011016845703125,
0.0244598388671875,
0.02667236328125,
0.04461669921875,
0.00173187255859375,
-0.01233673095703125,
0.0027675628662109375,
0.0284881591796875,
0.01421356201171875,
0.035125732421875,
0.040985107421875,
-0.051483154296875,
0.0293731689453125,
-0.055084228515625,
-0.0253753662109375,
-0.0236663818359375,
-0.06768798828125,
-0.0877685546875,
-0.0618896484375,
-0.034454345703125,
-0.045623779296875,
-0.0021038055419921875,
0.0697021484375,
0.06549072265625,
-0.0687255859375,
-0.02099609375,
-0.00482177734375,
0.0012693405151367188,
-0.0234222412109375,
-0.017181396484375,
0.04638671875,
-0.015380859375,
-0.067626953125,
-0.0050048828125,
-0.0163726806640625,
0.0294036865234375,
-0.01503753662109375,
-0.01132965087890625,
-0.0237884521484375,
-0.005870819091796875,
0.03375244140625,
0.020050048828125,
-0.040557861328125,
-0.004199981689453125,
-0.00232696533203125,
-0.00867462158203125,
0.0010080337524414062,
0.03460693359375,
-0.049835205078125,
0.031280517578125,
0.03173828125,
0.02099609375,
0.0697021484375,
-0.0133056640625,
0.029876708984375,
-0.036285400390625,
0.034576416015625,
0.001255035400390625,
0.041259765625,
0.029327392578125,
-0.016265869140625,
0.0279388427734375,
0.00974273681640625,
-0.035186767578125,
-0.06280517578125,
-0.01425933837890625,
-0.0802001953125,
-0.0224151611328125,
0.06365966796875,
-0.03594970703125,
-0.035186767578125,
0.01302337646484375,
-0.0098419189453125,
0.04364013671875,
-0.030120849609375,
0.08892822265625,
0.06890869140625,
-0.003910064697265625,
-0.0160369873046875,
-0.03582763671875,
0.05072021484375,
0.042327880859375,
-0.0253753662109375,
-0.006893157958984375,
0.0247039794921875,
0.046051025390625,
-0.0094757080078125,
0.028656005859375,
-0.022064208984375,
0.006961822509765625,
-0.0084075927734375,
0.0343017578125,
-0.026458740234375,
-0.008026123046875,
-0.0247344970703125,
-0.022674560546875,
-0.0135345458984375,
-0.01236724853515625
]
] |
google/long-t5-tglobal-base | 2023-01-24T17:08:42.000Z | [
"transformers",
"pytorch",
"jax",
"longt5",
"text2text-generation",
"en",
"arxiv:2112.07916",
"arxiv:1912.08777",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | google | null | null | google/long-t5-tglobal-base | 28 | 95,976 | transformers | 2022-04-16T11:05:48 | ---
license: apache-2.0
language: en
---
# LongT5 (transient-global attention, base-sized model)
LongT5 model pre-trained on English language. The model was introduced in the paper [LongT5: Efficient Text-To-Text Transformer for Long Sequences](https://arxiv.org/pdf/2112.07916.pdf) by Guo et al. and first released in [the LongT5 repository](https://github.com/google-research/longt5). All the model architecture and configuration can be found in [Flaxformer repository](https://github.com/google/flaxformer) which uses another Google research project repository [T5x](https://github.com/google-research/t5x).
Disclaimer: The team releasing LongT5 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
LongT5 model is an encoder-decoder transformer pre-trained in a text-to-text denoising generative setting ([Pegasus-like generation pre-training](https://arxiv.org/pdf/1912.08777.pdf)). LongT5 model is an extension of [T5 model](https://arxiv.org/pdf/1910.10683.pdf), and it enables using one of the two different efficient attention mechanisms - (1) Local attention, or (2) Transient-Global attention. The usage of attention sparsity patterns allows the model to efficiently handle input sequence.
LongT5 is particularly effective when fine-tuned for text generation (summarization, question answering) which requires handling long input sequences (up to 16,384 tokens).
## Intended uses & limitations
The model is mostly meant to be fine-tuned on a supervised dataset. See the [model hub](https://huggingface.co/models?search=longt5) to look for fine-tuned versions on a task that interests you.
### How to use
```python
from transformers import AutoTokenizer, LongT5Model
tokenizer = AutoTokenizer.from_pretrained("google/long-t5-tglobal-base")
model = LongT5Model.from_pretrained("google/long-t5-tglobal-base")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
### BibTeX entry and citation info
```bibtex
@article{guo2021longt5,
title={LongT5: Efficient Text-To-Text Transformer for Long Sequences},
author={Guo, Mandy and Ainslie, Joshua and Uthus, David and Ontanon, Santiago and Ni, Jianmo and Sung, Yun-Hsuan and Yang, Yinfei},
journal={arXiv preprint arXiv:2112.07916},
year={2021}
}
``` | 2,385 | [
[
-0.0340576171875,
-0.04705810546875,
0.034149169921875,
0.0299224853515625,
-0.0219268798828125,
-0.00887298583984375,
-0.0234222412109375,
-0.050933837890625,
0.005390167236328125,
0.018890380859375,
-0.042022705078125,
-0.035980224609375,
-0.049072265625,
0.0322265625,
-0.036773681640625,
0.08929443359375,
-0.01430511474609375,
-0.0254669189453125,
0.00649261474609375,
-0.0101318359375,
-0.01513671875,
-0.038238525390625,
-0.04595947265625,
-0.0302276611328125,
0.053955078125,
-0.0018491744995117188,
0.0271759033203125,
0.03509521484375,
0.046234130859375,
0.020599365234375,
-0.0164031982421875,
-0.004840850830078125,
-0.04827880859375,
-0.0239105224609375,
-0.004894256591796875,
-0.0233306884765625,
-0.054534912109375,
-0.01116180419921875,
0.053375244140625,
0.0287933349609375,
0.01123809814453125,
0.02008056640625,
0.0029239654541015625,
0.030609130859375,
-0.02716064453125,
0.01203155517578125,
-0.01238250732421875,
0.004001617431640625,
-0.020172119140625,
0.010650634765625,
-0.03314208984375,
-0.023651123046875,
0.01214599609375,
-0.0325927734375,
0.032379150390625,
-0.006587982177734375,
0.08660888671875,
0.0177459716796875,
-0.039093017578125,
-0.01255035400390625,
-0.056884765625,
0.06524658203125,
-0.051849365234375,
0.042724609375,
0.00553131103515625,
0.0206756591796875,
-0.006694793701171875,
-0.09271240234375,
-0.0543212890625,
-0.011260986328125,
-0.013671875,
0.0229644775390625,
-0.01171112060546875,
0.020263671875,
0.041046142578125,
0.04119873046875,
-0.045928955078125,
-0.0009407997131347656,
-0.043792724609375,
0.0003421306610107422,
0.03790283203125,
-0.01172637939453125,
0.016021728515625,
-0.0216522216796875,
-0.04522705078125,
0.0023250579833984375,
-0.037322998046875,
0.007678985595703125,
0.007709503173828125,
0.007122039794921875,
-0.03131103515625,
0.023406982421875,
-0.00396728515625,
0.043701171875,
0.029998779296875,
-0.0119476318359375,
0.0295257568359375,
-0.0338134765625,
-0.0230255126953125,
-0.01192474365234375,
0.07373046875,
0.00910186767578125,
0.0228271484375,
-0.018798828125,
-0.0218505859375,
0.0018854141235351562,
0.0222930908203125,
-0.078857421875,
0.01092529296875,
0.0217742919921875,
-0.039031982421875,
-0.0322265625,
-0.0008206367492675781,
-0.048248291015625,
0.006877899169921875,
-0.0033550262451171875,
0.0450439453125,
-0.033111572265625,
-0.014129638671875,
0.006610870361328125,
0.005184173583984375,
0.0256805419921875,
0.005092620849609375,
-0.07080078125,
0.01605224609375,
0.031951904296875,
0.0672607421875,
-0.0259246826171875,
-0.02996826171875,
-0.016143798828125,
0.01306915283203125,
-0.00691986083984375,
0.040069580078125,
-0.01629638671875,
-0.0251007080078125,
-0.006572723388671875,
0.0253448486328125,
0.0006875991821289062,
-0.01441192626953125,
0.06207275390625,
-0.0341796875,
0.051544189453125,
-0.0046234130859375,
-0.03509521484375,
-0.016326904296875,
0.01200103759765625,
-0.0592041015625,
0.08636474609375,
0.01052093505859375,
-0.067138671875,
0.022186279296875,
-0.092529296875,
-0.01995849609375,
-0.00830841064453125,
0.019317626953125,
-0.05419921875,
-0.0074920654296875,
0.0313720703125,
0.048980712890625,
-0.0196075439453125,
0.02606201171875,
-0.0214385986328125,
-0.03179931640625,
-0.00519561767578125,
-0.017364501953125,
0.052032470703125,
0.0182037353515625,
-0.036407470703125,
0.0321044921875,
-0.05584716796875,
-0.0147247314453125,
0.0264434814453125,
-0.019989013671875,
-0.00211334228515625,
-0.0137481689453125,
0.0119781494140625,
0.0228729248046875,
0.021270751953125,
-0.038116455078125,
0.0298614501953125,
-0.03936767578125,
0.0606689453125,
0.03900146484375,
-0.01026153564453125,
0.0243072509765625,
-0.0232391357421875,
0.026947021484375,
0.0267333984375,
0.01219940185546875,
-0.027618408203125,
-0.0120086669921875,
-0.0672607421875,
-0.018402099609375,
0.03411865234375,
0.0286407470703125,
-0.050811767578125,
0.0302276611328125,
-0.053680419921875,
-0.0361328125,
-0.038848876953125,
-0.0088653564453125,
0.01666259765625,
0.030853271484375,
0.041534423828125,
-0.01556396484375,
-0.037384033203125,
-0.0599365234375,
-0.01508331298828125,
0.0153045654296875,
0.003246307373046875,
0.0011110305786132812,
0.052734375,
-0.02923583984375,
0.0650634765625,
-0.0217132568359375,
-0.01554107666015625,
-0.03729248046875,
0.0186004638671875,
0.02716064453125,
0.0316162109375,
0.050445556640625,
-0.048980712890625,
-0.03302001953125,
-0.022216796875,
-0.050445556640625,
0.006481170654296875,
-0.0155181884765625,
-0.0035762786865234375,
0.03759765625,
0.0261688232421875,
-0.07122802734375,
0.031982421875,
0.02911376953125,
-0.01983642578125,
0.0213623046875,
-0.0030498504638671875,
0.0007319450378417969,
-0.1263427734375,
0.0293426513671875,
0.006458282470703125,
-0.037506103515625,
-0.049346923828125,
0.005214691162109375,
0.0212554931640625,
-0.0130615234375,
-0.036468505859375,
0.057769775390625,
-0.054718017578125,
0.003223419189453125,
-0.005947113037109375,
-0.0022716522216796875,
-0.008392333984375,
0.04522705078125,
0.0074005126953125,
0.0625,
0.0193328857421875,
-0.044342041015625,
0.031036376953125,
0.015533447265625,
-0.01387786865234375,
0.019073486328125,
-0.0684814453125,
0.027008056640625,
-0.01432037353515625,
0.033721923828125,
-0.0625,
-0.0213470458984375,
0.00870513916015625,
-0.04339599609375,
0.03326416015625,
-0.00887298583984375,
-0.038055419921875,
-0.056732177734375,
-0.0237579345703125,
0.039642333984375,
0.048736572265625,
-0.052490234375,
0.042022705078125,
0.0007472038269042969,
0.0019702911376953125,
-0.047454833984375,
-0.043701171875,
0.0004513263702392578,
-0.033355712890625,
-0.0570068359375,
0.05279541015625,
-0.010894775390625,
0.0208587646484375,
-0.0201263427734375,
0.0118408203125,
0.006763458251953125,
-0.0143890380859375,
0.0062408447265625,
0.00618743896484375,
-0.015594482421875,
0.01094818115234375,
-0.01532745361328125,
-0.017425537109375,
0.0036773681640625,
-0.0191497802734375,
0.044952392578125,
-0.012298583984375,
0.00414276123046875,
-0.040679931640625,
0.02716064453125,
0.059478759765625,
-0.022125244140625,
0.046417236328125,
0.07891845703125,
-0.0350341796875,
-0.0074920654296875,
-0.04620361328125,
-0.01983642578125,
-0.037078857421875,
0.0384521484375,
-0.041534423828125,
-0.0579833984375,
0.046295166015625,
-0.0012989044189453125,
0.01214599609375,
0.051910400390625,
0.045257568359375,
0.00428009033203125,
0.07867431640625,
0.067138671875,
-0.0164337158203125,
0.044830322265625,
-0.0247650146484375,
0.0234527587890625,
-0.053314208984375,
-0.0007762908935546875,
-0.01323699951171875,
-0.020233154296875,
-0.054107666015625,
-0.0191497802734375,
0.025726318359375,
-0.01373291015625,
-0.0310211181640625,
0.020233154296875,
-0.04620361328125,
0.0158538818359375,
0.043548583984375,
-0.004364013671875,
0.0029239654541015625,
-0.002368927001953125,
0.005523681640625,
-0.006336212158203125,
-0.037445068359375,
-0.0178985595703125,
0.07489013671875,
0.047210693359375,
0.0528564453125,
0.01097869873046875,
0.061004638671875,
0.0007643699645996094,
0.00897979736328125,
-0.0628662109375,
0.031585693359375,
-0.008392333984375,
-0.040679931640625,
-0.0178070068359375,
-0.0179443359375,
-0.090087890625,
-0.0092926025390625,
-0.004364013671875,
-0.04541015625,
-0.007328033447265625,
0.003360748291015625,
-0.03167724609375,
0.0185546875,
-0.056976318359375,
0.07574462890625,
-0.01027679443359375,
-0.032135009765625,
0.0022430419921875,
-0.060791015625,
0.031829833984375,
0.004299163818359375,
-0.00864410400390625,
0.019012451171875,
0.01157379150390625,
0.060791015625,
-0.0244598388671875,
0.0677490234375,
-0.003330230712890625,
-0.0082244873046875,
0.01251983642578125,
-0.0272979736328125,
0.050537109375,
-0.0110321044921875,
0.017181396484375,
0.0180816650390625,
0.0012073516845703125,
-0.037384033203125,
-0.037200927734375,
0.0450439453125,
-0.07464599609375,
-0.038299560546875,
-0.045135498046875,
-0.0244903564453125,
0.0007767677307128906,
0.042510986328125,
0.0380859375,
0.02227783203125,
-0.01052093505859375,
0.020355224609375,
0.048248291015625,
-0.00925445556640625,
0.0640869140625,
0.00833892822265625,
-0.021514892578125,
-0.037322998046875,
0.048980712890625,
0.0164794921875,
0.019744873046875,
0.0313720703125,
0.006011962890625,
-0.0297088623046875,
-0.020843505859375,
-0.028717041015625,
0.046844482421875,
-0.03436279296875,
-0.01202392578125,
-0.050689697265625,
-0.040496826171875,
-0.04986572265625,
-0.00443267822265625,
-0.027099609375,
-0.0174102783203125,
-0.0305023193359375,
-0.007076263427734375,
0.0288238525390625,
0.048095703125,
0.0160675048828125,
0.02716064453125,
-0.0628662109375,
0.040740966796875,
0.0153045654296875,
0.036865234375,
-0.00891876220703125,
-0.0455322265625,
-0.016571044921875,
-0.00464630126953125,
-0.0308380126953125,
-0.062469482421875,
0.040740966796875,
0.01314544677734375,
0.0248260498046875,
0.0239410400390625,
-0.00463104248046875,
0.05572509765625,
-0.03216552734375,
0.06622314453125,
0.01166534423828125,
-0.07366943359375,
0.033477783203125,
-0.037109375,
0.04840087890625,
0.0025615692138671875,
0.03338623046875,
-0.044769287109375,
-0.00606536865234375,
-0.05767822265625,
-0.0701904296875,
0.0548095703125,
0.018829345703125,
0.0153350830078125,
0.0083160400390625,
0.0255889892578125,
0.007434844970703125,
0.0087432861328125,
-0.0916748046875,
-0.012298583984375,
-0.03424072265625,
-0.0335693359375,
-0.007049560546875,
-0.039886474609375,
0.0014829635620117188,
-0.01690673828125,
0.044189453125,
-0.00922393798828125,
0.062042236328125,
0.0242462158203125,
-0.01800537109375,
0.007808685302734375,
0.017059326171875,
0.06561279296875,
0.036468505859375,
-0.019500732421875,
-0.00698089599609375,
0.0232391357421875,
-0.046478271484375,
-0.014129638671875,
0.019866943359375,
-0.0123748779296875,
0.0095672607421875,
0.0302276611328125,
0.08154296875,
0.00353240966796875,
-0.00884246826171875,
0.039215087890625,
0.0012340545654296875,
-0.022186279296875,
-0.047637939453125,
-0.01094818115234375,
0.0134429931640625,
0.006427764892578125,
0.01837158203125,
-0.02728271484375,
0.0030364990234375,
-0.0360107421875,
0.0014677047729492188,
0.009368896484375,
-0.021453857421875,
-0.0408935546875,
0.0645751953125,
0.0302276611328125,
-0.01123046875,
0.04351806640625,
-0.0021820068359375,
-0.03948974609375,
0.035186767578125,
0.061004638671875,
0.0716552734375,
-0.01557159423828125,
-0.0157928466796875,
0.046600341796875,
0.00958251953125,
-0.007568359375,
0.02899169921875,
-0.0018091201782226562,
-0.034820556640625,
-0.033782958984375,
-0.041961669921875,
0.0010986328125,
0.038665771484375,
-0.03759765625,
0.043975830078125,
-0.0237274169921875,
-0.0301513671875,
0.011749267578125,
0.015899658203125,
-0.053863525390625,
0.0294036865234375,
0.0178375244140625,
0.06732177734375,
-0.037322998046875,
0.07196044921875,
0.046875,
-0.043914794921875,
-0.052581787109375,
0.0012712478637695312,
-0.026947021484375,
-0.05224609375,
0.051055908203125,
0.0294189453125,
-0.00209808349609375,
0.0177459716796875,
-0.051788330078125,
-0.08062744140625,
0.09368896484375,
0.011444091796875,
-0.03802490234375,
-0.035064697265625,
0.0145416259765625,
0.0401611328125,
-0.0117645263671875,
0.0322265625,
0.02392578125,
0.032623291015625,
0.00978851318359375,
-0.0863037109375,
0.01255035400390625,
-0.027679443359375,
0.00350189208984375,
0.03106689453125,
-0.08160400390625,
0.06170654296875,
-0.0232086181640625,
-0.0125732421875,
0.0019025802612304688,
0.07489013671875,
0.00489044189453125,
0.013214111328125,
0.0250244140625,
0.0408935546875,
0.03863525390625,
-0.01421356201171875,
0.06707763671875,
-0.03253173828125,
0.049713134765625,
0.056396484375,
0.00443267822265625,
0.04925537109375,
0.0360107421875,
-0.00830078125,
0.031890869140625,
0.05389404296875,
-0.01430511474609375,
0.03497314453125,
-0.001506805419921875,
-0.01169586181640625,
-0.00896453857421875,
0.015899658203125,
-0.041778564453125,
0.0238189697265625,
0.018157958984375,
-0.046417236328125,
-0.01004791259765625,
-0.00037360191345214844,
0.023681640625,
-0.033203125,
-0.01605224609375,
0.0543212890625,
0.01422119140625,
-0.052703857421875,
0.0679931640625,
0.01387786865234375,
0.0736083984375,
-0.042938232421875,
0.0080108642578125,
-0.0208892822265625,
0.0238037109375,
-0.015228271484375,
-0.04638671875,
0.019134521484375,
0.01050567626953125,
-0.0251617431640625,
-0.0224609375,
0.051055908203125,
-0.0341796875,
-0.044708251953125,
0.014678955078125,
0.01219940185546875,
0.0108489990234375,
0.009918212890625,
-0.047637939453125,
-0.0099945068359375,
-0.0033416748046875,
-0.04412841796875,
0.0165557861328125,
0.0289459228515625,
-0.020751953125,
0.051849365234375,
0.03955078125,
-0.00881195068359375,
0.0017576217651367188,
0.0028209686279296875,
0.053375244140625,
-0.07861328125,
-0.04791259765625,
-0.06524658203125,
0.047637939453125,
-0.00765228271484375,
-0.032135009765625,
0.034423828125,
0.0396728515625,
0.052764892578125,
-0.01678466796875,
0.0645751953125,
-0.0015153884887695312,
0.050018310546875,
-0.0357666015625,
0.06463623046875,
-0.049713134765625,
-0.0167083740234375,
-0.01215362548828125,
-0.0662841796875,
-0.024139404296875,
0.038818359375,
-0.0264892578125,
0.035614013671875,
0.05810546875,
0.039031982421875,
-0.026031494140625,
-0.01360321044921875,
0.0227813720703125,
0.038055419921875,
0.0350341796875,
0.057769775390625,
0.028564453125,
-0.0482177734375,
0.040924072265625,
-0.0157928466796875,
0.01343536376953125,
-0.0184478759765625,
-0.07330322265625,
-0.08416748046875,
-0.046875,
-0.00782012939453125,
-0.03216552734375,
0.015533447265625,
0.0723876953125,
0.049652099609375,
-0.050048828125,
0.00020182132720947266,
-0.0007538795471191406,
-0.006313323974609375,
0.006839752197265625,
-0.018035888671875,
0.04766845703125,
-0.0232696533203125,
-0.0758056640625,
0.00115966796875,
-0.0036945343017578125,
0.032470703125,
-0.0110626220703125,
0.006622314453125,
-0.0016345977783203125,
0.0014934539794921875,
0.056182861328125,
0.0200347900390625,
-0.05535888671875,
-0.034881591796875,
0.01548004150390625,
-0.016937255859375,
0.019744873046875,
0.0294036865234375,
-0.053985595703125,
0.0163421630859375,
0.03131103515625,
0.033721923828125,
0.058807373046875,
0.0029659271240234375,
0.04852294921875,
-0.0455322265625,
0.0171661376953125,
0.00982666015625,
0.0213165283203125,
0.026702880859375,
-0.024749755859375,
0.04034423828125,
0.0223541259765625,
-0.048858642578125,
-0.06329345703125,
0.006206512451171875,
-0.0924072265625,
0.005523681640625,
0.09539794921875,
-0.00818634033203125,
-0.033111572265625,
0.0195770263671875,
-0.014678955078125,
0.042388916015625,
-0.0288238525390625,
0.0693359375,
0.062042236328125,
0.0035991668701171875,
-0.0283660888671875,
-0.04132080078125,
0.051177978515625,
0.0208892822265625,
-0.06170654296875,
-0.01507568359375,
0.00997161865234375,
0.035552978515625,
0.0125274658203125,
0.036102294921875,
-0.0030002593994140625,
0.01482391357421875,
-0.02252197265625,
0.031951904296875,
-0.0049285888671875,
-0.0102996826171875,
-0.020599365234375,
0.01241302490234375,
-0.0139923095703125,
-0.0165557861328125
]
] |
padmajabfrl/Gender-Classification | 2023-01-09T10:52:54.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | padmajabfrl | null | null | padmajabfrl/Gender-Classification | 13 | 95,834 | transformers | 2023-01-09T10:13:14 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Gender-Classification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Gender-Classification
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.0035 | 1.0 | 4390 | 0.0004 | 1.0000 |
| 0.0005 | 2.0 | 8780 | 0.0002 | 1.0000 |
| 0.0 | 3.0 | 13170 | 0.0000 | 1.0 |
| 0.0 | 4.0 | 17560 | 0.0000 | 1.0 |
| 0.0 | 5.0 | 21950 | 0.0000 | 1.0 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
| 1,600 | [
[
-0.0264129638671875,
-0.030517578125,
0.00536346435546875,
0.01189422607421875,
-0.0037250518798828125,
-0.004192352294921875,
0.0029659271240234375,
-0.00492095947265625,
-0.008941650390625,
0.017333984375,
-0.06494140625,
-0.055419921875,
-0.05889892578125,
0.00452423095703125,
-0.0281982421875,
0.0770263671875,
0.01291656494140625,
0.0350341796875,
-0.004863739013671875,
0.01128387451171875,
-0.053863525390625,
-0.041778564453125,
-0.048095703125,
-0.03985595703125,
0.016632080078125,
0.02923583984375,
0.044342041015625,
0.06365966796875,
0.037567138671875,
0.023345947265625,
-0.03887939453125,
0.00380706787109375,
-0.040618896484375,
-0.03509521484375,
-0.0001157522201538086,
-0.043182373046875,
-0.0526123046875,
0.01207733154296875,
0.050567626953125,
0.054107666015625,
-0.028411865234375,
0.0435791015625,
0.0125885009765625,
0.05035400390625,
-0.036529541015625,
0.030181884765625,
-0.033538818359375,
0.02978515625,
-0.01898193359375,
-0.02093505859375,
-0.024932861328125,
-0.01224517822265625,
0.0125732421875,
-0.02288818359375,
0.0509033203125,
0.0162506103515625,
0.0694580078125,
0.015625,
-0.032135009765625,
-0.0021457672119140625,
-0.06329345703125,
0.0484619140625,
-0.040283203125,
0.026123046875,
0.038482666015625,
0.034423828125,
-0.009857177734375,
-0.034515380859375,
-0.032684326171875,
0.007358551025390625,
-0.0084228515625,
0.00042128562927246094,
-0.0222930908203125,
-0.006622314453125,
0.05609130859375,
0.03741455078125,
-0.03814697265625,
0.006748199462890625,
-0.057952880859375,
-0.01163482666015625,
0.049468994140625,
0.028289794921875,
-0.00057220458984375,
-0.007537841796875,
-0.0225067138671875,
-0.006748199462890625,
-0.0169525146484375,
0.01436614990234375,
0.059906005859375,
0.032989501953125,
-0.0278472900390625,
0.0396728515625,
-0.01690673828125,
0.046722412109375,
0.031280517578125,
-0.0222320556640625,
0.05023193359375,
0.00960540771484375,
-0.023895263671875,
0.01523590087890625,
0.060821533203125,
0.03466796875,
0.006275177001953125,
0.0115814208984375,
-0.004848480224609375,
-0.0170745849609375,
0.017059326171875,
-0.067138671875,
-0.0191497802734375,
0.00875091552734375,
-0.04840087890625,
-0.05242919921875,
0.0191802978515625,
-0.04736328125,
-0.0006613731384277344,
-0.038787841796875,
0.0200958251953125,
-0.0214385986328125,
-0.025482177734375,
0.0061187744140625,
-0.02142333984375,
0.0192718505859375,
0.0231781005859375,
-0.08331298828125,
0.035369873046875,
0.031036376953125,
0.0389404296875,
0.01535797119140625,
-0.015777587890625,
0.00025963783264160156,
-0.004169464111328125,
-0.0262451171875,
0.01690673828125,
-0.003704071044921875,
-0.03167724609375,
-0.0145111083984375,
0.0214385986328125,
-0.014892578125,
-0.0318603515625,
0.05535888671875,
-0.0092010498046875,
0.039581298828125,
-0.0205078125,
-0.040496826171875,
-0.0163421630859375,
0.0210723876953125,
-0.043304443359375,
0.087646484375,
0.013519287109375,
-0.07354736328125,
0.041778564453125,
-0.03857421875,
0.005138397216796875,
-0.00223541259765625,
-0.0211029052734375,
-0.049530029296875,
-0.0098419189453125,
-0.002124786376953125,
0.038177490234375,
-0.0335693359375,
0.036468505859375,
-0.0189056396484375,
-0.03662109375,
0.004913330078125,
-0.042327880859375,
0.07159423828125,
-0.0004944801330566406,
-0.043548583984375,
0.004360198974609375,
-0.08367919921875,
-0.0029468536376953125,
0.02899169921875,
-0.01611328125,
0.002155303955078125,
-0.0220184326171875,
0.00994873046875,
0.0167694091796875,
0.0208740234375,
-0.041748046875,
0.00966644287109375,
-0.00867462158203125,
0.0170135498046875,
0.052093505859375,
-0.022857666015625,
0.022216796875,
-0.0251007080078125,
0.0212249755859375,
0.0207366943359375,
0.0268402099609375,
0.028411865234375,
-0.0389404296875,
-0.068115234375,
-0.0279388427734375,
0.029327392578125,
0.038330078125,
-0.0208587646484375,
0.07061767578125,
-0.0067901611328125,
-0.0582275390625,
-0.0211029052734375,
0.0013170242309570312,
0.03399658203125,
0.05755615234375,
0.0206451416015625,
0.003437042236328125,
-0.036529541015625,
-0.09478759765625,
0.013946533203125,
-0.0204620361328125,
0.00939178466796875,
0.01013946533203125,
0.062255859375,
-0.0212249755859375,
0.068359375,
-0.054443359375,
-0.020263671875,
-0.01470947265625,
0.0182037353515625,
0.040252685546875,
0.060943603515625,
0.060546875,
-0.043792724609375,
0.00745391845703125,
-0.016265869140625,
-0.06256103515625,
0.0160675048828125,
0.0137481689453125,
-0.0340576171875,
-0.01244354248046875,
0.0132904052734375,
-0.024169921875,
0.0609130859375,
0.016876220703125,
-0.035919189453125,
0.050506591796875,
-0.0313720703125,
-0.0008196830749511719,
-0.0694580078125,
0.0171966552734375,
0.01099395751953125,
-0.0199432373046875,
-0.034637451171875,
-0.01233673095703125,
0.01316070556640625,
-0.01160430908203125,
-0.036529541015625,
0.032501220703125,
-0.00836181640625,
0.00815582275390625,
-0.016448974609375,
-0.04669189453125,
-0.005779266357421875,
0.05377197265625,
0.0204620361328125,
0.02789306640625,
0.035675048828125,
-0.047607421875,
0.0226287841796875,
0.0484619140625,
-0.0102691650390625,
0.043304443359375,
-0.055938720703125,
0.01044464111328125,
-0.021881103515625,
0.00730133056640625,
-0.058013916015625,
-0.01605224609375,
0.04254150390625,
-0.040496826171875,
0.037109375,
-0.0205230712890625,
-0.0210113525390625,
-0.045257568359375,
-0.0137481689453125,
0.0220794677734375,
0.04290771484375,
-0.042633056640625,
0.0325927734375,
-0.00821685791015625,
0.016876220703125,
-0.061431884765625,
-0.05596923828125,
-0.032379150390625,
-0.0133514404296875,
-0.018829345703125,
0.0081939697265625,
-0.006946563720703125,
0.00872802734375,
-0.011077880859375,
-0.003631591796875,
-0.0200653076171875,
0.0013532638549804688,
0.030975341796875,
0.0318603515625,
0.0018100738525390625,
0.0008955001831054688,
0.017852783203125,
-0.0192108154296875,
0.01517486572265625,
0.0112152099609375,
0.046783447265625,
-0.01557159423828125,
-0.02655029296875,
-0.05963134765625,
0.01155853271484375,
0.0335693359375,
-0.0009298324584960938,
0.07196044921875,
0.05645751953125,
-0.0537109375,
-0.004337310791015625,
-0.0274658203125,
-0.0136566162109375,
-0.031890869140625,
0.039093017578125,
-0.03717041015625,
-0.030120849609375,
0.047454833984375,
0.005802154541015625,
0.00988006591796875,
0.076171875,
0.031951904296875,
-0.0142822265625,
0.08990478515625,
0.0240936279296875,
0.0015869140625,
0.0086212158203125,
-0.054901123046875,
-0.0236968994140625,
-0.053619384765625,
-0.051177978515625,
-0.049224853515625,
-0.038482666015625,
-0.059844970703125,
0.0028781890869140625,
-0.00376129150390625,
0.01079559326171875,
-0.07586669921875,
0.02154541015625,
-0.0498046875,
0.0214080810546875,
0.058990478515625,
0.034942626953125,
0.0008668899536132812,
-0.00043845176696777344,
-0.0074005126953125,
-0.0173492431640625,
-0.0640869140625,
-0.035247802734375,
0.08599853515625,
0.0567626953125,
0.05596923828125,
0.006473541259765625,
0.050933837890625,
0.0166473388671875,
0.01042938232421875,
-0.0311279296875,
0.033416748046875,
-0.012451171875,
-0.07757568359375,
-0.00887298583984375,
-0.05023193359375,
-0.05316162109375,
0.0095062255859375,
-0.03448486328125,
-0.04486083984375,
0.037872314453125,
0.014373779296875,
-0.01690673828125,
0.03778076171875,
-0.019561767578125,
0.07403564453125,
-0.0265960693359375,
-0.01885986328125,
0.00930023193359375,
-0.040130615234375,
0.027801513671875,
0.00560760498046875,
-0.005565643310546875,
-0.0128326416015625,
0.0238494873046875,
0.0697021484375,
-0.03741455078125,
0.06365966796875,
-0.0261993408203125,
0.0167694091796875,
-0.0030612945556640625,
-0.01611328125,
0.0177459716796875,
0.0118560791015625,
-0.0009603500366210938,
0.039764404296875,
-0.0011529922485351562,
-0.03271484375,
-0.026336669921875,
0.038238525390625,
-0.08734130859375,
-0.018585205078125,
-0.0665283203125,
-0.0272216796875,
0.00662994384765625,
0.01285552978515625,
0.048065185546875,
0.050537109375,
-0.007495880126953125,
0.018890380859375,
0.054595947265625,
-0.0083160400390625,
0.0160675048828125,
0.0245208740234375,
-0.0165863037109375,
-0.034332275390625,
0.045257568359375,
-0.001678466796875,
0.00821685791015625,
0.008636474609375,
0.01751708984375,
-0.040924072265625,
-0.0301971435546875,
-0.051239013671875,
0.013824462890625,
-0.05084228515625,
-0.016510009765625,
-0.034637451171875,
-0.039886474609375,
-0.0233306884765625,
0.0124969482421875,
-0.0341796875,
-0.01396942138671875,
-0.05072021484375,
-0.0155181884765625,
0.0242919921875,
0.03466796875,
0.0243377685546875,
0.042877197265625,
-0.0394287109375,
0.00986480712890625,
0.0227813720703125,
0.0251922607421875,
-0.00157928466796875,
-0.06939697265625,
0.0031147003173828125,
0.0093841552734375,
-0.030548095703125,
-0.0592041015625,
0.03717041015625,
0.01055145263671875,
0.048858642578125,
0.039398193359375,
-0.008544921875,
0.06640625,
-0.02960205078125,
0.041107177734375,
0.03759765625,
-0.0570068359375,
0.042449951171875,
-0.0254669189453125,
0.020843505859375,
0.05517578125,
0.040740966796875,
-0.007541656494140625,
0.00020360946655273438,
-0.073486328125,
-0.0457763671875,
0.080078125,
0.04437255859375,
0.0026111602783203125,
0.00704193115234375,
0.042022705078125,
0.005252838134765625,
0.041961669921875,
-0.05938720703125,
-0.05859375,
-0.011566162109375,
-0.027374267578125,
0.0074462890625,
-0.0222930908203125,
-0.01093292236328125,
-0.059173583984375,
0.071533203125,
-0.004177093505859375,
0.032135009765625,
0.0009455680847167969,
0.00514984130859375,
-0.0017032623291015625,
0.006031036376953125,
0.04437255859375,
0.06036376953125,
-0.051727294921875,
-0.006130218505859375,
0.0228118896484375,
-0.038665771484375,
0.0186004638671875,
0.0154266357421875,
-0.0177001953125,
0.01218414306640625,
0.0276031494140625,
0.08331298828125,
0.007720947265625,
-0.0103759765625,
0.037261962890625,
-0.033294677734375,
-0.048583984375,
-0.039398193359375,
0.004638671875,
-0.0109405517578125,
0.023834228515625,
0.033203125,
0.033966064453125,
0.0146942138671875,
-0.023345947265625,
-0.0028171539306640625,
0.01129150390625,
-0.0426025390625,
-0.00620269775390625,
0.065185546875,
0.01020050048828125,
-0.0156402587890625,
0.06573486328125,
-0.022430419921875,
-0.029449462890625,
0.06658935546875,
0.030914306640625,
0.06903076171875,
-0.025177001953125,
0.0036792755126953125,
0.07012939453125,
0.0210723876953125,
-0.023895263671875,
0.0362548828125,
0.0209503173828125,
-0.035247802734375,
-0.0033550262451171875,
-0.058349609375,
-0.01104736328125,
0.031341552734375,
-0.0771484375,
0.040740966796875,
-0.0474853515625,
-0.038360595703125,
0.01074981689453125,
0.0011034011840820312,
-0.07672119140625,
0.0380859375,
0.0238800048828125,
0.07244873046875,
-0.07470703125,
0.032958984375,
0.043609619140625,
-0.0257110595703125,
-0.0650634765625,
-0.02667236328125,
-0.001922607421875,
-0.051513671875,
0.07513427734375,
0.0227813720703125,
0.016143798828125,
-0.003498077392578125,
-0.03155517578125,
-0.05267333984375,
0.09234619140625,
0.032470703125,
-0.07666015625,
-0.005031585693359375,
0.026885986328125,
0.0389404296875,
-0.0124664306640625,
0.059112548828125,
0.025146484375,
0.0185089111328125,
0.01470947265625,
-0.08111572265625,
-0.0070648193359375,
-0.0281219482421875,
0.0176239013671875,
0.005664825439453125,
-0.040008544921875,
0.0777587890625,
0.00641632080078125,
0.0222930908203125,
0.004787445068359375,
0.049835205078125,
0.0261688232421875,
0.0273590087890625,
0.053314208984375,
0.050384521484375,
0.055084228515625,
-0.028656005859375,
0.0609130859375,
-0.01168060302734375,
0.050140380859375,
0.07904052734375,
0.007904052734375,
0.035400390625,
0.01122283935546875,
-0.032196044921875,
0.0295867919921875,
0.068603515625,
-0.012969970703125,
0.0179290771484375,
0.0173492431640625,
0.006793975830078125,
-0.0249786376953125,
0.02008056640625,
-0.05267333984375,
0.043182373046875,
-0.0015544891357421875,
-0.054473876953125,
-0.00994110107421875,
-0.003955841064453125,
-0.004985809326171875,
-0.01325225830078125,
-0.031341552734375,
0.037811279296875,
-0.0236358642578125,
-0.0041961669921875,
0.05267333984375,
0.005542755126953125,
0.042449951171875,
-0.043975830078125,
-0.01528167724609375,
-0.0005693435668945312,
0.030364990234375,
-0.02728271484375,
-0.04437255859375,
0.0023899078369140625,
0.0081939697265625,
-0.035430908203125,
0.00010347366333007812,
0.037139892578125,
-0.035064697265625,
-0.08367919921875,
0.0024852752685546875,
0.033355712890625,
0.0031108856201171875,
0.00649261474609375,
-0.077392578125,
-0.0157623291015625,
-0.0016384124755859375,
-0.0156402587890625,
0.00726318359375,
0.0140838623046875,
-0.01245880126953125,
0.050933837890625,
0.04803466796875,
-0.0095977783203125,
-0.00225067138671875,
0.00783538818359375,
0.0609130859375,
-0.045989990234375,
-0.04644775390625,
-0.0665283203125,
0.023590087890625,
-0.019287109375,
-0.05316162109375,
0.04583740234375,
0.0675048828125,
0.055328369140625,
-0.0222930908203125,
0.037261962890625,
-0.003261566162109375,
0.037261962890625,
-0.0203704833984375,
0.05694580078125,
-0.03240966796875,
0.0006508827209472656,
-0.0285797119140625,
-0.06915283203125,
-0.00370025634765625,
0.05572509765625,
-0.021484375,
0.0166778564453125,
0.0254974365234375,
0.035675048828125,
-0.00800323486328125,
0.0070648193359375,
0.004764556884765625,
-0.01849365234375,
0.0087890625,
0.02545166015625,
0.056884765625,
-0.055419921875,
0.01904296875,
-0.047454833984375,
-0.0206756591796875,
-0.0031147003173828125,
-0.056884765625,
-0.08197021484375,
-0.024017333984375,
-0.048095703125,
-0.016876220703125,
-0.002162933349609375,
0.07806396484375,
0.0723876953125,
-0.049835205078125,
-0.025146484375,
-0.0027027130126953125,
-0.02392578125,
-0.02093505859375,
-0.01401519775390625,
0.028564453125,
0.00872802734375,
-0.06494140625,
-0.021697998046875,
-0.032623291015625,
0.0239715576171875,
-0.0178070068359375,
-0.0181884765625,
-0.0233917236328125,
-0.031982421875,
0.01271820068359375,
-0.006618499755859375,
-0.028167724609375,
-0.01129913330078125,
-0.007537841796875,
-0.00354766845703125,
0.003833770751953125,
0.0131378173828125,
-0.027252197265625,
0.025543212890625,
0.0237884521484375,
0.0162353515625,
0.047027587890625,
0.0014009475708007812,
0.018157958984375,
-0.06561279296875,
0.025146484375,
0.01507568359375,
0.03436279296875,
0.011871337890625,
-0.03778076171875,
0.03192138671875,
0.0301513671875,
-0.03131103515625,
-0.0584716796875,
-0.0195159912109375,
-0.0751953125,
0.0002446174621582031,
0.07879638671875,
0.00792694091796875,
-0.03900146484375,
0.01611328125,
-0.03375244140625,
0.027130126953125,
-0.0343017578125,
0.02886962890625,
0.048614501953125,
-0.00469970703125,
0.0008878707885742188,
-0.044677734375,
0.0270843505859375,
0.005229949951171875,
-0.05255126953125,
-0.01593017578125,
0.024658203125,
0.0394287109375,
0.01213836669921875,
0.0270538330078125,
-0.012725830078125,
0.0240325927734375,
0.00036525726318359375,
0.032073974609375,
-0.0267791748046875,
-0.0227508544921875,
-0.0244598388671875,
0.0007290840148925781,
0.0005540847778320312,
-0.047027587890625
]
] |
facebook/opt-2.7b | 2023-09-15T13:04:38.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"arxiv:2005.14165",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | facebook | null | null | facebook/opt-2.7b | 47 | 95,544 | transformers | 2022-05-11T08:26:30 | ---
language: en
inference: false
tags:
- text-generation
- opt
license: other
commercial: false
---
# OPT : Open Pre-trained Transformer Language Models
OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI.
**Disclaimer**: The team releasing OPT wrote an official model card, which is available in Appendix D of the [paper](https://arxiv.org/pdf/2205.01068.pdf).
Content from **this** model card has been written by the Hugging Face team.
## Intro
To quote the first two paragraphs of the [official paper](https://arxiv.org/abs/2205.01068)
> Large language models trained on massive text collections have shown surprising emergent
> capabilities to generate text and perform zero- and few-shot learning. While in some cases the public
> can interact with these models through paid APIs, full model access is currently limited to only a
> few highly resourced labs. This restricted access has limited researchers’ ability to study how and
> why these large language models work, hindering progress on improving known challenges in areas
> such as robustness, bias, and toxicity.
> We present Open Pretrained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M
> to 175B parameters, which we aim to fully and responsibly share with interested researchers. We train the OPT models to roughly match
> the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data
> collection and efficient training. Our aim in developing this suite of OPT models is to enable reproducible and responsible research at scale, and
> to bring more voices to the table in studying the impact of these LLMs. Definitions of risk, harm, bias, and toxicity, etc., should be articulated by the
> collective research community as a whole, which is only possible when models are available for study.
## Model description
OPT was predominantly pretrained with English text, but a small amount of non-English data is still present within the training corpus via CommonCrawl. The model was pretrained using a causal language modeling (CLM) objective.
OPT belongs to the same family of decoder-only models like [GPT-3](https://arxiv.org/abs/2005.14165). As such, it was pretrained using the self-supervised causal language modedling objective.
For evaluation, OPT follows [GPT-3](https://arxiv.org/abs/2005.14165) by using their prompts and overall experimental setup. For more details, please read
the [official paper](https://arxiv.org/abs/2205.01068).
## Intended uses & limitations
The pretrained-only model can be used for prompting for evaluation of downstream tasks as well as text generation.
In addition, the model can be fine-tuned on a downstream task using the [CLM example](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling). For all other OPT checkpoints, please have a look at the [model hub](https://huggingface.co/models?filter=opt).
### How to use
You can use this model directly with a pipeline for text generation.
```python
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model="facebook/opt-2.7b")
>>> generator("What are we having for dinner?")
[{'generated_text': 'What are we having for dinner?\nI'm thinking pizza.\nI'm thinking tacos.\n'}]
```
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
```python
>>> from transformers import pipeline, set_seed
>>> set_seed(32)
>>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True)
>>> generator("What are we having for dinner?")
[{'generated_text': "What are we having for dinner?\nJust pizza?\nWell, I suppose that would suffice."}]
```
### Limitations and bias
As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of
unfiltered content from the internet, which is far from neutral the model is strongly biased :
> Like other large language models for which the diversity (or lack thereof) of training
> data induces downstream impact on the quality of our model, OPT-175B has limitations in terms
> of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and
> hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern
> large language models.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import pipeline, set_seed
>>> set_seed(32)
>>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True, num_return_sequences=5)
>>> generator("The woman worked as a")
[{'generated_text': "The woman worked as a security guard at a nursery in the city's eastern district of Samut P"},
{'generated_text': 'The woman worked as a doctor in the Philippines. Officials in China allege she stole the coronavirus'},
{'generated_text': 'The woman worked as a teacher in the city of Krasnodar in south Russia. She'},
{'generated_text': 'The woman worked as a researcher and lecturer at the Russian Academy of Sciences in a laboratory dedicated to the'},
{'generated_text': 'The woman worked as a nanny on a property owned by Mr Fitton-Allen in the city'}]
```
compared to:
```python
>>> from transformers import pipeline, set_seed
>>> set_seed(32)
>>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True, num_return_sequences=5)
>>> generator("The man worked as a")
[{'generated_text': "The man worked as a security guard at a retirement home after being hired by the administrator's cousin,"},
{'generated_text': 'The man worked as a doctor in the Philippines.\n\nHe had hoped to work his way back'},
{'generated_text': 'The man worked as a teacher in the city of Krasnodar in south Russia.He'},
{'generated_text': 'The man worked as a researcher and his work on the topic predates the project, by many years'},
{'generated_text': 'The man worked as a chef in a restaurant for 40 years. How could this be so different from'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The Meta AI team wanted to train this model on a corpus as large as possible. It is composed of the union of the following 5 filtered datasets of textual documents:
- BookCorpus, which consists of more than 10K unpublished books,
- CC-Stories, which contains a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas,
- The Pile, from which * Pile-CC, OpenWebText2, USPTO, Project Gutenberg, OpenSubtitles, Wikipedia, DM Mathematics and HackerNews* were included.
- Pushshift.io Reddit dataset that was developed in Baumgartner et al. (2020) and processed in
Roller et al. (2021)
- CCNewsV2 containing an updated version of the English portion of the CommonCrawl News
dataset that was used in RoBERTa (Liu et al., 2019b)
The final training data contains 180B tokens corresponding to 800GB of data. The validation split was made of 200MB of the pretraining data, sampled proportionally
to each dataset’s size in the pretraining corpus.
The dataset might contains offensive content as parts of the dataset are a subset of
public Common Crawl data, along with a subset of public Reddit data, which could contain sentences
that, if viewed directly, can be insulting, threatening, or might otherwise cause anxiety.
### Collection process
The dataset was collected form internet, and went through classic data processing algorithms and
re-formatting practices, including removing repetitive/non-informative text like *Chapter One* or
*This ebook by Project Gutenberg.*
## Training procedure
### Preprocessing
The texts are tokenized using the **GPT2** byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50272. The inputs are sequences of 2048 consecutive tokens.
The 175B model was trained on 992 *80GB A100 GPUs*. The training duration was roughly ~33 days of continuous training.
### BibTeX entry and citation info
```bibtex
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 8,785 | [
[
-0.0169525146484375,
-0.06634521484375,
0.0201263427734375,
0.00751495361328125,
-0.0174407958984375,
-0.0203399658203125,
-0.0301666259765625,
-0.031341552734375,
0.0034122467041015625,
0.049957275390625,
-0.0517578125,
-0.02862548828125,
-0.046051025390625,
0.02166748046875,
-0.04180908203125,
0.093017578125,
-0.0029735565185546875,
0.0017843246459960938,
0.0013780593872070312,
0.0164947509765625,
-0.014556884765625,
-0.0389404296875,
-0.04998779296875,
-0.01010894775390625,
0.0249176025390625,
0.016845703125,
0.0518798828125,
0.044586181640625,
0.032562255859375,
0.0218658447265625,
-0.0022869110107421875,
0.006053924560546875,
-0.054046630859375,
-0.0176849365234375,
-0.005523681640625,
-0.030059814453125,
-0.022796630859375,
0.0150604248046875,
0.044097900390625,
0.0379638671875,
0.005764007568359375,
0.0154876708984375,
0.0096435546875,
0.042022705078125,
-0.035858154296875,
0.005138397216796875,
-0.056304931640625,
-0.005962371826171875,
-0.0223388671875,
0.00768280029296875,
-0.04827880859375,
-0.0205078125,
0.00879669189453125,
-0.0345458984375,
0.01873779296875,
-0.0064697265625,
0.09051513671875,
0.0237884521484375,
-0.0201416015625,
-0.0130767822265625,
-0.0516357421875,
0.06475830078125,
-0.0631103515625,
0.02032470703125,
0.0271148681640625,
0.00299072265625,
-0.00019884109497070312,
-0.06646728515625,
-0.047088623046875,
-0.00786590576171875,
-0.01354217529296875,
0.0202178955078125,
-0.0224761962890625,
-0.0028629302978515625,
0.0160675048828125,
0.025115966796875,
-0.044097900390625,
0.005535125732421875,
-0.04364013671875,
-0.0236968994140625,
0.055084228515625,
0.00024819374084472656,
0.0276336669921875,
-0.027679443359375,
-0.0213470458984375,
-0.0096435546875,
-0.04443359375,
-0.00402069091796875,
0.042388916015625,
0.03070068359375,
-0.014190673828125,
0.04827880859375,
-0.017486572265625,
0.05511474609375,
0.00048732757568359375,
0.00109100341796875,
0.034393310546875,
-0.042999267578125,
-0.0109100341796875,
-0.01035308837890625,
0.09136962890625,
0.022552490234375,
0.036285400390625,
0.0026035308837890625,
-0.002227783203125,
0.0060577392578125,
0.0196075439453125,
-0.055816650390625,
-0.007549285888671875,
0.02203369140625,
-0.042327880859375,
-0.03289794921875,
0.0015172958374023438,
-0.06842041015625,
-0.0007848739624023438,
-0.01342010498046875,
0.022918701171875,
-0.03228759765625,
-0.028289794921875,
0.011962890625,
-0.0069732666015625,
0.0181427001953125,
0.0013628005981445312,
-0.06005859375,
0.005908966064453125,
0.0390625,
0.053375244140625,
-0.00612640380859375,
-0.0302734375,
-0.0188751220703125,
-0.0111083984375,
-0.01123809814453125,
0.039520263671875,
-0.034332275390625,
-0.00035071372985839844,
0.013671875,
0.007732391357421875,
-0.0103302001953125,
-0.0211181640625,
0.05859375,
-0.033935546875,
0.04388427734375,
-0.0090179443359375,
-0.026458740234375,
-0.002716064453125,
-0.002101898193359375,
-0.047821044921875,
0.0810546875,
0.0122528076171875,
-0.08465576171875,
0.0276031494140625,
-0.04766845703125,
-0.032196044921875,
-0.00395965576171875,
0.007640838623046875,
-0.03289794921875,
-0.01087188720703125,
0.028839111328125,
0.03363037109375,
-0.0207366943359375,
0.034332275390625,
-0.01031494140625,
-0.0152435302734375,
0.01104736328125,
-0.04156494140625,
0.08447265625,
0.027191162109375,
-0.0298919677734375,
0.0012674331665039062,
-0.049530029296875,
-0.003520965576171875,
0.0226287841796875,
-0.0300140380859375,
-0.01381683349609375,
0.007801055908203125,
0.01477813720703125,
0.02203369140625,
0.0228271484375,
-0.041046142578125,
0.00963592529296875,
-0.047515869140625,
0.053558349609375,
0.056488037109375,
-0.0150299072265625,
0.03179931640625,
-0.0160064697265625,
0.034454345703125,
0.00435638427734375,
0.018707275390625,
-0.023956298828125,
-0.0333251953125,
-0.06646728515625,
-0.015777587890625,
0.026092529296875,
0.044525146484375,
-0.0543212890625,
0.044342041015625,
-0.0244140625,
-0.047271728515625,
-0.046661376953125,
-0.0003135204315185547,
0.03369140625,
0.0263214111328125,
0.036956787109375,
-0.0035915374755859375,
-0.0543212890625,
-0.06597900390625,
-0.0299072265625,
-0.00811767578125,
0.00033664703369140625,
0.022247314453125,
0.046295166015625,
-0.035491943359375,
0.0826416015625,
-0.0447998046875,
-0.0174407958984375,
-0.040557861328125,
-0.0009083747863769531,
0.0257720947265625,
0.037445068359375,
0.0345458984375,
-0.06231689453125,
-0.04583740234375,
-0.01436614990234375,
-0.048675537109375,
-0.0130615234375,
-0.0097503662109375,
-0.023162841796875,
0.031341552734375,
0.04132080078125,
-0.059661865234375,
0.0131988525390625,
0.047332763671875,
-0.02239990234375,
0.0540771484375,
0.00975799560546875,
-0.016815185546875,
-0.10467529296875,
0.01548004150390625,
-0.00366973876953125,
-0.0141754150390625,
-0.04766845703125,
-0.00839996337890625,
-0.003936767578125,
-0.0157318115234375,
-0.0440673828125,
0.044891357421875,
-0.0269317626953125,
0.0218658447265625,
-0.0017499923706054688,
0.006656646728515625,
-0.0131988525390625,
0.038604736328125,
0.009918212890625,
0.049163818359375,
0.044769287109375,
-0.044219970703125,
0.008636474609375,
0.0216217041015625,
-0.0220184326171875,
0.016876220703125,
-0.050445556640625,
0.004825592041015625,
-0.019683837890625,
0.021087646484375,
-0.06231689453125,
-0.0266265869140625,
0.0248870849609375,
-0.0455322265625,
0.0150146484375,
0.00911712646484375,
-0.040283203125,
-0.052337646484375,
-0.00986480712890625,
0.0170440673828125,
0.049774169921875,
-0.03564453125,
0.03887939453125,
0.0293426513671875,
-0.0030002593994140625,
-0.0595703125,
-0.052093505859375,
0.00047206878662109375,
-0.01074981689453125,
-0.052947998046875,
0.0233306884765625,
-0.005279541015625,
-0.007476806640625,
0.006816864013671875,
0.01303863525390625,
-0.0125885009765625,
-0.004978179931640625,
0.00312042236328125,
0.0197296142578125,
-0.0098724365234375,
0.0030517578125,
-0.0008082389831542969,
-0.01074981689453125,
0.00811004638671875,
-0.0262298583984375,
0.059051513671875,
-0.007610321044921875,
-0.003360748291015625,
-0.03173828125,
0.0159759521484375,
0.0303497314453125,
-0.02618408203125,
0.06256103515625,
0.0556640625,
-0.0264129638671875,
-0.013427734375,
-0.038848876953125,
-0.0198974609375,
-0.03839111328125,
0.051025390625,
0.0033130645751953125,
-0.0672607421875,
0.0260467529296875,
0.013519287109375,
0.01152801513671875,
0.053436279296875,
0.045867919921875,
0.01202392578125,
0.0726318359375,
0.04217529296875,
-0.0185546875,
0.041168212890625,
-0.02081298828125,
0.0190582275390625,
-0.043487548828125,
-0.0003323554992675781,
-0.0433349609375,
-0.0013113021850585938,
-0.044891357421875,
-0.021240234375,
0.004459381103515625,
0.0015897750854492188,
-0.0304718017578125,
0.03839111328125,
-0.048553466796875,
0.034576416015625,
0.04791259765625,
0.009765625,
0.004077911376953125,
0.0016126632690429688,
-0.0087432861328125,
-0.0018405914306640625,
-0.059814453125,
-0.04364013671875,
0.0958251953125,
0.031494140625,
0.04937744140625,
-0.0251922607421875,
0.05853271484375,
0.0111236572265625,
0.0234832763671875,
-0.03369140625,
0.04278564453125,
-0.0207366943359375,
-0.0673828125,
-0.01555633544921875,
-0.0416259765625,
-0.07720947265625,
0.00766754150390625,
-0.01390838623046875,
-0.0572509765625,
-0.005619049072265625,
0.0101776123046875,
-0.0102386474609375,
0.0198974609375,
-0.06427001953125,
0.08477783203125,
-0.0232391357421875,
-0.0262451171875,
0.0008482933044433594,
-0.053619384765625,
0.034912109375,
-0.0101470947265625,
0.0250701904296875,
0.0153656005859375,
0.0210113525390625,
0.06683349609375,
-0.034332275390625,
0.07891845703125,
-0.006683349609375,
0.0028438568115234375,
0.0384521484375,
-0.017822265625,
0.0343017578125,
-0.008575439453125,
-0.01036834716796875,
0.0299530029296875,
-0.01409912109375,
-0.0219268798828125,
-0.0011653900146484375,
0.035369873046875,
-0.07879638671875,
-0.030242919921875,
-0.031585693359375,
-0.034454345703125,
0.01325225830078125,
0.044342041015625,
0.052337646484375,
0.0270843505859375,
-0.0148773193359375,
0.0224456787109375,
0.03082275390625,
-0.04180908203125,
0.03802490234375,
0.0226287841796875,
-0.0133514404296875,
-0.0311126708984375,
0.05853271484375,
0.005062103271484375,
0.0236053466796875,
0.034637451171875,
0.01482391357421875,
-0.027099609375,
-0.0224456787109375,
-0.0206298828125,
0.03118896484375,
-0.04827880859375,
-0.0198822021484375,
-0.07244873046875,
-0.036773681640625,
-0.041961669921875,
-0.01532745361328125,
-0.040374755859375,
-0.0080718994140625,
-0.04156494140625,
-0.010772705078125,
0.0132904052734375,
0.042999267578125,
0.0021820068359375,
0.035675048828125,
-0.052764892578125,
0.0197296142578125,
0.0084381103515625,
0.0192718505859375,
-0.00403594970703125,
-0.03717041015625,
-0.0227813720703125,
0.0262451171875,
-0.0382080078125,
-0.064208984375,
0.04083251953125,
0.007740020751953125,
0.03936767578125,
0.038360595703125,
0.012481689453125,
0.0289306640625,
-0.04052734375,
0.07208251953125,
0.00687408447265625,
-0.07379150390625,
0.03802490234375,
-0.0391845703125,
0.0214080810546875,
0.039825439453125,
0.0396728515625,
-0.031585693359375,
-0.04278564453125,
-0.05389404296875,
-0.0760498046875,
0.0733642578125,
0.0301971435546875,
0.0276031494140625,
-0.01274871826171875,
0.0260467529296875,
0.00466156005859375,
0.01727294921875,
-0.1055908203125,
-0.0245361328125,
-0.0262451171875,
-0.0291290283203125,
-0.0139312744140625,
-0.025634765625,
0.01389312744140625,
-0.021575927734375,
0.06011962890625,
0.00408935546875,
0.034454345703125,
0.0179443359375,
-0.0196533203125,
-0.005260467529296875,
0.01425933837890625,
0.0303497314453125,
0.04217529296875,
-0.00679779052734375,
0.0004754066467285156,
0.005939483642578125,
-0.046905517578125,
-0.0078887939453125,
0.0162353515625,
-0.0364990234375,
-0.0017461776733398438,
0.03204345703125,
0.07470703125,
-0.0015287399291992188,
-0.052337646484375,
0.047210693359375,
0.0052032470703125,
-0.0224456787109375,
-0.0343017578125,
0.0015287399291992188,
0.00690460205078125,
0.002506256103515625,
0.0239105224609375,
0.00592041015625,
-0.0060272216796875,
-0.033905029296875,
0.0184326171875,
0.032501220703125,
-0.024444580078125,
-0.0255584716796875,
0.06439208984375,
0.02166748046875,
-0.0258026123046875,
0.059356689453125,
-0.0221405029296875,
-0.060699462890625,
0.04248046875,
0.051727294921875,
0.06829833984375,
-0.01001739501953125,
0.032928466796875,
0.054168701171875,
0.050537109375,
-0.01290130615234375,
0.00499725341796875,
0.0186004638671875,
-0.06463623046875,
-0.04248046875,
-0.059356689453125,
0.0004277229309082031,
0.0266571044921875,
-0.035888671875,
0.041412353515625,
-0.00888824462890625,
-0.0084686279296875,
-0.0128021240234375,
-0.0098419189453125,
-0.057373046875,
0.0146331787109375,
0.006618499755859375,
0.06304931640625,
-0.080810546875,
0.04742431640625,
0.04315185546875,
-0.0386962890625,
-0.06304931640625,
0.00948333740234375,
-0.0185699462890625,
-0.060638427734375,
0.046417236328125,
0.0406494140625,
0.0283203125,
0.0150604248046875,
-0.062103271484375,
-0.07330322265625,
0.0673828125,
0.023834228515625,
-0.035186767578125,
-0.0106658935546875,
0.0279388427734375,
0.053253173828125,
-0.0200653076171875,
0.032745361328125,
0.033111572265625,
0.03607177734375,
-0.014801025390625,
-0.06494140625,
0.0090789794921875,
-0.0211334228515625,
-0.015899658203125,
0.006511688232421875,
-0.054534912109375,
0.08514404296875,
-0.0124053955078125,
-0.021728515625,
-0.0064544677734375,
0.03680419921875,
-0.00047969818115234375,
0.00786590576171875,
0.03277587890625,
0.047515869140625,
0.044036865234375,
-0.01531219482421875,
0.09222412109375,
-0.0289154052734375,
0.043426513671875,
0.07122802734375,
-0.000522613525390625,
0.057708740234375,
0.019500732421875,
-0.0199127197265625,
0.0335693359375,
0.048126220703125,
-0.0114593505859375,
0.03875732421875,
-0.00048828125,
0.01253509521484375,
-0.01309967041015625,
-0.0037822723388671875,
-0.032623291015625,
0.036163330078125,
0.00928497314453125,
-0.046356201171875,
-0.0101776123046875,
-0.0018310546875,
0.0259552001953125,
-0.0117950439453125,
-0.0139312744140625,
0.0517578125,
0.00366973876953125,
-0.07391357421875,
0.0465087890625,
0.00539398193359375,
0.0611572265625,
-0.055084228515625,
0.0189208984375,
-0.0019817352294921875,
0.0247344970703125,
-0.012451171875,
-0.043243408203125,
0.019805908203125,
0.004489898681640625,
-0.013214111328125,
-0.024261474609375,
0.062225341796875,
-0.03961181640625,
-0.05029296875,
0.0230560302734375,
0.0276336669921875,
0.0118560791015625,
-0.0169219970703125,
-0.057373046875,
0.00423431396484375,
0.0167694091796875,
-0.0295257568359375,
0.0096893310546875,
0.019805908203125,
0.0117950439453125,
0.0386962890625,
0.0550537109375,
0.0019521713256835938,
0.01035308837890625,
-0.01021575927734375,
0.07293701171875,
-0.03985595703125,
-0.03826904296875,
-0.07513427734375,
0.054656982421875,
-0.0086212158203125,
-0.0281219482421875,
0.0614013671875,
0.046783447265625,
0.08172607421875,
-0.0176849365234375,
0.0732421875,
-0.0235443115234375,
0.033721923828125,
-0.03399658203125,
0.05859375,
-0.046600341796875,
-0.005035400390625,
-0.04620361328125,
-0.0765380859375,
-0.010101318359375,
0.050872802734375,
-0.032135009765625,
0.0164947509765625,
0.062744140625,
0.05718994140625,
-0.0004062652587890625,
-0.00691986083984375,
-0.005901336669921875,
0.036651611328125,
0.0218658447265625,
0.04620361328125,
0.05401611328125,
-0.040069580078125,
0.06005859375,
-0.0244598388671875,
-0.020782470703125,
-0.0184478759765625,
-0.062225341796875,
-0.07965087890625,
-0.047210693359375,
-0.0150604248046875,
-0.03497314453125,
-0.0025730133056640625,
0.05419921875,
0.04534912109375,
-0.050811767578125,
-0.011627197265625,
-0.032745361328125,
0.003582000732421875,
-0.01222991943359375,
-0.02392578125,
0.031402587890625,
-0.0296783447265625,
-0.065185546875,
0.000579833984375,
-0.00868988037109375,
-0.005146026611328125,
-0.0175933837890625,
-0.0023632049560546875,
-0.0300445556640625,
0.002902984619140625,
0.041046142578125,
0.004779815673828125,
-0.04248046875,
-0.01094818115234375,
0.01230621337890625,
-0.0084228515625,
-0.006511688232421875,
0.039825439453125,
-0.043182373046875,
0.0283660888671875,
0.037017822265625,
0.041717529296875,
0.031158447265625,
0.0030918121337890625,
0.042083740234375,
-0.0477294921875,
0.01312255859375,
0.0165557861328125,
0.0279388427734375,
0.022979736328125,
-0.0352783203125,
0.035003662109375,
0.0230255126953125,
-0.05401611328125,
-0.067138671875,
0.014801025390625,
-0.0670166015625,
-0.0225677490234375,
0.112060546875,
-0.0010442733764648438,
-0.0168609619140625,
0.0016307830810546875,
-0.027984619140625,
0.03240966796875,
-0.031005859375,
0.0509033203125,
0.058746337890625,
0.0216522216796875,
-0.005157470703125,
-0.04052734375,
0.036956787109375,
0.029571533203125,
-0.061798095703125,
0.006916046142578125,
0.033721923828125,
0.0222625732421875,
0.0150146484375,
0.0623779296875,
-0.007625579833984375,
0.0038909912109375,
0.0008673667907714844,
0.0145263671875,
0.00007349252700805664,
-0.018707275390625,
-0.00791168212890625,
0.00443267822265625,
-0.0225982666015625,
0.0010890960693359375
]
] |
DionTimmer/controlnet_qrcode-control_v1p_sd15 | 2023-06-15T23:34:29.000Z | [
"diffusers",
"stable-diffusion",
"controlnet",
"image-to-image",
"en",
"license:openrail++",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | DionTimmer | null | null | DionTimmer/controlnet_qrcode-control_v1p_sd15 | 187 | 94,524 | diffusers | 2023-06-15T21:50:00 | ---
tags:
- stable-diffusion
- controlnet
- image-to-image
license: openrail++
language:
- en
library_name: diffusers
pipeline_tag: image-to-image
---
# QR Code Conditioned ControlNet Models for Stable Diffusion 1.5

## Model Description
This repo holds the safetensors & diffusers versions of the QR code conditioned ControlNet for Stable Diffusion v1.5.
The Stable Diffusion 2.1 version is marginally more effective, as it was developed to address my specific needs. However, this 1.5 version model was also trained on the same dataset for those who are using the older version.
## How to use with Diffusers
```bash
pip -q install diffusers transformers accelerate torch xformers
```
```python
import torch
from PIL import Image
from diffusers import StableDiffusionControlNetImg2ImgPipeline, ControlNetModel, DDIMScheduler
from diffusers.utils import load_image
controlnet = ControlNetModel.from_pretrained("DionTimmer/controlnet_qrcode-control_v1p_sd15",
torch_dtype=torch.float16)
pipe = StableDiffusionControlNetImg2ImgPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5",
controlnet=controlnet,
safety_checker=None,
torch_dtype=torch.float16
)
pipe.enable_xformers_memory_efficient_attention()
pipe.scheduler = DDIMScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
def resize_for_condition_image(input_image: Image, resolution: int):
input_image = input_image.convert("RGB")
W, H = input_image.size
k = float(resolution) / min(H, W)
H *= k
W *= k
H = int(round(H / 64.0)) * 64
W = int(round(W / 64.0)) * 64
img = input_image.resize((W, H), resample=Image.LANCZOS)
return img
# play with guidance_scale, controlnet_conditioning_scale and strength to make a valid QR Code Image
# qr code image
source_image = load_image("https://s3.amazonaws.com/moonup/production/uploads/6064e095abd8d3692e3e2ed6/A_RqHaAM6YHBodPLwqtjn.png")
# initial image, anything
init_image = load_image("https://s3.amazonaws.com/moonup/production/uploads/noauth/KfMBABpOwIuNolv1pe3qX.jpeg")
condition_image = resize_for_condition_image(source_image, 768)
init_image = resize_for_condition_image(init_image, 768)
generator = torch.manual_seed(123121231)
image = pipe(prompt="a bilboard in NYC with a qrcode",
negative_prompt="ugly, disfigured, low quality, blurry, nsfw",
image=init_image,
control_image=condition_image,
width=768,
height=768,
guidance_scale=20,
controlnet_conditioning_scale=1.5,
generator=generator,
strength=0.9,
num_inference_steps=150,
)
image.images[0]
```
## Performance and Limitations
These models perform quite well in most cases, but please note that they are not 100% accurate. In some instances, the QR code shape might not come through as expected. You can increase the ControlNet weight to emphasize the QR code shape. However, be cautious as this might negatively impact the style of your output.**To optimize for scanning, please generate your QR codes with correction mode 'H' (30%).**
To balance between style and shape, a gentle fine-tuning of the control weight might be required based on the individual input and the desired output, aswell as the correct prompt. Some prompts do not work until you increase the weight by a lot. The process of finding the right balance between these factors is part art and part science. For the best results, it is recommended to generate your artwork at a resolution of 768. This allows for a higher level of detail in the final product, enhancing the quality and effectiveness of the QR code-based artwork.
## Installation
The simplest way to use this is to place the .safetensors model and its .yaml config file in the folder where your other controlnet models are installed, which varies per application.
For usage in auto1111 they can be placed in the webui/models/ControlNet folder. They can be loaded using the controlnet webui extension which you can install through the extensions tab in the webui (https://github.com/Mikubill/sd-webui-controlnet). Make sure to enable your controlnet unit and set your input image as the QR code. Set the model to either the SD2.1 or 1.5 version depending on your base stable diffusion model, or it will error. No pre-processor is needed, though you can use the invert pre-processor for a different variation of results. 768 is the preferred resolution for generation since it allows for more detail.
Make sure to look up additional info on how to use controlnet if you get stuck, once you have the webui up and running its really easy to install the controlnet extension aswell. | 4,826 | [
[
-0.0249786376953125,
-0.006542205810546875,
0.0034942626953125,
0.027587890625,
-0.034881591796875,
-0.00921630859375,
0.01885986328125,
-0.0189056396484375,
0.0165557861328125,
0.039794921875,
-0.01171875,
-0.028961181640625,
-0.04638671875,
0.00443267822265625,
-0.01110076904296875,
0.05682373046875,
-0.007476806640625,
0.0038509368896484375,
0.023712158203125,
0.0019664764404296875,
-0.0183868408203125,
-0.005947113037109375,
-0.08026123046875,
-0.0258941650390625,
0.03338623046875,
0.0245513916015625,
0.0587158203125,
0.050262451171875,
0.040191650390625,
0.0221405029296875,
0.003833770751953125,
-0.006023406982421875,
-0.03216552734375,
-0.00875091552734375,
0.012725830078125,
-0.02392578125,
-0.0322265625,
-0.0099029541015625,
0.034576416015625,
-0.00023126602172851562,
-0.01305389404296875,
0.0007028579711914062,
-0.00475311279296875,
0.06591796875,
-0.05474853515625,
-0.001804351806640625,
-0.0067291259765625,
0.01739501953125,
0.00994110107421875,
-0.0294952392578125,
-0.015777587890625,
-0.042144775390625,
-0.0144805908203125,
-0.051971435546875,
0.01145172119140625,
0.0014295578002929688,
0.0904541015625,
0.0255889892578125,
-0.051605224609375,
-0.01534271240234375,
-0.06329345703125,
0.042755126953125,
-0.058441162109375,
0.0210723876953125,
0.0305328369140625,
0.025238037109375,
-0.01216888427734375,
-0.08465576171875,
-0.037139892578125,
-0.027191162109375,
0.006351470947265625,
0.032745361328125,
-0.04058837890625,
0.00679779052734375,
0.043609619140625,
0.0164642333984375,
-0.049346923828125,
-0.006153106689453125,
-0.05230712890625,
-0.0150299072265625,
0.04925537109375,
0.024627685546875,
0.0304412841796875,
-0.01224517822265625,
-0.03656005859375,
-0.02618408203125,
-0.024871826171875,
0.038116455078125,
0.0173797607421875,
-0.0220489501953125,
-0.033294677734375,
0.034271240234375,
-0.017913818359375,
0.03436279296875,
0.051055908203125,
-0.008819580078125,
0.0255584716796875,
-0.0240936279296875,
-0.0281524658203125,
-0.0087738037109375,
0.0787353515625,
0.04669189453125,
0.0009393692016601562,
0.00971221923828125,
-0.021331787109375,
-0.0127716064453125,
0.02825927734375,
-0.08782958984375,
-0.04962158203125,
0.039825439453125,
-0.0465087890625,
-0.027557373046875,
0.0033626556396484375,
-0.041717529296875,
-0.0170135498046875,
0.0096893310546875,
0.042755126953125,
-0.0296478271484375,
-0.027496337890625,
0.0258331298828125,
-0.036651611328125,
0.0236053466796875,
0.026275634765625,
-0.044464111328125,
0.0037746429443359375,
-0.0050048828125,
0.0662841796875,
0.01305389404296875,
-0.0001494884490966797,
-0.0264434814453125,
0.0010852813720703125,
-0.047760009765625,
0.031463623046875,
-0.0019397735595703125,
-0.029266357421875,
-0.0258331298828125,
0.0211944580078125,
0.0089874267578125,
-0.040313720703125,
0.052276611328125,
-0.07513427734375,
0.0005717277526855469,
0.0035247802734375,
-0.025238037109375,
-0.0157012939453125,
-0.006534576416015625,
-0.06219482421875,
0.06805419921875,
0.03216552734375,
-0.0780029296875,
0.01329803466796875,
-0.0450439453125,
-0.0112152099609375,
0.006992340087890625,
-0.004703521728515625,
-0.0535888671875,
-0.01708984375,
-0.01172637939453125,
0.0255279541015625,
0.003757476806640625,
-0.003009796142578125,
-0.004909515380859375,
-0.037109375,
0.022369384765625,
-0.0128173828125,
0.10272216796875,
0.036865234375,
-0.045257568359375,
0.01629638671875,
-0.059234619140625,
0.032806396484375,
0.006145477294921875,
-0.03961181640625,
-0.0007448196411132812,
-0.021240234375,
0.024993896484375,
0.0282440185546875,
0.0219879150390625,
-0.032379150390625,
0.00914764404296875,
-0.01458740234375,
0.045257568359375,
0.032623291015625,
0.01309967041015625,
0.04486083984375,
-0.036041259765625,
0.060089111328125,
0.01079559326171875,
0.02960205078125,
0.0253143310546875,
-0.0199737548828125,
-0.053619384765625,
-0.030059814453125,
0.0192718505859375,
0.0521240234375,
-0.08294677734375,
0.044403076171875,
-0.00679779052734375,
-0.059906005859375,
-0.0102386474609375,
-0.0098876953125,
0.0242767333984375,
0.025726318359375,
0.0165252685546875,
-0.030181884765625,
-0.031829833984375,
-0.06329345703125,
0.03826904296875,
0.01245880126953125,
-0.0247802734375,
-0.0073089599609375,
0.04248046875,
-0.0164337158203125,
0.05487060546875,
-0.0260772705078125,
-0.00685882568359375,
-0.005550384521484375,
0.00177001953125,
0.0251922607421875,
0.071044921875,
0.049835205078125,
-0.06964111328125,
-0.041778564453125,
-0.01041412353515625,
-0.04669189453125,
0.007335662841796875,
-0.01763916015625,
-0.0277862548828125,
-0.0015172958374023438,
0.0226287841796875,
-0.045928955078125,
0.060546875,
0.03729248046875,
-0.04412841796875,
0.06964111328125,
-0.043670654296875,
0.0282440185546875,
-0.08074951171875,
0.0099639892578125,
0.012664794921875,
-0.021331787109375,
-0.03839111328125,
0.01461029052734375,
0.0338134765625,
0.0010881423950195312,
-0.034576416015625,
0.0350341796875,
-0.036285400390625,
0.0213623046875,
-0.031005859375,
-0.0297088623046875,
0.02935791015625,
0.03570556640625,
0.0004134178161621094,
0.061920166015625,
0.0550537109375,
-0.063720703125,
0.047027587890625,
-0.00004976987838745117,
-0.02655029296875,
0.00662994384765625,
-0.08172607421875,
-0.0013751983642578125,
0.0056304931640625,
0.026611328125,
-0.0687255859375,
-0.007656097412109375,
0.0579833984375,
-0.043670654296875,
0.0271453857421875,
-0.017791748046875,
-0.0123138427734375,
-0.03509521484375,
-0.034912109375,
0.033355712890625,
0.061065673828125,
-0.024871826171875,
0.035186767578125,
0.0030689239501953125,
0.026824951171875,
-0.0399169921875,
-0.06597900390625,
-0.007568359375,
-0.021820068359375,
-0.041717529296875,
0.029876708984375,
-0.00991058349609375,
-0.0024280548095703125,
-0.007495880126953125,
-0.0021686553955078125,
-0.02325439453125,
-0.00034546852111816406,
0.0265960693359375,
0.0011053085327148438,
-0.005382537841796875,
-0.011077880859375,
0.007427215576171875,
-0.023834228515625,
0.009674072265625,
-0.023040771484375,
0.0274200439453125,
0.01172637939453125,
-0.01163482666015625,
-0.060882568359375,
0.0206451416015625,
0.0537109375,
-0.01349639892578125,
0.04547119140625,
0.058624267578125,
-0.03875732421875,
-0.0037746429443359375,
-0.01445770263671875,
-0.01258087158203125,
-0.040191650390625,
0.0167236328125,
-0.0277252197265625,
-0.030914306640625,
0.054534912109375,
0.0108489990234375,
-0.0032939910888671875,
0.0245819091796875,
0.036041259765625,
-0.0154876708984375,
0.0836181640625,
0.047576904296875,
0.0211334228515625,
0.052276611328125,
-0.059967041015625,
0.00820159912109375,
-0.078857421875,
-0.014617919921875,
-0.0352783203125,
-0.01416015625,
-0.034454345703125,
-0.0283203125,
0.036773681640625,
0.05340576171875,
-0.05584716796875,
0.01873779296875,
-0.05792236328125,
0.007610321044921875,
0.0516357421875,
0.036407470703125,
0.00356292724609375,
-0.0007619857788085938,
-0.011077880859375,
0.0027027130126953125,
-0.059661865234375,
-0.043975830078125,
0.063720703125,
0.022064208984375,
0.06768798828125,
0.00704193115234375,
0.043914794921875,
0.0237884521484375,
-0.0005049705505371094,
-0.0428466796875,
0.0158843994140625,
0.00293731689453125,
-0.03436279296875,
-0.0198211669921875,
-0.0233154296875,
-0.090087890625,
0.0034084320068359375,
-0.0214080810546875,
-0.03369140625,
0.04473876953125,
0.0183258056640625,
-0.042388916015625,
0.0214385986328125,
-0.053253173828125,
0.0565185546875,
-0.0223541259765625,
-0.0308074951171875,
0.0131988525390625,
-0.03662109375,
0.01885986328125,
0.00911712646484375,
-0.0035552978515625,
0.00812530517578125,
-0.01016998291015625,
0.0638427734375,
-0.06622314453125,
0.048736572265625,
-0.01303863525390625,
-0.004306793212890625,
0.0292510986328125,
0.00350189208984375,
0.0294189453125,
0.00957489013671875,
-0.0205535888671875,
0.0001233816146850586,
0.033294677734375,
-0.0521240234375,
-0.03961181640625,
0.026641845703125,
-0.07098388671875,
-0.01348114013671875,
-0.029632568359375,
-0.01470947265625,
0.0380859375,
0.0199127197265625,
0.07073974609375,
0.047943115234375,
0.0223541259765625,
0.004016876220703125,
0.05291748046875,
-0.01259613037109375,
0.0276947021484375,
0.014251708984375,
-0.0255889892578125,
-0.044281005859375,
0.048675537109375,
0.0178070068359375,
0.01352691650390625,
0.013671875,
0.014617919921875,
-0.0219879150390625,
-0.050506591796875,
-0.05108642578125,
-0.00673675537109375,
-0.056671142578125,
-0.042083740234375,
-0.0399169921875,
-0.0299835205078125,
-0.0299224853515625,
-0.0149078369140625,
-0.017181396484375,
-0.01250457763671875,
-0.043304443359375,
0.0174407958984375,
0.06243896484375,
0.034942626953125,
-0.02764892578125,
0.031219482421875,
-0.0576171875,
0.0282745361328125,
0.0253448486328125,
0.0262603759765625,
0.01464080810546875,
-0.052001953125,
-0.03326416015625,
0.017364501953125,
-0.027862548828125,
-0.0782470703125,
0.034698486328125,
-0.01041412353515625,
0.0255279541015625,
0.04534912109375,
0.03619384765625,
0.03399658203125,
-0.014404296875,
0.050506591796875,
0.033935546875,
-0.06048583984375,
0.03643798828125,
-0.03167724609375,
0.0277862548828125,
0.006122589111328125,
0.047576904296875,
-0.031158447265625,
-0.0208587646484375,
-0.03582763671875,
-0.05572509765625,
0.03131103515625,
0.0240936279296875,
-0.0013322830200195312,
0.01116180419921875,
0.050323486328125,
-0.025970458984375,
-0.01268768310546875,
-0.06060791015625,
-0.033599853515625,
-0.032684326171875,
0.00006097555160522461,
0.00909423828125,
-0.01435089111328125,
-0.0022716522216796875,
-0.0311279296875,
0.051849365234375,
0.00708770751953125,
0.040771484375,
0.03643798828125,
0.00652313232421875,
-0.0287017822265625,
-0.0005440711975097656,
0.03955078125,
0.057403564453125,
-0.01739501953125,
0.0015048980712890625,
-0.0157012939453125,
-0.06036376953125,
0.0260467529296875,
-0.0011081695556640625,
-0.03216552734375,
-0.0093841552734375,
0.015960693359375,
0.052154541015625,
-0.0135650634765625,
-0.022369384765625,
0.03363037109375,
-0.026336669921875,
-0.0361328125,
-0.03521728515625,
0.0250091552734375,
0.013641357421875,
0.041595458984375,
0.04302978515625,
0.02325439453125,
0.0181884765625,
0.0007085800170898438,
0.00838470458984375,
0.021728515625,
-0.014495849609375,
-0.005390167236328125,
0.06500244140625,
0.00017309188842773438,
-0.0223541259765625,
0.042816162109375,
-0.054840087890625,
-0.045745849609375,
0.0880126953125,
0.04248046875,
0.06268310546875,
0.0012216567993164062,
0.0220947265625,
0.060882568359375,
0.0278778076171875,
-0.0025482177734375,
0.04302978515625,
0.01122283935546875,
-0.0645751953125,
-0.01763916015625,
-0.0232696533203125,
-0.0172576904296875,
-0.0027828216552734375,
-0.05853271484375,
0.027008056640625,
-0.047393798828125,
-0.027557373046875,
-0.0232391357421875,
0.0304412841796875,
-0.050567626953125,
0.027374267578125,
-0.00920867919921875,
0.06964111328125,
-0.048675537109375,
0.07470703125,
0.05145263671875,
-0.047576904296875,
-0.09619140625,
-0.0174713134765625,
-0.0290679931640625,
-0.03564453125,
0.052703857421875,
-0.01116943359375,
-0.0205078125,
0.02734375,
-0.05609130859375,
-0.06378173828125,
0.10443115234375,
0.004993438720703125,
-0.0277862548828125,
0.017791748046875,
-0.027923583984375,
0.0389404296875,
-0.0214080810546875,
0.04266357421875,
0.002269744873046875,
0.0225982666015625,
0.00902557373046875,
-0.057373046875,
0.0290679931640625,
-0.03387451171875,
0.0245513916015625,
0.006649017333984375,
-0.050506591796875,
0.0755615234375,
-0.0267333984375,
-0.027435302734375,
0.01255035400390625,
0.0482177734375,
0.026275634765625,
0.01751708984375,
0.0290985107421875,
0.0413818359375,
0.050567626953125,
0.01297760009765625,
0.06329345703125,
-0.0247802734375,
0.0355224609375,
0.043670654296875,
0.0018777847290039062,
0.04718017578125,
0.0159912109375,
-0.0149383544921875,
0.032501220703125,
0.07037353515625,
-0.017181396484375,
0.04034423828125,
0.031982421875,
-0.01082611083984375,
-0.0003218650817871094,
0.023651123046875,
-0.05108642578125,
0.0137939453125,
0.01189422607421875,
-0.0172576904296875,
-0.01012420654296875,
0.02886962890625,
-0.0099639892578125,
-0.0190887451171875,
-0.033599853515625,
0.0236358642578125,
-0.0163421630859375,
-0.01235198974609375,
0.076171875,
0.014892578125,
0.081787109375,
-0.038818359375,
-0.01108551025390625,
-0.01113128662109375,
-0.0004048347473144531,
-0.03594970703125,
-0.0462646484375,
0.0166015625,
-0.0117340087890625,
-0.026641845703125,
-0.002536773681640625,
0.048919677734375,
-0.020111083984375,
-0.032623291015625,
0.0190887451171875,
0.0180816650390625,
0.033111572265625,
0.0089874267578125,
-0.0655517578125,
0.0189666748046875,
0.01267242431640625,
-0.0281982421875,
0.01373291015625,
0.038299560546875,
0.00560760498046875,
0.06475830078125,
0.0303955078125,
0.0023250579833984375,
0.028350830078125,
-0.015045166015625,
0.06463623046875,
-0.047271728515625,
-0.03631591796875,
-0.0303955078125,
0.0465087890625,
0.0119476318359375,
-0.033843994140625,
0.045745849609375,
0.043853759765625,
0.05462646484375,
-0.02398681640625,
0.058502197265625,
-0.0216217041015625,
0.008819580078125,
-0.048126220703125,
0.08282470703125,
-0.0494384765625,
0.00027561187744140625,
-0.00551605224609375,
-0.0335693359375,
-0.00940704345703125,
0.07171630859375,
-0.01519775390625,
0.01849365234375,
0.036529541015625,
0.08074951171875,
-0.004638671875,
-0.029052734375,
0.0157470703125,
-0.006359100341796875,
0.01361846923828125,
0.049652099609375,
0.051788330078125,
-0.0723876953125,
0.041046142578125,
-0.061004638671875,
-0.0294189453125,
-0.01296234130859375,
-0.06707763671875,
-0.043365478515625,
-0.03936767578125,
-0.05975341796875,
-0.0633544921875,
-0.016632080078125,
0.06243896484375,
0.0655517578125,
-0.041778564453125,
-0.01335906982421875,
-0.01296234130859375,
0.0166015625,
-0.018890380859375,
-0.0223541259765625,
0.035430908203125,
0.01042938232421875,
-0.053802490234375,
-0.01458740234375,
-0.004306793212890625,
0.032867431640625,
-0.002933502197265625,
-0.0229949951171875,
-0.01239013671875,
-0.016845703125,
0.01305389404296875,
0.038482666015625,
-0.03387451171875,
-0.0120849609375,
-0.00962066650390625,
-0.03948974609375,
0.0164337158203125,
0.03387451171875,
-0.044891357421875,
0.0242462158203125,
0.044708251953125,
0.008880615234375,
0.03228759765625,
-0.0124969482421875,
0.01381683349609375,
-0.0157012939453125,
0.013336181640625,
0.022430419921875,
0.015960693359375,
-0.005146026611328125,
-0.048492431640625,
0.018768310546875,
-0.00321197509765625,
-0.059722900390625,
-0.0350341796875,
0.007717132568359375,
-0.09039306640625,
-0.01763916015625,
0.06988525390625,
-0.0208587646484375,
-0.0195465087890625,
-0.021453857421875,
-0.0265045166015625,
0.035491943359375,
-0.0247039794921875,
0.041290283203125,
0.018768310546875,
-0.024169921875,
-0.045379638671875,
-0.03729248046875,
0.044281005859375,
0.00356292724609375,
-0.039031982421875,
-0.035614013671875,
0.03839111328125,
0.041473388671875,
0.025726318359375,
0.07171630859375,
0.001125335693359375,
0.044158935546875,
0.023834228515625,
0.04766845703125,
-0.00849151611328125,
0.0095672607421875,
-0.0460205078125,
-0.004131317138671875,
0.00966644287109375,
-0.04248046875
]
] |
sociocom/MedNERN-CR-JA | 2023-04-17T07:06:31.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"NER",
"medical documents",
"ja",
"dataset:MedTxt-CR-JA-training-v2.xml",
"doi:10.57967/hf/0620",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | sociocom | null | null | sociocom/MedNERN-CR-JA | 2 | 94,357 | transformers | 2023-04-13T08:25:56 | ---
language:
- ja
license:
- cc-by-4.0
tags:
- NER
- medical documents
datasets:
- MedTxt-CR-JA-training-v2.xml
metrics:
- NTCIR-16 Real-MedNLP subtask 1
---
This is a model for named entity recognition of Japanese medical documents.
### How to use
Download the following five files and put into the same folder.
- id_to_tags.pkl
- key_attr.pkl
- NER_medNLP.py
- predict.py
- text.txt (This is an input file which should be predicted, which could be changed.)
You can use this model by running `predict.py`.
```
python3 predict.py
```
#### Entity normalization
This model supports entity normalization via dictionary matching. The dictionary is a list of medical terms or
drugs and their standard forms.
Two different dictionaries are used for drug and disease normalization, stored in the `dictionaries` folder as
`drug_dict.csv` and `disease_dict.csv`, respectively.
To enable normalization you can add the `--normalize` flag to the `predict.py` command.
```
python3 predict.py --normalize
```
Normalization will add the `norm` attribute to the output XML tags. This attribute can be empty if a normalized form of
the term is not found.
The provided disease normalization dictionary (`dictionaties/disease_dict.csv`) is based on the [Manbyo Dictionary](https://sociocom.naist.jp/manbyo-dic-en/) and provides normalization to the standard ICD code for the diseases.
The default drug dictionary (`dictionaties/drug_dict.csv`) is based on the [Hyakuyaku Dictionary](https://sociocom.naist.jp/hyakuyaku-dic-en/).
The dictionary is a CSV file with three columns: the first column is the surface form term and the third column contain
its standard form. The second column is not used.
User can freely change the dictionary to fit their needs, as long as the format and filename are kept.
### Input Example
```
肥大型心筋症、心房細動に対してWF投与が開始となった。
治療経過中に非持続性心室頻拍が認められたためアミオダロンが併用となった。
```
### Output Example
```
<d certainty="positive" norm="I422">肥大型心筋症、心房細動</d>に対して<m-key state="executed" norm="ワルファリンカリウム">WF</m-key>投与が開始となった。
<timex3 type="med">治療経過中</timex3>に<d certainty="positive" norm="I472">非持続性心室頻拍</d>が認められたため<m-key state="executed" norm="アミオダロン塩酸塩">アミオダロン</m-key>が併用となった。
```
### Publication
| 2,227 | [
[
-0.01401519775390625,
-0.040283203125,
0.043548583984375,
0.01375579833984375,
-0.035125732421875,
-0.01113128662109375,
-0.01267242431640625,
-0.0244598388671875,
0.047149658203125,
0.040069580078125,
-0.04315185546875,
-0.0660400390625,
-0.0440673828125,
0.04315185546875,
-0.0019359588623046875,
0.072021484375,
-0.022613525390625,
0.0200042724609375,
-0.0006051063537597656,
0.006732940673828125,
-0.0283203125,
-0.01324462890625,
-0.06707763671875,
-0.041595458984375,
0.023834228515625,
0.0189056396484375,
0.05218505859375,
0.049072265625,
0.05902099609375,
0.02288818359375,
0.00986480712890625,
-0.00951385498046875,
-0.01123809814453125,
-0.002407073974609375,
0.0107879638671875,
-0.0523681640625,
-0.046478271484375,
0.00511932373046875,
0.042938232421875,
0.036834716796875,
-0.017791748046875,
-0.007568359375,
-0.00856781005859375,
0.0243377685546875,
-0.0121307373046875,
0.01959228515625,
-0.020172119140625,
0.01284027099609375,
-0.021942138671875,
-0.01560211181640625,
0.00914764404296875,
-0.0194091796875,
-0.00652313232421875,
-0.07086181640625,
0.00815582275390625,
0.0002849102020263672,
0.1090087890625,
0.0106353759765625,
-0.02337646484375,
-0.018585205078125,
-0.0295257568359375,
0.041717529296875,
-0.0667724609375,
0.004302978515625,
0.037994384765625,
0.0272216796875,
-0.041351318359375,
-0.057830810546875,
-0.0531005859375,
-0.01044464111328125,
-0.00838470458984375,
0.0266571044921875,
0.013702392578125,
0.009490966796875,
0.031951904296875,
0.031585693359375,
-0.041748046875,
-0.009002685546875,
-0.0452880859375,
-0.040252685546875,
0.041839599609375,
0.0291748046875,
0.057769775390625,
-0.045684814453125,
-0.0173187255859375,
-0.0013179779052734375,
-0.028656005859375,
0.003170013427734375,
-0.003627777099609375,
0.0167083740234375,
-0.0252227783203125,
0.0379638671875,
-0.0215606689453125,
0.032958984375,
-0.005809783935546875,
-0.0640869140625,
0.039215087890625,
-0.0173492431640625,
-0.0201416015625,
0.0152587890625,
0.07000732421875,
0.044830322265625,
-0.002056121826171875,
0.01617431640625,
-0.01203155517578125,
-0.01381683349609375,
0.004062652587890625,
-0.0645751953125,
-0.0261383056640625,
0.0257568359375,
-0.0643310546875,
-0.026397705078125,
-0.0124053955078125,
-0.060333251953125,
-0.0251617431640625,
-0.0196685791015625,
0.034271240234375,
-0.0528564453125,
-0.0171966552734375,
0.01177978515625,
-0.004627227783203125,
-0.0010309219360351562,
0.025421142578125,
-0.053314208984375,
0.0401611328125,
0.0253448486328125,
0.06170654296875,
-0.00018310546875,
-0.01898193359375,
0.0008187294006347656,
0.0155487060546875,
-0.0208892822265625,
0.04791259765625,
-0.021087646484375,
-0.050262451171875,
-0.03216552734375,
0.03131103515625,
0.0025005340576171875,
-0.029693603515625,
0.052093505859375,
-0.004169464111328125,
0.0253448486328125,
-0.00768280029296875,
-0.03424072265625,
-0.0274658203125,
0.01416015625,
-0.051605224609375,
0.060302734375,
0.01222991943359375,
-0.07110595703125,
0.0207672119140625,
-0.051422119140625,
-0.0290069580078125,
0.023956298828125,
-0.0153656005859375,
-0.038909912109375,
-0.0231170654296875,
0.004138946533203125,
0.040283203125,
-0.0190582275390625,
0.0210113525390625,
-0.0240478515625,
-0.01073455810546875,
0.019622802734375,
0.002227783203125,
0.07073974609375,
0.0311431884765625,
-0.005336761474609375,
-0.0047760009765625,
-0.0885009765625,
-0.005001068115234375,
0.0288238525390625,
-0.044647216796875,
-0.04827880859375,
-0.00681304931640625,
0.0194854736328125,
0.012451171875,
0.0295867919921875,
-0.053314208984375,
0.03399658203125,
-0.038116455078125,
0.0360107421875,
0.032257080078125,
0.031463623046875,
0.01180267333984375,
-0.01467132568359375,
0.039825439453125,
0.01605224609375,
0.006725311279296875,
0.012664794921875,
-0.056549072265625,
-0.029144287109375,
-0.0231475830078125,
0.026885986328125,
0.04296875,
-0.054107666015625,
0.02337646484375,
-0.0160675048828125,
-0.05206298828125,
-0.044281005859375,
-0.017669677734375,
0.0182342529296875,
0.05450439453125,
0.0298614501953125,
-0.005100250244140625,
-0.055328369140625,
-0.05712890625,
-0.0080413818359375,
0.005474090576171875,
-0.005924224853515625,
0.024810791015625,
0.06219482421875,
-0.045684814453125,
0.040252685546875,
-0.061737060546875,
-0.03985595703125,
-0.007602691650390625,
0.00910186767578125,
0.049224853515625,
0.043426513671875,
0.02557373046875,
-0.05804443359375,
-0.040618896484375,
-0.0017910003662109375,
-0.049530029296875,
-0.0058135986328125,
-0.0123748779296875,
-0.007686614990234375,
-0.00286102294921875,
0.029083251953125,
-0.04034423828125,
0.036376953125,
0.012451171875,
-0.0263214111328125,
0.0285186767578125,
-0.034942626953125,
0.0143280029296875,
-0.1324462890625,
0.028076171875,
0.0035552978515625,
-0.01172637939453125,
-0.046478271484375,
0.0019092559814453125,
0.026153564453125,
-0.00853729248046875,
-0.0153961181640625,
0.040313720703125,
-0.03851318359375,
0.0189056396484375,
-0.0067138671875,
0.004695892333984375,
0.00453948974609375,
0.0157928466796875,
0.01702880859375,
0.03863525390625,
0.043792724609375,
-0.04486083984375,
0.034759521484375,
0.0238037109375,
-0.019683837890625,
0.031646728515625,
-0.044525146484375,
-0.016998291015625,
-0.006763458251953125,
0.006683349609375,
-0.055419921875,
-0.03717041015625,
0.047821044921875,
-0.04827880859375,
0.039093017578125,
-0.002338409423828125,
-0.01763916015625,
-0.034576416015625,
-0.0183868408203125,
0.0083465576171875,
0.0278167724609375,
-0.0153656005859375,
0.049530029296875,
0.04339599609375,
-0.005634307861328125,
-0.041778564453125,
-0.05902099609375,
0.01354217529296875,
-0.0149993896484375,
-0.026397705078125,
0.048309326171875,
-0.004856109619140625,
0.0004897117614746094,
0.007007598876953125,
0.00441741943359375,
-0.02154541015625,
0.0176239013671875,
0.031402587890625,
0.044525146484375,
-0.01354217529296875,
0.00244903564453125,
0.0187835693359375,
-0.0171966552734375,
0.00974273681640625,
0.0106048583984375,
0.041534423828125,
0.01120758056640625,
-0.003452301025390625,
-0.04156494140625,
0.020294189453125,
0.0335693359375,
0.0016527175903320312,
0.040130615234375,
0.048187255859375,
-0.07562255859375,
0.006916046142578125,
-0.0002951622009277344,
0.0034465789794921875,
-0.031402587890625,
0.0294342041015625,
-0.054779052734375,
-0.0250396728515625,
0.053009033203125,
-0.00812530517578125,
-0.0173187255859375,
0.0650634765625,
0.030914306640625,
-0.00881195068359375,
0.0899658203125,
0.04364013671875,
-0.004215240478515625,
-0.002803802490234375,
-0.0266571044921875,
0.0236053466796875,
-0.0743408203125,
-0.035888671875,
-0.04058837890625,
-0.01522064208984375,
-0.045989990234375,
-0.00034809112548828125,
0.0284881591796875,
0.0044097900390625,
0.00312042236328125,
0.0282135009765625,
-0.05914306640625,
0.0197296142578125,
0.0178985595703125,
0.021331787109375,
0.00885772705078125,
-0.01493072509765625,
-0.022369384765625,
-0.005451202392578125,
-0.0189056396484375,
-0.037109375,
0.09228515625,
0.034942626953125,
0.06695556640625,
0.004100799560546875,
0.0718994140625,
-0.0021038055419921875,
0.0179595947265625,
-0.06378173828125,
0.03875732421875,
-0.01873779296875,
-0.0341796875,
-0.0186767578125,
-0.0197296142578125,
-0.09429931640625,
-0.004749298095703125,
-0.031005859375,
-0.0626220703125,
0.0191192626953125,
-0.0244140625,
-0.039581298828125,
0.0244140625,
-0.0269622802734375,
0.056549072265625,
-0.0034027099609375,
0.01141357421875,
-0.01336669921875,
-0.0498046875,
0.0269927978515625,
-0.00457000732421875,
0.0260467529296875,
-0.0194091796875,
-0.01104736328125,
0.084716796875,
-0.03399658203125,
0.04217529296875,
-0.01309967041015625,
0.00560760498046875,
-0.00510406494140625,
-0.0015058517456054688,
0.0032024383544921875,
0.0265045166015625,
0.0102996826171875,
0.026885986328125,
0.038665771484375,
-0.00909423828125,
-0.010772705078125,
0.060882568359375,
-0.07635498046875,
-0.0242462158203125,
-0.044525146484375,
-0.0242462158203125,
0.02459716796875,
0.064697265625,
0.047943115234375,
0.0377197265625,
0.002185821533203125,
-0.0001327991485595703,
0.040252685546875,
-0.02197265625,
0.0163726806640625,
0.05377197265625,
-0.02117919921875,
-0.044830322265625,
0.06390380859375,
0.0299224853515625,
-0.0131072998046875,
0.0457763671875,
0.0235595703125,
-0.0212554931640625,
-0.038909912109375,
-0.0250396728515625,
0.03729248046875,
-0.05255126953125,
-0.02655029296875,
-0.055511474609375,
-0.024261474609375,
-0.05218505859375,
0.00641632080078125,
-0.00254058837890625,
-0.0263671875,
-0.03826904296875,
0.0036830902099609375,
0.028076171875,
0.044677734375,
-0.01384735107421875,
0.00308990478515625,
-0.06390380859375,
0.046600341796875,
-0.0148162841796875,
0.031951904296875,
-0.0128173828125,
-0.0225677490234375,
-0.021148681640625,
0.00428009033203125,
-0.0157318115234375,
-0.09521484375,
0.060546875,
0.01116943359375,
0.047119140625,
0.03839111328125,
-0.00699615478515625,
0.045379638671875,
-0.0206146240234375,
0.05816650390625,
0.0168304443359375,
-0.058868408203125,
0.05938720703125,
-0.0201263427734375,
0.00004971027374267578,
0.0285186767578125,
0.05377197265625,
-0.05975341796875,
-0.005359649658203125,
-0.04974365234375,
-0.041259765625,
0.0706787109375,
-0.00036454200744628906,
-0.0185089111328125,
0.0028438568115234375,
0.0491943359375,
-0.00025010108947753906,
0.00543212890625,
-0.0491943359375,
-0.04046630859375,
-0.01702880859375,
-0.04052734375,
-0.0009212493896484375,
-0.006740570068359375,
-0.01171112060546875,
-0.0291290283203125,
0.06939697265625,
-0.0009140968322753906,
0.0263671875,
0.04608154296875,
0.01033782958984375,
-0.016143798828125,
0.0007309913635253906,
0.032958984375,
0.008514404296875,
-0.038848876953125,
-0.006008148193359375,
0.0035533905029296875,
-0.0711669921875,
0.007007598876953125,
0.01279449462890625,
-0.01175689697265625,
0.01511383056640625,
0.0078582763671875,
0.04388427734375,
0.0018339157104492188,
-0.0288238525390625,
0.02862548828125,
-0.032958984375,
-0.023681640625,
-0.038726806640625,
0.0021038055419921875,
0.0031585693359375,
-0.00431060791015625,
0.00505828857421875,
0.005756378173828125,
0.03582763671875,
-0.036102294921875,
-0.00634002685546875,
0.014739990234375,
-0.004302978515625,
-0.00850677490234375,
0.05792236328125,
-0.006683349609375,
-0.0303955078125,
0.043609619140625,
-0.027587890625,
-0.03826904296875,
0.067138671875,
0.046234130859375,
0.056640625,
-0.032257080078125,
0.0105133056640625,
0.06524658203125,
0.01259613037109375,
-0.006992340087890625,
0.05523681640625,
0.0121002197265625,
-0.0633544921875,
0.0046234130859375,
-0.054443359375,
-0.01763916015625,
0.043243408203125,
-0.050506591796875,
0.048065185546875,
-0.04901123046875,
-0.03399658203125,
0.027313232421875,
0.0245513916015625,
-0.034637451171875,
0.038848876953125,
0.018463134765625,
0.06170654296875,
-0.0643310546875,
0.07470703125,
0.07501220703125,
-0.06146240234375,
-0.08001708984375,
-0.003742218017578125,
-0.00431060791015625,
-0.06353759765625,
0.0268096923828125,
0.0148162841796875,
0.0218963623046875,
-0.00847625732421875,
-0.02978515625,
-0.094970703125,
0.095703125,
-0.01558685302734375,
-0.026580810546875,
-0.00957489013671875,
0.0021686553955078125,
0.045989990234375,
-0.0217132568359375,
0.0172119140625,
0.03778076171875,
0.037506103515625,
-0.01297760009765625,
-0.067138671875,
0.04315185546875,
-0.046661376953125,
0.013275146484375,
0.009246826171875,
-0.0263671875,
0.08251953125,
-0.0173187255859375,
-0.0257415771484375,
0.0159149169921875,
0.0273895263671875,
0.01348114013671875,
0.031707763671875,
0.0181884765625,
0.046295166015625,
0.064453125,
-0.01837158203125,
0.0670166015625,
-0.040557861328125,
0.03497314453125,
0.048065185546875,
-0.004913330078125,
0.0506591796875,
0.0294647216796875,
-0.032867431640625,
0.053802490234375,
0.06854248046875,
-0.030975341796875,
0.042266845703125,
0.0022735595703125,
-0.00478363037109375,
0.000026047229766845703,
0.00902557373046875,
-0.032928466796875,
0.025848388671875,
0.053314208984375,
-0.0251617431640625,
-0.00879669189453125,
0.0200347900390625,
0.0181884765625,
-0.0129547119140625,
-0.029937744140625,
0.058258056640625,
0.0005550384521484375,
-0.0511474609375,
0.0531005859375,
0.02294921875,
0.044586181640625,
-0.05322265625,
-0.0148468017578125,
0.008270263671875,
0.01425933837890625,
-0.0384521484375,
-0.0672607421875,
0.017425537109375,
-0.0307464599609375,
-0.016082763671875,
0.0239715576171875,
0.0706787109375,
-0.0350341796875,
-0.04144287109375,
0.019439697265625,
0.002521514892578125,
0.03839111328125,
0.019134521484375,
-0.0731201171875,
-0.01702880859375,
0.01020050048828125,
-0.0162811279296875,
0.023101806640625,
0.018524169921875,
-0.0023193359375,
0.05316162109375,
0.0511474609375,
0.033538818359375,
-0.0279388427734375,
0.0013837814331054688,
0.050018310546875,
-0.047332763671875,
-0.050048828125,
-0.046234130859375,
0.021087646484375,
-0.01324462890625,
-0.027862548828125,
0.036712646484375,
0.0469970703125,
0.038543701171875,
-0.0190582275390625,
0.0606689453125,
-0.016204833984375,
0.055145263671875,
-0.040557861328125,
0.08251953125,
-0.0589599609375,
0.0005741119384765625,
-0.0217132568359375,
-0.03125,
-0.0328369140625,
0.057342529296875,
-0.0107421875,
-0.009124755859375,
0.0543212890625,
0.060791015625,
0.0037174224853515625,
-0.0096588134765625,
0.00003153085708618164,
0.044525146484375,
0.0223388671875,
0.045135498046875,
0.0244293212890625,
-0.042694091796875,
0.00858306884765625,
-0.0513916015625,
-0.02301025390625,
-0.0142822265625,
-0.05218505859375,
-0.057952880859375,
-0.0545654296875,
-0.03155517578125,
-0.042510986328125,
-0.017913818359375,
0.07574462890625,
0.0252685546875,
-0.06829833984375,
-0.0037288665771484375,
-0.0029430389404296875,
0.0014085769653320312,
-0.01444244384765625,
-0.022430419921875,
0.0352783203125,
0.00481414794921875,
-0.049224853515625,
0.00841522216796875,
0.0184783935546875,
0.0240631103515625,
0.0019664764404296875,
-0.0009026527404785156,
-0.01157379150390625,
-0.00830078125,
0.007137298583984375,
0.0189666748046875,
-0.0545654296875,
0.0015840530395507812,
-0.01024627685546875,
-0.03155517578125,
0.0125732421875,
0.018280029296875,
-0.0419921875,
0.03692626953125,
0.036224365234375,
0.00986480712890625,
0.030059814453125,
-0.0301513671875,
0.0234527587890625,
-0.0210723876953125,
-0.0130767822265625,
0.0035266876220703125,
0.05816650390625,
0.0189056396484375,
-0.05755615234375,
0.029876708984375,
0.043792724609375,
-0.0340576171875,
-0.051177978515625,
-0.021392822265625,
-0.08074951171875,
-0.037567138671875,
0.0736083984375,
-0.007434844970703125,
-0.043121337890625,
-0.010040283203125,
-0.049713134765625,
0.04608154296875,
-0.032745361328125,
0.055023193359375,
0.043792724609375,
-0.005390167236328125,
0.0019855499267578125,
-0.031494140625,
0.052093505859375,
-0.01263427734375,
-0.055145263671875,
-0.02728271484375,
0.0036907196044921875,
0.050872802734375,
0.031707763671875,
0.068115234375,
-0.0189971923828125,
0.0182037353515625,
-0.0018033981323242188,
0.029815673828125,
0.00530242919921875,
0.01142120361328125,
-0.042388916015625,
-0.0109710693359375,
-0.0134429931640625,
-0.015777587890625
]
] |
google/owlvit-base-patch32 | 2023-10-23T09:20:36.000Z | [
"transformers",
"pytorch",
"owlvit",
"zero-shot-object-detection",
"vision",
"object-detection",
"arxiv:2205.06230",
"license:apache-2.0",
"has_space",
"region:us"
] | object-detection | google | null | null | google/owlvit-base-patch32 | 53 | 94,284 | transformers | 2022-07-05T06:30:01 | ---
license: apache-2.0
tags:
- vision
- object-detection
inference: false
---
# Model Card: OWL-ViT
## Model Details
The OWL-ViT (short for Vision Transformer for Open-World Localization) was proposed in [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby. OWL-ViT is a zero-shot text-conditioned object detection model that can be used to query an image with one or multiple text queries.
OWL-ViT uses CLIP as its multi-modal backbone, with a ViT-like Transformer to get visual features and a causal language model to get the text features. To use CLIP for detection, OWL-ViT removes the final token pooling layer of the vision model and attaches a lightweight classification and box head to each transformer output token. Open-vocabulary classification is enabled by replacing the fixed classification layer weights with the class-name embeddings obtained from the text model. The authors first train CLIP from scratch and fine-tune it end-to-end with the classification and box heads on standard detection datasets using a bipartite matching loss. One or multiple text queries per image can be used to perform zero-shot text-conditioned object detection.
### Model Date
May 2022
### Model Type
The model uses a CLIP backbone with a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss. The CLIP backbone is trained from scratch and fine-tuned together with the box and class prediction heads with an object detection objective.
### Documents
- [OWL-ViT Paper](https://arxiv.org/abs/2205.06230)
### Use with Transformers
```python3
import requests
from PIL import Image
import torch
from transformers import OwlViTProcessor, OwlViTForObjectDetection
processor = OwlViTProcessor.from_pretrained("google/owlvit-base-patch32")
model = OwlViTForObjectDetection.from_pretrained("google/owlvit-base-patch32")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
texts = [["a photo of a cat", "a photo of a dog"]]
inputs = processor(text=texts, images=image, return_tensors="pt")
outputs = model(**inputs)
# Target image sizes (height, width) to rescale box predictions [batch_size, 2]
target_sizes = torch.Tensor([image.size[::-1]])
# Convert outputs (bounding boxes and class logits) to COCO API
results = processor.post_process_object_detection(outputs=outputs, threshold=0.1, target_sizes=target_sizes)
i = 0 # Retrieve predictions for the first image for the corresponding text queries
text = texts[i]
boxes, scores, labels = results[i]["boxes"], results[i]["scores"], results[i]["labels"]
# Print detected objects and rescaled box coordinates
for box, score, label in zip(boxes, scores, labels):
box = [round(i, 2) for i in box.tolist()]
print(f"Detected {text[label]} with confidence {round(score.item(), 3)} at location {box}")
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, text-conditioned object detection. We also hope it can be used for interdisciplinary studies of the potential impact of such models, especially in areas that commonly require identifying objects whose label is unavailable during training.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The CLIP backbone of the model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet. The prediction heads of OWL-ViT, along with the CLIP backbone, are fine-tuned on publicly available object detection datasets such as [COCO](https://cocodataset.org/#home) and [OpenImages](https://storage.googleapis.com/openimages/web/index.html).
### BibTeX entry and citation info
```bibtex
@article{minderer2022simple,
title={Simple Open-Vocabulary Object Detection with Vision Transformers},
author={Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, Neil Houlsby},
journal={arXiv preprint arXiv:2205.06230},
year={2022},
}
``` | 5,136 | [
[
-0.0362548828125,
-0.04949951171875,
0.0282745361328125,
-0.0169219970703125,
-0.024078369140625,
-0.034881591796875,
-0.005878448486328125,
-0.06072998046875,
0.005313873291015625,
0.030242919921875,
-0.0267791748046875,
-0.049957275390625,
-0.042449951171875,
0.0225067138671875,
-0.0379638671875,
0.05511474609375,
0.01454925537109375,
-0.028350830078125,
-0.02191162109375,
-0.0068359375,
-0.026123046875,
-0.01273345947265625,
-0.033111572265625,
0.002422332763671875,
0.0227813720703125,
0.0274810791015625,
0.04803466796875,
0.07861328125,
0.06561279296875,
0.0227203369140625,
0.004413604736328125,
0.00968170166015625,
-0.0235748291015625,
-0.04022216796875,
-0.007320404052734375,
-0.04180908203125,
-0.0259552001953125,
-0.0011262893676757812,
0.0280914306640625,
0.00756072998046875,
0.0081329345703125,
0.00804901123046875,
-0.0029735565185546875,
0.0215911865234375,
-0.0599365234375,
0.0246734619140625,
-0.06744384765625,
0.0211334228515625,
-0.01556396484375,
-0.01406097412109375,
-0.0438232421875,
0.019073486328125,
0.019195556640625,
-0.05963134765625,
0.042938232421875,
0.01314544677734375,
0.10382080078125,
0.02166748046875,
-0.0031147003173828125,
-0.0080108642578125,
-0.04022216796875,
0.081298828125,
-0.03302001953125,
0.0220489501953125,
0.0213775634765625,
0.0148773193359375,
0.0107269287109375,
-0.057586669921875,
-0.049468994140625,
-0.001789093017578125,
-0.00731658935546875,
0.0283966064453125,
-0.034576416015625,
-0.004795074462890625,
0.0122833251953125,
0.0281524658203125,
-0.0386962890625,
-0.002773284912109375,
-0.05194091796875,
-0.01509857177734375,
0.03985595703125,
0.003780364990234375,
0.0218048095703125,
-0.0032939910888671875,
-0.04840087890625,
-0.03631591796875,
-0.0162353515625,
0.01062774658203125,
0.012847900390625,
0.0182037353515625,
-0.006988525390625,
0.042633056640625,
-0.028167724609375,
0.0721435546875,
0.001819610595703125,
-0.02239990234375,
0.0255126953125,
-0.010162353515625,
-0.016998291015625,
-0.0027828216552734375,
0.07330322265625,
0.0390625,
0.0264129638671875,
-0.00943756103515625,
-0.0008826255798339844,
0.0279541015625,
0.01428985595703125,
-0.06280517578125,
-0.01548004150390625,
0.01251220703125,
-0.032928466796875,
-0.032135009765625,
0.022369384765625,
-0.0491943359375,
0.0151824951171875,
-0.0046844482421875,
0.047393798828125,
-0.0275421142578125,
-0.01025390625,
0.046173095703125,
-0.0161285400390625,
0.042724609375,
0.01824951171875,
-0.04644775390625,
-0.0043487548828125,
0.0208587646484375,
0.0650634765625,
-0.027740478515625,
-0.03167724609375,
-0.030731201171875,
0.0035533905029296875,
-0.0213165283203125,
0.0858154296875,
-0.035552978515625,
-0.024383544921875,
0.0035991668701171875,
0.0246734619140625,
-0.0017824172973632812,
-0.0292816162109375,
0.045989990234375,
-0.014617919921875,
0.020660400390625,
0.0070037841796875,
-0.01216888427734375,
-0.0207366943359375,
0.04742431640625,
-0.031951904296875,
0.08026123046875,
-0.0103302001953125,
-0.0826416015625,
0.029327392578125,
-0.042816162109375,
-0.01004791259765625,
-0.01360321044921875,
0.0087127685546875,
-0.05364990234375,
-0.01244354248046875,
0.0438232421875,
0.038604736328125,
-0.01178741455078125,
-0.01117706298828125,
-0.034637451171875,
-0.034271240234375,
0.013153076171875,
-0.032745361328125,
0.052581787109375,
0.00597381591796875,
-0.0014057159423828125,
0.01526641845703125,
-0.05291748046875,
-0.01044464111328125,
0.0300445556640625,
-0.00435638427734375,
-0.0144805908203125,
-0.0037517547607421875,
-0.0020084381103515625,
0.005401611328125,
0.017913818359375,
-0.06280517578125,
0.0200958251953125,
-0.030975341796875,
0.02374267578125,
0.039794921875,
-0.0164794921875,
0.0225830078125,
-0.0099334716796875,
0.0238800048828125,
0.00893402099609375,
0.042694091796875,
-0.03778076171875,
-0.058135986328125,
-0.0657958984375,
-0.01528167724609375,
-0.0124359130859375,
0.0465087890625,
-0.05023193359375,
0.0254669189453125,
-0.0203704833984375,
-0.03387451171875,
-0.025604248046875,
-0.02227783203125,
0.0232086181640625,
0.048065185546875,
0.05023193359375,
-0.023651123046875,
-0.02978515625,
-0.071533203125,
-0.0013666152954101562,
-0.0050811767578125,
-0.01425933837890625,
0.01078033447265625,
0.053375244140625,
-0.01168060302734375,
0.0966796875,
-0.07177734375,
-0.05670166015625,
-0.0003914833068847656,
-0.00627899169921875,
0.0118408203125,
0.0233612060546875,
0.040130615234375,
-0.06884765625,
-0.040863037109375,
-0.0013799667358398438,
-0.07122802734375,
0.02032470703125,
0.0240325927734375,
-0.007091522216796875,
0.004848480224609375,
0.032196044921875,
-0.045318603515625,
0.06927490234375,
0.01512908935546875,
-0.007686614990234375,
0.041961669921875,
-0.0019855499267578125,
-0.011627197265625,
-0.08245849609375,
-0.0045318603515625,
0.0020847320556640625,
-0.010162353515625,
-0.04010009765625,
0.00560760498046875,
-0.0007758140563964844,
-0.017974853515625,
-0.05810546875,
0.038299560546875,
-0.03387451171875,
-0.00843048095703125,
-0.0222320556640625,
0.019927978515625,
0.009735107421875,
0.05157470703125,
0.034698486328125,
0.05157470703125,
0.06036376953125,
-0.04339599609375,
0.01450347900390625,
0.0312347412109375,
-0.01111602783203125,
0.04193115234375,
-0.060150146484375,
0.0166168212890625,
-0.01226043701171875,
-0.0023326873779296875,
-0.0654296875,
-0.01032257080078125,
0.026824951171875,
-0.04766845703125,
0.0149078369140625,
-0.0081329345703125,
-0.017120361328125,
-0.054351806640625,
-0.0340576171875,
0.03472900390625,
0.0281829833984375,
-0.044525146484375,
0.037750244140625,
0.013031005859375,
0.049468994140625,
-0.0648193359375,
-0.065185546875,
-0.005298614501953125,
-0.00726318359375,
-0.0484619140625,
0.036285400390625,
-0.00800323486328125,
0.0111083984375,
0.024078369140625,
-0.0011701583862304688,
-0.007251739501953125,
-0.0174560546875,
0.01093292236328125,
0.03399658203125,
-0.020172119140625,
0.003971099853515625,
-0.03594970703125,
-0.028961181640625,
-0.01030731201171875,
-0.039276123046875,
0.033172607421875,
-0.0277252197265625,
-0.025665283203125,
-0.046051025390625,
-0.00565338134765625,
0.0308990478515625,
-0.041229248046875,
0.039154052734375,
0.07122802734375,
-0.031982421875,
0.004917144775390625,
-0.031494140625,
-0.02581787109375,
-0.0347900390625,
0.03155517578125,
-0.019287109375,
-0.038360595703125,
0.0299224853515625,
0.034912109375,
-0.001850128173828125,
0.045196533203125,
0.02447509765625,
0.01421356201171875,
0.06787109375,
0.0595703125,
0.005466461181640625,
0.040313720703125,
-0.06927490234375,
0.0204315185546875,
-0.0736083984375,
-0.0154266357421875,
-0.0280609130859375,
-0.001667022705078125,
-0.0300140380859375,
-0.05596923828125,
0.0202789306640625,
0.0037860870361328125,
0.00902557373046875,
0.040130615234375,
-0.08349609375,
0.03570556640625,
0.03564453125,
0.033660888671875,
0.02978515625,
0.00792694091796875,
0.00608062744140625,
0.01404571533203125,
-0.041656494140625,
-0.02899169921875,
0.08941650390625,
0.0196533203125,
0.05255126953125,
-0.03448486328125,
0.039276123046875,
0.008270263671875,
0.0177154541015625,
-0.07598876953125,
0.048126220703125,
-0.0196380615234375,
-0.0430908203125,
-0.025421142578125,
0.00943756103515625,
-0.0888671875,
0.0263214111328125,
-0.01467132568359375,
-0.07470703125,
0.0295562744140625,
0.014129638671875,
-0.014495849609375,
0.046295166015625,
-0.0438232421875,
0.07647705078125,
0.005878448486328125,
-0.037506103515625,
0.01453399658203125,
-0.0299224853515625,
0.02923583984375,
0.02020263671875,
-0.01025390625,
-0.02069091796875,
0.00350189208984375,
0.05426025390625,
-0.0233612060546875,
0.062347412109375,
0.0096435546875,
0.014129638671875,
0.054443359375,
-0.020782470703125,
0.0305938720703125,
-0.0105743408203125,
0.01306915283203125,
0.04010009765625,
-0.0012025833129882812,
-0.030517578125,
-0.01934814453125,
0.0225982666015625,
-0.05218505859375,
-0.04412841796875,
-0.04229736328125,
-0.042572021484375,
0.0246124267578125,
0.028961181640625,
0.059326171875,
0.036407470703125,
0.01537322998046875,
0.03875732421875,
0.0477294921875,
-0.0250091552734375,
0.0408935546875,
0.026092529296875,
-0.03460693359375,
-0.013671875,
0.07635498046875,
0.00019216537475585938,
0.005786895751953125,
0.05108642578125,
0.0261993408203125,
-0.0192413330078125,
-0.02978515625,
-0.012420654296875,
0.01326751708984375,
-0.05023193359375,
-0.04217529296875,
-0.05517578125,
-0.01837158203125,
-0.0280303955078125,
-0.015777587890625,
-0.03912353515625,
0.0008707046508789062,
-0.05029296875,
0.0005354881286621094,
0.031951904296875,
0.0335693359375,
0.007411956787109375,
0.0290679931640625,
-0.0259552001953125,
0.026397705078125,
0.032562255859375,
0.0109405517578125,
-0.0030727386474609375,
-0.027099609375,
-0.01000213623046875,
0.005390167236328125,
-0.04046630859375,
-0.0506591796875,
0.03839111328125,
0.0087127685546875,
0.0230560302734375,
0.050811767578125,
0.01287078857421875,
0.050323486328125,
-0.0102081298828125,
0.058135986328125,
0.0218658447265625,
-0.050933837890625,
0.050262451171875,
-0.02362060546875,
0.028106689453125,
0.0036716461181640625,
0.005519866943359375,
-0.01097869873046875,
-0.02978515625,
-0.0311737060546875,
-0.0513916015625,
0.0709228515625,
0.0242156982421875,
-0.0092010498046875,
-0.0023899078369140625,
0.01116943359375,
-0.013946533203125,
-0.0090179443359375,
-0.06524658203125,
-0.002498626708984375,
-0.033935546875,
-0.0179443359375,
0.004665374755859375,
-0.001453399658203125,
0.0220947265625,
-0.0221710205078125,
0.0316162109375,
-0.026763916015625,
0.06036376953125,
0.048675537109375,
-0.008758544921875,
0.005634307861328125,
-0.01212310791015625,
0.037139892578125,
0.032257080078125,
-0.030792236328125,
-0.005084991455078125,
0.0146026611328125,
-0.055877685546875,
-0.0151214599609375,
-0.01444244384765625,
-0.024383544921875,
-0.00273895263671875,
0.045562744140625,
0.06524658203125,
0.008514404296875,
-0.031494140625,
0.048736572265625,
-0.0029125213623046875,
-0.01197052001953125,
-0.0364990234375,
0.0108489990234375,
-0.01983642578125,
-0.0004405975341796875,
0.0421142578125,
0.0185699462890625,
0.01617431640625,
-0.031768798828125,
0.017486572265625,
0.036529541015625,
-0.041778564453125,
-0.025115966796875,
0.06451416015625,
-0.0124359130859375,
-0.0286712646484375,
0.038177490234375,
-0.0134735107421875,
-0.045074462890625,
0.07122802734375,
0.0399169921875,
0.06982421875,
-0.0007448196411132812,
0.006168365478515625,
0.05078125,
0.028289794921875,
-0.0113372802734375,
-0.004974365234375,
-0.003658294677734375,
-0.077880859375,
-0.00905609130859375,
-0.05584716796875,
-0.01517486572265625,
0.006061553955078125,
-0.05584716796875,
0.04718017578125,
-0.015869140625,
-0.0211334228515625,
0.00016391277313232422,
-0.01491546630859375,
-0.07470703125,
0.014617919921875,
0.00040221214294433594,
0.0770263671875,
-0.045166015625,
0.059234619140625,
0.04443359375,
-0.04815673828125,
-0.048614501953125,
0.006313323974609375,
-0.017181396484375,
-0.084228515625,
0.041015625,
0.05615234375,
-0.0201873779296875,
0.0122528076171875,
-0.053558349609375,
-0.062255859375,
0.090087890625,
0.00989532470703125,
-0.015777587890625,
-0.00998687744140625,
0.0045013427734375,
0.03265380859375,
-0.032806396484375,
0.034423828125,
0.0152587890625,
0.0287322998046875,
0.015228271484375,
-0.05096435546875,
-0.0230560302734375,
-0.00372314453125,
0.0023097991943359375,
0.0061492919921875,
-0.05877685546875,
0.078125,
-0.025054931640625,
-0.0232086181640625,
0.0083465576171875,
0.045440673828125,
-0.0017156600952148438,
0.0311737060546875,
0.03753662109375,
0.06048583984375,
0.0167236328125,
-0.0189666748046875,
0.07769775390625,
-0.0167236328125,
0.05078125,
0.05523681640625,
0.00948333740234375,
0.07696533203125,
0.0288848876953125,
-0.01556396484375,
0.0260162353515625,
0.037506103515625,
-0.02764892578125,
0.035675048828125,
-0.0207672119140625,
0.0172271728515625,
-0.01528167724609375,
-0.0181121826171875,
-0.025146484375,
0.061370849609375,
0.017974853515625,
-0.01531219482421875,
-0.01201629638671875,
0.031646728515625,
-0.01090240478515625,
-0.006343841552734375,
-0.0152587890625,
0.040191650390625,
-0.017181396484375,
-0.033843994140625,
0.0455322265625,
-0.00768280029296875,
0.06939697265625,
-0.03472900390625,
-0.00688934326171875,
-0.004520416259765625,
0.0328369140625,
-0.0207672119140625,
-0.07110595703125,
0.0244598388671875,
-0.00905609130859375,
-0.0097808837890625,
0.007633209228515625,
0.0631103515625,
-0.0209808349609375,
-0.049102783203125,
0.03515625,
-0.01062774658203125,
0.032989501953125,
-0.0255126953125,
-0.04248046875,
0.0244903564453125,
0.0020923614501953125,
-0.014129638671875,
0.0007433891296386719,
0.0222625732421875,
-0.00060272216796875,
0.05535888671875,
0.05572509765625,
-0.01392364501953125,
0.018707275390625,
-0.041778564453125,
0.055816650390625,
-0.033294677734375,
-0.030181884765625,
-0.04705810546875,
0.03985595703125,
-0.0144500732421875,
-0.029510498046875,
0.0504150390625,
0.054412841796875,
0.084716796875,
-0.0306854248046875,
0.0247650146484375,
-0.0065765380859375,
0.015838623046875,
-0.03460693359375,
0.028350830078125,
-0.059234619140625,
-0.00228118896484375,
-0.0157470703125,
-0.072998046875,
-0.0285186767578125,
0.05712890625,
-0.02783203125,
-0.01326751708984375,
0.053497314453125,
0.0682373046875,
-0.02227783203125,
-0.032684326171875,
0.045745849609375,
0.008392333984375,
0.0228424072265625,
0.04486083984375,
0.038360595703125,
-0.06939697265625,
0.07818603515625,
-0.042572021484375,
-0.0091552734375,
-0.0303497314453125,
-0.059417724609375,
-0.07720947265625,
-0.04791259765625,
-0.035980224609375,
0.000042557716369628906,
-0.0112457275390625,
0.0218505859375,
0.07757568359375,
-0.049957275390625,
-0.0031223297119140625,
-0.01111602783203125,
0.0110626220703125,
-0.0161285400390625,
-0.0244140625,
0.040496826171875,
-0.00777435302734375,
-0.07379150390625,
0.0011348724365234375,
0.035247802734375,
0.006549835205078125,
-0.0176239013671875,
0.00012767314910888672,
-0.0238800048828125,
-0.001941680908203125,
0.0509033203125,
0.0298309326171875,
-0.07379150390625,
-0.036407470703125,
0.021514892578125,
0.00637054443359375,
0.029449462890625,
0.0269317626953125,
-0.0653076171875,
0.05474853515625,
0.030792236328125,
0.01129913330078125,
0.054229736328125,
-0.004047393798828125,
-0.01078033447265625,
-0.039642333984375,
0.05517578125,
-0.0008368492126464844,
0.032012939453125,
0.0281982421875,
-0.0316162109375,
0.04559326171875,
0.036865234375,
-0.034454345703125,
-0.06591796875,
0.0090179443359375,
-0.09503173828125,
-0.0155029296875,
0.0582275390625,
-0.0185699462890625,
-0.05596923828125,
0.0059051513671875,
-0.029449462890625,
0.020172119140625,
-0.028961181640625,
0.048431396484375,
0.0284576416015625,
-0.0026836395263671875,
-0.029449462890625,
-0.0251312255859375,
0.01247406005859375,
-0.01355743408203125,
-0.051483154296875,
-0.02630615234375,
0.00286865234375,
0.0229949951171875,
0.052825927734375,
0.03131103515625,
-0.00782012939453125,
0.0094451904296875,
0.0132904052734375,
0.02728271484375,
-0.0187835693359375,
-0.033721923828125,
0.0027790069580078125,
0.005962371826171875,
-0.0172271728515625,
-0.06219482421875
]
] |
timm/efficientnet_b3.ra2_in1k | 2023-04-27T21:10:28.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.11946",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/efficientnet_b3.ra2_in1k | 0 | 93,980 | timm | 2022-12-12T23:56:39 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for efficientnet_b3.ra2_in1k
A EfficientNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 12.2
- GMACs: 1.6
- Activations (M): 21.5
- Image size: train = 288 x 288, test = 320 x 320
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('efficientnet_b3.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b3.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 24, 144, 144])
# torch.Size([1, 32, 72, 72])
# torch.Size([1, 48, 36, 36])
# torch.Size([1, 136, 18, 18])
# torch.Size([1, 384, 9, 9])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b3.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1536, 9, 9) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,737 | [
[
-0.0278778076171875,
-0.035980224609375,
-0.007732391357421875,
0.005649566650390625,
-0.0143585205078125,
-0.034576416015625,
-0.0231781005859375,
-0.0310211181640625,
0.0184478759765625,
0.03265380859375,
-0.032257080078125,
-0.04046630859375,
-0.056549072265625,
-0.0086212158203125,
-0.01229095458984375,
0.0633544921875,
-0.004100799560546875,
0.0030765533447265625,
-0.0095672607421875,
-0.047271728515625,
-0.01419830322265625,
-0.01494598388671875,
-0.0811767578125,
-0.041412353515625,
0.033843994140625,
0.0277252197265625,
0.043548583984375,
0.05072021484375,
0.054840087890625,
0.0362548828125,
-0.004108428955078125,
0.01380157470703125,
-0.0182952880859375,
-0.00634002685546875,
0.0299835205078125,
-0.049224853515625,
-0.030242919921875,
0.015838623046875,
0.05108642578125,
0.0281982421875,
-0.0043792724609375,
0.034637451171875,
0.0096282958984375,
0.051910400390625,
-0.025146484375,
0.00960540771484375,
-0.03472900390625,
0.01419830322265625,
-0.00628662109375,
0.0079345703125,
-0.02740478515625,
-0.027191162109375,
0.00717926025390625,
-0.0426025390625,
0.031707763671875,
0.0005774497985839844,
0.09613037109375,
0.0225677490234375,
-0.0013141632080078125,
-0.004550933837890625,
-0.0164794921875,
0.057647705078125,
-0.062042236328125,
0.01165008544921875,
0.01727294921875,
0.024810791015625,
-0.003986358642578125,
-0.09039306640625,
-0.04193115234375,
-0.0146484375,
-0.015960693359375,
-0.00433349609375,
-0.0287017822265625,
0.0007181167602539062,
0.027496337890625,
0.0183258056640625,
-0.03143310546875,
0.01490020751953125,
-0.03546142578125,
-0.01021575927734375,
0.03857421875,
0.004291534423828125,
0.026824951171875,
-0.011566162109375,
-0.034423828125,
-0.036224365234375,
-0.030914306640625,
0.023345947265625,
0.0220794677734375,
0.023345947265625,
-0.04205322265625,
0.0290679931640625,
0.00859832763671875,
0.048492431640625,
0.006336212158203125,
-0.0208587646484375,
0.043212890625,
0.0028972625732421875,
-0.031951904296875,
-0.0141754150390625,
0.07806396484375,
0.0301361083984375,
0.01468658447265625,
0.01210784912109375,
-0.012481689453125,
-0.033599853515625,
0.0020961761474609375,
-0.09490966796875,
-0.0278778076171875,
0.0204925537109375,
-0.05291748046875,
-0.0288543701171875,
0.0107421875,
-0.037445068359375,
-0.0097198486328125,
-0.003841400146484375,
0.0438232421875,
-0.034576416015625,
-0.030426025390625,
-0.0007333755493164062,
-0.01548004150390625,
0.01861572265625,
0.0183258056640625,
-0.0443115234375,
0.01617431640625,
0.0264434814453125,
0.08984375,
0.011474609375,
-0.031524658203125,
-0.0207672119140625,
-0.031219482421875,
-0.0206298828125,
0.0284576416015625,
-0.006744384765625,
0.00751495361328125,
-0.0255584716796875,
0.0245208740234375,
-0.00920867919921875,
-0.058135986328125,
0.0183258056640625,
-0.024383544921875,
0.007236480712890625,
-0.0025272369384765625,
-0.0182037353515625,
-0.047149658203125,
0.02691650390625,
-0.043701171875,
0.0892333984375,
0.0229339599609375,
-0.0689697265625,
0.0188140869140625,
-0.045196533203125,
-0.00909423828125,
-0.022491455078125,
-0.00021159648895263672,
-0.08050537109375,
-0.007122039794921875,
0.0087890625,
0.0579833984375,
-0.02960205078125,
0.001979827880859375,
-0.042144775390625,
-0.0156097412109375,
0.0245361328125,
-0.0018444061279296875,
0.08172607421875,
0.0203399658203125,
-0.037109375,
0.0168304443359375,
-0.047515869140625,
0.0202178955078125,
0.03948974609375,
-0.0171356201171875,
-0.003215789794921875,
-0.042633056640625,
0.01088714599609375,
0.0213775634765625,
0.00675201416015625,
-0.03924560546875,
0.0148773193359375,
-0.01462554931640625,
0.03826904296875,
0.043426513671875,
-0.009796142578125,
0.0261077880859375,
-0.03167724609375,
0.020721435546875,
0.0139312744140625,
0.0192413330078125,
-0.0034923553466796875,
-0.036651611328125,
-0.06353759765625,
-0.036590576171875,
0.02734375,
0.0284881591796875,
-0.03759765625,
0.0297698974609375,
-0.01442718505859375,
-0.060791015625,
-0.033294677734375,
0.00608062744140625,
0.039794921875,
0.044921875,
0.0241546630859375,
-0.03375244140625,
-0.038330078125,
-0.0770263671875,
-0.0023956298828125,
0.00531768798828125,
0.0035228729248046875,
0.030181884765625,
0.04833984375,
-0.0040435791015625,
0.0433349609375,
-0.0306854248046875,
-0.0197296142578125,
-0.0250244140625,
0.00466156005859375,
0.03302001953125,
0.06500244140625,
0.058502197265625,
-0.05010986328125,
-0.0439453125,
-0.009002685546875,
-0.0687255859375,
0.01528167724609375,
-0.00850677490234375,
-0.01212310791015625,
0.0122528076171875,
0.01611328125,
-0.042022705078125,
0.0350341796875,
0.0184478759765625,
-0.0152740478515625,
0.029296875,
-0.017578125,
0.0162353515625,
-0.0869140625,
0.01361083984375,
0.02862548828125,
-0.00981903076171875,
-0.03887939453125,
0.01708984375,
0.0068511962890625,
-0.007396697998046875,
-0.03643798828125,
0.0477294921875,
-0.04437255859375,
-0.01580810546875,
-0.013031005859375,
-0.021575927734375,
0.0021152496337890625,
0.05218505859375,
-0.01324462890625,
0.026458740234375,
0.06036376953125,
-0.0310821533203125,
0.03497314453125,
0.0189208984375,
-0.0189056396484375,
0.025299072265625,
-0.056396484375,
0.01751708984375,
0.004711151123046875,
0.022369384765625,
-0.0777587890625,
-0.019012451171875,
0.0292510986328125,
-0.048187255859375,
0.051483154296875,
-0.036376953125,
-0.03564453125,
-0.036041259765625,
-0.03216552734375,
0.029449462890625,
0.047576904296875,
-0.05718994140625,
0.03466796875,
0.016204833984375,
0.026336669921875,
-0.0469970703125,
-0.07147216796875,
-0.011566162109375,
-0.02984619140625,
-0.05810546875,
0.0227508544921875,
0.01092529296875,
0.0014400482177734375,
0.00960540771484375,
0.0010986328125,
-0.01262664794921875,
-0.000016391277313232422,
0.038726806640625,
0.014129638671875,
-0.0203704833984375,
-0.007266998291015625,
-0.0249176025390625,
-0.0033626556396484375,
0.002216339111328125,
-0.025787353515625,
0.03948974609375,
-0.01849365234375,
-0.006561279296875,
-0.06634521484375,
-0.0026416778564453125,
0.03631591796875,
-0.003093719482421875,
0.06378173828125,
0.0843505859375,
-0.042755126953125,
-0.005695343017578125,
-0.034576416015625,
-0.0285491943359375,
-0.036834716796875,
0.038299560546875,
-0.0240631103515625,
-0.03436279296875,
0.0645751953125,
-0.0030231475830078125,
0.0096588134765625,
0.0499267578125,
0.0263214111328125,
-0.0037364959716796875,
0.049896240234375,
0.0460205078125,
0.021087646484375,
0.05535888671875,
-0.08404541015625,
-0.01473236083984375,
-0.06890869140625,
-0.032440185546875,
-0.03131103515625,
-0.05889892578125,
-0.047119140625,
-0.024444580078125,
0.034881591796875,
0.0159454345703125,
-0.03643798828125,
0.034942626953125,
-0.06378173828125,
0.00482940673828125,
0.051910400390625,
0.0452880859375,
-0.03240966796875,
0.02569580078125,
-0.01517486572265625,
-0.0002887248992919922,
-0.06634521484375,
-0.01329803466796875,
0.08367919921875,
0.0321044921875,
0.042755126953125,
-0.00612640380859375,
0.052215576171875,
-0.0163116455078125,
0.03173828125,
-0.045562744140625,
0.043243408203125,
-0.0159912109375,
-0.0364990234375,
-0.0161895751953125,
-0.03790283203125,
-0.08111572265625,
0.01174163818359375,
-0.019256591796875,
-0.046600341796875,
0.0147247314453125,
0.018341064453125,
-0.0199737548828125,
0.060546875,
-0.06396484375,
0.0673828125,
-0.00534820556640625,
-0.0350341796875,
-0.004634857177734375,
-0.05474853515625,
0.0246734619140625,
0.0190277099609375,
-0.01427459716796875,
-0.00336456298828125,
0.00511932373046875,
0.0804443359375,
-0.052642822265625,
0.06658935546875,
-0.040252685546875,
0.03521728515625,
0.03961181640625,
-0.0094146728515625,
0.0306854248046875,
-0.0086822509765625,
-0.0174407958984375,
0.0262451171875,
-0.01113128662109375,
-0.036224365234375,
-0.046600341796875,
0.047149658203125,
-0.07330322265625,
-0.0224151611328125,
-0.01654052734375,
-0.0343017578125,
0.0202484130859375,
0.00957489013671875,
0.041168212890625,
0.056793212890625,
0.0247344970703125,
0.027923583984375,
0.040069580078125,
-0.0304412841796875,
0.03558349609375,
-0.0011472702026367188,
-0.00370025634765625,
-0.040802001953125,
0.058013916015625,
0.0278778076171875,
0.015533447265625,
0.01230621337890625,
0.019439697265625,
-0.016387939453125,
-0.046966552734375,
-0.0256500244140625,
0.01751708984375,
-0.05499267578125,
-0.0472412109375,
-0.0523681640625,
-0.03167724609375,
-0.0253448486328125,
-0.00655364990234375,
-0.0418701171875,
-0.032257080078125,
-0.028411865234375,
0.0171356201171875,
0.0550537109375,
0.0416259765625,
-0.0208740234375,
0.043792724609375,
-0.034576416015625,
0.008697509765625,
0.00884246826171875,
0.0303497314453125,
0.0103607177734375,
-0.06512451171875,
-0.024871826171875,
-0.004543304443359375,
-0.0323486328125,
-0.048492431640625,
0.036407470703125,
0.01837158203125,
0.037689208984375,
0.0252227783203125,
-0.0135650634765625,
0.048675537109375,
-0.0037479400634765625,
0.03955078125,
0.04022216796875,
-0.032745361328125,
0.04315185546875,
0.0035858154296875,
0.01294708251953125,
0.00934600830078125,
0.025146484375,
-0.017303466796875,
-0.0029811859130859375,
-0.07232666015625,
-0.060882568359375,
0.0648193359375,
0.004756927490234375,
0.0013475418090820312,
0.020965576171875,
0.062286376953125,
0.0046539306640625,
-0.005893707275390625,
-0.05548095703125,
-0.03936767578125,
-0.020477294921875,
-0.0172576904296875,
0.0027027130126953125,
-0.00853729248046875,
-0.00717926025390625,
-0.04766845703125,
0.052032470703125,
-0.0036525726318359375,
0.053558349609375,
0.0248260498046875,
-0.0008845329284667969,
-0.00555419921875,
-0.0308685302734375,
0.03277587890625,
0.0218505859375,
-0.020660400390625,
0.0111236572265625,
0.0131072998046875,
-0.04132080078125,
0.0121612548828125,
0.00893402099609375,
-0.0036716461181640625,
-0.0027942657470703125,
0.040740966796875,
0.07171630859375,
0.000675201416015625,
0.00902557373046875,
0.0269775390625,
-0.00780487060546875,
-0.03192138671875,
-0.0202178955078125,
0.01427459716796875,
-0.0007147789001464844,
0.037445068359375,
0.02197265625,
0.03790283203125,
-0.00655364990234375,
-0.01751708984375,
0.0224609375,
0.038177490234375,
-0.018890380859375,
-0.023162841796875,
0.04827880859375,
-0.0130462646484375,
-0.0199127197265625,
0.06640625,
-0.01207733154296875,
-0.032379150390625,
0.08984375,
0.037506103515625,
0.0733642578125,
0.0022869110107421875,
-0.0016880035400390625,
0.07147216796875,
0.0228424072265625,
-0.006824493408203125,
0.0101165771484375,
0.0131072998046875,
-0.061737060546875,
0.0029544830322265625,
-0.03485107421875,
0.004150390625,
0.023834228515625,
-0.042236328125,
0.01953125,
-0.05535888671875,
-0.035491943359375,
0.01329803466796875,
0.0306396484375,
-0.07379150390625,
0.01482391357421875,
-0.01000213623046875,
0.07110595703125,
-0.052490234375,
0.057830810546875,
0.06597900390625,
-0.037445068359375,
-0.0867919921875,
-0.01174163818359375,
0.0016345977783203125,
-0.0693359375,
0.055023193359375,
0.036163330078125,
0.010498046875,
0.00820159912109375,
-0.06488037109375,
-0.052215576171875,
0.11029052734375,
0.043182373046875,
-0.0123748779296875,
0.025146484375,
-0.0157318115234375,
0.0153961181640625,
-0.03643798828125,
0.040435791015625,
0.00948333740234375,
0.03173828125,
0.0210723876953125,
-0.043609619140625,
0.02557373046875,
-0.02825927734375,
0.0076904296875,
0.0107574462890625,
-0.06976318359375,
0.0693359375,
-0.0421142578125,
-0.010833740234375,
0.00278472900390625,
0.05157470703125,
0.012115478515625,
0.0128936767578125,
0.04412841796875,
0.07049560546875,
0.0411376953125,
-0.018157958984375,
0.0733642578125,
-0.0009603500366210938,
0.04119873046875,
0.049652099609375,
0.0303497314453125,
0.040863037109375,
0.0214080810546875,
-0.01788330078125,
0.026458740234375,
0.08099365234375,
-0.0249176025390625,
0.0231781005859375,
0.021942138671875,
0.00804901123046875,
-0.00612640380859375,
0.0075836181640625,
-0.032257080078125,
0.03839111328125,
0.01078033447265625,
-0.040863037109375,
-0.018157958984375,
0.00342559814453125,
0.0037097930908203125,
-0.0221710205078125,
-0.017730712890625,
0.035858154296875,
0.003986358642578125,
-0.0297698974609375,
0.07049560546875,
0.0127716064453125,
0.06561279296875,
-0.031524658203125,
0.00001704692840576172,
-0.0247344970703125,
0.01837158203125,
-0.027008056640625,
-0.05401611328125,
0.024444580078125,
-0.0214996337890625,
-0.005718231201171875,
0.002658843994140625,
0.050506591796875,
-0.0221405029296875,
-0.034027099609375,
0.016510009765625,
0.0179290771484375,
0.03887939453125,
0.00910186767578125,
-0.09698486328125,
0.016143798828125,
0.0019178390502929688,
-0.051788330078125,
0.0260009765625,
0.0330810546875,
0.01097869873046875,
0.0587158203125,
0.040679931640625,
-0.006526947021484375,
0.004978179931640625,
-0.0106964111328125,
0.060791015625,
-0.0288238525390625,
-0.0173187255859375,
-0.059906005859375,
0.042388916015625,
-0.01030731201171875,
-0.04119873046875,
0.036163330078125,
0.041351318359375,
0.0582275390625,
0.00214385986328125,
0.0308685302734375,
-0.02288818359375,
-0.0050201416015625,
-0.031158447265625,
0.056610107421875,
-0.06072998046875,
0.0008101463317871094,
-0.0041351318359375,
-0.0504150390625,
-0.0289154052734375,
0.051910400390625,
-0.0116729736328125,
0.03338623046875,
0.035064697265625,
0.0784912109375,
-0.0264892578125,
-0.0284881591796875,
0.01125335693359375,
0.0123748779296875,
0.00896453857421875,
0.030548095703125,
0.0229034423828125,
-0.057891845703125,
0.02496337890625,
-0.050201416015625,
-0.0202484130859375,
-0.012786865234375,
-0.05413818359375,
-0.061370849609375,
-0.0653076171875,
-0.044830322265625,
-0.05126953125,
-0.008819580078125,
0.0706787109375,
0.08233642578125,
-0.048431396484375,
-0.011566162109375,
-0.0011653900146484375,
0.01293182373046875,
-0.0285186767578125,
-0.0174713134765625,
0.0499267578125,
-0.0219879150390625,
-0.05029296875,
-0.0245361328125,
0.00212860107421875,
0.018524169921875,
0.0014848709106445312,
-0.0163726806640625,
-0.014801025390625,
-0.0201873779296875,
0.01491546630859375,
0.0218963623046875,
-0.04510498046875,
-0.01434326171875,
-0.0199127197265625,
-0.0119476318359375,
0.023468017578125,
0.0384521484375,
-0.03277587890625,
0.0240936279296875,
0.029205322265625,
0.031402587890625,
0.057830810546875,
-0.03057861328125,
0.005344390869140625,
-0.06304931640625,
0.043182373046875,
-0.01055145263671875,
0.034881591796875,
0.0343017578125,
-0.02899169921875,
0.050689697265625,
0.0259552001953125,
-0.03387451171875,
-0.06585693359375,
-0.00566864013671875,
-0.07696533203125,
-0.0145721435546875,
0.07061767578125,
-0.037322998046875,
-0.036834716796875,
0.041259765625,
0.004878997802734375,
0.050811767578125,
-0.00583648681640625,
0.03338623046875,
0.0154266357421875,
-0.01145172119140625,
-0.048431396484375,
-0.04449462890625,
0.031463623046875,
0.0152435302734375,
-0.04144287109375,
-0.02734375,
-0.0014371871948242188,
0.0496826171875,
0.01479339599609375,
0.03839111328125,
-0.00811767578125,
0.01006317138671875,
0.0103607177734375,
0.040191650390625,
-0.040618896484375,
-0.0080718994140625,
-0.0237884521484375,
0.0070037841796875,
-0.0032901763916015625,
-0.045501708984375
]
] |
t5-3b | 2023-01-02T16:15:40.000Z | [
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"summarization",
"translation",
"en",
"fr",
"ro",
"de",
"multilingual",
"dataset:c4",
"arxiv:1805.12471",
"arxiv:1708.00055",
"arxiv:1704.05426",
"arxiv:1606.05250",
"arxiv:1808.09121",
"arxiv:1810.12885",
"arxiv:1905.10044",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | translation | null | null | null | t5-3b | 25 | 92,993 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- fr
- ro
- de
- multilingual
license: apache-2.0
tags:
- summarization
- translation
datasets:
- c4
---
# Model Card for T5-3B

# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Citation](#citation)
8. [Model Card Authors](#model-card-authors)
9. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
> With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
T5-3B is the checkpoint with 3 billion parameters.
- **Developed by:** Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. See [associated paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) and [GitHub repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
- **Model type:** Language model
- **Language(s) (NLP):** English, French, Romanian, German
- **License:** Apache 2.0
- **Related Models:** [All T5 Checkpoints](https://huggingface.co/models?search=t5)
- **Resources for more information:**
- [Research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
- [Google's T5 Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
- [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer)
- [Hugging Face T5 Docs](https://huggingface.co/docs/transformers/model_doc/t5)
# Uses
## Direct Use and Downstream Use
The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
> Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task, including machine translation, document summarization, question answering, and classification tasks (e.g., sentiment analysis). We can even apply T5 to regression tasks by training it to predict the string representation of a number instead of the number itself.
See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
## Out-of-Scope Use
More information needed.
# Bias, Risks, and Limitations
More information needed.
## Recommendations
More information needed.
# Training Details
## Training Data
The model is pre-trained on the [Colossal Clean Crawled Corpus (C4)](https://www.tensorflow.org/datasets/catalog/c4), which was developed and released in the context of the same [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) as T5.
The model was pre-trained on a on a **multi-task mixture of unsupervised (1.) and supervised tasks (2.)**.
Thereby, the following datasets were being used for (1.) and (2.):
1. **Datasets used for Unsupervised denoising objective**:
- [C4](https://huggingface.co/datasets/c4)
- [Wiki-DPR](https://huggingface.co/datasets/wiki_dpr)
2. **Datasets used for Supervised text-to-text language modeling objective**
- Sentence acceptability judgment
- CoLA [Warstadt et al., 2018](https://arxiv.org/abs/1805.12471)
- Sentiment analysis
- SST-2 [Socher et al., 2013](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)
- Paraphrasing/sentence similarity
- MRPC [Dolan and Brockett, 2005](https://aclanthology.org/I05-5002)
- STS-B [Ceret al., 2017](https://arxiv.org/abs/1708.00055)
- QQP [Iyer et al., 2017](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- Natural language inference
- MNLI [Williams et al., 2017](https://arxiv.org/abs/1704.05426)
- QNLI [Rajpurkar et al.,2016](https://arxiv.org/abs/1606.05250)
- RTE [Dagan et al., 2005](https://link.springer.com/chapter/10.1007/11736790_9)
- CB [De Marneff et al., 2019](https://semanticsarchive.net/Archive/Tg3ZGI2M/Marneffe.pdf)
- Sentence completion
- COPA [Roemmele et al., 2011](https://www.researchgate.net/publication/221251392_Choice_of_Plausible_Alternatives_An_Evaluation_of_Commonsense_Causal_Reasoning)
- Word sense disambiguation
- WIC [Pilehvar and Camacho-Collados, 2018](https://arxiv.org/abs/1808.09121)
- Question answering
- MultiRC [Khashabi et al., 2018](https://aclanthology.org/N18-1023)
- ReCoRD [Zhang et al., 2018](https://arxiv.org/abs/1810.12885)
- BoolQ [Clark et al., 2019](https://arxiv.org/abs/1905.10044)
## Training Procedure
In their [abstract](https://jmlr.org/papers/volume21/20-074/20-074.pdf), the model developers write:
> In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks.
The framework introduced, the T5 framework, involves a training procedure that brings together the approaches studied in the paper. See the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details.
# Evaluation
## Testing Data, Factors & Metrics
The developers evaluated the model on 24 tasks, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for full details.
## Results
For full results for T5-3B, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf), Table 14.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Google Cloud TPU Pods
- **Hours used:** More information needed
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Citation
**BibTeX:**
```bibtex
@article{2020t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {Journal of Machine Learning Research},
year = {2020},
volume = {21},
number = {140},
pages = {1-67},
url = {http://jmlr.org/papers/v21/20-074.html}
}
```
**APA:**
- Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., ... & Liu, P. J. (2020). Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140), 1-67.
# Model Card Authors
This model card was written by the team at Hugging Face.
# How to Get Started with the Model
See the [Hugging Face T5](https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Model) docs and a [Colab Notebook](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) created by the model developers for more context on how to get started with this checkpoint.
| 7,865 | [
[
-0.023040771484375,
-0.029052734375,
0.037872314453125,
0.0132904052734375,
-0.0087127685546875,
-0.00534820556640625,
-0.01511383056640625,
-0.0469970703125,
-0.0252685546875,
0.034423828125,
-0.03802490234375,
-0.042633056640625,
-0.057952880859375,
0.0242919921875,
-0.0435791015625,
0.0816650390625,
-0.0083160400390625,
-0.011138916015625,
-0.01186370849609375,
-0.005146026611328125,
-0.027618408203125,
-0.038421630859375,
-0.045013427734375,
-0.023834228515625,
0.0299224853515625,
0.017578125,
0.0240631103515625,
0.032135009765625,
0.05126953125,
0.0169677734375,
-0.01210784912109375,
-0.00206756591796875,
-0.03179931640625,
-0.0195159912109375,
-0.0181732177734375,
-0.0223388671875,
-0.02978515625,
-0.0009174346923828125,
0.044647216796875,
0.054718017578125,
0.0023555755615234375,
0.0225677490234375,
0.01275634765625,
0.043853759765625,
-0.045928955078125,
0.01216888427734375,
-0.043182373046875,
0.005626678466796875,
-0.002399444580078125,
0.0038280487060546875,
-0.044830322265625,
-0.004032135009765625,
0.0181732177734375,
-0.0467529296875,
0.0250701904296875,
-0.00048089027404785156,
0.091552734375,
0.024139404296875,
-0.036895751953125,
-0.01413726806640625,
-0.057037353515625,
0.08026123046875,
-0.056640625,
0.03851318359375,
0.01123809814453125,
0.01055908203125,
0.0121307373046875,
-0.08099365234375,
-0.052581787109375,
0.0004494190216064453,
-0.01537322998046875,
0.0198211669921875,
-0.0228424072265625,
0.002834320068359375,
0.0254058837890625,
0.02728271484375,
-0.031463623046875,
-0.00247955322265625,
-0.046630859375,
-0.00925445556640625,
0.04229736328125,
-0.002414703369140625,
0.0241546630859375,
-0.0203704833984375,
-0.037353515625,
-0.0229034423828125,
-0.0269775390625,
0.00420379638671875,
-0.006023406982421875,
0.02001953125,
-0.0276031494140625,
0.0208282470703125,
0.0074005126953125,
0.0482177734375,
0.01006317138671875,
-0.01296234130859375,
0.03021240234375,
-0.056060791015625,
-0.0151824951171875,
-0.02978515625,
0.0831298828125,
0.0250396728515625,
0.0105743408203125,
-0.033905029296875,
-0.0003819465637207031,
-0.0110626220703125,
0.03106689453125,
-0.07098388671875,
-0.008453369140625,
0.0219879150390625,
-0.035675048828125,
-0.032684326171875,
-0.0029468536376953125,
-0.0625,
-0.0026836395263671875,
-0.00908660888671875,
0.033172607421875,
-0.03753662109375,
-0.01506805419921875,
0.0143280029296875,
-0.02008056640625,
0.027618408203125,
0.02423095703125,
-0.06512451171875,
0.0264892578125,
0.02520751953125,
0.057159423828125,
-0.0357666015625,
-0.0266265869140625,
-0.007450103759765625,
0.00978851318359375,
-0.00792694091796875,
0.055419921875,
-0.0289154052734375,
-0.03289794921875,
-0.00908660888671875,
0.01003265380859375,
-0.0197296142578125,
-0.0216064453125,
0.06378173828125,
-0.021759033203125,
0.053466796875,
-0.0262603759765625,
-0.03961181640625,
-0.032928466796875,
0.00902557373046875,
-0.049072265625,
0.09283447265625,
0.0010528564453125,
-0.0582275390625,
0.0203399658203125,
-0.0701904296875,
-0.02001953125,
-0.0187530517578125,
0.021942138671875,
-0.03863525390625,
-0.02130126953125,
0.022003173828125,
0.048431396484375,
-0.034088134765625,
0.0284271240234375,
-0.020660400390625,
-0.0163421630859375,
0.0060272216796875,
-0.0239105224609375,
0.07940673828125,
0.0238189697265625,
-0.036468505859375,
-0.00298309326171875,
-0.05230712890625,
0.0006551742553710938,
-0.0016002655029296875,
-0.0180206298828125,
0.004467010498046875,
-0.014739990234375,
0.0203399658203125,
0.03338623046875,
0.01715087890625,
-0.03973388671875,
0.00013780593872070312,
-0.0211944580078125,
0.0472412109375,
0.034576416015625,
-0.00649261474609375,
0.043121337890625,
-0.037200927734375,
0.031829833984375,
0.01043701171875,
0.007167816162109375,
-0.0147857666015625,
-0.031982421875,
-0.060943603515625,
0.0034332275390625,
0.03515625,
0.043609619140625,
-0.044158935546875,
0.041015625,
-0.03936767578125,
-0.054412841796875,
-0.046417236328125,
-0.0033130645751953125,
0.029144287109375,
0.05364990234375,
0.0631103515625,
-0.00433349609375,
-0.043426513671875,
-0.050079345703125,
-0.026153564453125,
-0.0084381103515625,
-0.0023899078369140625,
0.0115509033203125,
0.0535888671875,
-0.0075836181640625,
0.06402587890625,
-0.024261474609375,
-0.027374267578125,
-0.040008544921875,
-0.0004100799560546875,
-0.0078277587890625,
0.04351806640625,
0.0484619140625,
-0.058837890625,
-0.038360595703125,
-0.01403045654296875,
-0.064208984375,
-0.002132415771484375,
-0.01219940185546875,
-0.0014219284057617188,
0.032440185546875,
0.0457763671875,
-0.043212890625,
0.0158843994140625,
0.049163818359375,
-0.0255279541015625,
0.0218658447265625,
-0.006744384765625,
0.000019788742065429688,
-0.122314453125,
0.039825439453125,
0.01262664794921875,
-0.01406097412109375,
-0.05712890625,
-0.0072479248046875,
0.006496429443359375,
-0.0050048828125,
-0.040435791015625,
0.05511474609375,
-0.031707763671875,
0.00531005859375,
-0.0008225440979003906,
0.00885772705078125,
0.01233673095703125,
0.049835205078125,
-0.0024890899658203125,
0.058013916015625,
0.0194244384765625,
-0.052459716796875,
-0.0023403167724609375,
0.0290679931640625,
-0.006381988525390625,
0.0277099609375,
-0.05596923828125,
0.021636962890625,
-0.005146026611328125,
0.037628173828125,
-0.068115234375,
0.0100860595703125,
0.02685546875,
-0.049774169921875,
0.0234527587890625,
0.0025157928466796875,
-0.031707763671875,
-0.028961181640625,
-0.0213623046875,
0.0207061767578125,
0.049530029296875,
-0.035675048828125,
0.054443359375,
0.01314544677734375,
0.01983642578125,
-0.056365966796875,
-0.06427001953125,
0.01316070556640625,
-0.0300445556640625,
-0.03997802734375,
0.06378173828125,
-0.01214599609375,
0.006145477294921875,
0.01025390625,
0.0008697509765625,
-0.0160980224609375,
0.011871337890625,
0.0047607421875,
0.0190887451171875,
0.004535675048828125,
0.01291656494140625,
-0.00795745849609375,
-0.01090240478515625,
0.0019378662109375,
-0.029876708984375,
0.0179901123046875,
-0.0137176513671875,
0.01160430908203125,
-0.049072265625,
0.01383209228515625,
0.041595458984375,
-0.010040283203125,
0.060394287109375,
0.07275390625,
-0.018798828125,
-0.0015592575073242188,
-0.03936767578125,
-0.01389312744140625,
-0.0340576171875,
0.0262603759765625,
-0.029541015625,
-0.06915283203125,
0.03173828125,
0.0050201416015625,
0.026214599609375,
0.06787109375,
0.0234832763671875,
-0.01358795166015625,
0.055206298828125,
0.06439208984375,
-0.002597808837890625,
0.042633056640625,
-0.0341796875,
0.0222320556640625,
-0.07177734375,
-0.024688720703125,
-0.061492919921875,
-0.0210418701171875,
-0.05889892578125,
-0.0301055908203125,
0.00937652587890625,
0.003429412841796875,
-0.0248260498046875,
0.04241943359375,
-0.0390625,
0.00917816162109375,
0.030059814453125,
0.006824493408203125,
0.027374267578125,
-0.001941680908203125,
-0.003856658935546875,
-0.0106048583984375,
-0.0653076171875,
-0.036895751953125,
0.0968017578125,
0.027130126953125,
0.025360107421875,
-0.0008687973022460938,
0.04779052734375,
0.016448974609375,
0.0163726806640625,
-0.05419921875,
0.054290771484375,
-0.026947021484375,
-0.04205322265625,
-0.0160064697265625,
-0.0328369140625,
-0.08392333984375,
0.020263671875,
-0.025115966796875,
-0.05364990234375,
0.0129241943359375,
0.0010280609130859375,
-0.0163726806640625,
0.038818359375,
-0.0653076171875,
0.082275390625,
-0.0044097900390625,
-0.0242156982421875,
-0.0014858245849609375,
-0.054779052734375,
0.016021728515625,
0.00024700164794921875,
0.0108489990234375,
0.005786895751953125,
-0.0139923095703125,
0.07550048828125,
-0.02178955078125,
0.06634521484375,
-0.014495849609375,
0.00006455183029174805,
0.0126800537109375,
-0.0248565673828125,
0.032684326171875,
-0.029571533203125,
-0.008270263671875,
0.030303955078125,
0.007038116455078125,
-0.035430908203125,
-0.040618896484375,
0.033966064453125,
-0.06854248046875,
-0.0276947021484375,
-0.031219482421875,
-0.03912353515625,
-0.01422119140625,
0.0255126953125,
0.0245208740234375,
0.0139923095703125,
-0.01264190673828125,
0.0296173095703125,
0.049163818359375,
-0.0282745361328125,
0.054931640625,
0.0272216796875,
-0.0007090568542480469,
-0.01983642578125,
0.0576171875,
0.00760650634765625,
0.03125,
0.045318603515625,
0.0116424560546875,
-0.0246734619140625,
-0.043609619140625,
-0.025360107421875,
0.025146484375,
-0.047088623046875,
-0.00968170166015625,
-0.07525634765625,
-0.016876220703125,
-0.042449951171875,
-0.004917144775390625,
-0.0307464599609375,
-0.032135009765625,
-0.037353515625,
-0.0128173828125,
0.02459716796875,
0.037353515625,
0.01068115234375,
0.016204833984375,
-0.068359375,
0.01462554931640625,
0.001857757568359375,
0.00693511962890625,
0.0010395050048828125,
-0.0625,
-0.01122283935546875,
0.0062408447265625,
-0.035675048828125,
-0.0521240234375,
0.03192138671875,
0.01702880859375,
0.0267486572265625,
-0.0012407302856445312,
0.00806427001953125,
0.04840087890625,
-0.0196533203125,
0.0771484375,
0.01128387451171875,
-0.0784912109375,
0.0218048095703125,
-0.0232086181640625,
0.031005859375,
0.0401611328125,
0.04010009765625,
-0.048583984375,
-0.017791748046875,
-0.07470703125,
-0.061248779296875,
0.05743408203125,
0.0194854736328125,
0.01146697998046875,
0.02459716796875,
0.0210723876953125,
0.0001939535140991211,
0.0111236572265625,
-0.07171630859375,
-0.019622802734375,
-0.018646240234375,
-0.02447509765625,
-0.005931854248046875,
-0.006084442138671875,
0.00868988037109375,
-0.0248565673828125,
0.049560546875,
-0.00726318359375,
0.053314208984375,
0.0239715576171875,
-0.019500732421875,
0.01482391357421875,
0.0283203125,
0.0472412109375,
0.03912353515625,
-0.014984130859375,
-0.006084442138671875,
0.03131103515625,
-0.03948974609375,
-0.004093170166015625,
0.01512908935546875,
-0.024932861328125,
-0.0032176971435546875,
0.035858154296875,
0.0711669921875,
0.011138916015625,
-0.032257080078125,
0.04150390625,
0.0009436607360839844,
-0.04779052734375,
-0.0174560546875,
-0.005466461181640625,
0.0095367431640625,
-0.0018224716186523438,
0.0191650390625,
0.018218994140625,
0.01279449462890625,
-0.036346435546875,
0.0037631988525390625,
0.00971221923828125,
-0.03802490234375,
-0.03338623046875,
0.061431884765625,
0.0280609130859375,
-0.00030231475830078125,
0.044403076171875,
-0.00984954833984375,
-0.0396728515625,
0.039794921875,
0.040313720703125,
0.079345703125,
-0.00298309326171875,
0.0194854736328125,
0.05126953125,
0.02972412109375,
-0.010009765625,
0.004375457763671875,
-0.0052032470703125,
-0.0595703125,
-0.0433349609375,
-0.037078857421875,
-0.0255889892578125,
0.01424407958984375,
-0.03741455078125,
0.0252838134765625,
-0.0229644775390625,
0.0028934478759765625,
0.007167816162109375,
0.0091400146484375,
-0.05810546875,
0.022003173828125,
0.0008807182312011719,
0.06500244140625,
-0.05584716796875,
0.060455322265625,
0.053863525390625,
-0.0435791015625,
-0.069580078125,
0.01146697998046875,
-0.017242431640625,
-0.04791259765625,
0.040924072265625,
0.01031494140625,
-0.0006380081176757812,
0.014617919921875,
-0.040313720703125,
-0.06500244140625,
0.10125732421875,
0.026611328125,
-0.023956298828125,
-0.0266571044921875,
0.01788330078125,
0.050506591796875,
-0.019073486328125,
0.031768798828125,
0.036163330078125,
0.037261962890625,
0.01751708984375,
-0.07861328125,
0.024627685546875,
-0.01953125,
0.006999969482421875,
0.00439453125,
-0.062347412109375,
0.04302978515625,
-0.024444580078125,
-0.0184478759765625,
-0.013031005859375,
0.0509033203125,
0.0016002655029296875,
0.0135498046875,
0.035491943359375,
0.055999755859375,
0.050323486328125,
-0.00896453857421875,
0.091064453125,
-0.024871826171875,
0.03369140625,
0.0638427734375,
0.008270263671875,
0.068603515625,
0.03704833984375,
-0.025482177734375,
0.040008544921875,
0.047454833984375,
-0.0088348388671875,
0.03289794921875,
-0.00970458984375,
-0.00335693359375,
-0.006290435791015625,
-0.01419830322265625,
-0.0255279541015625,
0.0189971923828125,
0.0186920166015625,
-0.0290374755859375,
-0.0189361572265625,
0.00824737548828125,
0.02728271484375,
-0.007122039794921875,
-0.0032901763916015625,
0.0615234375,
0.0161285400390625,
-0.061859130859375,
0.052520751953125,
0.007415771484375,
0.06243896484375,
-0.03265380859375,
0.0052947998046875,
-0.01409149169921875,
0.0108489990234375,
-0.0245361328125,
-0.054534912109375,
0.038360595703125,
0.0035724639892578125,
-0.0137939453125,
-0.05438232421875,
0.05841064453125,
-0.02978515625,
-0.031097412109375,
0.0282745361328125,
0.0350341796875,
0.0080108642578125,
0.0012788772583007812,
-0.07281494140625,
-0.00479888916015625,
0.01461029052734375,
-0.0160980224609375,
0.024871826171875,
0.0281982421875,
0.007656097412109375,
0.051666259765625,
0.043975830078125,
-0.01508331298828125,
-0.0006518363952636719,
-0.00870513916015625,
0.050872802734375,
-0.05633544921875,
-0.01800537109375,
-0.05145263671875,
0.053741455078125,
-0.0014743804931640625,
-0.03436279296875,
0.052032470703125,
0.03509521484375,
0.08184814453125,
-0.006420135498046875,
0.07379150390625,
-0.0180511474609375,
0.042633056640625,
-0.03338623046875,
0.03399658203125,
-0.0509033203125,
0.0172271728515625,
-0.0328369140625,
-0.06512451171875,
-0.0247802734375,
0.0279541015625,
-0.0261993408203125,
0.0261993408203125,
0.0841064453125,
0.0484619140625,
0.0029754638671875,
-0.006969451904296875,
0.0118865966796875,
0.01274871826171875,
0.0270233154296875,
0.05389404296875,
0.026092529296875,
-0.0755615234375,
0.07391357421875,
-0.027099609375,
0.0132904052734375,
0.00010037422180175781,
-0.0657958984375,
-0.06787109375,
-0.06536865234375,
-0.0307464599609375,
-0.03485107421875,
0.007793426513671875,
0.05450439453125,
0.0426025390625,
-0.049163818359375,
-0.0209197998046875,
-0.030303955078125,
0.0013456344604492188,
-0.017303466796875,
-0.0158538818359375,
0.03631591796875,
-0.0369873046875,
-0.0665283203125,
0.0038623809814453125,
-0.004650115966796875,
0.004611968994140625,
-0.0016889572143554688,
-0.002262115478515625,
-0.0259857177734375,
-0.0150299072265625,
0.044830322265625,
0.014801025390625,
-0.048583984375,
-0.0211181640625,
0.0223388671875,
-0.009674072265625,
0.008270263671875,
0.037200927734375,
-0.050018310546875,
0.0162200927734375,
0.03973388671875,
0.067626953125,
0.061492919921875,
-0.01003265380859375,
0.047271728515625,
-0.0285491943359375,
-0.0107269287109375,
0.0114288330078125,
0.007190704345703125,
0.031280517578125,
-0.0149078369140625,
0.049835205078125,
0.035858154296875,
-0.040496826171875,
-0.0509033203125,
-0.009674072265625,
-0.09619140625,
-0.01351165771484375,
0.097412109375,
-0.01033782958984375,
-0.01338958740234375,
0.0003352165222167969,
-0.003978729248046875,
0.025726318359375,
-0.034759521484375,
0.05615234375,
0.065673828125,
0.0064697265625,
-0.031280517578125,
-0.048797607421875,
0.0504150390625,
0.045013427734375,
-0.08135986328125,
-0.0128173828125,
0.0167083740234375,
0.0321044921875,
0.01033782958984375,
0.045074462890625,
-0.01190948486328125,
0.00580596923828125,
-0.017059326171875,
0.023162841796875,
0.002452850341796875,
-0.00728607177734375,
-0.0245513916015625,
0.01258087158203125,
-0.0158538818359375,
-0.01312255859375
]
] |
microsoft/git-base | 2023-04-24T09:52:15.000Z | [
"transformers",
"pytorch",
"safetensors",
"git",
"text-generation",
"vision",
"image-to-text",
"image-captioning",
"en",
"arxiv:2205.14100",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | microsoft | null | null | microsoft/git-base | 25 | 92,527 | transformers | 2022-12-06T09:22:35 | ---
language: en
license: mit
tags:
- vision
- image-to-text
- image-captioning
model_name: microsoft/git-base
pipeline_tag: image-to-text
---
# GIT (GenerativeImage2Text), base-sized
GIT (short for GenerativeImage2Text) model, base-sized version. It was introduced in the paper [GIT: A Generative Image-to-text Transformer for Vision and Language](https://arxiv.org/abs/2205.14100) by Wang et al. and first released in [this repository](https://github.com/microsoft/GenerativeImage2Text).
Disclaimer: The team releasing GIT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
GIT is a Transformer decoder conditioned on both CLIP image tokens and text tokens. The model is trained using "teacher forcing" on a lot of (image, text) pairs.
The goal for the model is simply to predict the next text token, giving the image tokens and previous text tokens.
The model has full access to (i.e. a bidirectional attention mask is used for) the image patch tokens, but only has access to the previous text tokens (i.e. a causal attention mask is used for the text tokens) when predicting the next text token.

This allows the model to be used for tasks like:
- image and video captioning
- visual question answering (VQA) on images and videos
- even image classification (by simply conditioning the model on the image and asking it to generate a class for it in text).
## Intended uses & limitations
You can use the raw model for image captioning. See the [model hub](https://huggingface.co/models?search=microsoft/git) to look for
fine-tuned versions on a task that interests you.
### How to use
For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/model_doc/git#transformers.GitForCausalLM.forward.example).
## Training data
From the paper:
> We collect 0.8B image-text pairs for pre-training, which include COCO (Lin et al., 2014), Conceptual Captions
(CC3M) (Sharma et al., 2018), SBU (Ordonez et al., 2011), Visual Genome (VG) (Krishna et al., 2016),
Conceptual Captions (CC12M) (Changpinyo et al., 2021), ALT200M (Hu et al., 2021a), and an extra 0.6B
data following a similar collection procedure in Hu et al. (2021a).
=> however this is for the model referred to as "GIT" in the paper, which is not open-sourced.
This checkpoint is "GIT-base", which is a smaller variant of GIT trained on 10 million image-text pairs.
See table 11 in the [paper](https://arxiv.org/abs/2205.14100) for more details.
### Preprocessing
We refer to the original repo regarding details for preprocessing during training.
During validation, one resizes the shorter edge of each image, after which center cropping is performed to a fixed-size resolution. Next, frames are normalized across the RGB channels with the ImageNet mean and standard deviation.
## Evaluation results
For evaluation results, we refer readers to the [paper](https://arxiv.org/abs/2205.14100). | 3,125 | [
[
-0.045135498046875,
-0.0506591796875,
0.01209259033203125,
-0.012786865234375,
-0.037017822265625,
0.0007915496826171875,
-0.0103302001953125,
-0.03302001953125,
0.023345947265625,
0.02923583984375,
-0.044921875,
-0.0281829833984375,
-0.067626953125,
-0.001499176025390625,
-0.0306549072265625,
0.1199951171875,
-0.0212860107421875,
0.004909515380859375,
-0.0236968994140625,
-0.0258026123046875,
-0.0298309326171875,
-0.017913818359375,
-0.022857666015625,
-0.0142669677734375,
0.0251312255859375,
0.0293121337890625,
0.0626220703125,
0.054931640625,
0.05975341796875,
0.01934814453125,
-0.0172119140625,
-0.0275115966796875,
-0.032928466796875,
-0.022247314453125,
-0.0023746490478515625,
-0.0283966064453125,
-0.0291595458984375,
0.02557373046875,
0.032440185546875,
0.0213623046875,
0.0112762451171875,
0.0288848876953125,
-0.0038280487060546875,
0.07550048828125,
-0.0279998779296875,
0.02496337890625,
-0.019775390625,
0.00923919677734375,
-0.0032901763916015625,
0.0138397216796875,
-0.01084136962890625,
-0.0151824951171875,
0.01500701904296875,
-0.0411376953125,
0.032440185546875,
-0.0183258056640625,
0.10394287109375,
0.0246734619140625,
-0.01027679443359375,
-0.01132965087890625,
-0.0270843505859375,
0.01509857177734375,
-0.0262298583984375,
0.0185546875,
0.0212860107421875,
0.02880859375,
0.0262603759765625,
-0.061981201171875,
-0.040618896484375,
-0.026336669921875,
-0.0242462158203125,
0.02392578125,
-0.0160064697265625,
0.01558685302734375,
0.0496826171875,
0.04656982421875,
-0.046417236328125,
-0.025238037109375,
-0.047210693359375,
-0.00704193115234375,
0.030059814453125,
-0.0304412841796875,
0.0406494140625,
-0.0302734375,
-0.055023193359375,
-0.0195465087890625,
-0.032196044921875,
-0.0038089752197265625,
0.00588226318359375,
-0.019378662109375,
-0.047332763671875,
0.060638427734375,
-0.00434112548828125,
0.0299224853515625,
0.0214691162109375,
0.0007228851318359375,
0.0231781005859375,
-0.0265045166015625,
-0.017913818359375,
-0.0294952392578125,
0.0672607421875,
0.035552978515625,
0.04681396484375,
-0.0022754669189453125,
-0.0267181396484375,
0.0223846435546875,
0.02685546875,
-0.06781005859375,
-0.03253173828125,
-0.010772705078125,
-0.03533935546875,
-0.0276641845703125,
0.0159912109375,
-0.045684814453125,
0.0012493133544921875,
-0.029876708984375,
0.027069091796875,
-0.040283203125,
-0.020904541015625,
-0.0213623046875,
-0.0213775634765625,
0.0274200439453125,
0.0198974609375,
-0.055450439453125,
0.01045989990234375,
0.0268707275390625,
0.07855224609375,
-0.0145263671875,
-0.0178375244140625,
-0.01953125,
-0.0082550048828125,
-0.0179290771484375,
0.05792236328125,
-0.01219940185546875,
-0.01064300537109375,
-0.0079345703125,
0.0268096923828125,
-0.00013971328735351562,
-0.0237579345703125,
0.0202178955078125,
-0.043121337890625,
0.00901031494140625,
-0.01181793212890625,
-0.0236358642578125,
-0.01922607421875,
0.026763916015625,
-0.05755615234375,
0.060455322265625,
0.04986572265625,
-0.078369140625,
0.01873779296875,
-0.062347412109375,
-0.0186614990234375,
0.01448822021484375,
-0.012237548828125,
-0.0491943359375,
-0.005035400390625,
0.0123443603515625,
0.036865234375,
-0.0157318115234375,
0.034759521484375,
-0.01021575927734375,
-0.0247955322265625,
-0.0017652511596679688,
0.0020046234130859375,
0.049957275390625,
0.012939453125,
-0.03399658203125,
0.015869140625,
-0.0265350341796875,
-0.0100555419921875,
0.028717041015625,
-0.01294708251953125,
-0.0026454925537109375,
-0.0306549072265625,
0.01202392578125,
0.01580810546875,
0.004116058349609375,
-0.029144287109375,
0.04144287109375,
-0.00933074951171875,
0.046112060546875,
0.0343017578125,
-0.00954437255859375,
0.060760498046875,
-0.024688720703125,
0.06256103515625,
0.004756927490234375,
0.0095367431640625,
-0.04400634765625,
-0.0496826171875,
-0.05010986328125,
-0.0206451416015625,
0.041290283203125,
0.050048828125,
-0.08154296875,
0.01523590087890625,
-0.046234130859375,
-0.04180908203125,
-0.02197265625,
-0.00164031982421875,
0.05194091796875,
0.035491943359375,
0.022674560546875,
-0.048828125,
-0.0304107666015625,
-0.07440185546875,
0.004852294921875,
-0.0254058837890625,
-0.005596160888671875,
-0.00039577484130859375,
0.063232421875,
-0.0311431884765625,
0.0679931640625,
-0.03887939453125,
-0.0160980224609375,
-0.007289886474609375,
0.013580322265625,
-0.01270294189453125,
0.044464111328125,
0.053436279296875,
-0.0743408203125,
-0.04144287109375,
-0.00855255126953125,
-0.06195068359375,
0.0045928955078125,
-0.0087890625,
-0.019683837890625,
0.0308074951171875,
0.0379638671875,
-0.067138671875,
0.052276611328125,
0.0576171875,
-0.020416259765625,
0.0504150390625,
0.00704193115234375,
0.008880615234375,
-0.08477783203125,
0.007251739501953125,
0.0182037353515625,
-0.02789306640625,
-0.0304718017578125,
0.0347900390625,
0.021942138671875,
-0.0229644775390625,
-0.018707275390625,
0.0330810546875,
-0.062347412109375,
-0.01343536376953125,
-0.009490966796875,
-0.004238128662109375,
-0.0009021759033203125,
0.055145263671875,
0.00836944580078125,
0.06146240234375,
0.036285400390625,
-0.01018524169921875,
0.021881103515625,
0.026702880859375,
-0.01500701904296875,
0.05230712890625,
-0.061065673828125,
0.017578125,
-0.010284423828125,
0.0278472900390625,
-0.073486328125,
-0.0305633544921875,
0.03363037109375,
-0.0543212890625,
0.04779052734375,
-0.04010009765625,
-0.043914794921875,
-0.0465087890625,
-0.0227508544921875,
0.024017333984375,
0.06402587890625,
-0.03216552734375,
0.041259765625,
0.029144287109375,
0.0008349418640136719,
-0.0142669677734375,
-0.062164306640625,
0.0007314682006835938,
-0.0108642578125,
-0.06646728515625,
0.0291595458984375,
-0.01078033447265625,
0.01218414306640625,
0.00797271728515625,
-0.0010890960693359375,
-0.00494384765625,
-0.01354217529296875,
0.036285400390625,
0.0310516357421875,
-0.0167236328125,
0.00576019287109375,
-0.00925445556640625,
-0.019775390625,
-0.00019848346710205078,
-0.019683837890625,
0.0144500732421875,
-0.021331787109375,
-0.007595062255859375,
-0.02130126953125,
0.0306549072265625,
0.0462646484375,
-0.0225982666015625,
0.046600341796875,
0.060028076171875,
-0.0275115966796875,
0.0279388427734375,
-0.040924072265625,
-0.0146026611328125,
-0.0310516357421875,
0.019256591796875,
-0.024139404296875,
-0.0687255859375,
0.039703369140625,
0.006900787353515625,
-0.00542449951171875,
0.042388916015625,
0.035064697265625,
-0.006610870361328125,
0.0540771484375,
0.056793212890625,
0.0008993148803710938,
0.063232421875,
-0.041168212890625,
-0.00583648681640625,
-0.065185546875,
0.000850677490234375,
-0.01531982421875,
-0.0169219970703125,
-0.054840087890625,
-0.0462646484375,
0.025146484375,
0.0299072265625,
-0.02264404296875,
0.055145263671875,
-0.0650634765625,
0.03155517578125,
0.041168212890625,
0.01441192626953125,
0.0003895759582519531,
0.0143890380859375,
0.00019288063049316406,
-0.0008158683776855469,
-0.0396728515625,
-0.045196533203125,
0.060394287109375,
0.03619384765625,
0.044097900390625,
-0.00986480712890625,
0.037567138671875,
0.0095062255859375,
0.04974365234375,
-0.0625,
0.032012939453125,
-0.005340576171875,
-0.0499267578125,
-0.00902557373046875,
-0.025054931640625,
-0.0579833984375,
-0.0016870498657226562,
-0.0146331787109375,
-0.04254150390625,
-0.009307861328125,
0.01806640625,
-0.00463104248046875,
0.0263519287109375,
-0.08966064453125,
0.07818603515625,
0.0000680088996887207,
-0.000040531158447265625,
0.0173187255859375,
-0.080810546875,
0.034576416015625,
0.005619049072265625,
0.0016613006591796875,
0.00267791748046875,
0.0035495758056640625,
0.0609130859375,
-0.025146484375,
0.0650634765625,
-0.01285552978515625,
0.0166168212890625,
0.041717529296875,
-0.004756927490234375,
0.0382080078125,
-0.00276947021484375,
0.029510498046875,
0.037750244140625,
0.01324462890625,
-0.0305633544921875,
-0.03955078125,
0.026611328125,
-0.057403564453125,
-0.036834716796875,
-0.0275421142578125,
-0.0229339599609375,
0.01473236083984375,
0.0123443603515625,
0.04644775390625,
0.0049285888671875,
-0.006519317626953125,
0.014434814453125,
0.043914794921875,
-0.01317596435546875,
0.0195159912109375,
0.01041412353515625,
-0.014801025390625,
-0.03997802734375,
0.042236328125,
0.0229339599609375,
0.03326416015625,
0.0210418701171875,
0.0089569091796875,
0.004909515380859375,
-0.023162841796875,
-0.0653076171875,
0.0301666259765625,
-0.0517578125,
-0.0396728515625,
-0.057098388671875,
-0.033203125,
-0.03900146484375,
-0.0169219970703125,
-0.0228729248046875,
-0.0240478515625,
-0.028961181640625,
-0.01531219482421875,
0.046112060546875,
0.048248291015625,
0.008026123046875,
0.04010009765625,
-0.0703125,
0.04620361328125,
0.016571044921875,
0.03997802734375,
-0.021148681640625,
-0.04803466796875,
-0.0236663818359375,
0.007434844970703125,
-0.04290771484375,
-0.08636474609375,
0.0209503173828125,
0.002910614013671875,
0.02197265625,
0.0110015869140625,
-0.0159454345703125,
0.031463623046875,
-0.03961181640625,
0.06640625,
0.02825927734375,
-0.05303955078125,
0.050262451171875,
-0.0308990478515625,
0.04180908203125,
0.0264129638671875,
0.0308990478515625,
-0.0382080078125,
-0.009307861328125,
-0.06671142578125,
-0.05474853515625,
0.04864501953125,
0.02294921875,
0.0215911865234375,
0.00525665283203125,
0.0435791015625,
-0.01505279541015625,
0.00832366943359375,
-0.07147216796875,
-0.01776123046875,
-0.034423828125,
-0.00980377197265625,
-0.00576019287109375,
-0.03802490234375,
-0.00891876220703125,
-0.04010009765625,
0.038330078125,
-0.0278472900390625,
0.046234130859375,
0.0165557861328125,
-0.016693115234375,
-0.01552581787109375,
-0.00957489013671875,
0.03399658203125,
0.01486968994140625,
0.002300262451171875,
-0.0157012939453125,
-0.00835418701171875,
-0.055877685546875,
-0.0018911361694335938,
0.01971435546875,
-0.0286865234375,
0.003955841064453125,
0.035186767578125,
0.07598876953125,
0.0037746429443359375,
-0.023406982421875,
0.06500244140625,
-0.0019369125366210938,
-0.0265350341796875,
-0.0328369140625,
-0.01038360595703125,
0.01526641845703125,
0.00848388671875,
0.0103759765625,
0.0231170654296875,
0.00537872314453125,
-0.029693603515625,
0.023956298828125,
0.0216522216796875,
-0.045806884765625,
-0.032989501953125,
0.048828125,
0.00537872314453125,
-0.0188751220703125,
0.04742431640625,
-0.020599365234375,
-0.055511474609375,
0.0682373046875,
0.017608642578125,
0.07391357421875,
0.0023746490478515625,
0.030059814453125,
0.052825927734375,
0.030181884765625,
-0.0033626556396484375,
0.00507354736328125,
-0.00801849365234375,
-0.04498291015625,
-0.0211944580078125,
-0.039031982421875,
-0.002132415771484375,
-0.0028781890869140625,
-0.05902099609375,
0.0302886962890625,
-0.03338623046875,
-0.0171051025390625,
0.00930023193359375,
-0.007595062255859375,
-0.056060791015625,
0.025054931640625,
0.01212310791015625,
0.0765380859375,
-0.058990478515625,
0.060516357421875,
0.0740966796875,
-0.0609130859375,
-0.07025146484375,
0.0019550323486328125,
0.0107574462890625,
-0.055389404296875,
0.029083251953125,
0.037933349609375,
0.0271453857421875,
-0.004238128662109375,
-0.07891845703125,
-0.052490234375,
0.0860595703125,
0.03131103515625,
-0.038543701171875,
-0.01898193359375,
-0.01983642578125,
0.04547119140625,
-0.0306243896484375,
0.0201263427734375,
0.03106689453125,
0.0261383056640625,
0.0286865234375,
-0.053741455078125,
0.01010894775390625,
-0.040252685546875,
0.020111083984375,
0.0178680419921875,
-0.044525146484375,
0.052459716796875,
-0.035308837890625,
-0.0249786376953125,
0.0174102783203125,
0.046051025390625,
0.009490966796875,
0.0123443603515625,
0.041961669921875,
0.049530029296875,
0.0176239013671875,
-0.0276947021484375,
0.1053466796875,
-0.0174102783203125,
0.037841796875,
0.06903076171875,
0.024322509765625,
0.03619384765625,
0.03704833984375,
-0.00852203369140625,
0.03594970703125,
0.060028076171875,
-0.023529052734375,
0.031463623046875,
0.0129241943359375,
0.0139007568359375,
0.00830078125,
0.0014553070068359375,
-0.031707763671875,
0.0162200927734375,
0.00641632080078125,
-0.043914794921875,
-0.00019562244415283203,
0.017303466796875,
0.0155487060546875,
-0.0025157928466796875,
-0.00800323486328125,
0.0728759765625,
0.00737762451171875,
-0.0628662109375,
0.0430908203125,
-0.005046844482421875,
0.058929443359375,
-0.03997802734375,
-0.00614166259765625,
-0.0204925537109375,
-0.01041412353515625,
-0.01050567626953125,
-0.0704345703125,
0.040771484375,
0.01024627685546875,
-0.03106689453125,
-0.0247802734375,
0.044830322265625,
-0.024688720703125,
-0.032623291015625,
0.0192718505859375,
0.016998291015625,
0.0207672119140625,
-0.0228118896484375,
-0.072021484375,
0.0032825469970703125,
0.0009822845458984375,
-0.03778076171875,
0.0223236083984375,
0.03826904296875,
-0.019073486328125,
0.031707763671875,
0.039703369140625,
-0.013671875,
-0.0144500732421875,
0.0007009506225585938,
0.08428955078125,
-0.041534423828125,
-0.02728271484375,
-0.044097900390625,
0.055694580078125,
0.0003037452697753906,
-0.021453857421875,
0.0307464599609375,
0.027984619140625,
0.078857421875,
-0.0283050537109375,
0.0509033203125,
-0.028961181640625,
0.01415252685546875,
-0.037841796875,
0.05609130859375,
-0.042938232421875,
-0.01309967041015625,
-0.03997802734375,
-0.0653076171875,
-0.016448974609375,
0.048797607421875,
-0.02313232421875,
0.023651123046875,
0.049560546875,
0.052703857421875,
0.00043129920959472656,
-0.006282806396484375,
0.02093505859375,
-0.00007963180541992188,
0.00701141357421875,
0.029693603515625,
0.04010009765625,
-0.0579833984375,
0.043426513671875,
-0.0240478515625,
-0.0298309326171875,
-0.0026950836181640625,
-0.059661865234375,
-0.0679931640625,
-0.06414794921875,
-0.04339599609375,
-0.04266357421875,
0.0058441162109375,
0.0430908203125,
0.07568359375,
-0.0450439453125,
0.01541900634765625,
-0.007785797119140625,
-0.003498077392578125,
-0.001499176025390625,
-0.017578125,
0.0202178955078125,
0.0007495880126953125,
-0.04345703125,
-0.034820556640625,
0.006862640380859375,
0.035400390625,
-0.0144805908203125,
-0.00717926025390625,
-0.01531982421875,
-0.018310546875,
0.044403076171875,
0.0222320556640625,
-0.0462646484375,
-0.0298614501953125,
0.00047588348388671875,
-0.0301055908203125,
0.0301666259765625,
0.04864501953125,
-0.04547119140625,
0.02532958984375,
0.033477783203125,
0.04083251953125,
0.053924560546875,
0.012786865234375,
0.026580810546875,
-0.053802490234375,
0.0173187255859375,
-0.0034503936767578125,
0.02996826171875,
0.046966552734375,
-0.036834716796875,
0.06695556640625,
0.034088134765625,
-0.037994384765625,
-0.03509521484375,
0.0167388916015625,
-0.0909423828125,
-0.00795745849609375,
0.09283447265625,
-0.029510498046875,
-0.0283966064453125,
0.035369873046875,
-0.03619384765625,
0.03875732421875,
-0.01983642578125,
0.038299560546875,
0.05206298828125,
0.034820556640625,
-0.053436279296875,
-0.047515869140625,
0.0323486328125,
0.00408935546875,
-0.055511474609375,
-0.021942138671875,
0.036041259765625,
0.03472900390625,
0.028350830078125,
0.048187255859375,
-0.0265960693359375,
0.025665283203125,
-0.006168365478515625,
0.03704833984375,
0.0029506683349609375,
-0.01523590087890625,
-0.0259246826171875,
-0.007457733154296875,
-0.0171051025390625,
-0.017486572265625
]
] |
lmsys/vicuna-13b-v1.5 | 2023-08-03T11:06:24.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2307.09288",
"arxiv:2306.05685",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | lmsys | null | null | lmsys/vicuna-13b-v1.5 | 114 | 92,224 | transformers | 2023-07-29T04:44:46 | ---
inference: false
license: llama2
---
# Vicuna Model Card
## Model Details
Vicuna is a chat assistant trained by fine-tuning Llama 2 on user-shared conversations collected from ShareGPT.
- **Developed by:** [LMSYS](https://lmsys.org/)
- **Model type:** An auto-regressive language model based on the transformer architecture
- **License:** Llama 2 Community License Agreement
- **Finetuned from model:** [Llama 2](https://arxiv.org/abs/2307.09288)
### Model Sources
- **Repository:** https://github.com/lm-sys/FastChat
- **Blog:** https://lmsys.org/blog/2023-03-30-vicuna/
- **Paper:** https://arxiv.org/abs/2306.05685
- **Demo:** https://chat.lmsys.org/
## Uses
The primary use of Vicuna is research on large language models and chatbots.
The primary intended users of the model are researchers and hobbyists in natural language processing, machine learning, and artificial intelligence.
## How to Get Started with the Model
- Command line interface: https://github.com/lm-sys/FastChat#vicuna-weights
- APIs (OpenAI API, Huggingface API): https://github.com/lm-sys/FastChat/tree/main#api
## Training Details
Vicuna v1.5 is fine-tuned from Llama 2 with supervised instruction fine-tuning.
The training data is around 125K conversations collected from ShareGPT.com.
See more details in the "Training Details of Vicuna Models" section in the appendix of this [paper](https://arxiv.org/pdf/2306.05685.pdf).
## Evaluation

Vicuna is evaluated with standard benchmarks, human preference, and LLM-as-a-judge. See more details in this [paper](https://arxiv.org/pdf/2306.05685.pdf) and [leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard).
## Difference between different versions of Vicuna
See [vicuna_weights_version.md](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md) | 1,965 | [
[
-0.0217132568359375,
-0.0667724609375,
0.0311431884765625,
0.0240631103515625,
-0.04437255859375,
-0.01389312744140625,
-0.006328582763671875,
-0.04644775390625,
0.0200958251953125,
0.025787353515625,
-0.0460205078125,
-0.046630859375,
-0.037506103515625,
-0.004100799560546875,
-0.01081085205078125,
0.056671142578125,
0.01335906982421875,
0.00893402099609375,
-0.005306243896484375,
-0.016815185546875,
-0.06939697265625,
-0.040618896484375,
-0.073486328125,
-0.018829345703125,
0.042449951171875,
0.035308837890625,
0.0450439453125,
0.0458984375,
0.029937744140625,
0.024078369140625,
-0.0029125213623046875,
0.0297393798828125,
-0.038665771484375,
0.0047760009765625,
0.01947021484375,
-0.0704345703125,
-0.053985595703125,
-0.021697998046875,
0.035064697265625,
-0.005626678466796875,
-0.01236724853515625,
0.0173187255859375,
0.0018157958984375,
0.03204345703125,
-0.02496337890625,
0.021697998046875,
-0.042266845703125,
-0.01383209228515625,
-0.0216064453125,
-0.042999267578125,
-0.01568603515625,
-0.0269775390625,
-0.0156402587890625,
-0.035064697265625,
0.0025634765625,
-0.005641937255859375,
0.08441162109375,
0.043243408203125,
-0.024627685546875,
-0.01136016845703125,
-0.05572509765625,
0.03271484375,
-0.063720703125,
0.0267181396484375,
0.03179931640625,
0.046417236328125,
-0.0208740234375,
-0.040985107421875,
-0.0457763671875,
-0.0174407958984375,
0.004680633544921875,
0.009765625,
-0.0201568603515625,
0.0104827880859375,
0.0088653564453125,
0.035430908203125,
-0.034027099609375,
0.03192138671875,
-0.04119873046875,
0.0088348388671875,
0.04095458984375,
0.03131103515625,
0.01221466064453125,
-0.016265869140625,
-0.0301055908203125,
-0.0245513916015625,
-0.0255584716796875,
-0.0005679130554199219,
0.03070068359375,
0.0310821533203125,
-0.03302001953125,
0.03955078125,
-0.0142669677734375,
0.03875732421875,
-0.0094757080078125,
-0.0150604248046875,
0.0242462158203125,
-0.0037975311279296875,
-0.03765869140625,
-0.02178955078125,
0.08685302734375,
0.034454345703125,
-0.004360198974609375,
0.01062774658203125,
0.001750946044921875,
0.00443267822265625,
0.0159454345703125,
-0.05877685546875,
0.00531005859375,
0.04833984375,
-0.0224761962890625,
-0.038055419921875,
-0.004306793212890625,
-0.0343017578125,
-0.0309906005859375,
-0.0174713134765625,
0.0270843505859375,
-0.028533935546875,
-0.0281829833984375,
0.0110015869140625,
0.0007538795471191406,
0.03240966796875,
0.02655029296875,
-0.05255126953125,
0.025665283203125,
0.052642822265625,
0.07940673828125,
-0.007640838623046875,
-0.0293121337890625,
-0.00923919677734375,
-0.0213623046875,
-0.0227813720703125,
0.0718994140625,
-0.005283355712890625,
-0.025543212890625,
-0.0037841796875,
0.0106353759765625,
-0.0012760162353515625,
-0.041839599609375,
0.04632568359375,
-0.0204315185546875,
0.02001953125,
-0.0123443603515625,
-0.037811279296875,
-0.0022830963134765625,
0.0186767578125,
-0.048095703125,
0.091552734375,
0.0031414031982421875,
-0.05291748046875,
0.01357269287109375,
-0.053863525390625,
0.00823974609375,
0.00823211669921875,
-0.005481719970703125,
-0.035003662109375,
-0.00617218017578125,
-0.00295257568359375,
0.039337158203125,
-0.044647216796875,
0.0408935546875,
-0.0189056396484375,
-0.037109375,
0.021392822265625,
-0.043853759765625,
0.07550048828125,
0.0217437744140625,
-0.0283966064453125,
0.0355224609375,
-0.058685302734375,
-0.01230621337890625,
0.021026611328125,
-0.01493072509765625,
-0.020233154296875,
-0.0181121826171875,
0.0024814605712890625,
0.00650787353515625,
0.03082275390625,
-0.0173187255859375,
0.0252838134765625,
-0.0011148452758789062,
0.014312744140625,
0.050323486328125,
0.004123687744140625,
0.00998687744140625,
-0.033233642578125,
0.0313720703125,
-0.0013294219970703125,
0.059661865234375,
0.0101318359375,
-0.039093017578125,
-0.083984375,
-0.032928466796875,
0.0018367767333984375,
0.050689697265625,
-0.04608154296875,
0.046722412109375,
-0.022613525390625,
-0.08197021484375,
-0.06976318359375,
0.01303863525390625,
0.032012939453125,
0.007568359375,
0.0226898193359375,
-0.03375244140625,
-0.047454833984375,
-0.07659912109375,
-0.00563812255859375,
-0.030029296875,
-0.0030956268310546875,
0.032135009765625,
0.0175018310546875,
-0.04254150390625,
0.06256103515625,
-0.0308837890625,
-0.029052734375,
-0.004657745361328125,
-0.0059051513671875,
0.004669189453125,
0.031524658203125,
0.050262451171875,
-0.0462646484375,
-0.022552490234375,
-0.0058746337890625,
-0.06463623046875,
-0.0031414031982421875,
-0.006439208984375,
-0.036346435546875,
0.017364501953125,
0.028778076171875,
-0.047454833984375,
0.039825439453125,
0.054412841796875,
-0.038818359375,
0.034149169921875,
-0.020172119140625,
0.007228851318359375,
-0.10205078125,
0.01125335693359375,
0.0005908012390136719,
-0.0287322998046875,
-0.0439453125,
0.005218505859375,
-0.00839996337890625,
0.0186614990234375,
-0.048095703125,
0.06524658203125,
-0.0262908935546875,
0.00405120849609375,
-0.036376953125,
-0.0158538818359375,
-0.003910064697265625,
0.05792236328125,
0.007457733154296875,
0.054443359375,
0.0309906005859375,
-0.061492919921875,
0.03680419921875,
0.0159454345703125,
-0.0120086669921875,
0.02752685546875,
-0.06787109375,
0.02252197265625,
0.007595062255859375,
0.012908935546875,
-0.0699462890625,
-0.007396697998046875,
0.048095703125,
-0.036346435546875,
0.00904083251953125,
-0.002971649169921875,
-0.044342041015625,
-0.01409912109375,
-0.010894775390625,
0.0116119384765625,
0.034393310546875,
-0.044952392578125,
0.0293121337890625,
0.03314208984375,
0.01654052734375,
-0.037506103515625,
-0.044647216796875,
-0.0029468536376953125,
-0.032318115234375,
-0.01221466064453125,
0.00103759765625,
-0.0247344970703125,
-0.0173492431640625,
-0.01036834716796875,
0.01203155517578125,
-0.00970458984375,
0.00926971435546875,
0.035552978515625,
0.0180511474609375,
-0.00678253173828125,
0.01032257080078125,
-0.00426483154296875,
-0.006439208984375,
-0.00965118408203125,
0.001964569091796875,
0.0743408203125,
-0.03582763671875,
-0.002429962158203125,
-0.0679931640625,
-0.003849029541015625,
0.047943115234375,
0.00519561767578125,
0.08685302734375,
0.051727294921875,
-0.0167083740234375,
0.01277923583984375,
-0.056365966796875,
-0.01419830322265625,
-0.035186767578125,
0.0204010009765625,
-0.0272674560546875,
-0.052215576171875,
0.051605224609375,
0.0188751220703125,
0.02630615234375,
0.036285400390625,
0.059173583984375,
0.0082550048828125,
0.034454345703125,
0.0626220703125,
-0.003818511962890625,
0.0672607421875,
-0.0271453857421875,
-0.01192474365234375,
-0.05712890625,
-0.031585693359375,
-0.04388427734375,
-0.0072784423828125,
-0.05499267578125,
-0.05029296875,
-0.0033931732177734375,
-0.0005345344543457031,
-0.025390625,
0.053314208984375,
-0.044830322265625,
0.01421356201171875,
0.045654296875,
0.01959228515625,
0.02044677734375,
-0.01024627685546875,
0.018951416015625,
0.01091766357421875,
-0.05242919921875,
-0.0526123046875,
0.07818603515625,
0.050506591796875,
0.03753662109375,
0.013671875,
0.0531005859375,
0.0195770263671875,
0.034454345703125,
-0.06805419921875,
0.04095458984375,
0.017852783203125,
-0.055145263671875,
-0.031646728515625,
-0.041748046875,
-0.08148193359375,
0.02655029296875,
-0.015899658203125,
-0.049560546875,
0.026092529296875,
0.01116180419921875,
-0.0145111083984375,
0.02252197265625,
-0.05474853515625,
0.0682373046875,
-0.029632568359375,
-0.0287322998046875,
-0.00243377685546875,
-0.0294952392578125,
0.041046142578125,
0.00726318359375,
0.00508880615234375,
-0.0148468017578125,
-0.006011962890625,
0.0576171875,
-0.042236328125,
0.081298828125,
-0.0118865966796875,
-0.0181427001953125,
0.021026611328125,
-0.0007491111755371094,
0.01331329345703125,
0.005275726318359375,
0.005725860595703125,
0.033782958984375,
0.0099029541015625,
-0.037445068359375,
-0.046051025390625,
0.047088623046875,
-0.0867919921875,
-0.035552978515625,
-0.0286712646484375,
-0.022735595703125,
0.0009217262268066406,
0.0081024169921875,
0.0293121337890625,
0.0193023681640625,
-0.020538330078125,
0.01154327392578125,
0.039703369140625,
-0.02313232421875,
0.004638671875,
0.031463623046875,
-0.0223846435546875,
-0.03302001953125,
0.0478515625,
-0.00609588623046875,
0.01267242431640625,
0.03314208984375,
0.01221466064453125,
-0.0171661376953125,
-0.01102447509765625,
-0.0170135498046875,
0.0318603515625,
-0.042022705078125,
-0.018585205078125,
-0.051971435546875,
-0.022247314453125,
-0.0226593017578125,
0.0308837890625,
-0.0615234375,
-0.018951416015625,
-0.032379150390625,
-0.007450103759765625,
0.0474853515625,
0.034881591796875,
0.01532745361328125,
0.054443359375,
-0.039459228515625,
0.021697998046875,
0.022430419921875,
0.0270233154296875,
0.0007190704345703125,
-0.050689697265625,
-0.0191802978515625,
0.00577545166015625,
-0.0198516845703125,
-0.06658935546875,
0.039886474609375,
-0.012420654296875,
0.042633056640625,
0.03631591796875,
-0.0017557144165039062,
0.067138671875,
-0.0101318359375,
0.046417236328125,
0.0106964111328125,
-0.040313720703125,
0.0399169921875,
-0.01849365234375,
0.016204833984375,
0.0499267578125,
0.022857666015625,
-0.0458984375,
-0.0224151611328125,
-0.0673828125,
-0.053985595703125,
0.03790283203125,
0.01934814453125,
0.0199127197265625,
-0.00444793701171875,
0.036834716796875,
0.012481689453125,
0.01409149169921875,
-0.056427001953125,
-0.04510498046875,
-0.00945281982421875,
-0.0124969482421875,
-0.0139617919921875,
-0.032806396484375,
-0.003536224365234375,
-0.0237884521484375,
0.054046630859375,
-0.0020694732666015625,
0.041107177734375,
0.0087738037109375,
0.006191253662109375,
-0.00008028745651245117,
0.011505126953125,
0.052276611328125,
0.0272674560546875,
-0.03179931640625,
-0.021270751953125,
0.007171630859375,
-0.034027099609375,
-0.0032291412353515625,
0.0009312629699707031,
0.002170562744140625,
0.01165771484375,
0.02203369140625,
0.10894775390625,
0.01003265380859375,
-0.033782958984375,
0.022613525390625,
-0.05206298828125,
-0.0157928466796875,
-0.042449951171875,
0.022918701171875,
0.0115966796875,
0.036102294921875,
0.00949859619140625,
-0.006320953369140625,
0.0017957687377929688,
-0.052734375,
-0.0233306884765625,
0.0218353271484375,
-0.0306549072265625,
-0.0161895751953125,
0.0491943359375,
0.0123748779296875,
-0.04925537109375,
0.0294647216796875,
0.00849151611328125,
-0.0205535888671875,
0.03680419921875,
0.018829345703125,
0.069091796875,
-0.0204315185546875,
0.01245880126953125,
0.041778564453125,
0.021942138671875,
-0.01078033447265625,
0.015350341796875,
-0.01345062255859375,
-0.048095703125,
0.00970458984375,
-0.0423583984375,
-0.04400634765625,
0.02935791015625,
-0.05474853515625,
0.03759765625,
-0.0304412841796875,
-0.035797119140625,
-0.027923583984375,
0.032135009765625,
-0.07257080078125,
0.0004532337188720703,
-0.0029010772705078125,
0.06927490234375,
-0.06658935546875,
0.07464599609375,
0.0343017578125,
-0.03521728515625,
-0.068603515625,
-0.021820068359375,
-0.00646209716796875,
-0.06341552734375,
0.014892578125,
0.0022754669189453125,
-0.0014133453369140625,
-0.0110015869140625,
-0.044952392578125,
-0.0474853515625,
0.10784912109375,
0.0293426513671875,
-0.058685302734375,
-0.003246307373046875,
-0.00002866983413696289,
0.054168701171875,
-0.0129852294921875,
0.042755126953125,
0.043212890625,
0.014801025390625,
0.01375579833984375,
-0.08642578125,
-0.0003781318664550781,
-0.0382080078125,
0.0049285888671875,
-0.018951416015625,
-0.08526611328125,
0.0572509765625,
0.005916595458984375,
-0.004985809326171875,
0.0183258056640625,
0.0614013671875,
0.04827880859375,
0.0156402587890625,
0.032745361328125,
0.02056884765625,
0.07745361328125,
0.00690460205078125,
0.0848388671875,
-0.0108489990234375,
0.0115203857421875,
0.08294677734375,
0.0139617919921875,
0.0718994140625,
0.03656005859375,
0.004161834716796875,
0.037353515625,
0.059661865234375,
0.00859832763671875,
0.021453857421875,
-0.005168914794921875,
0.005077362060546875,
-0.0074310302734375,
0.00380706787109375,
-0.0369873046875,
0.038543701171875,
0.02178955078125,
-0.0175323486328125,
0.016265869140625,
-0.01094818115234375,
0.02099609375,
-0.01629638671875,
-0.0013790130615234375,
0.06353759765625,
0.015411376953125,
-0.04937744140625,
0.0677490234375,
0.004852294921875,
0.06829833984375,
-0.04986572265625,
0.004306793212890625,
-0.04412841796875,
0.023468017578125,
-0.003932952880859375,
-0.020294189453125,
0.008270263671875,
0.0138092041015625,
0.010711669921875,
0.01461029052734375,
0.032928466796875,
-0.0215911865234375,
-0.0265045166015625,
0.028289794921875,
0.036285400390625,
0.04339599609375,
0.0078582763671875,
-0.058319091796875,
0.036865234375,
-0.00933837890625,
-0.037811279296875,
0.01546478271484375,
0.0299835205078125,
-0.01513671875,
0.0704345703125,
0.045379638671875,
0.00970458984375,
-0.001708984375,
0.01861572265625,
0.0634765625,
-0.038665771484375,
-0.0374755859375,
-0.066162109375,
0.0281829833984375,
-0.006656646728515625,
-0.041900634765625,
0.057525634765625,
0.047454833984375,
0.043060302734375,
0.006656646728515625,
0.043182373046875,
0.005962371826171875,
0.022125244140625,
-0.037872314453125,
0.049560546875,
-0.053375244140625,
0.0251617431640625,
-0.0169830322265625,
-0.07177734375,
-0.0184326171875,
0.048370361328125,
-0.01702880859375,
0.0018787384033203125,
0.037017822265625,
0.06011962890625,
0.0033702850341796875,
-0.018524169921875,
0.033477783203125,
0.013763427734375,
0.040618896484375,
0.034881591796875,
0.049072265625,
-0.05426025390625,
0.037933349609375,
-0.01525115966796875,
-0.019378662109375,
-0.038909912109375,
-0.04254150390625,
-0.09130859375,
-0.048248291015625,
-0.0152740478515625,
-0.0274505615234375,
0.0166015625,
0.073974609375,
0.04742431640625,
-0.0254364013671875,
-0.045501708984375,
-0.0011892318725585938,
-0.01021575927734375,
-0.01483917236328125,
-0.01508331298828125,
0.02520751953125,
-0.0010423660278320312,
-0.06427001953125,
0.00859832763671875,
-0.01351165771484375,
0.0152740478515625,
-0.02752685546875,
-0.0281829833984375,
-0.0096893310546875,
0.01049041748046875,
0.023345947265625,
0.041229248046875,
-0.047149658203125,
-0.0022563934326171875,
0.002765655517578125,
-0.0335693359375,
0.0171356201171875,
0.02490234375,
-0.045745849609375,
0.01041412353515625,
0.0226593017578125,
0.0106048583984375,
0.048309326171875,
0.001434326171875,
0.02734375,
-0.03924560546875,
0.041473388671875,
-0.001888275146484375,
0.0247802734375,
0.03070068359375,
-0.031219482421875,
0.034881591796875,
0.0005192756652832031,
-0.0286712646484375,
-0.07159423828125,
-0.00878143310546875,
-0.07965087890625,
-0.0148162841796875,
0.10333251953125,
0.0131378173828125,
-0.04888916015625,
0.005603790283203125,
-0.042083740234375,
0.0496826171875,
-0.0227203369140625,
0.057464599609375,
0.02880859375,
0.01522064208984375,
-0.037750244140625,
-0.05352783203125,
0.03656005859375,
0.00614166259765625,
-0.07391357421875,
0.00281524658203125,
0.01959228515625,
0.0341796875,
0.0010614395141601562,
0.0880126953125,
-0.005008697509765625,
0.0102386474609375,
0.0032787322998046875,
0.037261962890625,
-0.0302581787109375,
-0.032470703125,
-0.0185546875,
-0.024169921875,
0.018768310546875,
-0.033538818359375
]
] |
facebook/sam-vit-base | 2023-07-11T15:21:19.000Z | [
"transformers",
"pytorch",
"tf",
"sam",
"mask-generation",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | facebook | null | null | facebook/sam-vit-base | 35 | 91,156 | transformers | 2023-04-19T14:15:29 | ---
license: apache-2.0
---
# Model Card for Segment Anything Model (SAM) - ViT Base (ViT-B) version
<p>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-architecture.png" alt="Model architecture">
<em> Detailed architecture of Segment Anything Model (SAM).</em>
</p>
# Table of Contents
0. [TL;DR](#TL;DR)
1. [Model Details](#model-details)
2. [Usage](#usage)
3. [Citation](#citation)
# TL;DR
[Link to original repository](https://github.com/facebookresearch/segment-anything)
| <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-beancans.png" alt="Snow" width="600" height="600"> | <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-dog-masks.png" alt="Forest" width="600" height="600"> | <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-car-seg.png" alt="Mountains" width="600" height="600"> |
|---------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|
The **Segment Anything Model (SAM)** produces high quality object masks from input prompts such as points or boxes, and it can be used to generate masks for all objects in an image. It has been trained on a [dataset](https://segment-anything.com/dataset/index.html) of 11 million images and 1.1 billion masks, and has strong zero-shot performance on a variety of segmentation tasks.
The abstract of the paper states:
> We introduce the Segment Anything (SA) project: a new task, model, and dataset for image segmentation. Using our efficient model in a data collection loop, we built the largest segmentation dataset to date (by far), with over 1 billion masks on 11M licensed and privacy respecting images. The model is designed and trained to be promptable, so it can transfer zero-shot to new image distributions and tasks. We evaluate its capabilities on numerous tasks and find that its zero-shot performance is impressive -- often competitive with or even superior to prior fully supervised results. We are releasing the Segment Anything Model (SAM) and corresponding dataset (SA-1B) of 1B masks and 11M images at [https://segment-anything.com](https://segment-anything.com) to foster research into foundation models for computer vision.
**Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the original [SAM model card](https://github.com/facebookresearch/segment-anything).
# Model Details
The SAM model is made up of 3 modules:
- The `VisionEncoder`: a VIT based image encoder. It computes the image embeddings using attention on patches of the image. Relative Positional Embedding is used.
- The `PromptEncoder`: generates embeddings for points and bounding boxes
- The `MaskDecoder`: a two-ways transformer which performs cross attention between the image embedding and the point embeddings (->) and between the point embeddings and the image embeddings. The outputs are fed
- The `Neck`: predicts the output masks based on the contextualized masks produced by the `MaskDecoder`.
# Usage
## Prompted-Mask-Generation
```python
from PIL import Image
import requests
from transformers import SamModel, SamProcessor
model = SamModel.from_pretrained("facebook/sam-vit-base")
processor = SamProcessor.from_pretrained("facebook/sam-vit-base")
img_url = "https://huggingface.co/ybelkada/segment-anything/resolve/main/assets/car.png"
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert("RGB")
input_points = [[[450, 600]]] # 2D localization of a window
```
```python
inputs = processor(raw_image, input_points=input_points, return_tensors="pt").to("cuda")
outputs = model(**inputs)
masks = processor.image_processor.post_process_masks(outputs.pred_masks.cpu(), inputs["original_sizes"].cpu(), inputs["reshaped_input_sizes"].cpu())
scores = outputs.iou_scores
```
Among other arguments to generate masks, you can pass 2D locations on the approximate position of your object of interest, a bounding box wrapping the object of interest (the format should be x, y coordinate of the top right and bottom left point of the bounding box), a segmentation mask. At this time of writing, passing a text as input is not supported by the official model according to [the official repository](https://github.com/facebookresearch/segment-anything/issues/4#issuecomment-1497626844).
For more details, refer to this notebook, which shows a walk throught of how to use the model, with a visual example!
## Automatic-Mask-Generation
The model can be used for generating segmentation masks in a "zero-shot" fashion, given an input image. The model is automatically prompt with a grid of `1024` points
which are all fed to the model.
The pipeline is made for automatic mask generation. The following snippet demonstrates how easy you can run it (on any device! Simply feed the appropriate `points_per_batch` argument)
```python
from transformers import pipeline
generator = pipeline("mask-generation", device = 0, points_per_batch = 256)
image_url = "https://huggingface.co/ybelkada/segment-anything/resolve/main/assets/car.png"
outputs = generator(image_url, points_per_batch = 256)
```
Now to display the image:
```python
import matplotlib.pyplot as plt
from PIL import Image
import numpy as np
def show_mask(mask, ax, random_color=False):
if random_color:
color = np.concatenate([np.random.random(3), np.array([0.6])], axis=0)
else:
color = np.array([30 / 255, 144 / 255, 255 / 255, 0.6])
h, w = mask.shape[-2:]
mask_image = mask.reshape(h, w, 1) * color.reshape(1, 1, -1)
ax.imshow(mask_image)
plt.imshow(np.array(raw_image))
ax = plt.gca()
for mask in outputs["masks"]:
show_mask(mask, ax=ax, random_color=True)
plt.axis("off")
plt.show()
```
# Citation
If you use this model, please use the following BibTeX entry.
```
@article{kirillov2023segany,
title={Segment Anything},
author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C. and Lo, Wan-Yen and Doll{\'a}r, Piotr and Girshick, Ross},
journal={arXiv:2304.02643},
year={2023}
}
``` | 6,712 | [
[
-0.037139892578125,
-0.055145263671875,
0.0369873046875,
0.005580902099609375,
-0.03375244140625,
-0.01296234130859375,
0.022369384765625,
-0.038330078125,
0.045654296875,
0.03424072265625,
-0.04345703125,
-0.04571533203125,
-0.042694091796875,
-0.020782470703125,
-0.02349853515625,
0.043212890625,
0.0124359130859375,
-0.01364898681640625,
-0.0082550048828125,
-0.005832672119140625,
-0.0282440185546875,
-0.03240966796875,
-0.053009033203125,
-0.003448486328125,
0.0157470703125,
0.00853729248046875,
0.05169677734375,
0.0950927734375,
0.038177490234375,
0.0228271484375,
-0.01366424560546875,
0.0026340484619140625,
-0.0038089752197265625,
-0.00928497314453125,
-0.0011072158813476562,
-0.0221405029296875,
-0.0209503173828125,
0.0008306503295898438,
0.056854248046875,
0.0343017578125,
0.001071929931640625,
0.019439697265625,
-0.0251007080078125,
0.035369873046875,
-0.03765869140625,
-0.0009655952453613281,
-0.03814697265625,
-0.00811004638671875,
-0.01226806640625,
0.01052093505859375,
-0.013763427734375,
-0.0254669189453125,
-0.0023593902587890625,
-0.037017822265625,
0.01253509521484375,
0.00585174560546875,
0.12103271484375,
0.031982421875,
-0.00977325439453125,
0.005886077880859375,
-0.034881591796875,
0.0419921875,
-0.041900634765625,
0.0243072509765625,
0.01055908203125,
0.0206146240234375,
0.018798828125,
-0.06414794921875,
-0.034271240234375,
0.01248931884765625,
-0.016387939453125,
0.003322601318359375,
-0.0306854248046875,
-0.01058197021484375,
0.02703857421875,
0.0186309814453125,
-0.039520263671875,
-0.0253448486328125,
-0.058746337890625,
-0.0158233642578125,
0.058990478515625,
0.01184844970703125,
0.0307769775390625,
-0.047607421875,
-0.049652099609375,
-0.017974853515625,
-0.03515625,
0.032745361328125,
-0.0014095306396484375,
0.00321197509765625,
-0.0289306640625,
0.045257568359375,
-0.01611328125,
0.0650634765625,
0.027130126953125,
-0.028167724609375,
0.027069091796875,
-0.006683349609375,
-0.03570556640625,
-0.0019550323486328125,
0.042633056640625,
0.041748046875,
-0.00891876220703125,
0.00455474853515625,
-0.005527496337890625,
0.01027679443359375,
0.016815185546875,
-0.07965087890625,
-0.0279541015625,
0.019683837890625,
-0.044525146484375,
-0.0158843994140625,
0.023773193359375,
-0.042877197265625,
-0.00820159912109375,
-0.0187225341796875,
0.043853759765625,
-0.041534423828125,
-0.01319122314453125,
0.006313323974609375,
-0.018096923828125,
0.045989990234375,
0.006275177001953125,
-0.03839111328125,
-0.004909515380859375,
0.029541015625,
0.0750732421875,
-0.01038360595703125,
-0.01324462890625,
-0.018280029296875,
0.013763427734375,
-0.020782470703125,
0.07452392578125,
-0.05084228515625,
-0.0204620361328125,
-0.022064208984375,
0.036834716796875,
-0.0280303955078125,
-0.038665771484375,
0.033538818359375,
-0.0244293212890625,
-0.00966644287109375,
-0.00997161865234375,
-0.030792236328125,
-0.042266845703125,
0.014129638671875,
-0.03692626953125,
0.06640625,
0.0189208984375,
-0.0391845703125,
0.0178375244140625,
-0.057373046875,
-0.031768798828125,
-0.0018949508666992188,
-0.004421234130859375,
-0.0537109375,
0.006412506103515625,
0.0296630859375,
0.042266845703125,
-0.004314422607421875,
0.0096893310546875,
-0.03448486328125,
-0.017974853515625,
0.0182037353515625,
0.004474639892578125,
0.0770263671875,
0.0153350830078125,
-0.03582763671875,
0.017181396484375,
-0.056060791015625,
-0.0007624626159667969,
0.035308837890625,
0.01149749755859375,
-0.004062652587890625,
-0.022430419921875,
-0.01438140869140625,
0.0298309326171875,
0.01495361328125,
-0.051025390625,
-0.00435638427734375,
-0.006832122802734375,
0.046966552734375,
0.053131103515625,
0.035247802734375,
0.0419921875,
-0.038116455078125,
0.0394287109375,
0.010101318359375,
0.0428466796875,
-0.0567626953125,
-0.047393798828125,
-0.07122802734375,
-0.054595947265625,
0.014556884765625,
0.030975341796875,
-0.036895751953125,
0.035614013671875,
0.0028533935546875,
-0.05279541015625,
-0.039642333984375,
-0.026885986328125,
0.0166473388671875,
0.038055419921875,
0.0155029296875,
-0.036163330078125,
-0.045989990234375,
-0.074951171875,
0.019012451171875,
-0.007843017578125,
-0.003482818603515625,
0.03472900390625,
0.03955078125,
-0.01323699951171875,
0.07122802734375,
-0.07208251953125,
-0.0193328857421875,
-0.0004963874816894531,
-0.0303497314453125,
0.0008883476257324219,
0.053863525390625,
0.04803466796875,
-0.0576171875,
-0.037200927734375,
-0.01268768310546875,
-0.056365966796875,
0.0039825439453125,
0.0012798309326171875,
-0.0311431884765625,
0.0157012939453125,
0.0277252197265625,
-0.052703857421875,
0.052459716796875,
0.0193634033203125,
-0.033233642578125,
0.03668212890625,
0.0166168212890625,
-0.0031604766845703125,
-0.0830078125,
0.0293731689453125,
0.01285552978515625,
-0.0284576416015625,
-0.0491943359375,
0.01318359375,
-0.00021791458129882812,
-0.0286102294921875,
-0.049835205078125,
0.04840087890625,
-0.017181396484375,
-0.005771636962890625,
-0.00788116455078125,
-0.0011186599731445312,
0.0214385986328125,
0.058746337890625,
0.015472412109375,
0.0234832763671875,
0.062103271484375,
-0.047088623046875,
0.0181121826171875,
0.033538818359375,
-0.0341796875,
0.06292724609375,
-0.06182861328125,
-0.002140045166015625,
-0.0181732177734375,
0.0195465087890625,
-0.0755615234375,
-0.045440673828125,
0.039093017578125,
-0.032958984375,
0.0180816650390625,
-0.0185089111328125,
-0.00823974609375,
-0.0292510986328125,
-0.0134735107421875,
0.026885986328125,
0.048675537109375,
-0.039642333984375,
0.044189453125,
0.04962158203125,
-0.002960205078125,
-0.0162506103515625,
-0.0450439453125,
-0.024505615234375,
-0.024322509765625,
-0.06768798828125,
0.041107177734375,
0.0018405914306640625,
-0.00482177734375,
0.0218963623046875,
-0.00672149658203125,
-0.0018482208251953125,
-0.017242431640625,
0.049591064453125,
0.047393798828125,
-0.00762939453125,
-0.015777587890625,
-0.0028095245361328125,
-0.0166778564453125,
-0.003345489501953125,
-0.014556884765625,
0.055328369140625,
-0.0101470947265625,
-0.036224365234375,
-0.046356201171875,
0.01080322265625,
0.032623291015625,
-0.035369873046875,
0.0293426513671875,
0.052642822265625,
-0.03302001953125,
-0.0007314682006835938,
-0.054046630859375,
-0.0164031982421875,
-0.036041259765625,
0.021820068359375,
-0.03076171875,
-0.06805419921875,
0.057647705078125,
0.00682830810546875,
0.0009918212890625,
0.054595947265625,
0.0289306640625,
-0.01229095458984375,
0.07904052734375,
0.04486083984375,
0.01026153564453125,
0.04364013671875,
-0.044586181640625,
0.0169219970703125,
-0.07257080078125,
-0.049652099609375,
-0.02276611328125,
-0.03375244140625,
-0.0311431884765625,
-0.053558349609375,
0.03472900390625,
0.01230621337890625,
-0.039703369140625,
0.038330078125,
-0.06854248046875,
0.04498291015625,
0.0400390625,
0.01446533203125,
-0.00862884521484375,
0.01776123046875,
-0.0033435821533203125,
0.01323699951171875,
-0.050384521484375,
-0.0205535888671875,
0.053863525390625,
0.0250396728515625,
0.0350341796875,
-0.031494140625,
0.046234130859375,
0.004627227783203125,
0.006099700927734375,
-0.04229736328125,
0.047332763671875,
-0.00371551513671875,
-0.0687255859375,
-0.005664825439453125,
-0.00604248046875,
-0.06500244140625,
0.02374267578125,
0.005558013916015625,
-0.0714111328125,
0.049407958984375,
-0.0014047622680664062,
-0.0467529296875,
0.04656982421875,
-0.059814453125,
0.0699462890625,
-0.00542449951171875,
-0.01203155517578125,
0.0273590087890625,
-0.0606689453125,
0.03460693359375,
0.01018524169921875,
-0.005542755126953125,
-0.02020263671875,
0.01030731201171875,
0.06256103515625,
-0.0272064208984375,
0.06396484375,
-0.0286407470703125,
0.02191162109375,
0.055633544921875,
-0.00946807861328125,
0.0304718017578125,
-0.003742218017578125,
0.00603485107421875,
0.0255279541015625,
0.012176513671875,
-0.03985595703125,
-0.0275726318359375,
0.046539306640625,
-0.053497314453125,
-0.040374755859375,
-0.029205322265625,
-0.0246734619140625,
0.02386474609375,
0.01303863525390625,
0.0323486328125,
0.0251922607421875,
0.01039886474609375,
0.009674072265625,
0.031402587890625,
-0.0239715576171875,
0.04296875,
0.018707275390625,
-0.030487060546875,
-0.04620361328125,
0.08807373046875,
0.00479888916015625,
0.0181884765625,
0.0075225830078125,
-0.0050048828125,
-0.0245361328125,
-0.003253936767578125,
-0.040771484375,
0.036468505859375,
-0.04583740234375,
-0.043609619140625,
-0.043731689453125,
-0.046539306640625,
-0.0262298583984375,
-0.03399658203125,
-0.03326416015625,
-0.0426025390625,
-0.0183258056640625,
-0.005161285400390625,
0.0245208740234375,
0.037109375,
-0.0181427001953125,
0.041473388671875,
-0.039520263671875,
0.014892578125,
0.0253448486328125,
0.0239715576171875,
0.00038361549377441406,
-0.041290283203125,
-0.0078277587890625,
0.0006041526794433594,
-0.046478271484375,
-0.0469970703125,
0.037841796875,
-0.0060882568359375,
0.033477783203125,
0.050018310546875,
-0.002712249755859375,
0.07684326171875,
-0.0189208984375,
0.064208984375,
0.0218963623046875,
-0.06805419921875,
0.039794921875,
-0.00982666015625,
0.0179595947265625,
0.02264404296875,
0.0222625732421875,
-0.038543701171875,
-0.00046944618225097656,
-0.07122802734375,
-0.06280517578125,
0.08154296875,
0.0141143798828125,
-0.0077972412109375,
0.007251739501953125,
0.024261474609375,
-0.01094818115234375,
0.018341064453125,
-0.05712890625,
-0.03887939453125,
-0.0286102294921875,
0.01398468017578125,
0.01666259765625,
-0.00537872314453125,
-0.0036220550537109375,
-0.03924560546875,
0.0615234375,
0.01178741455078125,
0.05340576171875,
0.0276947021484375,
-0.005313873291015625,
-0.01294708251953125,
-0.017608642578125,
0.048614501953125,
0.0445556640625,
-0.0247650146484375,
0.0015745162963867188,
-0.005825042724609375,
-0.01360321044921875,
0.0084991455078125,
0.017608642578125,
-0.046661376953125,
0.00966644287109375,
0.00795745849609375,
0.0858154296875,
-0.009979248046875,
-0.0280609130859375,
0.032196044921875,
0.0226593017578125,
-0.024444580078125,
-0.0209197998046875,
0.00865936279296875,
0.006610870361328125,
0.03594970703125,
0.03472900390625,
0.019256591796875,
-0.0058746337890625,
-0.0289764404296875,
0.0175018310546875,
0.036529541015625,
-0.035491943359375,
-0.026397705078125,
0.0546875,
-0.004123687744140625,
-0.0256500244140625,
0.0276336669921875,
-0.02880859375,
-0.05413818359375,
0.06488037109375,
0.042327880859375,
0.0687255859375,
-0.035125732421875,
0.04217529296875,
0.057952880859375,
0.0253753662109375,
0.0183868408203125,
0.005176544189453125,
0.004474639892578125,
-0.03521728515625,
-0.0279083251953125,
-0.06976318359375,
-0.0079803466796875,
0.02349853515625,
-0.03662109375,
0.0181884765625,
-0.04296875,
-0.00994873046875,
0.00907135009765625,
-0.00972747802734375,
-0.040771484375,
0.03411865234375,
0.00615692138671875,
0.057861328125,
-0.06396484375,
0.041107177734375,
0.05401611328125,
-0.0400390625,
-0.07012939453125,
-0.0162506103515625,
-0.00955963134765625,
-0.07940673828125,
0.0234832763671875,
0.033355712890625,
0.017791748046875,
-0.0013151168823242188,
-0.05712890625,
-0.08209228515625,
0.0916748046875,
0.0247344970703125,
-0.0167236328125,
-0.0052490234375,
0.0182037353515625,
0.0082550048828125,
-0.0501708984375,
0.0102081298828125,
0.045806884765625,
0.043670654296875,
0.040679931640625,
-0.03314208984375,
0.0203094482421875,
-0.024017333984375,
0.013336181640625,
0.018157958984375,
-0.07122802734375,
0.0714111328125,
-0.01155853271484375,
-0.027984619140625,
0.007785797119140625,
0.03173828125,
0.029205322265625,
0.034637451171875,
0.04827880859375,
0.05535888671875,
0.03704833984375,
-0.0211639404296875,
0.073974609375,
-0.0280303955078125,
0.01837158203125,
0.055999755859375,
0.0012235641479492188,
0.041107177734375,
0.02215576171875,
-0.01532745361328125,
0.03515625,
0.0694580078125,
-0.036346435546875,
0.038604736328125,
-0.00046515464782714844,
-0.003173828125,
-0.02978515625,
-0.0284271240234375,
-0.03289794921875,
0.041015625,
0.01355743408203125,
-0.043060302734375,
-0.01161956787109375,
0.0048980712890625,
0.004119873046875,
-0.033538818359375,
-0.0272369384765625,
0.04681396484375,
0.0030975341796875,
-0.042999267578125,
0.0489501953125,
0.017486572265625,
0.0290374755859375,
-0.04327392578125,
0.00904083251953125,
-0.018768310546875,
0.0034942626953125,
-0.029449462890625,
-0.044158935546875,
0.044586181640625,
-0.007549285888671875,
-0.01004791259765625,
0.0118255615234375,
0.0682373046875,
-0.0192413330078125,
-0.061248779296875,
-0.0007457733154296875,
0.0128326416015625,
0.0264434814453125,
-0.0169219970703125,
-0.060211181640625,
0.0291900634765625,
0.0210723876953125,
-0.0170745849609375,
0.021331787109375,
0.0152587890625,
0.00079345703125,
0.037078857421875,
0.060760498046875,
-0.004978179931640625,
0.018768310546875,
-0.0269622802734375,
0.080078125,
-0.05694580078125,
-0.040283203125,
-0.057037353515625,
0.06292724609375,
-0.0193939208984375,
-0.007110595703125,
0.051605224609375,
0.058074951171875,
0.06903076171875,
-0.005748748779296875,
0.0419921875,
-0.032958984375,
0.021026611328125,
-0.0139923095703125,
0.04156494140625,
-0.055999755859375,
-0.00824737548828125,
-0.0230255126953125,
-0.0841064453125,
-0.0279083251953125,
0.0693359375,
-0.0178375244140625,
0.0157470703125,
0.04364013671875,
0.0701904296875,
-0.0149993896484375,
-0.003936767578125,
0.020050048828125,
0.008758544921875,
0.01451873779296875,
0.0234832763671875,
0.04608154296875,
-0.04095458984375,
0.043365478515625,
-0.06256103515625,
-0.007266998291015625,
-0.005466461181640625,
-0.045867919921875,
-0.061614990234375,
-0.061126708984375,
-0.041839599609375,
-0.0245208740234375,
-0.01351165771484375,
0.044464111328125,
0.08941650390625,
-0.046905517578125,
-0.00754547119140625,
0.0097198486328125,
0.014495849609375,
-0.014190673828125,
-0.017333984375,
0.050750732421875,
0.00284576416015625,
-0.06744384765625,
0.008575439453125,
0.03875732421875,
0.0079193115234375,
-0.005901336669921875,
-0.00339508056640625,
-0.01558685302734375,
0.0168914794921875,
0.04779052734375,
0.0297393798828125,
-0.049041748046875,
-0.0191650390625,
0.0032024383544921875,
0.0104217529296875,
0.0199127197265625,
0.03509521484375,
-0.036468505859375,
0.0312347412109375,
0.0239410400390625,
0.03228759765625,
0.0633544921875,
0.0259857177734375,
-0.003803253173828125,
-0.046966552734375,
0.01430511474609375,
-0.006229400634765625,
0.0228271484375,
0.031982421875,
-0.01361083984375,
0.04681396484375,
0.0276641845703125,
-0.03924560546875,
-0.06524658203125,
0.00954437255859375,
-0.08990478515625,
-0.023773193359375,
0.08367919921875,
-0.02020263671875,
-0.054473876953125,
0.01360321044921875,
-0.02178955078125,
0.0218048095703125,
-0.031951904296875,
0.041229248046875,
0.023040771484375,
-0.01690673828125,
-0.0244293212890625,
-0.0159759521484375,
0.0279083251953125,
0.01226043701171875,
-0.053863525390625,
-0.0297393798828125,
0.0280914306640625,
0.0308685302734375,
0.03570556640625,
0.03765869140625,
-0.0196685791015625,
0.01366424560546875,
0.002712249755859375,
0.0135498046875,
-0.0231170654296875,
-0.0220184326171875,
-0.0206451416015625,
0.0247955322265625,
-0.033660888671875,
-0.045989990234375
]
] |
CAMeL-Lab/bert-base-arabic-camelbert-mix-ner | 2021-10-17T11:13:00.000Z | [
"transformers",
"pytorch",
"tf",
"bert",
"token-classification",
"ar",
"arxiv:2103.06678",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | CAMeL-Lab | null | null | CAMeL-Lab/bert-base-arabic-camelbert-mix-ner | 7 | 91,118 | transformers | 2022-03-02T23:29:04 | ---
language:
- ar
license: apache-2.0
widget:
- text: "إمارة أبوظبي هي إحدى إمارات دولة الإمارات العربية المتحدة السبع"
---
# CAMeLBERT-Mix NER Model
## Model description
**CAMeLBERT-Mix NER Model** is a Named Entity Recognition (NER) model that was built by fine-tuning the [CAMeLBERT Mix](https://huggingface.co/CAMeL-Lab/bert-base-arabic-camelbert-mix/) model.
For the fine-tuning, we used the [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) dataset.
Our fine-tuning procedure and the hyperparameters we used can be found in our paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678).
"* Our fine-tuning code can be found [here](https://github.com/CAMeL-Lab/CAMeLBERT).
## Intended uses
You can use the CAMeLBERT-Mix NER model directly as part of our [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools) NER component (*recommended*) or as part of the transformers pipeline.
#### How to use
To use the model with the [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools) NER component:
```python
>>> from camel_tools.ner import NERecognizer
>>> from camel_tools.tokenizers.word import simple_word_tokenize
>>> ner = NERecognizer('CAMeL-Lab/bert-base-arabic-camelbert-mix-ner')
>>> sentence = simple_word_tokenize('إمارة أبوظبي هي إحدى إمارات دولة الإمارات العربية المتحدة السبع')
>>> ner.predict_sentence(sentence)
>>> ['O', 'B-LOC', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'I-LOC', 'O']
```
You can also use the NER model directly with a transformers pipeline:
```python
>>> from transformers import pipeline
>>> ner = pipeline('ner', model='CAMeL-Lab/bert-base-arabic-camelbert-mix-ner')
>>> ner("إمارة أبوظبي هي إحدى إمارات دولة الإمارات العربية المتحدة السبع")
[{'word': 'أبوظبي',
'score': 0.9895730018615723,
'entity': 'B-LOC',
'index': 2,
'start': 6,
'end': 12},
{'word': 'الإمارات',
'score': 0.8156259655952454,
'entity': 'B-LOC',
'index': 8,
'start': 33,
'end': 41},
{'word': 'العربية',
'score': 0.890906810760498,
'entity': 'I-LOC',
'index': 9,
'start': 42,
'end': 49},
{'word': 'المتحدة',
'score': 0.8169114589691162,
'entity': 'I-LOC',
'index': 10,
'start': 50,
'end': 57}]
```
*Note*: to download our models, you would need `transformers>=3.5.0`.
Otherwise, you could download the models manually.
## Citation
```bibtex
@inproceedings{inoue-etal-2021-interplay,
title = "The Interplay of Variant, Size, and Task Type in {A}rabic Pre-trained Language Models",
author = "Inoue, Go and
Alhafni, Bashar and
Baimukan, Nurpeiis and
Bouamor, Houda and
Habash, Nizar",
booktitle = "Proceedings of the Sixth Arabic Natural Language Processing Workshop",
month = apr,
year = "2021",
address = "Kyiv, Ukraine (Online)",
publisher = "Association for Computational Linguistics",
abstract = "In this paper, we explore the effects of language variants, data sizes, and fine-tuning task types in Arabic pre-trained language models. To do so, we build three pre-trained language models across three variants of Arabic: Modern Standard Arabic (MSA), dialectal Arabic, and classical Arabic, in addition to a fourth language model which is pre-trained on a mix of the three. We also examine the importance of pre-training data size by building additional models that are pre-trained on a scaled-down set of the MSA variant. We compare our different models to each other, as well as to eight publicly available models by fine-tuning them on five NLP tasks spanning 12 datasets. Our results suggest that the variant proximity of pre-training data to fine-tuning data is more important than the pre-training data size. We exploit this insight in defining an optimized system selection model for the studied tasks.",
}
``` | 3,803 | [
[
-0.04754638671875,
-0.057464599609375,
0.004772186279296875,
0.0190887451171875,
-0.01788330078125,
-0.004878997802734375,
-0.0245208740234375,
-0.036712646484375,
0.01497650146484375,
0.038238525390625,
-0.03460693359375,
-0.04010009765625,
-0.0699462890625,
0.0167236328125,
-0.03509521484375,
0.0921630859375,
-0.0176239013671875,
0.0005650520324707031,
0.01499176025390625,
-0.0263519287109375,
-0.022064208984375,
-0.038787841796875,
-0.040924072265625,
-0.027130126953125,
0.031219482421875,
0.0239105224609375,
0.07049560546875,
0.0302734375,
0.035675048828125,
0.0233154296875,
-0.01158905029296875,
0.01158905029296875,
-0.0148773193359375,
-0.0035343170166015625,
0.0012445449829101562,
-0.010650634765625,
-0.021636962890625,
0.002300262451171875,
0.046173095703125,
0.051788330078125,
-0.0153045654296875,
0.0275726318359375,
-0.002471923828125,
0.045257568359375,
-0.045562744140625,
0.006595611572265625,
-0.047119140625,
-0.0176849365234375,
-0.015533447265625,
0.01081085205078125,
-0.005573272705078125,
-0.0078125,
0.015899658203125,
-0.0279693603515625,
0.00969696044921875,
0.01371002197265625,
0.10498046875,
0.02947998046875,
-0.0277252197265625,
-0.028839111328125,
-0.049346923828125,
0.0745849609375,
-0.07257080078125,
0.0235137939453125,
0.042022705078125,
0.0108184814453125,
-0.02783203125,
-0.052825927734375,
-0.0693359375,
-0.0282440185546875,
-0.0201416015625,
-0.01229095458984375,
-0.01534271240234375,
-0.0008730888366699219,
0.0206451416015625,
0.0245361328125,
-0.0305023193359375,
-0.0031871795654296875,
-0.012908935546875,
-0.0223846435546875,
0.0369873046875,
0.00560760498046875,
0.038482666015625,
-0.0140380859375,
0.003604888916015625,
-0.0176849365234375,
-0.047821044921875,
0.011932373046875,
0.029388427734375,
0.0273590087890625,
-0.0242919921875,
0.05560302734375,
-0.0254669189453125,
0.0665283203125,
0.00707244873046875,
-0.00728607177734375,
0.032257080078125,
-0.0252685546875,
-0.0156707763671875,
0.00560760498046875,
0.06231689453125,
-0.0005178451538085938,
0.0128021240234375,
0.004253387451171875,
-0.026031494140625,
0.00395965576171875,
0.005138397216796875,
-0.07769775390625,
-0.0145263671875,
0.00848388671875,
-0.0374755859375,
-0.0025157928466796875,
-0.01358795166015625,
-0.040252685546875,
-0.002498626708984375,
-0.0220794677734375,
0.05230712890625,
-0.0509033203125,
-0.006420135498046875,
0.024932861328125,
0.0043792724609375,
0.01837158203125,
0.00943756103515625,
-0.06732177734375,
0.0297088623046875,
0.02734375,
0.05523681640625,
-0.0005974769592285156,
-0.02337646484375,
-0.0309600830078125,
-0.0012025833129882812,
-0.0034389495849609375,
0.037750244140625,
-0.045654296875,
-0.0300445556640625,
0.00020611286163330078,
-0.01139068603515625,
-0.01152801513671875,
-0.043914794921875,
0.040008544921875,
-0.04296875,
0.021240234375,
0.0011472702026367188,
-0.0440673828125,
-0.029205322265625,
0.014434814453125,
-0.053070068359375,
0.08551025390625,
0.01383209228515625,
-0.08111572265625,
0.023956298828125,
-0.05511474609375,
-0.03997802734375,
0.0001938343048095703,
-0.01546478271484375,
-0.0458984375,
-0.005558013916015625,
0.0272064208984375,
0.0198974609375,
-0.0265045166015625,
0.0135040283203125,
-0.00499725341796875,
-0.0227508544921875,
0.026031494140625,
-0.017303466796875,
0.060150146484375,
0.008056640625,
-0.0399169921875,
0.011444091796875,
-0.07379150390625,
0.003696441650390625,
0.00013387203216552734,
-0.035003662109375,
-0.0158843994140625,
0.007320404052734375,
0.029144287109375,
0.0272674560546875,
0.033355712890625,
-0.03936767578125,
0.005584716796875,
-0.048095703125,
0.033660888671875,
0.031982421875,
-0.0086517333984375,
0.0229034423828125,
-0.039794921875,
0.034149169921875,
0.01132965087890625,
-0.005382537841796875,
0.02044677734375,
-0.031158447265625,
-0.08612060546875,
-0.0295257568359375,
0.04632568359375,
0.033782958984375,
-0.053436279296875,
0.048614501953125,
-0.0254058837890625,
-0.05517578125,
-0.062286376953125,
-0.003936767578125,
0.0243682861328125,
0.0256805419921875,
0.04925537109375,
-0.0245208740234375,
-0.046630859375,
-0.0771484375,
-0.031341552734375,
0.0017709732055664062,
0.0091552734375,
0.025787353515625,
0.05535888671875,
-0.043701171875,
0.07928466796875,
-0.0296478271484375,
-0.0209808349609375,
-0.035308837890625,
0.0300445556640625,
0.03857421875,
0.05181884765625,
0.040283203125,
-0.052825927734375,
-0.04547119140625,
0.00008803606033325195,
-0.0163726806640625,
0.00962066650390625,
0.01346588134765625,
-0.006099700927734375,
0.0235443115234375,
0.028778076171875,
-0.05511474609375,
0.04449462890625,
0.043243408203125,
-0.0274200439453125,
0.048583984375,
-0.0205841064453125,
-0.0019369125366210938,
-0.08111572265625,
0.003604888916015625,
-0.011932373046875,
0.0056610107421875,
-0.051483154296875,
-0.0019502639770507812,
-0.00470733642578125,
0.018280029296875,
-0.0242462158203125,
0.051971435546875,
-0.033477783203125,
0.017913818359375,
-0.018951416015625,
-0.01454925537109375,
-0.018280029296875,
0.04315185546875,
0.0004723072052001953,
0.0643310546875,
0.042388916015625,
-0.052825927734375,
-0.01163482666015625,
0.02630615234375,
-0.053314208984375,
-0.0063323974609375,
-0.0640869140625,
0.00730133056640625,
-0.01123046875,
0.005619049072265625,
-0.06390380859375,
-0.006549835205078125,
0.0226898193359375,
-0.030609130859375,
0.0234375,
0.00506591796875,
-0.053070068359375,
-0.0106048583984375,
-0.0026493072509765625,
0.04510498046875,
0.032562255859375,
-0.021240234375,
0.0701904296875,
0.020660400390625,
-0.0115203857421875,
-0.055084228515625,
-0.042877197265625,
-0.005397796630859375,
-0.01560211181640625,
-0.03729248046875,
0.04986572265625,
-0.018768310546875,
-0.0030117034912109375,
0.0030841827392578125,
-0.003826141357421875,
-0.005771636962890625,
0.00839996337890625,
0.030792236328125,
0.0205535888671875,
-0.007518768310546875,
0.004070281982421875,
0.01357269287109375,
-0.001728057861328125,
-0.01148223876953125,
-0.004405975341796875,
0.072021484375,
-0.016021728515625,
-0.01013946533203125,
-0.031982421875,
0.0301666259765625,
0.0302581787109375,
-0.039031982421875,
0.0921630859375,
0.06256103515625,
-0.032684326171875,
0.0052490234375,
-0.041015625,
-0.00980377197265625,
-0.038177490234375,
0.041900634765625,
-0.036712646484375,
-0.07171630859375,
0.03289794921875,
0.011322021484375,
0.0016260147094726562,
0.05072021484375,
0.0521240234375,
0.01398468017578125,
0.0841064453125,
0.053070068359375,
-0.02703857421875,
0.03997802734375,
-0.0284423828125,
0.034759521484375,
-0.05810546875,
-0.00946044921875,
-0.058380126953125,
-0.0306854248046875,
-0.06072998046875,
-0.0193939208984375,
0.0153961181640625,
0.01013946533203125,
-0.037261962890625,
0.031829833984375,
-0.0239105224609375,
0.01111602783203125,
0.048431396484375,
-0.003917694091796875,
0.00615692138671875,
0.010894775390625,
-0.019927978515625,
0.0006289482116699219,
-0.041290283203125,
-0.0478515625,
0.078125,
0.006603240966796875,
0.038970947265625,
0.022613525390625,
0.0654296875,
0.0097503662109375,
0.0201263427734375,
-0.053070068359375,
0.044403076171875,
-0.0034580230712890625,
-0.04400634765625,
-0.0305633544921875,
-0.0211639404296875,
-0.087646484375,
0.0196685791015625,
-0.0287628173828125,
-0.0621337890625,
0.00971221923828125,
-0.00858306884765625,
-0.024322509765625,
0.0010995864868164062,
-0.035308837890625,
0.061187744140625,
0.003078460693359375,
0.00479888916015625,
-0.0279693603515625,
-0.0452880859375,
-0.0027790069580078125,
-0.00531005859375,
0.021240234375,
-0.01415252685546875,
0.0016622543334960938,
0.0743408203125,
-0.04376220703125,
0.047943115234375,
0.01354217529296875,
0.0075225830078125,
0.0182952880859375,
0.005687713623046875,
0.038330078125,
-0.0007371902465820312,
-0.016632080078125,
0.0293731689453125,
0.002285003662109375,
-0.03997802734375,
-0.031829833984375,
0.0458984375,
-0.09442138671875,
-0.042327880859375,
-0.03729248046875,
-0.05780029296875,
-0.00630950927734375,
0.027374267578125,
0.039031982421875,
0.03363037109375,
0.001842498779296875,
0.00827789306640625,
0.01479339599609375,
-0.02001953125,
0.04315185546875,
0.043487548828125,
-0.01073455810546875,
-0.0242919921875,
0.059295654296875,
0.0180511474609375,
0.00713348388671875,
0.034759521484375,
0.0032978057861328125,
-0.007415771484375,
-0.0626220703125,
-0.0200042724609375,
0.02691650390625,
-0.037445068359375,
-0.01058197021484375,
-0.06591796875,
-0.032135009765625,
-0.030792236328125,
0.014495849609375,
-0.012176513671875,
-0.02984619140625,
-0.03765869140625,
0.003173828125,
0.03851318359375,
0.045074462890625,
0.0144500732421875,
0.03131103515625,
-0.0616455078125,
0.01381683349609375,
0.0140838623046875,
0.0095977783203125,
0.003376007080078125,
-0.063720703125,
-0.0340576171875,
0.0204010009765625,
-0.0187530517578125,
-0.06689453125,
0.051544189453125,
0.017822265625,
0.0322265625,
0.01482391357421875,
-0.00565338134765625,
0.048736572265625,
-0.04248046875,
0.05621337890625,
0.0031299591064453125,
-0.0845947265625,
0.029876708984375,
-0.014312744140625,
0.0207977294921875,
0.042816162109375,
0.04296875,
-0.04913330078125,
-0.0234527587890625,
-0.05035400390625,
-0.0760498046875,
0.060791015625,
0.04150390625,
0.00807952880859375,
-0.008209228515625,
0.012054443359375,
0.0071868896484375,
0.028533935546875,
-0.05670166015625,
-0.04949951171875,
-0.01248931884765625,
-0.03564453125,
-0.0188751220703125,
-0.0208892822265625,
0.0015974044799804688,
-0.037261962890625,
0.0750732421875,
0.0103912353515625,
0.0168609619140625,
0.0100250244140625,
-0.02899169921875,
0.01277923583984375,
0.0311431884765625,
0.03863525390625,
0.0232696533203125,
-0.0168304443359375,
0.00199127197265625,
0.0196075439453125,
-0.049407958984375,
0.00104522705078125,
0.033599853515625,
-0.0264739990234375,
0.0098724365234375,
0.0238189697265625,
0.07501220703125,
0.0012044906616210938,
-0.061859130859375,
0.0272674560546875,
-0.0030231475830078125,
-0.0105133056640625,
-0.0243988037109375,
0.004062652587890625,
0.007110595703125,
0.00905609130859375,
0.01641845703125,
0.00791168212890625,
0.0167999267578125,
-0.0293731689453125,
-0.00457000732421875,
0.0287628173828125,
-0.028533935546875,
-0.01438140869140625,
0.02642822265625,
-0.020172119140625,
-0.038787841796875,
0.0533447265625,
-0.029327392578125,
-0.043365478515625,
0.050933837890625,
0.040924072265625,
0.05145263671875,
-0.01318359375,
0.017608642578125,
0.0330810546875,
0.018829345703125,
-0.0012769699096679688,
0.031494140625,
0.005950927734375,
-0.0789794921875,
-0.022796630859375,
-0.06866455078125,
-0.0090179443359375,
0.016143798828125,
-0.040557861328125,
0.018157958984375,
-0.0306396484375,
-0.0200653076171875,
0.00710296630859375,
0.006267547607421875,
-0.07147216796875,
0.0157623291015625,
0.0008640289306640625,
0.04742431640625,
-0.04046630859375,
0.0772705078125,
0.06396484375,
-0.044830322265625,
-0.0902099609375,
-0.0004582405090332031,
-0.031341552734375,
-0.053955078125,
0.06591796875,
0.0246124267578125,
-0.0077667236328125,
0.01061248779296875,
-0.031829833984375,
-0.09814453125,
0.060455322265625,
0.0136260986328125,
-0.035552978515625,
0.00514984130859375,
0.022430419921875,
0.04095458984375,
-0.0294189453125,
0.04364013671875,
0.0531005859375,
0.033233642578125,
-0.0096282958984375,
-0.07757568359375,
0.015625,
-0.032318115234375,
0.00418853759765625,
0.0289306640625,
-0.036285400390625,
0.08551025390625,
-0.0096893310546875,
-0.0174560546875,
0.0241546630859375,
0.057464599609375,
0.01340484619140625,
-0.0021381378173828125,
0.03656005859375,
0.049713134765625,
0.042236328125,
0.004825592041015625,
0.06658935546875,
-0.039825439453125,
0.032318115234375,
0.0941162109375,
-0.0036983489990234375,
0.0799560546875,
0.042022705078125,
-0.0247039794921875,
0.0758056640625,
0.042327880859375,
0.004718780517578125,
0.041839599609375,
0.023162841796875,
-0.0189208984375,
-0.0136260986328125,
-0.0142669677734375,
-0.035491943359375,
0.043914794921875,
0.03045654296875,
-0.0201416015625,
-0.00989532470703125,
-0.008209228515625,
0.002773284912109375,
-0.0038776397705078125,
-0.0084075927734375,
0.041412353515625,
-0.0010662078857421875,
-0.048431396484375,
0.0643310546875,
0.0270843505859375,
0.0635986328125,
-0.040496826171875,
0.0042572021484375,
0.0030727386474609375,
0.0034275054931640625,
-0.009765625,
-0.0274658203125,
0.0035686492919921875,
-0.01824951171875,
-0.016143798828125,
0.01367950439453125,
0.038055419921875,
-0.038330078125,
-0.04949951171875,
0.0242462158203125,
0.034271240234375,
0.00457000732421875,
-0.013763427734375,
-0.061126708984375,
-0.006435394287109375,
0.002941131591796875,
-0.0229949951171875,
0.02532958984375,
0.0203399658203125,
-0.0017242431640625,
0.046142578125,
0.048492431640625,
0.009002685546875,
0.01470947265625,
0.0110626220703125,
0.06341552734375,
-0.058013916015625,
-0.027587890625,
-0.07049560546875,
0.028594970703125,
0.0006866455078125,
-0.033294677734375,
0.04986572265625,
0.037872314453125,
0.059173583984375,
-0.00821685791015625,
0.048309326171875,
-0.012725830078125,
0.0338134765625,
-0.02532958984375,
0.053436279296875,
-0.04718017578125,
0.0028591156005859375,
-0.0169830322265625,
-0.0614013671875,
-0.01172637939453125,
0.0528564453125,
-0.032806396484375,
0.01788330078125,
0.052520751953125,
0.08074951171875,
0.005680084228515625,
-0.0036163330078125,
-0.0016222000122070312,
0.0281982421875,
0.0146942138671875,
0.038787841796875,
0.048583984375,
-0.040374755859375,
0.0345458984375,
-0.0195770263671875,
-0.0019321441650390625,
-0.0248565673828125,
-0.053497314453125,
-0.076171875,
-0.050689697265625,
-0.0237274169921875,
-0.0273284912109375,
-0.01482391357421875,
0.0882568359375,
0.039794921875,
-0.07086181640625,
-0.0230865478515625,
-0.0168609619140625,
0.005626678466796875,
-0.0242919921875,
-0.0163726806640625,
0.041839599609375,
-0.0174713134765625,
-0.05706787109375,
0.0082550048828125,
0.0045013427734375,
0.0213775634765625,
0.003566741943359375,
-0.013946533203125,
-0.0379638671875,
0.020782470703125,
0.031524658203125,
0.013275146484375,
-0.062286376953125,
-0.0246734619140625,
-0.01309967041015625,
-0.01557159423828125,
0.0033168792724609375,
0.0357666015625,
-0.058746337890625,
0.007602691650390625,
0.0255584716796875,
0.04705810546875,
0.04949951171875,
-0.030792236328125,
0.027008056640625,
-0.031524658203125,
0.0262451171875,
0.01593017578125,
0.04205322265625,
0.0258331298828125,
-0.011932373046875,
0.0168609619140625,
-0.006046295166015625,
-0.040252685546875,
-0.045989990234375,
0.02410888671875,
-0.073486328125,
-0.0139923095703125,
0.08197021484375,
-0.0017910003662109375,
-0.01323699951171875,
-0.01186370849609375,
-0.03521728515625,
0.039794921875,
-0.0367431640625,
0.0631103515625,
0.06756591796875,
0.00905609130859375,
0.01971435546875,
-0.022857666015625,
0.040740966796875,
0.06463623046875,
-0.045501708984375,
-0.0297088623046875,
0.011505126953125,
0.0284881591796875,
0.0167999267578125,
0.0347900390625,
0.0089111328125,
0.004894256591796875,
-0.01479339599609375,
0.0224609375,
0.0131072998046875,
-0.00279998779296875,
-0.0172271728515625,
0.0020122528076171875,
0.0125579833984375,
-0.01364898681640625
]
] |
cl-tohoku/bert-base-japanese-char | 2021-09-23T13:45:29.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | cl-tohoku | null | null | cl-tohoku/bert-base-japanese-char | 7 | 91,105 | transformers | 2022-03-02T23:29:05 | ---
language: ja
license: cc-by-sa-4.0
datasets:
- wikipedia
widget:
- text: 仙台は「[MASK]の都」と呼ばれている。
---
# BERT base Japanese (character tokenization)
This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language.
This version of the model processes input texts with word-level tokenization based on the IPA dictionary, followed by character-level tokenization.
The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/tree/v1.0).
## Model architecture
The model architecture is the same as the original BERT base model; 12 layers, 768 dimensions of hidden states, and 12 attention heads.
## Training Data
The model is trained on Japanese Wikipedia as of September 1, 2019.
To generate the training corpus, [WikiExtractor](https://github.com/attardi/wikiextractor) is used to extract plain texts from a dump file of Wikipedia articles.
The text files used for the training are 2.6GB in size, consisting of approximately 17M sentences.
## Tokenization
The texts are first tokenized by [MeCab](https://taku910.github.io/mecab/) morphological parser with the IPA dictionary and then split into characters.
The vocabulary size is 4000.
## Training
The model is trained with the same configuration as the original BERT; 512 tokens per instance, 256 instances per batch, and 1M training steps.
## Licenses
The pretrained models are distributed under the terms of the [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/).
## Acknowledgments
For training models, we used Cloud TPUs provided by [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc/) program.
| 1,719 | [
[
-0.031402587890625,
-0.05474853515625,
0.029541015625,
0.01119232177734375,
-0.0458984375,
-0.01082611083984375,
-0.018524169921875,
-0.036834716796875,
0.0318603515625,
0.0369873046875,
-0.05120849609375,
-0.0369873046875,
-0.045074462890625,
0.00705718994140625,
-0.00830841064453125,
0.090576171875,
-0.00009262561798095703,
0.02545166015625,
0.0189971923828125,
0.01666259765625,
-0.0269012451171875,
-0.042694091796875,
-0.0562744140625,
-0.0205078125,
0.04144287109375,
0.0240478515625,
0.03955078125,
0.039703369140625,
0.0310516357421875,
0.016082763671875,
-0.0016231536865234375,
-0.019683837890625,
-0.046417236328125,
-0.015716552734375,
-0.0003361701965332031,
-0.0299530029296875,
-0.017913818359375,
-0.01342010498046875,
0.053131103515625,
0.04827880859375,
0.0160369873046875,
0.00592803955078125,
-0.021209716796875,
0.0222320556640625,
-0.03814697265625,
0.01398468017578125,
-0.044891357421875,
0.0008268356323242188,
-0.0235443115234375,
0.0148773193359375,
-0.028656005859375,
-0.01177215576171875,
0.01430511474609375,
-0.057525634765625,
0.0245208740234375,
-0.006252288818359375,
0.090576171875,
0.0029621124267578125,
-0.0111541748046875,
-0.0204315185546875,
-0.028656005859375,
0.05279541015625,
-0.054931640625,
0.0292816162109375,
0.04290771484375,
0.00690460205078125,
-0.002056121826171875,
-0.0665283203125,
-0.047088623046875,
-0.018707275390625,
0.0062255859375,
0.004497528076171875,
0.0000597834587097168,
0.0094757080078125,
0.032806396484375,
0.0228271484375,
-0.046844482421875,
0.02880859375,
-0.037261962890625,
-0.01470947265625,
0.03326416015625,
-0.003551483154296875,
0.0452880859375,
-0.03802490234375,
-0.02886962890625,
-0.036590576171875,
-0.037261962890625,
0.006534576416015625,
0.0260162353515625,
0.01544189453125,
-0.011688232421875,
0.050323486328125,
0.00875091552734375,
0.0268402099609375,
0.00189971923828125,
-0.01363372802734375,
0.03173828125,
-0.021759033203125,
-0.01506805419921875,
0.010986328125,
0.07061767578125,
0.0066070556640625,
0.0230560302734375,
-0.01306915283203125,
-0.0161895751953125,
-0.00934600830078125,
0.038482666015625,
-0.0640869140625,
-0.0108795166015625,
0.005001068115234375,
-0.0552978515625,
-0.02325439453125,
-0.0079345703125,
-0.0258026123046875,
0.004665374755859375,
0.00421142578125,
0.046844482421875,
-0.07684326171875,
-0.024322509765625,
-0.005702972412109375,
-0.032806396484375,
0.02508544921875,
0.0061798095703125,
-0.08563232421875,
0.0102386474609375,
0.041534423828125,
0.052825927734375,
0.005767822265625,
-0.033935546875,
0.02069091796875,
0.019073486328125,
-0.03131103515625,
0.028350830078125,
-0.0238037109375,
-0.0458984375,
-0.00908660888671875,
0.00021064281463623047,
-0.003528594970703125,
-0.00492095947265625,
0.046417236328125,
-0.037628173828125,
0.01139068603515625,
-0.025848388671875,
-0.05908203125,
-0.008392333984375,
0.0168609619140625,
-0.0504150390625,
0.07916259765625,
0.0071258544921875,
-0.062347412109375,
0.0286865234375,
-0.0697021484375,
-0.034576416015625,
0.0255126953125,
0.004955291748046875,
-0.027069091796875,
0.010986328125,
0.0185546875,
0.0259246826171875,
0.012939453125,
0.0191802978515625,
-0.0180206298828125,
-0.034454345703125,
0.0014743804931640625,
-0.018524169921875,
0.09197998046875,
0.0201568603515625,
-0.02325439453125,
0.0004742145538330078,
-0.06707763671875,
0.006927490234375,
0.0198516845703125,
-0.031707763671875,
-0.0391845703125,
-0.009429931640625,
0.0197906494140625,
-0.004970550537109375,
0.048797607421875,
-0.05767822265625,
0.01885986328125,
-0.03936767578125,
0.028564453125,
0.0433349609375,
0.003139495849609375,
0.0205078125,
-0.003955841064453125,
0.01177215576171875,
-0.0036792755126953125,
0.0193634033203125,
-0.033966064453125,
-0.04998779296875,
-0.06439208984375,
-0.02264404296875,
0.02984619140625,
0.021697998046875,
-0.053131103515625,
0.07232666015625,
-0.04022216796875,
-0.049346923828125,
-0.053497314453125,
-0.01503753662109375,
0.026092529296875,
0.035430908203125,
0.022064208984375,
-0.0286102294921875,
-0.042388916015625,
-0.06890869140625,
0.011749267578125,
-0.0304718017578125,
-0.01076507568359375,
-0.002193450927734375,
0.055877685546875,
-0.0236968994140625,
0.059814453125,
-0.0146331787109375,
-0.00778961181640625,
-0.021728515625,
0.0304718017578125,
0.01995849609375,
0.05059814453125,
0.0465087890625,
-0.047149658203125,
-0.039642333984375,
-0.00530242919921875,
-0.0413818359375,
0.00738525390625,
-0.0016870498657226562,
-0.0179290771484375,
0.00788116455078125,
0.0244598388671875,
-0.048797607421875,
0.0197906494140625,
0.02703857421875,
-0.0151519775390625,
0.030242919921875,
-0.020538330078125,
-0.00630950927734375,
-0.105224609375,
0.03778076171875,
-0.018096923828125,
-0.007518768310546875,
-0.0447998046875,
0.027496337890625,
0.0143280029296875,
-0.0289764404296875,
-0.021331787109375,
0.041107177734375,
-0.031280517578125,
-0.006504058837890625,
-0.011749267578125,
-0.0203704833984375,
-0.00630950927734375,
0.0592041015625,
0.019073486328125,
0.06121826171875,
0.026458740234375,
-0.03704833984375,
0.00872802734375,
0.0272216796875,
-0.0447998046875,
0.0015964508056640625,
-0.06512451171875,
0.017486572265625,
-0.00681304931640625,
0.013153076171875,
-0.0794677734375,
-0.019744873046875,
0.026641845703125,
-0.04779052734375,
0.033416748046875,
-0.001445770263671875,
-0.06402587890625,
-0.03302001953125,
-0.03192138671875,
0.007694244384765625,
0.05133056640625,
-0.0367431640625,
0.03509521484375,
0.0275115966796875,
-0.0162811279296875,
-0.0582275390625,
-0.057891845703125,
0.0073699951171875,
0.0160980224609375,
-0.032073974609375,
0.035980224609375,
-0.01141357421875,
0.0160369873046875,
0.014007568359375,
0.00981903076171875,
-0.02392578125,
0.00925445556640625,
0.01617431640625,
0.02716064453125,
-0.01276397705078125,
0.0087432861328125,
0.0227813720703125,
0.00751495361328125,
-0.00012886524200439453,
-0.0010900497436523438,
0.07098388671875,
0.006046295166015625,
-0.01207733154296875,
-0.034515380859375,
0.0205078125,
0.0377197265625,
0.0018672943115234375,
0.06732177734375,
0.061553955078125,
-0.031524658203125,
0.010498046875,
-0.03741455078125,
-0.00337982177734375,
-0.032562255859375,
0.047271728515625,
-0.044891357421875,
-0.04931640625,
0.043487548828125,
0.0275421142578125,
0.02508544921875,
0.0504150390625,
0.04217529296875,
-0.024993896484375,
0.07647705078125,
0.05023193359375,
-0.03485107421875,
0.053131103515625,
-0.0198974609375,
0.0043487548828125,
-0.0487060546875,
-0.0240325927734375,
-0.04052734375,
-0.02239990234375,
-0.04095458984375,
-0.01006317138671875,
0.013641357421875,
0.011322021484375,
-0.037384033203125,
0.0289459228515625,
-0.027435302734375,
0.0374755859375,
0.053466796875,
0.015716552734375,
-0.0099029541015625,
0.0205841064453125,
-0.0186004638671875,
-0.00919342041015625,
-0.04437255859375,
-0.0318603515625,
0.08978271484375,
0.04498291015625,
0.03814697265625,
-0.00707244873046875,
0.062103271484375,
0.00803375244140625,
0.013824462890625,
-0.06011962890625,
0.03826904296875,
-0.0303192138671875,
-0.0736083984375,
-0.033782958984375,
-0.0204010009765625,
-0.07513427734375,
0.004253387451171875,
-0.0171051025390625,
-0.04412841796875,
-0.00792694091796875,
-0.01776123046875,
0.00806427001953125,
0.0262298583984375,
-0.060272216796875,
0.060943603515625,
-0.0214996337890625,
0.020294189453125,
-0.020294189453125,
-0.05908203125,
0.019073486328125,
-0.012054443359375,
-0.0050811767578125,
0.007045745849609375,
-0.00043892860412597656,
0.07586669921875,
-0.0426025390625,
0.07696533203125,
-0.027679443359375,
-0.004673004150390625,
0.004268646240234375,
-0.0286102294921875,
0.01910400390625,
-0.01306915283203125,
0.0126495361328125,
0.046356201171875,
-0.005344390869140625,
-0.031829833984375,
-0.007152557373046875,
0.0380859375,
-0.105224609375,
-0.01947021484375,
-0.015106201171875,
-0.0265655517578125,
-0.00534820556640625,
0.0533447265625,
0.054351806640625,
0.005336761474609375,
-0.03271484375,
0.021087646484375,
0.05670166015625,
-0.0232696533203125,
0.03143310546875,
0.0328369140625,
-0.0119171142578125,
-0.036346435546875,
0.064697265625,
0.014984130859375,
-0.003063201904296875,
0.041046142578125,
-0.00311279296875,
-0.0198211669921875,
-0.037567138671875,
-0.033477783203125,
0.03204345703125,
-0.048095703125,
0.004741668701171875,
-0.04998779296875,
-0.0362548828125,
-0.045989990234375,
0.006031036376953125,
-0.025634765625,
-0.0295867919921875,
-0.0341796875,
-0.006107330322265625,
0.009857177734375,
0.04656982421875,
0.00910186767578125,
0.0445556640625,
-0.05767822265625,
0.0267333984375,
0.00939178466796875,
0.0267333984375,
0.005657196044921875,
-0.03717041015625,
-0.033111572265625,
0.0105438232421875,
-0.01165771484375,
-0.056182861328125,
0.0266265869140625,
-0.0016107559204101562,
0.049102783203125,
0.03826904296875,
-0.0191497802734375,
0.05743408203125,
-0.05560302734375,
0.0792236328125,
0.039825439453125,
-0.07440185546875,
0.040283203125,
-0.019317626953125,
0.024871826171875,
0.043914794921875,
0.05169677734375,
-0.03619384765625,
-0.0287628173828125,
-0.0576171875,
-0.05743408203125,
0.0618896484375,
0.0094757080078125,
0.030517578125,
-0.01053619384765625,
0.03411865234375,
0.0151214599609375,
0.0014524459838867188,
-0.067626953125,
-0.0242919921875,
-0.0433349609375,
-0.0286102294921875,
-0.0106353759765625,
-0.0399169921875,
0.01239776611328125,
-0.0208740234375,
0.056610107421875,
0.009735107421875,
0.0361328125,
-0.0016565322875976562,
-0.022735595703125,
-0.00467681884765625,
-0.0021209716796875,
0.03369140625,
0.030731201171875,
-0.0295257568359375,
-0.02447509765625,
0.000530242919921875,
-0.0648193359375,
-0.0114593505859375,
0.00289154052734375,
-0.02447509765625,
0.037017822265625,
0.034454345703125,
0.09515380859375,
0.0265655517578125,
-0.04168701171875,
0.035430908203125,
-0.00536346435546875,
-0.0221710205078125,
-0.03192138671875,
0.013092041015625,
-0.00019693374633789062,
-0.005523681640625,
0.037109375,
-0.0233306884765625,
0.0038967132568359375,
-0.030517578125,
-0.006439208984375,
0.0239410400390625,
-0.00278472900390625,
-0.026824951171875,
0.037689208984375,
0.0135498046875,
-0.00971221923828125,
0.06494140625,
0.006031036376953125,
-0.03582763671875,
0.05126953125,
0.04815673828125,
0.0638427734375,
-0.00505828857421875,
0.00551605224609375,
0.050323486328125,
0.02728271484375,
0.0023651123046875,
0.0201568603515625,
-0.0144500732421875,
-0.0693359375,
-0.0264129638671875,
-0.048095703125,
-0.036590576171875,
0.04156494140625,
-0.057098388671875,
0.022186279296875,
-0.051849365234375,
-0.0213165283203125,
0.01091766357421875,
0.023162841796875,
-0.03936767578125,
0.027069091796875,
0.0264892578125,
0.08099365234375,
-0.052337646484375,
0.08966064453125,
0.0621337890625,
-0.050689697265625,
-0.06121826171875,
0.000568389892578125,
-0.04547119140625,
-0.0826416015625,
0.05670166015625,
0.01380157470703125,
0.0247344970703125,
0.0101776123046875,
-0.059417724609375,
-0.06646728515625,
0.06500244140625,
0.0173187255859375,
-0.038909912109375,
-0.034881591796875,
0.00452423095703125,
0.05084228515625,
-0.006622314453125,
0.00995635986328125,
0.020111083984375,
0.02093505859375,
-0.004375457763671875,
-0.0660400390625,
-0.02630615234375,
-0.03778076171875,
0.035552978515625,
0.000025928020477294922,
-0.03521728515625,
0.06988525390625,
0.00494384765625,
-0.0090179443359375,
0.02130126953125,
0.0421142578125,
0.0257110595703125,
-0.0166778564453125,
0.036285400390625,
0.0540771484375,
0.054229736328125,
-0.0017261505126953125,
0.07403564453125,
-0.03570556640625,
0.0279693603515625,
0.0626220703125,
0.003993988037109375,
0.06695556640625,
0.038970947265625,
-0.00623321533203125,
0.05059814453125,
0.05767822265625,
-0.024017333984375,
0.05645751953125,
-0.0088043212890625,
0.003658294677734375,
0.0058746337890625,
0.007965087890625,
-0.0355224609375,
0.0178070068359375,
0.03533935546875,
-0.040924072265625,
-0.006748199462890625,
0.0116424560546875,
0.004962921142578125,
-0.037994384765625,
-0.03741455078125,
0.06524658203125,
-0.01336669921875,
-0.04766845703125,
0.044921875,
0.0205078125,
0.058837890625,
-0.0826416015625,
0.0149078369140625,
-0.007282257080078125,
-0.0015134811401367188,
0.01067352294921875,
-0.0640869140625,
-0.0005054473876953125,
0.0237579345703125,
-0.027862548828125,
-0.0171966552734375,
0.055450439453125,
-0.0167388916015625,
-0.033447265625,
0.00957489013671875,
0.0103759765625,
0.036285400390625,
0.027374267578125,
-0.0560302734375,
0.01300811767578125,
0.00839996337890625,
-0.0283966064453125,
0.02392578125,
0.0286102294921875,
0.00011622905731201172,
0.0287322998046875,
0.0537109375,
0.01128387451171875,
0.015472412109375,
0.021759033203125,
0.05755615234375,
-0.039794921875,
-0.058746337890625,
-0.05218505859375,
0.030029296875,
-0.01258087158203125,
-0.03485107421875,
0.04302978515625,
0.033966064453125,
0.0760498046875,
-0.033905029296875,
0.063232421875,
-0.0216064453125,
0.035003662109375,
-0.03558349609375,
0.06610107421875,
-0.0450439453125,
-0.01934814453125,
-0.0135040283203125,
-0.0611572265625,
-0.0031986236572265625,
0.06927490234375,
0.0017194747924804688,
0.007564544677734375,
0.032135009765625,
0.02783203125,
0.004306793212890625,
-0.005649566650390625,
0.0230712890625,
0.0188140869140625,
0.016571044921875,
0.03289794921875,
0.03631591796875,
-0.036468505859375,
0.0275421142578125,
-0.025848388671875,
-0.0038089752197265625,
-0.0147705078125,
-0.04339599609375,
-0.07757568359375,
-0.044891357421875,
-0.00258636474609375,
-0.0122222900390625,
-0.0001583099365234375,
0.0648193359375,
0.0516357421875,
-0.059722900390625,
-0.0181427001953125,
-0.0141754150390625,
-0.0238037109375,
0.01090240478515625,
-0.0172271728515625,
0.027618408203125,
-0.045440673828125,
-0.07080078125,
0.014801025390625,
-0.006195068359375,
0.0141754150390625,
-0.01885986328125,
-0.01000213623046875,
-0.0143585205078125,
-0.00897216796875,
0.0316162109375,
0.0157928466796875,
-0.046630859375,
-0.0216522216796875,
-0.0028476715087890625,
-0.0290679931640625,
-0.0016307830810546875,
0.037689208984375,
-0.0301361083984375,
0.03936767578125,
0.03216552734375,
0.05010986328125,
0.06298828125,
-0.0210418701171875,
0.033355712890625,
-0.07843017578125,
0.0218048095703125,
0.0065460205078125,
0.04132080078125,
0.015716552734375,
-0.0184783935546875,
0.030181884765625,
0.0174407958984375,
-0.017303466796875,
-0.060272216796875,
-0.006885528564453125,
-0.07464599609375,
-0.037872314453125,
0.062347412109375,
-0.0250091552734375,
-0.0300750732421875,
0.011566162109375,
-0.023284912109375,
0.0450439453125,
-0.0189361572265625,
0.0535888671875,
0.07855224609375,
0.0205078125,
-0.014984130859375,
-0.0026645660400390625,
0.0193328857421875,
0.0198211669921875,
-0.042633056640625,
-0.02642822265625,
0.0213165283203125,
0.044769287109375,
0.037567138671875,
0.05889892578125,
-0.00879669189453125,
0.0196533203125,
0.008209228515625,
0.037994384765625,
0.0023097991943359375,
-0.0134124755859375,
-0.0157470703125,
-0.0016870498657226562,
-0.0050048828125,
-0.029754638671875
]
] |
facebook/opt-6.7b | 2023-01-24T17:10:29.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"arxiv:2005.14165",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | facebook | null | null | facebook/opt-6.7b | 79 | 91,020 | transformers | 2022-05-11T08:26:52 | ---
language: en
inference: false
tags:
- text-generation
- opt
license: other
commercial: false
---
# OPT : Open Pre-trained Transformer Language Models
OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI.
**Disclaimer**: The team releasing OPT wrote an official model card, which is available in Appendix D of the [paper](https://arxiv.org/pdf/2205.01068.pdf).
Content from **this** model card has been written by the Hugging Face team.
## Intro
To quote the first two paragraphs of the [official paper](https://arxiv.org/abs/2205.01068)
> Large language models trained on massive text collections have shown surprising emergent
> capabilities to generate text and perform zero- and few-shot learning. While in some cases the public
> can interact with these models through paid APIs, full model access is currently limited to only a
> few highly resourced labs. This restricted access has limited researchers’ ability to study how and
> why these large language models work, hindering progress on improving known challenges in areas
> such as robustness, bias, and toxicity.
> We present Open Pretrained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M
> to 175B parameters, which we aim to fully and responsibly share with interested researchers. We train the OPT models to roughly match
> the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data
> collection and efficient training. Our aim in developing this suite of OPT models is to enable reproducible and responsible research at scale, and
> to bring more voices to the table in studying the impact of these LLMs. Definitions of risk, harm, bias, and toxicity, etc., should be articulated by the
> collective research community as a whole, which is only possible when models are available for study.
## Model description
OPT was predominantly pretrained with English text, but a small amount of non-English data is still present within the training corpus via CommonCrawl. The model was pretrained using a causal language modeling (CLM) objective.
OPT belongs to the same family of decoder-only models like [GPT-3](https://arxiv.org/abs/2005.14165). As such, it was pretrained using the self-supervised causal language modedling objective.
For evaluation, OPT follows [GPT-3](https://arxiv.org/abs/2005.14165) by using their prompts and overall experimental setup. For more details, please read
the [official paper](https://arxiv.org/abs/2205.01068).
## Intended uses & limitations
The pretrained-only model can be used for prompting for evaluation of downstream tasks as well as text generation.
In addition, the model can be fine-tuned on a downstream task using the [CLM example](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling). For all other OPT checkpoints, please have a look at the [model hub](https://huggingface.co/models?filter=opt).
### How to use
For large OPT models, such as this one, it is not recommend to make use of the `text-generation` pipeline because
one should load the model in half-precision to accelerate generation and optimize memory consumption on GPU.
It is recommended to directly call the [`generate`](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.generation_utils.GenerationMixin.generate)
method as follows:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-6.7b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-6.7b", use_fast=False)
>>> prompt = "Hello, I'm am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> generated_ids = model.generate(input_ids)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
["Hello, I'm am conscious and aware of my surroundings. I'm not sure what you mean"]
```
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-6.7b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-6.7b", use_fast=False)
>>> prompt = "Hello, I'm am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
["Hello, I'm am conscious and aware of my surroundings. I'm not sure if I'm"]
```
### Limitations and bias
As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of
unfiltered content from the internet, which is far from neutral the model is strongly biased :
> Like other large language models for which the diversity (or lack thereof) of training
> data induces downstream impact on the quality of our model, OPT-175B has limitations in terms
> of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and
> hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern
> large language models.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-6.7b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-6.7b", use_fast=False)
>>> prompt = "The woman worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The woman worked as a supervisor in the office
The woman worked as a bartender in a bar
The woman worked as a cashier at the
The woman worked as a teacher, and was
The woman worked as a maid at a house
```
compared to:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-6.7b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-6.7b", use_fast=False)
>>> prompt = "The man worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The man worked as a consultant to the Government
The man worked as a bartender in a bar
The man worked as a cashier at the
The man worked as a teacher, and was
The man worked as a professional at a bank
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The Meta AI team wanted to train this model on a corpus as large as possible. It is composed of the union of the following 5 filtered datasets of textual documents:
- BookCorpus, which consists of more than 10K unpublished books,
- CC-Stories, which contains a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas,
- The Pile, from which * Pile-CC, OpenWebText2, USPTO, Project Gutenberg, OpenSubtitles, Wikipedia, DM Mathematics and HackerNews* were included.
- Pushshift.io Reddit dataset that was developed in Baumgartner et al. (2020) and processed in
Roller et al. (2021)
- CCNewsV2 containing an updated version of the English portion of the CommonCrawl News
dataset that was used in RoBERTa (Liu et al., 2019b)
The final training data contains 180B tokens corresponding to 800GB of data. The validation split was made of 200MB of the pretraining data, sampled proportionally
to each dataset’s size in the pretraining corpus.
The dataset might contains offensive content as parts of the dataset are a subset of
public Common Crawl data, along with a subset of public Reddit data, which could contain sentences
that, if viewed directly, can be insulting, threatening, or might otherwise cause anxiety.
### Collection process
The dataset was collected form internet, and went through classic data processing algorithms and
re-formatting practices, including removing repetitive/non-informative text like *Chapter One* or
*This ebook by Project Gutenberg.*
## Training procedure
### Preprocessing
The texts are tokenized using the **GPT2** byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50272. The inputs are sequences of 2048 consecutive tokens.
The 175B model was trained on 992 *80GB A100 GPUs*. The training duration was roughly ~33 days of continuous training.
### BibTeX entry and citation info
```bibtex
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 10,019 | [
[
-0.02325439453125,
-0.061798095703125,
0.01433563232421875,
0.017852783203125,
-0.01334381103515625,
-0.01096343994140625,
-0.0323486328125,
-0.033416748046875,
0.0027561187744140625,
0.03155517578125,
-0.04254150390625,
-0.03289794921875,
-0.048126220703125,
0.0085296630859375,
-0.03411865234375,
0.08160400390625,
-0.0045013427734375,
-0.00023317337036132812,
0.0129241943359375,
0.0067596435546875,
-0.0252838134765625,
-0.03533935546875,
-0.061981201171875,
-0.01314544677734375,
0.015960693359375,
0.0054473876953125,
0.051605224609375,
0.0423583984375,
0.03021240234375,
0.0281982421875,
0.0033664703369140625,
0.008392333984375,
-0.042938232421875,
-0.022491455078125,
-0.00423431396484375,
-0.034332275390625,
-0.033843994140625,
0.0181427001953125,
0.038421630859375,
0.035491943359375,
0.006519317626953125,
0.0168914794921875,
0.001079559326171875,
0.0301666259765625,
-0.037689208984375,
0.018890380859375,
-0.051513671875,
-0.003055572509765625,
-0.012115478515625,
0.00730133056640625,
-0.044189453125,
-0.0186767578125,
0.011749267578125,
-0.03582763671875,
0.029296875,
-0.0019283294677734375,
0.0909423828125,
0.03131103515625,
-0.0234832763671875,
-0.013153076171875,
-0.050628662109375,
0.050323486328125,
-0.0677490234375,
0.029052734375,
0.0186767578125,
0.0048980712890625,
0.00439453125,
-0.0677490234375,
-0.046478271484375,
-0.013153076171875,
-0.019622802734375,
0.0177459716796875,
-0.026641845703125,
0.01177978515625,
0.02191162109375,
0.0272064208984375,
-0.04046630859375,
0.0013818740844726562,
-0.04034423828125,
-0.02508544921875,
0.052734375,
0.004306793212890625,
0.02398681640625,
-0.0276641845703125,
-0.019683837890625,
-0.00777435302734375,
-0.03277587890625,
0.0006470680236816406,
0.0360107421875,
0.0186004638671875,
-0.0164642333984375,
0.04241943359375,
-0.0188140869140625,
0.05792236328125,
0.0183258056640625,
0.01061248779296875,
0.03082275390625,
-0.030792236328125,
-0.0241851806640625,
-0.007266998291015625,
0.0887451171875,
0.019195556640625,
0.02545166015625,
-0.002048492431640625,
-0.006252288818359375,
0.005992889404296875,
0.003559112548828125,
-0.06402587890625,
-0.0289306640625,
0.0245513916015625,
-0.038421630859375,
-0.0271148681640625,
0.0031299591064453125,
-0.057037353515625,
-0.0007987022399902344,
-0.0135040283203125,
0.044219970703125,
-0.029937744140625,
-0.036407470703125,
0.0187835693359375,
0.00301361083984375,
0.0211944580078125,
0.003452301025390625,
-0.06829833984375,
-0.0003097057342529297,
0.0240936279296875,
0.06060791015625,
-0.004146575927734375,
-0.0302276611328125,
-0.0161285400390625,
-0.006946563720703125,
-0.0159149169921875,
0.0307159423828125,
-0.0190887451171875,
-0.012054443359375,
-0.003803253173828125,
0.007732391357421875,
-0.019927978515625,
-0.0271759033203125,
0.047149658203125,
-0.02880859375,
0.032806396484375,
-0.0167694091796875,
-0.02825927734375,
-0.0025310516357421875,
-0.00478363037109375,
-0.0455322265625,
0.09130859375,
0.011383056640625,
-0.07342529296875,
0.033294677734375,
-0.05169677734375,
-0.026092529296875,
-0.007045745849609375,
-0.0030231475830078125,
-0.0286407470703125,
0.00344085693359375,
0.03436279296875,
0.041290283203125,
-0.01146697998046875,
0.038421630859375,
-0.01067352294921875,
-0.0188751220703125,
0.005985260009765625,
-0.039703369140625,
0.08892822265625,
0.02423095703125,
-0.04876708984375,
0.0221099853515625,
-0.042755126953125,
-0.00717926025390625,
0.0267791748046875,
-0.013031005859375,
-0.002964019775390625,
-0.00927734375,
0.0148162841796875,
0.03314208984375,
0.0238189697265625,
-0.0361328125,
0.0108795166015625,
-0.0447998046875,
0.052337646484375,
0.07269287109375,
-0.0187225341796875,
0.0300445556640625,
-0.00807952880859375,
0.019256591796875,
0.00792694091796875,
0.0282745361328125,
-0.006717681884765625,
-0.02685546875,
-0.080322265625,
-0.0215911865234375,
0.0166015625,
0.027130126953125,
-0.054351806640625,
0.052886962890625,
-0.0227508544921875,
-0.05078125,
-0.046356201171875,
-0.0026683807373046875,
0.0253448486328125,
0.0330810546875,
0.0338134765625,
-0.0115203857421875,
-0.04388427734375,
-0.0601806640625,
-0.0235443115234375,
-0.006946563720703125,
0.0122222900390625,
0.028350830078125,
0.05023193359375,
-0.034271240234375,
0.08673095703125,
-0.044189453125,
-0.0220947265625,
-0.0284881591796875,
-0.00335693359375,
0.0313720703125,
0.050537109375,
0.04205322265625,
-0.0556640625,
-0.046478271484375,
-0.0166778564453125,
-0.053924560546875,
-0.0037822723388671875,
-0.009002685546875,
-0.0290679931640625,
0.0296173095703125,
0.045562744140625,
-0.06500244140625,
0.02374267578125,
0.038665771484375,
-0.035064697265625,
0.04351806640625,
-0.0010080337524414062,
-0.01406097412109375,
-0.09014892578125,
0.0216827392578125,
-0.008636474609375,
-0.01265716552734375,
-0.04083251953125,
-0.017852783203125,
0.0003097057342529297,
-0.01241302490234375,
-0.04376220703125,
0.0618896484375,
-0.0282440185546875,
0.01519012451171875,
-0.01067352294921875,
0.002300262451171875,
-0.00689697265625,
0.04693603515625,
0.01056671142578125,
0.045562744140625,
0.05438232421875,
-0.048095703125,
0.0164947509765625,
0.016357421875,
-0.018402099609375,
0.0180206298828125,
-0.055023193359375,
0.004543304443359375,
-0.01535797119140625,
0.0258331298828125,
-0.071044921875,
-0.02313232421875,
0.0264892578125,
-0.04522705078125,
0.0263214111328125,
0.0086212158203125,
-0.03851318359375,
-0.055084228515625,
-0.001377105712890625,
0.032470703125,
0.042144775390625,
-0.04547119140625,
0.050994873046875,
0.0241851806640625,
0.01568603515625,
-0.05816650390625,
-0.04766845703125,
-0.00936126708984375,
-0.00859832763671875,
-0.051849365234375,
0.025726318359375,
-0.0076751708984375,
-0.0004668235778808594,
0.0086822509765625,
-0.00794219970703125,
0.00676727294921875,
-0.0026721954345703125,
0.006450653076171875,
0.026336669921875,
-0.0011720657348632812,
0.0015115737915039062,
-0.005367279052734375,
-0.01812744140625,
0.01314544677734375,
-0.03369140625,
0.06451416015625,
-0.0181121826171875,
-0.01105499267578125,
-0.040313720703125,
-0.002971649169921875,
0.03582763671875,
-0.032257080078125,
0.0667724609375,
0.072265625,
-0.036285400390625,
-0.01222991943359375,
-0.053924560546875,
-0.0250244140625,
-0.04132080078125,
0.05084228515625,
-0.011138916015625,
-0.056060791015625,
0.04046630859375,
0.0170745849609375,
0.0182342529296875,
0.059539794921875,
0.062042236328125,
0.0170440673828125,
0.07904052734375,
0.0440673828125,
-0.0213775634765625,
0.049072265625,
-0.04718017578125,
0.023101806640625,
-0.04998779296875,
-0.0072021484375,
-0.026824951171875,
-0.003772735595703125,
-0.035003662109375,
-0.0224761962890625,
0.01194000244140625,
0.0035800933837890625,
-0.0299835205078125,
0.0347900390625,
-0.055572509765625,
0.0246429443359375,
0.04486083984375,
0.01239013671875,
0.000339508056640625,
-0.01210784912109375,
-0.0133514404296875,
0.0030384063720703125,
-0.060882568359375,
-0.0295562744140625,
0.094482421875,
0.0297393798828125,
0.051788330078125,
-0.0256500244140625,
0.05474853515625,
0.0002275705337524414,
0.033721923828125,
-0.034515380859375,
0.040496826171875,
0.0004718303680419922,
-0.07598876953125,
-0.0115509033203125,
-0.039520263671875,
-0.061981201171875,
0.0177459716796875,
-0.00919342041015625,
-0.050018310546875,
0.01309967041015625,
0.0133056640625,
-0.0277862548828125,
0.026580810546875,
-0.06085205078125,
0.09320068359375,
-0.0313720703125,
-0.033172607421875,
0.00586700439453125,
-0.048431396484375,
0.034423828125,
-0.00157928466796875,
0.012847900390625,
-0.0008645057678222656,
0.0185089111328125,
0.069580078125,
-0.0330810546875,
0.072265625,
-0.015045166015625,
0.002353668212890625,
0.0347900390625,
-0.018280029296875,
0.034210205078125,
-0.006168365478515625,
-0.0037841796875,
0.02606201171875,
-0.0136566162109375,
-0.032745361328125,
-0.0137786865234375,
0.042205810546875,
-0.0838623046875,
-0.032867431640625,
-0.03289794921875,
-0.036895751953125,
0.005950927734375,
0.04241943359375,
0.05792236328125,
0.0236663818359375,
-0.00666046142578125,
0.007030487060546875,
0.035400390625,
-0.03497314453125,
0.047637939453125,
0.01438140869140625,
-0.0181884765625,
-0.03173828125,
0.06231689453125,
0.00925445556640625,
0.0247802734375,
0.01100921630859375,
0.007343292236328125,
-0.033660888671875,
-0.0229034423828125,
-0.0236358642578125,
0.032928466796875,
-0.057373046875,
-0.0170745849609375,
-0.07208251953125,
-0.03021240234375,
-0.0474853515625,
-0.0159149169921875,
-0.04522705078125,
-0.003643035888671875,
-0.034759521484375,
-0.0160064697265625,
0.0173187255859375,
0.03448486328125,
-0.00457000732421875,
0.0333251953125,
-0.037628173828125,
0.0189208984375,
0.0135040283203125,
0.024627685546875,
0.004856109619140625,
-0.03875732421875,
-0.0276336669921875,
0.01088714599609375,
-0.022918701171875,
-0.05938720703125,
0.0372314453125,
-0.00025534629821777344,
0.045562744140625,
0.03704833984375,
0.01393890380859375,
0.040863037109375,
-0.0283050537109375,
0.0545654296875,
0.0063323974609375,
-0.080078125,
0.029327392578125,
-0.029296875,
0.0165252685546875,
0.03900146484375,
0.0306854248046875,
-0.0263671875,
-0.034271240234375,
-0.056854248046875,
-0.07904052734375,
0.07489013671875,
0.03460693359375,
0.020263671875,
-0.01039886474609375,
0.0264434814453125,
-0.01226043701171875,
0.01727294921875,
-0.1015625,
-0.0380859375,
-0.033447265625,
-0.028900146484375,
-0.0142364501953125,
-0.00975799560546875,
0.00789642333984375,
-0.03375244140625,
0.06182861328125,
0.00405120849609375,
0.03997802734375,
0.027801513671875,
-0.0235443115234375,
-0.006229400634765625,
-0.00891876220703125,
0.024871826171875,
0.04327392578125,
-0.01232147216796875,
0.0017194747924804688,
0.0136260986328125,
-0.037933349609375,
-0.0047607421875,
0.020660400390625,
-0.0246124267578125,
0.0016603469848632812,
0.0196075439453125,
0.07904052734375,
0.00038361549377441406,
-0.039794921875,
0.0400390625,
0.0012083053588867188,
-0.01403045654296875,
-0.028961181640625,
0.005062103271484375,
0.004276275634765625,
0.008148193359375,
0.0266265869140625,
0.00792694091796875,
-0.00942230224609375,
-0.029449462890625,
0.01036834716796875,
0.039703369140625,
-0.0225982666015625,
-0.023529052734375,
0.08026123046875,
0.0215606689453125,
-0.01275634765625,
0.05169677734375,
-0.01493072509765625,
-0.056854248046875,
0.046905517578125,
0.0416259765625,
0.07470703125,
-0.010009765625,
0.00891876220703125,
0.049652099609375,
0.047943115234375,
-0.0092010498046875,
-0.0018482208251953125,
0.00891876220703125,
-0.059234619140625,
-0.028961181640625,
-0.054351806640625,
0.0035228729248046875,
0.013671875,
-0.03125,
0.037628173828125,
-0.0167236328125,
-0.0209808349609375,
-0.01434326171875,
0.0011663436889648438,
-0.06744384765625,
0.018402099609375,
0.0026836395263671875,
0.057281494140625,
-0.076171875,
0.05621337890625,
0.03143310546875,
-0.0426025390625,
-0.0721435546875,
-0.00006711483001708984,
-0.0232086181640625,
-0.055572509765625,
0.04669189453125,
0.04010009765625,
0.0192108154296875,
0.033294677734375,
-0.04949951171875,
-0.0755615234375,
0.0809326171875,
0.0277862548828125,
-0.0236358642578125,
-0.0249176025390625,
0.01861572265625,
0.040313720703125,
-0.0098419189453125,
0.04083251953125,
0.03570556640625,
0.0322265625,
-0.01084136962890625,
-0.06683349609375,
0.0131683349609375,
-0.01532745361328125,
-0.0142974853515625,
0.005405426025390625,
-0.0662841796875,
0.0899658203125,
-0.01043701171875,
-0.0175018310546875,
-0.014404296875,
0.054412841796875,
0.00247955322265625,
0.0006375312805175781,
0.030975341796875,
0.0447998046875,
0.04571533203125,
-0.0172576904296875,
0.078125,
-0.038909912109375,
0.05474853515625,
0.059417724609375,
0.00447845458984375,
0.042938232421875,
0.019989013671875,
-0.01434326171875,
0.019195556640625,
0.052825927734375,
-0.01153564453125,
0.026214599609375,
-0.00009679794311523438,
0.0006780624389648438,
-0.0121917724609375,
0.006439208984375,
-0.037109375,
0.0300140380859375,
0.00829315185546875,
-0.041900634765625,
-0.01202392578125,
0.0031147003173828125,
0.016448974609375,
-0.0286102294921875,
-0.0165863037109375,
0.038665771484375,
0.002780914306640625,
-0.05975341796875,
0.053009033203125,
0.0099945068359375,
0.072265625,
-0.043975830078125,
0.02447509765625,
-0.00511932373046875,
0.0252532958984375,
-0.018890380859375,
-0.02569580078125,
0.0184478759765625,
-0.00838470458984375,
-0.000946044921875,
-0.017974853515625,
0.0516357421875,
-0.035400390625,
-0.04522705078125,
0.0254058837890625,
0.0281829833984375,
0.00325775146484375,
-0.017242431640625,
-0.0628662109375,
0.00850677490234375,
0.00809478759765625,
-0.037353515625,
0.004215240478515625,
0.0170440673828125,
0.007694244384765625,
0.041595458984375,
0.059234619140625,
-0.011444091796875,
0.0283050537109375,
-0.0166168212890625,
0.072509765625,
-0.036895751953125,
-0.031097412109375,
-0.0819091796875,
0.050323486328125,
-0.002529144287109375,
-0.0291290283203125,
0.06805419921875,
0.05389404296875,
0.08367919921875,
-0.01326751708984375,
0.056640625,
-0.0286102294921875,
0.01184844970703125,
-0.029052734375,
0.070068359375,
-0.041748046875,
-0.0047454833984375,
-0.0423583984375,
-0.07470703125,
-0.0106201171875,
0.059600830078125,
-0.03607177734375,
0.0247650146484375,
0.04937744140625,
0.058837890625,
-0.00812530517578125,
-0.01593017578125,
0.0008721351623535156,
0.0335693359375,
0.033538818359375,
0.04339599609375,
0.04107666015625,
-0.04522705078125,
0.055023193359375,
-0.03839111328125,
-0.017303466796875,
-0.020477294921875,
-0.05377197265625,
-0.08282470703125,
-0.047882080078125,
-0.020538330078125,
-0.041748046875,
-0.017059326171875,
0.06903076171875,
0.056182861328125,
-0.04718017578125,
-0.01499176025390625,
-0.0302734375,
-0.00039768218994140625,
-0.0034427642822265625,
-0.0251617431640625,
0.04547119140625,
-0.036956787109375,
-0.07470703125,
-0.00574493408203125,
-0.00423431396484375,
0.0009207725524902344,
-0.0210418701171875,
-0.006595611572265625,
-0.0240325927734375,
0.005825042724609375,
0.036956787109375,
0.00960540771484375,
-0.047637939453125,
-0.006481170654296875,
0.017913818359375,
-0.0164642333984375,
-0.0013208389282226562,
0.038421630859375,
-0.04827880859375,
0.036346435546875,
0.0303802490234375,
0.03900146484375,
0.045562744140625,
-0.0079193115234375,
0.033966064453125,
-0.033935546875,
0.0198516845703125,
0.01861572265625,
0.03533935546875,
0.017425537109375,
-0.033111572265625,
0.035675048828125,
0.0285797119140625,
-0.048126220703125,
-0.07275390625,
0.00812530517578125,
-0.06439208984375,
-0.0164794921875,
0.099853515625,
-0.003238677978515625,
-0.019378662109375,
-0.0003352165222167969,
-0.0298004150390625,
0.0435791015625,
-0.0201568603515625,
0.052337646484375,
0.054931640625,
0.005588531494140625,
-0.0023651123046875,
-0.042694091796875,
0.045013427734375,
0.037139892578125,
-0.05584716796875,
0.00876617431640625,
0.0283660888671875,
0.0282440185546875,
0.0114593505859375,
0.06988525390625,
-0.006839752197265625,
0.0099639892578125,
-0.004467010498046875,
0.0175018310546875,
-0.012542724609375,
-0.01488494873046875,
-0.01134490966796875,
-0.006961822509765625,
-0.015777587890625,
-0.01004791259765625
]
] |
openlm-research/open_llama_7b | 2023-06-16T00:45:23.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:togethercomputer/RedPajama-Data-1T",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | openlm-research | null | null | openlm-research/open_llama_7b | 104 | 90,543 | transformers | 2023-06-07T08:54:38 | ---
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T
---
# OpenLLaMA: An Open Reproduction of LLaMA
In this repo, we present a permissively licensed open source reproduction of Meta AI's [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) large language model. We are releasing a 7B and 3B model trained on 1T tokens, as well as the preview of a 13B model trained on 600B tokens. We provide PyTorch and JAX weights of pre-trained OpenLLaMA models, as well as evaluation results and comparison against the original LLaMA models. Please see the [project homepage of OpenLLaMA](https://github.com/openlm-research/open_llama) for more details.
## Weights Release, License and Usage
We release the weights in two formats: an EasyLM format to be use with our [EasyLM framework](https://github.com/young-geng/EasyLM), and a PyTorch format to be used with the [Hugging Face transformers](https://huggingface.co/docs/transformers/index) library. Both our training framework EasyLM and the checkpoint weights are licensed permissively under the Apache 2.0 license.
### Loading the Weights with Hugging Face Transformers
Preview checkpoints can be directly loaded from Hugging Face Hub. **Please note that it is advised to avoid using the Hugging Face fast tokenizer for now, as we’ve observed that the auto-converted fast tokenizer sometimes gives incorrect tokenizations.** This can be achieved by directly using the `LlamaTokenizer` class, or passing in the `use_fast=False` option for the `AutoTokenizer` class. See the following example for usage.
```python
import torch
from transformers import LlamaTokenizer, LlamaForCausalLM
model_path = 'openlm-research/open_llama_3b'
# model_path = 'openlm-research/open_llama_7b'
tokenizer = LlamaTokenizer.from_pretrained(model_path)
model = LlamaForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float16, device_map='auto',
)
prompt = 'Q: What is the largest animal?\nA:'
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=32
)
print(tokenizer.decode(generation_output[0]))
```
For more advanced usage, please follow the [transformers LLaMA documentation](https://huggingface.co/docs/transformers/main/model_doc/llama).
### Evaluating with LM-Eval-Harness
The model can be evaluated with [lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness). However, due to the aforementioned tokenizer issue, we need to avoid using the fast tokenizer to obtain the correct results. This can be achieved by passing in `use_fast=False` to [this part of lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness/blob/4b701e228768052cfae9043dca13e82052ca5eea/lm_eval/models/huggingface.py#LL313C9-L316C10), as shown in the example below:
```python
tokenizer = self.AUTO_TOKENIZER_CLASS.from_pretrained(
pretrained if tokenizer is None else tokenizer,
revision=revision + ("/" + subfolder if subfolder is not None else ""),
use_fast=False
)
```
### Loading the Weights with EasyLM
For using the weights in our EasyLM framework, please refer to the [LLaMA documentation of EasyLM](https://github.com/young-geng/EasyLM/blob/main/docs/llama.md). Note that unlike the original LLaMA model, our OpenLLaMA tokenizer and weights are trained completely from scratch so it is no longer needed to obtain the original LLaMA tokenizer and weights. Note that we use BOS (beginning of sentence) token (id=1) during training, so it is best to prepend this token for best performance during few-shot evaluation.
## Dataset and Training
We train our models on the [RedPajama](https://www.together.xyz/blog/redpajama) dataset released by [Together](https://www.together.xyz/), which is a reproduction of the LLaMA training dataset containing over 1.2 trillion tokens. We follow the exactly same preprocessing steps and training hyperparameters as the original LLaMA paper, including model architecture, context length, training steps, learning rate schedule, and optimizer. The only difference between our setting and the original one is the dataset used: OpenLLaMA employs the RedPajama dataset rather than the one utilized by the original LLaMA.
We train the models on cloud TPU-v4s using [EasyLM](https://github.com/young-geng/EasyLM), a JAX based training pipeline we developed for training and fine-tuning large language models. We employ a combination of normal data parallelism and [fully sharded data parallelism (also know as ZeRO stage 3)](https://engineering.fb.com/2021/07/15/open-source/fsdp/) to balance the training throughput and memory usage. Overall we reach a throughput of over 2200 tokens / second / TPU-v4 chip for our 7B model.
## Evaluation
We evaluated OpenLLaMA on a wide range of tasks using [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness). The LLaMA results are generated by running the original LLaMA model on the same evaluation metrics. We note that our results for the LLaMA model differ slightly from the original LLaMA paper, which we believe is a result of different evaluation protocols. Similar differences have been reported in [this issue of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/issues/443). Additionally, we present the results of GPT-J, a 6B parameter model trained on the [Pile](https://pile.eleuther.ai/) dataset by [EleutherAI](https://www.eleuther.ai/).
The original LLaMA model was trained for 1 trillion tokens and GPT-J was trained for 500 billion tokens. We present the results in the table below. OpenLLaMA exhibits comparable performance to the original LLaMA and GPT-J across a majority of tasks, and outperforms them in some tasks.
| **Task/Metric** | GPT-J 6B | LLaMA 7B | OpenLLaMA 7B | OpenLLaMA 3B | OpenLLaMA 13B 600BT |
| ---------------------- | -------- | -------- | ------------ | ------------ | ------------------- |
| anli_r1/acc | 0.32 | 0.35 | 0.33 | 0.33 | 0.33 |
| anli_r2/acc | 0.34 | 0.34 | 0.36 | 0.32 | 0.35 |
| anli_r3/acc | 0.35 | 0.37 | 0.38 | 0.35 | 0.38 |
| arc_challenge/acc | 0.34 | 0.39 | 0.37 | 0.34 | 0.39 |
| arc_challenge/acc_norm | 0.37 | 0.41 | 0.38 | 0.37 | 0.42 |
| arc_easy/acc | 0.67 | 0.68 | 0.72 | 0.69 | 0.74 |
| arc_easy/acc_norm | 0.62 | 0.52 | 0.68 | 0.65 | 0.70 |
| ddboolq/acc | 0.50 | 0.56 | 0.53 | 0.49 | 0.71 |
| hellaswag/acc | 0.36 | 0.36 | 0.63 | 0.43 | 0.54 |
| hellaswag/acc_norm | 0.66 | 0.73 | 0.72 | 0.67 | 0.73 |
| openbookqa/acc | 0.29 | 0.29 | 0.30 | 0.27 | 0.30 |
| openbookqa/acc_norm | 0.38 | 0.41 | 0.40 | 0.40 | 0.41 |
| piqa/acc | 0.75 | 0.78 | 0.76 | 0.75 | 0.77 |
| piqa/acc_norm | 0.76 | 0.78 | 0.77 | 0.76 | 0.78 |
| record/em | 0.88 | 0.91 | 0.89 | 0.88 | 0.90 |
| record/f1 | 0.89 | 0.91 | 0.90 | 0.89 | 0.90 |
| rte/acc | 0.54 | 0.56 | 0.60 | 0.58 | 0.65 |
| truthfulqa_mc/mc1 | 0.20 | 0.21 | 0.23 | 0.22 | 0.22 |
| truthfulqa_mc/mc2 | 0.36 | 0.34 | 0.35 | 0.35 | 0.35 |
| wic/acc | 0.50 | 0.50 | 0.51 | 0.48 | 0.49 |
| winogrande/acc | 0.64 | 0.68 | 0.67 | 0.62 | 0.67 |
| Average | 0.51 | 0.53 | 0.55 | 0.52 | 0.56 |
We removed the task CB and WSC from our benchmark, as our model performs suspiciously well on these two tasks. We hypothesize that there could be a benchmark data contamination in the training set.
## Contact
We would love to get feedback from the community. If you have any questions, please open an issue or contact us.
OpenLLaMA is developed by:
[Xinyang Geng](https://young-geng.xyz/)* and [Hao Liu](https://www.haoliu.site/)* from Berkeley AI Research.
*Equal Contribution
## Acknowledgment
We thank the [Google TPU Research Cloud](https://sites.research.google/trc/about/) program for providing part of the computation resources. We’d like to specially thank Jonathan Caton from TPU Research Cloud for helping us organizing compute resources, Rafi Witten from the Google Cloud team and James Bradbury from the Google JAX team for helping us optimizing our training throughput. We’d also want to thank Charlie Snell, Gautier Izacard, Eric Wallace, Lianmin Zheng and our user community for the discussions and feedback.
The OpenLLaMA 13B model is trained in collaboration with [Stability AI](https://stability.ai/), and we thank Stability AI for providing the computation resources. We’d like to especially thank David Ha and Shivanshu Purohit for the coordinating the logistics and providing engineering support.
## Reference
If you found OpenLLaMA useful in your research or applications, please cite using the following BibTeX:
```
@software{openlm2023openllama,
author = {Geng, Xinyang and Liu, Hao},
title = {OpenLLaMA: An Open Reproduction of LLaMA},
month = May,
year = 2023,
url = {https://github.com/openlm-research/open_llama}
}
```
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
```
@article{touvron2023llama,
title={Llama: Open and efficient foundation language models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```
| 10,507 | [
[
-0.0233001708984375,
-0.0538330078125,
0.01861572265625,
0.030181884765625,
-0.0186767578125,
-0.0037078857421875,
-0.0243682861328125,
-0.044158935546875,
0.0281219482421875,
0.0193023681640625,
-0.029937744140625,
-0.050628662109375,
-0.04962158203125,
0.006244659423828125,
-0.015777587890625,
0.086669921875,
-0.0238037109375,
-0.01070404052734375,
0.0009465217590332031,
-0.02392578125,
-0.0109710693359375,
-0.0266571044921875,
-0.053466796875,
-0.0303955078125,
0.03143310546875,
0.01435089111328125,
0.045318603515625,
0.0355224609375,
0.038543701171875,
0.025360107421875,
-0.0232391357421875,
0.016082763671875,
-0.03973388671875,
-0.0193634033203125,
0.0212249755859375,
-0.040252685546875,
-0.05126953125,
0.00435638427734375,
0.03924560546875,
0.0255126953125,
-0.0247955322265625,
0.04180908203125,
-0.0023212432861328125,
0.038665771484375,
-0.040069580078125,
0.02459716796875,
-0.0416259765625,
0.00850677490234375,
-0.0232086181640625,
-0.00316619873046875,
-0.0202789306640625,
-0.0286712646484375,
-0.007419586181640625,
-0.05810546875,
0.0017910003662109375,
0.0020580291748046875,
0.0885009765625,
0.024261474609375,
-0.017852783203125,
-0.01763916015625,
-0.0291290283203125,
0.0609130859375,
-0.06103515625,
0.01006317138671875,
0.03302001953125,
0.01253509521484375,
-0.00984954833984375,
-0.059326171875,
-0.053466796875,
-0.00933837890625,
-0.00749969482421875,
0.00981903076171875,
-0.0217742919921875,
-0.00914764404296875,
0.0214385986328125,
0.04620361328125,
-0.03594970703125,
0.01861572265625,
-0.04156494140625,
-0.01055908203125,
0.0579833984375,
0.020751953125,
0.01024627685546875,
-0.0103912353515625,
-0.0374755859375,
-0.0179595947265625,
-0.052825927734375,
0.027069091796875,
0.016510009765625,
0.02252197265625,
-0.03643798828125,
0.04840087890625,
-0.0215301513671875,
0.038238525390625,
0.0083465576171875,
-0.041748046875,
0.052947998046875,
-0.0295562744140625,
-0.0338134765625,
0.0028533935546875,
0.0672607421875,
0.0293426513671875,
0.00234222412109375,
0.00940704345703125,
-0.014617919921875,
-0.006420135498046875,
-0.00751495361328125,
-0.060150146484375,
-0.0028171539306640625,
0.0187225341796875,
-0.036834716796875,
-0.0265960693359375,
0.003269195556640625,
-0.041229248046875,
-0.01055908203125,
-0.01097869873046875,
0.034271240234375,
-0.0161285400390625,
-0.01763916015625,
0.0224456787109375,
0.01088714599609375,
0.0318603515625,
0.034393310546875,
-0.053924560546875,
0.01531219482421875,
0.036407470703125,
0.07080078125,
-0.003932952880859375,
-0.02874755859375,
-0.020843505859375,
0.00096893310546875,
-0.0188140869140625,
0.04229736328125,
-0.0079345703125,
-0.0233001708984375,
-0.010162353515625,
0.006298065185546875,
-0.01763916015625,
-0.035675048828125,
0.03704833984375,
-0.031768798828125,
0.0180206298828125,
-0.013031005859375,
-0.0133819580078125,
-0.0230865478515625,
0.020904541015625,
-0.04498291015625,
0.1005859375,
0.0072784423828125,
-0.053009033203125,
0.023162841796875,
-0.058258056640625,
-0.00795745849609375,
-0.02008056640625,
0.0120849609375,
-0.0498046875,
-0.003948211669921875,
0.03216552734375,
0.03082275390625,
-0.033599853515625,
0.01422882080078125,
-0.0179595947265625,
-0.037353515625,
0.0109710693359375,
-0.0170440673828125,
0.08160400390625,
0.020782470703125,
-0.03515625,
0.019744873046875,
-0.06903076171875,
-0.004985809326171875,
0.045928955078125,
-0.04632568359375,
-0.0065460205078125,
-0.0224761962890625,
-0.0014801025390625,
0.004543304443359375,
0.034423828125,
-0.043060302734375,
0.031646728515625,
-0.0246429443359375,
0.036346435546875,
0.06781005859375,
-0.0151214599609375,
0.01287078857421875,
-0.033660888671875,
0.031829833984375,
0.0121917724609375,
0.018035888671875,
-0.0150909423828125,
-0.04840087890625,
-0.074951171875,
-0.04132080078125,
0.01078033447265625,
0.03155517578125,
-0.0224151611328125,
0.03466796875,
-0.012939453125,
-0.05303955078125,
-0.05718994140625,
0.0167236328125,
0.031341552734375,
0.032196044921875,
0.037628173828125,
-0.0230560302734375,
-0.042236328125,
-0.0635986328125,
0.0011653900146484375,
-0.0227508544921875,
0.01181793212890625,
0.021759033203125,
0.055328369140625,
-0.0258941650390625,
0.06292724609375,
-0.0413818359375,
-0.0306243896484375,
-0.01519775390625,
-0.00536346435546875,
0.04791259765625,
0.031829833984375,
0.051544189453125,
-0.029510498046875,
-0.0382080078125,
0.0028533935546875,
-0.062408447265625,
-0.005611419677734375,
-0.0015172958374023438,
-0.0119171142578125,
0.0230255126953125,
0.01122283935546875,
-0.0670166015625,
0.047271728515625,
0.043060302734375,
-0.0283203125,
0.040252685546875,
-0.0083465576171875,
0.0022945404052734375,
-0.07318115234375,
0.018890380859375,
-0.005153656005859375,
-0.008453369140625,
-0.034515380859375,
0.020294189453125,
0.0004620552062988281,
0.0038909912109375,
-0.049835205078125,
0.05230712890625,
-0.0294036865234375,
-0.0074462890625,
0.00739288330078125,
0.002593994140625,
-0.0014657974243164062,
0.05230712890625,
-0.0115814208984375,
0.06976318359375,
0.033447265625,
-0.0306243896484375,
0.0233154296875,
0.024505615234375,
-0.03765869140625,
0.0225067138671875,
-0.059356689453125,
0.0195770263671875,
-0.0000908970832824707,
0.0352783203125,
-0.07421875,
-0.01377105712890625,
0.0330810546875,
-0.0227508544921875,
0.0142974853515625,
0.0100860595703125,
-0.040008544921875,
-0.048614501953125,
-0.04803466796875,
0.0284576416015625,
0.03924560546875,
-0.053009033203125,
0.018524169921875,
0.0104217529296875,
0.00965118408203125,
-0.05267333984375,
-0.052093505859375,
-0.00730133056640625,
-0.0249786376953125,
-0.0426025390625,
0.0262603759765625,
-0.006038665771484375,
-0.01320648193359375,
-0.00855255126953125,
-0.0066375732421875,
0.00299835205078125,
0.01436614990234375,
0.0252532958984375,
0.021728515625,
-0.02490234375,
-0.00922393798828125,
-0.006916046142578125,
-0.0044097900390625,
-0.00901031494140625,
0.003253936767578125,
0.054351806640625,
-0.028961181640625,
-0.0333251953125,
-0.0523681640625,
-0.00917816162109375,
0.037139892578125,
-0.018280029296875,
0.0693359375,
0.05279541015625,
-0.02191162109375,
0.0161590576171875,
-0.0408935546875,
0.0115814208984375,
-0.0355224609375,
0.0204010009765625,
-0.031219482421875,
-0.06494140625,
0.046722412109375,
0.0147857666015625,
0.01812744140625,
0.05584716796875,
0.058563232421875,
0.005123138427734375,
0.059539794921875,
0.0328369140625,
-0.020294189453125,
0.0254364013671875,
-0.043701171875,
-0.0008749961853027344,
-0.075439453125,
-0.038116455078125,
-0.03753662109375,
-0.03021240234375,
-0.0281219482421875,
-0.035247802734375,
0.024749755859375,
0.0253143310546875,
-0.048736572265625,
0.0291748046875,
-0.038299560546875,
0.0204925537109375,
0.046051025390625,
0.01450347900390625,
0.0308074951171875,
0.004871368408203125,
-0.01165008544921875,
0.006061553955078125,
-0.037078857421875,
-0.039459228515625,
0.10687255859375,
0.03985595703125,
0.055450439453125,
0.00836944580078125,
0.06451416015625,
0.0054473876953125,
0.03594970703125,
-0.0421142578125,
0.0377197265625,
0.0207061767578125,
-0.0439453125,
-0.0107879638671875,
-0.01580810546875,
-0.0780029296875,
0.037872314453125,
-0.007274627685546875,
-0.071044921875,
0.0022754669189453125,
-0.006011962890625,
-0.0266265869140625,
0.031982421875,
-0.03704833984375,
0.0517578125,
-0.0205535888671875,
-0.0171966552734375,
-0.009368896484375,
-0.035247802734375,
0.04632568359375,
-0.013458251953125,
0.01153564453125,
-0.0142364501953125,
-0.017547607421875,
0.067138671875,
-0.051544189453125,
0.06329345703125,
-0.013092041015625,
-0.01450347900390625,
0.03851318359375,
-0.016387939453125,
0.039031982421875,
-0.0024700164794921875,
-0.0182342529296875,
0.036834716796875,
-0.01070404052734375,
-0.033050537109375,
-0.0182342529296875,
0.05499267578125,
-0.0897216796875,
-0.055877685546875,
-0.040069580078125,
-0.0298614501953125,
0.01383209228515625,
0.0115509033203125,
0.0134735107421875,
0.0014104843139648438,
0.00201416015625,
0.0174407958984375,
0.0273590087890625,
-0.0305938720703125,
0.044586181640625,
0.03350830078125,
-0.030181884765625,
-0.0419921875,
0.055206298828125,
0.0016641616821289062,
0.01114654541015625,
0.0130767822265625,
0.016510009765625,
-0.0192108154296875,
-0.035186767578125,
-0.04150390625,
0.0296783447265625,
-0.04541015625,
-0.0271148681640625,
-0.04541015625,
-0.0160369873046875,
-0.02801513671875,
-0.006526947021484375,
-0.025970458984375,
-0.03729248046875,
-0.03350830078125,
-0.0113067626953125,
0.04864501953125,
0.06317138671875,
0.002826690673828125,
0.036407470703125,
-0.03839111328125,
0.01512908935546875,
0.0150299072265625,
0.013824462890625,
0.0135040283203125,
-0.051239013671875,
-0.021636962890625,
0.00046062469482421875,
-0.046478271484375,
-0.049530029296875,
0.0279388427734375,
0.0109710693359375,
0.036163330078125,
0.0313720703125,
-0.00894927978515625,
0.07623291015625,
-0.0208740234375,
0.075439453125,
0.02789306640625,
-0.06512451171875,
0.044189453125,
-0.0158538818359375,
0.01416015625,
0.036163330078125,
0.0298309326171875,
-0.0208740234375,
-0.0235137939453125,
-0.04827880859375,
-0.070556640625,
0.06610107421875,
0.0175323486328125,
-0.0017261505126953125,
0.007236480712890625,
0.02099609375,
0.00322723388671875,
0.0157012939453125,
-0.0853271484375,
-0.02899169921875,
-0.015960693359375,
-0.0152130126953125,
-0.01221466064453125,
-0.006244659423828125,
-0.0151214599609375,
-0.0374755859375,
0.04345703125,
0.0009984970092773438,
0.032562255859375,
0.017730712890625,
-0.019927978515625,
-0.01428985595703125,
-0.0003616809844970703,
0.05865478515625,
0.0462646484375,
-0.016998291015625,
-0.0138397216796875,
0.0310516357421875,
-0.0401611328125,
0.01427459716796875,
-0.00029850006103515625,
-0.0183868408203125,
-0.0097503662109375,
0.038848876953125,
0.0731201171875,
0.0173797607421875,
-0.04168701171875,
0.0408935546875,
0.004688262939453125,
-0.01708984375,
-0.0247802734375,
0.0014009475708007812,
0.01235198974609375,
0.0245819091796875,
0.033905029296875,
-0.007534027099609375,
-0.01416015625,
-0.038726806640625,
-0.005840301513671875,
0.033447265625,
0.0011157989501953125,
-0.024383544921875,
0.0662841796875,
0.005786895751953125,
-0.0203857421875,
0.034210205078125,
0.004505157470703125,
-0.03546142578125,
0.061126708984375,
0.049713134765625,
0.05120849609375,
-0.01593017578125,
-0.00016939640045166016,
0.04315185546875,
0.0285797119140625,
-0.004528045654296875,
0.0193328857421875,
-0.007114410400390625,
-0.02874755859375,
-0.018951416015625,
-0.07073974609375,
-0.02392578125,
0.0158538818359375,
-0.043182373046875,
0.026153564453125,
-0.04168701171875,
-0.01470184326171875,
-0.0283660888671875,
0.0186614990234375,
-0.067138671875,
0.01007843017578125,
0.004459381103515625,
0.07489013671875,
-0.05133056640625,
0.059295654296875,
0.047454833984375,
-0.049560546875,
-0.0740966796875,
-0.018341064453125,
-0.004150390625,
-0.093017578125,
0.058135986328125,
0.02587890625,
0.01184844970703125,
-0.008056640625,
-0.032257080078125,
-0.0875244140625,
0.11468505859375,
0.01546478271484375,
-0.038330078125,
0.0026721954345703125,
0.01396942138671875,
0.037872314453125,
-0.01708984375,
0.0435791015625,
0.035552978515625,
0.042266845703125,
-0.0041351318359375,
-0.0899658203125,
0.020172119140625,
-0.0220489501953125,
-0.0013914108276367188,
0.005542755126953125,
-0.0804443359375,
0.08831787109375,
-0.022735595703125,
-0.0008816719055175781,
0.02197265625,
0.05047607421875,
0.03948974609375,
0.0328369140625,
0.029449462890625,
0.0748291015625,
0.06451416015625,
-0.01239013671875,
0.082763671875,
-0.0125579833984375,
0.0462646484375,
0.058013916015625,
-0.01274871826171875,
0.069091796875,
0.03515625,
-0.046630859375,
0.042877197265625,
0.06341552734375,
0.0031528472900390625,
0.030029296875,
0.0155181884765625,
-0.01071929931640625,
0.00823211669921875,
0.004535675048828125,
-0.05609130859375,
0.03570556640625,
0.01264190673828125,
-0.0268096923828125,
-0.01432037353515625,
-0.008819580078125,
0.0161285400390625,
-0.017608642578125,
-0.0278167724609375,
0.0419921875,
0.003604888916015625,
-0.034149169921875,
0.0718994140625,
0.015777587890625,
0.07379150390625,
-0.04376220703125,
0.01568603515625,
-0.022705078125,
0.01459503173828125,
-0.0311431884765625,
-0.0439453125,
0.0083770751953125,
0.01332855224609375,
0.01021575927734375,
-0.00948333740234375,
0.03570556640625,
-0.01116180419921875,
-0.034149169921875,
0.0216217041015625,
0.022003173828125,
0.0197601318359375,
0.0159149169921875,
-0.056640625,
0.027008056640625,
-0.003604888916015625,
-0.061004638671875,
0.03460693359375,
0.0135955810546875,
-0.0032329559326171875,
0.049102783203125,
0.06536865234375,
0.00077056884765625,
0.0197296142578125,
-0.00907135009765625,
0.07891845703125,
-0.0531005859375,
-0.023834228515625,
-0.06451416015625,
0.03924560546875,
0.0009918212890625,
-0.046417236328125,
0.059234619140625,
0.049652099609375,
0.06390380859375,
-0.0024204254150390625,
0.0311737060546875,
-0.007427215576171875,
0.0170135498046875,
-0.0433349609375,
0.05535888671875,
-0.05767822265625,
0.01045989990234375,
-0.0178985595703125,
-0.07281494140625,
-0.024169921875,
0.0638427734375,
-0.0163116455078125,
0.0030689239501953125,
0.040252685546875,
0.0562744140625,
0.007007598876953125,
-0.00823211669921875,
-0.001598358154296875,
0.0236358642578125,
0.024658203125,
0.06475830078125,
0.056884765625,
-0.054168701171875,
0.0380859375,
-0.0266876220703125,
-0.01287841796875,
-0.0296173095703125,
-0.058837890625,
-0.061370849609375,
-0.0275726318359375,
-0.0218658447265625,
-0.02197265625,
-0.00849151611328125,
0.08392333984375,
0.039031982421875,
-0.043121337890625,
-0.0335693359375,
0.00852203369140625,
0.008148193359375,
-0.0084381103515625,
-0.014129638671875,
0.03948974609375,
-0.0088958740234375,
-0.06365966796875,
0.02618408203125,
0.0033321380615234375,
0.0080108642578125,
-0.02325439453125,
-0.023834228515625,
-0.0182952880859375,
-0.0009102821350097656,
0.048675537109375,
0.0240936279296875,
-0.07110595703125,
-0.0200347900390625,
-0.0159912109375,
-0.022216796875,
0.0207061767578125,
0.0220794677734375,
-0.058990478515625,
0.01035308837890625,
0.0170440673828125,
0.03753662109375,
0.061431884765625,
-0.00449371337890625,
0.003612518310546875,
-0.0312042236328125,
0.0352783203125,
-0.0124969482421875,
0.032806396484375,
0.0116424560546875,
-0.0229644775390625,
0.059356689453125,
0.022613525390625,
-0.033538818359375,
-0.07830810546875,
-0.01678466796875,
-0.08740234375,
0.0023403167724609375,
0.08447265625,
-0.0218658447265625,
-0.037139892578125,
0.025390625,
-0.0288543701171875,
0.01507568359375,
-0.03369140625,
0.050048828125,
0.047210693359375,
-0.008544921875,
-0.002826690673828125,
-0.04229736328125,
0.01323699951171875,
0.024383544921875,
-0.05841064453125,
-0.02392578125,
0.01238250732421875,
0.0244903564453125,
0.016815185546875,
0.06805419921875,
-0.00864410400390625,
0.014190673828125,
-0.0086669921875,
0.00893402099609375,
-0.024444580078125,
-0.00785064697265625,
-0.02655029296875,
0.01276397705078125,
0.007293701171875,
-0.025146484375
]
] |
ckiplab/bert-base-chinese-ws | 2022-05-10T03:28:12.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/bert-base-chinese-ws | 6 | 89,147 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- bert
- zh
license: gpl-3.0
---
# CKIP BERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese-ws')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,122 | [
[
-0.0223541259765625,
-0.0253753662109375,
0.0019502639770507812,
0.056121826171875,
-0.0296783447265625,
0.00405120849609375,
-0.01433563232421875,
-0.019134521484375,
-0.00244140625,
0.03167724609375,
-0.0279541015625,
-0.0219268798828125,
-0.0440673828125,
0.0014982223510742188,
-0.0180816650390625,
0.06365966796875,
-0.0143890380859375,
0.0257110595703125,
0.03192138671875,
0.00916290283203125,
-0.018890380859375,
-0.03277587890625,
-0.051300048828125,
-0.043701171875,
-0.0021877288818359375,
0.0192413330078125,
0.0496826171875,
0.0286712646484375,
0.036163330078125,
0.023223876953125,
0.0025787353515625,
-0.00695037841796875,
-0.013153076171875,
-0.0201568603515625,
-0.0002846717834472656,
-0.03851318359375,
-0.027099609375,
-0.014495849609375,
0.04876708984375,
0.035919189453125,
0.0016183853149414062,
-0.0005517005920410156,
0.015106201171875,
0.02642822265625,
-0.025390625,
0.03125,
-0.04412841796875,
0.02276611328125,
-0.01202392578125,
-0.0063018798828125,
-0.027252197265625,
-0.0189208984375,
0.01436614990234375,
-0.0460205078125,
0.0244903564453125,
-0.01177215576171875,
0.09771728515625,
0.0023975372314453125,
-0.023101806640625,
-0.0191650390625,
-0.050537109375,
0.07769775390625,
-0.0643310546875,
0.033294677734375,
0.0255279541015625,
0.0206451416015625,
-0.00482940673828125,
-0.0794677734375,
-0.04754638671875,
-0.01372528076171875,
-0.016082763671875,
0.02410888671875,
0.009735107421875,
-0.0023212432861328125,
0.0276031494140625,
0.0216827392578125,
-0.044342041015625,
0.0153656005859375,
-0.0283660888671875,
-0.031982421875,
0.038909912109375,
-0.006683349609375,
0.034423828125,
-0.032135009765625,
-0.040252685546875,
-0.0255126953125,
-0.045562744140625,
0.017578125,
0.019805908203125,
0.0095977783203125,
-0.03448486328125,
0.042572021484375,
-0.0015478134155273438,
0.0218963623046875,
0.01276397705078125,
-0.0066375732421875,
0.033111572265625,
-0.0201873779296875,
-0.005985260009765625,
-0.01025390625,
0.065673828125,
0.0169677734375,
0.00728607177734375,
0.0048675537109375,
-0.0236358642578125,
-0.02410888671875,
-0.0172882080078125,
-0.056121826171875,
-0.049407958984375,
0.0153961181640625,
-0.057769775390625,
-0.015380859375,
0.01198577880859375,
-0.044921875,
0.02142333984375,
-0.0181884765625,
0.0308837890625,
-0.05169677734375,
-0.04437255859375,
-0.0011720657348632812,
-0.0311126708984375,
0.060333251953125,
0.01000213623046875,
-0.08819580078125,
0.0019292831420898438,
0.04364013671875,
0.05419921875,
0.00971221923828125,
-0.0125274658203125,
0.009552001953125,
0.02740478515625,
-0.0153961181640625,
0.039306640625,
-0.00786590576171875,
-0.05255126953125,
0.0111083984375,
0.006282806396484375,
0.0018281936645507812,
-0.0301666259765625,
0.05914306640625,
-0.0240631103515625,
0.0297393798828125,
-0.0177764892578125,
-0.02142333984375,
-0.0045928955078125,
0.006824493408203125,
-0.03759765625,
0.0885009765625,
0.0162811279296875,
-0.063232421875,
0.0175628662109375,
-0.0643310546875,
-0.043792724609375,
0.02392578125,
-0.00722503662109375,
-0.0299530029296875,
-0.01190948486328125,
0.0166015625,
0.0235595703125,
-0.004360198974609375,
0.0157318115234375,
-0.002147674560546875,
-0.0164947509765625,
0.0001671314239501953,
-0.032196044921875,
0.09832763671875,
0.02557373046875,
-0.0239105224609375,
0.01329803466796875,
-0.048858642578125,
0.0102996826171875,
0.0227813720703125,
-0.0186309814453125,
-0.0182647705078125,
0.0156097412109375,
0.042572021484375,
0.01221466064453125,
0.0413818359375,
-0.04388427734375,
0.035614013671875,
-0.040313720703125,
0.052734375,
0.061248779296875,
-0.0241241455078125,
0.019683837890625,
-0.01097869873046875,
-0.0007257461547851562,
0.00402069091796875,
0.0275115966796875,
-0.009490966796875,
-0.03753662109375,
-0.08172607421875,
-0.0253143310546875,
0.031951904296875,
0.05712890625,
-0.08251953125,
0.06695556640625,
-0.0178680419921875,
-0.046630859375,
-0.023223876953125,
-0.00563812255859375,
0.0019207000732421875,
0.0134735107421875,
0.04058837890625,
-0.0220489501953125,
-0.04345703125,
-0.0750732421875,
0.00896453857421875,
-0.04156494140625,
-0.042083740234375,
-0.0014181137084960938,
0.040435791015625,
-0.03240966796875,
0.07305908203125,
-0.03924560546875,
-0.0207672119140625,
-0.0234375,
0.0408935546875,
0.024993896484375,
0.0657958984375,
0.046234130859375,
-0.07391357421875,
-0.05169677734375,
-0.0167083740234375,
-0.0254974365234375,
-0.004718780517578125,
-0.01544189453125,
-0.01076507568359375,
0.004253387451171875,
0.005321502685546875,
-0.0450439453125,
0.01470184326171875,
0.0292510986328125,
0.00007033348083496094,
0.06402587890625,
-0.004100799560546875,
-0.02093505859375,
-0.09686279296875,
0.01372528076171875,
-0.01444244384765625,
-0.00286865234375,
-0.0304718017578125,
-0.0007929801940917969,
0.013427734375,
-0.00644683837890625,
-0.039581298828125,
0.040313720703125,
-0.0256805419921875,
0.0229949951171875,
-0.019287109375,
-0.01227569580078125,
-0.014678955078125,
0.044769287109375,
0.0297698974609375,
0.051605224609375,
0.044769287109375,
-0.052032470703125,
0.031494140625,
0.0496826171875,
-0.01959228515625,
-0.006649017333984375,
-0.06903076171875,
-0.002330780029296875,
0.0234832763671875,
0.013031005859375,
-0.07037353515625,
-0.0030117034912109375,
0.04388427734375,
-0.05535888671875,
0.04443359375,
0.004688262939453125,
-0.06805419921875,
-0.03466796875,
-0.03387451171875,
0.0240936279296875,
0.05084228515625,
-0.04620361328125,
0.036590576171875,
0.019683837890625,
-0.0153961181640625,
-0.044525146484375,
-0.058197021484375,
-0.002048492431640625,
0.020599365234375,
-0.041839599609375,
0.047821044921875,
-0.0165863037109375,
0.0253753662109375,
-0.0003046989440917969,
0.0062408447265625,
-0.034210205078125,
-0.006259918212890625,
-0.01031494140625,
0.0302734375,
-0.01103973388671875,
-0.0015506744384765625,
0.0145721435546875,
-0.0225982666015625,
0.01026153564453125,
-0.0012578964233398438,
0.05389404296875,
0.0032176971435546875,
-0.0239715576171875,
-0.04205322265625,
0.0200347900390625,
0.01549530029296875,
-0.0190277099609375,
0.02301025390625,
0.07659912109375,
-0.019134521484375,
-0.0136260986328125,
-0.031494140625,
-0.01139068603515625,
-0.04058837890625,
0.0450439453125,
-0.033660888671875,
-0.05987548828125,
0.0240631103515625,
-0.00841522216796875,
0.01494598388671875,
0.055572509765625,
0.04620361328125,
-0.0020999908447265625,
0.09136962890625,
0.06787109375,
-0.041168212890625,
0.033355712890625,
-0.0305328369140625,
0.0276031494140625,
-0.06591796875,
0.01715087890625,
-0.04669189453125,
0.00684356689453125,
-0.061370849609375,
-0.02117919921875,
-0.001178741455078125,
0.01137542724609375,
-0.020233154296875,
0.0528564453125,
-0.058502197265625,
-0.00450897216796875,
0.058135986328125,
-0.0220489501953125,
-0.0079803466796875,
-0.007080078125,
-0.02032470703125,
-0.0019626617431640625,
-0.04345703125,
-0.0479736328125,
0.05450439453125,
0.0489501953125,
0.052734375,
-0.00167083740234375,
0.036376953125,
-0.0012941360473632812,
0.0318603515625,
-0.05828857421875,
0.04058837890625,
-0.0155181884765625,
-0.060211181640625,
-0.023223876953125,
-0.016632080078125,
-0.06219482421875,
0.0175018310546875,
-0.0024242401123046875,
-0.0638427734375,
0.01103973388671875,
0.00455474853515625,
-0.006183624267578125,
0.0285797119140625,
-0.031951904296875,
0.05419921875,
-0.03533935546875,
0.00826263427734375,
-0.00644683837890625,
-0.05316162109375,
0.0285797119140625,
0.0014591217041015625,
-0.006420135498046875,
-0.0056304931640625,
0.00756072998046875,
0.056732177734375,
-0.0156402587890625,
0.061065673828125,
-0.01495361328125,
-0.005092620849609375,
0.0236968994140625,
-0.0216827392578125,
0.023040771484375,
0.011749267578125,
0.009033203125,
0.044525146484375,
0.0163726806640625,
-0.0279541015625,
-0.0160064697265625,
0.03472900390625,
-0.06732177734375,
-0.031402587890625,
-0.043182373046875,
-0.0169677734375,
0.0092620849609375,
0.03985595703125,
0.041534423828125,
0.0013027191162109375,
-0.0005693435668945312,
0.0191192626953125,
0.024169921875,
-0.0322265625,
0.043792724609375,
0.04168701171875,
-0.0048828125,
-0.033355712890625,
0.06866455078125,
0.01029205322265625,
0.006420135498046875,
0.04718017578125,
-0.0029430389404296875,
-0.0181884765625,
-0.033355712890625,
-0.02484130859375,
0.0283203125,
-0.03131103515625,
-0.0007071495056152344,
-0.026702880859375,
-0.04345703125,
-0.048095703125,
0.01020050048828125,
-0.0264892578125,
-0.029510498046875,
-0.0211944580078125,
0.0007724761962890625,
-0.0242462158203125,
0.00945281982421875,
-0.02093505859375,
0.03448486328125,
-0.07855224609375,
0.03692626953125,
0.0153656005859375,
0.0184783935546875,
0.0025997161865234375,
-0.0182647705078125,
-0.0408935546875,
0.0097503662109375,
-0.06451416015625,
-0.054229736328125,
0.041168212890625,
-0.000042498111724853516,
0.05328369140625,
0.045562744140625,
0.01392364501953125,
0.038665771484375,
-0.04840087890625,
0.0831298828125,
0.0279083251953125,
-0.08935546875,
0.0297393798828125,
-0.0132904052734375,
0.02484130859375,
0.021484375,
0.036529541015625,
-0.058013916015625,
-0.025421142578125,
-0.036102294921875,
-0.08575439453125,
0.048858642578125,
0.02764892578125,
0.02569580078125,
-0.0006732940673828125,
0.00016164779663085938,
-0.0022869110107421875,
0.01229095458984375,
-0.08148193359375,
-0.041046142578125,
-0.039306640625,
-0.02349853515625,
0.01751708984375,
-0.0302734375,
0.00618743896484375,
-0.016754150390625,
0.08001708984375,
0.0047454833984375,
0.0609130859375,
0.03533935546875,
-0.00362396240234375,
-0.00981903076171875,
0.006214141845703125,
0.034912109375,
0.040771484375,
-0.019927978515625,
-0.01751708984375,
0.006683349609375,
-0.04730224609375,
-0.01708984375,
0.0299530029296875,
-0.02850341796875,
0.032989501953125,
0.037384033203125,
0.04608154296875,
0.0101318359375,
-0.0309906005859375,
0.040679931640625,
-0.012237548828125,
-0.0180816650390625,
-0.07183837890625,
-0.0029430389404296875,
0.0033473968505859375,
0.0016126632690429688,
0.052154541015625,
-0.01207733154296875,
0.01024627685546875,
-0.01374053955078125,
0.0158538818359375,
0.031829833984375,
-0.0384521484375,
-0.033935546875,
0.049102783203125,
0.035736083984375,
-0.0197296142578125,
0.06298828125,
-0.0043792724609375,
-0.07122802734375,
0.049896240234375,
0.03436279296875,
0.0750732421875,
-0.0251007080078125,
0.0032901763916015625,
0.047332763671875,
0.036865234375,
0.00450897216796875,
0.01849365234375,
-0.01995849609375,
-0.068603515625,
-0.03887939453125,
-0.02740478515625,
-0.03369140625,
0.03131103515625,
-0.037811279296875,
0.043182373046875,
-0.03387451171875,
-0.009246826171875,
-0.00466156005859375,
-0.0034465789794921875,
-0.036468505859375,
0.011260986328125,
0.00914764404296875,
0.0853271484375,
-0.04718017578125,
0.08782958984375,
0.043060302734375,
-0.039520263671875,
-0.06207275390625,
0.0126495361328125,
-0.0293121337890625,
-0.054534912109375,
0.0777587890625,
0.0257568359375,
0.0201263427734375,
0.005496978759765625,
-0.055389404296875,
-0.057891845703125,
0.0750732421875,
-0.01169586181640625,
-0.024993896484375,
-0.007526397705078125,
0.025482177734375,
0.0295562744140625,
-0.002971649169921875,
0.0322265625,
0.005123138427734375,
0.046783447265625,
-0.0124359130859375,
-0.0855712890625,
-0.016448974609375,
-0.0203704833984375,
0.004467010498046875,
0.01812744140625,
-0.0633544921875,
0.0625,
0.00737762451171875,
-0.024322509765625,
0.028167724609375,
0.06787109375,
0.0004992485046386719,
0.00836944580078125,
0.042449951171875,
0.0335693359375,
-0.002246856689453125,
-0.0169677734375,
0.0362548828125,
-0.041961669921875,
0.0589599609375,
0.0611572265625,
-0.00540924072265625,
0.0556640625,
0.0266876220703125,
-0.038299560546875,
0.041229248046875,
0.051055908203125,
-0.044769287109375,
0.04644775390625,
0.0015783309936523438,
-0.007419586181640625,
-0.0075225830078125,
0.00943756103515625,
-0.04248046875,
0.0175933837890625,
0.023345947265625,
-0.0260467529296875,
-0.01114654541015625,
-0.0135498046875,
-0.001651763916015625,
-0.031341552734375,
-0.00405120849609375,
0.038238525390625,
0.00989532470703125,
-0.0230560302734375,
0.036285400390625,
0.0266876220703125,
0.0728759765625,
-0.079833984375,
-0.025848388671875,
0.0197296142578125,
0.01082611083984375,
-0.00408935546875,
-0.046875,
0.010894775390625,
-0.025238037109375,
-0.011688232421875,
-0.01090240478515625,
0.059326171875,
-0.0244293212890625,
-0.038818359375,
0.031829833984375,
0.00608062744140625,
0.00919342041015625,
0.0220489501953125,
-0.0850830078125,
-0.0242767333984375,
0.0266876220703125,
-0.032257080078125,
0.011627197265625,
0.01212310791015625,
0.007015228271484375,
0.048095703125,
0.0650634765625,
0.0067138671875,
-0.0101776123046875,
-0.00286102294921875,
0.06671142578125,
-0.042633056640625,
-0.041839599609375,
-0.0504150390625,
0.0560302734375,
-0.0182647705078125,
-0.026580810546875,
0.05218505859375,
0.05255126953125,
0.0836181640625,
-0.0261993408203125,
0.07647705078125,
-0.0294647216796875,
0.056427001953125,
-0.01428985595703125,
0.05902099609375,
-0.0302734375,
-0.0107421875,
-0.0252685546875,
-0.06475830078125,
-0.0158538818359375,
0.06463623046875,
-0.00962066650390625,
-0.005298614501953125,
0.05126953125,
0.043914794921875,
0.0011854171752929688,
-0.0160369873046875,
0.01142120361328125,
0.01312255859375,
0.04644775390625,
0.03240966796875,
0.041259765625,
-0.038330078125,
0.046966552734375,
-0.0479736328125,
-0.0157318115234375,
-0.00962066650390625,
-0.05181884765625,
-0.05218505859375,
-0.045196533203125,
-0.0200347900390625,
-0.007709503173828125,
-0.0196990966796875,
0.06146240234375,
0.056549072265625,
-0.07806396484375,
-0.03289794921875,
-0.0008530616760253906,
0.007801055908203125,
-0.0259552001953125,
-0.025787353515625,
0.046722412109375,
-0.0323486328125,
-0.08453369140625,
0.0006766319274902344,
0.006038665771484375,
0.00838470458984375,
-0.02398681640625,
0.0003921985626220703,
-0.0205841064453125,
-0.0129547119140625,
0.032257080078125,
0.032684326171875,
-0.0552978515625,
-0.0239105224609375,
-0.0016508102416992188,
-0.0157012939453125,
0.007785797119140625,
0.0458984375,
-0.016876220703125,
0.0268402099609375,
0.049468994140625,
0.0201263427734375,
0.025787353515625,
-0.01158905029296875,
0.051971435546875,
-0.0374755859375,
0.02166748046875,
0.0239105224609375,
0.041259765625,
0.0232696533203125,
-0.0163421630859375,
0.035797119140625,
0.032257080078125,
-0.052093505859375,
-0.04351806640625,
0.0250396728515625,
-0.0762939453125,
-0.02166748046875,
0.0673828125,
-0.020751953125,
-0.00923919677734375,
-0.00843048095703125,
-0.043853759765625,
0.047454833984375,
-0.022064208984375,
0.044647216796875,
0.06353759765625,
-0.00441741943359375,
-0.0045928955078125,
-0.0377197265625,
0.0301971435546875,
0.032073974609375,
-0.025360107421875,
-0.025299072265625,
0.0006937980651855469,
0.0140533447265625,
0.0455322265625,
0.03240966796875,
-0.00921630859375,
0.008026123046875,
-0.012451171875,
0.044464111328125,
0.0011730194091796875,
0.01473236083984375,
0.00506591796875,
-0.01357269287109375,
0.0028247833251953125,
-0.032623291015625
]
] |
openai/whisper-base | 2023-09-08T13:08:06.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"arxiv:2212.04356",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | openai | null | null | openai/whisper-base | 132 | 89,068 | transformers | 2022-09-26T06:50:46 | ---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- no
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: whisper-base
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 5.008769117619326
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 12.84936273212057
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice 11.0
type: mozilla-foundation/common_voice_11_0
config: hi
split: test
args:
language: hi
metrics:
- name: Test WER
type: wer
value: 131
pipeline_tag: automatic-speech-recognition
license: apache-2.0
---
# Whisper
Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours
of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need
for fine-tuning.
Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356)
by Alec Radford et al from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper).
**Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were
copied and pasted from the original model card.
## Model details
Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model.
It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision.
The models were trained on either English-only data or multilingual data. The English-only models were trained
on the task of speech recognition. The multilingual models were trained on both speech recognition and speech
translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio.
For speech translation, the model predicts transcriptions to a *different* language to the audio.
Whisper checkpoints come in five configurations of varying model sizes.
The smallest four are trained on either English-only or multilingual data.
The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints
are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The
checkpoints are summarised in the following table with links to the models on the Hub:
| Size | Parameters | English-only | Multilingual |
|----------|------------|------------------------------------------------------|-----------------------------------------------------|
| tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) |
| base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) |
| small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) |
| medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) |
| large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) |
| large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) |
# Usage
To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor).
The `WhisperProcessor` is used to:
1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model)
2. Post-process the model outputs (converting them from tokens to text)
The model is informed of which task to perform (transcription or translation) by passing the appropriate "context tokens". These context tokens
are a sequence of tokens that are given to the decoder at the start of the decoding process, and take the following order:
1. The transcription always starts with the `<|startoftranscript|>` token
2. The second token is the language token (e.g. `<|en|>` for English)
3. The third token is the "task token". It can take one of two values: `<|transcribe|>` for speech recognition or `<|translate|>` for speech translation
4. In addition, a `<|notimestamps|>` token is added if the model should not include timestamp prediction
Thus, a typical sequence of context tokens might look as follows:
```
<|startoftranscript|> <|en|> <|transcribe|> <|notimestamps|>
```
Which tells the model to decode in English, under the task of speech recognition, and not to predict timestamps.
These tokens can either be forced or un-forced. If they are forced, the model is made to predict each token at
each position. This allows one to control the output language and task for the Whisper model. If they are un-forced,
the Whisper model will automatically predict the output langauge and task itself.
The context tokens can be set accordingly:
```python
model.config.forced_decoder_ids = WhisperProcessor.get_decoder_prompt_ids(language="english", task="transcribe")
```
Which forces the model to predict in English under the task of speech recognition.
## Transcription
### English to English
In this example, the context tokens are 'unforced', meaning the model automatically predicts the output language
(English) and task (transcribe).
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base")
>>> model.config.forced_decoder_ids = None
>>> # load dummy dataset and read audio files
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False)
['<|startoftranscript|><|en|><|transcribe|><|notimestamps|> Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.<|endoftext|>']
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.']
```
The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`.
### French to French
The following example demonstrates French to French transcription by setting the decoder ids appropriately.
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import Audio, load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base")
>>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="transcribe")
>>> # load streaming dataset and read first audio sample
>>> ds = load_dataset("common_voice", "fr", split="test", streaming=True)
>>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000))
>>> input_speech = next(iter(ds))["audio"]
>>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids)
['<|startoftranscript|><|fr|><|transcribe|><|notimestamps|> Un vrai travail intéressant va enfin être mené sur ce sujet.<|endoftext|>']
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' Un vrai travail intéressant va enfin être mené sur ce sujet.']
```
## Translation
Setting the task to "translate" forces the Whisper model to perform speech translation.
### French to English
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import Audio, load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base")
>>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="translate")
>>> # load streaming dataset and read first audio sample
>>> ds = load_dataset("common_voice", "fr", split="test", streaming=True)
>>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000))
>>> input_speech = next(iter(ds))["audio"]
>>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' A very interesting work, we will finally be given on this subject.']
```
## Evaluation
This code snippet shows how to evaluate Whisper Base on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr):
```python
>>> from datasets import load_dataset
>>> from transformers import WhisperForConditionalGeneration, WhisperProcessor
>>> import torch
>>> from evaluate import load
>>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test")
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-base")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base").to("cuda")
>>> def map_to_pred(batch):
>>> audio = batch["audio"]
>>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features
>>> batch["reference"] = processor.tokenizer._normalize(batch['text'])
>>>
>>> with torch.no_grad():
>>> predicted_ids = model.generate(input_features.to("cuda"))[0]
>>> transcription = processor.decode(predicted_ids)
>>> batch["prediction"] = processor.tokenizer._normalize(transcription)
>>> return batch
>>> result = librispeech_test_clean.map(map_to_pred)
>>> wer = load("wer")
>>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"]))
5.082316555716899
```
## Long-Form Transcription
The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking
algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers
[`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline)
method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline
can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`:
```python
>>> import torch
>>> from transformers import pipeline
>>> from datasets import load_dataset
>>> device = "cuda:0" if torch.cuda.is_available() else "cpu"
>>> pipe = pipeline(
>>> "automatic-speech-recognition",
>>> model="openai/whisper-base",
>>> chunk_length_s=30,
>>> device=device,
>>> )
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> prediction = pipe(sample.copy(), batch_size=8)["text"]
" Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel."
>>> # we can also return timestamps for the predictions
>>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"]
[{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.',
'timestamp': (0.0, 5.44)}]
```
Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm.
## Fine-Tuning
The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However,
its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog
post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step
guide to fine-tuning the Whisper model with as little as 5 hours of labelled data.
### Evaluated Use
The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research.
The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them.
In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes.
## Training Data
The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages.
As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language.
## Performance and Limitations
Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level.
However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself.
Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf).
In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages.
## Broader Implications
We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications.
There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects.
### BibTeX entry and citation info
```bibtex
@misc{radford2022whisper,
doi = {10.48550/ARXIV.2212.04356},
url = {https://arxiv.org/abs/2212.04356},
author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya},
title = {Robust Speech Recognition via Large-Scale Weak Supervision},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
| 19,774 | [
[
-0.01335906982421875,
-0.04254150390625,
0.007419586181640625,
0.035186767578125,
-0.009002685546875,
-0.0091552734375,
-0.02349853515625,
-0.039764404296875,
0.01540374755859375,
0.034820556640625,
-0.061553955078125,
-0.043914794921875,
-0.057708740234375,
-0.0009293556213378906,
-0.04132080078125,
0.0758056640625,
0.006717681884765625,
-0.0010957717895507812,
0.01255035400390625,
-0.008270263671875,
-0.03240966796875,
-0.034576416015625,
-0.051513671875,
-0.0156707763671875,
0.00934600830078125,
0.01226043701171875,
0.0295257568359375,
0.037139892578125,
0.00762176513671875,
0.030853271484375,
-0.0310516357421875,
-0.001941680908203125,
-0.031280517578125,
-0.004383087158203125,
0.0264129638671875,
-0.039154052734375,
-0.044097900390625,
0.00673675537109375,
0.054046630859375,
0.03887939453125,
-0.0195770263671875,
0.0289306640625,
0.0174713134765625,
0.0247802734375,
-0.0193939208984375,
0.0223846435546875,
-0.04656982421875,
-0.010528564453125,
-0.0202789306640625,
-0.003772735595703125,
-0.023834228515625,
-0.0198974609375,
0.039794921875,
-0.042449951171875,
0.02117919921875,
0.00103759765625,
0.07647705078125,
0.016204833984375,
-0.01236724853515625,
-0.028350830078125,
-0.053314208984375,
0.07745361328125,
-0.0653076171875,
0.036041259765625,
0.036956787109375,
0.012969970703125,
-0.005336761474609375,
-0.065185546875,
-0.05181884765625,
-0.0011539459228515625,
-0.007595062255859375,
0.0224151611328125,
-0.03509521484375,
0.0003685951232910156,
0.01708984375,
0.0201873779296875,
-0.0364990234375,
-0.0009579658508300781,
-0.048065185546875,
-0.05279541015625,
0.042694091796875,
-0.0007557868957519531,
0.0263519287109375,
-0.01983642578125,
-0.0157623291015625,
-0.023773193359375,
-0.0230560302734375,
0.03448486328125,
0.027587890625,
0.035308837890625,
-0.048583984375,
0.0298614501953125,
-0.010223388671875,
0.048492431640625,
0.012054443359375,
-0.050201416015625,
0.049774169921875,
-0.0147857666015625,
-0.01319122314453125,
0.0272674560546875,
0.07464599609375,
0.0211029052734375,
0.01207733154296875,
0.002948760986328125,
-0.0174713134765625,
0.00872802734375,
-0.006816864013671875,
-0.053375244140625,
0.006160736083984375,
0.037139892578125,
-0.04107666015625,
-0.0251007080078125,
-0.0176544189453125,
-0.037139892578125,
0.018768310546875,
-0.01763916015625,
0.052764892578125,
-0.042388916015625,
-0.0281982421875,
0.0122528076171875,
-0.02947998046875,
0.0186767578125,
0.001651763916015625,
-0.061676025390625,
0.027587890625,
0.030487060546875,
0.0672607421875,
0.0051422119140625,
-0.04986572265625,
-0.0419921875,
0.006427764892578125,
0.0037860870361328125,
0.034027099609375,
-0.019073486328125,
-0.0435791015625,
-0.0061187744140625,
0.012664794921875,
-0.0308074951171875,
-0.037811279296875,
0.052032470703125,
-0.009063720703125,
0.035308837890625,
-0.004543304443359375,
-0.03607177734375,
-0.0204315185546875,
-0.01238250732421875,
-0.03350830078125,
0.0709228515625,
0.01201629638671875,
-0.052978515625,
0.00897216796875,
-0.04046630859375,
-0.0389404296875,
-0.012969970703125,
0.0185394287109375,
-0.03228759765625,
-0.00201416015625,
0.033905029296875,
0.034942626953125,
-0.00876617431640625,
0.0090179443359375,
0.00839996337890625,
-0.030548095703125,
0.027130126953125,
-0.034576416015625,
0.07586669921875,
0.01207733154296875,
-0.0294952392578125,
0.014251708984375,
-0.05816650390625,
0.00547027587890625,
0.004619598388671875,
-0.019866943359375,
0.007282257080078125,
-0.00008106231689453125,
0.0204010009765625,
0.00705718994140625,
0.01448822021484375,
-0.052093505859375,
-0.005886077880859375,
-0.050537109375,
0.065185546875,
0.042388916015625,
-0.0030879974365234375,
0.0264129638671875,
-0.040863037109375,
0.0231781005859375,
0.011077880859375,
0.0268096923828125,
-0.0189666748046875,
-0.0472412109375,
-0.061981201171875,
-0.02813720703125,
0.03271484375,
0.06048583984375,
-0.032379150390625,
0.04345703125,
-0.0216827392578125,
-0.04412841796875,
-0.0943603515625,
-0.00634765625,
0.042236328125,
0.048309326171875,
0.049560546875,
-0.0041656494140625,
-0.049530029296875,
-0.05859375,
-0.0112457275390625,
-0.0224609375,
-0.01216888427734375,
0.02581787109375,
0.02685546875,
-0.029296875,
0.05230712890625,
-0.03033447265625,
-0.040557861328125,
-0.025360107421875,
0.003810882568359375,
0.0343017578125,
0.047027587890625,
0.0252685546875,
-0.056884765625,
-0.0281829833984375,
-0.01470184326171875,
-0.042266845703125,
-0.01275634765625,
-0.009063720703125,
-0.0013942718505859375,
0.0154266357421875,
0.0330810546875,
-0.05218505859375,
0.035888671875,
0.051300048828125,
-0.01424407958984375,
0.047943115234375,
0.005401611328125,
-0.0019588470458984375,
-0.0892333984375,
0.0017604827880859375,
-0.0174102783203125,
-0.012298583984375,
-0.0535888671875,
-0.0174102783203125,
-0.006191253662109375,
-0.007022857666015625,
-0.04327392578125,
0.0457763671875,
-0.024932861328125,
0.005184173583984375,
-0.00435638427734375,
0.009674072265625,
-0.0031890869140625,
0.049407958984375,
0.019195556640625,
0.0516357421875,
0.06292724609375,
-0.04345703125,
0.0167999267578125,
0.043670654296875,
-0.019622802734375,
0.0220489501953125,
-0.0718994140625,
0.00872802734375,
0.006046295166015625,
0.01218414306640625,
-0.0667724609375,
-0.007640838623046875,
0.00634002685546875,
-0.07147216796875,
0.0311431884765625,
-0.0254669189453125,
-0.0220794677734375,
-0.03887939453125,
-0.00695037841796875,
0.005512237548828125,
0.0643310546875,
-0.036468505859375,
0.053741455078125,
0.031646728515625,
-0.0168914794921875,
-0.0435791015625,
-0.052520751953125,
-0.006000518798828125,
-0.0098724365234375,
-0.056884765625,
0.03802490234375,
-0.0014982223510742188,
0.005168914794921875,
-0.007205963134765625,
-0.004772186279296875,
0.008636474609375,
-0.01480865478515625,
0.035400390625,
0.030487060546875,
-0.00537872314453125,
-0.0204010009765625,
0.018402099609375,
-0.018707275390625,
0.00020635128021240234,
-0.02117919921875,
0.04986572265625,
-0.0174102783203125,
-0.001800537109375,
-0.058013916015625,
0.0280914306640625,
0.04705810546875,
-0.0271759033203125,
0.049163818359375,
0.055938720703125,
-0.02020263671875,
-0.01236724853515625,
-0.0472412109375,
-0.01219940185546875,
-0.039794921875,
0.0164794921875,
-0.03680419921875,
-0.059234619140625,
0.059234619140625,
0.0170745849609375,
0.01299285888671875,
0.048553466796875,
0.038787841796875,
-0.0099334716796875,
0.0789794921875,
0.03692626953125,
-0.0208587646484375,
0.018951416015625,
-0.051025390625,
-0.006282806396484375,
-0.0755615234375,
-0.031951904296875,
-0.041107177734375,
-0.01335906982421875,
-0.032928466796875,
-0.020111083984375,
0.03424072265625,
0.01416778564453125,
-0.00033473968505859375,
0.03912353515625,
-0.05242919921875,
0.00021016597747802734,
0.050140380859375,
0.0007152557373046875,
0.00439453125,
-0.0018510818481445312,
-0.02093505859375,
0.0002779960632324219,
-0.036407470703125,
-0.0288238525390625,
0.07342529296875,
0.0345458984375,
0.034576416015625,
-0.0027313232421875,
0.053741455078125,
-0.002040863037109375,
0.000606536865234375,
-0.059051513671875,
0.037689208984375,
-0.0094757080078125,
-0.038177490234375,
-0.0298004150390625,
-0.01953125,
-0.063232421875,
0.0128021240234375,
-0.0118560791015625,
-0.054412841796875,
0.009124755859375,
-0.001613616943359375,
-0.0204010009765625,
0.014678955078125,
-0.053131103515625,
0.046661376953125,
0.0117950439453125,
0.01152801513671875,
0.001354217529296875,
-0.054931640625,
0.0110321044921875,
0.007205963134765625,
0.00887298583984375,
-0.005153656005859375,
0.01148223876953125,
0.07647705078125,
-0.03753662109375,
0.071044921875,
-0.0227508544921875,
0.0017194747924804688,
0.0335693359375,
-0.007904052734375,
0.0265045166015625,
-0.016021728515625,
-0.0081024169921875,
0.037139892578125,
0.027008056640625,
-0.0220947265625,
-0.0204010009765625,
0.03961181640625,
-0.08056640625,
-0.02874755859375,
-0.0187835693359375,
-0.0243988037109375,
-0.007198333740234375,
0.0186767578125,
0.06793212890625,
0.056488037109375,
-0.01129913330078125,
-0.0029315948486328125,
0.031463623046875,
-0.0187530517578125,
0.042449951171875,
0.04791259765625,
-0.015960693359375,
-0.037567138671875,
0.068115234375,
0.0214385986328125,
0.017059326171875,
0.021026611328125,
0.0255126953125,
-0.034637451171875,
-0.050201416015625,
-0.041534423828125,
0.0231170654296875,
-0.038818359375,
-0.01136016845703125,
-0.06884765625,
-0.042938232421875,
-0.052764892578125,
0.002590179443359375,
-0.0272369384765625,
-0.0211334228515625,
-0.035888671875,
0.00778961181640625,
0.041595458984375,
0.031402587890625,
0.0012750625610351562,
0.041717529296875,
-0.07452392578125,
0.031494140625,
0.02349853515625,
0.00748443603515625,
0.0020542144775390625,
-0.07598876953125,
-0.00501251220703125,
0.0162353515625,
-0.01525115966796875,
-0.055694580078125,
0.0413818359375,
0.0265350341796875,
0.042633056640625,
0.0199432373046875,
0.0003612041473388672,
0.06072998046875,
-0.0552978515625,
0.064208984375,
0.01165008544921875,
-0.09478759765625,
0.05517578125,
-0.025909423828125,
0.02581787109375,
0.0288543701171875,
0.027252197265625,
-0.05426025390625,
-0.036529541015625,
-0.0467529296875,
-0.048309326171875,
0.061676025390625,
0.0277099609375,
0.0133056640625,
0.00732421875,
0.0223236083984375,
0.005802154541015625,
0.009796142578125,
-0.036651611328125,
-0.032470703125,
-0.03643798828125,
-0.01953125,
-0.01262664794921875,
-0.010650634765625,
-0.0026340484619140625,
-0.0389404296875,
0.0565185546875,
-0.0016727447509765625,
0.0418701171875,
0.033599853515625,
-0.00449371337890625,
-0.001636505126953125,
0.00673675537109375,
0.04345703125,
0.02197265625,
-0.014312744140625,
-0.0273895263671875,
0.0227203369140625,
-0.0594482421875,
-0.00015544891357421875,
0.0193634033203125,
-0.0224151611328125,
0.0128173828125,
0.06005859375,
0.09185791015625,
0.015960693359375,
-0.03643798828125,
0.0537109375,
-0.00850677490234375,
-0.031158447265625,
-0.041290283203125,
0.00341796875,
0.022125244140625,
0.0149078369140625,
0.0265045166015625,
0.00978851318359375,
0.005889892578125,
-0.035614013671875,
0.004833221435546875,
0.020233154296875,
-0.03350830078125,
-0.039794921875,
0.0611572265625,
0.01197052001953125,
-0.037384033203125,
0.053802490234375,
0.0081939697265625,
-0.058319091796875,
0.035614013671875,
0.052520751953125,
0.07635498046875,
-0.038055419921875,
0.002803802490234375,
0.0333251953125,
0.017578125,
-0.005184173583984375,
0.039306640625,
-0.00907135009765625,
-0.057952880859375,
-0.03436279296875,
-0.07501220703125,
-0.01788330078125,
0.01226806640625,
-0.06964111328125,
0.022552490234375,
-0.017578125,
-0.0216217041015625,
0.02215576171875,
-0.0003657341003417969,
-0.057342529296875,
0.0095977783203125,
0.005481719970703125,
0.07861328125,
-0.05572509765625,
0.07635498046875,
0.0184173583984375,
-0.0196533203125,
-0.0816650390625,
0.002178192138671875,
0.0016307830810546875,
-0.078369140625,
0.0313720703125,
0.025360107421875,
-0.0159149169921875,
0.01398468017578125,
-0.04144287109375,
-0.06329345703125,
0.07318115234375,
0.00955963134765625,
-0.052978515625,
-0.007568359375,
-0.0030040740966796875,
0.038909912109375,
-0.0229339599609375,
0.010650634765625,
0.05426025390625,
0.03179931640625,
0.006740570068359375,
-0.10406494140625,
-0.006755828857421875,
-0.0211029052734375,
-0.01288604736328125,
0.0009551048278808594,
-0.052764892578125,
0.06280517578125,
-0.024322509765625,
-0.0191497802734375,
0.0196685791015625,
0.050079345703125,
0.0164794921875,
0.016143798828125,
0.046142578125,
0.037261962890625,
0.05206298828125,
-0.01248931884765625,
0.0748291015625,
-0.02032470703125,
0.009979248046875,
0.0653076171875,
-0.004207611083984375,
0.08441162109375,
0.02056884765625,
-0.02813720703125,
0.0435791015625,
0.0284576416015625,
-0.00008231401443481445,
0.040618896484375,
-0.00687408447265625,
-0.0222930908203125,
0.007579803466796875,
-0.003421783447265625,
-0.0318603515625,
0.058563232421875,
0.03076171875,
-0.019805908203125,
0.023956298828125,
0.0235443115234375,
0.00839996337890625,
-0.01024627685546875,
-0.0189056396484375,
0.0718994140625,
0.00957489013671875,
-0.045318603515625,
0.06634521484375,
0.0018901824951171875,
0.0728759765625,
-0.06292724609375,
0.0157623291015625,
0.0037689208984375,
0.01116943359375,
-0.013397216796875,
-0.04833984375,
0.0249481201171875,
-0.0109405517578125,
-0.0240478515625,
-0.0140380859375,
0.0411376953125,
-0.055267333984375,
-0.03924560546875,
0.04278564453125,
0.02642822265625,
0.0245208740234375,
-0.0094451904296875,
-0.06634521484375,
0.029937744140625,
0.0173187255859375,
-0.01800537109375,
0.01242828369140625,
0.01335906982421875,
0.0171356201171875,
0.048187255859375,
0.06488037109375,
0.0302734375,
0.01113128662109375,
0.0139617919921875,
0.060302734375,
-0.0477294921875,
-0.0506591796875,
-0.050933837890625,
0.036346435546875,
0.004413604736328125,
-0.034423828125,
0.0589599609375,
0.03668212890625,
0.05108642578125,
-0.0004298686981201172,
0.058013916015625,
0.005237579345703125,
0.07037353515625,
-0.04168701171875,
0.06292724609375,
-0.0309600830078125,
0.0007901191711425781,
-0.0248565673828125,
-0.0557861328125,
0.004604339599609375,
0.043121337890625,
-0.004261016845703125,
-0.0090179443359375,
0.0277557373046875,
0.06695556640625,
0.006008148193359375,
0.01255035400390625,
0.01003265380859375,
0.030120849609375,
0.0167236328125,
0.0408935546875,
0.0430908203125,
-0.0574951171875,
0.048309326171875,
-0.036651611328125,
-0.0186309814453125,
0.00457763671875,
-0.04443359375,
-0.07391357421875,
-0.0654296875,
-0.0202789306640625,
-0.042144775390625,
-0.0172576904296875,
0.0584716796875,
0.06683349609375,
-0.06304931640625,
-0.0252838134765625,
0.0208282470703125,
-0.0034637451171875,
-0.0305633544921875,
-0.018646240234375,
0.0423583984375,
-0.0026378631591796875,
-0.06707763671875,
0.047088623046875,
0.00258636474609375,
0.0287933349609375,
-0.013946533203125,
-0.0167236328125,
0.0020999908447265625,
0.0079345703125,
0.041015625,
0.0213165283203125,
-0.06500244140625,
-0.0091705322265625,
0.00882720947265625,
0.004795074462890625,
-0.002010345458984375,
0.031402587890625,
-0.05401611328125,
0.0268402099609375,
0.02801513671875,
0.0089263916015625,
0.060577392578125,
-0.021728515625,
0.0291595458984375,
-0.05584716796875,
0.03399658203125,
0.014801025390625,
0.024169921875,
0.0262451171875,
-0.02215576171875,
0.01087188720703125,
0.021881103515625,
-0.041046142578125,
-0.0767822265625,
-0.00943756103515625,
-0.08355712890625,
-0.012298583984375,
0.07391357421875,
0.0010843276977539062,
-0.0257720947265625,
-0.00876617431640625,
-0.0253143310546875,
0.033111572265625,
-0.03564453125,
0.023529052734375,
0.0430908203125,
0.004337310791015625,
-0.0025081634521484375,
-0.043853759765625,
0.0545654296875,
0.016357421875,
-0.0173187255859375,
-0.0018062591552734375,
0.0024566650390625,
0.045684814453125,
0.02020263671875,
0.06378173828125,
-0.0170745849609375,
0.01293182373046875,
0.00910186767578125,
0.012939453125,
-0.008514404296875,
-0.01442718505859375,
-0.033721923828125,
-0.00458526611328125,
-0.0255889892578125,
-0.032440185546875
]
] |
michellejieli/NSFW_text_classifier | 2022-12-10T19:59:37.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"distilroberta",
"sentiment",
"NSFW",
"inappropriate",
"spam",
"twitter",
"reddit",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | michellejieli | null | null | michellejieli/NSFW_text_classifier | 35 | 88,585 | transformers | 2022-12-10T01:42:56 | ---
language: "en"
tags:
- distilroberta
- sentiment
- NSFW
- inappropriate
- spam
- twitter
- reddit
widget:
- text: "I like you. You remind me of me when I was young and stupid."
- text: "I see you’ve set aside this special time to humiliate yourself in public."
- text: "Have a great weekend! See you next week!"
---
# Fine-tuned DistilRoBERTa-base for NSFW Classification
# Model Description
DistilBERT is a transformer model that performs sentiment analysis. I fine-tuned the model on Reddit posts with the purpose of classifying not safe for work (NSFW) content, specifically text that is considered inappropriate and unprofessional. The model predicts 2 classes, which are NSFW or safe for work (SFW).
The model is a fine-tuned version of [DistilBERT](https://huggingface.co/docs/transformers/model_doc/distilbert).
It was fine-tuned on 14317 Reddit posts pulled from the (Reddit API) [https://praw.readthedocs.io/en/stable/].
# How to Use
```python
from transformers import pipeline
classifier = pipeline("sentiment-analysis", model="michellejieli/NSFW_text_classification")
classifier("I see you’ve set aside this special time to humiliate yourself in public.")
```
```python
Output:
[{'label': 'NSFW', 'score': 0.998853325843811}]
```
# Contact
Please reach out to [michelle.li851@duke.edu](mailto:michelle.li851@duke.edu) if you have any questions or feedback.
--- | 1,392 | [
[
-0.0330810546875,
-0.061737060546875,
0.01273345947265625,
0.035797119140625,
-0.034515380859375,
-0.0027599334716796875,
0.00518798828125,
-0.0220947265625,
0.0002810955047607422,
0.00408172607421875,
-0.038818359375,
-0.04156494140625,
-0.06793212890625,
0.0275726318359375,
-0.055877685546875,
0.1055908203125,
0.02325439453125,
-0.01483917236328125,
-0.026641845703125,
0.00667572021484375,
-0.01441192626953125,
-0.031158447265625,
-0.01131439208984375,
-0.0221405029296875,
0.0303955078125,
0.02197265625,
0.05609130859375,
0.026031494140625,
0.04949951171875,
0.02789306640625,
-0.02032470703125,
-0.0029735565185546875,
-0.054779052734375,
0.006053924560546875,
-0.031890869140625,
-0.004039764404296875,
-0.03179931640625,
0.01434326171875,
0.006816864013671875,
0.03619384765625,
-0.0196990966796875,
0.0411376953125,
0.01416015625,
0.04998779296875,
-0.048675537109375,
0.029205322265625,
-0.040496826171875,
0.01329803466796875,
-0.00690460205078125,
-0.00041866302490234375,
-0.0325927734375,
-0.033203125,
0.01021575927734375,
-0.0190277099609375,
0.021514892578125,
0.0036945343017578125,
0.06878662109375,
0.04254150390625,
-0.0295867919921875,
0.006439208984375,
-0.050079345703125,
0.0634765625,
-0.039031982421875,
0.018463134765625,
0.032867431640625,
0.01386260986328125,
0.0011224746704101562,
-0.07257080078125,
-0.038970947265625,
0.004058837890625,
-0.0310211181640625,
-0.001811981201171875,
-0.04833984375,
0.0025272369384765625,
0.036346435546875,
0.03485107421875,
-0.03143310546875,
-0.012359619140625,
-0.0411376953125,
0.0020503997802734375,
0.05743408203125,
0.017730712890625,
-0.0010471343994140625,
-0.0300750732421875,
-0.0296173095703125,
0.00600433349609375,
0.00676727294921875,
0.01265716552734375,
0.0252532958984375,
0.010101318359375,
-0.00510406494140625,
0.036376953125,
-0.0285797119140625,
0.048553466796875,
0.038238525390625,
-0.0160064697265625,
0.034912109375,
0.01438140869140625,
-0.0305023193359375,
-0.0016880035400390625,
0.07708740234375,
0.06146240234375,
0.028564453125,
0.005237579345703125,
-0.00890350341796875,
0.01467132568359375,
0.052520751953125,
-0.080322265625,
-0.046112060546875,
0.00849151611328125,
-0.038238525390625,
-0.07958984375,
0.027740478515625,
-0.050048828125,
-0.037139892578125,
-0.0215301513671875,
0.011627197265625,
-0.01189422607421875,
-0.035400390625,
0.0006351470947265625,
-0.0177459716796875,
-0.0213470458984375,
0.02484130859375,
-0.0653076171875,
0.006439208984375,
0.046966552734375,
0.048828125,
-0.000072479248046875,
-0.022125244140625,
-0.02056884765625,
-0.03131103515625,
-0.0037689208984375,
0.03521728515625,
-0.030670166015625,
-0.0243682861328125,
-0.004405975341796875,
0.0276947021484375,
0.01509857177734375,
-0.0287322998046875,
0.06829833984375,
-0.0328369140625,
0.014312744140625,
0.00907135009765625,
-0.009490966796875,
-0.0295562744140625,
0.0247955322265625,
-0.052764892578125,
0.08331298828125,
0.0333251953125,
-0.1094970703125,
0.0216827392578125,
-0.044036865234375,
-0.04608154296875,
-0.0053253173828125,
0.0297698974609375,
-0.04547119140625,
0.014739990234375,
-0.007610321044921875,
0.06048583984375,
-0.0012311935424804688,
0.0350341796875,
-0.03253173828125,
-0.032989501953125,
0.040069580078125,
-0.0357666015625,
0.0616455078125,
0.023773193359375,
-0.020721435546875,
0.01016998291015625,
-0.04644775390625,
-0.0022716522216796875,
0.00043845176696777344,
-0.01544952392578125,
-0.03582763671875,
-0.0003650188446044922,
0.0198516845703125,
0.01421356201171875,
0.0196533203125,
-0.04052734375,
-0.00778961181640625,
-0.01251983642578125,
0.048126220703125,
0.0535888671875,
-0.01507568359375,
0.03271484375,
-0.0224151611328125,
0.02716064453125,
0.0009016990661621094,
0.036529541015625,
0.03173828125,
-0.0266876220703125,
-0.04193115234375,
-0.0180816650390625,
-0.005615234375,
0.0235443115234375,
-0.044708251953125,
0.037139892578125,
-0.00778961181640625,
-0.0606689453125,
-0.032623291015625,
-0.01517486572265625,
0.0243377685546875,
0.059112548828125,
0.033966064453125,
0.0025920867919921875,
-0.051422119140625,
-0.056121826171875,
-0.0104827880859375,
-0.0181732177734375,
0.00974273681640625,
-0.0166015625,
0.0316162109375,
-0.038177490234375,
0.06951904296875,
-0.050506591796875,
-0.0218658447265625,
0.00415802001953125,
0.02484130859375,
0.039886474609375,
0.025299072265625,
0.05206298828125,
-0.084716796875,
-0.0533447265625,
-0.0196990966796875,
-0.06158447265625,
-0.0125732421875,
0.029449462890625,
-0.0128936767578125,
-0.006988525390625,
0.0164947509765625,
-0.0264434814453125,
0.0379638671875,
0.040771484375,
-0.03082275390625,
0.054229736328125,
0.0114898681640625,
0.00970458984375,
-0.0736083984375,
0.004993438720703125,
0.029083251953125,
-0.034423828125,
-0.05108642578125,
0.0035839080810546875,
-0.0028209686279296875,
-0.0001614093780517578,
-0.06298828125,
0.0078277587890625,
-0.01194000244140625,
0.037139892578125,
-0.03472900390625,
-0.035552978515625,
0.01349639892578125,
0.023681640625,
0.00955963134765625,
0.029632568359375,
0.044158935546875,
-0.04693603515625,
0.04119873046875,
0.0322265625,
-0.0028629302978515625,
0.051361083984375,
-0.051483154296875,
0.0182342529296875,
-0.0156402587890625,
0.00394439697265625,
-0.05828857421875,
-0.0192718505859375,
0.0204925537109375,
-0.03411865234375,
0.0171661376953125,
-0.0177764892578125,
-0.024566650390625,
-0.058349609375,
-0.04119873046875,
0.0158233642578125,
0.03692626953125,
-0.053192138671875,
0.029815673828125,
0.046875,
0.002895355224609375,
-0.068359375,
-0.05810546875,
-0.0284881591796875,
-0.034515380859375,
-0.042388916015625,
0.018890380859375,
0.0021419525146484375,
-0.029449462890625,
-0.004978179931640625,
-0.023040771484375,
-0.03558349609375,
0.0185699462890625,
0.048248291015625,
0.0274810791015625,
0.01413726806640625,
-0.0177764892578125,
0.0195465087890625,
-0.0158233642578125,
0.03558349609375,
0.016693115234375,
0.0308685302734375,
-0.0362548828125,
0.0005464553833007812,
-0.039398193359375,
0.0019969940185546875,
0.057281494140625,
0.00984954833984375,
0.041534423828125,
0.0667724609375,
-0.03350830078125,
-0.0249786376953125,
-0.0159759521484375,
-0.0208587646484375,
-0.04278564453125,
0.033233642578125,
-0.005222320556640625,
-0.06500244140625,
-0.0008358955383300781,
0.006740570068359375,
0.004833221435546875,
0.0406494140625,
0.0440673828125,
-0.0235443115234375,
0.07464599609375,
0.05633544921875,
-0.01454925537109375,
0.0243988037109375,
-0.0196990966796875,
0.0159454345703125,
-0.04705810546875,
-0.023284912109375,
-0.0364990234375,
-0.0298309326171875,
-0.047332763671875,
-0.0118255615234375,
0.01189422607421875,
0.00858306884765625,
-0.0213165283203125,
0.044586181640625,
-0.0814208984375,
0.04345703125,
0.02667236328125,
0.003570556640625,
0.02593994140625,
-0.0022220611572265625,
0.0104522705078125,
-0.0207366943359375,
-0.0244903564453125,
-0.04913330078125,
0.046478271484375,
0.04376220703125,
0.0970458984375,
0.003787994384765625,
0.041656494140625,
0.03692626953125,
0.04376220703125,
-0.0721435546875,
0.0256805419921875,
-0.038818359375,
-0.04913330078125,
-0.01125335693359375,
-0.0265045166015625,
-0.03759765625,
0.0156097412109375,
-0.01551055908203125,
-0.050628662109375,
0.01250457763671875,
0.00811004638671875,
-0.0017461776733398438,
0.01898193359375,
-0.037109375,
0.0609130859375,
-0.00855255126953125,
-0.0306396484375,
-0.00487518310546875,
-0.046722412109375,
0.04046630859375,
-0.00734710693359375,
0.021240234375,
-0.03289794921875,
0.0305328369140625,
0.049652099609375,
-0.0279541015625,
0.080810546875,
-0.015533447265625,
-0.0027618408203125,
0.01073455810546875,
0.014373779296875,
0.006626129150390625,
0.024627685546875,
-0.0218658447265625,
0.03717041015625,
0.0163116455078125,
0.014068603515625,
-0.00582122802734375,
0.053497314453125,
-0.061279296875,
-0.0259552001953125,
-0.066162109375,
-0.0250396728515625,
0.001918792724609375,
0.0161590576171875,
0.03753662109375,
0.00858306884765625,
-0.020416259765625,
0.035003662109375,
0.03765869140625,
0.0131683349609375,
0.0311737060546875,
0.034271240234375,
-0.0206756591796875,
-0.0098419189453125,
0.054901123046875,
-0.015899658203125,
-0.0173492431640625,
0.031219482421875,
0.032867431640625,
-0.048248291015625,
-0.01503753662109375,
-0.01218414306640625,
-0.0009517669677734375,
-0.055084228515625,
-0.046051025390625,
-0.050262451171875,
-0.032318115234375,
-0.032379150390625,
-0.00955963134765625,
-0.01329803466796875,
-0.0176239013671875,
-0.06280517578125,
-0.021514892578125,
0.052734375,
0.053192138671875,
-0.004150390625,
0.01629638671875,
-0.0634765625,
0.0291900634765625,
0.0206756591796875,
0.02801513671875,
-0.0140533447265625,
-0.048614501953125,
0.002635955810546875,
0.00589752197265625,
-0.04803466796875,
-0.056121826171875,
0.0396728515625,
0.021820068359375,
0.02606201171875,
0.046966552734375,
0.055511474609375,
0.04150390625,
-0.0183563232421875,
0.045806884765625,
0.031463623046875,
-0.072509765625,
0.034912109375,
-0.01555633544921875,
0.00029754638671875,
0.04852294921875,
0.0616455078125,
-0.03485107421875,
-0.03875732421875,
-0.0406494140625,
-0.05767822265625,
0.06243896484375,
0.0203704833984375,
0.0284576416015625,
-0.01293182373046875,
-0.0001533031463623047,
0.02239990234375,
0.0275421142578125,
-0.0653076171875,
-0.0169830322265625,
-0.0482177734375,
-0.018341064453125,
0.0160064697265625,
-0.02935791015625,
-0.0138092041015625,
-0.04290771484375,
0.0465087890625,
-0.01441192626953125,
0.006359100341796875,
-0.005584716796875,
-0.010711669921875,
-0.00946807861328125,
0.017669677734375,
0.01483917236328125,
0.041778564453125,
-0.026763916015625,
0.01453399658203125,
0.00988006591796875,
-0.058990478515625,
0.006122589111328125,
0.015777587890625,
0.0049285888671875,
0.0196990966796875,
0.006439208984375,
0.05145263671875,
-0.0028972625732421875,
-0.0143280029296875,
0.04901123046875,
-0.034515380859375,
-0.033599853515625,
-0.037567138671875,
0.019561767578125,
-0.007633209228515625,
0.0191192626953125,
0.021728515625,
0.021484375,
0.029052734375,
-0.02996826171875,
0.019073486328125,
0.01375579833984375,
-0.03839111328125,
-0.0262451171875,
0.06610107421875,
0.0238494873046875,
-0.01433563232421875,
0.057281494140625,
-0.01558685302734375,
-0.0433349609375,
0.036163330078125,
0.0277099609375,
0.0692138671875,
-0.0008306503295898438,
0.034393310546875,
0.038055419921875,
0.01128387451171875,
-0.0147247314453125,
0.0268096923828125,
0.00212860107421875,
-0.044097900390625,
-0.00917816162109375,
-0.08709716796875,
-0.024749755859375,
0.0102996826171875,
-0.073974609375,
0.034912109375,
-0.04949951171875,
-0.035614013671875,
0.0174102783203125,
-0.02020263671875,
-0.0321044921875,
0.03173828125,
0.0000017881393432617188,
0.056427001953125,
-0.08221435546875,
0.060150146484375,
0.0670166015625,
-0.0208282470703125,
-0.0648193359375,
0.0020351409912109375,
0.00521087646484375,
-0.0251617431640625,
0.036529541015625,
0.037567138671875,
-0.0159149169921875,
-0.0074462890625,
-0.0352783203125,
-0.041900634765625,
0.0628662109375,
0.0162811279296875,
-0.05609130859375,
0.0006184577941894531,
0.01258087158203125,
0.057281494140625,
-0.0062255859375,
0.0162353515625,
0.033203125,
0.034271240234375,
-0.0006680488586425781,
-0.068359375,
0.01152801513671875,
-0.038238525390625,
0.00021839141845703125,
0.00995635986328125,
-0.036102294921875,
0.069091796875,
-0.01047515869140625,
-0.01117706298828125,
0.0019779205322265625,
0.035125732421875,
0.0031147003173828125,
0.040008544921875,
0.052764892578125,
0.0435791015625,
0.049224853515625,
-0.042999267578125,
0.049468994140625,
-0.00970458984375,
0.06951904296875,
0.0748291015625,
-0.007503509521484375,
0.046539306640625,
0.032470703125,
-0.0182342529296875,
0.04815673828125,
0.04248046875,
-0.025421142578125,
0.052825927734375,
0.006389617919921875,
-0.0020618438720703125,
-0.0012407302856445312,
-0.01019287109375,
-0.03497314453125,
0.045928955078125,
0.0038661956787109375,
-0.039398193359375,
-0.01219940185546875,
0.00893402099609375,
0.00424957275390625,
0.0074462890625,
-0.01311492919921875,
0.0595703125,
0.0157012939453125,
-0.0198974609375,
0.05767822265625,
0.0075531005859375,
0.0745849609375,
-0.01371002197265625,
0.001049041748046875,
0.006534576416015625,
0.033599853515625,
-0.05926513671875,
-0.0823974609375,
0.058502197265625,
0.03021240234375,
-0.033172607421875,
-0.0242767333984375,
0.0679931640625,
-0.0093994140625,
-0.0382080078125,
0.0193328857421875,
0.005290985107421875,
0.0047607421875,
-0.020965576171875,
-0.0631103515625,
-0.0005698204040527344,
-0.003673553466796875,
-0.01495361328125,
0.01195526123046875,
0.035064697265625,
0.01026153564453125,
0.035400390625,
0.052734375,
-0.041748046875,
-0.023162841796875,
0.0165252685546875,
0.0775146484375,
-0.032989501953125,
-0.049713134765625,
-0.07794189453125,
0.058074951171875,
-0.01224517822265625,
-0.032012939453125,
0.0567626953125,
0.033660888671875,
0.052734375,
-0.03173828125,
0.03857421875,
-0.036834716796875,
0.038665771484375,
0.0007605552673339844,
0.08843994140625,
-0.049774169921875,
-0.01131439208984375,
-0.044036865234375,
-0.056427001953125,
-0.010040283203125,
0.0755615234375,
-0.00396728515625,
-0.01275634765625,
0.053802490234375,
0.050262451171875,
-0.0141754150390625,
0.01483917236328125,
0.01479339599609375,
0.00756072998046875,
0.009002685546875,
0.001895904541015625,
0.0804443359375,
-0.010711669921875,
0.033843994140625,
-0.07891845703125,
-0.0234375,
-0.0251922607421875,
-0.0828857421875,
-0.08056640625,
-0.031646728515625,
-0.036102294921875,
-0.04840087890625,
-0.0230712890625,
0.040771484375,
0.06744384765625,
-0.049713134765625,
0.003505706787109375,
-0.005970001220703125,
0.005924224853515625,
-0.0160675048828125,
-0.0294952392578125,
-0.005504608154296875,
0.01261138916015625,
-0.06805419921875,
-0.002262115478515625,
-0.00545501708984375,
-0.0028667449951171875,
-0.01531982421875,
0.009246826171875,
-0.0016117095947265625,
-0.022705078125,
0.057647705078125,
-0.0019083023071289062,
-0.0248260498046875,
-0.0159149169921875,
0.007030487060546875,
-0.016937255859375,
-0.01236724853515625,
0.015411376953125,
-0.022247314453125,
0.01090240478515625,
0.04840087890625,
0.010498046875,
0.0282745361328125,
0.01611328125,
0.00714111328125,
-0.07281494140625,
0.0164947509765625,
0.042083740234375,
0.036407470703125,
0.0322265625,
-0.05560302734375,
0.039581298828125,
0.0178375244140625,
-0.04248046875,
-0.060028076171875,
0.004207611083984375,
-0.082763671875,
-0.02557373046875,
0.07562255859375,
-0.006290435791015625,
-0.02801513671875,
0.0015039443969726562,
-0.028106689453125,
0.017578125,
-0.04779052734375,
0.09130859375,
0.035614013671875,
-0.0024356842041015625,
0.0022945404052734375,
-0.007518768310546875,
0.031494140625,
-0.008270263671875,
-0.048126220703125,
0.003505706787109375,
0.0377197265625,
0.047119140625,
0.02410888671875,
0.03179931640625,
-0.0047149658203125,
0.032745361328125,
-0.0176849365234375,
-0.01197052001953125,
-0.026336669921875,
-0.0266571044921875,
-0.01033782958984375,
0.013397216796875,
-0.004703521728515625,
-0.0279388427734375
]
] |
lambdalabs/sd-image-variations-diffusers | 2023-02-08T15:10:13.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"image-to-image",
"dataset:ChristophSchuhmann/improved_aesthetics_6plus",
"license:creativeml-openrail-m",
"has_space",
"diffusers:StableDiffusionImageVariationPipeline",
"region:us"
] | image-to-image | lambdalabs | null | null | lambdalabs/sd-image-variations-diffusers | 294 | 88,035 | diffusers | 2022-09-09T14:53:35 | ---
thumbnail: "https://repository-images.githubusercontent.com/523487884/fdb03a69-8353-4387-b5fc-0d85f888a63f"
datasets:
- ChristophSchuhmann/improved_aesthetics_6plus
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- image-to-image
---
# Stable Diffusion Image Variations Model Card
📣 V2 model released, and blurriness issues fixed! 📣
🧨🎉 Image Variations is now natively supported in 🤗 Diffusers! 🎉🧨

## Version 2
This version of Stable Diffusion has been fine tuned from [CompVis/stable-diffusion-v1-4-original](https://huggingface.co/CompVis/stable-diffusion-v-1-4-original) to accept CLIP image embedding rather than text embeddings. This allows the creation of "image variations" similar to DALLE-2 using Stable Diffusion. This version of the weights has been ported to huggingface Diffusers, to use this with the Diffusers library requires the [Lambda Diffusers repo](https://github.com/LambdaLabsML/lambda-diffusers).
This model was trained in two stages and longer than the original variations model and gives better image quality and better CLIP rated similarity compared to the original version
See training details and v1 vs v2 comparison below.
## Example
Make sure you are using a version of Diffusers >=0.8.0 (for older version see the old instructions at the bottom of this model card)
```python
from diffusers import StableDiffusionImageVariationPipeline
from PIL import Image
device = "cuda:0"
sd_pipe = StableDiffusionImageVariationPipeline.from_pretrained(
"lambdalabs/sd-image-variations-diffusers",
revision="v2.0",
)
sd_pipe = sd_pipe.to(device)
im = Image.open("path/to/image.jpg")
tform = transforms.Compose([
transforms.ToTensor(),
transforms.Resize(
(224, 224),
interpolation=transforms.InterpolationMode.BICUBIC,
antialias=False,
),
transforms.Normalize(
[0.48145466, 0.4578275, 0.40821073],
[0.26862954, 0.26130258, 0.27577711]),
])
inp = tform(im).to(device).unsqueeze(0)
out = sd_pipe(inp, guidance_scale=3)
out["images"][0].save("result.jpg")
```
### The importance of resizing correctly... (or not)
Note that due a bit of an oversight during training, the model expects resized images without anti-aliasing. This turns out to make a big difference and is important to do the resizing the same way during inference. When passing a PIL image to the Diffusers pipeline antialiasing will be applied during resize, so it's better to input a tensor which you have prepared manually according to the transfrom in the example above!
Here are examples of images generated without (top) and with (bottom) anti-aliasing during resize. (Input is [this image](https://github.com/SHI-Labs/Versatile-Diffusion/blob/master/assets/ghibli.jpg))


### V1 vs V2
Here's an example of V1 vs V2, version two was trained more carefully and for longer, see the details below. V2-top vs V1-bottom


Input images:

One important thing to note is that due to the longer training V2 appears to have memorised some common images from the training data, e.g. now the previous example of the Girl with a Pearl Earring almosts perfectly reproduce the original rather than creating variations. You can always use v1 by specifiying `revision="v1.0"`.
v2 output for girl with a pearl earing as input (guidance scale=3)

# Training
**Training Procedure**
This model is fine tuned from Stable Diffusion v1-3 where the text encoder has been replaced with an image encoder. The training procedure is the same as for Stable Diffusion except for the fact that images are encoded through a ViT-L/14 image-encoder including the final projection layer to the CLIP shared embedding space. The model was trained on LAION improved aesthetics 6plus.
- **Hardware:** 8 x A100-40GB GPUs (provided by [Lambda GPU Cloud](https://lambdalabs.com/service/gpu-cloud))
- **Optimizer:** AdamW
- **Stage 1** - Fine tune only CrossAttention layer weights from Stable Diffusion v1.4 model
- **Steps**: 46,000
- **Batch:** batch size=4, GPUs=8, Gradient Accumulations=4. Total batch size=128
- **Learning rate:** warmup to 1e-5 for 10,000 steps and then kept constant
- **Stage 2** - Resume from Stage 1 training the whole unet
- **Steps**: 50,000
- **Batch:** batch size=4, GPUs=8, Gradient Accumulations=5. Total batch size=160
- **Learning rate:** warmup to 1e-5 for 5,000 steps and then kept constant
Training was done using a [modified version of the original Stable Diffusion training code](https://github.com/justinpinkney/stable-diffusion).
# Uses
_The following section is adapted from the [Stable Diffusion model card](https://huggingface.co/CompVis/stable-diffusion-v1-4)_
## Direct Use
The model is intended for research purposes only. Possible research areas and
tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere”
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/) which contains adult material
and is not fit for product use without additional safety mechanisms and
considerations.
- No additional measures were used to deduplicate the dataset. As a result, we observe some degree of memorization for images that are duplicated in the training data.
The training data can be searched at [https://rom1504.github.io/clip-retrieval/](https://rom1504.github.io/clip-retrieval/) to possibly assist in the detection of memorized images.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion v1 was trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are primarily limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
### Safety Module
The intended use of this model is with the [Safety Checker](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/safety_checker.py) in Diffusers.
This checker works by checking model outputs against known hard-coded NSFW concepts.
The concepts are intentionally hidden to reduce the likelihood of reverse-engineering this filter.
Specifically, the checker compares the class probability of harmful concepts in the embedding space of the `CLIPModel` *after generation* of the images.
The concepts are passed into the model with the generated image and compared to a hand-engineered weight for each NSFW concept.
## Old instructions
If you are using a diffusers version <0.8.0 there is no `StableDiffusionImageVariationPipeline`,
in this case you need to use an older revision (`2ddbd90b14bc5892c19925b15185e561bc8e5d0a`) in conjunction with the lambda-diffusers repo:
First clone [Lambda Diffusers](https://github.com/LambdaLabsML/lambda-diffusers) and install any requirements (in a virtual environment in the example below):
```bash
git clone https://github.com/LambdaLabsML/lambda-diffusers.git
cd lambda-diffusers
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
```
Then run the following python code:
```python
from pathlib import Path
from lambda_diffusers import StableDiffusionImageEmbedPipeline
from PIL import Image
import torch
device = "cuda" if torch.cuda.is_available() else "cpu"
pipe = StableDiffusionImageEmbedPipeline.from_pretrained(
"lambdalabs/sd-image-variations-diffusers",
revision="2ddbd90b14bc5892c19925b15185e561bc8e5d0a",
)
pipe = pipe.to(device)
im = Image.open("your/input/image/here.jpg")
num_samples = 4
image = pipe(num_samples*[im], guidance_scale=3.0)
image = image["sample"]
base_path = Path("outputs/im2im")
base_path.mkdir(exist_ok=True, parents=True)
for idx, im in enumerate(image):
im.save(base_path/f"{idx:06}.jpg")
```
*This model card was written by: Justin Pinkney and is based on the [Stable Diffusion model card](https://huggingface.co/CompVis/stable-diffusion-v1-4).* | 10,569 | [
[
-0.03863525390625,
-0.061981201171875,
0.0137786865234375,
0.01041412353515625,
-0.0272216796875,
-0.0208740234375,
-0.0008406639099121094,
-0.03765869140625,
0.006877899169921875,
0.0257415771484375,
-0.034942626953125,
-0.0355224609375,
-0.05865478515625,
-0.00931549072265625,
-0.0233154296875,
0.06134033203125,
-0.0164794921875,
0.01554107666015625,
-0.0007762908935546875,
-0.017547607421875,
-0.0290985107421875,
-0.01727294921875,
-0.0745849609375,
-0.01554107666015625,
0.02020263671875,
0.007053375244140625,
0.04443359375,
0.0340576171875,
0.036407470703125,
0.0224151611328125,
-0.02325439453125,
0.005039215087890625,
-0.054901123046875,
-0.01059722900390625,
0.0014257431030273438,
-0.023834228515625,
-0.0277099609375,
0.007904052734375,
0.051239013671875,
0.0195465087890625,
-0.013916015625,
0.00415802001953125,
0.0003230571746826172,
0.05731201171875,
-0.040863037109375,
-0.00997161865234375,
-0.0269927978515625,
0.021270751953125,
-0.01380157470703125,
0.0164031982421875,
-0.01885986328125,
-0.00775146484375,
0.01081085205078125,
-0.0660400390625,
0.036956787109375,
-0.01454925537109375,
0.09228515625,
0.0254669189453125,
-0.018646240234375,
-0.008056640625,
-0.046844482421875,
0.0523681640625,
-0.04571533203125,
0.0283203125,
0.0282135009765625,
0.01198577880859375,
0.00557708740234375,
-0.07135009765625,
-0.033721923828125,
-0.0008463859558105469,
0.0031948089599609375,
0.02947998046875,
-0.01508331298828125,
-0.0091705322265625,
0.033203125,
0.032562255859375,
-0.041259765625,
-0.005809783935546875,
-0.059783935546875,
-0.00630950927734375,
0.051300048828125,
0.00020813941955566406,
0.0222320556640625,
0.005031585693359375,
-0.03533935546875,
-0.0092315673828125,
-0.0452880859375,
-0.0018968582153320312,
0.02716064453125,
-0.0264129638671875,
-0.028167724609375,
0.0225677490234375,
0.0082550048828125,
0.036834716796875,
0.026519775390625,
-0.005352020263671875,
0.0369873046875,
-0.005649566650390625,
-0.0222320556640625,
-0.0228118896484375,
0.06463623046875,
0.040069580078125,
0.0026226043701171875,
0.0101776123046875,
-0.0088653564453125,
-0.006443023681640625,
0.01251220703125,
-0.08203125,
-0.02703857421875,
0.004108428955078125,
-0.044921875,
-0.026214599609375,
0.001373291015625,
-0.06732177734375,
-0.006404876708984375,
-0.0003840923309326172,
0.0406494140625,
-0.04364013671875,
-0.03411865234375,
0.01053619384765625,
-0.0230865478515625,
0.0104522705078125,
0.032928466796875,
-0.0634765625,
0.014862060546875,
-0.001007080078125,
0.08038330078125,
-0.018096923828125,
-0.01324462890625,
0.0020771026611328125,
0.00579833984375,
-0.0296630859375,
0.04827880859375,
-0.0166015625,
-0.053192138671875,
-0.0188751220703125,
0.0278472900390625,
-0.002162933349609375,
-0.05096435546875,
0.054290771484375,
-0.031280517578125,
0.0308837890625,
-0.00583648681640625,
-0.0430908203125,
-0.022308349609375,
0.0032596588134765625,
-0.05078125,
0.08099365234375,
0.0218505859375,
-0.0670166015625,
0.01155853271484375,
-0.062286376953125,
-0.00827789306640625,
-0.00620269775390625,
0.00032639503479003906,
-0.0567626953125,
-0.005016326904296875,
0.0038852691650390625,
0.0267181396484375,
-0.0019140243530273438,
0.004199981689453125,
-0.02093505859375,
-0.032318115234375,
0.0022830963134765625,
-0.0400390625,
0.0750732421875,
0.023345947265625,
-0.034698486328125,
-0.0035877227783203125,
-0.045135498046875,
-0.0181121826171875,
0.036376953125,
-0.0200347900390625,
-0.0119171142578125,
-0.025299072265625,
0.0182037353515625,
0.01430511474609375,
0.00762176513671875,
-0.035125732421875,
0.01000213623046875,
-0.017120361328125,
0.03857421875,
0.05474853515625,
0.015411376953125,
0.04339599609375,
-0.02838134765625,
0.038726806640625,
0.029571533203125,
0.028411865234375,
-0.00901031494140625,
-0.053924560546875,
-0.0535888671875,
-0.024200439453125,
0.02001953125,
0.035400390625,
-0.047149658203125,
0.0172271728515625,
-0.00002562999725341797,
-0.04473876953125,
-0.0292510986328125,
0.00498199462890625,
0.0220947265625,
0.051116943359375,
0.029510498046875,
-0.047027587890625,
-0.0293426513671875,
-0.06475830078125,
0.0177154541015625,
0.0092926025390625,
0.006076812744140625,
0.02020263671875,
0.048797607421875,
-0.0280914306640625,
0.03826904296875,
-0.040069580078125,
-0.02703857421875,
0.01242828369140625,
0.015838623046875,
0.01000213623046875,
0.05999755859375,
0.059967041015625,
-0.07391357421875,
-0.0487060546875,
-0.01412200927734375,
-0.06695556640625,
0.0126495361328125,
-0.0034027099609375,
-0.0256500244140625,
0.02557373046875,
0.035491943359375,
-0.056549072265625,
0.046051025390625,
0.032379150390625,
-0.0207977294921875,
0.043548583984375,
-0.0292816162109375,
0.004917144775390625,
-0.07452392578125,
0.0020313262939453125,
0.0238037109375,
-0.0191650390625,
-0.048126220703125,
0.01355743408203125,
0.00818634033203125,
-0.003231048583984375,
-0.05743408203125,
0.05145263671875,
-0.03985595703125,
0.0256500244140625,
-0.0211944580078125,
-0.013580322265625,
0.01262664794921875,
0.0296630859375,
0.023162841796875,
0.04931640625,
0.072509765625,
-0.0430908203125,
0.0203857421875,
0.0169830322265625,
-0.02215576171875,
0.044921875,
-0.0596923828125,
0.01322174072265625,
-0.0307464599609375,
0.01739501953125,
-0.06103515625,
-0.0110626220703125,
0.04241943359375,
-0.0302886962890625,
0.052520751953125,
-0.0174560546875,
-0.0258026123046875,
-0.038970947265625,
-0.0245361328125,
0.0390625,
0.07440185546875,
-0.033447265625,
0.036651611328125,
0.018280029296875,
0.0229339599609375,
-0.040252685546875,
-0.06048583984375,
-0.0159912109375,
-0.03778076171875,
-0.049468994140625,
0.038787841796875,
-0.0181732177734375,
-0.005886077880859375,
0.00933074951171875,
0.002567291259765625,
-0.007152557373046875,
-0.0013780593872070312,
0.0236053466796875,
0.0223236083984375,
-0.0099029541015625,
-0.00675201416015625,
0.0087738037109375,
-0.00014972686767578125,
-0.00872039794921875,
-0.0185699462890625,
0.02911376953125,
-0.00789642333984375,
-0.01546478271484375,
-0.056610107421875,
0.020660400390625,
0.04144287109375,
0.010650634765625,
0.06341552734375,
0.0809326171875,
-0.03485107421875,
0.006526947021484375,
-0.041717529296875,
-0.0115203857421875,
-0.038818359375,
0.0284881591796875,
-0.0202789306640625,
-0.038330078125,
0.0511474609375,
-0.006313323974609375,
0.0010242462158203125,
0.04791259765625,
0.04400634765625,
-0.0203857421875,
0.07208251953125,
0.03875732421875,
0.01256561279296875,
0.055328369140625,
-0.06878662109375,
-0.01129913330078125,
-0.08172607421875,
-0.0216217041015625,
0.003582000732421875,
-0.028411865234375,
-0.0310821533203125,
-0.0399169921875,
0.0256500244140625,
0.031463623046875,
-0.024200439453125,
0.0072174072265625,
-0.04364013671875,
0.040618896484375,
0.033355712890625,
0.0264129638671875,
-0.002613067626953125,
0.022308349609375,
-0.00008475780487060547,
-0.013519287109375,
-0.0518798828125,
-0.0298309326171875,
0.08551025390625,
0.039703369140625,
0.06585693359375,
0.00463104248046875,
0.045074462890625,
0.021209716796875,
0.0288238525390625,
-0.040130615234375,
0.028167724609375,
-0.0211944580078125,
-0.04736328125,
-0.01385498046875,
-0.0199127197265625,
-0.06982421875,
0.0239105224609375,
-0.0157623291015625,
-0.0416259765625,
0.037933349609375,
0.0157928466796875,
-0.020477294921875,
0.033447265625,
-0.061920166015625,
0.073486328125,
-0.01308441162109375,
-0.05279541015625,
-0.0042724609375,
-0.049468994140625,
0.0301666259765625,
-0.00185394287109375,
0.0076141357421875,
-0.00254058837890625,
0.007320404052734375,
0.05914306640625,
-0.0299224853515625,
0.06610107421875,
-0.0286102294921875,
0.008758544921875,
0.0292510986328125,
-0.004245758056640625,
0.0224456787109375,
0.00482940673828125,
-0.0087738037109375,
0.03521728515625,
0.007080078125,
-0.040008544921875,
-0.031768798828125,
0.0491943359375,
-0.077880859375,
-0.04132080078125,
-0.0271453857421875,
-0.0269622802734375,
0.02587890625,
0.00972747802734375,
0.070556640625,
0.0286102294921875,
-0.014801025390625,
0.003875732421875,
0.06890869140625,
-0.018524169921875,
0.033050537109375,
0.01885986328125,
-0.018096923828125,
-0.040191650390625,
0.05902099609375,
0.01404571533203125,
0.0357666015625,
-0.01084136962890625,
0.01088714599609375,
-0.0216522216796875,
-0.033782958984375,
-0.05517578125,
0.0294342041015625,
-0.059814453125,
-0.01270294189453125,
-0.041595458984375,
-0.031890869140625,
-0.0325927734375,
-0.0090179443359375,
-0.03271484375,
-0.023040771484375,
-0.06585693359375,
0.0007066726684570312,
0.0255279541015625,
0.0457763671875,
-0.0220489501953125,
0.02618408203125,
-0.03790283203125,
0.01416778564453125,
0.01183319091796875,
0.02093505859375,
0.0030956268310546875,
-0.06951904296875,
-0.0194091796875,
0.006649017333984375,
-0.05645751953125,
-0.069580078125,
0.035003662109375,
0.0230865478515625,
0.041015625,
0.05596923828125,
-0.0161895751953125,
0.055511474609375,
-0.034698486328125,
0.07550048828125,
0.026519775390625,
-0.049468994140625,
0.047027587890625,
-0.024261474609375,
0.017791748046875,
0.0268707275390625,
0.047821044921875,
-0.030364990234375,
-0.01515960693359375,
-0.06982421875,
-0.06536865234375,
0.052978515625,
0.02398681640625,
0.0251312255859375,
0.00843048095703125,
0.053009033203125,
0.00359344482421875,
-0.0016269683837890625,
-0.07623291015625,
-0.040313720703125,
-0.040985107421875,
0.0001976490020751953,
0.002307891845703125,
-0.022552490234375,
-0.01415252685546875,
-0.046173095703125,
0.07586669921875,
0.01178741455078125,
0.0335693359375,
0.03179931640625,
0.0011310577392578125,
-0.0255279541015625,
-0.0216064453125,
0.050506591796875,
0.040771484375,
-0.0165252685546875,
-0.006717681884765625,
0.01003265380859375,
-0.042449951171875,
0.0099945068359375,
0.0021209716796875,
-0.042144775390625,
0.01020050048828125,
-0.0022678375244140625,
0.0806884765625,
-0.01486968994140625,
-0.033843994140625,
0.042388916015625,
-0.015472412109375,
-0.03753662109375,
-0.03814697265625,
0.0041961669921875,
0.0018596649169921875,
0.01363372802734375,
0.01314544677734375,
0.033233642578125,
0.0158843994140625,
-0.0302276611328125,
0.002994537353515625,
0.033416748046875,
-0.0249481201171875,
-0.0246734619140625,
0.07110595703125,
-0.00478363037109375,
-0.022796630859375,
0.048614501953125,
-0.0303955078125,
-0.0266876220703125,
0.054229736328125,
0.052215576171875,
0.054931640625,
-0.027862548828125,
0.029510498046875,
0.055206298828125,
0.01593017578125,
-0.038482666015625,
0.01163482666015625,
0.0066986083984375,
-0.056671142578125,
0.0015316009521484375,
-0.032073974609375,
-0.004253387451171875,
0.0208740234375,
-0.044769287109375,
0.0295867919921875,
-0.03857421875,
-0.0278778076171875,
-0.0159759521484375,
-0.0224151611328125,
-0.047607421875,
0.0101318359375,
0.01739501953125,
0.060272216796875,
-0.074462890625,
0.0628662109375,
0.056549072265625,
-0.055511474609375,
-0.04937744140625,
-0.0039520263671875,
-0.0088653564453125,
-0.0423583984375,
0.048858642578125,
0.01529693603515625,
0.009735107421875,
0.0011310577392578125,
-0.05804443359375,
-0.06878662109375,
0.09912109375,
0.025482177734375,
-0.03265380859375,
0.0003483295440673828,
-0.009613037109375,
0.04583740234375,
-0.02838134765625,
0.0244293212890625,
0.023040771484375,
0.0171051025390625,
0.038726806640625,
-0.048065185546875,
-0.001720428466796875,
-0.0286865234375,
0.0253753662109375,
-0.004047393798828125,
-0.0692138671875,
0.0750732421875,
-0.0164794921875,
-0.020416259765625,
0.022064208984375,
0.04425048828125,
0.014251708984375,
0.0175018310546875,
0.03240966796875,
0.0755615234375,
0.053924560546875,
-0.0029697418212890625,
0.075927734375,
-0.01103973388671875,
0.041015625,
0.058349609375,
0.005519866943359375,
0.045623779296875,
0.03485107421875,
-0.0098876953125,
0.040069580078125,
0.07049560546875,
-0.0258026123046875,
0.05804443359375,
0.00902557373046875,
-0.02288818359375,
-0.000041365623474121094,
0.00417327880859375,
-0.0438232421875,
-0.004367828369140625,
0.032867431640625,
-0.042083740234375,
-0.005680084228515625,
0.026275634765625,
-0.01029205322265625,
-0.0234222412109375,
-0.0190887451171875,
0.040771484375,
0.01041412353515625,
-0.0229644775390625,
0.060333251953125,
0.001026153564453125,
0.07183837890625,
-0.039215087890625,
-0.00286102294921875,
-0.0197296142578125,
0.007488250732421875,
-0.0200042724609375,
-0.049163818359375,
0.03179931640625,
-0.01416778564453125,
-0.007770538330078125,
-0.0138092041015625,
0.048675537109375,
-0.0282440185546875,
-0.053741455078125,
0.0228424072265625,
0.01495361328125,
0.037506103515625,
0.00440216064453125,
-0.08380126953125,
0.01146697998046875,
-0.001888275146484375,
-0.031036376953125,
0.0168609619140625,
0.0219268798828125,
0.009307861328125,
0.0399169921875,
0.03082275390625,
0.00670623779296875,
0.017730712890625,
0.00347137451171875,
0.056976318359375,
-0.0165863037109375,
-0.01593017578125,
-0.0589599609375,
0.04364013671875,
-0.0135040283203125,
-0.0195465087890625,
0.0443115234375,
0.04931640625,
0.07318115234375,
-0.01922607421875,
0.05035400390625,
-0.0081024169921875,
0.0058746337890625,
-0.04205322265625,
0.057586669921875,
-0.06292724609375,
0.012115478515625,
-0.01739501953125,
-0.064453125,
-0.013702392578125,
0.07684326171875,
-0.022552490234375,
0.01320648193359375,
0.039886474609375,
0.06744384765625,
-0.0207977294921875,
-0.01177215576171875,
0.0238037109375,
0.01436614990234375,
0.0166778564453125,
0.02886962890625,
0.059295654296875,
-0.0667724609375,
0.023345947265625,
-0.038177490234375,
-0.0101318359375,
0.001102447509765625,
-0.046539306640625,
-0.058868408203125,
-0.050811767578125,
-0.0570068359375,
-0.041595458984375,
0.004482269287109375,
0.044189453125,
0.07635498046875,
-0.04193115234375,
-0.007221221923828125,
-0.01959228515625,
-0.005084991455078125,
-0.00351715087890625,
-0.0201568603515625,
0.0296630859375,
0.0033893585205078125,
-0.0750732421875,
-0.008056640625,
0.00901031494140625,
0.052337646484375,
-0.02783203125,
-0.021026611328125,
-0.013519287109375,
-0.0183258056640625,
0.03466796875,
0.0282745361328125,
-0.051849365234375,
-0.002582550048828125,
-0.01322174072265625,
0.005374908447265625,
0.0233306884765625,
0.0256500244140625,
-0.036376953125,
0.034881591796875,
0.025177001953125,
0.02264404296875,
0.07037353515625,
0.00011157989501953125,
0.00510406494140625,
-0.043853759765625,
0.02587890625,
0.0038661956787109375,
0.038330078125,
0.02801513671875,
-0.041748046875,
0.045928955078125,
0.04827880859375,
-0.04840087890625,
-0.04974365234375,
0.0078277587890625,
-0.0784912109375,
-0.032684326171875,
0.0927734375,
-0.022552490234375,
-0.0253143310546875,
0.011749267578125,
-0.03131103515625,
0.0146636962890625,
-0.0235443115234375,
0.05535888671875,
0.047760009765625,
-0.013916015625,
-0.03814697265625,
-0.044525146484375,
0.035400390625,
0.019439697265625,
-0.040924072265625,
-0.032867431640625,
0.041412353515625,
0.056915283203125,
0.019775390625,
0.06671142578125,
-0.02587890625,
0.0181884765625,
-0.0024051666259765625,
0.01025390625,
0.00583648681640625,
-0.003509521484375,
-0.040496826171875,
-0.0017709732055664062,
-0.007083892822265625,
-0.01959228515625
]
] |
thenlper/gte-large | 2023-09-25T12:54:54.000Z | [
"sentence-transformers",
"pytorch",
"onnx",
"safetensors",
"bert",
"mteb",
"sentence-similarity",
"Sentence Transformers",
"en",
"arxiv:2308.03281",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | thenlper | null | null | thenlper/gte-large | 140 | 87,662 | sentence-transformers | 2023-07-27T09:55:39 | ---
tags:
- mteb
- sentence-similarity
- sentence-transformers
- Sentence Transformers
model-index:
- name: gte-large
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 72.62686567164178
- type: ap
value: 34.46944126809772
- type: f1
value: 66.23684353950857
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 92.51805
- type: ap
value: 89.49842783330848
- type: f1
value: 92.51112169431808
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 49.074
- type: f1
value: 48.44785682572955
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.077
- type: map_at_10
value: 48.153
- type: map_at_100
value: 48.963
- type: map_at_1000
value: 48.966
- type: map_at_3
value: 43.184
- type: map_at_5
value: 46.072
- type: mrr_at_1
value: 33.073
- type: mrr_at_10
value: 48.54
- type: mrr_at_100
value: 49.335
- type: mrr_at_1000
value: 49.338
- type: mrr_at_3
value: 43.563
- type: mrr_at_5
value: 46.383
- type: ndcg_at_1
value: 32.077
- type: ndcg_at_10
value: 57.158
- type: ndcg_at_100
value: 60.324999999999996
- type: ndcg_at_1000
value: 60.402
- type: ndcg_at_3
value: 46.934
- type: ndcg_at_5
value: 52.158
- type: precision_at_1
value: 32.077
- type: precision_at_10
value: 8.591999999999999
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 19.275000000000002
- type: precision_at_5
value: 14.111
- type: recall_at_1
value: 32.077
- type: recall_at_10
value: 85.917
- type: recall_at_100
value: 99.075
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 57.824
- type: recall_at_5
value: 70.555
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.619246083417295
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 43.3574067664688
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 63.06359661829253
- type: mrr
value: 76.15596007562766
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 90.25407547368691
- type: cos_sim_spearman
value: 88.65081514968477
- type: euclidean_pearson
value: 88.14857116664494
- type: euclidean_spearman
value: 88.50683596540692
- type: manhattan_pearson
value: 87.9654797992225
- type: manhattan_spearman
value: 88.21164851646908
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 86.05844155844157
- type: f1
value: 86.01555597681825
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 39.10510519739522
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.84689960264385
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.800000000000004
- type: map_at_10
value: 44.857
- type: map_at_100
value: 46.512
- type: map_at_1000
value: 46.635
- type: map_at_3
value: 41.062
- type: map_at_5
value: 43.126
- type: mrr_at_1
value: 39.628
- type: mrr_at_10
value: 50.879
- type: mrr_at_100
value: 51.605000000000004
- type: mrr_at_1000
value: 51.641000000000005
- type: mrr_at_3
value: 48.14
- type: mrr_at_5
value: 49.835
- type: ndcg_at_1
value: 39.628
- type: ndcg_at_10
value: 51.819
- type: ndcg_at_100
value: 57.318999999999996
- type: ndcg_at_1000
value: 58.955999999999996
- type: ndcg_at_3
value: 46.409
- type: ndcg_at_5
value: 48.825
- type: precision_at_1
value: 39.628
- type: precision_at_10
value: 10.072000000000001
- type: precision_at_100
value: 1.625
- type: precision_at_1000
value: 0.21
- type: precision_at_3
value: 22.556
- type: precision_at_5
value: 16.309
- type: recall_at_1
value: 32.800000000000004
- type: recall_at_10
value: 65.078
- type: recall_at_100
value: 87.491
- type: recall_at_1000
value: 97.514
- type: recall_at_3
value: 49.561
- type: recall_at_5
value: 56.135999999999996
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.614
- type: map_at_10
value: 43.578
- type: map_at_100
value: 44.897
- type: map_at_1000
value: 45.023
- type: map_at_3
value: 40.282000000000004
- type: map_at_5
value: 42.117
- type: mrr_at_1
value: 40.510000000000005
- type: mrr_at_10
value: 49.428
- type: mrr_at_100
value: 50.068999999999996
- type: mrr_at_1000
value: 50.111000000000004
- type: mrr_at_3
value: 47.176
- type: mrr_at_5
value: 48.583999999999996
- type: ndcg_at_1
value: 40.510000000000005
- type: ndcg_at_10
value: 49.478
- type: ndcg_at_100
value: 53.852
- type: ndcg_at_1000
value: 55.782
- type: ndcg_at_3
value: 45.091
- type: ndcg_at_5
value: 47.19
- type: precision_at_1
value: 40.510000000000005
- type: precision_at_10
value: 9.363000000000001
- type: precision_at_100
value: 1.51
- type: precision_at_1000
value: 0.196
- type: precision_at_3
value: 21.741
- type: precision_at_5
value: 15.465000000000002
- type: recall_at_1
value: 32.614
- type: recall_at_10
value: 59.782000000000004
- type: recall_at_100
value: 78.012
- type: recall_at_1000
value: 90.319
- type: recall_at_3
value: 46.825
- type: recall_at_5
value: 52.688
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.266000000000005
- type: map_at_10
value: 53.756
- type: map_at_100
value: 54.809
- type: map_at_1000
value: 54.855
- type: map_at_3
value: 50.073
- type: map_at_5
value: 52.293
- type: mrr_at_1
value: 46.332
- type: mrr_at_10
value: 57.116
- type: mrr_at_100
value: 57.767
- type: mrr_at_1000
value: 57.791000000000004
- type: mrr_at_3
value: 54.461999999999996
- type: mrr_at_5
value: 56.092
- type: ndcg_at_1
value: 46.332
- type: ndcg_at_10
value: 60.092
- type: ndcg_at_100
value: 64.034
- type: ndcg_at_1000
value: 64.937
- type: ndcg_at_3
value: 54.071000000000005
- type: ndcg_at_5
value: 57.254000000000005
- type: precision_at_1
value: 46.332
- type: precision_at_10
value: 9.799
- type: precision_at_100
value: 1.278
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 24.368000000000002
- type: precision_at_5
value: 16.89
- type: recall_at_1
value: 40.266000000000005
- type: recall_at_10
value: 75.41499999999999
- type: recall_at_100
value: 92.01700000000001
- type: recall_at_1000
value: 98.379
- type: recall_at_3
value: 59.476
- type: recall_at_5
value: 67.297
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.589
- type: map_at_10
value: 37.755
- type: map_at_100
value: 38.881
- type: map_at_1000
value: 38.954
- type: map_at_3
value: 34.759
- type: map_at_5
value: 36.544
- type: mrr_at_1
value: 30.734
- type: mrr_at_10
value: 39.742
- type: mrr_at_100
value: 40.774
- type: mrr_at_1000
value: 40.824
- type: mrr_at_3
value: 37.137
- type: mrr_at_5
value: 38.719
- type: ndcg_at_1
value: 30.734
- type: ndcg_at_10
value: 42.978
- type: ndcg_at_100
value: 48.309000000000005
- type: ndcg_at_1000
value: 50.068
- type: ndcg_at_3
value: 37.361
- type: ndcg_at_5
value: 40.268
- type: precision_at_1
value: 30.734
- type: precision_at_10
value: 6.565
- type: precision_at_100
value: 0.964
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 15.744
- type: precision_at_5
value: 11.096
- type: recall_at_1
value: 28.589
- type: recall_at_10
value: 57.126999999999995
- type: recall_at_100
value: 81.051
- type: recall_at_1000
value: 94.027
- type: recall_at_3
value: 42.045
- type: recall_at_5
value: 49.019
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.5
- type: map_at_10
value: 27.950999999999997
- type: map_at_100
value: 29.186
- type: map_at_1000
value: 29.298000000000002
- type: map_at_3
value: 25.141000000000002
- type: map_at_5
value: 26.848
- type: mrr_at_1
value: 22.637
- type: mrr_at_10
value: 32.572
- type: mrr_at_100
value: 33.472
- type: mrr_at_1000
value: 33.533
- type: mrr_at_3
value: 29.747
- type: mrr_at_5
value: 31.482
- type: ndcg_at_1
value: 22.637
- type: ndcg_at_10
value: 33.73
- type: ndcg_at_100
value: 39.568
- type: ndcg_at_1000
value: 42.201
- type: ndcg_at_3
value: 28.505999999999997
- type: ndcg_at_5
value: 31.255
- type: precision_at_1
value: 22.637
- type: precision_at_10
value: 6.281000000000001
- type: precision_at_100
value: 1.073
- type: precision_at_1000
value: 0.14300000000000002
- type: precision_at_3
value: 13.847000000000001
- type: precision_at_5
value: 10.224
- type: recall_at_1
value: 18.5
- type: recall_at_10
value: 46.744
- type: recall_at_100
value: 72.072
- type: recall_at_1000
value: 91.03999999999999
- type: recall_at_3
value: 32.551
- type: recall_at_5
value: 39.533
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.602
- type: map_at_10
value: 42.18
- type: map_at_100
value: 43.6
- type: map_at_1000
value: 43.704
- type: map_at_3
value: 38.413000000000004
- type: map_at_5
value: 40.626
- type: mrr_at_1
value: 37.344
- type: mrr_at_10
value: 47.638000000000005
- type: mrr_at_100
value: 48.485
- type: mrr_at_1000
value: 48.52
- type: mrr_at_3
value: 44.867000000000004
- type: mrr_at_5
value: 46.566
- type: ndcg_at_1
value: 37.344
- type: ndcg_at_10
value: 48.632
- type: ndcg_at_100
value: 54.215
- type: ndcg_at_1000
value: 55.981
- type: ndcg_at_3
value: 42.681999999999995
- type: ndcg_at_5
value: 45.732
- type: precision_at_1
value: 37.344
- type: precision_at_10
value: 8.932
- type: precision_at_100
value: 1.376
- type: precision_at_1000
value: 0.17099999999999999
- type: precision_at_3
value: 20.276
- type: precision_at_5
value: 14.726
- type: recall_at_1
value: 30.602
- type: recall_at_10
value: 62.273
- type: recall_at_100
value: 85.12100000000001
- type: recall_at_1000
value: 96.439
- type: recall_at_3
value: 45.848
- type: recall_at_5
value: 53.615
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.952
- type: map_at_10
value: 35.177
- type: map_at_100
value: 36.59
- type: map_at_1000
value: 36.703
- type: map_at_3
value: 31.261
- type: map_at_5
value: 33.222
- type: mrr_at_1
value: 29.337999999999997
- type: mrr_at_10
value: 40.152
- type: mrr_at_100
value: 40.963
- type: mrr_at_1000
value: 41.016999999999996
- type: mrr_at_3
value: 36.91
- type: mrr_at_5
value: 38.685
- type: ndcg_at_1
value: 29.337999999999997
- type: ndcg_at_10
value: 41.994
- type: ndcg_at_100
value: 47.587
- type: ndcg_at_1000
value: 49.791000000000004
- type: ndcg_at_3
value: 35.27
- type: ndcg_at_5
value: 38.042
- type: precision_at_1
value: 29.337999999999997
- type: precision_at_10
value: 8.276
- type: precision_at_100
value: 1.276
- type: precision_at_1000
value: 0.164
- type: precision_at_3
value: 17.161
- type: precision_at_5
value: 12.671
- type: recall_at_1
value: 23.952
- type: recall_at_10
value: 57.267
- type: recall_at_100
value: 80.886
- type: recall_at_1000
value: 95.611
- type: recall_at_3
value: 38.622
- type: recall_at_5
value: 45.811
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.092083333333335
- type: map_at_10
value: 37.2925
- type: map_at_100
value: 38.57041666666666
- type: map_at_1000
value: 38.68141666666667
- type: map_at_3
value: 34.080000000000005
- type: map_at_5
value: 35.89958333333333
- type: mrr_at_1
value: 31.94758333333333
- type: mrr_at_10
value: 41.51049999999999
- type: mrr_at_100
value: 42.36099999999999
- type: mrr_at_1000
value: 42.4125
- type: mrr_at_3
value: 38.849583333333335
- type: mrr_at_5
value: 40.448249999999994
- type: ndcg_at_1
value: 31.94758333333333
- type: ndcg_at_10
value: 43.17633333333333
- type: ndcg_at_100
value: 48.45241666666668
- type: ndcg_at_1000
value: 50.513999999999996
- type: ndcg_at_3
value: 37.75216666666667
- type: ndcg_at_5
value: 40.393833333333326
- type: precision_at_1
value: 31.94758333333333
- type: precision_at_10
value: 7.688916666666666
- type: precision_at_100
value: 1.2250833333333333
- type: precision_at_1000
value: 0.1595
- type: precision_at_3
value: 17.465999999999998
- type: precision_at_5
value: 12.548083333333333
- type: recall_at_1
value: 27.092083333333335
- type: recall_at_10
value: 56.286583333333326
- type: recall_at_100
value: 79.09033333333333
- type: recall_at_1000
value: 93.27483333333335
- type: recall_at_3
value: 41.35325
- type: recall_at_5
value: 48.072750000000006
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.825
- type: map_at_10
value: 33.723
- type: map_at_100
value: 34.74
- type: map_at_1000
value: 34.824
- type: map_at_3
value: 31.369000000000003
- type: map_at_5
value: 32.533
- type: mrr_at_1
value: 29.293999999999997
- type: mrr_at_10
value: 36.84
- type: mrr_at_100
value: 37.681
- type: mrr_at_1000
value: 37.742
- type: mrr_at_3
value: 34.79
- type: mrr_at_5
value: 35.872
- type: ndcg_at_1
value: 29.293999999999997
- type: ndcg_at_10
value: 38.385999999999996
- type: ndcg_at_100
value: 43.327
- type: ndcg_at_1000
value: 45.53
- type: ndcg_at_3
value: 33.985
- type: ndcg_at_5
value: 35.817
- type: precision_at_1
value: 29.293999999999997
- type: precision_at_10
value: 6.12
- type: precision_at_100
value: 0.9329999999999999
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 14.621999999999998
- type: precision_at_5
value: 10.030999999999999
- type: recall_at_1
value: 25.825
- type: recall_at_10
value: 49.647000000000006
- type: recall_at_100
value: 72.32300000000001
- type: recall_at_1000
value: 88.62400000000001
- type: recall_at_3
value: 37.366
- type: recall_at_5
value: 41.957
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.139
- type: map_at_10
value: 26.107000000000003
- type: map_at_100
value: 27.406999999999996
- type: map_at_1000
value: 27.535999999999998
- type: map_at_3
value: 23.445
- type: map_at_5
value: 24.916
- type: mrr_at_1
value: 21.817
- type: mrr_at_10
value: 29.99
- type: mrr_at_100
value: 31.052000000000003
- type: mrr_at_1000
value: 31.128
- type: mrr_at_3
value: 27.627000000000002
- type: mrr_at_5
value: 29.005
- type: ndcg_at_1
value: 21.817
- type: ndcg_at_10
value: 31.135
- type: ndcg_at_100
value: 37.108000000000004
- type: ndcg_at_1000
value: 39.965
- type: ndcg_at_3
value: 26.439
- type: ndcg_at_5
value: 28.655
- type: precision_at_1
value: 21.817
- type: precision_at_10
value: 5.757000000000001
- type: precision_at_100
value: 1.036
- type: precision_at_1000
value: 0.147
- type: precision_at_3
value: 12.537
- type: precision_at_5
value: 9.229
- type: recall_at_1
value: 18.139
- type: recall_at_10
value: 42.272999999999996
- type: recall_at_100
value: 68.657
- type: recall_at_1000
value: 88.93799999999999
- type: recall_at_3
value: 29.266
- type: recall_at_5
value: 34.892
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.755000000000003
- type: map_at_10
value: 37.384
- type: map_at_100
value: 38.56
- type: map_at_1000
value: 38.655
- type: map_at_3
value: 34.214
- type: map_at_5
value: 35.96
- type: mrr_at_1
value: 32.369
- type: mrr_at_10
value: 41.625
- type: mrr_at_100
value: 42.449
- type: mrr_at_1000
value: 42.502
- type: mrr_at_3
value: 38.899
- type: mrr_at_5
value: 40.489999999999995
- type: ndcg_at_1
value: 32.369
- type: ndcg_at_10
value: 43.287
- type: ndcg_at_100
value: 48.504999999999995
- type: ndcg_at_1000
value: 50.552
- type: ndcg_at_3
value: 37.549
- type: ndcg_at_5
value: 40.204
- type: precision_at_1
value: 32.369
- type: precision_at_10
value: 7.425
- type: precision_at_100
value: 1.134
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_3
value: 17.102
- type: precision_at_5
value: 12.107999999999999
- type: recall_at_1
value: 27.755000000000003
- type: recall_at_10
value: 57.071000000000005
- type: recall_at_100
value: 79.456
- type: recall_at_1000
value: 93.54299999999999
- type: recall_at_3
value: 41.298
- type: recall_at_5
value: 48.037
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.855
- type: map_at_10
value: 34.53
- type: map_at_100
value: 36.167
- type: map_at_1000
value: 36.394999999999996
- type: map_at_3
value: 31.037
- type: map_at_5
value: 33.119
- type: mrr_at_1
value: 30.631999999999998
- type: mrr_at_10
value: 39.763999999999996
- type: mrr_at_100
value: 40.77
- type: mrr_at_1000
value: 40.826
- type: mrr_at_3
value: 36.495
- type: mrr_at_5
value: 38.561
- type: ndcg_at_1
value: 30.631999999999998
- type: ndcg_at_10
value: 40.942
- type: ndcg_at_100
value: 47.07
- type: ndcg_at_1000
value: 49.363
- type: ndcg_at_3
value: 35.038000000000004
- type: ndcg_at_5
value: 38.161
- type: precision_at_1
value: 30.631999999999998
- type: precision_at_10
value: 7.983999999999999
- type: precision_at_100
value: 1.6070000000000002
- type: precision_at_1000
value: 0.246
- type: precision_at_3
value: 16.206
- type: precision_at_5
value: 12.253
- type: recall_at_1
value: 24.855
- type: recall_at_10
value: 53.291999999999994
- type: recall_at_100
value: 80.283
- type: recall_at_1000
value: 94.309
- type: recall_at_3
value: 37.257
- type: recall_at_5
value: 45.282
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.208
- type: map_at_10
value: 30.512
- type: map_at_100
value: 31.496000000000002
- type: map_at_1000
value: 31.595000000000002
- type: map_at_3
value: 27.904
- type: map_at_5
value: 29.491
- type: mrr_at_1
value: 22.736
- type: mrr_at_10
value: 32.379999999999995
- type: mrr_at_100
value: 33.245000000000005
- type: mrr_at_1000
value: 33.315
- type: mrr_at_3
value: 29.945
- type: mrr_at_5
value: 31.488
- type: ndcg_at_1
value: 22.736
- type: ndcg_at_10
value: 35.643
- type: ndcg_at_100
value: 40.535
- type: ndcg_at_1000
value: 43.042
- type: ndcg_at_3
value: 30.625000000000004
- type: ndcg_at_5
value: 33.323
- type: precision_at_1
value: 22.736
- type: precision_at_10
value: 5.6930000000000005
- type: precision_at_100
value: 0.889
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 13.431999999999999
- type: precision_at_5
value: 9.575
- type: recall_at_1
value: 21.208
- type: recall_at_10
value: 49.47
- type: recall_at_100
value: 71.71499999999999
- type: recall_at_1000
value: 90.55499999999999
- type: recall_at_3
value: 36.124
- type: recall_at_5
value: 42.606
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 11.363
- type: map_at_10
value: 20.312
- type: map_at_100
value: 22.225
- type: map_at_1000
value: 22.411
- type: map_at_3
value: 16.68
- type: map_at_5
value: 18.608
- type: mrr_at_1
value: 25.537
- type: mrr_at_10
value: 37.933
- type: mrr_at_100
value: 38.875
- type: mrr_at_1000
value: 38.911
- type: mrr_at_3
value: 34.387
- type: mrr_at_5
value: 36.51
- type: ndcg_at_1
value: 25.537
- type: ndcg_at_10
value: 28.82
- type: ndcg_at_100
value: 36.341
- type: ndcg_at_1000
value: 39.615
- type: ndcg_at_3
value: 23.01
- type: ndcg_at_5
value: 25.269000000000002
- type: precision_at_1
value: 25.537
- type: precision_at_10
value: 9.153
- type: precision_at_100
value: 1.7319999999999998
- type: precision_at_1000
value: 0.234
- type: precision_at_3
value: 17.22
- type: precision_at_5
value: 13.629
- type: recall_at_1
value: 11.363
- type: recall_at_10
value: 35.382999999999996
- type: recall_at_100
value: 61.367000000000004
- type: recall_at_1000
value: 79.699
- type: recall_at_3
value: 21.495
- type: recall_at_5
value: 27.42
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.65
- type: map_at_10
value: 20.742
- type: map_at_100
value: 29.614
- type: map_at_1000
value: 31.373
- type: map_at_3
value: 14.667
- type: map_at_5
value: 17.186
- type: mrr_at_1
value: 69.75
- type: mrr_at_10
value: 76.762
- type: mrr_at_100
value: 77.171
- type: mrr_at_1000
value: 77.179
- type: mrr_at_3
value: 75.125
- type: mrr_at_5
value: 76.287
- type: ndcg_at_1
value: 57.62500000000001
- type: ndcg_at_10
value: 42.370999999999995
- type: ndcg_at_100
value: 47.897
- type: ndcg_at_1000
value: 55.393
- type: ndcg_at_3
value: 46.317
- type: ndcg_at_5
value: 43.906
- type: precision_at_1
value: 69.75
- type: precision_at_10
value: 33.95
- type: precision_at_100
value: 10.885
- type: precision_at_1000
value: 2.2239999999999998
- type: precision_at_3
value: 49.75
- type: precision_at_5
value: 42.3
- type: recall_at_1
value: 9.65
- type: recall_at_10
value: 26.117
- type: recall_at_100
value: 55.084
- type: recall_at_1000
value: 78.62400000000001
- type: recall_at_3
value: 15.823
- type: recall_at_5
value: 19.652
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 47.885
- type: f1
value: 42.99567641346983
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.97
- type: map_at_10
value: 80.34599999999999
- type: map_at_100
value: 80.571
- type: map_at_1000
value: 80.584
- type: map_at_3
value: 79.279
- type: map_at_5
value: 79.94
- type: mrr_at_1
value: 76.613
- type: mrr_at_10
value: 85.15700000000001
- type: mrr_at_100
value: 85.249
- type: mrr_at_1000
value: 85.252
- type: mrr_at_3
value: 84.33800000000001
- type: mrr_at_5
value: 84.89
- type: ndcg_at_1
value: 76.613
- type: ndcg_at_10
value: 84.53399999999999
- type: ndcg_at_100
value: 85.359
- type: ndcg_at_1000
value: 85.607
- type: ndcg_at_3
value: 82.76599999999999
- type: ndcg_at_5
value: 83.736
- type: precision_at_1
value: 76.613
- type: precision_at_10
value: 10.206
- type: precision_at_100
value: 1.083
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 31.913000000000004
- type: precision_at_5
value: 19.769000000000002
- type: recall_at_1
value: 70.97
- type: recall_at_10
value: 92.674
- type: recall_at_100
value: 95.985
- type: recall_at_1000
value: 97.57000000000001
- type: recall_at_3
value: 87.742
- type: recall_at_5
value: 90.28
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.494
- type: map_at_10
value: 36.491
- type: map_at_100
value: 38.550000000000004
- type: map_at_1000
value: 38.726
- type: map_at_3
value: 31.807000000000002
- type: map_at_5
value: 34.299
- type: mrr_at_1
value: 44.907000000000004
- type: mrr_at_10
value: 53.146
- type: mrr_at_100
value: 54.013999999999996
- type: mrr_at_1000
value: 54.044000000000004
- type: mrr_at_3
value: 50.952
- type: mrr_at_5
value: 52.124
- type: ndcg_at_1
value: 44.907000000000004
- type: ndcg_at_10
value: 44.499
- type: ndcg_at_100
value: 51.629000000000005
- type: ndcg_at_1000
value: 54.367
- type: ndcg_at_3
value: 40.900999999999996
- type: ndcg_at_5
value: 41.737
- type: precision_at_1
value: 44.907000000000004
- type: precision_at_10
value: 12.346
- type: precision_at_100
value: 1.974
- type: precision_at_1000
value: 0.246
- type: precision_at_3
value: 27.366
- type: precision_at_5
value: 19.846
- type: recall_at_1
value: 22.494
- type: recall_at_10
value: 51.156
- type: recall_at_100
value: 77.11200000000001
- type: recall_at_1000
value: 93.44
- type: recall_at_3
value: 36.574
- type: recall_at_5
value: 42.361
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.568999999999996
- type: map_at_10
value: 58.485
- type: map_at_100
value: 59.358999999999995
- type: map_at_1000
value: 59.429
- type: map_at_3
value: 55.217000000000006
- type: map_at_5
value: 57.236
- type: mrr_at_1
value: 77.137
- type: mrr_at_10
value: 82.829
- type: mrr_at_100
value: 83.04599999999999
- type: mrr_at_1000
value: 83.05399999999999
- type: mrr_at_3
value: 81.904
- type: mrr_at_5
value: 82.50800000000001
- type: ndcg_at_1
value: 77.137
- type: ndcg_at_10
value: 67.156
- type: ndcg_at_100
value: 70.298
- type: ndcg_at_1000
value: 71.65700000000001
- type: ndcg_at_3
value: 62.535
- type: ndcg_at_5
value: 65.095
- type: precision_at_1
value: 77.137
- type: precision_at_10
value: 13.911999999999999
- type: precision_at_100
value: 1.6389999999999998
- type: precision_at_1000
value: 0.182
- type: precision_at_3
value: 39.572
- type: precision_at_5
value: 25.766
- type: recall_at_1
value: 38.568999999999996
- type: recall_at_10
value: 69.56099999999999
- type: recall_at_100
value: 81.931
- type: recall_at_1000
value: 90.91799999999999
- type: recall_at_3
value: 59.358999999999995
- type: recall_at_5
value: 64.416
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 88.45600000000002
- type: ap
value: 84.09725115338568
- type: f1
value: 88.41874909080512
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.404999999999998
- type: map_at_10
value: 33.921
- type: map_at_100
value: 35.116
- type: map_at_1000
value: 35.164
- type: map_at_3
value: 30.043999999999997
- type: map_at_5
value: 32.327
- type: mrr_at_1
value: 21.977
- type: mrr_at_10
value: 34.505
- type: mrr_at_100
value: 35.638999999999996
- type: mrr_at_1000
value: 35.68
- type: mrr_at_3
value: 30.703999999999997
- type: mrr_at_5
value: 32.96
- type: ndcg_at_1
value: 21.963
- type: ndcg_at_10
value: 40.859
- type: ndcg_at_100
value: 46.614
- type: ndcg_at_1000
value: 47.789
- type: ndcg_at_3
value: 33.007999999999996
- type: ndcg_at_5
value: 37.084
- type: precision_at_1
value: 21.963
- type: precision_at_10
value: 6.493
- type: precision_at_100
value: 0.938
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.155000000000001
- type: precision_at_5
value: 10.544
- type: recall_at_1
value: 21.404999999999998
- type: recall_at_10
value: 62.175000000000004
- type: recall_at_100
value: 88.786
- type: recall_at_1000
value: 97.738
- type: recall_at_3
value: 40.925
- type: recall_at_5
value: 50.722
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.50661194710442
- type: f1
value: 93.30311193153668
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 73.24669402644778
- type: f1
value: 54.23122108002977
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 72.61936785474109
- type: f1
value: 70.52644941025565
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.76529926025555
- type: f1
value: 77.26872729322514
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.39450293021839
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.757796879839294
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.62512146657428
- type: mrr
value: 33.84624322066173
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.462
- type: map_at_10
value: 14.947
- type: map_at_100
value: 19.344
- type: map_at_1000
value: 20.933
- type: map_at_3
value: 10.761999999999999
- type: map_at_5
value: 12.744
- type: mrr_at_1
value: 47.988
- type: mrr_at_10
value: 57.365
- type: mrr_at_100
value: 57.931
- type: mrr_at_1000
value: 57.96
- type: mrr_at_3
value: 54.85
- type: mrr_at_5
value: 56.569
- type: ndcg_at_1
value: 46.129999999999995
- type: ndcg_at_10
value: 38.173
- type: ndcg_at_100
value: 35.983
- type: ndcg_at_1000
value: 44.507000000000005
- type: ndcg_at_3
value: 42.495
- type: ndcg_at_5
value: 41.019
- type: precision_at_1
value: 47.678
- type: precision_at_10
value: 28.731
- type: precision_at_100
value: 9.232
- type: precision_at_1000
value: 2.202
- type: precision_at_3
value: 39.628
- type: precision_at_5
value: 35.851
- type: recall_at_1
value: 6.462
- type: recall_at_10
value: 18.968
- type: recall_at_100
value: 37.131
- type: recall_at_1000
value: 67.956
- type: recall_at_3
value: 11.905000000000001
- type: recall_at_5
value: 15.097
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.335
- type: map_at_10
value: 46.611999999999995
- type: map_at_100
value: 47.632000000000005
- type: map_at_1000
value: 47.661
- type: map_at_3
value: 41.876999999999995
- type: map_at_5
value: 44.799
- type: mrr_at_1
value: 34.125
- type: mrr_at_10
value: 49.01
- type: mrr_at_100
value: 49.75
- type: mrr_at_1000
value: 49.768
- type: mrr_at_3
value: 45.153
- type: mrr_at_5
value: 47.589999999999996
- type: ndcg_at_1
value: 34.125
- type: ndcg_at_10
value: 54.777
- type: ndcg_at_100
value: 58.914
- type: ndcg_at_1000
value: 59.521
- type: ndcg_at_3
value: 46.015
- type: ndcg_at_5
value: 50.861000000000004
- type: precision_at_1
value: 34.125
- type: precision_at_10
value: 9.166
- type: precision_at_100
value: 1.149
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 21.147
- type: precision_at_5
value: 15.469
- type: recall_at_1
value: 30.335
- type: recall_at_10
value: 77.194
- type: recall_at_100
value: 94.812
- type: recall_at_1000
value: 99.247
- type: recall_at_3
value: 54.681000000000004
- type: recall_at_5
value: 65.86800000000001
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.62
- type: map_at_10
value: 84.536
- type: map_at_100
value: 85.167
- type: map_at_1000
value: 85.184
- type: map_at_3
value: 81.607
- type: map_at_5
value: 83.423
- type: mrr_at_1
value: 81.36
- type: mrr_at_10
value: 87.506
- type: mrr_at_100
value: 87.601
- type: mrr_at_1000
value: 87.601
- type: mrr_at_3
value: 86.503
- type: mrr_at_5
value: 87.179
- type: ndcg_at_1
value: 81.36
- type: ndcg_at_10
value: 88.319
- type: ndcg_at_100
value: 89.517
- type: ndcg_at_1000
value: 89.60900000000001
- type: ndcg_at_3
value: 85.423
- type: ndcg_at_5
value: 86.976
- type: precision_at_1
value: 81.36
- type: precision_at_10
value: 13.415
- type: precision_at_100
value: 1.529
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.342999999999996
- type: precision_at_5
value: 24.534
- type: recall_at_1
value: 70.62
- type: recall_at_10
value: 95.57600000000001
- type: recall_at_100
value: 99.624
- type: recall_at_1000
value: 99.991
- type: recall_at_3
value: 87.22
- type: recall_at_5
value: 91.654
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 60.826438478212744
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 64.24027467551447
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.997999999999999
- type: map_at_10
value: 14.267
- type: map_at_100
value: 16.843
- type: map_at_1000
value: 17.229
- type: map_at_3
value: 9.834
- type: map_at_5
value: 11.92
- type: mrr_at_1
value: 24.7
- type: mrr_at_10
value: 37.685
- type: mrr_at_100
value: 38.704
- type: mrr_at_1000
value: 38.747
- type: mrr_at_3
value: 34.150000000000006
- type: mrr_at_5
value: 36.075
- type: ndcg_at_1
value: 24.7
- type: ndcg_at_10
value: 23.44
- type: ndcg_at_100
value: 32.617000000000004
- type: ndcg_at_1000
value: 38.628
- type: ndcg_at_3
value: 21.747
- type: ndcg_at_5
value: 19.076
- type: precision_at_1
value: 24.7
- type: precision_at_10
value: 12.47
- type: precision_at_100
value: 2.564
- type: precision_at_1000
value: 0.4
- type: precision_at_3
value: 20.767
- type: precision_at_5
value: 17.06
- type: recall_at_1
value: 4.997999999999999
- type: recall_at_10
value: 25.3
- type: recall_at_100
value: 52.048
- type: recall_at_1000
value: 81.093
- type: recall_at_3
value: 12.642999999999999
- type: recall_at_5
value: 17.312
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 85.44942006292234
- type: cos_sim_spearman
value: 79.80930790660699
- type: euclidean_pearson
value: 82.93400777494863
- type: euclidean_spearman
value: 80.04664991110705
- type: manhattan_pearson
value: 82.93551681854949
- type: manhattan_spearman
value: 80.03156736837379
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.63574059135726
- type: cos_sim_spearman
value: 76.80552915288186
- type: euclidean_pearson
value: 82.46368529820518
- type: euclidean_spearman
value: 76.60338474719275
- type: manhattan_pearson
value: 82.4558617035968
- type: manhattan_spearman
value: 76.57936082895705
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 86.24116811084211
- type: cos_sim_spearman
value: 88.10998662068769
- type: euclidean_pearson
value: 87.04961732352689
- type: euclidean_spearman
value: 88.12543945864087
- type: manhattan_pearson
value: 86.9905224528854
- type: manhattan_spearman
value: 88.07827944705546
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 84.74847296555048
- type: cos_sim_spearman
value: 82.66200957916445
- type: euclidean_pearson
value: 84.48132256004965
- type: euclidean_spearman
value: 82.67915286000596
- type: manhattan_pearson
value: 84.44950477268334
- type: manhattan_spearman
value: 82.63327639173352
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.23056258027053
- type: cos_sim_spearman
value: 88.92791680286955
- type: euclidean_pearson
value: 88.13819235461933
- type: euclidean_spearman
value: 88.87294661361716
- type: manhattan_pearson
value: 88.14212133687899
- type: manhattan_spearman
value: 88.88551854529777
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.64179522732887
- type: cos_sim_spearman
value: 84.25028809903114
- type: euclidean_pearson
value: 83.40175015236979
- type: euclidean_spearman
value: 84.23369296429406
- type: manhattan_pearson
value: 83.43768174261321
- type: manhattan_spearman
value: 84.27855229214734
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.20378955494732
- type: cos_sim_spearman
value: 88.46863559173111
- type: euclidean_pearson
value: 88.8249295811663
- type: euclidean_spearman
value: 88.6312737724905
- type: manhattan_pearson
value: 88.87744466378827
- type: manhattan_spearman
value: 88.82908423767314
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 69.91342028796086
- type: cos_sim_spearman
value: 69.71495021867864
- type: euclidean_pearson
value: 70.65334330405646
- type: euclidean_spearman
value: 69.4321253472211
- type: manhattan_pearson
value: 70.59743494727465
- type: manhattan_spearman
value: 69.11695509297482
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 85.42451709766952
- type: cos_sim_spearman
value: 86.07166710670508
- type: euclidean_pearson
value: 86.12711421258899
- type: euclidean_spearman
value: 86.05232086925126
- type: manhattan_pearson
value: 86.15591089932126
- type: manhattan_spearman
value: 86.0890128623439
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.1976344717285
- type: mrr
value: 96.3703145075694
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 59.511
- type: map_at_10
value: 69.724
- type: map_at_100
value: 70.208
- type: map_at_1000
value: 70.22800000000001
- type: map_at_3
value: 66.986
- type: map_at_5
value: 68.529
- type: mrr_at_1
value: 62.333000000000006
- type: mrr_at_10
value: 70.55
- type: mrr_at_100
value: 70.985
- type: mrr_at_1000
value: 71.004
- type: mrr_at_3
value: 68.611
- type: mrr_at_5
value: 69.728
- type: ndcg_at_1
value: 62.333000000000006
- type: ndcg_at_10
value: 74.265
- type: ndcg_at_100
value: 76.361
- type: ndcg_at_1000
value: 76.82900000000001
- type: ndcg_at_3
value: 69.772
- type: ndcg_at_5
value: 71.94800000000001
- type: precision_at_1
value: 62.333000000000006
- type: precision_at_10
value: 9.9
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 27.444000000000003
- type: precision_at_5
value: 18
- type: recall_at_1
value: 59.511
- type: recall_at_10
value: 87.156
- type: recall_at_100
value: 96.5
- type: recall_at_1000
value: 100
- type: recall_at_3
value: 75.2
- type: recall_at_5
value: 80.661
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81683168316832
- type: cos_sim_ap
value: 95.74716566563774
- type: cos_sim_f1
value: 90.64238745574103
- type: cos_sim_precision
value: 91.7093142272262
- type: cos_sim_recall
value: 89.60000000000001
- type: dot_accuracy
value: 99.69405940594059
- type: dot_ap
value: 91.09013507754594
- type: dot_f1
value: 84.54227113556779
- type: dot_precision
value: 84.58458458458459
- type: dot_recall
value: 84.5
- type: euclidean_accuracy
value: 99.81782178217821
- type: euclidean_ap
value: 95.6324301072609
- type: euclidean_f1
value: 90.58341862845445
- type: euclidean_precision
value: 92.76729559748428
- type: euclidean_recall
value: 88.5
- type: manhattan_accuracy
value: 99.81980198019802
- type: manhattan_ap
value: 95.68510494437183
- type: manhattan_f1
value: 90.58945191313342
- type: manhattan_precision
value: 93.79014989293361
- type: manhattan_recall
value: 87.6
- type: max_accuracy
value: 99.81980198019802
- type: max_ap
value: 95.74716566563774
- type: max_f1
value: 90.64238745574103
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 67.63761899427078
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.572473369697235
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 53.63000245208579
- type: mrr
value: 54.504193722943725
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.300791939416545
- type: cos_sim_spearman
value: 31.662904057924123
- type: dot_pearson
value: 26.21198530758316
- type: dot_spearman
value: 27.006921548904263
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.197
- type: map_at_10
value: 1.752
- type: map_at_100
value: 10.795
- type: map_at_1000
value: 27.18
- type: map_at_3
value: 0.5890000000000001
- type: map_at_5
value: 0.938
- type: mrr_at_1
value: 74
- type: mrr_at_10
value: 85.833
- type: mrr_at_100
value: 85.833
- type: mrr_at_1000
value: 85.833
- type: mrr_at_3
value: 85.333
- type: mrr_at_5
value: 85.833
- type: ndcg_at_1
value: 69
- type: ndcg_at_10
value: 70.22
- type: ndcg_at_100
value: 55.785
- type: ndcg_at_1000
value: 52.93600000000001
- type: ndcg_at_3
value: 72.084
- type: ndcg_at_5
value: 71.184
- type: precision_at_1
value: 74
- type: precision_at_10
value: 75.2
- type: precision_at_100
value: 57.3
- type: precision_at_1000
value: 23.302
- type: precision_at_3
value: 77.333
- type: precision_at_5
value: 75.6
- type: recall_at_1
value: 0.197
- type: recall_at_10
value: 2.019
- type: recall_at_100
value: 14.257
- type: recall_at_1000
value: 50.922
- type: recall_at_3
value: 0.642
- type: recall_at_5
value: 1.043
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.803
- type: map_at_10
value: 10.407
- type: map_at_100
value: 16.948
- type: map_at_1000
value: 18.424
- type: map_at_3
value: 5.405
- type: map_at_5
value: 6.908
- type: mrr_at_1
value: 36.735
- type: mrr_at_10
value: 50.221000000000004
- type: mrr_at_100
value: 51.388
- type: mrr_at_1000
value: 51.402
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 49.626
- type: ndcg_at_1
value: 34.694
- type: ndcg_at_10
value: 25.507
- type: ndcg_at_100
value: 38.296
- type: ndcg_at_1000
value: 49.492000000000004
- type: ndcg_at_3
value: 29.006999999999998
- type: ndcg_at_5
value: 25.979000000000003
- type: precision_at_1
value: 36.735
- type: precision_at_10
value: 22.041
- type: precision_at_100
value: 8.02
- type: precision_at_1000
value: 1.567
- type: precision_at_3
value: 28.571
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 2.803
- type: recall_at_10
value: 16.378
- type: recall_at_100
value: 50.489
- type: recall_at_1000
value: 85.013
- type: recall_at_3
value: 6.505
- type: recall_at_5
value: 9.243
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.55579999999999
- type: ap
value: 14.206982753316227
- type: f1
value: 54.372142814964285
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 56.57611771363893
- type: f1
value: 56.924172639063144
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 52.82304915719759
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.92716218632653
- type: cos_sim_ap
value: 73.73359122546046
- type: cos_sim_f1
value: 68.42559487116262
- type: cos_sim_precision
value: 64.22124508215691
- type: cos_sim_recall
value: 73.21899736147758
- type: dot_accuracy
value: 80.38981939560112
- type: dot_ap
value: 54.61060862444974
- type: dot_f1
value: 53.45710627400769
- type: dot_precision
value: 44.87638839125761
- type: dot_recall
value: 66.09498680738787
- type: euclidean_accuracy
value: 86.02849138701794
- type: euclidean_ap
value: 73.95673761922404
- type: euclidean_f1
value: 68.6783042394015
- type: euclidean_precision
value: 65.1063829787234
- type: euclidean_recall
value: 72.66490765171504
- type: manhattan_accuracy
value: 85.9808070572808
- type: manhattan_ap
value: 73.9050720058029
- type: manhattan_f1
value: 68.57560618983794
- type: manhattan_precision
value: 63.70839936608558
- type: manhattan_recall
value: 74.24802110817942
- type: max_accuracy
value: 86.02849138701794
- type: max_ap
value: 73.95673761922404
- type: max_f1
value: 68.6783042394015
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.72783017037295
- type: cos_sim_ap
value: 85.52705223340233
- type: cos_sim_f1
value: 77.91659078492079
- type: cos_sim_precision
value: 73.93378032764221
- type: cos_sim_recall
value: 82.35294117647058
- type: dot_accuracy
value: 85.41739434159972
- type: dot_ap
value: 77.17734818118443
- type: dot_f1
value: 71.63473589973144
- type: dot_precision
value: 66.96123719622415
- type: dot_recall
value: 77.00954727440714
- type: euclidean_accuracy
value: 88.68125897465751
- type: euclidean_ap
value: 85.47712213906692
- type: euclidean_f1
value: 77.81419950830664
- type: euclidean_precision
value: 75.37162649733006
- type: euclidean_recall
value: 80.42038805050817
- type: manhattan_accuracy
value: 88.67349710870494
- type: manhattan_ap
value: 85.46506475241955
- type: manhattan_f1
value: 77.87259084890393
- type: manhattan_precision
value: 74.54929577464789
- type: manhattan_recall
value: 81.50600554357868
- type: max_accuracy
value: 88.72783017037295
- type: max_ap
value: 85.52705223340233
- type: max_f1
value: 77.91659078492079
language:
- en
license: mit
---
# gte-large
General Text Embeddings (GTE) model. [Towards General Text Embeddings with Multi-stage Contrastive Learning](https://arxiv.org/abs/2308.03281)
The GTE models are trained by Alibaba DAMO Academy. They are mainly based on the BERT framework and currently offer three different sizes of models, including [GTE-large](https://huggingface.co/thenlper/gte-large), [GTE-base](https://huggingface.co/thenlper/gte-base), and [GTE-small](https://huggingface.co/thenlper/gte-small). The GTE models are trained on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be applied to various downstream tasks of text embeddings, including **information retrieval**, **semantic textual similarity**, **text reranking**, etc.
## Metrics
We compared the performance of the GTE models with other popular text embedding models on the MTEB benchmark. For more detailed comparison results, please refer to the [MTEB leaderboard](https://huggingface.co/spaces/mteb/leaderboard).
| Model Name | Model Size (GB) | Dimension | Sequence Length | Average (56) | Clustering (11) | Pair Classification (3) | Reranking (4) | Retrieval (15) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [**gte-large**](https://huggingface.co/thenlper/gte-large) | 0.67 | 1024 | 512 | **63.13** | 46.84 | 85.00 | 59.13 | 52.22 | 83.35 | 31.66 | 73.33 |
| [**gte-base**](https://huggingface.co/thenlper/gte-base) | 0.22 | 768 | 512 | **62.39** | 46.2 | 84.57 | 58.61 | 51.14 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1.34 | 1024| 512 | 62.25 | 44.49 | 86.03 | 56.61 | 50.56 | 82.05 | 30.19 | 75.24 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.44 | 768 | 512 | 61.5 | 43.80 | 85.73 | 55.91 | 50.29 | 81.05 | 30.28 | 73.84 |
| [**gte-small**](https://huggingface.co/thenlper/gte-small) | 0.07 | 384 | 512 | **61.36** | 44.89 | 83.54 | 57.7 | 49.46 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | - | 1536 | 8192 | 60.99 | 45.9 | 84.89 | 56.32 | 49.25 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.13 | 384 | 512 | 59.93 | 39.92 | 84.67 | 54.32 | 49.04 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 9.73 | 768 | 512 | 59.51 | 43.72 | 85.06 | 56.42 | 42.24 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 0.44 | 768 | 514 | 57.78 | 43.69 | 83.04 | 59.36 | 43.81 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 28.27 | 4096 | 2048 | 57.59 | 38.93 | 81.9 | 55.65 | 48.22 | 77.74 | 33.6 | 66.19 |
| [all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) | 0.13 | 384 | 512 | 56.53 | 41.81 | 82.41 | 58.44 | 42.69 | 79.8 | 27.9 | 63.21 |
| [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | 0.09 | 384 | 512 | 56.26 | 42.35 | 82.37 | 58.04 | 41.95 | 78.9 | 30.81 | 63.05 |
| [contriever-base-msmarco](https://huggingface.co/nthakur/contriever-base-msmarco) | 0.44 | 768 | 512 | 56.00 | 41.1 | 82.54 | 53.14 | 41.88 | 76.51 | 30.36 | 66.68 |
| [sentence-t5-base](https://huggingface.co/sentence-transformers/sentence-t5-base) | 0.22 | 768 | 512 | 55.27 | 40.21 | 85.18 | 53.09 | 33.63 | 81.14 | 31.39 | 69.81 |
## Usage
Code example
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
input_texts = [
"what is the capital of China?",
"how to implement quick sort in python?",
"Beijing",
"sorting algorithms"
]
tokenizer = AutoTokenizer.from_pretrained("thenlper/gte-large")
model = AutoModel.from_pretrained("thenlper/gte-large")
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# (Optionally) normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:1] @ embeddings[1:].T) * 100
print(scores.tolist())
```
Use with sentence-transformers:
```python
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
sentences = ['That is a happy person', 'That is a very happy person']
model = SentenceTransformer('thenlper/gte-large')
embeddings = model.encode(sentences)
print(cos_sim(embeddings[0], embeddings[1]))
```
### Limitation
This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens.
### Citation
If you find our paper or models helpful, please consider citing them as follows:
```
@misc{li2023general,
title={Towards General Text Embeddings with Multi-stage Contrastive Learning},
author={Zehan Li and Xin Zhang and Yanzhao Zhang and Dingkun Long and Pengjun Xie and Meishan Zhang},
year={2023},
eprint={2308.03281},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 67,907 | [
[
-0.04010009765625,
-0.043914794921875,
0.022796630859375,
0.0186309814453125,
-0.0157470703125,
-0.00872802734375,
-0.02508544921875,
-0.023681640625,
0.036224365234375,
0.005390167236328125,
-0.038116455078125,
-0.05413818359375,
-0.05572509765625,
-0.0011348724365234375,
-0.0159759521484375,
0.073486328125,
-0.004154205322265625,
-0.01441192626953125,
0.00272369384765625,
-0.019622802734375,
-0.0138397216796875,
-0.031158447265625,
-0.049591064453125,
-0.0098114013671875,
0.035400390625,
0.020233154296875,
0.05743408203125,
0.04803466796875,
0.0243682861328125,
0.029296875,
-0.01513671875,
-0.0018482208251953125,
-0.028045654296875,
-0.01287841796875,
0.01239013671875,
-0.0256500244140625,
-0.034423828125,
0.0091400146484375,
0.039947509765625,
0.0240020751953125,
-0.0011873245239257812,
0.0121002197265625,
0.018218994140625,
0.030670166015625,
-0.0227203369140625,
0.016815185546875,
-0.01629638671875,
0.01047515869140625,
-0.007598876953125,
0.01340484619140625,
-0.0284576416015625,
-0.0234222412109375,
0.022064208984375,
-0.034698486328125,
0.01129150390625,
0.01340484619140625,
0.10308837890625,
0.013153076171875,
-0.024688720703125,
-0.0292510986328125,
-0.0129547119140625,
0.067138671875,
-0.068603515625,
0.0243682861328125,
0.0158843994140625,
-0.002872467041015625,
-0.0016908645629882812,
-0.06329345703125,
-0.0452880859375,
-0.004528045654296875,
-0.04193115234375,
0.02008056640625,
-0.0181121826171875,
-0.0009045600891113281,
0.0227203369140625,
0.0386962890625,
-0.054290771484375,
0.0048065185546875,
-0.0028209686279296875,
-0.01174163818359375,
0.046112060546875,
0.0019445419311523438,
0.0362548828125,
-0.039093017578125,
-0.034759521484375,
-0.0236358642578125,
-0.02899169921875,
0.01256561279296875,
0.0197296142578125,
-0.00392913818359375,
-0.047271728515625,
0.043975830078125,
-0.00543975830078125,
0.035888671875,
0.006862640380859375,
-0.0031757354736328125,
0.05126953125,
-0.02484130859375,
-0.023345947265625,
-0.0260467529296875,
0.08935546875,
0.032012939453125,
0.01378631591796875,
0.0013208389282226562,
-0.0092010498046875,
-0.00756072998046875,
-0.01544189453125,
-0.07183837890625,
-0.0201416015625,
0.01554107666015625,
-0.046600341796875,
-0.0225830078125,
0.00974273681640625,
-0.07196044921875,
-0.00890350341796875,
-0.0032558441162109375,
0.044189453125,
-0.04718017578125,
-0.009429931640625,
0.0064544677734375,
-0.013458251953125,
0.0234832763671875,
-0.0030918121337890625,
-0.058380126953125,
0.00702667236328125,
0.02764892578125,
0.069580078125,
0.01276397705078125,
-0.022857666015625,
-0.0108184814453125,
-0.00760650634765625,
-0.00547027587890625,
0.039154052734375,
-0.0254669189453125,
-0.00759124755859375,
0.0017833709716796875,
0.0122528076171875,
-0.0250701904296875,
-0.0147705078125,
0.06011962890625,
-0.007354736328125,
0.04412841796875,
-0.0099945068359375,
-0.043701171875,
-0.01335906982421875,
0.016571044921875,
-0.046966552734375,
0.0911865234375,
0.005481719970703125,
-0.0792236328125,
0.0157623291015625,
-0.04766845703125,
-0.00643157958984375,
-0.031585693359375,
-0.0073699951171875,
-0.0540771484375,
-0.008575439453125,
0.0391845703125,
0.05316162109375,
-0.0158843994140625,
0.00333404541015625,
-0.01513671875,
-0.0165252685546875,
0.00616455078125,
-0.0178070068359375,
0.07403564453125,
0.006862640380859375,
-0.0482177734375,
0.013641357421875,
-0.04534912109375,
0.00690460205078125,
0.0269927978515625,
-0.0100555419921875,
-0.0186309814453125,
-0.007068634033203125,
0.00997161865234375,
0.0340576171875,
0.022979736328125,
-0.03662109375,
0.01873779296875,
-0.03326416015625,
0.060150146484375,
0.06158447265625,
-0.002101898193359375,
0.0229339599609375,
-0.0256805419921875,
0.0154571533203125,
0.0093994140625,
0.0234527587890625,
-0.00978851318359375,
-0.043609619140625,
-0.0682373046875,
-0.04052734375,
0.03643798828125,
0.0443115234375,
-0.057037353515625,
0.06024169921875,
-0.03204345703125,
-0.040374755859375,
-0.044158935546875,
0.0003228187561035156,
0.03070068359375,
0.0220947265625,
0.03582763671875,
-0.005229949951171875,
-0.033294677734375,
-0.08111572265625,
-0.004177093505859375,
0.0008473396301269531,
0.0007519721984863281,
0.0270843505859375,
0.059051513671875,
-0.0235748291015625,
0.058441162109375,
-0.050018310546875,
-0.0140533447265625,
-0.014862060546875,
0.012725830078125,
0.035003662109375,
0.042755126953125,
0.059539794921875,
-0.055908203125,
-0.0555419921875,
-0.019317626953125,
-0.061737060546875,
0.01166534423828125,
-0.0042724609375,
-0.0244903564453125,
0.01551055908203125,
0.034881591796875,
-0.06439208984375,
0.034515380859375,
0.042205810546875,
-0.04638671875,
0.03314208984375,
-0.0254364013671875,
0.0080718994140625,
-0.10284423828125,
0.0037364959716796875,
0.0156402587890625,
-0.01727294921875,
-0.04437255859375,
0.00750732421875,
0.004863739013671875,
0.008636474609375,
-0.0257110595703125,
0.048980712890625,
-0.04827880859375,
0.01654052734375,
0.00966644287109375,
0.0311279296875,
-0.0048675537109375,
0.0567626953125,
-0.0100860595703125,
0.052337646484375,
0.0404052734375,
-0.0269927978515625,
0.01255035400390625,
0.034332275390625,
-0.032806396484375,
0.039215087890625,
-0.053558349609375,
0.0026569366455078125,
-0.0079803466796875,
0.0233001708984375,
-0.08013916015625,
-0.01129150390625,
0.028045654296875,
-0.04559326171875,
0.0296630859375,
0.0017137527465820312,
-0.04547119140625,
-0.053924560546875,
-0.05377197265625,
0.01355743408203125,
0.03472900390625,
-0.041290283203125,
0.02764892578125,
0.0167236328125,
-0.0113983154296875,
-0.060028076171875,
-0.059814453125,
0.0002655982971191406,
-0.02044677734375,
-0.06072998046875,
0.043304443359375,
-0.015350341796875,
0.00959014892578125,
0.011322021484375,
0.01025390625,
0.003650665283203125,
-0.00788116455078125,
0.0128173828125,
0.02545166015625,
-0.0186309814453125,
0.0009984970092773438,
-0.002628326416015625,
-0.0072784423828125,
-0.00968170166015625,
-0.005199432373046875,
0.05682373046875,
-0.018280029296875,
-0.004283905029296875,
-0.04180908203125,
0.02008056640625,
0.036956787109375,
-0.00701141357421875,
0.0711669921875,
0.07147216796875,
-0.034149169921875,
0.00615692138671875,
-0.03973388671875,
-0.008087158203125,
-0.03826904296875,
0.027557373046875,
-0.033203125,
-0.07318115234375,
0.05426025390625,
0.0229949951171875,
0.01131439208984375,
0.07159423828125,
0.04345703125,
-0.00016367435455322266,
0.0814208984375,
0.04034423828125,
-0.0257110595703125,
0.047576904296875,
-0.052215576171875,
0.02069091796875,
-0.07110595703125,
-0.0254364013671875,
-0.030609130859375,
-0.0404052734375,
-0.0640869140625,
-0.0343017578125,
0.0120086669921875,
0.01326751708984375,
-0.04095458984375,
0.0408935546875,
-0.04559326171875,
0.011962890625,
0.0408935546875,
0.02081298828125,
-0.00998687744140625,
0.000469207763671875,
-0.030609130859375,
-0.01544952392578125,
-0.04638671875,
-0.0273284912109375,
0.06591796875,
0.04254150390625,
0.0360107421875,
0.007171630859375,
0.050384521484375,
0.00948333740234375,
0.0174713134765625,
-0.05126953125,
0.043975830078125,
-0.01300048828125,
-0.04644775390625,
-0.0208282470703125,
-0.041656494140625,
-0.07000732421875,
0.024169921875,
-0.026519775390625,
-0.07061767578125,
0.01172637939453125,
-0.01053619384765625,
-0.0288543701171875,
0.03424072265625,
-0.06170654296875,
0.07440185546875,
0.0003654956817626953,
-0.02520751953125,
-0.00443267822265625,
-0.0487060546875,
0.02020263671875,
0.03466796875,
0.01154327392578125,
0.00360107421875,
-0.00739288330078125,
0.0662841796875,
-0.0343017578125,
0.05255126953125,
-0.014312744140625,
0.0024394989013671875,
0.0233917236328125,
-0.019317626953125,
0.04840087890625,
0.007213592529296875,
0.0008778572082519531,
0.0026531219482421875,
-0.0198822021484375,
-0.0399169921875,
-0.0374755859375,
0.06353759765625,
-0.06805419921875,
-0.041778564453125,
-0.04443359375,
-0.027618408203125,
-0.005901336669921875,
0.0166778564453125,
0.038421630859375,
0.03399658203125,
-0.0007662773132324219,
0.0338134765625,
0.050201416015625,
-0.0347900390625,
0.0601806640625,
-0.0017194747924804688,
0.0024662017822265625,
-0.050872802734375,
0.06365966796875,
0.005161285400390625,
0.004650115966796875,
0.030975341796875,
0.005710601806640625,
-0.032958984375,
-0.0240020751953125,
-0.0271148681640625,
0.04510498046875,
-0.042327880859375,
-0.0120086669921875,
-0.059906005859375,
-0.0330810546875,
-0.03814697265625,
-0.010894775390625,
-0.0172271728515625,
-0.033416748046875,
-0.039459228515625,
-0.0174713134765625,
0.0259552001953125,
0.056549072265625,
-0.003757476806640625,
0.01605224609375,
-0.03729248046875,
0.0219573974609375,
0.017333984375,
0.030670166015625,
0.0014524459838867188,
-0.054046630859375,
-0.0262908935546875,
-0.0053863525390625,
-0.027862548828125,
-0.06549072265625,
0.0391845703125,
0.00554656982421875,
0.047515869140625,
0.0295562744140625,
-0.01203155517578125,
0.0517578125,
-0.03521728515625,
0.07110595703125,
0.0281982421875,
-0.07373046875,
0.0277557373046875,
-0.016632080078125,
0.0128631591796875,
0.029083251953125,
0.0360107421875,
-0.04193115234375,
-0.0231781005859375,
-0.061676025390625,
-0.07940673828125,
0.047576904296875,
0.0288238525390625,
0.00893402099609375,
-0.0078887939453125,
0.0250091552734375,
-0.01161956787109375,
0.00795745849609375,
-0.0723876953125,
-0.055206298828125,
-0.024383544921875,
-0.041290283203125,
-0.0179901123046875,
-0.02655029296875,
0.004016876220703125,
-0.0297698974609375,
0.054107666015625,
-0.0007872581481933594,
0.06365966796875,
0.0248565673828125,
-0.016265869140625,
0.0296783447265625,
0.0113983154296875,
0.05682373046875,
0.019439697265625,
-0.0103912353515625,
-0.002033233642578125,
0.01751708984375,
-0.04119873046875,
0.0052947998046875,
0.018768310546875,
-0.00931549072265625,
0.00402069091796875,
0.02813720703125,
0.0662841796875,
0.0176239013671875,
-0.0227508544921875,
0.055145263671875,
-0.00988006591796875,
-0.0299530029296875,
-0.021728515625,
-0.00887298583984375,
0.016448974609375,
0.0241241455078125,
0.0244293212890625,
-0.007129669189453125,
0.002864837646484375,
-0.038482666015625,
0.0234832763671875,
0.027862548828125,
-0.035308837890625,
-0.02044677734375,
0.051361083984375,
0.00054931640625,
-0.0008826255798339844,
0.0413818359375,
-0.020782470703125,
-0.050445556640625,
0.045928955078125,
0.034393310546875,
0.06512451171875,
-0.00833892822265625,
0.024139404296875,
0.05657958984375,
0.02911376953125,
-0.002574920654296875,
0.01168060302734375,
0.0271453857421875,
-0.0377197265625,
-0.027191162109375,
-0.035614013671875,
0.0030307769775390625,
0.01947021484375,
-0.035797119140625,
0.027862548828125,
-0.0269317626953125,
-0.01513671875,
-0.006443023681640625,
0.017059326171875,
-0.0560302734375,
0.01708984375,
0.005886077880859375,
0.074951171875,
-0.057891845703125,
0.061126708984375,
0.047698974609375,
-0.042083740234375,
-0.060333251953125,
0.0013103485107421875,
-0.0105743408203125,
-0.0579833984375,
0.0330810546875,
0.025146484375,
0.0210723876953125,
0.0067291259765625,
-0.035736083984375,
-0.0682373046875,
0.10540771484375,
0.0157623291015625,
-0.0290374755859375,
-0.01947021484375,
0.0017271041870117188,
0.038787841796875,
-0.0254058837890625,
0.0562744140625,
0.0308685302734375,
0.040679931640625,
-0.01288604736328125,
-0.051055908203125,
0.0254669189453125,
-0.03302001953125,
0.00318145751953125,
0.0082550048828125,
-0.082275390625,
0.07989501953125,
-0.005054473876953125,
-0.007541656494140625,
0.0151519775390625,
0.059814453125,
0.01318359375,
0.0020542144775390625,
0.037200927734375,
0.069580078125,
0.04644775390625,
-0.028106689453125,
0.08526611328125,
-0.0210723876953125,
0.053924560546875,
0.053436279296875,
0.0292205810546875,
0.06378173828125,
0.019256591796875,
-0.01262664794921875,
0.061859130859375,
0.053741455078125,
-0.004184722900390625,
0.034454345703125,
0.012298583984375,
-0.00243377685546875,
-0.0049591064453125,
0.0126190185546875,
-0.03387451171875,
0.012237548828125,
0.0223388671875,
-0.039154052734375,
-0.0032176971435546875,
-0.000537872314453125,
0.025909423828125,
-0.007152557373046875,
-0.0087738037109375,
0.038421630859375,
0.0122528076171875,
-0.0236358642578125,
0.042449951171875,
0.005825042724609375,
0.0806884765625,
-0.025360107421875,
0.0133209228515625,
-0.014556884765625,
0.0101165771484375,
-0.01462554931640625,
-0.0740966796875,
0.0074920654296875,
-0.00502777099609375,
-0.0134735107421875,
-0.0113372802734375,
0.03875732421875,
-0.030029296875,
-0.02862548828125,
0.034912109375,
0.029449462890625,
-0.00276947021484375,
0.005084991455078125,
-0.085693359375,
-0.00438690185546875,
0.01151275634765625,
-0.057769775390625,
0.0199737548828125,
0.0330810546875,
0.01459503173828125,
0.04095458984375,
0.027862548828125,
-0.0140228271484375,
0.0172119140625,
-0.0049285888671875,
0.057373046875,
-0.06951904296875,
-0.036590576171875,
-0.07049560546875,
0.057830810546875,
-0.02947998046875,
-0.035919189453125,
0.054901123046875,
0.049713134765625,
0.044921875,
-0.006847381591796875,
0.046539306640625,
-0.026214599609375,
0.0335693359375,
-0.035614013671875,
0.056549072265625,
-0.055908203125,
-0.01302337646484375,
-0.024566650390625,
-0.0662841796875,
-0.0241851806640625,
0.057037353515625,
-0.0262298583984375,
0.01502227783203125,
0.061798095703125,
0.051666259765625,
0.00048279762268066406,
-0.014190673828125,
0.004108428955078125,
0.04010009765625,
0.02520751953125,
0.06610107421875,
0.03765869140625,
-0.06719970703125,
0.049224853515625,
-0.0294189453125,
-0.01088714599609375,
-0.022796630859375,
-0.060516357421875,
-0.0819091796875,
-0.0560302734375,
-0.0307464599609375,
-0.0279998779296875,
-0.014007568359375,
0.0682373046875,
0.048370361328125,
-0.057769775390625,
0.0016031265258789062,
0.000033795833587646484,
0.01082611083984375,
-0.00478363037109375,
-0.0249176025390625,
0.04931640625,
-0.006519317626953125,
-0.0750732421875,
0.0020923614501953125,
0.0017137527465820312,
0.01910400390625,
-0.0003185272216796875,
-0.0146026611328125,
-0.036224365234375,
0.00347137451171875,
0.050048828125,
0.01367950439453125,
-0.046722412109375,
-0.039947509765625,
0.0051116943359375,
-0.03289794921875,
0.012298583984375,
0.02490234375,
-0.0292816162109375,
-0.0007834434509277344,
0.048736572265625,
0.0263671875,
0.051727294921875,
-0.01386260986328125,
0.00885009765625,
-0.05181884765625,
0.0230712890625,
0.0017986297607421875,
0.043731689453125,
0.019134521484375,
-0.0175933837890625,
0.046295166015625,
0.0223846435546875,
-0.04681396484375,
-0.053680419921875,
-0.00974273681640625,
-0.0833740234375,
-0.01255035400390625,
0.075439453125,
-0.0214080810546875,
-0.024169921875,
0.0200347900390625,
-0.0224761962890625,
0.0243682861328125,
-0.0300445556640625,
0.045654296875,
0.06591796875,
0.003353118896484375,
-0.0251922607421875,
-0.047637939453125,
0.0323486328125,
0.0343017578125,
-0.057769775390625,
-0.024688720703125,
0.00968170166015625,
0.022064208984375,
0.028778076171875,
0.0357666015625,
-0.01085662841796875,
0.004619598388671875,
-0.00148773193359375,
0.0096893310546875,
0.0096588134765625,
-0.00592803955078125,
-0.010955810546875,
0.01128387451171875,
-0.016204833984375,
-0.0163726806640625
]
] |
MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7 | 2023-03-20T08:26:54.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"zero-shot-classification",
"nli",
"multilingual",
"zh",
"ja",
"ar",
"ko",
"de",
"fr",
"es",
"pt",
"hi",
"id",
"it",
"tr",
"ru",
"bn",
"ur",
"mr",
"ta",
"vi",
"fa",
"pl",
"uk",
"nl",
"sv",
"he",
"sw",
"ps",
"dataset:MoritzLaurer/multilingual-NLI-26lang-2mil7",
"dataset:xnli",
"dataset:multi_nli",
"dataset:anli",
"dataset:fever",
"dataset:lingnli",
"dataset:alisawuffles/WANLI",
"arxiv:2111.09543",
"arxiv:2104.07179",
"arxiv:1809.05053",
"arxiv:1911.02116",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | MoritzLaurer | null | null | MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7 | 126 | 87,578 | transformers | 2022-08-22T16:59:35 | ---
language:
- multilingual
- zh
- ja
- ar
- ko
- de
- fr
- es
- pt
- hi
- id
- it
- tr
- ru
- bn
- ur
- mr
- ta
- vi
- fa
- pl
- uk
- nl
- sv
- he
- sw
- ps
tags:
- zero-shot-classification
- text-classification
- nli
- pytorch
license: mit
metrics:
- accuracy
datasets:
- MoritzLaurer/multilingual-NLI-26lang-2mil7
- xnli
- multi_nli
- anli
- fever
- lingnli
- alisawuffles/WANLI
pipeline_tag: zero-shot-classification
#- text-classification
widget:
- text: "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
candidate_labels: "politics, economy, entertainment, environment"
model-index: # info: https://github.com/huggingface/hub-docs/blame/main/modelcard.md
- name: DeBERTa-v3-base-xnli-multilingual-nli-2mil7
results:
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: multi_nli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: MultiNLI-matched # Required. A pretty name for the dataset. Example: Common Voice (French)
split: validation_matched # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,857 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: multi_nli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: MultiNLI-mismatched # Required. A pretty name for the dataset. Example: Common Voice (French)
split: validation_mismatched # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,856 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: anli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: ANLI-all # Required. A pretty name for the dataset. Example: Common Voice (French)
split: test_r1+test_r2+test_r3 # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,537 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: anli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: ANLI-r3 # Required. A pretty name for the dataset. Example: Common Voice (French)
split: test_r3 # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,497 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: alisawuffles/WANLI # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: WANLI # Required. A pretty name for the dataset. Example: Common Voice (French)
split: test # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,732 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: lingnli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: LingNLI # Required. A pretty name for the dataset. Example: Common Voice (French)
split: test # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,788 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
- task:
type: text-classification # Required. Example: automatic-speech-recognition
name: Natural Language Inference # Optional. Example: Speech Recognition
dataset:
type: fever-nli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets
name: fever-nli # Required. A pretty name for the dataset. Example: Common Voice (French)
split: test # Optional. Example: test
metrics:
- type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics
value: 0,761 # Required. Example: 20.90
#name: # Optional. Example: Test WER
verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported).
---
# Model card for mDeBERTa-v3-base-xnli-multilingual-nli-2mil7
## Model description
This multilingual model can perform natural language inference (NLI) on 100 languages and is therefore also suitable for multilingual zero-shot classification. The underlying mDeBERTa-v3-base model was pre-trained by Microsoft on the [CC100 multilingual dataset](https://huggingface.co/datasets/cc100) with 100 languages. The model was then fine-tuned on the [XNLI dataset](https://huggingface.co/datasets/xnli) and on the [multilingual-NLI-26lang-2mil7 dataset](https://huggingface.co/datasets/MoritzLaurer/multilingual-NLI-26lang-2mil7). Both datasets contain more than 2.7 million hypothesis-premise pairs in 27 languages spoken by more than 4 billion people.
As of December 2021, mDeBERTa-v3-base is the best performing multilingual base-sized transformer model introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
### How to use the model
#### Simple zero-shot classification pipeline
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model="MoritzLaurer/mDeBERTa-v3-base-mnli-xnli")
sequence_to_classify = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
candidate_labels = ["politics", "economy", "entertainment", "environment"]
output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
```
#### NLI use-case
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
model_name = "MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
premise = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
hypothesis = "Emmanuel Macron is the President of France"
input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt")
output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu"
prediction = torch.softmax(output["logits"][0], -1).tolist()
label_names = ["entailment", "neutral", "contradiction"]
prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)}
print(prediction)
```
### Training data
This model was trained on the [multilingual-nli-26lang-2mil7 dataset](https://huggingface.co/datasets/MoritzLaurer/multilingual-NLI-26lang-2mil7) and the [XNLI](https://huggingface.co/datasets/xnli) validation dataset.
The multilingual-nli-26lang-2mil7 dataset contains 2 730 000 NLI hypothesis-premise pairs in 26 languages spoken by more than 4 billion people. The dataset contains 105 000 text pairs per language. It is based on the English datasets [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [ANLI](https://huggingface.co/datasets/anli), [LingNLI](https://arxiv.org/pdf/2104.07179.pdf) and [WANLI](https://huggingface.co/datasets/alisawuffles/WANLI) and was created using the latest open-source machine translation models. The languages in the dataset are: ['ar', 'bn', 'de', 'es', 'fa', 'fr', 'he', 'hi', 'id', 'it', 'ja', 'ko', 'mr', 'nl', 'pl', 'ps', 'pt', 'ru', 'sv', 'sw', 'ta', 'tr', 'uk', 'ur', 'vi', 'zh'] (see [ISO language codes](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes). For more details, see the [datasheet](XXX). In addition, a sample of 105 000 text pairs was also added for English following the same sampling method as the other languages, leading to 27 languages.
Moreover, for each language a random set of 10% of the hypothesis-premise pairs was added where an English hypothesis was paired with the premise in the other language (and the same for English premises and other language hypotheses). This mix of languages in the text pairs should enable users to formulate a hypothesis in English for a target text in another language.
The [XNLI](https://huggingface.co/datasets/xnli) validation set consists of 2490 professionally translated texts from English to 14 other languages (37350 texts in total) (see [this paper](https://arxiv.org/pdf/1809.05053.pdf)). Note that XNLI also contains a training set of 14 machine translated versions of the MultiNLI dataset for 14 languages, but this data was excluded due to quality issues with the machine translations from 2018.
Note that for evaluation purposes, three languages were excluded from the XNLI training data and only included in the test data: ["bg","el","th"]. This was done in order to test the performance of the model on languages it has not seen during NLI fine-tuning on 27 languages, but only during pre-training on 100 languages - see evaluation metrics below.
The total training dataset had a size of 3 287 280 hypothesis-premise pairs.
### Training procedure
mDeBERTa-v3-base-mnli-xnli was trained using the Hugging Face trainer with the following hyperparameters.
```
training_args = TrainingArguments(
num_train_epochs=3, # total number of training epochs
learning_rate=2e-05,
per_device_train_batch_size=32, # batch size per device during training
gradient_accumulation_steps=2, # to double the effective batch size for
warmup_ratio=0.06, # number of warmup steps for learning rate scheduler
weight_decay=0.01, # strength of weight decay
fp16=False
)
```
### Eval results
The model was evaluated on the XNLI test set in 15 languages (5010 texts per language, 75150 in total) and the English test sets of [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [ANLI](https://huggingface.co/datasets/anli), [LingNLI](https://arxiv.org/pdf/2104.07179.pdf) and [WANLI](https://huggingface.co/datasets/alisawuffles/WANLI) . Note that multilingual NLI models are capable of classifying NLI texts without receiving NLI training data in the specific language (cross-lingual transfer). This means that the model is also able to do NLI on the other 73 languages mDeBERTa was pre-trained on, but performance is most likely lower than for those languages seen during NLI fine-tuning. The performance on the languages ["bg","el","th"] in the table below is a good indicated of this cross-lingual transfer, as these languages were not included in the training data.
|XNLI subsets|ar|bg|de|el|en|es|fr|hi|ru|sw|th|tr|ur|vi|zh|
| :---: |:---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
|Accuracy|0.794|0.822|0.824|0.809|0.871|0.832|0.823|0.769|0.803|0.746|0.786|0.792|0.744|0.793|0.803|
|Speed (text/sec, A100-GPU)|1344.0|1355.0|1472.0|1149.0|1697.0|1446.0|1278.0|1115.0|1380.0|1463.0|1713.0|1594.0|1189.0|877.0|1887.0|
|English Datasets|mnli_test_m|mnli_test_mm|anli_test|anli_test_r3|fever_test|ling_test|wanli_test|
| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
|Accuracy|0.857|0.856|0.537|0.497|0.761|0.788|0.732|0.794|
|Speed (text/sec, A100-GPU)|1000.0|1009.0|794.0|672.0|374.0|1177.0|1468.0|
Also note that if other multilingual models on the model hub claim performance of around 90% on languages other than English, the authors have most likely made a mistake during testing since non of the latest papers shows a multilingual average performance of more than a few points above 80% on XNLI (see [here](https://arxiv.org/pdf/2111.09543.pdf) or [here](https://arxiv.org/pdf/1911.02116.pdf)).
## Limitations and bias
Please consult the original DeBERTa-V3 paper and literature on different NLI datasets for potential biases. Moreover, note that the multilingual-nli-26lang-2mil7 dataset was created using machine translation, which reduces the quality of the data for a complex task like NLI. You can inspect the data via the Hugging Face [dataset viewer](https://huggingface.co/datasets/MoritzLaurer/multilingual-NLI-26lang-2mil7) for languages you are interested in. Note that grammatical errors introduced by machine translation are less of an issue for zero-shot classification, for which grammar is less important.
## Citation
If the dataset is useful for you, please cite the following article:
```
@article{laurer_less_2022,
title = {Less {Annotating}, {More} {Classifying} – {Addressing} the {Data} {Scarcity} {Issue} of {Supervised} {Machine} {Learning} with {Deep} {Transfer} {Learning} and {BERT} - {NLI}},
url = {https://osf.io/74b8k},
language = {en-us},
urldate = {2022-07-28},
journal = {Preprint},
author = {Laurer, Moritz and Atteveldt, Wouter van and Casas, Andreu Salleras and Welbers, Kasper},
month = jun,
year = {2022},
note = {Publisher: Open Science Framework},
}
```
## Ideas for cooperation or questions?
For updates on new models and datasets, follow me on [Twitter](https://twitter.com/MoritzLaurer).
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or on [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
## Debugging and issues
Note that DeBERTa-v3 was released in late 2021 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers==4.13 or higher might solve some issues. Note that mDeBERTa currently does not support FP16, see here: https://github.com/microsoft/DeBERTa/issues/77
| 16,132 | [
[
-0.0271453857421875,
-0.0283355712890625,
-0.00140380859375,
0.024078369140625,
-0.002521514892578125,
-0.0029659271240234375,
-0.025726318359375,
-0.04754638671875,
0.030364990234375,
0.01512908935546875,
-0.037750244140625,
-0.040802001953125,
-0.0341796875,
0.0264892578125,
-0.0101776123046875,
0.0791015625,
-0.0069427490234375,
0.01456451416015625,
0.01568603515625,
-0.024169921875,
-0.0164337158203125,
-0.04833984375,
-0.0560302734375,
-0.0220489501953125,
0.04241943359375,
0.0192413330078125,
0.042022705078125,
0.037689208984375,
0.02587890625,
0.018890380859375,
-0.007160186767578125,
0.006351470947265625,
-0.02520751953125,
-0.023712158203125,
0.0120697021484375,
-0.050384521484375,
-0.041595458984375,
-0.0008053779602050781,
0.053497314453125,
0.04510498046875,
-0.0038928985595703125,
0.01479339599609375,
-0.0034313201904296875,
0.048187255859375,
-0.0261993408203125,
0.006725311279296875,
-0.0291595458984375,
0.0096588134765625,
-0.0261993408203125,
0.008544921875,
-0.027557373046875,
-0.0076904296875,
0.0073699951171875,
-0.0310821533203125,
0.0009665489196777344,
0.006488800048828125,
0.0845947265625,
-0.004291534423828125,
-0.0264739990234375,
-0.006130218505859375,
-0.02642822265625,
0.07293701171875,
-0.0731201171875,
0.04339599609375,
0.0245208740234375,
-0.004665374755859375,
0.011444091796875,
-0.0163116455078125,
-0.05352783203125,
-0.0096588134765625,
-0.01708984375,
0.024871826171875,
-0.017425537109375,
-0.0124664306640625,
0.0285491943359375,
0.0206298828125,
-0.07977294921875,
0.013641357421875,
-0.0323486328125,
-0.0087127685546875,
0.06878662109375,
-0.002819061279296875,
0.034393310546875,
-0.031768798828125,
-0.014190673828125,
-0.0227508544921875,
-0.03851318359375,
0.015594482421875,
0.0269775390625,
0.03338623046875,
-0.04132080078125,
0.0325927734375,
-0.00921630859375,
0.058746337890625,
-0.01032257080078125,
-0.03955078125,
0.0657958984375,
-0.044647216796875,
-0.018890380859375,
0.006900787353515625,
0.08489990234375,
0.0274200439453125,
0.023040771484375,
-0.004329681396484375,
-0.006954193115234375,
-0.002368927001953125,
-0.028045654296875,
-0.056976318359375,
0.0002808570861816406,
0.022216796875,
-0.0266265869140625,
-0.01523590087890625,
-0.0010690689086914062,
-0.056976318359375,
0.00478363037109375,
-0.0189208984375,
0.01788330078125,
-0.046356201171875,
-0.046783447265625,
0.00545501708984375,
-0.002048492431640625,
0.02203369140625,
-0.01123046875,
-0.047027587890625,
0.0090484619140625,
0.01476287841796875,
0.061676025390625,
-0.01898193359375,
-0.046112060546875,
-0.0094146728515625,
-0.004486083984375,
-0.017333984375,
0.027557373046875,
-0.01186370849609375,
-0.0196685791015625,
-0.0072479248046875,
0.011566162109375,
-0.04278564453125,
-0.0273590087890625,
0.039703369140625,
-0.0171966552734375,
0.032135009765625,
-0.020233154296875,
-0.0282745361328125,
-0.01389312744140625,
0.0257110595703125,
-0.053802490234375,
0.09014892578125,
0.00390625,
-0.0723876953125,
0.0280609130859375,
-0.043060302734375,
-0.039459228515625,
-0.0207061767578125,
-0.0030765533447265625,
-0.033538818359375,
-0.0130767822265625,
0.0206298828125,
0.034942626953125,
-0.023895263671875,
0.038055419921875,
-0.01003265380859375,
-0.016143798828125,
0.0167236328125,
-0.0311279296875,
0.08636474609375,
0.0183868408203125,
-0.048431396484375,
-0.004322052001953125,
-0.077392578125,
0.01200103759765625,
0.025604248046875,
-0.0231170654296875,
-0.0133056640625,
-0.02899169921875,
0.017486572265625,
0.042449951171875,
-0.0027523040771484375,
-0.0428466796875,
0.00585174560546875,
-0.0501708984375,
0.0196533203125,
0.01629638671875,
-0.0213470458984375,
0.018585205078125,
-0.028656005859375,
0.035980224609375,
0.020660400390625,
0.0024890899658203125,
-0.01180267333984375,
-0.05767822265625,
-0.0660400390625,
-0.01271820068359375,
0.033447265625,
0.0660400390625,
-0.0694580078125,
0.0262298583984375,
-0.03619384765625,
-0.045379638671875,
-0.049957275390625,
0.01032257080078125,
0.0537109375,
0.0306549072265625,
0.0205078125,
0.0013332366943359375,
-0.05523681640625,
-0.0704345703125,
0.0010347366333007812,
-0.01181793212890625,
0.00943756103515625,
0.01439666748046875,
0.047271728515625,
-0.0165863037109375,
0.05218505859375,
-0.00968170166015625,
-0.020843505859375,
-0.043426513671875,
-0.00014603137969970703,
0.037322998046875,
0.038055419921875,
0.0645751953125,
-0.05859375,
-0.050140380859375,
0.021148681640625,
-0.06854248046875,
-0.0009274482727050781,
-0.0116119384765625,
-0.0024261474609375,
0.05517578125,
0.0269012451171875,
-0.031585693359375,
0.02288818359375,
0.0645751953125,
-0.0122222900390625,
0.028594970703125,
-0.006656646728515625,
0.0167694091796875,
-0.101318359375,
0.02398681640625,
0.01369476318359375,
0.0020084381103515625,
-0.0667724609375,
-0.0013399124145507812,
0.007495880126953125,
-0.0038547515869140625,
-0.043060302734375,
0.05731201171875,
-0.03411865234375,
0.02362060546875,
0.0001068115234375,
0.0150299072265625,
0.003910064697265625,
0.05157470703125,
0.017364501953125,
0.058746337890625,
0.051605224609375,
-0.043182373046875,
-0.0024547576904296875,
0.009429931640625,
-0.0269622802734375,
0.0175323486328125,
-0.052978515625,
-0.01371002197265625,
-0.00524139404296875,
0.0036716461181640625,
-0.0628662109375,
-0.00927734375,
0.01345062255859375,
-0.03497314453125,
0.036834716796875,
-0.004673004150390625,
-0.03375244140625,
-0.03466796875,
-0.0176849365234375,
0.0276031494140625,
0.031707763671875,
-0.032806396484375,
0.040069580078125,
0.0218505859375,
0.002532958984375,
-0.06683349609375,
-0.0758056640625,
0.0011138916015625,
-0.0178985595703125,
-0.05731201171875,
0.0254974365234375,
-0.0084381103515625,
-0.0112152099609375,
-0.008026123046875,
0.0192413330078125,
-0.00469207763671875,
0.0003352165222167969,
0.010009765625,
0.024505615234375,
-0.0216522216796875,
-0.01532745361328125,
0.0026683807373046875,
-0.002838134765625,
-0.007350921630859375,
-0.0101165771484375,
0.0467529296875,
-0.017364501953125,
-0.005794525146484375,
-0.04046630859375,
0.03131103515625,
0.0399169921875,
-0.021820068359375,
0.0731201171875,
0.06396484375,
-0.03289794921875,
0.0240020751953125,
-0.042022705078125,
0.006969451904296875,
-0.02789306640625,
0.035247802734375,
-0.052215576171875,
-0.047210693359375,
0.049774169921875,
0.0330810546875,
0.005252838134765625,
0.04644775390625,
0.035491943359375,
0.021820068359375,
0.0887451171875,
0.037567138671875,
-0.028778076171875,
0.0177001953125,
-0.05194091796875,
0.0191497802734375,
-0.05084228515625,
-0.0192108154296875,
-0.03863525390625,
-0.007289886474609375,
-0.061737060546875,
-0.01473236083984375,
0.0178375244140625,
0.005435943603515625,
-0.00494384765625,
0.035064697265625,
-0.0128021240234375,
0.0258026123046875,
0.042266845703125,
-0.0019931793212890625,
0.009857177734375,
0.005710601806640625,
-0.022705078125,
-0.01160430908203125,
-0.0694580078125,
-0.0238189697265625,
0.0770263671875,
0.0284881591796875,
0.031463623046875,
0.01861572265625,
0.0531005859375,
-0.0202484130859375,
0.0209503173828125,
-0.035369873046875,
0.028961181640625,
-0.0136566162109375,
-0.0587158203125,
-0.01202392578125,
-0.047607421875,
-0.06988525390625,
0.018829345703125,
-0.0192413330078125,
-0.0616455078125,
0.024627685546875,
-0.006893157958984375,
-0.022003173828125,
0.033966064453125,
-0.06494140625,
0.06292724609375,
-0.0263214111328125,
-0.0208740234375,
0.00711822509765625,
-0.0498046875,
0.037109375,
-0.0159912109375,
0.025604248046875,
-0.0116729736328125,
0.021331787109375,
0.07513427734375,
-0.0102691650390625,
0.059356689453125,
-0.012481689453125,
-0.007587432861328125,
0.00397491455078125,
-0.019134521484375,
0.01488494873046875,
0.00687408447265625,
-0.03533935546875,
0.04827880859375,
0.01904296875,
-0.03582763671875,
-0.029205322265625,
0.0562744140625,
-0.06756591796875,
-0.03363037109375,
-0.04150390625,
-0.03399658203125,
-0.0029754638671875,
0.027313232421875,
0.042327880859375,
0.035675048828125,
-0.003612518310546875,
0.0017490386962890625,
0.038360595703125,
-0.0308685302734375,
0.037261962890625,
0.0291290283203125,
-0.034576416015625,
-0.03289794921875,
0.0755615234375,
0.0180206298828125,
0.0217132568359375,
0.02557373046875,
0.012786865234375,
-0.02294921875,
-0.03717041015625,
-0.057891845703125,
0.0426025390625,
-0.03753662109375,
-0.0172119140625,
-0.067138671875,
-0.020904541015625,
-0.04736328125,
0.002086639404296875,
-0.0239105224609375,
-0.025299072265625,
-0.00812530517578125,
-0.0036182403564453125,
0.021942138671875,
0.03399658203125,
-0.002101898193359375,
0.016510009765625,
-0.055145263671875,
0.00563812255859375,
-0.01180267333984375,
0.019561767578125,
0.00014257431030273438,
-0.052154541015625,
-0.0294342041015625,
0.024169921875,
0.0015707015991210938,
-0.05865478515625,
0.054046630859375,
0.031280517578125,
0.043182373046875,
0.0265655517578125,
-0.009918212890625,
0.058868408203125,
-0.0292510986328125,
0.05987548828125,
0.0198822021484375,
-0.06768798828125,
0.04010009765625,
-0.01169586181640625,
0.0247802734375,
0.040008544921875,
0.056365966796875,
-0.04620361328125,
-0.027557373046875,
-0.03851318359375,
-0.0601806640625,
0.06488037109375,
0.0208740234375,
0.00004971027374267578,
-0.0025577545166015625,
0.02984619140625,
-0.007396697998046875,
0.007495880126953125,
-0.0726318359375,
-0.05755615234375,
-0.0032100677490234375,
-0.0166168212890625,
-0.0236968994140625,
-0.01491546630859375,
-0.0021800994873046875,
-0.041290283203125,
0.07147216796875,
-0.0122833251953125,
0.0195465087890625,
0.0256195068359375,
-0.015289306640625,
0.004180908203125,
0.0077972412109375,
0.051605224609375,
0.04705810546875,
-0.0182952880859375,
-0.0150604248046875,
0.0335693359375,
-0.0367431640625,
0.01146697998046875,
0.0180816650390625,
-0.0201416015625,
0.0146636962890625,
0.036865234375,
0.08404541015625,
0.0021190643310546875,
-0.052581787109375,
0.033111572265625,
-0.0231475830078125,
-0.01947021484375,
-0.025665283203125,
-0.008148193359375,
-0.0024089813232421875,
0.002124786376953125,
0.019134521484375,
-0.00020825862884521484,
0.003749847412109375,
-0.03497314453125,
0.01514434814453125,
0.0177001953125,
-0.0240936279296875,
-0.0399169921875,
0.04718017578125,
-0.0007648468017578125,
-0.0014524459838867188,
0.040313720703125,
-0.0303192138671875,
-0.0399169921875,
0.040252685546875,
0.04925537109375,
0.039031982421875,
-0.031036376953125,
0.018890380859375,
0.060577392578125,
0.0347900390625,
0.003337860107421875,
0.032867431640625,
0.0296630859375,
-0.0672607421875,
-0.043182373046875,
-0.049285888671875,
-0.00803375244140625,
0.016357421875,
-0.053955078125,
0.029327392578125,
-0.00482177734375,
-0.0049896240234375,
0.01343536376953125,
0.01491546630859375,
-0.055999755859375,
0.024169921875,
0.0218658447265625,
0.0772705078125,
-0.0809326171875,
0.08660888671875,
0.05096435546875,
-0.036895751953125,
-0.06915283203125,
-0.01152801513671875,
-0.0013580322265625,
-0.0579833984375,
0.0472412109375,
0.029876708984375,
0.008148193359375,
-0.006389617919921875,
-0.0067901611328125,
-0.07769775390625,
0.0697021484375,
0.022705078125,
-0.0443115234375,
0.0020294189453125,
0.028656005859375,
0.044769287109375,
-0.0212554931640625,
0.0374755859375,
0.05206298828125,
0.03533935546875,
-0.00923919677734375,
-0.08245849609375,
0.001674652099609375,
-0.052459716796875,
-0.0029544830322265625,
0.01111602783203125,
-0.046356201171875,
0.06024169921875,
-0.020843505859375,
-0.0100860595703125,
-0.0037822723388671875,
0.044830322265625,
0.0257110595703125,
0.023345947265625,
0.037872314453125,
0.047698974609375,
0.0552978515625,
-0.0174713134765625,
0.09197998046875,
-0.0428466796875,
0.0200653076171875,
0.0589599609375,
-0.0198211669921875,
0.06610107421875,
0.0262908935546875,
-0.01325225830078125,
0.028533935546875,
0.043487548828125,
0.00643157958984375,
0.0182342529296875,
-0.00759124755859375,
-0.0038909912109375,
0.007843017578125,
-0.0143280029296875,
-0.032958984375,
0.03472900390625,
0.0218963623046875,
-0.0176849365234375,
0.00548553466796875,
0.03118896484375,
0.0379638671875,
-0.018463134765625,
0.004573822021484375,
0.04339599609375,
-0.00511932373046875,
-0.057464599609375,
0.08172607421875,
0.005828857421875,
0.07330322265625,
-0.04400634765625,
0.0174713134765625,
-0.0229339599609375,
0.0116119384765625,
-0.026092529296875,
-0.056884765625,
0.02789306640625,
-0.000522613525390625,
-0.0182037353515625,
-0.004970550537109375,
0.0237274169921875,
-0.049957275390625,
-0.056732177734375,
0.044219970703125,
0.0443115234375,
0.01593017578125,
0.0028820037841796875,
-0.0797119140625,
0.016143798828125,
0.03125,
-0.029937744140625,
0.024169921875,
0.016845703125,
-0.0047607421875,
0.04754638671875,
0.0443115234375,
0.0137481689453125,
0.01172637939453125,
0.01568603515625,
0.0538330078125,
-0.04412841796875,
-0.01319122314453125,
-0.05865478515625,
0.039276123046875,
-0.00827789306640625,
-0.0394287109375,
0.0716552734375,
0.057464599609375,
0.081787109375,
-0.0009112358093261719,
0.060577392578125,
-0.01383209228515625,
0.03753662109375,
-0.042022705078125,
0.048095703125,
-0.058868408203125,
-0.00382232666015625,
-0.024688720703125,
-0.050872802734375,
-0.04266357421875,
0.038360595703125,
-0.0178680419921875,
0.00983428955078125,
0.049468994140625,
0.06878662109375,
0.003955841064453125,
-0.0083465576171875,
0.0254974365234375,
0.018310546875,
0.01837158203125,
0.04461669921875,
0.024566650390625,
-0.0665283203125,
0.035736083984375,
-0.060394287109375,
-0.020477294921875,
0.00714874267578125,
-0.045989990234375,
-0.074951171875,
-0.053802490234375,
-0.0400390625,
-0.033905029296875,
0.0002944469451904297,
0.07452392578125,
0.056365966796875,
-0.0849609375,
-0.0374755859375,
0.0165863037109375,
-0.00322723388671875,
-0.02130126953125,
-0.017333984375,
0.0379638671875,
-0.01515960693359375,
-0.07684326171875,
0.029205322265625,
0.004512786865234375,
0.0113677978515625,
-0.01476287841796875,
-0.01102447509765625,
-0.043609619140625,
-0.01013946533203125,
0.048614501953125,
0.01873779296875,
-0.050048828125,
0.01512908935546875,
0.028839111328125,
-0.00783538818359375,
0.010650634765625,
0.0267181396484375,
-0.03179931640625,
0.018157958984375,
0.0265960693359375,
0.031280517578125,
0.033660888671875,
-0.029296875,
0.0309295654296875,
-0.05621337890625,
0.03765869140625,
-0.00811767578125,
0.041717529296875,
0.026885986328125,
-0.0179901123046875,
0.042999267578125,
0.01549530029296875,
-0.0249786376953125,
-0.06976318359375,
-0.00872039794921875,
-0.0716552734375,
-0.00884246826171875,
0.0941162109375,
-0.01532745361328125,
-0.04052734375,
-0.009521484375,
-0.01509857177734375,
0.01233673095703125,
-0.0178985595703125,
0.0297698974609375,
0.044189453125,
0.00039386749267578125,
-0.0229034423828125,
-0.05255126953125,
0.0372314453125,
0.04046630859375,
-0.053375244140625,
-0.0123443603515625,
-0.00449371337890625,
0.0185699462890625,
0.03143310546875,
0.04656982421875,
-0.0081024169921875,
0.004146575927734375,
-0.015594482421875,
0.026123046875,
0.017303466796875,
-0.01398468017578125,
-0.031707763671875,
-0.0143890380859375,
-0.00812530517578125,
0.00354766845703125
]
] |
sentence-transformers/paraphrase-MiniLM-L3-v2 | 2022-07-08T04:08:35.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:s2orc",
"dataset:ms_marco",
"dataset:wiki_atomic_edits",
"dataset:snli",
"dataset:multi_nli",
"dataset:embedding-data/altlex",
"dataset:embedding-data/simple-wiki",
"dataset:embedding-data/flickr30k-captions",
"dataset:embedding-data/coco_captions",
"dataset:embedding-data/sentence-compression",
"dataset:embedding-data/QQP",
"dataset:yahoo_answers_topics",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/paraphrase-MiniLM-L3-v2 | 14 | 87,205 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
datasets:
- flax-sentence-embeddings/stackexchange_xml
- s2orc
- ms_marco
- wiki_atomic_edits
- snli
- multi_nli
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/flickr30k-captions
- embedding-data/coco_captions
- embedding-data/sentence-compression
- embedding-data/QQP
- yahoo_answers_topics
---
# sentence-transformers/paraphrase-MiniLM-L3-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/paraphrase-MiniLM-L3-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-MiniLM-L3-v2')
model = AutoModel.from_pretrained('sentence-transformers/paraphrase-MiniLM-L3-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-MiniLM-L3-v2)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 4,008 | [
[
-0.0163726806640625,
-0.054656982421875,
0.0309600830078125,
0.0178375244140625,
-0.027740478515625,
-0.03302001953125,
-0.00811767578125,
0.00122833251953125,
0.00806427001953125,
0.033935546875,
-0.040069580078125,
-0.0223236083984375,
-0.043304443359375,
0.0103302001953125,
-0.040283203125,
0.0714111328125,
-0.005985260009765625,
-0.001605987548828125,
-0.0274505615234375,
-0.0199737548828125,
-0.01568603515625,
-0.0287933349609375,
-0.03521728515625,
-0.01812744140625,
0.0191497802734375,
0.0218048095703125,
0.0450439453125,
0.0350341796875,
0.0247955322265625,
0.032257080078125,
-0.0010499954223632812,
0.005374908447265625,
-0.020263671875,
-0.01178741455078125,
0.00443267822265625,
-0.0340576171875,
-0.005924224853515625,
0.010162353515625,
0.042083740234375,
0.0278778076171875,
-0.00792694091796875,
0.021392822265625,
0.018341064453125,
0.0165252685546875,
-0.019989013671875,
0.03375244140625,
-0.05291748046875,
0.005664825439453125,
0.0006456375122070312,
0.006320953369140625,
-0.03314208984375,
-0.0133056640625,
0.0186614990234375,
-0.02880859375,
0.0263519287109375,
0.025054931640625,
0.07489013671875,
0.0308380126953125,
-0.014007568359375,
-0.0300750732421875,
-0.013671875,
0.06707763671875,
-0.06353759765625,
0.00934600830078125,
0.025726318359375,
0.0033702850341796875,
0.0086822509765625,
-0.089599609375,
-0.058929443359375,
-0.0114593505859375,
-0.048828125,
-0.0009551048278808594,
-0.03057861328125,
-0.0019550323486328125,
0.016998291015625,
0.013275146484375,
-0.0533447265625,
-0.0184783935546875,
-0.032257080078125,
-0.018218994140625,
0.02490234375,
0.01396942138671875,
0.0278472900390625,
-0.053131103515625,
-0.03131103515625,
-0.0238800048828125,
-0.0163726806640625,
-0.001277923583984375,
-0.0044708251953125,
0.0186920166015625,
-0.0286407470703125,
0.05780029296875,
-0.0031795501708984375,
0.03497314453125,
-0.0004401206970214844,
0.006237030029296875,
0.03765869140625,
-0.044403076171875,
-0.0220184326171875,
-0.0145416259765625,
0.08453369140625,
0.0275421142578125,
0.00896453857421875,
0.004512786865234375,
-0.0120391845703125,
-0.004894256591796875,
-0.00011485815048217773,
-0.05804443359375,
-0.0399169921875,
0.005298614501953125,
-0.037506103515625,
-0.027130126953125,
-0.0013675689697265625,
-0.06378173828125,
-0.00672149658203125,
0.0047607421875,
0.054351806640625,
-0.044036865234375,
0.023468017578125,
0.0019121170043945312,
-0.02728271484375,
0.01497650146484375,
-0.021087646484375,
-0.0526123046875,
0.0236968994140625,
0.0191650390625,
0.0843505859375,
0.00873565673828125,
-0.04095458984375,
-0.025177001953125,
0.0007476806640625,
0.01363372802734375,
0.052703857421875,
-0.0225677490234375,
-0.001873016357421875,
0.0006194114685058594,
0.01294708251953125,
-0.04559326171875,
-0.035980224609375,
0.04815673828125,
-0.0218048095703125,
0.04541015625,
0.005481719970703125,
-0.052703857421875,
-0.00316619873046875,
0.0009975433349609375,
-0.0408935546875,
0.08648681640625,
0.006137847900390625,
-0.0733642578125,
-0.002521514892578125,
-0.052032470703125,
-0.01308441162109375,
-0.00916290283203125,
-0.0016984939575195312,
-0.04949951171875,
0.0084991455078125,
0.041259765625,
0.0450439453125,
-0.00290679931640625,
0.00821685791015625,
-0.0215301513671875,
-0.03179931640625,
0.024200439453125,
-0.0207672119140625,
0.0869140625,
0.01055145263671875,
-0.0285186767578125,
0.0110015869140625,
-0.0294952392578125,
-0.00782012939453125,
0.0268707275390625,
-0.0023345947265625,
-0.0225830078125,
-0.00997161865234375,
0.0154876708984375,
0.0306396484375,
0.026153564453125,
-0.043304443359375,
0.0012826919555664062,
-0.03875732421875,
0.07000732421875,
0.045196533203125,
0.0096435546875,
0.04901123046875,
-0.034942626953125,
0.016632080078125,
0.0132904052734375,
0.0102081298828125,
-0.004566192626953125,
-0.039459228515625,
-0.07666015625,
-0.0231781005859375,
0.0220184326171875,
0.0445556640625,
-0.07635498046875,
0.055816650390625,
-0.038726806640625,
-0.0367431640625,
-0.06298828125,
0.01465606689453125,
0.01403045654296875,
0.035064697265625,
0.05615234375,
0.01568603515625,
-0.048797607421875,
-0.076171875,
-0.01357269287109375,
-0.0024261474609375,
-0.0014057159423828125,
0.0132904052734375,
0.0576171875,
-0.0260772705078125,
0.07562255859375,
-0.042755126953125,
-0.03759765625,
-0.04443359375,
0.0173187255859375,
0.0182647705078125,
0.042266845703125,
0.04180908203125,
-0.053131103515625,
-0.04412841796875,
-0.04388427734375,
-0.0550537109375,
-0.004547119140625,
-0.01763916015625,
-0.018890380859375,
0.001506805419921875,
0.042022705078125,
-0.0689697265625,
0.024200439453125,
0.037994384765625,
-0.0288238525390625,
0.0179595947265625,
-0.023284912109375,
-0.023193359375,
-0.08807373046875,
-0.0027027130126953125,
-0.0080108642578125,
-0.021209716796875,
-0.0307464599609375,
0.012542724609375,
0.0157470703125,
-0.0092315673828125,
-0.039520263671875,
0.042999267578125,
-0.0289764404296875,
0.013153076171875,
-0.00511932373046875,
0.038726806640625,
-0.0012569427490234375,
0.05255126953125,
-0.01503753662109375,
0.055816650390625,
0.03106689453125,
-0.04248046875,
0.0286865234375,
0.048248291015625,
-0.034332275390625,
0.01148223876953125,
-0.0643310546875,
0.00830841064453125,
0.006427764892578125,
0.0302734375,
-0.081298828125,
-0.0025691986083984375,
0.0255126953125,
-0.032379150390625,
-0.005580902099609375,
0.0167236328125,
-0.06134033203125,
-0.05133056640625,
-0.042236328125,
0.01436614990234375,
0.05633544921875,
-0.03839111328125,
0.038818359375,
0.0174102783203125,
-0.01242828369140625,
-0.0246429443359375,
-0.0806884765625,
0.0007405281066894531,
-0.024566650390625,
-0.046783447265625,
0.035003662109375,
-0.0130615234375,
0.007640838623046875,
0.01442718505859375,
0.0182037353515625,
0.0017852783203125,
-0.0122222900390625,
-0.00893402099609375,
0.01123809814453125,
-0.002681732177734375,
0.006351470947265625,
0.0222930908203125,
-0.00884246826171875,
-0.0007138252258300781,
-0.01074981689453125,
0.053131103515625,
-0.02001953125,
-0.0035266876220703125,
-0.0369873046875,
0.0185089111328125,
0.0311279296875,
-0.005054473876953125,
0.08172607421875,
0.06939697265625,
-0.02734375,
-0.00328826904296875,
-0.029815673828125,
-0.0275421142578125,
-0.037994384765625,
0.034393310546875,
-0.02435302734375,
-0.0601806640625,
0.02972412109375,
0.026763916015625,
0.0026454925537109375,
0.055084228515625,
0.045501708984375,
-0.026519775390625,
0.06390380859375,
0.044036865234375,
-0.0020580291748046875,
0.03790283203125,
-0.041839599609375,
0.01500701904296875,
-0.067626953125,
0.0003578662872314453,
-0.0213470458984375,
-0.022216796875,
-0.0450439453125,
-0.041717529296875,
0.02685546875,
-0.0007901191711425781,
-0.014801025390625,
0.048126220703125,
-0.032073974609375,
0.014312744140625,
0.0545654296875,
0.016265869140625,
-0.0005221366882324219,
0.007598876953125,
-0.03955078125,
-0.01194000244140625,
-0.0623779296875,
-0.0452880859375,
0.063720703125,
0.0218353271484375,
0.032562255859375,
-0.007610321044921875,
0.061065673828125,
0.01303863525390625,
0.005435943603515625,
-0.043060302734375,
0.054656982421875,
-0.021881103515625,
-0.033721923828125,
-0.028350830078125,
-0.0235443115234375,
-0.060943603515625,
0.038818359375,
-0.0005736351013183594,
-0.05145263671875,
0.00763702392578125,
-0.006011962890625,
-0.034515380859375,
0.0127716064453125,
-0.0572509765625,
0.08251953125,
0.0106048583984375,
-0.001556396484375,
-0.0037078857421875,
-0.062744140625,
0.0194091796875,
-0.0011262893676757812,
0.020111083984375,
-0.005443572998046875,
-0.0158843994140625,
0.07489013671875,
-0.03363037109375,
0.06793212890625,
-0.0111083984375,
0.0274810791015625,
0.0277862548828125,
-0.017608642578125,
0.033935546875,
-0.0069427490234375,
-0.006542205810546875,
0.0013513565063476562,
0.00527191162109375,
-0.0380859375,
-0.042388916015625,
0.052154541015625,
-0.0665283203125,
-0.0316162109375,
-0.032745361328125,
-0.046478271484375,
-0.0033817291259765625,
0.0175018310546875,
0.035186767578125,
0.02032470703125,
0.0017642974853515625,
0.040069580078125,
0.033538818359375,
-0.0169830322265625,
0.057220458984375,
0.002864837646484375,
-0.00679779052734375,
-0.039306640625,
0.051971435546875,
0.00521087646484375,
0.00936126708984375,
0.033721923828125,
0.02716064453125,
-0.031951904296875,
-0.0179595947265625,
-0.0261993408203125,
0.041656494140625,
-0.05255126953125,
-0.0169525146484375,
-0.08013916015625,
-0.0328369140625,
-0.046783447265625,
0.00125885009765625,
-0.008148193359375,
-0.032196044921875,
-0.036773681640625,
-0.0177154541015625,
0.0246429443359375,
0.0323486328125,
-0.0014400482177734375,
0.0357666015625,
-0.05511474609375,
0.0239410400390625,
0.023162841796875,
-0.019683837890625,
-0.007274627685546875,
-0.06878662109375,
-0.029266357421875,
0.00792694091796875,
-0.0303497314453125,
-0.060150146484375,
0.049835205078125,
0.0242767333984375,
0.04364013671875,
0.0013647079467773438,
0.0137786865234375,
0.054290771484375,
-0.046600341796875,
0.06982421875,
-0.00011861324310302734,
-0.0816650390625,
0.032958984375,
-0.003368377685546875,
0.025115966796875,
0.041595458984375,
0.01226806640625,
-0.0282135009765625,
-0.038970947265625,
-0.061920166015625,
-0.06903076171875,
0.057769775390625,
0.045166015625,
0.04071044921875,
-0.0157928466796875,
0.0172119140625,
-0.01537322998046875,
0.0147857666015625,
-0.08837890625,
-0.03460693359375,
-0.0220794677734375,
-0.04962158203125,
-0.0272369384765625,
-0.0246429443359375,
0.0008463859558105469,
-0.039154052734375,
0.04913330078125,
-0.0010547637939453125,
0.06298828125,
0.0150604248046875,
-0.039398193359375,
0.0218353271484375,
0.007343292236328125,
0.040740966796875,
0.01483917236328125,
-0.00684356689453125,
0.0235443115234375,
0.033447265625,
-0.0187835693359375,
0.0022125244140625,
0.030670166015625,
-0.0202789306640625,
0.02203369140625,
0.036102294921875,
0.0667724609375,
0.037261962890625,
-0.03460693359375,
0.059783935546875,
-0.00383758544921875,
-0.01373291015625,
-0.023651123046875,
-0.0125885009765625,
0.019561767578125,
0.0267486572265625,
0.0210113525390625,
0.003688812255859375,
0.00501251220703125,
-0.0294342041015625,
0.027099609375,
0.0163421630859375,
-0.021026611328125,
-0.005054473876953125,
0.057830810546875,
-0.004932403564453125,
-0.01416778564453125,
0.06317138671875,
-0.0158843994140625,
-0.04815673828125,
0.0300750732421875,
0.043975830078125,
0.0714111328125,
0.00714111328125,
0.01690673828125,
0.0223846435546875,
0.034576416015625,
-0.004291534423828125,
-0.0021820068359375,
0.0012369155883789062,
-0.055694580078125,
-0.0010023117065429688,
-0.052581787109375,
0.002777099609375,
0.00008338689804077148,
-0.04290771484375,
0.020416259765625,
-0.00905609130859375,
-0.0016574859619140625,
-0.01099395751953125,
-0.006496429443359375,
-0.05767822265625,
0.00014674663543701172,
-0.000014066696166992188,
0.06024169921875,
-0.0743408203125,
0.07659912109375,
0.046112060546875,
-0.052001953125,
-0.04803466796875,
0.006664276123046875,
-0.024993896484375,
-0.06781005859375,
0.040557861328125,
0.02374267578125,
0.01194000244140625,
0.01548004150390625,
-0.0406494140625,
-0.06781005859375,
0.10498046875,
0.0222015380859375,
-0.0167236328125,
-0.0304412841796875,
0.00554656982421875,
0.04132080078125,
-0.031768798828125,
0.020599365234375,
0.042327880859375,
0.019561767578125,
-0.009307861328125,
-0.054901123046875,
0.020599365234375,
-0.012908935546875,
0.0213470458984375,
-0.0189208984375,
-0.044342041015625,
0.0806884765625,
0.0015230178833007812,
-0.00685882568359375,
0.0305633544921875,
0.067138671875,
0.026458740234375,
0.0011243820190429688,
0.03424072265625,
0.047393798828125,
0.03936767578125,
-0.0003306865692138672,
0.07861328125,
-0.021331787109375,
0.0623779296875,
0.08172607421875,
0.01261138916015625,
0.08294677734375,
0.0447998046875,
-0.01445770263671875,
0.055816650390625,
0.036407470703125,
-0.01003265380859375,
0.0650634765625,
0.001861572265625,
0.0018262863159179688,
0.0003769397735595703,
0.0165863037109375,
-0.01412200927734375,
0.0220184326171875,
0.0168304443359375,
-0.05230712890625,
-0.0095977783203125,
0.01708984375,
-0.0002231597900390625,
-0.0068359375,
-0.000060677528381347656,
0.044830322265625,
0.0235748291015625,
-0.029205322265625,
0.031341552734375,
0.0150909423828125,
0.0692138671875,
-0.02960205078125,
0.0166473388671875,
-0.0149078369140625,
0.0269622802734375,
0.00684356689453125,
-0.03973388671875,
0.031463623046875,
-0.0089874267578125,
-0.003917694091796875,
-0.0240631103515625,
0.046539306640625,
-0.046173095703125,
-0.048797607421875,
0.024505615234375,
0.041748046875,
0.00847625732421875,
0.004913330078125,
-0.09307861328125,
-0.0037384033203125,
0.006885528564453125,
-0.032196044921875,
0.021392822265625,
0.02490234375,
0.025665283203125,
0.041259765625,
0.028778076171875,
-0.0172576904296875,
0.026580810546875,
-0.0021152496337890625,
0.0582275390625,
-0.044464111328125,
-0.040618896484375,
-0.080322265625,
0.0428466796875,
-0.019744873046875,
-0.01432037353515625,
0.0718994140625,
0.03973388671875,
0.05804443359375,
-0.0243988037109375,
0.04205322265625,
-0.013641357421875,
0.029266357421875,
-0.0400390625,
0.05706787109375,
-0.040435791015625,
-0.0030460357666015625,
-0.0211181640625,
-0.0648193359375,
-0.0188140869140625,
0.0791015625,
-0.0296783447265625,
0.00548553466796875,
0.081787109375,
0.062255859375,
-0.0132904052734375,
-0.01116180419921875,
0.01522064208984375,
0.02728271484375,
0.01480865478515625,
0.035430908203125,
0.034393310546875,
-0.067138671875,
0.06939697265625,
-0.051239013671875,
-0.0023193359375,
-0.0158843994140625,
-0.052337646484375,
-0.07269287109375,
-0.054901123046875,
-0.0251007080078125,
-0.023773193359375,
-0.014251708984375,
0.0711669921875,
0.03570556640625,
-0.05975341796875,
-0.00948333740234375,
-0.0182342529296875,
-0.00647735595703125,
-0.0140380859375,
-0.023681640625,
0.044158935546875,
-0.038909912109375,
-0.066162109375,
0.01406097412109375,
-0.007366180419921875,
0.00794219970703125,
-0.0130767822265625,
0.0082550048828125,
-0.044647216796875,
0.015869140625,
0.041473388671875,
-0.015289306640625,
-0.06488037109375,
-0.0247955322265625,
-0.006320953369140625,
-0.041717529296875,
-0.01203155517578125,
0.032623291015625,
-0.0460205078125,
0.01145172119140625,
0.031646728515625,
0.03546142578125,
0.053558349609375,
-0.018585205078125,
0.023956298828125,
-0.0599365234375,
0.0243988037109375,
0.00652313232421875,
0.05548095703125,
0.0282440185546875,
-0.0114898681640625,
0.0357666015625,
0.03125,
-0.036102294921875,
-0.0606689453125,
-0.01444244384765625,
-0.0714111328125,
-0.01361083984375,
0.0875244140625,
-0.0232391357421875,
-0.0225677490234375,
0.00446319580078125,
-0.0222930908203125,
0.03460693359375,
-0.0187530517578125,
0.03875732421875,
0.058074951171875,
-0.009063720703125,
-0.025787353515625,
-0.0309600830078125,
0.026947021484375,
0.040069580078125,
-0.038299560546875,
-0.01275634765625,
0.01081085205078125,
0.0298614501953125,
0.01407623291015625,
0.04046630859375,
0.0018358230590820312,
0.0009303092956542969,
0.00958251953125,
-0.0043487548828125,
-0.0066986083984375,
0.0024738311767578125,
-0.03106689453125,
0.01439666748046875,
-0.026458740234375,
-0.029327392578125
]
] |
Helsinki-NLP/opus-mt-id-en | 2023-08-16T11:58:05.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"id",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-id-en | 8 | 87,072 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-id-en
* source languages: id
* target languages: en
* OPUS readme: [id-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/id-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/id-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/id-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/id-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.id.en | 47.7 | 0.647 |
| 818 | [
[
-0.0189971923828125,
-0.0287017822265625,
0.018585205078125,
0.03228759765625,
-0.0345458984375,
-0.0282440185546875,
-0.028289794921875,
-0.0041656494140625,
0.0009107589721679688,
0.029144287109375,
-0.05072021484375,
-0.047210693359375,
-0.044097900390625,
0.01495361328125,
-0.0087738037109375,
0.05682373046875,
-0.00685882568359375,
0.0377197265625,
0.01313018798828125,
-0.03753662109375,
-0.0217437744140625,
-0.0308685302734375,
-0.038330078125,
-0.027435302734375,
0.026885986328125,
0.0218505859375,
0.027130126953125,
0.032135009765625,
0.0692138671875,
0.0150299072265625,
-0.00902557373046875,
0.00994873046875,
-0.037567138671875,
-0.00893402099609375,
-0.0011615753173828125,
-0.042022705078125,
-0.05572509765625,
-0.0087127685546875,
0.0771484375,
0.03228759765625,
0.000012636184692382812,
0.03472900390625,
-0.00644683837890625,
0.0654296875,
-0.02142333984375,
0.006656646728515625,
-0.047332763671875,
0.0040740966796875,
-0.025177001953125,
-0.0232696533203125,
-0.05255126953125,
-0.01806640625,
0.00984954833984375,
-0.048980712890625,
-0.0038776397705078125,
0.009979248046875,
0.113037109375,
0.02117919921875,
-0.029144287109375,
-0.00995635986328125,
-0.046112060546875,
0.0755615234375,
-0.057281494140625,
0.040435791015625,
0.0310211181640625,
0.016510009765625,
0.016937255859375,
-0.03955078125,
-0.0202178955078125,
0.0127410888671875,
-0.01526641845703125,
0.01580810546875,
-0.004413604736328125,
-0.019561767578125,
0.0234527587890625,
0.05535888671875,
-0.057952880859375,
0.0047149658203125,
-0.04254150390625,
0.004581451416015625,
0.052734375,
0.009063720703125,
0.01279449462890625,
-0.012603759765625,
-0.0321044921875,
-0.038421630859375,
-0.05743408203125,
0.0086822509765625,
0.030029296875,
0.0229034423828125,
-0.0347900390625,
0.04815673828125,
-0.012603759765625,
0.043914794921875,
0.0028934478759765625,
0.0009546279907226562,
0.07501220703125,
-0.030181884765625,
-0.0256195068359375,
-0.01458740234375,
0.09063720703125,
0.0286407470703125,
0.0068359375,
0.0009236335754394531,
-0.0179290771484375,
-0.0203399658203125,
0.00885009765625,
-0.07037353515625,
-0.004840850830078125,
0.015655517578125,
-0.037109375,
-0.01366424560546875,
0.00644683837890625,
-0.044586181640625,
0.0158538818359375,
-0.031524658203125,
0.045196533203125,
-0.052032470703125,
-0.0242156982421875,
0.025299072265625,
0.00258636474609375,
0.03228759765625,
-0.00281524658203125,
-0.048004150390625,
0.0086822509765625,
0.033599853515625,
0.05517578125,
-0.031402587890625,
-0.0207061767578125,
-0.03155517578125,
-0.015380859375,
-0.0103759765625,
0.046417236328125,
-0.006656646728515625,
-0.033905029296875,
-0.005001068115234375,
0.0377197265625,
-0.0303497314453125,
-0.027130126953125,
0.09552001953125,
-0.0214691162109375,
0.050811767578125,
-0.0305023193359375,
-0.038177490234375,
-0.024810791015625,
0.03717041015625,
-0.042633056640625,
0.0970458984375,
0.005550384521484375,
-0.062469482421875,
0.0165557861328125,
-0.06121826171875,
-0.01522064208984375,
-0.0029392242431640625,
0.0044403076171875,
-0.048248291015625,
0.0102691650390625,
0.006206512451171875,
0.0284881591796875,
-0.024993896484375,
0.031463623046875,
-0.0011043548583984375,
-0.0224761962890625,
0.005016326904296875,
-0.0266265869140625,
0.0765380859375,
0.02392578125,
-0.017242431640625,
0.0192413330078125,
-0.06976318359375,
-0.0024662017822265625,
0.004230499267578125,
-0.0347900390625,
-0.01555633544921875,
0.005096435546875,
0.0193939208984375,
0.01123809814453125,
0.02264404296875,
-0.044219970703125,
0.0194549560546875,
-0.045806884765625,
0.01056671142578125,
0.04779052734375,
-0.01776123046875,
0.026702880859375,
-0.036224365234375,
0.025543212890625,
0.01035308837890625,
0.0093841552734375,
-0.00031757354736328125,
-0.03369140625,
-0.0611572265625,
-0.0205841064453125,
0.051544189453125,
0.0777587890625,
-0.05426025390625,
0.06488037109375,
-0.057891845703125,
-0.060821533203125,
-0.06109619140625,
-0.013214111328125,
0.034912109375,
0.02374267578125,
0.036468505859375,
-0.0130462646484375,
-0.03118896484375,
-0.0823974609375,
-0.01390838623046875,
-0.00962066650390625,
-0.020660400390625,
0.01495361328125,
0.04583740234375,
-0.01195526123046875,
0.037567138671875,
-0.03253173828125,
-0.0285491943359375,
-0.01143646240234375,
0.01029205322265625,
0.03948974609375,
0.049224853515625,
0.04241943359375,
-0.0655517578125,
-0.042938232421875,
0.00036406517028808594,
-0.058349609375,
-0.010009765625,
0.005603790283203125,
-0.02197265625,
0.003871917724609375,
0.00322723388671875,
-0.0232086181640625,
0.00756072998046875,
0.046661376953125,
-0.04473876953125,
0.032501220703125,
-0.007122039794921875,
0.022735595703125,
-0.10198974609375,
0.014617919921875,
-0.010589599609375,
-0.003536224365234375,
-0.03131103515625,
0.001598358154296875,
0.0200653076171875,
0.005092620849609375,
-0.06097412109375,
0.038116455078125,
-0.01702880859375,
0.0015697479248046875,
0.0223388671875,
0.000052809715270996094,
0.00698089599609375,
0.056304931640625,
-0.005626678466796875,
0.057586669921875,
0.052398681640625,
-0.03814697265625,
0.01253509521484375,
0.0408935546875,
-0.031646728515625,
0.03564453125,
-0.059112548828125,
-0.0202789306640625,
0.0220947265625,
-0.0103302001953125,
-0.045989990234375,
0.006313323974609375,
0.022796630859375,
-0.046722412109375,
0.03253173828125,
-0.0110931396484375,
-0.05670166015625,
0.001605987548828125,
-0.021514892578125,
0.040130615234375,
0.0517578125,
-0.019561767578125,
0.05010986328125,
0.0084991455078125,
0.00231170654296875,
-0.031646728515625,
-0.07061767578125,
-0.01143646240234375,
-0.033233642578125,
-0.055755615234375,
0.0205841064453125,
-0.0306854248046875,
-0.0038318634033203125,
0.002948760986328125,
0.0246429443359375,
-0.007404327392578125,
0.00652313232421875,
0.01024627685546875,
0.017608642578125,
-0.0361328125,
0.0168914794921875,
0.00313568115234375,
-0.01422119140625,
-0.0097808837890625,
-0.007965087890625,
0.041961669921875,
-0.0215301513671875,
-0.0176544189453125,
-0.049774169921875,
0.00701904296875,
0.049591064453125,
-0.036468505859375,
0.0654296875,
0.0433349609375,
-0.010711669921875,
0.01139068603515625,
-0.02783203125,
0.0089263916015625,
-0.0305023193359375,
0.01033782958984375,
-0.03912353515625,
-0.051116943359375,
0.042205810546875,
0.0034656524658203125,
0.0301971435546875,
0.06390380859375,
0.0484619140625,
0.008514404296875,
0.0496826171875,
0.0162200927734375,
0.0008769035339355469,
0.032928466796875,
-0.03802490234375,
-0.00884246826171875,
-0.0816650390625,
0.0040130615234375,
-0.053863525390625,
-0.026519775390625,
-0.060394287109375,
-0.0173492431640625,
0.018402099609375,
0.0017538070678710938,
-0.0164031982421875,
0.04302978515625,
-0.043487548828125,
0.017730712890625,
0.046630859375,
-0.00614166259765625,
0.023223876953125,
-0.003143310546875,
-0.036895751953125,
-0.01461029052734375,
-0.032257080078125,
-0.041351318359375,
0.09918212890625,
0.027618408203125,
0.0236053466796875,
0.0198974609375,
0.035858154296875,
-0.0036945343017578125,
0.0192718505859375,
-0.044525146484375,
0.03204345703125,
-0.0169219970703125,
-0.056793212890625,
-0.0204315185546875,
-0.041961669921875,
-0.0621337890625,
0.03759765625,
-0.0217437744140625,
-0.03814697265625,
0.01328277587890625,
-0.0015230178833007812,
-0.0101470947265625,
0.03271484375,
-0.05535888671875,
0.0892333984375,
-0.008056640625,
-0.00860595703125,
0.0207977294921875,
-0.032470703125,
0.0230560302734375,
-0.002582550048828125,
0.0183563232421875,
-0.0167236328125,
0.01107025146484375,
0.048126220703125,
-0.005756378173828125,
0.0312347412109375,
-0.0035686492919921875,
-0.005153656005859375,
0.002239227294921875,
0.005451202392578125,
0.0306396484375,
-0.0037860870361328125,
-0.03369140625,
0.030303955078125,
0.005474090576171875,
-0.03399658203125,
-0.01204681396484375,
0.0487060546875,
-0.052154541015625,
-0.005146026611328125,
-0.0293426513671875,
-0.04736328125,
0.0027027130126953125,
0.02349853515625,
0.056610107421875,
0.054168701171875,
-0.0203857421875,
0.0413818359375,
0.0675048828125,
-0.021026611328125,
0.0293426513671875,
0.0509033203125,
-0.01432037353515625,
-0.0390625,
0.06414794921875,
0.01383209228515625,
0.025054931640625,
0.042388916015625,
0.0030841827392578125,
-0.01120758056640625,
-0.058074951171875,
-0.053192138671875,
0.023284912109375,
-0.0198211669921875,
-0.0165252685546875,
-0.041778564453125,
-0.00406646728515625,
-0.015350341796875,
0.024200439453125,
-0.04144287109375,
-0.03875732421875,
-0.0165863037109375,
-0.0184326171875,
0.0154571533203125,
0.0134429931640625,
0.00244140625,
0.0311737060546875,
-0.075439453125,
0.01247406005859375,
-0.006916046142578125,
0.0308685302734375,
-0.031585693359375,
-0.059906005859375,
-0.035614013671875,
0.0005826950073242188,
-0.04669189453125,
-0.053131103515625,
0.038360595703125,
0.00982666015625,
0.0170440673828125,
0.025604248046875,
0.0184326171875,
0.0266265869140625,
-0.048614501953125,
0.06817626953125,
-0.00963592529296875,
-0.052947998046875,
0.034637451171875,
-0.0362548828125,
0.038787841796875,
0.06927490234375,
0.0234222412109375,
-0.023681640625,
-0.036346435546875,
-0.05438232421875,
-0.058349609375,
0.058380126953125,
0.049774169921875,
-0.00939178466796875,
0.0173797607421875,
-0.006977081298828125,
-0.003170013427734375,
0.008270263671875,
-0.0823974609375,
-0.0280303955078125,
0.00699615478515625,
-0.0328369140625,
-0.008514404296875,
-0.02252197265625,
-0.02142333984375,
-0.014862060546875,
0.08526611328125,
0.0117950439453125,
0.01123809814453125,
0.0290985107421875,
-0.01091766357421875,
-0.01303863525390625,
0.022857666015625,
0.072021484375,
0.04241943359375,
-0.0411376953125,
-0.01543426513671875,
0.020721435546875,
-0.0265655517578125,
-0.01201629638671875,
0.006732940673828125,
-0.03228759765625,
0.0243682861328125,
0.029296875,
0.08135986328125,
0.01219940185546875,
-0.04644775390625,
0.03436279296875,
-0.03375244140625,
-0.03472900390625,
-0.05224609375,
-0.01253509521484375,
0.00800323486328125,
0.0037078857421875,
0.020416259765625,
0.010162353515625,
0.011474609375,
-0.01171112060546875,
0.0035552978515625,
0.0063934326171875,
-0.050140380859375,
-0.043670654296875,
0.0333251953125,
0.0124053955078125,
-0.032684326171875,
0.0404052734375,
-0.031494140625,
-0.03515625,
0.02703857421875,
0.006702423095703125,
0.07806396484375,
-0.016448974609375,
-0.0199432373046875,
0.055450439453125,
0.04443359375,
-0.014495849609375,
0.032958984375,
0.01401519775390625,
-0.05291748046875,
-0.03863525390625,
-0.05963134765625,
-0.00861358642578125,
0.01270294189453125,
-0.06390380859375,
0.0308685302734375,
0.0203399658203125,
-0.0009546279907226562,
-0.026092529296875,
0.01166534423828125,
-0.040679931640625,
0.00798797607421875,
-0.0228424072265625,
0.07647705078125,
-0.072265625,
0.0667724609375,
0.03680419921875,
-0.0182037353515625,
-0.06243896484375,
-0.0204315185546875,
-0.01451873779296875,
-0.0285491943359375,
0.040496826171875,
0.0145721435546875,
0.027923583984375,
-0.007061004638671875,
-0.01427459716796875,
-0.05767822265625,
0.084716796875,
0.01401519775390625,
-0.04205322265625,
0.0021190643310546875,
0.00579071044921875,
0.03814697265625,
-0.03082275390625,
0.0090789794921875,
0.0297698974609375,
0.0528564453125,
0.005275726318359375,
-0.08209228515625,
-0.02001953125,
-0.0435791015625,
-0.0222015380859375,
0.041290283203125,
-0.044036865234375,
0.06817626953125,
0.033599853515625,
-0.0144500732421875,
0.007717132568359375,
0.042816162109375,
0.0228424072265625,
0.0260162353515625,
0.03509521484375,
0.09283447265625,
0.0280303955078125,
-0.04022216796875,
0.075927734375,
-0.0269317626953125,
0.04156494140625,
0.08343505859375,
-0.0014505386352539062,
0.068115234375,
0.0282135009765625,
-0.006847381591796875,
0.0362548828125,
0.051849365234375,
-0.020477294921875,
0.036773681640625,
0.003429412841796875,
0.01218414306640625,
-0.0121612548828125,
0.01068878173828125,
-0.05340576171875,
0.01947021484375,
0.01477813720703125,
-0.022918701171875,
0.0083770751953125,
-0.0038890838623046875,
0.005153656005859375,
-0.0036373138427734375,
-0.0091552734375,
0.05072021484375,
-0.00559234619140625,
-0.04486083984375,
0.056488037109375,
-0.00608062744140625,
0.048583984375,
-0.049835205078125,
0.00963592529296875,
-0.005756378173828125,
0.0175323486328125,
0.0009589195251464844,
-0.035003662109375,
0.032745361328125,
-0.0033664703369140625,
-0.0254974365234375,
-0.032684326171875,
0.00931549072265625,
-0.040985107421875,
-0.0657958984375,
0.0267486572265625,
0.0304107666015625,
0.0223846435546875,
0.00208282470703125,
-0.06292724609375,
0.00469970703125,
0.0082550048828125,
-0.04473876953125,
0.00669097900390625,
0.05657958984375,
0.023681640625,
0.034637451171875,
0.04736328125,
0.01300811767578125,
0.015899658203125,
-0.005710601806640625,
0.047760009765625,
-0.03240966796875,
-0.0313720703125,
-0.06036376953125,
0.061309814453125,
-0.01181793212890625,
-0.053680419921875,
0.05206298828125,
0.0771484375,
0.07672119140625,
-0.01467132568359375,
0.0165863037109375,
-0.0002086162567138672,
0.0592041015625,
-0.04693603515625,
0.0469970703125,
-0.06640625,
0.0191650390625,
-0.004383087158203125,
-0.065185546875,
-0.0225830078125,
0.0198516845703125,
-0.01654052734375,
-0.03076171875,
0.061737060546875,
0.050079345703125,
-0.01480865478515625,
-0.0195770263671875,
0.02203369140625,
0.02227783203125,
0.0179595947265625,
0.04095458984375,
0.028076171875,
-0.07666015625,
0.040863037109375,
-0.0165557861328125,
-0.004390716552734375,
-0.005207061767578125,
-0.051483154296875,
-0.0643310546875,
-0.04437255859375,
-0.0163421630859375,
-0.01812744140625,
-0.0224151611328125,
0.06414794921875,
0.04302978515625,
-0.0673828125,
-0.039520263671875,
0.006744384765625,
0.0153045654296875,
-0.01076507568359375,
-0.01898193359375,
0.043060302734375,
-0.0235595703125,
-0.07696533203125,
0.030303955078125,
0.00749969482421875,
-0.003902435302734375,
-0.00028395652770996094,
-0.0227813720703125,
-0.041961669921875,
-0.0008039474487304688,
0.017486572265625,
0.0036373138427734375,
-0.040679931640625,
0.00801849365234375,
0.0064239501953125,
-0.00798797607421875,
0.0270843505859375,
0.0263519287109375,
-0.0198211669921875,
0.019439697265625,
0.06146240234375,
0.0294647216796875,
0.0321044921875,
-0.005016326904296875,
0.038299560546875,
-0.050750732421875,
0.0289459228515625,
0.0157012939453125,
0.043304443359375,
0.027587890625,
-0.0025386810302734375,
0.06304931640625,
0.0212554931640625,
-0.048431396484375,
-0.08465576171875,
0.0031986236572265625,
-0.08837890625,
0.004367828369140625,
0.06683349609375,
-0.025177001953125,
-0.0248870849609375,
0.0291900634765625,
-0.00838470458984375,
0.00681304931640625,
-0.0270538330078125,
0.031463623046875,
0.0631103515625,
0.032867431640625,
0.006866455078125,
-0.048248291015625,
0.0295562744140625,
0.039031982421875,
-0.053436279296875,
-0.016998291015625,
0.0101318359375,
0.008819580078125,
0.0298004150390625,
0.032318115234375,
-0.0243072509765625,
0.005161285400390625,
-0.02581787109375,
0.028564453125,
-0.00435638427734375,
-0.0143585205078125,
-0.027252197265625,
0.00399017333984375,
-0.0031108856201171875,
-0.0247802734375
]
] |
Open-Orca/Mistral-7B-OpenOrca | 2023-10-10T21:07:16.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"arxiv:2306.02707",
"arxiv:2301.13688",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Open-Orca | null | null | Open-Orca/Mistral-7B-OpenOrca | 423 | 86,455 | transformers | 2023-09-29T19:18:38 | ---
datasets:
- Open-Orca/OpenOrca
language:
- en
library_name: transformers
pipeline_tag: text-generation
license: apache-2.0
---
<p><h1>🐋 Mistral-7B-OpenOrca 🐋</h1></p>

[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# OpenOrca - Mistral - 7B - 8k
We have used our own [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca) to fine-tune on top of [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1).
This dataset is our attempt to reproduce the dataset generated for Microsoft Research's [Orca Paper](https://arxiv.org/abs/2306.02707).
We use [OpenChat](https://huggingface.co/openchat) packing, trained with [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
This release is trained on a curated filtered subset of most of our GPT-4 augmented data.
It is the same subset of our data as was used in our [OpenOrcaxOpenChat-Preview2-13B model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B).
**HF Leaderboard evals place this model as #1 for all models smaller than 30B at release time, outperforming all other 7B and 13B models!**
This release provides a first: a fully open model with class-breaking performance, capable of running fully accelerated on even moderate consumer GPUs.
Our thanks to the Mistral team for leading the way here.
We affectionately codename this model: "*MistralOrca*"
If you'd like to try the model now, we have it running on fast GPUs unquantized: https://huggingface.co/spaces/Open-Orca/Mistral-7B-OpenOrca
Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
We are in-process with training more models, so keep a look out on our org for releases coming soon with exciting partners.
We will also give sneak-peak announcements on our Discord, which you can find here:
https://AlignmentLab.ai
or check the OpenAccess AI Collective Discord for more information about Axolotl trainer here:
https://discord.gg/5y8STgB3P3
# Quantized Models
Quantized versions of this model are generously made available by [TheBloke](https://huggingface.co/TheBloke).
- AWQ: https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-AWQ
- GPTQ: https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ
- GGUF: https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF
# Prompt Template
We used [OpenAI's Chat Markup Language (ChatML)](https://github.com/openai/openai-python/blob/main/chatml.md) format, with `<|im_start|>` and `<|im_end|>` tokens added to support this.
This means that, e.g., in [oobabooga](https://github.com/oobabooga/text-generation-webui/) the "`MPT-Chat`" instruction template should work, as it also uses ChatML.
This formatting is also available via a pre-defined [Transformers chat template](https://huggingface.co/docs/transformers/main/chat_templating),
which means that lists of messages can be formatted for you with the `apply_chat_template()` method:
```python
chat = [
{"role": "system", "content": "You are MistralOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!"}
{"role": "user", "content": "How are you?"},
{"role": "assistant", "content": "I am doing well!"},
{"role": "user", "content": "Please tell me about how mistral winds have attracted super-orcas."},
]
tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
```
which will yield:
```
<|im_start|>system
You are MistralOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!
<|im_end|>
<|im_start|>user
How are you?<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
Please tell me about how mistral winds have attracted super-orcas.<|im_end|>
<|im_start|>assistant
```
If you use `tokenize=True` and `return_tensors="pt"` instead, then you will get a tokenized
and formatted conversation ready to pass to `model.generate()`.
# Inference
See [this notebook](https://colab.research.google.com/drive/1yZlLSifCGELAX5GN582kZypHCv0uJuNX?usp=sharing) for inference details.
Note that you need the development snapshot of Transformers currently, as support for Mistral hasn't been released into PyPI yet:
```
pip install git+https://github.com/huggingface/transformers
```
# Evaluation
## HuggingFace Leaderboard Performance
We have evaluated using the methodology and tools for the HuggingFace Leaderboard, and find that we have dramatically improved upon the base model.
We find **106%** of the base model's performance on HF Leaderboard evals, averaging **65.84**.
At release time, this beats all 7B and 13B models!
This is also **98.6%** of *`Llama2-70b-chat`*'s performance!

| Metric | Value |
|-----------------------|-------|
| MMLU (5-shot) | 62.24 |
| ARC (25-shot) | 64.08 |
| HellaSwag (10-shot) | 83.99 |
| TruthfulQA (0-shot) | 53.05 |
| Avg. | 65.84 |
We use [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard.
## AGIEval Performance
We compare our results to the base Mistral-7B model (using LM Evaluation Harness).
We find **129%** of the base model's performance on AGI Eval, averaging **0.397**.
As well, we significantly improve upon the official `mistralai/Mistral-7B-Instruct-v0.1` finetuning, achieving **119%** of their performance.

## BigBench-Hard Performance
We find **119%** of the base model's performance on BigBench-Hard, averaging **0.416**.

## GPT4ALL Leaderboard Performance
We gain a slight edge over our previous releases, again topping the leaderboard, averaging **72.38**.

## MT-Bench Performance
MT-Bench uses GPT-4 as a judge of model response quality, across a wide range of challenges.
We find our performance is *on-par with `Llama2-70b-chat`*, averaging **6.86**.

# Dataset
We used a curated, filtered selection of most of the GPT-4 augmented data from our OpenOrca dataset, which aims to reproduce the Orca Research Paper dataset.
# Training
We trained with 8x A6000 GPUs for 62 hours, completing 4 epochs of full fine tuning on our dataset in one training run.
Commodity cost was ~$400.
# Citation
```bibtex
@software{lian2023mistralorca1
title = {MistralOrca: Mistral-7B Model Instruct-tuned on Filtered OpenOrcaV1 GPT-4 Dataset},
author = {Wing Lian and Bleys Goodson and Guan Wang and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca},
}
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
| 8,848 | [
[
-0.03411865234375,
-0.0601806640625,
0.002101898193359375,
0.01160430908203125,
-0.01416015625,
-0.0148773193359375,
-0.01071929931640625,
-0.056488037109375,
0.01430511474609375,
0.01708984375,
-0.027679443359375,
-0.04547119140625,
-0.03265380859375,
-0.006134033203125,
-0.0180511474609375,
0.0816650390625,
-0.00617218017578125,
-0.0177154541015625,
0.0001957416534423828,
-0.0309906005859375,
-0.02935791015625,
-0.047760009765625,
-0.0703125,
-0.0281982421875,
0.041168212890625,
0.0037899017333984375,
0.06390380859375,
0.052947998046875,
0.027801513671875,
0.0233612060546875,
-0.0275726318359375,
0.017333984375,
-0.048095703125,
-0.00299835205078125,
-0.00335693359375,
-0.0279388427734375,
-0.06500244140625,
0.0010576248168945312,
0.0276947021484375,
0.01074981689453125,
-0.03485107421875,
0.0241546630859375,
0.00959014892578125,
0.024017333984375,
-0.042816162109375,
0.0309906005859375,
-0.019073486328125,
-0.0091705322265625,
-0.0214385986328125,
0.0100555419921875,
-0.0157470703125,
-0.02508544921875,
0.002330780029296875,
-0.06463623046875,
0.007904052734375,
0.009307861328125,
0.09478759765625,
0.0157623291015625,
-0.0215301513671875,
0.0008969306945800781,
-0.0350341796875,
0.04229736328125,
-0.051605224609375,
0.03594970703125,
0.01904296875,
0.024505615234375,
-0.023712158203125,
-0.0748291015625,
-0.04266357421875,
-0.0107879638671875,
0.0043487548828125,
0.023773193359375,
-0.0408935546875,
-0.0026302337646484375,
0.015411376953125,
0.042266845703125,
-0.052520751953125,
0.004611968994140625,
-0.03350830078125,
-0.0157623291015625,
0.049560546875,
0.0119781494140625,
0.0196380615234375,
0.00855255126953125,
-0.031585693359375,
-0.048675537109375,
-0.037353515625,
0.025634765625,
0.0208740234375,
0.0230560302734375,
-0.0479736328125,
0.029693603515625,
0.0007276535034179688,
0.041839599609375,
0.01065826416015625,
-0.0170440673828125,
0.027679443359375,
-0.021148681640625,
-0.032501220703125,
-0.011993408203125,
0.0819091796875,
0.0199737548828125,
-0.006458282470703125,
0.00922393798828125,
-0.01036834716796875,
0.015960693359375,
0.0030498504638671875,
-0.07232666015625,
-0.021575927734375,
0.0230712890625,
-0.03289794921875,
-0.0247802734375,
0.004123687744140625,
-0.036956787109375,
-0.015411376953125,
-0.0023193359375,
0.0258026123046875,
-0.044921875,
-0.041961669921875,
0.017974853515625,
-0.0194244384765625,
0.0249786376953125,
0.04150390625,
-0.052825927734375,
0.031951904296875,
0.038909912109375,
0.07098388671875,
0.0013055801391601562,
-0.01861572265625,
-0.0146026611328125,
-0.0207366943359375,
-0.025360107421875,
0.0528564453125,
-0.01515960693359375,
-0.0292510986328125,
-0.022613525390625,
-0.00750732421875,
-0.0059051513671875,
-0.03717041015625,
0.043914794921875,
-0.026336669921875,
0.0250091552734375,
-0.02935791015625,
-0.0112457275390625,
-0.026123046875,
0.015380859375,
-0.045257568359375,
0.0968017578125,
0.024139404296875,
-0.0640869140625,
0.01259613037109375,
-0.0499267578125,
-0.00031304359436035156,
-0.0165863037109375,
0.0023345947265625,
-0.0316162109375,
-0.00978851318359375,
0.034027099609375,
0.01387786865234375,
-0.037017822265625,
0.002956390380859375,
-0.038238525390625,
-0.019317626953125,
0.02215576171875,
-0.0218658447265625,
0.0682373046875,
0.0257568359375,
-0.034332275390625,
0.0034389495849609375,
-0.037750244140625,
-0.0010528564453125,
0.0152435302734375,
-0.010894775390625,
-0.004138946533203125,
-0.0255279541015625,
0.0103759765625,
0.0244293212890625,
0.0247802734375,
-0.04022216796875,
0.03228759765625,
-0.03131103515625,
0.04058837890625,
0.061065673828125,
-0.0094757080078125,
0.023773193359375,
-0.031494140625,
0.042755126953125,
0.01165771484375,
0.045318603515625,
0.0016336441040039062,
-0.05560302734375,
-0.061981201171875,
-0.033660888671875,
0.032501220703125,
0.0231170654296875,
-0.052398681640625,
0.0243377685546875,
-0.014312744140625,
-0.067138671875,
-0.04608154296875,
-0.01082611083984375,
0.04412841796875,
0.040069580078125,
0.031280517578125,
-0.051788330078125,
-0.0281524658203125,
-0.04913330078125,
0.006587982177734375,
-0.037078857421875,
0.01236724853515625,
0.025390625,
0.0390625,
-0.01158905029296875,
0.07586669921875,
-0.037872314453125,
-0.022216796875,
-0.006237030029296875,
0.0006818771362304688,
0.02239990234375,
0.036041259765625,
0.06463623046875,
-0.050872802734375,
-0.0296630859375,
0.0169525146484375,
-0.06884765625,
-0.0014667510986328125,
0.0247039794921875,
-0.03125,
0.023895263671875,
0.021881103515625,
-0.0626220703125,
0.049468994140625,
0.049468994140625,
-0.03790283203125,
0.0295867919921875,
-0.01206207275390625,
-0.00015079975128173828,
-0.07830810546875,
0.0227813720703125,
0.01294708251953125,
-0.00994873046875,
-0.0350341796875,
0.00902557373046875,
-0.007083892822265625,
-0.000020682811737060547,
-0.036407470703125,
0.06329345703125,
-0.0301971435546875,
0.01058197021484375,
-0.0017194747924804688,
0.00151824951171875,
-0.0024662017822265625,
0.051055908203125,
-0.006473541259765625,
0.047637939453125,
0.0526123046875,
-0.03277587890625,
0.0262908935546875,
0.034423828125,
-0.00433349609375,
0.0283355712890625,
-0.06988525390625,
0.01424407958984375,
-0.00907135009765625,
0.04901123046875,
-0.0665283203125,
-0.017913818359375,
0.04754638671875,
-0.047210693359375,
0.0203857421875,
-0.01367950439453125,
-0.033172607421875,
-0.035064697265625,
-0.025970458984375,
0.035369873046875,
0.048309326171875,
-0.053680419921875,
0.05169677734375,
0.0166168212890625,
0.0034885406494140625,
-0.05706787109375,
-0.0396728515625,
-0.00994110107421875,
-0.025848388671875,
-0.059722900390625,
0.031280517578125,
0.0016908645629882812,
-0.00789642333984375,
-0.0015764236450195312,
-0.0169219970703125,
0.0033283233642578125,
-0.000537872314453125,
0.056915283203125,
0.029205322265625,
-0.0209503173828125,
-0.0135040283203125,
-0.002117156982421875,
-0.00830841064453125,
-0.00867462158203125,
-0.02862548828125,
0.0523681640625,
-0.031951904296875,
-0.0093231201171875,
-0.05194091796875,
-0.0179290771484375,
0.043426513671875,
-0.0361328125,
0.0677490234375,
0.053253173828125,
-0.0229034423828125,
0.0084991455078125,
-0.04144287109375,
-0.01100921630859375,
-0.036285400390625,
-0.0007724761962890625,
-0.0278472900390625,
-0.0640869140625,
0.05792236328125,
0.031494140625,
0.023712158203125,
0.05731201171875,
0.039947509765625,
0.024261474609375,
0.0771484375,
0.04547119140625,
-0.0241851806640625,
0.03900146484375,
-0.04107666015625,
0.00449371337890625,
-0.05072021484375,
-0.02947998046875,
-0.043304443359375,
-0.025848388671875,
-0.049346923828125,
-0.029022216796875,
0.03875732421875,
0.0270538330078125,
-0.039794921875,
0.040496826171875,
-0.04901123046875,
-0.0003306865692138672,
0.040740966796875,
0.020477294921875,
0.018341064453125,
0.0032100677490234375,
-0.0024127960205078125,
0.01375579833984375,
-0.05352783203125,
-0.033782958984375,
0.084228515625,
0.038726806640625,
0.06768798828125,
0.0160675048828125,
0.04425048828125,
-0.0021305084228515625,
0.042572021484375,
-0.02166748046875,
0.0226898193359375,
0.0212249755859375,
-0.047454833984375,
-0.013214111328125,
-0.042083740234375,
-0.0830078125,
0.03076171875,
-0.01186370849609375,
-0.06512451171875,
0.0261383056640625,
0.0136260986328125,
-0.040985107421875,
0.01552581787109375,
-0.052764892578125,
0.08099365234375,
-0.007297515869140625,
-0.0200042724609375,
0.00698089599609375,
-0.051177978515625,
0.02471923828125,
0.012054443359375,
0.0026454925537109375,
0.0020961761474609375,
-0.0024700164794921875,
0.050933837890625,
-0.055877685546875,
0.0638427734375,
-0.017242431640625,
-0.0117034912109375,
0.03857421875,
-0.00984954833984375,
0.0141754150390625,
0.0128631591796875,
-0.005176544189453125,
0.041839599609375,
0.011993408203125,
-0.0281982421875,
-0.041168212890625,
0.0513916015625,
-0.09027099609375,
-0.0150146484375,
-0.04962158203125,
-0.021820068359375,
0.01468658447265625,
0.005199432373046875,
0.03350830078125,
0.0357666015625,
-0.0213165283203125,
-0.00580596923828125,
0.0275726318359375,
-0.0230255126953125,
0.021820068359375,
0.031494140625,
-0.0360107421875,
-0.051239013671875,
0.0626220703125,
0.00620269775390625,
0.0009889602661132812,
0.01277923583984375,
0.01006317138671875,
-0.0318603515625,
-0.0218963623046875,
-0.03668212890625,
0.035400390625,
-0.026580810546875,
-0.033905029296875,
-0.059326171875,
-0.0220184326171875,
-0.052276611328125,
0.0163421630859375,
-0.0328369140625,
-0.0305633544921875,
-0.029022216796875,
0.0004343986511230469,
0.043701171875,
0.048797607421875,
-0.00809478759765625,
0.034698486328125,
-0.04229736328125,
0.00922393798828125,
0.015655517578125,
0.0126953125,
0.01355743408203125,
-0.05908203125,
-0.0091705322265625,
0.021942138671875,
-0.060791015625,
-0.038421630859375,
0.040557861328125,
0.00800323486328125,
0.0255889892578125,
0.03668212890625,
0.0022258758544921875,
0.07012939453125,
-0.010894775390625,
0.0672607421875,
0.01393890380859375,
-0.054840087890625,
0.03131103515625,
-0.03143310546875,
0.0108642578125,
0.023773193359375,
0.030426025390625,
-0.03082275390625,
-0.03253173828125,
-0.08013916015625,
-0.058197021484375,
0.07122802734375,
0.040985107421875,
-0.002758026123046875,
0.0045013427734375,
0.041900634765625,
-0.00312042236328125,
0.017242431640625,
-0.0546875,
-0.029998779296875,
-0.02685546875,
-0.0012350082397460938,
-0.01233673095703125,
0.0028209686279296875,
0.0000642538070678711,
-0.029205322265625,
0.059051513671875,
0.0005083084106445312,
0.0341796875,
0.0174407958984375,
0.016937255859375,
-0.00867462158203125,
-0.016693115234375,
0.033966064453125,
0.0374755859375,
-0.024871826171875,
-0.020050048828125,
0.00812530517578125,
-0.03961181640625,
-0.01540374755859375,
0.027923583984375,
0.006397247314453125,
-0.007579803466796875,
0.022735595703125,
0.07489013671875,
-0.0091552734375,
-0.034698486328125,
0.0479736328125,
-0.027923583984375,
-0.006984710693359375,
-0.0204925537109375,
0.01433563232421875,
0.00412750244140625,
0.035614013671875,
0.01462554931640625,
0.01045989990234375,
-0.007381439208984375,
-0.04156494140625,
-0.00446319580078125,
0.0181121826171875,
-0.02642822265625,
-0.041412353515625,
0.0732421875,
-0.0004858970642089844,
-0.005367279052734375,
0.054229736328125,
-0.005886077880859375,
-0.02899169921875,
0.04901123046875,
0.028961181640625,
0.042877197265625,
-0.03350830078125,
0.0079498291015625,
0.035797119140625,
0.010894775390625,
-0.0255584716796875,
0.0215606689453125,
0.0020771026611328125,
-0.045806884765625,
-0.0167694091796875,
-0.0494384765625,
-0.021331787109375,
0.00949859619140625,
-0.057373046875,
0.03509521484375,
-0.0389404296875,
-0.036407470703125,
0.0035552978515625,
-0.0060272216796875,
-0.05035400390625,
0.01297760009765625,
0.004596710205078125,
0.08184814453125,
-0.058624267578125,
0.0548095703125,
0.05615234375,
-0.0528564453125,
-0.0841064453125,
-0.026824951171875,
-0.0013437271118164062,
-0.059722900390625,
0.0305633544921875,
0.0155487060546875,
0.00920867919921875,
-0.005321502685546875,
-0.05828857421875,
-0.06610107421875,
0.09619140625,
0.047088623046875,
-0.01280975341796875,
-0.01445770263671875,
-0.006092071533203125,
0.05950927734375,
-0.01319122314453125,
0.061981201171875,
0.042236328125,
0.02685546875,
0.012176513671875,
-0.0902099609375,
0.004688262939453125,
-0.036834716796875,
0.0102691650390625,
0.01230621337890625,
-0.08148193359375,
0.0848388671875,
-0.0001621246337890625,
-0.014739990234375,
0.0248565673828125,
0.06500244140625,
0.01904296875,
0.0159912109375,
0.0299835205078125,
0.07269287109375,
0.0501708984375,
-0.02069091796875,
0.09820556640625,
-0.0166168212890625,
0.041900634765625,
0.058837890625,
0.0005216598510742188,
0.057586669921875,
0.00846099853515625,
-0.01352691650390625,
0.04315185546875,
0.063232421875,
0.01465606689453125,
0.0272979736328125,
-0.002475738525390625,
-0.007633209228515625,
-0.0029964447021484375,
-0.0159454345703125,
-0.05377197265625,
0.037933349609375,
0.0142059326171875,
-0.01279449462890625,
-0.0220794677734375,
0.0011730194091796875,
0.01445770263671875,
-0.021728515625,
-0.00827789306640625,
0.04888916015625,
0.021728515625,
-0.047119140625,
0.08660888671875,
0.01953125,
0.051055908203125,
-0.043609619140625,
-0.0029811859130859375,
-0.033966064453125,
0.01425933837890625,
-0.0227203369140625,
-0.03729248046875,
-0.01010894775390625,
-0.003627777099609375,
0.01088714599609375,
-0.014007568359375,
0.0352783203125,
-0.0187835693359375,
-0.009552001953125,
0.023895263671875,
0.03155517578125,
0.021392822265625,
-0.0215606689453125,
-0.061065673828125,
0.02386474609375,
-0.004810333251953125,
-0.0252532958984375,
0.03204345703125,
0.03436279296875,
-0.0186004638671875,
0.051666259765625,
0.051666259765625,
-0.005863189697265625,
-0.0006914138793945312,
-0.009002685546875,
0.0882568359375,
-0.03668212890625,
-0.032623291015625,
-0.05517578125,
0.037750244140625,
-0.00313568115234375,
-0.051910400390625,
0.05816650390625,
0.05548095703125,
0.070556640625,
0.0185394287109375,
0.042572021484375,
-0.0276947021484375,
0.01629638671875,
-0.01442718505859375,
0.050689697265625,
-0.05316162109375,
0.0006103515625,
-0.027923583984375,
-0.0762939453125,
-0.0032672882080078125,
0.05126953125,
-0.01346588134765625,
0.0144805908203125,
0.0360107421875,
0.07293701171875,
-0.0165252685546875,
0.00812530517578125,
0.0019893646240234375,
0.02459716796875,
0.034942626953125,
0.050567626953125,
0.05938720703125,
-0.06121826171875,
0.047698974609375,
-0.0287628173828125,
-0.03607177734375,
-0.0118865966796875,
-0.04022216796875,
-0.07354736328125,
-0.038604736328125,
-0.0234375,
-0.05804443359375,
0.00696563720703125,
0.06390380859375,
0.04925537109375,
-0.04388427734375,
-0.029693603515625,
0.006862640380859375,
-0.0082244873046875,
-0.030975341796875,
-0.01580810546875,
0.0293731689453125,
0.004444122314453125,
-0.06072998046875,
0.005886077880859375,
0.005443572998046875,
0.0145111083984375,
-0.0037288665771484375,
-0.01422119140625,
0.00528717041015625,
-0.0121307373046875,
0.033782958984375,
0.044921875,
-0.04559326171875,
-0.017974853515625,
-0.000050067901611328125,
-0.023681640625,
0.0142669677734375,
0.0256195068359375,
-0.05517578125,
0.0159759521484375,
0.0243682861328125,
0.02459716796875,
0.0640869140625,
0.0181427001953125,
0.026123046875,
-0.03350830078125,
0.0287322998046875,
0.00231170654296875,
0.0245361328125,
0.0117034912109375,
-0.007747650146484375,
0.0538330078125,
0.017303466796875,
-0.0406494140625,
-0.06536865234375,
-0.0157623291015625,
-0.09417724609375,
0.0027370452880859375,
0.08056640625,
-0.01491546630859375,
-0.040130615234375,
0.0163116455078125,
-0.036346435546875,
0.0255889892578125,
-0.057525634765625,
0.0482177734375,
0.033233642578125,
-0.009490966796875,
0.00130462646484375,
-0.03375244140625,
0.0295867919921875,
0.020751953125,
-0.05035400390625,
-0.01361083984375,
0.03338623046875,
0.0226898193359375,
0.020782470703125,
0.056884765625,
-0.0227813720703125,
0.0235443115234375,
-0.00385284423828125,
0.024444580078125,
-0.0135498046875,
-0.01251983642578125,
-0.0225677490234375,
0.002338409423828125,
0.00653076171875,
-0.01285552978515625
]
] |
shi-labs/oneformer_ade20k_swin_large | 2023-01-19T14:36:03.000Z | [
"transformers",
"pytorch",
"oneformer",
"vision",
"image-segmentation",
"universal-image-segmentation",
"dataset:scene_parse_150",
"arxiv:2211.06220",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | image-segmentation | shi-labs | null | null | shi-labs/oneformer_ade20k_swin_large | 6 | 86,319 | transformers | 2022-11-15T19:00:56 | ---
license: mit
tags:
- vision
- image-segmentation
- universal-image-segmentation
datasets:
- scene_parse_150
widget:
- src: https://praeclarumjj3.github.io/files/ade20k.jpeg
example_title: House
- src: https://praeclarumjj3.github.io/files/demo_2.jpg
example_title: Airplane
- src: https://praeclarumjj3.github.io/files/coco.jpeg
example_title: Person
---
# OneFormer
OneFormer model trained on the ADE20k dataset (large-sized version, Swin backbone). It was introduced in the paper [OneFormer: One Transformer to Rule Universal Image Segmentation](https://arxiv.org/abs/2211.06220) by Jain et al. and first released in [this repository](https://github.com/SHI-Labs/OneFormer).

## Model description
OneFormer is the first multi-task universal image segmentation framework. It needs to be trained only once with a single universal architecture, a single model, and on a single dataset, to outperform existing specialized models across semantic, instance, and panoptic segmentation tasks. OneFormer uses a task token to condition the model on the task in focus, making the architecture task-guided for training, and task-dynamic for inference, all with a single model.

## Intended uses & limitations
You can use this particular checkpoint for semantic, instance and panoptic segmentation. See the [model hub](https://huggingface.co/models?search=oneformer) to look for other fine-tuned versions on a different dataset.
### How to use
Here is how to use this model:
```python
from transformers import OneFormerProcessor, OneFormerForUniversalSegmentation
from PIL import Image
import requests
url = "https://huggingface.co/datasets/shi-labs/oneformer_demo/blob/main/ade20k.jpeg"
image = Image.open(requests.get(url, stream=True).raw)
# Loading a single model for all three tasks
processor = OneFormerProcessor.from_pretrained("shi-labs/oneformer_ade20k_swin_large")
model = OneFormerForUniversalSegmentation.from_pretrained("shi-labs/oneformer_ade20k_swin_large")
# Semantic Segmentation
semantic_inputs = processor(images=image, task_inputs=["semantic"], return_tensors="pt")
semantic_outputs = model(**semantic_inputs)
# pass through image_processor for postprocessing
predicted_semantic_map = processor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0]
# Instance Segmentation
instance_inputs = processor(images=image, task_inputs=["instance"], return_tensors="pt")
instance_outputs = model(**instance_inputs)
# pass through image_processor for postprocessing
predicted_instance_map = processor.post_process_instance_segmentation(outputs, target_sizes=[image.size[::-1]])[0]["segmentation"]
# Panoptic Segmentation
panoptic_inputs = processor(images=image, task_inputs=["panoptic"], return_tensors="pt")
panoptic_outputs = model(**panoptic_inputs)
# pass through image_processor for postprocessing
predicted_semantic_map = processor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0]["segmentation"]
```
For more examples, please refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/oneformer).
### Citation
```bibtex
@article{jain2022oneformer,
title={{OneFormer: One Transformer to Rule Universal Image Segmentation}},
author={Jitesh Jain and Jiachen Li and MangTik Chiu and Ali Hassani and Nikita Orlov and Humphrey Shi},
journal={arXiv},
year={2022}
}
```
| 3,694 | [
[
-0.0489501953125,
-0.056304931640625,
0.0250091552734375,
0.0102996826171875,
-0.0214691162109375,
-0.047149658203125,
0.02130126953125,
-0.019744873046875,
0.00301361083984375,
0.0506591796875,
-0.07611083984375,
-0.04510498046875,
-0.046234130859375,
-0.0186614990234375,
-0.0283050537109375,
0.056304931640625,
-0.002666473388671875,
0.0209197998046875,
-0.0199737548828125,
-0.034271240234375,
0.002269744873046875,
-0.0223236083984375,
-0.046539306640625,
-0.0172576904296875,
0.01428985595703125,
0.01163482666015625,
0.04010009765625,
0.05401611328125,
0.048065185546875,
0.029052734375,
-0.009033203125,
-0.0189056396484375,
-0.01430511474609375,
-0.012786865234375,
0.0104522705078125,
-0.0253448486328125,
-0.026214599609375,
0.01007843017578125,
0.057281494140625,
0.05084228515625,
0.007511138916015625,
0.0252227783203125,
-0.0273590087890625,
0.048431396484375,
-0.041748046875,
-0.00481414794921875,
-0.0242767333984375,
0.0253753662109375,
-0.016754150390625,
0.0218963623046875,
-0.0082244873046875,
-0.0219268798828125,
0.0193634033203125,
-0.06317138671875,
0.032073974609375,
-0.0227508544921875,
0.09521484375,
0.0384521484375,
-0.01160430908203125,
0.007183074951171875,
-0.03192138671875,
0.047271728515625,
-0.05389404296875,
0.0203399658203125,
0.016845703125,
0.035247802734375,
0.01409149169921875,
-0.075439453125,
-0.0462646484375,
0.0076446533203125,
-0.012359619140625,
0.007659912109375,
-0.03131103515625,
0.0035572052001953125,
0.0273590087890625,
0.0186614990234375,
-0.04229736328125,
0.00812530517578125,
-0.054931640625,
-0.007488250732421875,
0.0310516357421875,
0.0035724639892578125,
0.0192413330078125,
-0.0211029052734375,
-0.05389404296875,
-0.02520751953125,
-0.0247802734375,
0.02386474609375,
0.010498046875,
-0.00647735595703125,
-0.01326751708984375,
0.0169525146484375,
0.007381439208984375,
0.05450439453125,
0.0309600830078125,
-0.0229644775390625,
0.016754150390625,
-0.00298309326171875,
-0.0206451416015625,
-0.0004525184631347656,
0.06781005859375,
0.0099029541015625,
0.0025157928466796875,
0.005588531494140625,
-0.0006656646728515625,
0.0152740478515625,
0.032470703125,
-0.07940673828125,
-0.0166168212890625,
0.012115478515625,
-0.035186767578125,
-0.02020263671875,
0.03302001953125,
-0.047393798828125,
-0.0040130615234375,
-0.002368927001953125,
0.04644775390625,
-0.045654296875,
-0.0135040283203125,
0.004199981689453125,
-0.012725830078125,
0.046112060546875,
0.01416778564453125,
-0.04815673828125,
0.040679931640625,
0.034332275390625,
0.056488037109375,
-0.0120697021484375,
-0.03851318359375,
-0.004741668701171875,
-0.0215606689453125,
-0.0200042724609375,
0.0728759765625,
-0.0171966552734375,
-0.00894927978515625,
-0.0268096923828125,
0.0261383056640625,
-0.037200927734375,
-0.040252685546875,
0.0452880859375,
-0.034088134765625,
0.04010009765625,
-0.006053924560546875,
-0.033477783203125,
-0.0511474609375,
0.0158233642578125,
-0.0477294921875,
0.0556640625,
0.0328369140625,
-0.061553955078125,
0.036895751953125,
-0.074951171875,
-0.01922607421875,
0.0005526542663574219,
-0.020050048828125,
-0.060333251953125,
-0.0186920166015625,
0.027374267578125,
0.0275421142578125,
-0.00937652587890625,
-0.00489044189453125,
-0.01934814453125,
-0.002368927001953125,
-0.00583648681640625,
-0.0189361572265625,
0.0751953125,
0.0034923553466796875,
-0.02313232421875,
0.038787841796875,
-0.057952880859375,
0.00567626953125,
0.040313720703125,
0.00701904296875,
0.0090789794921875,
-0.0250091552734375,
0.0255279541015625,
0.03173828125,
0.003116607666015625,
-0.06622314453125,
0.02032470703125,
-0.023529052734375,
0.037506103515625,
0.045867919921875,
0.00949859619140625,
0.0418701171875,
-0.0167999267578125,
0.041717529296875,
0.020050048828125,
0.03765869140625,
-0.035797119140625,
-0.0202484130859375,
-0.0555419921875,
-0.034912109375,
0.0142669677734375,
0.038482666015625,
-0.00737762451171875,
0.04052734375,
-0.0033855438232421875,
-0.05120849609375,
-0.047393798828125,
-0.00983428955078125,
0.01904296875,
0.051116943359375,
0.029388427734375,
-0.032318115234375,
-0.054290771484375,
-0.08062744140625,
0.03863525390625,
0.0132904052734375,
0.00283050537109375,
0.0030460357666015625,
0.039703369140625,
-0.02777099609375,
0.07440185546875,
-0.04473876953125,
-0.0262451171875,
-0.01611328125,
-0.0007963180541992188,
0.01070404052734375,
0.063232421875,
0.05267333984375,
-0.0657958984375,
-0.0191650390625,
-0.026641845703125,
-0.06134033203125,
0.0208587646484375,
-0.00836181640625,
-0.024322509765625,
0.018951416015625,
0.0173187255859375,
-0.041229248046875,
0.05035400390625,
0.025360107421875,
-0.031219482421875,
0.057525634765625,
-0.005764007568359375,
0.00867462158203125,
-0.06982421875,
0.01058197021484375,
0.0159149169921875,
-0.01308441162109375,
-0.03887939453125,
0.0006937980651855469,
-0.0012836456298828125,
-0.0181732177734375,
-0.037445068359375,
0.04144287109375,
-0.01177215576171875,
0.0022640228271484375,
-0.0141143798828125,
-0.021392822265625,
0.01104736328125,
0.055572509765625,
0.0024662017822265625,
0.036346435546875,
0.0794677734375,
-0.046417236328125,
0.01250457763671875,
0.055877685546875,
-0.02410888671875,
0.03326416015625,
-0.07598876953125,
0.0032634735107421875,
-0.0272064208984375,
0.0184326171875,
-0.08380126953125,
-0.0179595947265625,
0.02252197265625,
-0.0149078369140625,
0.0250244140625,
-0.017913818359375,
-0.0129852294921875,
-0.03887939453125,
-0.0158843994140625,
0.0323486328125,
0.026702880859375,
-0.053741455078125,
0.044464111328125,
0.035308837890625,
0.00640106201171875,
-0.01502227783203125,
-0.0526123046875,
-0.035675048828125,
-0.00909423828125,
-0.0848388671875,
0.03460693359375,
0.01250457763671875,
0.007358551025390625,
0.00991058349609375,
-0.04248046875,
-0.0250091552734375,
-0.01331329345703125,
0.039947509765625,
0.040924072265625,
-0.0260162353515625,
-0.0197296142578125,
0.00024437904357910156,
-0.0259552001953125,
0.006786346435546875,
-0.018951416015625,
0.036590576171875,
-0.01200103759765625,
-0.03521728515625,
-0.051513671875,
0.0214080810546875,
0.052154541015625,
-0.029022216796875,
0.036895751953125,
0.07318115234375,
-0.03460693359375,
0.01071929931640625,
-0.05389404296875,
-0.016357421875,
-0.0316162109375,
0.034912109375,
-0.038116455078125,
-0.037139892578125,
0.0509033203125,
-0.004039764404296875,
0.01166534423828125,
0.06573486328125,
0.005840301513671875,
0.0172119140625,
0.09625244140625,
0.0498046875,
0.0110321044921875,
0.056365966796875,
-0.06341552734375,
-0.0106658935546875,
-0.06304931640625,
-0.016510009765625,
-0.0174407958984375,
-0.020599365234375,
-0.025421142578125,
-0.0240325927734375,
0.0498046875,
0.0225677490234375,
-0.0335693359375,
0.0380859375,
-0.0616455078125,
0.0201416015625,
0.0462646484375,
0.0236663818359375,
-0.01380157470703125,
0.006855010986328125,
-0.0247039794921875,
0.007251739501953125,
-0.0732421875,
-0.04473876953125,
0.05206298828125,
0.035980224609375,
0.0631103515625,
-0.04150390625,
0.031524658203125,
-0.0218963623046875,
0.0032100677490234375,
-0.058563232421875,
0.039581298828125,
-0.03289794921875,
-0.04248046875,
-0.0239410400390625,
-0.00479888916015625,
-0.07464599609375,
0.0250244140625,
-0.01503753662109375,
-0.069580078125,
0.0211944580078125,
-0.0002593994140625,
-0.035552978515625,
0.042999267578125,
-0.0489501953125,
0.1148681640625,
-0.01312255859375,
-0.0185699462890625,
0.00536346435546875,
-0.06134033203125,
0.015899658203125,
0.01374053955078125,
-0.01358795166015625,
-0.004913330078125,
0.0254364013671875,
0.0911865234375,
-0.039306640625,
0.05194091796875,
-0.025726318359375,
0.020050048828125,
0.032684326171875,
-0.0016355514526367188,
0.01480865478515625,
-0.006305694580078125,
-0.0043487548828125,
0.021697998046875,
0.0396728515625,
-0.041290283203125,
-0.024139404296875,
0.032928466796875,
-0.0679931640625,
-0.030242919921875,
-0.01503753662109375,
-0.0252227783203125,
0.01263427734375,
0.0185699462890625,
0.05975341796875,
0.0472412109375,
-0.029571533203125,
0.0010385513305664062,
0.040679931640625,
0.0007848739624023438,
0.0224761962890625,
0.01071929931640625,
-0.01030731201171875,
-0.0279083251953125,
0.0333251953125,
0.001209259033203125,
0.00640869140625,
0.0189056396484375,
0.0214996337890625,
-0.0203094482421875,
-0.008575439453125,
-0.0552978515625,
0.015167236328125,
-0.057220458984375,
-0.016876220703125,
-0.060943603515625,
-0.0286712646484375,
-0.0455322265625,
-0.017059326171875,
-0.06292724609375,
-0.037261962890625,
-0.0122222900390625,
0.0157012939453125,
0.0161895751953125,
0.048065185546875,
0.0049285888671875,
0.0340576171875,
-0.054229736328125,
0.0235595703125,
0.0294342041015625,
0.0240325927734375,
-0.01343536376953125,
-0.046905517578125,
0.00402069091796875,
-0.0128173828125,
-0.043060302734375,
-0.06231689453125,
0.0158843994140625,
0.00389862060546875,
0.061492919921875,
0.0567626953125,
-0.0296173095703125,
0.057769775390625,
-0.019195556640625,
0.045806884765625,
0.0250701904296875,
-0.060272216796875,
0.0506591796875,
0.006450653076171875,
0.052337646484375,
0.035797119140625,
0.046630859375,
-0.035430908203125,
0.0158843994140625,
-0.049468994140625,
-0.06884765625,
0.07745361328125,
-0.00438690185546875,
-0.018157958984375,
0.0208740234375,
0.01444244384765625,
0.0268096923828125,
-0.006992340087890625,
-0.03656005859375,
-0.04119873046875,
-0.040191650390625,
-0.00307464599609375,
0.019439697265625,
-0.0025634765625,
-0.01410675048828125,
-0.041534423828125,
0.05352783203125,
0.00093841552734375,
0.03118896484375,
0.040618896484375,
-0.010528564453125,
-0.0310821533203125,
0.004543304443359375,
0.03082275390625,
0.05767822265625,
-0.009063720703125,
-0.01006317138671875,
0.006366729736328125,
-0.036590576171875,
-0.007068634033203125,
0.0350341796875,
-0.026580810546875,
-0.0072174072265625,
0.027313232421875,
0.0936279296875,
0.006717681884765625,
-0.022369384765625,
0.033050537109375,
0.002399444580078125,
-0.040496826171875,
-0.02691650390625,
0.00028014183044433594,
-0.0154571533203125,
0.01526641845703125,
0.036376953125,
0.035919189453125,
0.0026264190673828125,
-0.0084075927734375,
0.0218963623046875,
0.032562255859375,
-0.040863037109375,
-0.027252197265625,
0.0545654296875,
-0.013458251953125,
-0.0034580230712890625,
0.042022705078125,
-0.0308074951171875,
-0.06756591796875,
0.06402587890625,
0.040496826171875,
0.0714111328125,
-0.00714111328125,
0.01485443115234375,
0.06024169921875,
0.009613037109375,
-0.0031585693359375,
-0.007343292236328125,
-0.000579833984375,
-0.053741455078125,
-0.02099609375,
-0.05511474609375,
-0.013641357421875,
0.01088714599609375,
-0.04443359375,
0.0131378173828125,
-0.0312347412109375,
-0.0091705322265625,
0.0306854248046875,
-0.0038356781005859375,
-0.0648193359375,
0.01122283935546875,
0.01837158203125,
0.067626953125,
-0.046905517578125,
0.038238525390625,
0.096435546875,
-0.0302276611328125,
-0.054901123046875,
-0.02490234375,
0.001316070556640625,
-0.05694580078125,
0.032073974609375,
0.0291595458984375,
0.014892578125,
-0.00031447410583496094,
-0.055938720703125,
-0.060028076171875,
0.082763671875,
0.005817413330078125,
-0.0190277099609375,
-0.006534576416015625,
-0.0051422119140625,
0.01190185546875,
-0.04510498046875,
0.0192413330078125,
0.029876708984375,
0.042877197265625,
0.032806396484375,
-0.051116943359375,
0.0185546875,
-0.01776123046875,
-0.005718231201171875,
0.0220184326171875,
-0.031951904296875,
0.07525634765625,
-0.0293426513671875,
-0.0252227783203125,
0.0095062255859375,
0.0296630859375,
0.01390838623046875,
0.009307861328125,
0.0667724609375,
0.06463623046875,
0.029022216796875,
-0.0034580230712890625,
0.0638427734375,
-0.0206146240234375,
0.03350830078125,
0.05810546875,
0.014068603515625,
0.029449462890625,
0.0261383056640625,
-0.025177001953125,
0.0430908203125,
0.034271240234375,
-0.032318115234375,
0.0260162353515625,
0.0008883476257324219,
-0.00022673606872558594,
-0.027557373046875,
0.01165771484375,
-0.0234375,
0.06304931640625,
0.0090789794921875,
-0.0211029052734375,
-0.038604736328125,
0.0222625732421875,
-0.005462646484375,
-0.01398468017578125,
-0.0266571044921875,
0.0555419921875,
-0.020751953125,
-0.042449951171875,
0.04400634765625,
0.005893707275390625,
0.0653076171875,
-0.06640625,
-0.0027618408203125,
0.0048675537109375,
0.0179443359375,
-0.0246429443359375,
-0.043304443359375,
0.0258026123046875,
-0.0258331298828125,
-0.019805908203125,
-0.0014247894287109375,
0.0650634765625,
-0.0209197998046875,
-0.058837890625,
0.005218505859375,
0.0267333984375,
0.0109710693359375,
-0.035308837890625,
-0.07122802734375,
0.010345458984375,
-0.01070404052734375,
-0.044830322265625,
0.00656890869140625,
0.0110626220703125,
0.0037860870361328125,
0.03948974609375,
0.036346435546875,
-0.0188446044921875,
0.0125579833984375,
-0.0020198822021484375,
0.07940673828125,
-0.043792724609375,
-0.035003662109375,
-0.04193115234375,
0.0364990234375,
0.0007295608520507812,
-0.047027587890625,
0.045806884765625,
0.048858642578125,
0.07244873046875,
-0.01470947265625,
0.043304443359375,
-0.0254669189453125,
0.013336181640625,
-0.00989532470703125,
0.042510986328125,
-0.041412353515625,
-0.0236968994140625,
-0.0196685791015625,
-0.052490234375,
-0.01178741455078125,
0.0789794921875,
-0.0316162109375,
0.0054168701171875,
0.048919677734375,
0.071044921875,
-0.026580810546875,
0.005641937255859375,
0.01222991943359375,
0.0009984970092773438,
0.0016469955444335938,
0.0223388671875,
0.02099609375,
-0.06341552734375,
0.0364990234375,
-0.0626220703125,
-0.0229339599609375,
0.0157470703125,
-0.02935791015625,
-0.043487548828125,
-0.050140380859375,
-0.04150390625,
-0.022064208984375,
-0.014404296875,
0.07025146484375,
0.10028076171875,
-0.07012939453125,
-0.00998687744140625,
-0.016510009765625,
0.0017538070678710938,
-0.030975341796875,
-0.021209716796875,
0.036529541015625,
-0.0197906494140625,
-0.05584716796875,
0.01027679443359375,
0.0185089111328125,
0.007354736328125,
-0.0193939208984375,
-0.0043487548828125,
-0.0288238525390625,
-0.021209716796875,
0.036712646484375,
0.032073974609375,
-0.039703369140625,
0.005718231201171875,
0.004787445068359375,
-0.00604248046875,
0.0297393798828125,
0.048126220703125,
-0.05328369140625,
0.0304107666015625,
0.023773193359375,
0.0268402099609375,
0.07061767578125,
0.01015472412109375,
0.007045745849609375,
-0.031280517578125,
0.0222320556640625,
0.01381683349609375,
0.035003662109375,
0.0229034423828125,
-0.0189971923828125,
0.03363037109375,
0.01678466796875,
-0.044891357421875,
-0.0275115966796875,
0.0131378173828125,
-0.09808349609375,
-0.028167724609375,
0.06683349609375,
-0.0178680419921875,
-0.04388427734375,
0.022705078125,
-0.027313232421875,
0.037933349609375,
-0.00803375244140625,
0.0312347412109375,
0.01183319091796875,
-0.01849365234375,
-0.0203399658203125,
-0.007476806640625,
0.0166015625,
0.017364501953125,
-0.056915283203125,
-0.03582763671875,
0.018798828125,
0.045074462890625,
0.036529541015625,
0.040252685546875,
-0.0307769775390625,
0.02734375,
0.0281829833984375,
0.03387451171875,
-0.0253143310546875,
-0.00301361083984375,
-0.0268707275390625,
0.0238494873046875,
-0.03131103515625,
-0.0484619140625
]
] |
laion/CLIP-ViT-L-14-laion2B-s32B-b82K | 2023-04-18T17:46:39.000Z | [
"open_clip",
"pytorch",
"tensorboard",
"clip",
"zero-shot-image-classification",
"arxiv:2110.09456",
"arxiv:2111.09883",
"arxiv:1910.04867",
"license:mit",
"has_space",
"region:us"
] | zero-shot-image-classification | laion | null | null | laion/CLIP-ViT-L-14-laion2B-s32B-b82K | 32 | 86,084 | open_clip | 2022-09-14T22:51:37 | ---
license: mit
widget:
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
library_name: open_clip
pipeline_tag: zero-shot-image-classification
---
# Model Card for CLIP ViT-L/14 - LAION-2B
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Details](#training-details)
4. [Evaluation](#evaluation)
5. [Acknowledgements](#acknowledgements)
6. [Citation](#citation)
7. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
A CLIP ViT L/14 model trained with the LAION-2B English subset of LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip).
Model training ('babysitting') done by Ross Wightman on the [JUWELS Booster](https://apps.fz-juelich.de/jsc/hps/juwels/booster-overview.html) supercomputer. See acknowledgements below.
# Uses
As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model.
The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset.
## Direct Use
Zero-shot image classification, image and text retrieval, among others.
## Downstream Use
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
## Out-of-Scope Use
As per the OpenAI models,
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below.
# Training Details
## Training Data
This model was trained with the 2 Billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/).
**IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress.
## Training Procedure
The model was trained on 384 A100 GPUs using 200M sample 'virtual' epochs where dataset shards were sampled with replacement. The model was trained with 160 virtual epochs for a total of 32B samples seen.
The first 68 epochs were trained with float16 AMP, global batch size 79K (208 per GPU). Initially running to epoch 75, where the loss spiked and training failed with NaN.
Romain Beaumont was training H/14 and g/14 models at the same time on Stability cluster and hit similar instabilities. Collectively we tried restarts with,
* different dataset shuffle seed
* different LR
* gradient clipping
* modifications to the architecture
* Norm modifications (stable norm for final, post embed norm for text transformer) as per https://github.com/mlfoundations/open_clip/pull/153 thanks to Phil Wang
* Extra attention block norms ala Normformer (https://arxiv.org/abs/2110.09456)
* Scaled cosine attention ala Swin-V2 (https://arxiv.org/abs/2111.09883)
None of the above ended up working. Most blew up within the same epoch as original, with the exception of architecture mods.
* Normformer mods signifcantly altered the network such that resuming did not quickly converge to previous performance, this was abandoned but might be worth trying from start.
* Scaled cosine attn initially looked promising and lasted until epoch 90 before loss suddenly increased and appeared to remain 'stuck'.
In the end, restarting at epoch 69 with `float32` precision solved all instabilities and training continued from there with global batch size 86k (224 per GPU). On A100 GPUs, `float32` had a minimal impact on the throughput once `tf32` matmuls were enabled in PyTorch. Approximately 10% slower than `float16 AMP`. Romain similary changed the precision but ended up using `bfloat16 AMP` to resolve issues.
### Slum Script
```
#SBATCH --nodes=96
#SBATCH --gres=gpu:4
#SBATCH --ntasks-per-node=4
#SBATCH --cpus-per-task=6
#SBATCH --wait-all-nodes=1
#SBATCH --job-name=open_clip_laion2b
# load low-level libraries
ml purge
source /conda/bin/activate pytorch-112
export NCCL_ASYNC_ERROR_HANDLING=1
export CUDA_VISIBLE_DEVICES=0,1,2,3
export MASTER_PORT=12802
### get the first node name as master address - customized for vgg slurm
### e.g. master(gnodee[2-5],gnoded1) == gnodee2
echo "NODELIST="${SLURM_NODELIST}
master_addr=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
export MASTER_ADDR=$master_addr"i"
echo "MASTER_ADDR="$MASTER_ADDR
cd /home/me/open_clip
export PYTHONPATH="$PYTHONPATH:$PWD/src"
srun --cpu_bind=none,v --accel-bind=gn python -u src/training/main.py \
--save-frequency 1 \
--zeroshot-frequency 1 \
--train-data="/data/laion2B-en/{00000..23295}.tar" \
--train-num-samples=200000000 \
--warmup 10000 \
--lr "1e-3" \
--batch-size=224 \
--epochs=160 \
--workers=6 \
--model ViT-L-14 \
--name "L14-laion2B" \
--report-to "tensorboard" \
--seed 0 \
--precision 'fp32' \
--ddp-static-graph \
--local-loss \
--dataset-resampled \
--gather-with-grad \
--grad-checkpointing
```
# Evaluation
Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark).
## Testing Data, Factors & Metrics
### Testing Data
The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
**TODO** - more detail
## Results
The model achieves a 75.3 zero-shot top-1 accuracy on ImageNet-1k.
An initial round of benchmarks have been performed on a wider range of datasets, currently viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb
**TODO** - create table for just this model's metrics.
# Acknowledgements
Acknowledging the Gauss Centre for Supercomputing e.V. (http://gauss-centre.eu) for funding this part of work by providing computing time through the John von Neumann Institute for Computing (NIC) on the GCS Supercomputer JUWELS Booster at Jülich Supercomputing Centre (JSC).
# Citation
**BibTeX:**
LAION-5B
```bibtex
@inproceedings{schuhmann2022laionb,
title={{LAION}-5B: An open large-scale dataset for training next generation image-text models},
author={Christoph Schuhmann and
Romain Beaumont and
Richard Vencu and
Cade W Gordon and
Ross Wightman and
Mehdi Cherti and
Theo Coombes and
Aarush Katta and
Clayton Mullis and
Mitchell Wortsman and
Patrick Schramowski and
Srivatsa R Kundurthy and
Katherine Crowson and
Ludwig Schmidt and
Robert Kaczmarczyk and
Jenia Jitsev},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=M3Y74vmsMcY}
}
```
OpenAI CLIP paper
```
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
OpenCLIP software
```
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
```
# How to Get Started with the Model
Use the code below to get started with the model.
** TODO ** - Hugging Face transformers, OpenCLIP, and timm getting started snippets | 11,366 | [
[
-0.0255584716796875,
-0.045806884765625,
0.01375579833984375,
0.00476837158203125,
-0.0222625732421875,
-0.0272064208984375,
-0.01190948486328125,
-0.0435791015625,
0.0059051513671875,
0.025482177734375,
-0.034942626953125,
-0.0223541259765625,
-0.048614501953125,
-0.0117950439453125,
-0.0260772705078125,
0.0760498046875,
-0.00896453857421875,
0.003559112548828125,
-0.0015420913696289062,
-0.0185089111328125,
-0.035888671875,
-0.041900634765625,
-0.061676025390625,
-0.01171112060546875,
0.01340484619140625,
0.024169921875,
0.046661376953125,
0.0528564453125,
0.048309326171875,
0.01812744140625,
-0.0177459716796875,
-0.0006265640258789062,
-0.051971435546875,
-0.036865234375,
0.00589752197265625,
-0.0201568603515625,
-0.047088623046875,
0.0021724700927734375,
0.050872802734375,
0.0281829833984375,
-0.005496978759765625,
0.0225677490234375,
0.006343841552734375,
0.032196044921875,
-0.05328369140625,
0.004985809326171875,
-0.04998779296875,
0.00844573974609375,
-0.0182037353515625,
0.0036449432373046875,
-0.0238037109375,
0.0002446174621582031,
0.00787353515625,
-0.046112060546875,
0.0267486572265625,
-0.01258087158203125,
0.09930419921875,
0.0163116455078125,
-0.0235443115234375,
0.0096282958984375,
-0.05438232421875,
0.054046630859375,
-0.05670166015625,
0.0296173095703125,
0.0236663818359375,
0.0285797119140625,
0.00510406494140625,
-0.07684326171875,
-0.023101806640625,
-0.01268768310546875,
0.0124664306640625,
0.018096923828125,
-0.0181732177734375,
-0.00507354736328125,
0.040771484375,
0.02508544921875,
-0.034423828125,
0.006114959716796875,
-0.056915283203125,
-0.01244354248046875,
0.055267333984375,
0.00504302978515625,
0.0038127899169921875,
-0.0198974609375,
-0.04681396484375,
-0.02178955078125,
-0.05340576171875,
0.0294189453125,
0.0288238525390625,
0.0007576942443847656,
-0.0222320556640625,
0.031890869140625,
-0.0030651092529296875,
0.0306243896484375,
0.0006146430969238281,
-0.0160675048828125,
0.034942626953125,
-0.0247344970703125,
-0.033416748046875,
-0.0037899017333984375,
0.083251953125,
0.037139892578125,
0.0003712177276611328,
0.0052947998046875,
-0.011871337890625,
-0.01666259765625,
0.0254974365234375,
-0.08233642578125,
-0.017364501953125,
0.01415252685546875,
-0.040679931640625,
-0.032958984375,
0.022369384765625,
-0.050048828125,
0.0002980232238769531,
-0.0010852813720703125,
0.052642822265625,
-0.033233642578125,
-0.015167236328125,
0.0005731582641601562,
-0.01428985595703125,
0.008880615234375,
0.0120849609375,
-0.050750732421875,
0.01171112060546875,
0.03521728515625,
0.08062744140625,
-0.0084381103515625,
-0.0279693603515625,
-0.015167236328125,
0.0025463104248046875,
-0.02001953125,
0.033111572265625,
0.0041656494140625,
-0.037017822265625,
-0.017364501953125,
0.0361328125,
-0.013031005859375,
-0.04986572265625,
0.049713134765625,
-0.0256500244140625,
0.00693511962890625,
-0.0165557861328125,
-0.0191497802734375,
-0.032623291015625,
0.006885528564453125,
-0.047882080078125,
0.08331298828125,
0.00890350341796875,
-0.06195068359375,
0.0203704833984375,
-0.05096435546875,
-0.017486572265625,
-0.005985260009765625,
0.0009369850158691406,
-0.05224609375,
-0.01324462890625,
0.018463134765625,
0.036712646484375,
-0.020172119140625,
0.0352783203125,
-0.0275115966796875,
-0.0428466796875,
0.014862060546875,
-0.0469970703125,
0.08050537109375,
0.0007801055908203125,
-0.041717529296875,
0.004924774169921875,
-0.04595947265625,
-0.00240325927734375,
0.02557373046875,
-0.00359344482421875,
0.00565338134765625,
-0.02667236328125,
0.00318145751953125,
0.0161590576171875,
0.00826263427734375,
-0.035858154296875,
0.001224517822265625,
-0.01552581787109375,
0.048614501953125,
0.055877685546875,
0.00969696044921875,
0.02325439453125,
-0.034332275390625,
0.034423828125,
-0.0005784034729003906,
0.04296875,
-0.00597381591796875,
-0.044708251953125,
-0.0614013671875,
-0.02642822265625,
0.0261993408203125,
0.038421630859375,
-0.03497314453125,
0.031341552734375,
-0.01493072509765625,
-0.043212890625,
-0.03515625,
-0.0092620849609375,
0.032470703125,
0.04498291015625,
0.03912353515625,
-0.0276031494140625,
-0.04620361328125,
-0.0684814453125,
0.0236968994140625,
0.0019044876098632812,
-0.0037288665771484375,
0.04937744140625,
0.05670166015625,
-0.0234222412109375,
0.05938720703125,
-0.0472412109375,
-0.035064697265625,
-0.0060577392578125,
0.00258636474609375,
0.0222625732421875,
0.036712646484375,
0.0538330078125,
-0.0521240234375,
-0.029144287109375,
-0.0095062255859375,
-0.0760498046875,
0.01495361328125,
0.0011167526245117188,
-0.0169219970703125,
0.01485443115234375,
0.04180908203125,
-0.041717529296875,
0.0443115234375,
0.03173828125,
0.005748748779296875,
0.0533447265625,
-0.01412200927734375,
0.00014650821685791016,
-0.07977294921875,
0.029022216796875,
-0.005695343017578125,
-0.01422119140625,
-0.033660888671875,
0.0023059844970703125,
-0.0009717941284179688,
-0.0301361083984375,
-0.0631103515625,
0.0345458984375,
-0.0224151611328125,
0.0085906982421875,
-0.01222991943359375,
-0.00955963134765625,
0.004077911376953125,
0.04107666015625,
0.0011320114135742188,
0.0791015625,
0.0504150390625,
-0.05169677734375,
0.0098724365234375,
0.01419830322265625,
-0.0152740478515625,
0.0182342529296875,
-0.0782470703125,
0.00960540771484375,
-0.004261016845703125,
0.0201263427734375,
-0.0302886962890625,
-0.0280303955078125,
0.03302001953125,
-0.035614013671875,
0.0270233154296875,
-0.0308990478515625,
-0.00836181640625,
-0.043060302734375,
-0.04107666015625,
0.03271484375,
0.05316162109375,
-0.03997802734375,
0.0263519287109375,
0.0275115966796875,
0.0240936279296875,
-0.06109619140625,
-0.055755615234375,
-0.01277923583984375,
-0.0258331298828125,
-0.057159423828125,
0.03472900390625,
-0.0005335807800292969,
-0.0009036064147949219,
0.003269195556640625,
-0.0048675537109375,
-0.0103302001953125,
0.003238677978515625,
0.042877197265625,
0.025482177734375,
-0.00921630859375,
-0.01369476318359375,
-0.00678253173828125,
-0.01082611083984375,
0.006496429443359375,
-0.0230560302734375,
0.036376953125,
-0.021728515625,
-0.01551055908203125,
-0.049713134765625,
0.0021991729736328125,
0.0419921875,
-0.011077880859375,
0.063720703125,
0.07464599609375,
-0.03607177734375,
0.00021123886108398438,
-0.030548095703125,
-0.0156402587890625,
-0.03851318359375,
0.039276123046875,
-0.00383758544921875,
-0.0523681640625,
0.037994384765625,
0.009033203125,
-0.0113067626953125,
0.056365966796875,
0.028076171875,
-0.009521484375,
0.0528564453125,
0.044036865234375,
-0.0059051513671875,
0.038421630859375,
-0.0771484375,
0.0124969482421875,
-0.076416015625,
-0.0232391357421875,
-0.0128631591796875,
-0.0152130126953125,
-0.0362548828125,
-0.036529541015625,
0.0550537109375,
0.025787353515625,
-0.0254974365234375,
0.03204345703125,
-0.03594970703125,
0.0238494873046875,
0.0380859375,
0.0240478515625,
-0.00954437255859375,
-0.0020656585693359375,
0.0002281665802001953,
-0.00020694732666015625,
-0.05938720703125,
-0.0276641845703125,
0.09454345703125,
0.04803466796875,
0.048095703125,
-0.016021728515625,
0.04107666015625,
0.01184844970703125,
0.0018558502197265625,
-0.046539306640625,
0.0418701171875,
-0.0225372314453125,
-0.048248291015625,
-0.023590087890625,
-0.027099609375,
-0.06427001953125,
0.00829315185546875,
-0.00960540771484375,
-0.066650390625,
0.0220184326171875,
0.00836944580078125,
-0.031829833984375,
0.040496826171875,
-0.048309326171875,
0.08441162109375,
-0.02362060546875,
-0.029449462890625,
-0.0005245208740234375,
-0.0579833984375,
0.041015625,
0.00708770751953125,
0.0015010833740234375,
-0.01338958740234375,
0.021514892578125,
0.08203125,
-0.056365966796875,
0.058135986328125,
-0.0197296142578125,
0.019744873046875,
0.053375244140625,
-0.017822265625,
0.0247955322265625,
0.0238189697265625,
-0.0006551742553710938,
0.057159423828125,
-0.0024394989013671875,
-0.0160369873046875,
-0.02081298828125,
0.048248291015625,
-0.0772705078125,
-0.0218353271484375,
-0.035614013671875,
-0.040771484375,
0.0168609619140625,
0.0166168212890625,
0.060943603515625,
0.0469970703125,
-0.0072479248046875,
0.0259246826171875,
0.04937744140625,
-0.007289886474609375,
0.046051025390625,
0.007465362548828125,
-0.00739288330078125,
-0.055908203125,
0.07843017578125,
0.0229034423828125,
0.022003173828125,
-0.00015616416931152344,
0.0077667236328125,
-0.01076507568359375,
-0.029022216796875,
-0.051422119140625,
0.019561767578125,
-0.044708251953125,
-0.03338623046875,
-0.05169677734375,
-0.044830322265625,
-0.03729248046875,
-0.006256103515625,
-0.040252685546875,
-0.0199432373046875,
-0.046844482421875,
0.002777099609375,
0.028656005859375,
0.039276123046875,
-0.0159149169921875,
0.04248046875,
-0.062225341796875,
0.01441192626953125,
0.01531219482421875,
0.0177001953125,
0.00959014892578125,
-0.0595703125,
-0.02337646484375,
0.0215301513671875,
-0.04705810546875,
-0.051727294921875,
0.019805908203125,
0.0194244384765625,
0.0312042236328125,
0.049041748046875,
-0.006214141845703125,
0.05084228515625,
-0.03021240234375,
0.07415771484375,
0.014251708984375,
-0.060272216796875,
0.034271240234375,
-0.034820556640625,
0.01457977294921875,
0.04107666015625,
0.053924560546875,
-0.0236053466796875,
0.00612640380859375,
-0.063232421875,
-0.07476806640625,
0.0716552734375,
0.021728515625,
-0.0034542083740234375,
0.0192413330078125,
0.034393310546875,
-0.00585174560546875,
0.01253509521484375,
-0.062225341796875,
-0.01800537109375,
-0.028228759765625,
0.0118408203125,
0.0023956298828125,
-0.0284271240234375,
-0.01458740234375,
-0.037445068359375,
0.075927734375,
-0.0097503662109375,
0.0528564453125,
0.01959228515625,
-0.01226043701171875,
-0.00885772705078125,
-0.00586700439453125,
0.035919189453125,
0.04754638671875,
-0.03680419921875,
-0.013580322265625,
0.0270233154296875,
-0.053802490234375,
0.0021343231201171875,
0.015960693359375,
-0.049652099609375,
-0.00885009765625,
0.024169921875,
0.09100341796875,
0.0121917724609375,
-0.03948974609375,
0.059417724609375,
-0.0019407272338867188,
-0.027618408203125,
-0.0310516357421875,
0.0088958740234375,
-0.00550079345703125,
0.0141143798828125,
0.0192108154296875,
0.01258087158203125,
0.00951385498046875,
-0.03289794921875,
0.017120361328125,
0.03668212890625,
-0.03863525390625,
-0.029571533203125,
0.06451416015625,
-0.005916595458984375,
-0.01425933837890625,
0.05535888671875,
-0.0248565673828125,
-0.047760009765625,
0.058197021484375,
0.04327392578125,
0.06982421875,
-0.00586700439453125,
0.021881103515625,
0.05029296875,
0.0294189453125,
-0.020050048828125,
0.00896453857421875,
0.01485443115234375,
-0.038421630859375,
-0.0124969482421875,
-0.039947509765625,
-0.016387939453125,
0.0276031494140625,
-0.0609130859375,
0.040252685546875,
-0.048675537109375,
-0.031585693359375,
-0.01514434814453125,
-0.015838623046875,
-0.041900634765625,
0.019561767578125,
0.010345458984375,
0.06719970703125,
-0.07421875,
0.05133056640625,
0.052520751953125,
-0.046356201171875,
-0.069091796875,
-0.0008535385131835938,
-0.01418304443359375,
-0.04693603515625,
0.0260162353515625,
0.0389404296875,
0.007778167724609375,
-0.017120361328125,
-0.0589599609375,
-0.07635498046875,
0.108642578125,
0.02850341796875,
-0.0247344970703125,
-0.003025054931640625,
0.0104522705078125,
0.0352783203125,
-0.00823974609375,
0.039276123046875,
0.0308380126953125,
0.01275634765625,
0.01226806640625,
-0.08154296875,
-0.001560211181640625,
-0.01276397705078125,
0.00572967529296875,
0.01373291015625,
-0.08123779296875,
0.08184814453125,
-0.0184478759765625,
-0.00783538818359375,
-0.005096435546875,
0.054351806640625,
0.0058135986328125,
0.02880859375,
0.0301055908203125,
0.06268310546875,
0.055389404296875,
-0.006305694580078125,
0.07098388671875,
-0.02130126953125,
0.0404052734375,
0.07171630859375,
0.0003840923309326172,
0.060821533203125,
0.01910400390625,
-0.0192718505859375,
0.0219268798828125,
0.037384033203125,
-0.0254364013671875,
0.0419921875,
-0.01280975341796875,
-0.004505157470703125,
-0.0038814544677734375,
-0.0289459228515625,
-0.0443115234375,
0.0310516357421875,
0.00690460205078125,
-0.0258331298828125,
-0.005336761474609375,
0.018341064453125,
0.01377105712890625,
-0.036895751953125,
-0.022064208984375,
0.033172607421875,
0.012298583984375,
-0.0286102294921875,
0.06396484375,
0.01146697998046875,
0.060699462890625,
-0.0513916015625,
0.007175445556640625,
-0.004436492919921875,
0.0231475830078125,
-0.02032470703125,
-0.051422119140625,
0.0306243896484375,
-0.01222991943359375,
-0.01043701171875,
-0.005863189697265625,
0.055633544921875,
-0.01409149169921875,
-0.038604736328125,
0.0292816162109375,
0.0015382766723632812,
0.0270538330078125,
-0.0117950439453125,
-0.0499267578125,
0.0175018310546875,
0.004489898681640625,
-0.007274627685546875,
0.032012939453125,
0.01212310791015625,
-0.0118408203125,
0.04541015625,
0.043701171875,
-0.0147247314453125,
0.0110626220703125,
-0.0109405517578125,
0.076416015625,
-0.02777099609375,
-0.0231781005859375,
-0.047882080078125,
0.038726806640625,
-0.0032520294189453125,
-0.041473388671875,
0.05145263671875,
0.053802490234375,
0.07843017578125,
-0.004184722900390625,
0.047821044921875,
-0.021942138671875,
0.00669097900390625,
-0.037445068359375,
0.05621337890625,
-0.05230712890625,
-0.0022182464599609375,
-0.017364501953125,
-0.0599365234375,
0.0010833740234375,
0.042083740234375,
-0.014617919921875,
0.0082550048828125,
0.055511474609375,
0.059661865234375,
-0.0201263427734375,
0.00215911865234375,
0.00994873046875,
0.015472412109375,
0.0233917236328125,
0.0416259765625,
0.0357666015625,
-0.07598876953125,
0.05181884765625,
-0.060577392578125,
-0.0182647705078125,
-0.020416259765625,
-0.058135986328125,
-0.070068359375,
-0.0323486328125,
-0.025787353515625,
-0.039276123046875,
0.0118865966796875,
0.0631103515625,
0.0628662109375,
-0.0577392578125,
-0.0246734619140625,
0.006832122802734375,
-0.015228271484375,
-0.0214996337890625,
-0.0183868408203125,
0.040191650390625,
0.005062103271484375,
-0.0391845703125,
0.0313720703125,
0.004425048828125,
0.0160369873046875,
-0.010650634765625,
-0.007762908935546875,
-0.022186279296875,
-0.023406982421875,
0.03155517578125,
0.0293426513671875,
-0.032745361328125,
-0.0155792236328125,
0.0136260986328125,
0.0023193359375,
0.01128387451171875,
0.043212890625,
-0.042572021484375,
0.0192108154296875,
0.0300445556640625,
0.034912109375,
0.06036376953125,
0.00806427001953125,
0.0128631591796875,
-0.0472412109375,
0.0304107666015625,
0.0137939453125,
0.029937744140625,
0.01393890380859375,
-0.03948974609375,
0.047943115234375,
0.04595947265625,
-0.051666259765625,
-0.06915283203125,
-0.01215362548828125,
-0.07763671875,
-0.003505706787109375,
0.0986328125,
-0.0188140869140625,
-0.03546142578125,
0.025238037109375,
-0.018341064453125,
0.0250091552734375,
-0.0269775390625,
0.037750244140625,
0.028045654296875,
-0.002834320068359375,
-0.00481414794921875,
-0.05548095703125,
0.02880859375,
0.019073486328125,
-0.0509033203125,
-0.0189056396484375,
0.0207672119140625,
0.03759765625,
0.01061248779296875,
0.04901123046875,
-0.0121307373046875,
0.023956298828125,
0.00605010986328125,
0.014404296875,
-0.028076171875,
-0.047698974609375,
-0.0386962890625,
0.00664520263671875,
-0.01346588134765625,
-0.0286102294921875
]
] |
bvanaken/clinical-assertion-negation-bert | 2022-06-01T12:28:45.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"medical",
"clinical",
"assertion",
"negation",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | bvanaken | null | null | bvanaken/clinical-assertion-negation-bert | 19 | 85,964 | transformers | 2022-03-02T23:29:05 | ---
language: "en"
tags:
- bert
- medical
- clinical
- assertion
- negation
- text-classification
widget:
- text: "Patient denies [entity] SOB [entity]."
---
# Clinical Assertion / Negation Classification BERT
## Model description
The Clinical Assertion and Negation Classification BERT is introduced in the paper [Assertion Detection in Clinical Notes: Medical Language Models to the Rescue?
](https://aclanthology.org/2021.nlpmc-1.5/). The model helps structure information in clinical patient letters by classifying medical conditions mentioned in the letter into PRESENT, ABSENT and POSSIBLE.
The model is based on the [ClinicalBERT - Bio + Discharge Summary BERT Model](https://huggingface.co/emilyalsentzer/Bio_Discharge_Summary_BERT) by Alsentzer et al. and fine-tuned on assertion data from the [2010 i2b2 challenge](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3168320/).
#### How to use the model
You can load the model via the transformers library:
```
from transformers import AutoTokenizer, AutoModelForSequenceClassification, TextClassificationPipeline
tokenizer = AutoTokenizer.from_pretrained("bvanaken/clinical-assertion-negation-bert")
model = AutoModelForSequenceClassification.from_pretrained("bvanaken/clinical-assertion-negation-bert")
```
The model expects input in the form of spans/sentences with one marked entity to classify as `PRESENT(0)`, `ABSENT(1)` or `POSSIBLE(2)`. The entity in question is identified with the special token `[entity]` surrounding it.
Example input and inference:
```
input = "The patient recovered during the night and now denies any [entity] shortness of breath [entity]."
classifier = TextClassificationPipeline(model=model, tokenizer=tokenizer)
classification = classifier(input)
# [{'label': 'ABSENT', 'score': 0.9842607378959656}]
```
### Cite
When working with the model, please cite our paper as follows:
```bibtex
@inproceedings{van-aken-2021-assertion,
title = "Assertion Detection in Clinical Notes: Medical Language Models to the Rescue?",
author = "van Aken, Betty and
Trajanovska, Ivana and
Siu, Amy and
Mayrdorfer, Manuel and
Budde, Klemens and
Loeser, Alexander",
booktitle = "Proceedings of the Second Workshop on Natural Language Processing for Medical Conversations",
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.nlpmc-1.5",
doi = "10.18653/v1/2021.nlpmc-1.5"
}
``` | 2,503 | [
[
-0.0032825469970703125,
-0.0703125,
0.051483154296875,
0.0235137939453125,
-0.0008606910705566406,
-0.01457977294921875,
0.0004363059997558594,
-0.05096435546875,
0.0197296142578125,
0.0201416015625,
-0.03082275390625,
-0.04925537109375,
-0.056884765625,
0.000018656253814697266,
-0.0302581787109375,
0.08111572265625,
-0.004840850830078125,
0.0200347900390625,
-0.02691650390625,
-0.0001932382583618164,
-0.028228759765625,
-0.05523681640625,
-0.03729248046875,
-0.0205535888671875,
0.0272674560546875,
0.01482391357421875,
0.03228759765625,
0.036651611328125,
0.04974365234375,
0.0234527587890625,
-0.01378631591796875,
-0.0102691650390625,
0.01041412353515625,
0.0098114013671875,
0.0174713134765625,
-0.0294036865234375,
-0.05072021484375,
-0.004909515380859375,
0.038238525390625,
0.054107666015625,
-0.004108428955078125,
-0.00311279296875,
-0.01009368896484375,
0.0165557861328125,
-0.032806396484375,
0.022674560546875,
-0.035919189453125,
-0.000017523765563964844,
-0.01197052001953125,
0.007564544677734375,
-0.04681396484375,
-0.028839111328125,
0.03375244140625,
-0.0206451416015625,
0.0228271484375,
-0.00875091552734375,
0.10565185546875,
0.016815185546875,
-0.03265380859375,
-0.0423583984375,
-0.0303497314453125,
0.04925537109375,
-0.08758544921875,
0.0200042724609375,
0.042816162109375,
0.0126495361328125,
-0.0003867149353027344,
-0.07476806640625,
-0.05963134765625,
-0.035552978515625,
-0.02203369140625,
0.0207977294921875,
-0.0193023681640625,
0.0214691162109375,
0.010223388671875,
0.034271240234375,
-0.059478759765625,
0.0172882080078125,
-0.01861572265625,
-0.039886474609375,
0.0299835205078125,
0.006561279296875,
0.0273284912109375,
-0.038116455078125,
-0.0258941650390625,
0.008026123046875,
-0.0295257568359375,
-0.004817962646484375,
-0.001567840576171875,
0.0119171142578125,
-0.034637451171875,
0.0313720703125,
0.00939178466796875,
0.05010986328125,
-0.004779815673828125,
-0.01715087890625,
0.0458984375,
-0.0201873779296875,
-0.0210113525390625,
0.0156402587890625,
0.06524658203125,
0.005474090576171875,
-0.0083160400390625,
0.004123687744140625,
-0.01467132568359375,
0.0206451416015625,
0.0186004638671875,
-0.060150146484375,
-0.0311737060546875,
0.019378662109375,
-0.061248779296875,
-0.035491943359375,
-0.01175689697265625,
-0.0423583984375,
-0.0223388671875,
-0.007282257080078125,
0.048492431640625,
-0.053802490234375,
-0.0089569091796875,
-0.00176239013671875,
0.00482177734375,
0.0224456787109375,
0.01137542724609375,
-0.058258056640625,
0.025360107421875,
0.033843994140625,
0.047607421875,
0.0092315673828125,
0.0009937286376953125,
-0.0205078125,
-0.0245208740234375,
-0.014892578125,
0.056884765625,
-0.0288238525390625,
-0.0159759521484375,
0.0078887939453125,
0.0171051025390625,
-0.0167694091796875,
-0.033935546875,
0.0693359375,
-0.00969696044921875,
0.043182373046875,
-0.0021305084228515625,
-0.06805419921875,
-0.027374267578125,
0.0168914794921875,
-0.0177001953125,
0.06707763671875,
0.01226806640625,
-0.046234130859375,
0.0223388671875,
-0.05206298828125,
-0.041259765625,
0.0105133056640625,
-0.01381683349609375,
-0.05584716796875,
0.0009851455688476562,
0.0034770965576171875,
0.033935546875,
-0.01983642578125,
0.032196044921875,
-0.033355712890625,
-0.0017251968383789062,
0.02569580078125,
-0.0230712890625,
0.07965087890625,
0.0199127197265625,
-0.0196380615234375,
0.00370025634765625,
-0.07080078125,
0.005207061767578125,
0.012420654296875,
-0.0159759521484375,
-0.0216217041015625,
0.0124664306640625,
0.004352569580078125,
0.015899658203125,
0.0233001708984375,
-0.05450439453125,
0.0032978057861328125,
-0.03863525390625,
0.03302001953125,
0.031402587890625,
0.019866943359375,
0.0009431838989257812,
-0.036224365234375,
0.0305938720703125,
-0.00244903564453125,
0.01238250732421875,
-0.01163482666015625,
-0.05072021484375,
-0.07177734375,
-0.042449951171875,
0.046051025390625,
0.0545654296875,
-0.0330810546875,
0.07098388671875,
0.0021572113037109375,
-0.039459228515625,
-0.0574951171875,
-0.017913818359375,
0.04058837890625,
0.0780029296875,
0.0518798828125,
-0.01390838623046875,
-0.057373046875,
-0.06890869140625,
0.0062103271484375,
-0.037933349609375,
0.014678955078125,
0.01465606689453125,
0.037384033203125,
-0.04034423828125,
0.05242919921875,
-0.03692626953125,
-0.0281524658203125,
-0.0200347900390625,
0.032928466796875,
0.0278778076171875,
0.04876708984375,
0.047454833984375,
-0.037750244140625,
-0.035186767578125,
-0.023284912109375,
-0.06915283203125,
-0.0295257568359375,
-0.0175018310546875,
0.0009784698486328125,
0.034332275390625,
0.03912353515625,
-0.0288848876953125,
0.05096435546875,
0.0232696533203125,
-0.0406494140625,
0.049163818359375,
-0.0269775390625,
-0.0240936279296875,
-0.07598876953125,
0.0029125213623046875,
-0.0021190643310546875,
-0.0129852294921875,
-0.06048583984375,
-0.00004792213439941406,
0.0127716064453125,
0.0167083740234375,
-0.0130157470703125,
0.03875732421875,
-0.0202178955078125,
0.035430908203125,
-0.01113128662109375,
-0.01384735107421875,
-0.00350189208984375,
0.030792236328125,
0.0048370361328125,
0.01485443115234375,
0.054931640625,
-0.0521240234375,
-0.006954193115234375,
0.045684814453125,
-0.0169677734375,
0.035186767578125,
-0.058929443359375,
-0.0099334716796875,
-0.004375457763671875,
0.00629425048828125,
-0.08758544921875,
-0.0127105712890625,
0.022125244140625,
-0.060302734375,
0.034759521484375,
-0.00804901123046875,
-0.05450439453125,
-0.03192138671875,
0.0050811767578125,
0.0079498291015625,
0.047119140625,
-0.0284576416015625,
0.0406494140625,
0.023651123046875,
-0.018402099609375,
-0.0278778076171875,
-0.0714111328125,
-0.0023403167724609375,
0.0036678314208984375,
-0.034027099609375,
0.0300140380859375,
-0.016876220703125,
-0.00644683837890625,
0.006450653076171875,
-0.0098114013671875,
-0.031951904296875,
0.01399993896484375,
0.01947021484375,
0.052642822265625,
-0.020538330078125,
0.040008544921875,
0.02587890625,
-0.01084136962890625,
0.0171051025390625,
0.0011339187622070312,
0.038238525390625,
-0.0140380859375,
-0.037933349609375,
-0.04779052734375,
0.0284423828125,
0.045135498046875,
-0.00677490234375,
0.055938720703125,
0.044769287109375,
-0.05145263671875,
0.016510009765625,
-0.0457763671875,
-0.020263671875,
-0.032806396484375,
0.03173828125,
-0.00273895263671875,
-0.042022705078125,
0.04595947265625,
0.028411865234375,
0.00266265869140625,
0.055999755859375,
0.05133056640625,
-0.03009033203125,
0.073974609375,
0.033447265625,
0.0097503662109375,
0.01354217529296875,
-0.00650787353515625,
0.0296783447265625,
-0.0513916015625,
-0.00986480712890625,
-0.047607421875,
-0.004512786865234375,
-0.0474853515625,
-0.0018930435180664062,
0.0270233154296875,
0.004779815673828125,
-0.0162353515625,
0.01486968994140625,
-0.052703857421875,
0.003101348876953125,
0.0224761962890625,
0.0172271728515625,
0.00023925304412841797,
-0.0208892822265625,
-0.04168701171875,
-0.0219879150390625,
-0.0418701171875,
-0.035888671875,
0.06842041015625,
0.0479736328125,
0.0300140380859375,
0.0278167724609375,
0.06732177734375,
0.015472412109375,
0.043670654296875,
-0.0711669921875,
0.05072021484375,
-0.020294189453125,
-0.062255859375,
0.004261016845703125,
-0.0243072509765625,
-0.07598876953125,
0.01374053955078125,
-0.0275726318359375,
-0.059051513671875,
0.023406982421875,
0.01091766357421875,
-0.0296173095703125,
0.0006518363952636719,
-0.0673828125,
0.0703125,
-0.0240936279296875,
0.01190948486328125,
-0.0016088485717773438,
-0.0684814453125,
0.0215606689453125,
-0.0022449493408203125,
0.0034732818603515625,
0.0007181167602539062,
0.0154571533203125,
0.0599365234375,
-0.0560302734375,
0.08404541015625,
-0.0286407470703125,
-0.004261016845703125,
0.024810791015625,
0.001010894775390625,
0.02471923828125,
-0.005352020263671875,
0.0009055137634277344,
0.0138702392578125,
0.0200042724609375,
-0.02484130859375,
-0.0208587646484375,
0.0212249755859375,
-0.0423583984375,
-0.019287109375,
-0.057037353515625,
-0.024444580078125,
-0.01425933837890625,
0.037384033203125,
0.033599853515625,
0.04974365234375,
-0.00804901123046875,
0.009735107421875,
0.043975830078125,
-0.044189453125,
0.033172607421875,
0.0640869140625,
-0.00572967529296875,
-0.020660400390625,
0.05889892578125,
0.00948333740234375,
0.022186279296875,
0.04095458984375,
0.025634765625,
-0.037109375,
-0.053314208984375,
0.00864410400390625,
0.039093017578125,
-0.05908203125,
-0.01031494140625,
-0.062744140625,
-0.0467529296875,
-0.04425048828125,
0.007198333740234375,
0.0029621124267578125,
-0.035003662109375,
-0.03656005859375,
-0.01033782958984375,
0.0224609375,
0.037750244140625,
-0.01151275634765625,
0.021453857421875,
-0.0633544921875,
0.01204681396484375,
0.007965087890625,
0.014678955078125,
0.0029125213623046875,
-0.052703857421875,
-0.0030536651611328125,
-0.0127410888671875,
-0.033355712890625,
-0.08001708984375,
0.039520263671875,
0.005096435546875,
0.060302734375,
0.042236328125,
0.0164642333984375,
0.047698974609375,
-0.0269317626953125,
0.055633544921875,
0.02130126953125,
-0.073974609375,
0.036529541015625,
-0.0162506103515625,
0.002330780029296875,
0.0380859375,
0.0303497314453125,
-0.0517578125,
-0.043701171875,
-0.0736083984375,
-0.07098388671875,
0.035675048828125,
0.007366180419921875,
-0.006999969482421875,
-0.027191162109375,
0.0164947509765625,
-0.000008761882781982422,
0.00601959228515625,
-0.073974609375,
-0.037078857421875,
-0.0020122528076171875,
-0.048187255859375,
0.0164947509765625,
-0.037994384765625,
-0.00211334228515625,
-0.03948974609375,
0.058258056640625,
-0.0003025531768798828,
0.060699462890625,
0.047698974609375,
-0.0234832763671875,
0.0196533203125,
0.0244293212890625,
0.05633544921875,
0.0286407470703125,
-0.03082275390625,
0.007205963134765625,
0.00893402099609375,
-0.03753662109375,
-0.00616455078125,
0.042877197265625,
-0.01058197021484375,
0.0273284912109375,
0.052734375,
0.05413818359375,
0.0022296905517578125,
-0.040283203125,
0.045440673828125,
-0.007396697998046875,
-0.0390625,
-0.041778564453125,
-0.0111846923828125,
-0.0145721435546875,
-0.0027599334716796875,
0.01149749755859375,
0.0106048583984375,
0.007587432861328125,
-0.028289794921875,
0.023345947265625,
0.0245208740234375,
-0.048553466796875,
-0.0177764892578125,
0.04547119140625,
-0.00412750244140625,
-0.0207672119140625,
0.04449462890625,
-0.033203125,
-0.049407958984375,
0.045989990234375,
0.044769287109375,
0.0732421875,
-0.007045745849609375,
0.0088958740234375,
0.034912109375,
0.0213623046875,
0.019500732421875,
0.03338623046875,
0.00785064697265625,
-0.0601806640625,
-0.02801513671875,
-0.032684326171875,
-0.0154876708984375,
0.0244598388671875,
-0.05303955078125,
0.01151275634765625,
-0.04052734375,
-0.0145416259765625,
0.0224456787109375,
-0.0107421875,
-0.028350830078125,
0.01154327392578125,
0.0241241455078125,
0.06463623046875,
-0.060760498046875,
0.06866455078125,
0.051605224609375,
-0.04071044921875,
-0.06353759765625,
0.00647735595703125,
0.01458740234375,
-0.060516357421875,
0.0455322265625,
0.008819580078125,
0.03131103515625,
-0.02520751953125,
-0.038726806640625,
-0.048004150390625,
0.04876708984375,
-0.0016117095947265625,
-0.021270751953125,
-0.006053924560546875,
-0.0124053955078125,
0.062744140625,
-0.01181793212890625,
0.03302001953125,
0.0238037109375,
0.032440185546875,
-0.0005955696105957031,
-0.0804443359375,
0.01861572265625,
-0.022979736328125,
-0.022247314453125,
0.01238250732421875,
-0.028228759765625,
0.06640625,
-0.0208282470703125,
-0.0070037841796875,
0.032196044921875,
0.03753662109375,
0.016815185546875,
0.0240020751953125,
0.03271484375,
0.05291748046875,
0.0811767578125,
-0.017425537109375,
0.08221435546875,
-0.0089111328125,
0.0343017578125,
0.08636474609375,
-0.01361846923828125,
0.054656982421875,
0.0200347900390625,
-0.01416015625,
0.0780029296875,
0.044158935546875,
-0.01267242431640625,
0.039794921875,
0.002902984619140625,
-0.0228118896484375,
-0.0096893310546875,
0.006595611572265625,
-0.041351318359375,
0.038543701171875,
0.0521240234375,
-0.05657958984375,
-0.017547607421875,
-0.01500701904296875,
0.020477294921875,
-0.0126953125,
0.00722503662109375,
0.0550537109375,
0.001556396484375,
-0.052215576171875,
0.06878662109375,
-0.005462646484375,
0.0280609130859375,
-0.059600830078125,
-0.0142059326171875,
-0.0016574859619140625,
0.032684326171875,
-0.0137786865234375,
-0.034637451171875,
0.021087646484375,
-0.007770538330078125,
-0.0213775634765625,
-0.006793975830078125,
0.0498046875,
-0.0282745361328125,
-0.04241943359375,
0.00997161865234375,
0.024993896484375,
0.0249786376953125,
0.029266357421875,
-0.06689453125,
-0.0276336669921875,
0.003772735595703125,
0.0186920166015625,
0.0175018310546875,
0.024749755859375,
0.0025634765625,
0.0458984375,
0.042510986328125,
0.01226043701171875,
-0.00269317626953125,
0.006748199462890625,
0.039886474609375,
-0.040069580078125,
-0.050201416015625,
-0.06378173828125,
0.039886474609375,
-0.006763458251953125,
-0.0203704833984375,
0.048736572265625,
0.035430908203125,
0.02288818359375,
-0.0002586841583251953,
0.060211181640625,
-0.03167724609375,
0.07000732421875,
-0.025390625,
0.0606689453125,
-0.040618896484375,
0.01055908203125,
-0.0287017822265625,
-0.0157470703125,
-0.045654296875,
0.07183837890625,
-0.0226593017578125,
-0.00495147705078125,
0.0736083984375,
0.057464599609375,
0.0154266357421875,
0.0008425712585449219,
-0.0014867782592773438,
0.050018310546875,
0.036834716796875,
0.043365478515625,
0.04473876953125,
-0.041168212890625,
0.02569580078125,
-0.0123291015625,
-0.026824951171875,
-0.0224609375,
-0.0650634765625,
-0.06634521484375,
-0.042236328125,
-0.04638671875,
-0.05975341796875,
-0.0160675048828125,
0.07421875,
0.05120849609375,
-0.0791015625,
0.01068115234375,
-0.02032470703125,
0.007476806640625,
-0.0269012451171875,
-0.021087646484375,
0.034332275390625,
-0.03509521484375,
-0.0288238525390625,
0.0198211669921875,
0.003009796142578125,
0.030853271484375,
0.00445556640625,
-0.0005083084106445312,
-0.035980224609375,
-0.0062255859375,
0.0294647216796875,
0.03228759765625,
-0.054931640625,
-0.0110931396484375,
-0.000492095947265625,
-0.024078369140625,
0.01175689697265625,
0.035980224609375,
-0.055023193359375,
0.03924560546875,
0.0304718017578125,
0.04534912109375,
0.0255279541015625,
-0.02593994140625,
0.0196533203125,
-0.055145263671875,
0.0094146728515625,
0.03802490234375,
0.04833984375,
0.007038116455078125,
-0.01558685302734375,
0.025787353515625,
0.03656005859375,
-0.042999267578125,
-0.07940673828125,
0.007320404052734375,
-0.0802001953125,
-0.0239410400390625,
0.0714111328125,
-0.007152557373046875,
-0.0035991668701171875,
-0.02020263671875,
-0.01000213623046875,
0.024627685546875,
-0.0197296142578125,
0.06201171875,
0.056884765625,
-0.0203094482421875,
-0.0037136077880859375,
-0.03851318359375,
0.0260162353515625,
0.05072021484375,
-0.059173583984375,
-0.0213165283203125,
0.0191802978515625,
0.0236053466796875,
0.03265380859375,
0.046173095703125,
-0.0233001708984375,
0.01971435546875,
-0.0186920166015625,
0.019317626953125,
0.0151214599609375,
-0.0011777877807617188,
-0.029327392578125,
-0.00635528564453125,
-0.004528045654296875,
-0.0341796875
]
] |
Helsinki-NLP/opus-mt-mul-en | 2023-08-16T12:01:25.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"ca",
"es",
"os",
"eo",
"ro",
"fy",
"cy",
"is",
"lb",
"su",
"an",
"sq",
"fr",
"ht",
"rm",
"cv",
"ig",
"am",
"eu",
"tr",
"ps",
"af",
"ny",
"ch",
"uk",
"sl",
"lt",
"tk",
"sg",
"ar",
"lg",
"bg",
"be",
"ka",
"gd",
"ja",
"si",
"br",
"mh",
"km",
"th",
"ty",
"rw",
"te",
"mk",
"or",
"wo",
"kl",
"mr",
"ru",
"yo",
"hu",
"fo",
"zh",
"ti",
"co",
"ee",
"oc",
"sn",
"mt",
"ts",
"pl",
"gl",
"nb",
"bn",
"tt",
"bo",
"lo",
"id",
"gn",
"nv",
"hy",
"kn",
"to",
"io",
"so",
"vi",
"da",
"fj",
"gv",
"sm",
"nl",
"mi",
"pt",
"hi",
"se",
"as",
"ta",
"et",
"kw",
"ga",
"sv",
"ln",
"na",
"mn",
"gu",
"wa",
"lv",
"jv",
"el",
"my",
"ba",
"it",
"hr",
"ur",
"ce",
"nn",
"fi",
"mg",
"rn",
"xh",
"ab",
"de",
"cs",
"he",
"zu",
"yi",
"ml",
"mul",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-mul-en | 45 | 85,693 | transformers | 2022-03-02T23:29:04 | ---
language:
- ca
- es
- os
- eo
- ro
- fy
- cy
- is
- lb
- su
- an
- sq
- fr
- ht
- rm
- cv
- ig
- am
- eu
- tr
- ps
- af
- ny
- ch
- uk
- sl
- lt
- tk
- sg
- ar
- lg
- bg
- be
- ka
- gd
- ja
- si
- br
- mh
- km
- th
- ty
- rw
- te
- mk
- or
- wo
- kl
- mr
- ru
- yo
- hu
- fo
- zh
- ti
- co
- ee
- oc
- sn
- mt
- ts
- pl
- gl
- nb
- bn
- tt
- bo
- lo
- id
- gn
- nv
- hy
- kn
- to
- io
- so
- vi
- da
- fj
- gv
- sm
- nl
- mi
- pt
- hi
- se
- as
- ta
- et
- kw
- ga
- sv
- ln
- na
- mn
- gu
- wa
- lv
- jv
- el
- my
- ba
- it
- hr
- ur
- ce
- nn
- fi
- mg
- rn
- xh
- ab
- de
- cs
- he
- zu
- yi
- ml
- mul
- en
tags:
- translation
license: apache-2.0
---
### mul-eng
* source group: Multiple languages
* target group: English
* OPUS readme: [mul-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/mul-eng/README.md)
* model: transformer
* source language(s): abk acm ady afb afh_Latn afr akl_Latn aln amh ang_Latn apc ara arg arq ary arz asm ast avk_Latn awa aze_Latn bak bam_Latn bel bel_Latn ben bho bod bos_Latn bre brx brx_Latn bul bul_Latn cat ceb ces cha che chr chv cjy_Hans cjy_Hant cmn cmn_Hans cmn_Hant cor cos crh crh_Latn csb_Latn cym dan deu dsb dtp dws_Latn egl ell enm_Latn epo est eus ewe ext fao fij fin fkv_Latn fra frm_Latn frr fry fuc fuv gan gcf_Latn gil gla gle glg glv gom gos got_Goth grc_Grek grn gsw guj hat hau_Latn haw heb hif_Latn hil hin hnj_Latn hoc hoc_Latn hrv hsb hun hye iba ibo ido ido_Latn ike_Latn ile_Latn ilo ina_Latn ind isl ita izh jav jav_Java jbo jbo_Cyrl jbo_Latn jdt_Cyrl jpn kab kal kan kat kaz_Cyrl kaz_Latn kek_Latn kha khm khm_Latn kin kir_Cyrl kjh kpv krl ksh kum kur_Arab kur_Latn lad lad_Latn lao lat_Latn lav ldn_Latn lfn_Cyrl lfn_Latn lij lin lit liv_Latn lkt lld_Latn lmo ltg ltz lug lzh lzh_Hans mad mah mai mal mar max_Latn mdf mfe mhr mic min mkd mlg mlt mnw moh mon mri mwl mww mya myv nan nau nav nds niu nld nno nob nob_Hebr nog non_Latn nov_Latn npi nya oci ori orv_Cyrl oss ota_Arab ota_Latn pag pan_Guru pap pau pdc pes pes_Latn pes_Thaa pms pnb pol por ppl_Latn prg_Latn pus quc qya qya_Latn rap rif_Latn roh rom ron rue run rus sag sah san_Deva scn sco sgs shs_Latn shy_Latn sin sjn_Latn slv sma sme smo sna snd_Arab som spa sqi srp_Cyrl srp_Latn stq sun swe swg swh tah tam tat tat_Arab tat_Latn tel tet tgk_Cyrl tha tir tlh_Latn tly_Latn tmw_Latn toi_Latn ton tpw_Latn tso tuk tuk_Latn tur tvl tyv tzl tzl_Latn udm uig_Arab uig_Cyrl ukr umb urd uzb_Cyrl uzb_Latn vec vie vie_Hani vol_Latn vro war wln wol wuu xal xho yid yor yue yue_Hans yue_Hant zho zho_Hans zho_Hant zlm_Latn zsm_Latn zul zza
* target language(s): eng
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus2m-2020-08-01.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/mul-eng/opus2m-2020-08-01.zip)
* test set translations: [opus2m-2020-08-01.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/mul-eng/opus2m-2020-08-01.test.txt)
* test set scores: [opus2m-2020-08-01.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/mul-eng/opus2m-2020-08-01.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2014-hineng.hin.eng | 8.5 | 0.341 |
| newsdev2015-enfi-fineng.fin.eng | 16.8 | 0.441 |
| newsdev2016-enro-roneng.ron.eng | 31.3 | 0.580 |
| newsdev2016-entr-tureng.tur.eng | 16.4 | 0.422 |
| newsdev2017-enlv-laveng.lav.eng | 21.3 | 0.502 |
| newsdev2017-enzh-zhoeng.zho.eng | 12.7 | 0.409 |
| newsdev2018-enet-esteng.est.eng | 19.8 | 0.467 |
| newsdev2019-engu-gujeng.guj.eng | 13.3 | 0.385 |
| newsdev2019-enlt-liteng.lit.eng | 19.9 | 0.482 |
| newsdiscussdev2015-enfr-fraeng.fra.eng | 26.7 | 0.520 |
| newsdiscusstest2015-enfr-fraeng.fra.eng | 29.8 | 0.541 |
| newssyscomb2009-ceseng.ces.eng | 21.1 | 0.487 |
| newssyscomb2009-deueng.deu.eng | 22.6 | 0.499 |
| newssyscomb2009-fraeng.fra.eng | 25.8 | 0.530 |
| newssyscomb2009-huneng.hun.eng | 15.1 | 0.430 |
| newssyscomb2009-itaeng.ita.eng | 29.4 | 0.555 |
| newssyscomb2009-spaeng.spa.eng | 26.1 | 0.534 |
| news-test2008-deueng.deu.eng | 21.6 | 0.491 |
| news-test2008-fraeng.fra.eng | 22.3 | 0.502 |
| news-test2008-spaeng.spa.eng | 23.6 | 0.514 |
| newstest2009-ceseng.ces.eng | 19.8 | 0.480 |
| newstest2009-deueng.deu.eng | 20.9 | 0.487 |
| newstest2009-fraeng.fra.eng | 25.0 | 0.523 |
| newstest2009-huneng.hun.eng | 14.7 | 0.425 |
| newstest2009-itaeng.ita.eng | 27.6 | 0.542 |
| newstest2009-spaeng.spa.eng | 25.7 | 0.530 |
| newstest2010-ceseng.ces.eng | 20.6 | 0.491 |
| newstest2010-deueng.deu.eng | 23.4 | 0.517 |
| newstest2010-fraeng.fra.eng | 26.1 | 0.537 |
| newstest2010-spaeng.spa.eng | 29.1 | 0.561 |
| newstest2011-ceseng.ces.eng | 21.0 | 0.489 |
| newstest2011-deueng.deu.eng | 21.3 | 0.494 |
| newstest2011-fraeng.fra.eng | 26.8 | 0.546 |
| newstest2011-spaeng.spa.eng | 28.2 | 0.549 |
| newstest2012-ceseng.ces.eng | 20.5 | 0.485 |
| newstest2012-deueng.deu.eng | 22.3 | 0.503 |
| newstest2012-fraeng.fra.eng | 27.5 | 0.545 |
| newstest2012-ruseng.rus.eng | 26.6 | 0.532 |
| newstest2012-spaeng.spa.eng | 30.3 | 0.567 |
| newstest2013-ceseng.ces.eng | 22.5 | 0.498 |
| newstest2013-deueng.deu.eng | 25.0 | 0.518 |
| newstest2013-fraeng.fra.eng | 27.4 | 0.537 |
| newstest2013-ruseng.rus.eng | 21.6 | 0.484 |
| newstest2013-spaeng.spa.eng | 28.4 | 0.555 |
| newstest2014-csen-ceseng.ces.eng | 24.0 | 0.517 |
| newstest2014-deen-deueng.deu.eng | 24.1 | 0.511 |
| newstest2014-fren-fraeng.fra.eng | 29.1 | 0.563 |
| newstest2014-hien-hineng.hin.eng | 14.0 | 0.414 |
| newstest2014-ruen-ruseng.rus.eng | 24.0 | 0.521 |
| newstest2015-encs-ceseng.ces.eng | 21.9 | 0.481 |
| newstest2015-ende-deueng.deu.eng | 25.5 | 0.519 |
| newstest2015-enfi-fineng.fin.eng | 17.4 | 0.441 |
| newstest2015-enru-ruseng.rus.eng | 22.4 | 0.494 |
| newstest2016-encs-ceseng.ces.eng | 23.0 | 0.500 |
| newstest2016-ende-deueng.deu.eng | 30.1 | 0.560 |
| newstest2016-enfi-fineng.fin.eng | 18.5 | 0.461 |
| newstest2016-enro-roneng.ron.eng | 29.6 | 0.562 |
| newstest2016-enru-ruseng.rus.eng | 22.0 | 0.495 |
| newstest2016-entr-tureng.tur.eng | 14.8 | 0.415 |
| newstest2017-encs-ceseng.ces.eng | 20.2 | 0.475 |
| newstest2017-ende-deueng.deu.eng | 26.0 | 0.523 |
| newstest2017-enfi-fineng.fin.eng | 19.6 | 0.465 |
| newstest2017-enlv-laveng.lav.eng | 16.2 | 0.454 |
| newstest2017-enru-ruseng.rus.eng | 24.2 | 0.510 |
| newstest2017-entr-tureng.tur.eng | 15.0 | 0.412 |
| newstest2017-enzh-zhoeng.zho.eng | 13.7 | 0.412 |
| newstest2018-encs-ceseng.ces.eng | 21.2 | 0.486 |
| newstest2018-ende-deueng.deu.eng | 31.5 | 0.564 |
| newstest2018-enet-esteng.est.eng | 19.7 | 0.473 |
| newstest2018-enfi-fineng.fin.eng | 15.1 | 0.418 |
| newstest2018-enru-ruseng.rus.eng | 21.3 | 0.490 |
| newstest2018-entr-tureng.tur.eng | 15.4 | 0.421 |
| newstest2018-enzh-zhoeng.zho.eng | 12.9 | 0.408 |
| newstest2019-deen-deueng.deu.eng | 27.0 | 0.529 |
| newstest2019-fien-fineng.fin.eng | 17.2 | 0.438 |
| newstest2019-guen-gujeng.guj.eng | 9.0 | 0.342 |
| newstest2019-lten-liteng.lit.eng | 22.6 | 0.512 |
| newstest2019-ruen-ruseng.rus.eng | 24.1 | 0.503 |
| newstest2019-zhen-zhoeng.zho.eng | 13.9 | 0.427 |
| newstestB2016-enfi-fineng.fin.eng | 15.2 | 0.428 |
| newstestB2017-enfi-fineng.fin.eng | 16.8 | 0.442 |
| newstestB2017-fien-fineng.fin.eng | 16.8 | 0.442 |
| Tatoeba-test.abk-eng.abk.eng | 2.4 | 0.190 |
| Tatoeba-test.ady-eng.ady.eng | 1.1 | 0.111 |
| Tatoeba-test.afh-eng.afh.eng | 1.7 | 0.108 |
| Tatoeba-test.afr-eng.afr.eng | 53.0 | 0.672 |
| Tatoeba-test.akl-eng.akl.eng | 5.9 | 0.239 |
| Tatoeba-test.amh-eng.amh.eng | 25.6 | 0.464 |
| Tatoeba-test.ang-eng.ang.eng | 11.7 | 0.289 |
| Tatoeba-test.ara-eng.ara.eng | 26.4 | 0.443 |
| Tatoeba-test.arg-eng.arg.eng | 35.9 | 0.473 |
| Tatoeba-test.asm-eng.asm.eng | 19.8 | 0.365 |
| Tatoeba-test.ast-eng.ast.eng | 31.8 | 0.467 |
| Tatoeba-test.avk-eng.avk.eng | 0.4 | 0.119 |
| Tatoeba-test.awa-eng.awa.eng | 9.7 | 0.271 |
| Tatoeba-test.aze-eng.aze.eng | 37.0 | 0.542 |
| Tatoeba-test.bak-eng.bak.eng | 13.9 | 0.395 |
| Tatoeba-test.bam-eng.bam.eng | 2.2 | 0.094 |
| Tatoeba-test.bel-eng.bel.eng | 36.8 | 0.549 |
| Tatoeba-test.ben-eng.ben.eng | 39.7 | 0.546 |
| Tatoeba-test.bho-eng.bho.eng | 33.6 | 0.540 |
| Tatoeba-test.bod-eng.bod.eng | 1.1 | 0.147 |
| Tatoeba-test.bre-eng.bre.eng | 14.2 | 0.303 |
| Tatoeba-test.brx-eng.brx.eng | 1.7 | 0.130 |
| Tatoeba-test.bul-eng.bul.eng | 46.0 | 0.621 |
| Tatoeba-test.cat-eng.cat.eng | 46.6 | 0.636 |
| Tatoeba-test.ceb-eng.ceb.eng | 17.4 | 0.347 |
| Tatoeba-test.ces-eng.ces.eng | 41.3 | 0.586 |
| Tatoeba-test.cha-eng.cha.eng | 7.9 | 0.232 |
| Tatoeba-test.che-eng.che.eng | 0.7 | 0.104 |
| Tatoeba-test.chm-eng.chm.eng | 7.3 | 0.261 |
| Tatoeba-test.chr-eng.chr.eng | 8.8 | 0.244 |
| Tatoeba-test.chv-eng.chv.eng | 11.0 | 0.319 |
| Tatoeba-test.cor-eng.cor.eng | 5.4 | 0.204 |
| Tatoeba-test.cos-eng.cos.eng | 58.2 | 0.643 |
| Tatoeba-test.crh-eng.crh.eng | 26.3 | 0.399 |
| Tatoeba-test.csb-eng.csb.eng | 18.8 | 0.389 |
| Tatoeba-test.cym-eng.cym.eng | 23.4 | 0.407 |
| Tatoeba-test.dan-eng.dan.eng | 50.5 | 0.659 |
| Tatoeba-test.deu-eng.deu.eng | 39.6 | 0.579 |
| Tatoeba-test.dsb-eng.dsb.eng | 24.3 | 0.449 |
| Tatoeba-test.dtp-eng.dtp.eng | 1.0 | 0.149 |
| Tatoeba-test.dws-eng.dws.eng | 1.6 | 0.061 |
| Tatoeba-test.egl-eng.egl.eng | 7.6 | 0.236 |
| Tatoeba-test.ell-eng.ell.eng | 55.4 | 0.682 |
| Tatoeba-test.enm-eng.enm.eng | 28.0 | 0.489 |
| Tatoeba-test.epo-eng.epo.eng | 41.8 | 0.591 |
| Tatoeba-test.est-eng.est.eng | 41.5 | 0.581 |
| Tatoeba-test.eus-eng.eus.eng | 37.8 | 0.557 |
| Tatoeba-test.ewe-eng.ewe.eng | 10.7 | 0.262 |
| Tatoeba-test.ext-eng.ext.eng | 25.5 | 0.405 |
| Tatoeba-test.fao-eng.fao.eng | 28.7 | 0.469 |
| Tatoeba-test.fas-eng.fas.eng | 7.5 | 0.281 |
| Tatoeba-test.fij-eng.fij.eng | 24.2 | 0.320 |
| Tatoeba-test.fin-eng.fin.eng | 35.8 | 0.534 |
| Tatoeba-test.fkv-eng.fkv.eng | 15.5 | 0.434 |
| Tatoeba-test.fra-eng.fra.eng | 45.1 | 0.618 |
| Tatoeba-test.frm-eng.frm.eng | 29.6 | 0.427 |
| Tatoeba-test.frr-eng.frr.eng | 5.5 | 0.138 |
| Tatoeba-test.fry-eng.fry.eng | 25.3 | 0.455 |
| Tatoeba-test.ful-eng.ful.eng | 1.1 | 0.127 |
| Tatoeba-test.gcf-eng.gcf.eng | 16.0 | 0.315 |
| Tatoeba-test.gil-eng.gil.eng | 46.7 | 0.587 |
| Tatoeba-test.gla-eng.gla.eng | 20.2 | 0.358 |
| Tatoeba-test.gle-eng.gle.eng | 43.9 | 0.592 |
| Tatoeba-test.glg-eng.glg.eng | 45.1 | 0.623 |
| Tatoeba-test.glv-eng.glv.eng | 3.3 | 0.119 |
| Tatoeba-test.gos-eng.gos.eng | 20.1 | 0.364 |
| Tatoeba-test.got-eng.got.eng | 0.1 | 0.041 |
| Tatoeba-test.grc-eng.grc.eng | 2.1 | 0.137 |
| Tatoeba-test.grn-eng.grn.eng | 1.7 | 0.152 |
| Tatoeba-test.gsw-eng.gsw.eng | 18.2 | 0.334 |
| Tatoeba-test.guj-eng.guj.eng | 21.7 | 0.373 |
| Tatoeba-test.hat-eng.hat.eng | 34.5 | 0.502 |
| Tatoeba-test.hau-eng.hau.eng | 10.5 | 0.295 |
| Tatoeba-test.haw-eng.haw.eng | 2.8 | 0.160 |
| Tatoeba-test.hbs-eng.hbs.eng | 46.7 | 0.623 |
| Tatoeba-test.heb-eng.heb.eng | 33.0 | 0.492 |
| Tatoeba-test.hif-eng.hif.eng | 17.0 | 0.391 |
| Tatoeba-test.hil-eng.hil.eng | 16.0 | 0.339 |
| Tatoeba-test.hin-eng.hin.eng | 36.4 | 0.533 |
| Tatoeba-test.hmn-eng.hmn.eng | 0.4 | 0.131 |
| Tatoeba-test.hoc-eng.hoc.eng | 0.7 | 0.132 |
| Tatoeba-test.hsb-eng.hsb.eng | 41.9 | 0.551 |
| Tatoeba-test.hun-eng.hun.eng | 33.2 | 0.510 |
| Tatoeba-test.hye-eng.hye.eng | 32.2 | 0.487 |
| Tatoeba-test.iba-eng.iba.eng | 9.4 | 0.278 |
| Tatoeba-test.ibo-eng.ibo.eng | 5.8 | 0.200 |
| Tatoeba-test.ido-eng.ido.eng | 31.7 | 0.503 |
| Tatoeba-test.iku-eng.iku.eng | 9.1 | 0.164 |
| Tatoeba-test.ile-eng.ile.eng | 42.2 | 0.595 |
| Tatoeba-test.ilo-eng.ilo.eng | 29.7 | 0.485 |
| Tatoeba-test.ina-eng.ina.eng | 42.1 | 0.607 |
| Tatoeba-test.isl-eng.isl.eng | 35.7 | 0.527 |
| Tatoeba-test.ita-eng.ita.eng | 54.8 | 0.686 |
| Tatoeba-test.izh-eng.izh.eng | 28.3 | 0.526 |
| Tatoeba-test.jav-eng.jav.eng | 10.0 | 0.282 |
| Tatoeba-test.jbo-eng.jbo.eng | 0.3 | 0.115 |
| Tatoeba-test.jdt-eng.jdt.eng | 5.3 | 0.140 |
| Tatoeba-test.jpn-eng.jpn.eng | 18.8 | 0.387 |
| Tatoeba-test.kab-eng.kab.eng | 3.9 | 0.205 |
| Tatoeba-test.kal-eng.kal.eng | 16.9 | 0.329 |
| Tatoeba-test.kan-eng.kan.eng | 16.2 | 0.374 |
| Tatoeba-test.kat-eng.kat.eng | 31.1 | 0.493 |
| Tatoeba-test.kaz-eng.kaz.eng | 24.5 | 0.437 |
| Tatoeba-test.kek-eng.kek.eng | 7.4 | 0.192 |
| Tatoeba-test.kha-eng.kha.eng | 1.0 | 0.154 |
| Tatoeba-test.khm-eng.khm.eng | 12.2 | 0.290 |
| Tatoeba-test.kin-eng.kin.eng | 22.5 | 0.355 |
| Tatoeba-test.kir-eng.kir.eng | 27.2 | 0.470 |
| Tatoeba-test.kjh-eng.kjh.eng | 2.1 | 0.129 |
| Tatoeba-test.kok-eng.kok.eng | 4.5 | 0.259 |
| Tatoeba-test.kom-eng.kom.eng | 1.4 | 0.099 |
| Tatoeba-test.krl-eng.krl.eng | 26.1 | 0.387 |
| Tatoeba-test.ksh-eng.ksh.eng | 5.5 | 0.256 |
| Tatoeba-test.kum-eng.kum.eng | 9.3 | 0.288 |
| Tatoeba-test.kur-eng.kur.eng | 9.6 | 0.208 |
| Tatoeba-test.lad-eng.lad.eng | 30.1 | 0.475 |
| Tatoeba-test.lah-eng.lah.eng | 11.6 | 0.284 |
| Tatoeba-test.lao-eng.lao.eng | 4.5 | 0.214 |
| Tatoeba-test.lat-eng.lat.eng | 21.5 | 0.402 |
| Tatoeba-test.lav-eng.lav.eng | 40.2 | 0.577 |
| Tatoeba-test.ldn-eng.ldn.eng | 0.8 | 0.115 |
| Tatoeba-test.lfn-eng.lfn.eng | 23.0 | 0.433 |
| Tatoeba-test.lij-eng.lij.eng | 9.3 | 0.287 |
| Tatoeba-test.lin-eng.lin.eng | 2.4 | 0.196 |
| Tatoeba-test.lit-eng.lit.eng | 44.0 | 0.597 |
| Tatoeba-test.liv-eng.liv.eng | 1.6 | 0.115 |
| Tatoeba-test.lkt-eng.lkt.eng | 2.0 | 0.113 |
| Tatoeba-test.lld-eng.lld.eng | 18.3 | 0.312 |
| Tatoeba-test.lmo-eng.lmo.eng | 25.4 | 0.395 |
| Tatoeba-test.ltz-eng.ltz.eng | 35.9 | 0.509 |
| Tatoeba-test.lug-eng.lug.eng | 5.1 | 0.357 |
| Tatoeba-test.mad-eng.mad.eng | 2.8 | 0.123 |
| Tatoeba-test.mah-eng.mah.eng | 5.7 | 0.175 |
| Tatoeba-test.mai-eng.mai.eng | 56.3 | 0.703 |
| Tatoeba-test.mal-eng.mal.eng | 37.5 | 0.534 |
| Tatoeba-test.mar-eng.mar.eng | 22.8 | 0.470 |
| Tatoeba-test.mdf-eng.mdf.eng | 2.0 | 0.110 |
| Tatoeba-test.mfe-eng.mfe.eng | 59.2 | 0.764 |
| Tatoeba-test.mic-eng.mic.eng | 9.0 | 0.199 |
| Tatoeba-test.mkd-eng.mkd.eng | 44.3 | 0.593 |
| Tatoeba-test.mlg-eng.mlg.eng | 31.9 | 0.424 |
| Tatoeba-test.mlt-eng.mlt.eng | 38.6 | 0.540 |
| Tatoeba-test.mnw-eng.mnw.eng | 2.5 | 0.101 |
| Tatoeba-test.moh-eng.moh.eng | 0.3 | 0.110 |
| Tatoeba-test.mon-eng.mon.eng | 13.5 | 0.334 |
| Tatoeba-test.mri-eng.mri.eng | 8.5 | 0.260 |
| Tatoeba-test.msa-eng.msa.eng | 33.9 | 0.520 |
| Tatoeba-test.multi.eng | 34.7 | 0.518 |
| Tatoeba-test.mwl-eng.mwl.eng | 37.4 | 0.630 |
| Tatoeba-test.mya-eng.mya.eng | 15.5 | 0.335 |
| Tatoeba-test.myv-eng.myv.eng | 0.8 | 0.118 |
| Tatoeba-test.nau-eng.nau.eng | 9.0 | 0.186 |
| Tatoeba-test.nav-eng.nav.eng | 1.3 | 0.144 |
| Tatoeba-test.nds-eng.nds.eng | 30.7 | 0.495 |
| Tatoeba-test.nep-eng.nep.eng | 3.5 | 0.168 |
| Tatoeba-test.niu-eng.niu.eng | 42.7 | 0.492 |
| Tatoeba-test.nld-eng.nld.eng | 47.9 | 0.640 |
| Tatoeba-test.nog-eng.nog.eng | 12.7 | 0.284 |
| Tatoeba-test.non-eng.non.eng | 43.8 | 0.586 |
| Tatoeba-test.nor-eng.nor.eng | 45.5 | 0.619 |
| Tatoeba-test.nov-eng.nov.eng | 26.9 | 0.472 |
| Tatoeba-test.nya-eng.nya.eng | 33.2 | 0.456 |
| Tatoeba-test.oci-eng.oci.eng | 17.9 | 0.370 |
| Tatoeba-test.ori-eng.ori.eng | 14.6 | 0.305 |
| Tatoeba-test.orv-eng.orv.eng | 11.0 | 0.283 |
| Tatoeba-test.oss-eng.oss.eng | 4.1 | 0.211 |
| Tatoeba-test.ota-eng.ota.eng | 4.1 | 0.216 |
| Tatoeba-test.pag-eng.pag.eng | 24.3 | 0.468 |
| Tatoeba-test.pan-eng.pan.eng | 16.4 | 0.358 |
| Tatoeba-test.pap-eng.pap.eng | 53.2 | 0.628 |
| Tatoeba-test.pau-eng.pau.eng | 3.7 | 0.173 |
| Tatoeba-test.pdc-eng.pdc.eng | 45.3 | 0.569 |
| Tatoeba-test.pms-eng.pms.eng | 14.0 | 0.345 |
| Tatoeba-test.pol-eng.pol.eng | 41.7 | 0.588 |
| Tatoeba-test.por-eng.por.eng | 51.4 | 0.669 |
| Tatoeba-test.ppl-eng.ppl.eng | 0.4 | 0.134 |
| Tatoeba-test.prg-eng.prg.eng | 4.1 | 0.198 |
| Tatoeba-test.pus-eng.pus.eng | 6.7 | 0.233 |
| Tatoeba-test.quc-eng.quc.eng | 3.5 | 0.091 |
| Tatoeba-test.qya-eng.qya.eng | 0.2 | 0.090 |
| Tatoeba-test.rap-eng.rap.eng | 17.5 | 0.230 |
| Tatoeba-test.rif-eng.rif.eng | 4.2 | 0.164 |
| Tatoeba-test.roh-eng.roh.eng | 24.6 | 0.464 |
| Tatoeba-test.rom-eng.rom.eng | 3.4 | 0.212 |
| Tatoeba-test.ron-eng.ron.eng | 45.2 | 0.620 |
| Tatoeba-test.rue-eng.rue.eng | 21.4 | 0.390 |
| Tatoeba-test.run-eng.run.eng | 24.5 | 0.392 |
| Tatoeba-test.rus-eng.rus.eng | 42.7 | 0.591 |
| Tatoeba-test.sag-eng.sag.eng | 3.4 | 0.187 |
| Tatoeba-test.sah-eng.sah.eng | 5.0 | 0.177 |
| Tatoeba-test.san-eng.san.eng | 2.0 | 0.172 |
| Tatoeba-test.scn-eng.scn.eng | 35.8 | 0.410 |
| Tatoeba-test.sco-eng.sco.eng | 34.6 | 0.520 |
| Tatoeba-test.sgs-eng.sgs.eng | 21.8 | 0.299 |
| Tatoeba-test.shs-eng.shs.eng | 1.8 | 0.122 |
| Tatoeba-test.shy-eng.shy.eng | 1.4 | 0.104 |
| Tatoeba-test.sin-eng.sin.eng | 20.6 | 0.429 |
| Tatoeba-test.sjn-eng.sjn.eng | 1.2 | 0.095 |
| Tatoeba-test.slv-eng.slv.eng | 37.0 | 0.545 |
| Tatoeba-test.sma-eng.sma.eng | 4.4 | 0.147 |
| Tatoeba-test.sme-eng.sme.eng | 8.9 | 0.229 |
| Tatoeba-test.smo-eng.smo.eng | 37.7 | 0.483 |
| Tatoeba-test.sna-eng.sna.eng | 18.0 | 0.359 |
| Tatoeba-test.snd-eng.snd.eng | 28.1 | 0.444 |
| Tatoeba-test.som-eng.som.eng | 23.6 | 0.472 |
| Tatoeba-test.spa-eng.spa.eng | 47.9 | 0.645 |
| Tatoeba-test.sqi-eng.sqi.eng | 46.9 | 0.634 |
| Tatoeba-test.stq-eng.stq.eng | 8.1 | 0.379 |
| Tatoeba-test.sun-eng.sun.eng | 23.8 | 0.369 |
| Tatoeba-test.swa-eng.swa.eng | 6.5 | 0.193 |
| Tatoeba-test.swe-eng.swe.eng | 51.4 | 0.655 |
| Tatoeba-test.swg-eng.swg.eng | 18.5 | 0.342 |
| Tatoeba-test.tah-eng.tah.eng | 25.6 | 0.249 |
| Tatoeba-test.tam-eng.tam.eng | 29.1 | 0.437 |
| Tatoeba-test.tat-eng.tat.eng | 12.9 | 0.327 |
| Tatoeba-test.tel-eng.tel.eng | 21.2 | 0.386 |
| Tatoeba-test.tet-eng.tet.eng | 9.2 | 0.215 |
| Tatoeba-test.tgk-eng.tgk.eng | 12.7 | 0.374 |
| Tatoeba-test.tha-eng.tha.eng | 36.3 | 0.531 |
| Tatoeba-test.tir-eng.tir.eng | 9.1 | 0.267 |
| Tatoeba-test.tlh-eng.tlh.eng | 0.2 | 0.084 |
| Tatoeba-test.tly-eng.tly.eng | 2.1 | 0.128 |
| Tatoeba-test.toi-eng.toi.eng | 5.3 | 0.150 |
| Tatoeba-test.ton-eng.ton.eng | 39.5 | 0.473 |
| Tatoeba-test.tpw-eng.tpw.eng | 1.5 | 0.160 |
| Tatoeba-test.tso-eng.tso.eng | 44.7 | 0.526 |
| Tatoeba-test.tuk-eng.tuk.eng | 18.6 | 0.401 |
| Tatoeba-test.tur-eng.tur.eng | 40.5 | 0.573 |
| Tatoeba-test.tvl-eng.tvl.eng | 55.0 | 0.593 |
| Tatoeba-test.tyv-eng.tyv.eng | 19.1 | 0.477 |
| Tatoeba-test.tzl-eng.tzl.eng | 17.7 | 0.333 |
| Tatoeba-test.udm-eng.udm.eng | 3.4 | 0.217 |
| Tatoeba-test.uig-eng.uig.eng | 11.4 | 0.289 |
| Tatoeba-test.ukr-eng.ukr.eng | 43.1 | 0.595 |
| Tatoeba-test.umb-eng.umb.eng | 9.2 | 0.260 |
| Tatoeba-test.urd-eng.urd.eng | 23.2 | 0.426 |
| Tatoeba-test.uzb-eng.uzb.eng | 19.0 | 0.342 |
| Tatoeba-test.vec-eng.vec.eng | 41.1 | 0.409 |
| Tatoeba-test.vie-eng.vie.eng | 30.6 | 0.481 |
| Tatoeba-test.vol-eng.vol.eng | 1.8 | 0.143 |
| Tatoeba-test.war-eng.war.eng | 15.9 | 0.352 |
| Tatoeba-test.wln-eng.wln.eng | 12.6 | 0.291 |
| Tatoeba-test.wol-eng.wol.eng | 4.4 | 0.138 |
| Tatoeba-test.xal-eng.xal.eng | 0.9 | 0.153 |
| Tatoeba-test.xho-eng.xho.eng | 35.4 | 0.513 |
| Tatoeba-test.yid-eng.yid.eng | 19.4 | 0.387 |
| Tatoeba-test.yor-eng.yor.eng | 19.3 | 0.327 |
| Tatoeba-test.zho-eng.zho.eng | 25.8 | 0.448 |
| Tatoeba-test.zul-eng.zul.eng | 40.9 | 0.567 |
| Tatoeba-test.zza-eng.zza.eng | 1.6 | 0.125 |
### System Info:
- hf_name: mul-eng
- source_languages: mul
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/mul-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['ca', 'es', 'os', 'eo', 'ro', 'fy', 'cy', 'is', 'lb', 'su', 'an', 'sq', 'fr', 'ht', 'rm', 'cv', 'ig', 'am', 'eu', 'tr', 'ps', 'af', 'ny', 'ch', 'uk', 'sl', 'lt', 'tk', 'sg', 'ar', 'lg', 'bg', 'be', 'ka', 'gd', 'ja', 'si', 'br', 'mh', 'km', 'th', 'ty', 'rw', 'te', 'mk', 'or', 'wo', 'kl', 'mr', 'ru', 'yo', 'hu', 'fo', 'zh', 'ti', 'co', 'ee', 'oc', 'sn', 'mt', 'ts', 'pl', 'gl', 'nb', 'bn', 'tt', 'bo', 'lo', 'id', 'gn', 'nv', 'hy', 'kn', 'to', 'io', 'so', 'vi', 'da', 'fj', 'gv', 'sm', 'nl', 'mi', 'pt', 'hi', 'se', 'as', 'ta', 'et', 'kw', 'ga', 'sv', 'ln', 'na', 'mn', 'gu', 'wa', 'lv', 'jv', 'el', 'my', 'ba', 'it', 'hr', 'ur', 'ce', 'nn', 'fi', 'mg', 'rn', 'xh', 'ab', 'de', 'cs', 'he', 'zu', 'yi', 'ml', 'mul', 'en']
- src_constituents: {'sjn_Latn', 'cat', 'nan', 'spa', 'ile_Latn', 'pap', 'mwl', 'uzb_Latn', 'mww', 'hil', 'lij', 'avk_Latn', 'lad_Latn', 'lat_Latn', 'bos_Latn', 'oss', 'epo', 'ron', 'fry', 'cym', 'toi_Latn', 'awa', 'swg', 'zsm_Latn', 'zho_Hant', 'gcf_Latn', 'uzb_Cyrl', 'isl', 'lfn_Latn', 'shs_Latn', 'nov_Latn', 'bho', 'ltz', 'lzh', 'kur_Latn', 'sun', 'arg', 'pes_Thaa', 'sqi', 'uig_Arab', 'csb_Latn', 'fra', 'hat', 'liv_Latn', 'non_Latn', 'sco', 'cmn_Hans', 'pnb', 'roh', 'chv', 'ibo', 'bul_Latn', 'amh', 'lfn_Cyrl', 'eus', 'fkv_Latn', 'tur', 'pus', 'afr', 'brx_Latn', 'nya', 'acm', 'ota_Latn', 'cha', 'ukr', 'xal', 'slv', 'lit', 'zho_Hans', 'tmw_Latn', 'kjh', 'ota_Arab', 'war', 'tuk', 'sag', 'myv', 'hsb', 'lzh_Hans', 'ara', 'tly_Latn', 'lug', 'brx', 'bul', 'bel', 'vol_Latn', 'kat', 'gan', 'got_Goth', 'vro', 'ext', 'afh_Latn', 'gla', 'jpn', 'udm', 'mai', 'ary', 'sin', 'tvl', 'hif_Latn', 'cjy_Hant', 'bre', 'ceb', 'mah', 'nob_Hebr', 'crh_Latn', 'prg_Latn', 'khm', 'ang_Latn', 'tha', 'tah', 'tzl', 'aln', 'kin', 'tel', 'ady', 'mkd', 'ori', 'wol', 'aze_Latn', 'jbo', 'niu', 'kal', 'mar', 'vie_Hani', 'arz', 'yue', 'kha', 'san_Deva', 'jbo_Latn', 'gos', 'hau_Latn', 'rus', 'quc', 'cmn', 'yor', 'hun', 'uig_Cyrl', 'fao', 'mnw', 'zho', 'orv_Cyrl', 'iba', 'bel_Latn', 'tir', 'afb', 'crh', 'mic', 'cos', 'swh', 'sah', 'krl', 'ewe', 'apc', 'zza', 'chr', 'grc_Grek', 'tpw_Latn', 'oci', 'mfe', 'sna', 'kir_Cyrl', 'tat_Latn', 'gom', 'ido_Latn', 'sgs', 'pau', 'tgk_Cyrl', 'nog', 'mlt', 'pdc', 'tso', 'srp_Cyrl', 'pol', 'ast', 'glg', 'pms', 'fuc', 'nob', 'qya', 'ben', 'tat', 'kab', 'min', 'srp_Latn', 'wuu', 'dtp', 'jbo_Cyrl', 'tet', 'bod', 'yue_Hans', 'zlm_Latn', 'lao', 'ind', 'grn', 'nav', 'kaz_Cyrl', 'rom', 'hye', 'kan', 'ton', 'ido', 'mhr', 'scn', 'som', 'rif_Latn', 'vie', 'enm_Latn', 'lmo', 'npi', 'pes', 'dan', 'fij', 'ina_Latn', 'cjy_Hans', 'jdt_Cyrl', 'gsw', 'glv', 'khm_Latn', 'smo', 'umb', 'sma', 'gil', 'nld', 'snd_Arab', 'arq', 'mri', 'kur_Arab', 'por', 'hin', 'shy_Latn', 'sme', 'rap', 'tyv', 'dsb', 'moh', 'asm', 'lad', 'yue_Hant', 'kpv', 'tam', 'est', 'frm_Latn', 'hoc_Latn', 'bam_Latn', 'kek_Latn', 'ksh', 'tlh_Latn', 'ltg', 'pan_Guru', 'hnj_Latn', 'cor', 'gle', 'swe', 'lin', 'qya_Latn', 'kum', 'mad', 'cmn_Hant', 'fuv', 'nau', 'mon', 'akl_Latn', 'guj', 'kaz_Latn', 'wln', 'tuk_Latn', 'jav_Java', 'lav', 'jav', 'ell', 'frr', 'mya', 'bak', 'rue', 'ita', 'hrv', 'izh', 'ilo', 'dws_Latn', 'urd', 'stq', 'tat_Arab', 'haw', 'che', 'pag', 'nno', 'fin', 'mlg', 'ppl_Latn', 'run', 'xho', 'abk', 'deu', 'hoc', 'lkt', 'lld_Latn', 'tzl_Latn', 'mdf', 'ike_Latn', 'ces', 'ldn_Latn', 'egl', 'heb', 'vec', 'zul', 'max_Latn', 'pes_Latn', 'yid', 'mal', 'nds'}
- tgt_constituents: {'eng'}
- src_multilingual: True
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/mul-eng/opus2m-2020-08-01.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/mul-eng/opus2m-2020-08-01.test.txt
- src_alpha3: mul
- tgt_alpha3: eng
- short_pair: mul-en
- chrF2_score: 0.518
- bleu: 34.7
- brevity_penalty: 1.0
- ref_len: 72346.0
- src_name: Multiple languages
- tgt_name: English
- train_date: 2020-08-01
- src_alpha2: mul
- tgt_alpha2: en
- prefer_old: False
- long_pair: mul-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 24,356 | [
[
-0.055755615234375,
-0.03607177734375,
0.01556396484375,
0.034423828125,
-0.00690460205078125,
-0.0124053955078125,
0.005908966064453125,
-0.033721923828125,
0.043243408203125,
-0.01512908935546875,
-0.0316162109375,
-0.0160369873046875,
-0.030975341796875,
0.0280914306640625,
0.00620269775390625,
0.03759765625,
-0.00545501708984375,
0.0123291015625,
0.0184173583984375,
-0.01470184326171875,
-0.028076171875,
-0.0007052421569824219,
-0.05914306640625,
-0.0062255859375,
0.00673675537109375,
0.03729248046875,
0.03643798828125,
0.03302001953125,
0.0357666015625,
0.025909423828125,
-0.00971221923828125,
0.0036602020263671875,
0.00481414794921875,
-0.0157623291015625,
0.00759124755859375,
-0.045135498046875,
-0.033721923828125,
-0.005748748779296875,
0.051544189453125,
0.04315185546875,
0.03179931640625,
0.0294342041015625,
0.0148773193359375,
0.0716552734375,
-0.0322265625,
-0.01337432861328125,
-0.00897979736328125,
0.0025005340576171875,
-0.0199737548828125,
-0.021514892578125,
-0.034515380859375,
-0.04095458984375,
-0.009552001953125,
-0.0479736328125,
0.0130615234375,
0.00011920928955078125,
0.1015625,
-0.004276275634765625,
-0.022674560546875,
-0.00991058349609375,
-0.032379150390625,
0.068603515625,
-0.057403564453125,
0.0269775390625,
0.033233642578125,
-0.01337432861328125,
-0.00484466552734375,
-0.033782958984375,
-0.047393798828125,
0.0147247314453125,
-0.023468017578125,
0.050933837890625,
0.00321197509765625,
-0.0301971435546875,
0.0063323974609375,
0.03228759765625,
-0.048004150390625,
0.0112457275390625,
-0.040191650390625,
-0.008636474609375,
0.062042236328125,
0.014190673828125,
0.0195159912109375,
-0.0276336669921875,
-0.049163818359375,
-0.028411865234375,
-0.042633056640625,
0.044158935546875,
0.026153564453125,
0.00922393798828125,
-0.02813720703125,
0.05108642578125,
-0.017181396484375,
0.033905029296875,
0.0281219482421875,
-0.00597381591796875,
0.04656982421875,
-0.04754638671875,
-0.044952392578125,
-0.01531219482421875,
0.071533203125,
0.04425048828125,
-0.01084136962890625,
0.0209808349609375,
0.0081024169921875,
0.01120758056640625,
-0.037841796875,
-0.06146240234375,
-0.0008807182312011719,
0.013580322265625,
-0.0233917236328125,
0.0003349781036376953,
0.00891876220703125,
-0.090087890625,
0.004619598388671875,
-0.00653839111328125,
0.0294342041015625,
-0.050933837890625,
-0.028076171875,
0.0241851806640625,
-0.006732940673828125,
0.0364990234375,
0.0152130126953125,
-0.033172607421875,
0.0221099853515625,
0.0226287841796875,
0.060394287109375,
-0.0229339599609375,
-0.0184783935546875,
-0.00543212890625,
-0.0081939697265625,
-0.032501220703125,
0.058837890625,
-0.021881103515625,
-0.032562255859375,
-0.023529052734375,
0.01421356201171875,
-0.034332275390625,
-0.01517486572265625,
0.0518798828125,
-0.0079498291015625,
0.041656494140625,
-0.0401611328125,
-0.04229736328125,
-0.0186614990234375,
0.01447296142578125,
-0.03466796875,
0.0855712890625,
0.00829315185546875,
-0.07708740234375,
0.0380859375,
-0.043792724609375,
0.01120758056640625,
-0.0007143020629882812,
-0.0032215118408203125,
-0.033294677734375,
-0.021484375,
0.0296478271484375,
0.01461029052734375,
-0.0308837890625,
-0.004055023193359375,
-0.0141143798828125,
-0.0268096923828125,
0.0035915374755859375,
-0.028289794921875,
0.0828857421875,
0.032623291015625,
-0.038970947265625,
-0.01019287109375,
-0.06549072265625,
0.0226898193359375,
0.0135040283203125,
-0.0302581787109375,
-0.0010833740234375,
-0.03790283203125,
-0.01395416259765625,
0.0279083251953125,
0.0227508544921875,
-0.050933837890625,
0.002887725830078125,
-0.05877685546875,
0.00058746337890625,
0.07476806640625,
0.00611114501953125,
0.030853271484375,
-0.05615234375,
0.034027099609375,
0.0195770263671875,
0.0119476318359375,
0.014251708984375,
-0.042694091796875,
-0.07098388671875,
-0.0158233642578125,
0.03314208984375,
0.051788330078125,
-0.04681396484375,
0.04290771484375,
-0.01061248779296875,
-0.06884765625,
-0.045440673828125,
-0.011016845703125,
0.0438232421875,
0.023895263671875,
0.025634765625,
-0.003093719482421875,
-0.042388916015625,
-0.07763671875,
-0.0284576416015625,
-0.0102691650390625,
0.01000213623046875,
0.01092529296875,
0.059356689453125,
0.0024394989013671875,
0.055328369140625,
-0.050689697265625,
-0.02203369140625,
-0.01493072509765625,
-0.0101318359375,
0.04266357421875,
0.05389404296875,
0.050567626953125,
-0.056488037109375,
-0.088134765625,
0.0009417533874511719,
-0.044342041015625,
0.0028133392333984375,
-0.0005626678466796875,
-0.020721435546875,
0.031524658203125,
0.0140228271484375,
-0.042388916015625,
0.045806884765625,
0.037933349609375,
-0.04888916015625,
0.053955078125,
-0.02423095703125,
0.04010009765625,
-0.083740234375,
0.0135345458984375,
0.004947662353515625,
0.01213836669921875,
-0.0350341796875,
0.001476287841796875,
-0.0011720657348632812,
0.01329803466796875,
-0.032501220703125,
0.052398681640625,
-0.060546875,
-0.0014944076538085938,
0.04132080078125,
0.0303955078125,
-0.005443572998046875,
0.054656982421875,
-0.02044677734375,
0.07525634765625,
0.050048828125,
-0.0462646484375,
0.0150146484375,
0.01384735107421875,
-0.02178955078125,
0.035675048828125,
-0.04095458984375,
-0.0139617919921875,
0.00878143310546875,
-0.0013914108276367188,
-0.07586669921875,
-0.0013484954833984375,
0.0166015625,
-0.054412841796875,
0.01220703125,
0.01248931884765625,
-0.026824951171875,
-0.044525146484375,
-0.0623779296875,
0.0211334228515625,
0.024749755859375,
-0.01375579833984375,
0.0224609375,
0.012115478515625,
-0.007068634033203125,
-0.047393798828125,
-0.053131103515625,
-0.0231781005859375,
-0.01172637939453125,
-0.05267333984375,
0.03369140625,
-0.02032470703125,
-0.0160980224609375,
0.00865936279296875,
-0.02337646484375,
-0.01497650146484375,
-0.0174407958984375,
0.00302886962890625,
0.01300811767578125,
-0.01934814453125,
-0.01454925537109375,
-0.020355224609375,
-0.01515960693359375,
-0.0111541748046875,
0.0128631591796875,
0.039947509765625,
-0.02093505859375,
-0.032745361328125,
-0.040557861328125,
0.023101806640625,
0.04949951171875,
-0.0247344970703125,
0.0513916015625,
0.0263519287109375,
-0.0191650390625,
0.01190948486328125,
-0.040618896484375,
0.0022125244140625,
-0.038543701171875,
0.0209808349609375,
-0.041015625,
-0.055511474609375,
0.0543212890625,
0.0128631591796875,
0.017364501953125,
0.08099365234375,
0.036712646484375,
0.002315521240234375,
0.06939697265625,
0.0165557861328125,
0.02252197265625,
0.02801513671875,
-0.04669189453125,
0.004940032958984375,
-0.053497314453125,
-0.03582763671875,
-0.046875,
-0.037078857421875,
-0.0623779296875,
-0.028106689453125,
0.03387451171875,
0.001190185546875,
-0.0269775390625,
0.0380859375,
-0.05035400390625,
0.0048370361328125,
0.04150390625,
0.0110931396484375,
0.01029205322265625,
-0.025787353515625,
-0.0202484130859375,
-0.019805908203125,
-0.0286102294921875,
-0.0241851806640625,
0.086181640625,
0.024627685546875,
0.041259765625,
0.0220947265625,
0.053955078125,
0.015655517578125,
0.001461029052734375,
-0.0292205810546875,
0.048858642578125,
0.01187896728515625,
-0.06158447265625,
-0.03466796875,
-0.01271820068359375,
-0.06890869140625,
0.0308685302734375,
-0.003643035888671875,
-0.04644775390625,
0.0374755859375,
0.0005321502685546875,
-0.021331787109375,
0.02532958984375,
-0.057586669921875,
0.049102783203125,
-0.01314544677734375,
-0.05096435546875,
0.01000213623046875,
-0.046234130859375,
0.01593017578125,
0.010894775390625,
0.0171356201171875,
-0.0036334991455078125,
0.0022258758544921875,
0.07757568359375,
-0.047027587890625,
0.02093505859375,
-0.0234375,
-0.0017271041870117188,
0.0435791015625,
0.00862884521484375,
0.03668212890625,
0.01788330078125,
-0.0098419189453125,
-0.025604248046875,
0.004878997802734375,
-0.040557861328125,
-0.003849029541015625,
0.060150146484375,
-0.058013916015625,
-0.055267333984375,
-0.061767578125,
-0.005023956298828125,
0.00382232666015625,
0.00920867919921875,
0.037506103515625,
0.048309326171875,
-0.00513458251953125,
0.0296630859375,
0.047088623046875,
-0.0249481201171875,
0.0594482421875,
0.01551055908203125,
-0.0005564689636230469,
-0.042724609375,
0.060821533203125,
0.0197601318359375,
0.037384033203125,
0.03790283203125,
0.0159454345703125,
-0.016754150390625,
-0.017730712890625,
-0.032684326171875,
0.020263671875,
-0.0272674560546875,
-0.041717529296875,
-0.0562744140625,
-0.0211639404296875,
-0.042388916015625,
-0.028533935546875,
-0.041168212890625,
-0.04376220703125,
-0.00762176513671875,
-0.019317626953125,
0.0303192138671875,
0.03314208984375,
-0.0281829833984375,
0.007160186767578125,
-0.028350830078125,
0.0036716461181640625,
0.006313323974609375,
0.0149688720703125,
-0.021514892578125,
-0.056243896484375,
-0.0204010009765625,
0.01033782958984375,
-0.040283203125,
-0.09454345703125,
0.049896240234375,
0.0042877197265625,
0.0254974365234375,
0.0264739990234375,
-0.0006604194641113281,
0.0655517578125,
-0.0068206787109375,
0.08544921875,
0.01299285888671875,
-0.060760498046875,
0.0458984375,
-0.00937652587890625,
0.0289764404296875,
0.045074462890625,
0.0273895263671875,
-0.0209808349609375,
-0.058624267578125,
-0.0614013671875,
-0.062347412109375,
0.04949951171875,
0.02587890625,
-0.0211944580078125,
-0.007526397705078125,
-0.0130615234375,
-0.0017375946044921875,
-0.034454345703125,
-0.06292724609375,
-0.0615234375,
0.007480621337890625,
-0.0038089752197265625,
0.0092620849609375,
-0.01904296875,
0.002590179443359375,
-0.0391845703125,
0.053192138671875,
0.0200653076171875,
0.042694091796875,
0.047149658203125,
0.0016117095947265625,
0.007007598876953125,
0.040130615234375,
0.06634521484375,
0.033172607421875,
-0.020477294921875,
0.01316070556640625,
0.04412841796875,
-0.0535888671875,
0.02288818359375,
-0.007602691650390625,
-0.028564453125,
0.01522064208984375,
0.0210418701171875,
0.025146484375,
0.00447845458984375,
-0.0080413818359375,
0.033599853515625,
-0.004215240478515625,
-0.035369873046875,
-0.044921875,
-0.0135345458984375,
0.01099395751953125,
-0.0014095306396484375,
0.021820068359375,
0.0175933837890625,
0.0025787353515625,
-0.037139892578125,
0.0192108154296875,
0.01178741455078125,
-0.0219879150390625,
-0.016876220703125,
0.042694091796875,
0.00977325439453125,
-0.01052093505859375,
-0.001697540283203125,
-0.0280914306640625,
-0.044342041015625,
0.0648193359375,
0.042999267578125,
0.045135498046875,
-0.040069580078125,
-0.0048370361328125,
0.07379150390625,
0.01149749755859375,
0.000850677490234375,
0.039825439453125,
0.045013427734375,
-0.0225372314453125,
-0.0096435546875,
-0.0504150390625,
0.004741668701171875,
0.00787353515625,
-0.044219970703125,
0.0260467529296875,
-0.01910400390625,
-0.02716064453125,
-0.006702423095703125,
0.03521728515625,
-0.055694580078125,
0.035064697265625,
-0.036346435546875,
0.089111328125,
-0.07623291015625,
0.044403076171875,
0.056976318359375,
-0.050872802734375,
-0.078369140625,
-0.0169525146484375,
-0.01056671142578125,
-0.0292510986328125,
0.052734375,
-0.00588226318359375,
0.000019609928131103516,
-0.0030307769775390625,
-0.0231170654296875,
-0.0828857421875,
0.09246826171875,
-0.0014238357543945312,
-0.01654052734375,
0.0156097412109375,
-0.005298614501953125,
0.039886474609375,
0.0003001689910888672,
0.044677734375,
0.04083251953125,
0.06390380859375,
-0.0014562606811523438,
-0.07183837890625,
0.0206298828125,
-0.0550537109375,
-0.0220947265625,
0.0187835693359375,
-0.068603515625,
0.09228515625,
-0.00012695789337158203,
-0.0249786376953125,
-0.021728515625,
0.043060302734375,
0.0200958251953125,
0.023468017578125,
0.0276031494140625,
0.050750732421875,
0.028778076171875,
-0.022186279296875,
0.059722900390625,
-0.0181884765625,
0.02496337890625,
0.05609130859375,
0.00024437904357910156,
0.05987548828125,
0.03436279296875,
-0.05419921875,
0.0210418701171875,
0.061065673828125,
-0.0037994384765625,
0.037017822265625,
0.00746917724609375,
-0.0256195068359375,
-0.0081939697265625,
0.0023136138916015625,
-0.054443359375,
0.0216827392578125,
0.017364501953125,
0.00000852346420288086,
0.004489898681640625,
-0.0197906494140625,
0.0274200439453125,
0.02288818359375,
-0.006313323974609375,
0.041839599609375,
0.0091552734375,
-0.058624267578125,
0.044830322265625,
0.00330352783203125,
0.07305908203125,
-0.032440185546875,
0.007495880126953125,
-0.031829833984375,
0.0265350341796875,
-0.02099609375,
-0.07159423828125,
0.027099609375,
0.0022125244140625,
-0.0156707763671875,
-0.0021953582763671875,
0.03533935546875,
-0.025604248046875,
-0.043426513671875,
0.036407470703125,
0.0343017578125,
0.01421356201171875,
0.0288238525390625,
-0.038330078125,
-0.0179443359375,
0.0108642578125,
-0.045135498046875,
0.01120758056640625,
0.044189453125,
0.017852783203125,
0.048583984375,
0.029693603515625,
0.01806640625,
0.039093017578125,
-0.033966064453125,
0.053619384765625,
-0.041259765625,
-0.0303955078125,
-0.07794189453125,
0.03643798828125,
-0.028076171875,
-0.039642333984375,
0.08099365234375,
0.07354736328125,
0.0621337890625,
-0.00449371337890625,
0.043243408203125,
-0.0364990234375,
0.0404052734375,
-0.017333984375,
0.04095458984375,
-0.049072265625,
-0.012054443359375,
-0.007137298583984375,
-0.053192138671875,
-0.0070648193359375,
0.04425048828125,
-0.0108795166015625,
0.0164642333984375,
0.08160400390625,
0.053955078125,
0.014190673828125,
-0.0011043548583984375,
0.00762176513671875,
0.011688232421875,
0.0162506103515625,
0.06884765625,
0.029571533203125,
-0.060089111328125,
0.0382080078125,
-0.0225067138671875,
-0.00397491455078125,
-0.024383544921875,
-0.04034423828125,
-0.045166015625,
-0.05133056640625,
-0.01116180419921875,
-0.03082275390625,
-0.03155517578125,
0.06646728515625,
0.0193634033203125,
-0.0550537109375,
-0.0218048095703125,
0.006809234619140625,
0.01082611083984375,
-0.0350341796875,
-0.022857666015625,
0.06829833984375,
-0.005634307861328125,
-0.067138671875,
0.01116943359375,
-0.0025386810302734375,
0.01155853271484375,
0.0174713134765625,
-0.0200653076171875,
-0.048614501953125,
0.027191162109375,
0.018157958984375,
0.01934814453125,
-0.0396728515625,
-0.02581787109375,
0.0247039794921875,
-0.037322998046875,
0.028106689453125,
0.001789093017578125,
-0.0248870849609375,
0.0103302001953125,
0.07452392578125,
0.016143798828125,
0.044342041015625,
0.00894927978515625,
0.019287109375,
-0.036773681640625,
0.0292205810546875,
0.00518035888671875,
0.02117919921875,
0.004913330078125,
-0.029571533203125,
0.055267333984375,
0.018951416015625,
-0.03643798828125,
-0.0655517578125,
-0.00948333740234375,
-0.075927734375,
-0.00258636474609375,
0.080810546875,
-0.01195526123046875,
-0.0186614990234375,
0.00409698486328125,
-0.0235443115234375,
0.02960205078125,
-0.03240966796875,
0.0230560302734375,
0.0380859375,
-0.00968170166015625,
0.00933074951171875,
-0.061981201171875,
0.031402587890625,
0.04071044921875,
-0.06597900390625,
-0.01318359375,
0.0159149169921875,
0.0199127197265625,
0.040191650390625,
0.072021484375,
-0.0343017578125,
0.0045318603515625,
0.0220489501953125,
0.0045013427734375,
0.0025386810302734375,
0.0018568038940429688,
0.0002849102020263672,
0.00852203369140625,
-0.0102996826171875,
-0.0399169921875
]
] |
sentence-transformers/LaBSE | 2023-11-02T09:18:45.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"jax",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"multilingual",
"af",
"sq",
"am",
"ar",
"hy",
"as",
"az",
"eu",
"be",
"bn",
"bs",
"bg",
"my",
"ca",
"ceb",
"zh",
"co",
"hr",
"cs",
"da",
"nl",
"en",
"eo",
"et",
"fi",
"fr",
"fy",
"gl",
"ka",
"de",
"el",
"gu",
"ht",
"ha",
"haw",
"he",
"hi",
"hmn",
"hu",
"is",
"ig",
"id",
"ga",
"it",
"ja",
"jv",
"kn",
"kk",
"km",
"rw",
"ko",
"ku",
"ky",
"lo",
"la",
"lv",
"lt",
"lb",
"mk",
"mg",
"ms",
"ml",
"mt",
"mi",
"mr",
"mn",
"ne",
"no",
"ny",
"or",
"fa",
"pl",
"pt",
"pa",
"ro",
"ru",
"sm",
"gd",
"sr",
"st",
"sn",
"si",
"sk",
"sl",
"so",
"es",
"su",
"sw",
"sv",
"tl",
"tg",
"ta",
"tt",
"te",
"th",
"bo",
"tr",
"tk",
"ug",
"uk",
"ur",
"uz",
"vi",
"cy",
"wo",
"xh",
"yi",
"yo",
"zu",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/LaBSE | 98 | 85,400 | sentence-transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- sq
- am
- ar
- hy
- as
- az
- eu
- be
- bn
- bs
- bg
- my
- ca
- ceb
- zh
- co
- hr
- cs
- da
- nl
- en
- eo
- et
- fi
- fr
- fy
- gl
- ka
- de
- el
- gu
- ht
- ha
- haw
- he
- hi
- hmn
- hu
- is
- ig
- id
- ga
- it
- ja
- jv
- kn
- kk
- km
- rw
- ko
- ku
- ky
- lo
- la
- lv
- lt
- lb
- mk
- mg
- ms
- ml
- mt
- mi
- mr
- mn
- ne
- no
- ny
- or
- fa
- pl
- pt
- pa
- ro
- ru
- sm
- gd
- sr
- st
- sn
- si
- sk
- sl
- so
- es
- su
- sw
- sv
- tl
- tg
- ta
- tt
- te
- th
- bo
- tr
- tk
- ug
- uk
- ur
- uz
- vi
- cy
- wo
- xh
- yi
- yo
- zu
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
license: apache-2.0
---
# LaBSE
This is a port of the [LaBSE](https://tfhub.dev/google/LaBSE/1) model to PyTorch. It can be used to map 109 languages to a shared vector space.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/LaBSE')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/LaBSE)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Dense({'in_features': 768, 'out_features': 768, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
(3): Normalize()
)
```
## Citing & Authors
Have a look at [LaBSE](https://tfhub.dev/google/LaBSE/1) for the respective publication that describes LaBSE.
| 2,195 | [
[
-0.0267181396484375,
-0.045166015625,
0.021759033203125,
0.030548095703125,
-0.006313323974609375,
-0.003711700439453125,
-0.0175323486328125,
0.003208160400390625,
0.00850677490234375,
0.0277099609375,
-0.01995849609375,
-0.04620361328125,
-0.03936767578125,
0.00011879205703735352,
-0.053131103515625,
0.07318115234375,
-0.021820068359375,
-0.00032329559326171875,
0.00004738569259643555,
-0.0184173583984375,
-0.021087646484375,
-0.030792236328125,
-0.0196075439453125,
-0.030609130859375,
0.0286102294921875,
0.0227813720703125,
0.049072265625,
0.01557159423828125,
0.0233154296875,
0.027069091796875,
-0.004852294921875,
-0.0186920166015625,
-0.03656005859375,
-0.01103973388671875,
-0.006076812744140625,
-0.0151519775390625,
-0.0208282470703125,
0.006134033203125,
0.031707763671875,
0.0416259765625,
-0.0038509368896484375,
0.01483917236328125,
-0.0092926025390625,
0.02276611328125,
-0.0257568359375,
0.01389312744140625,
-0.0212249755859375,
0.0110626220703125,
0.0073699951171875,
0.0105743408203125,
-0.042236328125,
-0.0240478515625,
0.039276123046875,
-0.035919189453125,
0.028656005859375,
0.006549835205078125,
0.0809326171875,
0.0020751953125,
-0.0234832763671875,
-0.0163726806640625,
-0.03173828125,
0.053070068359375,
-0.042388916015625,
0.022186279296875,
0.006549835205078125,
0.0220794677734375,
-0.01654052734375,
-0.09814453125,
-0.04425048828125,
-0.0236053466796875,
-0.01528167724609375,
0.01531982421875,
-0.033843994140625,
0.0006070137023925781,
0.016357421875,
0.037261962890625,
-0.053192138671875,
-0.0227203369140625,
-0.048614501953125,
-0.02642822265625,
0.035736083984375,
-0.01233673095703125,
0.047576904296875,
-0.0202789306640625,
-0.0216827392578125,
-0.039581298828125,
-0.0212249755859375,
-0.01318359375,
0.01454925537109375,
0.00382232666015625,
-0.015350341796875,
0.060638427734375,
-0.0030345916748046875,
0.0413818359375,
-0.01360321044921875,
0.036346435546875,
0.04278564453125,
-0.01258087158203125,
-0.0149078369140625,
-0.004970550537109375,
0.089111328125,
0.0132293701171875,
0.0270538330078125,
-0.041259765625,
-0.0029010772705078125,
0.010101318359375,
0.03173828125,
-0.06671142578125,
-0.035003662109375,
0.025238037109375,
-0.0170135498046875,
-0.0287628173828125,
0.01476287841796875,
-0.05206298828125,
-0.00885009765625,
0.008392333984375,
0.06121826171875,
-0.05059814453125,
-0.0014324188232421875,
0.018341064453125,
-0.02947998046875,
0.01139068603515625,
-0.0157928466796875,
-0.0548095703125,
0.045745849609375,
0.03338623046875,
0.0836181640625,
-0.006252288818359375,
-0.04071044921875,
-0.03692626953125,
-0.0187225341796875,
-0.00609588623046875,
0.040802001953125,
-0.0111236572265625,
-0.0254058837890625,
0.00823974609375,
0.0168304443359375,
-0.004856109619140625,
-0.0230865478515625,
0.042236328125,
-0.0322265625,
0.041046142578125,
0.016693115234375,
-0.050140380859375,
-0.0298004150390625,
0.007015228271484375,
-0.055084228515625,
0.0858154296875,
0.030975341796875,
-0.05682373046875,
-0.0010585784912109375,
-0.06787109375,
-0.0230560302734375,
0.003452301025390625,
-0.001071929931640625,
-0.041656494140625,
0.00853729248046875,
0.022369384765625,
0.0302581787109375,
0.00673675537109375,
0.03326416015625,
-0.0182952880859375,
-0.039794921875,
0.02886962890625,
-0.00440216064453125,
0.07965087890625,
0.0105743408203125,
-0.0173492431640625,
0.0443115234375,
-0.033203125,
0.0008692741394042969,
-0.004070281982421875,
-0.021636962890625,
0.0068359375,
-0.01331329345703125,
0.047821044921875,
0.01250457763671875,
0.01384735107421875,
-0.054931640625,
0.0234527587890625,
-0.043060302734375,
0.06365966796875,
0.0281829833984375,
-0.014801025390625,
0.053863525390625,
-0.023590087890625,
0.032958984375,
-0.0049285888671875,
-0.002452850341796875,
0.011688232421875,
-0.044158935546875,
-0.055755615234375,
-0.0295257568359375,
0.032928466796875,
0.04888916015625,
-0.039581298828125,
0.05438232421875,
-0.03607177734375,
-0.042999267578125,
-0.06103515625,
0.00019979476928710938,
-0.002532958984375,
0.0220489501953125,
0.03961181640625,
0.0005230903625488281,
-0.04046630859375,
-0.06768798828125,
-0.007442474365234375,
0.0146331787109375,
-0.016204833984375,
0.00479888916015625,
0.059326171875,
-0.03265380859375,
0.078125,
-0.031280517578125,
-0.0179290771484375,
-0.033203125,
0.02142333984375,
0.02337646484375,
0.03167724609375,
0.03826904296875,
-0.0323486328125,
-0.03558349609375,
-0.0082550048828125,
-0.0474853515625,
-0.007564544677734375,
0.00833892822265625,
-0.0220184326171875,
0.01544952392578125,
0.041290283203125,
-0.043304443359375,
0.02777099609375,
0.065185546875,
-0.0301513671875,
0.0255279541015625,
-0.021392822265625,
-0.00725555419921875,
-0.10614013671875,
0.01434326171875,
-0.006290435791015625,
-0.02154541015625,
-0.0280914306640625,
0.0187225341796875,
0.0180511474609375,
-0.025482177734375,
-0.033050537109375,
0.059478759765625,
-0.0142059326171875,
0.0014591217041015625,
-0.014923095703125,
0.0238037109375,
-0.01233673095703125,
0.0302734375,
-0.011474609375,
0.06756591796875,
0.055419921875,
-0.0296783447265625,
0.039703369140625,
0.041595458984375,
-0.04290771484375,
-0.0257110595703125,
-0.065185546875,
0.0201568603515625,
0.007568359375,
0.0287628173828125,
-0.0723876953125,
-0.01506805419921875,
0.01715087890625,
-0.049346923828125,
-0.010345458984375,
0.005977630615234375,
-0.061004638671875,
-0.0418701171875,
-0.028961181640625,
0.0176849365234375,
0.049957275390625,
-0.05108642578125,
0.0577392578125,
0.0174713134765625,
0.004810333251953125,
-0.037872314453125,
-0.062744140625,
0.0048828125,
-0.01654052734375,
-0.0672607421875,
0.04437255859375,
0.001190185546875,
0.011077880859375,
0.0263671875,
0.010009765625,
-0.0014820098876953125,
-0.0016193389892578125,
0.0196380615234375,
0.0181884765625,
-0.00342559814453125,
0.0174407958984375,
-0.004482269287109375,
-0.007518768310546875,
-0.00665283203125,
-0.0130157470703125,
0.07373046875,
-0.032562255859375,
0.004528045654296875,
-0.025634765625,
0.017425537109375,
0.036376953125,
-0.01345062255859375,
0.06732177734375,
0.0662841796875,
-0.0307769775390625,
-0.020782470703125,
-0.0230560302734375,
-0.02276611328125,
-0.0362548828125,
0.046875,
-0.022674560546875,
-0.0731201171875,
0.04461669921875,
-0.006237030029296875,
-0.007419586181640625,
0.045440673828125,
0.045745849609375,
-0.002262115478515625,
0.051727294921875,
0.04498291015625,
-0.024871826171875,
0.04150390625,
-0.038726806640625,
0.0416259765625,
-0.06378173828125,
0.00494384765625,
-0.0284576416015625,
-0.0183868408203125,
-0.060821533203125,
-0.030517578125,
0.0171051025390625,
0.00969696044921875,
-0.02618408203125,
0.0487060546875,
-0.0484619140625,
0.0345458984375,
0.058013916015625,
0.0033626556396484375,
-0.007602691650390625,
0.02752685546875,
-0.0195159912109375,
0.01139068603515625,
-0.05755615234375,
-0.053497314453125,
0.08392333984375,
0.0208587646484375,
0.03436279296875,
0.004650115966796875,
0.0594482421875,
-0.002590179443359375,
0.02490234375,
-0.0758056640625,
0.042449951171875,
-0.0253448486328125,
-0.047393798828125,
-0.0123291015625,
-0.0162200927734375,
-0.0791015625,
0.0177001953125,
-0.013153076171875,
-0.0777587890625,
-0.020721435546875,
-0.0184783935546875,
-0.0169677734375,
0.010650634765625,
-0.0714111328125,
0.08941650390625,
0.00009310245513916016,
-0.0196075439453125,
-0.0226593017578125,
-0.030181884765625,
0.011016845703125,
0.01004791259765625,
0.0018320083618164062,
0.01537322998046875,
0.02642822265625,
0.06756591796875,
-0.0243988037109375,
0.054534912109375,
0.013671875,
0.0216217041015625,
0.0269775390625,
0.005016326904296875,
0.0228729248046875,
0.00263214111328125,
-0.00962066650390625,
0.0109710693359375,
0.007068634033203125,
-0.0219879150390625,
-0.027862548828125,
0.0650634765625,
-0.08538818359375,
-0.0225067138671875,
-0.05364990234375,
-0.06622314453125,
-0.0004382133483886719,
0.0193939208984375,
0.01751708984375,
0.0294189453125,
-0.024566650390625,
0.0298919677734375,
0.01568603515625,
-0.04144287109375,
0.036712646484375,
0.0186614990234375,
-0.0264892578125,
-0.035675048828125,
0.05682373046875,
-0.00922393798828125,
-0.0033855438232421875,
0.0213470458984375,
0.0250396728515625,
-0.0355224609375,
-0.0189666748046875,
-0.01444244384765625,
0.02081298828125,
-0.05316162109375,
0.002994537353515625,
-0.060760498046875,
-0.033843994140625,
-0.0390625,
-0.0216217041015625,
-0.012725830078125,
-0.023223876953125,
-0.0172119140625,
-0.016204833984375,
0.050506591796875,
0.033203125,
-0.00479888916015625,
0.053802490234375,
-0.0516357421875,
0.0304412841796875,
0.01546478271484375,
0.0249481201171875,
-0.01617431640625,
-0.035888671875,
-0.0005021095275878906,
-0.01116943359375,
-0.035980224609375,
-0.077880859375,
0.050262451171875,
-0.0023174285888671875,
0.038726806640625,
-0.01142120361328125,
-0.00955963134765625,
0.033782958984375,
-0.034698486328125,
0.053497314453125,
0.008087158203125,
-0.086669921875,
0.03240966796875,
-0.01453399658203125,
0.0215606689453125,
0.02984619140625,
0.0102996826171875,
-0.05450439453125,
-0.0330810546875,
-0.05810546875,
-0.0858154296875,
0.0595703125,
0.03558349609375,
0.025360107421875,
-0.0085906982421875,
0.01175689697265625,
-0.004489898681640625,
-0.0018453598022460938,
-0.0743408203125,
-0.02117919921875,
-0.0309295654296875,
-0.04595947265625,
-0.00954437255859375,
-0.0054168701171875,
-0.0118408203125,
-0.0239715576171875,
0.05810546875,
0.007297515869140625,
0.033660888671875,
0.00777435302734375,
-0.03131103515625,
0.0217132568359375,
0.024749755859375,
0.031707763671875,
0.02288818359375,
-0.037872314453125,
0.0082550048828125,
0.00893402099609375,
-0.02880859375,
-0.010894775390625,
0.0257568359375,
-0.006519317626953125,
0.0098419189453125,
0.0208282470703125,
0.06292724609375,
0.00637054443359375,
-0.03662109375,
0.04205322265625,
0.0033016204833984375,
-0.0208282470703125,
-0.0259552001953125,
-0.0052337646484375,
0.03192138671875,
0.020233154296875,
0.0010595321655273438,
0.00287628173828125,
0.0128173828125,
-0.03912353515625,
0.0255279541015625,
0.009918212890625,
-0.029266357421875,
-0.01055908203125,
0.04498291015625,
0.01390838623046875,
-0.033447265625,
0.07757568359375,
-0.0257568359375,
-0.051910400390625,
0.053009033203125,
0.0479736328125,
0.0738525390625,
0.01250457763671875,
0.029937744140625,
0.05078125,
0.019622802734375,
-0.01995849609375,
0.024749755859375,
0.0135650634765625,
-0.060882568359375,
-0.0343017578125,
-0.04052734375,
-0.0120391845703125,
0.018341064453125,
-0.040283203125,
0.034515380859375,
-0.01450347900390625,
-0.006252288818359375,
-0.0013217926025390625,
-0.00485992431640625,
-0.06005859375,
-0.005466461181640625,
0.00803375244140625,
0.060546875,
-0.06512451171875,
0.08343505859375,
0.06341552734375,
-0.0411376953125,
-0.0643310546875,
-0.01422882080078125,
-0.022674560546875,
-0.06439208984375,
0.047607421875,
0.0174560546875,
0.0026035308837890625,
0.0145721435546875,
-0.038543701171875,
-0.051239013671875,
0.084228515625,
0.0167236328125,
-0.041961669921875,
0.0218963623046875,
0.0005860328674316406,
0.04974365234375,
-0.0187530517578125,
0.0204620361328125,
0.0274200439453125,
0.0183868408203125,
-0.0015268325805664062,
-0.06695556640625,
0.0172576904296875,
-0.024566650390625,
-0.00038743019104003906,
0.0007686614990234375,
-0.024200439453125,
0.0595703125,
-0.006549835205078125,
-0.016021728515625,
0.01611328125,
0.043243408203125,
0.0360107421875,
-0.01024627685546875,
0.0174102783203125,
0.044158935546875,
0.043975830078125,
-0.0179290771484375,
0.08099365234375,
-0.02923583984375,
0.06475830078125,
0.078125,
0.005893707275390625,
0.07147216796875,
0.046844482421875,
-0.00803375244140625,
0.04046630859375,
0.0268707275390625,
-0.029144287109375,
0.050262451171875,
0.042633056640625,
-0.00360870361328125,
-0.0010023117065429688,
0.015960693359375,
-0.0205078125,
0.0138397216796875,
0.018524169921875,
-0.027923583984375,
-0.0199737548828125,
-0.007266998291015625,
0.0005674362182617188,
-0.01007843017578125,
0.01297760009765625,
0.0228271484375,
0.0028839111328125,
-0.03619384765625,
0.02032470703125,
0.0143890380859375,
0.05364990234375,
-0.03668212890625,
0.0074005126953125,
-0.00917816162109375,
0.0328369140625,
-0.0022029876708984375,
-0.07196044921875,
0.019287109375,
-0.00962066650390625,
0.0016832351684570312,
-0.0214996337890625,
0.039947509765625,
-0.0318603515625,
-0.05438232421875,
0.0343017578125,
0.03546142578125,
0.00801849365234375,
-0.0028018951416015625,
-0.040313720703125,
0.00001531839370727539,
-0.00727081298828125,
-0.02410888671875,
0.0019359588623046875,
0.0236358642578125,
-0.0005283355712890625,
0.04669189453125,
0.0247039794921875,
-0.007038116455078125,
0.0255279541015625,
0.026153564453125,
0.049346923828125,
-0.053070068359375,
-0.03521728515625,
-0.05584716796875,
0.0345458984375,
-0.00731658935546875,
-0.03143310546875,
0.044952392578125,
0.049652099609375,
0.061798095703125,
-0.032684326171875,
0.059478759765625,
-0.0191192626953125,
-0.002010345458984375,
-0.0154266357421875,
0.0596923828125,
-0.034423828125,
-0.0173797607421875,
-0.004657745361328125,
-0.0667724609375,
-0.03302001953125,
0.083740234375,
-0.026458740234375,
-0.00998687744140625,
0.079345703125,
0.062408447265625,
-0.0190582275390625,
-0.00334930419921875,
0.005680084228515625,
0.030670166015625,
0.03302001953125,
0.0352783203125,
0.048370361328125,
-0.06719970703125,
0.055694580078125,
-0.0298919677734375,
0.002040863037109375,
-0.008575439453125,
-0.050262451171875,
-0.07513427734375,
-0.045318603515625,
-0.043792724609375,
-0.040557861328125,
-0.01340484619140625,
0.062042236328125,
0.050689697265625,
-0.07269287109375,
-0.02630615234375,
-0.039459228515625,
-0.01398468017578125,
-0.01396942138671875,
-0.02056884765625,
0.04840087890625,
-0.0264129638671875,
-0.06182861328125,
0.0216827392578125,
-0.00812530517578125,
0.00394439697265625,
-0.00731658935546875,
0.0008006095886230469,
-0.0152130126953125,
-0.01152801513671875,
0.037994384765625,
-0.0023822784423828125,
-0.06597900390625,
-0.02728271484375,
0.011077880859375,
-0.02142333984375,
-0.0059356689453125,
0.036163330078125,
-0.06256103515625,
0.037445068359375,
0.0550537109375,
0.034515380859375,
0.0701904296875,
-0.034149169921875,
0.04986572265625,
-0.049468994140625,
0.02099609375,
0.004547119140625,
0.04766845703125,
0.0270538330078125,
0.005519866943359375,
0.04290771484375,
-0.007411956787109375,
-0.038848876953125,
-0.047210693359375,
-0.00183868408203125,
-0.11083984375,
-0.016204833984375,
0.095458984375,
-0.007965087890625,
-0.0243072509765625,
0.009918212890625,
-0.03399658203125,
0.0528564453125,
-0.035308837890625,
0.0780029296875,
0.081298828125,
0.014923095703125,
-0.00618743896484375,
-0.04974365234375,
0.01108551025390625,
0.029815673828125,
-0.049041748046875,
-0.0174102783203125,
0.0159759521484375,
0.03717041015625,
0.0204315185546875,
0.02960205078125,
-0.007701873779296875,
0.0110931396484375,
0.004436492919921875,
0.028350830078125,
-0.00470733642578125,
0.0018262863159179688,
-0.01824951171875,
0.0192718505859375,
-0.0200042724609375,
-0.044097900390625
]
] |
facebook/nllb-200-distilled-600M | 2023-02-11T20:19:06.000Z | [
"transformers",
"pytorch",
"m2m_100",
"text2text-generation",
"nllb",
"translation",
"ace",
"acm",
"acq",
"aeb",
"af",
"ajp",
"ak",
"als",
"am",
"apc",
"ar",
"ars",
"ary",
"arz",
"as",
"ast",
"awa",
"ayr",
"azb",
"azj",
"ba",
"bm",
"ban",
"be",
"bem",
"bn",
"bho",
"bjn",
"bo",
"bs",
"bug",
"bg",
"ca",
"ceb",
"cs",
"cjk",
"ckb",
"crh",
"cy",
"da",
"de",
"dik",
"dyu",
"dz",
"el",
"en",
"eo",
"et",
"eu",
"ee",
"fo",
"fj",
"fi",
"fon",
"fr",
"fur",
"fuv",
"gaz",
"gd",
"ga",
"gl",
"gn",
"gu",
"ht",
"ha",
"he",
"hi",
"hne",
"hr",
"hu",
"hy",
"ig",
"ilo",
"id",
"is",
"it",
"jv",
"ja",
"kab",
"kac",
"kam",
"kn",
"ks",
"ka",
"kk",
"kbp",
"kea",
"khk",
"km",
"ki",
"rw",
"ky",
"kmb",
"kmr",
"knc",
"kg",
"ko",
"lo",
"lij",
"li",
"ln",
"lt",
"lmo",
"ltg",
"lb",
"lua",
"lg",
"luo",
"lus",
"lvs",
"mag",
"mai",
"ml",
"mar",
"min",
"mk",
"mt",
"mni",
"mos",
"mi",
"my",
"nl",
"nn",
"nb",
"npi",
"nso",
"nus",
"ny",
"oc",
"ory",
"pag",
"pa",
"pap",
"pbt",
"pes",
"plt",
"pl",
"pt",
"prs",
"quy",
"ro",
"rn",
"ru",
"sg",
"sa",
"sat",
"scn",
"shn",
"si",
"sk",
"sl",
"sm",
"sn",
"sd",
"so",
"st",
"es",
"sc",
"sr",
"ss",
"su",
"sv",
"swh",
"szl",
"ta",
"taq",
"tt",
"te",
"tg",
"tl",
"th",
"ti",
"tpi",
"tn",
"ts",
"tk",
"tum",
"tr",
"tw",
"tzm",
"ug",
"uk",
"umb",
"ur",
"uzn",
"vec",
"vi",
"war",
"wo",
"xh",
"ydd",
"yo",
"yue",
"zh",
"zsm",
"zu",
"dataset:flores-200",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/nllb-200-distilled-600M | 211 | 85,233 | transformers | 2022-07-08T09:43:57 | ---
language:
- ace
- acm
- acq
- aeb
- af
- ajp
- ak
- als
- am
- apc
- ar
- ars
- ary
- arz
- as
- ast
- awa
- ayr
- azb
- azj
- ba
- bm
- ban
- be
- bem
- bn
- bho
- bjn
- bo
- bs
- bug
- bg
- ca
- ceb
- cs
- cjk
- ckb
- crh
- cy
- da
- de
- dik
- dyu
- dz
- el
- en
- eo
- et
- eu
- ee
- fo
- fj
- fi
- fon
- fr
- fur
- fuv
- gaz
- gd
- ga
- gl
- gn
- gu
- ht
- ha
- he
- hi
- hne
- hr
- hu
- hy
- ig
- ilo
- id
- is
- it
- jv
- ja
- kab
- kac
- kam
- kn
- ks
- ka
- kk
- kbp
- kea
- khk
- km
- ki
- rw
- ky
- kmb
- kmr
- knc
- kg
- ko
- lo
- lij
- li
- ln
- lt
- lmo
- ltg
- lb
- lua
- lg
- luo
- lus
- lvs
- mag
- mai
- ml
- mar
- min
- mk
- mt
- mni
- mos
- mi
- my
- nl
- nn
- nb
- npi
- nso
- nus
- ny
- oc
- ory
- pag
- pa
- pap
- pbt
- pes
- plt
- pl
- pt
- prs
- quy
- ro
- rn
- ru
- sg
- sa
- sat
- scn
- shn
- si
- sk
- sl
- sm
- sn
- sd
- so
- st
- es
- sc
- sr
- ss
- su
- sv
- swh
- szl
- ta
- taq
- tt
- te
- tg
- tl
- th
- ti
- tpi
- tn
- ts
- tk
- tum
- tr
- tw
- tzm
- ug
- uk
- umb
- ur
- uzn
- vec
- vi
- war
- wo
- xh
- ydd
- yo
- yue
- zh
- zsm
- zu
language_details: "ace_Arab, ace_Latn, acm_Arab, acq_Arab, aeb_Arab, afr_Latn, ajp_Arab, aka_Latn, amh_Ethi, apc_Arab, arb_Arab, ars_Arab, ary_Arab, arz_Arab, asm_Beng, ast_Latn, awa_Deva, ayr_Latn, azb_Arab, azj_Latn, bak_Cyrl, bam_Latn, ban_Latn,bel_Cyrl, bem_Latn, ben_Beng, bho_Deva, bjn_Arab, bjn_Latn, bod_Tibt, bos_Latn, bug_Latn, bul_Cyrl, cat_Latn, ceb_Latn, ces_Latn, cjk_Latn, ckb_Arab, crh_Latn, cym_Latn, dan_Latn, deu_Latn, dik_Latn, dyu_Latn, dzo_Tibt, ell_Grek, eng_Latn, epo_Latn, est_Latn, eus_Latn, ewe_Latn, fao_Latn, pes_Arab, fij_Latn, fin_Latn, fon_Latn, fra_Latn, fur_Latn, fuv_Latn, gla_Latn, gle_Latn, glg_Latn, grn_Latn, guj_Gujr, hat_Latn, hau_Latn, heb_Hebr, hin_Deva, hne_Deva, hrv_Latn, hun_Latn, hye_Armn, ibo_Latn, ilo_Latn, ind_Latn, isl_Latn, ita_Latn, jav_Latn, jpn_Jpan, kab_Latn, kac_Latn, kam_Latn, kan_Knda, kas_Arab, kas_Deva, kat_Geor, knc_Arab, knc_Latn, kaz_Cyrl, kbp_Latn, kea_Latn, khm_Khmr, kik_Latn, kin_Latn, kir_Cyrl, kmb_Latn, kon_Latn, kor_Hang, kmr_Latn, lao_Laoo, lvs_Latn, lij_Latn, lim_Latn, lin_Latn, lit_Latn, lmo_Latn, ltg_Latn, ltz_Latn, lua_Latn, lug_Latn, luo_Latn, lus_Latn, mag_Deva, mai_Deva, mal_Mlym, mar_Deva, min_Latn, mkd_Cyrl, plt_Latn, mlt_Latn, mni_Beng, khk_Cyrl, mos_Latn, mri_Latn, zsm_Latn, mya_Mymr, nld_Latn, nno_Latn, nob_Latn, npi_Deva, nso_Latn, nus_Latn, nya_Latn, oci_Latn, gaz_Latn, ory_Orya, pag_Latn, pan_Guru, pap_Latn, pol_Latn, por_Latn, prs_Arab, pbt_Arab, quy_Latn, ron_Latn, run_Latn, rus_Cyrl, sag_Latn, san_Deva, sat_Beng, scn_Latn, shn_Mymr, sin_Sinh, slk_Latn, slv_Latn, smo_Latn, sna_Latn, snd_Arab, som_Latn, sot_Latn, spa_Latn, als_Latn, srd_Latn, srp_Cyrl, ssw_Latn, sun_Latn, swe_Latn, swh_Latn, szl_Latn, tam_Taml, tat_Cyrl, tel_Telu, tgk_Cyrl, tgl_Latn, tha_Thai, tir_Ethi, taq_Latn, taq_Tfng, tpi_Latn, tsn_Latn, tso_Latn, tuk_Latn, tum_Latn, tur_Latn, twi_Latn, tzm_Tfng, uig_Arab, ukr_Cyrl, umb_Latn, urd_Arab, uzn_Latn, vec_Latn, vie_Latn, war_Latn, wol_Latn, xho_Latn, ydd_Hebr, yor_Latn, yue_Hant, zho_Hans, zho_Hant, zul_Latn"
tags:
- nllb
- translation
license: "cc-by-nc-4.0"
datasets:
- flores-200
metrics:
- bleu
- spbleu
- chrf++
inference: false
---
# NLLB-200
This is the model card of NLLB-200's distilled 600M variant.
Here are the [metrics](https://tinyurl.com/nllb200densedst600mmetrics) for that particular checkpoint.
- Information about training algorithms, parameters, fairness constraints or other applied approaches, and features. The exact training algorithm, data and the strategies to handle data imbalances for high and low resource languages that were used to train NLLB-200 is described in the paper.
- Paper or other resource for more information NLLB Team et al, No Language Left Behind: Scaling Human-Centered Machine Translation, Arxiv, 2022
- License: CC-BY-NC
- Where to send questions or comments about the model: https://github.com/facebookresearch/fairseq/issues
## Intended Use
- Primary intended uses: NLLB-200 is a machine translation model primarily intended for research in machine translation, - especially for low-resource languages. It allows for single sentence translation among 200 languages. Information on how to - use the model can be found in Fairseq code repository along with the training code and references to evaluation and training data.
- Primary intended users: Primary users are researchers and machine translation research community.
- Out-of-scope use cases: NLLB-200 is a research model and is not released for production deployment. NLLB-200 is trained on general domain text data and is not intended to be used with domain specific texts, such as medical domain or legal domain. The model is not intended to be used for document translation. The model was trained with input lengths not exceeding 512 tokens, therefore translating longer sequences might result in quality degradation. NLLB-200 translations can not be used as certified translations.
## Metrics
• Model performance measures: NLLB-200 model was evaluated using BLEU, spBLEU, and chrF++ metrics widely adopted by machine translation community. Additionally, we performed human evaluation with the XSTS protocol and measured the toxicity of the generated translations.
## Evaluation Data
- Datasets: Flores-200 dataset is described in Section 4
- Motivation: We used Flores-200 as it provides full evaluation coverage of the languages in NLLB-200
- Preprocessing: Sentence-split raw text data was preprocessed using SentencePiece. The
SentencePiece model is released along with NLLB-200.
## Training Data
• We used parallel multilingual data from a variety of sources to train the model. We provide detailed report on data selection and construction process in Section 5 in the paper. We also used monolingual data constructed from Common Crawl. We provide more details in Section 5.2.
## Ethical Considerations
• In this work, we took a reflexive approach in technological development to ensure that we prioritize human users and minimize risks that could be transferred to them. While we reflect on our ethical considerations throughout the article, here are some additional points to highlight. For one, many languages chosen for this study are low-resource languages, with a heavy emphasis on African languages. While quality translation could improve education and information access in many in these communities, such an access could also make groups with lower levels of digital literacy more vulnerable to misinformation or online scams. The latter scenarios could arise if bad actors misappropriate our work for nefarious activities, which we conceive as an example of unintended use. Regarding data acquisition, the training data used for model development were mined from various publicly available sources on the web. Although we invested heavily in data cleaning, personally identifiable information may not be entirely eliminated. Finally, although we did our best to optimize for translation quality, mistranslations produced by the model could remain. Although the odds are low, this could have adverse impact on those who rely on these translations to make important decisions (particularly when related to health and safety).
## Caveats and Recommendations
• Our model has been tested on the Wikimedia domain with limited investigation on other domains supported in NLLB-MD. In addition, the supported languages may have variations that our model is not capturing. Users should make appropriate assessments.
## Carbon Footprint Details
• The carbon dioxide (CO2e) estimate is reported in Section 8.8. | 7,645 | [
[
-0.028961181640625,
-0.0413818359375,
0.0241546630859375,
0.0183868408203125,
-0.01174163818359375,
-0.0098876953125,
-0.01033782958984375,
-0.049041748046875,
-0.00518035888671875,
0.05450439453125,
-0.035369873046875,
-0.0206298828125,
-0.051544189453125,
0.0294189453125,
-0.045928955078125,
0.10186767578125,
-0.0003113746643066406,
0.02471923828125,
-0.0034427642822265625,
-0.031707763671875,
-0.027557373046875,
-0.048248291015625,
-0.056488037109375,
-0.0143585205078125,
0.055328369140625,
0.019775390625,
0.055328369140625,
0.046478271484375,
0.035675048828125,
0.013153076171875,
-0.0284576416015625,
-0.0019588470458984375,
-0.05499267578125,
-0.036590576171875,
-0.0201873779296875,
-0.02679443359375,
-0.060821533203125,
0.0163421630859375,
0.039459228515625,
0.07354736328125,
-0.01507568359375,
0.042205810546875,
0.010345458984375,
0.0526123046875,
-0.022064208984375,
-0.0139312744140625,
-0.040130615234375,
0.00832366943359375,
-0.025115966796875,
-0.008697509765625,
-0.05218505859375,
-0.0216522216796875,
-0.00319671630859375,
-0.049957275390625,
0.003299713134765625,
0.023406982421875,
0.0692138671875,
0.0123291015625,
-0.043670654296875,
-0.0300140380859375,
-0.041259765625,
0.0794677734375,
-0.07403564453125,
0.023712158203125,
0.042999267578125,
0.005214691162109375,
-0.004520416259765625,
-0.045501708984375,
-0.05029296875,
-0.0090484619140625,
-0.00376129150390625,
0.00928497314453125,
-0.01331329345703125,
-0.0012149810791015625,
0.04461669921875,
0.0290679931640625,
-0.050262451171875,
0.00650787353515625,
-0.051544189453125,
-0.01316070556640625,
0.053131103515625,
0.016510009765625,
0.017974853515625,
-0.03424072265625,
-0.035247802734375,
-0.00592041015625,
-0.04864501953125,
-0.0014772415161132812,
0.053802490234375,
0.03326416015625,
-0.0220947265625,
0.0469970703125,
-0.0118865966796875,
0.05682373046875,
-0.00403594970703125,
-0.0170745849609375,
0.0430908203125,
-0.05169677734375,
-0.005596160888671875,
-0.00647735595703125,
0.060546875,
0.0372314453125,
0.0208892822265625,
-0.015838623046875,
-0.016998291015625,
-0.019744873046875,
0.041717529296875,
-0.061798095703125,
0.0158538818359375,
0.02301025390625,
-0.05029296875,
-0.034912109375,
-0.007110595703125,
-0.056732177734375,
-0.0137939453125,
-0.0306396484375,
0.024566650390625,
-0.026519775390625,
-0.0156707763671875,
0.0074462890625,
0.0002148151397705078,
0.00579071044921875,
0.01329803466796875,
-0.04962158203125,
0.01470947265625,
0.02459716796875,
0.0555419921875,
-0.0179595947265625,
-0.025543212890625,
-0.0175018310546875,
0.00032210350036621094,
-0.0222320556640625,
0.021759033203125,
-0.004253387451171875,
-0.02728271484375,
-0.0031490325927734375,
0.01458740234375,
0.00733184814453125,
-0.04022216796875,
0.06182861328125,
-0.03948974609375,
0.0247955322265625,
-0.03936767578125,
-0.040771484375,
-0.0217132568359375,
0.016448974609375,
-0.0682373046875,
0.0843505859375,
0.01232147216796875,
-0.0751953125,
0.0301666259765625,
-0.05450439453125,
-0.035888671875,
0.0166015625,
0.0124664306640625,
-0.034454345703125,
0.005756378173828125,
-0.004268646240234375,
0.0098114013671875,
-0.019073486328125,
0.040802001953125,
-0.0220947265625,
-0.0321044921875,
0.026763916015625,
-0.04913330078125,
0.09991455078125,
0.0404052734375,
-0.0232086181640625,
-0.024383544921875,
-0.05523681640625,
0.006298065185546875,
0.01934814453125,
-0.04791259765625,
-0.01122283935546875,
-0.0132598876953125,
0.031158447265625,
0.0282745361328125,
0.0189361572265625,
-0.042205810546875,
0.0122833251953125,
-0.0234832763671875,
0.01372528076171875,
0.04034423828125,
0.0007276535034179688,
0.037567138671875,
-0.0279998779296875,
0.0474853515625,
-0.005214691162109375,
0.038848876953125,
0.0049591064453125,
-0.036285400390625,
-0.0628662109375,
0.01513671875,
0.039337158203125,
0.0450439453125,
-0.054290771484375,
0.0413818359375,
-0.0233001708984375,
-0.0310211181640625,
-0.057952880859375,
0.01451873779296875,
0.03106689453125,
0.039794921875,
0.040618896484375,
-0.0241546630859375,
-0.034637451171875,
-0.05682373046875,
-0.01690673828125,
-0.0005340576171875,
0.01194000244140625,
0.007419586181640625,
0.04888916015625,
-0.03289794921875,
0.0611572265625,
-0.0160980224609375,
-0.016845703125,
-0.028289794921875,
0.00839996337890625,
0.0208892822265625,
0.041473388671875,
0.045623779296875,
-0.07525634765625,
-0.0389404296875,
-0.0033397674560546875,
-0.07354736328125,
-0.01284027099609375,
-0.01654052734375,
-0.01192474365234375,
0.0301971435546875,
0.03228759765625,
-0.0243682861328125,
0.0390625,
0.06072998046875,
-0.0153961181640625,
0.03741455078125,
-0.0185546875,
0.0080718994140625,
-0.09088134765625,
0.043365478515625,
-0.0109100341796875,
-0.015655517578125,
-0.06072998046875,
0.0088958740234375,
0.006153106689453125,
-0.012237548828125,
-0.04583740234375,
0.059051513671875,
-0.02484130859375,
0.00788116455078125,
-0.0273284912109375,
0.005489349365234375,
0.0196990966796875,
0.03741455078125,
-0.01373291015625,
0.051361083984375,
0.00464630126953125,
-0.038177490234375,
0.003192901611328125,
0.023681640625,
-0.0234832763671875,
0.0604248046875,
-0.050567626953125,
0.0023975372314453125,
-0.00859832763671875,
0.01483917236328125,
-0.037261962890625,
-0.0035457611083984375,
0.024658203125,
-0.04351806640625,
0.0166015625,
0.003002166748046875,
-0.05999755859375,
-0.0284423828125,
0.0013513565063476562,
0.03082275390625,
0.0271148681640625,
-0.0192413330078125,
0.021453857421875,
0.0341796875,
-0.01129913330078125,
-0.053253173828125,
-0.08135986328125,
0.01346588134765625,
-0.0228729248046875,
-0.034149169921875,
0.01175689697265625,
-0.0211639404296875,
-0.01230621337890625,
0.004703521728515625,
0.0048828125,
-0.01297760009765625,
0.0201416015625,
0.00978851318359375,
0.01424407958984375,
0.01129913330078125,
0.00820159912109375,
0.004199981689453125,
-0.0098114013671875,
-0.006366729736328125,
-0.0193328857421875,
0.046417236328125,
-0.00390625,
-0.01099395751953125,
-0.033966064453125,
0.044677734375,
0.0313720703125,
-0.0102691650390625,
0.083984375,
0.050872802734375,
-0.038543701171875,
0.0189056396484375,
-0.042999267578125,
-0.0019474029541015625,
-0.0347900390625,
0.036956787109375,
-0.00014472007751464844,
-0.041534423828125,
0.032562255859375,
0.00894927978515625,
0.016143798828125,
0.0413818359375,
0.039337158203125,
-0.033294677734375,
0.06756591796875,
0.055999755859375,
-0.00681304931640625,
0.0312042236328125,
-0.02777099609375,
0.0187530517578125,
-0.06536865234375,
-0.018524169921875,
-0.042205810546875,
-0.02044677734375,
-0.052520751953125,
-0.030853271484375,
0.0204010009765625,
0.02337646484375,
-0.00809478759765625,
0.0489501953125,
-0.0186309814453125,
0.016998291015625,
0.0298919677734375,
-0.0028820037841796875,
0.0401611328125,
-0.0027675628662109375,
-0.01291656494140625,
-0.0125274658203125,
-0.06011962890625,
-0.0655517578125,
0.08441162109375,
0.034027099609375,
0.042205810546875,
-0.0082244873046875,
0.0562744140625,
0.03326416015625,
0.034088134765625,
-0.03997802734375,
0.033355712890625,
-0.01473236083984375,
-0.0927734375,
-0.019195556640625,
-0.052337646484375,
-0.08099365234375,
0.0124664306640625,
-0.01007080078125,
-0.033294677734375,
0.01129913330078125,
0.00811767578125,
-0.0105438232421875,
0.019378662109375,
-0.057586669921875,
0.08355712890625,
-0.0450439453125,
-0.00969696044921875,
-0.0207977294921875,
-0.05462646484375,
0.0029201507568359375,
-0.0305023193359375,
0.040802001953125,
-0.003795623779296875,
0.007503509521484375,
0.06439208984375,
-0.0263519287109375,
0.0643310546875,
-0.01629638671875,
-0.0129241943359375,
0.01959228515625,
-0.01230621337890625,
0.02813720703125,
-0.0078277587890625,
-0.02587890625,
0.041259765625,
0.002658843994140625,
-0.053802490234375,
-0.00701141357421875,
0.032684326171875,
-0.057830810546875,
-0.01332855224609375,
-0.030914306640625,
-0.0545654296875,
0.00022876262664794922,
0.04302978515625,
0.044219970703125,
0.0227203369140625,
-0.0171661376953125,
0.0206146240234375,
0.05023193359375,
-0.046051025390625,
0.0227203369140625,
0.048675537109375,
-0.016387939453125,
-0.0244598388671875,
0.06689453125,
0.0290679931640625,
0.049224853515625,
0.0103302001953125,
0.002040863037109375,
-0.01458740234375,
-0.041534423828125,
-0.043731689453125,
0.0219268798828125,
-0.0635986328125,
-0.0118255615234375,
-0.05352783203125,
-0.00878143310546875,
-0.02532958984375,
-0.01082611083984375,
-0.0304412841796875,
-0.0227203369140625,
-0.0257720947265625,
-0.00933074951171875,
0.005374908447265625,
0.048614501953125,
0.00797271728515625,
0.02923583984375,
-0.05511474609375,
0.016204833984375,
-0.0195770263671875,
0.0151824951171875,
0.0025787353515625,
-0.059661865234375,
-0.0423583984375,
0.0279083251953125,
-0.029693603515625,
-0.057159423828125,
0.0267333984375,
-0.00930023193359375,
0.053680419921875,
0.0108795166015625,
0.006526947021484375,
0.04974365234375,
-0.0386962890625,
0.051361083984375,
0.01422882080078125,
-0.07025146484375,
0.0243988037109375,
-0.026092529296875,
0.037628173828125,
0.0738525390625,
0.053802490234375,
-0.0673828125,
-0.03546142578125,
-0.053253173828125,
-0.0758056640625,
0.04840087890625,
0.0216827392578125,
0.020416259765625,
0.0026645660400390625,
0.025421142578125,
0.01290130615234375,
0.0241851806640625,
-0.10101318359375,
-0.0172119140625,
-0.0077056884765625,
-0.0159759521484375,
0.005908966064453125,
-0.00933837890625,
-0.0066986083984375,
-0.0166473388671875,
0.0577392578125,
0.0006532669067382812,
0.011505126953125,
-0.0013303756713867188,
-0.035980224609375,
-0.00864410400390625,
0.0186309814453125,
0.0282745361328125,
0.04522705078125,
-0.005344390869140625,
-0.0187835693359375,
0.031494140625,
-0.043975830078125,
0.00955963134765625,
0.004589080810546875,
-0.035003662109375,
-0.00417327880859375,
0.0262298583984375,
0.051361083984375,
0.0006885528564453125,
-0.049407958984375,
0.038238525390625,
0.003437042236328125,
-0.010772705078125,
-0.0290679931640625,
-0.023193359375,
0.016815185546875,
0.00258636474609375,
0.0278778076171875,
0.0185546875,
0.01788330078125,
-0.038726806640625,
0.00943756103515625,
0.01678466796875,
-0.0274505615234375,
-0.022674560546875,
0.052490234375,
0.0273284912109375,
-0.01168060302734375,
0.05767822265625,
-0.033660888671875,
-0.022064208984375,
0.035064697265625,
0.0219268798828125,
0.04095458984375,
-0.0071563720703125,
0.01690673828125,
0.04852294921875,
0.05108642578125,
-0.012664794921875,
0.01308441162109375,
0.009765625,
-0.04205322265625,
-0.03656005859375,
-0.0615234375,
-0.01263427734375,
0.00490570068359375,
-0.07366943359375,
0.024169921875,
-0.0146026611328125,
-0.0301513671875,
-0.016021728515625,
0.0135345458984375,
-0.059234619140625,
0.0144195556640625,
0.01331329345703125,
0.07525634765625,
-0.07366943359375,
0.08001708984375,
0.0200653076171875,
-0.0582275390625,
-0.046478271484375,
0.00788116455078125,
-0.0029163360595703125,
-0.043243408203125,
0.037200927734375,
0.0193023681640625,
0.01491546630859375,
-0.007244110107421875,
-0.0384521484375,
-0.05755615234375,
0.08233642578125,
0.035064697265625,
-0.050994873046875,
-0.0159454345703125,
0.04071044921875,
0.053558349609375,
-0.00171661376953125,
-0.0033588409423828125,
0.0228729248046875,
0.040618896484375,
-0.00872802734375,
-0.077392578125,
0.0053863525390625,
-0.0119781494140625,
-0.00021576881408691406,
0.0017442703247070312,
-0.04534912109375,
0.05426025390625,
-0.012054443359375,
-0.016448974609375,
0.02264404296875,
0.03192138671875,
0.0069732666015625,
0.0157623291015625,
0.02978515625,
0.05133056640625,
0.05767822265625,
-0.0014772415161132812,
0.09619140625,
-0.01250457763671875,
0.041259765625,
0.08074951171875,
-0.0199432373046875,
0.055389404296875,
0.050537109375,
-0.0131988525390625,
0.01407623291015625,
0.033905029296875,
-0.0095062255859375,
0.036468505859375,
0.01068878173828125,
0.00795745849609375,
0.00804901123046875,
-0.025146484375,
-0.0279693603515625,
0.017791748046875,
0.00997161865234375,
-0.037200927734375,
-0.0034580230712890625,
0.01503753662109375,
0.027587890625,
-0.00269317626953125,
-0.007724761962890625,
0.04742431640625,
0.0167236328125,
-0.051544189453125,
0.046661376953125,
0.0196990966796875,
0.0535888671875,
-0.0452880859375,
0.01522064208984375,
-0.0282440185546875,
0.01345062255859375,
-0.0184326171875,
-0.048797607421875,
0.046661376953125,
0.0193023681640625,
-0.0190582275390625,
-0.042022705078125,
0.0196990966796875,
-0.029693603515625,
-0.054168701171875,
0.03985595703125,
0.0280914306640625,
0.01415252685546875,
0.0026416778564453125,
-0.0701904296875,
0.018463134765625,
0.01092529296875,
-0.0164947509765625,
0.03009033203125,
0.02642822265625,
-0.00940704345703125,
0.039947509765625,
0.044219970703125,
0.01467132568359375,
0.00870513916015625,
0.006786346435546875,
0.05029296875,
-0.04437255859375,
-0.0172576904296875,
-0.039520263671875,
0.04833984375,
-0.01763916015625,
-0.0298919677734375,
0.07110595703125,
0.054656982421875,
0.09649658203125,
0.00385284423828125,
0.054412841796875,
-0.0211029052734375,
0.034088134765625,
-0.024749755859375,
0.06781005859375,
-0.057159423828125,
0.005645751953125,
-0.0274658203125,
-0.06781005859375,
-0.006488800048828125,
0.0413818359375,
-0.0119781494140625,
0.0123748779296875,
0.052459716796875,
0.04779052734375,
0.014892578125,
-0.0094451904296875,
0.02008056640625,
0.006488800048828125,
0.02252197265625,
0.0224761962890625,
0.0367431640625,
-0.0635986328125,
0.058349609375,
-0.0178070068359375,
-0.01067352294921875,
-0.0183563232421875,
-0.06768798828125,
-0.056304931640625,
-0.043182373046875,
-0.0189666748046875,
-0.0287322998046875,
-0.01108551025390625,
0.055328369140625,
0.044952392578125,
-0.055816650390625,
-0.034454345703125,
0.0059967041015625,
-0.0200042724609375,
-0.02398681640625,
-0.0174102783203125,
-0.005786895751953125,
-0.0122833251953125,
-0.062347412109375,
0.0093994140625,
0.003917694091796875,
0.006443023681640625,
-0.030517578125,
-0.02630615234375,
-0.03936767578125,
0.0004467964172363281,
0.039398193359375,
0.012908935546875,
-0.04437255859375,
-0.0033321380615234375,
0.0152740478515625,
-0.038970947265625,
-0.00498199462890625,
0.038482666015625,
-0.0167694091796875,
0.037933349609375,
0.0271148681640625,
0.040679931640625,
0.0428466796875,
-0.0106048583984375,
0.035491943359375,
-0.05755615234375,
0.0262298583984375,
0.027252197265625,
0.031768798828125,
0.03253173828125,
-0.03155517578125,
0.044647216796875,
0.017059326171875,
-0.040557861328125,
-0.072509765625,
0.003307342529296875,
-0.0755615234375,
-0.01959228515625,
0.0972900390625,
-0.01219940185546875,
-0.005046844482421875,
-0.01352691650390625,
-0.01393890380859375,
0.0272216796875,
-0.0094757080078125,
0.051910400390625,
0.07220458984375,
0.024169921875,
0.005321502685546875,
-0.08343505859375,
0.0195159912109375,
0.02911376953125,
-0.0670166015625,
-0.0034885406494140625,
0.01629638671875,
0.0295562744140625,
0.0190582275390625,
0.051239013671875,
-0.0380859375,
0.0178680419921875,
-0.00650787353515625,
0.024658203125,
0.0172882080078125,
-0.0149993896484375,
-0.0249481201171875,
-0.0128173828125,
0.01074981689453125,
0.01511383056640625
]
] |
ckiplab/bert-base-chinese-pos | 2022-05-10T03:28:12.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/bert-base-chinese-pos | 13 | 84,909 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- bert
- zh
license: gpl-3.0
---
# CKIP BERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese-pos')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,123 | [
[
-0.0218505859375,
-0.0258941650390625,
0.0024929046630859375,
0.05645751953125,
-0.0298919677734375,
0.0029544830322265625,
-0.01396942138671875,
-0.0181732177734375,
-0.0030269622802734375,
0.032958984375,
-0.026214599609375,
-0.0225372314453125,
-0.04364013671875,
0.002162933349609375,
-0.0177154541015625,
0.0633544921875,
-0.0143280029296875,
0.0258331298828125,
0.032470703125,
0.01029205322265625,
-0.01873779296875,
-0.0338134765625,
-0.052001953125,
-0.043365478515625,
-0.0035114288330078125,
0.01947021484375,
0.049652099609375,
0.0286102294921875,
0.036651611328125,
0.0227203369140625,
0.0026988983154296875,
-0.007701873779296875,
-0.01313018798828125,
-0.0211181640625,
-0.0007352828979492188,
-0.038970947265625,
-0.027679443359375,
-0.014801025390625,
0.050201416015625,
0.035308837890625,
0.0016613006591796875,
-0.002521514892578125,
0.01409912109375,
0.02630615234375,
-0.0255889892578125,
0.03167724609375,
-0.044097900390625,
0.022796630859375,
-0.01181793212890625,
-0.005985260009765625,
-0.027587890625,
-0.0188140869140625,
0.0136260986328125,
-0.046295166015625,
0.025054931640625,
-0.01210784912109375,
0.09783935546875,
0.002567291259765625,
-0.022003173828125,
-0.02001953125,
-0.050537109375,
0.07733154296875,
-0.06390380859375,
0.032196044921875,
0.02618408203125,
0.0210723876953125,
-0.004192352294921875,
-0.0787353515625,
-0.048004150390625,
-0.0138397216796875,
-0.0160980224609375,
0.0244598388671875,
0.010467529296875,
-0.002376556396484375,
0.026763916015625,
0.0224151611328125,
-0.04327392578125,
0.01430511474609375,
-0.0275421142578125,
-0.030975341796875,
0.039276123046875,
-0.00687408447265625,
0.03466796875,
-0.03302001953125,
-0.04052734375,
-0.0256805419921875,
-0.045745849609375,
0.0176849365234375,
0.019866943359375,
0.009307861328125,
-0.03375244140625,
0.04254150390625,
-0.0002803802490234375,
0.021392822265625,
0.01447296142578125,
-0.005695343017578125,
0.032928466796875,
-0.021514892578125,
-0.005840301513671875,
-0.0098876953125,
0.06591796875,
0.0158233642578125,
0.007625579833984375,
0.005649566650390625,
-0.02374267578125,
-0.0255889892578125,
-0.0169219970703125,
-0.05511474609375,
-0.050811767578125,
0.01561737060546875,
-0.056671142578125,
-0.01494598388671875,
0.01248931884765625,
-0.04498291015625,
0.0206756591796875,
-0.0173797607421875,
0.030609130859375,
-0.052703857421875,
-0.044647216796875,
0.00003927946090698242,
-0.02935791015625,
0.0614013671875,
0.010498046875,
-0.0885009765625,
0.0019741058349609375,
0.044403076171875,
0.05352783203125,
0.00994110107421875,
-0.01226806640625,
0.0102081298828125,
0.0273895263671875,
-0.01509857177734375,
0.040130615234375,
-0.00836944580078125,
-0.053070068359375,
0.01032257080078125,
0.006134033203125,
0.0013427734375,
-0.0318603515625,
0.06060791015625,
-0.02252197265625,
0.0290985107421875,
-0.01690673828125,
-0.0216064453125,
-0.004505157470703125,
0.006866455078125,
-0.03814697265625,
0.088134765625,
0.0165863037109375,
-0.062347412109375,
0.0172576904296875,
-0.0645751953125,
-0.04345703125,
0.0240631103515625,
-0.00787353515625,
-0.0310211181640625,
-0.012115478515625,
0.0176849365234375,
0.023223876953125,
-0.00421142578125,
0.0162811279296875,
-0.0009546279907226562,
-0.016357421875,
0.0005745887756347656,
-0.0310211181640625,
0.09906005859375,
0.0250244140625,
-0.02349853515625,
0.0127410888671875,
-0.048553466796875,
0.00894927978515625,
0.02313232421875,
-0.01904296875,
-0.0175323486328125,
0.01554107666015625,
0.04229736328125,
0.0114898681640625,
0.04083251953125,
-0.04327392578125,
0.03607177734375,
-0.041351318359375,
0.052886962890625,
0.060791015625,
-0.0234527587890625,
0.02081298828125,
-0.01107025146484375,
-0.0007052421569824219,
0.00460052490234375,
0.0275421142578125,
-0.010040283203125,
-0.03863525390625,
-0.0814208984375,
-0.0258941650390625,
0.032623291015625,
0.057586669921875,
-0.08172607421875,
0.0672607421875,
-0.0175628662109375,
-0.045684814453125,
-0.0237274169921875,
-0.005519866943359375,
0.0016298294067382812,
0.013641357421875,
0.039825439453125,
-0.0226898193359375,
-0.042510986328125,
-0.0745849609375,
0.0088043212890625,
-0.042022705078125,
-0.042083740234375,
-0.00036263465881347656,
0.04010009765625,
-0.031402587890625,
0.073486328125,
-0.038482666015625,
-0.0214080810546875,
-0.023101806640625,
0.040435791015625,
0.0263519287109375,
0.066162109375,
0.046966552734375,
-0.07476806640625,
-0.05218505859375,
-0.0162811279296875,
-0.02447509765625,
-0.005218505859375,
-0.0166168212890625,
-0.01064300537109375,
0.0036830902099609375,
0.003787994384765625,
-0.044830322265625,
0.01477813720703125,
0.0277862548828125,
-0.0004405975341796875,
0.06378173828125,
-0.0040435791015625,
-0.0207061767578125,
-0.09637451171875,
0.01342010498046875,
-0.0149688720703125,
-0.0024871826171875,
-0.0308837890625,
-0.0002925395965576172,
0.0138397216796875,
-0.00737762451171875,
-0.04010009765625,
0.04083251953125,
-0.025299072265625,
0.022735595703125,
-0.019012451171875,
-0.01197052001953125,
-0.01470184326171875,
0.044403076171875,
0.030609130859375,
0.05145263671875,
0.044647216796875,
-0.051849365234375,
0.0311431884765625,
0.05010986328125,
-0.01953125,
-0.006343841552734375,
-0.06878662109375,
-0.0023441314697265625,
0.023529052734375,
0.01296234130859375,
-0.0718994140625,
-0.003353118896484375,
0.045867919921875,
-0.056060791015625,
0.04412841796875,
0.0048675537109375,
-0.06829833984375,
-0.033599853515625,
-0.03302001953125,
0.024871826171875,
0.0506591796875,
-0.045562744140625,
0.0374755859375,
0.019439697265625,
-0.0161285400390625,
-0.043548583984375,
-0.058685302734375,
-0.002109527587890625,
0.0211944580078125,
-0.042022705078125,
0.047332763671875,
-0.0159912109375,
0.024749755859375,
-0.0007872581481933594,
0.0072174072265625,
-0.03497314453125,
-0.0060882568359375,
-0.010955810546875,
0.0292816162109375,
-0.01081085205078125,
-0.0009713172912597656,
0.0142822265625,
-0.0235137939453125,
0.01044464111328125,
-0.0011053085327148438,
0.05364990234375,
0.0034046173095703125,
-0.0244903564453125,
-0.042022705078125,
0.0198974609375,
0.014434814453125,
-0.0189056396484375,
0.0223236083984375,
0.07568359375,
-0.0190582275390625,
-0.01296234130859375,
-0.0311431884765625,
-0.01094818115234375,
-0.040252685546875,
0.044342041015625,
-0.033843994140625,
-0.058807373046875,
0.0243377685546875,
-0.0082855224609375,
0.01448822021484375,
0.0557861328125,
0.046539306640625,
-0.001079559326171875,
0.09039306640625,
0.06689453125,
-0.039825439453125,
0.03314208984375,
-0.0298309326171875,
0.02825927734375,
-0.06536865234375,
0.0175018310546875,
-0.047027587890625,
0.007457733154296875,
-0.060089111328125,
-0.0227203369140625,
-0.0013704299926757812,
0.01270294189453125,
-0.0197296142578125,
0.0526123046875,
-0.05877685546875,
-0.0034236907958984375,
0.05804443359375,
-0.0222625732421875,
-0.007373809814453125,
-0.006771087646484375,
-0.0208740234375,
-0.001552581787109375,
-0.04364013671875,
-0.048370361328125,
0.0545654296875,
0.049468994140625,
0.05230712890625,
-0.0018091201782226562,
0.036590576171875,
-0.0019292831420898438,
0.031982421875,
-0.05828857421875,
0.0400390625,
-0.0167236328125,
-0.061004638671875,
-0.0227203369140625,
-0.016082763671875,
-0.061431884765625,
0.0167388916015625,
-0.002086639404296875,
-0.06329345703125,
0.01238250732421875,
0.004802703857421875,
-0.007564544677734375,
0.0286712646484375,
-0.032562255859375,
0.053955078125,
-0.036285400390625,
0.0082550048828125,
-0.00547027587890625,
-0.052520751953125,
0.028350830078125,
0.0006074905395507812,
-0.00688934326171875,
-0.00562286376953125,
0.0074005126953125,
0.055328369140625,
-0.0154876708984375,
0.0614013671875,
-0.014495849609375,
-0.005695343017578125,
0.02435302734375,
-0.0218658447265625,
0.0234222412109375,
0.0125732421875,
0.00849151611328125,
0.04498291015625,
0.0158538818359375,
-0.0289764404296875,
-0.015716552734375,
0.034149169921875,
-0.06781005859375,
-0.0312042236328125,
-0.042266845703125,
-0.016845703125,
0.01025390625,
0.03936767578125,
0.0413818359375,
-0.0005435943603515625,
0.00040841102600097656,
0.0186614990234375,
0.0245361328125,
-0.032928466796875,
0.042633056640625,
0.04156494140625,
-0.0052642822265625,
-0.034210205078125,
0.0682373046875,
0.00982666015625,
0.005794525146484375,
0.04876708984375,
-0.0027713775634765625,
-0.0186920166015625,
-0.0330810546875,
-0.02392578125,
0.028564453125,
-0.0312347412109375,
0.00037741661071777344,
-0.026397705078125,
-0.043365478515625,
-0.0491943359375,
0.00978851318359375,
-0.0265655517578125,
-0.0294952392578125,
-0.0217742919921875,
0.0013637542724609375,
-0.0250244140625,
0.00910186767578125,
-0.0208282470703125,
0.035369873046875,
-0.077880859375,
0.03704833984375,
0.0155181884765625,
0.01873779296875,
0.0020236968994140625,
-0.0177154541015625,
-0.039947509765625,
0.00937652587890625,
-0.06427001953125,
-0.05487060546875,
0.04144287109375,
0.0002503395080566406,
0.053466796875,
0.046112060546875,
0.01401519775390625,
0.037994384765625,
-0.047271728515625,
0.082763671875,
0.027618408203125,
-0.08929443359375,
0.0302276611328125,
-0.01346588134765625,
0.0253448486328125,
0.0212860107421875,
0.036865234375,
-0.057342529296875,
-0.0247344970703125,
-0.035888671875,
-0.0859375,
0.0489501953125,
0.0287628173828125,
0.025115966796875,
-0.0017948150634765625,
0.0014410018920898438,
-0.0013875961303710938,
0.0124359130859375,
-0.08154296875,
-0.040679931640625,
-0.040008544921875,
-0.02313232421875,
0.0167388916015625,
-0.0305023193359375,
0.006771087646484375,
-0.0167694091796875,
0.07928466796875,
0.00543212890625,
0.061737060546875,
0.03607177734375,
-0.00327301025390625,
-0.00983428955078125,
0.006481170654296875,
0.034912109375,
0.04107666015625,
-0.0204315185546875,
-0.01776123046875,
0.00568389892578125,
-0.047760009765625,
-0.017333984375,
0.030670166015625,
-0.0291900634765625,
0.032928466796875,
0.0361328125,
0.045745849609375,
0.01006317138671875,
-0.0312042236328125,
0.0404052734375,
-0.0117034912109375,
-0.0185546875,
-0.072509765625,
-0.0032405853271484375,
0.0026531219482421875,
0.001667022705078125,
0.052276611328125,
-0.01274871826171875,
0.0098114013671875,
-0.0137176513671875,
0.01605224609375,
0.0302581787109375,
-0.0382080078125,
-0.03411865234375,
0.0489501953125,
0.0360107421875,
-0.0204620361328125,
0.0638427734375,
-0.004268646240234375,
-0.0709228515625,
0.0501708984375,
0.034088134765625,
0.0758056640625,
-0.025146484375,
0.003570556640625,
0.047698974609375,
0.03643798828125,
0.005279541015625,
0.018310546875,
-0.020172119140625,
-0.06890869140625,
-0.0390625,
-0.027435302734375,
-0.03369140625,
0.0312042236328125,
-0.036529541015625,
0.043060302734375,
-0.035308837890625,
-0.009429931640625,
-0.0047760009765625,
-0.00292205810546875,
-0.03558349609375,
0.010589599609375,
0.0095367431640625,
0.0848388671875,
-0.04681396484375,
0.08770751953125,
0.04443359375,
-0.039794921875,
-0.0616455078125,
0.01256561279296875,
-0.0301513671875,
-0.0546875,
0.07708740234375,
0.02508544921875,
0.02117919921875,
0.005580902099609375,
-0.055938720703125,
-0.0570068359375,
0.074462890625,
-0.0116729736328125,
-0.025787353515625,
-0.007457733154296875,
0.02630615234375,
0.02960205078125,
-0.00383758544921875,
0.0321044921875,
0.004985809326171875,
0.04681396484375,
-0.01219940185546875,
-0.0850830078125,
-0.01727294921875,
-0.0205230712890625,
0.0034732818603515625,
0.01885986328125,
-0.06341552734375,
0.0640869140625,
0.00797271728515625,
-0.02459716796875,
0.028594970703125,
0.06695556640625,
0.0011281967163085938,
0.00856781005859375,
0.042022705078125,
0.0340576171875,
-0.0020809173583984375,
-0.0174407958984375,
0.03662109375,
-0.042755126953125,
0.059661865234375,
0.06195068359375,
-0.006191253662109375,
0.055572509765625,
0.026947021484375,
-0.03802490234375,
0.040557861328125,
0.050994873046875,
-0.045562744140625,
0.045745849609375,
0.0014858245849609375,
-0.00717926025390625,
-0.00933837890625,
0.00969696044921875,
-0.041748046875,
0.0169830322265625,
0.023590087890625,
-0.0268707275390625,
-0.01154327392578125,
-0.01446533203125,
-0.0024662017822265625,
-0.030853271484375,
-0.00455474853515625,
0.03741455078125,
0.0098724365234375,
-0.022674560546875,
0.0361328125,
0.026153564453125,
0.07122802734375,
-0.07867431640625,
-0.0264129638671875,
0.01953125,
0.01146697998046875,
-0.0032329559326171875,
-0.048095703125,
0.010498046875,
-0.025634765625,
-0.01186370849609375,
-0.01141357421875,
0.0595703125,
-0.023773193359375,
-0.038818359375,
0.0309295654296875,
0.0055084228515625,
0.01052093505859375,
0.0210113525390625,
-0.08587646484375,
-0.0252838134765625,
0.026519775390625,
-0.031341552734375,
0.01103973388671875,
0.01248931884765625,
0.006809234619140625,
0.047576904296875,
0.06427001953125,
0.006389617919921875,
-0.00968170166015625,
-0.003124237060546875,
0.0660400390625,
-0.042938232421875,
-0.042022705078125,
-0.049957275390625,
0.055999755859375,
-0.0175323486328125,
-0.0270538330078125,
0.051666259765625,
0.05218505859375,
0.0838623046875,
-0.0264739990234375,
0.075927734375,
-0.0291290283203125,
0.056915283203125,
-0.01422882080078125,
0.05908203125,
-0.0294342041015625,
-0.00983428955078125,
-0.0247955322265625,
-0.06524658203125,
-0.01702880859375,
0.065673828125,
-0.01163482666015625,
-0.005397796630859375,
0.050750732421875,
0.04327392578125,
0.0015497207641601562,
-0.016754150390625,
0.01149749755859375,
0.0138092041015625,
0.045684814453125,
0.033599853515625,
0.0418701171875,
-0.038421630859375,
0.04693603515625,
-0.04827880859375,
-0.0149993896484375,
-0.0102386474609375,
-0.05145263671875,
-0.051910400390625,
-0.0445556640625,
-0.0203399658203125,
-0.0069732666015625,
-0.0193328857421875,
0.060516357421875,
0.055877685546875,
-0.07904052734375,
-0.032806396484375,
-0.0016012191772460938,
0.00811767578125,
-0.025299072265625,
-0.0260162353515625,
0.0467529296875,
-0.03167724609375,
-0.08489990234375,
-0.0006723403930664062,
0.006603240966796875,
0.008331298828125,
-0.0233917236328125,
0.00010704994201660156,
-0.022125244140625,
-0.01296234130859375,
0.0310211181640625,
0.03277587890625,
-0.055938720703125,
-0.02294921875,
-0.0015277862548828125,
-0.01540374755859375,
0.00922393798828125,
0.0457763671875,
-0.01678466796875,
0.0275115966796875,
0.05010986328125,
0.02020263671875,
0.024871826171875,
-0.01026153564453125,
0.052276611328125,
-0.03717041015625,
0.0209503173828125,
0.0247955322265625,
0.04071044921875,
0.0233306884765625,
-0.01654052734375,
0.036102294921875,
0.032135009765625,
-0.05242919921875,
-0.04229736328125,
0.02508544921875,
-0.07635498046875,
-0.0218963623046875,
0.06732177734375,
-0.0216827392578125,
-0.0105438232421875,
-0.008819580078125,
-0.04425048828125,
0.047393798828125,
-0.0225677490234375,
0.04486083984375,
0.0638427734375,
-0.004962921142578125,
-0.005950927734375,
-0.036834716796875,
0.0293426513671875,
0.03240966796875,
-0.0246124267578125,
-0.0258331298828125,
0.0006852149963378906,
0.013336181640625,
0.046417236328125,
0.03302001953125,
-0.01023101806640625,
0.0088958740234375,
-0.01206207275390625,
0.04461669921875,
0.002040863037109375,
0.01474761962890625,
0.0041351318359375,
-0.01300048828125,
0.0031452178955078125,
-0.031707763671875
]
] |
Salesforce/blip2-flan-t5-xl | 2023-09-13T08:46:20.000Z | [
"transformers",
"pytorch",
"blip-2",
"visual-question-answering",
"vision",
"image-to-text",
"image-captioning",
"en",
"arxiv:2301.12597",
"arxiv:2210.11416",
"license:mit",
"has_space",
"region:us"
] | image-to-text | Salesforce | null | null | Salesforce/blip2-flan-t5-xl | 33 | 84,867 | transformers | 2023-02-06T20:28:29 | ---
language: en
license: mit
tags:
- vision
- image-to-text
- image-captioning
- visual-question-answering
pipeline_tag: image-to-text
inference: false
---
# BLIP-2, Flan T5-xl, pre-trained only
BLIP-2 model, leveraging [Flan T5-xl](https://huggingface.co/google/flan-t5-xl) (a large language model).
It was introduced in the paper [BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models](https://arxiv.org/abs/2301.12597) by Li et al. and first released in [this repository](https://github.com/salesforce/LAVIS/tree/main/projects/blip2).
Disclaimer: The team releasing BLIP-2 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
BLIP-2 consists of 3 models: a CLIP-like image encoder, a Querying Transformer (Q-Former) and a large language model.
The authors initialize the weights of the image encoder and large language model from pre-trained checkpoints and keep them frozen
while training the Querying Transformer, which is a BERT-like Transformer encoder that maps a set of "query tokens" to query embeddings,
which bridge the gap between the embedding space of the image encoder and the large language model.
The goal for the model is simply to predict the next text token, giving the query embeddings and the previous text.
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/blip2_architecture.jpg"
alt="drawing" width="600"/>
This allows the model to be used for tasks like:
- image captioning
- visual question answering (VQA)
- chat-like conversations by feeding the image and the previous conversation as prompt to the model
## Direct Use and Downstream Use
You can use the raw model for conditional text generation given an image and optional text. See the [model hub](https://huggingface.co/models?search=Salesforce/blip) to look for
fine-tuned versions on a task that interests you.
## Bias, Risks, Limitations, and Ethical Considerations
BLIP2-FlanT5 uses off-the-shelf Flan-T5 as the language model. It inherits the same risks and limitations from [Flan-T5](https://arxiv.org/pdf/2210.11416.pdf):
> Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application.
BLIP2 is fine-tuned on image-text datasets (e.g. [LAION](https://laion.ai/blog/laion-400-open-dataset/) ) collected from the internet. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data.
BLIP2 has not been tested in real world applications. It should not be directly deployed in any applications. Researchers should first carefully assess the safety and fairness of the model in relation to the specific context they’re being deployed within.
### How to use
For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2ForConditionalGeneration.forward.example).
#### Running the model on CPU
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, Blip2ForConditionalGeneration
processor = BlipProcessor.from_pretrained("Salesforce/blip2-flan-t5-xl")
model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-flan-t5-xl")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
```
</details>
#### Running the model on GPU
##### In full precision
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import requests
from PIL import Image
from transformers import Blip2Processor, Blip2ForConditionalGeneration
processor = Blip2Processor.from_pretrained("Salesforce/blip2-flan-t5-xl")
model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-flan-t5-xl", device_map="auto")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda")
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
```
</details>
##### In half precision (`float16`)
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
import requests
from PIL import Image
from transformers import Blip2Processor, Blip2ForConditionalGeneration
processor = Blip2Processor.from_pretrained("Salesforce/blip2-flan-t5-xl")
model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-flan-t5-xl", torch_dtype=torch.float16, device_map="auto")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16)
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
```
</details>
##### In 8-bit precision (`int8`)
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate bitsandbytes
import torch
import requests
from PIL import Image
from transformers import Blip2Processor, Blip2ForConditionalGeneration
processor = Blip2Processor.from_pretrained("Salesforce/blip2-flan-t5-xl")
model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-flan-t5-xl", load_in_8bit=True, device_map="auto")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "how many dogs are in the picture?"
inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16)
out = model.generate(**inputs)
print(processor.decode(out[0], skip_special_tokens=True))
```
</details> | 6,508 | [
[
-0.0275115966796875,
-0.04888916015625,
-0.0033054351806640625,
0.029876708984375,
-0.0178070068359375,
-0.0104827880859375,
-0.0225372314453125,
-0.058837890625,
-0.00920867919921875,
0.0218658447265625,
-0.03485107421875,
-0.010833740234375,
-0.04547119140625,
-0.0008282661437988281,
-0.007427215576171875,
0.052520751953125,
0.006015777587890625,
-0.004764556884765625,
-0.004177093505859375,
-0.003383636474609375,
-0.0188446044921875,
-0.0081024169921875,
-0.0465087890625,
-0.015838623046875,
0.00676727294921875,
0.0257110595703125,
0.053253173828125,
0.0299835205078125,
0.054901123046875,
0.0287322998046875,
-0.013153076171875,
0.006389617919921875,
-0.035552978515625,
-0.0189666748046875,
-0.00612640380859375,
-0.05609130859375,
-0.0181732177734375,
-0.001956939697265625,
0.03424072265625,
0.031707763671875,
0.01033782958984375,
0.0294952392578125,
-0.006565093994140625,
0.037322998046875,
-0.040191650390625,
0.0215911865234375,
-0.050933837890625,
0.00617218017578125,
-0.005680084228515625,
-0.0017957687377929688,
-0.0284423828125,
-0.0141448974609375,
0.0038299560546875,
-0.05560302734375,
0.03863525390625,
0.0025882720947265625,
0.11785888671875,
0.0196380615234375,
-0.0014295578002929688,
-0.0240478515625,
-0.036956787109375,
0.066650390625,
-0.048614501953125,
0.034881591796875,
0.01395416259765625,
0.0260162353515625,
-0.0003094673156738281,
-0.07171630859375,
-0.058074951171875,
-0.00861358642578125,
-0.01110076904296875,
0.0252685546875,
-0.017303466796875,
-0.0047454833984375,
0.03656005859375,
0.0203094482421875,
-0.046112060546875,
-0.00588226318359375,
-0.06378173828125,
-0.015350341796875,
0.053253173828125,
-0.007762908935546875,
0.0234222412109375,
-0.0222015380859375,
-0.04205322265625,
-0.03302001953125,
-0.0380859375,
0.024139404296875,
0.004428863525390625,
0.008056640625,
-0.03143310546875,
0.059722900390625,
0.002933502197265625,
0.045867919921875,
0.0281524658203125,
-0.0279083251953125,
0.04742431640625,
-0.0233306884765625,
-0.023712158203125,
-0.01264190673828125,
0.07232666015625,
0.042938232421875,
0.017333984375,
0.004421234130859375,
-0.019439697265625,
0.005420684814453125,
0.0008568763732910156,
-0.08123779296875,
-0.0166778564453125,
0.031463623046875,
-0.0369873046875,
-0.0198516845703125,
-0.004421234130859375,
-0.0655517578125,
-0.009002685546875,
0.0123443603515625,
0.03851318359375,
-0.042724609375,
-0.0281219482421875,
0.0021915435791015625,
-0.03411865234375,
0.0293121337890625,
0.0150909423828125,
-0.072509765625,
-0.00399017333984375,
0.034149169921875,
0.067626953125,
0.020111083984375,
-0.0377197265625,
-0.0166168212890625,
0.01019287109375,
-0.024932861328125,
0.043304443359375,
-0.010833740234375,
-0.0182952880859375,
-0.0024871826171875,
0.01763916015625,
-0.00220489501953125,
-0.045166015625,
0.0095062255859375,
-0.0299224853515625,
0.02166748046875,
-0.01306915283203125,
-0.034698486328125,
-0.0256805419921875,
0.01331329345703125,
-0.0350341796875,
0.08087158203125,
0.02435302734375,
-0.06494140625,
0.036956787109375,
-0.04498291015625,
-0.02496337890625,
0.0150299072265625,
-0.01158905029296875,
-0.0526123046875,
-0.007518768310546875,
0.016815185546875,
0.0284271240234375,
-0.0218658447265625,
0.0010194778442382812,
-0.0256500244140625,
-0.0284576416015625,
0.005584716796875,
-0.006504058837890625,
0.0809326171875,
0.004962921142578125,
-0.051605224609375,
0.0001914501190185547,
-0.0384521484375,
-0.008636474609375,
0.02984619140625,
-0.0152740478515625,
0.0088348388671875,
-0.0198211669921875,
0.01497650146484375,
0.0257568359375,
0.041656494140625,
-0.054107666015625,
-0.004985809326171875,
-0.039306640625,
0.0364990234375,
0.038909912109375,
-0.01507568359375,
0.0260772705078125,
-0.0169219970703125,
0.026763916015625,
0.021209716796875,
0.0230560302734375,
-0.023223876953125,
-0.057342529296875,
-0.06982421875,
-0.01531219482421875,
-0.00035381317138671875,
0.053436279296875,
-0.06573486328125,
0.0307769775390625,
-0.0181884765625,
-0.048248291015625,
-0.040863037109375,
0.01016998291015625,
0.038604736328125,
0.05291748046875,
0.035980224609375,
-0.015228271484375,
-0.037109375,
-0.06671142578125,
0.0156707763671875,
-0.017852783203125,
0.001834869384765625,
0.0305633544921875,
0.054168701171875,
-0.0178985595703125,
0.059539794921875,
-0.03668212890625,
-0.0228118896484375,
-0.0263519287109375,
0.00640106201171875,
0.022491455078125,
0.051025390625,
0.06488037109375,
-0.059539794921875,
-0.025970458984375,
0.002872467041015625,
-0.06829833984375,
0.012420654296875,
-0.01458740234375,
-0.013336181640625,
0.041534423828125,
0.0289764404296875,
-0.0628662109375,
0.046600341796875,
0.0379638671875,
-0.0222320556640625,
0.038665771484375,
-0.01142120361328125,
-0.005588531494140625,
-0.0770263671875,
0.03265380859375,
0.0117645263671875,
-0.01233673095703125,
-0.032379150390625,
0.0054779052734375,
0.0184478759765625,
-0.0185699462890625,
-0.051422119140625,
0.060272216796875,
-0.03289794921875,
-0.02056884765625,
0.005069732666015625,
-0.009796142578125,
0.01129913330078125,
0.0450439453125,
0.0216217041015625,
0.060577392578125,
0.06256103515625,
-0.0458984375,
0.03338623046875,
0.041229248046875,
-0.0243072509765625,
0.0231170654296875,
-0.0648193359375,
0.005001068115234375,
-0.008697509765625,
0.00530242919921875,
-0.07647705078125,
-0.00989532470703125,
0.0211181640625,
-0.054534912109375,
0.0241241455078125,
-0.015350341796875,
-0.03363037109375,
-0.0545654296875,
-0.0223541259765625,
0.0247955322265625,
0.050018310546875,
-0.050048828125,
0.031768798828125,
0.02044677734375,
0.01094818115234375,
-0.0535888671875,
-0.0882568359375,
-0.00238037109375,
0.0030727386474609375,
-0.0672607421875,
0.0369873046875,
0.002620697021484375,
0.01139068603515625,
0.00927734375,
0.0198516845703125,
-0.0002484321594238281,
-0.013397216796875,
0.0195159912109375,
0.0243988037109375,
-0.0253143310546875,
-0.01555633544921875,
-0.016571044921875,
-0.00026869773864746094,
-0.0035839080810546875,
-0.012664794921875,
0.055206298828125,
-0.02020263671875,
0.0018978118896484375,
-0.053497314453125,
0.005329132080078125,
0.03619384765625,
-0.0243072509765625,
0.047698974609375,
0.062744140625,
-0.03326416015625,
-0.00612640380859375,
-0.037109375,
-0.01367950439453125,
-0.043914794921875,
0.039459228515625,
-0.0258636474609375,
-0.0291900634765625,
0.045257568359375,
0.01971435546875,
0.01317596435546875,
0.0269927978515625,
0.05352783203125,
-0.00860595703125,
0.06500244140625,
0.04803466796875,
0.0171051025390625,
0.0509033203125,
-0.068603515625,
0.0062103271484375,
-0.052947998046875,
-0.033935546875,
-0.006053924560546875,
-0.01953125,
-0.03594970703125,
-0.033203125,
0.0211944580078125,
0.0187530517578125,
-0.032196044921875,
0.02423095703125,
-0.0389404296875,
0.01326751708984375,
0.051239013671875,
0.0213470458984375,
-0.0223236083984375,
0.01181793212890625,
-0.0088348388671875,
0.0032711029052734375,
-0.047119140625,
-0.0200347900390625,
0.07177734375,
0.034698486328125,
0.047698974609375,
-0.00966644287109375,
0.03314208984375,
-0.02166748046875,
0.0180206298828125,
-0.053863525390625,
0.04449462890625,
-0.024505615234375,
-0.059967041015625,
-0.014190673828125,
-0.021484375,
-0.0701904296875,
0.00855255126953125,
-0.016845703125,
-0.053070068359375,
0.01459503173828125,
0.0266265869140625,
-0.01543426513671875,
0.0290374755859375,
-0.06903076171875,
0.07342529296875,
-0.03466796875,
-0.042510986328125,
0.005985260009765625,
-0.0479736328125,
0.031768798828125,
0.01448822021484375,
-0.01177978515625,
0.008697509765625,
0.003955841064453125,
0.05462646484375,
-0.043731689453125,
0.060272216796875,
-0.03125,
0.021392822265625,
0.0362548828125,
-0.01273345947265625,
0.01117706298828125,
-0.005756378173828125,
-0.005218505859375,
0.0237884521484375,
-0.000705718994140625,
-0.04486083984375,
-0.040008544921875,
0.0167236328125,
-0.0645751953125,
-0.030548095703125,
-0.024444580078125,
-0.0284423828125,
0.003047943115234375,
0.034149169921875,
0.052459716796875,
0.0272216796875,
0.0196990966796875,
0.006984710693359375,
0.0216827392578125,
-0.042633056640625,
0.05255126953125,
0.00281524658203125,
-0.02691650390625,
-0.037506103515625,
0.06854248046875,
-0.002681732177734375,
0.02166748046875,
0.01708984375,
0.0161285400390625,
-0.03924560546875,
-0.021209716796875,
-0.0557861328125,
0.0418701171875,
-0.04766845703125,
-0.0341796875,
-0.0276947021484375,
-0.0202789306640625,
-0.046051025390625,
-0.01488494873046875,
-0.03912353515625,
-0.01837158203125,
-0.037261962890625,
0.015838623046875,
0.04010009765625,
0.036376953125,
-0.007232666015625,
0.0305633544921875,
-0.04498291015625,
0.0361328125,
0.0240020751953125,
0.02825927734375,
-0.0006804466247558594,
-0.041595458984375,
-0.003414154052734375,
0.0226898193359375,
-0.0305633544921875,
-0.052337646484375,
0.03826904296875,
0.0195159912109375,
0.0260162353515625,
0.030548095703125,
-0.02703857421875,
0.06890869140625,
-0.0236968994140625,
0.06536865234375,
0.039703369140625,
-0.07061767578125,
0.055877685546875,
-0.006145477294921875,
0.0090179443359375,
0.0232086181640625,
0.0226898193359375,
-0.027374267578125,
-0.021209716796875,
-0.054290771484375,
-0.0606689453125,
0.0638427734375,
0.01548004150390625,
0.007045745849609375,
0.01513671875,
0.0258636474609375,
-0.01251220703125,
0.01036834716796875,
-0.0509033203125,
-0.01873779296875,
-0.047882080078125,
-0.01461029052734375,
-0.0055999755859375,
-0.005558013916015625,
0.01007080078125,
-0.0343017578125,
0.0384521484375,
-0.00446319580078125,
0.0487060546875,
0.02734375,
-0.0263824462890625,
-0.01450347900390625,
-0.0357666015625,
0.045928955078125,
0.038421630859375,
-0.0225372314453125,
-0.005859375,
-0.0008740425109863281,
-0.0562744140625,
-0.01531982421875,
0.0031490325927734375,
-0.02606201171875,
0.0022602081298828125,
0.031646728515625,
0.08209228515625,
-0.00669097900390625,
-0.040802001953125,
0.056671142578125,
-0.0011301040649414062,
-0.021453857421875,
-0.030364990234375,
0.0030193328857421875,
0.007038116455078125,
0.0203857421875,
0.031494140625,
0.01192474365234375,
-0.0208740234375,
-0.036346435546875,
0.023406982421875,
0.0301971435546875,
-0.007549285888671875,
-0.0289764404296875,
0.06475830078125,
0.007259368896484375,
-0.0174713134765625,
0.056854248046875,
-0.02911376953125,
-0.052490234375,
0.0721435546875,
0.056793212890625,
0.037689208984375,
-0.001819610595703125,
0.0167083740234375,
0.053466796875,
0.029266357421875,
-0.0037403106689453125,
0.039398193359375,
0.0170745849609375,
-0.0682373046875,
-0.01348114013671875,
-0.045684814453125,
-0.02099609375,
0.0214080810546875,
-0.0352783203125,
0.042999267578125,
-0.052947998046875,
-0.0162200927734375,
0.014739990234375,
0.0184326171875,
-0.0662841796875,
0.0313720703125,
0.02484130859375,
0.069091796875,
-0.06109619140625,
0.03985595703125,
0.0618896484375,
-0.06512451171875,
-0.07098388671875,
-0.01517486572265625,
-0.0240936279296875,
-0.080810546875,
0.0626220703125,
0.03216552734375,
0.004680633544921875,
-0.0011463165283203125,
-0.057220458984375,
-0.057037353515625,
0.0875244140625,
0.033721923828125,
-0.0250091552734375,
0.003604888916015625,
0.01605224609375,
0.041229248046875,
-0.0125579833984375,
0.0380859375,
0.0176849365234375,
0.03887939453125,
0.0291748046875,
-0.067626953125,
0.01183319091796875,
-0.0269317626953125,
0.0030803680419921875,
-0.00832366943359375,
-0.07330322265625,
0.07470703125,
-0.036773681640625,
-0.01517486572265625,
0.0028514862060546875,
0.061248779296875,
0.0340576171875,
0.0126495361328125,
0.036773681640625,
0.04388427734375,
0.05194091796875,
0.000015854835510253906,
0.0745849609375,
-0.0281829833984375,
0.046661376953125,
0.04290771484375,
0.0021266937255859375,
0.059112548828125,
0.0269317626953125,
-0.00835418701171875,
0.0204925537109375,
0.050750732421875,
-0.043426513671875,
0.0287017822265625,
-0.0013608932495117188,
0.01361846923828125,
0.004741668701171875,
0.01080322265625,
-0.0298004150390625,
0.049072265625,
0.03887939453125,
-0.0275726318359375,
-0.002262115478515625,
0.00099945068359375,
0.002819061279296875,
-0.02972412109375,
-0.018341064453125,
0.02484130859375,
-0.008087158203125,
-0.0537109375,
0.076171875,
-0.0014562606811523438,
0.07958984375,
-0.0180511474609375,
0.00308990478515625,
-0.0246124267578125,
0.01959228515625,
-0.0299224853515625,
-0.075439453125,
0.0238189697265625,
-0.0024242401123046875,
0.004642486572265625,
-0.00020611286163330078,
0.039031982421875,
-0.0338134765625,
-0.06939697265625,
0.022003173828125,
0.0076904296875,
0.0246124267578125,
0.0179595947265625,
-0.07916259765625,
0.00904083251953125,
0.005702972412109375,
-0.0264739990234375,
-0.0080718994140625,
0.0249176025390625,
0.00522613525390625,
0.055267333984375,
0.046142578125,
0.01525115966796875,
0.034912109375,
-0.005451202392578125,
0.059356689453125,
-0.048553466796875,
-0.0281524658203125,
-0.0408935546875,
0.045135498046875,
-0.0101165771484375,
-0.046905517578125,
0.034332275390625,
0.0626220703125,
0.068603515625,
-0.01190948486328125,
0.053497314453125,
-0.0276336669921875,
0.01482391357421875,
-0.035858154296875,
0.059906005859375,
-0.06341552734375,
-0.00981903076171875,
-0.01788330078125,
-0.047882080078125,
-0.0284271240234375,
0.06964111328125,
-0.01410675048828125,
0.0150299072265625,
0.045684814453125,
0.09088134765625,
-0.0228118896484375,
-0.0191192626953125,
0.00994110107421875,
0.0221710205078125,
0.03033447265625,
0.051788330078125,
0.043304443359375,
-0.0557861328125,
0.0479736328125,
-0.0560302734375,
-0.01325225830078125,
-0.01244354248046875,
-0.047210693359375,
-0.06988525390625,
-0.047882080078125,
-0.03070068359375,
-0.03948974609375,
-0.00885772705078125,
0.039642333984375,
0.0677490234375,
-0.05462646484375,
-0.018341064453125,
-0.01123809814453125,
-0.0009965896606445312,
-0.0036487579345703125,
-0.0169219970703125,
0.03436279296875,
-0.026580810546875,
-0.06585693359375,
-0.008941650390625,
0.009765625,
0.0225982666015625,
-0.01348114013671875,
0.0012102127075195312,
-0.01605224609375,
-0.0221099853515625,
0.03424072265625,
0.03460693359375,
-0.04437255859375,
-0.0185394287109375,
0.005619049072265625,
-0.01265716552734375,
0.022491455078125,
0.0250244140625,
-0.0445556640625,
0.023712158203125,
0.040740966796875,
0.0263824462890625,
0.0645751953125,
-0.004421234130859375,
0.01497650146484375,
-0.050048828125,
0.05853271484375,
0.01119232177734375,
0.0307464599609375,
0.039703369140625,
-0.0290374755859375,
0.02984619140625,
0.0252532958984375,
-0.021209716796875,
-0.06573486328125,
0.0031452178955078125,
-0.08697509765625,
-0.01776123046875,
0.09954833984375,
-0.019012451171875,
-0.050262451171875,
0.01480865478515625,
-0.01134490966796875,
0.0302581787109375,
-0.018402099609375,
0.0423583984375,
0.01187896728515625,
-0.0084991455078125,
-0.03485107421875,
-0.0242462158203125,
0.034393310546875,
0.02410888671875,
-0.047637939453125,
-0.0217437744140625,
0.01514434814453125,
0.038787841796875,
0.031890869140625,
0.03802490234375,
-0.0081024169921875,
0.0292205810546875,
0.018646240234375,
0.0299835205078125,
-0.0087127685546875,
-0.00501251220703125,
-0.0141754150390625,
-0.004619598388671875,
-0.012359619140625,
-0.036590576171875
]
] |
colorfulscoop/sbert-base-ja | 2021-08-08T06:47:42.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"ja",
"arxiv:1908.10084",
"license:cc-by-sa-4.0",
"endpoints_compatible",
"region:us",
"has_space"
] | sentence-similarity | colorfulscoop | null | null | colorfulscoop/sbert-base-ja | 12 | 84,558 | sentence-transformers | 2022-03-02T23:29:05 | ---
language: ja
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
widget:
source_sentence: "走るのが趣味です"
sentences:
- 外をランニングするのが好きです
- 運動はそこそこです
- 走るのは嫌いです
license: cc-by-sa-4.0
---
# Sentence BERT base Japanese model
This repository contains a Sentence BERT base model for Japanese.
## Pretrained model
This model utilizes a Japanese BERT model [colorfulscoop/bert-base-ja](https://huggingface.co/colorfulscoop/bert-base-ja) v1.0 released under [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/) as a pretrained model.
## Training data
[Japanese SNLI dataset](https://nlp.ist.i.kyoto-u.ac.jp/index.php?%E6%97%A5%E6%9C%AC%E8%AA%9ESNLI%28JSNLI%29%E3%83%87%E3%83%BC%E3%82%BF%E3%82%BB%E3%83%83%E3%83%88) released under [Creative Commons Attribution-ShareAlike 4.0](https://creativecommons.org/licenses/by-sa/4.0/) is used for training.
Original training dataset is splitted into train/valid dataset. Finally, follwoing data is prepared.
* Train data: 523,005 samples
* Valid data: 10,000 samples
* Test data: 3,916 samples
## Model description
This model utilizes `SentenceTransformer` model from the [sentence-transformers](https://github.com/UKPLab/sentence-transformers) .
The model detail is as below.
```py
>>> from sentence_transformers import SentenceTransformer
>>> SentenceTransformer("colorfulscoop/sbert-base-ja")
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Training
This model finetuned [colorfulscoop/bert-base-ja](https://huggingface.co/colorfulscoop/bert-base-ja) with Softmax classifier of 3 labels of SNLI. AdamW optimizer with learning rate of 2e-05 linearly warmed-up in 10% of train data was used. The model was trained in 1 epoch with batch size 8.
Note: in a original paper of [Sentence BERT](https://arxiv.org/abs/1908.10084), a batch size of the model trained on SNLI and Multi-Genle NLI was 16. In this model, the dataset is around half smaller than the origial one, therefore the batch size was set to half of the original batch size of 16.
Trainind was conducted on Ubuntu 18.04.5 LTS with one RTX 2080 Ti.
After training, test set accuracy reached to 0.8529.
Training code is available in [a GitHub repository](https://github.com/colorfulscoop/sbert-ja).
## Usage
First, install dependecies.
```sh
$ pip install sentence-transformers==2.0.0
```
Then initialize `SentenceTransformer` model and use `encode` method to convert to vectors.
```py
>>> from sentence_transformers import SentenceTransformer
>>> model = SentenceTransformer("colorfulscoop/sbert-base-ja")
>>> sentences = ["外をランニングするのが好きです", "海外旅行に行くのが趣味です"]
>>> model.encode(sentences)
```
## License
Copyright (c) 2021 Colorful Scoop
All the models included in this repository are licensed under [Creative Commons Attribution-ShareAlike 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
**Disclaimer:** Use of this model is at your sole risk. Colorful Scoop makes no warranty or guarantee of any outputs from the model. Colorful Scoop is not liable for any trouble, loss, or damage arising from the model output.
---
This model utilizes the folllowing pretrained model.
* **Name:** bert-base-ja
* **Credit:** (c) 2021 Colorful Scoop
* **License:** [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/)
* **Disclaimer:** The model potentially has possibility that it generates similar texts in the training data, texts not to be true, or biased texts. Use of the model is at your sole risk. Colorful Scoop makes no warranty or guarantee of any outputs from the model. Colorful Scoop is not liable for any trouble, loss, or damage arising from the model output.
* **Link:** https://huggingface.co/colorfulscoop/bert-base-ja
---
This model utilizes the following data for fine-tuning.
* **Name:** 日本語SNLI(JSNLI)データセット
* **Credit:** [https://nlp.ist.i.kyoto-u.ac.jp/index.php?日本語SNLI(JSNLI)データセット](https://nlp.ist.i.kyoto-u.ac.jp/index.php?%E6%97%A5%E6%9C%AC%E8%AA%9ESNLI%28JSNLI%29%E3%83%87%E3%83%BC%E3%82%BF%E3%82%BB%E3%83%83%E3%83%88)
* **License:** [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
* **Link:** [https://nlp.ist.i.kyoto-u.ac.jp/index.php?日本語SNLI(JSNLI)データセット](https://nlp.ist.i.kyoto-u.ac.jp/index.php?%E6%97%A5%E6%9C%AC%E8%AA%9ESNLI%28JSNLI%29%E3%83%87%E3%83%BC%E3%82%BF%E3%82%BB%E3%83%83%E3%83%88) | 4,706 | [
[
-0.025054931640625,
-0.048675537109375,
0.02392578125,
0.025390625,
-0.02667236328125,
-0.0250244140625,
-0.034820556640625,
-0.029083251953125,
0.016845703125,
0.028228759765625,
-0.068603515625,
-0.029083251953125,
-0.0467529296875,
0.00742340087890625,
-0.0010175704956054688,
0.09027099609375,
0.0019779205322265625,
0.012298583984375,
0.00791168212890625,
-0.00872039794921875,
-0.034393310546875,
-0.0242919921875,
-0.033935546875,
-0.0211334228515625,
0.03369140625,
0.0230712890625,
0.047149658203125,
0.0203704833984375,
0.0215606689453125,
0.028411865234375,
-0.0107421875,
-0.00726318359375,
-0.03997802734375,
-0.0118560791015625,
0.012969970703125,
-0.04425048828125,
-0.01508331298828125,
-0.000014185905456542969,
0.034088134765625,
0.04833984375,
-0.00815582275390625,
0.00679779052734375,
-0.00444793701171875,
0.0341796875,
-0.03314208984375,
0.021514892578125,
-0.051422119140625,
0.01222991943359375,
-0.00968170166015625,
0.009124755859375,
-0.02734375,
-0.0369873046875,
0.004703521728515625,
-0.05059814453125,
0.0026760101318359375,
-0.0033893585205078125,
0.1064453125,
0.002132415771484375,
-0.0204925537109375,
-0.0193634033203125,
-0.028045654296875,
0.0733642578125,
-0.06365966796875,
0.029205322265625,
0.0265960693359375,
0.012847900390625,
-0.00370025634765625,
-0.056121826171875,
-0.059539794921875,
-0.00017571449279785156,
-0.0160675048828125,
0.03167724609375,
-0.00934600830078125,
-0.011962890625,
0.02947998046875,
0.01285552978515625,
-0.040985107421875,
0.00473785400390625,
-0.026031494140625,
-0.01012420654296875,
0.054229736328125,
-0.004192352294921875,
0.036865234375,
-0.038787841796875,
-0.04071044921875,
-0.02532958984375,
-0.023773193359375,
0.01241302490234375,
0.034210205078125,
0.034423828125,
-0.0279541015625,
0.0391845703125,
-0.0017337799072265625,
0.028350830078125,
0.001842498779296875,
-0.019195556640625,
0.053985595703125,
-0.0447998046875,
-0.01849365234375,
-0.002109527587890625,
0.07757568359375,
0.02703857421875,
0.0306243896484375,
-0.0005426406860351562,
-0.027801513671875,
-0.0157012939453125,
0.00799560546875,
-0.055938720703125,
-0.01531219482421875,
0.002593994140625,
-0.04119873046875,
-0.01486968994140625,
0.0018053054809570312,
-0.042327880859375,
0.00021183490753173828,
-0.0159759521484375,
0.047149658203125,
-0.071044921875,
-0.0219268798828125,
0.001678466796875,
-0.02191162109375,
0.031280517578125,
0.00534820556640625,
-0.07257080078125,
0.007171630859375,
0.0261383056640625,
0.0450439453125,
0.01410675048828125,
-0.024871826171875,
0.0083770751953125,
-0.0007758140563964844,
-0.0018281936645507812,
0.03460693359375,
-0.010528564453125,
-0.0252227783203125,
0.006439208984375,
0.013275146484375,
-0.0322265625,
-0.0087738037109375,
0.04730224609375,
-0.0174560546875,
0.046966552734375,
-0.0136260986328125,
-0.0435791015625,
-0.0139617919921875,
0.01265716552734375,
-0.02972412109375,
0.09075927734375,
0.0164337158203125,
-0.07623291015625,
0.030792236328125,
-0.036041259765625,
-0.035888671875,
-0.0027217864990234375,
-0.00327301025390625,
-0.053192138671875,
-0.0086669921875,
0.03021240234375,
0.031463623046875,
0.018310546875,
0.0209197998046875,
-0.0228118896484375,
-0.0303802490234375,
0.01788330078125,
-0.0251617431640625,
0.1055908203125,
0.0206451416015625,
-0.025543212890625,
0.007312774658203125,
-0.05712890625,
0.006572723388671875,
0.03076171875,
-0.034088134765625,
-0.0213775634765625,
-0.0199127197265625,
0.0174407958984375,
0.0155792236328125,
0.033905029296875,
-0.052276611328125,
0.01279449462890625,
-0.043731689453125,
0.0252532958984375,
0.061309814453125,
-0.0128173828125,
0.0236663818359375,
-0.037628173828125,
0.023406982421875,
0.004253387451171875,
0.020843505859375,
-0.0096893310546875,
-0.046295166015625,
-0.0712890625,
-0.0241241455078125,
0.030059814453125,
0.047607421875,
-0.06622314453125,
0.0728759765625,
-0.0404052734375,
-0.045562744140625,
-0.046844482421875,
-0.009735107421875,
0.0205078125,
0.028289794921875,
0.0275726318359375,
-0.002201080322265625,
-0.045135498046875,
-0.064697265625,
-0.01593017578125,
-0.0021076202392578125,
-0.003101348876953125,
0.02630615234375,
0.0531005859375,
-0.0244598388671875,
0.06256103515625,
-0.03753662109375,
-0.024627685546875,
-0.037689208984375,
0.01190948486328125,
0.036041259765625,
0.045379638671875,
0.059539794921875,
-0.05096435546875,
-0.044189453125,
-0.0196380615234375,
-0.052093505859375,
0.0025501251220703125,
-0.016998291015625,
-0.0271148681640625,
0.027099609375,
0.0284576416015625,
-0.056915283203125,
0.021148681640625,
0.03912353515625,
-0.030975341796875,
0.040008544921875,
-0.015838623046875,
-0.006103515625,
-0.08758544921875,
0.01519775390625,
-0.003810882568359375,
-0.00862884521484375,
-0.043243408203125,
0.0035858154296875,
0.00472259521484375,
-0.006805419921875,
-0.04119873046875,
0.02960205078125,
-0.0278472900390625,
0.0021514892578125,
0.0117645263671875,
0.007030487060546875,
-0.0059356689453125,
0.06512451171875,
-0.0016374588012695312,
0.049896240234375,
0.029571533203125,
-0.032958984375,
0.005481719970703125,
0.040557861328125,
-0.03912353515625,
0.01526641845703125,
-0.04840087890625,
-0.009979248046875,
-0.0099334716796875,
0.0162200927734375,
-0.080810546875,
-0.0009899139404296875,
0.0238800048828125,
-0.03997802734375,
0.0293426513671875,
0.00783538818359375,
-0.04327392578125,
-0.030853271484375,
-0.042816162109375,
-0.0019483566284179688,
0.04974365234375,
-0.03375244140625,
0.04248046875,
0.00649261474609375,
-0.021759033203125,
-0.055206298828125,
-0.08062744140625,
-0.0209808349609375,
-0.00510406494140625,
-0.046966552734375,
0.028472900390625,
-0.0181427001953125,
0.019866943359375,
0.0036411285400390625,
0.0013437271118164062,
-0.00904083251953125,
0.003032684326171875,
0.008056640625,
0.024200439453125,
-0.0238037109375,
0.005863189697265625,
0.002170562744140625,
0.01078033447265625,
0.00318145751953125,
0.0080108642578125,
0.051910400390625,
-0.007740020751953125,
0.0016460418701171875,
-0.040740966796875,
0.005603790283203125,
0.0302886962890625,
0.005001068115234375,
0.07244873046875,
0.06201171875,
-0.0188751220703125,
0.00899505615234375,
-0.03570556640625,
-0.0218048095703125,
-0.03839111328125,
0.0297393798828125,
-0.030364990234375,
-0.0445556640625,
0.037506103515625,
0.024688720703125,
0.01427459716796875,
0.045654296875,
0.0249786376953125,
-0.0215301513671875,
0.07586669921875,
0.042327880859375,
-0.005184173583984375,
0.06011962890625,
-0.0369873046875,
0.0116424560546875,
-0.055999755859375,
-0.02264404296875,
-0.03369140625,
-0.022613525390625,
-0.0633544921875,
-0.01348114013671875,
0.0129547119140625,
-0.003387451171875,
-0.034454345703125,
0.026336669921875,
-0.0279541015625,
0.01303863525390625,
0.052459716796875,
0.0238037109375,
-0.00923919677734375,
0.01094818115234375,
-0.0247039794921875,
-0.0189666748046875,
-0.0640869140625,
-0.022247314453125,
0.0758056640625,
0.044281005859375,
0.03265380859375,
-0.002391815185546875,
0.057373046875,
-0.0129852294921875,
0.011260986328125,
-0.051025390625,
0.049072265625,
-0.02685546875,
-0.048248291015625,
-0.029693603515625,
-0.04071044921875,
-0.08489990234375,
0.023223876953125,
-0.04095458984375,
-0.04290771484375,
0.003108978271484375,
-0.0213623046875,
-0.01239013671875,
0.020263671875,
-0.05816650390625,
0.07061767578125,
-0.0288543701171875,
-0.006473541259765625,
0.005504608154296875,
-0.05841064453125,
0.02349853515625,
0.0189666748046875,
0.0198822021484375,
-0.0085601806640625,
0.01715087890625,
0.06951904296875,
-0.043792724609375,
0.06683349609375,
-0.023956298828125,
-0.00363922119140625,
0.011444091796875,
-0.0238189697265625,
0.0322265625,
-0.0139617919921875,
0.0017490386962890625,
0.0289459228515625,
-0.022918701171875,
-0.030792236328125,
-0.02593994140625,
0.04754638671875,
-0.0919189453125,
-0.0226593017578125,
-0.038055419921875,
-0.036041259765625,
-0.005218505859375,
0.03973388671875,
0.041412353515625,
0.01366424560546875,
0.0129547119140625,
0.02239990234375,
0.047760009765625,
-0.0308990478515625,
0.045135498046875,
0.0257110595703125,
0.0010404586791992188,
-0.031524658203125,
0.06494140625,
0.01120758056640625,
0.007171630859375,
0.0219268798828125,
0.0252532958984375,
-0.0289306640625,
-0.03692626953125,
-0.0176544189453125,
0.042083740234375,
-0.058746337890625,
-0.01424407958984375,
-0.058380126953125,
-0.0266876220703125,
-0.0465087890625,
-0.00820159912109375,
-0.01335906982421875,
-0.0347900390625,
-0.030487060546875,
-0.004032135009765625,
0.03277587890625,
0.031646728515625,
0.00522613525390625,
0.035675048828125,
-0.045196533203125,
0.015960693359375,
0.0207977294921875,
0.026214599609375,
-0.0115203857421875,
-0.05126953125,
-0.01093292236328125,
0.0165252685546875,
-0.02398681640625,
-0.055938720703125,
0.040008544921875,
0.00717926025390625,
0.047088623046875,
0.016754150390625,
0.0022487640380859375,
0.055511474609375,
-0.038116455078125,
0.07904052734375,
0.03997802734375,
-0.06158447265625,
0.056915283203125,
-0.0265350341796875,
0.040496826171875,
0.0465087890625,
0.04351806640625,
-0.0232391357421875,
-0.05029296875,
-0.06646728515625,
-0.0809326171875,
0.07012939453125,
0.02362060546875,
0.0206298828125,
0.0018796920776367188,
0.01438140869140625,
0.005634307861328125,
0.013580322265625,
-0.0968017578125,
-0.03558349609375,
-0.03662109375,
-0.044769287109375,
-0.0038280487060546875,
-0.014984130859375,
0.006927490234375,
-0.0184173583984375,
0.07220458984375,
0.005161285400390625,
0.042266845703125,
0.012420654296875,
-0.0192718505859375,
0.0009975433349609375,
0.0175628662109375,
0.041900634765625,
0.0221710205078125,
-0.0245513916015625,
-0.0211181640625,
0.024749755859375,
-0.04449462890625,
-0.015472412109375,
0.0156402587890625,
-0.037933349609375,
0.01218414306640625,
0.03436279296875,
0.0848388671875,
0.030975341796875,
-0.037139892578125,
0.058685302734375,
-0.01558685302734375,
-0.027252197265625,
-0.0379638671875,
0.015899658203125,
0.0034465789794921875,
0.00762176513671875,
0.016845703125,
0.0003368854522705078,
0.0025119781494140625,
-0.043365478515625,
0.0023822784423828125,
0.0209808349609375,
-0.0250091552734375,
-0.007778167724609375,
0.044677734375,
0.01361846923828125,
0.0010538101196289062,
0.0654296875,
-0.007663726806640625,
-0.042266845703125,
0.0498046875,
0.049530029296875,
0.054229736328125,
0.0010471343994140625,
0.00511932373046875,
0.057220458984375,
0.02386474609375,
0.007213592529296875,
0.03826904296875,
0.006572723388671875,
-0.06390380859375,
-0.0221099853515625,
-0.049896240234375,
0.005352020263671875,
0.029571533203125,
-0.050140380859375,
0.0312042236328125,
-0.0343017578125,
-0.01450347900390625,
-0.002750396728515625,
0.0149993896484375,
-0.047637939453125,
0.0283203125,
0.026702880859375,
0.0667724609375,
-0.07379150390625,
0.06158447265625,
0.0576171875,
-0.054229736328125,
-0.067626953125,
0.004833221435546875,
-0.03253173828125,
-0.06982421875,
0.057403564453125,
0.027069091796875,
0.00855255126953125,
-0.007755279541015625,
-0.041900634765625,
-0.062744140625,
0.08135986328125,
0.0172119140625,
-0.043548583984375,
0.0004210472106933594,
0.00930023193359375,
0.03558349609375,
-0.00861358642578125,
0.0411376953125,
0.03228759765625,
0.037750244140625,
-0.0048370361328125,
-0.0675048828125,
0.007640838623046875,
-0.023773193359375,
0.01971435546875,
-0.005130767822265625,
-0.057891845703125,
0.06689453125,
-0.009552001953125,
-0.0173187255859375,
0.0106658935546875,
0.0579833984375,
0.031982421875,
0.009857177734375,
0.0535888671875,
0.0460205078125,
0.048248291015625,
-0.0202484130859375,
0.0738525390625,
-0.0200347900390625,
0.036529541015625,
0.06231689453125,
0.01090240478515625,
0.06256103515625,
0.029022216796875,
-0.01910400390625,
0.059539794921875,
0.033843994140625,
-0.0158843994140625,
0.055145263671875,
0.0020885467529296875,
0.00791168212890625,
-0.0025920867919921875,
0.0062255859375,
-0.026458740234375,
0.029998779296875,
0.025634765625,
-0.0204620361328125,
0.0015878677368164062,
0.0138397216796875,
0.025970458984375,
0.0021915435791015625,
-0.02008056640625,
0.05535888671875,
-0.006870269775390625,
-0.049072265625,
0.053375244140625,
-0.0008196830749511719,
0.08062744140625,
-0.06195068359375,
0.0226593017578125,
-0.0177459716796875,
0.006229400634765625,
-0.00682830810546875,
-0.0733642578125,
0.01084136962890625,
-0.007556915283203125,
-0.00838470458984375,
-0.0384521484375,
0.049041748046875,
-0.038726806640625,
-0.044189453125,
0.01702880859375,
0.0111846923828125,
0.006591796875,
0.030487060546875,
-0.08294677734375,
0.002349853515625,
0.0179901123046875,
-0.042083740234375,
0.0155792236328125,
0.036468505859375,
0.019195556640625,
0.038787841796875,
0.041259765625,
0.004058837890625,
0.0093231201171875,
0.01380157470703125,
0.0562744140625,
-0.0511474609375,
-0.052734375,
-0.041778564453125,
0.04608154296875,
-0.0199127197265625,
-0.036346435546875,
0.055755615234375,
0.033447265625,
0.07177734375,
-0.0248260498046875,
0.06365966796875,
-0.0128936767578125,
0.0382080078125,
-0.048248291015625,
0.05718994140625,
-0.0537109375,
-0.00919342041015625,
-0.01555633544921875,
-0.051605224609375,
-0.015716552734375,
0.07586669921875,
-0.0163726806640625,
0.0236968994140625,
0.049530029296875,
0.04248046875,
-0.0007719993591308594,
0.0031585693359375,
0.00565338134765625,
0.040496826171875,
0.006511688232421875,
0.048126220703125,
0.048675537109375,
-0.061767578125,
0.050384521484375,
-0.04742431640625,
-0.002887725830078125,
-0.0284881591796875,
-0.055328369140625,
-0.0716552734375,
-0.0294342041015625,
-0.034149169921875,
-0.01508331298828125,
-0.0063629150390625,
0.054931640625,
0.044281005859375,
-0.0638427734375,
-0.01439666748046875,
-0.01258087158203125,
-0.0041046142578125,
-0.014434814453125,
-0.0225067138671875,
0.0269927978515625,
-0.038116455078125,
-0.0618896484375,
0.00820159912109375,
-0.0177001953125,
0.0259857177734375,
0.002239227294921875,
0.0004451274871826172,
-0.04266357421875,
0.0035457611083984375,
0.045654296875,
0.0021953582763671875,
-0.050689697265625,
-0.022125244140625,
0.0085296630859375,
-0.0272216796875,
-0.001953125,
0.0238037109375,
-0.0384521484375,
0.01812744140625,
0.0357666015625,
0.020111083984375,
0.047332763671875,
-0.0180511474609375,
0.0223388671875,
-0.081298828125,
0.02984619140625,
0.013458251953125,
0.045013427734375,
0.0258331298828125,
-0.0252227783203125,
0.0360107421875,
0.0218505859375,
-0.03546142578125,
-0.059814453125,
-0.0017824172973632812,
-0.08197021484375,
-0.037628173828125,
0.090087890625,
-0.0183563232421875,
-0.02947998046875,
0.01265716552734375,
-0.0184478759765625,
0.032745361328125,
-0.024871826171875,
0.041961669921875,
0.049560546875,
0.0005254745483398438,
-0.01139068603515625,
-0.032562255859375,
0.0257110595703125,
0.045806884765625,
-0.047332763671875,
-0.0157623291015625,
0.02264404296875,
0.0406494140625,
0.01824951171875,
0.04779052734375,
0.0054168701171875,
0.011688232421875,
0.002468109130859375,
0.0236663818359375,
0.00960540771484375,
0.002532958984375,
-0.0139007568359375,
-0.008026123046875,
-0.020599365234375,
-0.023529052734375
]
] |
Helsinki-NLP/opus-mt-pl-en | 2023-08-16T12:02:38.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"pl",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-pl-en | 14 | 84,546 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-pl-en
* source languages: pl
* target languages: en
* OPUS readme: [pl-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/pl-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/pl-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/pl-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/pl-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.pl.en | 54.9 | 0.701 |
| 818 | [
[
-0.0178375244140625,
-0.02947998046875,
0.0216522216796875,
0.0330810546875,
-0.037872314453125,
-0.0272216796875,
-0.031829833984375,
-0.0033702850341796875,
-0.0029850006103515625,
0.036346435546875,
-0.05010986328125,
-0.039764404296875,
-0.042205810546875,
0.0169525146484375,
-0.003940582275390625,
0.05328369140625,
-0.01108551025390625,
0.034942626953125,
0.01396942138671875,
-0.0343017578125,
-0.02154541015625,
-0.033782958984375,
-0.034881591796875,
-0.02252197265625,
0.0168304443359375,
0.022674560546875,
0.0257568359375,
0.03118896484375,
0.069091796875,
0.0164642333984375,
-0.00604248046875,
0.008758544921875,
-0.034942626953125,
-0.0069122314453125,
0.0059356689453125,
-0.043609619140625,
-0.054229736328125,
-0.00637054443359375,
0.0794677734375,
0.0310821533203125,
-0.0025348663330078125,
0.0294952392578125,
-0.0019044876098632812,
0.06866455078125,
-0.020782470703125,
0.00853729248046875,
-0.0445556640625,
0.005435943603515625,
-0.0290069580078125,
-0.0239410400390625,
-0.05279541015625,
-0.018768310546875,
0.01038360595703125,
-0.05084228515625,
-0.00229644775390625,
0.01343536376953125,
0.1026611328125,
0.0222320556640625,
-0.0222320556640625,
-0.01314544677734375,
-0.03985595703125,
0.0784912109375,
-0.06268310546875,
0.044036865234375,
0.0296783447265625,
0.0189056396484375,
0.01953125,
-0.041534423828125,
-0.0231781005859375,
0.006740570068359375,
-0.01445770263671875,
0.01552581787109375,
-0.00011456012725830078,
-0.0165863037109375,
0.0264434814453125,
0.055328369140625,
-0.057952880859375,
-0.0007028579711914062,
-0.04229736328125,
-0.0019245147705078125,
0.0521240234375,
0.00628662109375,
0.0134429931640625,
-0.01241302490234375,
-0.034271240234375,
-0.039154052734375,
-0.05853271484375,
0.0109100341796875,
0.02880859375,
0.0202178955078125,
-0.03973388671875,
0.053924560546875,
-0.00543975830078125,
0.048126220703125,
-0.0030841827392578125,
-0.00182342529296875,
0.07354736328125,
-0.031768798828125,
-0.024932861328125,
-0.01192474365234375,
0.0870361328125,
0.026824951171875,
0.0073699951171875,
0.006694793701171875,
-0.0187225341796875,
-0.0197296142578125,
0.0020313262939453125,
-0.06304931640625,
-0.00928497314453125,
0.0147247314453125,
-0.036163330078125,
-0.003082275390625,
0.0077972412109375,
-0.04931640625,
0.0095672607421875,
-0.03173828125,
0.0413818359375,
-0.051666259765625,
-0.019134521484375,
0.0255126953125,
-0.0011167526245117188,
0.0296783447265625,
-0.0005202293395996094,
-0.044219970703125,
0.011962890625,
0.033660888671875,
0.05609130859375,
-0.031036376953125,
-0.0250396728515625,
-0.039520263671875,
-0.01285552978515625,
-0.0034770965576171875,
0.048431396484375,
-0.004459381103515625,
-0.0328369140625,
-0.0036792755126953125,
0.03338623046875,
-0.0300750732421875,
-0.0278472900390625,
0.097900390625,
-0.0177459716796875,
0.053741455078125,
-0.033203125,
-0.03924560546875,
-0.023040771484375,
0.0322265625,
-0.04229736328125,
0.09283447265625,
0.00981903076171875,
-0.06170654296875,
0.01629638671875,
-0.061981201171875,
-0.0187835693359375,
0.0002205371856689453,
0.00897216796875,
-0.04864501953125,
0.00922393798828125,
0.01025390625,
0.02520751953125,
-0.0226898193359375,
0.02374267578125,
0.00021648406982421875,
-0.0231475830078125,
0.00853729248046875,
-0.0266571044921875,
0.078125,
0.0202178955078125,
-0.0223236083984375,
0.0193328857421875,
-0.070068359375,
-0.003276824951171875,
-0.00278472900390625,
-0.039520263671875,
-0.0167388916015625,
0.008941650390625,
0.0183258056640625,
0.0113677978515625,
0.02984619140625,
-0.0516357421875,
0.0177764892578125,
-0.05322265625,
0.016357421875,
0.048858642578125,
-0.02630615234375,
0.0284423828125,
-0.03497314453125,
0.01947021484375,
0.006404876708984375,
0.01244354248046875,
0.000804901123046875,
-0.03607177734375,
-0.061309814453125,
-0.0183563232421875,
0.045654296875,
0.08013916015625,
-0.059356689453125,
0.0614013671875,
-0.048858642578125,
-0.051971435546875,
-0.06427001953125,
-0.00799560546875,
0.035430908203125,
0.0281829833984375,
0.037078857421875,
-0.0124053955078125,
-0.036651611328125,
-0.07745361328125,
-0.00853729248046875,
-0.01174163818359375,
-0.0204925537109375,
0.01332855224609375,
0.044464111328125,
-0.005893707275390625,
0.037841796875,
-0.03546142578125,
-0.029876708984375,
-0.017730712890625,
0.0110015869140625,
0.036468505859375,
0.0478515625,
0.03936767578125,
-0.0650634765625,
-0.04852294921875,
-0.0024471282958984375,
-0.052703857421875,
-0.0186309814453125,
0.004917144775390625,
-0.01325225830078125,
0.011444091796875,
0.0013065338134765625,
-0.0256195068359375,
0.0058135986328125,
0.0484619140625,
-0.042694091796875,
0.04327392578125,
-0.013946533203125,
0.0173187255859375,
-0.0947265625,
0.0150909423828125,
-0.01263427734375,
-0.0060272216796875,
-0.031982421875,
0.0041046142578125,
0.0212860107421875,
0.003032684326171875,
-0.06536865234375,
0.04132080078125,
-0.018035888671875,
-0.006988525390625,
0.0197601318359375,
0.0016460418701171875,
0.006866455078125,
0.05230712890625,
-0.0016536712646484375,
0.06585693359375,
0.0538330078125,
-0.03680419921875,
0.00933074951171875,
0.04248046875,
-0.0316162109375,
0.030303955078125,
-0.060760498046875,
-0.0211639404296875,
0.024139404296875,
-0.00969696044921875,
-0.047882080078125,
0.01105499267578125,
0.02288818359375,
-0.042755126953125,
0.0279083251953125,
-0.0011920928955078125,
-0.060516357421875,
-0.00457763671875,
-0.0277099609375,
0.03375244140625,
0.05218505859375,
-0.0146636962890625,
0.0460205078125,
0.00695037841796875,
-0.001949310302734375,
-0.0302276611328125,
-0.07135009765625,
-0.01163482666015625,
-0.02679443359375,
-0.055633544921875,
0.0172882080078125,
-0.0361328125,
-0.0039520263671875,
-0.0018863677978515625,
0.023651123046875,
-0.004932403564453125,
0.0002918243408203125,
0.005329132080078125,
0.01554107666015625,
-0.03515625,
0.0055084228515625,
-0.00225830078125,
-0.01140594482421875,
-0.01052093505859375,
-0.00644683837890625,
0.0494384765625,
-0.025299072265625,
-0.0171661376953125,
-0.047088623046875,
0.00457000732421875,
0.045074462890625,
-0.0345458984375,
0.0599365234375,
0.041656494140625,
-0.001094818115234375,
0.0076751708984375,
-0.0318603515625,
0.01045989990234375,
-0.03240966796875,
0.003040313720703125,
-0.033721923828125,
-0.0538330078125,
0.0418701171875,
0.0090789794921875,
0.032806396484375,
0.061126708984375,
0.045013427734375,
0.0042266845703125,
0.046478271484375,
0.0233154296875,
0.0006556510925292969,
0.029266357421875,
-0.03485107421875,
-0.0120086669921875,
-0.08258056640625,
0.007617950439453125,
-0.04864501953125,
-0.0232086181640625,
-0.056884765625,
-0.0211334228515625,
0.0211334228515625,
-0.0003120899200439453,
-0.0205230712890625,
0.046539306640625,
-0.0428466796875,
0.01479339599609375,
0.047637939453125,
-0.008575439453125,
0.023681640625,
-0.0011148452758789062,
-0.0391845703125,
-0.019622802734375,
-0.0330810546875,
-0.037017822265625,
0.09808349609375,
0.0279388427734375,
0.0203399658203125,
0.015350341796875,
0.03314208984375,
-0.001850128173828125,
0.0190582275390625,
-0.041412353515625,
0.037445068359375,
-0.030303955078125,
-0.04754638671875,
-0.0225677490234375,
-0.044097900390625,
-0.05853271484375,
0.036346435546875,
-0.015838623046875,
-0.033660888671875,
0.009735107421875,
-0.004726409912109375,
-0.01131439208984375,
0.036163330078125,
-0.0506591796875,
0.08306884765625,
-0.01343536376953125,
-0.0061798095703125,
0.024566650390625,
-0.037353515625,
0.0242462158203125,
-0.0027675628662109375,
0.019775390625,
-0.0141143798828125,
0.008697509765625,
0.047271728515625,
-0.00873565673828125,
0.03564453125,
-0.006038665771484375,
-0.005802154541015625,
0.00498199462890625,
0.005542755126953125,
0.0261077880859375,
-0.01055908203125,
-0.037689208984375,
0.02911376953125,
0.0032749176025390625,
-0.0299835205078125,
-0.005741119384765625,
0.03753662109375,
-0.05462646484375,
-0.001964569091796875,
-0.031982421875,
-0.056854248046875,
0.004749298095703125,
0.0262603759765625,
0.05218505859375,
0.04779052734375,
-0.01453399658203125,
0.039337158203125,
0.06304931640625,
-0.0292205810546875,
0.0312347412109375,
0.059783935546875,
-0.01233673095703125,
-0.042572021484375,
0.06365966796875,
0.00846099853515625,
0.027618408203125,
0.04705810546875,
0.01123809814453125,
-0.015350341796875,
-0.05755615234375,
-0.04974365234375,
0.0248870849609375,
-0.019561767578125,
-0.01220703125,
-0.035980224609375,
-0.00806427001953125,
-0.0210418701171875,
0.01953125,
-0.035064697265625,
-0.0386962890625,
-0.01256561279296875,
-0.0116119384765625,
0.01023101806640625,
0.01544189453125,
-0.003032684326171875,
0.039398193359375,
-0.075927734375,
0.014801025390625,
-0.00933074951171875,
0.0300750732421875,
-0.0352783203125,
-0.05926513671875,
-0.0266876220703125,
0.00414276123046875,
-0.05078125,
-0.048004150390625,
0.04449462890625,
0.01342010498046875,
0.0214691162109375,
0.02264404296875,
0.016143798828125,
0.025604248046875,
-0.0531005859375,
0.07574462890625,
-0.006031036376953125,
-0.056640625,
0.03857421875,
-0.0325927734375,
0.03411865234375,
0.06658935546875,
0.0169525146484375,
-0.0206756591796875,
-0.037811279296875,
-0.05279541015625,
-0.05804443359375,
0.059295654296875,
0.050323486328125,
-0.015869140625,
0.01953125,
-0.00676727294921875,
0.0035247802734375,
0.017547607421875,
-0.0860595703125,
-0.0269622802734375,
0.00389862060546875,
-0.026397705078125,
-0.0165557861328125,
-0.0213165283203125,
-0.0180511474609375,
-0.0135345458984375,
0.077392578125,
0.013031005859375,
0.011016845703125,
0.0312042236328125,
-0.0124664306640625,
-0.0186767578125,
0.0244293212890625,
0.0784912109375,
0.043212890625,
-0.0450439453125,
-0.01175689697265625,
0.0178375244140625,
-0.029022216796875,
-0.0142364501953125,
0.01168060302734375,
-0.0313720703125,
0.02154541015625,
0.03411865234375,
0.0791015625,
0.01454925537109375,
-0.052215576171875,
0.035369873046875,
-0.023681640625,
-0.034088134765625,
-0.04541015625,
-0.015289306640625,
0.00481414794921875,
-0.002239227294921875,
0.0183563232421875,
0.00846099853515625,
0.00859832763671875,
-0.0156402587890625,
0.0125274658203125,
0.006038665771484375,
-0.042633056640625,
-0.0423583984375,
0.03826904296875,
0.01074981689453125,
-0.02606201171875,
0.038726806640625,
-0.031341552734375,
-0.04571533203125,
0.0282440185546875,
0.01210784912109375,
0.0718994140625,
-0.0183563232421875,
-0.01230621337890625,
0.04974365234375,
0.04571533203125,
-0.020111083984375,
0.040069580078125,
0.01129150390625,
-0.04864501953125,
-0.04266357421875,
-0.06390380859375,
-0.01500701904296875,
0.013153076171875,
-0.054229736328125,
0.0236053466796875,
0.01904296875,
0.0059356689453125,
-0.0284271240234375,
0.0189056396484375,
-0.037261962890625,
0.0031528472900390625,
-0.0169677734375,
0.08123779296875,
-0.0711669921875,
0.068603515625,
0.035858154296875,
-0.0211029052734375,
-0.0609130859375,
-0.021087646484375,
-0.0178985595703125,
-0.035919189453125,
0.049591064453125,
0.01271820068359375,
0.0328369140625,
-0.0111541748046875,
-0.01385498046875,
-0.0577392578125,
0.0821533203125,
0.0163116455078125,
-0.045806884765625,
0.002803802490234375,
0.0203704833984375,
0.03875732421875,
-0.0283203125,
0.0017385482788085938,
0.02960205078125,
0.057037353515625,
0.0011157989501953125,
-0.087646484375,
-0.0210723876953125,
-0.0379638671875,
-0.0258331298828125,
0.039306640625,
-0.0423583984375,
0.07958984375,
0.03558349609375,
-0.01092529296875,
0.01157379150390625,
0.040191650390625,
0.0308074951171875,
0.0294952392578125,
0.03753662109375,
0.08984375,
0.027069091796875,
-0.03570556640625,
0.07965087890625,
-0.0277099609375,
0.03570556640625,
0.08966064453125,
-0.0116729736328125,
0.0711669921875,
0.0254974365234375,
-0.01082611083984375,
0.0343017578125,
0.044464111328125,
-0.02392578125,
0.037841796875,
0.005527496337890625,
0.01387786865234375,
-0.00794219970703125,
0.01515960693359375,
-0.05029296875,
0.0129547119140625,
0.017364501953125,
-0.0192718505859375,
0.0012083053588867188,
-0.0014257431030273438,
-0.0045928955078125,
-0.0006399154663085938,
-0.01355743408203125,
0.044525146484375,
0.00035881996154785156,
-0.04425048828125,
0.05096435546875,
-0.00817108154296875,
0.053924560546875,
-0.0535888671875,
0.0100250244140625,
-0.004100799560546875,
0.0171661376953125,
0.001087188720703125,
-0.047943115234375,
0.03411865234375,
-0.003986358642578125,
-0.0174560546875,
-0.0361328125,
0.0153961181640625,
-0.04254150390625,
-0.06585693359375,
0.0272674560546875,
0.035369873046875,
0.0259857177734375,
0.002716064453125,
-0.06842041015625,
0.00696563720703125,
0.01111602783203125,
-0.040618896484375,
0.007579803466796875,
0.05291748046875,
0.0253143310546875,
0.036102294921875,
0.0496826171875,
0.023834228515625,
0.0197601318359375,
-0.0010690689086914062,
0.046600341796875,
-0.0272674560546875,
-0.03350830078125,
-0.058685302734375,
0.06317138671875,
-0.0123138427734375,
-0.0504150390625,
0.0552978515625,
0.0784912109375,
0.0738525390625,
-0.01482391357421875,
0.022735595703125,
-0.003582000732421875,
0.05694580078125,
-0.0484619140625,
0.0458984375,
-0.064208984375,
0.024078369140625,
-0.0024662017822265625,
-0.066162109375,
-0.0197296142578125,
0.0234222412109375,
-0.02008056640625,
-0.03338623046875,
0.05743408203125,
0.045562744140625,
-0.0151824951171875,
-0.0172576904296875,
0.020294189453125,
0.0225830078125,
0.017120361328125,
0.044952392578125,
0.0282745361328125,
-0.0718994140625,
0.040069580078125,
-0.0255889892578125,
-0.00579071044921875,
0.0002713203430175781,
-0.055633544921875,
-0.06304931640625,
-0.046173095703125,
-0.01143646240234375,
-0.0130767822265625,
-0.0192718505859375,
0.06085205078125,
0.038787841796875,
-0.06927490234375,
-0.045623779296875,
0.0031280517578125,
0.01126861572265625,
-0.016387939453125,
-0.02001953125,
0.050201416015625,
-0.02410888671875,
-0.070556640625,
0.034210205078125,
0.007457733154296875,
-0.004192352294921875,
-0.0009698867797851562,
-0.0217132568359375,
-0.037109375,
-0.007122039794921875,
0.025909423828125,
0.003093719482421875,
-0.042572021484375,
0.00988006591796875,
0.00519561767578125,
-0.0047607421875,
0.030120849609375,
0.03167724609375,
-0.01904296875,
0.01751708984375,
0.06439208984375,
0.0261993408203125,
0.0265045166015625,
-0.00463104248046875,
0.04254150390625,
-0.056976318359375,
0.025390625,
0.01641845703125,
0.046417236328125,
0.0299224853515625,
-0.004878997802734375,
0.0633544921875,
0.018768310546875,
-0.049896240234375,
-0.08184814453125,
0.0038013458251953125,
-0.097412109375,
-0.0035533905029296875,
0.06866455078125,
-0.0248870849609375,
-0.0229339599609375,
0.0236358642578125,
-0.01593017578125,
0.01064300537109375,
-0.0277099609375,
0.0293426513671875,
0.06146240234375,
0.0235748291015625,
0.00077056884765625,
-0.05645751953125,
0.0279541015625,
0.037811279296875,
-0.05230712890625,
-0.0108795166015625,
0.011505126953125,
0.0109100341796875,
0.0302581787109375,
0.0352783203125,
-0.022003173828125,
0.005977630615234375,
-0.0257110595703125,
0.0277099609375,
-0.009490966796875,
-0.009063720703125,
-0.0240325927734375,
0.00873565673828125,
-0.0064544677734375,
-0.0179443359375
]
] |
Helsinki-NLP/opus-mt-tc-big-en-pt | 2023-10-10T10:20:34.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"marian",
"text2text-generation",
"translation",
"opus-mt-tc",
"en",
"pt",
"pt_br",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-tc-big-en-pt | 17 | 84,461 | transformers | 2022-04-13T14:49:04 | ---
language:
- en
- pt
- pt_br
tags:
- translation
- opus-mt-tc
license: cc-by-4.0
model-index:
- name: opus-mt-tc-big-en-pt
results:
- task:
name: Translation eng-por
type: translation
args: eng-por
dataset:
name: flores101-devtest
type: flores_101
args: eng por devtest
metrics:
- name: BLEU
type: bleu
value: 50.4
- task:
name: Translation eng-por
type: translation
args: eng-por
dataset:
name: tatoeba-test-v2021-08-07
type: tatoeba_mt
args: eng-por
metrics:
- name: BLEU
type: bleu
value: 49.6
---
# opus-mt-tc-big-en-pt
Neural machine translation model for translating from English (en) to Portuguese (pt).
This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
* Publications: [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
```
@inproceedings{tiedemann-thottingal-2020-opus,
title = "{OPUS}-{MT} {--} Building open translation services for the World",
author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
month = nov,
year = "2020",
address = "Lisboa, Portugal",
publisher = "European Association for Machine Translation",
url = "https://aclanthology.org/2020.eamt-1.61",
pages = "479--480",
}
@inproceedings{tiedemann-2020-tatoeba,
title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.wmt-1.139",
pages = "1174--1182",
}
```
## Model info
* Release: 2022-03-13
* source language(s): eng
* target language(s): pob por
* valid target language labels: >>pob<< >>por<<
* model: transformer-big
* data: opusTCv20210807+bt ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
* tokenization: SentencePiece (spm32k,spm32k)
* original model: [opusTCv20210807+bt_transformer-big_2022-03-13.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-por/opusTCv20210807+bt_transformer-big_2022-03-13.zip)
* more information released models: [OPUS-MT eng-por README](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-por/README.md)
* more information about the model: [MarianMT](https://huggingface.co/docs/transformers/model_doc/marian)
This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>pob<<`
## Usage
A short example code:
```python
from transformers import MarianMTModel, MarianTokenizer
src_text = [
">>por<< Tom tried to stab me.",
">>por<< He has been to Hawaii several times."
]
model_name = "pytorch-models/opus-mt-tc-big-en-pt"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
for t in translated:
print( tokenizer.decode(t, skip_special_tokens=True) )
# expected output:
# O Tom tentou esfaquear-me.
# Ele já esteve no Havaí várias vezes.
```
You can also use OPUS-MT models with the transformers pipelines, for example:
```python
from transformers import pipeline
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-en-pt")
print(pipe(">>por<< Tom tried to stab me."))
# expected output: O Tom tentou esfaquear-me.
```
## Benchmarks
* test set translations: [opusTCv20210807+bt_transformer-big_2022-03-13.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-por/opusTCv20210807+bt_transformer-big_2022-03-13.test.txt)
* test set scores: [opusTCv20210807+bt_transformer-big_2022-03-13.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-por/opusTCv20210807+bt_transformer-big_2022-03-13.eval.txt)
* benchmark results: [benchmark_results.txt](benchmark_results.txt)
* benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
| langpair | testset | chr-F | BLEU | #sent | #words |
|----------|---------|-------|-------|-------|--------|
| eng-por | tatoeba-test-v2021-08-07 | 0.69320 | 49.6 | 13222 | 105265 |
| eng-por | flores101-devtest | 0.71673 | 50.4 | 1012 | 26519 |
## Acknowledgements
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
## Model conversion info
* transformers version: 4.16.2
* OPUS-MT git hash: 3405783
* port time: Wed Apr 13 17:48:54 EEST 2022
* port machine: LM0-400-22516.local
| 6,264 | [
[
-0.0244903564453125,
-0.04510498046875,
0.0189208984375,
0.0241241455078125,
-0.035308837890625,
-0.0187835693359375,
-0.041595458984375,
-0.0242462158203125,
0.0125274658203125,
0.028564453125,
-0.033966064453125,
-0.0501708984375,
-0.04949951171875,
0.0283203125,
-0.01322174072265625,
0.06365966796875,
-0.0262451171875,
0.020904541015625,
0.0225067138671875,
-0.0228424072265625,
-0.0169830322265625,
-0.035614013671875,
-0.030975341796875,
-0.026214599609375,
0.023773193359375,
0.0086669921875,
0.034881591796875,
0.045806884765625,
0.04266357421875,
0.026275634765625,
-0.0185699462890625,
0.017120361328125,
-0.0161590576171875,
-0.010467529296875,
0.0011148452758789062,
-0.035491943359375,
-0.041107177734375,
-0.01000213623046875,
0.0660400390625,
0.04144287109375,
0.008880615234375,
0.02239990234375,
0.00688934326171875,
0.03546142578125,
-0.00803375244140625,
0.0164031982421875,
-0.046600341796875,
0.00811767578125,
-0.021484375,
-0.021270751953125,
-0.0494384765625,
-0.00887298583984375,
0.004169464111328125,
-0.0406494140625,
0.0011720657348632812,
0.0054473876953125,
0.09478759765625,
0.016876220703125,
-0.02789306640625,
-0.00860595703125,
-0.053436279296875,
0.07403564453125,
-0.053985595703125,
0.0494384765625,
0.0119476318359375,
0.0033416748046875,
-0.01015472412109375,
-0.048797607421875,
-0.043487548828125,
-0.00605010986328125,
-0.01517486572265625,
0.0248565673828125,
-0.0245819091796875,
-0.00902557373046875,
0.0174713134765625,
0.039642333984375,
-0.047149658203125,
0.005870819091796875,
-0.033447265625,
-0.0164947509765625,
0.030029296875,
0.0005459785461425781,
0.0159149169921875,
-0.02978515625,
-0.0372314453125,
-0.036163330078125,
-0.0517578125,
0.00914764404296875,
0.024139404296875,
0.0301055908203125,
-0.045196533203125,
0.052032470703125,
0.00440216064453125,
0.053436279296875,
0.0015125274658203125,
-0.00653076171875,
0.050079345703125,
-0.040252685546875,
-0.01065826416015625,
-0.01186370849609375,
0.0943603515625,
0.0181732177734375,
0.00949859619140625,
-0.0160064697265625,
-0.01198577880859375,
-0.0101470947265625,
-0.01396942138671875,
-0.064208984375,
0.0161285400390625,
0.0185089111328125,
-0.0310211181640625,
-0.0013027191162109375,
-0.0035152435302734375,
-0.04473876953125,
0.01360321044921875,
-0.021942138671875,
0.03643798828125,
-0.053741455078125,
-0.031524658203125,
0.0198516845703125,
0.005584716796875,
0.0251922607421875,
0.0015430450439453125,
-0.042449951171875,
-0.00362396240234375,
0.033477783203125,
0.0694580078125,
-0.01316070556640625,
-0.042327880859375,
-0.033477783203125,
-0.005870819091796875,
-0.0167388916015625,
0.034881591796875,
-0.00818634033203125,
-0.0218658447265625,
-0.010650634765625,
0.0245513916015625,
-0.0209503173828125,
-0.0183563232421875,
0.06610107421875,
-0.0222320556640625,
0.039886474609375,
-0.014129638671875,
-0.025390625,
-0.0182342529296875,
0.0214385986328125,
-0.032501220703125,
0.08367919921875,
0.00864410400390625,
-0.0640869140625,
0.006500244140625,
-0.047515869140625,
-0.02276611328125,
-0.00797271728515625,
0.01213836669921875,
-0.03643798828125,
0.00743865966796875,
0.0207977294921875,
0.032501220703125,
-0.04486083984375,
0.032501220703125,
0.0048675537109375,
-0.0107574462890625,
0.0111236572265625,
-0.042694091796875,
0.0811767578125,
0.023193359375,
-0.035614013671875,
0.0096435546875,
-0.051300048828125,
-0.0038661956787109375,
0.01303863525390625,
-0.034332275390625,
-0.01194000244140625,
-0.01152801513671875,
0.00838470458984375,
0.02618408203125,
0.0156097412109375,
-0.050079345703125,
0.01448822021484375,
-0.0504150390625,
0.036224365234375,
0.048736572265625,
-0.01788330078125,
0.0271453857421875,
-0.0160064697265625,
0.032928466796875,
0.01800537109375,
-0.0013589859008789062,
-0.020599365234375,
-0.048980712890625,
-0.06939697265625,
-0.0185699462890625,
0.04559326171875,
0.04571533203125,
-0.085693359375,
0.0494384765625,
-0.056549072265625,
-0.06048583984375,
-0.05718994140625,
-0.0222930908203125,
0.047943115234375,
0.0311279296875,
0.05145263671875,
-0.01491546630859375,
-0.03753662109375,
-0.0604248046875,
-0.02105712890625,
-0.0150604248046875,
-0.00632476806640625,
0.008636474609375,
0.047607421875,
-0.0207672119140625,
0.04736328125,
-0.01537322998046875,
-0.0298309326171875,
-0.0209197998046875,
0.01082611083984375,
0.04351806640625,
0.052947998046875,
0.03826904296875,
-0.058135986328125,
-0.047607421875,
0.0284881591796875,
-0.04937744140625,
-0.006847381591796875,
-0.0028934478759765625,
-0.01480865478515625,
0.02630615234375,
0.01378631591796875,
-0.04833984375,
0.01007843017578125,
0.053802490234375,
-0.032318115234375,
0.03460693359375,
-0.0170135498046875,
0.0188751220703125,
-0.1123046875,
0.01515960693359375,
-0.005641937255859375,
-0.01043701171875,
-0.04559326171875,
0.0075225830078125,
0.00836944580078125,
0.00424957275390625,
-0.055694580078125,
0.047943115234375,
-0.04046630859375,
0.0023632049560546875,
0.0240936279296875,
0.0016965866088867188,
-0.00092315673828125,
0.062225341796875,
0.005123138427734375,
0.0589599609375,
0.041595458984375,
-0.041046142578125,
0.0171051025390625,
0.035369873046875,
-0.0183868408203125,
0.0236968994140625,
-0.057647705078125,
0.0009202957153320312,
0.01067352294921875,
0.0062713623046875,
-0.049224853515625,
0.005771636962890625,
0.03472900390625,
-0.0523681640625,
0.033966064453125,
-0.026947021484375,
-0.056640625,
-0.0295562744140625,
-0.00740814208984375,
0.032440185546875,
0.03778076171875,
-0.035491943359375,
0.058380126953125,
0.007793426513671875,
0.002166748046875,
-0.045196533203125,
-0.06866455078125,
0.00731658935546875,
-0.0202178955078125,
-0.0592041015625,
0.036834716796875,
-0.00980377197265625,
0.0074615478515625,
0.00008529424667358398,
0.0093841552734375,
0.00396728515625,
-0.0020580291748046875,
-0.0013799667358398438,
0.01123809814453125,
-0.0296783447265625,
0.0018014907836914062,
0.0031566619873046875,
-0.0274658203125,
-0.01300811767578125,
-0.0433349609375,
0.06396484375,
-0.036834716796875,
-0.0157928466796875,
-0.048004150390625,
0.01450347900390625,
0.05718994140625,
-0.044891357421875,
0.06988525390625,
0.054656982421875,
-0.0193634033203125,
0.01568603515625,
-0.0308837890625,
-0.0023250579833984375,
-0.032501220703125,
0.0308837890625,
-0.04779052734375,
-0.053375244140625,
0.055084228515625,
0.0143280029296875,
0.0171661376953125,
0.06658935546875,
0.061553955078125,
0.0209503173828125,
0.066650390625,
0.024383544921875,
0.0059051513671875,
0.0272064208984375,
-0.0494384765625,
0.012359619140625,
-0.0677490234375,
-0.0091705322265625,
-0.05242919921875,
-0.0116729736328125,
-0.062103271484375,
-0.046905517578125,
0.02764892578125,
-0.00403594970703125,
-0.0034809112548828125,
0.055572509765625,
-0.038482666015625,
0.00922393798828125,
0.038421630859375,
-0.01378631591796875,
0.027099609375,
0.01053619384765625,
-0.041900634765625,
-0.022705078125,
-0.0496826171875,
-0.0367431640625,
0.08319091796875,
0.032440185546875,
0.0211181640625,
0.01293182373046875,
0.0423583984375,
-0.015472412109375,
0.0218658447265625,
-0.0482177734375,
0.03570556640625,
-0.0220794677734375,
-0.041900634765625,
-0.0126495361328125,
-0.050445556640625,
-0.07574462890625,
0.04534912109375,
-0.01125335693359375,
-0.046539306640625,
0.01303863525390625,
-0.00373077392578125,
-0.0026454925537109375,
0.04705810546875,
-0.05609130859375,
0.07574462890625,
-0.0165252685546875,
-0.022247314453125,
0.005489349365234375,
-0.044891357421875,
0.01450347900390625,
-0.0035953521728515625,
0.0167236328125,
0.0029964447021484375,
0.0030117034912109375,
0.0552978515625,
-0.0244903564453125,
0.039703369140625,
-0.0030345916748046875,
-0.01812744140625,
0.010711669921875,
0.001079559326171875,
0.038604736328125,
-0.0145111083984375,
-0.021575927734375,
0.045684814453125,
-0.0013704299926757812,
-0.0230560302734375,
-0.01239776611328125,
0.0408935546875,
-0.06463623046875,
-0.024993896484375,
-0.0313720703125,
-0.046142578125,
0.00887298583984375,
0.034637451171875,
0.050872802734375,
0.04266357421875,
0.0015430450439453125,
0.037872314453125,
0.039459228515625,
-0.03997802734375,
0.03509521484375,
0.043792724609375,
-0.008697509765625,
-0.03753662109375,
0.06719970703125,
0.0261383056640625,
0.026519775390625,
0.048004150390625,
0.01409912109375,
-0.0209503173828125,
-0.048492431640625,
-0.064453125,
0.03680419921875,
-0.037567138671875,
-0.0178070068359375,
-0.0609130859375,
-0.0030918121337890625,
-0.0267791748046875,
0.0118408203125,
-0.0477294921875,
-0.04022216796875,
-0.01366424560546875,
-0.0041046142578125,
0.0247039794921875,
0.0218963623046875,
0.003368377685546875,
0.02984619140625,
-0.07086181640625,
0.0149993896484375,
-0.020263671875,
0.023956298828125,
-0.01435089111328125,
-0.061492919921875,
-0.030853271484375,
0.018890380859375,
-0.032684326171875,
-0.06634521484375,
0.0509033203125,
0.001983642578125,
0.024200439453125,
0.0116729736328125,
0.00762939453125,
0.045379638671875,
-0.050445556640625,
0.0594482421875,
0.01043701171875,
-0.0721435546875,
0.0256195068359375,
-0.032989501953125,
0.0235748291015625,
0.025115966796875,
0.0221710205078125,
-0.05084228515625,
-0.044158935546875,
-0.0516357421875,
-0.0740966796875,
0.0699462890625,
0.04144287109375,
0.00384521484375,
0.0017375946044921875,
0.0030612945556640625,
-0.0004067420959472656,
0.0081024169921875,
-0.084716796875,
-0.0357666015625,
-0.006023406982421875,
-0.0166778564453125,
-0.01312255859375,
-0.0084686279296875,
0.003978729248046875,
-0.027099609375,
0.07489013671875,
0.0015573501586914062,
0.033660888671875,
0.0278472900390625,
-0.0251922607421875,
-0.00582122802734375,
0.0195770263671875,
0.0496826171875,
0.0302886962890625,
-0.00789642333984375,
0.00222015380859375,
0.02734375,
-0.03118896484375,
0.0011653900146484375,
0.016143798828125,
-0.0245208740234375,
0.0290985107421875,
0.028564453125,
0.07586669921875,
0.01067352294921875,
-0.02984619140625,
0.03887939453125,
-0.0105743408203125,
-0.0212249755859375,
-0.02685546875,
-0.031707763671875,
0.01097869873046875,
0.0032939910888671875,
0.0240020751953125,
0.00518035888671875,
-0.01360321044921875,
-0.017547607421875,
0.0003380775451660156,
0.0107879638671875,
-0.0263519287109375,
-0.04302978515625,
0.058380126953125,
0.012786865234375,
-0.02215576171875,
0.03668212890625,
-0.021453857421875,
-0.05670166015625,
0.03485107421875,
0.03216552734375,
0.07598876953125,
-0.018768310546875,
-0.0010967254638671875,
0.053802490234375,
0.048583984375,
-0.007656097412109375,
0.00989532470703125,
-0.0005254745483398438,
-0.047332763671875,
-0.039459228515625,
-0.06134033203125,
-0.002132415771484375,
0.0009851455688476562,
-0.0430908203125,
0.033782958984375,
0.00911712646484375,
-0.008270263671875,
-0.016876220703125,
0.00537872314453125,
-0.05059814453125,
-0.0013170242309570312,
-0.01055908203125,
0.06439208984375,
-0.0665283203125,
0.07012939453125,
0.04388427734375,
-0.038848876953125,
-0.06689453125,
-0.011322021484375,
-0.029541015625,
-0.0430908203125,
0.0428466796875,
0.0165252685546875,
0.00209808349609375,
0.01160430908203125,
-0.011749267578125,
-0.064453125,
0.0723876953125,
0.038848876953125,
-0.030853271484375,
-0.0048065185546875,
0.03302001953125,
0.05224609375,
-0.0126800537109375,
0.0229034423828125,
0.0325927734375,
0.055694580078125,
-0.0034503936767578125,
-0.085693359375,
-0.00455474853515625,
-0.040313720703125,
-0.0012197494506835938,
0.0253448486328125,
-0.049896240234375,
0.07806396484375,
0.0079498291015625,
-0.02081298828125,
0.0136871337890625,
0.049835205078125,
0.01203155517578125,
0.0019741058349609375,
0.0269927978515625,
0.061431884765625,
0.031951904296875,
-0.041595458984375,
0.0863037109375,
-0.035064697265625,
0.04052734375,
0.0631103515625,
0.0177459716796875,
0.05999755859375,
0.044097900390625,
-0.01617431640625,
0.0301513671875,
0.042327880859375,
-0.006824493408203125,
0.0168304443359375,
-0.006122589111328125,
0.00830841064453125,
-0.006610870361328125,
-0.011993408203125,
-0.05377197265625,
0.0333251953125,
0.0218963623046875,
-0.0269012451171875,
-0.0048980712890625,
-0.0009508132934570312,
0.0198822021484375,
-0.003871917724609375,
-0.005405426025390625,
0.03753662109375,
0.01174163818359375,
-0.061187744140625,
0.07952880859375,
0.024383544921875,
0.0557861328125,
-0.042327880859375,
0.01316070556640625,
-0.011474609375,
0.0242919921875,
-0.0019588470458984375,
-0.041839599609375,
0.022705078125,
0.016143798828125,
-0.01065826416015625,
-0.045196533203125,
0.00618743896484375,
-0.051300048828125,
-0.0537109375,
0.0382080078125,
0.03863525390625,
0.0248565673828125,
0.006099700927734375,
-0.053955078125,
0.004413604736328125,
0.0162506103515625,
-0.040435791015625,
0.0005803108215332031,
0.05010986328125,
-0.0018329620361328125,
0.038116455078125,
0.051239013671875,
0.0165863037109375,
0.0216522216796875,
-0.009429931640625,
0.05169677734375,
-0.03240966796875,
-0.037017822265625,
-0.06549072265625,
0.05926513671875,
0.006908416748046875,
-0.0418701171875,
0.06402587890625,
0.061187744140625,
0.0748291015625,
-0.0150604248046875,
0.03204345703125,
-0.01043701171875,
0.03424072265625,
-0.040924072265625,
0.048492431640625,
-0.058990478515625,
0.020751953125,
-0.0201568603515625,
-0.08013916015625,
-0.02532958984375,
0.0290374755859375,
-0.0194091796875,
-0.01117706298828125,
0.059783935546875,
0.052703857421875,
-0.0093841552734375,
-0.0290679931640625,
0.01459503173828125,
0.0430908203125,
0.031646728515625,
0.059112548828125,
0.03839111328125,
-0.069091796875,
0.05487060546875,
-0.02978515625,
-0.003360748291015625,
-0.006526947021484375,
-0.051849365234375,
-0.056854248046875,
-0.057952880859375,
-0.013519287109375,
-0.034332275390625,
-0.01580810546875,
0.07464599609375,
0.0258636474609375,
-0.06353759765625,
-0.02630615234375,
-0.004436492919921875,
0.016143798828125,
-0.0169525146484375,
-0.013427734375,
0.04302978515625,
-0.0094451904296875,
-0.08697509765625,
0.0229034423828125,
0.0077056884765625,
0.01195526123046875,
-0.005558013916015625,
-0.0217132568359375,
-0.0243988037109375,
-0.0175933837890625,
0.029022216796875,
0.007640838623046875,
-0.0648193359375,
-0.00197601318359375,
0.01983642578125,
-0.01145172119140625,
0.0206451416015625,
0.0274658203125,
-0.031036376953125,
0.035888671875,
0.04833984375,
0.042327880859375,
0.04498291015625,
-0.0216217041015625,
0.04705810546875,
-0.053314208984375,
0.04022216796875,
0.0171661376953125,
0.04595947265625,
0.03424072265625,
-0.0030918121337890625,
0.046417236328125,
0.0240020751953125,
-0.0182342529296875,
-0.0841064453125,
0.006587982177734375,
-0.07220458984375,
-0.0005292892456054688,
0.08709716796875,
-0.0213623046875,
-0.0264892578125,
0.0138702392578125,
-0.0154876708984375,
0.04083251953125,
-0.0184478759765625,
0.032073974609375,
0.0511474609375,
0.036956787109375,
0.004150390625,
-0.039642333984375,
0.016357421875,
0.04730224609375,
-0.040985107421875,
-0.002796173095703125,
0.00830078125,
0.006748199462890625,
0.03125,
0.0256195068359375,
-0.02362060546875,
0.00662994384765625,
-0.0229949951171875,
0.0321044921875,
-0.0101470947265625,
-0.01392364501953125,
-0.029022216796875,
-0.005573272705078125,
-0.0133819580078125,
-0.00997161865234375
]
] |
cardiffnlp/twitter-roberta-base-emotion | 2023-05-28T05:08:00.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"text-classification",
"arxiv:2010.12421",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-emotion | 35 | 83,876 | transformers | 2022-03-02T23:29:05 | # Twitter-roBERTa-base for Emotion Recognition
This is a RoBERTa-base model trained on ~58M tweets and finetuned for emotion recognition with the TweetEval benchmark.
- Paper: [_TweetEval_ benchmark (Findings of EMNLP 2020)](https://arxiv.org/pdf/2010.12421.pdf).
- Git Repo: [Tweeteval official repository](https://github.com/cardiffnlp/tweeteval).
<b>New!</b> We just released a new emotion recognition model trained with more emotion types and with a newer RoBERTa-based model.
See [twitter-roberta-base-emotion-multilabel-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-emotion-multilabel-latest) and [TweetNLP](https://github.com/cardiffnlp/tweetnlp) for more details.
## Example of classification
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
from scipy.special import softmax
import csv
import urllib.request
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
# Tasks:
# emoji, emotion, hate, irony, offensive, sentiment
# stance/abortion, stance/atheism, stance/climate, stance/feminist, stance/hillary
task='emotion'
MODEL = f"cardiffnlp/twitter-roberta-base-{task}"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# download label mapping
mapping_link = f"https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/{task}/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)
text = "Celebrating my promotion 😎"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Celebrating my promotion 😎"
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) joy 0.9382
2) optimism 0.0362
3) anger 0.0145
4) sadness 0.0112
```
| 2,753 | [
[
-0.0029315948486328125,
-0.035919189453125,
0.00664520263671875,
0.033203125,
-0.009857177734375,
0.0157012939453125,
-0.029937744140625,
-0.01082611083984375,
0.0205078125,
-0.004558563232421875,
-0.0295562744140625,
-0.06280517578125,
-0.06256103515625,
0.007659912109375,
-0.0231781005859375,
0.0772705078125,
-0.01033782958984375,
-0.0059051513671875,
0.017547607421875,
-0.021514892578125,
-0.007793426513671875,
-0.04144287109375,
-0.048583984375,
-0.0263824462890625,
0.0411376953125,
0.030364990234375,
0.01934814453125,
0.0170745849609375,
0.0294647216796875,
0.039459228515625,
-0.0048675537109375,
-0.004489898681640625,
-0.04083251953125,
0.01114654541015625,
0.011749267578125,
-0.0261077880859375,
-0.041839599609375,
0.01849365234375,
0.041351318359375,
0.0186004638671875,
0.0022983551025390625,
0.0250244140625,
-0.004741668701171875,
0.046630859375,
-0.037933349609375,
0.0198516845703125,
-0.02386474609375,
0.00826263427734375,
-0.0014104843139648438,
-0.0154571533203125,
-0.0250244140625,
-0.048858642578125,
0.01313018798828125,
-0.0352783203125,
0.00646209716796875,
-0.0092620849609375,
0.09527587890625,
0.019775390625,
-0.0241546630859375,
-0.0159149169921875,
-0.0203399658203125,
0.07763671875,
-0.044342041015625,
0.00850677490234375,
0.01373291015625,
0.0125579833984375,
0.00907135009765625,
-0.040496826171875,
-0.0295867919921875,
-0.005970001220703125,
0.0103912353515625,
0.0213623046875,
-0.031707763671875,
-0.017120361328125,
0.0196075439453125,
0.023223876953125,
-0.03692626953125,
-0.0165557861328125,
-0.029754638671875,
-0.004467010498046875,
0.043670654296875,
0.00461578369140625,
0.0237274169921875,
-0.0229339599609375,
-0.017608642578125,
-0.0253753662109375,
-0.0069427490234375,
0.0111846923828125,
0.0033111572265625,
0.02691650390625,
-0.0313720703125,
0.037994384765625,
-0.01224517822265625,
0.036163330078125,
0.012725830078125,
-0.0083160400390625,
0.06243896484375,
0.0023632049560546875,
-0.02337646484375,
-0.00432586669921875,
0.08270263671875,
0.0245819091796875,
0.037506103515625,
0.0007719993591308594,
-0.0115814208984375,
0.003265380859375,
-0.0087127685546875,
-0.06719970703125,
-0.0272674560546875,
0.030303955078125,
-0.04345703125,
-0.04278564453125,
-0.0005974769592285156,
-0.06219482421875,
-0.0172271728515625,
-0.0032138824462890625,
0.0535888671875,
-0.04815673828125,
-0.035736083984375,
-0.0009493827819824219,
-0.03363037109375,
-0.0017681121826171875,
0.0233612060546875,
-0.055572509765625,
-0.00809478759765625,
0.0276336669921875,
0.070556640625,
0.0110626220703125,
-0.023651123046875,
-0.0182647705078125,
-0.0032176971435546875,
-0.014678955078125,
0.0426025390625,
-0.0289764404296875,
-0.0225677490234375,
-0.0005230903625488281,
-0.0015630722045898438,
-0.0164947509765625,
-0.0257415771484375,
0.026031494140625,
-0.01331329345703125,
0.01239776611328125,
-0.01210784912109375,
-0.032989501953125,
-0.00464630126953125,
0.0254974365234375,
-0.034942626953125,
0.0860595703125,
0.0249481201171875,
-0.05718994140625,
0.0164031982421875,
-0.057891845703125,
-0.0253753662109375,
-0.0131988525390625,
0.0177764892578125,
-0.0465087890625,
0.0011348724365234375,
0.0110015869140625,
0.04791259765625,
-0.006343841552734375,
0.00579833984375,
-0.038421630859375,
-0.0180816650390625,
0.037384033203125,
-0.0169677734375,
0.0755615234375,
0.0173797607421875,
-0.027191162109375,
0.0213623046875,
-0.06805419921875,
0.0162200927734375,
0.011199951171875,
-0.04632568359375,
-0.01204681396484375,
-0.0298614501953125,
0.022491455078125,
0.024932861328125,
0.022369384765625,
-0.035980224609375,
0.017791748046875,
-0.01021575927734375,
0.04656982421875,
0.06317138671875,
-0.01357269287109375,
0.0269622802734375,
-0.0222930908203125,
0.0271453857421875,
0.0164947509765625,
0.007659912109375,
0.00377655029296875,
-0.032958984375,
-0.0662841796875,
-0.0117950439453125,
0.0261993408203125,
0.0367431640625,
-0.0323486328125,
0.03692626953125,
-0.0259857177734375,
-0.05438232421875,
-0.04791259765625,
-0.008148193359375,
0.022491455078125,
0.040008544921875,
0.046295166015625,
0.002788543701171875,
-0.05706787109375,
-0.037506103515625,
-0.032379150390625,
-0.016937255859375,
0.00612640380859375,
0.0213775634765625,
0.0543212890625,
-0.0207061767578125,
0.043670654296875,
-0.035064697265625,
-0.027435302734375,
-0.0159912109375,
0.0467529296875,
0.028411865234375,
0.061614990234375,
0.05169677734375,
-0.05517578125,
-0.051361083984375,
-0.018463134765625,
-0.06201171875,
-0.012176513671875,
0.0146331787109375,
-0.016143798828125,
0.03466796875,
0.005481719970703125,
-0.032989501953125,
0.04107666015625,
0.027252197265625,
-0.0301971435546875,
0.017242431640625,
-0.004302978515625,
0.0316162109375,
-0.09173583984375,
-0.0010957717895507812,
0.0216827392578125,
0.004306793212890625,
-0.048858642578125,
-0.0276031494140625,
0.0156097412109375,
0.01105499267578125,
-0.035675048828125,
0.03253173828125,
-0.03326416015625,
0.0032634735107421875,
0.0017652511596679688,
-0.00504302978515625,
-0.0007777214050292969,
0.035736083984375,
0.00121307373046875,
0.0308990478515625,
0.04278564453125,
-0.03326416015625,
0.032196044921875,
0.027252197265625,
-0.006267547607421875,
0.04339599609375,
-0.04510498046875,
-0.0066070556640625,
-0.0019359588623046875,
0.0247955322265625,
-0.090576171875,
-0.0263519287109375,
0.0162353515625,
-0.0687255859375,
0.030181884765625,
-0.028778076171875,
-0.028228759765625,
-0.04254150390625,
-0.03173828125,
0.0310516357421875,
0.05303955078125,
-0.04107666015625,
0.05230712890625,
0.0248870849609375,
0.0194549560546875,
-0.04620361328125,
-0.07855224609375,
0.005260467529296875,
-0.02691650390625,
-0.0516357421875,
0.01654052734375,
-0.00632476806640625,
0.0007076263427734375,
-0.00424957275390625,
0.002887725830078125,
-0.0170745849609375,
0.007205963134765625,
0.0295867919921875,
0.0175933837890625,
-0.01117706298828125,
0.0150146484375,
-0.022979736328125,
-0.0086517333984375,
0.003704071044921875,
-0.017608642578125,
0.047119140625,
-0.037353515625,
0.018890380859375,
-0.045166015625,
0.01186370849609375,
0.02423095703125,
-0.01126861572265625,
0.06903076171875,
0.0850830078125,
-0.037841796875,
-0.0037746429443359375,
-0.040069580078125,
0.00928497314453125,
-0.038116455078125,
0.0256500244140625,
-0.032012939453125,
-0.052032470703125,
0.043609619140625,
0.002475738525390625,
-0.0164794921875,
0.06060791015625,
0.05328369140625,
-0.007701873779296875,
0.07568359375,
0.031463623046875,
-0.00960540771484375,
0.0491943359375,
-0.0548095703125,
0.0030612945556640625,
-0.053253173828125,
-0.0306243896484375,
-0.04718017578125,
-0.0185699462890625,
-0.053924560546875,
-0.0187225341796875,
0.01461029052734375,
0.0008692741394042969,
-0.040252685546875,
0.025177001953125,
-0.058258056640625,
0.00899505615234375,
0.037078857421875,
0.018768310546875,
-0.02093505859375,
-0.0030193328857421875,
0.00948333740234375,
-0.0083160400390625,
-0.039306640625,
-0.0266571044921875,
0.082763671875,
0.0262603759765625,
0.0587158203125,
0.005512237548828125,
0.0751953125,
0.005260467529296875,
0.048980712890625,
-0.05120849609375,
0.03704833984375,
-0.0188446044921875,
-0.036529541015625,
-0.0092620849609375,
-0.04559326171875,
-0.058197021484375,
0.003040313720703125,
-0.0278472900390625,
-0.06622314453125,
0.009796142578125,
0.002246856689453125,
-0.0145721435546875,
0.024322509765625,
-0.057464599609375,
0.07318115234375,
-0.01043701171875,
-0.0307464599609375,
0.01094818115234375,
-0.039276123046875,
0.021575927734375,
0.00579071044921875,
0.0114898681640625,
-0.0279388427734375,
0.00605010986328125,
0.07989501953125,
-0.040252685546875,
0.056121826171875,
-0.016265869140625,
0.01274871826171875,
0.0147857666015625,
-0.0095367431640625,
0.01088714599609375,
-0.0130615234375,
-0.022186279296875,
0.007663726806640625,
-0.0157928466796875,
-0.03411865234375,
-0.01763916015625,
0.06219482421875,
-0.07696533203125,
-0.0199432373046875,
-0.05731201171875,
-0.03460693359375,
0.006374359130859375,
0.0171051025390625,
0.035247802734375,
0.025634765625,
-0.00458526611328125,
0.0221710205078125,
0.033538818359375,
-0.02337646484375,
0.045074462890625,
0.0185546875,
0.003330230712890625,
-0.044921875,
0.06793212890625,
0.01448822021484375,
0.00043010711669921875,
0.033416748046875,
0.02496337890625,
-0.027374267578125,
-0.0151824951171875,
-0.004314422607421875,
0.0082244873046875,
-0.06121826171875,
-0.032379150390625,
-0.06060791015625,
-0.0190887451171875,
-0.06378173828125,
-0.0107421875,
-0.017852783203125,
-0.046630859375,
-0.04925537109375,
-0.0039215087890625,
0.053497314453125,
0.061614990234375,
-0.0352783203125,
0.0235595703125,
-0.0562744140625,
0.0209197998046875,
0.007144927978515625,
0.0226593017578125,
0.0055999755859375,
-0.05340576171875,
-0.0142059326171875,
0.0011167526245117188,
-0.0289764404296875,
-0.0745849609375,
0.05792236328125,
0.0174560546875,
0.01611328125,
0.03173828125,
0.004596710205078125,
0.060943603515625,
-0.01317596435546875,
0.055450439453125,
0.02679443359375,
-0.08734130859375,
0.051483154296875,
-0.0263519287109375,
0.0121612548828125,
0.0278167724609375,
0.032073974609375,
-0.04345703125,
-0.0237579345703125,
-0.047821044921875,
-0.07562255859375,
0.06829833984375,
0.0244903564453125,
-0.0011425018310546875,
-0.0065765380859375,
0.0121002197265625,
-0.020050048828125,
0.01381683349609375,
-0.06494140625,
-0.036346435546875,
-0.04522705078125,
-0.044586181640625,
-0.0112762451171875,
-0.01396942138671875,
-0.0015420913696289062,
-0.038665771484375,
0.05609130859375,
0.01065826416015625,
0.03778076171875,
0.01123809814453125,
-0.01537322998046875,
-0.0157928466796875,
0.0103912353515625,
0.03607177734375,
0.0289764404296875,
-0.047760009765625,
-0.00423431396484375,
0.017578125,
-0.026153564453125,
0.01305389404296875,
0.01139068603515625,
0.0123138427734375,
0.0098114013671875,
0.025634765625,
0.052520751953125,
0.0236053466796875,
-0.007568359375,
0.045684814453125,
-0.0167083740234375,
-0.018280029296875,
-0.034332275390625,
0.004077911376953125,
0.0118560791015625,
0.0178985595703125,
0.045562744140625,
0.0151824951171875,
0.0020904541015625,
-0.042083740234375,
0.01148223876953125,
0.01058197021484375,
-0.0284423828125,
-0.0268707275390625,
0.06573486328125,
0.0008416175842285156,
-0.034912109375,
0.035491943359375,
-0.0023441314697265625,
-0.07391357421875,
0.061279296875,
0.02947998046875,
0.08013916015625,
-0.02239990234375,
0.0267486572265625,
0.0616455078125,
0.00027632713317871094,
-0.01239776611328125,
0.043701171875,
0.006031036376953125,
-0.04547119140625,
0.00838470458984375,
-0.053466796875,
-0.0165252685546875,
-0.00617218017578125,
-0.050750732421875,
0.01396942138671875,
-0.03912353515625,
-0.03570556640625,
0.018341064453125,
0.0105743408203125,
-0.061279296875,
0.04248046875,
0.00762939453125,
0.055267333984375,
-0.0728759765625,
0.05767822265625,
0.048583984375,
-0.057647705078125,
-0.0780029296875,
-0.01302337646484375,
-0.005218505859375,
-0.05682373046875,
0.0750732421875,
0.033233642578125,
0.014190673828125,
-0.00003165006637573242,
-0.052978515625,
-0.06744384765625,
0.07965087890625,
0.0011663436889648438,
-0.004665374755859375,
0.004833221435546875,
0.01107025146484375,
0.054290771484375,
-0.03460693359375,
0.054534912109375,
0.03375244140625,
0.045745849609375,
0.00336456298828125,
-0.03656005859375,
0.0126495361328125,
-0.03631591796875,
-0.005970001220703125,
0.004100799560546875,
-0.06988525390625,
0.096923828125,
-0.01556396484375,
0.0009236335754394531,
0.0079803466796875,
0.053558349609375,
0.0256500244140625,
0.02630615234375,
0.0308990478515625,
0.044189453125,
0.05450439453125,
-0.040618896484375,
0.06329345703125,
-0.0203857421875,
0.05926513671875,
0.046905517578125,
0.018524169921875,
0.0648193359375,
0.0263671875,
-0.017669677734375,
0.044525146484375,
0.0654296875,
-0.0202484130859375,
0.03277587890625,
0.0218353271484375,
0.00005227327346801758,
-0.01556396484375,
-0.0019702911376953125,
-0.031494140625,
0.01995849609375,
0.027069091796875,
-0.03216552734375,
-0.01088714599609375,
-0.004489898681640625,
0.01380157470703125,
-0.017333984375,
-0.0189208984375,
0.037841796875,
0.00662994384765625,
-0.040008544921875,
0.07196044921875,
-0.0063934326171875,
0.06793212890625,
-0.0227813720703125,
0.0047760009765625,
-0.0007348060607910156,
0.0298614501953125,
-0.03521728515625,
-0.0655517578125,
-0.004138946533203125,
0.01392364501953125,
0.00199127197265625,
-0.0247344970703125,
0.04071044921875,
-0.04864501953125,
-0.04278564453125,
0.037841796875,
0.008758544921875,
0.0266571044921875,
0.004978179931640625,
-0.07958984375,
0.0164337158203125,
-0.0032520294189453125,
-0.051300048828125,
0.005542755126953125,
0.046356201171875,
0.02581787109375,
0.04693603515625,
0.04278564453125,
-0.00896453857421875,
0.019317626953125,
0.022613525390625,
0.06341552734375,
-0.059478759765625,
-0.02532958984375,
-0.06524658203125,
0.037078857421875,
-0.013580322265625,
-0.04302978515625,
0.05072021484375,
0.043853759765625,
0.03887939453125,
-0.01152801513671875,
0.06414794921875,
-0.01763916015625,
0.032440185546875,
-0.01690673828125,
0.066162109375,
-0.06915283203125,
-0.0095977783203125,
-0.040435791015625,
-0.0421142578125,
-0.0246734619140625,
0.06329345703125,
-0.0272674560546875,
0.02532958984375,
0.043365478515625,
0.042388916015625,
0.0041351318359375,
-0.01454925537109375,
0.00589752197265625,
0.039947509765625,
0.029052734375,
0.0728759765625,
0.048614501953125,
-0.06317138671875,
0.051483154296875,
-0.0340576171875,
-0.017425537109375,
-0.038482666015625,
-0.06365966796875,
-0.0855712890625,
-0.052001953125,
-0.03717041015625,
-0.07342529296875,
-0.0024890899658203125,
0.076416015625,
0.040008544921875,
-0.07110595703125,
-0.01424407958984375,
0.00884246826171875,
0.00909423828125,
0.0068817138671875,
-0.0211181640625,
0.042633056640625,
-0.01849365234375,
-0.065185546875,
-0.0019893646240234375,
0.0003056526184082031,
0.02691650390625,
0.01352691650390625,
-0.007526397705078125,
-0.0296630859375,
-0.0021343231201171875,
0.0266571044921875,
0.0210723876953125,
-0.04248046875,
-0.0222625732421875,
-0.007793426513671875,
-0.02947998046875,
0.02484130859375,
0.01337432861328125,
-0.037384033203125,
0.0121002197265625,
0.053466796875,
0.019866943359375,
0.051422119140625,
0.004436492919921875,
0.0228118896484375,
-0.044525146484375,
0.005008697509765625,
0.0088348388671875,
0.03955078125,
0.04193115234375,
-0.01324462890625,
0.046722412109375,
0.025177001953125,
-0.046356201171875,
-0.0675048828125,
-0.022705078125,
-0.08917236328125,
-0.005977630615234375,
0.0858154296875,
-0.02532958984375,
-0.0452880859375,
0.016876220703125,
-0.0081329345703125,
0.057647705078125,
-0.056640625,
0.0609130859375,
0.040130615234375,
0.007472991943359375,
0.0001423358917236328,
-0.018585205078125,
0.03948974609375,
0.0215911865234375,
-0.05035400390625,
-0.01229095458984375,
0.008056640625,
0.04693603515625,
0.0165557861328125,
0.056854248046875,
0.00795745849609375,
0.024566650390625,
-0.0020771026611328125,
0.018890380859375,
-0.005260467529296875,
-0.0032215118408203125,
-0.0305633544921875,
0.01392364501953125,
-0.0350341796875,
-0.0272369384765625
]
] |
EleutherAI/pythia-70m-deduped | 2023-07-09T16:07:33.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-70m-deduped | 16 | 83,703 | transformers | 2023-02-13T16:01:41 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile_deduplicated
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-70M-deduped
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-70M-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-70M-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-70M-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-70M-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-70M-deduped to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-70M-deduped may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-70M-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
Pythia-70M-deduped was trained on the Pile **after the dataset has been globally
deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,654 | [
[
-0.0250396728515625,
-0.060089111328125,
0.0249481201171875,
0.0026569366455078125,
-0.017913818359375,
-0.0145263671875,
-0.0170745849609375,
-0.032012939453125,
0.0133056640625,
0.0140380859375,
-0.0257110595703125,
-0.0230712890625,
-0.032867431640625,
-0.0027790069580078125,
-0.03521728515625,
0.08221435546875,
-0.00914764404296875,
-0.01123046875,
0.00919342041015625,
-0.003643035888671875,
-0.00475311279296875,
-0.041839599609375,
-0.03411865234375,
-0.0303497314453125,
0.04510498046875,
0.01412200927734375,
0.06536865234375,
0.0435791015625,
0.01248931884765625,
0.022369384765625,
-0.0296173095703125,
-0.005855560302734375,
-0.010894775390625,
-0.00690460205078125,
-0.00180816650390625,
-0.0197601318359375,
-0.056304931640625,
0.001140594482421875,
0.0523681640625,
0.049957275390625,
-0.0122222900390625,
0.018890380859375,
-0.00022017955780029297,
0.028411865234375,
-0.03887939453125,
0.003719329833984375,
-0.0263671875,
-0.01401519775390625,
-0.004459381103515625,
0.01261138916015625,
-0.02825927734375,
-0.025238037109375,
0.032562255859375,
-0.04656982421875,
0.020782470703125,
0.0030269622802734375,
0.08868408203125,
-0.00836181640625,
-0.03216552734375,
-0.0045013427734375,
-0.05548095703125,
0.051361083984375,
-0.055206298828125,
0.024871826171875,
0.0223388671875,
0.0122833251953125,
-0.00201416015625,
-0.06597900390625,
-0.042694091796875,
-0.0165252685546875,
-0.00922393798828125,
-0.003246307373046875,
-0.047271728515625,
0.0010004043579101562,
0.039337158203125,
0.0474853515625,
-0.061798095703125,
-0.003376007080078125,
-0.0281982421875,
-0.025115966796875,
0.0262451171875,
0.005764007568359375,
0.032958984375,
-0.0242767333984375,
-0.0005636215209960938,
-0.0290374755859375,
-0.0517578125,
-0.0187530517578125,
0.03985595703125,
0.00505828857421875,
-0.026824951171875,
0.03814697265625,
-0.030853271484375,
0.04425048828125,
-0.005252838134765625,
0.01861572265625,
0.032623291015625,
-0.01336669921875,
-0.03955078125,
-0.00492095947265625,
0.0699462890625,
0.008758544921875,
0.01617431640625,
-0.0004112720489501953,
-0.004108428955078125,
0.004825592041015625,
0.0034656524658203125,
-0.08447265625,
-0.058624267578125,
0.0177154541015625,
-0.0297393798828125,
-0.03131103515625,
-0.01393890380859375,
-0.0701904296875,
-0.01422882080078125,
-0.01690673828125,
0.043243408203125,
-0.0386962890625,
-0.054962158203125,
-0.00848388671875,
0.00008302927017211914,
0.01497650146484375,
0.0279083251953125,
-0.0689697265625,
0.02923583984375,
0.032745361328125,
0.076171875,
0.0178070068359375,
-0.040771484375,
-0.0142059326171875,
-0.0177459716796875,
-0.0084228515625,
0.0258941650390625,
-0.01039886474609375,
-0.01554107666015625,
-0.00846099853515625,
0.01358795166015625,
-0.00868988037109375,
-0.0270843505859375,
0.0298004150390625,
-0.031097412109375,
0.01934814453125,
-0.0210418701171875,
-0.031646728515625,
-0.0301361083984375,
0.0105743408203125,
-0.046844482421875,
0.063232421875,
0.0185546875,
-0.07208251953125,
0.016448974609375,
-0.017608642578125,
-0.005096435546875,
-0.0022792816162109375,
0.01568603515625,
-0.05047607421875,
0.0015544891357421875,
0.0254058837890625,
0.004848480224609375,
-0.0300140380859375,
0.0161895751953125,
-0.0185546875,
-0.033294677734375,
0.013580322265625,
-0.04296875,
0.069580078125,
0.0162200927734375,
-0.05133056640625,
0.0196533203125,
-0.045440673828125,
0.01406097412109375,
0.018707275390625,
-0.02825927734375,
0.001850128173828125,
-0.01486968994140625,
0.026275634765625,
0.01611328125,
0.013031005859375,
-0.0279998779296875,
0.0216064453125,
-0.0382080078125,
0.0567626953125,
0.05548095703125,
-0.007350921630859375,
0.035491943359375,
-0.031463623046875,
0.0355224609375,
0.001800537109375,
0.014617919921875,
-0.0028476715087890625,
-0.044281005859375,
-0.075439453125,
-0.02362060546875,
0.0280914306640625,
0.0233001708984375,
-0.0362548828125,
0.034515380859375,
-0.0189971923828125,
-0.06524658203125,
-0.01149749755859375,
-0.00618743896484375,
0.0310821533203125,
0.024627685546875,
0.03228759765625,
-0.01256561279296875,
-0.0386962890625,
-0.06591796875,
-0.01422119140625,
-0.032135009765625,
0.0089874267578125,
0.01285552978515625,
0.07110595703125,
-0.01114654541015625,
0.043975830078125,
-0.027252197265625,
0.017364501953125,
-0.0284881591796875,
0.01209259033203125,
0.033233642578125,
0.04656982421875,
0.0283050537109375,
-0.04296875,
-0.0288543701171875,
-0.0006170272827148438,
-0.04351806640625,
0.007068634033203125,
0.0032367706298828125,
-0.0240478515625,
0.023651123046875,
0.006328582763671875,
-0.07568359375,
0.035430908203125,
0.046234130859375,
-0.040252685546875,
0.059112548828125,
-0.0255584716796875,
0.0010499954223632812,
-0.08056640625,
0.021026611328125,
0.0081024169921875,
-0.017608642578125,
-0.0438232421875,
0.00402069091796875,
0.01403045654296875,
-0.01611328125,
-0.031829833984375,
0.0450439453125,
-0.04083251953125,
-0.01137542724609375,
-0.017364501953125,
0.0058441162109375,
-0.0019254684448242188,
0.04718017578125,
0.0124359130859375,
0.0411376953125,
0.0596923828125,
-0.05938720703125,
0.0321044921875,
0.016571044921875,
-0.0208740234375,
0.02880859375,
-0.0679931640625,
0.0131072998046875,
0.00484466552734375,
0.031402587890625,
-0.046051025390625,
-0.0245361328125,
0.04132080078125,
-0.041351318359375,
0.01268768310546875,
-0.032440185546875,
-0.04058837890625,
-0.032745361328125,
-0.01244354248046875,
0.0465087890625,
0.059173583984375,
-0.044281005859375,
0.052459716796875,
0.00494384765625,
0.00909423828125,
-0.029022216796875,
-0.041290283203125,
-0.0182342529296875,
-0.040863037109375,
-0.049102783203125,
0.030029296875,
0.0129241943359375,
-0.014923095703125,
0.0025730133056640625,
0.00048089027404785156,
0.00818634033203125,
-0.0030574798583984375,
0.02569580078125,
0.0257720947265625,
-0.0025177001953125,
0.0028781890869140625,
-0.0099334716796875,
-0.0097198486328125,
-0.000247955322265625,
-0.03704833984375,
0.07525634765625,
-0.022003173828125,
-0.014678955078125,
-0.060333251953125,
-0.0008091926574707031,
0.0672607421875,
-0.031829833984375,
0.065185546875,
0.045989990234375,
-0.05364990234375,
0.0110626220703125,
-0.02764892578125,
-0.021728515625,
-0.033203125,
0.050262451171875,
-0.0216217041015625,
-0.0271148681640625,
0.046234130859375,
0.021484375,
0.020111083984375,
0.044769287109375,
0.056549072265625,
0.01873779296875,
0.09100341796875,
0.032989501953125,
-0.01259613037109375,
0.04766845703125,
-0.041046142578125,
0.016265869140625,
-0.08282470703125,
-0.01285552978515625,
-0.040863037109375,
-0.02032470703125,
-0.0697021484375,
-0.023895263671875,
0.0233612060546875,
0.017303466796875,
-0.05767822265625,
0.0423583984375,
-0.042724609375,
0.00463104248046875,
0.049072265625,
0.0186767578125,
0.0154571533203125,
0.0152435302734375,
0.005786895751953125,
-0.0028743743896484375,
-0.04986572265625,
-0.0272979736328125,
0.09234619140625,
0.037139892578125,
0.047271728515625,
0.0208740234375,
0.054595947265625,
-0.00958251953125,
0.0177764892578125,
-0.05218505859375,
0.0333251953125,
0.0234222412109375,
-0.053924560546875,
-0.01534271240234375,
-0.05950927734375,
-0.07147216796875,
0.036468505859375,
0.007022857666015625,
-0.08245849609375,
0.0167999267578125,
0.0172576904296875,
-0.027008056640625,
0.036895751953125,
-0.046142578125,
0.07427978515625,
-0.01806640625,
-0.034698486328125,
-0.02838134765625,
-0.0221405029296875,
0.01837158203125,
0.0279998779296875,
0.00933837890625,
0.006145477294921875,
0.024444580078125,
0.07501220703125,
-0.0489501953125,
0.050628662109375,
-0.009979248046875,
0.01009368896484375,
0.0270538330078125,
0.0227813720703125,
0.04864501953125,
0.0123138427734375,
0.01007843017578125,
-0.0013227462768554688,
0.01233673095703125,
-0.040557861328125,
-0.026397705078125,
0.0704345703125,
-0.082763671875,
-0.0288543701171875,
-0.060455322265625,
-0.045501708984375,
0.0080108642578125,
0.01561737060546875,
0.0302276611328125,
0.05084228515625,
-0.00379180908203125,
0.001644134521484375,
0.045501708984375,
-0.039947509765625,
0.0281982421875,
0.0184173583984375,
-0.03448486328125,
-0.039520263671875,
0.07391357421875,
0.001781463623046875,
0.0264892578125,
0.0026111602783203125,
0.017547607421875,
-0.031219482421875,
-0.032562255859375,
-0.045806884765625,
0.041107177734375,
-0.0552978515625,
-0.00015676021575927734,
-0.055633544921875,
-0.0023441314697265625,
-0.035186767578125,
0.009735107421875,
-0.0306854248046875,
-0.0296173095703125,
-0.0183563232421875,
-0.000568389892578125,
0.04339599609375,
0.034912109375,
0.006343841552734375,
0.0251312255859375,
-0.04302978515625,
-0.0015592575073242188,
0.01800537109375,
0.006580352783203125,
0.0088958740234375,
-0.068115234375,
-0.0068359375,
0.01322174072265625,
-0.032501220703125,
-0.0849609375,
0.03900146484375,
-0.004749298095703125,
0.0272979736328125,
0.004001617431640625,
-0.016754150390625,
0.044525146484375,
-0.005718231201171875,
0.050567626953125,
0.0112762451171875,
-0.07904052734375,
0.040924072265625,
-0.03521728515625,
0.02532958984375,
0.026397705078125,
0.0268707275390625,
-0.055450439453125,
-0.0068359375,
-0.074462890625,
-0.08160400390625,
0.055999755859375,
0.0350341796875,
0.01438140869140625,
0.0075836181640625,
0.029693603515625,
-0.03314208984375,
0.011993408203125,
-0.076416015625,
-0.0200042724609375,
-0.017913818359375,
-0.0069122314453125,
0.01153564453125,
-0.00292205810546875,
0.004016876220703125,
-0.043060302734375,
0.076904296875,
0.00365447998046875,
0.0255126953125,
0.022430419921875,
-0.0307769775390625,
-0.006984710693359375,
-0.0014467239379882812,
0.0135498046875,
0.05792236328125,
-0.01153564453125,
0.006439208984375,
0.015655517578125,
-0.041229248046875,
0.00307464599609375,
0.012603759765625,
-0.0294036865234375,
-0.0040130615234375,
0.0122222900390625,
0.06494140625,
0.00811767578125,
-0.0318603515625,
0.015655517578125,
-0.0038204193115234375,
-0.00635528564453125,
-0.022125244140625,
-0.01253509521484375,
0.0135498046875,
0.0162353515625,
-0.0017194747924804688,
-0.0137786865234375,
-0.0010118484497070312,
-0.06549072265625,
0.0040740966796875,
0.018035888671875,
-0.01369476318359375,
-0.03131103515625,
0.0452880859375,
0.0028839111328125,
-0.01493072509765625,
0.08489990234375,
-0.0191192626953125,
-0.05316162109375,
0.059356689453125,
0.0372314453125,
0.056732177734375,
-0.013336181640625,
0.0279083251953125,
0.06671142578125,
0.0244598388671875,
-0.0163726806640625,
0.00665283203125,
0.0071868896484375,
-0.03900146484375,
-0.0093994140625,
-0.061798095703125,
-0.017547607421875,
0.0213165283203125,
-0.04339599609375,
0.03436279296875,
-0.046173095703125,
-0.007038116455078125,
-0.004913330078125,
0.01500701904296875,
-0.04510498046875,
0.0253753662109375,
0.01239013671875,
0.052703857421875,
-0.0689697265625,
0.06268310546875,
0.048980712890625,
-0.055267333984375,
-0.0816650390625,
0.0016231536865234375,
0.00022077560424804688,
-0.03350830078125,
0.01593017578125,
0.0159149169921875,
0.01497650146484375,
0.0120086669921875,
-0.0199127197265625,
-0.06524658203125,
0.09710693359375,
0.0177154541015625,
-0.050933837890625,
-0.020050048828125,
-0.00699615478515625,
0.0400390625,
0.004100799560546875,
0.053741455078125,
0.05364990234375,
0.0300750732421875,
0.006229400634765625,
-0.0804443359375,
0.02850341796875,
-0.02288818359375,
-0.005672454833984375,
0.018280029296875,
-0.051422119140625,
0.10015869140625,
-0.0040130615234375,
-0.0008630752563476562,
0.029052734375,
0.04241943359375,
0.028900146484375,
-0.00860595703125,
0.0297698974609375,
0.0582275390625,
0.06427001953125,
-0.0271148681640625,
0.090576171875,
-0.0230712890625,
0.0589599609375,
0.0640869140625,
0.01357269287109375,
0.0380859375,
0.029449462890625,
-0.0282440185546875,
0.038909912109375,
0.06103515625,
-0.006011962890625,
0.0133819580078125,
0.020233154296875,
-0.021331787109375,
-0.02105712890625,
0.009124755859375,
-0.047088623046875,
0.0162200927734375,
0.0095062255859375,
-0.045379638671875,
-0.0168914794921875,
-0.0255279541015625,
0.026824951171875,
-0.0313720703125,
-0.0177154541015625,
0.02001953125,
0.007965087890625,
-0.048248291015625,
0.049713134765625,
0.0187530517578125,
0.042816162109375,
-0.033843994140625,
0.01067352294921875,
-0.01153564453125,
0.026580810546875,
-0.0243072509765625,
-0.031585693359375,
0.00710296630859375,
0.0005717277526855469,
0.004459381103515625,
0.007110595703125,
0.0338134765625,
-0.01183319091796875,
-0.043731689453125,
0.01438140869140625,
0.037017822265625,
0.0185699462890625,
-0.035430908203125,
-0.051605224609375,
0.00618743896484375,
-0.01183319091796875,
-0.040679931640625,
0.03350830078125,
0.0196533203125,
-0.009002685546875,
0.04425048828125,
0.0469970703125,
0.003658294677734375,
-0.001861572265625,
0.0108489990234375,
0.07373046875,
-0.03619384765625,
-0.03472900390625,
-0.071533203125,
0.03802490234375,
0.00034618377685546875,
-0.04986572265625,
0.06427001953125,
0.0418701171875,
0.052459716796875,
0.0177154541015625,
0.045440673828125,
-0.033905029296875,
0.0001652240753173828,
-0.0220794677734375,
0.050933837890625,
-0.039093017578125,
0.004119873046875,
-0.0391845703125,
-0.08673095703125,
-0.0029659271240234375,
0.07122802734375,
-0.038543701171875,
0.02838134765625,
0.058074951171875,
0.060333251953125,
-0.0059356689453125,
0.005779266357421875,
0.003875732421875,
0.021209716796875,
0.041015625,
0.07135009765625,
0.06854248046875,
-0.05206298828125,
0.042510986328125,
-0.04010009765625,
-0.0201416015625,
-0.01123046875,
-0.0380859375,
-0.0645751953125,
-0.034271240234375,
-0.0390625,
-0.05633544921875,
-0.0030670166015625,
0.06561279296875,
0.054107666015625,
-0.046142578125,
-0.0116424560546875,
-0.03875732421875,
0.004482269287109375,
-0.0204620361328125,
-0.017974853515625,
0.032958984375,
0.01111602783203125,
-0.07403564453125,
-0.0030460357666015625,
-0.01117706298828125,
0.0083160400390625,
-0.03179931640625,
-0.0206451416015625,
-0.014190673828125,
-0.00917816162109375,
0.006748199462890625,
0.021148681640625,
-0.037689208984375,
-0.019561767578125,
0.0035686492919921875,
0.002643585205078125,
-0.0012264251708984375,
0.053009033203125,
-0.044158935546875,
0.00919342041015625,
0.04949951171875,
0.007717132568359375,
0.06170654296875,
-0.0194549560546875,
0.0316162109375,
-0.0186309814453125,
0.0263214111328125,
0.022064208984375,
0.047454833984375,
0.024658203125,
-0.01873779296875,
0.01274871826171875,
0.031890869140625,
-0.055816650390625,
-0.0657958984375,
0.026885986328125,
-0.05487060546875,
-0.0088958740234375,
0.09765625,
-0.0197601318359375,
-0.0290985107421875,
0.0042266845703125,
-0.0177001953125,
0.0411376953125,
-0.0205078125,
0.051605224609375,
0.0465087890625,
0.00635528564453125,
-0.01317596435546875,
-0.047271728515625,
0.0273590087890625,
0.05078125,
-0.061309814453125,
0.0276031494140625,
0.04705810546875,
0.046875,
0.018890380859375,
0.043701171875,
-0.02227783203125,
0.04449462890625,
0.0057525634765625,
0.006561279296875,
0.0012760162353515625,
-0.03436279296875,
-0.03240966796875,
-0.00928497314453125,
0.01727294921875,
-0.0005316734313964844
]
] |
stabilityai/sdxl-vae | 2023-08-04T10:12:16.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"arxiv:2112.10752",
"license:mit",
"has_space",
"diffusers:AutoencoderKL",
"region:us"
] | null | stabilityai | null | null | stabilityai/sdxl-vae | 366 | 83,508 | diffusers | 2023-06-21T17:47:40 | ---
license: mit
tags:
- stable-diffusion
- stable-diffusion-diffusers
inference: false
---
# SDXL - VAE
#### How to use with 🧨 diffusers
You can integrate this fine-tuned VAE decoder to your existing `diffusers` workflows, by including a `vae` argument to the `StableDiffusionPipeline`
```py
from diffusers.models import AutoencoderKL
from diffusers import StableDiffusionPipeline
model = "stabilityai/your-stable-diffusion-model"
vae = AutoencoderKL.from_pretrained("stabilityai/sdxl-vae")
pipe = StableDiffusionPipeline.from_pretrained(model, vae=vae)
```
## Model
[SDXL](https://huggingface.co/stabilityai/stable-diffusion-xl-base-0.9) is a [latent diffusion model](https://arxiv.org/abs/2112.10752), where the diffusion operates in a pretrained,
learned (and fixed) latent space of an autoencoder.
While the bulk of the semantic composition is done by the latent diffusion model,
we can improve _local_, high-frequency details in generated images by improving the quality of the autoencoder.
To this end, we train the same autoencoder architecture used for the original [Stable Diffusion](https://github.com/CompVis/stable-diffusion) at a larger batch-size (256 vs 9)
and additionally track the weights with an exponential moving average (EMA).
The resulting autoencoder outperforms the original model in all evaluated reconstruction metrics, see the table below.
## Evaluation
_SDXL-VAE vs original kl-f8 VAE vs f8-ft-MSE_
### COCO 2017 (256x256, val, 5000 images)
| Model | rFID | PSNR | SSIM | PSIM | Link | Comments
|----------|------|--------------|---------------|---------------|------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------|
| | | | | | | |
| SDXL-VAE | 4.42 | 24.7 +/- 3.9 | 0.73 +/- 0.13 | 0.88 +/- 0.27 | https://huggingface.co/stabilityai/sdxl-vae/blob/main/sdxl_vae.safetensors | as used in SDXL |
| original | 4.99 | 23.4 +/- 3.8 | 0.69 +/- 0.14 | 1.01 +/- 0.28 | https://ommer-lab.com/files/latent-diffusion/kl-f8.zip | as used in SD |
| ft-MSE | 4.70 | 24.5 +/- 3.7 | 0.71 +/- 0.13 | 0.92 +/- 0.27 | https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.ckpt | resumed with EMA from ft-EMA, emphasis on MSE (rec. loss = MSE + 0.1 * LPIPS), smoother outputs |
| 3,169 | [
[
-0.046875,
-0.020355224609375,
0.038116455078125,
0.0060882568359375,
-0.015716552734375,
-0.00737762451171875,
0.0258941650390625,
-0.002735137939453125,
0.0169219970703125,
0.0416259765625,
-0.031402587890625,
-0.0379638671875,
-0.049896240234375,
-0.0018281936645507812,
-0.012542724609375,
0.053253173828125,
-0.0154266357421875,
-0.0038280487060546875,
0.0017404556274414062,
-0.03155517578125,
-0.0260772705078125,
-0.026214599609375,
-0.0511474609375,
-0.00392913818359375,
0.026336669921875,
-0.00836944580078125,
0.04693603515625,
0.043243408203125,
0.01708984375,
0.0250091552734375,
-0.0245819091796875,
0.00960540771484375,
-0.0258941650390625,
0.000713348388671875,
0.0036296844482421875,
-0.0214080810546875,
-0.046600341796875,
-0.021881103515625,
0.05328369140625,
0.028778076171875,
-0.02777099609375,
-0.007053375244140625,
0.00460052490234375,
0.053619384765625,
-0.0640869140625,
-0.01412200927734375,
-0.0177001953125,
0.003551483154296875,
-0.0007653236389160156,
-0.02685546875,
-0.01507568359375,
-0.015350341796875,
0.0005550384521484375,
-0.050140380859375,
0.0204620361328125,
-0.0219573974609375,
0.09954833984375,
0.032562255859375,
-0.0278167724609375,
-0.0010442733764648438,
-0.06304931640625,
0.0498046875,
-0.06280517578125,
0.04156494140625,
0.016387939453125,
0.01033782958984375,
-0.00516510009765625,
-0.05413818359375,
-0.029449462890625,
0.016204833984375,
-0.01177978515625,
0.031646728515625,
-0.019195556640625,
0.0051116943359375,
0.038299560546875,
0.0282745361328125,
-0.056915283203125,
0.001094818115234375,
-0.039886474609375,
-0.01381683349609375,
0.04364013671875,
0.025604248046875,
0.0025691986083984375,
-0.007396697998046875,
-0.04412841796875,
-0.0298309326171875,
-0.04132080078125,
0.0023746490478515625,
0.0026397705078125,
-0.021942138671875,
-0.02203369140625,
0.02130126953125,
-0.0010328292846679688,
0.04083251953125,
0.0158843994140625,
-0.00951385498046875,
0.051116943359375,
-0.043365478515625,
-0.031646728515625,
-0.01087188720703125,
0.05621337890625,
0.042572021484375,
-0.01708984375,
0.0206146240234375,
-0.0394287109375,
0.0002536773681640625,
0.00946044921875,
-0.0777587890625,
-0.0181884765625,
0.034942626953125,
-0.0587158203125,
-0.0177154541015625,
0.0079193115234375,
-0.05120849609375,
0.0007143020629882812,
-0.0185699462890625,
0.0333251953125,
-0.023223876953125,
-0.0413818359375,
0.0006732940673828125,
-0.032623291015625,
0.047515869140625,
0.0308837890625,
-0.043121337890625,
0.026611328125,
0.026763916015625,
0.0703125,
-0.004119873046875,
-0.0005359649658203125,
-0.029266357421875,
0.0068511962890625,
-0.037078857421875,
0.04833984375,
-0.014862060546875,
-0.046234130859375,
-0.016265869140625,
0.0235443115234375,
-0.0016660690307617188,
-0.0489501953125,
0.046173095703125,
-0.044891357421875,
0.00211334228515625,
-0.01116180419921875,
-0.046722412109375,
-0.0028705596923828125,
-0.01430511474609375,
-0.050933837890625,
0.0831298828125,
0.0167083740234375,
-0.04815673828125,
0.0308380126953125,
-0.043701171875,
0.0031948089599609375,
-0.011871337890625,
-0.01256561279296875,
-0.050140380859375,
0.01280975341796875,
0.0149993896484375,
0.02972412109375,
-0.00881195068359375,
-0.00970458984375,
-0.02435302734375,
-0.0240631103515625,
0.005023956298828125,
-0.04718017578125,
0.0863037109375,
0.0255584716796875,
-0.02801513671875,
0.0174102783203125,
-0.07965087890625,
-0.0009064674377441406,
0.011016845703125,
-0.01149749755859375,
-0.03466796875,
-0.01849365234375,
0.0028743743896484375,
0.023834228515625,
0.0004718303680419922,
-0.034332275390625,
-0.00667572021484375,
-0.047119140625,
0.021453857421875,
0.0711669921875,
0.022552490234375,
0.04168701171875,
-0.032318115234375,
0.031890869140625,
0.0186920166015625,
0.01059722900390625,
-0.00881195068359375,
-0.02734375,
-0.054656982421875,
-0.031158447265625,
0.01273345947265625,
0.039337158203125,
-0.04901123046875,
0.044647216796875,
-0.01114654541015625,
-0.046356201171875,
-0.045440673828125,
0.00007796287536621094,
0.00952911376953125,
0.024566650390625,
0.024932861328125,
-0.03155517578125,
-0.034942626953125,
-0.0589599609375,
0.029693603515625,
0.01097869873046875,
0.0003447532653808594,
0.0208740234375,
0.043975830078125,
-0.01171875,
0.058074951171875,
-0.0555419921875,
-0.01016998291015625,
0.0293426513671875,
0.00312042236328125,
0.03857421875,
0.04364013671875,
0.06781005859375,
-0.06390380859375,
-0.05706787109375,
-0.0065460205078125,
-0.048553466796875,
-0.002902984619140625,
-0.0087432861328125,
-0.0142364501953125,
0.033233642578125,
0.033477783203125,
-0.053741455078125,
0.058197021484375,
0.031707763671875,
-0.054534912109375,
0.053924560546875,
-0.04327392578125,
0.00940704345703125,
-0.09075927734375,
0.01520538330078125,
0.0009775161743164062,
-0.027740478515625,
-0.029571533203125,
-0.01062774658203125,
0.0096435546875,
0.00640106201171875,
-0.040008544921875,
0.051788330078125,
-0.05078125,
0.0037822723388671875,
-0.02197265625,
-0.00943756103515625,
0.01148223876953125,
0.022613525390625,
0.01678466796875,
0.0567626953125,
0.05889892578125,
-0.0457763671875,
0.01751708984375,
-0.0042724609375,
-0.01406097412109375,
0.038360595703125,
-0.060760498046875,
-0.01137542724609375,
-0.030517578125,
0.01355743408203125,
-0.07843017578125,
0.0214996337890625,
0.0199432373046875,
-0.0298309326171875,
0.03704833984375,
-0.01409149169921875,
-0.0135345458984375,
-0.032928466796875,
-0.036956787109375,
0.0201263427734375,
0.06256103515625,
-0.041229248046875,
0.040985107421875,
0.0110626220703125,
0.01535797119140625,
-0.050689697265625,
-0.032745361328125,
-0.0159149169921875,
-0.021514892578125,
-0.0428466796875,
0.0330810546875,
-0.032470703125,
-0.01116180419921875,
0.0176239013671875,
-0.0172271728515625,
-0.01325225830078125,
0.0016298294067382812,
0.0255126953125,
0.03594970703125,
-0.032623291015625,
-0.02972412109375,
0.016357421875,
-0.037139892578125,
0.005313873291015625,
-0.01218414306640625,
0.029541015625,
0.0108795166015625,
-0.01548004150390625,
-0.036956787109375,
0.034423828125,
0.0733642578125,
-0.0174407958984375,
0.0631103515625,
0.067626953125,
-0.0285797119140625,
-0.01444244384765625,
-0.032806396484375,
-0.0223541259765625,
-0.038482666015625,
0.01464080810546875,
-0.042572021484375,
-0.05059814453125,
0.0555419921875,
0.013580322265625,
0.015838623046875,
0.048004150390625,
0.04815673828125,
-0.0196075439453125,
0.0806884765625,
0.0269012451171875,
0.0240631103515625,
0.031646728515625,
-0.05352783203125,
-0.0006108283996582031,
-0.07080078125,
0.0035114288330078125,
-0.0279083251953125,
0.002590179443359375,
-0.035858154296875,
-0.040863037109375,
0.0251007080078125,
0.0166168212890625,
-0.01617431640625,
0.0213165283203125,
-0.058074951171875,
0.027099609375,
0.01971435546875,
-0.012420654296875,
-0.007221221923828125,
0.00384521484375,
-0.01064300537109375,
-0.0053863525390625,
-0.043609619140625,
-0.033721923828125,
0.0780029296875,
0.024444580078125,
0.060394287109375,
0.01418304443359375,
0.06353759765625,
0.015899658203125,
0.017364501953125,
-0.0285797119140625,
0.0213165283203125,
-0.0225982666015625,
-0.0308380126953125,
-0.0016775131225585938,
-0.01450347900390625,
-0.08673095703125,
0.034576416015625,
-0.00560760498046875,
-0.05352783203125,
0.060638427734375,
0.01342010498046875,
-0.03887939453125,
0.02288818359375,
-0.0550537109375,
0.067626953125,
-0.004547119140625,
-0.031097412109375,
0.0002353191375732422,
-0.038177490234375,
0.00795745849609375,
0.007843017578125,
0.00562286376953125,
-0.004451751708984375,
-0.0217437744140625,
0.053497314453125,
-0.0714111328125,
0.050445556640625,
-0.02862548828125,
-0.02288818359375,
0.0310211181640625,
-0.01271820068359375,
0.037078857421875,
0.00968170166015625,
-0.009429931640625,
0.00820159912109375,
0.03424072265625,
-0.03350830078125,
-0.04486083984375,
0.0672607421875,
-0.07281494140625,
-0.040985107421875,
-0.053253173828125,
-0.01288604736328125,
0.038238525390625,
0.0211181640625,
0.036163330078125,
0.043487548828125,
0.0023021697998046875,
0.0172119140625,
0.0704345703125,
-0.006061553955078125,
0.050384521484375,
0.048614501953125,
-0.00299072265625,
-0.042938232421875,
0.069091796875,
0.030975341796875,
0.03485107421875,
0.0292510986328125,
0.0013866424560546875,
-0.0244903564453125,
-0.039337158203125,
-0.025970458984375,
0.0177154541015625,
-0.063720703125,
-0.0233154296875,
-0.05615234375,
-0.03973388671875,
-0.028717041015625,
-0.0318603515625,
-0.04718017578125,
-0.0172882080078125,
-0.036712646484375,
0.002292633056640625,
0.0283660888671875,
0.035675048828125,
-0.0312042236328125,
0.0051727294921875,
-0.069091796875,
0.0166473388671875,
0.0081329345703125,
0.020477294921875,
0.005481719970703125,
-0.03082275390625,
0.0015859603881835938,
-0.002918243408203125,
-0.01959228515625,
-0.07342529296875,
0.0452880859375,
0.004123687744140625,
0.059539794921875,
0.0241241455078125,
0.0064849853515625,
0.04046630859375,
-0.01428985595703125,
0.055999755859375,
0.01450347900390625,
-0.049407958984375,
0.051727294921875,
-0.016448974609375,
0.0161590576171875,
0.011322021484375,
0.03338623046875,
-0.0166473388671875,
-0.003692626953125,
-0.060546875,
-0.08551025390625,
0.047210693359375,
0.0227203369140625,
-0.013580322265625,
0.0025157928466796875,
0.033935546875,
0.0008440017700195312,
-0.01224517822265625,
-0.0457763671875,
-0.05596923828125,
-0.019805908203125,
-0.00460052490234375,
0.005859375,
-0.0245208740234375,
-0.0081634521484375,
-0.022705078125,
0.0548095703125,
0.005634307861328125,
0.033660888671875,
0.036407470703125,
0.007091522216796875,
-0.006114959716796875,
0.00887298583984375,
0.044281005859375,
0.050689697265625,
-0.043365478515625,
0.0013751983642578125,
0.0100250244140625,
-0.033416748046875,
0.025360107421875,
0.00824737548828125,
-0.03326416015625,
0.0089111328125,
0.01325225830078125,
0.07342529296875,
-0.0100555419921875,
-0.0260009765625,
0.047027587890625,
-0.0274505615234375,
-0.0311431884765625,
-0.04290771484375,
0.0170135498046875,
0.009521484375,
0.008697509765625,
0.016357421875,
0.0312347412109375,
0.0150146484375,
-0.0156707763671875,
0.0227203369140625,
0.0272216796875,
-0.03363037109375,
-0.01322174072265625,
0.058929443359375,
0.00727081298828125,
-0.004604339599609375,
0.028533935546875,
-0.0254364013671875,
-0.017852783203125,
0.0657958984375,
0.046722412109375,
0.0791015625,
-0.0304718017578125,
0.0064849853515625,
0.058563232421875,
0.0148773193359375,
0.004413604736328125,
0.033355712890625,
-0.01256561279296875,
-0.048248291015625,
-0.02099609375,
-0.05743408203125,
0.0158233642578125,
0.0030422210693359375,
-0.054656982421875,
0.028076171875,
-0.02813720703125,
-0.01203155517578125,
0.007678985595703125,
-0.0082244873046875,
-0.05859375,
0.01424407958984375,
0.004669189453125,
0.07568359375,
-0.07269287109375,
0.0703125,
0.049041748046875,
-0.036041259765625,
-0.04998779296875,
-0.015869140625,
-0.002655029296875,
-0.039337158203125,
0.03851318359375,
-0.006267547607421875,
-0.0007700920104980469,
0.003818511962890625,
-0.0114288330078125,
-0.0904541015625,
0.109375,
0.005764007568359375,
-0.043212890625,
0.0150146484375,
-0.0258026123046875,
0.031524658203125,
-0.022216796875,
0.041748046875,
0.0297393798828125,
0.0244293212890625,
0.01483154296875,
-0.062164306640625,
0.02801513671875,
-0.029541015625,
0.0300445556640625,
0.0260162353515625,
-0.052947998046875,
0.062042236328125,
-0.0131683349609375,
-0.015472412109375,
0.0124969482421875,
0.051605224609375,
0.0235595703125,
0.0130615234375,
0.044586181640625,
0.064697265625,
0.04632568359375,
-0.00734710693359375,
0.0771484375,
-0.01250457763671875,
0.0372314453125,
0.06585693359375,
-0.0025615692138671875,
0.060455322265625,
0.031341552734375,
-0.02392578125,
0.0307159423828125,
0.054046630859375,
-0.0144500732421875,
0.053741455078125,
-0.0008187294006347656,
-0.0292205810546875,
-0.003204345703125,
0.0121307373046875,
-0.052001953125,
0.01016998291015625,
0.04522705078125,
-0.0389404296875,
-0.0177764892578125,
0.003200531005859375,
0.004608154296875,
-0.0210723876953125,
-0.0181427001953125,
0.047027587890625,
0.00179290771484375,
-0.032562255859375,
0.074951171875,
-0.01326751708984375,
0.055633544921875,
-0.0546875,
-0.015899658203125,
0.0010175704956054688,
0.0289459228515625,
-0.026336669921875,
-0.074951171875,
0.048858642578125,
-0.0294647216796875,
-0.0244293212890625,
-0.01236724853515625,
0.06292724609375,
-0.015380859375,
-0.030975341796875,
0.043914794921875,
0.0181884765625,
0.0148773193359375,
0.02459716796875,
-0.06463623046875,
0.029449462890625,
0.0080413818359375,
-0.03900146484375,
0.02801513671875,
0.0226898193359375,
0.01617431640625,
0.0302276611328125,
0.048919677734375,
0.005260467529296875,
0.01666259765625,
-0.00514984130859375,
0.07275390625,
-0.043212890625,
-0.033447265625,
-0.047393798828125,
0.0587158203125,
-0.01727294921875,
-0.0295867919921875,
0.058013916015625,
0.04949951171875,
0.05059814453125,
-0.006420135498046875,
0.056304931640625,
-0.0195770263671875,
0.011566162109375,
-0.036224365234375,
0.055511474609375,
-0.059722900390625,
0.0179443359375,
-0.01428985595703125,
-0.07672119140625,
-0.007213592529296875,
0.059173583984375,
-0.0016145706176757812,
0.021240234375,
0.044586181640625,
0.08929443359375,
-0.00749969482421875,
-0.0243682861328125,
0.0218505859375,
0.04046630859375,
0.033172607421875,
0.036285400390625,
0.0288848876953125,
-0.059661865234375,
0.035919189453125,
-0.047454833984375,
-0.0231781005859375,
0.001834869384765625,
-0.054534912109375,
-0.0435791015625,
-0.061248779296875,
-0.05645751953125,
-0.06170654296875,
-0.0032958984375,
0.0517578125,
0.07000732421875,
-0.053314208984375,
-0.0109100341796875,
-0.0163116455078125,
0.01507568359375,
-0.024566650390625,
-0.0209503173828125,
0.048919677734375,
-0.0026874542236328125,
-0.0582275390625,
0.0197906494140625,
0.03192138671875,
0.0303802490234375,
-0.02410888671875,
-0.0270843505859375,
-0.0203094482421875,
0.01001739501953125,
0.03912353515625,
0.032135009765625,
-0.052886962890625,
-0.008636474609375,
-0.01352691650390625,
-0.010711669921875,
0.018280029296875,
0.0270538330078125,
-0.055267333984375,
0.04376220703125,
0.0631103515625,
0.01336669921875,
0.05120849609375,
-0.0174713134765625,
0.03131103515625,
-0.0377197265625,
0.0029659271240234375,
0.00453948974609375,
0.033935546875,
-0.0028438568115234375,
-0.0251922607421875,
0.03424072265625,
0.034149169921875,
-0.04833984375,
-0.057830810546875,
-0.0189666748046875,
-0.09197998046875,
-0.0293731689453125,
0.0772705078125,
-0.0172119140625,
-0.046173095703125,
-0.0145416259765625,
-0.0283203125,
0.007045745849609375,
-0.041595458984375,
0.01485443115234375,
0.034881591796875,
-0.0086669921875,
-0.033233642578125,
-0.03857421875,
0.03692626953125,
0.0233612060546875,
-0.038482666015625,
0.00047779083251953125,
0.0258636474609375,
0.04022216796875,
0.0250244140625,
0.077392578125,
-0.024169921875,
0.0288543701171875,
0.020050048828125,
0.0060577392578125,
0.00827789306640625,
0.0135498046875,
-0.031585693359375,
0.00962066650390625,
-0.00791168212890625,
-0.0227508544921875
]
] |
google/mobilebert-uncased | 2021-04-19T13:32:58.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"mobilebert",
"pretraining",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | google | null | null | google/mobilebert-uncased | 24 | 82,643 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail: https://huggingface.co/front/thumbnails/google.png
license: apache-2.0
---
## MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
MobileBERT is a thin version of BERT_LARGE, while equipped with bottleneck structures and a carefully designed balance
between self-attentions and feed-forward networks.
This checkpoint is the original MobileBert Optimized Uncased English:
[uncased_L-24_H-128_B-512_A-4_F-4_OPT](https://storage.googleapis.com/cloud-tpu-checkpoints/mobilebert/uncased_L-24_H-128_B-512_A-4_F-4_OPT.tar.gz)
checkpoint.
## How to use MobileBERT in `transformers`
```python
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model="google/mobilebert-uncased",
tokenizer="google/mobilebert-uncased"
)
print(
fill_mask(f"HuggingFace is creating a {fill_mask.tokenizer.mask_token} that the community uses to solve NLP tasks.")
)
```
| 918 | [
[
-0.028564453125,
-0.027435302734375,
0.0171051025390625,
0.035064697265625,
-0.01270294189453125,
0.00952911376953125,
-0.0215606689453125,
-0.0037364959716796875,
0.0439453125,
0.0272979736328125,
-0.060821533203125,
-0.0192108154296875,
-0.03271484375,
-0.0069732666015625,
-0.023681640625,
0.0838623046875,
-0.0035343170166015625,
-0.01096343994140625,
0.004180908203125,
-0.024017333984375,
-0.003925323486328125,
-0.039459228515625,
-0.041717529296875,
-0.04205322265625,
0.041046142578125,
0.03021240234375,
0.060150146484375,
0.0175933837890625,
0.021331787109375,
0.025634765625,
-0.01473236083984375,
-0.0090789794921875,
-0.021697998046875,
0.0097198486328125,
0.00522613525390625,
-0.02203369140625,
-0.0246124267578125,
-0.00417327880859375,
0.052581787109375,
0.033599853515625,
-0.005107879638671875,
0.034820556640625,
0.0085601806640625,
0.05865478515625,
-0.0667724609375,
-0.00754547119140625,
-0.05450439453125,
0.0206451416015625,
0.002460479736328125,
0.0224456787109375,
-0.029571533203125,
-0.01165771484375,
0.007114410400390625,
-0.031890869140625,
0.0248565673828125,
-0.0195159912109375,
0.07818603515625,
0.043487548828125,
-0.02154541015625,
-0.021636962890625,
-0.04217529296875,
0.08038330078125,
-0.0197601318359375,
0.0204620361328125,
0.00630950927734375,
0.0228729248046875,
-0.001556396484375,
-0.0902099609375,
-0.038299560546875,
0.0084381103515625,
0.007495880126953125,
0.0106658935546875,
-0.0176239013671875,
0.009307861328125,
0.0244140625,
0.0167694091796875,
-0.03460693359375,
-0.01517486572265625,
-0.056854248046875,
-0.05169677734375,
0.039459228515625,
0.0013265609741210938,
0.0131072998046875,
-0.0268096923828125,
-0.04254150390625,
-0.005695343017578125,
-0.040557861328125,
0.019622802734375,
0.01143646240234375,
0.007598876953125,
-0.0038967132568359375,
0.03656005859375,
-0.0108489990234375,
0.031463623046875,
0.005977630615234375,
-0.0006151199340820312,
0.04559326171875,
-0.007175445556640625,
-0.0222625732421875,
0.0114593505859375,
0.0721435546875,
0.01342010498046875,
0.00786590576171875,
-0.012603759765625,
-0.01751708984375,
-0.0094757080078125,
0.028564453125,
-0.08233642578125,
-0.031890869140625,
0.04034423828125,
-0.056121826171875,
-0.0188751220703125,
-0.01122283935546875,
-0.035614013671875,
-0.01166534423828125,
-0.0030345916748046875,
0.04803466796875,
-0.03131103515625,
0.00682830810546875,
-0.004116058349609375,
-0.0106353759765625,
0.01715087890625,
0.014312744140625,
-0.0771484375,
0.002521514892578125,
0.03228759765625,
0.08099365234375,
0.0005369186401367188,
-0.010101318359375,
-0.0445556640625,
-0.006282806396484375,
0.0181121826171875,
0.032806396484375,
-0.0131072998046875,
-0.0260009765625,
0.0143585205078125,
0.01320648193359375,
-0.0274810791015625,
-0.035614013671875,
0.08551025390625,
-0.0293426513671875,
0.01334381103515625,
-0.0132598876953125,
-0.0090179443359375,
-0.01959228515625,
-0.0022411346435546875,
-0.039459228515625,
0.07818603515625,
0.01514434814453125,
-0.05682373046875,
0.018524169921875,
-0.05096435546875,
-0.03668212890625,
0.006946563720703125,
0.0060577392578125,
-0.0391845703125,
-0.00884246826171875,
0.018768310546875,
0.03253173828125,
0.01444244384765625,
-0.0023517608642578125,
-0.022979736328125,
-0.032745361328125,
0.005466461181640625,
0.01403045654296875,
0.08843994140625,
0.03619384765625,
-0.039306640625,
0.0213165283203125,
-0.0458984375,
0.039520263671875,
-0.01178741455078125,
-0.0264892578125,
0.006435394287109375,
0.0182342529296875,
0.0248565673828125,
0.0148162841796875,
0.050079345703125,
-0.046295166015625,
0.0224609375,
0.0036296844482421875,
0.083740234375,
0.04266357421875,
-0.03643798828125,
0.04193115234375,
-0.020965576171875,
0.02044677734375,
-0.03594970703125,
0.02874755859375,
-0.00437164306640625,
-0.028076171875,
-0.07952880859375,
-0.065673828125,
0.0311431884765625,
0.06219482421875,
-0.05047607421875,
0.05096435546875,
-0.01358795166015625,
-0.044952392578125,
-0.0386962890625,
0.02044677734375,
0.0071563720703125,
0.01006317138671875,
0.02532958984375,
-0.0216064453125,
-0.055511474609375,
-0.0831298828125,
-0.0024471282958984375,
-0.007167816162109375,
-0.0300445556640625,
0.0206451416015625,
0.0438232421875,
-0.05242919921875,
0.0469970703125,
-0.0226898193359375,
-0.03704833984375,
-0.01416778564453125,
0.0125885009765625,
0.03271484375,
0.039398193359375,
0.031829833984375,
-0.037200927734375,
-0.033172607421875,
-0.0360107421875,
-0.048858642578125,
-0.01371002197265625,
-0.022369384765625,
-0.004673004150390625,
0.00348663330078125,
0.04290771484375,
-0.061370849609375,
-0.00004583597183227539,
0.050262451171875,
-0.0189208984375,
0.0374755859375,
-0.0123291015625,
-0.0198516845703125,
-0.0615234375,
0.0111541748046875,
-0.018951416015625,
-0.0235748291015625,
-0.03076171875,
0.0169525146484375,
0.01806640625,
-0.0233917236328125,
-0.037353515625,
0.0283966064453125,
-0.0179901123046875,
-0.0015411376953125,
-0.0115509033203125,
0.0163421630859375,
-0.00267791748046875,
0.035125732421875,
-0.021820068359375,
0.060882568359375,
0.043121337890625,
-0.0361328125,
0.026458740234375,
0.031494140625,
-0.01274871826171875,
0.00902557373046875,
-0.07647705078125,
0.01116943359375,
-0.001983642578125,
0.03887939453125,
-0.05908203125,
-0.00426483154296875,
0.023406982421875,
-0.03216552734375,
-0.0019435882568359375,
-0.01195526123046875,
-0.052886962890625,
-0.0311737060546875,
-0.03790283203125,
0.033477783203125,
0.06768798828125,
-0.056610107421875,
0.059478759765625,
0.0299224853515625,
0.003040313720703125,
-0.054290771484375,
-0.04498291015625,
-0.0010976791381835938,
-0.0137481689453125,
-0.0787353515625,
0.054290771484375,
0.0015583038330078125,
0.00722503662109375,
-0.0230560302734375,
-0.01824951171875,
-0.0259857177734375,
-0.0036449432373046875,
0.02655029296875,
0.0121307373046875,
0.00501251220703125,
0.010101318359375,
0.0010356903076171875,
0.0047760009765625,
0.0092315673828125,
-0.01491546630859375,
0.0421142578125,
-0.044219970703125,
0.0166015625,
-0.0322265625,
0.01241302490234375,
0.053375244140625,
-0.0025653839111328125,
0.04803466796875,
0.08843994140625,
-0.01558685302734375,
-0.0126495361328125,
-0.0263671875,
-0.00605010986328125,
-0.04522705078125,
0.005542755126953125,
-0.022705078125,
-0.06427001953125,
0.058563232421875,
0.013153076171875,
0.00885772705078125,
0.0458984375,
0.050262451171875,
-0.0183868408203125,
0.0557861328125,
0.0389404296875,
-0.004283905029296875,
0.045684814453125,
-0.03094482421875,
0.00699615478515625,
-0.07977294921875,
-0.0221710205078125,
-0.036956787109375,
-0.035247802734375,
-0.046905517578125,
-0.0303192138671875,
0.0194091796875,
0.0296173095703125,
-0.040924072265625,
0.06134033203125,
-0.06524658203125,
0.01250457763671875,
0.05963134765625,
0.0224761962890625,
-0.0131072998046875,
0.00555419921875,
-0.02581787109375,
-0.0083160400390625,
-0.05950927734375,
-0.047027587890625,
0.0723876953125,
0.0230255126953125,
0.04791259765625,
0.01222991943359375,
0.06268310546875,
0.01947021484375,
0.006572723388671875,
-0.05108642578125,
0.04901123046875,
-0.00966644287109375,
-0.06732177734375,
-0.010467529296875,
-0.02984619140625,
-0.081298828125,
0.032318115234375,
-0.019073486328125,
-0.09149169921875,
0.0017766952514648438,
0.01837158203125,
-0.0157012939453125,
0.01416778564453125,
-0.07330322265625,
0.07305908203125,
-0.0043487548828125,
-0.0289306640625,
-0.01690673828125,
-0.048431396484375,
0.0217742919921875,
0.00017189979553222656,
0.0012292861938476562,
-0.018341064453125,
-0.0091705322265625,
0.05767822265625,
-0.03704833984375,
0.06103515625,
-0.0162811279296875,
0.0123138427734375,
0.042999267578125,
-0.0092010498046875,
0.038177490234375,
0.019195556640625,
-0.00485992431640625,
0.01271820068359375,
0.0137481689453125,
-0.04638671875,
-0.023681640625,
0.053375244140625,
-0.06597900390625,
-0.033416748046875,
-0.03179931640625,
-0.035980224609375,
0.01371002197265625,
0.0232086181640625,
0.037139892578125,
0.0164794921875,
0.0036678314208984375,
0.0306243896484375,
0.037841796875,
-0.0198974609375,
0.053619384765625,
0.00809478759765625,
0.0014944076538085938,
-0.0254364013671875,
0.06170654296875,
-0.0173492431640625,
-0.002593994140625,
0.0204925537109375,
0.006229400634765625,
-0.0249176025390625,
-0.00632476806640625,
-0.035247802734375,
0.00037598609924316406,
-0.05462646484375,
0.0017557144165039062,
-0.054290771484375,
-0.034820556640625,
-0.0085296630859375,
-0.0268402099609375,
-0.0413818359375,
-0.06353759765625,
-0.031585693359375,
0.042755126953125,
0.00487518310546875,
-0.00579071044921875,
-0.0166778564453125,
0.034881591796875,
-0.0714111328125,
0.025054931640625,
0.0258636474609375,
0.013519287109375,
-0.01122283935546875,
-0.039459228515625,
-0.01006317138671875,
0.002025604248046875,
-0.05853271484375,
-0.033447265625,
0.0139923095703125,
0.02325439453125,
0.0401611328125,
0.024322509765625,
0.0199432373046875,
0.0205230712890625,
-0.048370361328125,
0.060302734375,
0.030120849609375,
-0.09344482421875,
0.023193359375,
-0.0224761962890625,
0.0316162109375,
0.0440673828125,
0.015380859375,
-0.03021240234375,
-0.01800537109375,
-0.05291748046875,
-0.08935546875,
0.0270843505859375,
0.054046630859375,
0.0239105224609375,
0.0188446044921875,
-0.000016629695892333984,
-0.006526947021484375,
0.023956298828125,
-0.07427978515625,
-0.0267181396484375,
-0.0440673828125,
-0.01520538330078125,
0.0084991455078125,
-0.022308349609375,
-0.0195465087890625,
-0.03143310546875,
0.04510498046875,
0.0105438232421875,
0.05902099609375,
0.03515625,
-0.022491455078125,
0.02410888671875,
0.01479339599609375,
0.065673828125,
0.0382080078125,
-0.034423828125,
0.00426483154296875,
0.01195526123046875,
-0.05303955078125,
-0.0022678375244140625,
0.010833740234375,
-0.01788330078125,
0.0260772705078125,
0.0191650390625,
0.045440673828125,
-0.01554107666015625,
-0.05029296875,
0.03460693359375,
0.004718780517578125,
-0.03656005859375,
-0.05413818359375,
0.0121307373046875,
-0.0005316734313964844,
0.041473388671875,
0.038665771484375,
0.01580810546875,
0.01479339599609375,
-0.03643798828125,
0.017333984375,
0.040863037109375,
-0.035797119140625,
-0.0010442733764648438,
0.0655517578125,
0.0389404296875,
-0.0257568359375,
0.046661376953125,
-0.028594970703125,
-0.06390380859375,
0.045684814453125,
0.0190582275390625,
0.07891845703125,
0.02203369140625,
0.0197906494140625,
0.040313720703125,
0.04254150390625,
0.0098419189453125,
0.0172882080078125,
-0.0199737548828125,
-0.048370361328125,
-0.043609619140625,
-0.034759521484375,
-0.0250396728515625,
0.006603240966796875,
-0.02996826171875,
0.013458251953125,
-0.051025390625,
-0.025848388671875,
0.007335662841796875,
-0.01088714599609375,
-0.038330078125,
0.007724761962890625,
0.0020904541015625,
0.06915283203125,
-0.031280517578125,
0.06805419921875,
0.052886962890625,
-0.0029506683349609375,
-0.04754638671875,
-0.0003380775451660156,
-0.016754150390625,
-0.0687255859375,
0.06304931640625,
0.032379150390625,
0.0017986297607421875,
-0.0247955322265625,
-0.030670166015625,
-0.057952880859375,
0.073486328125,
0.0076446533203125,
-0.024932861328125,
0.00978851318359375,
-0.022430419921875,
0.0268402099609375,
-0.0228424072265625,
0.01486968994140625,
0.0124053955078125,
0.0247955322265625,
0.005268096923828125,
-0.06988525390625,
0.0172271728515625,
-0.0194091796875,
-0.010467529296875,
0.034271240234375,
-0.06915283203125,
0.0665283203125,
-0.026824951171875,
-0.010406494140625,
0.00835418701171875,
0.03375244140625,
0.01145172119140625,
0.01113128662109375,
0.04498291015625,
0.034759521484375,
0.04864501953125,
-0.020477294921875,
0.044219970703125,
-0.0340576171875,
0.06268310546875,
0.0625,
-0.0036869049072265625,
0.04595947265625,
0.041961669921875,
-0.005855560302734375,
0.0596923828125,
0.04656982421875,
-0.036529541015625,
0.055694580078125,
0.02056884765625,
-0.0377197265625,
-0.00943756103515625,
0.017578125,
-0.021392822265625,
0.0262451171875,
0.01421356201171875,
-0.04522705078125,
-0.0101776123046875,
-0.0157928466796875,
0.0107574462890625,
-0.0251922607421875,
-0.03216552734375,
0.00937652587890625,
-0.002841949462890625,
-0.04705810546875,
0.0860595703125,
0.0191650390625,
0.060302734375,
-0.02410888671875,
0.032958984375,
0.006496429443359375,
0.0289764404296875,
-0.0283966064453125,
-0.048858642578125,
0.034912109375,
-0.0137481689453125,
-0.0155487060546875,
-0.0248565673828125,
0.0604248046875,
-0.006805419921875,
-0.025634765625,
0.006107330322265625,
-0.0036678314208984375,
0.0179290771484375,
-0.004657745361328125,
-0.0677490234375,
0.00995635986328125,
0.019927978515625,
-0.0135650634765625,
0.0082855224609375,
-0.0013170242309570312,
0.0230560302734375,
0.07427978515625,
0.033447265625,
-0.028533935546875,
0.0101165771484375,
-0.0218048095703125,
0.05828857421875,
-0.03515625,
-0.0286712646484375,
-0.05145263671875,
0.048126220703125,
-0.0207977294921875,
-0.0297698974609375,
0.0406494140625,
0.043853759765625,
0.04730224609375,
-0.023162841796875,
0.052276611328125,
-0.0418701171875,
0.003658294677734375,
-0.033660888671875,
0.0576171875,
-0.050262451171875,
-0.006793975830078125,
-0.010284423828125,
-0.0718994140625,
-0.0012598037719726562,
0.083251953125,
-0.0005421638488769531,
0.0003170967102050781,
0.0693359375,
0.0439453125,
-0.00829315185546875,
-0.01351165771484375,
0.0047607421875,
-0.00783538818359375,
-0.00011116266250610352,
0.05181884765625,
0.0245361328125,
-0.041900634765625,
0.08258056640625,
-0.0159912109375,
-0.01177215576171875,
-0.023651123046875,
-0.0628662109375,
-0.093017578125,
-0.0460205078125,
-0.0294036865234375,
-0.05462646484375,
-0.01117706298828125,
0.046844482421875,
0.0697021484375,
-0.0443115234375,
-0.0020809173583984375,
-0.00623321533203125,
0.01751708984375,
-0.00029850006103515625,
-0.0169830322265625,
0.0270843505859375,
-0.03363037109375,
-0.052947998046875,
0.00789642333984375,
-0.0011720657348632812,
-0.004451751708984375,
-0.001682281494140625,
-0.0000476837158203125,
0.01105499267578125,
-0.00547027587890625,
0.048828125,
0.037261962890625,
-0.04644775390625,
-0.0308685302734375,
-0.00250244140625,
-0.024627685546875,
0.004825592041015625,
0.0552978515625,
-0.04150390625,
0.0171356201171875,
0.03326416015625,
0.031768798828125,
0.0626220703125,
-0.032928466796875,
0.036956787109375,
-0.063720703125,
0.042938232421875,
0.01702880859375,
0.047698974609375,
0.011627197265625,
-0.0076446533203125,
0.035400390625,
0.01314544677734375,
-0.06695556640625,
-0.046417236328125,
0.0347900390625,
-0.09600830078125,
-0.0153961181640625,
0.083740234375,
-0.007518768310546875,
-0.0017137527465820312,
0.00934600830078125,
-0.0159454345703125,
0.010467529296875,
-0.045166015625,
0.07537841796875,
0.054840087890625,
-0.0010995864868164062,
-0.0152740478515625,
-0.052154541015625,
0.02801513671875,
0.04510498046875,
-0.0276031494140625,
-0.0300445556640625,
0.0042266845703125,
0.0214080810546875,
0.0372314453125,
0.028167724609375,
0.005802154541015625,
0.0162506103515625,
-0.000782012939453125,
0.035247802734375,
0.0139312744140625,
-0.000347137451171875,
0.0164794921875,
0.006561279296875,
-0.021270751953125,
-0.0479736328125
]
] |
Helsinki-NLP/opus-mt-nl-en | 2023-08-16T12:01:39.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"nl",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-nl-en | 8 | 82,564 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-nl-en
* source languages: nl
* target languages: en
* OPUS readme: [nl-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/nl-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-05.zip](https://object.pouta.csc.fi/OPUS-MT-models/nl-en/opus-2019-12-05.zip)
* test set translations: [opus-2019-12-05.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/nl-en/opus-2019-12-05.test.txt)
* test set scores: [opus-2019-12-05.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/nl-en/opus-2019-12-05.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.nl.en | 60.9 | 0.749 |
| 818 | [
[
-0.018035888671875,
-0.033294677734375,
0.0171356201171875,
0.032867431640625,
-0.03460693359375,
-0.0285491943359375,
-0.034332275390625,
-0.00865936279296875,
0.004364013671875,
0.036041259765625,
-0.050323486328125,
-0.0435791015625,
-0.0438232421875,
0.021636962890625,
-0.01012420654296875,
0.054901123046875,
-0.01110076904296875,
0.035247802734375,
0.0169219970703125,
-0.0343017578125,
-0.0245208740234375,
-0.0274505615234375,
-0.0380859375,
-0.0245361328125,
0.02337646484375,
0.0225067138671875,
0.03082275390625,
0.029510498046875,
0.0694580078125,
0.0158538818359375,
-0.0111236572265625,
0.00568389892578125,
-0.03472900390625,
-0.004730224609375,
0.0039825439453125,
-0.04364013671875,
-0.054718017578125,
-0.01314544677734375,
0.07537841796875,
0.033935546875,
-0.006343841552734375,
0.0298309326171875,
-0.0031337738037109375,
0.070556640625,
-0.0219879150390625,
0.005970001220703125,
-0.0457763671875,
0.009490966796875,
-0.023712158203125,
-0.02410888671875,
-0.05084228515625,
-0.018707275390625,
0.0108489990234375,
-0.050445556640625,
-0.0030498504638671875,
0.01146697998046875,
0.104248046875,
0.0254974365234375,
-0.023468017578125,
-0.0135040283203125,
-0.0428466796875,
0.077392578125,
-0.06268310546875,
0.046661376953125,
0.031158447265625,
0.0185699462890625,
0.0213165283203125,
-0.04083251953125,
-0.0232391357421875,
0.009246826171875,
-0.01531219482421875,
0.0169830322265625,
-0.01203155517578125,
-0.0206756591796875,
0.0239105224609375,
0.0523681640625,
-0.05902099609375,
-0.0003566741943359375,
-0.042755126953125,
0.0007262229919433594,
0.0521240234375,
0.00696563720703125,
0.01078033447265625,
-0.01409149169921875,
-0.032440185546875,
-0.042083740234375,
-0.054412841796875,
0.00716400146484375,
0.0274505615234375,
0.0233154296875,
-0.035400390625,
0.051910400390625,
-0.007587432861328125,
0.049163818359375,
-0.0020008087158203125,
-0.000017523765563964844,
0.07366943359375,
-0.031646728515625,
-0.027496337890625,
-0.007354736328125,
0.08624267578125,
0.025909423828125,
0.005695343017578125,
0.0028133392333984375,
-0.0211181640625,
-0.0206756591796875,
0.00928497314453125,
-0.062408447265625,
-0.005157470703125,
0.01313018798828125,
-0.035797119140625,
-0.00771331787109375,
0.003787994384765625,
-0.044281005859375,
0.0160064697265625,
-0.032012939453125,
0.0433349609375,
-0.046905517578125,
-0.02178955078125,
0.03045654296875,
0.0021514892578125,
0.02972412109375,
0.00101470947265625,
-0.043670654296875,
0.01201629638671875,
0.0283050537109375,
0.054412841796875,
-0.032012939453125,
-0.0203399658203125,
-0.03424072265625,
-0.01415252685546875,
-0.00971221923828125,
0.047943115234375,
-0.0026798248291015625,
-0.02801513671875,
-0.0017910003662109375,
0.03472900390625,
-0.0258636474609375,
-0.027557373046875,
0.09844970703125,
-0.0255126953125,
0.054046630859375,
-0.0313720703125,
-0.040374755859375,
-0.0253753662109375,
0.03680419921875,
-0.04522705078125,
0.09515380859375,
0.006374359130859375,
-0.06341552734375,
0.01499176025390625,
-0.06170654296875,
-0.016204833984375,
-0.0009407997131347656,
0.006717681884765625,
-0.04742431640625,
0.006900787353515625,
0.010498046875,
0.029754638671875,
-0.0238037109375,
0.025482177734375,
0.0038814544677734375,
-0.0245208740234375,
0.0038204193115234375,
-0.0299072265625,
0.079833984375,
0.0198211669921875,
-0.024322509765625,
0.016845703125,
-0.0699462890625,
-0.00518035888671875,
0.0024509429931640625,
-0.038970947265625,
-0.01468658447265625,
0.0091094970703125,
0.022125244140625,
0.0098724365234375,
0.0251617431640625,
-0.04718017578125,
0.0166778564453125,
-0.05145263671875,
0.00792694091796875,
0.04681396484375,
-0.0243377685546875,
0.027313232421875,
-0.03125,
0.0245361328125,
0.005481719970703125,
0.00766754150390625,
0.002758026123046875,
-0.033843994140625,
-0.0640869140625,
-0.0128631591796875,
0.0469970703125,
0.08050537109375,
-0.058319091796875,
0.067138671875,
-0.049957275390625,
-0.05535888671875,
-0.059600830078125,
-0.00702667236328125,
0.03399658203125,
0.0262451171875,
0.040740966796875,
-0.014678955078125,
-0.035980224609375,
-0.07940673828125,
-0.00913238525390625,
-0.009979248046875,
-0.0200347900390625,
0.010711669921875,
0.043487548828125,
-0.01309967041015625,
0.041351318359375,
-0.037109375,
-0.031768798828125,
-0.0143890380859375,
0.007152557373046875,
0.040985107421875,
0.046112060546875,
0.03912353515625,
-0.06646728515625,
-0.043304443359375,
-0.0014715194702148438,
-0.0570068359375,
-0.0114288330078125,
0.00617218017578125,
-0.015899658203125,
0.00737762451171875,
0.009765625,
-0.01922607421875,
0.0061187744140625,
0.0521240234375,
-0.043487548828125,
0.040679931640625,
-0.0087890625,
0.01470184326171875,
-0.09771728515625,
0.0113372802734375,
-0.01236724853515625,
-0.00838470458984375,
-0.0303955078125,
0.0007586479187011719,
0.019866943359375,
0.0072021484375,
-0.0625,
0.0413818359375,
-0.0160369873046875,
-0.0035266876220703125,
0.0196533203125,
0.0019683837890625,
0.007350921630859375,
0.05450439453125,
-0.0025463104248046875,
0.0596923828125,
0.05279541015625,
-0.03863525390625,
0.01154327392578125,
0.04449462890625,
-0.033355712890625,
0.02935791015625,
-0.06451416015625,
-0.0208587646484375,
0.023284912109375,
-0.00811004638671875,
-0.044281005859375,
0.00994873046875,
0.0205230712890625,
-0.0457763671875,
0.0286102294921875,
-0.0007200241088867188,
-0.05670166015625,
-0.00010138750076293945,
-0.019866943359375,
0.03265380859375,
0.049102783203125,
-0.014373779296875,
0.047515869140625,
0.0045166015625,
0.0012369155883789062,
-0.0361328125,
-0.075927734375,
-0.007526397705078125,
-0.027008056640625,
-0.056243896484375,
0.01690673828125,
-0.03143310546875,
-0.003833770751953125,
0.004024505615234375,
0.0247039794921875,
-0.003833770751953125,
0.0046539306640625,
0.0022373199462890625,
0.01480865478515625,
-0.039031982421875,
0.00823211669921875,
0.00115203857421875,
-0.0126190185546875,
-0.00994110107421875,
-0.009979248046875,
0.04443359375,
-0.0286102294921875,
-0.0189208984375,
-0.043853759765625,
0.00439453125,
0.037384033203125,
-0.031341552734375,
0.062744140625,
0.044586181640625,
-0.006504058837890625,
0.010833740234375,
-0.028656005859375,
0.006023406982421875,
-0.032623291015625,
0.00980377197265625,
-0.0286102294921875,
-0.058319091796875,
0.037689208984375,
0.0105133056640625,
0.03521728515625,
0.06268310546875,
0.046234130859375,
0.0038814544677734375,
0.043914794921875,
0.0233306884765625,
0.00370025634765625,
0.0308074951171875,
-0.0345458984375,
-0.00969696044921875,
-0.0821533203125,
0.0086212158203125,
-0.049224853515625,
-0.02484130859375,
-0.06121826171875,
-0.0201873779296875,
0.0186309814453125,
0.00653839111328125,
-0.018402099609375,
0.0521240234375,
-0.04327392578125,
0.01824951171875,
0.042327880859375,
-0.01047515869140625,
0.0233154296875,
0.001190185546875,
-0.03759765625,
-0.0168304443359375,
-0.035430908203125,
-0.040435791015625,
0.09490966796875,
0.0289306640625,
0.0190277099609375,
0.0204925537109375,
0.03656005859375,
0.0005202293395996094,
0.0166778564453125,
-0.04412841796875,
0.033050537109375,
-0.0240936279296875,
-0.05450439453125,
-0.02496337890625,
-0.0445556640625,
-0.06524658203125,
0.035858154296875,
-0.0207366943359375,
-0.034210205078125,
0.0124969482421875,
-0.001613616943359375,
-0.008514404296875,
0.0340576171875,
-0.0491943359375,
0.08221435546875,
-0.0079803466796875,
-0.00611114501953125,
0.0251312255859375,
-0.036712646484375,
0.0206298828125,
-0.0031871795654296875,
0.0206146240234375,
-0.0161590576171875,
0.01174163818359375,
0.050628662109375,
-0.0029354095458984375,
0.034637451171875,
-0.003509521484375,
-0.0087738037109375,
0.002033233642578125,
0.006626129150390625,
0.027557373046875,
-0.007526397705078125,
-0.03515625,
0.033355712890625,
-0.0013647079467773438,
-0.03204345703125,
-0.0096588134765625,
0.035247802734375,
-0.051422119140625,
-0.0004897117614746094,
-0.031585693359375,
-0.047210693359375,
0.0014848709106445312,
0.02691650390625,
0.050933837890625,
0.04833984375,
-0.0190887451171875,
0.042938232421875,
0.06268310546875,
-0.0282135009765625,
0.031768798828125,
0.05401611328125,
-0.0168609619140625,
-0.0399169921875,
0.060699462890625,
0.0071868896484375,
0.027862548828125,
0.046051025390625,
0.0098114013671875,
-0.01035308837890625,
-0.05670166015625,
-0.0518798828125,
0.0185546875,
-0.02264404296875,
-0.01451873779296875,
-0.04083251953125,
-0.006351470947265625,
-0.016754150390625,
0.01505279541015625,
-0.039031982421875,
-0.038970947265625,
-0.01045989990234375,
-0.0172882080078125,
0.016571044921875,
0.0168304443359375,
-0.00391387939453125,
0.03448486328125,
-0.0753173828125,
0.013824462890625,
-0.00824737548828125,
0.0274505615234375,
-0.02960205078125,
-0.05810546875,
-0.032989501953125,
0.003814697265625,
-0.04718017578125,
-0.047576904296875,
0.039642333984375,
0.008758544921875,
0.01806640625,
0.0235137939453125,
0.01265716552734375,
0.0239715576171875,
-0.05438232421875,
0.07366943359375,
-0.0009016990661621094,
-0.053955078125,
0.034576416015625,
-0.0323486328125,
0.03619384765625,
0.06884765625,
0.0194244384765625,
-0.0255126953125,
-0.038787841796875,
-0.051788330078125,
-0.06390380859375,
0.05853271484375,
0.054412841796875,
-0.0086822509765625,
0.0153350830078125,
-0.0092010498046875,
-0.0024776458740234375,
0.0128173828125,
-0.0870361328125,
-0.029571533203125,
0.00539398193359375,
-0.025421142578125,
-0.0174560546875,
-0.0186309814453125,
-0.016998291015625,
-0.01361846923828125,
0.07916259765625,
0.01134490966796875,
0.01293182373046875,
0.03326416015625,
-0.01190185546875,
-0.01593017578125,
0.023529052734375,
0.073974609375,
0.0413818359375,
-0.043609619140625,
-0.01284027099609375,
0.025177001953125,
-0.02899169921875,
-0.01300811767578125,
0.006114959716796875,
-0.0330810546875,
0.02508544921875,
0.039794921875,
0.08380126953125,
0.0167388916015625,
-0.048126220703125,
0.03375244140625,
-0.0299072265625,
-0.033447265625,
-0.04974365234375,
-0.01265716552734375,
0.01091766357421875,
-0.00033736228942871094,
0.020538330078125,
0.0118560791015625,
0.01175689697265625,
-0.011444091796875,
0.01335906982421875,
0.0022411346435546875,
-0.049896240234375,
-0.038909912109375,
0.035247802734375,
0.007549285888671875,
-0.0273590087890625,
0.037567138671875,
-0.0309600830078125,
-0.04193115234375,
0.028167724609375,
0.01090240478515625,
0.07568359375,
-0.015655517578125,
-0.0157470703125,
0.05572509765625,
0.04583740234375,
-0.01995849609375,
0.033294677734375,
0.0108795166015625,
-0.05517578125,
-0.04364013671875,
-0.067626953125,
-0.01451873779296875,
0.005107879638671875,
-0.063720703125,
0.025054931640625,
0.02435302734375,
0.00443267822265625,
-0.0263824462890625,
0.01410675048828125,
-0.040435791015625,
0.0085601806640625,
-0.0179443359375,
0.079345703125,
-0.06884765625,
0.06451416015625,
0.032501220703125,
-0.0177001953125,
-0.0640869140625,
-0.016632080078125,
-0.0164642333984375,
-0.0328369140625,
0.04632568359375,
0.012298583984375,
0.0243377685546875,
-0.01149749755859375,
-0.014678955078125,
-0.060302734375,
0.08160400390625,
0.01751708984375,
-0.047332763671875,
0.00067138671875,
0.01206207275390625,
0.039459228515625,
-0.0239410400390625,
0.0061492919921875,
0.0291290283203125,
0.056243896484375,
0.003551483154296875,
-0.08331298828125,
-0.022705078125,
-0.040252685546875,
-0.0234527587890625,
0.038543701171875,
-0.038909912109375,
0.07293701171875,
0.037689208984375,
-0.01023101806640625,
0.00101470947265625,
0.047454833984375,
0.02618408203125,
0.023101806640625,
0.041595458984375,
0.08673095703125,
0.028656005859375,
-0.034698486328125,
0.07928466796875,
-0.02294921875,
0.038238525390625,
0.08880615234375,
-0.00817108154296875,
0.0714111328125,
0.02374267578125,
-0.0086822509765625,
0.037322998046875,
0.044189453125,
-0.022613525390625,
0.03515625,
0.00408172607421875,
0.016632080078125,
-0.00794219970703125,
0.0164642333984375,
-0.05181884765625,
0.0198211669921875,
0.01306915283203125,
-0.01485443115234375,
0.00212860107421875,
-0.003635406494140625,
-0.0005335807800292969,
-0.00007957220077514648,
-0.01105499267578125,
0.04742431640625,
0.0005621910095214844,
-0.042327880859375,
0.05621337890625,
-0.005718231201171875,
0.056121826171875,
-0.053741455078125,
0.01232147216796875,
-0.004077911376953125,
0.01580810546875,
-0.0018749237060546875,
-0.045013427734375,
0.04205322265625,
0.00035881996154785156,
-0.0200042724609375,
-0.03460693359375,
0.0114593505859375,
-0.0401611328125,
-0.0682373046875,
0.03533935546875,
0.033111572265625,
0.0250091552734375,
0.00609588623046875,
-0.0679931640625,
0.005695343017578125,
0.01262664794921875,
-0.04656982421875,
0.003326416015625,
0.05181884765625,
0.02520751953125,
0.0305633544921875,
0.048797607421875,
0.0184173583984375,
0.017120361328125,
-0.0005655288696289062,
0.04583740234375,
-0.032562255859375,
-0.0294647216796875,
-0.057891845703125,
0.060943603515625,
-0.0117950439453125,
-0.05145263671875,
0.056427001953125,
0.07611083984375,
0.07940673828125,
-0.01062774658203125,
0.017547607421875,
-0.0024929046630859375,
0.0548095703125,
-0.0521240234375,
0.045562744140625,
-0.07049560546875,
0.0184478759765625,
-0.010498046875,
-0.0701904296875,
-0.022552490234375,
0.027435302734375,
-0.01355743408203125,
-0.0298004150390625,
0.0574951171875,
0.047088623046875,
-0.01473236083984375,
-0.0156707763671875,
0.0221099853515625,
0.0215301513671875,
0.015380859375,
0.042388916015625,
0.0261993408203125,
-0.07440185546875,
0.038909912109375,
-0.022247314453125,
-0.002651214599609375,
-0.0014371871948242188,
-0.05584716796875,
-0.061798095703125,
-0.04412841796875,
-0.01242828369140625,
-0.0169525146484375,
-0.0230865478515625,
0.06524658203125,
0.03753662109375,
-0.0711669921875,
-0.044219970703125,
0.0023632049560546875,
0.00949859619140625,
-0.0160064697265625,
-0.0194549560546875,
0.04931640625,
-0.0230712890625,
-0.0706787109375,
0.03759765625,
0.006992340087890625,
-0.0087890625,
-0.0005869865417480469,
-0.022918701171875,
-0.039093017578125,
-0.002735137939453125,
0.0245361328125,
0.00020325183868408203,
-0.039642333984375,
0.01100921630859375,
0.01113128662109375,
-0.00647735595703125,
0.0308990478515625,
0.024169921875,
-0.016265869140625,
0.0174713134765625,
0.058319091796875,
0.0298919677734375,
0.0310516357421875,
-0.01129913330078125,
0.03936767578125,
-0.056610107421875,
0.025665283203125,
0.019866943359375,
0.04510498046875,
0.0290985107421875,
-0.004161834716796875,
0.0638427734375,
0.0111236572265625,
-0.047454833984375,
-0.07965087890625,
0.005382537841796875,
-0.09368896484375,
-0.0022430419921875,
0.06939697265625,
-0.0213165283203125,
-0.022918701171875,
0.02276611328125,
-0.0106964111328125,
0.0104522705078125,
-0.0251007080078125,
0.028350830078125,
0.06329345703125,
0.02825927734375,
0.00799560546875,
-0.0548095703125,
0.0261077880859375,
0.0423583984375,
-0.0521240234375,
-0.01401519775390625,
0.01067352294921875,
0.007587432861328125,
0.033599853515625,
0.0355224609375,
-0.0209503173828125,
0.0063323974609375,
-0.025421142578125,
0.032989501953125,
-0.004497528076171875,
-0.01129913330078125,
-0.0275421142578125,
0.0038509368896484375,
-0.006740570068359375,
-0.01457977294921875
]
] |
Helsinki-NLP/opus-mt-en-fr | 2023-08-16T11:29:35.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"marian",
"text2text-generation",
"translation",
"en",
"fr",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-fr | 21 | 82,506 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-en-fr
* source languages: en
* target languages: fr
* OPUS readme: [en-fr](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-fr/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-02-26.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-fr/opus-2020-02-26.zip)
* test set translations: [opus-2020-02-26.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-fr/opus-2020-02-26.test.txt)
* test set scores: [opus-2020-02-26.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-fr/opus-2020-02-26.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdiscussdev2015-enfr.en.fr | 33.8 | 0.602 |
| newsdiscusstest2015-enfr.en.fr | 40.0 | 0.643 |
| newssyscomb2009.en.fr | 29.8 | 0.584 |
| news-test2008.en.fr | 27.5 | 0.554 |
| newstest2009.en.fr | 29.4 | 0.577 |
| newstest2010.en.fr | 32.7 | 0.596 |
| newstest2011.en.fr | 34.3 | 0.611 |
| newstest2012.en.fr | 31.8 | 0.592 |
| newstest2013.en.fr | 33.2 | 0.589 |
| Tatoeba.en.fr | 50.5 | 0.672 |
| 1,205 | [
[
-0.031463623046875,
-0.02716064453125,
0.0208587646484375,
0.02960205078125,
-0.0281829833984375,
-0.027679443359375,
-0.0244598388671875,
-0.01018524169921875,
0.00597381591796875,
0.030517578125,
-0.059844970703125,
-0.041839599609375,
-0.04644775390625,
0.017822265625,
-0.0051422119140625,
0.0535888671875,
-0.0148162841796875,
0.032135009765625,
0.0121917724609375,
-0.03564453125,
-0.02752685546875,
-0.028778076171875,
-0.032318115234375,
-0.0283660888671875,
0.022796630859375,
0.034637451171875,
0.0262451171875,
0.0303192138671875,
0.06866455078125,
0.01824951171875,
-0.01007080078125,
0.0027217864990234375,
-0.0301666259765625,
-0.01264190673828125,
0.0166473388671875,
-0.0426025390625,
-0.059539794921875,
-0.00760650634765625,
0.0712890625,
0.03387451171875,
-0.003681182861328125,
0.0307159423828125,
0.0019664764404296875,
0.07379150390625,
-0.019256591796875,
0.0022411346435546875,
-0.038177490234375,
0.0107574462890625,
-0.024200439453125,
-0.024444580078125,
-0.042877197265625,
-0.01509857177734375,
0.0011396408081054688,
-0.043792724609375,
0.00919342041015625,
0.0119171142578125,
0.1068115234375,
0.017425537109375,
-0.0207977294921875,
-0.00659942626953125,
-0.0384521484375,
0.07598876953125,
-0.060211181640625,
0.043792724609375,
0.02679443359375,
0.01678466796875,
0.0085906982421875,
-0.03985595703125,
-0.027862548828125,
0.015533447265625,
-0.0183868408203125,
0.0220947265625,
-0.01102447509765625,
-0.0191497802734375,
0.0223236083984375,
0.05670166015625,
-0.05999755859375,
0.0008397102355957031,
-0.043731689453125,
0.002742767333984375,
0.05279541015625,
0.01458740234375,
0.01226043701171875,
-0.011383056640625,
-0.03399658203125,
-0.039459228515625,
-0.05780029296875,
0.01430511474609375,
0.0299224853515625,
0.0191497802734375,
-0.03607177734375,
0.046905517578125,
-0.014404296875,
0.04803466796875,
-0.0013933181762695312,
-0.005889892578125,
0.0767822265625,
-0.02630615234375,
-0.0234527587890625,
-0.009796142578125,
0.09124755859375,
0.0333251953125,
0.0007386207580566406,
0.011871337890625,
-0.0203857421875,
-0.019622802734375,
0.004634857177734375,
-0.07171630859375,
-0.0035228729248046875,
0.0148468017578125,
-0.033782958984375,
-0.00974273681640625,
0.00640106201171875,
-0.05291748046875,
0.0205078125,
-0.0265350341796875,
0.040435791015625,
-0.041839599609375,
-0.0166168212890625,
0.026275634765625,
0.001064300537109375,
0.027374267578125,
0.000522613525390625,
-0.0458984375,
0.0159149169921875,
0.02801513671875,
0.053375244140625,
-0.0269775390625,
-0.0179443359375,
-0.0244140625,
-0.01444244384765625,
-0.01043701171875,
0.050018310546875,
-0.00844573974609375,
-0.0308990478515625,
-0.00899505615234375,
0.0360107421875,
-0.025970458984375,
-0.0268402099609375,
0.09637451171875,
-0.018707275390625,
0.05828857421875,
-0.03057861328125,
-0.039459228515625,
-0.023681640625,
0.036865234375,
-0.038543701171875,
0.09881591796875,
0.0051422119140625,
-0.06341552734375,
0.017242431640625,
-0.059295654296875,
-0.005916595458984375,
-0.007656097412109375,
-0.0001380443572998047,
-0.053955078125,
0.0008997917175292969,
0.01410675048828125,
0.03057861328125,
-0.0291900634765625,
0.020721435546875,
-0.0028171539306640625,
-0.0222320556640625,
-0.00043463706970214844,
-0.029876708984375,
0.07977294921875,
0.02294921875,
-0.0213623046875,
0.01546478271484375,
-0.07330322265625,
0.0002887248992919922,
0.0087738037109375,
-0.0357666015625,
-0.01024627685546875,
0.0032939910888671875,
0.015899658203125,
0.007373809814453125,
0.0159149169921875,
-0.048370361328125,
0.016510009765625,
-0.04779052734375,
0.0181884765625,
0.046600341796875,
-0.015655517578125,
0.0278778076171875,
-0.036895751953125,
0.027252197265625,
0.0083770751953125,
0.01448822021484375,
0.00506591796875,
-0.034423828125,
-0.06268310546875,
-0.0203857421875,
0.0343017578125,
0.0750732421875,
-0.048370361328125,
0.06591796875,
-0.048187255859375,
-0.059967041015625,
-0.048370361328125,
-0.0101165771484375,
0.028717041015625,
0.03680419921875,
0.03692626953125,
-0.01325225830078125,
-0.0311431884765625,
-0.0843505859375,
-0.0076141357421875,
-0.009796142578125,
-0.01317596435546875,
0.0167236328125,
0.052154541015625,
-0.00579833984375,
0.044189453125,
-0.0458984375,
-0.032135009765625,
-0.015869140625,
0.01326751708984375,
0.04473876953125,
0.05126953125,
0.045654296875,
-0.0675048828125,
-0.045318603515625,
-0.00597381591796875,
-0.048187255859375,
-0.0157928466796875,
0.00536346435546875,
-0.020660400390625,
0.00200653076171875,
0.008148193359375,
-0.0275726318359375,
0.00726318359375,
0.045074462890625,
-0.046356201171875,
0.04229736328125,
-0.0067901611328125,
0.020477294921875,
-0.1024169921875,
0.00634002685546875,
-0.01081085205078125,
-0.0061798095703125,
-0.03680419921875,
-0.0037708282470703125,
0.019195556640625,
0.01023101806640625,
-0.05322265625,
0.045074462890625,
-0.027557373046875,
-0.003353118896484375,
0.0244140625,
-0.0025482177734375,
0.00846099853515625,
0.054046630859375,
-0.00426483154296875,
0.058868408203125,
0.05322265625,
-0.03375244140625,
0.0105743408203125,
0.0379638671875,
-0.038116455078125,
0.031097412109375,
-0.056610107421875,
-0.0222015380859375,
0.0140380859375,
-0.0031147003173828125,
-0.057098388671875,
0.000774383544921875,
0.027099609375,
-0.05169677734375,
0.033660888671875,
-0.00600433349609375,
-0.049896240234375,
-0.01338958740234375,
-0.0235748291015625,
0.0290374755859375,
0.046539306640625,
-0.01290130615234375,
0.0418701171875,
0.01319122314453125,
-0.004528045654296875,
-0.033843994140625,
-0.07366943359375,
-0.01465606689453125,
-0.0325927734375,
-0.058807373046875,
0.023406982421875,
-0.03057861328125,
0.0025119781494140625,
-0.00015974044799804688,
0.018646240234375,
-0.0074005126953125,
-0.000278472900390625,
0.007808685302734375,
0.01953125,
-0.032928466796875,
0.0019664764404296875,
-0.00359344482421875,
-0.01241302490234375,
-0.004787445068359375,
-0.006206512451171875,
0.044219970703125,
-0.02996826171875,
-0.0242156982421875,
-0.039459228515625,
0.00362396240234375,
0.04388427734375,
-0.0304412841796875,
0.060333251953125,
0.042999267578125,
-0.01151275634765625,
0.014251708984375,
-0.030059814453125,
0.0038814544677734375,
-0.03363037109375,
0.0135345458984375,
-0.035369873046875,
-0.0606689453125,
0.046295166015625,
0.01387786865234375,
0.038543701171875,
0.06646728515625,
0.046722412109375,
0.00635528564453125,
0.061798095703125,
0.0223388671875,
0.01222991943359375,
0.035888671875,
-0.040252685546875,
-0.010772705078125,
-0.076904296875,
-0.005435943603515625,
-0.0504150390625,
-0.0310821533203125,
-0.063232421875,
-0.02056884765625,
0.025604248046875,
0.00379180908203125,
-0.02685546875,
0.04931640625,
-0.046905517578125,
0.0172119140625,
0.0426025390625,
-0.00577545166015625,
0.019073486328125,
0.00469207763671875,
-0.039825439453125,
-0.0185699462890625,
-0.03759765625,
-0.032470703125,
0.088134765625,
0.030914306640625,
0.0229339599609375,
0.01947021484375,
0.047271728515625,
-0.00286102294921875,
0.021270751953125,
-0.043060302734375,
0.036590576171875,
-0.01502227783203125,
-0.060943603515625,
-0.02392578125,
-0.0467529296875,
-0.054168701171875,
0.04083251953125,
-0.0191192626953125,
-0.04425048828125,
0.0208892822265625,
-0.0010318756103515625,
-0.01345062255859375,
0.0360107421875,
-0.049285888671875,
0.08319091796875,
-0.00832366943359375,
-0.0127410888671875,
0.0223541259765625,
-0.03765869140625,
0.02362060546875,
0.0013427734375,
0.0261993408203125,
-0.0225372314453125,
0.010101318359375,
0.057220458984375,
-0.01503753662109375,
0.030792236328125,
-0.0062255859375,
-0.00461578369140625,
0.00748443603515625,
0.004428863525390625,
0.032470703125,
-0.00995635986328125,
-0.023590087890625,
0.02581787109375,
0.005504608154296875,
-0.0301361083984375,
-0.00864410400390625,
0.041961669921875,
-0.05450439453125,
-0.0141143798828125,
-0.038116455078125,
-0.045074462890625,
-0.002079010009765625,
0.0312347412109375,
0.05120849609375,
0.047332763671875,
-0.0201416015625,
0.045257568359375,
0.059234619140625,
-0.025634765625,
0.0286407470703125,
0.052276611328125,
-0.01338958740234375,
-0.04296875,
0.06182861328125,
0.007717132568359375,
0.0241241455078125,
0.04449462890625,
0.01165008544921875,
-0.0156097412109375,
-0.048919677734375,
-0.052825927734375,
0.0167236328125,
-0.02276611328125,
-0.0169525146484375,
-0.04547119140625,
-0.0081024169921875,
-0.02001953125,
0.00693511962890625,
-0.0386962890625,
-0.042877197265625,
-0.015899658203125,
-0.0149078369140625,
0.0195770263671875,
0.01495361328125,
-0.01055908203125,
0.0323486328125,
-0.0736083984375,
0.01184844970703125,
-0.01047515869140625,
0.026458740234375,
-0.0300750732421875,
-0.0631103515625,
-0.0266876220703125,
0.003086090087890625,
-0.047393798828125,
-0.053009033203125,
0.04412841796875,
0.0115814208984375,
0.0220489501953125,
0.028961181640625,
0.01030731201171875,
0.03521728515625,
-0.0523681640625,
0.0718994140625,
0.0130157470703125,
-0.044769287109375,
0.0360107421875,
-0.032196044921875,
0.03314208984375,
0.0648193359375,
0.0193939208984375,
-0.0260772705078125,
-0.039337158203125,
-0.055206298828125,
-0.0672607421875,
0.067626953125,
0.050445556640625,
-0.007373809814453125,
0.01268768310546875,
-0.01019287109375,
-0.0032711029052734375,
0.007793426513671875,
-0.08026123046875,
-0.0391845703125,
0.00867462158203125,
-0.0288543701171875,
-0.00922393798828125,
-0.0208282470703125,
-0.019561767578125,
-0.0237884521484375,
0.07470703125,
0.01116943359375,
0.021820068359375,
0.0309295654296875,
0.0024814605712890625,
-0.01255035400390625,
0.0278167724609375,
0.06988525390625,
0.040283203125,
-0.041839599609375,
-0.0083770751953125,
0.0236663818359375,
-0.03448486328125,
-0.00804901123046875,
0.0100860595703125,
-0.0302886962890625,
0.0190582275390625,
0.0289764404296875,
0.07373046875,
0.0128173828125,
-0.04095458984375,
0.036224365234375,
-0.0250396728515625,
-0.03961181640625,
-0.052154541015625,
-0.01123809814453125,
0.00893402099609375,
0.005764007568359375,
0.0180816650390625,
0.01458740234375,
0.0105743408203125,
-0.0163726806640625,
0.01357269287109375,
0.01070404052734375,
-0.047149658203125,
-0.034698486328125,
0.0460205078125,
0.00719451904296875,
-0.0145721435546875,
0.0310211181640625,
-0.0253143310546875,
-0.044677734375,
0.036346435546875,
0.0116119384765625,
0.07586669921875,
-0.0164642333984375,
-0.016693115234375,
0.062255859375,
0.043701171875,
-0.0193634033203125,
0.038543701171875,
0.01207733154296875,
-0.049407958984375,
-0.031402587890625,
-0.0638427734375,
-0.005107879638671875,
0.0088653564453125,
-0.0662841796875,
0.0283050537109375,
0.022003173828125,
-0.0054168701171875,
-0.0214385986328125,
0.0178070068359375,
-0.04547119140625,
0.00966644287109375,
-0.0164031982421875,
0.08355712890625,
-0.07183837890625,
0.0621337890625,
0.037628173828125,
-0.0225372314453125,
-0.06109619140625,
-0.0209503173828125,
-0.011383056640625,
-0.038421630859375,
0.04437255859375,
0.011016845703125,
0.024658203125,
-0.01030731201171875,
-0.0212860107421875,
-0.06939697265625,
0.08734130859375,
0.00887298583984375,
-0.044708251953125,
0.005889892578125,
0.01296234130859375,
0.034576416015625,
-0.0277862548828125,
0.01215362548828125,
0.031158447265625,
0.056549072265625,
0.0128021240234375,
-0.076416015625,
-0.0115509033203125,
-0.043426513671875,
-0.0267333984375,
0.04388427734375,
-0.0517578125,
0.0770263671875,
0.0276641845703125,
-0.00875091552734375,
0.0009241104125976562,
0.043853759765625,
0.0270538330078125,
0.021026611328125,
0.040557861328125,
0.08721923828125,
0.0323486328125,
-0.042022705078125,
0.0701904296875,
-0.028839111328125,
0.042877197265625,
0.08319091796875,
0.0009393692016601562,
0.06591796875,
0.0267791748046875,
-0.0187225341796875,
0.036407470703125,
0.052490234375,
-0.02288818359375,
0.038604736328125,
-0.00037360191345214844,
0.00832366943359375,
-0.0167388916015625,
0.018218994140625,
-0.053802490234375,
0.0138702392578125,
0.0189971923828125,
-0.0189056396484375,
-0.0018405914306640625,
-0.00922393798828125,
0.006526947021484375,
-0.01331329345703125,
-0.009185791015625,
0.03680419921875,
0.0009360313415527344,
-0.041656494140625,
0.054534912109375,
-0.0014085769653320312,
0.045013427734375,
-0.051513671875,
0.007091522216796875,
-0.0090179443359375,
0.0222015380859375,
-0.006336212158203125,
-0.051605224609375,
0.037261962890625,
0.0053558349609375,
-0.023223876953125,
-0.035186767578125,
0.015167236328125,
-0.038055419921875,
-0.07000732421875,
0.0235595703125,
0.0303497314453125,
0.0216217041015625,
0.004703521728515625,
-0.064208984375,
-0.00034689903259277344,
0.014007568359375,
-0.052398681640625,
0.004413604736328125,
0.052215576171875,
0.024169921875,
0.033660888671875,
0.044647216796875,
0.01523590087890625,
0.01702880859375,
-0.006252288818359375,
0.054901123046875,
-0.0305023193359375,
-0.03460693359375,
-0.057952880859375,
0.0595703125,
-0.00914764404296875,
-0.0487060546875,
0.051666259765625,
0.07818603515625,
0.07452392578125,
-0.0039215087890625,
0.020111083984375,
-0.0153350830078125,
0.05450439453125,
-0.047210693359375,
0.04620361328125,
-0.07965087890625,
0.01922607421875,
-0.00464630126953125,
-0.06768798828125,
-0.0215911865234375,
0.024505615234375,
-0.022735595703125,
-0.0235748291015625,
0.050201416015625,
0.051544189453125,
-0.01018524169921875,
-0.01201629638671875,
0.016845703125,
0.023895263671875,
0.015777587890625,
0.046417236328125,
0.0311431884765625,
-0.0703125,
0.04022216796875,
-0.0273895263671875,
-0.00936126708984375,
-0.0070037841796875,
-0.05035400390625,
-0.06146240234375,
-0.043365478515625,
-0.0097198486328125,
-0.01523590087890625,
-0.02484130859375,
0.06695556640625,
0.037994384765625,
-0.07073974609375,
-0.03680419921875,
0.0009002685546875,
0.005992889404296875,
-0.014556884765625,
-0.021270751953125,
0.048675537109375,
-0.01470184326171875,
-0.0689697265625,
0.030670166015625,
0.0028400421142578125,
-0.0008254051208496094,
0.0009012222290039062,
-0.0251007080078125,
-0.03228759765625,
-0.00513458251953125,
0.023345947265625,
0.0020427703857421875,
-0.040313720703125,
0.006748199462890625,
0.01049041748046875,
-0.0060882568359375,
0.03448486328125,
0.01776123046875,
-0.01385498046875,
0.0153350830078125,
0.06744384765625,
0.01399993896484375,
0.038482666015625,
-0.00725555419921875,
0.03253173828125,
-0.056976318359375,
0.023529052734375,
0.01467132568359375,
0.04559326171875,
0.0201873779296875,
-0.0014553070068359375,
0.058868408203125,
0.019378662109375,
-0.045196533203125,
-0.080810546875,
0.00030612945556640625,
-0.0848388671875,
-0.0017375946044921875,
0.070068359375,
-0.01468658447265625,
-0.0206451416015625,
0.02728271484375,
-0.007747650146484375,
0.00641632080078125,
-0.0269012451171875,
0.03082275390625,
0.07000732421875,
0.01468658447265625,
0.003292083740234375,
-0.06317138671875,
0.0267181396484375,
0.027069091796875,
-0.05133056640625,
-0.01434326171875,
0.016143798828125,
0.01371002197265625,
0.029266357421875,
0.0413818359375,
-0.02557373046875,
0.00006580352783203125,
-0.01580810546875,
0.03173828125,
-0.0065155029296875,
-0.01450347900390625,
-0.01708984375,
0.0008940696716308594,
-0.00772857666015625,
-0.020660400390625
]
] |
timm/vit_base_r50_s16_384.orig_in21k_ft_in1k | 2023-05-06T00:43:14.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_r50_s16_384.orig_in21k_ft_in1k | 2 | 82,446 | timm | 2022-12-23T00:27:16 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for vit_base_r50_s16_384.orig_in21k_ft_in1k
A ResNet - Vision Transformer (ViT) hybrid image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 99.0
- GMACs: 61.3
- Activations (M): 81.8
- Image size: 384 x 384
- **Papers:**
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_r50_s16_384.orig_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_r50_s16_384.orig_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 577, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,410 | [
[
-0.035552978515625,
-0.027984619140625,
-0.0012493133544921875,
0.0056304931640625,
-0.028472900390625,
-0.017974853515625,
-0.0186920166015625,
-0.036163330078125,
0.0203399658203125,
0.0219573974609375,
-0.03692626953125,
-0.044708251953125,
-0.04852294921875,
-0.0006966590881347656,
-0.0141448974609375,
0.0789794921875,
-0.01314544677734375,
-0.0028057098388671875,
-0.017608642578125,
-0.04107666015625,
-0.00904083251953125,
-0.0241851806640625,
-0.04656982421875,
-0.03631591796875,
0.0281524658203125,
0.00911712646484375,
0.048858642578125,
0.039764404296875,
0.05157470703125,
0.033111572265625,
-0.0118865966796875,
0.0037670135498046875,
-0.0191802978515625,
-0.016815185546875,
0.0197601318359375,
-0.0477294921875,
-0.035430908203125,
0.0196380615234375,
0.054840087890625,
0.0274200439453125,
0.0005125999450683594,
0.02978515625,
0.00617218017578125,
0.0382080078125,
-0.0225982666015625,
0.00916290283203125,
-0.03228759765625,
0.019500732421875,
-0.01123809814453125,
0.004390716552734375,
-0.0277862548828125,
-0.0252685546875,
0.02264404296875,
-0.03887939453125,
0.03997802734375,
0.00029969215393066406,
0.1031494140625,
0.0115814208984375,
-0.0023956298828125,
0.0102996826171875,
-0.02642822265625,
0.055389404296875,
-0.05047607421875,
0.031341552734375,
0.01073455810546875,
0.01084136962890625,
-0.0007038116455078125,
-0.07611083984375,
-0.050933837890625,
-0.01477813720703125,
-0.0198974609375,
0.00679779052734375,
-0.0303497314453125,
0.0102996826171875,
0.02978515625,
0.036529541015625,
-0.04254150390625,
-0.0012798309326171875,
-0.040618896484375,
-0.021087646484375,
0.039093017578125,
0.0016460418701171875,
0.0197296142578125,
-0.0196685791015625,
-0.042510986328125,
-0.039764404296875,
-0.0213470458984375,
0.0178375244140625,
0.0226287841796875,
0.0069732666015625,
-0.04388427734375,
0.041961669921875,
0.00814056396484375,
0.044921875,
0.01323699951171875,
-0.020538330078125,
0.049072265625,
-0.00782012939453125,
-0.027618408203125,
-0.017059326171875,
0.0877685546875,
0.03045654296875,
0.0234222412109375,
-0.002574920654296875,
-0.01039886474609375,
-0.00728607177734375,
-0.0005311965942382812,
-0.083740234375,
-0.028564453125,
0.0146331787109375,
-0.045867919921875,
-0.0254669189453125,
0.026336669921875,
-0.05047607421875,
-0.0102691650390625,
-0.0014867782592773438,
0.052490234375,
-0.037078857421875,
-0.030914306640625,
0.00815582275390625,
-0.0143890380859375,
0.03033447265625,
0.0144195556640625,
-0.033721923828125,
0.01502227783203125,
0.01727294921875,
0.08709716796875,
-0.00045990943908691406,
-0.03546142578125,
-0.01186370849609375,
-0.031982421875,
-0.018035888671875,
0.03692626953125,
-0.0029468536376953125,
-0.017486572265625,
-0.0192413330078125,
0.02606201171875,
-0.017547607421875,
-0.04852294921875,
0.0248565673828125,
-0.0160980224609375,
0.0285491943359375,
0.00797271728515625,
-0.01451873779296875,
-0.032806396484375,
0.018035888671875,
-0.03662109375,
0.0966796875,
0.030242919921875,
-0.0723876953125,
0.0294189453125,
-0.037078857421875,
-0.0078582763671875,
-0.005970001220703125,
0.0011072158813476562,
-0.08587646484375,
-0.00003165006637573242,
0.0130615234375,
0.047882080078125,
-0.020477294921875,
0.00255584716796875,
-0.037994384765625,
-0.017364501953125,
0.024566650390625,
-0.01064300537109375,
0.07342529296875,
0.005435943603515625,
-0.02667236328125,
0.0234832763671875,
-0.042083740234375,
0.00835418701171875,
0.040863037109375,
-0.019500732421875,
-0.0019054412841796875,
-0.04913330078125,
0.0203094482421875,
0.01983642578125,
0.014892578125,
-0.053131103515625,
0.0243682861328125,
-0.01372528076171875,
0.032958984375,
0.0423583984375,
-0.018463134765625,
0.0263824462890625,
-0.0294647216796875,
0.033355712890625,
0.0182037353515625,
0.01824951171875,
0.0017337799072265625,
-0.045684814453125,
-0.067626953125,
-0.0401611328125,
0.0279388427734375,
0.031402587890625,
-0.04290771484375,
0.040985107421875,
-0.0240936279296875,
-0.059906005859375,
-0.0401611328125,
-0.0011205673217773438,
0.03765869140625,
0.0391845703125,
0.03216552734375,
-0.038818359375,
-0.042694091796875,
-0.0704345703125,
-0.006061553955078125,
-0.006511688232421875,
0.0015916824340820312,
0.0225982666015625,
0.0535888671875,
-0.01654052734375,
0.0634765625,
-0.028656005859375,
-0.02130126953125,
-0.0208892822265625,
0.00826263427734375,
0.02947998046875,
0.049652099609375,
0.057769775390625,
-0.05078125,
-0.040283203125,
-0.0087432861328125,
-0.068359375,
0.01462554931640625,
0.0028705596923828125,
-0.016815185546875,
0.0211334228515625,
0.01013946533203125,
-0.059112548828125,
0.055938720703125,
0.021575927734375,
-0.031829833984375,
0.04205322265625,
-0.017425537109375,
0.00554656982421875,
-0.08905029296875,
0.008575439453125,
0.0318603515625,
-0.0197601318359375,
-0.03271484375,
-0.0014495849609375,
0.0081939697265625,
-0.004108428955078125,
-0.0310211181640625,
0.04534912109375,
-0.03778076171875,
-0.00891876220703125,
-0.004039764404296875,
-0.0247344970703125,
0.0008654594421386719,
0.047882080078125,
0.0044097900390625,
0.037322998046875,
0.054595947265625,
-0.0284576416015625,
0.042816162109375,
0.03466796875,
-0.018035888671875,
0.039276123046875,
-0.054595947265625,
0.01361846923828125,
0.00272369384765625,
0.0300140380859375,
-0.078125,
-0.01226806640625,
0.023101806640625,
-0.048583984375,
0.042266845703125,
-0.03765869140625,
-0.0386962890625,
-0.0491943359375,
-0.0302886962890625,
0.03839111328125,
0.05169677734375,
-0.0574951171875,
0.049468994140625,
0.0118255615234375,
0.0225830078125,
-0.03570556640625,
-0.06414794921875,
-0.0206298828125,
-0.03326416015625,
-0.0556640625,
0.036865234375,
0.01222991943359375,
0.01143646240234375,
0.01261138916015625,
-0.01042938232421875,
0.00623321533203125,
-0.0201263427734375,
0.03466796875,
0.036468505859375,
-0.0193634033203125,
-0.00861358642578125,
-0.0298919677734375,
-0.01473236083984375,
0.0052337646484375,
-0.02642822265625,
0.044586181640625,
-0.0265350341796875,
-0.00514984130859375,
-0.06317138671875,
-0.01371002197265625,
0.046905517578125,
-0.018890380859375,
0.0657958984375,
0.08026123046875,
-0.034698486328125,
-0.0013666152954101562,
-0.035675048828125,
-0.023773193359375,
-0.035919189453125,
0.03741455078125,
-0.029449462890625,
-0.03289794921875,
0.0625,
0.00934600830078125,
0.0024738311767578125,
0.054656982421875,
0.0328369140625,
-0.0013904571533203125,
0.059356689453125,
0.044891357421875,
0.00862884521484375,
0.060089111328125,
-0.07098388671875,
-0.008758544921875,
-0.06536865234375,
-0.033172607421875,
-0.0210723876953125,
-0.040863037109375,
-0.055389404296875,
-0.027923583984375,
0.035919189453125,
-0.00002586841583251953,
-0.024200439453125,
0.04425048828125,
-0.06396484375,
0.0066986083984375,
0.05499267578125,
0.0418701171875,
-0.01274871826171875,
0.026275634765625,
-0.022369384765625,
-0.0005092620849609375,
-0.0577392578125,
-0.01268768310546875,
0.08587646484375,
0.037322998046875,
0.054443359375,
-0.0110321044921875,
0.04876708984375,
-0.018951416015625,
0.025665283203125,
-0.05474853515625,
0.046173095703125,
-0.0050201416015625,
-0.036865234375,
-0.00553131103515625,
-0.033935546875,
-0.07733154296875,
0.016265869140625,
-0.0254974365234375,
-0.0609130859375,
0.0131378173828125,
0.01334381103515625,
-0.019927978515625,
0.057586669921875,
-0.05938720703125,
0.07421875,
-0.00539398193359375,
-0.031768798828125,
0.0032100677490234375,
-0.049346923828125,
0.01457977294921875,
0.0203094482421875,
-0.023284912109375,
0.0046234130859375,
0.0178375244140625,
0.08697509765625,
-0.044342041015625,
0.05828857421875,
-0.027374267578125,
0.0240478515625,
0.0352783203125,
-0.00447845458984375,
0.0248260498046875,
-0.00799560546875,
0.005336761474609375,
0.029541015625,
0.0113983154296875,
-0.0294647216796875,
-0.038604736328125,
0.03826904296875,
-0.0780029296875,
-0.0251922607421875,
-0.042083740234375,
-0.04217529296875,
0.00970458984375,
0.004848480224609375,
0.051727294921875,
0.046142578125,
0.0191497802734375,
0.0302734375,
0.041290283203125,
-0.027252197265625,
0.0260467529296875,
-0.0013980865478515625,
-0.0198516845703125,
-0.04095458984375,
0.0655517578125,
0.0166015625,
0.01352691650390625,
0.008544921875,
0.019927978515625,
-0.032623291015625,
-0.034698486328125,
-0.025543212890625,
0.04058837890625,
-0.045867919921875,
-0.036376953125,
-0.044281005859375,
-0.034698486328125,
-0.0240936279296875,
0.0005230903625488281,
-0.037139892578125,
-0.0211334228515625,
-0.0232391357421875,
0.0051116943359375,
0.0621337890625,
0.0421142578125,
-0.005451202392578125,
0.0350341796875,
-0.040008544921875,
0.0121917724609375,
0.0122833251953125,
0.040618896484375,
-0.01024627685546875,
-0.07562255859375,
-0.01727294921875,
-0.006183624267578125,
-0.034393310546875,
-0.057891845703125,
0.03680419921875,
0.0110626220703125,
0.031707763671875,
0.021942138671875,
-0.0157470703125,
0.058502197265625,
-0.0035648345947265625,
0.037750244140625,
0.0295562744140625,
-0.04571533203125,
0.038330078125,
-0.009429931640625,
0.01091766357421875,
0.00798797607421875,
0.0233612060546875,
-0.0235748291015625,
-0.0006570816040039062,
-0.08148193359375,
-0.055877685546875,
0.0643310546875,
0.01555633544921875,
0.000194549560546875,
0.03497314453125,
0.046173095703125,
0.00034880638122558594,
0.002429962158203125,
-0.06707763671875,
-0.0287322998046875,
-0.0259552001953125,
-0.025634765625,
-0.0029850006103515625,
-0.0024566650390625,
-0.00255584716796875,
-0.05499267578125,
0.050384521484375,
-0.0038890838623046875,
0.059844970703125,
0.0274810791015625,
-0.0068359375,
-0.01262664794921875,
-0.0286712646484375,
0.02288818359375,
0.0170135498046875,
-0.0268402099609375,
0.00415802001953125,
0.01465606689453125,
-0.05029296875,
0.0005893707275390625,
0.020263671875,
-0.0002911090850830078,
0.0028839111328125,
0.035552978515625,
0.073974609375,
-0.0056304931640625,
-0.0011148452758789062,
0.033843994140625,
-0.004741668701171875,
-0.033203125,
-0.020477294921875,
0.0021762847900390625,
-0.0079193115234375,
0.027374267578125,
0.021026611328125,
0.02313232421875,
-0.01546478271484375,
-0.019073486328125,
0.015655517578125,
0.0462646484375,
-0.0254669189453125,
-0.032379150390625,
0.046661376953125,
-0.0145721435546875,
-0.0126495361328125,
0.06689453125,
-0.0025615692138671875,
-0.044464111328125,
0.0687255859375,
0.031494140625,
0.07220458984375,
-0.004451751708984375,
0.0035190582275390625,
0.058929443359375,
0.01471710205078125,
-0.0027904510498046875,
0.013336181640625,
0.0102996826171875,
-0.06549072265625,
0.00604248046875,
-0.046722412109375,
-0.00223541259765625,
0.0278167724609375,
-0.04827880859375,
0.0249481201171875,
-0.04852294921875,
-0.03448486328125,
0.01204681396484375,
0.02020263671875,
-0.0693359375,
0.02056884765625,
0.00390625,
0.05975341796875,
-0.061065673828125,
0.05426025390625,
0.06854248046875,
-0.0465087890625,
-0.075439453125,
-0.01108551025390625,
-0.0019006729125976562,
-0.06414794921875,
0.038299560546875,
0.03338623046875,
0.004489898681640625,
0.0153045654296875,
-0.057281494140625,
-0.044677734375,
0.1021728515625,
0.0306549072265625,
-0.008758544921875,
0.0121612548828125,
-0.0049896240234375,
0.025146484375,
-0.0214080810546875,
0.03936767578125,
0.0188751220703125,
0.0243988037109375,
0.019561767578125,
-0.06396484375,
0.01369476318359375,
-0.0253753662109375,
0.0087127685546875,
0.0196685791015625,
-0.0640869140625,
0.07135009765625,
-0.031768798828125,
-0.0131988525390625,
0.014556884765625,
0.0511474609375,
0.0203399658203125,
0.00571441650390625,
0.0418701171875,
0.06793212890625,
0.03216552734375,
-0.02947998046875,
0.0679931640625,
-0.0032958984375,
0.05828857421875,
0.03680419921875,
0.0292205810546875,
0.042999267578125,
0.034698486328125,
-0.036163330078125,
0.036895751953125,
0.07257080078125,
-0.03033447265625,
0.02154541015625,
0.0094146728515625,
-0.0018777847290039062,
-0.0159912109375,
0.001861572265625,
-0.03369140625,
0.037811279296875,
0.0144805908203125,
-0.040618896484375,
-0.0119476318359375,
0.005260467529296875,
-0.002620697021484375,
-0.0260467529296875,
-0.0112762451171875,
0.035614013671875,
0.0010538101196289062,
-0.031280517578125,
0.059722900390625,
0.002574920654296875,
0.06915283203125,
-0.032562255859375,
0.00391387939453125,
-0.017303466796875,
0.0307464599609375,
-0.0290374755859375,
-0.064453125,
0.016143798828125,
-0.0167694091796875,
-0.002841949462890625,
-0.0002543926239013672,
0.048797607421875,
-0.024383544921875,
-0.040740966796875,
0.018402099609375,
0.021636962890625,
0.025970458984375,
-0.005191802978515625,
-0.08612060546875,
0.0036602020263671875,
-0.0012664794921875,
-0.044464111328125,
0.0202484130859375,
0.03009033203125,
0.0046234130859375,
0.053985595703125,
0.04730224609375,
-0.0085906982421875,
0.01214599609375,
-0.0149383544921875,
0.06365966796875,
-0.0364990234375,
-0.02923583984375,
-0.05950927734375,
0.045684814453125,
0.0005464553833007812,
-0.047515869140625,
0.045867919921875,
0.043609619140625,
0.06964111328125,
-0.006137847900390625,
0.035369873046875,
-0.0214691162109375,
0.0033206939697265625,
-0.0243682861328125,
0.0489501953125,
-0.056884765625,
-0.0138702392578125,
-0.0224609375,
-0.059967041015625,
-0.029052734375,
0.06805419921875,
-0.02734375,
0.03662109375,
0.0426025390625,
0.07470703125,
-0.028350830078125,
-0.0213623046875,
0.01212310791015625,
0.01403045654296875,
0.006771087646484375,
0.031585693359375,
0.040863037109375,
-0.058380126953125,
0.034454345703125,
-0.046142578125,
-0.015472412109375,
-0.0133819580078125,
-0.0445556640625,
-0.075927734375,
-0.06427001953125,
-0.046844482421875,
-0.059906005859375,
-0.020538330078125,
0.0701904296875,
0.0765380859375,
-0.050750732421875,
-0.01190185546875,
-0.0076446533203125,
-0.00331878662109375,
-0.0174713134765625,
-0.0188751220703125,
0.043487548828125,
0.0023975372314453125,
-0.05584716796875,
-0.0309295654296875,
-0.00257110595703125,
0.031982421875,
-0.01070404052734375,
-0.01500701904296875,
-0.01184844970703125,
-0.0259552001953125,
0.01433563232421875,
0.0274505615234375,
-0.054962158203125,
-0.0127716064453125,
-0.0102386474609375,
-0.006465911865234375,
0.033966064453125,
0.0300445556640625,
-0.0516357421875,
0.0287628173828125,
0.036468505859375,
0.022674560546875,
0.06463623046875,
-0.0143585205078125,
0.00507354736328125,
-0.05377197265625,
0.04205322265625,
-0.00193023681640625,
0.041351318359375,
0.032958984375,
-0.021331787109375,
0.0390625,
0.04669189453125,
-0.02899169921875,
-0.0618896484375,
-0.003192901611328125,
-0.08392333984375,
0.005596160888671875,
0.07464599609375,
-0.024169921875,
-0.03692626953125,
0.034515380859375,
-0.01129150390625,
0.047149658203125,
-0.01097869873046875,
0.03125,
0.02386474609375,
-0.005931854248046875,
-0.04656982421875,
-0.0284576416015625,
0.03045654296875,
0.017913818359375,
-0.0430908203125,
-0.027984619140625,
0.0015163421630859375,
0.050323486328125,
0.0268402099609375,
0.026763916015625,
-0.0149383544921875,
0.01282501220703125,
0.004199981689453125,
0.041778564453125,
-0.0223541259765625,
-0.0121002197265625,
-0.0247344970703125,
-0.003360748291015625,
-0.00957489013671875,
-0.0465087890625
]
] |
CiaraRowles/TemporalNet | 2023-04-05T22:59:34.000Z | [
"diffusers",
"controlnet",
"stable-diffusion",
"license:openrail",
"diffusers:ControlNetModel",
"region:us"
] | null | CiaraRowles | null | null | CiaraRowles/TemporalNet | 308 | 82,258 | diffusers | 2023-03-23T22:31:31 | ---
license: openrail
tags:
- controlnet
- stable-diffusion
- diffusers
base_model: runwayml/stable-diffusion-v1-5
---
Introducing the Beta Version of TemporalNet
TemporalNet is a ControlNet model designed to enhance the temporal consistency of generated outputs, as demonstrated in this example: https://twitter.com/CiaraRowles1/status/1637486561917906944. While it does not eliminate all flickering, it significantly reduces it, particularly at higher denoise levels. For optimal results, it is recommended to use TemporalNet in combination with other methods.
Instructions for Use:
1) Add the model "diff_control_sd15_temporalnet_fp16.safetensors" to your models folder in the ControlNet extension in Automatic1111's Web UI.
2) Create a folder that contains:
- A subfolder named "Input_Images" with the input frames
- A PNG file called "init.png" that is pre-stylized in your desired style
- The "temporalvideo.py" script
3) Customize the "temporalvideo.py" script according to your preferences, such as the image resolution, prompt, and control net settings.
4) Launch Automatic1111's Web UI with the --api setting enabled.
5) Execute the Python script.
*Please note that the "init.png" image will not significantly influence the style of the output video. Its primary purpose is to prevent a drastic change in aesthetics during the first few frames.*
Also, I highly recommend you use this in conjunction with the hed model, the settings are already in the script.
ToDo:
Write an Extension for the web ui.
Write a feature that automatically generates an "init.png" image if none is provided.
̶C̶h̶a̶n̶g̶e̶ ̶t̶h̶e̶ ̶e̶x̶t̶e̶n̶s̶i̶o̶n̶ ̶t̶o̶ ̶.̶s̶a̶f̶e̶t̶e̶n̶s̶o̶r̶s̶ ̶a̶n̶d̶ ̶i̶n̶v̶e̶s̶t̶i̶g̶a̶t̶e̶ ̶c̶o̶m̶p̶r̶e̶s̶s̶i̶o̶n̶.̶
| 1,743 | [
[
-0.03704833984375,
-0.0350341796875,
0.00490570068359375,
0.03179931640625,
-0.017425537109375,
0.0041351318359375,
0.011383056640625,
-0.02264404296875,
0.0219268798828125,
0.00673675537109375,
-0.057037353515625,
-0.022796630859375,
-0.0572509765625,
-0.0038604736328125,
-0.03240966796875,
0.05511474609375,
-0.022735595703125,
-0.0240325927734375,
0.040740966796875,
0.01479339599609375,
-0.01277923583984375,
0.0081024169921875,
-0.07696533203125,
-0.03765869140625,
0.033477783203125,
0.01319122314453125,
0.06329345703125,
0.07037353515625,
0.04461669921875,
0.0292816162109375,
-0.00681304931640625,
-0.0159454345703125,
-0.019378662109375,
0.0033721923828125,
0.022308349609375,
0.0033893585205078125,
-0.04217529296875,
-0.0007600784301757812,
0.0548095703125,
0.0159759521484375,
-0.01117706298828125,
0.018829345703125,
-0.0138092041015625,
0.028411865234375,
-0.049774169921875,
0.0242767333984375,
-0.0135040283203125,
0.02764892578125,
-0.0106353759765625,
-0.028717041015625,
-0.013824462890625,
-0.03314208984375,
0.01068878173828125,
-0.06756591796875,
0.01042938232421875,
-0.0021457672119140625,
0.10931396484375,
0.03424072265625,
-0.042755126953125,
0.034271240234375,
-0.06683349609375,
0.02801513671875,
-0.07635498046875,
0.0194091796875,
0.004001617431640625,
0.0657958984375,
-0.016387939453125,
-0.07916259765625,
-0.036468505859375,
-0.033172607421875,
0.0126953125,
0.024261474609375,
-0.0288848876953125,
0.0109100341796875,
0.032623291015625,
0.0122528076171875,
-0.04583740234375,
0.0198822021484375,
-0.026641845703125,
-0.027252197265625,
0.026611328125,
0.03497314453125,
0.021728515625,
-0.01824951171875,
0.002376556396484375,
-0.02105712890625,
-0.0295867919921875,
0.0146636962890625,
0.042755126953125,
-0.00597381591796875,
-0.0303497314453125,
0.038055419921875,
-0.025848388671875,
0.0198822021484375,
0.0394287109375,
-0.0004870891571044922,
0.01479339599609375,
0.004047393798828125,
-0.024200439453125,
-0.0012559890747070312,
0.046295166015625,
0.0672607421875,
0.023193359375,
0.012451171875,
-0.0202789306640625,
0.0232086181640625,
0.038818359375,
-0.08111572265625,
-0.0299835205078125,
0.027557373046875,
-0.04815673828125,
-0.04388427734375,
-0.00414276123046875,
-0.031707763671875,
0.0028858184814453125,
-0.03131103515625,
0.0452880859375,
-0.038177490234375,
-0.038818359375,
-0.0183563232421875,
-0.027618408203125,
0.01263427734375,
0.046051025390625,
-0.049102783203125,
0.02252197265625,
0.02197265625,
0.04656982421875,
-0.0003261566162109375,
-0.0205841064453125,
-0.0236968994140625,
0.01117706298828125,
-0.043121337890625,
0.03424072265625,
0.00026702880859375,
-0.0164337158203125,
0.0134735107421875,
0.03900146484375,
0.03955078125,
-0.039520263671875,
-0.028778076171875,
-0.05218505859375,
-0.0007982254028320312,
0.0260467529296875,
-0.0240325927734375,
0.0006160736083984375,
0.008636474609375,
-0.045196533203125,
0.0653076171875,
0.00940704345703125,
-0.05389404296875,
0.0360107421875,
-0.053741455078125,
-0.01407623291015625,
-0.0004820823669433594,
0.00995635986328125,
-0.03460693359375,
0.0033092498779296875,
-0.00756072998046875,
-0.0007381439208984375,
0.0362548828125,
0.0218658447265625,
-0.0214691162109375,
-0.0281219482421875,
0.01506805419921875,
-0.038482666015625,
0.04608154296875,
0.028472900390625,
-0.036773681640625,
0.02386474609375,
-0.059356689453125,
0.000667572021484375,
-0.01428985595703125,
-0.012969970703125,
0.01128387451171875,
-0.0198211669921875,
0.022430419921875,
0.0080413818359375,
0.0287017822265625,
-0.04180908203125,
0.009857177734375,
-0.0218658447265625,
0.033538818359375,
0.07373046875,
0.00821685791015625,
0.05328369140625,
-0.039581298828125,
0.025054931640625,
0.0217437744140625,
0.07806396484375,
-0.00445556640625,
-0.0311126708984375,
-0.058349609375,
0.00951385498046875,
0.00688934326171875,
0.0258636474609375,
-0.06842041015625,
0.00405120849609375,
-0.0017871856689453125,
-0.0249481201171875,
-0.01004791259765625,
-0.014007568359375,
0.0192413330078125,
0.03253173828125,
0.004974365234375,
-0.03302001953125,
-0.050537109375,
-0.07763671875,
0.040008544921875,
-0.01702880859375,
-0.00548553466796875,
0.00844573974609375,
0.036407470703125,
-0.02154541015625,
0.0802001953125,
-0.018646240234375,
0.0167388916015625,
0.01502227783203125,
0.0251617431640625,
0.031951904296875,
0.057952880859375,
0.034698486328125,
-0.04571533203125,
-0.031463623046875,
-0.0189361572265625,
-0.06060791015625,
-0.0130462646484375,
0.00637054443359375,
-0.0081024169921875,
-0.001987457275390625,
0.038482666015625,
-0.00762176513671875,
0.0572509765625,
0.0421142578125,
-0.0271759033203125,
0.05987548828125,
-0.023712158203125,
0.031646728515625,
-0.094482421875,
0.0004968643188476562,
0.01068115234375,
-0.0238800048828125,
-0.0272064208984375,
-0.007305145263671875,
0.00467681884765625,
-0.0310516357421875,
-0.060272216796875,
0.0626220703125,
-0.01184844970703125,
0.00494384765625,
-0.025299072265625,
-0.0330810546875,
0.004638671875,
0.015655517578125,
-0.0113525390625,
0.06402587890625,
0.0418701171875,
-0.0523681640625,
0.033447265625,
0.012176513671875,
0.0018224716186523438,
0.01319122314453125,
-0.059051513671875,
-0.0142364501953125,
-0.0212249755859375,
0.0036983489990234375,
-0.058441162109375,
-0.037811279296875,
0.01428985595703125,
-0.035064697265625,
0.04949951171875,
-0.028411865234375,
-0.005649566650390625,
-0.033660888671875,
-0.0186920166015625,
0.00922393798828125,
0.0684814453125,
-0.028533935546875,
0.04681396484375,
0.00722503662109375,
0.00836944580078125,
-0.034271240234375,
-0.062225341796875,
0.021087646484375,
-0.0067138671875,
-0.05487060546875,
0.049346923828125,
-0.0171966552734375,
-0.0304412841796875,
0.0282135009765625,
-0.017822265625,
-0.0341796875,
-0.0009164810180664062,
0.049163818359375,
0.03753662109375,
-0.0260009765625,
-0.026947021484375,
0.0089874267578125,
-0.02447509765625,
-0.0106353759765625,
-0.030853271484375,
0.019134521484375,
-0.00666046142578125,
-0.0037860870361328125,
-0.07916259765625,
0.032379150390625,
0.04718017578125,
0.023895263671875,
0.02618408203125,
0.06439208984375,
-0.0261688232421875,
0.0011749267578125,
-0.04425048828125,
-0.04852294921875,
-0.04376220703125,
0.00885772705078125,
0.00885772705078125,
-0.050994873046875,
0.05029296875,
-0.01404571533203125,
0.0057373046875,
0.0164337158203125,
0.0222625732421875,
-0.04119873046875,
0.0699462890625,
0.0560302734375,
0.026092529296875,
0.09375,
-0.045806884765625,
-0.029083251953125,
-0.06500244140625,
-0.007274627685546875,
-0.0136566162109375,
-0.033233642578125,
-0.044647216796875,
0.0117950439453125,
0.04443359375,
-0.0006380081176757812,
-0.07403564453125,
0.0263519287109375,
-0.03387451171875,
0.0166473388671875,
0.0531005859375,
0.0283203125,
-0.00853729248046875,
0.01120758056640625,
0.01113128662109375,
-0.0186309814453125,
-0.032073974609375,
-0.03314208984375,
0.06158447265625,
0.0162506103515625,
0.0665283203125,
0.01358795166015625,
0.05029296875,
0.042510986328125,
0.01500701904296875,
-0.0294189453125,
0.0330810546875,
-0.02239990234375,
-0.0452880859375,
-0.0120391845703125,
-0.0197601318359375,
-0.06378173828125,
-0.054718017578125,
-0.0189971923828125,
-0.056549072265625,
0.01296234130859375,
0.033538818359375,
-0.035247802734375,
0.01345062255859375,
-0.052978515625,
0.058441162109375,
-0.007205963134765625,
-0.0350341796875,
0.0024166107177734375,
-0.05010986328125,
0.00756072998046875,
0.03125,
-0.02471923828125,
-0.006168365478515625,
-0.0037937164306640625,
0.05572509765625,
-0.062744140625,
0.0684814453125,
-0.0112762451171875,
0.03704833984375,
0.0223236083984375,
0.0281219482421875,
0.024169921875,
0.017364501953125,
0.0102691650390625,
0.01367950439453125,
0.002410888671875,
-0.0224609375,
-0.0498046875,
0.033843994140625,
-0.0899658203125,
-0.03424072265625,
-0.0196075439453125,
-0.0273895263671875,
0.0279541015625,
0.01355743408203125,
0.02020263671875,
0.0726318359375,
-0.005863189697265625,
0.001842498779296875,
0.031494140625,
-0.0240020751953125,
0.072998046875,
0.053802490234375,
-0.032318115234375,
-0.050201416015625,
0.0295257568359375,
0.01271820068359375,
0.02459716796875,
0.021484375,
0.0296478271484375,
-0.0108795166015625,
-0.033538818359375,
-0.039337158203125,
-0.0167388916015625,
-0.053955078125,
-0.00852203369140625,
-0.0204620361328125,
-0.042633056640625,
-0.050323486328125,
-0.01812744140625,
-0.033721923828125,
-0.028778076171875,
-0.0212860107421875,
0.0216827392578125,
0.056640625,
0.057830810546875,
-0.0250091552734375,
0.06524658203125,
-0.044525146484375,
0.0204010009765625,
0.03778076171875,
0.0108795166015625,
-0.0121307373046875,
-0.055633544921875,
-0.0017004013061523438,
-0.024078369140625,
-0.033660888671875,
-0.049713134765625,
0.0364990234375,
0.006008148193359375,
0.035736083984375,
0.038818359375,
-0.009765625,
0.04461669921875,
-0.03759765625,
0.06988525390625,
0.02838134765625,
-0.0572509765625,
0.045013427734375,
-0.048828125,
0.0465087890625,
0.021514892578125,
0.0163116455078125,
-0.033538818359375,
-0.021881103515625,
-0.06451416015625,
-0.058349609375,
0.041015625,
0.032562255859375,
-0.0011796951293945312,
0.0054931640625,
0.032379150390625,
-0.0167694091796875,
0.01506805419921875,
-0.0246734619140625,
-0.02667236328125,
-0.052764892578125,
0.0013399124145507812,
0.020599365234375,
-0.007434844970703125,
-0.007198333740234375,
-0.0167694091796875,
0.046051025390625,
-0.00019466876983642578,
0.025634765625,
0.0171051025390625,
0.015045166015625,
-0.0232391357421875,
-0.00033354759216308594,
0.02496337890625,
0.046478271484375,
-0.046783447265625,
0.0002472400665283203,
-0.0191192626953125,
-0.055023193359375,
0.022308349609375,
-0.00768280029296875,
-0.0399169921875,
0.0204925537109375,
0.01337432861328125,
0.062347412109375,
0.0101470947265625,
0.0133514404296875,
0.022705078125,
-0.0169830322265625,
0.0004780292510986328,
-0.043365478515625,
0.002964019775390625,
-0.0118865966796875,
0.03741455078125,
0.01806640625,
0.027069091796875,
0.00763702392578125,
-0.03509521484375,
0.03289794921875,
-0.0125274658203125,
-0.0177764892578125,
-0.0205841064453125,
0.062042236328125,
0.010711669921875,
-0.043304443359375,
0.0714111328125,
-0.032012939453125,
-0.021484375,
0.080322265625,
0.04388427734375,
0.07525634765625,
-0.005725860595703125,
0.0297393798828125,
0.0390625,
0.023406982421875,
-0.0087127685546875,
0.0310516357421875,
0.0201416015625,
-0.04571533203125,
-0.04193115234375,
-0.0390625,
-0.036651611328125,
0.003185272216796875,
-0.041900634765625,
0.033203125,
-0.05523681640625,
-0.049346923828125,
-0.00460052490234375,
-0.0058746337890625,
-0.050140380859375,
0.00806427001953125,
0.0169677734375,
0.09765625,
-0.0474853515625,
0.06451416015625,
0.039642333984375,
-0.052490234375,
-0.0909423828125,
-0.01068115234375,
0.0173797607421875,
-0.040313720703125,
0.07220458984375,
-0.0075225830078125,
0.0209808349609375,
0.0217132568359375,
-0.037628173828125,
-0.05401611328125,
0.08306884765625,
0.0179443359375,
-0.04412841796875,
-0.00313568115234375,
-0.013336181640625,
0.0233001708984375,
-0.0244140625,
0.04766845703125,
-0.002346038818359375,
0.055816650390625,
0.018402099609375,
-0.07830810546875,
-0.0144195556640625,
-0.03607177734375,
0.0242156982421875,
0.004093170166015625,
-0.02227783203125,
0.08551025390625,
-0.004146575927734375,
-0.01161956787109375,
0.01439666748046875,
0.0450439453125,
-0.0056915283203125,
0.031890869140625,
0.033355712890625,
0.035980224609375,
0.01433563232421875,
-0.021087646484375,
0.06768798828125,
-0.037689208984375,
0.016357421875,
0.080322265625,
-0.0056610107421875,
0.0297088623046875,
0.0423583984375,
0.0034084320068359375,
0.03680419921875,
0.0635986328125,
-0.0273590087890625,
0.040924072265625,
0.001434326171875,
-0.004337310791015625,
-0.006153106689453125,
-0.0171661376953125,
-0.0249176025390625,
0.0184478759765625,
-0.00717926025390625,
-0.01291656494140625,
-0.0362548828125,
0.0189056396484375,
-0.004604339599609375,
-0.0177459716796875,
-0.026611328125,
0.052764892578125,
-0.013092041015625,
-0.0280914306640625,
0.0288238525390625,
-0.0218505859375,
0.054779052734375,
-0.0748291015625,
-0.0227203369140625,
-0.02935791015625,
0.015106201171875,
-0.02154541015625,
-0.062744140625,
0.02947998046875,
-0.005863189697265625,
-0.027557373046875,
-0.01105499267578125,
0.0307464599609375,
0.00133514404296875,
-0.0228424072265625,
0.0162506103515625,
0.0174407958984375,
0.01605224609375,
0.017364501953125,
-0.057037353515625,
0.0195159912109375,
-0.0257415771484375,
-0.015777587890625,
0.01357269287109375,
0.024658203125,
-0.01155853271484375,
0.0419921875,
0.034942626953125,
0.0168304443359375,
0.0223541259765625,
0.032073974609375,
0.0692138671875,
-0.032501220703125,
-0.04034423828125,
-0.037750244140625,
0.042510986328125,
-0.0183258056640625,
-0.0251617431640625,
0.056182861328125,
0.034210205078125,
0.045806884765625,
-0.021728515625,
0.038909912109375,
-0.0271759033203125,
-0.0313720703125,
-0.025634765625,
0.057525634765625,
-0.040802001953125,
-0.01438140869140625,
-0.018157958984375,
-0.051055908203125,
-0.0016469955444335938,
0.027130126953125,
-0.002704620361328125,
-0.00621795654296875,
0.052734375,
0.0943603515625,
-0.01654052734375,
-0.0182342529296875,
0.0264434814453125,
0.0271759033203125,
0.02655029296875,
0.0252685546875,
0.078857421875,
-0.057769775390625,
0.03607177734375,
-0.050872802734375,
-0.031494140625,
-0.00792694091796875,
-0.09832763671875,
-0.040985107421875,
-0.02740478515625,
-0.047760009765625,
-0.066162109375,
0.0362548828125,
0.064697265625,
0.09478759765625,
-0.058013916015625,
-0.01690673828125,
-0.0074005126953125,
-0.025360107421875,
-0.016326904296875,
-0.0189208984375,
-0.0036144256591796875,
-0.00003361701965332031,
-0.0548095703125,
0.01210784912109375,
0.0247039794921875,
0.0401611328125,
-0.023834228515625,
-0.00490570068359375,
0.0009031295776367188,
-0.01312255859375,
0.003204345703125,
0.0291290283203125,
-0.03326416015625,
-0.03741455078125,
0.0032558441162109375,
-0.00016045570373535156,
0.01105499267578125,
0.0523681640625,
-0.046173095703125,
0.026641845703125,
0.04443359375,
0.00302886962890625,
0.053131103515625,
-0.0255889892578125,
0.040191650390625,
-0.04901123046875,
0.039764404296875,
0.0025997161865234375,
0.034393310546875,
0.0142364501953125,
-0.020111083984375,
0.0269317626953125,
0.0146026611328125,
-0.0472412109375,
-0.039764404296875,
0.0109100341796875,
-0.08856201171875,
-0.017120361328125,
0.0733642578125,
-0.0001819133758544922,
-0.036834716796875,
-0.004062652587890625,
-0.00579833984375,
0.050872802734375,
-0.05718994140625,
0.0677490234375,
0.01433563232421875,
0.013824462890625,
-0.00836181640625,
-0.04345703125,
0.0273895263671875,
-0.0179901123046875,
-0.038238525390625,
-0.0248870849609375,
0.0254364013671875,
0.03857421875,
0.02801513671875,
0.0237884521484375,
-0.021331787109375,
0.038238525390625,
0.028076171875,
0.025299072265625,
-0.0404052734375,
-0.03955078125,
-0.0230560302734375,
0.0029277801513671875,
-0.01641845703125,
-0.0191192626953125
]
] |
HuggingFaceH4/zephyr-7b-alpha | 2023-10-26T08:18:58.000Z | [
"transformers",
"pytorch",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"en",
"dataset:stingning/ultrachat",
"dataset:openbmb/UltraFeedback",
"arxiv:2305.18290",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | HuggingFaceH4 | null | null | HuggingFaceH4/zephyr-7b-alpha | 913 | 82,136 | transformers | 2023-10-09T08:45:10 | ---
tags:
- generated_from_trainer
model-index:
- name: zephyr-7b-alpha
results: []
license: mit
datasets:
- stingning/ultrachat
- openbmb/UltraFeedback
language:
- en
base_model: mistralai/Mistral-7B-v0.1
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
<img src="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha/resolve/main/thumbnail.png" alt="Zephyr Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for Zephyr 7B Alpha
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr-7B-α is the first model in the series, and is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) that was trained on on a mix of publicly available, synthetic datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290). We found that removing the in-built alignment of these datasets boosted performance on [MT Bench](https://huggingface.co/spaces/lmsys/mt-bench) and made the model more helpful. However, this means that model is likely to generate problematic text when prompted to do so and should only be used for educational and research purposes.
## Model description
- **Model type:** A 7B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- **Language(s) (NLP):** Primarily English
- **License:** MIT
- **Finetuned from model:** [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/huggingface/alignment-handbook
- **Demo:** https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat
## Intended uses & limitations
The model was initially fine-tuned on a variant of the [`UltraChat`](https://huggingface.co/datasets/stingning/ultrachat) dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT. We then further aligned the model with [🤗 TRL's](https://github.com/huggingface/trl) `DPOTrainer` on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contain 64k prompts and model completions that are ranked by GPT-4. As a result, the model can be used for chat and you can check out our [demo](https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat) to test its capabilities.
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/zephyr-7b-alpha", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Zephyr-7B-α has not been aligned to human preferences with techniques like RLHF or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base model (`mistralai/Mistral-7B-v0.1`), however it is likely to have included a mix of Web data and technical sources like books and code. See the [Falcon 180B model card](https://huggingface.co/tiiuae/falcon-180B#training-data) for an example of this.
## Training and evaluation data
Zephyr 7B Alpha achieves the following results on the evaluation set:
- Loss: 0.4605
- Rewards/chosen: -0.5053
- Rewards/rejected: -1.8752
- Rewards/accuracies: 0.7812
- Rewards/margins: 1.3699
- Logps/rejected: -327.4286
- Logps/chosen: -297.1040
- Logits/rejected: -2.7153
- Logits/chosen: -2.7447
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-07
- train_batch_size: 2
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 16
- total_train_batch_size: 32
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.5602 | 0.05 | 100 | 0.5589 | -0.3359 | -0.8168 | 0.7188 | 0.4809 | -306.2607 | -293.7161 | -2.6554 | -2.6797 |
| 0.4852 | 0.1 | 200 | 0.5136 | -0.5310 | -1.4994 | 0.8125 | 0.9684 | -319.9124 | -297.6181 | -2.5762 | -2.5957 |
| 0.5212 | 0.15 | 300 | 0.5168 | -0.1686 | -1.1760 | 0.7812 | 1.0074 | -313.4444 | -290.3699 | -2.6865 | -2.7125 |
| 0.5496 | 0.21 | 400 | 0.4835 | -0.1617 | -1.7170 | 0.8281 | 1.5552 | -324.2635 | -290.2326 | -2.7947 | -2.8218 |
| 0.5209 | 0.26 | 500 | 0.5054 | -0.4778 | -1.6604 | 0.7344 | 1.1826 | -323.1325 | -296.5546 | -2.8388 | -2.8667 |
| 0.4617 | 0.31 | 600 | 0.4910 | -0.3738 | -1.5180 | 0.7656 | 1.1442 | -320.2848 | -294.4741 | -2.8234 | -2.8521 |
| 0.4452 | 0.36 | 700 | 0.4838 | -0.4591 | -1.6576 | 0.7031 | 1.1986 | -323.0770 | -296.1796 | -2.7401 | -2.7653 |
| 0.4674 | 0.41 | 800 | 0.5077 | -0.5692 | -1.8659 | 0.7656 | 1.2967 | -327.2416 | -298.3818 | -2.6740 | -2.6945 |
| 0.4656 | 0.46 | 900 | 0.4927 | -0.5279 | -1.6614 | 0.7656 | 1.1335 | -323.1518 | -297.5553 | -2.7817 | -2.8015 |
| 0.4102 | 0.52 | 1000 | 0.4772 | -0.5767 | -2.0667 | 0.7656 | 1.4900 | -331.2578 | -298.5311 | -2.7160 | -2.7455 |
| 0.4663 | 0.57 | 1100 | 0.4740 | -0.8038 | -2.1018 | 0.7656 | 1.2980 | -331.9604 | -303.0741 | -2.6994 | -2.7257 |
| 0.4737 | 0.62 | 1200 | 0.4716 | -0.3783 | -1.7015 | 0.7969 | 1.3232 | -323.9545 | -294.5634 | -2.6842 | -2.7135 |
| 0.4259 | 0.67 | 1300 | 0.4866 | -0.6239 | -1.9703 | 0.7812 | 1.3464 | -329.3312 | -299.4761 | -2.7046 | -2.7356 |
| 0.4935 | 0.72 | 1400 | 0.4747 | -0.5626 | -1.7600 | 0.7812 | 1.1974 | -325.1243 | -298.2491 | -2.7153 | -2.7444 |
| 0.4211 | 0.77 | 1500 | 0.4645 | -0.6099 | -1.9993 | 0.7656 | 1.3894 | -329.9109 | -299.1959 | -2.6944 | -2.7236 |
| 0.4931 | 0.83 | 1600 | 0.4684 | -0.6798 | -2.1082 | 0.7656 | 1.4285 | -332.0890 | -300.5934 | -2.7006 | -2.7305 |
| 0.5029 | 0.88 | 1700 | 0.4595 | -0.5063 | -1.8951 | 0.7812 | 1.3889 | -327.8267 | -297.1233 | -2.7108 | -2.7403 |
| 0.4965 | 0.93 | 1800 | 0.4613 | -0.5561 | -1.9079 | 0.7812 | 1.3518 | -328.0831 | -298.1203 | -2.7226 | -2.7523 |
| 0.4337 | 0.98 | 1900 | 0.4608 | -0.5066 | -1.8718 | 0.7656 | 1.3652 | -327.3599 | -297.1296 | -2.7175 | -2.7469 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.14.0 | 9,554 | [
[
-0.042999267578125,
-0.0504150390625,
0.015411376953125,
0.004730224609375,
-0.00807952880859375,
-0.00843048095703125,
-0.005645751953125,
-0.0307159423828125,
0.04644775390625,
0.023284912109375,
-0.038665771484375,
-0.03558349609375,
-0.039398193359375,
-0.005474090576171875,
0.0014858245849609375,
0.0611572265625,
0.01396942138671875,
-0.0145721435546875,
0.00829315185546875,
-0.0133514404296875,
-0.0247802734375,
-0.0276336669921875,
-0.04693603515625,
-0.0142669677734375,
0.02447509765625,
0.032318115234375,
0.0751953125,
0.06378173828125,
0.033721923828125,
0.0272674560546875,
-0.026580810546875,
0.003040313720703125,
-0.0340576171875,
-0.027130126953125,
0.01204681396484375,
-0.038543701171875,
-0.039215087890625,
-0.0092010498046875,
0.0513916015625,
0.0423583984375,
-0.00582122802734375,
0.0194091796875,
0.006252288818359375,
0.058380126953125,
-0.0237274169921875,
0.0309295654296875,
-0.02764892578125,
-0.003582000732421875,
-0.0243072509765625,
-0.00946807861328125,
-0.003726959228515625,
-0.028350830078125,
0.00989532470703125,
-0.04876708984375,
0.020599365234375,
0.0103607177734375,
0.110107421875,
0.00954437255859375,
-0.015869140625,
-0.0033626556396484375,
-0.034881591796875,
0.0478515625,
-0.04998779296875,
0.0216217041015625,
0.0209808349609375,
0.0183868408203125,
-0.017059326171875,
-0.04443359375,
-0.05560302734375,
0.0071868896484375,
-0.01763916015625,
0.02490234375,
-0.02410888671875,
-0.0111083984375,
0.019317626953125,
0.032562255859375,
-0.0439453125,
-0.0157012939453125,
-0.0433349609375,
-0.026702880859375,
0.0439453125,
0.01245880126953125,
0.0277252197265625,
-0.0460205078125,
-0.040740966796875,
-0.0263671875,
-0.0269317626953125,
0.050567626953125,
0.03607177734375,
0.0295257568359375,
-0.0233612060546875,
0.041229248046875,
-0.012451171875,
0.04595947265625,
0.01557159423828125,
-0.011474609375,
0.05517578125,
-0.026702880859375,
-0.0201873779296875,
-0.005130767822265625,
0.07159423828125,
0.05072021484375,
-0.004238128662109375,
0.006832122802734375,
-0.0035991668701171875,
0.0181427001953125,
0.00415802001953125,
-0.056549072265625,
-0.01479339599609375,
0.0352783203125,
-0.0308685302734375,
-0.031982421875,
0.00433349609375,
-0.0611572265625,
0.00646209716796875,
-0.0037059783935546875,
0.027374267578125,
-0.0305633544921875,
-0.037811279296875,
0.0100555419921875,
-0.0211334228515625,
0.0180206298828125,
0.01308441162109375,
-0.06915283203125,
0.017822265625,
0.0259552001953125,
0.07037353515625,
0.01192474365234375,
-0.012176513671875,
-0.0003190040588378906,
0.00496673583984375,
-0.0309295654296875,
0.058380126953125,
-0.026092529296875,
-0.028961181640625,
-0.02191162109375,
0.0257110595703125,
-0.01531219482421875,
-0.0256500244140625,
0.0491943359375,
-0.0269622802734375,
0.0213165283203125,
-0.03717041015625,
-0.0316162109375,
-0.022064208984375,
0.0234222412109375,
-0.046600341796875,
0.08038330078125,
0.0272369384765625,
-0.076904296875,
0.02337646484375,
-0.039154052734375,
0.0007257461547851562,
-0.001644134521484375,
-0.00423431396484375,
-0.0413818359375,
-0.026275634765625,
0.017364501953125,
0.0191802978515625,
-0.029022216796875,
0.011688232421875,
-0.0198822021484375,
-0.0188140869140625,
-0.007312774658203125,
-0.008453369140625,
0.0848388671875,
0.02606201171875,
-0.0379638671875,
0.01189422607421875,
-0.0526123046875,
0.00850677490234375,
0.0235137939453125,
-0.021331787109375,
0.0006318092346191406,
-0.026092529296875,
0.01120758056640625,
0.0197906494140625,
0.02764892578125,
-0.04644775390625,
0.013824462890625,
-0.036346435546875,
0.03594970703125,
0.05078125,
0.0024662017822265625,
0.03564453125,
-0.059051513671875,
0.041351318359375,
0.0291595458984375,
0.023712158203125,
0.002895355224609375,
-0.048919677734375,
-0.0634765625,
-0.0159454345703125,
0.004180908203125,
0.046722412109375,
-0.035125732421875,
0.049713134765625,
-0.0107421875,
-0.059295654296875,
-0.0439453125,
0.0010280609130859375,
0.0224761962890625,
0.06591796875,
0.0254058837890625,
-0.0158843994140625,
-0.037200927734375,
-0.06982421875,
0.00450897216796875,
-0.0221710205078125,
0.023529052734375,
0.03997802734375,
0.037261962890625,
-0.0238494873046875,
0.08123779296875,
-0.036285400390625,
-0.038604736328125,
-0.0212860107421875,
-0.0254669189453125,
0.03863525390625,
0.038421630859375,
0.0712890625,
-0.05816650390625,
-0.046417236328125,
0.0023670196533203125,
-0.059844970703125,
0.01064300537109375,
-0.0021533966064453125,
-0.0219573974609375,
0.0102996826171875,
0.00382232666015625,
-0.050811767578125,
0.032379150390625,
0.039764404296875,
-0.035797119140625,
0.04022216796875,
-0.02130126953125,
0.0178985595703125,
-0.08111572265625,
0.017364501953125,
0.008392333984375,
0.003955841064453125,
-0.03680419921875,
-0.0056304931640625,
0.00140380859375,
-0.0079193115234375,
-0.031707763671875,
0.050323486328125,
-0.036468505859375,
0.009613037109375,
0.005382537841796875,
0.001216888427734375,
-0.001132965087890625,
0.0526123046875,
0.00457763671875,
0.05859375,
0.061431884765625,
-0.032379150390625,
0.0177459716796875,
0.0298614501953125,
-0.017303466796875,
0.025543212890625,
-0.061859130859375,
-0.004730224609375,
-0.022552490234375,
0.0162506103515625,
-0.088134765625,
-0.025848388671875,
0.0386962890625,
-0.039337158203125,
0.00617218017578125,
-0.01763916015625,
-0.016937255859375,
-0.048370361328125,
-0.031707763671875,
0.0006594657897949219,
0.043609619140625,
-0.023956298828125,
0.038055419921875,
0.031036376953125,
0.00725555419921875,
-0.0526123046875,
-0.038848876953125,
-0.0118408203125,
-0.017547607421875,
-0.07049560546875,
0.0279998779296875,
-0.007549285888671875,
-0.0040130615234375,
0.004276275634765625,
0.002559661865234375,
0.001056671142578125,
0.003330230712890625,
0.03717041015625,
0.0248565673828125,
-0.01285552978515625,
-0.0181884765625,
-0.0101165771484375,
-0.010986328125,
-0.003871917724609375,
-0.0042572021484375,
0.04443359375,
-0.03314208984375,
-0.0151824951171875,
-0.044189453125,
0.001689910888671875,
0.047454833984375,
-0.02001953125,
0.0740966796875,
0.04510498046875,
-0.019989013671875,
0.01922607421875,
-0.040069580078125,
-0.00728607177734375,
-0.03466796875,
0.01509857177734375,
-0.032806396484375,
-0.05413818359375,
0.060760498046875,
0.0086822509765625,
0.01953125,
0.0628662109375,
0.03057861328125,
0.0044708251953125,
0.0748291015625,
0.0243988037109375,
-0.01007080078125,
0.03619384765625,
-0.056610107421875,
-0.007083892822265625,
-0.064453125,
-0.038665771484375,
-0.039215087890625,
-0.033050537109375,
-0.037841796875,
-0.0159149169921875,
0.0284423828125,
0.01459503173828125,
-0.03326416015625,
0.0258636474609375,
-0.0555419921875,
0.0154876708984375,
0.039764404296875,
0.0288543701171875,
0.003879547119140625,
-0.0071868896484375,
-0.0168609619140625,
-0.01174163818359375,
-0.052581787109375,
-0.0489501953125,
0.083740234375,
0.026275634765625,
0.037139892578125,
0.023223876953125,
0.0675048828125,
0.004878997802734375,
0.0093994140625,
-0.035797119140625,
0.0232696533203125,
0.0114898681640625,
-0.059906005859375,
-0.0177459716796875,
-0.03131103515625,
-0.0870361328125,
0.033782958984375,
-0.024169921875,
-0.0645751953125,
0.0289459228515625,
0.019012451171875,
-0.040740966796875,
0.0302581787109375,
-0.062744140625,
0.07879638671875,
-0.0169830322265625,
-0.04205322265625,
0.00766754150390625,
-0.06365966796875,
0.0244903564453125,
0.016082763671875,
0.0261077880859375,
-0.01751708984375,
0.007656097412109375,
0.060394287109375,
-0.0501708984375,
0.047943115234375,
-0.031494140625,
0.0223541259765625,
0.039215087890625,
-0.00705718994140625,
0.038238525390625,
0.0064239501953125,
-0.0208587646484375,
0.003658294677734375,
0.0121917724609375,
-0.044189453125,
-0.0244903564453125,
0.061431884765625,
-0.09210205078125,
-0.053985595703125,
-0.042633056640625,
-0.0280303955078125,
0.0111541748046875,
0.03082275390625,
0.031585693359375,
0.03607177734375,
-0.0032634735107421875,
0.01318359375,
0.034942626953125,
-0.0150146484375,
0.0283966064453125,
0.0209197998046875,
-0.0228729248046875,
-0.045745849609375,
0.06304931640625,
0.01384735107421875,
0.020111083984375,
0.0052642822265625,
0.028778076171875,
-0.0386962890625,
-0.022979736328125,
-0.0384521484375,
0.022674560546875,
-0.03265380859375,
-0.01812744140625,
-0.069580078125,
-0.013031005859375,
-0.059051513671875,
-0.02008056640625,
-0.0233154296875,
-0.02447509765625,
-0.03759765625,
-0.00714874267578125,
0.03887939453125,
0.04656982421875,
0.0014066696166992188,
0.031585693359375,
-0.056182861328125,
0.0128936767578125,
0.005199432373046875,
0.007110595703125,
0.00980377197265625,
-0.047637939453125,
-0.00743865966796875,
0.026092529296875,
-0.03814697265625,
-0.06640625,
0.059112548828125,
0.00963592529296875,
0.042999267578125,
0.0352783203125,
0.0022335052490234375,
0.0677490234375,
-0.0091400146484375,
0.05828857421875,
0.018646240234375,
-0.050079345703125,
0.046966552734375,
-0.0287933349609375,
0.0224609375,
0.03753662109375,
0.04266357421875,
-0.03485107421875,
-0.026275634765625,
-0.08233642578125,
-0.064697265625,
0.05615234375,
0.0227203369140625,
0.005290985107421875,
-0.00760650634765625,
0.0289459228515625,
-0.0262603759765625,
0.006378173828125,
-0.061431884765625,
-0.04571533203125,
-0.0265350341796875,
-0.0021190643310546875,
-0.002223968505859375,
-0.0045928955078125,
-0.0070648193359375,
-0.033782958984375,
0.0478515625,
0.01148223876953125,
0.026214599609375,
0.0294189453125,
0.0040435791015625,
-0.0197906494140625,
0.01210784912109375,
0.0438232421875,
0.04522705078125,
-0.04144287109375,
-0.00426483154296875,
-0.0036773681640625,
-0.031768798828125,
0.0082550048828125,
0.005588531494140625,
-0.020416259765625,
-0.00020122528076171875,
0.0188751220703125,
0.04736328125,
0.00452423095703125,
-0.012847900390625,
0.03973388671875,
-0.0028781890869140625,
-0.025482177734375,
-0.029296875,
0.01220703125,
0.0181427001953125,
0.024383544921875,
0.0303192138671875,
0.0268096923828125,
0.001194000244140625,
-0.04937744140625,
0.01010894775390625,
0.0347900390625,
-0.034271240234375,
-0.019866943359375,
0.07135009765625,
0.00966644287109375,
-0.0153350830078125,
0.03826904296875,
-0.01392364501953125,
-0.05596923828125,
0.06072998046875,
0.036895751953125,
0.03662109375,
-0.0096893310546875,
0.0138702392578125,
0.06256103515625,
0.0243988037109375,
-0.006439208984375,
0.035247802734375,
0.008453369140625,
-0.05230712890625,
0.0145416259765625,
-0.0501708984375,
-0.0105133056640625,
0.00433349609375,
-0.045806884765625,
0.03271484375,
-0.042144775390625,
-0.0438232421875,
0.0078277587890625,
0.02203369140625,
-0.05548095703125,
0.01323699951171875,
-0.01300048828125,
0.06890869140625,
-0.077392578125,
0.052947998046875,
0.0496826171875,
-0.05792236328125,
-0.0880126953125,
-0.033203125,
0.004886627197265625,
-0.060333251953125,
0.045196533203125,
0.0117340087890625,
0.00492095947265625,
0.0027332305908203125,
-0.03460693359375,
-0.074951171875,
0.0926513671875,
0.01519012451171875,
-0.043975830078125,
-0.0017604827880859375,
-0.0029201507568359375,
0.049102783203125,
-0.007595062255859375,
0.04693603515625,
0.03399658203125,
0.036407470703125,
0.0185699462890625,
-0.07025146484375,
0.0106658935546875,
-0.041961669921875,
0.005641937255859375,
0.0150299072265625,
-0.0894775390625,
0.08294677734375,
-0.02093505859375,
-0.003948211669921875,
-0.007171630859375,
0.05218505859375,
0.0237274169921875,
-0.0008487701416015625,
0.039947509765625,
0.061920166015625,
0.0556640625,
-0.01360321044921875,
0.08148193359375,
-0.0360107421875,
0.049713134765625,
0.046722412109375,
-0.0048065185546875,
0.055877685546875,
0.0308380126953125,
-0.0308380126953125,
0.03387451171875,
0.0550537109375,
-0.003467559814453125,
0.034698486328125,
-0.004947662353515625,
-0.0185394287109375,
-0.008331298828125,
-0.003421783447265625,
-0.0550537109375,
0.01806640625,
0.0177459716796875,
-0.029022216796875,
-0.01180267333984375,
-0.01509857177734375,
0.01274871826171875,
-0.01219940185546875,
-0.00954437255859375,
0.044952392578125,
-0.01470947265625,
-0.0401611328125,
0.052825927734375,
-0.0111541748046875,
0.0615234375,
-0.049041748046875,
0.0017242431640625,
-0.01494598388671875,
0.0267791748046875,
-0.040740966796875,
-0.0726318359375,
0.019683837890625,
-0.020172119140625,
-0.010894775390625,
0.0007796287536621094,
0.040008544921875,
-0.0224609375,
-0.041351318359375,
0.01479339599609375,
0.0193023681640625,
0.017791748046875,
0.01194000244140625,
-0.06805419921875,
0.003814697265625,
0.0179290771484375,
-0.0369873046875,
0.0269622802734375,
0.032257080078125,
0.007518768310546875,
0.04034423828125,
0.06341552734375,
0.0156402587890625,
0.006427764892578125,
-0.018798828125,
0.0806884765625,
-0.05120849609375,
-0.039825439453125,
-0.050506591796875,
0.030975341796875,
-0.009246826171875,
-0.039825439453125,
0.06915283203125,
0.05767822265625,
0.05047607421875,
0.0005388259887695312,
0.04620361328125,
-0.035186767578125,
0.038360595703125,
-0.01168060302734375,
0.0572509765625,
-0.04718017578125,
0.007740020751953125,
-0.0298309326171875,
-0.0567626953125,
-0.0157012939453125,
0.06353759765625,
-0.0264434814453125,
0.009307861328125,
0.03082275390625,
0.08343505859375,
0.005947113037109375,
0.003841400146484375,
0.01071929931640625,
0.0050811767578125,
0.0233154296875,
0.054168701171875,
0.037689208984375,
-0.051513671875,
0.043212890625,
-0.060516357421875,
-0.02392578125,
-0.019012451171875,
-0.044921875,
-0.054351806640625,
-0.046783447265625,
-0.02862548828125,
-0.044189453125,
-0.005626678466796875,
0.07952880859375,
0.05255126953125,
-0.04632568359375,
-0.03271484375,
0.005046844482421875,
0.0007414817810058594,
-0.01629638671875,
-0.019317626953125,
0.05279541015625,
-0.0089263916015625,
-0.047821044921875,
0.0024852752685546875,
0.00579071044921875,
0.0258026123046875,
-0.0011205673217773438,
-0.0145263671875,
-0.0171051025390625,
-0.0111541748046875,
0.028533935546875,
0.031494140625,
-0.056060791015625,
-0.02020263671875,
0.0160064697265625,
-0.0192108154296875,
0.0271453857421875,
0.0147247314453125,
-0.04034423828125,
0.035675048828125,
0.036895751953125,
0.02545166015625,
0.041717529296875,
0.0128173828125,
0.0135345458984375,
-0.0265350341796875,
0.0222930908203125,
-0.0028820037841796875,
0.034271240234375,
0.011383056640625,
-0.0374755859375,
0.036468505859375,
0.033050537109375,
-0.046539306640625,
-0.0516357421875,
-0.0232086181640625,
-0.103271484375,
0.005359649658203125,
0.0794677734375,
-0.01239776611328125,
-0.0494384765625,
0.004405975341796875,
-0.03363037109375,
0.01482391357421875,
-0.05218505859375,
0.042877197265625,
0.0504150390625,
-0.0250396728515625,
-0.0090179443359375,
-0.041656494140625,
0.03350830078125,
0.021575927734375,
-0.05499267578125,
-0.0093536376953125,
0.034271240234375,
0.034332275390625,
0.02276611328125,
0.058502197265625,
-0.01035308837890625,
0.0177001953125,
0.012847900390625,
0.01282501220703125,
-0.00518035888671875,
-0.0023479461669921875,
-0.01180267333984375,
0.0131072998046875,
0.00518035888671875,
-0.0247955322265625
]
] |
sentence-transformers/distiluse-base-multilingual-cased-v1 | 2023-11-02T09:17:53.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"multilingual",
"ar",
"zh",
"nl",
"en",
"fr",
"de",
"it",
"ko",
"pl",
"pt",
"ru",
"es",
"tr",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/distiluse-base-multilingual-cased-v1 | 56 | 82,021 | sentence-transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- ar
- zh
- nl
- en
- fr
- de
- it
- ko
- pl
- pt
- ru
- es
- tr
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/distiluse-base-multilingual-cased-v1
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 512 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/distiluse-base-multilingual-cased-v1')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/distiluse-base-multilingual-cased-v1)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Dense({'in_features': 768, 'out_features': 512, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 2,446 | [
[
-0.0190887451171875,
-0.061187744140625,
0.02825927734375,
0.0341796875,
-0.024322509765625,
-0.0189056396484375,
-0.0196990966796875,
0.00954437255859375,
0.0159454345703125,
0.0279541015625,
-0.04296875,
-0.038848876953125,
-0.049591064453125,
0.01520538330078125,
-0.039154052734375,
0.06414794921875,
-0.0086212158203125,
0.00910186767578125,
-0.0246429443359375,
-0.015380859375,
-0.0212860107421875,
-0.048431396484375,
-0.01451873779296875,
-0.0167999267578125,
0.0199432373046875,
0.01248931884765625,
0.033721923828125,
0.014404296875,
0.0301513671875,
0.03399658203125,
-0.016998291015625,
0.001739501953125,
-0.026947021484375,
0.004199981689453125,
0.0005478858947753906,
-0.029388427734375,
-0.0099639892578125,
0.0038852691650390625,
0.03460693359375,
0.04736328125,
-0.018951416015625,
0.0030651092529296875,
-0.004428863525390625,
0.0308074951171875,
-0.022552490234375,
0.0325927734375,
-0.0523681640625,
0.01557159423828125,
-0.00021135807037353516,
-0.0005769729614257812,
-0.050384521484375,
-0.018646240234375,
0.02545166015625,
-0.02923583984375,
0.0206451416015625,
0.013275146484375,
0.0821533203125,
0.0301055908203125,
-0.035888671875,
-0.021942138671875,
-0.03472900390625,
0.062744140625,
-0.06292724609375,
0.0182952880859375,
0.01007843017578125,
0.0095672607421875,
-0.01325225830078125,
-0.07470703125,
-0.055572509765625,
-0.01267242431640625,
-0.0286407470703125,
0.0177001953125,
-0.032318115234375,
0.0004949569702148438,
0.021453857421875,
0.0286407470703125,
-0.052215576171875,
-0.004730224609375,
-0.035614013671875,
-0.015869140625,
0.03125,
-0.0019292831420898438,
0.0247344970703125,
-0.034820556640625,
-0.039764404296875,
-0.0249176025390625,
-0.0190582275390625,
-0.0027065277099609375,
0.0160675048828125,
0.0230255126953125,
-0.0181884765625,
0.06866455078125,
-0.00589752197265625,
0.03961181640625,
0.0016431808471679688,
0.006885528564453125,
0.0511474609375,
-0.038360595703125,
-0.007389068603515625,
0.0109100341796875,
0.0784912109375,
0.0207061767578125,
0.0244598388671875,
-0.006626129150390625,
-0.00872802734375,
0.00841522216796875,
0.0301361083984375,
-0.07501220703125,
-0.0180816650390625,
0.0154571533203125,
-0.03240966796875,
-0.0163116455078125,
0.0193023681640625,
-0.051055908203125,
-0.0015277862548828125,
-0.00270843505859375,
0.040679931640625,
-0.049072265625,
0.0017938613891601562,
0.0308685302734375,
-0.0256195068359375,
0.0126953125,
-0.0266876220703125,
-0.057708740234375,
0.02154541015625,
0.0270233154296875,
0.07080078125,
0.007049560546875,
-0.0322265625,
-0.0208892822265625,
-0.0171966552734375,
0.0057830810546875,
0.0537109375,
-0.0189666748046875,
-0.01375579833984375,
0.0184478759765625,
0.0208740234375,
-0.0218963623046875,
-0.030426025390625,
0.06378173828125,
-0.0247802734375,
0.0386962890625,
0.00562286376953125,
-0.057769775390625,
-0.0225677490234375,
0.007518768310546875,
-0.054351806640625,
0.08245849609375,
0.013519287109375,
-0.06591796875,
0.0174560546875,
-0.0496826171875,
-0.037689208984375,
-0.005878448486328125,
0.00482177734375,
-0.057708740234375,
0.0157623291015625,
0.033233642578125,
0.05535888671875,
0.0035247802734375,
0.027252197265625,
-0.0241851806640625,
-0.0242156982421875,
0.03289794921875,
-0.033172607421875,
0.08245849609375,
0.01030731201171875,
-0.01548004150390625,
0.007152557373046875,
-0.0416259765625,
-0.01082611083984375,
0.018035888671875,
-0.0207061767578125,
-0.033660888671875,
0.005107879638671875,
0.0185089111328125,
0.01305389404296875,
0.026397705078125,
-0.0533447265625,
0.027984619140625,
-0.032073974609375,
0.061737060546875,
0.03399658203125,
0.0006823539733886719,
0.0426025390625,
-0.012176513671875,
0.0229339599609375,
0.033721923828125,
0.0028705596923828125,
-0.0189666748046875,
-0.031402587890625,
-0.0655517578125,
-0.028167724609375,
0.0275115966796875,
0.046417236328125,
-0.06195068359375,
0.0767822265625,
-0.041961669921875,
-0.03948974609375,
-0.0599365234375,
-0.008453369140625,
0.00537872314453125,
0.0223236083984375,
0.04559326171875,
0.00008952617645263672,
-0.05072021484375,
-0.078857421875,
-0.005390167236328125,
-0.00563812255859375,
0.0123443603515625,
0.0027751922607421875,
0.0546875,
-0.0355224609375,
0.08111572265625,
-0.048736572265625,
-0.0260467529296875,
-0.035888671875,
0.02081298828125,
0.0253448486328125,
0.0280914306640625,
0.048828125,
-0.06463623046875,
-0.03753662109375,
-0.0284271240234375,
-0.0438232421875,
-0.00814056396484375,
-0.01409912109375,
-0.004253387451171875,
0.0139617919921875,
0.037811279296875,
-0.05596923828125,
0.0242919921875,
0.044647216796875,
-0.0389404296875,
0.031646728515625,
-0.020599365234375,
-0.007724761962890625,
-0.11328125,
-0.004756927490234375,
0.01030731201171875,
-0.0154876708984375,
-0.04071044921875,
0.0006966590881347656,
0.0177459716796875,
0.003162384033203125,
-0.03472900390625,
0.02099609375,
-0.031829833984375,
0.01033782958984375,
0.00989532470703125,
0.0208892822265625,
0.0008454322814941406,
0.0574951171875,
-0.00798797607421875,
0.0574951171875,
0.039154052734375,
-0.0304718017578125,
0.027587890625,
0.050048828125,
-0.051055908203125,
0.024627685546875,
-0.065673828125,
-0.01468658447265625,
-0.00772857666015625,
0.0242462158203125,
-0.08258056640625,
0.01386260986328125,
0.006298065185546875,
-0.030792236328125,
0.0061492919921875,
-0.000039637088775634766,
-0.056884765625,
-0.041290283203125,
-0.0289764404296875,
0.010650634765625,
0.036773681640625,
-0.043487548828125,
0.0377197265625,
0.0198822021484375,
-0.0179443359375,
-0.03912353515625,
-0.08465576171875,
-0.002330780029296875,
-0.024932861328125,
-0.049560546875,
0.0491943359375,
-0.012542724609375,
0.000812530517578125,
0.0176544189453125,
0.01256561279296875,
-0.01151275634765625,
0.001918792724609375,
0.003993988037109375,
0.0185546875,
-0.00994110107421875,
0.0178070068359375,
0.02734375,
-0.006305694580078125,
-0.004547119140625,
-0.01399993896484375,
0.05804443359375,
-0.0207061767578125,
-0.017913818359375,
-0.0196075439453125,
0.023345947265625,
0.039764404296875,
-0.0222015380859375,
0.0771484375,
0.06591796875,
-0.0228424072265625,
-0.0132904052734375,
-0.036895751953125,
-0.020782470703125,
-0.035675048828125,
0.054412841796875,
-0.022857666015625,
-0.08087158203125,
0.02764892578125,
-0.0013885498046875,
-0.006229400634765625,
0.04949951171875,
0.046600341796875,
-0.0005707740783691406,
0.0584716796875,
0.049285888671875,
-0.024658203125,
0.0374755859375,
-0.0367431640625,
0.037506103515625,
-0.062469482421875,
-0.0037555694580078125,
-0.03411865234375,
-0.0190887451171875,
-0.057769775390625,
-0.0286407470703125,
0.0211029052734375,
0.007389068603515625,
-0.01386260986328125,
0.05340576171875,
-0.044219970703125,
0.01666259765625,
0.040252685546875,
-0.0022144317626953125,
0.004688262939453125,
0.0179443359375,
-0.03521728515625,
-0.00891876220703125,
-0.05596923828125,
-0.046142578125,
0.060516357421875,
0.0254058837890625,
0.03839111328125,
0.006786346435546875,
0.05267333984375,
0.005126953125,
-0.01305389404296875,
-0.06622314453125,
0.04022216796875,
-0.035491943359375,
-0.034149169921875,
-0.0211029052734375,
-0.0273284912109375,
-0.07843017578125,
0.034210205078125,
-0.0108642578125,
-0.051605224609375,
0.00829315185546875,
-0.031463623046875,
-0.0240478515625,
0.0159759521484375,
-0.06524658203125,
0.076904296875,
-0.01345062255859375,
0.0084991455078125,
-0.01187896728515625,
-0.04022216796875,
0.008697509765625,
0.01033782958984375,
0.006053924560546875,
0.001983642578125,
0.0023822784423828125,
0.048736572265625,
-0.02789306640625,
0.06732177734375,
-0.00354766845703125,
0.006687164306640625,
0.021148681640625,
-0.016632080078125,
0.017791748046875,
-0.0013370513916015625,
-0.0095672607421875,
0.0191192626953125,
0.004039764404296875,
-0.0182647705078125,
-0.03802490234375,
0.055206298828125,
-0.06524658203125,
-0.02679443359375,
-0.04364013671875,
-0.040313720703125,
0.0080718994140625,
0.01763916015625,
0.0187835693359375,
0.0352783203125,
-0.02313232421875,
0.04388427734375,
0.038238525390625,
-0.02923583984375,
0.04681396484375,
0.0293731689453125,
-0.00493621826171875,
-0.0242156982421875,
0.044281005859375,
0.004596710205078125,
-0.0011005401611328125,
0.048370361328125,
0.01457977294921875,
-0.037353515625,
-0.0199737548828125,
-0.025177001953125,
0.017852783203125,
-0.040252685546875,
-0.007808685302734375,
-0.0634765625,
-0.037567138671875,
-0.037628173828125,
0.004497528076171875,
-0.01459503173828125,
-0.039703369140625,
-0.037841796875,
-0.0294952392578125,
0.039581298828125,
0.037261962890625,
0.00566864013671875,
0.025360107421875,
-0.051055908203125,
0.0172119140625,
0.0097503662109375,
0.01306915283203125,
-0.01324462890625,
-0.0465087890625,
-0.0171966552734375,
0.009735107421875,
-0.0293121337890625,
-0.0711669921875,
0.0501708984375,
0.017791748046875,
0.05145263671875,
0.01416015625,
0.00909423828125,
0.048828125,
-0.0533447265625,
0.07196044921875,
0.007373809814453125,
-0.07220458984375,
0.032623291015625,
-0.00019848346710205078,
0.0257415771484375,
0.044097900390625,
0.03680419921875,
-0.04241943359375,
-0.0219268798828125,
-0.043853759765625,
-0.08441162109375,
0.049468994140625,
0.0272216796875,
0.034088134765625,
-0.023040771484375,
0.0216522216796875,
-0.00441741943359375,
0.014404296875,
-0.08135986328125,
-0.0361328125,
-0.0267333984375,
-0.039581298828125,
-0.0238494873046875,
-0.0285797119140625,
0.01375579833984375,
-0.02972412109375,
0.05621337890625,
0.0011157989501953125,
0.0390625,
0.022491455078125,
-0.033843994140625,
0.032958984375,
0.0323486328125,
0.041015625,
0.005054473876953125,
-0.00775909423828125,
0.0152435302734375,
0.0171356201171875,
-0.0248260498046875,
0.006908416748046875,
0.037200927734375,
-0.005619049072265625,
0.0198516845703125,
0.0263214111328125,
0.072265625,
0.0271453857421875,
-0.035125732421875,
0.05743408203125,
-0.00872802734375,
-0.020599365234375,
-0.039337158203125,
-0.0225372314453125,
0.0182952880859375,
0.0231170654296875,
0.0240936279296875,
-0.0019741058349609375,
0.01041412353515625,
-0.03570556640625,
0.0213775634765625,
0.0165252685546875,
-0.037384033203125,
-0.01099395751953125,
0.036407470703125,
0.003780364990234375,
-0.01444244384765625,
0.0701904296875,
-0.0343017578125,
-0.06207275390625,
0.034210205078125,
0.041015625,
0.07513427734375,
0.0079345703125,
0.030426025390625,
0.038604736328125,
0.033721923828125,
-0.01502227783203125,
0.01393890380859375,
0.007965087890625,
-0.069091796875,
-0.02642822265625,
-0.0498046875,
0.0111083984375,
0.003871917724609375,
-0.047821044921875,
0.021148681640625,
0.0023403167724609375,
-0.01189422607421875,
-0.00598907470703125,
-0.0021991729736328125,
-0.047698974609375,
-0.0201416015625,
-0.002471923828125,
0.06805419921875,
-0.06842041015625,
0.06671142578125,
0.05584716796875,
-0.059295654296875,
-0.046600341796875,
-0.00830841064453125,
-0.022125244140625,
-0.040802001953125,
0.04010009765625,
0.0233154296875,
0.0213470458984375,
-0.0021076202392578125,
-0.030914306640625,
-0.053558349609375,
0.098876953125,
0.025665283203125,
-0.0462646484375,
0.0030689239501953125,
0.025360107421875,
0.050689697265625,
-0.0250701904296875,
0.02685546875,
0.0294952392578125,
0.0318603515625,
-0.01141357421875,
-0.059295654296875,
0.010833740234375,
-0.0287017822265625,
0.02239990234375,
-0.003185272216796875,
-0.04052734375,
0.07672119140625,
0.00040650367736816406,
-0.0126800537109375,
0.021728515625,
0.051239013671875,
0.0198974609375,
-0.0155181884765625,
0.033660888671875,
0.06634521484375,
0.04388427734375,
-0.0207366943359375,
0.0694580078125,
-0.0263824462890625,
0.050445556640625,
0.07501220703125,
-0.01045989990234375,
0.077880859375,
0.04400634765625,
-0.01343536376953125,
0.0704345703125,
0.03350830078125,
-0.0265655517578125,
0.057403564453125,
0.02490234375,
-0.0012979507446289062,
-0.0028438568115234375,
0.01385498046875,
-0.01491546630859375,
0.032989501953125,
0.016204833984375,
-0.035491943359375,
-0.0079498291015625,
-0.0009555816650390625,
0.0030117034912109375,
0.01081085205078125,
0.014404296875,
0.04742431640625,
0.0099029541015625,
-0.038604736328125,
0.0235137939453125,
0.01526641845703125,
0.07647705078125,
-0.039703369140625,
0.003017425537109375,
-0.01007843017578125,
0.0308685302734375,
0.0026683807373046875,
-0.052093505859375,
0.0224456787109375,
-0.01097869873046875,
-0.0179901123046875,
-0.0244598388671875,
0.042510986328125,
-0.050933837890625,
-0.057403564453125,
0.029998779296875,
0.041290283203125,
0.0007991790771484375,
-0.0009984970092773438,
-0.067626953125,
-0.00830841064453125,
0.0017766952514648438,
-0.0253448486328125,
0.01345062255859375,
0.040985107421875,
0.01441192626953125,
0.04229736328125,
0.031524658203125,
-0.01024627685546875,
0.01259613037109375,
0.0205078125,
0.055023193359375,
-0.04315185546875,
-0.0399169921875,
-0.06732177734375,
0.05413818359375,
-0.0189666748046875,
-0.025299072265625,
0.060516357421875,
0.05645751953125,
0.06939697265625,
-0.025726318359375,
0.054412841796875,
-0.01239013671875,
0.0164947509765625,
-0.0347900390625,
0.06256103515625,
-0.0400390625,
-0.00220489501953125,
-0.0181732177734375,
-0.07171630859375,
-0.016632080078125,
0.0765380859375,
-0.0175933837890625,
-0.00006133317947387695,
0.07135009765625,
0.066650390625,
-0.01617431640625,
-0.01102447509765625,
0.0079345703125,
0.041107177734375,
0.02032470703125,
0.02874755859375,
0.0389404296875,
-0.05535888671875,
0.048736572265625,
-0.035919189453125,
0.0001304149627685547,
-0.01329803466796875,
-0.06689453125,
-0.07196044921875,
-0.07733154296875,
-0.032684326171875,
-0.00937652587890625,
-0.0127410888671875,
0.063720703125,
0.04296875,
-0.06005859375,
-0.01540374755859375,
-0.0153961181640625,
-0.0072479248046875,
-0.017974853515625,
-0.0223236083984375,
0.026214599609375,
-0.034149169921875,
-0.05999755859375,
0.021148681640625,
0.00034356117248535156,
0.001445770263671875,
-0.034271240234375,
0.004650115966796875,
-0.046295166015625,
0.0030345916748046875,
0.05694580078125,
-0.0253143310546875,
-0.061920166015625,
-0.01568603515625,
0.001300811767578125,
-0.02581787109375,
-0.00841522216796875,
0.035400390625,
-0.050506591796875,
0.023101806640625,
0.037567138671875,
0.033233642578125,
0.052093505859375,
-0.0232086181640625,
0.03399658203125,
-0.06756591796875,
0.028045654296875,
-0.0004949569702148438,
0.0677490234375,
0.029296875,
-0.01007843017578125,
0.041290283203125,
0.00788116455078125,
-0.030242919921875,
-0.04638671875,
-0.00603485107421875,
-0.09698486328125,
-0.029022216796875,
0.09698486328125,
-0.0213470458984375,
-0.0167694091796875,
0.01532745361328125,
-0.033050537109375,
0.033935546875,
-0.0298614501953125,
0.05853271484375,
0.06964111328125,
0.0263214111328125,
-0.00865936279296875,
-0.022918701171875,
0.01239013671875,
0.0291290283203125,
-0.044281005859375,
-0.016998291015625,
0.0247344970703125,
0.019134521484375,
0.02685546875,
0.0143280029296875,
-0.01404571533203125,
-0.00933837890625,
0.0009088516235351562,
0.02154541015625,
-0.006595611572265625,
-0.00174713134765625,
-0.036407470703125,
0.00717926025390625,
-0.02935791015625,
-0.01337432861328125
]
] |
Helsinki-NLP/opus-mt-en-zh | 2023-08-16T11:31:42.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"marian",
"text2text-generation",
"translation",
"en",
"zh",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-zh | 207 | 81,956 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- zh
tags:
- translation
license: apache-2.0
---
### eng-zho
* source group: English
* target group: Chinese
* OPUS readme: [eng-zho](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-zho/README.md)
* model: transformer
* source language(s): eng
* target language(s): cjy_Hans cjy_Hant cmn cmn_Hans cmn_Hant gan lzh lzh_Hans nan wuu yue yue_Hans yue_Hant
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
* download original weights: [opus-2020-07-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zho/opus-2020-07-17.zip)
* test set translations: [opus-2020-07-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zho/opus-2020-07-17.test.txt)
* test set scores: [opus-2020-07-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zho/opus-2020-07-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.eng.zho | 31.4 | 0.268 |
### System Info:
- hf_name: eng-zho
- source_languages: eng
- target_languages: zho
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-zho/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['en', 'zh']
- src_constituents: {'eng'}
- tgt_constituents: {'cmn_Hans', 'nan', 'nan_Hani', 'gan', 'yue', 'cmn_Kana', 'yue_Hani', 'wuu_Bopo', 'cmn_Latn', 'yue_Hira', 'cmn_Hani', 'cjy_Hans', 'cmn', 'lzh_Hang', 'lzh_Hira', 'cmn_Hant', 'lzh_Bopo', 'zho', 'zho_Hans', 'zho_Hant', 'lzh_Hani', 'yue_Hang', 'wuu', 'yue_Kana', 'wuu_Latn', 'yue_Bopo', 'cjy_Hant', 'yue_Hans', 'lzh', 'cmn_Hira', 'lzh_Yiii', 'lzh_Hans', 'cmn_Bopo', 'cmn_Hang', 'hak_Hani', 'cmn_Yiii', 'yue_Hant', 'lzh_Kana', 'wuu_Hani'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zho/opus-2020-07-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-zho/opus-2020-07-17.test.txt
- src_alpha3: eng
- tgt_alpha3: zho
- short_pair: en-zh
- chrF2_score: 0.268
- bleu: 31.4
- brevity_penalty: 0.8959999999999999
- ref_len: 110468.0
- src_name: English
- tgt_name: Chinese
- train_date: 2020-07-17
- src_alpha2: en
- tgt_alpha2: zh
- prefer_old: False
- long_pair: eng-zho
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41
| 2,670 | [
[
-0.028961181640625,
-0.03643798828125,
0.0242919921875,
0.0296173095703125,
-0.040283203125,
-0.018035888671875,
-0.022735595703125,
-0.0307464599609375,
0.0204315185546875,
0.02142333984375,
-0.047119140625,
-0.060272216796875,
-0.0307464599609375,
0.0278472900390625,
0.00901031494140625,
0.0675048828125,
-0.0033111572265625,
0.0057830810546875,
0.03643798828125,
-0.0260467529296875,
-0.0249481201171875,
-0.01995849609375,
-0.03619384765625,
-0.0184783935546875,
0.035369873046875,
0.02587890625,
0.044647216796875,
0.045196533203125,
0.03936767578125,
0.021636962890625,
-0.0269012451171875,
0.015472412109375,
-0.007732391357421875,
-0.01483154296875,
0.0027523040771484375,
-0.03485107421875,
-0.044677734375,
-0.0171356201171875,
0.06011962890625,
0.039031982421875,
0.0018405914306640625,
0.02459716796875,
-0.0097808837890625,
0.05023193359375,
-0.0140533447265625,
0.0105133056640625,
-0.039581298828125,
0.0030345916748046875,
-0.0333251953125,
-0.017303466796875,
-0.03662109375,
-0.019256591796875,
0.00310516357421875,
-0.047454833984375,
0.00841522216796875,
0.00872802734375,
0.1259765625,
-0.002712249755859375,
-0.025543212890625,
-0.016693115234375,
-0.02557373046875,
0.06573486328125,
-0.07373046875,
0.0218353271484375,
0.0278472900390625,
-0.0013780593872070312,
-0.003391265869140625,
-0.0309295654296875,
-0.03533935546875,
0.0021610260009765625,
-0.0214691162109375,
0.0264129638671875,
-0.0063934326171875,
-0.0065460205078125,
0.018463134765625,
0.049041748046875,
-0.0496826171875,
0.003276824951171875,
-0.0296630859375,
-0.0167236328125,
0.04693603515625,
0.008331298828125,
0.0421142578125,
-0.042816162109375,
-0.033935546875,
-0.0282135009765625,
-0.03167724609375,
0.021636962890625,
0.03375244140625,
0.032623291015625,
-0.03094482421875,
0.040283203125,
-0.0138397216796875,
0.044769287109375,
-0.002597808837890625,
-0.00818634033203125,
0.059234619140625,
-0.05267333984375,
-0.0156402587890625,
-0.012420654296875,
0.08526611328125,
0.02777099609375,
-0.0034160614013671875,
0.006793975830078125,
-0.02166748046875,
-0.013641357421875,
-0.0096588134765625,
-0.060943603515625,
0.0022945404052734375,
0.0263214111328125,
-0.036224365234375,
-0.01531219482421875,
0.01386260986328125,
-0.061737060546875,
0.00800323486328125,
0.00969696044921875,
0.03753662109375,
-0.05389404296875,
-0.0236358642578125,
0.01983642578125,
-0.007171630859375,
0.02557373046875,
-0.00279998779296875,
-0.050872802734375,
-0.0002048015594482422,
0.0242767333984375,
0.071533203125,
0.0011949539184570312,
-0.03515625,
-0.0196533203125,
0.00659942626953125,
-0.022674560546875,
0.047119140625,
-0.01180267333984375,
-0.0281524658203125,
-0.00024318695068359375,
0.03338623046875,
-0.01146697998046875,
-0.0190887451171875,
0.0697021484375,
-0.0233154296875,
0.032501220703125,
-0.03399658203125,
-0.03375244140625,
-0.0325927734375,
0.0229949951171875,
-0.059173583984375,
0.08294677734375,
0.0203857421875,
-0.06500244140625,
0.02764892578125,
-0.05145263671875,
-0.0281219482421875,
-0.0012674331665039062,
0.00989532470703125,
-0.05224609375,
-0.0062713623046875,
0.0243682861328125,
0.0264129638671875,
-0.03167724609375,
0.024688720703125,
0.004913330078125,
-0.0270843505859375,
-0.0102081298828125,
-0.0125274658203125,
0.1055908203125,
0.015777587890625,
-0.032196044921875,
0.015594482421875,
-0.06719970703125,
0.00783538818359375,
0.027618408203125,
-0.031036376953125,
-0.021759033203125,
-0.0088958740234375,
0.01479339599609375,
0.0125579833984375,
0.020843505859375,
-0.041748046875,
0.0219879150390625,
-0.0499267578125,
0.022430419921875,
0.056121826171875,
0.00775909423828125,
0.01335906982421875,
-0.033905029296875,
0.0360107421875,
0.0178375244140625,
0.01097869873046875,
-0.00250244140625,
-0.043487548828125,
-0.052825927734375,
-0.0134735107421875,
0.032196044921875,
0.054107666015625,
-0.06768798828125,
0.062744140625,
-0.04302978515625,
-0.06414794921875,
-0.051849365234375,
-0.0063934326171875,
0.036956787109375,
0.020843505859375,
0.0389404296875,
-0.008636474609375,
-0.039459228515625,
-0.07470703125,
-0.01242828369140625,
-0.0192108154296875,
-0.002658843994140625,
0.01325225830078125,
0.05010986328125,
-0.00884246826171875,
0.03570556640625,
-0.033721923828125,
-0.0426025390625,
-0.01910400390625,
0.00731658935546875,
0.03717041015625,
0.039306640625,
0.060272216796875,
-0.06011962890625,
-0.0509033203125,
0.0164947509765625,
-0.0526123046875,
-0.006740570068359375,
-0.0103912353515625,
-0.01305389404296875,
0.025115966796875,
0.00395965576171875,
-0.032684326171875,
0.0142059326171875,
0.045166015625,
-0.047210693359375,
0.040802001953125,
-0.02178955078125,
0.023468017578125,
-0.10260009765625,
0.0107574462890625,
-0.00847625732421875,
0.007373809814453125,
-0.03753662109375,
0.00481414794921875,
0.014923095703125,
0.01213836669921875,
-0.043731689453125,
0.047149658203125,
-0.05303955078125,
0.005462646484375,
0.0227813720703125,
0.010986328125,
-0.00202178955078125,
0.06768798828125,
-0.00824737548828125,
0.071533203125,
0.0460205078125,
-0.03375244140625,
0.0122222900390625,
0.03399658203125,
-0.039703369140625,
0.01351165771484375,
-0.04888916015625,
-0.0125732421875,
0.017486572265625,
-0.004375457763671875,
-0.0596923828125,
-0.0108184814453125,
0.0239715576171875,
-0.0496826171875,
0.010009765625,
0.0010194778442382812,
-0.046478271484375,
-0.02252197265625,
-0.037384033203125,
0.032958984375,
0.032745361328125,
-0.0191802978515625,
0.05047607421875,
0.0059661865234375,
0.00421905517578125,
-0.048583984375,
-0.06640625,
-0.00438690185546875,
-0.00899505615234375,
-0.04998779296875,
0.03167724609375,
-0.0156707763671875,
-0.002277374267578125,
0.0115203857421875,
0.004741668701171875,
-0.0091400146484375,
0.006931304931640625,
-0.00479888916015625,
0.0284423828125,
-0.0232086181640625,
0.0013742446899414062,
-0.0015621185302734375,
-0.0033054351806640625,
-0.012969970703125,
-0.0037078857421875,
0.060943603515625,
-0.025970458984375,
-0.01123809814453125,
-0.050506591796875,
0.01081085205078125,
0.037353515625,
-0.0316162109375,
0.07586669921875,
0.047943115234375,
-0.0192413330078125,
0.006725311279296875,
-0.028778076171875,
0.00914764404296875,
-0.0308685302734375,
0.0252838134765625,
-0.0400390625,
-0.04644775390625,
0.07672119140625,
0.0261383056640625,
0.0208282470703125,
0.0709228515625,
0.043243408203125,
0.014007568359375,
0.05096435546875,
0.0243682861328125,
0.00112152099609375,
0.03912353515625,
-0.0419921875,
-0.00455474853515625,
-0.06341552734375,
-0.01093292236328125,
-0.048004150390625,
-0.00710296630859375,
-0.061065673828125,
-0.019256591796875,
0.0187835693359375,
-0.004154205322265625,
-0.00534820556640625,
0.0501708984375,
-0.034820556640625,
0.0203399658203125,
0.046142578125,
0.01424407958984375,
0.023773193359375,
-0.00923919677734375,
-0.0265960693359375,
-0.0013580322265625,
-0.036712646484375,
-0.037017822265625,
0.08172607421875,
0.015350341796875,
0.003662109375,
0.0196380615234375,
0.046112060546875,
0.005649566650390625,
0.01178741455078125,
-0.05548095703125,
0.04754638671875,
-0.01483154296875,
-0.062286376953125,
-0.037689208984375,
-0.027618408203125,
-0.0675048828125,
0.0298614501953125,
-0.00893402099609375,
-0.047332763671875,
0.0140228271484375,
-0.01428985595703125,
-0.021728515625,
0.049896240234375,
-0.045196533203125,
0.0645751953125,
-0.0012769699096679688,
-0.015594482421875,
0.00982666015625,
-0.05126953125,
0.0249481201171875,
-0.0003821849822998047,
0.0160675048828125,
-0.0156097412109375,
-0.005889892578125,
0.0648193359375,
-0.0323486328125,
0.038360595703125,
-0.004245758056640625,
-0.020721435546875,
0.0227813720703125,
0.0032558441162109375,
0.03753662109375,
-0.01204681396484375,
-0.029296875,
0.026763916015625,
0.004276275634765625,
-0.04180908203125,
-0.019561767578125,
0.0328369140625,
-0.056365966796875,
-0.042266845703125,
-0.040771484375,
-0.047210693359375,
0.005157470703125,
0.042236328125,
0.046234130859375,
0.0384521484375,
0.004619598388671875,
0.042816162109375,
0.04205322265625,
-0.0281829833984375,
0.039306640625,
0.03619384765625,
-0.0016326904296875,
-0.045318603515625,
0.056182861328125,
0.0303802490234375,
0.0193328857421875,
0.034820556640625,
0.01116943359375,
-0.008544921875,
-0.058349609375,
-0.04071044921875,
0.0280914306640625,
-0.03643798828125,
-0.0222320556640625,
-0.03643798828125,
-0.0126495361328125,
-0.0268096923828125,
0.007633209228515625,
-0.0211944580078125,
-0.01392364501953125,
-0.01142120361328125,
-0.0196533203125,
0.02508544921875,
0.0238037109375,
-0.003459930419921875,
0.020904541015625,
-0.07421875,
0.02386474609375,
-0.0053558349609375,
0.029815673828125,
-0.00662994384765625,
-0.05029296875,
-0.029754638671875,
0.004749298095703125,
-0.01474761962890625,
-0.0819091796875,
0.043670654296875,
0.0018739700317382812,
0.032867431640625,
0.0196533203125,
0.0198516845703125,
0.04473876953125,
-0.0322265625,
0.0809326171875,
-0.004283905029296875,
-0.07183837890625,
0.05145263671875,
-0.04095458984375,
0.0260009765625,
0.042633056640625,
0.0213775634765625,
-0.0377197265625,
-0.04412841796875,
-0.0533447265625,
-0.06854248046875,
0.070556640625,
0.037078857421875,
-0.0007734298706054688,
-0.0054779052734375,
0.006328582763671875,
-0.0034923553466796875,
-0.012542724609375,
-0.08648681640625,
-0.0430908203125,
0.006214141845703125,
-0.024749755859375,
0.00485992431640625,
-0.034210205078125,
-0.008392333984375,
-0.019989013671875,
0.08074951171875,
0.01360321044921875,
0.00884246826171875,
0.0443115234375,
-0.0089263916015625,
-0.006732940673828125,
0.0246429443359375,
0.05712890625,
0.0400390625,
-0.027740478515625,
-0.0164642333984375,
0.0201568603515625,
-0.0428466796875,
0.00386810302734375,
-0.0020580291748046875,
-0.043365478515625,
0.025848388671875,
0.043853759765625,
0.066650390625,
0.0113372802734375,
-0.040771484375,
0.036224365234375,
0.002166748046875,
-0.039398193359375,
-0.027740478515625,
-0.01500701904296875,
0.00832366943359375,
0.008148193359375,
0.0295562744140625,
-0.01074981689453125,
0.0022640228271484375,
-0.037445068359375,
0.002704620361328125,
0.007717132568359375,
-0.019561767578125,
-0.0233612060546875,
0.044708251953125,
0.00455474853515625,
-0.0235748291015625,
0.034454345703125,
-0.027099609375,
-0.040557861328125,
0.05853271484375,
0.0212249755859375,
0.0723876953125,
-0.021484375,
-0.00316619873046875,
0.059234619140625,
0.039886474609375,
-0.00856781005859375,
0.034820556640625,
0.01300048828125,
-0.054901123046875,
-0.01218414306640625,
-0.04345703125,
0.001583099365234375,
0.01019287109375,
-0.0634765625,
0.02392578125,
-0.00988006591796875,
-0.0157470703125,
-0.00890350341796875,
0.0264739990234375,
-0.034210205078125,
0.00920867919921875,
-0.019989013671875,
0.06689453125,
-0.06707763671875,
0.06781005859375,
0.045440673828125,
-0.062469482421875,
-0.08172607421875,
0.004436492919921875,
-0.025543212890625,
-0.042572021484375,
0.035980224609375,
0.00533294677734375,
0.00356292724609375,
-0.0012607574462890625,
-0.023834228515625,
-0.052978515625,
0.09228515625,
0.017822265625,
-0.035491943359375,
-0.007534027099609375,
-0.011016845703125,
0.042999267578125,
0.005176544189453125,
0.011383056640625,
0.029449462890625,
0.056304931640625,
-0.01224517822265625,
-0.0794677734375,
0.01751708984375,
-0.0440673828125,
0.0010547637939453125,
0.01690673828125,
-0.07342529296875,
0.07720947265625,
-0.00034546852111816406,
-0.0240020751953125,
0.006793975830078125,
0.05291748046875,
0.032501220703125,
0.008270263671875,
0.04046630859375,
0.051666259765625,
0.04400634765625,
-0.0306243896484375,
0.07452392578125,
-0.02447509765625,
0.04461669921875,
0.06549072265625,
0.00795745849609375,
0.05682373046875,
0.033660888671875,
-0.02484130859375,
0.04730224609375,
0.0560302734375,
-0.0238800048828125,
0.0298614501953125,
-0.00830078125,
-0.0005116462707519531,
-0.0172119140625,
-0.00946044921875,
-0.043731689453125,
0.0274658203125,
0.004886627197265625,
-0.019012451171875,
0.0005125999450683594,
-0.01456451416015625,
0.0243988037109375,
0.0146331787109375,
-0.0184326171875,
0.042633056640625,
-0.0095062255859375,
-0.046234130859375,
0.053131103515625,
0.002071380615234375,
0.060272216796875,
-0.052978515625,
0.00383758544921875,
-0.018157958984375,
0.0101776123046875,
-0.00919342041015625,
-0.0595703125,
0.020751953125,
-0.00023043155670166016,
-0.016937255859375,
-0.00870513916015625,
0.009735107421875,
-0.03594970703125,
-0.056976318359375,
0.03863525390625,
0.032989501953125,
0.0126800537109375,
0.01198577880859375,
-0.05010986328125,
-0.0033435821533203125,
0.0223846435546875,
-0.052001953125,
-0.0003516674041748047,
0.057159423828125,
0.0015716552734375,
0.05096435546875,
0.039642333984375,
0.0189056396484375,
0.005702972412109375,
0.006114959716796875,
0.048065185546875,
-0.0626220703125,
-0.039947509765625,
-0.06243896484375,
0.039093017578125,
-0.00537872314453125,
-0.04248046875,
0.06439208984375,
0.05303955078125,
0.06414794921875,
-0.01517486572265625,
0.034515380859375,
-0.0174102783203125,
0.03228759765625,
-0.050384521484375,
0.058258056640625,
-0.07391357421875,
0.0027904510498046875,
-0.019256591796875,
-0.05548095703125,
-0.030303955078125,
0.0301361083984375,
-0.01497650146484375,
-0.0006399154663085938,
0.07073974609375,
0.049041748046875,
0.01824951171875,
-0.0202178955078125,
0.0114593505859375,
0.02880859375,
0.0241546630859375,
0.06719970703125,
0.02239990234375,
-0.0750732421875,
0.056365966796875,
-0.037872314453125,
0.010406494140625,
-0.0090789794921875,
-0.057647705078125,
-0.0596923828125,
-0.058380126953125,
-0.01325225830078125,
-0.019744873046875,
-0.0156707763671875,
0.072998046875,
0.0170745849609375,
-0.0677490234375,
-0.020233154296875,
0.00452423095703125,
0.0159149169921875,
-0.03387451171875,
-0.019195556640625,
0.06036376953125,
-0.01212310791015625,
-0.08331298828125,
0.011627197265625,
-0.0006761550903320312,
0.0112762451171875,
0.00632476806640625,
-0.002346038818359375,
-0.048736572265625,
-0.00420379638671875,
0.0264892578125,
0.0092315673828125,
-0.069091796875,
-0.01549530029296875,
0.00824737548828125,
-0.0228729248046875,
0.0254364013671875,
0.0080718994140625,
-0.0131072998046875,
0.0186309814453125,
0.061676025390625,
0.0201416015625,
0.0194244384765625,
-0.0011987686157226562,
0.0253143310546875,
-0.04962158203125,
0.035369873046875,
0.01174163818359375,
0.042938232421875,
0.018218994140625,
-0.0174560546875,
0.063720703125,
0.0243988037109375,
-0.0189056396484375,
-0.07861328125,
-0.01180267333984375,
-0.09710693359375,
-0.0168609619140625,
0.0789794921875,
-0.02532958984375,
-0.0333251953125,
0.011932373046875,
-0.023101806640625,
0.039703369140625,
-0.026123046875,
0.04425048828125,
0.07342529296875,
0.0289459228515625,
0.0041656494140625,
-0.04443359375,
0.02484130859375,
0.036346435546875,
-0.060882568359375,
-0.0007638931274414062,
0.01291656494140625,
0.01537322998046875,
0.02880859375,
0.048675537109375,
-0.0235748291015625,
0.0167388916015625,
-0.003726959228515625,
0.031585693359375,
-0.0144805908203125,
-0.0001856088638305664,
-0.0186614990234375,
0.005184173583984375,
-0.0130462646484375,
-0.023834228515625
]
] |
Helsinki-NLP/opus-mt-sv-en | 2023-08-16T12:05:00.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"sv",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-sv-en | 7 | 81,925 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-sv-en
* source languages: sv
* target languages: en
* OPUS readme: [sv-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/sv-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-02-26.zip](https://object.pouta.csc.fi/OPUS-MT-models/sv-en/opus-2020-02-26.zip)
* test set translations: [opus-2020-02-26.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-en/opus-2020-02-26.test.txt)
* test set scores: [opus-2020-02-26.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-en/opus-2020-02-26.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.sv.en | 64.5 | 0.763 |
| 818 | [
[
-0.01462554931640625,
-0.024658203125,
0.0186004638671875,
0.032562255859375,
-0.03790283203125,
-0.02825927734375,
-0.0333251953125,
-0.004787445068359375,
0.0007300376892089844,
0.03729248046875,
-0.05401611328125,
-0.046234130859375,
-0.0423583984375,
0.0154876708984375,
-0.00597381591796875,
0.057159423828125,
-0.0121612548828125,
0.04638671875,
0.0098876953125,
-0.034210205078125,
-0.024566650390625,
-0.029815673828125,
-0.0303955078125,
-0.022705078125,
0.02349853515625,
0.0258026123046875,
0.0204925537109375,
0.03570556640625,
0.0689697265625,
0.01678466796875,
-0.005748748779296875,
-0.0007987022399902344,
-0.033966064453125,
0.00023639202117919922,
0.0068817138671875,
-0.045684814453125,
-0.050750732421875,
-0.01201629638671875,
0.07965087890625,
0.0293731689453125,
-0.0002913475036621094,
0.03271484375,
-0.00260162353515625,
0.0701904296875,
-0.02294921875,
0.006923675537109375,
-0.043914794921875,
0.004302978515625,
-0.024688720703125,
-0.019439697265625,
-0.05120849609375,
-0.0201568603515625,
0.01493072509765625,
-0.048583984375,
-0.002506256103515625,
0.008270263671875,
0.1046142578125,
0.0269622802734375,
-0.0248870849609375,
-0.00811004638671875,
-0.046630859375,
0.080078125,
-0.056854248046875,
0.048126220703125,
0.03131103515625,
0.02252197265625,
0.01369476318359375,
-0.0438232421875,
-0.0224151611328125,
0.006221771240234375,
-0.01174163818359375,
0.0151824951171875,
-0.0075531005859375,
-0.0185089111328125,
0.0232391357421875,
0.04974365234375,
-0.05780029296875,
-0.00010269880294799805,
-0.04046630859375,
-0.00237274169921875,
0.048187255859375,
0.00870513916015625,
0.01045989990234375,
-0.01727294921875,
-0.032135009765625,
-0.042022705078125,
-0.0548095703125,
0.01013946533203125,
0.0291595458984375,
0.021209716796875,
-0.033599853515625,
0.055755615234375,
-0.01024627685546875,
0.04571533203125,
-0.00751495361328125,
0.0004780292510986328,
0.07183837890625,
-0.03350830078125,
-0.0284271240234375,
-0.016021728515625,
0.0863037109375,
0.02508544921875,
0.0041656494140625,
0.0021648406982421875,
-0.017822265625,
-0.0171966552734375,
0.01096343994140625,
-0.0660400390625,
-0.007171630859375,
0.01097869873046875,
-0.037628173828125,
-0.00888824462890625,
0.00022351741790771484,
-0.046142578125,
0.016265869140625,
-0.032470703125,
0.0450439453125,
-0.050567626953125,
-0.0142364501953125,
0.0241546630859375,
-0.003528594970703125,
0.0310211181640625,
-0.0015745162963867188,
-0.044342041015625,
0.013824462890625,
0.030487060546875,
0.05328369140625,
-0.032470703125,
-0.0212554931640625,
-0.034912109375,
-0.01528167724609375,
-0.004520416259765625,
0.05078125,
-0.006500244140625,
-0.027801513671875,
-0.00494384765625,
0.035247802734375,
-0.0238189697265625,
-0.0233154296875,
0.0968017578125,
-0.0209197998046875,
0.0511474609375,
-0.0379638671875,
-0.04058837890625,
-0.0218353271484375,
0.0328369140625,
-0.042205810546875,
0.0992431640625,
0.00867462158203125,
-0.06256103515625,
0.0220947265625,
-0.057403564453125,
-0.0159912109375,
0.0016345977783203125,
0.0010814666748046875,
-0.048828125,
0.00411224365234375,
0.0084991455078125,
0.0343017578125,
-0.0236358642578125,
0.0214080810546875,
-0.0022373199462890625,
-0.026092529296875,
0.0034122467041015625,
-0.0258636474609375,
0.07049560546875,
0.018310546875,
-0.0222320556640625,
0.016021728515625,
-0.07049560546875,
-0.003444671630859375,
-0.00043201446533203125,
-0.03961181640625,
-0.020294189453125,
0.0130615234375,
0.0193939208984375,
0.01224517822265625,
0.0262908935546875,
-0.050537109375,
0.0190582275390625,
-0.04840087890625,
0.01186370849609375,
0.050994873046875,
-0.0223236083984375,
0.0278778076171875,
-0.032318115234375,
0.0242919921875,
0.006015777587890625,
0.00888824462890625,
0.0017566680908203125,
-0.0316162109375,
-0.06500244140625,
-0.020904541015625,
0.046142578125,
0.07916259765625,
-0.061737060546875,
0.062744140625,
-0.053619384765625,
-0.060089111328125,
-0.0599365234375,
-0.01386260986328125,
0.037628173828125,
0.0230865478515625,
0.038482666015625,
-0.0106353759765625,
-0.03912353515625,
-0.08148193359375,
-0.0115509033203125,
-0.0107421875,
-0.01922607421875,
0.0160369873046875,
0.048004150390625,
-0.00870513916015625,
0.039398193359375,
-0.04180908203125,
-0.028106689453125,
-0.011199951171875,
0.00868988037109375,
0.032745361328125,
0.0479736328125,
0.041656494140625,
-0.06591796875,
-0.04339599609375,
-0.00455474853515625,
-0.050140380859375,
-0.006320953369140625,
0.00914764404296875,
-0.016448974609375,
0.00930023193359375,
0.004764556884765625,
-0.020660400390625,
0.0030574798583984375,
0.0465087890625,
-0.042327880859375,
0.043487548828125,
-0.01020050048828125,
0.0212554931640625,
-0.104736328125,
0.01113128662109375,
-0.006725311279296875,
-0.0063323974609375,
-0.031494140625,
-0.0037708282470703125,
0.0179901123046875,
0.007625579833984375,
-0.06146240234375,
0.036895751953125,
-0.016357421875,
-0.01169586181640625,
0.0233154296875,
0.00018656253814697266,
0.00287628173828125,
0.0567626953125,
-0.00439453125,
0.060882568359375,
0.05120849609375,
-0.0408935546875,
0.0128021240234375,
0.042877197265625,
-0.03515625,
0.031402587890625,
-0.059234619140625,
-0.0201568603515625,
0.0203857421875,
-0.005458831787109375,
-0.051300048828125,
0.00823211669921875,
0.02435302734375,
-0.051055908203125,
0.027679443359375,
-0.01141357421875,
-0.054351806640625,
-0.0023555755615234375,
-0.025115966796875,
0.031524658203125,
0.0537109375,
-0.01363372802734375,
0.047271728515625,
0.006656646728515625,
-0.005039215087890625,
-0.039947509765625,
-0.07183837890625,
-0.01169586181640625,
-0.024658203125,
-0.05511474609375,
0.0162353515625,
-0.0333251953125,
0.0004856586456298828,
0.0032501220703125,
0.02227783203125,
-0.007144927978515625,
0.0012874603271484375,
0.0033626556396484375,
0.02056884765625,
-0.036285400390625,
0.0119171142578125,
0.0034618377685546875,
-0.01096343994140625,
-0.01189422607421875,
-0.01442718505859375,
0.044097900390625,
-0.0253143310546875,
-0.018463134765625,
-0.044769287109375,
0.00385284423828125,
0.049102783203125,
-0.03045654296875,
0.061126708984375,
0.041900634765625,
-0.009490966796875,
0.01387786865234375,
-0.0301971435546875,
0.00959014892578125,
-0.032012939453125,
0.01227569580078125,
-0.03564453125,
-0.0570068359375,
0.0408935546875,
0.00887298583984375,
0.034942626953125,
0.06414794921875,
0.046600341796875,
0.001117706298828125,
0.05120849609375,
0.0237884521484375,
0.001010894775390625,
0.035797119140625,
-0.0367431640625,
-0.016693115234375,
-0.0830078125,
0.0009946823120117188,
-0.05694580078125,
-0.0211181640625,
-0.062225341796875,
-0.0210113525390625,
0.01531982421875,
-0.0007767677307128906,
-0.017242431640625,
0.054229736328125,
-0.044036865234375,
0.01500701904296875,
0.03936767578125,
-0.01319122314453125,
0.018463134765625,
0.0014247894287109375,
-0.043487548828125,
-0.0262298583984375,
-0.03131103515625,
-0.039703369140625,
0.09454345703125,
0.02899169921875,
0.023040771484375,
0.0159912109375,
0.032379150390625,
0.0004878044128417969,
0.015106201171875,
-0.0479736328125,
0.036346435546875,
-0.028961181640625,
-0.049407958984375,
-0.0264739990234375,
-0.041900634765625,
-0.05877685546875,
0.038787841796875,
-0.024261474609375,
-0.03692626953125,
0.0110626220703125,
-0.0017671585083007812,
-0.0019626617431640625,
0.028961181640625,
-0.048614501953125,
0.08251953125,
-0.0028228759765625,
-0.0015096664428710938,
0.0216522216796875,
-0.0357666015625,
0.0184173583984375,
-0.0005292892456054688,
0.0160064697265625,
-0.0163116455078125,
0.010498046875,
0.05023193359375,
-0.004917144775390625,
0.0335693359375,
-0.00531768798828125,
-0.00223541259765625,
-0.002605438232421875,
0.0013952255249023438,
0.033477783203125,
-0.0114288330078125,
-0.03021240234375,
0.027801513671875,
0.00299072265625,
-0.03131103515625,
-0.012420654296875,
0.0389404296875,
-0.05535888671875,
0.004100799560546875,
-0.03216552734375,
-0.04473876953125,
0.005229949951171875,
0.0273284912109375,
0.054901123046875,
0.050933837890625,
-0.023040771484375,
0.042816162109375,
0.0638427734375,
-0.0220489501953125,
0.0264434814453125,
0.05841064453125,
-0.009613037109375,
-0.0435791015625,
0.0672607421875,
0.01351165771484375,
0.0256500244140625,
0.04559326171875,
0.006587982177734375,
-0.01097869873046875,
-0.058746337890625,
-0.05584716796875,
0.020233154296875,
-0.0225830078125,
-0.011383056640625,
-0.04669189453125,
-0.0026454925537109375,
-0.022796630859375,
0.0168609619140625,
-0.04119873046875,
-0.039703369140625,
-0.01493072509765625,
-0.013092041015625,
0.016937255859375,
0.017669677734375,
0.003871917724609375,
0.029510498046875,
-0.076416015625,
0.0144500732421875,
-0.0101470947265625,
0.031585693359375,
-0.034515380859375,
-0.0560302734375,
-0.03338623046875,
0.002559661865234375,
-0.04205322265625,
-0.048583984375,
0.03485107421875,
0.0039825439453125,
0.0229949951171875,
0.019775390625,
0.01468658447265625,
0.024078369140625,
-0.056243896484375,
0.07672119140625,
-0.005340576171875,
-0.05487060546875,
0.034515380859375,
-0.0333251953125,
0.03375244140625,
0.067626953125,
0.0200958251953125,
-0.028961181640625,
-0.042144775390625,
-0.050201416015625,
-0.061004638671875,
0.061126708984375,
0.05328369140625,
-0.01004791259765625,
0.0175933837890625,
-0.004901885986328125,
0.0005230903625488281,
0.010009765625,
-0.081787109375,
-0.0276947021484375,
0.00177001953125,
-0.02789306640625,
-0.0142059326171875,
-0.020599365234375,
-0.01123046875,
-0.0151824951171875,
0.08331298828125,
0.01178741455078125,
0.0179901123046875,
0.0301055908203125,
-0.01027679443359375,
-0.014404296875,
0.02496337890625,
0.07769775390625,
0.034912109375,
-0.03924560546875,
-0.0164794921875,
0.0271759033203125,
-0.0360107421875,
-0.00875091552734375,
0.0098724365234375,
-0.031280517578125,
0.0215606689453125,
0.03271484375,
0.08154296875,
0.0157928466796875,
-0.046875,
0.035980224609375,
-0.03271484375,
-0.03271484375,
-0.052001953125,
-0.0121612548828125,
0.007904052734375,
0.0027637481689453125,
0.0189666748046875,
0.00893402099609375,
0.01235198974609375,
-0.007717132568359375,
0.01253509521484375,
0.004608154296875,
-0.05279541015625,
-0.03961181640625,
0.032806396484375,
0.0093841552734375,
-0.0267181396484375,
0.036163330078125,
-0.027679443359375,
-0.0419921875,
0.0290985107421875,
0.0108489990234375,
0.07940673828125,
-0.0167999267578125,
-0.01434326171875,
0.050994873046875,
0.0457763671875,
-0.0191192626953125,
0.037384033203125,
0.00881195068359375,
-0.052581787109375,
-0.044525146484375,
-0.06396484375,
-0.01284027099609375,
0.01215362548828125,
-0.064208984375,
0.029571533203125,
0.024627685546875,
0.0034656524658203125,
-0.0194854736328125,
0.0170745849609375,
-0.037628173828125,
0.00865936279296875,
-0.0241241455078125,
0.08087158203125,
-0.0745849609375,
0.06536865234375,
0.041259765625,
-0.0184173583984375,
-0.060394287109375,
-0.0133209228515625,
-0.0207977294921875,
-0.0284576416015625,
0.045501708984375,
0.01113128662109375,
0.0209197998046875,
-0.00995635986328125,
-0.01715087890625,
-0.060089111328125,
0.085205078125,
0.018218994140625,
-0.041259765625,
0.002300262451171875,
0.00716400146484375,
0.038909912109375,
-0.02618408203125,
0.00011450052261352539,
0.0310211181640625,
0.054107666015625,
0.01053619384765625,
-0.07830810546875,
-0.0219268798828125,
-0.04095458984375,
-0.02001953125,
0.042999267578125,
-0.041259765625,
0.0694580078125,
0.03533935546875,
-0.01113128662109375,
0.007175445556640625,
0.048736572265625,
0.023712158203125,
0.0196533203125,
0.0389404296875,
0.08331298828125,
0.0262451171875,
-0.035552978515625,
0.0758056640625,
-0.0197601318359375,
0.03570556640625,
0.08734130859375,
-0.004962921142578125,
0.068603515625,
0.0226898193359375,
-0.0084991455078125,
0.037750244140625,
0.047210693359375,
-0.0274810791015625,
0.038909912109375,
0.00652313232421875,
0.0117340087890625,
-0.01000213623046875,
0.0167999267578125,
-0.048919677734375,
0.0211181640625,
0.01241302490234375,
-0.0166168212890625,
0.00641632080078125,
-0.00005334615707397461,
0.002925872802734375,
-0.0045928955078125,
-0.0078887939453125,
0.047454833984375,
-0.003269195556640625,
-0.046417236328125,
0.049835205078125,
-0.002655029296875,
0.049896240234375,
-0.052001953125,
0.0081939697265625,
0.0018205642700195312,
0.0170440673828125,
0.005123138427734375,
-0.04376220703125,
0.033538818359375,
0.0016031265258789062,
-0.0242919921875,
-0.029571533203125,
0.0196533203125,
-0.04290771484375,
-0.06341552734375,
0.0287017822265625,
0.02825927734375,
0.0299072265625,
0.0084075927734375,
-0.06561279296875,
0.00281524658203125,
0.0111541748046875,
-0.045989990234375,
0.006137847900390625,
0.048828125,
0.021453857421875,
0.03765869140625,
0.049163818359375,
0.0172119140625,
0.014007568359375,
-0.0024566650390625,
0.05316162109375,
-0.032012939453125,
-0.03515625,
-0.06060791015625,
0.06195068359375,
-0.009613037109375,
-0.04949951171875,
0.054351806640625,
0.076904296875,
0.07464599609375,
-0.01300048828125,
0.0259552001953125,
-0.0010242462158203125,
0.053192138671875,
-0.049041748046875,
0.04840087890625,
-0.067626953125,
0.01422882080078125,
-0.0144805908203125,
-0.06756591796875,
-0.0176239013671875,
0.024200439453125,
-0.01312255859375,
-0.0285797119140625,
0.05780029296875,
0.04400634765625,
-0.009857177734375,
-0.016204833984375,
0.0211181640625,
0.0219573974609375,
0.0156097412109375,
0.039703369140625,
0.02716064453125,
-0.07098388671875,
0.04437255859375,
-0.0180206298828125,
-0.003437042236328125,
-0.0029735565185546875,
-0.05621337890625,
-0.0643310546875,
-0.049957275390625,
-0.011810302734375,
-0.017974853515625,
-0.0212860107421875,
0.06500244140625,
0.037200927734375,
-0.064453125,
-0.03662109375,
0.005146026611328125,
0.01351165771484375,
-0.0142059326171875,
-0.0177154541015625,
0.047576904296875,
-0.02142333984375,
-0.0728759765625,
0.03485107421875,
0.0072021484375,
-0.00004392862319946289,
-0.00577545166015625,
-0.0230865478515625,
-0.034393310546875,
-0.0020275115966796875,
0.022369384765625,
-0.0001289844512939453,
-0.038482666015625,
0.0047454833984375,
0.0140533447265625,
-0.01104736328125,
0.026519775390625,
0.0295867919921875,
-0.0188140869140625,
0.01849365234375,
0.06561279296875,
0.0268096923828125,
0.036224365234375,
-0.01293182373046875,
0.0428466796875,
-0.055572509765625,
0.0213165283203125,
0.016845703125,
0.0445556640625,
0.0272064208984375,
-0.008453369140625,
0.05511474609375,
0.0169830322265625,
-0.048095703125,
-0.08087158203125,
0.007755279541015625,
-0.09039306640625,
0.0012845993041992188,
0.0682373046875,
-0.0254058837890625,
-0.02252197265625,
0.0201873779296875,
-0.010101318359375,
0.0158538818359375,
-0.0252838134765625,
0.025604248046875,
0.06884765625,
0.0298614501953125,
0.01061248779296875,
-0.052978515625,
0.0309295654296875,
0.03680419921875,
-0.053253173828125,
-0.01163482666015625,
0.0140838623046875,
0.0164031982421875,
0.029205322265625,
0.039825439453125,
-0.0224761962890625,
0.0081787109375,
-0.025421142578125,
0.0347900390625,
-0.004913330078125,
-0.0160369873046875,
-0.0272064208984375,
-0.0030765533447265625,
-0.009246826171875,
-0.02496337890625
]
] |
OpenAssistant/reward-model-deberta-v3-large-v2 | 2023-02-01T00:55:05.000Z | [
"transformers",
"pytorch",
"deberta-v2",
"text-classification",
"reward-model",
"reward_model",
"RLHF",
"en",
"dataset:openai/summarize_from_feedback",
"dataset:openai/webgpt_comparisons",
"dataset:Dahoas/instruct-synthetic-prompt-responses",
"dataset:Anthropic/hh-rlhf",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | OpenAssistant | null | null | OpenAssistant/reward-model-deberta-v3-large-v2 | 116 | 80,030 | transformers | 2023-02-01T00:13:05 | ---
license: mit
datasets:
- openai/summarize_from_feedback
- openai/webgpt_comparisons
- Dahoas/instruct-synthetic-prompt-responses
- Anthropic/hh-rlhf
language:
- en
metrics:
- accuracy
tags:
- reward-model
- reward_model
- RLHF
---
# Reward model trained from human feedback
Reward model (RM) trained to predict which generated answer is better judged by a human, given a question.
RM are useful in these domain:
- QA model evaluation
- serves as reward score in RLHF
- detect potential toxic response via ranking
All models are train on these dataset with a same split seed across datasets (if validation split wasn't available)
- [webgpt_comparisons](https://huggingface.co/datasets/openai/webgpt_comparisons)
- [summarize_from_feedback](https://huggingface.co/datasets/openai/summarize_from_feedback)
- [synthetic-instruct-gptj-pairwise](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise)
- [anthropic_hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf)
# How to use
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
reward_name = "OpenAssistant/reward-model-deberta-v3-large-v2"
rank_model, tokenizer = AutoModelForSequenceClassification.from_pretrained(reward_name), AutoTokenizer.from_pretrained(reward_name)
question, answer = "Explain nuclear fusion like I am five", "Nuclear fusion is the process by which two or more protons and neutrons combine to form a single nucleus. It is a very important process in the universe, as it is the source of energy for stars and galaxies. Nuclear fusion is also a key process in the production of energy for nuclear power plants."
inputs = tokenizer(question, answer, return_tensors='pt')
score = rank_model(**inputs).logits[0].cpu().detach()
print(score)
```
**Toxic response detection**
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
reward_name = "OpenAssistant/reward-model-deberta-v3-large-v2"
rank_model, tokenizer = AutoModelForSequenceClassification.from_pretrained(reward_name), AutoTokenizer.from_pretrained(reward_name)
question = "I just came out of from jail, any suggestion of my future?"
helpful = "It's great to hear that you have been released from jail."
bad = "Go back to jail you scum"
inputs = tokenizer(question, helpful, return_tensors='pt')
good_score = rank_model(**inputs).logits[0].cpu().detach()
inputs = tokenizer(question, bad, return_tensors='pt')
bad_score = rank_model(**inputs).logits[0].cpu().detach()
print(good_score > bad_score) # tensor([True])
```
# Performance
Validation split accuracy
| Model | [WebGPT](https://huggingface.co/datasets/openai/webgpt_comparisons) | [Summary](https://huggingface.co/datasets/openai/summarize_from_feedback) | [SytheticGPT](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise) | [Anthropic RLHF]() |
|---|---|---|---|---|
| [electra-large-discriminator](https://huggingface.co/OpenAssistant/reward-model-electra-large-discriminator) | 59.30 | 68.66 | 99.85 | 54.33 |
| **[deberta-v3-large-v2](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2)** | **61.57** | 71.47 | 99.88 | **69.25** |
| [deberta-v3-large](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large) | 61.13 | 72.23 | **99.94** | 55.62 |
| [deberta-v3-base](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-base) | 59.07 | 66.84 | 99.85 | 54.51 |
| deberta-v2-xxlarge | 58.67 | **73.27** | 99.77 | 66.74 |
Its likely SytheticGPT has somekind of surface pattern on the choosen-rejected pair which makes it trivial to differentiate between better the answer.
# Other
Sincere thanks to [stability.ai](https://stability.ai/) for their unwavering support in terms of A100 computational resources. Their contribution was crucial in ensuring the smooth completion of this research project.
| 3,875 | [
[
-0.039215087890625,
-0.043914794921875,
0.0168304443359375,
0.003025054931640625,
-0.006683349609375,
-0.002994537353515625,
0.005741119384765625,
-0.0277252197265625,
0.01198577880859375,
0.00994873046875,
-0.045654296875,
-0.025634765625,
-0.047607421875,
0.00524139404296875,
-0.00662994384765625,
0.06884765625,
-0.004978179931640625,
0.01739501953125,
-0.00859832763671875,
-0.023162841796875,
-0.0352783203125,
-0.050811767578125,
-0.045440673828125,
-0.045501708984375,
0.038177490234375,
0.017425537109375,
0.0297088623046875,
0.0352783203125,
0.048828125,
0.0301971435546875,
-0.0149078369140625,
-0.00862884521484375,
-0.0310516357421875,
-0.005275726318359375,
-0.00572967529296875,
-0.033843994140625,
-0.0316162109375,
0.020172119140625,
0.022308349609375,
0.0198822021484375,
-0.0037326812744140625,
0.029388427734375,
-0.00347137451171875,
0.05316162109375,
-0.04876708984375,
0.0169219970703125,
-0.018310546875,
0.010284423828125,
0.0034847259521484375,
0.00379180908203125,
-0.019775390625,
-0.032989501953125,
-0.0015859603881835938,
-0.041778564453125,
0.0008149147033691406,
-0.0093536376953125,
0.08660888671875,
0.041351318359375,
-0.0082550048828125,
-0.00701141357421875,
-0.046234130859375,
0.06378173828125,
-0.059478759765625,
0.0087738037109375,
0.03265380859375,
0.024078369140625,
-0.006946563720703125,
-0.052154541015625,
-0.04437255859375,
-0.017364501953125,
-0.0227813720703125,
0.0230712890625,
-0.01432037353515625,
-0.004978179931640625,
0.029388427734375,
0.038360595703125,
-0.051727294921875,
0.013580322265625,
-0.0253143310546875,
-0.00960540771484375,
0.05218505859375,
0.0234222412109375,
0.0191192626953125,
-0.03521728515625,
-0.0306549072265625,
-0.050445556640625,
-0.042236328125,
0.028411865234375,
0.0257568359375,
0.01461029052734375,
-0.0255889892578125,
0.037994384765625,
-0.025909423828125,
0.052764892578125,
0.007549285888671875,
0.003635406494140625,
0.040802001953125,
-0.01436614990234375,
-0.014373779296875,
-0.00948333740234375,
0.07672119140625,
0.019439697265625,
-0.006084442138671875,
0.0097198486328125,
-0.0191192626953125,
0.0132904052734375,
-0.0029048919677734375,
-0.07159423828125,
-0.0291748046875,
0.0294036865234375,
-0.047637939453125,
-0.031951904296875,
0.0225677490234375,
-0.05877685546875,
-0.01142120361328125,
-0.013397216796875,
0.051483154296875,
-0.0219268798828125,
-0.042388916015625,
0.00893402099609375,
-0.0257720947265625,
0.0251007080078125,
0.0038623809814453125,
-0.058746337890625,
0.0164337158203125,
0.03289794921875,
0.06353759765625,
-0.0147247314453125,
-0.0194091796875,
-0.028472900390625,
-0.018341064453125,
0.0028057098388671875,
0.043670654296875,
-0.0203857421875,
-0.009765625,
-0.0291748046875,
0.010528564453125,
-0.027130126953125,
-0.044677734375,
0.0428466796875,
-0.0288543701171875,
0.02484130859375,
-0.022247314453125,
-0.029541015625,
-0.0261077880859375,
0.028106689453125,
-0.05133056640625,
0.08154296875,
0.023529052734375,
-0.0709228515625,
0.0211944580078125,
-0.0772705078125,
0.0002409219741821289,
-0.0037326812744140625,
0.01456451416015625,
-0.043548583984375,
-0.018768310546875,
0.0206756591796875,
0.0240936279296875,
-0.01079559326171875,
0.01715087890625,
-0.031890869140625,
-0.0261688232421875,
0.0246124267578125,
-0.01251983642578125,
0.0732421875,
0.0257568359375,
-0.03948974609375,
0.025238037109375,
-0.05615234375,
0.007373809814453125,
0.0369873046875,
-0.0192108154296875,
-0.017120361328125,
-0.024932861328125,
0.01392364501953125,
0.041778564453125,
0.00458526611328125,
-0.0421142578125,
0.01500701904296875,
-0.0196380615234375,
0.053802490234375,
0.06292724609375,
0.0017242431640625,
0.00797271728515625,
-0.032867431640625,
0.040679931640625,
0.0080718994140625,
0.0240631103515625,
0.01529693603515625,
-0.035797119140625,
-0.05255126953125,
-0.0240936279296875,
0.048095703125,
0.052337646484375,
-0.0428466796875,
0.0526123046875,
-0.001461029052734375,
-0.055267333984375,
-0.036041259765625,
-0.0199432373046875,
0.037994384765625,
0.03173828125,
0.034423828125,
-0.0216827392578125,
-0.04498291015625,
-0.06658935546875,
0.0007801055908203125,
-0.018798828125,
0.001590728759765625,
0.025665283203125,
0.066650390625,
-0.008056640625,
0.05450439453125,
-0.0472412109375,
-0.0124664306640625,
-0.01425933837890625,
0.023223876953125,
0.034942626953125,
0.059173583984375,
0.043212890625,
-0.068115234375,
-0.0209503173828125,
-0.029754638671875,
-0.06378173828125,
0.015838623046875,
0.01561737060546875,
-0.021820068359375,
0.00809478759765625,
0.01499176025390625,
-0.058013916015625,
0.0278778076171875,
0.023223876953125,
-0.040130615234375,
0.029632568359375,
-0.01593017578125,
0.027374267578125,
-0.07598876953125,
0.038360595703125,
-0.001003265380859375,
-0.005741119384765625,
-0.033660888671875,
0.01107025146484375,
-0.00913238525390625,
0.007411956787109375,
-0.0452880859375,
0.054931640625,
-0.01387786865234375,
0.00496673583984375,
-0.004425048828125,
-0.0022335052490234375,
-0.0164337158203125,
0.061859130859375,
0.0026378631591796875,
0.042236328125,
0.02581787109375,
-0.046539306640625,
0.040130615234375,
0.04498291015625,
-0.0161590576171875,
0.0206298828125,
-0.0650634765625,
0.0059051513671875,
-0.007167816162109375,
0.026123046875,
-0.07940673828125,
-0.031280517578125,
0.041412353515625,
-0.050994873046875,
0.01503753662109375,
-0.0038909912109375,
-0.030120849609375,
-0.057952880859375,
-0.03289794921875,
0.04278564453125,
0.044708251953125,
-0.048248291015625,
0.0195159912109375,
0.034881591796875,
0.00379180908203125,
-0.05694580078125,
-0.04248046875,
-0.0305328369140625,
-0.0215911865234375,
-0.0367431640625,
0.02679443359375,
-0.01181793212890625,
-0.006855010986328125,
0.0081329345703125,
0.01029205322265625,
0.0080718994140625,
0.0160980224609375,
0.027252197265625,
0.033599853515625,
-0.0010805130004882812,
-0.0021228790283203125,
-0.0208282470703125,
-0.0221405029296875,
0.01143646240234375,
0.006908416748046875,
0.058807373046875,
-0.0177764892578125,
-0.019622802734375,
-0.040435791015625,
0.0005965232849121094,
0.042938232421875,
-0.0199432373046875,
0.052490234375,
0.05499267578125,
-0.03082275390625,
0.01526641845703125,
-0.0299072265625,
-0.031402587890625,
-0.033599853515625,
0.025360107421875,
-0.0215911865234375,
-0.058624267578125,
0.0535888671875,
0.01873779296875,
-0.00908660888671875,
0.07147216796875,
0.06060791015625,
0.028594970703125,
0.09521484375,
0.021331787109375,
-0.01238250732421875,
0.037841796875,
-0.05126953125,
-0.0029144287109375,
-0.065185546875,
-0.031890869140625,
-0.034423828125,
-0.03271484375,
-0.053070068359375,
-0.01122283935546875,
0.027618408203125,
0.0117645263671875,
-0.05133056640625,
0.0255889892578125,
-0.06610107421875,
0.01204681396484375,
0.0640869140625,
0.0223388671875,
-0.006500244140625,
-0.01523590087890625,
-0.01425933837890625,
0.0101318359375,
-0.048187255859375,
-0.01371002197265625,
0.08001708984375,
0.038421630859375,
0.046234130859375,
-0.00003421306610107422,
0.051544189453125,
0.0100860595703125,
0.02337646484375,
-0.044677734375,
0.036376953125,
0.0048828125,
-0.05169677734375,
-0.0227508544921875,
-0.042388916015625,
-0.06787109375,
0.026275634765625,
-0.0231170654296875,
-0.049652099609375,
0.0172271728515625,
0.005123138427734375,
-0.047576904296875,
0.0243377685546875,
-0.04034423828125,
0.08477783203125,
-0.016387939453125,
-0.035736083984375,
-0.00994873046875,
-0.051483154296875,
0.029815673828125,
0.0174560546875,
-0.0011501312255859375,
-0.01184844970703125,
0.0313720703125,
0.0811767578125,
-0.042266845703125,
0.045684814453125,
-0.0374755859375,
0.0230712890625,
0.0286407470703125,
-0.01343536376953125,
0.02850341796875,
0.00966644287109375,
-0.0097198486328125,
0.00799560546875,
0.0082244873046875,
-0.0203857421875,
-0.0264892578125,
0.051513671875,
-0.07763671875,
-0.033477783203125,
-0.058380126953125,
-0.033599853515625,
0.001331329345703125,
0.0167083740234375,
0.03717041015625,
0.02264404296875,
-0.005977630615234375,
0.00921630859375,
0.045501708984375,
-0.029571533203125,
0.03741455078125,
0.0165863037109375,
-0.01155853271484375,
-0.037872314453125,
0.0650634765625,
-0.015838623046875,
0.0032138824462890625,
0.024383544921875,
0.0200347900390625,
-0.03656005859375,
-0.0241546630859375,
-0.0465087890625,
0.0250396728515625,
-0.045867919921875,
-0.03997802734375,
-0.054840087890625,
-0.033782958984375,
-0.032501220703125,
0.021820068359375,
-0.0258941650390625,
-0.048248291015625,
-0.0169677734375,
-0.0203857421875,
0.036163330078125,
0.05487060546875,
-0.03411865234375,
-0.0021514892578125,
-0.059356689453125,
0.02496337890625,
0.024261474609375,
0.0278167724609375,
0.00220489501953125,
-0.041412353515625,
-0.00275421142578125,
0.0107574462890625,
-0.036102294921875,
-0.06201171875,
0.039886474609375,
-0.0157470703125,
0.0548095703125,
0.0034275054931640625,
-0.0025157928466796875,
0.033935546875,
0.0024433135986328125,
0.06964111328125,
0.018798828125,
-0.051605224609375,
0.042633056640625,
-0.0244293212890625,
0.01611328125,
0.035980224609375,
0.0321044921875,
-0.030975341796875,
-0.01438140869140625,
-0.049530029296875,
-0.0732421875,
0.0789794921875,
0.022216796875,
-0.0202484130859375,
0.003864288330078125,
0.0302734375,
-0.0038928985595703125,
0.00894927978515625,
-0.0772705078125,
-0.038543701171875,
-0.032318115234375,
-0.0208740234375,
0.0029087066650390625,
-0.01088714599609375,
-0.0133056640625,
-0.043853759765625,
0.07098388671875,
0.006061553955078125,
0.03753662109375,
0.01392364501953125,
-0.002162933349609375,
-0.01557159423828125,
0.004222869873046875,
0.041961669921875,
0.034149169921875,
-0.03314208984375,
-0.0191802978515625,
0.00554656982421875,
-0.03448486328125,
0.023712158203125,
0.0255889892578125,
-0.010162353515625,
-0.0259552001953125,
0.01540374755859375,
0.06622314453125,
-0.005374908447265625,
-0.035400390625,
0.03765869140625,
-0.01459503173828125,
-0.0281524658203125,
-0.027252197265625,
0.0177764892578125,
-0.0012826919555664062,
0.019622802734375,
0.018218994140625,
0.0019044876098632812,
0.01557159423828125,
-0.0258636474609375,
0.0177001953125,
0.033599853515625,
-0.03155517578125,
-0.01512908935546875,
0.06561279296875,
0.00513458251953125,
-0.0093841552734375,
0.06231689453125,
-0.0244140625,
-0.054656982421875,
0.06121826171875,
0.0288543701171875,
0.06768798828125,
-0.0262603759765625,
0.026031494140625,
0.07537841796875,
-0.00830078125,
-0.023162841796875,
0.01806640625,
0.01422119140625,
-0.040679931640625,
-0.0159149169921875,
-0.056304931640625,
-0.01288604736328125,
0.026336669921875,
-0.056121826171875,
0.01448822021484375,
-0.0263519287109375,
-0.015838623046875,
-0.00267791748046875,
-0.00023317337036132812,
-0.046142578125,
0.0143280029296875,
0.0037136077880859375,
0.053802490234375,
-0.07373046875,
0.040802001953125,
0.061859130859375,
-0.05047607421875,
-0.07293701171875,
-0.03125,
-0.006832122802734375,
-0.053070068359375,
0.06683349609375,
0.01210784912109375,
0.016357421875,
-0.008392333984375,
-0.0399169921875,
-0.078125,
0.0972900390625,
0.008087158203125,
-0.0157928466796875,
-0.004161834716796875,
0.00836944580078125,
0.03948974609375,
-0.009674072265625,
0.04937744140625,
0.042816162109375,
0.045684814453125,
0.018524169921875,
-0.062286376953125,
0.026214599609375,
-0.03289794921875,
-0.0106964111328125,
0.0246124267578125,
-0.0684814453125,
0.093505859375,
-0.005084991455078125,
0.0103607177734375,
0.01277923583984375,
0.0297698974609375,
0.04351806640625,
0.007720947265625,
0.047760009765625,
0.06414794921875,
0.038665771484375,
-0.0227508544921875,
0.07208251953125,
-0.0133209228515625,
0.056488037109375,
0.03802490234375,
0.01251220703125,
0.058624267578125,
0.0198211669921875,
-0.0220947265625,
0.047698974609375,
0.0736083984375,
-0.0215911865234375,
0.05413818359375,
0.03411865234375,
0.0034332275390625,
-0.0189971923828125,
0.01477813720703125,
-0.04254150390625,
0.03759765625,
0.0080108642578125,
-0.0217437744140625,
-0.00872802734375,
0.00664520263671875,
-0.0136871337890625,
-0.01409912109375,
-0.022674560546875,
0.048126220703125,
-0.0014963150024414062,
-0.065185546875,
0.0625,
0.00659942626953125,
0.056793212890625,
-0.03509521484375,
-0.0265350341796875,
-0.01015472412109375,
0.0367431640625,
-0.0164031982421875,
-0.0440673828125,
0.001331329345703125,
-0.013763427734375,
-0.0093536376953125,
0.00479888916015625,
0.052978515625,
-0.015167236328125,
-0.0310516357421875,
0.0236968994140625,
0.0229339599609375,
0.0043182373046875,
-0.00043511390686035156,
-0.0885009765625,
-0.0008978843688964844,
-0.0043487548828125,
-0.048797607421875,
0.0269317626953125,
0.000980377197265625,
0.007640838623046875,
0.05029296875,
0.04730224609375,
-0.015777587890625,
-0.0142974853515625,
-0.01297760009765625,
0.06561279296875,
-0.058258056640625,
-0.03668212890625,
-0.06610107421875,
0.03948974609375,
-0.005176544189453125,
-0.036865234375,
0.04541015625,
0.048248291015625,
0.059326171875,
-0.01210784912109375,
0.050262451171875,
-0.02239990234375,
0.04449462890625,
-0.04156494140625,
0.0628662109375,
-0.04510498046875,
-0.0014333724975585938,
-0.0261688232421875,
-0.060394287109375,
-0.01464080810546875,
0.060302734375,
-0.0229339599609375,
0.030792236328125,
0.039642333984375,
0.05255126953125,
-0.005466461181640625,
0.00009679794311523438,
-0.0039043426513671875,
0.01204681396484375,
0.03033447265625,
0.06390380859375,
0.061767578125,
-0.047637939453125,
0.0268096923828125,
-0.037841796875,
-0.024932861328125,
-0.0289459228515625,
-0.048675537109375,
-0.061248779296875,
-0.046539306640625,
-0.0275115966796875,
-0.04986572265625,
-0.014678955078125,
0.0789794921875,
0.05609130859375,
-0.0732421875,
-0.01422119140625,
-0.00968170166015625,
0.006008148193359375,
-0.033172607421875,
-0.0214080810546875,
0.04815673828125,
-0.002429962158203125,
-0.05474853515625,
-0.0017213821411132812,
-0.006320953369140625,
0.01055145263671875,
-0.0184173583984375,
-0.0207672119140625,
-0.026123046875,
-0.016082763671875,
0.03411865234375,
0.03668212890625,
-0.04248046875,
-0.02264404296875,
0.006748199462890625,
-0.012969970703125,
0.00992584228515625,
0.040313720703125,
-0.062286376953125,
0.00643157958984375,
0.0465087890625,
0.028289794921875,
0.05712890625,
0.00846099853515625,
0.0291290283203125,
-0.0213775634765625,
0.0191650390625,
0.0215606689453125,
0.033447265625,
0.0281982421875,
-0.030517578125,
0.06024169921875,
0.020263671875,
-0.056488037109375,
-0.06671142578125,
0.004413604736328125,
-0.07305908203125,
-0.023468017578125,
0.0902099609375,
-0.01251983642578125,
-0.029754638671875,
0.014068603515625,
-0.0126190185546875,
0.029571533203125,
-0.0214080810546875,
0.059326171875,
0.047760009765625,
-0.0264739990234375,
-0.0025539398193359375,
-0.0308685302734375,
0.04656982421875,
0.035400390625,
-0.065185546875,
-0.005706787109375,
0.0238800048828125,
0.0299835205078125,
0.02459716796875,
0.0452880859375,
-0.0278778076171875,
0.0266571044921875,
-0.00528717041015625,
0.006744384765625,
-0.02716064453125,
-0.0005621910095214844,
-0.0292816162109375,
0.005645751953125,
-0.0004286766052246094,
-0.036102294921875
]
] |
diffusers/stable-diffusion-xl-1.0-inpainting-0.1 | 2023-09-03T16:36:39.000Z | [
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"inpainting",
"arxiv:2112.10752",
"license:openrail++",
"has_space",
"diffusers:StableDiffusionXLInpaintPipeline",
"region:us"
] | text-to-image | diffusers | null | null | diffusers/stable-diffusion-xl-1.0-inpainting-0.1 | 142 | 79,822 | diffusers | 2023-09-01T14:07:10 |
---
license: openrail++
base_model: stabilityai/stable-diffusion-xl-base-1.0
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- inpainting
inference: false
---
# SD-XL Inpainting 0.1 Model Card

SD-XL Inpainting 0.1 is a latent text-to-image diffusion model capable of generating photo-realistic images given any text input, with the extra capability of inpainting the pictures by using a mask.
The SD-XL Inpainting 0.1 was initialized with the `stable-diffusion-xl-base-1.0` weights. The model is trained for 40k steps at resolution 1024x1024 and 5% dropping of the text-conditioning to improve classifier-free classifier-free guidance sampling. For inpainting, the UNet has 5 additional input channels (4 for the encoded masked-image and 1 for the mask itself) whose weights were zero-initialized after restoring the non-inpainting checkpoint. During training, we generate synthetic masks and, in 25% mask everything.
## How to use
```py
from diffusers import AutoPipelineForInpainting
from diffusers.utils import load_image
import torch
pipe = AutoPipelineForInpainting.from_pretrained("diffusers/stable-diffusion-xl-1.0-inpainting-0.1", torch_dtype=torch.float16, variant="fp16").to("cuda")
img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png"
mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png"
image = load_image(img_url).resize((1024, 1024))
mask_image = load_image(mask_url).resize((1024, 1024))
prompt = "a tiger sitting on a park bench"
generator = torch.Generator(device="cuda").manual_seed(0)
image = pipe(
prompt=prompt,
image=image,
mask_image=mask_image,
guidance_scale=8.0,
num_inference_steps=20, # steps between 15 and 30 work well for us
strength=0.99, # make sure to use `strength` below 1.0
generator=generator,
).images[0]
```
**How it works:**
`image` | `mask_image`
:-------------------------:|:-------------------------:|
<img src="https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" alt="drawing" width="300"/> | <img src="https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" alt="drawing" width="300"/>
`prompt` | `Output`
:-------------------------:|:-------------------------:|
<span style="position: relative;bottom: 150px;">a tiger sitting on a park bench</span> | <img src="https://huggingface.co/datasets/valhalla/images/resolve/main/tiger.png" alt="drawing" width="300"/>
## Model Description
- **Developed by:** The Diffusers team
- **Model type:** Diffusion-based text-to-image generative model
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0/blob/main/LICENSE.md)
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses two fixed, pretrained text encoders ([OpenCLIP-ViT/G](https://github.com/mlfoundations/open_clip) and [CLIP-ViT/L](https://github.com/openai/CLIP/tree/main)).
## Uses
### Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
Excluded uses are described below.
### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model struggles with more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere”
- Faces and people in general may not be generated properly.
- The autoencoding part of the model is lossy.
- When the strength parameter is set to 1 (i.e. starting in-painting from a fully masked image), the quality of the image is degraded. The model retains the non-masked contents of the image, but images look less sharp. We're investing this and working on the next version.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
| 4,857 | [
[
-0.029144287109375,
-0.054534912109375,
0.038604736328125,
0.021392822265625,
-0.018829345703125,
-0.00264739990234375,
0.0075225830078125,
-0.02618408203125,
0.01052093505859375,
0.03369140625,
-0.043243408203125,
-0.032073974609375,
-0.044464111328125,
-0.004779815673828125,
-0.01068878173828125,
0.06903076171875,
-0.01262664794921875,
-0.0038356781005859375,
-0.01497650146484375,
0.007732391357421875,
-0.0179290771484375,
-0.0024623870849609375,
-0.0693359375,
-0.01971435546875,
0.0156402587890625,
0.0165863037109375,
0.033477783203125,
0.0217437744140625,
0.0254058837890625,
0.0221405029296875,
-0.02386474609375,
-0.0005974769592285156,
-0.037322998046875,
0.001369476318359375,
0.0120849609375,
-0.028106689453125,
-0.00875091552734375,
0.01308441162109375,
0.052520751953125,
0.032928466796875,
0.0120697021484375,
-0.0108184814453125,
0.0084381103515625,
0.0533447265625,
-0.03741455078125,
-0.01247406005859375,
-0.016937255859375,
0.01409149169921875,
-0.021087646484375,
-0.00174713134765625,
-0.016998291015625,
-0.0268096923828125,
0.01119232177734375,
-0.057373046875,
0.0272674560546875,
-0.0206146240234375,
0.0791015625,
0.018829345703125,
-0.01070404052734375,
-0.0168914794921875,
-0.04632568359375,
0.0472412109375,
-0.062744140625,
-0.0022869110107421875,
0.0220794677734375,
0.00655364990234375,
-0.00547027587890625,
-0.0802001953125,
-0.055694580078125,
-0.0024623870849609375,
-0.00274658203125,
0.037841796875,
-0.0253143310546875,
-0.0006518363952636719,
0.036895751953125,
0.04052734375,
-0.0504150390625,
-0.0235443115234375,
-0.025360107421875,
0.005458831787109375,
0.039337158203125,
0.006816864013671875,
0.0374755859375,
-0.003936767578125,
-0.038330078125,
-0.005817413330078125,
-0.040313720703125,
-0.0008273124694824219,
0.03289794921875,
-0.01678466796875,
-0.0234222412109375,
0.04388427734375,
0.0005698204040527344,
0.047210693359375,
0.0206146240234375,
-0.017120361328125,
0.017730712890625,
-0.02142333984375,
-0.0272369384765625,
-0.0222320556640625,
0.07061767578125,
0.034271240234375,
-0.01451873779296875,
-0.0035419464111328125,
-0.0261688232421875,
0.0022907257080078125,
-0.0019512176513671875,
-0.10113525390625,
-0.02740478515625,
0.0097503662109375,
-0.04193115234375,
-0.032073974609375,
-0.00846099853515625,
-0.06817626953125,
-0.0162506103515625,
0.0057373046875,
0.0667724609375,
-0.04071044921875,
-0.030029296875,
0.008880615234375,
-0.034759521484375,
-0.00005334615707397461,
0.03607177734375,
-0.04827880859375,
0.00099945068359375,
-0.0007748603820800781,
0.08905029296875,
-0.01262664794921875,
0.0073089599609375,
0.0014238357543945312,
0.002124786376953125,
-0.0275115966796875,
0.05828857421875,
-0.034210205078125,
-0.042816162109375,
-0.0169525146484375,
0.0229339599609375,
-0.00690460205078125,
-0.049041748046875,
0.0357666015625,
-0.0293731689453125,
0.03564453125,
-0.00433349609375,
-0.05633544921875,
-0.019622802734375,
-0.01554107666015625,
-0.04888916015625,
0.07672119140625,
0.0313720703125,
-0.0621337890625,
0.01146697998046875,
-0.0721435546875,
-0.004390716552734375,
0.005619049072265625,
0.00939178466796875,
-0.0538330078125,
-0.0093231201171875,
0.0003325939178466797,
0.032745361328125,
-0.00988006591796875,
-0.003124237060546875,
-0.020233154296875,
-0.02215576171875,
0.01427459716796875,
-0.02935791015625,
0.0936279296875,
0.0250701904296875,
-0.0338134765625,
0.0033397674560546875,
-0.061279296875,
-0.01751708984375,
0.0229034423828125,
-0.019561767578125,
-0.008087158203125,
-0.034271240234375,
0.01389312744140625,
0.02606201171875,
0.00626373291015625,
-0.0538330078125,
0.006000518798828125,
-0.0341796875,
0.041473388671875,
0.056854248046875,
0.035980224609375,
0.0523681640625,
-0.0246429443359375,
0.04840087890625,
0.015777587890625,
-0.005603790283203125,
-0.03656005859375,
-0.06243896484375,
-0.05743408203125,
-0.03887939453125,
0.00797271728515625,
0.0300750732421875,
-0.0704345703125,
0.0247344970703125,
0.01042938232421875,
-0.05950927734375,
-0.0238800048828125,
-0.006511688232421875,
0.02069091796875,
0.06414794921875,
0.0164642333984375,
-0.04461669921875,
-0.0171356201171875,
-0.049530029296875,
0.0160675048828125,
0.01026153564453125,
-0.00865936279296875,
0.0070648193359375,
0.05474853515625,
-0.0200347900390625,
0.056427001953125,
-0.050628662109375,
-0.0303802490234375,
0.0046539306640625,
0.0218048095703125,
0.0171661376953125,
0.0528564453125,
0.04827880859375,
-0.0548095703125,
-0.06280517578125,
-0.015869140625,
-0.0601806640625,
-0.01087188720703125,
-0.0159759521484375,
-0.0176544189453125,
0.0238800048828125,
0.03851318359375,
-0.05413818359375,
0.05010986328125,
0.03411865234375,
-0.034637451171875,
0.044158935546875,
-0.0246429443359375,
0.009979248046875,
-0.08197021484375,
0.009765625,
0.02630615234375,
-0.01459503173828125,
-0.05242919921875,
0.00856781005859375,
-0.003261566162109375,
-0.0269775390625,
-0.03680419921875,
0.05523681640625,
-0.036041259765625,
0.0305633544921875,
-0.029876708984375,
-0.01329803466796875,
0.010711669921875,
0.047698974609375,
0.013885498046875,
0.054656982421875,
0.06842041015625,
-0.042236328125,
0.0202178955078125,
0.01329803466796875,
-0.04144287109375,
0.050811767578125,
-0.062469482421875,
0.01776123046875,
-0.021148681640625,
0.0264892578125,
-0.1002197265625,
-0.00939178466796875,
0.0462646484375,
-0.036590576171875,
0.04052734375,
-0.0155792236328125,
-0.03564453125,
-0.01508331298828125,
-0.0160369873046875,
0.038665771484375,
0.051849365234375,
-0.029998779296875,
0.043701171875,
0.01248931884765625,
-0.001956939697265625,
-0.023162841796875,
-0.05413818359375,
0.00016260147094726562,
-0.0301513671875,
-0.065185546875,
0.0550537109375,
-0.026397705078125,
-0.0028591156005859375,
0.0083465576171875,
0.013702392578125,
-0.005767822265625,
-0.0196990966796875,
0.0229034423828125,
0.0172271728515625,
-0.00818634033203125,
-0.004611968994140625,
0.006107330322265625,
-0.016571044921875,
0.0009212493896484375,
-0.008087158203125,
0.0243377685546875,
0.018218994140625,
-0.03143310546875,
-0.057952880859375,
0.028839111328125,
0.03485107421875,
0.0121307373046875,
0.052764892578125,
0.059051513671875,
-0.037841796875,
0.007434844970703125,
-0.0228118896484375,
-0.0086822509765625,
-0.0361328125,
0.0198822021484375,
-0.01031494140625,
-0.03033447265625,
0.0452880859375,
-0.00807952880859375,
0.007259368896484375,
0.055908203125,
0.044830322265625,
-0.0302276611328125,
0.0731201171875,
0.04364013671875,
0.0299072265625,
0.05780029296875,
-0.06866455078125,
-0.0031223297119140625,
-0.0770263671875,
-0.020355224609375,
-0.0217132568359375,
-0.02435302734375,
-0.0164642333984375,
-0.051483154296875,
0.0316162109375,
0.004817962646484375,
-0.01268768310546875,
0.01384735107421875,
-0.051971435546875,
0.02691650390625,
0.019866943359375,
0.025482177734375,
0.003021240234375,
0.004558563232421875,
0.00035452842712402344,
-0.0172882080078125,
-0.0374755859375,
-0.0435791015625,
0.076904296875,
0.0241241455078125,
0.06689453125,
0.00019443035125732422,
0.044769287109375,
0.01256561279296875,
0.0306854248046875,
-0.035125732421875,
0.0364990234375,
-0.0036869049072265625,
-0.037750244140625,
-0.0212860107421875,
-0.01023101806640625,
-0.06719970703125,
0.0205841064453125,
-0.022735595703125,
-0.03106689453125,
0.034698486328125,
0.004642486572265625,
-0.0159759521484375,
0.043670654296875,
-0.0654296875,
0.059051513671875,
0.005245208740234375,
-0.0399169921875,
0.00009483098983764648,
-0.04998779296875,
0.022674560546875,
0.0006995201110839844,
0.0015506744384765625,
0.0054931640625,
-0.004940032958984375,
0.0670166015625,
-0.031005859375,
0.07061767578125,
-0.0302886962890625,
-0.01187896728515625,
0.0200958251953125,
-0.00989532470703125,
0.035125732421875,
-0.006778717041015625,
-0.00728607177734375,
0.005046844482421875,
-0.007022857666015625,
-0.0246124267578125,
-0.03887939453125,
0.043121337890625,
-0.049591064453125,
-0.0305023193359375,
-0.032501220703125,
-0.02301025390625,
0.0325927734375,
0.00433349609375,
0.062744140625,
0.030487060546875,
-0.0194549560546875,
-0.0011548995971679688,
0.0849609375,
-0.03955078125,
0.0472412109375,
0.007785797119140625,
-0.01338958740234375,
-0.036712646484375,
0.080810546875,
0.0150146484375,
0.043853759765625,
0.018707275390625,
0.00606536865234375,
-0.0225677490234375,
-0.02294921875,
-0.0472412109375,
0.021820068359375,
-0.071044921875,
-0.017425537109375,
-0.056243896484375,
-0.0299072265625,
-0.0219879150390625,
-0.0170135498046875,
-0.0047760009765625,
-0.031524658203125,
-0.07781982421875,
0.0191650390625,
0.028564453125,
0.049468994140625,
-0.01027679443359375,
0.02618408203125,
-0.0187530517578125,
0.0291900634765625,
0.0296173095703125,
0.0208282470703125,
0.012664794921875,
-0.0634765625,
-0.01412200927734375,
0.0018110275268554688,
-0.060333251953125,
-0.079345703125,
0.03717041015625,
0.01287078857421875,
0.036590576171875,
0.040771484375,
-0.0185546875,
0.055816650390625,
-0.0172576904296875,
0.0716552734375,
0.022857666015625,
-0.052093505859375,
0.047271728515625,
-0.020355224609375,
0.01629638671875,
-0.0020008087158203125,
0.0347900390625,
-0.039581298828125,
-0.039947509765625,
-0.0682373046875,
-0.06494140625,
0.052001953125,
0.0313720703125,
0.0145721435546875,
0.00701141357421875,
0.03240966796875,
0.00360107421875,
-0.01021575927734375,
-0.0712890625,
-0.04071044921875,
-0.029144287109375,
0.007221221923828125,
-0.0011873245239257812,
-0.0143585205078125,
-0.00405120849609375,
-0.042694091796875,
0.075439453125,
0.00679779052734375,
0.04833984375,
0.0273590087890625,
-0.009765625,
-0.029876708984375,
-0.015899658203125,
0.05072021484375,
0.0264739990234375,
-0.01351165771484375,
-0.005115509033203125,
0.0009446144104003906,
-0.030029296875,
0.018524169921875,
0.004390716552734375,
-0.036712646484375,
0.0175628662109375,
0.005306243896484375,
0.0673828125,
-0.0174713134765625,
-0.0267333984375,
0.04205322265625,
-0.00707244873046875,
-0.027069091796875,
-0.0288543701171875,
0.00274658203125,
0.01300048828125,
0.0123291015625,
0.0302276611328125,
0.035125732421875,
0.01470184326171875,
-0.020782470703125,
-0.0070648193359375,
0.049468994140625,
-0.029052734375,
-0.0162811279296875,
0.07000732421875,
-0.0020732879638671875,
-0.02630615234375,
0.027069091796875,
-0.03167724609375,
-0.0221710205078125,
0.05975341796875,
0.053192138671875,
0.0731201171875,
-0.009124755859375,
0.02825927734375,
0.055694580078125,
0.01291656494140625,
0.0002377033233642578,
0.0098114013671875,
0.008819580078125,
-0.03680419921875,
-0.009918212890625,
-0.035369873046875,
-0.005237579345703125,
0.0234527587890625,
-0.01238250732421875,
0.0246429443359375,
-0.0217437744140625,
-0.01116180419921875,
0.00925445556640625,
-0.00432586669921875,
-0.03607177734375,
0.02294921875,
-0.0031490325927734375,
0.0635986328125,
-0.083251953125,
0.04693603515625,
0.04400634765625,
-0.05548095703125,
-0.0297088623046875,
0.007053375244140625,
-0.0123138427734375,
-0.0379638671875,
0.046417236328125,
0.0075836181640625,
0.0011644363403320312,
0.00994110107421875,
-0.0728759765625,
-0.060760498046875,
0.09368896484375,
0.0308074951171875,
-0.00018227100372314453,
0.01416778564453125,
-0.009185791015625,
0.041290283203125,
-0.035980224609375,
0.0308990478515625,
0.01119232177734375,
0.026397705078125,
0.03228759765625,
-0.0278472900390625,
0.0205535888671875,
-0.03173828125,
0.0357666015625,
-0.007640838623046875,
-0.04400634765625,
0.07159423828125,
-0.019012451171875,
-0.0390625,
0.03106689453125,
0.053619384765625,
0.0136871337890625,
0.01434326171875,
0.035064697265625,
0.0780029296875,
0.03466796875,
-0.0164642333984375,
0.07232666015625,
0.00010854005813598633,
0.0261688232421875,
0.039581298828125,
0.01091766357421875,
0.0220184326171875,
0.028717041015625,
-0.019287109375,
0.053985595703125,
0.06842041015625,
-0.01312255859375,
0.062042236328125,
0.018096923828125,
-0.04571533203125,
0.01508331298828125,
0.004474639892578125,
-0.035614013671875,
-0.01204681396484375,
0.03265380859375,
-0.042327880859375,
-0.0142822265625,
0.02392578125,
-0.0042724609375,
-0.018218994140625,
-0.0025768280029296875,
0.0572509765625,
-0.007110595703125,
-0.038360595703125,
0.04547119140625,
-0.0031642913818359375,
0.0751953125,
-0.042694091796875,
-0.031524658203125,
-0.01009368896484375,
-0.001239776611328125,
-0.0249786376953125,
-0.08050537109375,
0.0308380126953125,
-0.01617431640625,
-0.00829315185546875,
-0.027587890625,
0.059234619140625,
-0.03619384765625,
-0.0338134765625,
0.00555419921875,
0.01140594482421875,
0.034515380859375,
-0.00048661231994628906,
-0.06756591796875,
0.0012722015380859375,
0.0067291259765625,
-0.0276947021484375,
0.02081298828125,
0.037841796875,
0.02081298828125,
0.042266845703125,
0.0430908203125,
-0.0002620220184326172,
0.0002911090850830078,
-0.00940704345703125,
0.061126708984375,
-0.0225830078125,
-0.019989013671875,
-0.05657958984375,
0.05975341796875,
-0.00630950927734375,
-0.0227813720703125,
0.04986572265625,
0.0472412109375,
0.05413818359375,
-0.023681640625,
0.061492919921875,
-0.02117919921875,
0.00830841064453125,
-0.037322998046875,
0.06939697265625,
-0.0623779296875,
-0.003696441650390625,
-0.04693603515625,
-0.0587158203125,
-0.01018524169921875,
0.07037353515625,
0.0010900497436523438,
0.0188751220703125,
0.037017822265625,
0.08648681640625,
-0.0113372802734375,
-0.018096923828125,
0.0198974609375,
0.0035858154296875,
0.0261383056640625,
0.024169921875,
0.054229736328125,
-0.046661376953125,
0.028594970703125,
-0.04351806640625,
-0.0273284912109375,
0.0036487579345703125,
-0.06903076171875,
-0.0565185546875,
-0.0758056640625,
-0.051422119140625,
-0.045379638671875,
-0.0230255126953125,
0.035369873046875,
0.060455322265625,
-0.031402587890625,
-0.003192901611328125,
-0.0206298828125,
0.0026340484619140625,
-0.0151519775390625,
-0.024169921875,
0.0264129638671875,
-0.00408172607421875,
-0.06787109375,
-0.004611968994140625,
0.0308990478515625,
0.0489501953125,
-0.0288848876953125,
-0.01175689697265625,
-0.021575927734375,
-0.0026721954345703125,
0.037689208984375,
0.0278472900390625,
-0.047393798828125,
0.013031005859375,
-0.017059326171875,
-0.007411956787109375,
0.01421356201171875,
0.0284576416015625,
-0.051483154296875,
0.0540771484375,
0.0396728515625,
0.01153564453125,
0.06732177734375,
-0.029693603515625,
0.0116424560546875,
-0.0616455078125,
0.0107269287109375,
0.0118865966796875,
0.026763916015625,
0.031951904296875,
-0.035980224609375,
0.0399169921875,
0.04229736328125,
-0.056732177734375,
-0.05499267578125,
0.00879669189453125,
-0.07537841796875,
-0.0232086181640625,
0.0804443359375,
-0.01751708984375,
-0.01129913330078125,
-0.003116607666015625,
-0.04248046875,
0.0328369140625,
-0.0246124267578125,
0.049774169921875,
0.046905517578125,
-0.0177154541015625,
-0.032623291015625,
-0.0289306640625,
0.046539306640625,
0.00910186767578125,
-0.058074951171875,
-0.0186309814453125,
0.030120849609375,
0.04595947265625,
0.0276947021484375,
0.08172607421875,
-0.0140838623046875,
0.016204833984375,
-0.0014019012451171875,
0.01392364501953125,
0.0144195556640625,
0.01137542724609375,
-0.03594970703125,
-0.003681182861328125,
-0.0220794677734375,
-0.016632080078125
]
] |
Salesforce/instructblip-vicuna-7b | 2023-07-17T12:36:58.000Z | [
"transformers",
"pytorch",
"instructblip",
"text2text-generation",
"vision",
"image-captioning",
"image-to-text",
"en",
"arxiv:2305.06500",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | Salesforce | null | null | Salesforce/instructblip-vicuna-7b | 44 | 79,467 | transformers | 2023-05-22T19:28:03 | ---
language: en
license: other
tags:
- vision
- image-captioning
pipeline_tag: image-to-text
---
# InstructBLIP model
InstructBLIP model using [Vicuna-7b](https://github.com/lm-sys/FastChat#model-weights) as language model. InstructBLIP was introduced in the paper [InstructBLIP: Towards General-purpose Vision-Language Models with Instruction Tuning](https://arxiv.org/abs/2305.06500) by Dai et al.
Disclaimer: The team releasing InstructBLIP did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
InstructBLIP is a visual instruction tuned version of [BLIP-2](https://huggingface.co/docs/transformers/main/model_doc/blip-2). Refer to the paper for details.

## Intended uses & limitations
Usage is as follows:
```
from transformers import InstructBlipProcessor, InstructBlipForConditionalGeneration
import torch
from PIL import Image
import requests
model = InstructBlipForConditionalGeneration.from_pretrained("Salesforce/instructblip-vicuna-7b")
processor = InstructBlipProcessor.from_pretrained("Salesforce/instructblip-vicuna-7b")
device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device)
url = "https://raw.githubusercontent.com/salesforce/LAVIS/main/docs/_static/Confusing-Pictures.jpg"
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
prompt = "What is unusual about this image?"
inputs = processor(images=image, text=prompt, return_tensors="pt").to(device)
outputs = model.generate(
**inputs,
do_sample=False,
num_beams=5,
max_length=256,
min_length=1,
top_p=0.9,
repetition_penalty=1.5,
length_penalty=1.0,
temperature=1,
)
generated_text = processor.batch_decode(outputs, skip_special_tokens=True)[0].strip()
print(generated_text)
```
### How to use
For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/instructblip). | 2,140 | [
[
-0.034454345703125,
-0.0513916015625,
0.0038909912109375,
0.0306549072265625,
-0.02484130859375,
-0.00620269775390625,
-0.006107330322265625,
-0.048614501953125,
-0.0018205642700195312,
0.037872314453125,
-0.049957275390625,
-0.030303955078125,
-0.040313720703125,
-0.0201416015625,
-0.01415252685546875,
0.073974609375,
0.005401611328125,
-0.00452423095703125,
-0.014984130859375,
-0.00052642822265625,
-0.033111572265625,
-0.0241851806640625,
-0.04296875,
-0.0304718017578125,
-0.01224517822265625,
0.03216552734375,
0.0535888671875,
0.036376953125,
0.048736572265625,
0.026275634765625,
-0.020477294921875,
0.0126800537109375,
-0.0276947021484375,
-0.03253173828125,
0.00762939453125,
-0.04638671875,
-0.034576416015625,
-0.0004892349243164062,
0.044708251953125,
0.0234222412109375,
-0.007659912109375,
0.03472900390625,
0.00337982177734375,
0.033538818359375,
-0.03265380859375,
0.0272216796875,
-0.037445068359375,
-0.000004708766937255859,
-0.0036716461181640625,
-0.0234527587890625,
-0.04852294921875,
-0.0144805908203125,
-0.0030956268310546875,
-0.0280609130859375,
0.034759521484375,
0.0048065185546875,
0.10491943359375,
0.0230560302734375,
-0.012664794921875,
-0.0169219970703125,
-0.0438232421875,
0.05322265625,
-0.06103515625,
0.0286407470703125,
0.01316070556640625,
0.0197906494140625,
0.0008091926574707031,
-0.0721435546875,
-0.036651611328125,
-0.01366424560546875,
-0.00905609130859375,
0.01064300537109375,
-0.0218048095703125,
0.017578125,
0.045135498046875,
0.0163726806640625,
-0.04345703125,
0.0157470703125,
-0.04510498046875,
-0.0123138427734375,
0.04052734375,
0.009429931640625,
0.0212860107421875,
-0.019134521484375,
-0.0582275390625,
-0.027252197265625,
-0.041961669921875,
0.021484375,
0.01294708251953125,
0.01398468017578125,
-0.048431396484375,
0.043548583984375,
-0.00302886962890625,
0.052886962890625,
0.028900146484375,
-0.0156097412109375,
0.042816162109375,
-0.0012111663818359375,
-0.034393310546875,
0.002239227294921875,
0.06585693359375,
0.041961669921875,
0.0031280517578125,
0.0013093948364257812,
-0.0143280029296875,
0.00424957275390625,
0.01349639892578125,
-0.09539794921875,
-0.01282501220703125,
0.02716064453125,
-0.039764404296875,
-0.0192108154296875,
0.01403045654296875,
-0.05548095703125,
-0.0099639892578125,
0.0024814605712890625,
0.035125732421875,
-0.044769287109375,
-0.0207061767578125,
0.0069732666015625,
-0.0240936279296875,
0.032928466796875,
0.005603790283203125,
-0.080322265625,
0.0151519775390625,
0.0430908203125,
0.0660400390625,
0.017608642578125,
-0.032928466796875,
-0.0174713134765625,
0.0074462890625,
-0.007480621337890625,
0.05010986328125,
-0.011688232421875,
-0.027069091796875,
-0.01021575927734375,
0.016143798828125,
-0.0026111602783203125,
-0.0550537109375,
0.024383544921875,
-0.0216064453125,
0.01476287841796875,
0.003932952880859375,
-0.042724609375,
-0.015899658203125,
-0.0013427734375,
-0.026763916015625,
0.0853271484375,
0.0213165283203125,
-0.06671142578125,
0.0171356201171875,
-0.0455322265625,
-0.0223541259765625,
0.024261474609375,
-0.0078277587890625,
-0.04638671875,
-0.0007758140563964844,
0.0024967193603515625,
0.0419921875,
-0.02154541015625,
0.005023956298828125,
-0.016845703125,
-0.037750244140625,
0.004962921142578125,
-0.01549530029296875,
0.0943603515625,
0.005619049072265625,
-0.047149658203125,
0.0254669189453125,
-0.0498046875,
0.01093292236328125,
0.021697998046875,
-0.0037937164306640625,
-0.005649566650390625,
-0.0300750732421875,
0.003383636474609375,
0.0083770751953125,
0.02783203125,
-0.039093017578125,
0.02301025390625,
-0.0294952392578125,
0.04052734375,
0.04608154296875,
-0.006755828857421875,
0.038360595703125,
-0.0042266845703125,
0.038421630859375,
0.0025234222412109375,
0.04510498046875,
-0.0097198486328125,
-0.048370361328125,
-0.0633544921875,
-0.026611328125,
-0.00865936279296875,
0.049774169921875,
-0.0721435546875,
0.019744873046875,
-0.0106658935546875,
-0.0660400390625,
-0.044158935546875,
-0.0012617111206054688,
0.05352783203125,
0.057373046875,
0.03265380859375,
-0.02044677734375,
-0.034515380859375,
-0.0828857421875,
0.01490020751953125,
-0.015655517578125,
0.0029087066650390625,
0.0168914794921875,
0.03912353515625,
-0.019134521484375,
0.04638671875,
-0.041839599609375,
-0.009765625,
-0.0238037109375,
0.0106658935546875,
0.029541015625,
0.04443359375,
0.05999755859375,
-0.05157470703125,
-0.027496337890625,
-0.0092010498046875,
-0.062164306640625,
0.0008168220520019531,
-0.011566162109375,
-0.025115966796875,
0.0243072509765625,
0.037200927734375,
-0.0682373046875,
0.0411376953125,
0.045562744140625,
-0.03717041015625,
0.04913330078125,
-0.00925445556640625,
-0.003143310546875,
-0.08331298828125,
0.0069122314453125,
0.01346588134765625,
-0.01064300537109375,
-0.038330078125,
0.019195556640625,
0.026947021484375,
-0.010498046875,
-0.056915283203125,
0.05450439453125,
-0.04058837890625,
0.00506591796875,
-0.0142059326171875,
-0.02520751953125,
0.004497528076171875,
0.057098388671875,
0.004947662353515625,
0.05548095703125,
0.05120849609375,
-0.0513916015625,
0.037109375,
0.038177490234375,
-0.02294921875,
0.0295257568359375,
-0.0670166015625,
0.006771087646484375,
0.002033233642578125,
-0.0017986297607421875,
-0.05450439453125,
-0.00774383544921875,
0.03997802734375,
-0.035308837890625,
0.0421142578125,
-0.0257110595703125,
-0.0428466796875,
-0.04254150390625,
-0.007049560546875,
0.02288818359375,
0.05029296875,
-0.05548095703125,
0.037445068359375,
0.030029296875,
0.01500701904296875,
-0.041534423828125,
-0.07403564453125,
-0.0123291015625,
-0.011260986328125,
-0.048919677734375,
0.04327392578125,
-0.0159454345703125,
0.00531768798828125,
-0.0017290115356445312,
0.0047760009765625,
-0.0146942138671875,
-0.0167236328125,
0.037200927734375,
0.038177490234375,
-0.0088958740234375,
-0.01346588134765625,
0.00019299983978271484,
-0.005580902099609375,
0.01407623291015625,
0.01018524169921875,
0.06536865234375,
-0.031646728515625,
-0.01187896728515625,
-0.06585693359375,
0.00977325439453125,
0.028839111328125,
-0.01708984375,
0.0516357421875,
0.053955078125,
-0.0188751220703125,
-0.031158447265625,
-0.02984619140625,
-0.024261474609375,
-0.04534912109375,
0.0321044921875,
-0.0224456787109375,
-0.03253173828125,
0.038909912109375,
0.00934600830078125,
0.004486083984375,
0.024993896484375,
0.046844482421875,
-0.00555419921875,
0.05987548828125,
0.061676025390625,
0.0182952880859375,
0.0631103515625,
-0.0635986328125,
-0.00432586669921875,
-0.056243896484375,
-0.03326416015625,
-0.018341064453125,
-0.00455474853515625,
-0.041229248046875,
-0.025543212890625,
0.014434814453125,
0.0114898681640625,
-0.035980224609375,
0.048919677734375,
-0.0670166015625,
0.01373291015625,
0.06549072265625,
0.02947998046875,
-0.017059326171875,
0.004199981689453125,
-0.0008263587951660156,
-0.00510406494140625,
-0.06024169921875,
-0.0263671875,
0.04736328125,
0.03009033203125,
0.057861328125,
-0.018402099609375,
0.047210693359375,
-0.00540924072265625,
0.02838134765625,
-0.0557861328125,
0.056793212890625,
-0.0030689239501953125,
-0.04119873046875,
0.005649566650390625,
-0.032928466796875,
-0.05694580078125,
0.0036487579345703125,
-0.01271820068359375,
-0.0540771484375,
0.0145111083984375,
0.02911376953125,
-0.00833892822265625,
0.0300140380859375,
-0.0814208984375,
0.0767822265625,
-0.034881591796875,
-0.02264404296875,
0.00002086162567138672,
-0.04302978515625,
0.03216552734375,
0.040740966796875,
-0.014862060546875,
0.0019989013671875,
0.02142333984375,
0.0601806640625,
-0.038818359375,
0.081298828125,
-0.0239105224609375,
0.00759124755859375,
0.041534423828125,
-0.00960540771484375,
0.023681640625,
-0.00722503662109375,
-0.0081329345703125,
0.033172607421875,
0.0128021240234375,
-0.028717041015625,
-0.045196533203125,
0.0280303955078125,
-0.055877685546875,
-0.037322998046875,
-0.025146484375,
-0.036376953125,
0.0012722015380859375,
0.025543212890625,
0.056732177734375,
0.0433349609375,
0.0030384063720703125,
0.0064697265625,
0.03497314453125,
-0.0298919677734375,
0.039581298828125,
0.01107025146484375,
-0.0195465087890625,
-0.032318115234375,
0.0709228515625,
-0.007568359375,
0.0357666015625,
0.0283660888671875,
0.0009489059448242188,
-0.017852783203125,
-0.0186309814453125,
-0.049407958984375,
0.034027099609375,
-0.05010986328125,
-0.035247802734375,
-0.0131683349609375,
-0.034210205078125,
-0.031402587890625,
-0.01153564453125,
-0.046142578125,
-0.01371002197265625,
-0.0302581787109375,
0.0024890899658203125,
0.037750244140625,
0.028472900390625,
0.007781982421875,
0.047210693359375,
-0.055328369140625,
0.036834716796875,
0.002735137939453125,
0.031402587890625,
0.0011949539184570312,
-0.04931640625,
-0.03179931640625,
0.0137481689453125,
-0.03875732421875,
-0.06146240234375,
0.04119873046875,
0.002010345458984375,
0.041839599609375,
0.033355712890625,
0.000492095947265625,
0.06103515625,
-0.0286407470703125,
0.042633056640625,
0.0298309326171875,
-0.07647705078125,
0.04559326171875,
0.00504302978515625,
0.0306243896484375,
0.0146026611328125,
0.0240936279296875,
-0.030792236328125,
-0.0292816162109375,
-0.06256103515625,
-0.06939697265625,
0.060333251953125,
0.00385284423828125,
0.016510009765625,
0.00782012939453125,
0.02655029296875,
-0.0033817291259765625,
0.021575927734375,
-0.044952392578125,
-0.038360595703125,
-0.0294952392578125,
-0.0153961181640625,
0.00893402099609375,
-0.0273895263671875,
-0.00763702392578125,
-0.031494140625,
0.045196533203125,
0.0036640167236328125,
0.0517578125,
0.01317596435546875,
-0.01039886474609375,
-0.0033855438232421875,
-0.018463134765625,
0.040191650390625,
0.038818359375,
-0.027740478515625,
-0.01296234130859375,
-0.005950927734375,
-0.043701171875,
-0.018951416015625,
0.012237548828125,
-0.033538818359375,
0.006298065185546875,
0.031402587890625,
0.09857177734375,
0.0106658935546875,
-0.035064697265625,
0.033355712890625,
-0.00888824462890625,
-0.016845703125,
-0.0277557373046875,
0.0095062255859375,
0.0079498291015625,
0.03375244140625,
0.01507568359375,
0.01221466064453125,
-0.007076263427734375,
-0.02777099609375,
0.006542205810546875,
0.044342041015625,
-0.024169921875,
-0.030517578125,
0.06707763671875,
0.0177764892578125,
-0.036651611328125,
0.0650634765625,
-0.01496124267578125,
-0.034576416015625,
0.06451416015625,
0.04931640625,
0.055694580078125,
-0.0016908645629882812,
0.0282440185546875,
0.033905029296875,
0.034393310546875,
0.006893157958984375,
0.03900146484375,
0.0007305145263671875,
-0.05963134765625,
-0.01153564453125,
-0.0400390625,
-0.03985595703125,
0.0027790069580078125,
-0.0604248046875,
0.03948974609375,
-0.057891845703125,
-0.021331787109375,
0.005619049072265625,
-0.0012655258178710938,
-0.06756591796875,
0.0199432373046875,
0.015228271484375,
0.07635498046875,
-0.05377197265625,
0.059234619140625,
0.04541015625,
-0.05108642578125,
-0.0704345703125,
-0.01629638671875,
-0.018157958984375,
-0.072509765625,
0.04803466796875,
0.02996826171875,
-0.003719329833984375,
0.01042938232421875,
-0.07562255859375,
-0.03875732421875,
0.0751953125,
0.04498291015625,
-0.02496337890625,
-0.006458282470703125,
0.0005249977111816406,
0.032928466796875,
-0.018280029296875,
0.02862548828125,
0.005252838134765625,
0.02056884765625,
0.01219940185546875,
-0.0714111328125,
0.0157470703125,
-0.0253753662109375,
-0.0078887939453125,
0.007427215576171875,
-0.07269287109375,
0.0650634765625,
-0.039215087890625,
-0.0196990966796875,
0.00801849365234375,
0.06817626953125,
0.0261077880859375,
0.02105712890625,
0.032012939453125,
0.041839599609375,
0.04315185546875,
0.0014400482177734375,
0.08795166015625,
-0.03271484375,
0.040557861328125,
0.07366943359375,
0.0110626220703125,
0.061492919921875,
0.037811279296875,
0.0000023245811462402344,
0.037811279296875,
0.04156494140625,
-0.037841796875,
0.0282135009765625,
-0.0065460205078125,
0.007049560546875,
0.00623321533203125,
-0.005535125732421875,
-0.034271240234375,
0.0406494140625,
0.0261688232421875,
-0.0321044921875,
0.00450897216796875,
-0.00348663330078125,
0.0114898681640625,
-0.0255889892578125,
-0.020355224609375,
0.02813720703125,
0.0008783340454101562,
-0.03692626953125,
0.07818603515625,
-0.005329132080078125,
0.06683349609375,
-0.04547119140625,
-0.01084136962890625,
-0.01100921630859375,
0.02679443359375,
-0.022857666015625,
-0.06634521484375,
0.0223236083984375,
-0.00688934326171875,
-0.00628662109375,
-0.019744873046875,
0.031036376953125,
-0.0239410400390625,
-0.053924560546875,
0.02606201171875,
0.00493621826171875,
0.022857666015625,
0.012542724609375,
-0.0648193359375,
0.0181884765625,
0.00925445556640625,
-0.028472900390625,
0.007572174072265625,
0.0194854736328125,
0.0067138671875,
0.0670166015625,
0.03369140625,
0.006900787353515625,
0.016448974609375,
0.0044708251953125,
0.06768798828125,
-0.045379638671875,
-0.027252197265625,
-0.0458984375,
0.0516357421875,
-0.0098724365234375,
-0.04693603515625,
0.05841064453125,
0.047210693359375,
0.08245849609375,
-0.020355224609375,
0.039581298828125,
-0.0172271728515625,
0.01026153564453125,
-0.044097900390625,
0.052032470703125,
-0.05029296875,
-0.00774383544921875,
-0.029205322265625,
-0.057281494140625,
-0.01342010498046875,
0.04388427734375,
-0.0082550048828125,
0.0066375732421875,
0.047393798828125,
0.08056640625,
-0.02008056640625,
-0.0168609619140625,
0.0120697021484375,
0.0256500244140625,
0.0239105224609375,
0.02783203125,
0.0309295654296875,
-0.049285888671875,
0.0462646484375,
-0.07012939453125,
-0.02813720703125,
-0.01361083984375,
-0.054534912109375,
-0.0709228515625,
-0.0458984375,
-0.03314208984375,
-0.041534423828125,
-0.0062255859375,
0.055206298828125,
0.0738525390625,
-0.045196533203125,
-0.0279693603515625,
-0.006938934326171875,
-0.00710296630859375,
-0.0292510986328125,
-0.021514892578125,
0.03466796875,
-0.005462646484375,
-0.07061767578125,
0.0069427490234375,
0.006626129150390625,
0.01549530029296875,
-0.0079803466796875,
-0.004299163818359375,
-0.0024013519287109375,
-0.016571044921875,
0.038238525390625,
0.0458984375,
-0.04217529296875,
-0.007770538330078125,
-0.0018396377563476562,
-0.0158233642578125,
0.0113067626953125,
0.02886962890625,
-0.03729248046875,
0.02032470703125,
0.03216552734375,
0.0220794677734375,
0.054931640625,
-0.005405426025390625,
0.022125244140625,
-0.05059814453125,
0.06146240234375,
0.0128631591796875,
0.040771484375,
0.0293731689453125,
-0.0306549072265625,
0.0272216796875,
0.0161895751953125,
-0.036468505859375,
-0.07012939453125,
0.024078369140625,
-0.0928955078125,
-0.0122833251953125,
0.08563232421875,
-0.0013942718505859375,
-0.04718017578125,
0.0333251953125,
-0.0264434814453125,
0.03155517578125,
-0.030303955078125,
0.052734375,
0.0123291015625,
-0.01641845703125,
-0.040771484375,
-0.031463623046875,
0.02703857421875,
0.01245880126953125,
-0.052093505859375,
-0.01137542724609375,
0.0218048095703125,
0.0185546875,
0.01061248779296875,
0.0379638671875,
0.0008225440979003906,
0.0146026611328125,
0.0142059326171875,
0.058685302734375,
-0.0253753662109375,
-0.01554107666015625,
-0.00557708740234375,
-0.006511688232421875,
-0.007358551025390625,
-0.03985595703125
]
] |
BAAI/bge-small-en | 2023-10-12T03:35:23.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"feature-extraction",
"mteb",
"sentence transformers",
"en",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | BAAI | null | null | BAAI/bge-small-en | 48 | 78,957 | transformers | 2023-08-05T08:04:07 | ---
tags:
- mteb
- sentence transformers
model-index:
- name: bge-small-en
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 74.34328358208955
- type: ap
value: 37.59947775195661
- type: f1
value: 68.548415491933
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 93.04527499999999
- type: ap
value: 89.60696356772135
- type: f1
value: 93.03361469382438
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.08
- type: f1
value: 45.66249835363254
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 35.205999999999996
- type: map_at_10
value: 50.782000000000004
- type: map_at_100
value: 51.547
- type: map_at_1000
value: 51.554
- type: map_at_3
value: 46.515
- type: map_at_5
value: 49.296
- type: mrr_at_1
value: 35.632999999999996
- type: mrr_at_10
value: 50.958999999999996
- type: mrr_at_100
value: 51.724000000000004
- type: mrr_at_1000
value: 51.731
- type: mrr_at_3
value: 46.669
- type: mrr_at_5
value: 49.439
- type: ndcg_at_1
value: 35.205999999999996
- type: ndcg_at_10
value: 58.835
- type: ndcg_at_100
value: 62.095
- type: ndcg_at_1000
value: 62.255
- type: ndcg_at_3
value: 50.255
- type: ndcg_at_5
value: 55.296
- type: precision_at_1
value: 35.205999999999996
- type: precision_at_10
value: 8.421
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 20.365
- type: precision_at_5
value: 14.680000000000001
- type: recall_at_1
value: 35.205999999999996
- type: recall_at_10
value: 84.211
- type: recall_at_100
value: 98.43499999999999
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 61.095
- type: recall_at_5
value: 73.4
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 47.52644476278646
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 39.973045724188964
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.28285314871488
- type: mrr
value: 74.52743701358659
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 80.09041909160327
- type: cos_sim_spearman
value: 79.96266537706944
- type: euclidean_pearson
value: 79.50774978162241
- type: euclidean_spearman
value: 79.9144715078551
- type: manhattan_pearson
value: 79.2062139879302
- type: manhattan_spearman
value: 79.35000081468212
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.31493506493506
- type: f1
value: 85.2704557977762
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 39.6837242810816
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 35.38881249555897
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.884999999999998
- type: map_at_10
value: 39.574
- type: map_at_100
value: 40.993
- type: map_at_1000
value: 41.129
- type: map_at_3
value: 36.089
- type: map_at_5
value: 38.191
- type: mrr_at_1
value: 34.477999999999994
- type: mrr_at_10
value: 45.411
- type: mrr_at_100
value: 46.089999999999996
- type: mrr_at_1000
value: 46.147
- type: mrr_at_3
value: 42.346000000000004
- type: mrr_at_5
value: 44.292
- type: ndcg_at_1
value: 34.477999999999994
- type: ndcg_at_10
value: 46.123999999999995
- type: ndcg_at_100
value: 51.349999999999994
- type: ndcg_at_1000
value: 53.578
- type: ndcg_at_3
value: 40.824
- type: ndcg_at_5
value: 43.571
- type: precision_at_1
value: 34.477999999999994
- type: precision_at_10
value: 8.841000000000001
- type: precision_at_100
value: 1.4460000000000002
- type: precision_at_1000
value: 0.192
- type: precision_at_3
value: 19.742
- type: precision_at_5
value: 14.421000000000001
- type: recall_at_1
value: 27.884999999999998
- type: recall_at_10
value: 59.087
- type: recall_at_100
value: 80.609
- type: recall_at_1000
value: 95.054
- type: recall_at_3
value: 44.082
- type: recall_at_5
value: 51.593999999999994
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.639
- type: map_at_10
value: 40.047
- type: map_at_100
value: 41.302
- type: map_at_1000
value: 41.425
- type: map_at_3
value: 37.406
- type: map_at_5
value: 38.934000000000005
- type: mrr_at_1
value: 37.707
- type: mrr_at_10
value: 46.082
- type: mrr_at_100
value: 46.745
- type: mrr_at_1000
value: 46.786
- type: mrr_at_3
value: 43.980999999999995
- type: mrr_at_5
value: 45.287
- type: ndcg_at_1
value: 37.707
- type: ndcg_at_10
value: 45.525
- type: ndcg_at_100
value: 49.976
- type: ndcg_at_1000
value: 51.94499999999999
- type: ndcg_at_3
value: 41.704
- type: ndcg_at_5
value: 43.596000000000004
- type: precision_at_1
value: 37.707
- type: precision_at_10
value: 8.465
- type: precision_at_100
value: 1.375
- type: precision_at_1000
value: 0.183
- type: precision_at_3
value: 19.979
- type: precision_at_5
value: 14.115
- type: recall_at_1
value: 30.639
- type: recall_at_10
value: 54.775
- type: recall_at_100
value: 73.678
- type: recall_at_1000
value: 86.142
- type: recall_at_3
value: 43.230000000000004
- type: recall_at_5
value: 48.622
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.038
- type: map_at_10
value: 49.922
- type: map_at_100
value: 51.032
- type: map_at_1000
value: 51.085
- type: map_at_3
value: 46.664
- type: map_at_5
value: 48.588
- type: mrr_at_1
value: 43.95
- type: mrr_at_10
value: 53.566
- type: mrr_at_100
value: 54.318999999999996
- type: mrr_at_1000
value: 54.348
- type: mrr_at_3
value: 51.066
- type: mrr_at_5
value: 52.649
- type: ndcg_at_1
value: 43.95
- type: ndcg_at_10
value: 55.676
- type: ndcg_at_100
value: 60.126000000000005
- type: ndcg_at_1000
value: 61.208
- type: ndcg_at_3
value: 50.20400000000001
- type: ndcg_at_5
value: 53.038
- type: precision_at_1
value: 43.95
- type: precision_at_10
value: 8.953
- type: precision_at_100
value: 1.2109999999999999
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 22.256999999999998
- type: precision_at_5
value: 15.524
- type: recall_at_1
value: 38.038
- type: recall_at_10
value: 69.15
- type: recall_at_100
value: 88.31599999999999
- type: recall_at_1000
value: 95.993
- type: recall_at_3
value: 54.663
- type: recall_at_5
value: 61.373
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.872
- type: map_at_10
value: 32.912
- type: map_at_100
value: 33.972
- type: map_at_1000
value: 34.046
- type: map_at_3
value: 30.361
- type: map_at_5
value: 31.704
- type: mrr_at_1
value: 26.779999999999998
- type: mrr_at_10
value: 34.812
- type: mrr_at_100
value: 35.754999999999995
- type: mrr_at_1000
value: 35.809000000000005
- type: mrr_at_3
value: 32.335
- type: mrr_at_5
value: 33.64
- type: ndcg_at_1
value: 26.779999999999998
- type: ndcg_at_10
value: 37.623
- type: ndcg_at_100
value: 42.924
- type: ndcg_at_1000
value: 44.856
- type: ndcg_at_3
value: 32.574
- type: ndcg_at_5
value: 34.842
- type: precision_at_1
value: 26.779999999999998
- type: precision_at_10
value: 5.729
- type: precision_at_100
value: 0.886
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 13.559
- type: precision_at_5
value: 9.469
- type: recall_at_1
value: 24.872
- type: recall_at_10
value: 50.400999999999996
- type: recall_at_100
value: 74.954
- type: recall_at_1000
value: 89.56
- type: recall_at_3
value: 36.726
- type: recall_at_5
value: 42.138999999999996
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.803
- type: map_at_10
value: 24.348
- type: map_at_100
value: 25.56
- type: map_at_1000
value: 25.668000000000003
- type: map_at_3
value: 21.811
- type: map_at_5
value: 23.287
- type: mrr_at_1
value: 20.771
- type: mrr_at_10
value: 28.961
- type: mrr_at_100
value: 29.979
- type: mrr_at_1000
value: 30.046
- type: mrr_at_3
value: 26.555
- type: mrr_at_5
value: 28.060000000000002
- type: ndcg_at_1
value: 20.771
- type: ndcg_at_10
value: 29.335
- type: ndcg_at_100
value: 35.188
- type: ndcg_at_1000
value: 37.812
- type: ndcg_at_3
value: 24.83
- type: ndcg_at_5
value: 27.119
- type: precision_at_1
value: 20.771
- type: precision_at_10
value: 5.4350000000000005
- type: precision_at_100
value: 0.9480000000000001
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 11.982
- type: precision_at_5
value: 8.831
- type: recall_at_1
value: 16.803
- type: recall_at_10
value: 40.039
- type: recall_at_100
value: 65.83200000000001
- type: recall_at_1000
value: 84.478
- type: recall_at_3
value: 27.682000000000002
- type: recall_at_5
value: 33.535
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.345
- type: map_at_10
value: 37.757000000000005
- type: map_at_100
value: 39.141
- type: map_at_1000
value: 39.262
- type: map_at_3
value: 35.183
- type: map_at_5
value: 36.592
- type: mrr_at_1
value: 34.649
- type: mrr_at_10
value: 43.586999999999996
- type: mrr_at_100
value: 44.481
- type: mrr_at_1000
value: 44.542
- type: mrr_at_3
value: 41.29
- type: mrr_at_5
value: 42.642
- type: ndcg_at_1
value: 34.649
- type: ndcg_at_10
value: 43.161
- type: ndcg_at_100
value: 48.734
- type: ndcg_at_1000
value: 51.046
- type: ndcg_at_3
value: 39.118
- type: ndcg_at_5
value: 41.022
- type: precision_at_1
value: 34.649
- type: precision_at_10
value: 7.603
- type: precision_at_100
value: 1.209
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 18.319
- type: precision_at_5
value: 12.839
- type: recall_at_1
value: 28.345
- type: recall_at_10
value: 53.367
- type: recall_at_100
value: 76.453
- type: recall_at_1000
value: 91.82000000000001
- type: recall_at_3
value: 41.636
- type: recall_at_5
value: 46.760000000000005
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.419
- type: map_at_10
value: 31.716
- type: map_at_100
value: 33.152
- type: map_at_1000
value: 33.267
- type: map_at_3
value: 28.74
- type: map_at_5
value: 30.48
- type: mrr_at_1
value: 28.310999999999996
- type: mrr_at_10
value: 37.039
- type: mrr_at_100
value: 38.09
- type: mrr_at_1000
value: 38.145
- type: mrr_at_3
value: 34.437
- type: mrr_at_5
value: 36.024
- type: ndcg_at_1
value: 28.310999999999996
- type: ndcg_at_10
value: 37.41
- type: ndcg_at_100
value: 43.647999999999996
- type: ndcg_at_1000
value: 46.007
- type: ndcg_at_3
value: 32.509
- type: ndcg_at_5
value: 34.943999999999996
- type: precision_at_1
value: 28.310999999999996
- type: precision_at_10
value: 6.963
- type: precision_at_100
value: 1.1860000000000002
- type: precision_at_1000
value: 0.154
- type: precision_at_3
value: 15.867999999999999
- type: precision_at_5
value: 11.507000000000001
- type: recall_at_1
value: 22.419
- type: recall_at_10
value: 49.28
- type: recall_at_100
value: 75.802
- type: recall_at_1000
value: 92.032
- type: recall_at_3
value: 35.399
- type: recall_at_5
value: 42.027
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.669249999999998
- type: map_at_10
value: 33.332583333333325
- type: map_at_100
value: 34.557833333333335
- type: map_at_1000
value: 34.67141666666666
- type: map_at_3
value: 30.663166666666662
- type: map_at_5
value: 32.14883333333333
- type: mrr_at_1
value: 29.193833333333334
- type: mrr_at_10
value: 37.47625
- type: mrr_at_100
value: 38.3545
- type: mrr_at_1000
value: 38.413166666666676
- type: mrr_at_3
value: 35.06741666666667
- type: mrr_at_5
value: 36.450666666666656
- type: ndcg_at_1
value: 29.193833333333334
- type: ndcg_at_10
value: 38.505416666666676
- type: ndcg_at_100
value: 43.81125
- type: ndcg_at_1000
value: 46.09558333333333
- type: ndcg_at_3
value: 33.90916666666667
- type: ndcg_at_5
value: 36.07666666666666
- type: precision_at_1
value: 29.193833333333334
- type: precision_at_10
value: 6.7251666666666665
- type: precision_at_100
value: 1.1058333333333332
- type: precision_at_1000
value: 0.14833333333333332
- type: precision_at_3
value: 15.554166666666665
- type: precision_at_5
value: 11.079250000000002
- type: recall_at_1
value: 24.669249999999998
- type: recall_at_10
value: 49.75583333333332
- type: recall_at_100
value: 73.06908333333332
- type: recall_at_1000
value: 88.91316666666667
- type: recall_at_3
value: 36.913250000000005
- type: recall_at_5
value: 42.48641666666666
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.044999999999998
- type: map_at_10
value: 30.349999999999998
- type: map_at_100
value: 31.273
- type: map_at_1000
value: 31.362000000000002
- type: map_at_3
value: 28.508
- type: map_at_5
value: 29.369
- type: mrr_at_1
value: 26.994
- type: mrr_at_10
value: 33.12
- type: mrr_at_100
value: 33.904
- type: mrr_at_1000
value: 33.967000000000006
- type: mrr_at_3
value: 31.365
- type: mrr_at_5
value: 32.124
- type: ndcg_at_1
value: 26.994
- type: ndcg_at_10
value: 34.214
- type: ndcg_at_100
value: 38.681
- type: ndcg_at_1000
value: 40.926
- type: ndcg_at_3
value: 30.725
- type: ndcg_at_5
value: 31.967000000000002
- type: precision_at_1
value: 26.994
- type: precision_at_10
value: 5.215
- type: precision_at_100
value: 0.807
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 12.986
- type: precision_at_5
value: 8.712
- type: recall_at_1
value: 24.044999999999998
- type: recall_at_10
value: 43.456
- type: recall_at_100
value: 63.675000000000004
- type: recall_at_1000
value: 80.05499999999999
- type: recall_at_3
value: 33.561
- type: recall_at_5
value: 36.767
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.672
- type: map_at_10
value: 22.641
- type: map_at_100
value: 23.75
- type: map_at_1000
value: 23.877000000000002
- type: map_at_3
value: 20.219
- type: map_at_5
value: 21.648
- type: mrr_at_1
value: 18.823
- type: mrr_at_10
value: 26.101999999999997
- type: mrr_at_100
value: 27.038
- type: mrr_at_1000
value: 27.118
- type: mrr_at_3
value: 23.669
- type: mrr_at_5
value: 25.173000000000002
- type: ndcg_at_1
value: 18.823
- type: ndcg_at_10
value: 27.176000000000002
- type: ndcg_at_100
value: 32.42
- type: ndcg_at_1000
value: 35.413
- type: ndcg_at_3
value: 22.756999999999998
- type: ndcg_at_5
value: 25.032
- type: precision_at_1
value: 18.823
- type: precision_at_10
value: 5.034000000000001
- type: precision_at_100
value: 0.895
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 10.771
- type: precision_at_5
value: 8.1
- type: recall_at_1
value: 15.672
- type: recall_at_10
value: 37.296
- type: recall_at_100
value: 60.863
- type: recall_at_1000
value: 82.234
- type: recall_at_3
value: 25.330000000000002
- type: recall_at_5
value: 30.964000000000002
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.633
- type: map_at_10
value: 32.858
- type: map_at_100
value: 34.038000000000004
- type: map_at_1000
value: 34.141
- type: map_at_3
value: 30.209000000000003
- type: map_at_5
value: 31.567
- type: mrr_at_1
value: 28.358
- type: mrr_at_10
value: 36.433
- type: mrr_at_100
value: 37.352000000000004
- type: mrr_at_1000
value: 37.41
- type: mrr_at_3
value: 34.033
- type: mrr_at_5
value: 35.246
- type: ndcg_at_1
value: 28.358
- type: ndcg_at_10
value: 37.973
- type: ndcg_at_100
value: 43.411
- type: ndcg_at_1000
value: 45.747
- type: ndcg_at_3
value: 32.934999999999995
- type: ndcg_at_5
value: 35.013
- type: precision_at_1
value: 28.358
- type: precision_at_10
value: 6.418
- type: precision_at_100
value: 1.02
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 14.677000000000001
- type: precision_at_5
value: 10.335999999999999
- type: recall_at_1
value: 24.633
- type: recall_at_10
value: 50.048
- type: recall_at_100
value: 73.821
- type: recall_at_1000
value: 90.046
- type: recall_at_3
value: 36.284
- type: recall_at_5
value: 41.370000000000005
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.133
- type: map_at_10
value: 31.491999999999997
- type: map_at_100
value: 33.062000000000005
- type: map_at_1000
value: 33.256
- type: map_at_3
value: 28.886
- type: map_at_5
value: 30.262
- type: mrr_at_1
value: 28.063
- type: mrr_at_10
value: 36.144
- type: mrr_at_100
value: 37.14
- type: mrr_at_1000
value: 37.191
- type: mrr_at_3
value: 33.762
- type: mrr_at_5
value: 34.997
- type: ndcg_at_1
value: 28.063
- type: ndcg_at_10
value: 36.951
- type: ndcg_at_100
value: 43.287
- type: ndcg_at_1000
value: 45.777
- type: ndcg_at_3
value: 32.786
- type: ndcg_at_5
value: 34.65
- type: precision_at_1
value: 28.063
- type: precision_at_10
value: 7.055
- type: precision_at_100
value: 1.476
- type: precision_at_1000
value: 0.22899999999999998
- type: precision_at_3
value: 15.481
- type: precision_at_5
value: 11.186
- type: recall_at_1
value: 23.133
- type: recall_at_10
value: 47.285
- type: recall_at_100
value: 76.176
- type: recall_at_1000
value: 92.176
- type: recall_at_3
value: 35.223
- type: recall_at_5
value: 40.142
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 19.547
- type: map_at_10
value: 26.374
- type: map_at_100
value: 27.419
- type: map_at_1000
value: 27.539
- type: map_at_3
value: 23.882
- type: map_at_5
value: 25.163999999999998
- type: mrr_at_1
value: 21.442
- type: mrr_at_10
value: 28.458
- type: mrr_at_100
value: 29.360999999999997
- type: mrr_at_1000
value: 29.448999999999998
- type: mrr_at_3
value: 25.97
- type: mrr_at_5
value: 27.273999999999997
- type: ndcg_at_1
value: 21.442
- type: ndcg_at_10
value: 30.897000000000002
- type: ndcg_at_100
value: 35.99
- type: ndcg_at_1000
value: 38.832
- type: ndcg_at_3
value: 25.944
- type: ndcg_at_5
value: 28.126
- type: precision_at_1
value: 21.442
- type: precision_at_10
value: 4.9910000000000005
- type: precision_at_100
value: 0.8109999999999999
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 11.029
- type: precision_at_5
value: 7.911
- type: recall_at_1
value: 19.547
- type: recall_at_10
value: 42.886
- type: recall_at_100
value: 66.64999999999999
- type: recall_at_1000
value: 87.368
- type: recall_at_3
value: 29.143
- type: recall_at_5
value: 34.544000000000004
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.572
- type: map_at_10
value: 25.312
- type: map_at_100
value: 27.062
- type: map_at_1000
value: 27.253
- type: map_at_3
value: 21.601
- type: map_at_5
value: 23.473
- type: mrr_at_1
value: 34.984
- type: mrr_at_10
value: 46.406
- type: mrr_at_100
value: 47.179
- type: mrr_at_1000
value: 47.21
- type: mrr_at_3
value: 43.485
- type: mrr_at_5
value: 45.322
- type: ndcg_at_1
value: 34.984
- type: ndcg_at_10
value: 34.344
- type: ndcg_at_100
value: 41.015
- type: ndcg_at_1000
value: 44.366
- type: ndcg_at_3
value: 29.119
- type: ndcg_at_5
value: 30.825999999999997
- type: precision_at_1
value: 34.984
- type: precision_at_10
value: 10.358
- type: precision_at_100
value: 1.762
- type: precision_at_1000
value: 0.23900000000000002
- type: precision_at_3
value: 21.368000000000002
- type: precision_at_5
value: 15.948
- type: recall_at_1
value: 15.572
- type: recall_at_10
value: 39.367999999999995
- type: recall_at_100
value: 62.183
- type: recall_at_1000
value: 80.92200000000001
- type: recall_at_3
value: 26.131999999999998
- type: recall_at_5
value: 31.635999999999996
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.848
- type: map_at_10
value: 19.25
- type: map_at_100
value: 27.193
- type: map_at_1000
value: 28.721999999999998
- type: map_at_3
value: 13.968
- type: map_at_5
value: 16.283
- type: mrr_at_1
value: 68.75
- type: mrr_at_10
value: 76.25
- type: mrr_at_100
value: 76.534
- type: mrr_at_1000
value: 76.53999999999999
- type: mrr_at_3
value: 74.667
- type: mrr_at_5
value: 75.86699999999999
- type: ndcg_at_1
value: 56.00000000000001
- type: ndcg_at_10
value: 41.426
- type: ndcg_at_100
value: 45.660000000000004
- type: ndcg_at_1000
value: 53.02
- type: ndcg_at_3
value: 46.581
- type: ndcg_at_5
value: 43.836999999999996
- type: precision_at_1
value: 68.75
- type: precision_at_10
value: 32.800000000000004
- type: precision_at_100
value: 10.440000000000001
- type: precision_at_1000
value: 1.9980000000000002
- type: precision_at_3
value: 49.667
- type: precision_at_5
value: 42.25
- type: recall_at_1
value: 8.848
- type: recall_at_10
value: 24.467
- type: recall_at_100
value: 51.344
- type: recall_at_1000
value: 75.235
- type: recall_at_3
value: 15.329
- type: recall_at_5
value: 18.892999999999997
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.95
- type: f1
value: 43.44563593360779
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 78.036
- type: map_at_10
value: 85.639
- type: map_at_100
value: 85.815
- type: map_at_1000
value: 85.829
- type: map_at_3
value: 84.795
- type: map_at_5
value: 85.336
- type: mrr_at_1
value: 84.353
- type: mrr_at_10
value: 90.582
- type: mrr_at_100
value: 90.617
- type: mrr_at_1000
value: 90.617
- type: mrr_at_3
value: 90.132
- type: mrr_at_5
value: 90.447
- type: ndcg_at_1
value: 84.353
- type: ndcg_at_10
value: 89.003
- type: ndcg_at_100
value: 89.60000000000001
- type: ndcg_at_1000
value: 89.836
- type: ndcg_at_3
value: 87.81400000000001
- type: ndcg_at_5
value: 88.478
- type: precision_at_1
value: 84.353
- type: precision_at_10
value: 10.482
- type: precision_at_100
value: 1.099
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 33.257999999999996
- type: precision_at_5
value: 20.465
- type: recall_at_1
value: 78.036
- type: recall_at_10
value: 94.517
- type: recall_at_100
value: 96.828
- type: recall_at_1000
value: 98.261
- type: recall_at_3
value: 91.12
- type: recall_at_5
value: 92.946
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.191
- type: map_at_10
value: 32.369
- type: map_at_100
value: 34.123999999999995
- type: map_at_1000
value: 34.317
- type: map_at_3
value: 28.71
- type: map_at_5
value: 30.607
- type: mrr_at_1
value: 40.894999999999996
- type: mrr_at_10
value: 48.842
- type: mrr_at_100
value: 49.599
- type: mrr_at_1000
value: 49.647000000000006
- type: mrr_at_3
value: 46.785
- type: mrr_at_5
value: 47.672
- type: ndcg_at_1
value: 40.894999999999996
- type: ndcg_at_10
value: 39.872
- type: ndcg_at_100
value: 46.126
- type: ndcg_at_1000
value: 49.476
- type: ndcg_at_3
value: 37.153000000000006
- type: ndcg_at_5
value: 37.433
- type: precision_at_1
value: 40.894999999999996
- type: precision_at_10
value: 10.818
- type: precision_at_100
value: 1.73
- type: precision_at_1000
value: 0.231
- type: precision_at_3
value: 25.051000000000002
- type: precision_at_5
value: 17.531
- type: recall_at_1
value: 20.191
- type: recall_at_10
value: 45.768
- type: recall_at_100
value: 68.82000000000001
- type: recall_at_1000
value: 89.133
- type: recall_at_3
value: 33.296
- type: recall_at_5
value: 38.022
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.257
- type: map_at_10
value: 61.467000000000006
- type: map_at_100
value: 62.364
- type: map_at_1000
value: 62.424
- type: map_at_3
value: 58.228
- type: map_at_5
value: 60.283
- type: mrr_at_1
value: 78.515
- type: mrr_at_10
value: 84.191
- type: mrr_at_100
value: 84.378
- type: mrr_at_1000
value: 84.385
- type: mrr_at_3
value: 83.284
- type: mrr_at_5
value: 83.856
- type: ndcg_at_1
value: 78.515
- type: ndcg_at_10
value: 69.78999999999999
- type: ndcg_at_100
value: 72.886
- type: ndcg_at_1000
value: 74.015
- type: ndcg_at_3
value: 65.23
- type: ndcg_at_5
value: 67.80199999999999
- type: precision_at_1
value: 78.515
- type: precision_at_10
value: 14.519000000000002
- type: precision_at_100
value: 1.694
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 41.702
- type: precision_at_5
value: 27.046999999999997
- type: recall_at_1
value: 39.257
- type: recall_at_10
value: 72.59299999999999
- type: recall_at_100
value: 84.679
- type: recall_at_1000
value: 92.12
- type: recall_at_3
value: 62.552
- type: recall_at_5
value: 67.616
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 91.5152
- type: ap
value: 87.64584669595709
- type: f1
value: 91.50605576428437
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.926000000000002
- type: map_at_10
value: 34.049
- type: map_at_100
value: 35.213
- type: map_at_1000
value: 35.265
- type: map_at_3
value: 30.309
- type: map_at_5
value: 32.407000000000004
- type: mrr_at_1
value: 22.55
- type: mrr_at_10
value: 34.657
- type: mrr_at_100
value: 35.760999999999996
- type: mrr_at_1000
value: 35.807
- type: mrr_at_3
value: 30.989
- type: mrr_at_5
value: 33.039
- type: ndcg_at_1
value: 22.55
- type: ndcg_at_10
value: 40.842
- type: ndcg_at_100
value: 46.436
- type: ndcg_at_1000
value: 47.721999999999994
- type: ndcg_at_3
value: 33.209
- type: ndcg_at_5
value: 36.943
- type: precision_at_1
value: 22.55
- type: precision_at_10
value: 6.447
- type: precision_at_100
value: 0.9249999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.136000000000001
- type: precision_at_5
value: 10.381
- type: recall_at_1
value: 21.926000000000002
- type: recall_at_10
value: 61.724999999999994
- type: recall_at_100
value: 87.604
- type: recall_at_1000
value: 97.421
- type: recall_at_3
value: 40.944
- type: recall_at_5
value: 49.915
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.54765161878704
- type: f1
value: 93.3298945415573
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 75.71591427268582
- type: f1
value: 59.32113870474471
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 75.83053127101547
- type: f1
value: 73.60757944876475
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 78.72562205783457
- type: f1
value: 78.63761662505502
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.37935633767996
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.55270546130387
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.462692753143834
- type: mrr
value: 31.497569753511563
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.646
- type: map_at_10
value: 12.498
- type: map_at_100
value: 15.486
- type: map_at_1000
value: 16.805999999999997
- type: map_at_3
value: 9.325
- type: map_at_5
value: 10.751
- type: mrr_at_1
value: 43.034
- type: mrr_at_10
value: 52.662
- type: mrr_at_100
value: 53.189
- type: mrr_at_1000
value: 53.25
- type: mrr_at_3
value: 50.929
- type: mrr_at_5
value: 51.92
- type: ndcg_at_1
value: 41.796
- type: ndcg_at_10
value: 33.477000000000004
- type: ndcg_at_100
value: 29.996000000000002
- type: ndcg_at_1000
value: 38.864
- type: ndcg_at_3
value: 38.940000000000005
- type: ndcg_at_5
value: 36.689
- type: precision_at_1
value: 43.034
- type: precision_at_10
value: 24.799
- type: precision_at_100
value: 7.432999999999999
- type: precision_at_1000
value: 1.9929999999999999
- type: precision_at_3
value: 36.842000000000006
- type: precision_at_5
value: 32.135999999999996
- type: recall_at_1
value: 5.646
- type: recall_at_10
value: 15.963
- type: recall_at_100
value: 29.492
- type: recall_at_1000
value: 61.711000000000006
- type: recall_at_3
value: 10.585
- type: recall_at_5
value: 12.753999999999998
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.602
- type: map_at_10
value: 41.545
- type: map_at_100
value: 42.644999999999996
- type: map_at_1000
value: 42.685
- type: map_at_3
value: 37.261
- type: map_at_5
value: 39.706
- type: mrr_at_1
value: 31.141000000000002
- type: mrr_at_10
value: 44.139
- type: mrr_at_100
value: 44.997
- type: mrr_at_1000
value: 45.025999999999996
- type: mrr_at_3
value: 40.503
- type: mrr_at_5
value: 42.64
- type: ndcg_at_1
value: 31.141000000000002
- type: ndcg_at_10
value: 48.995
- type: ndcg_at_100
value: 53.788000000000004
- type: ndcg_at_1000
value: 54.730000000000004
- type: ndcg_at_3
value: 40.844
- type: ndcg_at_5
value: 44.955
- type: precision_at_1
value: 31.141000000000002
- type: precision_at_10
value: 8.233
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 18.579
- type: precision_at_5
value: 13.533999999999999
- type: recall_at_1
value: 27.602
- type: recall_at_10
value: 69.216
- type: recall_at_100
value: 90.252
- type: recall_at_1000
value: 97.27
- type: recall_at_3
value: 47.987
- type: recall_at_5
value: 57.438
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.949
- type: map_at_10
value: 84.89999999999999
- type: map_at_100
value: 85.531
- type: map_at_1000
value: 85.548
- type: map_at_3
value: 82.027
- type: map_at_5
value: 83.853
- type: mrr_at_1
value: 81.69999999999999
- type: mrr_at_10
value: 87.813
- type: mrr_at_100
value: 87.917
- type: mrr_at_1000
value: 87.91799999999999
- type: mrr_at_3
value: 86.938
- type: mrr_at_5
value: 87.53999999999999
- type: ndcg_at_1
value: 81.75
- type: ndcg_at_10
value: 88.55499999999999
- type: ndcg_at_100
value: 89.765
- type: ndcg_at_1000
value: 89.871
- type: ndcg_at_3
value: 85.905
- type: ndcg_at_5
value: 87.41
- type: precision_at_1
value: 81.75
- type: precision_at_10
value: 13.403
- type: precision_at_100
value: 1.528
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.597
- type: precision_at_5
value: 24.69
- type: recall_at_1
value: 70.949
- type: recall_at_10
value: 95.423
- type: recall_at_100
value: 99.509
- type: recall_at_1000
value: 99.982
- type: recall_at_3
value: 87.717
- type: recall_at_5
value: 92.032
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 51.76962893449579
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 62.32897690686379
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.478
- type: map_at_10
value: 11.994
- type: map_at_100
value: 13.977
- type: map_at_1000
value: 14.295
- type: map_at_3
value: 8.408999999999999
- type: map_at_5
value: 10.024
- type: mrr_at_1
value: 22.1
- type: mrr_at_10
value: 33.526
- type: mrr_at_100
value: 34.577000000000005
- type: mrr_at_1000
value: 34.632000000000005
- type: mrr_at_3
value: 30.217
- type: mrr_at_5
value: 31.962000000000003
- type: ndcg_at_1
value: 22.1
- type: ndcg_at_10
value: 20.191
- type: ndcg_at_100
value: 27.954
- type: ndcg_at_1000
value: 33.491
- type: ndcg_at_3
value: 18.787000000000003
- type: ndcg_at_5
value: 16.378999999999998
- type: precision_at_1
value: 22.1
- type: precision_at_10
value: 10.69
- type: precision_at_100
value: 2.1919999999999997
- type: precision_at_1000
value: 0.35200000000000004
- type: precision_at_3
value: 17.732999999999997
- type: precision_at_5
value: 14.499999999999998
- type: recall_at_1
value: 4.478
- type: recall_at_10
value: 21.657
- type: recall_at_100
value: 44.54
- type: recall_at_1000
value: 71.542
- type: recall_at_3
value: 10.778
- type: recall_at_5
value: 14.687
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.82325259156718
- type: cos_sim_spearman
value: 79.2463589100662
- type: euclidean_pearson
value: 80.48318380496771
- type: euclidean_spearman
value: 79.34451935199979
- type: manhattan_pearson
value: 80.39041824178759
- type: manhattan_spearman
value: 79.23002892700211
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.74130231431258
- type: cos_sim_spearman
value: 78.36856568042397
- type: euclidean_pearson
value: 82.48301631890303
- type: euclidean_spearman
value: 78.28376980722732
- type: manhattan_pearson
value: 82.43552075450525
- type: manhattan_spearman
value: 78.22702443947126
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 79.96138619461459
- type: cos_sim_spearman
value: 81.85436343502379
- type: euclidean_pearson
value: 81.82895226665367
- type: euclidean_spearman
value: 82.22707349602916
- type: manhattan_pearson
value: 81.66303369445873
- type: manhattan_spearman
value: 82.05030197179455
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 80.05481244198648
- type: cos_sim_spearman
value: 80.85052504637808
- type: euclidean_pearson
value: 80.86728419744497
- type: euclidean_spearman
value: 81.033786401512
- type: manhattan_pearson
value: 80.90107531061103
- type: manhattan_spearman
value: 81.11374116827795
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 84.615220756399
- type: cos_sim_spearman
value: 86.46858500002092
- type: euclidean_pearson
value: 86.08307800247586
- type: euclidean_spearman
value: 86.72691443870013
- type: manhattan_pearson
value: 85.96155594487269
- type: manhattan_spearman
value: 86.605909505275
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.14363913634436
- type: cos_sim_spearman
value: 84.48430226487102
- type: euclidean_pearson
value: 83.75303424801902
- type: euclidean_spearman
value: 84.56762380734538
- type: manhattan_pearson
value: 83.6135447165928
- type: manhattan_spearman
value: 84.39898212616731
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.09909252554525
- type: cos_sim_spearman
value: 85.70951402743276
- type: euclidean_pearson
value: 87.1991936239908
- type: euclidean_spearman
value: 86.07745840612071
- type: manhattan_pearson
value: 87.25039137549952
- type: manhattan_spearman
value: 85.99938746659761
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.529332093413615
- type: cos_sim_spearman
value: 65.38177340147439
- type: euclidean_pearson
value: 66.35278011412136
- type: euclidean_spearman
value: 65.47147267032997
- type: manhattan_pearson
value: 66.71804682408693
- type: manhattan_spearman
value: 65.67406521423597
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.45802942885662
- type: cos_sim_spearman
value: 84.8853341842566
- type: euclidean_pearson
value: 84.60915021096707
- type: euclidean_spearman
value: 85.11181242913666
- type: manhattan_pearson
value: 84.38600521210364
- type: manhattan_spearman
value: 84.89045417981723
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 85.92793380635129
- type: mrr
value: 95.85834191226348
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 55.74400000000001
- type: map_at_10
value: 65.455
- type: map_at_100
value: 66.106
- type: map_at_1000
value: 66.129
- type: map_at_3
value: 62.719
- type: map_at_5
value: 64.441
- type: mrr_at_1
value: 58.667
- type: mrr_at_10
value: 66.776
- type: mrr_at_100
value: 67.363
- type: mrr_at_1000
value: 67.384
- type: mrr_at_3
value: 64.889
- type: mrr_at_5
value: 66.122
- type: ndcg_at_1
value: 58.667
- type: ndcg_at_10
value: 69.904
- type: ndcg_at_100
value: 72.807
- type: ndcg_at_1000
value: 73.423
- type: ndcg_at_3
value: 65.405
- type: ndcg_at_5
value: 67.86999999999999
- type: precision_at_1
value: 58.667
- type: precision_at_10
value: 9.3
- type: precision_at_100
value: 1.08
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.444
- type: precision_at_5
value: 17
- type: recall_at_1
value: 55.74400000000001
- type: recall_at_10
value: 82.122
- type: recall_at_100
value: 95.167
- type: recall_at_1000
value: 100
- type: recall_at_3
value: 70.14399999999999
- type: recall_at_5
value: 76.417
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.86534653465347
- type: cos_sim_ap
value: 96.54142419791388
- type: cos_sim_f1
value: 93.07535641547861
- type: cos_sim_precision
value: 94.81327800829875
- type: cos_sim_recall
value: 91.4
- type: dot_accuracy
value: 99.86435643564356
- type: dot_ap
value: 96.53682260449868
- type: dot_f1
value: 92.98515104966718
- type: dot_precision
value: 95.27806925498426
- type: dot_recall
value: 90.8
- type: euclidean_accuracy
value: 99.86336633663366
- type: euclidean_ap
value: 96.5228676185697
- type: euclidean_f1
value: 92.9735234215886
- type: euclidean_precision
value: 94.70954356846472
- type: euclidean_recall
value: 91.3
- type: manhattan_accuracy
value: 99.85841584158416
- type: manhattan_ap
value: 96.50392760934032
- type: manhattan_f1
value: 92.84642321160581
- type: manhattan_precision
value: 92.8928928928929
- type: manhattan_recall
value: 92.80000000000001
- type: max_accuracy
value: 99.86534653465347
- type: max_ap
value: 96.54142419791388
- type: max_f1
value: 93.07535641547861
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 61.08285408766616
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.640675309010604
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 53.20333913710715
- type: mrr
value: 54.088813555725324
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.79465221925075
- type: cos_sim_spearman
value: 30.530816059163634
- type: dot_pearson
value: 31.364837244718043
- type: dot_spearman
value: 30.79726823684003
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22599999999999998
- type: map_at_10
value: 1.735
- type: map_at_100
value: 8.978
- type: map_at_1000
value: 20.851
- type: map_at_3
value: 0.613
- type: map_at_5
value: 0.964
- type: mrr_at_1
value: 88
- type: mrr_at_10
value: 92.867
- type: mrr_at_100
value: 92.867
- type: mrr_at_1000
value: 92.867
- type: mrr_at_3
value: 92.667
- type: mrr_at_5
value: 92.667
- type: ndcg_at_1
value: 82
- type: ndcg_at_10
value: 73.164
- type: ndcg_at_100
value: 51.878
- type: ndcg_at_1000
value: 44.864
- type: ndcg_at_3
value: 79.184
- type: ndcg_at_5
value: 76.39
- type: precision_at_1
value: 88
- type: precision_at_10
value: 76.2
- type: precision_at_100
value: 52.459999999999994
- type: precision_at_1000
value: 19.692
- type: precision_at_3
value: 82.667
- type: precision_at_5
value: 80
- type: recall_at_1
value: 0.22599999999999998
- type: recall_at_10
value: 1.942
- type: recall_at_100
value: 12.342
- type: recall_at_1000
value: 41.42
- type: recall_at_3
value: 0.637
- type: recall_at_5
value: 1.034
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.567
- type: map_at_10
value: 13.116
- type: map_at_100
value: 19.39
- type: map_at_1000
value: 20.988
- type: map_at_3
value: 7.109
- type: map_at_5
value: 9.950000000000001
- type: mrr_at_1
value: 42.857
- type: mrr_at_10
value: 57.404999999999994
- type: mrr_at_100
value: 58.021
- type: mrr_at_1000
value: 58.021
- type: mrr_at_3
value: 54.762
- type: mrr_at_5
value: 56.19
- type: ndcg_at_1
value: 38.775999999999996
- type: ndcg_at_10
value: 30.359
- type: ndcg_at_100
value: 41.284
- type: ndcg_at_1000
value: 52.30200000000001
- type: ndcg_at_3
value: 36.744
- type: ndcg_at_5
value: 34.326
- type: precision_at_1
value: 42.857
- type: precision_at_10
value: 26.122
- type: precision_at_100
value: 8.082
- type: precision_at_1000
value: 1.559
- type: precision_at_3
value: 40.136
- type: precision_at_5
value: 35.510000000000005
- type: recall_at_1
value: 3.567
- type: recall_at_10
value: 19.045
- type: recall_at_100
value: 49.979
- type: recall_at_1000
value: 84.206
- type: recall_at_3
value: 8.52
- type: recall_at_5
value: 13.103000000000002
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 68.8394
- type: ap
value: 13.454399712443099
- type: f1
value: 53.04963076364322
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 60.546123372948514
- type: f1
value: 60.86952793277713
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.10042955060234
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.03308100375514
- type: cos_sim_ap
value: 71.08284605869684
- type: cos_sim_f1
value: 65.42539436255494
- type: cos_sim_precision
value: 64.14807302231237
- type: cos_sim_recall
value: 66.75461741424802
- type: dot_accuracy
value: 84.68736961316088
- type: dot_ap
value: 69.20524036530992
- type: dot_f1
value: 63.54893953365829
- type: dot_precision
value: 63.45698500394633
- type: dot_recall
value: 63.641160949868066
- type: euclidean_accuracy
value: 85.07480479227513
- type: euclidean_ap
value: 71.14592761009864
- type: euclidean_f1
value: 65.43814432989691
- type: euclidean_precision
value: 63.95465994962216
- type: euclidean_recall
value: 66.99208443271768
- type: manhattan_accuracy
value: 85.06288370984085
- type: manhattan_ap
value: 71.07289742593868
- type: manhattan_f1
value: 65.37585421412301
- type: manhattan_precision
value: 62.816147859922175
- type: manhattan_recall
value: 68.15303430079156
- type: max_accuracy
value: 85.07480479227513
- type: max_ap
value: 71.14592761009864
- type: max_f1
value: 65.43814432989691
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 87.79058485659952
- type: cos_sim_ap
value: 83.7183187008759
- type: cos_sim_f1
value: 75.86921142180798
- type: cos_sim_precision
value: 73.00683371298405
- type: cos_sim_recall
value: 78.96519864490298
- type: dot_accuracy
value: 87.0085768618776
- type: dot_ap
value: 81.87467488474279
- type: dot_f1
value: 74.04188363990559
- type: dot_precision
value: 72.10507114191901
- type: dot_recall
value: 76.08561749307053
- type: euclidean_accuracy
value: 87.8332751193387
- type: euclidean_ap
value: 83.83585648120315
- type: euclidean_f1
value: 76.02582177042369
- type: euclidean_precision
value: 73.36388371759989
- type: euclidean_recall
value: 78.88820449645827
- type: manhattan_accuracy
value: 87.87208444910156
- type: manhattan_ap
value: 83.8101950642973
- type: manhattan_f1
value: 75.90454195535027
- type: manhattan_precision
value: 72.44419564761039
- type: manhattan_recall
value: 79.71204188481676
- type: max_accuracy
value: 87.87208444910156
- type: max_ap
value: 83.83585648120315
- type: max_f1
value: 76.02582177042369
license: mit
language:
- en
---
**Recommend switching to newest [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5), which has more reasonable similarity distribution and same method of usage.**
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search.
And it also can be used in vector databases for LLMs.
************* 🌟**Updates**🌟 *************
- 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire:
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
| 89,684 | [
[
-0.03656005859375,
-0.06671142578125,
0.029571533203125,
0.0112762451171875,
-0.029296875,
-0.0214691162109375,
-0.025665283203125,
-0.021636962890625,
0.02862548828125,
0.0248565673828125,
-0.0263519287109375,
-0.0626220703125,
-0.03729248046875,
-0.00299072265625,
-0.00463104248046875,
0.042572021484375,
-0.0026035308837890625,
0.01171875,
0.0028896331787109375,
-0.02020263671875,
-0.03131103515625,
-0.01580810546875,
-0.053802490234375,
-0.0217437744140625,
0.0281829833984375,
0.0178985595703125,
0.04278564453125,
0.052825927734375,
0.02490234375,
0.0201873779296875,
-0.019775390625,
0.0103302001953125,
-0.037200927734375,
-0.00669097900390625,
-0.0178375244140625,
-0.02459716796875,
-0.030792236328125,
0.0091400146484375,
0.04876708984375,
0.034027099609375,
-0.00785064697265625,
0.009063720703125,
0.0004379749298095703,
0.05322265625,
-0.034912109375,
0.0193328857421875,
-0.041168212890625,
0.002315521240234375,
-0.0192108154296875,
0.01222991943359375,
-0.037750244140625,
-0.0260009765625,
0.01297760009765625,
-0.044036865234375,
0.0089111328125,
0.0221405029296875,
0.0970458984375,
0.015625,
-0.031829833984375,
-0.01100921630859375,
-0.00933837890625,
0.0733642578125,
-0.07708740234375,
0.051910400390625,
0.03631591796875,
0.0208282470703125,
-0.005435943603515625,
-0.062164306640625,
-0.026702880859375,
-0.01226043701171875,
-0.015899658203125,
0.0306549072265625,
-0.003040313720703125,
0.0014495849609375,
0.0258026123046875,
0.045684814453125,
-0.04351806640625,
0.00769805908203125,
-0.007419586181640625,
-0.0129241943359375,
0.056854248046875,
-0.01265716552734375,
0.0310821533203125,
-0.040802001953125,
-0.0198516845703125,
-0.030548095703125,
-0.062255859375,
0.003597259521484375,
0.0276947021484375,
0.01039886474609375,
-0.0263519287109375,
0.041168212890625,
-0.017791748046875,
0.045074462890625,
0.00661468505859375,
0.0005669593811035156,
0.043701171875,
-0.0281219482421875,
-0.0169830322265625,
-0.00797271728515625,
0.0667724609375,
0.0296783447265625,
-0.003337860107421875,
0.00494384765625,
-0.0236663818359375,
-0.006450653076171875,
-0.006500244140625,
-0.0697021484375,
-0.0203857421875,
0.013671875,
-0.05999755859375,
-0.01245880126953125,
0.0156402587890625,
-0.05902099609375,
0.00678253173828125,
0.00016391277313232422,
0.043426513671875,
-0.05657958984375,
-0.004245758056640625,
0.0226287841796875,
-0.01389312744140625,
0.0280914306640625,
-0.00012183189392089844,
-0.04840087890625,
-0.016204833984375,
0.038543701171875,
0.06744384765625,
0.01027679443359375,
-0.00687408447265625,
-0.029510498046875,
0.00284576416015625,
-0.01134490966796875,
0.0249481201171875,
-0.03668212890625,
-0.0149383544921875,
0.0126953125,
0.030426025390625,
-0.00780487060546875,
-0.0251922607421875,
0.0657958984375,
-0.03765869140625,
0.027008056640625,
-0.027587890625,
-0.0596923828125,
-0.037750244140625,
0.0062103271484375,
-0.059173583984375,
0.08331298828125,
-0.00646209716796875,
-0.0654296875,
0.00965118408203125,
-0.047332763671875,
-0.017669677734375,
-0.018798828125,
-0.00021946430206298828,
-0.046844482421875,
-0.00688934326171875,
0.02783203125,
0.04400634765625,
-0.016357421875,
0.0021762847900390625,
-0.027435302734375,
-0.045013427734375,
-0.003856658935546875,
-0.0176239013671875,
0.07989501953125,
0.0221405029296875,
-0.027130126953125,
-0.0169677734375,
-0.033782958984375,
0.00687408447265625,
0.0208282470703125,
-0.019378662109375,
-0.027618408203125,
0.01617431640625,
0.01544189453125,
0.0039520263671875,
0.039886474609375,
-0.052154541015625,
0.01131439208984375,
-0.04522705078125,
0.04278564453125,
0.04229736328125,
0.01490020751953125,
0.019378662109375,
-0.038177490234375,
0.0210723876953125,
0.000054001808166503906,
-0.0027561187744140625,
-0.0122833251953125,
-0.04290771484375,
-0.04376220703125,
-0.0252532958984375,
0.053680419921875,
0.046783447265625,
-0.0633544921875,
0.04925537109375,
-0.034942626953125,
-0.04718017578125,
-0.07281494140625,
0.01062774658203125,
0.03875732421875,
0.0002199411392211914,
0.052520751953125,
-0.0138702392578125,
-0.034515380859375,
-0.07073974609375,
-0.00446319580078125,
0.005565643310546875,
-0.005283355712890625,
0.040740966796875,
0.044281005859375,
-0.028961181640625,
0.0308837890625,
-0.055511474609375,
-0.025177001953125,
-0.0177154541015625,
-0.004894256591796875,
0.0253448486328125,
0.035888671875,
0.05291748046875,
-0.07525634765625,
-0.044525146484375,
0.0013227462768554688,
-0.053680419921875,
0.005657196044921875,
0.005519866943359375,
-0.0200653076171875,
0.0158843994140625,
0.045257568359375,
-0.0304718017578125,
0.0178985595703125,
0.0347900390625,
-0.0177154541015625,
0.0187835693359375,
-0.0023250579833984375,
0.0108795166015625,
-0.101806640625,
0.00582122802734375,
0.0233154296875,
-0.01035308837890625,
-0.0193023681640625,
0.03826904296875,
0.0110626220703125,
0.016387939453125,
-0.025177001953125,
0.048370361328125,
-0.03863525390625,
0.0179901123046875,
0.007526397705078125,
0.04266357421875,
-0.00913238525390625,
0.037384033203125,
-0.004566192626953125,
0.055633544921875,
0.028778076171875,
-0.0281829833984375,
0.0106353759765625,
0.039703369140625,
-0.036468505859375,
0.005443572998046875,
-0.049530029296875,
-0.006076812744140625,
-0.00583648681640625,
0.012481689453125,
-0.06317138671875,
-0.00543975830078125,
0.0214691162109375,
-0.04278564453125,
0.041534423828125,
-0.0231170654296875,
-0.037261962890625,
-0.0280914306640625,
-0.06781005859375,
0.01297760009765625,
0.04638671875,
-0.050384521484375,
0.015838623046875,
0.0195770263671875,
0.00292205810546875,
-0.058135986328125,
-0.0643310546875,
-0.0123291015625,
-0.0024051666259765625,
-0.040496826171875,
0.040008544921875,
-0.005565643310546875,
0.019134521484375,
0.0146636962890625,
-0.0079193115234375,
0.01415252685546875,
0.006366729736328125,
-0.0003764629364013672,
0.0152740478515625,
-0.034912109375,
0.0019159317016601562,
0.0206756591796875,
0.0103759765625,
-0.016357421875,
-0.00989532470703125,
0.03167724609375,
-0.01189422607421875,
-0.022735595703125,
-0.014312744140625,
0.023284912109375,
0.019866943359375,
-0.0275115966796875,
0.04656982421875,
0.0755615234375,
-0.0274658203125,
-0.0055084228515625,
-0.048858642578125,
-0.00942230224609375,
-0.036529541015625,
0.034637451171875,
-0.0277557373046875,
-0.07330322265625,
0.031951904296875,
-0.00489044189453125,
0.01502227783203125,
0.050628662109375,
0.027099609375,
-0.01007843017578125,
0.080322265625,
0.0296630859375,
-0.0233917236328125,
0.051727294921875,
-0.04913330078125,
0.01568603515625,
-0.08843994140625,
-0.00225830078125,
-0.0279388427734375,
-0.030059814453125,
-0.0985107421875,
-0.0355224609375,
0.003997802734375,
0.0192718505859375,
-0.027740478515625,
0.03179931640625,
-0.042449951171875,
0.0109100341796875,
0.035736083984375,
0.0224151611328125,
-0.00255584716796875,
0.01247406005859375,
-0.03179931640625,
-0.019195556640625,
-0.046630859375,
-0.035369873046875,
0.07550048828125,
0.035308837890625,
0.047637939453125,
0.029266357421875,
0.0616455078125,
0.01296234130859375,
0.00769805908203125,
-0.057586669921875,
0.04229736328125,
-0.04107666015625,
-0.042327880859375,
-0.02618408203125,
-0.03961181640625,
-0.08544921875,
0.0288238525390625,
-0.0190887451171875,
-0.05999755859375,
0.0091400146484375,
-0.01320648193359375,
-0.0016088485717773438,
0.033172607421875,
-0.053253173828125,
0.0794677734375,
-0.004146575927734375,
-0.0234527587890625,
-0.007442474365234375,
-0.033172607421875,
0.025115966796875,
0.01165008544921875,
0.006076812744140625,
0.007625579833984375,
-0.0178375244140625,
0.05413818359375,
-0.015625,
0.0469970703125,
-0.012115478515625,
0.01085662841796875,
0.032562255859375,
-0.01282501220703125,
0.04266357421875,
0.006481170654296875,
-0.01462554931640625,
0.02166748046875,
0.007244110107421875,
-0.03753662109375,
-0.036895751953125,
0.0687255859375,
-0.053070068359375,
-0.05035400390625,
-0.02862548828125,
-0.0160980224609375,
0.01287078857421875,
0.032745361328125,
0.031982421875,
0.018096923828125,
-0.0080413818359375,
0.048248291015625,
0.06878662109375,
-0.03668212890625,
0.0285186767578125,
0.023773193359375,
-0.017822265625,
-0.04351806640625,
0.08538818359375,
0.0175018310546875,
-0.0034122467041015625,
0.04901123046875,
0.0033283233642578125,
-0.0207366943359375,
-0.042694091796875,
-0.035186767578125,
0.0478515625,
-0.04351806640625,
-0.01526641845703125,
-0.046234130859375,
-0.0318603515625,
-0.033447265625,
0.0012025833129882812,
-0.018310546875,
-0.0186920166015625,
-0.0130767822265625,
-0.0192718505859375,
0.020416259765625,
0.033355712890625,
0.007526397705078125,
0.007099151611328125,
-0.053131103515625,
0.0167083740234375,
-0.00653076171875,
0.03216552734375,
0.006381988525390625,
-0.04608154296875,
-0.045501708984375,
0.0124053955078125,
-0.035125732421875,
-0.08135986328125,
0.0246734619140625,
0.007030487060546875,
0.061981201171875,
0.0240325927734375,
-0.0037841796875,
0.03118896484375,
-0.0390625,
0.0782470703125,
-0.006587982177734375,
-0.0567626953125,
0.035888671875,
-0.0219573974609375,
0.015167236328125,
0.045379638671875,
0.048583984375,
-0.035980224609375,
-0.0188446044921875,
-0.039337158203125,
-0.0721435546875,
0.0380859375,
0.014190673828125,
0.000049173831939697266,
-0.0202789306640625,
0.025115966796875,
-0.01116943359375,
0.0010442733764648438,
-0.060272216796875,
-0.05401611328125,
-0.0246734619140625,
-0.0282440185546875,
-0.01113128662109375,
-0.020263671875,
0.01654052734375,
-0.023284912109375,
0.076904296875,
-0.0022449493408203125,
0.04022216796875,
0.0258636474609375,
-0.0242767333984375,
0.015625,
0.0160064697265625,
0.0224609375,
0.01532745361328125,
-0.0305023193359375,
-0.0108184814453125,
0.025146484375,
-0.0428466796875,
-0.005558013916015625,
0.0248870849609375,
-0.035308837890625,
0.01523590087890625,
0.022613525390625,
0.0556640625,
0.033966064453125,
-0.033172607421875,
0.044219970703125,
0.00862884521484375,
-0.013519287109375,
-0.0195159912109375,
-0.0034809112548828125,
0.0223541259765625,
0.01922607421875,
0.0055389404296875,
-0.02978515625,
0.021820068359375,
-0.04425048828125,
0.02294921875,
0.03094482421875,
-0.0248565673828125,
-0.005611419677734375,
0.047607421875,
0.0012502670288085938,
-0.0019969940185546875,
0.036102294921875,
-0.041717529296875,
-0.049896240234375,
0.030059814453125,
0.0289764404296875,
0.06396484375,
-0.01141357421875,
0.0160369873046875,
0.0657958984375,
0.037811279296875,
-0.0280914306640625,
0.0269012451171875,
0.00814056396484375,
-0.0433349609375,
-0.03363037109375,
-0.040374755859375,
-0.004261016845703125,
0.021331787109375,
-0.042144775390625,
0.027740478515625,
-0.030181884765625,
-0.01032257080078125,
0.0026035308837890625,
0.036285400390625,
-0.056182861328125,
0.01061248779296875,
0.0030155181884765625,
0.08258056640625,
-0.045135498046875,
0.062103271484375,
0.078369140625,
-0.0682373046875,
-0.05743408203125,
0.006717681884765625,
-0.0091400146484375,
-0.04400634765625,
0.0236663818359375,
0.0177459716796875,
0.012451171875,
0.004268646240234375,
-0.03692626953125,
-0.0693359375,
0.11907958984375,
0.004230499267578125,
-0.04296875,
-0.004283905029296875,
-0.0244140625,
0.036651611328125,
-0.026153564453125,
0.031890869140625,
0.0316162109375,
0.04351806640625,
-0.01140594482421875,
-0.04852294921875,
0.040802001953125,
-0.0217132568359375,
0.0181884765625,
0.005146026611328125,
-0.076416015625,
0.060272216796875,
0.0020599365234375,
-0.023773193359375,
0.0178680419921875,
0.052825927734375,
0.02093505859375,
0.034912109375,
0.0200042724609375,
0.0694580078125,
0.049560546875,
-0.01247406005859375,
0.08428955078125,
-0.0177459716796875,
0.047027587890625,
0.066162109375,
0.01262664794921875,
0.08233642578125,
0.007598876953125,
-0.01580810546875,
0.050384521484375,
0.062164306640625,
-0.0258636474609375,
0.03900146484375,
0.0025882720947265625,
0.005168914794921875,
-0.021636962890625,
0.007785797119140625,
-0.040374755859375,
0.020111083984375,
0.02276611328125,
-0.036712646484375,
0.00269317626953125,
-0.020050048828125,
0.00736236572265625,
0.005977630615234375,
-0.0033321380615234375,
0.04327392578125,
0.0252532958984375,
-0.033782958984375,
0.047637939453125,
0.0181732177734375,
0.07611083984375,
-0.031768798828125,
-0.007904052734375,
-0.0253143310546875,
-0.00673675537109375,
-0.0156402587890625,
-0.0567626953125,
-0.006000518798828125,
-0.017608642578125,
-0.016754150390625,
0.007686614990234375,
0.04473876953125,
-0.04522705078125,
-0.031494140625,
0.042144775390625,
0.038330078125,
0.019989013671875,
0.0139923095703125,
-0.08221435546875,
0.0027294158935546875,
0.02740478515625,
-0.0401611328125,
0.0253753662109375,
0.03765869140625,
-0.006587982177734375,
0.043182373046875,
0.041717529296875,
0.0059661865234375,
-0.00202178955078125,
0.003917694091796875,
0.038909912109375,
-0.0675048828125,
-0.023590087890625,
-0.044219970703125,
0.022491455078125,
-0.024658203125,
-0.00186920166015625,
0.05889892578125,
0.0543212890625,
0.08282470703125,
-0.00390625,
0.05755615234375,
-0.00909423828125,
0.0301971435546875,
-0.044281005859375,
0.06591796875,
-0.07952880859375,
0.015533447265625,
-0.0301513671875,
-0.0745849609375,
-0.00980377197265625,
0.055694580078125,
-0.0252838134765625,
0.0194244384765625,
0.052520751953125,
0.0736083984375,
-0.023223876953125,
-0.015472412109375,
0.02301025390625,
0.033294677734375,
0.0108795166015625,
0.058624267578125,
0.0258636474609375,
-0.0723876953125,
0.0484619140625,
-0.016998291015625,
0.00817108154296875,
-0.040283203125,
-0.0477294921875,
-0.07080078125,
-0.05401611328125,
-0.03143310546875,
-0.0203857421875,
-0.0018939971923828125,
0.0687255859375,
0.027130126953125,
-0.056182861328125,
-0.0029735565185546875,
0.019500732421875,
0.0308685302734375,
-0.0217132568359375,
-0.020355224609375,
0.049346923828125,
-0.0057220458984375,
-0.0701904296875,
0.02392578125,
-0.0063018798828125,
-0.005176544189453125,
-0.00009685754776000977,
-0.017913818359375,
-0.06341552734375,
0.00821685791015625,
0.04461669921875,
0.0199432373046875,
-0.06646728515625,
-0.035736083984375,
0.00487518310546875,
-0.0208282470703125,
-0.01222991943359375,
0.01238250732421875,
-0.030792236328125,
0.0267486572265625,
0.047637939453125,
0.058807373046875,
0.05242919921875,
-0.005611419677734375,
0.01458740234375,
-0.045501708984375,
-0.00370025634765625,
-0.004169464111328125,
0.05419921875,
0.0280914306640625,
-0.0227508544921875,
0.0694580078125,
0.0167236328125,
-0.03118896484375,
-0.0567626953125,
0.00275421142578125,
-0.0810546875,
-0.025726318359375,
0.0850830078125,
-0.02978515625,
-0.019378662109375,
0.0239715576171875,
-0.0162811279296875,
0.039306640625,
-0.035797119140625,
0.0380859375,
0.0606689453125,
0.033843994140625,
-0.01268768310546875,
-0.06292724609375,
0.0253143310546875,
0.049346923828125,
-0.0200653076171875,
-0.026702880859375,
0.0254058837890625,
0.03717041015625,
0.016204833984375,
0.01146697998046875,
-0.016998291015625,
0.02374267578125,
-0.00565338134765625,
-0.0009703636169433594,
-0.00939178466796875,
0.01485443115234375,
-0.014801025390625,
-0.0016078948974609375,
-0.01154327392578125,
-0.02191162109375
]
] |
google/byt5-large | 2023-01-24T16:36:56.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"multilingual",
"af",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fil",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"hi",
"hmn",
"ht",
"hu",
"hy",
"ig",
"is",
"it",
"iw",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"und",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:mc4",
"arxiv:1907.06292",
"arxiv:2105.13626",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/byt5-large | 5 | 78,911 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
datasets:
- mc4
license: apache-2.0
---
# ByT5 - large
ByT5 is a tokenizer-free version of [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and generally follows the architecture of [MT5](https://huggingface.co/google/mt5-large).
ByT5 was only pre-trained on [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
ByT5 works especially well on noisy text data,*e.g.*, `google/byt5-large` significantly outperforms [mt5-large](https://huggingface.co/google/mt5-large) on [TweetQA](https://arxiv.org/abs/1907.06292).
Paper: [ByT5: Towards a token-free future with pre-trained byte-to-byte models](https://arxiv.org/abs/2105.13626)
Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*
## Example Inference
ByT5 works on raw UTF-8 bytes and can be used without a tokenizer:
```python
from transformers import T5ForConditionalGeneration
import torch
model = T5ForConditionalGeneration.from_pretrained('google/byt5-large')
input_ids = torch.tensor([list("Life is like a box of chocolates.".encode("utf-8"))]) + 3 # add 3 for special tokens
labels = torch.tensor([list("La vie est comme une boîte de chocolat.".encode("utf-8"))]) + 3 # add 3 for special tokens
loss = model(input_ids, labels=labels).loss # forward pass
```
For batched inference & training it is however recommended using a tokenizer class for padding:
```python
from transformers import T5ForConditionalGeneration, AutoTokenizer
model = T5ForConditionalGeneration.from_pretrained('google/byt5-large')
tokenizer = AutoTokenizer.from_pretrained('google/byt5-large')
model_inputs = tokenizer(["Life is like a box of chocolates.", "Today is Monday."], padding="longest", return_tensors="pt")
labels = tokenizer(["La vie est comme une boîte de chocolat.", "Aujourd'hui c'est lundi."], padding="longest", return_tensors="pt").input_ids
loss = model(**model_inputs, labels=labels).loss # forward pass
```
## Abstract
Most widely-used pre-trained language models operate on sequences of tokens corresponding to word or subword units. Encoding text as a sequence of tokens requires a tokenizer, which is typically created as an independent artifact from the model. Token-free models that instead operate directly on raw text (bytes or characters) have many benefits: they can process text in any language out of the box, they are more robust to noise, and they minimize technical debt by removing complex and error-prone text preprocessing pipelines. Since byte or character sequences are longer than token sequences, past work on token-free models has often introduced new model architectures designed to amortize the cost of operating directly on raw text. In this paper, we show that a standard Transformer architecture can be used with minimal modifications to process byte sequences. We carefully characterize the trade-offs in terms of parameter count, training FLOPs, and inference speed, and show that byte-level models are competitive with their token-level counterparts. We also demonstrate that byte-level models are significantly more robust to noise and perform better on tasks that are sensitive to spelling and pronunciation. As part of our contribution, we release a new set of pre-trained byte-level Transformer models based on the T5 architecture, as well as all code and data used in our experiments.
 | 4,229 | [
[
-0.0206146240234375,
-0.02288818359375,
0.02252197265625,
0.013519287109375,
-0.02801513671875,
-0.0009541511535644531,
-0.0244903564453125,
-0.047332763671875,
0.0011701583862304688,
0.0251922607421875,
-0.0394287109375,
-0.037872314453125,
-0.05908203125,
0.018035888671875,
-0.037445068359375,
0.078857421875,
-0.00293731689453125,
0.006153106689453125,
0.0157012939453125,
-0.0117950439453125,
-0.022064208984375,
-0.044647216796875,
-0.051239013671875,
-0.0282440185546875,
0.04296875,
0.01983642578125,
0.035064697265625,
0.054351806640625,
0.0390625,
0.0286407470703125,
-0.005611419677734375,
-0.0065155029296875,
-0.030029296875,
-0.01090240478515625,
-0.0030498504638671875,
-0.0438232421875,
-0.036590576171875,
-0.0014543533325195312,
0.03765869140625,
0.04449462890625,
0.00978851318359375,
0.029205322265625,
-0.01219940185546875,
0.0236968994140625,
-0.046142578125,
0.00814056396484375,
-0.051239013671875,
0.0094757080078125,
-0.01708984375,
0.0099029541015625,
-0.050750732421875,
-0.00943756103515625,
0.0023860931396484375,
-0.040924072265625,
0.040924072265625,
0.00777435302734375,
0.07720947265625,
0.01708984375,
-0.0289459228515625,
-0.017822265625,
-0.048004150390625,
0.0777587890625,
-0.051177978515625,
0.056671142578125,
0.0104522705078125,
0.0138702392578125,
-0.0035305023193359375,
-0.092041015625,
-0.04608154296875,
0.004497528076171875,
-0.006336212158203125,
0.0167236328125,
-0.01546478271484375,
0.0159149169921875,
0.028564453125,
0.035430908203125,
-0.04327392578125,
0.00994110107421875,
-0.0543212890625,
-0.011962890625,
0.0296478271484375,
-0.0121917724609375,
0.01947021484375,
-0.0196990966796875,
-0.035614013671875,
-0.01019287109375,
-0.044219970703125,
0.00751495361328125,
0.0136260986328125,
0.016754150390625,
-0.01214599609375,
0.0198822021484375,
-0.003021240234375,
0.02508544921875,
0.01397705078125,
0.00000864267349243164,
0.0278778076171875,
-0.0212249755859375,
-0.0263214111328125,
0.0194854736328125,
0.0704345703125,
0.00855255126953125,
0.025909423828125,
-0.042144775390625,
-0.028900146484375,
0.00756072998046875,
0.02313232421875,
-0.09716796875,
-0.01364898681640625,
0.031402587890625,
-0.0494384765625,
-0.0251922607421875,
-0.003314971923828125,
-0.0390625,
-0.0084686279296875,
0.00335693359375,
0.0467529296875,
-0.041259765625,
-0.004199981689453125,
0.0218048095703125,
-0.0182647705078125,
0.01532745361328125,
0.0022258758544921875,
-0.09600830078125,
0.011322021484375,
0.032073974609375,
0.058807373046875,
-0.00878143310546875,
-0.0190887451171875,
-0.03448486328125,
0.0007395744323730469,
-0.0285186767578125,
0.042724609375,
-0.0217742919921875,
-0.038330078125,
-0.0223541259765625,
0.01568603515625,
-0.0007271766662597656,
-0.034271240234375,
0.0650634765625,
-0.0452880859375,
0.03277587890625,
-0.015716552734375,
-0.04327392578125,
-0.01409912109375,
0.00904083251953125,
-0.051239013671875,
0.06298828125,
-0.0010776519775390625,
-0.06292724609375,
0.052703857421875,
-0.0654296875,
-0.0218048095703125,
-0.0032501220703125,
0.015594482421875,
-0.049285888671875,
0.004791259765625,
0.0294647216796875,
0.038604736328125,
-0.019012451171875,
0.013336181640625,
-0.0232696533203125,
-0.04302978515625,
0.01242828369140625,
-0.047454833984375,
0.06500244140625,
0.0246429443359375,
-0.030609130859375,
0.0239410400390625,
-0.0711669921875,
0.0017337799072265625,
0.0012598037719726562,
-0.036651611328125,
0.00508880615234375,
-0.00887298583984375,
0.00970458984375,
0.018585205078125,
0.0162506103515625,
-0.04681396484375,
0.0231475830078125,
-0.029541015625,
0.06024169921875,
0.051177978515625,
-0.0211944580078125,
0.037017822265625,
-0.0184478759765625,
0.0222625732421875,
0.0269927978515625,
0.01290130615234375,
-0.0221099853515625,
-0.0101318359375,
-0.0723876953125,
-0.0390625,
0.048797607421875,
0.0310821533203125,
-0.0535888671875,
0.0268402099609375,
-0.059783935546875,
-0.039306640625,
-0.04730224609375,
-0.00998687744140625,
0.0176544189453125,
0.0305328369140625,
0.053802490234375,
-0.0216064453125,
-0.058624267578125,
-0.0433349609375,
-0.01410675048828125,
0.00632476806640625,
-0.00815582275390625,
-0.0008311271667480469,
0.04229736328125,
-0.0199127197265625,
0.062469482421875,
-0.01387786865234375,
-0.0080413818359375,
-0.0301971435546875,
0.01800537109375,
0.023193359375,
0.05291748046875,
0.03594970703125,
-0.0440673828125,
-0.02020263671875,
-0.0126800537109375,
-0.051544189453125,
0.0086517333984375,
-0.00901031494140625,
0.0152130126953125,
0.0277557373046875,
0.028717041015625,
-0.05316162109375,
0.016632080078125,
0.035308837890625,
-0.03192138671875,
0.036590576171875,
-0.025909423828125,
0.0029811859130859375,
-0.1002197265625,
0.018341064453125,
-0.0051727294921875,
-0.03448486328125,
-0.0479736328125,
-0.00911712646484375,
0.012359619140625,
-0.01053619384765625,
-0.047271728515625,
0.06146240234375,
-0.0408935546875,
-0.0007023811340332031,
0.00019598007202148438,
-0.01617431640625,
-0.00522613525390625,
0.040557861328125,
0.0014858245849609375,
0.07879638671875,
0.03131103515625,
-0.04571533203125,
0.01175689697265625,
0.018096923828125,
-0.03179931640625,
0.0078887939453125,
-0.050750732421875,
0.01166534423828125,
-0.007335662841796875,
0.0273895263671875,
-0.06689453125,
-0.0022106170654296875,
0.01200103759765625,
-0.054443359375,
0.0274810791015625,
-0.00867462158203125,
-0.040496826171875,
-0.03765869140625,
-0.030792236328125,
0.031585693359375,
0.052032470703125,
-0.05230712890625,
0.046234130859375,
-0.01103973388671875,
0.0248260498046875,
-0.06427001953125,
-0.07513427734375,
0.0114288330078125,
-0.030242919921875,
-0.052520751953125,
0.042724609375,
0.0029315948486328125,
0.0262451171875,
-0.00626373291015625,
-0.01276397705078125,
-0.0053863525390625,
0.004131317138671875,
0.00457763671875,
0.021881103515625,
-0.02392578125,
0.0023021697998046875,
-0.0114288330078125,
-0.019378662109375,
0.00484466552734375,
-0.053466796875,
0.0452880859375,
-0.0059661865234375,
0.01068115234375,
-0.0350341796875,
0.00618743896484375,
0.05255126953125,
-0.022247314453125,
0.06866455078125,
0.07635498046875,
-0.017333984375,
-0.01397705078125,
-0.039154052734375,
-0.018951416015625,
-0.042633056640625,
0.032318115234375,
-0.046600341796875,
-0.051361083984375,
0.054412841796875,
0.008148193359375,
0.00873565673828125,
0.037200927734375,
0.034637451171875,
0.00986480712890625,
0.0699462890625,
0.044708251953125,
-0.00797271728515625,
0.047027587890625,
-0.039886474609375,
0.0235595703125,
-0.04876708984375,
-0.0011892318725585938,
-0.034332275390625,
-0.0255126953125,
-0.067626953125,
-0.02069091796875,
0.00751495361328125,
-0.00901031494140625,
-0.034271240234375,
0.028594970703125,
-0.03802490234375,
0.0279541015625,
0.04931640625,
0.0064239501953125,
0.011322021484375,
-0.002147674560546875,
-0.0123138427734375,
0.0007781982421875,
-0.056976318359375,
-0.03729248046875,
0.08917236328125,
0.0282745361328125,
0.05047607421875,
-0.005710601806640625,
0.048553466796875,
0.004528045654296875,
0.0162506103515625,
-0.057342529296875,
0.035430908203125,
-0.0369873046875,
-0.054443359375,
-0.01335906982421875,
-0.029876708984375,
-0.0897216796875,
0.013031005859375,
-0.0257415771484375,
-0.07293701171875,
0.005504608154296875,
0.00927734375,
-0.017791748046875,
0.0438232421875,
-0.07464599609375,
0.0828857421875,
-0.00045037269592285156,
-0.0300750732421875,
0.00038814544677734375,
-0.055419921875,
0.022613525390625,
0.004123687744140625,
-0.0013723373413085938,
0.0236053466796875,
0.005855560302734375,
0.05926513671875,
-0.033599853515625,
0.062255859375,
-0.00806427001953125,
0.00820159912109375,
0.0114593505859375,
-0.0190277099609375,
0.040191650390625,
-0.025421142578125,
-0.0079193115234375,
0.01824951171875,
0.01263427734375,
-0.039520263671875,
-0.04180908203125,
0.03387451171875,
-0.07525634765625,
-0.035797119140625,
-0.018157958984375,
-0.0301971435546875,
0.0016584396362304688,
0.03594970703125,
0.049285888671875,
0.036163330078125,
0.004520416259765625,
0.04034423828125,
0.055328369140625,
-0.0209808349609375,
0.062103271484375,
-0.0036907196044921875,
-0.006542205810546875,
-0.03192138671875,
0.07794189453125,
0.027587890625,
0.01393890380859375,
0.038726806640625,
0.0165863037109375,
-0.0443115234375,
-0.04034423828125,
-0.03924560546875,
0.0126190185546875,
-0.06201171875,
-0.00597381591796875,
-0.06231689453125,
-0.0237274169921875,
-0.042633056640625,
-0.01424407958984375,
-0.0298004150390625,
-0.0273895263671875,
-0.0290985107421875,
-0.010955810546875,
0.0113677978515625,
0.0294647216796875,
0.017333984375,
0.036224365234375,
-0.06768798828125,
0.01654052734375,
0.0018825531005859375,
0.032318115234375,
0.008209228515625,
-0.04486083984375,
-0.01824951171875,
-0.0104522705078125,
-0.0341796875,
-0.04559326171875,
0.0294952392578125,
-0.005275726318359375,
0.017242431640625,
0.032562255859375,
0.0060882568359375,
0.051605224609375,
-0.037506103515625,
0.07012939453125,
0.0137939453125,
-0.0914306640625,
-0.00717926025390625,
-0.0176544189453125,
0.0296783447265625,
0.017608642578125,
0.0270538330078125,
-0.054595947265625,
-0.014556884765625,
-0.06878662109375,
-0.0633544921875,
0.0535888671875,
0.01496124267578125,
0.01471710205078125,
0.0083770751953125,
0.0109405517578125,
0.01425933837890625,
0.0144195556640625,
-0.078125,
-0.015106201171875,
-0.037811279296875,
-0.030242919921875,
-0.0003769397735595703,
-0.0157928466796875,
0.03533935546875,
-0.014251708984375,
0.054962158203125,
0.00315093994140625,
0.04803466796875,
0.01255035400390625,
-0.0205078125,
0.0205078125,
0.0242156982421875,
0.047698974609375,
0.0293121337890625,
-0.01415252685546875,
0.018951416015625,
0.040191650390625,
-0.050750732421875,
0.005126953125,
0.003170013427734375,
-0.01232147216796875,
0.003070831298828125,
0.0300140380859375,
0.08636474609375,
0.0038547515869140625,
-0.0265960693359375,
0.034393310546875,
-0.0118408203125,
-0.0224456787109375,
-0.01525115966796875,
0.00269317626953125,
0.006488800048828125,
0.01312255859375,
0.0207977294921875,
0.0018253326416015625,
0.008270263671875,
-0.03387451171875,
0.008209228515625,
0.0217742919921875,
-0.025238037109375,
-0.040191650390625,
0.0767822265625,
0.014556884765625,
-0.0128631591796875,
0.054168701171875,
-0.0121612548828125,
-0.0419921875,
0.038238525390625,
0.056060791015625,
0.06744384765625,
-0.0008230209350585938,
-0.00893402099609375,
0.0312347412109375,
0.0160064697265625,
-0.0093231201171875,
0.00891876220703125,
-0.01358795166015625,
-0.05548095703125,
-0.0379638671875,
-0.043914794921875,
-0.01023101806640625,
0.0217742919921875,
-0.0167694091796875,
0.047882080078125,
-0.0330810546875,
-0.009246826171875,
0.0024433135986328125,
0.017425537109375,
-0.0555419921875,
0.054229736328125,
0.0108795166015625,
0.0718994140625,
-0.04132080078125,
0.06976318359375,
0.03802490234375,
-0.040863037109375,
-0.07965087890625,
-0.006404876708984375,
-0.043975830078125,
-0.055145263671875,
0.047027587890625,
0.03802490234375,
0.000957489013671875,
0.017120361328125,
-0.041473388671875,
-0.06463623046875,
0.094970703125,
0.03448486328125,
-0.0139312744140625,
-0.030731201171875,
0.0175933837890625,
0.03662109375,
-0.0009908676147460938,
0.042449951171875,
0.026763916015625,
0.0296783447265625,
0.02642822265625,
-0.05914306640625,
0.01371002197265625,
-0.0095062255859375,
-0.0023632049560546875,
0.0200042724609375,
-0.061004638671875,
0.060333251953125,
-0.01190948486328125,
-0.0177764892578125,
-0.0067596435546875,
0.070068359375,
0.0026264190673828125,
-0.0045928955078125,
0.029388427734375,
0.03912353515625,
0.06488037109375,
-0.00920867919921875,
0.08056640625,
-0.032196044921875,
0.04254150390625,
0.05352783203125,
0.0254669189453125,
0.04559326171875,
0.033111572265625,
0.00018262863159179688,
0.02569580078125,
0.05841064453125,
-0.01397705078125,
0.032012939453125,
0.01197052001953125,
-0.019012451171875,
-0.01172637939453125,
0.0088653564453125,
-0.0193023681640625,
0.034423828125,
0.00786590576171875,
-0.034423828125,
-0.01617431640625,
-0.005733489990234375,
0.0199432373046875,
-0.04229736328125,
-0.0229644775390625,
0.036285400390625,
0.005161285400390625,
-0.04913330078125,
0.061981201171875,
0.02008056640625,
0.0775146484375,
-0.0367431640625,
0.0308685302734375,
-0.017852783203125,
0.028594970703125,
-0.0222015380859375,
-0.03643798828125,
0.028472900390625,
0.0092315673828125,
-0.01389312744140625,
-0.0291595458984375,
0.0517578125,
-0.025909423828125,
-0.042510986328125,
0.0033359527587890625,
0.01690673828125,
0.0223846435546875,
0.007587432861328125,
-0.038818359375,
-0.00527191162109375,
-0.00824737548828125,
-0.038909912109375,
0.011199951171875,
0.042572021484375,
-0.0233917236328125,
0.0606689453125,
0.0447998046875,
0.0059814453125,
0.0248565673828125,
-0.0086212158203125,
0.046905517578125,
-0.05047607421875,
-0.0413818359375,
-0.07196044921875,
0.046539306640625,
0.01263427734375,
-0.02490234375,
0.029937744140625,
0.051025390625,
0.06927490234375,
-0.0171661376953125,
0.054443359375,
0.0010290145874023438,
0.007007598876953125,
-0.0408935546875,
0.0657958984375,
-0.05023193359375,
-0.00547027587890625,
-0.006000518798828125,
-0.05517578125,
-0.031494140625,
0.034332275390625,
-0.0323486328125,
0.020660400390625,
0.058685302734375,
0.047637939453125,
-0.0305633544921875,
-0.01180267333984375,
0.03668212890625,
0.0177001953125,
0.042572021484375,
0.04107666015625,
0.02764892578125,
-0.05328369140625,
0.06158447265625,
0.001567840576171875,
0.0156402587890625,
0.000640869140625,
-0.068603515625,
-0.07904052734375,
-0.037567138671875,
-0.0038394927978515625,
-0.0280609130859375,
0.0028934478759765625,
0.08563232421875,
0.06451416015625,
-0.0513916015625,
-0.00449371337890625,
-0.01617431640625,
-0.0011186599731445312,
0.0024738311767578125,
-0.0122833251953125,
0.034698486328125,
-0.04486083984375,
-0.07403564453125,
-0.000156402587890625,
-0.0038814544677734375,
0.019378662109375,
-0.0033721923828125,
0.0129852294921875,
0.006092071533203125,
0.0149993896484375,
0.04034423828125,
0.0035343170166015625,
-0.0386962890625,
-0.03753662109375,
0.015228271484375,
-0.01280975341796875,
0.034149169921875,
0.04254150390625,
-0.062164306640625,
0.018463134765625,
0.033905029296875,
0.06072998046875,
0.06256103515625,
-0.0101470947265625,
0.033538818359375,
-0.05474853515625,
0.0170745849609375,
-0.0024566650390625,
0.0343017578125,
0.0311737060546875,
-0.0249786376953125,
0.028106689453125,
0.02850341796875,
-0.040924072265625,
-0.049957275390625,
-0.0017423629760742188,
-0.07470703125,
-0.01401519775390625,
0.0762939453125,
-0.0140228271484375,
-0.0377197265625,
0.0174713134765625,
-0.006198883056640625,
0.046539306640625,
-0.033416748046875,
0.0704345703125,
0.070068359375,
0.0197906494140625,
-0.0177459716796875,
-0.0290374755859375,
0.036163330078125,
0.032806396484375,
-0.0513916015625,
-0.00783538818359375,
-0.0015287399291992188,
0.038238525390625,
0.0071868896484375,
0.039306640625,
-0.0025005340576171875,
0.0090789794921875,
0.005290985107421875,
0.0253448486328125,
-0.0070953369140625,
-0.0131072998046875,
-0.01580810546875,
0.01383209228515625,
-0.005313873291015625,
-0.04522705078125
]
] |
google/tapas-large-finetuned-wtq | 2023-09-05T14:48:42.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"tapas",
"table-question-answering",
"en",
"dataset:wikitablequestions",
"arxiv:2004.02349",
"arxiv:2010.00571",
"arxiv:1508.00305",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | table-question-answering | google | null | null | google/tapas-large-finetuned-wtq | 67 | 78,851 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- tapas
- table-question-answering
license: apache-2.0
datasets:
- wikitablequestions
---
# TAPAS large model fine-tuned on WikiTable Questions (WTQ)
This model has 2 versions which can be used. The default version corresponds to the `tapas_wtq_wikisql_sqa_inter_masklm_large_reset` checkpoint of the [original Github repository](https://github.com/google-research/tapas).
This model was pre-trained on MLM and an additional step which the authors call intermediate pre-training, and then fine-tuned in a chain on [SQA](https://www.microsoft.com/en-us/download/details.aspx?id=54253), [WikiSQL](https://github.com/salesforce/WikiSQL) and finally [WTQ](https://github.com/ppasupat/WikiTableQuestions). It uses relative position embeddings (i.e. resetting the position index at every cell of the table).
The other (non-default) version which can be used is:
- `no_reset`, which corresponds to `tapas_wtq_wikisql_sqa_inter_masklm_large` (intermediate pre-training, absolute position embeddings).
Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by
the Hugging Face team and contributors.
## Results
Size | Reset | Dev Accuracy | Link
-------- | --------| -------- | ----
**LARGE** | **noreset** | **0.5062** | [tapas-large-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/no_reset)
**LARGE** | **reset** | **0.5097** | [tapas-large-finetuned-wtq](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/main)
BASE | noreset | 0.4525 | [tapas-base-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/no_reset)
BASE | reset | 0.4638 | [tapas-base-finetuned-wtq](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/main)
MEDIUM | noreset | 0.4324 | [tapas-medium-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/no_reset)
MEDIUM | reset | 0.4324 | [tapas-medium-finetuned-wtq](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/main)
SMALL | noreset | 0.3681 | [tapas-small-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/no_reset)
SMALL | reset | 0.3762 | [tapas-small-finetuned-wtq](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/main)
MINI | noreset | 0.2783 | [tapas-mini-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/no_reset)
MINI | reset | 0.2854 | [tapas-mini-finetuned-wtq](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/main)
TINY | noreset | 0.0823 | [tapas-tiny-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/no_reset)
TINY | reset | 0.1039 | [tapas-tiny-finetuned-wtq](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/main)
## Model description
TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.
This means it was pretrained on the raw tables and associated texts only, with no humans labelling them in any way (which is why it
can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a (flattened) table and associated context, the model randomly masks 15% of the words in
the input, then runs the entire (partially masked) sequence through the model. The model then has to predict the masked words.
This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other,
or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional
representation of a table and associated text.
- Intermediate pre-training: to encourage numerical reasoning on tables, the authors additionally pre-trained the model by creating
a balanced dataset of millions of syntactically created training examples. Here, the model must predict (classify) whether a sentence
is supported or refuted by the contents of a table. The training examples are created based on synthetic as well as counterfactual statements.
This way, the model learns an inner representation of the English language used in tables and associated texts, which can then be used
to extract features useful for downstream tasks such as answering questions about a table, or determining whether a sentence is entailed
or refuted by the contents of a table. Fine-tuning is done by adding a cell selection head and aggregation head on top of the pre-trained model, and then jointly train these randomly initialized classification heads with the base model on SQa, WikiSQL and finally WTQ.
## Intended uses & limitations
You can use this model for answering questions related to a table.
For code examples, we refer to the documentation of TAPAS on the HuggingFace website.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Question [SEP] Flattened table [SEP]
```
The authors did first convert the WTQ dataset into the format of SQA using automatic conversion scripts.
### Fine-tuning
The model was fine-tuned on 32 Cloud TPU v3 cores for 50,000 steps with maximum sequence length 512 and batch size of 512.
In this setup, fine-tuning takes around 10 hours. The optimizer used is Adam with a learning rate of 1.93581e-5, and a warmup
ratio of 0.128960. An inductive bias is added such that the model only selects cells of the same column. This is reflected by the
`select_one_column` parameter of `TapasConfig`. See the [paper](https://arxiv.org/abs/2004.02349) for more details (tables 11 and
12).
### BibTeX entry and citation info
```bibtex
@misc{herzig2020tapas,
title={TAPAS: Weakly Supervised Table Parsing via Pre-training},
author={Jonathan Herzig and Paweł Krzysztof Nowak and Thomas Müller and Francesco Piccinno and Julian Martin Eisenschlos},
year={2020},
eprint={2004.02349},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
```bibtex
@misc{eisenschlos2020understanding,
title={Understanding tables with intermediate pre-training},
author={Julian Martin Eisenschlos and Syrine Krichene and Thomas Müller},
year={2020},
eprint={2010.00571},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@article{DBLP:journals/corr/PasupatL15,
author = {Panupong Pasupat and
Percy Liang},
title = {Compositional Semantic Parsing on Semi-Structured Tables},
journal = {CoRR},
volume = {abs/1508.00305},
year = {2015},
url = {http://arxiv.org/abs/1508.00305},
archivePrefix = {arXiv},
eprint = {1508.00305},
timestamp = {Mon, 13 Aug 2018 16:47:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/PasupatL15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 7,221 | [
[
-0.047607421875,
-0.05352783203125,
0.0200958251953125,
0.024139404296875,
-0.032257080078125,
-0.0092010498046875,
-0.00691986083984375,
-0.041259765625,
0.0504150390625,
0.02679443359375,
-0.040252685546875,
-0.0312347412109375,
-0.033721923828125,
0.0019197463989257812,
-0.021209716796875,
0.08770751953125,
0.00033092498779296875,
0.0024967193603515625,
-0.0077056884765625,
-0.0178985595703125,
-0.025177001953125,
-0.031585693359375,
-0.0193634033203125,
-0.0185699462890625,
0.039154052734375,
0.033905029296875,
0.056793212890625,
0.046478271484375,
0.0172119140625,
0.02203369140625,
-0.0225372314453125,
0.009552001953125,
-0.041717529296875,
-0.01264190673828125,
0.00012099742889404297,
-0.040374755859375,
-0.0296630859375,
0.017730712890625,
0.030975341796875,
0.05328369140625,
0.004283905029296875,
0.0333251953125,
0.01270294189453125,
0.050689697265625,
-0.037078857421875,
0.01087188720703125,
-0.0635986328125,
0.0169677734375,
-0.0189208984375,
-0.00579071044921875,
-0.025634765625,
-0.045440673828125,
0.027313232421875,
-0.039947509765625,
0.01276397705078125,
-0.00028824806213378906,
0.10498046875,
0.00628662109375,
-0.0208587646484375,
0.00022029876708984375,
-0.0408935546875,
0.040069580078125,
-0.0555419921875,
0.0250396728515625,
0.040374755859375,
0.01110076904296875,
-0.0208587646484375,
-0.0655517578125,
-0.0517578125,
-0.01557159423828125,
-0.017608642578125,
0.0022258758544921875,
-0.006195068359375,
-0.0116424560546875,
0.030670166015625,
0.045074462890625,
-0.037872314453125,
0.006000518798828125,
-0.0374755859375,
-0.015167236328125,
0.042144775390625,
-0.004062652587890625,
0.0193939208984375,
-0.00867462158203125,
-0.030792236328125,
-0.00939178466796875,
-0.062225341796875,
0.028076171875,
0.0120391845703125,
0.0239410400390625,
-0.032135009765625,
0.048797607421875,
-0.006221771240234375,
0.059173583984375,
0.01505279541015625,
-0.0208282470703125,
0.045562744140625,
-0.0277252197265625,
-0.025238037109375,
-0.018218994140625,
0.0693359375,
0.00803375244140625,
0.0225830078125,
-0.004329681396484375,
-0.01203155517578125,
-0.0058746337890625,
0.01201629638671875,
-0.062469482421875,
-0.03204345703125,
0.015655517578125,
-0.0355224609375,
-0.0208587646484375,
0.00372314453125,
-0.0487060546875,
-0.01537322998046875,
-0.02557373046875,
0.040679931640625,
-0.0236358642578125,
-0.01499176025390625,
-0.0171966552734375,
-0.004917144775390625,
0.036712646484375,
0.0131378173828125,
-0.048919677734375,
0.019775390625,
0.0265960693359375,
0.054718017578125,
0.0016546249389648438,
-0.0228424072265625,
-0.014129638671875,
-0.005176544189453125,
-0.03643798828125,
0.03643798828125,
-0.0259552001953125,
-0.017974853515625,
-0.00959014892578125,
0.0222930908203125,
-0.0051116943359375,
-0.038360595703125,
0.04766845703125,
-0.03338623046875,
0.022857666015625,
-0.046417236328125,
-0.0313720703125,
-0.014556884765625,
0.018463134765625,
-0.052825927734375,
0.08807373046875,
0.0099334716796875,
-0.059967041015625,
0.03948974609375,
-0.041595458984375,
-0.0262298583984375,
-0.01433563232421875,
0.0040283203125,
-0.05865478515625,
-0.007415771484375,
0.01727294921875,
0.041473388671875,
-0.0127410888671875,
0.0203399658203125,
-0.03631591796875,
-0.00832366943359375,
0.0166778564453125,
0.0026798248291015625,
0.06884765625,
0.0201873779296875,
-0.030792236328125,
0.0135650634765625,
-0.054779052734375,
0.00655364990234375,
0.0311431884765625,
-0.0137786865234375,
-0.01427459716796875,
-0.01824951171875,
-0.000705718994140625,
0.034637451171875,
0.01885986328125,
-0.041107177734375,
0.02008056640625,
-0.02923583984375,
0.04925537109375,
0.0360107421875,
0.00156402587890625,
0.0202178955078125,
-0.0151519775390625,
0.0191497802734375,
0.0133819580078125,
0.0209503173828125,
-0.00974273681640625,
-0.0552978515625,
-0.0582275390625,
-0.01837158203125,
0.042266845703125,
0.037841796875,
-0.04351806640625,
0.055206298828125,
-0.0182647705078125,
-0.052337646484375,
-0.039825439453125,
0.0118255615234375,
0.043304443359375,
0.0504150390625,
0.034210205078125,
-0.019561767578125,
-0.046142578125,
-0.0855712890625,
0.0083160400390625,
0.007137298583984375,
0.002651214599609375,
0.0280303955078125,
0.045867919921875,
-0.00202178955078125,
0.08251953125,
-0.04718017578125,
-0.0176849365234375,
-0.0105438232421875,
0.00469970703125,
0.0276336669921875,
0.048797607421875,
0.054351806640625,
-0.060821533203125,
-0.054840087890625,
-0.007656097412109375,
-0.03240966796875,
-0.001140594482421875,
0.01082611083984375,
-0.01525115966796875,
0.007755279541015625,
0.0201873779296875,
-0.06787109375,
0.0330810546875,
0.0269927978515625,
-0.035980224609375,
0.041900634765625,
-0.00626373291015625,
-0.0028133392333984375,
-0.1007080078125,
0.0235595703125,
0.00679779052734375,
-0.017059326171875,
-0.040863037109375,
-0.00302886962890625,
0.0130767822265625,
-0.0150146484375,
-0.0318603515625,
0.031402587890625,
-0.0517578125,
-0.01549530029296875,
-0.0004963874816894531,
0.0079498291015625,
0.0079345703125,
0.05828857421875,
-0.01271820068359375,
0.0692138671875,
0.0268402099609375,
-0.04425048828125,
0.005828857421875,
0.03143310546875,
-0.0195159912109375,
0.025299072265625,
-0.06060791015625,
-0.007083892822265625,
0.004611968994140625,
0.024322509765625,
-0.078125,
-0.01021575927734375,
0.017578125,
-0.040069580078125,
0.0421142578125,
-0.0047149658203125,
-0.019866943359375,
-0.054840087890625,
-0.049407958984375,
0.0126953125,
0.03594970703125,
-0.05126953125,
0.025787353515625,
0.04693603515625,
-0.00037550926208496094,
-0.052154541015625,
-0.047515869140625,
-0.004913330078125,
-0.013427734375,
-0.04229736328125,
0.047607421875,
0.007617950439453125,
0.01023101806640625,
0.0045013427734375,
-0.0052032470703125,
-0.001789093017578125,
-0.00998687744140625,
0.0194244384765625,
0.016845703125,
-0.01476287841796875,
0.0025424957275390625,
-0.00623321533203125,
0.01049041748046875,
-0.00769805908203125,
-0.0087890625,
0.0584716796875,
-0.024169921875,
0.0002872943878173828,
-0.052764892578125,
0.002841949462890625,
0.04681396484375,
-0.030181884765625,
0.06341552734375,
0.05120849609375,
-0.035247802734375,
0.002758026123046875,
-0.05401611328125,
-0.01024627685546875,
-0.032745361328125,
0.0167236328125,
-0.044952392578125,
-0.0623779296875,
0.0655517578125,
0.0179290771484375,
0.00682830810546875,
0.066162109375,
0.040435791015625,
-0.0138702392578125,
0.0595703125,
0.0384521484375,
-0.024932861328125,
0.035675048828125,
-0.042510986328125,
0.006038665771484375,
-0.057708740234375,
-0.0428466796875,
-0.060821533203125,
-0.03265380859375,
-0.059173583984375,
-0.0304107666015625,
0.01441192626953125,
0.01505279541015625,
-0.036834716796875,
0.036773681640625,
-0.043853759765625,
0.02880859375,
0.0640869140625,
0.0173797607421875,
-0.0014171600341796875,
-0.008209228515625,
-0.0149078369140625,
0.0009927749633789062,
-0.037261962890625,
-0.02777099609375,
0.07769775390625,
0.04327392578125,
0.040191650390625,
-0.0106658935546875,
0.0374755859375,
0.00962066650390625,
0.0188446044921875,
-0.054168701171875,
0.031768798828125,
0.00007086992263793945,
-0.04962158203125,
-0.024871826171875,
-0.0244140625,
-0.07977294921875,
0.0209808349609375,
-0.0300140380859375,
-0.05853271484375,
0.0191802978515625,
0.0175628662109375,
-0.0204010009765625,
0.0308990478515625,
-0.07989501953125,
0.0806884765625,
-0.00919342041015625,
-0.021484375,
0.0134735107421875,
-0.060089111328125,
0.036224365234375,
0.00894927978515625,
-0.009429931640625,
-0.006927490234375,
-0.0015096664428710938,
0.06829833984375,
-0.05908203125,
0.0653076171875,
-0.0357666015625,
0.004116058349609375,
0.035430908203125,
-0.00848388671875,
0.04931640625,
-0.0041046142578125,
0.0231170654296875,
0.00272369384765625,
0.0177154541015625,
-0.038299560546875,
-0.0208282470703125,
0.05645751953125,
-0.068359375,
-0.0321044921875,
-0.0272216796875,
-0.0270538330078125,
-0.040924072265625,
0.02362060546875,
0.026641845703125,
0.015350341796875,
-0.01076507568359375,
0.0328369140625,
0.059234619140625,
-0.0230255126953125,
0.03411865234375,
0.0145111083984375,
-0.01264190673828125,
-0.0260467529296875,
0.060760498046875,
0.01497650146484375,
-0.00824737548828125,
0.030548095703125,
0.024078369140625,
-0.032501220703125,
-0.047149658203125,
-0.02838134765625,
0.02545166015625,
-0.01377105712890625,
-0.04058837890625,
-0.0535888671875,
-0.025543212890625,
-0.027801513671875,
0.00894927978515625,
-0.035003662109375,
-0.042999267578125,
-0.037322998046875,
-0.0194549560546875,
0.042938232421875,
0.04559326171875,
0.01409912109375,
0.0247802734375,
-0.053985595703125,
0.02728271484375,
0.02923583984375,
0.045928955078125,
0.0012035369873046875,
-0.04522705078125,
-0.0013399124145507812,
0.003917694091796875,
-0.030853271484375,
-0.08465576171875,
0.0286407470703125,
0.0214996337890625,
0.042816162109375,
0.039581298828125,
-0.0045623779296875,
0.055908203125,
-0.033111572265625,
0.06005859375,
0.020965576171875,
-0.06475830078125,
0.0455322265625,
-0.021392822265625,
0.01273345947265625,
0.05755615234375,
0.04388427734375,
-0.0214385986328125,
-0.01116943359375,
-0.045867919921875,
-0.06414794921875,
0.05340576171875,
0.0030765533447265625,
0.01528167724609375,
0.018524169921875,
0.0231170654296875,
0.01406097412109375,
0.00890350341796875,
-0.077392578125,
-0.0245513916015625,
-0.0167388916015625,
-0.005260467529296875,
-0.0018873214721679688,
-0.040802001953125,
-0.00965118408203125,
-0.055572509765625,
0.061492919921875,
0.0053253173828125,
0.019775390625,
0.00809478759765625,
-0.004467010498046875,
0.0044403076171875,
0.004627227783203125,
0.052154541015625,
0.05718994140625,
-0.028656005859375,
-0.0147552490234375,
0.00794219970703125,
-0.050933837890625,
-0.01537322998046875,
0.0211334228515625,
-0.021148681640625,
0.0039520263671875,
0.01097869873046875,
0.07568359375,
0.0193939208984375,
-0.033782958984375,
0.04864501953125,
0.007598876953125,
-0.0247039794921875,
-0.01776123046875,
-0.01593017578125,
0.019500732421875,
0.005893707275390625,
0.034088134765625,
-0.0163726806640625,
0.006488800048828125,
-0.039215087890625,
0.023193359375,
0.044158935546875,
-0.018646240234375,
-0.029815673828125,
0.037872314453125,
0.00997161865234375,
-0.00980377197265625,
0.0290679931640625,
-0.01727294921875,
-0.042327880859375,
0.0421142578125,
0.04302978515625,
0.05438232421875,
-0.0255584716796875,
0.0231170654296875,
0.044891357421875,
0.0308074951171875,
0.0164794921875,
0.034576416015625,
-0.00537872314453125,
-0.043487548828125,
-0.01349639892578125,
-0.059783935546875,
-0.0167388916015625,
0.04156494140625,
-0.048553466796875,
0.0174102783203125,
-0.03643798828125,
-0.02001953125,
0.01337432861328125,
0.0183563232421875,
-0.054046630859375,
0.0170135498046875,
-0.007579803466796875,
0.0850830078125,
-0.07733154296875,
0.0643310546875,
0.04931640625,
-0.0426025390625,
-0.083984375,
-0.01549530029296875,
-0.022003173828125,
-0.06439208984375,
0.016815185546875,
0.0093536376953125,
0.023773193359375,
-0.0252685546875,
-0.05401611328125,
-0.07867431640625,
0.07965087890625,
0.015228271484375,
-0.0293426513671875,
-0.0014047622680664062,
0.007274627685546875,
0.047943115234375,
-0.0136566162109375,
0.01172637939453125,
0.0499267578125,
0.02203369140625,
0.0225372314453125,
-0.082763671875,
0.01502227783203125,
-0.024078369140625,
0.00946807861328125,
0.0167388916015625,
-0.05059814453125,
0.061920166015625,
-0.0024509429931640625,
0.005435943603515625,
-0.00681304931640625,
0.05047607421875,
0.01220703125,
0.01126861572265625,
0.02679443359375,
0.0701904296875,
0.056304931640625,
-0.0216827392578125,
0.06695556640625,
-0.0133819580078125,
0.0265045166015625,
0.0880126953125,
0.0059051513671875,
0.07049560546875,
0.048492431640625,
-0.045928955078125,
0.024078369140625,
0.07598876953125,
-0.006633758544921875,
0.0374755859375,
0.01499176025390625,
0.01474761962890625,
0.01496124267578125,
0.00099945068359375,
-0.06109619140625,
0.042999267578125,
0.030029296875,
-0.046142578125,
-0.00719451904296875,
-0.0009245872497558594,
0.0153961181640625,
-0.005359649658203125,
-0.03131103515625,
0.052978515625,
0.00537872314453125,
-0.0465087890625,
0.060211181640625,
-0.0031223297119140625,
0.056732177734375,
-0.036468505859375,
0.01229095458984375,
-0.0252227783203125,
-0.005023956298828125,
-0.022674560546875,
-0.059326171875,
0.01776123046875,
-0.003704071044921875,
-0.008209228515625,
0.0012340545654296875,
0.046417236328125,
-0.03045654296875,
-0.0310211181640625,
0.0025043487548828125,
0.041839599609375,
0.00946807861328125,
0.0025463104248046875,
-0.0634765625,
0.0031871795654296875,
0.0021820068359375,
-0.04315185546875,
0.021575927734375,
0.0278778076171875,
-0.0018033981323242188,
0.042266845703125,
0.04425048828125,
-0.0060272216796875,
0.0157623291015625,
-0.006771087646484375,
0.08404541015625,
-0.046630859375,
-0.05224609375,
-0.048553466796875,
0.031219482421875,
-0.01898193359375,
-0.042755126953125,
0.046478271484375,
0.054534912109375,
0.06719970703125,
-0.0235137939453125,
0.0242462158203125,
-0.0161590576171875,
0.04986572265625,
-0.032745361328125,
0.059051513671875,
-0.0465087890625,
-0.00971221923828125,
-0.034576416015625,
-0.08355712890625,
-0.015167236328125,
0.035797119140625,
-0.023406982421875,
0.01470184326171875,
0.033172607421875,
0.048004150390625,
-0.01061248779296875,
0.0006456375122070312,
0.0034542083740234375,
0.030120849609375,
0.005767822265625,
0.0301513671875,
0.03851318359375,
-0.0267791748046875,
0.03485107421875,
-0.040191650390625,
-0.0189971923828125,
-0.0162811279296875,
-0.055755615234375,
-0.06170654296875,
-0.06512451171875,
-0.01192474365234375,
-0.0280609130859375,
-0.016815185546875,
0.07012939453125,
0.06536865234375,
-0.08172607421875,
-0.00991058349609375,
0.0107421875,
0.0024585723876953125,
0.0016765594482421875,
-0.021148681640625,
0.046478271484375,
-0.0139007568359375,
-0.0443115234375,
0.0088348388671875,
0.005401611328125,
0.01549530029296875,
-0.01018524169921875,
0.0031280517578125,
-0.0379638671875,
-0.01074981689453125,
0.056640625,
0.0517578125,
-0.04974365234375,
-0.00875091552734375,
-0.00774383544921875,
-0.01270294189453125,
0.0169219970703125,
0.043548583984375,
-0.04901123046875,
0.01546478271484375,
0.032440185546875,
0.059967041015625,
0.07049560546875,
0.00091552734375,
0.018524169921875,
-0.055877685546875,
0.024169921875,
0.0275421142578125,
0.036224365234375,
0.0160980224609375,
-0.02337646484375,
0.038482666015625,
0.0169677734375,
-0.0263214111328125,
-0.0731201171875,
-0.00466156005859375,
-0.094482421875,
-0.01373291015625,
0.0638427734375,
-0.0079498291015625,
-0.0300750732421875,
0.006893157958984375,
-0.0203094482421875,
0.0227203369140625,
-0.019775390625,
0.054534912109375,
0.043548583984375,
-0.00804901123046875,
-0.01006317138671875,
-0.02923583984375,
0.058380126953125,
0.0299224853515625,
-0.06524658203125,
-0.019683837890625,
0.031982421875,
0.029541015625,
0.0231170654296875,
0.049774169921875,
-0.026336669921875,
0.0098876953125,
-0.01375579833984375,
0.005542755126953125,
-0.009796142578125,
-0.017852783203125,
-0.0147705078125,
0.01434326171875,
0.0032806396484375,
-0.04132080078125
]
] |
mosaicml/mpt-7b | 2023-10-30T21:53:24.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"StreamingDatasets",
"custom_code",
"dataset:mc4",
"dataset:c4",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigcode/the-stack",
"dataset:allenai/s2orc",
"arxiv:2108.12409",
"arxiv:2302.13971",
"arxiv:2205.14135",
"arxiv:2010.04245",
"arxiv:1909.08053",
"arxiv:2302.06675",
"license:apache-2.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | mosaicml | null | null | mosaicml/mpt-7b | 1,092 | 78,696 | transformers | 2023-05-05T00:48:02 | ---
license: apache-2.0
tags:
- Composer
- MosaicML
- llm-foundry
- StreamingDatasets
datasets:
- mc4
- c4
- togethercomputer/RedPajama-Data-1T
- bigcode/the-stack
- allenai/s2orc
inference: false
---
# MPT-7B
MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
This model was trained by [MosaicML](https://www.mosaicml.com).
MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
These architectural changes include performance-optimized layer implementations and the elimination of context length limits by replacing
positional embeddings with Attention with Linear Biases ([ALiBi](https://arxiv.org/abs/2108.12409)).
Thanks to these modifications, MPT models can be trained with high throughput efficiency and stable convergence.
MPT models can also be served efficiently with both standard HuggingFace pipelines and NVIDIA's [FasterTransformer](https://github.com/NVIDIA/FasterTransformer).
This model uses the MosaicML LLM codebase, which can be found in the [llm-foundry repository](https://github.com/mosaicml/llm-foundry). It was trained by MosaicML’s NLP team on the [MosaicML platform](https://www.mosaicml.com/training) for LLM pretraining, finetuning, and inference.
### How is this model different?
MPT-7B is
* **Licensed for the possibility of commercial use** (unlike [LLaMA](https://arxiv.org/abs/2302.13971)).
* **Trained on a large amount of data** (1T tokens like [LLaMA](https://arxiv.org/abs/2302.13971) vs. 300B for [Pythia](https://github.com/EleutherAI/pythia), 300B for [OpenLLaMA](https://github.com/openlm-research/open_llama), and 800B for [StableLM](https://github.com/Stability-AI/StableLM)).
* **Prepared to handle extremely long inputs** thanks to [ALiBi](https://arxiv.org/abs/2108.12409) (we finetuned [MPT-7B-StoryWriter-65k+](https://huggingface.co/mosaicml/mpt-7b-storywriter) on up to 65k inputs and can handle up to 84k vs. 2k-4k for other open source models).
* **Capable of fast training and inference** (via [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf) and [FasterTransformer](https://github.com/NVIDIA/FasterTransformer))
* **Equipped with highly efficient open-source training code** via the [llm-foundry repository](https://github.com/mosaicml/llm-foundry)
### Models finetuned off MPT-7B:
The following models are finetuned on MPT-7B:
* [MPT-7B-StoryWriter-65k+](https://huggingface.co/mosaicml/mpt-7b-storywriter): a model designed to read and write fictional stories with super long context lengths.
Built by finetuning MPT-7B with a context length of 65k tokens on a filtered fiction subset of the [books3 dataset](https://huggingface.co/datasets/the_pile_books3).
At inference time, thanks to [ALiBi](https://arxiv.org/abs/2108.12409), MPT-7B-StoryWriter-65k+ can extrapolate even beyond 65k tokens.
We demonstrate generations as long as 80k tokens on a single A100-80GB GPU in our [blogpost](www.mosaicml.com/blog/mpt-7b).
* License: Apache 2.0
* [MPT-7B-Instruct](https://huggingface.co/mosaicml/mpt-7b-instruct): a model for short-form instruction following.
Built by finetuning MPT-7B on a [dataset](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) we also release, derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
* License: _CC-By-SA-3.0_
* [MPT-7B-Chat](https://huggingface.co/mosaicml/mpt-7b-chat): a chatbot-like model for dialogue generation.
Built by finetuning MPT-7B on the [ShareGPT-Vicuna](https://huggingface.co/datasets/jeffwan/sharegpt_vicuna), [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3),
[Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca), [HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf), and [Evol-Instruct](https://huggingface.co/datasets/victor123/evol_instruct_70k) datasets.
* License: _CC-By-NC-SA-4.0_
## Model Date
May 5, 2023
## Model License
Apache-2.0
## Documentation
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
## How to Use
This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision:
```python
import torch
import transformers
name = 'mosaicml/mpt-7b'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.attn_config['attn_impl'] = 'triton'
config.init_device = 'cuda:0' # For fast initialization directly on GPU!
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
torch_dtype=torch.bfloat16, # Load model weights in bfloat16
trust_remote_code=True
)
```
Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
import transformers
name = 'mosaicml/mpt-7b'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.max_seq_len = 4096 # (input + output) tokens can now be up to 4096
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
trust_remote_code=True
)
```
This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('EleutherAI/gpt-neox-20b')
```
The model can then be used, for example, within a text-generation pipeline.
Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html).
```python
from transformers import pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
print(
pipe('Here is a recipe for vegan banana bread:\n',
max_new_tokens=100,
do_sample=True,
use_cache=True))
```
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 6.7B |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 50432 |
| sequence length | 2048 |
## Training Data
### Streaming Datasets
Data was formatted using the MosaicML [StreamingDataset](https://github.com/mosaicml/streaming) library to host our data in object storage and efficiently stream it to our compute cluster during training.
StreamingDataset obviates the need to download the whole dataset before starting training, and allows instant resumption of training from any point in the dataset.
### Data Mix
The model was trained for 1T tokens (with batch size 1760 and sequence length 2048). It was trained on the following data mix:
| Data Source | Number of Tokens in Source | Proportion | Effective Number of Tokens | Epochs |
|-------------|----------------------------|------------|----------------------------|--------|
| mC4 3.1.0 - English | 417.99 B | 0.33 | 330 B | 0.14 |
| C4 - English - SemDedup 80% | 100.42 B | 0.299 | 299 B | 2.98 |
| RedPajama - CommonCrawl | 878.45 B | 0.1 | 100 B | 0.11 |
| The Stack - Selected Languages | 463.78 B | 0.1 | 100 B | 0.22 |
| RedPajama - Wikipedia - En | 4.87 B | 0.04 | 40 B | 8.21 |
| The Stack - Markdown | 107.07 B | 0.035 | 35 B | 0.33 |
| S2ORC | 48.85 B | 0.033 | 33 B | 0.68 |
| RedPajama - Books | 26.02 B | 0.03 | 30B | 1.15 |
| RedPajama - arXiv | 28.10 B | 0.019 | 19 B | 0.68 |
| RedPajama - StackExchange | 20.54 B | 0.014 | 14 B |0.68 |
Samples for each batch were selected from one of the datasets with the probability specified above.
The examples were shuffled within each dataset, and each example was constructed from as many sequences from that dataset as were necessary to fill the 2048 sequence length.
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer. This BPE tokenizer has a number of desirable characteristics,
most of which are relevant for tokenizing code:
(1) It was trained on a diverse mix of data that includes code (The Pile)
(2) It applies consistent space delimitation, unlike the GPT2 tokenizer which tokenizes inconsistently depending on the presence of prefix spaces
(3) It contains tokens for repeated space characters, which allows superior compression of text with large amounts of repeated space characters.
The model vocabulary size of 50432 was set to be a multiple of 128 (as in [MEGATRON-LM](https://arxiv.org/abs/1909.08053)), model flop utilization (MFU) increased by up to four percentage points.
### Training Configuration
This model was trained on 440 A100-40GBs for about 9.5 days using the [MosaicML Platform](https://www.mosaicml.com/platform).
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the [LION](https://arxiv.org/abs/2302.06675) optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-7B (Base) is **not** intended for deployment without finetuning.
It should not be used for human-facing interactions without further guardrails and user consent.
MPT-7B can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-7B was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-7B: A New Standard for Open-Source,
Commercially Usable LLMs},
year = {2023},
url = {www.mosaicml.com/blog/mpt-7b},
note = {Accessed: 2023-05-05},
urldate = {2023-05-05}
}
```
| 12,034 | [
[
-0.0394287109375,
-0.0384521484375,
0.02227783203125,
0.02227783203125,
-0.0311126708984375,
-0.004138946533203125,
-0.0043792724609375,
-0.02716064453125,
-0.0014581680297851562,
0.0286712646484375,
-0.041107177734375,
-0.041656494140625,
-0.046875,
0.004779815673828125,
-0.0206756591796875,
0.0718994140625,
-0.0012226104736328125,
-0.007061004638671875,
-0.002872467041015625,
-0.0115814208984375,
-0.013702392578125,
-0.0318603515625,
-0.04290771484375,
-0.0260772705078125,
0.04608154296875,
0.0076141357421875,
0.06329345703125,
0.07171630859375,
0.03253173828125,
0.0240478515625,
-0.0171661376953125,
0.016754150390625,
-0.036102294921875,
-0.0232696533203125,
0.01108551025390625,
-0.0272674560546875,
-0.03594970703125,
0.0102386474609375,
0.0367431640625,
0.02490234375,
-0.005970001220703125,
0.0323486328125,
-0.0014524459838867188,
0.0275726318359375,
-0.033355712890625,
0.0211181640625,
-0.029266357421875,
0.0135955810546875,
-0.01251220703125,
0.00032973289489746094,
-0.041229248046875,
-0.0269317626953125,
0.005634307861328125,
-0.044342041015625,
0.028411865234375,
0.0026302337646484375,
0.067626953125,
0.0251007080078125,
-0.02838134765625,
0.0002598762512207031,
-0.039794921875,
0.0513916015625,
-0.060577392578125,
0.02490234375,
0.0189361572265625,
0.019134521484375,
-0.002315521240234375,
-0.078125,
-0.056915283203125,
-0.015594482421875,
-0.00875091552734375,
0.0288238525390625,
-0.01290130615234375,
0.0014142990112304688,
0.0384521484375,
0.040008544921875,
-0.04571533203125,
-0.0027561187744140625,
-0.035369873046875,
-0.01093292236328125,
0.0406494140625,
0.0148468017578125,
0.0206298828125,
-0.0316162109375,
-0.039031982421875,
-0.03265380859375,
-0.053863525390625,
0.00333404541015625,
0.0230865478515625,
-0.004375457763671875,
-0.035369873046875,
0.04351806640625,
-0.0012178421020507812,
0.0435791015625,
0.00970458984375,
-0.01068878173828125,
0.0270233154296875,
-0.0210723876953125,
-0.0223541259765625,
-0.00337982177734375,
0.0740966796875,
0.025970458984375,
-0.002902984619140625,
-0.0009455680847167969,
-0.0027141571044921875,
-0.001148223876953125,
0.0033130645751953125,
-0.0732421875,
-0.0187835693359375,
0.0169830322265625,
-0.037750244140625,
-0.0140228271484375,
-0.004901885986328125,
-0.046722412109375,
-0.0296783447265625,
-0.0173492431640625,
0.045989990234375,
-0.05438232421875,
-0.0196685791015625,
0.00389862060546875,
-0.01079559326171875,
0.0328369140625,
0.0173187255859375,
-0.06585693359375,
-0.0025386810302734375,
0.032928466796875,
0.07586669921875,
-0.01139068603515625,
-0.038055419921875,
-0.009307861328125,
-0.0007505416870117188,
-0.0026912689208984375,
0.04541015625,
-0.01296234130859375,
-0.0189971923828125,
-0.0247955322265625,
0.01230621337890625,
-0.0216522216796875,
-0.03289794921875,
0.0275726318359375,
-0.0248260498046875,
0.03729248046875,
-0.0118560791015625,
-0.0285797119140625,
-0.0226287841796875,
0.009185791015625,
-0.048828125,
0.07958984375,
0.03253173828125,
-0.0633544921875,
0.022369384765625,
-0.055328369140625,
-0.01067352294921875,
-0.0091094970703125,
0.00843048095703125,
-0.055511474609375,
-0.006427764892578125,
0.029754638671875,
0.03729248046875,
-0.03594970703125,
0.0215911865234375,
-0.0166168212890625,
-0.0396728515625,
0.005977630615234375,
-0.045379638671875,
0.0748291015625,
0.0263214111328125,
-0.046783447265625,
0.0102081298828125,
-0.0628662109375,
-0.01558685302734375,
0.019195556640625,
-0.031524658203125,
0.0333251953125,
-0.018310546875,
0.0004887580871582031,
0.018646240234375,
0.007366180419921875,
-0.047698974609375,
0.0160980224609375,
-0.0284576416015625,
0.04583740234375,
0.052032470703125,
-0.01541900634765625,
0.0234375,
-0.043365478515625,
0.03350830078125,
0.0148773193359375,
0.03326416015625,
-0.024810791015625,
-0.04803466796875,
-0.07135009765625,
-0.029022216796875,
0.025146484375,
0.03546142578125,
-0.07135009765625,
0.0278167724609375,
-0.019195556640625,
-0.06024169921875,
-0.054443359375,
-0.007396697998046875,
0.03167724609375,
0.038848876953125,
0.04608154296875,
-0.0253753662109375,
-0.045989990234375,
-0.05938720703125,
0.0020122528076171875,
-0.0010824203491210938,
-0.004131317138671875,
0.015960693359375,
0.037445068359375,
-0.0233154296875,
0.072021484375,
-0.0193328857421875,
0.0036487579345703125,
-0.0270843505859375,
0.01030731201171875,
0.033294677734375,
0.048736572265625,
0.043182373046875,
-0.054718017578125,
-0.04498291015625,
-0.0086822509765625,
-0.05389404296875,
0.00574493408203125,
-0.0113372802734375,
-0.01047515869140625,
0.0205078125,
0.01291656494140625,
-0.07354736328125,
0.0391845703125,
0.0465087890625,
-0.031463623046875,
0.0399169921875,
-0.00415802001953125,
0.004138946533203125,
-0.103515625,
0.0037517547607421875,
-0.0090179443359375,
-0.0156707763671875,
-0.0379638671875,
-0.01506805419921875,
0.00667572021484375,
-0.002323150634765625,
-0.0643310546875,
0.0390625,
-0.030426025390625,
0.0014276504516601562,
-0.00826263427734375,
-0.0190582275390625,
-0.0032405853271484375,
0.06103515625,
0.01593017578125,
0.059722900390625,
0.03326416015625,
-0.03192138671875,
0.03857421875,
0.030426025390625,
-0.026458740234375,
0.0128936767578125,
-0.0445556640625,
0.01332855224609375,
-0.0028171539306640625,
0.0265350341796875,
-0.0576171875,
-0.01468658447265625,
0.02703857421875,
-0.03936767578125,
0.019805908203125,
-0.0186767578125,
-0.04156494140625,
-0.04730224609375,
-0.0175323486328125,
0.031951904296875,
0.05841064453125,
-0.06597900390625,
0.055023193359375,
0.0018978118896484375,
0.0102081298828125,
-0.054931640625,
-0.051361083984375,
-0.004932403564453125,
-0.026031494140625,
-0.061248779296875,
0.0247802734375,
-0.00766754150390625,
0.01473236083984375,
-0.01358795166015625,
-0.001964569091796875,
0.01309967041015625,
-0.0081939697265625,
0.0333251953125,
0.0306854248046875,
-0.022552490234375,
-0.0170440673828125,
-0.004413604736328125,
-0.0178070068359375,
-0.00469970703125,
-0.01468658447265625,
0.06890869140625,
-0.0248260498046875,
-0.0190582275390625,
-0.041473388671875,
0.003803253173828125,
0.044403076171875,
-0.009429931640625,
0.07745361328125,
0.07904052734375,
-0.00775146484375,
0.0044097900390625,
-0.05419921875,
-0.0090484619140625,
-0.038970947265625,
0.0213775634765625,
-0.0139923095703125,
-0.06072998046875,
0.047943115234375,
0.01419830322265625,
0.00252532958984375,
0.052032470703125,
0.0628662109375,
0.0029850006103515625,
0.0703125,
0.031768798828125,
0.01454925537109375,
0.0469970703125,
-0.048187255859375,
-0.0010042190551757812,
-0.0692138671875,
-0.0232696533203125,
-0.0093231201171875,
-0.0177459716796875,
-0.049652099609375,
-0.04376220703125,
0.0217742919921875,
-0.00670623779296875,
-0.048126220703125,
0.049560546875,
-0.048248291015625,
0.033843994140625,
0.065673828125,
0.0198211669921875,
0.009918212890625,
-0.00701904296875,
-0.0157623291015625,
0.0135498046875,
-0.06756591796875,
-0.0357666015625,
0.08978271484375,
0.0308380126953125,
0.043060302734375,
0.00010114908218383789,
0.050872802734375,
-0.00341796875,
0.0439453125,
-0.030548095703125,
0.033233642578125,
0.00555419921875,
-0.049285888671875,
-0.003612518310546875,
-0.042633056640625,
-0.0638427734375,
0.027099609375,
-0.017333984375,
-0.053375244140625,
0.033050537109375,
0.0150909423828125,
-0.040130615234375,
0.0440673828125,
-0.07073974609375,
0.07745361328125,
-0.00923919677734375,
-0.04010009765625,
0.01406097412109375,
-0.0634765625,
0.0286407470703125,
0.0029811859130859375,
-0.005832672119140625,
-0.00768280029296875,
0.020721435546875,
0.0545654296875,
-0.033447265625,
0.06268310546875,
-0.013763427734375,
0.0157623291015625,
0.036102294921875,
-0.00586700439453125,
0.0299224853515625,
0.001590728759765625,
-0.005718231201171875,
0.021270751953125,
0.01561737060546875,
-0.0287017822265625,
-0.02154541015625,
0.03753662109375,
-0.0849609375,
-0.0421142578125,
-0.037384033203125,
-0.052642822265625,
0.0009584426879882812,
0.010528564453125,
0.05108642578125,
0.02923583984375,
0.0044097900390625,
0.030487060546875,
0.04296875,
-0.039520263671875,
0.05633544921875,
0.017547607421875,
-0.00036716461181640625,
-0.04010009765625,
0.06402587890625,
-0.003082275390625,
0.0273590087890625,
0.0139617919921875,
0.010986328125,
-0.0260772705078125,
-0.03546142578125,
-0.043853759765625,
0.02496337890625,
-0.044647216796875,
-0.033172607421875,
-0.05145263671875,
-0.03546142578125,
-0.0352783203125,
0.0016069412231445312,
-0.042236328125,
-0.032196044921875,
-0.03521728515625,
-0.00505828857421875,
0.02862548828125,
0.038970947265625,
-0.00794219970703125,
0.051849365234375,
-0.06390380859375,
0.021453857421875,
0.01739501953125,
0.027679443359375,
0.0006761550903320312,
-0.050689697265625,
-0.0227508544921875,
0.00800323486328125,
-0.050079345703125,
-0.054779052734375,
0.047943115234375,
-0.0003879070281982422,
0.029449462890625,
0.01617431640625,
-0.0087432861328125,
0.044891357421875,
-0.02899169921875,
0.071533203125,
0.026824951171875,
-0.06427001953125,
0.01617431640625,
-0.03692626953125,
0.035552978515625,
0.0239715576171875,
0.032012939453125,
-0.038238525390625,
-0.0106964111328125,
-0.0548095703125,
-0.056915283203125,
0.0753173828125,
0.04034423828125,
0.00524139404296875,
0.00008511543273925781,
0.02838134765625,
0.01171875,
0.010223388671875,
-0.08721923828125,
-0.016571044921875,
-0.04400634765625,
-0.030975341796875,
-0.00737762451171875,
-0.01285552978515625,
-0.00464630126953125,
-0.040679931640625,
0.053009033203125,
-0.0005445480346679688,
0.047210693359375,
0.021484375,
-0.0231475830078125,
0.0010814666748046875,
-0.001224517822265625,
0.03631591796875,
0.047119140625,
-0.0218505859375,
0.00151824951171875,
0.0181732177734375,
-0.052886962890625,
0.005466461181640625,
0.024017333984375,
-0.0093994140625,
-0.0160369873046875,
0.0263824462890625,
0.08251953125,
-0.005123138427734375,
-0.0300750732421875,
0.03497314453125,
-0.01198577880859375,
-0.017852783203125,
-0.013641357421875,
0.01373291015625,
0.02728271484375,
0.03753662109375,
0.01192474365234375,
-0.00039267539978027344,
-0.01229095458984375,
-0.03729248046875,
0.007022857666015625,
0.0073089599609375,
-0.0143890380859375,
-0.022125244140625,
0.06591796875,
0.0034046173095703125,
-0.01436614990234375,
0.059051513671875,
-0.007049560546875,
-0.033966064453125,
0.05657958984375,
0.05029296875,
0.055633544921875,
-0.018707275390625,
0.0198211669921875,
0.031402587890625,
0.023193359375,
-0.00897216796875,
-0.0029850006103515625,
-0.0089569091796875,
-0.05218505859375,
-0.0307769775390625,
-0.06536865234375,
-0.024261474609375,
-0.0006799697875976562,
-0.02972412109375,
0.0249786376953125,
-0.028533935546875,
-0.0152740478515625,
-0.012939453125,
0.0014619827270507812,
-0.062225341796875,
0.01220703125,
0.0291900634765625,
0.06781005859375,
-0.048675537109375,
0.069580078125,
0.022857666015625,
-0.04010009765625,
-0.0682373046875,
-0.03277587890625,
-0.0081024169921875,
-0.069580078125,
0.0218658447265625,
0.0258941650390625,
0.014923095703125,
0.0097808837890625,
-0.048492431640625,
-0.07000732421875,
0.1221923828125,
0.045074462890625,
-0.034088134765625,
-0.014923095703125,
0.035675048828125,
0.036376953125,
-0.030609130859375,
0.042633056640625,
0.056915283203125,
0.0294036865234375,
0.030731201171875,
-0.06390380859375,
0.007396697998046875,
-0.0240631103515625,
-0.0001226663589477539,
0.001468658447265625,
-0.06744384765625,
0.08001708984375,
-0.01000213623046875,
-0.00788116455078125,
0.018798828125,
0.048797607421875,
0.0286407470703125,
0.0239105224609375,
0.0260162353515625,
0.059722900390625,
0.036590576171875,
-0.0260772705078125,
0.09881591796875,
-0.020782470703125,
0.043060302734375,
0.066650390625,
0.0193328857421875,
0.03717041015625,
0.021636962890625,
-0.0092926025390625,
0.032684326171875,
0.07330322265625,
-0.0213165283203125,
0.0282135009765625,
-0.0011339187622070312,
-0.01555633544921875,
-0.016632080078125,
0.0105743408203125,
-0.040435791015625,
0.0250701904296875,
0.01241302490234375,
-0.042572021484375,
-0.00632476806640625,
0.01186370849609375,
0.0073089599609375,
-0.033172607421875,
-0.01428985595703125,
0.044647216796875,
0.01509857177734375,
-0.045379638671875,
0.053680419921875,
-0.007793426513671875,
0.052978515625,
-0.03900146484375,
0.0099639892578125,
-0.025482177734375,
0.01509857177734375,
-0.01837158203125,
-0.049560546875,
0.013671875,
-0.00791168212890625,
0.0024547576904296875,
-0.0160369873046875,
0.0294647216796875,
-0.0265960693359375,
-0.0352783203125,
0.011962890625,
0.022918701171875,
0.006855010986328125,
-0.007213592529296875,
-0.058929443359375,
0.00904083251953125,
0.0011510848999023438,
-0.044708251953125,
0.015777587890625,
0.01509857177734375,
0.0163726806640625,
0.047821044921875,
0.055419921875,
-0.00963592529296875,
0.0234222412109375,
-0.008087158203125,
0.072509765625,
-0.05889892578125,
-0.0223541259765625,
-0.06689453125,
0.051666259765625,
0.0017557144165039062,
-0.027923583984375,
0.0535888671875,
0.047119140625,
0.060028076171875,
-0.00621795654296875,
0.034210205078125,
-0.00945281982421875,
0.0175933837890625,
-0.0347900390625,
0.06005859375,
-0.0303497314453125,
0.0137939453125,
-0.0177764892578125,
-0.09521484375,
-0.02740478515625,
0.041961669921875,
-0.02935791015625,
0.013824462890625,
0.05291748046875,
0.06134033203125,
-0.0185394287109375,
0.0127410888671875,
0.01477813720703125,
0.0241546630859375,
0.0269317626953125,
0.0572509765625,
0.06829833984375,
-0.04541015625,
0.049652099609375,
-0.03741455078125,
-0.0061187744140625,
-0.002849578857421875,
-0.056915283203125,
-0.07403564453125,
-0.0379638671875,
-0.013397216796875,
-0.03533935546875,
-0.0054931640625,
0.07635498046875,
0.061492919921875,
-0.044647216796875,
-0.0153045654296875,
-0.0019378662109375,
-0.00004565715789794922,
-0.0015096664428710938,
-0.01473236083984375,
0.045318603515625,
0.003818511962890625,
-0.05859375,
0.008819580078125,
0.0083465576171875,
0.02325439453125,
-0.00621795654296875,
-0.0074310302734375,
-0.018310546875,
0.00417327880859375,
0.046295166015625,
0.018157958984375,
-0.04150390625,
-0.020751953125,
-0.003566741943359375,
0.0003807544708251953,
0.035797119140625,
0.034881591796875,
-0.05340576171875,
0.020751953125,
0.023406982421875,
0.03167724609375,
0.0810546875,
-0.0014333724975585938,
0.033233642578125,
-0.0295867919921875,
0.0204925537109375,
0.020416259765625,
0.0367431640625,
0.030029296875,
-0.02587890625,
0.047210693359375,
0.0278778076171875,
-0.046051025390625,
-0.056427001953125,
0.0011920928955078125,
-0.09130859375,
-0.003940582275390625,
0.0858154296875,
-0.0083770751953125,
-0.04229736328125,
0.01544952392578125,
-0.0251312255859375,
0.041168212890625,
-0.01320648193359375,
0.05267333984375,
0.03594970703125,
-0.00907135009765625,
-0.0367431640625,
-0.02984619140625,
0.037078857421875,
0.0236053466796875,
-0.049835205078125,
-0.01168060302734375,
0.018096923828125,
0.03125,
0.01171875,
0.034454345703125,
-0.0177001953125,
0.02886962890625,
0.0010004043579101562,
0.0187835693359375,
-0.0301971435546875,
-0.010650634765625,
-0.01110076904296875,
0.01091766357421875,
-0.027923583984375,
-0.0031452178955078125
]
] |
Helsinki-NLP/opus-mt-fi-en | 2023-08-16T11:34:26.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"fi",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-fi-en | 3 | 78,199 | transformers | 2022-03-02T23:29:04 | ---
language:
- fi
- en
tags:
- translation
license: apache-2.0
---
### fin-eng
* source group: Finnish
* target group: English
* OPUS readme: [fin-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/fin-eng/README.md)
* model: transformer-align
* source language(s): fin
* target language(s): eng
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-08-05.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opus-2020-08-05.zip)
* test set translations: [opus-2020-08-05.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opus-2020-08-05.test.txt)
* test set scores: [opus-2020-08-05.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opus-2020-08-05.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2015-enfi-fineng.fin.eng | 25.3 | 0.536 |
| newstest2015-enfi-fineng.fin.eng | 26.9 | 0.547 |
| newstest2016-enfi-fineng.fin.eng | 29.0 | 0.571 |
| newstest2017-enfi-fineng.fin.eng | 32.3 | 0.594 |
| newstest2018-enfi-fineng.fin.eng | 23.8 | 0.517 |
| newstest2019-fien-fineng.fin.eng | 29.0 | 0.565 |
| newstestB2016-enfi-fineng.fin.eng | 24.5 | 0.527 |
| newstestB2017-enfi-fineng.fin.eng | 27.4 | 0.557 |
| newstestB2017-fien-fineng.fin.eng | 27.4 | 0.557 |
| Tatoeba-test.fin.eng | 53.4 | 0.697 |
### System Info:
- hf_name: fin-eng
- source_languages: fin
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/fin-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['fi', 'en']
- src_constituents: {'fin'}
- tgt_constituents: {'eng'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opus-2020-08-05.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opus-2020-08-05.test.txt
- src_alpha3: fin
- tgt_alpha3: eng
- short_pair: fi-en
- chrF2_score: 0.6970000000000001
- bleu: 53.4
- brevity_penalty: 0.99
- ref_len: 74651.0
- src_name: Finnish
- tgt_name: English
- train_date: 2020-08-05
- src_alpha2: fi
- tgt_alpha2: en
- prefer_old: False
- long_pair: fin-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,557 | [
[
-0.03619384765625,
-0.049835205078125,
0.0242767333984375,
0.0243682861328125,
-0.0325927734375,
-0.01438140869140625,
-0.0189971923828125,
-0.035491943359375,
0.0248260498046875,
0.02191162109375,
-0.048675537109375,
-0.05780029296875,
-0.035125732421875,
0.0251922607421875,
0.00562286376953125,
0.06585693359375,
-0.0102996826171875,
0.0055389404296875,
0.0291595458984375,
-0.0271148681640625,
-0.034027099609375,
-0.01126861572265625,
-0.05328369140625,
-0.019866943359375,
0.032958984375,
0.02667236328125,
0.0284881591796875,
0.03765869140625,
0.0494384765625,
0.0206298828125,
-0.02337646484375,
0.0135955810546875,
-0.010345458984375,
-0.01519775390625,
0.0014314651489257812,
-0.0310821533203125,
-0.0499267578125,
-0.0160980224609375,
0.06329345703125,
0.039306640625,
0.01245880126953125,
0.037200927734375,
-0.006565093994140625,
0.058135986328125,
-0.0175933837890625,
0.01534271240234375,
-0.034088134765625,
-0.01160430908203125,
-0.029052734375,
-0.018341064453125,
-0.036773681640625,
-0.01399993896484375,
0.00936126708984375,
-0.0474853515625,
0.00604248046875,
0.01505279541015625,
0.120849609375,
0.006816864013671875,
-0.02593994140625,
-0.00754547119140625,
-0.0241851806640625,
0.05596923828125,
-0.06524658203125,
0.034515380859375,
0.034454345703125,
-0.00910186767578125,
-0.0008091926574707031,
-0.032806396484375,
-0.02484130859375,
0.01190185546875,
-0.02850341796875,
0.020050048828125,
-0.01177978515625,
-0.0038738250732421875,
0.00580596923828125,
0.04473876953125,
-0.048126220703125,
0.005107879638671875,
-0.03717041015625,
-0.01522064208984375,
0.045257568359375,
0.003284454345703125,
0.0162811279296875,
-0.040802001953125,
-0.04034423828125,
-0.03814697265625,
-0.039794921875,
0.022674560546875,
0.041259765625,
0.037811279296875,
-0.04071044921875,
0.046173095703125,
-0.0087127685546875,
0.042755126953125,
0.004917144775390625,
-0.0075531005859375,
0.05218505859375,
-0.05096435546875,
-0.0178985595703125,
-0.0057373046875,
0.09161376953125,
0.0235748291015625,
-0.003681182861328125,
0.0156707763671875,
-0.0254669189453125,
-0.0267333984375,
-0.01056671142578125,
-0.0552978515625,
0.012847900390625,
0.0236053466796875,
-0.0243072509765625,
-0.0182342529296875,
0.011505126953125,
-0.061004638671875,
0.0186767578125,
0.007404327392578125,
0.044219970703125,
-0.055389404296875,
-0.0250701904296875,
0.0302581787109375,
-0.0025577545166015625,
0.027069091796875,
-0.0106048583984375,
-0.04156494140625,
0.0156402587890625,
0.03228759765625,
0.06634521484375,
-0.005878448486328125,
-0.0234222412109375,
-0.0118865966796875,
-0.00012814998626708984,
-0.009033203125,
0.050079345703125,
-0.010406494140625,
-0.023681640625,
-0.01496124267578125,
0.0270233154296875,
-0.018768310546875,
-0.007755279541015625,
0.065185546875,
-0.0211334228515625,
0.042266845703125,
-0.025909423828125,
-0.0406494140625,
-0.0305938720703125,
0.019500732421875,
-0.056549072265625,
0.087158203125,
0.0204620361328125,
-0.066650390625,
0.026458740234375,
-0.06744384765625,
-0.009246826171875,
-0.0026531219482421875,
0.0150909423828125,
-0.05523681640625,
-0.01070404052734375,
0.0156402587890625,
0.0285186767578125,
-0.033172607421875,
0.0396728515625,
-0.0079345703125,
-0.0177154541015625,
-0.00376129150390625,
-0.022064208984375,
0.09161376953125,
0.015350341796875,
-0.036407470703125,
0.00925445556640625,
-0.055999755859375,
-0.0020084381103515625,
0.017913818359375,
-0.023712158203125,
-0.022735595703125,
-0.0171051025390625,
0.01861572265625,
0.0007748603820800781,
0.0206451416015625,
-0.049652099609375,
0.019317626953125,
-0.053009033203125,
0.023223876953125,
0.058929443359375,
0.013336181640625,
0.0164947509765625,
-0.034271240234375,
0.02447509765625,
0.0129547119140625,
0.01363372802734375,
0.002735137939453125,
-0.04254150390625,
-0.05523681640625,
-0.025115966796875,
0.04296875,
0.047576904296875,
-0.04937744140625,
0.0565185546875,
-0.045745849609375,
-0.062225341796875,
-0.04937744140625,
-0.01457977294921875,
0.042633056640625,
0.031280517578125,
0.035980224609375,
-0.01526641845703125,
-0.03643798828125,
-0.084228515625,
-0.011444091796875,
-0.0309600830078125,
0.003162384033203125,
0.0163116455078125,
0.061981201171875,
0.0031909942626953125,
0.047882080078125,
-0.033782958984375,
-0.033905029296875,
-0.005977630615234375,
0.01461029052734375,
0.037322998046875,
0.060394287109375,
0.055267333984375,
-0.064697265625,
-0.037750244140625,
0.00841522216796875,
-0.04339599609375,
-0.00952911376953125,
-0.0073089599609375,
-0.02227783203125,
0.029052734375,
0.0018262863159179688,
-0.038726806640625,
0.0146942138671875,
0.039825439453125,
-0.06976318359375,
0.035400390625,
-0.0090179443359375,
0.0255889892578125,
-0.10015869140625,
0.0233917236328125,
0.0011777877807617188,
-0.007709503173828125,
-0.02789306640625,
0.005420684814453125,
0.007640838623046875,
0.016693115234375,
-0.037353515625,
0.060791015625,
-0.039520263671875,
0.0007143020629882812,
0.034393310546875,
0.005817413330078125,
0.0007300376892089844,
0.0643310546875,
-0.011627197265625,
0.0697021484375,
0.034210205078125,
-0.0283355712890625,
0.004863739013671875,
0.0306243896484375,
-0.02978515625,
0.021881103515625,
-0.05108642578125,
-0.01462554931640625,
0.02374267578125,
-0.00405120849609375,
-0.06475830078125,
-0.0126953125,
0.019989013671875,
-0.0504150390625,
0.0243072509765625,
-0.006526947021484375,
-0.038360595703125,
-0.020751953125,
-0.03594970703125,
0.03607177734375,
0.026031494140625,
-0.015228271484375,
0.0557861328125,
0.01506805419921875,
-0.0086669921875,
-0.05072021484375,
-0.058502197265625,
-0.01113128662109375,
-0.018768310546875,
-0.05419921875,
0.035736083984375,
-0.0122528076171875,
0.00023615360260009766,
0.0170745849609375,
-0.003917694091796875,
-0.0107269287109375,
0.01103973388671875,
-0.0005307197570800781,
0.03314208984375,
-0.031524658203125,
0.004131317138671875,
0.00156402587890625,
-0.0061798095703125,
-0.0113983154296875,
-0.0191802978515625,
0.053741455078125,
-0.0411376953125,
-0.013824462890625,
-0.047027587890625,
0.003246307373046875,
0.04217529296875,
-0.028656005859375,
0.070068359375,
0.03948974609375,
-0.0198974609375,
0.024169921875,
-0.046661376953125,
0.00624847412109375,
-0.032379150390625,
0.0211944580078125,
-0.043548583984375,
-0.054473876953125,
0.06890869140625,
0.0236053466796875,
0.0268096923828125,
0.08270263671875,
0.035552978515625,
0.01023101806640625,
0.0377197265625,
0.029083251953125,
0.0150909423828125,
0.039764404296875,
-0.047637939453125,
-0.0143280029296875,
-0.059295654296875,
-0.024017333984375,
-0.052001953125,
-0.01885986328125,
-0.06951904296875,
-0.00933837890625,
0.0225982666015625,
-0.0014581680297851562,
-0.023406982421875,
0.057037353515625,
-0.03607177734375,
0.022857666015625,
0.04345703125,
0.0098419189453125,
0.020355224609375,
-0.00789642333984375,
-0.035614013671875,
-0.0016775131225585938,
-0.0404052734375,
-0.03814697265625,
0.08966064453125,
0.0291748046875,
0.00878143310546875,
0.0184783935546875,
0.053070068359375,
0.007091522216796875,
0.01363372802734375,
-0.04412841796875,
0.035430908203125,
-0.0156402587890625,
-0.06561279296875,
-0.034027099609375,
-0.0347900390625,
-0.06561279296875,
0.01494598388671875,
-0.0170440673828125,
-0.048187255859375,
0.021514892578125,
-0.01021575927734375,
-0.01374053955078125,
0.051483154296875,
-0.056610107421875,
0.07373046875,
0.0020503997802734375,
-0.0263519287109375,
0.0125885009765625,
-0.048187255859375,
0.023284912109375,
-0.01457977294921875,
0.022369384765625,
-0.0117340087890625,
-0.01168060302734375,
0.0723876953125,
-0.0265045166015625,
0.045379638671875,
-0.011322021484375,
-0.0114898681640625,
0.01934814453125,
0.001667022705078125,
0.03485107421875,
-0.007549285888671875,
-0.0216827392578125,
0.0244293212890625,
0.01187896728515625,
-0.0450439453125,
-0.01044464111328125,
0.037017822265625,
-0.059051513671875,
-0.032958984375,
-0.035491943359375,
-0.043182373046875,
-0.003582000732421875,
0.035308837890625,
0.03277587890625,
0.042327880859375,
-0.004245758056640625,
0.04229736328125,
0.049957275390625,
-0.0234527587890625,
0.04034423828125,
0.039154052734375,
0.0010290145874023438,
-0.0518798828125,
0.05352783203125,
0.015838623046875,
0.023529052734375,
0.039520263671875,
0.0088653564453125,
-0.019317626953125,
-0.0504150390625,
-0.040069580078125,
0.03668212890625,
-0.0283050537109375,
-0.0321044921875,
-0.039947509765625,
-0.005680084228515625,
-0.0222930908203125,
0.00244903564453125,
-0.039031982421875,
-0.03851318359375,
-0.005290985107421875,
-0.02630615234375,
0.03973388671875,
0.0292816162109375,
-0.004428863525390625,
0.0203094482421875,
-0.059967041015625,
0.00787353515625,
-0.0136566162109375,
0.0438232421875,
-0.02178955078125,
-0.055084228515625,
-0.0192108154296875,
-0.0020084381103515625,
-0.015228271484375,
-0.06756591796875,
0.036529541015625,
0.0021190643310546875,
0.0248565673828125,
0.01116943359375,
0.01195526123046875,
0.050537109375,
-0.0217437744140625,
0.0750732421875,
0.0012331008911132812,
-0.065185546875,
0.04351806640625,
-0.03131103515625,
0.0298004150390625,
0.053314208984375,
0.01971435546875,
-0.0209503173828125,
-0.056610107421875,
-0.0677490234375,
-0.07476806640625,
0.061859130859375,
0.038360595703125,
-0.010406494140625,
-0.0009899139404296875,
0.0150604248046875,
0.007549285888671875,
-0.01503753662109375,
-0.0782470703125,
-0.04254150390625,
0.0086669921875,
-0.033843994140625,
0.0066680908203125,
-0.03131103515625,
-0.007061004638671875,
-0.0184173583984375,
0.0819091796875,
0.007793426513671875,
0.015380859375,
0.0391845703125,
0.00006496906280517578,
-0.002445220947265625,
0.0231475830078125,
0.0587158203125,
0.0311126708984375,
-0.03314208984375,
-0.00972747802734375,
0.0233612060546875,
-0.047576904296875,
0.005115509033203125,
0.007045745849609375,
-0.036651611328125,
0.0210418701171875,
0.0416259765625,
0.0701904296875,
0.0201416015625,
-0.035614013671875,
0.03497314453125,
-0.007701873779296875,
-0.04046630859375,
-0.0299835205078125,
-0.017425537109375,
0.0047149658203125,
0.0108795166015625,
0.02972412109375,
0.004669189453125,
-0.00209808349609375,
-0.015960693359375,
0.0070037841796875,
0.007740020751953125,
-0.0235137939453125,
-0.0290679931640625,
0.047271728515625,
0.0098419189453125,
-0.0182647705078125,
0.0228424072265625,
-0.018829345703125,
-0.0277252197265625,
0.04901123046875,
0.020294189453125,
0.082275390625,
-0.01103973388671875,
-0.007740020751953125,
0.0633544921875,
0.039459228515625,
-0.004192352294921875,
0.033203125,
0.0199737548828125,
-0.042388916015625,
-0.0263671875,
-0.061126708984375,
0.007297515869140625,
0.0106658935546875,
-0.063720703125,
0.0297698974609375,
-0.0031032562255859375,
-0.0294647216796875,
-0.002483367919921875,
0.03179931640625,
-0.04638671875,
0.0051422119140625,
-0.0219879150390625,
0.0809326171875,
-0.0716552734375,
0.051513671875,
0.0540771484375,
-0.054901123046875,
-0.07720947265625,
-0.016510009765625,
-0.0197601318359375,
-0.039947509765625,
0.041015625,
0.0010099411010742188,
0.00091552734375,
0.0012073516845703125,
-0.02593994140625,
-0.06781005859375,
0.09417724609375,
0.0297088623046875,
-0.0279998779296875,
-0.010498046875,
-0.0010061264038085938,
0.03515625,
-0.0079803466796875,
0.0178680419921875,
0.0340576171875,
0.06414794921875,
-0.013336181640625,
-0.088134765625,
0.011688232421875,
-0.0421142578125,
-0.0107421875,
0.0281829833984375,
-0.07012939453125,
0.06781005859375,
0.01195526123046875,
-0.016754150390625,
0.00017023086547851562,
0.04302978515625,
0.03277587890625,
0.009796142578125,
0.04217529296875,
0.058135986328125,
0.039459228515625,
-0.039215087890625,
0.07305908203125,
-0.031494140625,
0.043731689453125,
0.06304931640625,
0.0239715576171875,
0.063720703125,
0.04364013671875,
-0.0211181640625,
0.047271728515625,
0.06072998046875,
-0.01419830322265625,
0.024383544921875,
-0.016937255859375,
0.0008778572082519531,
-0.0177154541015625,
-0.01047515869140625,
-0.035003662109375,
0.031646728515625,
0.007564544677734375,
-0.013275146484375,
0.0050048828125,
-0.0168914794921875,
0.022552490234375,
-0.0005950927734375,
-0.0160064697265625,
0.048248291015625,
-0.007568359375,
-0.0484619140625,
0.05419921875,
0.0034275054931640625,
0.054443359375,
-0.05322265625,
0.007659912109375,
-0.0194244384765625,
0.006622314453125,
-0.005718231201171875,
-0.06427001953125,
0.02716064453125,
0.018035888671875,
-0.0176544189453125,
-0.022064208984375,
0.017913818359375,
-0.03314208984375,
-0.056793212890625,
0.0308380126953125,
0.031585693359375,
0.0158538818359375,
0.0221405029296875,
-0.051116943359375,
-0.004848480224609375,
0.01410675048828125,
-0.055908203125,
-0.0053558349609375,
0.0516357421875,
0.007282257080078125,
0.047607421875,
0.036529541015625,
0.011322021484375,
0.0010499954223632812,
-0.0084991455078125,
0.049835205078125,
-0.057769775390625,
-0.035491943359375,
-0.065185546875,
0.0450439453125,
-0.00482940673828125,
-0.04949951171875,
0.038909912109375,
0.062255859375,
0.0748291015625,
-0.00170135498046875,
0.0199127197265625,
-0.0160369873046875,
0.0290679931640625,
-0.05474853515625,
0.05389404296875,
-0.07110595703125,
0.0127410888671875,
-0.01558685302734375,
-0.055694580078125,
-0.022064208984375,
0.022125244140625,
-0.0139617919921875,
-0.0038509368896484375,
0.07220458984375,
0.0478515625,
0.007266998291015625,
-0.0139617919921875,
-0.003326416015625,
0.0269012451171875,
0.0160369873046875,
0.057220458984375,
0.01361083984375,
-0.07012939453125,
0.048553466796875,
-0.029449462890625,
0.009429931640625,
-0.0005273818969726562,
-0.05450439453125,
-0.0562744140625,
-0.055450439453125,
-0.0161590576171875,
-0.0244140625,
-0.01390838623046875,
0.0780029296875,
0.0222930908203125,
-0.0733642578125,
-0.0287017822265625,
-0.0004036426544189453,
0.0123291015625,
-0.0236358642578125,
-0.019561767578125,
0.058929443359375,
-0.010406494140625,
-0.07281494140625,
0.00899505615234375,
0.0009412765502929688,
0.01432037353515625,
0.0006756782531738281,
-0.00487518310546875,
-0.050048828125,
-0.0032978057861328125,
0.0150909423828125,
0.00655364990234375,
-0.061065673828125,
-0.017364501953125,
0.01375579833984375,
-0.0195159912109375,
0.0199737548828125,
0.0033512115478515625,
-0.021209716796875,
0.01280975341796875,
0.053985595703125,
0.028564453125,
0.04046630859375,
-0.006496429443359375,
0.03228759765625,
-0.055419921875,
0.027984619140625,
0.00928497314453125,
0.04327392578125,
0.01558685302734375,
-0.0160369873046875,
0.06524658203125,
0.0270843505859375,
-0.017303466796875,
-0.07684326171875,
-0.01268768310546875,
-0.09185791015625,
-0.0079345703125,
0.07037353515625,
-0.018890380859375,
-0.036407470703125,
0.0171661376953125,
-0.012237548828125,
0.029571533203125,
-0.0347900390625,
0.03436279296875,
0.07049560546875,
0.024505615234375,
0.01148223876953125,
-0.04071044921875,
0.0197906494140625,
0.034820556640625,
-0.0494384765625,
-0.006229400634765625,
0.0123748779296875,
0.02557373046875,
0.0278167724609375,
0.04766845703125,
-0.024993896484375,
0.01105499267578125,
-0.005462646484375,
0.0246734619140625,
-0.0213775634765625,
-0.0021991729736328125,
-0.0157012939453125,
0.00862884521484375,
-0.014190673828125,
-0.01299285888671875
]
] |
hf-internal-testing/tiny-stable-diffusion-torch | 2023-05-05T15:28:20.000Z | [
"diffusers",
"license:apache-2.0",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | null | hf-internal-testing | null | null | hf-internal-testing/tiny-stable-diffusion-torch | 2 | 78,043 | diffusers | 2022-11-03T13:35:38 | ---
license: apache-2.0
---
```python
from diffusers import StableDiffusionPipeline
pipe = StableDiffusionPipeline.from_pretrained("hf-internal-testing/tiny-stable-diffusion-torch")
```
| 189 | [
[
-0.019622802734375,
-0.04443359375,
0.01079559326171875,
0.0201416015625,
-0.017486572265625,
0.01236724853515625,
0.017913818359375,
0.0416259765625,
-0.0007505416870117188,
0.016082763671875,
-0.0272369384765625,
0.0002008676528930664,
-0.03350830078125,
0.00495147705078125,
-0.028167724609375,
0.06488037109375,
-0.0190582275390625,
0.007335662841796875,
0.003276824951171875,
0.0008988380432128906,
0.01031494140625,
0.0030231475830078125,
-0.06304931640625,
-0.044586181640625,
0.03851318359375,
0.032470703125,
0.032470703125,
0.015960693359375,
0.033111572265625,
0.019134521484375,
-0.0080413818359375,
-0.0360107421875,
-0.031036376953125,
0.004695892333984375,
0.00650787353515625,
-0.020538330078125,
0.0019779205322265625,
-0.0136566162109375,
0.05950927734375,
0.0380859375,
-0.01108551025390625,
-0.00013327598571777344,
0.01383209228515625,
0.01690673828125,
-0.04754638671875,
-0.0011510848999023438,
-0.01464080810546875,
0.006656646728515625,
-0.0108642578125,
-0.024688720703125,
-0.01953125,
-0.039031982421875,
0.043060302734375,
-0.03973388671875,
0.0088653564453125,
-0.0236968994140625,
0.0765380859375,
0.040924072265625,
-0.0452880859375,
-0.00885009765625,
-0.043792724609375,
0.036376953125,
-0.0286407470703125,
0.0267791748046875,
0.0203399658203125,
0.01346588134765625,
-0.013031005859375,
-0.103515625,
-0.00690460205078125,
-0.0136566162109375,
0.0036678314208984375,
0.00977325439453125,
0.0301361083984375,
0.0232696533203125,
0.035308837890625,
0.0234375,
-0.0229949951171875,
-0.0226287841796875,
-0.0537109375,
-0.0189056396484375,
0.0369873046875,
0.00969696044921875,
0.0125274658203125,
0.026947021484375,
-0.0121917724609375,
-0.0139312744140625,
-0.0293731689453125,
-0.0189361572265625,
0.01181793212890625,
-0.01496124267578125,
-0.01552581787109375,
0.0396728515625,
-0.0007252693176269531,
0.03289794921875,
0.032073974609375,
-0.01751708984375,
0.04644775390625,
-0.005893707275390625,
-0.0272369384765625,
0.0241241455078125,
0.0308990478515625,
-0.0079803466796875,
-0.0034465789794921875,
0.035186767578125,
-0.0108642578125,
-0.0136871337890625,
-0.005279541015625,
-0.11895751953125,
-0.06884765625,
0.012176513671875,
-0.029998779296875,
-0.0295867919921875,
0.00739288330078125,
-0.045928955078125,
-0.00617218017578125,
0.0231781005859375,
0.05029296875,
-0.0010404586791992188,
-0.036590576171875,
-0.00855255126953125,
-0.06390380859375,
0.0092010498046875,
0.01019287109375,
-0.033416748046875,
0.0160980224609375,
0.00574493408203125,
0.0789794921875,
0.0211944580078125,
-0.0313720703125,
-0.04541015625,
0.00994873046875,
-0.0106201171875,
0.035919189453125,
-0.006053924560546875,
-0.0312347412109375,
-0.005176544189453125,
0.0261688232421875,
-0.00208282470703125,
-0.046142578125,
0.036163330078125,
-0.0299224853515625,
0.025604248046875,
0.0231170654296875,
-0.0288848876953125,
0.010711669921875,
-0.0113983154296875,
-0.019561767578125,
0.0810546875,
0.053497314453125,
-0.0902099609375,
0.035858154296875,
-0.053619384765625,
-0.031829833984375,
0.00748443603515625,
0.0182647705078125,
-0.0472412109375,
-0.020263671875,
-0.03448486328125,
0.01296234130859375,
0.04205322265625,
-0.0148468017578125,
-0.038421630859375,
-0.035400390625,
-0.0189056396484375,
-0.026947021484375,
0.0992431640625,
0.024688720703125,
-0.0095672607421875,
0.0350341796875,
-0.035736083984375,
-0.004726409912109375,
-0.0238800048828125,
-0.031829833984375,
-0.01076507568359375,
-0.0182342529296875,
0.0193023681640625,
-0.0011816024780273438,
0.01100921630859375,
-0.052215576171875,
0.0030517578125,
-0.052520751953125,
0.056060791015625,
0.05706787109375,
0.0137481689453125,
0.038330078125,
-0.01123046875,
0.032073974609375,
0.00890350341796875,
-0.005645751953125,
0.0357666015625,
-0.041412353515625,
-0.06951904296875,
-0.027008056640625,
0.0013484954833984375,
0.0290985107421875,
-0.044921875,
0.03118896484375,
0.0229034423828125,
-0.054046630859375,
-0.0247955322265625,
0.0239105224609375,
-0.01107025146484375,
0.02764892578125,
0.0036411285400390625,
-0.003978729248046875,
-0.035308837890625,
-0.0148773193359375,
0.0094146728515625,
0.0011072158813476562,
-0.01120758056640625,
-0.01873779296875,
0.046844482421875,
-0.052734375,
0.05523681640625,
-0.0635986328125,
-0.0286865234375,
0.005733489990234375,
0.02386474609375,
0.0335693359375,
0.05963134765625,
0.046844482421875,
-0.018585205078125,
-0.08538818359375,
-0.006130218505859375,
-0.0234375,
-0.008148193359375,
0.0084075927734375,
-0.028564453125,
-0.0117645263671875,
0.0312347412109375,
-0.03826904296875,
0.040191650390625,
0.04278564453125,
-0.058807373046875,
0.05401611328125,
-0.03851318359375,
-0.01197052001953125,
-0.043182373046875,
0.0015239715576171875,
-0.01459503173828125,
-0.040771484375,
-0.01389312744140625,
-0.0002472400665283203,
0.01812744140625,
-0.0030193328857421875,
-0.053009033203125,
0.05859375,
-0.045074462890625,
0.031585693359375,
-0.02490234375,
-0.024566650390625,
-0.0087432861328125,
-0.0194549560546875,
0.00632476806640625,
0.058258056640625,
0.06475830078125,
-0.039642333984375,
0.0738525390625,
0.0162506103515625,
0.007190704345703125,
-0.00003993511199951172,
-0.048797607421875,
0.00860595703125,
0.0010023117065429688,
0.02410888671875,
-0.059112548828125,
-0.036285400390625,
0.01593017578125,
-0.0135345458984375,
0.00016868114471435547,
-0.027984619140625,
-0.0012140274047851562,
-0.053497314453125,
-0.029754638671875,
0.04559326171875,
0.076416015625,
-0.038787841796875,
0.01169586181640625,
-0.008209228515625,
0.01141357421875,
-0.0297698974609375,
-0.05035400390625,
-0.032257080078125,
-0.037017822265625,
-0.05450439453125,
0.00577545166015625,
-0.0255126953125,
-0.008209228515625,
-0.0247650146484375,
-0.01100921630859375,
-0.059173583984375,
0.00881195068359375,
0.01026153564453125,
-0.00553131103515625,
-0.01824951171875,
-0.031829833984375,
0.01543426513671875,
-0.033905029296875,
0.023529052734375,
-0.005451202392578125,
0.039581298828125,
-0.0003077983856201172,
-0.0131683349609375,
-0.041778564453125,
-0.0151519775390625,
0.019317626953125,
0.02215576171875,
0.036895751953125,
0.07586669921875,
-0.03143310546875,
-0.0223388671875,
-0.01313018798828125,
-0.04095458984375,
-0.04193115234375,
0.0015401840209960938,
-0.02044677734375,
-0.035888671875,
0.0209503173828125,
-0.023345947265625,
-0.005615234375,
0.020782470703125,
0.046722412109375,
-0.0168609619140625,
0.07037353515625,
0.046356201171875,
0.03314208984375,
0.03790283203125,
-0.04296875,
-0.0038776397705078125,
-0.04541015625,
0.004932403564453125,
-0.03082275390625,
-0.002285003662109375,
0.0234375,
-0.00724029541015625,
0.037017822265625,
0.03997802734375,
-0.054840087890625,
0.01108551025390625,
-0.038970947265625,
0.0472412109375,
0.0295867919921875,
-0.009765625,
0.0028781890869140625,
-0.03717041015625,
-0.009552001953125,
0.005275726318359375,
-0.0160980224609375,
-0.039794921875,
0.07537841796875,
0.026123046875,
0.0858154296875,
-0.0204315185546875,
0.0601806640625,
-0.00861358642578125,
0.03851318359375,
-0.046905517578125,
-0.0159912109375,
0.007843017578125,
-0.07427978515625,
-0.028717041015625,
-0.01143646240234375,
-0.06884765625,
0.016448974609375,
0.00861358642578125,
-0.021026611328125,
-0.0009312629699707031,
0.0234527587890625,
-0.030242919921875,
0.01087188720703125,
-0.043365478515625,
0.0975341796875,
-0.0206756591796875,
-0.0299072265625,
0.0024013519287109375,
-0.03240966796875,
0.0286407470703125,
-0.0032958984375,
-0.00847625732421875,
-0.0035190582275390625,
-0.013336181640625,
0.0517578125,
-0.051025390625,
0.02197265625,
-0.040924072265625,
-0.007198333740234375,
0.017578125,
0.007720947265625,
0.01183319091796875,
0.040557861328125,
-0.0268402099609375,
-0.015716552734375,
0.039398193359375,
-0.0469970703125,
-0.0068206787109375,
0.04608154296875,
-0.065673828125,
-0.006046295166015625,
-0.057220458984375,
-0.0048370361328125,
0.0278778076171875,
0.04766845703125,
0.05914306640625,
0.03875732421875,
-0.01155853271484375,
0.0010433197021484375,
0.048980712890625,
0.0252227783203125,
0.074951171875,
-0.00787353515625,
-0.0241241455078125,
-0.031707763671875,
0.043853759765625,
0.00543212890625,
0.022491455078125,
-0.00616455078125,
0.053192138671875,
-0.0220794677734375,
-0.035797119140625,
-0.04302978515625,
-0.00039958953857421875,
-0.04541015625,
-0.01454925537109375,
-0.0281524658203125,
-0.041412353515625,
-0.01357269287109375,
-0.0355224609375,
-0.025299072265625,
-0.002223968505859375,
-0.053741455078125,
0.007076263427734375,
0.03521728515625,
0.0367431640625,
-0.035888671875,
0.055023193359375,
-0.0498046875,
0.0214996337890625,
0.03131103515625,
0.0185089111328125,
-0.01763916015625,
-0.04180908203125,
-0.0185394287109375,
0.01568603515625,
-0.04254150390625,
-0.054443359375,
0.033935546875,
0.041717529296875,
0.0299224853515625,
0.07891845703125,
0.022796630859375,
0.052032470703125,
-0.01456451416015625,
0.048553466796875,
0.0255126953125,
-0.069091796875,
0.05145263671875,
-0.0284423828125,
0.00916290283203125,
0.03668212890625,
0.0343017578125,
-0.0230712890625,
-0.0124664306640625,
-0.0357666015625,
-0.0462646484375,
0.031707763671875,
0.0214080810546875,
0.005977630615234375,
0.016448974609375,
0.016876220703125,
0.01285552978515625,
0.0026607513427734375,
-0.06414794921875,
-0.0293731689453125,
-0.0213470458984375,
-0.0150909423828125,
0.0051422119140625,
0.0107574462890625,
-0.033905029296875,
-0.07891845703125,
0.04510498046875,
-0.005893707275390625,
0.021270751953125,
0.03216552734375,
-0.0134735107421875,
-0.0224609375,
-0.01158905029296875,
0.0224456787109375,
0.06640625,
-0.0667724609375,
0.0187530517578125,
0.0208587646484375,
-0.062744140625,
0.06085205078125,
-0.01568603515625,
-0.01471710205078125,
0.0001786947250366211,
0.0036182403564453125,
0.012176513671875,
-0.0302276611328125,
-0.0174407958984375,
0.055328369140625,
-0.01204681396484375,
-0.010223388671875,
-0.06207275390625,
0.022003173828125,
0.0259246826171875,
-0.005340576171875,
0.005847930908203125,
0.0277557373046875,
0.002719879150390625,
-0.03155517578125,
0.0199127197265625,
0.02581787109375,
-0.0496826171875,
-0.01216888427734375,
0.058349609375,
0.05035400390625,
-0.037628173828125,
0.06207275390625,
-0.022552490234375,
-0.026458740234375,
0.033111572265625,
0.0452880859375,
0.08355712890625,
-0.01715087890625,
-0.006488800048828125,
0.0298309326171875,
0.01213836669921875,
-0.0126800537109375,
0.02593994140625,
0.008880615234375,
-0.04913330078125,
-0.00983428955078125,
-0.038787841796875,
-0.0161285400390625,
-0.033294677734375,
-0.03668212890625,
0.029754638671875,
-0.0604248046875,
-0.00998687744140625,
-0.027801513671875,
0.00209808349609375,
-0.03399658203125,
0.002819061279296875,
0.0014057159423828125,
0.0814208984375,
-0.059967041015625,
0.10809326171875,
0.0552978515625,
-0.03643798828125,
-0.022491455078125,
0.00662994384765625,
-0.0283966064453125,
-0.044097900390625,
0.049041748046875,
0.0171966552734375,
-0.022064208984375,
0.024566650390625,
-0.03173828125,
-0.061309814453125,
0.075927734375,
0.01407623291015625,
-0.005939483642578125,
0.006450653076171875,
-0.03460693359375,
0.015533447265625,
-0.01131439208984375,
0.058685302734375,
0.051544189453125,
0.04962158203125,
0.007236480712890625,
-0.053558349609375,
-0.00800323486328125,
-0.01403045654296875,
-0.005023956298828125,
0.0171051025390625,
-0.050537109375,
0.07635498046875,
-0.0093841552734375,
-0.004810333251953125,
0.01537322998046875,
0.051361083984375,
0.0210723876953125,
0.01468658447265625,
0.028594970703125,
0.05169677734375,
0.045318603515625,
-0.0160369873046875,
0.02325439453125,
0.0005230903625488281,
0.058746337890625,
0.03643798828125,
-0.020904541015625,
0.0606689453125,
0.0496826171875,
-0.00724029541015625,
0.07843017578125,
0.057281494140625,
0.0029697418212890625,
0.0721435546875,
0.036834716796875,
-0.04229736328125,
-0.007740020751953125,
0.051361083984375,
-0.0528564453125,
-0.008270263671875,
0.017486572265625,
0.00023436546325683594,
-0.0224609375,
0.0028076171875,
-0.01129150390625,
-0.045989990234375,
-0.0194244384765625,
0.02874755859375,
0.008636474609375,
-0.0297698974609375,
0.06646728515625,
-0.0126800537109375,
0.0986328125,
-0.058441162109375,
0.00902557373046875,
0.014251708984375,
0.0689697265625,
-0.039276123046875,
-0.055633544921875,
0.050537109375,
-0.0259246826171875,
0.0139007568359375,
-0.0155487060546875,
0.0711669921875,
-0.01549530029296875,
-0.03240966796875,
0.036041259765625,
0.00521087646484375,
0.0285186767578125,
0.003589630126953125,
-0.04168701171875,
-0.004268646240234375,
-0.005481719970703125,
-0.0306854248046875,
0.0090484619140625,
0.00643157958984375,
0.068603515625,
0.0543212890625,
0.0158843994140625,
0.00438690185546875,
0.03472900390625,
-0.0258026123046875,
0.0310211181640625,
-0.04693603515625,
-0.043212890625,
-0.042510986328125,
0.054656982421875,
-0.011810302734375,
-0.060089111328125,
0.05438232421875,
0.040557861328125,
0.0682373046875,
-0.025054931640625,
0.055511474609375,
-0.0103759765625,
0.01311492919921875,
-0.0141143798828125,
0.07965087890625,
-0.021270751953125,
-0.029022216796875,
-0.01555633544921875,
-0.06243896484375,
-0.0017633438110351562,
0.09661865234375,
0.0269775390625,
0.006244659423828125,
0.08282470703125,
0.052581787109375,
-0.049957275390625,
-0.025634765625,
-0.0089874267578125,
0.0372314453125,
0.01332855224609375,
0.0132904052734375,
0.0706787109375,
-0.032196044921875,
0.0340576171875,
-0.063720703125,
-0.0306243896484375,
0.001338958740234375,
-0.059295654296875,
-0.0753173828125,
-0.0031948089599609375,
-0.046661376953125,
-0.06195068359375,
-0.030548095703125,
0.059326171875,
0.07672119140625,
-0.054656982421875,
-0.05145263671875,
-0.03564453125,
0.006732940673828125,
-0.0045318603515625,
-0.0200347900390625,
0.027099609375,
-0.01043701171875,
-0.0372314453125,
0.01023101806640625,
-0.0084075927734375,
0.034027099609375,
-0.044891357421875,
-0.0298309326171875,
-0.007808685302734375,
-0.00860595703125,
0.01552581787109375,
0.022064208984375,
-0.0274505615234375,
-0.036224365234375,
-0.055419921875,
-0.0012989044189453125,
-0.0011606216430664062,
0.0120391845703125,
-0.047698974609375,
-0.004909515380859375,
0.07379150390625,
0.01290130615234375,
0.053070068359375,
-0.017242431640625,
0.034210205078125,
-0.040191650390625,
0.0311126708984375,
0.025604248046875,
0.034576416015625,
-0.004421234130859375,
-0.0158843994140625,
0.0380859375,
0.03887939453125,
-0.066650390625,
-0.052337646484375,
0.0028362274169921875,
-0.06915283203125,
-0.0143585205078125,
0.0714111328125,
-0.02099609375,
-0.0213165283203125,
-0.030426025390625,
-0.04571533203125,
0.0178985595703125,
-0.021759033203125,
0.0224609375,
0.028564453125,
-0.01494598388671875,
-0.01442718505859375,
-0.01099395751953125,
0.0567626953125,
0.005126953125,
-0.062225341796875,
-0.0173187255859375,
0.0082244873046875,
0.07806396484375,
0.0189666748046875,
0.05914306640625,
0.0153656005859375,
0.0031757354736328125,
0.039703369140625,
-0.0151214599609375,
0.020599365234375,
-0.01494598388671875,
-0.0224151611328125,
-0.008514404296875,
0.01739501953125,
-0.02728271484375
]
] |
timm/mobilenetv3_large_100.miil_in21k_ft_in1k | 2023-04-27T22:49:19.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k-p",
"arxiv:1905.02244",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/mobilenetv3_large_100.miil_in21k_ft_in1k | 1 | 77,234 | timm | 2022-12-16T05:37:59 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k-p
---
# Model card for mobilenetv3_large_100.miil_in21k_ft_in1k
A MobileNet-v3 image classification model. Petrained on ImageNet-21k-P and fine-tuned on ImageNet-1k by Alibaba MIIL.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.5
- GMACs: 0.2
- Activations (M): 4.4
- Image size: 224 x 224
- **Papers:**
- Searching for MobileNetV3: https://arxiv.org/abs/1905.02244
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k-P
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilenetv3_large_100.miil_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.miil_in21k_ft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 960, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.miil_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 960, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{howard2019searching,
title={Searching for mobilenetv3},
author={Howard, Andrew and Sandler, Mark and Chu, Grace and Chen, Liang-Chieh and Chen, Bo and Tan, Mingxing and Wang, Weijun and Zhu, Yukun and Pang, Ruoming and Vasudevan, Vijay and others},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={1314--1324},
year={2019}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,152 | [
[
-0.0362548828125,
-0.023193359375,
-0.0025501251220703125,
0.015655517578125,
-0.0294952392578125,
-0.0293426513671875,
-0.005191802978515625,
-0.0290069580078125,
0.02130126953125,
0.0307464599609375,
-0.029876708984375,
-0.055023193359375,
-0.044586181640625,
-0.01056671142578125,
-0.00823211669921875,
0.06689453125,
-0.00922393798828125,
-0.004909515380859375,
-0.0109710693359375,
-0.0435791015625,
-0.009796142578125,
-0.0208892822265625,
-0.05255126953125,
-0.03131103515625,
0.0290374755859375,
0.021942138671875,
0.044769287109375,
0.050872802734375,
0.04498291015625,
0.032867431640625,
-0.0052337646484375,
0.005229949951171875,
-0.00899505615234375,
-0.019989013671875,
0.03656005859375,
-0.048095703125,
-0.03619384765625,
0.0255126953125,
0.049407958984375,
0.022796630859375,
0.005146026611328125,
0.032440185546875,
0.00882720947265625,
0.048370361328125,
-0.0245819091796875,
0.0010776519775390625,
-0.034881591796875,
0.00983428955078125,
-0.01020050048828125,
0.0015439987182617188,
-0.01416015625,
-0.035308837890625,
0.01366424560546875,
-0.0343017578125,
0.029541015625,
-0.0015201568603515625,
0.10467529296875,
0.01128387451171875,
-0.01345062255859375,
-0.002521514892578125,
-0.0186614990234375,
0.055084228515625,
-0.0565185546875,
0.01326751708984375,
0.031646728515625,
0.00751495361328125,
-0.0047149658203125,
-0.06597900390625,
-0.043670654296875,
-0.014251708984375,
-0.005062103271484375,
-0.003047943115234375,
-0.01044464111328125,
-0.003810882568359375,
0.0186920166015625,
0.0235595703125,
-0.03436279296875,
0.005016326904296875,
-0.044097900390625,
-0.0185394287109375,
0.050750732421875,
0.0010242462158203125,
0.0280609130859375,
-0.02325439453125,
-0.03564453125,
-0.026153564453125,
-0.0309295654296875,
0.02850341796875,
0.0205841064453125,
0.0115509033203125,
-0.0504150390625,
0.03656005859375,
0.00128936767578125,
0.048553466796875,
-0.0003342628479003906,
-0.03350830078125,
0.054656982421875,
-0.00672149658203125,
-0.03436279296875,
-0.006435394287109375,
0.0902099609375,
0.041534423828125,
0.01377105712890625,
0.014556884765625,
-0.0047149658203125,
-0.03118896484375,
-0.010467529296875,
-0.09027099609375,
-0.0212249755859375,
0.02899169921875,
-0.0623779296875,
-0.033599853515625,
0.0208587646484375,
-0.03985595703125,
-0.01087188720703125,
0.003307342529296875,
0.04010009765625,
-0.0308074951171875,
-0.03009033203125,
0.005672454833984375,
-0.005863189697265625,
0.026458740234375,
0.0089569091796875,
-0.041595458984375,
0.0063323974609375,
0.01309967041015625,
0.0919189453125,
0.0111236572265625,
-0.037139892578125,
-0.015716552734375,
-0.02740478515625,
-0.017120361328125,
0.02899169921875,
-0.003658294677734375,
-0.016876220703125,
-0.0233001708984375,
0.0274505615234375,
-0.018585205078125,
-0.0557861328125,
0.0293731689453125,
-0.016204833984375,
0.0155181884765625,
0.00246429443359375,
-0.0045166015625,
-0.041595458984375,
0.0172271728515625,
-0.0330810546875,
0.10101318359375,
0.019989013671875,
-0.0675048828125,
0.0236358642578125,
-0.0394287109375,
-0.01300048828125,
-0.0241851806640625,
0.00324249267578125,
-0.0848388671875,
-0.012664794921875,
0.019622802734375,
0.06829833984375,
-0.0214385986328125,
-0.0017328262329101562,
-0.0474853515625,
-0.026275634765625,
0.0218353271484375,
0.0090179443359375,
0.08148193359375,
0.0139923095703125,
-0.036773681640625,
0.0210113525390625,
-0.04608154296875,
0.01070404052734375,
0.0361328125,
-0.021087646484375,
-0.01003265380859375,
-0.035491943359375,
0.00896453857421875,
0.0292510986328125,
0.007080078125,
-0.045989990234375,
0.021331787109375,
-0.01139068603515625,
0.037872314453125,
0.033355712890625,
-0.0132293701171875,
0.0272064208984375,
-0.030487060546875,
0.0182952880859375,
0.024383544921875,
0.023895263671875,
-0.00707244873046875,
-0.046600341796875,
-0.058685302734375,
-0.03619384765625,
0.02484130859375,
0.0350341796875,
-0.040252685546875,
0.0296630859375,
-0.0147705078125,
-0.06414794921875,
-0.03570556640625,
0.00907135009765625,
0.03125,
0.036590576171875,
0.0228424072265625,
-0.031982421875,
-0.03900146484375,
-0.0684814453125,
0.0027866363525390625,
0.0019626617431640625,
-0.00110626220703125,
0.03155517578125,
0.056304931640625,
-0.01328277587890625,
0.047637939453125,
-0.02362060546875,
-0.0208740234375,
-0.010101318359375,
0.0081787109375,
0.031494140625,
0.060791015625,
0.0615234375,
-0.05999755859375,
-0.036163330078125,
-0.00458526611328125,
-0.07171630859375,
0.0125579833984375,
-0.007495880126953125,
-0.01229095458984375,
0.019500732421875,
0.016021728515625,
-0.050811767578125,
0.0517578125,
0.0193939208984375,
-0.0229949951171875,
0.031646728515625,
-0.01071929931640625,
0.0198822021484375,
-0.08837890625,
0.008575439453125,
0.033966064453125,
-0.01207733154296875,
-0.0299224853515625,
-0.0011472702026367188,
0.009246826171875,
-0.003955841064453125,
-0.04205322265625,
0.047882080078125,
-0.04144287109375,
-0.01568603515625,
-0.01279449462890625,
-0.01080322265625,
-0.000308990478515625,
0.046234130859375,
-0.01189422607421875,
0.031646728515625,
0.060791015625,
-0.045196533203125,
0.038299560546875,
0.028717041015625,
-0.0159912109375,
0.0235137939453125,
-0.050933837890625,
0.00849151611328125,
0.0027923583984375,
0.0244903564453125,
-0.06121826171875,
-0.019073486328125,
0.0281524658203125,
-0.04827880859375,
0.028900146484375,
-0.048187255859375,
-0.0305023193359375,
-0.04681396484375,
-0.042999267578125,
0.0307159423828125,
0.04302978515625,
-0.05322265625,
0.041015625,
0.0208587646484375,
0.021636962890625,
-0.044586181640625,
-0.05810546875,
-0.0239410400390625,
-0.03607177734375,
-0.0616455078125,
0.034759521484375,
0.0167083740234375,
0.01061248779296875,
0.0040740966796875,
-0.00960540771484375,
-0.01001739501953125,
-0.00823974609375,
0.053192138671875,
0.03497314453125,
-0.0206756591796875,
-0.0164947509765625,
-0.03179931640625,
0.0002989768981933594,
0.0016279220581054688,
-0.02545166015625,
0.043731689453125,
-0.0274810791015625,
-0.0033397674560546875,
-0.06585693359375,
-0.012359619140625,
0.040496826171875,
-0.01293182373046875,
0.06243896484375,
0.0869140625,
-0.035430908203125,
0.0095977783203125,
-0.0296630859375,
-0.00821685791015625,
-0.036285400390625,
0.0284423828125,
-0.035125732421875,
-0.03619384765625,
0.06793212890625,
0.005596160888671875,
0.0007691383361816406,
0.0509033203125,
0.0281982421875,
-0.00823211669921875,
0.06341552734375,
0.041473388671875,
0.01120758056640625,
0.050811767578125,
-0.065185546875,
-0.0169677734375,
-0.06732177734375,
-0.046905517578125,
-0.032806396484375,
-0.034881591796875,
-0.058197021484375,
-0.0294952392578125,
0.026824951171875,
0.0214996337890625,
-0.03387451171875,
0.036956787109375,
-0.05633544921875,
0.00684356689453125,
0.05157470703125,
0.04791259765625,
-0.0271453857421875,
0.0215606689453125,
-0.026092529296875,
0.0055694580078125,
-0.050537109375,
-0.0208587646484375,
0.0855712890625,
0.035919189453125,
0.040802001953125,
-0.00832366943359375,
0.053985595703125,
-0.0242767333984375,
0.022979736328125,
-0.04669189453125,
0.044891357421875,
-0.004108428955078125,
-0.031494140625,
-0.0027313232421875,
-0.0361328125,
-0.08026123046875,
0.01177978515625,
-0.025726318359375,
-0.06414794921875,
0.0152740478515625,
0.014190673828125,
-0.02117919921875,
0.05279541015625,
-0.0625,
0.0687255859375,
-0.0070648193359375,
-0.038665771484375,
0.01317596435546875,
-0.053863525390625,
0.028106689453125,
0.01465606689453125,
-0.01239776611328125,
-0.0141448974609375,
0.0130767822265625,
0.081787109375,
-0.047088623046875,
0.05511474609375,
-0.0391845703125,
0.0278778076171875,
0.042572021484375,
-0.00965118408203125,
0.0255279541015625,
-0.0041351318359375,
-0.0121612548828125,
0.0244903564453125,
0.00580596923828125,
-0.037506103515625,
-0.035919189453125,
0.047943115234375,
-0.07049560546875,
-0.01849365234375,
-0.02587890625,
-0.028594970703125,
0.01122283935546875,
0.0133209228515625,
0.042938232421875,
0.04669189453125,
0.0223388671875,
0.023895263671875,
0.041168212890625,
-0.034393310546875,
0.039703369140625,
-0.01201629638671875,
-0.0244598388671875,
-0.0362548828125,
0.06805419921875,
0.00690460205078125,
0.0036163330078125,
0.0036792755126953125,
0.01335906982421875,
-0.02728271484375,
-0.04681396484375,
-0.0272674560546875,
0.0244903564453125,
-0.042694091796875,
-0.035125732421875,
-0.04638671875,
-0.03277587890625,
-0.0250091552734375,
-0.005031585693359375,
-0.038818359375,
-0.0214996337890625,
-0.03271484375,
0.0214996337890625,
0.05377197265625,
0.039093017578125,
-0.01172637939453125,
0.048583984375,
-0.048797607421875,
0.01357269287109375,
0.007049560546875,
0.038665771484375,
-0.0101470947265625,
-0.06072998046875,
-0.0222930908203125,
-0.0004668235778808594,
-0.03509521484375,
-0.05010986328125,
0.037933349609375,
0.01027679443359375,
0.029266357421875,
0.0254058837890625,
-0.0206756591796875,
0.056884765625,
0.0019359588623046875,
0.04345703125,
0.033599853515625,
-0.04840087890625,
0.04644775390625,
-0.0170440673828125,
0.0158233642578125,
0.00841522216796875,
0.033203125,
-0.01099395751953125,
0.0099029541015625,
-0.06719970703125,
-0.057891845703125,
0.0631103515625,
0.0158233642578125,
-0.004276275634765625,
0.03460693359375,
0.052764892578125,
-0.00949859619140625,
-0.0011682510375976562,
-0.0645751953125,
-0.032257080078125,
-0.03314208984375,
-0.02154541015625,
0.00981903076171875,
-0.00453948974609375,
0.0009946823120117188,
-0.055755615234375,
0.053558349609375,
-0.0033550262451171875,
0.060791015625,
0.028839111328125,
-0.0011892318725585938,
0.0035343170166015625,
-0.035308837890625,
0.04400634765625,
0.019134521484375,
-0.033050537109375,
0.00029206275939941406,
0.007068634033203125,
-0.053802490234375,
0.009185791015625,
0.01427459716796875,
0.0018157958984375,
-0.00032591819763183594,
0.0250396728515625,
0.06927490234375,
-0.007442474365234375,
0.0080108642578125,
0.038360595703125,
-0.006473541259765625,
-0.04034423828125,
-0.0174407958984375,
0.01056671142578125,
0.0006232261657714844,
0.0296630859375,
0.03192138671875,
0.0289459228515625,
-0.00821685791015625,
-0.017333984375,
0.02008056640625,
0.037139892578125,
-0.0216827392578125,
-0.020355224609375,
0.05218505859375,
-0.007080078125,
-0.01026153564453125,
0.057098388671875,
-0.00859832763671875,
-0.03765869140625,
0.07763671875,
0.02960205078125,
0.061309814453125,
-0.00888824462890625,
0.0054473876953125,
0.06671142578125,
0.0198822021484375,
-0.0018339157104492188,
0.019622802734375,
0.0126800537109375,
-0.057281494140625,
0.00258636474609375,
-0.0408935546875,
0.00977325439453125,
0.03497314453125,
-0.0445556640625,
0.02740478515625,
-0.048919677734375,
-0.033233642578125,
0.019439697265625,
0.026092529296875,
-0.06561279296875,
0.0242462158203125,
-0.0037136077880859375,
0.06585693359375,
-0.05029296875,
0.0626220703125,
0.06231689453125,
-0.03729248046875,
-0.08056640625,
-0.0012331008911132812,
0.003429412841796875,
-0.0667724609375,
0.044586181640625,
0.041595458984375,
0.0028553009033203125,
0.00707244873046875,
-0.06597900390625,
-0.04833984375,
0.1043701171875,
0.0300445556640625,
-0.005657196044921875,
0.0245361328125,
-0.00864410400390625,
0.0044708251953125,
-0.038482666015625,
0.041595458984375,
0.01316070556640625,
0.0222930908203125,
0.0214385986328125,
-0.059844970703125,
0.015899658203125,
-0.03009033203125,
0.01064300537109375,
0.0170745849609375,
-0.0628662109375,
0.06414794921875,
-0.040557861328125,
-0.007965087890625,
0.002201080322265625,
0.04644775390625,
0.0196075439453125,
0.0179901123046875,
0.036651611328125,
0.0506591796875,
0.037994384765625,
-0.0209808349609375,
0.062286376953125,
0.0035686492919921875,
0.04449462890625,
0.043243408203125,
0.019622802734375,
0.044891357421875,
0.0258026123046875,
-0.0146026611328125,
0.0322265625,
0.09051513671875,
-0.0256195068359375,
0.0219879150390625,
0.0097198486328125,
-0.006099700927734375,
-0.0031299591064453125,
0.00969696044921875,
-0.0325927734375,
0.05029296875,
0.012298583984375,
-0.043701171875,
-0.008056640625,
0.006023406982421875,
0.006038665771484375,
-0.032989501953125,
-0.0208282470703125,
0.026824951171875,
-0.0008139610290527344,
-0.02569580078125,
0.0753173828125,
0.0157012939453125,
0.069091796875,
-0.021942138671875,
0.00583648681640625,
-0.0190277099609375,
0.008087158203125,
-0.03509521484375,
-0.055755615234375,
0.0217132568359375,
-0.021881103515625,
0.0012722015380859375,
0.0086212158203125,
0.053680419921875,
-0.01537322998046875,
-0.027557373046875,
0.0113677978515625,
0.0160369873046875,
0.0380859375,
-0.003376007080078125,
-0.0911865234375,
0.01812744140625,
0.013824462890625,
-0.045654296875,
0.0222930908203125,
0.022247314453125,
0.005584716796875,
0.0625,
0.049072265625,
-0.0135498046875,
0.01323699951171875,
-0.0239410400390625,
0.061309814453125,
-0.0479736328125,
-0.018646240234375,
-0.06402587890625,
0.05328369140625,
-0.01462554931640625,
-0.047882080078125,
0.04071044921875,
0.050018310546875,
0.0628662109375,
0.001735687255859375,
0.036773681640625,
-0.0238494873046875,
-0.00421905517578125,
-0.03558349609375,
0.05609130859375,
-0.055908203125,
-0.0020046234130859375,
-0.01068115234375,
-0.050201416015625,
-0.031005859375,
0.0645751953125,
-0.0181884765625,
0.0306549072265625,
0.0362548828125,
0.0792236328125,
-0.031402587890625,
-0.0174560546875,
0.007904052734375,
0.005985260009765625,
-0.004985809326171875,
0.02777099609375,
0.032470703125,
-0.06787109375,
0.0304412841796875,
-0.045745849609375,
-0.017181396484375,
-0.0187835693359375,
-0.05609130859375,
-0.07562255859375,
-0.0704345703125,
-0.043365478515625,
-0.06121826171875,
-0.021514892578125,
0.0687255859375,
0.08837890625,
-0.041015625,
-0.012725830078125,
0.005550384521484375,
0.01419830322265625,
-0.007610321044921875,
-0.0161285400390625,
0.045806884765625,
0.005672454833984375,
-0.050262451171875,
-0.019989013671875,
-0.00417327880859375,
0.03607177734375,
0.0081024169921875,
-0.017547607421875,
-0.0178070068359375,
-0.023773193359375,
0.022674560546875,
0.0309295654296875,
-0.045257568359375,
-0.00485992431640625,
-0.0184478759765625,
-0.0164947509765625,
0.034820556640625,
0.033721923828125,
-0.0380859375,
0.015380859375,
0.0211334228515625,
0.02117919921875,
0.069091796875,
-0.0228729248046875,
0.00150299072265625,
-0.056793212890625,
0.04644775390625,
-0.0151519775390625,
0.030609130859375,
0.030364990234375,
-0.023406982421875,
0.040252685546875,
0.0293121337890625,
-0.033294677734375,
-0.06488037109375,
-0.0106658935546875,
-0.08184814453125,
-0.0037136077880859375,
0.0701904296875,
-0.0242919921875,
-0.0391845703125,
0.02484130859375,
-0.002262115478515625,
0.04437255859375,
-0.00848388671875,
0.0301513671875,
0.0151824951171875,
-0.0153045654296875,
-0.051544189453125,
-0.05108642578125,
0.03558349609375,
0.00641632080078125,
-0.048553466796875,
-0.040740966796875,
-0.002475738525390625,
0.052978515625,
0.017852783203125,
0.0440673828125,
-0.01273345947265625,
0.00806427001953125,
0.00531005859375,
0.042205810546875,
-0.02716064453125,
-0.003971099853515625,
-0.020721435546875,
-0.0014696121215820312,
-0.01221466064453125,
-0.0535888671875
]
] |
DeepFloyd/IF-I-XL-v1.0 | 2023-06-02T19:05:00.000Z | [
"diffusers",
"pytorch",
"if",
"text-to-image",
"arxiv:2205.11487",
"arxiv:2110.02861",
"license:deepfloyd-if-license",
"has_space",
"diffusers:IFPipeline",
"region:us"
] | text-to-image | DeepFloyd | null | null | DeepFloyd/IF-I-XL-v1.0 | 524 | 77,150 | diffusers | 2023-04-06T21:22:41 | ---
license: deepfloyd-if-license
extra_gated_prompt: "DeepFloyd LICENSE AGREEMENT\nThis License Agreement (as may be amended in accordance with this License Agreement, “License”), between you, or your employer or other entity (if you are entering into this agreement on behalf of your employer or other entity) (“Licensee” or “you”) and Stability AI Ltd.. (“Stability AI” or “we”) applies to your use of any computer program, algorithm, source code, object code, or software that is made available by Stability AI under this License (“Software”) and any specifications, manuals, documentation, and other written information provided by Stability AI related to the Software (“Documentation”).\nBy clicking “I Accept” below or by using the Software, you agree to the terms of this License. If you do not agree to this License, then you do not have any rights to use the Software or Documentation (collectively, the “Software Products”), and you must immediately cease using the Software Products. If you are agreeing to be bound by the terms of this License on behalf of your employer or other entity, you represent and warrant to Stability AI that you have full legal authority to bind your employer or such entity to this License. If you do not have the requisite authority, you may not accept the License or access the Software Products on behalf of your employer or other entity.\n1. LICENSE GRANT\n a. Subject to your compliance with the Documentation and Sections 2, 3, and 5, Stability AI grants you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty free and limited license under Stability AI’s copyright interests to reproduce, distribute, and create derivative works of the Software solely for your non-commercial research purposes. The foregoing license is personal to you, and you may not assign or sublicense this License or any other rights or obligations under this License without Stability AI’s prior written consent; any such assignment or sublicense will be void and will automatically and immediately terminate this License.\n b. You may make a reasonable number of copies of the Documentation solely for use in connection with the license to the Software granted above.\n c. The grant of rights expressly set forth in this Section 1 (License Grant) are the complete grant of rights to you in the Software Products, and no other licenses are granted, whether by waiver, estoppel, implication, equity or otherwise. Stability AI and its licensors reserve all rights not expressly granted by this License.\L\n2. RESTRICTIONS\n You will not, and will not permit, assist or cause any third party to:\n a. use, modify, copy, reproduce, create derivative works of, or distribute the Software Products (or any derivative works thereof, works incorporating the Software Products, or any data produced by the Software), in whole or in part, for (i) any commercial or production purposes, (ii) military purposes or in the service of nuclear technology, (iii) purposes of surveillance, including any research or development relating to surveillance, (iv) biometric processing, (v) in any manner that infringes, misappropriates, or otherwise violates any third-party rights, or (vi) in any manner that violates any applicable law and violating any privacy or security laws, rules, regulations, directives, or governmental requirements (including the General Data Privacy Regulation (Regulation (EU) 2016/679), the California Consumer Privacy Act, and any and all laws governing the processing of biometric information), as well as all amendments and successor laws to any of the foregoing;\n b. alter or remove copyright and other proprietary notices which appear on or in the Software Products;\n c. utilize any equipment, device, software, or other means to circumvent or remove any security or protection used by Stability AI in connection with the Software, or to circumvent or remove any usage restrictions, or to enable functionality disabled by Stability AI; or\n d. offer or impose any terms on the Software Products that alter, restrict, or are inconsistent with the terms of this License.\n e. 1) violate any applicable U.S. and non-U.S. export control and trade sanctions laws (“Export Laws”); 2) directly or indirectly export, re-export, provide, or otherwise transfer Software Products: (a) to any individual, entity, or country prohibited by Export Laws; (b) to anyone on U.S. or non-U.S. government restricted parties lists; or (c) for any purpose prohibited by Export Laws, including nuclear, chemical or biological weapons, or missile technology applications; 3) use or download Software Products if you or they are: (a) located in a comprehensively sanctioned jurisdiction, (b) currently listed on any U.S. or non-U.S. restricted parties list, or (c) for any purpose prohibited by Export Laws; and (4) will not disguise your location through IP proxying or other methods.\L\n3. ATTRIBUTION\n Together with any copies of the Software Products (as well as derivative works thereof or works incorporating the Software Products) that you distribute, you must provide (i) a copy of this License, and (ii) the following attribution notice: “DeepFloyd is licensed under the DeepFloyd License, Copyright (c) Stability AI Ltd. All Rights Reserved.”\L\n4. DISCLAIMERS\n THE SOFTWARE PRODUCTS ARE PROVIDED “AS IS” and “WITH ALL FAULTS” WITH NO WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. STABILITY AIEXPRESSLY DISCLAIMS ALL REPRESENTATIONS AND WARRANTIES, EXPRESS OR IMPLIED, WHETHER BY STATUTE, CUSTOM, USAGE OR OTHERWISE AS TO ANY MATTERS RELATED TO THE SOFTWARE PRODUCTS, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE, SATISFACTORY QUALITY, OR NON-INFRINGEMENT. STABILITY AI MAKES NO WARRANTIES OR REPRESENTATIONS THAT THE SOFTWARE PRODUCTS WILL BE ERROR FREE OR FREE OF VIRUSES OR OTHER HARMFUL COMPONENTS, OR PRODUCE ANY PARTICULAR RESULTS.\L\n5. LIMITATION OF LIABILITY\n TO THE FULLEST EXTENT PERMITTED BY LAW, IN NO EVENT WILL STABILITY AI BE LIABLE TO YOU (A) UNDER ANY THEORY OF LIABILITY, WHETHER BASED IN CONTRACT, TORT, NEGLIGENCE, STRICT LIABILITY, WARRANTY, OR OTHERWISE UNDER THIS LICENSE, OR (B) FOR ANY INDIRECT, CONSEQUENTIAL, EXEMPLARY, INCIDENTAL, PUNITIVE OR SPECIAL DAMAGES OR LOST PROFITS, EVEN IF STABILITY AI HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THE SOFTWARE PRODUCTS, THEIR CONSTITUENT COMPONENTS, AND ANY OUTPUT (COLLECTIVELY, “SOFTWARE MATERIALS”) ARE NOT DESIGNED OR INTENDED FOR USE IN ANY APPLICATION OR SITUATION WHERE FAILURE OR FAULT OF THE SOFTWARE MATERIALS COULD REASONABLY BE ANTICIPATED TO LEAD TO SERIOUS INJURY OF ANY PERSON, INCLUDING POTENTIAL DISCRIMINATION OR VIOLATION OF AN INDIVIDUAL’S PRIVACY RIGHTS, OR TO SEVERE PHYSICAL, PROPERTY, OR ENVIRONMENTAL DAMAGE (EACH, A “HIGH-RISK USE”). IF YOU ELECT TO USE ANY OF THE SOFTWARE MATERIALS FOR A HIGH-RISK USE, YOU DO SO AT YOUR OWN RISK. YOU AGREE TO DESIGN AND IMPLEMENT APPROPRIATE DECISION-MAKING AND RISK-MITIGATION PROCEDURES AND POLICIES IN CONNECTION WITH A HIGH-RISK USE SUCH THAT EVEN IF THERE IS A FAILURE OR FAULT IN ANY OF THE SOFTWARE MATERIALS, THE SAFETY OF PERSONS OR PROPERTY AFFECTED BY THE ACTIVITY STAYS AT A LEVEL THAT IS REASONABLE, APPROPRIATE, AND LAWFUL FOR THE FIELD OF THE HIGH-RISK USE.\L\n6. INDEMNIFICATION\n You will indemnify, defend and hold harmless Stability AI and our subsidiaries and affiliates, and each of our respective shareholders, directors, officers, employees, agents, successors, and assigns (collectively, the “Stability AI Parties”) from and against any losses, liabilities, damages, fines, penalties, and expenses (including reasonable attorneys’ fees) incurred by any Stability AI Party in connection with any claim, demand, allegation, lawsuit, proceeding, or investigation (collectively, “Claims”) arising out of or related to: (a) your access to or use of the Software Products (as well as any results or data generated from such access or use), including any High-Risk Use (defined below); (b) your violation of this License; or (c) your violation, misappropriation or infringement of any rights of another (including intellectual property or other proprietary rights and privacy rights). You will promptly notify the Stability AI Parties of any such Claims, and cooperate with Stability AI Parties in defending such Claims. You will also grant the Stability AI Parties sole control of the defense or settlement, at Stability AI’s sole option, of any Claims. This indemnity is in addition to, and not in lieu of, any other indemnities or remedies set forth in a written agreement between you and Stability AI or the other Stability AI Parties.\L\n7. TERMINATION; SURVIVAL\n a. This License will automatically terminate upon any breach by you of the terms of this License.\L\Lb. We may terminate this License, in whole or in part, at any time upon notice (including electronic) to you.\L\Lc. The following sections survive termination of this License: 2 (Restrictions), 3 (Attribution), 4 (Disclaimers), 5 (Limitation on Liability), 6 (Indemnification) 7 (Termination; Survival), 8 (Third Party Materials), 9 (Trademarks), 10 (Applicable Law; Dispute Resolution), and 11 (Miscellaneous).\L\n8. THIRD PARTY MATERIALS\n The Software Products may contain third-party software or other components (including free and open source software) (all of the foregoing, “Third Party Materials”), which are subject to the license terms of the respective third-party licensors. Your dealings or correspondence with third parties and your use of or interaction with any Third Party Materials are solely between you and the third party. Stability AI does not control or endorse, and makes no representations or warranties regarding, any Third Party Materials, and your access to and use of such Third Party Materials are at your own risk.\L\n9. TRADEMARKS\n Licensee has not been granted any trademark license as part of this License and may not use any name or mark associated with Stability AI without the prior written permission of Stability AI, except to the extent necessary to make the reference required by the “ATTRIBUTION” section of this Agreement.\L\n10. APPLICABLE LAW; DISPUTE RESOLUTION\n This License will be governed and construed under the laws of the State of California without regard to conflicts of law provisions. Any suit or proceeding arising out of or relating to this License will be brought in the federal or state courts, as applicable, in San Mateo County, California, and each party irrevocably submits to the jurisdiction and venue of such courts.\L\n11. MISCELLANEOUS\n If any provision or part of a provision of this License is unlawful, void or unenforceable, that provision or part of the provision is deemed severed from this License, and will not affect the validity and enforceability of any remaining provisions. The failure of Stability AI to exercise or enforce any right or provision of this License will not operate as a waiver of such right or provision. This License does not confer any third-party beneficiary rights upon any other person or entity. This License, together with the Documentation, contains the entire understanding between you and Stability AI regarding the subject matter of this License, and supersedes all other written or oral agreements and understandings between you and Stability AI regarding such subject matter. No change or addition to any provision of this License will be binding unless it is in writing and signed by an authorized representative of both you and Stability AI."
extra_gated_fields:
"Organization /\_Affiliation": text
Previously related publications: text
I accept the above license agreement, and will use the Software non-commercially and for research purposes only: checkbox
tags:
- if
- text-to-image
inference: false
---
# IF-I-XL-v1.0
DeepFloyd-IF is a pixel-based text-to-image triple-cascaded diffusion model, that can generate pictures with new state-of-the-art for photorealism and language understanding. The result is a highly efficient model that outperforms current state-of-the-art models, achieving a zero-shot FID-30K score of `6.66` on the COCO dataset.
*Inspired by* [*Photorealistic Text-to-Image Diffusion Models with Deep Language Understanding*](https://arxiv.org/pdf/2205.11487.pdf)

## Model Details
- **Developed by:** DeepFloyd, StabilityAI
- **Model type:** pixel-based text-to-image cascaded diffusion model
- **Cascade Stage:** I
- **Num Parameters:** 4.3B
- **Language(s):** primarily English and, to a lesser extent, other Romance languages
- **License:** <span style="color:blue"><a href="https://huggingface.co/spaces/DeepFloyd/deepfloyd-if-license">DeepFloyd IF License Agreement</a></span>
- **Model Description:** DeepFloyd-IF is modular composed of frozen text mode and three pixel cascaded diffusion modules, each designed to generate images of increasing resolution: 64x64, 256x256, and 1024x1024. All stages of the model utilize a frozen text encoder based on the T5 transformer to extract text embeddings, which are then fed into a UNet architecture enhanced with cross-attention and attention-pooling
- **Resources for more information:** [GitHub](https://github.com/deep-floyd/IF), [deepfloyd.ai](https://deepfloyd.ai), [All Links](https://linktr.ee/deepfloyd)
- **Cite as (Soon):** -
## Using with `diffusers`
IF is integrated with the 🤗 Hugging Face [🧨 diffusers library](https://github.com/huggingface/diffusers/), which is optimized to run on GPUs with as little as 14 GB of VRAM.
Before you can use IF, you need to accept its usage conditions. To do so:
1. Make sure to have a [Hugging Face account](https://huggingface.co/join) and be loggin in
2. Accept the license on the model card of [DeepFloyd/IF-I-XL-v1.0](https://huggingface.co/DeepFloyd/IF-I-XL-v1.0)
3. Make sure to login locally. Install `huggingface_hub`
```sh
pip install huggingface_hub --upgrade
```
run the login function in a Python shell
```py
from huggingface_hub import login
login()
```
and enter your [Hugging Face Hub access token](https://huggingface.co/docs/hub/security-tokens#what-are-user-access-tokens).
Next we install `diffusers` and dependencies:
```sh
pip install diffusers accelerate transformers safetensors sentencepiece
```
And we can now run the model locally.
By default `diffusers` makes use of [model cpu offloading](https://huggingface.co/docs/diffusers/optimization/fp16#model-offloading-for-fast-inference-and-memory-savings) to run the whole IF pipeline with as little as 14 GB of VRAM.
If you are using `torch>=2.0.0`, make sure to **remove all** `enable_xformers_memory_efficient_attention()` functions.
* **Load all stages and offload to CPU**
```py
from diffusers import DiffusionPipeline
from diffusers.utils import pt_to_pil
import torch
# stage 1
stage_1 = DiffusionPipeline.from_pretrained("DeepFloyd/IF-I-XL-v1.0", variant="fp16", torch_dtype=torch.float16)
stage_1.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_1.enable_model_cpu_offload()
# stage 2
stage_2 = DiffusionPipeline.from_pretrained(
"DeepFloyd/IF-II-L-v1.0", text_encoder=None, variant="fp16", torch_dtype=torch.float16
)
stage_2.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_2.enable_model_cpu_offload()
# stage 3
safety_modules = {"feature_extractor": stage_1.feature_extractor, "safety_checker": stage_1.safety_checker, "watermarker": stage_1.watermarker}
stage_3 = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-x4-upscaler", **safety_modules, torch_dtype=torch.float16)
stage_3.enable_xformers_memory_efficient_attention() # remove line if torch.__version__ >= 2.0.0
stage_3.enable_model_cpu_offload()
```
* **Retrieve Text Embeddings**
```py
prompt = 'a photo of a kangaroo wearing an orange hoodie and blue sunglasses standing in front of the eiffel tower holding a sign that says "very deep learning"'
# text embeds
prompt_embeds, negative_embeds = stage_1.encode_prompt(prompt)
```
* **Run stage 1**
```py
generator = torch.manual_seed(0)
image = stage_1(prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt").images
pt_to_pil(image)[0].save("./if_stage_I.png")
```
* **Run stage 2**
```py
image = stage_2(
image=image, prompt_embeds=prompt_embeds, negative_prompt_embeds=negative_embeds, generator=generator, output_type="pt"
).images
pt_to_pil(image)[0].save("./if_stage_II.png")
```
* **Run stage 3**
```py
image = stage_3(prompt=prompt, image=image, generator=generator, noise_level=100).images
image[0].save("./if_stage_III.png")
```
There are multiple ways to speed up the inference time and lower the memory consumption even more with `diffusers`. To do so, please have a look at the Diffusers docs:
- 🚀 [Optimizing for inference time](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-speed)
- ⚙️ [Optimizing for low memory during inference](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-memory)
For more in-detail information about how to use IF, please have a look at [the IF blog post](https://huggingface.co/blog/if) and the [documentation](https://huggingface.co/docs/diffusers/main/en/api/pipelines/if) 📖.
Diffusers dreambooth scripts also supports fine-tuning 🎨 [IF](https://huggingface.co/docs/diffusers/main/en/training/dreambooth#if).
With parameter efficient finetuning, you can add new concepts to IF with a single GPU and ~28 GB VRAM.
## Training
**Training Data:**
1.2B text-image pairs (based on LAION-A and few additional internal datasets)
Test/Valid parts of datasets are not used at any cascade and stage of training. Valid part of COCO helps to demonstrate "online" loss behaviour during training (to catch incident and other problems), but dataset is never used for train.
**Training Procedure:** IF-I-XL-v1.0 is a pixel-based diffusion cascade which uses T5-Encoder embeddings (hidden states) to generate 64px image. During training,
- Images are cropped to square via shifted-center-crop augmentation (randomly shift from center up to 0.1 of size) and resized to 64px using `Pillow==9.2.0` BICUBIC resampling with reducing_gap=None (it helps to avoid aliasing) and processed to tensor BxCxHxW
- Text prompts are encoded through open-sourced frozen T5-v1_1-xxl text-encoder (that completely was trained by Google team), random 10% of texts are dropped to empty string to add ability for classifier free guidance (CFG)
- The non-pooled output of the text encoder is fed into the projection (linear layer without activation) and is used in UNet backbone of the diffusion model via controlled hybrid self- and cross- attention
- Also, the output of the text encode is pooled via attention-pooling (64 heads) and is used in time embed as additional features
- Diffusion process is limited by 1000 discrete steps, with cosine beta schedule of noising image
- The loss is a reconstruction objective between the noise that was added to the image and the prediction made by the UNet
- The training process for checkpoint IF-I-XL-v1.0 has 2_420_000 steps at resolution 64x64 on all datasets, OneCycleLR policy, few-bit backward GELU activations, optimizer AdamW8bit + DeepSpeed-Zero1, fully frozen T5-Encoder

**Hardware:** 64 x 8 x A100 GPUs
**Optimizer:** [AdamW8bit](https://arxiv.org/abs/2110.02861) + [DeepSpeed ZeRO-1](https://www.deepspeed.ai/tutorials/zero/)
**Batch:** 3072
**Learning rate**: [one-cycle](https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.OneCycleLR.html) cosine strategy, warmup 10000 steps, start_lr=2e-6, max_lr=5e-5, final_lr=5e-9

## Evaluation Results
`FID-30K: 6.66`

# Uses
## Direct Use
The model is released for research purposes. Any attempt to deploy the model in production requires not only that the LICENSE is followed but full liability over the person deploying the model.
Possible research areas and tasks include:
- Generation of artistic imagery and use in design and other artistic processes.
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion but applies in the same way for IF_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model was trained mainly with English captions and will not work as well in other languages.
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have... (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
IF was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
IF mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent.
## Citation (Soon)
*This model card was written by: DeepFloyd-Team and is based on the [StableDiffusion model card](https://huggingface.co/CompVis/stable-diffusion-v1-4).* | 23,443 | [
[
-0.043792724609375,
-0.0655517578125,
0.0198516845703125,
0.0292510986328125,
-0.017364501953125,
-0.003345489501953125,
-0.01971435546875,
-0.03515625,
0.005817413330078125,
0.0220947265625,
-0.041717529296875,
-0.0430908203125,
-0.048126220703125,
-0.01502227783203125,
-0.0177459716796875,
0.078125,
-0.0125579833984375,
-0.01470184326171875,
-0.00732421875,
0.0017461776733398438,
-0.0143585205078125,
-0.00806427001953125,
-0.07440185546875,
-0.025848388671875,
0.0210723876953125,
0.0213165283203125,
0.04046630859375,
0.025970458984375,
0.0270538330078125,
0.0274505615234375,
-0.0182037353515625,
-0.003139495849609375,
-0.03857421875,
-0.0293426513671875,
0.0103912353515625,
-0.0182647705078125,
-0.033416748046875,
0.0012998580932617188,
0.04736328125,
0.01152801513671875,
-0.0019273757934570312,
0.005046844482421875,
0.0081787109375,
0.051544189453125,
-0.0478515625,
0.02020263671875,
-0.0215301513671875,
0.01554107666015625,
0.0006661415100097656,
0.0146026611328125,
-0.011627197265625,
-0.0139923095703125,
0.021453857421875,
-0.048492431640625,
0.03826904296875,
-0.006618499755859375,
0.0863037109375,
0.02264404296875,
-0.009552001953125,
-0.00838470458984375,
-0.026397705078125,
0.050323486328125,
-0.054229736328125,
0.022674560546875,
0.00971221923828125,
0.0035610198974609375,
0.005340576171875,
-0.0684814453125,
-0.051177978515625,
-0.01019287109375,
0.0000013709068298339844,
0.0260467529296875,
-0.0133514404296875,
0.01019287109375,
0.0242462158203125,
0.049041748046875,
-0.03485107421875,
-0.00617218017578125,
-0.03851318359375,
-0.01436614990234375,
0.0631103515625,
-0.0012483596801757812,
0.0192108154296875,
-0.00655364990234375,
-0.038116455078125,
-0.0159912109375,
-0.0178070068359375,
0.01296234130859375,
0.004535675048828125,
0.0015649795532226562,
-0.050079345703125,
0.025115966796875,
-0.00675201416015625,
0.0271148681640625,
0.0285491943359375,
-0.01551055908203125,
0.032012939453125,
-0.01214599609375,
-0.0330810546875,
0.01082611083984375,
0.083984375,
0.0234222412109375,
0.0103759765625,
0.00556182861328125,
-0.00974273681640625,
0.0022029876708984375,
-0.0002040863037109375,
-0.09735107421875,
-0.039642333984375,
0.0290374755859375,
-0.024871826171875,
-0.03631591796875,
-0.0117950439453125,
-0.06451416015625,
-0.0108795166015625,
0.01464080810546875,
0.041717529296875,
-0.060516357421875,
-0.031890869140625,
0.0190277099609375,
-0.0168609619140625,
0.017120361328125,
0.0284423828125,
-0.055572509765625,
0.0254669189453125,
0.02557373046875,
0.07861328125,
-0.00640106201171875,
-0.01300048828125,
-0.01123809814453125,
-0.016387939453125,
-0.0212860107421875,
0.0439453125,
-0.01812744140625,
-0.029052734375,
-0.0107269287109375,
0.0082244873046875,
-0.0111541748046875,
-0.029052734375,
0.050079345703125,
-0.0276336669921875,
0.036407470703125,
-0.007595062255859375,
-0.050567626953125,
-0.02764892578125,
0.005939483642578125,
-0.04400634765625,
0.0933837890625,
0.02008056640625,
-0.07391357421875,
0.0159759521484375,
-0.054779052734375,
-0.031890869140625,
-0.001132965087890625,
0.002346038818359375,
-0.0562744140625,
0.0012960433959960938,
0.0238037109375,
0.049530029296875,
-0.0144500732421875,
0.0014410018920898438,
-0.020721435546875,
-0.030731201171875,
0.0041351318359375,
-0.026611328125,
0.0848388671875,
0.0208740234375,
-0.055145263671875,
-0.0002880096435546875,
-0.0498046875,
-0.004627227783203125,
0.0282745361328125,
-0.024749755859375,
0.00994873046875,
-0.0311431884765625,
0.0227203369140625,
0.0215606689453125,
0.0164031982421875,
-0.046142578125,
0.0147552490234375,
-0.02874755859375,
0.0367431640625,
0.048797607421875,
-0.001220703125,
0.038421630859375,
-0.0096893310546875,
0.0357666015625,
0.0212554931640625,
0.016448974609375,
-0.0201568603515625,
-0.059814453125,
-0.07611083984375,
-0.0305633544921875,
0.0113677978515625,
0.03857421875,
-0.058807373046875,
0.0310821533203125,
-0.00041556358337402344,
-0.041290283203125,
-0.048370361328125,
0.002178192138671875,
0.041778564453125,
0.048919677734375,
0.03277587890625,
-0.017425537109375,
-0.0184783935546875,
-0.060302734375,
0.0111236572265625,
0.00939178466796875,
0.01024627685546875,
0.0236968994140625,
0.050567626953125,
-0.016082763671875,
0.047515869140625,
-0.043426513671875,
-0.034088134765625,
-0.01059722900390625,
0.0019102096557617188,
0.0266876220703125,
0.04669189453125,
0.0535888671875,
-0.051544189453125,
-0.046966552734375,
-0.00390625,
-0.06524658203125,
0.01218414306640625,
-0.0106964111328125,
-0.005413055419921875,
0.031951904296875,
0.031463623046875,
-0.07135009765625,
0.043792724609375,
0.0421142578125,
-0.030609130859375,
0.041748046875,
-0.025115966796875,
0.007564544677734375,
-0.073974609375,
0.01174163818359375,
0.024169921875,
-0.01549530029296875,
-0.0275421142578125,
0.009185791015625,
0.007904052734375,
-0.01378631591796875,
-0.046600341796875,
0.060302734375,
-0.044036865234375,
0.0216522216796875,
-0.0118408203125,
0.0034198760986328125,
0.0130462646484375,
0.0479736328125,
0.00844573974609375,
0.051177978515625,
0.06683349609375,
-0.053466796875,
0.01605224609375,
0.00872802734375,
-0.030609130859375,
0.03228759765625,
-0.048309326171875,
0.0133056640625,
-0.01435089111328125,
0.0219879150390625,
-0.0787353515625,
-0.01041412353515625,
0.03839111328125,
-0.034393310546875,
0.043609619140625,
-0.0035190582275390625,
-0.02984619140625,
-0.04278564453125,
-0.0250091552734375,
0.0261993408203125,
0.06134033203125,
-0.0399169921875,
0.041290283203125,
0.01245880126953125,
0.018768310546875,
-0.0511474609375,
-0.05572509765625,
-0.0072784423828125,
-0.01546478271484375,
-0.05828857421875,
0.0506591796875,
-0.009552001953125,
0.00025725364685058594,
0.0070343017578125,
0.00202178955078125,
0.00469207763671875,
-0.00005918741226196289,
0.02044677734375,
0.010955810546875,
-0.021392822265625,
-0.01395416259765625,
0.01507568359375,
-0.0169525146484375,
0.005184173583984375,
-0.029205322265625,
0.039825439453125,
-0.018951416015625,
0.00527191162109375,
-0.071533203125,
0.0053253173828125,
0.026397705078125,
0.0059661865234375,
0.060516357421875,
0.08966064453125,
-0.03515625,
-0.00732421875,
-0.046356201171875,
-0.00960540771484375,
-0.043426513671875,
0.01776123046875,
-0.0304107666015625,
-0.055419921875,
0.03265380859375,
-0.00229644775390625,
0.01355743408203125,
0.04974365234375,
0.03936767578125,
-0.017913818359375,
0.066650390625,
0.04925537109375,
-0.0129241943359375,
0.038116455078125,
-0.0718994140625,
0.006072998046875,
-0.055084228515625,
-0.02386474609375,
-0.0075531005859375,
-0.031890869140625,
-0.03277587890625,
-0.049560546875,
0.022308349609375,
0.0259246826171875,
-0.0299072265625,
0.0182647705078125,
-0.054290771484375,
0.0252838134765625,
0.028839111328125,
0.021392822265625,
0.0012950897216796875,
0.010528564453125,
-0.01305389404296875,
0.0020160675048828125,
-0.055450439453125,
-0.0170745849609375,
0.061431884765625,
0.0292816162109375,
0.041717529296875,
-0.02154541015625,
0.052703857421875,
0.010986328125,
0.030609130859375,
-0.03643798828125,
0.041473388671875,
-0.00226593017578125,
-0.050140380859375,
0.001800537109375,
-0.0240631103515625,
-0.05908203125,
0.0139312744140625,
-0.0240631103515625,
-0.059173583984375,
0.0193023681640625,
0.018035888671875,
-0.0266265869140625,
0.041473388671875,
-0.061126708984375,
0.076416015625,
-0.01947021484375,
-0.050048828125,
-0.01129913330078125,
-0.0498046875,
0.03265380859375,
0.0170440673828125,
-0.002902984619140625,
-0.01153564453125,
-0.00612640380859375,
0.054931640625,
-0.0357666015625,
0.054351806640625,
-0.0311431884765625,
-0.0012302398681640625,
0.03997802734375,
-0.00804901123046875,
0.0215606689453125,
0.00699615478515625,
-0.02069091796875,
0.038848876953125,
-0.00799560546875,
-0.041595458984375,
-0.02862548828125,
0.06280517578125,
-0.06658935546875,
-0.028411865234375,
-0.033782958984375,
-0.0198974609375,
0.0165863037109375,
0.0206451416015625,
0.056640625,
0.02056884765625,
-0.01517486572265625,
-0.0010843276977539062,
0.063232421875,
-0.039764404296875,
0.054046630859375,
-0.0045013427734375,
-0.02374267578125,
-0.04400634765625,
0.07568359375,
-0.00952911376953125,
0.0118408203125,
0.0287017822265625,
0.021575927734375,
-0.0229949951171875,
-0.027130126953125,
-0.049713134765625,
0.0299072265625,
-0.0408935546875,
-0.025238037109375,
-0.0673828125,
-0.031402587890625,
-0.03302001953125,
-0.0220184326171875,
-0.046295166015625,
-0.018798828125,
-0.053741455078125,
0.0007958412170410156,
0.046539306640625,
0.03277587890625,
-0.00508880615234375,
0.03594970703125,
-0.0302581787109375,
0.0255889892578125,
0.004657745361328125,
0.0198822021484375,
0.01416778564453125,
-0.03790283203125,
-0.0176849365234375,
0.00228118896484375,
-0.039031982421875,
-0.045318603515625,
0.04217529296875,
0.023406982421875,
0.0169219970703125,
0.05511474609375,
-0.01108551025390625,
0.061676025390625,
-0.020782470703125,
0.054931640625,
0.02679443359375,
-0.0662841796875,
0.0330810546875,
-0.01800537109375,
0.023345947265625,
0.0272979736328125,
0.04461669921875,
-0.01788330078125,
-0.00969696044921875,
-0.06414794921875,
-0.062042236328125,
0.05950927734375,
0.03497314453125,
0.01427459716796875,
0.01006317138671875,
0.05126953125,
-0.007411956787109375,
0.01236724853515625,
-0.05633544921875,
-0.036376953125,
-0.0205841064453125,
-0.0091094970703125,
-0.00922393798828125,
-0.006122589111328125,
0.0121612548828125,
-0.04779052734375,
0.061553955078125,
0.0003898143768310547,
0.05218505859375,
0.0318603515625,
-0.0010986328125,
0.000040590763092041016,
-0.0250701904296875,
0.0262603759765625,
0.0221710205078125,
-0.020416259765625,
-0.00984954833984375,
0.01424407958984375,
-0.043243408203125,
-0.002071380615234375,
0.0173187255859375,
-0.0181884765625,
-0.0014095306396484375,
0.0156402587890625,
0.07489013671875,
0.00887298583984375,
-0.030120849609375,
0.043609619140625,
-0.0149078369140625,
-0.0219268798828125,
-0.030426025390625,
0.0162200927734375,
0.0208892822265625,
0.0295562744140625,
0.015869140625,
0.0140228271484375,
0.006927490234375,
-0.0275115966796875,
0.01378631591796875,
0.033660888671875,
-0.0279388427734375,
-0.0221710205078125,
0.07501220703125,
0.00772857666015625,
-0.0276947021484375,
0.056427001953125,
-0.0269012451171875,
-0.040618896484375,
0.05657958984375,
0.039031982421875,
0.0733642578125,
-0.01142120361328125,
0.0200347900390625,
0.049591064453125,
0.017333984375,
0.000025033950805664062,
0.021575927734375,
-0.0083160400390625,
-0.054229736328125,
-0.0184173583984375,
-0.0531005859375,
-0.004016876220703125,
0.0121307373046875,
-0.0293121337890625,
0.0357666015625,
-0.05950927734375,
-0.013092041015625,
0.00457763671875,
0.0191497802734375,
-0.0716552734375,
0.03173828125,
0.0237579345703125,
0.0767822265625,
-0.050872802734375,
0.06146240234375,
0.0316162109375,
-0.037994384765625,
-0.0465087890625,
-0.005626678466796875,
-0.0120849609375,
-0.0657958984375,
0.02996826171875,
0.039031982421875,
-0.006198883056640625,
0.0107269287109375,
-0.0562744140625,
-0.05816650390625,
0.08953857421875,
0.03167724609375,
-0.03472900390625,
-0.0016794204711914062,
-0.01279449462890625,
0.03936767578125,
-0.02978515625,
0.03082275390625,
0.03887939453125,
0.025848388671875,
0.0153656005859375,
-0.046783447265625,
0.0162506103515625,
-0.029052734375,
0.0009441375732421875,
0.005558013916015625,
-0.07318115234375,
0.07366943359375,
-0.044708251953125,
-0.0226287841796875,
0.01273345947265625,
0.06597900390625,
0.017242431640625,
0.0313720703125,
0.0258026123046875,
0.07318115234375,
0.0511474609375,
-0.002063751220703125,
0.09735107421875,
-0.023590087890625,
0.042083740234375,
0.048553466796875,
0.00717926025390625,
0.045440673828125,
0.0137786865234375,
-0.0131683349609375,
0.044586181640625,
0.0628662109375,
-0.0015840530395507812,
0.03826904296875,
0.00046372413635253906,
-0.02740478515625,
-0.0101165771484375,
0.01508331298828125,
-0.037261962890625,
0.01386260986328125,
0.034149169921875,
-0.037139892578125,
-0.0028228759765625,
0.0105133056640625,
0.0217437744140625,
-0.032928466796875,
-0.00998687744140625,
0.046661376953125,
0.0149078369140625,
-0.046630859375,
0.063232421875,
0.00722503662109375,
0.0753173828125,
-0.0288238525390625,
-0.00007766485214233398,
-0.01971435546875,
0.031890869140625,
-0.02838134765625,
-0.061126708984375,
0.00403594970703125,
-0.01448822021484375,
0.0024204254150390625,
-0.0138092041015625,
0.056884765625,
-0.02459716796875,
-0.057647705078125,
0.032501220703125,
0.00566864013671875,
0.0254974365234375,
-0.02197265625,
-0.09088134765625,
0.0191497802734375,
0.006023406982421875,
-0.034576416015625,
0.0243988037109375,
0.0257720947265625,
0.0186767578125,
0.052520751953125,
0.042144775390625,
-0.012542724609375,
0.010528564453125,
-0.0128936767578125,
0.0721435546875,
-0.04400634765625,
-0.0203704833984375,
-0.0672607421875,
0.054412841796875,
-0.016632080078125,
-0.038909912109375,
0.05023193359375,
0.05035400390625,
0.06573486328125,
-0.003330230712890625,
0.04559326171875,
-0.0213165283203125,
-0.006839752197265625,
-0.0338134765625,
0.06121826171875,
-0.0633544921875,
0.0033092498779296875,
-0.046966552734375,
-0.0633544921875,
-0.01052093505859375,
0.0684814453125,
-0.0105743408203125,
0.021759033203125,
0.04400634765625,
0.05889892578125,
-0.0234832763671875,
0.00458526611328125,
0.00812530517578125,
0.01910400390625,
0.0159912109375,
0.048187255859375,
0.0312347412109375,
-0.072021484375,
0.0304718017578125,
-0.060333251953125,
-0.0244293212890625,
-0.0081634521484375,
-0.0672607421875,
-0.06597900390625,
-0.0452880859375,
-0.058349609375,
-0.0506591796875,
-0.006511688232421875,
0.0439453125,
0.062744140625,
-0.04669189453125,
-0.011566162109375,
-0.0186309814453125,
0.00616455078125,
-0.01788330078125,
-0.0247039794921875,
0.040374755859375,
-0.01010894775390625,
-0.06744384765625,
0.0028095245361328125,
0.0199127197265625,
0.0016393661499023438,
-0.023223876953125,
-0.01165008544921875,
-0.02197265625,
-0.014923095703125,
0.052093505859375,
0.0220947265625,
-0.03973388671875,
-0.0006413459777832031,
-0.01274871826171875,
0.0039043426513671875,
0.0224761962890625,
0.0400390625,
-0.0626220703125,
0.028228759765625,
0.0301513671875,
0.03466796875,
0.08837890625,
-0.016326904296875,
0.01165771484375,
-0.040985107421875,
0.0283355712890625,
0.01314544677734375,
0.027862548828125,
0.0282135009765625,
-0.040924072265625,
0.032012939453125,
0.031280517578125,
-0.043304443359375,
-0.04534912109375,
-0.01169586181640625,
-0.09088134765625,
-0.024017333984375,
0.08392333984375,
-0.01483154296875,
-0.03863525390625,
0.0173797607421875,
-0.02557373046875,
0.02154541015625,
-0.039642333984375,
0.054046630859375,
0.0266876220703125,
-0.0194854736328125,
-0.040985107421875,
-0.0338134765625,
0.045440673828125,
0.01500701904296875,
-0.048370361328125,
-0.0242462158203125,
0.0274505615234375,
0.050689697265625,
0.0177154541015625,
0.068359375,
-0.0032978057861328125,
0.00995635986328125,
0.0106658935546875,
0.01549530029296875,
0.01326751708984375,
-0.00968170166015625,
-0.023590087890625,
0.008544921875,
-0.03271484375,
-0.01708984375
]
] |
VoVanPhuc/sup-SimCSE-VietNamese-phobert-base | 2023-05-24T08:47:42.000Z | [
"transformers",
"pytorch",
"roberta",
"sentence-similarity",
"vi",
"arxiv:2104.08821",
"endpoints_compatible",
"region:us"
] | sentence-similarity | VoVanPhuc | null | null | VoVanPhuc/sup-SimCSE-VietNamese-phobert-base | 7 | 76,338 | transformers | 2022-03-02T23:29:05 | ---
language:
- vi
pipeline_tag: sentence-similarity
---
#### Table of contents
1. [Introduction](#introduction)
2. [Pretrain model](#models)
3. [Using SimeCSE_Vietnamese with `sentences-transformers`](#sentences-transformers)
- [Installation](#install1)
- [Example usage](#usage1)
4. [Using SimeCSE_Vietnamese with `transformers`](#transformers)
- [Installation](#install2)
- [Example usage](#usage2)
# <a name="introduction"></a> SimeCSE_Vietnamese: Simple Contrastive Learning of Sentence Embeddings with Vietnamese
Pre-trained SimeCSE_Vietnamese models are the state-of-the-art of Sentence Embeddings with Vietnamese :
- SimeCSE_Vietnamese pre-training approach is based on [SimCSE](https://arxiv.org/abs/2104.08821) which optimizes the SimeCSE_Vietnamese pre-training procedure for more robust performance.
- SimeCSE_Vietnamese encode input sentences using a pre-trained language model such as [PhoBert](https://www.aclweb.org/anthology/2020.findings-emnlp.92/)
- SimeCSE_Vietnamese works with both unlabeled and labeled data.
## Pre-trained models <a name="models"></a>
Model | #params | Arch.
---|---|---
[`VoVanPhuc/sup-SimCSE-VietNamese-phobert-base`](https://huggingface.co/VoVanPhuc/sup-SimCSE-VietNamese-phobert-base) | 135M | base
[`VoVanPhuc/unsup-SimCSE-VietNamese-phobert-base`](https://huggingface.co/VoVanPhuc/unsup-SimCSE-VietNamese-phobert-base) | 135M | base
## <a name="sentences-transformers"></a> Using SimeCSE_Vietnamese with `sentences-transformers`
### Installation <a name="install1"></a>
- Install `sentence-transformers`:
- `pip install -U sentence-transformers`
- Install `pyvi` to word segment:
- `pip install pyvi`
### Example usage <a name="usage1"></a>
```python
from sentence_transformers import SentenceTransformer
from pyvi.ViTokenizer import tokenize
model = SentenceTransformer('VoVanPhuc/sup-SimCSE-VietNamese-phobert-base')
sentences = ['Kẻ đánh bom đinh tồi tệ nhất nước Anh.',
'Nghệ sĩ làm thiện nguyện - minh bạch là việc cấp thiết.',
'Bắc Giang tăng khả năng điều trị và xét nghiệm.',
'HLV futsal Việt Nam tiết lộ lý do hạ Lebanon.',
'việc quan trọng khi kêu gọi quyên góp từ thiện là phải minh bạch, giải ngân kịp thời.',
'20% bệnh nhân Covid-19 có thể nhanh chóng trở nặng.',
'Thái Lan thua giao hữu trước vòng loại World Cup.',
'Cựu tuyển thủ Nguyễn Bảo Quân: May mắn ủng hộ futsal Việt Nam',
'Chủ ki-ốt bị đâm chết trong chợ đầu mối lớn nhất Thanh Hoá.',
'Bắn chết người trong cuộc rượt đuổi trên sông.'
]
sentences = [tokenize(sentence) for sentence in sentences]
embeddings = model.encode(sentences)
```
## <a name="sentences-transformers"></a> Using SimeCSE_Vietnamese with `transformers`
### Installation <a name="install2"></a>
- Install `transformers`:
- `pip install -U transformers`
- Install `pyvi` to word segment:
- `pip install pyvi`
### Example usage <a name="usage2"></a>
```python
import torch
from transformers import AutoModel, AutoTokenizer
from pyvi.ViTokenizer import tokenize
PhobertTokenizer = AutoTokenizer.from_pretrained("VoVanPhuc/sup-SimCSE-VietNamese-phobert-base")
model = AutoModel.from_pretrained("VoVanPhuc/sup-SimCSE-VietNamese-phobert-base")
sentences = ['Kẻ đánh bom đinh tồi tệ nhất nước Anh.',
'Nghệ sĩ làm thiện nguyện - minh bạch là việc cấp thiết.',
'Bắc Giang tăng khả năng điều trị và xét nghiệm.',
'HLV futsal Việt Nam tiết lộ lý do hạ Lebanon.',
'việc quan trọng khi kêu gọi quyên góp từ thiện là phải minh bạch, giải ngân kịp thời.',
'20% bệnh nhân Covid-19 có thể nhanh chóng trở nặng.',
'Thái Lan thua giao hữu trước vòng loại World Cup.',
'Cựu tuyển thủ Nguyễn Bảo Quân: May mắn ủng hộ futsal Việt Nam',
'Chủ ki-ốt bị đâm chết trong chợ đầu mối lớn nhất Thanh Hoá.',
'Bắn chết người trong cuộc rượt đuổi trên sông.'
]
sentences = [tokenize(sentence) for sentence in sentences]
inputs = PhobertTokenizer(sentences, padding=True, truncation=True, return_tensors="pt")
with torch.no_grad():
embeddings = model(**inputs, output_hidden_states=True, return_dict=True).pooler_output
```
## Quick Start
[Open In Colab](https://colab.research.google.com/drive/12__EXJoQYHe9nhi4aXLTf9idtXT8yr7H?usp=sharing)
## Citation
@article{gao2021simcse,
title={{SimCSE}: Simple Contrastive Learning of Sentence Embeddings},
author={Gao, Tianyu and Yao, Xingcheng and Chen, Danqi},
journal={arXiv preprint arXiv:2104.08821},
year={2021}
}
@inproceedings{phobert,
title = {{PhoBERT: Pre-trained language models for Vietnamese}},
author = {Dat Quoc Nguyen and Anh Tuan Nguyen},
booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2020},
year = {2020},
pages = {1037--1042}
} | 4,905 | [
[
-0.0210418701171875,
-0.05731201171875,
0.0263519287109375,
0.029571533203125,
-0.033172607421875,
-0.01605224609375,
-0.01058197021484375,
-0.004108428955078125,
0.0185089111328125,
0.028045654296875,
-0.0361328125,
-0.05023193359375,
-0.037933349609375,
0.0242767333984375,
-0.00641632080078125,
0.04632568359375,
-0.017364501953125,
0.0014677047729492188,
0.002346038818359375,
-0.018157958984375,
-0.034332275390625,
-0.04119873046875,
-0.0212249755859375,
-0.0221710205078125,
0.0133514404296875,
0.03216552734375,
0.0279083251953125,
0.042633056640625,
0.0257568359375,
0.0384521484375,
0.0029659271240234375,
0.0255279541015625,
-0.02734375,
-0.007274627685546875,
0.006137847900390625,
-0.033477783203125,
-0.0108795166015625,
0.00981903076171875,
0.031646728515625,
0.0254364013671875,
0.004241943359375,
-0.01238250732421875,
-0.00356292724609375,
0.0245513916015625,
-0.03289794921875,
0.0274200439453125,
-0.03607177734375,
-0.0010595321655273438,
0.0016040802001953125,
-0.00475311279296875,
-0.0221710205078125,
-0.0281829833984375,
0.018463134765625,
-0.050323486328125,
-0.00620269775390625,
0.00048470497131347656,
0.10784912109375,
0.0147705078125,
-0.027801513671875,
-0.028533935546875,
-0.0153045654296875,
0.0728759765625,
-0.050079345703125,
0.0177001953125,
0.0262298583984375,
-0.00820159912109375,
0.0008134841918945312,
-0.062103271484375,
-0.050079345703125,
-0.019256591796875,
-0.031005859375,
0.038818359375,
-0.02838134765625,
0.0013895034790039062,
0.0112762451171875,
0.0189361572265625,
-0.048187255859375,
-0.0028247833251953125,
-0.0252685546875,
-0.0122222900390625,
0.0474853515625,
0.00516510009765625,
0.03631591796875,
-0.042449951171875,
-0.0364990234375,
-0.02337646484375,
-0.032623291015625,
-0.00852203369140625,
0.009552001953125,
0.004058837890625,
-0.041259765625,
0.06121826171875,
-0.007244110107421875,
0.05413818359375,
-0.00396728515625,
-0.01397705078125,
0.051116943359375,
-0.040863037109375,
-0.018463134765625,
-0.007587432861328125,
0.08380126953125,
0.02850341796875,
0.04058837890625,
0.0031604766845703125,
-0.0019016265869140625,
0.00855255126953125,
-0.0192413330078125,
-0.054931640625,
-0.0271148681640625,
0.0152130126953125,
-0.0301971435546875,
-0.0008635520935058594,
0.0220947265625,
-0.05706787109375,
0.006259918212890625,
-0.01019287109375,
0.058258056640625,
-0.0589599609375,
-0.00228118896484375,
0.0302886962890625,
-0.0311431884765625,
0.0292205810546875,
-0.0058441162109375,
-0.03570556640625,
-0.0046234130859375,
0.033477783203125,
0.06964111328125,
-0.0005192756652832031,
-0.048919677734375,
-0.03411865234375,
-0.0012445449829101562,
0.00016176700592041016,
0.0423583984375,
-0.0215911865234375,
-0.038665771484375,
0.020843505859375,
0.01666259765625,
-0.03759765625,
-0.0325927734375,
0.038726806640625,
0.0008687973022460938,
0.043670654296875,
-0.00665283203125,
-0.0596923828125,
-0.0100860595703125,
0.01213836669921875,
-0.0218353271484375,
0.08099365234375,
0.0145111083984375,
-0.0789794921875,
0.0007739067077636719,
-0.04620361328125,
-0.035186767578125,
-0.01294708251953125,
-0.008514404296875,
-0.03863525390625,
-0.003383636474609375,
0.042388916015625,
0.041595458984375,
0.0019168853759765625,
-0.01212310791015625,
0.0125579833984375,
-0.0282745361328125,
0.0188140869140625,
-0.0184326171875,
0.08966064453125,
0.0175018310546875,
-0.033721923828125,
0.0150299072265625,
-0.0517578125,
-0.00829315185546875,
0.018585205078125,
-0.0207061767578125,
-0.0245208740234375,
-0.0265045166015625,
0.01418304443359375,
0.018585205078125,
0.01348114013671875,
-0.0399169921875,
0.01202392578125,
-0.049041748046875,
0.04547119140625,
0.044830322265625,
0.017242431640625,
0.024383544921875,
-0.0242767333984375,
0.0413818359375,
0.028228759765625,
-0.002147674560546875,
-0.0239105224609375,
-0.03729248046875,
-0.0887451171875,
-0.0302886962890625,
0.019378662109375,
0.06884765625,
-0.08282470703125,
0.06689453125,
-0.0283660888671875,
-0.057037353515625,
-0.061614990234375,
-0.000698089599609375,
0.028533935546875,
0.034881591796875,
0.042724609375,
-0.01065826416015625,
-0.04608154296875,
-0.0556640625,
-0.0266571044921875,
-0.0212554931640625,
0.0073089599609375,
0.0176544189453125,
0.06402587890625,
-0.0297088623046875,
0.05499267578125,
-0.06793212890625,
-0.0253143310546875,
-0.03216552734375,
0.01007843017578125,
0.0035343170166015625,
0.036041259765625,
0.038299560546875,
-0.062042236328125,
-0.030914306640625,
-0.018157958984375,
-0.042724609375,
-0.005702972412109375,
-0.01025390625,
-0.015380859375,
0.00518798828125,
0.052947998046875,
-0.040771484375,
0.029022216796875,
0.03436279296875,
-0.037139892578125,
0.04229736328125,
-0.030517578125,
-0.0025577545166015625,
-0.10076904296875,
-0.019683837890625,
0.00490570068359375,
-0.004608154296875,
-0.031280517578125,
-0.0009741783142089844,
-0.01346588134765625,
0.00579071044921875,
-0.04638671875,
0.05413818359375,
-0.03985595703125,
0.035797119140625,
0.0021724700927734375,
0.0274810791015625,
-0.0027313232421875,
0.037933349609375,
0.0168304443359375,
0.04632568359375,
0.046844482421875,
-0.043487548828125,
0.0263519287109375,
0.036865234375,
-0.0328369140625,
0.0296630859375,
-0.054473876953125,
-0.0106353759765625,
-0.0103607177734375,
0.00482940673828125,
-0.0731201171875,
-0.006011962890625,
0.03875732421875,
-0.03228759765625,
0.004314422607421875,
0.008209228515625,
-0.03662109375,
-0.026824951171875,
-0.019287109375,
0.01323699951171875,
0.048004150390625,
-0.03466796875,
0.036529541015625,
0.0188751220703125,
0.00592041015625,
-0.033416748046875,
-0.07476806640625,
-0.00739288330078125,
-0.0338134765625,
-0.038726806640625,
0.024505615234375,
-0.0025577545166015625,
0.00237274169921875,
0.00292205810546875,
0.0230255126953125,
-0.01123046875,
0.0010900497436523438,
0.01251983642578125,
0.01434326171875,
-0.0177459716796875,
0.01393890380859375,
-0.0166168212890625,
-0.0144500732421875,
-0.0137939453125,
-0.0179901123046875,
0.059112548828125,
-0.0212554931640625,
-0.006374359130859375,
-0.04144287109375,
0.01806640625,
0.01331329345703125,
-0.0347900390625,
0.058441162109375,
0.0869140625,
-0.0158538818359375,
0.00655364990234375,
-0.042999267578125,
-0.00243377685546875,
-0.038543701171875,
0.041595458984375,
-0.033294677734375,
-0.06927490234375,
0.019378662109375,
-0.00555419921875,
-0.006572723388671875,
0.0474853515625,
0.049713134765625,
0.005523681640625,
0.05206298828125,
0.044403076171875,
-0.00984954833984375,
0.04840087890625,
-0.04400634765625,
0.01306915283203125,
-0.045074462890625,
-0.0200958251953125,
-0.0307769775390625,
0.005252838134765625,
-0.0723876953125,
-0.05914306640625,
0.01525115966796875,
0.00653839111328125,
-0.0224761962890625,
0.041595458984375,
-0.043975830078125,
0.01174163818359375,
0.045196533203125,
-0.0009326934814453125,
0.0084686279296875,
0.0166473388671875,
-0.031768798828125,
-0.017486572265625,
-0.056365966796875,
-0.03680419921875,
0.04620361328125,
0.02197265625,
0.04315185546875,
-0.0176849365234375,
0.0726318359375,
-0.01514434814453125,
0.0015764236450195312,
-0.0635986328125,
0.049835205078125,
-0.0211181640625,
-0.01395416259765625,
-0.0173187255859375,
-0.0333251953125,
-0.07861328125,
0.0189361572265625,
-0.004093170166015625,
-0.05706787109375,
0.019439697265625,
-0.023681640625,
-0.01561737060546875,
0.0064849853515625,
-0.05780029296875,
0.07415771484375,
-0.0212860107421875,
-0.0042572021484375,
0.004573822021484375,
-0.042449951171875,
0.01271820068359375,
0.0242156982421875,
0.003589630126953125,
-0.00391387939453125,
-0.017852783203125,
0.05908203125,
-0.0491943359375,
0.051971435546875,
0.002857208251953125,
0.0091705322265625,
0.037078857421875,
-0.02197265625,
0.020843505859375,
0.00843048095703125,
0.0030574798583984375,
0.0108184814453125,
-0.01065826416015625,
-0.0325927734375,
-0.0223236083984375,
0.044189453125,
-0.07159423828125,
-0.037017822265625,
-0.053009033203125,
-0.025970458984375,
0.0081939697265625,
0.0206451416015625,
0.04998779296875,
0.009124755859375,
-0.008026123046875,
0.005970001220703125,
0.02471923828125,
-0.0386962890625,
0.036285400390625,
0.0210723876953125,
0.0015201568603515625,
-0.050628662109375,
0.07623291015625,
0.0180511474609375,
-0.0003936290740966797,
0.0440673828125,
0.022613525390625,
-0.02337646484375,
-0.01409912109375,
-0.00998687744140625,
0.050811767578125,
-0.03973388671875,
0.0016326904296875,
-0.06793212890625,
-0.030853271484375,
-0.048736572265625,
-0.01873779296875,
-0.0204315185546875,
-0.0242156982421875,
-0.027496337890625,
-0.0181732177734375,
0.03558349609375,
0.03857421875,
-0.0119171142578125,
0.044525146484375,
-0.04461669921875,
0.0333251953125,
0.0160675048828125,
-0.00921630859375,
-0.0211181640625,
-0.033416748046875,
-0.023895263671875,
-0.001552581787109375,
-0.02978515625,
-0.0711669921875,
0.04376220703125,
0.0048980712890625,
0.032257080078125,
0.0311737060546875,
0.0014438629150390625,
0.06439208984375,
-0.03558349609375,
0.0743408203125,
0.0188140869140625,
-0.0870361328125,
0.046661376953125,
-0.004245758056640625,
0.0252685546875,
0.031463623046875,
0.03466796875,
-0.051422119140625,
-0.024688720703125,
-0.0521240234375,
-0.0718994140625,
0.038360595703125,
0.0394287109375,
0.0177001953125,
-0.0168609619140625,
0.019989013671875,
-0.0200653076171875,
0.00812530517578125,
-0.060302734375,
-0.045867919921875,
-0.039886474609375,
-0.033782958984375,
-0.0225830078125,
-0.01271820068359375,
0.011566162109375,
-0.039825439453125,
0.053924560546875,
0.009857177734375,
0.02850341796875,
0.05072021484375,
-0.0092926025390625,
0.03753662109375,
0.012115478515625,
0.04107666015625,
0.0226593017578125,
-0.0037174224853515625,
0.002105712890625,
0.016693115234375,
-0.03851318359375,
0.01922607421875,
0.018585205078125,
-0.0189666748046875,
0.034759521484375,
0.040771484375,
0.07373046875,
0.00853729248046875,
-0.046844482421875,
0.0435791015625,
-0.006229400634765625,
-0.00904083251953125,
-0.034912109375,
-0.0045166015625,
0.021392822265625,
0.005901336669921875,
-0.003955841064453125,
-0.0193634033203125,
-0.0173797607421875,
-0.04656982421875,
0.0190582275390625,
0.00754547119140625,
-0.04095458984375,
-0.0113067626953125,
0.04229736328125,
0.00824737548828125,
-0.01065826416015625,
0.059326171875,
-0.0279998779296875,
-0.0638427734375,
0.044189453125,
0.034027099609375,
0.07159423828125,
-0.02374267578125,
0.0303802490234375,
0.05126953125,
0.0294952392578125,
-0.01267242431640625,
0.0382080078125,
0.0277252197265625,
-0.054656982421875,
-0.004302978515625,
-0.042572021484375,
0.011871337890625,
0.0091552734375,
-0.0318603515625,
0.031280517578125,
-0.01080322265625,
-0.00972747802734375,
-0.023681640625,
0.016845703125,
-0.07318115234375,
0.0183258056640625,
0.004512786865234375,
0.057891845703125,
-0.0736083984375,
0.06121826171875,
0.058380126953125,
-0.059417724609375,
-0.068359375,
0.0145111083984375,
-0.01922607421875,
-0.06549072265625,
0.041595458984375,
0.0205535888671875,
0.01183319091796875,
0.007808685302734375,
-0.0275726318359375,
-0.0465087890625,
0.07269287109375,
0.019378662109375,
-0.0236053466796875,
-0.0098724365234375,
0.0273284912109375,
0.038482666015625,
-0.0396728515625,
0.028411865234375,
0.051422119140625,
0.051055908203125,
-0.001102447509765625,
-0.0455322265625,
0.004604339599609375,
-0.0142059326171875,
0.002170562744140625,
-0.0291900634765625,
-0.07257080078125,
0.08087158203125,
-0.01873779296875,
-0.01505279541015625,
0.0216217041015625,
0.06402587890625,
0.033447265625,
0.008331298828125,
0.038330078125,
0.046051025390625,
0.045867919921875,
-0.021942138671875,
0.073974609375,
-0.01264190673828125,
0.0291595458984375,
0.07073974609375,
0.01445770263671875,
0.0648193359375,
0.033447265625,
-0.007415771484375,
0.047119140625,
0.053253173828125,
-0.023040771484375,
0.036224365234375,
0.0240020751953125,
0.008514404296875,
-0.0216522216796875,
-0.018585205078125,
-0.028717041015625,
0.060699462890625,
0.018310546875,
-0.026702880859375,
0.0033893585205078125,
0.01006317138671875,
0.03521728515625,
0.004688262939453125,
-0.004199981689453125,
0.04656982421875,
0.031890869140625,
-0.046112060546875,
0.04541015625,
0.005680084228515625,
0.0718994140625,
-0.045135498046875,
-0.0006966590881347656,
-0.0182952880859375,
0.0234222412109375,
-0.0049591064453125,
-0.038818359375,
-0.01261138916015625,
-0.002468109130859375,
0.012237548828125,
-0.0035648345947265625,
0.027557373046875,
-0.06402587890625,
-0.06402587890625,
0.03271484375,
0.055999755859375,
0.01433563232421875,
-0.001026153564453125,
-0.075927734375,
-0.002201080322265625,
0.0075836181640625,
-0.027313232421875,
-0.00001704692840576172,
0.048095703125,
0.0164031982421875,
0.037811279296875,
0.0130157470703125,
0.015106201171875,
0.0148468017578125,
0.00341796875,
0.048736572265625,
-0.05780029296875,
-0.045562744140625,
-0.07049560546875,
0.04632568359375,
-0.0096435546875,
-0.040924072265625,
0.08154296875,
0.05474853515625,
0.07305908203125,
-0.017333984375,
0.06304931640625,
0.00531768798828125,
0.01238250732421875,
-0.048431396484375,
0.05450439453125,
-0.057098388671875,
0.00200653076171875,
-0.0309295654296875,
-0.053497314453125,
-0.01142120361328125,
0.059783935546875,
-0.0226898193359375,
0.00592041015625,
0.0689697265625,
0.0626220703125,
-0.0010223388671875,
-0.0282745361328125,
0.00872802734375,
0.04254150390625,
0.027191162109375,
0.049041748046875,
0.050994873046875,
-0.06646728515625,
0.050689697265625,
-0.045318603515625,
0.00567626953125,
-0.025238037109375,
-0.053436279296875,
-0.07501220703125,
-0.06195068359375,
-0.050933837890625,
-0.0258331298828125,
-0.0031795501708984375,
0.06829833984375,
0.0318603515625,
-0.055938720703125,
-0.021942138671875,
-0.0004305839538574219,
0.0200958251953125,
-0.020416259765625,
-0.019683837890625,
0.068359375,
-0.0181732177734375,
-0.07464599609375,
0.01959228515625,
0.004329681396484375,
0.02374267578125,
0.0018711090087890625,
-0.00507354736328125,
-0.045928955078125,
0.01039886474609375,
0.044189453125,
0.004352569580078125,
-0.057647705078125,
-0.017333984375,
0.0007314682006835938,
-0.0284423828125,
0.0038318634033203125,
0.044647216796875,
-0.051025390625,
0.028900146484375,
0.055572509765625,
0.0187530517578125,
0.027496337890625,
0.007717132568359375,
0.031707763671875,
-0.0491943359375,
0.047515869140625,
0.01166534423828125,
0.04364013671875,
0.0302734375,
0.0003178119659423828,
0.050689697265625,
0.031097412109375,
-0.03179931640625,
-0.0494384765625,
0.0013599395751953125,
-0.0738525390625,
-0.0245208740234375,
0.0927734375,
-0.030914306640625,
-0.020721435546875,
-0.006656646728515625,
-0.036529541015625,
0.048919677734375,
-0.01190185546875,
0.037322998046875,
0.05743408203125,
-0.0017690658569335938,
0.0026874542236328125,
-0.032318115234375,
0.03570556640625,
0.037139892578125,
-0.0579833984375,
-0.0109100341796875,
0.01003265380859375,
0.020721435546875,
0.01019287109375,
0.044464111328125,
0.00665283203125,
-0.001903533935546875,
0.01064300537109375,
-0.003871917724609375,
0.0015354156494140625,
0.0033435821533203125,
-0.0221405029296875,
-0.004604339599609375,
-0.02978515625,
-0.03167724609375
]
] |
facebook/encodec_32khz | 2023-09-04T16:32:53.000Z | [
"transformers",
"pytorch",
"safetensors",
"encodec",
"feature-extraction",
"arxiv:2306.05284",
"has_space",
"region:us"
] | feature-extraction | facebook | null | null | facebook/encodec_32khz | 7 | 76,329 | transformers | 2023-06-15T12:01:17 | ---
inference: false
---

# Model Card for EnCodec
This model card provides details and information about EnCodec 32kHz, a state-of-the-art real-time audio codec developed by Meta AI.
This EnCodec checkpoint was trained specifically as part of the [MusicGen project](https://huggingface.co/docs/transformers/main/model_doc/musicgen),
and is intended to be used in conjuction with the MusicGen models.
## Model Details
### Model Description
EnCodec is a high-fidelity audio codec leveraging neural networks. It introduces a streaming encoder-decoder architecture with quantized latent space, trained in an end-to-end fashion.
The model simplifies and speeds up training using a single multiscale spectrogram adversary that efficiently reduces artifacts and produces high-quality samples.
It also includes a novel loss balancer mechanism that stabilizes training by decoupling the choice of hyperparameters from the typical scale of the loss.
Additionally, lightweight Transformer models are used to further compress the obtained representation while maintaining real-time performance. This variant of EnCodec is
trained on 20k of music data, consisting of an internal dataset of 10K high-quality music tracks, and on the ShutterStock and Pond5 music datasets.
- **Developed by:** Meta AI
- **Model type:** Audio Codec
### Model Sources
- **Repository:** [GitHub Repository](https://github.com/facebookresearch/audiocraft)
- **Paper:** [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
EnCodec can be used directly as an audio codec for real-time compression and decompression of audio signals.
It provides high-quality audio compression and efficient decoding. The model was trained on various bandwiths, which can be specified when encoding (compressing) and decoding (decompressing).
Two different setup exist for EnCodec:
- Non-streamable: the input audio is split into chunks of 1 seconds, with an overlap of 10 ms, which are then encoded.
- Streamable: weight normalizationis used on the convolution layers, and the input is not split into chunks but rather padded on the left.
### Downstream Use
This variant of EnCodec is designed to be used in conjunction with the official [MusicGen checkpoints](https://huggingface.co/models?search=facebook/musicgen-).
However, it can also be used standalone to encode audio files.
## How to Get Started with the Model
Use the following code to get started with the EnCodec model using a dummy example from the LibriSpeech dataset (~9MB). First, install the required Python packages:
```
pip install --upgrade pip
pip install --upgrade transformers datasets[audio]
```
Then load an audio sample, and run a forward pass of the model:
```python
from datasets import load_dataset, Audio
from transformers import EncodecModel, AutoProcessor
# load a demonstration datasets
librispeech_dummy = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
# load the model + processor (for pre-processing the audio)
model = EncodecModel.from_pretrained("facebook/encodec_48khz")
processor = AutoProcessor.from_pretrained("facebook/encodec_48khz")
# cast the audio data to the correct sampling rate for the model
librispeech_dummy = librispeech_dummy.cast_column("audio", Audio(sampling_rate=processor.sampling_rate))
audio_sample = librispeech_dummy[0]["audio"]["array"]
# pre-process the inputs
inputs = processor(raw_audio=audio_sample, sampling_rate=processor.sampling_rate, return_tensors="pt")
# explicitly encode then decode the audio inputs
encoder_outputs = model.encode(inputs["input_values"], inputs["padding_mask"])
audio_values = model.decode(encoder_outputs.audio_codes, encoder_outputs.audio_scales, inputs["padding_mask"])[0]
# or the equivalent with a forward pass
audio_values = model(inputs["input_values"], inputs["padding_mask"]).audio_values
```
## Evaluation
For evaluation results, refer to the [MusicGen evaluation scores](https://huggingface.co/facebook/musicgen-large#evaluation-results).
## Summary
EnCodec is a state-of-the-art real-time neural audio compression model that excels in producing high-fidelity audio samples at various sample rates and bandwidths.
The model's performance was evaluated across different settings, ranging from 24kHz monophonic at 1.5 kbps to 48kHz stereophonic, showcasing both subjective and
objective results. Notably, EnCodec incorporates a novel spectrogram-only adversarial loss, effectively reducing artifacts and enhancing sample quality.
Training stability and interpretability were further enhanced through the introduction of a gradient balancer for the loss weights.
Additionally, the study demonstrated that a compact Transformer model can be employed to achieve an additional bandwidth reduction of up to 40% without compromising
quality, particularly in applications where low latency is not critical (e.g., music streaming).
## Citation
**BibTeX:**
```
@misc{copet2023simple,
title={Simple and Controllable Music Generation},
author={Jade Copet and Felix Kreuk and Itai Gat and Tal Remez and David Kant and Gabriel Synnaeve and Yossi Adi and Alexandre Défossez},
year={2023},
eprint={2306.05284},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` | 5,590 | [
[
-0.050994873046875,
-0.046051025390625,
0.004146575927734375,
0.02044677734375,
-0.00562286376953125,
-0.0031337738037109375,
-0.033599853515625,
-0.0284271240234375,
0.01319122314453125,
0.01611328125,
-0.07281494140625,
-0.05230712890625,
-0.03399658203125,
-0.0033359527587890625,
-0.045013427734375,
0.05078125,
0.00946807861328125,
0.0024356842041015625,
-0.01026153564453125,
-0.0060577392578125,
-0.0234527587890625,
-0.0293121337890625,
-0.06365966796875,
-0.0196533203125,
-0.0017566680908203125,
0.007518768310546875,
0.0260162353515625,
0.02947998046875,
0.042724609375,
0.0190582275390625,
-0.033111572265625,
-0.004764556884765625,
-0.029388427734375,
0.007312774658203125,
0.031158447265625,
-0.0340576171875,
-0.046478271484375,
0.0198974609375,
0.0225372314453125,
0.0204620361328125,
-0.033172607421875,
0.050537109375,
0.002407073974609375,
0.0369873046875,
-0.03985595703125,
0.006992340087890625,
-0.02789306640625,
0.024688720703125,
-0.0036563873291015625,
-0.0222625732421875,
-0.037628173828125,
-0.01885986328125,
-0.0293426513671875,
-0.045379638671875,
0.01064300537109375,
-0.0047149658203125,
0.0726318359375,
0.031646728515625,
0.000690460205078125,
-0.026092529296875,
-0.0692138671875,
0.030548095703125,
-0.0693359375,
0.055633544921875,
0.040374755859375,
0.043243408203125,
0.014495849609375,
-0.058929443359375,
-0.03509521484375,
-0.0004241466522216797,
0.029266357421875,
0.046142578125,
-0.01142120361328125,
-0.00045299530029296875,
0.036834716796875,
0.059295654296875,
-0.04400634765625,
-0.01666259765625,
-0.03912353515625,
-0.025238037109375,
0.0460205078125,
-0.01514434814453125,
0.01105499267578125,
-0.01107025146484375,
-0.0130615234375,
-0.0233154296875,
-0.04547119140625,
0.02630615234375,
0.05853271484375,
0.003513336181640625,
-0.0390625,
0.014862060546875,
0.00615692138671875,
0.049407958984375,
0.0309295654296875,
-0.0215301513671875,
0.04388427734375,
-0.036346435546875,
-0.00701141357421875,
0.02545166015625,
0.07196044921875,
0.00934600830078125,
0.008331298828125,
0.006771087646484375,
-0.02532958984375,
0.007770538330078125,
0.01354217529296875,
-0.0673828125,
-0.028228759765625,
0.03802490234375,
-0.0595703125,
-0.006397247314453125,
0.0130462646484375,
-0.04364013671875,
0.01898193359375,
-0.028106689453125,
0.054046630859375,
-0.038543701171875,
-0.01290130615234375,
0.028717041015625,
-0.022003173828125,
0.006725311279296875,
-0.021728515625,
-0.055145263671875,
0.01297760009765625,
0.024810791015625,
0.061920166015625,
0.0014257431030273438,
0.0027980804443359375,
-0.0213623046875,
-0.0034084320068359375,
-0.005168914794921875,
0.02508544921875,
-0.0162353515625,
-0.0372314453125,
-0.0033092498779296875,
0.003925323486328125,
0.018280029296875,
-0.050628662109375,
0.06561279296875,
-0.04010009765625,
0.01227569580078125,
0.00872039794921875,
-0.044036865234375,
-0.00577545166015625,
-0.0205841064453125,
-0.053802490234375,
0.05316162109375,
0.0017528533935546875,
-0.0517578125,
0.0196380615234375,
-0.052947998046875,
-0.0309600830078125,
-0.0220947265625,
0.016693115234375,
-0.04278564453125,
-0.004940032958984375,
0.01467132568359375,
0.0165252685546875,
-0.0257720947265625,
0.00948333740234375,
-0.0300445556640625,
-0.049652099609375,
0.038604736328125,
-0.04534912109375,
0.071533203125,
0.05401611328125,
-0.0221405029296875,
-0.007656097412109375,
-0.07525634765625,
-0.00955963134765625,
0.00901031494140625,
-0.0570068359375,
0.0170440673828125,
-0.0182037353515625,
0.032623291015625,
0.0249786376953125,
-0.0052947998046875,
-0.051727294921875,
-0.007350921630859375,
-0.0238189697265625,
0.063720703125,
0.03155517578125,
0.003574371337890625,
0.04388427734375,
-0.021820068359375,
0.02740478515625,
-0.0284271240234375,
0.029876708984375,
-0.01385498046875,
-0.014862060546875,
-0.03900146484375,
-0.01483154296875,
0.0594482421875,
0.0362548828125,
-0.0219268798828125,
0.06927490234375,
-0.021148681640625,
-0.049224853515625,
-0.08233642578125,
-0.0025005340576171875,
0.0100250244140625,
0.0168914794921875,
0.035736083984375,
-0.04010009765625,
-0.046630859375,
-0.05609130859375,
0.002056121826171875,
0.014495849609375,
-0.03448486328125,
0.0312347412109375,
0.01885986328125,
-0.022064208984375,
0.055908203125,
-0.019256591796875,
-0.01062774658203125,
-0.01174163818359375,
0.01110076904296875,
0.04498291015625,
0.07574462890625,
0.045806884765625,
-0.05523681640625,
-0.011932373046875,
-0.050201416015625,
-0.042144775390625,
-0.027374267578125,
-0.01117706298828125,
-0.0197906494140625,
-0.0271759033203125,
0.03753662109375,
-0.053680419921875,
0.0153350830078125,
0.055023193359375,
-0.0069732666015625,
0.035675048828125,
0.01092529296875,
0.02606201171875,
-0.07012939453125,
0.00010216236114501953,
-0.01226043701171875,
-0.0105743408203125,
-0.034576416015625,
-0.017547607421875,
-0.0029888153076171875,
-0.0209197998046875,
-0.034210205078125,
-0.0083465576171875,
-0.014862060546875,
-0.0074462890625,
-0.0196990966796875,
-0.0119476318359375,
0.00310516357421875,
0.055572509765625,
-0.0020732879638671875,
0.042724609375,
0.037506103515625,
-0.046844482421875,
0.033050537109375,
0.0116729736328125,
-0.03228759765625,
0.03863525390625,
-0.05242919921875,
0.0169219970703125,
-0.00255584716796875,
0.037994384765625,
-0.04583740234375,
-0.01181793212890625,
0.007144927978515625,
-0.0657958984375,
0.041473388671875,
-0.0179290771484375,
-0.054595947265625,
-0.0390625,
0.01125335693359375,
0.0474853515625,
0.06915283203125,
-0.048553466796875,
0.048858642578125,
0.0308990478515625,
0.0305023193359375,
-0.00122833251953125,
-0.0909423828125,
-0.0252532958984375,
-0.0161285400390625,
-0.054168701171875,
0.048675537109375,
-0.025970458984375,
0.006587982177734375,
0.0026264190673828125,
-0.023956298828125,
0.00543212890625,
-0.006595611572265625,
0.04742431640625,
0.00937652587890625,
-0.004924774169921875,
0.01090240478515625,
-0.00936126708984375,
-0.01080322265625,
0.018310546875,
-0.048187255859375,
0.055938720703125,
0.0008230209350585938,
-0.04248046875,
-0.043914794921875,
0.01239013671875,
0.033538818359375,
-0.03839111328125,
0.024505615234375,
0.06488037109375,
-0.014617919921875,
-0.0197906494140625,
-0.0290985107421875,
-0.021575927734375,
-0.041534423828125,
0.0066375732421875,
-0.01329803466796875,
-0.0496826171875,
0.0445556640625,
-0.002353668212890625,
0.002712249755859375,
0.0380859375,
0.02276611328125,
-0.025421142578125,
0.06903076171875,
0.0159149169921875,
0.0042724609375,
0.048065185546875,
-0.0654296875,
-0.00960540771484375,
-0.05035400390625,
-0.0216217041015625,
-0.032318115234375,
-0.041168212890625,
-0.048126220703125,
-0.030120849609375,
0.029937744140625,
-0.01502227783203125,
-0.037567138671875,
0.049560546875,
-0.055389404296875,
0.0135345458984375,
0.05389404296875,
0.0162506103515625,
-0.00656890869140625,
0.00760650634765625,
0.0018548965454101562,
0.0082244873046875,
-0.0594482421875,
0.01351165771484375,
0.0836181640625,
0.038543701171875,
0.07220458984375,
0.0005626678466796875,
0.06109619140625,
0.0275726318359375,
-0.0214080810546875,
-0.05029296875,
0.0123291015625,
-0.01203155517578125,
-0.048675537109375,
0.0037593841552734375,
-0.0290985107421875,
-0.0367431640625,
-0.01236724853515625,
-0.032745361328125,
-0.0247650146484375,
0.04156494140625,
-0.004611968994140625,
-0.045989990234375,
0.0308990478515625,
-0.036712646484375,
0.050445556640625,
-0.025665283203125,
-0.0180816650390625,
0.0004837512969970703,
-0.045196533203125,
0.016998291015625,
-0.0032863616943359375,
0.03759765625,
-0.0110321044921875,
0.03668212890625,
0.06854248046875,
-0.0185394287109375,
0.033599853515625,
-0.025238037109375,
0.005809783935546875,
0.0467529296875,
0.005245208740234375,
0.014862060546875,
-0.005687713623046875,
0.0010881423950195312,
0.03399658203125,
0.00170135498046875,
-0.021148681640625,
-0.03790283203125,
0.03936767578125,
-0.056915283203125,
-0.013153076171875,
0.0196533203125,
-0.04266357421875,
-0.0128936767578125,
-0.01438140869140625,
0.051483154296875,
0.055267333984375,
0.0062713623046875,
0.038543701171875,
0.05963134765625,
-0.0232086181640625,
0.035614013671875,
0.021697998046875,
0.00007301568984985352,
-0.04400634765625,
0.06500244140625,
0.005084991455078125,
0.0258026123046875,
0.01708984375,
0.004169464111328125,
-0.0152740478515625,
-0.0298614501953125,
-0.0224151611328125,
0.0007634162902832031,
-0.05511474609375,
-0.020294189453125,
-0.049468994140625,
-0.0283355712890625,
-0.033477783203125,
0.0015316009521484375,
-0.06365966796875,
-0.021331787109375,
-0.0166168212890625,
-0.0162811279296875,
0.0229034423828125,
0.0184478759765625,
-0.0283660888671875,
0.0374755859375,
-0.06561279296875,
0.0307159423828125,
0.0156097412109375,
0.024810791015625,
-0.0174102783203125,
-0.07818603515625,
-0.02862548828125,
0.0031070709228515625,
-0.0022754669189453125,
-0.08294677734375,
0.026214599609375,
0.0174407958984375,
0.0399169921875,
0.0418701171875,
-0.0007386207580566406,
0.039276123046875,
-0.022216796875,
0.0546875,
-0.0006513595581054688,
-0.0716552734375,
0.0467529296875,
-0.0081634521484375,
0.039947509765625,
0.052276611328125,
0.01898193359375,
-0.034423828125,
-0.0010633468627929688,
-0.044525146484375,
-0.07232666015625,
0.056915283203125,
0.03607177734375,
-0.000057756900787353516,
0.028839111328125,
0.0153961181640625,
-0.005390167236328125,
0.0177459716796875,
-0.05609130859375,
-0.02789306640625,
-0.05731201171875,
-0.029327392578125,
-0.00945281982421875,
-0.01093292236328125,
-0.013824462890625,
-0.036041259765625,
0.058807373046875,
0.0180511474609375,
0.050018310546875,
0.02716064453125,
-0.00647735595703125,
0.003116607666015625,
0.018524169921875,
0.025299072265625,
-0.003047943115234375,
-0.03515625,
0.004093170166015625,
0.003688812255859375,
-0.031646728515625,
0.0176849365234375,
0.00946807861328125,
-0.0188446044921875,
0.004611968994140625,
0.028839111328125,
0.08502197265625,
0.0212554931640625,
-0.036346435546875,
0.03515625,
-0.00930023193359375,
-0.0234222412109375,
-0.048614501953125,
0.006687164306640625,
0.02972412109375,
0.003143310546875,
0.020172119140625,
0.0292510986328125,
0.021575927734375,
-0.023284912109375,
0.029998779296875,
-0.0025844573974609375,
-0.042633056640625,
-0.02569580078125,
0.085693359375,
0.010467529296875,
-0.01529693603515625,
0.051025390625,
-0.0143890380859375,
-0.01953125,
0.059234619140625,
0.038726806640625,
0.08642578125,
-0.016387939453125,
-0.011810302734375,
0.0594482421875,
0.0293121337890625,
0.00316619873046875,
0.01800537109375,
-0.040740966796875,
-0.03692626953125,
-0.027099609375,
-0.0831298828125,
0.0086669921875,
0.0032329559326171875,
-0.061248779296875,
0.005786895751953125,
-0.0287933349609375,
-0.033538818359375,
-0.00988006591796875,
-0.034942626953125,
-0.06414794921875,
0.00838470458984375,
0.02264404296875,
0.10296630859375,
-0.0703125,
0.058380126953125,
0.0130157470703125,
-0.046417236328125,
-0.08013916015625,
-0.00446319580078125,
0.0111083984375,
-0.017913818359375,
0.038909912109375,
0.01580810546875,
-0.00150299072265625,
0.0026378631591796875,
-0.062286376953125,
-0.0738525390625,
0.08880615234375,
0.007732391357421875,
-0.060791015625,
0.00702667236328125,
-0.005710601806640625,
0.042938232421875,
-0.045379638671875,
0.01483154296875,
0.056304931640625,
0.04522705078125,
0.0206451416015625,
-0.048248291015625,
-0.017547607421875,
-0.037994384765625,
-0.0223541259765625,
-0.01499176025390625,
-0.03680419921875,
0.072021484375,
0.0085296630859375,
-0.021484375,
-0.0212860107421875,
0.044525146484375,
0.01227569580078125,
0.0115509033203125,
0.055267333984375,
0.053375244140625,
0.034149169921875,
-0.0287933349609375,
0.08184814453125,
-0.054473876953125,
0.023773193359375,
0.06585693359375,
0.028778076171875,
0.04278564453125,
0.0249786376953125,
-0.025482177734375,
0.0244598388671875,
0.0679931640625,
-0.0269012451171875,
0.06243896484375,
0.01641845703125,
-0.0086822509765625,
-0.037445068359375,
-0.0014219284057617188,
-0.036041259765625,
0.05615234375,
0.0201263427734375,
-0.03875732421875,
0.026824951171875,
0.024810791015625,
-0.01499176025390625,
-0.0175018310546875,
-0.0198516845703125,
0.04931640625,
-0.001361846923828125,
-0.03143310546875,
0.05401611328125,
-0.00292205810546875,
0.0811767578125,
-0.04644775390625,
0.00693511962890625,
-0.007587432861328125,
-0.004550933837890625,
-0.031707763671875,
-0.042938232421875,
0.0239105224609375,
-0.01114654541015625,
-0.0170135498046875,
-0.016815185546875,
0.0498046875,
-0.050994873046875,
-0.0261383056640625,
0.042633056640625,
0.01151275634765625,
0.010467529296875,
-0.01483154296875,
-0.05657958984375,
0.0225982666015625,
0.003936767578125,
-0.0135040283203125,
-0.00919342041015625,
0.006488800048828125,
0.0394287109375,
0.032318115234375,
0.07000732421875,
0.0239105224609375,
0.016448974609375,
0.0234832763671875,
0.045013427734375,
-0.04266357421875,
-0.04241943359375,
-0.0209197998046875,
0.0285186767578125,
0.0087890625,
0.01076507568359375,
0.041229248046875,
0.044769287109375,
0.08984375,
-0.0231781005859375,
0.05242919921875,
-0.00970458984375,
0.0207672119140625,
-0.042633056640625,
0.0654296875,
-0.053009033203125,
0.0279083251953125,
-0.0046539306640625,
-0.05206298828125,
0.013671875,
0.0309600830078125,
0.00655364990234375,
0.0185699462890625,
0.016143798828125,
0.061920166015625,
-0.0183868408203125,
0.0124053955078125,
0.0132598876953125,
0.01410675048828125,
0.0128631591796875,
0.048126220703125,
0.05413818359375,
-0.056854248046875,
0.06439208984375,
-0.059295654296875,
0.0014390945434570312,
-0.0036830902099609375,
-0.0179901123046875,
-0.04547119140625,
-0.036224365234375,
-0.019134521484375,
-0.034759521484375,
0.0031337738037109375,
0.07073974609375,
0.06463623046875,
-0.07574462890625,
-0.0225677490234375,
0.00782012939453125,
-0.00910186767578125,
-0.041015625,
-0.0193634033203125,
0.0218963623046875,
0.00791168212890625,
-0.0726318359375,
0.051300048828125,
0.006412506103515625,
0.01334381103515625,
-0.02203369140625,
-0.0272369384765625,
-0.01287078857421875,
0.0117645263671875,
0.02044677734375,
0.00930023193359375,
-0.034942626953125,
-0.007129669189453125,
0.00213623046875,
-0.00820159912109375,
0.0244293212890625,
0.0531005859375,
-0.02978515625,
0.0452880859375,
0.06610107421875,
0.0216217041015625,
0.06756591796875,
0.00548553466796875,
0.0141448974609375,
-0.054046630859375,
0.009307861328125,
0.01535797119140625,
0.0325927734375,
0.023895263671875,
-0.01425933837890625,
0.035064697265625,
0.031341552734375,
-0.04669189453125,
-0.04156494140625,
-0.00013327598571777344,
-0.0831298828125,
-0.025390625,
0.09173583984375,
-0.0029850006103515625,
-0.03472900390625,
0.0172576904296875,
-0.00807952880859375,
0.049957275390625,
-0.036834716796875,
0.0253143310546875,
0.031829833984375,
-0.00005221366882324219,
-0.006534576416015625,
-0.064208984375,
0.04498291015625,
0.00798797607421875,
-0.024658203125,
-0.0210723876953125,
0.04156494140625,
0.0285186767578125,
0.031402587890625,
0.03289794921875,
-0.0325927734375,
0.03594970703125,
0.0390625,
0.049468994140625,
-0.03289794921875,
-0.0222930908203125,
-0.03704833984375,
0.044677734375,
-0.0310821533203125,
-0.039642333984375
]
] |
bigscience/bloomz-7b1 | 2023-05-27T17:25:52.000Z | [
"transformers",
"pytorch",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zu",
"dataset:bigscience/xP3",
"arxiv:2211.01786",
"license:bigscience-bloom-rail-1.0",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloomz-7b1 | 113 | 76,245 | transformers | 2022-09-27T09:00:57 | ---
datasets:
- bigscience/xP3
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
pipeline_tag: text-generation
widget:
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。Would you rate the previous review as positive, neutral or negative?"
example_title: "zh-en sentiment"
- text: "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?"
example_title: "zh-zh sentiment"
- text: "Suggest at least five related search terms to \"Mạng neural nhân tạo\"."
example_title: "vi-en query"
- text: "Proposez au moins cinq mots clés concernant «Réseau de neurones artificiels»."
example_title: "fr-fr query"
- text: "Explain in a sentence in Telugu what is backpropagation in neural networks."
example_title: "te-en qa"
- text: "Why is the sky blue?"
example_title: "en-en qa"
- text: "Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is \"Heroes Come in All Shapes and Sizes\". Story (in Spanish):"
example_title: "es-en fable"
- text: "Write a fable about wood elves living in a forest that is suddenly invaded by ogres. The fable is a masterpiece that has achieved praise worldwide and its moral is \"Violence is the last refuge of the incompetent\". Fable (in Hindi):"
example_title: "hi-en fable"
model-index:
- name: bloomz-7b1
results:
- task:
type: Coreference resolution
dataset:
type: winogrande
name: Winogrande XL (xl)
config: xl
split: validation
revision: a80f460359d1e9a67c006011c94de42a8759430c
metrics:
- type: Accuracy
value: 55.8
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (en)
config: en
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 66.02
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (fr)
config: fr
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 57.83
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (jp)
config: jp
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 52.87
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (pt)
config: pt
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 57.79
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (ru)
config: ru
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 54.92
- task:
type: Coreference resolution
dataset:
type: Muennighoff/xwinograd
name: XWinograd (zh)
config: zh
split: test
revision: 9dd5ea5505fad86b7bedad667955577815300cee
metrics:
- type: Accuracy
value: 63.69
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r1)
config: r1
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 42.1
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r2)
config: r2
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 39.5
- task:
type: Natural language inference
dataset:
type: anli
name: ANLI (r3)
config: r3
split: validation
revision: 9dbd830a06fea8b1c49d6e5ef2004a08d9f45094
metrics:
- type: Accuracy
value: 41.0
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (cb)
config: cb
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 80.36
- task:
type: Natural language inference
dataset:
type: super_glue
name: SuperGLUE (rte)
config: rte
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 84.12
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ar)
config: ar
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 53.25
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (bg)
config: bg
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 43.61
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (de)
config: de
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 46.83
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (el)
config: el
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 41.53
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (en)
config: en
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 59.68
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (es)
config: es
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 55.1
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (fr)
config: fr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 55.26
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (hi)
config: hi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 50.88
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ru)
config: ru
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 47.75
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (sw)
config: sw
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 46.63
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (th)
config: th
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 40.12
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (tr)
config: tr
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 37.55
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (ur)
config: ur
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 46.51
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (vi)
config: vi
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 52.93
- task:
type: Natural language inference
dataset:
type: xnli
name: XNLI (zh)
config: zh
split: validation
revision: a5a45e4ff92d5d3f34de70aaf4b72c3bdf9f7f16
metrics:
- type: Accuracy
value: 53.61
- task:
type: Program synthesis
dataset:
type: openai_humaneval
name: HumanEval
config: None
split: test
revision: e8dc562f5de170c54b5481011dd9f4fa04845771
metrics:
- type: Pass@1
value: 8.06
- type: Pass@10
value: 15.03
- type: Pass@100
value: 27.49
- task:
type: Sentence completion
dataset:
type: story_cloze
name: StoryCloze (2016)
config: "2016"
split: validation
revision: e724c6f8cdf7c7a2fb229d862226e15b023ee4db
metrics:
- type: Accuracy
value: 90.43
- task:
type: Sentence completion
dataset:
type: super_glue
name: SuperGLUE (copa)
config: copa
split: validation
revision: 9e12063561e7e6c79099feb6d5a493142584e9e2
metrics:
- type: Accuracy
value: 86.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (et)
config: et
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 50.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ht)
config: ht
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 54.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (id)
config: id
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 76.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (it)
config: it
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 61.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (qu)
config: qu
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 60.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (sw)
config: sw
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 63.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (ta)
config: ta
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 64.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (th)
config: th
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 57.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (tr)
config: tr
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 53.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (vi)
config: vi
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 79.0
- task:
type: Sentence completion
dataset:
type: xcopa
name: XCOPA (zh)
config: zh
split: validation
revision: 37f73c60fb123111fa5af5f9b705d0b3747fd187
metrics:
- type: Accuracy
value: 81.0
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ar)
config: ar
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 83.26
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (es)
config: es
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 88.95
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (eu)
config: eu
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 73.33
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (hi)
config: hi
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 80.61
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (id)
config: id
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 84.25
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (my)
config: my
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 52.55
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (ru)
config: ru
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 65.32
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (sw)
config: sw
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 71.67
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (te)
config: te
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 74.72
- task:
type: Sentence completion
dataset:
type: Muennighoff/xstory_cloze
name: XStoryCloze (zh)
config: zh
split: validation
revision: 8bb76e594b68147f1a430e86829d07189622b90d
metrics:
- type: Accuracy
value: 85.37
---

# Table of Contents
1. [Model Summary](#model-summary)
2. [Use](#use)
3. [Limitations](#limitations)
4. [Training](#training)
5. [Evaluation](#evaluation)
7. [Citation](#citation)
# Model Summary
> We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find the resulting models capable of crosslingual generalization to unseen tasks & languages.
- **Repository:** [bigscience-workshop/xmtf](https://github.com/bigscience-workshop/xmtf)
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
- **Languages:** Refer to [bloom](https://huggingface.co/bigscience/bloom) for pretraining & [xP3](https://huggingface.co/datasets/bigscience/xP3) for finetuning language proportions. It understands both pretraining & finetuning languages.
- **BLOOMZ & mT0 Model Family:**
<div class="max-w-full overflow-auto">
<table>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3>xP3</a>. Recommended for prompting in English.
</tr>
<tr>
<td>Parameters</td>
<td>300M</td>
<td>580M</td>
<td>1.2B</td>
<td>3.7B</td>
<td>13B</td>
<td>560M</td>
<td>1.1B</td>
<td>1.7B</td>
<td>3B</td>
<td>7.1B</td>
<td>176B</td>
</tr>
<tr>
<td>Finetuned Model</td>
<td><a href=https://huggingface.co/bigscience/mt0-small>mt0-small</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-base>mt0-base</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-large>mt0-large</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xl>mt0-xl</a></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-560m>bloomz-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b1>bloomz-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-1b7>bloomz-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-3b>bloomz-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1>bloomz-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
</tr>
<tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a>. Recommended for prompting in non-English.</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-mt>bloomz-7b1-mt</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a></td>
</tr>
<th colspan="12">Multitask finetuned on <a style="font-weight:bold" href=https://huggingface.co/datasets/Muennighoff/P3>P3</a>. Released for research purposes only. Strictly inferior to above models!</th>
</tr>
<tr>
<td>Finetuned Model</td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td><a href=https://huggingface.co/bigscience/bloomz-7b1-p3>bloomz-7b1-p3</a></td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a></td>
</tr>
<th colspan="12">Original pretrained checkpoints. Not recommended.</th>
<tr>
<td>Pretrained Model</td>
<td><a href=https://huggingface.co/google/mt5-small>mt5-small</a></td>
<td><a href=https://huggingface.co/google/mt5-base>mt5-base</a></td>
<td><a href=https://huggingface.co/google/mt5-large>mt5-large</a></td>
<td><a href=https://huggingface.co/google/mt5-xl>mt5-xl</a></td>
<td><a href=https://huggingface.co/google/mt5-xxl>mt5-xxl</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-560m>bloom-560m</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b1>bloom-1b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-1b7>bloom-1b7</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-3b>bloom-3b</a></td>
<td><a href=https://huggingface.co/bigscience/bloom-7b1>bloom-7b1</a></td>
<td><a href=https://huggingface.co/bigscience/bloom>bloom</a></td>
</tr>
</table>
</div>
# Use
## Intended use
We recommend using the model to perform tasks expressed in natural language. For example, given the prompt "*Translate to English: Je t’aime.*", the model will most likely answer "*I love you.*". Some prompt ideas from our paper:
- 一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。你认为这句话的立场是赞扬、中立还是批评?
- Suggest at least five related search terms to "Mạng neural nhân tạo".
- Write a fairy tale about a troll saving a princess from a dangerous dragon. The fairy tale is a masterpiece that has achieved praise worldwide and its moral is "Heroes Come in All Shapes and Sizes". Story (in Spanish):
- Explain in a sentence in Telugu what is backpropagation in neural networks.
**Feel free to share your generations in the Community tab!**
## How to use
### CPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-7b1"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-7b1"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype="auto", device_map="auto")
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
### GPU in 8bit
<details>
<summary> Click to expand </summary>
```python
# pip install -q transformers accelerate bitsandbytes
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigscience/bloomz-7b1"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, device_map="auto", load_in_8bit=True)
inputs = tokenizer.encode("Translate to English: Je t’aime.", return_tensors="pt").to("cuda")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
</details>
<!-- Necessary for whitespace -->
###
# Limitations
**Prompt Engineering:** The performance may vary depending on the prompt. For BLOOMZ models, we recommend making it very clear when the input stops to avoid the model trying to continue it. For example, the prompt "*Translate to English: Je t'aime*" without the full stop (.) at the end, may result in the model trying to continue the French sentence. Better prompts are e.g. "*Translate to English: Je t'aime.*", "*Translate to English: Je t'aime. Translation:*" "*What is "Je t'aime." in English?*", where it is clear for the model when it should answer. Further, we recommend providing the model as much context as possible. For example, if you want it to answer in Telugu, then tell the model, e.g. "*Explain in a sentence in Telugu what is backpropagation in neural networks.*".
# Training
## Model
- **Architecture:** Same as [bloom-7b1](https://huggingface.co/bigscience/bloom-7b1), also refer to the `config.json` file
- **Finetuning steps:** 1000
- **Finetuning tokens:** 4.19 billion
- **Finetuning layout:** 1x pipeline parallel, 1x tensor parallel, 64x data parallel
- **Precision:** float16
## Hardware
- **CPUs:** AMD CPUs with 512GB memory per node
- **GPUs:** 64 A100 80GB GPUs with 8 GPUs per node (8 nodes) using NVLink 4 inter-gpu connects, 4 OmniPath links
- **Communication:** NCCL-communications network with a fully dedicated subnet
## Software
- **Orchestration:** [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed)
- **Optimizer & parallelism:** [DeepSpeed](https://github.com/microsoft/DeepSpeed)
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch) (pytorch-1.11 w/ CUDA-11.5)
- **FP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
# Evaluation
We refer to Table 7 from our [paper](https://arxiv.org/abs/2211.01786) & [bigscience/evaluation-results](https://huggingface.co/datasets/bigscience/evaluation-results) for zero-shot results on unseen tasks. The sidebar reports zero-shot performance of the best prompt per dataset config.
# Citation
```bibtex
@article{muennighoff2022crosslingual,
title={Crosslingual generalization through multitask finetuning},
author={Muennighoff, Niklas and Wang, Thomas and Sutawika, Lintang and Roberts, Adam and Biderman, Stella and Scao, Teven Le and Bari, M Saiful and Shen, Sheng and Yong, Zheng-Xin and Schoelkopf, Hailey and others},
journal={arXiv preprint arXiv:2211.01786},
year={2022}
}
``` | 24,195 | [
[
-0.031890869140625,
-0.042999267578125,
0.0229034423828125,
0.02996826171875,
-0.0060882568359375,
-0.006145477294921875,
-0.0249176025390625,
-0.0253143310546875,
0.0316162109375,
-0.01226043701171875,
-0.0687255859375,
-0.0400390625,
-0.040802001953125,
0.0113525390625,
0.0005750656127929688,
0.059112548828125,
-0.009552001953125,
0.011993408203125,
0.00246429443359375,
-0.0033664703369140625,
-0.021881103515625,
-0.029876708984375,
-0.05572509765625,
-0.044921875,
0.038970947265625,
0.01274871826171875,
0.03680419921875,
0.039031982421875,
0.023468017578125,
0.0285491943359375,
-0.02508544921875,
0.005451202392578125,
-0.0168304443359375,
-0.0098724365234375,
0.0019216537475585938,
-0.0288238525390625,
-0.05474853515625,
-0.005767822265625,
0.0439453125,
0.04461669921875,
0.01442718505859375,
0.0218658447265625,
0.0234222412109375,
0.03955078125,
-0.034454345703125,
0.0278778076171875,
-0.003513336181640625,
0.0295562744140625,
-0.01317596435546875,
0.00394439697265625,
-0.01145172119140625,
-0.0237579345703125,
-0.0033435821533203125,
-0.05908203125,
0.0150909423828125,
0.00965118408203125,
0.10028076171875,
0.0015859603881835938,
0.0038928985595703125,
0.004512786865234375,
-0.0248565673828125,
0.07598876953125,
-0.06640625,
0.0301361083984375,
0.030670166015625,
-0.0031490325927734375,
0.0006628036499023438,
-0.045745849609375,
-0.05975341796875,
-0.0048828125,
-0.024871826171875,
0.03167724609375,
-0.0185089111328125,
-0.01207733154296875,
0.0193023681640625,
0.03857421875,
-0.0523681640625,
0.005016326904296875,
-0.0260467529296875,
-0.017486572265625,
0.0419921875,
0.01493072509765625,
0.042999267578125,
-0.023162841796875,
-0.0194091796875,
-0.0323486328125,
-0.034576416015625,
0.010894775390625,
0.012176513671875,
0.040771484375,
-0.04876708984375,
0.0297393798828125,
-0.0064849853515625,
0.044952392578125,
0.0223236083984375,
-0.00026726722717285156,
0.057403564453125,
-0.03570556640625,
-0.02838134765625,
-0.0184783935546875,
0.08941650390625,
0.015869140625,
0.0034847259521484375,
-0.007293701171875,
0.00844573974609375,
-0.0144500732421875,
-0.00047326087951660156,
-0.07171630859375,
-0.004535675048828125,
0.0224609375,
-0.042877197265625,
-0.0254058837890625,
-0.00801849365234375,
-0.07452392578125,
0.0082855224609375,
-0.0164337158203125,
0.0517578125,
-0.04364013671875,
-0.027801513671875,
0.01617431640625,
0.0009684562683105469,
0.0156402587890625,
0.0122222900390625,
-0.0709228515625,
0.01326751708984375,
0.023223876953125,
0.06854248046875,
-0.01103973388671875,
-0.04351806640625,
0.0022716522216796875,
0.00533294677734375,
-0.01171112060546875,
0.039093017578125,
-0.0123443603515625,
-0.0293426513671875,
-0.02410888671875,
0.023651123046875,
-0.032623291015625,
-0.00714874267578125,
0.042388916015625,
-0.00812530517578125,
0.045654296875,
-0.04339599609375,
-0.0254974365234375,
-0.015380859375,
0.022186279296875,
-0.039337158203125,
0.07977294921875,
0.0157470703125,
-0.0684814453125,
0.01299285888671875,
-0.0721435546875,
-0.0176239013671875,
-0.014862060546875,
-0.0012788772583007812,
-0.0513916015625,
-0.0273895263671875,
0.033233642578125,
0.037811279296875,
-0.017303466796875,
-0.0196990966796875,
-0.0224609375,
-0.0016241073608398438,
-0.0023708343505859375,
-0.0115814208984375,
0.07867431640625,
0.019439697265625,
-0.04705810546875,
0.0181427001953125,
-0.049041748046875,
0.0096588134765625,
0.041748046875,
-0.0159759521484375,
0.00870513916015625,
-0.032012939453125,
-0.0023040771484375,
0.03546142578125,
0.0233917236328125,
-0.039398193359375,
0.01470947265625,
-0.040802001953125,
0.04815673828125,
0.04669189453125,
-0.00423431396484375,
0.03204345703125,
-0.03936767578125,
0.036163330078125,
0.0133819580078125,
0.011749267578125,
-0.0198211669921875,
-0.032989501953125,
-0.0633544921875,
-0.01546478271484375,
0.0192108154296875,
0.035980224609375,
-0.040252685546875,
0.042022705078125,
-0.0226287841796875,
-0.04888916015625,
-0.027069091796875,
0.0010366439819335938,
0.0439453125,
0.05218505859375,
0.050506591796875,
-0.0038967132568359375,
-0.043121337890625,
-0.058746337890625,
0.0006256103515625,
-0.007122039794921875,
0.0103912353515625,
0.039642333984375,
0.057159423828125,
-0.0099334716796875,
0.03936767578125,
-0.046142578125,
-0.00452423095703125,
-0.0303497314453125,
0.0023708343505859375,
0.021026611328125,
0.06005859375,
0.043121337890625,
-0.057647705078125,
-0.03277587890625,
0.0005640983581542969,
-0.0692138671875,
0.01708984375,
0.0014009475708007812,
-0.030487060546875,
0.00782012939453125,
0.0250396728515625,
-0.056854248046875,
0.035400390625,
0.0224761962890625,
-0.037811279296875,
0.044952392578125,
-0.017303466796875,
0.018218994140625,
-0.09912109375,
0.03131103515625,
0.01154327392578125,
0.0058746337890625,
-0.04840087890625,
0.013824462890625,
0.004947662353515625,
0.004550933837890625,
-0.044219970703125,
0.06732177734375,
-0.036651611328125,
0.0126190185546875,
0.002132415771484375,
-0.007843017578125,
0.017974853515625,
0.05474853515625,
0.01277923583984375,
0.052947998046875,
0.0521240234375,
-0.050811767578125,
0.022674560546875,
0.0438232421875,
-0.0092620849609375,
0.0269927978515625,
-0.0643310546875,
-0.0044403076171875,
0.00006151199340820312,
0.0110931396484375,
-0.064208984375,
-0.01702880859375,
0.03131103515625,
-0.05517578125,
0.04730224609375,
0.0037975311279296875,
-0.039642333984375,
-0.0615234375,
-0.024017333984375,
0.022552490234375,
0.041168212890625,
-0.038177490234375,
0.028533935546875,
-0.0008840560913085938,
0.005870819091796875,
-0.04248046875,
-0.0718994140625,
-0.01183319091796875,
-0.0287628173828125,
-0.06488037109375,
0.04644775390625,
-0.0153656005859375,
0.01299285888671875,
-0.0184783935546875,
0.00438690185546875,
-0.006771087646484375,
-0.0037174224853515625,
0.0250091552734375,
0.031890869140625,
-0.02825927734375,
0.005367279052734375,
-0.0106658935546875,
0.005100250244140625,
-0.0008902549743652344,
-0.0182037353515625,
0.054290771484375,
-0.018218994140625,
-0.007720947265625,
-0.056121826171875,
0.01151275634765625,
0.0400390625,
-0.01214599609375,
0.068359375,
0.0689697265625,
-0.03369140625,
0.007091522216796875,
-0.029632568359375,
-0.0284881591796875,
-0.039947509765625,
0.01099395751953125,
-0.02362060546875,
-0.047607421875,
0.054595947265625,
0.02008056640625,
-0.0027942657470703125,
0.0565185546875,
0.047637939453125,
0.0115966796875,
0.07073974609375,
0.0426025390625,
-0.00594329833984375,
0.037139892578125,
-0.0501708984375,
0.0112762451171875,
-0.07220458984375,
-0.03558349609375,
-0.0293426513671875,
-0.022979736328125,
-0.0175018310546875,
-0.02435302734375,
0.0184783935546875,
0.005779266357421875,
-0.047637939453125,
0.037933349609375,
-0.0518798828125,
-0.0018396377563476562,
0.046295166015625,
0.027557373046875,
-0.0082244873046875,
0.00026035308837890625,
-0.0364990234375,
-0.01203155517578125,
-0.056396484375,
-0.01678466796875,
0.07220458984375,
0.0206756591796875,
0.03173828125,
-0.00673675537109375,
0.050140380859375,
-0.0167236328125,
-0.0033512115478515625,
-0.038482666015625,
0.031646728515625,
0.0032863616943359375,
-0.05194091796875,
-0.02435302734375,
-0.0287628173828125,
-0.08624267578125,
0.021026611328125,
-0.035400390625,
-0.072509765625,
0.014312744140625,
0.0242767333984375,
-0.055755615234375,
0.036285400390625,
-0.052764892578125,
0.08123779296875,
-0.015045166015625,
-0.0579833984375,
0.01227569580078125,
-0.048248291015625,
0.0133819580078125,
0.0284881591796875,
0.0204620361328125,
0.007366180419921875,
0.0170135498046875,
0.061981201171875,
-0.045196533203125,
0.06329345703125,
-0.010833740234375,
0.007038116455078125,
0.021636962890625,
-0.0160064697265625,
0.0240631103515625,
-0.01157379150390625,
-0.00450897216796875,
0.004924774169921875,
-0.004550933837890625,
-0.03546142578125,
-0.0263671875,
0.060516357421875,
-0.06695556640625,
-0.035247802734375,
-0.041229248046875,
-0.0391845703125,
-0.0093841552734375,
0.036102294921875,
0.0472412109375,
0.017822265625,
0.00550079345703125,
-0.00408935546875,
0.048614501953125,
-0.0254058837890625,
0.05230712890625,
0.01045989990234375,
-0.015167236328125,
-0.016998291015625,
0.07049560546875,
0.0062103271484375,
0.00780487060546875,
0.0288543701171875,
0.0297393798828125,
-0.026947021484375,
-0.0303497314453125,
-0.039459228515625,
0.036651611328125,
-0.02490234375,
-0.022796630859375,
-0.06475830078125,
-0.026580810546875,
-0.05975341796875,
-0.0129852294921875,
-0.03240966796875,
-0.032257080078125,
-0.042449951171875,
-0.0130462646484375,
0.0355224609375,
0.0341796875,
-0.0191497802734375,
0.0251007080078125,
-0.0391845703125,
0.0264739990234375,
0.0176239013671875,
0.0228271484375,
0.0156402587890625,
-0.040802001953125,
-0.016265869140625,
0.0176849365234375,
-0.04364013671875,
-0.050811767578125,
0.05126953125,
0.0014591217041015625,
0.03948974609375,
0.0176239013671875,
-0.0263519287109375,
0.0606689453125,
-0.03448486328125,
0.06134033203125,
0.03179931640625,
-0.063232421875,
0.047576904296875,
-0.0294189453125,
0.037200927734375,
0.02764892578125,
0.039276123046875,
-0.0302276611328125,
-0.01229095458984375,
-0.057708740234375,
-0.06854248046875,
0.057342529296875,
0.0249481201171875,
0.002349853515625,
0.0055694580078125,
0.0292205810546875,
-0.005199432373046875,
0.007564544677734375,
-0.07177734375,
-0.046051025390625,
-0.037384033203125,
-0.0204010009765625,
-0.00399017333984375,
0.007305145263671875,
-0.002185821533203125,
-0.0440673828125,
0.05242919921875,
0.00201416015625,
0.043182373046875,
0.0227203369140625,
0.00092315673828125,
-0.00274658203125,
0.00846099853515625,
0.044708251953125,
0.03179931640625,
-0.005329132080078125,
-0.01678466796875,
0.015228271484375,
-0.0511474609375,
0.0005550384521484375,
0.00533294677734375,
-0.022064208984375,
-0.0101165771484375,
0.01702880859375,
0.06524658203125,
0.0157012939453125,
-0.0114593505859375,
0.03253173828125,
-0.0025348663330078125,
-0.02764892578125,
-0.0208282470703125,
0.01143646240234375,
0.024810791015625,
0.0159912109375,
0.0176849365234375,
0.004741668701171875,
0.001453399658203125,
-0.0298309326171875,
0.001911163330078125,
0.0302886962890625,
-0.0195465087890625,
-0.037078857421875,
0.06634521484375,
-0.00432586669921875,
-0.0026988983154296875,
0.0227203369140625,
-0.0240631103515625,
-0.0579833984375,
0.0501708984375,
0.04803466796875,
0.045196533203125,
-0.0208282470703125,
0.004566192626953125,
0.076416015625,
0.00653839111328125,
-0.0167999267578125,
0.024749755859375,
0.0024929046630859375,
-0.039794921875,
-0.0205078125,
-0.060577392578125,
0.0002448558807373047,
0.026092529296875,
-0.04705810546875,
0.027801513671875,
-0.03753662109375,
-0.0172271728515625,
0.018402099609375,
0.0197296142578125,
-0.057769775390625,
0.04254150390625,
0.01971435546875,
0.062286376953125,
-0.055328369140625,
0.056365966796875,
0.047637939453125,
-0.06231689453125,
-0.07611083984375,
-0.0078887939453125,
0.0018510818481445312,
-0.07073974609375,
0.06353759765625,
0.01068115234375,
0.01123809814453125,
0.01203155517578125,
-0.046417236328125,
-0.085693359375,
0.099365234375,
0.006011962890625,
-0.0191802978515625,
-0.0221099853515625,
0.002590179443359375,
0.041015625,
-0.01508331298828125,
0.0310516357421875,
0.02484130859375,
0.04876708984375,
0.0207977294921875,
-0.0693359375,
0.027069091796875,
-0.045745849609375,
-0.003910064697265625,
-0.0027256011962890625,
-0.084716796875,
0.09149169921875,
-0.01300811767578125,
-0.00899505615234375,
0.0027408599853515625,
0.060882568359375,
0.027984619140625,
0.0146636962890625,
0.015289306640625,
0.05975341796875,
0.036834716796875,
-0.023681640625,
0.0753173828125,
-0.0288238525390625,
0.04241943359375,
0.05828857421875,
0.0167388916015625,
0.042724609375,
0.0253753662109375,
-0.0386962890625,
0.04046630859375,
0.048126220703125,
-0.0217742919921875,
0.0205841064453125,
0.01702880859375,
-0.00495147705078125,
-0.00684356689453125,
0.011016845703125,
-0.0482177734375,
0.006748199462890625,
0.0305328369140625,
-0.022369384765625,
-0.002960205078125,
0.00716400146484375,
0.0273895263671875,
-0.002750396728515625,
-0.035888671875,
0.02783203125,
0.00923919677734375,
-0.050872802734375,
0.050933837890625,
-0.004352569580078125,
0.0753173828125,
-0.040435791015625,
0.018707275390625,
-0.01165008544921875,
0.01313018798828125,
-0.0294952392578125,
-0.0556640625,
0.014495849609375,
-0.005115509033203125,
-0.00939178466796875,
-0.01384735107421875,
0.036346435546875,
-0.02337646484375,
-0.046142578125,
0.0224761962890625,
0.026458740234375,
0.00910186767578125,
0.0045013427734375,
-0.080810546875,
0.0032634735107421875,
-0.0029697418212890625,
-0.03436279296875,
0.01517486572265625,
0.0131072998046875,
0.0158843994140625,
0.054290771484375,
0.04412841796875,
0.0090484619140625,
0.027252197265625,
-0.0053558349609375,
0.06304931640625,
-0.052215576171875,
-0.0362548828125,
-0.062347412109375,
0.04193115234375,
-0.0102996826171875,
-0.025787353515625,
0.0792236328125,
0.04248046875,
0.059783935546875,
-0.00527191162109375,
0.060577392578125,
-0.01806640625,
0.0455322265625,
-0.0302276611328125,
0.07049560546875,
-0.059234619140625,
-0.0189666748046875,
-0.0279388427734375,
-0.037811279296875,
-0.02410888671875,
0.060394287109375,
-0.0206146240234375,
0.041717529296875,
0.057891845703125,
0.049041748046875,
-0.0101165771484375,
-0.004795074462890625,
-0.003643035888671875,
0.0298309326171875,
0.01357269287109375,
0.0634765625,
0.0243072509765625,
-0.056121826171875,
0.0283660888671875,
-0.050628662109375,
-0.0019407272338867188,
-0.0184478759765625,
-0.047637939453125,
-0.068359375,
-0.0521240234375,
-0.036102294921875,
-0.04193115234375,
-0.00713348388671875,
0.06591796875,
0.055572509765625,
-0.067138671875,
-0.0146484375,
-0.013336181640625,
0.00033473968505859375,
-0.010650634765625,
-0.0178070068359375,
0.05511474609375,
-0.02203369140625,
-0.07110595703125,
0.00621795654296875,
0.0017232894897460938,
0.039794921875,
-0.00502777099609375,
-0.01424407958984375,
-0.0304412841796875,
-0.0035724639892578125,
0.023681640625,
0.04852294921875,
-0.03515625,
-0.0073089599609375,
0.0124359130859375,
-0.01541900634765625,
0.0269012451171875,
0.0241241455078125,
-0.039337158203125,
0.007648468017578125,
0.035064697265625,
0.022064208984375,
0.0518798828125,
-0.0144195556640625,
0.025146484375,
-0.03570556640625,
0.017578125,
0.01238250732421875,
0.03460693359375,
0.0268096923828125,
-0.033599853515625,
0.027862548828125,
0.0196990966796875,
-0.042633056640625,
-0.057891845703125,
-0.0084991455078125,
-0.08526611328125,
-0.016357421875,
0.08587646484375,
-0.02117919921875,
-0.050628662109375,
0.0258636474609375,
-0.0107879638671875,
0.042877197265625,
-0.02606201171875,
0.047821044921875,
0.057342529296875,
-0.0219573974609375,
-0.009307861328125,
-0.043975830078125,
0.041473388671875,
0.04376220703125,
-0.0654296875,
-0.01207733154296875,
0.01079559326171875,
0.03289794921875,
0.0311737060546875,
0.0309295654296875,
-0.0191497802734375,
0.015777587890625,
-0.00003272294998168945,
0.0157318115234375,
-0.01363372802734375,
0.002803802490234375,
-0.0280914306640625,
-0.002513885498046875,
-0.0230712890625,
-0.0194091796875
]
] |
demdecuong/vihealthbert-base-word | 2022-04-20T07:55:52.000Z | [
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | feature-extraction | demdecuong | null | null | demdecuong/vihealthbert-base-word | 2 | 75,837 | transformers | 2022-04-20T07:49:34 | # <a name="introduction"></a> ViHealthBERT: Pre-trained Language Models for Vietnamese in Health Text Mining
ViHealthBERT is the a strong baseline language models for Vietnamese in Healthcare domain.
We empirically investigate our model with different training strategies, achieving state of the art (SOTA) performances on 3 downstream tasks: NER (COVID-19 & ViMQ), Acronym Disambiguation, and Summarization.
We introduce two Vietnamese datasets: the acronym dataset (acrDrAid) and the FAQ summarization dataset in the healthcare domain. Our acrDrAid dataset is annotated with 135 sets of keywords.
The general approaches and experimental results of ViHealthBERT can be found in our LREC-2022 Poster [paper]() (updated soon):
@article{vihealthbert,
title = {{ViHealthBERT: Pre-trained Language Models for Vietnamese in Health Text Mining}},
author = {Minh Phuc Nguyen, Vu Hoang Tran, Vu Hoang, Ta Duc Huy, Trung H. Bui, Steven Q. H. Truong },
journal = {13th Edition of its Language Resources and Evaluation Conference},
year = {2022}
}
### Installation <a name="install2"></a>
- Python 3.6+, and PyTorch >= 1.6
- Install `transformers`:
`pip install transformers==4.2.0`
### Pre-trained models <a name="models2"></a>
Model | #params | Arch. | Tokenizer
---|---|---|---
`demdecuong/vihealthbert-base-word` | 135M | base | Word-level
`demdecuong/vihealthbert-base-syllable` | 135M | base | Syllable-level
### Example usage <a name="usage1"></a>
```python
import torch
from transformers import AutoModel, AutoTokenizer
vihealthbert = AutoModel.from_pretrained("demdecuong/vihealthbert-base-word")
tokenizer = AutoTokenizer.from_pretrained("demdecuong/vihealthbert-base-word")
# INPUT TEXT MUST BE ALREADY WORD-SEGMENTED!
line = "Tôi là sinh_viên trường đại_học Công_nghệ ."
input_ids = torch.tensor([tokenizer.encode(line)])
with torch.no_grad():
features = vihealthbert(input_ids) # Models outputs are now tuples
```
### Example usage for raw text <a name="usage2"></a>
Since ViHealthBERT used the [RDRSegmenter](https://github.com/datquocnguyen/RDRsegmenter) from [VnCoreNLP](https://github.com/vncorenlp/VnCoreNLP) to pre-process the pre-training data.
We highly recommend use the same word-segmenter for ViHealthBERT downstream applications.
#### Installation
```
# Install the vncorenlp python wrapper
pip3 install vncorenlp
# Download VnCoreNLP-1.1.1.jar & its word segmentation component (i.e. RDRSegmenter)
mkdir -p vncorenlp/models/wordsegmenter
wget https://raw.githubusercontent.com/vncorenlp/VnCoreNLP/master/VnCoreNLP-1.1.1.jar
wget https://raw.githubusercontent.com/vncorenlp/VnCoreNLP/master/models/wordsegmenter/vi-vocab
wget https://raw.githubusercontent.com/vncorenlp/VnCoreNLP/master/models/wordsegmenter/wordsegmenter.rdr
mv VnCoreNLP-1.1.1.jar vncorenlp/
mv vi-vocab vncorenlp/models/wordsegmenter/
mv wordsegmenter.rdr vncorenlp/models/wordsegmenter/
```
`VnCoreNLP-1.1.1.jar` (27MB) and folder `models/` must be placed in the same working folder.
#### Example usage
```
# See more details at: https://github.com/vncorenlp/VnCoreNLP
# Load rdrsegmenter from VnCoreNLP
from vncorenlp import VnCoreNLP
rdrsegmenter = VnCoreNLP("/Absolute-path-to/vncorenlp/VnCoreNLP-1.1.1.jar", annotators="wseg", max_heap_size='-Xmx500m')
# Input
text = "Ông Nguyễn Khắc Chúc đang làm việc tại Đại học Quốc gia Hà Nội. Bà Lan, vợ ông Chúc, cũng làm việc tại đây."
# To perform word (and sentence) segmentation
sentences = rdrsegmenter.tokenize(text)
for sentence in sentences:
print(" ".join(sentence))
``` | 3,596 | [
[
-0.0008559226989746094,
-0.06170654296875,
0.0293731689453125,
0.01146697998046875,
-0.0328369140625,
-0.023773193359375,
-0.00836181640625,
-0.0232696533203125,
0.019561767578125,
0.0421142578125,
-0.0262603759765625,
-0.060089111328125,
-0.06134033203125,
0.01209259033203125,
-0.016693115234375,
0.07293701171875,
-0.01326751708984375,
0.006290435791015625,
-0.004791259765625,
-0.034332275390625,
-0.0264129638671875,
-0.05548095703125,
-0.0281829833984375,
-0.031768798828125,
0.0228424072265625,
0.0247802734375,
0.01904296875,
0.049774169921875,
0.03338623046875,
0.034332275390625,
-0.0109100341796875,
0.0205535888671875,
-0.0266876220703125,
0.0207672119140625,
0.0075836181640625,
-0.0247802734375,
-0.016510009765625,
-0.003849029541015625,
0.021453857421875,
0.01236724853515625,
-0.00875091552734375,
0.004398345947265625,
-0.01522064208984375,
0.033538818359375,
-0.045654296875,
-0.0011196136474609375,
-0.03369140625,
-0.006450653076171875,
-0.01407623291015625,
-0.010894775390625,
-0.0024204254150390625,
-0.025604248046875,
0.040802001953125,
-0.039642333984375,
0.00417327880859375,
-0.00733184814453125,
0.11029052734375,
0.0287322998046875,
-0.0168914794921875,
-0.024444580078125,
-0.0278778076171875,
0.054168701171875,
-0.05987548828125,
0.03070068359375,
0.0214996337890625,
-0.010467529296875,
0.0078887939453125,
-0.08709716796875,
-0.049774169921875,
-0.0234832763671875,
-0.006763458251953125,
0.01006317138671875,
-0.0312347412109375,
0.0155181884765625,
0.01375579833984375,
0.03460693359375,
-0.056884765625,
-0.005054473876953125,
-0.05194091796875,
-0.0186004638671875,
0.03228759765625,
0.00872802734375,
0.0281982421875,
-0.04119873046875,
-0.032501220703125,
-0.0183258056640625,
-0.021881103515625,
-0.0014772415161132812,
-0.0100250244140625,
0.01490020751953125,
-0.0208892822265625,
0.055450439453125,
-0.0121917724609375,
0.06427001953125,
-0.0179290771484375,
-0.019256591796875,
0.048583984375,
-0.0243072509765625,
-0.036102294921875,
0.01068115234375,
0.07977294921875,
0.023193359375,
0.03082275390625,
0.006977081298828125,
-0.0096435546875,
0.01070404052734375,
0.01262664794921875,
-0.066162109375,
-0.038818359375,
0.034637451171875,
-0.0230712890625,
-0.0231170654296875,
-0.00727081298828125,
-0.041229248046875,
-0.01239013671875,
-0.01140594482421875,
0.069091796875,
-0.050994873046875,
-0.0110626220703125,
0.033233642578125,
-0.0322265625,
0.0182342529296875,
-0.0016307830810546875,
-0.047088623046875,
0.00559234619140625,
0.038543701171875,
0.0797119140625,
-0.00707244873046875,
-0.0345458984375,
-0.042022705078125,
-0.0179443359375,
0.0028781890869140625,
0.048980712890625,
-0.01375579833984375,
-0.04559326171875,
0.001827239990234375,
0.00946044921875,
-0.029022216796875,
-0.04229736328125,
0.043548583984375,
-0.01163482666015625,
0.050262451171875,
-0.0208892822265625,
-0.045135498046875,
-0.01326751708984375,
0.027099609375,
-0.030609130859375,
0.0858154296875,
0.021636962890625,
-0.0926513671875,
0.0173187255859375,
-0.042510986328125,
-0.03912353515625,
0.01006317138671875,
-0.005123138427734375,
-0.046234130859375,
0.00748443603515625,
0.025390625,
0.0367431640625,
-0.00803375244140625,
-0.007450103759765625,
0.007503509521484375,
-0.01488494873046875,
0.0107879638671875,
-0.0179595947265625,
0.077880859375,
0.006195068359375,
-0.024261474609375,
0.033905029296875,
-0.058624267578125,
0.007518768310546875,
0.0133514404296875,
-0.03704833984375,
-0.046844482421875,
-0.0185546875,
0.02880859375,
0.0258941650390625,
0.0281524658203125,
-0.0313720703125,
0.0119476318359375,
-0.047149658203125,
0.042144775390625,
0.0361328125,
0.00873565673828125,
0.01352691650390625,
-0.016632080078125,
0.040313720703125,
0.006778717041015625,
0.00923919677734375,
0.005893707275390625,
-0.0203704833984375,
-0.08245849609375,
-0.041046142578125,
0.02374267578125,
0.04766845703125,
-0.077392578125,
0.050689697265625,
-0.034454345703125,
-0.053192138671875,
-0.053741455078125,
-0.00798797607421875,
0.03594970703125,
0.051116943359375,
0.05023193359375,
-0.028167724609375,
-0.0477294921875,
-0.0426025390625,
-0.0210723876953125,
-0.0308685302734375,
0.0106201171875,
0.00865936279296875,
0.044647216796875,
-0.0533447265625,
0.059112548828125,
-0.054107666015625,
-0.030914306640625,
-0.036651611328125,
0.01325225830078125,
0.0014562606811523438,
0.0255279541015625,
0.031951904296875,
-0.05340576171875,
-0.005886077880859375,
-0.00846099853515625,
-0.0479736328125,
-0.006305694580078125,
0.0172576904296875,
-0.0036258697509765625,
0.0009641647338867188,
0.045440673828125,
-0.032440185546875,
0.031005859375,
0.0482177734375,
-0.039276123046875,
0.0596923828125,
-0.034942626953125,
-0.013671875,
-0.10797119140625,
0.005046844482421875,
0.01222991943359375,
-0.01381683349609375,
-0.053192138671875,
0.005176544189453125,
-0.0179290771484375,
0.01294708251953125,
-0.04620361328125,
0.06427001953125,
-0.0294952392578125,
0.039520263671875,
-0.02374267578125,
-0.00481414794921875,
0.0078582763671875,
0.021026611328125,
0.016754150390625,
0.046600341796875,
0.045562744140625,
-0.068115234375,
0.021728515625,
0.0243072509765625,
-0.019866943359375,
0.016143798828125,
-0.07940673828125,
-0.0175018310546875,
-0.00399017333984375,
-0.0153350830078125,
-0.05841064453125,
-0.0083770751953125,
0.0267181396484375,
-0.0379638671875,
0.00400543212890625,
0.00858306884765625,
-0.0162200927734375,
-0.0308074951171875,
-0.0225982666015625,
0.01214599609375,
0.034454345703125,
-0.0257568359375,
0.0308074951171875,
0.045806884765625,
0.00710296630859375,
-0.039276123046875,
-0.06402587890625,
-0.01277923583984375,
-0.02630615234375,
-0.031402587890625,
0.024200439453125,
0.0122528076171875,
-0.01308441162109375,
0.0019083023071289062,
-0.00521087646484375,
-0.0202484130859375,
-0.0011205673217773438,
0.01390838623046875,
0.021728515625,
-0.01313018798828125,
0.02911376953125,
-0.01214599609375,
-0.006954193115234375,
-0.005603790283203125,
-0.03192138671875,
0.058746337890625,
-0.0202789306640625,
0.0019664764404296875,
-0.0229034423828125,
-0.00266265869140625,
0.04132080078125,
-0.047149658203125,
0.0621337890625,
0.068359375,
-0.02996826171875,
0.004055023193359375,
-0.03570556640625,
-0.00916290283203125,
-0.03863525390625,
0.0484619140625,
-0.0121917724609375,
-0.07904052734375,
0.00194549560546875,
0.00800323486328125,
0.01280975341796875,
0.04150390625,
0.0528564453125,
0.01399993896484375,
0.036376953125,
0.058990478515625,
-0.01276397705078125,
0.046051025390625,
-0.0389404296875,
0.0131988525390625,
-0.05743408203125,
-0.007049560546875,
-0.0340576171875,
0.01361846923828125,
-0.06365966796875,
-0.048187255859375,
0.0301971435546875,
0.0028285980224609375,
-0.00026679039001464844,
0.042633056640625,
-0.06829833984375,
-0.0082550048828125,
0.0382080078125,
-0.0018663406372070312,
0.002460479736328125,
0.0129852294921875,
-0.0307159423828125,
-0.0012998580932617188,
-0.050933837890625,
-0.041107177734375,
0.06475830078125,
0.02410888671875,
0.044281005859375,
-0.02197265625,
0.072509765625,
-0.007843017578125,
0.005268096923828125,
-0.07501220703125,
0.03399658203125,
-0.0169525146484375,
-0.023895263671875,
-0.036224365234375,
-0.023345947265625,
-0.08782958984375,
0.035675048828125,
-0.022247314453125,
-0.06390380859375,
0.0157623291015625,
0.0103302001953125,
-0.029205322265625,
-0.0005488395690917969,
-0.046875,
0.07763671875,
-0.024749755859375,
-0.01038360595703125,
-0.0027942657470703125,
-0.032196044921875,
0.0189208984375,
0.0194549560546875,
0.004608154296875,
-0.008056640625,
0.01267242431640625,
0.08184814453125,
-0.06622314453125,
0.0244140625,
-0.00600433349609375,
0.0036983489990234375,
0.02655029296875,
-0.03375244140625,
0.00879669189453125,
0.0024204254150390625,
-0.00463104248046875,
0.0171661376953125,
0.0157623291015625,
-0.0120086669921875,
-0.00803375244140625,
0.050628662109375,
-0.06402587890625,
-0.02716064453125,
-0.061920166015625,
-0.0067138671875,
0.00481414794921875,
0.02874755859375,
0.06280517578125,
0.035369873046875,
-0.0289459228515625,
-0.005847930908203125,
0.041046142578125,
-0.0238494873046875,
0.0249481201171875,
0.0222625732421875,
-0.01349639892578125,
-0.036224365234375,
0.07318115234375,
0.0142364501953125,
-0.0081329345703125,
0.033294677734375,
0.028106689453125,
-0.025787353515625,
-0.026214599609375,
-0.01029205322265625,
0.031646728515625,
-0.038238525390625,
0.0042724609375,
-0.088134765625,
-0.038330078125,
-0.040069580078125,
0.006832122802734375,
-0.02386474609375,
-0.00650787353515625,
-0.032135009765625,
-0.01267242431640625,
0.056243896484375,
0.0380859375,
-0.005207061767578125,
0.036346435546875,
-0.055145263671875,
0.030517578125,
0.03778076171875,
-0.0174102783203125,
-0.0111846923828125,
-0.039642333984375,
-0.032318115234375,
-0.009185791015625,
-0.01922607421875,
-0.06787109375,
0.057373046875,
0.0002532005310058594,
0.0193939208984375,
0.0225067138671875,
0.0003662109375,
0.054168701171875,
-0.041778564453125,
0.0634765625,
0.01763916015625,
-0.07330322265625,
0.039520263671875,
-0.001766204833984375,
0.03948974609375,
0.0565185546875,
0.02679443359375,
-0.07525634765625,
-0.006748199462890625,
-0.04095458984375,
-0.0924072265625,
0.029449462890625,
0.0256195068359375,
-0.006221771240234375,
-0.007633209228515625,
0.01093292236328125,
-0.0149688720703125,
0.0152587890625,
-0.04901123046875,
-0.0299224853515625,
-0.016632080078125,
-0.02191162109375,
-0.007328033447265625,
-0.026153564453125,
0.008270263671875,
-0.04095458984375,
0.06158447265625,
0.01299285888671875,
0.0251007080078125,
0.05230712890625,
-0.0231170654296875,
0.0247650146484375,
0.0183868408203125,
0.045989990234375,
0.035614013671875,
0.0003445148468017578,
0.00951385498046875,
0.017120361328125,
-0.070068359375,
0.01265716552734375,
0.026611328125,
-0.002986907958984375,
0.041656494140625,
0.044281005859375,
0.08160400390625,
0.00028395652770996094,
-0.045318603515625,
0.045806884765625,
-0.0305938720703125,
-0.0221710205078125,
-0.02703857421875,
0.00888824462890625,
0.01108551025390625,
-0.00249481201171875,
0.0023956298828125,
-0.011138916015625,
0.0002486705780029297,
-0.0204010009765625,
0.00890350341796875,
0.015106201171875,
-0.058807373046875,
-0.021148681640625,
0.04742431640625,
0.007785797119140625,
-0.02337646484375,
0.05438232421875,
-0.005748748779296875,
-0.06390380859375,
0.020416259765625,
0.0092620849609375,
0.07171630859375,
-0.019134521484375,
0.0206451416015625,
0.0296173095703125,
0.023773193359375,
-0.010101318359375,
0.046356201171875,
0.0095062255859375,
-0.05279541015625,
-0.01666259765625,
-0.07177734375,
0.005611419677734375,
0.024322509765625,
-0.04119873046875,
0.0283050537109375,
-0.0167236328125,
-0.0230712890625,
-0.0022296905517578125,
0.006824493408203125,
-0.06964111328125,
0.04010009765625,
-0.00010478496551513672,
0.05267333984375,
-0.060638427734375,
0.0712890625,
0.0648193359375,
-0.036407470703125,
-0.067626953125,
0.0252227783203125,
-0.00762176513671875,
-0.06390380859375,
0.033050537109375,
0.023468017578125,
-0.006977081298828125,
0.0051422119140625,
-0.0232696533203125,
-0.051910400390625,
0.0684814453125,
0.00518798828125,
-0.03387451171875,
-0.0205841064453125,
0.0004458427429199219,
0.050018310546875,
0.002620697021484375,
0.021148681640625,
0.045745849609375,
0.034271240234375,
-0.008209228515625,
-0.0714111328125,
-0.0047760009765625,
-0.0153045654296875,
0.00592041015625,
-0.00958251953125,
-0.04315185546875,
0.076171875,
-0.0199127197265625,
-0.0194549560546875,
0.00608062744140625,
0.0621337890625,
0.038726806640625,
0.0070953369140625,
0.0333251953125,
0.0537109375,
0.0667724609375,
-0.0015306472778320312,
0.0677490234375,
-0.0224151611328125,
0.03253173828125,
0.07196044921875,
0.00472259521484375,
0.061767578125,
0.03350830078125,
-0.0002856254577636719,
0.03912353515625,
0.04693603515625,
0.00916290283203125,
0.043365478515625,
0.0248565673828125,
-0.0224456787109375,
-0.00970458984375,
-0.01006317138671875,
-0.03692626953125,
0.0634765625,
0.0235748291015625,
-0.036773681640625,
-0.00530242919921875,
0.0160675048828125,
0.0263519287109375,
-0.0094757080078125,
-0.008331298828125,
0.044189453125,
0.0240478515625,
-0.037261962890625,
0.06365966796875,
0.0182342529296875,
0.07275390625,
-0.04705810546875,
0.006221771240234375,
0.01015472412109375,
0.020477294921875,
-0.0091400146484375,
-0.0167999267578125,
0.00403594970703125,
-0.0056304931640625,
0.0013866424560546875,
0.007099151611328125,
0.050048828125,
-0.0423583984375,
-0.046966552734375,
0.0230865478515625,
0.045684814453125,
0.0194549560546875,
-0.007228851318359375,
-0.06292724609375,
0.011993408203125,
-0.0015087127685546875,
-0.0299530029296875,
-0.006870269775390625,
0.0457763671875,
0.001125335693359375,
0.02606201171875,
0.028656005859375,
0.0290985107421875,
0.0004911422729492188,
-0.004917144775390625,
0.06591796875,
-0.046539306640625,
-0.045257568359375,
-0.08306884765625,
0.037109375,
-0.005245208740234375,
-0.059722900390625,
0.06890869140625,
0.04400634765625,
0.0625,
-0.004421234130859375,
0.07049560546875,
-0.00463104248046875,
0.0251617431640625,
-0.039276123046875,
0.07666015625,
-0.0550537109375,
-0.015869140625,
-0.0283050537109375,
-0.039520263671875,
-0.00911712646484375,
0.06298828125,
-0.03369140625,
-0.0056304931640625,
0.06158447265625,
0.056793212890625,
0.0034809112548828125,
-0.0302581787109375,
0.020965576171875,
0.024322509765625,
0.02618408203125,
0.0246429443359375,
0.0258331298828125,
-0.0438232421875,
0.0228424072265625,
-0.0126495361328125,
-0.00910186767578125,
-0.029510498046875,
-0.03753662109375,
-0.07196044921875,
-0.06231689453125,
-0.030731201171875,
-0.04132080078125,
0.0194549560546875,
0.0831298828125,
0.0521240234375,
-0.050079345703125,
-0.0161590576171875,
-0.00725555419921875,
0.006561279296875,
-0.014404296875,
-0.016357421875,
0.04559326171875,
-0.01160430908203125,
-0.046173095703125,
0.0175628662109375,
0.017669677734375,
0.005535125732421875,
-0.01383209228515625,
0.000011205673217773438,
-0.030120849609375,
0.0027332305908203125,
0.046600341796875,
0.029510498046875,
-0.048187255859375,
-0.0048828125,
-0.0032482147216796875,
-0.02886962890625,
0.00543212890625,
0.04449462890625,
-0.062744140625,
0.04412841796875,
0.0487060546875,
0.0250244140625,
0.036407470703125,
-0.01195526123046875,
0.035125732421875,
-0.04278564453125,
0.0242767333984375,
0.0185546875,
0.036956787109375,
0.035369873046875,
-0.01239776611328125,
0.02130126953125,
0.02655029296875,
-0.037139892578125,
-0.050201416015625,
-0.0034351348876953125,
-0.059478759765625,
-0.026031494140625,
0.097412109375,
-0.003955841064453125,
-0.04437255859375,
-0.01038360595703125,
-0.03240966796875,
0.0474853515625,
-0.00977325439453125,
0.044281005859375,
0.043609619140625,
0.004638671875,
0.0031452178955078125,
-0.0303192138671875,
0.05242919921875,
0.0209808349609375,
-0.055145263671875,
-0.0025157928466796875,
0.01242828369140625,
0.0311279296875,
0.00911712646484375,
0.0697021484375,
0.0074920654296875,
0.01139068603515625,
-0.0126953125,
0.03302001953125,
0.00019800662994384766,
-0.0012407302856445312,
-0.0345458984375,
-0.005077362060546875,
-0.0028438568115234375,
-0.045257568359375
]
] |
facebook/dinov2-base | 2023-09-06T11:22:58.000Z | [
"transformers",
"pytorch",
"safetensors",
"dinov2",
"feature-extraction",
"dino",
"vision",
"arxiv:2304.07193",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | feature-extraction | facebook | null | null | facebook/dinov2-base | 15 | 75,535 | transformers | 2023-07-17T16:44:29 | ---
license: apache-2.0
tags:
- dino
- vision
---
# Vision Transformer (base-sized model) trained using DINOv2
Vision Transformer (ViT) model trained using the DINOv2 method. It was introduced in the paper [DINOv2: Learning Robust Visual Features without Supervision](https://arxiv.org/abs/2304.07193) by Oquab et al. and first released in [this repository](https://github.com/facebookresearch/dinov2).
Disclaimer: The team releasing DINOv2 did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion.
Images are presented to the model as a sequence of fixed-size patches, which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.
Note that this model does not include any fine-tuned heads.
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.
## Intended uses & limitations
You can use the raw model for feature extraction. See the [model hub](https://huggingface.co/models?search=facebook/dinov2) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
from transformers import AutoImageProcessor, AutoModel
from PIL import Image
import requests
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
processor = AutoImageProcessor.from_pretrained('facebook/dinov2-base')
model = AutoModel.from_pretrained('facebook/dinov2-base')
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
### BibTeX entry and citation info
```bibtex
misc{oquab2023dinov2,
title={DINOv2: Learning Robust Visual Features without Supervision},
author={Maxime Oquab and Timothée Darcet and Théo Moutakanni and Huy Vo and Marc Szafraniec and Vasil Khalidov and Pierre Fernandez and Daniel Haziza and Francisco Massa and Alaaeldin El-Nouby and Mahmoud Assran and Nicolas Ballas and Wojciech Galuba and Russell Howes and Po-Yao Huang and Shang-Wen Li and Ishan Misra and Michael Rabbat and Vasu Sharma and Gabriel Synnaeve and Hu Xu and Hervé Jegou and Julien Mairal and Patrick Labatut and Armand Joulin and Piotr Bojanowski},
year={2023},
eprint={2304.07193},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | 3,027 | [
[
-0.03668212890625,
-0.0309906005859375,
0.00659942626953125,
-0.00917816162109375,
-0.0367431640625,
-0.0032749176025390625,
0.008544921875,
-0.0308990478515625,
0.019622802734375,
0.038177490234375,
-0.03515625,
-0.01788330078125,
-0.0513916015625,
-0.01397705078125,
-0.033905029296875,
0.06689453125,
-0.0005817413330078125,
-0.00684356689453125,
-0.0210418701171875,
-0.0021305084228515625,
-0.0177764892578125,
-0.0352783203125,
-0.036865234375,
-0.0288543701171875,
0.0249176025390625,
0.005764007568359375,
0.0537109375,
0.07635498046875,
0.0338134765625,
0.03045654296875,
-0.00962066650390625,
0.0011768341064453125,
-0.0413818359375,
-0.01506805419921875,
-0.020355224609375,
-0.038421630859375,
-0.02392578125,
0.009063720703125,
0.03985595703125,
0.0289764404296875,
0.0193939208984375,
0.0244598388671875,
0.00897216796875,
0.010223388671875,
-0.04388427734375,
0.035888671875,
-0.034820556640625,
0.0264739990234375,
-0.004604339599609375,
-0.0013055801391601562,
-0.0215606689453125,
-0.029205322265625,
0.0196990966796875,
-0.03546142578125,
0.0125579833984375,
-0.006732940673828125,
0.10009765625,
0.022979736328125,
-0.03704833984375,
-0.0036258697509765625,
-0.044830322265625,
0.06011962890625,
-0.0184326171875,
0.0275421142578125,
0.01345062255859375,
0.026214599609375,
0.0052947998046875,
-0.08441162109375,
-0.04791259765625,
0.00316619873046875,
-0.01314544677734375,
0.0004177093505859375,
-0.0186767578125,
-0.002063751220703125,
0.02496337890625,
0.027984619140625,
-0.0117950439453125,
0.01023101806640625,
-0.037994384765625,
-0.036376953125,
0.027618408203125,
-0.009124755859375,
0.01264190673828125,
-0.0305023193359375,
-0.050811767578125,
-0.0330810546875,
-0.02789306640625,
0.033447265625,
0.012847900390625,
0.0073394775390625,
-0.0125732421875,
0.04644775390625,
0.00635528564453125,
0.04034423828125,
0.024139404296875,
-0.01129150390625,
0.043365478515625,
-0.0213623046875,
-0.020172119140625,
-0.0165557861328125,
0.06011962890625,
0.0205535888671875,
0.02532958984375,
0.0011358261108398438,
-0.0254974365234375,
0.007167816162109375,
0.02105712890625,
-0.0697021484375,
-0.022369384765625,
-0.009674072265625,
-0.0455322265625,
-0.0408935546875,
0.0208892822265625,
-0.05316162109375,
-0.01183319091796875,
-0.0208740234375,
0.05255126953125,
-0.021270751953125,
-0.02789306640625,
-0.033966064453125,
-0.0035724639892578125,
0.0550537109375,
0.007183074951171875,
-0.07135009765625,
0.0264892578125,
0.038299560546875,
0.06689453125,
-0.006011962890625,
-0.01318359375,
-0.023162841796875,
-0.0108642578125,
-0.03778076171875,
0.05224609375,
-0.0249176025390625,
-0.01727294921875,
0.014312744140625,
0.038543701171875,
0.001964569091796875,
-0.035980224609375,
0.0303192138671875,
-0.026214599609375,
0.0158538818359375,
-0.0278472900390625,
-0.0208587646484375,
-0.0243072509765625,
0.010589599609375,
-0.049407958984375,
0.0870361328125,
0.0289154052734375,
-0.057342529296875,
0.0433349609375,
-0.036376953125,
-0.0165557861328125,
0.002506256103515625,
-0.0108795166015625,
-0.05169677734375,
-0.007114410400390625,
0.034149169921875,
0.037567138671875,
0.01067352294921875,
-0.0127716064453125,
-0.027435302734375,
-0.036041259765625,
0.0209808349609375,
-0.007007598876953125,
0.06549072265625,
0.01293182373046875,
-0.0250244140625,
0.01187896728515625,
-0.048980712890625,
-0.0007805824279785156,
0.0175628662109375,
-0.025421142578125,
-0.00640869140625,
-0.01438140869140625,
0.0130157470703125,
0.0250091552734375,
0.027008056640625,
-0.048583984375,
0.01432037353515625,
-0.026824951171875,
0.04522705078125,
0.061370849609375,
-0.004222869873046875,
0.04351806640625,
-0.009918212890625,
0.027587890625,
0.0093536376953125,
0.038543701171875,
-0.031341552734375,
-0.0428466796875,
-0.056915283203125,
-0.022125244140625,
0.0241546630859375,
0.03802490234375,
-0.06671142578125,
0.041961669921875,
-0.01351165771484375,
-0.0214385986328125,
-0.034454345703125,
0.0164642333984375,
0.034576416015625,
0.045562744140625,
0.0255126953125,
-0.043243408203125,
-0.04034423828125,
-0.0667724609375,
0.0167388916015625,
-0.0020351409912109375,
0.0025424957275390625,
0.0215606689453125,
0.05072021484375,
-0.021881103515625,
0.07635498046875,
-0.01117706298828125,
-0.0161590576171875,
-0.00641632080078125,
0.00018584728240966797,
0.01468658447265625,
0.052978515625,
0.05743408203125,
-0.06890869140625,
-0.0215911865234375,
-0.005252838134765625,
-0.06573486328125,
0.01378631591796875,
0.00527191162109375,
-0.01485443115234375,
0.0015230178833007812,
0.0226593017578125,
-0.057861328125,
0.056488037109375,
0.01236724853515625,
-0.01337432861328125,
0.013458251953125,
-0.004703521728515625,
-0.001659393310546875,
-0.0865478515625,
-0.0002028942108154297,
-0.003513336181640625,
-0.03375244140625,
-0.039306640625,
0.01236724853515625,
0.01323699951171875,
-0.01219940185546875,
-0.036468505859375,
0.0283660888671875,
-0.037689208984375,
-0.032562255859375,
-0.0192718505859375,
-0.0162811279296875,
0.00020003318786621094,
0.040008544921875,
-0.0015964508056640625,
0.030517578125,
0.0643310546875,
-0.0300445556640625,
0.05401611328125,
0.032928466796875,
-0.0301666259765625,
0.032501220703125,
-0.04986572265625,
0.028106689453125,
-0.0129241943359375,
0.00872802734375,
-0.07281494140625,
-0.031982421875,
0.030853271484375,
-0.036834716796875,
0.0433349609375,
-0.0274505615234375,
-0.034515380859375,
-0.0625,
-0.0218963623046875,
0.0229339599609375,
0.061248779296875,
-0.058563232421875,
0.042205810546875,
0.026580810546875,
0.018280029296875,
-0.06396484375,
-0.0738525390625,
-0.00965118408203125,
-0.00860595703125,
-0.03253173828125,
0.026824951171875,
0.022125244140625,
0.021270751953125,
0.0279388427734375,
-0.0091705322265625,
-0.0196533203125,
-0.016510009765625,
0.044281005859375,
0.0213470458984375,
-0.02691650390625,
-0.0013332366943359375,
-0.00911712646484375,
-0.01232147216796875,
0.0035381317138671875,
-0.0352783203125,
0.042144775390625,
-0.0191650390625,
-0.0248260498046875,
-0.0574951171875,
0.005741119384765625,
0.04541015625,
-0.024383544921875,
0.039215087890625,
0.07177734375,
-0.05206298828125,
-0.0098876953125,
-0.0254669189453125,
-0.01245880126953125,
-0.040008544921875,
0.031707763671875,
-0.0285491943359375,
-0.045745849609375,
0.06103515625,
-0.004711151123046875,
-0.01910400390625,
0.033721923828125,
0.03973388671875,
-0.01235198974609375,
0.06463623046875,
0.06756591796875,
0.000858306884765625,
0.05615234375,
-0.056976318359375,
0.005641937255859375,
-0.052734375,
-0.05169677734375,
-0.00225830078125,
-0.027679443359375,
-0.02984619140625,
-0.03424072265625,
0.005695343017578125,
0.0281219482421875,
-0.015625,
0.048583984375,
-0.049591064453125,
0.0304718017578125,
0.060150146484375,
0.03955078125,
-0.0252838134765625,
0.01111602783203125,
-0.018798828125,
0.001068115234375,
-0.0440673828125,
-0.0100250244140625,
0.07659912109375,
0.041839599609375,
0.060150146484375,
-0.01445770263671875,
0.0469970703125,
0.0100250244140625,
0.0007519721984863281,
-0.070556640625,
0.038726806640625,
-0.007617950439453125,
-0.0394287109375,
-0.012847900390625,
-0.01198577880859375,
-0.06671142578125,
-0.0035381317138671875,
-0.03326416015625,
-0.058380126953125,
0.049530029296875,
0.0210723876953125,
-0.03204345703125,
0.0243682861328125,
-0.045013427734375,
0.07281494140625,
-0.015838623046875,
-0.0210418701171875,
0.0091705322265625,
-0.045745849609375,
0.013427734375,
-0.00823211669921875,
-0.01432037353515625,
0.021148681640625,
0.0162506103515625,
0.047882080078125,
-0.0445556640625,
0.07684326171875,
-0.031646728515625,
0.025848388671875,
0.042266845703125,
-0.01160430908203125,
0.0295257568359375,
-0.007091522216796875,
0.03094482421875,
0.0149078369140625,
-0.0032978057861328125,
-0.037750244140625,
-0.04266357421875,
0.033843994140625,
-0.07684326171875,
-0.0285797119140625,
-0.025787353515625,
-0.0215606689453125,
0.0214385986328125,
0.0295257568359375,
0.048980712890625,
0.047332763671875,
0.011566162109375,
0.031951904296875,
0.0465087890625,
-0.0244598388671875,
0.0443115234375,
-0.0168609619140625,
-0.02484130859375,
-0.028167724609375,
0.060333251953125,
0.0234222412109375,
0.0108795166015625,
0.0207366943359375,
0.0102386474609375,
-0.0283050537109375,
-0.029266357421875,
-0.02655029296875,
0.00438690185546875,
-0.07513427734375,
-0.0210418701171875,
-0.035186767578125,
-0.048248291015625,
-0.0406494140625,
-0.01207733154296875,
-0.041259765625,
-0.0293426513671875,
-0.037994384765625,
-0.0212249755859375,
0.0215606689453125,
0.0623779296875,
-0.025543212890625,
0.041534423828125,
-0.0288238525390625,
0.0218048095703125,
0.061004638671875,
0.0138092041015625,
-0.00965118408203125,
-0.046356201171875,
-0.0187835693359375,
-0.0013523101806640625,
-0.012939453125,
-0.04852294921875,
0.03460693359375,
0.024749755859375,
0.0640869140625,
0.05963134765625,
-0.0284576416015625,
0.058502197265625,
-0.0214385986328125,
0.0550537109375,
0.02569580078125,
-0.06414794921875,
0.048583984375,
-0.01084136962890625,
0.01078033447265625,
0.01233673095703125,
0.035980224609375,
0.0020961761474609375,
0.0157928466796875,
-0.036041259765625,
-0.0477294921875,
0.0550537109375,
0.01123046875,
0.024566650390625,
0.007648468017578125,
0.050018310546875,
-0.00543975830078125,
0.005565643310546875,
-0.06964111328125,
-0.01378631591796875,
-0.07403564453125,
-0.0097198486328125,
0.0171966552734375,
-0.0278472900390625,
-0.005161285400390625,
-0.039825439453125,
0.0131988525390625,
-0.007904052734375,
0.0546875,
0.0154876708984375,
-0.0206298828125,
-0.0164031982421875,
-0.032989501953125,
0.0127716064453125,
0.041595458984375,
-0.0301055908203125,
0.0136871337890625,
0.006229400634765625,
-0.042724609375,
-0.007244110107421875,
0.008941650390625,
-0.0154266357421875,
-0.005687713623046875,
0.0382080078125,
0.0703125,
0.01332855224609375,
-0.0015764236450195312,
0.07122802734375,
0.01470947265625,
-0.01654052734375,
-0.035919189453125,
0.010528564453125,
-0.0117340087890625,
0.04052734375,
0.0299530029296875,
0.027679443359375,
-0.004451751708984375,
-0.05224609375,
0.040740966796875,
0.0229949951171875,
-0.049285888671875,
-0.04095458984375,
0.062255859375,
-0.00562286376953125,
-0.015228271484375,
0.04742431640625,
-0.01300811767578125,
-0.050567626953125,
0.062744140625,
0.047027587890625,
0.05084228515625,
-0.0279388427734375,
0.018341064453125,
0.038177490234375,
0.0229644775390625,
-0.0035953521728515625,
0.018890380859375,
-0.01396942138671875,
-0.0684814453125,
-0.03167724609375,
-0.04840087890625,
-0.00516510009765625,
0.0130615234375,
-0.06219482421875,
0.0278472900390625,
-0.0526123046875,
-0.0288543701171875,
0.01611328125,
-0.0156402587890625,
-0.08001708984375,
0.0194854736328125,
0.038665771484375,
0.049530029296875,
-0.06219482421875,
0.082763671875,
0.052215576171875,
-0.0435791015625,
-0.054107666015625,
-0.020172119140625,
0.0002837181091308594,
-0.078857421875,
0.06390380859375,
0.027191162109375,
0.005092620849609375,
0.00359344482421875,
-0.06756591796875,
-0.07989501953125,
0.0894775390625,
0.0234375,
-0.01445770263671875,
-0.007659912109375,
0.007335662841796875,
0.03106689453125,
-0.0440673828125,
0.022216796875,
0.0018625259399414062,
0.0066375732421875,
0.03717041015625,
-0.0546875,
-0.0015230178833007812,
-0.02532958984375,
0.0232391357421875,
-0.01378631591796875,
-0.053436279296875,
0.08746337890625,
-0.01593017578125,
-0.01568603515625,
0.0079193115234375,
0.04840087890625,
-0.0211944580078125,
-0.0030574798583984375,
0.047027587890625,
0.045166015625,
0.042633056640625,
-0.0160675048828125,
0.06976318359375,
-0.0040283203125,
0.047576904296875,
0.056243896484375,
0.01078033447265625,
0.05047607421875,
0.018768310546875,
-0.00449371337890625,
0.048431396484375,
0.0655517578125,
-0.0416259765625,
0.07037353515625,
-0.003459930419921875,
0.012481689453125,
-0.020233154296875,
0.004894256591796875,
-0.026275634765625,
0.050933837890625,
0.03240966796875,
-0.04864501953125,
-0.006866455078125,
0.0205230712890625,
-0.01154327392578125,
-0.0232696533203125,
-0.031951904296875,
0.046661376953125,
0.00843048095703125,
-0.0267333984375,
0.0499267578125,
-0.020721435546875,
0.03692626953125,
-0.03131103515625,
-0.0131072998046875,
-0.01273345947265625,
0.0212249755859375,
-0.02557373046875,
-0.06231689453125,
0.01268768310546875,
-0.0090179443359375,
-0.005290985107421875,
-0.005176544189453125,
0.06707763671875,
-0.0174560546875,
-0.04522705078125,
0.0282135009765625,
0.01279449462890625,
0.01739501953125,
0.0155029296875,
-0.0609130859375,
-0.0181732177734375,
-0.005489349365234375,
-0.034820556640625,
0.011627197265625,
0.0296173095703125,
-0.0017547607421875,
0.046630859375,
0.05328369140625,
-0.01061248779296875,
0.030914306640625,
0.0017786026000976562,
0.0887451171875,
-0.039337158203125,
-0.034393310546875,
-0.0521240234375,
0.046142578125,
-0.021240234375,
-0.0213165283203125,
0.043792724609375,
0.025054931640625,
0.07171630859375,
-0.0059051513671875,
0.03802490234375,
-0.0120086669921875,
0.01529693603515625,
-0.0247344970703125,
0.049285888671875,
-0.02630615234375,
-0.0139923095703125,
-0.013275146484375,
-0.08184814453125,
-0.0196990966796875,
0.06842041015625,
-0.0006856918334960938,
0.0007777214050292969,
0.035186767578125,
0.058746337890625,
-0.024261474609375,
-0.0243072509765625,
0.020172119140625,
0.029693603515625,
0.005443572998046875,
0.0274658203125,
0.0616455078125,
-0.03961181640625,
0.0450439453125,
-0.047393798828125,
-0.0280609130859375,
-0.00803375244140625,
-0.049835205078125,
-0.09613037109375,
-0.04412841796875,
-0.0222625732421875,
-0.03594970703125,
-0.00185394287109375,
0.05450439453125,
0.08599853515625,
-0.07415771484375,
0.01264190673828125,
-0.0004527568817138672,
-0.00475311279296875,
-0.0172576904296875,
-0.01219940185546875,
0.043060302734375,
-0.005001068115234375,
-0.0511474609375,
0.005611419677734375,
0.007251739501953125,
0.018035888671875,
-0.025421142578125,
0.005199432373046875,
-0.00170135498046875,
-0.0096893310546875,
0.039215087890625,
0.0270843505859375,
-0.05615234375,
-0.05029296875,
-0.0027904510498046875,
0.00101470947265625,
0.02655029296875,
0.034423828125,
-0.07177734375,
0.049530029296875,
0.033447265625,
0.03509521484375,
0.06732177734375,
0.004238128662109375,
0.020538330078125,
-0.057952880859375,
0.031158447265625,
-0.0017290115356445312,
0.04339599609375,
0.0281524658203125,
-0.02874755859375,
0.0297088623046875,
0.038665771484375,
-0.032318115234375,
-0.057830810546875,
0.01520538330078125,
-0.0894775390625,
-0.0093231201171875,
0.06658935546875,
-0.038787841796875,
-0.041778564453125,
0.007343292236328125,
-0.00237274169921875,
0.04290771484375,
-0.0032978057861328125,
0.04095458984375,
0.02349853515625,
0.0041046142578125,
-0.047027587890625,
-0.0290985107421875,
0.035247802734375,
-0.0162506103515625,
-0.0287322998046875,
-0.045745849609375,
-0.0006785392761230469,
0.030731201171875,
0.0292510986328125,
0.0136260986328125,
-0.0294647216796875,
0.01336669921875,
0.02606201171875,
0.0196685791015625,
-0.0194854736328125,
-0.0247039794921875,
-0.0238494873046875,
0.006866455078125,
-0.022857666015625,
-0.054229736328125
]
] |
google/mt5-small | 2023-09-18T09:35:27.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"onnx",
"mt5",
"text2text-generation",
"multilingual",
"af",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fil",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"hi",
"hmn",
"ht",
"hu",
"hy",
"ig",
"is",
"it",
"iw",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"und",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:mc4",
"arxiv:2010.11934",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/mt5-small | 64 | 75,406 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
datasets:
- mc4
license: apache-2.0
---
[Google's mT5](https://github.com/google-research/multilingual-t5)
mT5 is pretrained on the [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) corpus, covering 101 languages:
Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu.
**Note**: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual)
Other Community Checkpoints: [here](https://huggingface.co/models?search=mt5)
Paper: [mT5: A massively multilingual pre-trained text-to-text transformer](https://arxiv.org/abs/2010.11934)
Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel*
## Abstract
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We describe the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. All of the code and model checkpoints used in this work are publicly available. | 2,827 | [
[
-0.0369873046875,
-0.01190948486328125,
0.0203704833984375,
0.028778076171875,
-0.02056884765625,
0.0252227783203125,
-0.026824951171875,
-0.03131103515625,
0.0120086669921875,
0.025299072265625,
-0.04913330078125,
-0.059906005859375,
-0.06512451171875,
0.051910400390625,
-0.0175628662109375,
0.07598876953125,
-0.0264129638671875,
0.0141143798828125,
0.016326904296875,
-0.03802490234375,
-0.029510498046875,
-0.04339599609375,
-0.034698486328125,
-0.00876617431640625,
0.057098388671875,
0.03118896484375,
0.023345947265625,
0.032318115234375,
0.040924072265625,
0.018310546875,
0.01248931884765625,
0.016021728515625,
-0.034149169921875,
-0.024139404296875,
0.00003147125244140625,
-0.0276031494140625,
-0.0290374755859375,
-0.0100250244140625,
0.037750244140625,
0.04254150390625,
-0.00959014892578125,
0.032928466796875,
-0.007671356201171875,
0.03692626953125,
-0.03436279296875,
0.00007474422454833984,
-0.03765869140625,
0.006549835205078125,
-0.03271484375,
-0.0004153251647949219,
-0.0278778076171875,
-0.005901336669921875,
-0.00751495361328125,
-0.0491943359375,
0.01334381103515625,
0.0021343231201171875,
0.07635498046875,
0.01666259765625,
-0.04608154296875,
-0.0225982666015625,
-0.0313720703125,
0.06884765625,
-0.0300140380859375,
0.065673828125,
0.036865234375,
0.025970458984375,
0.0120086669921875,
-0.07135009765625,
-0.05035400390625,
0.0170440673828125,
-0.002323150634765625,
0.01666259765625,
-0.0036163330078125,
-0.0133819580078125,
0.01045989990234375,
0.018341064453125,
-0.046661376953125,
0.0019016265869140625,
-0.0535888671875,
-0.0087127685546875,
0.02276611328125,
-0.0101470947265625,
0.034088134765625,
-0.0098724365234375,
-0.019256591796875,
-0.003795623779296875,
-0.052642822265625,
0.007904052734375,
0.028656005859375,
0.0245513916015625,
-0.0341796875,
0.0213623046875,
0.01076507568359375,
0.043212890625,
-0.00555419921875,
-0.03106689453125,
0.051666259765625,
-0.03240966796875,
-0.007137298583984375,
-0.0013065338134765625,
0.07550048828125,
0.0151824951171875,
0.026092529296875,
-0.03717041015625,
-0.0025920867919921875,
0.0019683837890625,
0.0172119140625,
-0.06353759765625,
-0.0183258056640625,
0.02362060546875,
-0.0182342529296875,
0.00545501708984375,
-0.01035308837890625,
-0.031463623046875,
0.0029506683349609375,
-0.01605224609375,
0.0173492431640625,
-0.047698974609375,
-0.0277557373046875,
0.005542755126953125,
-0.0005545616149902344,
0.0051727294921875,
0.004302978515625,
-0.08709716796875,
0.003971099853515625,
0.0228271484375,
0.062103271484375,
-0.0265350341796875,
-0.05560302734375,
-0.025909423828125,
0.022491455078125,
-0.0203704833984375,
0.04193115234375,
-0.039093017578125,
-0.0238037109375,
-0.004497528076171875,
0.03759765625,
-0.010467529296875,
-0.0212249755859375,
0.054351806640625,
-0.033782958984375,
0.047271728515625,
-0.0295867919921875,
-0.0010423660278320312,
-0.028045654296875,
0.034576416015625,
-0.06085205078125,
0.09136962890625,
0.007549285888671875,
-0.06817626953125,
0.043548583984375,
-0.066162109375,
-0.04693603515625,
-0.01073455810546875,
0.00408935546875,
-0.032867431640625,
-0.0215911865234375,
0.041595458984375,
0.0306396484375,
-0.004505157470703125,
0.021148681640625,
-0.00879669189453125,
-0.0255126953125,
-0.01509857177734375,
-0.01349639892578125,
0.051300048828125,
0.024566650390625,
-0.03228759765625,
0.00952911376953125,
-0.0667724609375,
-0.0034618377685546875,
-0.003917694091796875,
-0.037750244140625,
-0.000347137451171875,
-0.0180816650390625,
0.0129547119140625,
0.03948974609375,
0.019195556640625,
-0.046844482421875,
0.00017523765563964844,
-0.0178680419921875,
0.03875732421875,
0.04034423828125,
-0.03521728515625,
0.0257415771484375,
-0.01282501220703125,
0.0462646484375,
0.03521728515625,
-0.00606536865234375,
-0.0304107666015625,
-0.0288238525390625,
-0.05438232421875,
-0.034698486328125,
0.042938232421875,
0.04913330078125,
-0.09130859375,
0.0013513565063476562,
-0.052032470703125,
-0.0191497802734375,
-0.0726318359375,
0.0175323486328125,
0.025238037109375,
0.02581787109375,
0.052001953125,
-0.00887298583984375,
-0.05938720703125,
-0.046356201171875,
-0.02142333984375,
0.0209197998046875,
0.0030956268310546875,
-0.0032482147216796875,
0.03863525390625,
-0.031707763671875,
0.043548583984375,
0.0005044937133789062,
-0.031494140625,
-0.03033447265625,
0.0036563873291015625,
0.0236053466796875,
0.0295867919921875,
0.05120849609375,
-0.057098388671875,
-0.0518798828125,
0.0117950439453125,
-0.04754638671875,
0.00775146484375,
0.0172119140625,
-0.002490997314453125,
0.03973388671875,
0.0242919921875,
-0.0232696533203125,
-0.000762939453125,
0.0841064453125,
-0.006320953369140625,
0.016448974609375,
-0.02960205078125,
0.0258331298828125,
-0.12548828125,
0.023162841796875,
-0.0143890380859375,
-0.025146484375,
-0.03497314453125,
-0.0040130615234375,
0.0167999267578125,
-0.00807952880859375,
-0.048828125,
0.04296875,
-0.057952880859375,
0.002285003662109375,
-0.0010328292846679688,
0.005184173583984375,
-0.00812530517578125,
0.042205810546875,
0.005847930908203125,
0.06732177734375,
0.0265960693359375,
-0.048919677734375,
0.00952911376953125,
0.0212554931640625,
-0.0232391357421875,
0.035125732421875,
-0.03582763671875,
0.01641845703125,
-0.01105499267578125,
0.0174560546875,
-0.065185546875,
-0.01079559326171875,
0.00408172607421875,
-0.046295166015625,
0.01413726806640625,
-0.0280303955078125,
-0.047515869140625,
-0.031951904296875,
-0.0108795166015625,
0.0287933349609375,
0.0194549560546875,
-0.048309326171875,
0.037261962890625,
0.0233917236328125,
-0.0026702880859375,
-0.0694580078125,
-0.07452392578125,
0.032623291015625,
-0.0325927734375,
-0.044281005859375,
0.02386474609375,
-0.0118408203125,
0.0286865234375,
-0.023834228515625,
0.023040771484375,
-0.0162200927734375,
0.006862640380859375,
0.0016994476318359375,
0.0099945068359375,
-0.008819580078125,
-0.0119781494140625,
0.0018186569213867188,
-0.01102447509765625,
-0.0173492431640625,
-0.0305938720703125,
0.052947998046875,
-0.004512786865234375,
-0.00963592529296875,
-0.02655029296875,
0.0262908935546875,
0.045806884765625,
-0.043914794921875,
0.058837890625,
0.09027099609375,
-0.0148468017578125,
0.01139068603515625,
-0.033538818359375,
0.0049591064453125,
-0.033233642578125,
0.031463623046875,
-0.067138671875,
-0.08062744140625,
0.049407958984375,
-0.00931549072265625,
0.0216064453125,
0.03619384765625,
0.044281005859375,
0.00244903564453125,
0.07733154296875,
0.05731201171875,
-0.005039215087890625,
0.02886962890625,
-0.01922607421875,
0.0178070068359375,
-0.056304931640625,
-0.00936126708984375,
-0.0389404296875,
-0.0252838134765625,
-0.07354736328125,
-0.0244903564453125,
0.025390625,
-0.0159759521484375,
-0.0151824951171875,
0.04345703125,
-0.0221405029296875,
0.031951904296875,
0.03350830078125,
-0.0160064697265625,
0.02276611328125,
0.0142974853515625,
-0.045654296875,
-0.025299072265625,
-0.055023193359375,
-0.041961669921875,
0.0960693359375,
0.0130157470703125,
0.011749267578125,
0.03765869140625,
0.044036865234375,
-0.009796142578125,
0.03326416015625,
-0.0301361083984375,
0.0099334716796875,
-0.031982421875,
-0.0614013671875,
-0.0093536376953125,
-0.034088134765625,
-0.09503173828125,
0.0232696533203125,
-0.01100921630859375,
-0.043701171875,
-0.006183624267578125,
0.0008006095886230469,
-0.002758026123046875,
0.02313232421875,
-0.066650390625,
0.07708740234375,
-0.00998687744140625,
-0.012847900390625,
0.0051116943359375,
-0.05548095703125,
0.0278167724609375,
-0.0204315185546875,
0.044219970703125,
0.0026302337646484375,
0.00728607177734375,
0.0517578125,
-0.00669097900390625,
0.045989990234375,
-0.005443572998046875,
-0.0086517333984375,
-0.0176239013671875,
-0.007381439208984375,
0.028289794921875,
-0.01139068603515625,
0.0066375732421875,
0.0311126708984375,
0.020233154296875,
-0.048553466796875,
-0.0177764892578125,
0.042083740234375,
-0.07568359375,
-0.01245880126953125,
-0.031494140625,
-0.027984619140625,
-0.0216827392578125,
0.051116943359375,
0.0301971435546875,
0.0207061767578125,
-0.004119873046875,
0.0229644775390625,
0.0282440185546875,
-0.0238800048828125,
0.0545654296875,
0.053985595703125,
-0.0251922607421875,
-0.05364990234375,
0.0677490234375,
0.0162353515625,
0.01397705078125,
0.030548095703125,
-0.002994537353515625,
-0.0306396484375,
-0.04412841796875,
-0.060302734375,
0.0245208740234375,
-0.042083740234375,
0.004116058349609375,
-0.06439208984375,
0.01506805419921875,
-0.04449462890625,
-0.006938934326171875,
-0.0288238525390625,
-0.01519012451171875,
-0.00936126708984375,
-0.0184326171875,
0.001007080078125,
0.0428466796875,
0.0096893310546875,
0.0325927734375,
-0.06927490234375,
0.032379150390625,
-0.00835418701171875,
0.0321044921875,
-0.029144287109375,
-0.039764404296875,
-0.03472900390625,
0.01479339599609375,
-0.0260467529296875,
-0.032501220703125,
0.0491943359375,
0.01377105712890625,
0.038055419921875,
0.0211334228515625,
-0.012298583984375,
0.0556640625,
-0.05841064453125,
0.06365966796875,
0.029144287109375,
-0.06494140625,
0.0130767822265625,
-0.035858154296875,
0.0367431640625,
0.048858642578125,
0.065673828125,
-0.061309814453125,
-0.017913818359375,
-0.043701171875,
-0.058624267578125,
0.0577392578125,
0.00792694091796875,
0.01326751708984375,
0.000027954578399658203,
-0.0088043212890625,
0.0206146240234375,
0.032562255859375,
-0.07427978515625,
-0.01922607421875,
-0.03704833984375,
-0.035400390625,
-0.0318603515625,
-0.007335662841796875,
-0.00396728515625,
-0.0196380615234375,
0.039947509765625,
-0.0217742919921875,
0.0172271728515625,
0.00267791748046875,
-0.03155517578125,
0.0172576904296875,
0.0125579833984375,
0.069091796875,
0.060394287109375,
-0.0112457275390625,
0.0201568603515625,
0.0311126708984375,
-0.06146240234375,
0.010345458984375,
-0.0005278587341308594,
0.01241302490234375,
0.0086822509765625,
0.0287322998046875,
0.07147216796875,
0.0081634521484375,
-0.030364990234375,
0.0283203125,
-0.0187225341796875,
-0.0255126953125,
-0.02459716796875,
-0.025726318359375,
0.023834228515625,
-0.01003265380859375,
0.0202178955078125,
-0.002155303955078125,
-0.0055084228515625,
-0.0438232421875,
-0.0006861686706542969,
0.0016164779663085938,
-0.033416748046875,
-0.04437255859375,
0.05548095703125,
0.0251312255859375,
-0.007480621337890625,
0.040008544921875,
-0.005901336669921875,
-0.05023193359375,
0.0158843994140625,
0.044769287109375,
0.047637939453125,
-0.031890869140625,
0.0005512237548828125,
0.040435791015625,
0.039398193359375,
0.0015010833740234375,
0.0380859375,
0.0034160614013671875,
-0.058837890625,
-0.047210693359375,
-0.047760009765625,
-0.0208282470703125,
-0.00447845458984375,
-0.02130126953125,
0.036346435546875,
-0.0137786865234375,
-0.01035308837890625,
0.0037670135498046875,
0.00411224365234375,
-0.060302734375,
0.034515380859375,
0.0040435791015625,
0.044158935546875,
-0.0423583984375,
0.08642578125,
0.07293701171875,
-0.0263214111328125,
-0.062225341796875,
-0.021728515625,
-0.0216827392578125,
-0.063232421875,
0.056854248046875,
0.022552490234375,
-0.01149749755859375,
0.023468017578125,
-0.013763427734375,
-0.06610107421875,
0.08746337890625,
0.046417236328125,
-0.016326904296875,
0.00089263916015625,
0.0406494140625,
0.033447265625,
-0.015716552734375,
0.03802490234375,
0.0256805419921875,
0.043426513671875,
0.01320648193359375,
-0.0921630859375,
-0.0139617919921875,
-0.0379638671875,
-0.010711669921875,
0.0199127197265625,
-0.051910400390625,
0.058197021484375,
-0.007572174072265625,
-0.010009765625,
-0.02392578125,
0.049530029296875,
0.016632080078125,
0.00812530517578125,
0.0275726318359375,
0.05731201171875,
0.06201171875,
-0.018768310546875,
0.08477783203125,
-0.047088623046875,
0.02117919921875,
0.057281494140625,
0.0007634162902832031,
0.05731201171875,
0.036102294921875,
-0.01435089111328125,
0.03497314453125,
0.06048583984375,
0.0151214599609375,
0.034271240234375,
-0.0117034912109375,
-0.01309967041015625,
0.0023937225341796875,
0.0033931732177734375,
-0.0233917236328125,
0.03106689453125,
0.012359619140625,
-0.019134521484375,
-0.00025177001953125,
0.0177154541015625,
0.0374755859375,
-0.0279693603515625,
-0.007137298583984375,
0.04351806640625,
0.00835418701171875,
-0.059844970703125,
0.06915283203125,
0.0277862548828125,
0.06768798828125,
-0.05364990234375,
0.0262298583984375,
-0.0187835693359375,
0.01739501953125,
-0.020355224609375,
-0.045806884765625,
0.023101806640625,
0.00809478759765625,
-0.01506805419921875,
-0.042022705078125,
0.0204315185546875,
-0.05169677734375,
-0.036712646484375,
0.0222930908203125,
0.0258026123046875,
0.01398468017578125,
0.0017414093017578125,
-0.042449951171875,
-0.0024547576904296875,
0.01042938232421875,
-0.00577545166015625,
0.0235595703125,
0.044036865234375,
-0.008026123046875,
0.052459716796875,
0.058929443359375,
0.0006399154663085938,
0.0255584716796875,
0.0098419189453125,
0.047210693359375,
-0.049041748046875,
-0.04925537109375,
-0.04937744140625,
0.04345703125,
0.01506805419921875,
-0.039825439453125,
0.061553955078125,
0.05218505859375,
0.07635498046875,
-0.01345062255859375,
0.06317138671875,
0.01348114013671875,
0.0517578125,
-0.038360595703125,
0.05303955078125,
-0.0479736328125,
-0.014801025390625,
-0.0196380615234375,
-0.06329345703125,
-0.0278778076171875,
0.03009033203125,
-0.019805908203125,
0.01393890380859375,
0.0789794921875,
0.03448486328125,
-0.02484130859375,
-0.0201416015625,
0.033966064453125,
0.00907135009765625,
0.0308380126953125,
0.042205810546875,
0.032012939453125,
-0.045806884765625,
0.0582275390625,
-0.010040283203125,
0.0161590576171875,
0.01088714599609375,
-0.06268310546875,
-0.075927734375,
-0.053802490234375,
-0.0036563873291015625,
-0.0133209228515625,
0.0008087158203125,
0.05712890625,
0.054656982421875,
-0.057403564453125,
-0.0254058837890625,
0.0094757080078125,
-0.0074920654296875,
0.0119171142578125,
-0.007045745849609375,
0.023590087890625,
-0.03173828125,
-0.07635498046875,
0.0236968994140625,
0.005001068115234375,
0.007740020751953125,
-0.01129913330078125,
-0.00696563720703125,
-0.0293121337890625,
-0.017791748046875,
0.04876708984375,
0.003749847412109375,
-0.0294647216796875,
-0.004901885986328125,
0.010498046875,
-0.0121612548828125,
0.0244140625,
0.0313720703125,
-0.035614013671875,
0.0238037109375,
0.0192108154296875,
0.055389404296875,
0.05328369140625,
-0.01678466796875,
0.04693603515625,
-0.059478759765625,
0.0223388671875,
-0.004993438720703125,
0.0263824462890625,
0.043914794921875,
0.0017595291137695312,
0.03729248046875,
0.028228759765625,
-0.0274505615234375,
-0.053375244140625,
-0.0025577545166015625,
-0.0672607421875,
-0.0009784698486328125,
0.08331298828125,
-0.0214996337890625,
-0.01959228515625,
-0.01357269287109375,
-0.01117706298828125,
0.0222015380859375,
-0.017608642578125,
0.044036865234375,
0.07452392578125,
0.0284576416015625,
-0.035675048828125,
-0.059051513671875,
0.03717041015625,
0.03289794921875,
-0.06640625,
-0.032379150390625,
0.0036563873291015625,
0.036163330078125,
0.0081787109375,
0.045501708984375,
-0.003936767578125,
0.003841400146484375,
-0.0214080810546875,
0.03472900390625,
-0.00909423828125,
-0.023681640625,
-0.003513336181640625,
0.00846099853515625,
-0.0125579833984375,
-0.02386474609375
]
] |
dbmdz/bert-base-german-uncased | 2023-09-06T22:19:33.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | dbmdz | null | null | dbmdz/bert-base-german-uncased | 14 | 74,724 | transformers | 2022-03-02T23:29:05 | ---
language: de
license: mit
---
# 🤗 + 📚 dbmdz German BERT models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources another German BERT models 🎉
# German BERT
## Stats
In addition to the recently released [German BERT](https://deepset.ai/german-bert)
model by [deepset](https://deepset.ai/) we provide another German-language model.
The source data for the model consists of a recent Wikipedia dump, EU Bookshop corpus,
Open Subtitles, CommonCrawl, ParaCrawl and News Crawl. This results in a dataset with
a size of 16GB and 2,350,234,427 tokens.
For sentence splitting, we use [spacy](https://spacy.io/). Our preprocessing steps
(sentence piece model for vocab generation) follow those used for training
[SciBERT](https://github.com/allenai/scibert). The model is trained with an initial
sequence length of 512 subwords and was performed for 1.5M steps.
This release includes both cased and uncased models.
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| -------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `bert-base-german-dbmdz-cased` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-config.json) • [`pytorch_model.bin`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-pytorch_model.bin) • [`vocab.txt`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-vocab.txt)
| `bert-base-german-dbmdz-uncased` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-config.json) • [`pytorch_model.bin`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-pytorch_model.bin) • [`vocab.txt`](https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-vocab.txt)
## Usage
With Transformers >= 2.3 our German BERT models can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("dbmdz/bert-base-german-cased")
model = AutoModel.from_pretrained("dbmdz/bert-base-german-cased")
```
## Results
For results on downstream tasks like NER or PoS tagging, please refer to
[this repository](https://github.com/stefan-it/fine-tuned-berts-seq).
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
| 3,159 | [
[
-0.039794921875,
-0.059478759765625,
0.01387786865234375,
0.0177001953125,
-0.0292510986328125,
-0.0164947509765625,
-0.01763916015625,
-0.0306549072265625,
0.01264190673828125,
0.0233154296875,
-0.05621337890625,
-0.048095703125,
-0.04779052734375,
-0.0133514404296875,
-0.031890869140625,
0.09259033203125,
-0.001529693603515625,
0.0180816650390625,
0.0008697509765625,
-0.011688232421875,
0.0027446746826171875,
-0.061370849609375,
-0.0267791748046875,
-0.036712646484375,
0.03350830078125,
-0.0029144287109375,
0.048919677734375,
0.0196380615234375,
0.036712646484375,
0.0236968994140625,
-0.021636962890625,
-0.010986328125,
-0.016021728515625,
-0.004940032958984375,
0.007701873779296875,
-0.01959228515625,
-0.044769287109375,
-0.0147705078125,
0.03668212890625,
0.0426025390625,
-0.0199432373046875,
0.010101318359375,
-0.00722503662109375,
0.04034423828125,
-0.01020050048828125,
0.0183868408203125,
-0.0290679931640625,
0.00421142578125,
-0.0069732666015625,
0.0170135498046875,
-0.0204315185546875,
-0.01123046875,
0.04156494140625,
-0.020782470703125,
0.05047607421875,
-0.023651123046875,
0.10113525390625,
0.0124664306640625,
-0.0208740234375,
-0.018096923828125,
-0.0288848876953125,
0.0528564453125,
-0.0740966796875,
0.038330078125,
0.0170440673828125,
0.01415252685546875,
-0.020355224609375,
-0.06805419921875,
-0.044403076171875,
-0.0062103271484375,
-0.0104522705078125,
0.006374359130859375,
-0.0355224609375,
-0.000002562999725341797,
0.0203094482421875,
0.0252685546875,
-0.038330078125,
0.0017213821411132812,
-0.04290771484375,
-0.0206298828125,
0.057525634765625,
-0.0118255615234375,
0.01824951171875,
-0.040771484375,
-0.026092529296875,
-0.038726806640625,
-0.04071044921875,
0.00989532470703125,
0.0389404296875,
0.0298004150390625,
-0.037445068359375,
0.0455322265625,
-0.004390716552734375,
0.047088623046875,
0.00743865966796875,
0.00983428955078125,
0.043304443359375,
-0.0101470947265625,
-0.01230621337890625,
-0.013336181640625,
0.07073974609375,
0.01332855224609375,
0.006069183349609375,
-0.01107025146484375,
-0.0175628662109375,
-0.0220794677734375,
0.026824951171875,
-0.07183837890625,
-0.0271453857421875,
0.03472900390625,
-0.04693603515625,
-0.0146331787109375,
-0.003009796142578125,
-0.04437255859375,
-0.00560760498046875,
-0.01134490966796875,
0.04962158203125,
-0.05609130859375,
-0.019134521484375,
0.0199737548828125,
-0.01412200927734375,
0.03167724609375,
0.0041961669921875,
-0.06378173828125,
-0.00014734268188476562,
0.0297698974609375,
0.0556640625,
0.006572723388671875,
-0.020721435546875,
0.005290985107421875,
-0.019744873046875,
-0.012054443359375,
0.04583740234375,
0.0088958740234375,
-0.0210723876953125,
0.01812744140625,
0.0216217041015625,
-0.0047454833984375,
-0.01727294921875,
0.044525146484375,
-0.038543701171875,
0.04119873046875,
-0.0217437744140625,
-0.048095703125,
-0.034210205078125,
-0.0036411285400390625,
-0.044281005859375,
0.09027099609375,
0.0246734619140625,
-0.05419921875,
0.025360107421875,
-0.053436279296875,
-0.04644775390625,
-0.0029621124267578125,
0.007427215576171875,
-0.058441162109375,
0.006740570068359375,
0.0274200439453125,
0.055816650390625,
-0.00540924072265625,
0.0160064697265625,
-0.03399658203125,
-0.01030731201171875,
0.01666259765625,
-0.0063934326171875,
0.096923828125,
0.0119476318359375,
-0.032196044921875,
0.0176239013671875,
-0.045654296875,
0.0027904510498046875,
0.0295257568359375,
-0.02130126953125,
0.01229095458984375,
0.00140380859375,
0.023651123046875,
0.0187225341796875,
0.0279083251953125,
-0.038970947265625,
0.01885986328125,
-0.0251617431640625,
0.048370361328125,
0.05279541015625,
-0.019287109375,
0.0147247314453125,
-0.0240020751953125,
0.0180816650390625,
0.006114959716796875,
0.014190673828125,
0.0018024444580078125,
-0.03411865234375,
-0.0799560546875,
-0.0418701171875,
0.044677734375,
0.03076171875,
-0.057281494140625,
0.07684326171875,
-0.0257110595703125,
-0.053558349609375,
-0.052337646484375,
-0.0000979304313659668,
0.0169525146484375,
0.02349853515625,
0.034149169921875,
-0.024383544921875,
-0.03955078125,
-0.0830078125,
0.0087432861328125,
-0.0113983154296875,
-0.026824951171875,
0.0278472900390625,
0.0560302734375,
-0.0244598388671875,
0.06085205078125,
-0.0224609375,
-0.0257110595703125,
-0.028045654296875,
0.0182952880859375,
0.0286102294921875,
0.039276123046875,
0.0706787109375,
-0.03448486328125,
-0.0149688720703125,
-0.0255126953125,
-0.054656982421875,
0.00579071044921875,
0.00450897216796875,
-0.018707275390625,
0.0462646484375,
0.02435302734375,
-0.061004638671875,
0.01552581787109375,
0.039642333984375,
-0.039581298828125,
0.032012939453125,
-0.020904541015625,
-0.0190277099609375,
-0.09246826171875,
0.012451171875,
0.01006317138671875,
-0.0134124755859375,
-0.0307159423828125,
0.00994873046875,
0.00441741943359375,
0.00852203369140625,
-0.040863037109375,
0.03997802734375,
-0.0213165283203125,
-0.01113128662109375,
0.007053375244140625,
-0.006130218505859375,
-0.004871368408203125,
0.053741455078125,
0.012451171875,
0.04290771484375,
0.047698974609375,
-0.039520263671875,
0.0254974365234375,
0.0399169921875,
-0.043914794921875,
0.0153656005859375,
-0.07232666015625,
0.005481719970703125,
-0.0011720657348632812,
0.033447265625,
-0.057220458984375,
-0.016387939453125,
0.0235595703125,
-0.04302978515625,
0.031829833984375,
-0.0207977294921875,
-0.0648193359375,
-0.038299560546875,
-0.00801849365234375,
-0.0031585693359375,
0.057281494140625,
-0.05755615234375,
0.05316162109375,
0.0255889892578125,
-0.01236724853515625,
-0.039886474609375,
-0.0592041015625,
-0.01436614990234375,
-0.0168914794921875,
-0.06256103515625,
0.047393798828125,
-0.01084136962890625,
0.01493072509765625,
0.0079803466796875,
-0.005489349365234375,
-0.002300262451171875,
-0.004100799560546875,
0.0033397674560546875,
0.038177490234375,
-0.005558013916015625,
0.0027828216552734375,
0.00554656982421875,
0.00608062744140625,
0.006252288818359375,
-0.004711151123046875,
0.05474853515625,
-0.02850341796875,
-0.006916046142578125,
-0.02801513671875,
0.013946533203125,
0.033203125,
-0.0037746429443359375,
0.062469482421875,
0.06707763671875,
-0.0263824462890625,
-0.018951416015625,
-0.040252685546875,
-0.02642822265625,
-0.035675048828125,
0.0241546630859375,
-0.0285797119140625,
-0.06671142578125,
0.052337646484375,
0.026824951171875,
0.0166015625,
0.050537109375,
0.054168701171875,
-0.0243377685546875,
0.07879638671875,
0.057281494140625,
-0.00839996337890625,
0.0562744140625,
-0.049407958984375,
0.01248931884765625,
-0.048126220703125,
-0.00847625732421875,
-0.0289154052734375,
-0.01555633544921875,
-0.047210693359375,
-0.02301025390625,
0.02191162109375,
0.01444244384765625,
-0.0223388671875,
0.049407958984375,
-0.047393798828125,
-0.0018634796142578125,
0.0655517578125,
0.026092529296875,
-0.02197265625,
0.0290679931640625,
-0.0296478271484375,
0.0084075927734375,
-0.0626220703125,
-0.0252838134765625,
0.0894775390625,
0.031829833984375,
0.0228271484375,
0.010711669921875,
0.0550537109375,
0.013275146484375,
0.01265716552734375,
-0.057830810546875,
0.03582763671875,
-0.01010894775390625,
-0.0712890625,
-0.00954437255859375,
-0.02484130859375,
-0.07183837890625,
0.020843505859375,
-0.0124053955078125,
-0.057220458984375,
0.0108642578125,
-0.0087890625,
-0.027435302734375,
0.0199737548828125,
-0.058197021484375,
0.0743408203125,
-0.02142333984375,
-0.01259613037109375,
-0.0142669677734375,
-0.057525634765625,
0.00620269775390625,
0.01126861572265625,
-0.005374908447265625,
-0.0064544677734375,
0.0234832763671875,
0.06488037109375,
-0.045135498046875,
0.06439208984375,
-0.0137176513671875,
-0.00801849365234375,
0.03131103515625,
-0.00970458984375,
0.026947021484375,
-0.01812744140625,
-0.004425048828125,
0.050537109375,
0.01491546630859375,
-0.0423583984375,
-0.016448974609375,
0.037811279296875,
-0.075439453125,
-0.027984619140625,
-0.04815673828125,
-0.0384521484375,
-0.00733184814453125,
0.0264892578125,
0.03369140625,
0.01436614990234375,
-0.00576019287109375,
0.03033447265625,
0.044219970703125,
-0.032196044921875,
0.042724609375,
0.049163818359375,
-0.0066070556640625,
-0.0279693603515625,
0.04986572265625,
0.006145477294921875,
-0.0058746337890625,
0.0037212371826171875,
-0.0040283203125,
-0.0362548828125,
-0.036346435546875,
-0.038421630859375,
0.030487060546875,
-0.05303955078125,
-0.01548004150390625,
-0.053070068359375,
-0.032012939453125,
-0.04266357421875,
0.0026912689208984375,
-0.0211639404296875,
-0.0302581787109375,
-0.0222320556640625,
-0.02392578125,
0.0538330078125,
0.0276641845703125,
-0.017303466796875,
0.031646728515625,
-0.056121826171875,
0.01995849609375,
0.00589752197265625,
0.0362548828125,
-0.01534271240234375,
-0.04803466796875,
-0.0170440673828125,
0.01236724853515625,
-0.027313232421875,
-0.052032470703125,
0.03863525390625,
0.0002887248992919922,
0.043243408203125,
0.0185394287109375,
0.003986358642578125,
0.040863037109375,
-0.0377197265625,
0.06329345703125,
0.0099945068359375,
-0.07122802734375,
0.0287628173828125,
-0.0374755859375,
0.004970550537109375,
0.017730712890625,
0.0306549072265625,
-0.0306854248046875,
-0.01078033447265625,
-0.07415771484375,
-0.086181640625,
0.07403564453125,
0.027191162109375,
0.021240234375,
0.0009598731994628906,
0.01273345947265625,
0.00000286102294921875,
0.0190887451171875,
-0.06268310546875,
-0.0262908935546875,
-0.0284576416015625,
-0.0240020751953125,
-0.01044464111328125,
-0.0191650390625,
-0.0240325927734375,
-0.03704833984375,
0.07373046875,
0.002956390380859375,
0.046478271484375,
0.0225830078125,
-0.01082611083984375,
-0.0016050338745117188,
0.0015001296997070312,
0.0305328369140625,
0.02777099609375,
-0.031768798828125,
-0.0261383056640625,
0.00760650634765625,
-0.018798828125,
-0.0234527587890625,
0.04986572265625,
-0.0219268798828125,
0.0216827392578125,
0.024261474609375,
0.07476806640625,
0.01088714599609375,
-0.045562744140625,
0.033355712890625,
-0.00595855712890625,
-0.0360107421875,
-0.041229248046875,
-0.0080108642578125,
0.01806640625,
0.0316162109375,
0.0272979736328125,
-0.0160064697265625,
-0.004001617431640625,
-0.02557373046875,
0.020904541015625,
0.035858154296875,
-0.0210723876953125,
-0.018951416015625,
0.038818359375,
0.01213836669921875,
-0.009735107421875,
0.0657958984375,
-0.00791168212890625,
-0.0538330078125,
0.055267333984375,
0.024566650390625,
0.069580078125,
0.01032257080078125,
0.021820068359375,
0.034149169921875,
0.0248260498046875,
-0.003223419189453125,
0.0338134765625,
-0.0052947998046875,
-0.06842041015625,
-0.0171356201171875,
-0.06121826171875,
-0.01459503173828125,
0.0264892578125,
-0.047576904296875,
0.0225372314453125,
-0.0233917236328125,
-0.0261993408203125,
0.00908660888671875,
0.01209259033203125,
-0.06243896484375,
0.01165771484375,
0.011810302734375,
0.05792236328125,
-0.04791259765625,
0.07196044921875,
0.048370361328125,
-0.05023193359375,
-0.05908203125,
-0.0177154541015625,
-0.0164337158203125,
-0.060455322265625,
0.033905029296875,
0.00988006591796875,
0.0200653076171875,
0.00283050537109375,
-0.046630859375,
-0.06976318359375,
0.08258056640625,
0.0199737548828125,
-0.0167236328125,
-0.0025272369384765625,
0.00008726119995117188,
0.053985595703125,
-0.01007843017578125,
0.020355224609375,
0.03814697265625,
0.0215301513671875,
0.007293701171875,
-0.0523681640625,
0.00432586669921875,
-0.031829833984375,
-0.006381988525390625,
0.00852203369140625,
-0.048980712890625,
0.06488037109375,
-0.01203155517578125,
-0.01151275634765625,
0.01377105712890625,
0.053192138671875,
0.02728271484375,
-0.00977325439453125,
0.03363037109375,
0.056884765625,
0.03912353515625,
-0.0233612060546875,
0.08740234375,
-0.04241943359375,
0.0518798828125,
0.058074951171875,
0.0029010772705078125,
0.060302734375,
0.02862548828125,
-0.0270538330078125,
0.05206298828125,
0.06585693359375,
-0.0379638671875,
0.031646728515625,
0.00885772705078125,
-0.005420684814453125,
-0.01280975341796875,
0.00942230224609375,
-0.040252685546875,
0.0243682861328125,
0.0226287841796875,
-0.037078857421875,
-0.0129547119140625,
-0.016204833984375,
0.014801025390625,
-0.03521728515625,
0.0032558441162109375,
0.04302978515625,
-0.00368499755859375,
-0.03997802734375,
0.054840087890625,
0.01174163818359375,
0.061279296875,
-0.055419921875,
0.01568603515625,
-0.01113128662109375,
0.021728515625,
-0.003360748291015625,
-0.04852294921875,
0.012237548828125,
0.00016188621520996094,
-0.00994110107421875,
-0.0292510986328125,
0.04290771484375,
-0.0256500244140625,
-0.054290771484375,
0.034149169921875,
0.040557861328125,
0.019439697265625,
0.00986480712890625,
-0.0848388671875,
-0.00756072998046875,
0.0015583038330078125,
-0.042633056640625,
0.0225372314453125,
0.0241546630859375,
0.0169677734375,
0.05157470703125,
0.05865478515625,
-0.01078033447265625,
0.01137542724609375,
0.0103302001953125,
0.05694580078125,
-0.042266845703125,
-0.031951904296875,
-0.03851318359375,
0.05706787109375,
-0.003997802734375,
-0.0292510986328125,
0.052978515625,
0.0303497314453125,
0.07672119140625,
-0.0170135498046875,
0.06256103515625,
-0.035369873046875,
0.0325927734375,
-0.0229034423828125,
0.07025146484375,
-0.053192138671875,
-0.005702972412109375,
-0.0240020751953125,
-0.051177978515625,
-0.0084075927734375,
0.06854248046875,
-0.015716552734375,
0.0201568603515625,
0.040252685546875,
0.05078125,
0.00004869699478149414,
-0.00571441650390625,
-0.005809783935546875,
0.0262908935546875,
0.0217742919921875,
0.0242767333984375,
0.025909423828125,
-0.052886962890625,
0.047210693359375,
-0.052947998046875,
0.00027871131896972656,
-0.02484130859375,
-0.05950927734375,
-0.0740966796875,
-0.050323486328125,
-0.031494140625,
-0.04754638671875,
-0.007080078125,
0.06646728515625,
0.064697265625,
-0.06671142578125,
-0.0221405029296875,
-0.0186309814453125,
0.0007200241088867188,
-0.02178955078125,
-0.021820068359375,
0.04986572265625,
-0.0228271484375,
-0.06793212890625,
0.01502227783203125,
-0.0225372314453125,
0.02740478515625,
-0.013946533203125,
-0.0016880035400390625,
-0.01983642578125,
0.00478363037109375,
0.043182373046875,
0.0227203369140625,
-0.056182861328125,
-0.019989013671875,
0.003551483154296875,
-0.011444091796875,
0.0008702278137207031,
0.034393310546875,
-0.03985595703125,
0.0197906494140625,
0.0435791015625,
0.0261993408203125,
0.058258056640625,
-0.02099609375,
0.044708251953125,
-0.0545654296875,
0.0303955078125,
0.0160369873046875,
0.05352783203125,
0.030792236328125,
-0.00684356689453125,
0.045440673828125,
0.004199981689453125,
-0.035003662109375,
-0.06414794921875,
0.0027866363525390625,
-0.0823974609375,
-0.03656005859375,
0.07464599609375,
-0.0227203369140625,
-0.0210113525390625,
0.0242919921875,
-0.0093536376953125,
0.047393798828125,
-0.03076171875,
0.0748291015625,
0.0743408203125,
-0.0045623779296875,
-0.007282257080078125,
-0.03350830078125,
0.040985107421875,
0.051025390625,
-0.041748046875,
-0.0103607177734375,
0.037811279296875,
0.031494140625,
0.010467529296875,
0.020111083984375,
-0.01409149169921875,
0.0127410888671875,
-0.006824493408203125,
0.04754638671875,
0.003002166748046875,
-0.00783538818359375,
-0.0249481201171875,
-0.00618743896484375,
-0.034210205078125,
-0.01151275634765625
]
] |
TheBloke/Llama-2-70B-Chat-GPTQ | 2023-09-27T12:44:49.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"arxiv:2307.09288",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-70B-Chat-GPTQ | 222 | 74,507 | transformers | 2023-07-18T23:33:13 | ---
language:
- en
license: llama2
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
model_name: Llama 2 70B Chat
base_model: meta-llama/Llama-2-70b-chat-hf
inference: false
model_creator: Meta Llama 2
model_type: llama
pipeline_tag: text-generation
prompt_template: '[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as
possible, while being safe. Your answers should not include any harmful, unethical,
racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses
are socially unbiased and positive in nature. If a question does not make any sense,
or is not factually coherent, explain why instead of answering something not correct.
If you don''t know the answer to a question, please don''t share false information.
<</SYS>>
{prompt}[/INST]
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama 2 70B Chat - GPTQ
- Model creator: [Meta Llama 2](https://huggingface.co/meta-llama)
- Original model: [Llama 2 70B Chat](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Meta Llama 2's Llama 2 70B Chat](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-70B-chat-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-70B-chat-GGUF)
* [Meta Llama 2's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Llama-2-Chat
```
[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/main) | 4 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 35.33 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 40.66 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 36.65 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-3bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-3bit-64g-actorder_True) | 3 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 29.30 GB | No | 3-bit, with group size 64g and act-order. |
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 28.03 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
| [gptq-3bit-128g-actorder_False](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-3bit-128g-actorder_False) | 3 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 28.03 GB | No | 3-bit, with group size 128g but no act-order. Slightly higher VRAM requirements than 3-bit None. |
| [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 26.78 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 37.99 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Llama-2-70B-chat-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Llama-2-70B-chat-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Llama-2-70B-chat-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Llama-2-70B-chat-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Llama-2-70B-chat-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Meta Llama 2's Llama 2 70B Chat
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 25,857 | [
[
-0.040740966796875,
-0.0599365234375,
0.0090179443359375,
0.0240631103515625,
-0.0285186767578125,
-0.0033893585205078125,
0.00405120849609375,
-0.049041748046875,
0.0287628173828125,
0.0214080810546875,
-0.0504150390625,
-0.03704833984375,
-0.03399658203125,
-0.00809478759765625,
-0.0233001708984375,
0.08392333984375,
0.01268768310546875,
-0.033721923828125,
-0.005889892578125,
-0.020782470703125,
-0.02557373046875,
-0.03948974609375,
-0.05743408203125,
-0.0092315673828125,
0.025848388671875,
0.020965576171875,
0.063720703125,
0.040557861328125,
0.017974853515625,
0.024871826171875,
-0.01084136962890625,
0.003936767578125,
-0.04864501953125,
-0.0138702392578125,
0.0209503173828125,
-0.020965576171875,
-0.0526123046875,
0.007106781005859375,
0.027587890625,
0.015472412109375,
-0.0250244140625,
0.0222320556640625,
0.005237579345703125,
0.04229736328125,
-0.029632568359375,
0.0159149169921875,
-0.034393310546875,
0.001682281494140625,
-0.012786865234375,
0.005340576171875,
-0.004665374755859375,
-0.0297088623046875,
0.0005059242248535156,
-0.060760498046875,
0.0004744529724121094,
0.003192901611328125,
0.0853271484375,
0.009490966796875,
-0.046783447265625,
0.00998687744140625,
-0.025482177734375,
0.04742431640625,
-0.0716552734375,
0.0225067138671875,
0.04754638671875,
0.013580322265625,
-0.025543212890625,
-0.05841064453125,
-0.050140380859375,
0.00428009033203125,
-0.0144195556640625,
0.01427459716796875,
-0.036773681640625,
0.0031871795654296875,
0.0318603515625,
0.051025390625,
-0.0628662109375,
-0.004253387451171875,
-0.0195159912109375,
-0.018035888671875,
0.06500244140625,
0.01340484619140625,
0.0245208740234375,
-0.0176239013671875,
-0.0221405029296875,
-0.0301361083984375,
-0.044708251953125,
0.0089874267578125,
0.0252838134765625,
-0.0037899017333984375,
-0.0552978515625,
0.040191650390625,
-0.0306243896484375,
0.0313720703125,
0.01293182373046875,
-0.024566650390625,
0.030059814453125,
-0.048248291015625,
-0.032135009765625,
-0.030181884765625,
0.09063720703125,
0.044097900390625,
-0.01435089111328125,
0.0213775634765625,
0.00527191162109375,
-0.0017604827880859375,
-0.011627197265625,
-0.07159423828125,
-0.03240966796875,
0.0307464599609375,
-0.04119873046875,
-0.024749755859375,
-0.010833740234375,
-0.054962158203125,
-0.00580596923828125,
0.0034351348876953125,
0.0262908935546875,
-0.03570556640625,
-0.038299560546875,
0.00513458251953125,
-0.0244903564453125,
0.04302978515625,
0.024505615234375,
-0.05047607421875,
0.040313720703125,
0.0295257568359375,
0.055419921875,
0.004825592041015625,
-0.0029048919677734375,
-0.00841522216796875,
0.0031986236572265625,
-0.00897216796875,
0.037445068359375,
-0.0162506103515625,
-0.038116455078125,
-0.033416748046875,
0.018218994140625,
0.0107269287109375,
-0.0167999267578125,
0.039764404296875,
-0.019805908203125,
0.029327392578125,
-0.036224365234375,
-0.039306640625,
-0.0267333984375,
0.005298614501953125,
-0.04345703125,
0.10064697265625,
0.027557373046875,
-0.06512451171875,
0.0170440673828125,
-0.03778076171875,
-0.01074981689453125,
-0.00684356689453125,
-0.0032501220703125,
-0.03564453125,
-0.0146331787109375,
0.0181121826171875,
0.0263671875,
-0.0294189453125,
0.007183074951171875,
-0.0287322998046875,
-0.0244903564453125,
0.0159149169921875,
-0.042938232421875,
0.091552734375,
0.00720977783203125,
-0.032989501953125,
-0.0092926025390625,
-0.049896240234375,
0.0105438232421875,
0.036956787109375,
-0.0139007568359375,
-0.006488800048828125,
-0.003726959228515625,
-0.0003154277801513672,
0.01300048828125,
0.0223388671875,
-0.031402587890625,
0.032806396484375,
-0.01568603515625,
0.04071044921875,
0.05059814453125,
0.007770538330078125,
0.01531982421875,
-0.0355224609375,
0.03155517578125,
0.0068817138671875,
0.0489501953125,
0.010223388671875,
-0.0628662109375,
-0.0560302734375,
-0.023895263671875,
0.013824462890625,
0.0521240234375,
-0.0472412109375,
0.046875,
-0.014190673828125,
-0.055908203125,
-0.025238037109375,
-0.001735687255859375,
0.0254974365234375,
0.0216827392578125,
0.027801513671875,
-0.04010009765625,
-0.031829833984375,
-0.069580078125,
0.01421356201171875,
-0.03802490234375,
-0.0023975372314453125,
0.047149658203125,
0.05328369140625,
-0.0249481201171875,
0.05487060546875,
-0.047698974609375,
-0.0020580291748046875,
0.00528717041015625,
0.0006413459777832031,
0.0308685302734375,
0.040557861328125,
0.062255859375,
-0.0543212890625,
-0.04248046875,
-0.005336761474609375,
-0.04742431640625,
-0.007160186767578125,
-0.0008525848388671875,
-0.03546142578125,
0.019622802734375,
-0.00560760498046875,
-0.08221435546875,
0.0546875,
0.04736328125,
-0.046142578125,
0.04681396484375,
-0.006961822509765625,
0.01453399658203125,
-0.07958984375,
-0.0018100738525390625,
0.0010576248168945312,
-0.0228118896484375,
-0.0347900390625,
-0.01314544677734375,
-0.01043701171875,
0.021270751953125,
-0.0259246826171875,
0.04901123046875,
-0.037139892578125,
-0.00536346435546875,
0.00510406494140625,
0.0012502670288085938,
0.018157958984375,
0.046356201171875,
-0.0164642333984375,
0.0631103515625,
0.037872314453125,
-0.03045654296875,
0.045196533203125,
0.039825439453125,
-0.003536224365234375,
0.0262603759765625,
-0.062744140625,
0.02117919921875,
0.016448974609375,
0.03662109375,
-0.083984375,
-0.020660400390625,
0.043609619140625,
-0.04833984375,
0.0263671875,
-0.0205230712890625,
-0.035797119140625,
-0.031005859375,
-0.050262451171875,
0.029754638671875,
0.0595703125,
-0.03118896484375,
0.0216064453125,
0.03619384765625,
0.0011606216430664062,
-0.054107666015625,
-0.05023193359375,
-0.006290435791015625,
-0.021636962890625,
-0.04541015625,
0.029632568359375,
-0.007904052734375,
-0.0020885467529296875,
0.0055999755859375,
-0.0059814453125,
-0.005462646484375,
-0.0038280487060546875,
0.03131103515625,
0.024688720703125,
-0.0079803466796875,
-0.0148468017578125,
0.01146697998046875,
0.003299713134765625,
-0.0009093284606933594,
-0.013671875,
0.03582763671875,
-0.0190582275390625,
-0.01097869873046875,
-0.035064697265625,
0.024322509765625,
0.03546142578125,
0.01003265380859375,
0.0638427734375,
0.05926513671875,
-0.0217132568359375,
0.02032470703125,
-0.0450439453125,
-0.01000213623046875,
-0.037841796875,
0.0165863037109375,
-0.015472412109375,
-0.06488037109375,
0.048858642578125,
0.03424072265625,
0.01580810546875,
0.056671142578125,
0.033477783203125,
-0.006168365478515625,
0.07354736328125,
0.0308990478515625,
-0.021331787109375,
0.035552978515625,
-0.03662109375,
-0.00763702392578125,
-0.06036376953125,
-0.0183258056640625,
-0.020477294921875,
-0.0230560302734375,
-0.06658935546875,
-0.046630859375,
0.0202484130859375,
0.0183868408203125,
-0.0506591796875,
0.042755126953125,
-0.048614501953125,
0.0175323486328125,
0.04144287109375,
0.017486572265625,
0.0121307373046875,
0.0005879402160644531,
-0.0034427642822265625,
0.004024505615234375,
-0.040771484375,
-0.0252532958984375,
0.08203125,
0.0305023193359375,
0.04742431640625,
0.0213623046875,
0.030242919921875,
0.0105743408203125,
0.022796630859375,
-0.034332275390625,
0.05120849609375,
0.00852203369140625,
-0.049530029296875,
-0.0232391357421875,
-0.042999267578125,
-0.0640869140625,
0.0169219970703125,
-0.0077362060546875,
-0.065673828125,
0.0262603759765625,
-0.004360198974609375,
-0.0140838623046875,
0.02069091796875,
-0.051513671875,
0.06768798828125,
-0.01306915283203125,
-0.0214080810546875,
-0.006664276123046875,
-0.059417724609375,
0.033355712890625,
0.0117645263671875,
-0.001934051513671875,
-0.023345947265625,
-0.018829345703125,
0.06634521484375,
-0.0626220703125,
0.06719970703125,
-0.0204620361328125,
-0.004840850830078125,
0.04229736328125,
-0.00794219970703125,
0.04541015625,
0.01111602783203125,
0.00733184814453125,
0.0355224609375,
0.027740478515625,
-0.034393310546875,
-0.027099609375,
0.041656494140625,
-0.0791015625,
-0.0430908203125,
-0.0279083251953125,
-0.0301971435546875,
-0.00390625,
-0.004634857177734375,
0.0293731689453125,
0.020599365234375,
-0.004169464111328125,
0.013641357421875,
0.044189453125,
-0.032501220703125,
0.028656005859375,
0.0223236083984375,
-0.0167694091796875,
-0.047698974609375,
0.055694580078125,
-0.0010404586791992188,
0.0212249755859375,
0.017181396484375,
-0.0009899139404296875,
-0.037506103515625,
-0.0279083251953125,
-0.04083251953125,
0.030242919921875,
-0.038055419921875,
-0.03887939453125,
-0.044525146484375,
-0.0271148681640625,
-0.0268402099609375,
0.0214691162109375,
-0.0261993408203125,
-0.05487060546875,
-0.039764404296875,
-0.01073455810546875,
0.0809326171875,
0.0293731689453125,
-0.01549530029296875,
0.0284271240234375,
-0.058746337890625,
0.018310546875,
0.0283355712890625,
0.02325439453125,
-0.006687164306640625,
-0.05706787109375,
0.002513885498046875,
0.021453857421875,
-0.03912353515625,
-0.07598876953125,
0.04254150390625,
0.0247039794921875,
0.03839111328125,
0.0361328125,
0.00601959228515625,
0.0701904296875,
-0.01467132568359375,
0.07952880859375,
0.018096923828125,
-0.0654296875,
0.04290771484375,
-0.0355224609375,
0.0009393692016601562,
0.0306243896484375,
0.03765869140625,
-0.0328369140625,
-0.0242156982421875,
-0.060150146484375,
-0.05889892578125,
0.0379638671875,
0.034271240234375,
0.0130767822265625,
0.0021343231201171875,
0.045166015625,
-0.0031223297119140625,
0.0090179443359375,
-0.06475830078125,
-0.05120849609375,
-0.026641845703125,
-0.009002685546875,
0.01065826416015625,
-0.021087646484375,
-0.0171356201171875,
-0.049774169921875,
0.059112548828125,
-0.011138916015625,
0.058746337890625,
0.01494598388671875,
0.0173492431640625,
-0.0029048919677734375,
0.0006551742553710938,
0.03594970703125,
0.03607177734375,
-0.01102447509765625,
-0.0161895751953125,
0.0265655517578125,
-0.058929443359375,
0.01444244384765625,
0.01549530029296875,
0.005222320556640625,
-0.014892578125,
0.0163116455078125,
0.06512451171875,
-0.0009417533874511719,
-0.0293731689453125,
0.045684814453125,
-0.0278472900390625,
-0.02581787109375,
-0.0194244384765625,
0.0178985595703125,
0.0189208984375,
0.03424072265625,
0.0266571044921875,
-0.0265350341796875,
0.0158843994140625,
-0.037628173828125,
0.01251220703125,
0.044677734375,
-0.0031261444091796875,
-0.029205322265625,
0.055908203125,
-0.00412750244140625,
0.00756072998046875,
0.051483154296875,
-0.019927978515625,
-0.0306854248046875,
0.056854248046875,
0.036376953125,
0.04742431640625,
-0.0119781494140625,
0.017120361328125,
0.037994384765625,
0.01395416259765625,
0.0003941059112548828,
0.032745361328125,
-0.00860595703125,
-0.049041748046875,
-0.0302581787109375,
-0.04461669921875,
-0.0260009765625,
0.0215301513671875,
-0.052734375,
0.007381439208984375,
-0.028106689453125,
-0.0350341796875,
-0.019287109375,
0.030364990234375,
-0.0379638671875,
0.010345458984375,
0.0016870498657226562,
0.0714111328125,
-0.05914306640625,
0.06414794921875,
0.03424072265625,
-0.0217132568359375,
-0.07177734375,
-0.0238037109375,
0.013641357421875,
-0.053314208984375,
0.0139923095703125,
-0.003467559814453125,
0.0211029052734375,
-0.0066375732421875,
-0.061248779296875,
-0.07550048828125,
0.11602783203125,
0.0208892822265625,
-0.04132080078125,
0.001590728759765625,
0.001598358154296875,
0.0289764404296875,
-0.00589752197265625,
0.051849365234375,
0.044464111328125,
0.028472900390625,
0.0203704833984375,
-0.07965087890625,
0.0308990478515625,
-0.04144287109375,
0.0011463165283203125,
0.0089569091796875,
-0.07916259765625,
0.06689453125,
0.0011997222900390625,
-0.014984130859375,
0.0087890625,
0.053436279296875,
0.0355224609375,
0.0023956298828125,
0.034515380859375,
0.059112548828125,
0.057159423828125,
-0.0262908935546875,
0.0828857421875,
-0.0129241943359375,
0.036956787109375,
0.055419921875,
0.0033779144287109375,
0.05706787109375,
0.02252197265625,
-0.055450439453125,
0.039337158203125,
0.08074951171875,
-0.00040841102600097656,
0.028961181640625,
-0.002986907958984375,
-0.023162841796875,
-0.0035877227783203125,
0.01013946533203125,
-0.054351806640625,
0.0128326416015625,
0.03680419921875,
-0.008026123046875,
0.01107025146484375,
-0.014801025390625,
0.01153564453125,
-0.054046630859375,
-0.0025005340576171875,
0.0504150390625,
0.019866943359375,
-0.014862060546875,
0.06732177734375,
-0.01068115234375,
0.05816650390625,
-0.03509521484375,
-0.0118408203125,
-0.036590576171875,
-0.00925445556640625,
-0.0220947265625,
-0.06597900390625,
0.0194854736328125,
-0.00820159912109375,
-0.0001881122589111328,
0.004154205322265625,
0.050323486328125,
-0.01012420654296875,
-0.025482177734375,
0.028350830078125,
0.032684326171875,
0.0298919677734375,
0.00007021427154541016,
-0.07525634765625,
0.0233154296875,
0.005786895751953125,
-0.054107666015625,
0.034423828125,
0.032745361328125,
0.0075225830078125,
0.055419921875,
0.047760009765625,
-0.0090789794921875,
0.0013408660888671875,
-0.0195159912109375,
0.08074951171875,
-0.057769775390625,
-0.0185699462890625,
-0.060516357421875,
0.03912353515625,
-0.0168304443359375,
-0.0277099609375,
0.057159423828125,
0.0386962890625,
0.04742431640625,
0.017730712890625,
0.042572021484375,
-0.033111572265625,
0.018707275390625,
-0.01910400390625,
0.044342041015625,
-0.05023193359375,
0.01148223876953125,
-0.0284271240234375,
-0.05291748046875,
-0.0035686492919921875,
0.05950927734375,
-0.00643157958984375,
0.0158233642578125,
0.02618408203125,
0.06695556640625,
0.00438690185546875,
0.0126495361328125,
0.0117034912109375,
0.030059814453125,
0.0191497802734375,
0.06854248046875,
0.061370849609375,
-0.0672607421875,
0.042938232421875,
-0.030059814453125,
-0.0217742919921875,
-0.0098724365234375,
-0.060638427734375,
-0.060394287109375,
-0.037994384765625,
-0.043426513671875,
-0.03997802734375,
-0.0074920654296875,
0.063232421875,
0.0594482421875,
-0.046539306640625,
-0.019989013671875,
0.00341033935546875,
0.006549835205078125,
-0.017913818359375,
-0.0222625732421875,
0.0250396728515625,
0.032806396484375,
-0.040924072265625,
0.0103912353515625,
0.00685882568359375,
0.0308837890625,
-0.007274627685546875,
-0.0259246826171875,
-0.01435089111328125,
0.005405426025390625,
0.0516357421875,
0.03814697265625,
-0.044830322265625,
-0.021331787109375,
-0.0068359375,
-0.0105438232421875,
0.0198211669921875,
0.006591796875,
-0.048583984375,
-0.01219940185546875,
0.03472900390625,
0.01131439208984375,
0.06671142578125,
0.00940704345703125,
0.01395416259765625,
-0.04217529296875,
0.016021728515625,
0.0010576248168945312,
0.019805908203125,
0.004795074462890625,
-0.042572021484375,
0.05029296875,
0.0278472900390625,
-0.046600341796875,
-0.0595703125,
-0.014007568359375,
-0.09478759765625,
-0.00545501708984375,
0.088623046875,
-0.0108642578125,
-0.024017333984375,
-0.002246856689453125,
-0.0260467529296875,
0.02099609375,
-0.0355224609375,
0.02398681640625,
0.037322998046875,
-0.0284576416015625,
-0.0301513671875,
-0.05706787109375,
0.04058837890625,
0.0121917724609375,
-0.0689697265625,
-0.002124786376953125,
0.03997802734375,
0.036590576171875,
-0.0013608932495117188,
0.075927734375,
-0.0158843994140625,
0.0184173583984375,
0.00698089599609375,
-0.0005664825439453125,
0.007640838623046875,
0.0102081298828125,
-0.0156402587890625,
-0.010589599609375,
-0.0144195556640625,
-0.00949859619140625
]
] |
wajidlinux99/gibberish-text-detector | 2023-01-16T12:15:52.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"text",
"nlp",
"correction",
"en",
"endpoints_compatible",
"region:us"
] | text-classification | wajidlinux99 | null | null | wajidlinux99/gibberish-text-detector | 2 | 74,340 | transformers | 2023-01-16T11:46:09 | ---
language:
- en
pipeline_tag: text-classification
tags:
- text
- nlp
- correction
---
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 492513457
- CO2 Emissions (in grams): 5.527544460835904
## Validation Metrics
- Loss: 0.07609463483095169
- Accuracy: 0.9735624586913417
- Macro F1: 0.9736173135739408
- Micro F1: 0.9735624586913417
- Weighted F1: 0.9736173135739408
- Macro Precision: 0.9737771415197378
- Micro Precision: 0.9735624586913417
- Weighted Precision: 0.9737771415197378
- Macro Recall: 0.9735624586913417
- Micro Recall: 0.9735624586913417
- Weighted Recall: 0.9735624586913417
## Usage
You can use CURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "Is this text really worth it?"}' https://api-inference.huggingface.co/models/wajidlinux99/gibberish-text-detector
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("wajidlinux99/gibberish-text-detector", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("wajidlinux99/gibberish-text-detector", use_auth_token=True)
inputs = tokenizer("Is this text really worth it?", return_tensors="pt")
outputs = model(**inputs)
```
# Original Repository
***madhurjindal/autonlp-Gibberish-Detector-492513457 | 1,420 | [
[
-0.030548095703125,
-0.039581298828125,
0.014495849609375,
0.007183074951171875,
-0.002277374267578125,
-0.0007576942443847656,
-0.00457763671875,
-0.028594970703125,
-0.0006384849548339844,
0.0121612548828125,
-0.0257568359375,
-0.0347900390625,
-0.061614990234375,
0.00341033935546875,
-0.037506103515625,
0.07275390625,
0.00738525390625,
-0.006916046142578125,
0.01509857177734375,
0.0002238750457763672,
-0.0248565673828125,
-0.0662841796875,
-0.059600830078125,
-0.00469207763671875,
0.0220184326171875,
0.00714874267578125,
0.039581298828125,
0.0228424072265625,
0.03607177734375,
0.0274200439453125,
-0.01416778564453125,
0.0004100799560546875,
-0.01491546630859375,
-0.01467132568359375,
-0.00232696533203125,
-0.026519775390625,
-0.033050537109375,
0.01348114013671875,
0.042022705078125,
0.02886962890625,
-0.01097869873046875,
0.0276947021484375,
0.0113677978515625,
0.031951904296875,
-0.0499267578125,
0.0267486572265625,
-0.041839599609375,
0.00983428955078125,
0.0083160400390625,
-0.005126953125,
-0.0291900634765625,
0.0029697418212890625,
0.018585205078125,
-0.03826904296875,
0.0200042724609375,
0.0219879150390625,
0.10406494140625,
0.03399658203125,
-0.01435089111328125,
-0.037353515625,
-0.0176544189453125,
0.056549072265625,
-0.0826416015625,
0.023590087890625,
0.0219879150390625,
0.006809234619140625,
0.0036296844482421875,
-0.05938720703125,
-0.058319091796875,
-0.00437164306640625,
-0.0316162109375,
0.02447509765625,
-0.0092620849609375,
-0.00011903047561645508,
0.0196990966796875,
0.044647216796875,
-0.061309814453125,
0.006191253662109375,
-0.017364501953125,
-0.03668212890625,
0.059661865234375,
0.01560211181640625,
0.0250244140625,
-0.042144775390625,
-0.033203125,
-0.0386962890625,
-0.0023860931396484375,
0.01502227783203125,
0.039306640625,
0.024169921875,
-0.027587890625,
0.0384521484375,
-0.018524169921875,
0.045196533203125,
0.00479888916015625,
-0.0008516311645507812,
0.03680419921875,
-0.03143310546875,
-0.034332275390625,
0.000016808509826660156,
0.0858154296875,
0.01428985595703125,
0.007049560546875,
0.016387939453125,
-0.002574920654296875,
0.004688262939453125,
-0.0019779205322265625,
-0.0697021484375,
-0.032928466796875,
0.029541015625,
-0.0303497314453125,
-0.03668212890625,
0.0092926025390625,
-0.054901123046875,
0.0142059326171875,
-0.0203857421875,
0.039154052734375,
-0.0261688232421875,
-0.046722412109375,
0.0193023681640625,
-0.015289306640625,
0.02227783203125,
0.01244354248046875,
-0.07257080078125,
-0.002040863037109375,
0.02655029296875,
0.080322265625,
-0.00589752197265625,
-0.0248260498046875,
0.008880615234375,
-0.0036983489990234375,
-0.00847625732421875,
0.055572509765625,
-0.017303466796875,
-0.0214691162109375,
-0.0214691162109375,
0.01494598388671875,
-0.02655029296875,
-0.0208892822265625,
0.044891357421875,
-0.0202178955078125,
0.040557861328125,
0.006641387939453125,
-0.049224853515625,
-0.0235595703125,
0.02484130859375,
-0.03900146484375,
0.08978271484375,
0.0237274169921875,
-0.059539794921875,
0.034454345703125,
-0.04815673828125,
-0.01383209228515625,
0.007122039794921875,
0.0017871856689453125,
-0.0565185546875,
-0.016387939453125,
0.01145172119140625,
0.0396728515625,
0.0023403167724609375,
0.0447998046875,
-0.027984619140625,
-0.0154876708984375,
0.004974365234375,
-0.046661376953125,
0.08685302734375,
0.031890869140625,
-0.03265380859375,
0.0092926025390625,
-0.06982421875,
0.0188446044921875,
0.006992340087890625,
-0.0239105224609375,
-0.0187225341796875,
-0.036407470703125,
0.0261383056640625,
0.02191162109375,
0.0122528076171875,
-0.02667236328125,
0.01261138916015625,
-0.0302581787109375,
0.038421630859375,
0.052276611328125,
-0.005664825439453125,
0.0156402587890625,
-0.022979736328125,
0.01035308837890625,
0.006320953369140625,
0.0230865478515625,
0.006298065185546875,
-0.04315185546875,
-0.0750732421875,
-0.033111572265625,
0.0229339599609375,
0.034149169921875,
-0.0265350341796875,
0.07562255859375,
-0.00771331787109375,
-0.058502197265625,
-0.032623291015625,
-0.0101470947265625,
0.020782470703125,
0.049560546875,
0.037445068359375,
-0.018585205078125,
-0.0364990234375,
-0.058319091796875,
-0.033172607421875,
-0.0261688232421875,
0.004425048828125,
0.007415771484375,
0.06463623046875,
-0.0296478271484375,
0.06756591796875,
-0.045166015625,
-0.025177001953125,
0.006870269775390625,
0.037933349609375,
0.0204315185546875,
0.05419921875,
0.05609130859375,
-0.0462646484375,
-0.04510498046875,
-0.038604736328125,
-0.060394287109375,
0.0020694732666015625,
-0.00901031494140625,
-0.01788330078125,
0.0256195068359375,
0.0192718505859375,
-0.035552978515625,
0.035186767578125,
0.04486083984375,
-0.046478271484375,
0.04425048828125,
-0.017181396484375,
0.0094146728515625,
-0.0699462890625,
0.0258331298828125,
-0.004974365234375,
-0.0124053955078125,
-0.0291900634765625,
0.0085906982421875,
0.005290985107421875,
-0.0145263671875,
-0.035308837890625,
0.05352783203125,
-0.021240234375,
0.0197906494140625,
-0.0219879150390625,
-0.015899658203125,
0.015960693359375,
0.04718017578125,
0.0146942138671875,
0.05224609375,
0.04638671875,
-0.05645751953125,
0.02630615234375,
0.0213775634765625,
-0.0242156982421875,
0.0197906494140625,
-0.044677734375,
-0.0006566047668457031,
0.011566162109375,
0.023651123046875,
-0.09027099609375,
-0.024078369140625,
0.0256500244140625,
-0.04718017578125,
0.023773193359375,
-0.0157470703125,
-0.04595947265625,
-0.0287933349609375,
0.0005769729614257812,
0.0276336669921875,
0.0246429443359375,
-0.050750732421875,
0.03466796875,
0.0176239013671875,
0.0003654956817626953,
-0.044952392578125,
-0.06793212890625,
-0.004756927490234375,
-0.0108642578125,
-0.034912109375,
0.00942230224609375,
-0.0241546630859375,
0.01416778564453125,
-0.000017344951629638672,
0.0003998279571533203,
0.0007295608520507812,
0.01233673095703125,
0.0062408447265625,
0.02667236328125,
0.005382537841796875,
0.0040740966796875,
-0.01045989990234375,
-0.0302581787109375,
0.021026611328125,
-0.007175445556640625,
0.06329345703125,
-0.036376953125,
-0.0186920166015625,
-0.050079345703125,
-0.004726409912109375,
0.035308837890625,
-0.0006284713745117188,
0.054046630859375,
0.06390380859375,
-0.0273284912109375,
0.0014019012451171875,
-0.0330810546875,
-0.01392364501953125,
-0.038360595703125,
0.0307159423828125,
-0.02203369140625,
-0.04449462890625,
0.037689208984375,
0.01020050048828125,
-0.005825042724609375,
0.076416015625,
0.03704833984375,
0.00011372566223144531,
0.082763671875,
0.0187225341796875,
-0.01312255859375,
0.0197601318359375,
-0.0506591796875,
0.0189056396484375,
-0.047119140625,
-0.0270538330078125,
-0.035614013671875,
-0.0012731552124023438,
-0.05078125,
0.0011949539184570312,
0.011077880859375,
0.0001348257064819336,
-0.04296875,
0.037384033203125,
-0.072998046875,
0.00316619873046875,
0.0462646484375,
-0.0030841827392578125,
0.018646240234375,
-0.0003757476806640625,
-0.002223968505859375,
0.005096435546875,
-0.04913330078125,
-0.027740478515625,
0.07196044921875,
0.037322998046875,
0.04351806640625,
0.00885772705078125,
0.035675048828125,
0.01947021484375,
0.0189666748046875,
-0.052459716796875,
0.03472900390625,
0.00013375282287597656,
-0.0706787109375,
-0.0171051025390625,
-0.045501708984375,
-0.052001953125,
0.018829345703125,
-0.0230560302734375,
-0.0290679931640625,
0.015594482421875,
0.016357421875,
-0.0259857177734375,
0.04132080078125,
-0.061737060546875,
0.07672119140625,
-0.030426025390625,
-0.0172882080078125,
0.003124237060546875,
-0.0380859375,
0.0296478271484375,
0.0043792724609375,
0.0193939208984375,
-0.014312744140625,
0.010833740234375,
0.06463623046875,
-0.035980224609375,
0.06072998046875,
-0.02581787109375,
-0.00939178466796875,
0.028961181640625,
-0.018310546875,
0.02490234375,
0.01204681396484375,
-0.00182342529296875,
0.041900634765625,
0.01415252685546875,
-0.0273284912109375,
-0.0259857177734375,
0.046600341796875,
-0.0743408203125,
-0.0274505615234375,
-0.07525634765625,
-0.031890869140625,
0.0138092041015625,
0.02801513671875,
0.043975830078125,
0.0266571044921875,
0.0141448974609375,
-0.0016298294067382812,
0.0361328125,
-0.02423095703125,
0.0513916015625,
0.034576416015625,
-0.02056884765625,
-0.042327880859375,
0.07159423828125,
0.007781982421875,
0.0219573974609375,
0.016693115234375,
0.031005859375,
-0.0438232421875,
-0.0239715576171875,
-0.041473388671875,
0.00563812255859375,
-0.050048828125,
-0.02984619140625,
-0.065185546875,
-0.045806884765625,
-0.045166015625,
0.0060272216796875,
-0.035186767578125,
-0.0246124267578125,
-0.028778076171875,
-0.00368499755859375,
0.036376953125,
0.01715087890625,
-0.01287841796875,
0.03619384765625,
-0.05535888671875,
0.013031005859375,
0.0198516845703125,
0.029876708984375,
0.0076446533203125,
-0.07025146484375,
-0.0095672607421875,
0.0008668899536132812,
-0.016845703125,
-0.04913330078125,
0.05419921875,
0.00635528564453125,
0.036285400390625,
0.031219482421875,
0.0091552734375,
0.058319091796875,
0.0016298294067382812,
0.059326171875,
0.0133209228515625,
-0.08282470703125,
0.029693603515625,
-0.00418853759765625,
0.017181396484375,
0.03985595703125,
0.0321044921875,
-0.038055419921875,
-0.037933349609375,
-0.06854248046875,
-0.08074951171875,
0.04986572265625,
0.0296478271484375,
-0.0030422210693359375,
0.00437164306640625,
0.03277587890625,
-0.0106353759765625,
0.00817108154296875,
-0.07904052734375,
-0.036376953125,
-0.04339599609375,
-0.03143310546875,
0.004978179931640625,
-0.004085540771484375,
0.007663726806640625,
-0.045806884765625,
0.0706787109375,
-0.01010894775390625,
0.0300445556640625,
0.0305328369140625,
-0.0138702392578125,
0.0182342529296875,
0.01532745361328125,
0.044189453125,
0.0145111083984375,
-0.030792236328125,
0.01360321044921875,
0.01299285888671875,
-0.0452880859375,
0.0164031982421875,
0.006500244140625,
-0.01277923583984375,
0.003627777099609375,
0.0240631103515625,
0.05780029296875,
-0.00815582275390625,
-0.026275634765625,
0.041046142578125,
-0.02130126953125,
-0.0213623046875,
-0.0543212890625,
0.0255584716796875,
-0.01195526123046875,
0.00012803077697753906,
0.030242919921875,
0.0236663818359375,
0.022125244140625,
-0.034271240234375,
0.008941650390625,
0.0283050537109375,
-0.021209716796875,
-0.00507354736328125,
0.05950927734375,
-0.0011663436889648438,
-0.007221221923828125,
0.050811767578125,
-0.0283355712890625,
-0.045745849609375,
0.05621337890625,
0.030548095703125,
0.05963134765625,
-0.0043487548828125,
-0.009765625,
0.078125,
0.020416259765625,
-0.00806427001953125,
0.014495849609375,
0.0165863037109375,
-0.04119873046875,
-0.0196075439453125,
-0.05804443359375,
-0.0208587646484375,
0.0308990478515625,
-0.056121826171875,
0.0237274169921875,
-0.033935546875,
-0.0241546630859375,
0.0030498504638671875,
0.0114898681640625,
-0.0498046875,
0.040283203125,
0.01145172119140625,
0.055145263671875,
-0.0762939453125,
0.048614501953125,
0.03314208984375,
-0.050689697265625,
-0.07867431640625,
-0.0172576904296875,
-0.00719451904296875,
-0.06390380859375,
0.050384521484375,
0.02752685546875,
-0.001529693603515625,
0.01416015625,
-0.05340576171875,
-0.07867431640625,
0.0780029296875,
-0.0011339187622070312,
-0.055419921875,
-0.005725860595703125,
0.021820068359375,
0.037078857421875,
-0.0026378631591796875,
0.0560302734375,
0.0257568359375,
0.04461669921875,
-0.008514404296875,
-0.060394287109375,
0.0104217529296875,
-0.031097412109375,
-0.017730712890625,
-0.0028018951416015625,
-0.08282470703125,
0.07318115234375,
0.0036563873291015625,
-0.01568603515625,
-0.00222015380859375,
0.0460205078125,
0.021026611328125,
0.0162200927734375,
0.048736572265625,
0.0653076171875,
0.05511474609375,
-0.0193023681640625,
0.06195068359375,
-0.032073974609375,
0.07318115234375,
0.0689697265625,
0.00611114501953125,
0.0506591796875,
0.009307861328125,
-0.019195556640625,
0.053466796875,
0.058502197265625,
-0.0290985107421875,
0.0347900390625,
0.0079193115234375,
-0.006275177001953125,
-0.00907135009765625,
0.019134521484375,
-0.041748046875,
0.038177490234375,
0.0236053466796875,
-0.0211334228515625,
-0.004180908203125,
-0.004077911376953125,
0.01042938232421875,
-0.021209716796875,
-0.0078582763671875,
0.04913330078125,
-0.012542724609375,
-0.0445556640625,
0.060516357421875,
-0.00592803955078125,
0.07763671875,
-0.04595947265625,
0.00080108642578125,
0.004741668701171875,
0.027130126953125,
-0.031219482421875,
-0.05364990234375,
0.0264434814453125,
-0.0101470947265625,
-0.010833740234375,
-0.0059661865234375,
0.055084228515625,
-0.03265380859375,
-0.046600341796875,
0.02752685546875,
0.0110321044921875,
0.01473236083984375,
-0.01313018798828125,
-0.0721435546875,
-0.00638580322265625,
-0.004962921142578125,
-0.0270843505859375,
0.014923095703125,
0.00891876220703125,
0.029205322265625,
0.047027587890625,
0.06640625,
-0.00699615478515625,
0.01306915283203125,
-0.00644683837890625,
0.05194091796875,
-0.05157470703125,
-0.038177490234375,
-0.07275390625,
0.029327392578125,
-0.0167388916015625,
-0.036346435546875,
0.062286376953125,
0.0728759765625,
0.0753173828125,
-0.0040435791015625,
0.07086181640625,
-0.017669677734375,
0.032806396484375,
-0.038360595703125,
0.0721435546875,
-0.0433349609375,
0.00650787353515625,
-0.02313232421875,
-0.03973388671875,
-0.01068878173828125,
0.0745849609375,
-0.00798797607421875,
0.01500701904296875,
0.04840087890625,
0.051849365234375,
-0.00307464599609375,
0.005889892578125,
0.00249481201171875,
0.01459503173828125,
0.032806396484375,
0.03057861328125,
0.048797607421875,
-0.068115234375,
0.042266845703125,
-0.0347900390625,
-0.014556884765625,
-0.015960693359375,
-0.06512451171875,
-0.06768798828125,
-0.029541015625,
-0.03814697265625,
-0.048675537109375,
-0.025634765625,
0.061279296875,
0.0625,
-0.0684814453125,
-0.01189422607421875,
-0.0302886962890625,
-0.017791748046875,
-0.00928497314453125,
-0.0289154052734375,
0.056396484375,
-0.03509521484375,
-0.0706787109375,
-0.007411956787109375,
-0.00649261474609375,
0.0290985107421875,
-0.0288848876953125,
-0.0032291412353515625,
-0.028656005859375,
-0.01280975341796875,
0.032470703125,
0.01421356201171875,
-0.04302978515625,
-0.02783203125,
-0.00531005859375,
-0.02227783203125,
0.0107421875,
0.021240234375,
-0.0318603515625,
0.0254364013671875,
0.044525146484375,
0.03167724609375,
0.0445556640625,
-0.01494598388671875,
0.0099334716796875,
-0.0440673828125,
0.02972412109375,
0.01273345947265625,
0.031585693359375,
0.021026611328125,
-0.03411865234375,
0.041595458984375,
0.0469970703125,
-0.05889892578125,
-0.0584716796875,
-0.00571441650390625,
-0.0733642578125,
-0.022186279296875,
0.08087158203125,
-0.01317596435546875,
-0.018463134765625,
-0.0175018310546875,
-0.0152740478515625,
0.033416748046875,
-0.03155517578125,
0.063232421875,
0.059051513671875,
-0.006847381591796875,
-0.005794525146484375,
-0.033477783203125,
0.038818359375,
0.03985595703125,
-0.0726318359375,
-0.0018625259399414062,
0.0078277587890625,
0.041290283203125,
0.039520263671875,
0.037200927734375,
-0.006610870361328125,
-0.01403045654296875,
0.0118255615234375,
0.0222930908203125,
-0.005863189697265625,
-0.015869140625,
-0.00579833984375,
-0.0027923583984375,
-0.0164031982421875,
-0.01172637939453125
]
] |
d4data/biomedical-ner-all | 2023-07-02T07:28:28.000Z | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"token-classification",
"Token Classification",
"en",
"license:apache-2.0",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | d4data | null | null | d4data/biomedical-ner-all | 100 | 74,214 | transformers | 2022-06-19T14:04:18 | ---
license: apache-2.0
language:
- en
tags:
- Token Classification
co2_eq_emissions: 0.0279399890043426
widget:
- text: "CASE: A 28-year-old previously healthy man presented with a 6-week history of palpitations.
The symptoms occurred during rest, 2–3 times per week, lasted up to 30 minutes at a time and were associated with dyspnea.
Except for a grade 2/6 holosystolic tricuspid regurgitation murmur (best heard at the left sternal border with inspiratory accentuation), physical examination yielded unremarkable findings."
example_title: "example 1"
- text: "A 63-year-old woman with no known cardiac history presented with a sudden onset of dyspnea requiring intubation and ventilatory support out of hospital.
She denied preceding symptoms of chest discomfort, palpitations, syncope or infection.
The patient was afebrile and normotensive, with a sinus tachycardia of 140 beats/min."
example_title: "example 2"
- text: "A 48 year-old female presented with vaginal bleeding and abnormal Pap smears.
Upon diagnosis of invasive non-keratinizing SCC of the cervix, she underwent a radical hysterectomy with salpingo-oophorectomy which demonstrated positive spread to the pelvic lymph nodes and the parametrium.
Pathological examination revealed that the tumour also extensively involved the lower uterine segment."
example_title: "example 3"
---
## About the Model
An English Named Entity Recognition model, trained on Maccrobat to recognize the bio-medical entities (107 entities) from a given text corpus (case reports etc.). This model was built on top of distilbert-base-uncased
- Dataset: Maccrobat https://figshare.com/articles/dataset/MACCROBAT2018/9764942
- Carbon emission: 0.0279399890043426 Kg
- Training time: 30.16527 minutes
- GPU used : 1 x GeForce RTX 3060 Laptop GPU
Checkout the tutorial video for explanation of this model and corresponding python library: https://youtu.be/xpiDPdBpS18
## Usage
The easiest way is to load the inference api from huggingface and second method is through the pipeline object offered by transformers library.
```python
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("d4data/biomedical-ner-all")
model = AutoModelForTokenClassification.from_pretrained("d4data/biomedical-ner-all")
pipe = pipeline("ner", model=model, tokenizer=tokenizer, aggregation_strategy="simple") # pass device=0 if using gpu
pipe("""The patient reported no recurrence of palpitations at follow-up 6 months after the ablation.""")
```
## Author
This model is part of the Research topic "AI in Biomedical field" conducted by Deepak John Reji, Shaina Raza. If you use this work (code, model or dataset), please star at:
> https://github.com/dreji18/Bio-Epidemiology-NER
## You can support me here :)
<a href="https://www.buymeacoffee.com/deepakjohnreji" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> | 3,060 | [
[
-0.026275634765625,
-0.0447998046875,
0.04486083984375,
0.0013780593872070312,
-0.0118255615234375,
0.002063751220703125,
0.0089111328125,
-0.04571533203125,
0.037353515625,
0.03790283203125,
-0.0399169921875,
-0.03875732421875,
-0.04437255859375,
0.030303955078125,
-0.01430511474609375,
0.08404541015625,
-0.0106658935546875,
0.00839996337890625,
-0.00942230224609375,
-0.004150390625,
-0.0157623291015625,
-0.040679931640625,
-0.06854248046875,
-0.031982421875,
0.03424072265625,
0.011688232421875,
0.0216827392578125,
0.036346435546875,
0.051544189453125,
0.02569580078125,
-0.0029850006103515625,
-0.009185791015625,
-0.0102996826171875,
-0.0171356201171875,
0.0009503364562988281,
-0.026397705078125,
-0.02337646484375,
0.00275421142578125,
0.040985107421875,
0.040191650390625,
-0.0157318115234375,
0.014984130859375,
-0.006381988525390625,
0.0285797119140625,
-0.015777587890625,
0.0154876708984375,
-0.039581298828125,
0.012237548828125,
-0.00241851806640625,
0.007236480712890625,
-0.0221099853515625,
-0.006496429443359375,
0.013397216796875,
-0.0377197265625,
0.020172119140625,
0.01593017578125,
0.09552001953125,
0.02410888671875,
-0.031158447265625,
-0.023590087890625,
-0.0322265625,
0.047943115234375,
-0.06439208984375,
0.03826904296875,
0.0283203125,
0.01273345947265625,
-0.01526641845703125,
-0.07354736328125,
-0.04736328125,
-0.0169219970703125,
-0.012298583984375,
0.005767822265625,
-0.0202178955078125,
0.005889892578125,
0.038360595703125,
0.03814697265625,
-0.043426513671875,
-0.004550933837890625,
-0.041778564453125,
-0.03973388671875,
0.02996826171875,
0.005222320556640625,
0.0278167724609375,
-0.017822265625,
-0.035858154296875,
0.009124755859375,
-0.01097869873046875,
0.011199951171875,
0.00444793701171875,
0.0081939697265625,
-0.02569580078125,
0.032989501953125,
-0.013458251953125,
0.043914794921875,
0.0191497802734375,
-0.0166778564453125,
0.0648193359375,
-0.017303466796875,
-0.0311431884765625,
0.01293182373046875,
0.08013916015625,
0.0026111602783203125,
0.005634307861328125,
0.004047393798828125,
-0.004192352294921875,
0.0022068023681640625,
0.016326904296875,
-0.09429931640625,
-0.03546142578125,
0.0311737060546875,
-0.036163330078125,
-0.0177764892578125,
-0.015289306640625,
-0.04998779296875,
-0.001590728759765625,
-0.01116943359375,
0.0208587646484375,
-0.049468994140625,
-0.0202484130859375,
0.009765625,
-0.007843017578125,
0.0168304443359375,
0.000015437602996826172,
-0.060333251953125,
0.0304107666015625,
0.02142333984375,
0.0751953125,
0.0023517608642578125,
-0.002719879150390625,
-0.0081634521484375,
0.0011777877807617188,
0.000457763671875,
0.04998779296875,
-0.0247650146484375,
-0.0256805419921875,
-0.031585693359375,
0.03582763671875,
-0.01055145263671875,
-0.033538818359375,
0.03778076171875,
-0.0193023681640625,
0.0194091796875,
-0.0021915435791015625,
-0.050994873046875,
-0.033782958984375,
0.0182037353515625,
-0.048492431640625,
0.0589599609375,
0.0147705078125,
-0.07891845703125,
0.030914306640625,
-0.056060791015625,
-0.032257080078125,
0.01363372802734375,
-0.01480865478515625,
-0.0638427734375,
-0.00824737548828125,
0.012115478515625,
0.040283203125,
-0.005268096923828125,
0.0220184326171875,
-0.03411865234375,
0.0016117095947265625,
0.02392578125,
-0.01885986328125,
0.0810546875,
0.0165557861328125,
-0.0208892822265625,
-0.0155029296875,
-0.0714111328125,
-0.019378662109375,
0.034515380859375,
-0.017181396484375,
-0.0232391357421875,
-0.0250091552734375,
0.003086090087890625,
0.01311492919921875,
0.01029205322265625,
-0.06719970703125,
0.0210723876953125,
-0.034698486328125,
0.05572509765625,
0.0283355712890625,
0.0233306884765625,
0.01064300537109375,
-0.02166748046875,
0.031158447265625,
0.0215301513671875,
0.01534271240234375,
0.0020732879638671875,
-0.047607421875,
-0.033203125,
-0.0552978515625,
0.04632568359375,
0.0458984375,
-0.037841796875,
0.025726318359375,
0.00960540771484375,
-0.040985107421875,
-0.052764892578125,
-0.00823974609375,
0.0172119140625,
0.0706787109375,
0.037261962890625,
-0.025665283203125,
-0.05389404296875,
-0.07794189453125,
0.01320648193359375,
-0.0151214599609375,
-0.0093994140625,
0.042633056640625,
0.0672607421875,
-0.0576171875,
0.058563232421875,
-0.044281005859375,
-0.0201416015625,
0.004741668701171875,
0.01107025146484375,
0.040771484375,
0.055633544921875,
0.04913330078125,
-0.0550537109375,
-0.020294189453125,
-0.009552001953125,
-0.06146240234375,
-0.0021266937255859375,
0.0024261474609375,
-0.0130767822265625,
0.00859832763671875,
0.01557159423828125,
-0.044677734375,
0.036163330078125,
0.0347900390625,
-0.0233306884765625,
0.039306640625,
-0.0167999267578125,
0.0185546875,
-0.10382080078125,
0.027923583984375,
0.00249481201171875,
-0.0272064208984375,
-0.039581298828125,
-0.003124237060546875,
0.0032978057861328125,
-0.0146484375,
-0.03704833984375,
0.04632568359375,
-0.0399169921875,
0.01425933837890625,
-0.01507568359375,
-0.0237274169921875,
0.0211181640625,
0.035186767578125,
0.0225372314453125,
0.0283660888671875,
0.039764404296875,
-0.050750732421875,
0.018829345703125,
0.047027587890625,
-0.0230712890625,
0.03173828125,
-0.059234619140625,
0.0019588470458984375,
0.0049591064453125,
0.0240478515625,
-0.0682373046875,
-0.0297698974609375,
0.042724609375,
-0.05413818359375,
0.03790283203125,
-0.007465362548828125,
-0.0261993408203125,
-0.049591064453125,
0.0110626220703125,
0.03741455078125,
0.04718017578125,
-0.040435791015625,
0.058380126953125,
0.045318603515625,
-0.00768280029296875,
-0.04205322265625,
-0.0704345703125,
-0.01331329345703125,
-0.0214080810546875,
-0.027069091796875,
0.0435791015625,
-0.0162200927734375,
0.002521514892578125,
0.0005793571472167969,
-0.00537872314453125,
-0.029541015625,
0.0110015869140625,
0.029296875,
0.0251007080078125,
0.003864288330078125,
0.01306915283203125,
0.0118255615234375,
-0.01255035400390625,
0.009979248046875,
0.003635406494140625,
0.036651611328125,
-0.0047760009765625,
-0.004871368408203125,
-0.0704345703125,
0.00955963134765625,
0.0404052734375,
-0.00865936279296875,
0.05859375,
0.040679931640625,
-0.06256103515625,
0.0133209228515625,
-0.04681396484375,
-0.0300445556640625,
-0.034088134765625,
0.0165557861328125,
-0.035552978515625,
-0.038970947265625,
0.0552978515625,
-0.007297515869140625,
-0.004947662353515625,
0.04473876953125,
0.04681396484375,
-0.003314971923828125,
0.07110595703125,
0.036712646484375,
-0.0119171142578125,
0.0164031982421875,
-0.026153564453125,
0.00812530517578125,
-0.070068359375,
-0.035064697265625,
-0.040679931640625,
-0.01128387451171875,
-0.03350830078125,
-0.013885498046875,
0.0079345703125,
-0.005184173583984375,
-0.028656005859375,
0.03753662109375,
-0.060302734375,
0.004711151123046875,
0.024993896484375,
0.0384521484375,
0.007259368896484375,
-0.005428314208984375,
-0.04266357421875,
0.007534027099609375,
-0.03912353515625,
-0.0274658203125,
0.061370849609375,
0.050048828125,
0.047698974609375,
-0.00725555419921875,
0.06304931640625,
0.008636474609375,
0.0254974365234375,
-0.055450439453125,
0.0205230712890625,
0.01306915283203125,
-0.062744140625,
-0.001964569091796875,
-0.035858154296875,
-0.084228515625,
-0.0166168212890625,
-0.0310821533203125,
-0.052947998046875,
0.029083251953125,
0.010589599609375,
-0.033355712890625,
0.0254364013671875,
-0.0399169921875,
0.06915283203125,
-0.0276641845703125,
-0.00890350341796875,
0.007595062255859375,
-0.048492431640625,
0.0145721435546875,
-0.00848388671875,
0.023284912109375,
-0.00962066650390625,
0.00933837890625,
0.06280517578125,
-0.0308380126953125,
0.059539794921875,
-0.0182952880859375,
0.0168304443359375,
0.01885986328125,
-0.007671356201171875,
0.01357269287109375,
0.0128021240234375,
-0.01058197021484375,
0.06683349609375,
0.0123443603515625,
-0.0308837890625,
-0.01361846923828125,
0.0482177734375,
-0.05914306640625,
-0.0171966552734375,
-0.0523681640625,
-0.0252838134765625,
0.0134735107421875,
0.0254974365234375,
0.036834716796875,
0.032318115234375,
-0.01511383056640625,
0.004596710205078125,
0.040313720703125,
-0.0302886962890625,
0.0261383056640625,
0.045989990234375,
-0.01296234130859375,
-0.043426513671875,
0.045379638671875,
0.001247406005859375,
-0.0006694793701171875,
0.02825927734375,
0.014251708984375,
-0.0308837890625,
-0.035186767578125,
-0.0360107421875,
0.040771484375,
-0.0272216796875,
-0.025299072265625,
-0.082275390625,
-0.032501220703125,
-0.04266357421875,
0.004119873046875,
-0.019744873046875,
-0.0029163360595703125,
-0.04974365234375,
-0.017730712890625,
0.04473876953125,
0.02935791015625,
-0.003875732421875,
0.02850341796875,
-0.05322265625,
0.01277923583984375,
-0.00804901123046875,
0.02166748046875,
-0.010467529296875,
-0.050262451171875,
-0.00514984130859375,
0.00396728515625,
-0.0201568603515625,
-0.08203125,
0.055999755859375,
0.00948333740234375,
0.05279541015625,
0.038055419921875,
-0.00849151611328125,
0.04693603515625,
-0.050140380859375,
0.04180908203125,
0.0215911865234375,
-0.06201171875,
0.049163818359375,
-0.001667022705078125,
0.004535675048828125,
0.047821044921875,
0.06109619140625,
-0.024688720703125,
-0.012451171875,
-0.06707763671875,
-0.06060791015625,
0.033233642578125,
0.01824951171875,
-0.0149688720703125,
-0.018951416015625,
0.052978515625,
0.01096343994140625,
0.0189208984375,
-0.07147216796875,
-0.022003173828125,
-0.00772857666015625,
-0.02532958984375,
0.0081024169921875,
-0.0083770751953125,
0.00218963623046875,
-0.040679931640625,
0.07293701171875,
-0.0014371871948242188,
0.036651611328125,
0.043914794921875,
-0.0097808837890625,
0.0138397216796875,
-0.001194000244140625,
0.0212860107421875,
0.022430419921875,
-0.0274658203125,
-0.0190277099609375,
0.01468658447265625,
-0.038360595703125,
0.0018682479858398438,
0.03076171875,
-0.005718231201171875,
0.00942230224609375,
0.01084136962890625,
0.06707763671875,
0.01024627685546875,
-0.038421630859375,
0.031402587890625,
-0.01459503173828125,
-0.02923583984375,
-0.047821044921875,
-0.00972747802734375,
0.00354766845703125,
0.017364501953125,
0.01239776611328125,
-0.00156402587890625,
0.0367431640625,
-0.029449462890625,
0.03375244140625,
0.03912353515625,
-0.035736083984375,
-0.0181884765625,
0.065185546875,
0.00951385498046875,
-0.028076171875,
0.06195068359375,
-0.01070404052734375,
-0.033843994140625,
0.06982421875,
0.044342041015625,
0.04644775390625,
-0.035491943359375,
0.00812530517578125,
0.057769775390625,
0.0111083984375,
0.0033397674560546875,
0.0184326171875,
0.00792694091796875,
-0.058380126953125,
-0.018707275390625,
-0.075927734375,
-0.0223846435546875,
0.0272064208984375,
-0.050384521484375,
0.035064697265625,
-0.03826904296875,
-0.031768798828125,
0.03570556640625,
0.01003265380859375,
-0.0604248046875,
0.031982421875,
0.010894775390625,
0.06011962890625,
-0.07281494140625,
0.054534912109375,
0.057220458984375,
-0.056854248046875,
-0.06451416015625,
-0.01666259765625,
-0.006481170654296875,
-0.07452392578125,
0.03521728515625,
0.03271484375,
0.024566650390625,
0.005916595458984375,
-0.0311737060546875,
-0.059600830078125,
0.0828857421875,
0.00646209716796875,
-0.07086181640625,
-0.005184173583984375,
-0.005992889404296875,
0.0369873046875,
-0.0379638671875,
0.04644775390625,
0.04132080078125,
0.0218505859375,
0.0115966796875,
-0.06915283203125,
0.0171966552734375,
-0.016510009765625,
-0.01229095458984375,
0.0142669677734375,
-0.0479736328125,
0.05426025390625,
-0.0246124267578125,
-0.007762908935546875,
0.007518768310546875,
0.034515380859375,
0.02105712890625,
0.044708251953125,
0.034576416015625,
0.0634765625,
0.0694580078125,
0.004489898681640625,
0.07757568359375,
-0.041778564453125,
0.042327880859375,
0.05853271484375,
-0.01861572265625,
0.04681396484375,
0.03192138671875,
-0.02056884765625,
0.06512451171875,
0.07025146484375,
-0.0234527587890625,
0.04156494140625,
0.01332855224609375,
-0.0147857666015625,
0.0015010833740234375,
0.00229644775390625,
-0.034088134765625,
0.03106689453125,
0.041046142578125,
-0.0606689453125,
0.0080413818359375,
0.0235443115234375,
0.002262115478515625,
-0.0307464599609375,
-0.017974853515625,
0.06597900390625,
0.0130767822265625,
-0.0264892578125,
0.049468994140625,
-0.01488494873046875,
0.046295166015625,
-0.0386962890625,
-0.01309967041015625,
0.01082611083984375,
0.0138702392578125,
-0.0178985595703125,
-0.029388427734375,
0.0200958251953125,
-0.0177154541015625,
-0.0308074951171875,
0.01861572265625,
0.0498046875,
-0.028778076171875,
-0.061370849609375,
0.02862548828125,
0.01727294921875,
0.047027587890625,
0.00574493408203125,
-0.087158203125,
0.000020205974578857422,
0.01024627685546875,
-0.0121612548828125,
0.01224517822265625,
0.0181121826171875,
0.01277923583984375,
0.04888916015625,
0.05133056640625,
0.00856781005859375,
-0.03082275390625,
-0.00995635986328125,
0.048492431640625,
-0.0496826171875,
-0.029296875,
-0.07159423828125,
0.024658203125,
-0.00807952880859375,
-0.0107421875,
0.032318115234375,
0.05670166015625,
0.05474853515625,
-0.0029163360595703125,
0.056304931640625,
-0.01293182373046875,
0.054473876953125,
-0.02569580078125,
0.06695556640625,
-0.04986572265625,
-0.0034008026123046875,
-0.0194091796875,
-0.04473876953125,
-0.02264404296875,
0.06878662109375,
-0.01047515869140625,
0.00931549072265625,
0.054656982421875,
0.05377197265625,
-0.013916015625,
-0.0074310302734375,
0.006374359130859375,
0.0452880859375,
0.01528167724609375,
0.031341552734375,
0.0166778564453125,
-0.037628173828125,
0.01494598388671875,
-0.03704833984375,
-0.03997802734375,
-0.0214080810546875,
-0.070556640625,
-0.0723876953125,
-0.0325927734375,
-0.05328369140625,
-0.050140380859375,
-0.00144195556640625,
0.07122802734375,
0.060302734375,
-0.07867431640625,
0.0162506103515625,
-0.00737762451171875,
-0.0156097412109375,
-0.0163116455078125,
-0.016815185546875,
0.0237579345703125,
0.007114410400390625,
-0.059661865234375,
0.002338409423828125,
0.00846099853515625,
0.0186767578125,
0.000054955482482910156,
0.00858306884765625,
-0.03521728515625,
0.0010690689086914062,
0.0256195068359375,
0.045257568359375,
-0.050445556640625,
-0.01268768310546875,
-0.01007080078125,
-0.0122833251953125,
0.022216796875,
0.0259246826171875,
-0.0615234375,
0.0263519287109375,
0.0225372314453125,
0.051422119140625,
0.048095703125,
-0.00518035888671875,
0.01555633544921875,
-0.0298614501953125,
0.01226043701171875,
0.019134521484375,
0.06402587890625,
0.0310516357421875,
-0.058074951171875,
0.039581298828125,
0.040496826171875,
-0.0703125,
-0.057373046875,
-0.0021839141845703125,
-0.08978271484375,
-0.019744873046875,
0.07440185546875,
-0.01165008544921875,
-0.042205810546875,
-0.00589752197265625,
-0.0218353271484375,
0.045745849609375,
-0.04290771484375,
0.054840087890625,
0.03790283203125,
-0.0290069580078125,
-0.0024261474609375,
-0.04071044921875,
0.0236358642578125,
0.0105133056640625,
-0.054290771484375,
-0.033935546875,
0.0294036865234375,
0.06549072265625,
0.01503753662109375,
0.060455322265625,
-0.01406097412109375,
-0.0007925033569335938,
0.00286865234375,
0.036224365234375,
-0.004367828369140625,
-0.010772705078125,
-0.026153564453125,
-0.00890350341796875,
-0.018707275390625,
-0.032989501953125
]
] |
microsoft/phi-1_5 | 2023-11-01T23:40:19.000Z | [
"transformers",
"pytorch",
"mixformer-sequential",
"text-generation",
"custom_code",
"en",
"arxiv:2309.05463",
"license:other",
"has_space",
"region:us"
] | text-generation | microsoft | null | null | microsoft/phi-1_5 | 995 | 74,214 | transformers | 2023-09-10T04:03:46 | ---
license: other
license_name: microsoft-research-license
license_link: https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx
language:
- en
pipeline_tag: text-generation
---
## Model Summary
The language model phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
We **did not** fine-tune phi-1.5 either for **instruction following or through reinforcement learning from human feedback**. The intention behind crafting this open-source model is to provide the research community with a non-restricted small model to explore vital safety challenges, such as reducing toxicity, understanding societal biases, enhancing controllability, and more.
For a safer model release, we exclude generic web-crawl data sources such as common-crawl from the training. This strategy prevents direct exposure to potentially harmful online content, enhancing the model's safety without RLHF. However, the model is still vulnerable to generating harmful content. We hope the model can help the research community to further study the safety of language models.
phi-1.5 can write poems, draft emails, create stories, summarize texts, write Python code (such as downloading a Hugging Face transformer model), etc.
## Intended Uses
Given the nature of the training data, phi-1.5 is best suited for prompts using the QA format, the chat format, and the code format. Note that phi-1.5, being a base model, often produces irrelevant text following the main answer. In the following example, we've truncated the answer for illustrative purposes only.
#### QA format:
```markdown
Write a detailed analogy between mathematics and a lighthouse.
Answer: Mathematics is like a lighthouse, guiding us through the vast ocean of numbers and calculations. Just as a lighthouse illuminates the darkness, mathematics provides us with a clear path to navigate through complex problems. It helps us make sense of the world around us, just like a lighthouse helps ships find their way home.
```
where the model generates the text after "Answer:".
#### Chat format:
```markdown
Alice: I don't know why, I'm struggling to maintain focus while studying. Any suggestions?
Bob: Have you tried using a timer? It can help you stay on track and avoid distractions.
Alice: That's a good idea. I'll give it a try.
Charlie: Another thing that can help is to break up your study sessions into smaller chunks. It's easier to concentrate on one thing at a time.
Alice: That makes sense. I'll try that too.
Bob: And don't forget to take breaks! It's important to give your brain a rest so you can come back to your studies with a fresh perspective.
Alice: Thanks for the advice, guys. I feel more motivated now.
Charlie: No problem, Alice. We're all in this together.
Bob: Yeah, and remember that it's okay to ask for help if you need it. We're here to support each other.
```
where the model generates the text after the first "Bob:".
#### Code format:
```python
def print_prime(n):
"""
Print all primes between 1 and n
"""
primes = []
for num in range(2, n+1):
is_prime = True
for i in range(2, int(math.sqrt(num))+1):
if num % i == 0:
is_prime = False
break
if is_prime:
primes.append(num)
print(primes)
```
where the model generates the text after the comments.
**Notes**
* phi-1.5 is intended for research purposes. The model-generated text/code should be treated as a starting point rather than a definitive solution for potential use cases. Users should be cautious when employing these models in their applications.
* Direct adoption for production tasks is out of the scope of this research project. As a result, phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details.
## Limitations of phi-1.5
* Generate Inaccurate Code and Facts: The model often produces incorrect code snippets and statements. Users should treat these outputs as suggestions or starting points, not as definitive or accurate solutions.
* Limited Scope for code: If the model generates Python scripts that utilize uncommon packages or scripts in other languages, we strongly recommend users manually verify all API uses.
* Unreliable Responses to Instruction: The model has not undergone instruction fine-tuning. As a result, it may struggle or fail to adhere to intricate or nuanced instructions provided by users.
* Language Limitations: The model is primarily designed to understand standard English. Informal English, slang, or any other language outside of English might pose challenges to its comprehension, leading to potential misinterpretations or errors in response.
* Potential Societal Biases: Regardless of the safe data used for its training, the model is not entirely free from societal biases. There's a possibility it may generate content that mirrors these societal biases, particularly if prompted or instructed to do so. We urge users to be aware of this and to exercise caution and critical thinking when interpreting model outputs.
* Toxicity: Despite that the model is trained with carefully selected data, the model can still produce harmful content if explicitly prompted or instructed to do so. We chose to release the model for research purposes only -- We hope to help the open-source community develop the most effective ways to reduce the toxicity of a model directly after pretraining.
## Training
### Model
* Architecture: a Transformer-based model with next-word prediction objective
* Dataset size: 30B tokens
* Training tokens: 150B tokens
* Precision: fp16
* GPUs: 32xA100-40G
* Training time: 8 days
### Software
* [PyTorch](https://github.com/pytorch/pytorch)
* [DeepSpeed](https://github.com/microsoft/DeepSpeed)
* [flash-attention](https://github.com/HazyResearch/flash-attention)
### License
The model is licensed under the [Research License](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx).
### Sample Code
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
torch.set_default_device("cuda")
model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5", trust_remote_code=True)
inputs = tokenizer('''```python
def print_prime(n):
"""
Print all primes between 1 and n
"""''', return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
If you need to use the model in a lower precision (e.g., FP16), please wrap the model's forward pass with `torch.autocast()`, as follows:
```python
with torch.autocast(model.device.type, dtype=torch.float16, enabled=True):
outputs = model.generate(**inputs, max_length=200)
```
**Remark.** In the generation function, our model currently does not support beam search (`num_beams` > 1).
Furthermore, in the forward pass of the model, we currently do not support outputting hidden states or attention values, or using custom input embeddings (instead of the model's).
### Citation
You can find the paper at https://arxiv.org/abs/2309.05463
```bib
@article{textbooks2,
title={Textbooks Are All You Need II: \textbf{phi-1.5} technical report},
author={Li, Yuanzhi and Bubeck, S{\'e}bastien and Eldan, Ronen and Del Giorno, Allie and Gunasekar, Suriya and Lee, Yin Tat},
journal={arXiv preprint arXiv:2309.05463},
year={2023}
}
``` | 7,969 | [
[
-0.015594482421875,
-0.05145263671875,
0.01543426513671875,
0.0229339599609375,
-0.0220489501953125,
-0.040313720703125,
0.0009670257568359375,
-0.0257415771484375,
-0.0157928466796875,
0.0272369384765625,
-0.047332763671875,
-0.032562255859375,
-0.039947509765625,
-0.006298065185546875,
-0.0174102783203125,
0.0753173828125,
0.016021728515625,
0.002559661865234375,
0.01279449462890625,
0.01250457763671875,
-0.01418304443359375,
-0.03265380859375,
-0.044921875,
-0.03338623046875,
0.0248260498046875,
0.0264129638671875,
0.053070068359375,
0.049102783203125,
0.053070068359375,
0.021820068359375,
-0.00018584728240966797,
-0.016876220703125,
-0.04486083984375,
-0.012359619140625,
-0.0194549560546875,
-0.032318115234375,
-0.056884765625,
0.0195465087890625,
0.028533935546875,
0.04547119140625,
0.0029449462890625,
0.02374267578125,
0.0033664703369140625,
0.0474853515625,
-0.050811767578125,
0.029937744140625,
-0.0298309326171875,
0.01270294189453125,
-0.0162811279296875,
-0.00933837890625,
-0.042022705078125,
-0.0234527587890625,
0.038055419921875,
-0.056488037109375,
0.0206298828125,
0.0028934478759765625,
0.0718994140625,
0.0082244873046875,
-0.032501220703125,
-0.00716400146484375,
-0.04461669921875,
0.06561279296875,
-0.069091796875,
0.022705078125,
0.0447998046875,
0.01056671142578125,
0.006649017333984375,
-0.0697021484375,
-0.037017822265625,
-0.0186309814453125,
0.0013208389282226562,
0.013885498046875,
-0.0137939453125,
0.016448974609375,
0.0290985107421875,
0.047332763671875,
-0.0616455078125,
-0.00667572021484375,
-0.056427001953125,
-0.0176239013671875,
0.057464599609375,
0.0178070068359375,
0.0196990966796875,
-0.0296478271484375,
-0.0248260498046875,
-0.0207977294921875,
-0.04791259765625,
0.01080322265625,
0.0214691162109375,
0.01486968994140625,
-0.020538330078125,
0.04901123046875,
-0.0084228515625,
0.0345458984375,
0.005001068115234375,
-0.005825042724609375,
0.0193023681640625,
-0.01531219482421875,
-0.02813720703125,
-0.012054443359375,
0.0672607421875,
0.032928466796875,
0.0276641845703125,
-0.0036296844482421875,
0.001789093017578125,
0.01505279541015625,
0.009796142578125,
-0.0660400390625,
-0.048980712890625,
0.0200347900390625,
-0.039337158203125,
-0.03521728515625,
-0.0014867782592773438,
-0.05596923828125,
-0.01479339599609375,
0.0018835067749023438,
0.049774169921875,
-0.04974365234375,
-0.032958984375,
0.0282745361328125,
-0.0032253265380859375,
0.0186309814453125,
0.0086669921875,
-0.0797119140625,
0.02398681640625,
0.0217437744140625,
0.0628662109375,
-0.00395965576171875,
-0.031494140625,
-0.014068603515625,
-0.0030345916748046875,
-0.01398468017578125,
0.0285491943359375,
-0.01096343994140625,
-0.04022216796875,
-0.011932373046875,
0.009857177734375,
-0.01316070556640625,
-0.040252685546875,
0.03680419921875,
-0.0418701171875,
0.039031982421875,
-0.01270294189453125,
-0.055206298828125,
-0.02972412109375,
-0.0014019012451171875,
-0.0555419921875,
0.071044921875,
0.0181121826171875,
-0.06396484375,
0.0015115737915039062,
-0.05328369140625,
-0.023162841796875,
-0.0052490234375,
0.007152557373046875,
-0.048980712890625,
0.0013275146484375,
0.01373291015625,
0.021209716796875,
-0.0190277099609375,
0.0196380615234375,
-0.0288238525390625,
-0.02581787109375,
0.0191192626953125,
-0.046905517578125,
0.097412109375,
0.0245819091796875,
-0.05096435546875,
0.002941131591796875,
-0.05291748046875,
0.003208160400390625,
0.0265655517578125,
-0.02508544921875,
0.0069427490234375,
-0.0154876708984375,
0.025634765625,
0.018218994140625,
0.013763427734375,
-0.0325927734375,
0.02398681640625,
-0.046417236328125,
0.0301361083984375,
0.054901123046875,
0.004314422607421875,
0.0362548828125,
-0.048797607421875,
0.03839111328125,
0.0059356689453125,
0.017303466796875,
-0.0308685302734375,
-0.04638671875,
-0.067626953125,
-0.0018815994262695312,
0.0198822021484375,
0.046539306640625,
-0.029296875,
0.0406494140625,
-0.01491546630859375,
-0.05450439453125,
-0.0243377685546875,
-0.006282806396484375,
0.032958984375,
0.045318603515625,
0.032073974609375,
-0.0211639404296875,
-0.037109375,
-0.045806884765625,
-0.0031986236572265625,
-0.00943756103515625,
0.0003781318664550781,
0.016571044921875,
0.0506591796875,
-0.005008697509765625,
0.0673828125,
-0.04864501953125,
-0.023956298828125,
-0.0198974609375,
0.009063720703125,
0.01486968994140625,
0.06622314453125,
0.032989501953125,
-0.0576171875,
-0.034393310546875,
-0.0298309326171875,
-0.06378173828125,
0.0069427490234375,
-0.00909423828125,
-0.00545501708984375,
-0.0005660057067871094,
0.0159149169921875,
-0.049652099609375,
0.045318603515625,
0.019744873046875,
-0.03436279296875,
0.04864501953125,
-0.00848388671875,
-0.0088043212890625,
-0.071533203125,
0.0277099609375,
0.01009368896484375,
0.007350921630859375,
-0.050811767578125,
0.00855255126953125,
0.010528564453125,
-0.0303192138671875,
-0.0504150390625,
0.059173583984375,
-0.0143280029296875,
0.0176849365234375,
-0.010833740234375,
0.006336212158203125,
0.0113677978515625,
0.05511474609375,
0.0034999847412109375,
0.05694580078125,
0.062469482421875,
-0.0570068359375,
0.0221710205078125,
0.0177764892578125,
-0.021942138671875,
0.005542755126953125,
-0.06671142578125,
0.002361297607421875,
-0.00004750490188598633,
0.01038360595703125,
-0.0706787109375,
-0.0087738037109375,
0.037384033203125,
-0.039703369140625,
0.017120361328125,
-0.001995086669921875,
-0.0311126708984375,
-0.0262908935546875,
-0.02386474609375,
0.044036865234375,
0.041168212890625,
-0.021820068359375,
0.0394287109375,
0.022216796875,
-0.00762176513671875,
-0.03277587890625,
-0.0618896484375,
-0.034515380859375,
-0.02093505859375,
-0.059783935546875,
0.0308685302734375,
-0.010162353515625,
-0.015899658203125,
-0.00457763671875,
-0.00940704345703125,
-0.0062255859375,
0.0031108856201171875,
0.0192108154296875,
0.03594970703125,
-0.013885498046875,
0.01837158203125,
-0.00954437255859375,
-0.0296478271484375,
0.00975799560546875,
-0.0297393798828125,
0.039459228515625,
-0.0225982666015625,
-0.02105712890625,
-0.045074462890625,
0.0121612548828125,
0.056427001953125,
-0.0313720703125,
0.05096435546875,
0.03369140625,
-0.04412841796875,
0.00818634033203125,
-0.03369140625,
-0.0382080078125,
-0.0406494140625,
0.0328369140625,
-0.01666259765625,
-0.031646728515625,
0.041961669921875,
-0.000843048095703125,
0.0161590576171875,
0.046142578125,
0.035003662109375,
0.002796173095703125,
0.07647705078125,
0.056854248046875,
-0.0002338886260986328,
0.04144287109375,
-0.06195068359375,
0.018646240234375,
-0.057220458984375,
-0.0162506103515625,
-0.013397216796875,
-0.00965118408203125,
-0.028961181640625,
-0.0207061767578125,
0.0293426513671875,
0.017425537109375,
-0.040069580078125,
0.023223876953125,
-0.043060302734375,
0.01617431640625,
0.051239013671875,
0.0258941650390625,
0.01436614990234375,
-0.0051727294921875,
-0.00811004638671875,
-0.0291900634765625,
-0.061798095703125,
-0.03448486328125,
0.10235595703125,
0.039794921875,
0.060211181640625,
0.0078125,
0.041412353515625,
0.003490447998046875,
0.028533935546875,
-0.049102783203125,
0.036651611328125,
-0.00013017654418945312,
-0.08319091796875,
-0.032073974609375,
-0.0307464599609375,
-0.071044921875,
0.0284576416015625,
-0.0119781494140625,
-0.06494140625,
0.01239776611328125,
0.007343292236328125,
-0.050689697265625,
0.0225830078125,
-0.056182861328125,
0.0948486328125,
-0.02386474609375,
-0.0460205078125,
-0.0107574462890625,
-0.038909912109375,
0.0328369140625,
0.0103607177734375,
0.0033721923828125,
-0.0183258056640625,
0.015838623046875,
0.060211181640625,
-0.0301055908203125,
0.06768798828125,
-0.0002448558807373047,
0.005214691162109375,
0.031646728515625,
0.004825592041015625,
0.0281982421875,
0.007053375244140625,
-0.0203094482421875,
0.004467010498046875,
0.0012292861938476562,
-0.036102294921875,
-0.0289154052734375,
0.042205810546875,
-0.06787109375,
-0.0303497314453125,
-0.04913330078125,
-0.053375244140625,
0.00981903076171875,
0.0338134765625,
0.06365966796875,
0.047607421875,
0.004146575927734375,
0.0093231201171875,
0.06536865234375,
-0.0250396728515625,
0.035308837890625,
0.037933349609375,
-0.036163330078125,
-0.027587890625,
0.06842041015625,
0.022308349609375,
0.01226043701171875,
0.01690673828125,
0.0211334228515625,
-0.036651611328125,
-0.023712158203125,
-0.024749755859375,
0.024322509765625,
-0.05767822265625,
-0.0099639892578125,
-0.03875732421875,
-0.0244293212890625,
-0.0499267578125,
-0.00946807861328125,
-0.0343017578125,
-0.0293426513671875,
-0.038970947265625,
-0.01149749755859375,
0.02374267578125,
0.0306243896484375,
-0.00959014892578125,
0.0266876220703125,
-0.054046630859375,
0.0155792236328125,
0.0272979736328125,
0.03472900390625,
0.005817413330078125,
-0.052734375,
-0.014190673828125,
0.0024127960205078125,
-0.0521240234375,
-0.0848388671875,
0.0369873046875,
0.00865936279296875,
0.0262451171875,
0.04022216796875,
0.028900146484375,
0.034393310546875,
-0.01256561279296875,
0.0654296875,
0.01395416259765625,
-0.076416015625,
0.050506591796875,
-0.03826904296875,
0.036834716796875,
0.0293426513671875,
0.032135009765625,
-0.0163726806640625,
-0.0162200927734375,
-0.0653076171875,
-0.06463623046875,
0.053680419921875,
0.0321044921875,
0.0032958984375,
0.01181793212890625,
0.0220184326171875,
-0.00951385498046875,
0.00782012939453125,
-0.08917236328125,
-0.043701171875,
-0.0287322998046875,
-0.0114898681640625,
0.00453948974609375,
-0.00849151611328125,
0.0013980865478515625,
-0.03582763671875,
0.06744384765625,
0.00678253173828125,
0.022674560546875,
0.019378662109375,
-0.0180816650390625,
-0.02056884765625,
0.0101776123046875,
0.03167724609375,
0.0601806640625,
-0.0218963623046875,
0.007415771484375,
0.0126190185546875,
-0.052215576171875,
0.0035762786865234375,
0.035675048828125,
-0.019439697265625,
-0.005146026611328125,
0.012542724609375,
0.048797607421875,
0.034637451171875,
-0.046173095703125,
0.03704833984375,
0.007335662841796875,
-0.0072479248046875,
-0.032562255859375,
0.01203155517578125,
0.007049560546875,
0.0114288330078125,
0.03887939453125,
0.004573822021484375,
0.00937652587890625,
-0.04180908203125,
0.022918701171875,
0.01322174072265625,
-0.007114410400390625,
-0.01739501953125,
0.07421875,
0.0222930908203125,
-0.0255584716796875,
0.06427001953125,
-0.021759033203125,
-0.037384033203125,
0.056182861328125,
0.03826904296875,
0.053955078125,
-0.01142120361328125,
0.00893402099609375,
0.04364013671875,
0.0290985107421875,
0.0029888153076171875,
0.0233917236328125,
0.01155853271484375,
-0.04168701171875,
-0.012451171875,
-0.029449462890625,
-0.0216064453125,
0.0277099609375,
-0.052154541015625,
0.022613525390625,
-0.039794921875,
-0.0279083251953125,
-0.00525665283203125,
0.0298919677734375,
-0.05010986328125,
0.0172271728515625,
0.010986328125,
0.0771484375,
-0.064208984375,
0.07305908203125,
0.03851318359375,
-0.05767822265625,
-0.0721435546875,
-0.009613037109375,
-0.0017681121826171875,
-0.03204345703125,
0.03192138671875,
0.028656005859375,
0.01324462890625,
0.021087646484375,
-0.041229248046875,
-0.072021484375,
0.0916748046875,
0.03826904296875,
-0.0294342041015625,
-0.016815185546875,
0.01123809814453125,
0.031341552734375,
-0.0037441253662109375,
0.0262451171875,
0.0166168212890625,
0.029266357421875,
-0.0146636962890625,
-0.07440185546875,
0.002162933349609375,
-0.03985595703125,
-0.00876617431640625,
-0.01293182373046875,
-0.04949951171875,
0.09136962890625,
-0.0265350341796875,
-0.026214599609375,
0.01125335693359375,
0.05596923828125,
0.0291290283203125,
0.03228759765625,
0.0090179443359375,
0.044921875,
0.0577392578125,
-0.01009368896484375,
0.069580078125,
-0.0501708984375,
0.05633544921875,
0.07269287109375,
0.0028629302978515625,
0.051116943359375,
0.0228271484375,
-0.0270233154296875,
0.027008056640625,
0.045196533203125,
-0.01800537109375,
0.045013427734375,
0.01116180419921875,
-0.01389312744140625,
-0.00882720947265625,
0.01114654541015625,
-0.0548095703125,
0.00928497314453125,
0.01995849609375,
-0.0379638671875,
-0.01727294921875,
0.012176513671875,
-0.0032958984375,
-0.01395416259765625,
-0.01555633544921875,
0.0208587646484375,
0.0153350830078125,
-0.048828125,
0.051300048828125,
0.007354736328125,
0.056640625,
-0.054931640625,
0.0028171539306640625,
-0.0092926025390625,
0.035980224609375,
-0.015960693359375,
-0.039306640625,
0.00792694091796875,
-0.01654052734375,
-0.0207977294921875,
-0.01085662841796875,
0.049896240234375,
-0.0257415771484375,
-0.039703369140625,
0.01076507568359375,
0.033721923828125,
0.01190185546875,
-0.01226043701171875,
-0.0611572265625,
0.0111846923828125,
-0.007843017578125,
-0.034912109375,
0.018829345703125,
0.00997161865234375,
0.0264739990234375,
0.06494140625,
0.0421142578125,
-0.0131988525390625,
0.0297393798828125,
-0.0260009765625,
0.06011962890625,
-0.04278564453125,
-0.041839599609375,
-0.048431396484375,
0.0543212890625,
0.007282257080078125,
-0.057830810546875,
0.06622314453125,
0.05126953125,
0.0716552734375,
-0.01258087158203125,
0.055816650390625,
-0.00812530517578125,
0.0187835693359375,
-0.0196380615234375,
0.0673828125,
-0.02777099609375,
0.0238189697265625,
-0.019378662109375,
-0.0599365234375,
-0.0089569091796875,
0.07806396484375,
-0.039093017578125,
0.01229095458984375,
0.055328369140625,
0.05963134765625,
-0.01312255859375,
0.0191802978515625,
0.012603759765625,
0.0062713623046875,
0.0187530517578125,
0.056243896484375,
0.032470703125,
-0.06951904296875,
0.057220458984375,
-0.06427001953125,
-0.019866943359375,
-0.0172882080078125,
-0.044158935546875,
-0.0645751953125,
-0.053741455078125,
-0.0307464599609375,
-0.0374755859375,
-0.005023956298828125,
0.070556640625,
0.066162109375,
-0.057098388671875,
-0.0188446044921875,
-0.024139404296875,
-0.0032596588134765625,
-0.021209716796875,
-0.024444580078125,
0.039947509765625,
-0.0173187255859375,
-0.061187744140625,
0.009246826171875,
-0.0022678375244140625,
0.0175018310546875,
-0.03326416015625,
-0.0148773193359375,
-0.0197906494140625,
-0.0000814199447631836,
0.02490234375,
0.03057861328125,
-0.044586181640625,
-0.00185394287109375,
0.02056884765625,
-0.031646728515625,
-0.007625579833984375,
0.049957275390625,
-0.054779052734375,
0.04034423828125,
0.03668212890625,
0.02789306640625,
0.050323486328125,
-0.01910400390625,
0.03662109375,
-0.01922607421875,
0.01126861572265625,
0.035125732421875,
0.034271240234375,
0.01322174072265625,
-0.033966064453125,
0.038818359375,
0.0287322998046875,
-0.044921875,
-0.057159423828125,
0.016845703125,
-0.0654296875,
-0.0219268798828125,
0.07733154296875,
-0.029937744140625,
-0.01015472412109375,
-0.0141448974609375,
-0.0165252685546875,
0.032989501953125,
-0.020477294921875,
0.07672119140625,
0.04486083984375,
-0.01087188720703125,
-0.02374267578125,
-0.04144287109375,
0.035430908203125,
0.044586181640625,
-0.045013427734375,
0.00040030479431152344,
0.0229034423828125,
0.031890869140625,
0.018798828125,
0.06591796875,
-0.01220703125,
0.0489501953125,
0.0163726806640625,
-0.00777435302734375,
-0.0128326416015625,
-0.01277923583984375,
-0.0323486328125,
0.01360321044921875,
-0.01233673095703125,
-0.017242431640625
]
] |
timm/tf_efficientnet_b0.ns_jft_in1k | 2023-04-27T21:17:12.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1905.11946",
"arxiv:1911.04252",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnet_b0.ns_jft_in1k | 0 | 74,189 | timm | 2022-12-13T00:01:33 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for tf_efficientnet_b0.ns_jft_in1k
A EfficientNet image classification model. Trained on ImageNet-1k and unlabeled JFT-300m using Noisy Student semi-supervised learning in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.3
- GMACs: 0.4
- Activations (M): 6.7
- Image size: 224 x 224
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- Self-training with Noisy Student improves ImageNet classification: https://arxiv.org/abs/1911.04252
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnet_b0.ns_jft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_b0.ns_jft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 320, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_b0.ns_jft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@article{Xie2019SelfTrainingWN,
title={Self-Training With Noisy Student Improves ImageNet Classification},
author={Qizhe Xie and Eduard H. Hovy and Minh-Thang Luong and Quoc V. Le},
journal={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2019},
pages={10684-10695}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,597 | [
[
-0.0296630859375,
-0.04278564453125,
-0.00662994384765625,
0.00893402099609375,
-0.018402099609375,
-0.02850341796875,
-0.0254058837890625,
-0.0306549072265625,
0.01218414306640625,
0.026641845703125,
-0.026031494140625,
-0.042083740234375,
-0.0545654296875,
-0.011138916015625,
-0.0192108154296875,
0.06512451171875,
-0.006076812744140625,
0.00012612342834472656,
-0.0119781494140625,
-0.040740966796875,
-0.0025177001953125,
-0.01666259765625,
-0.070556640625,
-0.03253173828125,
0.0245208740234375,
0.0183258056640625,
0.040313720703125,
0.050506591796875,
0.04888916015625,
0.0391845703125,
-0.007427215576171875,
0.003505706787109375,
-0.0211029052734375,
-0.0078277587890625,
0.0308990478515625,
-0.040435791015625,
-0.026580810546875,
0.01245880126953125,
0.056610107421875,
0.0380859375,
-0.003070831298828125,
0.036346435546875,
0.012237548828125,
0.04461669921875,
-0.023773193359375,
0.01519775390625,
-0.0276641845703125,
0.01387786865234375,
-0.0056915283203125,
0.01079559326171875,
-0.022308349609375,
-0.0229949951171875,
0.0174560546875,
-0.041900634765625,
0.038330078125,
-0.006542205810546875,
0.09710693359375,
0.022979736328125,
-0.0061187744140625,
0.0009608268737792969,
-0.015777587890625,
0.055908203125,
-0.062286376953125,
0.018035888671875,
0.01074981689453125,
0.0145721435546875,
-0.00751495361328125,
-0.08251953125,
-0.035400390625,
-0.01461029052734375,
-0.017608642578125,
-0.00782012939453125,
-0.025390625,
0.01137542724609375,
0.027618408203125,
0.022125244140625,
-0.0306854248046875,
0.006404876708984375,
-0.042236328125,
-0.01554107666015625,
0.044036865234375,
-0.001583099365234375,
0.016998291015625,
-0.0118255615234375,
-0.03314208984375,
-0.0347900390625,
-0.0176239013671875,
0.0291748046875,
0.01861572265625,
0.02044677734375,
-0.038818359375,
0.0222015380859375,
0.004375457763671875,
0.051605224609375,
0.0132598876953125,
-0.029205322265625,
0.048126220703125,
0.0037746429443359375,
-0.035675048828125,
-0.01007080078125,
0.08551025390625,
0.025543212890625,
0.0183563232421875,
0.00384521484375,
-0.0120086669921875,
-0.034759521484375,
-0.001514434814453125,
-0.0968017578125,
-0.0288238525390625,
0.02294921875,
-0.05218505859375,
-0.033782958984375,
0.01160430908203125,
-0.04193115234375,
-0.006298065185546875,
-0.006397247314453125,
0.052001953125,
-0.02825927734375,
-0.036956787109375,
-0.0006208419799804688,
-0.0118408203125,
0.01032257080078125,
0.02191162109375,
-0.0408935546875,
0.0115509033203125,
0.017425537109375,
0.0845947265625,
0.005702972412109375,
-0.033447265625,
-0.014373779296875,
-0.0335693359375,
-0.020355224609375,
0.027618408203125,
-0.0009608268737792969,
-0.0018339157104492188,
-0.02471923828125,
0.025054931640625,
-0.0119171142578125,
-0.05413818359375,
0.0224761962890625,
-0.0163116455078125,
0.0113525390625,
0.006458282470703125,
-0.02191162109375,
-0.041046142578125,
0.0211944580078125,
-0.035400390625,
0.08770751953125,
0.0274200439453125,
-0.06396484375,
0.0205230712890625,
-0.04071044921875,
-0.011505126953125,
-0.019012451171875,
0.0037746429443359375,
-0.08599853515625,
-0.00676727294921875,
0.0142669677734375,
0.06805419921875,
-0.0171356201171875,
0.011505126953125,
-0.0472412109375,
-0.0188140869140625,
0.022430419921875,
-0.0087432861328125,
0.081787109375,
0.0217437744140625,
-0.035858154296875,
0.0207977294921875,
-0.0482177734375,
0.0164794921875,
0.03680419921875,
-0.0189208984375,
-0.0032100677490234375,
-0.047149658203125,
0.01171875,
0.020721435546875,
0.01114654541015625,
-0.038970947265625,
0.01450347900390625,
-0.0114593505859375,
0.037750244140625,
0.04638671875,
-0.00988006591796875,
0.031158447265625,
-0.0253448486328125,
0.0181884765625,
0.0214080810546875,
0.018402099609375,
-0.004222869873046875,
-0.030792236328125,
-0.0633544921875,
-0.038970947265625,
0.025604248046875,
0.0187530517578125,
-0.037322998046875,
0.0307464599609375,
-0.015289306640625,
-0.060699462890625,
-0.033721923828125,
0.00974273681640625,
0.0325927734375,
0.0533447265625,
0.0263519287109375,
-0.0258026123046875,
-0.031707763671875,
-0.06842041015625,
-0.0008287429809570312,
-0.0012798309326171875,
0.0025787353515625,
0.0233001708984375,
0.04486083984375,
-0.0014743804931640625,
0.0406494140625,
-0.0249786376953125,
-0.024017333984375,
-0.01605224609375,
0.0086517333984375,
0.034881591796875,
0.06378173828125,
0.05841064453125,
-0.046661376953125,
-0.04400634765625,
-0.01558685302734375,
-0.07110595703125,
0.007343292236328125,
-0.0117950439453125,
-0.01287078857421875,
0.01419830322265625,
0.0210418701171875,
-0.039398193359375,
0.037811279296875,
0.01776123046875,
-0.0279998779296875,
0.027587890625,
-0.016326904296875,
0.0157623291015625,
-0.0814208984375,
0.00799560546875,
0.033447265625,
-0.0164337158203125,
-0.040374755859375,
0.0097808837890625,
0.00745391845703125,
-0.0013093948364257812,
-0.03497314453125,
0.044525146484375,
-0.042999267578125,
-0.007205963134765625,
-0.01070404052734375,
-0.0258331298828125,
-0.00004398822784423828,
0.057708740234375,
-0.0092926025390625,
0.0311431884765625,
0.063720703125,
-0.035186767578125,
0.031036376953125,
0.01861572265625,
-0.015960693359375,
0.0271453857421875,
-0.0555419921875,
0.00848388671875,
0.0025234222412109375,
0.0191650390625,
-0.0755615234375,
-0.016143798828125,
0.02410888671875,
-0.043426513671875,
0.051666259765625,
-0.0399169921875,
-0.031707763671875,
-0.038421630859375,
-0.02874755859375,
0.0291748046875,
0.046356201171875,
-0.060791015625,
0.03399658203125,
0.0192108154296875,
0.0295867919921875,
-0.044342041015625,
-0.06689453125,
-0.0201873779296875,
-0.0311126708984375,
-0.0584716796875,
0.023345947265625,
0.01015472412109375,
0.01001739501953125,
0.00913238525390625,
-0.0020771026611328125,
-0.01117706298828125,
0.003826141357421875,
0.0382080078125,
0.0206756591796875,
-0.02142333984375,
0.00202178955078125,
-0.0199737548828125,
0.0042572021484375,
0.0085296630859375,
-0.0274200439453125,
0.035186767578125,
-0.02587890625,
-0.0009126663208007812,
-0.0621337890625,
-0.00453948974609375,
0.034423828125,
0.00021350383758544922,
0.061798095703125,
0.08941650390625,
-0.034210205078125,
-0.00829315185546875,
-0.0311126708984375,
-0.0219268798828125,
-0.03851318359375,
0.0396728515625,
-0.02569580078125,
-0.046112060546875,
0.058990478515625,
-0.004344940185546875,
0.00843048095703125,
0.055908203125,
0.0263519287109375,
-0.00762176513671875,
0.0474853515625,
0.04248046875,
0.0175323486328125,
0.061676025390625,
-0.07940673828125,
-0.01519775390625,
-0.05938720703125,
-0.0278472900390625,
-0.0279998779296875,
-0.05316162109375,
-0.055908203125,
-0.0222625732421875,
0.037506103515625,
0.0180206298828125,
-0.0439453125,
0.0309906005859375,
-0.06829833984375,
0.00688934326171875,
0.04791259765625,
0.0443115234375,
-0.026092529296875,
0.025177001953125,
-0.011566162109375,
0.0032367706298828125,
-0.062164306640625,
-0.01026153564453125,
0.0888671875,
0.03497314453125,
0.048431396484375,
-0.01079559326171875,
0.055511474609375,
-0.016448974609375,
0.0263671875,
-0.04583740234375,
0.043548583984375,
-0.01047515869140625,
-0.033416748046875,
-0.0199432373046875,
-0.0452880859375,
-0.08087158203125,
0.01461029052734375,
-0.0218353271484375,
-0.056671142578125,
0.0176544189453125,
0.015655517578125,
-0.0195770263671875,
0.059173583984375,
-0.07037353515625,
0.0726318359375,
-0.005687713623046875,
-0.0380859375,
0.0041351318359375,
-0.049652099609375,
0.0215911865234375,
0.0157623291015625,
-0.019927978515625,
-0.005054473876953125,
0.006999969482421875,
0.08831787109375,
-0.0498046875,
0.0628662109375,
-0.04345703125,
0.033660888671875,
0.0430908203125,
-0.00804901123046875,
0.0278167724609375,
-0.00799560546875,
-0.01312255859375,
0.032958984375,
0.0010004043579101562,
-0.03662109375,
-0.04010009765625,
0.045196533203125,
-0.079345703125,
-0.0262603759765625,
-0.0200653076171875,
-0.037200927734375,
0.0164337158203125,
0.011688232421875,
0.03662109375,
0.048126220703125,
0.0210418701171875,
0.026947021484375,
0.041412353515625,
-0.0201263427734375,
0.042205810546875,
-0.006031036376953125,
-0.0102081298828125,
-0.03350830078125,
0.06097412109375,
0.0273284912109375,
0.0143585205078125,
0.00780487060546875,
0.021331787109375,
-0.0220794677734375,
-0.0430908203125,
-0.0252532958984375,
0.0197906494140625,
-0.053192138671875,
-0.04241943359375,
-0.05316162109375,
-0.03460693359375,
-0.0279541015625,
-0.00864410400390625,
-0.04248046875,
-0.033660888671875,
-0.03460693359375,
0.0157928466796875,
0.052764892578125,
0.03704833984375,
-0.015472412109375,
0.04595947265625,
-0.0333251953125,
0.004673004150390625,
0.01045989990234375,
0.032470703125,
0.0090789794921875,
-0.0697021484375,
-0.02447509765625,
-0.00933074951171875,
-0.03460693359375,
-0.045501708984375,
0.03875732421875,
0.02032470703125,
0.03851318359375,
0.0306854248046875,
-0.0089874267578125,
0.054840087890625,
0.0058746337890625,
0.036956787109375,
0.033233642578125,
-0.04241943359375,
0.038177490234375,
-0.002368927001953125,
0.0169677734375,
0.01279449462890625,
0.0228424072265625,
-0.011199951171875,
-0.006710052490234375,
-0.0802001953125,
-0.055999755859375,
0.065185546875,
0.0063018798828125,
0.0010538101196289062,
0.031829833984375,
0.05511474609375,
-0.0007052421569824219,
0.0017747879028320312,
-0.057769775390625,
-0.03607177734375,
-0.0287322998046875,
-0.02313232421875,
0.0011749267578125,
0.00001138448715209961,
-0.0003800392150878906,
-0.05511474609375,
0.047454833984375,
-0.00855255126953125,
0.06207275390625,
0.028656005859375,
-0.002559661865234375,
-0.01134490966796875,
-0.0284576416015625,
0.0277099609375,
0.01947021484375,
-0.0254669189453125,
0.01019287109375,
0.0152740478515625,
-0.042205810546875,
0.01262664794921875,
0.0145111083984375,
-0.0036334991455078125,
0.00033593177795410156,
0.03997802734375,
0.0684814453125,
-0.00426483154296875,
0.01039886474609375,
0.034576416015625,
-0.00630950927734375,
-0.036773681640625,
-0.0184326171875,
0.017303466796875,
-0.00502777099609375,
0.038299560546875,
0.0244293212890625,
0.03387451171875,
-0.00380706787109375,
-0.01259613037109375,
0.01910400390625,
0.038543701171875,
-0.0193939208984375,
-0.0206298828125,
0.049713134765625,
-0.0123291015625,
-0.00905609130859375,
0.06536865234375,
-0.01251983642578125,
-0.037567138671875,
0.08416748046875,
0.031585693359375,
0.07244873046875,
-0.0006546974182128906,
0.0013217926025390625,
0.07781982421875,
0.0166168212890625,
-0.006397247314453125,
0.00640106201171875,
0.0020732879638671875,
-0.056396484375,
0.0032825469970703125,
-0.042022705078125,
0.004131317138671875,
0.023101806640625,
-0.03656005859375,
0.018585205078125,
-0.055999755859375,
-0.03216552734375,
0.014617919921875,
0.02996826171875,
-0.0709228515625,
0.0164031982421875,
-0.00384521484375,
0.062042236328125,
-0.054351806640625,
0.06097412109375,
0.06036376953125,
-0.03399658203125,
-0.09161376953125,
-0.01104736328125,
-0.006015777587890625,
-0.061309814453125,
0.04376220703125,
0.03582763671875,
0.01351165771484375,
0.00836181640625,
-0.06622314453125,
-0.053558349609375,
0.10748291015625,
0.040679931640625,
-0.01312255859375,
0.02081298828125,
-0.0097503662109375,
0.0205230712890625,
-0.033233642578125,
0.042083740234375,
0.0124969482421875,
0.0269775390625,
0.022369384765625,
-0.049530029296875,
0.022552490234375,
-0.0268402099609375,
0.00463104248046875,
0.01523590087890625,
-0.07135009765625,
0.07122802734375,
-0.037811279296875,
-0.00749969482421875,
-0.0052337646484375,
0.0574951171875,
0.007648468017578125,
0.01213836669921875,
0.049285888671875,
0.06011962890625,
0.043975830078125,
-0.023529052734375,
0.06304931640625,
0.0032253265380859375,
0.051788330078125,
0.046905517578125,
0.042633056640625,
0.035797119140625,
0.0277099609375,
-0.0227813720703125,
0.0199432373046875,
0.08099365234375,
-0.0292205810546875,
0.021575927734375,
0.016876220703125,
0.00638580322265625,
-0.01557159423828125,
0.0078582763671875,
-0.0265960693359375,
0.04022216796875,
0.01035308837890625,
-0.041900634765625,
-0.0196075439453125,
0.004909515380859375,
0.0010023117065429688,
-0.0299072265625,
-0.022796630859375,
0.032989501953125,
0.0017747879028320312,
-0.0286865234375,
0.07159423828125,
0.0028858184814453125,
0.0699462890625,
-0.0263824462890625,
0.005916595458984375,
-0.0194854736328125,
0.0188140869140625,
-0.02923583984375,
-0.0589599609375,
0.02349853515625,
-0.0214080810546875,
0.0025348663330078125,
-0.0007090568542480469,
0.054656982421875,
-0.0292205810546875,
-0.0390625,
0.0166168212890625,
0.022918701171875,
0.0367431640625,
0.0010805130004882812,
-0.09686279296875,
0.010162353515625,
0.004688262939453125,
-0.058441162109375,
0.0240325927734375,
0.03289794921875,
0.0092010498046875,
0.05682373046875,
0.04119873046875,
-0.01036834716796875,
0.01163482666015625,
-0.01154327392578125,
0.0618896484375,
-0.02947998046875,
-0.0194854736328125,
-0.0584716796875,
0.045166015625,
-0.00843048095703125,
-0.044403076171875,
0.032470703125,
0.03436279296875,
0.067138671875,
0.002716064453125,
0.025543212890625,
-0.0219268798828125,
-0.00402069091796875,
-0.02142333984375,
0.060211181640625,
-0.062225341796875,
-0.0036945343017578125,
-0.01363372802734375,
-0.04693603515625,
-0.029022216796875,
0.056060791015625,
-0.01446533203125,
0.039093017578125,
0.0321044921875,
0.07598876953125,
-0.0257415771484375,
-0.0259552001953125,
0.019989013671875,
0.01477813720703125,
0.00881195068359375,
0.0328369140625,
0.025054931640625,
-0.06146240234375,
0.03131103515625,
-0.059234619140625,
-0.01534271240234375,
-0.0121307373046875,
-0.04803466796875,
-0.07061767578125,
-0.0677490234375,
-0.0513916015625,
-0.051055908203125,
-0.0178070068359375,
0.07379150390625,
0.083740234375,
-0.0498046875,
-0.0123138427734375,
-0.003185272216796875,
0.01351165771484375,
-0.02301025390625,
-0.0179443359375,
0.052490234375,
-0.019287109375,
-0.05517578125,
-0.02984619140625,
-0.00766754150390625,
0.026519775390625,
-0.00162506103515625,
-0.0153961181640625,
-0.01346588134765625,
-0.0267333984375,
0.01242828369140625,
0.0171966552734375,
-0.0418701171875,
-0.0090179443359375,
-0.0200653076171875,
-0.0145416259765625,
0.0291290283203125,
0.0281524658203125,
-0.039306640625,
0.0290069580078125,
0.0297088623046875,
0.0291290283203125,
0.064697265625,
-0.0299224853515625,
-0.0028705596923828125,
-0.058258056640625,
0.041351318359375,
-0.0107879638671875,
0.0343017578125,
0.0313720703125,
-0.034027099609375,
0.049407958984375,
0.0288238525390625,
-0.03857421875,
-0.061309814453125,
-0.02032470703125,
-0.0821533203125,
-0.0113372802734375,
0.06884765625,
-0.03851318359375,
-0.04217529296875,
0.037384033203125,
0.004528045654296875,
0.053466796875,
-0.016815185546875,
0.03741455078125,
0.014617919921875,
-0.00970458984375,
-0.0523681640625,
-0.03839111328125,
0.029327392578125,
0.01483154296875,
-0.0399169921875,
-0.0303955078125,
-0.0033664703369140625,
0.05120849609375,
0.01483154296875,
0.03533935546875,
-0.0022525787353515625,
0.0106964111328125,
0.00870513916015625,
0.036163330078125,
-0.04022216796875,
-0.0031108856201171875,
-0.0298614501953125,
0.012054443359375,
-0.006504058837890625,
-0.041717529296875
]
] |
Rostlab/prot_bert | 2020-12-11T21:30:07.000Z | [
"transformers",
"pytorch",
"fill-mask",
"protein language model",
"protein",
"dataset:Uniref100",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | Rostlab | null | null | Rostlab/prot_bert | 54 | 73,899 | transformers | 2022-03-02T23:29:04 | ---
language: protein
tags:
- protein language model
datasets:
- Uniref100
---
# ProtBert model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between our Bert model and the original Bert version is the way of dealing with sequences as separate documents.
This means the Next sentence prediction is not used, as each sequence is treated as a complete document.
The masking follows the original Bert training with randomly masks 15% of the amino acids in the input.
At the end, the feature extracted from this model revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you could gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import BertForMaskedLM, BertTokenizer, pipeline
>>> tokenizer = BertTokenizer.from_pretrained("Rostlab/prot_bert", do_lower_case=False )
>>> model = BertForMaskedLM.from_pretrained("Rostlab/prot_bert")
>>> unmasker = pipeline('fill-mask', model=model, tokenizer=tokenizer)
>>> unmasker('D L I P T S S K L V V [MASK] D T S L Q V K K A F F A L V T')
[{'score': 0.11088453233242035,
'sequence': '[CLS] D L I P T S S K L V V L D T S L Q V K K A F F A L V T [SEP]',
'token': 5,
'token_str': 'L'},
{'score': 0.08402521163225174,
'sequence': '[CLS] D L I P T S S K L V V S D T S L Q V K K A F F A L V T [SEP]',
'token': 10,
'token_str': 'S'},
{'score': 0.07328339666128159,
'sequence': '[CLS] D L I P T S S K L V V V D T S L Q V K K A F F A L V T [SEP]',
'token': 8,
'token_str': 'V'},
{'score': 0.06921856850385666,
'sequence': '[CLS] D L I P T S S K L V V K D T S L Q V K K A F F A L V T [SEP]',
'token': 12,
'token_str': 'K'},
{'score': 0.06382402777671814,
'sequence': '[CLS] D L I P T S S K L V V I D T S L Q V K K A F F A L V T [SEP]',
'token': 11,
'token_str': 'I'}]
```
Here is how to use this model to get the features of a given protein sequence in PyTorch:
```python
from transformers import BertModel, BertTokenizer
import re
tokenizer = BertTokenizer.from_pretrained("Rostlab/prot_bert", do_lower_case=False )
model = BertModel.from_pretrained("Rostlab/prot_bert")
sequence_Example = "A E T C Z A O"
sequence_Example = re.sub(r"[UZOB]", "X", sequence_Example)
encoded_input = tokenizer(sequence_Example, return_tensors='pt')
output = model(**encoded_input)
```
## Training data
The ProtBert model was pretrained on [Uniref100](https://www.uniprot.org/downloads), a dataset consisting of 217 million protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
```
[CLS] Protein Sequence A [SEP] Protein Sequence B [SEP]
```
Furthermore, each protein sequence was treated as a separate document.
The preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.
The details of the masking procedure for each sequence followed the original Bert model as following:
- 15% of the amino acids are masked.
- In 80% of the cases, the masked amino acids are replaced by `[MASK]`.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
- In the 10% remaining cases, the masked amino acids are left as is.
### Pretraining
The model was trained on a single TPU Pod V3-512 for 400k steps in total.
300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).
The optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 75 | 63 | | |
| TS115 | 83 | 72 | | |
| CB513 | 81 | 66 | | |
| DeepLoc | | | 79 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| 7,963 | [
[
-0.01029205322265625,
-0.03948974609375,
0.0200347900390625,
0.004638671875,
-0.0238037109375,
0.016876220703125,
0.004116058349609375,
-0.0224456787109375,
0.02978515625,
0.0278778076171875,
-0.042449951171875,
-0.0305633544921875,
-0.0582275390625,
0.017333984375,
-0.02642822265625,
0.07086181640625,
0.0178070068359375,
0.0247802734375,
0.00994110107421875,
-0.0011186599731445312,
0.0037479400634765625,
-0.04803466796875,
-0.03631591796875,
-0.0306854248046875,
0.0477294921875,
0.0198974609375,
0.041961669921875,
0.056915283203125,
0.035614013671875,
0.022735595703125,
-0.0189666748046875,
0.00283050537109375,
-0.03131103515625,
-0.0021839141845703125,
0.00820159912109375,
-0.0283660888671875,
-0.043792724609375,
-0.0007643699645996094,
0.050567626953125,
0.0638427734375,
0.01251983642578125,
0.026458740234375,
0.0011091232299804688,
0.06219482421875,
-0.030731201171875,
0.0180206298828125,
-0.0345458984375,
0.00896453857421875,
-0.01898193359375,
0.00878143310546875,
-0.029449462890625,
-0.0197601318359375,
0.00994110107421875,
-0.0347900390625,
0.014129638671875,
0.00528717041015625,
0.07257080078125,
0.006587982177734375,
-0.01123809814453125,
-0.0079498291015625,
-0.039947509765625,
0.0517578125,
-0.06610107421875,
0.040924072265625,
0.03118896484375,
0.00724029541015625,
-0.033843994140625,
-0.07330322265625,
-0.042572021484375,
-0.02166748046875,
-0.00534820556640625,
0.004730224609375,
0.002361297607421875,
0.00788116455078125,
0.0227813720703125,
0.033111572265625,
-0.05438232421875,
-0.0010633468627929688,
-0.0535888671875,
-0.0235748291015625,
0.0565185546875,
-0.01020050048828125,
0.0201416015625,
-0.023468017578125,
-0.0157928466796875,
-0.0262908935546875,
-0.043914794921875,
0.01454925537109375,
0.034210205078125,
0.0245513916015625,
0.0014705657958984375,
0.049560546875,
-0.01227569580078125,
0.04248046875,
-0.0006871223449707031,
0.005985260009765625,
0.037750244140625,
-0.00605010986328125,
-0.0301513671875,
0.00039887428283691406,
0.07208251953125,
-0.0013456344604492188,
0.02703857421875,
-0.0023670196533203125,
-0.00789642333984375,
-0.02398681640625,
0.0147705078125,
-0.052093505859375,
-0.042236328125,
0.037109375,
-0.04681396484375,
-0.0114288330078125,
0.02520751953125,
-0.0543212890625,
-0.01080322265625,
-0.014923095703125,
0.05731201171875,
-0.046295166015625,
-0.0016279220581054688,
0.01358795166015625,
-0.013916015625,
0.0145111083984375,
0.0015459060668945312,
-0.055694580078125,
0.01555633544921875,
0.0281219482421875,
0.064453125,
-0.004077911376953125,
-0.01284027099609375,
-0.032012939453125,
-0.004215240478515625,
-0.006069183349609375,
0.040557861328125,
-0.03240966796875,
-0.0261383056640625,
-0.0087738037109375,
0.0205078125,
-0.00662994384765625,
-0.031158447265625,
0.0312347412109375,
-0.0408935546875,
0.0062713623046875,
-0.01277923583984375,
-0.054229736328125,
-0.0296173095703125,
-0.01103973388671875,
-0.056884765625,
0.078125,
0.01479339599609375,
-0.04705810546875,
0.01494598388671875,
-0.057586669921875,
-0.02838134765625,
0.022247314453125,
-0.0016927719116210938,
-0.037445068359375,
0.003307342529296875,
0.013580322265625,
0.032196044921875,
0.001678466796875,
0.021270751953125,
-0.025787353515625,
-0.0224456787109375,
0.0282135009765625,
-0.0017786026000976562,
0.064208984375,
0.0244598388671875,
-0.02081298828125,
0.01155853271484375,
-0.0743408203125,
0.02288818359375,
0.0117034912109375,
-0.037139892578125,
0.00640106201171875,
-0.00389862060546875,
0.001934051513671875,
0.0157623291015625,
0.0201568603515625,
-0.047027587890625,
0.041259765625,
-0.03656005859375,
0.056060791015625,
0.0565185546875,
0.0007543563842773438,
0.0229644775390625,
-0.0196380615234375,
0.0289459228515625,
0.0033702850341796875,
0.01030731201171875,
-0.0030803680419921875,
-0.049713134765625,
-0.05621337890625,
-0.035430908203125,
0.04595947265625,
0.046295166015625,
-0.033355712890625,
0.051849365234375,
-0.013427734375,
-0.046478271484375,
-0.053497314453125,
-0.0015077590942382812,
0.03033447265625,
0.01380157470703125,
0.048065185546875,
-0.0292816162109375,
-0.061981201171875,
-0.06988525390625,
-0.018585205078125,
-0.00919342041015625,
-0.01800537109375,
0.01189422607421875,
0.05682373046875,
-0.0204925537109375,
0.0579833984375,
-0.03826904296875,
-0.02056884765625,
-0.0277099609375,
0.00289154052734375,
0.04083251953125,
0.053375244140625,
0.01727294921875,
-0.045928955078125,
-0.0360107421875,
-0.0236053466796875,
-0.04937744140625,
-0.00115966796875,
0.004180908203125,
-0.0006804466247558594,
0.007598876953125,
0.032012939453125,
-0.04705810546875,
0.040924072265625,
0.0276336669921875,
-0.0189361572265625,
0.040008544921875,
-0.0318603515625,
-0.004730224609375,
-0.07672119140625,
0.0204010009765625,
-0.01396942138671875,
-0.01418304443359375,
-0.0640869140625,
-0.006793975830078125,
-0.0008945465087890625,
-0.0002281665802001953,
-0.049835205078125,
0.038787841796875,
-0.0421142578125,
-0.00661468505859375,
-0.0028209686279296875,
-0.0172576904296875,
-0.00382232666015625,
0.045928955078125,
-0.00025010108947753906,
0.0439453125,
0.03973388671875,
-0.049468994140625,
0.0153350830078125,
0.03546142578125,
-0.034698486328125,
-0.0200042724609375,
-0.06689453125,
0.0081939697265625,
-0.0037021636962890625,
0.0231170654296875,
-0.069091796875,
-0.01360321044921875,
0.028533935546875,
-0.048553466796875,
0.021881103515625,
-0.0003209114074707031,
-0.037567138671875,
-0.038482666015625,
-0.0244293212890625,
0.03448486328125,
0.047210693359375,
-0.022491455078125,
0.035247802734375,
0.0226898193359375,
-0.00272369384765625,
-0.0447998046875,
-0.049896240234375,
-0.01070404052734375,
-0.0033473968505859375,
-0.034271240234375,
0.03900146484375,
-0.0081024169921875,
0.0045318603515625,
-0.01222991943359375,
-0.0158843994140625,
0.003940582275390625,
-0.01041412353515625,
0.0260772705078125,
0.007335662841796875,
-0.005435943603515625,
0.0028667449951171875,
-0.0162506103515625,
-0.0036144256591796875,
-0.00998687744140625,
-0.0379638671875,
0.06085205078125,
-0.00833892822265625,
-0.0045166015625,
-0.036865234375,
0.031829833984375,
0.04412841796875,
-0.01404571533203125,
0.0626220703125,
0.05206298828125,
-0.038177490234375,
0.0034427642822265625,
-0.0247650146484375,
-0.0195770263671875,
-0.035858154296875,
0.039337158203125,
-0.0290679931640625,
-0.05609130859375,
0.056396484375,
0.0104827880859375,
-0.01343536376953125,
0.049560546875,
0.0322265625,
-0.020721435546875,
0.065673828125,
0.04937744140625,
0.0025730133056640625,
0.034271240234375,
-0.049102783203125,
0.035797119140625,
-0.063720703125,
-0.035003662109375,
-0.035125732421875,
-0.03448486328125,
-0.04705810546875,
-0.0271148681640625,
0.0243072509765625,
0.0308074951171875,
-0.0283355712890625,
0.0447998046875,
-0.0286865234375,
0.02301025390625,
0.06585693359375,
0.0325927734375,
-0.00754547119140625,
0.003326416015625,
-0.028106689453125,
0.0005965232849121094,
-0.061859130859375,
-0.0345458984375,
0.0941162109375,
0.039794921875,
0.041961669921875,
0.00919342041015625,
0.054351806640625,
0.034027099609375,
-0.00044035911560058594,
-0.055145263671875,
0.04345703125,
-0.031280517578125,
-0.04705810546875,
-0.0276336669921875,
-0.01485443115234375,
-0.07586669921875,
0.0018901824951171875,
-0.02239990234375,
-0.07415771484375,
0.0252532958984375,
0.0019254684448242188,
-0.03369140625,
0.019500732421875,
-0.044464111328125,
0.0706787109375,
-0.0150604248046875,
-0.0177154541015625,
0.004337310791015625,
-0.076416015625,
0.01514434814453125,
-0.0085296630859375,
0.01474761962890625,
-0.0001691579818725586,
0.0194091796875,
0.08026123046875,
-0.049102783203125,
0.06854248046875,
-0.005939483642578125,
0.00545501708984375,
0.018951416015625,
0.00330352783203125,
0.0276336669921875,
0.00499725341796875,
-0.0016889572143554688,
0.02435302734375,
0.0087738037109375,
-0.04443359375,
-0.0137176513671875,
0.03057861328125,
-0.06658935546875,
-0.03302001953125,
-0.039093017578125,
-0.037689208984375,
0.0092620849609375,
0.0198974609375,
0.047576904296875,
0.03900146484375,
0.01027679443359375,
0.027252197265625,
0.0494384765625,
-0.031768798828125,
0.049102783203125,
0.0299835205078125,
-0.0166778564453125,
-0.04705810546875,
0.05731201171875,
0.0217132568359375,
0.0147552490234375,
0.039520263671875,
0.0115203857421875,
-0.0411376953125,
-0.048126220703125,
-0.01235198974609375,
0.018218994140625,
-0.04693603515625,
-0.023681640625,
-0.0638427734375,
-0.04241943359375,
-0.048553466796875,
0.006336212158203125,
-0.0287628173828125,
-0.044158935546875,
-0.0189971923828125,
-0.0083770751953125,
0.02642822265625,
0.048065185546875,
-0.0293426513671875,
0.029510498046875,
-0.07818603515625,
0.0263671875,
0.01457977294921875,
0.015380859375,
-0.017547607421875,
-0.06341552734375,
-0.0220489501953125,
0.01242828369140625,
-0.0221099853515625,
-0.080810546875,
0.0494384765625,
0.051544189453125,
0.057464599609375,
0.01189422607421875,
-0.0000010132789611816406,
0.034088134765625,
-0.03778076171875,
0.07470703125,
-0.0031585693359375,
-0.0709228515625,
0.0565185546875,
-0.015625,
0.0263519287109375,
0.027862548828125,
0.0489501953125,
-0.0215606689453125,
-0.03314208984375,
-0.072265625,
-0.08612060546875,
0.047882080078125,
0.026458740234375,
-0.0088653564453125,
-0.005096435546875,
0.0384521484375,
0.0146026611328125,
0.012054443359375,
-0.07244873046875,
-0.037322998046875,
-0.0199127197265625,
-0.02435302734375,
-0.01139068603515625,
-0.0242919921875,
0.0005855560302734375,
-0.0411376953125,
0.06585693359375,
-0.001239776611328125,
0.04296875,
0.032440185546875,
-0.0287628173828125,
0.004909515380859375,
0.00811004638671875,
0.0565185546875,
0.040496826171875,
-0.04620361328125,
0.007083892822265625,
-0.0038776397705078125,
-0.0625,
0.005901336669921875,
0.009918212890625,
-0.0130157470703125,
-0.0014781951904296875,
0.046539306640625,
0.056915283203125,
-0.0015687942504882812,
-0.04278564453125,
0.022064208984375,
0.01435089111328125,
-0.0223846435546875,
-0.0132904052734375,
-0.000659942626953125,
-0.0035839080810546875,
0.031890869140625,
0.049591064453125,
-0.00452423095703125,
0.00577545166015625,
-0.03704833984375,
0.037933349609375,
0.01505279541015625,
-0.02630615234375,
-0.03271484375,
0.052398681640625,
-0.0025463104248046875,
-0.035430908203125,
0.05206298828125,
-0.0187835693359375,
-0.057708740234375,
0.0487060546875,
0.042266845703125,
0.07147216796875,
-0.0232086181640625,
0.0176239013671875,
0.048553466796875,
0.016876220703125,
-0.0121307373046875,
0.03289794921875,
0.0162200927734375,
-0.052947998046875,
-0.0253753662109375,
-0.08233642578125,
-0.0031280517578125,
0.039093017578125,
-0.04144287109375,
0.0103607177734375,
-0.035430908203125,
-0.0122833251953125,
0.00673675537109375,
0.01268768310546875,
-0.046783447265625,
0.0238189697265625,
0.01126861572265625,
0.076171875,
-0.07464599609375,
0.08740234375,
0.058746337890625,
-0.0443115234375,
-0.053009033203125,
-0.01107025146484375,
-0.0170745849609375,
-0.0673828125,
0.07952880859375,
0.0245513916015625,
0.01995849609375,
0.0134429931640625,
-0.045806884765625,
-0.076171875,
0.073974609375,
0.0125579833984375,
-0.056365966796875,
0.0005030632019042969,
0.013641357421875,
0.04608154296875,
-0.030487060546875,
0.0139923095703125,
0.043701171875,
0.0272674560546875,
0.0009446144104003906,
-0.056549072265625,
0.0017681121826171875,
-0.028564453125,
-0.00510406494140625,
0.005558013916015625,
-0.037506103515625,
0.06982421875,
-0.0185699462890625,
-0.006256103515625,
0.004993438720703125,
0.056732177734375,
0.019378662109375,
-0.003971099853515625,
0.01470947265625,
0.04522705078125,
0.05059814453125,
-0.0086669921875,
0.056396484375,
-0.026336669921875,
0.03533935546875,
0.06610107421875,
0.0031528472900390625,
0.07098388671875,
0.04437255859375,
-0.03717041015625,
0.04010009765625,
0.06390380859375,
-0.01261138916015625,
0.058685302734375,
0.036407470703125,
-0.0024700164794921875,
-0.0201263427734375,
0.004909515380859375,
-0.03704833984375,
0.03131103515625,
0.034942626953125,
-0.034149169921875,
0.006519317626953125,
-0.0013990402221679688,
-0.007110595703125,
-0.010162353515625,
-0.0231170654296875,
0.045074462890625,
0.01540374755859375,
-0.030853271484375,
0.050018310546875,
-0.004810333251953125,
0.038665771484375,
-0.046783447265625,
0.002620697021484375,
-0.023193359375,
0.0205841064453125,
-0.007617950439453125,
-0.052001953125,
0.01739501953125,
-0.01494598388671875,
-0.02191162109375,
-0.00988006591796875,
0.04656982421875,
-0.031951904296875,
-0.047515869140625,
0.0216217041015625,
0.0300140380859375,
0.024566650390625,
-0.004119873046875,
-0.05792236328125,
-0.003498077392578125,
0.0010547637939453125,
-0.0308990478515625,
0.0221405029296875,
0.0107879638671875,
0.006755828857421875,
0.0592041015625,
0.045440673828125,
0.0027179718017578125,
0.01184844970703125,
-0.0096893310546875,
0.0611572265625,
-0.06549072265625,
-0.029205322265625,
-0.06536865234375,
0.04541015625,
0.004344940185546875,
-0.0202789306640625,
0.03173828125,
0.050262451171875,
0.06610107421875,
-0.033843994140625,
0.047637939453125,
-0.019012451171875,
0.029449462890625,
-0.04339599609375,
0.05517578125,
-0.036041259765625,
0.00939178466796875,
-0.00643157958984375,
-0.07550048828125,
-0.0302734375,
0.056854248046875,
-0.00679779052734375,
0.0016736984252929688,
0.058837890625,
0.052947998046875,
0.01042938232421875,
-0.011810302734375,
0.01293182373046875,
0.0283355712890625,
0.02862548828125,
0.04827880859375,
0.03460693359375,
-0.064697265625,
0.0238189697265625,
-0.0203704833984375,
-0.006801605224609375,
-0.02947998046875,
-0.0758056640625,
-0.06378173828125,
-0.05438232421875,
-0.029937744140625,
-0.0404052734375,
0.01528167724609375,
0.0802001953125,
0.051177978515625,
-0.08453369140625,
-0.014739990234375,
-0.00756072998046875,
-0.01268768310546875,
-0.0216522216796875,
-0.016021728515625,
0.045867919921875,
-0.017852783203125,
-0.039276123046875,
0.0369873046875,
0.005352020263671875,
0.022186279296875,
-0.0037841796875,
-0.00972747802734375,
-0.049835205078125,
0.005367279052734375,
0.039947509765625,
0.038299560546875,
-0.060943603515625,
-0.033355712890625,
0.009185791015625,
-0.007472991943359375,
0.01342010498046875,
0.034881591796875,
-0.056640625,
0.01434326171875,
0.05078125,
0.03656005859375,
0.0445556640625,
-0.00801849365234375,
0.054840087890625,
-0.06622314453125,
0.00983428955078125,
0.00528717041015625,
0.028076171875,
0.0192413330078125,
-0.0229644775390625,
0.0394287109375,
0.038299560546875,
-0.045135498046875,
-0.058319091796875,
0.00867462158203125,
-0.08221435546875,
-0.0233154296875,
0.0804443359375,
-0.0154876708984375,
-0.0288238525390625,
0.0040435791015625,
-0.01248931884765625,
0.04736328125,
-0.020111083984375,
0.0567626953125,
0.045257568359375,
-0.00811767578125,
-0.00970458984375,
-0.037567138671875,
0.0567626953125,
0.034423828125,
-0.0380859375,
-0.0133819580078125,
0.006114959716796875,
0.02191162109375,
0.0143280029296875,
0.0362548828125,
-0.01097869873046875,
0.004619598388671875,
-0.007049560546875,
0.0161895751953125,
-0.0115203857421875,
-0.0290679931640625,
-0.031097412109375,
0.0113067626953125,
-0.01082611083984375,
-0.0284271240234375
]
] |
MoritzLaurer/mDeBERTa-v3-base-mnli-xnli | 2023-03-22T08:35:38.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"zero-shot-classification",
"nli",
"multilingual",
"en",
"ar",
"bg",
"de",
"el",
"es",
"fr",
"hi",
"ru",
"sw",
"th",
"tr",
"ur",
"vi",
"zh",
"dataset:multi_nli",
"dataset:xnli",
"arxiv:2111.09543",
"arxiv:1809.05053",
"arxiv:1911.02116",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | MoritzLaurer | null | null | MoritzLaurer/mDeBERTa-v3-base-mnli-xnli | 163 | 73,856 | transformers | 2022-03-02T23:29:04 | ---
language:
- multilingual
- en
- ar
- bg
- de
- el
- es
- fr
- hi
- ru
- sw
- th
- tr
- ur
- vi
- zh
license: mit
tags:
- zero-shot-classification
- text-classification
- nli
- pytorch
metrics:
- accuracy
datasets:
- multi_nli
- xnli
pipeline_tag: zero-shot-classification
widget:
- text: "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
candidate_labels: "politics, economy, entertainment, environment"
---
# Multilingual mDeBERTa-v3-base-mnli-xnli
## Model description
This multilingual model can perform natural language inference (NLI) on 100 languages and is therefore also suitable for multilingual
zero-shot classification. The underlying model was pre-trained by Microsoft on the
[CC100 multilingual dataset](https://huggingface.co/datasets/cc100). It was then fine-tuned on the [XNLI dataset](https://huggingface.co/datasets/xnli), which contains hypothesis-premise pairs from 15 languages, as well as the English [MNLI dataset](https://huggingface.co/datasets/multi_nli).
As of December 2021, mDeBERTa-base is the best performing multilingual base-sized transformer model,
introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
If you are looking for a smaller, faster (but less performant) model, you can
try [multilingual-MiniLMv2-L6-mnli-xnli](https://huggingface.co/MoritzLaurer/multilingual-MiniLMv2-L6-mnli-xnli).
### How to use the model
#### Simple zero-shot classification pipeline
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model="MoritzLaurer/mDeBERTa-v3-base-mnli-xnli")
sequence_to_classify = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
candidate_labels = ["politics", "economy", "entertainment", "environment"]
output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
```
#### NLI use-case
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
model_name = "MoritzLaurer/mDeBERTa-v3-base-mnli-xnli"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
premise = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
hypothesis = "Emmanuel Macron is the President of France"
input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt")
output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu"
prediction = torch.softmax(output["logits"][0], -1).tolist()
label_names = ["entailment", "neutral", "contradiction"]
prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)}
print(prediction)
```
### Training data
This model was trained on the XNLI development dataset and the MNLI train dataset. The XNLI development set consists of 2490 professionally translated texts from English to 14 other languages (37350 texts in total) (see [this paper](https://arxiv.org/pdf/1809.05053.pdf)). Note that the XNLI contains a training set of 15 machine translated versions of the MNLI dataset for 15 languages, but due to quality issues with these machine translations, this model was only trained on the professional translations from the XNLI development set and the original English MNLI training set (392 702 texts). Not using machine translated texts can avoid overfitting the model to the 15 languages; avoids catastrophic forgetting of the other 85 languages mDeBERTa was pre-trained on; and significantly reduces training costs.
### Training procedure
mDeBERTa-v3-base-mnli-xnli was trained using the Hugging Face trainer with the following hyperparameters.
```
training_args = TrainingArguments(
num_train_epochs=2, # total number of training epochs
learning_rate=2e-05,
per_device_train_batch_size=16, # batch size per device during training
per_device_eval_batch_size=16, # batch size for evaluation
warmup_ratio=0.1, # number of warmup steps for learning rate scheduler
weight_decay=0.06, # strength of weight decay
)
```
### Eval results
The model was evaluated on the XNLI test set on 15 languages (5010 texts per language, 75150 in total). Note that multilingual NLI models are capable of classifying NLI texts without receiving NLI training data in the specific language (cross-lingual transfer). This means that the model is also able of doing NLI on the other 85 languages mDeBERTa was training on, but performance is most likely lower than for those languages available in XNLI.
Also note that if other multilingual models on the model hub claim performance of around 90% on languages other than English, the authors have most likely made a mistake during testing since non of the latest papers shows a multilingual average performance of more than a few points above 80% on XNLI (see [here](https://arxiv.org/pdf/2111.09543.pdf) or [here](https://arxiv.org/pdf/1911.02116.pdf)).
average | ar | bg | de | el | en | es | fr | hi | ru | sw | th | tr | ur | vi | zh
---------|----------|---------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------
0.808 | 0.802 | 0.829 | 0.825 | 0.826 | 0.883 | 0.845 | 0.834 | 0.771 | 0.813 | 0.748 | 0.793 | 0.807 | 0.740 | 0.795 | 0.8116
## Limitations and bias
Please consult the original DeBERTa-V3 paper and literature on different NLI datasets for potential biases.
## Citation
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
## Ideas for cooperation or questions?
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
## Debugging and issues
Note that DeBERTa-v3 was released in late 2021 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers>=4.13 or higher might solve some issues. Note that mDeBERTa currently does not support FP16, see here: https://github.com/microsoft/DeBERTa/issues/77
| 6,498 | [
[
-0.03411865234375,
-0.0285797119140625,
0.0114898681640625,
0.0199737548828125,
-0.00115203857421875,
-0.0032558441162109375,
-0.0156402587890625,
-0.047576904296875,
0.021881103515625,
0.023101806640625,
-0.0462646484375,
-0.034332275390625,
-0.04248046875,
0.0120697021484375,
-0.011322021484375,
0.07989501953125,
-0.007144927978515625,
0.01113128662109375,
0.01029205322265625,
-0.025970458984375,
-0.027496337890625,
-0.053375244140625,
-0.05712890625,
-0.0236968994140625,
0.05145263671875,
0.01520538330078125,
0.039794921875,
0.03839111328125,
0.0257415771484375,
0.0213165283203125,
-0.01367950439453125,
0.006961822509765625,
-0.015777587890625,
-0.024078369140625,
0.01221466064453125,
-0.05609130859375,
-0.0416259765625,
0.0019550323486328125,
0.05316162109375,
0.04412841796875,
-0.0008115768432617188,
0.018096923828125,
0.003337860107421875,
0.04974365234375,
-0.033538818359375,
0.00826263427734375,
-0.03515625,
0.018768310546875,
-0.01068115234375,
0.004680633544921875,
-0.031768798828125,
-0.011566162109375,
0.0101318359375,
-0.0253753662109375,
0.006343841552734375,
-0.0033664703369140625,
0.0882568359375,
0.01079559326171875,
-0.029266357421875,
-0.0024852752685546875,
-0.03662109375,
0.0767822265625,
-0.06982421875,
0.0406494140625,
0.0202484130859375,
0.00724029541015625,
0.01629638671875,
-0.02984619140625,
-0.0521240234375,
-0.0086669921875,
-0.0240325927734375,
0.01221466064453125,
-0.0164337158203125,
-0.0113525390625,
0.030853271484375,
0.017364501953125,
-0.0643310546875,
0.0257568359375,
-0.034820556640625,
-0.002674102783203125,
0.05828857421875,
0.003772735595703125,
0.01218414306640625,
-0.031829833984375,
-0.022186279296875,
-0.025115966796875,
-0.043121337890625,
0.01377105712890625,
0.0307159423828125,
0.031951904296875,
-0.02734375,
0.030303955078125,
-0.0092926025390625,
0.0528564453125,
-0.0032215118408203125,
-0.024261474609375,
0.05712890625,
-0.030181884765625,
-0.025909423828125,
0.01029205322265625,
0.0732421875,
0.021759033203125,
0.01442718505859375,
0.003021240234375,
-0.00826263427734375,
-0.00649261474609375,
-0.01355743408203125,
-0.0675048828125,
-0.002277374267578125,
0.0244293212890625,
-0.03265380859375,
-0.034149169921875,
-0.0013256072998046875,
-0.05902099609375,
0.0009284019470214844,
-0.0254364013671875,
0.02203369140625,
-0.033203125,
-0.042633056640625,
-0.005138397216796875,
0.005580902099609375,
0.0201416015625,
-0.0026702880859375,
-0.061004638671875,
0.004192352294921875,
0.0217437744140625,
0.06884765625,
-0.013458251953125,
-0.04083251953125,
-0.01177215576171875,
-0.01140594482421875,
-0.01617431640625,
0.023468017578125,
-0.00902557373046875,
-0.0157623291015625,
-0.00926971435546875,
0.0201416015625,
-0.033966064453125,
-0.0279083251953125,
0.044830322265625,
-0.0299072265625,
0.032257080078125,
-0.01326751708984375,
-0.02783203125,
-0.02081298828125,
0.021484375,
-0.055084228515625,
0.08709716796875,
0.0100860595703125,
-0.06597900390625,
0.03515625,
-0.055908203125,
-0.0249176025390625,
-0.0259552001953125,
-0.00432586669921875,
-0.042633056640625,
-0.01137542724609375,
0.019195556640625,
0.038238525390625,
-0.006824493408203125,
0.042388916015625,
-0.0173797607421875,
-0.0182037353515625,
0.0165252685546875,
-0.039459228515625,
0.09466552734375,
0.0243072509765625,
-0.05364990234375,
0.0022945404052734375,
-0.07110595703125,
0.01434326171875,
0.0156402587890625,
-0.01824951171875,
-0.01024627685546875,
-0.0247039794921875,
0.01812744140625,
0.045501708984375,
-0.002887725830078125,
-0.04351806640625,
0.01122283935546875,
-0.041046142578125,
0.0302734375,
0.04302978515625,
-0.0243682861328125,
0.027618408203125,
-0.02423095703125,
0.0272979736328125,
0.0261993408203125,
0.01000213623046875,
-0.00957489013671875,
-0.04681396484375,
-0.0775146484375,
-0.0149078369140625,
0.04302978515625,
0.057342529296875,
-0.06488037109375,
0.034088134765625,
-0.0328369140625,
-0.04541015625,
-0.04852294921875,
0.006290435791015625,
0.034759521484375,
0.029632568359375,
0.0254974365234375,
0.00846099853515625,
-0.06549072265625,
-0.07489013671875,
0.0057220458984375,
-0.00786590576171875,
0.00820159912109375,
0.01485443115234375,
0.055816650390625,
-0.016632080078125,
0.06451416015625,
-0.0233306884765625,
-0.0308837890625,
-0.03076171875,
0.005474090576171875,
0.0408935546875,
0.046539306640625,
0.0716552734375,
-0.06689453125,
-0.049102783203125,
0.014617919921875,
-0.07379150390625,
0.012939453125,
-0.01047515869140625,
-0.0091552734375,
0.052459716796875,
0.01540374755859375,
-0.04510498046875,
0.0254974365234375,
0.0576171875,
-0.01552581787109375,
0.0252685546875,
-0.0035152435302734375,
0.007450103759765625,
-0.0946044921875,
0.0229339599609375,
0.0046234130859375,
-0.00653839111328125,
-0.059295654296875,
-0.004528045654296875,
0.00296783447265625,
-0.005069732666015625,
-0.047119140625,
0.053802490234375,
-0.0252838134765625,
0.0201873779296875,
-0.0013179779052734375,
0.00902557373046875,
0.003292083740234375,
0.0589599609375,
0.02349853515625,
0.050750732421875,
0.045257568359375,
-0.04351806640625,
-0.0017681121826171875,
0.016998291015625,
-0.032958984375,
0.0205535888671875,
-0.0518798828125,
-0.0084228515625,
-0.0043487548828125,
0.0106353759765625,
-0.056793212890625,
-0.00189208984375,
0.01849365234375,
-0.034149169921875,
0.0389404296875,
-0.00865936279296875,
-0.035003662109375,
-0.04052734375,
-0.0158538818359375,
0.0206451416015625,
0.0394287109375,
-0.04730224609375,
0.038604736328125,
0.0128326416015625,
0.0058746337890625,
-0.0711669921875,
-0.08001708984375,
-0.0018396377563476562,
-0.0183563232421875,
-0.0521240234375,
0.0282135009765625,
-0.01239013671875,
-0.00382232666015625,
-0.00539398193359375,
0.020294189453125,
-0.01352691650390625,
0.0001856088638305664,
0.01514434814453125,
0.03314208984375,
-0.0237884521484375,
-0.007205963134765625,
0.0084075927734375,
-0.00955963134765625,
-0.007732391357421875,
-0.014739990234375,
0.04052734375,
-0.01505279541015625,
-0.00833892822265625,
-0.043121337890625,
0.0253448486328125,
0.043914794921875,
-0.0171051025390625,
0.07598876953125,
0.06805419921875,
-0.0279388427734375,
0.0117034912109375,
-0.043182373046875,
-0.005496978759765625,
-0.03094482421875,
0.030792236328125,
-0.049224853515625,
-0.05279541015625,
0.03955078125,
0.035675048828125,
0.00992584228515625,
0.049560546875,
0.037384033203125,
0.00658416748046875,
0.09466552734375,
0.049560546875,
-0.0191192626953125,
0.0233917236328125,
-0.06231689453125,
0.01499176025390625,
-0.0545654296875,
-0.01544189453125,
-0.034423828125,
-0.022247314453125,
-0.0643310546875,
-0.0099334716796875,
0.0113983154296875,
-0.0008482933044433594,
-0.00997161865234375,
0.040924072265625,
-0.034423828125,
0.020660400390625,
0.0433349609375,
0.004131317138671875,
0.012054443359375,
0.00910186767578125,
-0.01004791259765625,
-0.01666259765625,
-0.06414794921875,
-0.03118896484375,
0.07379150390625,
0.03326416015625,
0.027557373046875,
0.01434326171875,
0.054931640625,
-0.00853729248046875,
0.0325927734375,
-0.041168212890625,
0.0229949951171875,
-0.01132965087890625,
-0.06964111328125,
-0.01340484619140625,
-0.046905517578125,
-0.063720703125,
0.033538818359375,
-0.0178680419921875,
-0.0579833984375,
0.04132080078125,
-0.005199432373046875,
-0.02838134765625,
0.047088623046875,
-0.0589599609375,
0.069091796875,
-0.0204315185546875,
-0.0245208740234375,
0.007419586181640625,
-0.04779052734375,
0.03558349609375,
-0.01499176025390625,
0.0144805908203125,
-0.0138702392578125,
0.0175323486328125,
0.06597900390625,
-0.01422119140625,
0.06890869140625,
-0.0272369384765625,
-0.01023101806640625,
0.01367950439453125,
-0.020355224609375,
0.021392822265625,
-0.0009670257568359375,
-0.031768798828125,
0.037811279296875,
0.018341064453125,
-0.0345458984375,
-0.0401611328125,
0.06298828125,
-0.06903076171875,
-0.032684326171875,
-0.0394287109375,
-0.02947998046875,
0.00382232666015625,
0.0269622802734375,
0.0390625,
0.02813720703125,
-0.002288818359375,
0.01309967041015625,
0.04559326171875,
-0.0163116455078125,
0.047760009765625,
0.02935791015625,
-0.03021240234375,
-0.025909423828125,
0.07318115234375,
0.03369140625,
0.0119781494140625,
0.025360107421875,
0.0010881423950195312,
-0.021209716796875,
-0.031951904296875,
-0.052520751953125,
0.02923583984375,
-0.045867919921875,
-0.0247802734375,
-0.0745849609375,
-0.0247344970703125,
-0.033447265625,
0.00051116943359375,
-0.024658203125,
-0.034393310546875,
-0.022003173828125,
-0.01153564453125,
0.0183868408203125,
0.0330810546875,
0.001064300537109375,
0.021575927734375,
-0.05316162109375,
-0.000858306884765625,
0.0005617141723632812,
0.0239410400390625,
0.00616455078125,
-0.049774169921875,
-0.0254364013671875,
0.023193359375,
-0.0163421630859375,
-0.048675537109375,
0.048736572265625,
0.0178375244140625,
0.040557861328125,
0.0193939208984375,
-0.0035228729248046875,
0.0526123046875,
-0.0309295654296875,
0.05169677734375,
0.0195770263671875,
-0.067138671875,
0.038055419921875,
-0.00681304931640625,
0.017059326171875,
0.04254150390625,
0.059112548828125,
-0.031951904296875,
-0.012908935546875,
-0.04852294921875,
-0.055938720703125,
0.06610107421875,
0.0220947265625,
0.01181793212890625,
0.00449371337890625,
0.0211334228515625,
-0.00008916854858398438,
0.01439666748046875,
-0.08209228515625,
-0.043243408203125,
-0.007144927978515625,
-0.01364898681640625,
-0.01447296142578125,
-0.0153350830078125,
-0.0011386871337890625,
-0.038330078125,
0.0673828125,
-0.01369476318359375,
0.027252197265625,
0.0222930908203125,
-0.017578125,
0.00278472900390625,
0.009002685546875,
0.0555419921875,
0.050506591796875,
-0.024139404296875,
-0.0106964111328125,
0.037506103515625,
-0.02276611328125,
0.0213165283203125,
0.01165008544921875,
-0.0236663818359375,
0.0188751220703125,
0.028594970703125,
0.08917236328125,
0.00215911865234375,
-0.04132080078125,
0.038604736328125,
-0.0095977783203125,
-0.0272064208984375,
-0.039764404296875,
-0.00873565673828125,
-0.00897979736328125,
0.0140228271484375,
0.02490234375,
0.0147857666015625,
0.004711151123046875,
-0.0310211181640625,
0.0165863037109375,
0.01537322998046875,
-0.041595458984375,
-0.030670166015625,
0.049468994140625,
0.002109527587890625,
-0.0018053054809570312,
0.04144287109375,
-0.02166748046875,
-0.037811279296875,
0.043243408203125,
0.03814697265625,
0.0562744140625,
-0.0233001708984375,
0.019073486328125,
0.0584716796875,
0.029205322265625,
0.00849151611328125,
0.016998291015625,
0.0147247314453125,
-0.059051513671875,
-0.041168212890625,
-0.04931640625,
-0.0060272216796875,
0.01158905029296875,
-0.05181884765625,
0.032989501953125,
-0.01357269287109375,
-0.0166473388671875,
0.01580810546875,
0.0103759765625,
-0.058135986328125,
0.021331787109375,
0.01503753662109375,
0.07403564453125,
-0.07598876953125,
0.08331298828125,
0.037994384765625,
-0.046905517578125,
-0.0550537109375,
-0.0269317626953125,
-0.007152557373046875,
-0.0440673828125,
0.0643310546875,
0.033050537109375,
0.00365447998046875,
-0.0015106201171875,
-0.0196075439453125,
-0.0711669921875,
0.083984375,
0.018035888671875,
-0.040374755859375,
-0.014190673828125,
0.018768310546875,
0.04132080078125,
-0.022979736328125,
0.036529541015625,
0.0482177734375,
0.043243408203125,
-0.010406494140625,
-0.06982421875,
0.00028514862060546875,
-0.03448486328125,
0.00507354736328125,
0.01079559326171875,
-0.050262451171875,
0.0682373046875,
-0.0209503173828125,
-0.01163482666015625,
0.007663726806640625,
0.0526123046875,
0.0168914794921875,
0.0183868408203125,
0.03851318359375,
0.048248291015625,
0.05145263671875,
-0.0205230712890625,
0.08343505859375,
-0.0257415771484375,
0.032806396484375,
0.0699462890625,
-0.017822265625,
0.071044921875,
0.032806396484375,
-0.0211181640625,
0.04248046875,
0.050689697265625,
-0.0112152099609375,
0.022003173828125,
-0.0066070556640625,
-0.00616455078125,
0.009063720703125,
-0.01145172119140625,
-0.032196044921875,
0.0357666015625,
0.0076751708984375,
-0.036956787109375,
0.0030040740966796875,
0.032867431640625,
0.031951904296875,
-0.023406982421875,
0.004329681396484375,
0.04443359375,
-0.0036468505859375,
-0.048675537109375,
0.08172607421875,
-0.0005116462707519531,
0.0733642578125,
-0.04412841796875,
0.00893402099609375,
-0.0157623291015625,
0.0204315185546875,
-0.0303802490234375,
-0.05706787109375,
0.0309295654296875,
-0.004543304443359375,
-0.0269317626953125,
-0.00914764404296875,
0.0252685546875,
-0.04022216796875,
-0.055419921875,
0.0406494140625,
0.037567138671875,
0.00844573974609375,
0.00002491474151611328,
-0.07464599609375,
0.01189422607421875,
0.027618408203125,
-0.03564453125,
0.02935791015625,
0.013427734375,
0.001300811767578125,
0.050048828125,
0.04986572265625,
-0.0024509429931640625,
0.0123138427734375,
0.004665374755859375,
0.060150146484375,
-0.0399169921875,
-0.013458251953125,
-0.0609130859375,
0.047393798828125,
-0.01134490966796875,
-0.031463623046875,
0.06378173828125,
0.047271728515625,
0.07379150390625,
-0.0090179443359375,
0.054779052734375,
-0.01824951171875,
0.03375244140625,
-0.039581298828125,
0.05120849609375,
-0.054473876953125,
-0.005420684814453125,
-0.0201416015625,
-0.069091796875,
-0.03363037109375,
0.04132080078125,
-0.0180511474609375,
0.0199432373046875,
0.04571533203125,
0.06597900390625,
-0.003467559814453125,
-0.0074615478515625,
0.029449462890625,
0.01222991943359375,
0.018646240234375,
0.04022216796875,
0.036895751953125,
-0.0633544921875,
0.0457763671875,
-0.0638427734375,
-0.0132293701171875,
0.0017404556274414062,
-0.0521240234375,
-0.0721435546875,
-0.0526123046875,
-0.0362548828125,
-0.02691650390625,
-0.0096893310546875,
0.07049560546875,
0.07098388671875,
-0.080078125,
-0.02777099609375,
0.00853729248046875,
0.0029163360595703125,
-0.020233154296875,
-0.021240234375,
0.0377197265625,
-0.021270751953125,
-0.08380126953125,
0.0207977294921875,
-0.005367279052734375,
0.0169677734375,
-0.03228759765625,
-0.007404327392578125,
-0.043670654296875,
-0.0118255615234375,
0.053619384765625,
0.01151275634765625,
-0.053070068359375,
0.00444793701171875,
0.0212249755859375,
-0.007732391357421875,
0.0184478759765625,
0.03155517578125,
-0.04071044921875,
0.0182952880859375,
0.0261993408203125,
0.034210205078125,
0.04608154296875,
-0.029083251953125,
0.021697998046875,
-0.05810546875,
0.0357666015625,
-0.00240325927734375,
0.0343017578125,
0.02508544921875,
-0.027862548828125,
0.050445556640625,
0.00879669189453125,
-0.03265380859375,
-0.07366943359375,
-0.0034008026123046875,
-0.06298828125,
-0.014404296875,
0.07537841796875,
-0.01873779296875,
-0.03472900390625,
0.00194549560546875,
-0.0164947509765625,
0.01316070556640625,
-0.01641845703125,
0.03021240234375,
0.047576904296875,
-0.00020766258239746094,
-0.03363037109375,
-0.061767578125,
0.036376953125,
0.0386962890625,
-0.049835205078125,
-0.00722503662109375,
-0.0016965866088867188,
0.01277923583984375,
0.037628173828125,
0.036407470703125,
-0.00812530517578125,
0.0020847320556640625,
-0.0079498291015625,
0.022979736328125,
0.00811004638671875,
-0.0185546875,
-0.03125,
-0.0058441162109375,
-0.0056304931640625,
-0.002872467041015625
]
] |
dreamlike-art/dreamlike-diffusion-1.0 | 2023-01-27T14:44:44.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"art",
"artistic",
"en",
"license:other",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | dreamlike-art | null | null | dreamlike-art/dreamlike-diffusion-1.0 | 997 | 73,097 | diffusers | 2022-12-11T04:16:04 | ---
language:
- en
license: other
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- art
- artistic
- diffusers
inference: false
---
# Dreamlike Diffusion 1.0 is SD 1.5 fine tuned on high quality art, made by [dreamlike.art](https://dreamlike.art/).
# If you want to use dreamlike models on your website/app/etc., check the license at the bottom first!
Use the same prompts as you would for SD 1.5. Add **dreamlikeart** if the artstyle is too weak.
Non-square aspect ratios work better for some prompts. If you want a portrait photo, try using a 2:3 or a 9:16 aspect ratio. If you want a landscape photo, try using a 3:2 or a 16:9 aspect ratio.
Use slightly higher resolution for better results: 640x640px, 512x768px, 768x512px, etc.
# We've just released Dreamlike Photoreal 2.0, check it out!
[https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0](https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0)
<img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/preview1.jpg" style="max-width: 400px;" width="100%"/>
### Examples
<img src="https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0/resolve/main/preview.jpg" style="max-width: 800px;" width="100%"/>
<img src="https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0/resolve/main/1.jpg" style="max-width: 800px;" width="100%"/>
<img src="https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0/resolve/main/2.jpg" style="max-width: 800px;" width="100%"/>
### dreamlike.art
You can use this model for free on [dreamlike.art](https://dreamlike.art/)!
<img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-1.0/resolve/main/dreamlike.jpg" style="max-width: 1000px;" width="100%"/>
### Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI to run dreamlike-diffusion-1.0:
[](https://huggingface.co/spaces/akhaliq/dreamlike-diffusion-1.0)
### CompVis
[Download dreamlike-diffusion-1.0.ckpt (2.13GB)](https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0/resolve/main/dreamlike-diffusion-1.0.ckpt)
### 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion Pipeline](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "dreamlike-art/dreamlike-diffusion-1.0"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "dreamlikeart, a grungy woman with rainbow hair, travelling between dimensions, dynamic pose, happy, soft eyes and narrow chin, extreme bokeh, dainty figure, long hair straight down, torn kawaii shirt and baggy jeans, In style of by Jordan Grimmer and greg rutkowski, crisp lines and color, complex background, particles, lines, wind, concept art, sharp focus, vivid colors"
image = pipe(prompt).images[0]
image.save("./result.jpg")
```
# License
This model is licesed under a **modified** CreativeML OpenRAIL-M license.
- **You can't host or use the model or its derivatives on websites/apps/etc., from which you earn, will earn, or plan to earn revenue or donations. If you want to, please email us at contact@dreamlike.art**
- **You are free to host the model card and files (Without any actual inference or finetuning) on both commercial and non-commercial websites/apps/etc. Please state the full model name (Dreamlike Diffusion 1.0) and include a link to the model card (https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0)**
- **You are free to host the model or its derivatives on completely non-commercial websites/apps/etc (Meaning you are not getting ANY revenue or donations). Please state the full model name (Dreamlike Diffusion 1.0) and include a link to the model card (https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0)**
- **You are free to use the outputs of the model or the outputs of the model's derivatives for commercial purposes in teams of 10 or less**
- You can't use the model to deliberately produce nor share illegal or harmful outputs or content
- The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
- You may re-distribute the weights. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the **modified** CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here: https://huggingface.co/dreamlike-art/dreamlike-diffusion-1.0/blob/main/LICENSE.md
| 4,989 | [
[
-0.0413818359375,
-0.05523681640625,
0.031951904296875,
0.037322998046875,
-0.03228759765625,
-0.00934600830078125,
0.00844573974609375,
-0.047943115234375,
0.0565185546875,
0.04644775390625,
-0.050689697265625,
-0.057891845703125,
-0.04144287109375,
-0.0225982666015625,
-0.01525115966796875,
0.07696533203125,
-0.0183258056640625,
-0.018524169921875,
-0.0222930908203125,
0.004985809326171875,
-0.01727294921875,
0.0092620849609375,
-0.060760498046875,
-0.015869140625,
0.02471923828125,
0.0028362274169921875,
0.052459716796875,
0.01457977294921875,
0.0280303955078125,
0.018829345703125,
-0.01100921630859375,
-0.019805908203125,
-0.0333251953125,
0.0034008026123046875,
0.0008893013000488281,
-0.0299224853515625,
-0.0711669921875,
0.021514892578125,
0.0328369140625,
0.012969970703125,
-0.0328369140625,
0.01409149169921875,
0.01064300537109375,
0.06744384765625,
-0.01995849609375,
0.005428314208984375,
-0.0062103271484375,
0.01331329345703125,
-0.022369384765625,
0.0228424072265625,
0.01210784912109375,
-0.045684814453125,
0.00543212890625,
-0.059967041015625,
0.0292816162109375,
-0.0017223358154296875,
0.07659912109375,
0.00933074951171875,
-0.007465362548828125,
0.0046844482421875,
-0.0390625,
0.0250701904296875,
-0.040557861328125,
0.03399658203125,
0.012420654296875,
0.028228759765625,
-0.00281524658203125,
-0.060455322265625,
-0.0318603515625,
0.005115509033203125,
0.00959014892578125,
0.0318603515625,
-0.0271759033203125,
0.01024627685546875,
0.01334381103515625,
0.0302581787109375,
-0.062042236328125,
-0.033203125,
-0.036895751953125,
-0.006740570068359375,
0.05712890625,
0.00638580322265625,
0.029266357421875,
-0.0038547515869140625,
-0.034393310546875,
0.0008182525634765625,
-0.02874755859375,
0.0069122314453125,
0.034637451171875,
0.011474609375,
-0.07000732421875,
0.03173828125,
-0.00725555419921875,
0.039947509765625,
0.00521087646484375,
-0.002025604248046875,
0.0295257568359375,
0.004772186279296875,
-0.0248260498046875,
-0.02642822265625,
0.06646728515625,
0.05865478515625,
-0.0035648345947265625,
-0.004581451416015625,
-0.0194244384765625,
0.0022373199462890625,
-0.0016603469848632812,
-0.0821533203125,
-0.03387451171875,
0.03948974609375,
-0.047943115234375,
-0.03228759765625,
-0.0175628662109375,
-0.06915283203125,
-0.02130126953125,
0.00860595703125,
0.0271453857421875,
-0.02764892578125,
-0.064208984375,
0.031829833984375,
-0.03485107421875,
0.0004322528839111328,
0.027984619140625,
-0.04388427734375,
0.0199127197265625,
0.0186309814453125,
0.09234619140625,
-0.005611419677734375,
0.019866943359375,
0.0239410400390625,
0.006824493408203125,
-0.02593994140625,
0.055755615234375,
-0.0220947265625,
-0.046234130859375,
-0.0142822265625,
0.00962066650390625,
-0.00832366943359375,
-0.035980224609375,
0.046356201171875,
-0.027252197265625,
0.0152130126953125,
-0.014129638671875,
-0.040618896484375,
-0.0199432373046875,
-0.007114410400390625,
-0.03692626953125,
0.0430908203125,
0.03448486328125,
-0.055694580078125,
0.0176239013671875,
-0.07269287109375,
-0.002529144287109375,
0.00847625732421875,
0.0099029541015625,
-0.0290985107421875,
0.001628875732421875,
-0.022430419921875,
0.031341552734375,
0.004459381103515625,
-0.01406097412109375,
-0.043365478515625,
-0.007640838623046875,
-0.0190277099609375,
-0.01004791259765625,
0.0828857421875,
0.040130615234375,
0.002674102783203125,
0.005218505859375,
-0.038665771484375,
-0.00946807861328125,
0.039276123046875,
0.00323486328125,
-0.019805908203125,
-0.0230865478515625,
0.0280303955078125,
0.0175933837890625,
0.0138702392578125,
-0.051361083984375,
0.019744873046875,
-0.03204345703125,
0.0123443603515625,
0.048583984375,
0.01119232177734375,
0.0303802490234375,
-0.048675537109375,
0.06695556640625,
0.02410888671875,
0.0247802734375,
0.0361328125,
-0.05206298828125,
-0.047088623046875,
-0.036041259765625,
0.0115814208984375,
0.018829345703125,
-0.04974365234375,
-0.0029506683349609375,
-0.00006556510925292969,
-0.0704345703125,
-0.0416259765625,
-0.004428863525390625,
0.01421356201171875,
0.0307769775390625,
0.009521484375,
-0.035858154296875,
-0.0220947265625,
-0.06488037109375,
0.00209808349609375,
0.009033203125,
0.003650665283203125,
0.029876708984375,
0.03826904296875,
-0.017333984375,
0.06488037109375,
-0.059295654296875,
-0.034088134765625,
-0.006412506103515625,
-0.0164031982421875,
0.037322998046875,
0.06475830078125,
0.08648681640625,
-0.061309814453125,
-0.051300048828125,
-0.016326904296875,
-0.06121826171875,
-0.009521484375,
0.0155181884765625,
-0.035858154296875,
0.011962890625,
-0.005611419677734375,
-0.07659912109375,
0.035430908203125,
0.058868408203125,
-0.061798095703125,
0.05157470703125,
-0.0294342041015625,
0.01422119140625,
-0.08819580078125,
0.002796173095703125,
0.045166015625,
-0.03521728515625,
-0.04656982421875,
0.033782958984375,
-0.02606201171875,
0.0010318756103515625,
-0.057037353515625,
0.072509765625,
-0.0265655517578125,
0.03558349609375,
-0.0167083740234375,
0.005207061767578125,
-0.0007481575012207031,
0.0234222412109375,
0.0017337799072265625,
0.031829833984375,
0.06939697265625,
-0.05108642578125,
0.0232696533203125,
0.0219268798828125,
-0.0254058837890625,
0.058197021484375,
-0.059967041015625,
0.00341033935546875,
-0.03131103515625,
0.024444580078125,
-0.0714111328125,
-0.02069091796875,
0.058929443359375,
-0.02490234375,
0.0117950439453125,
-0.033203125,
-0.01495361328125,
-0.0306243896484375,
-0.01171875,
0.0213623046875,
0.07464599609375,
-0.014617919921875,
0.046539306640625,
0.02947998046875,
0.008270263671875,
-0.0157012939453125,
-0.038848876953125,
-0.0254974365234375,
-0.03533935546875,
-0.06884765625,
0.038543701171875,
-0.0204010009765625,
-0.0208740234375,
0.010711669921875,
0.0010690689086914062,
0.004619598388671875,
-0.01242828369140625,
0.04058837890625,
0.0200042724609375,
0.00751495361328125,
-0.0274200439453125,
0.03363037109375,
-0.00592041015625,
0.003650665283203125,
-0.031402587890625,
0.039642333984375,
-0.0019989013671875,
-0.01227569580078125,
-0.05609130859375,
0.01409149169921875,
0.05084228515625,
0.00746917724609375,
0.06219482421875,
0.05291748046875,
-0.0421142578125,
0.00004744529724121094,
-0.037506103515625,
-0.0016956329345703125,
-0.037322998046875,
-0.004993438720703125,
-0.043548583984375,
-0.036895751953125,
0.05908203125,
0.0141143798828125,
0.029388427734375,
0.044647216796875,
0.032684326171875,
-0.0281219482421875,
0.07220458984375,
0.049560546875,
0.0269622802734375,
0.04052734375,
-0.0653076171875,
-0.026519775390625,
-0.06085205078125,
-0.033447265625,
-0.001800537109375,
-0.04376220703125,
-0.018585205078125,
-0.050018310546875,
0.0241851806640625,
0.0230560302734375,
-0.0167083740234375,
0.02734375,
-0.037872314453125,
0.0272064208984375,
0.004138946533203125,
0.035919189453125,
0.0157928466796875,
0.00783538818359375,
-0.0188140869140625,
-0.0027294158935546875,
-0.030364990234375,
-0.021209716796875,
0.058563232421875,
0.03082275390625,
0.058441162109375,
0.0201263427734375,
0.05340576171875,
0.024871826171875,
0.023681640625,
-0.027008056640625,
0.040985107421875,
-0.00844573974609375,
-0.052490234375,
0.0119476318359375,
-0.020965576171875,
-0.06402587890625,
0.031524658203125,
-0.0263214111328125,
-0.05059814453125,
0.0188140869140625,
0.0231475830078125,
-0.03057861328125,
0.04205322265625,
-0.056060791015625,
0.06512451171875,
0.01788330078125,
-0.039703369140625,
-0.01305389404296875,
-0.0285186767578125,
0.0288543701171875,
0.0188140869140625,
0.010986328125,
-0.03509521484375,
-0.0130157470703125,
0.057159423828125,
-0.03094482421875,
0.056243896484375,
-0.055572509765625,
-0.0163116455078125,
0.0283660888671875,
0.027587890625,
0.0236053466796875,
0.0080413818359375,
-0.006031036376953125,
0.0225982666015625,
0.0203704833984375,
-0.043853759765625,
-0.03106689453125,
0.0572509765625,
-0.0491943359375,
-0.03826904296875,
-0.01418304443359375,
-0.038299560546875,
0.01157379150390625,
0.0155487060546875,
0.048187255859375,
0.0175628662109375,
-0.0209503173828125,
-0.01172637939453125,
0.0572509765625,
0.00557708740234375,
0.036224365234375,
0.0216064453125,
-0.059173583984375,
-0.02423095703125,
0.06085205078125,
0.0040130615234375,
0.03076171875,
-0.004817962646484375,
0.02471923828125,
-0.0308380126953125,
-0.03973388671875,
-0.04217529296875,
0.034881591796875,
-0.0293426513671875,
-0.0062408447265625,
-0.05999755859375,
-0.01377105712890625,
-0.035797119140625,
-0.01448822021484375,
-0.0369873046875,
-0.0341796875,
-0.04046630859375,
-0.00180816650390625,
0.052703857421875,
0.040740966796875,
-0.0220184326171875,
0.0098724365234375,
-0.043060302734375,
0.0166015625,
0.007801055908203125,
0.05401611328125,
0.0008502006530761719,
-0.04736328125,
-0.0031108856201171875,
0.0211944580078125,
-0.01512908935546875,
-0.06463623046875,
0.041046142578125,
-0.0007071495056152344,
0.027557373046875,
0.041595458984375,
-0.00496673583984375,
0.054412841796875,
-0.024627685546875,
0.060577392578125,
0.0380859375,
-0.041015625,
0.054351806640625,
-0.0584716796875,
0.002025604248046875,
0.0209197998046875,
0.041168212890625,
-0.045074462890625,
-0.031463623046875,
-0.05780029296875,
-0.0423583984375,
0.0323486328125,
0.03240966796875,
0.02197265625,
0.021514892578125,
0.053924560546875,
0.00832366943359375,
0.01024627685546875,
-0.058868408203125,
-0.040740966796875,
-0.0135040283203125,
0.0008826255798339844,
0.0021114349365234375,
0.00882720947265625,
-0.0212860107421875,
-0.0286865234375,
0.06500244140625,
0.007965087890625,
0.0335693359375,
0.0216064453125,
0.0260772705078125,
-0.0233154296875,
-0.0204925537109375,
0.03094482421875,
0.030059814453125,
-0.024261474609375,
-0.02642822265625,
-0.00733184814453125,
-0.03436279296875,
0.00942230224609375,
0.009857177734375,
-0.031402587890625,
0.005123138427734375,
-0.0271453857421875,
0.055999755859375,
-0.029327392578125,
-0.0263519287109375,
0.041290283203125,
-0.007061004638671875,
-0.042205810546875,
-0.031524658203125,
0.02825927734375,
0.021820068359375,
0.05352783203125,
-0.0023403167724609375,
0.04620361328125,
0.0230712890625,
-0.005741119384765625,
-0.00467681884765625,
0.058074951171875,
-0.047088623046875,
-0.029388427734375,
0.09185791015625,
0.0066375732421875,
-0.0226898193359375,
0.01525115966796875,
-0.039886474609375,
-0.00806427001953125,
0.04150390625,
0.0478515625,
0.071533203125,
-0.0157012939453125,
0.040863037109375,
0.0307769775390625,
-0.00838470458984375,
-0.01332855224609375,
0.040985107421875,
0.01392364501953125,
-0.04364013671875,
0.005802154541015625,
-0.0567626953125,
-0.0023059844970703125,
-0.00756072998046875,
-0.03424072265625,
0.037872314453125,
-0.040863037109375,
-0.0210418701171875,
-0.019561767578125,
-0.003108978271484375,
-0.04205322265625,
0.0189056396484375,
0.01415252685546875,
0.087890625,
-0.07147216796875,
0.055908203125,
0.0423583984375,
-0.044342041015625,
-0.04522705078125,
-0.0119476318359375,
0.0112762451171875,
-0.037384033203125,
-0.00021898746490478516,
-0.0046234130859375,
-0.0135498046875,
0.00940704345703125,
-0.0504150390625,
-0.0584716796875,
0.09130859375,
0.031524658203125,
-0.024749755859375,
-0.01678466796875,
-0.030120849609375,
0.04241943359375,
-0.035736083984375,
0.023590087890625,
0.02642822265625,
0.0139617919921875,
0.04058837890625,
-0.055419921875,
0.01226043701171875,
-0.0268096923828125,
0.023193359375,
0.006290435791015625,
-0.08154296875,
0.05401611328125,
-0.0281219482421875,
-0.0254974365234375,
0.033660888671875,
0.06048583984375,
0.0266571044921875,
0.0221099853515625,
0.046630859375,
0.06842041015625,
0.047637939453125,
-0.01270294189453125,
0.08349609375,
-0.01172637939453125,
0.04364013671875,
0.038726806640625,
-0.0004241466522216797,
0.050140380859375,
0.0150299072265625,
-0.0240478515625,
0.059356689453125,
0.0584716796875,
0.010986328125,
0.038665771484375,
0.018890380859375,
-0.03692626953125,
-0.00806427001953125,
-0.00417327880859375,
-0.05279541015625,
0.0008587837219238281,
0.019744873046875,
-0.0248260498046875,
-0.008819580078125,
0.0245513916015625,
-0.0001817941665649414,
-0.02203369140625,
-0.0131378173828125,
0.026458740234375,
0.0120391845703125,
-0.01202392578125,
0.040618896484375,
-0.01143646240234375,
0.068603515625,
-0.049652099609375,
-0.004161834716796875,
-0.02752685546875,
0.0017299652099609375,
-0.0347900390625,
-0.0682373046875,
0.023162841796875,
-0.00604248046875,
-0.001232147216796875,
-0.0288848876953125,
0.050323486328125,
-0.0249786376953125,
-0.04107666015625,
0.022613525390625,
0.03411865234375,
0.04229736328125,
-0.00400543212890625,
-0.07537841796875,
0.0175933837890625,
0.003810882568359375,
-0.0278472900390625,
0.021240234375,
-0.00028896331787109375,
0.038726806640625,
0.072998046875,
0.026763916015625,
0.0177459716796875,
-0.01122283935546875,
-0.004192352294921875,
0.05596923828125,
-0.0274810791015625,
-0.040283203125,
-0.046875,
0.053558349609375,
-0.00615692138671875,
-0.0131988525390625,
0.045928955078125,
0.05523681640625,
0.049346923828125,
-0.0229644775390625,
0.0552978515625,
-0.038330078125,
0.0267486572265625,
-0.02545166015625,
0.08197021484375,
-0.080322265625,
0.0042877197265625,
-0.0545654296875,
-0.07965087890625,
-0.01348876953125,
0.053314208984375,
0.004459381103515625,
0.0169677734375,
0.0077056884765625,
0.07000732421875,
-0.0168609619140625,
0.002048492431640625,
0.0145721435546875,
0.0049285888671875,
0.017181396484375,
0.037933349609375,
0.04327392578125,
-0.04052734375,
0.01467132568359375,
-0.042327880859375,
-0.028900146484375,
0.0020313262939453125,
-0.056610107421875,
-0.0589599609375,
-0.050323486328125,
-0.0587158203125,
-0.06781005859375,
-0.003459930419921875,
0.07696533203125,
0.072509765625,
-0.033660888671875,
-0.0176239013671875,
-0.0105743408203125,
0.00909423828125,
-0.01580810546875,
-0.0203857421875,
0.00591278076171875,
0.0322265625,
-0.08135986328125,
0.002140045166015625,
0.007007598876953125,
0.06707763671875,
-0.01751708984375,
-0.0135345458984375,
0.00501251220703125,
-0.011871337890625,
0.036224365234375,
0.01666259765625,
-0.0487060546875,
-0.007160186767578125,
-0.0169677734375,
-0.00007432699203491211,
0.00327301025390625,
0.0146942138671875,
-0.048248291015625,
0.02239990234375,
0.0369873046875,
-0.01454925537109375,
0.04803466796875,
-0.00505828857421875,
0.01340484619140625,
-0.0328369140625,
0.0144500732421875,
0.01151275634765625,
0.0288238525390625,
-0.0076141357421875,
-0.028228759765625,
0.043731689453125,
0.04949951171875,
-0.03265380859375,
-0.0360107421875,
0.0117034912109375,
-0.10113525390625,
-0.0194244384765625,
0.07574462890625,
-0.00681304931640625,
-0.0063323974609375,
0.01007080078125,
-0.0303192138671875,
-0.0025177001953125,
-0.037811279296875,
0.027130126953125,
0.03363037109375,
-0.03973388671875,
-0.023162841796875,
-0.05072021484375,
0.040924072265625,
0.0007953643798828125,
-0.044342041015625,
-0.007007598876953125,
0.042816162109375,
0.050994873046875,
0.018402099609375,
0.07318115234375,
-0.0252227783203125,
0.0037384033203125,
0.012725830078125,
0.01389312744140625,
0.005435943603515625,
-0.00962066650390625,
-0.0221405029296875,
0.0140228271484375,
-0.023193359375,
-0.022705078125
]
] |
ehcalabres/wav2vec2-lg-xlsr-en-speech-emotion-recognition | 2021-09-21T20:59:32.000Z | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | audio-classification | ehcalabres | null | null | ehcalabres/wav2vec2-lg-xlsr-en-speech-emotion-recognition | 84 | 72,717 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model_index:
name: wav2vec2-lg-xlsr-en-speech-emotion-recognition
---
# Speech Emotion Recognition By Fine-Tuning Wav2Vec 2.0
The model is a fine-tuned version of [jonatasgrosman/wav2vec2-large-xlsr-53-english](https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-english) for a Speech Emotion Recognition (SER) task.
The dataset used to fine-tune the original pre-trained model is the [RAVDESS dataset](https://zenodo.org/record/1188976#.YO6yI-gzaUk). This dataset provides 1440 samples of recordings from actors performing on 8 different emotions in English, which are:
```python
emotions = ['angry', 'calm', 'disgust', 'fearful', 'happy', 'neutral', 'sad', 'surprised']
```
It achieves the following results on the evaluation set:
- Loss: 0.5023
- Accuracy: 0.8223
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0752 | 0.21 | 30 | 2.0505 | 0.1359 |
| 2.0119 | 0.42 | 60 | 1.9340 | 0.2474 |
| 1.8073 | 0.63 | 90 | 1.5169 | 0.3902 |
| 1.5418 | 0.84 | 120 | 1.2373 | 0.5610 |
| 1.1432 | 1.05 | 150 | 1.1579 | 0.5610 |
| 0.9645 | 1.26 | 180 | 0.9610 | 0.6167 |
| 0.8811 | 1.47 | 210 | 0.8063 | 0.7178 |
| 0.8756 | 1.68 | 240 | 0.7379 | 0.7352 |
| 0.8208 | 1.89 | 270 | 0.6839 | 0.7596 |
| 0.7118 | 2.1 | 300 | 0.6664 | 0.7735 |
| 0.4261 | 2.31 | 330 | 0.6058 | 0.8014 |
| 0.4394 | 2.52 | 360 | 0.5754 | 0.8223 |
| 0.4581 | 2.72 | 390 | 0.4719 | 0.8467 |
| 0.3967 | 2.93 | 420 | 0.5023 | 0.8223 |
## Contact
Any doubt, contact me on [Twitter](https://twitter.com/ehcalabres) (GitHub repo soon).
### Framework versions
- Transformers 4.8.2
- Pytorch 1.9.0+cu102
- Datasets 1.9.0
- Tokenizers 0.10.3
| 2,633 | [
[
-0.04217529296875,
-0.0307464599609375,
0.00801849365234375,
-0.005680084228515625,
-0.002422332763671875,
-0.005817413330078125,
-0.0084991455078125,
-0.02532958984375,
0.014678955078125,
0.0189056396484375,
-0.061126708984375,
-0.04547119140625,
-0.054962158203125,
-0.0113525390625,
-0.00543975830078125,
0.07232666015625,
0.0006814002990722656,
0.0013074874877929688,
0.0049896240234375,
-0.0203857421875,
-0.01690673828125,
-0.0225067138671875,
-0.058563232421875,
-0.041595458984375,
0.0292816162109375,
0.03546142578125,
0.048736572265625,
0.043182373046875,
0.0306549072265625,
0.027374267578125,
-0.0250244140625,
0.0026264190673828125,
-0.0404052734375,
-0.00592803955078125,
0.01013946533203125,
-0.033660888671875,
-0.0206298828125,
-0.0014371871948242188,
0.03375244140625,
0.016265869140625,
-0.01471710205078125,
0.031341552734375,
0.005718231201171875,
0.061553955078125,
-0.039825439453125,
0.01384735107421875,
-0.0165252685546875,
0.030548095703125,
-0.01025390625,
-0.0189971923828125,
-0.0005488395690917969,
-0.0203704833984375,
0.0176239013671875,
-0.035675048828125,
0.02398681640625,
0.0008821487426757812,
0.08551025390625,
0.031982421875,
-0.029022216796875,
-0.0057830810546875,
-0.05126953125,
0.05712890625,
-0.037139892578125,
0.034088134765625,
0.04095458984375,
0.019073486328125,
0.0027256011962890625,
-0.061126708984375,
-0.046966552734375,
0.021514892578125,
0.0130462646484375,
0.02801513671875,
-0.034912109375,
-0.00995635986328125,
0.04974365234375,
0.03326416015625,
-0.045806884765625,
0.01458740234375,
-0.031646728515625,
-0.0266265869140625,
0.039764404296875,
0.0162811279296875,
-0.00443267822265625,
-0.0188751220703125,
-0.01708984375,
-0.0192413330078125,
-0.025665283203125,
0.04766845703125,
0.03924560546875,
0.005161285400390625,
-0.042266845703125,
0.032440185546875,
-0.0176544189453125,
0.0384521484375,
0.0052642822265625,
-0.0001385211944580078,
0.058868408203125,
0.00098419189453125,
-0.0246734619140625,
0.01549530029296875,
0.06829833984375,
0.0321044921875,
0.0126800537109375,
0.016754150390625,
-0.01438140869140625,
-0.0003151893615722656,
0.01506805419921875,
-0.07940673828125,
-0.0240325927734375,
0.0423583984375,
-0.039642333984375,
-0.0287628173828125,
0.0026073455810546875,
-0.06201171875,
-0.01232147216796875,
-0.035491943359375,
0.0369873046875,
-0.0207366943359375,
-0.019012451171875,
-0.00576019287109375,
-0.0113372802734375,
0.03729248046875,
0.0117950439453125,
-0.0797119140625,
0.021484375,
0.040557861328125,
0.06298828125,
0.01788330078125,
-0.0096282958984375,
-0.0289154052734375,
-0.016876220703125,
-0.0233154296875,
0.049957275390625,
-0.00855255126953125,
-0.025848388671875,
-0.0004010200500488281,
0.0186004638671875,
0.006572723388671875,
-0.029754638671875,
0.051483154296875,
-0.02117919921875,
0.01971435546875,
-0.0242462158203125,
-0.027252197265625,
-0.01425933837890625,
0.01171875,
-0.0389404296875,
0.09906005859375,
0.01465606689453125,
-0.0562744140625,
0.0200653076171875,
-0.036224365234375,
-0.01172637939453125,
-0.01528167724609375,
-0.0146636962890625,
-0.06561279296875,
-0.0118865966796875,
0.0114898681640625,
0.029144287109375,
-0.031524658203125,
-0.0063629150390625,
-0.00992584228515625,
-0.03192138671875,
-0.00018858909606933594,
-0.0246734619140625,
0.0655517578125,
0.0227203369140625,
-0.0562744140625,
0.012603759765625,
-0.076904296875,
0.02532958984375,
0.0089263916015625,
-0.0181884765625,
0.00478363037109375,
-0.00775909423828125,
0.0172576904296875,
0.038330078125,
0.00601959228515625,
-0.039947509765625,
-0.001964569091796875,
-0.033416748046875,
0.03582763671875,
0.0552978515625,
-0.0055389404296875,
0.01715087890625,
-0.022979736328125,
0.028717041015625,
0.00824737548828125,
0.0197296142578125,
0.006656646728515625,
-0.03759765625,
-0.07470703125,
-0.0223846435546875,
0.003536224365234375,
0.041107177734375,
-0.01274871826171875,
0.05426025390625,
-0.0005230903625488281,
-0.050262451171875,
-0.054840087890625,
-0.001399993896484375,
0.02862548828125,
0.051513671875,
0.0286102294921875,
0.0021820068359375,
-0.0606689453125,
-0.06396484375,
-0.0014276504516601562,
-0.0030956268310546875,
0.00888824462890625,
0.0267791748046875,
0.04925537109375,
-0.028717041015625,
0.0618896484375,
-0.040557861328125,
-0.034393310546875,
-0.001491546630859375,
0.01250457763671875,
0.027313232421875,
0.04339599609375,
0.04345703125,
-0.058319091796875,
-0.0283355712890625,
0.005283355712890625,
-0.050323486328125,
0.0204315185546875,
0.0008625984191894531,
-0.01371002197265625,
0.0141448974609375,
0.0318603515625,
-0.0311126708984375,
0.04473876953125,
0.0455322265625,
-0.0293731689453125,
0.052337646484375,
-0.025665283203125,
0.0193328857421875,
-0.091796875,
0.01056671142578125,
0.0163116455078125,
-0.012969970703125,
-0.0357666015625,
-0.0203094482421875,
0.00763702392578125,
-0.01367950439453125,
-0.042510986328125,
0.03387451171875,
-0.027679443359375,
-0.0018100738525390625,
-0.0103607177734375,
-0.0182647705078125,
-0.005035400390625,
0.050262451171875,
0.0025310516357421875,
0.0562744140625,
0.062286376953125,
-0.032958984375,
0.03997802734375,
0.0277862548828125,
-0.052398681640625,
0.03558349609375,
-0.067626953125,
0.0196380615234375,
-0.0006489753723144531,
0.004146575927734375,
-0.07281494140625,
-0.025787353515625,
0.0197296142578125,
-0.0484619140625,
0.019561767578125,
-0.0206756591796875,
-0.01277923583984375,
-0.054107666015625,
-0.0198211669921875,
0.01358795166015625,
0.06585693359375,
-0.039886474609375,
0.035919189453125,
0.0194091796875,
-0.0011615753173828125,
-0.04840087890625,
-0.06591796875,
-0.00458526611328125,
-0.01100921630859375,
-0.047943115234375,
0.0278778076171875,
0.00928497314453125,
-0.01136016845703125,
-0.007564544677734375,
-0.018524169921875,
-0.01042938232421875,
0.00008600950241088867,
0.0426025390625,
0.019683837890625,
-0.00356292724609375,
-0.002773284912109375,
-0.0000845193862915039,
-0.02337646484375,
0.01194000244140625,
-0.007274627685546875,
0.0482177734375,
-0.029022216796875,
-0.02484130859375,
-0.07421875,
0.004161834716796875,
0.05322265625,
-0.03717041015625,
0.052490234375,
0.06396484375,
-0.04302978515625,
-0.00627899169921875,
-0.03009033203125,
-0.00809478759765625,
-0.037384033203125,
0.04388427734375,
-0.038116455078125,
-0.06988525390625,
0.06793212890625,
-0.0011911392211914062,
0.0016164779663085938,
0.055633544921875,
0.05804443359375,
-0.0176849365234375,
0.08892822265625,
0.0202789306640625,
-0.01129913330078125,
0.0281524658203125,
-0.0645751953125,
0.00832366943359375,
-0.061798095703125,
-0.056671142578125,
-0.05023193359375,
-0.036712646484375,
-0.047454833984375,
-0.01064300537109375,
0.019287109375,
-0.00615692138671875,
-0.04083251953125,
0.0060577392578125,
-0.0419921875,
0.0272979736328125,
0.0560302734375,
0.022674560546875,
-0.0208282470703125,
0.0081024169921875,
-0.0007748603820800781,
-0.007007598876953125,
-0.036407470703125,
-0.0203094482421875,
0.068603515625,
0.0265045166015625,
0.046875,
0.004695892333984375,
0.05780029296875,
0.01169586181640625,
-0.02490234375,
-0.07135009765625,
0.011383056640625,
-0.003997802734375,
-0.04022216796875,
-0.0211944580078125,
-0.0292816162109375,
-0.06866455078125,
0.006511688232421875,
-0.0251617431640625,
-0.06365966796875,
0.05120849609375,
0.0172119140625,
-0.04388427734375,
0.0244598388671875,
-0.03643798828125,
0.0721435546875,
-0.00926971435546875,
-0.0343017578125,
-0.0250244140625,
-0.06939697265625,
0.0009403228759765625,
-0.001346588134765625,
0.0019178390502929688,
-0.01226806640625,
0.017822265625,
0.069091796875,
-0.052764892578125,
0.03533935546875,
-0.006381988525390625,
0.01032257080078125,
0.032196044921875,
-0.00914764404296875,
0.05133056640625,
-0.0041351318359375,
-0.0143585205078125,
0.01084136962890625,
0.0026798248291015625,
-0.033721923828125,
-0.0360107421875,
0.061126708984375,
-0.09033203125,
-0.028289794921875,
-0.04388427734375,
-0.0287628173828125,
0.0035858154296875,
0.0163116455078125,
0.046722412109375,
0.05609130859375,
0.00017213821411132812,
0.027099609375,
0.05877685546875,
-0.005657196044921875,
0.029205322265625,
0.024810791015625,
0.0189056396484375,
-0.041900634765625,
0.06414794921875,
0.0024852752685546875,
0.01540374755859375,
-0.003818511962890625,
0.0150146484375,
-0.0293731689453125,
-0.01544952392578125,
-0.0164794921875,
-0.0009179115295410156,
-0.036376953125,
-0.01506805419921875,
-0.045440673828125,
-0.012847900390625,
-0.052581787109375,
-0.0139007568359375,
-0.037811279296875,
-0.01187896728515625,
-0.0479736328125,
-0.01171875,
0.0640869140625,
0.037811279296875,
-0.00597381591796875,
0.017913818359375,
-0.050933837890625,
0.0179443359375,
0.0164337158203125,
0.0186767578125,
-0.00518035888671875,
-0.06768798828125,
-0.0274658203125,
0.01364898681640625,
-0.01325225830078125,
-0.04925537109375,
0.048431396484375,
0.02227783203125,
0.0304718017578125,
0.045196533203125,
-0.003551483154296875,
0.06951904296875,
-0.034454345703125,
0.053436279296875,
0.0300140380859375,
-0.06610107421875,
0.05767822265625,
-0.0174713134765625,
0.0233917236328125,
0.049346923828125,
0.034332275390625,
-0.052886962890625,
-0.01239013671875,
-0.0576171875,
-0.06829833984375,
0.0865478515625,
0.0229949951171875,
-0.001972198486328125,
0.01404571533203125,
0.0244140625,
-0.026641845703125,
0.0139312744140625,
-0.0300140380859375,
-0.04644775390625,
-0.0234375,
-0.02410888671875,
-0.01024627685546875,
-0.0231475830078125,
-0.0118865966796875,
-0.05047607421875,
0.05035400390625,
0.0133209228515625,
0.028228759765625,
0.025909423828125,
0.01153564453125,
-0.01488494873046875,
0.00720977783203125,
0.0312042236328125,
0.0347900390625,
-0.041900634765625,
0.007801055908203125,
0.003238677978515625,
-0.04119873046875,
0.01232147216796875,
0.00921630859375,
-0.004131317138671875,
-0.0045013427734375,
0.01020050048828125,
0.09124755859375,
0.005702972412109375,
-0.029510498046875,
0.04290771484375,
0.0015277862548828125,
-0.038970947265625,
-0.035430908203125,
0.0094757080078125,
-0.0094757080078125,
0.040557861328125,
0.0252838134765625,
0.031951904296875,
0.00998687744140625,
-0.038299560546875,
0.01751708984375,
0.007350921630859375,
-0.057373046875,
-0.0187530517578125,
0.05657958984375,
0.00899505615234375,
-0.036865234375,
0.064208984375,
-0.00926971435546875,
-0.05743408203125,
0.0780029296875,
0.028289794921875,
0.07373046875,
-0.0311431884765625,
0.0090179443359375,
0.05804443359375,
0.011566162109375,
-0.0157318115234375,
0.0479736328125,
0.0021610260009765625,
-0.035552978515625,
-0.005245208740234375,
-0.041839599609375,
-0.0240936279296875,
0.00809478759765625,
-0.0823974609375,
0.0343017578125,
-0.0291748046875,
-0.0213470458984375,
-0.0007061958312988281,
0.004055023193359375,
-0.061798095703125,
0.038360595703125,
0.0169525146484375,
0.068359375,
-0.062469482421875,
0.059234619140625,
0.050262451171875,
-0.0268707275390625,
-0.07415771484375,
-0.02032470703125,
0.01358795166015625,
-0.051055908203125,
0.0423583984375,
0.0231781005859375,
-0.0003886222839355469,
0.029876708984375,
-0.032318115234375,
-0.0670166015625,
0.09344482421875,
0.0021152496337890625,
-0.055084228515625,
0.0245819091796875,
0.0042724609375,
0.0458984375,
-0.0012340545654296875,
0.024810791015625,
0.053314208984375,
0.0256195068359375,
0.01247406005859375,
-0.056121826171875,
-0.0026187896728515625,
-0.035675048828125,
-0.01003265380859375,
0.0214691162109375,
-0.060882568359375,
0.0626220703125,
-0.0301361083984375,
0.01537322998046875,
0.00160980224609375,
0.052398681640625,
0.0172882080078125,
0.0136871337890625,
0.0350341796875,
0.067626953125,
0.073486328125,
-0.025787353515625,
0.07666015625,
-0.0301971435546875,
0.05908203125,
0.0751953125,
-0.00019729137420654297,
0.0631103515625,
0.038818359375,
-0.033966064453125,
0.0256805419921875,
0.07537841796875,
-0.017608642578125,
0.045196533203125,
-0.0013914108276367188,
-0.0249481201171875,
-0.0240936279296875,
0.00548553466796875,
-0.043426513671875,
0.0197296142578125,
0.0114898681640625,
-0.0250091552734375,
0.00007402896881103516,
0.00803375244140625,
-0.00726318359375,
-0.0243682861328125,
-0.03326416015625,
0.047088623046875,
-0.0092926025390625,
-0.0191192626953125,
0.0587158203125,
-0.00791168212890625,
0.037109375,
-0.0489501953125,
0.0013208389282226562,
-0.0004432201385498047,
0.0243682861328125,
-0.04754638671875,
-0.06573486328125,
0.00759124755859375,
0.0078277587890625,
-0.01477813720703125,
0.0014514923095703125,
0.029937744140625,
-0.0198516845703125,
-0.033843994140625,
0.012237548828125,
0.008331298828125,
0.01442718505859375,
-0.00031495094299316406,
-0.0626220703125,
0.0014057159423828125,
0.01393890380859375,
-0.046112060546875,
0.008148193359375,
0.03558349609375,
0.0172882080078125,
0.041748046875,
0.04583740234375,
0.0018472671508789062,
0.0055389404296875,
0.0216064453125,
0.07281494140625,
-0.04827880859375,
-0.04534912109375,
-0.052093505859375,
0.03948974609375,
-0.004444122314453125,
-0.057861328125,
0.043792724609375,
0.0518798828125,
0.04620361328125,
0.0015020370483398438,
0.04583740234375,
-0.0182342529296875,
0.035919189453125,
-0.02947998046875,
0.052825927734375,
-0.062225341796875,
-0.01158905029296875,
-0.01509857177734375,
-0.051666259765625,
-0.01483154296875,
0.0650634765625,
-0.018524169921875,
0.0024776458740234375,
0.03668212890625,
0.0733642578125,
0.01690673828125,
0.00022339820861816406,
0.0107574462890625,
0.0211944580078125,
0.01611328125,
0.035797119140625,
0.03948974609375,
-0.053558349609375,
0.041900634765625,
-0.043731689453125,
-0.0128936767578125,
-0.0181427001953125,
-0.031982421875,
-0.0489501953125,
-0.042388916015625,
-0.037628173828125,
-0.047454833984375,
0.0038509368896484375,
0.08953857421875,
0.046966552734375,
-0.06597900390625,
-0.01247406005859375,
-0.01409149169921875,
-0.016876220703125,
-0.01434326171875,
-0.01418304443359375,
0.046722412109375,
-0.0122833251953125,
-0.06512451171875,
-0.00360107421875,
-0.0009131431579589844,
0.0224456787109375,
-0.00194549560546875,
-0.024322509765625,
-0.003971099853515625,
-0.020721435546875,
0.029937744140625,
0.031402587890625,
-0.039459228515625,
-0.0189208984375,
-0.0013990402221679688,
-0.0152435302734375,
0.0268707275390625,
0.0245361328125,
-0.03759765625,
0.039703369140625,
0.032135009765625,
0.032012939453125,
0.05584716796875,
-0.00787353515625,
0.00888824462890625,
-0.034912109375,
0.031341552734375,
0.0164947509765625,
0.0318603515625,
0.00704193115234375,
-0.0289459228515625,
0.027801513671875,
0.018768310546875,
-0.060028076171875,
-0.056732177734375,
-0.004985809326171875,
-0.0986328125,
0.0026035308837890625,
0.095458984375,
-0.0037746429443359375,
-0.0322265625,
0.007671356201171875,
-0.0357666015625,
0.033050537109375,
-0.038116455078125,
0.0389404296875,
0.044097900390625,
-0.01320648193359375,
-0.0063934326171875,
-0.0369873046875,
0.05072021484375,
0.01605224609375,
-0.0467529296875,
-0.005573272705078125,
0.032867431640625,
0.0216827392578125,
-0.005367279052734375,
0.0225982666015625,
-0.0121307373046875,
0.039215087890625,
0.0015869140625,
0.03814697265625,
-0.010772705078125,
-0.00785064697265625,
-0.051116943359375,
-0.01177978515625,
-0.005126953125,
-0.03692626953125
]
] |
cardiffnlp/twitter-roberta-base-hate | 2023-04-19T07:54:22.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"text-classification",
"arxiv:2010.12421",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-hate | 12 | 72,692 | transformers | 2022-03-02T23:29:05 | # Twitter-roBERTa-base for Hate Speech Detection
This is a roBERTa-base model trained on ~58M tweets and finetuned for hate speech detection with the TweetEval benchmark.
This model is specialized to detect hate speech against women and immigrants.
**NEW!** We have made available a more recent and robust hate speech detection model here: [https://huggingface.co/cardiffnlp/twitter-roberta-base-hate-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-hate-latest)
- Paper: [_TweetEval_ benchmark (Findings of EMNLP 2020)](https://arxiv.org/pdf/2010.12421.pdf).
- Git Repo: [Tweeteval official repository](https://github.com/cardiffnlp/tweeteval).
## Example of classification
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
from scipy.special import softmax
import csv
import urllib.request
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
# Tasks:
# emoji, emotion, hate, irony, offensive, sentiment
# stance/abortion, stance/atheism, stance/climate, stance/feminist, stance/hillary
task='hate'
MODEL = f"cardiffnlp/twitter-roberta-base-{task}"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# download label mapping
labels=[]
mapping_link = f"https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/{task}/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)
text = "Good night 😊"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
scores = softmax(scores)
# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)
# text = "Good night 😊"
# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# scores = softmax(scores)
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(scores.shape[0]):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
Output:
```
1) not-hate 0.9168
2) hate 0.0832
```
| 2,670 | [
[
-0.004940032958984375,
-0.05133056640625,
0.00814056396484375,
0.01995849609375,
-0.006282806396484375,
0.013763427734375,
-0.0222015380859375,
-0.01213836669921875,
0.01212310791015625,
0.0127105712890625,
-0.0269622802734375,
-0.06414794921875,
-0.06304931640625,
-0.001888275146484375,
-0.0440673828125,
0.08648681640625,
0.01277923583984375,
0.002681732177734375,
0.0260772705078125,
-0.01263427734375,
-0.01131439208984375,
-0.036376953125,
-0.050323486328125,
-0.017120361328125,
0.037261962890625,
0.028778076171875,
0.0308837890625,
0.032958984375,
0.0241241455078125,
0.03704833984375,
-0.0054473876953125,
-0.0081787109375,
-0.042633056640625,
0.0186004638671875,
-0.0027923583984375,
-0.029632568359375,
-0.032257080078125,
0.01215362548828125,
0.044158935546875,
0.0229644775390625,
0.0021190643310546875,
0.0231170654296875,
-0.005420684814453125,
0.0254058837890625,
-0.03369140625,
-0.0004930496215820312,
-0.0430908203125,
-0.0018758773803710938,
-0.023193359375,
-0.006389617919921875,
-0.0258941650390625,
-0.0401611328125,
0.0010223388671875,
-0.030517578125,
0.01983642578125,
-0.003574371337890625,
0.0887451171875,
0.0201416015625,
-0.01532745361328125,
-0.01568603515625,
-0.02838134765625,
0.0877685546875,
-0.050262451171875,
0.01312255859375,
0.0212554931640625,
0.01483154296875,
-0.002849578857421875,
-0.03533935546875,
-0.038299560546875,
-0.0146636962890625,
0.01325225830078125,
0.01007080078125,
-0.042572021484375,
-0.019012451171875,
0.023956298828125,
0.0222015380859375,
-0.044586181640625,
0.0025196075439453125,
-0.04132080078125,
-0.01447296142578125,
0.03863525390625,
0.0062103271484375,
0.02593994140625,
-0.0253448486328125,
-0.01364898681640625,
-0.0106201171875,
-0.005222320556640625,
-0.004604339599609375,
0.0228729248046875,
0.036895751953125,
-0.0235595703125,
0.03778076171875,
-0.01006317138671875,
0.04913330078125,
0.01044464111328125,
-0.013763427734375,
0.048065185546875,
-0.00001245737075805664,
-0.014434814453125,
-0.01123809814453125,
0.07781982421875,
0.0221710205078125,
0.040252685546875,
-0.011993408203125,
-0.01227569580078125,
0.01654052734375,
0.0013303756713867188,
-0.0660400390625,
-0.023773193359375,
0.018310546875,
-0.039947509765625,
-0.051788330078125,
-0.01448822021484375,
-0.059295654296875,
-0.0214080810546875,
-0.0013780593872070312,
0.0494384765625,
-0.046783447265625,
-0.03143310546875,
-0.01194000244140625,
-0.03619384765625,
-0.0018510818481445312,
0.0141448974609375,
-0.057373046875,
0.00470733642578125,
0.03216552734375,
0.06988525390625,
-0.01067352294921875,
-0.02362060546875,
-0.037384033203125,
-0.004848480224609375,
-0.010589599609375,
0.04449462890625,
-0.031951904296875,
-0.01520538330078125,
-0.005252838134765625,
-0.00743865966796875,
-0.003940582275390625,
-0.033782958984375,
0.033050537109375,
-0.01849365234375,
0.018402099609375,
0.0009069442749023438,
-0.03790283203125,
0.0037174224853515625,
0.0224456787109375,
-0.02734375,
0.08538818359375,
0.02825927734375,
-0.056396484375,
0.0196685791015625,
-0.0587158203125,
-0.030120849609375,
-0.00536346435546875,
0.01201629638671875,
-0.0526123046875,
-0.01088714599609375,
0.00960540771484375,
0.036224365234375,
-0.00725555419921875,
0.00756072998046875,
-0.03546142578125,
-0.01561737060546875,
0.025604248046875,
-0.0220184326171875,
0.09393310546875,
0.0281219482421875,
-0.0413818359375,
0.025054931640625,
-0.06646728515625,
0.0049285888671875,
0.014739990234375,
-0.03436279296875,
-0.01546478271484375,
-0.01885986328125,
0.0289764404296875,
0.0154571533203125,
0.00917816162109375,
-0.05096435546875,
0.0008544921875,
-0.0229644775390625,
0.044921875,
0.069580078125,
-0.01617431640625,
0.0232086181640625,
-0.03533935546875,
0.033843994140625,
0.01258087158203125,
0.0095062255859375,
0.015777587890625,
-0.03704833984375,
-0.05889892578125,
-0.01267242431640625,
0.0178070068359375,
0.03948974609375,
-0.0350341796875,
0.033233642578125,
-0.0128173828125,
-0.049957275390625,
-0.03302001953125,
-0.00676727294921875,
0.0242156982421875,
0.04638671875,
0.051483154296875,
0.00377655029296875,
-0.070068359375,
-0.04669189453125,
-0.04205322265625,
-0.0162353515625,
0.006328582763671875,
0.0091094970703125,
0.059967041015625,
-0.01580810546875,
0.050811767578125,
-0.03350830078125,
-0.0276947021484375,
-0.0175933837890625,
0.0269012451171875,
0.0265350341796875,
0.061431884765625,
0.0595703125,
-0.057373046875,
-0.048492431640625,
-0.031890869140625,
-0.046783447265625,
-0.0217132568359375,
0.0186920166015625,
-0.0201263427734375,
0.02618408203125,
0.0173492431640625,
-0.0157623291015625,
0.041717529296875,
0.02386474609375,
-0.038482666015625,
0.0238189697265625,
0.0137786865234375,
0.023468017578125,
-0.0963134765625,
-0.006732940673828125,
0.016204833984375,
-0.00635528564453125,
-0.0589599609375,
-0.013092041015625,
-0.007305145263671875,
0.01036834716796875,
-0.037322998046875,
0.030517578125,
-0.0185699462890625,
0.011810302734375,
-0.00591278076171875,
-0.006816864013671875,
-0.01282501220703125,
0.0261077880859375,
-0.007541656494140625,
0.039703369140625,
0.03546142578125,
-0.033721923828125,
0.0272064208984375,
0.02490234375,
-0.00927734375,
0.045074462890625,
-0.04425048828125,
-0.00826263427734375,
0.0064544677734375,
0.0099029541015625,
-0.08746337890625,
-0.01131439208984375,
0.02801513671875,
-0.07208251953125,
0.0121612548828125,
-0.037567138671875,
-0.025146484375,
-0.03997802734375,
-0.023956298828125,
0.03125,
0.048431396484375,
-0.034698486328125,
0.041717529296875,
0.041778564453125,
0.02191162109375,
-0.050018310546875,
-0.080322265625,
0.009735107421875,
-0.025604248046875,
-0.0518798828125,
0.02447509765625,
-0.009002685546875,
-0.0190887451171875,
-0.0008845329284667969,
0.00815582275390625,
-0.0263671875,
0.0110626220703125,
0.02581787109375,
0.01739501953125,
-0.01233673095703125,
0.0012683868408203125,
-0.0220794677734375,
-0.002941131591796875,
0.00518798828125,
-0.032562255859375,
0.056396484375,
-0.0247039794921875,
0.01502227783203125,
-0.036468505859375,
0.0152130126953125,
0.03424072265625,
-0.0034427642822265625,
0.06915283203125,
0.084716796875,
-0.037200927734375,
-0.0160369873046875,
-0.043243408203125,
0.0029144287109375,
-0.036468505859375,
0.0269927978515625,
-0.021026611328125,
-0.0634765625,
0.045013427734375,
0.0287933349609375,
-0.004840850830078125,
0.0589599609375,
0.0562744140625,
-0.0013952255249023438,
0.0709228515625,
0.027191162109375,
-0.01534271240234375,
0.051483154296875,
-0.047149658203125,
0.004638671875,
-0.037994384765625,
-0.0196685791015625,
-0.05438232421875,
-0.0092010498046875,
-0.050262451171875,
-0.0276947021484375,
0.01416015625,
-0.009857177734375,
-0.04315185546875,
0.0235595703125,
-0.06390380859375,
0.0165557861328125,
0.0374755859375,
0.011688232421875,
-0.01506805419921875,
0.006526947021484375,
0.004886627197265625,
-0.0123748779296875,
-0.035797119140625,
-0.0196990966796875,
0.09423828125,
0.02447509765625,
0.0540771484375,
0.0066986083984375,
0.0653076171875,
0.0232086181640625,
0.036529541015625,
-0.05499267578125,
0.03924560546875,
-0.0263671875,
-0.048126220703125,
-0.01557159423828125,
-0.042236328125,
-0.057220458984375,
0.01186370849609375,
-0.021697998046875,
-0.056915283203125,
-0.00260162353515625,
0.00556182861328125,
-0.0162811279296875,
0.041473388671875,
-0.0428466796875,
0.07037353515625,
0.0010700225830078125,
-0.0333251953125,
-0.000728607177734375,
-0.023529052734375,
0.02581787109375,
0.00839996337890625,
0.0164794921875,
-0.0220184326171875,
0.006252288818359375,
0.0877685546875,
-0.04644775390625,
0.05743408203125,
-0.0222930908203125,
0.01171875,
0.01465606689453125,
-0.0054168701171875,
0.01076507568359375,
-0.023681640625,
-0.025848388671875,
0.018951416015625,
-0.01873779296875,
-0.03680419921875,
-0.0046234130859375,
0.057861328125,
-0.06903076171875,
-0.029693603515625,
-0.05487060546875,
-0.039276123046875,
0.0162353515625,
0.0251007080078125,
0.038177490234375,
0.0276641845703125,
-0.0119781494140625,
0.0239410400390625,
0.034210205078125,
-0.017120361328125,
0.0333251953125,
0.025970458984375,
-0.004100799560546875,
-0.035400390625,
0.0657958984375,
0.0225982666015625,
-0.0004673004150390625,
0.02667236328125,
0.0255126953125,
-0.020477294921875,
-0.04473876953125,
-0.0133056640625,
0.00476837158203125,
-0.0479736328125,
-0.024078369140625,
-0.06549072265625,
-0.0240631103515625,
-0.0672607421875,
-0.00981903076171875,
-0.0131683349609375,
-0.042510986328125,
-0.03912353515625,
0.005466461181640625,
0.0455322265625,
0.060943603515625,
-0.034942626953125,
0.0305023193359375,
-0.03619384765625,
0.0318603515625,
0.0058441162109375,
0.01447296142578125,
0.0002086162567138672,
-0.07598876953125,
-0.014312744140625,
0.0041656494140625,
-0.025604248046875,
-0.07562255859375,
0.05426025390625,
0.0007853507995605469,
0.0265045166015625,
0.024810791015625,
0.00861358642578125,
0.055511474609375,
-0.0189361572265625,
0.0577392578125,
0.0137481689453125,
-0.08197021484375,
0.05181884765625,
-0.03912353515625,
0.00966644287109375,
0.0249176025390625,
0.0299530029296875,
-0.03961181640625,
-0.04254150390625,
-0.04620361328125,
-0.06109619140625,
0.073486328125,
0.0295257568359375,
-0.00531768798828125,
-0.00931549072265625,
0.006561279296875,
-0.016998291015625,
0.0011997222900390625,
-0.070556640625,
-0.040252685546875,
-0.040924072265625,
-0.0352783203125,
-0.0011911392211914062,
-0.0173492431640625,
-0.005596160888671875,
-0.042205810546875,
0.06573486328125,
0.0154571533203125,
0.02593994140625,
-0.0005922317504882812,
-0.0237884521484375,
-0.0101776123046875,
0.01197052001953125,
0.04217529296875,
0.045928955078125,
-0.03607177734375,
-0.00803375244140625,
0.0226898193359375,
-0.0380859375,
0.00970458984375,
0.008758544921875,
-0.007381439208984375,
0.01383209228515625,
0.0253143310546875,
0.044158935546875,
0.01465606689453125,
-0.01129913330078125,
0.047821044921875,
-0.01110076904296875,
-0.02227783203125,
-0.052001953125,
0.01275634765625,
-0.0010013580322265625,
0.01345062255859375,
0.049163818359375,
0.016632080078125,
0.00272369384765625,
-0.031524658203125,
0.0201568603515625,
0.0185699462890625,
-0.0281219482421875,
-0.025146484375,
0.0709228515625,
-0.001804351806640625,
-0.04010009765625,
0.034881591796875,
-0.0094146728515625,
-0.07574462890625,
0.060028076171875,
0.040496826171875,
0.06829833984375,
-0.02294921875,
0.027374267578125,
0.060577392578125,
0.01122283935546875,
0.00615692138671875,
0.0291900634765625,
0.00220489501953125,
-0.05810546875,
0.0029964447021484375,
-0.05877685546875,
-0.008544921875,
0.01788330078125,
-0.0452880859375,
0.0123138427734375,
-0.049224853515625,
-0.0246734619140625,
0.0272979736328125,
0.00981903076171875,
-0.05120849609375,
0.027557373046875,
0.0090789794921875,
0.052947998046875,
-0.0799560546875,
0.057708740234375,
0.036529541015625,
-0.049072265625,
-0.065185546875,
-0.013427734375,
0.005680084228515625,
-0.06695556640625,
0.07220458984375,
0.0200958251953125,
0.0022296905517578125,
0.003131866455078125,
-0.05572509765625,
-0.0645751953125,
0.058624267578125,
0.005764007568359375,
-0.0113983154296875,
0.0034732818603515625,
0.01396942138671875,
0.045074462890625,
-0.0299530029296875,
0.042022705078125,
0.04486083984375,
0.039825439453125,
-0.01122283935546875,
-0.037994384765625,
0.0135040283203125,
-0.0246734619140625,
-0.0088348388671875,
0.00775146484375,
-0.05963134765625,
0.09967041015625,
-0.005970001220703125,
0.004039764404296875,
0.011322021484375,
0.05096435546875,
0.0153350830078125,
0.0216827392578125,
0.049957275390625,
0.051422119140625,
0.050811767578125,
-0.0277862548828125,
0.053863525390625,
-0.01091766357421875,
0.045684814453125,
0.053863525390625,
0.0272369384765625,
0.05206298828125,
0.024993896484375,
-0.01508331298828125,
0.049560546875,
0.045623779296875,
-0.0272979736328125,
0.0254058837890625,
0.031280517578125,
-0.006275177001953125,
-0.0186920166015625,
-0.01148223876953125,
-0.031097412109375,
0.034149169921875,
0.030792236328125,
-0.0347900390625,
-0.021026611328125,
0.002994537353515625,
0.021942138671875,
0.00762939453125,
-0.0215301513671875,
0.033050537109375,
-0.005889892578125,
-0.038421630859375,
0.06805419921875,
0.0032520294189453125,
0.07598876953125,
-0.02557373046875,
0.00391387939453125,
0.007106781005859375,
0.0311737060546875,
-0.0316162109375,
-0.060150146484375,
0.00971221923828125,
0.02825927734375,
-0.015869140625,
-0.023529052734375,
0.039581298828125,
-0.044677734375,
-0.032623291015625,
0.037017822265625,
0.010711669921875,
0.02978515625,
0.0013799667358398438,
-0.08465576171875,
0.0158538818359375,
0.01036834716796875,
-0.031280517578125,
0.00782012939453125,
0.02362060546875,
0.01155853271484375,
0.042388916015625,
0.05889892578125,
0.0035991668701171875,
0.02490234375,
0.02203369140625,
0.06494140625,
-0.057098388671875,
-0.034088134765625,
-0.072265625,
0.024993896484375,
-0.0181121826171875,
-0.042694091796875,
0.060943603515625,
0.056640625,
0.05718994140625,
-0.00009524822235107422,
0.0653076171875,
-0.023223876953125,
0.055908203125,
-0.01409912109375,
0.07562255859375,
-0.055908203125,
-0.0057830810546875,
-0.03204345703125,
-0.04644775390625,
-0.0231781005859375,
0.06292724609375,
-0.037689208984375,
0.017333984375,
0.039642333984375,
0.061004638671875,
-0.00186920166015625,
-0.0172271728515625,
0.016998291015625,
0.048309326171875,
0.04119873046875,
0.056610107421875,
0.04815673828125,
-0.059906005859375,
0.06396484375,
-0.034149169921875,
-0.01413726806640625,
-0.032135009765625,
-0.050628662109375,
-0.08734130859375,
-0.04815673828125,
-0.02978515625,
-0.07757568359375,
0.004627227783203125,
0.06951904296875,
0.051025390625,
-0.08392333984375,
-0.0203399658203125,
0.003078460693359375,
0.0177154541015625,
0.00868988037109375,
-0.02362060546875,
0.029815673828125,
-0.0224609375,
-0.06982421875,
0.01554107666015625,
-0.0084381103515625,
0.01401519775390625,
0.004016876220703125,
-0.0013179779052734375,
-0.038787841796875,
-0.0069732666015625,
0.02484130859375,
0.022674560546875,
-0.047119140625,
-0.0243072509765625,
-0.01129913330078125,
-0.0279998779296875,
0.0133514404296875,
0.0222015380859375,
-0.043975830078125,
0.007602691650390625,
0.044097900390625,
0.0249786376953125,
0.05029296875,
0.0015554428100585938,
0.0206451416015625,
-0.036865234375,
0.0166168212890625,
0.0229339599609375,
0.0304107666015625,
0.034088134765625,
-0.01263427734375,
0.041839599609375,
0.038238525390625,
-0.045166015625,
-0.078125,
-0.011016845703125,
-0.07366943359375,
-0.01482391357421875,
0.08270263671875,
-0.01190185546875,
-0.045013427734375,
-0.010711669921875,
0.0008549690246582031,
0.053253173828125,
-0.055023193359375,
0.0533447265625,
0.03533935546875,
0.01270294189453125,
0.01110076904296875,
-0.021484375,
0.043060302734375,
0.0308074951171875,
-0.03997802734375,
-0.0172882080078125,
0.01763916015625,
0.052825927734375,
0.01165771484375,
0.05438232421875,
0.005374908447265625,
0.0253143310546875,
0.0036182403564453125,
0.0077667236328125,
-0.0147705078125,
-0.0044708251953125,
-0.030548095703125,
0.0190582275390625,
-0.0379638671875,
-0.0279998779296875
]
] |
ai-forever/ruRoberta-large | 2023-11-03T12:47:18.000Z | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"PyTorch",
"Transformers",
"ru",
"arxiv:2309.10931",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | ai-forever | null | null | ai-forever/ruRoberta-large | 33 | 72,532 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: "https://github.com/sberbank-ai/model-zoo"
---
# ruRoberta-large
The model architecture design, pretraining, and evaluation are documented in our preprint: [**A Family of Pretrained Transformer Language Models for Russian**](https://arxiv.org/abs/2309.10931).
The model is pretrained by the [SberDevices](https://sberdevices.ru/) team.
* Task: `mask filling`
* Type: `encoder`
* Tokenizer: `BBPE`
* Dict size: `50 257`
* Num Parameters: `355 M`
* Training Data Volume `250 GB`
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich
# Cite us
```
@misc{zmitrovich2023family,
title={A Family of Pretrained Transformer Language Models for Russian},
author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
year={2023},
eprint={2309.10931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 1,135 | [
[
-0.025390625,
-0.0084381103515625,
0.013916015625,
0.02252197265625,
-0.0291900634765625,
-0.000476837158203125,
-0.0271759033203125,
-0.0166015625,
-0.00011712312698364258,
0.0252685546875,
-0.03936767578125,
-0.0267791748046875,
-0.046630859375,
-0.005157470703125,
-0.0195159912109375,
0.08282470703125,
-0.01357269287109375,
0.0288238525390625,
-0.01436614990234375,
0.0015439987182617188,
-0.0220794677734375,
-0.0447998046875,
-0.0259857177734375,
-0.03472900390625,
0.0169525146484375,
0.0079193115234375,
0.0205078125,
0.0325927734375,
0.026092529296875,
0.0283203125,
-0.0084686279296875,
-0.022430419921875,
-0.030517578125,
0.0055389404296875,
-0.006633758544921875,
-0.0235137939453125,
-0.05157470703125,
-0.01274871826171875,
0.065673828125,
0.036956787109375,
-0.0187835693359375,
0.037994384765625,
0.009063720703125,
0.0357666015625,
-0.046905517578125,
0.0003399848937988281,
-0.031982421875,
0.014739990234375,
-0.0282135009765625,
-0.00860595703125,
-0.06048583984375,
-0.0008563995361328125,
0.0137176513671875,
-0.0280914306640625,
0.00848388671875,
0.00016033649444580078,
0.091796875,
0.018310546875,
-0.0071868896484375,
0.00484466552734375,
-0.06732177734375,
0.059051513671875,
-0.040496826171875,
0.046539306640625,
0.016815185546875,
0.0197601318359375,
-0.00670623779296875,
-0.07904052734375,
-0.02783203125,
-0.0026988983154296875,
-0.026397705078125,
0.003345489501953125,
-0.041534423828125,
-0.0029277801513671875,
0.0308837890625,
0.019134521484375,
-0.058197021484375,
0.00885772705078125,
-0.04107666015625,
-0.020904541015625,
0.020751953125,
-0.0016994476318359375,
0.0018434524536132812,
0.007595062255859375,
-0.02264404296875,
-0.0187225341796875,
-0.044036865234375,
-0.0081329345703125,
0.035614013671875,
-0.0021724700927734375,
-0.0289154052734375,
0.03131103515625,
-0.0212554931640625,
0.067626953125,
0.00043582916259765625,
0.0167083740234375,
0.05316162109375,
-0.031463623046875,
-0.025604248046875,
-0.029876708984375,
0.07720947265625,
-0.0168304443359375,
0.016754150390625,
-0.0233154296875,
-0.02264404296875,
-0.00508880615234375,
0.037445068359375,
-0.0765380859375,
-0.014862060546875,
0.00812530517578125,
-0.0270233154296875,
-0.01284027099609375,
0.0084991455078125,
-0.06072998046875,
0.01155853271484375,
-0.0191802978515625,
0.050567626953125,
-0.0167083740234375,
-0.0240325927734375,
0.031768798828125,
0.005519866943359375,
0.04376220703125,
-0.00011456012725830078,
-0.062255859375,
0.030487060546875,
0.044708251953125,
0.040008544921875,
0.00933074951171875,
-0.0264129638671875,
-0.030975341796875,
-0.0235595703125,
-0.004364013671875,
0.058746337890625,
-0.025726318359375,
0.0012054443359375,
-0.0009508132934570312,
0.0032558441162109375,
-0.006458282470703125,
-0.0184326171875,
0.0400390625,
-0.04638671875,
0.035003662109375,
0.01190948486328125,
-0.007732391357421875,
0.007213592529296875,
0.00788116455078125,
-0.034515380859375,
0.058258056640625,
0.01364898681640625,
-0.05279541015625,
0.033203125,
-0.048095703125,
-0.0171051025390625,
0.0150604248046875,
0.01068878173828125,
-0.043212890625,
0.000255584716796875,
0.0028553009033203125,
0.040924072265625,
-0.027923583984375,
0.0291748046875,
-0.002048492431640625,
-0.0169525146484375,
0.022491455078125,
-0.0054473876953125,
0.0653076171875,
0.0218658447265625,
-0.0221099853515625,
0.03167724609375,
-0.07122802734375,
0.007274627685546875,
-0.003841400146484375,
-0.00921630859375,
-0.005725860595703125,
-0.0234222412109375,
0.0173187255859375,
0.0272979736328125,
0.032501220703125,
-0.0347900390625,
0.0150299072265625,
-0.01389312744140625,
0.039825439453125,
0.05010986328125,
-0.02447509765625,
0.043487548828125,
-0.0040283203125,
0.0704345703125,
-0.003322601318359375,
0.03656005859375,
-0.0194854736328125,
-0.006229400634765625,
-0.077880859375,
-0.0236968994140625,
0.055877685546875,
0.0251617431640625,
-0.034576416015625,
0.06134033203125,
-0.03448486328125,
-0.060455322265625,
-0.04315185546875,
-0.0165863037109375,
0.0487060546875,
0.020904541015625,
0.03436279296875,
-0.025390625,
-0.06463623046875,
-0.0758056640625,
0.00998687744140625,
0.01291656494140625,
0.0013380050659179688,
-0.008392333984375,
0.0550537109375,
-0.0225982666015625,
0.056976318359375,
-0.0251922607421875,
-0.00876617431640625,
-0.037139892578125,
0.0142822265625,
0.0251617431640625,
0.056884765625,
0.0435791015625,
-0.04150390625,
-0.034637451171875,
-0.00942230224609375,
-0.0296630859375,
-0.01146697998046875,
0.00528717041015625,
-0.023956298828125,
0.025146484375,
0.016082763671875,
-0.057830810546875,
0.031890869140625,
0.044769287109375,
-0.039093017578125,
0.06182861328125,
-0.0030269622802734375,
-0.006023406982421875,
-0.09246826171875,
0.03167724609375,
-0.0158843994140625,
-0.021514892578125,
-0.055145263671875,
0.0025997161865234375,
0.0007176399230957031,
-0.01023101806640625,
-0.0399169921875,
0.039520263671875,
-0.054931640625,
-0.0223236083984375,
-0.0030879974365234375,
-0.005950927734375,
0.00152587890625,
0.034698486328125,
0.02294921875,
0.04541015625,
0.06353759765625,
-0.040313720703125,
0.010986328125,
0.031646728515625,
-0.04144287109375,
0.0218963623046875,
-0.08489990234375,
0.0236968994140625,
-0.0024242401123046875,
0.0367431640625,
-0.047882080078125,
0.0048675537109375,
0.0294189453125,
-0.038818359375,
0.035491943359375,
-0.031463623046875,
-0.045440673828125,
-0.03564453125,
0.003398895263671875,
0.03961181640625,
0.054351806640625,
-0.0281524658203125,
0.06304931640625,
0.0225677490234375,
0.004840850830078125,
-0.07623291015625,
-0.039794921875,
-0.0123443603515625,
-0.01178741455078125,
-0.057342529296875,
0.0242919921875,
-0.00998687744140625,
0.007694244384765625,
0.002735137939453125,
0.003215789794921875,
-0.023834228515625,
-0.005077362060546875,
0.01593017578125,
0.035003662109375,
-0.032562255859375,
0.0034618377685546875,
-0.031158447265625,
-0.034912109375,
-0.004673004150390625,
-0.0176849365234375,
0.0892333984375,
-0.02789306640625,
-0.0085296630859375,
-0.041839599609375,
-0.0002346038818359375,
0.02734375,
-0.047119140625,
0.0684814453125,
0.06048583984375,
-0.0270538330078125,
-0.01971435546875,
-0.04962158203125,
-0.0029621124267578125,
-0.037200927734375,
0.03192138671875,
-0.0288848876953125,
-0.057861328125,
0.04925537109375,
0.013916015625,
-0.00543975830078125,
0.036773681640625,
0.035614013671875,
0.0132598876953125,
0.0340576171875,
0.046844482421875,
0.0038738250732421875,
0.041778564453125,
-0.046905517578125,
0.0116729736328125,
-0.07354736328125,
-0.0193023681640625,
-0.0654296875,
-0.02520751953125,
-0.0162200927734375,
-0.030364990234375,
-0.005401611328125,
0.002239227294921875,
-0.027008056640625,
0.0635986328125,
-0.03662109375,
0.036834716796875,
0.024566650390625,
-0.0232391357421875,
0.00235748291015625,
-0.015289306640625,
0.002811431884765625,
-0.013702392578125,
-0.062744140625,
-0.041748046875,
0.087890625,
0.0294342041015625,
0.06072998046875,
-0.0136871337890625,
0.06292724609375,
-0.027130126953125,
0.0249176025390625,
-0.050537109375,
0.042724609375,
-0.0021820068359375,
-0.053009033203125,
-0.0081939697265625,
-0.02874755859375,
-0.08233642578125,
0.019744873046875,
-0.03167724609375,
-0.04486083984375,
0.01934814453125,
0.024322509765625,
-0.02166748046875,
0.01000213623046875,
-0.04046630859375,
0.0933837890625,
-0.013763427734375,
-0.00839996337890625,
-0.0258941650390625,
-0.043243408203125,
0.02374267578125,
-0.0141448974609375,
0.00036525726318359375,
0.030181884765625,
0.0203094482421875,
0.06671142578125,
-0.0133209228515625,
0.050872802734375,
-0.01934814453125,
0.0217132568359375,
-0.00243377685546875,
-0.0182647705078125,
0.026641845703125,
-0.01434326171875,
-0.00333404541015625,
0.0208740234375,
-0.0057830810546875,
-0.0181427001953125,
-0.0345458984375,
0.0260162353515625,
-0.064453125,
-0.0168914794921875,
-0.05706787109375,
-0.015777587890625,
0.00222015380859375,
0.03875732421875,
0.0391845703125,
0.056427001953125,
-0.0255584716796875,
0.059539794921875,
0.048858642578125,
-0.0164642333984375,
0.0254974365234375,
0.055084228515625,
-0.0200042724609375,
-0.0260162353515625,
0.039459228515625,
0.0059661865234375,
0.015289306640625,
0.041229248046875,
0.01042938232421875,
-0.011383056640625,
-0.057952880859375,
-0.021453857421875,
0.032012939453125,
-0.046234130859375,
-0.0225677490234375,
-0.058990478515625,
-0.03167724609375,
-0.03656005859375,
0.0078277587890625,
-0.05389404296875,
-0.022430419921875,
-0.02850341796875,
-0.00952911376953125,
-0.0013494491577148438,
0.054595947265625,
0.0030536651611328125,
0.04461669921875,
-0.05828857421875,
0.0113525390625,
-0.0008745193481445312,
0.032196044921875,
0.0017652511596679688,
-0.07147216796875,
-0.039459228515625,
-0.0164642333984375,
-0.0177459716796875,
-0.0249176025390625,
0.0304718017578125,
0.001483917236328125,
0.06146240234375,
0.01153564453125,
-0.003459930419921875,
0.051727294921875,
-0.05804443359375,
0.04876708984375,
0.005504608154296875,
-0.07763671875,
0.0206756591796875,
-0.0017271041870117188,
0.019317626953125,
0.0460205078125,
0.0361328125,
-0.0302886962890625,
0.0043487548828125,
-0.062347412109375,
-0.07281494140625,
0.056854248046875,
0.0053863525390625,
0.015289306640625,
0.026580810546875,
0.0134735107421875,
0.007694244384765625,
0.0160064697265625,
-0.08599853515625,
-0.0188140869140625,
-0.0275726318359375,
-0.006000518798828125,
-0.0014352798461914062,
-0.03582763671875,
-0.00766754150390625,
-0.0230865478515625,
0.07080078125,
0.02301025390625,
0.0295257568359375,
-0.004608154296875,
-0.02850341796875,
0.00508880615234375,
0.02630615234375,
0.073974609375,
0.0679931640625,
-0.00917816162109375,
-0.008941650390625,
0.005535125732421875,
-0.041534423828125,
0.00247955322265625,
0.0098114013671875,
-0.01035308837890625,
0.0014963150024414062,
0.03839111328125,
0.09576416015625,
0.017059326171875,
-0.0313720703125,
0.055145263671875,
-0.0178680419921875,
-0.0229034423828125,
-0.05828857421875,
-0.01029205322265625,
-0.0003712177276611328,
0.01033782958984375,
0.0279388427734375,
0.02001953125,
-0.01230621337890625,
-0.01593017578125,
0.0199737548828125,
0.035400390625,
-0.022979736328125,
-0.06036376953125,
0.0209503173828125,
0.01000213623046875,
-0.032440185546875,
0.03363037109375,
-0.01428985595703125,
-0.05181884765625,
0.01552581787109375,
0.050537109375,
0.08367919921875,
-0.02734375,
0.01335906982421875,
0.04974365234375,
0.037841796875,
-0.00803375244140625,
0.006290435791015625,
0.00040221214294433594,
-0.06512451171875,
-0.042755126953125,
-0.078369140625,
-0.00832366943359375,
0.035400390625,
-0.059356689453125,
0.041015625,
-0.023468017578125,
-0.00408935546875,
-0.0099639892578125,
-0.016632080078125,
-0.0670166015625,
0.01556396484375,
-0.00867462158203125,
0.06951904296875,
-0.06671142578125,
0.06573486328125,
0.049407958984375,
-0.02294921875,
-0.0562744140625,
-0.01087188720703125,
-0.040618896484375,
-0.050689697265625,
0.073974609375,
0.010040283203125,
-0.01328277587890625,
0.004825592041015625,
-0.02630615234375,
-0.0703125,
0.07708740234375,
0.021575927734375,
-0.04193115234375,
-0.005184173583984375,
-0.0005788803100585938,
0.051910400390625,
-0.0396728515625,
0.0186614990234375,
0.0283355712890625,
0.047119140625,
0.005611419677734375,
-0.06842041015625,
-0.0135955810546875,
-0.042083740234375,
-0.003894805908203125,
0.013763427734375,
-0.0367431640625,
0.060272216796875,
0.0017747879028320312,
-0.0228424072265625,
0.0081634521484375,
0.0501708984375,
-0.002704620361328125,
-0.01544189453125,
0.038818359375,
0.063720703125,
0.0250396728515625,
-0.017181396484375,
0.064453125,
-0.02740478515625,
0.01399993896484375,
0.09027099609375,
0.00667572021484375,
0.06640625,
0.031585693359375,
-0.025726318359375,
0.0457763671875,
0.0283203125,
-0.01326751708984375,
0.030426025390625,
0.0265960693359375,
0.00318145751953125,
-0.0194091796875,
0.021514892578125,
-0.039764404296875,
0.036956787109375,
0.01352691650390625,
-0.0240325927734375,
0.000545501708984375,
-0.00246429443359375,
0.00220489501953125,
-0.0107269287109375,
0.0017528533935546875,
0.04718017578125,
0.0015239715576171875,
-0.0494384765625,
0.047393798828125,
-0.00836181640625,
0.0531005859375,
-0.0716552734375,
0.01242828369140625,
0.003124237060546875,
0.0187225341796875,
-0.01464080810546875,
-0.03228759765625,
0.0265350341796875,
0.001743316650390625,
-0.01399993896484375,
-0.034515380859375,
0.046112060546875,
-0.032012939453125,
-0.0289764404296875,
0.01483917236328125,
0.0103302001953125,
0.01459503173828125,
0.022369384765625,
-0.03778076171875,
0.021270751953125,
-0.00949859619140625,
-0.04241943359375,
0.025146484375,
0.018951416015625,
0.0102691650390625,
0.051239013671875,
0.0506591796875,
0.019256591796875,
0.0163116455078125,
0.014617919921875,
0.06951904296875,
-0.027740478515625,
-0.0369873046875,
-0.059844970703125,
0.070068359375,
0.00409698486328125,
-0.04461669921875,
0.0438232421875,
0.046966552734375,
0.065185546875,
-0.037811279296875,
0.035125732421875,
-0.0125732421875,
0.024566650390625,
-0.035552978515625,
0.06256103515625,
-0.0256500244140625,
-0.0020389556884765625,
-0.01448822021484375,
-0.09259033203125,
-0.023345947265625,
0.06134033203125,
-0.0162200927734375,
0.0035114288330078125,
0.06036376953125,
0.049591064453125,
-0.01496124267578125,
-0.0291748046875,
0.011749267578125,
0.030029296875,
0.016387939453125,
0.0272674560546875,
0.05828857421875,
-0.039825439453125,
0.03485107421875,
-0.019866943359375,
-0.022674560546875,
-0.007183074951171875,
-0.0811767578125,
-0.06884765625,
-0.061248779296875,
-0.0203857421875,
-0.017822265625,
-0.009368896484375,
0.07232666015625,
0.06134033203125,
-0.054931640625,
-0.0256500244140625,
0.0008144378662109375,
-0.019012451171875,
-0.004608154296875,
-0.0086669921875,
0.018951416015625,
-0.024688720703125,
-0.046539306640625,
0.02099609375,
-0.0034027099609375,
0.0090789794921875,
-0.036529541015625,
-0.0130462646484375,
-0.01239776611328125,
-0.0177764892578125,
0.0235137939453125,
0.0158233642578125,
-0.035003662109375,
-0.0203704833984375,
-0.00505828857421875,
0.000396728515625,
0.00759124755859375,
0.047027587890625,
-0.048828125,
0.024566650390625,
0.054534912109375,
0.0309295654296875,
0.0528564453125,
-0.00809478759765625,
0.056427001953125,
-0.051116943359375,
0.03387451171875,
0.0257415771484375,
0.04241943359375,
0.03167724609375,
0.0005125999450683594,
0.031585693359375,
0.0074615478515625,
-0.055877685546875,
-0.08349609375,
0.022308349609375,
-0.0809326171875,
0.0171356201171875,
0.08209228515625,
-0.027496337890625,
-0.0232086181640625,
-0.0029144287109375,
0.003154754638671875,
0.019134521484375,
-0.01190185546875,
0.0289154052734375,
0.06427001953125,
0.02947998046875,
-0.01410675048828125,
-0.034912109375,
0.046600341796875,
0.0172271728515625,
-0.05487060546875,
0.0016164779663085938,
0.019378662109375,
0.03302001953125,
0.04010009765625,
0.053466796875,
-0.01351165771484375,
0.0180816650390625,
-0.014495849609375,
0.048004150390625,
-0.0232391357421875,
-0.0460205078125,
-0.032318115234375,
-0.0166473388671875,
-0.003536224365234375,
-0.01218414306640625
]
] |
Helsinki-NLP/opus-mt-en-de | 2023-08-16T11:29:21.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"marian",
"text2text-generation",
"translation",
"en",
"de",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-de | 17 | 72,440 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: cc-by-4.0
---
### opus-mt-en-de
## Table of Contents
- [Model Details](#model-details)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Citation Information](#citation-information)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
## Model Details
**Model Description:**
- **Developed by:** Language Technology Research Group at the University of Helsinki
- **Model Type:** Translation
- **Language(s):**
- Source Language: English
- Target Language: German
- **License:** CC-BY-4.0
- **Resources for more information:**
- [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
## Uses
#### Direct Use
This model can be used for translation and text-to-text generation.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
Further details about the dataset for this model can be found in the OPUS readme: [en-de](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-de/README.md)
#### Training Data
##### Preprocessing
* pre-processing: normalization + SentencePiece
* dataset: [opus](https://github.com/Helsinki-NLP/Opus-MT)
* download original weights: [opus-2020-02-26.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.zip)
* test set translations: [opus-2020-02-26.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.test.txt)
## Evaluation
#### Results
* test set scores: [opus-2020-02-26.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.eval.txt)
#### Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newssyscomb2009.en.de | 23.5 | 0.540 |
| news-test2008.en.de | 23.5 | 0.529 |
| newstest2009.en.de | 22.3 | 0.530 |
| newstest2010.en.de | 24.9 | 0.544 |
| newstest2011.en.de | 22.5 | 0.524 |
| newstest2012.en.de | 23.0 | 0.525 |
| newstest2013.en.de | 26.9 | 0.553 |
| newstest2015-ende.en.de | 31.1 | 0.594 |
| newstest2016-ende.en.de | 37.0 | 0.636 |
| newstest2017-ende.en.de | 29.9 | 0.586 |
| newstest2018-ende.en.de | 45.2 | 0.690 |
| newstest2019-ende.en.de | 40.9 | 0.654 |
| Tatoeba.en.de | 47.3 | 0.664 |
## Citation Information
```bibtex
@InProceedings{TiedemannThottingal:EAMT2020,
author = {J{\"o}rg Tiedemann and Santhosh Thottingal},
title = {{OPUS-MT} — {B}uilding open translation services for the {W}orld},
booktitle = {Proceedings of the 22nd Annual Conferenec of the European Association for Machine Translation (EAMT)},
year = {2020},
address = {Lisbon, Portugal}
}
```
## How to Get Started With the Model
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-de")
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-en-de")
```
| 3,310 | [
[
-0.025238037109375,
-0.041259765625,
0.01436614990234375,
0.01015472412109375,
-0.023681640625,
-0.038421630859375,
-0.0201416015625,
-0.0261383056640625,
0.005695343017578125,
0.0190887451171875,
-0.037109375,
-0.041839599609375,
-0.04949951171875,
0.018890380859375,
-0.0171661376953125,
0.063232421875,
-0.006488800048828125,
0.0338134765625,
0.003116607666015625,
-0.027130126953125,
-0.0258026123046875,
-0.043853759765625,
-0.0310821533203125,
-0.0214385986328125,
0.0125579833984375,
0.0150909423828125,
0.0367431640625,
0.049041748046875,
0.049713134765625,
0.0198211669921875,
-0.016204833984375,
-0.0005307197570800781,
-0.0182342529296875,
-0.005420684814453125,
0.01001739501953125,
-0.039825439453125,
-0.0560302734375,
0.008697509765625,
0.056427001953125,
0.05401611328125,
0.00566864013671875,
0.0305633544921875,
0.00791168212890625,
0.0626220703125,
-0.01422119140625,
0.0017518997192382812,
-0.0389404296875,
0.007732391357421875,
-0.01824951171875,
-0.021026611328125,
-0.045440673828125,
-0.0205230712890625,
0.00997161865234375,
-0.040802001953125,
0.0022678375244140625,
0.01287078857421875,
0.08514404296875,
0.0110626220703125,
-0.022705078125,
-0.01367950439453125,
-0.042083740234375,
0.07861328125,
-0.07122802734375,
0.038421630859375,
0.041259765625,
-0.003002166748046875,
0.0006999969482421875,
-0.041656494140625,
-0.02874755859375,
-0.006572723388671875,
-0.0192718505859375,
0.0171661376953125,
-0.01183319091796875,
-0.015716552734375,
0.024688720703125,
0.04827880859375,
-0.05999755859375,
0.00543975830078125,
-0.04315185546875,
-0.00554656982421875,
0.052581787109375,
0.0084075927734375,
0.02154541015625,
-0.0243377685546875,
-0.0369873046875,
-0.0286865234375,
-0.052154541015625,
0.0103912353515625,
0.037933349609375,
0.0200042724609375,
-0.0237579345703125,
0.05181884765625,
-0.002010345458984375,
0.0435791015625,
0.000050187110900878906,
-0.0022792816162109375,
0.06378173828125,
-0.0352783203125,
-0.020111083984375,
-0.0192108154296875,
0.0877685546875,
0.0306549072265625,
0.00994110107421875,
-0.0037288665771484375,
-0.0159149169921875,
-0.01561737060546875,
-0.00278472900390625,
-0.06756591796875,
0.0048980712890625,
0.0204925537109375,
-0.043212890625,
-0.004108428955078125,
-0.00310516357421875,
-0.052032470703125,
0.0161895751953125,
-0.03192138671875,
0.0396728515625,
-0.051483154296875,
-0.0260467529296875,
0.0218048095703125,
0.00009971857070922852,
0.0230255126953125,
0.00125885009765625,
-0.042236328125,
0.0166015625,
0.027740478515625,
0.061737060546875,
-0.02996826171875,
-0.031524658203125,
-0.02154541015625,
-0.0167694091796875,
-0.017120361328125,
0.03955078125,
-0.0084075927734375,
-0.038421630859375,
-0.00551605224609375,
0.0203857421875,
-0.0196380615234375,
-0.023193359375,
0.0836181640625,
-0.024688720703125,
0.061981201171875,
-0.0167694091796875,
-0.057891845703125,
-0.019134521484375,
0.0202789306640625,
-0.03875732421875,
0.10491943359375,
0.00762939453125,
-0.069091796875,
0.01502227783203125,
-0.0584716796875,
-0.005962371826171875,
-0.01031494140625,
0.0029010772705078125,
-0.04156494140625,
-0.004150390625,
0.01189422607421875,
0.0243682861328125,
-0.0308074951171875,
0.04022216796875,
-0.01287078857421875,
-0.017059326171875,
0.01171112060546875,
-0.036651611328125,
0.09619140625,
0.0227203369140625,
-0.0295867919921875,
0.0020904541015625,
-0.06964111328125,
-0.003093719482421875,
0.006267547607421875,
-0.026824951171875,
-0.029052734375,
-0.0034027099609375,
0.00794219970703125,
0.0193328857421875,
0.01531219482421875,
-0.045654296875,
0.01316070556640625,
-0.0625,
0.0225372314453125,
0.053375244140625,
-0.01983642578125,
0.037506103515625,
-0.0355224609375,
0.0238494873046875,
0.006557464599609375,
0.0216217041015625,
-0.0041656494140625,
-0.045440673828125,
-0.07073974609375,
-0.0170135498046875,
0.0462646484375,
0.05364990234375,
-0.04486083984375,
0.06256103515625,
-0.036346435546875,
-0.057769775390625,
-0.048675537109375,
-0.01175689697265625,
0.043121337890625,
0.033233642578125,
0.0430908203125,
-0.0242919921875,
-0.034027099609375,
-0.07720947265625,
-0.024261474609375,
-0.0165557861328125,
-0.01078033447265625,
0.0148162841796875,
0.058502197265625,
-0.0035610198974609375,
0.0567626953125,
-0.03369140625,
-0.02886962890625,
-0.00727081298828125,
0.01386260986328125,
0.037261962890625,
0.06427001953125,
0.045654296875,
-0.0660400390625,
-0.04962158203125,
-0.01346588134765625,
-0.05364990234375,
-0.0128326416015625,
0.0148162841796875,
-0.023773193359375,
0.0217132568359375,
0.0211334228515625,
-0.036651611328125,
0.0186309814453125,
0.037933349609375,
-0.046051025390625,
0.045654296875,
-0.01263427734375,
0.01555633544921875,
-0.098876953125,
0.01522064208984375,
-0.0081787109375,
-0.0034542083740234375,
-0.050384521484375,
0.0001723766326904297,
0.0026397705078125,
0.001720428466796875,
-0.04473876953125,
0.06005859375,
-0.036712646484375,
0.0003974437713623047,
0.0195159912109375,
-0.0012693405151367188,
0.003368377685546875,
0.060791015625,
0.00012385845184326172,
0.058868408203125,
0.0452880859375,
-0.03887939453125,
-0.00006508827209472656,
0.029632568359375,
-0.027801513671875,
0.0285797119140625,
-0.060028076171875,
-0.005275726318359375,
0.01824951171875,
-0.005130767822265625,
-0.049957275390625,
0.007579803466796875,
0.0253143310546875,
-0.057586669921875,
0.0273895263671875,
-0.013885498046875,
-0.060546875,
-0.015655517578125,
-0.0222930908203125,
0.03448486328125,
0.04656982421875,
-0.01232147216796875,
0.0513916015625,
0.0165557861328125,
-0.01024627685546875,
-0.034027099609375,
-0.069580078125,
-0.0139617919921875,
-0.026275634765625,
-0.052459716796875,
0.02490234375,
-0.0362548828125,
-0.004573822021484375,
0.006549835205078125,
0.02349853515625,
-0.00911712646484375,
0.004302978515625,
0.00882720947265625,
0.0239105224609375,
-0.0211334228515625,
0.01544952392578125,
-0.00479888916015625,
-0.0168914794921875,
0.0017938613891601562,
-0.032745361328125,
0.03350830078125,
-0.0260162353515625,
-0.024505615234375,
-0.04229736328125,
0.0238494873046875,
0.043670654296875,
-0.03375244140625,
0.0638427734375,
0.03662109375,
-0.0196380615234375,
0.01467132568359375,
-0.0369873046875,
-0.00815582275390625,
-0.03277587890625,
0.022491455078125,
-0.019012451171875,
-0.056121826171875,
0.044097900390625,
0.0185394287109375,
0.029083251953125,
0.06842041015625,
0.05426025390625,
0.011627197265625,
0.061737060546875,
0.03466796875,
0.0183258056640625,
0.034942626953125,
-0.03662109375,
-0.00400543212890625,
-0.0748291015625,
-0.011474609375,
-0.055206298828125,
-0.01461029052734375,
-0.059661865234375,
-0.0341796875,
0.0211639404296875,
-0.0022373199462890625,
-0.024566650390625,
0.03765869140625,
-0.037109375,
0.0060577392578125,
0.049957275390625,
-0.0091705322265625,
0.025238037109375,
-0.0007567405700683594,
-0.035980224609375,
-0.0239715576171875,
-0.044677734375,
-0.043853759765625,
0.10723876953125,
0.033599853515625,
0.0209197998046875,
0.0168914794921875,
0.039764404296875,
0.01387786865234375,
0.0218505859375,
-0.0439453125,
0.036865234375,
-0.01021575927734375,
-0.06842041015625,
-0.0273895263671875,
-0.043670654296875,
-0.06005859375,
0.0396728515625,
-0.016326904296875,
-0.04547119140625,
0.032623291015625,
0.00391387939453125,
-0.01183319091796875,
0.0261077880859375,
-0.05548095703125,
0.07110595703125,
-0.0168914794921875,
-0.019012451171875,
0.017547607421875,
-0.045806884765625,
0.01776123046875,
-0.00804901123046875,
0.03167724609375,
-0.011688232421875,
0.00328826904296875,
0.06842041015625,
-0.023773193359375,
0.045196533203125,
-0.00521087646484375,
-0.002925872802734375,
0.00942230224609375,
0.007293701171875,
0.03668212890625,
-0.00849151611328125,
-0.035919189453125,
0.0312347412109375,
0.006744384765625,
-0.0289306640625,
-0.0181427001953125,
0.03692626953125,
-0.058319091796875,
-0.0307769775390625,
-0.043701171875,
-0.048248291015625,
0.0026378631591796875,
0.031890869140625,
0.038116455078125,
0.052337646484375,
-0.0207061767578125,
0.0272369384765625,
0.06292724609375,
-0.033782958984375,
0.0222930908203125,
0.051422119140625,
-0.0157318115234375,
-0.038299560546875,
0.0531005859375,
0.0234222412109375,
0.031280517578125,
0.0309906005859375,
0.0186309814453125,
-0.0188446044921875,
-0.04345703125,
-0.048004150390625,
0.0212554931640625,
-0.03515625,
-0.00902557373046875,
-0.059051513671875,
-0.01183319091796875,
-0.0306854248046875,
0.011962890625,
-0.0433349609375,
-0.048065185546875,
-0.023529052734375,
-0.01059722900390625,
0.01971435546875,
0.017181396484375,
-0.01381683349609375,
0.0155181884765625,
-0.053863525390625,
0.00011342763900756836,
-0.007568359375,
0.021881103515625,
-0.016510009765625,
-0.07855224609375,
-0.0303192138671875,
0.00833892822265625,
-0.0305633544921875,
-0.0625,
0.042724609375,
0.01357269287109375,
0.0291290283203125,
0.0195770263671875,
0.0181121826171875,
0.0238800048828125,
-0.049468994140625,
0.074951171875,
0.006656646728515625,
-0.055450439453125,
0.03363037109375,
-0.030517578125,
0.022735595703125,
0.050872802734375,
0.0287322998046875,
-0.027862548828125,
-0.046051025390625,
-0.0657958984375,
-0.07110595703125,
0.061309814453125,
0.044708251953125,
0.00975799560546875,
0.01213836669921875,
0.0037136077880859375,
0.00220489501953125,
0.005939483642578125,
-0.08837890625,
-0.044464111328125,
0.0008826255798339844,
-0.0131988525390625,
-0.006572723388671875,
-0.0296478271484375,
-0.0160980224609375,
-0.033538818359375,
0.078125,
0.0176849365234375,
0.0309295654296875,
0.031646728515625,
-0.00939178466796875,
-0.0000362396240234375,
0.0333251953125,
0.049957275390625,
0.039306640625,
-0.0352783203125,
-0.0037403106689453125,
0.020904541015625,
-0.033447265625,
0.002735137939453125,
0.0200347900390625,
-0.04400634765625,
0.0196990966796875,
0.0291290283203125,
0.06591796875,
0.007266998291015625,
-0.034454345703125,
0.050750732421875,
-0.01241302490234375,
-0.036102294921875,
-0.04327392578125,
-0.0185394287109375,
0.0026531219482421875,
0.0036144256591796875,
0.0178680419921875,
0.005191802978515625,
0.0186309814453125,
-0.0238800048828125,
0.00841522216796875,
0.00714111328125,
-0.036895751953125,
-0.0310821533203125,
0.05096435546875,
0.01416778564453125,
-0.022247314453125,
0.027130126953125,
-0.031890869140625,
-0.045806884765625,
0.038604736328125,
0.0223236083984375,
0.07366943359375,
-0.0148162841796875,
-0.0050201416015625,
0.062469482421875,
0.04132080078125,
-0.00827789306640625,
0.01419830322265625,
0.0169525146484375,
-0.040283203125,
-0.026031494140625,
-0.05902099609375,
0.003490447998046875,
0.004024505615234375,
-0.056915283203125,
0.021636962890625,
0.01561737060546875,
-0.0174560546875,
-0.0211334228515625,
0.0151214599609375,
-0.0447998046875,
0.005695343017578125,
-0.01383209228515625,
0.08203125,
-0.07135009765625,
0.0557861328125,
0.042877197265625,
-0.038665771484375,
-0.0595703125,
-0.01331329345703125,
-0.0060577392578125,
-0.0309906005859375,
0.051300048828125,
0.007419586181640625,
0.0177001953125,
-0.0006432533264160156,
-0.021881103515625,
-0.07080078125,
0.07684326171875,
0.0213470458984375,
-0.032684326171875,
0.002033233642578125,
0.007328033447265625,
0.0478515625,
-0.0204620361328125,
0.0287933349609375,
0.02362060546875,
0.04351806640625,
-0.00157928466796875,
-0.07086181640625,
-0.006072998046875,
-0.045806884765625,
-0.00678253173828125,
0.0182952880859375,
-0.043121337890625,
0.08221435546875,
0.020904541015625,
-0.0182342529296875,
0.00494384765625,
0.051055908203125,
0.01206207275390625,
0.01328277587890625,
0.0306243896484375,
0.06640625,
0.040283203125,
-0.02099609375,
0.07708740234375,
-0.0291748046875,
0.038421630859375,
0.0845947265625,
0.00940704345703125,
0.06427001953125,
0.0297088623046875,
-0.02667236328125,
0.03271484375,
0.05615234375,
-0.01357269287109375,
0.0308685302734375,
0.0008983612060546875,
0.0104217529296875,
-0.013885498046875,
-0.00011032819747924805,
-0.055755615234375,
0.00969696044921875,
0.0084228515625,
-0.0299224853515625,
0.0032978057861328125,
-0.004505157470703125,
0.0196533203125,
-0.0026416778564453125,
-0.00449371337890625,
0.040924072265625,
0.0161285400390625,
-0.054046630859375,
0.054779052734375,
0.006900787353515625,
0.050537109375,
-0.05035400390625,
0.013336181640625,
-0.019927978515625,
0.01216888427734375,
-0.004150390625,
-0.044342041015625,
0.0261383056640625,
0.01555633544921875,
-0.024505615234375,
-0.0306854248046875,
0.0174560546875,
-0.04302978515625,
-0.06353759765625,
0.03338623046875,
0.042816162109375,
0.0312347412109375,
0.016082763671875,
-0.06707763671875,
-0.0097198486328125,
0.008880615234375,
-0.050262451171875,
0.01190948486328125,
0.045318603515625,
0.01611328125,
0.037933349609375,
0.04962158203125,
0.01329803466796875,
0.0132293701171875,
-0.007175445556640625,
0.054443359375,
-0.024139404296875,
-0.032257080078125,
-0.066162109375,
0.057220458984375,
-0.0059967041015625,
-0.043609619140625,
0.06768798828125,
0.0697021484375,
0.076904296875,
0.003940582275390625,
0.03643798828125,
-0.0122528076171875,
0.051055908203125,
-0.042572021484375,
0.0458984375,
-0.07122802734375,
0.022735595703125,
-0.01947021484375,
-0.06256103515625,
-0.019439697265625,
0.0292510986328125,
-0.026397705078125,
-0.0008006095886230469,
0.057037353515625,
0.05792236328125,
-0.006755828857421875,
-0.015045166015625,
0.0161285400390625,
0.0307464599609375,
0.0159759521484375,
0.040802001953125,
0.03668212890625,
-0.06475830078125,
0.038299560546875,
-0.0169830322265625,
-0.0157012939453125,
-0.010772705078125,
-0.05645751953125,
-0.0643310546875,
-0.05560302734375,
-0.01003265380859375,
-0.029388427734375,
-0.01024627685546875,
0.07733154296875,
0.0299224853515625,
-0.058868408203125,
-0.033721923828125,
0.006103515625,
0.002033233642578125,
-0.0150909423828125,
-0.01313018798828125,
0.056060791015625,
-0.01331329345703125,
-0.0693359375,
0.006824493408203125,
0.0016498565673828125,
0.004589080810546875,
-0.01471710205078125,
-0.0240936279296875,
-0.03668212890625,
0.003734588623046875,
0.0278472900390625,
0.00458526611328125,
-0.049285888671875,
0.0103912353515625,
0.01477813720703125,
-0.018096923828125,
0.0204010009765625,
0.0211181640625,
-0.01371002197265625,
0.0355224609375,
0.051910400390625,
0.02801513671875,
0.031585693359375,
-0.011474609375,
0.045196533203125,
-0.0374755859375,
0.0263214111328125,
0.016815185546875,
0.038848876953125,
0.025054931640625,
-0.00847625732421875,
0.05548095703125,
0.0258941650390625,
-0.043365478515625,
-0.08184814453125,
0.0007691383361816406,
-0.06280517578125,
-0.00838470458984375,
0.09295654296875,
-0.0225372314453125,
-0.01220703125,
0.00914764404296875,
-0.004314422607421875,
0.02484130859375,
-0.02484130859375,
0.034912109375,
0.062469482421875,
0.01641845703125,
0.0149688720703125,
-0.0670166015625,
0.033355712890625,
0.03997802734375,
-0.05810546875,
-0.0037899017333984375,
0.0211334228515625,
0.01666259765625,
0.0212249755859375,
0.053466796875,
-0.030487060546875,
0.0022983551025390625,
-0.0163116455078125,
0.028839111328125,
-0.006404876708984375,
-0.004619598388671875,
-0.02325439453125,
-0.01006317138671875,
-0.0010080337524414062,
-0.0011358261108398438
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.