File size: 1,231 Bytes
e75d706
4cfa3f8
e75d706
 
 
 
 
 
 
 
 
6cfb019
e75d706
6cfb019
e75d706
6cfb019
e75d706
6cfb019
 
e75d706
cd904a5
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
title: MBartTranslator
emoji: 
colorFrom: pink
colorTo: indigo
sdk: gradio
sdk_version: 3.15.0
app_file: app.py
pinned: false
---

# mBART-50

mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper.

## Model description

mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. 
Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.

## Docker with GPU

```
docker run -it --gpus all -p 7860:7860 --platform=linux/amd64 \
	registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py
```

## Docker with CPU

```
docker run -it -p 7860:7860 --platform=linux/amd64 \
	registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py
```