Update README.md
Browse files
README.md
CHANGED
|
@@ -7,6 +7,28 @@ language:
|
|
| 7 |
metrics:
|
| 8 |
- bleu
|
| 9 |
- chrf
|
| 10 |
-
- comet
|
| 11 |
pipeline_tag: translation
|
| 12 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
metrics:
|
| 8 |
- bleu
|
| 9 |
- chrf
|
|
|
|
| 10 |
pipeline_tag: translation
|
| 11 |
+
---
|
| 12 |
+
# Model info
|
| 13 |
+
|
| 14 |
+
Distilled model from a Tatoeba-MT Teacher: [Tatoeba-MT-models/kor-eng/opusTCv20210807-sepvoc_transformer-big_2022-07-28](https://object.pouta.csc.fi/Tatoeba-MT-models/kor-eng/opusTCv20210807-sepvoc_transformer-big_2022-07-28.zip), which has been trained on the [Tatoeba](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/data) dataset.
|
| 15 |
+
|
| 16 |
+
We used the [OpusDistillery](https://github.com/Helsinki-NLP/OpusDistillery) to train new a new student with the tiny architecture, with a regular transformer decoder.
|
| 17 |
+
For training data, we used [Tatoeba](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/data).
|
| 18 |
+
The configuration file fed into OpusDistillery can be found [here](https://github.com/Helsinki-NLP/OpusDistillery/blob/main/configs/hplt/config.hplt.kor-eng.yml).
|
| 19 |
+
|
| 20 |
+
## How to run
|
| 21 |
+
```
|
| 22 |
+
>>> from transformers import pipeline
|
| 23 |
+
|
| 24 |
+
>>> pipe = pipeline("translation", model="odegiber/ko-en", max_length=256)
|
| 25 |
+
>>> pipe("2017년 말, 시미노프는 쇼핑 텔레비젼 채널인 QVC에 출연했다.")
|
| 26 |
+
[{'translation_text': 'At the end of 2017, Siminof appeared on the shopping television channel QVC.'}]
|
| 27 |
+
|
| 28 |
+
```
|
| 29 |
+
|
| 30 |
+
## Benchmarks
|
| 31 |
+
|
| 32 |
+
| testset | BLEU | chr-F |
|
| 33 |
+
|-----------------------|-------|-------|
|
| 34 |
+
| flores200 | 20.3 | 50.3 |
|