ko-en / README.md
raphaelmerx's picture
README: mention fork
be7d85f
---
datasets:
- Helsinki-NLP/tatoeba
language:
- ko
- en
metrics:
- bleu
- chrf
pipeline_tag: translation
library_name: transformers
---
# Model info
Forked from [odegiber/ko-en](https://huggingface.co/odegiber/ko-en), with a `.tflite` version of the model weights added.
Distilled model from a Tatoeba-MT Teacher: [Tatoeba-MT-models/kor-eng/opusTCv20210807-sepvoc_transformer-big_2022-07-28](https://object.pouta.csc.fi/Tatoeba-MT-models/kor-eng/opusTCv20210807-sepvoc_transformer-big_2022-07-28.zip), which has been trained on the [Tatoeba](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/data) dataset.
We used the [OpusDistillery](https://github.com/Helsinki-NLP/OpusDistillery) to train new a new student with the tiny architecture, with a regular transformer decoder.
For training data, we used [Tatoeba](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/data).
The configuration file fed into OpusDistillery can be found [here](https://github.com/Helsinki-NLP/OpusDistillery/blob/main/configs/hplt/config.hplt.kor-eng.yml).
## How to run
```python
>>> from transformers import pipeline
>>> pipe = pipeline("translation", model="odegiber/ko-en", max_length=256)
>>> pipe("2017년 말, 시미노프는 쇼핑 텔레비젼 채널인 QVC에 출연했다.")
[{'translation_text': 'At the end of 2017, Siminof appeared on the shopping television channel QVC.'}]
```
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| flores200 | 20.3 | 50.3 |