How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
# Warning: Pipeline type "translation" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline

pipe = pipeline("translation", model="Nextcloud-AI/opus-mt-fi-en")
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("Nextcloud-AI/opus-mt-fi-en")
model = AutoModelForSeq2SeqLM.from_pretrained("Nextcloud-AI/opus-mt-fi-en")
Quick Links

fin-eng

Benchmarks

testset BLEU chr-F
newsdev2015-enfi-fineng.fin.eng 25.3 0.536
newstest2015-enfi-fineng.fin.eng 26.9 0.547
newstest2016-enfi-fineng.fin.eng 29.0 0.571
newstest2017-enfi-fineng.fin.eng 32.3 0.594
newstest2018-enfi-fineng.fin.eng 23.8 0.517
newstest2019-fien-fineng.fin.eng 29.0 0.565
newstestB2016-enfi-fineng.fin.eng 24.5 0.527
newstestB2017-enfi-fineng.fin.eng 27.4 0.557
newstestB2017-fien-fineng.fin.eng 27.4 0.557
Tatoeba-test.fin.eng 53.4 0.697

System Info:

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support