somosnlp-hackathon-2022/neutral-es
Viewer • Updated • 3.61k • 178 • 7
How to use feserrm/mbart-neutralization with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("feserrm/mbart-neutralization")
model = AutoModelForSeq2SeqLM.from_pretrained("feserrm/mbart-neutralization")This model is a fine-tuned version of facebook/mbart-large-50 on an unknown dataset. It achieves the following results on the evaluation set:
Disclaimer: this is part of a practical excerise carried out as part of the University course "Machine Traslation" of the Master's Degree in Language Processing and Applied AI to Linguistcs of Universidad de La Rioja. This model is a fine-tuned version of facebook/mbart-large-50 on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|---|---|---|---|---|---|
| No log | 1.0 | 440 | 0.0220 | 98.1628 | 18.8229 |
| 0.2273 | 2.0 | 880 | 0.0108 | 98.1545 | 18.8229 |
Base model
facebook/mbart-large-50