OdiaGenAI/facebook-nllb-200-3.3B-finetuned-odia
3B
•
Updated
•
4
For T2T task of Workshop on Asian Translation(2025), these are the fine-tuned models with NLLB-200-3B as base model, with WAT + 100k samanantar pairs.
Note Note For WAT2025, on - Challenge set, BLEU - 56.40, RIBES - 0.916177 - Evaluation set, BLEU - 62.90, RIBES - 0.903659
Note Note For WAT2025, on - Challenge set, BLEU - 44.20, RIBES - 0.775824 - Evaluation set, BLEU - 43.20, RIBEs - 0.708217
Note Note For WAT2025, on - Challenge set, BLEU - 50.10, RIBES - 0.830882 - Evaluation set, BLEU - 49.50, RIBES - 0.804158
Note Note For WAT2025, on - Challenge set, BLEU - 56.90, RIBES - 0.870254 - Evaluation set, BLEU - 45.10, RIBES - 0.831282