Transformers
Safetensors
mt5
text2text-generation
How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("halfrot/sft-mt5-base")
model = AutoModelForSeq2SeqLM.from_pretrained("halfrot/sft-mt5-base")
Quick Links

Trained SFT policy for MT task in the paper "ALaRM: Align Language Models via Hierarchical Rewards Modeling".

Check out our project page for more information.

Downloads last month
11
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train halfrot/sft-mt5-base

Paper for halfrot/sft-mt5-base