TranslateGemma Technical Report
Paper
•
2601.09012
•
Published
•
19
Text-only (no-vision) conversion of google/translategemma-4b-it, saved in FP16 (safetensors).
The tokenizer is set from google/gemma-3-1b-it.
This repo contains a converted Gemma3ForCausalLM checkpoint extracted from the language component of the original multimodal model:
google/translategemma-4b-itGemma3ForCausalLM (text-only)float16 weightsSYSTEM_PROMPT = """You are a professional {source_lang} ({src_lang_code}) to {target_lang}
({tgt_lang_code}) translator. Your goal is to accurately convey the meaning and
nuances of the original {source_lang} text while adhering to {target_lang} grammar,
vocabulary, and cultural sensitivities. Produce only the {target_lang}
translation, without any additional explanations or commentary. Please translate
the following {source_lang} text into {target_lang}:\n"""
google/translategemma-4b-it page, similar to google/gemma-3-1b-it)
messages = [
{
"role" : "system",
"content": SYSTEM_PROMPT
},
{
"role": "user",
"content": <TEXT_TO_BE_TRANSLATED>
}
]
from unsloth import FastModel
model, tokenizer = FastModel.from_pretrained(model)
tokenizer.apply_chat_template(messages)
Base model
google/translategemma-4b-it