YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Model Card: T5 English β German Translator
# β T5 English β German Translator
This repository hosts a fine-tuned **T5 model** for **English β German translation**. The model, training notebook, and interactive demo are maintained by [@chinesemusk](https://huggingface.co/chinesemusk).
---
## Model Information
- **Architecture**: T5-small (Text-to-Text Transfer Transformer)
- **Task**: English β German Translation (seq2seq)
- **Tokenizer**: SentencePiece (`spiece.model` + `tokenizer.json`)
- **Training Code**: Available in this [Google Colab / GitHub notebook](https://github.com/Deon62/Eng-German-Translator-model/blob/main/translator.ipynb)
- **Demo**: Interactive UI hosted via Gradio in my Hugging Face Space: [Kahnwald Translator Demo](https://huggingface.co/spaces/chinesemusk/Kahnwald)
---
## Use the Model
Load and run translations with just a few lines:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
model_id = "chinesemusk/t5-en-de-translator"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSeq2SeqLM.from_pretrained(model_id)
text = "This is an example."
inputs = tokenizer(f"translate English to German: {text}", return_tensors="pt", truncation=True)
outputs = model.generate(**inputs, max_length=60)
print("EN:", text)
print("DE:", tokenizer.decode(outputs[0], skip_special_tokens=True))
Try It Live
Don't want to code? Try the model directly in your browser via this Gradio app:
Enter text, select the direction (English β German or German β English), and get translations instantly.
Purpose & Limitations
Purpose: Educational and prototyping usageβlearn how translation fine-tuning works and test small-scale translation tasks.
Limitations:
- Fine-tuned on a small dataset slice β quality may vary on long or complex sentences.
- Not designed for production-level accuracy or large-scale deployment.
- Direction "German β English" works but may produce less accurate results since only lightly fine-tuned for that direction.
Acknowledgments
- Model built using Hugging Face
transformers,datasets, andevaluatelibraries. - Huge thanks to the original T5 authors (Google Research).
- Demo powered by Gradio in a Hugging Face Space.
References
- Training Notebook: translator.ipynb on GitHub
- Gradio Demo Space: Kahnwald
---
### Summary of Inclusions:
- Clear breakdown of model architecture and task.
- GitHub link to your code/notebook for transparency and reproducibility.
- Live demo link via Hugging Face Space for interactive testing.
- Usage snippet for quick adoption.
- Caveats and purpose for better user awareness.
- Proper acknowledgments and references.
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support