My ByT5 Fine-Tuned Model
Model description
This is a fine-tuned version of google/byt5-small.
It was trained on my custom dataset for transliterating Yiddish characters to Latin characters.
Intended uses & limitations
- Intended use: transliteration
Training data
Transliterated Wiktionary data and Bible verses were used for model development.
How to use
Load with ๐ค Transformers:
from transformers import pipeline
transliterator = pipeline("text-generation", model="shoowadoo/galkhesnet")
transliterator('Your Yiddish word here')[0]['generated_text']
transliterator("Your Yiddish word here", num_beams=10, num_return_sequences=10, max_length=100)
- Downloads last month
- 11
Model tree for shoowadoo/galkhesnet
Base model
google/byt5-small