My ByT5 Fine-Tuned Model

Model description

This is a fine-tuned version of google/byt5-small.
It was trained on my custom dataset for transliterating Yiddish characters to Latin characters.

Intended uses & limitations

  • Intended use: transliteration

Training data

Transliterated Wiktionary data and Bible verses were used for model development.

How to use

Load with ๐Ÿค— Transformers:

from transformers import pipeline
transliterator = pipeline("text-generation", model="shoowadoo/galkhesnet")
transliterator('Your Yiddish word here')[0]['generated_text']

transliterator("Your Yiddish word here", num_beams=10, num_return_sequences=10, max_length=100)
Downloads last month
11
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for shoowadoo/galkhesnet

Base model

google/byt5-small
Finetuned
(169)
this model

Space using shoowadoo/galkhesnet 1