Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
xiulinyang
/
GPT-TR-5k
like
0
Turkish
Model card
Files
Files and versions
xet
Community
Turkish trained on OPUS12 parallel corpus with a vocab size being 5k.
Downloads last month
-
Downloads are not tracked for this model.
How to track
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for
xiulinyang/GPT-TR-5k
Base model
openai-community/gpt2
Finetuned
(
2083
)
this model
Collection including
xiulinyang/GPT-TR-5k
Parallel_multilingual_LM_varying_vocab
Collection
21 items
โข
Updated
Apr 28, 2025