Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
fzengin18
/
multrenizer
like
0
Turkish
English
tokenizers
tokenizer
unigram
turkish
english
bilingual
sentencepiece
arxiv:
2508.08424
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
8306520
multrenizer
1.84 MB
Ctrl+K
Ctrl+K
1 contributor
History:
2 commits
fzengin18
Upload folder using huggingface_hub
8306520
verified
15 days ago
.gitattributes
Safe
1.52 kB
initial commit
15 days ago
README.md
Safe
31 Bytes
initial commit
15 days ago
special_tokens_map.json
16.2 kB
Upload folder using huggingface_hub
15 days ago
tokenizer.json
1.8 MB
Upload folder using huggingface_hub
15 days ago
tokenizer_config.json
16.9 kB
Upload folder using huggingface_hub
15 days ago