Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
fzengin18
/
multrenizer
like
0
Transformers
wikimedia/wikipedia
Helsinki-NLP/opus-100
Turkish
English
tokenizer
tokenizers
unigram
turkish
english
bilingual
arxiv:
2508.08424
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
multrenizer
1.86 MB
Ctrl+K
Ctrl+K
1 contributor
History:
7 commits
fzengin18
Add dataset metadata to model card
e1fe4bd
verified
1 day ago
.gitattributes
Safe
1.52 kB
initial commit
1 day ago
LICENSE
10.8 kB
Upload LICENSE with huggingface_hub
1 day ago
README.md
14.9 kB
Add dataset metadata to model card
1 day ago
special_tokens_map.json
16.2 kB
Upload folder using huggingface_hub
1 day ago
tokenizer.json
1.8 MB
Upload folder using huggingface_hub
1 day ago
tokenizer_config.json
16.9 kB
Upload folder using huggingface_hub
1 day ago