Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
esenergun
/
wikitext_tokenizer
like
0
Transformers
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
wikitext_tokenizer
2.03 MB
1 contributor
History:
2 commits
esenergun
Upload tokenizer
797d57c
verified
almost 2 years ago
.gitattributes
1.52 kB
initial commit
almost 2 years ago
README.md
5.18 kB
Upload tokenizer
almost 2 years ago
merges.txt
280 kB
Upload tokenizer
almost 2 years ago
special_tokens_map.json
99 Bytes
Upload tokenizer
almost 2 years ago
tokenizer.json
1.27 MB
Upload tokenizer
almost 2 years ago
tokenizer_config.json
440 Bytes
Upload tokenizer
almost 2 years ago
vocab.json
479 kB
Upload tokenizer
almost 2 years ago