Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
hash-map
/
got_tokenizer
like
0
sentence-transformers
English
License:
mit
Model card
Files
Files and versions
xet
Community
Use this model
main
got_tokenizer
1.24 MB
1 contributor
History:
3 commits
hash-map
Update README.md
d6a544f
verified
3 months ago
.gitattributes
1.52 kB
initial commit
3 months ago
README.md
71 Bytes
Update README.md
3 months ago
icefire_spm.model
743 kB
xet
Upload 3 files
3 months ago
icefire_spm.vocab
492 kB
Upload 3 files
3 months ago
usage.py
767 Bytes
Upload 3 files
3 months ago