Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
mkshing
/
novelai-tokenizer-v1
like
0
English
Japanese
tokenizer
novelai
sentencepiece
License:
gpl-2.0
Model card
Files
Files and versions
xet
Community
main
novelai-tokenizer-v1
/
special_tokens_map.json
mkshing
Upload 3 files
66107eb
almost 3 years ago
raw
Copy download link
history
blame
contribute
delete
Safe
91 Bytes
{
"eos_token"
:
"<|endoftext|>"
,
"pad_token"
:
"<|pad|>"
,
"unk_token"
:
"<|unknown|>"
}