Transformers
example-custom-tokenizer / added_tokens.json
Masa-Erland's picture
Upload tokenizer
df13113 verified
raw
history blame contribute delete
34 Bytes
{
"<|file_separator|>": 32000
}