neu_text_classification / added_tokens.json
hung20gg's picture
Upload tokenizer
90a1a67
raw
history blame contribute delete
22 Bytes
{
"<mask>": 64000
}