model2 / special_tokens_map.json

Commit History

add tokenizer
331734d

idsedykh commited on