test_tokenizer / special_tokens_map.json
Gunulhona's picture
add tokenizer
deab4d3
raw
history blame contribute delete
187 Bytes
{"bos_token": "</s>", "eos_token": "</s>", "unk_token": "<unk>", "pad_token": "<pad>", "mask_token": "<mask>", "additional_special_tokens": ["[BOS]", "[EOS]", "[UNK]", "[PAD]", "[MASK]"]}