roberta-fakeddit / tokenizer_config.json
两步
all
a0f3f80
raw
history blame contribute delete
189 Bytes
{"do_lower_case": true, "max_len": 512, "bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": "<mask>"}