Heavy_roformer / tokenizer_config.json
Yaoooooooo's picture
Upload 6 files
ae8d49d verified
raw
history blame contribute delete
304 Bytes
{"do_lower_case": false, "do_basic_tokenize": true, "never_split": null, "unk_token": "?", "sep_token": "|", "pad_token": "$", "cls_token": "*", "mask_token": ".", "tokenize_chinese_chars": false, "strip_accents": null, "model_max_len": 128.0, "padding_side": "right", "tokenizer_class": "BertTokenizer"}