SMILES_tokenizer / tokenizer_config.json
nepp1d0's picture
add tokenizer
231d457
raw
history blame contribute delete
183 Bytes
{"model_max_length": 512, "unk_token": "[UNK]", "pad_token": "[PAD]", "cls_token": "[CLS]", "sep_token": "[SEP]", "mask_token": "[MASK]", "tokenizer_class": "PreTrainedTokenizerFast"}