calculator_model_test / tokenizer_config.json
dec0dedd's picture
Training in progress, step 240
569d1e7 verified
raw
history blame contribute delete
199 Bytes
{
"backend": "tokenizers",
"cls_token": "[CLS]",
"eos_token": "[EOS]",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "[PAD]",
"tokenizer_class": "TokenizersBackend"
}