calculator_model_test / tokenizer_config.json
marcin6's picture
Training in progress, step 240
bf2f2e0 verified
raw
history blame contribute delete
199 Bytes
{
"backend": "tokenizers",
"cls_token": "[CLS]",
"eos_token": "[EOS]",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "[PAD]",
"tokenizer_class": "TokenizersBackend"
}