Sam-1-large / tokenizer_config.json
Keeby-smilyai's picture
Upload folder using huggingface_hub
73d6fa3 verified
raw
history blame contribute delete
264 Bytes
{
"tokenizer_class": "GPT2Tokenizer",
"model_max_length": 1024,
"pad_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"bos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>",
"additional_special_tokens": [
"<think>",
"</think>"
]
}