CosmicFish-300M / tokenizer_config.json
akkiisfrommars's picture
Upload 7 files
e226163 verified
raw
history blame contribute delete
274 Bytes
{
"tokenizer_class": "GPT2Tokenizer",
"vocab_size": 50257,
"model_max_length": 2048,
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>",
"pad_token": "<|endoftext|>",
"add_prefix_space": false,
"do_lower_case": false
}