CosmicFish-90M / vocab_info.json
akkiisfrommars's picture
Model commit
a8c6adb verified
raw
history blame contribute delete
155 Bytes
{
"note": "This model uses GPT-2 tokenizer. Please use: tokenizer = GPT2Tokenizer.from_pretrained('gpt2')",
"vocab_size": 50257,
"encoding": "gpt2"
}