morbid0.0.5 / tokenizer_config.json
h3ir's picture
Morbid.AI v0.0.5 - Exam-augmented actuarial intelligence with PDF Q/A
bdd62a1 verified
raw
history blame contribute delete
217 Bytes
{
"model_max_length": 2048,
"tokenizer_class": "LlamaTokenizer",
"use_fast": false,
"special_tokens": {
"bos_token": "<s>",
"eos_token": "</s>",
"unk_token": "<unk>",
"pad_token": "<pad>"
}
}