Inelly4-Blaze / generation_config.json
GenueAI's picture
Upload Qwen2ForCausalLM
85ff967 verified
raw
history blame contribute delete
165 Bytes
{
"bos_token_id": 151643,
"eos_token_id": 151643,
"max_length": 32768,
"max_new_tokens": 2048,
"pad_token_id": 151654,
"transformers_version": "5.5.0"
}