forestwhy-9B / generation_config.json
Siddharth63's picture
Upload Qwen3_5ForCausalLM
bbf0f19 verified
raw
history blame contribute delete
115 Bytes
{
"_from_model_config": true,
"eos_token_id": 248044,
"transformers_version": "5.2.0",
"use_cache": true
}