bbbbbbbbbb / generation_config.json
BKM1804's picture
Upload Qwen2ForCausalLM
d59cb11 verified
raw
history blame contribute delete
150 Bytes
{
"eos_token_id": [
151643
],
"max_length": 32768,
"max_new_tokens": 2048,
"pad_token_id": 151654,
"transformers_version": "4.56.1"
}