FocusUI-7B / generation_config.json
yyyang's picture
Upload folder using huggingface_hub
f4152a1 verified
raw
history blame contribute delete
222 Bytes
{
"attn_implementation": "flash_attention_2",
"do_sample": true,
"eos_token_id": [
151666
],
"pad_token_id": 151643,
"repetition_penalty": 1.05,
"temperature": 1e-06,
"transformers_version": "4.57.0"
}