MINT-CoT-7B / generation_config.json
xy06's picture
Upload folder using huggingface_hub
66b65d1 verified
raw
history blame contribute delete
232 Bytes
{
"attn_implementation": "flash_attention_2",
"bos_token_id": 151643,
"do_sample": true,
"eos_token_id": [
151645,
151643
],
"pad_token_id": 151643,
"transformers_version": "4.49.0.dev0",
"use_cache": null
}