Text Generation
Transformers
Safetensors
English
olmo3
conversational
Olmo-3-7B-Think / generation_config.json
baileyk's picture
Define "do_sample" explicitly in generation_config.json (#6)
5d89684 verified
raw
history blame contribute delete
203 Bytes
{
"_from_model_config": true,
"eos_token_id": [
100265,
100257
],
"transformers_version": "4.54.0",
"temperature": 0.6,
"top_p": 0.95,
"max_new_tokens": 32768,
"do_sample": true
}