File size: 122 Bytes
e177851 | 1 2 3 4 5 6 7 | {
"do_sample": true,
"max_new_tokens": 256,
"temperature": 0.7,
"top_p": 0.9,
"transformers_version": "4.45.0"
} |
e177851 | 1 2 3 4 5 6 7 | {
"do_sample": true,
"max_new_tokens": 256,
"temperature": 0.7,
"top_p": 0.9,
"transformers_version": "4.45.0"
} |