| license: apache-2.0 | |
| base_model: | |
| - allenai/Olmo-3-1125-32B | |
| - allenai/Olmo-3.1-32B-Think | |
| - allenai/Olmo-3.1-32B-Instruct | |
| language: | |
| - en | |
| library_name: transformers | |
| ## Model Details | |
| <img alt="Logo for Olmo 3.1 32B Think/Instruct model" src="olmo-instruct.png" width="307px" style="margin-left:'auto' margin-right:'auto' display:'block'"> | |
| # Model Card for Olmo-3.1-32B-ThinkInstruct | |
| AllenAI dropped a pretty good Think and Instruct model based on two very different branches off of Olmo 3. I merged them together using my normal bespoke fourier interpolation process, but kept the last two layers and the lm head from Think otherwise it would forget to close the think tag. | |
| It's completely coherent. It tends to think a lot less than the standard Think model. | |
| ## License | |
| This model is licensed under Apache 2.0. | |