ibm-granite/granite-4.0-micro
Text Generation • Updated • 317k • 271
4b is the new 7b, i was meant to do sub-3b and sub-4b collections, but in the end i decided to combine into a single collection of text-to-text LLMs
Note traditional transformers model
Note h stands for hybrid, uses transformers and mamba
Note biggest glow-up compared to SmolLM2
Note just an updated version of phi-4-mini-reasoning. phi-4.5-mini when???
Note better than 2511
Note literally the end-game 4b parameter model.
Note literally the end-game 4b parameter model. Now with reasoning!
Note wow! a tiny MoE!