Quantized LFM2.5
Collection
Verified models. Compatible with vLLM.
•
14 items
•
Updated
•
1
This is LFM2.5-1.2B-Thinking quantized with llm-compressor to MXFP4. The model is compatible with vLLM (tested: v0.14.0; still experimental). Tested with an L4 (Google Colab).
Base model
LiquidAI/LFM2.5-1.2B-Base