Quantized LFM2.5
Collection
Verified models. Compatible with vLLM.
•
17 items
•
Updated
•
1
This is LFM2.5-1.2B-Thinking quantized with llm-compressor to FP8. The model is compatible with vLLM (tested: v0.14.0). Tested with an L4 (Google Colab).
Base model
LiquidAI/LFM2.5-1.2B-Base