bnjmnmarie's picture
Update README.md
ce660b1 verified
metadata
license: apache-2.0
base_model:
  - LiquidAI/LFM2.5-1.2B-Instruct
tags:
  - llm-compressor

This is LFM2.5-1.2B-Instruct quantized with llm-compressor to FP8. The model is compatible with vLLM (tested: v0.13.0). Tested with an RTX 4090.

How to Support My Work

"buy me a kofi"

Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free.