bnjmnmarie's picture
Update README.md
2dbca94 verified
metadata
license: apache-2.0
base_model:
  - LiquidAI/LFM2.5-1.2B-JP
tags:
  - llm-compressor

This is LFM2.5-1.2B-JP quantized with llm-compressor to FP8. The model is compatible with vLLM (tested: v0.13.0). Tested with an RTX 4090.

How to Support My Work

"buy me a kofi" Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free.