metadata
license: mit
base_model:
- zai-org/GLM-4.7-Flash
tags:
- llm-compressor
This is zai-org/GLM-4.7-Flash quantized with llm-compressor to FP8. The model is compatible with vLLM (tested: v0.14.0). Tested with an L4 (Google Colab).
- Developed by: The Kaitchup
- License: lfm1.0