metadata
license: apache-2.0
base_model:
- LLM360/K2-V2-Instruct
tags:
- llmcompressor
This is LLM360/K2-V2-Instruct quantized with LLM Compressor with the recipe in the "recipe.yaml" file. The model is compatible with vLLM (tested: v0.12.0). Tested with an RTX Pro 6000.
- Developed by: The Kaitchup
- License: Apache 2.0 license
How to Support My Work
Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free. Or you can "buy me a kofi".