converted to bf16, quantized to q8_0
- Downloads last month
- 8
Hardware compatibility
Log In to add your hardware
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
converted to bf16, quantized to q8_0
8-bit