This model can be used in vllm or sglang produced by llm-compressor.
- Downloads last month
- 43
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for jiangchengchengNLP/Mistral-Small-3.2-24B-Instruct-W8A8
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503