Original model: Devstral-Small-2-24B-Instruct-2512 by mistralai
Available ExLlamaV3 (release v0.0.17) quantizations
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 14.05 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 19.61 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 25.17 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing: The license for the provided quantized models is derived from the original model (see the source above)
Model tree for DeathGodlike/Devstral-Small-2-24B-Instruct-2512_EXL3
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503