| license: apache-2.0 | |
| base_model: mistralai/Mistral-Small-3.1-24B-Instruct-2503 | |
| base_model_relation: quantized | |
| quantized_by: turboderp | |
| tags: | |
| - exl3 | |
| EXL3 quants of [Mistral-Small-3.1-24B-Instruct-2503](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503) | |
| [2.00 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/2.0bpw) | |
| [2.50 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/2.5bpw) | |
| [3.00 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/3.0bpw) | |
| [3.50 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/3.5bpw) | |
| [4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/4.0bpw) | |
| [5.00 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/5.0bpw) | |
| [6.00 bits per weight](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/6.0bpw) | |
| [8.00 bits per weight / H8](https://huggingface.co/turboderp/Mistral-Small-3.1-24B-Instruct-2503-exl3/tree/8.0bpw_H8) | |
|  | |
|  |