DeathGodlike's picture
Update README.md
4a90f9c verified
---
base_model:
- ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4
base_model_relation: quantized
pipeline_tag: text-generation
library_name: safetensors
tags:
- exl3
- 4-bit
- 6-bit
- 8-bit
---
# Original model: [Mistral-Small-24B-ArliAI-RPMax-v1.4](https://huggingface.co/ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4) by [ArliAI](https://huggingface.co/ArliAI)
## Available [ExLlamaV3](https://github.com/turboderp-org/exllamav3) 0.0.16 quants
| Type | Size | CLI |
|------|------|---------|
| [H8-4.0BPW](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/tree/H8-4.0BPW) | 13.16 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/resolve/H8-4.0BPW/Download~ArliAI_Mistral-Small-24B-ArliAI-RPMax-v1.4_H8-4.0BPW_EXL3.bat) |
| [H8-6.0BPW](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/tree/H8-6.0BPW) | 18.72 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/resolve/H8-6.0BPW/Download~ArliAI_Mistral-Small-24B-ArliAI-RPMax-v1.4_H8-6.0BPW_EXL3.bat) |
| [H8-8.0BPW](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/tree/H8-8.0BPW) | 24.27 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3/resolve/H8-8.0BPW/Download~ArliAI_Mistral-Small-24B-ArliAI-RPMax-v1.4_H8-8.0BPW_EXL3.bat) |
***Requirements: A python installation with huggingface-hub module to use CLI.***
### Licensing: The license for the provided quantized models is derived from the original model (see the source above)