| base_model: | |
| - Vortex5/Starry-Shadow-12B | |
| base_model_relation: quantized | |
| pipeline_tag: text-generation | |
| library_name: safetensors | |
| tags: | |
| - exl3 | |
| - 4-bit | |
| - 6-bit | |
| - 8-bit | |
| # Original model: [Starry-Shadow-12B](https://huggingface.co/Vortex5/Starry-Shadow-12B) by [Vortex5](https://huggingface.co/Vortex5) | |
| ## Available [ExLlamaV3](https://github.com/turboderp-org/exllamav3) (release v0.0.18) quantizations | |
| | Type | Size | CLI | | |
| |------|------|---------| | |
| | [H8-4.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-4.0BPW) | 7.49 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-4.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-4.0BPW_EXL3.bat) | | |
| | [H8-6.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-6.0BPW) | 10.22 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-6.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-6.0BPW_EXL3.bat) | | |
| | [H8-8.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-8.0BPW) | 12.95 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-8.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-8.0BPW_EXL3.bat) | | |
| ***Requirements: A python installation with huggingface-hub module to use CLI.*** | |
| ### Licensing: The license for the provided quantized models is derived from the original model (see the source above) | |