File size: 1,539 Bytes
e203751
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---

base_model:
- Vortex5/Starry-Shadow-12B
base_model_relation: quantized
pipeline_tag: text-generation
library_name: safetensors
tags:
- exl3
- 4-bit
- 6-bit
- 8-bit
---


# Original model: [Starry-Shadow-12B](https://huggingface.co/Vortex5/Starry-Shadow-12B) by [Vortex5](https://huggingface.co/Vortex5)

## Available [ExLlamaV3](https://github.com/turboderp-org/exllamav3) (release v0.0.18) quantizations

| Type | Size | CLI |
|------|------|---------|
| [H8-4.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-4.0BPW) | 7.49 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-4.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-4.0BPW_EXL3.bat) |
| [H8-6.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-6.0BPW) | 10.22 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-6.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-6.0BPW_EXL3.bat) |
| [H8-8.0BPW](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/tree/H8-8.0BPW) | 12.95 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/Starry-Shadow-12B_EXL3/resolve/H8-8.0BPW/Download~Vortex5_Starry-Shadow-12B_H8-8.0BPW_EXL3.bat) |

***Requirements: A python installation with huggingface-hub module to use CLI.***

### Licensing: The license for the provided quantized models is derived from the original model (see the source above)