Update README.md
Browse files
README.md
CHANGED
|
@@ -171,9 +171,11 @@ The most important aspect of this work is to make it fresh, trained on datasets
|
|
| 171 |
|
| 172 |
## LLAMA-3_8B_Unaligned_Alpha is available at the following quantizations:
|
| 173 |
|
|
|
|
|
|
|
| 174 |
- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
|
| 175 |
- GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_GGUF) | [iMatrix_GGUF](https://huggingface.co/bartowski/LLAMA-3_8B_Unaligned_Alpha-GGUF)
|
| 176 |
-
- EXL2: [3.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-
|
| 177 |
|
| 178 |
|
| 179 |
### Support
|
|
|
|
| 171 |
|
| 172 |
## LLAMA-3_8B_Unaligned_Alpha is available at the following quantizations:
|
| 173 |
|
| 174 |
+
Censorship level: <b>Low - Medium</b>
|
| 175 |
+
|
| 176 |
- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
|
| 177 |
- GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_GGUF) | [iMatrix_GGUF](https://huggingface.co/bartowski/LLAMA-3_8B_Unaligned_Alpha-GGUF)
|
| 178 |
+
- EXL2: [2.6 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_2.6bpw) | [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.0bpw) | [3.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.5bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.0bpw) | [4.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.5bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.0bpw) | [5.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.5bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.0bpw) | [6.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.5bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.0bpw) | [7.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.5bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_8.0bpw)
|
| 179 |
|
| 180 |
|
| 181 |
### Support
|