Update README.md
Browse files
README.md
CHANGED
|
@@ -5,13 +5,13 @@ language:
|
|
| 5 |
---
|
| 6 |
|
| 7 |
<div align="center">
|
| 8 |
-
<b style="font-size: 40px;">LLAMA-
|
| 9 |
|
| 10 |
|
| 11 |
</div>
|
| 12 |
|
| 13 |
|
| 14 |
-
<img src="https://i.imgur.com/
|
| 15 |
|
| 16 |
|
| 17 |
# Current status:
|
|
@@ -46,7 +46,7 @@ Sicarius
|
|
| 46 |
## Intermediate checkpoint of this model:
|
| 47 |
|
| 48 |
- (Can still be decent for merges, fairly uncensored): [LLAMA-3_8B_Unaligned_Alpha](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
|
| 49 |
-
|
| 50 |
|
| 51 |
# Model instruction template: (Can use either ChatML or Llama-3)
|
| 52 |
# ChatML
|
|
@@ -168,10 +168,15 @@ The most important aspect of this work is to make it fresh, trained on datasets
|
|
| 168 |
- FP16: soon...
|
| 169 |
- EXL2: soon...
|
| 170 |
- GGUF: soon...
|
| 171 |
-
|
| 172 |
## LLAMA-3_8B_Unaligned_Alpha is available at the following quantizations:
|
| 173 |
-
|
| 174 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 175 |
|
| 176 |
### Support
|
| 177 |
<img src="https://i.imgur.com/0lHHN95.png" alt="GPUs too expensive" style="width: 10%; min-width: 100px; display: block; margin: left;">
|
|
|
|
| 5 |
---
|
| 6 |
|
| 7 |
<div align="center">
|
| 8 |
+
<b style="font-size: 40px;">LLAMA-3_8B_Unaligned_Alpha</b>
|
| 9 |
|
| 10 |
|
| 11 |
</div>
|
| 12 |
|
| 13 |
|
| 14 |
+
<img src="https://i.imgur.com/Kpk1PgZ.png" alt="LLAMA-3_8B_Unaligned_Alpha_GGUF" style="width: 50%; min-width: 400px; display: block; margin: auto;">
|
| 15 |
|
| 16 |
|
| 17 |
# Current status:
|
|
|
|
| 46 |
## Intermediate checkpoint of this model:
|
| 47 |
|
| 48 |
- (Can still be decent for merges, fairly uncensored): [LLAMA-3_8B_Unaligned_Alpha](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
|
| 49 |
+
- Roleplay merge example: [LLAMA-3_8B_Unaligned_Alpha_RP_Soup](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_RP_Soup)
|
| 50 |
|
| 51 |
# Model instruction template: (Can use either ChatML or Llama-3)
|
| 52 |
# ChatML
|
|
|
|
| 168 |
- FP16: soon...
|
| 169 |
- EXL2: soon...
|
| 170 |
- GGUF: soon...
|
| 171 |
+
|
| 172 |
## LLAMA-3_8B_Unaligned_Alpha is available at the following quantizations:
|
| 173 |
+
|
| 174 |
+
Censorship level: <b>Low - Medium</b>
|
| 175 |
+
|
| 176 |
+
- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha)
|
| 177 |
+
- GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_GGUF) | [iMatrix_GGUF](https://huggingface.co/bartowski/LLAMA-3_8B_Unaligned_Alpha-GGUF)
|
| 178 |
+
- EXL2: [2.6 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_2.6bpw) | [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.0bpw) | [3.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_3.5bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.0bpw) | [4.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_4.5bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.0bpw) | [5.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_5.5bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.0bpw) | [6.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_6.5bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.0bpw) | [7.5 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_7.5bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_Alpha_EXL2_8.0bpw)
|
| 179 |
+
|
| 180 |
|
| 181 |
### Support
|
| 182 |
<img src="https://i.imgur.com/0lHHN95.png" alt="GPUs too expensive" style="width: 10%; min-width: 100px; display: block; margin: left;">
|