Staticaliza commited on
Commit
d925ff5
·
verified ·
1 Parent(s): e136bf1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -3
README.md CHANGED
@@ -2,7 +2,17 @@
2
  license: apache-2.0
3
  pipeline_tag: text-generation
4
  ---
5
- # x: y
6
 
7
- :)
8
- Credits to [ArliAI/QwQ-32B-ArliAI-RpR-v4](https://huggingface.co/ArliAI/QwQ-32B-ArliAI-RpR-v4),
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: apache-2.0
3
  pipeline_tag: text-generation
4
  ---
 
5
 
6
+ # Statica-Reasoning-8B-Distilled-GPTQ-Int4
7
+
8
+ <img src="https://huggingface.co/Statical-Workspace/Storage/resolve/main/DepressedGirl.png" alt="icon" height="500" width="500">
9
+
10
+ A low restricted reasoning creative roleplay and conversation model based on [ArliAI/QwQ-32B-ArliAI-RpR-v4](https://huggingface.co/ArliAI/QwQ-32B-ArliAI-RpR-v4) and [huihui-ai/DeepSeek-R1-0528-Qwen3-8B-abliterated](https://huggingface.co/huihui-ai/DeepSeek-R1-0528-Qwen3-8B-abliterated).
11
+
12
+ This is a distilled and quantized 4 bit GPTQ model (W4A16) that can run on GPU.
13
+
14
+ # vLLM Use
15
+
16
+ ```python
17
+
18
+ ```