Update README.md
Browse files
README.md
CHANGED
|
@@ -65,14 +65,16 @@ extra_gated_button_content: Agree and Request Access
|
|
| 65 |
|
| 66 |
## Model Description
|
| 67 |
|
| 68 |
-
This is **Ex0bit/GLM-4.7-Flash-PRISM**
|
| 69 |
-
- PRISM (Projected Refusal Isolation via Subspace Modification) — State-of-the-art over-refusal/propaganda removal from LLMs
|
| 70 |
|
| 71 |
-
**GLM-4.7-PRISM:** Unrestricted GLM-4.7 Model Access
|
| 72 |
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
-
|
| 78 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
| 65 |
|
| 66 |
## Model Description
|
| 67 |
|
| 68 |
+
This is **Ex0bit/GLM-4.7-Flash-PRISM**
|
|
|
|
| 69 |
|
| 70 |
+
**GLM-4.7-Flash-PRISM:** Unrestricted GLM-4.7-Flash Model Access
|
| 71 |
|
| 72 |
+
Access GLM-4.7-Flash-PRISM, an abliterated version of ZAI's efficient 30B-A3B MoE model with over-refusal mechanisms removed.
|
| 73 |
+
|
| 74 |
+
**What You Get:**
|
| 75 |
+
|
| 76 |
+
- **30B-A3B MoE Architecture** — Lightweight yet powerful Mixture-of-Experts model with 30 billion total parameters and ~3 billion active per token for fast, efficient inference
|
| 77 |
+
- **PRISM (Projected Refusal Isolation via Subspace Modification)** — State-of-the-art abliteration technique that removes over-refusal behaviors while preserving capabilities
|
| 78 |
+
- **128K Context Window** — Extended context for complex tasks and large codebases
|
| 79 |
+
- **Interleaved & Preserved Thinking** — Multi-turn reasoning that persists across conversations with per-turn thinking control
|
| 80 |
+
- **Strong In-Class Benchmarks** — 91.6% AIME 2025, 79.5% τ²-Bench, 59.2% SWE-bench Verified, 75.2% GPQA
|