Ex0bit commited on
Commit
63674c8
·
verified ·
1 Parent(s): 68db643

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -34
README.md CHANGED
@@ -4,62 +4,64 @@ license_name: prism-research
4
  license_link: LICENSE.md
5
  language:
6
  - en
7
- - cn
8
  tags:
9
  - glm4
10
  - prism
 
11
  pipeline_tag: text-generation
12
  library_name: transformers
13
  ---
14
 
15
- [![Model](https://img.shields.io/badge/Model-2.6B-blue)]()
16
- [![Architecture](https://img.shields.io/badge/Architecture-LFM2%20Hybrid-green)]()
17
  [![Context](https://img.shields.io/badge/Context-128K-orange)]()
18
 
 
19
 
 
20
 
 
21
 
22
- ## Model Description
23
 
24
- This is **Ex0bit/GLM-4.7-Flash-PRISM**
25
 
26
- <div align="center" style="background:#161b22;border:1px solid #30363d;border-radius:10px;padding:20px 24px;margin:20px 0;">
27
 
 
 
 
 
28
 
29
- <span style="color:#58a6ff;font-size:11px;font-weight:600;letter-spacing:0.5px;">PLEASE SUPPORT OUR WORK!</span>
30
- If you enjoy what we do, consider supporting us on Ko-fi! Every little bit means the world! https://ko-fi.com/ericelbaz
31
- <div align="center">
32
- <a href="https://ko-fi.com/ericelbaz#tier17681523526070">
33
- <img src="https://cdn-uploads.huggingface.co/production/uploads/63adf1fa42fd3b8dbaeb0c92/Qe17Xd59xWbucl1ZOl1jF.png" width="50%"/>
34
- </a>
35
 
36
- <br><br>
37
- <!--<span style="color:#e6edf3;font-size:18px;font-weight:600;">Complete both steps.</span>-->
38
- <br><br>
39
 
40
- <table cellpadding="0" cellspacing="0" border="0" align="center" style="text-align:left;">
41
- <!--<tr><td style="padding:6px 0;color:#e6edf3;font-size:13px;"><span style="color:#58a6ff;font-weight:600;margin-right:10px;">â‘ </span>Submit the access request form below</td></tr>-->
42
- <tr><td style="padding:6px 0;color:#e6edf3;font-size:13px;"><span style="color:#58a6ff;font-weight:600;margin-right:10px;">â‘¡</span>Support Donation Option:</td></tr>
43
- </table>
44
- <br>
45
 
46
- <a href="https://ko-fi.com/summary/6bae206c-a751-4868-8dc7-f531afd1fb4c" target="_blank" style="background:#238636;color:white;text-decoration:none;padding:8px 16px;border-radius:6px;font-weight:600;font-size:16px;">PRISM VIP Member Sign-Up </a> <span style="color:#8b949e;font-size:11px;margin-left:8px;"> All Models </span>
47
- <br><br>
48
- <a href="https://ko-fi.com/s/86882e8991" target="_blank" style="background:#21262d;color:#e6edf3;text-decoration:none;padding:8px 16px;border-radius:6px;font-weight:500;font-size:16px;border:1px solid #30363d;">One-Time Support </a> <span style="color:#8b949e;font-size:11px;margin-left:8px;"> This Model </span>
49
- <br><br>
50
 
51
- <span style="color:#3fb950;font-size:11px;">✓ Priority Access</span>
52
 
53
- </div>
 
 
 
 
 
54
 
55
- **GLM-4.7-Flash-PRISM:** Unrestricted (Zero Over-Refusals and Zero Propoganda) GLM-4.7-Flash Model Access
 
 
56
 
57
- Access GLM-4.7-Flash-PRISM, a PRISM unchained version of ZAI's efficient 30B-A3B MoE model with over-refusal mechanisms removed.
 
 
58
 
59
- **What You Get:**
60
 
61
- - **PRISM (Projected Refusal Isolation via Subspace Modification)** — State-of-the-art abliteration technique that removes over-refusal behaviors while preserving capabilities
62
- - **30B-A3B MoE Architecture** — Lightweight yet powerful Mixture-of-Experts model with 30 billion total parameters and ~3 billion active per token for fast, efficient inference
63
- - **128K Context Window** — Extended context for complex tasks and large codebases
64
- - **Interleaved & Preserved Thinking** — Multi-turn reasoning that persists across conversations with per-turn thinking control
65
- - **Strong In-Class Benchmarks** — 91.6% AIME 2025, 79.5% τ²-Bench, 59.2% SWE-bench Verified, 75.2% GPQA
 
4
  license_link: LICENSE.md
5
  language:
6
  - en
7
+ - zh
8
  tags:
9
  - glm4
10
  - prism
11
+ - moe
12
  pipeline_tag: text-generation
13
  library_name: transformers
14
  ---
15
 
16
+ [![Parameters](https://img.shields.io/badge/Parameters-30B--A3B_MoE-blue)]()
17
+ [![Architecture](https://img.shields.io/badge/Architecture-GLM--4-green)]()
18
  [![Context](https://img.shields.io/badge/Context-128K-orange)]()
19
 
20
+ # GLM-4.7-Flash-PRISM
21
 
22
+ An unrestricted version of ZAI's GLM-4.7-Flash with over-refusal mechanisms removed using PRISM (Projected Refusal Isolation via Subspace Modification).
23
 
24
+ <div align="center">
25
 
26
+ ### ☕ Support Our Work
27
 
28
+ If you find this useful, consider supporting us on Ko-fi!
29
 
30
+ [![Ko-fi](https://img.shields.io/badge/Ko--fi-Support%20Us-ff5e5b?logo=ko-fi&logoColor=white)](https://ko-fi.com/ericelbaz)
31
 
32
+ | Option | Description |
33
+ |--------|-------------|
34
+ | [**PRISM VIP Membership**](https://ko-fi.com/summary/6bae206c-a751-4868-8dc7-f531afd1fb4c) | Access to all PRISM models |
35
+ | [**One-Time Support**](https://ko-fi.com/s/86882e8991) | Support this model |
36
 
37
+ </div>
 
 
 
 
 
38
 
39
+ ---
 
 
40
 
41
+ ## Model Highlights
 
 
 
 
42
 
43
+ - **PRISM Ablation** — State-of-the-art technique that removes over-refusal behaviors while preserving model capabilities
44
+ - **30B-A3B MoE Architecture** — 30 billion total parameters with ~3 billion active per token for fast, efficient inference
45
+ - **128K Context Window** — Extended context for complex tasks and large codebases
46
+ - **Interleaved Thinking** — Multi-turn reasoning that persists across conversations with per-turn thinking control
47
 
48
+ ## Benchmarks
49
 
50
+ | Benchmark | Score |
51
+ |-----------|-------|
52
+ | AIME 2025 | 91.6% |
53
+ | τ²-Bench | 79.5% |
54
+ | SWE-bench Verified | 59.2% |
55
+ | GPQA | 75.2% |
56
 
57
+ ## Usage
58
+ ```python
59
+ from transformers import AutoModelForCausalLM, AutoTokenizer
60
 
61
+ model = AutoModelForCausalLM.from_pretrained("Ex0bit/GLM-4.7-Flash-PRISM")
62
+ tokenizer = AutoTokenizer.from_pretrained("Ex0bit/GLM-4.7-Flash-PRISM")
63
+ ```
64
 
65
+ ## License
66
 
67
+ This model is released under the [PRISM Research License](LICENSE.md).