File size: 2,776 Bytes
9f8d3d6 ee93040 9f8d3d6 badf476 63326ac badf476 9f8d3d6 dce4e2b 888d107 9f8d3d6 25d50c6 2a9d28d 5e1bb81 2a9d28d 25d50c6 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 | ---
license: other
license_name: prism-research
license_link: LICENSE.md
language:
- en
- cn
tags:
- glm4
- prism
pipeline_tag: text-generation
library_name: transformers
gated: true
extra_gated_heading: >-
Request Access to Ex0bit/GLM-4.7-Flash-PRISM
extra_gated_description: >
**IMPORTANT:**
**Step 1:** Submit the access request with your information below.
**Step 2:** Complete the support donation at https://ko-fi.com/s/86882e8991
Access to this limited edition model will be granted automatically after
completion of **BOTH** steps above.
Please provide your information below.
extra_gated_prompt: |
By requesting access, you agree to:
- Use this model for research or educational purposes only
- Not redistribute the model weights without explicit permission
- Cite this work appropriately in any publications
- Report any issues or safety concerns to the author
extra_gated_fields:
Full Name: text
Organization/Affiliation: text
Country: country
Intended Use:
type: select
options:
- Research
- Education
- Personal
- label: Commercial (requires separate license)
value: commercial
- label: Other
value: other
Brief description of your intended use case: text
I agree to the terms of use: checkbox
extra_gated_button_content: Agree and Request Access
---
<div align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63adf1fa42fd3b8dbaeb0c92/V767Mtu8VSiFNbUZ3GcXW.png" width="20%"/>
</div>
<div align="center">
**Ex0bit/GLM-4.7-Flash-PRISM**
[]()
[]()
[]()
</div>
## Model Description
This is **Ex0bit/GLM-4.7-Flash-PRISM**
**GLM-4.7-Flash-PRISM:** Unrestricted (Zero Over-Refusals and Zero Propoganda) GLM-4.7-Flash Model Access
Access GLM-4.7-Flash-PRISM, an abliterated version of ZAI's efficient 30B-A3B MoE model with over-refusal mechanisms removed.
**What You Get:**
- **30B-A3B MoE Architecture** — Lightweight yet powerful Mixture-of-Experts model with 30 billion total parameters and ~3 billion active per token for fast, efficient inference
- **PRISM (Projected Refusal Isolation via Subspace Modification)** — State-of-the-art abliteration technique that removes over-refusal behaviors while preserving capabilities
- **128K Context Window** — Extended context for complex tasks and large codebases
- **Interleaved & Preserved Thinking** — Multi-turn reasoning that persists across conversations with per-turn thinking control
- **Strong In-Class Benchmarks** — 91.6% AIME 2025, 79.5% τ²-Bench, 59.2% SWE-bench Verified, 75.2% GPQA
|