GLM-4.7-Flash-PRISM / README.md
Ex0bit's picture
Update README.md
68db643 verified
|
raw
history blame
3.29 kB
metadata
license: other
license_name: prism-research
license_link: LICENSE.md
language:
  - en
  - cn
tags:
  - glm4
  - prism
pipeline_tag: text-generation
library_name: transformers

Model Architecture Context

Model Description

This is Ex0bit/GLM-4.7-Flash-PRISM

PLEASE SUPPORT OUR WORK! If you enjoy what we do, consider supporting us on Ko-fi! Every little bit means the world! https://ko-fi.com/ericelbaz





Support Donation Option:

PRISM VIP Member Sign-Up All Models

One-Time Support This Model

✓ Priority Access

GLM-4.7-Flash-PRISM: Unrestricted (Zero Over-Refusals and Zero Propoganda) GLM-4.7-Flash Model Access

Access GLM-4.7-Flash-PRISM, a PRISM unchained version of ZAI's efficient 30B-A3B MoE model with over-refusal mechanisms removed.

What You Get:

  • PRISM (Projected Refusal Isolation via Subspace Modification) — State-of-the-art abliteration technique that removes over-refusal behaviors while preserving capabilities
  • 30B-A3B MoE Architecture — Lightweight yet powerful Mixture-of-Experts model with 30 billion total parameters and ~3 billion active per token for fast, efficient inference
  • 128K Context Window — Extended context for complex tasks and large codebases
  • Interleaved & Preserved Thinking — Multi-turn reasoning that persists across conversations with per-turn thinking control
  • Strong In-Class Benchmarks — 91.6% AIME 2025, 79.5% τ²-Bench, 59.2% SWE-bench Verified, 75.2% GPQA