Spaces:
Running
Running
File size: 7,981 Bytes
f3ab956 de902c9 f3ab956 de902c9 f3ab956 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 f5b7a06 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 698351f 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 eddcc43 8ddd3a9 eddcc43 8ddd3a9 eddcc43 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 eddcc43 8ddd3a9 f5b7a06 8ddd3a9 eddcc43 698351f 8ddd3a9 698351f 8ddd3a9 698351f 8ddd3a9 eddcc43 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 f5b7a06 8ddd3a9 f5b7a06 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 de902c9 8ddd3a9 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 | ---
title: Matrix.Corp
emoji: π¦βπ₯
colorFrom: indigo
colorTo: pink
sdk: static
pinned: true
---
# Matrix.Corp
Independent AI research organization building specialized models, frontier agentic systems, and new intelligence paradigms.
**HuggingFace:** [Matrix-Corp](https://huggingface.co/Matrix-Corp) Β· **Founded by:** [Zandy-Wandy](https://huggingface.co/Zandy-Wandy) Β· **GitHub:** [zapgaming](https://github.com/zapgaming)
---
## Status Legend
| Badge | Meaning |
|---|---|
| π’ Released | Weights available, ready to use |
| π‘ Preview | Architecture published, training planned |
| π΄ Planned | Design complete, not yet built |
| π©΅ Long-Term | Vision defined, major research ahead |
| π£ Closed | Proprietary weights |
| β¬ Deprecated | Cancelled or superseded |
---
## Models
### π Zenith β Reasoning + Emotional Intelligence
**Status:** π‘ Preview Β· **Target:** Tenstorrent Blackhole p300a
Transformer models with a built-in EQ Engine β a dedicated emotional intelligence layer that sits alongside the reasoning stack. Ring Attention (32K), MoE (12 experts top-2), Ollama + vLLM compatible.
| Model | Params | Base | Link |
|---|---|---|---|
| Zenith-7B-V1 | 7B | Qwen2.5-Coder-7B | [β](https://huggingface.co/Matrix-Corp/Zenith-7b-V1) |
| Zenith-28B-V1 | 28B | Qwen3.5-27B (Opus 4.6 distilled) | [β](https://huggingface.co/Matrix-Corp/Zenith-28b-p300-V1) |
| Zenith-32B-V1 | 32B | DeepSeek-R1-Distill-Qwen-32B | [β](https://huggingface.co/Matrix-Corp/Zenith-32b-V1-Tenstorrent-Blackhole-p300) |
| Zenith-70B-V1 | 70B | DeepSeek-R1-Distill-Llama-70B | [β](https://huggingface.co/Matrix-Corp/Zenith-70b-V1-Tenstorrent-Blackhole-p300) |
[View Zenith Collection β](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
---
### π¬ Vortex Scientific β Deep Science Reasoning
**Status:** π‘ Preview Β· **Target:** MacBook M2/M3 + Nvidia 4060
Built from scratch β no base model. Custom 50K science tokenizer. Hybrid SSM+Attention architecture with four domain-specific modules: Equation/LaTeX, Numerical, Citation, and Molecular/Periodic Table.
| Model | Params | Link |
|---|---|---|
| Vortex-7B-V1 | 7B | [β](https://huggingface.co/Matrix-Corp/Vortex-7b-V1) |
| Vortex-13B-V1 | 13B | [β](https://huggingface.co/Matrix-Corp/Vortex-13b-V1) |
[View Vortex Collection β](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
---
### πΏ Touch Grass β Music AI
**Status:** π‘ Preview Β· **Target:** Any hardware
LoRA fine-tune on Qwen3.5 built for musicians. Tab & Chord Module, Music Theory Engine, Ear Training, EQ Adapter (4 emotional modes), Songwriting Module.
| Model | Params | Base | Link |
|---|---|---|---|
| TouchGrass-3B | 3B | Qwen3.5-3B-Instruct | [β](https://huggingface.co/Matrix-Corp/TouchGrass-3b) |
| TouchGrass-7B | 7B | Qwen3.5-7B-Instruct | [β](https://huggingface.co/Matrix-Corp/TouchGrass-7b) |
[View Touch Grass Collection β](https://huggingface.co/collections/Matrix-Corp/touch-grass)
---
### π Matrix Lattice β Frontier Agentic MoE
**Status:** π’ Released Β· π£ Closed Source Β· **Target:** 4β32Γ H100 / Tenstorrent p300a
**Shipped.** Our largest and most capable system. Frontier-scale mixture-of-experts with 17 custom intelligence modules including: EQ Engine V2, Multi-Agent Coordination Layer (MACL), Hierarchical Context Compression Engine (HCCE), Causal Reasoning Graph, Long-Horizon Task Planner, Confidence Calibration Head, Safety Reasoning Module (SRM), and more. 1M token context across all tiers.
| Model | Total Params | Active Params | Experts | Context | Link |
|---|---|---|---|---|---|
| Lattice-120B | 120B | ~22B | 64 top-4 | 1M | [β](https://huggingface.co/Matrix-Corp/Lattice-120B-V1) |
| Lattice-430B | 430B | ~38B | 128 top-4 | 1M | [β](https://huggingface.co/Matrix-Corp/Lattice-430B-V1) |
| Lattice-671B | 671B | ~47B | 256 top-4 | 1M | [β](https://huggingface.co/Matrix-Corp/Lattice-671B-V1) |
[View Lattice Collection β](https://huggingface.co/collections/Matrix-Corp/lattice-v1)
---
### π©Έ Matrix ECHO β Living Error Memory
**Status:** π΄ Build In Progress Β· π’ Open Source Β· **Language:** Rust
**The model that remembers how it was wrong.**
ECHO is a 27B coding-focused LLM built on `Jackrong/Qwen3.5-27B-Claude-4.6-Opus-Reasoning-Distilled`, running fully in Rust via HuggingFace `candle`. Every correction it receives crystallizes into a **Scar** β a typed, weighted memory object stored in a live petgraph lattice.
Before every response, ECHO scans its Scar lattice for similar past mistakes. The more it's corrected, the harder it is to fool. Mistakes are not erased β they become assets.
**Core loop:**
```
prompt β pre-scan Scar lattice β inject caution context β generate β correction β new Scar forms
```
**Scar types:** Factual Β· Logical Β· Contextual Β· Hallucination Β· Overconfidence
**Domain Weakness Map** β ECHO tracks which topics it's systematically weak in and suppresses confidence automatically in high-risk domains.
**OpenAI-compatible API** β drop-in via `POST /v1/chat/completions`. Corrections via `POST /v1/echo/correct`.
| Model | Params | Base | Language |
|---|---|---|---|
| ECHO-27B-V1 | 27B | Qwen3.5-27B (Opus 4.6 distilled) | Rust + candle |
[View ECHO Collection β](https://huggingface.co/collections/Matrix-Corp/echo-v1)
---
### π¨ Matrix Voxel β 3D Generation
**Status:** π΄ Planned Β· **Target:** A100 40GB
Flow-matching DiT backbone (~2.3B) with task-specific decoder heads. Generates 3D meshes, environments, printable models, and NeRF/Gaussian Splatting outputs.
| Model | Task | Outputs | License |
|---|---|---|---|
| Voxel Atlas | World/environment gen | .vox, .obj, .usd | π’ Open |
| Voxel Forge | 3D mesh & assets | .obj, .glb, .fbx, .usdz | π’ Open |
| Voxel Cast | 3D printable | .stl, .step, .3mf | π’ Open |
| Voxel Lens | NeRF / Gaussian Splatting | .ply (3DGS) | π’ Open |
| Voxel Prime | Unified all-in-one | All formats | π£ Closed |
---
### π· Matrix Vexa β Crystalline Intelligence Substrate
**Status:** π΄ Paused Β· π’ Open Source
Vexa is not a model. It is a new intelligence paradigm β a living lattice of **Glyphs** (structured meaning objects) that grows through **Crystallization** instead of training. 10 minutes on any CPU. No GPU required. Knowledge never goes stale β three background threads continuously update from the web, interactions, and decay.
Full paradigm definition and build prompt complete. Build paused, will resume.
[View Vexa Collection β](https://huggingface.co/collections/Matrix-Corp/vexa-v1)
---
### β¬ ~~Kairiq β Critical Moment Intelligence Module~~
**Status:** β¬ Deprecated
A Lume-native intelligence amplifier module designed to wrap Matrix models. Deprecated β the custom Lume language runtime exceeded practical build complexity. Core ideas (pre-scan, confidence suppression, domain routing) absorbed into ECHO.
---
## Paradigms
| Name | Type | Status |
|---|---|---|
| Crystalline Intelligence (Vexa) | Non-neural knowledge substrate | π΄ Paused |
| Living Error Memory (ECHO) | Scar-based mistake crystallization | π΄ Build In Progress |
| Ferric Attention | Ownership-typed attention mechanism | π©΅ Research concept |
---
## Reserved Names
These names are allocated to specific projects. Not available for other uses.
| Name | Allocated To |
|---|---|
| Vexa | Crystalline Intelligence Substrate |
| ECHO | Living Error Memory LLM |
| Axiom | Future extreme reasoning model (planned) |
| Lume | Declarative-relational language for Vexa |
---
## Licensing
| Model Family | License |
|---|---|
| Zenith | Apache 2.0 |
| Vortex | Apache 2.0 |
| Touch Grass | Apache 2.0 |
| Matrix Lattice | Proprietary |
| Matrix ECHO | Apache 2.0 |
| Matrix Voxel (open tiers) | Apache 2.0 |
| Matrix Voxel Prime | Proprietary |
| Vexa | Apache 2.0 |
---
*Matrix.Corp β building intelligence that knows its own limits.* |