Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -9,17 +9,53 @@ pinned: true
|
|
| 9 |
|
| 10 |
# Matrix.Corp π
|
| 11 |
|
| 12 |
-
Welcome to **Matrix.Corp** β an independent AI research organization
|
| 13 |
|
| 14 |
---
|
| 15 |
|
| 16 |
## Our Philosophy
|
| 17 |
|
| 18 |
-
We believe
|
| 19 |
|
| 20 |
---
|
| 21 |
|
| 22 |
-
## Model Families
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 23 |
|
| 24 |
### π Zenith β Reasoning & Emotional Intelligence
|
| 25 |
*Optimized for Tenstorrent Blackhole p300a hardware*
|
|
@@ -30,16 +66,17 @@ High-performance reasoning models built for the Tenstorrent p300a accelerator (d
|
|
| 30 |
|---|---|---|---|---|
|
| 31 |
| [Zenith-7B-V1](https://huggingface.co/Matrix-Corp/Zenith-7b-V1) | 7B | Qwen2.5-Coder-7B | π‘ Preview | Code generation, fast inference |
|
| 32 |
| [Zenith-28B-V1](https://huggingface.co/Matrix-Corp/Zenith-28b-p300-V1) | 28B | Qwen3.5-27B (Claude Opus 4.6 distilled) | π‘ Preview | Nuanced reasoning, EQ-aware conversations |
|
| 33 |
-
| [Zenith-32B-V1](https://huggingface.co/Matrix-Corp/Zenith-32b-
|
| 34 |
-
| [Zenith-70B-V1](https://huggingface.co/Matrix-Corp/Zenith-70b-
|
| 35 |
|
| 36 |
**Key features:**
|
| 37 |
-
- π§ EQ Engine β frustration detection + 8-emotion classification, fully integrated as core architecture
|
| 38 |
- β‘ Ring Attention β 32K context on limited memory
|
| 39 |
- π Mixture of Experts β 12 experts, top-2 routing
|
| 40 |
- π₯οΈ p300a optimized β TP=8/PP=4 maps 1:1 to all 32 RISC-V cores
|
| 41 |
- π¦ Ollama + vLLM compatible
|
| 42 |
|
|
|
|
| 43 |
**Timeline:** Trained weights + real benchmarks expected in 3β6 months
|
| 44 |
|
| 45 |
---
|
|
@@ -61,6 +98,7 @@ From-scratch models built for scientific reasoning across Physics, Mathematics,
|
|
| 61 |
- π₯οΈ Laptop-first β runs on MacBook Pro M2/M3 and Nvidia 4060
|
| 62 |
- π School science project
|
| 63 |
|
|
|
|
| 64 |
**Timeline:** Training data pipeline in progress, weights TBD
|
| 65 |
|
| 66 |
---
|
|
@@ -84,33 +122,40 @@ Fine-tuned music assistant helping users learn instruments, understand music the
|
|
| 84 |
- π Music EQ adapter β frustration detection tuned for music learners
|
| 85 |
- π Genre & music history knowledge
|
| 86 |
|
| 87 |
-
**
|
|
|
|
| 88 |
|
| 89 |
---
|
| 90 |
|
| 91 |
-
###
|
| 92 |
-
*
|
| 93 |
|
| 94 |
-
|
| 95 |
|
| 96 |
-
| Model |
|
| 97 |
|---|---|---|---|---|
|
| 98 |
-
| Lattice-120B | 120B | ~22B | π©΅ Long-Term Roadmap |
|
| 99 |
-
| Lattice-430B | 430B | ~38B | π©΅ Long-Term Roadmap |
|
| 100 |
-
| Lattice-671B | 671B | ~47B | π©΅ Long-Term Roadmap |
|
| 101 |
|
| 102 |
-
**
|
| 103 |
-
|
| 104 |
-
-
|
| 105 |
-
|
| 106 |
-
|
| 107 |
-
|
| 108 |
-
|
| 109 |
-
|
| 110 |
|
| 111 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 112 |
|
| 113 |
-
**
|
| 114 |
|
| 115 |
---
|
| 116 |
|
|
@@ -119,10 +164,10 @@ Matrix.Corp's flagship frontier family. Massive MoE models built for deployment
|
|
| 119 |
| Status | Meaning |
|
| 120 |
|---|---|
|
| 121 |
| π’ Released | Trained weights available, benchmarks published |
|
| 122 |
-
| π‘ Preview | Architecture and code published, training in progress |
|
| 123 |
| π΄ Planned | Design complete, build not yet started |
|
| 124 |
-
| π©΅ Long-Term Roadmap |
|
| 125 |
-
| π£ Closed Source |
|
| 126 |
|
| 127 |
---
|
| 128 |
|
|
@@ -132,10 +177,21 @@ We build for **accessible, affordable hardware** β not just cloud GPUs:
|
|
| 132 |
|
| 133 |
| Series | Target Hardware | Why |
|
| 134 |
|---|---|---|
|
|
|
|
| 135 |
| Zenith | Tenstorrent p300a | High-performance AI at a fraction of Nvidia cost |
|
| 136 |
| Vortex | MacBook + 4060 laptop | Science AI for researchers and students anywhere |
|
| 137 |
| Touch Grass | Any hardware | Music assistance should be universally accessible |
|
| 138 |
-
| Lattice |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 139 |
|
| 140 |
---
|
| 141 |
|
|
@@ -153,9 +209,10 @@ We build for **accessible, affordable hardware** β not just cloud GPUs:
|
|
| 153 |
|
| 154 |
- [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1) β All Zenith series models
|
| 155 |
- [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1) β All Vortex series models
|
| 156 |
-
- [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass) β
|
| 157 |
-
-
|
|
|
|
| 158 |
|
| 159 |
---
|
| 160 |
|
| 161 |
-
*Building
|
|
|
|
| 9 |
|
| 10 |
# Matrix.Corp π
|
| 11 |
|
| 12 |
+
Welcome to **Matrix.Corp** β an independent AI research organization pushing the boundaries of what intelligence can be. We build specialized models, novel architectures, and entirely new paradigms for real-world use cases.
|
| 13 |
|
| 14 |
---
|
| 15 |
|
| 16 |
## Our Philosophy
|
| 17 |
|
| 18 |
+
We believe intelligence should be **purpose-built, accessible, and honest**. Every project in our ecosystem is designed from the ground up for a specific domain, user, and hardware target. We focus on novel architectures, hardware-aware optimization, emotional intelligence, and β with Vexa β a completely new computational paradigm that goes beyond AI entirely.
|
| 19 |
|
| 20 |
---
|
| 21 |
|
| 22 |
+
## Projects & Model Families
|
| 23 |
+
|
| 24 |
+
### π· Vexa β Crystalline Intelligence Substrate
|
| 25 |
+
*NOT AN AI MODEL β A new computational paradigm*
|
| 26 |
+
*Runs on any laptop Β· Fully open source Β· Lume language*
|
| 27 |
+
|
| 28 |
+
> **Vexa is not a neural network. It is a Crystalline Intelligence Substrate** β a living, self-updating lattice of Glyphs that crystallizes knowledge in 10 minutes on any laptop, learns from the live web in real time, and runs on Ollama. No gradient descent. No GPU cluster. No frozen knowledge.
|
| 29 |
+
|
| 30 |
+
**Core components:**
|
| 31 |
+
- π· **Glyph Lattice** β structured meaning objects, not weights. Every Glyph has identity, typed relations, confidence, source references, and a decay function
|
| 32 |
+
- β‘ **Crystallization** β replaces training. 5-phase pipeline, 10 min, any 8-core CPU, no GPU needed
|
| 33 |
+
- π **Real-Time Learning** β 3 live threads: Web Crystallizer, Interaction Crystallizer, Decay Monitor. Never goes stale
|
| 34 |
+
- π» **Lume** β declarative-relational programming language where meaning is a first-class citizen
|
| 35 |
+
- π **Vexa Bridge** β Ollama / vLLM / HuggingFace compatible adapter
|
| 36 |
+
|
| 37 |
+
| Component | Repo | Status |
|
| 38 |
+
|---|---|---|
|
| 39 |
+
| Vexa-V1 (Max density, ~10B Glyphs) | Matrix-Corp/Vexa-V1 | π΄ Planned |
|
| 40 |
+
| Vexa-Micro-V1 (Laptop, ~10M Glyphs, 4GB RAM) | Matrix-Corp/Vexa-Micro-V1 | π΄ Planned |
|
| 41 |
+
| Lume Language Spec + Parser | Matrix-Corp/Lume-Language-Spec | π΄ Planned |
|
| 42 |
+
| Vexa Bridge (Ollama/vLLM/HF adapter) | Matrix-Corp/Vexa-Bridge | π΄ Planned |
|
| 43 |
+
| Vexa Crystallizer Engine | Matrix-Corp/Vexa-Crystallizer | π΄ Planned |
|
| 44 |
+
|
| 45 |
+
**Glyph Lattice density tiers (one architecture, scaled by density):**
|
| 46 |
+
|
| 47 |
+
| Tier | Glyphs | RAM | Equivalent | Use Case |
|
| 48 |
+
|---|---|---|---|---|
|
| 49 |
+
| Nano | ~1M | 2GB | ~1B LLM | IoT, edge devices |
|
| 50 |
+
| Micro | ~10M | 4GB | ~7B LLM | Laptops, Raspberry Pi |
|
| 51 |
+
| Core | ~100M | 8GB | ~13B LLM | Consumer GPU |
|
| 52 |
+
| Dense | ~1B | 16GB | ~30B LLM | Workstation |
|
| 53 |
+
| Max | ~10B | 40GB | ~70B LLM | A100 / p300a |
|
| 54 |
+
|
| 55 |
+
**Collection:** Matrix-Corp/vexa-v1 *(coming soon)*
|
| 56 |
+
**Status:** π΄ Planned β paradigm definition complete, implementation starting
|
| 57 |
+
|
| 58 |
+
---
|
| 59 |
|
| 60 |
### π Zenith β Reasoning & Emotional Intelligence
|
| 61 |
*Optimized for Tenstorrent Blackhole p300a hardware*
|
|
|
|
| 66 |
|---|---|---|---|---|
|
| 67 |
| [Zenith-7B-V1](https://huggingface.co/Matrix-Corp/Zenith-7b-V1) | 7B | Qwen2.5-Coder-7B | π‘ Preview | Code generation, fast inference |
|
| 68 |
| [Zenith-28B-V1](https://huggingface.co/Matrix-Corp/Zenith-28b-p300-V1) | 28B | Qwen3.5-27B (Claude Opus 4.6 distilled) | π‘ Preview | Nuanced reasoning, EQ-aware conversations |
|
| 69 |
+
| [Zenith-32B-V1](https://huggingface.co/Matrix-Corp/Zenith-32b-p300-V1) | 32B | DeepSeek-R1-Distill-Qwen-32B | π‘ Preview | Mathematical & structured reasoning |
|
| 70 |
+
| [Zenith-70B-V1](https://huggingface.co/Matrix-Corp/Zenith-70b-p300-V1) | 70B | DeepSeek-R1-Distill-Llama-70B | π‘ Preview | Maximum capability, multi-card setup |
|
| 71 |
|
| 72 |
**Key features:**
|
| 73 |
+
- π§ EQ Engine V1 β frustration detection + 8-emotion classification, fully integrated as core architecture
|
| 74 |
- β‘ Ring Attention β 32K context on limited memory
|
| 75 |
- π Mixture of Experts β 12 experts, top-2 routing
|
| 76 |
- π₯οΈ p300a optimized β TP=8/PP=4 maps 1:1 to all 32 RISC-V cores
|
| 77 |
- π¦ Ollama + vLLM compatible
|
| 78 |
|
| 79 |
+
**Collection:** [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
|
| 80 |
**Timeline:** Trained weights + real benchmarks expected in 3β6 months
|
| 81 |
|
| 82 |
---
|
|
|
|
| 98 |
- π₯οΈ Laptop-first β runs on MacBook Pro M2/M3 and Nvidia 4060
|
| 99 |
- π School science project
|
| 100 |
|
| 101 |
+
**Collection:** [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
|
| 102 |
**Timeline:** Training data pipeline in progress, weights TBD
|
| 103 |
|
| 104 |
---
|
|
|
|
| 122 |
- π Music EQ adapter β frustration detection tuned for music learners
|
| 123 |
- π Genre & music history knowledge
|
| 124 |
|
| 125 |
+
**Collection:** [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass)
|
| 126 |
+
**Timeline:** Architecture published, training planned
|
| 127 |
|
| 128 |
---
|
| 129 |
|
| 130 |
+
### π Matrix Lattice β Frontier Agentic MoE
|
| 131 |
+
*Inference provider deployment Β· Closed source Β· Long-term roadmap*
|
| 132 |
|
| 133 |
+
Flagship frontier agentic + multimodal MoE family. Designed for inference provider deployment (Novita, Hyperbolic, Together, Fireworks). OpenAI-compatible API. 17 custom modules including EQ Engine V2, Multi-Agent Coordination Layer, Hierarchical Context Compression, and Safety Reasoning Module.
|
| 134 |
|
| 135 |
+
| Model | Total Params | Active Params | Context | Status |
|
| 136 |
|---|---|---|---|---|
|
| 137 |
+
| Lattice-120B | 120B | ~22B | 1M tokens | π©΅ Long-Term Roadmap |
|
| 138 |
+
| Lattice-430B | 430B | ~38B | 1M tokens | π©΅ Long-Term Roadmap |
|
| 139 |
+
| Lattice-671B | 671B | ~47B | 1M tokens | π©΅ Long-Term Roadmap |
|
| 140 |
|
| 141 |
+
**Status:** π©΅ Long-Term Roadmap Β· π£ Closed Source
|
| 142 |
+
|
| 143 |
+
---
|
| 144 |
+
|
| 145 |
+
### π¨ Matrix Voxel β 3D Generation
|
| 146 |
+
*Flow Matching Β· Triplane Latent Β· A100 40GB*
|
| 147 |
+
|
| 148 |
+
3D generation model family sharing a flow-matching DiT backbone with task-specific decoder heads. 4 specialist models open source, Voxel Prime closed source API-only.
|
| 149 |
|
| 150 |
+
| Model | Task | Outputs | Status |
|
| 151 |
+
|---|---|---|---|
|
| 152 |
+
| Voxel Atlas | World/environment generation | .vox, .obj, .usd | π΄ Planned Β· π’ Open Source |
|
| 153 |
+
| Voxel Forge | 3D mesh & asset generation | .obj, .glb, .fbx, .usdz | π΄ Planned Β· π’ Open Source |
|
| 154 |
+
| Voxel Cast | 3D printable generation | .stl, .step, .3mf | π΄ Planned Β· π’ Open Source |
|
| 155 |
+
| Voxel Lens | NeRF / Gaussian Splatting | .ply (3DGS), NeRF weights | π΄ Planned Β· π’ Open Source |
|
| 156 |
+
| Voxel Prime | All-in-one unified | All formats + pipeline mode | π΄ Planned Β· π£ Closed Source |
|
| 157 |
|
| 158 |
+
**Status:** π΄ Planned β architecture complete
|
| 159 |
|
| 160 |
---
|
| 161 |
|
|
|
|
| 164 |
| Status | Meaning |
|
| 165 |
|---|---|
|
| 166 |
| π’ Released | Trained weights available, benchmarks published |
|
| 167 |
+
| π‘ Preview | Architecture and code published, training in progress or planned |
|
| 168 |
| π΄ Planned | Design complete, build not yet started |
|
| 169 |
+
| π©΅ Long-Term Roadmap | Vision defined, significant research and compute required |
|
| 170 |
+
| π£ Closed Source | Architecture may be public, weights and training are proprietary |
|
| 171 |
|
| 172 |
---
|
| 173 |
|
|
|
|
| 177 |
|
| 178 |
| Series | Target Hardware | Why |
|
| 179 |
|---|---|---|
|
| 180 |
+
| Vexa | Any 8-core laptop β CPU only | Crystalline Intelligence for everyone, no GPU required |
|
| 181 |
| Zenith | Tenstorrent p300a | High-performance AI at a fraction of Nvidia cost |
|
| 182 |
| Vortex | MacBook + 4060 laptop | Science AI for researchers and students anywhere |
|
| 183 |
| Touch Grass | Any hardware | Music assistance should be universally accessible |
|
| 184 |
+
| Matrix Lattice | 16β32Γ H100 / p300a | Frontier inference for providers |
|
| 185 |
+
| Matrix Voxel | A100 40GB | 3D generation on a single card |
|
| 186 |
+
|
| 187 |
+
---
|
| 188 |
+
|
| 189 |
+
## Reserved Project Names
|
| 190 |
+
|
| 191 |
+
These names are allocated to specific future projects β not available for other uses:
|
| 192 |
+
|
| 193 |
+
- **Vexa** β Crystalline Intelligence Substrate (this project)
|
| 194 |
+
- **Axiom** β Future extreme reasoning model (planned)
|
| 195 |
|
| 196 |
---
|
| 197 |
|
|
|
|
| 209 |
|
| 210 |
- [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1) β All Zenith series models
|
| 211 |
- [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1) β All Vortex series models
|
| 212 |
+
- [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass) β Touch Grass music assistant models
|
| 213 |
+
- Matrix-Corp/vexa-v1 *(coming soon)* β Vexa Crystalline Intelligence components
|
| 214 |
+
- Matrix-Corp/voxel-v1 *(coming soon)* β Matrix Voxel 3D generation models
|
| 215 |
|
| 216 |
---
|
| 217 |
|
| 218 |
+
*Building the future of intelligence β one paradigm at a time.* π
|