Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -24,6 +24,7 @@ We believe intelligence should be **purpose-built, accessible, and honest**. Eve
|
|
| 24 |
### π· Vexa β Crystalline Intelligence Substrate
|
| 25 |
*NOT AN AI MODEL β A new computational paradigm*
|
| 26 |
*Runs on any laptop Β· Fully open source Β· Lume language*
|
|
|
|
| 27 |
|
| 28 |
> **Vexa is not a neural network. It is a Crystalline Intelligence Substrate** β a living, self-updating lattice of Glyphs that crystallizes knowledge in 10 minutes on any laptop, learns from the live web in real time, and runs on Ollama. No gradient descent. No GPU cluster. No frozen knowledge.
|
| 29 |
|
|
@@ -42,7 +43,7 @@ We believe intelligence should be **purpose-built, accessible, and honest**. Eve
|
|
| 42 |
| Vexa Bridge (Ollama/vLLM/HF adapter) | Matrix-Corp/Vexa-Bridge | π΄ Planned |
|
| 43 |
| Vexa Crystallizer Engine | Matrix-Corp/Vexa-Crystallizer | π΄ Planned |
|
| 44 |
|
| 45 |
-
**Glyph Lattice density tiers
|
| 46 |
|
| 47 |
| Tier | Glyphs | RAM | Equivalent | Use Case |
|
| 48 |
|---|---|---|---|---|
|
|
@@ -53,84 +54,99 @@ We believe intelligence should be **purpose-built, accessible, and honest**. Eve
|
|
| 53 |
| Max | ~10B | 40GB | ~70B LLM | A100 / p300a |
|
| 54 |
|
| 55 |
**Collection:** Matrix-Corp/vexa-v1 *(coming soon)*
|
| 56 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 57 |
|
| 58 |
---
|
| 59 |
|
| 60 |
### π Zenith β Reasoning & Emotional Intelligence
|
| 61 |
*Optimized for Tenstorrent Blackhole p300a hardware*
|
| 62 |
|
| 63 |
-
High-performance reasoning models built for the Tenstorrent p300a accelerator (dual-chip, 32 RISC-V cores, 64GB GDDR6).
|
| 64 |
|
| 65 |
-
| Model | Parameters | Base Model | Status |
|
| 66 |
-
|---|---|---|---|
|
| 67 |
-
| [Zenith-7B-V1](https://huggingface.co/Matrix-Corp/Zenith-7b-V1) | 7B | Qwen2.5-Coder-7B | π‘ Preview |
|
| 68 |
-
| [Zenith-28B-V1](https://huggingface.co/Matrix-Corp/Zenith-28b-p300-V1) | 28B | Qwen3.5-27B (Claude Opus 4.6 distilled) | π‘ Preview |
|
| 69 |
-
| [Zenith-32B-V1](https://huggingface.co/Matrix-Corp/Zenith-32b-p300-V1) | 32B | DeepSeek-R1-Distill-Qwen-32B | π‘ Preview |
|
| 70 |
-
| [Zenith-70B-V1](https://huggingface.co/Matrix-Corp/Zenith-70b-p300-V1) | 70B | DeepSeek-R1-Distill-Llama-70B | π‘ Preview |
|
| 71 |
-
|
| 72 |
-
**Key features:**
|
| 73 |
-
- π§ EQ Engine V1 β frustration detection + 8-emotion classification, fully integrated as core architecture
|
| 74 |
-
- β‘ Ring Attention β 32K context on limited memory
|
| 75 |
-
- π Mixture of Experts β 12 experts, top-2 routing
|
| 76 |
-
- π₯οΈ p300a optimized β TP=8/PP=4 maps 1:1 to all 32 RISC-V cores
|
| 77 |
-
- π¦ Ollama + vLLM compatible
|
| 78 |
|
| 79 |
**Collection:** [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
|
| 80 |
-
**Timeline:** Trained weights + real benchmarks expected in 3β6 months
|
| 81 |
|
| 82 |
---
|
| 83 |
|
| 84 |
### π¬ Vortex Scientific β Deep Science Reasoning
|
| 85 |
-
*Optimized for Apple Silicon & Nvidia 4060
|
| 86 |
|
| 87 |
-
From-scratch models
|
| 88 |
|
| 89 |
-
| Model | Parameters | Architecture | Status |
|
| 90 |
-
|---|---|---|---|
|
| 91 |
-
| [Vortex-7B-V1](https://huggingface.co/Matrix-Corp/Vortex-7b-V1) | 7B | Hybrid SSM + Attention (60% SSM) | π‘ Preview |
|
| 92 |
-
| [Vortex-13B-V1](https://huggingface.co/Matrix-Corp/Vortex-13b-V1) | 13B | Hybrid SSM + Attention (50% SSM) | π‘ Preview |
|
| 93 |
|
| 94 |
-
**Key features:**
|
| 95 |
-
- π No base model β built entirely from scratch including tokenizer
|
| 96 |
-
- βοΈ Science modules β Equation/LaTeX, Numerical Reasoning, Citation Awareness, Molecular/Periodic Table
|
| 97 |
-
- π Custom science tokenizer β 50K vocab with LaTeX symbols, element symbols, SI units, amino acids
|
| 98 |
-
- π₯οΈ Laptop-first β runs on MacBook Pro M2/M3 and Nvidia 4060
|
| 99 |
-
- π School science project
|
| 100 |
|
| 101 |
**Collection:** [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
|
| 102 |
-
**Timeline:** Training data pipeline in progress, weights TBD
|
| 103 |
|
| 104 |
---
|
| 105 |
|
| 106 |
### πΏ Touch Grass β Music AI Assistant
|
| 107 |
-
*Ultra lightweight
|
| 108 |
|
| 109 |
-
Fine-tuned music assistant
|
| 110 |
|
| 111 |
-
| Model | Parameters | Base Model | Status |
|
| 112 |
-
|---|---|---|---|
|
| 113 |
-
| [TouchGrass-3B](https://huggingface.co/Matrix-Corp/TouchGrass-3b) | 3B | Qwen3.5-3B-Instruct | π‘ Preview |
|
| 114 |
-
| [TouchGrass-7B](https://huggingface.co/Matrix-Corp/TouchGrass-7b) | 7B | Qwen3.5-7B-Instruct | π‘ Preview |
|
| 115 |
-
|
| 116 |
-
**Key features:**
|
| 117 |
-
- πΈ All instruments β Guitar, Bass, Piano, Keys, Drums, Vocals, DJ & Production
|
| 118 |
-
- π΅ Tab & chord generation β structured, musically validated output
|
| 119 |
-
- πΌ Music theory engine β circle of fifths, modes, progressions, voice leading
|
| 120 |
-
- π Ear training guidance β interval recognition, chord quality, relative pitch
|
| 121 |
-
- βοΈ Songwriting assistant β lyrics, progressions, structure, hooks
|
| 122 |
-
- π Music EQ adapter β frustration detection tuned for music learners
|
| 123 |
-
- π Genre & music history knowledge
|
| 124 |
|
| 125 |
**Collection:** [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass)
|
| 126 |
-
**Timeline:** Architecture published, training planned
|
| 127 |
|
| 128 |
---
|
| 129 |
|
| 130 |
### π Matrix Lattice β Frontier Agentic MoE
|
| 131 |
*Inference provider deployment Β· Closed source Β· Long-term roadmap*
|
|
|
|
| 132 |
|
| 133 |
-
Flagship frontier agentic + multimodal MoE family
|
| 134 |
|
| 135 |
| Model | Total Params | Active Params | Context | Status |
|
| 136 |
|---|---|---|---|---|
|
|
@@ -144,18 +160,27 @@ Flagship frontier agentic + multimodal MoE family. Designed for inference provid
|
|
| 144 |
|
| 145 |
### π¨ Matrix Voxel β 3D Generation
|
| 146 |
*Flow Matching Β· Triplane Latent Β· A100 40GB*
|
|
|
|
| 147 |
|
| 148 |
-
3D generation model family sharing a flow-matching DiT backbone with task-specific decoder heads.
|
| 149 |
|
| 150 |
| Model | Task | Outputs | Status |
|
| 151 |
|---|---|---|---|
|
| 152 |
-
| Voxel Atlas | World/environment generation | .vox, .obj, .usd | π΄ Planned Β· π’ Open
|
| 153 |
-
| Voxel Forge | 3D mesh & asset generation | .obj, .glb, .fbx, .usdz | π΄ Planned Β· π’ Open
|
| 154 |
-
| Voxel Cast | 3D printable generation | .stl, .step, .3mf | π΄ Planned Β· π’ Open
|
| 155 |
-
| Voxel Lens | NeRF / Gaussian Splatting | .ply (3DGS), NeRF weights | π΄ Planned Β· π’ Open
|
| 156 |
-
| Voxel Prime | All-in-one unified | All formats
|
| 157 |
|
| 158 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 159 |
|
| 160 |
---
|
| 161 |
|
|
@@ -164,10 +189,10 @@ Flagship frontier agentic + multimodal MoE family. Designed for inference provid
|
|
| 164 |
| Status | Meaning |
|
| 165 |
|---|---|
|
| 166 |
| π’ Released | Trained weights available, benchmarks published |
|
| 167 |
-
| π‘ Preview | Architecture
|
| 168 |
| π΄ Planned | Design complete, build not yet started |
|
| 169 |
| π©΅ Long-Term Roadmap | Vision defined, significant research and compute required |
|
| 170 |
-
| π£ Closed Source |
|
| 171 |
|
| 172 |
---
|
| 173 |
|
|
@@ -175,23 +200,24 @@ Flagship frontier agentic + multimodal MoE family. Designed for inference provid
|
|
| 175 |
|
| 176 |
We build for **accessible, affordable hardware** β not just cloud GPUs:
|
| 177 |
|
| 178 |
-
| Series | Target Hardware |
|
| 179 |
-
|---|---|
|
| 180 |
-
| Vexa | Any 8-core laptop β CPU only |
|
| 181 |
-
|
|
| 182 |
-
|
|
| 183 |
-
|
|
| 184 |
-
|
|
| 185 |
-
|
|
|
|
|
|
|
|
| 186 |
|
| 187 |
---
|
| 188 |
|
| 189 |
## Reserved Project Names
|
| 190 |
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
- **
|
| 194 |
-
- **Axiom** β Future extreme reasoning model (planned)
|
| 195 |
|
| 196 |
---
|
| 197 |
|
|
@@ -207,11 +233,12 @@ These names are allocated to specific future projects β not available for othe
|
|
| 207 |
|
| 208 |
## Collections
|
| 209 |
|
| 210 |
-
- [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
|
| 211 |
-
- [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
|
| 212 |
-
- [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass)
|
| 213 |
-
- Matrix-Corp/vexa-v1 *(coming soon)*
|
| 214 |
-
- Matrix-Corp/
|
|
|
|
| 215 |
|
| 216 |
---
|
| 217 |
|
|
|
|
| 24 |
### π· Vexa β Crystalline Intelligence Substrate
|
| 25 |
*NOT AN AI MODEL β A new computational paradigm*
|
| 26 |
*Runs on any laptop Β· Fully open source Β· Lume language*
|
| 27 |
+
**Status: π΄ Spec complete β Build in progress**
|
| 28 |
|
| 29 |
> **Vexa is not a neural network. It is a Crystalline Intelligence Substrate** β a living, self-updating lattice of Glyphs that crystallizes knowledge in 10 minutes on any laptop, learns from the live web in real time, and runs on Ollama. No gradient descent. No GPU cluster. No frozen knowledge.
|
| 30 |
|
|
|
|
| 43 |
| Vexa Bridge (Ollama/vLLM/HF adapter) | Matrix-Corp/Vexa-Bridge | π΄ Planned |
|
| 44 |
| Vexa Crystallizer Engine | Matrix-Corp/Vexa-Crystallizer | π΄ Planned |
|
| 45 |
|
| 46 |
+
**Glyph Lattice density tiers:**
|
| 47 |
|
| 48 |
| Tier | Glyphs | RAM | Equivalent | Use Case |
|
| 49 |
|---|---|---|---|---|
|
|
|
|
| 54 |
| Max | ~10B | 40GB | ~70B LLM | A100 / p300a |
|
| 55 |
|
| 56 |
**Collection:** Matrix-Corp/vexa-v1 *(coming soon)*
|
| 57 |
+
|
| 58 |
+
---
|
| 59 |
+
|
| 60 |
+
### β‘ Kairiq β Critical Moment Intelligence Module
|
| 61 |
+
*Universal plug-in module Β· Built entirely in Lume Β· Slap-on adapters*
|
| 62 |
+
**Status: π΄ Architecture complete β Build starting**
|
| 63 |
+
|
| 64 |
+
> **Kairiq is not a model. It is a universal intelligence amplifier** β a plug-in module built entirely in `.lume` files that attaches to any Matrix.Corp model and elevates it to elite benchmark performance. A Kairiq-enhanced 32B competes with a vanilla 70B. Not because it knows more β because it deploys what it knows at exactly the right moment, the right depth, and the right priority.
|
| 65 |
+
|
| 66 |
+
**The Three KQ Dimensions:**
|
| 67 |
+
- β± **Temporal Acuity (T)** β sensing rhythm, pacing, momentum. Knowing when a reasoning path is collapsing before it wastes compute
|
| 68 |
+
- β‘ **Tension Sensing (X)** β reading pressure gradients. Identifying load-bearing sub-problems where failure cascades everywhere
|
| 69 |
+
- π **Imperial Hierarchy (H)** β the rank and weight of every sub-problem. What overrides what. The actual question beneath the stated question
|
| 70 |
+
|
| 71 |
+
**Three-layer pipeline:** Kairiq Gate β Kairiq Router β Base Model β Kairiq Verifier
|
| 72 |
+
|
| 73 |
+
**Benchmark targets:**
|
| 74 |
+
|
| 75 |
+
| Benchmark | KQ Dimension | Expected Gain |
|
| 76 |
+
|---|---|---|
|
| 77 |
+
| MMLU | H β ranks actual question instantly | +4β6% |
|
| 78 |
+
| HumanEval | T β kills dead reasoning paths early | +6β9% |
|
| 79 |
+
| MATH | X β slow-deep on load-bearing steps | +5β8% |
|
| 80 |
+
| ARC | H β no reasoning overkill on simple problems | +3β5% |
|
| 81 |
+
| HellaSwag | T β reads narrative momentum correctly | +4β7% |
|
| 82 |
+
|
| 83 |
+
**Written entirely in `.lume`. Python `adapter.py` bridge for any existing model. Zero changes to base model required.**
|
| 84 |
+
|
| 85 |
+
| Component | Repo | Status |
|
| 86 |
+
|---|---|---|
|
| 87 |
+
| Kairiq Module V1 | Matrix-Corp/Kairiq-Module-V1 | π΄ Planned |
|
| 88 |
+
| Lume Kairiq Spec | Matrix-Corp/Lume-Kairiq-Spec | π΄ Planned |
|
| 89 |
+
| Zenith-32B-Kairiq-V1 | Matrix-Corp/Zenith-32B-Kairiq-V1 | π΄ Planned |
|
| 90 |
+
|
| 91 |
+
**Collection:** Matrix-Corp/kairiq-v1 *(coming soon)*
|
| 92 |
|
| 93 |
---
|
| 94 |
|
| 95 |
### π Zenith β Reasoning & Emotional Intelligence
|
| 96 |
*Optimized for Tenstorrent Blackhole p300a hardware*
|
| 97 |
|
| 98 |
+
High-performance reasoning models built for the Tenstorrent p300a accelerator (dual-chip, 32 RISC-V cores, 64GB GDDR6). Ring Attention (32K context), Mixture of Experts, and EQ Engine V1 for emotional intelligence.
|
| 99 |
|
| 100 |
+
| Model | Parameters | Base Model | Status |
|
| 101 |
+
|---|---|---|---|
|
| 102 |
+
| [Zenith-7B-V1](https://huggingface.co/Matrix-Corp/Zenith-7b-V1) | 7B | Qwen2.5-Coder-7B | π‘ Preview |
|
| 103 |
+
| [Zenith-28B-V1](https://huggingface.co/Matrix-Corp/Zenith-28b-p300-V1) | 28B | Qwen3.5-27B (Claude Opus 4.6 distilled) | π‘ Preview |
|
| 104 |
+
| [Zenith-32B-V1](https://huggingface.co/Matrix-Corp/Zenith-32b-p300-V1) | 32B | DeepSeek-R1-Distill-Qwen-32B | π‘ Preview |
|
| 105 |
+
| [Zenith-70B-V1](https://huggingface.co/Matrix-Corp/Zenith-70b-p300-V1) | 70B | DeepSeek-R1-Distill-Llama-70B | π‘ Preview |
|
| 106 |
+
|
| 107 |
+
**Key features:** EQ Engine V1 Β· Ring Attention 32K Β· MoE 12 experts top-2 Β· TP=8/PP=4 Β· Ollama + vLLM compatible
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 108 |
|
| 109 |
**Collection:** [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
|
|
|
|
| 110 |
|
| 111 |
---
|
| 112 |
|
| 113 |
### π¬ Vortex Scientific β Deep Science Reasoning
|
| 114 |
+
*Optimized for Apple Silicon & Nvidia 4060 Β· Built from scratch β no base model*
|
| 115 |
|
| 116 |
+
From-scratch models for scientific reasoning across Physics, Mathematics, Chemistry, Biology, Earth Science, Space Science, and Zoology. Hybrid SSM + attention architecture with a custom 50K science tokenizer and 4 specialized science modules.
|
| 117 |
|
| 118 |
+
| Model | Parameters | Architecture | Status |
|
| 119 |
+
|---|---|---|---|
|
| 120 |
+
| [Vortex-7B-V1](https://huggingface.co/Matrix-Corp/Vortex-7b-V1) | 7B | Hybrid SSM + Attention (60% SSM) | π‘ Preview |
|
| 121 |
+
| [Vortex-13B-V1](https://huggingface.co/Matrix-Corp/Vortex-13b-V1) | 13B | Hybrid SSM + Attention (50% SSM) | π‘ Preview |
|
| 122 |
|
| 123 |
+
**Key features:** No base model Β· Custom science tokenizer Β· Equation/LaTeX Β· Molecular/Periodic Table modules Β· Laptop-first
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 124 |
|
| 125 |
**Collection:** [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
|
|
|
|
| 126 |
|
| 127 |
---
|
| 128 |
|
| 129 |
### πΏ Touch Grass β Music AI Assistant
|
| 130 |
+
*Ultra lightweight Β· Runs on anything*
|
| 131 |
|
| 132 |
+
Fine-tuned music assistant for learning instruments, music theory, songwriting, ear training, and music history. Warm, encouraging, beginner-friendly.
|
| 133 |
|
| 134 |
+
| Model | Parameters | Base Model | Status |
|
| 135 |
+
|---|---|---|---|
|
| 136 |
+
| [TouchGrass-3B](https://huggingface.co/Matrix-Corp/TouchGrass-3b) | 3B | Qwen3.5-3B-Instruct | π‘ Preview |
|
| 137 |
+
| [TouchGrass-7B](https://huggingface.co/Matrix-Corp/TouchGrass-7b) | 7B | Qwen3.5-7B-Instruct | π‘ Preview |
|
| 138 |
+
|
| 139 |
+
**Key features:** All instruments Β· Tab & chord generation Β· Music theory engine Β· Ear training Β· Songwriting assistant Β· Music EQ adapter
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 140 |
|
| 141 |
**Collection:** [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass)
|
|
|
|
| 142 |
|
| 143 |
---
|
| 144 |
|
| 145 |
### π Matrix Lattice β Frontier Agentic MoE
|
| 146 |
*Inference provider deployment Β· Closed source Β· Long-term roadmap*
|
| 147 |
+
**Status: π©΅ Long-Term Roadmap Β· Spec complete**
|
| 148 |
|
| 149 |
+
Flagship frontier agentic + multimodal MoE family with 17 custom modules including EQ Engine V2, Multi-Agent Coordination Layer (MACL), Hierarchical Context Compression Engine (HCCE), and Safety Reasoning Module. Designed for inference provider deployment (Novita, Hyperbolic, Together, Fireworks).
|
| 150 |
|
| 151 |
| Model | Total Params | Active Params | Context | Status |
|
| 152 |
|---|---|---|---|---|
|
|
|
|
| 160 |
|
| 161 |
### π¨ Matrix Voxel β 3D Generation
|
| 162 |
*Flow Matching Β· Triplane Latent Β· A100 40GB*
|
| 163 |
+
**Status: π΄ Planned β Architecture complete**
|
| 164 |
|
| 165 |
+
3D generation model family sharing a flow-matching DiT backbone (~2.3B) with task-specific decoder heads.
|
| 166 |
|
| 167 |
| Model | Task | Outputs | Status |
|
| 168 |
|---|---|---|---|
|
| 169 |
+
| Voxel Atlas | World/environment generation | .vox, .obj, .usd | π΄ Planned Β· π’ Open |
|
| 170 |
+
| Voxel Forge | 3D mesh & asset generation | .obj, .glb, .fbx, .usdz | π΄ Planned Β· π’ Open |
|
| 171 |
+
| Voxel Cast | 3D printable generation | .stl, .step, .3mf | π΄ Planned Β· π’ Open |
|
| 172 |
+
| Voxel Lens | NeRF / Gaussian Splatting | .ply (3DGS), NeRF weights | π΄ Planned Β· π’ Open |
|
| 173 |
+
| Voxel Prime | All-in-one unified | All formats | π΄ Planned Β· π£ Closed |
|
| 174 |
|
| 175 |
+
---
|
| 176 |
+
|
| 177 |
+
### βοΈ Matrix Axiom β Extreme Reasoning Code Intelligence
|
| 178 |
+
*Reserved Β· Design in progress*
|
| 179 |
+
**Status: π΄ Reserved β Design starting**
|
| 180 |
+
|
| 181 |
+
> Axiom is Matrix.Corp's planned extreme reasoning coding model. Hybrid architecture on a proven base with structurally enforced pre-code reasoning, a self-verify + rewrite loop, internal multi-agent trident (Architect Β· Artisan Β· Auditor), and a custom semantic code tokenizer. Targets #1 on HumanEval. Any hardware. Kairiq-enhanced at launch.
|
| 182 |
+
|
| 183 |
+
*Full spec coming soon.*
|
| 184 |
|
| 185 |
---
|
| 186 |
|
|
|
|
| 189 |
| Status | Meaning |
|
| 190 |
|---|---|
|
| 191 |
| π’ Released | Trained weights available, benchmarks published |
|
| 192 |
+
| π‘ Preview | Architecture published, training in progress or planned |
|
| 193 |
| π΄ Planned | Design complete, build not yet started |
|
| 194 |
| π©΅ Long-Term Roadmap | Vision defined, significant research and compute required |
|
| 195 |
+
| π£ Closed Source | Weights and training are proprietary |
|
| 196 |
|
| 197 |
---
|
| 198 |
|
|
|
|
| 200 |
|
| 201 |
We build for **accessible, affordable hardware** β not just cloud GPUs:
|
| 202 |
|
| 203 |
+
| Series | Target Hardware |
|
| 204 |
+
|---|---|
|
| 205 |
+
| Vexa | Any 8-core laptop β CPU only |
|
| 206 |
+
| Kairiq | Any β universal plug-in |
|
| 207 |
+
| Zenith | Tenstorrent Blackhole p300a |
|
| 208 |
+
| Vortex | MacBook M2/M3 + Nvidia 4060 laptop |
|
| 209 |
+
| Touch Grass | Any hardware |
|
| 210 |
+
| Axiom | Any / multi-target |
|
| 211 |
+
| Matrix Lattice | 16β32Γ H100 / p300a |
|
| 212 |
+
| Matrix Voxel | A100 40GB |
|
| 213 |
|
| 214 |
---
|
| 215 |
|
| 216 |
## Reserved Project Names
|
| 217 |
|
| 218 |
+
- **Vexa** β Crystalline Intelligence Substrate
|
| 219 |
+
- **Kairiq** β Critical Moment Intelligence Module
|
| 220 |
+
- **Axiom** β Extreme Reasoning Code Intelligence
|
|
|
|
| 221 |
|
| 222 |
---
|
| 223 |
|
|
|
|
| 233 |
|
| 234 |
## Collections
|
| 235 |
|
| 236 |
+
- [Zenith V1](https://huggingface.co/collections/Matrix-Corp/zenith-v1)
|
| 237 |
+
- [Vortex V1](https://huggingface.co/collections/Matrix-Corp/vortex-v1)
|
| 238 |
+
- [Touch Grass](https://huggingface.co/collections/Matrix-Corp/touch-grass)
|
| 239 |
+
- Matrix-Corp/vexa-v1 *(coming soon)*
|
| 240 |
+
- Matrix-Corp/kairiq-v1 *(coming soon)*
|
| 241 |
+
- Matrix-Corp/voxel-v1 *(coming soon)*
|
| 242 |
|
| 243 |
---
|
| 244 |
|