SparseTech commited on
Commit
a336a4f
·
verified ·
1 Parent(s): 84fec84

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -8
README.md CHANGED
@@ -1,10 +1,25 @@
 
 
 
 
 
 
 
 
1
  ---
2
- title: README
3
- emoji: 🔥
4
- colorFrom: indigo
5
- colorTo: indigo
6
- sdk: static
7
- pinned: false
8
- ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ⚡️ SparseTech
2
+
3
+ **Redefining LLM reliability for edge AI through variance reduction, sparse knowledge distillation, and probability-domain manifold correction.**
4
+
5
+ Standard benchmarks measure whether a model is correct; SparseTech measures whether a model is *reliable*. We believe that in agentic and edge AI, **hallucinations live in variance**. Our mission is to crush stochastic variance and stabilize reasoning without relying on massive, server-side inference ensembles.
6
+
7
+
8
+
9
  ---
 
 
 
 
 
 
 
10
 
11
+ ## 📚 Foundational Research
12
+ We believe models should be built upon a rigorous axiomatic framework developed throughout early 2026. You can read our core methodology here:
13
+
14
+ ### The Core Theory
15
+ * **[Hallucinations Live in Variance (Jan 2026)](https://huggingface.co/papers/2601.07058)**
16
+ *Introduces Semantic Stability (SS) and Paraphrase Consistency (PC@k) as the true metrics for LLM reliability.*
17
+
18
+ ### The Distillation Framework
19
+ * **[Sparse Knowledge Distillation: A Mathematical Framework... (Jan 2026)](https://huggingface.co/papers/2601.03195)**
20
+ * **[Multi-Teacher Ensemble Distillation (Jan 2026)](https://huggingface.co/papers/2601.09165)**
21
+ * **[Recursive Meta-Distillation: An Axiomatic Framework... (Jan 2026)](https://huggingface.co/papers/2601.13100)**
22
+ * **[Adaptive Weighting in Knowledge Distillation (Jan 2026)](https://huggingface.co/papers/2601.17910)**
23
+
24
+ ### Advanced Manifold Correction
25
+ * **[Post-Training Probability Manifold Correction via Structured SVD Pruning... (Feb 2026)](https://huggingface.co/papers/2602.00372)**