EXOROBOURII commited on
Commit
4a98e60
·
verified ·
1 Parent(s): cb6f945

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +65 -7
README.md CHANGED
@@ -1,10 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- title: README
3
- emoji: 🔥
4
- colorFrom: green
5
- colorTo: indigo
6
- sdk: static
7
- pinned: false
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Exorobourii Research Lab
2
+
3
+ > **"What we observe is not nature itself, but nature exposed to our method of questioning."** — Werner Heisenberg
4
+
5
+ Exorobourii is a research initiative dedicated to **Mechanistic Interpretability** and **Efficient Intelligence**. We believe technology should be a glass box, not a black box. We build instruments to measure the internal physics of AI, and engineering frameworks to optimize it for ethical and ecological sustainability.
6
+
7
+ ## 📡 The Mission
8
+
9
+ Current AI development faces an "Observability Crisis". We are building engines that are faster and more powerful, but we rely on a dashboard that only has a speedometer (`val_loss`).
10
+
11
+ Our work focuses on three pillars:
12
+ 1. **Observability:** Developing the **VSM Protocol** to act as a "mechanistic stethoscope" for Transformer attention.
13
+ 2. **Efficiency:** Engineering **Nano-LLMs** (Project Janus) that achieve "Super-Chinchilla" performance by eliminating structural redundancy.
14
+ 3. **Sustainability:** Reducing the computational cost of intelligence through **Vector Space Homeostasis**.
15
+
16
  ---
17
+
18
+ ## 🔬 Key Research Initiatives
19
+
20
+ ### 1. The VSM Protocol
21
+ *A Framework for Quantifying and Guiding Attention Head Specialization.*
22
+
23
+ The VSM Protocol treats the Transformer architecture as a physical system for spectral processing. We utilize two novel metrics to track the evolution of a model's "mind" from initialization to convergence:
24
+
25
+ * **$\sigma_p$ (Coherence):** Measures the focus/entropy of attention heads.
26
+ * **$\sigma_a$ (Novelty/Agreement):** Measures the degree of cross-head specialization.
27
+
28
+ Our research has quantified the "Untrained Symmetry" phenomenon (Softmax Collapse) and mapped the "Diagonally Oppositional" trajectory of healthy learning.
29
+
30
+ ### 2. Project Janus
31
+ *Engineering Efficient Nano-LLMs via Feature Orthogonality.*
32
+
33
+ Project Janus is an attempt to solve "Attentional Collapse"—the tendency for small models (Nano-LLMs) to learn redundant features due to limited capacity.
34
+
35
+ By implementing **Vector Space Homeostasis** (a diversity pressure term $\lambda_{div}$ in the loss function) and a **Trapezoidal Pressure Schedule**, we force the model to maintain feature orthogonality.
36
+
37
+ **Key Results (Janus v3 vs. Baseline):**
38
+ * **Architecture:** 40M Parameters (Llama-style Chassis).
39
+ * **Performance:** 9.2% reduction in Loss on logical coherence tasks.
40
+ * **Efficiency:** Achieved parity loss with **28% less structural redundancy**.
41
+ * **Generalization:** 0.91 improvement in Perplexity on WikiText-103.
42
+
43
+ ![Efficiency Gap Chart - Janus Loss vs Baseline](path/to/efficiency_gap_chart.png)
44
+
45
  ---
46
 
47
+ ## 🛠️ Usage & Citation
48
+
49
+ We believe in open science. Our protocols and model weights are released here to encourage the community to move beyond black-box optimization.
50
+
51
+ ### BibTeX
52
+
53
+ If you use the VSM Protocol or Janus methodology in your research, please cite:
54
+
55
+ ```bibtex
56
+ @techreport{belanger2025vsm,
57
+ title={The VSM (Vector-Space-Mapping) Protocol: A Framework for Quantifying and Guiding Attention Head Specialization in Transformers},
58
+ author={Belanger, Jonathan R.},
59
+ institution={Exorobourii},
60
+ year={2025}
61
+ }
62
+
63
+ @techreport{belanger2025janus,
64
+ title={Project Janus: Engineering Efficient Nano-LLMs via Feature Orthogonality and Vector Space Homeostasis},
65
+ author={Belanger, Jonathan R.},
66
+ institution={Exorobourii},
67
+ year={2025}
68
+ }