Kent Stone commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,146 @@
|
|
| 1 |
-
---
|
| 2 |
-
|
| 3 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language: en
|
| 3 |
+
license: afl-3.0
|
| 4 |
+
tags:
|
| 5 |
+
- vector-symbolic-architecture
|
| 6 |
+
- holographic-reduced-representations
|
| 7 |
+
- sparse-distributed-memory
|
| 8 |
+
- semantic-hashing
|
| 9 |
+
- cognitive-architecture
|
| 10 |
+
- memory-systems
|
| 11 |
+
- no-gpu
|
| 12 |
+
library_name: numpy
|
| 13 |
+
pipeline_tag: feature-extraction
|
| 14 |
+
---
|
| 15 |
+
|
| 16 |
+
# Holographic Neural Mesh (HNM) v3.0
|
| 17 |
+
|
| 18 |
+
A **deterministic sparse semantic substrate** for cognitive systems. No GPU required.
|
| 19 |
+
|
| 20 |
+
## Model Description
|
| 21 |
+
|
| 22 |
+
HNM is **not** a language model or embedding model replacement. It is a **cognitive memory layer** providing:
|
| 23 |
+
|
| 24 |
+
- β
Constant-time encoding (O(1) regardless of corpus size)
|
| 25 |
+
- β
99% structural sparsity (102Γ FLOPS reduction)
|
| 26 |
+
- β
Deterministic representations (same input β same output β always)
|
| 27 |
+
- β
Semantic discrimination (negation, role reversal, synonyms)
|
| 28 |
+
- β
Associative binding/unbinding (key-value memory)
|
| 29 |
+
- β
Pure NumPy (no GPU, no PyTorch, no TensorFlow)
|
| 30 |
+
|
| 31 |
+
### What HNM IS
|
| 32 |
+
|
| 33 |
+
| β
HNM Is | β HNM Is Not |
|
| 34 |
+
|-----------|---------------|
|
| 35 |
+
| Semantic memory substrate | Language model |
|
| 36 |
+
| Symbolic binding engine | Next-token predictor |
|
| 37 |
+
| Deterministic cognitive layer | Embedding model replacement |
|
| 38 |
+
| Associative recall system | Foundation model |
|
| 39 |
+
|
| 40 |
+
## Intended Uses
|
| 41 |
+
|
| 42 |
+
- **Agent memory backbone**: Long-term memory for LLM agents
|
| 43 |
+
- **Semantic routing**: Content-addressable dispatch
|
| 44 |
+
- **Variable binding**: Compositional reasoning substrate
|
| 45 |
+
- **Edge deployment**: Cognitive processing without GPU
|
| 46 |
+
- **Distributed consensus**: Deterministic state for multi-agent systems
|
| 47 |
+
|
| 48 |
+
## How to Use
|
| 49 |
+
|
| 50 |
+
```python
|
| 51 |
+
from hnm_v3 import HolographicNeuralMeshV3, HNMConfig
|
| 52 |
+
|
| 53 |
+
# Initialize
|
| 54 |
+
hnm = HolographicNeuralMeshV3(HNMConfig())
|
| 55 |
+
|
| 56 |
+
# Encode text
|
| 57 |
+
pattern, stats = hnm.forward("Machine learning is fascinating")
|
| 58 |
+
print(f"Latency: {stats['inference_time_ms']:.2f}ms, Sparsity: {1-stats['active_ratio']:.1%}")
|
| 59 |
+
|
| 60 |
+
# Semantic similarity
|
| 61 |
+
sim = hnm.similarity("I am happy", "I feel joyful") # ~0.87
|
| 62 |
+
sim = hnm.similarity("dog bites man", "man bites dog") # ~0.52 (role reversal detected)
|
| 63 |
+
|
| 64 |
+
# Memory storage and retrieval
|
| 65 |
+
hnm.encode_and_store("Deep learning uses neural networks")
|
| 66 |
+
hnm.encode_and_store("The stock market crashed today")
|
| 67 |
+
results = hnm.search("Tell me about neural networks", top_k=3)
|
| 68 |
+
|
| 69 |
+
# Associative binding
|
| 70 |
+
bound = hnm.bind("capital of France", "Paris")
|
| 71 |
+
recovered = hnm.unbind(bound, "capital of France") # β Paris vector
|
| 72 |
+
```
|
| 73 |
+
|
| 74 |
+
## Benchmarks
|
| 75 |
+
|
| 76 |
+
### Semantic Discrimination (7/7 Pass)
|
| 77 |
+
|
| 78 |
+
| Test | Pair | Score | Target | Status |
|
| 79 |
+
|------|------|-------|--------|--------|
|
| 80 |
+
| Negation | "alive" / "not alive" | 0.48 | < 0.50 | β
|
|
| 81 |
+
| Role Reversal | "dog bites man" / "man bites dog" | 0.52 | < 0.70 | β
|
|
| 82 |
+
| Paraphrase | "happy" / "joyful" | 0.87 | > 0.70 | β
|
|
| 83 |
+
| Unrelated | "neural networks" / "fishing" | 0.01 | < 0.30 | β
|
|
| 84 |
+
|
| 85 |
+
### Scaling (Constant Time)
|
| 86 |
+
|
| 87 |
+
| Corpus Size | TF-IDF | BM25 | HNM |
|
| 88 |
+
|-------------|--------|------|-----|
|
| 89 |
+
| 20 docs | 0.03ms | 0.04ms | 1.78ms |
|
| 90 |
+
| 2,000 docs | 2.45ms | 3.60ms | **1.62ms** |
|
| 91 |
+
| **100Γ growth** | **78Γ slower** | **98Γ slower** | **0.9Γ slower** |
|
| 92 |
+
|
| 93 |
+
### Resource Efficiency
|
| 94 |
+
|
| 95 |
+
| Metric | Value |
|
| 96 |
+
|--------|-------|
|
| 97 |
+
| Sparsity | 99% |
|
| 98 |
+
| FLOPS reduction | 102Γ |
|
| 99 |
+
| Inference latency | ~3.5ms |
|
| 100 |
+
| GPU required | **No** |
|
| 101 |
+
|
| 102 |
+
## Architecture
|
| 103 |
+
|
| 104 |
+
```
|
| 105 |
+
Input β Semantic Encoder β Holographic Projection β Interference Layers (Γ8) β Memory
|
| 106 |
+
β β β β
|
| 107 |
+
Word vectors + Complex pattern FFT + Phase mixing Cleanup memory +
|
| 108 |
+
Negation handling Phase = semantics 99% sparsification Iterative decoding
|
| 109 |
+
```
|
| 110 |
+
|
| 111 |
+
**Key Components:**
|
| 112 |
+
- **Dual-channel encoding**: Semantic (meaning) + Structural (order)
|
| 113 |
+
- **Circular convolution binding**: `bind(key, value)` β `unbind(bound, key) β value`
|
| 114 |
+
- **Cleanup memory**: Iterative extraction from superposition
|
| 115 |
+
- **Hierarchical storage**: 16 slots with saturation monitoring
|
| 116 |
+
|
| 117 |
+
## Limitations
|
| 118 |
+
|
| 119 |
+
1. **Semantic priors are hand-crafted**: ~40 word clusters cover common vocabulary but not open-domain language
|
| 120 |
+
2. **Similarity is ordinal, not metric**: Scores are internally consistent but not calibrated to SBERT
|
| 121 |
+
3. **This is a substrate, not a system**: Provides memory/binding, not language generation
|
| 122 |
+
|
| 123 |
+
## Citation
|
| 124 |
+
|
| 125 |
+
```bibtex
|
| 126 |
+
@software{stone2024hnm,
|
| 127 |
+
author = {Stone, Kent},
|
| 128 |
+
title = {Holographic Neural Mesh: A Deterministic Sparse Semantic Substrate},
|
| 129 |
+
year = {2024},
|
| 130 |
+
publisher = {JARVIS Cognitive Systems},
|
| 131 |
+
url = {https://huggingface.co/jarvis-cognitive/hnm-v3}
|
| 132 |
+
}
|
| 133 |
+
```
|
| 134 |
+
|
| 135 |
+
## Lineage
|
| 136 |
+
|
| 137 |
+
HNM builds on established cognitive architecture research:
|
| 138 |
+
- **Holographic Reduced Representations** (Plate, 1995)
|
| 139 |
+
- **Sparse Distributed Memory** (Kanerva, 1988)
|
| 140 |
+
- **Hyperdimensional Computing** (Kanerva, 2009)
|
| 141 |
+
- **Vector Symbolic Architectures** (Kleyko et al., 2023)
|
| 142 |
+
|
| 143 |
+
|
| 144 |
+
## Contact
|
| 145 |
+
|
| 146 |
+
Kent Stone - JARVIS Cognitive Systems
|