Holographic Neural Mesh (HNM) v3.0
A deterministic sparse semantic substrate for cognitive systems. No GPU required.
Model Description
HNM is not a language model or embedding model replacement. It is a cognitive memory layer providing:
- β Constant-time encoding (O(1) regardless of corpus size)
- β 99% structural sparsity (102Γ FLOPS reduction)
- β Deterministic representations (same input β same output β always)
- β Semantic discrimination (negation, role reversal, synonyms)
- β Associative binding/unbinding (key-value memory)
- β Pure NumPy (no GPU, no PyTorch, no TensorFlow)
What HNM IS
| β HNM Is | β HNM Is Not |
|---|---|
| Semantic memory substrate | Language model |
| Symbolic binding engine | Next-token predictor |
| Deterministic cognitive layer | Embedding model replacement |
| Associative recall system | Foundation model |
Intended Uses
- Agent memory backbone: Long-term memory for LLM agents
- Semantic routing: Content-addressable dispatch
- Variable binding: Compositional reasoning substrate
- Edge deployment: Cognitive processing without GPU
- Distributed consensus: Deterministic state for multi-agent systems
How to Use
from hnm_v3 import HolographicNeuralMeshV3, HNMConfig
# Initialize
hnm = HolographicNeuralMeshV3(HNMConfig())
# Encode text
pattern, stats = hnm.forward("Machine learning is fascinating")
print(f"Latency: {stats['inference_time_ms']:.2f}ms, Sparsity: {1-stats['active_ratio']:.1%}")
# Semantic similarity
sim = hnm.similarity("I am happy", "I feel joyful") # ~0.87
sim = hnm.similarity("dog bites man", "man bites dog") # ~0.52 (role reversal detected)
# Memory storage and retrieval
hnm.encode_and_store("Deep learning uses neural networks")
hnm.encode_and_store("The stock market crashed today")
results = hnm.search("Tell me about neural networks", top_k=3)
# Associative binding
bound = hnm.bind("capital of France", "Paris")
recovered = hnm.unbind(bound, "capital of France") # β Paris vector
Benchmarks
Semantic Discrimination (7/7 Pass)
| Test | Pair | Score | Target | Status |
|---|---|---|---|---|
| Negation | "alive" / "not alive" | 0.48 | < 0.50 | β |
| Role Reversal | "dog bites man" / "man bites dog" | 0.52 | < 0.70 | β |
| Paraphrase | "happy" / "joyful" | 0.87 | > 0.70 | β |
| Unrelated | "neural networks" / "fishing" | 0.01 | < 0.30 | β |
Scaling (Constant Time)
| Corpus Size | TF-IDF | BM25 | HNM |
|---|---|---|---|
| 20 docs | 0.03ms | 0.04ms | 1.78ms |
| 2,000 docs | 2.45ms | 3.60ms | 1.62ms |
| 100Γ growth | 78Γ slower | 98Γ slower | 0.9Γ slower |
Resource Efficiency
| Metric | Value |
|---|---|
| Sparsity | 99% |
| FLOPS reduction | 102Γ |
| Inference latency | ~3.5ms |
| GPU required | No |
Architecture
Input β Semantic Encoder β Holographic Projection β Interference Layers (Γ8) β Memory
β β β β
Word vectors + Complex pattern FFT + Phase mixing Cleanup memory +
Negation handling Phase = semantics 99% sparsification Iterative decoding
Key Components:
- Dual-channel encoding: Semantic (meaning) + Structural (order)
- Circular convolution binding:
bind(key, value)βunbind(bound, key) β value - Cleanup memory: Iterative extraction from superposition
- Hierarchical storage: 16 slots with saturation monitoring
Limitations
- Semantic priors are hand-crafted: ~40 word clusters cover common vocabulary but not open-domain language
- Similarity is ordinal, not metric: Scores are internally consistent but not calibrated to SBERT
- This is a substrate, not a system: Provides memory/binding, not language generation
Citation
@software{stone2024hnm,
author = {Stone, Kent},
title = {Holographic Neural Mesh: A Deterministic Sparse Semantic Substrate},
year = {2024},
publisher = {JARVIS Cognitive Systems},
url = {https://huggingface.co/jarvis-cognitive/hnm-v3}
}
Lineage
HNM builds on established cognitive architecture research:
- Holographic Reduced Representations (Plate, 1995)
- Sparse Distributed Memory (Kanerva, 1988)
- Hyperdimensional Computing (Kanerva, 2009)
- Vector Symbolic Architectures (Kleyko et al., 2023)
Contact
Kent Stone - JARVIS Cognitive Systems