File size: 5,694 Bytes
bdb278a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
---
license: mit
tags:
  - spatial-database
  - memory
  - embeddings
  - ai
  - vector-search
  - rust
library_name: arms-core
pipeline_tag: feature-extraction
---

# ARMS - Attention Reasoning Memory Store

> **Position IS Relationship** - A Spatial Memory Fabric for AI Systems

ARMS is a spatial memory fabric that enables AI systems to store and retrieve computed states by their native dimensional coordinates. Unlike traditional databases that require explicit relationships through foreign keys or learned topology through approximate nearest neighbor algorithms, ARMS operates on a fundamental principle: **proximity defines connection**.

![ARMS Architecture](paper/figures/fig01_architecture.jpg)

## The Problem

Current AI memory approaches all lose information:

- **Extended context**: Expensive, doesn't scale beyond training length
- **RAG retrieval**: Retrieves text, requires recomputation of attention
- **Vector databases**: Treat all data as unstructured point clouds
- **External memory**: Key-value stores with explicit indexing

![Traditional vs ARMS](paper/figures/fig06_traditional_vs_arms.jpg)

## The ARMS Insight

```
Traditional:  State β†’ Project β†’ Index β†’ Retrieve β†’ Reconstruct
              (lossy at each step)

ARMS:         State β†’ Store AT coordinates β†’ Retrieve β†’ Inject directly
              (native representation preserved)
```

## The Five Primitives

Everything in ARMS reduces to five operations:

![Five Primitives](paper/figures/fig03_primitives.jpg)

| Primitive | Type | Purpose |
|-----------|------|---------|
| **Point** | `Vec<f32>` | Any dimensionality |
| **Proximity** | `fn(a, b) -> f32` | How related? |
| **Merge** | `fn(points) -> point` | Compose together |
| **Place** | `fn(point, data) -> id` | Exist in space |
| **Near** | `fn(point, k) -> ids` | What's related? |

## Quick Start

```rust
use arms_core::{Arms, ArmsConfig, Point};

// Create ARMS with default config
let mut arms = Arms::new(ArmsConfig::new(768));

// Place a point in the space
let point = Point::new(vec![0.1; 768]);
let id = arms.place(point, b"my data".to_vec()).unwrap();

// Find nearby points
let query = Point::new(vec![0.1; 768]);
let neighbors = arms.near(&query, 5).unwrap();
```

## Hexagonal Architecture

ARMS follows a hexagonal (ports-and-adapters) architecture. The core domain contains pure math with no I/O. Ports define trait contracts. Adapters provide swappable implementations.

![Hexagonal Architecture](paper/figures/fig02_hexagonal.jpg)

```
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                         ARMS                                β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  CORE (pure math, no I/O)                                   β”‚
β”‚    Point, Id, Blob, Proximity, Merge                        β”‚
β”‚                                                             β”‚
β”‚  PORTS (trait contracts)                                    β”‚
β”‚    Place, Near, Latency                                     β”‚
β”‚                                                             β”‚
β”‚  ADAPTERS (swappable implementations)                       β”‚
β”‚    Storage: Memory, NVMe (planned)                          β”‚
β”‚    Index: Flat, HAT (see arms-hat crate)                    β”‚
β”‚                                                             β”‚
β”‚  ENGINE (orchestration)                                     β”‚
β”‚    Arms - the main entry point                              β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
```

## The Hippocampus Analogy

ARMS functions as an artificial hippocampus for AI systems:

![Hippocampus Analogy](paper/figures/fig05_hippocampus.jpg)

| Hippocampus | ARMS |
|-------------|------|
| Encodes episodic memories | Stores attention states |
| Spatial navigation | High-dimensional proximity |
| Pattern completion | Near queries |
| Memory consolidation | Merge operations |
| Place cells | Points at coordinates |

## Ecosystem

![ARMS Ecosystem](paper/figures/fig07_ecosystem.jpg)

### Related Crates

- [`arms-hat`](https://crates.io/crates/arms-hat) - Hierarchical Attention Tree index adapter (100% recall, 70x faster than HNSW)

### Planned Adapters

- `arms-nvme` - Persistent storage via memory-mapped files
- `arms-distributed` - Sharded storage across machines
- `arms-gpu` - CUDA-accelerated similarity search
- `arms-py` - Python bindings

## Proximity Functions

Built-in proximity measures:

- **Cosine** - Angle between vectors (semantic similarity)
- **Euclidean** - Straight-line distance
- **DotProduct** - Raw dot product
- **Manhattan** - L1 distance

## Installation

```toml
[dependencies]
arms-core = "0.1"
```

## Paper

The research paper is available in the [`paper/`](paper/) directory.

**ARMS: A Spatial Memory Fabric for AI Systems**
Andrew Young, 2026

## License

MIT License - see [LICENSE](LICENSE)

## Citation

If you use ARMS in research, please cite:

```bibtex
@article{young2026arms,
  author = {Young, Andrew},
  title = {ARMS: A Spatial Memory Fabric for AI Systems},
  journal = {arXiv preprint},
  year = {2026},
  url = {https://github.com/automate-capture/arms}
}
```

## Author

Andrew Young - [andrew@automate-capture.com](mailto:andrew@automate-capture.com)