memory_forest / README.md
1990two's picture
Update README.md
0dd6cef verified
---
tags:
- memory-augmented
- decision-trees
- associative-memory
- classics-revival
- experimental
license: apache-2.0
library_name: pytorch
---
# Memory Forest - The Classics Revival
**Associative Memory with Learned Routing Through Decision Trees**
**Experimental Research Code** - Functional but unoptimized, expect rough edges
## What Is This?
Memory Forest combines decision tree routing with associative hash buckets to create a scalable memory system. Instead of searching all memory linearly, learned decision trees route queries to the most relevant memory buckets.
**Core Innovation**: Trees learn to route based on retrieval success, creating adaptive memory organization that gets better over time.
## Architecture Highlights
- **Learned Routing**: Decision trees adapt based on retrieval performance
- **Associative Storage**: Hash buckets with similarity-based clustering
- **Ensemble Retrieval**: Multiple trees vote on best memories
- **Eviction Policies**: Automatic memory management
- **Scalable Design**: O(log n) routing instead of O(n) search
## Quick Start
```python
from memory_forest import MemoryForest
# Create memory system
forest = MemoryForest(
input_dim=64,
num_trees=3,
max_depth=4,
bucket_size=32
)
# Store some memories
features = torch.randn(100, 64)
forest.store(features)
# Retrieve similar items
query = torch.randn(5, 64)
results = forest.retrieve(query, top_k=3)
```
## Current Status
- **Working**: Hierarchical storage, associative clustering, tree routing, ensemble retrieval
- **Rough Edges**: Adaptive learning can be overly aggressive, bucket utilization optimization needed
- **Still Missing**: Learning rate scheduling, advanced eviction policies, distributed routing
- **Performance**: Excellent retrieval quality (0.95+ similarity), needs learning component tuning
- **Memory Usage**: Not optimized, expect high RAM usage
- **Speed**: Baseline implementation, significant optimization possible
## Mathematical Foundation
The decision trees learn routing functions f: R^d -> {0,1,...,B-1} where B is the number of buckets. Trees update based on retrieval success using simple gradient-free optimization:
```python
success_rate = retrieved_similarity / max_possible_similarity
tree_update โˆ success_rate * path_taken
```
Associative buckets use learnable hash functions with Hebbian-style updates for similarity clustering.
## Research Applications
- **Large-scale retrieval systems**
- **Adaptive memory architectures**
- **Decision tree optimization**
- **Associative memory research**
- **Hierarchical clustering**
## Installation
``` python
pip install torch numpy
# Download memory_forest.py from this repo
```
## The Classics Revival Collection
Memory Forest is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:
- Evolutionary Turing Machine
- Hebbian Bloom Filter
- Hopfield Decision Graph
- Liquid Bayes Chain
- Liquid State Space Model
- ** Memory Forest** โ† You are here
## Citation
```bibtex
@misc{memoryforest2025,
title={Memory Forest: Associative Memory with Learned Routing},
author={Jae Parker ๐“…ธ 1990two},
year={2025},
note={Part of The Classics Revival Collection}
}