liquid_bayes / README.md
1990two's picture
Create README.md
89cac14 verified
---
tags:
- bayesian-inference
- liquid-networks
- uncertainty-quantification
- classics-revival
- experimental
license: apache-2.0
library_name: pytorch
---
# Liquid Bayes Chain - The Classics Revival
**Probabilistic Control of Continuous Dynamics with Bayesian Feedback**
**Experimental Research Code** - Functional but unoptimized, expect rough edges
## What Is This?
Liquid Bayes Chain combines liquid neural networks with Bayesian inference to create a system where probabilistic confidence directly modulates continuous dynamics. The network's liquid state evolves based on Bayesian uncertainty, creating adaptive exploration-exploitation behavior.
**Core Innovation**: Bayesian confidence estimates control liquid time constants and dynamics, creating a feedback loop between probabilistic reasoning and continuous neural evolution.
## Architecture Highlights
- **Confidence-Modulated Dynamics**: Bayesian uncertainty controls liquid evolution speed
- **Adaptive Time Constants**: Neural dynamics adjust based on confidence levels
- **Probabilistic Feedback Loop**: Continuous dynamics inform Bayesian updates
- **Multi-Step Chain Processing**: Sequential confidence-guided evolution steps
- **Uncertainty Quantification**: Full probabilistic output with confidence measures
- **Exploration-Exploitation Balance**: High confidence → stability, low confidence → exploration
## Quick Start
```python
from liquid_bayes import LiquidBayesChain
# Create liquid-Bayesian system
model = LiquidBayesChain(
input_dim=32,
state_dim=64,
output_dim=10,
num_chain_steps=4
)
# Process input with uncertainty quantification
input_signal = torch.randn(batch_size, input_dim)
output = model(input_signal, return_chain_states=True)
# Get uncertainty information
uncertainty_info = model.predict_with_uncertainty(input_signal)
print(f"Confidence: {uncertainty_info['confidence'].mean():.3f}")
```
## Current Status
- **Working**: Liquid dynamics, Bayesian networks, confidence modulation, chain evolution, uncertainty quantification
- **Rough Edges**: No benchmarking on standard tasks, chain length optimization needed
- **Still Missing**: Advanced Bayesian structures, variational inference, distributed chain processing
- **Performance**: Good convergence on toy problems, needs validation on real tasks
- **Memory Usage**: Moderate, scales with chain length and state dimension
- **Speed**: Sequential chain processing, parallelization opportunities exist
## Mathematical Foundation
The liquid dynamics evolve according to:
```
dx/dt = -x/τ(confidence) + W_rec·σ(x) + W_in·u + noise(1-confidence)
```
Bayesian confidence estimation uses:
```
P(belief|evidence) ∝ P(evidence|belief) × P(belief)
confidence = 1 - H(P(belief|evidence))
```
Where H is Shannon entropy. High confidence leads to stable dynamics (large τ), while low confidence increases exploration through noise injection and faster adaptation.
The chain processes through multiple steps:
```
x_{t+1} = LiquidEvolution(x_t, u, confidence_t)
confidence_{t+1} = BayesianUpdate(x_{t+1})
```
## Research Applications
- **Adaptive control systems with uncertainty**
- **Robotics with confidence-aware planning**
- **Financial modeling with risk adaptation**
- **Autonomous systems requiring exploration-exploitation**
- **Scientific computing with adaptive dynamics**
## Installation
```bash
pip install torch numpy scipy
# Download liquid_bayes.py from this repo
```
## The Classics Revival Collection
Liquid Bayes Chain is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:
- Evolutionary Turing Machine
- Hebbian Bloom Filter
- Hopfield Decision Graph
- **Liquid Bayes Chain** ← You are here
- Liquid State Space Model
- Möbius Markov Chain
- Memory Forest
## Citation
```bibtex
@misc{liquidbayes2025,
title={Liquid Bayes Chain: Probabilistic Control of Continuous Dynamics},
author={Jae Parker 𓅸 1990two},
year={2025},
note={Part of The Classics Revival Collection}
}
```