MPKNet V6 - Bio-Inspired Visual Classification
A lightweight neural network inspired by the primate Lateral Geniculate Nucleus (LGN), implementing parallel Magnocellular (M), Parvocellular (P), and Koniocellular (K) pathways with Fibonacci-stride spatial sampling.
Architecture
MPKNet V6 uses three parallel pathways with biologically-motivated stride ratios (2:3:5):
- P pathway (stride 2): Fine detail and edges, analogous to Parvocellular neurons (~80% of LGN)
- K pathway (stride 3): Context signals that generate gating modulation, analogous to Koniocellular neurons (~10% of LGN)
- M pathway (stride 5): Global structure and coarse features, analogous to Magnocellular neurons (~10% of LGN)
The K-gating mechanism dynamically modulates P and M pathways via learned sigmoid gates, inspired by cross-stream modulation in biological vision.
Results
| Dataset | Classes | Accuracy | Parameters |
|---|---|---|---|
| Kvasir-v2 (GI endoscopy) | 8 | 89.2% | 0.21M |
| CIFAR-10 | 10 | 89.4% | 0.54M |
| CIFAR-100 | 100 | 58.8% | 0.22M |
| ImageNet-100 | 100 | 60.8% | 0.54M |
No pretraining. No augmentation. 161x fewer parameters than MobileNetV3-Small.
Usage
import torch
from mpknet_v6 import BinocularMPKNetV6
from mpknet_components import count_params
# Load model
model = BinocularMPKNetV6(num_classes=8, ch=48, use_stereo=True)
state_dict = torch.load("v6_kvasir_best.pth", map_location="cpu", weights_only=True)
model.load_state_dict(state_dict)
model.eval()
# Inference
x = torch.randn(1, 3, 224, 224)
with torch.no_grad():
logits = model(x)
pred = logits.argmax(dim=1)
Files
v6_kvasir_best.pth- Trained weights (Kvasir-v2, 8 classes, 2.1MB)mpknet_v6.py- Model architecturempknet_components.py- Shared components (RGCLayer, BinocularPreMPK, StereoDisparity, StridedMonocularBlock)
Citation
D.J. Lougen, "MPKNet: Bio-Inspired Visual Classification with Parallel LGN Pathways", 2025.
License
PolyForm Small Business License 1.0.0 - Free for organizations with less than $100K revenue, non-profits, and education.