Evolved Qwen Hybrid v3
An experimental Hybrid AI integrating a Large Language Model with biological-inspired neuromorphic memory. This model uses a combination of structural SNN neurogenesis and Hebbian fast-weights to achieve rapid, on-the-fly learning from user interactions.
𧬠Neuromorphic Architecture
- SNN Reservoir (SELSC Engine): A 10,000-neuron liquid-state machine that captures structural context.
- Dopamine-Modulated STDP: Synaptic plasticity in the SNN is gated by a "Dopamine" signal derived from sentiment analysis and novelty (Surprise Factor).
- Hebbian Fast-Weights: Dynamic shadow matrices attached to the LLM's FFN layers, allowing for near-instantaneous association learning without backpropagation.
- Tiny Reward Expert: A prioritized linear expert that internalizes human feedback into a predictive reward model.
π Quickstart
Clone the repository:
git clone https://huggingface.co/Specialgfhdhdh/learning-qwen cd learning-qwenInstall dependencies:
pip install -r requirements.txtRun the interactive chat:
python run_hybrid.py
π§ How it Learns
As you converse with the AI, the SELSC Engine builds a biological representation of the context. If you praise the AI, its Dopamine level rises, strengthening current synaptic patterns. If you correct it, Dopamine drops, suppressing incorrect associations. The Hebbian Layer ensures that these associations are immediately accessible in future turns of the same session.
Credits
Built using Qwen-2.5-0.5B-Instruct as the cognitive core and a custom JAX-powered neuromorphic engine for structural memory.