Post
82
I wrote a note on something I’ve been experimenting with: EqPropMomentum
It’s a new optimizer:
take Equilibrium Propagation gradients, then update parameters with classical momentum instead of plain naive steps.
Why I cared:
predictive coding / EqProp style methods are interesting because they move away from standard backprop assumptions, but they often feel slow, noisy, and hard to scale
So this was my attempt at a small practical bridge: keep the energy-based flavor, improve optimization behavior
I put together the intuition, math, code, and experiments here: https://teendifferent.substack.com/p/the-revival-of-predictive-coding
Would love feedback from anyone working on predictive coding, biologically plausible learning, or energy-based training ✌️
It’s a new optimizer:
take Equilibrium Propagation gradients, then update parameters with classical momentum instead of plain naive steps.
Why I cared:
predictive coding / EqProp style methods are interesting because they move away from standard backprop assumptions, but they often feel slow, noisy, and hard to scale
So this was my attempt at a small practical bridge: keep the energy-based flavor, improve optimization behavior
I put together the intuition, math, code, and experiments here: https://teendifferent.substack.com/p/the-revival-of-predictive-coding
Would love feedback from anyone working on predictive coding, biologically plausible learning, or energy-based training ✌️