YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Phase-Induced Coherence-Gated Gradient Descent
This repository contains checkpoint files for the research prototype Phase-Induced Coherence-Gated Gradient Descent (PIC-GD).
The project tests whether neural network training improves when latent representations are treated as complex signals and the training signal is modulated by phase coherence between model and reference representations.
Included Checkpoints
baseline_best.ptcomplex_best.ptalign_best.ptfull_best.ptrun_config.json
The checkpoint names correspond to these training variants:
| File | Variant |
|---|---|
baseline_best.pt |
Standard real-valued baseline |
complex_best.pt |
Complex latent representation only |
align_best.pt |
Complex latent representation with amplitude and phase alignment losses |
full_best.pt |
Alignment losses plus coherence-gated gradient scaling |
Training Setup
These checkpoints were produced from the MedleyDB sample experiment configuration stored in run_config.json.
Key settings:
- dataset:
medleydb_sample - batch size:
16 - epochs:
20 - learning rate:
1e-3 - hidden dim:
128 - embed dim:
64 - sample rate:
44100 - segment length:
2.0s - gate warmup:
5epochs
The current evaluation protocol is same-track segment classification/learning, not held-out track or artist generalization.
Method Summary
Latent activations are modeled as complex signals:
with coherence score:
and coherence-gated update:
The full method combines:
- complex latent representation
- amplitude alignment loss
- phase alignment loss
- coherence-gated per-sample training signal
Loading
These files are raw PyTorch state dict checkpoints. Load them with the corresponding model definition from the project code:
- baseline:
BaselineModel - complex / align / full:
PhaseModel
Example:
import torch
state = torch.load("full_best.pt", map_location="cpu")
You will also need the architecture definitions from the source repository:
phase_coherence_test.pydatasets.py
Limitations
- Research prototype, not production-ready.
- Checkpoints are architecture-specific PyTorch state dicts.
- The current MedleyDB setup evaluates same-track learning only.
- Results should be interpreted as exploratory evidence rather than a finalized benchmark.
Source
Project repository:
jzgdev/phase-induced-coherence-gated-gradient-descent
License
MIT