Research Publications

Peer-reviewed papers and preprints from the Phronesis Analytics research team.

Preprint — Under Review

Spectral Alignment: Engineering the Fourier Path to Generalization in Neural Networks

Nathan Rigoni Phronesis Analytics March 12, 2026

We present Spectral Alignment, a framework that transforms the "grokking" phase transition from a stochastic wait into an engineered process by targeting the spectral utilization gap: the unused high-frequency bandwidth that standard gradient descent fails to recruit due to spectral bias. We introduce Fourier Gradient Projection (FGP) and Prescribed Fourier Frequency Training (PFFT), which steer token embedding gradients toward capacity-optimal near-Nyquist modes. Across 42 experimental runs (14 variants × 3 seeds), prescribing near-Nyquist modes {30, 35, 40, 45, 48} for p = 97 modular addition accelerates generalization by 92.7% (57 vs. 782 epochs-to-grokking) and reduces the memorization phase by 97.9% (9 vs. 451 epochs). We introduce the Natural Ordering Condition (NOC), which predicts exactly when Fourier steering is beneficial and when it is destructive. We also introduce the Spectral Transformer (ST-1), which achieves a terminal BPC of 0.026 versus the baseline's 1.864 on character-level TinyStories—a 71.5× gap—via a rapid "spectral snap" analogous to the grokking transition.