Mitigating Exponential Mixed Frequency Growth through Frequency Selection
arXiv:2508.10533v5 Announce Type: replace-cross
Abstract: Angle encoding has emerged as a popular feature map for embedding classical data into quantum models, naturally generating truncated Fourier series with universal function approximation capabilities. Despite this expressive capability, practical training faces significant challenges. Through controlled experiments with white-box target functions, we demonstrate that training failures can occur even when all established parameter sufficiency conditions are satisfied. Building on the redundancy-gradient framework of Duffy and Jastrzebski, we provide systematic experimental evidence that non-unique frequencies dominate the gradient landscape and crowd out target frequencies -- a burden that grows exponentially with encoding depth under unary encoding. Small-angle initialization mitigates this in one-dimensional settings but fails to scale to higher dimensions, where even ternary encoding -- which minimizes per-frequency redundancy -- faces intractable combinatorial growth of unique frequency tuples regardless of initialization or optimizer choice. We introduce frequency selection as a principled solution that restricts the model spectrum to only those frequencies present in the target function. For two-dimensional targets, frequency selection achieves near-optimal performance (median $R^2 \approx 0.95$) where dense approaches struggle, and remains tractable at high-frequency magnitudes where dense approaches fail entirely (median $R^2 \approx 0.85$). Validation on a real-world dataset confirms the approach transfers beyond synthetic settings.