Activation Saturation and Floquet Spectrum Collapse in Neural ODEs
arXiv:2604.00543v1 Announce Type: cross
Abstract: We prove that activation saturation imposes a structural dynamical limitation on autonomous Neural ODEs $\dot{h}=f_\theta(h)$ with saturating activations ($\tanh$, sigmoid, etc.): if $q$ hidden layers of the MLP $f_\theta$ satisfy $|\sigma'|\le\delta$ on a region~$U$, the input Jacobian is attenuated as $\norm{Df_\theta(x)}\le C(U)$ (for activations with $\sup_{x}|\sigma'(x)|\le 1$, e.g.\ $\tanh$ and sigmoid, this reduces to $C_W\delta^q$), forcing every Floquet (Lyapunov) exponen along any $T$-periodic orbit $\gamma\subset U$ into the interval $[-C(U),\;C(U)]$. This is a collapse of the Floquet spectrum: as saturation deepens ($\delta\to 0$), all exponents are driven to zero, limiting both strong contraction and chaotic sensitivity. The obstruction is structural -- it constrains the learned vector field at inference time, independent of training quality. As a secondary contribution, for activations with $\sigma'>0$, a saturation-weighted spectral factorisation yields a refined bound $\widetilde{C}(U)\le C(U)$ whose improvement is amplified exponentially in~$T$ at the flow level. All results are numerically illustrated on the Stuart--Landau oscillator; the bounds provide a theoretical explanation for the empirically observed failure of $\tanh$-NODEs on the Morris--Lecar neuron model.