Exact Gaussian Moment Matching for Residual Networks: a Second-Order Method
arXiv:2601.22307v2 Announce Type: replace
Abstract: We study the problem of propagating the mean and covariance of a general multivariate Gaussian distribution through a deep (residual) neural network using layer-by-layer moment matching. We close a longstanding gap by deriving exact moment matching for the probit, GeLU, ReLU (as a limit of GeLU), Heaviside (as a limit of probit), and sine activation functions; for both feedforward and generalized residual layers. On random networks, we find orders-of-magnitude improvements in the KL divergence error metric, up to a millionfold, over popular alternatives. On a variational Bayes neural network, we show that our method attains hundredfold improvements in KL divergence from Monte Carlo ground truth over a state-of-the-art deterministic inference method. We also give a smooth-distance error bound showing that, under regularity assumptions, moment matching removes the leading low-variance errors and propagates higher-order local accuracy through the layers of a network.