Universality in Deep Neural Networks: An approach via the Lindeberg exchange principle

arXiv:2605.02771v1 Announce Type: cross Abstract: We consider the infinite-width limit of a fully connected deep neural network with general weights, and we prove quantitative general bounds on the $2$-Wasserstein distance between the network and its infinite-width Gaussian limit, under appropriate regularity assumptions on the activation function. Our main tool is a Lindeberg principle for Deep Neural Networks, which we use to successively replace the weights on each layer by Gaussian random variables.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top