cs.LG, stat.ML

Beyond ReLU: How Activations Affect Neural Kernels and Random Wide Networks

arXiv:2506.22429v2 Announce Type: replace
Abstract: In recent years, the neural tangent kernel (NTK) and neural network Gaussian process kernel (NNGP) have given theoreticians tractable limiting cases of fully connected neural networks. However, the p…