cs.LG, math.OC

Lower Bounds and Proximally Anchored SGD for Non-Convex Minimization Under Unbounded Variance

arXiv:2604.16620v1 Announce Type: new
Abstract: Analysis of Stochastic Gradient Descent (SGD) and its variants typically relies on the assumption of uniformly bounded variance, a condition that frequently fails in practical non-convex settings, such a…