A Near-optimal SQ Lower Bound for Smoothed Agnostic Learning of Boolean Halfspaces
arXiv:2605.02350v1 Announce Type: new
Abstract: We study the complexity of smoothed agnostic learning of halfspaces on $\{\pm 1\}^n$ under the uniform distribution in the model of \citet{KM25} where each input coordinate is independently flipped with probability $\sigma \in (0, {1}/{2})$. We show that $L^1$ polynomial regression achieves complexity $\tilde{O}(n^{O(\log(1/\varepsilon)/\sigma)})$, and prove a nearly matching Statistical Query complexity lower bound of $n^{\Omega(\log(1+\sigma/\varepsilon ^2)/\sigma)}$. This complements the recent work of \citet{DK26}, which established analogous bounds in the continuous setting under Gaussian marginals.