Sample Complexity of Stochastic Optimization with Integer Variables

arXiv:2605.07239v1 Announce Type: new Abstract: We establish sample complexity results for stochastic optimization over the integers, especially with a view to understand the complexity with respect to the corresponding continuous optimization problem. We show that integer optimization can sometimes require strictly more samples and sometimes strictly smaller number of samples, depending on the structure of the objective and constraints. 1. For Lipschitz objectives over subsets of the $\ell_\infty$ ball, the statistical complexity of general stochastic mixed-integer, nonlinear, nonconvex optimization is exactly the same as stochastic linear optimization with just bound constraints. 2. For Lipschitz objectives over subsets of the $\ell_2$ ball, we show that integer optimization can require strictly *smaller* sample size compared to the continuous setting in a certain regime. To get to this result, we also establish tight sample complexity results for nonconvex continuous stochastic optimization which, to the best of our knowledge, do not appear in prior work. 3. For strongly convex, smooth objectives, integer optimization has high statistical complexity compared to the continuous setting. In particular, we show that integer optimization requires $\Omega(1/\epsilon^2)$ samples to report an $\epsilon$-approximate solution, compared to the well-known $O(1/\epsilon)$ sample complexity from the continuous optimization literature.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top