Nonmonotone subgradient methods based on a local descent lemma

arXiv:2510.19341v2 Announce Type: replace-cross Abstract: In this paper we present a nonmonotone line search subgradient algorithm tailored to upper-$\mathcal{C}^2$ functions. This is a family of nonsmooth and nonconvex functions that satisfies a nonsmooth and local version of the descent lemma, making them suitable for line searches. We prove subsequential convergence of the proposed algorithm to a stationary point of the optimization problem. Our approach allows us to cover the setting of various subgradient algorithms, including Newton and quasi-Newton methods. In addition, we propose a specification of the general scheme, named Self-adaptive Nonmonotone Subgradient Method (SNSM), which automatically updates the parameters of the line search. Particular attention is paid to the minimum sum-of-squares clustering problem, for which we provide a concrete implementation of SNSM. We conclude with some numerical experiments where we exhibit the advantages of SNSM in comparison with some known algorithms.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top