$(\alpha,\beta)$-Stability for Boosting Vector-Valued Prediction

arXiv:2602.18866v2 Announce Type: replace-cross Abstract: Despite the widespread use of boosting in structured prediction, a general theoretical understanding of aggregation beyond scalar prediction remains incomplete. We study vector-valued prediction under a target divergence and identify a geometric stability property under which aggregation amplifies weak guarantees into strong ones. We formalize this property as $(\alpha,\beta)$-stability by geometric median and show how it supports a boosting framework based on exponential reweighting and geometric-median aggregation. For vector-valued prediction, we characterize this stability property under several natural divergences: $\ell_1$ and $\ell_2$ distances for unconstrained vector-valued prediction, and TV, Hellinger, and KL for density estimation over finite probability vectors. Building on these results, we propose a generic boosting framework \geomedboost. Under a weak learner condition and $(\alpha,\beta)$-stability, we obtain exponential decay of the empirical divergence error, which then yields population guarantees through a generalization bound.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top