cs.LG

PENEX: AdaBoost-Inspired Neural Network Regularization

arXiv:2510.02107v4 Announce Type: replace
Abstract: AdaBoost sequentially fits so-called weak learners to minimize an exponential loss, which penalizes misclassified data points more severely than other loss functions like cross-entropy. Paradoxically…