Dynamic Hyperparameter Importance for Efficient Multi-Objective Optimization

arXiv:2601.03166v2 Announce Type: replace Abstract: Choosing a suitable ML model is a complex task that can depend on several objectives, e.g., accuracy, fairness, or energy consumption. In practice, this requires trading off multiple, often competing, objectives through multi-objective optimization (MOO). However, existing MOO methods typically treat all hyperparameters as equally important, disregarding that hyperparameter importance (HPI) can vary significantly across objectives. We propose a novel dynamic optimization approach that prioritizes the most influential hyperparameters based on varying objective trade-offs during the search, thereby accelerating empirical convergence. We advance prior work on HPI for MOO from post-analysis to direct, dynamic integration within the optimization, using the recent HPI method HyperSHAP. For this, we leverage the objective weightings naturally produced by the MOO algorithm ParEGO and reduce the configuration space by fixing the unimportant hyperparameters, allowing the search to focus on the important ones. Eventually, we evaluate our method on diverse tasks from PyMOO and YAHPO-Gym. For HPO, integrating HPI yields up to 24% improvement in final Pareto front quality, while on synthetic data, integrating HPI achieves 2x better final results.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top