Differentially Private Hyperparameter Tuning using Local Bayesian Optimization
arXiv:2502.06044v3 Announce Type: replace
Abstract: Hyperparameter tuning is a key component of machine learning procedures, but when validation data contain sensitive user information, search mechanisms can leak private information through the selected configuration. Existing differentially private hyperparameter tuning methods often rely on near-random search, while prior differentially private Bayesian optimization approaches are typically global and, therefore, scale poorly with the hyperparameter dimensionality. We study differentially private hyperparameter tuning using local Bayesian optimization, focusing on settings where the validation objective is available only through noisy black box evaluations and gradients are unavailable or impractical to compute. We introduce DP-GIBO, a differentially private local Bayesian optimization framework that privately approximates gradients using a Gaussian Process surrogate. Under suitable conditions, we prove that DP-GIBO converges to a locally optimal hyperparameter configuration up to a privacy-dependent error, with dimensional dependence that is polynomial rather than exponential.Empirically, we show that DP-GIBO provides scalable private hyperparameter tuning across multiple tasks, substantially outperforming non-private random search and global Bayesian optimization baselines in moderate-to-high-dimensional hyperparameter spaces.