cs.LG

DP-{\lambda}CGD: Efficient Noise Correlation for Differentially Private Model Training

arXiv:2601.22334v2 Announce Type: replace
Abstract: Differentially private stochastic gradient descent (DP-SGD) is the gold standard for training machine learning models with formal differential privacy guarantees. Several recent extensions improve it…