Tight Auditing of Differential Privacy in MST and AIM

arXiv:2604.18352v1 Announce Type: cross Abstract: State-of-the-art Differentially Private (DP) synthetic data generators such as MST and AIM are widely used, yet tightly auditing their privacy guarantees remains challenging. We introduce a Gaussian Differential Privacy (GDP)-based auditing framework that measures privacy via the full false-positive/false-negative tradeoff. Applied to MST and AIM under worst-case settings, our method provides the first tight audits in the strong-privacy regime. For $(\epsilon,\delta)=(1,10^{-2})$, we obtain $\mu_{emp}\approx0.43$ vs. implied $\mu=0.45$, showing a small theory-practice gap. Our code is publicly available: https://github.com/sassoftware/dpmm.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top