Diverse Image Priors for Black-box Data-free Knowledge Distillation
arXiv:2604.25794v1 Announce Type: new
Abstract: Knowledge distillation (KD) represents a vital mechanism to transfer expertise from complex teacher networks to efficient student models. However, in decentralized or secure AI ecosystems, privacy regula…