Federated Knowledge Distillation for Multi-Model Architectures Lithography Hotspot Detection

arXiv:2501.04066v2 Announce Type: replace Abstract: As a special type of multimedia data, Lithography Hotspot Detection (LHD) training often requires stronger privacy protection than conventional multimedia data, and federated learning provides a promising potential solution to this challenge. However, existing approaches rely solely on either parameter aggregation or Knowledge Distillation (KD), failing to fully exploit the potential of collaborative learning. To address this, we propose FedKD-hybrid, a novel framework that synergizes the strengths of both paradigms. Specifically, FedKD-hybrid utilizes a public dataset to facilitate consensus, where clients exchange both parameters of agreed-upon layers and logits. This hybrid information is aggregated to refine local models, enhancing knowledge transfer. Extensive experiments on ICCAD-2012 and real-world FAB datasets demonstrate that FedKD-hybrid consistently outperforms state-of-the-art methods in both effectiveness and robustness.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top