Federated Learning with Hypergradient-based Online Update of Aggregation Weights

arXiv:2605.00458v1 Announce Type: new Abstract: Federated learning using mobile and Internet of Things devices requires not only the ability to handle heterogeneity of clients' data distributions but also high adaptability to varying communication environments. We propose FedHAW (Federated Learning with Hypergradient-based update of Aggregation Weights) that implements online updates of aggregation weights. FedHAW updates the aggregation weights by using hypergradient, the gradient of the objective function with respect to the weights, which can be calculated with low computational overhead. Simulation results show that the proposed method possesses high generalization performance in heterogeneous environments and high robustness to communication errors.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top