A PAC-Bayesian Analysis of Channel-Induced Degradation in Edge Inference
arXiv:2601.10915v2 Announce Type: replace-cross
Abstract: In the emerging paradigm of edge learning, neural networks (NNs) are partitioned across distributed edge devices that collaboratively perform inference via wireless transmission. However, deploying NNs for edge inference over wireless channels inevitably leads to performance degradation, as the exact channel realizations in the inference stage are not known in the training stage. In this paper, we establish a theoretical framework to evaluate and bound this performance degradation. Inspired by statistical learning theory, we define a wireless generalization error to characterize the gap between the empirical performance during training and the expected inference performance under the true stochastic channel. To enable theoretical analysis, we introduce an augmented NN model that incorporates channel statistics directly into the weight space. Leveraging the PAC-Bayesian framework, we derive a high-probability bound on this error, which provides theoretical guarantees for wireless inference performance. Furthermore, we propose a channel-aware training algorithm that minimizes a tractable surrogate objective based on the derived bound. Simulations demonstrate that the proposed algorithm effectively improves wireless inference performance and model robustness under various channel conditions.