RFPrompt: Prompt-Based Expert Adaptation of the Large Wireless Model for Modulation Classification

arXiv:2605.03279v1 Announce Type: new Abstract: Automatic modulation classification (AMC) in real-world deployments demands robustness to distribution shifts arising from hardware impairments, unseen propagation environments, and recording conditions never encountered during training. Although wireless foundation models offer a promising starting point for robust RF representation learning, an important open question is how to adapt them efficiently to out-of-distribution (OOD) downstream tasks without overwriting the structure learned during large-scale pre-training. In this paper, we investigate prompt-based adaptation as a general mechanism for OOD transfer in wireless foundation models. We propose RFPrompt, a parameter-efficient framework that introduces learnable deep prompt tokens while keeping the pretrained backbone frozen, enabling task-specific adaptation with minimal trainable parameters. We instantiate and evaluate this approach on the Large Wireless Model (LWM), a mixture-of-experts wireless foundation model, and study its behavior under both standard and OOD modulation-classification settings. Results show that prompt-based adaptation consistently improves robustness under distribution shift and limited supervision, particularly on real-world over-the-air IQ data, while preserving strong parameter efficiency. These findings suggest that prompt learning is a practical and effective strategy for adapting wireless foundation models to challenging downstream RF environments.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top