ML-Based Real-Time Downlink Performance Prediction in Standalone 5G NR Using Smartphones
arXiv:2604.09632v1 Announce Type: cross
Abstract: We propose a machine learning (ML)-based framework for downlink performance prediction in 5G networks using real-time measurements from commercial off-the-shelf (COTS) user equipment (UE). Our experimental platform integrates the srsRAN 5G New Radio (NR) stack deployed on a Dell desktop serving as the 5G next generation nodeB (gNB), operating at 3.4 GHz. Two Google Pixel 7a smartphones are used to collect physical layer characteristics such as channel quality indicator (CQI), modulation and coding scheme (MCS), bit rate, transmission time interval (TTI), and block error rate (BLER), which are leveraged as predictors in model training. We use commercial-grade traffic generation tools, including Ookla, for stationary and mobility measurements under line-of-sight (LOS) and non-line-of-sight (nLOS) conditions. Test data includes global Ookla servers (e.g., USA, Portugal, Ghana, Egypt, Japan), iperf TCP/UDP data, and video streaming sessions from YouTube. To analyze inter-user interference, we also include scenarios with multiple UEs at the same location. We evaluate the predictive performance of five supervised regression models - linear regression, decision tree regression, random forest regression, extreme gradient boosting (XGBoost), light gradient boosting machine (LightGBM). Our results demonstrate that throughput and BLER can be accurately predicted using COTS hardware and standard ML techniques in diverse real-world 5G scenarios.