Neural CDEs as Correctors for Learned Time Series Models

arXiv:2512.12116v3 Announce Type: replace Abstract: Learned time-series models, whether continuous or discrete, are widely used for forecasting the states of dynamical systems but suffer from error accumulation in multi-step forecasts. To address this issue, we propose a Predictor-Corrector framework in which the Predictor is a learned time-series model that generates multi-step forecasts and the Corrector is a neural controlled differential equation that corrects the forecast errors. The Corrector works with irregularly sampled time series and is compatible with both continuous- and discrete-time Predictors. We further introduce two regularization strategies that improve the Corrector's extrapolation performance and accelerate its training. We also provide theoretical guarantees on the stability and convergence of the proposed framework. Experiments on synthetic, physics-based, and real-world datasets show that the proposed framework consistently improves forecasting performance across diverse Predictors, including neural ordinary differential equations, ContiFormer, and DLinear, demonstrating its predictor-agnostic nature.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top