Neural CDEs as Correctors for Learned Time Series Models
- URL: http://arxiv.org/abs/2512.12116v2
- Date: Fri, 19 Dec 2025 20:58:15 GMT
- Title: Neural CDEs as Correctors for Learned Time Series Models
- Authors: Muhammad Bilal Shahid, Prajwal Koirla, Cody Fleming,
- Abstract summary: We propose a Predictor-Corrector mechanism where the Predictor is any learned time-series model and the Corrector is a neural controlled differential equation.<n>The proposed Corrector works with irregularly sampled time series and continuous- and discrete-time Predictors.<n>We evaluate our Corrector with diverse Predictors on synthetic, physics simulation, and real-world forecasting datasets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learned time-series models, whether continuous- or discrete-time, are widely used to forecast the states of a dynamical system. Such models generate multi-step forecasts either directly, by predicting the full horizon at once, or iteratively, by feeding back their own predictions at each step. In both cases, the multi-step forecasts are prone to errors. To address this, we propose a Predictor-Corrector mechanism where the Predictor is any learned time-series model and the Corrector is a neural controlled differential equation. The Predictor forecasts, and the Corrector predicts the errors of the forecasts. Adding these errors to the forecasts improves forecast performance. The proposed Corrector works with irregularly sampled time series and continuous- and discrete-time Predictors. Additionally, we introduce two regularization strategies to improve the extrapolation performance of the Corrector with accelerated training. We evaluate our Corrector with diverse Predictors, e.g., neural ordinary differential equations, Contiformer, and DLinear, on synthetic, physics simulation, and real-world forecasting datasets. The experiments demonstrate that the Predictor-Corrector mechanism consistently improves the performance compared to Predictor alone.
Related papers
- Error Adjustment Based on Spatiotemporal Correlation Fusion for Traffic Forecasting [22.37553946699755]
A general assumption of training the said forecasting models via mean error estimation is that the errors across time steps and spatial positions are unrelated.<n>This paper proposes Stemporally Autorelated Error Adjustment (SAEA), a novel and general framework designed to systematically autocorrelated prediction errors in traffic forecasting.
arXiv Detail & Related papers (2025-10-25T23:48:50Z) - Unsupervised Anomaly Prediction with N-BEATS and Graph Neural Network in Multi-variate Semiconductor Process Time Series [1.0874100424278175]
anomaly prediction in semiconductor fabrication presents several critical challenges.<n>The complex interdependencies between variables complicate both anomaly prediction and root-cause-analysis.<n>This paper proposes two novel approaches to advance the field from anomaly detection to anomaly prediction.
arXiv Detail & Related papers (2025-10-23T16:33:52Z) - SynCast: Synergizing Contradictions in Precipitation Nowcasting via Diffusion Sequential Preference Optimization [62.958457694151384]
We introduce preference optimization into precipitation nowcasting for the first time, motivated by the success of reinforcement learning from human feedback in large language models.<n>In the first stage, the framework focuses on reducing FAR, training the model to effectively suppress false alarms.
arXiv Detail & Related papers (2025-10-22T16:11:22Z) - Adaptive Conformal Prediction Intervals Over Trajectory Ensembles [50.31074512684758]
Future trajectories play an important role across domains such as autonomous driving, hurricane forecasting, and epidemic modeling.<n>We propose a unified framework based on conformal prediction that transforms sampled trajectories into calibrated prediction intervals with theoretical coverage guarantees.
arXiv Detail & Related papers (2025-08-18T21:14:07Z) - HopCast: Calibration of Autoregressive Dynamics Models [0.0]
This work introduces an alternative Predictor-Corrector approach named hop that uses Modern Hopfield Networks (MHN) to learn the errors of a deterministic Predictor.<n>The Corrector predicts a set of errors for the Predictor's output based on a context state at any timestep during autoregression.<n>The calibration and prediction performances are evaluated across a set of dynamical systems.
arXiv Detail & Related papers (2025-01-27T23:59:23Z) - Loss Shaping Constraints for Long-Term Time Series Forecasting [79.3533114027664]
We present a Constrained Learning approach for long-term time series forecasting that respects a user-defined upper bound on the loss at each time-step.
We propose a practical Primal-Dual algorithm to tackle it, and aims to demonstrate that it exhibits competitive average performance in time series benchmarks, while shaping the errors across the predicted window.
arXiv Detail & Related papers (2024-02-14T18:20:44Z) - Predictive Churn with the Set of Good Models [61.00058053669447]
This paper explores connections between two seemingly unrelated concepts of predictive inconsistency.<n>The first, known as predictive multiplicity, occurs when models that perform similarly produce conflicting predictions for individual samples.<n>The second concept, predictive churn, examines the differences in individual predictions before and after model updates.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Fine-grained Forecasting Models Via Gaussian Process Blurring Effect [6.472434306724611]
Time series forecasting is a challenging task due to the existence of complex and dynamic temporal dependencies.
Using more training data is one way to improve the accuracy, but this source is often limited.
We are building on successful denoising approaches for image generation by advocating for an end-to-end forecasting and denoising paradigm.
arXiv Detail & Related papers (2023-12-21T20:25:16Z) - Performative Time-Series Forecasting [64.03865043422597]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.<n>We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.<n>We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - All-Clear Flare Prediction Using Interval-based Time Series Classifiers [0.21028463367241026]
An all-clear flare prediction is a type of solar flare forecasting that puts more emphasis on predicting non-flaring instances.
Finding the right balance between avoiding false negatives (misses) and reducing the false positives (false alarms) is often challenging.
arXiv Detail & Related papers (2021-05-03T22:40:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.