Learning to Forecast Dynamical Systems from Streaming Data
- URL: http://arxiv.org/abs/2109.09703v2
- Date: Tue, 21 Sep 2021 14:38:26 GMT
- Title: Learning to Forecast Dynamical Systems from Streaming Data
- Authors: Dimitris Giannakis, Amelia Henriksen, Joel A. Tropp, and Rachel Ward
- Abstract summary: This paper proposes a streaming algorithm for KAF that only requires a single pass over the training data.
Computational experiments demonstrate that the streaming KAF method can successfully forecast several classes of dynamical systems.
The overall methodology may have wider interest as a new template for streaming kernel regression.
- Score: 3.6136161812301744
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernel analog forecasting (KAF) is a powerful methodology for data-driven,
non-parametric forecasting of dynamically generated time series data. This
approach has a rigorous foundation in Koopman operator theory and it produces
good forecasts in practice, but it suffers from the heavy computational costs
common to kernel methods. This paper proposes a streaming algorithm for KAF
that only requires a single pass over the training data. This algorithm
dramatically reduces the costs of training and prediction without sacrificing
forecasting skill. Computational experiments demonstrate that the streaming KAF
method can successfully forecast several classes of dynamical systems
(periodic, quasi-periodic, and chaotic) in both data-scarce and data-rich
regimes. The overall methodology may have wider interest as a new template for
streaming kernel regression.
Related papers
- Data-Augmented Predictive Deep Neural Network: Enhancing the extrapolation capabilities of non-intrusive surrogate models [0.5735035463793009]
We propose a new deep learning framework, where kernel dynamic mode decomposition (KDMD) is employed to evolve the dynamics of the latent space generated by the encoder part of a convolutional autoencoder (CAE)
After adding the KDMD-decoder-extrapolated data into the original data set, we train the CAE along with a feed-forward deep neural network using the augmented data.
The trained network can predict future states outside the training time interval at any out-of-training parameter samples.
arXiv Detail & Related papers (2024-10-17T09:26:14Z) - Physics-guided Active Sample Reweighting for Urban Flow Prediction [75.24539704456791]
Urban flow prediction is a nuanced-temporal modeling that estimates the throughput of transportation services like buses, taxis and ride-driven models.
Some recent prediction solutions bring remedies with the notion of physics-guided machine learning (PGML)
We develop a atized physics-guided network (PN), and propose a data-aware framework Physics-guided Active Sample Reweighting (P-GASR)
arXiv Detail & Related papers (2024-07-18T15:44:23Z) - Online Distributional Regression [0.0]
Large-scale streaming data are common in modern machine learning applications.
Many fields, such as supply chain management, weather and meteorology, have pivoted towards using probabilistic forecasts.
We present a methodology for online estimation of regularized, linear distributional models.
arXiv Detail & Related papers (2024-06-26T16:04:49Z) - Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Scalable computation of prediction intervals for neural networks via
matrix sketching [79.44177623781043]
Existing algorithms for uncertainty estimation require modifying the model architecture and training procedure.
This work proposes a new algorithm that can be applied to a given trained neural network and produces approximate prediction intervals.
arXiv Detail & Related papers (2022-05-06T13:18:31Z) - A Meta-learning Approach to Reservoir Computing: Time Series Prediction
with Limited Data [0.0]
We present a data-driven approach to automatically extract an appropriate model structure from experimentally observed processes.
We demonstrate our approach on a simple benchmark problem, where it beats the state of the art meta-learning techniques.
arXiv Detail & Related papers (2021-10-07T18:23:14Z) - Incorporating Causal Graphical Prior Knowledge into Predictive Modeling
via Simple Data Augmentation [92.96204497841032]
Causal graphs (CGs) are compact representations of the knowledge of the data generating processes behind the data distributions.
We propose a model-agnostic data augmentation method that allows us to exploit the prior knowledge of the conditional independence (CI) relations.
We experimentally show that the proposed method is effective in improving the prediction accuracy, especially in the small-data regime.
arXiv Detail & Related papers (2021-02-27T06:13:59Z) - Data-driven geophysical forecasting: Simple, low-cost, and accurate
baselines with kernel methods [0.6875312133832078]
We show that when the kernel of these emulators is also learned from data, the resulting data-driven models are faster than equation-based models.
We see significant improvements over climatology and persistence based forecast techniques.
arXiv Detail & Related papers (2021-02-13T19:57:33Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Supervised learning from noisy observations: Combining machine-learning
techniques with data assimilation [0.6091702876917281]
We show how to optimally combine forecast models and their inherent uncertainty with incoming noisy observations.
We show that the obtained forecast model has remarkably good forecast skill while being computationally cheap once trained.
Going beyond the task of forecasting, we show that our method can be used to generate reliable ensembles for probabilistic forecasting as well as to learn effective model closure in multi-scale systems.
arXiv Detail & Related papers (2020-07-14T22:29:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.