A projected nonlinear state-space model for forecasting time series
signals
- URL: http://arxiv.org/abs/2311.13247v1
- Date: Wed, 22 Nov 2023 09:05:37 GMT
- Title: A projected nonlinear state-space model for forecasting time series
signals
- Authors: Christian Donner, Anuj Mishra, Hideaki Shimazaki
- Abstract summary: We propose a fast algorithm to learn and forecast nonlinear dynamics from noisy time series data.
A key feature of the proposed model is kernel functions applied to projected lines, enabling fast capture of nonlinearities in the latent dynamics.
- Score: 0.6537685198688538
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning and forecasting stochastic time series is essential in various
scientific fields. However, despite the proposals of nonlinear filters and
deep-learning methods, it remains challenging to capture nonlinear dynamics
from a few noisy samples and predict future trajectories with uncertainty
estimates while maintaining computational efficiency. Here, we propose a fast
algorithm to learn and forecast nonlinear dynamics from noisy time series data.
A key feature of the proposed model is kernel functions applied to projected
lines, enabling fast and efficient capture of nonlinearities in the latent
dynamics. Through empirical case studies and benchmarking, the model
demonstrates its effectiveness in learning and forecasting complex nonlinear
dynamics, offering a valuable tool for researchers and practitioners in time
series analysis.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Safe Active Learning for Time-Series Modeling with Gaussian Processes [7.505622158856545]
Learning time-series models is useful for many applications, such as simulation and forecasting.
In this study, we consider the problem of actively learning time-series models while taking given safety constraints into account.
The proposed approach generates data appropriate for time series model learning, i.e. input and output trajectories, by dynamically exploring the input space.
arXiv Detail & Related papers (2024-02-09T09:40:33Z) - Gaussian process learning of nonlinear dynamics [0.0]
We propose a new method that learns nonlinear dynamics through a Bayesian inference of characterizing model parameters.
We will discuss the applicability of the proposed method to several typical scenarios for dynamical systems.
arXiv Detail & Related papers (2023-12-19T14:27:26Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Efficient learning of nonlinear prediction models with time-series
privileged information [11.679648862014655]
We show that for prediction in linear-Gaussian dynamical systems, a LuPI learner with access to intermediate time series data is never worse than any unbiased classical learner.
We propose algorithms based on random features and representation learning for the case when this map is unknown.
arXiv Detail & Related papers (2022-09-15T05:56:36Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting [30.277213545837924]
Many classical statistical models often fall short in handling the complexity and high non-linearity present in time-series data.
In this work, we consider the time-series data as a random realization from a nonlinear state-space model.
We use particle flow as the tool for approximating the posterior distribution of the states, as it is shown to be highly effective in complex, high-dimensional settings.
arXiv Detail & Related papers (2021-06-10T21:49:23Z) - Meta-Learning for Koopman Spectral Analysis with Short Time-series [49.41640137945938]
Existing methods require long time-series for training neural networks.
We propose a meta-learning method for estimating embedding functions from unseen short time-series.
We experimentally demonstrate that the proposed method achieves better performance in terms of eigenvalue estimation and future prediction.
arXiv Detail & Related papers (2021-02-09T07:19:19Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z) - Improved Predictive Deep Temporal Neural Networks with Trend Filtering [22.352437268596674]
We propose a new prediction framework based on deep neural networks and a trend filtering.
We reveal that the predictive performance of deep temporal neural networks improves when the training data is temporally processed by a trend filtering.
arXiv Detail & Related papers (2020-10-16T08:29:36Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.