Time Series Forecasting Using Manifold Learning
- URL: http://arxiv.org/abs/2110.03625v2
- Date: Fri, 8 Oct 2021 14:26:29 GMT
- Title: Time Series Forecasting Using Manifold Learning
- Authors: Panagiotis Papaioannou, Ronen Talmon, Daniela di Serafino,
Constantinos Siettos
- Abstract summary: We address a three-tier numerical framework based on manifold learning for the forecasting of high-dimensional time series.
At the first step, we embed the time series into a reduced low-dimensional space using a nonlinear manifold learning algorithm.
At the second step, we construct reduced-order regression models on the manifold to forecast the embedded dynamics.
At the final step, we lift the embedded time series back to the original high-dimensional space.
- Score: 6.316185724124034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address a three-tier numerical framework based on manifold learning for
the forecasting of high-dimensional time series. At the first step, we embed
the time series into a reduced low-dimensional space using a nonlinear manifold
learning algorithm such as Locally Linear Embedding and Diffusion Maps. At the
second step, we construct reduced-order regression models on the manifold, in
particular Multivariate Autoregressive (MVAR) and Gaussian Process Regression
(GPR) models, to forecast the embedded dynamics. At the final step, we lift the
embedded time series back to the original high-dimensional space using Radial
Basis Functions interpolation and Geometric Harmonics. For our illustrations,
we test the forecasting performance of the proposed numerical scheme with four
sets of time series: three synthetic stochastic ones resembling EEG signals
produced from linear and nonlinear stochastic models with different model
orders, and one real-world data set containing daily time series of 10 key
foreign exchange rates (FOREX) spanning the time period 03/09/2001-29/10/2020.
The forecasting performance of the proposed numerical scheme is assessed using
the combinations of manifold learning, modelling and lifting approaches. We
also provide a comparison with the Principal Component Analysis algorithm as
well as with the naive random walk model and the MVAR and GPR models trained
and implemented directly in the high-dimensional space.
Related papers
- Time Series Analysis by State Space Learning [44.99833362998488]
Time series analysis by state-space models is widely used in forecasting and extracting unobservable components like level slope, and seasonality, along with explanatory variables.
Our research introduces the State Space Learning (SSL), a novel framework and paradigm that leverages the capabilities of statistical learning to construct a comprehensive framework for time series modeling and forecasting.
arXiv Detail & Related papers (2024-08-17T07:04:26Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Efficient Interpretable Nonlinear Modeling for Multiple Time Series [5.448070998907116]
This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
arXiv Detail & Related papers (2023-09-29T11:42:59Z) - Hybrid State Space-based Learning for Sequential Data Prediction with
Joint Optimization [0.0]
We introduce a hybrid model that mitigates, via a joint mechanism, the need for domain-specific feature engineering issues of conventional nonlinear prediction models.
We achieve this by introducing novel state space representations for the base models, which are then combined to provide a full state space representation of the hybrid or the ensemble.
Due to such novel combination and joint optimization, we demonstrate significant improvements in widely publicized real life competition datasets.
arXiv Detail & Related papers (2023-09-19T12:00:28Z) - Sequential Monte Carlo Learning for Time Series Structure Discovery [17.964180907602657]
We present a novel structure learning algorithm that integrates sequential Monte Carlo and involutive MCMC for highly effective posterior inference.
Our method can be used both in "online" settings, where new data is incorporated sequentially in time, and in "offline" settings, by using nested subsets of historical data to anneal the posterior.
arXiv Detail & Related papers (2023-07-13T16:38:01Z) - Gait Recognition in the Wild with Multi-hop Temporal Switch [81.35245014397759]
gait recognition in the wild is a more practical problem that has attracted the attention of the community of multimedia and computer vision.
This paper presents a novel multi-hop temporal switch method to achieve effective temporal modeling of gait patterns in real-world scenes.
arXiv Detail & Related papers (2022-09-01T10:46:09Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - Time Series Forecasting with Ensembled Stochastic Differential Equations
Driven by L\'evy Noise [2.3076895420652965]
We use a collection of SDEs equipped with neural networks to predict long-term trend of noisy time series.
Our contributions are, first, we use the phase space reconstruction method to extract intrinsic dimension of the time series data.
Second, we explore SDEs driven by $alpha$-stable L'evy motion to model the time series data and solve the problem through neural network approximation.
arXiv Detail & Related papers (2021-11-25T16:49:01Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.