Deep Switching State Space Model (DS$^3$M) for Nonlinear Time Series
Forecasting with Regime Switching
- URL: http://arxiv.org/abs/2106.02329v1
- Date: Fri, 4 Jun 2021 08:25:47 GMT
- Title: Deep Switching State Space Model (DS$^3$M) for Nonlinear Time Series
Forecasting with Regime Switching
- Authors: Xiuqin Xu, Ying Chen
- Abstract summary: We propose a deep switching state space model (DS$3$M) for efficient inference and forecasting of nonlinear time series.
The switching among regimes is captured by both discrete and continuous latent variables with recurrent neural networks.
- Score: 3.3970049571884204
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a deep switching state space model (DS$^3$M) for efficient
inference and forecasting of nonlinear time series with irregularly switching
among various regimes. The switching among regimes is captured by both discrete
and continuous latent variables with recurrent neural networks. The model is
estimated with variational inference using a reparameterization trick. We test
the approach on a variety of simulated and real datasets. In all cases, DS$^3$M
achieves competitive performance compared to several state-of-the-art methods
(e.g. GRU, SRNN, DSARF, SNLDS), with superior forecasting accuracy, convincing
interpretability of the discrete latent variables, and powerful representation
of the continuous latent variables for different kinds of time series.
Specifically, the MAPE values increase by 0.09\% to 15.71\% against the
second-best performing alternative models.
Related papers
- Efficient Interpretable Nonlinear Modeling for Multiple Time Series [5.448070998907116]
This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
arXiv Detail & Related papers (2023-09-29T11:42:59Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Gait Recognition in the Wild with Multi-hop Temporal Switch [81.35245014397759]
gait recognition in the wild is a more practical problem that has attracted the attention of the community of multimedia and computer vision.
This paper presents a novel multi-hop temporal switch method to achieve effective temporal modeling of gait patterns in real-world scenes.
arXiv Detail & Related papers (2022-09-01T10:46:09Z) - Triformer: Triangular, Variable-Specific Attentions for Long Sequence
Multivariate Time Series Forecasting--Full Version [50.43914511877446]
We propose a triangular, variable-specific attention to ensure high efficiency and accuracy.
We show that Triformer outperforms state-of-the-art methods w.r.t. both accuracy and efficiency.
arXiv Detail & Related papers (2022-04-28T20:41:49Z) - Interpretable Latent Variables in Deep State Space Models [4.884336328409872]
We introduce a new version of deep state-space models (DSSMs) that combines a recurrent neural network with a state-space framework to forecast time series data.
The model estimates the observed series as functions of latent variables that evolve non-linearly through time.
arXiv Detail & Related papers (2022-03-03T23:10:58Z) - Warped Dynamic Linear Models for Time Series of Counts [1.3515965758160216]
We introduce a novel semiparametric methodology for count time series by warping a Gaussian DLM.
We leverage these results to produce customized and efficient algorithms for inference and forecasting.
arXiv Detail & Related papers (2021-10-27T21:44:00Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z) - Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series [43.86737761236125]
We propose a novel generative model, which tracks the transition of latent clusters, instead of isolated feature representations.
It is characterized by a newly designed dynamic Gaussian mixture distribution, which captures the dynamics of clustering structures.
A structured inference network is also designed for enabling inductive analysis.
arXiv Detail & Related papers (2021-03-03T04:10:07Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - On Multivariate Singular Spectrum Analysis and its Variants [23.517864567789353]
We introduce and analyze a variant of multivariate singular analysis (mSSA), a popular time series method.
We establish prediction mean-squared-error for both imputation and out-of-sample forecasting effectively as $1 / sqrtmin(N, T )T$.
On benchmark datasets, our variant of mSSA performs competitively with state-of-the-art neural-network time series methods.
arXiv Detail & Related papers (2020-06-24T03:17:01Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.