Deep Switching Auto-Regressive Factorization:Application to Time Series
Forecasting
- URL: http://arxiv.org/abs/2009.05135v1
- Date: Thu, 10 Sep 2020 20:15:59 GMT
- Title: Deep Switching Auto-Regressive Factorization:Application to Time Series
Forecasting
- Authors: Amirreza Farnoosh, Bahar Azari, Sarah Ostadabbas
- Abstract summary: DSARF approximates high dimensional data by a product variables between time dependent weights and spatially dependent factors.
DSARF is different from the state-of-the-art techniques in that it parameterizes the weights in terms of a deep switching vector auto-regressive factorization.
Our experiments attest the superior performance of DSARF in terms of long- and short-term prediction error, when compared with the state-of-the-art methods.
- Score: 16.934920617960085
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce deep switching auto-regressive factorization (DSARF), a deep
generative model for spatio-temporal data with the capability to unravel
recurring patterns in the data and perform robust short- and long-term
predictions. Similar to other factor analysis methods, DSARF approximates high
dimensional data by a product between time dependent weights and spatially
dependent factors. These weights and factors are in turn represented in terms
of lower dimensional latent variables that are inferred using stochastic
variational inference. DSARF is different from the state-of-the-art techniques
in that it parameterizes the weights in terms of a deep switching vector
auto-regressive likelihood governed with a Markovian prior, which is able to
capture the non-linear inter-dependencies among weights to characterize
multimodal temporal dynamics. This results in a flexible hierarchical deep
generative factor analysis model that can be extended to (i) provide a
collection of potentially interpretable states abstracted from the process
dynamics, and (ii) perform short- and long-term vector time series prediction
in a complex multi-relational setting. Our extensive experiments, which include
simulated data and real data from a wide range of applications such as climate
change, weather forecasting, traffic, infectious disease spread and nonlinear
physical systems attest the superior performance of DSARF in terms of long- and
short-term prediction error, when compared with the state-of-the-art methods.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Parsimony or Capability? Decomposition Delivers Both in Long-term Time Series Forecasting [46.63798583414426]
Long-term time series forecasting (LTSF) represents a critical frontier in time series analysis.
Our study demonstrates, through both analytical and empirical evidence, that decomposition is key to containing excessive model inflation.
Remarkably, by tailoring decomposition to the intrinsic dynamics of time series data, our proposed model outperforms existing benchmarks.
arXiv Detail & Related papers (2024-01-22T13:15:40Z) - Discovering Predictable Latent Factors for Time Series Forecasting [39.08011991308137]
We develop a novel framework for inferring the intrinsic latent factors implied by the observable time series.
We introduce three characteristics, i.e., predictability, sufficiency, and identifiability, and model these characteristics via the powerful deep latent dynamics models.
Empirical results on multiple real datasets show the efficiency of our method for different kinds of time series forecasting.
arXiv Detail & Related papers (2023-03-18T14:37:37Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Deep Markov Spatio-Temporal Factorization [16.125473644303852]
Deep Markov-temporal factorization (DMSTF) is a generative model for dynamical analysis of data.
DMSTF learns a low dimensional spatial latent to generatively parameterize spatial factors or their functional forms.
Results in a flexible family of generative factor analysis models that can be extended to perform time series clustering or perform factor analysis in a control signal.
arXiv Detail & Related papers (2020-03-22T01:27:44Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.