An Interpretable and Efficient Infinite-Order Vector Autoregressive
Model for High-Dimensional Time Series
- URL: http://arxiv.org/abs/2209.01172v4
- Date: Sat, 24 Feb 2024 22:53:59 GMT
- Title: An Interpretable and Efficient Infinite-Order Vector Autoregressive
Model for High-Dimensional Time Series
- Authors: Yao Zheng
- Abstract summary: This paper proposes a novel sparse infinite-order VAR model for high-dimensional time series.
The temporal and cross-sectional structures of the VARMA-type dynamics captured by this model can be interpreted separately.
Greater statistical efficiency and interpretability can be achieved with little loss of temporal information.
- Score: 1.4939176102916187
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As a special infinite-order vector autoregressive (VAR) model, the vector
autoregressive moving average (VARMA) model can capture much richer temporal
patterns than the widely used finite-order VAR model. However, its practicality
has long been hindered by its non-identifiability, computational
intractability, and difficulty of interpretation, especially for
high-dimensional time series. This paper proposes a novel sparse infinite-order
VAR model for high-dimensional time series, which avoids all above drawbacks
while inheriting essential temporal patterns of the VARMA model. As another
attractive feature, the temporal and cross-sectional structures of the
VARMA-type dynamics captured by this model can be interpreted separately, since
they are characterized by different sets of parameters. This separation
naturally motivates the sparsity assumption on the parameters determining the
cross-sectional dependence. As a result, greater statistical efficiency and
interpretability can be achieved with little loss of temporal information. We
introduce two $\ell_1$-regularized estimation methods for the proposed model,
which can be efficiently implemented via block coordinate descent algorithms,
and derive the corresponding nonasymptotic error bounds. A consistent model
order selection method based on the Bayesian information criteria is also
developed. The merit of the proposed approach is supported by simulation
studies and a real-world macroeconomic data analysis.
Related papers
- Efficient Interpretable Nonlinear Modeling for Multiple Time Series [5.448070998907116]
This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
arXiv Detail & Related papers (2023-09-29T11:42:59Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Time Adaptive Gaussian Model [0.913755431537592]
Our model is a generalization of state-of-the-art methods for the inference of temporal graphical models.
It performs pattern recognition by clustering data points in time; and, it finds probabilistic (and possibly causal) relationships among the observed variables.
arXiv Detail & Related papers (2021-02-02T00:28:14Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Predicting Multidimensional Data via Tensor Learning [0.0]
We develop a model that retains the intrinsic multidimensional structure of the dataset.
To estimate the model parameters, an Alternating Least Squares algorithm is developed.
The proposed model is able to outperform benchmark models present in the forecasting literature.
arXiv Detail & Related papers (2020-02-11T11:57:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.