Sparse Tucker Decomposition and Graph Regularization for High-Dimensional Time Series Forecasting
- URL: http://arxiv.org/abs/2601.00377v1
- Date: Thu, 01 Jan 2026 15:56:35 GMT
- Title: Sparse Tucker Decomposition and Graph Regularization for High-Dimensional Time Series Forecasting
- Authors: Sijia Xia, Michael K. Ng, Xiongjun Zhang,
- Abstract summary: We propose a sparse Tucker decomposition method with graph regularization for high-dimensional vector autoregressive time series.<n>The two proposed regularization techniques can be shown to more accurate parameters estimation.<n>A proximal alternating linearized minimization algorithm is designed to solve the resulting model.
- Score: 18.099606410441734
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Existing methods of vector autoregressive model for multivariate time series analysis make use of low-rank matrix approximation or Tucker decomposition to reduce the dimension of the over-parameterization issue. In this paper, we propose a sparse Tucker decomposition method with graph regularization for high-dimensional vector autoregressive time series. By stacking the time-series transition matrices into a third-order tensor, the sparse Tucker decomposition is employed to characterize important interactions within the transition third-order tensor and reduce the number of parameters. Moreover, the graph regularization is employed to measure the local consistency of the response, predictor and temporal factor matrices in the vector autoregressive model.The two proposed regularization techniques can be shown to more accurate parameters estimation. A non-asymptotic error bound of the estimator of the proposed method is established, which is lower than those of the existing matrix or tensor based methods. A proximal alternating linearized minimization algorithm is designed to solve the resulting model and its global convergence is established under very mild conditions. Extensive numerical experiments on synthetic data and real-world datasets are carried out to verify the superior performance of the proposed method over existing state-of-the-art methods.
Related papers
- Multi-Dimensional Visual Data Recovery: Scale-Aware Tensor Modeling and Accelerated Randomized Computation [51.65236537605077]
We propose a new type of network compression optimization technique, fully randomized tensor network compression (FCTN)<n>FCTN has significant advantages in correlation characterization and transpositional in algebra, and has notable achievements in multi-dimensional data processing and analysis.<n>We derive efficient algorithms with guarantees to solve the formulated models.
arXiv Detail & Related papers (2026-02-13T14:56:37Z) - Revisiting Trace Norm Minimization for Tensor Tucker Completion: A Direct Multilinear Rank Learning Approach [22.740653766104153]
This paper shows that trace norm-based formulations in Tucker completion are inefficient in multilinear rank minimization.
We propose a new interpretation of Tucker format such that trace norm minimization is applied to the factor matrices of the equivalent representation.
Numerical results are presented to show that the proposed algorithm exhibits significant improved performance in terms of multilinear rank learning.
arXiv Detail & Related papers (2024-09-08T15:44:00Z) - Spatiotemporal Regularized Tucker Decomposition Approach for Traffic
Data Imputation [0.0]
In intelligent transportation systems, traffic data imputation estimating the missing value from partially observed data is an inevitable challenging task.
Previous studies have not fully considered traffic data's multidimensionality and correlations, but they are vital to data recovery, especially for high-level missing scenarios.
arXiv Detail & Related papers (2023-05-11T04:42:35Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Modelling matrix time series via a tensor CP-decomposition [7.900118935012717]
We propose to model matrix time series based on a tensor CP-decomposition.
We show that all the component coefficient in the CP-decomposition are estimated consistently with the different error rates, depending on the relative sizes between the dimensions of time series and the sample size.
arXiv Detail & Related papers (2021-12-31T13:02:06Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Alternating minimization algorithms for graph regularized tensor
completion [8.26185178671935]
We consider a Canonical Polyadic (CP) decomposition approach to low-rank tensor completion (LRTC)
The usage of graph regularization entails benefits in the learning accuracy of LRTC, but at the same time, induces coupling graph Laplacian terms.
We propose efficient alternating minimization algorithms by leveraging the block structure of the underlying CP decomposition-based model.
arXiv Detail & Related papers (2020-08-28T23:20:49Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.