Generalized Temporal Tensor Decomposition with Rank-revealing Latent-ODE
- URL: http://arxiv.org/abs/2502.06164v1
- Date: Mon, 10 Feb 2025 05:27:11 GMT
- Title: Generalized Temporal Tensor Decomposition with Rank-revealing Latent-ODE
- Authors: Panqi Chen, Lei Cheng, Jianlong Li, Weichang Li, Weiqing Liu, Jiang Bian, Shikai Fang,
- Abstract summary: We propose underlineGeneralized temporal tensor decomposition with underlineRank-runderlineEvealing latenunderlineT-ODE (GRET)
Our approach encodes continuous spatial indexes as learnable features and employs neural ODEs in latent space to learn the temporal trajectories of factors.
GRET significantly outperforms existing methods in prediction performance and robustness against noise.
- Score: 17.61798738261815
- License:
- Abstract: Tensor decomposition is a fundamental tool for analyzing multi-dimensional data by learning low-rank factors to represent high-order interactions. While recent works on temporal tensor decomposition have made significant progress by incorporating continuous timestamps in latent factors, they still struggle with general tensor data with continuous indexes not only in the temporal mode but also in other modes, such as spatial coordinates in climate data. Additionally, the problem of determining the tensor rank remains largely unexplored in temporal tensor models. To address these limitations, we propose \underline{G}eneralized temporal tensor decomposition with \underline{R}ank-r\underline{E}vealing laten\underline{T}-ODE (GRET). Our approach encodes continuous spatial indexes as learnable Fourier features and employs neural ODEs in latent space to learn the temporal trajectories of factors. To automatically reveal the rank of temporal tensors, we introduce a rank-revealing Gaussian-Gamma prior over the factor trajectories. We develop an efficient variational inference scheme with an analytical evidence lower bound, enabling sampling-free optimization. Through extensive experiments on both synthetic and real-world datasets, we demonstrate that GRET not only reveals the underlying ranks of temporal tensors but also significantly outperforms existing methods in prediction performance and robustness against noise.
Related papers
- Irregular Tensor Low-Rank Representation for Hyperspectral Image Representation [71.69331824668954]
Spectral variations pose a common challenge in analyzing hyperspectral images (HSI)
Low-rank tensor representation has emerged as a robust strategy, leveraging inherent correlations within HSI data.
We propose a novel model for irregular tensor lowrank representation tailored to efficiently model irregular 3D cubes.
arXiv Detail & Related papers (2024-10-24T02:56:22Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Nonparametric Factor Trajectory Learning for Dynamic Tensor
Decomposition [20.55025648415664]
We propose NON FActor Trajectory learning for dynamic tensor decomposition (NONFAT)
We use a second-level GP to sample the entry values and to capture the temporal relationship between the entities.
We have shown the advantage of our method in several real-world applications.
arXiv Detail & Related papers (2022-07-06T05:33:00Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Low-Rank Hankel Tensor Completion for Traffic Speed Estimation [7.346671461427793]
We propose a purely data-driven and model-free solution to the traffic state estimation problem.
By imposing a low-rank assumption on this tensor structure, we can approximate characterize both global patterns and the unknown complex local dynamics.
We conduct numerical experiments on both synthetic simulation data and real-world high-resolution data, and our results demonstrate the effectiveness and superiority of the proposed model.
arXiv Detail & Related papers (2021-05-21T00:08:06Z) - Time-Aware Tensor Decomposition for Missing Entry Prediction [14.61218681943499]
Given a time-evolving tensor with missing entries, how can we effectively factorize it for precisely predicting the missing entries?
We propose TATD (Time-Aware Decomposition), a novel tensor decomposition method for real-world temporal tensors.
We show that TATD provides the state-of-the-art accuracy for decomposing temporal tensors.
arXiv Detail & Related papers (2020-12-16T10:52:34Z) - Direction Matters: On the Implicit Bias of Stochastic Gradient Descent
with Moderate Learning Rate [105.62979485062756]
This paper attempts to characterize the particular regularization effect of SGD in the moderate learning rate regime.
We show that SGD converges along the large eigenvalue directions of the data matrix, while GD goes after the small eigenvalue directions.
arXiv Detail & Related papers (2020-11-04T21:07:52Z) - Spatio-Temporal Tensor Sketching via Adaptive Sampling [15.576219771198389]
We propose SkeTenSmooth, a novel tensor factorization framework that uses adaptive sampling to compress tensor slices in a temporally streaming fashion.
Experiments on the New York City Yellow Taxi data show that SkeTenSmooth greatly reduces the memory cost and random sampling rate.
arXiv Detail & Related papers (2020-06-21T23:55:10Z) - On Learning Rates and Schr\"odinger Operators [105.32118775014015]
We present a general theoretical analysis of the effect of the learning rate.
We find that the learning rate tends to zero for a broad non- neural class functions.
arXiv Detail & Related papers (2020-04-15T09:52:37Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Grassmannian Optimization for Online Tensor Completion and Tracking with
the t-SVD [10.137631021498109]
We show the t-SVD is a specialization of the well-studied block-term decomposition for third-order tensors.
We present an algorithm under this model that can track changing free submodules from incomplete streaming 2-D data.
Our results are competitive in accuracy but much faster in compute time than state-of-the-art tensor completion algorithms on real applications.
arXiv Detail & Related papers (2020-01-30T15:56:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.