Time-Aware Tensor Decomposition for Missing Entry Prediction
- URL: http://arxiv.org/abs/2012.08855v1
- Date: Wed, 16 Dec 2020 10:52:34 GMT
- Title: Time-Aware Tensor Decomposition for Missing Entry Prediction
- Authors: Dawon Ahn, Jun-Gi Jang, U Kang
- Abstract summary: Given a time-evolving tensor with missing entries, how can we effectively factorize it for precisely predicting the missing entries?
We propose TATD (Time-Aware Decomposition), a novel tensor decomposition method for real-world temporal tensors.
We show that TATD provides the state-of-the-art accuracy for decomposing temporal tensors.
- Score: 14.61218681943499
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given a time-evolving tensor with missing entries, how can we effectively
factorize it for precisely predicting the missing entries? Tensor factorization
has been extensively utilized for analyzing various multi-dimensional
real-world data. However, existing models for tensor factorization have
disregarded the temporal property for tensor factorization while most
real-world data are closely related to time. Moreover, they do not address
accuracy degradation due to the sparsity of time slices. The essential problems
of how to exploit the temporal property for tensor decomposition and consider
the sparsity of time slices remain unresolved. In this paper, we propose TATD
(Time-Aware Tensor Decomposition), a novel tensor decomposition method for
real-world temporal tensors. TATD is designed to exploit temporal dependency
and time-varying sparsity of real-world temporal tensors. We propose a new
smoothing regularization with Gaussian kernel for modeling time dependency.
Moreover, we improve the performance of TATD by considering time-varying
sparsity. We design an alternating optimization scheme suitable for temporal
tensor factorization with our smoothing regularization. Extensive experiments
show that TATD provides the state-of-the-art accuracy for decomposing temporal
tensors.
Related papers
- Distributed Stochastic Gradient Descent with Staleness: A Stochastic Delay Differential Equation Based Framework [56.82432591933544]
Distributed gradient descent (SGD) has attracted considerable recent attention due to its potential for scaling computational resources, reducing training time, and helping protect user privacy in machine learning.
This paper presents the run time and staleness of distributed SGD based on delay differential equations (SDDEs) and the approximation of gradient arrivals.
It is interestingly shown that increasing the number of activated workers does not necessarily accelerate distributed SGD due to staleness.
arXiv Detail & Related papers (2024-06-17T02:56:55Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - SWoTTeD: An Extension of Tensor Decomposition to Temporal Phenotyping [0.0]
We propose SWoTTeD (Sliding Window for Temporal Decomposition), a novel method to discover hidden temporal patterns.
We validate our proposal using both synthetic and real-world datasets, and we present an original usecase using data from the Greater Paris University Hospital.
The results show that SWoTTeD achieves at least as accurate reconstruction as recent state-of-the-art tensor decomposition models.
arXiv Detail & Related papers (2023-10-02T13:42:11Z) - A Time-aware tensor decomposition for tracking evolving patterns [0.7958824725263767]
Time-evolving data sets can often be arranged as a higher-order tensor with one of the modes being the time mode.
While tensor factorizations have been successfully used to capture the underlying patterns in such higher-order data sets, the temporal aspect is often ignored.
We propose temporal PARAFAC2: a PARAFAC2-based tensor factorization method with temporal regularization to extract gradually evolving patterns from temporal data.
arXiv Detail & Related papers (2023-08-14T13:13:50Z) - TimesNet: Temporal 2D-Variation Modeling for General Time Series
Analysis [80.56913334060404]
Time series analysis is of immense importance in applications, such as weather forecasting, anomaly detection, and action recognition.
Previous methods attempt to accomplish this directly from the 1D time series.
We ravel out the complex temporal variations into the multiple intraperiod- and interperiod-variations.
arXiv Detail & Related papers (2022-10-05T12:19:51Z) - Nonparametric Factor Trajectory Learning for Dynamic Tensor
Decomposition [20.55025648415664]
We propose NON FActor Trajectory learning for dynamic tensor decomposition (NONFAT)
We use a second-level GP to sample the entry values and to capture the temporal relationship between the entities.
We have shown the advantage of our method in several real-world applications.
arXiv Detail & Related papers (2022-07-06T05:33:00Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Multi-version Tensor Completion for Time-delayed Spatio-temporal Data [50.762087239885936]
Real-world-temporal data is often incomplete or inaccurate due to various data loading delays.
We propose a low-rank tensor model to predict the updates over time.
We obtain up to 27.2% lower root mean-squared-error compared to the best baseline method.
arXiv Detail & Related papers (2021-05-11T19:55:56Z) - Low-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion [3.498620439731324]
We introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion.
Our model possesses a sparse regularization term to promote a sparse core tensor, which is beneficial for tensor data compression.
It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors.
arXiv Detail & Related papers (2020-10-01T12:45:39Z) - Spatio-Temporal Tensor Sketching via Adaptive Sampling [15.576219771198389]
We propose SkeTenSmooth, a novel tensor factorization framework that uses adaptive sampling to compress tensor slices in a temporally streaming fashion.
Experiments on the New York City Yellow Taxi data show that SkeTenSmooth greatly reduces the memory cost and random sampling rate.
arXiv Detail & Related papers (2020-06-21T23:55:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.