Tensor train rank minimization with nonlocal self-similarity for tensor
completion
- URL: http://arxiv.org/abs/2004.14273v1
- Date: Wed, 29 Apr 2020 15:39:39 GMT
- Title: Tensor train rank minimization with nonlocal self-similarity for tensor
completion
- Authors: Meng Ding, Ting-Zhu Huang, Xi-Le Zhao, Michael K. Ng, Tian-Hui Ma
- Abstract summary: tensor train (TT) rank has received increasing attention in tensor completion due to its ability to capture the global correlation of high-order tensors.
For third order visual data, direct TT rank minimization has not exploited the potential of TT rank for high-order tensors.
We propose a TT rank minimization with nonlocal self-similarity for tensor completion by simultaneously exploring the spatial, temporal/spectral, and nonlocal redundancy in visual data.
- Score: 27.727973182796678
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The tensor train (TT) rank has received increasing attention in tensor
completion due to its ability to capture the global correlation of high-order
tensors ($\textrm{order} >3$). For third order visual data, direct TT rank
minimization has not exploited the potential of TT rank for high-order tensors.
The TT rank minimization accompany with \emph{ket augmentation}, which
transforms a lower-order tensor (e.g., visual data) into a higher-order tensor,
suffers from serious block-artifacts. To tackle this issue, we suggest the TT
rank minimization with nonlocal self-similarity for tensor completion by
simultaneously exploring the spatial, temporal/spectral, and nonlocal
redundancy in visual data. More precisely, the TT rank minimization is
performed on a formed higher-order tensor called group by stacking similar
cubes, which naturally and fully takes advantage of the ability of TT rank for
high-order tensors. Moreover, the perturbation analysis for the TT low-rankness
of each group is established. We develop the alternating direction method of
multipliers tailored for the specific structure to solve the proposed model.
Extensive experiments demonstrate that the proposed method is superior to
several existing state-of-the-art methods in terms of both qualitative and
quantitative measures.
Related papers
- Irregular Tensor Low-Rank Representation for Hyperspectral Image Representation [71.69331824668954]
Low-rank tensor representation is an important approach to alleviate spectral variations.
Previous low-rank representation methods can only be applied to the regular data cubes.
We propose a novel irregular lowrank representation method that can efficiently model the irregular 3D cubes.
arXiv Detail & Related papers (2024-10-24T02:56:22Z) - A Novel Tensor Factorization-Based Method with Robustness to Inaccurate
Rank Estimation [9.058215418134209]
We propose a new tensor norm with a dual low-rank constraint, which utilizes the low-rank prior and rank information at the same time.
It is proven theoretically that the resulting tensor completion model can effectively avoid performance degradation caused by inaccurate rank estimation.
Based on this, the total cost at each iteration of the optimization algorithm is reduced to $mathcalO(n3log n +kn3)$ from $mathcalO(n4)$ achieved with standard methods.
arXiv Detail & Related papers (2023-05-19T06:26:18Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Multi-mode Tensor Train Factorization with Spatial-spectral
Regularization for Remote Sensing Images Recovery [1.3272510644778104]
We propose a novel low-MTT-rank tensor completion model via multi-mode TT factorization and spatial-spectral smoothness regularization.
We show that the proposed MTTD3R method outperforms compared methods in terms of visual and quantitative measures.
arXiv Detail & Related papers (2022-05-05T07:36:08Z) - Efficient Tensor Completion via Element-wise Weighted Low-rank Tensor
Train with Overlapping Ket Augmentation [18.438177637687357]
We propose a novel tensor completion approach via the element-wise weighted technique.
We specifically consider the recovery quality of edge elements from adjacent blocks.
Our experimental results demonstrate that the proposed algorithm TWMac-TT outperforms several other competing tensor completion methods.
arXiv Detail & Related papers (2021-09-13T06:50:37Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Optimal High-order Tensor SVD via Tensor-Train Orthogonal Iteration [10.034394572576922]
We propose a new algorithm to estimate the low tensor-train rank structure from the noisy high-order tensor observation.
The merits of the proposed TTOI are illustrated through applications to estimation and dimension reduction of high-order Markov processes.
arXiv Detail & Related papers (2020-10-06T05:18:24Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Tensor completion via nonconvex tensor ring rank minimization with
guaranteed convergence [16.11872681638052]
In recent studies, the tensor ring (TR) rank has shown high effectiveness in tensor completion.
A recently proposed TR rank is based on capturing the structure within the weighted sum penalizing the singular value equally.
In this paper, we propose to use the logdet-based function as a non smooth relaxation for solutions practice.
arXiv Detail & Related papers (2020-05-14T03:13:17Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.