Efficient Tensor Completion via Element-wise Weighted Low-rank Tensor
Train with Overlapping Ket Augmentation
- URL: http://arxiv.org/abs/2109.05736v1
- Date: Mon, 13 Sep 2021 06:50:37 GMT
- Title: Efficient Tensor Completion via Element-wise Weighted Low-rank Tensor
Train with Overlapping Ket Augmentation
- Authors: Yang Zhang, Yao Wang, Zhi Han, Xi'ai Chen, Yandong Tang
- Abstract summary: We propose a novel tensor completion approach via the element-wise weighted technique.
We specifically consider the recovery quality of edge elements from adjacent blocks.
Our experimental results demonstrate that the proposed algorithm TWMac-TT outperforms several other competing tensor completion methods.
- Score: 18.438177637687357
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, there have been an increasing number of applications of
tensor completion based on the tensor train (TT) format because of its
efficiency and effectiveness in dealing with higher-order tensor data. However,
existing tensor completion methods using TT decomposition have two obvious
drawbacks. One is that they only consider mode weights according to the degree
of mode balance, even though some elements are recovered better in an
unbalanced mode. The other is that serious blocking artifacts appear when the
missing element rate is relatively large. To remedy such two issues, in this
work, we propose a novel tensor completion approach via the element-wise
weighted technique. Accordingly, a novel formulation for tensor completion and
an efficient optimization algorithm, called as tensor completion by parallel
weighted matrix factorization via tensor train (TWMac-TT), is proposed. In
addition, we specifically consider the recovery quality of edge elements from
adjacent blocks. Different from traditional reshaping and ket augmentation, we
utilize a new tensor augmentation technique called overlapping ket
augmentation, which can further avoid blocking artifacts. We then conduct
extensive performance evaluations on synthetic data and several real image data
sets. Our experimental results demonstrate that the proposed algorithm TWMac-TT
outperforms several other competing tensor completion methods.
Related papers
- Scalable CP Decomposition for Tensor Learning using GPU Tensor Cores [47.87810316745786]
We propose a compression-based tensor decomposition framework, namely the exascale-tensor, to support exascale tensor decomposition.
Compared to the baselines, the exascale-tensor supports 8,000x larger tensors and a speedup up to 6.95x.
We also apply our method to two real-world applications, including gene analysis and tensor layer neural networks.
arXiv Detail & Related papers (2023-11-22T21:04:59Z) - A Novel Tensor Factorization-Based Method with Robustness to Inaccurate
Rank Estimation [9.058215418134209]
We propose a new tensor norm with a dual low-rank constraint, which utilizes the low-rank prior and rank information at the same time.
It is proven theoretically that the resulting tensor completion model can effectively avoid performance degradation caused by inaccurate rank estimation.
Based on this, the total cost at each iteration of the optimization algorithm is reduced to $mathcalO(n3log n +kn3)$ from $mathcalO(n4)$ achieved with standard methods.
arXiv Detail & Related papers (2023-05-19T06:26:18Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - A high-order tensor completion algorithm based on Fully-Connected Tensor
Network weighted optimization [8.229028597459752]
We propose a new tensor completion method named the fully connected tensor network weighted optization(FCTN-WOPT)
The algorithm performs a composition of the completed tensor by initialising the factors from the FCTN decomposition.
The results show the advanced performance of our FCTN-WOPT when it is applied to higher-order tensor completion.
arXiv Detail & Related papers (2022-04-04T13:46:32Z) - JULIA: Joint Multi-linear and Nonlinear Identification for Tensor
Completion [46.27248186328502]
This paper proposes a Joint mUlti-linear and non Linear IdentificAtion framework for large-scale tensor completion.
Experiments on six real large-scale tensors demonstrate that JULIA outperforms many existing tensor completion algorithms.
arXiv Detail & Related papers (2022-01-31T20:18:41Z) - Robust M-estimation-based Tensor Ring Completion: a Half-quadratic
Minimization Approach [14.048989759890475]
We develop a robust approach to tensor ring completion that uses an M-estimator as its error statistic.
We present two HQ-based algorithms based on truncated singular value decomposition and matrix factorization.
arXiv Detail & Related papers (2021-06-19T04:37:50Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Multi-version Tensor Completion for Time-delayed Spatio-temporal Data [50.762087239885936]
Real-world-temporal data is often incomplete or inaccurate due to various data loading delays.
We propose a low-rank tensor model to predict the updates over time.
We obtain up to 27.2% lower root mean-squared-error compared to the best baseline method.
arXiv Detail & Related papers (2021-05-11T19:55:56Z) - Tensor Completion via Tensor Networks with a Tucker Wrapper [28.83358353043287]
We propose to solve low-rank tensor completion (LRTC) via tensor networks with a Tucker wrapper.
A two-level alternative least square method is then employed to update the unknown factors.
Numerical simulations show that the proposed algorithm is comparable with state-of-the-art methods.
arXiv Detail & Related papers (2020-10-29T17:54:01Z) - Tensor completion via nonconvex tensor ring rank minimization with
guaranteed convergence [16.11872681638052]
In recent studies, the tensor ring (TR) rank has shown high effectiveness in tensor completion.
A recently proposed TR rank is based on capturing the structure within the weighted sum penalizing the singular value equally.
In this paper, we propose to use the logdet-based function as a non smooth relaxation for solutions practice.
arXiv Detail & Related papers (2020-05-14T03:13:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.