Multi-mode Tensor Train Factorization with Spatial-spectral
Regularization for Remote Sensing Images Recovery
- URL: http://arxiv.org/abs/2205.03380v1
- Date: Thu, 5 May 2022 07:36:08 GMT
- Title: Multi-mode Tensor Train Factorization with Spatial-spectral
Regularization for Remote Sensing Images Recovery
- Authors: Gaohang Yu, Shaochun Wan, Liqun Qi, Yanwei Xu
- Abstract summary: We propose a novel low-MTT-rank tensor completion model via multi-mode TT factorization and spatial-spectral smoothness regularization.
We show that the proposed MTTD3R method outperforms compared methods in terms of visual and quantitative measures.
- Score: 1.3272510644778104
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tensor train (TT) factorization and corresponding TT rank, which can well
express the low-rankness and mode correlations of higher-order tensors, have
attracted much attention in recent years. However, TT factorization based
methods are generally not sufficient to characterize low-rankness along each
mode of third-order tensor. Inspired by this, we generalize the tensor train
factorization to the mode-k tensor train factorization and introduce a
corresponding multi-mode tensor train (MTT) rank. Then, we proposed a novel
low-MTT-rank tensor completion model via multi-mode TT factorization and
spatial-spectral smoothness regularization. To tackle the proposed model, we
develop an efficient proximal alternating minimization (PAM) algorithm.
Extensive numerical experiment results on visual data demonstrate that the
proposed MTTD3R method outperforms compared methods in terms of visual and
quantitative measures.
Related papers
- Handling The Non-Smooth Challenge in Tensor SVD: A Multi-Objective Tensor Recovery Framework [15.16222081389267]
We introduce a novel tensor recovery model with a learnable tensor nuclear norm to address the challenge of non-smooth changes in tensor data.
We develop a new optimization algorithm named the Alternating Proximal Multiplier Method (APMM) to iteratively solve the proposed tensor completion model.
In addition, we propose a multi-objective tensor recovery framework based on APMM to efficiently explore the correlations of tensor data across its various dimensions.
arXiv Detail & Related papers (2023-11-23T12:16:33Z) - Multi-View Clustering via Semi-non-negative Tensor Factorization [120.87318230985653]
We develop a novel multi-view clustering based on semi-non-negative tensor factorization (Semi-NTF)
Our model directly considers the between-view relationship and exploits the between-view complementary information.
In addition, we provide an optimization algorithm for the proposed method and prove mathematically that the algorithm always converges to the stationary KKT point.
arXiv Detail & Related papers (2023-03-29T14:54:19Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Multi-Tensor Network Representation for High-Order Tensor Completion [25.759851542474447]
This work studies the problem of high-dimensional data (referred to tensors) completion from partially observed samplings.
We consider that a tensor is a superposition of multiple low-rank components.
In this paper, we propose a fundamental tensor decomposition framework: Multi-Tensor Network decomposition (MTNR)
arXiv Detail & Related papers (2021-09-09T03:50:19Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Spectral Tensor Train Parameterization of Deep Learning Layers [136.4761580842396]
We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context.
We show the effects of neural network compression in the classification setting and both compression and improved stability training in the generative adversarial training setting.
arXiv Detail & Related papers (2021-03-07T00:15:44Z) - Tensor Train Random Projection [0.0]
This work proposes a novel tensor train random projection (TTRP) method for dimension reduction.
Our TTRP is systematically constructed through a tensor train representation with TT-ranks equal to one.
Based on the tensor train format, this new random projection method can speed up the dimension reduction procedure for high-dimensional datasets.
arXiv Detail & Related papers (2020-10-21T07:31:45Z) - Enhanced nonconvex low-rank approximation of tensor multi-modes for
tensor completion [1.3406858660972554]
We propose a novel low-rank approximation tensor multi-modes (LRATM)
A block-bound method-based algorithm is designed to efficiently solve the proposed model.
Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors.
arXiv Detail & Related papers (2020-05-28T08:53:54Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z) - Tensor train rank minimization with nonlocal self-similarity for tensor
completion [27.727973182796678]
tensor train (TT) rank has received increasing attention in tensor completion due to its ability to capture the global correlation of high-order tensors.
For third order visual data, direct TT rank minimization has not exploited the potential of TT rank for high-order tensors.
We propose a TT rank minimization with nonlocal self-similarity for tensor completion by simultaneously exploring the spatial, temporal/spectral, and nonlocal redundancy in visual data.
arXiv Detail & Related papers (2020-04-29T15:39:39Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.