Self-Supervised Nonlinear Transform-Based Tensor Nuclear Norm for
Multi-Dimensional Image Recovery
- URL: http://arxiv.org/abs/2105.14320v1
- Date: Sat, 29 May 2021 14:56:51 GMT
- Title: Self-Supervised Nonlinear Transform-Based Tensor Nuclear Norm for
Multi-Dimensional Image Recovery
- Authors: Yi-Si Luo, Xi-Le Zhao, Tai-Xiang Jiang, Yi Chang, Michael K. Ng, and
Chao Li
- Abstract summary: We propose a multilayer neural network to learn a nonlinear transform via the observed tensor data under self-supervision.
The proposed network makes use of low-rank representation of transformed tensors and data-fitting between the observed tensor and the reconstructed tensor to construct the nonlinear transformation.
- Score: 27.34643415429293
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study multi-dimensional image recovery. Recently,
transform-based tensor nuclear norm minimization methods are considered to
capture low-rank tensor structures to recover third-order tensors in
multi-dimensional image processing applications. The main characteristic of
such methods is to perform the linear transform along the third mode of
third-order tensors, and then compute tensor nuclear norm minimization on the
transformed tensor so that the underlying low-rank tensors can be recovered.
The main aim of this paper is to propose a nonlinear multilayer neural network
to learn a nonlinear transform via the observed tensor data under
self-supervision. The proposed network makes use of low-rank representation of
transformed tensors and data-fitting between the observed tensor and the
reconstructed tensor to construct the nonlinear transformation. Extensive
experimental results on tensor completion, background subtraction, robust
tensor completion, and snapshot compressive imaging are presented to
demonstrate that the performance of the proposed method is better than that of
state-of-the-art methods.
Related papers
- Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - A Theory of Topological Derivatives for Inverse Rendering of Geometry [87.49881303178061]
We introduce a theoretical framework for differentiable surface evolution that allows discrete topology changes through the use of topological derivatives.
We validate the proposed theory with optimization of closed curves in 2D and surfaces in 3D to lend insights into limitations of current methods.
arXiv Detail & Related papers (2023-08-19T00:55:55Z) - Tensor Factorization via Transformed Tensor-Tensor Product for Image
Alignment [3.0969191504482243]
We study the problem of a batch of linearly correlated image alignment, where the observed images are deformed by some unknown domain transformations.
By stacking these images as the frontal slices of a third-order tensor, we propose to explore the low-rankness of the underlying tensor.
arXiv Detail & Related papers (2022-12-12T05:52:26Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - 2D+3D facial expression recognition via embedded tensor manifold
regularization [16.98176664818354]
A novel approach via embedded tensor manifold regularization for 2D+3D facial expression recognition (FERETMR) is proposed.
We establish the first-order optimality condition in terms of stationary points, and then design a block coordinate descent (BCD) algorithm with convergence analysis.
Numerical results on BU-3DFE database and Bosphorus databases demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2022-01-29T06:11:00Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - Nonlinear Transform Induced Tensor Nuclear Norm for Tensor Completion [12.788874164701785]
We propose a low-rank tensor completion (LRTC) model along the theoretical convergence of the NTTNN and the PAM algorithm.
Our method outperforms linear transform-based state-of-the-art nuclear norm (TNN) methods qualitatively and quantitatively.
arXiv Detail & Related papers (2021-10-17T09:25:37Z) - Understanding Deflation Process in Over-parametrized Tensor
Decomposition [17.28303004783945]
We study the training dynamics for gradient flow on over-parametrized tensor decomposition problems.
Empirically, such training process often first fits larger components and then discovers smaller components.
arXiv Detail & Related papers (2021-06-11T18:51:36Z) - Anomaly Detection with Tensor Networks [2.3895981099137535]
We exploit the memory and computational efficiency of tensor networks to learn a linear transformation over a space with a dimension exponential in the number of original features.
We produce competitive results on image datasets, despite not exploiting the locality of images.
arXiv Detail & Related papers (2020-06-03T20:41:30Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.