Multi-mode Core Tensor Factorization based Low-Rankness and Its
Applications to Tensor Completion
- URL: http://arxiv.org/abs/2012.01918v2
- Date: Sun, 16 May 2021 07:13:54 GMT
- Title: Multi-mode Core Tensor Factorization based Low-Rankness and Its
Applications to Tensor Completion
- Authors: Haijin Zeng
- Abstract summary: Low-rank tensor completion is widely used in computer and machine learning.
This paper develops a kind of multi-modal tensorization algorithm (MCTF) together with a low-rankness measure and a better nonspectral relaxation form of it.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-rank tensor completion has been widely used in computer vision and
machine learning. This paper develops a kind of multi-modal core tensor
factorization (MCTF) method together with a tensor low-rankness measure and a
better nonconvex relaxation form of it (NonMCTF). The proposed models encode
low-rank insights for general tensors provided by Tucker and T-SVD, and thus
are expected to simultaneously model spectral low-rankness in multiple
orientations and accurately restore the data of intrinsic low-rank structure
based on few observed entries. Furthermore, we study the MCTF and NonMCTF
regularization minimization problem, and design an effective BSUM algorithm to
solve them. This efficient solver can extend MCTF to various tasks, such as
tensor completion. A series of experiments, including hyperspectral image
(HSI), video and MRI completion, confirm the superior performance of the
proposed method.
Related papers
- Handling The Non-Smooth Challenge in Tensor SVD: A Multi-Objective Tensor Recovery Framework [15.16222081389267]
We introduce a novel tensor recovery model with a learnable tensor nuclear norm to address the challenge of non-smooth changes in tensor data.
We develop a new optimization algorithm named the Alternating Proximal Multiplier Method (APMM) to iteratively solve the proposed tensor completion model.
In addition, we propose a multi-objective tensor recovery framework based on APMM to efficiently explore the correlations of tensor data across its various dimensions.
arXiv Detail & Related papers (2023-11-23T12:16:33Z) - Tensorized LSSVMs for Multitask Regression [48.844191210894245]
Multitask learning (MTL) can utilize the relatedness between multiple tasks for performance improvement.
New MTL is proposed by leveraging low-rank tensor analysis and Least Squares Support Vectorized Least Squares Support Vectorized tLSSVM-MTL.
arXiv Detail & Related papers (2023-03-04T16:36:03Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Tensor Full Feature Measure and Its Nonconvex Relaxation Applications to
Tensor Recovery [1.8899300124593645]
We propose a new tensor sparsity measure called Full Feature Measure (FFM)
It can simultaneously describe the feature dimension each dimension, and connect the Tucker rank with the tensor tube rank.
Two efficient models based on FFM are proposed, and two Alternating Multiplier Method (ADMM) algorithms are developed to solve the proposed model.
arXiv Detail & Related papers (2021-09-25T01:44:34Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Spectral Tensor Train Parameterization of Deep Learning Layers [136.4761580842396]
We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context.
We show the effects of neural network compression in the classification setting and both compression and improved stability training in the generative adversarial training setting.
arXiv Detail & Related papers (2021-03-07T00:15:44Z) - Enhanced nonconvex low-rank approximation of tensor multi-modes for
tensor completion [1.3406858660972554]
We propose a novel low-rank approximation tensor multi-modes (LRATM)
A block-bound method-based algorithm is designed to efficiently solve the proposed model.
Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors.
arXiv Detail & Related papers (2020-05-28T08:53:54Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z) - Tensor denoising and completion based on ordinal observations [11.193504036335503]
We consider the problem of low-rank tensor estimation from possibly incomplete, ordinal-valued observations.
We propose a multi-linear cumulative link model, develop a rank-constrained M-estimator, and obtain theoretical accuracy guarantees.
We show that the proposed estimator is minimax optimal under the class of low-rank models.
arXiv Detail & Related papers (2020-02-16T07:09:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.