Graph Regularized Nonnegative Tensor Ring Decomposition for Multiway
Representation Learning
- URL: http://arxiv.org/abs/2010.05657v1
- Date: Mon, 12 Oct 2020 12:54:20 GMT
- Title: Graph Regularized Nonnegative Tensor Ring Decomposition for Multiway
Representation Learning
- Authors: Yuyuan Yu, Guoxu Zhou, Ning Zheng, Shengli Xie and Qibin Zhao
- Abstract summary: Nonnegative tensor ring (NTR) decomposition and graph regularized NTR (GNTR) decomposition are proposed.
The proposed algorithms can extract parts-based basis with rich colors and rich lines from tensor objects that provide more interpretable and meaningful representation.
- Score: 38.70369173200596
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tensor ring (TR) decomposition is a powerful tool for exploiting the low-rank
nature of multiway data and has demonstrated great potential in a variety of
important applications. In this paper, nonnegative tensor ring (NTR)
decomposition and graph regularized NTR (GNTR) decomposition are proposed,
where the former equips TR decomposition with local feature extraction by
imposing nonnegativity on the core tensors and the latter is additionally able
to capture manifold geometry information of tensor data, both significantly
extend the applications of TR decomposition for nonnegative multiway
representation learning. Accelerated proximal gradient based methods are
derived for NTR and GNTR. The experimental result demonstrate that the proposed
algorithms can extract parts-based basis with rich colors and rich lines from
tensor objects that provide more interpretable and meaningful representation,
and hence yield better performance than the state-of-the-art tensor based
methods in clustering and classification tasks.
Related papers
- Scalable and Robust Tensor Ring Decomposition for Large-scale Data [12.02023514105999]
We propose a scalable and robust TR decomposition algorithm capable of handling large-scale tensor data with missing entries and gross corruptions.
We first develop a novel auto-weighted steepest descent method that can adaptively fill the missing entries and identify the outliers during the decomposition process.
arXiv Detail & Related papers (2023-05-15T22:08:47Z) - Deep Graph Neural Networks via Posteriori-Sampling-based Node-Adaptive Residual Module [65.81781176362848]
Graph Neural Networks (GNNs) can learn from graph-structured data through neighborhood information aggregation.
As the number of layers increases, node representations become indistinguishable, which is known as over-smoothing.
We propose a textbfPosterior-Sampling-based, Node-distinguish Residual module (PSNR).
arXiv Detail & Related papers (2023-05-09T12:03:42Z) - Multi-View Clustering via Semi-non-negative Tensor Factorization [120.87318230985653]
We develop a novel multi-view clustering based on semi-non-negative tensor factorization (Semi-NTF)
Our model directly considers the between-view relationship and exploits the between-view complementary information.
In addition, we provide an optimization algorithm for the proposed method and prove mathematically that the algorithm always converges to the stationary KKT point.
arXiv Detail & Related papers (2023-03-29T14:54:19Z) - Fast Learnings of Coupled Nonnegative Tensor Decomposition Using Optimal Gradient and Low-rank Approximation [7.265645216663691]
We introduce a novel coupled nonnegative CANDECOMP/PARAFAC decomposition algorithm optimized by the alternating gradient method (CoNCPD-APG)
By integrating low-rank approximation with the proposed CoNCPD-APG method, the proposed algorithm can significantly decrease the computational burden without compromising decomposition quality.
arXiv Detail & Related papers (2023-02-10T08:49:36Z) - OrthoReg: Improving Graph-regularized MLPs via Orthogonality
Regularization [66.30021126251725]
Graph Neural Networks (GNNs) are currently dominating in modeling graphstructure data.
Graph-regularized networks (GR-MLPs) implicitly inject the graph structure information into model weights, while their performance can hardly match that of GNNs in most tasks.
We show that GR-MLPs suffer from dimensional collapse, a phenomenon in which the largest a few eigenvalues dominate the embedding space.
We propose OrthoReg, a novel GR-MLP model to mitigate the dimensional collapse issue.
arXiv Detail & Related papers (2023-01-31T21:20:48Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Fast Hypergraph Regularized Nonnegative Tensor Ring Factorization Based
on Low-Rank Approximation [19.43953011889585]
Nonnegative tensor ring (NTR) decomposition equipped with manifold learning has become a promising model to exploit the multi-dimensional structure.
In this paper, we introduce hypergraph to the framework of NTR to further enhance the feature extraction.
To reduce the computational complexity and suppress the noise, we apply the low-rank approximation trick to accelerate HGNTR.
arXiv Detail & Related papers (2021-09-06T09:24:51Z) - Tensor completion via nonconvex tensor ring rank minimization with
guaranteed convergence [16.11872681638052]
In recent studies, the tensor ring (TR) rank has shown high effectiveness in tensor completion.
A recently proposed TR rank is based on capturing the structure within the weighted sum penalizing the singular value equally.
In this paper, we propose to use the logdet-based function as a non smooth relaxation for solutions practice.
arXiv Detail & Related papers (2020-05-14T03:13:17Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z) - A Unified Framework for Coupled Tensor Completion [42.19293115131073]
Coupled tensor decomposition reveals the joint data structure by incorporating priori knowledge that come from the latent coupled factors.
The TR has powerful expression ability and achieves success in some multi-dimensional data processing applications.
The proposed method is validated on numerical experiments on synthetic data, and experimental results on real-world data demonstrate its superiority over the state-of-the-art methods in terms of recovery accuracy.
arXiv Detail & Related papers (2020-01-09T02:15:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.