Provable Tensor Completion with Graph Information
- URL: http://arxiv.org/abs/2310.02543v1
- Date: Wed, 4 Oct 2023 02:55:10 GMT
- Title: Provable Tensor Completion with Graph Information
- Authors: Kaidong Wang, Yao Wang, Xiuwu Liao, Shaojie Tang, Can Yang and Deyu
Meng
- Abstract summary: We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
- Score: 49.08648842312456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs, depicting the interrelations between variables, has been widely used
as effective side information for accurate data recovery in various
matrix/tensor recovery related applications. In this paper, we study the tensor
completion problem with graph information. Current research on
graph-regularized tensor completion tends to be task-specific, lacking
generality and systematic approaches. Furthermore, a recovery theory to ensure
performance remains absent. Moreover, these approaches overlook the dynamic
aspects of graphs, treating them as static akin to matrices, even though graphs
could exhibit dynamism in tensor-related scenarios. To confront these
challenges, we introduce a pioneering framework in this paper that
systematically formulates a novel model, theory, and algorithm for solving the
dynamic graph regularized tensor completion problem. For the model, we
establish a rigorous mathematical representation of the dynamic graph, based on
which we derive a new tensor-oriented graph smoothness regularization. By
integrating this regularization into a tensor decomposition model based on
transformed t-SVD, we develop a comprehensive model simultaneously capturing
the low-rank and similarity structure of the tensor. In terms of theory, we
showcase the alignment between the proposed graph smoothness regularization and
a weighted tensor nuclear norm. Subsequently, we establish assurances of
statistical consistency for our model, effectively bridging a gap in the
theoretical examination of the problem involving tensor recovery with graph
information. In terms of the algorithm, we develop a solution of high
effectiveness, accompanied by a guaranteed convergence, to address the
resulting model. To showcase the prowess of our proposed model in contrast to
established ones, we provide in-depth numerical experiments encompassing
synthetic data as well as real-world datasets.
Related papers
- Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Interpretable and Scalable Graphical Models for Complex Spatio-temporal
Processes [3.469001874498102]
thesis focuses on data that has complex-temporal structure and on probabilistic graphical models that learn the structure in an interpretable and interpretable manner.
practical applications of the methodology are considered using real datasets.
This includes brain-connectivity analysis using data, space weather forecasting using solar imaging data, longitudinal analysis of public opinions using Twitter data, and mining of mental health related issues using TalkLife data.
arXiv Detail & Related papers (2023-01-15T05:39:30Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference [48.63194907060615]
We build off of semi-implicit graph variational auto-encoders to capture higher-order statistics in a low-dimensional graph latent representation.
We incorporate hyperbolic geometry in the latent space through a Poincare embedding to efficiently represent graphs exhibiting hierarchical structure.
arXiv Detail & Related papers (2020-10-31T05:48:34Z) - Low-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion [3.498620439731324]
We introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion.
Our model possesses a sparse regularization term to promote a sparse core tensor, which is beneficial for tensor data compression.
It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors.
arXiv Detail & Related papers (2020-10-01T12:45:39Z) - Lossless Compression of Structured Convolutional Models via Lifting [14.63152363481139]
We introduce a simple and efficient technique to detect the symmetries and compress the neural models without loss of any information.
We demonstrate through experiments that such compression can lead to significant speedups of structured convolutional models.
arXiv Detail & Related papers (2020-07-13T08:02:27Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Partially Observed Dynamic Tensor Response Regression [17.930417764563106]
In modern data science, dynamic tensor data is prevailing in numerous applications.
We develop a regression model with partially observed dynamic tensor sparsity as a predictor.
We illustrate the efficacy of our proposed method using simulations, and two real applications.
arXiv Detail & Related papers (2020-02-22T17:14:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.