Joint Network Topology Inference via Structured Fusion Regularization
- URL: http://arxiv.org/abs/2103.03471v2
- Date: Mon, 8 Mar 2021 06:21:00 GMT
- Title: Joint Network Topology Inference via Structured Fusion Regularization
- Authors: Yanli Yuan, De Wen Soh, Xiao Yang, Kun Guo, Tony Q. S. Quek
- Abstract summary: Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
- Score: 70.30364652829164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Joint network topology inference represents a canonical problem of jointly
learning multiple graph Laplacian matrices from heterogeneous graph signals. In
such a problem, a widely employed assumption is that of a simple common
component shared among multiple networks. However, in practice, a more
intricate topological pattern, comprising simultaneously of sparse, homogeneity
and heterogeneity components, would exhibit in multiple networks. In this
paper, we propose a general graph estimator based on a novel structured fusion
regularization that enables us to jointly learn multiple graph Laplacian
matrices with such complex topological patterns, and enjoys both high
computational efficiency and rigorous theoretical guarantee. Moreover, in the
proposed regularization term, the topological pattern among networks is
characterized by a Gram matrix, endowing our graph estimator with the ability
of flexible modelling different types of topological patterns by different
choices of the Gram matrix. Computationally, the regularization term, coupling
the parameters together, makes the formulated optimization problem intractable
and thus, we develop a computationally-scalable algorithm based on the
alternating direction method of multipliers (ADMM) to solve it efficiently.
Theoretically, we provide a theoretical analysis of the proposed graph
estimator, which establishes a non-asymptotic bound of the estimation error
under the high-dimensional setting and reflects the effect of several key
factors on the convergence rate of our algorithm. Finally, the superior
performance of the proposed method is illustrated through simulated and real
data examples.
Related papers
- Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Random Feature Approximation for Online Nonlinear Graph Topology
Identification [7.992550355579789]
We propose a kernel-based algorithm for graph topology estimation.
We exploit the fact that the real-world networks often exhibit sparse topologies.
The experiments conducted on real and synthetic data show that the proposed method outperforms its competitors.
arXiv Detail & Related papers (2021-10-19T12:48:12Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Joint Inference of Multiple Graphs from Matrix Polynomials [34.98220454543502]
Inferring graph structure from observations on the nodes is an important and popular network science task.
We study the problem of jointly inferring multiple graphs from the observation of signals at their nodes.
We propose a convex optimization method along with sufficient conditions that guarantee the recovery of the true graphs.
arXiv Detail & Related papers (2020-10-16T02:45:15Z) - The multilayer random dot product graph [6.722870980553432]
We present a comprehensive extension of the latent position network model known as the random dot product graph.
We propose a method for jointly embedding submatrices into a suitable latent space.
Empirical improvements in link prediction over single graph embeddings are exhibited in a cyber-security example.
arXiv Detail & Related papers (2020-07-20T20:31:39Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.