Joint inference of multiple graphs with hidden variables from stationary
graph signals
- URL: http://arxiv.org/abs/2110.03666v1
- Date: Tue, 5 Oct 2021 21:31:36 GMT
- Title: Joint inference of multiple graphs with hidden variables from stationary
graph signals
- Authors: Samuel Rey, Andrei Buciulea, Madeline Navarro, Santiago Segarra, and
Antonio G. Marques
- Abstract summary: We introduce a joint graph topology inference method that models the influence of the hidden variables.
Under the assumptions that the observed signals are stationary on the sought graphs, the joint estimation of multiple networks allows us to exploit such relationships.
- Score: 19.586429684209843
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning graphs from sets of nodal observations represents a prominent
problem formally known as graph topology inference. However, current approaches
are limited by typically focusing on inferring single networks, and they assume
that observations from all nodes are available. First, many contemporary setups
involve multiple related networks, and second, it is often the case that only a
subset of nodes is observed while the rest remain hidden. Motivated by these
facts, we introduce a joint graph topology inference method that models the
influence of the hidden variables. Under the assumptions that the observed
signals are stationary on the sought graphs and the graphs are closely related,
the joint estimation of multiple networks allows us to exploit such
relationships to improve the quality of the learned graphs. Moreover, we
confront the challenging problem of modeling the influence of the hidden nodes
to minimize their detrimental effect. To obtain an amenable approach, we take
advantage of the particular structure of the setup at hand and leverage the
similarity between the different graphs, which affects both the observed and
the hidden nodes. To test the proposed method, numerical simulations over
synthetic and real-world graphs are provided.
Related papers
- Joint graph learning from Gaussian observations in the presence of
hidden nodes [26.133725549667734]
We propose a joint graph learning method that takes into account the presence of hidden (latent) variables.
We exploit the structure resulting from the previous considerations to propose a convex optimization problem.
We compare the proposed algorithm with different baselines and evaluate its performance over synthetic and real-world graphs.
arXiv Detail & Related papers (2022-12-04T13:03:41Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Joint Network Topology Inference via a Shared Graphon Model [24.077455621015552]
We consider the problem of estimating the topology of multiple networks from nodal observations.
We adopt a graphon as our random graph model, which is a nonparametric model from which graphs of potentially different sizes can be drawn.
arXiv Detail & Related papers (2022-09-17T02:38:58Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graphon-aided Joint Estimation of Multiple Graphs [24.077455621015552]
We consider the problem of estimating the topology of multiple networks from nodal observations.
We adopt a graphon as our random graph model, which is a nonparametric model from which graphs of potentially different sizes can be drawn.
arXiv Detail & Related papers (2022-02-11T15:20:44Z) - Self-Supervised Graph Learning with Proximity-based Views and Channel
Contrast [4.761137180081091]
Graph neural networks (GNNs) use neighborhood aggregation as a core component that results in feature smoothing among nodes in proximity.
To tackle this problem, we strengthen the graph with two additional graph views, in which nodes are directly linked to those with the most similar features or local structures.
We propose a method that aims to maximize the agreement between representations across generated views and the original graph.
arXiv Detail & Related papers (2021-06-07T15:38:36Z) - Joint Inference of Multiple Graphs from Matrix Polynomials [34.98220454543502]
Inferring graph structure from observations on the nodes is an important and popular network science task.
We study the problem of jointly inferring multiple graphs from the observation of signals at their nodes.
We propose a convex optimization method along with sufficient conditions that guarantee the recovery of the true graphs.
arXiv Detail & Related papers (2020-10-16T02:45:15Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Interpretable Deep Graph Generation with Node-Edge Co-Disentanglement [55.2456981313287]
We propose a new disentanglement enhancement framework for deep generative models for attributed graphs.
A novel variational objective is proposed to disentangle the above three types of latent factors, with novel architecture for node and edge deconvolutions.
Within each type, individual-factor-wise disentanglement is further enhanced, which is shown to be a generalization of the existing framework for images.
arXiv Detail & Related papers (2020-06-09T16:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.