Joint graph learning from Gaussian observations in the presence of
hidden nodes
- URL: http://arxiv.org/abs/2212.01816v1
- Date: Sun, 4 Dec 2022 13:03:41 GMT
- Title: Joint graph learning from Gaussian observations in the presence of
hidden nodes
- Authors: Samuel Rey, Madeline Navarro, Andrei Buciulea, Santiago Segarra, and
Antonio G. Marques
- Abstract summary: We propose a joint graph learning method that takes into account the presence of hidden (latent) variables.
We exploit the structure resulting from the previous considerations to propose a convex optimization problem.
We compare the proposed algorithm with different baselines and evaluate its performance over synthetic and real-world graphs.
- Score: 26.133725549667734
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph learning problems are typically approached by focusing on learning the
topology of a single graph when signals from all nodes are available. However,
many contemporary setups involve multiple related networks and, moreover, it is
often the case that only a subset of nodes is observed while the rest remain
hidden. Motivated by this, we propose a joint graph learning method that takes
into account the presence of hidden (latent) variables. Intuitively, the
presence of the hidden nodes renders the inference task ill-posed and
challenging to solve, so we overcome this detrimental influence by harnessing
the similarity of the estimated graphs. To that end, we assume that the
observed signals are drawn from a Gaussian Markov random field with latent
variables and we carefully model the graph similarity among hidden (latent)
nodes. Then, we exploit the structure resulting from the previous
considerations to propose a convex optimization problem that solves the joint
graph learning task by providing a regularized maximum likelihood estimator.
Finally, we compare the proposed algorithm with different baselines and
evaluate its performance over synthetic and real-world graphs.
Related papers
- Online Network Inference from Graph-Stationary Signals with Hidden Nodes [31.927912288598467]
We present a novel method for online graph estimation that accounts for the presence of hidden nodes.
We then formulate a convex optimization problem for graph learning from streaming, incomplete graph signals.
arXiv Detail & Related papers (2024-09-13T12:09:09Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graphon-aided Joint Estimation of Multiple Graphs [24.077455621015552]
We consider the problem of estimating the topology of multiple networks from nodal observations.
We adopt a graphon as our random graph model, which is a nonparametric model from which graphs of potentially different sizes can be drawn.
arXiv Detail & Related papers (2022-02-11T15:20:44Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Joint inference of multiple graphs with hidden variables from stationary
graph signals [19.586429684209843]
We introduce a joint graph topology inference method that models the influence of the hidden variables.
Under the assumptions that the observed signals are stationary on the sought graphs, the joint estimation of multiple networks allows us to exploit such relationships.
arXiv Detail & Related papers (2021-10-05T21:31:36Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Learning Graphs from Smooth Signals under Moment Uncertainty [23.868075779606425]
We consider the problem of inferring the graph structure from a given set of graph signals.
Traditional graph learning models do not take this distributional uncertainty into account.
arXiv Detail & Related papers (2021-05-12T06:47:34Z) - A Thorough View of Exact Inference in Graphs from the Degree-4
Sum-of-Squares Hierarchy [37.34153902687548]
We tackle the problem of exactly recovering an unknown ground-truth binary labeling of the nodes from a single corrupted observation of each edge.
We apply a hierarchy of relaxations known as the sum-of-squares hierarchy, to the problem.
We show that the solution of the dual of the relaxed problem is related to finding edge weights of the Johnson and Kneser graphs.
arXiv Detail & Related papers (2021-02-16T08:36:19Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.