Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference
- URL: http://arxiv.org/abs/2011.00194v2
- Date: Wed, 10 Mar 2021 21:48:07 GMT
- Title: Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference
- Authors: Ali Lotfi Rezaabad, Rahi Kalantari, Sriram Vishwanath, Mingyuan Zhou,
Jonathan Tamir
- Abstract summary: We build off of semi-implicit graph variational auto-encoders to capture higher-order statistics in a low-dimensional graph latent representation.
We incorporate hyperbolic geometry in the latent space through a Poincare embedding to efficiently represent graphs exhibiting hierarchical structure.
- Score: 48.63194907060615
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Efficient modeling of relational data arising in physical, social, and
information sciences is challenging due to complicated dependencies within the
data. In this work, we build off of semi-implicit graph variational
auto-encoders to capture higher-order statistics in a low-dimensional graph
latent representation. We incorporate hyperbolic geometry in the latent space
through a Poincare embedding to efficiently represent graphs exhibiting
hierarchical structure. To address the naive posterior latent distribution
assumptions in classical variational inference, we use semi-implicit
hierarchical variational Bayes to implicitly capture posteriors of given graph
data, which may exhibit heavy tails, multiple modes, skewness, and highly
correlated latent structures. We show that the existing semi-implicit
variational inference objective provably reduces information in the observed
graph. Based on this observation, we estimate and add an additional mutual
information term to the semi-implicit variational inference learning objective
to capture rich correlations arising between the input and latent spaces. We
show that the inclusion of this regularization term in conjunction with the
Poincare embedding boosts the quality of learned high-level representations and
enables more flexible and faithful graphical modeling. We experimentally
demonstrate that our approach outperforms existing graph variational
auto-encoders both in Euclidean and in hyperbolic spaces for edge link
prediction and node classification.
Related papers
- Self-Supervised Conditional Distribution Learning on Graphs [15.730933577970687]
We present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
This alignment effectively reduces the risk of disrupting intrinsic semantic information through graph-structured data augmentation.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - PAC Learnability under Explanation-Preserving Graph Perturbations [15.83659369727204]
Graph neural networks (GNNs) operate over graphs, enabling the model to leverage the complex relationships and dependencies in graph-structured data.
A graph explanation is a subgraph which is an almost sufficient' statistic of the input graph with respect to its classification label.
This work considers two methods for leveraging such perturbation invariances in the design and training of GNNs.
arXiv Detail & Related papers (2024-02-07T17:23:15Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.