Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction
- URL: http://arxiv.org/abs/2202.09025v1
- Date: Fri, 18 Feb 2022 05:08:04 GMT
- Title: Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction
- Authors: Mingyue Tang, Carl Yang, Pan Li
- Abstract summary: We propose a novel graph decoder to reconstruct the entire neighborhood information regarding both proximity and structure via Neighborhood Wasserstein Reconstruction (NWR)
NWR jointly predicts its node degree and neighbor feature distribution, where the distribution prediction adopts an optimal-transport loss based on the Wasserstein distance.
Experiments on both synthetic and real-world network datasets show that NWR have much more advantageous in structure-oriented graph mining tasks, while also achieving competitive performance in proximity-oriented ones.
- Score: 19.684222592220614
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have drawn significant research attention
recently, mostly under the setting of semi-supervised learning. When
task-agnostic representations are preferred or supervision is simply
unavailable, the auto-encoder framework comes in handy with a natural graph
reconstruction objective for unsupervised GNN training. However, existing graph
auto-encoders are designed to reconstruct the direct links, so GNNs trained in
this way are only optimized towards proximity-oriented graph mining tasks, and
will fall short when the topological structures matter. In this work, we
revisit the graph encoding process of GNNs which essentially learns to encode
the neighborhood information of each node into an embedding vector, and propose
a novel graph decoder to reconstruct the entire neighborhood information
regarding both proximity and structure via Neighborhood Wasserstein
Reconstruction (NWR). Specifically, from the GNN embedding of each node, NWR
jointly predicts its node degree and neighbor feature distribution, where the
distribution prediction adopts an optimal-transport loss based on the
Wasserstein distance. Extensive experiments on both synthetic and real-world
network datasets show that the unsupervised node representations learned with
NWR have much more advantageous in structure-oriented graph mining tasks, while
also achieving competitive performance in proximity-oriented ones.
Related papers
- Commute Graph Neural Networks [7.143879014059894]
We introduce Commute Graph Neural Networks (CGNN), an approach that seamlessly integrates node-wise commute time into the message passing scheme.
CGNN is an efficient method for computing commute time using a newly formulated digraph Laplacian.
It enables CGNN to directly capture the mutual, asymmetric relationships in digraphs.
arXiv Detail & Related papers (2024-06-30T10:53:40Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Structure Enhanced Graph Neural Networks for Link Prediction [6.872826041648584]
We propose Structure Enhanced Graph neural network (SEG) for link prediction.
SEG incorporates surrounding topological information of target nodes into an ordinary GNN model.
Experiments on the OGB link prediction datasets demonstrate that SEG achieves state-of-the-art results.
arXiv Detail & Related papers (2022-01-14T03:49:30Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.