WGCN: Graph Convolutional Networks with Weighted Structural Features
- URL: http://arxiv.org/abs/2104.14060v1
- Date: Thu, 29 Apr 2021 00:50:06 GMT
- Title: WGCN: Graph Convolutional Networks with Weighted Structural Features
- Authors: Yunxiang Zhao and Jianzhong Qi and Qingwei Liu and Rui Zhang
- Abstract summary: Graph convolutional networks (GCNs) learn nodes' representations by capturing structural information.
We propose a GCN model with weighted structural features named WGCN.
Experiments show that WGCN outperforms the baseline models consistently by up to 17.07% in terms of accuracy on five benchmark datasets.
- Score: 25.64794159605137
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph structural information such as topologies or connectivities provides
valuable guidance for graph convolutional networks (GCNs) to learn nodes'
representations. Existing GCN models that capture nodes' structural information
weight in- and out-neighbors equally or differentiate in- and out-neighbors
globally without considering nodes' local topologies. We observe that in- and
out-neighbors contribute differently for nodes with different local topologies.
To explore the directional structural information for different nodes, we
propose a GCN model with weighted structural features, named WGCN. WGCN first
captures nodes' structural fingerprints via a direction and degree aware Random
Walk with Restart algorithm, where the walk is guided by both edge direction
and nodes' in- and out-degrees. Then, the interactions between nodes'
structural fingerprints are used as the weighted node structural features. To
further capture nodes' high-order dependencies and graph geometry, WGCN embeds
graphs into a latent space to obtain nodes' latent neighbors and geometrical
relationships. Based on nodes' geometrical relationships in the latent space,
WGCN differentiates latent, in-, and out-neighbors with an attention-based
geometrical aggregation. Experiments on transductive node classification tasks
show that WGCN outperforms the baseline models consistently by up to 17.07% in
terms of accuracy on five benchmark datasets.
Related papers
- A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Structure Enhanced Graph Neural Networks for Link Prediction [6.872826041648584]
We propose Structure Enhanced Graph neural network (SEG) for link prediction.
SEG incorporates surrounding topological information of target nodes into an ordinary GNN model.
Experiments on the OGB link prediction datasets demonstrate that SEG achieves state-of-the-art results.
arXiv Detail & Related papers (2022-01-14T03:49:30Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Node-wise Localization of Graph Neural Networks [52.04194209002702]
Graph neural networks (GNNs) emerge as a powerful family of representation learning models on graphs.
We propose a node-wise localization of GNNs by accounting for both global and local aspects of the graph.
We conduct extensive experiments on four benchmark graphs, and consistently obtain promising performance surpassing the state-of-the-art GNNs.
arXiv Detail & Related papers (2021-10-27T10:02:03Z) - Feature Correlation Aggregation: on the Path to Better Graph Neural
Networks [37.79964911718766]
Prior to the introduction of Graph Neural Networks (GNNs), modeling and analyzing irregular data, particularly graphs, was thought to be the Achilles' heel of deep learning.
This paper introduces a central node permutation variant function through a frustratingly simple and innocent-looking modification to the core operation of a GNN.
A tangible boost in performance of the model is observed where the model surpasses previous state-of-the-art results by a significant margin while employing fewer parameters.
arXiv Detail & Related papers (2021-09-20T05:04:26Z) - Curvature Graph Neural Network [8.477559786537919]
We introduce discrete graph curvature (the Ricci curvature) to quantify the strength of structural connection of pairwise nodes.
We propose Curvature Graph Neural Network (CGNN), which effectively improves the adaptive locality ability of GNNs.
The experimental results on synthetic datasets show that CGNN effectively exploits the topology structure information.
arXiv Detail & Related papers (2021-06-30T00:56:03Z) - SPAGAN: Shortest Path Graph Attention Network [187.75441278910708]
Graph convolutional networks (GCN) have recently demonstrated their potential in analyzing non-grid structure data that can be represented as graphs.
We propose a novel GCN model, which we term as Shortest Path Graph Attention Network (SPAGAN)
arXiv Detail & Related papers (2021-01-10T03:18:34Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.