Deformable Graph Convolutional Networks
- URL: http://arxiv.org/abs/2112.14438v1
- Date: Wed, 29 Dec 2021 07:55:29 GMT
- Title: Deformable Graph Convolutional Networks
- Authors: Jinyoung Park, Sungdong Yoo, Jihwan Park, Hyunwoo J. Kim
- Abstract summary: Graph neural networks (GNNs) have significantly improved representation power for graph-structured data.
In this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces.
Our framework simultaneously learns the node positional embeddings to determine the relations between nodes in an end-to-end fashion.
- Score: 12.857403315970231
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have significantly improved the representation
power for graph-structured data. Despite of the recent success of GNNs, the
graph convolution in most GNNs have two limitations. Since the graph
convolution is performed in a small local neighborhood on the input graph, it
is inherently incapable to capture long-range dependencies between distance
nodes. In addition, when a node has neighbors that belong to different classes,
i.e., heterophily, the aggregated messages from them often negatively affect
representation learning. To address the two common problems of graph
convolution, in this paper, we propose Deformable Graph Convolutional Networks
(Deformable GCNs) that adaptively perform convolution in multiple latent spaces
and capture short/long-range dependencies between nodes. Separated from node
representations (features), our framework simultaneously learns the node
positional embeddings (coordinates) to determine the relations between nodes in
an end-to-end fashion. Depending on node position, the convolution kernels are
deformed by deformation vectors and apply different transformations to its
neighbor nodes. Our extensive experiments demonstrate that Deformable GCNs
flexibly handles the heterophily and achieve the best performance in node
classification tasks on six heterophilic graph datasets.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Transfer Entropy in Graph Convolutional Neural Networks [0.0]
Graph Convolutional Networks (GCN) are Graph Neural Networks where the convolutions are applied over a graph.
In this study, we address two important challenges related to GCNs: i.
Oversmoothing is the degradation of the discriminative capacity of nodes as a result of repeated aggregations.
We propose a new strategy for addressing these challenges in GCNs based on Transfer Entropy (TE), which measures of the amount of directed transfer of information between two time varying nodes.
arXiv Detail & Related papers (2024-06-08T20:09:17Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Structural Imbalance Aware Graph Augmentation Learning [2.793446335600599]
Graphs are often structurally imbalanced, that is, only a few hub nodes have a denser local structure and higher influence.
This paper proposes a selective graph augmentation method (SAug) to solve this problem.
Extensive experiments demonstrate that SAug can significantly improve the backbone GNNs and achieve superior performance to its competitors.
arXiv Detail & Related papers (2023-03-24T02:13:32Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Graph Transformer Networks: Learning Meta-path Graphs to Improve GNNs [20.85042364993559]
We propose Graph Transformer Networks (GTNs) that generate new graph structures and include useful connections for tasks.
Fast Graph Transformer Networks (FastGTNs) are 230x faster and use 100x less memory.
We extend graph transformations to the semantic proximity of nodes allowing non-local operations beyond meta-paths.
arXiv Detail & Related papers (2021-06-11T07:56:55Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.