BiGCN: A Bi-directional Low-Pass Filtering Graph Neural Network
- URL: http://arxiv.org/abs/2101.05519v1
- Date: Thu, 14 Jan 2021 09:41:00 GMT
- Title: BiGCN: A Bi-directional Low-Pass Filtering Graph Neural Network
- Authors: Zhixian Chen, Tengfei Ma, Zhihua Jin, Yangqiu Song, Yang Wang
- Abstract summary: Many graph convolutional networks can be regarded as low-pass filters for graph signals.
We propose a new model, BiGCN, which represents a graph neural network as a bi-directional low-pass filter.
Our model outperforms previous graph neural networks in the tasks of node classification and link prediction on most of the benchmark datasets.
- Score: 35.97496022085212
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional networks have achieved great success on graph-structured
data. Many graph convolutional networks can be regarded as low-pass filters for
graph signals. In this paper, we propose a new model, BiGCN, which represents a
graph neural network as a bi-directional low-pass filter. Specifically, we not
only consider the original graph structure information but also the latent
correlation between features, thus BiGCN can filter the signals along with both
the original graph and a latent feature-connection graph. Our model outperforms
previous graph neural networks in the tasks of node classification and link
prediction on most of the benchmark datasets, especially when we add noise to
the node features.
Related papers
- Saliency-Aware Regularized Graph Neural Network [39.82009838086267]
We propose the Saliency-Aware Regularized Graph Neural Network (SAR-GNN) for graph classification.
We first estimate the global node saliency by measuring the semantic similarity between the compact graph representation and node features.
Then the learned saliency distribution is leveraged to regularize the neighborhood aggregation of the backbone.
arXiv Detail & Related papers (2024-01-01T13:44:16Z) - Signed Graph Neural Networks: A Frequency Perspective [14.386571627652975]
Graph convolutional networks (GCNs) are designed for unsigned graphs containing only positive links.
We propose two different signed graph neural networks, one keeps only low-frequency information and one also retains high-frequency information.
arXiv Detail & Related papers (2022-08-15T16:42:18Z) - KGNN: Harnessing Kernel-based Networks for Semi-supervised Graph
Classification [13.419578861488226]
We propose a Kernel-based Graph Neural Network (KGNN) for semi-supervised graph classification.
We show that KGNN achieves impressive performance over competitive baselines.
arXiv Detail & Related papers (2022-05-21T10:03:46Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering [61.315598419655224]
We propose Automatic Graph Convolutional Networks (AutoGCN) to capture the full spectrum of graph signals.
While it is based on graph spectral theory, our AutoGCN is also localized in space and has a spatial form.
arXiv Detail & Related papers (2021-07-10T04:11:25Z) - Natural Graph Networks [80.77570956520482]
We show that the more general concept of naturality is sufficient for a graph network to be well-defined.
We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks.
arXiv Detail & Related papers (2020-07-16T14:19:06Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Customized Graph Neural Networks [38.30640892828196]
Graph Neural Networks (GNNs) have greatly advanced the task of graph classification.
We propose a novel customized graph neural network framework, i.e., Customized-GNN.
The proposed framework is very general that can be applied to numerous existing graph neural network models.
arXiv Detail & Related papers (2020-05-22T05:22:24Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.