Beyond Graph Convolutional Network: An Interpretable
Regularizer-centered Optimization Framework
- URL: http://arxiv.org/abs/2301.04318v1
- Date: Wed, 11 Jan 2023 05:51:33 GMT
- Title: Beyond Graph Convolutional Network: An Interpretable
Regularizer-centered Optimization Framework
- Authors: Shiping Wang, Zhihao Wu, Yuhong Chen, Yong Chen
- Abstract summary: Graph convolutional networks (GCNs) have been attracting widespread attentions due to their encouraging performance and powerful generalizations.
In this paper, we induce an interpretable regularizer-centerd optimization framework, in which by building appropriate regularizers we can interpret most GCNs.
Under the proposed framework, we devise a dual-regularizer graph convolutional network (dubbed tsGCN) to capture topological and semantic structures from graph data.
- Score: 12.116373546916078
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional networks (GCNs) have been attracting widespread
attentions due to their encouraging performance and powerful generalizations.
However, few work provide a general view to interpret various GCNs and guide
GCNs' designs. In this paper, by revisiting the original GCN, we induce an
interpretable regularizer-centerd optimization framework, in which by building
appropriate regularizers we can interpret most GCNs, such as APPNP, JKNet,
DAGNN, and GNN-LF/HF. Further, under the proposed framework, we devise a
dual-regularizer graph convolutional network (dubbed tsGCN) to capture
topological and semantic structures from graph data. Since the derived learning
rule for tsGCN contains an inverse of a large matrix and thus is
time-consuming, we leverage the Woodbury matrix identity and low-rank
approximation tricks to successfully decrease the high computational complexity
of computing infinite-order graph convolutions. Extensive experiments on eight
public datasets demonstrate that tsGCN achieves superior performance against
quite a few state-of-the-art competitors w.r.t. classification tasks.
Related papers
- A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Graph Contrastive Learning with Generative Adversarial Network [35.564028359355596]
Graph generative adversarial networks (GANs) learn the distribution of views for Graph Contrastive Learning (GCL)
We present GACN, a novel Generative Adversarial Contrastive learning Network for graph representation learning.
We show that GACN is able to generate high-quality augmented views for GCL and is superior to twelve state-of-the-art baseline methods.
arXiv Detail & Related papers (2023-08-01T13:28:24Z) - Multi-scale Graph Convolutional Networks with Self-Attention [2.66512000865131]
Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data.
Over-smoothing phenomenon as a crucial issue of GCNs remains to be solved and investigated.
We propose two novel multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs.
arXiv Detail & Related papers (2021-12-04T04:41:24Z) - SLGCN: Structure Learning Graph Convolutional Networks for Graphs under
Heterophily [5.619890178124606]
We propose a structure learning graph convolutional networks (SLGCNs) to alleviate the issue from two aspects.
Specifically, we design a efficient-spectral-clustering with anchors (ESC-ANCH) approach to efficiently aggregate feature representations from all similar nodes.
Experimental results on a wide range of benchmark datasets illustrate that the proposed SLGCNs outperform the stat-of-the-art GNN counterparts.
arXiv Detail & Related papers (2021-05-28T13:00:38Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Bi-GCN: Binary Graph Convolutional Network [57.733849700089955]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node features.
Our Bi-GCN can reduce the memory consumption by an average of 30x for both the network parameters and input data, and accelerate the inference speed by an average of 47x.
arXiv Detail & Related papers (2020-10-15T07:26:23Z) - DeeperGCN: All You Need to Train Deeper GCNs [66.64739331859226]
Graph Convolutional Networks (GCNs) have been drawing significant attention with the power of representation learning on graphs.
Unlike Convolutional Neural Networks (CNNs), which are able to take advantage of stacking very deep layers, GCNs suffer from vanishing gradient, over-smoothing and over-fitting issues when going deeper.
This paper proposes DeeperGCN that is capable of successfully and reliably training very deep GCNs.
arXiv Detail & Related papers (2020-06-13T23:00:22Z) - Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks [0.0]
Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
Here, we propose to augment conventional GCNs with geometric scattering transforms and residual convolutions.
The former enables band-pass filtering of graph signals, thus alleviating the so-called oversmoothing often encountered in GCNs.
arXiv Detail & Related papers (2020-03-18T18:03:08Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.