EIGNN: Efficient Infinite-Depth Graph Neural Networks
- URL: http://arxiv.org/abs/2202.10720v1
- Date: Tue, 22 Feb 2022 08:16:58 GMT
- Title: EIGNN: Efficient Infinite-Depth Graph Neural Networks
- Authors: Juncheng Liu, Kenji Kawaguchi, Bryan Hooi, Yiwei Wang, Xiaokui Xiao
- Abstract summary: Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
- Score: 51.97361378423152
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) are widely used for modelling graph-structured
data in numerous applications. However, with their inherently finite
aggregation layers, existing GNN models may not be able to effectively capture
long-range dependencies in the underlying graphs. Motivated by this limitation,
we propose a GNN model with infinite depth, which we call Efficient
Infinite-Depth Graph Neural Networks (EIGNN), to efficiently capture very
long-range dependencies. We theoretically derive a closed-form solution of
EIGNN which makes training an infinite-depth GNN model tractable. We then
further show that we can achieve more efficient computation for training EIGNN
by using eigendecomposition. The empirical results of comprehensive experiments
on synthetic and real-world datasets show that EIGNN has a better ability to
capture long-range dependencies than recent baselines, and consistently
achieves state-of-the-art performance. Furthermore, we show that our model is
also more robust against both noise and adversarial perturbations on node
features.
Related papers
- Search to Capture Long-range Dependency with Stacking GNNs for Graph
Classification [41.84399177525008]
shallow GNNs are more common due to the well-known over-smoothing problem facing deeper GNNs.
We propose a novel approach with the help of neural architecture search (NAS), which is dubbed LRGNN (Long-Range Graph Neural Networks)
arXiv Detail & Related papers (2023-02-17T03:40:17Z) - LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation [51.552170474958736]
We propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning.
LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN.
Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks.
arXiv Detail & Related papers (2023-02-03T02:33:07Z) - Reducing Over-smoothing in Graph Neural Networks Using Relational
Embeddings [0.15619750966454563]
We propose a new simple, and efficient method to alleviate the effect of the over-smoothing problem in GNNs.
Our method can be used in combination with other methods to give the best performance.
arXiv Detail & Related papers (2023-01-07T19:26:04Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Implicit Graph Neural Networks [46.0589136729616]
We propose a graph learning framework called Implicit Graph Neural Networks (IGNN)
IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.
arXiv Detail & Related papers (2020-09-14T06:04:55Z) - Efficient Robustness Certificates for Discrete Data: Sparsity-Aware
Randomized Smoothing for Graphs, Images and More [85.52940587312256]
We propose a model-agnostic certificate based on the randomized smoothing framework which subsumes earlier work and is tight, efficient, and sparsity-aware.
We show the effectiveness of our approach on a wide variety of models, datasets, and tasks -- specifically highlighting its use for Graph Neural Networks.
arXiv Detail & Related papers (2020-08-29T10:09:02Z) - Efficient Probabilistic Logic Reasoning with Graph Neural Networks [63.099999467118245]
Markov Logic Networks (MLNs) can be used to address many knowledge graph problems.
Inference in MLN is computationally intensive, making the industrial-scale application of MLN very difficult.
We propose a graph neural network (GNN) variant, named ExpressGNN, which strikes a nice balance between the representation power and the simplicity of the model.
arXiv Detail & Related papers (2020-01-29T23:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.