RAN-GNNs: breaking the capacity limits of graph neural networks
- URL: http://arxiv.org/abs/2103.15565v1
- Date: Mon, 29 Mar 2021 12:34:36 GMT
- Title: RAN-GNNs: breaking the capacity limits of graph neural networks
- Authors: Diego Valsesia, Giulia Fracastoro, Enrico Magli
- Abstract summary: Graph neural networks have become a staple in problems addressing learning and analysis of data defined over graphs.
Recent works attribute this to the need to consider multiple neighborhood sizes at the same time and adaptively tune them.
We show that employing a randomly-wired architecture can be a more effective way to increase the capacity of the network and obtain richer representations.
- Score: 43.66682619000099
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks have become a staple in problems addressing learning
and analysis of data defined over graphs. However, several results suggest an
inherent difficulty in extracting better performance by increasing the number
of layers. Recent works attribute this to a phenomenon peculiar to the
extraction of node features in graph-based tasks, i.e., the need to consider
multiple neighborhood sizes at the same time and adaptively tune them. In this
paper, we investigate the recently proposed randomly wired architectures in the
context of graph neural networks. Instead of building deeper networks by
stacking many layers, we prove that employing a randomly-wired architecture can
be a more effective way to increase the capacity of the network and obtain
richer representations. We show that such architectures behave like an ensemble
of paths, which are able to merge contributions from receptive fields of varied
size. Moreover, these receptive fields can also be modulated to be wider or
narrower through the trainable weights over the paths. We also provide
extensive experimental evidence of the superior performance of randomly wired
architectures over multiple tasks and four graph convolution definitions, using
recent benchmarking frameworks that addresses the reliability of previous
testing methodologies.
Related papers
- AGNN: Alternating Graph-Regularized Neural Networks to Alleviate
Over-Smoothing [29.618952407794776]
We propose an Alternating Graph-regularized Neural Network (AGNN) composed of Graph Convolutional Layer (GCL) and Graph Embedding Layer (GEL)
GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem.
AGNN is evaluated via a large number of experiments including performance comparison with some multi-layer or multi-order graph neural networks.
arXiv Detail & Related papers (2023-04-14T09:20:03Z) - Building Shortcuts between Distant Nodes with Biaffine Mapping for Graph
Convolutional Networks [18.160610500658183]
We introduce Biaffine technique to improve the expressiveness of graph convolutional networks with a shallow architecture.
Our method is to learn direct dependency on long-distance neighbors for nodes, with which only one-hop message passing is capable of capturing rich information for node representation.
arXiv Detail & Related papers (2023-02-17T06:39:47Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Mastering Spatial Graph Prediction of Road Networks [18.321172168775472]
We propose a graph-based framework that simulates the addition of sequences of graph edges.
In particular, given a partially generated graph associated with a satellite image, an RL agent nominates modifications that maximize a cumulative reward.
arXiv Detail & Related papers (2022-10-03T11:26:09Z) - Tree Decomposed Graph Neural Network [11.524511007436791]
We propose a tree decomposition method to disentangle neighborhoods in different layers to alleviate feature smoothing.
We also characterize the multi-hop dependency via graph diffusion within our tree decomposition formulation to construct Tree Decomposed Graph Neural Network (TDGNN)
Comprehensive experiments demonstrate the superior performance of TDGNN on both homophily and heterophily networks.
arXiv Detail & Related papers (2021-08-25T02:47:16Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.