Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks
- URL: http://arxiv.org/abs/2107.10234v5
- Date: Mon, 18 Sep 2023 21:40:20 GMT
- Title: Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks
- Authors: Zhiqian Chen, Fanglan Chen, Lei Zhang, Taoran Ji, Kaiqun Fu, Liang
Zhao, Feng Chen, Lingfei Wu, Charu Aggarwal and Chang-Tien Lu
- Abstract summary: Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep learning does not easily manage.
The purpose of this study is to establish a unified framework that integrates GNNs based on spectral graph and approximation theory.
- Score: 61.17075071853949
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning's performance has been extensively recognized recently. Graph
neural networks (GNNs) are designed to deal with graph-structural data that
classical deep learning does not easily manage. Since most GNNs were created
using distinct theories, direct comparisons are impossible. Prior research has
primarily concentrated on categorizing existing models, with little attention
paid to their intrinsic connections. The purpose of this study is to establish
a unified framework that integrates GNNs based on spectral graph and
approximation theory. The framework incorporates a strong integration between
spatial- and spectral-based GNNs while tightly associating approaches that
exist within each respective domain.
Related papers
- Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Deep Ensembles for Graphs with Higher-order Dependencies [13.164412455321907]
Graph neural networks (GNNs) continue to achieve state-of-the-art performance on many graph learning tasks.
We show that the tendency of traditional graph representations to underfit each node's neighborhood causes existing GNNs to generalize poorly.
We propose a novel Deep Graph Ensemble (DGE) which captures neighborhood variance by training an ensemble of GNNs on different neighborhood subspaces of the same node.
arXiv Detail & Related papers (2022-05-27T14:01:08Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - AdaGNN: A multi-modal latent representation meta-learner for GNNs based
on AdaBoosting [0.38073142980733]
Graph Neural Networks (GNNs) focus on extracting intrinsic network features.
We propose boosting-based meta learner for GNNs.
AdaGNN performs exceptionally well for applications with rich and diverse node neighborhood information.
arXiv Detail & Related papers (2021-08-14T03:07:26Z) - Architectural Implications of Graph Neural Networks [17.01480604968118]
Graph neural networks (GNN) represent an emerging line of deep learning models that operate on graph structures.
GNN is not as well understood in the system and architecture community as its counterparts such as multi-layer perceptrons and convolutional neural networks.
arXiv Detail & Related papers (2020-09-02T03:36:24Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph
Neural Networks [52.76042362922247]
Graph neural networks (GNNs) are designed to handle the non-Euclidean graph-structure.
Existing GNNs are presented using various techniques, making direct comparison and cross-reference more complex.
We organize existing GNNs into spatial and spectral domains, as well as expose the connections within each domain.
arXiv Detail & Related papers (2020-02-27T01:15:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.