Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph
Neural Networks
- URL: http://arxiv.org/abs/2002.11867v4
- Date: Wed, 21 Jul 2021 15:54:42 GMT
- Title: Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph
Neural Networks
- Authors: Zhiqian Chen, Fanglan Chen, Lei Zhang, Taoran Ji, Kaiqun Fu, Liang
Zhao, Feng Chen, Lingfei Wu, Charu Aggarwal and Chang-Tien Lu
- Abstract summary: Graph neural networks (GNNs) are designed to handle the non-Euclidean graph-structure.
Existing GNNs are presented using various techniques, making direct comparison and cross-reference more complex.
We organize existing GNNs into spatial and spectral domains, as well as expose the connections within each domain.
- Score: 52.76042362922247
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning's success has been widely recognized in a variety of machine
learning tasks, including image classification, audio recognition, and natural
language processing. As an extension of deep learning beyond these domains,
graph neural networks (GNNs) are designed to handle the non-Euclidean
graph-structure which is intractable to previous deep learning techniques.
Existing GNNs are presented using various techniques, making direct comparison
and cross-reference more complex. Although existing studies categorize GNNs
into spatial-based and spectral-based techniques, there hasn't been a thorough
examination of their relationship. To close this gap, this study presents a
single framework that systematically incorporates most GNNs. We organize
existing GNNs into spatial and spectral domains, as well as expose the
connections within each domain. A review of spectral graph theory and
approximation theory builds a strong relationship across the spatial and
spectral domains in further investigation.
Related papers
- Rethinking Spectral Graph Neural Networks with Spatially Adaptive Filtering [31.595664867365322]
spectral Graph Neural Networks (GNNs) are well-founded in the spectral domain, but their practical reliance on approximation implies a profound linkage to the spatial domain.
We establish a theoretical connection between spectral and spatial aggregation, unveiling an intrinsic interaction that spectral implicitly leads the original graph to an adapted new graph.
We propose a novel Spatially Adaptive Filtering (SAF) framework, which leverages the adapted new graph by spectral filtering for an auxiliary non-local aggregation.
arXiv Detail & Related papers (2024-01-17T09:12:31Z) - The Evolution of Distributed Systems for Graph Neural Networks and their
Origin in Graph Processing and Deep Learning: A Survey [17.746899445454048]
Graph Neural Networks (GNNs) are an emerging research field.
GNNs can be applied to various domains including recommendation systems, computer vision, natural language processing, biology and chemistry.
We aim to fill this gap by summarizing and categorizing important methods and techniques for large-scale GNN solutions.
arXiv Detail & Related papers (2023-05-23T09:22:33Z) - A Survey on Spectral Graph Neural Networks [42.469584005389414]
We summarize the recent development of spectral GNNs, including model, theory, and application.
We first discuss the connection between spatial GNNs and spectral GNNs, which shows that spectral GNNs can capture global information and have better interpretability.
In addition, we review major theoretical results and applications of spectral GNNs, followed by a quantitative experiment to benchmark some popular spectral GNNs.
arXiv Detail & Related papers (2023-02-11T09:16:46Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - AdaGNN: A multi-modal latent representation meta-learner for GNNs based
on AdaBoosting [0.38073142980733]
Graph Neural Networks (GNNs) focus on extracting intrinsic network features.
We propose boosting-based meta learner for GNNs.
AdaGNN performs exceptionally well for applications with rich and diverse node neighborhood information.
arXiv Detail & Related papers (2021-08-14T03:07:26Z) - Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks [61.17075071853949]
Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep learning does not easily manage.
The purpose of this study is to establish a unified framework that integrates GNNs based on spectral graph and approximation theory.
arXiv Detail & Related papers (2021-07-21T17:34:33Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.