DepWiGNN: A Depth-wise Graph Neural Network for Multi-hop Spatial
Reasoning in Text
- URL: http://arxiv.org/abs/2310.12557v2
- Date: Fri, 8 Mar 2024 14:16:55 GMT
- Title: DepWiGNN: A Depth-wise Graph Neural Network for Multi-hop Spatial
Reasoning in Text
- Authors: Shuaiyi Li, Yang Deng, Wai Lam
- Abstract summary: We propose a novel Depth-Wise Graph Neural Network (DepWiGNN) to handle multi-hop spatial reasoning.
Specifically, we design a novel node memory scheme and aggregate the information over the depth dimension instead of the breadth dimension of the graph.
Experimental results on two challenging multi-hop spatial reasoning datasets show that DepWiGNN outperforms existing spatial reasoning methods.
- Score: 52.699307699505646
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spatial reasoning in text plays a crucial role in various real-world
applications. Existing approaches for spatial reasoning typically infer spatial
relations from pure text, which overlooks the gap between natural language and
symbolic structures. Graph neural networks (GNNs) have showcased exceptional
proficiency in inducing and aggregating symbolic structures. However, classical
GNNs face challenges in handling multi-hop spatial reasoning due to the
over-smoothing issue, i.e., the performance decreases substantially as the
number of graph layers increases. To cope with these challenges, we propose a
novel Depth-Wise Graph Neural Network (DepWiGNN). Specifically, we design a
novel node memory scheme and aggregate the information over the depth dimension
instead of the breadth dimension of the graph, which empowers the ability to
collect long dependencies without stacking multiple layers. Experimental
results on two challenging multi-hop spatial reasoning datasets show that
DepWiGNN outperforms existing spatial reasoning methods. The comparisons with
the other three GNNs further demonstrate its superiority in capturing long
dependency in the graph.
Related papers
- Layer-wise training for self-supervised learning on graphs [0.0]
End-to-end training of graph neural networks (GNN) on large graphs presents several memory and computational challenges.
We propose Layer-wise Regularized Graph Infomax, an algorithm to train GNNs layer by layer in a self-supervised manner.
arXiv Detail & Related papers (2023-09-04T10:23:39Z) - GPINN: Physics-informed Neural Network with Graph Embedding [1.6607142366834016]
This work proposes a Physics-informed Neural Network framework with Graph Embedding (GPINN) to perform PINN in graph.
The method integrates topological data into the neural network's computations, which significantly boosts the performance of the Physics-Informed Neural Network (PINN)
arXiv Detail & Related papers (2023-06-16T12:03:39Z) - Feature Expansion for Graph Neural Networks [26.671557021142572]
We decompose graph neural networks into determined feature spaces and trainable weights.
We theoretically find that the feature space tends to be linearly correlated due to repeated aggregations.
Motivated by these findings, we propose 1) feature subspaces flattening and 2) structural principal components to expand the feature space.
arXiv Detail & Related papers (2023-05-10T13:45:57Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - DPGNN: Dual-Perception Graph Neural Network for Representation Learning [21.432960458513826]
Graph neural networks (GNNs) have drawn increasing attention in recent years and achieved remarkable performance in many graph-based tasks.
Most existing GNNs are based on the message-passing paradigm to iteratively aggregate neighborhood information in a single topology space.
We present a novel message-passing paradigm, based on the properties of multi-step message source, node-specific message output, and multi-space message interaction.
arXiv Detail & Related papers (2021-10-15T05:47:26Z) - Reasoning Graph Networks for Kinship Verification: from Star-shaped to
Hierarchical [85.0376670244522]
We investigate the problem of facial kinship verification by learning hierarchical reasoning graph networks.
We develop a Star-shaped Reasoning Graph Network (S-RGN) to exploit more powerful and flexible capacity.
We also develop a Hierarchical Reasoning Graph Network (H-RGN) to exploit more powerful and flexible capacity.
arXiv Detail & Related papers (2021-09-06T03:16:56Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.