Feature Expansion for Graph Neural Networks
- URL: http://arxiv.org/abs/2305.06142v2
- Date: Sat, 27 May 2023 17:26:04 GMT
- Title: Feature Expansion for Graph Neural Networks
- Authors: Jiaqi Sun, Lin Zhang, Guangyi Chen, Kun Zhang, Peng XU, Yujiu Yang
- Abstract summary: We decompose graph neural networks into determined feature spaces and trainable weights.
We theoretically find that the feature space tends to be linearly correlated due to repeated aggregations.
Motivated by these findings, we propose 1) feature subspaces flattening and 2) structural principal components to expand the feature space.
- Score: 26.671557021142572
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks aim to learn representations for graph-structured data
and show impressive performance, particularly in node classification. Recently,
many methods have studied the representations of GNNs from the perspective of
optimization goals and spectral graph theory. However, the feature space that
dominates representation learning has not been systematically studied in graph
neural networks. In this paper, we propose to fill this gap by analyzing the
feature space of both spatial and spectral models. We decompose graph neural
networks into determined feature spaces and trainable weights, providing the
convenience of studying the feature space explicitly using matrix space
analysis. In particular, we theoretically find that the feature space tends to
be linearly correlated due to repeated aggregations. Motivated by these
findings, we propose 1) feature subspaces flattening and 2) structural
principal components to expand the feature space. Extensive experiments verify
the effectiveness of our proposed more comprehensive feature space, with
comparable inference time to the baseline, and demonstrate its efficient
convergence capability.
Related papers
- DepWiGNN: A Depth-wise Graph Neural Network for Multi-hop Spatial
Reasoning in Text [52.699307699505646]
We propose a novel Depth-Wise Graph Neural Network (DepWiGNN) to handle multi-hop spatial reasoning.
Specifically, we design a novel node memory scheme and aggregate the information over the depth dimension instead of the breadth dimension of the graph.
Experimental results on two challenging multi-hop spatial reasoning datasets show that DepWiGNN outperforms existing spatial reasoning methods.
arXiv Detail & Related papers (2023-10-19T08:07:22Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - GPINN: Physics-informed Neural Network with Graph Embedding [1.6607142366834016]
This work proposes a Physics-informed Neural Network framework with Graph Embedding (GPINN) to perform PINN in graph.
The method integrates topological data into the neural network's computations, which significantly boosts the performance of the Physics-Informed Neural Network (PINN)
arXiv Detail & Related papers (2023-06-16T12:03:39Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Graph Networks with Spectral Message Passing [1.0742675209112622]
We introduce the Spectral Graph Network, which applies message passing to both the spatial and spectral domains.
Our results show that the Spectral GN promotes efficient training, reaching high performance with fewer training iterations despite having more parameters.
arXiv Detail & Related papers (2020-12-31T21:33:17Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.