LEReg: Empower Graph Neural Networks with Local Energy Regularization
- URL: http://arxiv.org/abs/2203.10565v1
- Date: Sun, 20 Mar 2022 14:38:05 GMT
- Title: LEReg: Empower Graph Neural Networks with Local Energy Regularization
- Authors: Xiaojun Ma, Hanyue Chen, Guojie Song
- Abstract summary: Graph Neural Networks (GNNs) map the adjacency matrix and node features to node representations by message passing through edges on each convolution layer.
Existing GNNs treat all parts of the graph uniformly, which makes it difficult to adaptively pass the most informative message for each unique part.
We propose two regularization terms that consider message passing locally: (1) Intra-Energy Reg and (2) Inter-Energy Reg.
- Score: 20.663228831150725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Researches on analyzing graphs with Graph Neural Networks (GNNs) have been
receiving more and more attention because of the great expressive power of
graphs. GNNs map the adjacency matrix and node features to node representations
by message passing through edges on each convolution layer. However, the
message passed through GNNs is not always beneficial for all parts in a graph.
Specifically, as the data distribution is different over the graph, the
receptive field (the farthest nodes that a node can obtain information from)
needed to gather information is also different. Existing GNNs treat all parts
of the graph uniformly, which makes it difficult to adaptively pass the most
informative message for each unique part. To solve this problem, we propose two
regularization terms that consider message passing locally: (1) Intra-Energy
Reg and (2) Inter-Energy Reg. Through experiments and theoretical discussion,
we first show that the speed of smoothing of different parts varies enormously
and the topology of each part affects the way of smoothing. With Intra-Energy
Reg, we strengthen the message passing within each part, which is beneficial
for getting more useful information. With Inter-Energy Reg, we improve the
ability of GNNs to distinguish different nodes. With the proposed two
regularization terms, GNNs are able to filter the most useful information
adaptively, learn more robustly and gain higher expressiveness. Moreover, the
proposed LEReg can be easily applied to other GNN models with plug-and-play
characteristics. Extensive experiments on several benchmarks verify that GNNs
with LEReg outperform or match the state-of-the-art methods. The effectiveness
and efficiency are also empirically visualized with elaborate experiments.
Related papers
- A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Self-Attention Empowered Graph Convolutional Network for Structure
Learning and Node Embedding [5.164875580197953]
In representation learning on graph-structured data, many popular graph neural networks (GNNs) fail to capture long-range dependencies.
This paper proposes a novel graph learning framework called the graph convolutional network with self-attention (GCN-SA)
The proposed scheme exhibits an exceptional generalization capability in node-level representation learning.
arXiv Detail & Related papers (2024-03-06T05:00:31Z) - GGNNs : Generalizing GNNs using Residual Connections and Weighted
Message Passing [0.0]
GNNs excel at capturing relationships and patterns within graphs, enabling effective learning and prediction tasks.
It is commonly believed that the generalizing power of GNNs is attributed to the message-passing mechanism between layers.
Our technique builds on these results, modifying the message-passing mechanism further: one by weighing the messages before accumulating at each node and another by adding Residual connections.
arXiv Detail & Related papers (2023-11-26T22:22:38Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily [24.408557217909316]
Graph Neural Networks (GNNs) show strong expressive power on graph data mining.
However, not all graphs are homophilic, even in the same graph, the distributions may vary significantly.
We propose Meta Weight Graph Neural Network (MWGNN) to adaptively construct graph convolution layers for different nodes.
arXiv Detail & Related papers (2022-03-19T09:27:38Z) - Graph Feature Gating Networks [31.20878472589719]
We propose a general graph feature gating network (GFGN) based on the graph signal denoising problem.
We also introduce three graph filters under GFGN to allow different levels of contributions from feature dimensions.
arXiv Detail & Related papers (2021-05-10T16:33:58Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.