GGNNs : Generalizing GNNs using Residual Connections and Weighted
Message Passing
- URL: http://arxiv.org/abs/2311.15448v1
- Date: Sun, 26 Nov 2023 22:22:38 GMT
- Title: GGNNs : Generalizing GNNs using Residual Connections and Weighted
Message Passing
- Authors: Abhinav Raghuvanshi and Kushal Sokke Malleshappa
- Abstract summary: GNNs excel at capturing relationships and patterns within graphs, enabling effective learning and prediction tasks.
It is commonly believed that the generalizing power of GNNs is attributed to the message-passing mechanism between layers.
Our technique builds on these results, modifying the message-passing mechanism further: one by weighing the messages before accumulating at each node and another by adding Residual connections.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many real-world phenomena can be modeled as a graph, making them extremely
valuable due to their ubiquitous presence. GNNs excel at capturing those
relationships and patterns within these graphs, enabling effective learning and
prediction tasks. GNNs are constructed using Multi-Layer Perceptrons (MLPs) and
incorporate additional layers for message passing to facilitate the flow of
features among nodes. It is commonly believed that the generalizing power of
GNNs is attributed to the message-passing mechanism between layers, where nodes
exchange information with their neighbors, enabling them to effectively capture
and propagate information across the nodes of a graph. Our technique builds on
these results, modifying the message-passing mechanism further: one by weighing
the messages before accumulating at each node and another by adding Residual
connections. These two mechanisms show significant improvements in learning and
faster convergence
Related papers
- Harnessing Collective Structure Knowledge in Data Augmentation for Graph Neural Networks [25.12261412297796]
Graph neural networks (GNNs) have achieved state-of-the-art performance in graph representation learning.
We propose a novel approach, namely collective structure knowledge-augmented graph neural network (CoS-GNN)
arXiv Detail & Related papers (2024-05-17T08:50:00Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - Generalizing Aggregation Functions in GNNs:High-Capacity GNNs via
Nonlinear Neighborhood Aggregators [14.573383849211773]
Graph neural networks (GNNs) have achieved great success in many graph learning tasks.
Existing GNNs mainly adopt either linear neighborhood aggregation (mean,sum) or max aggregator in their message propagation.
We re-think the message propagation mechanism in GNNs and aim to develop the general nonlinear aggregators for neighborhood information aggregation in GNNs.
arXiv Detail & Related papers (2022-02-18T11:49:59Z) - DPGNN: Dual-Perception Graph Neural Network for Representation Learning [21.432960458513826]
Graph neural networks (GNNs) have drawn increasing attention in recent years and achieved remarkable performance in many graph-based tasks.
Most existing GNNs are based on the message-passing paradigm to iteratively aggregate neighborhood information in a single topology space.
We present a novel message-passing paradigm, based on the properties of multi-step message source, node-specific message output, and multi-space message interaction.
arXiv Detail & Related papers (2021-10-15T05:47:26Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.