FDGATII : Fast Dynamic Graph Attention with Initial Residual and
Identity Mapping
- URL: http://arxiv.org/abs/2110.11464v1
- Date: Thu, 21 Oct 2021 20:19:17 GMT
- Title: FDGATII : Fast Dynamic Graph Attention with Initial Residual and
Identity Mapping
- Authors: Gayan K. Kulatilleke, Marius Portmann, Ryan Ko, Shekhar S. Chandra
- Abstract summary: We propose a novel graph neural network FDGATII, inspired by attention mechanism's ability to focus on selective information.
By using sparse dynamic attention, FDG ATII is inherently parallelizable in design, whist efficient in operation.
We show that FDG ATII outperforms GAT and GCN based benchmarks in accuracy and performance on fully supervised tasks.
- Score: 0.39373541926236766
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While Graph Neural Networks have gained popularity in multiple domains,
graph-structured input remains a major challenge due to (a) over-smoothing, (b)
noisy neighbours (heterophily), and (c) the suspended animation problem. To
address all these problems simultaneously, we propose a novel graph neural
network FDGATII, inspired by attention mechanism's ability to focus on
selective information supplemented with two feature preserving mechanisms.
FDGATII combines Initial Residuals and Identity Mapping with the more
expressive dynamic self-attention to handle noise prevalent from the
neighbourhoods in heterophilic data sets. By using sparse dynamic attention,
FDGATII is inherently parallelizable in design, whist efficient in operation;
thus theoretically able to scale to arbitrary graphs with ease. Our approach
has been extensively evaluated on 7 datasets. We show that FDGATII outperforms
GAT and GCN based benchmarks in accuracy and performance on fully supervised
tasks, obtaining state-of-the-art results on Chameleon and Cornell datasets
with zero domain-specific graph pre-processing, and demonstrate its versatility
and fairness.
Related papers
- Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Understanding When and Why Graph Attention Mechanisms Work via Node Classification [8.098074314691125]
We show that graph attention mechanisms can enhance classification performance when structure noise exceeds feature noise.
We propose a novel multi-layer Graph Attention Network (GAT) architecture that significantly outperforms single-layer GATs in achieving emphperfect node classification.
arXiv Detail & Related papers (2024-12-20T02:22:38Z) - Dual-Frequency Filtering Self-aware Graph Neural Networks for Homophilic and Heterophilic Graphs [60.82508765185161]
We propose Dual-Frequency Filtering Self-aware Graph Neural Networks (DFGNN)
DFGNN integrates low-pass and high-pass filters to extract smooth and detailed topological features.
It dynamically adjusts filtering ratios to accommodate both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2024-11-18T04:57:05Z) - SIG: Efficient Self-Interpretable Graph Neural Network for Continuous-time Dynamic Graphs [34.269958289295516]
We aim to predict future links within the dynamic graph while simultaneously providing causal explanations for these predictions.
To tackle these challenges, we propose a novel causal inference model, namely the Independent and Confounded Causal Model (ICCM)
Our proposed model significantly outperforms existing methods across link prediction accuracy, explanation quality, and robustness to shortcut features.
arXiv Detail & Related papers (2024-05-29T13:09:33Z) - Representation Learning on Heterophilic Graph with Directional
Neighborhood Attention [8.493802098034255]
Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture.
GAT lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets.
We propose Directional Graph Attention Network (DGAT) to combine the feature-based attention with the global directional information extracted from the graph topology.
arXiv Detail & Related papers (2024-03-03T10:59:16Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Beyond the Gates of Euclidean Space: Temporal-Discrimination-Fusions and
Attention-based Graph Neural Network for Human Activity Recognition [5.600003119721707]
Human activity recognition (HAR) through wearable devices has received much interest due to its numerous applications in fitness tracking, wellness screening, and supported living.
Traditional deep learning (DL) has set a state of the art performance for HAR domain.
We propose an approach based on Graph Neural Networks (GNNs) for structuring the input representation and exploiting the relations among the samples.
arXiv Detail & Related papers (2022-06-10T03:04:23Z) - Multi-hop Attention Graph Neural Network [70.21119504298078]
Multi-hop Attention Graph Neural Network (MAGNA) is a principled way to incorporate multi-hop context information into every layer of attention computation.
We show that MAGNA captures large-scale structural information in every layer, and has a low-pass effect that eliminates noisy high-frequency information from graph data.
arXiv Detail & Related papers (2020-09-29T22:41:19Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graphs, Entities, and Step Mixture [11.162937043309478]
We propose a new graph neural network that considers both edge-based neighborhood relationships and node-based entity features.
With intensive experiments, we show that the proposed GESM achieves state-of-the-art or comparable performances on eight benchmark graph datasets.
arXiv Detail & Related papers (2020-05-18T06:57:02Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.