How to Find Your Friendly Neighborhood: Graph Attention Design with
Self-Supervision
- URL: http://arxiv.org/abs/2204.04879v1
- Date: Mon, 11 Apr 2022 05:45:09 GMT
- Title: How to Find Your Friendly Neighborhood: Graph Attention Design with
Self-Supervision
- Authors: Dongkwan Kim and Alice Oh
- Abstract summary: We propose a self-supervised graph attention network (SuperGAT) for noisy graphs.
We exploit two attention forms compatible with a self-supervised task to predict edges.
By encoding edges, SuperGAT learns more expressive attention in distinguishing mislinked neighbors.
- Score: 16.86132592140062
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Attention mechanism in graph neural networks is designed to assign larger
weights to important neighbor nodes for better representation. However, what
graph attention learns is not understood well, particularly when graphs are
noisy. In this paper, we propose a self-supervised graph attention network
(SuperGAT), an improved graph attention model for noisy graphs. Specifically,
we exploit two attention forms compatible with a self-supervised task to
predict edges, whose presence and absence contain the inherent information
about the importance of the relationships between nodes. By encoding edges,
SuperGAT learns more expressive attention in distinguishing mislinked
neighbors. We find two graph characteristics influence the effectiveness of
attention forms and self-supervision: homophily and average degree. Thus, our
recipe provides guidance on which attention design to use when those two graph
characteristics are known. Our experiment on 17 real-world datasets
demonstrates that our recipe generalizes across 15 datasets of them, and our
models designed by recipe show improved performance over baselines.
Related papers
- InstructG2I: Synthesizing Images from Multimodal Attributed Graphs [50.852150521561676]
We propose a graph context-conditioned diffusion model called InstructG2I.
InstructG2I first exploits the graph structure and multimodal information to conduct informative neighbor sampling.
A Graph-QFormer encoder adaptively encodes the graph nodes into an auxiliary set of graph prompts to guide the denoising process.
arXiv Detail & Related papers (2024-10-09T17:56:15Z) - Neighbor Overlay-Induced Graph Attention Network [5.792501481702088]
Graph neural networks (GNNs) have garnered significant attention due to their ability to represent graph data.
This study proposes a neighbor overlay-induced graph attention network (NO-GAT) with the following two-fold ideas.
Empirical studies on graph benchmark datasets indicate that the proposed NO-GAT consistently outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-08-16T15:01:28Z) - Self-attention Dual Embedding for Graphs with Heterophily [6.803108335002346]
A number of real-world graphs are heterophilic, and this leads to much lower classification accuracy using standard GNNs.
We design a novel GNN which is effective for both heterophilic and homophilic graphs.
We evaluate our algorithm on real-world graphs containing thousands to millions of nodes and show that we achieve state-of-the-art results.
arXiv Detail & Related papers (2023-05-28T09:38:28Z) - Causally-guided Regularization of Graph Attention Improves
Generalizability [69.09877209676266]
We introduce CAR, a general-purpose regularization framework for graph attention networks.
Methodname aligns the attention mechanism with the causal effects of active interventions on graph connectivity.
For social media network-sized graphs, a CAR-guided graph rewiring approach could allow us to combine the scalability of graph convolutional methods with the higher performance of graph attention.
arXiv Detail & Related papers (2022-10-20T01:29:10Z) - On Classification Thresholds for Graph Attention with Edge Features [26.01769042481568]
We analyze, theoretically and empirically, graph attention networks and their ability of correctly labelling nodes in a classic classification task.
We consider a general graph attention mechanism that takes random edge features as input to determine the attention coefficients.
arXiv Detail & Related papers (2022-10-18T17:32:18Z) - Graph Attention Retrospective [14.52271219759284]
Graph-based learning is a rapidly growing sub-field of machine learning with applications in social networks, citation networks, and bioinformatics.
In this paper, we theoretically study the behaviour of graph attention networks.
We show that in an "easy" regime, where the distance between the means of the Gaussians is large enough, graph attention is able to distinguish inter-class from intra-class edges.
In the "hard" regime, we show that every attention mechanism fails to distinguish intra-class from inter-class edges.
arXiv Detail & Related papers (2022-02-26T04:58:36Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Graph Decoupling Attention Markov Networks for Semi-supervised Graph
Node Classification [38.52231889960877]
Graph neural networks (GNN) have been ubiquitous in graph learning tasks such as node classification.
In this paper, we consider the label dependency of graph nodes and propose a decoupling attention mechanism to learn both hard and soft attention.
arXiv Detail & Related papers (2021-04-28T11:44:13Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.