On Classification Thresholds for Graph Attention with Edge Features
- URL: http://arxiv.org/abs/2210.10014v1
- Date: Tue, 18 Oct 2022 17:32:18 GMT
- Title: On Classification Thresholds for Graph Attention with Edge Features
- Authors: Kimon Fountoulakis, Dake He, Silvio Lattanzi, Bryan Perozzi, Anton
Tsitsulin, Shenghao Yang
- Abstract summary: We analyze, theoretically and empirically, graph attention networks and their ability of correctly labelling nodes in a classic classification task.
We consider a general graph attention mechanism that takes random edge features as input to determine the attention coefficients.
- Score: 26.01769042481568
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recent years we have seen the rise of graph neural networks for
prediction tasks on graphs. One of the dominant architectures is graph
attention due to its ability to make predictions using weighted edge features
and not only node features. In this paper we analyze, theoretically and
empirically, graph attention networks and their ability of correctly labelling
nodes in a classic classification task. More specifically, we study the
performance of graph attention on the classic contextual stochastic block model
(CSBM). In CSBM the nodes and edge features are obtained from a mixture of
Gaussians and the edges from a stochastic block model. We consider a general
graph attention mechanism that takes random edge features as input to determine
the attention coefficients. We study two cases, in the first one, when the edge
features are noisy, we prove that the majority of the attention coefficients
are up to a constant uniform. This allows us to prove that graph attention with
edge features is not better than simple graph convolution for achieving perfect
node classification. Second, we prove that when the edge features are clean
graph attention can distinguish intra- from inter-edges and this makes graph
attention better than classic graph convolution.
Related papers
- Random Geometric Graph Alignment with Graph Neural Networks [8.08963638000146]
We show that a graph neural network can recover an unknown one-to-one mapping between the vertices of two graphs.
We also prove that our conditions on the noise level are tight up to logarithmic factors.
We demonstrate that when the noise level is at least constant this direct matching fails to have perfect recovery while the graph neural network can tolerate noise level growing as fast as a power of the size of the graph.
arXiv Detail & Related papers (2024-02-12T00:18:25Z) - How to Find Your Friendly Neighborhood: Graph Attention Design with
Self-Supervision [16.86132592140062]
We propose a self-supervised graph attention network (SuperGAT) for noisy graphs.
We exploit two attention forms compatible with a self-supervised task to predict edges.
By encoding edges, SuperGAT learns more expressive attention in distinguishing mislinked neighbors.
arXiv Detail & Related papers (2022-04-11T05:45:09Z) - Graph Attention Retrospective [14.52271219759284]
Graph-based learning is a rapidly growing sub-field of machine learning with applications in social networks, citation networks, and bioinformatics.
In this paper, we theoretically study the behaviour of graph attention networks.
We show that in an "easy" regime, where the distance between the means of the Gaussians is large enough, graph attention is able to distinguish inter-class from intra-class edges.
In the "hard" regime, we show that every attention mechanism fails to distinguish intra-class from inter-class edges.
arXiv Detail & Related papers (2022-02-26T04:58:36Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.