Optimality of Message-Passing Architectures for Sparse Graphs
- URL: http://arxiv.org/abs/2305.10391v2
- Date: Sat, 16 Dec 2023 03:28:06 GMT
- Title: Optimality of Message-Passing Architectures for Sparse Graphs
- Authors: Aseem Baranwal and Kimon Fountoulakis and Aukosh Jagannath
- Abstract summary: We study the node classification problem on feature-decorated graphs in the sparse setting, i.e., when the expected degree of a node is $O(1)$ in the number of nodes.
We introduce a notion of Bayes optimality for node classification tasks, called local Bayes optimality.
We show that the optimal message-passing architecture interpolates between a standard in the regime of low graph signal and a typical in the regime of high graph signal.
- Score: 13.96547777184641
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the node classification problem on feature-decorated graphs in the
sparse setting, i.e., when the expected degree of a node is $O(1)$ in the
number of nodes, in the fixed-dimensional asymptotic regime, i.e., the
dimension of the feature data is fixed while the number of nodes is large. Such
graphs are typically known to be locally tree-like. We introduce a notion of
Bayes optimality for node classification tasks, called asymptotic local Bayes
optimality, and compute the optimal classifier according to this criterion for
a fairly general statistical data model with arbitrary distributions of the
node features and edge connectivity. The optimal classifier is implementable
using a message-passing graph neural network architecture. We then compute the
generalization error of this classifier and compare its performance against
existing learning methods theoretically on a well-studied statistical model
with naturally identifiable signal-to-noise ratios (SNRs) in the data. We find
that the optimal message-passing architecture interpolates between a standard
MLP in the regime of low graph signal and a typical convolution in the regime
of high graph signal. Furthermore, we prove a corresponding non-asymptotic
result.
Related papers
- Sparse Decomposition of Graph Neural Networks [20.768412002413843]
We propose an approach to reduce the number of nodes that are included during aggregation.
We achieve this through a sparse decomposition, learning to approximate node representations using a weighted sum of linearly transformed features.
We demonstrate via extensive experiments that our method outperforms other baselines designed for inference speedup.
arXiv Detail & Related papers (2024-10-25T17:52:16Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Beyond kNN: Adaptive, Sparse Neighborhood Graphs via Optimal Transport [0.1933681537640272]
Nearest neighbour graphs are widely used to capture the geometry or topology of a dataset.
One of the most common strategies to construct such a graph is based on selecting a fixed number k of nearest neighbours (kNN) for each point.
We propose a simple approach to construct an adaptive neighbourhood graph from a single parameter, based on quadratically regularised optimal transport.
arXiv Detail & Related papers (2022-08-01T04:24:58Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.