Neural-prior stochastic block model
- URL: http://arxiv.org/abs/2303.09995v2
- Date: Wed, 6 Sep 2023 13:36:52 GMT
- Title: Neural-prior stochastic block model
- Authors: O. Duranthon, L. Zdeborov\'a
- Abstract summary: We propose to model the communities as being determined by the node attributes rather than the opposite.
We propose an algorithm, stemming from statistical physics, based on a combination of belief propagation and approximate message passing.
The proposed model and algorithm can be used as a benchmark for both theory and algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The stochastic block model (SBM) is widely studied as a benchmark for graph
clustering aka community detection. In practice, graph data often come with
node attributes that bear additional information about the communities.
Previous works modeled such data by considering that the node attributes are
generated from the node community memberships. In this work, motivated by a
recent surge of works in signal processing using deep neural networks as
priors, we propose to model the communities as being determined by the node
attributes rather than the opposite. We define the corresponding model; we call
it the neural-prior SBM. We propose an algorithm, stemming from statistical
physics, based on a combination of belief propagation and approximate message
passing. We analyze the performance of the algorithm as well as the
Bayes-optimal performance. We identify detectability and exact recovery phase
transitions, as well as an algorithmically hard region. The proposed model and
algorithm can be used as a benchmark for both theory and algorithms. To
illustrate this, we compare the optimal performances to the performance of
simple graph neural networks.
Related papers
- Sparse Decomposition of Graph Neural Networks [20.768412002413843]
We propose an approach to reduce the number of nodes that are included during aggregation.
We achieve this through a sparse decomposition, learning to approximate node representations using a weighted sum of linearly transformed features.
We demonstrate via extensive experiments that our method outperforms other baselines designed for inference speedup.
arXiv Detail & Related papers (2024-10-25T17:52:16Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Optimality of Message-Passing Architectures for Sparse Graphs [13.96547777184641]
We study the node classification problem on feature-decorated graphs in the sparse setting, i.e., when the expected degree of a node is $O(1)$ in the number of nodes.
We introduce a notion of Bayes optimality for node classification tasks, called local Bayes optimality.
We show that the optimal message-passing architecture interpolates between a standard in the regime of low graph signal and a typical in the regime of high graph signal.
arXiv Detail & Related papers (2023-05-17T17:31:20Z) - Towards Better Out-of-Distribution Generalization of Neural Algorithmic
Reasoning Tasks [51.8723187709964]
We study the OOD generalization of neural algorithmic reasoning tasks.
The goal is to learn an algorithm from input-output pairs using deep neural networks.
arXiv Detail & Related papers (2022-11-01T18:33:20Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Invertible Neural Networks for Graph Prediction [22.140275054568985]
In this work, we address conditional generation using deep invertible neural networks.
We adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once.
arXiv Detail & Related papers (2022-06-02T17:28:33Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Semi-Supervised Node Classification on Graphs: Markov Random Fields vs.
Graph Neural Networks [38.760186021633146]
Semi-supervised node classification on graph-structured data has many applications such as fraud detection, fake account and review detection, user's private attribute inference in social networks, and community detection.
Various methods such as pairwise Markov Random Fields (pMRF) and graph neural networks were developed for semi-supervised node classification.
pMRF is more efficient than graph neural networks.
Existing pMRF-based methods are less accurate than graph neural networks, due to a key limitation that they assume a constant edge potential for all edges.
arXiv Detail & Related papers (2020-12-24T03:46:08Z) - Neuralizing Efficient Higher-order Belief Propagation [19.436520792345064]
We propose to combine approaches to learn better node and graph representations.
We derive an efficient approximate sum-product loopy belief propagation inference algorithm for higher-order PGMs.
Our model indeed captures higher-order information, substantially outperforming state-of-the-art $k$-order graph neural networks in molecular datasets.
arXiv Detail & Related papers (2020-10-19T07:51:31Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.