Curved Markov Chain Monte Carlo for Network Learning
- URL: http://arxiv.org/abs/2110.03413v2
- Date: Mon, 11 Oct 2021 05:24:14 GMT
- Title: Curved Markov Chain Monte Carlo for Network Learning
- Authors: John Sigbeku, Emil Saucan, and Anthea Monod
- Abstract summary: We present a geometrically enhanced Markov chain Monte Carlo sampler for networks based on a discrete curvature measure defined on graphs.
We show that integrating curvature into the sampler results in faster convergence to a wide range of network statistics demonstrated on deterministic networks drawn from real-world data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a geometrically enhanced Markov chain Monte Carlo sampler for
networks based on a discrete curvature measure defined on graphs. Specifically,
we incorporate the concept of graph Forman curvature into sampling procedures
on both the nodes and edges of a network explicitly, via the transition
probability of the Markov chain, as well as implicitly, via the target
stationary distribution, which gives a novel, curved Markov chain Monte Carlo
approach to learning networks. We show that integrating curvature into the
sampler results in faster convergence to a wide range of network statistics
demonstrated on deterministic networks drawn from real-world data.
Related papers
- Symmetry-driven embedding of networks in hyperbolic space [0.4779196219827508]
Hyperbolic models can reproduce the heavy-tailed degree distribution, high clustering, and hierarchical structure of empirical networks.
Current algorithms for finding the hyperbolic coordinates of networks, however, do not quantify uncertainty in the inferred coordinates.
We present BIGUE, a Markov chain Monte Carlo algorithm that samples the posterior distribution of a Bayesian hyperbolic random graph model.
arXiv Detail & Related papers (2024-06-15T18:44:02Z) - Ai-Sampler: Adversarial Learning of Markov kernels with involutive maps [28.229819253644862]
We propose a method to parameterize and train transition kernels of Markov chains to achieve efficient sampling and good mixing.
This training procedure minimizes the total variation distance between the stationary distribution of the chain and the empirical distribution of the data.
arXiv Detail & Related papers (2024-06-04T17:00:14Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Generative Flow Networks: a Markov Chain Perspective [93.9910025411313]
We propose a new perspective for GFlowNets using Markov chains, showing a unifying view for GFlowNets regardless of the nature of the state space.
Positioning GFlowNets under the same theoretical framework as MCMC methods also allows us to identify the similarities between both frameworks.
arXiv Detail & Related papers (2023-07-04T01:28:02Z) - A Bayesian Take on Gaussian Process Networks [1.7188280334580197]
This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures.
We show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network.
arXiv Detail & Related papers (2023-06-20T08:38:31Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Merge-split Markov chain Monte Carlo for community detection [0.0]
We present a Markov chain Monte Carlo scheme based on merges and splits of groups that is capable of efficiently sampling from the posterior distribution of network partitions.
We demonstrate how schemes based on the move of single nodes between groups systematically fail at correctly sampling from the posterior distribution even on small networks.
arXiv Detail & Related papers (2020-03-16T08:26:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.