Learning Networks from Gaussian Graphical Models and Gaussian Free
Fields
- URL: http://arxiv.org/abs/2308.02344v1
- Date: Fri, 4 Aug 2023 14:18:39 GMT
- Title: Learning Networks from Gaussian Graphical Models and Gaussian Free
Fields
- Authors: Subhro Ghosh, Soumendu Sundar Mukherjee, Hoang-Son Tran, Ujan
Gangopadhyay
- Abstract summary: We propose a novel estimator for the weighted network (equivalently, its Laplacian) from repeated measurements of a GFF on the network.
We demonstrate the effectiveness of our estimator with concrete recovery guarantees and bounds on the required sample complexity.
In the setting of networks growing with sample size, our results show that for Erdos-Renyi random graphs $G(d,p)$ above the connectivity threshold, we demonstrate that network recovery takes place with high probability.
- Score: 2.294014185517203
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate the problem of estimating the structure of a weighted network
from repeated measurements of a Gaussian Graphical Model (GGM) on the network.
In this vein, we consider GGMs whose covariance structures align with the
geometry of the weighted network on which they are based. Such GGMs have been
of longstanding interest in statistical physics, and are referred to as the
Gaussian Free Field (GFF). In recent years, they have attracted considerable
interest in the machine learning and theoretical computer science. In this
work, we propose a novel estimator for the weighted network (equivalently, its
Laplacian) from repeated measurements of a GFF on the network, based on the
Fourier analytic properties of the Gaussian distribution. In this pursuit, our
approach exploits complex-valued statistics constructed from observed data,
that are of interest on their own right. We demonstrate the effectiveness of
our estimator with concrete recovery guarantees and bounds on the required
sample complexity. In particular, we show that the proposed statistic achieves
the parametric rate of estimation for fixed network size. In the setting of
networks growing with sample size, our results show that for Erdos-Renyi random
graphs $G(d,p)$ above the connectivity threshold, we demonstrate that network
recovery takes place with high probability as soon as the sample size $n$
satisfies $n \gg d^4 \log d \cdot p^{-2}$.
Related papers
- Investigating Parameter-Efficiency of Hybrid QuGANs Based on Geometric Properties of Generated Sea Route Graphs [3.9456729020535013]
We use quantum-classical hybrid generative adversarial networks (QuGANs) to artificially generate graphs of shipping routes.
We compare hybrid QuGANs with classical Generative Adversarial Networks (GANs)
Our results indicate that QuGANs are indeed able to quickly learn and represent underlying geometric properties and distributions.
arXiv Detail & Related papers (2025-01-15T09:08:05Z) - Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - Generalization of Graph Neural Networks is Robust to Model Mismatch [84.01980526069075]
Graph neural networks (GNNs) have demonstrated their effectiveness in various tasks supported by their generalization capabilities.
In this paper, we examine GNNs that operate on geometric graphs generated from manifold models.
Our analysis reveals the robustness of the GNN generalization in the presence of such model mismatch.
arXiv Detail & Related papers (2024-08-25T16:00:44Z) - A Bayesian Take on Gaussian Process Networks [1.7188280334580197]
This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures.
We show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network.
arXiv Detail & Related papers (2023-06-20T08:38:31Z) - Is Stochastic Gradient Descent Near Optimal? [0.0]
We show that gradient descent achieves small expected error with a number of samples and total number of queries.
This suggests that SGD nearly achieves the information-theoretic sample complexity bounds of Joen & Van Roy (arXiv:2203.00246) in a computationally efficient manner.
arXiv Detail & Related papers (2022-09-18T18:26:43Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Fractal Gaussian Networks: A sparse random graph model based on Gaussian
Multiplicative Chaos [12.096252285460814]
We propose a novel network model, called Fractal Gaussian Network (FGN)
FGN embodies well-defined and analytically tractable fractal structures.
arXiv Detail & Related papers (2020-08-07T08:37:36Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.