Fractal Gaussian Networks: A sparse random graph model based on Gaussian
Multiplicative Chaos
- URL: http://arxiv.org/abs/2008.03038v2
- Date: Thu, 13 Jan 2022 15:45:58 GMT
- Title: Fractal Gaussian Networks: A sparse random graph model based on Gaussian
Multiplicative Chaos
- Authors: Subhroshekhar Ghosh, Krishnakumar Balasubramanian, Xiaochuan Yang
- Abstract summary: We propose a novel network model, called Fractal Gaussian Network (FGN)
FGN embodies well-defined and analytically tractable fractal structures.
- Score: 12.096252285460814
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel stochastic network model, called Fractal Gaussian Network
(FGN), that embodies well-defined and analytically tractable fractal
structures. Such fractal structures have been empirically observed in diverse
applications. FGNs interpolate continuously between the popular purely random
geometric graphs (a.k.a. the Poisson Boolean network), and random graphs with
increasingly fractal behavior. In fact, they form a parametric family of sparse
random geometric graphs that are parametrized by a fractality parameter which
governs the strength of the fractal structure. FGNs are driven by the latent
spatial geometry of Gaussian Multiplicative Chaos (GMC), a canonical model of
fractality in its own right. We asymptotically characterize the expected number
of edges, triangles, cliques and hub-and-spoke motifs in FGNs, unveiling a
distinct pattern in their scaling with the size parameter of the network. We
then examine the natural question of detecting the presence of fractality and
the problem of parameter estimation based on observed network data, in addition
to fundamental properties of the FGN as a random graph model. We also explore
fractality in community structures by unveiling a natural stochastic block
model in the setting of FGNs. Finally, we substantiate our results with
phenomenological analysis of the FGN in the context of available scientific
literature for fractality in networks, including applications to real-world
massive network data.
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - Information-Theoretic Thresholds for Planted Dense Cycles [52.076657911275525]
We study a random graph model for small-world networks which are ubiquitous in social and biological sciences.
For both detection and recovery of the planted dense cycle, we characterize the information-theoretic thresholds in terms of $n$, $tau$, and an edge-wise signal-to-noise ratio $lambda$.
arXiv Detail & Related papers (2024-02-01T03:39:01Z) - Learning Networks from Gaussian Graphical Models and Gaussian Free
Fields [2.294014185517203]
We propose a novel estimator for the weighted network (equivalently, its Laplacian) from repeated measurements of a GFF on the network.
We demonstrate the effectiveness of our estimator with concrete recovery guarantees and bounds on the required sample complexity.
In the setting of networks growing with sample size, our results show that for Erdos-Renyi random graphs $G(d,p)$ above the connectivity threshold, we demonstrate that network recovery takes place with high probability.
arXiv Detail & Related papers (2023-08-04T14:18:39Z) - Torsion Graph Neural Networks [21.965704710488232]
We propose TorGNN, an analytic torsion enhanced Graph Neural Network model.
In our TorGNN, for each edge, a corresponding local simplicial complex is identified, then the analytic torsion is calculated.
It has been found that our TorGNN can achieve superior performance on both tasks, and outperform various state-of-the-art models.
arXiv Detail & Related papers (2023-06-23T15:02:23Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Curvature Graph Generative Adversarial Networks [31.763904668737304]
Generative adversarial network (GAN) is widely used for generalized and robust learning on graph data.
Existing GAN-based graph representation methods generate negative samples by random walk or traverse in discrete space.
CurvGAN consistently and significantly outperforms the state-of-the-art methods across multiple tasks.
arXiv Detail & Related papers (2022-03-03T10:00:32Z) - Convergence and Stability of Graph Convolutional Networks on Large
Random Graphs [22.387735135790706]
We study properties of Graph Convolutional Networks (GCNs) by analyzing their behavior on standard models of random graphs.
We first study the convergence of GCNs to their continuous counterpart as the number of nodes grows.
We then analyze the stability of GCNs to small deformations of the random graph model.
arXiv Detail & Related papers (2020-06-02T18:36:19Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.