Critical Phenomena in Complex Networks: from Scale-free to Random
Networks
- URL: http://arxiv.org/abs/2008.02319v2
- Date: Wed, 7 Apr 2021 19:43:03 GMT
- Title: Critical Phenomena in Complex Networks: from Scale-free to Random
Networks
- Authors: Alexander I. Nesterov and Pablo H\'ector Mata Villafuerte
- Abstract summary: We study critical phenomena in a class of configuration network models with hidden variables controlling links between pairs of nodes.
We find analytical expressions for the average node degree, the expected number of edges, and the Landau and Helmholtz free energies, as a function of the temperature and number of nodes.
- Score: 77.34726150561087
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Within the conventional statistical physics framework, we study critical
phenomena in a class of configuration network models with hidden variables
controlling links between pairs of nodes. We find analytical expressions for
the average node degree, the expected number of edges, and the Landau and
Helmholtz free energies, as a function of the temperature and number of nodes.
We show that the network's temperature is a parameter that controls the average
node degree in the whole network and the transition from unconnected graphs to
a power-law degree (scale-free) and random graphs. With increasing temperature,
the degree distribution is changed from power-law degree distribution, for
lower temperatures, to a Poisson-like distribution for high temperatures. We
also show that phase transition in the so-called Type A networks leads to
fundamental structural changes in the network topology. Below the critical
temperature, the graph is completely disconnected. Above the critical
temperature, the graph becomes connected, and a giant component appears.
Related papers
- Transfer Entropy in Graph Convolutional Neural Networks [0.0]
Graph Convolutional Networks (GCN) are Graph Neural Networks where the convolutions are applied over a graph.
In this study, we address two important challenges related to GCNs: i.
Oversmoothing is the degradation of the discriminative capacity of nodes as a result of repeated aggregations.
We propose a new strategy for addressing these challenges in GCNs based on Transfer Entropy (TE), which measures of the amount of directed transfer of information between two time varying nodes.
arXiv Detail & Related papers (2024-06-08T20:09:17Z) - Network Centralities in Quantum Entanglement Distribution due to User
Preferences [5.243460995467895]
This paper studies the centralities of the network when the link layer topology of entanglements is driven by usage patterns of peer-to-peer connections.
It shows that the edge centralities (measured as usage of individual edges during entanglement distribution) of the entangled graph follow power law distributions.
These findings will help in quantum resource management, e.g., quantum technology with high reliability and lower decoherence time may be allocated to edges with high centralities.
arXiv Detail & Related papers (2023-08-16T07:00:09Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Effects of Graph Convolutions in Deep Networks [8.937905773981702]
We present a rigorous theoretical understanding of the effects of graph convolutions in multi-layer networks.
We show that a single graph convolution expands the regime of the distance between the means where multi-layer networks can classify the data.
We provide both theoretical and empirical insights into the performance of graph convolutions placed in different combinations among the layers of a network.
arXiv Detail & Related papers (2022-04-20T08:24:43Z) - Superradiant phase transition in complex networks [62.997667081978825]
We consider a superradiant phase transition problem for the Dicke-Ising model.
We examine regular, random, and scale-free network structures.
arXiv Detail & Related papers (2020-12-05T17:40:53Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - Asymptotic entropy of the Gibbs state of complex networks [68.8204255655161]
The Gibbs state is obtained from the Laplacian, normalized Laplacian or adjacency matrices associated with a graph.
We calculated the entropy of the Gibbs state for a few classes of graphs and studied their behavior with changing graph order and temperature.
Our results show that the behavior of Gibbs entropy as a function of the temperature differs for a choice of real networks when compared to the random ErdHos-R'enyi graphs.
arXiv Detail & Related papers (2020-03-18T18:01:28Z) - Potential energy of complex networks: a novel perspective [0.0]
We present a novel characterization of complex networks, based on the potential of an associated Schr"odinger equation.
Crucial information is retained in the reconstructed potential, which provides a compact representation of the properties of the network structure.
arXiv Detail & Related papers (2020-02-11T17:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.