Asymptotic entropy of the Gibbs state of complex networks
- URL: http://arxiv.org/abs/2003.08411v2
- Date: Mon, 11 Jan 2021 15:46:23 GMT
- Title: Asymptotic entropy of the Gibbs state of complex networks
- Authors: Adam Glos, Aleksandra Krawiec, {\L}ukasz Pawela
- Abstract summary: The Gibbs state is obtained from the Laplacian, normalized Laplacian or adjacency matrices associated with a graph.
We calculated the entropy of the Gibbs state for a few classes of graphs and studied their behavior with changing graph order and temperature.
Our results show that the behavior of Gibbs entropy as a function of the temperature differs for a choice of real networks when compared to the random ErdHos-R'enyi graphs.
- Score: 68.8204255655161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we study the entropy of the Gibbs state corresponding to a
graph. The Gibbs state is obtained from the Laplacian, normalized Laplacian or
adjacency matrices associated with a graph. We calculated the entropy of the
Gibbs state for a few classes of graphs and studied their behavior with
changing graph order and temperature. We illustrate our analytical results with
numerical simulations for Erd\H{o}s-R\'enyi, Watts-Strogatz, Barab\'asi-Albert
and Chung-Lu graph models and a few real-world graphs. Our results show that
the behavior of Gibbs entropy as a function of the temperature differs for a
choice of real networks when compared to the random Erd\H{o}s-R\'enyi graphs.
Related papers
- Theoretical Insights into Line Graph Transformation on Graph Learning [3.0574700762497744]
Line graph transformation has been widely studied in graph theory, where each node in a line graph corresponds to an edge in the original graph.
This has inspired a series of graph neural networks (GNNs) applied to transformed line graphs, which have proven effective in various graph representation learning tasks.
In this study, we focus on two types of graphs known to be challenging to the Weisfeiler-Leman (WL) tests: Cai-F"urer-Immerman (CFI) graphs and strongly regular graphs.
arXiv Detail & Related papers (2024-10-21T16:04:50Z) - Almost Surely Asymptotically Constant Graph Neural Networks [7.339728196535312]
We show that the output converges to a constant function, which upper-bounds what these classifiers can uniformly express.
Results apply to a broad class of random graph models.
We empirically validate these findings, observing that the convergence phenomenon appears not only on random graphs but also on some real-world graphs.
arXiv Detail & Related papers (2024-03-06T17:40:26Z) - Graph Neural Networks with a Distribution of Parametrized Graphs [27.40566674759208]
We introduce latent variables to parameterize and generate multiple graphs.
We obtain the maximum likelihood estimate of the network parameters in an Expectation-Maximization framework.
arXiv Detail & Related papers (2023-10-25T06:38:24Z) - Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation [61.39364567221311]
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes.
One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs.
We introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations.
arXiv Detail & Related papers (2021-12-19T05:04:53Z) - Non-separable Spatio-temporal Graph Kernels via SPDEs [90.62347738138594]
We provide graph kernels for principled-temporal modelling on graphs.
By providing novel tools for modelling on graphs, we outperform pre-existing graph kernels in real-world applications.
arXiv Detail & Related papers (2021-11-16T14:53:19Z) - Average scattering entropy of quantum graphs [0.0]
We propose a methodology that associates a graph a scattering entropy, which we call the average scattering entropy.
It is defined by taking into account the period of the scattering amplitude which we calculate using the Green's function procedure.
arXiv Detail & Related papers (2021-01-13T18:22:51Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - Hamiltonian systems, Toda lattices, Solitons, Lax Pairs on weighted
Z-graded graphs [62.997667081978825]
We identify conditions which allow one to lift one dimensional solutions to solutions on graphs.
We show that even for a simple example of a topologically interesting graph the corresponding non-trivial Lax pairs and associated unitary transformations do not lift to a Lax pair on the Z-graded graph.
arXiv Detail & Related papers (2020-08-11T17:58:13Z) - Critical Phenomena in Complex Networks: from Scale-free to Random
Networks [77.34726150561087]
We study critical phenomena in a class of configuration network models with hidden variables controlling links between pairs of nodes.
We find analytical expressions for the average node degree, the expected number of edges, and the Landau and Helmholtz free energies, as a function of the temperature and number of nodes.
arXiv Detail & Related papers (2020-08-05T18:57:38Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.