Complex networks with tuneable dimensions as a universality playground
- URL: http://arxiv.org/abs/2006.10421v3
- Date: Wed, 21 Apr 2021 09:34:47 GMT
- Title: Complex networks with tuneable dimensions as a universality playground
- Authors: Ana P. Mill\'an, Giacomo Gori, Federico Battiston, Tilman Enss, and
Nicol\`o Defenu
- Abstract summary: We discuss the role of a fundamental network parameter for universality, the spectral dimension.
By explicit computation we prove that the spectral dimension for this model can be tuned continuously from $1$ to infinity.
We propose our model as a tool to probe universal behaviour on inhomogeneous structures and comment on the possibility that the universal behaviour of correlated models on such networks mimics the one of continuous field theories in fractional Euclidean dimensions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Universality is one of the key concepts in understanding critical phenomena.
However, for interacting inhomogeneous systems described by complex networks a
clear understanding of the relevant parameters for universality is still
missing. Here we discuss the role of a fundamental network parameter for
universality, the spectral dimension. For this purpose, we construct a complex
network model where the probability of a bond between two nodes is proportional
to a power law of the nodes' distances. By explicit computation we prove that
the spectral dimension for this model can be tuned continuously from $1$ to
infinity, and we discuss related network connectivity measures. We propose our
model as a tool to probe universal behaviour on inhomogeneous structures and
comment on the possibility that the universal behaviour of correlated models on
such networks mimics the one of continuous field theories in fractional
Euclidean dimensions.
Related papers
- Going Beyond Neural Network Feature Similarity: The Network Feature
Complexity and Its Interpretation Using Category Theory [64.06519549649495]
We provide the definition of what we call functionally equivalent features.
These features produce equivalent output under certain transformations.
We propose an efficient algorithm named Iterative Feature Merging.
arXiv Detail & Related papers (2023-10-10T16:27:12Z) - Joint Group Invariant Functions on Data-Parameter Domain Induce
Universal Neural Networks [14.45619075342763]
We present a systematic method to induce a generalized neural network and its right inverse operator, called the ridgelet transform.
Since the ridgelet transform is an inverse, it can describe the arrangement of parameters for the network to represent a target function.
We present a new simple proof of the universality by using Schur's lemma in a unified manner covering a wide class of networks.
arXiv Detail & Related papers (2023-10-05T13:30:37Z) - Interpolation, Approximation and Controllability of Deep Neural Networks [18.311411538309425]
We consider two properties that arise from supervised learning, namely universal and universal approximation.
We give a characterisation of universal equivalence, showing that it holds for essentially any architecture with non-linearity.
arXiv Detail & Related papers (2023-09-12T07:29:47Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z) - Full network nonlocality [68.8204255655161]
We introduce the concept of full network nonlocality, which describes correlations that necessitate all links in a network to distribute nonlocal resources.
We show that the most well-known network Bell test does not witness full network nonlocality.
More generally, we point out that established methods for analysing local and theory-independent correlations in networks can be combined in order to deduce sufficient conditions for full network nonlocality.
arXiv Detail & Related papers (2021-05-19T18:00:02Z) - A Functional Perspective on Learning Symmetric Functions with Neural
Networks [48.80300074254758]
We study the learning and representation of neural networks defined on measures.
We establish approximation and generalization bounds under different choices of regularization.
The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes.
arXiv Detail & Related papers (2020-08-16T16:34:33Z) - Emergent entanglement structures and self-similarity in quantum spin
chains [0.0]
We introduce an experimentally accessible network representation for many-body quantum states based on entanglement between all pairs of its constituents.
We illustrate the power of this representation by applying it to a paradigmatic spin chain model, the XX model, and showing that it brings to light new phenomena.
arXiv Detail & Related papers (2020-07-14T12:13:29Z) - Universal Approximation Power of Deep Residual Neural Networks via
Nonlinear Control Theory [9.210074587720172]
We explain the universal approximation capabilities of deep residual neural networks through geometric nonlinear control.
Inspired by recent work establishing links between residual networks and control systems, we provide a general sufficient condition for a residual network to have the power of universal approximation.
arXiv Detail & Related papers (2020-07-12T14:53:30Z) - Coupling-based Invertible Neural Networks Are Universal Diffeomorphism
Approximators [72.62940905965267]
Invertible neural networks based on coupling flows (CF-INNs) have various machine learning applications such as image synthesis and representation learning.
Are CF-INNs universal approximators for invertible functions?
We prove a general theorem to show the equivalence of the universality for certain diffeomorphism classes.
arXiv Detail & Related papers (2020-06-20T02:07:37Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z) - Potential energy of complex networks: a novel perspective [0.0]
We present a novel characterization of complex networks, based on the potential of an associated Schr"odinger equation.
Crucial information is retained in the reconstructed potential, which provides a compact representation of the properties of the network structure.
arXiv Detail & Related papers (2020-02-11T17:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.