Structural Balance and Random Walks on Complex Networks with Complex
Weights
- URL: http://arxiv.org/abs/2307.01813v1
- Date: Tue, 4 Jul 2023 16:39:52 GMT
- Title: Structural Balance and Random Walks on Complex Networks with Complex
Weights
- Authors: Yu Tian, Renaud Lambiotte
- Abstract summary: Recent years have seen an increasing interest to extend the tools of network science when the weight of edges are complex numbers.
Here, we focus on the case when the weight matrix is Hermitian, a reasonable assumption in many applications.
We introduce a classification of complex-weighted networks based on the notion of structural balance, and illustrate the shared spectral properties within each type.
- Score: 13.654842079699458
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Complex numbers define the relationship between entities in many situations.
A canonical example would be the off-diagonal terms in a Hamiltonian matrix in
quantum physics. Recent years have seen an increasing interest to extend the
tools of network science when the weight of edges are complex numbers. Here, we
focus on the case when the weight matrix is Hermitian, a reasonable assumption
in many applications, and investigate both structural and dynamical properties
of the complex-weighted networks. Building on concepts from signed graphs, we
introduce a classification of complex-weighted networks based on the notion of
structural balance, and illustrate the shared spectral properties within each
type. We then apply the results to characterise the dynamics of random walks on
complex-weighted networks, where local consensus can be achieved asymptotically
when the graph is structurally balanced, while global consensus will be
obtained when it is strictly unbalanced. Finally, we explore potential
applications of our findings by generalising the notion of cut, and propose an
associated spectral clustering algorithm. We also provide further
characteristics of the magnetic Laplacian, associating directed networks to
complex-weighted ones. The performance of the algorithm is verified on both
synthetic and real networks.
Related papers
- DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Learning Coherent Clusters in Weakly-Connected Network Systems [7.766921168069532]
We propose a structure-preserving model methodology for large-scale dynamic networks with tightly-connected components.
We provide an upper bound on the approximation error when the network graph is randomly generated from a weight block model.
arXiv Detail & Related papers (2022-11-28T13:32:25Z) - Neural Network Complexity of Chaos and Turbulence [0.0]
We consider the relative complexity of chaos and turbulence from the perspective of deep neural networks.
We analyze a set of classification problems, where the network has to distinguish images of fluid profiles in the turbulent regime.
We quantify the complexity of the computation performed by the network via the intrinsic dimensionality of the internal feature representations.
arXiv Detail & Related papers (2022-11-24T13:21:36Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Spectral Complexity-scaled Generalization Bound of Complex-valued Neural
Networks [78.64167379726163]
This paper is the first work that proves a generalization bound for the complex-valued neural network.
We conduct experiments by training complex-valued convolutional neural networks on different datasets.
arXiv Detail & Related papers (2021-12-07T03:25:25Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z) - Network configuration theory for all networks [0.0]
Entangled quantum networks provide great flexibilities and scalabilities for quantum information processing or quantum Internet.
Our goal in this work is to explore new characterizations of any networks with theory-independent configurations.
arXiv Detail & Related papers (2021-07-13T04:55:19Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Emergent complex quantum networks in continuous-variables non-Gaussian
states [0.0]
We study a class of continuous-variable quantum states that present both multipartite entanglement and non-Gaussian statistics.
In particular, the states are built from an initial imprinted cluster state created via Gaussian entangling operations.
arXiv Detail & Related papers (2020-12-31T13:58:37Z) - Emergent entanglement structures and self-similarity in quantum spin
chains [0.0]
We introduce an experimentally accessible network representation for many-body quantum states based on entanglement between all pairs of its constituents.
We illustrate the power of this representation by applying it to a paradigmatic spin chain model, the XX model, and showing that it brings to light new phenomena.
arXiv Detail & Related papers (2020-07-14T12:13:29Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.