LayerPlexRank: Exploring Node Centrality and Layer Influence through Algebraic Connectivity in Multiplex Networks
- URL: http://arxiv.org/abs/2405.05576v1
- Date: Thu, 09 May 2024 06:52:24 GMT
- Title: LayerPlexRank: Exploring Node Centrality and Layer Influence through Algebraic Connectivity in Multiplex Networks
- Authors: Hao Ren, Jiaojiao Jiang,
- Abstract summary: This paper introduces LayerPlexRank, an algorithm that simultaneously assesses node centrality and layer influence in multiplex networks.
We substantiate the utility of LayerPlexRank with theoretical analyses and empirical validations on varied real-world datasets.
- Score: 4.130399938456945
- License:
- Abstract: As the calculation of centrality in complex networks becomes increasingly vital across technological, biological, and social systems, precise and scalable ranking methods are essential for understanding these networks. This paper introduces LayerPlexRank, an algorithm that simultaneously assesses node centrality and layer influence in multiplex networks using algebraic connectivity metrics. This method enhances the robustness of the ranking algorithm by effectively assessing structural changes across layers using random walk, considering the overall connectivity of the graph. We substantiate the utility of LayerPlexRank with theoretical analyses and empirical validations on varied real-world datasets, contrasting it with established centrality measures.
Related papers
- Cost-Effective Community-Hierarchy-Based Mutual Voting Approach for Influence Maximization in Complex Networks [54.366995393644586]
Real-world usually have high requirements on the balance between time and accuracy of influential nodes identification.
This article proposes a novel approach called Cost-Effective Community-Hierarchy-Based Mutual Voting for influence in complex networks.
The proposed approach outperforms 16 state-of-the-art techniques on the balance between time complexity and accuracy of influential nodes identification.
arXiv Detail & Related papers (2024-09-21T06:32:28Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - Decomposing neural networks as mappings of correlation functions [57.52754806616669]
We study the mapping between probability distributions implemented by a deep feed-forward network.
We identify essential statistics in the data, as well as different information representations that can be used by neural networks.
arXiv Detail & Related papers (2022-02-10T09:30:31Z) - Fast Topological Clustering with Wasserstein Distance [0.0]
We propose a novel and computationally practical topological clustering method that clusters complex networks with intricate topology.
Such networks are aggregated into clusters through a centroid-based clustering strategy based on both their topological and geometric structure.
The proposed method is demonstrated to be effective using both simulated networks and measured functional brain networks.
arXiv Detail & Related papers (2021-11-30T21:02:53Z) - A Modular Framework for Centrality and Clustering in Complex Networks [0.6423239719448168]
In this paper, we study two important such network analysis techniques, namely, centrality and clustering.
An information-flow based model is adopted for clustering, which itself builds upon an information theoretic measure for computing centrality.
Our clustering naturally inherits the flexibility to accommodate edge directionality, as well as different interpretations and interplay between edge weights and node degrees.
arXiv Detail & Related papers (2021-11-23T03:01:29Z) - Subspace Clustering Based Analysis of Neural Networks [7.451579925406617]
We learn affinity graphs from the latent structure of a given neural network layer trained over a set of inputs.
We then use tools from Community Detection to quantify structures present in the input.
We analyze the learned affinity graphs of the final convolutional layer of the network and demonstrate how an input's local neighbourhood affects its classification by the network.
arXiv Detail & Related papers (2021-07-02T22:46:40Z) - Learning Structures for Deep Neural Networks [99.8331363309895]
We propose to adopt the efficient coding principle, rooted in information theory and developed in computational neuroscience.
We show that sparse coding can effectively maximize the entropy of the output signals.
Our experiments on a public image classification dataset demonstrate that using the structure learned from scratch by our proposed algorithm, one can achieve a classification accuracy comparable to the best expert-designed structure.
arXiv Detail & Related papers (2021-05-27T12:27:24Z) - Hierarchical Graph Neural Networks [0.0]
This paper aims to connect the dots between the traditional Neural Network and the Graph Neural Network architectures.
A Hierarchical Graph Neural Network architecture is proposed, supplementing the original input network layer with the hierarchy of auxiliary network layers.
It enables simultaneous learning of the individual node features along with the aggregated network features at variable resolution and uses them to improve the convergence and stability of the individual node feature learning.
arXiv Detail & Related papers (2021-05-07T16:47:18Z) - Classical and quantum random-walk centrality measures in multilayer
networks [0.0]
Classifying the importance of nodes and node-layers is an important aspect of the study of multilayer networks.
It is common to calculate various centrality measures, which allow one to rank nodes and node-layers according to a variety of structural features.
We apply our framework to a variety of synthetic and real-world multilayer networks, and we identify marked differences between classical and quantum centrality measures.
arXiv Detail & Related papers (2020-12-13T21:23:29Z) - On the use of local structural properties for improving the efficiency
of hierarchical community detection methods [77.34726150561087]
We study how local structural network properties can be used as proxies to improve the efficiency of hierarchical community detection.
We also check the performance impact of network prunings as an ancillary tactic to make hierarchical community detection more efficient.
arXiv Detail & Related papers (2020-09-15T00:16:12Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.