Big Networks: A Survey
- URL: http://arxiv.org/abs/2008.03638v1
- Date: Sun, 9 Aug 2020 03:40:20 GMT
- Title: Big Networks: A Survey
- Authors: Hayat Dino Bedru, Shuo Yu, Xinru Xiao, Da Zhang, Liangtian Wan, He
Guo, Feng Xia
- Abstract summary: This paper introduces a new network science concept called big network.
Big networks are generally in large-scale with a complicated and higher-order inner structure.
We first introduce the structural characteristics of big networks from three levels, which are micro-level, meso-level, and macro-level.
We then discuss some state-of-the-art advanced topics of big network analysis.
- Score: 12.9583587501745
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A network is a typical expressive form of representing complex systems in
terms of vertices and links, in which the pattern of interactions amongst
components of the network is intricate. The network can be static that does not
change over time or dynamic that evolves through time. The complication of
network analysis is different under the new circumstance of network size
explosive increasing. In this paper, we introduce a new network science concept
called big network. Big networks are generally in large-scale with a
complicated and higher-order inner structure. This paper proposes a guideline
framework that gives an insight into the major topics in the area of network
science from the viewpoint of a big network. We first introduce the structural
characteristics of big networks from three levels, which are micro-level,
meso-level, and macro-level. We then discuss some state-of-the-art advanced
topics of big network analysis. Big network models and related approaches,
including ranking methods, partition approaches, as well as network embedding
algorithms are systematically introduced. Some typical applications in big
networks are then reviewed, such as community detection, link prediction,
recommendation, etc. Moreover, we also pinpoint some critical open issues that
need to be investigated further.
Related papers
- Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - CS-MLGCN : Multiplex Graph Convolutional Networks for Community Search
in Multiplex Networks [0.0]
We propose a query-driven graph convolutional network in multiplex networks, CS-MLGCN, that can capture flexible community structures.
Experiments on real-world graphs with ground-truth communities validate the quality of the solutions we obtain.
arXiv Detail & Related papers (2022-10-17T07:47:19Z) - TeKo: Text-Rich Graph Neural Networks with External Knowledge [75.91477450060808]
We propose a novel text-rich graph neural network with external knowledge (TeKo)
We first present a flexible heterogeneous semantic network that incorporates high-quality entities.
We then introduce two types of external knowledge, that is, structured triplets and unstructured entity description.
arXiv Detail & Related papers (2022-06-15T02:33:10Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - Understanding the network formation pattern for better link prediction [4.8334761517444855]
We propose a novel method named Link prediction using Multiple Order Local Information (MOLI)
MOLI exploits the local information from the neighbors of different distances, with parameters that can be a prior-driven based on prior knowledge.
We show that MOLI outperforms the other 11 widely used link prediction algorithms on 11 different types of simulated and real-world networks.
arXiv Detail & Related papers (2021-10-17T15:30:04Z) - Learning low-rank latent mesoscale structures in networks [1.1470070927586016]
We present a new approach for describing low-rank mesoscale structures in networks.
We use several synthetic network models and empirical friendship, collaboration, and protein--protein interaction (PPI) networks.
We show how to denoise a corrupted network by using only the latent motifs that one learns directly from the corrupted network.
arXiv Detail & Related papers (2021-02-13T18:54:49Z) - NetReAct: Interactive Learning for Network Summarization [60.18513812680714]
We present NetReAct, a novel interactive network summarization algorithm which supports the visualization of networks induced by text corpora to perform sensemaking.
We show how NetReAct is successful in generating high-quality summaries and visualizations that reveal hidden patterns better than other non-trivial baselines.
arXiv Detail & Related papers (2020-12-22T03:56:26Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Detecting Communities in Heterogeneous Multi-Relational Networks:A
Message Passing based Approach [89.19237792558687]
Community is a common characteristic of networks including social networks, biological networks, computer and information networks.
We propose an efficient message passing based algorithm to simultaneously detect communities for all homogeneous networks.
arXiv Detail & Related papers (2020-04-06T17:36:24Z) - Quasi-Equivalence of Width and Depth of Neural Networks [10.365556153676538]
We investigate if the design of artificial neural networks should have a directional preference.
Inspired by the De Morgan law, we establish a quasi-equivalence between the width and depth of ReLU networks.
Based on our findings, a deep network has a wide equivalent, subject to an arbitrarily small error.
arXiv Detail & Related papers (2020-02-06T21:17:32Z) - Emergence of Network Motifs in Deep Neural Networks [0.35911228556176483]
We show that network science tools can be successfully applied to the study of artificial neural networks.
In particular, we study the emergence of network motifs in multi-layer perceptrons.
arXiv Detail & Related papers (2019-12-27T17:05:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.