Network representation learning systematic review: ancestors and current
development state
- URL: http://arxiv.org/abs/2109.07583v1
- Date: Tue, 14 Sep 2021 14:44:44 GMT
- Title: Network representation learning systematic review: ancestors and current
development state
- Authors: Amina Amara, Mohamed Ali Hadj Taieb, Mohamed Ben Aouicha
- Abstract summary: We present a systematic survey of network representation learning, known as network embedding, from birth to the current development state.
We provide also formal definitions of basic concepts required to understand network representation learning.
Most commonly used downstream tasks to evaluate embeddings, their evaluation metrics and popular datasets are highlighted.
- Score: 1.0312968200748116
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world information networks are increasingly occurring across various
disciplines including online social networks and citation networks. These
network data are generally characterized by sparseness, nonlinearity and
heterogeneity bringing different challenges to the network analytics task to
capture inherent properties from network data. Artificial intelligence and
machine learning have been recently leveraged as powerful systems to learn
insights from network data and deal with presented challenges. As part of
machine learning techniques, graph embedding approaches are originally
conceived for graphs constructed from feature represented datasets, like image
dataset, in which links between nodes are explicitly defined. These traditional
approaches cannot cope with network data challenges. As a new learning
paradigm, network representation learning has been proposed to map a real-world
information network into a low-dimensional space while preserving inherent
properties of the network. In this paper, we present a systematic comprehensive
survey of network representation learning, known also as network embedding,
from birth to the current development state. Through the undertaken survey, we
provide a comprehensive view of reasons behind the emergence of network
embedding and, types of settings and models used in the network embedding
pipeline. Thus, we introduce a brief history of representation learning and
word representation learning ancestor of network embedding. We provide also
formal definitions of basic concepts required to understand network
representation learning followed by a description of network embedding
pipeline. Most commonly used downstream tasks to evaluate embeddings, their
evaluation metrics and popular datasets are highlighted. Finally, we present
the open-source libraries for network embedding.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - Leveraging advances in machine learning for the robust classification and interpretation of networks [0.0]
Simulation approaches involve selecting a suitable network generative model such as Erd"os-R'enyi or small-world.
We utilize advances in interpretable machine learning to classify simulated networks by our generative models based on various network attributes.
arXiv Detail & Related papers (2024-03-20T00:24:23Z) - Network representation learning: A macro and micro view [9.221196170951702]
We conduct a comprehensive review of current literature on network representation learning.
Existing algorithms can be categorized into three groups: shallow embedding models, heterogeneous network embedding models, graph neural network based models.
One advantage of the survey is that we systematically study the underlying theoretical foundations underlying the different categories of algorithms.
arXiv Detail & Related papers (2021-11-21T08:58:51Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - NetReAct: Interactive Learning for Network Summarization [60.18513812680714]
We present NetReAct, a novel interactive network summarization algorithm which supports the visualization of networks induced by text corpora to perform sensemaking.
We show how NetReAct is successful in generating high-quality summaries and visualizations that reveal hidden patterns better than other non-trivial baselines.
arXiv Detail & Related papers (2020-12-22T03:56:26Z) - How Researchers Use Diagrams in Communicating Neural Network Systems [5.064404027153093]
This paper reports on a study into the use of neural network system diagrams.
We find high diversity of usage, perception and preference in both creation and interpretation of diagrams.
Considering the interview data alongside existing guidance, we propose guidelines aiming to improve the way in which neural network system diagrams are constructed.
arXiv Detail & Related papers (2020-08-28T10:21:03Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Temporal Network Representation Learning via Historical Neighborhoods
Aggregation [28.397309507168128]
We propose the Embedding via Historical Neighborhoods Aggregation (EHNA) algorithm.
We first propose a temporal random walk that can identify relevant nodes in historical neighborhoods.
Then we apply a deep learning model which uses a custom attention mechanism to induce node embeddings.
arXiv Detail & Related papers (2020-03-30T04:18:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.