Network representation learning: A macro and micro view
- URL: http://arxiv.org/abs/2111.10772v1
- Date: Sun, 21 Nov 2021 08:58:51 GMT
- Title: Network representation learning: A macro and micro view
- Authors: Xueyi Liu, Jie Tang
- Abstract summary: We conduct a comprehensive review of current literature on network representation learning.
Existing algorithms can be categorized into three groups: shallow embedding models, heterogeneous network embedding models, graph neural network based models.
One advantage of the survey is that we systematically study the underlying theoretical foundations underlying the different categories of algorithms.
- Score: 9.221196170951702
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph is a universe data structure that is widely used to organize data in
real-world. Various real-word networks like the transportation network, social
and academic network can be represented by graphs. Recent years have witnessed
the quick development on representing vertices in the network into a
low-dimensional vector space, referred to as network representation learning.
Representation learning can facilitate the design of new algorithms on the
graph data. In this survey, we conduct a comprehensive review of current
literature on network representation learning. Existing algorithms can be
categorized into three groups: shallow embedding models, heterogeneous network
embedding models, graph neural network based models. We review state-of-the-art
algorithms for each category and discuss the essential differences between
these algorithms. One advantage of the survey is that we systematically study
the underlying theoretical foundations underlying the different categories of
algorithms, which offers deep insights for better understanding the development
of the network representation learning field.
Related papers
- A Comprehensive Survey on Deep Graph Representation Learning [26.24869157855632]
Graph representation learning aims to encode high-dimensional sparse graph-structured data into low-dimensional dense vectors.
Traditional methods have limited model capacity which limits the learning performance.
Deep graph representation learning has shown great potential and advantages over shallow (traditional) methods.
arXiv Detail & Related papers (2023-04-11T08:23:52Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Network Representation Learning: From Preprocessing, Feature Extraction
to Node Embedding [9.844802841686105]
Network representation learning (NRL) advances the conventional graph mining of social networks, knowledge graphs, and complex biomedical and physics information networks.
This survey paper reviews the design principles and the different node embedding techniques for network representation learning over homogeneous networks.
arXiv Detail & Related papers (2021-10-14T17:46:37Z) - Network representation learning systematic review: ancestors and current
development state [1.0312968200748116]
We present a systematic survey of network representation learning, known as network embedding, from birth to the current development state.
We provide also formal definitions of basic concepts required to understand network representation learning.
Most commonly used downstream tasks to evaluate embeddings, their evaluation metrics and popular datasets are highlighted.
arXiv Detail & Related papers (2021-09-14T14:44:44Z) - Graph Neural Networks: Methods, Applications, and Opportunities [1.2183405753834562]
This article provides a comprehensive survey of graph neural networks (GNNs) in each learning setting.
The approaches for each learning task are analyzed from both theoretical as well as empirical standpoints.
Various applications and benchmark datasets are also provided, along with open challenges still plaguing the general applicability of GNNs.
arXiv Detail & Related papers (2021-08-24T13:46:19Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Machine Learning on Graphs: A Model and Comprehensive Taxonomy [22.73365477040205]
We bridge the gap between graph neural networks, network embedding and graph regularization models.
Specifically, we propose a Graph Decoder Model (GRAPHEDM), which generalizes popular algorithms for semi-supervised learning on graphs.
arXiv Detail & Related papers (2020-05-07T18:00:02Z) - Deep Learning for Learning Graph Representations [58.649784596090385]
Mining graph data has become a popular research topic in computer science.
The huge amount of network data has posed great challenges for efficient analysis.
This motivates the advent of graph representation which maps the graph into a low-dimension vector space.
arXiv Detail & Related papers (2020-01-02T02:13:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.