Network Representation Learning: From Traditional Feature Learning to
Deep Learning
- URL: http://arxiv.org/abs/2103.04339v1
- Date: Sun, 7 Mar 2021 12:31:33 GMT
- Title: Network Representation Learning: From Traditional Feature Learning to
Deep Learning
- Authors: Ke Sun, Lei Wang, Bo Xu, Wenhong Zhao, Shyh Wei Teng, Feng Xia
- Abstract summary: Network representation learning (NRL) is an effective graph analytics technique and promotes users to deeply understand the hidden characteristics of graph data.
It has been successfully applied in many real-world tasks related to network science, such as social network data processing, biological information processing, and recommender systems.
- Score: 13.18795412068976
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network representation learning (NRL) is an effective graph analytics
technique and promotes users to deeply understand the hidden characteristics of
graph data. It has been successfully applied in many real-world tasks related
to network science, such as social network data processing, biological
information processing, and recommender systems. Deep Learning is a powerful
tool to learn data features. However, it is non-trivial to generalize deep
learning to graph-structured data since it is different from the regular data
such as pictures having spatial information and sounds having temporal
information. Recently, researchers proposed many deep learning-based methods in
the area of NRL. In this survey, we investigate classical NRL from traditional
feature learning method to the deep learning-based model, analyze relationships
between them, and summarize the latest progress. Finally, we discuss open
issues considering NRL and point out the future directions in this field.
Related papers
- Multiway Multislice PHATE: Visualizing Hidden Dynamics of RNNs through Training [6.326396282553267]
Recurrent neural networks (RNNs) are a widely used tool for sequential data analysis, however, they are still often seen as black boxes of computation.
Here, we present Multiway Multislice PHATE (MM-PHATE), a novel method for visualizing the evolution of RNNs' hidden states.
arXiv Detail & Related papers (2024-06-04T05:05:27Z) - Graph Neural Networks for Tabular Data Learning: A Survey with Taxonomy
and Directions [10.753191494611892]
We dive into Tabular Data Learning using Graph Neural Networks (GNNs)
GNNs have garnered significant interest and application across various Tabular Data Learning domains.
This survey serves as a resource for researchers and practitioners, offering a thorough understanding of GNNs' role in revolutionizing TDL.
arXiv Detail & Related papers (2024-01-04T08:49:10Z) - Exploring Causal Learning through Graph Neural Networks: An In-depth
Review [12.936700685252145]
We introduce a novel taxonomy that encompasses various state-of-the-art GNN methods employed in studying causality.
GNNs are further categorized based on their applications in the causality domain.
This review also touches upon the application of causal learning across diverse sectors.
arXiv Detail & Related papers (2023-11-25T10:46:06Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Deep networks for system identification: a Survey [56.34005280792013]
System identification learns mathematical descriptions of dynamic systems from input-output data.
Main aim of the identified model is to predict new data from previous observations.
We discuss architectures commonly adopted in the literature, like feedforward, convolutional, and recurrent networks.
arXiv Detail & Related papers (2023-01-30T12:38:31Z) - Piecewise Linear Neural Networks and Deep Learning [27.02556725989978]
PieceWise Linear Neural Networks (PWLNNs) have proven successful in various fields, most recently in deep learning.
In 1977, the canonical representation pioneered the works of shallow PWLNNs learned by incremental designs.
In 2010, the Rectified Linear Unit (ReLU) advocated the prevalence of PWLNNs in deep learning.
arXiv Detail & Related papers (2022-06-18T08:41:42Z) - Reasoning-Modulated Representations [85.08205744191078]
We study a common setting where our task is not purely opaque.
Our approach paves the way for a new class of data-efficient representation learning.
arXiv Detail & Related papers (2021-07-19T13:57:13Z) - What can linearized neural networks actually say about generalization? [67.83999394554621]
In certain infinitely-wide neural networks, the neural tangent kernel (NTK) theory fully characterizes generalization.
We show that the linear approximations can indeed rank the learning complexity of certain tasks for neural networks.
Our work provides concrete examples of novel deep learning phenomena which can inspire future theoretical research.
arXiv Detail & Related papers (2021-06-12T13:05:11Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z) - A Practical Tutorial on Graph Neural Networks [49.919443059032226]
Graph neural networks (GNNs) have recently grown in popularity in the field of artificial intelligence (AI)
This tutorial exposes the power and novelty of GNNs to AI practitioners.
arXiv Detail & Related papers (2020-10-11T12:36:17Z) - Lifelong Learning of Graph Neural Networks for Open-World Node
Classification [3.364554138758565]
Real-world graphs are often evolving over time and even new classes may arise.
We model these challenges as an instance of lifelong learning.
In this work, we systematically analyze the influence of implicit and explicit knowledge.
arXiv Detail & Related papers (2020-06-25T14:03:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.