Self-Supervised Learning of Contextual Embeddings for Link Prediction in
Heterogeneous Networks
- URL: http://arxiv.org/abs/2007.11192v3
- Date: Sun, 21 Mar 2021 20:42:38 GMT
- Title: Self-Supervised Learning of Contextual Embeddings for Link Prediction in
Heterogeneous Networks
- Authors: Ping Wang, Khushbu Agarwal, Colby Ham, Sutanay Choudhury, Chandan K.
Reddy
- Abstract summary: We develop SLiCE, a framework bridging static representation learning methods using global information from the entire graph.
We first pre-train our model in a self-supervised manner by introducing higher-order semantic associations and masking nodes.
We also interpret the semantic association matrix and provide its utility and relevance in making successful link predictions between heterogeneous nodes in the network.
- Score: 11.540329725077843
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Representation learning methods for heterogeneous networks produce a
low-dimensional vector embedding for each node that is typically fixed for all
tasks involving the node. Many of the existing methods focus on obtaining a
static vector representation for a node in a way that is agnostic to the
downstream application where it is being used. In practice, however, downstream
tasks such as link prediction require specific contextual information that can
be extracted from the subgraphs related to the nodes provided as input to the
task. To tackle this challenge, we develop SLiCE, a framework bridging static
representation learning methods using global information from the entire graph
with localized attention driven mechanisms to learn contextual node
representations. We first pre-train our model in a self-supervised manner by
introducing higher-order semantic associations and masking nodes, and then
fine-tune our model for a specific link prediction task. Instead of training
node representations by aggregating information from all semantic neighbors
connected via metapaths, we automatically learn the composition of different
metapaths that characterize the context for a specific task without the need
for any pre-defined metapaths. SLiCE significantly outperforms both static and
contextual embedding learning methods on several publicly available benchmark
network datasets. We also interpret the semantic association matrix and provide
its utility and relevance in making successful link predictions between
heterogeneous nodes in the network.
Related papers
- Domain-adaptive Message Passing Graph Neural Network [67.35534058138387]
Cross-network node classification (CNNC) aims to classify nodes in a label-deficient target network by transferring the knowledge from a source network with abundant labels.
We propose a domain-adaptive message passing graph neural network (DM-GNN), which integrates graph neural network (GNN) with conditional adversarial domain adaptation.
arXiv Detail & Related papers (2023-08-31T05:26:08Z) - BSAL: A Framework of Bi-component Structure and Attribute Learning for
Link Prediction [33.488229191263564]
We propose a bicomponent structural and attribute learning framework (BSAL) that is designed to adaptively leverage information from topology and feature spaces.
BSAL constructs a semantic topology via the node attributes and then gets the embeddings regarding the semantic view.
It provides a flexible and easy-to-implement solution to adaptively incorporate the information carried by the node attributes.
arXiv Detail & Related papers (2022-04-18T03:12:13Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Topic-aware latent models for representation learning on networks [5.304857921982132]
We introduce TNE, a generic framework to enhance the embeddings of nodes acquired by means of random walk-based approaches with topic-based information.
We evaluate our methodology in two downstream tasks: node classification and link prediction.
arXiv Detail & Related papers (2021-11-10T08:52:52Z) - Network Representation Learning: From Preprocessing, Feature Extraction
to Node Embedding [9.844802841686105]
Network representation learning (NRL) advances the conventional graph mining of social networks, knowledge graphs, and complex biomedical and physics information networks.
This survey paper reviews the design principles and the different node embedding techniques for network representation learning over homogeneous networks.
arXiv Detail & Related papers (2021-10-14T17:46:37Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Towards Efficient Scene Understanding via Squeeze Reasoning [71.1139549949694]
We propose a novel framework called Squeeze Reasoning.
Instead of propagating information on the spatial map, we first learn to squeeze the input feature into a channel-wise global vector.
We show that our approach can be modularized as an end-to-end trained block and can be easily plugged into existing networks.
arXiv Detail & Related papers (2020-11-06T12:17:01Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z) - Self-Supervised Graph Representation Learning via Global Context
Prediction [31.07584920486755]
This paper introduces a novel self-supervised strategy for graph representation learning by exploiting natural supervision provided by the data itself.
We randomly select pairs of nodes in a graph and train a well-designed neural net to predict the contextual position of one node relative to the other.
Our underlying hypothesis is that the representations learned from such within-graph context would capture the global topology of the graph and finely characterize the similarity and differentiation between nodes.
arXiv Detail & Related papers (2020-03-03T15:46:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.