GAHNE: Graph-Aggregated Heterogeneous Network Embedding
- URL: http://arxiv.org/abs/2012.12517v1
- Date: Wed, 23 Dec 2020 07:11:30 GMT
- Title: GAHNE: Graph-Aggregated Heterogeneous Network Embedding
- Authors: Xiaohe Li, Lijie Wen, Chen Qian, Jianmin Wang
- Abstract summary: Heterogeneous network embedding aims to embed nodes into low-dimensional vectors which capture rich intrinsic information of heterogeneous networks.
Existing models either depend on manually designing meta-paths, ignore mutual effects between different semantics, or omit some aspects of information from global networks.
In GAHNE model, we develop several mechanisms that can aggregate semantic representations from different single-type sub-networks as well as fuse the global information into final embeddings.
- Score: 32.44836376873812
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The real-world networks often compose of different types of nodes and edges
with rich semantics, widely known as heterogeneous information network (HIN).
Heterogeneous network embedding aims to embed nodes into low-dimensional
vectors which capture rich intrinsic information of heterogeneous networks.
However, existing models either depend on manually designing meta-paths, ignore
mutual effects between different semantics, or omit some aspects of information
from global networks. To address these limitations, we propose a novel
Graph-Aggregated Heterogeneous Network Embedding (GAHNE), which is designed to
extract the semantics of HINs as comprehensively as possible to improve the
results of downstream tasks based on graph convolutional neural networks. In
GAHNE model, we develop several mechanisms that can aggregate semantic
representations from different single-type sub-networks as well as fuse the
global information into final embeddings. Extensive experiments on three
real-world HIN datasets show that our proposed model consistently outperforms
the existing state-of-the-art methods.
Related papers
- Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Pay Attention to Relations: Multi-embeddings for Attributed Multiplex
Networks [0.0]
RAHMeN is a novel unified relation-aware embedding framework for attributed heterogeneous multiplex networks.
Our model incorporates node attributes, motif-based features, relation-based GCN approaches, and relational self-attention to learn embeddings of nodes.
We evaluate our model on four real-world datasets from Amazon, Twitter, YouTube, and Tissue PPIs in both transductive and inductive settings.
arXiv Detail & Related papers (2022-03-03T18:31:29Z) - DeHIN: A Decentralized Framework for Embedding Large-scale Heterogeneous
Information Networks [64.62314068155997]
We present textitDecentralized Embedding Framework for Heterogeneous Information Network (DeHIN) in this paper.
DeHIN presents a context preserving partition mechanism that innovatively formulates a large HIN as a hypergraph.
Our framework then adopts a decentralized strategy to efficiently partition HINs by adopting a tree-like pipeline.
arXiv Detail & Related papers (2022-01-08T04:08:36Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Heterogeneous Graph Neural Network with Multi-view Representation
Learning [16.31723570596291]
We propose a Heterogeneous Graph Neural Network with Multi-View Representation Learning (MV-HetGNN) for heterogeneous graph embedding.
The proposed model consists of node feature transformation, view-specific ego graph encoding and auto multi-view fusion to thoroughly learn complex structural and semantic information for generating comprehensive node representations.
Extensive experiments on three real-world heterogeneous graph datasets show that the proposed MV-HetGNN model consistently outperforms all the state-of-the-art GNN baselines in various downstream tasks.
arXiv Detail & Related papers (2021-08-31T07:18:48Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Multi-View Dynamic Heterogeneous Information Network Embedding [3.8093526291513347]
We propose a novel framework for incorporating temporal information into HIN embedding, denoted as Multi-View Dynamic HIN Embedding (MDHNE)
Our proposed MDHNE applies Recurrent Neural Network (RNN) to incorporate evolving pattern of complex network structure and semantic relationships between nodes into latent embedding spaces.
Our model outperforms state-of-the-art baselines on three real-world dynamic datasets for different network mining tasks.
arXiv Detail & Related papers (2020-11-12T12:33:29Z) - Layer-stacked Attention for Heterogeneous Network Embedding [0.0]
Layer-stacked ATTention Embedding (LATTE) is an architecture that automatically decomposes higher-order meta relations at each layer.
LATTE offers a more interpretable aggregation scheme for nodes of different types at different neighborhood ranges.
In both transductive and inductive node classification tasks, LATTE can achieve state-of-the-art performance compared to existing approaches.
arXiv Detail & Related papers (2020-09-17T05:13:41Z) - A Multi-Semantic Metapath Model for Large Scale Heterogeneous Network
Representation Learning [52.83948119677194]
We propose a multi-semantic metapath (MSM) model for large scale heterogeneous representation learning.
Specifically, we generate multi-semantic metapath-based random walks to construct the heterogeneous neighborhood to handle the unbalanced distributions.
We conduct systematical evaluations for the proposed framework on two challenging datasets: Amazon and Alibaba.
arXiv Detail & Related papers (2020-07-19T22:50:20Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.