GCN for HIN via Implicit Utilization of Attention and Meta-paths
- URL: http://arxiv.org/abs/2007.02643v1
- Date: Mon, 6 Jul 2020 11:09:40 GMT
- Title: GCN for HIN via Implicit Utilization of Attention and Meta-paths
- Authors: Di Jin, Zhizhi Yu, Dongxiao He, Carl Yang, Philip S. Yu and Jiawei Han
- Abstract summary: Heterogeneous information network (HIN) embedding aims to map the structure and semantic information in a HIN to distributed representations.
We propose a novel neural network method via implicitly utilizing attention and meta-paths.
We first use the multi-layer graph convolutional network (GCN) framework, which performs a discriminative aggregation at each layer.
We then give an effective relaxation and improvement via introducing a new propagation operation which can be separated from aggregation.
- Score: 104.24467864133942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous information network (HIN) embedding, aiming to map the
structure and semantic information in a HIN to distributed representations, has
drawn considerable research attention. Graph neural networks for HIN embeddings
typically adopt a hierarchical attention (including node-level and
meta-path-level attentions) to capture the information from meta-path-based
neighbors. However, this complicated attention structure often cannot achieve
the function of selecting meta-paths due to severe overfitting. Moreover, when
propagating information, these methods do not distinguish direct (one-hop)
meta-paths from indirect (multi-hop) ones. But from the perspective of network
science, direct relationships are often believed to be more essential, which
can only be used to model direct information propagation. To address these
limitations, we propose a novel neural network method via implicitly utilizing
attention and meta-paths, which can relieve the severe overfitting brought by
the current over-parameterized attention mechanisms on HIN. We first use the
multi-layer graph convolutional network (GCN) framework, which performs a
discriminative aggregation at each layer, along with stacking the information
propagation of direct linked meta-paths layer-by-layer, realizing the function
of attentions for selecting meta-paths in an indirect way. We then give an
effective relaxation and improvement via introducing a new propagation
operation which can be separated from aggregation. That is, we first model the
whole propagation process with well-defined probabilistic diffusion dynamics,
and then introduce a random graph-based constraint which allows it to reduce
noise with the increase of layers. Extensive experiments demonstrate the
superiority of the new approach over state-of-the-art methods.
Related papers
- Sparse Explanations of Neural Networks Using Pruned Layer-Wise Relevance Propagation [1.593690982728631]
We present a modification of the widely used explanation method layer-wise relevance propagation.
Our approach enforces sparsity directly by pruning the relevance propagation for the different layers.
We show that our modification indeed leads to noise reduction and concentrates relevance on the most important features compared to the baseline.
arXiv Detail & Related papers (2024-04-22T15:16:59Z) - Prototype-Enhanced Hypergraph Learning for Heterogeneous Information
Networks [22.564818600608838]
We introduce a novel prototype-enhanced hypergraph learning approach for node classification in Heterogeneous Information Networks.
Our method captures higher-order relationships among nodes and extracts semantic information without relying on metapaths.
arXiv Detail & Related papers (2023-09-22T09:51:15Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Simplicial Attention Networks [0.0]
We introduce a proper self-attention mechanism able to process data components at different layers.
We learn how to weight both upper and lower neighborhoods of the given topological domain in a totally task-oriented fashion.
The proposed approach compares favorably with other methods when applied to different (inductive and transductive) tasks.
arXiv Detail & Related papers (2022-03-14T20:47:31Z) - Decomposing neural networks as mappings of correlation functions [57.52754806616669]
We study the mapping between probability distributions implemented by a deep feed-forward network.
We identify essential statistics in the data, as well as different information representations that can be used by neural networks.
arXiv Detail & Related papers (2022-02-10T09:30:31Z) - Joint-bone Fusion Graph Convolutional Network for Semi-supervised
Skeleton Action Recognition [65.78703941973183]
We propose a novel correlation-driven joint-bone fusion graph convolutional network (CD-JBF-GCN) as an encoder and use a pose prediction head as a decoder.
Specifically, the CD-JBF-GC can explore the motion transmission between the joint stream and the bone stream.
The pose prediction based auto-encoder in the self-supervised training stage allows the network to learn motion representation from unlabeled data.
arXiv Detail & Related papers (2022-02-08T16:03:15Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - DisenHAN: Disentangled Heterogeneous Graph Attention Network for
Recommendation [11.120241862037911]
Heterogeneous information network has been widely used to alleviate sparsity and cold start problems in recommender systems.
We propose a novel disentangled heterogeneous graph attention network DisenHAN for top-$N$ recommendation.
arXiv Detail & Related papers (2021-06-21T06:26:10Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Interpretable and Efficient Heterogeneous Graph Convolutional Network [27.316334213279973]
We propose an interpretable and efficient Heterogeneous Graph Convolutional Network (ie-HGCN) to learn the representations of objects in Heterogeneous Information Network (HINs)
ie-HGCN can automatically extract useful meta-paths for each object from all possible meta-paths within a length limit.
It can also reduce the computational cost by avoiding intermediate HIN transformation and neighborhood attention.
arXiv Detail & Related papers (2020-05-27T06:06:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.