Learning the Implicit Semantic Representation on Graph-Structured Data
- URL: http://arxiv.org/abs/2101.06471v1
- Date: Sat, 16 Jan 2021 16:18:43 GMT
- Title: Learning the Implicit Semantic Representation on Graph-Structured Data
- Authors: Likang Wu, Zhi Li, Hongke Zhao, Qi Liu, Jun Wang, Mengdi Zhang, Enhong
Chen
- Abstract summary: Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
- Score: 57.670106959061634
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing representation learning methods in graph convolutional networks are
mainly designed by describing the neighborhood of each node as a perceptual
whole, while the implicit semantic associations behind highly complex
interactions of graphs are largely unexploited. In this paper, we propose a
Semantic Graph Convolutional Networks (SGCN) that explores the implicit
semantics by learning latent semantic-paths in graphs. In previous work, there
are explorations of graph semantics via meta-paths. However, these methods
mainly rely on explicit heterogeneous information that is hard to be obtained
in a large amount of graph-structured data. SGCN first breaks through this
restriction via leveraging the semantic-paths dynamically and automatically
during the node aggregating process. To evaluate our idea, we conduct
sufficient experiments on several standard datasets, and the empirical results
show the superior performance of our model.
Related papers
- Unveiling Global Interactive Patterns across Graphs: Towards Interpretable Graph Neural Networks [31.29616732552006]
Graph Neural Networks (GNNs) have emerged as a prominent framework for graph mining.
This paper proposes a novel intrinsically interpretable scheme for graph classification.
Global Interactive Pattern (GIP) learning introduces learnable global interactive patterns to explicitly interpret decisions.
arXiv Detail & Related papers (2024-07-02T06:31:13Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Mitigating Semantic Confusion from Hostile Neighborhood for Graph Active
Learning [38.5372139056485]
Graph Active Learning (GAL) aims to find the most informative nodes in graphs for annotation to maximize the Graph Neural Networks (GNNs) performance.
Gal strategies may introduce semantic confusion to the selected training set, particularly when graphs are noisy.
We present Semantic-aware Active learning framework for Graphs (SAG) to mitigate the semantic confusion problem.
arXiv Detail & Related papers (2023-08-17T07:06:54Z) - Text Enriched Sparse Hyperbolic Graph Convolutional Networks [21.83127488157701]
Graph Neural Networks (GNNs) and their hyperbolic variants provide a promising approach to encode such networks in a low-dimensional latent space.
We propose Text Enriched Sparse Hyperbolic Graph Convolution Network (TESH-GCN) to capture the graph's metapath structures using semantic signals.
Our model outperforms the current state-of-the-art approaches by a large margin on the task of link prediction.
arXiv Detail & Related papers (2022-07-06T00:23:35Z) - Meta Propagation Networks for Graph Few-shot Semi-supervised Learning [39.96930762034581]
We propose a novel network architecture equipped with a novel meta-learning algorithm to solve this problem.
In essence, our framework Meta-PN infers high-quality pseudo labels on unlabeled nodes via a meta-learned label propagation strategy.
Our approach offers easy and substantial performance gains compared to existing techniques on various benchmark datasets.
arXiv Detail & Related papers (2021-12-18T00:11:56Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Heterogeneous Graph Representation Learning with Relation Awareness [45.14314180743549]
We propose a Relation-aware Heterogeneous Graph Neural Network, namely R-HGNN, to learn node representations on heterogeneous graphs at a fine-grained level.
A dedicated graph convolution component is first designed to learn unique node representations from each relation-specific graph.
A cross-relation message passing module is developed to improve the interactions of node representations across different relations.
arXiv Detail & Related papers (2021-05-24T07:01:41Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.