Enhancing Heterogeneous Knowledge Graph Completion with a Novel GAT-based Approach
- URL: http://arxiv.org/abs/2408.02456v1
- Date: Mon, 5 Aug 2024 13:28:51 GMT
- Title: Enhancing Heterogeneous Knowledge Graph Completion with a Novel GAT-based Approach
- Authors: Wanxu Wei, Yitong Song, Bin Yao,
- Abstract summary: We propose a novel GAT-based knowledge graph completion method for Heterogeneous knowledge graphs.
GATH incorporates two separate attention network modules that work synergistically to predict the missing entities.
Our model improves performance by 5.2% and 5.2% on the FB15K-237 dataset and by 4.5% and 14.6% on the WN18RR dataset.
- Score: 3.8357926394952306
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graphs (KGs) play a vital role in enhancing search results and recommendation systems. With the rapid increase in the size of the KGs, they are becoming inaccuracy and incomplete. This problem can be solved by the knowledge graph completion methods, of which graph attention network (GAT)-based methods stand out since their superior performance. However, existing GAT-based knowledge graph completion methods often suffer from overfitting issues when dealing with heterogeneous knowledge graphs, primarily due to the unbalanced number of samples. Additionally, these methods demonstrate poor performance in predicting the tail (head) entity that shares the same relation and head (tail) entity with others. To solve these problems, we propose GATH, a novel GAT-based method designed for Heterogeneous KGs. GATH incorporates two separate attention network modules that work synergistically to predict the missing entities. We also introduce novel encoding and feature transformation approaches, enabling the robust performance of GATH in scenarios with imbalanced samples. Comprehensive experiments are conducted to evaluate the GATH's performance. Compared with the existing SOTA GAT-based model on Hits@10 and MRR metrics, our model improves performance by 5.2% and 5.2% on the FB15K-237 dataset, and by 4.5% and 14.6% on the WN18RR dataset, respectively.
Related papers
- Revisiting, Benchmarking and Understanding Unsupervised Graph Domain Adaptation [31.106636947179005]
Unsupervised Graph Domain Adaptation involves the transfer of knowledge from a label-rich source graph to an unlabeled target graph.
We present the first comprehensive benchmark for unsupervised graph domain adaptation named GDABench.
We observe that the performance of current UGDA models varies significantly across different datasets and adaptation scenarios.
arXiv Detail & Related papers (2024-07-09T06:44:09Z) - KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation [19.31783654838732]
We use large language models to generate coherent descriptions, bridging the semantic gap between queries and answers.
We also utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC.
Our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.
arXiv Detail & Related papers (2023-09-26T09:03:25Z) - Challenging the Myth of Graph Collaborative Filtering: a Reasoned and Reproducibility-driven Analysis [50.972595036856035]
We present a code that successfully replicates results from six popular and recent graph recommendation models.
We compare these graph models with traditional collaborative filtering models that historically performed well in offline evaluations.
By investigating the information flow from users' neighborhoods, we aim to identify which models are influenced by intrinsic features in the dataset structure.
arXiv Detail & Related papers (2023-08-01T09:31:44Z) - Adaptive Depth Graph Attention Networks [19.673509341792606]
The graph attention networks (GAT) is considered the most advanced learning architecture for graph representation.
We find that the main factor limiting the accuracy of the GAT model as the number of layers increases is the oversquashing phenomenon.
We propose a GAT variant model-ADGAT that adaptively selects the number of layers based on the sparsity of the graph.
arXiv Detail & Related papers (2023-01-16T05:22:29Z) - Learnable Graph Convolutional Attention Networks [7.465923786151107]
Graph Neural Networks (GNNs) compute the message exchange between nodes by either aggregating uniformly (convolving) the features of all the neighboring nodes, or by applying a non-uniform score (attending) to the features.
Recent works have shown the strengths and weaknesses of the resulting GNN architectures, respectively, GCNs and GATs.
We introduce the graph convolutional attention layer (CAT), which relies on convolutions to compute the attention scores.
Our results demonstrate that L-CAT is able to efficiently combine different GNN layers along the network, outperforming competing methods in a wide
arXiv Detail & Related papers (2022-11-21T21:08:58Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Resisting Graph Adversarial Attack via Cooperative Homophilous
Augmentation [60.50994154879244]
Recent studies show that Graph Neural Networks are vulnerable and easily fooled by small perturbations.
In this work, we focus on the emerging but critical attack, namely, Graph Injection Attack.
We propose a general defense framework CHAGNN against GIA through cooperative homophilous augmentation of graph data and model.
arXiv Detail & Related papers (2022-11-15T11:44:31Z) - Make Heterophily Graphs Better Fit GNN: A Graph Rewiring Approach [43.41163711340362]
We propose a method named Deep Heterophily Graph Rewiring (DHGR) to rewire graphs by adding homophilic edges and pruning heterophilic edges.
To the best of our knowledge, it is the first work studying graph rewiring for heterophily graphs.
arXiv Detail & Related papers (2022-09-17T06:55:21Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.