UniHR: Hierarchical Representation Learning for Unified Knowledge Graph Link Prediction
- URL: http://arxiv.org/abs/2411.07019v1
- Date: Mon, 11 Nov 2024 14:22:42 GMT
- Title: UniHR: Hierarchical Representation Learning for Unified Knowledge Graph Link Prediction
- Authors: Zhiqiang Liu, Mingyang Chen, Yin Hua, Zhuo Chen, Ziqi Liu, Lei Liang, Huajun Chen, Wen Zhang,
- Abstract summary: We propose a unified Hierarchical Representation learning framework (UniHR) for unified knowledge graph link prediction.
It consists of a unified Hierarchical Data Representation (HiDR) module and a unified Hierarchical Structure Learning (HiSL) module as graph encoder.
We show that our UniHR outperforms baselines designed for one specific kind of KG, indicating strong generalization capability of HiDR form and the effectiveness of HiSL module.
- Score: 41.46369433488762
- License:
- Abstract: Beyond-triple fact representations including hyper-relational facts with auxiliary key-value pairs, temporal facts with additional timestamps, and nested facts implying relationships between facts, are gaining significant attention. However, existing link prediction models are usually designed for one specific type of facts, making it difficult to generalize to other fact representations. To overcome this limitation, we propose a Unified Hierarchical Representation learning framework (UniHR) for unified knowledge graph link prediction. It consists of a unified Hierarchical Data Representation (HiDR) module and a unified Hierarchical Structure Learning (HiSL) module as graph encoder. The HiDR module unifies hyper-relational KGs, temporal KGs, and nested factual KGs into triple-based representations. Then HiSL incorporates intra-fact and inter-fact message passing, focusing on enhancing the semantic information within individual facts and enriching the structural information between facts. Experimental results across 7 datasets from 3 types of KGs demonstrate that our UniHR outperforms baselines designed for one specific kind of KG, indicating strong generalization capability of HiDR form and the effectiveness of HiSL module. Code and data are available at https://github.com/Lza12a/UniHR.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - HyperFormer: Enhancing Entity and Relation Interaction for
Hyper-Relational Knowledge Graph Completion [25.399684403558553]
Hyper-relational knowledge graphs (HKGs) extend standard knowledge graphs by associating attribute-value qualifiers to triples.
We propose HyperFormer, a model that considers local-level sequential information, which encodes the content of the entities, relations and qualifiers of a triple.
arXiv Detail & Related papers (2023-08-12T09:31:43Z) - Few-shot Link Prediction on N-ary Facts [70.8150181683017]
Link Prediction on Hyper-relational Facts (LPHFs) is to predict a missing element in a hyper-relational fact.
Few-Shot Link Prediction on Hyper-relational Facts (PHFs) aims to predict a missing entity in a hyper-relational fact with limited support instances.
arXiv Detail & Related papers (2023-05-10T12:44:00Z) - Learning Representations for Hyper-Relational Knowledge Graphs [35.380689788802776]
We design a framework to learn representations for hyper-relational facts using multiple aggregators.
Experiments demonstrate the effectiveness of our framework across multiple datasets.
We conduct an ablation study that validates the importance of the various components in our framework.
arXiv Detail & Related papers (2022-08-30T15:02:14Z) - DHGE: Dual-View Hyper-Relational Knowledge Graph Embedding for Link
Prediction and Entity Typing [1.2932412290302255]
We propose a dual-view hyper-relational KG structure (DH-KG) that contains a hyper-relational instance view for entities and a hyper-relational view for concepts that are abstracted hierarchically from the entities.
This paper defines link prediction and entity typing tasks on DH-KG for the first time and constructs two DH-KG datasets, JW44K-6K, extracted from Wikidata, and HTDM based on medical data.
arXiv Detail & Related papers (2022-07-18T12:44:59Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - Message Passing for Hyper-Relational Knowledge Graphs [7.733963597282456]
We propose a message passing graph encoder - StarE capable of modeling such hyper-relational knowledge graphs.
StarE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact.
Our experiments demonstrate that StarE based LP model outperforms existing approaches across multiple benchmarks.
arXiv Detail & Related papers (2020-09-22T22:38:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.