Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2209.08721v1
- Date: Mon, 19 Sep 2022 02:41:02 GMT
- Title: Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion
- Authors: Jianhao Shen, Chenguang Wang, Linyuan Gong, Dawn Song
- Abstract summary: We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
- Score: 66.15933600765835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The task of completing knowledge triplets has broad downstream applications.
Both structural and semantic information plays an important role in knowledge
graph completion. Unlike previous approaches that rely on either the structures
or semantics of the knowledge graphs, we propose to jointly embed the semantics
in the natural language description of the knowledge triplets with their
structure information. Our method embeds knowledge graphs for the completion
task via fine-tuning pre-trained language models with respect to a
probabilistic structured loss, where the forward pass of the language models
captures semantics and the loss reconstructs structures. Our extensive
experiments on a variety of knowledge graph benchmarks have demonstrated the
state-of-the-art performance of our method. We also show that our method can
significantly improve the performance in a low-resource regime, thanks to the
better use of semantics. The code and datasets are available at
https://github.com/pkusjh/LASS.
Related papers
- CodeKGC: Code Language Model for Generative Knowledge Graph Construction [46.220237225553234]
Large generative language model trained on structured data such as code has demonstrated impressive capability in understanding natural language for structural prediction and reasoning tasks.
We develop schema-aware prompts that effectively utilize the semantic structure within the knowledge graph.
Experimental results indicate that the proposed approach can obtain better performance on benchmark datasets compared with baselines.
arXiv Detail & Related papers (2023-04-18T15:12:34Z) - Schema-aware Reference as Prompt Improves Data-Efficient Knowledge Graph
Construction [57.854498238624366]
We propose a retrieval-augmented approach, which retrieves schema-aware Reference As Prompt (RAP) for data-efficient knowledge graph construction.
RAP can dynamically leverage schema and knowledge inherited from human-annotated and weak-supervised data as a prompt for each sample.
arXiv Detail & Related papers (2022-10-19T16:40:28Z) - Semantic TrueLearn: Using Semantic Knowledge Graphs in Recommendation
Systems [22.387120578306277]
This work aims to advance towards building a state-aware educational recommendation system that incorporates semantic relatedness.
We introduce a novel learner model that exploits this semantic relatedness between knowledge components in learning resources using the Wikipedia link graph.
Our experiments with a large dataset demonstrate that this new semantic version of TrueLearn algorithm achieves statistically significant improvements in terms of predictive performance.
arXiv Detail & Related papers (2021-12-08T16:23:27Z) - JAKET: Joint Pre-training of Knowledge Graph and Language Understanding [73.43768772121985]
We propose a novel joint pre-training framework, JAKET, to model both the knowledge graph and language.
The knowledge module and language module provide essential information to mutually assist each other.
Our design enables the pre-trained model to easily adapt to unseen knowledge graphs in new domains.
arXiv Detail & Related papers (2020-10-02T05:53:36Z) - CoLAKE: Contextualized Language and Knowledge Embedding [81.90416952762803]
We propose the Contextualized Language and Knowledge Embedding (CoLAKE)
CoLAKE jointly learns contextualized representation for both language and knowledge with the extended objective.
We conduct experiments on knowledge-driven tasks, knowledge probing tasks, and language understanding tasks.
arXiv Detail & Related papers (2020-10-01T11:39:32Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.