Open Relation Modeling: Learning to Define Relations between Entities
- URL: http://arxiv.org/abs/2108.09241v1
- Date: Fri, 20 Aug 2021 16:03:23 GMT
- Title: Open Relation Modeling: Learning to Define Relations between Entities
- Authors: Jie Huang, Kevin Chen-Chuan Chang, Jinjun Xiong, Wen-mei Hwu
- Abstract summary: We propose to teach machines to generate definition-like relation descriptions by letting them learn from definitions of entities.
Specifically, we fine-tune Pre-trained Language Models (PLMs) to produce definitions conditioned on extracted entity pairs.
We show that PLMs can select interpretable and informative reasoning paths by confidence estimation, and the selected path can guide PLMs to generate better relation descriptions.
- Score: 24.04238065663009
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Relations between entities can be represented by different instances, e.g., a
sentence containing both entities or a fact in a Knowledge Graph (KG). However,
these instances may not well capture the general relations between entities,
may be difficult to understand by humans, even may not be found due to the
incompleteness of the knowledge source.
In this paper, we introduce the Open Relation Modeling task - given two
entities, generate a coherent sentence describing the relation between them. To
solve this task, we propose to teach machines to generate definition-like
relation descriptions by letting them learn from definitions of entities.
Specifically, we fine-tune Pre-trained Language Models (PLMs) to produce
definitions conditioned on extracted entity pairs. To help PLMs reason between
entities and provide additional relational knowledge to PLMs for open relation
modeling, we incorporate reasoning paths in KGs and include a reasoning path
selection mechanism. We show that PLMs can select interpretable and informative
reasoning paths by confidence estimation, and the selected path can guide PLMs
to generate better relation descriptions. Experimental results show that our
model can generate concise but informative relation descriptions that capture
the representative characteristics of entities and relations.
Related papers
- Two Heads Are Better Than One: Integrating Knowledge from Knowledge
Graphs and Large Language Models for Entity Alignment [31.70064035432789]
We propose a Large Language Model-enhanced Entity Alignment framework (LLMEA)
LLMEA identifies candidate alignments for a given entity by considering both embedding similarities between entities across Knowledge Graphs and edit distances to a virtual equivalent entity.
Experiments conducted on three public datasets reveal that LLMEA surpasses leading baseline models.
arXiv Detail & Related papers (2024-01-30T12:41:04Z) - Empowering Language Models with Knowledge Graph Reasoning for Question
Answering [117.79170629640525]
We propose knOwledge REasOning empowered Language Model (OREO-LM)
OREO-LM consists of a novel Knowledge Interaction Layer that can be flexibly plugged into existing Transformer-based LMs.
We show significant performance gain, achieving state-of-art results in the Closed-Book setting.
arXiv Detail & Related papers (2022-11-15T18:26:26Z) - DKG: A Descriptive Knowledge Graph for Explaining Relationships between
Entities [34.14526494269527]
We propose Descriptive Knowledge Graph (DKG) - an open and interpretable form of modeling relationships between entities.
To construct DKGs, we propose a self-supervised learning method to extract relation descriptions.
Experiments demonstrate that our system can extract and generate high-quality relation descriptions.
arXiv Detail & Related papers (2022-05-21T01:16:04Z) - Learning Relation-Specific Representations for Few-shot Knowledge Graph
Completion [24.880078645503417]
We propose a Relation-Specific Context Learning framework, which exploits graph contexts of triples to capture semantic information of relations and entities simultaneously.
Experimental results on two public datasets demonstrate that RSCL outperforms state-of-the-art FKGC methods.
arXiv Detail & Related papers (2022-03-22T11:45:48Z) - Learning to Compose Visual Relations [100.45138490076866]
We propose to represent each relation as an unnormalized density (an energy-based model)
We show that such a factorized decomposition allows the model to both generate and edit scenes with multiple sets of relations more faithfully.
arXiv Detail & Related papers (2021-11-17T18:51:29Z) - Unsupervised Knowledge Graph Alignment by Probabilistic Reasoning and
Semantic Embedding [22.123001954919893]
We propose an iterative framework named PRASE which is based on probabilistic reasoning and semantic embedding.
The PRASE framework is compatible with different embedding-based models, and our experiments on multiple datasets have demonstrated its state-of-the-art performance.
arXiv Detail & Related papers (2021-05-12T11:27:46Z) - ERICA: Improving Entity and Relation Understanding for Pre-trained
Language Models via Contrastive Learning [97.10875695679499]
We propose a novel contrastive learning framework named ERICA in pre-training phase to obtain a deeper understanding of the entities and their relations in text.
Experimental results demonstrate that our proposed ERICA framework achieves consistent improvements on several document-level language understanding tasks.
arXiv Detail & Related papers (2020-12-30T03:35:22Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Relation Extraction with Contextualized Relation Embedding (CRE) [6.030060645424665]
This paper proposes an architecture for the relation extraction task that integrates semantic information with knowledge base modeling.
We present a model architecture that internalizes KB modeling in relation extraction.
The proposed CRE model achieves state of the art performance on datasets derived from The New York Times Annotated Corpus and FreeBase.
arXiv Detail & Related papers (2020-11-19T05:19:46Z) - HittER: Hierarchical Transformers for Knowledge Graph Embeddings [85.93509934018499]
We propose Hitt to learn representations of entities and relations in a complex knowledge graph.
Experimental results show that Hitt achieves new state-of-the-art results on multiple link prediction.
We additionally propose a simple approach to integrate Hitt into BERT and demonstrate its effectiveness on two Freebase factoid answering datasets.
arXiv Detail & Related papers (2020-08-28T18:58:15Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.