ZS-BERT: Towards Zero-Shot Relation Extraction with Attribute
Representation Learning
- URL: http://arxiv.org/abs/2104.04697v1
- Date: Sat, 10 Apr 2021 06:53:41 GMT
- Title: ZS-BERT: Towards Zero-Shot Relation Extraction with Attribute
Representation Learning
- Authors: Chih-Yao Chen, Cheng-Te Li
- Abstract summary: We formulate the zero-shot relation extraction problem by incorporating the text description of seen and unseen relations.
We propose a novel multi-task learning model, zero-shot BERT, to directly predict unseen relations without hand-crafted labeling and multiple pairwise attribute classifications.
Experiments conducted on two well-known datasets exhibit that ZS-BERT can outperform existing methods by at least 13.54% improvement on F1 score.
- Score: 10.609715843964263
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While relation extraction is an essential task in knowledge acquisition and
representation, and new-generated relations are common in the real world, less
effort is made to predict unseen relations that cannot be observed at the
training stage. In this paper, we formulate the zero-shot relation extraction
problem by incorporating the text description of seen and unseen relations. We
propose a novel multi-task learning model, zero-shot BERT (ZS-BERT), to
directly predict unseen relations without hand-crafted attribute labeling and
multiple pairwise classifications. Given training instances consisting of input
sentences and the descriptions of their relations, ZS-BERT learns two functions
that project sentences and relation descriptions into an embedding space by
jointly minimizing the distances between them and classifying seen relations.
By generating the embeddings of unseen relations and new-coming sentences based
on such two functions, we use nearest neighbor search to obtain the prediction
of unseen relations. Experiments conducted on two well-known datasets exhibit
that ZS-BERT can outperform existing methods by at least 13.54\% improvement on
F1 score.
Related papers
- Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Open Set Relation Extraction via Unknown-Aware Training [72.10462476890784]
We propose an unknown-aware training method, regularizing the model by dynamically synthesizing negative instances.
Inspired by text adversarial attacks, we adaptively apply small but critical perturbations to original training instances.
Experimental results show that this method achieves SOTA unknown relation detection without compromising the classification of known relations.
arXiv Detail & Related papers (2023-06-08T05:45:25Z) - Sentence-Level Relation Extraction via Contrastive Learning with
Descriptive Relation Prompts [1.5736899098702974]
We propose a new paradigm, Contrastive Learning with Descriptive Relation Prompts(CTL-), to jointly consider entity information, relational knowledge and entity type restrictions.
The CTL- obtains a competitive F1-score of 76.7% on TACRED.
The new presented paradigm achieves F1-scores of 85.8% and 91.6% on TACREV and Re-TACRED respectively, which are both the state-of-the-art performance.
arXiv Detail & Related papers (2023-04-11T02:15:13Z) - Relation-dependent Contrastive Learning with Cluster Sampling for
Inductive Relation Prediction [30.404149577013595]
We introduce Relation-dependent Contrastive Learning (ReCoLe) for inductive relation prediction.
GNN-based encoder is optimized by contrastive learning, which ensures satisfactory performance on long-tail relations.
Experimental results suggest that ReCoLe outperforms state-of-the-art methods on commonly used inductive datasets.
arXiv Detail & Related papers (2022-11-22T13:30:49Z) - Prompt-based Zero-shot Relation Extraction with Semantic Knowledge
Augmentation [3.154631846975021]
In relation triplet extraction, recognizing unseen relations for which there are no training instances is a challenging task.
We propose a prompt-based model with semantic knowledge augmentation (ZS-SKA) to recognize unseen relations under the zero-shot setting.
arXiv Detail & Related papers (2021-12-08T19:34:27Z) - D-REX: Dialogue Relation Extraction with Explanations [65.3862263565638]
This work focuses on extracting explanations that indicate that a relation exists while using only partially labeled data.
We propose our model-agnostic framework, D-REX, a policy-guided semi-supervised algorithm that explains and ranks relations.
We find that about 90% of the time, human annotators prefer D-REX's explanations over a strong BERT-based joint relation extraction and explanation model.
arXiv Detail & Related papers (2021-09-10T22:30:48Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Relation of the Relations: A New Paradigm of the Relation Extraction
Problem [52.21210549224131]
We propose a new paradigm of Relation Extraction (RE) that considers as a whole the predictions of all relations in the same context.
We develop a data-driven approach that does not require hand-crafted rules but learns by itself the relation of relations (RoR) using Graph Neural Networks and a relation matrix transformer.
Experiments show that our model outperforms the state-of-the-art approaches by +1.12% on the ACE05 dataset and +2.55% on SemEval 2018 Task 7.2.
arXiv Detail & Related papers (2020-06-05T22:25:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.