Sequence Generation with Label Augmentation for Relation Extraction
- URL: http://arxiv.org/abs/2212.14266v1
- Date: Thu, 29 Dec 2022 11:28:05 GMT
- Title: Sequence Generation with Label Augmentation for Relation Extraction
- Authors: Bo Li, Dingyao Yu, Wei Ye, Jinglei Zhang, Shikun Zhang
- Abstract summary: We propose Relation Extraction with Label Augmentation (RELA), a Seq2Seq model with automatic label augmentation for relation extraction.
Experimental results show RELA achieves competitive results compared with previous methods on four RE datasets.
- Score: 17.38986046630852
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequence generation demonstrates promising performance in recent information
extraction efforts, by incorporating large-scale pre-trained Seq2Seq models.
This paper investigates the merits of employing sequence generation in relation
extraction, finding that with relation names or synonyms as generation targets,
their textual semantics and the correlation (in terms of word sequence pattern)
among them affect model performance. We then propose Relation Extraction with
Label Augmentation (RELA), a Seq2Seq model with automatic label augmentation
for RE. By saying label augmentation, we mean prod semantically synonyms for
each relation name as the generation target. Besides, we present an in-depth
analysis of the Seq2Seq model's behavior when dealing with RE. Experimental
results show that RELA achieves competitive results compared with previous
methods on four RE datasets.
Related papers
- InfeRE: Step-by-Step Regex Generation via Chain of Inference [15.276963928784047]
In this paper, we propose a new paradigm called InfeRE, which decomposes the generation of expressions into chains of step-by-step inference.
We evaluate InfeRE on two publicly available datasets, NL-RX-Turk and KB13, and compare the results with state-of-the-art approaches and the popular tree-based generation approach TRANX.
arXiv Detail & Related papers (2023-08-08T04:37:41Z) - SequenceMatch: Imitation Learning for Autoregressive Sequence Modelling with Backtracking [60.109453252858806]
A maximum-likelihood (MLE) objective does not match a downstream use-case of autoregressively generating high-quality sequences.
We formulate sequence generation as an imitation learning (IL) problem.
This allows us to minimize a variety of divergences between the distribution of sequences generated by an autoregressive model and sequences from a dataset.
Our resulting method, SequenceMatch, can be implemented without adversarial training or architectural changes.
arXiv Detail & Related papers (2023-06-08T17:59:58Z) - GCRE-GPT: A Generative Model for Comparative Relation Extraction [47.69464882382656]
Given comparative text, comparative relation extraction aims to extract two targets in comparison and the aspect they are compared for.
Existing solutions formulate this task as a sequence labeling task, to extract targets and aspects.
We show that comparative relations can be directly extracted with high accuracy, by generative model.
arXiv Detail & Related papers (2023-03-15T13:15:22Z) - Mutual Exclusivity Training and Primitive Augmentation to Induce
Compositionality [84.94877848357896]
Recent datasets expose the lack of the systematic generalization ability in standard sequence-to-sequence models.
We analyze this behavior of seq2seq models and identify two contributing factors: a lack of mutual exclusivity bias and the tendency to memorize whole examples.
We show substantial empirical improvements using standard sequence-to-sequence models on two widely-used compositionality datasets.
arXiv Detail & Related papers (2022-11-28T17:36:41Z) - DORE: Document Ordered Relation Extraction based on Generative Framework [56.537386636819626]
This paper investigates the root cause of the underwhelming performance of the existing generative DocRE models.
We propose to generate a symbolic and ordered sequence from the relation matrix which is deterministic and easier for model to learn.
Experimental results on four datasets show that our proposed method can improve the performance of the generative DocRE models.
arXiv Detail & Related papers (2022-10-28T11:18:10Z) - Automatically Generating Counterfactuals for Relation Exaction [18.740447044960796]
relation extraction (RE) is a fundamental task in natural language processing.
Current deep neural models have achieved high accuracy but are easily affected by spurious correlations.
We develop a novel approach to derive contextual counterfactuals for entities.
arXiv Detail & Related papers (2022-02-22T04:46:10Z) - Minimize Exposure Bias of Seq2Seq Models in Joint Entity and Relation
Extraction [57.22929457171352]
Joint entity and relation extraction aims to extract relation triplets from plain text directly.
We propose a novel Sequence-to-Unordered-Multi-Tree (Seq2UMTree) model to minimize the effects of exposure bias.
arXiv Detail & Related papers (2020-09-16T06:53:34Z) - On the Discrepancy between Density Estimation and Sequence Generation [92.70116082182076]
log-likelihood is highly correlated with BLEU when we consider models within the same family.
We observe no correlation between rankings of models across different families.
arXiv Detail & Related papers (2020-02-17T20:13:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.