Towards Better Document-level Relation Extraction via Iterative
Inference
- URL: http://arxiv.org/abs/2211.14470v1
- Date: Sat, 26 Nov 2022 03:57:34 GMT
- Title: Towards Better Document-level Relation Extraction via Iterative
Inference
- Authors: Liang Zhang, Jinsong Su, Yidong Chen, Zhongjian Miao, Zijun Min,
Qingguo Hu, Xiaodong Shi
- Abstract summary: Document-level relation extraction (RE) aims to extract the relations between entities from the input document.
We propose a novel document-level RE model with iterative inference.
Experimental results on three commonly-used datasets show that our model consistently outperforms other competitive baselines.
- Score: 29.62043809208398
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Document-level relation extraction (RE) aims to extract the relations between
entities from the input document that usually containing many
difficultly-predicted entity pairs whose relations can only be predicted
through relational inference. Existing methods usually directly predict the
relations of all entity pairs of input document in a one-pass manner, ignoring
the fact that predictions of some entity pairs heavily depend on the predicted
results of other pairs. To deal with this issue, in this paper, we propose a
novel document-level RE model with iterative inference. Our model is mainly
composed of two modules: 1) a base module expected to provide preliminary
relation predictions on entity pairs; 2) an inference module introduced to
refine these preliminary predictions by iteratively dealing with
difficultly-predicted entity pairs depending on other pairs in an easy-to-hard
manner. Unlike previous methods which only consider feature information of
entity pairs, our inference module is equipped with two Extended Cross
Attention units, allowing it to exploit both feature information and previous
predictions of entity pairs during relational inference. Furthermore, we adopt
a two-stage strategy to train our model. At the first stage, we only train our
base module. During the second stage, we train the whole model, where
contrastive learning is introduced to enhance the training of inference module.
Experimental results on three commonly-used datasets show that our model
consistently outperforms other competitive baselines.
Related papers
- Entity or Relation Embeddings? An Analysis of Encoding Strategies for Relation Extraction [19.019881161010474]
Relation extraction is essentially a text classification problem, which can be tackled by fine-tuning a pre-trained language model (LM)
Existing approaches therefore solve the problem in an indirect way: they fine-tune an LM to learn embeddings of the head and tail entities, and then predict the relationship from these entity embeddings.
Our hypothesis in this paper is that relation extraction models can be improved by capturing relationships in a more direct way.
arXiv Detail & Related papers (2023-12-18T09:58:19Z) - ProtoEM: A Prototype-Enhanced Matching Framework for Event Relation
Extraction [69.74158631862652]
Event Relation Extraction (ERE) aims to extract multiple kinds of relations among events in texts.
Existing methods singly categorize event relations as different classes, which are inadequately capturing the intrinsic semantics of these relations.
We propose a Prototype-Enhanced Matching (ProtoEM) framework for the joint extraction of multiple kinds of event relations.
arXiv Detail & Related papers (2023-09-22T14:26:06Z) - A Simple yet Effective Relation Information Guided Approach for Few-Shot
Relation Extraction [22.60428265210431]
Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation.
Some recent works have introduced relation information to assist model learning based on Prototype Network.
We argue that relation information can be introduced more explicitly and effectively into the model.
arXiv Detail & Related papers (2022-05-19T13:03:01Z) - Document-Level Relation Extraction with Sentences Importance Estimation
and Focusing [52.069206266557266]
Document-level relation extraction (DocRE) aims to determine the relation between two entities from a document of multiple sentences.
We propose a Sentence Estimation and Focusing (SIEF) framework for DocRE, where we design a sentence importance score and a sentence focusing loss.
Experimental results on two domains show that our SIEF not only improves overall performance, but also makes DocRE models more robust.
arXiv Detail & Related papers (2022-04-27T03:20:07Z) - Document-level Relation Extraction with Context Guided Mention
Integration and Inter-pair Reasoning [18.374097786748834]
Document-level Relation Extraction (DRE) aims to recognize the relations between two entities.
Few previous studies have investigated the mention integration, which may be problematic.
We propose two novel techniques, Context Guided Mention Integration and Inter-pair Reasoning.
arXiv Detail & Related papers (2022-01-13T08:00:23Z) - Link Prediction on N-ary Relational Data Based on Relatedness Evaluation [61.61555159755858]
We propose a method called NaLP to conduct link prediction on n-ary relational data.
We represent each n-ary relational fact as a set of its role and role-value pairs.
Experimental results validate the effectiveness and merits of the proposed methods.
arXiv Detail & Related papers (2021-04-21T09:06:54Z) - Paired Examples as Indirect Supervision in Latent Decision Models [109.76417071249945]
We introduce a way to leverage paired examples that provide stronger cues for learning latent decisions.
We apply our method to improve compositional question answering using neural module networks on the DROP dataset.
arXiv Detail & Related papers (2021-04-05T03:58:30Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - A Frustratingly Easy Approach for Entity and Relation Extraction [25.797992240847833]
We present a simple pipelined approach for entity and relation extraction.
We establish the new state-of-the-art on standard benchmarks (ACE04, ACE05 and SciERC)
Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model.
arXiv Detail & Related papers (2020-10-24T07:14:01Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.