Document-Level Relation Extraction with Reconstruction
- URL: http://arxiv.org/abs/2012.11384v1
- Date: Mon, 21 Dec 2020 14:29:31 GMT
- Title: Document-Level Relation Extraction with Reconstruction
- Authors: Wang Xu, Kehai Chen and Tiejun Zhao
- Abstract summary: We propose a novel encoder-classifier-reconstructor model for document-level relation extraction (DocRE)
The reconstructor reconstructs the ground-truth path dependencies from the graph representation, to ensure that the proposed DocRE model pays more attention to encode entity pairs with relationships in the training.
Experimental results on a large-scale DocRE dataset show that the proposed model can significantly improve the accuracy of relation extraction on a strong heterogeneous graph-based baseline.
- Score: 28.593318203728963
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In document-level relation extraction (DocRE), graph structure is generally
used to encode relation information in the input document to classify the
relation category between each entity pair, and has greatly advanced the DocRE
task over the past several years. However, the learned graph representation
universally models relation information between all entity pairs regardless of
whether there are relationships between these entity pairs. Thus, those entity
pairs without relationships disperse the attention of the encoder-classifier
DocRE for ones with relationships, which may further hind the improvement of
DocRE. To alleviate this issue, we propose a novel
encoder-classifier-reconstructor model for DocRE. The reconstructor manages to
reconstruct the ground-truth path dependencies from the graph representation,
to ensure that the proposed DocRE model pays more attention to encode entity
pairs with relationships in the training. Furthermore, the reconstructor is
regarded as a relationship indicator to assist relation classification in the
inference, which can further improve the performance of DocRE model.
Experimental results on a large-scale DocRE dataset show that the proposed
model can significantly improve the accuracy of relation extraction on a strong
heterogeneous graph-based baseline.
Related papers
- Relation Rectification in Diffusion Model [64.84686527988809]
We introduce a novel task termed Relation Rectification, aiming to refine the model to accurately represent a given relationship it initially fails to generate.
We propose an innovative solution utilizing a Heterogeneous Graph Convolutional Network (HGCN)
The lightweight HGCN adjusts the text embeddings generated by the text encoder, ensuring the accurate reflection of the textual relation in the embedding space.
arXiv Detail & Related papers (2024-03-29T15:54:36Z) - A Semantic Mention Graph Augmented Model for Document-Level Event Argument Extraction [12.286432133599355]
Document-level Event Argument Extraction (DEAE) aims to identify arguments and their specific roles from an unstructured document.
advanced approaches on DEAE utilize prompt-based methods to guide pre-trained language models (PLMs) in extracting arguments from input documents.
We propose a semantic mention Graph Augmented Model (GAM) to address these two problems in this paper.
arXiv Detail & Related papers (2024-03-12T08:58:07Z) - Document-Level Relation Extraction with Relation Correlation Enhancement [10.684005956288347]
Document-level relation extraction (DocRE) is a task that focuses on identifying relations between entities within a document.
Existing DocRE models often overlook the correlation between relations and lack a quantitative analysis of relation correlations.
We propose a relation graph method, which aims to explicitly exploit the interdependency among relations.
arXiv Detail & Related papers (2023-10-06T10:59:00Z) - Improving Long Tailed Document-Level Relation Extraction via Easy
Relation Augmentation and Contrastive Learning [66.83982926437547]
We argue that mitigating the long-tailed distribution problem is crucial for DocRE in the real-world scenario.
Motivated by the long-tailed distribution problem, we propose an Easy Relation Augmentation(ERA) method for improving DocRE.
arXiv Detail & Related papers (2022-05-21T06:15:11Z) - Document-Level Relation Extraction with Sentences Importance Estimation
and Focusing [52.069206266557266]
Document-level relation extraction (DocRE) aims to determine the relation between two entities from a document of multiple sentences.
We propose a Sentence Estimation and Focusing (SIEF) framework for DocRE, where we design a sentence importance score and a sentence focusing loss.
Experimental results on two domains show that our SIEF not only improves overall performance, but also makes DocRE models more robust.
arXiv Detail & Related papers (2022-04-27T03:20:07Z) - A Masked Image Reconstruction Network for Document-level Relation
Extraction [3.276435438007766]
Document-level relation extraction requires inference over multiple sentences to extract complex relational triples.
We propose a novel Document-level Relation Extraction model based on a Masked Image Reconstruction network (DRE-MIR)
We evaluate our model on three public document-level relation extraction datasets.
arXiv Detail & Related papers (2022-04-21T02:41:21Z) - Does Recommend-Revise Produce Reliable Annotations? An Analysis on
Missing Instances in DocRED [60.39125850987604]
We show that a textit-revise scheme results in false negative samples and an obvious bias towards popular entities and relations.
The relabeled dataset is released to serve as a more reliable test set of document RE models.
arXiv Detail & Related papers (2022-04-17T11:29:01Z) - Mention-centered Graph Neural Network for Document-level Relation
Extraction [2.724649366608364]
We build cross-sentence dependencies by inferring compositional relations between inter-sentence mentions.
Experiments show the connections between different mentions are crucial to document-level relation extraction.
arXiv Detail & Related papers (2021-03-15T08:19:44Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Reasoning with Latent Structure Refinement for Document-Level Relation
Extraction [20.308845516900426]
We propose a novel model that empowers the relational reasoning across sentences by automatically inducing the latent document-level graph.
Specifically, our model achieves an F1 score of 59.05 on a large-scale document-level dataset (DocRED)
arXiv Detail & Related papers (2020-05-13T13:36:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.