Document-level Relation Extraction with Relation Correlations
- URL: http://arxiv.org/abs/2212.10171v1
- Date: Tue, 20 Dec 2022 11:17:52 GMT
- Title: Document-level Relation Extraction with Relation Correlations
- Authors: Ridong Han, Tao Peng, Benyou Wang, Lu Liu, Xiang Wan
- Abstract summary: Document-level relation extraction faces two overlooked challenges: long-tail problem and multi-label problem.
We analyze the co-occurrence correlation of relations, and introduce it into DocRE task for the first time.
- Score: 15.997345900917058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Document-level relation extraction faces two overlooked challenges: long-tail
problem and multi-label problem. Previous work focuses mainly on obtaining
better contextual representations for entity pairs, hardly address the above
challenges. In this paper, we analyze the co-occurrence correlation of
relations, and introduce it into DocRE task for the first time. We argue that
the correlations can not only transfer knowledge between data-rich relations
and data-scarce ones to assist in the training of tailed relations, but also
reflect semantic distance guiding the classifier to identify semantically close
relations for multi-label entity pairs. Specifically, we use relation embedding
as a medium, and propose two co-occurrence prediction sub-tasks from both
coarse- and fine-grained perspectives to capture relation correlations.
Finally, the learned correlation-aware embeddings are used to guide the
extraction of relational facts. Substantial experiments on two popular DocRE
datasets are conducted, and our method achieves superior results compared to
baselines. Insightful analysis also demonstrates the potential of relation
correlations to address the above challenges.
Related papers
- Document-Level Relation Extraction with Relation Correlation Enhancement [10.684005956288347]
Document-level relation extraction (DocRE) is a task that focuses on identifying relations between entities within a document.
Existing DocRE models often overlook the correlation between relations and lack a quantitative analysis of relation correlations.
We propose a relation graph method, which aims to explicitly exploit the interdependency among relations.
arXiv Detail & Related papers (2023-10-06T10:59:00Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Relation-dependent Contrastive Learning with Cluster Sampling for
Inductive Relation Prediction [30.404149577013595]
We introduce Relation-dependent Contrastive Learning (ReCoLe) for inductive relation prediction.
GNN-based encoder is optimized by contrastive learning, which ensures satisfactory performance on long-tail relations.
Experimental results suggest that ReCoLe outperforms state-of-the-art methods on commonly used inductive datasets.
arXiv Detail & Related papers (2022-11-22T13:30:49Z) - SAIS: Supervising and Augmenting Intermediate Steps for Document-Level
Relation Extraction [51.27558374091491]
We propose to explicitly teach the model to capture relevant contexts and entity types by supervising and augmenting intermediate steps (SAIS) for relation extraction.
Based on a broad spectrum of carefully designed tasks, our proposed SAIS method not only extracts relations of better quality due to more effective supervision, but also retrieves the corresponding supporting evidence more accurately.
arXiv Detail & Related papers (2021-09-24T17:37:35Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Improving Long-Tail Relation Extraction with Collaborating
Relation-Augmented Attention [63.26288066935098]
We propose a novel neural network, Collaborating Relation-augmented Attention (CoRA), to handle both the wrong labeling and long-tail relations.
In the experiments on the popular benchmark dataset NYT, the proposed CoRA improves the prior state-of-the-art performance by a large margin.
arXiv Detail & Related papers (2020-10-08T05:34:43Z) - Leveraging Semantic Parsing for Relation Linking over Knowledge Bases [80.99588366232075]
We present SLING, a relation linking framework which leverages semantic parsing using AMR and distant supervision.
SLING integrates multiple relation linking approaches that capture complementary signals such as linguistic cues, rich semantic representation, and information from the knowledgebase.
experiments on relation linking using three KBQA datasets; QALD-7, QALD-9, and LC-QuAD 1.0 demonstrate that the proposed approach achieves state-of-the-art performance on all benchmarks.
arXiv Detail & Related papers (2020-09-16T14:56:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.