Summarization as Indirect Supervision for Relation Extraction
- URL: http://arxiv.org/abs/2205.09837v1
- Date: Thu, 19 May 2022 20:25:29 GMT
- Title: Summarization as Indirect Supervision for Relation Extraction
- Authors: Keming Lu, I-Hung Hsu, Wenxuan Zhou, Mingyu Derek Ma, Muhao Chen
- Abstract summary: We present SuRE, which converts Relation extraction (RE) into a summarization formulation.
We develop sentence and relation conversion techniques that essentially bridge the formulation of summarization and RE tasks.
Experiments on three datasets demonstrate the effectiveness of SuRE in both full-dataset and low-resource settings.
- Score: 23.98136192661566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Relation extraction (RE) models have been challenged by their reliance on
training data with expensive annotations. Considering that summarization tasks
aim at acquiring concise expressions of synoptical information from the longer
context, these tasks naturally align with the objective of RE, i.e., extracting
a kind of synoptical information that describes the relation of entity
mentions. We present SuRE, which converts RE into a summarization formulation.
SuRE leads to more precise and resource-efficient RE based on indirect
supervision from summarization tasks. To achieve this goal, we develop sentence
and relation conversion techniques that essentially bridge the formulation of
summarization and RE tasks. We also incorporate constraint decoding techniques
with Trie scoring to further enhance summarization-based RE with robust
inference. Experiments on three RE datasets demonstrate the effectiveness of
SuRE in both full-dataset and low-resource settings, showing that summarization
is a promising source of indirect supervision to improve RE models.
Related papers
- AMR-RE: Abstract Meaning Representations for Retrieval-Based In-Context Learning in Relation Extraction [9.12646853282321]
We propose an AMR-enhanced retrieval-based ICL method for relation extraction.
Our model retrieves in-context examples based on semantic structure similarity between task inputs and training samples.
arXiv Detail & Related papers (2024-06-14T22:36:08Z) - Retrieval-Augmented Generation-based Relation Extraction [0.0]
Retrieved-Augmented Generation-based Relation Extraction (RAG4RE) is proposed to enhance the performance of relation extraction tasks.
This work evaluated the effectiveness of our RAG4RE approach utilizing different Large Language Models (LLMs)
The results of our study demonstrate that our RAG4RE approach surpasses performance of traditional RE approaches.
arXiv Detail & Related papers (2024-04-20T14:42:43Z) - Enhancing Low-Resource Relation Representations through Multi-View Decoupling [21.32064890807893]
We propose a novel prompt-based relation representation method, named MVRE.
MVRE decouples each relation into different perspectives to encompass multi-view relation representations.
Our method can achieve state-of-the-art in low-resource settings.
arXiv Detail & Related papers (2023-12-26T14:16:16Z) - Continual Contrastive Finetuning Improves Low-Resource Relation
Extraction [34.76128090845668]
Relation extraction has been particularly challenging in low-resource scenarios and domains.
Recent literature has tackled low-resource RE by self-supervised learning.
We propose to pretrain and finetune the RE model using consistent objectives of contrastive learning.
arXiv Detail & Related papers (2022-12-21T07:30:22Z) - Towards Realistic Low-resource Relation Extraction: A Benchmark with
Empirical Baseline Study [51.33182775762785]
This paper presents an empirical study to build relation extraction systems in low-resource settings.
We investigate three schemes to evaluate the performance in low-resource settings: (i) different types of prompt-based methods with few-shot labeled data; (ii) diverse balancing methods to address the long-tailed distribution issue; and (iii) data augmentation technologies and self-training to generate more labeled in-domain data.
arXiv Detail & Related papers (2022-10-19T15:46:37Z) - Should We Rely on Entity Mentions for Relation Extraction? Debiasing
Relation Extraction with Counterfactual Analysis [60.83756368501083]
We propose the CORE (Counterfactual Analysis based Relation Extraction) debiasing method for sentence-level relation extraction.
Our CORE method is model-agnostic to debias existing RE systems during inference without changing their training processes.
arXiv Detail & Related papers (2022-05-08T05:13:54Z) - SAIS: Supervising and Augmenting Intermediate Steps for Document-Level
Relation Extraction [51.27558374091491]
We propose to explicitly teach the model to capture relevant contexts and entity types by supervising and augmenting intermediate steps (SAIS) for relation extraction.
Based on a broad spectrum of carefully designed tasks, our proposed SAIS method not only extracts relations of better quality due to more effective supervision, but also retrieves the corresponding supporting evidence more accurately.
arXiv Detail & Related papers (2021-09-24T17:37:35Z) - D-REX: Dialogue Relation Extraction with Explanations [65.3862263565638]
This work focuses on extracting explanations that indicate that a relation exists while using only partially labeled data.
We propose our model-agnostic framework, D-REX, a policy-guided semi-supervised algorithm that explains and ranks relations.
We find that about 90% of the time, human annotators prefer D-REX's explanations over a strong BERT-based joint relation extraction and explanation model.
arXiv Detail & Related papers (2021-09-10T22:30:48Z) - Adjacency List Oriented Relational Fact Extraction via Adaptive
Multi-task Learning [24.77542721790553]
We show that all of the fact extraction models can be organized according to a graph-oriented analytical perspective.
An efficient model, aDjacency lIst oRientational faCT (Direct), is proposed based on this analytical framework.
arXiv Detail & Related papers (2021-06-03T02:57:08Z) - Abstractive Query Focused Summarization with Query-Free Resources [60.468323530248945]
In this work, we consider the problem of leveraging only generic summarization resources to build an abstractive QFS system.
We propose Marge, a Masked ROUGE Regression framework composed of a novel unified representation for summaries and queries.
Despite learning from minimal supervision, our system achieves state-of-the-art results in the distantly supervised setting.
arXiv Detail & Related papers (2020-12-29T14:39:35Z) - Multi-task Collaborative Network for Joint Referring Expression
Comprehension and Segmentation [135.67558811281984]
We propose a novel Multi-task Collaborative Network (MCN) to achieve a joint learning offerring expression comprehension (REC) and segmentation (RES)
In MCN, RES can help REC to achieve better language-vision alignment, while REC can help RES to better locate the referent.
We address a key challenge in this multi-task setup, i.e., the prediction conflict, with two innovative designs namely, Consistency Energy Maximization (CEM) and Adaptive Soft Non-Located Suppression (ASNLS)
arXiv Detail & Related papers (2020-03-19T14:25:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.