A Simple yet Effective Relation Information Guided Approach for Few-Shot
Relation Extraction
- URL: http://arxiv.org/abs/2205.09536v1
- Date: Thu, 19 May 2022 13:03:01 GMT
- Title: A Simple yet Effective Relation Information Guided Approach for Few-Shot
Relation Extraction
- Authors: Yang Liu, Jinpeng Hu, Xiang Wan, Tsung-Hui Chang
- Abstract summary: Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation.
Some recent works have introduced relation information to assist model learning based on Prototype Network.
We argue that relation information can be introduced more explicitly and effectively into the model.
- Score: 22.60428265210431
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-Shot Relation Extraction aims at predicting the relation for a pair of
entities in a sentence by training with a few labelled examples in each
relation. Some recent works have introduced relation information (i.e.,
relation labels or descriptions) to assist model learning based on Prototype
Network. However, most of them constrain the prototypes of each relation class
implicitly with relation information, generally through designing complex
network structures, like generating hybrid features, combining with contrastive
learning or attention networks. We argue that relation information can be
introduced more explicitly and effectively into the model. Thus, this paper
proposes a direct addition approach to introduce relation information.
Specifically, for each relation class, the relation representation is first
generated by concatenating two views of relations (i.e., [CLS] token embedding
and the mean value of embeddings of all tokens) and then directly added to the
original prototype for both train and prediction. Experimental results on the
benchmark dataset FewRel 1.0 show significant improvements and achieve
comparable results to the state-of-the-art, which demonstrates the
effectiveness of our proposed approach. Besides, further analyses verify that
the direct addition is a much more effective way to integrate the relation
representations and the original prototypes.
Related papers
- Entity or Relation Embeddings? An Analysis of Encoding Strategies for Relation Extraction [19.019881161010474]
Relation extraction is essentially a text classification problem, which can be tackled by fine-tuning a pre-trained language model (LM)
Existing approaches therefore solve the problem in an indirect way: they fine-tune an LM to learn embeddings of the head and tail entities, and then predict the relationship from these entity embeddings.
Our hypothesis in this paper is that relation extraction models can be improved by capturing relationships in a more direct way.
arXiv Detail & Related papers (2023-12-18T09:58:19Z) - A Novel Few-Shot Relation Extraction Pipeline Based on Adaptive
Prototype Fusion [5.636675879040131]
Few-shot relation extraction (FSRE) aims at recognizing unseen relations by learning with merely a handful of annotated instances.
This paper proposes a novel pipeline for the FSRE task based on adaptive prototype fusion.
Experiments on the benchmark dataset FewRel 1.0 show a significant improvement of our method against state-of-the-art methods.
arXiv Detail & Related papers (2022-10-15T09:44:21Z) - Link Prediction on N-ary Relational Data Based on Relatedness Evaluation [61.61555159755858]
We propose a method called NaLP to conduct link prediction on n-ary relational data.
We represent each n-ary relational fact as a set of its role and role-value pairs.
Experimental results validate the effectiveness and merits of the proposed methods.
arXiv Detail & Related papers (2021-04-21T09:06:54Z) - Prototypical Representation Learning for Relation Extraction [56.501332067073065]
This paper aims to learn predictive, interpretable, and robust relation representations from distantly-labeled data.
We learn prototypes for each relation from contextual information to best explore the intrinsic semantics of relations.
Results on several relation learning tasks show that our model significantly outperforms the previous state-of-the-art relational models.
arXiv Detail & Related papers (2021-03-22T08:11:43Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - A Frustratingly Easy Approach for Entity and Relation Extraction [25.797992240847833]
We present a simple pipelined approach for entity and relation extraction.
We establish the new state-of-the-art on standard benchmarks (ACE04, ACE05 and SciERC)
Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model.
arXiv Detail & Related papers (2020-10-24T07:14:01Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Type-augmented Relation Prediction in Knowledge Graphs [65.88395564516115]
We propose a type-augmented relation prediction (TaRP) method, where we apply both the type information and instance-level information for relation prediction.
Our proposed TaRP method achieves significantly better performance than state-of-the-art methods on four benchmark datasets.
arXiv Detail & Related papers (2020-09-16T21:14:18Z) - Relation-Guided Representation Learning [53.60351496449232]
We propose a new representation learning method that explicitly models and leverages sample relations.
Our framework well preserves the relations between samples.
By seeking to embed samples into subspace, we show that our method can address the large-scale and out-of-sample problem.
arXiv Detail & Related papers (2020-07-11T10:57:45Z) - Few-shot Relation Extraction via Bayesian Meta-learning on Relation
Graphs [35.842356537926]
This paper studies few-shot relation extraction, which aims at predicting the relation for a pair of entities in a sentence by training with a few labeled examples in each relation.
To more effectively generalize to new relations, in this paper we study the relationships between different relations and propose to leverage a global relation graph.
arXiv Detail & Related papers (2020-07-05T17:04:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.