Serial Contrastive Knowledge Distillation for Continual Few-shot
Relation Extraction
- URL: http://arxiv.org/abs/2305.06616v1
- Date: Thu, 11 May 2023 07:25:47 GMT
- Title: Serial Contrastive Knowledge Distillation for Continual Few-shot
Relation Extraction
- Authors: Xinyi Wang and Zitao Wang and Wei Hu
- Abstract summary: We propose a new model, namely SCKD, to accomplish the continual few-shot RE task.
Specifically, we design serial knowledge distillation to preserve the prior knowledge from previous models.
Our experiments on two benchmark datasets validate the effectiveness of SCKD for continual few-shot RE.
- Score: 35.79570854392989
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Continual few-shot relation extraction (RE) aims to continuously train a
model for new relations with few labeled training data, of which the major
challenges are the catastrophic forgetting of old relations and the overfitting
caused by data sparsity. In this paper, we propose a new model, namely SCKD, to
accomplish the continual few-shot RE task. Specifically, we design serial
knowledge distillation to preserve the prior knowledge from previous models and
conduct contrastive learning with pseudo samples to keep the representations of
samples in different relations sufficiently distinguishable. Our experiments on
two benchmark datasets validate the effectiveness of SCKD for continual
few-shot RE and its superiority in knowledge transfer and memory utilization
over state-of-the-art models.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.