KESA: A Knowledge Enhanced Approach For Sentiment Analysis
- URL: http://arxiv.org/abs/2202.12093v1
- Date: Thu, 24 Feb 2022 13:21:27 GMT
- Title: KESA: A Knowledge Enhanced Approach For Sentiment Analysis
- Authors: Qinghua Zhao, Shuai Ma, Shuo Ren
- Abstract summary: We study sentence-level sentiment analysis and propose two sentiment-aware auxiliary tasks named sentiment word cloze and conditional sentiment prediction.
The experimental results demonstrate that our approach consistently outperforms pre-trained models.
- Score: 13.937274761540925
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Though some recent works focus on injecting sentiment knowledge into
pre-trained language models, they usually design mask and reconstruction tasks
in the post-training phase. In this paper, we aim to benefit from sentiment
knowledge in a lighter way. To achieve this goal, we study sentence-level
sentiment analysis and, correspondingly, propose two sentiment-aware auxiliary
tasks named sentiment word cloze and conditional sentiment prediction. The
first task learns to select the correct sentiment words within the input, given
the overall sentiment polarity as prior knowledge. On the contrary, the second
task predicts the overall sentiment polarity given the sentiment polarity of
the word as prior knowledge. In addition, two kinds of label combination
methods are investigated to unify multiple types of labels in each task. We
argue that more information can promote the models to learn more profound
semantic representation. We implement it in a straightforward way to verify
this hypothesis. The experimental results demonstrate that our approach
consistently outperforms pre-trained models and is additive to existing
knowledge-enhanced post-trained models. The code and data are released at
https://github.com/lshowway/KESA.
Related papers
- What is Sentiment Meant to Mean to Language Models? [0.0]
"sentiment" entails a wide variety of concepts depending on the domain and tools used.
"sentiment" has been used to mean emotion, opinions, market movements, or simply a general good-bad'' dimension.
arXiv Detail & Related papers (2024-05-03T19:37:37Z) - Towards Unsupervised Recognition of Token-level Semantic Differences in
Related Documents [61.63208012250885]
We formulate recognizing semantic differences as a token-level regression task.
We study three unsupervised approaches that rely on a masked language model.
Our results show that an approach based on word alignment and sentence-level contrastive learning has a robust correlation to gold labels.
arXiv Detail & Related papers (2023-05-22T17:58:04Z) - Unifying the Discrete and Continuous Emotion labels for Speech Emotion
Recognition [28.881092401807894]
In paralinguistic analysis for emotion detection from speech, emotions have been identified with discrete or dimensional (continuous-valued) labels.
We propose a model to jointly predict continuous and discrete emotional attributes.
arXiv Detail & Related papers (2022-10-29T16:12:31Z) - Sentiment-Aware Word and Sentence Level Pre-training for Sentiment
Analysis [64.70116276295609]
SentiWSP is a Sentiment-aware pre-trained language model with combined Word-level and Sentence-level Pre-training tasks.
SentiWSP achieves new state-of-the-art performance on various sentence-level and aspect-level sentiment classification benchmarks.
arXiv Detail & Related papers (2022-10-18T12:25:29Z) - Prior Knowledge Guided Unsupervised Domain Adaptation [82.9977759320565]
We propose a Knowledge-guided Unsupervised Domain Adaptation (KUDA) setting where prior knowledge about the target class distribution is available.
In particular, we consider two specific types of prior knowledge about the class distribution in the target domain: Unary Bound and Binary Relationship.
We propose a rectification module that uses such prior knowledge to refine model generated pseudo labels.
arXiv Detail & Related papers (2022-07-18T18:41:36Z) - A Latent-Variable Model for Intrinsic Probing [93.62808331764072]
We propose a novel latent-variable formulation for constructing intrinsic probes.
We find empirical evidence that pre-trained representations develop a cross-lingually entangled notion of morphosyntax.
arXiv Detail & Related papers (2022-01-20T15:01:12Z) - Enhanced Aspect-Based Sentiment Analysis Models with Progressive
Self-supervised Attention Learning [103.0064298630794]
In aspect-based sentiment analysis (ABSA), many neural models are equipped with an attention mechanism to quantify the contribution of each context word to sentiment prediction.
We propose a progressive self-supervised attention learning approach for attentional ABSA models.
We integrate the proposed approach into three state-of-the-art neural ABSA models.
arXiv Detail & Related papers (2021-03-05T02:50:05Z) - A Variational Approach to Unsupervised Sentiment Analysis [8.87759101018566]
We propose a variational approach to unsupervised sentiment analysis.
We use target-opinion word pairs as a supervision signal.
We apply our method to sentiment analysis on customer reviews and clinical narratives.
arXiv Detail & Related papers (2020-08-21T09:52:35Z) - SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis [69.80296394461149]
We introduce Sentiment Knowledge Enhanced Pre-training (SKEP) in order to learn a unified sentiment representation for multiple sentiment analysis tasks.
With the help of automatically-mined knowledge, SKEP conducts sentiment masking and constructs three sentiment knowledge prediction objectives.
Experiments on three kinds of sentiment tasks show that SKEP significantly outperforms strong pre-training baseline.
arXiv Detail & Related papers (2020-05-12T09:23:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.