SentiCSE: A Sentiment-aware Contrastive Sentence Embedding Framework with Sentiment-guided Textual Similarity
- URL: http://arxiv.org/abs/2404.01104v1
- Date: Mon, 1 Apr 2024 13:24:20 GMT
- Title: SentiCSE: A Sentiment-aware Contrastive Sentence Embedding Framework with Sentiment-guided Textual Similarity
- Authors: Jaemin Kim, Yohan Na, Kangmin Kim, Sang Rak Lee, Dong-Kyu Chae,
- Abstract summary: Sentiment-aware pre-trained language models (PLMs) demonstrate impressive results in downstream sentiment analysis tasks.
We propose Sentiment-guided Textual Similarity (SgTS), a novel metric for evaluating the quality of sentiment representations.
We then propose SentiCSE, a novel Sentiment-aware Contrastive Sentence Embedding framework for constructing sentiment representations.
- Score: 12.954271451359222
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, sentiment-aware pre-trained language models (PLMs) demonstrate impressive results in downstream sentiment analysis tasks. However, they neglect to evaluate the quality of their constructed sentiment representations; they just focus on improving the fine-tuning performance, which overshadows the representation quality. We argue that without guaranteeing the representation quality, their downstream performance can be highly dependent on the supervision of the fine-tuning data rather than representation quality. This problem would make them difficult to foray into other sentiment-related domains, especially where labeled data is scarce. We first propose Sentiment-guided Textual Similarity (SgTS), a novel metric for evaluating the quality of sentiment representations, which is designed based on the degree of equivalence in sentiment polarity between two sentences. We then propose SentiCSE, a novel Sentiment-aware Contrastive Sentence Embedding framework for constructing sentiment representations via combined word-level and sentence-level objectives, whose quality is guaranteed by SgTS. Qualitative and quantitative comparison with the previous sentiment-aware PLMs shows the superiority of our work. Our code is available at: https://github.com/nayohan/SentiCSE
Related papers
- Sentiment Reasoning for Healthcare [2.0451307225357427]
Sentiment Reasoning is an auxiliary task in sentiment analysis where the model predicts both the sentiment label and generates the rationale behind it based on the input transcript.
Our study conducted on both human transcripts and Automatic Speech Recognition (ASR) transcripts shows that Sentiment Reasoning helps improve model transparency by providing rationale for model prediction with quality semantically comparable to humans.
arXiv Detail & Related papers (2024-07-24T12:07:54Z) - Sentiment-Aware Word and Sentence Level Pre-training for Sentiment
Analysis [64.70116276295609]
SentiWSP is a Sentiment-aware pre-trained language model with combined Word-level and Sentence-level Pre-training tasks.
SentiWSP achieves new state-of-the-art performance on various sentence-level and aspect-level sentiment classification benchmarks.
arXiv Detail & Related papers (2022-10-18T12:25:29Z) - Learning Implicit Sentiment in Aspect-based Sentiment Analysis with
Supervised Contrastive Pre-Training [18.711698114617526]
We propose Supervised Contrastive Pre-training on large-scale sentiment-annotated corpora.
By aligning the representation of implicit sentiment expressions to those with the same sentiment label, the pre-training process leads to better capture of both implicit and explicit sentiment orientation towards aspects in reviews.
arXiv Detail & Related papers (2021-11-03T13:03:17Z) - SentiPrompt: Sentiment Knowledge Enhanced Prompt-Tuning for Aspect-Based
Sentiment Analysis [22.758661494710047]
We propose SentiPrompt to tune the language model in the unified framework.
We inject sentiment knowledge regarding aspects, opinions, and polarities into prompt and explicitly model term relations.
Our approach can outperform strong baselines on Triplet Extraction, Pair Extraction, and Aspect Term Extraction with Sentiment Classification.
arXiv Detail & Related papers (2021-09-17T01:56:06Z) - Contrastive Semantic Similarity Learning for Image Captioning Evaluation
with Intrinsic Auto-encoder [52.42057181754076]
Motivated by the auto-encoder mechanism and contrastive representation learning advances, we propose a learning-based metric for image captioning.
We develop three progressive model structures to learn the sentence level representations.
Experiment results show that our proposed method can align well with the scores generated from other contemporary metrics.
arXiv Detail & Related papers (2021-06-29T12:27:05Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis [69.80296394461149]
We introduce Sentiment Knowledge Enhanced Pre-training (SKEP) in order to learn a unified sentiment representation for multiple sentiment analysis tasks.
With the help of automatically-mined knowledge, SKEP conducts sentiment masking and constructs three sentiment knowledge prediction objectives.
Experiments on three kinds of sentiment tasks show that SKEP significantly outperforms strong pre-training baseline.
arXiv Detail & Related papers (2020-05-12T09:23:32Z) - SentiBERT: A Transferable Transformer-Based Architecture for
Compositional Sentiment Semantics [82.51956663674747]
SentiBERT is a variant of BERT that effectively captures compositional sentiment semantics.
We show that SentiBERT achieves competitive performance on phrase-level sentiment classification.
arXiv Detail & Related papers (2020-05-08T15:40:17Z) - A Deep Neural Framework for Contextual Affect Detection [51.378225388679425]
A short and simple text carrying no emotion can represent some strong emotions when reading along with its context.
We propose a Contextual Affect Detection framework which learns the inter-dependence of words in a sentence.
arXiv Detail & Related papers (2020-01-28T05:03:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.