SentiBERT: A Transferable Transformer-Based Architecture for
Compositional Sentiment Semantics
- URL: http://arxiv.org/abs/2005.04114v4
- Date: Thu, 21 May 2020 04:37:43 GMT
- Title: SentiBERT: A Transferable Transformer-Based Architecture for
Compositional Sentiment Semantics
- Authors: Da Yin, Tao Meng, Kai-Wei Chang
- Abstract summary: SentiBERT is a variant of BERT that effectively captures compositional sentiment semantics.
We show that SentiBERT achieves competitive performance on phrase-level sentiment classification.
- Score: 82.51956663674747
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose SentiBERT, a variant of BERT that effectively captures
compositional sentiment semantics. The model incorporates contextualized
representation with binary constituency parse tree to capture semantic
composition. Comprehensive experiments demonstrate that SentiBERT achieves
competitive performance on phrase-level sentiment classification. We further
demonstrate that the sentiment composition learned from the phrase-level
annotations on SST can be transferred to other sentiment analysis tasks as well
as related tasks, such as emotion classification tasks. Moreover, we conduct
ablation studies and design visualization methods to understand SentiBERT. We
show that SentiBERT is better than baseline approaches in capturing negation
and the contrastive relation and model the compositional sentiment semantics.
Related papers
- Two in One Go: Single-stage Emotion Recognition with Decoupled Subject-context Transformer [78.35816158511523]
We present a single-stage emotion recognition approach, employing a Decoupled Subject-Context Transformer (DSCT) for simultaneous subject localization and emotion classification.
We evaluate our single-stage framework on two widely used context-aware emotion recognition datasets, CAER-S and EMOTIC.
arXiv Detail & Related papers (2024-04-26T07:30:32Z) - SentiCSE: A Sentiment-aware Contrastive Sentence Embedding Framework with Sentiment-guided Textual Similarity [12.954271451359222]
Sentiment-aware pre-trained language models (PLMs) demonstrate impressive results in downstream sentiment analysis tasks.
We propose Sentiment-guided Textual Similarity (SgTS), a novel metric for evaluating the quality of sentiment representations.
We then propose SentiCSE, a novel Sentiment-aware Contrastive Sentence Embedding framework for constructing sentiment representations.
arXiv Detail & Related papers (2024-04-01T13:24:20Z) - Sentiment-Aware Word and Sentence Level Pre-training for Sentiment
Analysis [64.70116276295609]
SentiWSP is a Sentiment-aware pre-trained language model with combined Word-level and Sentence-level Pre-training tasks.
SentiWSP achieves new state-of-the-art performance on various sentence-level and aspect-level sentiment classification benchmarks.
arXiv Detail & Related papers (2022-10-18T12:25:29Z) - Transition-based Abstract Meaning Representation Parsing with Contextual
Embeddings [0.0]
We study a way of combing two of the most successful routes to meaning of language--statistical language models and symbolic semantics formalisms--in the task of semantic parsing.
We explore the utility of incorporating pretrained context-aware word embeddings--such as BERT and RoBERTa--in the problem of parsing.
arXiv Detail & Related papers (2022-06-13T15:05:24Z) - BiSyn-GAT+: Bi-Syntax Aware Graph Attention Network for Aspect-based
Sentiment Analysis [23.223136577272516]
Aspect-based sentiment analysis aims to align aspects and corresponding sentiments for aspect-specific sentiment polarity inference.
Recently, exploiting dependency syntax information with graph neural networks has been the most popular trend.
We propose a Bi-Syntax aware Graph Attention Network (BiSyn-GAT+) to address this problem.
arXiv Detail & Related papers (2022-04-06T22:18:12Z) - SentiPrompt: Sentiment Knowledge Enhanced Prompt-Tuning for Aspect-Based
Sentiment Analysis [22.758661494710047]
We propose SentiPrompt to tune the language model in the unified framework.
We inject sentiment knowledge regarding aspects, opinions, and polarities into prompt and explicitly model term relations.
Our approach can outperform strong baselines on Triplet Extraction, Pair Extraction, and Aspect Term Extraction with Sentiment Classification.
arXiv Detail & Related papers (2021-09-17T01:56:06Z) - Tell Me Why You Feel That Way: Processing Compositional Dependency for
Tree-LSTM Aspect Sentiment Triplet Extraction (TASTE) [0.0]
We present a hybrid neural-symbolic method utilising a Dependency Tree-LSTM's compositional sentiment parse structure and complementary symbolic rules.
We show that this method has the potential to perform in line with state-of-the-art approaches while also simplifying the data required and providing a degree of interpretability.
arXiv Detail & Related papers (2021-03-10T01:52:10Z) - Unsupervised Distillation of Syntactic Information from Contextualized
Word Representations [62.230491683411536]
We tackle the task of unsupervised disentanglement between semantics and structure in neural language representations.
To this end, we automatically generate groups of sentences which are structurally similar but semantically different.
We demonstrate that our transformation clusters vectors in space by structural properties, rather than by lexical semantics.
arXiv Detail & Related papers (2020-10-11T15:13:18Z) - A Unified Dual-view Model for Review Summarization and Sentiment
Classification with Inconsistency Loss [51.448615489097236]
Acquiring accurate summarization and sentiment from user reviews is an essential component of modern e-commerce platforms.
We propose a novel dual-view model that jointly improves the performance of these two tasks.
Experiment results on four real-world datasets from different domains demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2020-06-02T13:34:11Z) - A Deep Neural Framework for Contextual Affect Detection [51.378225388679425]
A short and simple text carrying no emotion can represent some strong emotions when reading along with its context.
We propose a Contextual Affect Detection framework which learns the inter-dependence of words in a sentence.
arXiv Detail & Related papers (2020-01-28T05:03:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.