Aspect-Based Sentiment Analysis with Explicit Sentiment Augmentations
- URL: http://arxiv.org/abs/2312.10961v1
- Date: Mon, 18 Dec 2023 06:31:13 GMT
- Title: Aspect-Based Sentiment Analysis with Explicit Sentiment Augmentations
- Authors: Jihong Ouyang, Zhiyao Yang, Silong Liang, Bing Wang, Yimeng Wang,
Ximing Li
- Abstract summary: implicit sentiment widely exists in the ABSA dataset.
We propose an ABSA method that integrates explicit sentiment augmentations.
We test ABSA-ESA on two of the most popular benchmarks of ABSA.
- Score: 24.44428585727117
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Aspect-based sentiment analysis (ABSA), a fine-grained sentiment
classification task, has received much attention recently. Many works
investigate sentiment information through opinion words, such as ''good'' and
''bad''. However, implicit sentiment widely exists in the ABSA dataset, which
refers to the sentence containing no distinct opinion words but still expresses
sentiment to the aspect term. To deal with implicit sentiment, this paper
proposes an ABSA method that integrates explicit sentiment augmentations. And
we propose an ABSA-specific augmentation method to create such augmentations.
Specifically, we post-trains T5 by rule-based data. We employ Syntax Distance
Weighting and Unlikelihood Contrastive Regularization in the training procedure
to guide the model to generate an explicit sentiment. Meanwhile, we utilize the
Constrained Beam Search to ensure the augmentation sentence contains the aspect
terms. We test ABSA-ESA on two of the most popular benchmarks of ABSA. The
results show that ABSA-ESA outperforms the SOTA baselines on implicit and
explicit sentiment accuracy.
Related papers
- DimABSA: Building Multilingual and Multidomain Datasets for Dimensional Aspect-Based Sentiment Analysis [57.70022214686838]
DimABSA is the first multilingual, dimensional ABSA resource annotated with both traditional ABSA elements and VA scores.<n>This resource contains 76,958 aspect instances across 42,590 sentences, spanning six languages and four domains.
arXiv Detail & Related papers (2026-01-30T14:30:35Z) - PanoSent: A Panoptic Sextuple Extraction Benchmark for Multimodal Conversational Aspect-based Sentiment Analysis [74.41260927676747]
This paper bridges the gaps by introducing a multimodal conversational Sentiment Analysis (ABSA)
To benchmark the tasks, we construct PanoSent, a dataset annotated both manually and automatically, featuring high quality, large scale, multimodality, multilingualism, multi-scenarios, and covering both implicit and explicit sentiment elements.
To effectively address the tasks, we devise a novel Chain-of-Sentiment reasoning framework, together with a novel multimodal large language model (namely Sentica) and a paraphrase-based verification mechanism.
arXiv Detail & Related papers (2024-08-18T13:51:01Z) - ROAST: Review-level Opinion Aspect Sentiment Target Joint Detection for ABSA [50.90538760832107]
This research presents a novel task, Review-Level Opinion Aspect Sentiment Target (ROAST)
ROAST seeks to close the gap between sentence-level and text-level ABSA by identifying every ABSA constituent at the review level.
We extend the available datasets to enable ROAST, addressing the drawbacks noted in previous research.
arXiv Detail & Related papers (2024-05-30T17:29:15Z) - OATS: Opinion Aspect Target Sentiment Quadruple Extraction Dataset for
Aspect-Based Sentiment Analysis [55.61047894397937]
Aspect-based sentiment analysis (ABSA) delves into understanding sentiments specific to distinct elements within a user-generated review.
We introduce the OATS dataset, which encompasses three fresh domains and consists of 27,470 sentence-level quadruples and 17,092 review-levels.
Our initiative seeks to bridge specific observed gaps: the recurrent focus on familiar domains like restaurants and laptops, limited data for intricate quadruple extraction tasks, and an occasional oversight of the synergy between sentence and review-level sentiments.
arXiv Detail & Related papers (2023-09-23T07:39:16Z) - BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis [4.522719296659495]
This paper proposes a unified framework to address aspect categorization and aspect-based sentiment subtasks.
We introduce a mechanism to construct an auxiliary-sentence for the implicit aspect using the corpus's semantic information.
We then encourage BERT to learn aspect-specific representation in response to this auxiliary-sentence, not the aspect itself.
arXiv Detail & Related papers (2022-03-22T13:12:27Z) - A Survey on Aspect-Based Sentiment Analysis: Tasks, Methods, and
Challenges [58.97831696674075]
ABSA aims to analyze and understand people's opinions at the aspect level.
We provide a new taxonomy for ABSA which organizes existing studies from the axes of concerned sentiment elements.
We summarize the utilization of pre-trained language models for ABSA, which improved the performance of ABSA to a new stage.
arXiv Detail & Related papers (2022-03-02T12:01:46Z) - Zero-Shot Aspect-Based Sentiment Analysis [19.367516271270446]
This paper aims to train a unified model that can perform zero-shot ABSA without using any annotated data for a new domain.
We evaluate CORN on ABSA tasks, ranging from aspect extraction (AE), aspect sentiment classification (ASC), to end-to-end aspect-based sentiment analysis (E2E ABSA)
arXiv Detail & Related papers (2022-02-04T00:51:46Z) - Enhanced Aspect-Based Sentiment Analysis Models with Progressive
Self-supervised Attention Learning [103.0064298630794]
In aspect-based sentiment analysis (ABSA), many neural models are equipped with an attention mechanism to quantify the contribution of each context word to sentiment prediction.
We propose a progressive self-supervised attention learning approach for attentional ABSA models.
We integrate the proposed approach into three state-of-the-art neural ABSA models.
arXiv Detail & Related papers (2021-03-05T02:50:05Z) - Understanding Pre-trained BERT for Aspect-based Sentiment Analysis [71.40586258509394]
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
arXiv Detail & Related papers (2020-10-31T02:21:43Z) - A Position Aware Decay Weighted Network for Aspect based Sentiment
Analysis [3.1473798197405944]
In ABSA, a text can have multiple sentiments depending upon each aspect.
Most of the existing approaches for ATSA, incorporate aspect information through a different subnetwork.
In this paper, we propose a model that leverages the positional information of the aspect.
arXiv Detail & Related papers (2020-05-03T09:22:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.