ATP: A holistic attention integrated approach to enhance ABSA
- URL: http://arxiv.org/abs/2208.02653v1
- Date: Thu, 4 Aug 2022 13:32:56 GMT
- Title: ATP: A holistic attention integrated approach to enhance ABSA
- Authors: Ashish Kumar (1), Vasundhra Dahiya (2), Aditi Sharan (1) ((1)
Jawaharlal Nehru University, New Delhi, India, (2) Indian Institute of
Technology, Jodhpur, India)
- Abstract summary: Aspect based sentiment analysis (ABSA) deals with the identification of the sentiment polarity of a review sentence towards a given aspect.
We propose a method that captures the position based information using dependency parsing tree.
We performed the experiments on SemEval'14 dataset to demonstrate the effect of dependency parsing relation-based attention for ABSA.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Aspect based sentiment analysis (ABSA) deals with the identification of the
sentiment polarity of a review sentence towards a given aspect. Deep Learning
sequential models like RNN, LSTM, and GRU are current state-of-the-art methods
for inferring the sentiment polarity. These methods work well to capture the
contextual relationship between the words of a review sentence. However, these
methods are insignificant in capturing long-term dependencies. Attention
mechanism plays a significant role by focusing only on the most crucial part of
the sentence. In the case of ABSA, aspect position plays a vital role. Words
near to aspect contribute more while determining the sentiment towards the
aspect. Therefore, we propose a method that captures the position based
information using dependency parsing tree and helps attention mechanism. Using
this type of position information over a simple word-distance-based position
enhances the deep learning model's performance. We performed the experiments on
SemEval'14 dataset to demonstrate the effect of dependency parsing
relation-based attention for ABSA.
Related papers
- Amplifying Aspect-Sentence Awareness: A Novel Approach for Aspect-Based Sentiment Analysis [2.9045498954705886]
Aspect-Based Sentiment Analysis (ABSA) is increasingly crucial in Natural Language Processing (NLP)
ABSA goes beyond traditional sentiment analysis by extracting sentiments related to specific aspects mentioned in the text.
We propose Amplifying Aspect-Sentence Awareness (A3SN), a novel technique designed to enhance ABSA through amplifying aspect-sentence awareness attention.
arXiv Detail & Related papers (2024-05-14T10:29:59Z) - A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - Object Localization under Single Coarse Point Supervision [107.46800858130658]
We propose a POL method using coarse point annotations, relaxing the supervision signals from accurate key points to freely spotted points.
CPR constructs point bags, selects semantic-correlated points, and produces semantic center points through multiple instance learning (MIL)
In this way, CPR defines a weakly supervised evolution procedure, which ensures training high-performance object localizer under coarse point supervision.
arXiv Detail & Related papers (2022-03-17T14:14:11Z) - A Simple Information-Based Approach to Unsupervised Domain-Adaptive
Aspect-Based Sentiment Analysis [58.124424775536326]
We propose a simple but effective technique based on mutual information to extract their term.
Experiment results show that our proposed method outperforms the state-of-the-art methods for cross-domain ABSA by 4.32% Micro-F1.
arXiv Detail & Related papers (2022-01-29T10:18:07Z) - Contextualized Semantic Distance between Highly Overlapped Texts [85.1541170468617]
Overlapping frequently occurs in paired texts in natural language processing tasks like text editing and semantic similarity evaluation.
This paper aims to address the issue with a mask-and-predict strategy.
We take the words in the longest common sequence as neighboring words and use masked language modeling (MLM) to predict the distributions on their positions.
Experiments on Semantic Textual Similarity show NDD to be more sensitive to various semantic differences, especially on highly overlapped paired texts.
arXiv Detail & Related papers (2021-10-04T03:59:15Z) - Deep Context- and Relation-Aware Learning for Aspect-based Sentiment
Analysis [3.7175198778996483]
We propose Deep Contextualized Relation-Aware Network (DCRAN), which allows interactive relations among subtasks with deep contextual information.
DCRAN significantly outperforms previous state-of-the-art methods by large margins on three widely used benchmarks.
arXiv Detail & Related papers (2021-06-07T17:16:15Z) - Enhanced Aspect-Based Sentiment Analysis Models with Progressive
Self-supervised Attention Learning [103.0064298630794]
In aspect-based sentiment analysis (ABSA), many neural models are equipped with an attention mechanism to quantify the contribution of each context word to sentiment prediction.
We propose a progressive self-supervised attention learning approach for attentional ABSA models.
We integrate the proposed approach into three state-of-the-art neural ABSA models.
arXiv Detail & Related papers (2021-03-05T02:50:05Z) - Understanding Pre-trained BERT for Aspect-based Sentiment Analysis [71.40586258509394]
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
arXiv Detail & Related papers (2020-10-31T02:21:43Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - Simple Unsupervised Similarity-Based Aspect Extraction [0.9558392439655015]
We propose a simple approach called SUAEx for aspect extraction.
SUAEx is unsupervised and relies solely on the similarity of word embeddings.
Experimental results on datasets from three different domains have shown that SUAEx achieves results that can outperform the state-of-the-art attention-based approach at a fraction of the time.
arXiv Detail & Related papers (2020-08-25T04:58:07Z) - A Position Aware Decay Weighted Network for Aspect based Sentiment
Analysis [3.1473798197405944]
In ABSA, a text can have multiple sentiments depending upon each aspect.
Most of the existing approaches for ATSA, incorporate aspect information through a different subnetwork.
In this paper, we propose a model that leverages the positional information of the aspect.
arXiv Detail & Related papers (2020-05-03T09:22:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.