BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis
- URL: http://arxiv.org/abs/2203.11702v3
- Date: Fri, 23 Aug 2024 07:46:00 GMT
- Title: BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis
- Authors: Murtadha Ahmed, Bo Wen, Shengfeng Pan, Jianlin Su, Luo Ao, Yunfeng Liu,
- Abstract summary: This paper proposes a unified framework to address aspect categorization and aspect-based sentiment subtasks.
We introduce a mechanism to construct an auxiliary-sentence for the implicit aspect using the corpus's semantic information.
We then encourage BERT to learn aspect-specific representation in response to this auxiliary-sentence, not the aspect itself.
- Score: 4.522719296659495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Aspect-based sentiment analysis (ABSA) aims to associate a text with a set of aspects and infer their respective sentimental polarities. State-of-the-art approaches are built on fine-tuning pre-trained language models, focusing on learning aspect-specific representations from the corpus. However, aspects are often expressed implicitly, making implicit mapping challenging without sufficient labeled examples, which may be scarce in real-world scenarios. This paper proposes a unified framework to address aspect categorization and aspect-based sentiment subtasks. We introduce a mechanism to construct an auxiliary-sentence for the implicit aspect using the corpus's semantic information. We then encourage BERT to learn aspect-specific representation in response to this auxiliary-sentence, not the aspect itself. We evaluate our approach on real benchmark datasets for both ABSA and Targeted-ABSA tasks. Our experiments show that it consistently achieves state-of-the-art performance in aspect categorization and aspect-based sentiment across all datasets, with considerable improvement margins. The BERT-ASC code is available at https://github.com/amurtadha/BERT-ASC.
Related papers
- A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - Incorporating Dynamic Semantics into Pre-Trained Language Model for
Aspect-based Sentiment Analysis [67.41078214475341]
We propose Dynamic Re-weighting BERT (DR-BERT) to learn dynamic aspect-oriented semantics for ABSA.
Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence.
We then fine-tune it by incorporating a lightweight Dynamic Re-weighting Adapter (DRA)
arXiv Detail & Related papers (2022-03-30T14:48:46Z) - Out of Context: A New Clue for Context Modeling of Aspect-based
Sentiment Analysis [54.735400754548635]
ABSA aims to predict the sentiment expressed in a review with respect to a given aspect.
The given aspect should be considered as a new clue out of context in the context modeling process.
We design several aspect-aware context encoders based on different backbones.
arXiv Detail & Related papers (2021-06-21T02:26:03Z) - Understanding Pre-trained BERT for Aspect-based Sentiment Analysis [71.40586258509394]
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
arXiv Detail & Related papers (2020-10-31T02:21:43Z) - Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis [14.394987796101349]
We investigate whether adding context to self-attention models improves performance on (T)ABSA.
We propose two variants of Context-Guided BERT (CG-BERT) that learn to distribute attention under different contexts.
Our work provides more evidence for the utility of adding context-dependencies to pretrained self-attention-based language models for context-based natural language tasks.
arXiv Detail & Related papers (2020-10-15T05:01:20Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - Consensus-Aware Visual-Semantic Embedding for Image-Text Matching [69.34076386926984]
Image-text matching plays a central role in bridging vision and language.
Most existing approaches only rely on the image-text instance pair to learn their representations.
We propose a Consensus-aware Visual-Semantic Embedding model to incorporate the consensus information.
arXiv Detail & Related papers (2020-07-17T10:22:57Z) - A Hybrid Approach for Aspect-Based Sentiment Analysis Using Deep
Contextual Word Embeddings and Hierarchical Attention [4.742874328556818]
We extend the state-of-the-art Hybrid Approach for Aspect-Based Sentiment Analysis (HAABSA) in two directions.
First we replace the non-contextual word embeddings with deep contextual word embeddings in order to better cope with the word semantics in a given text.
Second, we use hierarchical attention by adding an extra attention layer to the HAABSA high-level representations in order to increase the method flexibility in modeling the input data.
arXiv Detail & Related papers (2020-04-18T17:54:55Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.