BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning
in Sentiment Analysis
- URL: http://arxiv.org/abs/2203.11702v1
- Date: Tue, 22 Mar 2022 13:12:27 GMT
- Title: BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning
in Sentiment Analysis
- Authors: Ahmed Murtadha, Shengfeng Pan, Bo Wen, Jianlin Su, Wenze Zhang,
Yunfeng Liu
- Abstract summary: We propose to address aspect categorization and aspect-based sentiment subtasks in a unified framework.
We first introduce a mechanism that collaborates the semantic and syntactic information to construct auxiliary-sentences for the implicit aspect.
We then encourage BERT to learn the aspect-specific representation in response to the automatically constructed auxiliary-sentence.
- Score: 4.008465268899542
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Aspect-based sentiment analysis (ABSA) task aims to associate a piece of text
with a set of aspects and meanwhile infer their respective sentimental
polarities. Up to now, the state-of-the-art approaches are built upon
fine-tuning of various pre-trained language models. They commonly aim to learn
the aspect-specific representation in the corpus. Unfortunately, the aspect is
often expressed implicitly through a set of representatives and thus renders
implicit mapping process unattainable unless sufficient labeled examples.
In this paper, we propose to jointly address aspect categorization and
aspect-based sentiment subtasks in a unified framework. Specifically, we first
introduce a simple but effective mechanism that collaborates the semantic and
syntactic information to construct auxiliary-sentences for the implicit aspect.
Then, we encourage BERT to learn the aspect-specific representation in response
to the automatically constructed auxiliary-sentence instead of the aspect
itself. Finally, we empirically evaluate the performance of the proposed
solution by a comparative study on real benchmark datasets for both ABSA and
Targeted-ABSA tasks. Our extensive experiments show that it consistently
achieves state-of-the-art performance in terms of aspect categorization and
aspect-based sentiment across all datasets and the improvement margins are
considerable.
Related papers
- A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - Incorporating Dynamic Semantics into Pre-Trained Language Model for
Aspect-based Sentiment Analysis [67.41078214475341]
We propose Dynamic Re-weighting BERT (DR-BERT) to learn dynamic aspect-oriented semantics for ABSA.
Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence.
We then fine-tune it by incorporating a lightweight Dynamic Re-weighting Adapter (DRA)
arXiv Detail & Related papers (2022-03-30T14:48:46Z) - Out of Context: A New Clue for Context Modeling of Aspect-based
Sentiment Analysis [54.735400754548635]
ABSA aims to predict the sentiment expressed in a review with respect to a given aspect.
The given aspect should be considered as a new clue out of context in the context modeling process.
We design several aspect-aware context encoders based on different backbones.
arXiv Detail & Related papers (2021-06-21T02:26:03Z) - Understanding Pre-trained BERT for Aspect-based Sentiment Analysis [71.40586258509394]
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
arXiv Detail & Related papers (2020-10-31T02:21:43Z) - Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis [14.394987796101349]
We investigate whether adding context to self-attention models improves performance on (T)ABSA.
We propose two variants of Context-Guided BERT (CG-BERT) that learn to distribute attention under different contexts.
Our work provides more evidence for the utility of adding context-dependencies to pretrained self-attention-based language models for context-based natural language tasks.
arXiv Detail & Related papers (2020-10-15T05:01:20Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - Consensus-Aware Visual-Semantic Embedding for Image-Text Matching [69.34076386926984]
Image-text matching plays a central role in bridging vision and language.
Most existing approaches only rely on the image-text instance pair to learn their representations.
We propose a Consensus-aware Visual-Semantic Embedding model to incorporate the consensus information.
arXiv Detail & Related papers (2020-07-17T10:22:57Z) - A Hybrid Approach for Aspect-Based Sentiment Analysis Using Deep
Contextual Word Embeddings and Hierarchical Attention [4.742874328556818]
We extend the state-of-the-art Hybrid Approach for Aspect-Based Sentiment Analysis (HAABSA) in two directions.
First we replace the non-contextual word embeddings with deep contextual word embeddings in order to better cope with the word semantics in a given text.
Second, we use hierarchical attention by adding an extra attention layer to the HAABSA high-level representations in order to increase the method flexibility in modeling the input data.
arXiv Detail & Related papers (2020-04-18T17:54:55Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.