Understanding Pre-trained BERT for Aspect-based Sentiment Analysis
- URL: http://arxiv.org/abs/2011.00169v1
- Date: Sat, 31 Oct 2020 02:21:43 GMT
- Title: Understanding Pre-trained BERT for Aspect-based Sentiment Analysis
- Authors: Hu Xu, Lei Shu, Philip S. Yu, Bing Liu
- Abstract summary: This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
- Score: 71.40586258509394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper analyzes the pre-trained hidden representations learned from
reviews on BERT for tasks in aspect-based sentiment analysis (ABSA). Our work
is motivated by the recent progress in BERT-based language models for ABSA.
However, it is not clear how the general proxy task of (masked) language model
trained on unlabeled corpus without annotations of aspects or opinions can
provide important features for downstream tasks in ABSA. By leveraging the
annotated datasets in ABSA, we investigate both the attentions and the learned
representations of BERT pre-trained on reviews. We found that BERT uses very
few self-attention heads to encode context words (such as prepositions or
pronouns that indicating an aspect) and opinion words for an aspect. Most
features in the representation of an aspect are dedicated to the fine-grained
semantics of the domain (or product category) and the aspect itself, instead of
carrying summarized opinions from its context. We hope this investigation can
help future research in improving self-supervised learning, unsupervised
learning and fine-tuning for ABSA. The pre-trained model and code can be found
at https://github.com/howardhsu/BERT-for-RRC-ABSA.
Related papers
- ROAST: Review-level Opinion Aspect Sentiment Target Joint Detection for ABSA [50.90538760832107]
This research presents a novel task, Review-Level Opinion Aspect Sentiment Target (ROAST)
ROAST seeks to close the gap between sentence-level and text-level ABSA by identifying every ABSA constituent at the review level.
We extend the available datasets to enable ROAST, addressing the drawbacks noted in previous research.
arXiv Detail & Related papers (2024-05-30T17:29:15Z) - Incorporating Dynamic Semantics into Pre-Trained Language Model for
Aspect-based Sentiment Analysis [67.41078214475341]
We propose Dynamic Re-weighting BERT (DR-BERT) to learn dynamic aspect-oriented semantics for ABSA.
Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence.
We then fine-tune it by incorporating a lightweight Dynamic Re-weighting Adapter (DRA)
arXiv Detail & Related papers (2022-03-30T14:48:46Z) - BERT-ASC: Auxiliary-Sentence Construction for Implicit Aspect Learning in Sentiment Analysis [4.522719296659495]
This paper proposes a unified framework to address aspect categorization and aspect-based sentiment subtasks.
We introduce a mechanism to construct an auxiliary-sentence for the implicit aspect using the corpus's semantic information.
We then encourage BERT to learn aspect-specific representation in response to this auxiliary-sentence, not the aspect itself.
arXiv Detail & Related papers (2022-03-22T13:12:27Z) - Arabic aspect based sentiment analysis using BERT [0.0]
This article explores the modeling capabilities of contextual embeddings from pre-trained language models, such as BERT.
We are building a simple but effective BERT-based neural baseline to handle this task.
Our BERT architecture with a simple linear classification layer surpassed the state-of-the-art works, according to the experimental results.
arXiv Detail & Related papers (2021-07-28T11:34:00Z) - Out of Context: A New Clue for Context Modeling of Aspect-based
Sentiment Analysis [54.735400754548635]
ABSA aims to predict the sentiment expressed in a review with respect to a given aspect.
The given aspect should be considered as a new clue out of context in the context modeling process.
We design several aspect-aware context encoders based on different backbones.
arXiv Detail & Related papers (2021-06-21T02:26:03Z) - Enhanced Aspect-Based Sentiment Analysis Models with Progressive
Self-supervised Attention Learning [103.0064298630794]
In aspect-based sentiment analysis (ABSA), many neural models are equipped with an attention mechanism to quantify the contribution of each context word to sentiment prediction.
We propose a progressive self-supervised attention learning approach for attentional ABSA models.
We integrate the proposed approach into three state-of-the-art neural ABSA models.
arXiv Detail & Related papers (2021-03-05T02:50:05Z) - Exploiting BERT to improve aspect-based sentiment analysis performance
on Persian language [0.0]
This research shows the potential of using pre-trained BERT model and taking advantage of using sentence-pair input on an ABSA task.
The results indicate that employing Pars-BERT pre-trained model along with natural language inference auxiliary sentence (NLI-M) could boost the ABSA task accuracy up to 91%.
arXiv Detail & Related papers (2020-12-02T16:47:20Z) - Improving BERT Performance for Aspect-Based Sentiment Analysis [3.5493798890908104]
Aspect-Based Sentiment Analysis (ABSA) studies the consumer opinion on the market products.
It involves examining the type of sentiments as well as sentiment targets expressed in product reviews.
We show that applying the proposed models eliminates the need for further training of the BERT model.
arXiv Detail & Related papers (2020-10-22T13:52:18Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - Adversarial Training for Aspect-Based Sentiment Analysis with BERT [3.5493798890908104]
We propose a novel architecture called BERT Adrial Training (BAT) to utilize adversarial training in sentiment analysis.
The proposed model outperforms post-trained BERT in both tasks.
To the best of our knowledge, this is the first study on the application of adversarial training in ABSA.
arXiv Detail & Related papers (2020-01-30T13:53:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.