Zero-Shot Aspect-Based Sentiment Analysis
- URL: http://arxiv.org/abs/2202.01924v2
- Date: Tue, 8 Feb 2022 13:00:23 GMT
- Title: Zero-Shot Aspect-Based Sentiment Analysis
- Authors: Lei Shu, Jiahua Chen, Bing Liu, Hu Xu
- Abstract summary: This paper aims to train a unified model that can perform zero-shot ABSA without using any annotated data for a new domain.
We evaluate CORN on ABSA tasks, ranging from aspect extraction (AE), aspect sentiment classification (ASC), to end-to-end aspect-based sentiment analysis (E2E ABSA)
- Score: 19.367516271270446
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Aspect-based sentiment analysis (ABSA) typically requires in-domain annotated
data for supervised training/fine-tuning. It is a big challenge to scale ABSA
to a large number of new domains. This paper aims to train a unified model that
can perform zero-shot ABSA without using any annotated data for a new domain.
We propose a method called contrastive post-training on review Natural Language
Inference (CORN). Later ABSA tasks can be cast into NLI for zero-shot transfer.
We evaluate CORN on ABSA tasks, ranging from aspect extraction (AE), aspect
sentiment classification (ASC), to end-to-end aspect-based sentiment analysis
(E2E ABSA), which show ABSA can be conducted without any human annotated ABSA
data.
Related papers
- ROAST: Review-level Opinion Aspect Sentiment Target Joint Detection for ABSA [50.90538760832107]
This research presents a novel task, Review-Level Opinion Aspect Sentiment Target (ROAST)
ROAST seeks to close the gap between sentence-level and text-level ABSA by identifying every ABSA constituent at the review level.
We extend the available datasets to enable ROAST, addressing the drawbacks noted in previous research.
arXiv Detail & Related papers (2024-05-30T17:29:15Z) - A Weak Supervision Approach for Few-Shot Aspect Based Sentiment [39.33888584498155]
Weak supervision on abundant unlabeled data can be leveraged to improve few-shot performance in sentiment analysis tasks.
We propose a pipeline approach to construct a noisy ABSA dataset, and we use it to adapt a pre-trained sequence-to-sequence model to the ABSA tasks.
Our proposed method preserves the full fine-tuning performance while showing significant improvements (15.84% absolute F1) in the few-shot learning scenario.
arXiv Detail & Related papers (2023-05-19T19:53:54Z) - Bidirectional Generative Framework for Cross-domain Aspect-based
Sentiment Analysis [68.742820522137]
Cross-domain aspect-based sentiment analysis (ABSA) aims to perform various fine-grained sentiment analysis tasks on a target domain by transferring knowledge from a source domain.
We propose a unified bidirectional generative framework to tackle various cross-domain ABSA tasks.
Our framework trains a generative model in both text-to-label and label-to-text directions.
arXiv Detail & Related papers (2023-05-16T15:02:23Z) - Survey of Aspect-based Sentiment Analysis Datasets [55.61047894397937]
Aspect-based sentiment analysis (ABSA) is a natural language processing problem that requires analyzing user-generated reviews.
Numerous yet scattered corpora for ABSA make it difficult for researchers to identify corpora best suited for a specific ABSA subtask quickly.
This study aims to present a database of corpora that can be used to train and assess autonomous ABSA systems.
arXiv Detail & Related papers (2022-04-11T16:23:36Z) - A Survey on Aspect-Based Sentiment Analysis: Tasks, Methods, and
Challenges [58.97831696674075]
ABSA aims to analyze and understand people's opinions at the aspect level.
We provide a new taxonomy for ABSA which organizes existing studies from the axes of concerned sentiment elements.
We summarize the utilization of pre-trained language models for ABSA, which improved the performance of ABSA to a new stage.
arXiv Detail & Related papers (2022-03-02T12:01:46Z) - A Simple Information-Based Approach to Unsupervised Domain-Adaptive
Aspect-Based Sentiment Analysis [58.124424775536326]
We propose a simple but effective technique based on mutual information to extract their term.
Experiment results show that our proposed method outperforms the state-of-the-art methods for cross-domain ABSA by 4.32% Micro-F1.
arXiv Detail & Related papers (2022-01-29T10:18:07Z) - Understanding Pre-trained BERT for Aspect-based Sentiment Analysis [71.40586258509394]
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA)
It is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA.
arXiv Detail & Related papers (2020-10-31T02:21:43Z) - A Position Aware Decay Weighted Network for Aspect based Sentiment
Analysis [3.1473798197405944]
In ABSA, a text can have multiple sentiments depending upon each aspect.
Most of the existing approaches for ATSA, incorporate aspect information through a different subnetwork.
In this paper, we propose a model that leverages the positional information of the aspect.
arXiv Detail & Related papers (2020-05-03T09:22:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.