To be Closer: Learning to Link up Aspects with Opinions
- URL: http://arxiv.org/abs/2109.08382v1
- Date: Fri, 17 Sep 2021 07:37:13 GMT
- Title: To be Closer: Learning to Link up Aspects with Opinions
- Authors: Yuxiang Zhou, Lejian Liao, Yang Gao, Zhanming Jie, Wei Lu
- Abstract summary: Dependency parse trees are helpful for discovering the opinion words in aspect-based sentiment analysis (ABSA)
In this work, we aim to shorten the distance between aspects and corresponding opinion words by learning an aspect-centric tree structure.
The learning process allows the tree structure to adaptively correlate the aspect and opinion words, enabling us to better identify the polarity in the ABSA task.
- Score: 18.956990787407793
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dependency parse trees are helpful for discovering the opinion words in
aspect-based sentiment analysis (ABSA). However, the trees obtained from
off-the-shelf dependency parsers are static, and could be sub-optimal in ABSA.
This is because the syntactic trees are not designed for capturing the
interactions between opinion words and aspect words. In this work, we aim to
shorten the distance between aspects and corresponding opinion words by
learning an aspect-centric tree structure. The aspect and opinion words are
expected to be closer along such tree structure compared to the standard
dependency parse tree. The learning process allows the tree structure to
adaptively correlate the aspect and opinion words, enabling us to better
identify the polarity in the ABSA task. We conduct experiments on five
aspect-based sentiment datasets, and the proposed model significantly
outperforms recent strong baselines. Furthermore, our thorough analysis
demonstrates the average distance between aspect and opinion words are
shortened by at least 19% on the standard SemEval Restaurant14 dataset.
Related papers
- Syntactic Language Change in English and German: Metrics, Parsers, and Convergences [56.47832275431858]
The current paper looks at diachronic trends in syntactic language change in both English and German, using corpora of parliamentary debates from the last c. 160 years.
We base our observations on five dependencys, including the widely used Stanford Core as well as 4 newer alternatives.
We show that changes in syntactic measures seem to be more frequent at the tails of sentence length distributions.
arXiv Detail & Related papers (2024-02-18T11:46:16Z) - Interpreting Sentiment Composition with Latent Semantic Tree [21.008695645095038]
We propose semantic tree, a new tree form capable of interpreting the sentiment composition in a principled way.
Semantic tree is a derivation of a context-free grammar (CFG) describing the specific composition rules on difference semantic roles.
Our method achieves better or competitive results compared to baselines in the setting of regular and domain adaptation classification.
arXiv Detail & Related papers (2023-08-31T09:35:52Z) - Opinion Tree Parsing for Aspect-based Sentiment Analysis [24.29073390167775]
We propose an opinion tree parsing model, aiming to parse all the sentiment elements from an opinion tree, which is much faster and can explicitly reveal a more comprehensive and complete aspect-level sentiment structure.
In particular, we first introduce context-free opinion grammar to normalize the opinion tree structure. We then employ a neural chart-based opinion tree to fully explore the correlations among sentiment elements and parse them into an opinion tree structure.
arXiv Detail & Related papers (2023-06-15T07:53:14Z) - CPTAM: Constituency Parse Tree Aggregation Method [6.011216641982612]
This paper adopts the truth discovery idea to aggregate constituency parse trees from different distances.
We formulate the constituency parse tree aggregation problem in two steps, structure aggregation and constituent label aggregation.
Experiments are conducted on benchmark datasets in different languages and domains.
arXiv Detail & Related papers (2022-01-19T23:05:37Z) - Linguistic dependencies and statistical dependence [76.89273585568084]
We use pretrained language models to estimate probabilities of words in context.
We find that maximum-CPMI trees correspond to linguistic dependencies more often than trees extracted from non-contextual PMI estimate.
arXiv Detail & Related papers (2021-04-18T02:43:37Z) - Improving Aspect-based Sentiment Analysis with Gated Graph Convolutional
Networks and Syntax-based Regulation [89.38054401427173]
Aspect-based Sentiment Analysis (ABSA) seeks to predict the sentiment polarity of a sentence toward a specific aspect.
dependency trees can be integrated into deep learning models to produce the state-of-the-art performance for ABSA.
We propose a novel graph-based deep learning model to overcome these two issues.
arXiv Detail & Related papers (2020-10-26T07:36:24Z) - A Survey of Unsupervised Dependency Parsing [62.16714720135358]
Unsupervised dependency parsing aims to learn a dependency from sentences that have no annotation of their correct parse trees.
Despite its difficulty, unsupervised parsing is an interesting research direction because of its capability of utilizing almost unlimited unannotated text data.
arXiv Detail & Related papers (2020-10-04T10:51:22Z) - Exploiting Syntactic Structure for Better Language Modeling: A Syntactic
Distance Approach [78.77265671634454]
We make use of a multi-task objective, i.e., the models simultaneously predict words as well as ground truth parse trees in a form called "syntactic distances"
Experimental results on the Penn Treebank and Chinese Treebank datasets show that when ground truth parse trees are provided as additional training signals, the model is able to achieve lower perplexity and induce trees with better quality.
arXiv Detail & Related papers (2020-05-12T15:35:00Z) - Relational Graph Attention Network for Aspect-based Sentiment Analysis [35.342467338880546]
Aspect-based sentiment analysis aims to determine the sentiment polarity towards a specific aspect in online reviews.
We propose a relational graph attention network (R-GAT) to encode the new tree structure for sentiment prediction.
Experiments are conducted on the SemEval 2014 and Twitter datasets.
arXiv Detail & Related papers (2020-04-26T12:21:04Z) - A Deep Neural Framework for Contextual Affect Detection [51.378225388679425]
A short and simple text carrying no emotion can represent some strong emotions when reading along with its context.
We propose a Contextual Affect Detection framework which learns the inter-dependence of words in a sentence.
arXiv Detail & Related papers (2020-01-28T05:03:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.