Tell Me Why You Feel That Way: Processing Compositional Dependency for
Tree-LSTM Aspect Sentiment Triplet Extraction (TASTE)
- URL: http://arxiv.org/abs/2103.05815v1
- Date: Wed, 10 Mar 2021 01:52:10 GMT
- Title: Tell Me Why You Feel That Way: Processing Compositional Dependency for
Tree-LSTM Aspect Sentiment Triplet Extraction (TASTE)
- Authors: A. Sutherland, S. Bensch, T. Hellstr\"om, S. Magg, S.Wermter
- Abstract summary: We present a hybrid neural-symbolic method utilising a Dependency Tree-LSTM's compositional sentiment parse structure and complementary symbolic rules.
We show that this method has the potential to perform in line with state-of-the-art approaches while also simplifying the data required and providing a degree of interpretability.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Sentiment analysis has transitioned from classifying the sentiment of an
entire sentence to providing the contextual information of what targets exist
in a sentence, what sentiment the individual targets have, and what the causal
words responsible for that sentiment are. However, this has led to elaborate
requirements being placed on the datasets needed to train neural networks on
the joint triplet task of determining an entity, its sentiment, and the causal
words for that sentiment. Requiring this kind of data for training systems is
problematic, as they suffer from stacking subjective annotations and domain
over-fitting leading to poor model generalisation when applied in new contexts.
These problems are also likely to be compounded as we attempt to jointly
determine additional contextual elements in the future. To mitigate these
problems, we present a hybrid neural-symbolic method utilising a Dependency
Tree-LSTM's compositional sentiment parse structure and complementary symbolic
rules to correctly extract target-sentiment-cause triplets from sentences
without the need for triplet training data. We show that this method has the
potential to perform in line with state-of-the-art approaches while also
simplifying the data required and providing a degree of interpretability
through the Tree-LSTM.
Related papers
- Semantic Loss Functions for Neuro-Symbolic Structured Prediction [74.18322585177832]
We discuss the semantic loss, which injects knowledge about such structure, defined symbolically, into training.
It is agnostic to the arrangement of the symbols, and depends only on the semantics expressed thereby.
It can be combined with both discriminative and generative neural models.
arXiv Detail & Related papers (2024-05-12T22:18:25Z) - A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - BiSyn-GAT+: Bi-Syntax Aware Graph Attention Network for Aspect-based
Sentiment Analysis [23.223136577272516]
Aspect-based sentiment analysis aims to align aspects and corresponding sentiments for aspect-specific sentiment polarity inference.
Recently, exploiting dependency syntax information with graph neural networks has been the most popular trend.
We propose a Bi-Syntax aware Graph Attention Network (BiSyn-GAT+) to address this problem.
arXiv Detail & Related papers (2022-04-06T22:18:12Z) - Semantic and Syntactic Enhanced Aspect Sentiment Triplet Extraction [18.331779474247323]
Aspect Sentiment Triplet Extraction aims to extract triplets from sentences, where each triplet includes an entity, its associated sentiment, and the opinion span explaining the reason for the sentiment.
We propose a Semantic and Syntactic Enhanced aspect Sentiment triplet Extraction model (S3E2) to fully exploit the syntactic and semantic relationships between the triplet elements and jointly extract them.
arXiv Detail & Related papers (2021-06-07T03:16:51Z) - Leveraging Recursive Processing for Neural-Symbolic Affect-Target
Associations [0.0]
We present a commonsense approach to associate extracted targets, noun chunks determined to be associated with the expressed emotion, with affective labels from a natural language expression.
We leverage a pre-trained neural network that is well adapted to tree and sub-tree processing, the Dependency Tree-LSTM, to learn the affect labels of dynamic targets.
arXiv Detail & Related papers (2021-03-05T15:32:38Z) - Syntactic representation learning for neural network based TTS with
syntactic parse tree traversal [49.05471750563229]
We propose a syntactic representation learning method based on syntactic parse tree to automatically utilize the syntactic structure information.
Experimental results demonstrate the effectiveness of our proposed approach.
For sentences with multiple syntactic parse trees, prosodic differences can be clearly perceived from the synthesized speeches.
arXiv Detail & Related papers (2020-12-13T05:52:07Z) - SentiBERT: A Transferable Transformer-Based Architecture for
Compositional Sentiment Semantics [82.51956663674747]
SentiBERT is a variant of BERT that effectively captures compositional sentiment semantics.
We show that SentiBERT achieves competitive performance on phrase-level sentiment classification.
arXiv Detail & Related papers (2020-05-08T15:40:17Z) - Neural Data-to-Text Generation via Jointly Learning the Segmentation and
Correspondence [48.765579605145454]
We propose to explicitly segment target text into fragment units and align them with their data correspondences.
The resulting architecture maintains the same expressive power as neural attention models.
On both E2E and WebNLG benchmarks, we show the proposed model consistently outperforms its neural attention counterparts.
arXiv Detail & Related papers (2020-05-03T14:28:28Z) - Grounded Situation Recognition [56.18102368133022]
We introduce Grounded Situation Recognition (GSR), a task that requires producing structured semantic summaries of images.
GSR presents important technical challenges: identifying semantic saliency, categorizing and localizing a large and diverse set of entities.
We show initial findings on three exciting future directions enabled by our models: conditional querying, visual chaining, and grounded semantic aware image retrieval.
arXiv Detail & Related papers (2020-03-26T17:57:52Z) - Linguistically Driven Graph Capsule Network for Visual Question
Reasoning [153.76012414126643]
We propose a hierarchical compositional reasoning model called the "Linguistically driven Graph Capsule Network"
The compositional process is guided by the linguistic parse tree. Specifically, we bind each capsule in the lowest layer to bridge the linguistic embedding of a single word in the original question with visual evidence.
Experiments on the CLEVR dataset, CLEVR compositional generation test, and FigureQA dataset demonstrate the effectiveness and composition generalization ability of our end-to-end model.
arXiv Detail & Related papers (2020-03-23T03:34:25Z) - Investigating Typed Syntactic Dependencies for Targeted Sentiment
Classification Using Graph Attention Neural Network [10.489983726592303]
We investigate a novel relational graph attention network that integrates typed syntactic dependency information.
Results show that our method can effectively leverage label information for improving targeted sentiment classification performances.
arXiv Detail & Related papers (2020-02-22T11:17:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.