A Commonsense Reasoning Framework for Explanatory Emotion Attribution,
Generation and Re-classification
- URL: http://arxiv.org/abs/2101.04017v1
- Date: Mon, 11 Jan 2021 16:44:38 GMT
- Title: A Commonsense Reasoning Framework for Explanatory Emotion Attribution,
Generation and Re-classification
- Authors: Antonio Lieto, Gian Luca Pozzato, Stefano Zoia, Viviana Patti, Rossana
Damiano
- Abstract summary: We present an explainable system for emotion attribution and recommendation (called DEGARI)
The system exploits the logic TCL to automatically generate novel commonsense semantic representations of compound emotions.
The generated emotions correspond to prototypes, i.e. commonsense representations of given concepts.
- Score: 3.464871689508835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work we present an explainable system for emotion attribution and
recommendation (called DEGARI) relying on a recently introduced commonsense
reasoning framework (the TCL logic) which is based on a human-like procedure
for the automatic generation of novel concepts in a Description Logics
knowledge base. Starting from an ontological formalization of emotions (known
as ArsEmotica), the system exploits the logic TCL to automatically generate
novel commonsense semantic representations of compound emotions (e.g. Love as
derived from the combination of Joy and Trust according to the ArsEmotica
model). The generated emotions correspond to prototypes, i.e. commonsense
representations of given concepts, and have been used to reclassify
emotion-related contents in a variety of artistic domains, ranging from art
datasets to the editorial content available in RaiPlay, the online multimedia
platform of RAI Radiotelevisione Italiana (the Italian public broadcasting
company). We have tested our system (1) by reclassifying the available contents
in the tested dataset with respect to the new generated compound emotions (2)
with an evaluation, in the form of a controlled user study experiment, of the
feasibility of using the obtained reclassifications as recommended emotional
content. The obtained results are encouraging and pave the way to many possible
further improvements and research directions.
Related papers
- Towards Empathetic Conversational Recommender Systems [77.53167131692]
We propose an empathetic conversational recommender (ECR) framework.
ECR contains two main modules: emotion-aware item recommendation and emotion-aligned response generation.
Our experiments on the ReDial dataset validate the efficacy of our framework in enhancing recommendation accuracy and improving user satisfaction.
arXiv Detail & Related papers (2024-08-30T15:43:07Z) - EmoTwiCS: A Corpus for Modelling Emotion Trajectories in Dutch Customer
Service Dialogues on Twitter [9.2878798098526]
This paper introduces EmoTwiCS, a corpus of 9,489 Dutch customer service dialogues on Twitter that are annotated for emotion trajectories.
The term emotion trajectory' refers not only to the fine-grained emotions experienced by customers, but also to the event happening prior to the conversation and the responses made by the human operator.
arXiv Detail & Related papers (2023-10-10T11:31:11Z) - EmotionIC: emotional inertia and contagion-driven dependency modeling for emotion recognition in conversation [34.24557248359872]
We propose an emotional inertia and contagion-driven dependency modeling approach (EmotionIC) for ERC task.
Our EmotionIC consists of three main components, i.e., Identity Masked Multi-Head Attention (IMMHA), Dialogue-based Gated Recurrent Unit (DiaGRU) and Skip-chain Conditional Random Field (SkipCRF)
Experimental results show that our method can significantly outperform the state-of-the-art models on four benchmark datasets.
arXiv Detail & Related papers (2023-03-20T13:58:35Z) - REDAffectiveLM: Leveraging Affect Enriched Embedding and
Transformer-based Neural Language Model for Readers' Emotion Detection [3.6678641723285446]
We propose a novel approach for Readers' Emotion Detection from short-text documents using a deep learning model called REDAffectiveLM.
We leverage context-specific and affect enriched representations by using a transformer-based pre-trained language model in tandem with affect enriched Bi-LSTM+Attention.
arXiv Detail & Related papers (2023-01-21T19:28:25Z) - The Whole Truth and Nothing But the Truth: Faithful and Controllable
Dialogue Response Generation with Dataflow Transduction and Constrained
Decoding [65.34601470417967]
We describe a hybrid architecture for dialogue response generation that combines the strengths of neural language modeling and rule-based generation.
Our experiments show that this system outperforms both rule-based and learned approaches in human evaluations of fluency, relevance, and truthfulness.
arXiv Detail & Related papers (2022-09-16T09:00:49Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - Affective Image Content Analysis: Two Decades Review and New
Perspectives [132.889649256384]
We will comprehensively review the development of affective image content analysis (AICA) in the recent two decades.
We will focus on the state-of-the-art methods with respect to three main challenges -- the affective gap, perception subjectivity, and label noise and absence.
We discuss some challenges and promising research directions in the future, such as image content and context understanding, group emotion clustering, and viewer-image interaction.
arXiv Detail & Related papers (2021-06-30T15:20:56Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.