Multi-Task Learning and Adapted Knowledge Models for Emotion-Cause
Extraction
- URL: http://arxiv.org/abs/2106.09790v1
- Date: Thu, 17 Jun 2021 20:11:04 GMT
- Title: Multi-Task Learning and Adapted Knowledge Models for Emotion-Cause
Extraction
- Authors: Elsbeth Turcan, Shuai Wang, Rishita Anubhai, Kasturi Bhattacharjee,
Yaser Al-Onaizan, Smaranda Muresan
- Abstract summary: We present solutions that tackle both emotion recognition and emotion cause detection in a joint fashion.
Considering that common-sense knowledge plays an important role in understanding implicitly expressed emotions, we propose novel methods.
We show performance improvement on both tasks when including common-sense reasoning and a multitask framework.
- Score: 18.68808042388714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Detecting what emotions are expressed in text is a well-studied problem in
natural language processing. However, research on finer grained emotion
analysis such as what causes an emotion is still in its infancy. We present
solutions that tackle both emotion recognition and emotion cause detection in a
joint fashion. Considering that common-sense knowledge plays an important role
in understanding implicitly expressed emotions and the reasons for those
emotions, we propose novel methods that combine common-sense knowledge via
adapted knowledge models with multi-task learning to perform joint emotion
classification and emotion cause tagging. We show performance improvement on
both tasks when including common-sense reasoning and a multitask framework. We
provide a thorough analysis to gain insights into model performance.
Related papers
- Think out Loud: Emotion Deducing Explanation in Dialogues [57.90554323226896]
We propose a new task "Emotion Deducing Explanation in Dialogues" (EDEN)
EDEN recognizes emotion and causes in an explicitly thinking way.
It can help Large Language Models (LLMs) achieve better recognition of emotions and causes.
arXiv Detail & Related papers (2024-06-07T08:58:29Z) - SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Dynamic Causal Disentanglement Model for Dialogue Emotion Detection [77.96255121683011]
We propose a Dynamic Causal Disentanglement Model based on hidden variable separation.
This model effectively decomposes the content of dialogues and investigates the temporal accumulation of emotions.
Specifically, we propose a dynamic temporal disentanglement model to infer the propagation of utterances and hidden variables.
arXiv Detail & Related papers (2023-09-13T12:58:09Z) - Natural Language Processing for Cognitive Analysis of Emotions [0.0]
We introduce a new annotation scheme for exploring emotions and their causes, along with a new French dataset composed of autobiographical accounts of an emotional scene.
The texts were collected by applying the Cognitive Analysis of Emotions developed by A. Finkel to help people improve on their emotion management.
arXiv Detail & Related papers (2022-10-11T09:47:00Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - Emotion Recognition from Multiple Modalities: Fundamentals and
Methodologies [106.62835060095532]
We discuss several key aspects of multi-modal emotion recognition (MER)
We begin with a brief introduction on widely used emotion representation models and affective modalities.
We then summarize existing emotion annotation strategies and corresponding computational tasks.
Finally, we outline several real-world applications and discuss some future directions.
arXiv Detail & Related papers (2021-08-18T21:55:20Z) - EmoDNN: Understanding emotions from short texts through a deep neural
network ensemble [2.459874436804819]
We propose a framework that infers latent individual aspects from brief contents.
We also present a novel ensemble classifier equipped with dynamic dropout convnets to extract emotions from textual context.
Our proposed model can achieve a higher performance in recognizing emotion from noisy contents.
arXiv Detail & Related papers (2021-06-03T09:17:34Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.