From Multilingual Complexity to Emotional Clarity: Leveraging
Commonsense to Unveil Emotions in Code-Mixed Dialogues
- URL: http://arxiv.org/abs/2310.13080v1
- Date: Thu, 19 Oct 2023 18:17:00 GMT
- Title: From Multilingual Complexity to Emotional Clarity: Leveraging
Commonsense to Unveil Emotions in Code-Mixed Dialogues
- Authors: Shivani Kumar, Ramaneswaran S, Md Shad Akhtar, Tanmoy Chakraborty
- Abstract summary: Understanding emotions during conversation is a fundamental aspect of human communication, driving NLP research for Emotion Recognition in Conversation (ERC)
We propose an innovative approach that integrates commonsense information with dialogue context to facilitate a deeper understanding of emotions.
Our comprehensive experimentation showcases the substantial performance improvement obtained through the systematic incorporation of commonsense in ERC.
- Score: 38.87497808740538
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Understanding emotions during conversation is a fundamental aspect of human
communication, driving NLP research for Emotion Recognition in Conversation
(ERC). While considerable research has focused on discerning emotions of
individual speakers in monolingual dialogues, understanding the emotional
dynamics in code-mixed conversations has received relatively less attention.
This motivates our undertaking of ERC for code-mixed conversations in this
study. Recognizing that emotional intelligence encompasses a comprehension of
worldly knowledge, we propose an innovative approach that integrates
commonsense information with dialogue context to facilitate a deeper
understanding of emotions. To achieve this, we devise an efficient pipeline
that extracts relevant commonsense from existing knowledge graphs based on the
code-mixed input. Subsequently, we develop an advanced fusion technique that
seamlessly combines the acquired commonsense information with the dialogue
representation obtained from a dedicated dialogue understanding module. Our
comprehensive experimentation showcases the substantial performance improvement
obtained through the systematic incorporation of commonsense in ERC. Both
quantitative assessments and qualitative analyses further corroborate the
validity of our hypothesis, reaffirming the pivotal role of commonsense
integration in enhancing ERC.
Related papers
- Empathy Through Multimodality in Conversational Interfaces [1.360649555639909]
Conversational Health Agents (CHAs) are redefining healthcare by offering nuanced support that transcends textual analysis to incorporate emotional intelligence.
This paper introduces an LLM-based CHA engineered for rich, multimodal dialogue-especially in the realm of mental health support.
It adeptly interprets and responds to users' emotional states by analyzing multimodal cues, thus delivering contextually aware and empathetically resonant verbal responses.
arXiv Detail & Related papers (2024-05-08T02:48:29Z) - Facilitating Multi-turn Emotional Support Conversation with Positive
Emotion Elicitation: A Reinforcement Learning Approach [58.88422314998018]
Emotional support conversation (ESC) aims to provide emotional support (ES) to improve one's mental state.
Existing works stay at fitting grounded responses and responding strategies which ignore the effect on ES and lack explicit goals to guide emotional positive transition.
We introduce a new paradigm to formalize multi-turn ESC as a process of positive emotion elicitation.
arXiv Detail & Related papers (2023-07-16T09:58:44Z) - Context-Dependent Embedding Utterance Representations for Emotion
Recognition in Conversations [1.8126187844654875]
We approach Emotion Recognition in Conversations leveraging the conversational context.
We propose context-dependent embedding representations of each utterance.
The effectiveness of our approach is validated on the open-domain DailyDialog dataset and on the task-oriented EmoWOZ dataset.
arXiv Detail & Related papers (2023-04-17T12:37:57Z) - deep learning of segment-level feature representation for speech emotion
recognition in conversations [9.432208348863336]
We propose a conversational speech emotion recognition method to deal with capturing attentive contextual dependency and speaker-sensitive interactions.
First, we use a pretrained VGGish model to extract segment-based audio representation in individual utterances.
Second, an attentive bi-directional recurrent unit (GRU) models contextual-sensitive information and explores intra- and inter-speaker dependencies jointly.
arXiv Detail & Related papers (2023-02-05T16:15:46Z) - DialogueCRN: Contextual Reasoning Networks for Emotion Recognition in
Conversations [0.0]
We propose novel Contextual Reasoning Networks (DialogueCRN) to fully understand the conversational context from a cognitive perspective.
Inspired by the Cognitive Theory of Emotion, we design multi-turn reasoning modules to extract and integrate emotional clues.
The reasoning module iteratively performs an intuitive retrieving process and a conscious reasoning process, which imitates human unique cognitive thinking.
arXiv Detail & Related papers (2021-06-03T16:47:38Z) - Target Guided Emotion Aware Chat Machine [58.8346820846765]
The consistency of a response to a given post at semantic-level and emotional-level is essential for a dialogue system to deliver human-like interactions.
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
arXiv Detail & Related papers (2020-11-15T01:55:37Z) - COSMIC: COmmonSense knowledge for eMotion Identification in
Conversations [95.71018134363976]
We propose COSMIC, a new framework that incorporates different elements of commonsense such as mental states, events, and causal relations.
We show that COSMIC achieves new state-of-the-art results for emotion recognition on four different benchmark conversational datasets.
arXiv Detail & Related papers (2020-10-06T15:09:38Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z) - You Impress Me: Dialogue Generation via Mutual Persona Perception [62.89449096369027]
The research in cognitive science suggests that understanding is an essential signal for a high-quality chit-chat conversation.
Motivated by this, we propose P2 Bot, a transmitter-receiver based framework with the aim of explicitly modeling understanding.
arXiv Detail & Related papers (2020-04-11T12:51:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.