Dynamic Demonstration Retrieval and Cognitive Understanding for Emotional Support Conversation
- URL: http://arxiv.org/abs/2404.02505v1
- Date: Wed, 3 Apr 2024 06:47:15 GMT
- Title: Dynamic Demonstration Retrieval and Cognitive Understanding for Emotional Support Conversation
- Authors: Zhe Xu, Daoyuan Chen, Jiayi Kuang, Zihao Yi, Yaliang Li, Ying Shen,
- Abstract summary: We tackle two key challenges in ESC: enhancing contextually relevant and empathetic response generation and advancing cognitive understanding.
ourwork is a novel approach that synergizes these elements to improve the quality of support provided in ESCs.
Our codes are available for public access to facilitate further research and development.
- Score: 35.49338831485202
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Emotional Support Conversation (ESC) systems are pivotal in providing empathetic interactions, aiding users through negative emotional states by understanding and addressing their unique experiences. In this paper, we tackle two key challenges in ESC: enhancing contextually relevant and empathetic response generation through dynamic demonstration retrieval, and advancing cognitive understanding to grasp implicit mental states comprehensively. We introduce Dynamic Demonstration Retrieval and Cognitive-Aspect Situation Understanding (\ourwork), a novel approach that synergizes these elements to improve the quality of support provided in ESCs. By leveraging in-context learning and persona information, we introduce an innovative retrieval mechanism that selects informative and personalized demonstration pairs. We also propose a cognitive understanding module that utilizes four cognitive relationships from the ATOMIC knowledge source to deepen situational awareness of help-seekers' mental states. Our supportive decoder integrates information from diverse knowledge sources, underpinning response generation that is both empathetic and cognitively aware. The effectiveness of \ourwork is demonstrated through extensive automatic and human evaluations, revealing substantial improvements over numerous state-of-the-art models, with up to 13.79\% enhancement in overall performance of ten metrics. Our codes are available for public access to facilitate further research and development.
Related papers
- Complex Emotion Recognition System using basic emotions via Facial Expression, EEG, and ECG Signals: a review [1.8310098790941458]
The Complex Emotion Recognition System (CERS) deciphers complex emotional states by examining combinations of basic emotions expressed, their interconnections, and the dynamic variations.
The development of AI systems for discerning complex emotions poses a substantial challenge with significant implications for affective computing.
incorporating physiological signals such as Electrocardiogram (ECG) and Electroencephalogram (EEG) can notably enhance CERS.
arXiv Detail & Related papers (2024-09-09T05:06:10Z) - Cause-Aware Empathetic Response Generation via Chain-of-Thought Fine-Tuning [12.766893968788263]
Empathetic response generation endows agents with the capability to comprehend dialogue contexts and react to expressed emotions.
Previous works predominantly focus on leveraging the speaker's emotional labels, but ignore the importance of emotion cause reasoning.
We propose a cause-aware empathetic generation approach by integrating emotions and causes through a well-designed Chain-of-Thought prompt.
arXiv Detail & Related papers (2024-08-21T13:11:03Z) - APTNESS: Incorporating Appraisal Theory and Emotion Support Strategies for Empathetic Response Generation [71.26755736617478]
Empathetic response generation is designed to comprehend the emotions of others.
We develop a framework that combines retrieval augmentation and emotional support strategy integration.
Our framework can enhance the empathy ability of LLMs from both cognitive and affective empathy perspectives.
arXiv Detail & Related papers (2024-07-23T02:23:37Z) - Empathy Through Multimodality in Conversational Interfaces [1.360649555639909]
Conversational Health Agents (CHAs) are redefining healthcare by offering nuanced support that transcends textual analysis to incorporate emotional intelligence.
This paper introduces an LLM-based CHA engineered for rich, multimodal dialogue-especially in the realm of mental health support.
It adeptly interprets and responds to users' emotional states by analyzing multimodal cues, thus delivering contextually aware and empathetically resonant verbal responses.
arXiv Detail & Related papers (2024-05-08T02:48:29Z) - K-ESConv: Knowledge Injection for Emotional Support Dialogue Systems via
Prompt Learning [83.19215082550163]
We propose K-ESConv, a novel prompt learning based knowledge injection method for emotional support dialogue system.
We evaluate our model on an emotional support dataset ESConv, where the model retrieves and incorporates knowledge from external professional emotional Q&A forum.
arXiv Detail & Related papers (2023-12-16T08:10:10Z) - Facilitating Multi-turn Emotional Support Conversation with Positive
Emotion Elicitation: A Reinforcement Learning Approach [58.88422314998018]
Emotional support conversation (ESC) aims to provide emotional support (ES) to improve one's mental state.
Existing works stay at fitting grounded responses and responding strategies which ignore the effect on ES and lack explicit goals to guide emotional positive transition.
We introduce a new paradigm to formalize multi-turn ESC as a process of positive emotion elicitation.
arXiv Detail & Related papers (2023-07-16T09:58:44Z) - Improving Empathetic Dialogue Generation by Dynamically Infusing
Commonsense Knowledge [39.536604198392375]
In empathetic conversations, individuals express their empathy towards others.
Previous work has mainly focused on generating empathetic responses by utilizing the speaker's emotion.
We propose a novel approach for empathetic response generation, which incorporates an adaptive module for commonsense knowledge selection.
arXiv Detail & Related papers (2023-05-24T10:25:12Z) - CogAlign: Learning to Align Textual Neural Representations to Cognitive
Language Processing Signals [60.921888445317705]
We propose a CogAlign approach to integrate cognitive language processing signals into natural language processing models.
We show that CogAlign achieves significant improvements with multiple cognitive features over state-of-the-art models on public datasets.
arXiv Detail & Related papers (2021-06-10T07:10:25Z) - You Impress Me: Dialogue Generation via Mutual Persona Perception [62.89449096369027]
The research in cognitive science suggests that understanding is an essential signal for a high-quality chit-chat conversation.
Motivated by this, we propose P2 Bot, a transmitter-receiver based framework with the aim of explicitly modeling understanding.
arXiv Detail & Related papers (2020-04-11T12:51:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.