Wish I Can Feel What You Feel: A Neural Approach for Empathetic Response
Generation
- URL: http://arxiv.org/abs/2212.02000v1
- Date: Mon, 5 Dec 2022 03:20:37 GMT
- Title: Wish I Can Feel What You Feel: A Neural Approach for Empathetic Response
Generation
- Authors: Yangbin Chen and Chunfeng Liang
- Abstract summary: We propose a novel approach, which integrates the three components - emotion cause, knowledge graph, and communication mechanism for empathetic response generation.
Experimental results show that incorporating the key components generates more informative and empathetic responses.
- Score: 2.5255184843886225
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Expressing empathy is important in everyday conversations, and exploring how
empathy arises is crucial in automatic response generation. Most previous
approaches consider only a single factor that affects empathy. However, in
practice, empathy generation and expression is a very complex and dynamic
psychological process. A listener needs to find out events which cause a
speaker's emotions (emotion cause extraction), project the events into some
experience (knowledge extension), and express empathy in the most appropriate
way (communication mechanism). To this end, we propose a novel approach, which
integrates the three components - emotion cause, knowledge graph, and
communication mechanism for empathetic response generation. Experimental
results on the benchmark dataset demonstrate the effectiveness of our method
and show that incorporating the key components generates more informative and
empathetic responses.
Related papers
- APTNESS: Incorporating Appraisal Theory and Emotion Support Strategies for Empathetic Response Generation [71.26755736617478]
Empathetic response generation is designed to comprehend the emotions of others.
We develop a framework that combines retrieval augmentation and emotional support strategy integration.
Our framework can enhance the empathy ability of LLMs from both cognitive and affective empathy perspectives.
arXiv Detail & Related papers (2024-07-23T02:23:37Z) - Empathetic Response Generation via Emotion Cause Transition Graph [29.418144401849194]
Empathetic dialogue is a human-like behavior that requires the perception of both affective factors (e.g., emotion status) and cognitive factors (e.g., cause of the emotion)
We propose an emotion cause transition graph to explicitly model the natural transition of emotion causes between two adjacent turns in empathetic dialogue.
With this graph, the concept words of the emotion causes in the next turn can be predicted and used by a specifically designed concept-aware decoder to generate the empathic response.
arXiv Detail & Related papers (2023-02-23T05:51:17Z) - Empathetic Dialogue Generation via Sensitive Emotion Recognition and
Sensible Knowledge Selection [47.60224978460442]
We propose a Serial and Emotion-Knowledge interaction (SEEK) method for empathetic dialogue generation.
We use a fine-grained encoding strategy which is more sensitive to the emotion dynamics (emotion flow) in the conversations to predict the emotion-intent characteristic of response. Besides, we design a novel framework to model the interaction between knowledge and emotion to generate more sensible response.
arXiv Detail & Related papers (2022-10-21T03:51:18Z) - CASE: Aligning Coarse-to-Fine Cognition and Affection for Empathetic
Response Generation [59.8935454665427]
Empathetic dialogue models usually consider only the affective aspect or treat cognition and affection in isolation.
We propose the CASE model for empathetic dialogue generation.
arXiv Detail & Related papers (2022-08-18T14:28:38Z) - CEM: Commonsense-aware Empathetic Response Generation [31.956147246779423]
We propose a novel approach for empathetic response generation, which leverages commonsense to draw more information about the user's situation.
We evaluate our approach on EmpatheticDialogues, which is a widely-used benchmark dataset for empathetic response generation.
arXiv Detail & Related papers (2021-09-13T06:55:14Z) - Exemplars-guided Empathetic Response Generation Controlled by the
Elements of Human Communication [88.52901763928045]
We propose an approach that relies on exemplars to cue the generative model on fine stylistic properties that signal empathy to the interlocutor.
We empirically show that these approaches yield significant improvements in empathetic response quality in terms of both automated and human-evaluated metrics.
arXiv Detail & Related papers (2021-06-22T14:02:33Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - MIME: MIMicking Emotions for Empathetic Response Generation [82.57304533143756]
Current approaches to empathetic response generation view the set of emotions expressed in the input text as a flat structure.
We argue that empathetic responses often mimic the emotion of the user to a varying degree, depending on its positivity or negativity and content.
arXiv Detail & Related papers (2020-10-04T00:35:47Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.