Generating Emotionally Aligned Responses in Dialogues using Affect
Control Theory
- URL: http://arxiv.org/abs/2003.03645v2
- Date: Thu, 16 Apr 2020 06:46:25 GMT
- Title: Generating Emotionally Aligned Responses in Dialogues using Affect
Control Theory
- Authors: Nabiha Asghar, Ivan Kobyzev, Jesse Hoey, Pascal Poupart, and Muhammad
Bilal Sheikh
- Abstract summary: Affect Control Theory (ACT) is a socio-mathematical model of emotions for human-human interactions.
We investigate how ACT can be used to develop affect-aware neural conversational agents.
- Score: 15.848210524718219
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: State-of-the-art neural dialogue systems excel at syntactic and semantic
modelling of language, but often have a hard time establishing emotional
alignment with the human interactant during a conversation. In this work, we
bring Affect Control Theory (ACT), a socio-mathematical model of emotions for
human-human interactions, to the neural dialogue generation setting. ACT makes
predictions about how humans respond to emotional stimuli in social situations.
Due to this property, ACT and its derivative probabilistic models have been
successfully deployed in several applications of Human-Computer Interaction,
including empathetic tutoring systems, assistive healthcare devices and
two-person social dilemma games. We investigate how ACT can be used to develop
affect-aware neural conversational agents, which produce emotionally aligned
responses to prompts and take into consideration the affective identities of
the interactants.
Related papers
- CAPE: A Chinese Dataset for Appraisal-based Emotional Generation using Large Language Models [30.40159858361768]
We introduce a two-stage automatic data generation framework to create CAPE, a Chinese dataset named Cognitive Appraisal theory-based Emotional corpus.
This corpus facilitates the generation of dialogues with contextually appropriate emotional responses by accounting for diverse personal and situational factors.
Our study shows the potential for advancing emotional expression in conversational agents, paving the way for more nuanced and meaningful human-computer interactions.
arXiv Detail & Related papers (2024-10-18T03:33:18Z) - SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Personality-affected Emotion Generation in Dialog Systems [67.40609683389947]
We propose a new task, Personality-affected Emotion Generation, to generate emotion based on the personality given to the dialog system.
We analyze the challenges in this task, i.e., (1) heterogeneously integrating personality and emotional factors and (2) extracting multi-granularity emotional information in the dialog context.
Results suggest that by adopting our method, the emotion generation performance is improved by 13% in macro-F1 and 5% in weighted-F1 from the BERT-base model.
arXiv Detail & Related papers (2024-04-03T08:48:50Z) - Emotion-Oriented Behavior Model Using Deep Learning [0.9176056742068812]
The accuracy of emotion-based behavior predictions is statistically validated using the 2-tailed Pearson correlation.
This study is a steppingstone to a multi-faceted artificial agent interaction based on emotion-oriented behaviors.
arXiv Detail & Related papers (2023-10-28T17:27:59Z) - Think Twice: A Human-like Two-stage Conversational Agent for Emotional Response Generation [16.659457455269127]
We propose a two-stage conversational agent for the generation of emotional dialogue.
First, a dialogue model trained without the emotion-annotated dialogue corpus generates a prototype response that meets the contextual semantics.
Secondly, the first-stage prototype is modified by a controllable emotion refiner with the empathy hypothesis.
arXiv Detail & Related papers (2023-01-12T10:03:56Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.