Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue
- URL: http://arxiv.org/abs/2103.03940v1
- Date: Fri, 5 Mar 2021 20:55:48 GMT
- Title: Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue
- Authors: Henrique Siqueira, Alexander Sutherland, Pablo Barros, Mattias Kerzel,
Sven Magg, Stefan Wermter
- Abstract summary: We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
- Score: 67.89143112645556
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Effectively recognising and applying emotions to interactions is a highly
desirable trait for social robots. Implicitly understanding how subjects
experience different kinds of actions and objects in the world is crucial for
natural HRI interactions, with the possibility to perform positive actions and
avoid negative actions. In this paper, we utilize the NICO robot's appearance
and capabilities to give the NICO the ability to model a coherent affective
association between a perceived auditory stimulus and a temporally asynchronous
emotion expression. This is done by combining evaluations of emotional valence
from vision and language. NICO uses this information to make decisions about
when to extend conversations in order to accrue more affective information if
the representation of the association is not coherent. Our primary contribution
is providing a NICO robot with the ability to learn the affective associations
between a perceived auditory stimulus and an emotional expression. NICO is able
to do this for both individual subjects and specific stimuli, with the aid of
an emotion-driven dialogue system that rectifies emotional expression
incoherences. The robot is then able to use this information to determine a
subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
Related papers
- SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - ECR-Chain: Advancing Generative Language Models to Better Emotion-Cause Reasoners through Reasoning Chains [61.50113532215864]
Causal Emotion Entailment (CEE) aims to identify the causal utterances in a conversation that stimulate the emotions expressed in a target utterance.
Current works in CEE mainly focus on modeling semantic and emotional interactions in conversations.
We introduce a step-by-step reasoning method, Emotion-Cause Reasoning Chain (ECR-Chain), to infer the stimulus from the target emotional expressions in conversations.
arXiv Detail & Related papers (2024-05-17T15:45:08Z) - Emotion Flip Reasoning in Multiparty Conversations [27.884015521888458]
Instigator based Emotion Flip Reasoning (EFR) aims to identify the instigator behind a speaker's emotion flip within a conversation.
We present MELD-I, a dataset that includes ground-truth EFR instigator labels, which are in line with emotional psychology.
We propose a novel neural architecture called TGIF, which leverages Transformer encoders and stacked GRUs to capture the dialogue context.
arXiv Detail & Related papers (2023-06-24T13:22:02Z) - Nonverbal Cues in Human-Robot Interaction: A Communication Studies
Perspective [19.67112387337872]
Communication between people is characterized by a broad range of nonverbal cues.
We offer definitive nonverbal codes for human-robot interaction (HRI)
We argue that integrating robotic nonverbal codes in HRI will afford robots a feeling of "aliveness" or "social agency"
arXiv Detail & Related papers (2023-04-22T02:15:48Z) - Empathetic Dialogue Generation via Sensitive Emotion Recognition and
Sensible Knowledge Selection [47.60224978460442]
We propose a Serial and Emotion-Knowledge interaction (SEEK) method for empathetic dialogue generation.
We use a fine-grained encoding strategy which is more sensitive to the emotion dynamics (emotion flow) in the conversations to predict the emotion-intent characteristic of response. Besides, we design a novel framework to model the interaction between knowledge and emotion to generate more sensible response.
arXiv Detail & Related papers (2022-10-21T03:51:18Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z) - Generating Emotionally Aligned Responses in Dialogues using Affect
Control Theory [15.848210524718219]
Affect Control Theory (ACT) is a socio-mathematical model of emotions for human-human interactions.
We investigate how ACT can be used to develop affect-aware neural conversational agents.
arXiv Detail & Related papers (2020-03-07T19:31:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.