Modeling emotion for human-like behavior in future intelligent robots
- URL: http://arxiv.org/abs/2009.14810v2
- Date: Tue, 19 Jul 2022 13:00:46 GMT
- Title: Modeling emotion for human-like behavior in future intelligent robots
- Authors: Marwen Belkaid and Luiz Pessoa
- Abstract summary: We show how neuroscience can help advance the current state of the art.
We argue that a stronger integration of emotion-related processes in robot models is critical for the design of human-like behavior.
- Score: 0.913755431537592
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Over the past decades, research in cognitive and affective neuroscience has
emphasized that emotion is crucial for human intelligence and in fact
inseparable from cognition. Concurrently, there has been growing interest in
simulating and modeling emotion-related processes in robots and artificial
agents. In this opinion paper, our goal is to provide a snapshot of the present
landscape in emotion modeling and to show how neuroscience can help advance the
current state of the art. We start with an overview of the existing literature
on emotion modeling in three areas of research: affective computing, social
robotics, and neurorobotics. Briefly summarizing the current state of knowledge
on natural emotion, we then highlight how existing proposals in artificial
emotion do not make sufficient contact with neuroscientific evidence. We
conclude by providing a set of principles to help guide future research in
artificial emotion and intelligent machines more generally. Overall, we argue
that a stronger integration of emotion-related processes in robot models is
critical for the design of human-like behavior in future intelligent machines.
Such integration not only will contribute to the development of autonomous
social machines capable of tackling real-world problems but would contribute to
advancing understanding of human emotion.
Related papers
- The Good, The Bad, and Why: Unveiling Emotions in Generative AI [73.94035652867618]
We show that EmotionPrompt can boost the performance of AI models while EmotionAttack can hinder it.
EmotionDecode reveals that AI models can comprehend emotional stimuli akin to the mechanism of dopamine in the human brain.
arXiv Detail & Related papers (2023-12-18T11:19:45Z) - World Models and Predictive Coding for Cognitive and Developmental
Robotics: Frontiers and Challenges [51.92834011423463]
We focus on the two concepts of world models and predictive coding.
In neuroscience, predictive coding proposes that the brain continuously predicts its inputs and adapts to model its own dynamics and control behavior in its environment.
arXiv Detail & Related papers (2023-01-14T06:38:14Z) - HICEM: A High-Coverage Emotion Model for Artificial Emotional
Intelligence [9.153146173929935]
Next-generation artificial emotional intelligence (AEI) is taking center stage to address users' desire for deeper, more meaningful human-machine interaction.
Unlike theory of emotion, which has been the historical focus in psychology, emotion models are a descriptive tools.
This work has broad implications in social robotics, human-machine interaction, mental healthcare, and computational psychology.
arXiv Detail & Related papers (2022-06-15T15:21:30Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - Survey and Perspective on Social Emotions in Robotics [0.0]
In robotics, emotions are pursued for a long duration, such as recognition, expression, and computational modeling.
Social emotions, also called higher-level emotions, have been studied in psychology.
We believe that these higher-level emotions are worth pursuing in robotics for next-generation social-aware robots.
arXiv Detail & Related papers (2021-05-20T10:25:37Z) - Neuroscience-inspired perception-action in robotics: applying active
inference for state estimation, control and self-perception [2.1067139116005595]
We discuss how neuroscience findings open up opportunities to improve current estimation and control algorithms in robotics.
This paper summarizes some experiments and lessons learned from developing such a computational model on real embodied platforms.
arXiv Detail & Related papers (2021-05-10T10:59:38Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Sensorimotor representation learning for an "active self" in robots: A
model survey [10.649413494649293]
In humans, these capabilities are thought to be related to our ability to perceive our body in space.
This paper reviews the developmental processes of underlying mechanisms of these abilities.
We propose a theoretical computational framework, which aims to allow the emergence of the sense of self in artificial agents.
arXiv Detail & Related papers (2020-11-25T16:31:01Z) - Simulation of Human and Artificial Emotion (SHArE) [0.0]
The framework for Simulation of Human and Artificial Emotion (SHArE) describes the architecture of emotion in terms of parameters transferable between neuroscience, psychology, and artificial intelligence.
This model enables emotional trajectory design for humans which may lead to novel therapeutic solutions for various mental health concerns.
For artificial intelligence, this work provides a compact notation which can be applied to neural networks as a means to observe the emotions and motivations of machines.
arXiv Detail & Related papers (2020-11-04T06:45:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.