Survey and Perspective on Social Emotions in Robotics
- URL: http://arxiv.org/abs/2105.09647v1
- Date: Thu, 20 May 2021 10:25:37 GMT
- Title: Survey and Perspective on Social Emotions in Robotics
- Authors: Chie Hieida and Takayuki Nagai
- Abstract summary: In robotics, emotions are pursued for a long duration, such as recognition, expression, and computational modeling.
Social emotions, also called higher-level emotions, have been studied in psychology.
We believe that these higher-level emotions are worth pursuing in robotics for next-generation social-aware robots.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This study reviews research on social emotions in robotics. In robotics,
emotions are pursued for a long duration, such as recognition, expression, and
computational modeling of the basic mechanism behind them. Research has been
promoted according to well-known psychological findings, such as category and
dimension theories. Many studies have been based on these basic theories,
addressing only basic emotions. However, social emotions, also called
higher-level emotions, have been studied in psychology. We believe that these
higher-level emotions are worth pursuing in robotics for next-generation
social-aware robots. In this review paper, while summarizing the findings of
social emotions in psychology and neuroscience, studies on social emotions in
robotics at present are surveyed. Thereafter, research directions towards
implementation of social emotions in robots are discussed.
Related papers
- Dealing with Controversy: An Emotion and Coping Strategy Corpus Based on Role Playing [14.255172744243541]
Many emotion fundamentals remain under-explored in natural language processing.
We treat emotions as strategies to cope with salient situations.
We introduce the task of coping identification, together with a corpus to do so, constructed via role-playing.
arXiv Detail & Related papers (2024-09-26T06:49:54Z) - The Good, The Bad, and Why: Unveiling Emotions in Generative AI [73.94035652867618]
We show that EmotionPrompt can boost the performance of AI models while EmotionAttack can hinder it.
EmotionDecode reveals that AI models can comprehend emotional stimuli akin to the mechanism of dopamine in the human brain.
arXiv Detail & Related papers (2023-12-18T11:19:45Z) - Developing Social Robots with Empathetic Non-Verbal Cues Using Large
Language Models [2.5489046505746704]
We design and label four types of empathetic non-verbal cues, abbreviated as SAFE: Speech, Action (gesture), Facial expression, and Emotion, in a social robot.
Preliminary results show distinct patterns in the robot's responses, such as a preference for calm and positive social emotions like 'joy' and 'lively', and frequent nodding gestures.
Our work lays the groundwork for future studies on human-robot interactions, emphasizing the essential role of both verbal and non-verbal cues in creating social and empathetic robots.
arXiv Detail & Related papers (2023-08-31T08:20:04Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - HICEM: A High-Coverage Emotion Model for Artificial Emotional
Intelligence [9.153146173929935]
Next-generation artificial emotional intelligence (AEI) is taking center stage to address users' desire for deeper, more meaningful human-machine interaction.
Unlike theory of emotion, which has been the historical focus in psychology, emotion models are a descriptive tools.
This work has broad implications in social robotics, human-machine interaction, mental healthcare, and computational psychology.
arXiv Detail & Related papers (2022-06-15T15:21:30Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - MIME: MIMicking Emotions for Empathetic Response Generation [82.57304533143756]
Current approaches to empathetic response generation view the set of emotions expressed in the input text as a flat structure.
We argue that empathetic responses often mimic the emotion of the user to a varying degree, depending on its positivity or negativity and content.
arXiv Detail & Related papers (2020-10-04T00:35:47Z) - Modeling emotion for human-like behavior in future intelligent robots [0.913755431537592]
We show how neuroscience can help advance the current state of the art.
We argue that a stronger integration of emotion-related processes in robot models is critical for the design of human-like behavior.
arXiv Detail & Related papers (2020-09-30T17:32:30Z) - ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for
Socially-Aware Robot Navigation [65.11858854040543]
We present ProxEmo, a novel end-to-end emotion prediction algorithm for robot navigation among pedestrians.
Our approach predicts the perceived emotions of a pedestrian from walking gaits, which is then used for emotion-guided navigation.
arXiv Detail & Related papers (2020-03-02T17:47:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.