Bodily expressed emotion understanding through integrating Laban
movement analysis
- URL: http://arxiv.org/abs/2304.02187v1
- Date: Wed, 5 Apr 2023 02:07:15 GMT
- Title: Bodily expressed emotion understanding through integrating Laban
movement analysis
- Authors: Chenyan Wu, Dolzodmaa Davaasuren, Tal Shafir, Rachelle Tsachor, James
Z. Wang
- Abstract summary: This study develops a high-quality human motor element dataset based on the Laban Movement Analysis movement coding system.
Our long-term ambition is to integrate knowledge from computing, psychology, and performing arts to enable automated understanding and analysis of emotion and mental state through body language.
- Score: 7.73546354173679
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Body movements carry important information about a person's emotions or
mental state and are essential in daily communication. Enhancing the ability of
machines to understand emotions expressed through body language can improve the
communication of assistive robots with children and elderly users, provide
psychiatric professionals with quantitative diagnostic and prognostic
assistance, and aid law enforcement in identifying deception. This study
develops a high-quality human motor element dataset based on the Laban Movement
Analysis movement coding system and utilizes that to jointly learn about motor
elements and emotions. Our long-term ambition is to integrate knowledge from
computing, psychology, and performing arts to enable automated understanding
and analysis of emotion and mental state through body language. This work
serves as a launchpad for further research into recognizing emotions through
analysis of human movement.
Related papers
- Emotion Detection through Body Gesture and Face [0.0]
The project addresses the challenge of emotion recognition by focusing on non-facial cues, specifically hands, body gestures, and gestures.
Traditional emotion recognition systems mainly rely on facial expression analysis and often ignore the rich emotional information conveyed through body language.
The project aims to contribute to the field of affective computing by enhancing the ability of machines to interpret and respond to human emotions in a more comprehensive and nuanced way.
arXiv Detail & Related papers (2024-07-13T15:15:50Z) - Improving Language Models for Emotion Analysis: Insights from Cognitive Science [0.0]
We present the main emotion theories in psychology and cognitive science.
We introduce the main methods of emotion annotation in natural language processing.
We propose directions for improving language models for emotion analysis.
arXiv Detail & Related papers (2024-06-11T07:42:13Z) - SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Unlocking the Emotional World of Visual Media: An Overview of the
Science, Research, and Impact of Understanding Emotion [24.920797480215242]
This article provides a comprehensive overview of the field of emotion analysis in visual media.
We discuss the psychological foundations of emotion and the computational principles that underpin the understanding of emotions from images and videos.
We contend that this represents a "Holy Grail" research problem in computing and delineate pivotal directions for future inquiry.
arXiv Detail & Related papers (2023-07-25T12:47:21Z) - Language-Specific Representation of Emotion-Concept Knowledge Causally
Supports Emotion Inference [44.126681295827794]
This study used a form of artificial intelligence known as large language models (LLMs) to assess whether language-based representations of emotion causally contribute to the AI's ability to generate inferences about the emotional meaning of novel situations.
Our findings provide a proof-in-concept that even a LLM can learn about emotions in the absence of sensory-motor representations and highlight the contribution of language-derived emotion-concept knowledge for emotion inference.
arXiv Detail & Related papers (2023-02-19T14:21:33Z) - See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation [49.925499720323806]
We study how visual, auditory, and tactile perception can jointly help robots to solve complex manipulation tasks.
We build a robot system that can see with a camera, hear with a contact microphone, and feel with a vision-based tactile sensor.
arXiv Detail & Related papers (2022-12-07T18:55:53Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z) - Sentiment Analysis: Automatically Detecting Valence, Emotions, and Other
Affectual States from Text [31.87319293259599]
This article presents a sweeping overview of sentiment analysis research.
It includes the origins of the field, the rich landscape of tasks, challenges, a survey of the methods and resources used, and applications.
We discuss how, without careful fore-thought, sentiment analysis has the potential for harmful outcomes.
arXiv Detail & Related papers (2020-05-25T01:37:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.