Analysing the Direction of Emotional Influence in Nonverbal Dyadic
Communication: A Facial-Expression Study
- URL: http://arxiv.org/abs/2012.08780v1
- Date: Wed, 16 Dec 2020 07:52:35 GMT
- Title: Analysing the Direction of Emotional Influence in Nonverbal Dyadic
Communication: A Facial-Expression Study
- Authors: Maha Shadaydeh, Lea Mueller, Dana Schneider, Martin Thuemmel, Thomas
Kessler, Joachim Denzler
- Abstract summary: This study is concerned with the analysis of the direction of emotional influence in dyadic dialogue based on facial expressions only.
We exploit computer vision capabilities along with causal inference theory for quantitative verification of hypotheses on the direction of emotional influence.
- Score: 6.4985954299863
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Identifying the direction of emotional influence in a dyadic dialogue is of
increasing interest in the psychological sciences with applications in
psychotherapy, analysis of political interactions, or interpersonal conflict
behavior. Facial expressions are widely described as being automatic and thus
hard to overtly influence. As such, they are a perfect measure for a better
understanding of unintentional behavior cues about social-emotional cognitive
processes. With this view, this study is concerned with the analysis of the
direction of emotional influence in dyadic dialogue based on facial expressions
only. We exploit computer vision capabilities along with causal inference
theory for quantitative verification of hypotheses on the direction of
emotional influence, i.e., causal effect relationships, in dyadic dialogues. We
address two main issues. First, in a dyadic dialogue, emotional influence
occurs over transient time intervals and with intensity and direction that are
variant over time. To this end, we propose a relevant interval selection
approach that we use prior to causal inference to identify those transient
intervals where causal inference should be applied. Second, we propose to use
fine-grained facial expressions that are present when strong distinct facial
emotions are not visible. To specify the direction of influence, we apply the
concept of Granger causality to the time series of facial expressions over
selected relevant intervals. We tested our approach on newly, experimentally
obtained data. Based on the quantitative verification of hypotheses on the
direction of emotional influence, we were able to show that the proposed
approach is most promising to reveal the causal effect pattern in various
instructed interaction conditions.
Related papers
- CauESC: A Causal Aware Model for Emotional Support Conversation [79.4451588204647]
Existing approaches ignore the emotion causes of the distress.
They focus on the seeker's own mental state rather than the emotional dynamics during interaction between speakers.
We propose a novel framework CauESC, which firstly recognizes the emotion causes of the distress, as well as the emotion effects triggered by the causes.
arXiv Detail & Related papers (2024-01-31T11:30:24Z) - E-CORE: Emotion Correlation Enhanced Empathetic Dialogue Generation [33.57399405783864]
We propose a novel emotion correlation enhanced empathetic dialogue generation framework.
Specifically, a multi-resolution emotion graph is devised to capture context-based emotion interactions.
We then propose an emotion correlation enhanced decoder, with a novel correlation-aware aggregation and soft/hard strategy.
arXiv Detail & Related papers (2023-11-25T12:47:39Z) - Dynamic Causal Disentanglement Model for Dialogue Emotion Detection [77.96255121683011]
We propose a Dynamic Causal Disentanglement Model based on hidden variable separation.
This model effectively decomposes the content of dialogues and investigates the temporal accumulation of emotions.
Specifically, we propose a dynamic temporal disentanglement model to infer the propagation of utterances and hidden variables.
arXiv Detail & Related papers (2023-09-13T12:58:09Z) - Expanding the Role of Affective Phenomena in Multimodal Interaction
Research [57.069159905961214]
We examined over 16,000 papers from selected conferences in multimodal interaction, affective computing, and natural language processing.
We identify 910 affect-related papers and present our analysis of the role of affective phenomena in these papers.
We find limited research on how affect and emotion predictions might be used by AI systems to enhance machine understanding of human social behaviors and cognitive states.
arXiv Detail & Related papers (2023-05-18T09:08:39Z) - Empathetic Response Generation via Emotion Cause Transition Graph [29.418144401849194]
Empathetic dialogue is a human-like behavior that requires the perception of both affective factors (e.g., emotion status) and cognitive factors (e.g., cause of the emotion)
We propose an emotion cause transition graph to explicitly model the natural transition of emotion causes between two adjacent turns in empathetic dialogue.
With this graph, the concept words of the emotion causes in the next turn can be predicted and used by a specifically designed concept-aware decoder to generate the empathic response.
arXiv Detail & Related papers (2023-02-23T05:51:17Z) - Learning Graph Representation of Person-specific Cognitive Processes
from Audio-visual Behaviours for Automatic Personality Recognition [17.428626029689653]
We propose to represent the target subjects person-specific cognition in the form of a person-specific CNN architecture.
Each person-specific CNN is explored by the Neural Architecture Search (NAS) and a novel adaptive loss function.
Experimental results show that the produced graph representations are well associated with target subjects' personality traits.
arXiv Detail & Related papers (2021-10-26T11:04:23Z) - I Only Have Eyes for You: The Impact of Masks On Convolutional-Based
Facial Expression Recognition [78.07239208222599]
We evaluate how the recently proposed FaceChannel adapts towards recognizing facial expressions from persons with masks.
We also perform specific feature-level visualization to demonstrate how the inherent capabilities of the FaceChannel to learn and combine facial features change when in a constrained social interaction scenario.
arXiv Detail & Related papers (2021-04-16T20:03:30Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.