A Multi-Componential Approach to Emotion Recognition and the Effect of
Personality
- URL: http://arxiv.org/abs/2010.11370v1
- Date: Thu, 22 Oct 2020 01:27:23 GMT
- Title: A Multi-Componential Approach to Emotion Recognition and the Effect of
Personality
- Authors: Gelareh Mohammadi and Patrik Vuilleumier
- Abstract summary: This paper applies a componential framework with a data-driven approach to characterize emotional experiences evoked during movie watching.
The results suggest that differences between various emotions can be captured by a few (at least 6) latent dimensions.
Results show that a componential model with a limited number of descriptors is still able to predict the level of experienced discrete emotion.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Emotions are an inseparable part of human nature affecting our behavior in
response to the outside world. Although most empirical studies have been
dominated by two theoretical models including discrete categories of emotion
and dichotomous dimensions, results from neuroscience approaches suggest a
multi-processes mechanism underpinning emotional experience with a large
overlap across different emotions. While these findings are consistent with the
influential theories of emotion in psychology that emphasize a role for
multiple component processes to generate emotion episodes, few studies have
systematically investigated the relationship between discrete emotions and a
full componential view. This paper applies a componential framework with a
data-driven approach to characterize emotional experiences evoked during movie
watching. The results suggest that differences between various emotions can be
captured by a few (at least 6) latent dimensions, each defined by features
associated with component processes, including appraisal, expression,
physiology, motivation, and feeling. In addition, the link between discrete
emotions and component model is explored and results show that a componential
model with a limited number of descriptors is still able to predict the level
of experienced discrete emotion(s) to a satisfactory level. Finally, as
appraisals may vary according to individual dispositions and biases, we also
study the relationship between personality traits and emotions in our
computational framework and show that the role of personality on discrete
emotion differences can be better justified using the component model.
Related papers
- Exploring Emotions in Multi-componential Space using Interactive VR Games [1.1510009152620668]
We operationalised a data-driven approach using interactive Virtual Reality (VR) games.
We used Machine Learning (ML) methods to identify the unique contributions of each component to emotion differentiation.
These findings also have implications for using VR environments in emotion research.
arXiv Detail & Related papers (2024-04-04T06:54:44Z) - Emotion Recognition based on Psychological Components in Guided
Narratives for Emotion Regulation [0.0]
This paper introduces a new French corpus of emotional narratives collected using a questionnaire for emotion regulation.
We study the interaction of components and their impact on emotion classification with machine learning methods and pre-trained language models.
arXiv Detail & Related papers (2023-05-15T12:06:31Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - Seeking Subjectivity in Visual Emotion Distribution Learning [93.96205258496697]
Visual Emotion Analysis (VEA) aims to predict people's emotions towards different visual stimuli.
Existing methods often predict visual emotion distribution in a unified network, neglecting the inherent subjectivity in its crowd voting process.
We propose a novel textitSubjectivity Appraise-and-Match Network (SAMNet) to investigate the subjectivity in visual emotion distribution.
arXiv Detail & Related papers (2022-07-25T02:20:03Z) - Emotion Recognition from Multiple Modalities: Fundamentals and
Methodologies [106.62835060095532]
We discuss several key aspects of multi-modal emotion recognition (MER)
We begin with a brief introduction on widely used emotion representation models and affective modalities.
We then summarize existing emotion annotation strategies and corresponding computational tasks.
Finally, we outline several real-world applications and discuss some future directions.
arXiv Detail & Related papers (2021-08-18T21:55:20Z) - Emotion Recognition under Consideration of the Emotion Component Process
Model [9.595357496779394]
We use the emotion component process model (CPM) by Scherer (2005) to explain emotion communication.
CPM states that emotions are a coordinated process of various subcomponents, in reaction to an event, namely the subjective feeling, the cognitive appraisal, the expression, a physiological bodily reaction, and a motivational action tendency.
We find that emotions on Twitter are predominantly expressed by event descriptions or subjective reports of the feeling, while in literature, authors prefer to describe what characters do, and leave the interpretation to the reader.
arXiv Detail & Related papers (2021-07-27T15:53:25Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Impact of multiple modalities on emotion recognition: investigation into
3d facial landmarks, action units, and physiological data [4.617405932149653]
We analyze 3D facial data, action units, and physiological data as it relates to their impact on emotion recognition.
Our analysis indicates that both 3D facial landmarks and physiological data are encouraging for expression/emotion recognition.
On the other hand, while action units can positively impact emotion recognition when fused with other modalities, the results suggest it is difficult to detect emotion using them in a unimodal fashion.
arXiv Detail & Related papers (2020-05-17T18:59:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.