Exploring the dynamic interplay of cognitive load and emotional arousal
by using multimodal measurements: Correlation of pupil diameter and emotional
arousal in emotionally engaging tasks
- URL: http://arxiv.org/abs/2403.00366v1
- Date: Fri, 1 Mar 2024 08:49:17 GMT
- Title: Exploring the dynamic interplay of cognitive load and emotional arousal
by using multimodal measurements: Correlation of pupil diameter and emotional
arousal in emotionally engaging tasks
- Authors: C. Kosel, S. Michel, T. Seidel, M. Foerster
- Abstract summary: The study aims to investigate the correlation between two continuous sensor streams, pupil diameter as an indicator of cognitive workload and FACTs with deep learning as an indicator of emotional arousal.
28 participants worked on three cognitively demanding and emotionally engaging everyday moral dilemmas while eye-tracking and emotion recognition data were collected.
The results show negative and statistically significant correlations between the data streams for emotional arousal and pupil diameter.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Multimodal data analysis and validation based on streams from
state-of-the-art sensor technology such as eye-tracking or emotion recognition
using the Facial Action Coding System (FACTs) with deep learning allows
educational researchers to study multifaceted learning and problem-solving
processes and to improve educational experiences. This study aims to
investigate the correlation between two continuous sensor streams, pupil
diameter as an indicator of cognitive workload and FACTs with deep learning as
an indicator of emotional arousal (RQ 1a), specifically for epochs of high,
medium, and low arousal (RQ 1b). Furthermore, the time lag between emotional
arousal and pupil diameter data will be analyzed (RQ 2). 28 participants worked
on three cognitively demanding and emotionally engaging everyday moral dilemmas
while eye-tracking and emotion recognition data were collected. The data were
pre-processed in Phyton (synchronization, blink control, downsampling) and
analyzed using correlation analysis and Granger causality tests. The results
show negative and statistically significant correlations between the data
streams for emotional arousal and pupil diameter. However, the correlation is
negative and significant only for epochs of high arousal, while positive but
non-significant relationships were found for epochs of medium or low arousal.
The average time lag for the relationship between arousal and pupil diameter
was 2.8 ms. In contrast to previous findings without a multimodal approach
suggesting a positive correlation between the constructs, the results
contribute to the state of research by highlighting the importance of
multimodal data validation and research on convergent vagility. Future research
should consider emotional regulation strategies and emotional valence.
Related papers
- Auto Detecting Cognitive Events Using Machine Learning on Pupillary Data [0.0]
Pupil size is a valuable indicator of cognitive workload, reflecting changes in attention and arousal governed by the autonomic nervous system.
This study explores the potential of using machine learning to automatically detect cognitive events experienced using individuals.
arXiv Detail & Related papers (2024-10-18T04:54:46Z) - Brain-Cognition Fingerprinting via Graph-GCCA with Contrastive Learning [28.681229869236393]
longitudinal neuroimaging studies aim to improve the understanding of brain aging and diseases by studying the dynamic interactions between brain function and cognition.
We propose an unsupervised learning model that encodes their relationship via Graph Attention Networks and generalized Correlational Analysis.
To create brain-cognition fingerprints reflecting unique neural and cognitive phenotype of each person, the model also relies on individualized and multimodal contrastive learning.
arXiv Detail & Related papers (2024-09-20T20:36:20Z) - Faces of the Mind: Unveiling Mental Health States Through Facial Expressions in 11,427 Adolescents [12.51443153354506]
Mood disorders, including depression and anxiety, often manifest through facial expressions.
We analyzed facial videos of 11,427 participants, a dataset two orders of magnitude larger than previous studies.
arXiv Detail & Related papers (2024-05-30T14:02:40Z) - Characterizing Information Seeking Processes with Multiple Physiological Signals [12.771920957950334]
This study examines informational search with four stages: the realization of Information Need (IN), Query Formulation (QF), Query Submission (QS), and Relevance Judgment (RJ)
We analyze the physiological signals across these stages and report outcomes of pairwise non-parametric repeated-measure statistical tests.
Our findings offer valuable insights into user behavior and emotional responses in information seeking processes.
arXiv Detail & Related papers (2024-05-01T05:15:00Z) - Measuring Non-Typical Emotions for Mental Health: A Survey of Computational Approaches [57.486040830365646]
Stress and depression impact the engagement in daily tasks, highlighting the need to understand their interplay.
This survey is the first to simultaneously explore computational methods for analyzing stress, depression, and engagement.
arXiv Detail & Related papers (2024-03-09T11:16:09Z) - A Hierarchical Regression Chain Framework for Affective Vocal Burst
Recognition [72.36055502078193]
We propose a hierarchical framework, based on chain regression models, for affective recognition from vocal bursts.
To address the challenge of data sparsity, we also use self-supervised learning (SSL) representations with layer-wise and temporal aggregation modules.
The proposed systems participated in the ACII Affective Vocal Burst (A-VB) Challenge 2022 and ranked first in the "TWO'' and "CULTURE" tasks.
arXiv Detail & Related papers (2023-03-14T16:08:45Z) - Co-Located Human-Human Interaction Analysis using Nonverbal Cues: A
Survey [71.43956423427397]
We aim to identify the nonverbal cues and computational methodologies resulting in effective performance.
This survey differs from its counterparts by involving the widest spectrum of social phenomena and interaction settings.
Some major observations are: the most often used nonverbal cue, computational method, interaction environment, and sensing approach are speaking activity, support vector machines, and meetings composed of 3-4 persons equipped with microphones and cameras, respectively.
arXiv Detail & Related papers (2022-07-20T13:37:57Z) - Towards Unbiased Visual Emotion Recognition via Causal Intervention [63.74095927462]
We propose a novel Emotion Recognition Network (IERN) to alleviate the negative effects brought by the dataset bias.
A series of designed tests validate the effectiveness of IERN, and experiments on three emotion benchmarks demonstrate that IERN outperforms other state-of-the-art approaches.
arXiv Detail & Related papers (2021-07-26T10:40:59Z) - Towards Interaction Detection Using Topological Analysis on Neural
Networks [55.74562391439507]
In neural networks, any interacting features must follow a strongly weighted connection to common hidden units.
We propose a new measure for quantifying interaction strength, based upon the well-received theory of persistent homology.
A Persistence Interaction detection(PID) algorithm is developed to efficiently detect interactions.
arXiv Detail & Related papers (2020-10-25T02:15:24Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z) - SensAI+Expanse Emotional Valence Prediction Studies with Cognition and
Memory Integration [0.0]
This work contributes with an artificial intelligent agent able to assist on cognitive science studies.
The developed artificial agent system (SensAI+Expanse) includes machine learning algorithms, empathetic algorithms, and memory.
Results of the present study show evidence of significant emotional behaviour differences between some age ranges and gender combinations.
arXiv Detail & Related papers (2020-01-03T18:17:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.