Faces of the Mind: Unveiling Mental Health States Through Facial Expressions in 11,427 Adolescents
- URL: http://arxiv.org/abs/2405.20072v1
- Date: Thu, 30 May 2024 14:02:40 GMT
- Title: Faces of the Mind: Unveiling Mental Health States Through Facial Expressions in 11,427 Adolescents
- Authors: Xiao Xu, Keyin Zhou, Yan Zhang, Yang Wang, Fei Wang, Xizhe Zhang,
- Abstract summary: Mood disorders, including depression and anxiety, often manifest through facial expressions.
We analyzed facial videos of 11,427 participants, a dataset two orders of magnitude larger than previous studies.
- Score: 12.51443153354506
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mood disorders, including depression and anxiety, often manifest through facial expressions. While previous research has explored the connection between facial features and emotions, machine learning algorithms for estimating mood disorder severity have been hindered by small datasets and limited real-world application. To address this gap, we analyzed facial videos of 11,427 participants, a dataset two orders of magnitude larger than previous studies. This comprehensive collection includes standardized facial expression videos from reading tasks, along with a detailed psychological scale that measures depression, anxiety, and stress. By examining the relationships among these emotional states and employing clustering analysis, we identified distinct subgroups embodying different emotional profiles. We then trained tree-based classifiers and deep learning models to estimate emotional states from facial features. Results indicate that models previously effective on small datasets experienced decreased performance when applied to our large dataset, highlighting the importance of data scale and mitigating overfitting in practical settings. Notably, our study identified subtle shifts in pupil dynamics and gaze orientation as potential markers of mood disorders, providing valuable information on the interaction between facial expressions and mental health. This research marks the first large-scale and comprehensive investigation of facial expressions in the context of mental health, laying the groundwork for future data-driven advancements in this field.
Related papers
- Leaving Some Facial Features Behind [0.0]
This study examines how specific facial features influence emotion classification, using facial perturbations on the Fer2013 dataset.
Models trained on data with the removal of some important facial feature experienced up to an 85% accuracy drop when compared to baseline for emotions like happy and surprise.
arXiv Detail & Related papers (2024-10-29T02:28:53Z) - Large-scale digital phenotyping: identifying depression and anxiety indicators in a general UK population with over 10,000 participants [2.2909783327197393]
We conducted a cross-sectional analysis of data from 10,129 participants recruited from a UK-based general population.
Participants shared wearable (Fitbit) data and self-reported questionnaires on depression (PHQ-8), anxiety (GAD-7), and mood via a study app.
We observed significant associations between the severity of depression and anxiety with several factors, including mood, age, gender, BMI, sleep patterns, physical activity, and heart rate.
arXiv Detail & Related papers (2024-09-24T16:05:17Z) - Exploring Facial Biomarkers for Depression through Temporal Analysis of Action Units [0.0]
We analyzed facial expressions from video data of participants classified with or without depression.
Results indicate significant differences in the intensities of AUs associated with sadness and happiness between the groups.
arXiv Detail & Related papers (2024-07-18T17:55:01Z) - EmoScan: Automatic Screening of Depression Symptoms in Romanized Sinhala Tweets [0.0]
This work explores the utilization of Romanized Sinhala social media data to identify individuals at risk of depression.
A machine learning-based framework is presented for the automatic screening of depression symptoms by analyzing language patterns, sentiment, and behavioural cues.
arXiv Detail & Related papers (2024-03-28T10:31:09Z) - Measuring Non-Typical Emotions for Mental Health: A Survey of Computational Approaches [57.486040830365646]
Stress and depression impact the engagement in daily tasks, highlighting the need to understand their interplay.
This survey is the first to simultaneously explore computational methods for analyzing stress, depression, and engagement.
arXiv Detail & Related papers (2024-03-09T11:16:09Z) - Dynamic Graph Representation Learning for Depression Screening with
Transformer [13.551342607089184]
Social media platforms present research opportunities to investigate mental health and potentially detect instances of mental illness.
Existing depression detection methods are constrained due to the reliance on feature engineering and the lack of consideration for time-varying factors.
We propose ContrastEgo, which treats each user as a dynamic time-evolving attributed graph (ego-network)
We show that ContrastEgo significantly outperforms the state-of-the-art methods in terms of all the effectiveness metrics in various experimental settings.
arXiv Detail & Related papers (2023-05-10T20:34:40Z) - Handwriting and Drawing for Depression Detection: A Preliminary Study [53.11777541341063]
Short-term covid effects on mental health were a significant increase in anxiety and depressive symptoms.
The aim of this study is to use a new tool, the online handwriting and drawing analysis, to discriminate between healthy individuals and depressed patients.
arXiv Detail & Related papers (2023-02-05T22:33:49Z) - CIAO! A Contrastive Adaptation Mechanism for Non-Universal Facial
Expression Recognition [80.07590100872548]
We propose Contrastive Inhibitory Adaptati On (CIAO), a mechanism that adapts the last layer of facial encoders to depict specific affective characteristics on different datasets.
CIAO presents an improvement in facial expression recognition performance over six different datasets with very unique affective representations.
arXiv Detail & Related papers (2022-08-10T15:46:05Z) - The world seems different in a social context: a neural network analysis
of human experimental data [57.729312306803955]
We show that it is possible to replicate human behavioral data in both individual and social task settings by modifying the precision of prior and sensory signals.
An analysis of the neural activation traces of the trained networks provides evidence that information is coded in fundamentally different ways in the network in the individual and in the social conditions.
arXiv Detail & Related papers (2022-03-03T17:19:12Z) - Affect Analysis in-the-wild: Valence-Arousal, Expressions, Action Units
and a Unified Framework [83.21732533130846]
The paper focuses on large in-the-wild databases, i.e., Aff-Wild and Aff-Wild2.
It presents the design of two classes of deep neural networks trained with these databases.
A novel multi-task and holistic framework is presented which is able to jointly learn and effectively generalize and perform affect recognition.
arXiv Detail & Related papers (2021-03-29T17:36:20Z) - Deep Multi-task Learning for Depression Detection and Prediction in
Longitudinal Data [50.02223091927777]
Depression is among the most prevalent mental disorders, affecting millions of people of all ages globally.
Machine learning techniques have shown effective in enabling automated detection and prediction of depression for early intervention and treatment.
We introduce a novel deep multi-task recurrent neural network to tackle this challenge, in which depression classification is jointly optimized with two auxiliary tasks.
arXiv Detail & Related papers (2020-12-05T05:14:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.