Investigating Acoustic-Textual Emotional Inconsistency Information for Automatic Depression Detection
- URL: http://arxiv.org/abs/2412.18614v1
- Date: Mon, 09 Dec 2024 02:52:52 GMT
- Title: Investigating Acoustic-Textual Emotional Inconsistency Information for Automatic Depression Detection
- Authors: Rongfeng Su, Changqing Xu, Xinyi Wu, Feng Xu, Xie Chen, Lan Wangt, Nan Yan,
- Abstract summary: Previous studies have demonstrated that emotional features from a single acoustic sentiment label can enhance depression diagnosis accuracy.
Individuals with depression might convey negative emotional content in an unexpectedly calm manner.
This work is the first to incorporate emotional expression inconsistency information into depression detection.
- Score: 18.797661194307683
- License:
- Abstract: Previous studies have demonstrated that emotional features from a single acoustic sentiment label can enhance depression diagnosis accuracy. Additionally, according to the Emotion Context-Insensitivity theory and our pilot study, individuals with depression might convey negative emotional content in an unexpectedly calm manner, showing a high degree of inconsistency in emotional expressions during natural conversations. So far, few studies have recognized and leveraged the emotional expression inconsistency for depression detection. In this paper, a multimodal cross-attention method is presented to capture the Acoustic-Textual Emotional Inconsistency (ATEI) information. This is achieved by analyzing the intricate local and long-term dependencies of emotional expressions across acoustic and textual domains, as well as the mismatch between the emotional content within both domains. A Transformer-based model is then proposed to integrate this ATEI information with various fusion strategies for detecting depression. Furthermore, a scaling technique is employed to adjust the ATEI feature degree during the fusion process, thereby enhancing the model's ability to discern patients with depression across varying levels of severity. To best of our knowledge, this work is the first to incorporate emotional expression inconsistency information into depression detection. Experimental results on a counseling conversational dataset illustrate the effectiveness of our method.
Related papers
- Deep Learning-Based Feature Fusion for Emotion Analysis and Suicide Risk Differentiation in Chinese Psychological Support Hotlines [18.81118590515144]
This study introduces a method that combines pitch acoustic features with deep learning-based features to analyze and understand emotions expressed during hotline interactions.
Using data from China's largest psychological support hotline, our method achieved an F1-score of 79.13% for negative binary emotion classification.
Our findings suggest that emotional fluctuation intensity and frequency could serve as novel features for psychological assessment scales and suicide risk prediction.
arXiv Detail & Related papers (2025-01-15T10:09:38Z) - Enhancing Depression-Diagnosis-Oriented Chat with Psychological State Tracking [27.96718892323191]
Depression-diagnosis-oriented chat aims to guide patients in self-expression to collect key symptoms for depression detection.
Recent work focuses on combining task-oriented dialogue and chitchat to simulate the interview-based depression diagnosis.
No explicit framework has been explored to guide the dialogue, which results in some useless communications.
arXiv Detail & Related papers (2024-03-12T07:17:01Z) - Measuring Non-Typical Emotions for Mental Health: A Survey of Computational Approaches [57.486040830365646]
Stress and depression impact the engagement in daily tasks, highlighting the need to understand their interplay.
This survey is the first to simultaneously explore computational methods for analyzing stress, depression, and engagement.
arXiv Detail & Related papers (2024-03-09T11:16:09Z) - CauESC: A Causal Aware Model for Emotional Support Conversation [79.4451588204647]
Existing approaches ignore the emotion causes of the distress.
They focus on the seeker's own mental state rather than the emotional dynamics during interaction between speakers.
We propose a novel framework CauESC, which firstly recognizes the emotion causes of the distress, as well as the emotion effects triggered by the causes.
arXiv Detail & Related papers (2024-01-31T11:30:24Z) - Language and Mental Health: Measures of Emotion Dynamics from Text as
Linguistic Biosocial Markers [30.656554495536618]
We study the relationship between tweet emotion dynamics and mental health disorders.
We find that each of the UED metrics studied varied by the user's self-disclosed diagnosis.
This work provides important early evidence for how linguistic cues pertaining to emotion dynamics can play a crucial role as biosocial markers for mental illnesses.
arXiv Detail & Related papers (2023-10-26T13:00:26Z) - Dynamic Causal Disentanglement Model for Dialogue Emotion Detection [77.96255121683011]
We propose a Dynamic Causal Disentanglement Model based on hidden variable separation.
This model effectively decomposes the content of dialogues and investigates the temporal accumulation of emotions.
Specifically, we propose a dynamic temporal disentanglement model to infer the propagation of utterances and hidden variables.
arXiv Detail & Related papers (2023-09-13T12:58:09Z) - NLP meets psychotherapy: Using predicted client emotions and
self-reported client emotions to measure emotional coherence [44.82634301507483]
Coherence between emotional experience and emotional expression is considered important to clients' well being.
No study has examined EC between the subjective experience of emotions and emotion expression in therapy.
This work presents an end-to-end approach where we use emotion predictions from our transformer based emotion recognition model to study emotional coherence.
arXiv Detail & Related papers (2022-11-22T14:28:41Z) - Climate and Weather: Inspecting Depression Detection via Emotion
Recognition [25.290414205116107]
This paper uses pretrained features extracted from the emotion recognition model for depression detection to form multimodal depression detection.
The proposed emotion transfer improves depression detection performance on DAIC-WOZ as well as increases the training stability.
arXiv Detail & Related papers (2022-04-29T13:44:22Z) - Emotion Intensity and its Control for Emotional Voice Conversion [77.05097999561298]
Emotional voice conversion (EVC) seeks to convert the emotional state of an utterance while preserving the linguistic content and speaker identity.
In this paper, we aim to explicitly characterize and control the intensity of emotion.
We propose to disentangle the speaker style from linguistic content and encode the speaker style into a style embedding in a continuous space that forms the prototype of emotion embedding.
arXiv Detail & Related papers (2022-01-10T02:11:25Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.