Multimodal Emotion Recognition among Couples from Lab Settings to Daily
Life using Smartwatches
- URL: http://arxiv.org/abs/2212.13917v1
- Date: Wed, 21 Dec 2022 16:41:11 GMT
- Title: Multimodal Emotion Recognition among Couples from Lab Settings to Daily
Life using Smartwatches
- Authors: George Boateng
- Abstract summary: recognizing the emotions of each partner in daily life could provide an insight into their emotional well-being in chronic disease management.
Currently, there exists no comprehensive overview of works on emotion recognition among couples.
This thesis contributes toward building automated emotion recognition systems that would eventually enable partners to monitor their emotions in daily life.
- Score: 2.4366811507669124
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Couples generally manage chronic diseases together and the management takes
an emotional toll on both patients and their romantic partners. Consequently,
recognizing the emotions of each partner in daily life could provide an insight
into their emotional well-being in chronic disease management. The emotions of
partners are currently inferred in the lab and daily life using self-reports
which are not practical for continuous emotion assessment or observer reports
which are manual, time-intensive, and costly. Currently, there exists no
comprehensive overview of works on emotion recognition among couples.
Furthermore, approaches for emotion recognition among couples have (1) focused
on English-speaking couples in the U.S., (2) used data collected from the lab,
and (3) performed recognition using observer ratings rather than partner's
self-reported / subjective emotions. In this body of work contained in this
thesis (8 papers - 5 published and 3 currently under review in various
journals), we fill the current literature gap on couples' emotion recognition,
develop emotion recognition systems using 161 hours of data from a total of
1,051 individuals, and make contributions towards taking couples' emotion
recognition from the lab which is the status quo, to daily life. This thesis
contributes toward building automated emotion recognition systems that would
eventually enable partners to monitor their emotions in daily life and enable
the delivery of interventions to improve their emotional well-being.
Related papers
- SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Emotion in Cognitive Architecture: Emergent Properties from Interactions
with Human Emotion [0.0]
This document presents endeavors to represent emotion in a computational cognitive architecture.
The advantage of the cognitive human-agent interaction approach is in representing human internal states and processes.
arXiv Detail & Related papers (2022-12-28T23:50:27Z) - Face Emotion Recognization Using Dataset Augmentation Based on Neural
Network [0.0]
Facial expression is one of the most external indications of a person's feelings and emotions.
It plays an important role in coordinating interpersonal relationships.
As a branch of the field of analyzing sentiment, facial expression recognition offers broad application prospects.
arXiv Detail & Related papers (2022-10-23T10:21:45Z) - Why Do You Feel This Way? Summarizing Triggers of Emotions in Social
Media Posts [61.723046082145416]
We introduce CovidET (Emotions and their Triggers during Covid-19), a dataset of 1,900 English Reddit posts related to COVID-19.
We develop strong baselines to jointly detect emotions and summarize emotion triggers.
Our analyses show that CovidET presents new challenges in emotion-specific summarization, as well as multi-emotion detection in long social media posts.
arXiv Detail & Related papers (2022-10-22T19:10:26Z) - "Are you okay, honey?": Recognizing Emotions among Couples Managing
Diabetes in Daily Life using Multimodal Real-World Smartwatch Data [8.355190969810305]
Couples generally manage chronic diseases together and the management takes an emotional toll on both patients and their romantic partners.
Recognizing the emotions of each partner in daily life could provide an insight into their emotional well-being in chronic disease management.
We extracted physiological, movement, acoustic, and linguistic features, and trained machine learning models to recognize each partner's self-reported emotions.
arXiv Detail & Related papers (2022-08-16T22:04:12Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - Bridging the gap between emotion and joint action [0.0]
Joint action brings individuals (and embodiments of their emotions) together, in space and in time.
Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion.
In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences.
arXiv Detail & Related papers (2021-08-13T14:21:37Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - "You made me feel this way": Investigating Partners' Influence in
Predicting Emotions in Couples' Conflict Interactions using Speech Data [3.618388731766687]
How romantic partners interact with each other during a conflict influences how they feel at the end of the interaction.
In this work, we used BERT to extract linguistic features (i.e., what partners said) and openSMILE to extract paralinguistic features (i.e., how they said it) from a data set of 368 German-speaking Swiss couples.
Based on those features, we trained machine learning models to predict if partners feel positive or negative after the conflict interaction.
arXiv Detail & Related papers (2021-06-03T01:15:41Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Emotion Recognition From Gait Analyses: Current Research and Future
Directions [48.93172413752614]
gait conveys information about the walker's emotion.
The mapping between various emotions and gait patterns provides a new source for automated emotion recognition.
gait is remotely observable, more difficult to imitate, and requires less cooperation from the subject.
arXiv Detail & Related papers (2020-03-13T08:22:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.