WEARS: Wearable Emotion AI with Real-time Sensor data
- URL: http://arxiv.org/abs/2308.11673v1
- Date: Tue, 22 Aug 2023 11:03:00 GMT
- Title: WEARS: Wearable Emotion AI with Real-time Sensor data
- Authors: Dhruv Limbani, Daketi Yatin, Nitish Chaturvedi, Vaishnavi Moorthy,
Pushpalatha M, Harichandana BSS
- Abstract summary: We propose a system to predict user emotion using smartwatch sensors.
We design a framework to collect ground truth in real-time utilizing a mix of English and regional language-based videos.
We also did an ablation study to understand the impact of features including Heart Rate, Accelerometer, and Gyroscope sensor data on mood.
- Score: 0.8740570557632509
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Emotion prediction is the field of study to understand human emotions.
Existing methods focus on modalities like text, audio, facial expressions,
etc., which could be private to the user. Emotion can be derived from the
subject's psychological data as well. Various approaches that employ
combinations of physiological sensors for emotion recognition have been
proposed. Yet, not all sensors are simple to use and handy for individuals in
their daily lives. Thus, we propose a system to predict user emotion using
smartwatch sensors. We design a framework to collect ground truth in real-time
utilizing a mix of English and regional language-based videos to invoke
emotions in participants and collect the data. Further, we modeled the problem
as binary classification due to the limited dataset size and experimented with
multiple machine-learning models. We also did an ablation study to understand
the impact of features including Heart Rate, Accelerometer, and Gyroscope
sensor data on mood. From the experimental results, Multi-Layer Perceptron has
shown a maximum accuracy of 93.75 percent for pleasant-unpleasant (high/low
valence classification) moods.
Related papers
- SensEmo: Enabling Affective Learning through Real-time Emotion Recognition with Smartwatches [3.7303587372123315]
SensEmo is a smartwatch-based system designed for affective learning.
SensEmo recognizes student emotion with an average of 88.9% accuracy.
SensEmo assists students to achieve better online learning outcomes.
arXiv Detail & Related papers (2024-07-13T15:10:58Z) - Language Models (Mostly) Do Not Consider Emotion Triggers When Predicting Emotion [87.18073195745914]
We investigate how well human-annotated emotion triggers correlate with features deemed salient in their prediction of emotions.
Using EmoTrigger, we evaluate the ability of large language models to identify emotion triggers.
Our analysis reveals that emotion triggers are largely not considered salient features for emotion prediction models, instead there is intricate interplay between various features and the task of emotion detection.
arXiv Detail & Related papers (2023-11-16T06:20:13Z) - Towards Generalizable SER: Soft Labeling and Data Augmentation for
Modeling Temporal Emotion Shifts in Large-Scale Multilingual Speech [3.86122440373248]
We propose a soft labeling system to capture gradational emotional intensities.
Using the Whisper encoder and data augmentation methods inspired by contrastive learning, our method emphasizes the temporal dynamics of emotions.
We publish our open source model weights and initial promising results after fine-tuning on Hume-Prosody.
arXiv Detail & Related papers (2023-11-15T00:09:21Z) - Implicit Design Choices and Their Impact on Emotion Recognition Model
Development and Evaluation [5.534160116442057]
The subjectivity of emotions poses significant challenges in developing accurate and robust computational models.
This thesis examines critical facets of emotion recognition, beginning with the collection of diverse datasets.
To handle the challenge of non-representative training data, this work collects the Multimodal Stressed Emotion dataset.
arXiv Detail & Related papers (2023-09-06T02:45:42Z) - Emotion Analysis on EEG Signal Using Machine Learning and Neural Network [0.0]
The main purpose of this study is to improve ways to improve emotion recognition performance using brain signals.
Various approaches to human-machine interaction technologies have been ongoing for a long time, and in recent years, researchers have had great success in automatically understanding emotion using brain signals.
arXiv Detail & Related papers (2023-07-09T09:50:34Z) - EmoSens: Emotion Recognition based on Sensor data analysis using
LightGBM [1.6197357532363172]
The study examines the performance of various supervised learning models such as Decision Trees, Random Forests, XGBoost, LightGBM on the dataset.
With our proposed model, we obtained a high recognition rate of 92.5% using XGBoost and LightGBM for 9 different emotion classes.
arXiv Detail & Related papers (2022-07-12T13:52:32Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Affect2MM: Affective Analysis of Multimedia Content Using Emotion
Causality [84.69595956853908]
We present Affect2MM, a learning method for time-series emotion prediction for multimedia content.
Our goal is to automatically capture the varying emotions depicted by characters in real-life human-centric situations and behaviors.
arXiv Detail & Related papers (2021-03-11T09:07:25Z) - Modality-Transferable Emotion Embeddings for Low-Resource Multimodal
Emotion Recognition [55.44502358463217]
We propose a modality-transferable model with emotion embeddings to tackle the aforementioned issues.
Our model achieves state-of-the-art performance on most of the emotion categories.
Our model also outperforms existing baselines in the zero-shot and few-shot scenarios for unseen emotions.
arXiv Detail & Related papers (2020-09-21T06:10:39Z) - Emotion Recognition From Gait Analyses: Current Research and Future
Directions [48.93172413752614]
gait conveys information about the walker's emotion.
The mapping between various emotions and gait patterns provides a new source for automated emotion recognition.
gait is remotely observable, more difficult to imitate, and requires less cooperation from the subject.
arXiv Detail & Related papers (2020-03-13T08:22:33Z) - ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for
Socially-Aware Robot Navigation [65.11858854040543]
We present ProxEmo, a novel end-to-end emotion prediction algorithm for robot navigation among pedestrians.
Our approach predicts the perceived emotions of a pedestrian from walking gaits, which is then used for emotion-guided navigation.
arXiv Detail & Related papers (2020-03-02T17:47:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.