Personalized Emotion Detection using IoT and Machine Learning
- URL: http://arxiv.org/abs/2209.06464v1
- Date: Wed, 14 Sep 2022 07:36:03 GMT
- Title: Personalized Emotion Detection using IoT and Machine Learning
- Authors: Fiona Victoria Stanley Jothiraj and Afra Mashhadi
- Abstract summary: This paper presents a non-invasive IoT system that tracks patients' emotions, especially those with autism spectrum disorder.
With a few affordable sensors and cloud computing services, the individual's heart rates are monitored and analyzed to study the effects of changes in sweat and heartbeats per minute for different emotions.
The proposed system could detect the right emotion using machine learning algorithms with a performance of up to 92% accuracy.
- Score: 6.09170287691728
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Medical Internet of Things, a recent technological advancement in
medicine, is incredibly helpful in providing real-time monitoring of health
metrics. This paper presents a non-invasive IoT system that tracks patients'
emotions, especially those with autism spectrum disorder. With a few affordable
sensors and cloud computing services, the individual's heart rates are
monitored and analyzed to study the effects of changes in sweat and heartbeats
per minute for different emotions. Under normal resting conditions of the
individual, the proposed system could detect the right emotion using machine
learning algorithms with a performance of up to 92% accuracy. The result of the
proposed approach is comparable with the state-of-the-art solutions in medical
IoT.
Related papers
- EEG Emotion Copilot: Pruning LLMs for Emotional EEG Interpretation with Assisted Medical Record Generation [13.048477440429195]
This paper presents the EEG Emotion Copilot, a system leveraging a lightweight large language model (LLM) operating in a local setting.
The system is designed to first recognize emotional states directly from EEG signals, subsequently generate personalized diagnostic and treatment suggestions.
Privacy concerns are also addressed, with a focus on ethical data collection, processing, and the protection of users' personal information.
arXiv Detail & Related papers (2024-09-30T19:15:05Z) - A Health Monitoring System Based on Flexible Triboelectric Sensors for
Intelligence Medical Internet of Things and its Applications in Virtual
Reality [4.522609963399036]
The Internet of Medical Things (IoMT) is a platform that combines Internet of Things (IoT) technology with medical applications.
In this study, we designed a robust and intelligent IoMT system through the synergistic integration of flexible wearable triboelectric sensors and deep learning-assisted data analytics.
We embedded four triboelectric sensors into a wristband to detect and analyze limb movements in patients suffering from Parkinson's Disease (PD)
This innovative approach enabled us to accurately capture and scrutinize the subtle movements and fine motor of PD patients, thus providing insightful feedback and comprehensive assessment of the patients conditions.
arXiv Detail & Related papers (2023-09-13T01:01:16Z) - WEARS: Wearable Emotion AI with Real-time Sensor data [0.8740570557632509]
We propose a system to predict user emotion using smartwatch sensors.
We design a framework to collect ground truth in real-time utilizing a mix of English and regional language-based videos.
We also did an ablation study to understand the impact of features including Heart Rate, Accelerometer, and Gyroscope sensor data on mood.
arXiv Detail & Related papers (2023-08-22T11:03:00Z) - Emotion Analysis on EEG Signal Using Machine Learning and Neural Network [0.0]
The main purpose of this study is to improve ways to improve emotion recognition performance using brain signals.
Various approaches to human-machine interaction technologies have been ongoing for a long time, and in recent years, researchers have had great success in automatically understanding emotion using brain signals.
arXiv Detail & Related papers (2023-07-09T09:50:34Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - Stimuli-Aware Visual Emotion Analysis [75.68305830514007]
We propose a stimuli-aware visual emotion analysis (VEA) method consisting of three stages, namely stimuli selection, feature extraction and emotion prediction.
To the best of our knowledge, it is the first time to introduce stimuli selection process into VEA in an end-to-end network.
Experiments demonstrate that the proposed method consistently outperforms the state-of-the-art approaches on four public visual emotion datasets.
arXiv Detail & Related papers (2021-09-04T08:14:52Z) - Interpretable SincNet-based Deep Learning for Emotion Recognition from
EEG brain activity [13.375254690028225]
SincNet is a convolutional neural network that efficiently learns customized band-pass filters.
In this study, we use SincNet to analyze the neural activity of individuals with Autism Spectrum Disorder (ASD)
We found that our system automatically learns the high-$alpha$ (9-13 Hz) and $beta$ (13-30 Hz) band suppression often present in individuals with ASD.
arXiv Detail & Related papers (2021-07-18T14:44:53Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - AEGIS: A real-time multimodal augmented reality computer vision based
system to assist facial expression recognition for individuals with autism
spectrum disorder [93.0013343535411]
This paper presents the development of a multimodal augmented reality (AR) system which combines the use of computer vision and deep convolutional neural networks (CNN)
The proposed system, which we call AEGIS, is an assistive technology deployable on a variety of user devices including tablets, smartphones, video conference systems, or smartglasses.
We leverage both spatial and temporal information in order to provide an accurate expression prediction, which is then converted into its corresponding visualization and drawn on top of the original video frame.
arXiv Detail & Related papers (2020-10-22T17:20:38Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.