SensEmo: Enabling Affective Learning through Real-time Emotion Recognition with Smartwatches
- URL: http://arxiv.org/abs/2407.09911v1
- Date: Sat, 13 Jul 2024 15:10:58 GMT
- Title: SensEmo: Enabling Affective Learning through Real-time Emotion Recognition with Smartwatches
- Authors: Kushan Choksi, Hongkai Chen, Karan Joshi, Sukrutha Jade, Shahriar Nirjon, Shan Lin,
- Abstract summary: SensEmo is a smartwatch-based system designed for affective learning.
SensEmo recognizes student emotion with an average of 88.9% accuracy.
SensEmo assists students to achieve better online learning outcomes.
- Score: 3.7303587372123315
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent research has demonstrated the capability of physiological signals to infer both user emotional and attention responses. This presents an opportunity for leveraging widely available physiological sensors in smartwatches, to detect real-time emotional cues in users, such as stress and excitement. In this paper, we introduce SensEmo, a smartwatch-based system designed for affective learning. SensEmo utilizes multiple physiological sensor data, including heart rate and galvanic skin response, to recognize a student's motivation and concentration levels during class. This recognition is facilitated by a personalized emotion recognition model that predicts emotional states based on degrees of valence and arousal. With real-time emotion and attention feedback from students, we design a Markov decision process-based algorithm to enhance student learning effectiveness and experience by by offering suggestions to the teacher regarding teaching content and pacing. We evaluate SensEmo with 22 participants in real-world classroom environments. Evaluation results show that SensEmo recognizes student emotion with an average of 88.9% accuracy. More importantly, SensEmo assists students to achieve better online learning outcomes, e.g., an average of 40.0% higher grades in quizzes, over the traditional learning without student emotional feedback.
Related papers
- Real-time EEG-based Emotion Recognition Model using Principal Component
Analysis and Tree-based Models for Neurohumanities [0.0]
This project proposes a solution by incorporating emotional monitoring during the learning process of context inside an immersive space.
A real-time emotion detection EEG-based system was developed to interpret and classify specific emotions.
This system aims to integrate emotional data into the Neurohumanities Lab interactive platform, creating a comprehensive and immersive learning environment.
arXiv Detail & Related papers (2024-01-28T20:02:13Z) - Enhancing Emotional Generation Capability of Large Language Models via Emotional Chain-of-Thought [50.13429055093534]
Large Language Models (LLMs) have shown remarkable performance in various emotion recognition tasks.
We propose the Emotional Chain-of-Thought (ECoT) to enhance the performance of LLMs on various emotional generation tasks.
arXiv Detail & Related papers (2024-01-12T16:42:10Z) - Enhancing Student Engagement in Online Learning through Facial
Expression Analysis and Complex Emotion Recognition using Deep Learning [1.3812010983144802]
This paper introduces a novel approach employing deep learning techniques based on facial expressions to assess students engagement levels during online learning sessions.
To address this challenge, proposed a generation of four complex emotions such as confusion, satisfaction, disappointment, and frustration by combining the basic emotions.
The proposed work utilized a Convolutional Neural Network (CNN) model to categorize the fundamental emotional states of learners accurately.
arXiv Detail & Related papers (2023-11-17T06:07:54Z) - WEARS: Wearable Emotion AI with Real-time Sensor data [0.8740570557632509]
We propose a system to predict user emotion using smartwatch sensors.
We design a framework to collect ground truth in real-time utilizing a mix of English and regional language-based videos.
We also did an ablation study to understand the impact of features including Heart Rate, Accelerometer, and Gyroscope sensor data on mood.
arXiv Detail & Related papers (2023-08-22T11:03:00Z) - Large Language Models Understand and Can be Enhanced by Emotional
Stimuli [53.53886609012119]
We take the first step towards exploring the ability of Large Language Models to understand emotional stimuli.
Our experiments show that LLMs have a grasp of emotional intelligence, and their performance can be improved with emotional prompts.
Our human study results demonstrate that EmotionPrompt significantly boosts the performance of generative tasks.
arXiv Detail & Related papers (2023-07-14T00:57:12Z) - Emotion Analysis on EEG Signal Using Machine Learning and Neural Network [0.0]
The main purpose of this study is to improve ways to improve emotion recognition performance using brain signals.
Various approaches to human-machine interaction technologies have been ongoing for a long time, and in recent years, researchers have had great success in automatically understanding emotion using brain signals.
arXiv Detail & Related papers (2023-07-09T09:50:34Z) - EmoSens: Emotion Recognition based on Sensor data analysis using
LightGBM [1.6197357532363172]
The study examines the performance of various supervised learning models such as Decision Trees, Random Forests, XGBoost, LightGBM on the dataset.
With our proposed model, we obtained a high recognition rate of 92.5% using XGBoost and LightGBM for 9 different emotion classes.
arXiv Detail & Related papers (2022-07-12T13:52:32Z) - Emotion Intensity and its Control for Emotional Voice Conversion [77.05097999561298]
Emotional voice conversion (EVC) seeks to convert the emotional state of an utterance while preserving the linguistic content and speaker identity.
In this paper, we aim to explicitly characterize and control the intensity of emotion.
We propose to disentangle the speaker style from linguistic content and encode the speaker style into a style embedding in a continuous space that forms the prototype of emotion embedding.
arXiv Detail & Related papers (2022-01-10T02:11:25Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.