Deep Learning-Enabled Sleep Staging From Vital Signs and Activity
Measured Using a Near-Infrared Video Camera
- URL: http://arxiv.org/abs/2306.03711v1
- Date: Tue, 6 Jun 2023 14:21:22 GMT
- Title: Deep Learning-Enabled Sleep Staging From Vital Signs and Activity
Measured Using a Near-Infrared Video Camera
- Authors: Jonathan Carter, Jo\~ao Jorge, Bindia Venugopal, Oliver Gibson, Lionel
Tarassenko
- Abstract summary: We use heart rate, breathing rate and activity measures, all derived from a near-infrared video camera, to perform sleep stage classification.
We achieve an accuracy of 73.4% and a Cohen's kappa of 0.61 in four-class sleep stage classification.
- Score: 1.0499611180329802
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conventional sleep monitoring is time-consuming, expensive and uncomfortable,
requiring a large number of contact sensors to be attached to the patient.
Video data is commonly recorded as part of a sleep laboratory assessment. If
accurate sleep staging could be achieved solely from video, this would overcome
many of the problems of traditional methods. In this work we use heart rate,
breathing rate and activity measures, all derived from a near-infrared video
camera, to perform sleep stage classification. We use a deep transfer learning
approach to overcome data scarcity, by using an existing contact-sensor dataset
to learn effective representations from the heart and breathing rate time
series. Using a dataset of 50 healthy volunteers, we achieve an accuracy of
73.4\% and a Cohen's kappa of 0.61 in four-class sleep stage classification,
establishing a new state-of-the-art for video-based sleep staging.
Related papers
- SleepVST: Sleep Staging from Near-Infrared Video Signals using Pre-Trained Transformers [0.6599755599064447]
We introduce SleepVST, a transformer model which enables state-of-the-art performance in camera-based sleep stage classification.
We show that SleepVST can be successfully transferred to cardio-respiratory waveforms extracted from video, enabling fully contact-free sleep staging.
arXiv Detail & Related papers (2024-04-04T23:24:14Z) - SlAction: Non-intrusive, Lightweight Obstructive Sleep Apnea Detection
using Infrared Video [1.850099608285478]
Obstructive sleep apnea (OSA) is a prevalent sleep disorder affecting approximately one billion people world-wide.
We present SlAction, a non-intrusive OSA detection system for daily sleep environments using infrared videos.
arXiv Detail & Related papers (2023-09-06T04:52:02Z) - Sleep Quality Prediction from Wearables using Convolution Neural
Networks and Ensemble Learning [0.0]
Sleep is among the most important factors affecting one's daily performance, well-being, and life quality.
Rather than camera recordings and extraction of the state from the images, wrist-worn devices can measure directly via accelerometer, heart rate, and heart rate variability sensors.
Some measured features can be as follows: time to bed, time out of bed, bedtime duration, minutes to fall asleep, and minutes after wake-up.
arXiv Detail & Related papers (2023-03-08T18:08:08Z) - Sleep Activity Recognition and Characterization from Multi-Source
Passively Sensed Data [67.60224656603823]
Sleep Activity Recognition methods can provide indicators to assess, monitor, and characterize subjects' sleep-wake cycles and detect behavioral changes.
We propose a general method that continuously operates on passively sensed data from smartphones to characterize sleep and identify significant sleep episodes.
Thanks to their ubiquity, these devices constitute an excellent alternative data source to profile subjects' biorhythms in a continuous, objective, and non-invasive manner.
arXiv Detail & Related papers (2023-01-17T15:18:45Z) - TempNet: Temporal Attention Towards the Detection of Animal Behaviour in
Videos [63.85815474157357]
We propose an efficient computer vision- and deep learning-based method for the detection of biological behaviours in videos.
TempNet uses an encoder bridge and residual blocks to maintain model performance with a two-staged, spatial, then temporal, encoder.
We demonstrate its application to the detection of sablefish (Anoplopoma fimbria) startle events.
arXiv Detail & Related papers (2022-11-17T23:55:12Z) - How Would The Viewer Feel? Estimating Wellbeing From Video Scenarios [73.24092762346095]
We introduce two large-scale datasets with over 60,000 videos annotated for emotional response and subjective wellbeing.
The Video Cognitive Empathy dataset contains annotations for distributions of fine-grained emotional responses, allowing models to gain a detailed understanding of affective states.
The Video to Valence dataset contains annotations of relative pleasantness between videos, which enables predicting a continuous spectrum of wellbeing.
arXiv Detail & Related papers (2022-10-18T17:58:25Z) - Using Ballistocardiography for Sleep Stage Classification [2.360019611990601]
Current methods of sleep stage detection are expensive, invasive to a person's sleep, and not practical in a modern home setting.
Ballistocardiography (BCG) is a non-invasive sensing technology that collects information by measuring the ballistic forces generated by the heart.
We propose to implement a sleep stage detection algorithm and compare it against sleep stages extracted from a Fitbit Sense Smart Watch.
arXiv Detail & Related papers (2022-02-02T14:02:48Z) - In-Bed Person Monitoring Using Thermal Infrared Sensors [53.561797148529664]
We use 'Griddy', a prototype with a Panasonic Grid-EYE, a low-resolution infrared thermopile array sensor, which offers more privacy.
For this purpose, two datasets were captured, one (480 images) under constant conditions, and a second one (200 images) under different variations.
We test three machine learning algorithms: Support Vector Machines (SVM), k-Nearest Neighbors (k-NN) and Neural Network (NN)
arXiv Detail & Related papers (2021-07-16T15:59:07Z) - Human POSEitioning System (HPS): 3D Human Pose Estimation and
Self-localization in Large Scenes from Body-Mounted Sensors [71.29186299435423]
We introduce (HPS) Human POSEitioning System, a method to recover the full 3D pose of a human registered with a 3D scan of the surrounding environment.
We show that our optimization-based integration exploits the benefits of the two, resulting in pose accuracy free of drift.
HPS could be used for VR/AR applications where humans interact with the scene without requiring direct line of sight with an external camera.
arXiv Detail & Related papers (2021-03-31T17:58:31Z) - Temporal convolutional networks and transformers for classifying the
sleep stage in awake or asleep using pulse oximetry signals [0.0]
We develop a network architecture with the aim of classifying the sleep stage in awake or asleep using only HR signals from a pulse oximeter.
Transformers are able to model the sequence, learning the transition rules between sleep stages.
The overall accuracy, specificity, sensibility, and Cohen's Kappa coefficient were 90.0%, 94.9%, 78.1%, and 0.73.
arXiv Detail & Related papers (2021-01-29T22:58:33Z) - Multimodal In-bed Pose and Shape Estimation under the Blankets [77.12439296395733]
We propose a pyramid scheme to fuse different modalities in a way that best leverages the knowledge captured by the multimodal sensors.
We employ an attention-based reconstruction module to generate uncovered modalities, which are further fused to update current estimation.
arXiv Detail & Related papers (2020-12-12T05:35:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.