EventSleep: Sleep Activity Recognition with Event Cameras
- URL: http://arxiv.org/abs/2404.01801v1
- Date: Tue, 2 Apr 2024 10:03:23 GMT
- Title: EventSleep: Sleep Activity Recognition with Event Cameras
- Authors: Carlos Plou, Nerea Gallego, Alberto Sabater, Eduardo Montijano, Pablo Urcola, Luis Montesano, Ruben Martinez-Cantin, Ana C. Murillo,
- Abstract summary: Event cameras are a promising technology for activity recognition in dark environments.
We present EventSleep, a new dataset and methodology to study the suitability of event cameras for a medical application.
- Score: 12.584362614255213
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are a promising technology for activity recognition in dark environments due to their unique properties. However, real event camera datasets under low-lighting conditions are still scarce, which also limits the number of approaches to solve these kind of problems, hindering the potential of this technology in many applications. We present EventSleep, a new dataset and methodology to address this gap and study the suitability of event cameras for a very relevant medical application: sleep monitoring for sleep disorders analysis. The dataset contains synchronized event and infrared recordings emulating common movements that happen during the sleep, resulting in a new challenging and unique dataset for activity recognition in dark environments. Our novel pipeline is able to achieve high accuracy under these challenging conditions and incorporates a Bayesian approach (Laplace ensembles) to increase the robustness in the predictions, which is fundamental for medical applications. Our work is the first application of Bayesian neural networks for event cameras, the first use of Laplace ensembles in a realistic problem, and also demonstrates for the first time the potential of event cameras in a new application domain: to enhance current sleep evaluation procedures. Our activity recognition results highlight the potential of event cameras under dark conditions, and its capacity and robustness for sleep activity recognition, and open problems as the adaptation of event data pre-processing techniques to dark environments.
Related papers
- Low-power, Continuous Remote Behavioral Localization with Event Cameras [9.107129038623242]
Event cameras offer unique advantages for battery-dependent remote monitoring.
We use this sensor to quantify a behavior in Chinstrap penguins called ecstatic display.
Experiments show that the event cameras' natural response to motion is effective for continuous behavior monitoring and detection.
arXiv Detail & Related papers (2023-12-06T14:58:03Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - MISO: Monitoring Inactivity of Single Older Adults at Home using RGB-D Technology [5.612499701087411]
A new application for real-time monitoring of the lack of movement in older adults' own homes is proposed.
A lightweight camera monitoring system was developed and piloted in community homes to observe the daily behavior of older adults.
arXiv Detail & Related papers (2023-11-03T21:51:33Z) - In the Blink of an Eye: Event-based Emotion Recognition [44.12621619057609]
We introduce a wearable single-eye emotion recognition device and a real-time approach to recognizing emotions from partial observations of an emotion.
At the heart of our method is a bio-inspired event-based camera setup and a newly designed lightweight Spiking Eye Emotion Network (SEEN)
arXiv Detail & Related papers (2023-10-06T06:33:20Z) - EventTransAct: A video transformer-based framework for Event-camera
based action recognition [52.537021302246664]
Event cameras offer new opportunities compared to standard action recognition in RGB videos.
In this study, we employ a computationally efficient model, namely the video transformer network (VTN), which initially acquires spatial embeddings per event-frame.
In order to better adopt the VTN for the sparse and fine-grained nature of event data, we design Event-Contrastive Loss ($mathcalL_EC$) and event-specific augmentations.
arXiv Detail & Related papers (2023-08-25T23:51:07Z) - Cross-modal Place Recognition in Image Databases using Event-based
Sensors [28.124708490967713]
We present the first cross-modal visual place recognition framework that is capable of retrieving regular images from a database given an event query.
Our method demonstrates promising results with respect to the state-of-the-art frame-based and event-based methods on the Brisbane-Event-VPR dataset.
arXiv Detail & Related papers (2023-07-03T14:24:04Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - Multi Visual Modality Fall Detection Dataset [4.00152916049695]
Falls are one of the leading cause of injury-related deaths among the elderly worldwide.
Effective detection of falls can reduce the risk of complications and injuries.
Video cameras provide a passive alternative; however, regular RGB cameras are impacted by changing lighting conditions and privacy concerns.
arXiv Detail & Related papers (2022-06-25T21:54:26Z) - EventNeRF: Neural Radiance Fields from a Single Colour Event Camera [81.19234142730326]
This paper proposes the first approach for 3D-consistent, dense and novel view synthesis using just a single colour event stream as input.
At its core is a neural radiance field trained entirely in a self-supervised manner from events while preserving the original resolution of the colour event channels.
We evaluate our method qualitatively and numerically on several challenging synthetic and real scenes and show that it produces significantly denser and more visually appealing renderings.
arXiv Detail & Related papers (2022-06-23T17:59:53Z) - In-Bed Person Monitoring Using Thermal Infrared Sensors [53.561797148529664]
We use 'Griddy', a prototype with a Panasonic Grid-EYE, a low-resolution infrared thermopile array sensor, which offers more privacy.
For this purpose, two datasets were captured, one (480 images) under constant conditions, and a second one (200 images) under different variations.
We test three machine learning algorithms: Support Vector Machines (SVM), k-Nearest Neighbors (k-NN) and Neural Network (NN)
arXiv Detail & Related papers (2021-07-16T15:59:07Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.