Contactless Cardiac Pulse Monitoring Using Event Cameras
- URL: http://arxiv.org/abs/2505.09529v2
- Date: Tue, 24 Jun 2025 13:38:00 GMT
- Title: Contactless Cardiac Pulse Monitoring Using Event Cameras
- Authors: Mohamed Moustafa, Joseph Lemley, Peter Corcoran,
- Abstract summary: This study investigates the contact-free reconstruction of an individual's cardiac pulse signal from time event recording of their face.<n>An end-to-end model is trained to extract the cardiac signal from a two-dimensional representation of the event stream.<n>The experimental results confirm that physiological cardiac information in the facial region is effectively preserved within the event stream.
- Score: 0.10923877073891444
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Time event cameras are a novel technology for recording scene information at extremely low latency and with low power consumption. Event cameras output a stream of events that encapsulate pixel-level light intensity changes within the scene, capturing information with a higher dynamic range and temporal resolution than traditional cameras. This study investigates the contact-free reconstruction of an individual's cardiac pulse signal from time event recording of their face using a supervised convolutional neural network (CNN) model. An end-to-end model is trained to extract the cardiac signal from a two-dimensional representation of the event stream, with model performance evaluated based on the accuracy of the calculated heart rate. The experimental results confirm that physiological cardiac information in the facial region is effectively preserved within the event stream, showcasing the potential of this novel sensor for remote heart rate monitoring. The model trained on event frames achieves a root mean square error (RMSE) of 3.32 beats per minute (bpm) compared to the RMSE of 2.92 bpm achieved by the baseline model trained on standard camera frames. Furthermore, models trained on event frames generated at 60 and 120 FPS outperformed the 30 FPS standard camera results, achieving an RMSE of 2.54 and 2.13 bpm, respectively.
Related papers
- Inter-event Interval Microscopy for Event Cameras [52.05337480169517]
Event cameras, an innovative bio-inspired sensor, differ from traditional cameras by sensing changes in intensity rather than directly perceiving intensity.<n>We achieve event-to-intensity conversion using a static event camera for both static and dynamic scenes in fluorescence microscopy.<n>We have collected IEIMat dataset under various scenes including high dynamic range and high-speed scenarios.
arXiv Detail & Related papers (2025-04-07T11:05:13Z) - Time-Series U-Net with Recurrence for Noise-Robust Imaging Photoplethysmography [14.749406169315554]
Photoplethysmography system consists of three modules: face and landmark detection, time-series extraction, and pulse signal/pulse rate estimation.<n>The pulse signal estimation module, which we call TURNIP, allows the system to faithfully reconstruct the underlying pulse signal waveform.<n>Our algorithm provides reliable heart rate estimates without the need for specialized sensors or contact with the skin.
arXiv Detail & Related papers (2025-03-21T17:52:33Z) - EvDNeRF: Reconstructing Event Data with Dynamic Neural Radiance Fields [80.94515892378053]
EvDNeRF is a pipeline for generating event data and training an event-based dynamic NeRF.
NeRFs offer geometric-based learnable rendering, but prior work with events has only considered reconstruction of static scenes.
We show that by training on varied batch sizes of events, we can improve test-time predictions of events at fine time resolutions.
arXiv Detail & Related papers (2023-10-03T21:08:41Z) - Heart Rate Detection Using an Event Camera [1.8020166013859684]
Event cameras, also known as neuromorphic cameras, are an emerging technology that offer advantages over traditional shutter and frame-based cameras.
We propose to harnesses the capabilities of event-based cameras to capture subtle changes in the surface of the skin caused by the pulsatile flow of blood in the wrist region.
arXiv Detail & Related papers (2023-09-21T08:51:30Z) - Neuromorphic Seatbelt State Detection for In-Cabin Monitoring with Event
Cameras [0.932065750652415]
This research provides a proof of concept to expand event-based DMS techniques to include seatbelt state detection.
In a binary classification task, the fastened/unfastened frames were identified with an F1 score of 0.989 and 0.944 on the simulated and real test sets respectively.
arXiv Detail & Related papers (2023-08-15T14:27:46Z) - Recurrent Vision Transformers for Object Detection with Event Cameras [62.27246562304705]
We present Recurrent Vision Transformers (RVTs), a novel backbone for object detection with event cameras.
RVTs can be trained from scratch to reach state-of-the-art performance on event-based object detection.
Our study brings new insights into effective design choices that can be fruitful for research beyond event-based vision.
arXiv Detail & Related papers (2022-12-11T20:28:59Z) - LFPS-Net: a lightweight fast pulse simulation network for BVP estimation [4.631302854901082]
Heart rate estimation based on remote photoplethysmography plays an important role in several specific scenarios, such as health monitoring and fatigue detection.
Existing methods are committed to taking the average of the predicted HRs of multiple overlapping video clips as the final results for the 30-second facial video.
We propose a lightweight fast pulse simulation network (LFPS-Net), pursuing the best accuracy within a very limited computational and time budget.
arXiv Detail & Related papers (2022-06-25T05:24:52Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Combining Events and Frames using Recurrent Asynchronous Multimodal
Networks for Monocular Depth Prediction [51.072733683919246]
We introduce Recurrent Asynchronous Multimodal (RAM) networks to handle asynchronous and irregular data from multiple sensors.
Inspired by traditional RNNs, RAM networks maintain a hidden state that is updated asynchronously and can be queried at any time to generate a prediction.
We show an improvement over state-of-the-art methods by up to 30% in terms of mean depth absolute error.
arXiv Detail & Related papers (2021-02-18T13:24:35Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z) - Real-Time Face & Eye Tracking and Blink Detection using Event Cameras [3.842206880015537]
Event cameras contain emerging, neuromorphic vision sensors that capture local light intensity changes at each pixel, generating a stream of asynchronous events.
Driver monitoring systems (DMS) are in-cabin safety systems designed to sense and understand a drivers physical and cognitive state.
This paper proposes a novel method to simultaneously detect and track faces and eyes for driver monitoring.
arXiv Detail & Related papers (2020-10-16T10:02:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.