Full-Body Cardiovascular Sensing with Remote Photoplethysmography
- URL: http://arxiv.org/abs/2303.09638v1
- Date: Thu, 16 Mar 2023 20:37:07 GMT
- Title: Full-Body Cardiovascular Sensing with Remote Photoplethysmography
- Authors: Lu Niu, Jeremy Speth, Nathan Vance, Ben Sporrer, Adam Czajka, Patrick
Flynn
- Abstract summary: Remote photoplethysmography (r) allows for noncontact monitoring of blood volume changes from a camera by detecting minor fluctuations in reflected light.
We explored the feasibility of r from non-face body regions such as the arms, legs, and hands.
- Score: 4.123458880886283
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Remote photoplethysmography (rPPG) allows for noncontact monitoring of blood
volume changes from a camera by detecting minor fluctuations in reflected
light. Prior applications of rPPG focused on face videos. In this paper we
explored the feasibility of rPPG from non-face body regions such as the arms,
legs, and hands. We collected a new dataset titled Multi-Site Physiological
Monitoring (MSPM), which will be released with this paper. The dataset consists
of 90 frames per second video of exposed arms, legs, and face, along with 10
synchronized PPG recordings. We performed baseline heart rate estimation
experiments from non-face regions with several state-of-the-art rPPG
approaches, including chrominance-based (CHROM), plane-orthogonal-to-skin (POS)
and RemotePulseNet (RPNet). To our knowledge, this is the first evaluation of
the fidelity of rPPG signals simultaneously obtained from multiple regions of a
human body. Our experiments showed that skin pixels from arms, legs, and hands
are all potential sources of the blood volume pulse. The best-performing
approach, POS, achieved a mean absolute error peaking at 7.11 beats per minute
from non-facial body parts compared to 1.38 beats per minute from the face.
Additionally, we performed experiments on pulse transit time (PTT) from both
the contact PPG and rPPG signals. We found that remote PTT is possible with
moderately high frame rate video when distal locations on the body are visible.
These findings and the supporting dataset should facilitate new research on
non-face rPPG and monitoring blood flow dynamics over the whole body with a
camera.
Related papers
- Summit Vitals: Multi-Camera and Multi-Signal Biosensing at High Altitudes [22.23531900474421]
Video photoplethysmography is an emerging method for non-invasive and convenient measurement of physiological signals.
This dataset is designed to validate video vitals estimation algorithms and fusing videos from different positions.
Our findings suggest that simultaneous training on multiple indicators, such as PPG and blood oxygen, can reduce MAE in SpO2 estimation by 17.8%.
arXiv Detail & Related papers (2024-09-28T03:36:16Z) - MSPM: A Multi-Site Physiological Monitoring Dataset for Remote Pulse,
Respiration, and Blood Pressure Estimation [6.2250341321698155]
We present the Multi-Site Physiological Monitoring dataset.
It is the first dataset collected to support the study of simultaneous camera-based vital signs estimation on the body.
arXiv Detail & Related papers (2024-02-03T17:50:18Z) - PhysFormer++: Facial Video-based Physiological Measurement with SlowFast
Temporal Difference Transformer [76.40106756572644]
Recent deep learning approaches focus on mining subtle clues using convolutional neural networks with limited-temporal receptive fields.
In this paper, we propose two end-to-end video transformer based on PhysFormer and Phys++++, to adaptively aggregate both local and global features for r representation enhancement.
Comprehensive experiments are performed on four benchmark datasets to show our superior performance on both intra-temporal and cross-dataset testing.
arXiv Detail & Related papers (2023-02-07T15:56:03Z) - Facial Video-based Remote Physiological Measurement via Self-supervised
Learning [9.99375728024877]
We introduce a novel framework that learns to estimate r signals from facial videos without the need of ground truth signals.
Negative samples are generated via a learnable frequency module, which performs nonlinear signal frequency transformation.
Next, we introduce a local r expert aggregation module to estimate r signals from augmented samples.
It encodes complementary pulsation information from different face regions and aggregate them into one r prediction.
arXiv Detail & Related papers (2022-10-27T13:03:23Z) - Benchmarking Joint Face Spoofing and Forgery Detection with Visual and
Physiological Cues [81.15465149555864]
We establish the first joint face spoofing and detection benchmark using both visual appearance and physiological r cues.
To enhance the r periodicity discrimination, we design a two-branch physiological network using both facial powerful rtemporal signal map and its continuous wavelet transformed counterpart as inputs.
arXiv Detail & Related papers (2022-08-10T15:41:48Z) - Identifying Rhythmic Patterns for Face Forgery Detection and
Categorization [46.21354355137544]
We propose a framework for face forgery detection and categorization consisting of: 1) a Spatial-Temporal Filtering Network (STFNet) for PPG signals, and 2) a Spatial-Temporal Interaction Network (STINet) for constraint and interaction of PPG signals.
With insight into the generation of forgery methods, we further propose intra-source and inter-source blending to boost the performance of the framework.
arXiv Detail & Related papers (2022-07-04T04:57:06Z) - ReViSe: Remote Vital Signs Measurement Using Smartphone Camera [0.0]
Remote Photoplethysmography (rVi) is a fast, effective, inexpensive and convenient method for collecting biometric data.
We propose an end-to-end framework to measure people's vital signs based on the ruration of a user's face captured with a smartphone camera.
We extract face landmarks with a deep learning-based neural network model in real-time.
arXiv Detail & Related papers (2022-06-13T19:20:11Z) - PhysFormer: Facial Video-based Physiological Measurement with Temporal
Difference Transformer [55.936527926778695]
Recent deep learning approaches focus on mining subtle r clues using convolutional neural networks with limited-temporal receptive fields.
In this paper, we propose the PhysFormer, an end-to-end video transformer based architecture.
arXiv Detail & Related papers (2021-11-23T18:57:11Z) - Real Time Video based Heart and Respiration Rate Monitoring [5.257115841810259]
Smartphone cameras can measure heart rate (HR) and respiration rate (RR)
variation in the intensity of the green channel can be measured by the i signals of the video.
This study aimed to provide a method to extract heart rate and respiration rate using the video of individuals' faces.
arXiv Detail & Related papers (2021-06-04T19:03:21Z) - Video-based Remote Physiological Measurement via Cross-verified Feature
Disentangling [121.50704279659253]
We propose a cross-verified feature disentangling strategy to disentangle the physiological features with non-physiological representations.
We then use the distilled physiological features for robust multi-task physiological measurements.
The disentangled features are finally used for the joint prediction of multiple physiological signals like average HR values and r signals.
arXiv Detail & Related papers (2020-07-16T09:39:17Z) - AutoHR: A Strong End-to-end Baseline for Remote Heart Rate Measurement
with Neural Searching [76.4844593082362]
We investigate the reason why existing end-to-end networks perform poorly in challenging conditions and establish a strong baseline for remote HR measurement with architecture search (NAS)
Comprehensive experiments are performed on three benchmark datasets on both intra-temporal and cross-dataset testing.
arXiv Detail & Related papers (2020-04-26T05:43:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.