Real-Time Face & Eye Tracking and Blink Detection using Event Cameras
- URL: http://arxiv.org/abs/2010.08278v1
- Date: Fri, 16 Oct 2020 10:02:41 GMT
- Title: Real-Time Face & Eye Tracking and Blink Detection using Event Cameras
- Authors: Cian Ryan, Brian O Sullivan, Amr Elrasad, Joe Lemley, Paul Kielty,
Christoph Posch and Etienne Perot
- Abstract summary: Event cameras contain emerging, neuromorphic vision sensors that capture local light intensity changes at each pixel, generating a stream of asynchronous events.
Driver monitoring systems (DMS) are in-cabin safety systems designed to sense and understand a drivers physical and cognitive state.
This paper proposes a novel method to simultaneously detect and track faces and eyes for driver monitoring.
- Score: 3.842206880015537
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras contain emerging, neuromorphic vision sensors that capture
local light intensity changes at each pixel, generating a stream of
asynchronous events. This way of acquiring visual information constitutes a
departure from traditional frame based cameras and offers several significant
advantages: low energy consumption, high temporal resolution, high dynamic
range and low latency. Driver monitoring systems (DMS) are in-cabin safety
systems designed to sense and understand a drivers physical and cognitive
state. Event cameras are particularly suited to DMS due to their inherent
advantages. This paper proposes a novel method to simultaneously detect and
track faces and eyes for driver monitoring. A unique, fully convolutional
recurrent neural network architecture is presented. To train this network, a
synthetic event-based dataset is simulated with accurate bounding box
annotations, called Neuromorphic HELEN. Additionally, a method to detect and
analyse drivers eye blinks is proposed, exploiting the high temporal resolution
of event cameras. Behaviour of blinking provides greater insights into a driver
level of fatigue or drowsiness. We show that blinks have a unique temporal
signature that can be better captured by event cameras.
Related papers
- BlinkTrack: Feature Tracking over 100 FPS via Events and Images [50.98675227695814]
We propose a novel framework, BlinkTrack, which integrates event data with RGB images for high-frequency feature tracking.
Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches.
Experimental results indicate that BlinkTrack significantly outperforms existing event-based methods.
arXiv Detail & Related papers (2024-09-26T15:54:18Z) - SpikeMOT: Event-based Multi-Object Tracking with Sparse Motion Features [52.213656737672935]
SpikeMOT is an event-based multi-object tracker.
SpikeMOT uses spiking neural networks to extract sparsetemporal features from event streams associated with objects.
arXiv Detail & Related papers (2023-09-29T05:13:43Z) - Neuromorphic Seatbelt State Detection for In-Cabin Monitoring with Event
Cameras [0.932065750652415]
This research provides a proof of concept to expand event-based DMS techniques to include seatbelt state detection.
In a binary classification task, the fastened/unfastened frames were identified with an F1 score of 0.989 and 0.944 on the simulated and real test sets respectively.
arXiv Detail & Related papers (2023-08-15T14:27:46Z) - Dual Memory Aggregation Network for Event-Based Object Detection with
Learnable Representation [79.02808071245634]
Event-based cameras are bio-inspired sensors that capture brightness change of every pixel in an asynchronous manner.
Event streams are divided into grids in the x-y-t coordinates for both positive and negative polarity, producing a set of pillars as 3D tensor representation.
Long memory is encoded in the hidden state of adaptive convLSTMs while short memory is modeled by computing spatial-temporal correlation between event pillars.
arXiv Detail & Related papers (2023-03-17T12:12:41Z) - Traffic Sign Detection With Event Cameras and DCNN [0.0]
Event cameras (DVS) have been used in vision systems as an alternative or supplement to traditional cameras.
In this work, we test whether these rather novel sensors can be applied to the popular task of traffic sign detection.
arXiv Detail & Related papers (2022-07-27T08:01:54Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Moving Object Detection for Event-based vision using Graph Spectral
Clustering [6.354824287948164]
Moving object detection has been a central topic of discussion in computer vision for its wide range of applications.
We present an unsupervised Graph Spectral Clustering technique for Moving Object Detection in Event-based data.
We additionally show how the optimum number of moving objects can be automatically determined.
arXiv Detail & Related papers (2021-09-30T10:19:22Z) - Fusion-FlowNet: Energy-Efficient Optical Flow Estimation using Sensor
Fusion and Deep Fused Spiking-Analog Network Architectures [7.565038387344594]
We present a sensor fusion framework for energy-efficient optical flow estimation using both frame- and event-based sensors.
Our network is end-to-end trained using unsupervised learning to avoid expensive video annotations.
arXiv Detail & Related papers (2021-03-19T02:03:33Z) - Combining Events and Frames using Recurrent Asynchronous Multimodal
Networks for Monocular Depth Prediction [51.072733683919246]
We introduce Recurrent Asynchronous Multimodal (RAM) networks to handle asynchronous and irregular data from multiple sensors.
Inspired by traditional RNNs, RAM networks maintain a hidden state that is updated asynchronously and can be queried at any time to generate a prediction.
We show an improvement over state-of-the-art methods by up to 30% in terms of mean depth absolute error.
arXiv Detail & Related papers (2021-02-18T13:24:35Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - End-to-end Learning of Object Motion Estimation from Retinal Events for
Event-based Object Tracking [35.95703377642108]
We propose a novel deep neural network to learn and regress a parametric object-level motion/transform model for event-based object tracking.
To achieve this goal, we propose a synchronous Time-Surface with Linear Time Decay representation.
We feed the sequence of TSLTD frames to a novel Retinal Motion Regression Network (RMRNet) perform to an end-to-end 5-DoF object motion regression.
arXiv Detail & Related papers (2020-02-14T08:19:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.