Event Vision Sensor: A Review
- URL: http://arxiv.org/abs/2502.06116v1
- Date: Mon, 10 Feb 2025 02:50:57 GMT
- Title: Event Vision Sensor: A Review
- Authors: Xinyue Qin, Junlin Zhang, Wenzhong Bao, Chun Lin, Honglei Chen,
- Abstract summary: Event-based vision sensors provide high temporal resolution and low latency while maintaining low power consumption and simplicity in circuit structure.
Back-illuminated (BSI) technology, wafer stacking techniques, and industrial interfaces has brought new opportunities for enhancing the performance of event-based vision sensors.
This paper will review the progression from neuromorphic engineering to state-of-the-art event-based vision sensor technologies.
- Score: 1.2323240746125856
- License:
- Abstract: By monitoring temporal contrast, event-based vision sensors can provide high temporal resolution and low latency while maintaining low power consumption and simplicity in circuit structure. These characteristics have garnered significant attention in both academia and industry. In recent years, the application of back-illuminated (BSI) technology, wafer stacking techniques, and industrial interfaces has brought new opportunities for enhancing the performance of event-based vision sensors. This is evident in the substantial advancements made in reducing noise, improving resolution, and increasing readout rates. Additionally, the integration of these technologies has enhanced the compatibility of event-based vision sensors with current and edge vision systems, providing greater possibilities for their practical applications. This paper will review the progression from neuromorphic engineering to state-of-the-art event-based vision sensor technologies, including their development trends, operating principles, and key features. Moreover, we will delve into the sensitivity of event-based vision sensors and the opportunities and challenges they face in the realm of infrared imaging, providing references for future research and applications.
Related papers
- Research, Applications and Prospects of Event-Based Pedestrian Detection: A Survey [10.494414329120909]
Event-based cameras, inspired by the biological retina, have evolved into cutting-edge sensors distinguished by their minimal power requirements, negligible latency, superior temporal resolution, and expansive dynamic range.
Event-based cameras address limitations by eschewing extraneous data transmissions and obviating motion blur in high-speed imaging scenarios.
This paper offers an exhaustive review of research and applications particularly in the autonomous driving context.
arXiv Detail & Related papers (2024-07-05T06:17:00Z) - Decisive Data using Multi-Modality Optical Sensors for Advanced
Vehicular Systems [1.3315340349412819]
This paper focuses on various optical technologies for design and development of state-of-the-art out-cabin forward vision systems and in-cabin driver monitoring systems.
The focused optical sensors include Longwave Thermal Imaging (LWIR) cameras, Near Infrared (NIR), Neuromorphic/ event cameras, Visible CMOS cameras and Depth cameras.
arXiv Detail & Related papers (2023-07-25T16:03:47Z) - High-precision and low-latency widefield diamond quantum sensing with
neuromorphic vision sensors [17.98109004256033]
Neuromorphic vision sensor pre-processes the detected signals in optically detected magnetic resonance measurements for quantum sensing.
Experiment with an off-the-shelf event camera demonstrated a 13x improvement in temporal resolution.
Development provides new insights for high-precision and low-latency widefield quantum sensing.
arXiv Detail & Related papers (2023-06-25T02:37:44Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - Deep Learning for Event-based Vision: A Comprehensive Survey and Benchmarks [55.81577205593956]
Event cameras are bio-inspired sensors that capture the per-pixel intensity changes asynchronously.
Deep learning (DL) has been brought to this emerging field and inspired active research endeavors in mining its potential.
arXiv Detail & Related papers (2023-02-17T14:19:28Z) - DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - Towards Energy Efficient Mobile Eye Tracking for AR Glasses through
Optical Sensor Technology [1.52292571922932]
Eye-tracking is a crucial technology to help AR glasses achieve a breakthrough through optimized display technology and gaze-based interaction concepts.
This thesis contributes to a significant scientific advancement towards energy-efficient mobile eye-tracking for AR glasses.
arXiv Detail & Related papers (2022-12-06T18:09:25Z) - Enabling energy efficient machine learning on a Ultra-Low-Power vision
sensor for IoT [3.136861161060886]
This paper presents the development, analysis, and embedded implementation of a realtime detection, classification and tracking pipeline.
The power consumption obtained for the inference - which requires 8ms - is 7.5 mW.
arXiv Detail & Related papers (2021-02-02T06:39:36Z) - Cognitive Visual Inspection Service for LCD Manufacturing Industry [80.63336968475889]
This paper discloses a novel visual inspection system for liquid crystal display (LCD), which is currently a dominant type in the FPD industry.
System is based on two cornerstones: robust/high-performance defect recognition model and cognitive visual inspection service architecture.
arXiv Detail & Related papers (2021-01-11T08:14:35Z) - Energy Aware Deep Reinforcement Learning Scheduling for Sensors
Correlated in Time and Space [62.39318039798564]
We propose a scheduling mechanism capable of taking advantage of correlated information.
The proposed mechanism is capable of determining the frequency with which sensors should transmit their updates.
We show that our solution can significantly extend the sensors' lifetime.
arXiv Detail & Related papers (2020-11-19T09:53:27Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.