Hybrid Event Frame Sensors: Modeling, Calibration, and Simulation
- URL: http://arxiv.org/abs/2511.18037v1
- Date: Sat, 22 Nov 2025 12:32:07 GMT
- Title: Hybrid Event Frame Sensors: Modeling, Calibration, and Simulation
- Authors: Yunfan Lu, Nico Messikommer, Xiaogang Xu, Liming Chen, Yuhan Chen, Nikola Zubic, Davide Scaramuzza, Hui Xiong,
- Abstract summary: Event frame hybrid sensors integrate an Active Pixel Sensor (APS) and an Event Vision Sensor (EVS) within a single chip.<n>We present the first unified, statistics-based imaging noise model that jointly describes the noise behavior of APS and EVS pixels.
- Score: 46.93612436763656
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event frame hybrid sensors integrate an Active Pixel Sensor (APS) and an Event Vision Sensor (EVS) within a single chip, combining the high dynamic range and low latency of the EVS with the rich spatial intensity information from the APS. While this tight integration offers compact, temporally precise imaging, the complex circuit architecture introduces non-trivial noise patterns that remain poorly understood and unmodeled. In this work, we present the first unified, statistics-based imaging noise model that jointly describes the noise behavior of APS and EVS pixels. Our formulation explicitly incorporates photon shot noise, dark current noise, fixed-pattern noise, and quantization noise, and links EVS noise to illumination level and dark current. Based on this formulation, we further develop a calibration pipeline to estimate noise parameters from real data and offer a detailed analysis of both APS and EVS noise behaviors. Finally, we propose HESIM, a statistically grounded simulator that generates RAW frames and events under realistic, jointly calibrated noise statistics. Experiments on two hybrid sensors validate our model across multiple imaging tasks (e.g., video frame interpolation and deblurring), demonstrating strong transfer from simulation to real data.
Related papers
- Denoising the Deep Sky: Physics-Based CCD Noise Formation for Astronomical Imaging [47.83642412662346]
Learning-based denoising is promising, yet progress is hindered by scarce paired training data.<n>We propose a physics-based noise synthesis framework tailored to CCD noise formation.
arXiv Detail & Related papers (2026-01-30T18:47:54Z) - Physics-Guided Rectified Flow for Low-light RAW Image Enhancement [0.0]
Enhancing RAW images captured under low light conditions is a challenging task.<n>Recent deep learning based RAW enhancement methods have shifted from using real paired data to relying on synthetic datasets.
arXiv Detail & Related papers (2025-09-10T07:08:43Z) - Towards General Low-Light Raw Noise Synthesis and Modeling [37.87312467017369]
We introduce a new perspective to synthesize the signal-independent noise by a generative model.
Specifically, we synthesize the signal-dependent and signal-independent noise in a physics- and learning-based manner.
In this way, our method can be considered as a general model, that is, it can simultaneously learn different noise characteristics for different ISO levels.
arXiv Detail & Related papers (2023-07-31T09:10:10Z) - Realistic Noise Synthesis with Diffusion Models [44.404059914652194]
Deep denoising models require extensive real-world training data, which is challenging to acquire.<n>We propose a novel Realistic Noise Synthesis Diffusor (RNSD) method using diffusion models to address these challenges.
arXiv Detail & Related papers (2023-05-23T12:56:01Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - E-MLB: Multilevel Benchmark for Event-Based Camera Denoising [12.698543500397275]
Event cameras are more sensitive to junction leakage current and photocurrent as they output differential signals.
We construct a large-scale event denoising dataset (multilevel benchmark for event denoising, E-MLB) for the first time.
We also propose the first nonreference event denoising metric, the event structural ratio (ESR), which measures the structural intensity of given events.
arXiv Detail & Related papers (2023-03-21T16:31:53Z) - Rethinking Noise Synthesis and Modeling in Raw Denoising [75.55136662685341]
We introduce a new perspective to synthesize noise by directly sampling from the sensor's real noise.
It inherently generates accurate raw image noise for different camera sensors.
arXiv Detail & Related papers (2021-10-10T10:45:24Z) - CERL: A Unified Optimization Framework for Light Enhancement with
Realistic Noise [81.47026986488638]
Low-light images captured in the real world are inevitably corrupted by sensor noise.
Existing light enhancement methods either overlook the important impact of real-world noise during enhancement, or treat noise removal as a separate pre- or post-processing step.
We present Coordinated Enhancement for Real-world Low-light Noisy Images (CERL), that seamlessly integrates light enhancement and noise suppression parts into a unified and physics-grounded framework.
arXiv Detail & Related papers (2021-08-01T15:31:15Z) - Designing a Practical Degradation Model for Deep Blind Image
Super-Resolution [134.9023380383406]
Single image super-resolution (SISR) methods would not perform well if the assumed degradation model deviates from those in real images.
This paper proposes to design a more complex but practical degradation model that consists of randomly shuffled blur, downsampling and noise degradations.
arXiv Detail & Related papers (2021-03-25T17:40:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.