Autofocus for Event Cameras
- URL: http://arxiv.org/abs/2203.12321v1
- Date: Wed, 23 Mar 2022 10:46:33 GMT
- Title: Autofocus for Event Cameras
- Authors: Shijie Lin and Yinqiang Zhang and Lei Yu and Bin Zhou and Xiaowei Luo
and Jia Pan
- Abstract summary: We develop a novel event-based autofocus framework consisting of an event-specific focus measure called event rate (ER) and a robust search strategy called event-based golden search (EGS)
The experiments on this dataset and additional real-world scenarios demonstrated the superiority of our method over state-of-the-art approaches in terms of efficiency and accuracy.
- Score: 21.972388081563267
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Focus control (FC) is crucial for cameras to capture sharp images in
challenging real-world scenarios. The autofocus (AF) facilitates the FC by
automatically adjusting the focus settings. However, due to the lack of
effective AF methods for the recently introduced event cameras, their FC still
relies on naive AF like manual focus adjustments, leading to poor adaptation in
challenging real-world conditions. In particular, the inherent differences
between event and frame data in terms of sensing modality, noise, temporal
resolutions, etc., bring many challenges in designing an effective AF method
for event cameras. To address these challenges, we develop a novel event-based
autofocus framework consisting of an event-specific focus measure called event
rate (ER) and a robust search strategy called event-based golden search (EGS).
To verify the performance of our method, we have collected an event-based
autofocus dataset (EAD) containing well-synchronized frames, events, and focal
positions in a wide variety of challenging scenes with severe lighting and
motion conditions. The experiments on this dataset and additional real-world
scenarios demonstrated the superiority of our method over state-of-the-art
approaches in terms of efficiency and accuracy.
Related papers
- Event-assisted Low-Light Video Object Segmentation [47.28027938310957]
Event cameras offer promise in enhancing object visibility and aiding VOS methods under such low-light conditions.
This paper introduces a pioneering framework tailored for low-light VOS, leveraging event camera data to elevate segmentation accuracy.
arXiv Detail & Related papers (2024-04-02T13:41:22Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - $\text{DC}^2$: Dual-Camera Defocus Control by Learning to Refocus [38.24734623691387]
We propose a system for defocus control for synthetically varying camera aperture, focus distance and arbitrary defocus effects.
Our key insight is to leverage real-world smartphone camera dataset by using image refocus as a proxy task for learning to control defocus.
We demonstrate creative post-capture defocus control enabled by our method, including tilt-shift and content-based defocus effects.
arXiv Detail & Related papers (2023-04-06T17:59:58Z) - Improving Fast Auto-Focus with Event Polarity [5.376511424333543]
This paper presents a new high-speed and accurate event-based focusing algorithm.
Experiments on the public event-based autofocus dataset (EAD) show the robustness of the model.
precise focus with less than one depth of focus is achieved within 0.004 seconds on our self-built high-speed focusing platform.
arXiv Detail & Related papers (2023-03-15T13:36:13Z) - Learning to See Through with Events [37.19232535463858]
This paper presents an Event-based SAI (E-SAI) method by relying on asynchronous events with extremely low latency and high dynamic range.
The collected events are first refocused by a Re-focus-Net module to align in-focus events while scattering out off-focus ones.
A hybrid network composed of spiking neural networks (SNNs) and convolutional neural networks (CNNs) is proposed to encode the foreground-temporal information from the refocused events and reconstruct a visual image of the occluded scenes.
arXiv Detail & Related papers (2022-12-05T12:51:22Z) - Bridging the Gap between Events and Frames through Unsupervised Domain
Adaptation [57.22705137545853]
We propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data.
We leverage the generative event model to split event features into content and motion features.
Our approach unlocks the vast amount of existing image datasets for the training of event-based neural networks.
arXiv Detail & Related papers (2021-09-06T17:31:37Z) - An End-to-End Autofocus Camera for Iris on the Move [48.14011526385088]
In this paper, we introduce a novel rapid autofocus camera for active refocusing of the iris area ofthe moving objects using a focus-tunable lens.
Our end-to-end computational algorithm can predict the best focus position from one single blurred image and generate a lens diopter control signal automatically.
The results demonstrate the advantages of our proposed camera for biometric perception in static and dynamic scenes.
arXiv Detail & Related papers (2021-06-29T03:00:39Z) - Rapid Whole Slide Imaging via Learning-based Two-shot Virtual
Autofocusing [57.90239401665367]
Whole slide imaging (WSI) is an emerging technology for digital pathology.
We propose the concept of textitvirtual autofocusing, which does not rely on mechanical adjustment to conduct refocusing.
arXiv Detail & Related papers (2020-03-14T13:40:33Z) - Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for
Event-based Object Tracking [87.0297771292994]
We propose an Event-based Tracking-by-Detection (ETD) method for generic bounding box-based object tracking.
To achieve this goal, we present an Adaptive Time-Surface with Linear Time Decay (ATSLTD) event-to-frame conversion algorithm.
We compare the proposed ETD method with seven popular object tracking methods, that are based on conventional cameras or event cameras, and two variants of ETD.
arXiv Detail & Related papers (2020-02-13T15:58:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.