Panoramas from Photons
- URL: http://arxiv.org/abs/2309.03811v1
- Date: Thu, 7 Sep 2023 16:07:31 GMT
- Title: Panoramas from Photons
- Authors: Sacha Jungerman, Atul Ingle, Mohit Gupta
- Abstract summary: We present a method capable of estimating extreme scene motion under challenging conditions, such as low light or high dynamic range.
Our method relies on grouping and aggregating frames after-the-fact, in a stratified manner.
We demonstrate the creation of high-quality panoramas under fast motion and extremely low light, and super-resolution results using a custom single-photon camera prototype.
- Score: 22.437940699523082
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scene reconstruction in the presence of high-speed motion and low
illumination is important in many applications such as augmented and virtual
reality, drone navigation, and autonomous robotics. Traditional motion
estimation techniques fail in such conditions, suffering from too much blur in
the presence of high-speed motion and strong noise in low-light conditions.
Single-photon cameras have recently emerged as a promising technology capable
of capturing hundreds of thousands of photon frames per second thanks to their
high speed and extreme sensitivity. Unfortunately, traditional computer vision
techniques are not well suited for dealing with the binary-valued photon data
captured by these cameras because these are corrupted by extreme Poisson noise.
Here we present a method capable of estimating extreme scene motion under
challenging conditions, such as low light or high dynamic range, from a
sequence of high-speed image frames such as those captured by a single-photon
camera. Our method relies on iteratively improving a motion estimate by
grouping and aggregating frames after-the-fact, in a stratified manner. We
demonstrate the creation of high-quality panoramas under fast motion and
extremely low light, and super-resolution results using a custom single-photon
camera prototype. For code and supplemental material see our
$\href{https://wisionlab.com/project/panoramas-from-photons/}{\text{project
webpage}}$.
Related papers
- Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Towards Real-world Event-guided Low-light Video Enhancement and Deblurring [39.942568142125126]
Event cameras have emerged as a promising solution for improving image quality in low-light environments.
We introduce an end-to-end framework to effectively handle these tasks.
Our framework incorporates a module to efficiently leverage temporal information from events and frames.
arXiv Detail & Related papers (2024-08-27T09:44:54Z) - Radiance Fields from Photons [18.15183252935672]
We introduce quanta radiance fields, a class of neural radiance fields that are trained at the granularity of individual photons using single-photon cameras (SPCs)
We demonstrate, both via simulations and a prototype SPC hardware, high-fidelity reconstructions under high-speed motion, in low light, and for extreme dynamic range settings.
arXiv Detail & Related papers (2024-07-12T16:06:51Z) - Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging [25.13346470561497]
Event cameras and single-photon avalanche diode (SPAD) sensors have emerged as promising alternatives to conventional cameras.
We show that these properties are complementary, and can help achieve low-light, high-speed image reconstruction with low bandwidth requirements.
arXiv Detail & Related papers (2024-04-17T16:06:29Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - E-NeRF: Neural Radiance Fields from a Moving Event Camera [83.91656576631031]
Estimating neural radiance fields (NeRFs) from ideal images has been extensively studied in the computer vision community.
We present E-NeRF, the first method which estimates a volumetric scene representation in the form of a NeRF from a fast-moving event camera.
arXiv Detail & Related papers (2022-08-24T04:53:32Z) - High Dynamic Range and Super-Resolution from Raw Image Bursts [52.341483902624006]
This paper introduces the first approach to reconstruct high-resolution, high-dynamic range color images from raw photographic bursts captured by a handheld camera with exposure bracketing.
The proposed algorithm is fast, with low memory requirements compared to state-of-the-art learning-based approaches to image restoration.
Experiments demonstrate its excellent performance with super-resolution factors of up to $times 4$ on real photographs taken in the wild with hand-held cameras.
arXiv Detail & Related papers (2022-07-29T13:31:28Z) - Learning Spatially Varying Pixel Exposures for Motion Deblurring [49.07867902677453]
We present a novel approach of leveraging spatially varying pixel exposures for motion deblurring.
Our work illustrates the promising role that focal-plane sensor--processors can play in the future of computational imaging.
arXiv Detail & Related papers (2022-04-14T23:41:49Z) - Real-Time Optical Flow for Vehicular Perception with Low- and
High-Resolution Event Cameras [3.845877724862319]
Event cameras capture changes of illumination in the observed scene rather than accumulating light to create images.
We propose an optimized framework for computing optical flow in real-time with both low- and high-resolution event cameras.
We evaluate our approach on both low- and high-resolution driving sequences, and show that it often achieves better results than the current state of the art.
arXiv Detail & Related papers (2021-12-20T15:09:20Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Photon-Starved Scene Inference using Single Photon Cameras [14.121328731553868]
We propose photon scale-space a collection of high-SNR images spanning a wide range of photons-per-pixel (PPP) levels.
We develop training techniques that push images with different illumination levels closer to each other in feature representation space.
Based on the proposed approach, we demonstrate, via simulations and real experiments with a SPAD camera, high-performance on various inference tasks.
arXiv Detail & Related papers (2021-07-23T02:27:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.