Seeing Far in the Dark with Patterned Flash
- URL: http://arxiv.org/abs/2207.12570v1
- Date: Mon, 25 Jul 2022 23:16:50 GMT
- Title: Seeing Far in the Dark with Patterned Flash
- Authors: Zhanghao Sun, Jian Wang, Yicheng Wu, Shree Nayar
- Abstract summary: We propose a new flash technique, named patterned flash'', for flash imaging at a long distance.
Patterned flash concentrates optical power into a dot array.
We develop a joint image reconstruction and depth estimation algorithm with a convolutional neural network.
- Score: 5.540878289831889
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Flash illumination is widely used in imaging under low-light environments.
However, illumination intensity falls off with propagation distance
quadratically, which poses significant challenges for flash imaging at a long
distance. We propose a new flash technique, named ``patterned flash'', for
flash imaging at a long distance. Patterned flash concentrates optical power
into a dot array. Compared with the conventional uniform flash where the signal
is overwhelmed by the noise everywhere, patterned flash provides stronger
signals at sparsely distributed points across the field of view to ensure the
signals at those points stand out from the sensor noise. This enables
post-processing to resolve important objects and details. Additionally, the
patterned flash projects texture onto the scene, which can be treated as a
structured light system for depth perception. Given the novel system, we
develop a joint image reconstruction and depth estimation algorithm with a
convolutional neural network. We build a hardware prototype and test the
proposed flash technique on various scenes. The experimental results
demonstrate that our patterned flash has significantly better performance at
long distances in low-light environments.
Related papers
- Flash-Splat: 3D Reflection Removal with Flash Cues and Gaussian Splats [13.27784783829039]
We introduce a simple yet effective approach for separating transmitted and reflected light.
Our method, Flash-Splat, accurately reconstructs both transmitted and reflected scenes in 3D.
arXiv Detail & Related papers (2024-10-03T17:59:59Z) - Computational Flash Photography through Intrinsics [0.0]
We study the computational control of the flash light in photographs taken with or without flash.
We present a physically motivated intrinsic formulation for flash photograph formation and develop flash decomposition and generation methods.
arXiv Detail & Related papers (2023-06-09T17:51:20Z) - WildLight: In-the-wild Inverse Rendering with a Flashlight [77.31815397135381]
We propose a practical photometric solution for in-the-wild inverse rendering under unknown ambient lighting.
Our system recovers scene geometry and reflectance using only multi-view images captured by a smartphone.
We demonstrate by extensive experiments that our method is easy to implement, casual to set up, and consistently outperforms existing in-the-wild inverse rendering techniques.
arXiv Detail & Related papers (2023-03-24T17:59:56Z) - Robust Reflection Removal with Flash-only Cues in the Wild [88.13531903652526]
We propose a reflection-free cue for robust reflection removal from a pair of flash and ambient (no-flash) images.
Our model outperforms state-of-the-art reflection removal approaches by more than 5.23dB in PSNR.
We extend our approach to handheld photography to address the misalignment between the flash and no-flash pair.
arXiv Detail & Related papers (2022-11-05T14:09:10Z) - Robust Reflection Removal with Reflection-free Flash-only Cues [52.46297802064146]
We propose a reflection-free cue for robust reflection removal from a pair of flash and ambient (no-flash) images.
Our model outperforms state-of-the-art reflection removal approaches by more than 5.23dB in PSNR, 0.04 in SSIM, and 0.068 in LPIPS.
arXiv Detail & Related papers (2021-03-07T05:27:43Z) - Deep Denoising of Flash and No-Flash Pairs for Photography in Low-Light
Environments [51.74566709730618]
We introduce a neural network-based method to denoise pairs of images taken in quick succession, with and without a flash, in low-light environments.
Our goal is to produce a high-quality rendering of the scene that preserves the color and mood from the ambient illumination of the noisy no-flash image.
arXiv Detail & Related papers (2020-12-09T15:41:16Z) - Light Stage Super-Resolution: Continuous High-Frequency Relighting [58.09243542908402]
We propose a learning-based solution for the "super-resolution" of scans of human faces taken from a light stage.
Our method aggregates the captured images corresponding to neighboring lights in the stage, and uses a neural network to synthesize a rendering of the face.
Our learned model is able to produce renderings for arbitrary light directions that exhibit realistic shadows and specular highlights.
arXiv Detail & Related papers (2020-10-17T23:40:43Z) - Towards Geometry Guided Neural Relighting with Flash Photography [26.511476565209026]
We propose a framework for image relighting from a single flash photograph with its corresponding depth map using deep learning.
We experimentally validate the advantage of our geometry guided approach over state-of-the-art image-based approaches in intrinsic image decomposition and image relighting.
arXiv Detail & Related papers (2020-08-12T08:03:28Z) - Deep Bilateral Retinex for Low-Light Image Enhancement [96.15991198417552]
Low-light images suffer from poor visibility caused by low contrast, color distortion and measurement noise.
This paper proposes a deep learning method for low-light image enhancement with a particular focus on handling the measurement noise.
The proposed method is very competitive to the state-of-the-art methods, and has significant advantage over others when processing images captured in extremely low lighting conditions.
arXiv Detail & Related papers (2020-07-04T06:26:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.