Robust photon-efficient imaging using a pixel-wise residual shrinkage
network
- URL: http://arxiv.org/abs/2201.01453v1
- Date: Wed, 5 Jan 2022 05:08:12 GMT
- Title: Robust photon-efficient imaging using a pixel-wise residual shrinkage
network
- Authors: Gongxin Yao, Yiwei Chen, Yong Liu, Xiaomin Hu and Yu Pan
- Abstract summary: Single-photon light detection and ranging (LiDAR) has been widely applied to 3D imaging in challenging scenarios.
limited signal photon counts and high noises in the collected data have posed great challenges for predicting the depth image precisely.
We propose a pixel-wise residual shrinkage network for photon-efficient imaging from high-noise data.
- Score: 7.557893223548758
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Single-photon light detection and ranging (LiDAR) has been widely applied to
3D imaging in challenging scenarios. However, limited signal photon counts and
high noises in the collected data have posed great challenges for predicting
the depth image precisely. In this paper, we propose a pixel-wise residual
shrinkage network for photon-efficient imaging from high-noise data, which
adaptively generates the optimal thresholds for each pixel and denoises the
intermediate features by soft thresholding. Besides, redefining the
optimization target as pixel-wise classification provides a sharp advantage in
producing confident and accurate depth estimation when compared with existing
research. Comprehensive experiments conducted on both simulated and real-world
datasets demonstrate that the proposed model outperforms the state-of-the-arts
and maintains robust imaging performance under different signal-to-noise ratios
including the extreme case of 1:100.
Related papers
- bit2bit: 1-bit quanta video reconstruction via self-supervised photon prediction [57.199618102578576]
We propose bit2bit, a new method for reconstructing high-quality image stacks at original resolution from sparse binary quantatemporal image data.
Inspired by recent work on Poisson denoising, we developed an algorithm that creates a dense image sequence from sparse binary photon data.
We present a novel dataset containing a wide range of real SPAD high-speed videos under various challenging imaging conditions.
arXiv Detail & Related papers (2024-10-30T17:30:35Z) - Super-resolving Real-world Image Illumination Enhancement: A New Dataset and A Conditional Diffusion Model [43.93772529301279]
We propose a SRRIIE dataset with an efficient conditional diffusion probabilistic models-based method.
We capture images using an ILDC camera and an optical zoom lens with exposure levels ranging from -6 EV to 0 EV and ISO levels ranging from 50 to 12800.
We show that most existing methods are less effective in preserving the structures and sharpness of restored images from complicated noises.
arXiv Detail & Related papers (2024-10-16T18:47:04Z) - Sparse-DeRF: Deblurred Neural Radiance Fields from Sparse View [17.214047499850487]
This paper focuses on constructing deblurred neural radiance fields (DeRF) from sparse-view for more pragmatic real-world scenarios.
Sparse-DeRF successfully regularizes the complicated joint optimization, presenting alleviated overfitting artifacts and enhanced quality on radiance fields.
We demonstrate the effectiveness of the Sparse-DeRF with extensive quantitative and qualitative experimental results by training DeRF from 2-view, 4-view, and 6-view blurry images.
arXiv Detail & Related papers (2024-07-09T07:36:54Z) - Towards High-quality HDR Deghosting with Conditional Diffusion Models [88.83729417524823]
High Dynamic Range (LDR) images can be recovered from several Low Dynamic Range (LDR) images by existing Deep Neural Networks (DNNs) techniques.
DNNs still generate ghosting artifacts when LDR images have saturation and large motion.
We formulate the HDR deghosting problem as an image generation that leverages LDR features as the diffusion model's condition.
arXiv Detail & Related papers (2023-11-02T01:53:55Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - Complex-valued Retrievals From Noisy Images Using Diffusion Models [26.467188665404727]
In microscopy, sensors measure only real-valued intensities. Additionally, the sensor readouts are affected by Poissonian-distributed photon noise.
Traditional restoration algorithms aim to minimize the mean squared error (MSE) between the original and recovered images.
This often leads to blurry outcomes with poor perceptual quality.
arXiv Detail & Related papers (2022-12-06T18:57:59Z) - Ultra Low-Parameter Denoising: Trainable Bilateral Filter Layers in
Computed Tomography [7.405782253585339]
This work presents an open-source CT denoising framework based on the idea of bilateral filtering.
We propose a bilateral filter that can be incorporated into a deep learning pipeline and optimized in a purely data-driven way.
Denoising performance is achieved on x-ray microscope bone data (0.7053 and 33.10) and the 2016 Low Dose CT Grand Challenge dataset (0.9674 and 43.07) in terms of SSIM and PSNR.
arXiv Detail & Related papers (2022-01-25T14:33:56Z) - Deep Domain Adversarial Adaptation for Photon-efficient Imaging Based on
Spatiotemporal Inception Network [11.58898808789911]
In single-photon LiDAR, photon-efficient imaging captures the 3D structure of a scene by only several signal detected per pixel.
Existing deep learning models for this task are trained on simulated datasets, which poses the domain shift challenge when applied to realistic scenarios.
We propose a network (STIN) for photon-efficient imaging, which is able to precisely predict the depth from a sparse and high-noise photon counting histogram by fully exploiting spatial and temporal information.
arXiv Detail & Related papers (2022-01-07T14:51:48Z) - Designing a Practical Degradation Model for Deep Blind Image
Super-Resolution [134.9023380383406]
Single image super-resolution (SISR) methods would not perform well if the assumed degradation model deviates from those in real images.
This paper proposes to design a more complex but practical degradation model that consists of randomly shuffled blur, downsampling and noise degradations.
arXiv Detail & Related papers (2021-03-25T17:40:53Z) - Correlation Plenoptic Imaging between Arbitrary Planes [52.77024349608834]
We show that the protocol enables to change the focused planes, in post-processing, and to achieve an unprecedented combination of image resolution and depth of field.
Results lead the way towards the development of compact designs for correlation plenoptic imaging devices based on chaotic light, as well as high-SNR plenoptic imaging devices based on entangled photon illumination.
arXiv Detail & Related papers (2020-07-23T14:26:14Z) - Deep Bilateral Retinex for Low-Light Image Enhancement [96.15991198417552]
Low-light images suffer from poor visibility caused by low contrast, color distortion and measurement noise.
This paper proposes a deep learning method for low-light image enhancement with a particular focus on handling the measurement noise.
The proposed method is very competitive to the state-of-the-art methods, and has significant advantage over others when processing images captured in extremely low lighting conditions.
arXiv Detail & Related papers (2020-07-04T06:26:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.