Deep Photon Mapping
- URL: http://arxiv.org/abs/2004.12069v1
- Date: Sat, 25 Apr 2020 06:59:10 GMT
- Title: Deep Photon Mapping
- Authors: Shilin Zhu, Zexiang Xu, Henrik Wann Jensen, Hao Su, Ravi Ramamoorthi
- Abstract summary: In this paper, we develop the first deep learning-based method for particle-based rendering.
We train a novel deep neural network to predict a kernel function to aggregate photon contributions at shading points.
Our network encodes individual photons into per-photon features, aggregates them in the neighborhood of a shading point, and infers a kernel function from the per-photon and photon local context features.
- Score: 59.41146655216394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, deep learning-based denoising approaches have led to dramatic
improvements in low sample-count Monte Carlo rendering. These approaches are
aimed at path tracing, which is not ideal for simulating challenging light
transport effects like caustics, where photon mapping is the method of choice.
However, photon mapping requires very large numbers of traced photons to
achieve high-quality reconstructions. In this paper, we develop the first deep
learning-based method for particle-based rendering, and specifically focus on
photon density estimation, the core of all particle-based methods. We train a
novel deep neural network to predict a kernel function to aggregate photon
contributions at shading points. Our network encodes individual photons into
per-photon features, aggregates them in the neighborhood of a shading point to
construct a photon local context vector, and infers a kernel function from the
per-photon and photon local context features. This network is easy to
incorporate in many previous photon mapping methods (by simply swapping the
kernel density estimator) and can produce high-quality reconstructions of
complex global illumination effects like caustics with an order of magnitude
fewer photons compared to previous photon mapping methods.
Related papers
- Tutorial: Shaping the Spatial Correlations of Entangled Photon Pairs [1.2316671400812602]
This tutorial describes how the concepts of classical light shaping can be applied to imaging schemes based on entangled photon pairs.
We detail two basic experimental configurations in which a spatial light modulator is used to shape the spatial correlations of a photon pair state.
We showcase two recent examples that expand on these concepts to perform aberration and scattering correction with photon pairs.
arXiv Detail & Related papers (2024-02-12T14:20:33Z) - NePF: Neural Photon Field for Single-Stage Inverse Rendering [6.977356702921476]
We present a novel single-stage framework, Neural Photon Field (NePF), to address the ill-posed inverse rendering from multi-view images.
NePF achieves this unification by fully utilizing the physical implication behind the weight function of neural implicit surfaces.
We evaluate our method on both real and synthetic datasets.
arXiv Detail & Related papers (2023-11-20T06:15:46Z) - Deep Richardson-Lucy Deconvolution for Low-Light Image Deblurring [48.80983873199214]
We develop a data-driven approach to model the saturated pixels by a learned latent map.
Based on the new model, the non-blind deblurring task can be formulated into a maximum a posterior (MAP) problem.
To estimate high-quality deblurred images without amplified artifacts, we develop a prior estimation network.
arXiv Detail & Related papers (2023-08-10T12:53:30Z) - Image Denoising and the Generative Accumulation of Photons [63.14988413396991]
We show that a network trained to predict where the next photon could arrive is in fact solving the minimum mean square error (MMSE) denoising task.
We present a new strategy for self-supervised denoising.
We present a new method for sampling from the posterior of possible solutions by iteratively sampling and adding small numbers of photons to the image.
arXiv Detail & Related papers (2023-07-13T08:03:32Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - Fast simulation for multi-photon, atomic-ensemble quantum model of
linear optical systems addressing the curse of dimensionality [0.0]
We decompose the time evolutionary operator on multiple photons into a group of time evolution operators acting on a single photon.
Our method visualizes the spatial propagation of photons hence provides insights that aid experiment designs for quantum-enabled technologies.
arXiv Detail & Related papers (2023-02-27T16:54:58Z) - Pixelated Reconstruction of Foreground Density and Background Surface
Brightness in Gravitational Lensing Systems using Recurrent Inference
Machines [116.33694183176617]
We use a neural network based on the Recurrent Inference Machine to reconstruct an undistorted image of the background source and the lens mass density distribution as pixelated maps.
When compared to more traditional parametric models, the proposed method is significantly more expressive and can reconstruct complex mass distributions.
arXiv Detail & Related papers (2023-01-10T19:00:12Z) - Uncalibrated Neural Inverse Rendering for Photometric Stereo of General
Surfaces [103.08512487830669]
This paper presents an uncalibrated deep neural network framework for the photometric stereo problem.
Existing neural network-based methods either require exact light directions or ground-truth surface normals of the object or both.
We propose an uncalibrated neural inverse rendering approach to this problem.
arXiv Detail & Related papers (2020-12-12T10:33:08Z) - Fast Correlated-Photon Imaging Enhanced by Deep Learning [5.2618075333626075]
Correlated photon pairs, carrying strong quantum correlations, have been harnessed to bring quantum advantages to various fields.
We present an experimental fast correlated-photon imaging enhanced by deep learning.
arXiv Detail & Related papers (2020-06-16T18:00:42Z) - Adaptive optics with reflected light and deep neural networks [0.0]
We develop a method for adaptive optics with reflected light and deep neural networks compatible with an epi-detection configuration.
Large datasets of sample aberrations which consist of excitation and detection path aberrations as well as the corresponding reflected focus images are generated.
Deep neural networks can disentangle and independently correct excitation and detection aberrations based on reflected light images recorded from scattering samples.
arXiv Detail & Related papers (2020-04-09T15:39:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.