From Particles to Fields: Reframing Photon Mapping with Continuous Gaussian Photon Fields
- URL: http://arxiv.org/abs/2512.12459v1
- Date: Sat, 13 Dec 2025 21:09:09 GMT
- Title: From Particles to Fields: Reframing Photon Mapping with Continuous Gaussian Photon Fields
- Authors: Jiachen Tao, Benjamin Planche, Van Nguyen Nguyen, Junyi Wu, Yuchun Liu, Haoxuan Wang, Zhongpai Gao, Gengyu Zhang, Meng Zheng, Feiran Wang, Anwesa Choudhuri, Zhenghao Zhao, Weitai Kang, Terrence Chen, Yan Yan, Ziyan Wu,
- Abstract summary: We introduce a learnable representation that encodes photon distributions as anisotropic 3D Gaussian primitives by position, rotation, scale, and spectrum.<n>Experiments on scenes with complex light transport, such as caustics and specular-diffuse interactions, demonstrate that GPF attains photon-level accuracy by orders of magnitude.
- Score: 33.039991323985056
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurately modeling light transport is essential for realistic image synthesis. Photon mapping provides physically grounded estimates of complex global illumination effects such as caustics and specular-diffuse interactions, yet its per-view radiance estimation remains computationally inefficient when rendering multiple views of the same scene. The inefficiency arises from independent photon tracing and stochastic kernel estimation at each viewpoint, leading to inevitable redundant computation. To accelerate multi-view rendering, we reformulate photon mapping as a continuous and reusable radiance function. Specifically, we introduce the Gaussian Photon Field (GPF), a learnable representation that encodes photon distributions as anisotropic 3D Gaussian primitives parameterized by position, rotation, scale, and spectrum. GPF is initialized from physically traced photons in the first SPPM iteration and optimized using multi-view supervision of final radiance, distilling photon-based light transport into a continuous field. Once trained, the field enables differentiable radiance evaluation along camera rays without repeated photon tracing or iterative refinement. Extensive experiments on scenes with complex light transport, such as caustics and specular-diffuse interactions, demonstrate that GPF attains photon-level accuracy while reducing computation by orders of magnitude, unifying the physical rigor of photon-based rendering with the efficiency of neural scene representations.
Related papers
- Multiphoton Hong-Ou-Mandel Interference Enables Superresolution of Bright Thermal Sources [0.0]
Scheme employs multiphoton interference with a reference single-photon Fock state at a beamsplitter.<n>Even-photon-number coincidences exhibit constant precision in the sub-Rayleigh regime.<n>Our scheme offers a robust alternative to non-invasive single-particle tracking and imaging of bright sources in nanoscopic chemical and biological systems.
arXiv Detail & Related papers (2026-02-23T12:24:22Z) - Image-Plane Detection of Spatially Entangled Photon Pairs with a CMOS Camera [0.0]
spatially entangled photon pairs (biphotons) generated by spontaneous parametric down-conversion offer unique opportunities for quantum imaging.<n>Previous camera-based biphoton imaging experiments have relied on photon-counting detection.<n>We demonstrate the detection of spatial biphoton joint probability distributions in both the image plane and the pupil plane.
arXiv Detail & Related papers (2025-12-31T14:15:59Z) - Optical Integration With Heralded Single Photons [0.0]
We experimentally harness the transverse spatial degrees of freedom of light within an optical processing framework based on heralded single photons.<n>The integration is performed over binary phase patterns encoded via a phase-only spatial light modulator, with polarization serving as an auxiliary degree of freedom.
arXiv Detail & Related papers (2025-08-28T18:52:43Z) - A framework for extracting the rates of photophysical processes from biexponentially decaying photon emission data [0.0]
We develop a model that includes trapping and release of carriers by optically inactive states.
The model also allows determination of likelihood intervals for all the transition rates involved in the emission dynamics.
We demonstrate the value of this model by applying it to time resolved photoluminescence measurements of CdSeTe/CdS heterostructures.
arXiv Detail & Related papers (2024-08-22T08:14:51Z) - Spatial super-resolution in nanosensing with blinking emitters [79.16635054977068]
We propose a method of spatial resolution enhancement in metrology (thermometry, magnetometry, pH estimation, and similar methods) with blinking fluorescent nanosensors.<n>We believe that blinking fluorescent sensing agents being complemented with the developed image analysis technique could be utilized routinely in the life science sector.
arXiv Detail & Related papers (2024-02-27T10:38:05Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - On-chip quantum information processing with distinguishable photons [55.41644538483948]
Multi-photon interference is at the heart of photonic quantum technologies.
Here, we experimentally demonstrate that detection can be implemented with a temporal resolution sufficient to interfere photons detuned on the scales necessary for cavity-based integrated photon sources.
We show how time-resolved detection of non-ideal photons can be used to improve the fidelity of an entangling operation and to mitigate the reduction of computational complexity in boson sampling experiments.
arXiv Detail & Related papers (2022-10-14T18:16:49Z) - Second-order correlations and purity of unheralded single photons from
spontaneous parametric down-conversion [1.7396274240172125]
Various quantum technology applications require high-purity single photons with high generation rate.
We present a revised expression to calculate second-order temporal correlation function, $g(2)$ for any fixed time window (bin)
arXiv Detail & Related papers (2022-07-14T15:09:58Z) - Learning Neural Transmittance for Efficient Rendering of Reflectance
Fields [43.24427791156121]
We propose a novel method based on precomputed Neural Transmittance Functions to accelerate rendering of neural reflectance fields.
Results on real and synthetic scenes demonstrate almost two order of magnitude speedup for renderings under environment maps with minimal accuracy loss.
arXiv Detail & Related papers (2021-10-25T21:12:25Z) - Regularization by Denoising Sub-sampled Newton Method for Spectral CT
Multi-Material Decomposition [78.37855832568569]
We propose to solve a model-based maximum-a-posterior problem to reconstruct multi-materials images with application to spectral CT.
In particular, we propose to solve a regularized optimization problem based on a plug-in image-denoising function.
We show numerical and experimental results for spectral CT materials decomposition.
arXiv Detail & Related papers (2021-03-25T15:20:10Z) - A bright and fast source of coherent single photons [46.25143811066789]
A single photon source is a key enabling technology in device-independent quantum communication.
We report a single photon source with an especially high system efficiency.
arXiv Detail & Related papers (2020-07-24T17:08:46Z) - Deep Photon Mapping [59.41146655216394]
In this paper, we develop the first deep learning-based method for particle-based rendering.
We train a novel deep neural network to predict a kernel function to aggregate photon contributions at shading points.
Our network encodes individual photons into per-photon features, aggregates them in the neighborhood of a shading point, and infers a kernel function from the per-photon and photon local context features.
arXiv Detail & Related papers (2020-04-25T06:59:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.