HDR-NeRF: High Dynamic Range Neural Radiance Fields
- URL: http://arxiv.org/abs/2111.14451v4
- Date: Tue, 25 Apr 2023 11:49:08 GMT
- Title: HDR-NeRF: High Dynamic Range Neural Radiance Fields
- Authors: Xin Huang, Qi Zhang, Ying Feng, Hongdong Li, Xuan Wang, Qing Wang
- Abstract summary: We present High Dynamic Range Neural Radiance Fields (-NeRF) to recover an HDR radiance field from a set of low dynamic range (LDR) views with different exposures.
We are able to generate both novel HDR views and novel LDR views under different exposures.
- Score: 70.80920996881113
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present High Dynamic Range Neural Radiance Fields (HDR-NeRF) to recover an
HDR radiance field from a set of low dynamic range (LDR) views with different
exposures. Using the HDR-NeRF, we are able to generate both novel HDR views and
novel LDR views under different exposures. The key to our method is to model
the physical imaging process, which dictates that the radiance of a scene point
transforms to a pixel value in the LDR image with two implicit functions: a
radiance field and a tone mapper. The radiance field encodes the scene radiance
(values vary from 0 to +infty), which outputs the density and radiance of a ray
by giving corresponding ray origin and ray direction. The tone mapper models
the mapping process that a ray hitting on the camera sensor becomes a pixel
value. The color of the ray is predicted by feeding the radiance and the
corresponding exposure time into the tone mapper. We use the classic volume
rendering technique to project the output radiance, colors, and densities into
HDR and LDR images, while only the input LDR images are used as the
supervision. We collect a new forward-facing HDR dataset to evaluate the
proposed method. Experimental results on synthetic and real-world scenes
validate that our method can not only accurately control the exposures of
synthesized views but also render views with a high dynamic range.
Related papers
- HDR-GS: Efficient High Dynamic Range Novel View Synthesis at 1000x Speed via Gaussian Splatting [76.5908492298286]
Existing HDR NVS methods are mainly based on NeRF.
They suffer from long training time and slow inference speed.
We propose a new framework, High Dynamic Range Gaussian Splatting (-GS)
arXiv Detail & Related papers (2024-05-24T00:46:58Z) - Fast High Dynamic Range Radiance Fields for Dynamic Scenes [39.3304365600248]
We propose a dynamic HDR NeRF framework, named HDR-HexPlane, which can learn 3D scenes from dynamic 2D images captured with various exposures.
With the proposed model, high-quality novel-view images at any time point can be rendered with any desired exposure.
arXiv Detail & Related papers (2024-01-11T17:15:16Z) - Pano-NeRF: Synthesizing High Dynamic Range Novel Views with Geometry
from Sparse Low Dynamic Range Panoramic Images [82.1477261107279]
We propose the irradiance fields from sparse LDR panoramic images to increase the observation counts for faithful geometry recovery.
Experiments demonstrate that the irradiance fields outperform state-of-the-art methods on both geometry recovery and HDR reconstruction.
arXiv Detail & Related papers (2023-12-26T08:10:22Z) - Towards High-quality HDR Deghosting with Conditional Diffusion Models [88.83729417524823]
High Dynamic Range (LDR) images can be recovered from several Low Dynamic Range (LDR) images by existing Deep Neural Networks (DNNs) techniques.
DNNs still generate ghosting artifacts when LDR images have saturation and large motion.
We formulate the HDR deghosting problem as an image generation that leverages LDR features as the diffusion model's condition.
arXiv Detail & Related papers (2023-11-02T01:53:55Z) - GlowGAN: Unsupervised Learning of HDR Images from LDR Images in the Wild [74.52723408793648]
We present the first method for learning a generative model of HDR images from in-the-wild LDR image collections in a fully unsupervised manner.
The key idea is to train a generative adversarial network (GAN) to generate HDR images which, when projected to LDR under various exposures, are indistinguishable from real LDR images.
Experiments show that our method GlowGAN can synthesize photorealistic HDR images in many challenging cases such as landscapes, lightning, or windows.
arXiv Detail & Related papers (2022-11-22T15:42:08Z) - Casual Indoor HDR Radiance Capture from Omnidirectional Images [6.558757117312684]
We present Pano-NeRF, a novel pipeline to casually capture a plausible full HDR radiance field of a large indoor scene.
The resulting Pano-NeRF pipeline can estimate full HDR panoramas from any location of the scene.
arXiv Detail & Related papers (2022-08-16T18:45:27Z) - HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields [15.32264927462068]
We learn a plenoptic function of 3D HDR radiance fields, geometry information, and varying camera settings inherent in 2D low dynamic range (LDR) images.
Our voxel-based volume rendering pipeline reconstructs HDR radiance fields with only multi-view LDR images.
Our experiments show that HDR-Plenoxels can express detail and high-quality HDR novel views from only LDR images with various cameras.
arXiv Detail & Related papers (2022-08-14T06:12:22Z) - HDR-GAN: HDR Image Reconstruction from Multi-Exposed LDR Images with
Large Motions [62.44802076971331]
We propose a novel GAN-based model, HDR-GAN, for synthesizing HDR images from multi-exposed LDR images.
By incorporating adversarial learning, our method is able to produce faithful information in the regions with missing content.
arXiv Detail & Related papers (2020-07-03T11:42:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.