PanDORA: Casual HDR Radiance Acquisition for Indoor Scenes
- URL: http://arxiv.org/abs/2407.06150v1
- Date: Mon, 8 Jul 2024 17:22:27 GMT
- Title: PanDORA: Casual HDR Radiance Acquisition for Indoor Scenes
- Authors: Mohammad Reza Karimi Dastjerdi, Frédéric Fortier-Chouinard, Yannick Hold-Geoffroy, Marc Hébert, Claude Demers, Nima Kalantari, Jean-François Lalonde,
- Abstract summary: We present PanDORA: a PANoramic Dual-Observer Radiance Acquisition system for the casual capture of indoor scenes in high dynamic range.
Our proposed system comprises two 360deg cameras rigidly attached to a portable tripod.
The cameras simultaneously acquire two 360deg videos: one at a regular exposure and the other at a very fast exposure.
The resulting images are fed to a NeRF-based algorithm that reconstructs the scene's full high dynamic range.
- Score: 13.790885617434197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most novel view synthesis methods such as NeRF are unable to capture the true high dynamic range (HDR) radiance of scenes since they are typically trained on photos captured with standard low dynamic range (LDR) cameras. While the traditional exposure bracketing approach which captures several images at different exposures has recently been adapted to the multi-view case, we find such methods to fall short of capturing the full dynamic range of indoor scenes, which includes very bright light sources. In this paper, we present PanDORA: a PANoramic Dual-Observer Radiance Acquisition system for the casual capture of indoor scenes in high dynamic range. Our proposed system comprises two 360{\deg} cameras rigidly attached to a portable tripod. The cameras simultaneously acquire two 360{\deg} videos: one at a regular exposure and the other at a very fast exposure, allowing a user to simply wave the apparatus casually around the scene in a matter of minutes. The resulting images are fed to a NeRF-based algorithm that reconstructs the scene's full high dynamic range. Compared to HDR baselines from previous work, our approach reconstructs the full HDR radiance of indoor scenes without sacrificing the visual quality while retaining the ease of capture from recent NeRF-like approaches.
Related papers
- Exposure Completing for Temporally Consistent Neural High Dynamic Range Video Rendering [17.430726543786943]
We propose a novel paradigm to render HDR frames via completing the absent exposure information.
Our approach involves interpolating neighbor LDR frames in the time dimension to reconstruct LDR frames for the absent exposures.
This benefits the fusing process for HDR results, reducing noise and ghosting artifacts therefore improving temporal consistency.
arXiv Detail & Related papers (2024-07-18T09:13:08Z) - Cinematic Gaussians: Real-Time HDR Radiance Fields with Depth of Field [23.92087253022495]
Radiance field methods represent the state of the art in reconstructing complex scenes from multi-view photos.
Their reliance on a pinhole camera model, assuming all scene elements are in focus in the input images, presents practical challenges and complicates refocusing during novel-view synthesis.
We present a lightweight analytical based on 3D Gaussian Splatting that utilizes multi-view LDR images on varying exposure times, radiance of apertures, and focus distances as input to reconstruct a high-dynamic-range scene.
arXiv Detail & Related papers (2024-06-11T15:00:24Z) - HDR-GS: Efficient High Dynamic Range Novel View Synthesis at 1000x Speed via Gaussian Splatting [76.5908492298286]
Existing HDR NVS methods are mainly based on NeRF.
They suffer from long training time and slow inference speed.
We propose a new framework, High Dynamic Range Gaussian Splatting (-GS)
arXiv Detail & Related papers (2024-05-24T00:46:58Z) - Fast High Dynamic Range Radiance Fields for Dynamic Scenes [39.3304365600248]
We propose a dynamic HDR NeRF framework, named HDR-HexPlane, which can learn 3D scenes from dynamic 2D images captured with various exposures.
With the proposed model, high-quality novel-view images at any time point can be rendered with any desired exposure.
arXiv Detail & Related papers (2024-01-11T17:15:16Z) - Pano-NeRF: Synthesizing High Dynamic Range Novel Views with Geometry
from Sparse Low Dynamic Range Panoramic Images [82.1477261107279]
We propose the irradiance fields from sparse LDR panoramic images to increase the observation counts for faithful geometry recovery.
Experiments demonstrate that the irradiance fields outperform state-of-the-art methods on both geometry recovery and HDR reconstruction.
arXiv Detail & Related papers (2023-12-26T08:10:22Z) - GlowGAN: Unsupervised Learning of HDR Images from LDR Images in the Wild [74.52723408793648]
We present the first method for learning a generative model of HDR images from in-the-wild LDR image collections in a fully unsupervised manner.
The key idea is to train a generative adversarial network (GAN) to generate HDR images which, when projected to LDR under various exposures, are indistinguishable from real LDR images.
Experiments show that our method GlowGAN can synthesize photorealistic HDR images in many challenging cases such as landscapes, lightning, or windows.
arXiv Detail & Related papers (2022-11-22T15:42:08Z) - Casual Indoor HDR Radiance Capture from Omnidirectional Images [6.558757117312684]
We present Pano-NeRF, a novel pipeline to casually capture a plausible full HDR radiance field of a large indoor scene.
The resulting Pano-NeRF pipeline can estimate full HDR panoramas from any location of the scene.
arXiv Detail & Related papers (2022-08-16T18:45:27Z) - Self-supervised HDR Imaging from Motion and Exposure Cues [14.57046548797279]
We propose a novel self-supervised approach for learnable HDR estimation that alleviates the need for HDR ground-truth labels.
Experimental results show that the HDR models trained using our proposed self-supervision approach achieve performance competitive with those trained under full supervision.
arXiv Detail & Related papers (2022-03-23T10:22:03Z) - Multi-Bracket High Dynamic Range Imaging with Event Cameras [46.81570594990517]
We propose the first multi-bracket HDR pipeline combining a standard camera with an event camera.
Our results show better overall robustness when using events, with improvements in PSNR by up to 5dB on synthetic data and up to 0.7dB on real-world data.
arXiv Detail & Related papers (2022-03-13T11:10:47Z) - Neural Radiance Fields for Outdoor Scene Relighting [70.97747511934705]
We present NeRF-OSR, the first approach for outdoor scene relighting based on neural radiance fields.
In contrast to the prior art, our technique allows simultaneous editing of both scene illumination and camera viewpoint.
It also includes a dedicated network for shadow reproduction, which is crucial for high-quality outdoor scene relighting.
arXiv Detail & Related papers (2021-12-09T18:59:56Z) - HDR-NeRF: High Dynamic Range Neural Radiance Fields [70.80920996881113]
We present High Dynamic Range Neural Radiance Fields (-NeRF) to recover an HDR radiance field from a set of low dynamic range (LDR) views with different exposures.
We are able to generate both novel HDR views and novel LDR views under different exposures.
arXiv Detail & Related papers (2021-11-29T11:06:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.