Towards Physically-Based Sky-Modeling
- URL: http://arxiv.org/abs/2412.11883v1
- Date: Mon, 16 Dec 2024 15:32:05 GMT
- Title: Towards Physically-Based Sky-Modeling
- Authors: Ian J. Maquignaz,
- Abstract summary: We propose an all-weather sky-model, learning weatheredkies directly from physically captured HDR imagery.<n>Our model (AllSky) allows for emulation of physically captured environment maps with improved retention of the Extended Dynamic Range (EDR) of the sky.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate environment maps are a key component in rendering photorealistic outdoor scenes with coherent illumination. They enable captivating visual arts, immersive virtual reality and a wide range of engineering and scientific applications. Recent works have extended sky-models to be more comprehensive and inclusive of cloud formations but existing approaches fall short in faithfully recreating key-characteristics in physically captured HDRI. As we demonstrate, environment maps produced by sky-models do not relight scenes with the same tones, shadows, and illumination coherence as physically captured HDR imagery. Though the visual quality of DNN-generated LDR and HDR imagery has greatly progressed in recent years, we demonstrate this progress to be tangential to sky-modelling. Due to the Extended Dynamic Range (EDR) of 14EV required for outdoor environment maps inclusive of the sun, sky-modelling extends beyond the conventional paradigm of High Dynamic Range Imagery (HDRI). In this work, we propose an all-weather sky-model, learning weathered-skies directly from physically captured HDR imagery. Per user-controlled positioning of the sun and cloud formations, our model (AllSky) allows for emulation of physically captured environment maps with improved retention of the Extended Dynamic Range (EDR) of the sky.
Related papers
- Towards Physically-Based Sky-Modeling For Image Based Lighting [0.0]
Environment maps are a key component for rendering photorealistic outdoor scenes with coherent illumination.<n>Recent works have extended sky-models to be more comprehensive and inclusive of cloud formations but, as we demonstrate, existing methods fall short in faithfully recreating natural skies.<n>We propose AllSky, a flexible all-weather sky-model learned directly from physically captured HDRI.
arXiv Detail & Related papers (2025-12-15T16:44:38Z) - Reconstructing 3D Scenes in Native High Dynamic Range [82.90064638813185]
We present the first method for 3D scene reconstruction that directly models native HDR observations.<n>We propose bf Native High dynamic range 3D Gaussian Splatting (NH-3DGS), which preserves the full dynamic range throughout the reconstruction pipeline.<n>We demonstrate on both synthetic and real multi-view HDR datasets that NH-3DGS significantly outperforms existing methods in reconstruction quality and dynamic range preservation.
arXiv Detail & Related papers (2025-11-17T02:33:31Z) - Dynamic Novel View Synthesis in High Dynamic Range [78.72910306733607]
Current methods primarily focus on static scenes, implicitly assuming all scene elements remain stationary and non-living.<n>We introduce HDR-4DGS, a Gaussian Splatting-based architecture featured with an innovative dynamic tone-mapping module.<n>Experiments demonstrate that HDR-4DGS surpasses existing state-of-the-art methods in both quantitative performance and visual fidelity.
arXiv Detail & Related papers (2025-09-26T04:29:22Z) - S2R-HDR: A Large-Scale Rendered Dataset for HDR Fusion [4.684215759472536]
S2R- is the first large-scale high-quality synthetic dataset for HDR fusion, with 24,000 HDR samples.
We design a diverse set of realistic HDR scenes that encompass various dynamic elements, motion types, high dynamic range scenes, and lighting.
We introduce S2R-Adapter, a domain adaptation designed to bridge the gap between synthetic and real-world data.
arXiv Detail & Related papers (2025-04-10T11:39:56Z) - LEDiff: Latent Exposure Diffusion for HDR Generation [11.669442066168244]
LEDiff is a method that enables a generative model with HDR content generation through latent space exposure fusion techniques.
It also functions as an LDR-to- fusion converter, expanding the dynamic range of existing low-dynamic range images.
arXiv Detail & Related papers (2024-12-19T02:15:55Z) - Radiance Fields from Photons [16.150757184515037]
We introduce quanta radiance fields, a class of neural radiance fields that are trained at the granularity of individual photons using single-photon cameras (SPCs)
We demonstrate, both via simulations and a prototype SPC hardware, high-fidelity reconstructions under high-speed motion, in low light, and for extreme dynamic range settings.
arXiv Detail & Related papers (2024-07-12T16:06:51Z) - Cinematic Gaussians: Real-Time HDR Radiance Fields with Depth of Field [23.92087253022495]
Radiance field methods represent the state of the art in reconstructing complex scenes from multi-view photos.
Their reliance on a pinhole camera model, assuming all scene elements are in focus in the input images, presents practical challenges and complicates refocusing during novel-view synthesis.
We present a lightweight analytical based on 3D Gaussian Splatting that utilizes multi-view LDR images on varying exposure times, radiance of apertures, and focus distances as input to reconstruct a high-dynamic-range scene.
arXiv Detail & Related papers (2024-06-11T15:00:24Z) - HDR-GS: Efficient High Dynamic Range Novel View Synthesis at 1000x Speed via Gaussian Splatting [76.5908492298286]
Existing HDR NVS methods are mainly based on NeRF.
They suffer from long training time and slow inference speed.
We propose a new framework, High Dynamic Range Gaussian Splatting (-GS)
arXiv Detail & Related papers (2024-05-24T00:46:58Z) - Pano-NeRF: Synthesizing High Dynamic Range Novel Views with Geometry
from Sparse Low Dynamic Range Panoramic Images [82.1477261107279]
We propose the irradiance fields from sparse LDR panoramic images to increase the observation counts for faithful geometry recovery.
Experiments demonstrate that the irradiance fields outperform state-of-the-art methods on both geometry recovery and HDR reconstruction.
arXiv Detail & Related papers (2023-12-26T08:10:22Z) - Spatiotemporally Consistent HDR Indoor Lighting Estimation [66.26786775252592]
We propose a physically-motivated deep learning framework to solve the indoor lighting estimation problem.
Given a single LDR image with a depth map, our method predicts spatially consistent lighting at any given image position.
Our framework achieves photorealistic lighting prediction with higher quality compared to state-of-the-art single-image or video-based methods.
arXiv Detail & Related papers (2023-05-07T20:36:29Z) - Deep Dynamic Cloud Lighting [3.4442294678697385]
We propose a solution which enables whole-sky dynamic cloud for the first time.
We synthesise a multi-timescale sky appearance model which learns to predict the sky illumination over various timescales.
arXiv Detail & Related papers (2023-04-18T22:02:54Z) - GlowGAN: Unsupervised Learning of HDR Images from LDR Images in the Wild [74.52723408793648]
We present the first method for learning a generative model of HDR images from in-the-wild LDR image collections in a fully unsupervised manner.
The key idea is to train a generative adversarial network (GAN) to generate HDR images which, when projected to LDR under various exposures, are indistinguishable from real LDR images.
Experiments show that our method GlowGAN can synthesize photorealistic HDR images in many challenging cases such as landscapes, lightning, or windows.
arXiv Detail & Related papers (2022-11-22T15:42:08Z) - HDR-NeRF: High Dynamic Range Neural Radiance Fields [70.80920996881113]
We present High Dynamic Range Neural Radiance Fields (-NeRF) to recover an HDR radiance field from a set of low dynamic range (LDR) views with different exposures.
We are able to generate both novel HDR views and novel LDR views under different exposures.
arXiv Detail & Related papers (2021-11-29T11:06:39Z) - Beyond Visual Attractiveness: Physically Plausible Single Image HDR
Reconstruction for Spherical Panoramas [60.24132321381606]
We introduce the physical illuminance constraints to our single-shot HDR reconstruction framework.
Our method can generate HDRs which are not only visually appealing but also physically plausible.
arXiv Detail & Related papers (2021-03-24T01:51:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.