EverLight: Indoor-Outdoor Editable HDR Lighting Estimation
- URL: http://arxiv.org/abs/2304.13207v2
- Date: Mon, 21 Aug 2023 18:53:15 GMT
- Title: EverLight: Indoor-Outdoor Editable HDR Lighting Estimation
- Authors: Mohammad Reza Karimi Dastjerdi, Jonathan Eisenmann, Yannick
Hold-Geoffroy, Jean-Fran\c{c}ois Lalonde
- Abstract summary: We propose a method which combines a parametric light model with 360deg panoramas, ready to use as HDRI in rendering engines.
In our representation, users can easily edit light direction, intensity, number, etc. to impact shading while providing rich, complex reflections while seamlessly blending with the edits.
- Score: 9.443561684223514
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Because of the diversity in lighting environments, existing illumination
estimation techniques have been designed explicitly on indoor or outdoor
environments. Methods have focused specifically on capturing accurate energy
(e.g., through parametric lighting models), which emphasizes shading and strong
cast shadows; or producing plausible texture (e.g., with GANs), which
prioritizes plausible reflections. Approaches which provide editable lighting
capabilities have been proposed, but these tend to be with simplified lighting
models, offering limited realism. In this work, we propose to bridge the gap
between these recent trends in the literature, and propose a method which
combines a parametric light model with 360{\deg} panoramas, ready to use as
HDRI in rendering engines. We leverage recent advances in GAN-based LDR
panorama extrapolation from a regular image, which we extend to HDR using
parametric spherical gaussians. To achieve this, we introduce a novel lighting
co-modulation method that injects lighting-related features throughout the
generator, tightly coupling the original or edited scene illumination within
the panorama generation process. In our representation, users can easily edit
light direction, intensity, number, etc. to impact shading while providing
rich, complex reflections while seamlessly blending with the edits.
Furthermore, our method encompasses indoor and outdoor environments,
demonstrating state-of-the-art results even when compared to domain-specific
methods.
Related papers
- DifFRelight: Diffusion-Based Facial Performance Relighting [12.909429637057343]
We present a novel framework for free-viewpoint facial performance relighting using diffusion-based image-to-image translation.
We train a diffusion model for precise lighting control, enabling high-fidelity relit facial images from flat-lit inputs.
The model accurately reproduces complex lighting effects like eye reflections, subsurface scattering, self-shadowing, and translucency.
arXiv Detail & Related papers (2024-10-10T17:56:44Z) - NieR: Normal-Based Lighting Scene Rendering [17.421326290704844]
NieR (Normal-Based Lighting Scene Rendering) is a novel framework that takes into account the nuances of light reflection on diverse material surfaces.
We present the LD (Light Decomposition) module, which captures the lighting reflection characteristics on surfaces.
We also propose the HNGD (Hierarchical Normal Gradient Densification) module to overcome the limitations of sparse Gaussian representation.
arXiv Detail & Related papers (2024-05-21T14:24:43Z) - Controllable Light Diffusion for Portraits [8.931046902694984]
We introduce light diffusion, a novel method to improve lighting in portraits.
Inspired by professional photographers' diffusers and scrims, our method softens lighting given only a single portrait photo.
arXiv Detail & Related papers (2023-05-08T14:46:28Z) - Spatiotemporally Consistent HDR Indoor Lighting Estimation [66.26786775252592]
We propose a physically-motivated deep learning framework to solve the indoor lighting estimation problem.
Given a single LDR image with a depth map, our method predicts spatially consistent lighting at any given image position.
Our framework achieves photorealistic lighting prediction with higher quality compared to state-of-the-art single-image or video-based methods.
arXiv Detail & Related papers (2023-05-07T20:36:29Z) - Neural Fields meet Explicit Geometric Representation for Inverse
Rendering of Urban Scenes [62.769186261245416]
We present a novel inverse rendering framework for large urban scenes capable of jointly reconstructing the scene geometry, spatially-varying materials, and HDR lighting from a set of posed RGB images with optional depth.
Specifically, we use a neural field to account for the primary rays, and use an explicit mesh (reconstructed from the underlying neural field) for modeling secondary rays that produce higher-order lighting effects such as cast shadows.
arXiv Detail & Related papers (2023-04-06T17:51:54Z) - WildLight: In-the-wild Inverse Rendering with a Flashlight [77.31815397135381]
We propose a practical photometric solution for in-the-wild inverse rendering under unknown ambient lighting.
Our system recovers scene geometry and reflectance using only multi-view images captured by a smartphone.
We demonstrate by extensive experiments that our method is easy to implement, casual to set up, and consistently outperforms existing in-the-wild inverse rendering techniques.
arXiv Detail & Related papers (2023-03-24T17:59:56Z) - Neural Light Field Estimation for Street Scenes with Differentiable
Virtual Object Insertion [129.52943959497665]
Existing works on outdoor lighting estimation typically simplify the scene lighting into an environment map.
We propose a neural approach that estimates the 5D HDR light field from a single image.
We show the benefits of our AR object insertion in an autonomous driving application.
arXiv Detail & Related papers (2022-08-19T17:59:16Z) - StyleLight: HDR Panorama Generation for Lighting Estimation and Editing [98.20167223076756]
We present a new lighting estimation and editing framework to generate high-dynamic-range (GAN) indoor panorama lighting from a single field-of-view (LFOV) image.
Our framework achieves superior performance over state-of-the-art methods on indoor lighting estimation.
arXiv Detail & Related papers (2022-07-29T17:58:58Z) - GMLight: Lighting Estimation via Geometric Distribution Approximation [86.95367898017358]
This paper presents a lighting estimation framework that employs a regression network and a generative projector for effective illumination estimation.
We parameterize illumination scenes in terms of the geometric light distribution, light intensity, ambient term, and auxiliary depth, and estimate them as a pure regression task.
With the estimated lighting parameters, the generative projector synthesizes panoramic illumination maps with realistic appearance and frequency.
arXiv Detail & Related papers (2021-02-20T03:31:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.