GMLight: Lighting Estimation via Geometric Distribution Approximation
- URL: http://arxiv.org/abs/2102.10244v1
- Date: Sat, 20 Feb 2021 03:31:52 GMT
- Title: GMLight: Lighting Estimation via Geometric Distribution Approximation
- Authors: Fangneng Zhan, Yingchen Yu, Rongliang Wu, Changgong Zhang, Shijian Lu,
Ling Shao, Feiying Ma, Xuansong Xie
- Abstract summary: This paper presents a lighting estimation framework that employs a regression network and a generative projector for effective illumination estimation.
We parameterize illumination scenes in terms of the geometric light distribution, light intensity, ambient term, and auxiliary depth, and estimate them as a pure regression task.
With the estimated lighting parameters, the generative projector synthesizes panoramic illumination maps with realistic appearance and frequency.
- Score: 86.95367898017358
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Lighting estimation from a single image is an essential yet challenging task
in computer vision and computer graphics. Existing works estimate lighting by
regressing representative illumination parameters or generating illumination
maps directly. However, these methods often suffer from poor accuracy and
generalization. This paper presents Geometric Mover's Light (GMLight), a
lighting estimation framework that employs a regression network and a
generative projector for effective illumination estimation. We parameterize
illumination scenes in terms of the geometric light distribution, light
intensity, ambient term, and auxiliary depth, and estimate them as a pure
regression task. Inspired by the earth mover's distance, we design a novel
geometric mover's loss to guide the accurate regression of light distribution
parameters. With the estimated lighting parameters, the generative projector
synthesizes panoramic illumination maps with realistic appearance and
frequency. Extensive experiments show that GMLight achieves accurate
illumination estimation and superior fidelity in relighting for 3D object
insertion.
Related papers
- GS-Phong: Meta-Learned 3D Gaussians for Relightable Novel View Synthesis [63.5925701087252]
We propose a novel method for representing a scene illuminated by a point light using a set of relightable 3D Gaussian points.
Inspired by the Blinn-Phong model, our approach decomposes the scene into ambient, diffuse, and specular components.
To facilitate the decomposition of geometric information independent of lighting conditions, we introduce a novel bilevel optimization-based meta-learning framework.
arXiv Detail & Related papers (2024-05-31T13:48:54Z) - MixLight: Borrowing the Best of both Spherical Harmonics and Gaussian Models [69.39388799906409]
Existing works estimate illumination by generating illumination maps or regressing illumination parameters.
This paper presents MixLight, a joint model that utilizes the complementary characteristics of SH and SG to achieve a more complete illumination representation.
arXiv Detail & Related papers (2024-04-19T10:17:10Z) - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization [62.13932669494098]
This paper presents a 3D Gaussian Inverse Rendering (GIR) method, employing 3D Gaussian representations to factorize the scene into material properties, light, and geometry.
We compute the normal of each 3D Gaussian using the shortest eigenvector, with a directional masking scheme forcing accurate normal estimation without external supervision.
We adopt an efficient voxel-based indirect illumination tracing scheme that stores direction-aware outgoing radiance in each 3D Gaussian to disentangle secondary illumination for approximating multi-bounce light transport.
arXiv Detail & Related papers (2023-12-08T16:05:15Z) - EverLight: Indoor-Outdoor Editable HDR Lighting Estimation [9.443561684223514]
We propose a method which combines a parametric light model with 360deg panoramas, ready to use as HDRI in rendering engines.
In our representation, users can easily edit light direction, intensity, number, etc. to impact shading while providing rich, complex reflections while seamlessly blending with the edits.
arXiv Detail & Related papers (2023-04-26T00:20:59Z) - A CNN Based Approach for the Point-Light Photometric Stereo Problem [26.958763133729846]
We propose a CNN-based approach capable of handling realistic assumptions by leveraging recent improvements of deep neural networks for far-field Photometric Stereo.
Our approach outperforms the state-of-the-art on the DiLiGenT real world dataset.
In order to measure the performance of our approach for near-field point-light source PS data, we introduce LUCES the first real-world 'dataset for near-fieLd point light soUrCe photomEtric Stereo'
arXiv Detail & Related papers (2022-10-10T12:57:12Z) - Physically-Based Editing of Indoor Scene Lighting from a Single Image [106.60252793395104]
We present a method to edit complex indoor lighting from a single image with its predicted depth and light source segmentation masks.
We tackle this problem using two novel components: 1) a holistic scene reconstruction method that estimates scene reflectance and parametric 3D lighting, and 2) a neural rendering framework that re-renders the scene from our predictions.
arXiv Detail & Related papers (2022-05-19T06:44:37Z) - Sparse Needlets for Lighting Estimation with Spherical Transport Loss [89.52531416604774]
NeedleLight is a new lighting estimation model that represents illumination with needlets and allows lighting estimation in both frequency domain and spatial domain jointly.
Extensive experiments show that NeedleLight achieves superior lighting estimation consistently across multiple evaluation metrics as compared with state-of-the-art methods.
arXiv Detail & Related papers (2021-06-24T15:19:42Z) - EMLight: Lighting Estimation via Spherical Distribution Approximation [33.26530733479459]
We propose an illumination estimation framework that leverages a regression network and a neural projector for accurate illumination estimation.
We decompose the illumination map into spherical light distribution, light intensity and the ambient term.
Under the guidance of the predicted spherical distribution, light intensity and ambient term, the neural projector synthesizes panoramic illumination maps with realistic light frequency.
arXiv Detail & Related papers (2020-12-21T04:54:08Z) - Deep Lighting Environment Map Estimation from Spherical Panoramas [0.0]
We present a data-driven model that estimates an HDR lighting environment map from a single LDR monocular spherical panorama.
We exploit the availability of surface geometry to employ image-based relighting as a data generator and supervision mechanism.
arXiv Detail & Related papers (2020-05-16T14:23:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.