PointAR: Efficient Lighting Estimation for Mobile Augmented Reality
- URL: http://arxiv.org/abs/2004.00006v4
- Date: Fri, 17 Jul 2020 20:13:40 GMT
- Title: PointAR: Efficient Lighting Estimation for Mobile Augmented Reality
- Authors: Yiqin Zhao, Tian Guo
- Abstract summary: We propose an efficient lighting estimation pipeline that is suitable to run on modern mobile devices.
PointAR takes a single RGB-D image captured from the mobile camera and a 2D location in that image, and estimates 2nd order spherical harmonics coefficients.
- Score: 7.58114840374767
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose an efficient lighting estimation pipeline that is suitable to run
on modern mobile devices, with comparable resource complexities to
state-of-the-art mobile deep learning models. Our pipeline, PointAR, takes a
single RGB-D image captured from the mobile camera and a 2D location in that
image, and estimates 2nd order spherical harmonics coefficients. This estimated
spherical harmonics coefficients can be directly utilized by rendering engines
for supporting spatially variant indoor lighting, in the context of augmented
reality. Our key insight is to formulate the lighting estimation as a point
cloud-based learning problem directly from point clouds, which is in part
inspired by the Monte Carlo integration leveraged by real-time spherical
harmonics lighting. While existing approaches estimate lighting information
with complex deep learning pipelines, our method focuses on reducing the
computational complexity. Through both quantitative and qualitative
experiments, we demonstrate that PointAR achieves lower lighting estimation
errors compared to state-of-the-art methods. Further, our method requires an
order of magnitude lower resource, comparable to that of mobile-specific DNNs.
Related papers
- MixLight: Borrowing the Best of both Spherical Harmonics and Gaussian Models [69.39388799906409]
Existing works estimate illumination by generating illumination maps or regressing illumination parameters.
This paper presents MixLight, a joint model that utilizes the complementary characteristics of SH and SG to achieve a more complete illumination representation.
arXiv Detail & Related papers (2024-04-19T10:17:10Z) - LightOctree: Lightweight 3D Spatially-Coherent Indoor Lighting Estimation [4.079873017864992]
We present a lightweight solution for estimating spatially-coherent indoor lighting from a single RGB image.
We introduce a unified, voxel octree-based illumination estimation framework to produce 3D spatially-coherent lighting.
arXiv Detail & Related papers (2024-04-05T07:15:06Z) - SplitNeRF: Split Sum Approximation Neural Field for Joint Geometry,
Illumination, and Material Estimation [65.99344783327054]
We present a novel approach for digitizing real-world objects by estimating their geometry, material properties, and lighting.
Our method incorporates into Radiance Neural Field (NeRF) pipelines the split sum approximation used with image-based lighting for real-time physical-based rendering.
Our method is capable of attaining state-of-the-art relighting quality after only $sim1$ hour of training in a single NVIDIA A100 GPU.
arXiv Detail & Related papers (2023-11-28T10:36:36Z) - Spatiotemporally Consistent HDR Indoor Lighting Estimation [66.26786775252592]
We propose a physically-motivated deep learning framework to solve the indoor lighting estimation problem.
Given a single LDR image with a depth map, our method predicts spatially consistent lighting at any given image position.
Our framework achieves photorealistic lighting prediction with higher quality compared to state-of-the-art single-image or video-based methods.
arXiv Detail & Related papers (2023-05-07T20:36:29Z) - Sparse Needlets for Lighting Estimation with Spherical Transport Loss [89.52531416604774]
NeedleLight is a new lighting estimation model that represents illumination with needlets and allows lighting estimation in both frequency domain and spatial domain jointly.
Extensive experiments show that NeedleLight achieves superior lighting estimation consistently across multiple evaluation metrics as compared with state-of-the-art methods.
arXiv Detail & Related papers (2021-06-24T15:19:42Z) - Xihe: A 3D Vision-based Lighting Estimation Framework for Mobile
Augmented Reality [9.129335351176904]
We design an edge-assisted framework called Xihe to provide mobile AR applications the ability to obtain accurate omnidirectional lighting estimation in real time.
We develop a tailored GPU pipeline for on-device point cloud processing and use an encoding technique that reduces network transmitted bytes.
Our results show that Xihe takes as fast as 20.67ms per lighting estimation and achieves 9.4% better estimation accuracy than a state-of-the-art neural network.
arXiv Detail & Related papers (2021-05-30T13:48:29Z) - Leveraging Spatial and Photometric Context for Calibrated Non-Lambertian
Photometric Stereo [61.6260594326246]
We introduce an efficient fully-convolutional architecture that can leverage both spatial and photometric context simultaneously.
Using separable 4D convolutions and 2D heat-maps reduces the size and makes more efficient.
arXiv Detail & Related papers (2021-03-22T18:06:58Z) - Object-based Illumination Estimation with Rendering-aware Neural
Networks [56.01734918693844]
We present a scheme for fast environment light estimation from the RGBD appearance of individual objects and their local image areas.
With the estimated lighting, virtual objects can be rendered in AR scenarios with shading that is consistent to the real scene.
arXiv Detail & Related papers (2020-08-06T08:23:19Z) - Deep Lighting Environment Map Estimation from Spherical Panoramas [0.0]
We present a data-driven model that estimates an HDR lighting environment map from a single LDR monocular spherical panorama.
We exploit the availability of surface geometry to employ image-based relighting as a data generator and supervision mechanism.
arXiv Detail & Related papers (2020-05-16T14:23:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.