Factorized and Controllable Neural Re-Rendering of Outdoor Scene for
Photo Extrapolation
- URL: http://arxiv.org/abs/2207.06899v1
- Date: Thu, 14 Jul 2022 13:28:08 GMT
- Title: Factorized and Controllable Neural Re-Rendering of Outdoor Scene for
Photo Extrapolation
- Authors: Boming Zhao, Bangbang Yang, Zhenyang Li, Zuoyue Li, Guofeng Zhang,
Jiashu Zhao, Dawei Yin, Zhaopeng Cui, Hujun Bao
- Abstract summary: We propose a factorized neural re-rendering model to produce novel views from cluttered outdoor Internet photo collections.
We also propose a novel realism augmentation process to complement appearance details, which automatically propagates the texture details from a narrow captured photo to the extrapolated neural rendered image.
- Score: 50.00344639039158
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Expanding an existing tourist photo from a partially captured scene to a full
scene is one of the desired experiences for photography applications. Although
photo extrapolation has been well studied, it is much more challenging to
extrapolate a photo (i.e., selfie) from a narrow field of view to a wider one
while maintaining a similar visual style. In this paper, we propose a
factorized neural re-rendering model to produce photorealistic novel views from
cluttered outdoor Internet photo collections, which enables the applications
including controllable scene re-rendering, photo extrapolation and even
extrapolated 3D photo generation. Specifically, we first develop a novel
factorized re-rendering pipeline to handle the ambiguity in the decomposition
of geometry, appearance and illumination. We also propose a composited training
strategy to tackle the unexpected occlusion in Internet images. Moreover, to
enhance photo-realism when extrapolating tourist photographs, we propose a
novel realism augmentation process to complement appearance details, which
automatically propagates the texture details from a narrow captured photo to
the extrapolated neural rendered image. The experiments and photo editing
examples on outdoor scenes demonstrate the superior performance of our proposed
method in both photo-realism and downstream applications.
Related papers
- Photorealistic Object Insertion with Diffusion-Guided Inverse Rendering [56.68286440268329]
correct insertion of virtual objects in images of real-world scenes requires a deep understanding of the scene's lighting, geometry and materials.
We propose using a personalized large diffusion model as guidance to a physically based inverse rendering process.
Our method recovers scene lighting and tone-mapping parameters, allowing the photorealistic composition of arbitrary virtual objects in single frames or videos of indoor or outdoor scenes.
arXiv Detail & Related papers (2024-08-19T05:15:45Z) - Neural Fields meet Explicit Geometric Representation for Inverse
Rendering of Urban Scenes [62.769186261245416]
We present a novel inverse rendering framework for large urban scenes capable of jointly reconstructing the scene geometry, spatially-varying materials, and HDR lighting from a set of posed RGB images with optional depth.
Specifically, we use a neural field to account for the primary rays, and use an explicit mesh (reconstructed from the underlying neural field) for modeling secondary rays that produce higher-order lighting effects such as cast shadows.
arXiv Detail & Related papers (2023-04-06T17:51:54Z) - Neural Radiance Transfer Fields for Relightable Novel-view Synthesis
with Global Illumination [63.992213016011235]
We propose a method for scene relighting under novel views by learning a neural precomputed radiance transfer function.
Our method can be solely supervised on a set of real images of the scene under a single unknown lighting condition.
Results show that the recovered disentanglement of scene parameters improves significantly over the current state of the art.
arXiv Detail & Related papers (2022-07-27T16:07:48Z) - Multi-view Gradient Consistency for SVBRDF Estimation of Complex Scenes
under Natural Illumination [6.282068591820945]
This paper presents a process for estimating the spatially varying surface reflectance of complex scenes observed under natural illumination.
An end-to-end process uses a model of the scene's geometry and several images capturing the scene's surfaces.
Experiments show that our technique produces realistic results for arbitrary outdoor scenes with complex geometry.
arXiv Detail & Related papers (2022-02-25T23:49:39Z) - Neural Radiance Fields for Outdoor Scene Relighting [70.97747511934705]
We present NeRF-OSR, the first approach for outdoor scene relighting based on neural radiance fields.
In contrast to the prior art, our technique allows simultaneous editing of both scene illumination and camera viewpoint.
It also includes a dedicated network for shadow reproduction, which is crucial for high-quality outdoor scene relighting.
arXiv Detail & Related papers (2021-12-09T18:59:56Z) - Neural Reflectance Fields for Appearance Acquisition [61.542001266380375]
We present Neural Reflectance Fields, a novel deep scene representation that encodes volume density, normal and reflectance properties at any 3D point in a scene.
We combine this representation with a physically-based differentiable ray marching framework that can render images from a neural reflectance field under any viewpoint and light.
arXiv Detail & Related papers (2020-08-09T22:04:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.