IE-NeRF: Inpainting Enhanced Neural Radiance Fields in the Wild
- URL: http://arxiv.org/abs/2407.10695v1
- Date: Mon, 15 Jul 2024 13:10:23 GMT
- Title: IE-NeRF: Inpainting Enhanced Neural Radiance Fields in the Wild
- Authors: Shuaixian Wang, Haoran Xu, Yaokun Li, Jiwei Chen, Guang Tan,
- Abstract summary: We present a novel approach for synthesizing realistic novel views using Neural Radiance Fields (NeRF) with uncontrolled photos in the wild.
Our framework called textitInpainting Enhanced NeRF, or ours, enhances the conventional NeRF by drawing inspiration from the technique of image inpainting.
- Score: 15.86621086993995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel approach for synthesizing realistic novel views using Neural Radiance Fields (NeRF) with uncontrolled photos in the wild. While NeRF has shown impressive results in controlled settings, it struggles with transient objects commonly found in dynamic and time-varying scenes. Our framework called \textit{Inpainting Enhanced NeRF}, or \ours, enhances the conventional NeRF by drawing inspiration from the technique of image inpainting. Specifically, our approach extends the Multi-Layer Perceptrons (MLP) of NeRF, enabling it to simultaneously generate intrinsic properties (static color, density) and extrinsic transient masks. We introduce an inpainting module that leverages the transient masks to effectively exclude occlusions, resulting in improved volume rendering quality. Additionally, we propose a new training strategy with frequency regularization to address the sparsity issue of low-frequency transient components. We evaluate our approach on internet photo collections of landmarks, demonstrating its ability to generate high-quality novel views and achieve state-of-the-art performance.
Related papers
- NeRF On-the-go: Exploiting Uncertainty for Distractor-free NeRFs in the Wild [55.154625718222995]
We introduce NeRF On-the-go, a simple yet effective approach that enables the robust synthesis of novel views in complex, in-the-wild scenes.
Our method demonstrates a significant improvement over state-of-the-art techniques.
arXiv Detail & Related papers (2024-05-29T02:53:40Z) - NeRF-Casting: Improved View-Dependent Appearance with Consistent Reflections [57.63028964831785]
Recent works have improved NeRF's ability to render detailed specular appearance of distant environment illumination, but are unable to synthesize consistent reflections of closer content.
We address these issues with an approach based on ray tracing.
Instead of querying an expensive neural network for the outgoing view-dependent radiance at points along each camera ray, our model casts rays from these points and traces them through the NeRF representation to render feature vectors.
arXiv Detail & Related papers (2024-05-23T17:59:57Z) - Taming Latent Diffusion Model for Neural Radiance Field Inpainting [63.297262813285265]
Neural Radiance Field (NeRF) is a representation for 3D reconstruction from multi-view images.
We propose tempering the diffusion model'sity with per-scene customization and mitigating the textural shift with masked training.
Our framework yields state-of-the-art NeRF inpainting results on various real-world scenes.
arXiv Detail & Related papers (2024-04-15T17:59:57Z) - Dynamic Mesh-Aware Radiance Fields [75.59025151369308]
This paper designs a two-way coupling between mesh and NeRF during rendering and simulation.
We show that a hybrid system approach outperforms alternatives in visual realism for mesh insertion.
arXiv Detail & Related papers (2023-09-08T20:18:18Z) - Learning Neural Duplex Radiance Fields for Real-Time View Synthesis [33.54507228895688]
We propose a novel approach to distill and bake NeRFs into highly efficient mesh-based neural representations.
We demonstrate the effectiveness and superiority of our approach via extensive experiments on a range of standard datasets.
arXiv Detail & Related papers (2023-04-20T17:59:52Z) - Pre-NeRF 360: Enriching Unbounded Appearances for Neural Radiance Fields [8.634008996263649]
We propose a new framework to boost the performance of NeRF-based architectures.
Our solution overcomes several obstacles that plagued earlier versions of NeRF.
We introduce an updated version of the Nutrition5k dataset, known as the N5k360 dataset.
arXiv Detail & Related papers (2023-03-21T23:29:38Z) - NeRF++: Analyzing and Improving Neural Radiance Fields [117.73411181186088]
Neural Radiance Fields (NeRF) achieve impressive view synthesis results for a variety of capture settings.
NeRF fits multi-layer perceptrons representing view-invariant opacity and view-dependent color volumes to a set of training images.
We address a parametrization issue involved in applying NeRF to 360 captures of objects within large-scale, 3D scenes.
arXiv Detail & Related papers (2020-10-15T03:24:14Z) - NeRF in the Wild: Neural Radiance Fields for Unconstrained Photo
Collections [47.9463405062868]
We present a learning-based method for synthesizing novel views of complex scenes using only unstructured collections of in-the-wild photographs.
We build on Neural Radiance Fields (NeRF), which uses the weights of a multilayer perceptron to model the density and color of a scene as a function of 3D coordinates.
We introduce a series of extensions to NeRF to address these issues, thereby enabling accurate reconstructions from unstructured image collections taken from the internet.
arXiv Detail & Related papers (2020-08-05T17:51:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.