Fast Sparse View Guided NeRF Update for Object Reconfigurations
- URL: http://arxiv.org/abs/2403.11024v1
- Date: Sat, 16 Mar 2024 22:00:16 GMT
- Title: Fast Sparse View Guided NeRF Update for Object Reconfigurations
- Authors: Ziqi Lu, Jianbo Ye, Xiaohan Fei, Xiaolong Li, Jiawei Mo, Ashwin Swaminathan, Stefano Soatto,
- Abstract summary: We develop the first update method for NeRFs to physical changes.
Our method takes only sparse new images as extra inputs and update the pre-trained NeRF in around 1 to 2 minutes.
Our core idea is the use of a second helper NeRF to learn the local geometry and appearance changes.
- Score: 42.947608325321475
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Radiance Field (NeRF), as an implicit 3D scene representation, lacks inherent ability to accommodate changes made to the initial static scene. If objects are reconfigured, it is difficult to update the NeRF to reflect the new state of the scene without time-consuming data re-capturing and NeRF re-training. To address this limitation, we develop the first update method for NeRFs to physical changes. Our method takes only sparse new images (e.g. 4) of the altered scene as extra inputs and update the pre-trained NeRF in around 1 to 2 minutes. Particularly, we develop a pipeline to identify scene changes and update the NeRF accordingly. Our core idea is the use of a second helper NeRF to learn the local geometry and appearance changes, which sidesteps the optimization difficulties in direct NeRF fine-tuning. The interpolation power of the helper NeRF is the key to accurately reconstruct the un-occluded objects regions under sparse view supervision. Our method imposes no constraints on NeRF pre-training, and requires no extra user input or explicit semantic priors. It is an order of magnitude faster than re-training NeRF from scratch while maintaining on-par and even superior performance.
Related papers
- InsertNeRF: Instilling Generalizability into NeRF with HyperNet Modules [23.340064406356174]
Generalizing Neural Radiance Fields (NeRF) to new scenes is a significant challenge.
We introduce InsertNeRF, a method for INStilling gEneRalizabiliTy into NeRF.
arXiv Detail & Related papers (2023-08-26T14:50:24Z) - DReg-NeRF: Deep Registration for Neural Radiance Fields [66.69049158826677]
We propose DReg-NeRF to solve the NeRF registration problem on object-centric annotated scenes without human intervention.
Our proposed method beats the SOTA point cloud registration methods by a large margin.
arXiv Detail & Related papers (2023-08-18T08:37:49Z) - Learning a Diffusion Prior for NeRFs [84.99454404653339]
We propose to use a diffusion model to generate NeRFs encoded on a regularized grid.
We show that our model can sample realistic NeRFs, while at the same time allowing conditional generations, given a certain observation as guidance.
arXiv Detail & Related papers (2023-04-27T19:24:21Z) - FreeNeRF: Improving Few-shot Neural Rendering with Free Frequency
Regularization [32.1581416980828]
We present Frequency regularized NeRF (FreeNeRF), a surprisingly simple baseline that outperforms previous methods.
We analyze the key challenges in few-shot neural rendering and find that frequency plays an important role in NeRF's training.
arXiv Detail & Related papers (2023-03-13T18:59:03Z) - Removing Objects From Neural Radiance Fields [60.067117643543824]
We propose a framework to remove objects from a NeRF representation created from an RGB-D sequence.
Our NeRF inpainting method leverages recent work in 2D image inpainting and is guided by a user-provided mask.
We show that our method for NeRF editing is effective for synthesizing plausible inpaintings in a multi-view coherent manner.
arXiv Detail & Related papers (2022-12-22T18:51:06Z) - NeRF-RPN: A general framework for object detection in NeRFs [54.54613914831599]
NeRF-RPN aims to detect all bounding boxes of objects in a scene.
NeRF-RPN is a general framework and can be applied to detect objects without class labels.
arXiv Detail & Related papers (2022-11-21T17:02:01Z) - Compressing Explicit Voxel Grid Representations: fast NeRFs become also
small [3.1473798197405944]
Re:NeRF aims to reduce memory storage of NeRF models while maintaining comparable performance.
We benchmark our approach with three different EVG-NeRF architectures on four popular benchmarks.
arXiv Detail & Related papers (2022-10-23T16:42:29Z) - Aug-NeRF: Training Stronger Neural Radiance Fields with Triple-Level
Physically-Grounded Augmentations [111.08941206369508]
We propose Augmented NeRF (Aug-NeRF), which for the first time brings the power of robust data augmentations into regularizing the NeRF training.
Our proposal learns to seamlessly blend worst-case perturbations into three distinct levels of the NeRF pipeline.
Aug-NeRF effectively boosts NeRF performance in both novel view synthesis and underlying geometry reconstruction.
arXiv Detail & Related papers (2022-07-04T02:27:07Z) - VaxNeRF: Revisiting the Classic for Voxel-Accelerated Neural Radiance
Field [28.087183395793236]
We propose Voxel-Accelearated NeRF (VaxNeRF) to integrate NeRF with visual hull.
VaxNeRF achieves about 2-8x faster learning on top of the highly-performative JaxNeRF.
We hope VaxNeRF can empower and accelerate new NeRF extensions and applications.
arXiv Detail & Related papers (2021-11-25T14:56:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.