PaletteNeRF: Palette-based Color Editing for NeRFs
- URL: http://arxiv.org/abs/2212.12871v1
- Date: Sun, 25 Dec 2022 08:01:03 GMT
- Title: PaletteNeRF: Palette-based Color Editing for NeRFs
- Authors: Qiling Wu, Jianchao Tan, Kun Xu
- Abstract summary: We propose a simple but effective extension of vanilla NeRF, named PaletteNeRF, to enable efficient color editing on NeRF-represented scenes.
Our method achieves efficient, view-consistent, and artifact-free color editing on a wide range of NeRF-represented scenes.
- Score: 16.49512200561126
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Radiance Field (NeRF) is a powerful tool to faithfully generate novel
views for scenes with only sparse captured images. Despite its strong
capability for representing 3D scenes and their appearance, its editing ability
is very limited. In this paper, we propose a simple but effective extension of
vanilla NeRF, named PaletteNeRF, to enable efficient color editing on
NeRF-represented scenes. Motivated by recent palette-based image decomposition
works, we approximate each pixel color as a sum of palette colors modulated by
additive weights. Instead of predicting pixel colors as in vanilla NeRFs, our
method predicts additive weights. The underlying NeRF backbone could also be
replaced with more recent NeRF models such as KiloNeRF to achieve real-time
editing. Experimental results demonstrate that our method achieves efficient,
view-consistent, and artifact-free color editing on a wide range of
NeRF-represented scenes.
Related papers
- IReNe: Instant Recoloring of Neural Radiance Fields [54.94866137102324]
We introduce IReNe, enabling swift, near real-time color editing in NeRF.
We leverage a pre-trained NeRF model and a single training image with user-applied color edits.
This adjustment allows the model to generate new scene views, accurately representing the color changes from the training image.
arXiv Detail & Related papers (2024-05-30T09:30:28Z) - Taming Latent Diffusion Model for Neural Radiance Field Inpainting [63.297262813285265]
Neural Radiance Field (NeRF) is a representation for 3D reconstruction from multi-view images.
We propose tempering the diffusion model'sity with per-scene customization and mitigating the textural shift with masked training.
Our framework yields state-of-the-art NeRF inpainting results on various real-world scenes.
arXiv Detail & Related papers (2024-04-15T17:59:57Z) - LAENeRF: Local Appearance Editing for Neural Radiance Fields [4.681790910494339]
LAENeRF is a framework for photorealistic and non-photorealistic appearance editing of NeRFs.
We learn a mapping from expected ray terminations to final output color, which can be supervised by a style loss.
Relying on a single point per ray for our mapping, we limit memory requirements and enable fast optimization.
arXiv Detail & Related papers (2023-12-15T16:23:42Z) - ProteusNeRF: Fast Lightweight NeRF Editing using 3D-Aware Image Context [26.07841568311428]
We present a very simple but effective neural network architecture that is fast and efficient while maintaining a low memory footprint.
Our representation allows straightforward object selection via semantic feature distillation at the training stage.
We propose a local 3D-aware image context to facilitate view-consistent image editing that can then be distilled into fine-tuned NeRFs.
arXiv Detail & Related papers (2023-10-15T21:54:45Z) - RePaint-NeRF: NeRF Editting via Semantic Masks and Diffusion Models [36.236190350126826]
We propose a novel framework that can take RGB images as input and alter the 3D content in neural scenes.
Specifically, we semantically select the target object and a pre-trained diffusion model will guide the NeRF model to generate new 3D objects.
Experiment results show that our algorithm is effective for editing 3D objects in NeRF under different text prompts.
arXiv Detail & Related papers (2023-06-09T04:49:31Z) - RecolorNeRF: Layer Decomposed Radiance Fields for Efficient Color
Editing of 3D Scenes [21.284044381058575]
We present RecolorNeRF, a novel user-friendly color editing approach for neural radiance fields.
Our key idea is to decompose the scene into a set of pure-colored layers, forming a palette.
To support efficient palette-based editing, the color of each layer needs to be as representative as possible.
arXiv Detail & Related papers (2023-01-19T09:18:06Z) - Removing Objects From Neural Radiance Fields [60.067117643543824]
We propose a framework to remove objects from a NeRF representation created from an RGB-D sequence.
Our NeRF inpainting method leverages recent work in 2D image inpainting and is guided by a user-provided mask.
We show that our method for NeRF editing is effective for synthesizing plausible inpaintings in a multi-view coherent manner.
arXiv Detail & Related papers (2022-12-22T18:51:06Z) - PaletteNeRF: Palette-based Appearance Editing of Neural Radiance Fields [60.66412075837952]
We present PaletteNeRF, a novel method for appearance editing of neural radiance fields (NeRF) based on 3D color decomposition.
Our method decomposes the appearance of each 3D point into a linear combination of palette-based bases.
We extend our framework with compressed semantic features for semantic-aware appearance editing.
arXiv Detail & Related papers (2022-12-21T00:20:01Z) - iNeRF: Inverting Neural Radiance Fields for Pose Estimation [68.91325516370013]
We present iNeRF, a framework that performs mesh-free pose estimation by "inverting" a Neural RadianceField (NeRF)
NeRFs have been shown to be remarkably effective for the task of view synthesis.
arXiv Detail & Related papers (2020-12-10T18:36:40Z) - NeRF++: Analyzing and Improving Neural Radiance Fields [117.73411181186088]
Neural Radiance Fields (NeRF) achieve impressive view synthesis results for a variety of capture settings.
NeRF fits multi-layer perceptrons representing view-invariant opacity and view-dependent color volumes to a set of training images.
We address a parametrization issue involved in applying NeRF to 360 captures of objects within large-scale, 3D scenes.
arXiv Detail & Related papers (2020-10-15T03:24:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.