GaussianEditor: Swift and Controllable 3D Editing with Gaussian
Splatting
- URL: http://arxiv.org/abs/2311.14521v4
- Date: Wed, 20 Dec 2023 14:35:27 GMT
- Title: GaussianEditor: Swift and Controllable 3D Editing with Gaussian
Splatting
- Authors: Yiwen Chen, Zilong Chen, Chi Zhang, Feng Wang, Xiaofeng Yang, Yikai
Wang, Zhongang Cai, Lei Yang, Huaping Liu, Guosheng Lin
- Abstract summary: 3D editing plays a crucial role in many areas such as gaming and virtual reality.
Traditional 3D editing methods, which rely on representations like meshes and point clouds, often fall short in realistically depicting complex scenes.
Our paper presents GaussianEditor, an innovative and efficient 3D editing algorithm based on Gaussian Splatting (GS), a novel 3D representation.
- Score: 66.08674785436612
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D editing plays a crucial role in many areas such as gaming and virtual
reality. Traditional 3D editing methods, which rely on representations like
meshes and point clouds, often fall short in realistically depicting complex
scenes. On the other hand, methods based on implicit 3D representations, like
Neural Radiance Field (NeRF), render complex scenes effectively but suffer from
slow processing speeds and limited control over specific scene areas. In
response to these challenges, our paper presents GaussianEditor, an innovative
and efficient 3D editing algorithm based on Gaussian Splatting (GS), a novel 3D
representation. GaussianEditor enhances precision and control in editing
through our proposed Gaussian semantic tracing, which traces the editing target
throughout the training process. Additionally, we propose Hierarchical Gaussian
splatting (HGS) to achieve stabilized and fine results under stochastic
generative guidance from 2D diffusion models. We also develop editing
strategies for efficient object removal and integration, a challenging task for
existing methods. Our comprehensive experiments demonstrate GaussianEditor's
superior control, efficacy, and rapid performance, marking a significant
advancement in 3D editing. Project Page:
https://buaacyw.github.io/gaussian-editor/
Related papers
- GSEditPro: 3D Gaussian Splatting Editing with Attention-based Progressive Localization [11.170354299559998]
We propose GSEditPro, a novel 3D scene editing framework which allows users to perform various creative and precise editing using text prompts only.
We introduce an attention-based progressive localization module to add semantic labels to each Gaussian during rendering.
This enables precise localization on editing areas by classifying Gaussians based on their relevance to the editing prompts derived from cross-attention layers of the T2I model.
arXiv Detail & Related papers (2024-11-15T08:25:14Z) - 3DitScene: Editing Any Scene via Language-guided Disentangled Gaussian Splatting [100.94916668527544]
Existing methods solely focus on either 2D individual object or 3D global scene editing.
We propose 3DitScene, a novel and unified scene editing framework.
It enables seamless editing from 2D to 3D, allowing precise control over scene composition and individual objects.
arXiv Detail & Related papers (2024-05-28T17:59:01Z) - TIGER: Text-Instructed 3D Gaussian Retrieval and Coherent Editing [12.50147114409895]
This paper proposes a systematic approach, namely TIGER, for coherent text-instructed 3D Gaussian retrieval and editing.
To overcome the over-smoothing and inconsistency issues in editing, we propose a Coherent Score Distillation (CSD) that aggregates a 2D image editing diffusion model and a multi-view diffusion model for score distillation.
arXiv Detail & Related papers (2024-05-23T11:37:17Z) - DragGaussian: Enabling Drag-style Manipulation on 3D Gaussian Representation [57.406031264184584]
DragGaussian is a 3D object drag-editing framework based on 3D Gaussian Splatting.
Our contributions include the introduction of a new task, the development of DragGaussian for interactive point-based 3D editing, and comprehensive validation of its effectiveness through qualitative and quantitative experiments.
arXiv Detail & Related papers (2024-05-09T14:34:05Z) - DGE: Direct Gaussian 3D Editing by Consistent Multi-view Editing [72.54566271694654]
We consider the problem of editing 3D objects and scenes based on open-ended language instructions.
A common approach to this problem is to use a 2D image generator or editor to guide the 3D editing process.
This process is often inefficient due to the need for iterative updates of costly 3D representations.
arXiv Detail & Related papers (2024-04-29T17:59:30Z) - GSEdit: Efficient Text-Guided Editing of 3D Objects via Gaussian Splatting [10.527349772993796]
We present GSEdit, a pipeline for text-guided 3D object editing based on Gaussian Splatting models.
Our method enables the editing of the style and appearance of 3D objects without altering their main details, all in a matter of minutes on consumer hardware.
arXiv Detail & Related papers (2024-03-08T08:42:23Z) - Gaussian Grouping: Segment and Edit Anything in 3D Scenes [65.49196142146292]
We propose Gaussian Grouping, which extends Gaussian Splatting to jointly reconstruct and segment anything in open-world 3D scenes.
Compared to the implicit NeRF representation, we show that the grouped 3D Gaussians can reconstruct, segment and edit anything in 3D with high visual quality, fine granularity and efficiency.
arXiv Detail & Related papers (2023-12-01T17:09:31Z) - GaussianEditor: Editing 3D Gaussians Delicately with Text Instructions [90.38892097863814]
We propose a systematic framework, named GaussianEditor, to edit 3D scenes delicately via 3D Gaussians with text instructions.
Our framework can achieve more delicate and precise editing of 3D scenes than previous methods while enjoying much faster training speed.
arXiv Detail & Related papers (2023-11-27T17:58:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.