Reference-based Controllable Scene Stylization with Gaussian Splatting
- URL: http://arxiv.org/abs/2407.07220v1
- Date: Tue, 9 Jul 2024 20:30:29 GMT
- Title: Reference-based Controllable Scene Stylization with Gaussian Splatting
- Authors: Yiqun Mei, Jiacong Xu, Vishal M. Patel,
- Abstract summary: Referenced-based scene stylization that edits the appearance based on a content-aligned reference image is an emerging research area.
We propose ReGS, which adapts 3D Gaussian Splatting (3DGS) for reference-based stylization to enable real-time stylized view synthesis.
- Score: 30.321151430263946
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Referenced-based scene stylization that edits the appearance based on a content-aligned reference image is an emerging research area. Starting with a pretrained neural radiance field (NeRF), existing methods typically learn a novel appearance that matches the given style. Despite their effectiveness, they inherently suffer from time-consuming volume rendering, and thus are impractical for many real-time applications. In this work, we propose ReGS, which adapts 3D Gaussian Splatting (3DGS) for reference-based stylization to enable real-time stylized view synthesis. Editing the appearance of a pretrained 3DGS is challenging as it uses discrete Gaussians as 3D representation, which tightly bind appearance with geometry. Simply optimizing the appearance as prior methods do is often insufficient for modeling continuous textures in the given reference image. To address this challenge, we propose a novel texture-guided control mechanism that adaptively adjusts local responsible Gaussians to a new geometric arrangement, serving for desired texture details. The proposed process is guided by texture clues for effective appearance editing, and regularized by scene depth for preserving original geometric structure. With these novel designs, we show ReGs can produce state-of-the-art stylization results that respect the reference texture while embracing real-time rendering speed for free-view navigation.
Related papers
- StylizedGS: Controllable Stylization for 3D Gaussian Splatting [53.0225128090909]
StylizedGS is a 3D neural style transfer framework with adaptable control over perceptual factors based on 3D Gaussian Splatting representation.
Our method can attain high-quality stylization results characterized by faithful brushstrokes and geometric consistency with flexible controls.
arXiv Detail & Related papers (2024-04-08T06:32:11Z) - DN-Splatter: Depth and Normal Priors for Gaussian Splatting and Meshing [19.437747560051566]
High-fidelity 3D reconstruction of common indoor scenes is crucial for VR and AR applications.
We extend 3D Gaussian splatting with depth and normal cues to tackle challenging indoor datasets.
We propose an adaptive depth loss based on the gradient of color images, improving depth estimation and novel view synthesis results.
arXiv Detail & Related papers (2024-03-26T16:00:31Z) - Bridging 3D Gaussian and Mesh for Freeview Video Rendering [57.21847030980905]
GauMesh bridges the 3D Gaussian and Mesh for modeling and rendering the dynamic scenes.
We show that our approach adapts the appropriate type of primitives to represent the different parts of the dynamic scene.
arXiv Detail & Related papers (2024-03-18T04:01:26Z) - Recent Advances in 3D Gaussian Splatting [31.3820273122585]
3D Gaussian Splatting has greatly accelerated rendering speed of novel view synthesis.
The explicit representation of 3D Gaussian Splatting facilitates editing tasks like dynamic reconstruction, geometry editing, and physical simulation.
We present a literature review of recent 3D Gaussian Splatting methods, which can be roughly classified into 3D reconstruction, 3D editing, and other downstream applications.
arXiv Detail & Related papers (2024-03-17T07:57:08Z) - Texture-GS: Disentangling the Geometry and Texture for 3D Gaussian Splatting Editing [79.10630153776759]
3D Gaussian splatting, emerging as a groundbreaking approach, has drawn increasing attention for its capabilities of high-fidelity reconstruction and real-time rendering.
We propose a novel approach, namely Texture-GS, to disentangle the appearance from the geometry by representing it as a 2D texture mapped onto the 3D surface.
Our method not only facilitates high-fidelity appearance editing but also achieves real-time rendering on consumer-level devices.
arXiv Detail & Related papers (2024-03-15T06:42:55Z) - Gaussian Splatting in Style [35.376015119962354]
We propose a novel architecture trained on a collection of style images, that at test time produces high quality stylized novel views.
We show our methods achieve state-of-the-art performance with superior visual quality on various indoor and outdoor real-world data.
arXiv Detail & Related papers (2024-03-13T13:06:31Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for
Geometry and Texture Editing [39.71252429542249]
We present a novel mesh-based representation by encoding the neural implicit field with disentangled geometry and texture codes on mesh vertices.
We develop several techniques including learnable sign indicators to magnify spatial distinguishability of mesh-based representation.
Experiments and editing examples on both real and synthetic data demonstrate the superiority of our method on representation quality and editing ability.
arXiv Detail & Related papers (2022-07-25T05:30:50Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.