StylizedGS: Controllable Stylization for 3D Gaussian Splatting
- URL: http://arxiv.org/abs/2404.05220v2
- Date: Tue, 13 Aug 2024 03:43:30 GMT
- Title: StylizedGS: Controllable Stylization for 3D Gaussian Splatting
- Authors: Dingxi Zhang, Yu-Jie Yuan, Zhuoxun Chen, Fang-Lue Zhang, Zhenliang He, Shiguang Shan, Lin Gao,
- Abstract summary: StylizedGS is an efficient 3D neural style transfer framework with adaptable control over perceptual factors.
Our method achieves high-quality stylization results characterized by faithful brushstrokes and geometric consistency with flexible controls.
- Score: 53.0225128090909
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As XR technology continues to advance rapidly, 3D generation and editing are increasingly crucial. Among these, stylization plays a key role in enhancing the appearance of 3D models. By utilizing stylization, users can achieve consistent artistic effects in 3D editing using a single reference style image, making it a user-friendly editing method. However, recent NeRF-based 3D stylization methods encounter efficiency issues that impact the user experience, and their implicit nature limits their ability to accurately transfer geometric pattern styles. Additionally, the ability for artists to apply flexible control over stylized scenes is considered highly desirable to foster an environment conducive to creative exploration. To address the above issues, we introduce StylizedGS, an efficient 3D neural style transfer framework with adaptable control over perceptual factors based on 3D Gaussian Splatting (3DGS) representation. We propose a filter-based refinement to eliminate floaters that affect the stylization effects in the scene reconstruction process. The nearest neighbor-based style loss is introduced to achieve stylization by fine-tuning the geometry and color parameters of 3DGS, while a depth preservation loss with other regularizations is proposed to prevent the tampering of geometry content. Moreover, facilitated by specially designed losses, StylizedGS enables users to control color, stylized scale, and regions during the stylization to possess customization capabilities. Our method achieves high-quality stylization results characterized by faithful brushstrokes and geometric consistency with flexible controls. Extensive experiments across various scenes and styles demonstrate the effectiveness and efficiency of our method concerning both stylization quality and inference speed.
Related papers
- StyleSplat: 3D Object Style Transfer with Gaussian Splatting [0.3374875022248866]
Style transfer can enhance 3D assets with diverse artistic styles, transforming creative expression.
We introduce StyleSplat, a method for stylizing 3D objects in scenes represented by 3D Gaussians from reference style images.
We demonstrate its effectiveness across various 3D scenes and styles, showcasing enhanced control and customization in 3D creation.
arXiv Detail & Related papers (2024-07-12T17:55:08Z) - CoARF: Controllable 3D Artistic Style Transfer for Radiance Fields [7.651502365257349]
We introduce Controllable Artistic Radiance Fields (CoARF), a novel algorithm for controllable 3D scene stylization.
CoARF provides user-specified controllability of style transfer and superior style transfer quality with more precise feature matching.
arXiv Detail & Related papers (2024-04-23T12:22:32Z) - 3DStyleGLIP: Part-Tailored Text-Guided 3D Neural Stylization [1.2499537119440243]
3DStyleGLIP is a novel framework specifically designed for text-driven, part-tailored 3D stylization.
Our method achieves significant part-wise stylization capabilities, demonstrating promising potential in advancing the field of 3D stylization.
arXiv Detail & Related papers (2024-04-03T10:44:06Z) - GaussianStyle: Gaussian Head Avatar via StyleGAN [64.85782838199427]
We propose a novel framework that integrates the volumetric strengths of 3DGS with the powerful implicit representation of StyleGAN.
We show that our method achieves state-of-the-art performance in reenactment, novel view synthesis, and animation.
arXiv Detail & Related papers (2024-02-01T18:14:42Z) - DeformToon3D: Deformable 3D Toonification from Neural Radiance Fields [96.0858117473902]
3D toonification involves transferring the style of an artistic domain onto a target 3D face with stylized geometry and texture.
We propose DeformToon3D, an effective toonification framework tailored for hierarchical 3D GAN.
Our approach decomposes 3D toonification into subproblems of geometry and texture stylization to better preserve the original latent space.
arXiv Detail & Related papers (2023-09-08T16:17:45Z) - ARF-Plus: Controlling Perceptual Factors in Artistic Radiance Fields for
3D Scene Stylization [11.841897748330302]
radiance fields style transfer is an emerging field that has recently gained popularity as a means of 3D scene stylization.
We highlight a research gap in radiance fields style transfer, the lack of sufficient perceptual controllability.
We present ARF-Plus, a 3D neural style transfer framework offering manageable control over perceptual factors.
arXiv Detail & Related papers (2023-08-23T22:22:20Z) - StyleRF: Zero-shot 3D Style Transfer of Neural Radiance Fields [52.19291190355375]
StyleRF (Style Radiance Fields) is an innovative 3D style transfer technique.
It employs an explicit grid of high-level features to represent 3D scenes, with which high-fidelity geometry can be reliably restored via volume rendering.
It transforms the grid features according to the reference style which directly leads to high-quality zero-shot style transfer.
arXiv Detail & Related papers (2023-03-19T08:26:06Z) - ARF: Artistic Radiance Fields [63.79314417413371]
We present a method for transferring the artistic features of an arbitrary style image to a 3D scene.
Previous methods that perform 3D stylization on point clouds or meshes are sensitive to geometric reconstruction errors.
We propose to stylize the more robust radiance field representation.
arXiv Detail & Related papers (2022-06-13T17:55:31Z) - 3DStyleNet: Creating 3D Shapes with Geometric and Texture Style
Variations [81.45521258652734]
We propose a method to create plausible geometric and texture style variations of 3D objects.
Our method can create many novel stylized shapes, resulting in effortless 3D content creation and style-ware data augmentation.
arXiv Detail & Related papers (2021-08-30T02:28:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.